Input
stringlengths 251
41.6k
| Output
stringlengths 137
9.7k
| input_ids
listlengths 157
2.05k
| attention_mask
listlengths 157
2.05k
| labels
listlengths 157
2.05k
|
---|---|---|---|---|
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper builds the connection between various steerable cnn structures based on group representation theory and filter transformations using the discrete rotation and reflection group as an example the paper establishes ways to construct steerable convolutional filters that transform features in trivial regular and irreducible representations in particular recent works on steerable cnns such as orn rotdcf tipooling and roteqnet can all be explained under such framework the paper is generally well written and well organized however i reckon the material would be quite dense for readers interested in equivariant cnns but not wellversed in group representation theory so a bit more explanation either in text or in the appendix would be helpful the main theoretical results in the paper on the considered discrete group transformation are interesting and can potentially lead to other development in the community the numerical experiments seem to be limited though this phenomenon seems to be the issue for most of the papers in this area pros 1 well written and technically correct 2 interesting theoretical results that might inspire other works in the field cons to be explained in more details in questions 1 some notation and abbreviation are used without proper explanation which might be confusing 2 limited experimental results questions and comment 1 some of the notations and abbreviations are used without proper explanation for example k in equation 12 ocr on page 1 and missing reference on r2conv i would encourage the authors to make a greater effort to clarify their ideas and results 2 the theory seems to be built only on discrete group does it generalize to continuous group transformation such as so2 before discretization 3 also does the theory generalize to noncompact groups such as scaling and shearing 4 the experiments seem to be limited even though this seems to be a common issue in papers in this field however i do recommend the authors to present the equivariant loss when using their proposed filtra considering that discretization and interpolation might cause a problem in their setting unlike other means of steerable cnns such as rotdcf and harmonic netdocsepthis paper studies the connection between steerable cnn and filter transformation the authors show theoretically that filter transformation can be used to realize steerable filters over different group representations the authors also empirically show that the filter transformation based steerable cnn performs on par with the implementation based on harmonic bases overall i think this paper is hard to follow the presentation and notation depends on that of prior works and is hard to understand without being familiar with steerable cnn compared with prior works this paper focuses more on the mathematical derivation but falls short of explaining their implication the result is therefore hard to understand for those that are not familiar with the theoretical basis also the authors do not clearly show why the contribution of this work is significant the theoretical connection doesnt seem to provide a significant advantage nor solve the problems in existing methods similarly the experiment doesnt provide much information and only shows that the proposed implementation works on simple datasets the authors should try to highlight how this work may benefit other research or applications and verify that the proposed method works on more complicated and realistic data another aspect that can be improved is to discuss how the theory generalizes when multiple convolution layers are applied in general the theoretical properties of steerable convolution does not automatically generalize to multiple convolution layers this is particularly important in real world data because the transformation may happen at different scales and may need to be accounted for at different layers in the network therefore the results derived from a single convolution operation may not be sufficient docsepsummary the paper outlines the construction of filters in a group equivariant convolutional network equivariant to the groups cn cyclic group of order n n rotations and dn dihedral group of order n n rotations and flipping the authors achieve this by taking linear combinations of harmonic basis filters as was shown in weiler et al 2018 they then proceed to show explicitly how one can build filters between particular representations of each group namely the trivial irreducible and regular representations for experiments they demonstrate that the filters perform en par with those of weiler and cesa 2019 who had performed a nearexhaustive comparison among representations of se2 r pros technically i believe the work to be sound i did not see any mistakes jump out at me i also think that it is laudable that the authors included a high amount of detail in their exposition which lays bare the exact mechanisms by which one would effectively go about building an equivariant layer experimentally it appears that the authors chose a sensible baseline and that they compared on dataset which are relevant to the topic and commonly used in the equivariance literature the authors also provide timing information on how fast each filter can be generated which is something i have not seen before in the equivariance literature and which i appreciate seeing cons and constructive feedback perhaps my largest criticism is that it is not obvious exactly what the contribution of the paper is meant to be perhaps this originates from my not knowing what the authors mean by filter transform and steerable cnn my understanding is that the authors believe it is currently unknown how to construct a steerable cnn for the cyclic and dihedral groups they provide explicit constructions in section 3 but to my knowledge these are already provided in the long appendix of weiler and cesa 2019 furthermore they can be found in cohen and welling 2016 cohen and welling 2017 and bekkers et al 2018 and most notably weiler et al 2018 who were to first to show how to construct equivariant filter layers some linear combinations of harmonic basis filters as a result i am unsure of how to gauge the novelty of this contribution in the experiments of table 2 classification and regression the authors compare their framework with an r2conv representative harmonic based convolution and a regular translationally equivariant network i would like to know what exactly is a representative harmonic based convolution is that meant to refer to the work of weiler and cesa 2019 in the results tables there are no error bars if possible i would have liked to have seen them since they are not there is it very hard to judge the efficacy of the results which differ from the baselines by very small amounts given the proximity of the work to weiler and cesa 2019 i would like to know exactly what differentiates the two works please include exactly what the functions roll flipped and circulant actually do mathematically i found these difficult to parse the mathematical level of the paper is pretty heavy for the uninitiated it may be advisable for the authors to include a glossary of terms if not short descriptions in an appendix if this is too much at least please point to other papers which have easily readable background sections for those not already wellread in group theory the authors may wish to have the submission proofread for spelling and grammar post rebuttal review having read through the rebuttal the updated submission and the reviews of the other reviewers i have upgraded my review from a reject to marginally below acceptance this is for two main reasons 1 the authors have vastly improved the presentation of the submission which now looks a lot easier to read 2 the authors have clarified for me at least what the main contribution of the work is that said i am not entirely sure what this contribution adds to the equivariance community hence why my recommendation still leans towards reject as far as i am aware solving the equivariance equations is not the large bottleneck to progress in our community they are linear equations and there is work back into the 80s solving them check out people like pietro perona patrick teo i think more importantly we need to focus on pushing the boundaries in areas such as extension to noneuclidean manifolds convolution over noncompact groups learning symmetries etc while this work is clearly mathematically sound and the authors have demonstrated deep knowledge of the area it feels a little like retracing prior works that said if the other reviewers disagree then i dont mind this paper being accepted perhaps since i have worked in this area what appears as obvious to me is not generally acknowledged and this paper may serve as a useful clarification for those wishing to dive into the literaturedocsepequivariant steerable cnns for 2d3d rotationreflectiontranslation groups have generally been implemented as a filter transformexpansion step followed by a standard convolution the filter expansion step involves taking a linear combination of steerable basis filters these basis filters are precomputed before network training by solving a linear system or by sampling the continuous analytical solution this can take a few minutes depending on the chosen group representation wrt which the network layer is equivariant a different filter basis will emerge but in general one can see that the basis filters come out as rotated and flipped copies of some basis filters with the occasional sign flip this has been stated in some earlier works the precise way in which a basis filter is to be rotated and flipped to obtain the steerable filter basis for the 2d case had not been worked out before to my knowledge and this is one of the contributions of this paper the analysis is done for each input output representation type chosen from trivial irreducible regular representations having worked this out the paper proposes to use this as a way of implementing the filter expansion step starting from a basis filter and rotatingflipping it to obtain an expanded filter bank the proposed method filtra does not require a precomputation step if i understand correctly which is a significant practical advantage experiments further show that the method is similar or faster at filter expansion finally the method is validated by training networks on benchmark tasks and shown to perform similarly to or better than the steerable cnn implementation of weiler cesa which is the best existing implementation the paper briefly mentions that filters cannot be rotated exactly on a discrete grid but i didnt figure out how the authors propose to deal with this issue how exactly are the filters rotated i think the method proposed in the paper is useful as it is both faster and better than existing steerable cnn implementations the paper itself is fairly well written and technically correct as far as i can tell but may be challenging to read for those who are not yet knowledgeable about steerable cnns those readers however are unlikely to be interested in learning about the implementation details of steerable cnns anyway so perhaps this is fine the reason i am not giving a higher rating is that i think that although this is a useful contribution to the literature on steerable cnns which are being used in an increasing number of applications the paper does not represent a major breakthrough and although the calculations are nontrivial does not contain highly unexpected or deep theoretical results typos cadestrian cartesian irreduciable irreducible equity equality edit having read the reviews rebuttal and updated paper i have decided to maintain my score of 6
### Summary:
|
this paper introduces an approach based on filter transform for designing networks equivariant to different transformation groups especially the authors rely on the haramonic analysis view of steerable cnns given in weiler cesa 2019 to design an equivariant filter bank by computing simple transforms over base filters the reviewers finds the paper technically solid but difficult to read and with a limited contribution the ac carefully reads the paper and discussions although the connection between steerable cnns and filter transform are interesting the ac considers that the main contributions of the paper should be consolidated especially the positioning with respect to weiler cesa 2019 therefore the ac recommends rejection
|
[
4278,
275,
3533,
337,
690,
14951,
285,
31931,
2492,
403,
908,
1293,
1463,
8813,
534,
1537,
320,
21643,
374,
3710,
5661,
1543,
50275,
34974,
285,
4385,
337,
690,
273,
253,
41818,
285,
490,
25669,
403,
908,
1293,
1463,
8813,
323,
1650,
465,
275,
5150,
1249,
258,
7083,
327,
3239,
337,
285,
5816,
3806,
327,
391,
19,
13118,
891,
651,
11907,
253,
4477,
281,
1056,
247,
3687,
3434,
281,
19148,
616,
5697,
285,
1543,
50276,
19,
253,
3762,
3133,
281,
320,
4270,
760,
327,
13358,
1387,
1057,
352,
39970,
281,
5415,
1387,
9261,
824,
347,
594,
19,
1078,
35132,
1320,
50276,
20,
671,
1057,
253,
3762,
39970,
281,
1327,
36648,
2390,
824,
347,
50276,
49708,
285,
703,
1875,
50276,
21,
253,
4679,
1646,
281,
320,
3710,
1014,
2167,
436,
3133,
281,
320,
247,
1846,
2523,
275,
9380,
275,
436,
1673,
2299,
891,
513,
5583,
253,
4477,
281,
1246,
253,
32270,
6410,
2957,
672,
970,
616,
4081,
1193,
7604,
7296,
326,
35132,
1320,
285,
30370,
1537,
2847,
247,
1895,
275,
616,
4758,
12401,
643,
2097,
273,
37434,
494,
260,
79,
2224,
824,
347,
687,
2851,
7836,
285,
23007,
2036,
7152,
33032,
2520,
2929,
2175,
253,
4602,
875,
37434,
494,
260,
9866,
285,
5806,
9261,
253,
4477,
921,
28055,
326,
5806,
9261,
476,
320,
908,
281,
8968,
37434,
494,
15116,
689,
1027,
1387,
14237,
253,
4477,
671,
45190,
921,
326,
253,
5806,
9261,
1754,
37434,
494,
260,
9866,
17923,
327,
1061,
342,
253,
7092,
1754,
327,
23007,
14395,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
1892,
281,
956,
253,
9759,
285,
14951,
7024,
327,
326,
273,
2720,
2987,
285,
310,
1892,
281,
2096,
1293,
1146,
7615,
342,
37434,
494,
260,
9866,
2429,
342,
2720,
2987,
436,
2929,
16633,
625,
327,
253,
15965,
28529,
533,
11521,
2159,
273,
15571,
616,
27570,
253,
906,
310,
3103,
1892,
281,
2096,
323,
1110,
326,
403,
417,
7615,
342,
253,
10527,
3720,
671,
253,
4477,
513,
417,
4518,
921,
2139,
253,
7680,
273,
436,
789,
310,
1534,
253,
10527,
4602,
36908,
1646,
281,
2085,
247,
1534,
5750,
4543,
8415,
253,
3237,
275,
5368,
3082,
12014,
253,
3368,
36908,
2085,
1199,
1491,
285,
760,
2722,
326,
253,
4081,
7092,
2987,
327,
2969,
15302,
253,
4477,
943,
1611,
281,
6780,
849,
436,
789,
778,
5649,
643,
2561,
390,
4893,
285,
12654,
326,
253,
4081,
1332,
2987,
327,
625,
9542,
285,
15958,
941,
50276,
23955,
4809,
326,
476,
320,
5520,
310,
281,
2319,
849,
253,
3762,
2087,
4219,
672,
2709,
27311,
8090,
403,
3732,
275,
2087,
253,
10527,
3607,
273,
37434,
494,
27311,
1057,
417,
8356,
39970,
281,
2709,
27311,
8090,
436,
310,
3782,
1774,
275,
1524,
1533,
941,
984,
253,
9261,
778,
5108,
387,
1027,
11498,
285,
778,
878,
281,
320,
20184,
323,
387,
1027,
8090,
275,
253,
2990,
3103,
253,
1543,
6012,
432,
247,
2014,
27311,
4254,
778,
417,
320,
4209,
50276,
7152,
339,
793,
360,
3454,
50276,
783,
2929,
36264,
253,
5140,
273,
15116,
275,
247,
1387,
32270,
6410,
27311,
267,
2990,
32270,
6410,
281,
253,
2390,
260,
79,
19870,
1387,
273,
1340,
295,
295,
39501,
285,
277,
79,
1073,
16232,
1387,
273,
1340,
295,
295,
39501,
285,
46899,
253,
4477,
5115,
436,
407,
3192,
4872,
13553,
273,
23007,
3720,
15116,
347,
369,
2011,
275,
359,
6731,
1162,
355,
4765,
597,
840,
4262,
281,
921,
11120,
849,
581,
476,
1973,
15116,
875,
1798,
14237,
273,
1016,
1387,
10775,
253,
14916,
22816,
285,
3963,
14237,
323,
4679,
597,
7568,
326,
253,
15116,
1347,
546,
1061,
342,
1110,
273,
359,
6731,
285,
19109,
66,
6247,
665,
574,
2684,
247,
425,
609,
89,
8648,
422,
5301,
2190,
14237,
273,
396,
19,
391,
50274,
856,
84,
50276,
23693,
1037,
891,
2868,
253,
789,
281,
320,
3590,
891,
858,
417,
923,
667,
16503,
6923,
562,
387,
479,
891,
671,
1158,
326,
352,
310,
826,
438,
494,
326,
253,
4477,
2908,
247,
1029,
2408,
273,
2508,
275,
616,
47284,
534,
41714,
8050,
253,
3242,
6297,
407,
534,
581,
651,
8069,
564,
670,
3652,
271,
32270,
6410,
3828,
50276,
16217,
2092,
595,
352,
4620,
326,
253,
4477,
9703,
247,
24600,
8245,
285,
326,
597,
2429,
327,
10895,
534,
403,
4623,
281,
253,
9400,
285,
7744,
908,
275,
253,
32270,
14417,
6239,
253,
4477,
671,
2085,
11795,
1491,
327,
849,
3809,
1016,
5806,
476,
320,
4561,
534,
310,
1633,
891,
452,
417,
2326,
1078,
275,
253,
32270,
14417,
6239,
285,
534,
891,
11435,
6523,
50273,
5040,
285,
25799,
8680,
50276,
30875,
619,
6253,
14226,
310,
326,
352,
310,
417,
4755,
4555,
752,
253,
7680,
273,
253,
2929,
310,
5486,
281,
320,
4931,
436,
42799,
432,
619,
417,
8958,
752,
253,
4477,
1599,
407,
5806,
4979,
285,
37434,
494,
260,
9866,
619,
4685,
310,
326,
253,
4477,
2868,
352,
310,
4390,
7202,
849,
281,
3989,
247,
37434,
494,
260,
9866,
323,
253,
19870,
285,
1073,
16232,
2390,
597,
2085,
6843,
35831,
275,
2593,
495,
533,
281,
619,
3640,
841,
403,
2168,
2530,
275,
253,
1048,
30762,
273,
359,
6731,
285,
19109,
66,
6247,
33810,
597,
476,
320,
1119,
275,
820,
864,
285,
973,
272,
4022,
820,
864,
285,
973,
272,
4240,
285,
320,
14750,
398,
1162,
355,
4765,
285,
954,
19836,
359,
6731,
1162,
355,
4765,
665,
497,
281,
806,
281,
921,
849,
281,
3989,
32270,
6410,
5806,
8090,
690,
4872,
13553,
273,
23007,
3720,
15116,
347,
247,
906,
891,
717,
31488,
273,
849,
281,
11206,
253,
38135,
273,
436,
7680,
50276,
249,
253,
4679,
273,
2829,
374,
9162,
285,
9077,
253,
4477,
7277,
616,
7792,
342,
271,
391,
19,
13118,
8612,
23007,
1754,
27311,
285,
247,
3963,
10234,
595,
32270,
6410,
2990,
891,
651,
751,
281,
871,
752,
4555,
310,
247,
8612,
23007,
1754,
27311,
310,
326,
5486,
281,
3730,
281,
253,
789,
273,
359,
6731,
285,
19109,
66,
6247,
50275,
249,
253,
1543,
7180,
627,
403,
642,
2228,
8965,
604,
1896,
891,
651,
452,
10490,
281,
452,
2326,
731,
1580,
597,
403,
417,
627,
310,
352,
1077,
1892,
281,
5963,
253,
10307,
273,
253,
1543,
534,
9184,
432,
253,
1666,
25379,
407,
1077,
1355,
8322,
50276,
28821,
253,
18326,
273,
253,
789,
281,
359,
6731,
285,
19109,
66,
6247,
891,
651,
751,
281,
871,
4555,
752,
1027,
28032,
253,
767,
2987,
50276,
32897,
2486,
4555,
752,
253,
3470,
4533,
34572,
285,
9682,
386,
2686,
513,
11076,
1037,
891,
1119,
841,
2834,
281,
14390,
50276,
783,
15965,
1268,
273,
253,
2929,
310,
3965,
5536,
323,
253,
440,
4478,
4215,
352,
778,
320,
15237,
494,
323,
253,
4477,
281,
2486,
247,
27392,
552,
273,
2426,
604,
417,
2159,
20121,
275,
271,
30762,
604,
436,
310,
1512,
1199,
387,
1878,
4496,
1127,
281,
643,
9380,
534,
452,
4354,
34025,
4114,
7118,
323,
1110,
417,
2168,
973,
1088,
275,
1387,
3762,
50276,
783,
4477,
778,
5730,
281,
452,
253,
19529,
4737,
1088,
323,
33797,
285,
28146,
50276,
5996,
30080,
22559,
2278,
50276,
30819,
1239,
949,
253,
30080,
22559,
253,
9300,
19529,
285,
253,
10123,
273,
253,
643,
30628,
891,
452,
29101,
619,
2278,
432,
247,
12009,
281,
42876,
2708,
14924,
436,
310,
323,
767,
2022,
4606,
337,
253,
4477,
452,
37078,
5520,
253,
9759,
273,
253,
19529,
534,
1024,
4453,
247,
2257,
6927,
281,
1239,
374,
253,
4477,
452,
31637,
323,
479,
387,
1878,
752,
253,
2022,
7680,
273,
253,
789,
310,
50275,
3529,
753,
891,
717,
417,
7094,
2119,
752,
436,
7680,
11323,
281,
253,
32270,
14417,
3114,
7613,
2139,
619,
17401,
1335,
458,
507,
4404,
12009,
347,
2080,
347,
891,
717,
6600,
16161,
253,
32270,
14417,
7424,
310,
417,
253,
1781,
3673,
44856,
281,
4780,
275,
776,
3114,
597,
403,
4872,
7424,
285,
627,
310,
789,
896,
715,
253,
5096,
84,
16161,
731,
2451,
562,
952,
751,
268,
2880,
287,
591,
8440,
869,
4662,
716,
80,
891,
1158,
625,
15538,
359,
878,
281,
2770,
327,
13383,
253,
13674,
275,
3672,
824,
347,
6880,
281,
5293,
26365,
28236,
27311,
689,
1327,
36648,
2390,
4715,
34902,
3966,
1223,
436,
789,
310,
4518,
11076,
1037,
3590,
285,
253,
4477,
452,
5183,
3676,
3640,
273,
253,
2170,
352,
9193,
247,
1652,
751,
851,
83,
4234,
2720,
2987,
326,
753,
604,
253,
643,
30628,
14936,
840,
891,
13414,
2564,
436,
2929,
1146,
7607,
4931,
1580,
891,
452,
4307,
275,
436,
2170,
752,
4620,
347,
4755,
281,
479,
310,
417,
3839,
14969,
285,
436,
2929,
778,
5752,
347,
247,
4217,
37699,
323,
1110,
30685,
281,
25760,
715,
253,
4133,
30387,
406,
339,
365,
371,
400,
6410,
37434,
494,
260,
79,
2224,
323,
374,
69,
20,
69,
9381,
709,
1788,
20099,
2390,
452,
3839,
644,
9009,
347,
247,
5806,
4979,
4347,
6232,
3213,
3560,
407,
247,
2629,
27311,
253,
5806,
7466,
3213,
8687,
3192,
247,
4872,
5019,
273,
37434,
494,
3720,
15116,
841,
3720,
15116,
403,
638,
16777,
264,
1078,
2990,
3733,
407,
16161,
247,
4872,
985,
390,
407,
10491,
253,
5415,
16101,
2900,
436,
476,
1379,
247,
1643,
2909,
7293,
327,
253,
6777,
1387,
6779,
8772,
534,
253,
2990,
3828,
310,
32270,
6410,
247,
1027,
5806,
3720,
588,
20177,
533,
275,
2087,
581,
476,
923,
326,
253,
3720,
15116,
1705,
562,
347,
27272,
285,
34572,
10125,
273,
690,
3720,
15116,
342,
253,
19322,
861,
19153,
436,
556,
644,
4767,
275,
690,
4321,
2987,
253,
10799,
1039,
275,
534,
247,
3720,
5806,
310,
281,
320,
27272,
285,
34572,
281,
4044,
253,
37434,
494,
5806,
3720,
323,
253,
374,
69,
1083,
574,
417,
644,
4307,
562,
1078,
281,
619,
3640,
285,
436,
310,
581,
273,
253,
9021,
273,
436,
2929,
253,
1783,
310,
2218,
323,
1016,
3280,
3453,
6779,
1511,
6777,
432,
14916,
22816,
3963,
14237,
1907,
4307,
436,
562,
253,
2929,
29328,
281,
897,
436,
347,
247,
1039,
273,
16994,
253,
5806,
7466,
3213,
4983,
432,
247,
3720,
5806,
285,
17387,
71,
965,
2784,
352,
281,
4044,
271,
11848,
5806,
4310,
50275,
783,
4081,
1332,
1193,
7604,
1057,
417,
2430,
247,
638,
681,
10340,
3213,
604,
891,
2096,
9113,
534,
310,
247,
1534,
8542,
5750,
4679,
2007,
921,
326,
253,
1332,
310,
2074,
390,
7938,
387,
5806,
7466,
4720,
253,
1332,
310,
17618,
407,
3733,
6928,
327,
22791,
8892,
285,
2011,
281,
1347,
12014,
281,
390,
1805,
685,
253,
37434,
494,
260,
9866,
7092,
273,
359,
6731,
50276,
707,
66,
534,
310,
253,
1682,
5368,
7092,
50276,
783,
2929,
13366,
25957,
326,
15116,
2550,
320,
27272,
4555,
327,
247,
13358,
9860,
533,
891,
42126,
4677,
562,
849,
253,
4477,
12661,
281,
2968,
342,
436,
2523,
849,
4555,
403,
253,
15116,
27272,
50276,
74,
1158,
253,
1332,
4081,
275,
253,
2929,
310,
4217,
347,
352,
310,
1097,
7938,
285,
1805,
685,
5368,
37434,
494,
260,
9866,
27558,
253,
2929,
3139,
310,
9648,
973,
3542,
285,
22335,
3451,
347,
2080,
347,
891,
476,
2028,
533,
778,
320,
11132,
281,
1239,
323,
1110,
665,
403,
417,
2568,
37289,
670,
37434,
494,
260,
79,
2224,
1110,
10668,
2299,
403,
11543,
281,
320,
6110,
275,
4715,
670,
253,
7092,
4278,
273,
37434,
494,
260,
79,
2224,
8791,
594,
4931,
436,
310,
4030,
253,
1921,
891,
717,
417,
4933,
247,
2169,
13716,
310,
326,
891,
1158,
326,
3738,
436,
310,
247,
4217,
7680,
281,
253,
6239,
327,
37434,
494,
260,
79,
2224,
534,
403,
1146,
908,
275,
271,
3629,
1180,
273,
4893,
253,
2929,
1057,
417,
1957,
247,
2201,
29709,
285,
3738,
253,
10426,
403,
37825,
1057,
417,
3831,
4122,
12439,
390,
3676,
10527,
1543,
50276,
555,
993,
18072,
383,
5651,
50276,
23487,
16561,
3496,
17773,
6051,
50276,
343,
21456,
17382,
50276,
2655,
1319,
50276,
15576,
1907,
1239,
253,
10123,
30080,
22559,
285,
9300,
2929,
891,
452,
4425,
281,
6558,
619,
4868,
273,
721,
187,
187,
4118,
18435,
27,
2520,
2929,
23970,
271,
2746,
1754,
327,
5806,
4979,
323,
20462,
6928,
32270,
6410,
281,
1027,
9261,
2390,
3340,
253,
4477,
10725,
327,
253,
4230,
312,
5120,
1783,
1859,
273,
37434,
494,
260,
79,
2224,
1677,
275,
359,
6731,
50276,
707,
66,
6247,
281,
2216,
271,
32270,
6410,
5806,
4310,
407,
12672,
2969,
29698,
689,
2613,
15116,
50275,
783,
30628,
9010,
253,
2929,
22335,
4891,
533,
2834,
281,
1239,
285,
342,
247,
3710,
7680,
50276,
783,
913,
9257,
9563,
253,
2929,
285,
11985,
3738,
253,
4602,
875,
37434,
494,
260,
79,
2224,
285,
5806,
4979,
403,
4722,
253,
913,
19401,
326,
253,
2022,
9021,
273,
253,
2929,
943,
320,
33114,
3340,
253,
19274,
342,
1675,
281,
359,
6731,
50276,
707,
66,
6247,
50276,
45230,
253,
913,
32636,
18235
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
4278,
275,
3533,
337,
690,
14951,
285,
31931,
2492,
403,
908,
1293,
1463,
8813,
534,
1537,
320,
21643,
374,
3710,
5661,
1543,
50275,
34974,
285,
4385,
337,
690,
273,
253,
41818,
285,
490,
25669,
403,
908,
1293,
1463,
8813,
323,
1650,
465,
275,
5150,
1249,
258,
7083,
327,
3239,
337,
285,
5816,
3806,
327,
391,
19,
13118,
891,
651,
11907,
253,
4477,
281,
1056,
247,
3687,
3434,
281,
19148,
616,
5697,
285,
1543,
50276,
19,
253,
3762,
3133,
281,
320,
4270,
760,
327,
13358,
1387,
1057,
352,
39970,
281,
5415,
1387,
9261,
824,
347,
594,
19,
1078,
35132,
1320,
50276,
20,
671,
1057,
253,
3762,
39970,
281,
1327,
36648,
2390,
824,
347,
50276,
49708,
285,
703,
1875,
50276,
21,
253,
4679,
1646,
281,
320,
3710,
1014,
2167,
436,
3133,
281,
320,
247,
1846,
2523,
275,
9380,
275,
436,
1673,
2299,
891,
513,
5583,
253,
4477,
281,
1246,
253,
32270,
6410,
2957,
672,
970,
616,
4081,
1193,
7604,
7296,
326,
35132,
1320,
285,
30370,
1537,
2847,
247,
1895,
275,
616,
4758,
12401,
643,
2097,
273,
37434,
494,
260,
79,
2224,
824,
347,
687,
2851,
7836,
285,
23007,
2036,
7152,
33032,
2520,
2929,
2175,
253,
4602,
875,
37434,
494,
260,
9866,
285,
5806,
9261,
253,
4477,
921,
28055,
326,
5806,
9261,
476,
320,
908,
281,
8968,
37434,
494,
15116,
689,
1027,
1387,
14237,
253,
4477,
671,
45190,
921,
326,
253,
5806,
9261,
1754,
37434,
494,
260,
9866,
17923,
327,
1061,
342,
253,
7092,
1754,
327,
23007,
14395,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
1892,
281,
956,
253,
9759,
285,
14951,
7024,
327,
326,
273,
2720,
2987,
285,
310,
1892,
281,
2096,
1293,
1146,
7615,
342,
37434,
494,
260,
9866,
2429,
342,
2720,
2987,
436,
2929,
16633,
625,
327,
253,
15965,
28529,
533,
11521,
2159,
273,
15571,
616,
27570,
253,
906,
310,
3103,
1892,
281,
2096,
323,
1110,
326,
403,
417,
7615,
342,
253,
10527,
3720,
671,
253,
4477,
513,
417,
4518,
921,
2139,
253,
7680,
273,
436,
789,
310,
1534,
253,
10527,
4602,
36908,
1646,
281,
2085,
247,
1534,
5750,
4543,
8415,
253,
3237,
275,
5368,
3082,
12014,
253,
3368,
36908,
2085,
1199,
1491,
285,
760,
2722,
326,
253,
4081,
7092,
2987,
327,
2969,
15302,
253,
4477,
943,
1611,
281,
6780,
849,
436,
789,
778,
5649,
643,
2561,
390,
4893,
285,
12654,
326,
253,
4081,
1332,
2987,
327,
625,
9542,
285,
15958,
941,
50276,
23955,
4809,
326,
476,
320,
5520,
310,
281,
2319,
849,
253,
3762,
2087,
4219,
672,
2709,
27311,
8090,
403,
3732,
275,
2087,
253,
10527,
3607,
273,
37434,
494,
27311,
1057,
417,
8356,
39970,
281,
2709,
27311,
8090,
436,
310,
3782,
1774,
275,
1524,
1533,
941,
984,
253,
9261,
778,
5108,
387,
1027,
11498,
285,
778,
878,
281,
320,
20184,
323,
387,
1027,
8090,
275,
253,
2990,
3103,
253,
1543,
6012,
432,
247,
2014,
27311,
4254,
778,
417,
320,
4209,
50276,
7152,
339,
793,
360,
3454,
50276,
783,
2929,
36264,
253,
5140,
273,
15116,
275,
247,
1387,
32270,
6410,
27311,
267,
2990,
32270,
6410,
281,
253,
2390,
260,
79,
19870,
1387,
273,
1340,
295,
295,
39501,
285,
277,
79,
1073,
16232,
1387,
273,
1340,
295,
295,
39501,
285,
46899,
253,
4477,
5115,
436,
407,
3192,
4872,
13553,
273,
23007,
3720,
15116,
347,
369,
2011,
275,
359,
6731,
1162,
355,
4765,
597,
840,
4262,
281,
921,
11120,
849,
581,
476,
1973,
15116,
875,
1798,
14237,
273,
1016,
1387,
10775,
253,
14916,
22816,
285,
3963,
14237,
323,
4679,
597,
7568,
326,
253,
15116,
1347,
546,
1061,
342,
1110,
273,
359,
6731,
285,
19109,
66,
6247,
665,
574,
2684,
247,
425,
609,
89,
8648,
422,
5301,
2190,
14237,
273,
396,
19,
391,
50274,
856,
84,
50276,
23693,
1037,
891,
2868,
253,
789,
281,
320,
3590,
891,
858,
417,
923,
667,
16503,
6923,
562,
387,
479,
891,
671,
1158,
326,
352,
310,
826,
438,
494,
326,
253,
4477,
2908,
247,
1029,
2408,
273,
2508,
275,
616,
47284,
534,
41714,
8050,
253,
3242,
6297,
407,
534,
581,
651,
8069,
564,
670,
3652,
271,
32270,
6410,
3828,
50276,
16217,
2092,
595,
352,
4620,
326,
253,
4477,
9703,
247,
24600,
8245,
285,
326,
597,
2429,
327,
10895,
534,
403,
4623,
281,
253,
9400,
285,
7744,
908,
275,
253,
32270,
14417,
6239,
253,
4477,
671,
2085,
11795,
1491,
327,
849,
3809,
1016,
5806,
476,
320,
4561,
534,
310,
1633,
891,
452,
417,
2326,
1078,
275,
253,
32270,
14417,
6239,
285,
534,
891,
11435,
6523,
50273,
5040,
285,
25799,
8680,
50276,
30875,
619,
6253,
14226,
310,
326,
352,
310,
417,
4755,
4555,
752,
253,
7680,
273,
253,
2929,
310,
5486,
281,
320,
4931,
436,
42799,
432,
619,
417,
8958,
752,
253,
4477,
1599,
407,
5806,
4979,
285,
37434,
494,
260,
9866,
619,
4685,
310,
326,
253,
4477,
2868,
352,
310,
4390,
7202,
849,
281,
3989,
247,
37434,
494,
260,
9866,
323,
253,
19870,
285,
1073,
16232,
2390,
597,
2085,
6843,
35831,
275,
2593,
495,
533,
281,
619,
3640,
841,
403,
2168,
2530,
275,
253,
1048,
30762,
273,
359,
6731,
285,
19109,
66,
6247,
33810,
597,
476,
320,
1119,
275,
820,
864,
285,
973,
272,
4022,
820,
864,
285,
973,
272,
4240,
285,
320,
14750,
398,
1162,
355,
4765,
285,
954,
19836,
359,
6731,
1162,
355,
4765,
665,
497,
281,
806,
281,
921,
849,
281,
3989,
32270,
6410,
5806,
8090,
690,
4872,
13553,
273,
23007,
3720,
15116,
347,
247,
906,
891,
717,
31488,
273,
849,
281,
11206,
253,
38135,
273,
436,
7680,
50276,
249,
253,
4679,
273,
2829,
374,
9162,
285,
9077,
253,
4477,
7277,
616,
7792,
342,
271,
391,
19,
13118,
8612,
23007,
1754,
27311,
285,
247,
3963,
10234,
595,
32270,
6410,
2990,
891,
651,
751,
281,
871,
752,
4555,
310,
247,
8612,
23007,
1754,
27311,
310,
326,
5486,
281,
3730,
281,
253,
789,
273,
359,
6731,
285,
19109,
66,
6247,
50275,
249,
253,
1543,
7180,
627,
403,
642,
2228,
8965,
604,
1896,
891,
651,
452,
10490,
281,
452,
2326,
731,
1580,
597,
403,
417,
627,
310,
352,
1077,
1892,
281,
5963,
253,
10307,
273,
253,
1543,
534,
9184,
432,
253,
1666,
25379,
407,
1077,
1355,
8322,
50276,
28821,
253,
18326,
273,
253,
789,
281,
359,
6731,
285,
19109,
66,
6247,
891,
651,
751,
281,
871,
4555,
752,
1027,
28032,
253,
767,
2987,
50276,
32897,
2486,
4555,
752,
253,
3470,
4533,
34572,
285,
9682,
386,
2686,
513,
11076,
1037,
891,
1119,
841,
2834,
281,
14390,
50276,
783,
15965,
1268,
273,
253,
2929,
310,
3965,
5536,
323,
253,
440,
4478,
4215,
352,
778,
320,
15237,
494,
323,
253,
4477,
281,
2486,
247,
27392,
552,
273,
2426,
604,
417,
2159,
20121,
275,
271,
30762,
604,
436,
310,
1512,
1199,
387,
1878,
4496,
1127,
281,
643,
9380,
534,
452,
4354,
34025,
4114,
7118,
323,
1110,
417,
2168,
973,
1088,
275,
1387,
3762,
50276,
783,
4477,
778,
5730,
281,
452,
253,
19529,
4737,
1088,
323,
33797,
285,
28146,
50276,
5996,
30080,
22559,
2278,
50276,
30819,
1239,
949,
253,
30080,
22559,
253,
9300,
19529,
285,
253,
10123,
273,
253,
643,
30628,
891,
452,
29101,
619,
2278,
432,
247,
12009,
281,
42876,
2708,
14924,
436,
310,
323,
767,
2022,
4606,
337,
253,
4477,
452,
37078,
5520,
253,
9759,
273,
253,
19529,
534,
1024,
4453,
247,
2257,
6927,
281,
1239,
374,
253,
4477,
452,
31637,
323,
479,
387,
1878,
752,
253,
2022,
7680,
273,
253,
789,
310,
50275,
3529,
753,
891,
717,
417,
7094,
2119,
752,
436,
7680,
11323,
281,
253,
32270,
14417,
3114,
7613,
2139,
619,
17401,
1335,
458,
507,
4404,
12009,
347,
2080,
347,
891,
717,
6600,
16161,
253,
32270,
14417,
7424,
310,
417,
253,
1781,
3673,
44856,
281,
4780,
275,
776,
3114,
597,
403,
4872,
7424,
285,
627,
310,
789,
896,
715,
253,
5096,
84,
16161,
731,
2451,
562,
952,
751,
268,
2880,
287,
591,
8440,
869,
4662,
716,
80,
891,
1158,
625,
15538,
359,
878,
281,
2770,
327,
13383,
253,
13674,
275,
3672,
824,
347,
6880,
281,
5293,
26365,
28236,
27311,
689,
1327,
36648,
2390,
4715,
34902,
3966,
1223,
436,
789,
310,
4518,
11076,
1037,
3590,
285,
253,
4477,
452,
5183,
3676,
3640,
273,
253,
2170,
352,
9193,
247,
1652,
751,
851,
83,
4234,
2720,
2987,
326,
753,
604,
253,
643,
30628,
14936,
840,
891,
13414,
2564,
436,
2929,
1146,
7607,
4931,
1580,
891,
452,
4307,
275,
436,
2170,
752,
4620,
347,
4755,
281,
479,
310,
417,
3839,
14969,
285,
436,
2929,
778,
5752,
347,
247,
4217,
37699,
323,
1110,
30685,
281,
25760,
715,
253,
4133,
30387,
406,
339,
365,
371,
400,
6410,
37434,
494,
260,
79,
2224,
323,
374,
69,
20,
69,
9381,
709,
1788,
20099,
2390,
452,
3839,
644,
9009,
347,
247,
5806,
4979,
4347,
6232,
3213,
3560,
407,
247,
2629,
27311,
253,
5806,
7466,
3213,
8687,
3192,
247,
4872,
5019,
273,
37434,
494,
3720,
15116,
841,
3720,
15116,
403,
638,
16777,
264,
1078,
2990,
3733,
407,
16161,
247,
4872,
985,
390,
407,
10491,
253,
5415,
16101,
2900,
436,
476,
1379,
247,
1643,
2909,
7293,
327,
253,
6777,
1387,
6779,
8772,
534,
253,
2990,
3828,
310,
32270,
6410,
247,
1027,
5806,
3720,
588,
20177,
533,
275,
2087,
581,
476,
923,
326,
253,
3720,
15116,
1705,
562,
347,
27272,
285,
34572,
10125,
273,
690,
3720,
15116,
342,
253,
19322,
861,
19153,
436,
556,
644,
4767,
275,
690,
4321,
2987,
253,
10799,
1039,
275,
534,
247,
3720,
5806,
310,
281,
320,
27272,
285,
34572,
281,
4044,
253,
37434,
494,
5806,
3720,
323,
253,
374,
69,
1083,
574,
417,
644,
4307,
562,
1078,
281,
619,
3640,
285,
436,
310,
581,
273,
253,
9021,
273,
436,
2929,
253,
1783,
310,
2218,
323,
1016,
3280,
3453,
6779,
1511,
6777,
432,
14916,
22816,
3963,
14237,
1907,
4307,
436,
562,
253,
2929,
29328,
281,
897,
436,
347,
247,
1039,
273,
16994,
253,
5806,
7466,
3213,
4983,
432,
247,
3720,
5806,
285,
17387,
71,
965,
2784,
352,
281,
4044,
271,
11848,
5806,
4310,
50275,
783,
4081,
1332,
1193,
7604,
1057,
417,
2430,
247,
638,
681,
10340,
3213,
604,
891,
2096,
9113,
534,
310,
247,
1534,
8542,
5750,
4679,
2007,
921,
326,
253,
1332,
310,
2074,
390,
7938,
387,
5806,
7466,
4720,
253,
1332,
310,
17618,
407,
3733,
6928,
327,
22791,
8892,
285,
2011,
281,
1347,
12014,
281,
390,
1805,
685,
253,
37434,
494,
260,
9866,
7092,
273,
359,
6731,
50276,
707,
66,
534,
310,
253,
1682,
5368,
7092,
50276,
783,
2929,
13366,
25957,
326,
15116,
2550,
320,
27272,
4555,
327,
247,
13358,
9860,
533,
891,
42126,
4677,
562,
849,
253,
4477,
12661,
281,
2968,
342,
436,
2523,
849,
4555,
403,
253,
15116,
27272,
50276,
74,
1158,
253,
1332,
4081,
275,
253,
2929,
310,
4217,
347,
352,
310,
1097,
7938,
285,
1805,
685,
5368,
37434,
494,
260,
9866,
27558,
253,
2929,
3139,
310,
9648,
973,
3542,
285,
22335,
3451,
347,
2080,
347,
891,
476,
2028,
533,
778,
320,
11132,
281,
1239,
323,
1110,
665,
403,
417,
2568,
37289,
670,
37434,
494,
260,
79,
2224,
1110,
10668,
2299,
403,
11543,
281,
320,
6110,
275,
4715,
670,
253,
7092,
4278,
273,
37434,
494,
260,
79,
2224,
8791,
594,
4931,
436,
310,
4030,
253,
1921,
891,
717,
417,
4933,
247,
2169,
13716,
310,
326,
891,
1158,
326,
3738,
436,
310,
247,
4217,
7680,
281,
253,
6239,
327,
37434,
494,
260,
79,
2224,
534,
403,
1146,
908,
275,
271,
3629,
1180,
273,
4893,
253,
2929,
1057,
417,
1957,
247,
2201,
29709,
285,
3738,
253,
10426,
403,
37825,
1057,
417,
3831,
4122,
12439,
390,
3676,
10527,
1543,
50276,
555,
993,
18072,
383,
5651,
50276,
23487,
16561,
3496,
17773,
6051,
50276,
343,
21456,
17382,
50276,
2655,
1319,
50276,
15576,
1907,
1239,
253,
10123,
30080,
22559,
285,
9300,
2929,
891,
452,
4425,
281,
6558,
619,
4868,
273,
721,
187,
187,
4118,
18435,
27,
2520,
2929,
23970,
271,
2746,
1754,
327,
5806,
4979,
323,
20462,
6928,
32270,
6410,
281,
1027,
9261,
2390,
3340,
253,
4477,
10725,
327,
253,
4230,
312,
5120,
1783,
1859,
273,
37434,
494,
260,
79,
2224,
1677,
275,
359,
6731,
50276,
707,
66,
6247,
281,
2216,
271,
32270,
6410,
5806,
4310,
407,
12672,
2969,
29698,
689,
2613,
15116,
50275,
783,
30628,
9010,
253,
2929,
22335,
4891,
533,
2834,
281,
1239,
285,
342,
247,
3710,
7680,
50276,
783,
913,
9257,
9563,
253,
2929,
285,
11985,
3738,
253,
4602,
875,
37434,
494,
260,
79,
2224,
285,
5806,
4979,
403,
4722,
253,
913,
19401,
326,
253,
2022,
9021,
273,
253,
2929,
943,
320,
33114,
3340,
253,
19274,
342,
1675,
281,
359,
6731,
50276,
707,
66,
6247,
50276,
45230,
253,
913,
32636,
18235
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a conditional independence test in a linear gaussian setting but the variance of the error term may not be a constant but affected by a function of a subset of its parents or sampling index eg individualdependent variance or a timeseries like setting hence the authors designed heteroskedasticityaware weighted partial correlation where the weights come from inverse of variances which is estimated using a windowing approach however this test works with expert knowledge about heteroskedasticity whether it is caused by sampling index or heteroskedastic parent in a causal discovery setting not knowing the parents of variables it only assumes that the heteroskedasticy comes from sampling index strengths wellmotivated problem and a theoretically correct solution under some assumptions a simple solution to an important problem weaknesses as the authors acknowledged assumption 1 and 5 for pc algorithm requires expert knowledge it seems hard for me to find weaknesses of the paper considering that the authors did a right thing to solve a problem and acknowledged limitations clearly that are not due to the authors but comes from the nature requires prior knowledge on the reasons for heteroskedasticity docsepa conditional independence test is proposed based on partial correlation but adapted to the case of heteroskedasticity it is also studied how this test can be used in the pc algorithm all results a rigorously proved as well as illustrated using experiments this contribution is of high theoretical quality it is also of great practical relevance as homoskedasticity assumptions are often violated in real datasets there are currently some major limitations which are clearly mentioned in the paper mainly that the proposed partial correlation test only works if for each variable there is at most one source of heteroskedasticity and for theoretical consistency of the pc algorithm that this source is not another variable improvement in these areas is listed as future work the paper is clearly written i especially appreciate the illustrations of the problems in figure 1 the application to causal discovery is original though if i understand correctly the test itself is quite similar to existing work on heteroskedasticity yes docsepin this work authors consider heteroskedasticity in a structural causal model scm and develop an adapted weighted leastsquares wls partial correlation variant as a conditional independence ci test which is able to deal with heteroskedastic noise with expert knowledge about the heteroskedastic relationships are given combined with the pc algorithm the proposed ci test can also identify the causal relationships between variables under some assumptions regarding the heteroskedastic relationships both theoretical consistency analysis and simulation experiments demonstrate that the proposed ci test achieves better false positive rates and also improves upon the detection power as compared to the standard partial correlation ci test strengths 1 the authors propose an adapted wls partial correlation test for conditional independence testing in the presence of heteroskedastic noise 2 the authors provide a detailed analysis of the wls methods consistency with the proposed weight approximation method and further inject the proposed ci test into the pc algorithm to learn the causal structure 3 experiments on simulated datasets demonstrate that the proposed wls partial correlation ci test achieves better performance than the ols partial correlation ci test in the presence of heteroskedastic noise combined with the pc algorithm the proposed wls partial correlation ci test has a smaller false positive rate in identifying causal relationships weaknesses 1 the idea of using the inverse of errors variance is intuitive however it would be great if the authors could further discuss why wls can deal with heteroskedastic noise and some possible better reweighting strategies in detail 2 the authors formulate several assumptions in the paper however the heteroskedastic relationships require expert knowledge the reviewer wonders if the authors could discuss further how to relax the heteroskedastic relationships assumption authors have discussed the limitations and societal impacts in the conclusion section
### Summary:
|
in real problems data often exhibit the heterogeneity property as a consequence conditional independence tests that rely on the data homoskedasticity assumption may perform suboptimally leading to inaccurate causal discovery results this paper adapts the partial correlation tests to account for heteroskedastic noise and provides some necessary theoretical guarantees and empirical results reviewers agree that the studied problem is well motivated and that the solution is sensible
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
17697,
14275,
1071,
275,
247,
4872,
305,
12064,
4758,
533,
253,
11041,
273,
253,
2228,
1307,
778,
417,
320,
247,
3638,
533,
5876,
407,
247,
1159,
273,
247,
8578,
273,
697,
4651,
390,
10491,
3605,
24088,
2060,
6820,
11041,
390,
247,
2069,
12395,
751,
4758,
7613,
253,
4477,
4158,
6895,
375,
16386,
3258,
414,
13823,
17375,
7898,
5921,
835,
253,
13461,
1705,
432,
13737,
273,
48894,
534,
310,
5998,
970,
247,
3497,
272,
2746,
2299,
436,
1071,
2987,
342,
6485,
3640,
670,
6895,
375,
16386,
3258,
414,
1880,
352,
310,
4269,
407,
10491,
3605,
390,
6895,
375,
16386,
3258,
2885,
275,
247,
19349,
8900,
4758,
417,
8958,
253,
4651,
273,
4903,
352,
760,
19584,
326,
253,
6895,
375,
16386,
505,
2576,
3249,
432,
10491,
3605,
50276,
296,
3755,
20556,
50276,
4714,
24013,
8550,
1895,
285,
247,
28055,
3451,
2900,
762,
690,
13260,
50276,
66,
2969,
2900,
281,
271,
1774,
1895,
50276,
20881,
1255,
265,
50275,
284,
253,
4477,
14969,
9376,
337,
285,
608,
323,
21136,
5933,
4419,
6485,
3640,
50276,
262,
3133,
1892,
323,
479,
281,
1089,
32213,
273,
253,
2929,
7296,
326,
253,
4477,
858,
247,
987,
2181,
281,
8415,
247,
1895,
285,
14969,
7364,
4518,
326,
403,
417,
1955,
281,
253,
4477,
533,
3249,
432,
253,
3753,
50274,
36042,
2720,
3640,
327,
253,
4606,
323,
6895,
375,
16386,
3258,
414,
50276,
7152,
339,
4904,
17697,
14275,
1071,
310,
4081,
1754,
327,
7898,
5921,
533,
12956,
281,
253,
1083,
273,
6895,
375,
16386,
3258,
414,
352,
310,
671,
5421,
849,
436,
1071,
476,
320,
908,
275,
253,
21136,
5933,
512,
1543,
247,
8132,
29689,
8058,
347,
973,
347,
12800,
970,
4679,
436,
7680,
310,
273,
1029,
10527,
3290,
352,
310,
671,
273,
1270,
8542,
17200,
347,
2860,
375,
16386,
3258,
414,
13260,
403,
2223,
13588,
275,
1524,
15302,
627,
403,
4390,
690,
2201,
7364,
534,
403,
4518,
5393,
275,
253,
2929,
7194,
326,
253,
4081,
7898,
5921,
1071,
760,
2987,
604,
323,
1016,
4778,
627,
310,
387,
954,
581,
2603,
273,
6895,
375,
16386,
3258,
414,
285,
323,
10527,
15274,
273,
253,
21136,
5933,
326,
436,
2603,
310,
417,
1529,
4778,
7756,
275,
841,
3672,
310,
7117,
347,
2852,
789,
50276,
783,
2929,
310,
4518,
3542,
891,
3340,
11435,
253,
33954,
273,
253,
3237,
275,
4677,
337,
253,
2898,
281,
19349,
8900,
310,
3236,
2167,
604,
891,
2096,
9113,
253,
1071,
3139,
310,
3240,
2074,
281,
5368,
789,
327,
6895,
375,
16386,
3258,
414,
4754,
5474,
339,
9852,
436,
789,
4477,
1908,
6895,
375,
16386,
3258,
414,
275,
247,
8350,
19349,
1566,
660,
78,
285,
1287,
271,
12956,
17375,
1878,
23600,
4420,
259,
5200,
7898,
5921,
12955,
347,
247,
17697,
14275,
16399,
1071,
534,
310,
2104,
281,
2968,
342,
6895,
375,
16386,
3258,
6046,
342,
6485,
3640,
670,
253,
6895,
375,
16386,
3258,
7688,
403,
1677,
5678,
342,
253,
21136,
5933,
253,
4081,
16399,
1071,
476,
671,
4271,
253,
19349,
7688,
875,
4903,
762,
690,
13260,
5001,
253,
6895,
375,
16386,
3258,
7688,
1097,
10527,
15274,
1783,
285,
9864,
4679,
7568,
326,
253,
4081,
16399,
1071,
33526,
1805,
3221,
2762,
4142,
285,
671,
19132,
2220,
253,
5481,
1612,
347,
2429,
281,
253,
2629,
7898,
5921,
16399,
1071,
50276,
296,
3755,
20556,
50276,
18,
253,
4477,
12661,
271,
12956,
259,
5200,
7898,
5921,
1071,
323,
17697,
14275,
5175,
275,
253,
3361,
273,
6895,
375,
16386,
3258,
6046,
50276,
19,
253,
4477,
2085,
247,
7000,
1783,
273,
253,
259,
5200,
3082,
15274,
342,
253,
4081,
2801,
11193,
1332,
285,
2007,
14888,
253,
4081,
16399,
1071,
715,
253,
21136,
5933,
281,
3037,
253,
19349,
2605,
50276,
20,
4679,
327,
15524,
15302,
7568,
326,
253,
4081,
259,
5200,
7898,
5921,
16399,
1071,
33526,
1805,
3045,
685,
253,
258,
5200,
7898,
5921,
16399,
1071,
275,
253,
3361,
273,
6895,
375,
16386,
3258,
6046,
5678,
342,
253,
21136,
5933,
253,
4081,
259,
5200,
7898,
5921,
16399,
1071,
556,
247,
4577,
3221,
2762,
2281,
275,
12488,
19349,
7688,
50276,
20881,
1255,
265,
50276,
18,
253,
2934,
273,
970,
253,
13737,
273,
6332,
11041,
310,
27350,
2299,
352,
651,
320,
1270,
604,
253,
4477,
812,
2007,
2319,
2139,
259,
5200,
476,
2968,
342,
6895,
375,
16386,
3258,
6046,
285,
690,
1896,
1805,
294,
6712,
272,
8130,
275,
2508,
50276,
19,
253,
4477,
36803,
2067,
13260,
275,
253,
2929,
2299,
253,
6895,
375,
16386,
3258,
7688,
2430,
6485,
3640,
253,
37317,
28770,
604,
253,
4477,
812,
2319,
2007,
849,
281,
7921,
253,
6895,
375,
16386,
3258,
7688,
9376,
4477,
452,
5469,
253,
7364,
285,
38058,
16274,
275,
253,
6452,
2593,
50276,
187,
187,
4118,
18435,
27,
249,
1524,
3237,
941,
2223,
10738,
253,
19331,
2867,
347,
247,
9936,
17697,
14275,
5216,
326,
10725,
327,
253,
941,
2860,
375,
16386,
3258,
414,
9376,
778,
1347,
749,
32581,
595,
4283,
281,
31215,
19349,
8900,
1543,
436,
2929,
5223,
84,
253,
7898,
5921,
5216,
281,
2395,
323,
6895,
375,
16386,
3258,
6046,
285,
3400,
690,
3309,
10527,
23632,
285,
16774,
1543,
30628,
5194,
326,
253,
5421,
1895,
310,
973,
17194,
285,
326,
253,
2900,
310,
24600
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
17697,
14275,
1071,
275,
247,
4872,
305,
12064,
4758,
533,
253,
11041,
273,
253,
2228,
1307,
778,
417,
320,
247,
3638,
533,
5876,
407,
247,
1159,
273,
247,
8578,
273,
697,
4651,
390,
10491,
3605,
24088,
2060,
6820,
11041,
390,
247,
2069,
12395,
751,
4758,
7613,
253,
4477,
4158,
6895,
375,
16386,
3258,
414,
13823,
17375,
7898,
5921,
835,
253,
13461,
1705,
432,
13737,
273,
48894,
534,
310,
5998,
970,
247,
3497,
272,
2746,
2299,
436,
1071,
2987,
342,
6485,
3640,
670,
6895,
375,
16386,
3258,
414,
1880,
352,
310,
4269,
407,
10491,
3605,
390,
6895,
375,
16386,
3258,
2885,
275,
247,
19349,
8900,
4758,
417,
8958,
253,
4651,
273,
4903,
352,
760,
19584,
326,
253,
6895,
375,
16386,
505,
2576,
3249,
432,
10491,
3605,
50276,
296,
3755,
20556,
50276,
4714,
24013,
8550,
1895,
285,
247,
28055,
3451,
2900,
762,
690,
13260,
50276,
66,
2969,
2900,
281,
271,
1774,
1895,
50276,
20881,
1255,
265,
50275,
284,
253,
4477,
14969,
9376,
337,
285,
608,
323,
21136,
5933,
4419,
6485,
3640,
50276,
262,
3133,
1892,
323,
479,
281,
1089,
32213,
273,
253,
2929,
7296,
326,
253,
4477,
858,
247,
987,
2181,
281,
8415,
247,
1895,
285,
14969,
7364,
4518,
326,
403,
417,
1955,
281,
253,
4477,
533,
3249,
432,
253,
3753,
50274,
36042,
2720,
3640,
327,
253,
4606,
323,
6895,
375,
16386,
3258,
414,
50276,
7152,
339,
4904,
17697,
14275,
1071,
310,
4081,
1754,
327,
7898,
5921,
533,
12956,
281,
253,
1083,
273,
6895,
375,
16386,
3258,
414,
352,
310,
671,
5421,
849,
436,
1071,
476,
320,
908,
275,
253,
21136,
5933,
512,
1543,
247,
8132,
29689,
8058,
347,
973,
347,
12800,
970,
4679,
436,
7680,
310,
273,
1029,
10527,
3290,
352,
310,
671,
273,
1270,
8542,
17200,
347,
2860,
375,
16386,
3258,
414,
13260,
403,
2223,
13588,
275,
1524,
15302,
627,
403,
4390,
690,
2201,
7364,
534,
403,
4518,
5393,
275,
253,
2929,
7194,
326,
253,
4081,
7898,
5921,
1071,
760,
2987,
604,
323,
1016,
4778,
627,
310,
387,
954,
581,
2603,
273,
6895,
375,
16386,
3258,
414,
285,
323,
10527,
15274,
273,
253,
21136,
5933,
326,
436,
2603,
310,
417,
1529,
4778,
7756,
275,
841,
3672,
310,
7117,
347,
2852,
789,
50276,
783,
2929,
310,
4518,
3542,
891,
3340,
11435,
253,
33954,
273,
253,
3237,
275,
4677,
337,
253,
2898,
281,
19349,
8900,
310,
3236,
2167,
604,
891,
2096,
9113,
253,
1071,
3139,
310,
3240,
2074,
281,
5368,
789,
327,
6895,
375,
16386,
3258,
414,
4754,
5474,
339,
9852,
436,
789,
4477,
1908,
6895,
375,
16386,
3258,
414,
275,
247,
8350,
19349,
1566,
660,
78,
285,
1287,
271,
12956,
17375,
1878,
23600,
4420,
259,
5200,
7898,
5921,
12955,
347,
247,
17697,
14275,
16399,
1071,
534,
310,
2104,
281,
2968,
342,
6895,
375,
16386,
3258,
6046,
342,
6485,
3640,
670,
253,
6895,
375,
16386,
3258,
7688,
403,
1677,
5678,
342,
253,
21136,
5933,
253,
4081,
16399,
1071,
476,
671,
4271,
253,
19349,
7688,
875,
4903,
762,
690,
13260,
5001,
253,
6895,
375,
16386,
3258,
7688,
1097,
10527,
15274,
1783,
285,
9864,
4679,
7568,
326,
253,
4081,
16399,
1071,
33526,
1805,
3221,
2762,
4142,
285,
671,
19132,
2220,
253,
5481,
1612,
347,
2429,
281,
253,
2629,
7898,
5921,
16399,
1071,
50276,
296,
3755,
20556,
50276,
18,
253,
4477,
12661,
271,
12956,
259,
5200,
7898,
5921,
1071,
323,
17697,
14275,
5175,
275,
253,
3361,
273,
6895,
375,
16386,
3258,
6046,
50276,
19,
253,
4477,
2085,
247,
7000,
1783,
273,
253,
259,
5200,
3082,
15274,
342,
253,
4081,
2801,
11193,
1332,
285,
2007,
14888,
253,
4081,
16399,
1071,
715,
253,
21136,
5933,
281,
3037,
253,
19349,
2605,
50276,
20,
4679,
327,
15524,
15302,
7568,
326,
253,
4081,
259,
5200,
7898,
5921,
16399,
1071,
33526,
1805,
3045,
685,
253,
258,
5200,
7898,
5921,
16399,
1071,
275,
253,
3361,
273,
6895,
375,
16386,
3258,
6046,
5678,
342,
253,
21136,
5933,
253,
4081,
259,
5200,
7898,
5921,
16399,
1071,
556,
247,
4577,
3221,
2762,
2281,
275,
12488,
19349,
7688,
50276,
20881,
1255,
265,
50276,
18,
253,
2934,
273,
970,
253,
13737,
273,
6332,
11041,
310,
27350,
2299,
352,
651,
320,
1270,
604,
253,
4477,
812,
2007,
2319,
2139,
259,
5200,
476,
2968,
342,
6895,
375,
16386,
3258,
6046,
285,
690,
1896,
1805,
294,
6712,
272,
8130,
275,
2508,
50276,
19,
253,
4477,
36803,
2067,
13260,
275,
253,
2929,
2299,
253,
6895,
375,
16386,
3258,
7688,
2430,
6485,
3640,
253,
37317,
28770,
604,
253,
4477,
812,
2319,
2007,
849,
281,
7921,
253,
6895,
375,
16386,
3258,
7688,
9376,
4477,
452,
5469,
253,
7364,
285,
38058,
16274,
275,
253,
6452,
2593,
50276,
187,
187,
4118,
18435,
27,
249,
1524,
3237,
941,
2223,
10738,
253,
19331,
2867,
347,
247,
9936,
17697,
14275,
5216,
326,
10725,
327,
253,
941,
2860,
375,
16386,
3258,
414,
9376,
778,
1347,
749,
32581,
595,
4283,
281,
31215,
19349,
8900,
1543,
436,
2929,
5223,
84,
253,
7898,
5921,
5216,
281,
2395,
323,
6895,
375,
16386,
3258,
6046,
285,
3400,
690,
3309,
10527,
23632,
285,
16774,
1543,
30628,
5194,
326,
253,
5421,
1895,
310,
973,
17194,
285,
326,
253,
2900,
310,
24600
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper studies an interesting inversegraphics problem it proposed a novel method to learn 3d shape reconstruction using pretrained 2d image generative adversarial networks given an image containing one single object of interest it first predicts the graphics code eg viewpoint lighting depth and albedo by minimizing the reconstruction error using a differentiable renderer the next step is to render many pseudo samples by randomization in the viewpoint and lighting space while keeping the predicted depth and albedo fixed a pretrained 2d image gan is further used to project the pseudo samples to the learned data manifold through ganinversion finally these projected samples are added to the set for the next round optimization experimental evaluations have been conducted on several categories including face car building and horse comments overall this is a very interesting paper with good presentations promising experimental results and solid quantitative comparisons with the previous work reviewer would like to point out the potential weakness of the paper as follows w1 though impressed by the results especially the proposed method works for horse and building reviewer suspects the paper only works in a very simplified setting 1 the gan was previously trained on a large amount of 2d images of a single category with many variations in identity viewpoint and lighting 2 the initialization or step 1 in section 31 step seems very critical to the overall performance and 3 viewpoint and lightning randomization seems have to be handtuned reviewer would like to see the discussions on the underlying assumptions more explicitly in addition reviewer would like to know how does the method generalize to dirty data people with sunglasses people with noticeable earrings people partially occluded by wavy long hair and people with a side view looks like the input has to be a frontal face image same question applies to those nonconvex shapes a convertible car or a car with the window open reviewer suspects the method in the current form cannot handle them well w2 some important experimental settings are neither presented nor clarified for example it is not clear what is the difference between ours 3d and ours gan which should be clarified for image editing see figure 6 and figure 8 reviewer sees a noticeable change in background color sometimes eg second row in figure 6 it would be good to give a very detailed explanation of the image editing process eg whats the input and output format in each stage as ellipsoid was used to initialize the face shape reviewer would like to know what was the initialization for other categories such as building and car see figure 10 w3 it would be good to report the time spent on the computation and optimization and how it is compared to the baselines in table 1 and table 2 this is very important metric to report as a fair comparison to the previous work postrebuttal comments i am raising my score from 7 to 8 as author responses addressed my comments well especially answer to w1 and the figure 13 than expecteddocseppros 1 this is the first work that attempts to reconstruct 3d shape from 2d image in an unsupervised way using gans the idea is neat use networks to predict four 3d parameters and use gan to generate synthesize the images corresponding to a set of parameters then these synthesized images can be used as pseudo ground truth to train the 3d parameter network 2 the experiments are comprehensive to support the effectiveness of the proposed pipeline 2 tasks are evaluated 3d shape reconstruction and objectaware image manipulation on 3d shape reconstruction performances are reported on two datasets and demonstrated it outperforms sota method by a large margin on image manipulation the visualization results look reasonable and visually better than previous method cons questions 1 the authors claim in the introduction section that this proposed method has advantage over previous method as it doesnt assume symmetry of the instance but in this proposed method a symmetrical ellipsoid is used as the shape prior is this a stronger implicit assumption than the symmetry assumption is there any experiment to explore how the shape prior affects the models training results eg what if a nonconvex shape prior is provided what if asymmetrical prior is provided 2 in the introduction section the authors use building as an example to show that previous methods symmetry assumption cannot work but in the experiments and comparison with previous works all data used are symmetrical human face animal face cars etc visualizations on buildings are shown in appendix but there is no quantitative analysis or comparing with current sota method 3 how is the generator initialized as the generator is always fixed during training i assume the generator is using some pretrained network if so can we still call this network fully unsuperviseddocsepthis paper proposes an iterative method that jointly estimates viewpoints light directions depth and albedo from single images by projecting intermediate renderings to the nautral image manifold intuitively the method works by generating with pretrained gans multiple views of the same object under different lightings and then inferring 3d shapes from those variants the key idea is to use pretrained 2d gans to make such data generation photorealistic the authors also demonstrate 3d edits such as 3d rotation and relighting that one can perform after running their model i like this paper because 1 it presents the novel idea of generating by gan inversion photorealistic multiview multilight data of the given real object from which the 3d shape can then be estimated 2 extensive evaluations were performed to demonstrate the high quality achieved and 3 one can perform 3d edits such as 3d rotation and relighting on top of the model outputs having explicit 3d understanding for relighting makes a lot of sense to me and this paper presents a new angle of doing so by gan inversion in terms of drawbacks this paper would benefit from the following experiments or clarifications 1 how crucial is the size of the dataset for gan pretraining for example if the dataset is small not covering many of the face poses i imagine the gan projections may not always look realistic thereby causing the shape estimation to degrade such failure cases or studies should be shown so that the reader understands what impact the dataset bias or size has on the final results an example would be a plot of shape reconstruction error wrt the dataset size or face pose coverage 2 how robust is the algorithm to the shape initialization ellipsoid shape more importantly what about its location what if the offtheshelf scene parsing model fails so that the ellipsoid is placed off the main object if the model fails because of bad initializations what do the failure modes look like are the shapes completely garbage or something that looks like the initialization 3 why is the viewing direction parameterized in r6 i presume the start and end xyzs shouldnt it be in s2 just like the light direction if they are in r6 then results on zooming inout should be included otherwise there seems to be no point in defining them in r6
### Summary:
|
the paper proposes to use pretrained 2d ie image gans as a mechanism for recovering 3d shape from a single 2d image the work demonstrates impressive results on not only human and cat faces but also cars and buildings the method is demonstrated with qualitative results and quantitative results on multiple datasets and tasks the reviewers were persuaded by the novelty and neatness of the idea and the ac is in agreement as well as the results at submission time there were some concerns with experimental details for instance there was a question of how carefully the settings have to be tuned always a concern with unsupervised methods as well as an overarching concern about the initialization and whether the method will work on less clean data the reviewers and the ac seem to think that these have been sorted out in discussion all three reviewers were in favor of acceptance and the area chair is inclined to agree with the reviewers in particular the ac finds the work interesting and compelling while there is an updated version already uploaded during the discussion the ac encourages the reviewers to double check all the questions from the reviewers and include the answers from the discussion into the camera ready even these results are in the appendix
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
6010,
50276,
2520,
2929,
2175,
271,
4722,
13737,
40718,
1895,
352,
4081,
247,
4460,
1332,
281,
3037,
495,
69,
5281,
14433,
970,
3215,
11273,
374,
69,
2460,
1006,
800,
48960,
6928,
1677,
271,
2460,
4508,
581,
2014,
1789,
273,
1600,
352,
806,
26295,
253,
15896,
2127,
24088,
31460,
15632,
6864,
285,
355,
3026,
80,
407,
28699,
253,
14433,
2228,
970,
247,
46350,
3816,
21052,
253,
1735,
3213,
310,
281,
8600,
1142,
17927,
3530,
407,
46852,
275,
253,
31460,
285,
15632,
2317,
1223,
7562,
253,
8131,
6864,
285,
355,
3026,
80,
4229,
247,
3215,
11273,
374,
69,
2460,
36827,
310,
2007,
908,
281,
2199,
253,
17927,
3530,
281,
253,
6311,
941,
16751,
949,
36827,
249,
4149,
4720,
841,
16589,
3530,
403,
2879,
281,
253,
873,
323,
253,
1735,
3790,
13757,
5661,
27163,
452,
644,
5196,
327,
2067,
9050,
1690,
2454,
1113,
3652,
285,
8815,
50274,
26122,
50276,
1189,
455,
436,
310,
247,
1077,
4722,
2929,
342,
1175,
27228,
12532,
5661,
1543,
285,
4891,
11745,
14023,
342,
253,
2045,
789,
37317,
651,
751,
281,
1127,
562,
253,
2442,
14855,
273,
253,
2929,
347,
3637,
50276,
88,
18,
2167,
17847,
407,
253,
1543,
3340,
253,
4081,
1332,
2987,
323,
8815,
285,
3652,
37317,
25638,
253,
2929,
760,
2987,
275,
247,
1077,
21010,
4758,
337,
253,
36827,
369,
3786,
10166,
327,
247,
1781,
2408,
273,
374,
69,
3888,
273,
247,
2014,
7140,
342,
1142,
10575,
275,
6489,
31460,
285,
15632,
374,
253,
31850,
390,
3213,
337,
275,
2593,
4562,
3213,
3133,
1077,
4619,
281,
253,
4583,
3045,
285,
495,
31460,
285,
25033,
46852,
3133,
452,
281,
320,
1133,
85,
37437,
37317,
651,
751,
281,
923,
253,
11985,
327,
253,
6944,
13260,
625,
11120,
275,
1635,
37317,
651,
751,
281,
871,
849,
1057,
253,
1332,
39970,
281,
16076,
941,
952,
342,
29502,
38330,
952,
342,
28629,
2472,
32226,
952,
10571,
15715,
4686,
407,
259,
17157,
1048,
4707,
285,
952,
342,
247,
1930,
1859,
4453,
751,
253,
3280,
556,
281,
320,
247,
24459,
2454,
2460,
1072,
1953,
10384,
281,
1110,
1327,
44181,
15029,
247,
6455,
917,
1113,
390,
247,
1113,
342,
253,
3497,
1527,
37317,
25638,
253,
1332,
275,
253,
1655,
830,
2550,
6016,
731,
973,
50276,
88,
19,
690,
1774,
5661,
7533,
403,
6747,
3559,
4543,
31637,
323,
1650,
352,
310,
417,
2590,
752,
310,
253,
3064,
875,
20451,
495,
69,
285,
20451,
36827,
534,
943,
320,
31637,
323,
2460,
14835,
923,
4677,
721,
285,
4677,
854,
37317,
11403,
247,
28629,
1818,
275,
4114,
3295,
4536,
24088,
1273,
4194,
275,
4677,
721,
352,
651,
320,
1175,
281,
1918,
247,
1077,
7000,
8813,
273,
253,
2460,
14835,
1232,
24088,
47515,
253,
3280,
285,
3453,
5981,
275,
1016,
3924,
347,
36809,
601,
301,
369,
908,
281,
26641,
253,
2454,
5281,
37317,
651,
751,
281,
871,
752,
369,
253,
31850,
323,
643,
9050,
824,
347,
3652,
285,
1113,
923,
4677,
884,
50275,
88,
20,
352,
651,
320,
1175,
281,
1304,
253,
673,
5262,
327,
253,
13782,
285,
13757,
285,
849,
352,
310,
2429,
281,
253,
1666,
25379,
275,
2829,
337,
285,
2829,
374,
436,
310,
1077,
1774,
7982,
281,
1304,
347,
247,
4344,
5301,
281,
253,
2045,
789,
50275,
5996,
250,
2858,
22559,
5701,
50275,
74,
717,
12976,
619,
4868,
432,
818,
281,
854,
347,
2488,
6128,
9713,
619,
5701,
973,
3340,
3662,
281,
259,
18,
285,
253,
4677,
2145,
685,
3264,
7152,
339,
377,
2921,
337,
436,
310,
253,
806,
789,
326,
9437,
281,
17029,
495,
69,
5281,
432,
374,
69,
2460,
275,
271,
440,
35421,
1039,
970,
305,
507,
253,
2934,
310,
18176,
897,
6928,
281,
3283,
1740,
495,
69,
3602,
285,
897,
36827,
281,
6635,
50276,
49331,
3281,
253,
3888,
3969,
281,
247,
873,
273,
3602,
840,
841,
17791,
3888,
476,
320,
908,
347,
17927,
3216,
5083,
281,
6194,
253,
495,
69,
4764,
2990,
374,
253,
4679,
403,
11088,
281,
1329,
253,
12510,
273,
253,
4081,
15722,
374,
8892,
403,
6760,
495,
69,
5281,
14433,
285,
1789,
13823,
2460,
19763,
327,
495,
69,
5281,
14433,
16226,
403,
2361,
327,
767,
15302,
285,
5183,
352,
41731,
13015,
256,
5503,
1332,
407,
247,
1781,
8459,
327,
2460,
19763,
253,
24426,
1543,
1007,
5272,
285,
25910,
1805,
685,
2045,
1332,
50276,
5040,
50276,
34974,
337,
253,
4477,
1750,
275,
253,
10199,
2593,
326,
436,
4081,
1332,
556,
5750,
689,
2045,
1332,
347,
352,
36908,
5467,
10377,
273,
253,
4227,
533,
275,
436,
4081,
1332,
247,
42736,
36809,
601,
301,
310,
908,
347,
253,
5281,
2720,
310,
436,
247,
10046,
15424,
9376,
685,
253,
10377,
9376,
310,
627,
667,
3368,
281,
8338,
849,
253,
5281,
2720,
11852,
253,
3210,
3733,
1543,
24088,
752,
604,
247,
1327,
44181,
5281,
2720,
310,
2530,
752,
604,
40736,
5526,
2720,
310,
2530,
374,
275,
253,
10199,
2593,
253,
4477,
897,
3652,
347,
271,
1650,
281,
921,
326,
2045,
3082,
10377,
9376,
2550,
789,
533,
275,
253,
4679,
285,
5301,
342,
2045,
2987,
512,
941,
908,
403,
42736,
1966,
2454,
5893,
2454,
8458,
3966,
5304,
5904,
327,
9195,
403,
2011,
275,
30762,
533,
627,
310,
642,
11745,
1783,
390,
10941,
342,
1655,
256,
5503,
1332,
495,
849,
310,
253,
14156,
31260,
347,
253,
14156,
310,
1900,
4229,
1309,
3733,
891,
5467,
253,
14156,
310,
970,
690,
3215,
11273,
2990,
604,
594,
476,
359,
1335,
1067,
436,
2990,
4751,
440,
35421,
7152,
33032,
2520,
2929,
29328,
271,
34560,
1332,
326,
26277,
8197,
1859,
10801,
1708,
10746,
6864,
285,
355,
3026,
80,
432,
2014,
3888,
407,
35104,
10444,
8600,
723,
281,
253,
295,
1920,
1544,
2460,
16751,
540,
41597,
253,
1332,
2987,
407,
11365,
342,
3215,
11273,
305,
507,
2709,
6849,
273,
253,
1072,
1789,
762,
1027,
1708,
723,
285,
840,
9441,
804,
495,
69,
15029,
432,
1110,
11640,
253,
2234,
2934,
310,
281,
897,
3215,
11273,
374,
69,
305,
507,
281,
1056,
824,
941,
5978,
38099,
267,
2531,
253,
4477,
671,
7568,
495,
69,
1407,
953,
824,
347,
495,
69,
9381,
285,
774,
23144,
326,
581,
476,
1347,
846,
3515,
616,
1566,
50276,
74,
751,
436,
2929,
984,
337,
352,
10262,
253,
4460,
2934,
273,
11365,
407,
36827,
27697,
38099,
267,
2531,
1554,
400,
827,
1554,
30673,
941,
273,
253,
1677,
1524,
1789,
432,
534,
253,
495,
69,
5281,
476,
840,
320,
5998,
374,
9470,
27163,
497,
2684,
281,
7568,
253,
1029,
3290,
6786,
285,
495,
581,
476,
1347,
495,
69,
1407,
953,
824,
347,
495,
69,
9381,
285,
774,
23144,
327,
1755,
273,
253,
1566,
18012,
1907,
6843,
495,
69,
4685,
323,
774,
23144,
2789,
247,
2257,
273,
3282,
281,
479,
285,
436,
2929,
10262,
247,
747,
6907,
273,
2509,
594,
407,
36827,
27697,
50275,
249,
2426,
273,
30453,
436,
2929,
651,
5649,
432,
253,
1563,
4679,
390,
8254,
6787,
337,
849,
9560,
310,
253,
1979,
273,
253,
10895,
323,
36827,
3215,
26208,
323,
1650,
604,
253,
10895,
310,
1355,
417,
10985,
1142,
273,
253,
2454,
24543,
891,
8564,
253,
36827,
20553,
778,
417,
1900,
1007,
15958,
7624,
8479,
253,
5281,
13418,
281,
40195,
824,
4433,
2219,
390,
2175,
943,
320,
2011,
594,
326,
253,
9414,
24586,
752,
3486,
253,
10895,
8492,
390,
1979,
556,
327,
253,
2457,
1543,
271,
1650,
651,
320,
247,
7484,
273,
5281,
14433,
2228,
8772,
253,
10895,
1979,
390,
2454,
16753,
7031,
374,
849,
10237,
310,
253,
5933,
281,
253,
5281,
31850,
36809,
601,
301,
5281,
625,
15538,
752,
670,
697,
4328,
50276,
5371,
604,
253,
273,
649,
1041,
48164,
6200,
29072,
1566,
10224,
594,
326,
253,
36809,
601,
301,
310,
4845,
745,
253,
2022,
1789,
604,
253,
1566,
10224,
984,
273,
3076,
3302,
5904,
752,
513,
253,
4433,
10006,
1007,
751,
403,
253,
15029,
4336,
22630,
390,
1633,
326,
4453,
751,
253,
31850,
50276,
20,
2139,
310,
253,
14657,
3884,
4764,
1025,
275,
391,
23,
891,
35533,
253,
1265,
285,
990,
1269,
30608,
84,
943,
2649,
352,
320,
275,
256,
19,
816,
751,
253,
1708,
3884,
604,
597,
403,
275,
391,
23,
840,
1543,
327,
21282,
272,
275,
483,
943,
320,
2908,
5010,
627,
3133,
281,
320,
642,
1127,
275,
13947,
731,
275,
391,
23,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
897,
3215,
11273,
374,
69,
26332,
2460,
305,
507,
347,
247,
5122,
323,
27930,
495,
69,
5281,
432,
247,
2014,
374,
69,
2460,
253,
789,
14371,
13943,
1543,
327,
417,
760,
1966,
285,
5798,
9365,
533,
671,
8458,
285,
9195,
253,
1332,
310,
5183,
342,
18276,
1543,
285,
11745,
1543,
327,
2709,
15302,
285,
8892,
50276,
783,
30628,
497,
28198,
407,
253,
38135,
285,
18176,
1255,
273,
253,
2934,
285,
253,
913,
310,
275,
4345,
347,
973,
347,
253,
1543,
387,
19529,
673,
627,
497,
690,
7350,
342,
5661,
4278,
323,
4227,
627,
369,
247,
1953,
273,
849,
9257,
253,
7533,
452,
281,
320,
24251,
1900,
247,
4468,
342,
440,
35421,
3082,
347,
973,
347,
271,
689,
50238,
4468,
670,
253,
31850,
285,
1880,
253,
1332,
588,
789,
327,
1679,
4076,
941,
253,
30628,
285,
253,
913,
1646,
281,
1158,
326,
841,
452,
644,
20045,
562,
275,
5955,
50275,
455,
1264,
30628,
497,
275,
3718,
273,
14924,
285,
253,
2170,
6951,
310,
21802,
281,
5194,
342,
253,
30628,
275,
1798,
253,
913,
9010,
253,
789,
4722,
285,
18511,
1223,
627,
310,
271,
9300,
2715,
2168,
28228,
1309,
253,
5955,
253,
913,
29426,
253,
30628,
281,
4021,
2451,
512,
253,
3533,
432,
253,
30628,
285,
2486,
253,
9172,
432,
253,
5955,
715,
253,
6568,
4704,
1014,
841,
1543,
403,
275,
253,
30762
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
6010,
50276,
2520,
2929,
2175,
271,
4722,
13737,
40718,
1895,
352,
4081,
247,
4460,
1332,
281,
3037,
495,
69,
5281,
14433,
970,
3215,
11273,
374,
69,
2460,
1006,
800,
48960,
6928,
1677,
271,
2460,
4508,
581,
2014,
1789,
273,
1600,
352,
806,
26295,
253,
15896,
2127,
24088,
31460,
15632,
6864,
285,
355,
3026,
80,
407,
28699,
253,
14433,
2228,
970,
247,
46350,
3816,
21052,
253,
1735,
3213,
310,
281,
8600,
1142,
17927,
3530,
407,
46852,
275,
253,
31460,
285,
15632,
2317,
1223,
7562,
253,
8131,
6864,
285,
355,
3026,
80,
4229,
247,
3215,
11273,
374,
69,
2460,
36827,
310,
2007,
908,
281,
2199,
253,
17927,
3530,
281,
253,
6311,
941,
16751,
949,
36827,
249,
4149,
4720,
841,
16589,
3530,
403,
2879,
281,
253,
873,
323,
253,
1735,
3790,
13757,
5661,
27163,
452,
644,
5196,
327,
2067,
9050,
1690,
2454,
1113,
3652,
285,
8815,
50274,
26122,
50276,
1189,
455,
436,
310,
247,
1077,
4722,
2929,
342,
1175,
27228,
12532,
5661,
1543,
285,
4891,
11745,
14023,
342,
253,
2045,
789,
37317,
651,
751,
281,
1127,
562,
253,
2442,
14855,
273,
253,
2929,
347,
3637,
50276,
88,
18,
2167,
17847,
407,
253,
1543,
3340,
253,
4081,
1332,
2987,
323,
8815,
285,
3652,
37317,
25638,
253,
2929,
760,
2987,
275,
247,
1077,
21010,
4758,
337,
253,
36827,
369,
3786,
10166,
327,
247,
1781,
2408,
273,
374,
69,
3888,
273,
247,
2014,
7140,
342,
1142,
10575,
275,
6489,
31460,
285,
15632,
374,
253,
31850,
390,
3213,
337,
275,
2593,
4562,
3213,
3133,
1077,
4619,
281,
253,
4583,
3045,
285,
495,
31460,
285,
25033,
46852,
3133,
452,
281,
320,
1133,
85,
37437,
37317,
651,
751,
281,
923,
253,
11985,
327,
253,
6944,
13260,
625,
11120,
275,
1635,
37317,
651,
751,
281,
871,
849,
1057,
253,
1332,
39970,
281,
16076,
941,
952,
342,
29502,
38330,
952,
342,
28629,
2472,
32226,
952,
10571,
15715,
4686,
407,
259,
17157,
1048,
4707,
285,
952,
342,
247,
1930,
1859,
4453,
751,
253,
3280,
556,
281,
320,
247,
24459,
2454,
2460,
1072,
1953,
10384,
281,
1110,
1327,
44181,
15029,
247,
6455,
917,
1113,
390,
247,
1113,
342,
253,
3497,
1527,
37317,
25638,
253,
1332,
275,
253,
1655,
830,
2550,
6016,
731,
973,
50276,
88,
19,
690,
1774,
5661,
7533,
403,
6747,
3559,
4543,
31637,
323,
1650,
352,
310,
417,
2590,
752,
310,
253,
3064,
875,
20451,
495,
69,
285,
20451,
36827,
534,
943,
320,
31637,
323,
2460,
14835,
923,
4677,
721,
285,
4677,
854,
37317,
11403,
247,
28629,
1818,
275,
4114,
3295,
4536,
24088,
1273,
4194,
275,
4677,
721,
352,
651,
320,
1175,
281,
1918,
247,
1077,
7000,
8813,
273,
253,
2460,
14835,
1232,
24088,
47515,
253,
3280,
285,
3453,
5981,
275,
1016,
3924,
347,
36809,
601,
301,
369,
908,
281,
26641,
253,
2454,
5281,
37317,
651,
751,
281,
871,
752,
369,
253,
31850,
323,
643,
9050,
824,
347,
3652,
285,
1113,
923,
4677,
884,
50275,
88,
20,
352,
651,
320,
1175,
281,
1304,
253,
673,
5262,
327,
253,
13782,
285,
13757,
285,
849,
352,
310,
2429,
281,
253,
1666,
25379,
275,
2829,
337,
285,
2829,
374,
436,
310,
1077,
1774,
7982,
281,
1304,
347,
247,
4344,
5301,
281,
253,
2045,
789,
50275,
5996,
250,
2858,
22559,
5701,
50275,
74,
717,
12976,
619,
4868,
432,
818,
281,
854,
347,
2488,
6128,
9713,
619,
5701,
973,
3340,
3662,
281,
259,
18,
285,
253,
4677,
2145,
685,
3264,
7152,
339,
377,
2921,
337,
436,
310,
253,
806,
789,
326,
9437,
281,
17029,
495,
69,
5281,
432,
374,
69,
2460,
275,
271,
440,
35421,
1039,
970,
305,
507,
253,
2934,
310,
18176,
897,
6928,
281,
3283,
1740,
495,
69,
3602,
285,
897,
36827,
281,
6635,
50276,
49331,
3281,
253,
3888,
3969,
281,
247,
873,
273,
3602,
840,
841,
17791,
3888,
476,
320,
908,
347,
17927,
3216,
5083,
281,
6194,
253,
495,
69,
4764,
2990,
374,
253,
4679,
403,
11088,
281,
1329,
253,
12510,
273,
253,
4081,
15722,
374,
8892,
403,
6760,
495,
69,
5281,
14433,
285,
1789,
13823,
2460,
19763,
327,
495,
69,
5281,
14433,
16226,
403,
2361,
327,
767,
15302,
285,
5183,
352,
41731,
13015,
256,
5503,
1332,
407,
247,
1781,
8459,
327,
2460,
19763,
253,
24426,
1543,
1007,
5272,
285,
25910,
1805,
685,
2045,
1332,
50276,
5040,
50276,
34974,
337,
253,
4477,
1750,
275,
253,
10199,
2593,
326,
436,
4081,
1332,
556,
5750,
689,
2045,
1332,
347,
352,
36908,
5467,
10377,
273,
253,
4227,
533,
275,
436,
4081,
1332,
247,
42736,
36809,
601,
301,
310,
908,
347,
253,
5281,
2720,
310,
436,
247,
10046,
15424,
9376,
685,
253,
10377,
9376,
310,
627,
667,
3368,
281,
8338,
849,
253,
5281,
2720,
11852,
253,
3210,
3733,
1543,
24088,
752,
604,
247,
1327,
44181,
5281,
2720,
310,
2530,
752,
604,
40736,
5526,
2720,
310,
2530,
374,
275,
253,
10199,
2593,
253,
4477,
897,
3652,
347,
271,
1650,
281,
921,
326,
2045,
3082,
10377,
9376,
2550,
789,
533,
275,
253,
4679,
285,
5301,
342,
2045,
2987,
512,
941,
908,
403,
42736,
1966,
2454,
5893,
2454,
8458,
3966,
5304,
5904,
327,
9195,
403,
2011,
275,
30762,
533,
627,
310,
642,
11745,
1783,
390,
10941,
342,
1655,
256,
5503,
1332,
495,
849,
310,
253,
14156,
31260,
347,
253,
14156,
310,
1900,
4229,
1309,
3733,
891,
5467,
253,
14156,
310,
970,
690,
3215,
11273,
2990,
604,
594,
476,
359,
1335,
1067,
436,
2990,
4751,
440,
35421,
7152,
33032,
2520,
2929,
29328,
271,
34560,
1332,
326,
26277,
8197,
1859,
10801,
1708,
10746,
6864,
285,
355,
3026,
80,
432,
2014,
3888,
407,
35104,
10444,
8600,
723,
281,
253,
295,
1920,
1544,
2460,
16751,
540,
41597,
253,
1332,
2987,
407,
11365,
342,
3215,
11273,
305,
507,
2709,
6849,
273,
253,
1072,
1789,
762,
1027,
1708,
723,
285,
840,
9441,
804,
495,
69,
15029,
432,
1110,
11640,
253,
2234,
2934,
310,
281,
897,
3215,
11273,
374,
69,
305,
507,
281,
1056,
824,
941,
5978,
38099,
267,
2531,
253,
4477,
671,
7568,
495,
69,
1407,
953,
824,
347,
495,
69,
9381,
285,
774,
23144,
326,
581,
476,
1347,
846,
3515,
616,
1566,
50276,
74,
751,
436,
2929,
984,
337,
352,
10262,
253,
4460,
2934,
273,
11365,
407,
36827,
27697,
38099,
267,
2531,
1554,
400,
827,
1554,
30673,
941,
273,
253,
1677,
1524,
1789,
432,
534,
253,
495,
69,
5281,
476,
840,
320,
5998,
374,
9470,
27163,
497,
2684,
281,
7568,
253,
1029,
3290,
6786,
285,
495,
581,
476,
1347,
495,
69,
1407,
953,
824,
347,
495,
69,
9381,
285,
774,
23144,
327,
1755,
273,
253,
1566,
18012,
1907,
6843,
495,
69,
4685,
323,
774,
23144,
2789,
247,
2257,
273,
3282,
281,
479,
285,
436,
2929,
10262,
247,
747,
6907,
273,
2509,
594,
407,
36827,
27697,
50275,
249,
2426,
273,
30453,
436,
2929,
651,
5649,
432,
253,
1563,
4679,
390,
8254,
6787,
337,
849,
9560,
310,
253,
1979,
273,
253,
10895,
323,
36827,
3215,
26208,
323,
1650,
604,
253,
10895,
310,
1355,
417,
10985,
1142,
273,
253,
2454,
24543,
891,
8564,
253,
36827,
20553,
778,
417,
1900,
1007,
15958,
7624,
8479,
253,
5281,
13418,
281,
40195,
824,
4433,
2219,
390,
2175,
943,
320,
2011,
594,
326,
253,
9414,
24586,
752,
3486,
253,
10895,
8492,
390,
1979,
556,
327,
253,
2457,
1543,
271,
1650,
651,
320,
247,
7484,
273,
5281,
14433,
2228,
8772,
253,
10895,
1979,
390,
2454,
16753,
7031,
374,
849,
10237,
310,
253,
5933,
281,
253,
5281,
31850,
36809,
601,
301,
5281,
625,
15538,
752,
670,
697,
4328,
50276,
5371,
604,
253,
273,
649,
1041,
48164,
6200,
29072,
1566,
10224,
594,
326,
253,
36809,
601,
301,
310,
4845,
745,
253,
2022,
1789,
604,
253,
1566,
10224,
984,
273,
3076,
3302,
5904,
752,
513,
253,
4433,
10006,
1007,
751,
403,
253,
15029,
4336,
22630,
390,
1633,
326,
4453,
751,
253,
31850,
50276,
20,
2139,
310,
253,
14657,
3884,
4764,
1025,
275,
391,
23,
891,
35533,
253,
1265,
285,
990,
1269,
30608,
84,
943,
2649,
352,
320,
275,
256,
19,
816,
751,
253,
1708,
3884,
604,
597,
403,
275,
391,
23,
840,
1543,
327,
21282,
272,
275,
483,
943,
320,
2908,
5010,
627,
3133,
281,
320,
642,
1127,
275,
13947,
731,
275,
391,
23,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
897,
3215,
11273,
374,
69,
26332,
2460,
305,
507,
347,
247,
5122,
323,
27930,
495,
69,
5281,
432,
247,
2014,
374,
69,
2460,
253,
789,
14371,
13943,
1543,
327,
417,
760,
1966,
285,
5798,
9365,
533,
671,
8458,
285,
9195,
253,
1332,
310,
5183,
342,
18276,
1543,
285,
11745,
1543,
327,
2709,
15302,
285,
8892,
50276,
783,
30628,
497,
28198,
407,
253,
38135,
285,
18176,
1255,
273,
253,
2934,
285,
253,
913,
310,
275,
4345,
347,
973,
347,
253,
1543,
387,
19529,
673,
627,
497,
690,
7350,
342,
5661,
4278,
323,
4227,
627,
369,
247,
1953,
273,
849,
9257,
253,
7533,
452,
281,
320,
24251,
1900,
247,
4468,
342,
440,
35421,
3082,
347,
973,
347,
271,
689,
50238,
4468,
670,
253,
31850,
285,
1880,
253,
1332,
588,
789,
327,
1679,
4076,
941,
253,
30628,
285,
253,
913,
1646,
281,
1158,
326,
841,
452,
644,
20045,
562,
275,
5955,
50275,
455,
1264,
30628,
497,
275,
3718,
273,
14924,
285,
253,
2170,
6951,
310,
21802,
281,
5194,
342,
253,
30628,
275,
1798,
253,
913,
9010,
253,
789,
4722,
285,
18511,
1223,
627,
310,
271,
9300,
2715,
2168,
28228,
1309,
253,
5955,
253,
913,
29426,
253,
30628,
281,
4021,
2451,
512,
253,
3533,
432,
253,
30628,
285,
2486,
253,
9172,
432,
253,
5955,
715,
253,
6568,
4704,
1014,
841,
1543,
403,
275,
253,
30762
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a neural network the outputs of which create a bayesian network of sigmoids this is for use in massively multilabel situations where the class outputs are connected to some ontology by using the ontology the performance on longtail classes with few examples should be improved i like the method it is intuitive and easy to implement the only issue i have with the features and model is when in 312 the weights for the medical labels as input to the encoder are said to be tied to the output label embeddings i would like to have seem more justification for this perhaps just the keeping number of parameters down or evaluation as to whether it helped i think it may have hindered because of how the output label is used in the dot product the stated difference between this work and previous hierarchical softmax models is that they use dag structures not just tree structures however of the two datasets they try only one proteins has a dag structure to the ontology and they do not have a baseline comparison with a hierarchical softmax model the method does show improvements at low number of examples against a flat sigmoid model in the small disease prediction and protein function prediction but other results are mixed especially for the proteins with the dag ontology which is where one would have liked to see an advantage i would like to have seen performance against a hierarchical soft max framework or on some openly available or benchmark datasets otherwise it is hard to judge the utility of the method docsepthe authors propose a new training scheme for training neural networks for multilabel prediction tasks by introducing ontology relationships between labels the paper motivates very well by the observations that some labels include very small amount of data points however the authors dont really investigate why such labels are rarely observed and the experiments dont include any significance thus overall i dont think the paper is ready for publishing for iclr yet below are some more detailed comments 1 the authors discuss nicely about the intuition to introduce the bayesian networks in the tasks of disease prediction essentially the probability of assigning the label leaf node should be account for the probability of it being observed namely the prior thus it is not surprising that for the rare labels the proposed method would yield higher precision however the experiments dont really include any significance measurement especially for such tasks where the number of testing examples with rare labels is small 510 positive examples significance measurement or some forms of hypothesis testing is a musthave in order to draw conclusion about the performance comparison answering such significance issue with tests for overfitting would be nice 2 my other major concern is for the protein function prediction task the reason of why for certain labels the number of instances is small could be due to that a there dont exist much biological weblab evidence or b among the population there indeed only exist small number of proteins associated with such labels the proposed method can address b but not necessarily address a 3 the paper discusses the other results very briefly in section 5 but doesnt include any experiment comparison thus it is not convincing that the proposed method is making contribution to the field of disease prediction protein function prediction or even general multilabel prediction i would suggest to include the comparison with the state of the art methods for each application 4 particularly for protein function prediction another line of studies is to use protein protein interaction networks or other sources such as functional pathways rather than using sequence information alone ref below some discussion would be nice schwikowski benno peter uetz and stanley fields a network of proteinprotein interactions in yeast nature biotechnology 1812 2000 1257 cao mengfei et al new directions for diffusionbased network prediction of protein function incorporating pathways with confidence bioinformatics 3012 2014 i219i227 docsepthis is a clear and well written paper that attempts to improve our ability to predict in the setting of massive multilabel data which as the authors highlight is an increasingly import problem in biology and healthcare strengths the idea of using the hierarchical structure of the labels is innovative and wellmotivated the experimental design and description of the methods is excellent weaknesses overall the results are not consistently strong and there is a key baseline missing the approach only seems help in the rare label small data regime which limits the applicability of the method but is still worthy of consideration my biggest reservation is that the authors did not include a baseline where the classes are reweighted according to their frequency multilabel binary crossentropy is very easy to modify to incorporate class weights eg upweight the minority class for each label and without this baseline i am unable to discern how well the method works relative to this simple baseline one more dataset would also strengthen the results and since i am suggesting more work i will also try to be helpful and be specific predicting mesh terms from abstracts would qualify as a massive multilabel task and there is plenty of public data available here httpswwwnlmnihgovdatabasesdownloadpubmedmedlinehtml finally there is one relevant paper that the authors may wish to consider in their review section httpswwwbiorxivorgcontentearly20180710365965
### Summary:
|
the paper proposes a nice approach to massively multilabel problems with rare labels which may only have a limited number of positive examples the approach uses bayes nets to exploit the relationships among the labels in the output layer of a neural nets the paper is clearly written and the approach seems promising however the reviewers would like to see even more convincing empirical results
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
11454,
2990,
253,
18012,
273,
534,
2794,
247,
17699,
16561,
2990,
273,
9788,
78,
9448,
436,
310,
323,
897,
275,
48994,
33362,
1492,
9534,
835,
253,
966,
18012,
403,
4802,
281,
690,
42081,
407,
970,
253,
42081,
253,
3045,
327,
1048,
14694,
5971,
342,
1643,
6667,
943,
320,
5520,
891,
751,
253,
1332,
352,
310,
27350,
285,
3477,
281,
3359,
253,
760,
2523,
891,
452,
342,
253,
3386,
285,
1566,
310,
672,
275,
30581,
253,
13461,
323,
253,
3739,
13301,
347,
3280,
281,
253,
32049,
403,
753,
281,
320,
12331,
281,
253,
3453,
5203,
46234,
891,
651,
751,
281,
452,
1646,
625,
22861,
323,
436,
4931,
816,
253,
7562,
1180,
273,
3602,
1066,
390,
7103,
347,
281,
1880,
352,
6518,
50276,
74,
1158,
352,
778,
452,
17134,
2122,
984,
273,
849,
253,
3453,
5203,
310,
908,
275,
253,
14261,
1885,
253,
4767,
3064,
875,
436,
789,
285,
2045,
24498,
2602,
4090,
3210,
310,
326,
597,
897,
31398,
5289,
417,
816,
5202,
5289,
2299,
273,
253,
767,
15302,
597,
1611,
760,
581,
4324,
556,
247,
31398,
2605,
281,
253,
42081,
285,
597,
513,
417,
452,
247,
8245,
5301,
342,
247,
24498,
2602,
4090,
1566,
253,
1332,
1057,
921,
11701,
387,
1698,
1180,
273,
6667,
1411,
247,
6507,
9788,
78,
1238,
1566,
275,
253,
1355,
2728,
10554,
285,
2601,
1159,
10554,
533,
643,
1543,
403,
6804,
3340,
323,
253,
4324,
342,
253,
31398,
42081,
534,
310,
835,
581,
651,
452,
10490,
281,
923,
271,
5750,
891,
651,
751,
281,
452,
2326,
3045,
1411,
247,
24498,
2602,
2781,
7792,
390,
327,
690,
22134,
2130,
390,
22791,
15302,
5010,
352,
310,
1892,
281,
5963,
253,
11839,
273,
253,
1332,
5474,
339,
431,
248,
4477,
12661,
247,
747,
3733,
6974,
323,
3733,
11454,
6928,
323,
33362,
1492,
10554,
8892,
407,
16984,
42081,
7688,
875,
13301,
253,
2929,
15265,
684,
1077,
973,
407,
253,
7313,
326,
690,
13301,
2486,
1077,
1355,
2408,
273,
941,
2792,
2299,
253,
4477,
13414,
1663,
7409,
2139,
824,
13301,
403,
11766,
2540,
285,
253,
4679,
13414,
2486,
667,
8453,
3021,
4583,
891,
13414,
1158,
253,
2929,
310,
4704,
323,
18051,
323,
17857,
32888,
2568,
50276,
27490,
403,
690,
625,
7000,
5701,
337,
253,
4477,
2319,
23395,
670,
253,
30328,
281,
9569,
253,
17699,
16561,
6928,
275,
253,
8892,
273,
2728,
10554,
9093,
253,
5912,
273,
34018,
253,
5203,
10617,
4666,
943,
320,
2395,
323,
253,
5912,
273,
352,
1146,
2540,
10775,
253,
2720,
3021,
352,
310,
417,
10084,
326,
323,
253,
7520,
13301,
253,
4081,
1332,
651,
4917,
2169,
12320,
2299,
253,
4679,
13414,
1663,
2486,
667,
8453,
6814,
3340,
323,
824,
8892,
835,
253,
1180,
273,
5175,
6667,
342,
7520,
13301,
310,
1355,
33930,
2762,
6667,
8453,
6814,
390,
690,
4948,
273,
9079,
5175,
310,
247,
1364,
9802,
275,
1340,
281,
3812,
6452,
670,
253,
3045,
5301,
22291,
824,
8453,
2523,
342,
5216,
323,
689,
31893,
651,
320,
5322,
50276,
19,
619,
643,
2201,
4468,
310,
323,
253,
2601,
1159,
10554,
4836,
253,
1921,
273,
2139,
323,
2176,
13301,
253,
1180,
273,
10872,
310,
1355,
812,
320,
1955,
281,
326,
247,
627,
13414,
2226,
1199,
7534,
359,
1559,
357,
1941,
390,
270,
2190,
253,
3072,
627,
6296,
760,
2226,
1355,
1180,
273,
4324,
2330,
342,
824,
13301,
253,
4081,
1332,
476,
2953,
270,
533,
417,
7933,
2953,
247,
50276,
20,
253,
2929,
25339,
253,
643,
1543,
1077,
13366,
275,
2593,
608,
533,
36908,
2486,
667,
3368,
5301,
3021,
352,
310,
417,
21414,
326,
253,
4081,
1332,
310,
2403,
7680,
281,
253,
1673,
273,
2728,
10554,
2601,
1159,
10554,
390,
1014,
2087,
33362,
1492,
10554,
891,
651,
1804,
281,
2486,
253,
5301,
342,
253,
1375,
273,
253,
1445,
3082,
323,
1016,
2898,
50276,
21,
3782,
323,
2601,
1159,
10554,
1529,
1386,
273,
2175,
310,
281,
897,
2601,
2601,
5016,
6928,
390,
643,
4973,
824,
347,
5164,
9130,
2581,
685,
970,
3425,
1491,
3815,
1275,
2708,
690,
5955,
651,
320,
5322,
50276,
10629,
44874,
15767,
2240,
2369,
268,
1715,
1484,
25532,
285,
331,
266,
2205,
4910,
247,
2990,
273,
2601,
20529,
6355,
275,
15727,
575,
36791,
1794,
24377,
575,
1093,
805,
5307,
11140,
24,
7318,
80,
278,
1205,
453,
74,
1162,
355,
747,
10746,
323,
12393,
3169,
2990,
10554,
273,
2601,
1159,
24049,
9130,
342,
7162,
575,
24658,
37366,
575,
20,
12522,
4059,
891,
24558,
74,
20785,
50276,
7152,
33032,
2520,
310,
247,
2590,
285,
973,
3542,
2929,
326,
9437,
281,
3157,
776,
3745,
281,
3283,
275,
253,
4758,
273,
7863,
33362,
1492,
941,
534,
347,
253,
4477,
6780,
310,
271,
9592,
1395,
1895,
275,
16775,
285,
11723,
50275,
296,
3755,
20556,
253,
2934,
273,
970,
253,
24498,
2605,
273,
253,
13301,
310,
16694,
285,
973,
24013,
8550,
253,
5661,
2216,
285,
5740,
273,
253,
3082,
310,
7126,
50275,
20881,
1255,
265,
4583,
253,
1543,
403,
417,
12724,
2266,
285,
627,
310,
247,
2234,
8245,
5816,
253,
2746,
760,
3133,
1361,
275,
253,
7520,
5203,
1355,
941,
9459,
534,
7787,
253,
30437,
273,
253,
1332,
533,
310,
1335,
18338,
273,
8180,
50275,
2577,
5962,
28930,
310,
326,
253,
4477,
858,
417,
2486,
247,
8245,
835,
253,
5971,
403,
294,
24676,
2556,
281,
616,
4294,
33362,
1492,
8985,
2831,
290,
10144,
310,
1077,
3477,
281,
10007,
281,
19071,
966,
13461,
24088,
598,
6712,
253,
15156,
966,
323,
1016,
5203,
285,
1293,
436,
8245,
891,
717,
7591,
281,
26923,
849,
973,
253,
1332,
2987,
4103,
281,
436,
2969,
8245,
50276,
531,
625,
10895,
651,
671,
17084,
253,
1543,
285,
1580,
891,
717,
7738,
625,
789,
891,
588,
671,
1611,
281,
320,
9371,
285,
320,
2173,
21565,
17489,
2426,
432,
12002,
84,
651,
19478,
347,
247,
7863,
33362,
1492,
4836,
285,
627,
310,
9828,
273,
1345,
941,
2130,
1060,
5987,
1477,
939,
77,
16192,
6356,
12312,
8608,
15409,
21596,
16712,
1314,
1314,
1282,
2974,
50275,
71,
3341,
627,
310,
581,
4623,
2929,
326,
253,
4477,
778,
5730,
281,
1908,
275,
616,
2278,
2593,
5987,
2700,
67,
1528,
32693,
2061,
6071,
18579,
1252,
28950,
740,
1812,
3046,
2082,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
5322,
2746,
281,
48994,
33362,
1492,
3237,
342,
7520,
13301,
534,
778,
760,
452,
247,
3710,
1180,
273,
2762,
6667,
253,
2746,
4648,
17699,
265,
37507,
281,
22059,
253,
7688,
2190,
253,
13301,
275,
253,
3453,
3828,
273,
247,
11454,
37507,
253,
2929,
310,
4518,
3542,
285,
253,
2746,
3133,
12532,
2299,
253,
30628,
651,
751,
281,
923,
1014,
625,
21414,
16774,
1543,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
11454,
2990,
253,
18012,
273,
534,
2794,
247,
17699,
16561,
2990,
273,
9788,
78,
9448,
436,
310,
323,
897,
275,
48994,
33362,
1492,
9534,
835,
253,
966,
18012,
403,
4802,
281,
690,
42081,
407,
970,
253,
42081,
253,
3045,
327,
1048,
14694,
5971,
342,
1643,
6667,
943,
320,
5520,
891,
751,
253,
1332,
352,
310,
27350,
285,
3477,
281,
3359,
253,
760,
2523,
891,
452,
342,
253,
3386,
285,
1566,
310,
672,
275,
30581,
253,
13461,
323,
253,
3739,
13301,
347,
3280,
281,
253,
32049,
403,
753,
281,
320,
12331,
281,
253,
3453,
5203,
46234,
891,
651,
751,
281,
452,
1646,
625,
22861,
323,
436,
4931,
816,
253,
7562,
1180,
273,
3602,
1066,
390,
7103,
347,
281,
1880,
352,
6518,
50276,
74,
1158,
352,
778,
452,
17134,
2122,
984,
273,
849,
253,
3453,
5203,
310,
908,
275,
253,
14261,
1885,
253,
4767,
3064,
875,
436,
789,
285,
2045,
24498,
2602,
4090,
3210,
310,
326,
597,
897,
31398,
5289,
417,
816,
5202,
5289,
2299,
273,
253,
767,
15302,
597,
1611,
760,
581,
4324,
556,
247,
31398,
2605,
281,
253,
42081,
285,
597,
513,
417,
452,
247,
8245,
5301,
342,
247,
24498,
2602,
4090,
1566,
253,
1332,
1057,
921,
11701,
387,
1698,
1180,
273,
6667,
1411,
247,
6507,
9788,
78,
1238,
1566,
275,
253,
1355,
2728,
10554,
285,
2601,
1159,
10554,
533,
643,
1543,
403,
6804,
3340,
323,
253,
4324,
342,
253,
31398,
42081,
534,
310,
835,
581,
651,
452,
10490,
281,
923,
271,
5750,
891,
651,
751,
281,
452,
2326,
3045,
1411,
247,
24498,
2602,
2781,
7792,
390,
327,
690,
22134,
2130,
390,
22791,
15302,
5010,
352,
310,
1892,
281,
5963,
253,
11839,
273,
253,
1332,
5474,
339,
431,
248,
4477,
12661,
247,
747,
3733,
6974,
323,
3733,
11454,
6928,
323,
33362,
1492,
10554,
8892,
407,
16984,
42081,
7688,
875,
13301,
253,
2929,
15265,
684,
1077,
973,
407,
253,
7313,
326,
690,
13301,
2486,
1077,
1355,
2408,
273,
941,
2792,
2299,
253,
4477,
13414,
1663,
7409,
2139,
824,
13301,
403,
11766,
2540,
285,
253,
4679,
13414,
2486,
667,
8453,
3021,
4583,
891,
13414,
1158,
253,
2929,
310,
4704,
323,
18051,
323,
17857,
32888,
2568,
50276,
27490,
403,
690,
625,
7000,
5701,
337,
253,
4477,
2319,
23395,
670,
253,
30328,
281,
9569,
253,
17699,
16561,
6928,
275,
253,
8892,
273,
2728,
10554,
9093,
253,
5912,
273,
34018,
253,
5203,
10617,
4666,
943,
320,
2395,
323,
253,
5912,
273,
352,
1146,
2540,
10775,
253,
2720,
3021,
352,
310,
417,
10084,
326,
323,
253,
7520,
13301,
253,
4081,
1332,
651,
4917,
2169,
12320,
2299,
253,
4679,
13414,
1663,
2486,
667,
8453,
6814,
3340,
323,
824,
8892,
835,
253,
1180,
273,
5175,
6667,
342,
7520,
13301,
310,
1355,
33930,
2762,
6667,
8453,
6814,
390,
690,
4948,
273,
9079,
5175,
310,
247,
1364,
9802,
275,
1340,
281,
3812,
6452,
670,
253,
3045,
5301,
22291,
824,
8453,
2523,
342,
5216,
323,
689,
31893,
651,
320,
5322,
50276,
19,
619,
643,
2201,
4468,
310,
323,
253,
2601,
1159,
10554,
4836,
253,
1921,
273,
2139,
323,
2176,
13301,
253,
1180,
273,
10872,
310,
1355,
812,
320,
1955,
281,
326,
247,
627,
13414,
2226,
1199,
7534,
359,
1559,
357,
1941,
390,
270,
2190,
253,
3072,
627,
6296,
760,
2226,
1355,
1180,
273,
4324,
2330,
342,
824,
13301,
253,
4081,
1332,
476,
2953,
270,
533,
417,
7933,
2953,
247,
50276,
20,
253,
2929,
25339,
253,
643,
1543,
1077,
13366,
275,
2593,
608,
533,
36908,
2486,
667,
3368,
5301,
3021,
352,
310,
417,
21414,
326,
253,
4081,
1332,
310,
2403,
7680,
281,
253,
1673,
273,
2728,
10554,
2601,
1159,
10554,
390,
1014,
2087,
33362,
1492,
10554,
891,
651,
1804,
281,
2486,
253,
5301,
342,
253,
1375,
273,
253,
1445,
3082,
323,
1016,
2898,
50276,
21,
3782,
323,
2601,
1159,
10554,
1529,
1386,
273,
2175,
310,
281,
897,
2601,
2601,
5016,
6928,
390,
643,
4973,
824,
347,
5164,
9130,
2581,
685,
970,
3425,
1491,
3815,
1275,
2708,
690,
5955,
651,
320,
5322,
50276,
10629,
44874,
15767,
2240,
2369,
268,
1715,
1484,
25532,
285,
331,
266,
2205,
4910,
247,
2990,
273,
2601,
20529,
6355,
275,
15727,
575,
36791,
1794,
24377,
575,
1093,
805,
5307,
11140,
24,
7318,
80,
278,
1205,
453,
74,
1162,
355,
747,
10746,
323,
12393,
3169,
2990,
10554,
273,
2601,
1159,
24049,
9130,
342,
7162,
575,
24658,
37366,
575,
20,
12522,
4059,
891,
24558,
74,
20785,
50276,
7152,
33032,
2520,
310,
247,
2590,
285,
973,
3542,
2929,
326,
9437,
281,
3157,
776,
3745,
281,
3283,
275,
253,
4758,
273,
7863,
33362,
1492,
941,
534,
347,
253,
4477,
6780,
310,
271,
9592,
1395,
1895,
275,
16775,
285,
11723,
50275,
296,
3755,
20556,
253,
2934,
273,
970,
253,
24498,
2605,
273,
253,
13301,
310,
16694,
285,
973,
24013,
8550,
253,
5661,
2216,
285,
5740,
273,
253,
3082,
310,
7126,
50275,
20881,
1255,
265,
4583,
253,
1543,
403,
417,
12724,
2266,
285,
627,
310,
247,
2234,
8245,
5816,
253,
2746,
760,
3133,
1361,
275,
253,
7520,
5203,
1355,
941,
9459,
534,
7787,
253,
30437,
273,
253,
1332,
533,
310,
1335,
18338,
273,
8180,
50275,
2577,
5962,
28930,
310,
326,
253,
4477,
858,
417,
2486,
247,
8245,
835,
253,
5971,
403,
294,
24676,
2556,
281,
616,
4294,
33362,
1492,
8985,
2831,
290,
10144,
310,
1077,
3477,
281,
10007,
281,
19071,
966,
13461,
24088,
598,
6712,
253,
15156,
966,
323,
1016,
5203,
285,
1293,
436,
8245,
891,
717,
7591,
281,
26923,
849,
973,
253,
1332,
2987,
4103,
281,
436,
2969,
8245,
50276,
531,
625,
10895,
651,
671,
17084,
253,
1543,
285,
1580,
891,
717,
7738,
625,
789,
891,
588,
671,
1611,
281,
320,
9371,
285,
320,
2173,
21565,
17489,
2426,
432,
12002,
84,
651,
19478,
347,
247,
7863,
33362,
1492,
4836,
285,
627,
310,
9828,
273,
1345,
941,
2130,
1060,
5987,
1477,
939,
77,
16192,
6356,
12312,
8608,
15409,
21596,
16712,
1314,
1314,
1282,
2974,
50275,
71,
3341,
627,
310,
581,
4623,
2929,
326,
253,
4477,
778,
5730,
281,
1908,
275,
616,
2278,
2593,
5987,
2700,
67,
1528,
32693,
2061,
6071,
18579,
1252,
28950,
740,
1812,
3046,
2082,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
5322,
2746,
281,
48994,
33362,
1492,
3237,
342,
7520,
13301,
534,
778,
760,
452,
247,
3710,
1180,
273,
2762,
6667,
253,
2746,
4648,
17699,
265,
37507,
281,
22059,
253,
7688,
2190,
253,
13301,
275,
253,
3453,
3828,
273,
247,
11454,
37507,
253,
2929,
310,
4518,
3542,
285,
253,
2746,
3133,
12532,
2299,
253,
30628,
651,
751,
281,
923,
1014,
625,
21414,
16774,
1543,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors combined existing ophthalmology datasets and introduced additional biomarkers labels the authors identified ml tasks that are relevant to patient care and benchmarked classification performance on image data biomarkers and multimodal inputs the authors identified good ml tasks relevant to ophthalmology patient care and how they can be incorporated into patient care the authors showed good summaries of the labels biomarkers created for the data it would be nice if the authors can elaborate more on the clinical significance of the biomarkers why were these markers chosen and what are the value ranges of the biomarkers and their implications it is unclear about the graders qualification and how the biomarkers are acquired is there only one grader for each scan or are there multiple graders for each scan there are standard deviations for the balanced accuracy why arent they included for other metrics it is unclear if there are any domain shift between the data collected from two different studies they study different conditions which are also the labels of the classification task any domain shift between the datasets would compromise the classification result it is also unclear why these two datasets are selected and what are the clinical impact of such a combined dataset table 1 is confusing it is unclear if there is any relationships between the merged rows in image modalities and label modalities time series data in the dataset correspond to visits which have an average of 16 data points and may have different frequencies and intervals it us unclear how these data contribute to the classification of patients and there are no benchmarks included in the paper based on the time series data docsepauthors presented a longitudinal multimodal dataset comprising of 2d fundus image 3d oct scans clinical labels and biomarkers collected from patients undergoing treatment for diabetic retinopathy or diabetic macular edema authors have also presented baseline results for tasks such as drdme detection disease classification biomarker detection and clinical outcome prediction longitudinal data covering multiple modalities facilities research in several different directions disease detection and classification treatment progression as well as understanding the relationship between multiple modalities authors explicitly state this data is collected from one geographical location and could be biased sample size in terms of the number of patients docsepthis paper presents olives an oct and nearir fundus dataset that includes clinical labels biomarker labels disease labels and timeseries patient treatment information from associated clinical trials the dataset contains the information of 96 eyes averaged over a period of at least two years with each eye treated for an average of 66 weeks and 7 injections benchmark experiments benchmark models and baseline results are presented this dataset contains 96 eyes and an average of 16 visits per patient and 1268 fundas eye images figure 1 clearly illustrates the clinical practice described in section 1 this dataset is collected over a long period of time the long spanning of time series data allows future researchers to perform experiments on predictive models good level of details on data collection and hyperparameters used the authors have discussed related work in different aspects all mentioned research was properly referenced compared with existing datasets olives contains a comprehensive set of modalities and is large enough in volume to be leveraged by ml algorithms according to the paper this is currently the largest and most diverse dataset of its kind the entire paper is hard to follow for reviewers who are not experts in biology because of the extensive use of abbreviations of biological terminologies i understand that this paper is targeted toward an audience in the domain of biologymedicine still to facilitate interdisciplinary research it would be great if the authors could include in their appendix the corresponding full names of the abbreviations used in the paper i would suggest the authors reorganize section 41 table 2 presents experiments with increasing balanced accuracy but section 41 presents different tasks in different orders which makes readers hard to follow would be great if the authors could indicate which ml model is used for which task in the tables table 3 is unclear at first glance it would be clearer if the authors could discuss the first three models in detail in the corresponding section also it would be better to mention that table six is in the appendix at first glance i thought the authors forgot to present table six in the paper also in the last section of section 4 there is no figure c3 in appendix c3 the overall paper needs more careful review in the discussion section the authors should consider elaborating more upon the ethical implications of this study docsepthe paper provides medical data from different modalities with potentially positive impact on medical research and treatments the authors train different ml models to analyze the ability of the presented data to detect the relevant diseases drdme as well as predicting the effects of the successive treatment and the final occular state they explain the technical details of their experiments and their outcomes the paper provides medical data from different modalities with potentially positive impact on medical research and treatments the authors train different ml models to analyze the ability of the presented data to detect the relevant diseases drdme as well as predicting the effects of the successive treatment and the final occular state they explain the technical details of their experiments and their outcomes the paper the pesented dataset seem good grounded and valuable to me but it is hard for me as someone without any medical background to evaluate the medical analysis and justifications made in this paper it is unclear to me why they trained the vision models used to test the abilities of the dataset with a resnet 18 backbone which is pretty small and old compared to 2022 ml standards docsepthe authors provided an ophthalmic dataset with oct and nearir fundus images including clinical labels biomarker labels disease labels and timeseries patient treatment information from associated clinical trials the authors introduced the olives dataset to bridge the gap between existing ophthalmic datasets and the clinical diagnosis and treatment process the paper is well written and correctly addresses the problem statement the paper has introduced a dataset with three modalities and shown its scope in the field of ml 1 the size of the dataset introduced is small 2 the data is collected from two trials prime and trex the authors have not mentioned the differences which maymay not affect the model evaluation with collected samples 3 the significance of the clinical features such as bcva etc should have been better explained to draw the comparison across the modalities 4 clinical labels and biomarkers are associated with each eye how is the relation across the two modalities developed for the datasets per my understanding there should be a patient id with left and right eyes and corresponding clinical labels and biomarkers associated with each sample eye 5 are the mentioned three modalities correspond to the same patient this means there are three samples across three modalities for each patient 6 results in terms of sensitivity and specificity are missing which are important for evaluating the ml model for disease diagnosis 7 in table 3 many inputs used to train the model have shown a random accuracy for binary classification this proves the insignificance of these features and contradicts the authors claims similar results are found in table 7 in the supplementary 8 data collected from a single centre might encourage data bias
### Summary:
|
the reviewers struggled to find a consensus for this paper concerns about applicability of the dataset due to domain shift issues with the data collection size of the dataset and clarity of the paper were raised at the same time i believe that despite its size the value of longitudinal data for diagnostics is extremely valuable to the community and the authors have made efforts to improve the readability therefore i recommend accept
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
5678,
5368,
39109,
44548,
15302,
285,
5611,
3081,
20896,
50276,
31294,
253,
4477,
3636,
13361,
8892,
326,
403,
4623,
281,
3110,
1557,
285,
22791,
264,
9162,
3045,
327,
2460,
941,
20896,
285,
23390,
26306,
14800,
50276,
783,
4477,
3636,
1175,
13361,
8892,
4623,
281,
39109,
44548,
3110,
1557,
285,
849,
597,
476,
320,
11217,
715,
3110,
1557,
50276,
783,
4477,
2692,
1175,
14568,
3927,
273,
253,
13301,
50276,
29812,
782,
398,
3562,
323,
253,
941,
50276,
262,
651,
320,
5322,
604,
253,
4477,
476,
21184,
625,
327,
253,
3382,
8453,
273,
253,
20896,
2139,
497,
841,
9588,
6777,
285,
752,
403,
253,
1318,
13794,
273,
253,
20896,
285,
616,
12739,
50276,
262,
310,
12744,
670,
253,
3805,
398,
35011,
285,
849,
253,
20896,
403,
9288,
310,
627,
760,
581,
3805,
254,
323,
1016,
11017,
390,
403,
627,
2709,
3805,
398,
323,
1016,
11017,
50276,
9088,
403,
2629,
21492,
323,
253,
16645,
7200,
2139,
403,
2649,
597,
2908,
323,
643,
17082,
50276,
262,
310,
12744,
604,
627,
403,
667,
5028,
5333,
875,
253,
941,
5728,
432,
767,
1027,
2175,
597,
1263,
1027,
2515,
534,
403,
671,
253,
13301,
273,
253,
9162,
4836,
667,
5028,
5333,
875,
253,
15302,
651,
18230,
253,
9162,
906,
352,
310,
671,
12744,
2139,
841,
767,
15302,
403,
4236,
285,
752,
403,
253,
3382,
3486,
273,
824,
247,
5678,
10895,
50276,
2420,
337,
310,
21643,
352,
310,
12744,
604,
627,
310,
667,
7688,
875,
253,
21884,
10175,
275,
2460,
33433,
285,
5203,
33433,
50276,
2606,
2962,
941,
275,
253,
10895,
2723,
281,
12941,
534,
452,
271,
3388,
273,
1668,
941,
2792,
285,
778,
452,
1027,
11383,
285,
11508,
352,
441,
12744,
849,
841,
941,
8162,
281,
253,
9162,
273,
1363,
285,
627,
403,
642,
49602,
2908,
275,
253,
2929,
1754,
327,
253,
673,
2962,
941,
5474,
33032,
43355,
3559,
247,
14854,
23390,
26306,
10895,
11616,
273,
374,
69,
3187,
316,
2460,
495,
69,
17109,
20947,
3382,
13301,
285,
20896,
5728,
432,
1363,
17362,
1971,
323,
17435,
26965,
15819,
390,
17435,
5315,
792,
30297,
4477,
452,
671,
3559,
8245,
1543,
323,
8892,
824,
347,
1837,
69,
1405,
5481,
2728,
9162,
28976,
5481,
285,
3382,
6454,
10554,
50275,
5056,
13315,
941,
10985,
2709,
33433,
50276,
28402,
3191,
2561,
275,
2067,
1027,
10746,
2728,
5481,
285,
9162,
1971,
10005,
347,
973,
347,
4685,
253,
2954,
875,
2709,
33433,
50275,
43355,
11120,
1375,
436,
941,
310,
5728,
432,
581,
25231,
4328,
285,
812,
320,
23539,
50276,
16848,
1979,
275,
2426,
273,
253,
1180,
273,
1363,
50275,
7152,
33032,
2520,
2929,
10262,
8919,
1644,
271,
17109,
285,
2822,
343,
3187,
316,
10895,
326,
3797,
3382,
13301,
28976,
13301,
2728,
13301,
285,
2069,
12395,
3110,
1971,
1491,
432,
2330,
3382,
7587,
253,
10895,
4428,
253,
1491,
273,
9161,
2927,
17522,
689,
247,
2180,
273,
387,
1878,
767,
1107,
342,
1016,
5130,
4127,
323,
271,
3388,
273,
9523,
3618,
285,
818,
25602,
22791,
4679,
22791,
3210,
285,
8245,
1543,
403,
3559,
436,
10895,
4428,
9161,
2927,
285,
271,
3388,
273,
1668,
12941,
591,
3110,
285,
1249,
2358,
3187,
284,
5130,
3888,
4677,
337,
4518,
18303,
253,
3382,
3946,
2529,
275,
2593,
337,
50276,
2520,
10895,
310,
5728,
689,
247,
1048,
2180,
273,
673,
253,
1048,
28369,
273,
673,
2962,
941,
4483,
2852,
8607,
281,
1347,
4679,
327,
15970,
3210,
50275,
12311,
1268,
273,
4278,
327,
941,
4849,
285,
4373,
22041,
908,
50275,
783,
4477,
452,
5469,
2905,
789,
275,
1027,
7794,
512,
5393,
2561,
369,
6283,
23378,
50276,
3118,
1096,
342,
5368,
15302,
8919,
1644,
4428,
247,
11088,
873,
273,
33433,
285,
310,
1781,
2217,
275,
4644,
281,
320,
19732,
2961,
407,
13361,
11333,
2556,
281,
253,
2929,
436,
310,
4390,
253,
6253,
285,
954,
11117,
10895,
273,
697,
2238,
50276,
783,
2862,
2929,
310,
1892,
281,
956,
323,
30628,
665,
403,
417,
10071,
275,
16775,
984,
273,
253,
9470,
897,
273,
490,
25669,
273,
7534,
18376,
5970,
891,
2096,
326,
436,
2929,
310,
10522,
2584,
271,
8446,
275,
253,
5028,
273,
1794,
862,
1105,
264,
35705,
1335,
281,
12454,
734,
36078,
2561,
352,
651,
320,
1270,
604,
253,
4477,
812,
2486,
275,
616,
30762,
253,
3969,
2120,
4454,
273,
253,
490,
25669,
908,
275,
253,
2929,
50276,
74,
651,
1804,
253,
4477,
294,
7397,
907,
2593,
7609,
2829,
374,
10262,
4679,
342,
3629,
16645,
7200,
533,
2593,
7609,
10262,
1027,
8892,
275,
1027,
7367,
534,
2789,
10668,
1892,
281,
956,
651,
320,
1270,
604,
253,
4477,
812,
5224,
534,
13361,
1566,
310,
908,
323,
534,
4836,
275,
253,
7180,
50276,
2420,
495,
310,
12744,
387,
806,
17834,
352,
651,
320,
30909,
604,
253,
4477,
812,
2319,
253,
806,
1264,
3210,
275,
2508,
275,
253,
3969,
2593,
671,
352,
651,
320,
1805,
281,
3748,
326,
2829,
2800,
310,
275,
253,
30762,
387,
806,
17834,
891,
1869,
253,
4477,
18298,
281,
1246,
2829,
2800,
275,
253,
2929,
671,
275,
253,
1390,
2593,
273,
2593,
577,
627,
310,
642,
4677,
260,
20,
275,
30762,
260,
20,
253,
4583,
2929,
3198,
625,
10182,
2278,
50275,
249,
253,
5955,
2593,
253,
4477,
943,
1908,
14883,
839,
625,
2220,
253,
16289,
12739,
273,
436,
1263,
50276,
7152,
339,
431,
248,
2929,
3400,
3739,
941,
432,
1027,
33433,
342,
7826,
2762,
3486,
327,
3739,
2561,
285,
9694,
50276,
783,
4477,
6194,
1027,
13361,
3210,
281,
12106,
253,
3745,
273,
253,
3559,
941,
281,
2736,
253,
4623,
6578,
50276,
5267,
69,
1405,
50276,
284,
973,
347,
50276,
22714,
272,
253,
2538,
273,
253,
20946,
1971,
285,
253,
2457,
1609,
792,
1375,
50276,
9328,
5513,
253,
7681,
4278,
273,
616,
4679,
285,
616,
6973,
50276,
783,
2929,
3400,
3739,
941,
432,
1027,
33433,
342,
7826,
2762,
3486,
327,
3739,
2561,
285,
9694,
50276,
783,
4477,
6194,
1027,
13361,
3210,
281,
12106,
253,
3745,
273,
253,
3559,
941,
281,
2736,
253,
4623,
6578,
50276,
5267,
69,
1405,
50276,
284,
973,
347,
50276,
22714,
272,
253,
2538,
273,
253,
20946,
1971,
285,
253,
2457,
1609,
792,
1375,
50276,
9328,
5513,
253,
7681,
4278,
273,
616,
4679,
285,
616,
6973,
50276,
783,
2929,
50276,
783,
27246,
8006,
10895,
1646,
1175,
28462,
285,
9865,
281,
479,
533,
352,
310,
1892,
323,
479,
347,
3095,
1293,
667,
3739,
4114,
281,
7472,
253,
3739,
1783,
285,
816,
6787,
1160,
275,
436,
2929,
352,
310,
12744,
281,
479,
2139,
597,
10166,
253,
8113,
3210,
908,
281,
1071,
253,
15277,
273,
253,
10895,
342,
247,
501,
3024,
1283,
27882,
534,
310,
3965,
1355,
285,
1711,
2429,
281,
1384,
1423,
13361,
7465,
50275,
7152,
339,
431,
248,
4477,
2530,
271,
39109,
6185,
10895,
342,
17109,
285,
2822,
343,
3187,
316,
3888,
1690,
3382,
13301,
28976,
13301,
2728,
13301,
285,
2069,
12395,
3110,
1971,
1491,
432,
2330,
3382,
7587,
253,
4477,
5611,
253,
8919,
1644,
10895,
281,
9729,
253,
8037,
875,
5368,
39109,
6185,
15302,
285,
253,
3382,
6120,
285,
1971,
1232,
50274,
783,
2929,
310,
973,
3542,
285,
9113,
12453,
253,
1895,
3908,
253,
2929,
556,
5611,
247,
10895,
342,
1264,
33433,
285,
2011,
697,
7990,
275,
253,
1673,
273,
13361,
50276,
18,
253,
1979,
273,
253,
10895,
5611,
310,
1355,
374,
253,
941,
310,
5728,
432,
767,
7587,
4335,
285,
2578,
89,
253,
4477,
452,
417,
5393,
253,
3910,
534,
778,
11159,
417,
2818,
253,
1566,
7103,
342,
5728,
3530,
495,
253,
8453,
273,
253,
3382,
3386,
824,
347,
49501,
6156,
3966,
943,
452,
644,
1805,
5544,
281,
3812,
253,
5301,
2439,
253,
33433,
577,
3382,
13301,
285,
20896,
403,
2330,
342,
1016,
5130,
849,
310,
253,
5886,
2439,
253,
767,
33433,
3715,
323,
253,
15302,
591,
619,
4685,
627,
943,
320,
247,
3110,
2654,
342,
1669,
285,
987,
2927,
285,
3969,
3382,
13301,
285,
20896,
2330,
342,
1016,
3410,
5130,
608,
403,
253,
5393,
1264,
33433,
2723,
281,
253,
1072,
3110,
436,
2097,
627,
403,
1264,
3530,
2439,
1264,
33433,
323,
1016,
3110,
721,
1543,
275,
2426,
273,
7340,
285,
13005,
403,
5816,
534,
403,
1774,
323,
16344,
253,
13361,
1566,
323,
2728,
6120,
818,
275,
2829,
495,
1142,
14800,
908,
281,
6194,
253,
1566,
452,
2011,
247,
3632,
7200,
323,
8985,
9162,
436,
19539,
253,
27514,
40348,
273,
841,
3386,
285,
40878,
253,
4477,
3916,
2074,
1543,
403,
1119,
275,
2829,
818,
275,
253,
24864,
50276,
25,
941,
5728,
432,
247,
2014,
9145,
1537,
11907,
941,
8492,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
19460,
281,
1089,
247,
13969,
323,
436,
2929,
7350,
670,
30437,
273,
253,
10895,
1955,
281,
5028,
5333,
3374,
342,
253,
941,
4849,
1979,
273,
253,
10895,
285,
19843,
273,
253,
2929,
497,
5439,
387,
253,
1072,
673,
891,
2868,
326,
5747,
697,
1979,
253,
1318,
273,
14854,
941,
323,
39266,
310,
6685,
9865,
281,
253,
3114,
285,
253,
4477,
452,
1160,
6031,
281,
3157,
253,
1239,
1430,
3103,
891,
5583,
2997
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
5678,
5368,
39109,
44548,
15302,
285,
5611,
3081,
20896,
50276,
31294,
253,
4477,
3636,
13361,
8892,
326,
403,
4623,
281,
3110,
1557,
285,
22791,
264,
9162,
3045,
327,
2460,
941,
20896,
285,
23390,
26306,
14800,
50276,
783,
4477,
3636,
1175,
13361,
8892,
4623,
281,
39109,
44548,
3110,
1557,
285,
849,
597,
476,
320,
11217,
715,
3110,
1557,
50276,
783,
4477,
2692,
1175,
14568,
3927,
273,
253,
13301,
50276,
29812,
782,
398,
3562,
323,
253,
941,
50276,
262,
651,
320,
5322,
604,
253,
4477,
476,
21184,
625,
327,
253,
3382,
8453,
273,
253,
20896,
2139,
497,
841,
9588,
6777,
285,
752,
403,
253,
1318,
13794,
273,
253,
20896,
285,
616,
12739,
50276,
262,
310,
12744,
670,
253,
3805,
398,
35011,
285,
849,
253,
20896,
403,
9288,
310,
627,
760,
581,
3805,
254,
323,
1016,
11017,
390,
403,
627,
2709,
3805,
398,
323,
1016,
11017,
50276,
9088,
403,
2629,
21492,
323,
253,
16645,
7200,
2139,
403,
2649,
597,
2908,
323,
643,
17082,
50276,
262,
310,
12744,
604,
627,
403,
667,
5028,
5333,
875,
253,
941,
5728,
432,
767,
1027,
2175,
597,
1263,
1027,
2515,
534,
403,
671,
253,
13301,
273,
253,
9162,
4836,
667,
5028,
5333,
875,
253,
15302,
651,
18230,
253,
9162,
906,
352,
310,
671,
12744,
2139,
841,
767,
15302,
403,
4236,
285,
752,
403,
253,
3382,
3486,
273,
824,
247,
5678,
10895,
50276,
2420,
337,
310,
21643,
352,
310,
12744,
604,
627,
310,
667,
7688,
875,
253,
21884,
10175,
275,
2460,
33433,
285,
5203,
33433,
50276,
2606,
2962,
941,
275,
253,
10895,
2723,
281,
12941,
534,
452,
271,
3388,
273,
1668,
941,
2792,
285,
778,
452,
1027,
11383,
285,
11508,
352,
441,
12744,
849,
841,
941,
8162,
281,
253,
9162,
273,
1363,
285,
627,
403,
642,
49602,
2908,
275,
253,
2929,
1754,
327,
253,
673,
2962,
941,
5474,
33032,
43355,
3559,
247,
14854,
23390,
26306,
10895,
11616,
273,
374,
69,
3187,
316,
2460,
495,
69,
17109,
20947,
3382,
13301,
285,
20896,
5728,
432,
1363,
17362,
1971,
323,
17435,
26965,
15819,
390,
17435,
5315,
792,
30297,
4477,
452,
671,
3559,
8245,
1543,
323,
8892,
824,
347,
1837,
69,
1405,
5481,
2728,
9162,
28976,
5481,
285,
3382,
6454,
10554,
50275,
5056,
13315,
941,
10985,
2709,
33433,
50276,
28402,
3191,
2561,
275,
2067,
1027,
10746,
2728,
5481,
285,
9162,
1971,
10005,
347,
973,
347,
4685,
253,
2954,
875,
2709,
33433,
50275,
43355,
11120,
1375,
436,
941,
310,
5728,
432,
581,
25231,
4328,
285,
812,
320,
23539,
50276,
16848,
1979,
275,
2426,
273,
253,
1180,
273,
1363,
50275,
7152,
33032,
2520,
2929,
10262,
8919,
1644,
271,
17109,
285,
2822,
343,
3187,
316,
10895,
326,
3797,
3382,
13301,
28976,
13301,
2728,
13301,
285,
2069,
12395,
3110,
1971,
1491,
432,
2330,
3382,
7587,
253,
10895,
4428,
253,
1491,
273,
9161,
2927,
17522,
689,
247,
2180,
273,
387,
1878,
767,
1107,
342,
1016,
5130,
4127,
323,
271,
3388,
273,
9523,
3618,
285,
818,
25602,
22791,
4679,
22791,
3210,
285,
8245,
1543,
403,
3559,
436,
10895,
4428,
9161,
2927,
285,
271,
3388,
273,
1668,
12941,
591,
3110,
285,
1249,
2358,
3187,
284,
5130,
3888,
4677,
337,
4518,
18303,
253,
3382,
3946,
2529,
275,
2593,
337,
50276,
2520,
10895,
310,
5728,
689,
247,
1048,
2180,
273,
673,
253,
1048,
28369,
273,
673,
2962,
941,
4483,
2852,
8607,
281,
1347,
4679,
327,
15970,
3210,
50275,
12311,
1268,
273,
4278,
327,
941,
4849,
285,
4373,
22041,
908,
50275,
783,
4477,
452,
5469,
2905,
789,
275,
1027,
7794,
512,
5393,
2561,
369,
6283,
23378,
50276,
3118,
1096,
342,
5368,
15302,
8919,
1644,
4428,
247,
11088,
873,
273,
33433,
285,
310,
1781,
2217,
275,
4644,
281,
320,
19732,
2961,
407,
13361,
11333,
2556,
281,
253,
2929,
436,
310,
4390,
253,
6253,
285,
954,
11117,
10895,
273,
697,
2238,
50276,
783,
2862,
2929,
310,
1892,
281,
956,
323,
30628,
665,
403,
417,
10071,
275,
16775,
984,
273,
253,
9470,
897,
273,
490,
25669,
273,
7534,
18376,
5970,
891,
2096,
326,
436,
2929,
310,
10522,
2584,
271,
8446,
275,
253,
5028,
273,
1794,
862,
1105,
264,
35705,
1335,
281,
12454,
734,
36078,
2561,
352,
651,
320,
1270,
604,
253,
4477,
812,
2486,
275,
616,
30762,
253,
3969,
2120,
4454,
273,
253,
490,
25669,
908,
275,
253,
2929,
50276,
74,
651,
1804,
253,
4477,
294,
7397,
907,
2593,
7609,
2829,
374,
10262,
4679,
342,
3629,
16645,
7200,
533,
2593,
7609,
10262,
1027,
8892,
275,
1027,
7367,
534,
2789,
10668,
1892,
281,
956,
651,
320,
1270,
604,
253,
4477,
812,
5224,
534,
13361,
1566,
310,
908,
323,
534,
4836,
275,
253,
7180,
50276,
2420,
495,
310,
12744,
387,
806,
17834,
352,
651,
320,
30909,
604,
253,
4477,
812,
2319,
253,
806,
1264,
3210,
275,
2508,
275,
253,
3969,
2593,
671,
352,
651,
320,
1805,
281,
3748,
326,
2829,
2800,
310,
275,
253,
30762,
387,
806,
17834,
891,
1869,
253,
4477,
18298,
281,
1246,
2829,
2800,
275,
253,
2929,
671,
275,
253,
1390,
2593,
273,
2593,
577,
627,
310,
642,
4677,
260,
20,
275,
30762,
260,
20,
253,
4583,
2929,
3198,
625,
10182,
2278,
50275,
249,
253,
5955,
2593,
253,
4477,
943,
1908,
14883,
839,
625,
2220,
253,
16289,
12739,
273,
436,
1263,
50276,
7152,
339,
431,
248,
2929,
3400,
3739,
941,
432,
1027,
33433,
342,
7826,
2762,
3486,
327,
3739,
2561,
285,
9694,
50276,
783,
4477,
6194,
1027,
13361,
3210,
281,
12106,
253,
3745,
273,
253,
3559,
941,
281,
2736,
253,
4623,
6578,
50276,
5267,
69,
1405,
50276,
284,
973,
347,
50276,
22714,
272,
253,
2538,
273,
253,
20946,
1971,
285,
253,
2457,
1609,
792,
1375,
50276,
9328,
5513,
253,
7681,
4278,
273,
616,
4679,
285,
616,
6973,
50276,
783,
2929,
3400,
3739,
941,
432,
1027,
33433,
342,
7826,
2762,
3486,
327,
3739,
2561,
285,
9694,
50276,
783,
4477,
6194,
1027,
13361,
3210,
281,
12106,
253,
3745,
273,
253,
3559,
941,
281,
2736,
253,
4623,
6578,
50276,
5267,
69,
1405,
50276,
284,
973,
347,
50276,
22714,
272,
253,
2538,
273,
253,
20946,
1971,
285,
253,
2457,
1609,
792,
1375,
50276,
9328,
5513,
253,
7681,
4278,
273,
616,
4679,
285,
616,
6973,
50276,
783,
2929,
50276,
783,
27246,
8006,
10895,
1646,
1175,
28462,
285,
9865,
281,
479,
533,
352,
310,
1892,
323,
479,
347,
3095,
1293,
667,
3739,
4114,
281,
7472,
253,
3739,
1783,
285,
816,
6787,
1160,
275,
436,
2929,
352,
310,
12744,
281,
479,
2139,
597,
10166,
253,
8113,
3210,
908,
281,
1071,
253,
15277,
273,
253,
10895,
342,
247,
501,
3024,
1283,
27882,
534,
310,
3965,
1355,
285,
1711,
2429,
281,
1384,
1423,
13361,
7465,
50275,
7152,
339,
431,
248,
4477,
2530,
271,
39109,
6185,
10895,
342,
17109,
285,
2822,
343,
3187,
316,
3888,
1690,
3382,
13301,
28976,
13301,
2728,
13301,
285,
2069,
12395,
3110,
1971,
1491,
432,
2330,
3382,
7587,
253,
4477,
5611,
253,
8919,
1644,
10895,
281,
9729,
253,
8037,
875,
5368,
39109,
6185,
15302,
285,
253,
3382,
6120,
285,
1971,
1232,
50274,
783,
2929,
310,
973,
3542,
285,
9113,
12453,
253,
1895,
3908,
253,
2929,
556,
5611,
247,
10895,
342,
1264,
33433,
285,
2011,
697,
7990,
275,
253,
1673,
273,
13361,
50276,
18,
253,
1979,
273,
253,
10895,
5611,
310,
1355,
374,
253,
941,
310,
5728,
432,
767,
7587,
4335,
285,
2578,
89,
253,
4477,
452,
417,
5393,
253,
3910,
534,
778,
11159,
417,
2818,
253,
1566,
7103,
342,
5728,
3530,
495,
253,
8453,
273,
253,
3382,
3386,
824,
347,
49501,
6156,
3966,
943,
452,
644,
1805,
5544,
281,
3812,
253,
5301,
2439,
253,
33433,
577,
3382,
13301,
285,
20896,
403,
2330,
342,
1016,
5130,
849,
310,
253,
5886,
2439,
253,
767,
33433,
3715,
323,
253,
15302,
591,
619,
4685,
627,
943,
320,
247,
3110,
2654,
342,
1669,
285,
987,
2927,
285,
3969,
3382,
13301,
285,
20896,
2330,
342,
1016,
3410,
5130,
608,
403,
253,
5393,
1264,
33433,
2723,
281,
253,
1072,
3110,
436,
2097,
627,
403,
1264,
3530,
2439,
1264,
33433,
323,
1016,
3110,
721,
1543,
275,
2426,
273,
7340,
285,
13005,
403,
5816,
534,
403,
1774,
323,
16344,
253,
13361,
1566,
323,
2728,
6120,
818,
275,
2829,
495,
1142,
14800,
908,
281,
6194,
253,
1566,
452,
2011,
247,
3632,
7200,
323,
8985,
9162,
436,
19539,
253,
27514,
40348,
273,
841,
3386,
285,
40878,
253,
4477,
3916,
2074,
1543,
403,
1119,
275,
2829,
818,
275,
253,
24864,
50276,
25,
941,
5728,
432,
247,
2014,
9145,
1537,
11907,
941,
8492,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
19460,
281,
1089,
247,
13969,
323,
436,
2929,
7350,
670,
30437,
273,
253,
10895,
1955,
281,
5028,
5333,
3374,
342,
253,
941,
4849,
1979,
273,
253,
10895,
285,
19843,
273,
253,
2929,
497,
5439,
387,
253,
1072,
673,
891,
2868,
326,
5747,
697,
1979,
253,
1318,
273,
14854,
941,
323,
39266,
310,
6685,
9865,
281,
253,
3114,
285,
253,
4477,
452,
1160,
6031,
281,
3157,
253,
1239,
1430,
3103,
891,
5583,
2997
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a novel crosslingual multitasking framework based on a dualencoder model that can learn crosslingual sentence representations which are useful in monolingual tasks and crosslingual tasks for both languages involved in the training as observed on the experiments for three language pairs the main idea of the approach is to model all tasks as inputresponse ranking tasks and introduce crosslingual representation tying through the translation ranking task introduced by guo et al 2018 all components of the framework are quite standard and dejavu but i like the paper in general and the results seem quite encouraging i have several comments on how to further strengthen the paper and improve the presentation of the main findings the proposed framework does not offer any substantial modeling contribution ie all major components are based on sota models but the framework is still quite interesting as a mixture of these sota components i believe that some additional experiments would make the main contributions clearer and would also provide additional insights into the main properties of the proposed framework 1 crosslinguality and 2 multitasking most of all i am surprised not to see any ablation studies for instance what happens if we remove one of the two monolingual tasks in each language how does that reduced model compare to the full model which monolingual task is more beneficial for the final performance in downstream tasks can we think of adding another monolingual task to boost performance further i think that this sort of experiment would be more beneficial for the paper than a pretty long analysis from section 5 this analysis is still valid but should be shortened substantially evaluating only multitasking without any crosslingual training would also be very beneficial to recognise the extent of improvement achieved by adding crosslinguality to the model how much does the proposed architecture depend on the choice of the encoding model for the function g have the authors experimented with other recent and nearsota encoding models i would like to see a comparative analysis of this hyperparameter i would like to see more experiments on more distant language pairs this would make the paper even more interesting imho i am also curious whether there would be a drop in performance reported conditioned on the distanceproximity between two languages in a language pair i would like to see a more detailed description of the two best performing sts systems ecnu and bit in what respect are these systems stateoftheart feature engineered and mixed i am not sure what this means without providing any additional context to the claim and description how does the monolingual english sts model trained with the crosslingual multitask framework compare to the work of conneau et al emnlp 2017 which also used snli as the task on which to learn universal sentence representations this would be a good experiment imho as it would show how much we gain from crosslingual training and multitasking minor page 3 could you add a short footnote discussing how hardnegatives for the translation ranking task are selected how do you compute similarity here do you expect performance to improve further by training multinli instead of snli or combining the two datasets all hyperparameters are tuned based on preliminary experiments on a development set what is used as the development set more details needed finally as an additional training heuristic we multiply the gradients to the word and character embeddings by a factor of 100 how is the value for the embedding gradient multiplier determined is there an automatic procedure to finetune this hyperparameter or has this been done in a completely empirical way table 1 please define the task abbreviations before showing them in the table it is not clear what each task is by relying only on the abbreviation this dataset was not available at the time of the submission but for the revision it would make sense to also evaluate on the new xnli dataset of conneau et al emnlp 2018 for multilingual nli experiments after the first revision i have raised the score after the very detailed author response thanks for that but this is also conditioned on the authors making the actual revisions promised in their response i am still quite interested to check how well the method works in a setup with distant language pairsdocsepsummary in this paper authors explore learning of crosslingual sentence representations with their proposed dualencoder model evaluation conducted with learned crosslingual representations on several tasks such as monolingual crosslingual and zeroshotfewshot learning show the effectiveness of the proposed approach also they show provide a graphbased analysis of the learned representations three positive and negative points of the paper is presented as follows pros 1 crosslingual representation learning by combinining ideas from learning sentence representations and crosslanguage retrieval 2 multitask setup of different tasks for improving crosslanguage and monolingual tasks 3 lot of experimental results cons 1 claim it works for monolignual tasks in target language such as zeroshot learning for sentiment classification and nli also for crosslingual sts and eigensimilarity metric is hard to retrieve from the paper 2 many terms datasets are used without being referenced 3 usage of existing approaches to build a single model for many tasks comments to authors 1 dualencoder architecture is inspired from guo et al 2018 which uses encoding of source and target sentence with deep neural network however it is here replaced into multitask dualencoder model 2 what are the tasks that are very specific to source language 3 equation1 is basically a logistic regression or softmax over phi however phi is dot product of encodings as similar to deep averaging networks iyyer et al 2015 4 in section2 it is unclear what does symmetric tasks mean they use parallel corpora 5 in section21 it is mentioned that word embeddings are learned endtoend does this mean they are not initialized with pretrained ones 6 in section21 it is mentioned that word and character embeddings are learned in a computationally efficient way what does it represent they use less parameters parallelizable 7 why only three layers of transformer it is understood that 612 layers is required for effective encoding of sentences alrfou et al 2018 8 in model configuration how is convergence decided any stopping criterion 9 what are the splits for reddit wikipedia datasets 10 in table1 what does mrcr etc refer to they are never mentioned before does all tasks only use only english overall it is an interesting paper which explores multitask model for simultaneously improving both monolingual and crosslingual tasks however due to missing information and lacking clarity in some details it is hard to accept at this point of time minor issues 1 sentences are very long and not easily comprehensible 2 target language and repsonse are used without referencing each other better to use one of them for better tracking 3 no common notation for the model it is been referenced with different names crosslingual multitask model multitask dualencoder modeldocsepthe paper presents an intuitive architecture for learning crosslingual sentence representations i see weaknesses and strengths i the approach is not very novel using parallel data and similarity training siamese adversarial etc to facilitate transfer has been done before see 0 and references therein sharing encoder parameters across very different tasks is also pretty standard by now going back to 1 or so ii the evaluation is strong with a nice combination of standard benchmark evaluation downstream evaluation and analysis iii while the paper is on crosslingual transfer the authors only experiment with a small set of highresource languages where transfer is relatively easy iv i think the datasets used for evaluation are somewhat suboptimal eg a crosslingual retrieval and multilingual sts are very similar tasks other tasks using sentence representations and for which multilingual corpora are available include discourse parsing support identification for qa extractive summarization stance detection etc b instead of relying on agic and schluter 2017 why dont the authors use the xnli corpus 2 c translating the english sts data using google nmt to evaluate an architecture that looks a lot like google nmt sounds a suspicious v while i found the experiment with eigensimilarity a nice contribution there is a lot of alternatives seeing whether there is a linear transformation from one language to another using procrustes for example seeing whether the sentence graphs can be aligned using gans based only on jsd divergence looking at the geometry of these representations etc did you think about doing the same analysis on the representations learned without the translation task but using target language training data for the tasks instead the question would be whether there exists a linear transformation from the sentence graph learned for english while doing nli to the sentence graph learned for german while doing nli minor comments table 3 on page 5 should be table 2 table 2 seems unnecessary since the results are not interesting on their own but simply a premise in the motivating argument i would present these results intext 0 httpaclweborganthologyw183023
### Summary:
|
pros a new framework for learning sentence representations solid experiments and analyses enzh xnli dataset was added addressing the comment that no distant languages were considered also ablation tests cons the considered components are not novel and their combination is straightforward the set of downstream tasks is not very diverse see r2 only high resource languages are considered interesting to see it applied to real low resource languages all reviewers agree that there is no modeling contribution overall it is a solid paper but i do not believe that the contribution is sufficient
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
4460,
2831,
1981,
780,
1554,
262,
1945,
272,
7792,
1754,
327,
247,
8746,
36465,
1566,
326,
476,
3037,
2831,
1981,
780,
6197,
14237,
534,
403,
4217,
275,
28294,
272,
780,
8892,
285,
2831,
1981,
780,
8892,
323,
1097,
11515,
3206,
275,
253,
3733,
347,
2540,
327,
253,
4679,
323,
1264,
3448,
8557,
253,
2022,
2934,
273,
253,
2746,
310,
281,
1566,
512,
8892,
347,
3280,
10927,
19947,
8892,
285,
9569,
2831,
1981,
780,
6779,
42068,
949,
253,
10234,
19947,
4836,
5611,
407,
1149,
80,
1162,
355,
4765,
512,
4295,
273,
253,
7792,
403,
3240,
2629,
285,
372,
20304,
86,
533,
891,
751,
253,
2929,
275,
2087,
285,
253,
1543,
1646,
3240,
18462,
891,
452,
2067,
5701,
327,
849,
281,
2007,
17084,
253,
2929,
285,
3157,
253,
9759,
273,
253,
2022,
4342,
50276,
783,
4081,
7792,
1057,
417,
3959,
667,
6832,
14053,
7680,
26332,
512,
2201,
4295,
403,
1754,
327,
256,
5503,
3210,
533,
253,
7792,
310,
1335,
3240,
4722,
347,
247,
7802,
273,
841,
256,
5503,
4295,
891,
2868,
326,
690,
3081,
4679,
651,
1056,
253,
2022,
9021,
30909,
285,
651,
671,
2085,
3081,
16039,
715,
253,
2022,
3607,
273,
253,
4081,
7792,
337,
2831,
1981,
10982,
285,
374,
1554,
262,
1945,
272,
50275,
2252,
273,
512,
891,
717,
9861,
417,
281,
923,
667,
28913,
2175,
323,
4227,
752,
6569,
604,
359,
5386,
581,
273,
253,
767,
28294,
272,
780,
8892,
275,
1016,
3448,
849,
1057,
326,
3777,
1566,
7277,
281,
253,
2120,
1566,
534,
28294,
272,
780,
4836,
310,
625,
12912,
323,
253,
2457,
3045,
275,
15450,
8892,
476,
359,
1158,
273,
6240,
1529,
28294,
272,
780,
4836,
281,
9510,
3045,
2007,
891,
1158,
326,
436,
3686,
273,
3368,
651,
320,
625,
12912,
323,
253,
2929,
685,
247,
3965,
1048,
1783,
432,
2593,
608,
436,
1783,
310,
1335,
3588,
533,
943,
320,
36439,
9619,
16344,
760,
1554,
262,
1945,
272,
1293,
667,
2831,
1981,
780,
3733,
651,
671,
320,
1077,
12912,
281,
31410,
253,
6070,
273,
7756,
6786,
407,
6240,
2831,
1981,
10982,
281,
253,
1566,
50276,
5430,
1199,
1057,
253,
4081,
10336,
3469,
327,
253,
4327,
273,
253,
9706,
1566,
323,
253,
1159,
305,
452,
253,
4477,
3368,
264,
342,
643,
3332,
285,
425,
1032,
5503,
9706,
3210,
891,
651,
751,
281,
923,
247,
20407,
1783,
273,
436,
4373,
19484,
50276,
74,
651,
751,
281,
923,
625,
4679,
327,
625,
13392,
3448,
8557,
436,
651,
1056,
253,
2929,
1014,
625,
4722,
516,
1689,
891,
717,
671,
14338,
1880,
627,
651,
320,
247,
5926,
275,
3045,
2361,
27039,
327,
253,
4181,
856,
3266,
414,
875,
767,
11515,
275,
247,
3448,
4667,
50276,
74,
651,
751,
281,
923,
247,
625,
7000,
5740,
273,
253,
767,
1682,
9591,
331,
84,
2718,
10038,
3023,
285,
2372,
275,
752,
1675,
403,
841,
2718,
1375,
23037,
14387,
4735,
28136,
285,
6804,
891,
717,
417,
2119,
752,
436,
2097,
1293,
5277,
667,
3081,
3634,
281,
253,
1750,
285,
5740,
50276,
5430,
1057,
253,
28294,
272,
780,
48087,
331,
84,
1566,
10166,
342,
253,
2831,
1981,
780,
1554,
262,
1945,
7792,
7277,
281,
253,
789,
273,
1865,
1952,
1162,
355,
802,
13307,
81,
4240,
534,
671,
908,
3802,
965,
347,
253,
4836,
327,
534,
281,
3037,
10898,
6197,
14237,
436,
651,
320,
247,
1175,
3368,
516,
1689,
347,
352,
651,
921,
849,
1199,
359,
6351,
432,
2831,
1981,
780,
3733,
285,
1554,
262,
1945,
272,
50276,
37585,
3239,
495,
812,
368,
823,
247,
2159,
43302,
16585,
849,
1892,
8265,
3993,
323,
253,
10234,
19947,
4836,
403,
4236,
849,
513,
368,
11897,
14259,
1060,
513,
368,
1902,
3045,
281,
3157,
2007,
407,
3733,
37197,
965,
3185,
273,
3802,
965,
390,
16248,
253,
767,
15302,
512,
4373,
22041,
403,
24251,
1754,
327,
12611,
4679,
327,
247,
2440,
873,
50276,
5371,
310,
908,
347,
253,
2440,
873,
625,
4278,
3058,
4720,
347,
271,
3081,
3733,
47641,
359,
30247,
253,
27935,
281,
253,
3159,
285,
1894,
46234,
407,
247,
2803,
273,
2233,
50276,
5430,
310,
253,
1318,
323,
253,
21496,
11786,
39199,
3413,
310,
627,
271,
12077,
5199,
281,
1442,
292,
2517,
436,
4373,
19484,
390,
556,
436,
644,
2218,
275,
247,
4336,
16774,
1039,
2829,
337,
4496,
4853,
253,
4836,
490,
25669,
1078,
4645,
731,
275,
253,
2829,
352,
310,
417,
2590,
752,
1016,
4836,
310,
407,
22128,
760,
327,
253,
31931,
2492,
436,
10895,
369,
417,
2130,
387,
253,
673,
273,
253,
19529,
533,
323,
253,
18520,
352,
651,
1056,
3282,
281,
671,
7472,
327,
253,
747,
1269,
79,
965,
10895,
273,
1865,
1952,
1162,
355,
802,
13307,
81,
4765,
323,
1554,
39661,
295,
965,
4679,
50276,
6438,
253,
806,
18520,
891,
452,
5439,
253,
4868,
846,
253,
1077,
7000,
2488,
2380,
6701,
323,
326,
533,
436,
310,
671,
27039,
327,
253,
4477,
2403,
253,
4588,
38549,
12316,
275,
616,
2380,
891,
717,
1335,
3240,
6110,
281,
2451,
849,
973,
253,
1332,
2987,
275,
247,
9978,
342,
13392,
3448,
8557,
7152,
339,
793,
360,
3454,
50276,
249,
436,
2929,
4477,
8338,
4715,
273,
2831,
1981,
780,
6197,
14237,
342,
616,
4081,
8746,
36465,
1566,
7103,
5196,
342,
6311,
2831,
1981,
780,
14237,
327,
2067,
8892,
824,
347,
28294,
272,
780,
2831,
1981,
780,
285,
1182,
254,
6934,
302,
37922,
11860,
4715,
921,
253,
12510,
273,
253,
4081,
2746,
671,
597,
921,
2085,
247,
4216,
3169,
1783,
273,
253,
6311,
14237,
50276,
13524,
2762,
285,
4016,
2792,
273,
253,
2929,
310,
3559,
347,
3637,
50276,
856,
84,
50275,
18,
2831,
1981,
780,
6779,
4715,
407,
2049,
249,
1699,
5697,
432,
4715,
6197,
14237,
285,
2831,
12982,
25064,
374,
1554,
262,
1945,
9978,
273,
1027,
8892,
323,
11138,
2831,
12982,
285,
28294,
272,
780,
8892,
495,
2257,
273,
5661,
1543,
50275,
5040,
50276,
18,
1750,
352,
2987,
323,
28294,
525,
780,
8892,
275,
2303,
3448,
824,
347,
1182,
254,
6934,
302,
4715,
323,
21942,
9162,
285,
295,
965,
671,
323,
2831,
1981,
780,
331,
84,
285,
299,
17731,
303,
1858,
414,
7982,
310,
1892,
281,
19553,
432,
253,
2929,
50276,
19,
1142,
2426,
15302,
403,
908,
1293,
1146,
23378,
50276,
20,
10393,
273,
5368,
7274,
281,
1973,
247,
2014,
1566,
323,
1142,
8892,
50276,
26122,
281,
4477,
50274,
18,
8746,
36465,
10336,
310,
11797,
432,
1149,
80,
1162,
355,
4765,
534,
4648,
9706,
273,
2603,
285,
2303,
6197,
342,
3676,
11454,
2990,
2299,
352,
310,
1060,
7932,
715,
1554,
262,
1945,
8746,
36465,
1566,
50276,
19,
752,
403,
253,
8892,
326,
403,
1077,
2173,
281,
2603,
3448,
50275,
20,
5150,
18,
310,
10323,
247,
21535,
9077,
390,
2602,
4090,
689,
815,
74,
2299,
815,
74,
310,
14261,
1885,
273,
2349,
351,
723,
347,
2074,
281,
3676,
25001,
6928,
891,
90,
7885,
1162,
355,
4104,
50275,
21,
275,
2593,
19,
352,
310,
12744,
752,
1057,
13123,
8892,
1599,
597,
897,
7529,
5944,
66,
50276,
22,
275,
2593,
1797,
352,
310,
5393,
326,
3159,
46234,
403,
6311,
990,
936,
423,
1057,
436,
1599,
597,
403,
417,
31260,
342,
3215,
11273,
4394,
50276,
23,
275,
2593,
1797,
352,
310,
5393,
326,
3159,
285,
1894,
46234,
403,
6311,
275,
247,
43245,
5919,
1039,
752,
1057,
352,
1957,
597,
897,
1679,
3602,
7529,
12729,
50276,
24,
2139,
760,
1264,
8090,
273,
39707,
352,
310,
7192,
326,
47882,
8090,
310,
2424,
323,
3576,
9706,
273,
14683,
355,
19232,
276,
1162,
355,
4765,
50276,
25,
275,
1566,
6661,
849,
310,
14940,
4425,
667,
15910,
17705,
50276,
26,
752,
403,
253,
36509,
323,
28159,
262,
259,
15170,
15302,
50276,
740,
275,
2829,
18,
752,
1057,
278,
3373,
83,
3966,
3730,
281,
597,
403,
1620,
5393,
1078,
1057,
512,
8892,
760,
897,
760,
48087,
50273,
1189,
455,
352,
310,
271,
4722,
2929,
534,
33826,
1554,
262,
1945,
1566,
323,
10486,
11138,
1097,
28294,
272,
780,
285,
2831,
1981,
780,
8892,
2299,
1955,
281,
5816,
1491,
285,
14999,
19843,
275,
690,
4278,
352,
310,
1892,
281,
2997,
387,
436,
1127,
273,
673,
50276,
37585,
3374,
50275,
18,
14683,
403,
1077,
1048,
285,
417,
4354,
28535,
6286,
374,
2303,
3448,
285,
47276,
1910,
403,
908,
1293,
44978,
1016,
643,
1805,
281,
897,
581,
273,
731,
323,
1805,
12544,
495,
642,
1846,
14951,
323,
253,
1566,
352,
310,
644,
23378,
342,
1027,
4454,
2831,
1981,
780,
1554,
262,
1945,
1566,
1554,
262,
1945,
8746,
36465,
1566,
7152,
339,
431,
248,
2929,
10262,
271,
27350,
10336,
323,
4715,
2831,
1981,
780,
6197,
14237,
891,
923,
32213,
285,
20544,
50275,
74,
253,
2746,
310,
417,
1077,
4460,
970,
7529,
941,
285,
14259,
3733,
4927,
1443,
70,
48960,
3966,
281,
12454,
3700,
556,
644,
2218,
1078,
923,
470,
285,
10414,
15308,
9628,
32049,
3602,
2439,
1077,
1027,
8892,
310,
671,
3965,
2629,
407,
1024,
1469,
896,
281,
337,
390,
594,
50276,
2886,
253,
7103,
310,
2266,
342,
247,
5322,
5019,
273,
2629,
22791,
7103,
15450,
7103,
285,
1783,
50276,
12211,
1223,
253,
2929,
310,
327,
2831,
1981,
780,
3700,
253,
4477,
760,
3368,
342,
247,
1355,
873,
273,
1029,
15024,
11515,
835,
3700,
310,
4942,
3477,
50276,
400,
891,
1158,
253,
15302,
908,
323,
7103,
403,
8489,
749,
29776,
24088,
50276,
66,
2831,
1981,
780,
25064,
285,
1554,
39661,
331,
84,
403,
1077,
2074,
8892,
643,
8892,
970,
6197,
14237,
285,
323,
534,
1554,
39661,
5944,
66,
403,
2130,
2486,
25200,
29072,
1329,
8137,
323,
2805,
66,
4908,
422,
10405,
1320,
22567,
5481,
3966,
50276,
67,
3185,
273,
22128,
327,
639,
280,
285,
5807,
77,
18696,
4240,
2139,
13414,
253,
4477,
897,
253,
1269,
79,
965,
20689,
374,
260,
42477,
253,
48087,
331,
84,
941,
970,
17899,
295,
6917,
281,
7472,
271,
10336,
326,
4453,
247,
2257,
751,
17899,
295,
6917,
7835,
247,
20634,
50276,
87,
1223,
891,
1119,
253,
3368,
342,
299,
17731,
303,
1858,
414,
247,
5322,
7680,
627,
310,
247,
2257,
273,
18075,
6523,
1880,
627,
310,
247,
4872,
9261,
432,
581,
3448,
281,
1529,
970,
354,
7083,
461,
265,
323,
1650,
6523,
1880,
253,
6197,
14580,
476,
320,
15616,
970,
305,
507,
1754,
760,
327,
480,
8289,
23279,
2819,
387,
253,
12087,
273,
841,
14237,
3966,
858,
368,
1158,
670,
2509,
253,
1072,
1783,
327,
253,
14237,
6311,
1293,
253,
10234,
4836,
533,
970,
2303,
3448,
3733,
941,
323,
253,
8892,
3185,
253,
1953,
651,
320,
1880,
627,
4961,
247,
4872,
9261,
432,
253,
6197,
4216,
6311,
323,
48087,
1223,
2509,
295,
965,
281,
253,
6197,
4216,
6311,
323,
305,
8592,
1223,
2509,
295,
965,
50275,
37585,
5701,
50275,
2420,
495,
327,
3239,
608,
943,
320,
2829,
374,
50275,
2420,
374,
3133,
15279,
1580,
253,
1543,
403,
417,
4722,
327,
616,
1211,
533,
3365,
247,
26536,
275,
253,
15265,
839,
4154,
891,
651,
1246,
841,
1543,
1101,
633,
50275,
17,
3944,
29404,
7585,
2061,
14718,
1497,
88,
1093,
1229,
1508,
187,
187,
4118,
18435,
27,
856,
84,
50275,
66,
747,
7792,
323,
4715,
6197,
14237,
50276,
23706,
4679,
285,
6260,
50276,
12586,
73,
50276,
89,
79,
965,
10895,
369,
2879,
15974,
253,
4385,
326,
642,
13392,
11515,
497,
2783,
671,
28913,
5216,
50276,
5040,
50274,
783,
2783,
4295,
403,
417,
4460,
285,
616,
5019,
310,
15246,
50275,
783,
873,
273,
15450,
8892,
310,
417,
1077,
11117,
923,
391,
19,
50275,
7483,
1029,
7741,
11515,
403,
2783,
4722,
281,
923,
352,
3732,
281,
1524,
1698,
7741,
11515,
50276,
455,
30628,
5194,
326,
627,
310,
642,
14053,
7680,
50276,
1189,
455,
352,
310,
247,
4891,
2929,
533,
891,
513,
417,
2868,
326,
253,
7680,
310,
4209,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
4460,
2831,
1981,
780,
1554,
262,
1945,
272,
7792,
1754,
327,
247,
8746,
36465,
1566,
326,
476,
3037,
2831,
1981,
780,
6197,
14237,
534,
403,
4217,
275,
28294,
272,
780,
8892,
285,
2831,
1981,
780,
8892,
323,
1097,
11515,
3206,
275,
253,
3733,
347,
2540,
327,
253,
4679,
323,
1264,
3448,
8557,
253,
2022,
2934,
273,
253,
2746,
310,
281,
1566,
512,
8892,
347,
3280,
10927,
19947,
8892,
285,
9569,
2831,
1981,
780,
6779,
42068,
949,
253,
10234,
19947,
4836,
5611,
407,
1149,
80,
1162,
355,
4765,
512,
4295,
273,
253,
7792,
403,
3240,
2629,
285,
372,
20304,
86,
533,
891,
751,
253,
2929,
275,
2087,
285,
253,
1543,
1646,
3240,
18462,
891,
452,
2067,
5701,
327,
849,
281,
2007,
17084,
253,
2929,
285,
3157,
253,
9759,
273,
253,
2022,
4342,
50276,
783,
4081,
7792,
1057,
417,
3959,
667,
6832,
14053,
7680,
26332,
512,
2201,
4295,
403,
1754,
327,
256,
5503,
3210,
533,
253,
7792,
310,
1335,
3240,
4722,
347,
247,
7802,
273,
841,
256,
5503,
4295,
891,
2868,
326,
690,
3081,
4679,
651,
1056,
253,
2022,
9021,
30909,
285,
651,
671,
2085,
3081,
16039,
715,
253,
2022,
3607,
273,
253,
4081,
7792,
337,
2831,
1981,
10982,
285,
374,
1554,
262,
1945,
272,
50275,
2252,
273,
512,
891,
717,
9861,
417,
281,
923,
667,
28913,
2175,
323,
4227,
752,
6569,
604,
359,
5386,
581,
273,
253,
767,
28294,
272,
780,
8892,
275,
1016,
3448,
849,
1057,
326,
3777,
1566,
7277,
281,
253,
2120,
1566,
534,
28294,
272,
780,
4836,
310,
625,
12912,
323,
253,
2457,
3045,
275,
15450,
8892,
476,
359,
1158,
273,
6240,
1529,
28294,
272,
780,
4836,
281,
9510,
3045,
2007,
891,
1158,
326,
436,
3686,
273,
3368,
651,
320,
625,
12912,
323,
253,
2929,
685,
247,
3965,
1048,
1783,
432,
2593,
608,
436,
1783,
310,
1335,
3588,
533,
943,
320,
36439,
9619,
16344,
760,
1554,
262,
1945,
272,
1293,
667,
2831,
1981,
780,
3733,
651,
671,
320,
1077,
12912,
281,
31410,
253,
6070,
273,
7756,
6786,
407,
6240,
2831,
1981,
10982,
281,
253,
1566,
50276,
5430,
1199,
1057,
253,
4081,
10336,
3469,
327,
253,
4327,
273,
253,
9706,
1566,
323,
253,
1159,
305,
452,
253,
4477,
3368,
264,
342,
643,
3332,
285,
425,
1032,
5503,
9706,
3210,
891,
651,
751,
281,
923,
247,
20407,
1783,
273,
436,
4373,
19484,
50276,
74,
651,
751,
281,
923,
625,
4679,
327,
625,
13392,
3448,
8557,
436,
651,
1056,
253,
2929,
1014,
625,
4722,
516,
1689,
891,
717,
671,
14338,
1880,
627,
651,
320,
247,
5926,
275,
3045,
2361,
27039,
327,
253,
4181,
856,
3266,
414,
875,
767,
11515,
275,
247,
3448,
4667,
50276,
74,
651,
751,
281,
923,
247,
625,
7000,
5740,
273,
253,
767,
1682,
9591,
331,
84,
2718,
10038,
3023,
285,
2372,
275,
752,
1675,
403,
841,
2718,
1375,
23037,
14387,
4735,
28136,
285,
6804,
891,
717,
417,
2119,
752,
436,
2097,
1293,
5277,
667,
3081,
3634,
281,
253,
1750,
285,
5740,
50276,
5430,
1057,
253,
28294,
272,
780,
48087,
331,
84,
1566,
10166,
342,
253,
2831,
1981,
780,
1554,
262,
1945,
7792,
7277,
281,
253,
789,
273,
1865,
1952,
1162,
355,
802,
13307,
81,
4240,
534,
671,
908,
3802,
965,
347,
253,
4836,
327,
534,
281,
3037,
10898,
6197,
14237,
436,
651,
320,
247,
1175,
3368,
516,
1689,
347,
352,
651,
921,
849,
1199,
359,
6351,
432,
2831,
1981,
780,
3733,
285,
1554,
262,
1945,
272,
50276,
37585,
3239,
495,
812,
368,
823,
247,
2159,
43302,
16585,
849,
1892,
8265,
3993,
323,
253,
10234,
19947,
4836,
403,
4236,
849,
513,
368,
11897,
14259,
1060,
513,
368,
1902,
3045,
281,
3157,
2007,
407,
3733,
37197,
965,
3185,
273,
3802,
965,
390,
16248,
253,
767,
15302,
512,
4373,
22041,
403,
24251,
1754,
327,
12611,
4679,
327,
247,
2440,
873,
50276,
5371,
310,
908,
347,
253,
2440,
873,
625,
4278,
3058,
4720,
347,
271,
3081,
3733,
47641,
359,
30247,
253,
27935,
281,
253,
3159,
285,
1894,
46234,
407,
247,
2803,
273,
2233,
50276,
5430,
310,
253,
1318,
323,
253,
21496,
11786,
39199,
3413,
310,
627,
271,
12077,
5199,
281,
1442,
292,
2517,
436,
4373,
19484,
390,
556,
436,
644,
2218,
275,
247,
4336,
16774,
1039,
2829,
337,
4496,
4853,
253,
4836,
490,
25669,
1078,
4645,
731,
275,
253,
2829,
352,
310,
417,
2590,
752,
1016,
4836,
310,
407,
22128,
760,
327,
253,
31931,
2492,
436,
10895,
369,
417,
2130,
387,
253,
673,
273,
253,
19529,
533,
323,
253,
18520,
352,
651,
1056,
3282,
281,
671,
7472,
327,
253,
747,
1269,
79,
965,
10895,
273,
1865,
1952,
1162,
355,
802,
13307,
81,
4765,
323,
1554,
39661,
295,
965,
4679,
50276,
6438,
253,
806,
18520,
891,
452,
5439,
253,
4868,
846,
253,
1077,
7000,
2488,
2380,
6701,
323,
326,
533,
436,
310,
671,
27039,
327,
253,
4477,
2403,
253,
4588,
38549,
12316,
275,
616,
2380,
891,
717,
1335,
3240,
6110,
281,
2451,
849,
973,
253,
1332,
2987,
275,
247,
9978,
342,
13392,
3448,
8557,
7152,
339,
793,
360,
3454,
50276,
249,
436,
2929,
4477,
8338,
4715,
273,
2831,
1981,
780,
6197,
14237,
342,
616,
4081,
8746,
36465,
1566,
7103,
5196,
342,
6311,
2831,
1981,
780,
14237,
327,
2067,
8892,
824,
347,
28294,
272,
780,
2831,
1981,
780,
285,
1182,
254,
6934,
302,
37922,
11860,
4715,
921,
253,
12510,
273,
253,
4081,
2746,
671,
597,
921,
2085,
247,
4216,
3169,
1783,
273,
253,
6311,
14237,
50276,
13524,
2762,
285,
4016,
2792,
273,
253,
2929,
310,
3559,
347,
3637,
50276,
856,
84,
50275,
18,
2831,
1981,
780,
6779,
4715,
407,
2049,
249,
1699,
5697,
432,
4715,
6197,
14237,
285,
2831,
12982,
25064,
374,
1554,
262,
1945,
9978,
273,
1027,
8892,
323,
11138,
2831,
12982,
285,
28294,
272,
780,
8892,
495,
2257,
273,
5661,
1543,
50275,
5040,
50276,
18,
1750,
352,
2987,
323,
28294,
525,
780,
8892,
275,
2303,
3448,
824,
347,
1182,
254,
6934,
302,
4715,
323,
21942,
9162,
285,
295,
965,
671,
323,
2831,
1981,
780,
331,
84,
285,
299,
17731,
303,
1858,
414,
7982,
310,
1892,
281,
19553,
432,
253,
2929,
50276,
19,
1142,
2426,
15302,
403,
908,
1293,
1146,
23378,
50276,
20,
10393,
273,
5368,
7274,
281,
1973,
247,
2014,
1566,
323,
1142,
8892,
50276,
26122,
281,
4477,
50274,
18,
8746,
36465,
10336,
310,
11797,
432,
1149,
80,
1162,
355,
4765,
534,
4648,
9706,
273,
2603,
285,
2303,
6197,
342,
3676,
11454,
2990,
2299,
352,
310,
1060,
7932,
715,
1554,
262,
1945,
8746,
36465,
1566,
50276,
19,
752,
403,
253,
8892,
326,
403,
1077,
2173,
281,
2603,
3448,
50275,
20,
5150,
18,
310,
10323,
247,
21535,
9077,
390,
2602,
4090,
689,
815,
74,
2299,
815,
74,
310,
14261,
1885,
273,
2349,
351,
723,
347,
2074,
281,
3676,
25001,
6928,
891,
90,
7885,
1162,
355,
4104,
50275,
21,
275,
2593,
19,
352,
310,
12744,
752,
1057,
13123,
8892,
1599,
597,
897,
7529,
5944,
66,
50276,
22,
275,
2593,
1797,
352,
310,
5393,
326,
3159,
46234,
403,
6311,
990,
936,
423,
1057,
436,
1599,
597,
403,
417,
31260,
342,
3215,
11273,
4394,
50276,
23,
275,
2593,
1797,
352,
310,
5393,
326,
3159,
285,
1894,
46234,
403,
6311,
275,
247,
43245,
5919,
1039,
752,
1057,
352,
1957,
597,
897,
1679,
3602,
7529,
12729,
50276,
24,
2139,
760,
1264,
8090,
273,
39707,
352,
310,
7192,
326,
47882,
8090,
310,
2424,
323,
3576,
9706,
273,
14683,
355,
19232,
276,
1162,
355,
4765,
50276,
25,
275,
1566,
6661,
849,
310,
14940,
4425,
667,
15910,
17705,
50276,
26,
752,
403,
253,
36509,
323,
28159,
262,
259,
15170,
15302,
50276,
740,
275,
2829,
18,
752,
1057,
278,
3373,
83,
3966,
3730,
281,
597,
403,
1620,
5393,
1078,
1057,
512,
8892,
760,
897,
760,
48087,
50273,
1189,
455,
352,
310,
271,
4722,
2929,
534,
33826,
1554,
262,
1945,
1566,
323,
10486,
11138,
1097,
28294,
272,
780,
285,
2831,
1981,
780,
8892,
2299,
1955,
281,
5816,
1491,
285,
14999,
19843,
275,
690,
4278,
352,
310,
1892,
281,
2997,
387,
436,
1127,
273,
673,
50276,
37585,
3374,
50275,
18,
14683,
403,
1077,
1048,
285,
417,
4354,
28535,
6286,
374,
2303,
3448,
285,
47276,
1910,
403,
908,
1293,
44978,
1016,
643,
1805,
281,
897,
581,
273,
731,
323,
1805,
12544,
495,
642,
1846,
14951,
323,
253,
1566,
352,
310,
644,
23378,
342,
1027,
4454,
2831,
1981,
780,
1554,
262,
1945,
1566,
1554,
262,
1945,
8746,
36465,
1566,
7152,
339,
431,
248,
2929,
10262,
271,
27350,
10336,
323,
4715,
2831,
1981,
780,
6197,
14237,
891,
923,
32213,
285,
20544,
50275,
74,
253,
2746,
310,
417,
1077,
4460,
970,
7529,
941,
285,
14259,
3733,
4927,
1443,
70,
48960,
3966,
281,
12454,
3700,
556,
644,
2218,
1078,
923,
470,
285,
10414,
15308,
9628,
32049,
3602,
2439,
1077,
1027,
8892,
310,
671,
3965,
2629,
407,
1024,
1469,
896,
281,
337,
390,
594,
50276,
2886,
253,
7103,
310,
2266,
342,
247,
5322,
5019,
273,
2629,
22791,
7103,
15450,
7103,
285,
1783,
50276,
12211,
1223,
253,
2929,
310,
327,
2831,
1981,
780,
3700,
253,
4477,
760,
3368,
342,
247,
1355,
873,
273,
1029,
15024,
11515,
835,
3700,
310,
4942,
3477,
50276,
400,
891,
1158,
253,
15302,
908,
323,
7103,
403,
8489,
749,
29776,
24088,
50276,
66,
2831,
1981,
780,
25064,
285,
1554,
39661,
331,
84,
403,
1077,
2074,
8892,
643,
8892,
970,
6197,
14237,
285,
323,
534,
1554,
39661,
5944,
66,
403,
2130,
2486,
25200,
29072,
1329,
8137,
323,
2805,
66,
4908,
422,
10405,
1320,
22567,
5481,
3966,
50276,
67,
3185,
273,
22128,
327,
639,
280,
285,
5807,
77,
18696,
4240,
2139,
13414,
253,
4477,
897,
253,
1269,
79,
965,
20689,
374,
260,
42477,
253,
48087,
331,
84,
941,
970,
17899,
295,
6917,
281,
7472,
271,
10336,
326,
4453,
247,
2257,
751,
17899,
295,
6917,
7835,
247,
20634,
50276,
87,
1223,
891,
1119,
253,
3368,
342,
299,
17731,
303,
1858,
414,
247,
5322,
7680,
627,
310,
247,
2257,
273,
18075,
6523,
1880,
627,
310,
247,
4872,
9261,
432,
581,
3448,
281,
1529,
970,
354,
7083,
461,
265,
323,
1650,
6523,
1880,
253,
6197,
14580,
476,
320,
15616,
970,
305,
507,
1754,
760,
327,
480,
8289,
23279,
2819,
387,
253,
12087,
273,
841,
14237,
3966,
858,
368,
1158,
670,
2509,
253,
1072,
1783,
327,
253,
14237,
6311,
1293,
253,
10234,
4836,
533,
970,
2303,
3448,
3733,
941,
323,
253,
8892,
3185,
253,
1953,
651,
320,
1880,
627,
4961,
247,
4872,
9261,
432,
253,
6197,
4216,
6311,
323,
48087,
1223,
2509,
295,
965,
281,
253,
6197,
4216,
6311,
323,
305,
8592,
1223,
2509,
295,
965,
50275,
37585,
5701,
50275,
2420,
495,
327,
3239,
608,
943,
320,
2829,
374,
50275,
2420,
374,
3133,
15279,
1580,
253,
1543,
403,
417,
4722,
327,
616,
1211,
533,
3365,
247,
26536,
275,
253,
15265,
839,
4154,
891,
651,
1246,
841,
1543,
1101,
633,
50275,
17,
3944,
29404,
7585,
2061,
14718,
1497,
88,
1093,
1229,
1508,
187,
187,
4118,
18435,
27,
856,
84,
50275,
66,
747,
7792,
323,
4715,
6197,
14237,
50276,
23706,
4679,
285,
6260,
50276,
12586,
73,
50276,
89,
79,
965,
10895,
369,
2879,
15974,
253,
4385,
326,
642,
13392,
11515,
497,
2783,
671,
28913,
5216,
50276,
5040,
50274,
783,
2783,
4295,
403,
417,
4460,
285,
616,
5019,
310,
15246,
50275,
783,
873,
273,
15450,
8892,
310,
417,
1077,
11117,
923,
391,
19,
50275,
7483,
1029,
7741,
11515,
403,
2783,
4722,
281,
923,
352,
3732,
281,
1524,
1698,
7741,
11515,
50276,
455,
30628,
5194,
326,
627,
310,
642,
14053,
7680,
50276,
1189,
455,
352,
310,
247,
4891,
2929,
533,
891,
513,
417,
2868,
326,
253,
7680,
310,
4209,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
an approach to pruning and growing probabilistic circuits is proposed which is empirically shown to be useful for achieving higher effective utilization of the weights of the circuit and thus better likelihoods on various datasets originality the approach to pruning ends up being quite simple evaluating the probability of sampling along any given branch of the circuits tree and using that as an upper bound for the influence that portion of the tree can have on the overall joint density then pruning out lowinfluence branches i have not seem similar work in pcs but would not be surprised to be able to find something related if looking at work in trees graphs etc in fact it would be nice if we could point to some work on decision trees or graphnetwork flow as algorithmically related work in this space relative to other approaches to learning pc structure the prune and grow approach seems possibly simpler and the empirical results suggest it is more robust quality the work is well done and experiments seem typical for the pcs density estimators space mnist family ptb the overall construction of the paper is sound approach is well explained the paper needs proofreading multiple misspellings grammar issues lines 30 31 38 59 87 105 198 240 279 rather than term on lines 171 172 181 185 perhaps expression would be a better word line 194 isnt it impossible to get a more expressive pc by pruning not just difficult line 271 i would use k portion or k fraction instead of k because k makes it ambiguous whether k of 1 means 1 fraction or 001 fraction clarity presentation is clear and the experiments demonstrate the value incremental to multiple competitive baselines presumably these results would also map to more effective lossless pcbased compression which might have been a nice story to tag onto the experiments significance this work is important in the space of density estimation probabilistic circuits which historically has been of interest to a broad swath of conference participants achieves new sotas in this class for bpd the technique is also relevant in the wider space of model pruning which is also of broad interest line 229 addresses a limitation sparse pc is not vectorizationfriendly and requires custom gpu kernels these are provided in oss no societal impact noted by authors or reviewer docsepthis work suggests a method for exploiting sparsity in probabilistic circuits pcs the authors motivate the paper by showing pc parameters being close to zero in the majority of pcs sum edges thus these models could be reduced with a potentially negligible loss in representation loglikelihood the paper suggests two methods for compressing a pc while minimizing its representation loss pruning and growing pruning identifies and removes parts of the pc that can be shown as nonimportant under the pcrepresented distribution similarly growing mitigates representation loss by adding new edges with noisy parameters which can be optimized to achieve better performance applying pruning and growing can significantly reduce the models size while not losing much performance some of this works strengths are its comprehensive solution and clear practical relevance regarding the comprehensive solution this paper puts forth the problem of computation sparsity in pcs this problem involves both tasks of removing unnecessary parts of this computation while minimizing loss in performance the solution suggested in this manuscript tackles both tasks by providing modular techniques called pruning and growing these techniques allow for finetuning the gains and losses of compression and loglikelihood respectively and they are individually grounded on theoretical results for instance the pruning technique called eflow is a good approximation for the upper bound of the loglikelihood drop when removing edges as shown in theorem 1 and 2 these comprehensive discussions of the sparsity problem including theoretical analysis strengthen the work here this papers practical relevance is evident in its motivation and provides stateoftheart results on benchmark experiments indeed the manuscript emphasizes the impact of sparsity in the model size as depicted in figure 1 while also showing methods for maximizing efficiency as discussed in section 5 thus the work being suggested here is relevant for practical applications this premise is confirmed with stateoftheart results in challenging datasets furthermore the extended discussion in section 62 is exciting and it shows the capabilities of pruninggrowing operations on finetuning the tradeoff between pc size and accuracy one minor weakness in this papers presentation lies in the intuition behind the theoreticallygrounded metric for the importance of pc edges the manuscript says that the global influence of edges on the pcs output is highly related to pc semantics this point is positive for the use of pcs compared to other models such as some competitors in table 1 since pcs semantics are a differentiator characteristic however theoretical discussions in section 4 are focused on one pruning method eflow which bases itself on pcs sampling process thus it is unclear from the text how the theoretical analysis uses pcs unique semantics this paper does not explicitly discuss its limitations since the experimental results are a crucial draw for this work it could be beneficial to add a short discussion on how the metrics used to measure exploiting sparsity benefit or hurts the different models being evaluated docsepthe authors present an algorithm to distill probabilistic circuits pcs in this particular case they present pruning and growing operations removing nodes that are not significant for the estimator and adding nodes where needed the authors present and prove multiple variants for pruning describe structure learning via pruning and growing and parameter estimation via stochastic minibatch em furthermore the strong empirical results indicate that this approach can have significant impact in the learning of pcs this approach also allows practitioners to control between model size and performance the paper presents a solid contribution as shown by the empirical evaluation pushing stateoftheart in pcs the paper is wellwritten and technically sound there are some minor weaknesses in 1 the authors propose a pruning approach based on the lottery ticket hypothesis for pcs i believe this might be relevant here there is no comparison to other methods that work on regularization of pcs such as 2 1 ventola fabrizio et al residual sumproduct networks international conference on probabilistic graphical models pmlr 2020 2 shih andy dorsa sadigh and stefano ermon hyperspns compact and expressive probabilistic circuits advances in neural information processing systems 34 2021 85718582 no negative societal impact docsepthis work addresses the scalability issue of the probabilistic circuits they propose a technique of pruning and growing to scale and improve the expressiveness of the pcs for pruning they propose pruning edges based on the probability that it will be activated during sampling eflow strategy in alg1 to counter the reduction in the model capacity done by the pruning step they introduce a grow alg2 strategy to obtain a compressed pc they alternate between the pruning growing operations based on optimizing for the loglikelihood pros cons 1 it is a wellwritten paper i appreciate the authors introducing the literature and prior work about pcs in a concise and efficient manner 2 the formulation of pruning by circuit flows is a natural followup question to the pruning by generative significance the sampled conditioned parameterization is one approach which the authors introduced in this work this may introduce a biasoverfitting for the training data i wonder if there are other approaches that can better handle the same problem what if we do unconditional sampling and then measure the probability of observing the sampled data with the observed data use this information to weigh the flow this is good work although i feel the scalability performance is still far from using it on larger datasets at least from the experiments added in the paper as mentioned above will be good to know the scale of the data this method can currently handle
### Summary:
|
the paper introduces sparsenessinducing techniques for probabilistic circuits pcs leading to novel structure learning approaches for pcs the reviewers were very positive about this paper found it wellwritten and to be improving stateoftheart the techniques are novel and shown to be effective on generative modeling tasks
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
266,
2746,
281,
819,
25004,
285,
5675,
37851,
14174,
310,
4081,
534,
310,
45190,
2011,
281,
320,
4217,
323,
17170,
2169,
3576,
19575,
273,
253,
13461,
273,
253,
5049,
285,
3021,
1805,
12177,
84,
327,
2710,
15302,
3236,
414,
253,
2746,
281,
819,
25004,
7637,
598,
1146,
3240,
2969,
16344,
253,
5912,
273,
10491,
2112,
667,
1677,
7789,
273,
253,
14174,
5202,
285,
970,
326,
347,
271,
5170,
3033,
323,
253,
4833,
326,
5110,
273,
253,
5202,
476,
452,
327,
253,
4583,
6036,
4038,
840,
819,
25004,
562,
1698,
249,
23718,
12998,
891,
452,
417,
1646,
2074,
789,
275,
268,
6113,
533,
651,
417,
320,
9861,
281,
320,
2104,
281,
1089,
1633,
2905,
604,
2819,
387,
789,
275,
7139,
14580,
3966,
275,
958,
352,
651,
320,
5322,
604,
359,
812,
1127,
281,
690,
789,
327,
3061,
7139,
390,
4216,
18428,
2685,
347,
5933,
1037,
2905,
789,
275,
436,
2317,
4103,
281,
643,
7274,
281,
4715,
21136,
2605,
253,
819,
2517,
285,
1756,
2746,
3133,
6830,
19554,
285,
253,
16774,
1543,
1804,
352,
310,
625,
10237,
50276,
15177,
253,
789,
310,
973,
2218,
285,
4679,
1646,
6867,
323,
253,
268,
6113,
50276,
20425,
48489,
2317,
278,
79,
382,
2021,
50276,
431,
67,
253,
4583,
5140,
273,
253,
2929,
310,
3590,
2746,
310,
973,
5544,
253,
2929,
3198,
4737,
24042,
2709,
2985,
81,
437,
723,
28146,
3374,
3104,
1884,
4562,
6480,
8978,
11422,
12446,
26397,
16918,
29226,
2581,
685,
1307,
327,
3104,
24549,
24347,
26240,
23512,
4931,
2048,
651,
320,
247,
1805,
3159,
1386,
26771,
310,
2649,
352,
7479,
281,
755,
247,
625,
43541,
21136,
407,
819,
25004,
417,
816,
2834,
1386,
32855,
891,
651,
897,
465,
5110,
390,
465,
6919,
3185,
273,
465,
984,
465,
2789,
352,
23851,
1880,
465,
273,
337,
2097,
337,
6919,
390,
209,
2874,
6919,
50276,
498,
15752,
9759,
310,
2590,
285,
253,
4679,
7568,
253,
1318,
32809,
281,
2709,
12085,
1666,
25379,
18289,
841,
1543,
651,
671,
3711,
281,
625,
3576,
2957,
1417,
21136,
3169,
13800,
534,
1537,
452,
644,
247,
5322,
2926,
281,
6809,
4830,
253,
4679,
50276,
9188,
40348,
436,
789,
310,
1774,
275,
253,
2317,
273,
4038,
13418,
50276,
22275,
5280,
2531,
14174,
534,
24842,
556,
644,
273,
1600,
281,
247,
3862,
1863,
506,
273,
8059,
5014,
33526,
747,
256,
302,
284,
275,
436,
966,
323,
270,
19875,
253,
5853,
310,
671,
4623,
275,
253,
14200,
2317,
273,
1566,
819,
25004,
534,
310,
671,
273,
3862,
1600,
1386,
26780,
12453,
247,
12291,
50276,
1033,
10788,
21136,
310,
417,
4972,
1320,
19771,
285,
4419,
2840,
305,
11113,
34501,
841,
403,
2530,
275,
31058,
50276,
2369,
38058,
3486,
4879,
407,
4477,
390,
37317,
5474,
33032,
2520,
789,
5936,
247,
1332,
323,
38883,
37139,
414,
275,
37851,
14174,
268,
6113,
253,
4477,
41509,
253,
2929,
407,
4645,
21136,
3602,
1146,
2810,
281,
5058,
275,
253,
5020,
273,
268,
6113,
2020,
9297,
3021,
841,
3210,
812,
320,
3777,
342,
247,
7826,
22879,
2957,
275,
6779,
2412,
7513,
10202,
253,
2929,
5936,
767,
3082,
323,
509,
13537,
247,
21136,
1223,
28699,
697,
6779,
2957,
819,
25004,
285,
5675,
819,
25004,
22649,
285,
26586,
4243,
273,
253,
21136,
326,
476,
320,
2011,
347,
1327,
18108,
762,
253,
268,
719,
15068,
264,
3268,
12014,
5675,
4784,
304,
684,
6779,
2957,
407,
6240,
747,
9297,
342,
27620,
3602,
534,
476,
320,
18325,
281,
5115,
1805,
3045,
9433,
819,
25004,
285,
5675,
476,
3012,
4796,
253,
3210,
1979,
1223,
417,
10305,
1199,
3045,
690,
273,
436,
2987,
20544,
403,
697,
11088,
2900,
285,
2590,
8542,
17200,
5001,
253,
11088,
2900,
436,
2929,
12516,
6593,
253,
1895,
273,
13782,
37139,
414,
275,
268,
6113,
436,
1895,
8687,
1097,
8892,
273,
11922,
15279,
4243,
273,
436,
13782,
1223,
28699,
2957,
275,
3045,
253,
2900,
5125,
275,
436,
7714,
39223,
1097,
8892,
407,
5277,
23178,
5609,
1925,
819,
25004,
285,
5675,
841,
5609,
1581,
323,
1442,
292,
25004,
253,
15988,
285,
11655,
273,
13800,
285,
2412,
7513,
10202,
2975,
285,
597,
403,
15978,
28462,
327,
10527,
1543,
323,
4227,
253,
819,
25004,
5853,
1925,
299,
5449,
310,
247,
1175,
11193,
323,
253,
5170,
3033,
273,
253,
2412,
7513,
10202,
5926,
672,
11922,
9297,
347,
2011,
275,
10012,
337,
285,
374,
841,
11088,
11985,
273,
253,
37139,
414,
1895,
1690,
10527,
1783,
17084,
253,
789,
1060,
436,
9380,
8542,
17200,
310,
8943,
275,
697,
16038,
285,
3400,
1375,
23037,
14387,
1543,
327,
22791,
4679,
6296,
253,
7714,
35520,
253,
3486,
273,
37139,
414,
275,
253,
1566,
1979,
347,
17253,
275,
4677,
337,
1223,
671,
4645,
3082,
323,
46875,
6733,
347,
5469,
275,
2593,
608,
3021,
253,
789,
1146,
5125,
1060,
310,
4623,
323,
8542,
4893,
436,
26536,
310,
5783,
342,
1375,
23037,
14387,
1543,
275,
11132,
15302,
33810,
253,
6508,
5955,
275,
2593,
9743,
310,
12302,
285,
352,
2722,
253,
13789,
273,
819,
25004,
33601,
5871,
327,
1442,
292,
25004,
253,
5454,
2727,
875,
21136,
1979,
285,
7200,
50276,
531,
5884,
14855,
275,
436,
9380,
9759,
8696,
275,
253,
30328,
3212,
253,
28055,
2595,
264,
7982,
323,
253,
6349,
273,
21136,
9297,
253,
7714,
2296,
326,
253,
4156,
4833,
273,
9297,
327,
253,
268,
6113,
3453,
310,
4122,
2905,
281,
21136,
35185,
436,
1127,
310,
2762,
323,
253,
897,
273,
268,
6113,
2429,
281,
643,
3210,
824,
347,
690,
21607,
275,
2829,
337,
1580,
268,
6113,
35185,
403,
247,
1027,
35250,
8847,
2299,
10527,
11985,
275,
2593,
577,
403,
7106,
327,
581,
819,
25004,
1332,
299,
5449,
534,
14395,
3139,
327,
268,
6113,
10491,
1232,
3021,
352,
310,
12744,
432,
253,
2505,
849,
253,
10527,
1783,
4648,
268,
6113,
4451,
35185,
436,
2929,
1057,
417,
11120,
2319,
697,
7364,
1580,
253,
5661,
1543,
403,
247,
9560,
3812,
323,
436,
789,
352,
812,
320,
12912,
281,
823,
247,
2159,
5955,
327,
849,
253,
17082,
908,
281,
2557,
38883,
37139,
414,
5649,
390,
31835,
253,
1027,
3210,
1146,
6760,
5474,
339,
431,
248,
4477,
1246,
271,
5933,
281,
940,
408,
37851,
14174,
268,
6113,
275,
436,
1798,
1083,
597,
1246,
819,
25004,
285,
5675,
5871,
11922,
7632,
326,
403,
417,
1534,
323,
253,
29107,
285,
6240,
7632,
835,
3058,
50276,
783,
4477,
1246,
285,
5276,
2709,
11640,
323,
819,
25004,
6266,
2605,
4715,
3066,
819,
25004,
285,
5675,
285,
4764,
13418,
3066,
19191,
1054,
487,
1506,
802,
33810,
253,
2266,
16774,
1543,
5224,
326,
436,
2746,
476,
452,
1534,
3486,
275,
253,
4715,
273,
268,
6113,
50276,
2520,
2746,
671,
4483,
24432,
281,
1453,
875,
1566,
1979,
285,
3045,
50275,
783,
2929,
10262,
247,
4891,
7680,
347,
2011,
407,
253,
16774,
7103,
13383,
1375,
23037,
14387,
275,
268,
6113,
50276,
783,
2929,
310,
973,
15720,
285,
22335,
3590,
50275,
9088,
403,
690,
5884,
32213,
275,
337,
253,
4477,
12661,
247,
819,
25004,
2746,
1754,
327,
253,
36284,
13571,
9079,
323,
268,
6113,
891,
2868,
436,
1537,
320,
4623,
1060,
627,
310,
642,
5301,
281,
643,
3082,
326,
789,
327,
37820,
273,
268,
6113,
824,
347,
374,
50275,
18,
5698,
6836,
6969,
21100,
900,
1162,
355,
12541,
2020,
7509,
6928,
5213,
8059,
327,
37851,
29886,
3210,
268,
1686,
83,
9169,
374,
439,
6356,
285,
90,
18154,
66,
8872,
798,
285,
331,
832,
4692,
209,
693,
251,
24052,
81,
2224,
8566,
285,
43541,
37851,
14174,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
854,
3011,
17786,
3507,
642,
4016,
38058,
3486,
5474,
33032,
2520,
789,
12453,
253,
9171,
1430,
2523,
273,
253,
37851,
14174,
597,
12661,
247,
5853,
273,
819,
25004,
285,
5675,
281,
4311,
285,
3157,
253,
3890,
6460,
273,
253,
268,
6113,
323,
819,
25004,
597,
12661,
819,
25004,
9297,
1754,
327,
253,
5912,
326,
352,
588,
320,
10564,
1309,
10491,
299,
5449,
5700,
275,
20320,
18,
281,
4828,
253,
5141,
275,
253,
1566,
5350,
2218,
407,
253,
819,
25004,
3213,
597,
9569,
247,
1756,
20320,
19,
5700,
281,
4044,
247,
21012,
21136,
597,
17958,
875,
253,
819,
25004,
50276,
33601,
5871,
1754,
327,
39793,
323,
253,
2412,
7513,
10202,
50276,
856,
84,
50276,
5040,
337,
352,
310,
247,
973,
15720,
2929,
891,
11435,
253,
4477,
16984,
253,
6239,
285,
2720,
789,
670,
268,
6113,
275,
247,
44003,
285,
5919,
5133,
50276,
19,
253,
15895,
273,
819,
25004,
407,
5049,
14221,
310,
247,
3626,
956,
484,
1953,
281,
253,
819,
25004,
407,
1006,
800,
8453,
253,
19958,
27039,
4764,
1320,
310,
581,
2746,
534,
253,
4477,
5611,
275,
436,
789,
436,
778,
9569,
247,
8492,
1189,
31893,
323,
253,
3733,
941,
891,
4282,
604,
627,
403,
643,
7274,
326,
476,
1805,
6016,
253,
1072,
1895,
752,
604,
359,
513,
49795,
10491,
285,
840,
2557,
253,
5912,
273,
20764,
253,
19958,
941,
342,
253,
2540,
941,
897,
436,
1491,
281,
14357,
253,
2685,
50276,
2520,
310,
1175,
789,
3738,
891,
1928,
253,
9171,
1430,
3045,
310,
1335,
2080,
432,
970,
352,
327,
4067,
15302,
387,
1878,
432,
253,
4679,
2879,
275,
253,
2929,
50275,
284,
5393,
1840,
588,
320,
1175,
281,
871,
253,
4311,
273,
253,
941,
436,
1332,
476,
4390,
6016,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
23970,
37139,
8098,
527,
32578,
5609,
323,
37851,
14174,
268,
6113,
4283,
281,
4460,
2605,
4715,
7274,
323,
268,
6113,
253,
30628,
497,
1077,
2762,
670,
436,
2929,
1119,
352,
973,
15720,
285,
281,
320,
11138,
1375,
23037,
14387,
253,
5609,
403,
4460,
285,
2011,
281,
320,
3576,
327,
1006,
800,
14053,
8892,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
266,
2746,
281,
819,
25004,
285,
5675,
37851,
14174,
310,
4081,
534,
310,
45190,
2011,
281,
320,
4217,
323,
17170,
2169,
3576,
19575,
273,
253,
13461,
273,
253,
5049,
285,
3021,
1805,
12177,
84,
327,
2710,
15302,
3236,
414,
253,
2746,
281,
819,
25004,
7637,
598,
1146,
3240,
2969,
16344,
253,
5912,
273,
10491,
2112,
667,
1677,
7789,
273,
253,
14174,
5202,
285,
970,
326,
347,
271,
5170,
3033,
323,
253,
4833,
326,
5110,
273,
253,
5202,
476,
452,
327,
253,
4583,
6036,
4038,
840,
819,
25004,
562,
1698,
249,
23718,
12998,
891,
452,
417,
1646,
2074,
789,
275,
268,
6113,
533,
651,
417,
320,
9861,
281,
320,
2104,
281,
1089,
1633,
2905,
604,
2819,
387,
789,
275,
7139,
14580,
3966,
275,
958,
352,
651,
320,
5322,
604,
359,
812,
1127,
281,
690,
789,
327,
3061,
7139,
390,
4216,
18428,
2685,
347,
5933,
1037,
2905,
789,
275,
436,
2317,
4103,
281,
643,
7274,
281,
4715,
21136,
2605,
253,
819,
2517,
285,
1756,
2746,
3133,
6830,
19554,
285,
253,
16774,
1543,
1804,
352,
310,
625,
10237,
50276,
15177,
253,
789,
310,
973,
2218,
285,
4679,
1646,
6867,
323,
253,
268,
6113,
50276,
20425,
48489,
2317,
278,
79,
382,
2021,
50276,
431,
67,
253,
4583,
5140,
273,
253,
2929,
310,
3590,
2746,
310,
973,
5544,
253,
2929,
3198,
4737,
24042,
2709,
2985,
81,
437,
723,
28146,
3374,
3104,
1884,
4562,
6480,
8978,
11422,
12446,
26397,
16918,
29226,
2581,
685,
1307,
327,
3104,
24549,
24347,
26240,
23512,
4931,
2048,
651,
320,
247,
1805,
3159,
1386,
26771,
310,
2649,
352,
7479,
281,
755,
247,
625,
43541,
21136,
407,
819,
25004,
417,
816,
2834,
1386,
32855,
891,
651,
897,
465,
5110,
390,
465,
6919,
3185,
273,
465,
984,
465,
2789,
352,
23851,
1880,
465,
273,
337,
2097,
337,
6919,
390,
209,
2874,
6919,
50276,
498,
15752,
9759,
310,
2590,
285,
253,
4679,
7568,
253,
1318,
32809,
281,
2709,
12085,
1666,
25379,
18289,
841,
1543,
651,
671,
3711,
281,
625,
3576,
2957,
1417,
21136,
3169,
13800,
534,
1537,
452,
644,
247,
5322,
2926,
281,
6809,
4830,
253,
4679,
50276,
9188,
40348,
436,
789,
310,
1774,
275,
253,
2317,
273,
4038,
13418,
50276,
22275,
5280,
2531,
14174,
534,
24842,
556,
644,
273,
1600,
281,
247,
3862,
1863,
506,
273,
8059,
5014,
33526,
747,
256,
302,
284,
275,
436,
966,
323,
270,
19875,
253,
5853,
310,
671,
4623,
275,
253,
14200,
2317,
273,
1566,
819,
25004,
534,
310,
671,
273,
3862,
1600,
1386,
26780,
12453,
247,
12291,
50276,
1033,
10788,
21136,
310,
417,
4972,
1320,
19771,
285,
4419,
2840,
305,
11113,
34501,
841,
403,
2530,
275,
31058,
50276,
2369,
38058,
3486,
4879,
407,
4477,
390,
37317,
5474,
33032,
2520,
789,
5936,
247,
1332,
323,
38883,
37139,
414,
275,
37851,
14174,
268,
6113,
253,
4477,
41509,
253,
2929,
407,
4645,
21136,
3602,
1146,
2810,
281,
5058,
275,
253,
5020,
273,
268,
6113,
2020,
9297,
3021,
841,
3210,
812,
320,
3777,
342,
247,
7826,
22879,
2957,
275,
6779,
2412,
7513,
10202,
253,
2929,
5936,
767,
3082,
323,
509,
13537,
247,
21136,
1223,
28699,
697,
6779,
2957,
819,
25004,
285,
5675,
819,
25004,
22649,
285,
26586,
4243,
273,
253,
21136,
326,
476,
320,
2011,
347,
1327,
18108,
762,
253,
268,
719,
15068,
264,
3268,
12014,
5675,
4784,
304,
684,
6779,
2957,
407,
6240,
747,
9297,
342,
27620,
3602,
534,
476,
320,
18325,
281,
5115,
1805,
3045,
9433,
819,
25004,
285,
5675,
476,
3012,
4796,
253,
3210,
1979,
1223,
417,
10305,
1199,
3045,
690,
273,
436,
2987,
20544,
403,
697,
11088,
2900,
285,
2590,
8542,
17200,
5001,
253,
11088,
2900,
436,
2929,
12516,
6593,
253,
1895,
273,
13782,
37139,
414,
275,
268,
6113,
436,
1895,
8687,
1097,
8892,
273,
11922,
15279,
4243,
273,
436,
13782,
1223,
28699,
2957,
275,
3045,
253,
2900,
5125,
275,
436,
7714,
39223,
1097,
8892,
407,
5277,
23178,
5609,
1925,
819,
25004,
285,
5675,
841,
5609,
1581,
323,
1442,
292,
25004,
253,
15988,
285,
11655,
273,
13800,
285,
2412,
7513,
10202,
2975,
285,
597,
403,
15978,
28462,
327,
10527,
1543,
323,
4227,
253,
819,
25004,
5853,
1925,
299,
5449,
310,
247,
1175,
11193,
323,
253,
5170,
3033,
273,
253,
2412,
7513,
10202,
5926,
672,
11922,
9297,
347,
2011,
275,
10012,
337,
285,
374,
841,
11088,
11985,
273,
253,
37139,
414,
1895,
1690,
10527,
1783,
17084,
253,
789,
1060,
436,
9380,
8542,
17200,
310,
8943,
275,
697,
16038,
285,
3400,
1375,
23037,
14387,
1543,
327,
22791,
4679,
6296,
253,
7714,
35520,
253,
3486,
273,
37139,
414,
275,
253,
1566,
1979,
347,
17253,
275,
4677,
337,
1223,
671,
4645,
3082,
323,
46875,
6733,
347,
5469,
275,
2593,
608,
3021,
253,
789,
1146,
5125,
1060,
310,
4623,
323,
8542,
4893,
436,
26536,
310,
5783,
342,
1375,
23037,
14387,
1543,
275,
11132,
15302,
33810,
253,
6508,
5955,
275,
2593,
9743,
310,
12302,
285,
352,
2722,
253,
13789,
273,
819,
25004,
33601,
5871,
327,
1442,
292,
25004,
253,
5454,
2727,
875,
21136,
1979,
285,
7200,
50276,
531,
5884,
14855,
275,
436,
9380,
9759,
8696,
275,
253,
30328,
3212,
253,
28055,
2595,
264,
7982,
323,
253,
6349,
273,
21136,
9297,
253,
7714,
2296,
326,
253,
4156,
4833,
273,
9297,
327,
253,
268,
6113,
3453,
310,
4122,
2905,
281,
21136,
35185,
436,
1127,
310,
2762,
323,
253,
897,
273,
268,
6113,
2429,
281,
643,
3210,
824,
347,
690,
21607,
275,
2829,
337,
1580,
268,
6113,
35185,
403,
247,
1027,
35250,
8847,
2299,
10527,
11985,
275,
2593,
577,
403,
7106,
327,
581,
819,
25004,
1332,
299,
5449,
534,
14395,
3139,
327,
268,
6113,
10491,
1232,
3021,
352,
310,
12744,
432,
253,
2505,
849,
253,
10527,
1783,
4648,
268,
6113,
4451,
35185,
436,
2929,
1057,
417,
11120,
2319,
697,
7364,
1580,
253,
5661,
1543,
403,
247,
9560,
3812,
323,
436,
789,
352,
812,
320,
12912,
281,
823,
247,
2159,
5955,
327,
849,
253,
17082,
908,
281,
2557,
38883,
37139,
414,
5649,
390,
31835,
253,
1027,
3210,
1146,
6760,
5474,
339,
431,
248,
4477,
1246,
271,
5933,
281,
940,
408,
37851,
14174,
268,
6113,
275,
436,
1798,
1083,
597,
1246,
819,
25004,
285,
5675,
5871,
11922,
7632,
326,
403,
417,
1534,
323,
253,
29107,
285,
6240,
7632,
835,
3058,
50276,
783,
4477,
1246,
285,
5276,
2709,
11640,
323,
819,
25004,
6266,
2605,
4715,
3066,
819,
25004,
285,
5675,
285,
4764,
13418,
3066,
19191,
1054,
487,
1506,
802,
33810,
253,
2266,
16774,
1543,
5224,
326,
436,
2746,
476,
452,
1534,
3486,
275,
253,
4715,
273,
268,
6113,
50276,
2520,
2746,
671,
4483,
24432,
281,
1453,
875,
1566,
1979,
285,
3045,
50275,
783,
2929,
10262,
247,
4891,
7680,
347,
2011,
407,
253,
16774,
7103,
13383,
1375,
23037,
14387,
275,
268,
6113,
50276,
783,
2929,
310,
973,
15720,
285,
22335,
3590,
50275,
9088,
403,
690,
5884,
32213,
275,
337,
253,
4477,
12661,
247,
819,
25004,
2746,
1754,
327,
253,
36284,
13571,
9079,
323,
268,
6113,
891,
2868,
436,
1537,
320,
4623,
1060,
627,
310,
642,
5301,
281,
643,
3082,
326,
789,
327,
37820,
273,
268,
6113,
824,
347,
374,
50275,
18,
5698,
6836,
6969,
21100,
900,
1162,
355,
12541,
2020,
7509,
6928,
5213,
8059,
327,
37851,
29886,
3210,
268,
1686,
83,
9169,
374,
439,
6356,
285,
90,
18154,
66,
8872,
798,
285,
331,
832,
4692,
209,
693,
251,
24052,
81,
2224,
8566,
285,
43541,
37851,
14174,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
854,
3011,
17786,
3507,
642,
4016,
38058,
3486,
5474,
33032,
2520,
789,
12453,
253,
9171,
1430,
2523,
273,
253,
37851,
14174,
597,
12661,
247,
5853,
273,
819,
25004,
285,
5675,
281,
4311,
285,
3157,
253,
3890,
6460,
273,
253,
268,
6113,
323,
819,
25004,
597,
12661,
819,
25004,
9297,
1754,
327,
253,
5912,
326,
352,
588,
320,
10564,
1309,
10491,
299,
5449,
5700,
275,
20320,
18,
281,
4828,
253,
5141,
275,
253,
1566,
5350,
2218,
407,
253,
819,
25004,
3213,
597,
9569,
247,
1756,
20320,
19,
5700,
281,
4044,
247,
21012,
21136,
597,
17958,
875,
253,
819,
25004,
50276,
33601,
5871,
1754,
327,
39793,
323,
253,
2412,
7513,
10202,
50276,
856,
84,
50276,
5040,
337,
352,
310,
247,
973,
15720,
2929,
891,
11435,
253,
4477,
16984,
253,
6239,
285,
2720,
789,
670,
268,
6113,
275,
247,
44003,
285,
5919,
5133,
50276,
19,
253,
15895,
273,
819,
25004,
407,
5049,
14221,
310,
247,
3626,
956,
484,
1953,
281,
253,
819,
25004,
407,
1006,
800,
8453,
253,
19958,
27039,
4764,
1320,
310,
581,
2746,
534,
253,
4477,
5611,
275,
436,
789,
436,
778,
9569,
247,
8492,
1189,
31893,
323,
253,
3733,
941,
891,
4282,
604,
627,
403,
643,
7274,
326,
476,
1805,
6016,
253,
1072,
1895,
752,
604,
359,
513,
49795,
10491,
285,
840,
2557,
253,
5912,
273,
20764,
253,
19958,
941,
342,
253,
2540,
941,
897,
436,
1491,
281,
14357,
253,
2685,
50276,
2520,
310,
1175,
789,
3738,
891,
1928,
253,
9171,
1430,
3045,
310,
1335,
2080,
432,
970,
352,
327,
4067,
15302,
387,
1878,
432,
253,
4679,
2879,
275,
253,
2929,
50275,
284,
5393,
1840,
588,
320,
1175,
281,
871,
253,
4311,
273,
253,
941,
436,
1332,
476,
4390,
6016,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
23970,
37139,
8098,
527,
32578,
5609,
323,
37851,
14174,
268,
6113,
4283,
281,
4460,
2605,
4715,
7274,
323,
268,
6113,
253,
30628,
497,
1077,
2762,
670,
436,
2929,
1119,
352,
973,
15720,
285,
281,
320,
11138,
1375,
23037,
14387,
253,
5609,
403,
4460,
285,
2011,
281,
320,
3576,
327,
1006,
800,
14053,
8892,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work considers twoteam zerosum games where two teams with opposite objectives are facing each other while the literature is quite extensive on learning in zerosum games very little is known for twoteam zero sumgames this paper first shows that this problem is much harder than classical zerosum games by showing that the computation of a ne is cls hard besides showing the failure of classical optimization methods for zerosum games the authors propose an optimisation algorithm better suited to this setting which converges locally to a ne for carefully tuned hyperparameters i have read the authors answer and the other reviews i am satisfied with their answers and do not have any concern other than the lack of clarity at some points i am thus still in favour of its acceptance i globally find this work very interesting well writtend and i think the problem of twoteam zerosum games deserves careful considerations my main concern is about the comparison with related work which seems weirdincomplete when reading the related paper from daskalakis and panageas for example they already claim that gda and ogda may not converge to ne in zero sumgames in last iterate a main part of this work is a similar claim in twoteam zerosum games thm 35 but it trivially holds if it is already the case in classical zerosum games as a consequence i think the better should better focus on comparing to average iterates of these algorithms that are known to converge to equilibria or to methods that are known to converge in last iterate such as omwu for example the case of average iterates is only mentioned in figure 4 and remark 36 but having a longer discussion a theoretical result on this would be great in my opinion for example figure 2 makes less sense as soon as we are considering average iterates the other main weakness of this paper is the fact that theorem 37 only holds under specific choices of the matrices k and p that depend on the problem parameters this is mentioned by the authors in the conclusion and i agree with them that this can be left as future work minor comments p2 is computationally harder of finding pure ne in a congestion game isnt it a typo than instead of of typos p3 introduces introduced their they are p5 captures capture p6 nontrivial dont you mean trivial p9 small number of neurons achieves to the end of the sentence is missing could you please explain the meaning of equation 24 with words the figures are lacking legends and are thus hard to interpret for some of them theorem 31 i think it would be nice to define cls hard before theorem 37 what is exactly nablaxx this paper is overall interesting and significant for the problem of two team zero sum games my main concern is how it relates to previous works and i hope the authors could clarify this point docsepthe main contributions of the paper are as follows first the authors show show that the computation of nash equilibrium in twoteam zerosum game is clshard as a result gda and its variants including optimistic gda and extragradient cannotin generalbe used to converge to the nash equilibrium then the authors propose a stabilized version of gda called kpvgda obtained through certain stabilization techniques in control theory overall i am a bit conflicted for this paper i appreciate the clshardness result and like the idea of using control theory to stabilize the dynamics of gda at the same time i think the paper is lacking in many aspects 1 first of all i was surprised to see no mention of the dichotomy between team maxmin equilibrium 3 which to my understanding is what the authors are focusing on versus its convexification tmecor 12 until the appendix tmecor allows for correlated strategies to be used within the team as independent strategies are a special case of correlated strategies tmecor always allows for higher social welfare compared to tme against the same opponent also the use of correlation is not an obstacle as long as the team members can agree on a common shared signal to use to seed the same random number generator for example the team member having access to the current time would often be good enough but above all beyond higher welfare and limited additional assumptions tmecor is a convex concave problem so i think that tme makes for a pretty non compelling solution concept compared to tmecor showing hardness of tme is definitely interesting but i believe that a robust discussion about tmecor and its benefits should be present in the paper realistically its hard for me to imagine a practical case where id rather use tme than tmecor 2 the paper is a bit hard to read ive numbered here a few questions that would help clarify a few of the obscure points in the rebuttal a it was not immediately clear to me what converge locally to a point means b is kpvgda defined in the introduction the same as gdakpv mentioned in theorem 37 c in the conclusions you mention that your stabilized algorithm manages to stabilize around nash equilibria what does around mean here in theorem 37 you talk about local convergence to a nash equilibrium so it that the same thing d what can be said about the matrices k and p in theorem 37 does it mean that the theorem is purely an existential result or is there an algorithm to compute the required matrices 3 i think the paper would benefit from polishing the language that goes beyond typos i think the overall language of the paper needs to be made more precise as it often came off to me as vague and handwavy for example beyond all examples mentioned above in point 2 in the abstract you write we present a family of team games whose induced utility is nonmultilinear with nonattracting perse mixed nash equilibria what does perse bind to here the term is only used once and it was not clear to me if it was a technical term i was not familiar with a perse nash equilibrium or if it instead referred to perse notattracting which again was not clear is the point attracting or not attracting or something else that needs to be explained also if perse referred to the latin expression i believe it does not need a hyphen in the abstract again you write is not possible using gda its optimistic variant and extra gradient i think it would have been clearer to say its extragradient variant the paper is inconsistent between twoplayer zerosum and two player zerosum in the introduction you talk about a certain minmax approach missing the critical component of the collective strategy making what is the critical component i wish you had been way more specific introduction computing local nash equilibrium are is introduction is computationally harder of finding than introduction from optimization perspective from an i encourage the authors to improve the figures which are hard to read due to the fonts very low contrast the text looks pale instead of black and font sizes used section 34 our machiner machinery experiments iter 3000 refers to data generated by 3000 episodes worth of training it would be great to dive a more complete description as to how the experiments were run in the experimental section of the paper 1 basilico et al teammaxmin equilibrium efficiency bounds and algorithms 2 celli gatti computational results for extensiveform adversarial team games 3 von stengel koller team maxmin equilibria i appreciated the theoretical results of the paper though i believe more needs to be done to justify the model i also think the writing could be significantly improved overall i am somewhat on the fence i welcome a thorough discussion with the authors docsepthis paper studies the equilibrium compution in twoteam zerosum games finding the per player nash is proved to be clshard and many popular gradientbased algorithms are proved to be not stable a vaned gradient descent ascent algorithm is proposed to address the instability shown to locally converge strengths 1 the setting of twoteam games is important even the zerosum version has many applications 2 the techniques used in the paper are wellmotivated and as far as i checked are correct weaknesses 1 the detailed proofs are relatively simple although not trivial most conclusions are not suprising the reduction for clshardness is somewhat straightforward the locally convergence of the proposed kpvgda is proved by showing the existance of suitable k and p but no construction is provided 2 there are several issues related to the presentation the general organization is fine but there are many typos and misused capital chatractors the most important one is the ambiguity in section 4 last two sentences basically i cannot understand the settings and the results provided in the experiments although this does not influence the theoretical parts 3 one minor concern about the illustraive exmample on multiagent gans the nash equilibrium in this paper is defined in the sense of per player ie each individual player cannot improve itself by unilaterally changing its strategy however it seems mgans should in fact care about the equilibrium where each team cannot improve itself by unilaterally changing all its team members strategies still restricted in the space of cartesian product of simplices the later just called per team ne as i have not the official name is defined similarly as the team maxmin equilibrium except both teams can have multiple players each per team ne although not necessarily exists must be a per player ne but not vise versa i think this paper is blow the bar of acceptance currently i am negative on it due the weaknesses i mentioned but may change to be positive if those issues can be addressed docsepthis paper shows that finding a nash equilibrium in team zerosum games is clshard ie it is unlikely to have a polynomialtime algorithm for computing a nash equilibrium they show that the commonly used gradientbased methods cannot converge to a nash equilibrium and then propose a gradientbased method with special conditions which manages to stabilize around nash equilibria the claims in this paper are supported by theoretical and experimental results however my concerns are the solution concept for the studied twoteam zerosum game is nash equilibrium and this twoteam zerosum game is an nplayer game it is wellknown that computing a nash equilibrium in nplayer games is ppadcomplete daskalakis et al 2009 ie it is unlikely to have a polynomialtime algorithm for computing a nash equilibrium and this paper shows that finding a nash equilibrium in twoteam zerosum games is clshard ie it is unlikely to have a polynomialtime algorithm for computing a nash equilibrium gradientbased algorithms have been used to compute a nash equilibrium in nplayer games mckelvey 1988 mckelvey et al 2014 gemp et al 2021 which cannot guarantee the convergence in general but will converge to a nash equilibrium with special settings that is the similar result presented in this paper therefore this paper cannot make us understand twoteam zerosum games better mckelvey rd 1998 a liapunov function for nash equilibria mckelvey richard d mclennan andrew m and turocy theodore l 2014 gambit software tools for game theory version 1601 httpwwwgambitprojectorg gemp i savani r lanctot m bachrach y anthony t everett r tacchetti a eccles t and kramr j 2021 samplebased approximation of nash in large manyplayer games via gradient descent arxiv preprint arxiv210601285 the presentation can be improved 1 it will be better to explain the given equations eg the gradient formulation in section 2 by the way what is k12 in extra gradient method 2 how to obtain eq25 from eq24 what is mu in eq25 3 what is measure zero in theorem 32 4 the number in figures is unclear 5 need to explain how the experimental results in section 4 validate the theoretical findings or which finding minor there are many grammar issues in the abstract moreover in in section 1 firstly computing local nash equilibria ne in general nonconvex nonconcave games are ppadcomplete mdgan hardy et al 2019 are in section 32 we prove an important theorem in section 33 coursecorrelated equilibria cce in section 4 on the other hand our architecture with a small number of neurons achieves to we defer an execution of multigenerators multidiscriminators architectures for cifar10 again to the papers supplement about the related work on tme the following two latest papers should be discussed zhang y an b and ern j 2021 may computing ex ante coordinated teammaxmin equilibria in zerosum multiplayer extensiveform games in proceedings of the aaai conference on artificial intelligence vol 35 no 6 pp 58135821 farina g celli a gatti n and sandholm t 2021 july connecting optimal exante collusion in teams to extensiveform correlation faster algorithms and positive complexity results in international conference on machine learning pp 31643173 pmlr this paper cannot make us understand twoteam zerosum games better after the discussion it seems that the authors have misunderstood the relation between different solution concepts the statement the notion ofper player nash equilibria can be seen as a smoother figure of merit for the performance of a defensive team against an adversary is not correct because a perplayer nash equilibrium may not be a team maxmin equilibrium tme similar to tme a twoteam maximin equilibrium is still a perplayer nash equilibrium but a perplayer nash equilibrium may not be a twoteam maximin equilibrium this paper focuses on minmax optimization in team zerosum games but it only achieves a perplayer nash equilibrium i think this is inappropriate
### Summary:
|
in this paper the authors study teamzero sum games where two teams are facing each other with opposite objective the main result is that the complexity of finding equilibrium is cls hence probably not polynomial this result is obtained via a reduction to some congestion games three reviewers gave a mild positive score 6 while the fourth one had more concerns i tend to agree with the first three reviewers with a a personal opinion around 56 the paper is interesting but could benefit from polishing here and there i acknowledge that the related work section is more precise after discussion this said i also kind of agree with the last reviewer in the sense that the result of this paper is a bit narrow also not really surprising but we cannot always have breathtaking results and i am also not sure that most of the iclr community will be interested by this kind of result this is not really a criticism but this paper is really borderline and this is what makes it fall into the rejection pile for instance i think this paper would be more suited to some other conferences more concerned about games and computations for instance or even a journal
|
[
263,
50276,
19,
253,
2929,
310,
247,
2372,
1892,
281,
1239,
209,
422,
31050,
1060,
247,
1643,
3533,
326,
651,
1361,
19148,
247,
1643,
273,
253,
26591,
2792,
275,
253,
30080,
22559,
50276,
66,
352,
369,
417,
4745,
2590,
281,
479,
752,
29623,
12171,
281,
247,
1127,
2097,
50276,
67,
310,
465,
45270,
72,
1473,
2931,
275,
253,
10199,
253,
1072,
347,
305,
69,
518,
45270,
5393,
275,
10012,
5345,
50276,
68,
275,
253,
11815,
368,
3748,
326,
634,
32779,
5933,
26091,
281,
33292,
1475,
295,
1225,
45571,
5182,
752,
1057,
1475,
1599,
1060,
275,
10012,
5345,
368,
2312,
670,
1980,
14940,
281,
247,
295,
1225,
12902,
594,
352,
326,
253,
1072,
2181,
50276,
69,
752,
476,
320,
753,
670,
253,
12624,
465,
285,
268,
275,
10012,
5345,
1057,
352,
1599,
326,
253,
10012,
310,
15846,
271,
41412,
906,
390,
310,
627,
271,
5933,
281,
11897,
253,
2424,
12624,
50276,
20,
891,
1158,
253,
2929,
651,
5649,
432,
35952,
253,
3448,
326,
4566,
4457,
963,
993,
891,
1158,
253,
4583,
3448,
273,
253,
2929,
3198,
281,
320,
1160,
625,
10799,
347,
352,
2223,
2210,
745,
281,
479,
347,
21248,
285,
1133,
88,
17157,
323,
1650,
4457,
512,
6667,
5393,
1840,
275,
1127,
374,
50276,
249,
253,
12002,
368,
3630,
359,
1246,
247,
2021,
273,
2285,
3958,
3692,
5802,
11839,
310,
1327,
9961,
40511,
342,
1327,
1595,
25031,
591,
339,
6804,
295,
1225,
45571,
5182,
752,
1057,
591,
339,
8980,
281,
1060,
253,
1307,
310,
760,
908,
2378,
285,
352,
369,
417,
2590,
281,
479,
604,
352,
369,
247,
7681,
1307,
891,
369,
417,
7615,
342,
247,
591,
339,
295,
1225,
12902,
390,
604,
352,
3185,
6289,
281,
591,
339,
417,
1595,
25031,
534,
969,
369,
417,
2590,
310,
253,
1127,
35748,
390,
417,
35748,
390,
1633,
2010,
326,
3198,
281,
320,
5544,
671,
604,
591,
339,
6289,
281,
253,
4329,
249,
2048,
891,
2868,
352,
1057,
417,
878,
247,
3500,
864,
50276,
249,
253,
12002,
969,
368,
3630,
50276,
261,
417,
1896,
970,
50276,
72,
1473,
697,
28684,
12955,
285,
4465,
11786,
891,
1158,
352,
651,
452,
644,
30909,
281,
1333,
697,
15534,
356,
4614,
850,
12955,
50276,
783,
2929,
310,
16706,
875,
2500,
4488,
4071,
33303,
360,
285,
767,
4760,
33303,
360,
50276,
249,
253,
10199,
368,
2312,
670,
247,
2176,
1054,
4090,
2746,
5816,
253,
4619,
4445,
273,
253,
12786,
5700,
2403,
752,
310,
253,
4619,
4445,
891,
5730,
368,
574,
644,
1039,
625,
2173,
50276,
46089,
12672,
1980,
295,
1225,
12902,
50276,
609,
50276,
261,
50276,
46089,
310,
43245,
12150,
273,
4560,
50276,
14644,
50276,
46089,
432,
13757,
8668,
50276,
4064,
271,
50276,
74,
11907,
253,
4477,
281,
3157,
253,
8442,
534,
403,
1892,
281,
1239,
1955,
281,
253,
36622,
1077,
1698,
4499,
253,
2505,
4453,
14530,
3185,
273,
2806,
285,
8266,
9552,
908,
50276,
4674,
5910,
776,
3674,
7068,
50276,
78,
607,
16029,
50276,
16217,
3825,
10040,
27295,
10770,
281,
941,
4561,
407,
27295,
13305,
4409,
273,
3733,
352,
651,
320,
1270,
281,
25760,
247,
625,
3426,
5740,
347,
281,
849,
253,
4679,
497,
1408,
275,
253,
5661,
2593,
273,
253,
2929,
50275,
18,
40683,
4173,
1162,
355,
716,
1861,
89,
1222,
12902,
6733,
14493,
285,
11333,
50276,
19,
894,
74,
50276,
72,
26797,
15180,
1543,
323,
9470,
630,
48960,
2285,
3958,
50275,
20,
8449,
331,
1205,
293,
50276,
76,
311,
2146,
2285,
2781,
1222,
45571,
5182,
891,
14109,
253,
10527,
1543,
273,
253,
2929,
2167,
891,
2868,
625,
3198,
281,
320,
2218,
281,
15249,
253,
1566,
891,
671,
1158,
253,
4028,
812,
320,
3012,
5520,
4583,
891,
717,
8489,
327,
253,
19354,
891,
10112,
247,
11080,
5955,
342,
253,
4477,
50276,
7152,
33032,
2520,
2929,
2175,
253,
12902,
509,
890,
275,
2500,
1584,
312,
33303,
360,
3958,
4560,
253,
591,
4760,
295,
1225,
310,
8058,
281,
320,
502,
1200,
472,
285,
1142,
4633,
11786,
3169,
11333,
403,
8058,
281,
320,
417,
6474,
247,
3889,
264,
11786,
18499,
49104,
5933,
310,
4081,
281,
2953,
253,
17620,
2011,
281,
12171,
29623,
20544,
337,
253,
4758,
273,
2500,
1584,
312,
3958,
310,
1774,
1014,
253,
33303,
360,
2715,
556,
1142,
4893,
374,
253,
5609,
908,
275,
253,
2929,
403,
973,
24013,
8550,
285,
347,
2080,
347,
891,
10141,
403,
3451,
50276,
20881,
1255,
265,
337,
253,
7000,
27947,
403,
4942,
2969,
3738,
417,
14916,
954,
11815,
403,
417,
402,
20733,
253,
5141,
323,
502,
1200,
472,
1255,
310,
8489,
15246,
253,
12171,
14940,
273,
253,
4081,
465,
45270,
72,
1473,
310,
8058,
407,
4645,
253,
2226,
593,
273,
7470,
465,
285,
268,
533,
642,
5140,
310,
2530,
374,
627,
403,
2067,
3374,
2905,
281,
253,
9759,
253,
2087,
6003,
310,
4030,
533,
627,
403,
1142,
963,
993,
285,
3731,
3197,
5347,
12939,
974,
641,
253,
954,
1774,
581,
310,
253,
28931,
275,
2593,
577,
1390,
767,
14683,
10323,
891,
2550,
2096,
253,
7533,
285,
253,
1543,
2530,
275,
253,
4679,
3738,
436,
1057,
417,
4833,
253,
10527,
4243,
495,
581,
5884,
4468,
670,
253,
15612,
376,
422,
385,
78,
4636,
327,
4471,
12788,
305,
507,
253,
295,
1225,
12902,
275,
436,
2929,
310,
2931,
275,
253,
3282,
273,
591,
4760,
26332,
1016,
2060,
4760,
2550,
3157,
3139,
407,
440,
300,
37313,
6890,
697,
5700,
2299,
352,
3133,
5770,
507,
943,
275,
958,
1557,
670,
253,
12902,
835,
1016,
2285,
2550,
3157,
3139,
407,
440,
300,
37313,
6890,
512,
697,
2285,
2758,
8130,
1335,
11096,
275,
253,
2317,
273,
7281,
16561,
1885,
273,
8077,
1271,
253,
1996,
816,
1925,
591,
2285,
425,
347,
891,
452,
417,
253,
3565,
1416,
310,
2931,
12014,
347,
253,
2285,
2781,
1222,
12902,
3707,
1097,
6671,
476,
452,
2709,
3773,
1016,
591,
2285,
425,
3738,
417,
7933,
4961,
1364,
320,
247,
591,
4760,
425,
533,
417,
362,
885,
26620,
891,
1158,
436,
2929,
310,
8230,
253,
2534,
273,
14924,
4390,
891,
717,
4016,
327,
352,
1955,
253,
32213,
891,
5393,
533,
778,
1818,
281,
320,
2762,
604,
1110,
3374,
476,
320,
9713,
5474,
33032,
2520,
2929,
2722,
326,
4560,
247,
295,
1225,
12902,
275,
2285,
33303,
360,
3958,
310,
502,
1200,
472,
26332,
352,
310,
11543,
281,
452,
247,
14189,
2606,
5933,
323,
12672,
247,
295,
1225,
12902,
50276,
9328,
921,
326,
253,
7744,
908,
11786,
3169,
3082,
2550,
29623,
281,
247,
295,
1225,
12902,
285,
840,
12661,
247,
11786,
3169,
1332,
342,
2714,
2515,
534,
26091,
281,
33292,
1475,
295,
1225,
45571,
5182,
253,
3916,
275,
436,
2929,
403,
4516,
407,
10527,
285,
5661,
1543,
50276,
35529,
619,
7350,
403,
50276,
783,
2900,
4473,
323,
253,
5421,
2500,
1584,
312,
33303,
360,
2165,
310,
295,
1225,
12902,
285,
436,
2500,
1584,
312,
33303,
360,
2165,
310,
271,
295,
15381,
2165,
352,
310,
973,
4304,
326,
12672,
247,
295,
1225,
12902,
275,
295,
15381,
3958,
310,
7266,
324,
11984,
277,
1945,
267,
30441,
1162,
355,
4748,
26332,
352,
310,
11543,
281,
452,
247,
14189,
2606,
5933,
323,
12672,
247,
295,
1225,
12902,
285,
436,
2929,
2722,
326,
4560,
247,
295,
1225,
12902,
275,
2500,
1584,
312,
33303,
360,
3958,
310,
502,
1200,
472,
26332,
352,
310,
11543,
281,
452,
247,
14189,
2606,
5933,
323,
12672,
247,
295,
1225,
12902,
50275,
29844,
3169,
11333,
452,
644,
908,
281,
11897,
247,
295,
1225,
12902,
275,
295,
15381,
3958,
278,
777,
293,
3942,
11513,
278,
777,
293,
3942,
1162,
355,
4059,
16915,
81,
1162,
355,
43425,
534,
2550,
12215,
253,
14940,
275,
2087,
533,
588,
29623,
281,
247,
295,
1225,
12902,
342,
2714,
7533,
326,
310,
253,
2074,
906,
3559,
275,
436,
2929,
50276,
45230,
436,
2929,
2550,
1056,
441,
2096,
2500,
1584,
312,
33303,
360,
3958,
1805,
50273,
78,
777,
293,
3942,
47939,
8065,
247,
632,
522,
43772,
1159,
323,
295,
1225,
45571,
5182,
278,
777,
293,
3942,
6793,
472,
277,
278,
498,
2477,
266,
285,
2663,
278,
285,
246,
1822,
951,
253,
351,
410,
298,
4059,
38415,
262,
3694,
5657,
323,
2165,
3762,
2715,
1668,
520,
3944,
2700,
72,
1369,
262,
10408,
2061,
50276,
72,
21568,
891,
5745,
6451,
391,
28967,
291,
302,
278,
270,
607,
83,
607,
340,
13950,
2421,
246,
43866,
12436,
391,
32851,
1962,
26760,
247,
10038,
5005,
246,
285,
465,
3358,
83,
480,
43425,
3410,
3169,
11193,
273,
295,
1225,
275,
1781,
1142,
15381,
3958,
3066,
11786,
18499,
549,
32693,
638,
3845,
549,
32693,
19,
12971,
520,
23025,
50275,
783,
9759,
476,
320,
5520,
337,
186,
262,
588,
320,
1805,
281,
5513,
253,
1677,
7424,
24088,
253,
11786,
15895,
275,
2593,
374,
407,
253,
1039,
752,
310,
465,
805,
275,
4465,
11786,
1332,
50276,
19,
186,
5430,
281,
4044,
16186,
1099,
432,
16186,
1348,
752,
310,
12910,
275,
16186,
1099,
495,
186,
5371,
310,
2557,
5058,
275,
10012,
4567,
577,
186,
783,
1180,
275,
8442,
310,
12744,
608,
186,
22990,
281,
5513,
849,
253,
5661,
1543,
275,
2593,
577,
17813,
253,
10527,
4342,
390,
534,
4560,
50275,
37585,
50276,
9088,
403,
1142,
28146,
3374,
275,
253,
12002,
25761,
275,
50276,
249,
2593,
337,
41005,
12672,
1980,
295,
1225,
45571,
5182,
425,
275,
2087,
1327,
44181,
1327,
45542,
1123,
3958,
403,
7266,
324,
11984,
31934,
1247,
1892,
90,
1162,
355,
6247,
403,
275,
2593,
4567,
359,
5276,
271,
1774,
10012,
50276,
249,
2593,
5922,
1960,
1704,
263,
4919,
45571,
5182,
260,
336,
275,
2593,
577,
327,
253,
643,
1133,
776,
10336,
342,
247,
1355,
1180,
273,
8512,
33526,
281,
359,
50276,
37671,
271,
10636,
273,
1554,
3855,
254,
2392,
23964,
2865,
34673,
2392,
35615,
323,
260,
338,
274,
740,
969,
281,
253,
50276,
50004,
8499,
50275,
10383,
253,
2905,
789,
327,
246,
1405,
253,
1563,
767,
6323,
9380,
943,
320,
5469,
1182,
12109,
340,
271,
270,
285,
209,
1808,
480,
43425,
778,
12672,
385,
22943,
25899,
716,
1861,
89,
1222,
45571,
5182,
275,
33303,
360,
44047,
9470,
630,
3958,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
4791,
642,
721,
7266,
9135,
1012,
3680,
1797,
2080,
1758,
305,
894,
74,
247,
305,
26797,
295,
285,
7537,
26666,
246,
43425,
480,
2988,
12873,
8654,
385,
7961,
3007,
2035,
275,
6671,
281,
9470,
630,
5921,
7938,
11333,
285,
2762,
10454,
1543,
275,
5213,
8059,
327,
5145,
4715,
7266,
33152,
3079,
21151,
268,
1686,
83,
50276,
2520,
2929,
2550,
1056,
441,
2096,
2500,
1584,
312,
33303,
360,
3958,
1805,
50276,
6438,
253,
5955,
352,
3133,
326,
253,
4477,
452,
46485,
253,
5886,
875,
1027,
2900,
12342,
50276,
783,
3908,
253,
10732,
273,
468,
4760,
295,
1225,
45571,
5182,
476,
320,
2326,
347,
247,
39797,
977,
4677,
273,
15785,
323,
253,
3045,
273,
247,
14397,
2285,
1411,
271,
34014,
310,
417,
3451,
984,
247,
591,
15381,
295,
1225,
12902,
778,
417,
320,
247,
2285,
2781,
1222,
12902,
246,
1405,
2074,
281,
246,
1405,
247,
2500,
1584,
312,
11903,
249,
12902,
310,
1335,
247,
591,
15381,
295,
1225,
12902,
533,
247,
591,
15381,
295,
1225,
12902,
778,
417,
320,
247,
2500,
1584,
312,
11903,
249,
12902,
436,
2929,
16633,
327,
1054,
4090,
13757,
275,
2285,
33303,
360,
3958,
533,
352,
760,
33526,
247,
591,
15381,
295,
1225,
12902,
891,
1158,
436,
310,
19582,
2490,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
1263,
2285,
10528,
2020,
3958,
835,
767,
6671,
403,
10268,
1016,
643,
342,
7285,
8103,
50275,
783,
2022,
906,
310,
326,
253,
10454,
273,
4560,
12902,
310,
502,
84,
7613,
3164,
417,
14189,
436,
906,
310,
2797,
3066,
247,
5141,
281,
690,
35367,
3958,
50276,
13524,
30628,
3534,
247,
11134,
2762,
4868,
721,
1223,
253,
7002,
581,
574,
625,
7350,
891,
5257,
281,
5194,
342,
253,
806,
1264,
30628,
342,
247,
50276,
66,
3367,
4743,
1475,
8026,
253,
2929,
310,
4722,
533,
812,
5649,
432,
35952,
1060,
285,
627,
891,
14409,
326,
253,
2905,
789,
2593,
310,
625,
10799,
846,
5955,
50275,
2520,
753,
891,
671,
2238,
273,
5194,
342,
253,
1390,
37317,
275,
253,
3282,
326,
253,
906,
273,
436,
2929,
310,
247,
2372,
6891,
671,
417,
1663,
10084,
533,
359,
2550,
1900,
452,
14911,
48150,
1543,
285,
891,
717,
671,
417,
2119,
326,
954,
273,
253,
17857,
32888,
3114,
588,
320,
6110,
407,
436,
2238,
273,
906,
436,
310,
417,
1663,
247,
14226,
533,
436,
2929,
310,
1663,
45210,
285,
436,
310,
752,
2789,
352,
2965,
715,
253,
18235,
19176,
50276,
1542,
4227,
891,
1158,
436,
2929,
651,
320,
625,
18960,
281,
690,
643,
27691,
625,
7514,
670,
3958,
285,
30745,
323,
4227,
390,
1014,
247,
6698
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
263,
50276,
19,
253,
2929,
310,
247,
2372,
1892,
281,
1239,
209,
422,
31050,
1060,
247,
1643,
3533,
326,
651,
1361,
19148,
247,
1643,
273,
253,
26591,
2792,
275,
253,
30080,
22559,
50276,
66,
352,
369,
417,
4745,
2590,
281,
479,
752,
29623,
12171,
281,
247,
1127,
2097,
50276,
67,
310,
465,
45270,
72,
1473,
2931,
275,
253,
10199,
253,
1072,
347,
305,
69,
518,
45270,
5393,
275,
10012,
5345,
50276,
68,
275,
253,
11815,
368,
3748,
326,
634,
32779,
5933,
26091,
281,
33292,
1475,
295,
1225,
45571,
5182,
752,
1057,
1475,
1599,
1060,
275,
10012,
5345,
368,
2312,
670,
1980,
14940,
281,
247,
295,
1225,
12902,
594,
352,
326,
253,
1072,
2181,
50276,
69,
752,
476,
320,
753,
670,
253,
12624,
465,
285,
268,
275,
10012,
5345,
1057,
352,
1599,
326,
253,
10012,
310,
15846,
271,
41412,
906,
390,
310,
627,
271,
5933,
281,
11897,
253,
2424,
12624,
50276,
20,
891,
1158,
253,
2929,
651,
5649,
432,
35952,
253,
3448,
326,
4566,
4457,
963,
993,
891,
1158,
253,
4583,
3448,
273,
253,
2929,
3198,
281,
320,
1160,
625,
10799,
347,
352,
2223,
2210,
745,
281,
479,
347,
21248,
285,
1133,
88,
17157,
323,
1650,
4457,
512,
6667,
5393,
1840,
275,
1127,
374,
50276,
249,
253,
12002,
368,
3630,
359,
1246,
247,
2021,
273,
2285,
3958,
3692,
5802,
11839,
310,
1327,
9961,
40511,
342,
1327,
1595,
25031,
591,
339,
6804,
295,
1225,
45571,
5182,
752,
1057,
591,
339,
8980,
281,
1060,
253,
1307,
310,
760,
908,
2378,
285,
352,
369,
417,
2590,
281,
479,
604,
352,
369,
247,
7681,
1307,
891,
369,
417,
7615,
342,
247,
591,
339,
295,
1225,
12902,
390,
604,
352,
3185,
6289,
281,
591,
339,
417,
1595,
25031,
534,
969,
369,
417,
2590,
310,
253,
1127,
35748,
390,
417,
35748,
390,
1633,
2010,
326,
3198,
281,
320,
5544,
671,
604,
591,
339,
6289,
281,
253,
4329,
249,
2048,
891,
2868,
352,
1057,
417,
878,
247,
3500,
864,
50276,
249,
253,
12002,
969,
368,
3630,
50276,
261,
417,
1896,
970,
50276,
72,
1473,
697,
28684,
12955,
285,
4465,
11786,
891,
1158,
352,
651,
452,
644,
30909,
281,
1333,
697,
15534,
356,
4614,
850,
12955,
50276,
783,
2929,
310,
16706,
875,
2500,
4488,
4071,
33303,
360,
285,
767,
4760,
33303,
360,
50276,
249,
253,
10199,
368,
2312,
670,
247,
2176,
1054,
4090,
2746,
5816,
253,
4619,
4445,
273,
253,
12786,
5700,
2403,
752,
310,
253,
4619,
4445,
891,
5730,
368,
574,
644,
1039,
625,
2173,
50276,
46089,
12672,
1980,
295,
1225,
12902,
50276,
609,
50276,
261,
50276,
46089,
310,
43245,
12150,
273,
4560,
50276,
14644,
50276,
46089,
432,
13757,
8668,
50276,
4064,
271,
50276,
74,
11907,
253,
4477,
281,
3157,
253,
8442,
534,
403,
1892,
281,
1239,
1955,
281,
253,
36622,
1077,
1698,
4499,
253,
2505,
4453,
14530,
3185,
273,
2806,
285,
8266,
9552,
908,
50276,
4674,
5910,
776,
3674,
7068,
50276,
78,
607,
16029,
50276,
16217,
3825,
10040,
27295,
10770,
281,
941,
4561,
407,
27295,
13305,
4409,
273,
3733,
352,
651,
320,
1270,
281,
25760,
247,
625,
3426,
5740,
347,
281,
849,
253,
4679,
497,
1408,
275,
253,
5661,
2593,
273,
253,
2929,
50275,
18,
40683,
4173,
1162,
355,
716,
1861,
89,
1222,
12902,
6733,
14493,
285,
11333,
50276,
19,
894,
74,
50276,
72,
26797,
15180,
1543,
323,
9470,
630,
48960,
2285,
3958,
50275,
20,
8449,
331,
1205,
293,
50276,
76,
311,
2146,
2285,
2781,
1222,
45571,
5182,
891,
14109,
253,
10527,
1543,
273,
253,
2929,
2167,
891,
2868,
625,
3198,
281,
320,
2218,
281,
15249,
253,
1566,
891,
671,
1158,
253,
4028,
812,
320,
3012,
5520,
4583,
891,
717,
8489,
327,
253,
19354,
891,
10112,
247,
11080,
5955,
342,
253,
4477,
50276,
7152,
33032,
2520,
2929,
2175,
253,
12902,
509,
890,
275,
2500,
1584,
312,
33303,
360,
3958,
4560,
253,
591,
4760,
295,
1225,
310,
8058,
281,
320,
502,
1200,
472,
285,
1142,
4633,
11786,
3169,
11333,
403,
8058,
281,
320,
417,
6474,
247,
3889,
264,
11786,
18499,
49104,
5933,
310,
4081,
281,
2953,
253,
17620,
2011,
281,
12171,
29623,
20544,
337,
253,
4758,
273,
2500,
1584,
312,
3958,
310,
1774,
1014,
253,
33303,
360,
2715,
556,
1142,
4893,
374,
253,
5609,
908,
275,
253,
2929,
403,
973,
24013,
8550,
285,
347,
2080,
347,
891,
10141,
403,
3451,
50276,
20881,
1255,
265,
337,
253,
7000,
27947,
403,
4942,
2969,
3738,
417,
14916,
954,
11815,
403,
417,
402,
20733,
253,
5141,
323,
502,
1200,
472,
1255,
310,
8489,
15246,
253,
12171,
14940,
273,
253,
4081,
465,
45270,
72,
1473,
310,
8058,
407,
4645,
253,
2226,
593,
273,
7470,
465,
285,
268,
533,
642,
5140,
310,
2530,
374,
627,
403,
2067,
3374,
2905,
281,
253,
9759,
253,
2087,
6003,
310,
4030,
533,
627,
403,
1142,
963,
993,
285,
3731,
3197,
5347,
12939,
974,
641,
253,
954,
1774,
581,
310,
253,
28931,
275,
2593,
577,
1390,
767,
14683,
10323,
891,
2550,
2096,
253,
7533,
285,
253,
1543,
2530,
275,
253,
4679,
3738,
436,
1057,
417,
4833,
253,
10527,
4243,
495,
581,
5884,
4468,
670,
253,
15612,
376,
422,
385,
78,
4636,
327,
4471,
12788,
305,
507,
253,
295,
1225,
12902,
275,
436,
2929,
310,
2931,
275,
253,
3282,
273,
591,
4760,
26332,
1016,
2060,
4760,
2550,
3157,
3139,
407,
440,
300,
37313,
6890,
697,
5700,
2299,
352,
3133,
5770,
507,
943,
275,
958,
1557,
670,
253,
12902,
835,
1016,
2285,
2550,
3157,
3139,
407,
440,
300,
37313,
6890,
512,
697,
2285,
2758,
8130,
1335,
11096,
275,
253,
2317,
273,
7281,
16561,
1885,
273,
8077,
1271,
253,
1996,
816,
1925,
591,
2285,
425,
347,
891,
452,
417,
253,
3565,
1416,
310,
2931,
12014,
347,
253,
2285,
2781,
1222,
12902,
3707,
1097,
6671,
476,
452,
2709,
3773,
1016,
591,
2285,
425,
3738,
417,
7933,
4961,
1364,
320,
247,
591,
4760,
425,
533,
417,
362,
885,
26620,
891,
1158,
436,
2929,
310,
8230,
253,
2534,
273,
14924,
4390,
891,
717,
4016,
327,
352,
1955,
253,
32213,
891,
5393,
533,
778,
1818,
281,
320,
2762,
604,
1110,
3374,
476,
320,
9713,
5474,
33032,
2520,
2929,
2722,
326,
4560,
247,
295,
1225,
12902,
275,
2285,
33303,
360,
3958,
310,
502,
1200,
472,
26332,
352,
310,
11543,
281,
452,
247,
14189,
2606,
5933,
323,
12672,
247,
295,
1225,
12902,
50276,
9328,
921,
326,
253,
7744,
908,
11786,
3169,
3082,
2550,
29623,
281,
247,
295,
1225,
12902,
285,
840,
12661,
247,
11786,
3169,
1332,
342,
2714,
2515,
534,
26091,
281,
33292,
1475,
295,
1225,
45571,
5182,
253,
3916,
275,
436,
2929,
403,
4516,
407,
10527,
285,
5661,
1543,
50276,
35529,
619,
7350,
403,
50276,
783,
2900,
4473,
323,
253,
5421,
2500,
1584,
312,
33303,
360,
2165,
310,
295,
1225,
12902,
285,
436,
2500,
1584,
312,
33303,
360,
2165,
310,
271,
295,
15381,
2165,
352,
310,
973,
4304,
326,
12672,
247,
295,
1225,
12902,
275,
295,
15381,
3958,
310,
7266,
324,
11984,
277,
1945,
267,
30441,
1162,
355,
4748,
26332,
352,
310,
11543,
281,
452,
247,
14189,
2606,
5933,
323,
12672,
247,
295,
1225,
12902,
285,
436,
2929,
2722,
326,
4560,
247,
295,
1225,
12902,
275,
2500,
1584,
312,
33303,
360,
3958,
310,
502,
1200,
472,
26332,
352,
310,
11543,
281,
452,
247,
14189,
2606,
5933,
323,
12672,
247,
295,
1225,
12902,
50275,
29844,
3169,
11333,
452,
644,
908,
281,
11897,
247,
295,
1225,
12902,
275,
295,
15381,
3958,
278,
777,
293,
3942,
11513,
278,
777,
293,
3942,
1162,
355,
4059,
16915,
81,
1162,
355,
43425,
534,
2550,
12215,
253,
14940,
275,
2087,
533,
588,
29623,
281,
247,
295,
1225,
12902,
342,
2714,
7533,
326,
310,
253,
2074,
906,
3559,
275,
436,
2929,
50276,
45230,
436,
2929,
2550,
1056,
441,
2096,
2500,
1584,
312,
33303,
360,
3958,
1805,
50273,
78,
777,
293,
3942,
47939,
8065,
247,
632,
522,
43772,
1159,
323,
295,
1225,
45571,
5182,
278,
777,
293,
3942,
6793,
472,
277,
278,
498,
2477,
266,
285,
2663,
278,
285,
246,
1822,
951,
253,
351,
410,
298,
4059,
38415,
262,
3694,
5657,
323,
2165,
3762,
2715,
1668,
520,
3944,
2700,
72,
1369,
262,
10408,
2061,
50276,
72,
21568,
891,
5745,
6451,
391,
28967,
291,
302,
278,
270,
607,
83,
607,
340,
13950,
2421,
246,
43866,
12436,
391,
32851,
1962,
26760,
247,
10038,
5005,
246,
285,
465,
3358,
83,
480,
43425,
3410,
3169,
11193,
273,
295,
1225,
275,
1781,
1142,
15381,
3958,
3066,
11786,
18499,
549,
32693,
638,
3845,
549,
32693,
19,
12971,
520,
23025,
50275,
783,
9759,
476,
320,
5520,
337,
186,
262,
588,
320,
1805,
281,
5513,
253,
1677,
7424,
24088,
253,
11786,
15895,
275,
2593,
374,
407,
253,
1039,
752,
310,
465,
805,
275,
4465,
11786,
1332,
50276,
19,
186,
5430,
281,
4044,
16186,
1099,
432,
16186,
1348,
752,
310,
12910,
275,
16186,
1099,
495,
186,
5371,
310,
2557,
5058,
275,
10012,
4567,
577,
186,
783,
1180,
275,
8442,
310,
12744,
608,
186,
22990,
281,
5513,
849,
253,
5661,
1543,
275,
2593,
577,
17813,
253,
10527,
4342,
390,
534,
4560,
50275,
37585,
50276,
9088,
403,
1142,
28146,
3374,
275,
253,
12002,
25761,
275,
50276,
249,
2593,
337,
41005,
12672,
1980,
295,
1225,
45571,
5182,
425,
275,
2087,
1327,
44181,
1327,
45542,
1123,
3958,
403,
7266,
324,
11984,
31934,
1247,
1892,
90,
1162,
355,
6247,
403,
275,
2593,
4567,
359,
5276,
271,
1774,
10012,
50276,
249,
2593,
5922,
1960,
1704,
263,
4919,
45571,
5182,
260,
336,
275,
2593,
577,
327,
253,
643,
1133,
776,
10336,
342,
247,
1355,
1180,
273,
8512,
33526,
281,
359,
50276,
37671,
271,
10636,
273,
1554,
3855,
254,
2392,
23964,
2865,
34673,
2392,
35615,
323,
260,
338,
274,
740,
969,
281,
253,
50276,
50004,
8499,
50275,
10383,
253,
2905,
789,
327,
246,
1405,
253,
1563,
767,
6323,
9380,
943,
320,
5469,
1182,
12109,
340,
271,
270,
285,
209,
1808,
480,
43425,
778,
12672,
385,
22943,
25899,
716,
1861,
89,
1222,
45571,
5182,
275,
33303,
360,
44047,
9470,
630,
3958,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
4791,
642,
721,
7266,
9135,
1012,
3680,
1797,
2080,
1758,
305,
894,
74,
247,
305,
26797,
295,
285,
7537,
26666,
246,
43425,
480,
2988,
12873,
8654,
385,
7961,
3007,
2035,
275,
6671,
281,
9470,
630,
5921,
7938,
11333,
285,
2762,
10454,
1543,
275,
5213,
8059,
327,
5145,
4715,
7266,
33152,
3079,
21151,
268,
1686,
83,
50276,
2520,
2929,
2550,
1056,
441,
2096,
2500,
1584,
312,
33303,
360,
3958,
1805,
50276,
6438,
253,
5955,
352,
3133,
326,
253,
4477,
452,
46485,
253,
5886,
875,
1027,
2900,
12342,
50276,
783,
3908,
253,
10732,
273,
468,
4760,
295,
1225,
45571,
5182,
476,
320,
2326,
347,
247,
39797,
977,
4677,
273,
15785,
323,
253,
3045,
273,
247,
14397,
2285,
1411,
271,
34014,
310,
417,
3451,
984,
247,
591,
15381,
295,
1225,
12902,
778,
417,
320,
247,
2285,
2781,
1222,
12902,
246,
1405,
2074,
281,
246,
1405,
247,
2500,
1584,
312,
11903,
249,
12902,
310,
1335,
247,
591,
15381,
295,
1225,
12902,
533,
247,
591,
15381,
295,
1225,
12902,
778,
417,
320,
247,
2500,
1584,
312,
11903,
249,
12902,
436,
2929,
16633,
327,
1054,
4090,
13757,
275,
2285,
33303,
360,
3958,
533,
352,
760,
33526,
247,
591,
15381,
295,
1225,
12902,
891,
1158,
436,
310,
19582,
2490,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
1263,
2285,
10528,
2020,
3958,
835,
767,
6671,
403,
10268,
1016,
643,
342,
7285,
8103,
50275,
783,
2022,
906,
310,
326,
253,
10454,
273,
4560,
12902,
310,
502,
84,
7613,
3164,
417,
14189,
436,
906,
310,
2797,
3066,
247,
5141,
281,
690,
35367,
3958,
50276,
13524,
30628,
3534,
247,
11134,
2762,
4868,
721,
1223,
253,
7002,
581,
574,
625,
7350,
891,
5257,
281,
5194,
342,
253,
806,
1264,
30628,
342,
247,
50276,
66,
3367,
4743,
1475,
8026,
253,
2929,
310,
4722,
533,
812,
5649,
432,
35952,
1060,
285,
627,
891,
14409,
326,
253,
2905,
789,
2593,
310,
625,
10799,
846,
5955,
50275,
2520,
753,
891,
671,
2238,
273,
5194,
342,
253,
1390,
37317,
275,
253,
3282,
326,
253,
906,
273,
436,
2929,
310,
247,
2372,
6891,
671,
417,
1663,
10084,
533,
359,
2550,
1900,
452,
14911,
48150,
1543,
285,
891,
717,
671,
417,
2119,
326,
954,
273,
253,
17857,
32888,
3114,
588,
320,
6110,
407,
436,
2238,
273,
906,
436,
310,
417,
1663,
247,
14226,
533,
436,
2929,
310,
1663,
45210,
285,
436,
310,
752,
2789,
352,
2965,
715,
253,
18235,
19176,
50276,
1542,
4227,
891,
1158,
436,
2929,
651,
320,
625,
18960,
281,
690,
643,
27691,
625,
7514,
670,
3958,
285,
30745,
323,
4227,
390,
1014,
247,
6698
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work addresses the largescale pretraining in the opendomain texttovideo generation the major challenge to this task is the lack of large textvideo paired datasets available and the weak relevance between them in this paper the authors propose a 9bparameter transformer model cogvideo which is trained by inheriting the learned spatial semantics from a pretrained texttoimage model cogview2 in this paper they claim that the key difference between the texttovideo generation and the texttoimage generation is that the former needs huge of paired data to infer both spatial and temporal correlation between two modalities while the latter only requires the learning of spatial correlation therefore they propose a dualattention channel that adds an additional attention layer based on original structure of cogview2 to address the learning of temporal correlation as illustrated in figure 3 during training it only optimizes the parameters of newly added temporal attention layer while keep all the parameters of cogview2 frozen to address the weak alignment of text and variablelength video the example in line 3236 gives a good illustration of this issue they propose multiframerate hierarchical training concretely it involves a twostage generation process in stage 1 they propose to add a framerate token to the text to generate the image frames at low frame rate then in stage 2 they introduce another frame interpolation transformer to generate the immediate transition frames between the generated frames of the first transformer model in stage 1 the proposed hierarchical generation framework looks promising on long videos given a text input the idea of interpolation transformer model that can utilize bidirectional frame context to finish the interpolation of current frame also make sense however there are some critical weaknesses and questions listed below strengths 1 the proposed multiframerate hierarchical training has the potential on the texttovideo generation especially the idea of frame interpolation transformer model that can utilize the bidirectional frame context to generate immediate frame it indeed looks a promising solution to handle long video generation given a text input 2 the proposed cogvideo inherits the parameters of a pretrained texttoimage models and avoid the expensive pretraining from scratch 3 in addition the authors claim that they will opensource the proposed largescale texttovideo generation model this is another contribution to the community weaknesses 1 in the main experiment results in table 1 it seems that cogvideo obviously underperforms tatsbase 9 on ucf101 dataset and trivdganfp 16 on kinetics600 this conflicts with the claim in abstract is that due to the issue of metric itself or other reasons could you explain more about results in table 1 since i cannot find related information in the main text 2 it is not clear how much the temporal attention channel contributes to the performance gains of cogvideo in the ablation study it does not include the comparison of withwithout temporal attention layer ie compare the performance gap of directly finetuning cogview2s weights and keeping cogview2s weights frozen while tuning the temporal attention layer 3 the qualitative evaluation only includes one case analysis it would be more convincing to provide more visualized comparison between different variants since the visualization samples are much more intuitive than evaluation metrics to directly judge the real performance of the model the major limitation is that how long the generated videos by the proposed cogvideo can last the submitted demos show that the generated videos can only last several seconds it would be more exciting that cogvideo can generate longer videos for given text input docsepthis paper is the first work to propose an opensource pretrained transformer to solve the texttovideo task the authors propose a twostage framework cogvideo by finetuning a texttoimage model and avoid pretraining from scratch to reduce the training cost the idea of multiframerate ensures the flexibility and accuracy of the generated video experiments and visualized samples show the effectiveness of the method on video generating 1strengths the paper proposes the pretrained twostage sequential generation and recursive interpolation pipeline and better reconstruct the alignment relation in a video and the twostage method does help the model converge better and faster which is proved by the line chart of training loss shown in the ablation studies i think the cogvideo pipeline has its value in realworld application scenarios also the paper is well writing 2weaknesses first the paper is not the first attempting to investigate hierarchical transformer structure in texttovideo challenge for example 1 also exploits autoregressive and interpolation transformers and gain better performance on ucf101 however the authors do not explain the difference or innovation compared with 1 the authors have claimed that the cogvideo pipeline can solve the texttovideo generation in the general domain but the experiment results in tab1 are all attained on human action video datasets ucf101 and kinetics600 which maybe somewhat weak to prove the above conclusion i guess maybe general domain represents the motions in the text have some temporal relation if the concept of general domain can be defined in the paper it will be much better 1 s ge t hayes h yang x yin g pang d jacobs jb huang and d parikh long video generation with timeagnostic vqgan and timesensitive transformer arxiv preprint arxiv220403638 2022 no and i do not see any serious concerns docsepin this paper the authors propose a largescale pretraining model for texttovideo generation called cogvideo which is trained by inheriting a pretrained texttoimage model cogview2 a multiframerate hierarchical training strategy is designed to better align text and video clips the proposed method is validated on standard benchmarks such as ucf101 and kinetics600 strengths 1this work explores an important topic textconditional video generation and presents an effective method for it 2a multiframerate hierarchical training strategy is proposed to improve the generation quality and control the intensity of changes during generation 3the authors verified the proposed model through both quantitative experiments and qualitative analyses weaknesses 1the novelty and technical contributions of this work are quite limited it simply assembles several existing algorithms such as cogview2 6 and swin transformer 14 2the performance of the proposed cogvideo is not as strong as the authors claimed as shown in tab 1 the metrics of cogvideo is and fvd fall behind previous methods it is not acceptable to only compare with publicly available models 3the authors trained the cogvideo on a large dataset of 54 million captioned videos but did not give any detailed information regarding this dataset what are the sources of the videos and captions how are they collected will the dataset be released 4the authors claimed that they greatly enhance parallelism and accelerate inference but no results about inference speed of the cogvideo model are provided 5in sec 51 the authors finetune cogvideo on the whole dataset ucf101 for 10000 iterations is it appropriate in this field yes the authors adequately addressed the limitations and potential negative societal impact of the work docsepthis paper proposed a largescale pretrained texttovideo model and multiframerate hierarchical training strategy to better align text and video clips the strengths of this paper are as follows 1 this paper is well written and easy to follow 2 the proposed model is the largest and the first opensource pretrained transformer for texttovideo generation in the general domain 3 the experimental results show that the proposed methods can outperform a number of existing baselines the weaknesses of this paper are as follows 1 the main limitation of the work is the huge consumption of gpu memory but the related information type and number of gpu is not provided i hope this information can be given i am not an expert in this field and do not tap into dalle and cogview i will adjust the final score with other reviewers comments and the response from the author none
### Summary:
|
the authors address an important problem of texttovideo generation in this work although there is a set of solid contributions that are specific to video generation authorreviewer discussion as well as the following reviewerarea chair discussion did not address all the concerns raised by the reviewers furthermore the lack of and also superficial address of the ethical concerns raised by the reviewers as well as ethics reviewers strongly suggests that the authors need to substantially revise the manuscript along both technical and ethical aspects the paper is thus recommended to be rejected
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
12453,
253,
1236,
2510,
25912,
3215,
26208,
275,
253,
1121,
423,
297,
404,
2505,
85,
729,
2842,
5978,
253,
2201,
5691,
281,
436,
4836,
310,
253,
3480,
273,
1781,
2505,
16455,
18433,
15302,
2130,
285,
253,
5075,
17200,
875,
731,
275,
436,
2929,
253,
4477,
12661,
247,
898,
67,
19484,
39707,
1566,
260,
462,
16455,
534,
310,
10166,
407,
10958,
2996,
253,
6311,
8820,
35185,
432,
247,
3215,
11273,
2505,
936,
5695,
1566,
260,
462,
1374,
19,
275,
436,
2929,
50276,
9328,
1750,
326,
253,
2234,
3064,
875,
253,
2505,
85,
729,
2842,
5978,
285,
253,
2505,
936,
5695,
5978,
310,
326,
253,
3438,
3198,
5699,
273,
18433,
941,
281,
9441,
1097,
8820,
285,
11935,
5921,
875,
767,
33433,
1223,
253,
6158,
760,
4419,
253,
4715,
273,
8820,
5921,
3103,
597,
12661,
247,
8746,
42959,
5048,
326,
11323,
271,
3081,
4116,
3828,
1754,
327,
3236,
2605,
273,
260,
462,
1374,
19,
281,
2953,
253,
4715,
273,
11935,
5921,
347,
12800,
275,
4677,
495,
1309,
3733,
352,
760,
5556,
4219,
253,
3602,
273,
9841,
2879,
11935,
4116,
3828,
1223,
1978,
512,
253,
3602,
273,
260,
462,
1374,
19,
13831,
50276,
936,
2953,
253,
5075,
12420,
273,
2505,
285,
4778,
3985,
3492,
253,
1650,
275,
1386,
495,
21358,
4245,
247,
1175,
23356,
273,
436,
2523,
597,
12661,
25274,
3358,
5034,
24498,
3733,
345,
2414,
600,
352,
8687,
247,
2500,
493,
486,
5978,
1232,
275,
3924,
337,
597,
12661,
281,
823,
247,
30432,
5034,
10669,
281,
253,
2505,
281,
6635,
253,
2460,
13009,
387,
1698,
3665,
2281,
840,
275,
3924,
374,
597,
9569,
1529,
3665,
30370,
39707,
281,
6635,
253,
8993,
5502,
13009,
875,
253,
4561,
13009,
273,
253,
806,
39707,
1566,
275,
3924,
337,
50276,
783,
4081,
24498,
5978,
7792,
4453,
12532,
327,
1048,
10556,
1677,
247,
2505,
3280,
253,
2934,
273,
30370,
39707,
1566,
326,
476,
16584,
12246,
30869,
3665,
3634,
281,
8416,
253,
30370,
273,
1655,
3665,
671,
1056,
3282,
2299,
627,
403,
690,
4619,
32213,
285,
3533,
7117,
2708,
50276,
296,
3755,
20556,
337,
253,
4081,
25274,
3358,
5034,
24498,
3733,
556,
253,
2442,
327,
253,
2505,
85,
729,
2842,
5978,
3340,
253,
2934,
273,
3665,
30370,
39707,
1566,
326,
476,
16584,
253,
12246,
30869,
3665,
3634,
281,
6635,
8993,
3665,
352,
6296,
4453,
247,
12532,
2900,
281,
6016,
1048,
3492,
5978,
1677,
247,
2505,
3280,
374,
253,
4081,
260,
462,
16455,
10958,
953,
253,
3602,
273,
247,
3215,
11273,
2505,
936,
5695,
3210,
285,
3693,
253,
8214,
3215,
26208,
432,
20041,
50276,
20,
275,
1635,
253,
4477,
1750,
326,
597,
588,
13279,
1505,
253,
4081,
1236,
2510,
25912,
2505,
85,
729,
2842,
5978,
1566,
436,
310,
1529,
7680,
281,
253,
3114,
50276,
20881,
1255,
265,
337,
275,
253,
2022,
3368,
1543,
275,
2829,
337,
352,
3133,
326,
260,
462,
16455,
9090,
762,
468,
13015,
246,
1832,
4793,
898,
327,
44274,
71,
6903,
10895,
285,
35820,
69,
1247,
16983,
1668,
327,
24273,
10487,
436,
15272,
342,
253,
1750,
275,
12002,
310,
326,
1955,
281,
253,
2523,
273,
7982,
3139,
390,
643,
4606,
812,
368,
5513,
625,
670,
1543,
275,
2829,
337,
1580,
891,
2550,
1089,
2905,
1491,
275,
253,
2022,
2505,
374,
352,
310,
417,
2590,
849,
1199,
253,
11935,
4116,
5048,
17904,
281,
253,
3045,
15988,
273,
260,
462,
16455,
275,
253,
28913,
1263,
352,
1057,
417,
2486,
253,
5301,
273,
342,
14920,
11935,
4116,
3828,
26332,
7277,
253,
3045,
8037,
273,
3587,
1442,
292,
25004,
260,
462,
1374,
19,
84,
13461,
285,
7562,
260,
462,
1374,
19,
84,
13461,
13831,
1223,
25184,
253,
11935,
4116,
3828,
495,
253,
18276,
7103,
760,
3797,
581,
1083,
1783,
352,
651,
320,
625,
21414,
281,
2085,
625,
27130,
5301,
875,
1027,
11640,
1580,
253,
24426,
3530,
403,
1199,
625,
27350,
685,
7103,
17082,
281,
3587,
5963,
253,
1524,
3045,
273,
253,
1566,
50276,
783,
2201,
12291,
310,
326,
849,
1048,
253,
4561,
10556,
407,
253,
4081,
260,
462,
16455,
476,
1390,
253,
9262,
1471,
375,
921,
326,
253,
4561,
10556,
476,
760,
1390,
2067,
7253,
352,
651,
320,
625,
12302,
326,
260,
462,
16455,
476,
6635,
3356,
10556,
323,
1677,
2505,
3280,
5474,
33032,
2520,
2929,
310,
253,
806,
789,
281,
12661,
271,
13279,
1505,
3215,
11273,
39707,
281,
8415,
253,
2505,
85,
729,
2842,
4836,
253,
4477,
12661,
247,
2500,
493,
486,
7792,
260,
462,
16455,
407,
1442,
292,
25004,
247,
2505,
936,
5695,
1566,
285,
3693,
3215,
26208,
432,
20041,
281,
4796,
253,
3733,
2105,
253,
2934,
273,
25274,
3358,
5034,
20096,
253,
15840,
285,
7200,
273,
253,
4561,
3492,
4679,
285,
27130,
3530,
921,
253,
12510,
273,
253,
1332,
327,
3492,
11365,
337,
296,
3755,
20556,
50276,
783,
2929,
29328,
253,
3215,
11273,
2500,
493,
486,
22453,
5978,
285,
33037,
30370,
15722,
285,
1805,
17029,
253,
12420,
5886,
275,
247,
3492,
285,
253,
2500,
493,
486,
1332,
1057,
1361,
253,
1566,
29623,
1805,
285,
7938,
534,
310,
8058,
407,
253,
1386,
8326,
273,
3733,
2957,
2011,
275,
253,
28913,
2175,
891,
1158,
253,
260,
462,
16455,
15722,
556,
697,
1318,
275,
1524,
10186,
2898,
15216,
671,
253,
2929,
310,
973,
4028,
50276,
19,
20881,
1255,
265,
50276,
7053,
253,
2929,
310,
417,
253,
806,
13756,
281,
7409,
24498,
39707,
2605,
275,
2505,
85,
729,
2842,
5691,
323,
1650,
337,
671,
40725,
47694,
11020,
285,
30370,
4979,
398,
285,
6351,
1805,
3045,
327,
44274,
71,
6903,
2299,
253,
4477,
513,
417,
5513,
253,
3064,
390,
15832,
2429,
342,
337,
253,
4477,
452,
7558,
326,
253,
260,
462,
16455,
15722,
476,
8415,
253,
2505,
85,
729,
2842,
5978,
275,
253,
2087,
5028,
533,
253,
3368,
1543,
275,
10334,
18,
403,
512,
26553,
327,
1966,
2250,
3492,
15302,
44274,
71,
6903,
285,
24273,
10487,
534,
5046,
8489,
5075,
281,
5276,
253,
1840,
6452,
891,
5476,
5046,
2087,
5028,
6125,
253,
14462,
275,
253,
2505,
452,
690,
11935,
5886,
604,
253,
4473,
273,
2087,
5028,
476,
320,
2931,
275,
253,
2929,
352,
588,
320,
1199,
1805,
50276,
18,
256,
3471,
246,
19273,
265,
288,
30966,
1269,
340,
249,
305,
268,
606,
277,
480,
317,
10600,
480,
67,
30287,
606,
285,
277,
1061,
20323,
1048,
3492,
5978,
342,
673,
1530,
6932,
362,
82,
1247,
285,
2069,
18917,
39707,
549,
32693,
638,
3845,
549,
32693,
14256,
1449,
1812,
1839,
1384,
1423,
642,
285,
891,
513,
417,
923,
667,
4092,
7350,
5474,
339,
9852,
436,
2929,
253,
4477,
12661,
247,
1236,
2510,
25912,
3215,
26208,
1566,
323,
2505,
85,
729,
2842,
5978,
1925,
260,
462,
16455,
534,
310,
10166,
407,
10958,
2996,
247,
3215,
11273,
2505,
936,
5695,
1566,
260,
462,
1374,
19,
247,
25274,
3358,
5034,
24498,
3733,
5700,
310,
4158,
281,
1805,
8495,
2505,
285,
3492,
29205,
253,
4081,
1332,
310,
17618,
327,
2629,
49602,
824,
347,
44274,
71,
6903,
285,
24273,
10487,
20544,
337,
2520,
789,
33826,
271,
1774,
9400,
50276,
1156,
35428,
3492,
5978,
285,
10262,
271,
3576,
1332,
323,
352,
374,
66,
25274,
3358,
5034,
24498,
3733,
5700,
310,
4081,
281,
3157,
253,
5978,
3290,
285,
1453,
253,
7133,
273,
2544,
1309,
5978,
495,
783,
4477,
16058,
253,
4081,
1566,
949,
1097,
11745,
4679,
285,
18276,
6260,
50275,
20881,
1255,
265,
337,
783,
38135,
285,
7681,
9021,
273,
436,
789,
403,
3240,
3710,
352,
3365,
347,
40275,
2067,
5368,
11333,
824,
347,
260,
462,
1374,
19,
721,
285,
1863,
249,
39707,
1638,
374,
783,
3045,
273,
253,
4081,
260,
462,
16455,
310,
417,
347,
2266,
347,
253,
4477,
7558,
347,
2011,
275,
10334,
337,
253,
17082,
273,
260,
462,
16455,
310,
285,
269,
19122,
2965,
3212,
2045,
3082,
352,
310,
417,
12207,
281,
760,
7277,
342,
13644,
2130,
3210,
495,
783,
4477,
10166,
253,
260,
462,
16455,
327,
247,
1781,
10895,
273,
8255,
3041,
3403,
10998,
10556,
533,
858,
417,
1918,
667,
7000,
1491,
5001,
436,
10895,
752,
403,
253,
4973,
273,
253,
10556,
285,
3403,
621,
849,
403,
597,
5728,
588,
253,
10895,
320,
4439,
50276,
21,
783,
4477,
7558,
326,
597,
10260,
7278,
7529,
1204,
285,
28523,
17032,
533,
642,
1543,
670,
17032,
3885,
273,
253,
260,
462,
16455,
1566,
403,
2530,
608,
249,
4706,
8319,
253,
4477,
1442,
292,
2517,
260,
462,
16455,
327,
253,
2644,
10895,
44274,
71,
6903,
323,
30321,
25142,
310,
352,
4569,
275,
436,
1673,
50276,
9820,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
253,
789,
5474,
33032,
2520,
2929,
4081,
247,
1236,
2510,
25912,
3215,
11273,
2505,
85,
729,
2842,
1566,
285,
25274,
3358,
5034,
24498,
3733,
5700,
281,
1805,
8495,
2505,
285,
3492,
29205,
50276,
783,
20544,
273,
436,
2929,
403,
347,
3637,
50276,
18,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
374,
253,
4081,
1566,
310,
253,
6253,
285,
253,
806,
13279,
1505,
3215,
11273,
39707,
323,
2505,
85,
729,
2842,
5978,
275,
253,
2087,
5028,
495,
253,
5661,
1543,
921,
326,
253,
4081,
3082,
476,
562,
32231,
247,
1180,
273,
5368,
1666,
25379,
50276,
783,
32213,
273,
436,
2929,
403,
347,
3637,
50276,
18,
253,
2022,
12291,
273,
253,
789,
310,
253,
5699,
8353,
273,
305,
11113,
3541,
533,
253,
2905,
1491,
1511,
285,
1180,
273,
305,
11113,
310,
417,
2530,
891,
3524,
436,
1491,
476,
320,
1677,
50275,
74,
717,
417,
271,
6485,
275,
436,
1673,
285,
513,
417,
13341,
715,
277,
4781,
285,
260,
462,
1374,
891,
588,
4575,
253,
2457,
4868,
342,
643,
30628,
5701,
285,
253,
2380,
432,
253,
2488,
5293,
2490,
187,
4118,
18435,
27,
783,
4477,
2953,
271,
1774,
1895,
273,
2505,
85,
729,
2842,
5978,
275,
436,
789,
3738,
627,
310,
247,
873,
273,
4891,
9021,
326,
403,
2173,
281,
3492,
5978,
2488,
15337,
254,
5955,
347,
973,
347,
253,
1563,
37317,
12879,
6951,
5955,
858,
417,
2953,
512,
253,
7350,
5439,
407,
253,
30628,
33810,
253,
3480,
273,
285,
671,
28019,
2953,
273,
253,
16289,
7350,
5439,
407,
253,
30628,
347,
973,
347,
18035,
30628,
7052,
5936,
326,
253,
4477,
878,
281,
9619,
49620,
253,
7714,
2112,
1097,
7681,
285,
16289,
7794,
253,
2929,
310,
3021,
8521,
281,
320,
10945
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
12453,
253,
1236,
2510,
25912,
3215,
26208,
275,
253,
1121,
423,
297,
404,
2505,
85,
729,
2842,
5978,
253,
2201,
5691,
281,
436,
4836,
310,
253,
3480,
273,
1781,
2505,
16455,
18433,
15302,
2130,
285,
253,
5075,
17200,
875,
731,
275,
436,
2929,
253,
4477,
12661,
247,
898,
67,
19484,
39707,
1566,
260,
462,
16455,
534,
310,
10166,
407,
10958,
2996,
253,
6311,
8820,
35185,
432,
247,
3215,
11273,
2505,
936,
5695,
1566,
260,
462,
1374,
19,
275,
436,
2929,
50276,
9328,
1750,
326,
253,
2234,
3064,
875,
253,
2505,
85,
729,
2842,
5978,
285,
253,
2505,
936,
5695,
5978,
310,
326,
253,
3438,
3198,
5699,
273,
18433,
941,
281,
9441,
1097,
8820,
285,
11935,
5921,
875,
767,
33433,
1223,
253,
6158,
760,
4419,
253,
4715,
273,
8820,
5921,
3103,
597,
12661,
247,
8746,
42959,
5048,
326,
11323,
271,
3081,
4116,
3828,
1754,
327,
3236,
2605,
273,
260,
462,
1374,
19,
281,
2953,
253,
4715,
273,
11935,
5921,
347,
12800,
275,
4677,
495,
1309,
3733,
352,
760,
5556,
4219,
253,
3602,
273,
9841,
2879,
11935,
4116,
3828,
1223,
1978,
512,
253,
3602,
273,
260,
462,
1374,
19,
13831,
50276,
936,
2953,
253,
5075,
12420,
273,
2505,
285,
4778,
3985,
3492,
253,
1650,
275,
1386,
495,
21358,
4245,
247,
1175,
23356,
273,
436,
2523,
597,
12661,
25274,
3358,
5034,
24498,
3733,
345,
2414,
600,
352,
8687,
247,
2500,
493,
486,
5978,
1232,
275,
3924,
337,
597,
12661,
281,
823,
247,
30432,
5034,
10669,
281,
253,
2505,
281,
6635,
253,
2460,
13009,
387,
1698,
3665,
2281,
840,
275,
3924,
374,
597,
9569,
1529,
3665,
30370,
39707,
281,
6635,
253,
8993,
5502,
13009,
875,
253,
4561,
13009,
273,
253,
806,
39707,
1566,
275,
3924,
337,
50276,
783,
4081,
24498,
5978,
7792,
4453,
12532,
327,
1048,
10556,
1677,
247,
2505,
3280,
253,
2934,
273,
30370,
39707,
1566,
326,
476,
16584,
12246,
30869,
3665,
3634,
281,
8416,
253,
30370,
273,
1655,
3665,
671,
1056,
3282,
2299,
627,
403,
690,
4619,
32213,
285,
3533,
7117,
2708,
50276,
296,
3755,
20556,
337,
253,
4081,
25274,
3358,
5034,
24498,
3733,
556,
253,
2442,
327,
253,
2505,
85,
729,
2842,
5978,
3340,
253,
2934,
273,
3665,
30370,
39707,
1566,
326,
476,
16584,
253,
12246,
30869,
3665,
3634,
281,
6635,
8993,
3665,
352,
6296,
4453,
247,
12532,
2900,
281,
6016,
1048,
3492,
5978,
1677,
247,
2505,
3280,
374,
253,
4081,
260,
462,
16455,
10958,
953,
253,
3602,
273,
247,
3215,
11273,
2505,
936,
5695,
3210,
285,
3693,
253,
8214,
3215,
26208,
432,
20041,
50276,
20,
275,
1635,
253,
4477,
1750,
326,
597,
588,
13279,
1505,
253,
4081,
1236,
2510,
25912,
2505,
85,
729,
2842,
5978,
1566,
436,
310,
1529,
7680,
281,
253,
3114,
50276,
20881,
1255,
265,
337,
275,
253,
2022,
3368,
1543,
275,
2829,
337,
352,
3133,
326,
260,
462,
16455,
9090,
762,
468,
13015,
246,
1832,
4793,
898,
327,
44274,
71,
6903,
10895,
285,
35820,
69,
1247,
16983,
1668,
327,
24273,
10487,
436,
15272,
342,
253,
1750,
275,
12002,
310,
326,
1955,
281,
253,
2523,
273,
7982,
3139,
390,
643,
4606,
812,
368,
5513,
625,
670,
1543,
275,
2829,
337,
1580,
891,
2550,
1089,
2905,
1491,
275,
253,
2022,
2505,
374,
352,
310,
417,
2590,
849,
1199,
253,
11935,
4116,
5048,
17904,
281,
253,
3045,
15988,
273,
260,
462,
16455,
275,
253,
28913,
1263,
352,
1057,
417,
2486,
253,
5301,
273,
342,
14920,
11935,
4116,
3828,
26332,
7277,
253,
3045,
8037,
273,
3587,
1442,
292,
25004,
260,
462,
1374,
19,
84,
13461,
285,
7562,
260,
462,
1374,
19,
84,
13461,
13831,
1223,
25184,
253,
11935,
4116,
3828,
495,
253,
18276,
7103,
760,
3797,
581,
1083,
1783,
352,
651,
320,
625,
21414,
281,
2085,
625,
27130,
5301,
875,
1027,
11640,
1580,
253,
24426,
3530,
403,
1199,
625,
27350,
685,
7103,
17082,
281,
3587,
5963,
253,
1524,
3045,
273,
253,
1566,
50276,
783,
2201,
12291,
310,
326,
849,
1048,
253,
4561,
10556,
407,
253,
4081,
260,
462,
16455,
476,
1390,
253,
9262,
1471,
375,
921,
326,
253,
4561,
10556,
476,
760,
1390,
2067,
7253,
352,
651,
320,
625,
12302,
326,
260,
462,
16455,
476,
6635,
3356,
10556,
323,
1677,
2505,
3280,
5474,
33032,
2520,
2929,
310,
253,
806,
789,
281,
12661,
271,
13279,
1505,
3215,
11273,
39707,
281,
8415,
253,
2505,
85,
729,
2842,
4836,
253,
4477,
12661,
247,
2500,
493,
486,
7792,
260,
462,
16455,
407,
1442,
292,
25004,
247,
2505,
936,
5695,
1566,
285,
3693,
3215,
26208,
432,
20041,
281,
4796,
253,
3733,
2105,
253,
2934,
273,
25274,
3358,
5034,
20096,
253,
15840,
285,
7200,
273,
253,
4561,
3492,
4679,
285,
27130,
3530,
921,
253,
12510,
273,
253,
1332,
327,
3492,
11365,
337,
296,
3755,
20556,
50276,
783,
2929,
29328,
253,
3215,
11273,
2500,
493,
486,
22453,
5978,
285,
33037,
30370,
15722,
285,
1805,
17029,
253,
12420,
5886,
275,
247,
3492,
285,
253,
2500,
493,
486,
1332,
1057,
1361,
253,
1566,
29623,
1805,
285,
7938,
534,
310,
8058,
407,
253,
1386,
8326,
273,
3733,
2957,
2011,
275,
253,
28913,
2175,
891,
1158,
253,
260,
462,
16455,
15722,
556,
697,
1318,
275,
1524,
10186,
2898,
15216,
671,
253,
2929,
310,
973,
4028,
50276,
19,
20881,
1255,
265,
50276,
7053,
253,
2929,
310,
417,
253,
806,
13756,
281,
7409,
24498,
39707,
2605,
275,
2505,
85,
729,
2842,
5691,
323,
1650,
337,
671,
40725,
47694,
11020,
285,
30370,
4979,
398,
285,
6351,
1805,
3045,
327,
44274,
71,
6903,
2299,
253,
4477,
513,
417,
5513,
253,
3064,
390,
15832,
2429,
342,
337,
253,
4477,
452,
7558,
326,
253,
260,
462,
16455,
15722,
476,
8415,
253,
2505,
85,
729,
2842,
5978,
275,
253,
2087,
5028,
533,
253,
3368,
1543,
275,
10334,
18,
403,
512,
26553,
327,
1966,
2250,
3492,
15302,
44274,
71,
6903,
285,
24273,
10487,
534,
5046,
8489,
5075,
281,
5276,
253,
1840,
6452,
891,
5476,
5046,
2087,
5028,
6125,
253,
14462,
275,
253,
2505,
452,
690,
11935,
5886,
604,
253,
4473,
273,
2087,
5028,
476,
320,
2931,
275,
253,
2929,
352,
588,
320,
1199,
1805,
50276,
18,
256,
3471,
246,
19273,
265,
288,
30966,
1269,
340,
249,
305,
268,
606,
277,
480,
317,
10600,
480,
67,
30287,
606,
285,
277,
1061,
20323,
1048,
3492,
5978,
342,
673,
1530,
6932,
362,
82,
1247,
285,
2069,
18917,
39707,
549,
32693,
638,
3845,
549,
32693,
14256,
1449,
1812,
1839,
1384,
1423,
642,
285,
891,
513,
417,
923,
667,
4092,
7350,
5474,
339,
9852,
436,
2929,
253,
4477,
12661,
247,
1236,
2510,
25912,
3215,
26208,
1566,
323,
2505,
85,
729,
2842,
5978,
1925,
260,
462,
16455,
534,
310,
10166,
407,
10958,
2996,
247,
3215,
11273,
2505,
936,
5695,
1566,
260,
462,
1374,
19,
247,
25274,
3358,
5034,
24498,
3733,
5700,
310,
4158,
281,
1805,
8495,
2505,
285,
3492,
29205,
253,
4081,
1332,
310,
17618,
327,
2629,
49602,
824,
347,
44274,
71,
6903,
285,
24273,
10487,
20544,
337,
2520,
789,
33826,
271,
1774,
9400,
50276,
1156,
35428,
3492,
5978,
285,
10262,
271,
3576,
1332,
323,
352,
374,
66,
25274,
3358,
5034,
24498,
3733,
5700,
310,
4081,
281,
3157,
253,
5978,
3290,
285,
1453,
253,
7133,
273,
2544,
1309,
5978,
495,
783,
4477,
16058,
253,
4081,
1566,
949,
1097,
11745,
4679,
285,
18276,
6260,
50275,
20881,
1255,
265,
337,
783,
38135,
285,
7681,
9021,
273,
436,
789,
403,
3240,
3710,
352,
3365,
347,
40275,
2067,
5368,
11333,
824,
347,
260,
462,
1374,
19,
721,
285,
1863,
249,
39707,
1638,
374,
783,
3045,
273,
253,
4081,
260,
462,
16455,
310,
417,
347,
2266,
347,
253,
4477,
7558,
347,
2011,
275,
10334,
337,
253,
17082,
273,
260,
462,
16455,
310,
285,
269,
19122,
2965,
3212,
2045,
3082,
352,
310,
417,
12207,
281,
760,
7277,
342,
13644,
2130,
3210,
495,
783,
4477,
10166,
253,
260,
462,
16455,
327,
247,
1781,
10895,
273,
8255,
3041,
3403,
10998,
10556,
533,
858,
417,
1918,
667,
7000,
1491,
5001,
436,
10895,
752,
403,
253,
4973,
273,
253,
10556,
285,
3403,
621,
849,
403,
597,
5728,
588,
253,
10895,
320,
4439,
50276,
21,
783,
4477,
7558,
326,
597,
10260,
7278,
7529,
1204,
285,
28523,
17032,
533,
642,
1543,
670,
17032,
3885,
273,
253,
260,
462,
16455,
1566,
403,
2530,
608,
249,
4706,
8319,
253,
4477,
1442,
292,
2517,
260,
462,
16455,
327,
253,
2644,
10895,
44274,
71,
6903,
323,
30321,
25142,
310,
352,
4569,
275,
436,
1673,
50276,
9820,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
253,
789,
5474,
33032,
2520,
2929,
4081,
247,
1236,
2510,
25912,
3215,
11273,
2505,
85,
729,
2842,
1566,
285,
25274,
3358,
5034,
24498,
3733,
5700,
281,
1805,
8495,
2505,
285,
3492,
29205,
50276,
783,
20544,
273,
436,
2929,
403,
347,
3637,
50276,
18,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
374,
253,
4081,
1566,
310,
253,
6253,
285,
253,
806,
13279,
1505,
3215,
11273,
39707,
323,
2505,
85,
729,
2842,
5978,
275,
253,
2087,
5028,
495,
253,
5661,
1543,
921,
326,
253,
4081,
3082,
476,
562,
32231,
247,
1180,
273,
5368,
1666,
25379,
50276,
783,
32213,
273,
436,
2929,
403,
347,
3637,
50276,
18,
253,
2022,
12291,
273,
253,
789,
310,
253,
5699,
8353,
273,
305,
11113,
3541,
533,
253,
2905,
1491,
1511,
285,
1180,
273,
305,
11113,
310,
417,
2530,
891,
3524,
436,
1491,
476,
320,
1677,
50275,
74,
717,
417,
271,
6485,
275,
436,
1673,
285,
513,
417,
13341,
715,
277,
4781,
285,
260,
462,
1374,
891,
588,
4575,
253,
2457,
4868,
342,
643,
30628,
5701,
285,
253,
2380,
432,
253,
2488,
5293,
2490,
187,
4118,
18435,
27,
783,
4477,
2953,
271,
1774,
1895,
273,
2505,
85,
729,
2842,
5978,
275,
436,
789,
3738,
627,
310,
247,
873,
273,
4891,
9021,
326,
403,
2173,
281,
3492,
5978,
2488,
15337,
254,
5955,
347,
973,
347,
253,
1563,
37317,
12879,
6951,
5955,
858,
417,
2953,
512,
253,
7350,
5439,
407,
253,
30628,
33810,
253,
3480,
273,
285,
671,
28019,
2953,
273,
253,
16289,
7350,
5439,
407,
253,
30628,
347,
973,
347,
18035,
30628,
7052,
5936,
326,
253,
4477,
878,
281,
9619,
49620,
253,
7714,
2112,
1097,
7681,
285,
16289,
7794,
253,
2929,
310,
3021,
8521,
281,
320,
10945
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors present a system defuse which is geared towards identifying and correcting classifier performance when labels are assigned incorrectly there are three phases that are used to design defuse 1 identify unrestricted adversarial examples using variational auto encoders 2 use a clustering approach to distill the above examples into failure scenarios and 3 correct the classifier predictions overall the idea of using adversarial examples to correct incorrect classifications is very interesting the choice of certain algorithms and their parameters needs to be justified clearly while it is understandable that a nonparametric model be used for the clustering step it it not clear why a dirichlet process is the best fit how does this choice compare with other clustering approaches do the results generalize the paper should be rewritten to have sufficient details of experiments in the text rather than delegating them to the appendix a the motivation of why certain parameters are chosen for experiments should be discussed for example we sample 10 instances from each cluster in the distillation step we ask 5 workers to label the instance why are these choices appropriate description of the annotation task ought to be more detailed labeling them ourselves who constitutes ourselves what is the agreement between the annotators minor comments 1 section 32 how we identity how we identify 2 section 323 the paragraph ends with for instance the sentence needs to be completed and an example provided 3 section 41 32x32 should be replaced with 32x32 similarly 128x128 should be replaced with 128x128docsepthe technique is described in sufficient detail and the paper is easy to read experimental results involving three datasets mnist street view house numbers and german traffic signs the experimental results show that the proposed technique finds significant failures in all datasets including critical failure scenarios after correction the performance of the method improves an interesting aspect of the method which distinguishes it from similar techniques is involvement of usersexperts in the training process to indicate the classification errors in order to improve the performance of the method in the future engaging users in the training of classifiers has its advantages and disadvantages for example it can make easier to create personalised classification models that could be applied eg in recommender system or information retrieval where finding a perfect item depends on users subjective perception of certain qualities at the same time user involvement in the training process can be tricky if it requires expert judgment as they may not always be available as the authors demonstrated in the case of their third dataset consisting of german traffic signs further involving user generated assessments requires well defined procedures in terms of requirement of assessors determining the appropriate number of assessors resolving disagreements between assessors to ensure robustness of the final classifier in the examples provided in the paper the authors state that they used 5 workers annotators and the majority vote was used to decide the final label what was the interannotator agreement since using human labellers is a crucial part of the proposed method i would like to see more discussion of this aspectdocsepthe paper proposes a method to identify and correct regions on the data manifold in which a trained classifier fails the identification phase is based on clustering classification failure regions in a gan latent space and the correction phase is based on finetuning the classifier with additional synthetic samples from the gan the proposed method is strongly based on zhao et al 2018 generating natural adversarial examples a method to generate onmanifold blackbox adversarial examples using a gan the authors of the current paper describe some differences of their identification step from zhao et al end of section 321 but in my opinion they are minor the main contribution of the current paper over zhao et al seems to be clustering the adversarial examples using gmm and using them to finetune the classifier this in my opinion is potentially an interesting idea however the authors do not show sufficient evidence of its success specifically the authors claim to achieve near perfect failure scenario accuracy with minimal change in test set accuracy but they do not provide any details eg table of accuracy values on the train test and adversarial sets before and after the finetuning i would also expect to see an ablation study comparing the proposed method to simply including the adversarial examples found using zhao et al wo gmm fitting and sampling as additional training example a standard adversarial defense approach see eg 1 perhaps more importantly the objective of the proposed method is not in my opinion clear the title and abstract describe the goal as debugging a classifier and correcting fail regions however the described method seems like a defense against onmanifold adversarial attack if the method as claimed helps debugging and correcting the classifier i would expect to see an improved accuracy on the natural unseen test set not just on the synthetically generated adversarial examples the quality and clarity of the writing can be improved as well a lot of space is allocated to describing wellknown methods eg vae gmm however critical information about the experimental results are missing im also not sure all the formally defined algorithms and equations actually help in the understanding eg algorithm 1 equation 2 some of the mathematical notations are not standard minor comment the norm in definition 31 is a regular vector norm l2 and not a matrix norm to summarize pros interesting idea clustering onmanifold failures labeling them and then using them to improve the classifier cons contribution over zhao et al not well established insufficient and inaccurate experimental results general quality of writing not sure actual work and experiments match the stated objective significance update following the authors response i upgraded my rating but i still think there are critical issues with the paper the most problematic point in my opinion is the onlymarginal improvement on the test data indicating that the suggested training method only improves the specific failure scenarios making it is similar to adversarial training methods used to gain adversarial robustness however the abstract and introduction indicates that the paper helps in debugging in fixing failures in general which i think should have been evident in improved test accuracy 1 zhang hongyang et al theoretically principled tradeoff between robustness and accuracy icml 2019
### Summary:
|
the manuscript describes a method for identifying and correcting classifier performance when labels are assigned incorrectly the identification is based on clustering classification failure regions in a vae latent space and the correction phase is based on finetuning the classifier with additional synthetic samples from the vae reviewers agreed that the manuscript is not ready for publication the main issue is that the suggested training method is similar to adversarial training methods used to gain adversarial robustness the method does not help in debugging and fixing failures in general
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
247,
985,
809,
2327,
534,
310,
48526,
4404,
12488,
285,
35827,
30410,
3045,
672,
13301,
403,
7922,
30833,
627,
403,
1264,
12475,
326,
403,
908,
281,
2216,
809,
2327,
337,
4271,
48566,
48960,
6667,
970,
39762,
6753,
2349,
351,
398,
374,
897,
247,
17524,
2746,
281,
940,
408,
253,
1840,
6667,
715,
4433,
15216,
285,
495,
3451,
253,
30410,
13650,
50276,
1189,
455,
253,
2934,
273,
970,
48960,
6667,
281,
3451,
13583,
43394,
310,
1077,
4722,
50275,
783,
4327,
273,
2176,
11333,
285,
616,
3602,
3198,
281,
320,
17285,
4518,
1223,
352,
310,
34007,
326,
247,
1327,
36928,
1566,
320,
908,
323,
253,
17524,
3213,
352,
352,
417,
2590,
2139,
247,
14035,
42878,
1232,
310,
253,
1682,
4944,
849,
1057,
436,
4327,
7277,
342,
643,
17524,
7274,
513,
253,
1543,
39970,
50276,
783,
2929,
943,
320,
35993,
281,
452,
4209,
4278,
273,
4679,
275,
253,
2505,
2581,
685,
12016,
839,
731,
281,
253,
30762,
247,
50275,
783,
16038,
273,
2139,
2176,
3602,
403,
6777,
323,
4679,
943,
320,
5469,
323,
1650,
359,
3410,
884,
10872,
432,
1016,
7368,
275,
253,
940,
21755,
3213,
359,
1642,
608,
5820,
281,
5203,
253,
4227,
50276,
22309,
403,
841,
10165,
4569,
50276,
10008,
273,
253,
22581,
4836,
12758,
281,
320,
625,
7000,
50276,
1968,
272,
731,
9361,
50276,
10002,
16988,
9361,
752,
310,
253,
4345,
875,
253,
12182,
2392,
50276,
37585,
5701,
50276,
18,
2593,
4567,
849,
359,
6489,
50276,
5430,
359,
4271,
50276,
19,
2593,
32597,
253,
12494,
7637,
342,
323,
4227,
253,
6197,
3198,
281,
320,
6312,
285,
271,
1650,
2530,
495,
2593,
7609,
4567,
89,
1237,
943,
320,
7932,
342,
4567,
89,
1237,
12014,
12842,
89,
8196,
943,
320,
7932,
342,
12842,
89,
8196,
7152,
339,
431,
248,
5853,
310,
2529,
275,
4209,
2508,
285,
253,
2929,
310,
3477,
281,
1239,
5661,
1543,
7668,
1264,
15302,
278,
79,
382,
6406,
1859,
2419,
3904,
285,
305,
8592,
7137,
7871,
253,
5661,
1543,
921,
326,
253,
4081,
5853,
9010,
1534,
20101,
275,
512,
15302,
1690,
4619,
4433,
15216,
846,
10618,
253,
3045,
273,
253,
1332,
19132,
50276,
266,
4722,
4809,
273,
253,
1332,
534,
44587,
352,
432,
2074,
5609,
310,
10171,
273,
2608,
11523,
468,
1641,
275,
253,
3733,
1232,
281,
5224,
253,
9162,
6332,
275,
1340,
281,
3157,
253,
3045,
273,
253,
1332,
275,
253,
2852,
15966,
4212,
275,
253,
3733,
273,
49996,
556,
697,
11361,
285,
23797,
323,
1650,
352,
476,
1056,
6927,
281,
2794,
3367,
1701,
9162,
3210,
326,
812,
320,
3732,
24088,
275,
3818,
3109,
985,
390,
1491,
25064,
835,
4560,
247,
3962,
5382,
7024,
327,
4212,
17854,
13071,
273,
2176,
18701,
387,
253,
1072,
673,
2608,
10171,
275,
253,
3733,
1232,
476,
320,
28190,
604,
352,
4419,
6485,
3883,
347,
597,
778,
417,
1900,
320,
2130,
347,
253,
4477,
5183,
275,
253,
1083,
273,
616,
2626,
10895,
11253,
273,
305,
8592,
7137,
7871,
2007,
7668,
2608,
4561,
20215,
4419,
973,
2931,
7259,
275,
2426,
273,
8284,
273,
2939,
641,
8925,
253,
4569,
1180,
273,
2939,
641,
30426,
10009,
36559,
875,
2939,
641,
281,
5416,
31640,
273,
253,
2457,
30410,
275,
253,
6667,
2530,
275,
253,
2929,
253,
4477,
1375,
326,
597,
908,
608,
5820,
12182,
2392,
285,
253,
5020,
6273,
369,
908,
281,
7617,
253,
2457,
5203,
752,
369,
253,
734,
11423,
1080,
4345,
1580,
970,
1966,
5188,
20945,
310,
247,
9560,
629,
273,
253,
4081,
1332,
891,
651,
751,
281,
923,
625,
5955,
273,
436,
4809,
7152,
339,
431,
248,
2929,
29328,
247,
1332,
281,
4271,
285,
3451,
4811,
327,
253,
941,
16751,
275,
534,
247,
10166,
30410,
10224,
253,
8137,
3408,
310,
1754,
327,
17524,
9162,
4433,
4811,
275,
247,
36827,
21624,
2317,
285,
253,
10618,
3408,
310,
1754,
327,
1442,
292,
25004,
253,
30410,
342,
3081,
13506,
3530,
432,
253,
36827,
50276,
783,
4081,
1332,
310,
7052,
1754,
327,
1182,
31035,
1162,
355,
4765,
11365,
3626,
48960,
6667,
247,
1332,
281,
6635,
327,
38556,
2806,
3364,
48960,
6667,
970,
247,
36827,
253,
4477,
273,
253,
1655,
2929,
6266,
690,
3910,
273,
616,
8137,
3213,
432,
1182,
31035,
1162,
355,
990,
273,
2593,
33251,
533,
275,
619,
4743,
597,
403,
5884,
50276,
783,
2022,
7680,
273,
253,
1655,
2929,
689,
1182,
31035,
1162,
355,
3133,
281,
320,
17524,
253,
48960,
6667,
970,
305,
2188,
285,
970,
731,
281,
1442,
292,
2517,
253,
30410,
436,
275,
619,
4743,
310,
7826,
271,
4722,
2934,
2299,
253,
4477,
513,
417,
921,
4209,
1941,
273,
697,
2323,
5742,
253,
4477,
1750,
281,
5115,
2822,
3962,
4433,
10076,
7200,
342,
8723,
1818,
275,
1071,
873,
7200,
533,
597,
513,
417,
2085,
667,
4278,
24088,
2829,
273,
7200,
2193,
327,
253,
6194,
1071,
285,
48960,
5239,
1078,
285,
846,
253,
1442,
292,
25004,
891,
651,
671,
1902,
281,
923,
271,
28913,
1263,
10941,
253,
4081,
1332,
281,
3365,
1690,
253,
48960,
6667,
1119,
970,
1182,
31035,
1162,
355,
32063,
305,
2188,
13532,
285,
10491,
347,
3081,
3733,
1650,
50276,
66,
2629,
48960,
5684,
2746,
923,
24088,
337,
50276,
30875,
625,
15538,
253,
8103,
273,
253,
4081,
1332,
310,
417,
275,
619,
4743,
2590,
253,
4060,
285,
12002,
6266,
253,
4736,
347,
33146,
247,
30410,
285,
35827,
1891,
4811,
2299,
253,
2529,
1332,
3133,
751,
247,
5684,
1411,
327,
38556,
48960,
2983,
604,
253,
1332,
347,
7558,
7729,
33146,
285,
35827,
253,
30410,
891,
651,
1902,
281,
923,
271,
5520,
7200,
327,
253,
3626,
39709,
1071,
873,
50276,
1439,
816,
327,
253,
5132,
85,
1037,
4561,
48960,
6667,
50276,
783,
3290,
285,
19843,
273,
253,
4028,
476,
320,
5520,
347,
973,
247,
2257,
273,
2317,
310,
18564,
281,
12930,
973,
4304,
3082,
24088,
362,
3348,
305,
2188,
2299,
4619,
1491,
670,
253,
5661,
1543,
403,
5816,
516,
671,
417,
2119,
512,
253,
19186,
2931,
11333,
285,
7424,
2686,
1361,
275,
253,
4685,
24088,
5933,
337,
5150,
374,
690,
273,
253,
15965,
41818,
403,
417,
2629,
50276,
37585,
4385,
253,
5222,
275,
5426,
4562,
310,
247,
3963,
4972,
5222,
298,
19,
285,
417,
247,
4315,
5222,
50276,
936,
26799,
50276,
856,
84,
50276,
47606,
2934,
17524,
327,
38556,
20101,
21473,
731,
285,
840,
970,
731,
281,
3157,
253,
30410,
50276,
5040,
50276,
1987,
2382,
689,
1182,
31035,
1162,
355,
417,
973,
4232,
50276,
968,
86,
2276,
285,
31215,
5661,
1543,
50276,
16691,
3290,
273,
4028,
50276,
1439,
2119,
4588,
789,
285,
4679,
3761,
253,
4767,
8103,
50276,
9188,
40348,
50276,
11183,
1563,
253,
4477,
2380,
891,
29101,
619,
13716,
533,
891,
1335,
1158,
627,
403,
4619,
3374,
342,
253,
2929,
253,
954,
20276,
1127,
275,
619,
4743,
310,
253,
760,
78,
1662,
989,
7756,
327,
253,
1071,
941,
7809,
326,
253,
5125,
3733,
1332,
760,
19132,
253,
2173,
4433,
15216,
2403,
352,
310,
2074,
281,
48960,
3733,
3082,
908,
281,
6351,
48960,
31640,
2299,
253,
12002,
285,
10199,
6492,
326,
253,
2929,
7729,
275,
33146,
275,
18505,
20101,
275,
2087,
534,
891,
1158,
943,
452,
644,
8943,
275,
5520,
1071,
7200,
50276,
18,
1182,
12109,
288,
543,
31524,
1162,
355,
28055,
3505,
74,
6216,
5454,
2727,
875,
31640,
285,
7200,
17857,
1686,
6247,
187,
187,
4118,
18435,
27,
783,
7714,
8631,
247,
1332,
323,
12488,
285,
35827,
30410,
3045,
672,
13301,
403,
7922,
30833,
253,
8137,
310,
1754,
327,
17524,
9162,
4433,
4811,
275,
247,
362,
3348,
21624,
2317,
285,
253,
10618,
3408,
310,
1754,
327,
1442,
292,
25004,
253,
30410,
342,
3081,
13506,
3530,
432,
253,
362,
3348,
50276,
15337,
398,
5821,
326,
253,
7714,
310,
417,
4704,
323,
9311,
253,
2022,
2523,
310,
326,
253,
5125,
3733,
1332,
310,
2074,
281,
48960,
3733,
3082,
908,
281,
6351,
48960,
31640,
253,
1332,
1057,
417,
1361,
275,
33146,
285,
18505,
20101,
275,
2087,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
247,
985,
809,
2327,
534,
310,
48526,
4404,
12488,
285,
35827,
30410,
3045,
672,
13301,
403,
7922,
30833,
627,
403,
1264,
12475,
326,
403,
908,
281,
2216,
809,
2327,
337,
4271,
48566,
48960,
6667,
970,
39762,
6753,
2349,
351,
398,
374,
897,
247,
17524,
2746,
281,
940,
408,
253,
1840,
6667,
715,
4433,
15216,
285,
495,
3451,
253,
30410,
13650,
50276,
1189,
455,
253,
2934,
273,
970,
48960,
6667,
281,
3451,
13583,
43394,
310,
1077,
4722,
50275,
783,
4327,
273,
2176,
11333,
285,
616,
3602,
3198,
281,
320,
17285,
4518,
1223,
352,
310,
34007,
326,
247,
1327,
36928,
1566,
320,
908,
323,
253,
17524,
3213,
352,
352,
417,
2590,
2139,
247,
14035,
42878,
1232,
310,
253,
1682,
4944,
849,
1057,
436,
4327,
7277,
342,
643,
17524,
7274,
513,
253,
1543,
39970,
50276,
783,
2929,
943,
320,
35993,
281,
452,
4209,
4278,
273,
4679,
275,
253,
2505,
2581,
685,
12016,
839,
731,
281,
253,
30762,
247,
50275,
783,
16038,
273,
2139,
2176,
3602,
403,
6777,
323,
4679,
943,
320,
5469,
323,
1650,
359,
3410,
884,
10872,
432,
1016,
7368,
275,
253,
940,
21755,
3213,
359,
1642,
608,
5820,
281,
5203,
253,
4227,
50276,
22309,
403,
841,
10165,
4569,
50276,
10008,
273,
253,
22581,
4836,
12758,
281,
320,
625,
7000,
50276,
1968,
272,
731,
9361,
50276,
10002,
16988,
9361,
752,
310,
253,
4345,
875,
253,
12182,
2392,
50276,
37585,
5701,
50276,
18,
2593,
4567,
849,
359,
6489,
50276,
5430,
359,
4271,
50276,
19,
2593,
32597,
253,
12494,
7637,
342,
323,
4227,
253,
6197,
3198,
281,
320,
6312,
285,
271,
1650,
2530,
495,
2593,
7609,
4567,
89,
1237,
943,
320,
7932,
342,
4567,
89,
1237,
12014,
12842,
89,
8196,
943,
320,
7932,
342,
12842,
89,
8196,
7152,
339,
431,
248,
5853,
310,
2529,
275,
4209,
2508,
285,
253,
2929,
310,
3477,
281,
1239,
5661,
1543,
7668,
1264,
15302,
278,
79,
382,
6406,
1859,
2419,
3904,
285,
305,
8592,
7137,
7871,
253,
5661,
1543,
921,
326,
253,
4081,
5853,
9010,
1534,
20101,
275,
512,
15302,
1690,
4619,
4433,
15216,
846,
10618,
253,
3045,
273,
253,
1332,
19132,
50276,
266,
4722,
4809,
273,
253,
1332,
534,
44587,
352,
432,
2074,
5609,
310,
10171,
273,
2608,
11523,
468,
1641,
275,
253,
3733,
1232,
281,
5224,
253,
9162,
6332,
275,
1340,
281,
3157,
253,
3045,
273,
253,
1332,
275,
253,
2852,
15966,
4212,
275,
253,
3733,
273,
49996,
556,
697,
11361,
285,
23797,
323,
1650,
352,
476,
1056,
6927,
281,
2794,
3367,
1701,
9162,
3210,
326,
812,
320,
3732,
24088,
275,
3818,
3109,
985,
390,
1491,
25064,
835,
4560,
247,
3962,
5382,
7024,
327,
4212,
17854,
13071,
273,
2176,
18701,
387,
253,
1072,
673,
2608,
10171,
275,
253,
3733,
1232,
476,
320,
28190,
604,
352,
4419,
6485,
3883,
347,
597,
778,
417,
1900,
320,
2130,
347,
253,
4477,
5183,
275,
253,
1083,
273,
616,
2626,
10895,
11253,
273,
305,
8592,
7137,
7871,
2007,
7668,
2608,
4561,
20215,
4419,
973,
2931,
7259,
275,
2426,
273,
8284,
273,
2939,
641,
8925,
253,
4569,
1180,
273,
2939,
641,
30426,
10009,
36559,
875,
2939,
641,
281,
5416,
31640,
273,
253,
2457,
30410,
275,
253,
6667,
2530,
275,
253,
2929,
253,
4477,
1375,
326,
597,
908,
608,
5820,
12182,
2392,
285,
253,
5020,
6273,
369,
908,
281,
7617,
253,
2457,
5203,
752,
369,
253,
734,
11423,
1080,
4345,
1580,
970,
1966,
5188,
20945,
310,
247,
9560,
629,
273,
253,
4081,
1332,
891,
651,
751,
281,
923,
625,
5955,
273,
436,
4809,
7152,
339,
431,
248,
2929,
29328,
247,
1332,
281,
4271,
285,
3451,
4811,
327,
253,
941,
16751,
275,
534,
247,
10166,
30410,
10224,
253,
8137,
3408,
310,
1754,
327,
17524,
9162,
4433,
4811,
275,
247,
36827,
21624,
2317,
285,
253,
10618,
3408,
310,
1754,
327,
1442,
292,
25004,
253,
30410,
342,
3081,
13506,
3530,
432,
253,
36827,
50276,
783,
4081,
1332,
310,
7052,
1754,
327,
1182,
31035,
1162,
355,
4765,
11365,
3626,
48960,
6667,
247,
1332,
281,
6635,
327,
38556,
2806,
3364,
48960,
6667,
970,
247,
36827,
253,
4477,
273,
253,
1655,
2929,
6266,
690,
3910,
273,
616,
8137,
3213,
432,
1182,
31035,
1162,
355,
990,
273,
2593,
33251,
533,
275,
619,
4743,
597,
403,
5884,
50276,
783,
2022,
7680,
273,
253,
1655,
2929,
689,
1182,
31035,
1162,
355,
3133,
281,
320,
17524,
253,
48960,
6667,
970,
305,
2188,
285,
970,
731,
281,
1442,
292,
2517,
253,
30410,
436,
275,
619,
4743,
310,
7826,
271,
4722,
2934,
2299,
253,
4477,
513,
417,
921,
4209,
1941,
273,
697,
2323,
5742,
253,
4477,
1750,
281,
5115,
2822,
3962,
4433,
10076,
7200,
342,
8723,
1818,
275,
1071,
873,
7200,
533,
597,
513,
417,
2085,
667,
4278,
24088,
2829,
273,
7200,
2193,
327,
253,
6194,
1071,
285,
48960,
5239,
1078,
285,
846,
253,
1442,
292,
25004,
891,
651,
671,
1902,
281,
923,
271,
28913,
1263,
10941,
253,
4081,
1332,
281,
3365,
1690,
253,
48960,
6667,
1119,
970,
1182,
31035,
1162,
355,
32063,
305,
2188,
13532,
285,
10491,
347,
3081,
3733,
1650,
50276,
66,
2629,
48960,
5684,
2746,
923,
24088,
337,
50276,
30875,
625,
15538,
253,
8103,
273,
253,
4081,
1332,
310,
417,
275,
619,
4743,
2590,
253,
4060,
285,
12002,
6266,
253,
4736,
347,
33146,
247,
30410,
285,
35827,
1891,
4811,
2299,
253,
2529,
1332,
3133,
751,
247,
5684,
1411,
327,
38556,
48960,
2983,
604,
253,
1332,
347,
7558,
7729,
33146,
285,
35827,
253,
30410,
891,
651,
1902,
281,
923,
271,
5520,
7200,
327,
253,
3626,
39709,
1071,
873,
50276,
1439,
816,
327,
253,
5132,
85,
1037,
4561,
48960,
6667,
50276,
783,
3290,
285,
19843,
273,
253,
4028,
476,
320,
5520,
347,
973,
247,
2257,
273,
2317,
310,
18564,
281,
12930,
973,
4304,
3082,
24088,
362,
3348,
305,
2188,
2299,
4619,
1491,
670,
253,
5661,
1543,
403,
5816,
516,
671,
417,
2119,
512,
253,
19186,
2931,
11333,
285,
7424,
2686,
1361,
275,
253,
4685,
24088,
5933,
337,
5150,
374,
690,
273,
253,
15965,
41818,
403,
417,
2629,
50276,
37585,
4385,
253,
5222,
275,
5426,
4562,
310,
247,
3963,
4972,
5222,
298,
19,
285,
417,
247,
4315,
5222,
50276,
936,
26799,
50276,
856,
84,
50276,
47606,
2934,
17524,
327,
38556,
20101,
21473,
731,
285,
840,
970,
731,
281,
3157,
253,
30410,
50276,
5040,
50276,
1987,
2382,
689,
1182,
31035,
1162,
355,
417,
973,
4232,
50276,
968,
86,
2276,
285,
31215,
5661,
1543,
50276,
16691,
3290,
273,
4028,
50276,
1439,
2119,
4588,
789,
285,
4679,
3761,
253,
4767,
8103,
50276,
9188,
40348,
50276,
11183,
1563,
253,
4477,
2380,
891,
29101,
619,
13716,
533,
891,
1335,
1158,
627,
403,
4619,
3374,
342,
253,
2929,
253,
954,
20276,
1127,
275,
619,
4743,
310,
253,
760,
78,
1662,
989,
7756,
327,
253,
1071,
941,
7809,
326,
253,
5125,
3733,
1332,
760,
19132,
253,
2173,
4433,
15216,
2403,
352,
310,
2074,
281,
48960,
3733,
3082,
908,
281,
6351,
48960,
31640,
2299,
253,
12002,
285,
10199,
6492,
326,
253,
2929,
7729,
275,
33146,
275,
18505,
20101,
275,
2087,
534,
891,
1158,
943,
452,
644,
8943,
275,
5520,
1071,
7200,
50276,
18,
1182,
12109,
288,
543,
31524,
1162,
355,
28055,
3505,
74,
6216,
5454,
2727,
875,
31640,
285,
7200,
17857,
1686,
6247,
187,
187,
4118,
18435,
27,
783,
7714,
8631,
247,
1332,
323,
12488,
285,
35827,
30410,
3045,
672,
13301,
403,
7922,
30833,
253,
8137,
310,
1754,
327,
17524,
9162,
4433,
4811,
275,
247,
362,
3348,
21624,
2317,
285,
253,
10618,
3408,
310,
1754,
327,
1442,
292,
25004,
253,
30410,
342,
3081,
13506,
3530,
432,
253,
362,
3348,
50276,
15337,
398,
5821,
326,
253,
7714,
310,
417,
4704,
323,
9311,
253,
2022,
2523,
310,
326,
253,
5125,
3733,
1332,
310,
2074,
281,
48960,
3733,
3082,
908,
281,
6351,
48960,
31640,
253,
1332,
1057,
417,
1361,
275,
33146,
285,
18505,
20101,
275,
2087,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors explore a brand new yet challenging setting that studies adversarial training at with complementary labels cl generally it is significant to involve cl in at since imperfect supervision can be common in real at scenarios however the direct combination of at and cl consistently leads to failure according to extensive empirical observations to explore this issue the authors provide theoretical evidence that there exists inconsistency between complementary risk and ordinary risk of adversarial optimization with limited cls together with the empirical studies of gradients they identify two key challenges including intractable adversarial optimization and lowquality adversarial examples based on the analysis a new attack strategy is introduced a warmup is adopted to ease the difficulty of adversarial optimization with the model prediction as supplementary information the adversarial training gradually involves the pseudo label predicted by the model the authors conduct extensive experiments on different datasets and compare proposed algorithm with various baselines to demonstrate its effectiveness pros in general this paper is wellwritten and easy to follow the motivation is clear the studied setting is both significant and challenging both the theoretical analysis of the inconsistency between empirical risks with the assumption of limited cls and the empirical analysis of the difficulty of adversarial optimization as well as adversarial example generation are intriguing the proposed techniques including warmup and pseudolabel are proposed based on the theoretical and empirical analysis which are simple yet natural the authors provide sufficient evaluation of proposed algorithm the evaluation is conducted on mnist kuzushiji cifar10 and svhn and includes various sota complementary losses for comparison the proposed algorithm consistently achieves better adversarial robustness as well as stability cons in section 43 the authors propose to use model prediction as a strong supplementary information since the model tends to assign high confidence to ordinary label when the epsilon ball is small enough however it is difficult to find empirical evidence of it in experimental section it would be better for the authors to report the accuracy of pseudo labels in some scenarios the author introduces a warmup attack which controls the radius of epsilon ball however the number of attack steps is fixed during warmup it would be better for the authors to conduct more ablation studies of it yes the authors have discussed the limitations and potential negative societal impacts in appendix e docsepthis paper focuses on how to make adversarial trainingat applicable in a new setting where complementary labels cl instead of groundtruth labels are given for at the authors claim that the main obstacles for clbased at are intractable adversarial optimization and lowquality adversarial examples based on this the authors propose to solve the problems with warmup attack and pseudolabel attack experiment results show that the proposed method successfully build robust models in cl setting while many baselines fail to get a robust model strength the proposed method is neat and reasonable the authors first analyze the reasons why at fails in cl setting and then design corresponding solutions to mitigate the problems weakness 1 i am not quite agree with the setting proposed by the authors the at with complementary labels considers if there is no perfect supervision data for training a robust model however the authors did not make it clear and reasonable that why we should consider at in such a setting i admit that exploring the performance of at in noised data might be necessary and practical but exploring the performance of at in cl setting seems to make it unnecessarily harder for adversarial training it can be more reasonable if the authors can demonstrate their clbased at performs better than vanilla at when the training data is noised instead of first using cl to make the supervision imperfect and then try to conduct at under imperfect supervision 2 consequently the robustness trained under cl setting does no lead to better robustness mostly because of the imperfect supervision as shown in the experiments for most cases vanilla at under standard setting outperforms the proposed methods in cl setting in table 1 though such unfair setting provides vanilla at advantage i still think the proposed setting is not reasonable to better demonstrate the effectiveness a noised dataset for at may be considered and the authors can compare the performance on such a dataset between vanilla at and their clbased at method the authors have addressed the limitations and potential negative societal impacts docsepthis paper proposes to address a new problem adversarial training with complementary labels cls a naive combination of adversarial training and cls fails to yield good performance the authors identified the problem of this naive combination and propose to use warmup attack and pseudo label attack to address these problems the proposed method yields a performance improvement above the naive combination and simple twostage method strengths 1 the writing is good and easy to follow 2 thorough theoretical analysis is provided 3 the target problem has never been explored before 4 unique challenge of this problem is identified and solved weakness 1 limited novelty in the proposed method the proposed pseudolabel attack is very similar to the simple twostage baseline also the warmup attack has also been explored in the previous work 1 meanwhile the performance improvement above the simple twostage baseline is not significant 1 liu chen et al on the loss landscape of adversarial training identifying challenges and how to overcome them advances in neural information processing systems 33 2020 2147621487 no negative societal impact is found docsepthis paper aims to propose an effective adversarial training at method for the scenario where the imperfect supervision is available more specifically the paper shows that when the complementary label cl ie label for nongroundtruth class are available a direct combination of at and cl will fail to address this limitation two attack approaches warmup attack and pseudolabel attack are proposed the former gradually increases the attack budget over the training epoch and the latter uses the pseudolabel of the model prediction to generate the adversarial example experiments demonstrate the effectiveness of the proposed attack methods under the imperfect supervision scenario strength 1 this work first proposed adversarial training together with complementary labels from my best understanding this is the first work to study this problem 2 the motivation of studying adversarial training in imperfect data scenario is strong and practical 3 algorithm 1 is clear and simple to follow weakness 1 in l24 the author mentioned that the adversarial training at with imperfect supervision has received less attention while this work mainly studies the use of complementary labels there are papers like a which studies at with noisy label the author is suggested to compare the proposed method with a which both study the at in imperfect data regime 2 the novelty of the proposed method is a concern of this work the warmup technique has been widely used in deep learning for example b studies the warmup technique used to adjust the learning rate c is somewhat similar to the proposed warmup attack where the attack budget gradually increases see section 5 of c on the other hand the pseudolabel attack mainly stabilizes the adversarial example generation by using the pseudolabel from model prediction this has been widely used for example when applying adversarial training under semisupervised learning d see metaalgorithm 1 in section 4 of d since both proposed attack methods appear in the literature the author is suggested to highlight the novelty of the proposed methods 3 figure 3 only conducts ablation on the log method does the improvement of adding warmup attack and pseudolabel attack applied to other complementary learning baselines a understanding the interaction of adversarial training with noisy labels b a closer look at deep learning heuristics learning rate restarts warmup and distillation c on the loss landscape of adversarial training identifying challenges and how to overcome them d unlabeled data improves adversarial robustness yes the author has provided the checklist and the discuss the broader impact in the supplemental material
### Summary:
|
this paper focuses on a significant and challenging problem adversarial training at with complementary labels a naive combination of at with existing complementary learning techniques fails to achieve good performance the authors conduct both theoretical and empirical analyses of this phenomenon and identified two key challenges including intractable adversarial optimization and lowquality adversarial examples furthermore two attack approaches are proposed accordingly a warmup attack to ease the adversarial optimization and a pseudolabel attack to improve the adversarial example quality all reviewers recognize the effectiveness of the proposed method through experimental evaluations during the discussion the authors also successfully addressed the reviewers questions on the problem settings the novelty of the pseudolabel attack warmup strategies etc based on the positive reviews and thorough discussions we recommend the acceptance of the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
8338,
247,
7138,
747,
2568,
11132,
4758,
326,
2175,
48960,
3733,
387,
342,
19767,
13301,
502,
3839,
352,
310,
1534,
281,
6388,
502,
275,
387,
1580,
35180,
20446,
476,
320,
1846,
275,
1524,
387,
15216,
2299,
253,
1480,
5019,
273,
387,
285,
502,
12724,
5644,
281,
4433,
2556,
281,
9470,
16774,
7313,
50276,
936,
8338,
436,
2523,
253,
4477,
2085,
10527,
1941,
326,
627,
4961,
43430,
875,
19767,
2495,
285,
9826,
2495,
273,
48960,
13757,
342,
3710,
502,
84,
2366,
342,
253,
16774,
2175,
273,
27935,
597,
4271,
767,
2234,
7881,
1690,
540,
44374,
48960,
13757,
285,
1698,
15177,
48960,
6667,
50276,
3169,
327,
253,
1783,
247,
747,
2983,
5700,
310,
5611,
247,
5890,
484,
310,
8671,
281,
11990,
253,
10183,
273,
48960,
13757,
342,
253,
1566,
10554,
347,
24864,
1491,
253,
48960,
3733,
13237,
8687,
253,
17927,
5203,
8131,
407,
253,
1566,
253,
4477,
2589,
9470,
4679,
327,
1027,
15302,
285,
7277,
4081,
5933,
342,
2710,
1666,
25379,
281,
7568,
697,
12510,
5847,
50275,
249,
2087,
436,
2929,
310,
973,
15720,
285,
3477,
281,
956,
253,
16038,
310,
2590,
253,
5421,
4758,
310,
1097,
1534,
285,
11132,
50276,
15617,
253,
10527,
1783,
273,
253,
43430,
875,
16774,
10502,
342,
253,
9376,
273,
3710,
502,
84,
285,
253,
16774,
1783,
273,
253,
10183,
273,
48960,
13757,
347,
973,
347,
48960,
1650,
5978,
403,
27807,
50276,
783,
4081,
5609,
1690,
5890,
484,
285,
10585,
311,
1492,
403,
4081,
1754,
327,
253,
10527,
285,
16774,
1783,
534,
403,
2969,
2568,
3626,
50276,
783,
4477,
2085,
4209,
7103,
273,
4081,
5933,
253,
7103,
310,
5196,
327,
278,
79,
382,
465,
7958,
2345,
27520,
260,
338,
274,
740,
285,
18504,
13107,
285,
3797,
2710,
256,
5503,
19767,
11655,
323,
5301,
253,
4081,
5933,
12724,
33526,
1805,
48960,
31640,
347,
973,
347,
7882,
50276,
5040,
50275,
249,
2593,
7652,
253,
4477,
12661,
281,
897,
1566,
10554,
347,
247,
2266,
24864,
1491,
1580,
253,
1566,
14280,
281,
9212,
1029,
7162,
281,
9826,
5203,
672,
253,
299,
4277,
4023,
310,
1355,
2217,
2299,
352,
310,
2834,
281,
1089,
16774,
1941,
273,
352,
275,
5661,
2593,
352,
651,
320,
1805,
323,
253,
4477,
281,
1304,
253,
7200,
273,
17927,
13301,
275,
690,
15216,
50276,
783,
2488,
23970,
247,
5890,
484,
2983,
534,
5760,
253,
9941,
273,
299,
4277,
4023,
2299,
253,
1180,
273,
2983,
5018,
310,
4229,
1309,
5890,
484,
352,
651,
320,
1805,
323,
253,
4477,
281,
2589,
625,
28913,
2175,
273,
352,
4754,
253,
4477,
452,
5469,
253,
7364,
285,
2442,
4016,
38058,
16274,
275,
30762,
299,
5474,
33032,
2520,
2929,
16633,
327,
849,
281,
1056,
48960,
3733,
255,
7763,
275,
247,
747,
4758,
835,
19767,
13301,
502,
3185,
273,
3216,
33024,
13301,
403,
1677,
323,
387,
253,
4477,
1750,
326,
253,
2022,
24238,
323,
502,
3169,
387,
403,
540,
44374,
48960,
13757,
285,
1698,
15177,
48960,
6667,
1754,
327,
436,
253,
4477,
12661,
281,
8415,
253,
3237,
342,
5890,
484,
2983,
285,
10585,
311,
1492,
2983,
3368,
1543,
921,
326,
209,
575,
783,
4081,
1332,
8379,
1973,
10237,
3210,
275,
502,
4758,
1223,
1142,
1666,
25379,
1891,
281,
755,
247,
10237,
1566,
50275,
45563,
253,
4081,
1332,
310,
18176,
285,
5272,
253,
4477,
806,
12106,
253,
4606,
2139,
387,
10224,
275,
502,
4758,
285,
840,
2216,
3969,
5482,
281,
29966,
253,
3237,
50275,
20881,
1255,
50275,
18,
891,
717,
417,
3240,
5194,
342,
253,
4758,
4081,
407,
253,
4477,
253,
387,
342,
19767,
13301,
19401,
604,
627,
310,
642,
3962,
20446,
941,
323,
3733,
247,
10237,
1566,
2299,
253,
4477,
858,
417,
1056,
352,
2590,
285,
5272,
326,
2139,
359,
943,
1908,
387,
275,
824,
247,
4758,
891,
11476,
326,
18216,
253,
3045,
273,
387,
275,
642,
1701,
941,
1537,
320,
3309,
285,
8542,
533,
18216,
253,
3045,
273,
387,
275,
502,
4758,
3133,
281,
1056,
352,
48312,
12150,
323,
48960,
3733,
352,
476,
320,
625,
5272,
604,
253,
4477,
476,
7568,
616,
502,
3169,
387,
17923,
1805,
685,
26724,
387,
672,
253,
3733,
941,
310,
642,
1701,
3185,
273,
806,
970,
502,
281,
1056,
253,
20446,
35180,
285,
840,
1611,
281,
2589,
387,
762,
35180,
20446,
374,
17912,
253,
31640,
10166,
762,
502,
4758,
1057,
642,
1421,
281,
1805,
31640,
6571,
984,
273,
253,
35180,
20446,
347,
2011,
275,
253,
4679,
323,
954,
2219,
26724,
387,
762,
2629,
4758,
41731,
13015,
253,
4081,
3082,
275,
502,
4758,
275,
2829,
337,
2167,
824,
16593,
4758,
3400,
26724,
387,
5750,
891,
1335,
1158,
253,
4081,
4758,
310,
417,
5272,
281,
1805,
7568,
253,
12510,
247,
642,
1701,
10895,
323,
387,
778,
320,
2783,
285,
253,
4477,
476,
7277,
253,
3045,
327,
824,
247,
10895,
875,
26724,
387,
285,
616,
502,
3169,
387,
1332,
253,
4477,
452,
9713,
253,
7364,
285,
2442,
4016,
38058,
16274,
5474,
33032,
2520,
2929,
29328,
281,
2953,
247,
747,
1895,
48960,
3733,
342,
19767,
13301,
502,
84,
247,
27785,
5019,
273,
48960,
3733,
285,
502,
84,
10224,
281,
4917,
1175,
3045,
253,
4477,
3636,
253,
1895,
273,
436,
27785,
5019,
285,
12661,
281,
897,
5890,
484,
2983,
285,
17927,
5203,
2983,
281,
2953,
841,
3237,
253,
4081,
1332,
11026,
247,
3045,
7756,
1840,
253,
27785,
5019,
285,
2969,
2500,
493,
486,
1332,
20544,
337,
253,
4028,
310,
1175,
285,
3477,
281,
956,
374,
11080,
10527,
1783,
310,
2530,
495,
253,
2303,
1895,
556,
1620,
644,
14859,
1078,
577,
4451,
5691,
273,
436,
1895,
310,
3636,
285,
14042,
50275,
20881,
1255,
337,
3710,
38135,
275,
253,
4081,
1332,
253,
4081,
10585,
311,
1492,
2983,
310,
1077,
2074,
281,
253,
2969,
2500,
493,
486,
8245,
671,
253,
5890,
484,
2983,
556,
671,
644,
14859,
275,
253,
2045,
789,
337,
26614,
253,
3045,
7756,
1840,
253,
2969,
2500,
493,
486,
8245,
310,
417,
1534,
50276,
18,
632,
86,
260,
864,
1162,
355,
327,
253,
2957,
13016,
273,
48960,
3733,
12488,
7881,
285,
849,
281,
11399,
731,
16424,
275,
11454,
1491,
5162,
2718,
5922,
9169,
25901,
3121,
21866,
2597,
642,
50276,
12373,
38058,
3486,
310,
1119,
5474,
33032,
2520,
2929,
13698,
281,
12661,
271,
3576,
48960,
3733,
387,
1332,
323,
253,
10076,
835,
253,
35180,
20446,
310,
2130,
625,
5742,
253,
2929,
2722,
326,
672,
253,
19767,
5203,
502,
26332,
5203,
323,
295,
543,
4650,
33024,
966,
403,
2130,
247,
1480,
5019,
273,
387,
285,
502,
588,
1891,
281,
2953,
436,
12291,
767,
2983,
7274,
5890,
484,
2983,
285,
10585,
311,
1492,
2983,
403,
4081,
253,
3438,
13237,
5459,
253,
2983,
7563,
689,
253,
3733,
23657,
285,
253,
6158,
4648,
253,
10585,
311,
1492,
273,
253,
1566,
10554,
281,
6635,
253,
48960,
1650,
4679,
7568,
253,
12510,
273,
253,
4081,
2983,
3082,
762,
253,
35180,
20446,
10076,
4757,
50276,
18,
186,
2520,
789,
806,
4081,
48960,
3733,
2366,
342,
19767,
13301,
432,
619,
1682,
4685,
436,
310,
253,
806,
789,
281,
1263,
436,
1895,
50276,
19,
186,
783,
16038,
273,
12392,
48960,
3733,
275,
35180,
941,
10076,
310,
2266,
285,
8542,
50275,
20,
186,
41528,
337,
310,
2590,
285,
2969,
281,
956,
50276,
20881,
1255,
50276,
18,
186,
249,
298,
1348,
253,
2488,
5393,
326,
253,
48960,
3733,
387,
342,
35180,
20446,
556,
2959,
1679,
4116,
1223,
436,
789,
7194,
2175,
253,
897,
273,
19767,
13301,
627,
403,
9380,
751,
247,
534,
2175,
387,
342,
27620,
5203,
253,
2488,
310,
5125,
281,
7277,
253,
4081,
1332,
342,
247,
534,
1097,
1263,
253,
387,
275,
35180,
941,
9459,
50276,
19,
186,
783,
38135,
273,
253,
4081,
1332,
310,
247,
4468,
273,
436,
789,
253,
5890,
484,
5853,
556,
644,
7561,
908,
275,
3676,
4715,
323,
1650,
270,
2175,
253,
5890,
484,
5853,
908,
281,
4575,
253,
4715,
2281,
50276,
68,
310,
8489,
2074,
281,
253,
4081,
5890,
484,
2983,
835,
253,
2983,
7563,
13237,
5459,
923,
2593,
608,
273,
260,
327,
253,
643,
1133,
253,
10585,
311,
1492,
2983,
7194,
10308,
4219,
253,
48960,
1650,
5978,
407,
970,
253,
10585,
311,
1492,
432,
1566,
10554,
436,
556,
644,
7561,
908,
323,
1650,
672,
9433,
48960,
3733,
762,
49863,
29974,
13337,
4715,
277,
923,
11419,
41528,
337,
275,
2593,
577,
273,
277,
1580,
1097,
4081,
2983,
3082,
3176,
275,
253,
6239,
253,
2488,
310,
5125,
281,
6780,
253,
38135,
273,
253,
4081,
3082,
50276,
20,
186,
13206,
495,
760,
2589,
84,
28913,
327,
253,
2412,
1332,
1057,
253,
7756,
273,
6240,
5890,
484,
2983,
285,
10585,
311,
1492,
2983,
3732,
281,
643,
19767,
4715,
1666,
25379,
50274,
66,
4685,
253,
5016,
273,
48960,
3733,
342,
27620,
13301,
270,
247,
8003,
1007,
387,
3676,
4715,
344,
321,
3397,
4715,
2281,
1551,
12863,
5890,
484,
285,
940,
21755,
260,
327,
253,
2957,
13016,
273,
48960,
3733,
12488,
7881,
285,
849,
281,
11399,
731,
277,
440,
22027,
941,
19132,
48960,
31640,
50276,
9820,
253,
2488,
556,
2530,
253,
44282,
285,
253,
2319,
253,
16055,
3486,
275,
253,
25702,
2144,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
16633,
327,
247,
1534,
285,
11132,
1895,
48960,
3733,
387,
50276,
3113,
19767,
13301,
247,
27785,
5019,
273,
387,
342,
5368,
19767,
4715,
5609,
10224,
281,
5115,
1175,
3045,
253,
4477,
2589,
1097,
10527,
285,
16774,
6260,
273,
436,
11562,
285,
3636,
767,
2234,
7881,
1690,
540,
44374,
48960,
13757,
285,
1698,
15177,
48960,
6667,
33810,
767,
2983,
7274,
403,
4081,
15672,
247,
5890,
484,
2983,
281,
11990,
253,
48960,
13757,
285,
247,
10585,
311,
1492,
2983,
281,
3157,
253,
48960,
1650,
3290,
512,
30628,
9446,
253,
12510,
273,
253,
4081,
1332,
949,
5661,
27163,
50276,
32674,
253,
5955,
253,
4477,
671,
8379,
9713,
253,
30628,
3533,
327,
253,
1895,
7533,
253,
38135,
273,
253,
10585,
311,
1492,
2983,
5890,
484,
8130,
3966,
50276,
3169,
327,
253,
2762,
10123,
285,
11080,
11985,
359,
5583,
253,
14924,
273,
253,
2929,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
8338,
247,
7138,
747,
2568,
11132,
4758,
326,
2175,
48960,
3733,
387,
342,
19767,
13301,
502,
3839,
352,
310,
1534,
281,
6388,
502,
275,
387,
1580,
35180,
20446,
476,
320,
1846,
275,
1524,
387,
15216,
2299,
253,
1480,
5019,
273,
387,
285,
502,
12724,
5644,
281,
4433,
2556,
281,
9470,
16774,
7313,
50276,
936,
8338,
436,
2523,
253,
4477,
2085,
10527,
1941,
326,
627,
4961,
43430,
875,
19767,
2495,
285,
9826,
2495,
273,
48960,
13757,
342,
3710,
502,
84,
2366,
342,
253,
16774,
2175,
273,
27935,
597,
4271,
767,
2234,
7881,
1690,
540,
44374,
48960,
13757,
285,
1698,
15177,
48960,
6667,
50276,
3169,
327,
253,
1783,
247,
747,
2983,
5700,
310,
5611,
247,
5890,
484,
310,
8671,
281,
11990,
253,
10183,
273,
48960,
13757,
342,
253,
1566,
10554,
347,
24864,
1491,
253,
48960,
3733,
13237,
8687,
253,
17927,
5203,
8131,
407,
253,
1566,
253,
4477,
2589,
9470,
4679,
327,
1027,
15302,
285,
7277,
4081,
5933,
342,
2710,
1666,
25379,
281,
7568,
697,
12510,
5847,
50275,
249,
2087,
436,
2929,
310,
973,
15720,
285,
3477,
281,
956,
253,
16038,
310,
2590,
253,
5421,
4758,
310,
1097,
1534,
285,
11132,
50276,
15617,
253,
10527,
1783,
273,
253,
43430,
875,
16774,
10502,
342,
253,
9376,
273,
3710,
502,
84,
285,
253,
16774,
1783,
273,
253,
10183,
273,
48960,
13757,
347,
973,
347,
48960,
1650,
5978,
403,
27807,
50276,
783,
4081,
5609,
1690,
5890,
484,
285,
10585,
311,
1492,
403,
4081,
1754,
327,
253,
10527,
285,
16774,
1783,
534,
403,
2969,
2568,
3626,
50276,
783,
4477,
2085,
4209,
7103,
273,
4081,
5933,
253,
7103,
310,
5196,
327,
278,
79,
382,
465,
7958,
2345,
27520,
260,
338,
274,
740,
285,
18504,
13107,
285,
3797,
2710,
256,
5503,
19767,
11655,
323,
5301,
253,
4081,
5933,
12724,
33526,
1805,
48960,
31640,
347,
973,
347,
7882,
50276,
5040,
50275,
249,
2593,
7652,
253,
4477,
12661,
281,
897,
1566,
10554,
347,
247,
2266,
24864,
1491,
1580,
253,
1566,
14280,
281,
9212,
1029,
7162,
281,
9826,
5203,
672,
253,
299,
4277,
4023,
310,
1355,
2217,
2299,
352,
310,
2834,
281,
1089,
16774,
1941,
273,
352,
275,
5661,
2593,
352,
651,
320,
1805,
323,
253,
4477,
281,
1304,
253,
7200,
273,
17927,
13301,
275,
690,
15216,
50276,
783,
2488,
23970,
247,
5890,
484,
2983,
534,
5760,
253,
9941,
273,
299,
4277,
4023,
2299,
253,
1180,
273,
2983,
5018,
310,
4229,
1309,
5890,
484,
352,
651,
320,
1805,
323,
253,
4477,
281,
2589,
625,
28913,
2175,
273,
352,
4754,
253,
4477,
452,
5469,
253,
7364,
285,
2442,
4016,
38058,
16274,
275,
30762,
299,
5474,
33032,
2520,
2929,
16633,
327,
849,
281,
1056,
48960,
3733,
255,
7763,
275,
247,
747,
4758,
835,
19767,
13301,
502,
3185,
273,
3216,
33024,
13301,
403,
1677,
323,
387,
253,
4477,
1750,
326,
253,
2022,
24238,
323,
502,
3169,
387,
403,
540,
44374,
48960,
13757,
285,
1698,
15177,
48960,
6667,
1754,
327,
436,
253,
4477,
12661,
281,
8415,
253,
3237,
342,
5890,
484,
2983,
285,
10585,
311,
1492,
2983,
3368,
1543,
921,
326,
209,
575,
783,
4081,
1332,
8379,
1973,
10237,
3210,
275,
502,
4758,
1223,
1142,
1666,
25379,
1891,
281,
755,
247,
10237,
1566,
50275,
45563,
253,
4081,
1332,
310,
18176,
285,
5272,
253,
4477,
806,
12106,
253,
4606,
2139,
387,
10224,
275,
502,
4758,
285,
840,
2216,
3969,
5482,
281,
29966,
253,
3237,
50275,
20881,
1255,
50275,
18,
891,
717,
417,
3240,
5194,
342,
253,
4758,
4081,
407,
253,
4477,
253,
387,
342,
19767,
13301,
19401,
604,
627,
310,
642,
3962,
20446,
941,
323,
3733,
247,
10237,
1566,
2299,
253,
4477,
858,
417,
1056,
352,
2590,
285,
5272,
326,
2139,
359,
943,
1908,
387,
275,
824,
247,
4758,
891,
11476,
326,
18216,
253,
3045,
273,
387,
275,
642,
1701,
941,
1537,
320,
3309,
285,
8542,
533,
18216,
253,
3045,
273,
387,
275,
502,
4758,
3133,
281,
1056,
352,
48312,
12150,
323,
48960,
3733,
352,
476,
320,
625,
5272,
604,
253,
4477,
476,
7568,
616,
502,
3169,
387,
17923,
1805,
685,
26724,
387,
672,
253,
3733,
941,
310,
642,
1701,
3185,
273,
806,
970,
502,
281,
1056,
253,
20446,
35180,
285,
840,
1611,
281,
2589,
387,
762,
35180,
20446,
374,
17912,
253,
31640,
10166,
762,
502,
4758,
1057,
642,
1421,
281,
1805,
31640,
6571,
984,
273,
253,
35180,
20446,
347,
2011,
275,
253,
4679,
323,
954,
2219,
26724,
387,
762,
2629,
4758,
41731,
13015,
253,
4081,
3082,
275,
502,
4758,
275,
2829,
337,
2167,
824,
16593,
4758,
3400,
26724,
387,
5750,
891,
1335,
1158,
253,
4081,
4758,
310,
417,
5272,
281,
1805,
7568,
253,
12510,
247,
642,
1701,
10895,
323,
387,
778,
320,
2783,
285,
253,
4477,
476,
7277,
253,
3045,
327,
824,
247,
10895,
875,
26724,
387,
285,
616,
502,
3169,
387,
1332,
253,
4477,
452,
9713,
253,
7364,
285,
2442,
4016,
38058,
16274,
5474,
33032,
2520,
2929,
29328,
281,
2953,
247,
747,
1895,
48960,
3733,
342,
19767,
13301,
502,
84,
247,
27785,
5019,
273,
48960,
3733,
285,
502,
84,
10224,
281,
4917,
1175,
3045,
253,
4477,
3636,
253,
1895,
273,
436,
27785,
5019,
285,
12661,
281,
897,
5890,
484,
2983,
285,
17927,
5203,
2983,
281,
2953,
841,
3237,
253,
4081,
1332,
11026,
247,
3045,
7756,
1840,
253,
27785,
5019,
285,
2969,
2500,
493,
486,
1332,
20544,
337,
253,
4028,
310,
1175,
285,
3477,
281,
956,
374,
11080,
10527,
1783,
310,
2530,
495,
253,
2303,
1895,
556,
1620,
644,
14859,
1078,
577,
4451,
5691,
273,
436,
1895,
310,
3636,
285,
14042,
50275,
20881,
1255,
337,
3710,
38135,
275,
253,
4081,
1332,
253,
4081,
10585,
311,
1492,
2983,
310,
1077,
2074,
281,
253,
2969,
2500,
493,
486,
8245,
671,
253,
5890,
484,
2983,
556,
671,
644,
14859,
275,
253,
2045,
789,
337,
26614,
253,
3045,
7756,
1840,
253,
2969,
2500,
493,
486,
8245,
310,
417,
1534,
50276,
18,
632,
86,
260,
864,
1162,
355,
327,
253,
2957,
13016,
273,
48960,
3733,
12488,
7881,
285,
849,
281,
11399,
731,
16424,
275,
11454,
1491,
5162,
2718,
5922,
9169,
25901,
3121,
21866,
2597,
642,
50276,
12373,
38058,
3486,
310,
1119,
5474,
33032,
2520,
2929,
13698,
281,
12661,
271,
3576,
48960,
3733,
387,
1332,
323,
253,
10076,
835,
253,
35180,
20446,
310,
2130,
625,
5742,
253,
2929,
2722,
326,
672,
253,
19767,
5203,
502,
26332,
5203,
323,
295,
543,
4650,
33024,
966,
403,
2130,
247,
1480,
5019,
273,
387,
285,
502,
588,
1891,
281,
2953,
436,
12291,
767,
2983,
7274,
5890,
484,
2983,
285,
10585,
311,
1492,
2983,
403,
4081,
253,
3438,
13237,
5459,
253,
2983,
7563,
689,
253,
3733,
23657,
285,
253,
6158,
4648,
253,
10585,
311,
1492,
273,
253,
1566,
10554,
281,
6635,
253,
48960,
1650,
4679,
7568,
253,
12510,
273,
253,
4081,
2983,
3082,
762,
253,
35180,
20446,
10076,
4757,
50276,
18,
186,
2520,
789,
806,
4081,
48960,
3733,
2366,
342,
19767,
13301,
432,
619,
1682,
4685,
436,
310,
253,
806,
789,
281,
1263,
436,
1895,
50276,
19,
186,
783,
16038,
273,
12392,
48960,
3733,
275,
35180,
941,
10076,
310,
2266,
285,
8542,
50275,
20,
186,
41528,
337,
310,
2590,
285,
2969,
281,
956,
50276,
20881,
1255,
50276,
18,
186,
249,
298,
1348,
253,
2488,
5393,
326,
253,
48960,
3733,
387,
342,
35180,
20446,
556,
2959,
1679,
4116,
1223,
436,
789,
7194,
2175,
253,
897,
273,
19767,
13301,
627,
403,
9380,
751,
247,
534,
2175,
387,
342,
27620,
5203,
253,
2488,
310,
5125,
281,
7277,
253,
4081,
1332,
342,
247,
534,
1097,
1263,
253,
387,
275,
35180,
941,
9459,
50276,
19,
186,
783,
38135,
273,
253,
4081,
1332,
310,
247,
4468,
273,
436,
789,
253,
5890,
484,
5853,
556,
644,
7561,
908,
275,
3676,
4715,
323,
1650,
270,
2175,
253,
5890,
484,
5853,
908,
281,
4575,
253,
4715,
2281,
50276,
68,
310,
8489,
2074,
281,
253,
4081,
5890,
484,
2983,
835,
253,
2983,
7563,
13237,
5459,
923,
2593,
608,
273,
260,
327,
253,
643,
1133,
253,
10585,
311,
1492,
2983,
7194,
10308,
4219,
253,
48960,
1650,
5978,
407,
970,
253,
10585,
311,
1492,
432,
1566,
10554,
436,
556,
644,
7561,
908,
323,
1650,
672,
9433,
48960,
3733,
762,
49863,
29974,
13337,
4715,
277,
923,
11419,
41528,
337,
275,
2593,
577,
273,
277,
1580,
1097,
4081,
2983,
3082,
3176,
275,
253,
6239,
253,
2488,
310,
5125,
281,
6780,
253,
38135,
273,
253,
4081,
3082,
50276,
20,
186,
13206,
495,
760,
2589,
84,
28913,
327,
253,
2412,
1332,
1057,
253,
7756,
273,
6240,
5890,
484,
2983,
285,
10585,
311,
1492,
2983,
3732,
281,
643,
19767,
4715,
1666,
25379,
50274,
66,
4685,
253,
5016,
273,
48960,
3733,
342,
27620,
13301,
270,
247,
8003,
1007,
387,
3676,
4715,
344,
321,
3397,
4715,
2281,
1551,
12863,
5890,
484,
285,
940,
21755,
260,
327,
253,
2957,
13016,
273,
48960,
3733,
12488,
7881,
285,
849,
281,
11399,
731,
277,
440,
22027,
941,
19132,
48960,
31640,
50276,
9820,
253,
2488,
556,
2530,
253,
44282,
285,
253,
2319,
253,
16055,
3486,
275,
253,
25702,
2144,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
16633,
327,
247,
1534,
285,
11132,
1895,
48960,
3733,
387,
50276,
3113,
19767,
13301,
247,
27785,
5019,
273,
387,
342,
5368,
19767,
4715,
5609,
10224,
281,
5115,
1175,
3045,
253,
4477,
2589,
1097,
10527,
285,
16774,
6260,
273,
436,
11562,
285,
3636,
767,
2234,
7881,
1690,
540,
44374,
48960,
13757,
285,
1698,
15177,
48960,
6667,
33810,
767,
2983,
7274,
403,
4081,
15672,
247,
5890,
484,
2983,
281,
11990,
253,
48960,
13757,
285,
247,
10585,
311,
1492,
2983,
281,
3157,
253,
48960,
1650,
3290,
512,
30628,
9446,
253,
12510,
273,
253,
4081,
1332,
949,
5661,
27163,
50276,
32674,
253,
5955,
253,
4477,
671,
8379,
9713,
253,
30628,
3533,
327,
253,
1895,
7533,
253,
38135,
273,
253,
10585,
311,
1492,
2983,
5890,
484,
8130,
3966,
50276,
3169,
327,
253,
2762,
10123,
285,
11080,
11985,
359,
5583,
253,
14924,
273,
253,
2929,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
important interesting problem and innovative solution concept good theoretical and experimental justification some unclarity on technical and theoretical matters the submission studies the problem of determining the highestvalue atomic intervention given coarsegrained causal background knowledge allowing for a degree of unobserved confounding the authors propose a causal bandit approach and study its regret profile showing theoretically and experimentally that it compares favorably with the state of the art a rigorous and innovative approach to a difficult and important problem i have two questions that i would like addressed 1 i am not clear to what degree we are really allowing for unobserved confounding here in the third graf of section 1 the authors write that they are studying causal graphs with unobserved variables that are parents of at least two observable variables but in section 12 we seem to assume that every unobserved variable has exactly two children have we really not lost any generality here more importantly the authors assume that the effect of doxx can be consistently estimated from the observational data alone according to the result of tian and pearl appealed to by the authors this effect is the case if there is no bidirected path connecting x to any of its children but x and y have no bidirected path between them iff they have no unobserved parent so havent we assumed away the relevant kind of confounding by assuming identifiability havent we just assumed that the intervention variable and the target variable have no unobserved common cause what am i missing 2 i do not quite see whether the authors are modeling any observationintervention tradeoff the algorithm always does the same number of pulls of the observational and interventional arms so i suppose this tradeoff is unmodeled can the authors discuss how the approach might be extended to handle this in a subtler way 3 am i right that the experiments compare the proposed algorithm only with algos that use no causal information at all is this really the relevant comparison shouldnt they be compared with other causal bandit algos for example those that dont account for unobserved confounders minor issues i dont think the example in figure 1 is very good since it is very implausible to implement a surgical intervention in this setting there is no way to break the edge from socialeconomic variables to working from home care workers etc cannot work from home i could use a reminder in the text on the difference between bigo vs bigomega as written the comparisons in eg the second graf on section 11 dont look appletoapple docsepthe paper advances in the use of causal models for bandits problem considering unobserved confounders not considered in previous work it is well presented including an extensive additional material with demostration and additional experiments one issue is why not considered confounders for the case of cumulative regret the explanation given is not convincing clarify extend the explanation for not considering confounders in the case of accumalative regret discuss the condition of mc n what does this imply it is common in practice docsepthe two algorithms with better performance rigorous regret analysis and corroborating empirical results
the result seems useful in understanding a very specific type of causal graph the papers topic is highly relevant the assumption that every intervenable variable having no bidirected path to all of its children is very restrictive in that you can always obtain every arms reward distribution pydox from pv in fact the assumption is even stronger you can identify the whole distribution pvxdox the wording that with unobserved confounders doesnt seem sufficient
the significance of the result is a bit limited due to the type of graph considered correctness the paper appears to be correct where causal inference is used i skipped regret analysis part which is not my expertise literature review some of the recent work on causal bandits and differences between this work and existing work are well discussed clarity the paper is generally clearly written
i have reviewed this paper before and almost all of my concerns are now addressed in section 5 every variable is now observable where backdoor criterion can be used i wonder whether the backdoor criterion with parents as admissible set is the most desirable choice there may be a different adjustment set that leads to lower variance in causal effect estimation than the parents by the way you dont have to assume that every variable is observable but you can just say every pydoxi is identified by adjusting for its parents which is less restricting again if i am not mistaken atomic intervention represents hard intervention do where its opposite is soft intervention conditional intervention etc singleton intervention would be a reasonable choice missing period footnote 2 3 6 9 disclaimer ive reviewed the previous version of this paper docsepthe discussion of causal approaches to bandit problems is wellmotivated and the proposed algorithms are thoroughly described though i did not go through the proofs in close detail the results appear sound it was not entirely clear to me how these algorithms go beyond existing work this line in particular struck me as problematic to the best of our knowledge this is the first work that analyses the regret of causal bandit algorithms when input causal graphs contains ucs unobserved confounders elias bareinboim has been publishing on this topic for years including in several papers cited in this manuscripts bibliography eg the 2015 neurips paper bandits with unobserved confounders and the 2018 neurips paper structural causal bandits where to intervene how if at all do srmalg or crmalg improve upon these methods the experiments section is brief and unsatisfying as far as i can tell it includes no other causal bandit benchmarks even though several such methods exist and are discussed in the manuscript it is possible i have simply misunderstood something about the proposal here that distinguishes it from existing methods but if so then this should be clearly stated in the manuscript further experiments could also help drive home the unique selling points of this method the technical discussion appears well executed though it is hard for me to evaluate this as i am not an expert on bandit algorithms that said i am aware of other work in this area that aims at similar goals some of which is explicitly mentioned in the manuscript the text could do a better job distinguishing the present proposal from those works both through theory and experiments as it stands the claim that no other causal bandit method considers the case of unobserved confounding is simply false minor comments i would recommend adding a note to sect 2 notation on the use of yt to denote response at round t as this is often denoted with subscripts ie yt my first thought seeing yt was potential outcomes notation ie shorthand for y dot t which obviously makes no sense in this context the dash in arg max is somewhat jarring these should just be argmax
### Summary:
|
meta review thank you for your submission to uai and the authors response to reviewers concerns this paper demonstrates a causal bandit method for determining the best atomic intervention given knowledge of the causal graph governing a system the paper presents algorithms for simple regret minimization with confounders with both theoretical analysis and experimental validation reviewers appreciated the works innovativeness and potential broader applications such as in reinforcement learning
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
1774,
4722,
1895,
285,
16694,
2900,
4473,
50276,
12311,
10527,
285,
5661,
22861,
50274,
8826,
440,
498,
15752,
327,
7681,
285,
10527,
8213,
253,
19529,
2175,
253,
1895,
273,
8925,
253,
4585,
2877,
13805,
7268,
1677,
25319,
72,
11273,
19349,
4114,
3640,
6941,
323,
247,
4248,
273,
440,
45912,
34541,
253,
4477,
12661,
247,
19349,
3961,
262,
2746,
285,
1263,
697,
14938,
6222,
4645,
28055,
285,
21657,
326,
352,
26662,
49148,
342,
253,
1375,
273,
253,
1445,
247,
26565,
285,
16694,
2746,
281,
247,
2834,
285,
1774,
1895,
891,
452,
767,
3533,
326,
891,
651,
751,
9713,
50275,
18,
891,
717,
417,
2590,
281,
752,
4248,
359,
403,
1663,
6941,
323,
440,
45912,
34541,
1060,
275,
253,
2626,
7098,
71,
273,
2593,
337,
253,
4477,
3630,
326,
597,
403,
12392,
19349,
14580,
342,
440,
45912,
4903,
326,
403,
4651,
273,
387,
1878,
767,
24802,
4903,
533,
275,
2593,
1249,
359,
1646,
281,
5467,
326,
1046,
440,
45912,
4778,
556,
4555,
767,
2151,
452,
359,
1663,
417,
3663,
667,
31376,
1060,
50275,
3062,
15538,
253,
4477,
5467,
326,
253,
1055,
273,
513,
5260,
476,
320,
12724,
5998,
432,
253,
21899,
941,
3815,
2556,
281,
253,
906,
273,
246,
757,
285,
27887,
77,
20652,
281,
407,
253,
4477,
436,
1055,
310,
253,
1083,
604,
627,
310,
642,
12246,
17799,
1854,
12873,
1269,
281,
667,
273,
697,
2151,
533,
1269,
285,
340,
452,
642,
12246,
17799,
1854,
875,
731,
36714,
597,
452,
642,
440,
45912,
2885,
594,
419,
2254,
359,
8025,
1977,
253,
4623,
2238,
273,
34541,
407,
7384,
1548,
18279,
1430,
419,
2254,
359,
816,
8025,
326,
253,
7268,
4778,
285,
253,
2303,
4778,
452,
642,
440,
45912,
1846,
2847,
752,
717,
891,
5816,
50275,
19,
891,
513,
417,
3240,
923,
1880,
253,
4477,
403,
14053,
667,
8310,
43198,
5454,
2727,
253,
5933,
1900,
1057,
253,
1072,
1180,
273,
25612,
273,
253,
21899,
285,
7268,
267,
6174,
594,
891,
9428,
436,
5454,
2727,
310,
440,
7645,
264,
476,
253,
4477,
2319,
849,
253,
2746,
1537,
320,
6508,
281,
6016,
436,
275,
247,
8482,
2146,
1039,
50275,
20,
717,
891,
987,
326,
253,
4679,
7277,
253,
4081,
5933,
760,
342,
20320,
375,
326,
897,
642,
19349,
1491,
387,
512,
310,
436,
1663,
253,
4623,
5301,
943,
2649,
597,
320,
2429,
342,
643,
19349,
3961,
262,
20320,
375,
323,
1650,
1110,
326,
13414,
2395,
323,
440,
45912,
44667,
398,
50275,
37585,
3374,
50274,
74,
13414,
1158,
253,
1650,
275,
4677,
337,
310,
1077,
1175,
1580,
352,
310,
1077,
3898,
666,
917,
281,
3359,
247,
9205,
7268,
275,
436,
4758,
627,
310,
642,
1039,
281,
2740,
253,
5024,
432,
2675,
17989,
4903,
281,
2444,
432,
1728,
1557,
5820,
3966,
2550,
789,
432,
1728,
50275,
74,
812,
897,
247,
24388,
275,
253,
2505,
327,
253,
3064,
875,
1943,
80,
4632,
1943,
3151,
347,
3542,
253,
14023,
275,
24088,
253,
1273,
7098,
71,
327,
2593,
1903,
13414,
1007,
19126,
936,
19934,
50276,
7152,
339,
431,
248,
2929,
16424,
275,
253,
897,
273,
19349,
3210,
323,
3961,
953,
1895,
7296,
440,
45912,
44667,
398,
417,
2783,
275,
2045,
789,
352,
310,
973,
3559,
1690,
271,
9470,
3081,
2144,
342,
1471,
493,
1589,
285,
3081,
4679,
581,
2523,
310,
2139,
417,
2783,
44667,
398,
323,
253,
1083,
273,
18849,
14938,
253,
8813,
1677,
310,
417,
21414,
19148,
9017,
253,
8813,
323,
417,
7296,
44667,
398,
275,
253,
1083,
273,
7358,
267,
800,
14938,
2319,
253,
1617,
273,
278,
68,
50276,
79,
752,
1057,
436,
16084,
352,
310,
1846,
275,
3946,
5474,
339,
431,
248,
767,
11333,
342,
1805,
3045,
26565,
14938,
1783,
285,
25092,
839,
16774,
1543,
40702,
40702,
783,
906,
3133,
4217,
275,
4685,
247,
1077,
2173,
1511,
273,
19349,
4216,
50275,
783,
9380,
9400,
310,
4122,
4623,
253,
9376,
326,
1046,
16070,
494,
4778,
1907,
642,
12246,
17799,
1854,
281,
512,
273,
697,
2151,
310,
1077,
29190,
275,
326,
368,
476,
1900,
4044,
1046,
6174,
10921,
3268,
7239,
69,
1004,
432,
268,
87,
275,
958,
253,
9376,
310,
1014,
10046,
368,
476,
4271,
253,
2644,
3268,
268,
87,
17176,
1004,
253,
41066,
326,
342,
440,
45912,
44667,
398,
36908,
1646,
4209,
40702,
40702,
50276,
783,
8453,
273,
253,
906,
310,
247,
2372,
3710,
1955,
281,
253,
1511,
273,
4216,
2783,
36594,
253,
2929,
4620,
281,
320,
3451,
835,
19349,
17032,
310,
908,
891,
37001,
14938,
1783,
629,
534,
310,
417,
619,
15040,
50276,
22478,
1177,
2278,
690,
273,
253,
3332,
789,
327,
19349,
3961,
953,
285,
3910,
875,
436,
789,
285,
5368,
789,
403,
973,
5469,
19843,
253,
2929,
310,
3839,
4518,
3542,
40702,
40702,
891,
452,
9814,
436,
2929,
1078,
285,
2761,
512,
273,
619,
7350,
403,
1024,
9713,
50276,
249,
2593,
608,
1046,
4778,
310,
1024,
24802,
835,
896,
11806,
17705,
476,
320,
908,
891,
4282,
1880,
253,
896,
11806,
17705,
342,
4651,
347,
22961,
873,
310,
253,
954,
11408,
4327,
627,
778,
320,
247,
1027,
14000,
873,
326,
5644,
281,
2406,
11041,
275,
19349,
1055,
13418,
685,
253,
4651,
407,
253,
1039,
368,
13414,
452,
281,
5467,
326,
1046,
4778,
310,
24802,
533,
368,
476,
816,
1333,
1046,
7239,
69,
1004,
74,
310,
3636,
407,
19427,
323,
697,
4651,
534,
310,
1679,
34617,
50276,
16245,
604,
891,
717,
417,
20854,
13805,
7268,
6125,
1892,
7268,
513,
835,
697,
7285,
310,
2602,
7268,
17697,
7268,
3966,
47736,
7268,
651,
320,
247,
5272,
4327,
50276,
33722,
2180,
43302,
374,
495,
721,
898,
50276,
3431,
34837,
209,
422,
9814,
253,
2045,
2715,
273,
436,
2929,
5474,
339,
431,
248,
5955,
273,
19349,
7274,
281,
3961,
262,
3237,
310,
973,
24013,
8550,
285,
253,
4081,
11333,
403,
16575,
2529,
2167,
891,
858,
417,
564,
949,
253,
27947,
275,
2810,
2508,
253,
1543,
3176,
3590,
50276,
262,
369,
417,
7094,
2590,
281,
479,
849,
841,
11333,
564,
4457,
5368,
789,
436,
1386,
275,
1798,
10903,
479,
347,
20276,
281,
253,
1682,
273,
776,
3640,
436,
310,
253,
806,
789,
326,
6260,
253,
14938,
273,
19349,
3961,
262,
11333,
672,
3280,
19349,
14580,
4428,
44274,
84,
440,
45912,
44667,
398,
1045,
6358,
8050,
249,
2399,
303,
556,
644,
18051,
327,
436,
9400,
323,
1107,
1690,
275,
2067,
9380,
11106,
275,
436,
40336,
20314,
20561,
24088,
253,
4104,
5723,
2824,
2929,
3961,
953,
342,
440,
45912,
44667,
398,
285,
253,
4765,
5723,
2824,
2929,
8350,
19349,
3961,
953,
835,
281,
32014,
849,
604,
387,
512,
513,
256,
1109,
13256,
390,
260,
1109,
13256,
3157,
2220,
841,
3082,
50276,
783,
4679,
2593,
310,
4864,
285,
43288,
3184,
347,
2080,
347,
891,
476,
2028,
352,
3797,
642,
643,
19349,
3961,
262,
49602,
1014,
2167,
2067,
824,
3082,
2226,
285,
403,
5469,
275,
253,
7714,
50276,
262,
310,
1896,
891,
452,
3365,
46485,
1633,
670,
253,
10419,
1060,
326,
44587,
352,
432,
5368,
3082,
533,
604,
594,
840,
436,
943,
320,
4518,
4767,
275,
253,
7714,
2007,
4679,
812,
671,
1361,
4446,
1728,
253,
4451,
10156,
2792,
273,
436,
1332,
253,
7681,
5955,
4620,
973,
11407,
2167,
352,
310,
1892,
323,
479,
281,
7472,
436,
347,
891,
717,
417,
271,
6485,
327,
3961,
262,
11333,
326,
753,
891,
717,
6600,
273,
643,
789,
275,
436,
2170,
326,
13698,
387,
2074,
7342,
690,
273,
534,
310,
11120,
5393,
275,
253,
7714,
253,
2505,
812,
513,
247,
1805,
2628,
32495,
253,
1246,
10419,
432,
1110,
2987,
1097,
949,
3762,
285,
4679,
347,
352,
9572,
253,
1750,
326,
642,
643,
19349,
3961,
262,
1332,
19401,
253,
1083,
273,
440,
45912,
34541,
310,
3365,
3221,
50276,
37585,
5701,
891,
651,
5583,
6240,
247,
3877,
281,
25102,
374,
14951,
327,
253,
897,
273,
340,
85,
281,
9173,
2380,
387,
3790,
246,
347,
436,
310,
2223,
17007,
342,
749,
26804,
26332,
340,
85,
619,
806,
1869,
6523,
340,
85,
369,
2442,
6973,
14951,
26332,
46719,
395,
323,
340,
50276,
5256,
50276,
85,
534,
9090,
2789,
642,
3282,
275,
436,
3634,
50276,
783,
20134,
275,
1736,
50276,
4090,
310,
8489,
21152,
804,
841,
943,
816,
320,
1736,
4090,
50275,
187,
187,
4118,
18435,
27,
13518,
2278,
5717,
368,
323,
634,
19529,
281,
1484,
2284,
285,
253,
4477,
2380,
281,
30628,
7350,
50276,
2520,
2929,
14371,
247,
19349,
3961,
262,
1332,
323,
8925,
253,
1682,
13805,
7268,
1677,
3640,
273,
253,
19349,
4216,
13200,
247,
985,
50276,
783,
2929,
10262,
11333,
323,
2969,
14938,
41458,
342,
44667,
398,
342,
1097,
10527,
1783,
285,
5661,
12820,
50276,
15337,
398,
14109,
253,
2987,
8434,
255,
6460,
285,
2442,
16055,
4893,
824,
347,
275,
35221,
4715
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
1774,
4722,
1895,
285,
16694,
2900,
4473,
50276,
12311,
10527,
285,
5661,
22861,
50274,
8826,
440,
498,
15752,
327,
7681,
285,
10527,
8213,
253,
19529,
2175,
253,
1895,
273,
8925,
253,
4585,
2877,
13805,
7268,
1677,
25319,
72,
11273,
19349,
4114,
3640,
6941,
323,
247,
4248,
273,
440,
45912,
34541,
253,
4477,
12661,
247,
19349,
3961,
262,
2746,
285,
1263,
697,
14938,
6222,
4645,
28055,
285,
21657,
326,
352,
26662,
49148,
342,
253,
1375,
273,
253,
1445,
247,
26565,
285,
16694,
2746,
281,
247,
2834,
285,
1774,
1895,
891,
452,
767,
3533,
326,
891,
651,
751,
9713,
50275,
18,
891,
717,
417,
2590,
281,
752,
4248,
359,
403,
1663,
6941,
323,
440,
45912,
34541,
1060,
275,
253,
2626,
7098,
71,
273,
2593,
337,
253,
4477,
3630,
326,
597,
403,
12392,
19349,
14580,
342,
440,
45912,
4903,
326,
403,
4651,
273,
387,
1878,
767,
24802,
4903,
533,
275,
2593,
1249,
359,
1646,
281,
5467,
326,
1046,
440,
45912,
4778,
556,
4555,
767,
2151,
452,
359,
1663,
417,
3663,
667,
31376,
1060,
50275,
3062,
15538,
253,
4477,
5467,
326,
253,
1055,
273,
513,
5260,
476,
320,
12724,
5998,
432,
253,
21899,
941,
3815,
2556,
281,
253,
906,
273,
246,
757,
285,
27887,
77,
20652,
281,
407,
253,
4477,
436,
1055,
310,
253,
1083,
604,
627,
310,
642,
12246,
17799,
1854,
12873,
1269,
281,
667,
273,
697,
2151,
533,
1269,
285,
340,
452,
642,
12246,
17799,
1854,
875,
731,
36714,
597,
452,
642,
440,
45912,
2885,
594,
419,
2254,
359,
8025,
1977,
253,
4623,
2238,
273,
34541,
407,
7384,
1548,
18279,
1430,
419,
2254,
359,
816,
8025,
326,
253,
7268,
4778,
285,
253,
2303,
4778,
452,
642,
440,
45912,
1846,
2847,
752,
717,
891,
5816,
50275,
19,
891,
513,
417,
3240,
923,
1880,
253,
4477,
403,
14053,
667,
8310,
43198,
5454,
2727,
253,
5933,
1900,
1057,
253,
1072,
1180,
273,
25612,
273,
253,
21899,
285,
7268,
267,
6174,
594,
891,
9428,
436,
5454,
2727,
310,
440,
7645,
264,
476,
253,
4477,
2319,
849,
253,
2746,
1537,
320,
6508,
281,
6016,
436,
275,
247,
8482,
2146,
1039,
50275,
20,
717,
891,
987,
326,
253,
4679,
7277,
253,
4081,
5933,
760,
342,
20320,
375,
326,
897,
642,
19349,
1491,
387,
512,
310,
436,
1663,
253,
4623,
5301,
943,
2649,
597,
320,
2429,
342,
643,
19349,
3961,
262,
20320,
375,
323,
1650,
1110,
326,
13414,
2395,
323,
440,
45912,
44667,
398,
50275,
37585,
3374,
50274,
74,
13414,
1158,
253,
1650,
275,
4677,
337,
310,
1077,
1175,
1580,
352,
310,
1077,
3898,
666,
917,
281,
3359,
247,
9205,
7268,
275,
436,
4758,
627,
310,
642,
1039,
281,
2740,
253,
5024,
432,
2675,
17989,
4903,
281,
2444,
432,
1728,
1557,
5820,
3966,
2550,
789,
432,
1728,
50275,
74,
812,
897,
247,
24388,
275,
253,
2505,
327,
253,
3064,
875,
1943,
80,
4632,
1943,
3151,
347,
3542,
253,
14023,
275,
24088,
253,
1273,
7098,
71,
327,
2593,
1903,
13414,
1007,
19126,
936,
19934,
50276,
7152,
339,
431,
248,
2929,
16424,
275,
253,
897,
273,
19349,
3210,
323,
3961,
953,
1895,
7296,
440,
45912,
44667,
398,
417,
2783,
275,
2045,
789,
352,
310,
973,
3559,
1690,
271,
9470,
3081,
2144,
342,
1471,
493,
1589,
285,
3081,
4679,
581,
2523,
310,
2139,
417,
2783,
44667,
398,
323,
253,
1083,
273,
18849,
14938,
253,
8813,
1677,
310,
417,
21414,
19148,
9017,
253,
8813,
323,
417,
7296,
44667,
398,
275,
253,
1083,
273,
7358,
267,
800,
14938,
2319,
253,
1617,
273,
278,
68,
50276,
79,
752,
1057,
436,
16084,
352,
310,
1846,
275,
3946,
5474,
339,
431,
248,
767,
11333,
342,
1805,
3045,
26565,
14938,
1783,
285,
25092,
839,
16774,
1543,
40702,
40702,
783,
906,
3133,
4217,
275,
4685,
247,
1077,
2173,
1511,
273,
19349,
4216,
50275,
783,
9380,
9400,
310,
4122,
4623,
253,
9376,
326,
1046,
16070,
494,
4778,
1907,
642,
12246,
17799,
1854,
281,
512,
273,
697,
2151,
310,
1077,
29190,
275,
326,
368,
476,
1900,
4044,
1046,
6174,
10921,
3268,
7239,
69,
1004,
432,
268,
87,
275,
958,
253,
9376,
310,
1014,
10046,
368,
476,
4271,
253,
2644,
3268,
268,
87,
17176,
1004,
253,
41066,
326,
342,
440,
45912,
44667,
398,
36908,
1646,
4209,
40702,
40702,
50276,
783,
8453,
273,
253,
906,
310,
247,
2372,
3710,
1955,
281,
253,
1511,
273,
4216,
2783,
36594,
253,
2929,
4620,
281,
320,
3451,
835,
19349,
17032,
310,
908,
891,
37001,
14938,
1783,
629,
534,
310,
417,
619,
15040,
50276,
22478,
1177,
2278,
690,
273,
253,
3332,
789,
327,
19349,
3961,
953,
285,
3910,
875,
436,
789,
285,
5368,
789,
403,
973,
5469,
19843,
253,
2929,
310,
3839,
4518,
3542,
40702,
40702,
891,
452,
9814,
436,
2929,
1078,
285,
2761,
512,
273,
619,
7350,
403,
1024,
9713,
50276,
249,
2593,
608,
1046,
4778,
310,
1024,
24802,
835,
896,
11806,
17705,
476,
320,
908,
891,
4282,
1880,
253,
896,
11806,
17705,
342,
4651,
347,
22961,
873,
310,
253,
954,
11408,
4327,
627,
778,
320,
247,
1027,
14000,
873,
326,
5644,
281,
2406,
11041,
275,
19349,
1055,
13418,
685,
253,
4651,
407,
253,
1039,
368,
13414,
452,
281,
5467,
326,
1046,
4778,
310,
24802,
533,
368,
476,
816,
1333,
1046,
7239,
69,
1004,
74,
310,
3636,
407,
19427,
323,
697,
4651,
534,
310,
1679,
34617,
50276,
16245,
604,
891,
717,
417,
20854,
13805,
7268,
6125,
1892,
7268,
513,
835,
697,
7285,
310,
2602,
7268,
17697,
7268,
3966,
47736,
7268,
651,
320,
247,
5272,
4327,
50276,
33722,
2180,
43302,
374,
495,
721,
898,
50276,
3431,
34837,
209,
422,
9814,
253,
2045,
2715,
273,
436,
2929,
5474,
339,
431,
248,
5955,
273,
19349,
7274,
281,
3961,
262,
3237,
310,
973,
24013,
8550,
285,
253,
4081,
11333,
403,
16575,
2529,
2167,
891,
858,
417,
564,
949,
253,
27947,
275,
2810,
2508,
253,
1543,
3176,
3590,
50276,
262,
369,
417,
7094,
2590,
281,
479,
849,
841,
11333,
564,
4457,
5368,
789,
436,
1386,
275,
1798,
10903,
479,
347,
20276,
281,
253,
1682,
273,
776,
3640,
436,
310,
253,
806,
789,
326,
6260,
253,
14938,
273,
19349,
3961,
262,
11333,
672,
3280,
19349,
14580,
4428,
44274,
84,
440,
45912,
44667,
398,
1045,
6358,
8050,
249,
2399,
303,
556,
644,
18051,
327,
436,
9400,
323,
1107,
1690,
275,
2067,
9380,
11106,
275,
436,
40336,
20314,
20561,
24088,
253,
4104,
5723,
2824,
2929,
3961,
953,
342,
440,
45912,
44667,
398,
285,
253,
4765,
5723,
2824,
2929,
8350,
19349,
3961,
953,
835,
281,
32014,
849,
604,
387,
512,
513,
256,
1109,
13256,
390,
260,
1109,
13256,
3157,
2220,
841,
3082,
50276,
783,
4679,
2593,
310,
4864,
285,
43288,
3184,
347,
2080,
347,
891,
476,
2028,
352,
3797,
642,
643,
19349,
3961,
262,
49602,
1014,
2167,
2067,
824,
3082,
2226,
285,
403,
5469,
275,
253,
7714,
50276,
262,
310,
1896,
891,
452,
3365,
46485,
1633,
670,
253,
10419,
1060,
326,
44587,
352,
432,
5368,
3082,
533,
604,
594,
840,
436,
943,
320,
4518,
4767,
275,
253,
7714,
2007,
4679,
812,
671,
1361,
4446,
1728,
253,
4451,
10156,
2792,
273,
436,
1332,
253,
7681,
5955,
4620,
973,
11407,
2167,
352,
310,
1892,
323,
479,
281,
7472,
436,
347,
891,
717,
417,
271,
6485,
327,
3961,
262,
11333,
326,
753,
891,
717,
6600,
273,
643,
789,
275,
436,
2170,
326,
13698,
387,
2074,
7342,
690,
273,
534,
310,
11120,
5393,
275,
253,
7714,
253,
2505,
812,
513,
247,
1805,
2628,
32495,
253,
1246,
10419,
432,
1110,
2987,
1097,
949,
3762,
285,
4679,
347,
352,
9572,
253,
1750,
326,
642,
643,
19349,
3961,
262,
1332,
19401,
253,
1083,
273,
440,
45912,
34541,
310,
3365,
3221,
50276,
37585,
5701,
891,
651,
5583,
6240,
247,
3877,
281,
25102,
374,
14951,
327,
253,
897,
273,
340,
85,
281,
9173,
2380,
387,
3790,
246,
347,
436,
310,
2223,
17007,
342,
749,
26804,
26332,
340,
85,
619,
806,
1869,
6523,
340,
85,
369,
2442,
6973,
14951,
26332,
46719,
395,
323,
340,
50276,
5256,
50276,
85,
534,
9090,
2789,
642,
3282,
275,
436,
3634,
50276,
783,
20134,
275,
1736,
50276,
4090,
310,
8489,
21152,
804,
841,
943,
816,
320,
1736,
4090,
50275,
187,
187,
4118,
18435,
27,
13518,
2278,
5717,
368,
323,
634,
19529,
281,
1484,
2284,
285,
253,
4477,
2380,
281,
30628,
7350,
50276,
2520,
2929,
14371,
247,
19349,
3961,
262,
1332,
323,
8925,
253,
1682,
13805,
7268,
1677,
3640,
273,
253,
19349,
4216,
13200,
247,
985,
50276,
783,
2929,
10262,
11333,
323,
2969,
14938,
41458,
342,
44667,
398,
342,
1097,
10527,
1783,
285,
5661,
12820,
50276,
15337,
398,
14109,
253,
2987,
8434,
255,
6460,
285,
2442,
16055,
4893,
824,
347,
275,
35221,
4715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors present a goalconditioned rl method which performs relabelling given a set of demonstrations the main claim is that demonstrations will help guide exploration a lot more efficiently this method is inspired from her lfd methods the agent starts with a database of demonstrations at training time the agent samples a goal from this database which it updates with successfully reached goalstates in the hindsight relabelling process the goals are sampled from the trajectory or from the demonstration database this approach is evaluated on a set bimanual cable insertion tasks where it outperforms prior approaches such as her and is able to learn with fewer demonstrations than rl lfd strengths this paper presents a to my knowledge novel goalconditioned rl method this approach improves appon hindsight experience replay by incoporating demonstrations and efficiently uses them to set goals i think this way of using goals will be quite useful for goalconditioned rl in general the method is well presented and easy to understand the results are also well analyzed and hindrl shows strong performance when compared to her and other rl lfd approaches i think the ablations especially looking at the number of demonstrations needed for each method are very detailed and provide good intuition about hindrl weaknesses i think the main weakness of this paper is the lack of diversity in the empircal evaluations it would be good to see results on more complex continuous control tasks in different settings than the bimanual cable insertion for example the tasks from the her paper andrychowicz et al 2017 i am willing to increase my recommendation score if the authors can address this issue the method is novel useful and well presented but the paper lacks diversity in the domains evaluated docsepthis paper studies longhorizon manipulation tasks given sparse rewards and a few demonstrations from a hindsight relabelingbased approach specifically it leverages the demonstration states as relevant goals in the goal relabeling process to guide taskspecific exploration when evaluating the policy the online goals are sampled from a goal dataset composed of successful states and final states from the demonstrations when relabeling the goals for training the goals are sampled from the set of states visited by the demonstrations rather than operating over the raw state observations this method utilizes an encoder which can be either engineered by an expert or learned through selfsupervision to extract a latent representation then the rewards are relabeled in hindsight based on a thresholded distance in the latent space strengths the paper studies a difficult problem in robotic manipulation and shows results on tasks of varying difficulty leveraging demonstrations as taskrelevant goals is intuitive and straightforward the experiments are thorough they study the importance of the goal selection strategy sensitivity to different numbers of demonstrations and how quickly different learned policies can solve the task weaknesses in the hindsight goal selection relabeling the rewards for goals not part of the rollout trajectory necessitates a different reward function separate from the environment reward the separate reward function seems to need to follow a sparsereward structure to learn a qvalue function from both reward functions this requires choosing a thresholding value epsilon although appendix a8 presents a seemingly general way to define what this should be it ultimately still depends on taskspecific knowledge of how large the window should be from which epsilon is computed it would be nice to include a sensitivity analysis of this hyperparameter relabeling the goals with just demonstration states can fail to make the sparsereward problem easier as intended by her if the resulting goal distribution is still relatively narrow for example we could imagine tasks with a wide initial state distribution and a large state space that the demonstrations only sparsely cover and even reaching demo states can be difficult it seems like mixing the demodriven goal distribution and distribution over reached states could lead to some benefits as shown in appendix a4 the choice of encoder seems to be quite important with the tcc results being the closest to those with the handengineered features across the four tasks finally some of the implementationexperimental details are currently a bit unclear to me my questions are below which encoder is used for the results in table 1 do the other methods in table 1 operate over the same latent representation as hindrl is the strong performance due to the proposed relabeling scheme or its combination with the extracted representation what is pzg in the definition of rg is this derived from the demonstration data eg a uniform distribution over the set of demonstration states what does the number of hindsight goal samples hyperparameter refer to is it the total number of hindsight goals ie rg in figure 7 why is there a dip in success rate for the different methods at around 60k environment steps is this because the bc term in the actors loss is annealed away minor comments the placement of the figures is a bit awkward for example figure 8 corresponds to sec 54 but is placed in sec 55 table 6 corresponds to sec 52 but is placed in sec 53 some of the figures also appear in the paper out of order itd be nice to have a consistent legend across the 4 plots in figure 12 the paper tackles an important and relevant problem solving longhorizon tasks given a few demonstrations the experiments are quite thorough but there are some missing details that seem critical to evaluating the performance of the method compared to prior work also it would be good to include a discussion of scenarios in which this goal relabeling method might fail ie when reaching the demonstration states is difficult and so relabeling still doesnt provide additional learning signal docsepthis paper contributes a method called hindrl which tackles the important problem of sparse reward robotic rl tasks when supplied with demonstrations hindrl seeks to build off of hindsight relabeling to learn a goalconditioned policy which is trained in a selfsupervised manner specifically by relabeling transitions with taskspecific goal candidates from expert demonstrations the agent will be given more taskrelevant feedback by combining the goal selection strategy with goalconditioned rewards which are either engineered or learned through temporal timeconsistency hindrl is demonstrated on simulated sparsereward robotics tasks where it achieves improved final performance and demo efficiency compared to her the strong points of this paper are a more comprehensive ablation study of topics such as encoder quality effect of number of demonstrations on method performance and goaldistribution sampling which are often important but unclear aspects of implementing these types of goalconditioned methods the use of tcc to learn the latent observation representations seems quite promising as shown in figure 4 and useful for the reward computation the bimanual manipulation task which is evaluated on appears quite challenging and the method achieves significant gains on the shown tasks the weak points of this paper are im not sure if the claim made in the related literature section however in all those works hindsight goals were always chosen directly from the agents own trajectory is accurate nair 2018a also selects goals directly from demonstration states using demonstration states as relabeled rewards in itself does not seem novel the engineered encoder representation which is used for the method evaluation seems to essentially provide a more dense reward for the agent which is the ell2 distance between the engineered features in this case i would argue that comparisons should be made to her and other relabeling strategies which are given access to this engineered reward function this may already be the case but it was not clear to me when reading the text when moving to use the tccbased encoder and reward function the performance on some tasks drops significantly which leads me to believe that the performance improvements could largely be due to essentially giving more signal from the human expert additional baselines would make the empirical results more convincing for example to reinforcement learning with imagined goals rig additional questions how are the number of relabeled goals for each task selected why is it that in figure 9 the performance of the green line for full insertion is around 02 whereas it is much higher in table 1 are these the same method or is the difference in number of training steps a few recommendations for the manuscript the reference to table 6 in the text results should be to table 1 instead there seems to be no table 6 section 42 was rather hard to parse for example i dont quite understand how in contrast hindsight goal selection is the process of sampling candidate goal is different from instead we sample goals for each time step from trajectory using some candidate goal distribution pzg my recommendation currently is to reject the paper because i feel that the experimental results seem to indicate that the engineered features are contributing to the difference in performance and thus creating an unfair comparison and the novelty of the goal selection method is not currently apparent to me if the authors would be able to clarify either or both of these points it would help immensely if the methods which are compared to are not being currently given the engineered encodings comparisons to versions which are given that information as well as additional baselines would be helpful docsepthis paper aims to improve the learning effectiveness and data efficiency for longhorizon control tasks under sparse rewards with demonstrations building upon prior works on dpgfd and her the authors proposed a new hindsight goal relabeling technique which is referred to as taskconstrained goalconditioned rl to differentiate against general goalconditioned rl as in prior works specifically instead of selecting goals from the learned agents rollouts the new technique only selects goals from the successful rollouts and demonstrations rigorous experiments demonstrated that the proposed algorithm outperforms prior works on a very challenging bidextrous peg insertion task in simulation in the aspects of the learned agents final performance eg task completion rate and data efficiency eg number of demonstrations strong points the final task used in the paper bimanual insertion is challenging enough to convince readers of the authors claim that the proposed algorithm is a meaningful improvement on learning longhorizon control tasks authors breaking it down to several tasks by complexity is very helpful the empirical results are comprehensive and convincing difference between the proposed algorithm and prior works is big the paper is well motivated the overall storyline is clear and makes sense weak points the amount of novelty is mediocre the main innovation seems to be the new technique for hindsight relabeling technical clarity can be improved especially in the way prior works are introduced in the methodology section which seems a bit highlevel and requires readers to have substantial knowledge in the prior works to understand the full picture additional questions im curious to learn from the authors why her both the final and future variants underperforms dpgfd in the bring near and bring near orient tasks as shown in table 1 overall i enjoy reading this paper while the novelty is not steller the experimental results are convincing and impressive i recommend acceptance for broader dissemination
### Summary:
|
this paper proposes a method to improve the sample efficiency of the her algorithm by sampling goals from a distribution that is learned from human demonstrations empirical results on a simulated robotic insertion task show that the proposed method enjoys a better sample efficiency compared to her the reviewers find the paper wellwritten overall and the proposed idea reasonable however there are concerns regarding the limited novelty of the proposed method which seems incremental also the empirical evaluation suffers from a lack of diversity the considered tasks are virtually all equivalent to an insertion task the paper would benefit from further empirical evaluations that include tasks such as those considered in the original her paper
|
[
275,
2087,
253,
1332,
310,
973,
3559,
285,
3477,
281,
2096,
253,
1543,
403,
671,
973,
5867,
285,
17134,
8435,
2722,
2266,
3045,
672,
2429,
281,
617,
285,
643,
391,
77,
50276,
77,
9194,
7274,
891,
1158,
253,
490,
77,
569,
3340,
2819,
387,
253,
1180,
273,
32367,
3058,
323,
1016,
1332,
403,
1077,
7000,
285,
2085,
1175,
30328,
670,
17134,
8435,
50275,
20881,
1255,
265,
50275,
74,
1158,
253,
2022,
14855,
273,
436,
2929,
310,
253,
3480,
273,
9991,
275,
253,
13974,
1179,
27163,
352,
651,
320,
1175,
281,
923,
1543,
327,
625,
2570,
5415,
1453,
8892,
275,
1027,
7533,
685,
253,
270,
25884,
780,
10856,
16941,
323,
1650,
253,
8892,
432,
253,
617,
2929,
285,
610,
348,
319,
25928,
1162,
355,
4240,
891,
717,
7378,
281,
2572,
619,
17401,
4868,
604,
253,
4477,
476,
2953,
436,
2523,
50274,
783,
1332,
310,
4460,
4217,
285,
973,
3559,
533,
253,
2929,
19756,
9991,
275,
253,
10625,
6760,
50276,
7152,
33032,
2520,
2929,
2175,
1048,
1688,
21148,
19763,
8892,
1677,
23507,
23267,
285,
247,
1643,
32367,
432,
247,
17134,
18347,
774,
1492,
272,
3169,
2746,
5742,
352,
19732,
1131,
253,
20028,
3054,
347,
4623,
7342,
275,
253,
4736,
774,
1492,
272,
1232,
281,
7102,
8892,
29765,
17947,
50276,
9453,
16344,
253,
3646,
253,
3909,
7342,
403,
19958,
432,
247,
4736,
10895,
9924,
273,
5547,
3054,
285,
2457,
3054,
432,
253,
32367,
672,
774,
1492,
272,
253,
7342,
323,
3733,
253,
7342,
403,
19958,
432,
253,
873,
273,
3054,
11580,
407,
253,
32367,
2581,
685,
6498,
689,
253,
9305,
1375,
7313,
436,
1332,
29820,
271,
32049,
534,
476,
320,
2057,
28136,
407,
271,
6485,
390,
6311,
949,
1881,
12185,
4694,
281,
4908,
247,
21624,
6779,
840,
253,
23267,
403,
774,
1492,
264,
275,
17134,
18347,
1754,
327,
247,
7887,
264,
4181,
275,
253,
21624,
2317,
20544,
50276,
783,
2929,
2175,
247,
2834,
1895,
275,
35121,
19763,
285,
2722,
1543,
327,
8892,
273,
11962,
10183,
19732,
2977,
32367,
347,
4836,
15477,
7342,
310,
27350,
285,
15246,
50275,
783,
4679,
403,
11080,
597,
1263,
253,
6349,
273,
253,
4736,
5438,
5700,
7340,
281,
1027,
3904,
273,
32367,
285,
849,
4541,
1027,
6311,
7823,
476,
8415,
253,
4836,
50276,
20881,
1255,
265,
50276,
249,
253,
17134,
18347,
4736,
5438,
774,
1492,
272,
253,
23267,
323,
7342,
417,
629,
273,
253,
4533,
483,
18974,
2436,
36269,
247,
1027,
10921,
1159,
4858,
432,
253,
3126,
10921,
253,
4858,
10921,
1159,
3133,
281,
878,
281,
956,
247,
23507,
250,
1034,
2605,
281,
3037,
247,
2805,
2877,
1159,
432,
1097,
10921,
3470,
436,
4419,
13887,
247,
7887,
272,
1318,
299,
4277,
3738,
30762,
247,
25,
10262,
247,
16907,
2087,
1039,
281,
4853,
752,
436,
943,
320,
352,
9142,
1335,
7024,
327,
8892,
29765,
3640,
273,
849,
1781,
253,
3497,
943,
320,
432,
534,
299,
4277,
310,
10302,
352,
651,
320,
5322,
281,
2486,
247,
7340,
1783,
273,
436,
4373,
19484,
50275,
1661,
1492,
272,
253,
7342,
342,
816,
20028,
3054,
476,
1891,
281,
1056,
253,
23507,
250,
1034,
1895,
6927,
347,
6034,
407,
617,
604,
253,
4795,
4736,
3268,
310,
1335,
4942,
6891,
323,
1650,
359,
812,
8564,
8892,
342,
247,
4618,
3302,
1375,
3268,
285,
247,
1781,
1375,
2317,
326,
253,
32367,
760,
37139,
600,
3835,
285,
1014,
10922,
22020,
3054,
476,
320,
2834,
352,
3133,
751,
12480,
253,
1471,
351,
1069,
257,
4736,
3268,
285,
3268,
689,
4925,
3054,
812,
1421,
281,
690,
5373,
347,
2011,
275,
30762,
247,
21,
50275,
783,
4327,
273,
32049,
3133,
281,
320,
3240,
1774,
342,
253,
246,
550,
1543,
1146,
253,
8642,
281,
1110,
342,
253,
1133,
15179,
2122,
3386,
2439,
253,
1740,
8892,
50276,
71,
3341,
690,
273,
253,
7092,
49363,
4278,
403,
4390,
247,
2372,
12744,
281,
479,
619,
3533,
403,
2708,
50275,
4609,
32049,
310,
908,
323,
253,
1543,
275,
2829,
337,
513,
253,
643,
3082,
275,
2829,
337,
10196,
689,
253,
1072,
21624,
6779,
347,
17134,
8435,
310,
253,
2266,
3045,
1955,
281,
253,
4081,
774,
1492,
272,
6974,
390,
697,
5019,
342,
253,
10375,
6779,
50275,
5371,
310,
268,
42148,
275,
253,
5426,
273,
28937,
310,
436,
6012,
432,
253,
20028,
941,
24088,
247,
6447,
3268,
689,
253,
873,
273,
20028,
3054,
50275,
5371,
1057,
253,
1180,
273,
17134,
18347,
4736,
3530,
4373,
19484,
3730,
281,
310,
352,
253,
2264,
1180,
273,
17134,
18347,
7342,
26332,
28937,
50275,
249,
4677,
818,
2139,
310,
627,
247,
12539,
275,
2323,
2281,
323,
253,
1027,
3082,
387,
1475,
3925,
76,
3126,
5018,
310,
436,
984,
253,
49501,
1307,
275,
253,
14142,
2957,
310,
27175,
3256,
1977,
50276,
37585,
5701,
50276,
783,
14663,
273,
253,
8442,
310,
247,
2372,
19328,
323,
1650,
4677,
854,
10140,
281,
4706,
8255,
533,
310,
4845,
275,
4706,
7288,
2829,
721,
10140,
281,
4706,
8073,
533,
310,
4845,
275,
4706,
8676,
690,
273,
253,
8442,
671,
3176,
275,
253,
2929,
562,
273,
1340,
50275,
262,
69,
320,
5322,
281,
452,
247,
5185,
13691,
2439,
253,
577,
14777,
275,
4677,
1249,
253,
2929,
39223,
271,
1774,
285,
4623,
1895,
16161,
1048,
1688,
21148,
8892,
1677,
247,
1643,
32367,
253,
4679,
403,
3240,
11080,
533,
627,
403,
690,
5816,
4278,
326,
1646,
4619,
281,
16344,
253,
3045,
273,
253,
1332,
2429,
281,
2720,
789,
671,
352,
651,
320,
1175,
281,
2486,
247,
5955,
273,
15216,
275,
534,
436,
4736,
774,
1492,
272,
1332,
1537,
1891,
26332,
672,
10922,
253,
20028,
3054,
310,
2834,
285,
594,
774,
1492,
272,
1335,
36908,
2085,
3081,
4715,
2625,
5474,
33032,
2520,
2929,
17904,
247,
1332,
1925,
17134,
8435,
534,
39223,
253,
1774,
1895,
273,
23507,
10921,
35121,
391,
77,
8892,
672,
12164,
342,
32367,
17134,
8435,
14993,
281,
1973,
745,
273,
17134,
18347,
774,
1492,
272,
281,
3037,
247,
4736,
44321,
3646,
534,
310,
10166,
275,
247,
1881,
35421,
5133,
5742,
407,
774,
1492,
272,
16307,
342,
8892,
29765,
4736,
9183,
432,
6485,
32367,
253,
5570,
588,
320,
1677,
625,
4836,
15477,
8680,
407,
16248,
253,
4736,
5438,
5700,
342,
4736,
44321,
23267,
534,
403,
2057,
28136,
390,
6311,
949,
11935,
673,
46540,
1371,
17134,
8435,
310,
5183,
327,
15524,
23507,
250,
1034,
15688,
982,
8892,
835,
352,
33526,
5520,
2457,
3045,
285,
22020,
6733,
2429,
281,
617,
50275,
783,
2266,
2792,
273,
436,
2929,
403,
50276,
66,
625,
11088,
28913,
1263,
273,
12989,
824,
347,
32049,
3290,
1055,
273,
1180,
273,
32367,
327,
1332,
3045,
285,
4736,
35360,
10491,
534,
403,
2223,
1774,
533,
12744,
7794,
273,
16994,
841,
3510,
273,
4736,
44321,
3082,
50276,
783,
897,
273,
246,
550,
281,
3037,
253,
21624,
8310,
14237,
3133,
3240,
12532,
347,
2011,
275,
4677,
577,
285,
4217,
323,
253,
10921,
13782,
50276,
783,
270,
25884,
780,
19763,
4836,
534,
310,
6760,
327,
4620,
3240,
11132,
285,
253,
1332,
33526,
1534,
15988,
327,
253,
2011,
8892,
50276,
783,
5075,
2792,
273,
436,
2929,
403,
50276,
303,
417,
2119,
604,
253,
1750,
1160,
275,
253,
2905,
6239,
2593,
2299,
275,
512,
1110,
2987,
17134,
18347,
7342,
497,
1900,
6777,
3587,
432,
253,
6083,
1211,
18974,
310,
7899,
50276,
79,
1094,
4765,
66,
671,
34899,
7342,
3587,
432,
20028,
3054,
970,
20028,
3054,
347,
774,
1492,
264,
23267,
275,
3139,
1057,
417,
1646,
4460,
50276,
783,
28136,
32049,
6779,
534,
310,
908,
323,
253,
1332,
7103,
3133,
281,
9093,
2085,
247,
625,
14086,
10921,
323,
253,
5570,
534,
310,
253,
11591,
19,
4181,
875,
253,
28136,
3386,
275,
436,
1083,
891,
651,
9059,
326,
14023,
943,
320,
1160,
281,
617,
285,
643,
774,
1492,
272,
8130,
534,
403,
1677,
2289,
281,
436,
28136,
10921,
1159,
436,
778,
2168,
320,
253,
1083,
533,
352,
369,
417,
2590,
281,
479,
672,
4361,
253,
2505,
672,
4886,
281,
897,
253,
246,
550,
3169,
32049,
285,
10921,
1159,
253,
3045,
327,
690,
8892,
15323,
3012,
534,
5644,
479,
281,
2868,
326,
253,
3045,
11701,
812,
8127,
320,
1955,
281,
9093,
4933,
625,
2625,
432,
253,
1966,
6485,
50275,
38092,
1666,
25379,
651,
1056,
253,
16774,
1543,
625,
21414,
323,
1650,
281,
35221,
4715,
342,
18998,
7342,
8132,
50275,
38092,
3533,
50276,
5430,
403,
253,
1180,
273,
774,
1492,
264,
7342,
323,
1016,
4836,
4236,
50276,
22309,
310,
352,
326,
275,
4677,
898,
253,
3045,
273,
253,
4759,
1386,
323,
2120,
16941,
310,
1475,
16261,
5727,
352,
310,
1199,
2169,
275,
2829,
337,
403,
841,
253,
1072,
1332,
390,
310,
253,
3064,
275,
1180,
273,
3733,
5018,
50276,
66,
1643,
12645,
323,
253,
7714,
50276,
783,
3806,
281,
2829,
721,
275,
253,
2505,
1543,
943,
320,
281,
2829,
337,
3185,
627,
3133,
281,
320,
642,
2829,
721,
50276,
4674,
5976,
369,
2581,
1892,
281,
14390,
323,
1650,
891,
13414,
3240,
2096,
849,
275,
4499,
17134,
18347,
4736,
5438,
310,
253,
1232,
273,
10491,
7431,
4736,
50276,
261,
1027,
432,
3185,
359,
3410,
7342,
323,
1016,
673,
3213,
432,
18974,
50276,
5302,
690,
7431,
4736,
3268,
268,
42148,
50274,
2577,
17401,
4390,
310,
281,
12009,
253,
2929,
984,
891,
1928,
326,
253,
5661,
1543,
1646,
281,
5224,
326,
253,
28136,
3386,
403,
15979,
281,
253,
3064,
275,
3045,
285,
3021,
6153,
271,
16593,
5301,
285,
253,
38135,
273,
253,
4736,
5438,
1332,
310,
417,
4390,
5165,
281,
479,
604,
253,
4477,
651,
320,
2104,
281,
19148,
2057,
390,
1097,
273,
841,
2792,
352,
651,
1361,
45224,
604,
253,
3082,
534,
403,
2429,
281,
403,
417,
1146,
4390,
1677,
253,
28136,
2349,
351,
723,
14023,
281,
9508,
534,
403,
1677,
326,
1491,
347,
973,
347,
3081,
1666,
25379,
651,
320,
9371,
50275,
7152,
33032,
2520,
2929,
13698,
281,
3157,
253,
4715,
12510,
285,
941,
6733,
323,
1048,
1688,
21148,
1453,
8892,
762,
23507,
23267,
342,
32367,
3652,
2220,
2720,
2987,
327,
277,
8159,
9194,
285,
617,
253,
4477,
4081,
247,
747,
17134,
18347,
4736,
774,
1492,
272,
5853,
534,
310,
6289,
281,
347,
4836,
48454,
4736,
44321,
391,
77,
281,
22629,
1411,
2087,
4736,
44321,
391,
77,
347,
275,
2720,
2987,
5742,
3185,
273,
17221,
7342,
432,
253,
6311,
6083,
4533,
8349,
253,
747,
5853,
760,
34899,
7342,
432,
253,
5547,
4533,
8349,
285,
32367,
26565,
4679,
5183,
326,
253,
4081,
5933,
41731,
13015,
2720,
2987,
327,
247,
1077,
11132,
270,
504,
633,
18437,
47997,
16941,
4836,
275,
9864,
275,
253,
7794,
273,
253,
6311,
6083,
2457,
3045,
24088,
4836,
12240,
2281,
285,
941,
6733,
24088,
1180,
273,
32367,
2266,
2792,
50275,
783,
2457,
4836,
908,
275,
253,
2929,
270,
25884,
780,
16941,
310,
11132,
2217,
281,
18578,
10668,
273,
253,
4477,
1750,
326,
253,
4081,
5933,
310,
247,
14282,
7756,
327,
4715,
1048,
1688,
21148,
1453,
8892,
4477,
10155,
352,
1066,
281,
2067,
8892,
407,
10454,
310,
1077,
9371,
50275,
783,
16774,
1543,
403,
11088,
285,
21414,
3064,
875,
253,
4081,
5933,
285,
2720,
2987,
310,
1943,
50275,
783,
2929,
310,
973,
17194,
253,
4583,
44803,
310,
2590,
285,
2789,
3282,
50276,
20881,
2792,
50275,
783,
2408,
273,
38135,
310,
12069,
49636,
253,
2022,
15832,
3133,
281,
320,
253,
747,
5853,
323,
17134,
18347,
774,
1492,
272,
50274,
48746,
19843,
476,
320,
5520,
3340,
275,
253,
1039,
2720,
2987,
403,
5611,
275,
253,
16182,
2593,
534,
3133,
247,
2372,
1029,
5251,
285,
4419,
10668,
281,
452,
6832,
3640,
275,
253,
2720,
2987,
281,
2096,
253,
2120,
5406,
50275,
38092,
3533,
50275,
303,
14338,
281,
3037,
432,
253,
4477,
2139,
617,
1097,
253,
2457,
285,
2852,
11640,
762,
468,
13015,
277,
8159,
9194,
275,
253,
3324,
2822,
285,
3324,
2822,
50276,
13981,
8892,
347,
2011,
275,
2829,
337,
50276,
1189,
455,
891,
4264,
4361,
436,
2929,
1223,
253,
38135,
310,
417,
331,
7707,
253,
5661,
1543,
403,
21414,
285,
13943,
891,
5583,
14924,
323,
16055,
36404,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
281,
3157,
253,
3410,
6733,
273,
253,
617,
5933,
407,
10491,
7342,
432,
247,
3268,
326,
310,
6311,
432,
1966,
32367,
16774,
1543,
327,
247,
15524,
35121,
16941,
4836,
921,
326,
253,
4081,
1332,
29566,
247,
1805,
3410,
6733,
2429,
281,
617,
50276,
783,
30628,
1089,
253,
2929,
973,
15720,
4583,
285,
253,
4081,
2934,
5272,
2299,
627,
403,
7350,
5001,
253,
3710,
38135,
273,
253,
4081,
1332,
534,
3133,
32809,
671,
253,
16774,
7103,
27171,
432,
247,
3480,
273,
9991,
253,
2783,
8892,
403,
14257,
512,
6425,
281,
271,
16941,
4836,
253,
2929,
651,
5649,
432,
2007,
16774,
27163,
326,
2486,
8892,
824,
347,
1110,
2783,
275,
253,
3236,
617,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
275,
2087,
253,
1332,
310,
973,
3559,
285,
3477,
281,
2096,
253,
1543,
403,
671,
973,
5867,
285,
17134,
8435,
2722,
2266,
3045,
672,
2429,
281,
617,
285,
643,
391,
77,
50276,
77,
9194,
7274,
891,
1158,
253,
490,
77,
569,
3340,
2819,
387,
253,
1180,
273,
32367,
3058,
323,
1016,
1332,
403,
1077,
7000,
285,
2085,
1175,
30328,
670,
17134,
8435,
50275,
20881,
1255,
265,
50275,
74,
1158,
253,
2022,
14855,
273,
436,
2929,
310,
253,
3480,
273,
9991,
275,
253,
13974,
1179,
27163,
352,
651,
320,
1175,
281,
923,
1543,
327,
625,
2570,
5415,
1453,
8892,
275,
1027,
7533,
685,
253,
270,
25884,
780,
10856,
16941,
323,
1650,
253,
8892,
432,
253,
617,
2929,
285,
610,
348,
319,
25928,
1162,
355,
4240,
891,
717,
7378,
281,
2572,
619,
17401,
4868,
604,
253,
4477,
476,
2953,
436,
2523,
50274,
783,
1332,
310,
4460,
4217,
285,
973,
3559,
533,
253,
2929,
19756,
9991,
275,
253,
10625,
6760,
50276,
7152,
33032,
2520,
2929,
2175,
1048,
1688,
21148,
19763,
8892,
1677,
23507,
23267,
285,
247,
1643,
32367,
432,
247,
17134,
18347,
774,
1492,
272,
3169,
2746,
5742,
352,
19732,
1131,
253,
20028,
3054,
347,
4623,
7342,
275,
253,
4736,
774,
1492,
272,
1232,
281,
7102,
8892,
29765,
17947,
50276,
9453,
16344,
253,
3646,
253,
3909,
7342,
403,
19958,
432,
247,
4736,
10895,
9924,
273,
5547,
3054,
285,
2457,
3054,
432,
253,
32367,
672,
774,
1492,
272,
253,
7342,
323,
3733,
253,
7342,
403,
19958,
432,
253,
873,
273,
3054,
11580,
407,
253,
32367,
2581,
685,
6498,
689,
253,
9305,
1375,
7313,
436,
1332,
29820,
271,
32049,
534,
476,
320,
2057,
28136,
407,
271,
6485,
390,
6311,
949,
1881,
12185,
4694,
281,
4908,
247,
21624,
6779,
840,
253,
23267,
403,
774,
1492,
264,
275,
17134,
18347,
1754,
327,
247,
7887,
264,
4181,
275,
253,
21624,
2317,
20544,
50276,
783,
2929,
2175,
247,
2834,
1895,
275,
35121,
19763,
285,
2722,
1543,
327,
8892,
273,
11962,
10183,
19732,
2977,
32367,
347,
4836,
15477,
7342,
310,
27350,
285,
15246,
50275,
783,
4679,
403,
11080,
597,
1263,
253,
6349,
273,
253,
4736,
5438,
5700,
7340,
281,
1027,
3904,
273,
32367,
285,
849,
4541,
1027,
6311,
7823,
476,
8415,
253,
4836,
50276,
20881,
1255,
265,
50276,
249,
253,
17134,
18347,
4736,
5438,
774,
1492,
272,
253,
23267,
323,
7342,
417,
629,
273,
253,
4533,
483,
18974,
2436,
36269,
247,
1027,
10921,
1159,
4858,
432,
253,
3126,
10921,
253,
4858,
10921,
1159,
3133,
281,
878,
281,
956,
247,
23507,
250,
1034,
2605,
281,
3037,
247,
2805,
2877,
1159,
432,
1097,
10921,
3470,
436,
4419,
13887,
247,
7887,
272,
1318,
299,
4277,
3738,
30762,
247,
25,
10262,
247,
16907,
2087,
1039,
281,
4853,
752,
436,
943,
320,
352,
9142,
1335,
7024,
327,
8892,
29765,
3640,
273,
849,
1781,
253,
3497,
943,
320,
432,
534,
299,
4277,
310,
10302,
352,
651,
320,
5322,
281,
2486,
247,
7340,
1783,
273,
436,
4373,
19484,
50275,
1661,
1492,
272,
253,
7342,
342,
816,
20028,
3054,
476,
1891,
281,
1056,
253,
23507,
250,
1034,
1895,
6927,
347,
6034,
407,
617,
604,
253,
4795,
4736,
3268,
310,
1335,
4942,
6891,
323,
1650,
359,
812,
8564,
8892,
342,
247,
4618,
3302,
1375,
3268,
285,
247,
1781,
1375,
2317,
326,
253,
32367,
760,
37139,
600,
3835,
285,
1014,
10922,
22020,
3054,
476,
320,
2834,
352,
3133,
751,
12480,
253,
1471,
351,
1069,
257,
4736,
3268,
285,
3268,
689,
4925,
3054,
812,
1421,
281,
690,
5373,
347,
2011,
275,
30762,
247,
21,
50275,
783,
4327,
273,
32049,
3133,
281,
320,
3240,
1774,
342,
253,
246,
550,
1543,
1146,
253,
8642,
281,
1110,
342,
253,
1133,
15179,
2122,
3386,
2439,
253,
1740,
8892,
50276,
71,
3341,
690,
273,
253,
7092,
49363,
4278,
403,
4390,
247,
2372,
12744,
281,
479,
619,
3533,
403,
2708,
50275,
4609,
32049,
310,
908,
323,
253,
1543,
275,
2829,
337,
513,
253,
643,
3082,
275,
2829,
337,
10196,
689,
253,
1072,
21624,
6779,
347,
17134,
8435,
310,
253,
2266,
3045,
1955,
281,
253,
4081,
774,
1492,
272,
6974,
390,
697,
5019,
342,
253,
10375,
6779,
50275,
5371,
310,
268,
42148,
275,
253,
5426,
273,
28937,
310,
436,
6012,
432,
253,
20028,
941,
24088,
247,
6447,
3268,
689,
253,
873,
273,
20028,
3054,
50275,
5371,
1057,
253,
1180,
273,
17134,
18347,
4736,
3530,
4373,
19484,
3730,
281,
310,
352,
253,
2264,
1180,
273,
17134,
18347,
7342,
26332,
28937,
50275,
249,
4677,
818,
2139,
310,
627,
247,
12539,
275,
2323,
2281,
323,
253,
1027,
3082,
387,
1475,
3925,
76,
3126,
5018,
310,
436,
984,
253,
49501,
1307,
275,
253,
14142,
2957,
310,
27175,
3256,
1977,
50276,
37585,
5701,
50276,
783,
14663,
273,
253,
8442,
310,
247,
2372,
19328,
323,
1650,
4677,
854,
10140,
281,
4706,
8255,
533,
310,
4845,
275,
4706,
7288,
2829,
721,
10140,
281,
4706,
8073,
533,
310,
4845,
275,
4706,
8676,
690,
273,
253,
8442,
671,
3176,
275,
253,
2929,
562,
273,
1340,
50275,
262,
69,
320,
5322,
281,
452,
247,
5185,
13691,
2439,
253,
577,
14777,
275,
4677,
1249,
253,
2929,
39223,
271,
1774,
285,
4623,
1895,
16161,
1048,
1688,
21148,
8892,
1677,
247,
1643,
32367,
253,
4679,
403,
3240,
11080,
533,
627,
403,
690,
5816,
4278,
326,
1646,
4619,
281,
16344,
253,
3045,
273,
253,
1332,
2429,
281,
2720,
789,
671,
352,
651,
320,
1175,
281,
2486,
247,
5955,
273,
15216,
275,
534,
436,
4736,
774,
1492,
272,
1332,
1537,
1891,
26332,
672,
10922,
253,
20028,
3054,
310,
2834,
285,
594,
774,
1492,
272,
1335,
36908,
2085,
3081,
4715,
2625,
5474,
33032,
2520,
2929,
17904,
247,
1332,
1925,
17134,
8435,
534,
39223,
253,
1774,
1895,
273,
23507,
10921,
35121,
391,
77,
8892,
672,
12164,
342,
32367,
17134,
8435,
14993,
281,
1973,
745,
273,
17134,
18347,
774,
1492,
272,
281,
3037,
247,
4736,
44321,
3646,
534,
310,
10166,
275,
247,
1881,
35421,
5133,
5742,
407,
774,
1492,
272,
16307,
342,
8892,
29765,
4736,
9183,
432,
6485,
32367,
253,
5570,
588,
320,
1677,
625,
4836,
15477,
8680,
407,
16248,
253,
4736,
5438,
5700,
342,
4736,
44321,
23267,
534,
403,
2057,
28136,
390,
6311,
949,
11935,
673,
46540,
1371,
17134,
8435,
310,
5183,
327,
15524,
23507,
250,
1034,
15688,
982,
8892,
835,
352,
33526,
5520,
2457,
3045,
285,
22020,
6733,
2429,
281,
617,
50275,
783,
2266,
2792,
273,
436,
2929,
403,
50276,
66,
625,
11088,
28913,
1263,
273,
12989,
824,
347,
32049,
3290,
1055,
273,
1180,
273,
32367,
327,
1332,
3045,
285,
4736,
35360,
10491,
534,
403,
2223,
1774,
533,
12744,
7794,
273,
16994,
841,
3510,
273,
4736,
44321,
3082,
50276,
783,
897,
273,
246,
550,
281,
3037,
253,
21624,
8310,
14237,
3133,
3240,
12532,
347,
2011,
275,
4677,
577,
285,
4217,
323,
253,
10921,
13782,
50276,
783,
270,
25884,
780,
19763,
4836,
534,
310,
6760,
327,
4620,
3240,
11132,
285,
253,
1332,
33526,
1534,
15988,
327,
253,
2011,
8892,
50276,
783,
5075,
2792,
273,
436,
2929,
403,
50276,
303,
417,
2119,
604,
253,
1750,
1160,
275,
253,
2905,
6239,
2593,
2299,
275,
512,
1110,
2987,
17134,
18347,
7342,
497,
1900,
6777,
3587,
432,
253,
6083,
1211,
18974,
310,
7899,
50276,
79,
1094,
4765,
66,
671,
34899,
7342,
3587,
432,
20028,
3054,
970,
20028,
3054,
347,
774,
1492,
264,
23267,
275,
3139,
1057,
417,
1646,
4460,
50276,
783,
28136,
32049,
6779,
534,
310,
908,
323,
253,
1332,
7103,
3133,
281,
9093,
2085,
247,
625,
14086,
10921,
323,
253,
5570,
534,
310,
253,
11591,
19,
4181,
875,
253,
28136,
3386,
275,
436,
1083,
891,
651,
9059,
326,
14023,
943,
320,
1160,
281,
617,
285,
643,
774,
1492,
272,
8130,
534,
403,
1677,
2289,
281,
436,
28136,
10921,
1159,
436,
778,
2168,
320,
253,
1083,
533,
352,
369,
417,
2590,
281,
479,
672,
4361,
253,
2505,
672,
4886,
281,
897,
253,
246,
550,
3169,
32049,
285,
10921,
1159,
253,
3045,
327,
690,
8892,
15323,
3012,
534,
5644,
479,
281,
2868,
326,
253,
3045,
11701,
812,
8127,
320,
1955,
281,
9093,
4933,
625,
2625,
432,
253,
1966,
6485,
50275,
38092,
1666,
25379,
651,
1056,
253,
16774,
1543,
625,
21414,
323,
1650,
281,
35221,
4715,
342,
18998,
7342,
8132,
50275,
38092,
3533,
50276,
5430,
403,
253,
1180,
273,
774,
1492,
264,
7342,
323,
1016,
4836,
4236,
50276,
22309,
310,
352,
326,
275,
4677,
898,
253,
3045,
273,
253,
4759,
1386,
323,
2120,
16941,
310,
1475,
16261,
5727,
352,
310,
1199,
2169,
275,
2829,
337,
403,
841,
253,
1072,
1332,
390,
310,
253,
3064,
275,
1180,
273,
3733,
5018,
50276,
66,
1643,
12645,
323,
253,
7714,
50276,
783,
3806,
281,
2829,
721,
275,
253,
2505,
1543,
943,
320,
281,
2829,
337,
3185,
627,
3133,
281,
320,
642,
2829,
721,
50276,
4674,
5976,
369,
2581,
1892,
281,
14390,
323,
1650,
891,
13414,
3240,
2096,
849,
275,
4499,
17134,
18347,
4736,
5438,
310,
253,
1232,
273,
10491,
7431,
4736,
50276,
261,
1027,
432,
3185,
359,
3410,
7342,
323,
1016,
673,
3213,
432,
18974,
50276,
5302,
690,
7431,
4736,
3268,
268,
42148,
50274,
2577,
17401,
4390,
310,
281,
12009,
253,
2929,
984,
891,
1928,
326,
253,
5661,
1543,
1646,
281,
5224,
326,
253,
28136,
3386,
403,
15979,
281,
253,
3064,
275,
3045,
285,
3021,
6153,
271,
16593,
5301,
285,
253,
38135,
273,
253,
4736,
5438,
1332,
310,
417,
4390,
5165,
281,
479,
604,
253,
4477,
651,
320,
2104,
281,
19148,
2057,
390,
1097,
273,
841,
2792,
352,
651,
1361,
45224,
604,
253,
3082,
534,
403,
2429,
281,
403,
417,
1146,
4390,
1677,
253,
28136,
2349,
351,
723,
14023,
281,
9508,
534,
403,
1677,
326,
1491,
347,
973,
347,
3081,
1666,
25379,
651,
320,
9371,
50275,
7152,
33032,
2520,
2929,
13698,
281,
3157,
253,
4715,
12510,
285,
941,
6733,
323,
1048,
1688,
21148,
1453,
8892,
762,
23507,
23267,
342,
32367,
3652,
2220,
2720,
2987,
327,
277,
8159,
9194,
285,
617,
253,
4477,
4081,
247,
747,
17134,
18347,
4736,
774,
1492,
272,
5853,
534,
310,
6289,
281,
347,
4836,
48454,
4736,
44321,
391,
77,
281,
22629,
1411,
2087,
4736,
44321,
391,
77,
347,
275,
2720,
2987,
5742,
3185,
273,
17221,
7342,
432,
253,
6311,
6083,
4533,
8349,
253,
747,
5853,
760,
34899,
7342,
432,
253,
5547,
4533,
8349,
285,
32367,
26565,
4679,
5183,
326,
253,
4081,
5933,
41731,
13015,
2720,
2987,
327,
247,
1077,
11132,
270,
504,
633,
18437,
47997,
16941,
4836,
275,
9864,
275,
253,
7794,
273,
253,
6311,
6083,
2457,
3045,
24088,
4836,
12240,
2281,
285,
941,
6733,
24088,
1180,
273,
32367,
2266,
2792,
50275,
783,
2457,
4836,
908,
275,
253,
2929,
270,
25884,
780,
16941,
310,
11132,
2217,
281,
18578,
10668,
273,
253,
4477,
1750,
326,
253,
4081,
5933,
310,
247,
14282,
7756,
327,
4715,
1048,
1688,
21148,
1453,
8892,
4477,
10155,
352,
1066,
281,
2067,
8892,
407,
10454,
310,
1077,
9371,
50275,
783,
16774,
1543,
403,
11088,
285,
21414,
3064,
875,
253,
4081,
5933,
285,
2720,
2987,
310,
1943,
50275,
783,
2929,
310,
973,
17194,
253,
4583,
44803,
310,
2590,
285,
2789,
3282,
50276,
20881,
2792,
50275,
783,
2408,
273,
38135,
310,
12069,
49636,
253,
2022,
15832,
3133,
281,
320,
253,
747,
5853,
323,
17134,
18347,
774,
1492,
272,
50274,
48746,
19843,
476,
320,
5520,
3340,
275,
253,
1039,
2720,
2987,
403,
5611,
275,
253,
16182,
2593,
534,
3133,
247,
2372,
1029,
5251,
285,
4419,
10668,
281,
452,
6832,
3640,
275,
253,
2720,
2987,
281,
2096,
253,
2120,
5406,
50275,
38092,
3533,
50275,
303,
14338,
281,
3037,
432,
253,
4477,
2139,
617,
1097,
253,
2457,
285,
2852,
11640,
762,
468,
13015,
277,
8159,
9194,
275,
253,
3324,
2822,
285,
3324,
2822,
50276,
13981,
8892,
347,
2011,
275,
2829,
337,
50276,
1189,
455,
891,
4264,
4361,
436,
2929,
1223,
253,
38135,
310,
417,
331,
7707,
253,
5661,
1543,
403,
21414,
285,
13943,
891,
5583,
14924,
323,
16055,
36404,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
281,
3157,
253,
3410,
6733,
273,
253,
617,
5933,
407,
10491,
7342,
432,
247,
3268,
326,
310,
6311,
432,
1966,
32367,
16774,
1543,
327,
247,
15524,
35121,
16941,
4836,
921,
326,
253,
4081,
1332,
29566,
247,
1805,
3410,
6733,
2429,
281,
617,
50276,
783,
30628,
1089,
253,
2929,
973,
15720,
4583,
285,
253,
4081,
2934,
5272,
2299,
627,
403,
7350,
5001,
253,
3710,
38135,
273,
253,
4081,
1332,
534,
3133,
32809,
671,
253,
16774,
7103,
27171,
432,
247,
3480,
273,
9991,
253,
2783,
8892,
403,
14257,
512,
6425,
281,
271,
16941,
4836,
253,
2929,
651,
5649,
432,
2007,
16774,
27163,
326,
2486,
8892,
824,
347,
1110,
2783,
275,
253,
3236,
617,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes sagemix a data augmentation technique for point clouds similar to the mixup family of data augmentations sagemix mixes two point clouds it tries to mix point clouds in a saliencyguided way to preserve the salient local structures experiments have been conducted to show the efficacy of the method strengths the paper is wellwritten and easy to follow it is nice that this data augmentation can lead to more robust networks as shown in table 3 weakness a critical weakness is that experimental findings do not appear to be very significant the performance difference between various techniques is not very large and no error margin has been reported table 2 it would be nice if the paper could provide some mean std measures this could be done by running the same experiments multiple times with random initialization and reporting the mean and variance this is particularly important as pointbased benchmark methods can have significant variations across runs most experiments in the paper are limited to point cloud classification the experiment on partsegmentation has not been described in detail it would be nice to see experiments similar to table 2 for segmentation as well this would help in showing that the technique can be used beyond classification na docsepthis paper presents a method for the data augmentation of 3d point clouds the proposed saliencyguided mixup for point clouds sagemix preserves discriminative local structures and generates continuous samples with smoothly varying mixing ratios here saliency is computed based on the impact to the corresponding task measured through the gradients of the loss experimental results show that sagemix brings consistent and significant improvements over stateoftheart mixup methods strengths 1 the saliencyguided sequential sampling is technically novel 2 there are some ablation studies to demonstrate the effect of the proposed method 3 overall the paper is well organised weaknesses 1 the difference between 2d and 3d mixupbased methods is not insightfully analysed in the introduction 2 the shapepreserving continuous mixup component just follows the mainstream method and thus the novelty is limited 3 the experimental results is not extensive the proposed method is only quantitatively compared with pointmixup and rsmix and there is also a lack of qualitative results na docsepthis paper proposes a novel saliency guided mixup method for point clouds it first utilizes saliency to find a query point for each of the two point clouds then it uses an rbf kernel around the query point to calculate the blending weights for each of the points this method generalizes pointmixup 2 and shows superior performance against existing ones strengths the idea of introducing saliency guidance to the mixup method in point clouds sounds reasonable as it has been proven to be useful in imagebased methods 11 12 26 the design of sequentially sample two remotely located query salient points is delicate it avoids the overlapping problem while preserving the important local structure of the two point clouds the experiments seem sufficient weaknesses i didnt see any major weaknesses in this paper one potential improvement to the method could be rotating and translating the whole point clouds to move the two query points far away from each other in my opinion its a more elegant way nevertheless i think its fine to leave it as a future work na
### Summary:
|
this paper studies the point cloud data mixup with the saliency guidance the proposed sagemix focus on the mixup over the local regions to preserve salient structures which are more informative for downstream tasks the whole paper is well organized with clear logic to follow the proposed method is simple but effective moreover there are solid experiments in various tasks including object classification parts segmentation and calibration to comprehensively evaluate proposed methods one of the major concerns is the limited improvements over the standard mixup reviewer vlst on pointnet and the discussion of 2d and 3d mixup can be enriched in the aspects of technical challenges and novelties reviewer ygrl this paper includes five different tasks and four benchmarks in experimental studies that strongly address the third major concern in the limited evaluation of reviewer ygrl who however has not provided any feedback after the authors rebuttal considering the overall contributions in methods and solid evaluation this submission is slightly above the bar of acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
18561,
358,
895,
247,
941,
42072,
5853,
323,
1127,
16173,
2074,
281,
253,
5878,
484,
2021,
273,
941,
35919,
569,
18561,
358,
895,
47603,
767,
1127,
16173,
352,
14177,
281,
5878,
1127,
16173,
275,
247,
3779,
4364,
26960,
1039,
281,
14003,
253,
43066,
1980,
5289,
4679,
452,
644,
5196,
281,
921,
253,
10307,
273,
253,
1332,
20544,
50275,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
262,
310,
5322,
326,
436,
941,
42072,
476,
1421,
281,
625,
10237,
6928,
347,
2011,
275,
2829,
495,
50275,
20881,
1255,
50275,
66,
4619,
14855,
310,
326,
5661,
4342,
513,
417,
3176,
281,
320,
1077,
1534,
253,
3045,
3064,
875,
2710,
5609,
310,
417,
1077,
1781,
285,
642,
2228,
8459,
556,
644,
2361,
2829,
374,
352,
651,
320,
5322,
604,
253,
2929,
812,
2085,
690,
1599,
50276,
8400,
5593,
436,
812,
320,
2218,
407,
3515,
253,
1072,
4679,
2709,
2069,
342,
3632,
31850,
285,
9610,
253,
1599,
285,
11041,
436,
310,
3782,
1774,
347,
1127,
3169,
22791,
3082,
476,
452,
1534,
10575,
2439,
6613,
50275,
2252,
4679,
275,
253,
2929,
403,
3710,
281,
1127,
9005,
9162,
253,
3368,
327,
629,
29429,
318,
556,
417,
644,
2529,
275,
2508,
352,
651,
320,
5322,
281,
923,
4679,
2074,
281,
2829,
374,
323,
26405,
347,
973,
436,
651,
1361,
275,
4645,
326,
253,
5853,
476,
320,
908,
4457,
9162,
5549,
5474,
33032,
2520,
2929,
10262,
247,
1332,
323,
253,
941,
42072,
273,
495,
69,
1127,
16173,
253,
4081,
3779,
4364,
26960,
5878,
484,
323,
1127,
16173,
18561,
358,
895,
31221,
20741,
800,
1980,
5289,
285,
15693,
5415,
3530,
342,
25863,
11962,
12480,
11878,
1060,
3779,
4364,
310,
10302,
1754,
327,
253,
3486,
281,
253,
3969,
4836,
4080,
949,
253,
27935,
273,
253,
2957,
5661,
1543,
921,
326,
18561,
358,
895,
10316,
5185,
285,
1534,
11701,
689,
1375,
23037,
14387,
5878,
484,
3082,
20544,
337,
253,
3779,
4364,
26960,
22453,
10491,
310,
22335,
4460,
374,
627,
403,
690,
28913,
2175,
281,
7568,
253,
1055,
273,
253,
4081,
1332,
495,
4583,
253,
2929,
310,
973,
29070,
50276,
20881,
1255,
265,
337,
253,
3064,
875,
374,
69,
285,
495,
69,
5878,
484,
3169,
3082,
310,
417,
12288,
2920,
15626,
275,
253,
10199,
374,
253,
5281,
10192,
26368,
5415,
5878,
484,
4445,
816,
3637,
253,
17068,
1332,
285,
3021,
253,
38135,
310,
3710,
495,
253,
5661,
1543,
310,
417,
9470,
253,
4081,
1332,
310,
760,
36878,
2429,
342,
1127,
24706,
484,
285,
391,
3610,
895,
285,
627,
310,
671,
247,
3480,
273,
18276,
1543,
5549,
5474,
33032,
2520,
2929,
29328,
247,
4460,
3779,
4364,
18107,
5878,
484,
1332,
323,
1127,
16173,
352,
806,
29820,
3779,
4364,
281,
1089,
247,
7316,
1127,
323,
1016,
273,
253,
767,
1127,
16173,
840,
352,
4648,
271,
391,
3342,
10295,
1475,
253,
7316,
1127,
281,
10173,
253,
41209,
13461,
323,
1016,
273,
253,
2792,
436,
1332,
2087,
4219,
1127,
24706,
484,
374,
285,
2722,
8936,
3045,
1411,
5368,
4394,
20544,
50276,
783,
2934,
273,
16984,
3779,
4364,
12925,
281,
253,
5878,
484,
1332,
275,
1127,
16173,
7835,
5272,
347,
352,
556,
644,
11464,
281,
320,
4217,
275,
2460,
3169,
3082,
1903,
1249,
3436,
50276,
783,
2216,
273,
32627,
3410,
767,
27938,
4441,
7316,
43066,
2792,
310,
21140,
352,
32547,
253,
21481,
1895,
1223,
24279,
253,
1774,
1980,
2605,
273,
253,
767,
1127,
16173,
50276,
783,
4679,
1646,
4209,
50276,
20881,
1255,
265,
50276,
74,
42126,
923,
667,
2201,
32213,
275,
436,
2929,
581,
2442,
7756,
281,
253,
1332,
812,
320,
17387,
285,
42477,
253,
2644,
1127,
16173,
281,
2118,
253,
767,
7316,
2792,
2080,
1977,
432,
1016,
643,
275,
619,
4743,
697,
247,
625,
20654,
1039,
17837,
891,
1158,
697,
4030,
281,
3553,
352,
347,
247,
2852,
789,
5549,
2490,
187,
4118,
18435,
27,
436,
2929,
2175,
253,
1127,
9005,
941,
5878,
484,
342,
253,
3779,
4364,
12925,
253,
4081,
18561,
358,
895,
2770,
327,
253,
5878,
484,
689,
253,
1980,
4811,
281,
14003,
43066,
5289,
534,
403,
625,
27096,
323,
15450,
8892,
253,
2644,
2929,
310,
973,
10932,
342,
2590,
9317,
281,
956,
253,
4081,
1332,
310,
2969,
533,
3576,
25761,
627,
403,
4891,
4679,
275,
2710,
8892,
1690,
1789,
9162,
4243,
26405,
285,
18543,
281,
9483,
1242,
7472,
4081,
3082,
581,
273,
253,
2201,
7350,
310,
253,
3710,
11701,
689,
253,
2629,
5878,
484,
37317,
362,
42663,
327,
1127,
3024,
285,
253,
5955,
273,
374,
69,
285,
495,
69,
5878,
484,
476,
320,
18839,
275,
253,
7794,
273,
7681,
7881,
285,
4460,
2890,
37317,
340,
737,
77,
436,
2929,
3797,
2620,
1027,
8892,
285,
1740,
49602,
275,
5661,
2175,
326,
7052,
2953,
253,
2626,
2201,
4468,
275,
253,
3710,
7103,
273,
37317,
340,
737,
77,
665,
2299,
556,
417,
2530,
667,
8680,
846,
253,
4477,
30080,
22559,
7296,
253,
4583,
9021,
275,
3082,
285,
4891,
7103,
436,
19529,
310,
5777,
1840,
253,
2534,
273,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
18561,
358,
895,
247,
941,
42072,
5853,
323,
1127,
16173,
2074,
281,
253,
5878,
484,
2021,
273,
941,
35919,
569,
18561,
358,
895,
47603,
767,
1127,
16173,
352,
14177,
281,
5878,
1127,
16173,
275,
247,
3779,
4364,
26960,
1039,
281,
14003,
253,
43066,
1980,
5289,
4679,
452,
644,
5196,
281,
921,
253,
10307,
273,
253,
1332,
20544,
50275,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
262,
310,
5322,
326,
436,
941,
42072,
476,
1421,
281,
625,
10237,
6928,
347,
2011,
275,
2829,
495,
50275,
20881,
1255,
50275,
66,
4619,
14855,
310,
326,
5661,
4342,
513,
417,
3176,
281,
320,
1077,
1534,
253,
3045,
3064,
875,
2710,
5609,
310,
417,
1077,
1781,
285,
642,
2228,
8459,
556,
644,
2361,
2829,
374,
352,
651,
320,
5322,
604,
253,
2929,
812,
2085,
690,
1599,
50276,
8400,
5593,
436,
812,
320,
2218,
407,
3515,
253,
1072,
4679,
2709,
2069,
342,
3632,
31850,
285,
9610,
253,
1599,
285,
11041,
436,
310,
3782,
1774,
347,
1127,
3169,
22791,
3082,
476,
452,
1534,
10575,
2439,
6613,
50275,
2252,
4679,
275,
253,
2929,
403,
3710,
281,
1127,
9005,
9162,
253,
3368,
327,
629,
29429,
318,
556,
417,
644,
2529,
275,
2508,
352,
651,
320,
5322,
281,
923,
4679,
2074,
281,
2829,
374,
323,
26405,
347,
973,
436,
651,
1361,
275,
4645,
326,
253,
5853,
476,
320,
908,
4457,
9162,
5549,
5474,
33032,
2520,
2929,
10262,
247,
1332,
323,
253,
941,
42072,
273,
495,
69,
1127,
16173,
253,
4081,
3779,
4364,
26960,
5878,
484,
323,
1127,
16173,
18561,
358,
895,
31221,
20741,
800,
1980,
5289,
285,
15693,
5415,
3530,
342,
25863,
11962,
12480,
11878,
1060,
3779,
4364,
310,
10302,
1754,
327,
253,
3486,
281,
253,
3969,
4836,
4080,
949,
253,
27935,
273,
253,
2957,
5661,
1543,
921,
326,
18561,
358,
895,
10316,
5185,
285,
1534,
11701,
689,
1375,
23037,
14387,
5878,
484,
3082,
20544,
337,
253,
3779,
4364,
26960,
22453,
10491,
310,
22335,
4460,
374,
627,
403,
690,
28913,
2175,
281,
7568,
253,
1055,
273,
253,
4081,
1332,
495,
4583,
253,
2929,
310,
973,
29070,
50276,
20881,
1255,
265,
337,
253,
3064,
875,
374,
69,
285,
495,
69,
5878,
484,
3169,
3082,
310,
417,
12288,
2920,
15626,
275,
253,
10199,
374,
253,
5281,
10192,
26368,
5415,
5878,
484,
4445,
816,
3637,
253,
17068,
1332,
285,
3021,
253,
38135,
310,
3710,
495,
253,
5661,
1543,
310,
417,
9470,
253,
4081,
1332,
310,
760,
36878,
2429,
342,
1127,
24706,
484,
285,
391,
3610,
895,
285,
627,
310,
671,
247,
3480,
273,
18276,
1543,
5549,
5474,
33032,
2520,
2929,
29328,
247,
4460,
3779,
4364,
18107,
5878,
484,
1332,
323,
1127,
16173,
352,
806,
29820,
3779,
4364,
281,
1089,
247,
7316,
1127,
323,
1016,
273,
253,
767,
1127,
16173,
840,
352,
4648,
271,
391,
3342,
10295,
1475,
253,
7316,
1127,
281,
10173,
253,
41209,
13461,
323,
1016,
273,
253,
2792,
436,
1332,
2087,
4219,
1127,
24706,
484,
374,
285,
2722,
8936,
3045,
1411,
5368,
4394,
20544,
50276,
783,
2934,
273,
16984,
3779,
4364,
12925,
281,
253,
5878,
484,
1332,
275,
1127,
16173,
7835,
5272,
347,
352,
556,
644,
11464,
281,
320,
4217,
275,
2460,
3169,
3082,
1903,
1249,
3436,
50276,
783,
2216,
273,
32627,
3410,
767,
27938,
4441,
7316,
43066,
2792,
310,
21140,
352,
32547,
253,
21481,
1895,
1223,
24279,
253,
1774,
1980,
2605,
273,
253,
767,
1127,
16173,
50276,
783,
4679,
1646,
4209,
50276,
20881,
1255,
265,
50276,
74,
42126,
923,
667,
2201,
32213,
275,
436,
2929,
581,
2442,
7756,
281,
253,
1332,
812,
320,
17387,
285,
42477,
253,
2644,
1127,
16173,
281,
2118,
253,
767,
7316,
2792,
2080,
1977,
432,
1016,
643,
275,
619,
4743,
697,
247,
625,
20654,
1039,
17837,
891,
1158,
697,
4030,
281,
3553,
352,
347,
247,
2852,
789,
5549,
2490,
187,
4118,
18435,
27,
436,
2929,
2175,
253,
1127,
9005,
941,
5878,
484,
342,
253,
3779,
4364,
12925,
253,
4081,
18561,
358,
895,
2770,
327,
253,
5878,
484,
689,
253,
1980,
4811,
281,
14003,
43066,
5289,
534,
403,
625,
27096,
323,
15450,
8892,
253,
2644,
2929,
310,
973,
10932,
342,
2590,
9317,
281,
956,
253,
4081,
1332,
310,
2969,
533,
3576,
25761,
627,
403,
4891,
4679,
275,
2710,
8892,
1690,
1789,
9162,
4243,
26405,
285,
18543,
281,
9483,
1242,
7472,
4081,
3082,
581,
273,
253,
2201,
7350,
310,
253,
3710,
11701,
689,
253,
2629,
5878,
484,
37317,
362,
42663,
327,
1127,
3024,
285,
253,
5955,
273,
374,
69,
285,
495,
69,
5878,
484,
476,
320,
18839,
275,
253,
7794,
273,
7681,
7881,
285,
4460,
2890,
37317,
340,
737,
77,
436,
2929,
3797,
2620,
1027,
8892,
285,
1740,
49602,
275,
5661,
2175,
326,
7052,
2953,
253,
2626,
2201,
4468,
275,
253,
3710,
7103,
273,
37317,
340,
737,
77,
665,
2299,
556,
417,
2530,
667,
8680,
846,
253,
4477,
30080,
22559,
7296,
253,
4583,
9021,
275,
3082,
285,
4891,
7103,
436,
19529,
310,
5777,
1840,
253,
2534,
273,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the problem of sampling from nonsymmetrical determinantal point processes ndpps in particular this paper focuses on exact sampling for lowrank ndpps the main contributions of this paper are 1 this paper proposes to adapt the choleskydecompositionbased sampler for dpps to a lineartime sampler for lowrank ndpps rank of ndpp size of the ground set 2 this paper proposes to use rejection sampling to implement a sublineartime assuming that the of rejections are bounded by a small constant sampler for a subclass of ndpps called orthogonal ndpps ondpps by leveraging an existing sublineartime sampler for dpps 3 this paper shows that empirically in terms of modeling realworld datasets ondpps are as effective as the general ndpps and more importantly we can efficiently learn ondpps in a way such that when it comes to rejection sampling the of rejections is bounded by a small number this paper is overall wellwritten easy to understand and both the theoretical and empirical results presented in this paper are very exciting compared to the lineartime choleskydecompositionbased sampler the rejectionsamplingbased method is definitely the main focus of the paper though the rejectionsampling method is not strictly sublineartime in general the theoretical result presented in this paper theorem 2 directly implies that in practice we can easily allow efficient sampling by bounding the expected of rejections in learning ndpps ondpps more importantly the authors conducted comprehensive experiments to show that in learning ndpps by imposing the constraints orthogonality extra regularization term we need for efficient sampling we are not losing the expressive power of ndpps experiments also show that compared to the lineartime sampling algorithm the rejectionsampling method scales much better on both synthetic and realworld datasets ndpps are a strictly more expressive class compared to dpps but the absence of efficient sampling algorithms puts a major barrier from replacing dpps by ndpps in largescale applications by proposing an efficient sublineartime sampling algorithm for ndpps this paper makes substantial progress in scaling up ndpps and opens up new avenues for applying ndpps to various realworld scenarios cons 1 compared to the rejectionsampling method the lineartime algorithm is not as interesting and novel and somewhat deviating from the main story authors might want to consider presenting the lineartime algorithm without getting into the technical details of course the experiments that compares the rejectionsampling method against the lineartime algorithm is still important 2 section 42 is a little bit difficult to follow for those who are not familiar with the treebased sampling algorithm for dpps authors might want to expand this section a little bit by proposing an efficient sublineartime under certain assumptions exact sampling algorithm for ndpps this work makes substantial progress in scaling up ndpps to realworld applications this paper is overall wellwritten and easy to follow and it also demonstrates a decent amount theoretical novelties docsepthe paper provides efficient linear time in item set size algorithms for exact sampling from nonsymmetric determinantal point processes ndpps using a new ndpp decomposition which was introduced recently in an iclr 2021 oral gartrell etal and also a learning algorithm to learn ndpp kernels which are more amenable to some of their sampling algorithms they also provide empirical results comparing their learned kernels with previous works and compare times for their various sampling algorithms the paper is well written clear and interesting to read they are the first to give efficient algorithms for exact sampling from ndpps the previous sampling algorithm takes time cubic in ground set size which is impractical for realworld data their work can lead to more applications of ndpps in practical settings it is also timely and adds to the growing literature on ndpps some comments for equations 2 and 3 and similar probabilities elsewhere in the paper which only involve singleton sets it might help the reader if you use pri in y j in y rather than pri subseteq y j subseteq y also for these specific equations you mention that poulson 2019 shows them via the cholesky decomposition but that seems overkill for these specific equations 2 and 3 you can just derive equation 2 by computing fracprij subseteq yprj subseteq y fracdet kijdetkj and equation 3 also by fracpri in y j notin yprj in y fracpri in y pr ij subseteq yprj in y fracdet ki det kijdetkj it seems like poulson 2019 uses lu decomposition in proposition 3 and 5 because their propositions are more general and apply to disjoint subsets possibly having more than one element also im not very familiar with these factorizations but seems like lu decomposition is different from cholesky decomposition at least from my understanding reading the wiki for cholesky decomposition experts might be irritated by this or maybe its fine in which case please let me know overall i like the paper and recommend acceptance this is the first paper to give efficient algorithms for exact sampling from ndpps which can be used in a variety of realworld applications like recommender systems etc and thus this paper can have a good impact docsepthis paper studies scalable sampling methods for ndpps by assuming lowrank structure of the kernel the paper proposes a lineartime sampling method which is faster than previous sampling algorithm with cubic runtime for general kernel furthermore this paper develops a scalable sublineartime rejection sampling algorithm for a subclass of lowrank ndpps that are called ondpps through experiments it is shown that the predictive performance of ondpps is not degraded compared to ndpps by adding a regularization term in the optimization the rejection probability is greatly reduced this paper proposes two scalable sampling algorithms for ndpps one for lowrank kernel and the other for low rank orthogonal kernel the efficiency of the proposed algorithms are verified through experiments and the predictive performance is not degrade this paper is well written and the studies could benefit relevant researches on ndpps i have some minor comments which should be easy to fix or answer for equ 4 and equ 5 please give a definition of bf zj in section 41 the eigendecomposition of bf vbf vt is not used in algorithm 2 could we remove it to avoid the confusion in section 41 the dimensions of bf z and bf x do not match when k is odd in section 5 could you elaborate on why if bf vnotperpbf b bf l will have rank 2k what happens if bf vbf v1bf b where bf v1perpbf b overall i think this is a well written paper my only concern is that the more scalable sampling algorithm is achieved by imposing more restrictions on the kernel which might limit its usage however this might not be a big issue because it is shown in the experiments that its predictive performance is not degraded docsepthis paper gives an exact sampling algorithm to sample from dpps based on nonsymmetrical lowrank kernel matrices of the form lvvtop bddtopbtop where v and b are of size m by k and d is square nonsymmetric and of size k by k the marginal kernel of such a dpp may be written in the form kzwztop with z a m by 2k matrix and w a square matrix of size 2k that is not symmetric the authors then discuss in section 2 how poulsons algorithm is adapted to sample from such ndpps in section 3 show that as l is lowrank the generic om3 cost of poulsons algorithm can be reduced to omk2 as updates of the marginal kernel can be efficiently done by only updating the inner w matrix in section 4 give a sublineartime rejectionsampling based algorithm adapting the treebased algorithm of gillenwater et al in this section they propose a welladapted dpp based on a symmetric kernel hatl that is easytosample it is a dpp based on a symmetric psd kernel and can be sampled from many different fast existing algorithms and for which any set y verifies detlyleq dethatly where the upperbound is actually attained thm 1 in order to exactly control the rejection rate of their proposed algorithm that is equal to dethatli detli the authors simplify the model by supposing that v and b are orthogonal to each other thm 2 in passing the authors observe that the complexity of gillenwater et als algorithm can in fact be easily improved by a factor k where k is the size of the sampled set in sections 5 and 6 experiments are provided comparing their samping method with the stateoftheart the paper is wellwritten even though it is a somewhat straightforward mix of ideas from gartrell et al motivating such ndpps gillenwater et al sublinear sampling algorithm of symmetric dpps and poulson choleskybased sampling of dpps and ndpps i believe that the results are interesting enough for acceptance the main contribution of the paper is to my eyes to have found a symmetric dpp that is wellsuited for rejection sampling the proof that forall y detlyleq dethatly essential for the rejection sampling framework to work is an interesting contribution and could be used in other works on dpps as well in passing i find the proof hard to follow and suggest to the authors to do their best to find ways of simplifying it as much as possible for the cameraready version the simpler the proof the more easily it could be transferred to other scenarios increasing the potential impact of the paper as for theorem 2 the orthogonality constraint is frustrating and i would imagine that further efforts could lift this constraint and obtain a bound that depends on how orthogonal both subspaces are and recovering the current bound when they are indeed orthogonal on the other hand ondpps can indeed be motivated for learning as they indeed ensure that the full available rank is put to contribution the empirical results tend to validate this intuitive argument all in all the small lack of true originality is compensated by two useful theorems that can prove useful to the community
### Summary:
|
this is an exciting paper that provide the efficient algorithms for exact sampling from ndpps along with theoretical results that are very pertinent in and out themselves the ac agree with the reviewers that the authors satisfactorily addressed the concerns raised in the reviews and is convinced that the revised version will be greatly appreciated by the community we very much encourage the authors to pursue this line of work and in particular to overcome the practical restriction to the ondpp subclass
|
[
273,
40515,
377,
50276,
3281,
273,
253,
3216,
873,
374,
436,
2929,
29328,
281,
897,
18235,
10491,
281,
3359,
247,
749,
1282,
435,
553,
7384,
326,
253,
50276,
1171,
294,
11817,
403,
11542,
407,
247,
1355,
3638,
1775,
17407,
323,
247,
35851,
273,
40515,
44361,
1925,
19627,
40515,
44361,
327,
69,
44361,
407,
19732,
2977,
271,
5368,
749,
1282,
435,
553,
1775,
17407,
323,
277,
44361,
495,
436,
2929,
2722,
326,
45190,
275,
2426,
273,
14053,
1524,
10186,
15302,
327,
69,
44361,
403,
347,
3576,
347,
253,
2087,
40515,
44361,
285,
625,
15538,
359,
476,
14556,
3037,
327,
69,
44361,
275,
247,
1039,
824,
326,
672,
352,
3249,
281,
18235,
10491,
253,
50276,
1171,
294,
11817,
310,
11542,
407,
247,
1355,
1180,
436,
2929,
310,
4583,
973,
15720,
3477,
281,
2096,
285,
1097,
253,
10527,
285,
16774,
1543,
3559,
275,
436,
2929,
403,
1077,
12302,
50276,
3118,
1096,
281,
253,
1386,
435,
553,
448,
3841,
4742,
615,
42190,
3169,
1775,
17407,
253,
294,
11817,
312,
4906,
3169,
1332,
310,
7964,
253,
2022,
2770,
273,
253,
2929,
2167,
253,
294,
11817,
312,
4906,
1332,
310,
417,
13714,
749,
1282,
435,
553,
275,
2087,
253,
10527,
906,
3559,
275,
436,
2929,
10012,
374,
3587,
8018,
326,
275,
3946,
359,
476,
4354,
1581,
5919,
10491,
407,
41113,
253,
3264,
50276,
1171,
294,
11817,
275,
4715,
40515,
44361,
327,
69,
44361,
625,
15538,
253,
4477,
5196,
11088,
4679,
281,
921,
326,
275,
4715,
40515,
44361,
407,
23254,
253,
10806,
9373,
38931,
1319,
50276,
24124,
37820,
1307,
359,
878,
323,
5919,
10491,
359,
403,
417,
10305,
253,
43541,
1612,
273,
40515,
44361,
4679,
671,
921,
326,
2429,
281,
253,
1386,
435,
553,
10491,
5933,
253,
294,
11817,
312,
4906,
1332,
11498,
1199,
1805,
327,
1097,
13506,
285,
1524,
10186,
15302,
50276,
2109,
44361,
403,
247,
13714,
625,
43541,
966,
2429,
281,
277,
44361,
533,
253,
5928,
273,
5919,
10491,
11333,
12516,
247,
2201,
11394,
432,
15706,
277,
44361,
407,
40515,
44361,
275,
1236,
2510,
25912,
4893,
407,
36636,
271,
5919,
749,
1282,
435,
553,
10491,
5933,
323,
40515,
44361,
436,
2929,
2789,
6832,
4780,
275,
13642,
598,
40515,
44361,
285,
13279,
598,
747,
44201,
323,
9433,
40515,
44361,
281,
2710,
1524,
10186,
15216,
50275,
5040,
337,
2429,
281,
253,
294,
11817,
312,
4906,
1332,
253,
1386,
435,
553,
5933,
310,
417,
347,
4722,
285,
4460,
285,
8489,
1474,
15544,
432,
253,
2022,
2926,
4477,
1537,
971,
281,
1908,
15250,
253,
1386,
435,
553,
5933,
1293,
2970,
715,
253,
7681,
4278,
273,
2282,
253,
4679,
326,
26662,
253,
294,
11817,
312,
4906,
1332,
1411,
253,
1386,
435,
553,
5933,
310,
1335,
1774,
50276,
19,
2593,
5976,
310,
247,
1652,
2372,
2834,
281,
956,
323,
1110,
665,
403,
417,
7615,
342,
253,
5202,
3169,
10491,
5933,
323,
277,
44361,
4477,
1537,
971,
281,
5645,
436,
2593,
247,
1652,
2372,
50276,
1615,
36636,
271,
5919,
749,
1282,
435,
553,
762,
2176,
13260,
3242,
10491,
5933,
323,
40515,
44361,
436,
789,
2789,
6832,
4780,
275,
13642,
598,
40515,
44361,
281,
1524,
10186,
4893,
436,
2929,
310,
4583,
973,
15720,
285,
3477,
281,
956,
285,
352,
671,
14371,
247,
12524,
2408,
10527,
4460,
2890,
5474,
339,
431,
248,
2929,
3400,
5919,
4872,
673,
275,
5382,
873,
1979,
11333,
323,
3242,
10491,
432,
14122,
25562,
27152,
267,
1127,
4870,
40515,
44361,
970,
247,
747,
40515,
377,
14717,
534,
369,
5611,
4102,
275,
271,
17857,
32888,
43425,
7946,
305,
435,
11436,
1162,
267,
285,
671,
247,
4715,
5933,
281,
3037,
40515,
377,
34501,
534,
403,
625,
42133,
281,
690,
273,
616,
10491,
11333,
597,
671,
2085,
16774,
1543,
10941,
616,
6311,
34501,
342,
2045,
2987,
285,
7277,
2069,
323,
616,
2710,
10491,
11333,
253,
2929,
310,
973,
3542,
2590,
285,
4722,
281,
1239,
597,
403,
253,
806,
281,
1918,
5919,
11333,
323,
3242,
10491,
432,
40515,
44361,
253,
2045,
10491,
5933,
3936,
673,
23664,
275,
3216,
873,
1979,
534,
310,
45783,
323,
1524,
10186,
941,
616,
789,
476,
1421,
281,
625,
4893,
273,
40515,
44361,
275,
8542,
7533,
352,
310,
671,
14793,
285,
11323,
281,
253,
5675,
6239,
327,
40515,
44361,
50276,
8826,
5701,
323,
7424,
374,
285,
495,
285,
2074,
20552,
11358,
275,
253,
2929,
534,
760,
6388,
47736,
5239,
352,
1537,
1361,
253,
9414,
604,
368,
897,
2235,
275,
340,
50276,
75,
275,
340,
2581,
685,
2235,
8578,
2574,
340,
50276,
75,
8578,
2574,
340,
671,
323,
841,
2173,
7424,
368,
3748,
326,
268,
3941,
1665,
6247,
2722,
731,
3066,
253,
448,
3841,
4742,
14717,
533,
326,
3133,
689,
24212,
323,
841,
2173,
7424,
374,
285,
495,
368,
476,
816,
15313,
5150,
374,
407,
12672,
1315,
317,
3225,
75,
8578,
2574,
340,
1087,
75,
8578,
2574,
340,
50276,
1124,
5992,
465,
1944,
5992,
31169,
285,
5150,
495,
671,
407,
1315,
317,
3225,
275,
340,
480,
417,
249,
340,
1087,
75,
275,
340,
50276,
1124,
3225,
275,
340,
50276,
1087,
891,
75,
8578,
2574,
340,
1087,
75,
275,
340,
50276,
1124,
5992,
25130,
50276,
5992,
465,
1944,
5992,
31169,
352,
3133,
751,
268,
3941,
1665,
6247,
4648,
26535,
14717,
275,
13989,
495,
285,
608,
984,
616,
39325,
403,
625,
2087,
285,
4647,
281,
28465,
20077,
6830,
1907,
625,
685,
581,
3284,
671,
516,
417,
1077,
7615,
342,
841,
2803,
5904,
533,
3133,
751,
26535,
14717,
310,
1027,
432,
448,
3841,
4742,
14717,
387,
1878,
432,
619,
4685,
4361,
253,
35372,
323,
448,
3841,
4742,
14717,
10071,
1537,
320,
17339,
456,
407,
436,
390,
5046,
697,
4030,
275,
534,
1083,
4496,
1339,
479,
871,
4583,
891,
751,
253,
2929,
285,
5583,
14924,
436,
310,
253,
806,
2929,
281,
1918,
5919,
11333,
323,
3242,
10491,
432,
40515,
44361,
534,
476,
320,
908,
275,
247,
5235,
273,
1524,
10186,
4893,
751,
3818,
3109,
2718,
3966,
285,
3021,
436,
2929,
476,
452,
247,
1175,
3486,
50276,
7152,
33032,
2520,
2929,
2175,
44755,
10491,
3082,
323,
40515,
44361,
50276,
1615,
7384,
1698,
14714,
2605,
273,
253,
10295,
253,
2929,
29328,
247,
1386,
435,
553,
10491,
1332,
534,
310,
7938,
685,
2045,
10491,
5933,
342,
23664,
20243,
323,
2087,
10295,
50276,
44295,
3062,
436,
2929,
24357,
247,
44755,
749,
1282,
435,
553,
18235,
10491,
5933,
323,
247,
35851,
273,
1698,
14714,
40515,
44361,
326,
403,
1925,
327,
69,
44361,
50276,
10489,
4679,
352,
310,
2011,
326,
253,
15970,
3045,
273,
327,
69,
44361,
310,
417,
30853,
2429,
281,
40515,
44361,
407,
6240,
247,
37820,
1307,
275,
253,
13757,
253,
18235,
5912,
310,
10260,
3777,
436,
2929,
29328,
767,
44755,
10491,
11333,
323,
40515,
44361,
581,
323,
1698,
14714,
10295,
285,
253,
643,
323,
1698,
5958,
19627,
10295,
253,
6733,
273,
253,
4081,
11333,
403,
16058,
949,
4679,
285,
253,
15970,
3045,
310,
417,
40195,
436,
2929,
310,
973,
3542,
285,
253,
2175,
812,
5649,
4623,
29905,
2706,
327,
40515,
44361,
50276,
74,
452,
690,
5884,
5701,
534,
943,
320,
3477,
281,
4993,
390,
3662,
50276,
1542,
1298,
577,
285,
1298,
608,
4496,
1918,
247,
5426,
273,
270,
71,
1182,
75,
50276,
249,
2593,
7609,
253,
299,
304,
9747,
42190,
273,
270,
71,
362,
3342,
362,
85,
310,
417,
908,
275,
5933,
374,
812,
359,
5386,
352,
281,
3693,
253,
13775,
50276,
249,
2593,
7609,
253,
10103,
273,
270,
71,
1182,
285,
270,
71,
1269,
513,
417,
3761,
672,
465,
310,
8909,
50276,
249,
2593,
608,
812,
368,
21184,
327,
2139,
604,
270,
71,
362,
1439,
14715,
3342,
270,
270,
71,
298,
588,
452,
5958,
374,
76,
752,
6569,
604,
270,
71,
362,
3342,
362,
18,
3342,
270,
835,
270,
71,
362,
18,
14715,
3342,
270,
50276,
1189,
455,
891,
1158,
436,
310,
247,
973,
3542,
2929,
619,
760,
4468,
310,
326,
253,
625,
44755,
10491,
5933,
310,
6786,
407,
23254,
625,
13133,
327,
253,
10295,
534,
1537,
2701,
697,
10393,
2299,
436,
1537,
417,
320,
247,
1943,
2523,
984,
352,
310,
2011,
275,
253,
4679,
326,
697,
15970,
3045,
310,
417,
30853,
5474,
33032,
2520,
2929,
4245,
271,
3242,
10491,
5933,
281,
3410,
432,
277,
44361,
1754,
327,
14122,
1105,
3899,
5526,
1698,
14714,
10295,
12624,
273,
253,
830,
298,
24260,
3956,
50276,
67,
1678,
3956,
2612,
412,
835,
362,
285,
270,
403,
273,
1979,
278,
407,
465,
285,
277,
310,
6278,
14122,
25562,
285,
273,
1979,
465,
407,
465,
253,
16888,
10295,
273,
824,
247,
277,
377,
778,
320,
3542,
275,
253,
830,
465,
40965,
91,
3956,
342,
1182,
247,
278,
407,
374,
76,
4315,
285,
259,
247,
6278,
4315,
273,
1979,
374,
76,
326,
310,
417,
13123,
253,
4477,
840,
50276,
35844,
275,
2593,
374,
849,
268,
3941,
84,
790,
5933,
310,
12956,
281,
3410,
432,
824,
40515,
44361,
50276,
249,
2593,
495,
921,
326,
347,
298,
310,
1698,
14714,
253,
12314,
7005,
20,
2105,
273,
268,
3941,
84,
790,
5933,
476,
320,
3777,
281,
7005,
76,
19,
347,
11269,
273,
253,
16888,
10295,
476,
320,
14556,
2218,
407,
760,
22753,
253,
6703,
259,
4315,
50276,
249,
2593,
577,
1918,
247,
749,
1282,
435,
553,
294,
11817,
312,
4906,
1754,
5933,
42174,
253,
5202,
3169,
5933,
273,
305,
408,
257,
7779,
1162,
355,
275,
436,
2593,
597,
28910,
12661,
247,
973,
26672,
264,
277,
377,
1754,
327,
247,
13123,
10295,
7856,
77,
326,
310,
1842,
1767,
375,
4636,
352,
310,
247,
277,
377,
1754,
327,
247,
13123,
3714,
69,
10295,
285,
476,
320,
19958,
432,
1142,
1027,
3809,
5368,
11333,
285,
323,
534,
667,
873,
340,
2336,
7790,
843,
314,
3040,
277,
678,
255,
314,
835,
253,
5170,
9458,
310,
2686,
26553,
289,
78,
337,
275,
1340,
281,
4555,
1453,
253,
18235,
2281,
273,
616,
4081,
5933,
326,
310,
4503,
281,
277,
678,
255,
965,
50276,
5992,
965,
253,
4477,
25636,
253,
1566,
407,
915,
5555,
326,
362,
285,
270,
403,
19627,
281,
1016,
643,
289,
78,
374,
28910,
275,
8136,
253,
4477,
10018,
326,
253,
10454,
273,
305,
408,
257,
7779,
1162,
14350,
5933,
476,
275,
958,
320,
4354,
5520,
407,
247,
2803,
465,
835,
465,
310,
253,
1979,
273,
253,
19958,
873,
50275,
249,
7118,
608,
285,
721,
4679,
403,
2530,
10941,
616,
256,
42956,
1332,
342,
253,
1375,
23037,
14387,
253,
2929,
310,
973,
15720,
1014,
2167,
352,
310,
247,
8489,
15246,
5878,
273,
5697,
432,
305,
435,
11436,
1162,
355,
15265,
839,
824,
40515,
44361,
305,
408,
257,
7779,
1162,
355,
749,
8172,
10491,
5933,
273,
13123,
277,
44361,
285,
268,
3941,
1665,
448,
3841,
4742,
3169,
10491,
273,
277,
44361,
285,
40515,
44361,
891,
2868,
326,
253,
1543,
403,
4722,
2217,
323,
14924,
253,
2022,
7680,
273,
253,
2929,
310,
281,
619,
2927,
281,
452,
1119,
247,
13123,
277,
377,
326,
310,
973,
3467,
959,
323,
18235,
10491,
253,
4737,
326,
323,
455,
340,
843,
314,
3040,
277,
678,
255,
314,
5667,
323,
253,
18235,
10491,
7792,
281,
789,
310,
271,
4722,
7680,
285,
812,
320,
908,
275,
643,
2987,
327,
277,
44361,
347,
973,
275,
8136,
891,
1089,
253,
4737,
1892,
281,
956,
285,
1804,
281,
253,
4477,
281,
513,
616,
1682,
281,
1089,
4088,
273,
8077,
5411,
352,
347,
1199,
347,
1896,
323,
253,
4049,
254,
609,
5102,
2715,
253,
19554,
253,
4737,
253,
625,
4354,
352,
812,
320,
9495,
281,
643,
15216,
3629,
253,
2442,
3486,
273,
253,
2929,
50276,
284,
323,
10012,
374,
253,
9373,
38931,
1319,
7658,
310,
29125,
285,
891,
651,
8564,
326,
2007,
6031,
812,
8488,
436,
7658,
285,
4044,
247,
3033,
326,
7024,
327,
849,
19627,
1097,
749,
31748,
403,
285,
27930,
253,
1655,
3033,
672,
597,
403,
6296,
19627,
327,
253,
643,
1133,
327,
69,
44361,
476,
6296,
320,
17194,
323,
4715,
347,
597,
6296,
5416,
326,
253,
2120,
2130,
5958,
310,
1691,
281,
7680,
253,
16774,
1543,
5257,
281,
17813,
436,
27350,
4154,
512,
275,
512,
253,
1355,
3480,
273,
2032,
3236,
414,
310,
31745,
407,
767,
4217,
39383,
326,
476,
5276,
4217,
281,
253,
3114,
2490,
187,
4118,
18435,
27,
2520,
310,
271,
12302,
2929,
326,
2085,
253,
5919,
11333,
323,
3242,
10491,
432,
40515,
44361,
2112,
342,
10527,
1543,
326,
403,
1077,
21452,
275,
285,
562,
3746,
253,
913,
5194,
342,
253,
30628,
326,
253,
4477,
3449,
5906,
1031,
9713,
253,
7350,
5439,
275,
253,
10123,
285,
310,
13762,
326,
253,
17265,
2715,
588,
320,
10260,
14109,
407,
253,
3114,
359,
1077,
1199,
11907,
253,
4477,
281,
15142,
436,
1386,
273,
789,
285,
275,
1798,
281,
11399,
253,
8542,
12400,
281,
253,
327,
69,
377,
35851
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
273,
40515,
377,
50276,
3281,
273,
253,
3216,
873,
374,
436,
2929,
29328,
281,
897,
18235,
10491,
281,
3359,
247,
749,
1282,
435,
553,
7384,
326,
253,
50276,
1171,
294,
11817,
403,
11542,
407,
247,
1355,
3638,
1775,
17407,
323,
247,
35851,
273,
40515,
44361,
1925,
19627,
40515,
44361,
327,
69,
44361,
407,
19732,
2977,
271,
5368,
749,
1282,
435,
553,
1775,
17407,
323,
277,
44361,
495,
436,
2929,
2722,
326,
45190,
275,
2426,
273,
14053,
1524,
10186,
15302,
327,
69,
44361,
403,
347,
3576,
347,
253,
2087,
40515,
44361,
285,
625,
15538,
359,
476,
14556,
3037,
327,
69,
44361,
275,
247,
1039,
824,
326,
672,
352,
3249,
281,
18235,
10491,
253,
50276,
1171,
294,
11817,
310,
11542,
407,
247,
1355,
1180,
436,
2929,
310,
4583,
973,
15720,
3477,
281,
2096,
285,
1097,
253,
10527,
285,
16774,
1543,
3559,
275,
436,
2929,
403,
1077,
12302,
50276,
3118,
1096,
281,
253,
1386,
435,
553,
448,
3841,
4742,
615,
42190,
3169,
1775,
17407,
253,
294,
11817,
312,
4906,
3169,
1332,
310,
7964,
253,
2022,
2770,
273,
253,
2929,
2167,
253,
294,
11817,
312,
4906,
1332,
310,
417,
13714,
749,
1282,
435,
553,
275,
2087,
253,
10527,
906,
3559,
275,
436,
2929,
10012,
374,
3587,
8018,
326,
275,
3946,
359,
476,
4354,
1581,
5919,
10491,
407,
41113,
253,
3264,
50276,
1171,
294,
11817,
275,
4715,
40515,
44361,
327,
69,
44361,
625,
15538,
253,
4477,
5196,
11088,
4679,
281,
921,
326,
275,
4715,
40515,
44361,
407,
23254,
253,
10806,
9373,
38931,
1319,
50276,
24124,
37820,
1307,
359,
878,
323,
5919,
10491,
359,
403,
417,
10305,
253,
43541,
1612,
273,
40515,
44361,
4679,
671,
921,
326,
2429,
281,
253,
1386,
435,
553,
10491,
5933,
253,
294,
11817,
312,
4906,
1332,
11498,
1199,
1805,
327,
1097,
13506,
285,
1524,
10186,
15302,
50276,
2109,
44361,
403,
247,
13714,
625,
43541,
966,
2429,
281,
277,
44361,
533,
253,
5928,
273,
5919,
10491,
11333,
12516,
247,
2201,
11394,
432,
15706,
277,
44361,
407,
40515,
44361,
275,
1236,
2510,
25912,
4893,
407,
36636,
271,
5919,
749,
1282,
435,
553,
10491,
5933,
323,
40515,
44361,
436,
2929,
2789,
6832,
4780,
275,
13642,
598,
40515,
44361,
285,
13279,
598,
747,
44201,
323,
9433,
40515,
44361,
281,
2710,
1524,
10186,
15216,
50275,
5040,
337,
2429,
281,
253,
294,
11817,
312,
4906,
1332,
253,
1386,
435,
553,
5933,
310,
417,
347,
4722,
285,
4460,
285,
8489,
1474,
15544,
432,
253,
2022,
2926,
4477,
1537,
971,
281,
1908,
15250,
253,
1386,
435,
553,
5933,
1293,
2970,
715,
253,
7681,
4278,
273,
2282,
253,
4679,
326,
26662,
253,
294,
11817,
312,
4906,
1332,
1411,
253,
1386,
435,
553,
5933,
310,
1335,
1774,
50276,
19,
2593,
5976,
310,
247,
1652,
2372,
2834,
281,
956,
323,
1110,
665,
403,
417,
7615,
342,
253,
5202,
3169,
10491,
5933,
323,
277,
44361,
4477,
1537,
971,
281,
5645,
436,
2593,
247,
1652,
2372,
50276,
1615,
36636,
271,
5919,
749,
1282,
435,
553,
762,
2176,
13260,
3242,
10491,
5933,
323,
40515,
44361,
436,
789,
2789,
6832,
4780,
275,
13642,
598,
40515,
44361,
281,
1524,
10186,
4893,
436,
2929,
310,
4583,
973,
15720,
285,
3477,
281,
956,
285,
352,
671,
14371,
247,
12524,
2408,
10527,
4460,
2890,
5474,
339,
431,
248,
2929,
3400,
5919,
4872,
673,
275,
5382,
873,
1979,
11333,
323,
3242,
10491,
432,
14122,
25562,
27152,
267,
1127,
4870,
40515,
44361,
970,
247,
747,
40515,
377,
14717,
534,
369,
5611,
4102,
275,
271,
17857,
32888,
43425,
7946,
305,
435,
11436,
1162,
267,
285,
671,
247,
4715,
5933,
281,
3037,
40515,
377,
34501,
534,
403,
625,
42133,
281,
690,
273,
616,
10491,
11333,
597,
671,
2085,
16774,
1543,
10941,
616,
6311,
34501,
342,
2045,
2987,
285,
7277,
2069,
323,
616,
2710,
10491,
11333,
253,
2929,
310,
973,
3542,
2590,
285,
4722,
281,
1239,
597,
403,
253,
806,
281,
1918,
5919,
11333,
323,
3242,
10491,
432,
40515,
44361,
253,
2045,
10491,
5933,
3936,
673,
23664,
275,
3216,
873,
1979,
534,
310,
45783,
323,
1524,
10186,
941,
616,
789,
476,
1421,
281,
625,
4893,
273,
40515,
44361,
275,
8542,
7533,
352,
310,
671,
14793,
285,
11323,
281,
253,
5675,
6239,
327,
40515,
44361,
50276,
8826,
5701,
323,
7424,
374,
285,
495,
285,
2074,
20552,
11358,
275,
253,
2929,
534,
760,
6388,
47736,
5239,
352,
1537,
1361,
253,
9414,
604,
368,
897,
2235,
275,
340,
50276,
75,
275,
340,
2581,
685,
2235,
8578,
2574,
340,
50276,
75,
8578,
2574,
340,
671,
323,
841,
2173,
7424,
368,
3748,
326,
268,
3941,
1665,
6247,
2722,
731,
3066,
253,
448,
3841,
4742,
14717,
533,
326,
3133,
689,
24212,
323,
841,
2173,
7424,
374,
285,
495,
368,
476,
816,
15313,
5150,
374,
407,
12672,
1315,
317,
3225,
75,
8578,
2574,
340,
1087,
75,
8578,
2574,
340,
50276,
1124,
5992,
465,
1944,
5992,
31169,
285,
5150,
495,
671,
407,
1315,
317,
3225,
275,
340,
480,
417,
249,
340,
1087,
75,
275,
340,
50276,
1124,
3225,
275,
340,
50276,
1087,
891,
75,
8578,
2574,
340,
1087,
75,
275,
340,
50276,
1124,
5992,
25130,
50276,
5992,
465,
1944,
5992,
31169,
352,
3133,
751,
268,
3941,
1665,
6247,
4648,
26535,
14717,
275,
13989,
495,
285,
608,
984,
616,
39325,
403,
625,
2087,
285,
4647,
281,
28465,
20077,
6830,
1907,
625,
685,
581,
3284,
671,
516,
417,
1077,
7615,
342,
841,
2803,
5904,
533,
3133,
751,
26535,
14717,
310,
1027,
432,
448,
3841,
4742,
14717,
387,
1878,
432,
619,
4685,
4361,
253,
35372,
323,
448,
3841,
4742,
14717,
10071,
1537,
320,
17339,
456,
407,
436,
390,
5046,
697,
4030,
275,
534,
1083,
4496,
1339,
479,
871,
4583,
891,
751,
253,
2929,
285,
5583,
14924,
436,
310,
253,
806,
2929,
281,
1918,
5919,
11333,
323,
3242,
10491,
432,
40515,
44361,
534,
476,
320,
908,
275,
247,
5235,
273,
1524,
10186,
4893,
751,
3818,
3109,
2718,
3966,
285,
3021,
436,
2929,
476,
452,
247,
1175,
3486,
50276,
7152,
33032,
2520,
2929,
2175,
44755,
10491,
3082,
323,
40515,
44361,
50276,
1615,
7384,
1698,
14714,
2605,
273,
253,
10295,
253,
2929,
29328,
247,
1386,
435,
553,
10491,
1332,
534,
310,
7938,
685,
2045,
10491,
5933,
342,
23664,
20243,
323,
2087,
10295,
50276,
44295,
3062,
436,
2929,
24357,
247,
44755,
749,
1282,
435,
553,
18235,
10491,
5933,
323,
247,
35851,
273,
1698,
14714,
40515,
44361,
326,
403,
1925,
327,
69,
44361,
50276,
10489,
4679,
352,
310,
2011,
326,
253,
15970,
3045,
273,
327,
69,
44361,
310,
417,
30853,
2429,
281,
40515,
44361,
407,
6240,
247,
37820,
1307,
275,
253,
13757,
253,
18235,
5912,
310,
10260,
3777,
436,
2929,
29328,
767,
44755,
10491,
11333,
323,
40515,
44361,
581,
323,
1698,
14714,
10295,
285,
253,
643,
323,
1698,
5958,
19627,
10295,
253,
6733,
273,
253,
4081,
11333,
403,
16058,
949,
4679,
285,
253,
15970,
3045,
310,
417,
40195,
436,
2929,
310,
973,
3542,
285,
253,
2175,
812,
5649,
4623,
29905,
2706,
327,
40515,
44361,
50276,
74,
452,
690,
5884,
5701,
534,
943,
320,
3477,
281,
4993,
390,
3662,
50276,
1542,
1298,
577,
285,
1298,
608,
4496,
1918,
247,
5426,
273,
270,
71,
1182,
75,
50276,
249,
2593,
7609,
253,
299,
304,
9747,
42190,
273,
270,
71,
362,
3342,
362,
85,
310,
417,
908,
275,
5933,
374,
812,
359,
5386,
352,
281,
3693,
253,
13775,
50276,
249,
2593,
7609,
253,
10103,
273,
270,
71,
1182,
285,
270,
71,
1269,
513,
417,
3761,
672,
465,
310,
8909,
50276,
249,
2593,
608,
812,
368,
21184,
327,
2139,
604,
270,
71,
362,
1439,
14715,
3342,
270,
270,
71,
298,
588,
452,
5958,
374,
76,
752,
6569,
604,
270,
71,
362,
3342,
362,
18,
3342,
270,
835,
270,
71,
362,
18,
14715,
3342,
270,
50276,
1189,
455,
891,
1158,
436,
310,
247,
973,
3542,
2929,
619,
760,
4468,
310,
326,
253,
625,
44755,
10491,
5933,
310,
6786,
407,
23254,
625,
13133,
327,
253,
10295,
534,
1537,
2701,
697,
10393,
2299,
436,
1537,
417,
320,
247,
1943,
2523,
984,
352,
310,
2011,
275,
253,
4679,
326,
697,
15970,
3045,
310,
417,
30853,
5474,
33032,
2520,
2929,
4245,
271,
3242,
10491,
5933,
281,
3410,
432,
277,
44361,
1754,
327,
14122,
1105,
3899,
5526,
1698,
14714,
10295,
12624,
273,
253,
830,
298,
24260,
3956,
50276,
67,
1678,
3956,
2612,
412,
835,
362,
285,
270,
403,
273,
1979,
278,
407,
465,
285,
277,
310,
6278,
14122,
25562,
285,
273,
1979,
465,
407,
465,
253,
16888,
10295,
273,
824,
247,
277,
377,
778,
320,
3542,
275,
253,
830,
465,
40965,
91,
3956,
342,
1182,
247,
278,
407,
374,
76,
4315,
285,
259,
247,
6278,
4315,
273,
1979,
374,
76,
326,
310,
417,
13123,
253,
4477,
840,
50276,
35844,
275,
2593,
374,
849,
268,
3941,
84,
790,
5933,
310,
12956,
281,
3410,
432,
824,
40515,
44361,
50276,
249,
2593,
495,
921,
326,
347,
298,
310,
1698,
14714,
253,
12314,
7005,
20,
2105,
273,
268,
3941,
84,
790,
5933,
476,
320,
3777,
281,
7005,
76,
19,
347,
11269,
273,
253,
16888,
10295,
476,
320,
14556,
2218,
407,
760,
22753,
253,
6703,
259,
4315,
50276,
249,
2593,
577,
1918,
247,
749,
1282,
435,
553,
294,
11817,
312,
4906,
1754,
5933,
42174,
253,
5202,
3169,
5933,
273,
305,
408,
257,
7779,
1162,
355,
275,
436,
2593,
597,
28910,
12661,
247,
973,
26672,
264,
277,
377,
1754,
327,
247,
13123,
10295,
7856,
77,
326,
310,
1842,
1767,
375,
4636,
352,
310,
247,
277,
377,
1754,
327,
247,
13123,
3714,
69,
10295,
285,
476,
320,
19958,
432,
1142,
1027,
3809,
5368,
11333,
285,
323,
534,
667,
873,
340,
2336,
7790,
843,
314,
3040,
277,
678,
255,
314,
835,
253,
5170,
9458,
310,
2686,
26553,
289,
78,
337,
275,
1340,
281,
4555,
1453,
253,
18235,
2281,
273,
616,
4081,
5933,
326,
310,
4503,
281,
277,
678,
255,
965,
50276,
5992,
965,
253,
4477,
25636,
253,
1566,
407,
915,
5555,
326,
362,
285,
270,
403,
19627,
281,
1016,
643,
289,
78,
374,
28910,
275,
8136,
253,
4477,
10018,
326,
253,
10454,
273,
305,
408,
257,
7779,
1162,
14350,
5933,
476,
275,
958,
320,
4354,
5520,
407,
247,
2803,
465,
835,
465,
310,
253,
1979,
273,
253,
19958,
873,
50275,
249,
7118,
608,
285,
721,
4679,
403,
2530,
10941,
616,
256,
42956,
1332,
342,
253,
1375,
23037,
14387,
253,
2929,
310,
973,
15720,
1014,
2167,
352,
310,
247,
8489,
15246,
5878,
273,
5697,
432,
305,
435,
11436,
1162,
355,
15265,
839,
824,
40515,
44361,
305,
408,
257,
7779,
1162,
355,
749,
8172,
10491,
5933,
273,
13123,
277,
44361,
285,
268,
3941,
1665,
448,
3841,
4742,
3169,
10491,
273,
277,
44361,
285,
40515,
44361,
891,
2868,
326,
253,
1543,
403,
4722,
2217,
323,
14924,
253,
2022,
7680,
273,
253,
2929,
310,
281,
619,
2927,
281,
452,
1119,
247,
13123,
277,
377,
326,
310,
973,
3467,
959,
323,
18235,
10491,
253,
4737,
326,
323,
455,
340,
843,
314,
3040,
277,
678,
255,
314,
5667,
323,
253,
18235,
10491,
7792,
281,
789,
310,
271,
4722,
7680,
285,
812,
320,
908,
275,
643,
2987,
327,
277,
44361,
347,
973,
275,
8136,
891,
1089,
253,
4737,
1892,
281,
956,
285,
1804,
281,
253,
4477,
281,
513,
616,
1682,
281,
1089,
4088,
273,
8077,
5411,
352,
347,
1199,
347,
1896,
323,
253,
4049,
254,
609,
5102,
2715,
253,
19554,
253,
4737,
253,
625,
4354,
352,
812,
320,
9495,
281,
643,
15216,
3629,
253,
2442,
3486,
273,
253,
2929,
50276,
284,
323,
10012,
374,
253,
9373,
38931,
1319,
7658,
310,
29125,
285,
891,
651,
8564,
326,
2007,
6031,
812,
8488,
436,
7658,
285,
4044,
247,
3033,
326,
7024,
327,
849,
19627,
1097,
749,
31748,
403,
285,
27930,
253,
1655,
3033,
672,
597,
403,
6296,
19627,
327,
253,
643,
1133,
327,
69,
44361,
476,
6296,
320,
17194,
323,
4715,
347,
597,
6296,
5416,
326,
253,
2120,
2130,
5958,
310,
1691,
281,
7680,
253,
16774,
1543,
5257,
281,
17813,
436,
27350,
4154,
512,
275,
512,
253,
1355,
3480,
273,
2032,
3236,
414,
310,
31745,
407,
767,
4217,
39383,
326,
476,
5276,
4217,
281,
253,
3114,
2490,
187,
4118,
18435,
27,
2520,
310,
271,
12302,
2929,
326,
2085,
253,
5919,
11333,
323,
3242,
10491,
432,
40515,
44361,
2112,
342,
10527,
1543,
326,
403,
1077,
21452,
275,
285,
562,
3746,
253,
913,
5194,
342,
253,
30628,
326,
253,
4477,
3449,
5906,
1031,
9713,
253,
7350,
5439,
275,
253,
10123,
285,
310,
13762,
326,
253,
17265,
2715,
588,
320,
10260,
14109,
407,
253,
3114,
359,
1077,
1199,
11907,
253,
4477,
281,
15142,
436,
1386,
273,
789,
285,
275,
1798,
281,
11399,
253,
8542,
12400,
281,
253,
327,
69,
377,
35851
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the submission introduces new pixel level and segment level semantic annotations on top of the existing epickitchens100 videos annotations include temporal actions over short and longer time spans eg flour dough pixel level segmentation of active objects relationships between hands and active objects and a new type of annotation indicating where a given object came from some statistics about the data are provided data is annotated via a novel proposed pipeline done in three steps i identifying active objects ii pixel based annotations on the selected objects helped by an aipowered annotation tool and iii objecthand relations these sparse annotations are densified later with the help of models and the dense annotations not manual are also provided the submission also proposes benchmarks on three tasks video object segmentation handobject segmentation and relations with objects and a novel where did this come from task model baselines are provided for the two first tasks and an oraclebased baseline is provided for the last task the data is released in a clear format and is easy to use the dataset provides annotations on transformations eg flour dough which is an interesting concept still to be understood well by vision models the dataset provides short and long temporal actions the amount of annotated frames is great the proposed set of tasks in the benchmark has both more traditional tasks and novel attempts of making vision models able to do humanlike tasks eg wdtcf building on top of epic kitchens enriches the set of annotations and contributes to broader research data is easily accessible in the format provided different strategies to improve annotation quality were used in summary the dataset is well built accessible and definitely contributes to the broader research community by both providing more data for existing tasks and also proposing new tasks that can push the community towards solving perception i would have liked to be provided with code to visualise annotations rather than just images in the submission statistics on annotation errors would be a plus eg what proportion of manually verified annotations needed correction reporting more diversity metrics on the annotators would help docsepthe paper proposes visor benchmark a dataset with segmented kitchen videos along with three tasks semisupervised video object segmentation handobject segmentation relations where did this come from back in time segment tracking tasks are accompanied with metrics and baseline models data is under permissive license cc bync 40 1 data labelling pipepline is described in detail that may be beneficial for other researchers 2 wellknown methods are used for baselines stm and pointrend along with standard metrics 3 permissive license and no ethical concerns 1 while benchmark labelling tasks and baselines are well written motivation for creating benchmark is described poorly why these three challenges are selected what kitchen applications are possible if a method shows good quality on one of the tasks 2 introduction compares visor with previous work very briefly i believe that adding separate related work section would benefit the paper docsepthis paper introduced a richly annotated dataset for video object segmentation compared with previous datasets the new visor dataset has more frames more annotated masks and more action and entity classes the authors also defined a new task where did this come from wdtcf to show that the annotated longterm videos may enable more complex reasoning problems i see both the dataset and the task as major contributions this is a strong submission a richly annotated video dataset is always welcome the annotations seem to be of very high quality in addition the authors have introduced an interesting new video reasoning problem i actually dont see many weaknesses in this submission i have two minor comments 1 the discussion of related work is a bit thin in particular with 33 it would be great to better explain and motivate why visor is needed and what problems it may enable us to solve with eg longterm videos the wdtcf task is a nice demo but it can be motivated to highlight why such a task is needed 2 the authors have not provided any quantitative evaluation of annotation quality from the demos the annotation quality seems very high just to make the dataset contribution more solid i wonder if there is a way that this can be systematically calibrated eg wrt with some ground truth from mocap id be happy to increase my score if the authors may address my concerns especially 2 docsepthis paper extends epickitchens dataset for pixel level annotation in addition to both hands active entity is pixel level annotated forming large scale pixel and action level annotation the annotation was conducted on sparse frames and dense frame data is interpolated from neighboring sparse frames this benchmark can be used for three tasks vos hos and wdtcf the dataset is relatively larger than other datasets of the same style the dataset is built with proper use of aipowered segmentation tool solid and thorough inspection is performed for the dataset i dont see notable weakness in this paper docsepthis study is dedicated to a new semantic segmentation benchmark for segmenting and tracking hands and active objects in egocentric videos called visor it is created on the top of epickitchens video dataset yet introduces new annotations and three completely new challenges video object segmentation vos interaction understanding hand object segmentation hos and longterm reasoning where did this come from or wdtcf the labeling is performed semiautomatically the semantic masks are labeled via a aipowered tool providing user guidance moreover trainable stm methods are leveraged to extrapolate manual pixellevel semantic segmentation masks to the neighbor frames giving dense video annotations good paper overall 1 a large amount of work has been obviously done that includes developing labeling tools creating verification procedures formulating object categories designing challenges and setting up baselines 2 the challenges are fun and inventive despite i have some doubts on the third one i find the first two quite meaningful 3 the appendix provides a detailed and comprehensive description of all procedures performed while creating the benchmark i personally appreciate a rational and thoughtful approach to frame selection and labeling based on a common sense and supported with the statistical evidences 4 this benchmark has a practical value it could be used to facilitate creation of an aipowered home assistant providing smart guidance during a meal preparation eg if connected with a camera facing downwards placed underneath a cupboard 1 well i personally dealed with the tasks implying recognising actions and objects in the kitchen however for a reader without such an experience the motivation of creating this benchmark might be nonobvious i would recommend to articulate possible usage scenarios and potential applications more clearly 2 the test set contains 20 of the masks and 14 of the frames does this mean that the maskperframe distribution for the test set differs from the one for the remaining data in other words how can we be sure that the test subset is not biased wrt the train subset 3 wdtcf where did this come from challenge seems to be fun but a problem statement is a bit controversial to me the fridge and jar represent different categories of containers that are not mutually exclusive and cannot be opposed in a real life eg a jam might come from a jar taken out of the fridge so various combinations are possible 5 among the annotated objects there are watch and candle i am not really sure whether they are significant in the meal preparation context and should be annotated 6 the visor annotations include such an object as foot i am terribly curious about how a foot can be used while preparing a meal not a weakness though but i would be grateful for an answer
### Summary:
|
this paper introduces new annotations on top of the epickitchen 100 videos which contain pixellevel instance and semantic annotations on objects on top of the valuable instance mask annotations such as handobject interaction the annotation volume presented is significantly larger than prior works in multiple statistics based on the added annotations the author also propose a new task where did this come from wdtcf to demonstrate the values of added annotations for more complex semantic tasks this is an important area to the community and the dataset enables both studies in important classical problems eg vos and newly proposed tasks all the reviewers have suggested acceptance without major concerns to the paper meanwhile the authors of the paper considered the reviewers suggestions and improved the paper to include 1 more discussion separate section on related works t9qp xav1 2 visualization tools sj6u 3 baseline code sj6u 4 expanded discussion of the motivation for the work and proposed task cefw i believe that the authors responses and updated materials address most of the issues raised by reviewers i think this benchmark will be a good contribution to the computer vision community
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
19529,
23970,
747,
12275,
1268,
285,
8223,
1268,
24705,
31825,
327,
1755,
273,
253,
5368,
2563,
781,
262,
5756,
84,
2313,
10556,
31825,
2486,
11935,
5231,
689,
2159,
285,
3356,
673,
35742,
24088,
12274,
50276,
69,
602,
12275,
1268,
26405,
273,
3939,
5113,
7688,
875,
3564,
285,
3939,
5113,
285,
247,
747,
1511,
273,
22581,
7809,
835,
247,
1677,
1789,
2210,
432,
690,
9990,
670,
253,
941,
403,
2530,
50276,
2203,
310,
28267,
3066,
247,
4460,
4081,
15722,
2218,
275,
1264,
5018,
891,
12488,
3939,
5113,
21255,
12275,
1754,
31825,
327,
253,
4236,
5113,
6518,
407,
271,
247,
532,
319,
2122,
22581,
4968,
285,
37685,
1789,
4608,
2493,
841,
23507,
31825,
403,
12006,
1245,
1996,
342,
253,
1361,
273,
3210,
285,
253,
14086,
31825,
417,
11595,
403,
671,
2530,
50276,
783,
19529,
671,
29328,
49602,
327,
1264,
8892,
3492,
1789,
26405,
1133,
6082,
26405,
285,
2493,
342,
5113,
285,
247,
4460,
835,
858,
436,
1705,
432,
4836,
1566,
1666,
25379,
403,
2530,
323,
253,
767,
806,
8892,
285,
271,
42295,
3169,
8245,
310,
2530,
323,
253,
1390,
4836,
50276,
783,
941,
310,
4439,
275,
247,
2590,
5981,
285,
310,
3477,
281,
897,
50276,
783,
10895,
3400,
31825,
327,
21257,
24088,
12274,
50276,
69,
602,
534,
310,
271,
4722,
4473,
1335,
281,
320,
7192,
973,
407,
8113,
3210,
50276,
783,
10895,
3400,
2159,
285,
1048,
11935,
5231,
50276,
783,
2408,
273,
28267,
13009,
310,
1270,
50276,
783,
4081,
873,
273,
8892,
275,
253,
22791,
556,
1097,
625,
5899,
8892,
285,
4460,
9437,
273,
2403,
8113,
3210,
2104,
281,
513,
1966,
3022,
8892,
24088,
259,
7064,
7836,
50276,
22157,
327,
1755,
273,
19876,
8576,
84,
546,
695,
1041,
253,
873,
273,
31825,
285,
17904,
281,
16055,
2561,
50276,
2203,
310,
4354,
12482,
275,
253,
5981,
2530,
50276,
19623,
8130,
281,
3157,
22581,
3290,
497,
908,
50276,
249,
6010,
253,
10895,
310,
973,
4270,
12482,
285,
7964,
17904,
281,
253,
16055,
2561,
3114,
407,
1097,
5277,
625,
941,
323,
5368,
8892,
285,
671,
36636,
747,
8892,
326,
476,
7450,
253,
3114,
4404,
16161,
13071,
50276,
74,
651,
452,
10490,
281,
320,
2530,
342,
2127,
281,
5304,
885,
31825,
2581,
685,
816,
3888,
275,
253,
19529,
50276,
8766,
3397,
327,
22581,
6332,
651,
320,
247,
5043,
24088,
752,
8394,
273,
13542,
16058,
31825,
3058,
10618,
50276,
46418,
625,
9991,
17082,
327,
253,
12182,
2392,
651,
1361,
5474,
339,
431,
248,
2929,
29328,
1649,
263,
22791,
247,
10895,
342,
48456,
8576,
10556,
2112,
342,
1264,
8892,
49863,
29974,
13337,
3492,
1789,
26405,
1133,
6082,
26405,
2493,
835,
858,
436,
1705,
432,
896,
275,
673,
8223,
12544,
8892,
403,
11704,
342,
17082,
285,
8245,
3210,
941,
310,
762,
591,
35407,
7981,
25215,
407,
9068,
3387,
337,
941,
46684,
12881,
446,
460,
310,
2529,
275,
2508,
326,
778,
320,
12912,
323,
643,
8607,
374,
973,
4304,
3082,
403,
908,
323,
1666,
25379,
331,
78,
285,
1127,
5047,
2112,
342,
2629,
17082,
495,
591,
35407,
7981,
285,
642,
16289,
7350,
337,
1223,
22791,
46684,
8892,
285,
1666,
25379,
403,
973,
3542,
16038,
323,
6153,
22791,
310,
2529,
15225,
2139,
841,
1264,
7881,
403,
4236,
752,
8576,
4893,
403,
1896,
604,
247,
1332,
2722,
1175,
3290,
327,
581,
273,
253,
8892,
374,
10199,
26662,
1649,
263,
342,
2045,
789,
1077,
13366,
891,
2868,
326,
6240,
4858,
2905,
789,
2593,
651,
5649,
253,
2929,
5474,
33032,
2520,
2929,
5611,
247,
6793,
314,
28267,
10895,
323,
3492,
1789,
26405,
50276,
3118,
1096,
342,
2045,
15302,
253,
747,
1649,
263,
10895,
556,
625,
13009,
625,
28267,
25965,
285,
625,
2250,
285,
10726,
5971,
50276,
783,
4477,
671,
2931,
247,
747,
4836,
835,
858,
436,
1705,
432,
259,
7064,
7836,
281,
921,
326,
253,
28267,
1048,
3945,
10556,
778,
8046,
625,
2570,
14720,
3237,
50276,
74,
923,
1097,
253,
10895,
285,
253,
4836,
347,
2201,
9021,
436,
310,
247,
2266,
19529,
50276,
66,
6793,
314,
28267,
3492,
10895,
310,
1900,
10112,
50276,
783,
31825,
1646,
281,
320,
273,
1077,
1029,
3290,
50276,
249,
1635,
253,
4477,
452,
5611,
271,
4722,
747,
3492,
14720,
1895,
891,
2686,
13414,
923,
1142,
32213,
275,
436,
19529,
50276,
74,
452,
767,
5884,
5701,
50276,
18,
253,
5955,
273,
2905,
789,
310,
247,
2372,
6906,
275,
1798,
342,
5922,
50276,
262,
651,
320,
1270,
281,
1805,
5513,
285,
41509,
2139,
1649,
263,
310,
3058,
285,
752,
3237,
352,
778,
8046,
441,
281,
8415,
342,
24088,
1048,
3945,
10556,
50276,
783,
259,
7064,
7836,
4836,
310,
247,
5322,
22020,
533,
352,
476,
320,
17194,
281,
6780,
2139,
824,
247,
4836,
310,
3058,
50276,
19,
253,
4477,
452,
417,
2530,
667,
11745,
7103,
273,
22581,
3290,
50276,
4064,
253,
1471,
375,
253,
22581,
3290,
3133,
1077,
1029,
816,
281,
1056,
253,
10895,
7680,
625,
4891,
891,
4282,
604,
627,
310,
247,
1039,
326,
436,
476,
320,
24181,
35890,
24088,
8772,
342,
690,
3216,
5083,
432,
278,
406,
522,
50276,
301,
320,
5211,
281,
2572,
619,
4868,
604,
253,
4477,
778,
2953,
619,
7350,
3340,
374,
5474,
33032,
2520,
2929,
8725,
2563,
781,
262,
5756,
84,
10895,
323,
12275,
1268,
22581,
50276,
249,
1635,
281,
1097,
3564,
3939,
10726,
310,
12275,
1268,
28267,
9046,
1781,
4311,
12275,
285,
2250,
1268,
22581,
253,
22581,
369,
5196,
327,
23507,
13009,
285,
14086,
3665,
941,
310,
20670,
456,
432,
20667,
23507,
13009,
436,
22791,
476,
320,
908,
323,
1264,
8892,
362,
375,
288,
375,
285,
259,
7064,
7836,
253,
10895,
310,
4942,
4067,
685,
643,
15302,
273,
253,
1072,
3740,
253,
10895,
310,
4270,
342,
1463,
897,
273,
247,
532,
319,
2122,
26405,
4968,
50276,
23706,
285,
11080,
15981,
310,
2684,
323,
253,
10895,
891,
13414,
923,
16613,
14855,
275,
436,
2929,
5474,
33032,
2520,
1263,
310,
9940,
281,
247,
747,
24705,
26405,
22791,
323,
8223,
272,
285,
12544,
3564,
285,
3939,
5113,
275,
24088,
406,
19458,
10556,
1925,
1649,
263,
352,
310,
3562,
327,
253,
1755,
273,
2563,
781,
262,
5756,
84,
3492,
10895,
2568,
23970,
747,
31825,
285,
1264,
4336,
747,
7881,
3492,
1789,
26405,
362,
375,
5016,
4685,
1133,
1789,
26405,
288,
375,
285,
1048,
3945,
14720,
835,
858,
436,
1705,
432,
390,
259,
7064,
7836,
253,
21473,
310,
2684,
3300,
571,
307,
297,
5372,
253,
24705,
25965,
403,
13130,
3066,
247,
247,
532,
319,
2122,
4968,
5277,
2608,
12925,
25761,
6194,
494,
331,
78,
3082,
403,
19732,
2961,
281,
26480,
25839,
11595,
8066,
4415,
652,
24705,
26405,
25965,
281,
253,
6346,
13009,
4933,
14086,
3492,
31825,
1175,
2929,
4583,
50276,
18,
247,
1781,
2408,
273,
789,
556,
644,
9090,
2218,
326,
3797,
6684,
21473,
5657,
6153,
21999,
7259,
830,
8287,
1789,
9050,
20462,
7881,
285,
4758,
598,
1666,
25379,
50275,
19,
253,
7881,
403,
794,
285,
10242,
422,
5747,
891,
452,
690,
24626,
327,
253,
2626,
581,
891,
1089,
253,
806,
767,
3240,
14282,
50276,
20,
253,
30762,
3400,
247,
7000,
285,
11088,
5740,
273,
512,
7259,
2684,
1223,
6153,
253,
22791,
891,
11697,
11435,
247,
8870,
285,
30457,
2746,
281,
3665,
5438,
285,
21473,
1754,
327,
247,
1846,
3282,
285,
4516,
342,
253,
7605,
20456,
2979,
50276,
21,
436,
22791,
556,
247,
8542,
1318,
352,
812,
320,
908,
281,
12454,
8869,
273,
271,
247,
532,
319,
2122,
1728,
13372,
5277,
7060,
12925,
1309,
247,
11484,
9008,
24088,
604,
4802,
342,
247,
6568,
10268,
1066,
4515,
4845,
21281,
247,
5500,
4697,
337,
973,
891,
11697,
2968,
264,
342,
253,
8892,
27594,
3183,
2182,
5231,
285,
5113,
275,
253,
8576,
2299,
323,
247,
9414,
1293,
824,
271,
2793,
253,
16038,
273,
6153,
436,
22791,
1537,
320,
1327,
706,
3391,
891,
651,
5583,
281,
40244,
1896,
10393,
15216,
285,
2442,
4893,
625,
4518,
50276,
19,
253,
1071,
873,
4428,
1384,
273,
253,
25965,
285,
1638,
273,
253,
13009,
1057,
436,
1599,
326,
253,
8989,
468,
6301,
3268,
323,
253,
1071,
873,
19986,
432,
253,
581,
323,
253,
5780,
941,
275,
643,
3000,
849,
476,
359,
320,
2119,
326,
253,
1071,
8578,
310,
417,
23539,
8772,
253,
6194,
8578,
50276,
20,
259,
7064,
7836,
835,
858,
436,
1705,
432,
5691,
3133,
281,
320,
794,
533,
247,
1895,
3908,
310,
247,
2372,
15620,
281,
479,
253,
32454,
285,
21152,
1957,
1027,
9050,
273,
17671,
326,
403,
417,
25834,
11855,
285,
2550,
320,
10066,
275,
247,
1524,
1495,
24088,
247,
25308,
1537,
1705,
432,
247,
21152,
2668,
562,
273,
253,
32454,
594,
2710,
13553,
403,
1896,
50276,
22,
2190,
253,
28267,
5113,
627,
403,
3698,
285,
28725,
891,
717,
417,
1663,
2119,
1880,
597,
403,
1534,
275,
253,
11484,
9008,
3634,
285,
943,
320,
28267,
50275,
23,
253,
1649,
263,
31825,
2486,
824,
271,
1789,
347,
3174,
891,
717,
30643,
14338,
670,
849,
247,
3174,
476,
320,
908,
1223,
13828,
247,
11484,
417,
247,
14855,
2167,
533,
891,
651,
320,
14442,
323,
271,
3662,
2490,
187,
4118,
18435,
27,
2520,
2929,
23970,
747,
31825,
327,
1755,
273,
253,
2563,
781,
262,
5756,
2233,
10556,
534,
3831,
8066,
4415,
652,
4227,
285,
24705,
31825,
327,
5113,
327,
1755,
273,
253,
9865,
4227,
8989,
31825,
824,
347,
1133,
6082,
5016,
253,
22581,
4644,
3559,
310,
3012,
4067,
685,
2720,
2987,
275,
2709,
9990,
1754,
327,
253,
2879,
31825,
253,
2488,
671,
12661,
247,
747,
4836,
835,
858,
436,
1705,
432,
259,
7064,
7836,
281,
7568,
253,
2193,
273,
2879,
31825,
323,
625,
2570,
24705,
8892,
436,
310,
271,
1774,
2170,
281,
253,
3114,
285,
253,
10895,
13276,
1097,
2175,
275,
1774,
8946,
3237,
24088,
362,
375,
285,
9841,
4081,
8892,
50275,
455,
253,
30628,
452,
5125,
14924,
1293,
2201,
7350,
281,
253,
2929,
26614,
253,
4477,
273,
253,
2929,
2783,
253,
30628,
13991,
285,
5520,
253,
2929,
281,
2486,
337,
625,
5955,
50276,
16806,
366,
2593,
327,
2905,
2987,
246,
26,
39541,
1269,
580,
18,
374,
24426,
5657,
41996,
23,
86,
495,
8245,
2127,
41996,
23,
86,
577,
11848,
5955,
273,
253,
16038,
323,
253,
789,
285,
4081,
4836,
260,
832,
88,
50276,
74,
2868,
326,
253,
4477,
6128,
285,
9300,
4753,
2953,
954,
273,
253,
3374,
5439,
407,
30628,
891,
1158,
436,
22791,
588,
320,
247,
1175,
7680,
281,
253,
4382,
8113,
3114
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
19529,
23970,
747,
12275,
1268,
285,
8223,
1268,
24705,
31825,
327,
1755,
273,
253,
5368,
2563,
781,
262,
5756,
84,
2313,
10556,
31825,
2486,
11935,
5231,
689,
2159,
285,
3356,
673,
35742,
24088,
12274,
50276,
69,
602,
12275,
1268,
26405,
273,
3939,
5113,
7688,
875,
3564,
285,
3939,
5113,
285,
247,
747,
1511,
273,
22581,
7809,
835,
247,
1677,
1789,
2210,
432,
690,
9990,
670,
253,
941,
403,
2530,
50276,
2203,
310,
28267,
3066,
247,
4460,
4081,
15722,
2218,
275,
1264,
5018,
891,
12488,
3939,
5113,
21255,
12275,
1754,
31825,
327,
253,
4236,
5113,
6518,
407,
271,
247,
532,
319,
2122,
22581,
4968,
285,
37685,
1789,
4608,
2493,
841,
23507,
31825,
403,
12006,
1245,
1996,
342,
253,
1361,
273,
3210,
285,
253,
14086,
31825,
417,
11595,
403,
671,
2530,
50276,
783,
19529,
671,
29328,
49602,
327,
1264,
8892,
3492,
1789,
26405,
1133,
6082,
26405,
285,
2493,
342,
5113,
285,
247,
4460,
835,
858,
436,
1705,
432,
4836,
1566,
1666,
25379,
403,
2530,
323,
253,
767,
806,
8892,
285,
271,
42295,
3169,
8245,
310,
2530,
323,
253,
1390,
4836,
50276,
783,
941,
310,
4439,
275,
247,
2590,
5981,
285,
310,
3477,
281,
897,
50276,
783,
10895,
3400,
31825,
327,
21257,
24088,
12274,
50276,
69,
602,
534,
310,
271,
4722,
4473,
1335,
281,
320,
7192,
973,
407,
8113,
3210,
50276,
783,
10895,
3400,
2159,
285,
1048,
11935,
5231,
50276,
783,
2408,
273,
28267,
13009,
310,
1270,
50276,
783,
4081,
873,
273,
8892,
275,
253,
22791,
556,
1097,
625,
5899,
8892,
285,
4460,
9437,
273,
2403,
8113,
3210,
2104,
281,
513,
1966,
3022,
8892,
24088,
259,
7064,
7836,
50276,
22157,
327,
1755,
273,
19876,
8576,
84,
546,
695,
1041,
253,
873,
273,
31825,
285,
17904,
281,
16055,
2561,
50276,
2203,
310,
4354,
12482,
275,
253,
5981,
2530,
50276,
19623,
8130,
281,
3157,
22581,
3290,
497,
908,
50276,
249,
6010,
253,
10895,
310,
973,
4270,
12482,
285,
7964,
17904,
281,
253,
16055,
2561,
3114,
407,
1097,
5277,
625,
941,
323,
5368,
8892,
285,
671,
36636,
747,
8892,
326,
476,
7450,
253,
3114,
4404,
16161,
13071,
50276,
74,
651,
452,
10490,
281,
320,
2530,
342,
2127,
281,
5304,
885,
31825,
2581,
685,
816,
3888,
275,
253,
19529,
50276,
8766,
3397,
327,
22581,
6332,
651,
320,
247,
5043,
24088,
752,
8394,
273,
13542,
16058,
31825,
3058,
10618,
50276,
46418,
625,
9991,
17082,
327,
253,
12182,
2392,
651,
1361,
5474,
339,
431,
248,
2929,
29328,
1649,
263,
22791,
247,
10895,
342,
48456,
8576,
10556,
2112,
342,
1264,
8892,
49863,
29974,
13337,
3492,
1789,
26405,
1133,
6082,
26405,
2493,
835,
858,
436,
1705,
432,
896,
275,
673,
8223,
12544,
8892,
403,
11704,
342,
17082,
285,
8245,
3210,
941,
310,
762,
591,
35407,
7981,
25215,
407,
9068,
3387,
337,
941,
46684,
12881,
446,
460,
310,
2529,
275,
2508,
326,
778,
320,
12912,
323,
643,
8607,
374,
973,
4304,
3082,
403,
908,
323,
1666,
25379,
331,
78,
285,
1127,
5047,
2112,
342,
2629,
17082,
495,
591,
35407,
7981,
285,
642,
16289,
7350,
337,
1223,
22791,
46684,
8892,
285,
1666,
25379,
403,
973,
3542,
16038,
323,
6153,
22791,
310,
2529,
15225,
2139,
841,
1264,
7881,
403,
4236,
752,
8576,
4893,
403,
1896,
604,
247,
1332,
2722,
1175,
3290,
327,
581,
273,
253,
8892,
374,
10199,
26662,
1649,
263,
342,
2045,
789,
1077,
13366,
891,
2868,
326,
6240,
4858,
2905,
789,
2593,
651,
5649,
253,
2929,
5474,
33032,
2520,
2929,
5611,
247,
6793,
314,
28267,
10895,
323,
3492,
1789,
26405,
50276,
3118,
1096,
342,
2045,
15302,
253,
747,
1649,
263,
10895,
556,
625,
13009,
625,
28267,
25965,
285,
625,
2250,
285,
10726,
5971,
50276,
783,
4477,
671,
2931,
247,
747,
4836,
835,
858,
436,
1705,
432,
259,
7064,
7836,
281,
921,
326,
253,
28267,
1048,
3945,
10556,
778,
8046,
625,
2570,
14720,
3237,
50276,
74,
923,
1097,
253,
10895,
285,
253,
4836,
347,
2201,
9021,
436,
310,
247,
2266,
19529,
50276,
66,
6793,
314,
28267,
3492,
10895,
310,
1900,
10112,
50276,
783,
31825,
1646,
281,
320,
273,
1077,
1029,
3290,
50276,
249,
1635,
253,
4477,
452,
5611,
271,
4722,
747,
3492,
14720,
1895,
891,
2686,
13414,
923,
1142,
32213,
275,
436,
19529,
50276,
74,
452,
767,
5884,
5701,
50276,
18,
253,
5955,
273,
2905,
789,
310,
247,
2372,
6906,
275,
1798,
342,
5922,
50276,
262,
651,
320,
1270,
281,
1805,
5513,
285,
41509,
2139,
1649,
263,
310,
3058,
285,
752,
3237,
352,
778,
8046,
441,
281,
8415,
342,
24088,
1048,
3945,
10556,
50276,
783,
259,
7064,
7836,
4836,
310,
247,
5322,
22020,
533,
352,
476,
320,
17194,
281,
6780,
2139,
824,
247,
4836,
310,
3058,
50276,
19,
253,
4477,
452,
417,
2530,
667,
11745,
7103,
273,
22581,
3290,
50276,
4064,
253,
1471,
375,
253,
22581,
3290,
3133,
1077,
1029,
816,
281,
1056,
253,
10895,
7680,
625,
4891,
891,
4282,
604,
627,
310,
247,
1039,
326,
436,
476,
320,
24181,
35890,
24088,
8772,
342,
690,
3216,
5083,
432,
278,
406,
522,
50276,
301,
320,
5211,
281,
2572,
619,
4868,
604,
253,
4477,
778,
2953,
619,
7350,
3340,
374,
5474,
33032,
2520,
2929,
8725,
2563,
781,
262,
5756,
84,
10895,
323,
12275,
1268,
22581,
50276,
249,
1635,
281,
1097,
3564,
3939,
10726,
310,
12275,
1268,
28267,
9046,
1781,
4311,
12275,
285,
2250,
1268,
22581,
253,
22581,
369,
5196,
327,
23507,
13009,
285,
14086,
3665,
941,
310,
20670,
456,
432,
20667,
23507,
13009,
436,
22791,
476,
320,
908,
323,
1264,
8892,
362,
375,
288,
375,
285,
259,
7064,
7836,
253,
10895,
310,
4942,
4067,
685,
643,
15302,
273,
253,
1072,
3740,
253,
10895,
310,
4270,
342,
1463,
897,
273,
247,
532,
319,
2122,
26405,
4968,
50276,
23706,
285,
11080,
15981,
310,
2684,
323,
253,
10895,
891,
13414,
923,
16613,
14855,
275,
436,
2929,
5474,
33032,
2520,
1263,
310,
9940,
281,
247,
747,
24705,
26405,
22791,
323,
8223,
272,
285,
12544,
3564,
285,
3939,
5113,
275,
24088,
406,
19458,
10556,
1925,
1649,
263,
352,
310,
3562,
327,
253,
1755,
273,
2563,
781,
262,
5756,
84,
3492,
10895,
2568,
23970,
747,
31825,
285,
1264,
4336,
747,
7881,
3492,
1789,
26405,
362,
375,
5016,
4685,
1133,
1789,
26405,
288,
375,
285,
1048,
3945,
14720,
835,
858,
436,
1705,
432,
390,
259,
7064,
7836,
253,
21473,
310,
2684,
3300,
571,
307,
297,
5372,
253,
24705,
25965,
403,
13130,
3066,
247,
247,
532,
319,
2122,
4968,
5277,
2608,
12925,
25761,
6194,
494,
331,
78,
3082,
403,
19732,
2961,
281,
26480,
25839,
11595,
8066,
4415,
652,
24705,
26405,
25965,
281,
253,
6346,
13009,
4933,
14086,
3492,
31825,
1175,
2929,
4583,
50276,
18,
247,
1781,
2408,
273,
789,
556,
644,
9090,
2218,
326,
3797,
6684,
21473,
5657,
6153,
21999,
7259,
830,
8287,
1789,
9050,
20462,
7881,
285,
4758,
598,
1666,
25379,
50275,
19,
253,
7881,
403,
794,
285,
10242,
422,
5747,
891,
452,
690,
24626,
327,
253,
2626,
581,
891,
1089,
253,
806,
767,
3240,
14282,
50276,
20,
253,
30762,
3400,
247,
7000,
285,
11088,
5740,
273,
512,
7259,
2684,
1223,
6153,
253,
22791,
891,
11697,
11435,
247,
8870,
285,
30457,
2746,
281,
3665,
5438,
285,
21473,
1754,
327,
247,
1846,
3282,
285,
4516,
342,
253,
7605,
20456,
2979,
50276,
21,
436,
22791,
556,
247,
8542,
1318,
352,
812,
320,
908,
281,
12454,
8869,
273,
271,
247,
532,
319,
2122,
1728,
13372,
5277,
7060,
12925,
1309,
247,
11484,
9008,
24088,
604,
4802,
342,
247,
6568,
10268,
1066,
4515,
4845,
21281,
247,
5500,
4697,
337,
973,
891,
11697,
2968,
264,
342,
253,
8892,
27594,
3183,
2182,
5231,
285,
5113,
275,
253,
8576,
2299,
323,
247,
9414,
1293,
824,
271,
2793,
253,
16038,
273,
6153,
436,
22791,
1537,
320,
1327,
706,
3391,
891,
651,
5583,
281,
40244,
1896,
10393,
15216,
285,
2442,
4893,
625,
4518,
50276,
19,
253,
1071,
873,
4428,
1384,
273,
253,
25965,
285,
1638,
273,
253,
13009,
1057,
436,
1599,
326,
253,
8989,
468,
6301,
3268,
323,
253,
1071,
873,
19986,
432,
253,
581,
323,
253,
5780,
941,
275,
643,
3000,
849,
476,
359,
320,
2119,
326,
253,
1071,
8578,
310,
417,
23539,
8772,
253,
6194,
8578,
50276,
20,
259,
7064,
7836,
835,
858,
436,
1705,
432,
5691,
3133,
281,
320,
794,
533,
247,
1895,
3908,
310,
247,
2372,
15620,
281,
479,
253,
32454,
285,
21152,
1957,
1027,
9050,
273,
17671,
326,
403,
417,
25834,
11855,
285,
2550,
320,
10066,
275,
247,
1524,
1495,
24088,
247,
25308,
1537,
1705,
432,
247,
21152,
2668,
562,
273,
253,
32454,
594,
2710,
13553,
403,
1896,
50276,
22,
2190,
253,
28267,
5113,
627,
403,
3698,
285,
28725,
891,
717,
417,
1663,
2119,
1880,
597,
403,
1534,
275,
253,
11484,
9008,
3634,
285,
943,
320,
28267,
50275,
23,
253,
1649,
263,
31825,
2486,
824,
271,
1789,
347,
3174,
891,
717,
30643,
14338,
670,
849,
247,
3174,
476,
320,
908,
1223,
13828,
247,
11484,
417,
247,
14855,
2167,
533,
891,
651,
320,
14442,
323,
271,
3662,
2490,
187,
4118,
18435,
27,
2520,
2929,
23970,
747,
31825,
327,
1755,
273,
253,
2563,
781,
262,
5756,
2233,
10556,
534,
3831,
8066,
4415,
652,
4227,
285,
24705,
31825,
327,
5113,
327,
1755,
273,
253,
9865,
4227,
8989,
31825,
824,
347,
1133,
6082,
5016,
253,
22581,
4644,
3559,
310,
3012,
4067,
685,
2720,
2987,
275,
2709,
9990,
1754,
327,
253,
2879,
31825,
253,
2488,
671,
12661,
247,
747,
4836,
835,
858,
436,
1705,
432,
259,
7064,
7836,
281,
7568,
253,
2193,
273,
2879,
31825,
323,
625,
2570,
24705,
8892,
436,
310,
271,
1774,
2170,
281,
253,
3114,
285,
253,
10895,
13276,
1097,
2175,
275,
1774,
8946,
3237,
24088,
362,
375,
285,
9841,
4081,
8892,
50275,
455,
253,
30628,
452,
5125,
14924,
1293,
2201,
7350,
281,
253,
2929,
26614,
253,
4477,
273,
253,
2929,
2783,
253,
30628,
13991,
285,
5520,
253,
2929,
281,
2486,
337,
625,
5955,
50276,
16806,
366,
2593,
327,
2905,
2987,
246,
26,
39541,
1269,
580,
18,
374,
24426,
5657,
41996,
23,
86,
495,
8245,
2127,
41996,
23,
86,
577,
11848,
5955,
273,
253,
16038,
323,
253,
789,
285,
4081,
4836,
260,
832,
88,
50276,
74,
2868,
326,
253,
4477,
6128,
285,
9300,
4753,
2953,
954,
273,
253,
3374,
5439,
407,
30628,
891,
1158,
436,
22791,
588,
320,
247,
1175,
7680,
281,
253,
4382,
8113,
3114
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
updated review summary this work proposes an approach for modelbased optimization based on learning a density function through an approximation of the normalized maximum likelihood nml this is done by discretizing the space and fitting distinct model parameters for each value to lower the computational cost the authors propose optimizing the candidates concurrently with the model parameters each models distribution is encoded as a neural net outputting a scalar which is then encoded using a thermometer approach using a series of shifted sigmoid candidates are optimized based on the average value of the scalar of each model evaluated using parameters obtained from an exponentially weighted average of its most recent parameters reason for score this work proposes a reasonable approximation to an interesting estimator and demonstrate it is capable of achieving good consistent performance this is likely to be of interest to the community and as far as im aware is sufficiently novel given that i see no noteworthy issues and all of my major concerns have been addressed i dont see any reason for rejection i strongly support acceptance pros using estimates of the nml for modelbased optimization is an interesting idea this work shows that the nml can be successfully approximated with a relatively coarse discretization and that both the optimization of the candidate and the various model parameters can be optimized in tandem this suggests that this type of approach is viable and possibly warrants further investigation initial review summary this work proposes an approach for modelbased optimization based on learning a density function through an approximation of the normalized maximum likelihood nml this is done by discretizing the space and fitting distinct model parameters for each value to lower the computational cost the authors propose optimizing the candidates concurrently with the model parameters each models distribution is encoded as a neural net outputting a scalar which is then encoded using a thermometer approach using a series of shifted sigmoid candidates are optimized based on the average value of the scalar of each model evaluated using parameters obtained from an exponentially weighted average of its most recent parameters reason for score there are a lot of typos and issues with the notation which make this paper unfit for publication in its current state otherwise the work seems interesting but i find the experiments dont provide much insight though the comparison with the selected method is favorable its hard to consider them significant when there is a notable difference in only one of the three datasets of an unpublished benchmark additionally the fact that the reported results only cover half the datasets from this benchmark raises some questions despite the negative tone of this paragraph i want to note that the severity of some of these issues is subjective while others can be easily fixed my mind isnt made up and i hope the authors can clarify anything i might have missed pros using estimates of the nml for modelbased optimization is an interesting idea this work shows that the nml can be successfully approximated with a relatively coarse discretization and that both the optimization of the candidate and the various model parameters can be optimized in tandem this suggests that this type of approach is viable and possibly warrants further investigation cons the current notation is often confusing and even ambiguous at times this will probably transpire in some of my other comments but i do consider these issues to be superficial and easily fixed ive provided some suggestions below for how to improve the notation i understand that notation preference is a very subjective thing so the authors should feel free to opt for something different but i do think the notation needs to be improved the thermometer encoding of the output seems like an odd choice especially given how it is done here if i understood the approach correctly this seems to be a hacky way of using the output of the nn oint as parameters to a logistic distributions why not treat it this way directly my interpretation of this approach is that the okints are parameters for logistic distributions and the mean is optimized through the unnormalized probs by optimizing each okint directly would this be a fair description of the approach i was expecting the discretization to be used directly to approximate the integral in eq 2 the experimental results arent very conclusive only showing a clear benefit in a single case without additional results it is difficult to say much about the behavior and properties of this method a visualization of the distribution of the value of the candidates might help convey some additional information in favor of this method its possible that i am missing some context to appreciate these results if that is the case it would help if the authors could provide the context i need to appreciate these results questions i dont think i understand what makes the appendix results an ablation study from what i understand these results only compare with the case where there is no learning of the model parameters what are the models initialized to where does the data come into play have the authors considered using points selected from some fixed quadrature method instead of a uniform grid a common theme for the comparison methods is the idea of not diverging too much from the data was the validity of the outputs of this method evaluated in any way how can i know that the method isnt just exploiting some quirk in the learned models used to evaluate it while some of the other methods avoid doing this were comparisons with a simple approaches bayesian optimization tried eg gaussian process what happens when doing more iterations on the log likelihood before updating x how much do we lose when only doing a single update does using a more accurate approximation of the nlm improveworsen performance i was hoping this would be part of the ablation study how do the run times compare were the other datasets from the designbench benchmark tried misc and typos page 2 in that it has shown to be missing word page 2 to discuss how approximate this intractable missing word page 3 when pnml is formally written the meaning of y is ambiguous since it is on the lhs and also being redefined by maxy in the rhs page 3 the notation d cup x y refers to an augmented dataset d cup xn1 yn1 this wasnt very informative and felt a bit tautological i would recommend sticking with one of the two notations page 3 where d is fixed to the given offline dataset and thetad cup xy this sentence is a bit confusing as a whole the start of the sentence talks only about d so the theta mention is unexpected when reading page 3 right after eq 2 x y is on the lhs but then also part of the expectation on the rhs what is the expectation being taken over are x and y being redefined page 4 for y in 1 k do is y i dont believe that y is assumed to be an integer page 4 algorithm 4 page 4 this would produce a distribution over output values pyx this doesnt type check for me if i understood correctly p or pbullet x represent distributions and y are output values also it might be good to reuse the hat pnml notation to get the following sentence this would produce a conditional distribution hat pnmlbullet xt over output values page 4 we can optimize xt with respect to some evaluation function gy of this distribution such as the mean this is confusing since algorithm 1 has mathbbe gy what does it mean for the evaluation function g to be the mean in this context how should i interpret the expectation of meany also i assumed that what was meant is that g is the evaluation function not gy or is g meant to return a function given a y page 6 a straightforward choice for the evaluation function g is the identity function gx x it might be best to stick to a consistent variable name eg gy y to avoid confusion about what the domain of g is proof of thm 41 equation under using these two facts we can show q should be replaced with pnml in the rhs proof of thm 41 going from tv to kl looking up the bound returns a 12 factor inside the root rather than a 2 i could have missed a detail but thought worth mentioning in case i havent proof of thm 41 missing a when bounding the kl divergence algorithm 2 there is some undefined superscript k and y in the loop over xmt notation suggestions when writing expectations i would strongly recommend making it explicit over which variables they are eg mathbbey sim pbullet x theta introducing some shorthand notation might help make this more concise eg px thetay py x theta when referring to a function only mention its namesymbol and reserve the form that includes inputs to refer to the output of the function given those inputs eg a function g an evaluation score gy avoid relying on the readers pattern matching abilities for assigning meaning to variables and make sure variables eg mathbfx and y are always explicitly defined by explicitly defined i include defining y compactly with something like maxy in mathcaly gy for example there is not need to be overly verbose but it should never leave room for interpretation this is related to the point about expectation notation i usually prefer explicit domains eg sumy in mathcaly instead of sumy but i consider it fine to omit it if variables names are always reserved to the same domain when reused this was not the case for x in this paper when writing pseudocode either loop over integer indices or over elements of a set it is confusing to use for y in 1 k when y isnt an integer additionally using both makes it difficult to tell that k is associated with y inside a loopdocsep summary this paper proposed a method based on nml and provided a principled approach to estimate the uncertainty for ood data pros 1 the method proposed in this work is a principled way to handle uncertainty for novel points out of the original dataset compared with for example deep ensemble 2 one clear advantage of ths proposed approach is this method can scale to large dataset compared with gp which scales cubically cons 1 the authors claim using a supervised approach is brittle and prone to failure as the uncertainty of inputs outside of training data is uncontroled however this is not true and uncertainty can be well controlled depending on the model which can be nonparametric or parametric distributionfree or distributiondependent for example to measure uncertainty on novel point gp could be viewed as the ground truth under some conditions my question is why not directly compare your approach with a gp approach then combine the posterior with an acquisition function such as ei the offline mbo problem presented in this work is similiar to an online mbo except we have only one online sample and we are tying to optimize this point comparing eq1 of this paper with the formula of ei it is easy to see eq1 is exactly ei if we assume there exists xy such that y is larger than objective values in the training data set thus the problem presented in this work can be effectively solved through one step of conventional bo given the datasets used in the experiments of this work is are not of large scale i think comparing with a gpbased bo is necessary 2 in figure 3 although not a major concern i dont think the comparison with the ensemble is fair although this work uses bootstrap and ensemble mse cannot capture uncertainty thus it is not an ideal metric in this setting for example to obtain a similar uncertainty estimation compared with nml middle column we can use a deep ensemble which optimizes nll instead of mse 3 the experimental results in my opinion are not sufficient and there is only one table presenting empirical results i dont want to judge sufficiency by the quantity of tables or figures but considering the theoretical analysis is not strong enough i think more empirical study should be performed 4 the uncertainty estimation seems too conservative and this could make the estimated uncertainty less useful especially for highdimensional problems questions the approach proposed in this paper seems very similiar with conformal prediction in conformal prediction the target value y for a novel point x adversarial input is chosen so that y is compatible with the original dataset as i am not familiar with the evaluation protocol in brookles2019 and fannjiang2020 the metric used in table 1 is not clear to me can the authors say more about that update overall speaking the added gpbo results address my concerns and ive updated the score from 56 a final update will be given laterdocsepsummary the paper proposes an approximation method called nemo normalized maximum likelihood estimation for modelbased optimization to compute the conditional normalized maximum loglikelihood of a query data point as a way to quantify the uncertainty in a forward prediction model in offline modelbased optimization problems the main idea is to construct a conditional nml cnml distribution that maps the highdimensional inputs to a distribution over output variables in addition the paper provides a theoretical motivation that estimating the true function with the cnml is close to the best possible expert even if the test label is chosen adversarially which is a great challenge for an optimizer to exploit the model by using this cnml on three offline optimization benchmark datasets superconductor gfp moleculeactivity with gradient ascentbased optimization the nemo outputs all the other four baselines on the superconductor dataset by almost 14x to 17x the generate comparable results as the other four baselines method on the gfp and moleculeactivity datasets typo in section 4 we outline the highlevel pseudocode in algorithm 4 in algorithm 1 questions 1 when sampling a batch of data points at each step of algorithm 1 is the sampling performed with or without replacements 2 whats the variance across the 16 random runs is the score of the best algorithm in the average performance across 16 random runs significantly different from the baseline algorithms 3 when estimating the cnml what is the number of models in the experiments are the number of models differ from dataset to dataset how to choose the number of models in practice 4 since the output y needs to be discretized in the nemo algorithm how difficult for the algorithm to scale when y is multivariate update i think the authors did a great job of addressing my concerns im happy to increase my score to 6
### Summary:
|
this work proposes a modelbased optimization using an approximated normalized maximum likelihood nml it is an interesting idea and has the advantage of scaling to large datasets the reviewers are generally positive and are satisfied with authors response
|
[
1332,
247,
24426,
273,
253,
3268,
273,
253,
1318,
273,
253,
9183,
1537,
1361,
12709,
690,
3081,
1491,
275,
3718,
273,
436,
1332,
697,
1896,
326,
891,
717,
5816,
690,
3634,
281,
11435,
841,
1543,
604,
326,
310,
253,
1083,
352,
651,
1361,
604,
253,
4477,
812,
2085,
253,
3634,
891,
878,
281,
11435,
841,
1543,
50275,
34974,
50275,
74,
13414,
1158,
891,
2096,
752,
2789,
253,
30762,
1543,
271,
28913,
1263,
432,
752,
891,
2096,
841,
1543,
760,
7277,
342,
253,
1083,
835,
627,
310,
642,
4715,
273,
253,
1566,
3602,
752,
403,
253,
3210,
31260,
281,
835,
1057,
253,
941,
1705,
715,
1132,
50276,
9802,
253,
4477,
2783,
970,
2792,
4236,
432,
690,
4229,
13284,
1177,
1332,
3185,
273,
247,
6447,
9860,
50276,
66,
1846,
10014,
323,
253,
5301,
3082,
310,
253,
2934,
273,
417,
11711,
3390,
1512,
1199,
432,
253,
941,
369,
253,
13091,
273,
253,
18012,
273,
436,
1332,
6760,
275,
667,
1039,
849,
476,
891,
871,
326,
253,
1332,
310,
2649,
816,
38883,
690,
572,
11131,
275,
253,
6311,
3210,
908,
281,
7472,
352,
1223,
690,
273,
253,
643,
3082,
3693,
2509,
436,
50276,
12796,
14023,
342,
247,
2969,
7274,
17699,
16561,
13757,
3597,
24088,
305,
12064,
1232,
50276,
5371,
6569,
672,
2509,
625,
25142,
327,
253,
2412,
12177,
1078,
22753,
1269,
849,
1199,
513,
359,
7168,
672,
760,
2509,
247,
2014,
5731,
1057,
970,
247,
625,
7899,
11193,
273,
253,
295,
20347,
3157,
88,
641,
257,
3045,
891,
369,
11525,
436,
651,
320,
629,
273,
253,
28913,
1263,
50276,
5430,
513,
253,
1408,
2069,
7277,
50276,
12796,
253,
643,
15302,
432,
253,
2216,
31591,
22791,
3597,
50275,
43671,
285,
963,
993,
50275,
6377,
374,
275,
326,
352,
556,
2011,
281,
320,
5816,
3159,
50276,
6377,
374,
281,
2319,
849,
16851,
436,
540,
44374,
5816,
3159,
50276,
6377,
495,
672,
268,
79,
1686,
310,
19186,
3542,
253,
4495,
273,
340,
310,
23851,
1580,
352,
310,
327,
253,
298,
11285,
285,
671,
1146,
294,
7769,
407,
2781,
90,
275,
253,
38309,
50276,
6377,
495,
253,
14951,
277,
5500,
1269,
340,
10770,
281,
271,
31612,
10895,
277,
5500,
1269,
79,
18,
340,
79,
18,
436,
369,
2649,
1077,
27096,
285,
3543,
247,
2372,
44663,
1975,
891,
651,
5583,
26530,
342,
581,
273,
253,
767,
41818,
50276,
6377,
495,
835,
277,
310,
4229,
281,
253,
1677,
28841,
10895,
285,
253,
85,
324,
5500,
1269,
90,
50276,
2520,
6197,
310,
247,
2372,
21643,
347,
247,
2644,
253,
1265,
273,
253,
6197,
12088,
760,
670,
277,
594,
253,
39116,
3748,
310,
12439,
672,
4361,
50276,
6377,
495,
987,
846,
16186,
374,
1269,
340,
310,
327,
253,
298,
11285,
533,
840,
671,
629,
273,
253,
15355,
327,
253,
38309,
752,
310,
253,
15355,
1146,
2668,
689,
403,
1269,
285,
340,
1146,
294,
7769,
50276,
6377,
577,
323,
340,
275,
337,
50276,
76,
513,
310,
340,
891,
13414,
2868,
326,
340,
310,
8025,
281,
320,
271,
7007,
50275,
6377,
577,
5933,
577,
50276,
6377,
577,
209,
575,
2520,
651,
4711,
247,
3268,
689,
3453,
2193,
7239,
89,
436,
36908,
1511,
2451,
323,
479,
604,
891,
7192,
9113,
268,
390,
268,
14696,
50276,
89,
1957,
10670,
285,
340,
403,
3453,
2193,
671,
352,
1537,
320,
1175,
281,
33150,
253,
7856,
268,
79,
1686,
14951,
281,
755,
253,
1563,
6197,
436,
651,
4711,
247,
17697,
3268,
7856,
268,
79,
1686,
14696,
50276,
633,
209,
575,
1189,
3453,
2193,
50276,
6377,
577,
359,
476,
22318,
209,
633,
342,
1675,
281,
690,
7103,
1159,
19859,
273,
436,
3268,
824,
347,
253,
1599,
436,
310,
21643,
1580,
5933,
337,
556,
14168,
49472,
19859,
752,
1057,
352,
1599,
323,
253,
7103,
1159,
305,
281,
320,
253,
1599,
275,
436,
3634,
849,
943,
891,
4665,
253,
15355,
273,
479,
1279,
671,
891,
8025,
326,
752,
369,
5486,
310,
326,
305,
310,
253,
7103,
1159,
417,
19859,
390,
310,
305,
5486,
281,
1091,
247,
1159,
1677,
247,
340,
50276,
6377,
721,
247,
15246,
4327,
323,
253,
7103,
1159,
305,
310,
253,
6489,
1159,
305,
89,
50276,
89,
352,
1537,
320,
1682,
281,
7356,
281,
247,
5185,
4778,
1416,
24088,
19859,
50276,
90,
281,
3693,
13775,
670,
752,
253,
5028,
273,
305,
310,
50276,
16314,
273,
289,
78,
7609,
5150,
762,
970,
841,
767,
5441,
359,
476,
921,
2805,
943,
320,
7932,
342,
268,
79,
1686,
275,
253,
38309,
50276,
16314,
273,
289,
78,
7609,
1469,
432,
23055,
281,
27451,
2819,
598,
253,
3033,
6548,
247,
1249,
2803,
3304,
253,
5230,
2581,
685,
247,
374,
891,
812,
452,
9829,
247,
2508,
533,
1869,
4409,
29570,
275,
1083,
891,
419,
2254,
50276,
16314,
273,
289,
78,
7609,
5816,
247,
50276,
9453,
41113,
253,
27451,
23279,
50276,
41528,
374,
627,
310,
690,
17011,
17402,
1687,
465,
285,
340,
275,
253,
6287,
689,
1269,
6917,
50275,
25604,
13991,
50275,
9453,
4028,
12656,
891,
651,
7052,
5583,
2403,
352,
6843,
689,
534,
4903,
597,
403,
24088,
14168,
67,
22905,
948,
268,
14696,
50276,
89,
39116,
16984,
690,
46719,
395,
14951,
1537,
1361,
1056,
436,
625,
44003,
24088,
268,
89,
253,
85,
333,
50276,
4789,
50276,
89,
39116,
50276,
9453,
14339,
281,
247,
1159,
760,
3748,
697,
4454,
4790,
285,
15917,
253,
830,
326,
3797,
14800,
281,
3730,
281,
253,
3453,
273,
253,
1159,
1677,
1110,
14800,
24088,
247,
1159,
305,
271,
7103,
4868,
19859,
50276,
27635,
22128,
327,
253,
10668,
3102,
11038,
15277,
323,
34018,
4495,
281,
4903,
285,
1056,
2119,
4903,
24088,
14168,
3342,
89,
285,
340,
403,
1900,
11120,
2931,
407,
11120,
2931,
891,
2486,
13947,
340,
8566,
314,
342,
1633,
751,
2781,
90,
275,
14168,
1179,
90,
19859,
323,
1650,
627,
310,
417,
878,
281,
320,
27662,
48656,
533,
352,
943,
1620,
3553,
2316,
323,
7914,
436,
310,
2905,
281,
253,
1127,
670,
15355,
14951,
50276,
74,
3798,
4510,
6843,
10625,
24088,
2020,
90,
275,
14168,
1179,
90,
3185,
273,
2020,
90,
533,
891,
1908,
352,
4030,
281,
35991,
352,
604,
4903,
4454,
403,
1900,
10827,
281,
253,
1072,
5028,
672,
294,
3197,
436,
369,
417,
253,
1083,
323,
1269,
275,
436,
2929,
50276,
9453,
4028,
10585,
406,
853,
2057,
6287,
689,
7007,
14452,
390,
689,
3603,
273,
247,
873,
352,
310,
21643,
281,
897,
323,
340,
275,
337,
50276,
76,
672,
340,
310,
2649,
271,
7007,
23000,
970,
1097,
2789,
352,
2834,
281,
2028,
326,
465,
310,
2330,
342,
340,
3304,
247,
6287,
7152,
33032,
6010,
50276,
2520,
2929,
4081,
247,
1332,
1754,
327,
295,
1686,
285,
2530,
247,
3505,
74,
6216,
2746,
281,
6642,
253,
11649,
323,
258,
351,
941,
50274,
856,
84,
50276,
18,
50276,
783,
1332,
4081,
275,
436,
789,
310,
247,
3505,
74,
6216,
1039,
281,
6016,
11649,
50273,
1542,
4460,
2792,
562,
273,
253,
3236,
10895,
2429,
342,
323,
1650,
50273,
22412,
19862,
374,
50276,
531,
2590,
5750,
273,
289,
84,
4081,
2746,
310,
436,
1332,
476,
4311,
281,
50273,
16374,
10895,
2429,
342,
31025,
534,
11498,
12966,
1037,
50274,
5040,
50276,
18,
50276,
783,
4477,
1750,
970,
247,
22296,
2746,
310,
1308,
1522,
285,
21291,
281,
50273,
33699,
347,
253,
11649,
273,
14800,
3345,
273,
3733,
941,
310,
50273,
328,
43961,
1070,
2299,
436,
310,
417,
2032,
285,
11649,
476,
320,
973,
50273,
16894,
7293,
327,
253,
1566,
534,
476,
320,
1327,
36928,
390,
50273,
36928,
3268,
4924,
390,
3268,
6820,
323,
1650,
281,
50273,
30238,
11649,
327,
4460,
1127,
31025,
812,
320,
11575,
347,
253,
3216,
5083,
50273,
4524,
690,
2515,
619,
1953,
310,
2139,
417,
3587,
7277,
634,
50273,
6772,
607,
342,
247,
31025,
2746,
840,
13398,
253,
12637,
342,
271,
11931,
50273,
3701,
824,
347,
22616,
253,
28841,
278,
2399,
1895,
3559,
275,
436,
789,
310,
50273,
3549,
6668,
281,
271,
3909,
278,
2399,
3707,
359,
452,
760,
581,
3909,
3410,
285,
50273,
664,
403,
42068,
281,
22318,
436,
1127,
10941,
16186,
18,
273,
436,
2929,
342,
253,
50273,
19350,
273,
22616,
352,
310,
3477,
281,
923,
16186,
18,
310,
4555,
22616,
604,
359,
5467,
627,
50273,
19390,
1269,
90,
824,
326,
340,
310,
4067,
685,
8103,
2193,
275,
253,
3733,
50273,
2203,
873,
3021,
253,
1895,
3559,
275,
436,
789,
476,
320,
8069,
14042,
50273,
10489,
581,
3213,
273,
6041,
1766,
1677,
253,
15302,
908,
275,
253,
50273,
16217,
3825,
273,
436,
789,
310,
403,
417,
273,
1781,
4311,
891,
1158,
10941,
342,
50273,
66,
31025,
3169,
1766,
310,
3309,
374,
50276,
249,
4677,
495,
3738,
417,
247,
2201,
4468,
891,
13414,
1158,
253,
5301,
50273,
3113,
253,
19862,
310,
4344,
3738,
436,
789,
4648,
28551,
285,
19862,
50273,
78,
339,
2550,
9232,
11649,
3021,
352,
310,
417,
271,
7445,
7982,
275,
436,
50273,
28617,
323,
1650,
281,
4044,
247,
2074,
11649,
13418,
2429,
50273,
3113,
295,
1686,
4766,
5084,
359,
476,
897,
247,
3676,
19862,
534,
5556,
4219,
295,
620,
50273,
34235,
273,
278,
339,
495,
50276,
783,
5661,
1543,
275,
619,
4743,
403,
417,
4209,
285,
627,
310,
50273,
7483,
581,
2829,
15250,
16774,
1543,
891,
13414,
971,
281,
5963,
50273,
3467,
15412,
407,
253,
10671,
273,
7180,
390,
8442,
533,
7296,
253,
50273,
783,
33977,
1783,
310,
417,
2266,
2217,
891,
1158,
625,
16774,
1263,
50273,
11425,
320,
2684,
577,
50276,
783,
11649,
13418,
3133,
1512,
11518,
285,
436,
812,
1056,
253,
50273,
383,
18280,
11649,
1679,
4217,
3340,
323,
1029,
6967,
50273,
856,
23042,
50274,
34974,
50276,
783,
2746,
4081,
275,
436,
2929,
3133,
1077,
948,
6668,
342,
29269,
10554,
275,
29269,
10554,
253,
2303,
1318,
340,
323,
247,
4460,
1127,
1269,
48960,
3280,
310,
6777,
594,
326,
340,
310,
13333,
342,
253,
3236,
10895,
347,
891,
717,
417,
7615,
342,
253,
7103,
7241,
275,
1795,
536,
868,
9638,
285,
269,
1136,
36492,
14952,
253,
7982,
908,
275,
2829,
337,
310,
417,
2590,
281,
479,
476,
253,
4477,
1333,
625,
670,
326,
50276,
11183,
50275,
1189,
455,
8288,
253,
2879,
31025,
2399,
1543,
2953,
619,
50276,
585,
1209,
2224,
285,
209,
422,
9300,
253,
4868,
432,
8026,
247,
2457,
5731,
588,
320,
1677,
1996,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
271,
11193,
1332,
1925,
20641,
80,
12650,
4869,
12177,
13418,
323,
1566,
3169,
13757,
50276,
936,
11897,
253,
17697,
12650,
4869,
2412,
7513,
10202,
273,
247,
7316,
941,
1127,
347,
247,
1039,
281,
22048,
253,
11649,
275,
247,
3579,
10554,
1566,
275,
28841,
1566,
3169,
13757,
3237,
253,
2022,
2934,
310,
281,
3989,
247,
17697,
295,
1686,
260,
79,
1686,
3268,
326,
8115,
253,
1029,
6967,
14800,
281,
247,
3268,
689,
3453,
4903,
275,
1635,
253,
2929,
3400,
247,
10527,
16038,
326,
26230,
253,
2032,
1159,
342,
253,
260,
79,
1686,
310,
2810,
281,
253,
1682,
1896,
6485,
1014,
604,
253,
1071,
5203,
310,
6777,
18539,
274,
1365,
534,
310,
247,
1270,
5691,
323,
271,
5556,
6081,
281,
22059,
253,
1566,
407,
970,
436,
260,
79,
1686,
327,
1264,
28841,
13757,
22791,
15302,
2221,
48679,
305,
16983,
12570,
12135,
342,
11786,
49104,
3169,
13757,
253,
20641,
80,
18012,
512,
253,
643,
1740,
1666,
25379,
327,
253,
2221,
48679,
10895,
407,
2761,
1638,
89,
281,
1722,
89,
253,
6635,
10870,
1543,
347,
253,
643,
1740,
1666,
25379,
1332,
327,
253,
305,
16983,
285,
12570,
12135,
15302,
50273,
555,
5367,
50276,
249,
2593,
577,
359,
19270,
253,
1029,
5251,
10585,
406,
853,
275,
5933,
577,
50275,
249,
5933,
337,
50276,
34974,
50276,
18,
672,
10491,
247,
14604,
273,
941,
2792,
387,
1016,
3213,
273,
5933,
337,
310,
253,
10491,
2684,
342,
390,
1293,
47105,
50275,
19,
47515,
253,
11041,
2439,
253,
1668,
3632,
6613,
310,
253,
4868,
273,
253,
1682,
5933,
275,
253,
3388,
3045,
2439,
1668,
3632,
6613,
3012,
1027,
432,
253,
8245,
11333,
495,
672,
26230,
253,
260,
79,
1686,
752,
310,
253,
1180,
273,
3210,
275,
253,
4679,
403,
253,
1180,
273,
3210,
9184,
432,
10895,
281,
10895,
849,
281,
5206,
253,
1180,
273,
3210,
275,
3946,
577,
1580,
253,
3453,
340,
3198,
281,
320,
35132,
1025,
275,
253,
20641,
80,
5933,
849,
2834,
323,
253,
5933,
281,
4311,
672,
340,
310,
21471,
50275,
11183,
50276,
74,
1158,
253,
4477,
858,
247,
1270,
2628,
273,
15974,
619,
7350,
516,
5211,
281,
2572,
619,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
1566,
3169,
13757,
970,
271,
34930,
12650,
4869,
12177,
295,
1686,
352,
310,
271,
4722,
2934,
285,
556,
253,
5750,
273,
13642,
281,
1781,
15302,
253,
30628,
403,
3839,
2762,
285,
403,
10048,
342,
4477,
2380,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1332,
247,
24426,
273,
253,
3268,
273,
253,
1318,
273,
253,
9183,
1537,
1361,
12709,
690,
3081,
1491,
275,
3718,
273,
436,
1332,
697,
1896,
326,
891,
717,
5816,
690,
3634,
281,
11435,
841,
1543,
604,
326,
310,
253,
1083,
352,
651,
1361,
604,
253,
4477,
812,
2085,
253,
3634,
891,
878,
281,
11435,
841,
1543,
50275,
34974,
50275,
74,
13414,
1158,
891,
2096,
752,
2789,
253,
30762,
1543,
271,
28913,
1263,
432,
752,
891,
2096,
841,
1543,
760,
7277,
342,
253,
1083,
835,
627,
310,
642,
4715,
273,
253,
1566,
3602,
752,
403,
253,
3210,
31260,
281,
835,
1057,
253,
941,
1705,
715,
1132,
50276,
9802,
253,
4477,
2783,
970,
2792,
4236,
432,
690,
4229,
13284,
1177,
1332,
3185,
273,
247,
6447,
9860,
50276,
66,
1846,
10014,
323,
253,
5301,
3082,
310,
253,
2934,
273,
417,
11711,
3390,
1512,
1199,
432,
253,
941,
369,
253,
13091,
273,
253,
18012,
273,
436,
1332,
6760,
275,
667,
1039,
849,
476,
891,
871,
326,
253,
1332,
310,
2649,
816,
38883,
690,
572,
11131,
275,
253,
6311,
3210,
908,
281,
7472,
352,
1223,
690,
273,
253,
643,
3082,
3693,
2509,
436,
50276,
12796,
14023,
342,
247,
2969,
7274,
17699,
16561,
13757,
3597,
24088,
305,
12064,
1232,
50276,
5371,
6569,
672,
2509,
625,
25142,
327,
253,
2412,
12177,
1078,
22753,
1269,
849,
1199,
513,
359,
7168,
672,
760,
2509,
247,
2014,
5731,
1057,
970,
247,
625,
7899,
11193,
273,
253,
295,
20347,
3157,
88,
641,
257,
3045,
891,
369,
11525,
436,
651,
320,
629,
273,
253,
28913,
1263,
50276,
5430,
513,
253,
1408,
2069,
7277,
50276,
12796,
253,
643,
15302,
432,
253,
2216,
31591,
22791,
3597,
50275,
43671,
285,
963,
993,
50275,
6377,
374,
275,
326,
352,
556,
2011,
281,
320,
5816,
3159,
50276,
6377,
374,
281,
2319,
849,
16851,
436,
540,
44374,
5816,
3159,
50276,
6377,
495,
672,
268,
79,
1686,
310,
19186,
3542,
253,
4495,
273,
340,
310,
23851,
1580,
352,
310,
327,
253,
298,
11285,
285,
671,
1146,
294,
7769,
407,
2781,
90,
275,
253,
38309,
50276,
6377,
495,
253,
14951,
277,
5500,
1269,
340,
10770,
281,
271,
31612,
10895,
277,
5500,
1269,
79,
18,
340,
79,
18,
436,
369,
2649,
1077,
27096,
285,
3543,
247,
2372,
44663,
1975,
891,
651,
5583,
26530,
342,
581,
273,
253,
767,
41818,
50276,
6377,
495,
835,
277,
310,
4229,
281,
253,
1677,
28841,
10895,
285,
253,
85,
324,
5500,
1269,
90,
50276,
2520,
6197,
310,
247,
2372,
21643,
347,
247,
2644,
253,
1265,
273,
253,
6197,
12088,
760,
670,
277,
594,
253,
39116,
3748,
310,
12439,
672,
4361,
50276,
6377,
495,
987,
846,
16186,
374,
1269,
340,
310,
327,
253,
298,
11285,
533,
840,
671,
629,
273,
253,
15355,
327,
253,
38309,
752,
310,
253,
15355,
1146,
2668,
689,
403,
1269,
285,
340,
1146,
294,
7769,
50276,
6377,
577,
323,
340,
275,
337,
50276,
76,
513,
310,
340,
891,
13414,
2868,
326,
340,
310,
8025,
281,
320,
271,
7007,
50275,
6377,
577,
5933,
577,
50276,
6377,
577,
209,
575,
2520,
651,
4711,
247,
3268,
689,
3453,
2193,
7239,
89,
436,
36908,
1511,
2451,
323,
479,
604,
891,
7192,
9113,
268,
390,
268,
14696,
50276,
89,
1957,
10670,
285,
340,
403,
3453,
2193,
671,
352,
1537,
320,
1175,
281,
33150,
253,
7856,
268,
79,
1686,
14951,
281,
755,
253,
1563,
6197,
436,
651,
4711,
247,
17697,
3268,
7856,
268,
79,
1686,
14696,
50276,
633,
209,
575,
1189,
3453,
2193,
50276,
6377,
577,
359,
476,
22318,
209,
633,
342,
1675,
281,
690,
7103,
1159,
19859,
273,
436,
3268,
824,
347,
253,
1599,
436,
310,
21643,
1580,
5933,
337,
556,
14168,
49472,
19859,
752,
1057,
352,
1599,
323,
253,
7103,
1159,
305,
281,
320,
253,
1599,
275,
436,
3634,
849,
943,
891,
4665,
253,
15355,
273,
479,
1279,
671,
891,
8025,
326,
752,
369,
5486,
310,
326,
305,
310,
253,
7103,
1159,
417,
19859,
390,
310,
305,
5486,
281,
1091,
247,
1159,
1677,
247,
340,
50276,
6377,
721,
247,
15246,
4327,
323,
253,
7103,
1159,
305,
310,
253,
6489,
1159,
305,
89,
50276,
89,
352,
1537,
320,
1682,
281,
7356,
281,
247,
5185,
4778,
1416,
24088,
19859,
50276,
90,
281,
3693,
13775,
670,
752,
253,
5028,
273,
305,
310,
50276,
16314,
273,
289,
78,
7609,
5150,
762,
970,
841,
767,
5441,
359,
476,
921,
2805,
943,
320,
7932,
342,
268,
79,
1686,
275,
253,
38309,
50276,
16314,
273,
289,
78,
7609,
1469,
432,
23055,
281,
27451,
2819,
598,
253,
3033,
6548,
247,
1249,
2803,
3304,
253,
5230,
2581,
685,
247,
374,
891,
812,
452,
9829,
247,
2508,
533,
1869,
4409,
29570,
275,
1083,
891,
419,
2254,
50276,
16314,
273,
289,
78,
7609,
5816,
247,
50276,
9453,
41113,
253,
27451,
23279,
50276,
41528,
374,
627,
310,
690,
17011,
17402,
1687,
465,
285,
340,
275,
253,
6287,
689,
1269,
6917,
50275,
25604,
13991,
50275,
9453,
4028,
12656,
891,
651,
7052,
5583,
2403,
352,
6843,
689,
534,
4903,
597,
403,
24088,
14168,
67,
22905,
948,
268,
14696,
50276,
89,
39116,
16984,
690,
46719,
395,
14951,
1537,
1361,
1056,
436,
625,
44003,
24088,
268,
89,
253,
85,
333,
50276,
4789,
50276,
89,
39116,
50276,
9453,
14339,
281,
247,
1159,
760,
3748,
697,
4454,
4790,
285,
15917,
253,
830,
326,
3797,
14800,
281,
3730,
281,
253,
3453,
273,
253,
1159,
1677,
1110,
14800,
24088,
247,
1159,
305,
271,
7103,
4868,
19859,
50276,
27635,
22128,
327,
253,
10668,
3102,
11038,
15277,
323,
34018,
4495,
281,
4903,
285,
1056,
2119,
4903,
24088,
14168,
3342,
89,
285,
340,
403,
1900,
11120,
2931,
407,
11120,
2931,
891,
2486,
13947,
340,
8566,
314,
342,
1633,
751,
2781,
90,
275,
14168,
1179,
90,
19859,
323,
1650,
627,
310,
417,
878,
281,
320,
27662,
48656,
533,
352,
943,
1620,
3553,
2316,
323,
7914,
436,
310,
2905,
281,
253,
1127,
670,
15355,
14951,
50276,
74,
3798,
4510,
6843,
10625,
24088,
2020,
90,
275,
14168,
1179,
90,
3185,
273,
2020,
90,
533,
891,
1908,
352,
4030,
281,
35991,
352,
604,
4903,
4454,
403,
1900,
10827,
281,
253,
1072,
5028,
672,
294,
3197,
436,
369,
417,
253,
1083,
323,
1269,
275,
436,
2929,
50276,
9453,
4028,
10585,
406,
853,
2057,
6287,
689,
7007,
14452,
390,
689,
3603,
273,
247,
873,
352,
310,
21643,
281,
897,
323,
340,
275,
337,
50276,
76,
672,
340,
310,
2649,
271,
7007,
23000,
970,
1097,
2789,
352,
2834,
281,
2028,
326,
465,
310,
2330,
342,
340,
3304,
247,
6287,
7152,
33032,
6010,
50276,
2520,
2929,
4081,
247,
1332,
1754,
327,
295,
1686,
285,
2530,
247,
3505,
74,
6216,
2746,
281,
6642,
253,
11649,
323,
258,
351,
941,
50274,
856,
84,
50276,
18,
50276,
783,
1332,
4081,
275,
436,
789,
310,
247,
3505,
74,
6216,
1039,
281,
6016,
11649,
50273,
1542,
4460,
2792,
562,
273,
253,
3236,
10895,
2429,
342,
323,
1650,
50273,
22412,
19862,
374,
50276,
531,
2590,
5750,
273,
289,
84,
4081,
2746,
310,
436,
1332,
476,
4311,
281,
50273,
16374,
10895,
2429,
342,
31025,
534,
11498,
12966,
1037,
50274,
5040,
50276,
18,
50276,
783,
4477,
1750,
970,
247,
22296,
2746,
310,
1308,
1522,
285,
21291,
281,
50273,
33699,
347,
253,
11649,
273,
14800,
3345,
273,
3733,
941,
310,
50273,
328,
43961,
1070,
2299,
436,
310,
417,
2032,
285,
11649,
476,
320,
973,
50273,
16894,
7293,
327,
253,
1566,
534,
476,
320,
1327,
36928,
390,
50273,
36928,
3268,
4924,
390,
3268,
6820,
323,
1650,
281,
50273,
30238,
11649,
327,
4460,
1127,
31025,
812,
320,
11575,
347,
253,
3216,
5083,
50273,
4524,
690,
2515,
619,
1953,
310,
2139,
417,
3587,
7277,
634,
50273,
6772,
607,
342,
247,
31025,
2746,
840,
13398,
253,
12637,
342,
271,
11931,
50273,
3701,
824,
347,
22616,
253,
28841,
278,
2399,
1895,
3559,
275,
436,
789,
310,
50273,
3549,
6668,
281,
271,
3909,
278,
2399,
3707,
359,
452,
760,
581,
3909,
3410,
285,
50273,
664,
403,
42068,
281,
22318,
436,
1127,
10941,
16186,
18,
273,
436,
2929,
342,
253,
50273,
19350,
273,
22616,
352,
310,
3477,
281,
923,
16186,
18,
310,
4555,
22616,
604,
359,
5467,
627,
50273,
19390,
1269,
90,
824,
326,
340,
310,
4067,
685,
8103,
2193,
275,
253,
3733,
50273,
2203,
873,
3021,
253,
1895,
3559,
275,
436,
789,
476,
320,
8069,
14042,
50273,
10489,
581,
3213,
273,
6041,
1766,
1677,
253,
15302,
908,
275,
253,
50273,
16217,
3825,
273,
436,
789,
310,
403,
417,
273,
1781,
4311,
891,
1158,
10941,
342,
50273,
66,
31025,
3169,
1766,
310,
3309,
374,
50276,
249,
4677,
495,
3738,
417,
247,
2201,
4468,
891,
13414,
1158,
253,
5301,
50273,
3113,
253,
19862,
310,
4344,
3738,
436,
789,
4648,
28551,
285,
19862,
50273,
78,
339,
2550,
9232,
11649,
3021,
352,
310,
417,
271,
7445,
7982,
275,
436,
50273,
28617,
323,
1650,
281,
4044,
247,
2074,
11649,
13418,
2429,
50273,
3113,
295,
1686,
4766,
5084,
359,
476,
897,
247,
3676,
19862,
534,
5556,
4219,
295,
620,
50273,
34235,
273,
278,
339,
495,
50276,
783,
5661,
1543,
275,
619,
4743,
403,
417,
4209,
285,
627,
310,
50273,
7483,
581,
2829,
15250,
16774,
1543,
891,
13414,
971,
281,
5963,
50273,
3467,
15412,
407,
253,
10671,
273,
7180,
390,
8442,
533,
7296,
253,
50273,
783,
33977,
1783,
310,
417,
2266,
2217,
891,
1158,
625,
16774,
1263,
50273,
11425,
320,
2684,
577,
50276,
783,
11649,
13418,
3133,
1512,
11518,
285,
436,
812,
1056,
253,
50273,
383,
18280,
11649,
1679,
4217,
3340,
323,
1029,
6967,
50273,
856,
23042,
50274,
34974,
50276,
783,
2746,
4081,
275,
436,
2929,
3133,
1077,
948,
6668,
342,
29269,
10554,
275,
29269,
10554,
253,
2303,
1318,
340,
323,
247,
4460,
1127,
1269,
48960,
3280,
310,
6777,
594,
326,
340,
310,
13333,
342,
253,
3236,
10895,
347,
891,
717,
417,
7615,
342,
253,
7103,
7241,
275,
1795,
536,
868,
9638,
285,
269,
1136,
36492,
14952,
253,
7982,
908,
275,
2829,
337,
310,
417,
2590,
281,
479,
476,
253,
4477,
1333,
625,
670,
326,
50276,
11183,
50275,
1189,
455,
8288,
253,
2879,
31025,
2399,
1543,
2953,
619,
50276,
585,
1209,
2224,
285,
209,
422,
9300,
253,
4868,
432,
8026,
247,
2457,
5731,
588,
320,
1677,
1996,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
271,
11193,
1332,
1925,
20641,
80,
12650,
4869,
12177,
13418,
323,
1566,
3169,
13757,
50276,
936,
11897,
253,
17697,
12650,
4869,
2412,
7513,
10202,
273,
247,
7316,
941,
1127,
347,
247,
1039,
281,
22048,
253,
11649,
275,
247,
3579,
10554,
1566,
275,
28841,
1566,
3169,
13757,
3237,
253,
2022,
2934,
310,
281,
3989,
247,
17697,
295,
1686,
260,
79,
1686,
3268,
326,
8115,
253,
1029,
6967,
14800,
281,
247,
3268,
689,
3453,
4903,
275,
1635,
253,
2929,
3400,
247,
10527,
16038,
326,
26230,
253,
2032,
1159,
342,
253,
260,
79,
1686,
310,
2810,
281,
253,
1682,
1896,
6485,
1014,
604,
253,
1071,
5203,
310,
6777,
18539,
274,
1365,
534,
310,
247,
1270,
5691,
323,
271,
5556,
6081,
281,
22059,
253,
1566,
407,
970,
436,
260,
79,
1686,
327,
1264,
28841,
13757,
22791,
15302,
2221,
48679,
305,
16983,
12570,
12135,
342,
11786,
49104,
3169,
13757,
253,
20641,
80,
18012,
512,
253,
643,
1740,
1666,
25379,
327,
253,
2221,
48679,
10895,
407,
2761,
1638,
89,
281,
1722,
89,
253,
6635,
10870,
1543,
347,
253,
643,
1740,
1666,
25379,
1332,
327,
253,
305,
16983,
285,
12570,
12135,
15302,
50273,
555,
5367,
50276,
249,
2593,
577,
359,
19270,
253,
1029,
5251,
10585,
406,
853,
275,
5933,
577,
50275,
249,
5933,
337,
50276,
34974,
50276,
18,
672,
10491,
247,
14604,
273,
941,
2792,
387,
1016,
3213,
273,
5933,
337,
310,
253,
10491,
2684,
342,
390,
1293,
47105,
50275,
19,
47515,
253,
11041,
2439,
253,
1668,
3632,
6613,
310,
253,
4868,
273,
253,
1682,
5933,
275,
253,
3388,
3045,
2439,
1668,
3632,
6613,
3012,
1027,
432,
253,
8245,
11333,
495,
672,
26230,
253,
260,
79,
1686,
752,
310,
253,
1180,
273,
3210,
275,
253,
4679,
403,
253,
1180,
273,
3210,
9184,
432,
10895,
281,
10895,
849,
281,
5206,
253,
1180,
273,
3210,
275,
3946,
577,
1580,
253,
3453,
340,
3198,
281,
320,
35132,
1025,
275,
253,
20641,
80,
5933,
849,
2834,
323,
253,
5933,
281,
4311,
672,
340,
310,
21471,
50275,
11183,
50276,
74,
1158,
253,
4477,
858,
247,
1270,
2628,
273,
15974,
619,
7350,
516,
5211,
281,
2572,
619,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
1566,
3169,
13757,
970,
271,
34930,
12650,
4869,
12177,
295,
1686,
352,
310,
271,
4722,
2934,
285,
556,
253,
5750,
273,
13642,
281,
1781,
15302,
253,
30628,
403,
3839,
2762,
285,
403,
10048,
342,
4477,
2380,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes methods to construct 2hop spanners to approximate similarity graph in extremely large data the idea is to use hashing functions to reduce the neighborhood searching load i think the paper can be a significant contribution to the problem of similarity graph construction via using 2hop spanners which increase the sparsity of the graphs i could not check the maths in detail to approve the analysis what the papers missing is to validate the benefit of 2hop spanners compared to a similarity graphs as the twohop spanners contain a similarity graph with higher threshold the paper has typos in many places na docsepthis work proposes an algorithm for efficiently constructing similarity graphs for extremely large graphs this work introduces the idea of using twohop spanners derived from concept of metric spanners in which all points with similarity greater than some threshold are guaranteed to be less than or equal to 2 hops apart and no edge with some small similarity level have an edge between them with the relaxed goal of similar points being within two hops as opposed to one the algorithm is able to make significant gains in overall efficiency they give an algorithm for constructing the twohop spanner threshold graph using locality sensitive hashing lsh as well as an algorithm using sorted lsh for approximating knearest neighbors they give algorithms and theoretical guarantees that their algorithm approximates nearest neighbors using angle or jaccard similarity measures as well as theoretical guarantees on efficiency they confirm the theoretical results with experiments on a combination of real and synthetic datasets the experiments include results on the number of comparisons made for various algorithms how close the number of sketches hyperparameter affects approximation to knn in terms of recall as well as the coverage of the approximation there is also an experiment showing the performance on clustering of various versions of the algorithm and the allpairs baseline originality somewhat original builds off techniques from previous similarity clustering algorithms as well as using techniques of locality sensitive hashing and twohop spanners not clear to me how this compares to other approximate k nearest neighbor and similarity graph construction algorithms eg eirasfranco carlos et al fast distributed knn graph construction using autotuned localitysensitive hashing acm transactions on intelligent systems and technology tist 11 2020 1 18 zhang yanming et al fast knn graph construction with locality sensitive hashing ecmlpkdd 2013 wang g et al learning to prune general and efficient approximate nearest neighbor search with direction navigating graph 2022 the 5th international conference on data storage and data engineering 2022 n pag quality this is a complete high quality work while i have not confirmed all the proofs in the appendix the theorems seem to be soundly derived the experiments show the effectiveness of the algorithm in terms of efficiency and approximation it would be nice to have some more experiments showing the usefulness on downstream tasks also no comparisons to similar works clarity while math heavy the paper remains clear as a nice benefit the paper would still read pretty clearly even for a light read as many readers might be doing significance it seems like this would be of interest to some readers it seems certainly on the application side this would be of interest it would be nice to have more examples of downstream use cases the clustering experiment is helpful in that regard but not super clearcut it is spellinggramar line 172 twphop twohop adequately done docsepthis paper studies the similarity graph construction problem which is useful for downstream tasks such as clustering and graph learning it proposes a scheme named stars to construct twohop spanners as a relaxation of the similarity graph by selecting leader points and generate starshape edges it claims it is superior because it requires at most on1oepsilon complexity in construction and the output twohop spanners are sparser than normal similarity graphs strengths s1 this paper investigates a novel problem for graph construction on twohop spanner graphs it claims this type of graph has fewer edges and is hence more favorable for downstream tasks on largescale data it is of certain interest for the research in building sparse graphs from similarity representation especially for largescale data and distributed computation s2 this paper proposes a simple yet effective scheme stars which can efficiently generate twohop spanners based on similarity measures ie locality sensitive hash lsh families it provides theoretical analysis to show that stars can construct approximate near neighbor ann graphs under given error guarantees and gives the time complexity bound weaknesses w1 given that twohop spanner is a generalized type of similarity graphs it is natural that algorithms designed on this graph type have better complexity than baselines so whether the performance improvement comes from the stars algorithm itself or is the merit of the graph relaxation is unclear it is also not presented in the paper whether the difference in similarity graphs will result in performance drop in downstream tasks such as clustering see q1 w2 as epsilon ranges from 0 to 1 i doubt whether the on1oepsilon complexity can be described as nearlylinear in the paper it claims that the empirical time is significantly better than the theoretical bound but this is not well explained see q3 w3 for both theoretical and experimental evaluations there are insufficient baselines included in this paper w31 for theoretical complexity analysis it only presents the complexity of stars without comparison to other approaches w32 for the graph construction experiment it only compares brute force allpairs and a nonstars algorithm what exactly is the nonstars algorithm is unknown similar works like 21 or r1 are also not evaluated w33 for the clustering experiment there should be even more available baselines i think it is even necessary to compare with nonlshbased and nonsimilaritybased approaches w4 the experiment designs are still too simple beside the baseline issue sec 5 empirical study lacks critical information and is poorly written w41 it mainly evaluates the algorithm by changing the parameter r of number of sketches however only having three points r 25 100 400 is not enough it is hard to observe the relation between complexity num comparisons num edges and sketches w42 as complexity improvement is an important contribution in the paper there should be experiments and analyses explicitly studying that the evaluation of complexity on different scales of data is expected rather than solely on two points of 1b and 10b data w43 section 5 lacks necessary analysis and conclusion for experiments for example in fig 2 what is the meaning of the recall metric why the performance of lsh exact is different from others on mnist in fig 4 amazon2m why lshstarslearn and sortinglshstarslearn seem to have lower performance than nonstars methods minor issues i1 line 125 sentitive sensitive i2 line 297305 whether random1brandom10b dataset names are capitalized needs to be aligned i3 the log y axes in some figures eg fig 1 wikipedia fig 3 mnist are too rough to assess the results some data points fig 1 allpair fig 2 lshstars exact fig 3 lshstarsrelaxed are hard to observe r1 zhang yanming et al fast knn graph construction with locality sensitive hashing joint european conference on machine learning and knowledge discovery in databases springer berlin heidelberg 2013 see w1 to w4 docsepgiven a set of points x endowed with a measure of pairwise similarity a similarity graph gxe on x is supposed loosely speaking to draw edges between similar points and avoid edges between dissimilar points common variants define that a pair of points xy should be neighbors if their similarity is above a fixed threshold or if y is among the k most similar points to y for some fixed k these graphs are commonly used in many applications the computational challenges they pose are a that constructing them can be very costly especially for very large datasets where exact construction is infeasible and b that they need to be reasonably sparse since this affects their usability in the downstreams applications they are constructed for this paper advocates constructing these graphs such that similar points are not necessarily connected by an edge but with a path of length 2 this addresses challenge b since this relaxed requirement allows the graph to be more sparse and addresses challenge a since it lends itself to a very simple construction bucketing the points in a way that roughly preserves similarity there is a vast literature on efficiently generating such bucketing for large highdimensional point sets notably by lsh and then choosing a representative point from each bucket and drawing an edge between the representative and the other point in the bucket ie putting a star graph on the bucket the algorithms in this paper repeat this with several random partitions to generate the similarity graph this approach to constructing similaritypreserving graphs has been suggested and studied before but this paper applies it more specifically in the context of nearneighbor preserving graph construction shows it can be implemented in practice on a very large scale and includes an empirical evaluation this adds a different angle to the existing literature previous work that i am aware of has been strictly theoretical and focused on other flavors of similarity preserving graph this is a good quality paper that seems to me like a solid contribution to the literature on largescale nns graph construction is written well and clearly up to some technical points noted below and to me clears the bar for acceptance the analytic novelty is perhaps not on the higher side as the analysis directly utilizes classical and existing lsh techniques however the application to similarity graphs especially in practice and in demonstrably large scale is interesting and significant and wellexecuted in this paper this may advance the way we use lsh in largescale settings and impact the way we construct largescale similarity graphs there is perhaps some room for more discussion and exploration of the usability and limitations of this approach see questions below the technical writing could be more careful with some undefined on not clearly defined notation and some lax writing in parts of the proofs though there is no concern about correctness these glitches can be corrected from context or if one is already familiar with the background literature but may still impede readability for newcomers which is a bit unfortunate since the paper is otherwise very well written for example taukp not explicitly defined is line 199 supposed to be implicitly defining it as the similarity to the kth most similar point incompatible neighborhood notation between lines 149 and 261 the proof of proposition 33 makes use of properties of simhash and minhash that are never stated or referenced the connection between the collision probability to the similarity and some more i cannot list all of these but making the technical parts clearer and selfcontained would likely help make the paper accessible to a wider audience small corrections 1 missing subscripts on the pis in line 220 2 did you mean dpq instead of mupq on line 251 na
### Summary:
|
all reviews for this paper were positive albeit with a varying level of enthusiasm reviewers found the problem large scale similarity search to be important and the authors contribution to it faster search using 2hops search graph constructed via lsh significant there were some concerns about the baselines in experimental evaluation and some relatively minor presentation issues ultimately the positives significantly outweighed the negatives
|
[
403,
16293,
281,
320,
1679,
685,
390,
4503,
281,
374,
47010,
7419,
285,
642,
5024,
342,
690,
1355,
14259,
1268,
452,
271,
5024,
875,
731,
342,
253,
19595,
4736,
273,
2074,
2792,
1146,
1561,
767,
47010,
347,
10066,
281,
581,
253,
5933,
310,
2104,
281,
1056,
1534,
15988,
275,
4583,
6733,
597,
1918,
271,
5933,
323,
26736,
253,
767,
12242,
653,
9582,
7887,
4216,
970,
33643,
7996,
556,
2027,
298,
1200,
347,
973,
347,
271,
5933,
970,
20045,
298,
1200,
323,
4020,
839,
7725,
4885,
15833,
597,
1918,
11333,
285,
10527,
23632,
326,
616,
5933,
4020,
684,
5275,
15833,
970,
6907,
390,
480,
3649,
472,
14259,
5593,
347,
973,
347,
10527,
23632,
327,
6733,
50275,
9328,
6583,
253,
10527,
1543,
342,
4679,
327,
247,
5019,
273,
1524,
285,
13506,
15302,
253,
4679,
2486,
1543,
327,
253,
1180,
273,
14023,
1160,
323,
2710,
11333,
849,
2810,
253,
1180,
273,
46159,
4373,
19484,
11852,
11193,
281,
694,
79,
275,
2426,
273,
6983,
347,
973,
347,
253,
7031,
273,
253,
11193,
627,
310,
671,
271,
3368,
4645,
253,
3045,
327,
17524,
273,
2710,
9508,
273,
253,
5933,
285,
253,
512,
81,
3514,
8245,
50276,
19164,
414,
50275,
8826,
5371,
3236,
21168,
745,
5609,
432,
2045,
14259,
17524,
11333,
347,
973,
347,
970,
5609,
273,
33643,
7996,
556,
2027,
285,
767,
12242,
653,
23217,
50274,
1439,
2590,
281,
479,
849,
436,
26662,
281,
643,
16851,
465,
5275,
6346,
285,
14259,
4216,
5140,
11333,
50276,
909,
50276,
70,
343,
284,
925,
26798,
1113,
26185,
1162,
355,
3809,
5939,
694,
79,
4216,
5140,
970,
1125,
302,
37437,
33643,
19579,
556,
2027,
913,
78,
13122,
327,
17497,
2718,
285,
4302,
246,
382,
1903,
9169,
337,
50276,
1093,
50276,
91,
12109,
340,
266,
3987,
1162,
355,
3809,
694,
79,
4216,
5140,
342,
33643,
7996,
556,
2027,
10038,
1686,
27905,
1678,
4072,
50276,
33317,
305,
1162,
355,
4715,
281,
819,
2517,
2087,
285,
5919,
16851,
5275,
6346,
3186,
342,
3884,
49858,
4216,
1384,
1423,
253,
608,
394,
5213,
8059,
327,
941,
5718,
285,
941,
11369,
1384,
1423,
295,
24949,
50276,
15177,
50275,
2520,
310,
247,
3426,
1029,
3290,
789,
50274,
6050,
891,
452,
417,
5783,
512,
253,
27947,
275,
253,
30762,
253,
39383,
1646,
281,
320,
3590,
314,
6012,
50274,
783,
4679,
921,
253,
12510,
273,
253,
5933,
275,
2426,
273,
6733,
285,
11193,
50274,
262,
651,
320,
5322,
281,
452,
690,
625,
4679,
4645,
253,
31471,
327,
15450,
8892,
50274,
12563,
642,
14023,
281,
2074,
2987,
50276,
498,
15752,
50275,
6050,
14168,
5536,
253,
2929,
4558,
2590,
347,
247,
5322,
5649,
253,
2929,
651,
1335,
1239,
3965,
4518,
1014,
323,
247,
1708,
1239,
347,
1142,
10668,
1537,
320,
2509,
50275,
9188,
40348,
50275,
262,
3133,
751,
436,
651,
320,
273,
1600,
281,
690,
10668,
352,
3133,
5604,
327,
253,
2898,
1930,
436,
651,
320,
273,
1600,
50274,
262,
651,
320,
5322,
281,
452,
625,
6667,
273,
15450,
897,
2219,
253,
17524,
3368,
310,
9371,
275,
326,
2743,
533,
417,
2221,
2590,
7317,
352,
310,
50274,
1033,
3485,
1710,
274,
50276,
1282,
24347,
2500,
545,
412,
50276,
9389,
12242,
18212,
2218,
50276,
7152,
33032,
2520,
2929,
2175,
253,
14259,
4216,
5140,
1895,
534,
310,
4217,
323,
15450,
8892,
824,
347,
17524,
285,
4216,
4715,
352,
29328,
247,
6974,
4907,
6114,
281,
3989,
767,
12242,
653,
23217,
347,
247,
17040,
273,
253,
14259,
4216,
407,
17221,
6657,
2792,
285,
6635,
6114,
73,
2259,
9297,
352,
3916,
352,
310,
8936,
984,
352,
4419,
387,
954,
327,
18,
80,
4259,
10454,
275,
5140,
285,
253,
3453,
767,
12242,
653,
23217,
403,
653,
9332,
685,
2622,
14259,
14580,
20544,
50276,
84,
18,
436,
2929,
2340,
684,
247,
4460,
1895,
323,
4216,
5140,
327,
767,
12242,
653,
9582,
14580,
352,
3916,
436,
1511,
273,
4216,
556,
11184,
9297,
285,
310,
7613,
625,
13857,
323,
15450,
8892,
327,
1236,
2510,
25912,
941,
352,
310,
273,
2176,
1600,
323,
253,
2561,
275,
3652,
23507,
14580,
432,
14259,
6779,
3340,
323,
1236,
2510,
25912,
941,
285,
5939,
13782,
50276,
84,
19,
436,
2929,
29328,
247,
2969,
2568,
3576,
6974,
6114,
534,
476,
14556,
6635,
767,
12242,
653,
23217,
1754,
327,
14259,
5593,
26332,
33643,
7996,
13283,
298,
1200,
5870,
352,
3400,
10527,
1783,
281,
921,
326,
6114,
476,
3989,
16851,
2822,
6346,
2459,
14580,
762,
1677,
2228,
23632,
285,
4245,
253,
673,
10454,
3033,
50276,
20881,
1255,
265,
50276,
88,
18,
1677,
326,
767,
12242,
653,
9582,
310,
247,
14923,
1511,
273,
14259,
14580,
352,
310,
3626,
326,
11333,
4158,
327,
436,
4216,
1511,
452,
1805,
10454,
685,
1666,
25379,
594,
1880,
253,
3045,
7756,
3249,
432,
253,
6114,
5933,
3139,
390,
310,
253,
15785,
273,
253,
4216,
17040,
310,
12744,
352,
310,
671,
417,
3559,
275,
253,
2929,
1880,
253,
3064,
275,
14259,
14580,
588,
906,
275,
3045,
5926,
275,
15450,
8892,
824,
347,
17524,
923,
2805,
18,
50276,
88,
19,
347,
299,
4277,
13794,
432,
470,
281,
337,
891,
5545,
1880,
253,
327,
18,
80,
4259,
10454,
476,
320,
2529,
347,
4829,
8172,
275,
253,
2929,
352,
3916,
326,
253,
16774,
673,
310,
3012,
1805,
685,
253,
10527,
3033,
533,
436,
310,
417,
973,
5544,
923,
2805,
20,
50276,
88,
20,
323,
1097,
10527,
285,
5661,
27163,
627,
403,
12497,
1666,
25379,
2908,
275,
436,
2929,
50276,
88,
2405,
323,
10527,
10454,
1783,
352,
760,
10262,
253,
10454,
273,
6114,
1293,
5301,
281,
643,
7274,
50276,
88,
1237,
323,
253,
4216,
5140,
3368,
352,
760,
26662,
45294,
3490,
512,
81,
3514,
285,
247,
1327,
37146,
5933,
752,
4555,
310,
253,
1327,
37146,
5933,
310,
7202,
2074,
2987,
751,
3127,
390,
391,
18,
403,
671,
417,
6760,
50276,
88,
1610,
323,
253,
17524,
3368,
627,
943,
320,
1014,
625,
2130,
1666,
25379,
891,
1158,
352,
310,
1014,
3309,
281,
7277,
342,
1327,
77,
1200,
3169,
285,
14122,
303,
1858,
414,
3169,
7274,
50276,
88,
21,
253,
3368,
11809,
403,
1335,
1512,
2969,
12200,
253,
8245,
2523,
4706,
608,
16774,
1263,
19756,
4619,
1491,
285,
310,
15225,
3542,
50276,
88,
3156,
352,
7194,
44995,
253,
5933,
407,
6890,
253,
4764,
391,
273,
1180,
273,
46159,
2299,
760,
1907,
1264,
2792,
391,
50276,
1099,
2233,
9166,
310,
417,
2217,
352,
310,
1892,
281,
10018,
253,
5886,
875,
10454,
930,
14023,
930,
9297,
285,
46159,
50276,
88,
2945,
347,
10454,
7756,
310,
271,
1774,
7680,
275,
253,
2929,
627,
943,
320,
4679,
285,
6260,
11120,
12392,
326,
253,
7103,
273,
10454,
327,
1027,
11498,
273,
941,
310,
3264,
2581,
685,
12718,
327,
767,
2792,
273,
337,
67,
285,
884,
67,
941,
50276,
88,
3079,
2593,
608,
19756,
3309,
1783,
285,
6452,
323,
4679,
323,
1650,
275,
3036,
374,
752,
310,
253,
4495,
273,
253,
6983,
7982,
2139,
253,
3045,
273,
298,
1200,
3242,
310,
1027,
432,
2571,
327,
278,
79,
382,
275,
3036,
577,
7001,
251,
19,
78,
2139,
298,
1200,
37146,
29343,
285,
23762,
77,
1200,
37146,
29343,
1646,
281,
452,
2406,
3045,
685,
1327,
37146,
3082,
50276,
37585,
3374,
891,
18,
1386,
11140,
2197,
1483,
50276,
19579,
50276,
74,
19,
1386,
32047,
19042,
1880,
3632,
18,
1288,
2976,
740,
67,
10895,
4454,
403,
5347,
1025,
3198,
281,
320,
15616,
50276,
74,
20,
253,
2412,
340,
24039,
275,
690,
8442,
24088,
3036,
337,
259,
15170,
3036,
495,
278,
79,
382,
403,
1512,
7227,
281,
2939,
253,
1543,
690,
941,
2792,
3036,
337,
512,
13934,
3036,
374,
298,
1200,
37146,
3242,
3036,
495,
298,
1200,
37146,
39471,
264,
403,
1892,
281,
10018,
50276,
83,
18,
1182,
12109,
340,
266,
3987,
1162,
355,
3809,
694,
79,
4216,
5140,
342,
33643,
7996,
556,
2027,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
7203,
254,
17099,
3642,
344,
45355,
4072,
50276,
2887,
259,
18,
281,
259,
21,
5474,
339,
8159,
3870,
247,
873,
273,
2792,
1269,
39433,
342,
247,
2557,
273,
28208,
14259,
247,
14259,
4216,
305,
7096,
327,
1269,
310,
6326,
35056,
8288,
281,
3812,
9297,
875,
2074,
2792,
285,
3693,
9297,
875,
43110,
2792,
1846,
11640,
4853,
326,
247,
4667,
273,
2792,
1269,
90,
943,
320,
15833,
604,
616,
14259,
310,
1840,
247,
4229,
7887,
390,
604,
340,
310,
2190,
253,
465,
954,
2074,
2792,
281,
340,
323,
690,
4229,
465,
841,
14580,
403,
7744,
908,
275,
1142,
4893,
253,
15180,
7881,
597,
16753,
403,
247,
326,
26736,
731,
476,
320,
1077,
19983,
3340,
323,
1077,
1781,
15302,
835,
3242,
5140,
310,
275,
36764,
917,
285,
270,
326,
597,
878,
281,
320,
12054,
23507,
1580,
436,
11852,
616,
47813,
275,
253,
1066,
296,
9779,
4893,
597,
403,
8818,
323,
50276,
2520,
2929,
23318,
26736,
841,
14580,
824,
326,
2074,
2792,
403,
417,
7933,
4802,
407,
271,
5024,
533,
342,
247,
1854,
273,
2978,
374,
436,
12453,
5691,
270,
1580,
436,
19595,
8284,
4483,
253,
4216,
281,
320,
625,
23507,
285,
12453,
5691,
247,
1580,
352,
298,
1727,
3139,
281,
247,
1077,
2969,
5140,
12433,
8211,
253,
2792,
275,
247,
1039,
326,
11467,
31221,
14259,
627,
310,
247,
8485,
6239,
327,
14556,
11365,
824,
12433,
8211,
323,
1781,
1029,
6967,
1127,
5239,
19836,
407,
298,
1200,
285,
840,
13887,
247,
8612,
1127,
432,
1016,
22205,
285,
10263,
271,
5024,
875,
253,
8612,
285,
253,
643,
1127,
275,
253,
22205,
26332,
8133,
247,
4177,
4216,
327,
253,
22205,
253,
11333,
275,
436,
2929,
10280,
436,
342,
2067,
3632,
27959,
281,
6635,
253,
14259,
4216,
50276,
2520,
2746,
281,
26736,
14259,
10192,
26368,
14580,
556,
644,
5125,
285,
5421,
1078,
533,
436,
2929,
10384,
352,
625,
5742,
275,
253,
3634,
273,
2822,
570,
25194,
24279,
4216,
5140,
2722,
352,
476,
320,
9009,
275,
3946,
327,
247,
1077,
1781,
4311,
285,
3797,
271,
16774,
7103,
436,
11323,
247,
1027,
6907,
281,
253,
5368,
6239,
2045,
789,
326,
891,
717,
6600,
273,
556,
644,
13714,
10527,
285,
7106,
327,
643,
26791,
273,
14259,
24279,
4216,
50275,
2520,
310,
247,
1175,
3290,
2929,
326,
3133,
281,
479,
751,
247,
4891,
7680,
281,
253,
6239,
327,
1236,
2510,
25912,
295,
2224,
4216,
575,
11682,
310,
3542,
973,
285,
4518,
598,
281,
690,
7681,
2792,
4879,
2708,
285,
281,
479,
1391,
1032,
253,
2534,
323,
14924,
253,
20059,
38135,
575,
261,
4931,
417,
327,
253,
2169,
1930,
347,
253,
1783,
3587,
29820,
8946,
285,
5368,
298,
1200,
5609,
2299,
253,
2898,
281,
14259,
14580,
50276,
20432,
275,
3946,
285,
275,
2837,
1598,
1781,
4311,
50276,
261,
4722,
285,
1534,
285,
6210,
1591,
886,
4525,
275,
436,
2929,
436,
778,
7170,
253,
1039,
359,
897,
298,
1200,
275,
1236,
2510,
25912,
7533,
285,
3486,
253,
1039,
359,
3989,
1236,
2510,
25912,
14259,
14580,
627,
310,
4931,
690,
2316,
323,
625,
5955,
285,
17947,
273,
253,
47813,
285,
7364,
273,
436,
2746,
923,
3533,
2708,
50276,
783,
7681,
4028,
812,
320,
625,
10182,
575,
3113,
690,
17011,
327,
417,
4518,
2931,
14951,
285,
690,
46948,
4028,
275,
4243,
273,
253,
27947,
2167,
627,
310,
642,
4468,
670,
36594,
841,
1289,
27539,
476,
320,
15045,
432,
3634,
390,
604,
581,
310,
2168,
7615,
342,
253,
4114,
6239,
533,
778,
1335,
1607,
13616,
1239,
1430,
323,
38782,
398,
534,
310,
247,
2372,
23293,
1580,
253,
2929,
310,
5010,
1077,
973,
575,
15720,
323,
1650,
50276,
893,
2788,
81,
417,
11120,
2931,
310,
1386,
1749,
6326,
281,
320,
29688,
13947,
352,
347,
253,
14259,
281,
253,
465,
394,
954,
2074,
575,
3659,
50276,
249,
34118,
9168,
14951,
875,
3104,
21300,
285,
32858,
50276,
783,
4737,
273,
13989,
5922,
2789,
897,
273,
3607,
273,
948,
13362,
285,
1054,
13362,
326,
403,
1620,
4767,
390,
23378,
253,
4602,
875,
253,
15708,
5912,
281,
253,
14259,
50276,
395,
690,
625,
891,
2550,
1618,
512,
273,
841,
533,
2403,
253,
7681,
4243,
30909,
285,
1881,
41010,
651,
2779,
1361,
1056,
253,
2929,
12482,
281,
247,
14200,
8446,
50276,
6795,
17660,
337,
5816,
749,
26804,
327,
253,
268,
261,
275,
1386,
18881,
374,
858,
368,
1599,
33234,
82,
3185,
273,
278,
484,
82,
327,
1386,
30669,
5549,
2490,
187,
4118,
18435,
27,
455,
10123,
323,
436,
2929,
497,
2762,
23447,
342,
247,
11962,
1268,
273,
23027,
30628,
1119,
253,
1895,
1781,
4311,
14259,
3186,
281,
320,
1774,
285,
253,
4477,
7680,
281,
352,
7938,
3186,
970,
374,
26001,
3186,
4216,
8818,
3066,
50276,
77,
1200,
1534,
627,
497,
690,
7350,
670,
253,
1666,
25379,
275,
5661,
7103,
285,
690,
4942,
5884,
9759,
3374,
9142,
253,
37865,
3012,
32180,
18201,
253,
2297,
3993
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
403,
16293,
281,
320,
1679,
685,
390,
4503,
281,
374,
47010,
7419,
285,
642,
5024,
342,
690,
1355,
14259,
1268,
452,
271,
5024,
875,
731,
342,
253,
19595,
4736,
273,
2074,
2792,
1146,
1561,
767,
47010,
347,
10066,
281,
581,
253,
5933,
310,
2104,
281,
1056,
1534,
15988,
275,
4583,
6733,
597,
1918,
271,
5933,
323,
26736,
253,
767,
12242,
653,
9582,
7887,
4216,
970,
33643,
7996,
556,
2027,
298,
1200,
347,
973,
347,
271,
5933,
970,
20045,
298,
1200,
323,
4020,
839,
7725,
4885,
15833,
597,
1918,
11333,
285,
10527,
23632,
326,
616,
5933,
4020,
684,
5275,
15833,
970,
6907,
390,
480,
3649,
472,
14259,
5593,
347,
973,
347,
10527,
23632,
327,
6733,
50275,
9328,
6583,
253,
10527,
1543,
342,
4679,
327,
247,
5019,
273,
1524,
285,
13506,
15302,
253,
4679,
2486,
1543,
327,
253,
1180,
273,
14023,
1160,
323,
2710,
11333,
849,
2810,
253,
1180,
273,
46159,
4373,
19484,
11852,
11193,
281,
694,
79,
275,
2426,
273,
6983,
347,
973,
347,
253,
7031,
273,
253,
11193,
627,
310,
671,
271,
3368,
4645,
253,
3045,
327,
17524,
273,
2710,
9508,
273,
253,
5933,
285,
253,
512,
81,
3514,
8245,
50276,
19164,
414,
50275,
8826,
5371,
3236,
21168,
745,
5609,
432,
2045,
14259,
17524,
11333,
347,
973,
347,
970,
5609,
273,
33643,
7996,
556,
2027,
285,
767,
12242,
653,
23217,
50274,
1439,
2590,
281,
479,
849,
436,
26662,
281,
643,
16851,
465,
5275,
6346,
285,
14259,
4216,
5140,
11333,
50276,
909,
50276,
70,
343,
284,
925,
26798,
1113,
26185,
1162,
355,
3809,
5939,
694,
79,
4216,
5140,
970,
1125,
302,
37437,
33643,
19579,
556,
2027,
913,
78,
13122,
327,
17497,
2718,
285,
4302,
246,
382,
1903,
9169,
337,
50276,
1093,
50276,
91,
12109,
340,
266,
3987,
1162,
355,
3809,
694,
79,
4216,
5140,
342,
33643,
7996,
556,
2027,
10038,
1686,
27905,
1678,
4072,
50276,
33317,
305,
1162,
355,
4715,
281,
819,
2517,
2087,
285,
5919,
16851,
5275,
6346,
3186,
342,
3884,
49858,
4216,
1384,
1423,
253,
608,
394,
5213,
8059,
327,
941,
5718,
285,
941,
11369,
1384,
1423,
295,
24949,
50276,
15177,
50275,
2520,
310,
247,
3426,
1029,
3290,
789,
50274,
6050,
891,
452,
417,
5783,
512,
253,
27947,
275,
253,
30762,
253,
39383,
1646,
281,
320,
3590,
314,
6012,
50274,
783,
4679,
921,
253,
12510,
273,
253,
5933,
275,
2426,
273,
6733,
285,
11193,
50274,
262,
651,
320,
5322,
281,
452,
690,
625,
4679,
4645,
253,
31471,
327,
15450,
8892,
50274,
12563,
642,
14023,
281,
2074,
2987,
50276,
498,
15752,
50275,
6050,
14168,
5536,
253,
2929,
4558,
2590,
347,
247,
5322,
5649,
253,
2929,
651,
1335,
1239,
3965,
4518,
1014,
323,
247,
1708,
1239,
347,
1142,
10668,
1537,
320,
2509,
50275,
9188,
40348,
50275,
262,
3133,
751,
436,
651,
320,
273,
1600,
281,
690,
10668,
352,
3133,
5604,
327,
253,
2898,
1930,
436,
651,
320,
273,
1600,
50274,
262,
651,
320,
5322,
281,
452,
625,
6667,
273,
15450,
897,
2219,
253,
17524,
3368,
310,
9371,
275,
326,
2743,
533,
417,
2221,
2590,
7317,
352,
310,
50274,
1033,
3485,
1710,
274,
50276,
1282,
24347,
2500,
545,
412,
50276,
9389,
12242,
18212,
2218,
50276,
7152,
33032,
2520,
2929,
2175,
253,
14259,
4216,
5140,
1895,
534,
310,
4217,
323,
15450,
8892,
824,
347,
17524,
285,
4216,
4715,
352,
29328,
247,
6974,
4907,
6114,
281,
3989,
767,
12242,
653,
23217,
347,
247,
17040,
273,
253,
14259,
4216,
407,
17221,
6657,
2792,
285,
6635,
6114,
73,
2259,
9297,
352,
3916,
352,
310,
8936,
984,
352,
4419,
387,
954,
327,
18,
80,
4259,
10454,
275,
5140,
285,
253,
3453,
767,
12242,
653,
23217,
403,
653,
9332,
685,
2622,
14259,
14580,
20544,
50276,
84,
18,
436,
2929,
2340,
684,
247,
4460,
1895,
323,
4216,
5140,
327,
767,
12242,
653,
9582,
14580,
352,
3916,
436,
1511,
273,
4216,
556,
11184,
9297,
285,
310,
7613,
625,
13857,
323,
15450,
8892,
327,
1236,
2510,
25912,
941,
352,
310,
273,
2176,
1600,
323,
253,
2561,
275,
3652,
23507,
14580,
432,
14259,
6779,
3340,
323,
1236,
2510,
25912,
941,
285,
5939,
13782,
50276,
84,
19,
436,
2929,
29328,
247,
2969,
2568,
3576,
6974,
6114,
534,
476,
14556,
6635,
767,
12242,
653,
23217,
1754,
327,
14259,
5593,
26332,
33643,
7996,
13283,
298,
1200,
5870,
352,
3400,
10527,
1783,
281,
921,
326,
6114,
476,
3989,
16851,
2822,
6346,
2459,
14580,
762,
1677,
2228,
23632,
285,
4245,
253,
673,
10454,
3033,
50276,
20881,
1255,
265,
50276,
88,
18,
1677,
326,
767,
12242,
653,
9582,
310,
247,
14923,
1511,
273,
14259,
14580,
352,
310,
3626,
326,
11333,
4158,
327,
436,
4216,
1511,
452,
1805,
10454,
685,
1666,
25379,
594,
1880,
253,
3045,
7756,
3249,
432,
253,
6114,
5933,
3139,
390,
310,
253,
15785,
273,
253,
4216,
17040,
310,
12744,
352,
310,
671,
417,
3559,
275,
253,
2929,
1880,
253,
3064,
275,
14259,
14580,
588,
906,
275,
3045,
5926,
275,
15450,
8892,
824,
347,
17524,
923,
2805,
18,
50276,
88,
19,
347,
299,
4277,
13794,
432,
470,
281,
337,
891,
5545,
1880,
253,
327,
18,
80,
4259,
10454,
476,
320,
2529,
347,
4829,
8172,
275,
253,
2929,
352,
3916,
326,
253,
16774,
673,
310,
3012,
1805,
685,
253,
10527,
3033,
533,
436,
310,
417,
973,
5544,
923,
2805,
20,
50276,
88,
20,
323,
1097,
10527,
285,
5661,
27163,
627,
403,
12497,
1666,
25379,
2908,
275,
436,
2929,
50276,
88,
2405,
323,
10527,
10454,
1783,
352,
760,
10262,
253,
10454,
273,
6114,
1293,
5301,
281,
643,
7274,
50276,
88,
1237,
323,
253,
4216,
5140,
3368,
352,
760,
26662,
45294,
3490,
512,
81,
3514,
285,
247,
1327,
37146,
5933,
752,
4555,
310,
253,
1327,
37146,
5933,
310,
7202,
2074,
2987,
751,
3127,
390,
391,
18,
403,
671,
417,
6760,
50276,
88,
1610,
323,
253,
17524,
3368,
627,
943,
320,
1014,
625,
2130,
1666,
25379,
891,
1158,
352,
310,
1014,
3309,
281,
7277,
342,
1327,
77,
1200,
3169,
285,
14122,
303,
1858,
414,
3169,
7274,
50276,
88,
21,
253,
3368,
11809,
403,
1335,
1512,
2969,
12200,
253,
8245,
2523,
4706,
608,
16774,
1263,
19756,
4619,
1491,
285,
310,
15225,
3542,
50276,
88,
3156,
352,
7194,
44995,
253,
5933,
407,
6890,
253,
4764,
391,
273,
1180,
273,
46159,
2299,
760,
1907,
1264,
2792,
391,
50276,
1099,
2233,
9166,
310,
417,
2217,
352,
310,
1892,
281,
10018,
253,
5886,
875,
10454,
930,
14023,
930,
9297,
285,
46159,
50276,
88,
2945,
347,
10454,
7756,
310,
271,
1774,
7680,
275,
253,
2929,
627,
943,
320,
4679,
285,
6260,
11120,
12392,
326,
253,
7103,
273,
10454,
327,
1027,
11498,
273,
941,
310,
3264,
2581,
685,
12718,
327,
767,
2792,
273,
337,
67,
285,
884,
67,
941,
50276,
88,
3079,
2593,
608,
19756,
3309,
1783,
285,
6452,
323,
4679,
323,
1650,
275,
3036,
374,
752,
310,
253,
4495,
273,
253,
6983,
7982,
2139,
253,
3045,
273,
298,
1200,
3242,
310,
1027,
432,
2571,
327,
278,
79,
382,
275,
3036,
577,
7001,
251,
19,
78,
2139,
298,
1200,
37146,
29343,
285,
23762,
77,
1200,
37146,
29343,
1646,
281,
452,
2406,
3045,
685,
1327,
37146,
3082,
50276,
37585,
3374,
891,
18,
1386,
11140,
2197,
1483,
50276,
19579,
50276,
74,
19,
1386,
32047,
19042,
1880,
3632,
18,
1288,
2976,
740,
67,
10895,
4454,
403,
5347,
1025,
3198,
281,
320,
15616,
50276,
74,
20,
253,
2412,
340,
24039,
275,
690,
8442,
24088,
3036,
337,
259,
15170,
3036,
495,
278,
79,
382,
403,
1512,
7227,
281,
2939,
253,
1543,
690,
941,
2792,
3036,
337,
512,
13934,
3036,
374,
298,
1200,
37146,
3242,
3036,
495,
298,
1200,
37146,
39471,
264,
403,
1892,
281,
10018,
50276,
83,
18,
1182,
12109,
340,
266,
3987,
1162,
355,
3809,
694,
79,
4216,
5140,
342,
33643,
7996,
556,
2027,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
7203,
254,
17099,
3642,
344,
45355,
4072,
50276,
2887,
259,
18,
281,
259,
21,
5474,
339,
8159,
3870,
247,
873,
273,
2792,
1269,
39433,
342,
247,
2557,
273,
28208,
14259,
247,
14259,
4216,
305,
7096,
327,
1269,
310,
6326,
35056,
8288,
281,
3812,
9297,
875,
2074,
2792,
285,
3693,
9297,
875,
43110,
2792,
1846,
11640,
4853,
326,
247,
4667,
273,
2792,
1269,
90,
943,
320,
15833,
604,
616,
14259,
310,
1840,
247,
4229,
7887,
390,
604,
340,
310,
2190,
253,
465,
954,
2074,
2792,
281,
340,
323,
690,
4229,
465,
841,
14580,
403,
7744,
908,
275,
1142,
4893,
253,
15180,
7881,
597,
16753,
403,
247,
326,
26736,
731,
476,
320,
1077,
19983,
3340,
323,
1077,
1781,
15302,
835,
3242,
5140,
310,
275,
36764,
917,
285,
270,
326,
597,
878,
281,
320,
12054,
23507,
1580,
436,
11852,
616,
47813,
275,
253,
1066,
296,
9779,
4893,
597,
403,
8818,
323,
50276,
2520,
2929,
23318,
26736,
841,
14580,
824,
326,
2074,
2792,
403,
417,
7933,
4802,
407,
271,
5024,
533,
342,
247,
1854,
273,
2978,
374,
436,
12453,
5691,
270,
1580,
436,
19595,
8284,
4483,
253,
4216,
281,
320,
625,
23507,
285,
12453,
5691,
247,
1580,
352,
298,
1727,
3139,
281,
247,
1077,
2969,
5140,
12433,
8211,
253,
2792,
275,
247,
1039,
326,
11467,
31221,
14259,
627,
310,
247,
8485,
6239,
327,
14556,
11365,
824,
12433,
8211,
323,
1781,
1029,
6967,
1127,
5239,
19836,
407,
298,
1200,
285,
840,
13887,
247,
8612,
1127,
432,
1016,
22205,
285,
10263,
271,
5024,
875,
253,
8612,
285,
253,
643,
1127,
275,
253,
22205,
26332,
8133,
247,
4177,
4216,
327,
253,
22205,
253,
11333,
275,
436,
2929,
10280,
436,
342,
2067,
3632,
27959,
281,
6635,
253,
14259,
4216,
50276,
2520,
2746,
281,
26736,
14259,
10192,
26368,
14580,
556,
644,
5125,
285,
5421,
1078,
533,
436,
2929,
10384,
352,
625,
5742,
275,
253,
3634,
273,
2822,
570,
25194,
24279,
4216,
5140,
2722,
352,
476,
320,
9009,
275,
3946,
327,
247,
1077,
1781,
4311,
285,
3797,
271,
16774,
7103,
436,
11323,
247,
1027,
6907,
281,
253,
5368,
6239,
2045,
789,
326,
891,
717,
6600,
273,
556,
644,
13714,
10527,
285,
7106,
327,
643,
26791,
273,
14259,
24279,
4216,
50275,
2520,
310,
247,
1175,
3290,
2929,
326,
3133,
281,
479,
751,
247,
4891,
7680,
281,
253,
6239,
327,
1236,
2510,
25912,
295,
2224,
4216,
575,
11682,
310,
3542,
973,
285,
4518,
598,
281,
690,
7681,
2792,
4879,
2708,
285,
281,
479,
1391,
1032,
253,
2534,
323,
14924,
253,
20059,
38135,
575,
261,
4931,
417,
327,
253,
2169,
1930,
347,
253,
1783,
3587,
29820,
8946,
285,
5368,
298,
1200,
5609,
2299,
253,
2898,
281,
14259,
14580,
50276,
20432,
275,
3946,
285,
275,
2837,
1598,
1781,
4311,
50276,
261,
4722,
285,
1534,
285,
6210,
1591,
886,
4525,
275,
436,
2929,
436,
778,
7170,
253,
1039,
359,
897,
298,
1200,
275,
1236,
2510,
25912,
7533,
285,
3486,
253,
1039,
359,
3989,
1236,
2510,
25912,
14259,
14580,
627,
310,
4931,
690,
2316,
323,
625,
5955,
285,
17947,
273,
253,
47813,
285,
7364,
273,
436,
2746,
923,
3533,
2708,
50276,
783,
7681,
4028,
812,
320,
625,
10182,
575,
3113,
690,
17011,
327,
417,
4518,
2931,
14951,
285,
690,
46948,
4028,
275,
4243,
273,
253,
27947,
2167,
627,
310,
642,
4468,
670,
36594,
841,
1289,
27539,
476,
320,
15045,
432,
3634,
390,
604,
581,
310,
2168,
7615,
342,
253,
4114,
6239,
533,
778,
1335,
1607,
13616,
1239,
1430,
323,
38782,
398,
534,
310,
247,
2372,
23293,
1580,
253,
2929,
310,
5010,
1077,
973,
575,
15720,
323,
1650,
50276,
893,
2788,
81,
417,
11120,
2931,
310,
1386,
1749,
6326,
281,
320,
29688,
13947,
352,
347,
253,
14259,
281,
253,
465,
394,
954,
2074,
575,
3659,
50276,
249,
34118,
9168,
14951,
875,
3104,
21300,
285,
32858,
50276,
783,
4737,
273,
13989,
5922,
2789,
897,
273,
3607,
273,
948,
13362,
285,
1054,
13362,
326,
403,
1620,
4767,
390,
23378,
253,
4602,
875,
253,
15708,
5912,
281,
253,
14259,
50276,
395,
690,
625,
891,
2550,
1618,
512,
273,
841,
533,
2403,
253,
7681,
4243,
30909,
285,
1881,
41010,
651,
2779,
1361,
1056,
253,
2929,
12482,
281,
247,
14200,
8446,
50276,
6795,
17660,
337,
5816,
749,
26804,
327,
253,
268,
261,
275,
1386,
18881,
374,
858,
368,
1599,
33234,
82,
3185,
273,
278,
484,
82,
327,
1386,
30669,
5549,
2490,
187,
4118,
18435,
27,
455,
10123,
323,
436,
2929,
497,
2762,
23447,
342,
247,
11962,
1268,
273,
23027,
30628,
1119,
253,
1895,
1781,
4311,
14259,
3186,
281,
320,
1774,
285,
253,
4477,
7680,
281,
352,
7938,
3186,
970,
374,
26001,
3186,
4216,
8818,
3066,
50276,
77,
1200,
1534,
627,
497,
690,
7350,
670,
253,
1666,
25379,
275,
5661,
7103,
285,
690,
4942,
5884,
9759,
3374,
9142,
253,
37865,
3012,
32180,
18201,
253,
2297,
3993
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper focuses on the problem of aggregating transcriptions of speech signals from crowdsourced workers in practice multiple workers are asked to transcribe the same speech sentence an aggregation algorithm should try to retrieve a more reliable transcription hopefully closer to the real ground truth by combining users transcriptions the authors collected a dataset of transcriptions called crowdspeech that is based on the manual annotation of the dev and test subsets of librispeech this dataset can be used as a benchmark to evaluate aggregation algorithms the authors also created a dataset based on a synthesized speech called rusnews which is released with the corresponding annotations from the workers under the name vox diy finally the authors evaluate some aggregation methods ie rover rasa hrrasa on the proposed dataset this paper releases a dataset of speech transcriptions that can foster research on aggregation methods for audio transcriptions it provides a common benchmark for evaluating such algorithms the paper provides useful guidelines on how the current aggregation methods work and more in general on how a highquality crowdsourced data collection should be performed the paper releases a dataset potentially valuable for part of the community but it focuses on a niche problem ie aggregation of speech transcriptions from crowdsourced workers that might be of limited research interest as a matter of fact the best aggregation is still rover an old method based on dynamic programming the author mentioned that crowdsourced data might be useful to finetune a speech recognizer to adapt it to new conditions like new accents etc in the paper i would thus have expected some real speech recognition experiments where the different aggregation algorithms are compared using the final speech recognition performance achieved after finetuning the model the author claims the main obstacle towards designing advanced aggregation methods is the absence of training data and in this work we focus on bridging this gap in speech recognition if im not misinterpreting this sentence it looks like the authors suggest that crowdspeech can be used to train an aggregation method i would thus have expected some initial attempts in this direction eg a seq2seq neural network that takes in input the multiple transcriptions and outputs the combined one crowdspeech is based on librispeech which is a very specific dataset based on old audiobooks it looks like this task is very challenging for workers as witnessed by the high wer results achieved with all the considered aggregation methods im not really sure if the results achieved with crowdspeech actually hold in a more realistic crowdsourcing scenario as mentioned by the authors im very skeptical as well on the results achieved with synthetic speech imo these results dont add too much to the paper and instead raise some doubts on their real significance i would even consider removing them it could be great to add an error analysis to highlights where most of the errors done by the users and by the aggregation methods come from vox populi vox diy benchmark dataset for crowdsourced audio transcription i dont understand why the title mentions vox populi this name is not reported in any other place of the article moreover voxpopuli is the name of another popular dataset recently released httpsarxivorgabs210100390 please explain this docsepthe paper presents the annotation of two existing datasets in terms of audio transcription by 7 workers the first dataset for the english language corresponds to more than 20 hours of audiobooks librispeech that crowd workers transcribe to text the second dataset for the russian language starts with existing news texts those are automatically transcribed to audio and then crowd workers transcribe it again to text the related work section supports the need for new larger datasets and the discussion on the different aggregation methods for different crowdsourced replies also previous datasets are described in this section one of those crowdcsa2019 is used later for comparison with the new annotated datasets the annotation is described in detail and exploratory analysis of the collected datasets proves the quality of the annotated data for instance krippendorffs alpha between annotators proved agreement with a value around 08 for evaluation the paper compares the existing methods for aggregation of noisy texts on the new datasets rover rasa hrrasa and oracle are computed when compared with the previous datasets the new annotated material seems to provide better quality for all metrics the provided datasets enrich the field of audio to text transcriptions both code and data are available and there seem to be no ethical questions a method on how to extend audio data for underrepresented domains is discussed for russian texts are used and automatically converted to audio this method is limited and not representative of human diversity when reading or speaking for this reason those data are not comparable to the remaining datasets the work does not provide results on benchmarking machine learning tasks docsepthis is a sold and wellwritten though not groundbreaking work on creating benchmarks for measuring the quality of crowdsourced audio transcriptions the contributions of this papers are as follows crowdspeech a large dataset for crowdsourced audio annotations english voxdiy a similar dataset for russian though based on synthesized not real audio recordings evaluation of several baseline methods for aggregation of crowdsourced transcriptions methodology and an opensourced pipeline for preparing such datasets the datasets presented in the paper are filling some important gaps with these datasets one can check new more advanced methods for transcription aggregation its good that the authors compared their datasets to the crowdcsa2019 data for machine translation it was instructive to see the similarities and differences between them the authors present some interesting insights eg the one about the length of crowdsourced annotations 280 the main weakness as the authors themselves notice is that the voxdiy is too easy it should be definitely augmented with realistic noise and in general the fact that the data set was synthesized is somewhat problematic it would be interesting to see more baselines even simple ones eg the longest sentence the most frequent sentence and switch to random if no sentence is repeated actually it would be interesting to see statistics for sentence repetitions though this is not really an important weakness
### Summary:
|
this paper makes two contributions 1 collecting crowdsourced transcripts for existing speech data librispeech and synthesized speech data rusnews the novel aspect is that each audio is annotated by 7 different crowd workers providing the largest publicly available dataset for crowdsourced texts aggregation 99k annotations of 11k recording by 3k unique workers 2 evaluating different methods rasa hrrasa t5 for aggregating sequence level annotation from crowd workers the data collection method follows standard practices and shows high interannotator agreement table 3 and will be useful for the research community studying imperfect annotations the comparison between transcript for real speech librispeech and synthesized speech data rusnews is also interesting overall the paper is well written and the newly added t5 baseline results outperforming other aggregation methods are interesting
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
16633,
327,
253,
1895,
273,
9406,
839,
7045,
621,
273,
6519,
6298,
432,
24597,
47549,
5820,
275,
3946,
2709,
5820,
403,
2546,
281,
811,
19268,
253,
1072,
6519,
6197,
50275,
266,
20828,
5933,
943,
1611,
281,
19553,
247,
625,
9630,
9464,
18670,
8003,
281,
253,
1524,
3216,
5083,
407,
16248,
4212,
7045,
621,
253,
4477,
5728,
247,
10895,
273,
7045,
621,
1925,
24597,
365,
5036,
326,
310,
1754,
327,
253,
11595,
22581,
273,
253,
1474,
285,
1071,
20077,
273,
40211,
261,
365,
5036,
436,
10895,
476,
320,
908,
347,
247,
22791,
281,
7472,
20828,
11333,
253,
4477,
671,
3562,
247,
10895,
1754,
327,
247,
17791,
6519,
1925,
391,
316,
13608,
50276,
4609,
310,
4439,
342,
253,
3969,
31825,
432,
253,
5820,
762,
253,
1416,
32582,
1073,
90,
4720,
253,
4477,
7472,
690,
20828,
3082,
26332,
687,
332,
391,
19924,
288,
2676,
19924,
327,
253,
4081,
10895,
50276,
2520,
2929,
16784,
247,
10895,
273,
6519,
7045,
621,
326,
476,
22846,
2561,
327,
20828,
3082,
323,
9797,
7045,
621,
50275,
262,
3400,
247,
1846,
22791,
323,
16344,
824,
11333,
50275,
783,
2929,
3400,
4217,
9600,
327,
849,
253,
1655,
20828,
3082,
789,
285,
625,
275,
2087,
327,
849,
247,
1029,
15177,
24597,
47549,
941,
4849,
943,
320,
2684,
50276,
783,
2929,
16784,
247,
10895,
7826,
9865,
323,
629,
273,
253,
3114,
533,
352,
16633,
327,
247,
25803,
1895,
26332,
20828,
273,
6519,
7045,
621,
432,
24597,
47549,
5820,
50276,
3529,
1537,
320,
273,
3710,
2561,
1600,
347,
247,
2647,
273,
958,
253,
1682,
20828,
310,
1335,
687,
332,
271,
1711,
1332,
1754,
327,
7870,
10717,
50275,
783,
2488,
5393,
326,
24597,
47549,
941,
1537,
320,
4217,
281,
1442,
292,
2517,
247,
6519,
3183,
6081,
281,
5223,
352,
281,
747,
2515,
751,
747,
756,
592,
3966,
50276,
249,
253,
2929,
891,
651,
3021,
452,
3264,
690,
1524,
6519,
8981,
4679,
835,
253,
1027,
20828,
11333,
403,
2429,
970,
253,
2457,
6519,
8981,
3045,
6786,
846,
1442,
292,
25004,
253,
1566,
50275,
783,
2488,
3916,
253,
2022,
26982,
4404,
20462,
7269,
20828,
3082,
310,
253,
5928,
273,
3733,
941,
285,
275,
436,
789,
359,
2770,
327,
49519,
436,
8037,
275,
6519,
8981,
50276,
338,
516,
417,
3731,
2388,
3456,
1076,
436,
6197,
352,
4453,
751,
253,
4477,
1804,
326,
24597,
365,
5036,
476,
320,
908,
281,
6194,
271,
20828,
1332,
891,
651,
3021,
452,
3264,
690,
3302,
9437,
275,
436,
3884,
24088,
247,
22510,
19,
14571,
11454,
2990,
326,
3936,
275,
3280,
253,
2709,
7045,
621,
285,
18012,
253,
5678,
581,
50274,
32910,
1397,
365,
5036,
310,
1754,
327,
40211,
261,
365,
5036,
534,
310,
247,
1077,
2173,
10895,
1754,
327,
1711,
41174,
706,
645,
84,
352,
4453,
751,
436,
4836,
310,
1077,
11132,
323,
5820,
347,
21153,
407,
253,
1029,
16640,
1543,
6786,
342,
512,
253,
2783,
20828,
3082,
516,
417,
1663,
2119,
604,
253,
1543,
6786,
342,
24597,
365,
5036,
2686,
2186,
275,
247,
625,
15958,
24597,
40883,
10076,
50275,
284,
5393,
407,
253,
4477,
516,
1077,
33872,
347,
973,
327,
253,
1543,
6786,
342,
13506,
6519,
50276,
17622,
841,
1543,
13414,
823,
1512,
1199,
281,
253,
2929,
285,
3185,
7164,
690,
24626,
327,
616,
1524,
8453,
891,
651,
1014,
1908,
11922,
731,
50275,
262,
812,
320,
1270,
281,
823,
271,
2228,
1783,
281,
16681,
835,
954,
273,
253,
6332,
2218,
407,
253,
4212,
285,
407,
253,
20828,
3082,
1705,
432,
50275,
87,
1004,
1684,
22357,
32582,
1073,
90,
22791,
10895,
323,
24597,
47549,
9797,
9464,
891,
13414,
2096,
2139,
253,
4060,
25957,
32582,
1684,
22357,
436,
1416,
310,
417,
2361,
275,
667,
643,
1659,
273,
253,
3929,
25761,
32582,
9576,
22357,
310,
253,
1416,
273,
1529,
4633,
10895,
4102,
4439,
5987,
39962,
2061,
5375,
1797,
28949,
23850,
4496,
5513,
436,
5474,
339,
431,
248,
2929,
10262,
253,
22581,
273,
767,
5368,
15302,
275,
2426,
273,
9797,
9464,
407,
818,
5820,
253,
806,
10895,
323,
253,
48087,
3448,
10140,
281,
625,
685,
1384,
3038,
273,
41174,
706,
645,
84,
40211,
261,
365,
5036,
326,
9539,
5820,
811,
19268,
281,
2505,
253,
1273,
10895,
323,
253,
391,
1316,
757,
3448,
7866,
342,
5368,
3668,
17438,
1110,
403,
8356,
34348,
281,
9797,
285,
840,
9539,
5820,
811,
19268,
352,
969,
281,
2505,
50276,
783,
2905,
789,
2593,
8525,
253,
878,
323,
747,
4067,
15302,
285,
253,
5955,
327,
253,
1027,
20828,
3082,
323,
1027,
24597,
47549,
32114,
671,
2045,
15302,
403,
2529,
275,
436,
2593,
581,
273,
1110,
9539,
6113,
66,
9638,
310,
908,
1996,
323,
5301,
342,
253,
747,
28267,
15302,
50276,
783,
22581,
310,
2529,
275,
2508,
285,
41075,
1783,
273,
253,
5728,
15302,
19539,
253,
3290,
273,
253,
28267,
941,
323,
4227,
465,
21743,
13047,
567,
84,
9765,
875,
12182,
2392,
8058,
4345,
342,
247,
1318,
1475,
16331,
50276,
1542,
7103,
253,
2929,
26662,
253,
5368,
3082,
323,
20828,
273,
27620,
17438,
327,
253,
747,
15302,
687,
332,
391,
19924,
288,
2676,
19924,
285,
42295,
403,
10302,
672,
2429,
342,
253,
2045,
15302,
253,
747,
28267,
2144,
3133,
281,
2085,
1805,
3290,
323,
512,
17082,
50275,
783,
2530,
15302,
15655,
253,
1673,
273,
9797,
281,
2505,
7045,
621,
50276,
15617,
2127,
285,
941,
403,
2130,
285,
627,
1646,
281,
320,
642,
16289,
3533,
50276,
66,
1332,
327,
849,
281,
9017,
9797,
941,
323,
762,
33174,
10625,
310,
5469,
50274,
1542,
391,
1316,
757,
17438,
403,
908,
285,
8356,
11516,
281,
9797,
436,
1332,
310,
3710,
285,
417,
8612,
273,
1966,
9991,
672,
4361,
390,
8288,
323,
436,
1921,
1110,
941,
403,
417,
10870,
281,
253,
5780,
15302,
50276,
783,
789,
1057,
417,
2085,
1543,
327,
22791,
272,
5145,
4715,
8892,
50276,
7152,
33032,
2520,
310,
247,
4211,
285,
973,
15720,
2167,
417,
3216,
22071,
789,
327,
6153,
49602,
323,
10499,
253,
3290,
273,
24597,
47549,
9797,
7045,
621,
50276,
783,
9021,
273,
436,
9380,
403,
347,
3637,
50275,
32910,
1397,
365,
5036,
50276,
66,
1781,
10895,
323,
24597,
47549,
9797,
31825,
48087,
50276,
87,
1004,
5168,
90,
50276,
66,
2074,
10895,
323,
391,
1316,
757,
2167,
1754,
327,
17791,
417,
1524,
9797,
19654,
50276,
15419,
2368,
273,
2067,
8245,
3082,
323,
20828,
273,
24597,
47549,
7045,
621,
50276,
9349,
1497,
285,
271,
13279,
47549,
15722,
323,
13828,
824,
15302,
50276,
783,
15302,
3559,
275,
253,
2929,
403,
12868,
690,
1774,
18388,
342,
841,
15302,
581,
476,
2451,
747,
625,
7269,
3082,
323,
9464,
20828,
50276,
953,
1175,
326,
253,
4477,
2429,
616,
15302,
281,
253,
9539,
6113,
66,
9638,
941,
323,
5145,
10234,
352,
369,
49664,
281,
923,
253,
22620,
285,
3910,
875,
731,
50276,
783,
4477,
1246,
690,
4722,
16039,
24088,
253,
581,
670,
253,
2978,
273,
24597,
47549,
31825,
22121,
50276,
783,
2022,
14855,
347,
253,
4477,
3746,
4366,
310,
326,
253,
32582,
5168,
90,
310,
1512,
3477,
352,
943,
320,
7964,
31612,
342,
15958,
6046,
285,
275,
2087,
253,
958,
326,
253,
941,
873,
369,
17791,
310,
8489,
20276,
50275,
262,
651,
320,
4722,
281,
923,
625,
1666,
25379,
1014,
2969,
4394,
24088,
253,
20088,
6197,
253,
954,
10879,
6197,
285,
5234,
281,
3632,
604,
642,
6197,
310,
6015,
2686,
352,
651,
320,
4722,
281,
923,
9990,
323,
6197,
49495,
50276,
2004,
436,
310,
417,
1663,
271,
1774,
14855,
50274,
187,
187,
4118,
18435,
27,
2520,
2929,
2789,
767,
9021,
50276,
18,
17055,
24597,
47549,
16917,
323,
5368,
6519,
941,
40211,
261,
365,
5036,
285,
17791,
6519,
941,
391,
316,
13608,
253,
4460,
4809,
310,
326,
1016,
9797,
310,
28267,
407,
818,
1027,
9539,
5820,
5277,
253,
6253,
13644,
2130,
10895,
323,
24597,
47549,
17438,
20828,
8688,
76,
31825,
273,
1903,
76,
7663,
407,
495,
76,
4451,
5820,
374,
16344,
1027,
3082,
391,
19924,
288,
2676,
19924,
246,
22,
323,
9406,
839,
3425,
1268,
22581,
432,
9539,
5820,
50275,
783,
941,
4849,
1332,
3637,
2629,
8333,
285,
2722,
1029,
734,
11423,
1080,
4345,
2829,
495,
285,
588,
320,
4217,
323,
253,
2561,
3114,
12392,
35180,
31825,
253,
5301,
875,
7045,
323,
1524,
6519,
40211,
261,
365,
5036,
285,
17791,
6519,
941,
391,
316,
13608,
310,
671,
4722,
4583,
253,
2929,
310,
973,
3542,
285,
253,
9841,
2879,
246,
22,
8245,
1543,
41731,
14692,
643,
20828,
3082,
403,
4722,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
16633,
327,
253,
1895,
273,
9406,
839,
7045,
621,
273,
6519,
6298,
432,
24597,
47549,
5820,
275,
3946,
2709,
5820,
403,
2546,
281,
811,
19268,
253,
1072,
6519,
6197,
50275,
266,
20828,
5933,
943,
1611,
281,
19553,
247,
625,
9630,
9464,
18670,
8003,
281,
253,
1524,
3216,
5083,
407,
16248,
4212,
7045,
621,
253,
4477,
5728,
247,
10895,
273,
7045,
621,
1925,
24597,
365,
5036,
326,
310,
1754,
327,
253,
11595,
22581,
273,
253,
1474,
285,
1071,
20077,
273,
40211,
261,
365,
5036,
436,
10895,
476,
320,
908,
347,
247,
22791,
281,
7472,
20828,
11333,
253,
4477,
671,
3562,
247,
10895,
1754,
327,
247,
17791,
6519,
1925,
391,
316,
13608,
50276,
4609,
310,
4439,
342,
253,
3969,
31825,
432,
253,
5820,
762,
253,
1416,
32582,
1073,
90,
4720,
253,
4477,
7472,
690,
20828,
3082,
26332,
687,
332,
391,
19924,
288,
2676,
19924,
327,
253,
4081,
10895,
50276,
2520,
2929,
16784,
247,
10895,
273,
6519,
7045,
621,
326,
476,
22846,
2561,
327,
20828,
3082,
323,
9797,
7045,
621,
50275,
262,
3400,
247,
1846,
22791,
323,
16344,
824,
11333,
50275,
783,
2929,
3400,
4217,
9600,
327,
849,
253,
1655,
20828,
3082,
789,
285,
625,
275,
2087,
327,
849,
247,
1029,
15177,
24597,
47549,
941,
4849,
943,
320,
2684,
50276,
783,
2929,
16784,
247,
10895,
7826,
9865,
323,
629,
273,
253,
3114,
533,
352,
16633,
327,
247,
25803,
1895,
26332,
20828,
273,
6519,
7045,
621,
432,
24597,
47549,
5820,
50276,
3529,
1537,
320,
273,
3710,
2561,
1600,
347,
247,
2647,
273,
958,
253,
1682,
20828,
310,
1335,
687,
332,
271,
1711,
1332,
1754,
327,
7870,
10717,
50275,
783,
2488,
5393,
326,
24597,
47549,
941,
1537,
320,
4217,
281,
1442,
292,
2517,
247,
6519,
3183,
6081,
281,
5223,
352,
281,
747,
2515,
751,
747,
756,
592,
3966,
50276,
249,
253,
2929,
891,
651,
3021,
452,
3264,
690,
1524,
6519,
8981,
4679,
835,
253,
1027,
20828,
11333,
403,
2429,
970,
253,
2457,
6519,
8981,
3045,
6786,
846,
1442,
292,
25004,
253,
1566,
50275,
783,
2488,
3916,
253,
2022,
26982,
4404,
20462,
7269,
20828,
3082,
310,
253,
5928,
273,
3733,
941,
285,
275,
436,
789,
359,
2770,
327,
49519,
436,
8037,
275,
6519,
8981,
50276,
338,
516,
417,
3731,
2388,
3456,
1076,
436,
6197,
352,
4453,
751,
253,
4477,
1804,
326,
24597,
365,
5036,
476,
320,
908,
281,
6194,
271,
20828,
1332,
891,
651,
3021,
452,
3264,
690,
3302,
9437,
275,
436,
3884,
24088,
247,
22510,
19,
14571,
11454,
2990,
326,
3936,
275,
3280,
253,
2709,
7045,
621,
285,
18012,
253,
5678,
581,
50274,
32910,
1397,
365,
5036,
310,
1754,
327,
40211,
261,
365,
5036,
534,
310,
247,
1077,
2173,
10895,
1754,
327,
1711,
41174,
706,
645,
84,
352,
4453,
751,
436,
4836,
310,
1077,
11132,
323,
5820,
347,
21153,
407,
253,
1029,
16640,
1543,
6786,
342,
512,
253,
2783,
20828,
3082,
516,
417,
1663,
2119,
604,
253,
1543,
6786,
342,
24597,
365,
5036,
2686,
2186,
275,
247,
625,
15958,
24597,
40883,
10076,
50275,
284,
5393,
407,
253,
4477,
516,
1077,
33872,
347,
973,
327,
253,
1543,
6786,
342,
13506,
6519,
50276,
17622,
841,
1543,
13414,
823,
1512,
1199,
281,
253,
2929,
285,
3185,
7164,
690,
24626,
327,
616,
1524,
8453,
891,
651,
1014,
1908,
11922,
731,
50275,
262,
812,
320,
1270,
281,
823,
271,
2228,
1783,
281,
16681,
835,
954,
273,
253,
6332,
2218,
407,
253,
4212,
285,
407,
253,
20828,
3082,
1705,
432,
50275,
87,
1004,
1684,
22357,
32582,
1073,
90,
22791,
10895,
323,
24597,
47549,
9797,
9464,
891,
13414,
2096,
2139,
253,
4060,
25957,
32582,
1684,
22357,
436,
1416,
310,
417,
2361,
275,
667,
643,
1659,
273,
253,
3929,
25761,
32582,
9576,
22357,
310,
253,
1416,
273,
1529,
4633,
10895,
4102,
4439,
5987,
39962,
2061,
5375,
1797,
28949,
23850,
4496,
5513,
436,
5474,
339,
431,
248,
2929,
10262,
253,
22581,
273,
767,
5368,
15302,
275,
2426,
273,
9797,
9464,
407,
818,
5820,
253,
806,
10895,
323,
253,
48087,
3448,
10140,
281,
625,
685,
1384,
3038,
273,
41174,
706,
645,
84,
40211,
261,
365,
5036,
326,
9539,
5820,
811,
19268,
281,
2505,
253,
1273,
10895,
323,
253,
391,
1316,
757,
3448,
7866,
342,
5368,
3668,
17438,
1110,
403,
8356,
34348,
281,
9797,
285,
840,
9539,
5820,
811,
19268,
352,
969,
281,
2505,
50276,
783,
2905,
789,
2593,
8525,
253,
878,
323,
747,
4067,
15302,
285,
253,
5955,
327,
253,
1027,
20828,
3082,
323,
1027,
24597,
47549,
32114,
671,
2045,
15302,
403,
2529,
275,
436,
2593,
581,
273,
1110,
9539,
6113,
66,
9638,
310,
908,
1996,
323,
5301,
342,
253,
747,
28267,
15302,
50276,
783,
22581,
310,
2529,
275,
2508,
285,
41075,
1783,
273,
253,
5728,
15302,
19539,
253,
3290,
273,
253,
28267,
941,
323,
4227,
465,
21743,
13047,
567,
84,
9765,
875,
12182,
2392,
8058,
4345,
342,
247,
1318,
1475,
16331,
50276,
1542,
7103,
253,
2929,
26662,
253,
5368,
3082,
323,
20828,
273,
27620,
17438,
327,
253,
747,
15302,
687,
332,
391,
19924,
288,
2676,
19924,
285,
42295,
403,
10302,
672,
2429,
342,
253,
2045,
15302,
253,
747,
28267,
2144,
3133,
281,
2085,
1805,
3290,
323,
512,
17082,
50275,
783,
2530,
15302,
15655,
253,
1673,
273,
9797,
281,
2505,
7045,
621,
50276,
15617,
2127,
285,
941,
403,
2130,
285,
627,
1646,
281,
320,
642,
16289,
3533,
50276,
66,
1332,
327,
849,
281,
9017,
9797,
941,
323,
762,
33174,
10625,
310,
5469,
50274,
1542,
391,
1316,
757,
17438,
403,
908,
285,
8356,
11516,
281,
9797,
436,
1332,
310,
3710,
285,
417,
8612,
273,
1966,
9991,
672,
4361,
390,
8288,
323,
436,
1921,
1110,
941,
403,
417,
10870,
281,
253,
5780,
15302,
50276,
783,
789,
1057,
417,
2085,
1543,
327,
22791,
272,
5145,
4715,
8892,
50276,
7152,
33032,
2520,
310,
247,
4211,
285,
973,
15720,
2167,
417,
3216,
22071,
789,
327,
6153,
49602,
323,
10499,
253,
3290,
273,
24597,
47549,
9797,
7045,
621,
50276,
783,
9021,
273,
436,
9380,
403,
347,
3637,
50275,
32910,
1397,
365,
5036,
50276,
66,
1781,
10895,
323,
24597,
47549,
9797,
31825,
48087,
50276,
87,
1004,
5168,
90,
50276,
66,
2074,
10895,
323,
391,
1316,
757,
2167,
1754,
327,
17791,
417,
1524,
9797,
19654,
50276,
15419,
2368,
273,
2067,
8245,
3082,
323,
20828,
273,
24597,
47549,
7045,
621,
50276,
9349,
1497,
285,
271,
13279,
47549,
15722,
323,
13828,
824,
15302,
50276,
783,
15302,
3559,
275,
253,
2929,
403,
12868,
690,
1774,
18388,
342,
841,
15302,
581,
476,
2451,
747,
625,
7269,
3082,
323,
9464,
20828,
50276,
953,
1175,
326,
253,
4477,
2429,
616,
15302,
281,
253,
9539,
6113,
66,
9638,
941,
323,
5145,
10234,
352,
369,
49664,
281,
923,
253,
22620,
285,
3910,
875,
731,
50276,
783,
4477,
1246,
690,
4722,
16039,
24088,
253,
581,
670,
253,
2978,
273,
24597,
47549,
31825,
22121,
50276,
783,
2022,
14855,
347,
253,
4477,
3746,
4366,
310,
326,
253,
32582,
5168,
90,
310,
1512,
3477,
352,
943,
320,
7964,
31612,
342,
15958,
6046,
285,
275,
2087,
253,
958,
326,
253,
941,
873,
369,
17791,
310,
8489,
20276,
50275,
262,
651,
320,
4722,
281,
923,
625,
1666,
25379,
1014,
2969,
4394,
24088,
253,
20088,
6197,
253,
954,
10879,
6197,
285,
5234,
281,
3632,
604,
642,
6197,
310,
6015,
2686,
352,
651,
320,
4722,
281,
923,
9990,
323,
6197,
49495,
50276,
2004,
436,
310,
417,
1663,
271,
1774,
14855,
50274,
187,
187,
4118,
18435,
27,
2520,
2929,
2789,
767,
9021,
50276,
18,
17055,
24597,
47549,
16917,
323,
5368,
6519,
941,
40211,
261,
365,
5036,
285,
17791,
6519,
941,
391,
316,
13608,
253,
4460,
4809,
310,
326,
1016,
9797,
310,
28267,
407,
818,
1027,
9539,
5820,
5277,
253,
6253,
13644,
2130,
10895,
323,
24597,
47549,
17438,
20828,
8688,
76,
31825,
273,
1903,
76,
7663,
407,
495,
76,
4451,
5820,
374,
16344,
1027,
3082,
391,
19924,
288,
2676,
19924,
246,
22,
323,
9406,
839,
3425,
1268,
22581,
432,
9539,
5820,
50275,
783,
941,
4849,
1332,
3637,
2629,
8333,
285,
2722,
1029,
734,
11423,
1080,
4345,
2829,
495,
285,
588,
320,
4217,
323,
253,
2561,
3114,
12392,
35180,
31825,
253,
5301,
875,
7045,
323,
1524,
6519,
40211,
261,
365,
5036,
285,
17791,
6519,
941,
391,
316,
13608,
310,
671,
4722,
4583,
253,
2929,
310,
973,
3542,
285,
253,
9841,
2879,
246,
22,
8245,
1543,
41731,
14692,
643,
20828,
3082,
403,
4722,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the welldefined feature selection problem the authors pointed out that the redundancy issue of the weightbased feature selection methods to tackle this the authors explored the secondorder feature covariance matrix and proposed a twostage framework including feature correlation matrix via knowledge contrastive distillation and feature selection on the masked correlation matrix via graph segmentation extensive experiments demonstrated the effectiveness of the proposed methods pros 1 the motivation is well introduced with illustrative examples in figure 1 2 the graph segmentationbased framework is novel and interesting to me which is different from the mainstream weightbased feature selection 3 the philosophy of knowledge contrastive distillation is logically sound i have a minor question that will be posted in the cons section 4 the experimental results are extensive the authors compared with 10 methods including several recent deep methods on 12 datasets and demonstrated the significant improvements 5 the indepth exploration is a plus which helps understand the proposed algorithm cons 1 the authors employed the knowledge contrastive distillation to learn the feature correlation matrix the traditional unsupervised feature selection methods based on the firstorder feature matrix usually employs a clustering method to purse pseudo labels for feature selection it is suggested that the authors would like to do another ablation study without knowledge contrastive distillation 2 the motivation of this paper is to address the redundancy issue of selected features although the authors provided the illustrative examples in figure 1 it would be better to demonstrate the proposed method does not suffer from this issue on the same datasets in the experimental part this paper considers the welldefined feature selection problem the authors pointed out that the redundancy issue of the weightbased feature selection methods to tackle this the authors explored the secondorder feature covariance matrix and proposed a twostage framework including feature correlation matrix via knowledge contrastive distillation and feature selection on the masked correlation matrix via graph segmentation extensive experiments demonstrated the effectiveness of the proposed methods the motivation is well introduced with illustrative examples the graph segmentationbased framework is novel and interesting to me which is different from the mainstream weightbased feature selection the philosophy of knowledge contrastive distillation is logically sound it is suggested that the authors would like to do another ablation study without knowledge contrastive distillation although the authors provided the illustrative examples in figure 1 it would be better to demonstrate the proposed method does not suffer from this issue on the same datasets in the experimental part docsepthis paper proposes a twostage secondorder unsupervised feature selection via knowledge contrastive distillation model that incorporates the secondorder covariance matrix with the firstorder data matrix for unsupervised feature selection a sparse attention matrix that can represent secondorder relations between features is learned in the first stage a relational graph based on the learned attention matrix is learned to perform graph segmentation for feature selection strengths 1 the proposed method is interesting 2 experimental results are good weaknesses 1 some presentations are not clear enough 11 the first line of page 4 why thetam is forced to be symmetric 12 why the masked matrix can be defined by eq2 2 since the gcn training process relies on pseudo labels how to ensure the reliability of the clutering results obtained by pca and kmeans interesting idea but some unclear presentations and motivations docsepfeature selection reduces highdimensional data by identifying comprehensible informative features this paper proposed an unsupervised feature selection method soft by combining the information from the secondorder covariance matrix with the firstorder data matrix by empirical experiments the authors demonstrated the effectiveness of the proposed method strengths this paper proposed an interesting feature selection algorithm which leveraged the information from the secondorder covariance matrix with the firstorder data matrix on empirical experiments the proposed method showed superior performance over several baselines to some extend weaknesses here are some of my commentsquestions i would appreciate it if the authors could give a response if i am wrong please correct me thanks the authors stated that most of these methods apply the linear feature selection matrices and select the representative features by ranking their feature weight vector such operations treat the feature set independently and fail to tackle the complex highorder relationship while as far as i know the model in 3 actually can globally exploit the complex feature relationships and the models in 1 and 3 are both flexible to nonlinear feature selection so what are the limitations of existing methods that the paper was trying to address the authors demonstrated the effectiveness of the proposed method through empirical experiments currently there are many new developments in feature selection for example 1 and 2 i noticed the authors cited 1 but why not compare soft to more nonlinear feature selection models such as aefs if so the effectiveness of the proposed method will be relatively convincing comparing table 2 with figure 3 it is observed that the performance of soft is not so stable as other methods not only on lcancer as the authors pointed out so i encourage the authors might compare the stability of selected features from soft with baseline comparisons for example do stability analysis for selected features from different methods before this paper is published in influential conferences or journals from table 4 in supplementary material it seems that soft has no obvious advantage over other baselines in terms of efficiency doesnt it if so efficiency has no advantage and performance is not too stable then what is the purpose of soft could the authors clarify or explain the motivation more i look forward to hearing that thanks additionally this paper is generally well written but some places some issues in this paper should be further clarifiedfixed the font for the legend in figure 3 is too small the same full name and abbreviation appeared several times in the paper about the architecture of used baselines the authors stated that for tsfs cae and inffs we use default settings provided in their opensource codes however it is not so clear for example for cae both linear and nonlinear structures are discussed in 3 which one did the authors of this paper use 1 han et al autoencoder inspired unsupervised feature selection in international conference on acoustics speech and signal processing 2018 2 doquet et al agnostic feature selection in joint european conference on machine learning and knowledge discovery in databases 2019 3 muhammed fatih baln abubakar abid and james zou concrete autoencoders differentiable feature selection and reconstruction in international conference on machine learning 2019 in general i think this paper is relatively interesting and i was initially interested by the title of this paper i would like to increase my score if the authors could give convincing responses to the previous commentsquestions in the weaknesses part thanks docsepfor unsupervised feature selection this paper proposes a twostage secondorder method knowledge contrastive distillation is also incorporated for feature learning experiments on various datasets validate the effectiveness for unsupervised feature selection this paper proposes a twostage secondorder method knowledge contrastive distillation is also incorporated for feature learning experiments on various datasets validate the effectiveness pros 1 the overall presentation and organization are good 2 the presented results are good on these related datasets cons 1 my main concern lies in the experiments ndfs achieves the best results among the compared baselines however ndfs is published in aaai 2012 which is 9 years ago the improvement of the proposed method over ndfs is also marginal on several datasets besides the difference among these baselines is also marginal though the authors also compare with two recent methods including cae and inffs their performance is even lower than ndfs since they adopted different datasets in their original papers therefore the experimental results are not convincing 2 the novelty is not satisfying all these modules including the secondorder information have been well studied in the area of unsupervised learning the proposed method is a combination of these methods the perspective for unsupervised feature selection by feature relationship learning and graph segmentation is not new from my point of view 3 for the clustering task many methods have already achieved much better results on coil20 orl and even larger datasets including cifar10 and cifar100 4 the computational complexity is very expensive which is much higher than many compared methods 5 this paper is rejected by neurips 2021 and i happened to review this paper several months ago the authors do not incorporate any suggestions in this resubmitted manuscript the main body is totally the same and only one dataset for evaluation is changed this paper proposes a rather complex and expensive new method for unsupervised feature selection that performs marginally better than the baselines many quite old on the set of datasets they have chosen to present in the paper the exposition could also be clearer but the idea seems more complex than it is interesting in all the paper does not rise to the standards of significance and novelty of iclr docsepthe authors propose a twostep unsupervised feature selection algorithm contraru to classic fs approaches the authors try to infer important relations between pairs of features in the final step a graph segmentation approach is used to select the most important features experimental results show promising results overall i think it is an interesting approach to unsupervised feature selection although the computational cost both spatial and temporal is very high a correlation matrix is used the idea of searching for strong feature correlations is interesting however i have some concerns regarding the contribution 1 some of the decisions are not clearly stated for instance in the graph construction why the authors delete the las 10 features why they set to zero all values smaller than the median in my opinion the authors should include an ablation study that would shed some light over all those decisions 2 the same idea should be applied to the pseudo label generation and the evaluation metrics i expect to see how this algorithm behaves when using a more complex approach than kmeans like dec 1 or imsat 2 for instance 3 regarding to the problem of the accuracy drop whenever they increase the selected features 20 or higher i wonder if there could be a different graph segmentation algorithm that could prevent it 1 xie j girshick r farhadi a 2016 june unsupervised deep embedding for clustering analysis in international conference on machine learning pp 478487 pmlr 2 hu w miyato t tokui s matsumoto e sugiyama m 2017 july learning discrete representations via information maximizing selfaugmented training in international conference on machine learning pp 15581567 pmlr overall i think it is an interesting idea with promising results but more experiments have to be done to clearly state the performance of the algorithm as well as a clear reasoning behing all the decisions the authors made in it
### Summary:
|
this paper proposes a new two stage secondorder unsupervised feature selection method via knowledge contrastive distillation in the first stage a sparse attention matrix that represents second order statistics is learned in the second stage a relational graph based on the learned attention matrix is constructed to perform graph segmentation for feature selection this proposed method contains some new and interesting ideas and is novel in the unsupervised feature selection setting though some components such as the second order affinity matrix are not totally new the proposed method is technically sound the authors compared their method with 10 methods including several recent deep methods on 12 datasets and demonstrated consistent improvements however there are some concerns from the reviewers even after the discussion phase 1 the computational efficiency of the proposed method seems to be low since one goal of feature selection is to speed up downstream tasks the efficiency of feature selection itself should also be considered i suggest the authors analyze the computational bottleneck of the proposed method and improve the efficiency 2 more ablation studies can be added to illustrate how the proposed method removes the redundancy issues of the selected features 3 some metrics like supervised classification accuracy can be potentially used as a metric though supervised classification is impossible in the unsupervised learning setting running the experiments on some datasets that have labels by pretending having on label is one way to evaluate the method overall the paper provides some new and interesting ideas however given the above concerns the novelty and significance of the paper will degenerate although we think the paper is not ready for iclr in this round we believe that the paper would be a strong one if the concerns can be well addressed
|
[
39296,
2523,
273,
253,
2801,
3169,
4735,
5438,
3082,
281,
18915,
436,
253,
4477,
14859,
253,
1273,
2621,
4735,
26677,
4315,
285,
4081,
247,
2500,
493,
486,
7792,
1690,
4735,
5921,
4315,
3066,
3640,
4499,
422,
940,
21755,
285,
4735,
5438,
327,
253,
34741,
5921,
4315,
3066,
4216,
26405,
9470,
4679,
5183,
253,
12510,
273,
253,
4081,
3082,
50276,
783,
16038,
310,
973,
5611,
342,
47386,
6667,
253,
4216,
26405,
3169,
7792,
310,
4460,
285,
4722,
281,
479,
534,
310,
1027,
432,
253,
17068,
2801,
3169,
4735,
5438,
253,
11727,
273,
3640,
4499,
422,
940,
21755,
310,
40452,
3590,
50276,
262,
310,
5125,
326,
253,
4477,
651,
751,
281,
513,
1529,
28913,
1263,
1293,
3640,
4499,
422,
940,
21755,
3738,
253,
4477,
2530,
253,
47386,
6667,
275,
4677,
337,
352,
651,
320,
1805,
281,
7568,
253,
4081,
1332,
1057,
417,
11089,
432,
436,
2523,
327,
253,
1072,
15302,
275,
253,
5661,
629,
50276,
7152,
33032,
2520,
2929,
29328,
247,
2500,
493,
486,
1273,
2621,
440,
35421,
4735,
5438,
3066,
3640,
4499,
422,
940,
21755,
1566,
326,
31167,
253,
1273,
2621,
26677,
4315,
342,
253,
806,
2621,
941,
4315,
323,
440,
35421,
4735,
5438,
247,
23507,
4116,
4315,
326,
476,
1957,
1273,
2621,
2493,
875,
3386,
310,
6311,
275,
253,
806,
3924,
247,
38524,
4216,
1754,
327,
253,
6311,
4116,
4315,
310,
6311,
281,
1347,
4216,
26405,
323,
4735,
5438,
20544,
337,
253,
4081,
1332,
310,
4722,
374,
5661,
1543,
403,
1175,
32213,
337,
690,
27228,
403,
417,
2590,
2217,
1903,
253,
806,
1386,
273,
3239,
577,
2139,
253,
85,
312,
310,
6726,
281,
320,
13123,
1249,
2139,
253,
34741,
4315,
476,
320,
2931,
407,
16186,
19,
374,
1580,
253,
305,
14340,
3733,
1232,
15771,
327,
17927,
13301,
849,
281,
5416,
253,
13367,
273,
253,
26986,
2158,
1543,
2797,
407,
268,
6357,
285,
465,
30799,
4722,
2934,
533,
690,
12744,
27228,
285,
42852,
5474,
33032,
24594,
5438,
11355,
1029,
6967,
941,
407,
12488,
28535,
6286,
27096,
3386,
436,
2929,
4081,
271,
440,
35421,
4735,
5438,
1332,
2602,
407,
16248,
253,
1491,
432,
253,
1273,
2621,
26677,
4315,
342,
253,
806,
2621,
941,
4315,
407,
16774,
4679,
253,
4477,
5183,
253,
12510,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50276,
2520,
2929,
4081,
271,
4722,
4735,
5438,
5933,
534,
19732,
2961,
253,
1491,
432,
253,
1273,
2621,
26677,
4315,
342,
253,
806,
2621,
941,
4315,
327,
16774,
4679,
253,
4081,
1332,
2692,
8936,
3045,
689,
2067,
1666,
25379,
281,
690,
9017,
50275,
20881,
1255,
265,
50276,
1568,
403,
690,
273,
619,
5701,
34974,
891,
651,
11435,
352,
604,
253,
4477,
812,
1918,
247,
2380,
604,
891,
717,
3430,
4496,
3451,
479,
6701,
50275,
783,
4477,
4767,
326,
954,
273,
841,
3082,
4647,
253,
4872,
4735,
5438,
12624,
285,
3609,
253,
8612,
3386,
407,
19947,
616,
4735,
2801,
4972,
824,
5871,
1555,
253,
4735,
873,
10939,
285,
1891,
281,
18915,
253,
2570,
1029,
2621,
2954,
1223,
347,
2080,
347,
891,
871,
253,
1566,
275,
495,
2686,
476,
21349,
22059,
253,
2570,
4735,
7688,
285,
253,
3210,
275,
337,
285,
495,
403,
1097,
12112,
281,
14561,
4735,
5438,
594,
752,
403,
253,
7364,
273,
5368,
3082,
326,
253,
2929,
369,
2820,
281,
2953,
50275,
783,
4477,
5183,
253,
12510,
273,
253,
4081,
1332,
949,
16774,
4679,
4390,
627,
403,
1142,
747,
16936,
275,
4735,
5438,
323,
1650,
337,
285,
374,
891,
8344,
253,
4477,
11106,
337,
533,
2139,
417,
7277,
2602,
281,
625,
14561,
4735,
5438,
3210,
824,
347,
247,
23172,
604,
594,
253,
12510,
273,
253,
4081,
1332,
588,
320,
4942,
21414,
50275,
681,
48434,
2829,
374,
342,
4677,
495,
352,
310,
2540,
326,
253,
3045,
273,
2602,
310,
417,
594,
6474,
347,
643,
3082,
417,
760,
327,
298,
30366,
347,
253,
4477,
8042,
562,
594,
891,
11907,
253,
4477,
1537,
7277,
253,
7882,
273,
4236,
3386,
432,
2602,
342,
8245,
14023,
323,
1650,
513,
7882,
1783,
323,
4236,
3386,
432,
1027,
3082,
1078,
436,
2929,
310,
3863,
275,
20803,
27691,
390,
24331,
50275,
4064,
2829,
577,
275,
24864,
2144,
50276,
262,
3133,
326,
2602,
556,
642,
4755,
5750,
689,
643,
1666,
25379,
275,
2426,
273,
6733,
36908,
352,
604,
594,
6733,
556,
642,
5750,
285,
3045,
310,
417,
1512,
6474,
840,
752,
310,
253,
4096,
273,
2602,
812,
253,
4477,
19148,
390,
5513,
253,
16038,
625,
891,
1007,
3579,
281,
4854,
326,
6701,
50275,
29483,
595,
436,
2929,
310,
3839,
973,
3542,
533,
690,
5053,
690,
3374,
275,
436,
2929,
943,
320,
2007,
31637,
20188,
50272,
783,
8266,
323,
253,
13691,
275,
4677,
495,
310,
1512,
1355,
50272,
783,
1072,
2120,
1416,
285,
31931,
2492,
5420,
2067,
2069,
275,
253,
2929,
50272,
10383,
253,
10336,
273,
908,
1666,
25379,
253,
4477,
4767,
326,
323,
28669,
3671,
260,
3348,
285,
275,
567,
84,
359,
897,
4284,
7533,
2530,
275,
616,
13279,
1505,
11646,
2299,
352,
310,
417,
594,
2590,
323,
1650,
323,
260,
3348,
1097,
4872,
285,
14561,
5289,
403,
5469,
275,
495,
534,
581,
858,
253,
4477,
273,
436,
2929,
897,
50266,
18,
15761,
1162,
355,
6753,
36465,
11797,
440,
35421,
4735,
5438,
275,
5213,
8059,
327,
913,
26202,
982,
6519,
285,
2625,
5162,
4765,
50272,
19,
513,
21118,
1162,
355,
639,
79,
6932,
4735,
5438,
275,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
6247,
50270,
20,
12910,
3964,
1314,
4688,
6356,
4273,
79,
490,
538,
518,
274,
490,
301,
285,
480,
1443,
1182,
276,
11859,
6753,
2083,
351,
398,
46350,
4735,
5438,
285,
14433,
275,
5213,
8059,
327,
5145,
4715,
6247,
275,
2087,
891,
1158,
436,
2929,
310,
4942,
4722,
285,
891,
369,
8523,
6110,
407,
253,
4060,
273,
436,
2929,
891,
651,
751,
281,
2572,
619,
4868,
604,
253,
4477,
812,
1918,
21414,
6128,
281,
253,
2045,
5701,
34974,
275,
253,
32213,
629,
6701,
5474,
33032,
1542,
440,
35421,
4735,
5438,
436,
2929,
29328,
247,
2500,
493,
486,
1273,
2621,
1332,
3640,
4499,
422,
940,
21755,
310,
671,
11217,
323,
4735,
4715,
4679,
327,
2710,
15302,
17813,
253,
12510,
50275,
1542,
440,
35421,
4735,
5438,
436,
2929,
29328,
247,
2500,
493,
486,
1273,
2621,
1332,
3640,
4499,
422,
940,
21755,
310,
671,
11217,
323,
4735,
4715,
4679,
327,
2710,
15302,
17813,
253,
12510,
50276,
856,
84,
50276,
18,
253,
4583,
9759,
285,
6003,
403,
1175,
50276,
19,
253,
3559,
1543,
403,
1175,
327,
841,
2905,
15302,
50276,
5040,
50276,
18,
619,
2022,
4468,
8696,
275,
253,
4679,
40515,
3671,
33526,
253,
1682,
1543,
2190,
253,
2429,
1666,
25379,
2299,
40515,
3671,
310,
3863,
275,
39951,
2284,
4050,
534,
310,
898,
1107,
3622,
253,
7756,
273,
253,
4081,
1332,
689,
40515,
3671,
310,
671,
16888,
327,
2067,
15302,
16280,
253,
3064,
2190,
841,
1666,
25379,
310,
671,
16888,
2167,
253,
4477,
671,
7277,
342,
767,
3332,
3082,
1690,
260,
3348,
285,
275,
567,
84,
616,
3045,
310,
1014,
2406,
685,
40515,
3671,
1580,
597,
8671,
1027,
15302,
275,
616,
3236,
9380,
50276,
45230,
253,
5661,
1543,
403,
417,
21414,
50275,
19,
253,
38135,
310,
417,
14127,
512,
841,
11911,
1690,
253,
1273,
2621,
1491,
452,
644,
973,
5421,
275,
253,
2170,
273,
440,
35421,
4715,
253,
4081,
1332,
310,
247,
5019,
273,
841,
3082,
253,
8668,
323,
440,
35421,
4735,
5438,
407,
4735,
2954,
4715,
285,
4216,
26405,
310,
417,
747,
432,
619,
1127,
273,
1859,
50276,
20,
323,
253,
17524,
4836,
1142,
3082,
452,
2168,
6786,
1199,
1805,
1543,
327,
18783,
938,
390,
77,
285,
1014,
4067,
15302,
1690,
260,
338,
274,
740,
285,
260,
338,
274,
2313,
50276,
21,
253,
15180,
10454,
310,
1077,
8214,
534,
310,
1199,
2169,
685,
1142,
2429,
3082,
50276,
22,
436,
2929,
310,
10945,
407,
5723,
2824,
43425,
285,
891,
4592,
281,
2278,
436,
2929,
2067,
2607,
3622,
253,
4477,
513,
417,
19071,
667,
13991,
275,
436,
501,
538,
3004,
7714,
253,
2022,
2133,
310,
9106,
253,
1072,
285,
760,
581,
10895,
323,
7103,
310,
4391,
50275,
2520,
2929,
29328,
247,
2581,
2570,
285,
8214,
747,
1332,
323,
440,
35421,
4735,
5438,
326,
17923,
42876,
1805,
685,
253,
1666,
25379,
1142,
3240,
1711,
327,
253,
873,
273,
15302,
597,
452,
6777,
281,
1246,
275,
253,
2929,
253,
47284,
812,
671,
320,
30909,
533,
253,
2934,
3133,
625,
2570,
685,
352,
310,
4722,
275,
512,
253,
2929,
1057,
417,
6054,
281,
253,
7465,
273,
8453,
285,
38135,
273,
17857,
32888,
5474,
339,
431,
248,
4477,
12661,
247,
2500,
493,
554,
440,
35421,
4735,
5438,
5933,
2916,
29883,
281,
10610,
25290,
7274,
253,
4477,
1611,
281,
9441,
1774,
2493,
875,
8557,
273,
3386,
275,
253,
2457,
3213,
247,
4216,
26405,
2746,
310,
908,
281,
3609,
253,
954,
1774,
3386,
5661,
1543,
921,
12532,
1543,
4583,
891,
1158,
352,
310,
271,
4722,
2746,
281,
440,
35421,
4735,
5438,
3738,
253,
15180,
2105,
1097,
8820,
285,
11935,
310,
1077,
1029,
247,
5921,
4315,
310,
908,
253,
2934,
273,
12203,
323,
2266,
4735,
13007,
310,
4722,
2299,
891,
452,
690,
7350,
5001,
253,
7680,
50276,
18,
690,
273,
253,
7089,
403,
417,
4518,
4767,
323,
4227,
275,
253,
4216,
5140,
2139,
253,
4477,
11352,
253,
4358,
884,
3386,
2139,
597,
873,
281,
5058,
512,
2193,
4577,
685,
253,
8876,
275,
619,
4743,
253,
4477,
943,
2486,
271,
28913,
1263,
326,
651,
17914,
690,
1708,
689,
512,
1110,
7089,
50276,
19,
253,
1072,
2934,
943,
320,
3732,
281,
253,
17927,
5203,
5978,
285,
253,
7103,
17082,
891,
1902,
281,
923,
849,
436,
5933,
37824,
672,
970,
247,
625,
2570,
2746,
685,
465,
30799,
751,
1086,
337,
390,
516,
22354,
374,
323,
4227,
50276,
20,
5001,
281,
253,
1895,
273,
253,
7200,
5926,
10793,
597,
2572,
253,
4236,
3386,
1384,
390,
2169,
891,
4282,
604,
627,
812,
320,
247,
1027,
4216,
26405,
5933,
326,
812,
3657,
352,
50275,
18,
1269,
466,
480,
48496,
1200,
781,
391,
50276,
14103,
10178,
74,
247,
4022,
480,
2517,
440,
35421,
3676,
21496,
323,
17524,
1783,
275,
5213,
8059,
327,
5145,
4715,
7266,
42035,
30910,
268,
1686,
83,
50276,
19,
30287,
259,
3641,
90,
4611,
246,
18734,
4113,
256,
1111,
2204,
4881,
299,
50276,
84,
814,
14059,
2902,
278,
4240,
480,
2988,
4715,
13358,
14237,
3066,
1491,
46875,
1881,
2321,
16390,
3733,
275,
5213,
8059,
327,
5145,
4715,
7266,
1458,
3680,
1010,
2251,
268,
1686,
83,
4583,
891,
1158,
352,
310,
271,
4722,
2934,
342,
12532,
1543,
533,
625,
4679,
452,
281,
320,
2218,
281,
4518,
1375,
253,
3045,
273,
253,
5933,
347,
973,
347,
247,
2590,
14720,
1602,
272,
512,
253,
7089,
253,
4477,
1160,
275,
352,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
767,
3924,
1273,
2621,
440,
35421,
4735,
5438,
1332,
3066,
3640,
4499,
422,
940,
21755,
275,
253,
806,
3924,
247,
23507,
4116,
4315,
326,
6125,
1273,
1340,
9990,
310,
6311,
275,
253,
1273,
3924,
247,
38524,
4216,
1754,
327,
253,
6311,
4116,
4315,
310,
8818,
281,
1347,
4216,
26405,
323,
4735,
5438,
50275,
2520,
4081,
1332,
4428,
690,
747,
285,
4722,
5697,
285,
310,
4460,
275,
253,
440,
35421,
4735,
5438,
4758,
2167,
690,
4295,
824,
347,
253,
1273,
1340,
15430,
4315,
403,
417,
9106,
747,
253,
4081,
1332,
310,
22335,
3590,
253,
4477,
2429,
616,
1332,
342,
884,
3082,
1690,
2067,
3332,
3676,
3082,
327,
1249,
15302,
285,
5183,
5185,
11701,
50275,
35529,
627,
403,
690,
7350,
432,
253,
30628,
1014,
846,
253,
5955,
3408,
337,
253,
15180,
6733,
273,
253,
4081,
1332,
3133,
281,
320,
1698,
1580,
581,
4736,
273,
4735,
5438,
310,
281,
3885,
598,
15450,
8892,
253,
6733,
273,
4735,
5438,
3139,
943,
671,
320,
2783,
891,
1804,
253,
4477,
12106,
253,
15180,
3673,
44856,
273,
253,
4081,
1332,
285,
3157,
253,
6733,
374,
625,
28913,
2175,
476,
320,
2879,
281,
17093,
849,
253,
4081,
1332,
26586,
253,
39296,
3374,
273,
253,
4236,
3386,
495,
690,
17082,
751,
22296,
9162,
7200,
476,
320,
7826,
908,
347,
247,
7982,
2167,
22296,
9162,
310,
7479,
275,
253,
440,
35421,
4715,
4758,
3515,
253,
4679,
327,
690,
15302,
326,
452,
13301,
407,
34452,
1907,
327,
5203,
310,
581,
1039,
281,
7472,
253,
1332,
50276,
1189,
455,
253,
2929,
3400,
690,
747,
285,
4722,
5697,
2299,
1677,
253,
1840,
7350,
253,
38135,
285,
8453,
273,
253,
2929,
588,
29458,
3738,
359,
1158,
253,
2929,
310,
417,
4704,
323,
17857,
32888,
275,
436,
3790,
359,
2868,
326,
253,
2929,
651,
320,
247,
2266,
581,
604,
253,
7350,
476,
320,
973,
9713
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
39296,
2523,
273,
253,
2801,
3169,
4735,
5438,
3082,
281,
18915,
436,
253,
4477,
14859,
253,
1273,
2621,
4735,
26677,
4315,
285,
4081,
247,
2500,
493,
486,
7792,
1690,
4735,
5921,
4315,
3066,
3640,
4499,
422,
940,
21755,
285,
4735,
5438,
327,
253,
34741,
5921,
4315,
3066,
4216,
26405,
9470,
4679,
5183,
253,
12510,
273,
253,
4081,
3082,
50276,
783,
16038,
310,
973,
5611,
342,
47386,
6667,
253,
4216,
26405,
3169,
7792,
310,
4460,
285,
4722,
281,
479,
534,
310,
1027,
432,
253,
17068,
2801,
3169,
4735,
5438,
253,
11727,
273,
3640,
4499,
422,
940,
21755,
310,
40452,
3590,
50276,
262,
310,
5125,
326,
253,
4477,
651,
751,
281,
513,
1529,
28913,
1263,
1293,
3640,
4499,
422,
940,
21755,
3738,
253,
4477,
2530,
253,
47386,
6667,
275,
4677,
337,
352,
651,
320,
1805,
281,
7568,
253,
4081,
1332,
1057,
417,
11089,
432,
436,
2523,
327,
253,
1072,
15302,
275,
253,
5661,
629,
50276,
7152,
33032,
2520,
2929,
29328,
247,
2500,
493,
486,
1273,
2621,
440,
35421,
4735,
5438,
3066,
3640,
4499,
422,
940,
21755,
1566,
326,
31167,
253,
1273,
2621,
26677,
4315,
342,
253,
806,
2621,
941,
4315,
323,
440,
35421,
4735,
5438,
247,
23507,
4116,
4315,
326,
476,
1957,
1273,
2621,
2493,
875,
3386,
310,
6311,
275,
253,
806,
3924,
247,
38524,
4216,
1754,
327,
253,
6311,
4116,
4315,
310,
6311,
281,
1347,
4216,
26405,
323,
4735,
5438,
20544,
337,
253,
4081,
1332,
310,
4722,
374,
5661,
1543,
403,
1175,
32213,
337,
690,
27228,
403,
417,
2590,
2217,
1903,
253,
806,
1386,
273,
3239,
577,
2139,
253,
85,
312,
310,
6726,
281,
320,
13123,
1249,
2139,
253,
34741,
4315,
476,
320,
2931,
407,
16186,
19,
374,
1580,
253,
305,
14340,
3733,
1232,
15771,
327,
17927,
13301,
849,
281,
5416,
253,
13367,
273,
253,
26986,
2158,
1543,
2797,
407,
268,
6357,
285,
465,
30799,
4722,
2934,
533,
690,
12744,
27228,
285,
42852,
5474,
33032,
24594,
5438,
11355,
1029,
6967,
941,
407,
12488,
28535,
6286,
27096,
3386,
436,
2929,
4081,
271,
440,
35421,
4735,
5438,
1332,
2602,
407,
16248,
253,
1491,
432,
253,
1273,
2621,
26677,
4315,
342,
253,
806,
2621,
941,
4315,
407,
16774,
4679,
253,
4477,
5183,
253,
12510,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50276,
2520,
2929,
4081,
271,
4722,
4735,
5438,
5933,
534,
19732,
2961,
253,
1491,
432,
253,
1273,
2621,
26677,
4315,
342,
253,
806,
2621,
941,
4315,
327,
16774,
4679,
253,
4081,
1332,
2692,
8936,
3045,
689,
2067,
1666,
25379,
281,
690,
9017,
50275,
20881,
1255,
265,
50276,
1568,
403,
690,
273,
619,
5701,
34974,
891,
651,
11435,
352,
604,
253,
4477,
812,
1918,
247,
2380,
604,
891,
717,
3430,
4496,
3451,
479,
6701,
50275,
783,
4477,
4767,
326,
954,
273,
841,
3082,
4647,
253,
4872,
4735,
5438,
12624,
285,
3609,
253,
8612,
3386,
407,
19947,
616,
4735,
2801,
4972,
824,
5871,
1555,
253,
4735,
873,
10939,
285,
1891,
281,
18915,
253,
2570,
1029,
2621,
2954,
1223,
347,
2080,
347,
891,
871,
253,
1566,
275,
495,
2686,
476,
21349,
22059,
253,
2570,
4735,
7688,
285,
253,
3210,
275,
337,
285,
495,
403,
1097,
12112,
281,
14561,
4735,
5438,
594,
752,
403,
253,
7364,
273,
5368,
3082,
326,
253,
2929,
369,
2820,
281,
2953,
50275,
783,
4477,
5183,
253,
12510,
273,
253,
4081,
1332,
949,
16774,
4679,
4390,
627,
403,
1142,
747,
16936,
275,
4735,
5438,
323,
1650,
337,
285,
374,
891,
8344,
253,
4477,
11106,
337,
533,
2139,
417,
7277,
2602,
281,
625,
14561,
4735,
5438,
3210,
824,
347,
247,
23172,
604,
594,
253,
12510,
273,
253,
4081,
1332,
588,
320,
4942,
21414,
50275,
681,
48434,
2829,
374,
342,
4677,
495,
352,
310,
2540,
326,
253,
3045,
273,
2602,
310,
417,
594,
6474,
347,
643,
3082,
417,
760,
327,
298,
30366,
347,
253,
4477,
8042,
562,
594,
891,
11907,
253,
4477,
1537,
7277,
253,
7882,
273,
4236,
3386,
432,
2602,
342,
8245,
14023,
323,
1650,
513,
7882,
1783,
323,
4236,
3386,
432,
1027,
3082,
1078,
436,
2929,
310,
3863,
275,
20803,
27691,
390,
24331,
50275,
4064,
2829,
577,
275,
24864,
2144,
50276,
262,
3133,
326,
2602,
556,
642,
4755,
5750,
689,
643,
1666,
25379,
275,
2426,
273,
6733,
36908,
352,
604,
594,
6733,
556,
642,
5750,
285,
3045,
310,
417,
1512,
6474,
840,
752,
310,
253,
4096,
273,
2602,
812,
253,
4477,
19148,
390,
5513,
253,
16038,
625,
891,
1007,
3579,
281,
4854,
326,
6701,
50275,
29483,
595,
436,
2929,
310,
3839,
973,
3542,
533,
690,
5053,
690,
3374,
275,
436,
2929,
943,
320,
2007,
31637,
20188,
50272,
783,
8266,
323,
253,
13691,
275,
4677,
495,
310,
1512,
1355,
50272,
783,
1072,
2120,
1416,
285,
31931,
2492,
5420,
2067,
2069,
275,
253,
2929,
50272,
10383,
253,
10336,
273,
908,
1666,
25379,
253,
4477,
4767,
326,
323,
28669,
3671,
260,
3348,
285,
275,
567,
84,
359,
897,
4284,
7533,
2530,
275,
616,
13279,
1505,
11646,
2299,
352,
310,
417,
594,
2590,
323,
1650,
323,
260,
3348,
1097,
4872,
285,
14561,
5289,
403,
5469,
275,
495,
534,
581,
858,
253,
4477,
273,
436,
2929,
897,
50266,
18,
15761,
1162,
355,
6753,
36465,
11797,
440,
35421,
4735,
5438,
275,
5213,
8059,
327,
913,
26202,
982,
6519,
285,
2625,
5162,
4765,
50272,
19,
513,
21118,
1162,
355,
639,
79,
6932,
4735,
5438,
275,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
6247,
50270,
20,
12910,
3964,
1314,
4688,
6356,
4273,
79,
490,
538,
518,
274,
490,
301,
285,
480,
1443,
1182,
276,
11859,
6753,
2083,
351,
398,
46350,
4735,
5438,
285,
14433,
275,
5213,
8059,
327,
5145,
4715,
6247,
275,
2087,
891,
1158,
436,
2929,
310,
4942,
4722,
285,
891,
369,
8523,
6110,
407,
253,
4060,
273,
436,
2929,
891,
651,
751,
281,
2572,
619,
4868,
604,
253,
4477,
812,
1918,
21414,
6128,
281,
253,
2045,
5701,
34974,
275,
253,
32213,
629,
6701,
5474,
33032,
1542,
440,
35421,
4735,
5438,
436,
2929,
29328,
247,
2500,
493,
486,
1273,
2621,
1332,
3640,
4499,
422,
940,
21755,
310,
671,
11217,
323,
4735,
4715,
4679,
327,
2710,
15302,
17813,
253,
12510,
50275,
1542,
440,
35421,
4735,
5438,
436,
2929,
29328,
247,
2500,
493,
486,
1273,
2621,
1332,
3640,
4499,
422,
940,
21755,
310,
671,
11217,
323,
4735,
4715,
4679,
327,
2710,
15302,
17813,
253,
12510,
50276,
856,
84,
50276,
18,
253,
4583,
9759,
285,
6003,
403,
1175,
50276,
19,
253,
3559,
1543,
403,
1175,
327,
841,
2905,
15302,
50276,
5040,
50276,
18,
619,
2022,
4468,
8696,
275,
253,
4679,
40515,
3671,
33526,
253,
1682,
1543,
2190,
253,
2429,
1666,
25379,
2299,
40515,
3671,
310,
3863,
275,
39951,
2284,
4050,
534,
310,
898,
1107,
3622,
253,
7756,
273,
253,
4081,
1332,
689,
40515,
3671,
310,
671,
16888,
327,
2067,
15302,
16280,
253,
3064,
2190,
841,
1666,
25379,
310,
671,
16888,
2167,
253,
4477,
671,
7277,
342,
767,
3332,
3082,
1690,
260,
3348,
285,
275,
567,
84,
616,
3045,
310,
1014,
2406,
685,
40515,
3671,
1580,
597,
8671,
1027,
15302,
275,
616,
3236,
9380,
50276,
45230,
253,
5661,
1543,
403,
417,
21414,
50275,
19,
253,
38135,
310,
417,
14127,
512,
841,
11911,
1690,
253,
1273,
2621,
1491,
452,
644,
973,
5421,
275,
253,
2170,
273,
440,
35421,
4715,
253,
4081,
1332,
310,
247,
5019,
273,
841,
3082,
253,
8668,
323,
440,
35421,
4735,
5438,
407,
4735,
2954,
4715,
285,
4216,
26405,
310,
417,
747,
432,
619,
1127,
273,
1859,
50276,
20,
323,
253,
17524,
4836,
1142,
3082,
452,
2168,
6786,
1199,
1805,
1543,
327,
18783,
938,
390,
77,
285,
1014,
4067,
15302,
1690,
260,
338,
274,
740,
285,
260,
338,
274,
2313,
50276,
21,
253,
15180,
10454,
310,
1077,
8214,
534,
310,
1199,
2169,
685,
1142,
2429,
3082,
50276,
22,
436,
2929,
310,
10945,
407,
5723,
2824,
43425,
285,
891,
4592,
281,
2278,
436,
2929,
2067,
2607,
3622,
253,
4477,
513,
417,
19071,
667,
13991,
275,
436,
501,
538,
3004,
7714,
253,
2022,
2133,
310,
9106,
253,
1072,
285,
760,
581,
10895,
323,
7103,
310,
4391,
50275,
2520,
2929,
29328,
247,
2581,
2570,
285,
8214,
747,
1332,
323,
440,
35421,
4735,
5438,
326,
17923,
42876,
1805,
685,
253,
1666,
25379,
1142,
3240,
1711,
327,
253,
873,
273,
15302,
597,
452,
6777,
281,
1246,
275,
253,
2929,
253,
47284,
812,
671,
320,
30909,
533,
253,
2934,
3133,
625,
2570,
685,
352,
310,
4722,
275,
512,
253,
2929,
1057,
417,
6054,
281,
253,
7465,
273,
8453,
285,
38135,
273,
17857,
32888,
5474,
339,
431,
248,
4477,
12661,
247,
2500,
493,
554,
440,
35421,
4735,
5438,
5933,
2916,
29883,
281,
10610,
25290,
7274,
253,
4477,
1611,
281,
9441,
1774,
2493,
875,
8557,
273,
3386,
275,
253,
2457,
3213,
247,
4216,
26405,
2746,
310,
908,
281,
3609,
253,
954,
1774,
3386,
5661,
1543,
921,
12532,
1543,
4583,
891,
1158,
352,
310,
271,
4722,
2746,
281,
440,
35421,
4735,
5438,
3738,
253,
15180,
2105,
1097,
8820,
285,
11935,
310,
1077,
1029,
247,
5921,
4315,
310,
908,
253,
2934,
273,
12203,
323,
2266,
4735,
13007,
310,
4722,
2299,
891,
452,
690,
7350,
5001,
253,
7680,
50276,
18,
690,
273,
253,
7089,
403,
417,
4518,
4767,
323,
4227,
275,
253,
4216,
5140,
2139,
253,
4477,
11352,
253,
4358,
884,
3386,
2139,
597,
873,
281,
5058,
512,
2193,
4577,
685,
253,
8876,
275,
619,
4743,
253,
4477,
943,
2486,
271,
28913,
1263,
326,
651,
17914,
690,
1708,
689,
512,
1110,
7089,
50276,
19,
253,
1072,
2934,
943,
320,
3732,
281,
253,
17927,
5203,
5978,
285,
253,
7103,
17082,
891,
1902,
281,
923,
849,
436,
5933,
37824,
672,
970,
247,
625,
2570,
2746,
685,
465,
30799,
751,
1086,
337,
390,
516,
22354,
374,
323,
4227,
50276,
20,
5001,
281,
253,
1895,
273,
253,
7200,
5926,
10793,
597,
2572,
253,
4236,
3386,
1384,
390,
2169,
891,
4282,
604,
627,
812,
320,
247,
1027,
4216,
26405,
5933,
326,
812,
3657,
352,
50275,
18,
1269,
466,
480,
48496,
1200,
781,
391,
50276,
14103,
10178,
74,
247,
4022,
480,
2517,
440,
35421,
3676,
21496,
323,
17524,
1783,
275,
5213,
8059,
327,
5145,
4715,
7266,
42035,
30910,
268,
1686,
83,
50276,
19,
30287,
259,
3641,
90,
4611,
246,
18734,
4113,
256,
1111,
2204,
4881,
299,
50276,
84,
814,
14059,
2902,
278,
4240,
480,
2988,
4715,
13358,
14237,
3066,
1491,
46875,
1881,
2321,
16390,
3733,
275,
5213,
8059,
327,
5145,
4715,
7266,
1458,
3680,
1010,
2251,
268,
1686,
83,
4583,
891,
1158,
352,
310,
271,
4722,
2934,
342,
12532,
1543,
533,
625,
4679,
452,
281,
320,
2218,
281,
4518,
1375,
253,
3045,
273,
253,
5933,
347,
973,
347,
247,
2590,
14720,
1602,
272,
512,
253,
7089,
253,
4477,
1160,
275,
352,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
767,
3924,
1273,
2621,
440,
35421,
4735,
5438,
1332,
3066,
3640,
4499,
422,
940,
21755,
275,
253,
806,
3924,
247,
23507,
4116,
4315,
326,
6125,
1273,
1340,
9990,
310,
6311,
275,
253,
1273,
3924,
247,
38524,
4216,
1754,
327,
253,
6311,
4116,
4315,
310,
8818,
281,
1347,
4216,
26405,
323,
4735,
5438,
50275,
2520,
4081,
1332,
4428,
690,
747,
285,
4722,
5697,
285,
310,
4460,
275,
253,
440,
35421,
4735,
5438,
4758,
2167,
690,
4295,
824,
347,
253,
1273,
1340,
15430,
4315,
403,
417,
9106,
747,
253,
4081,
1332,
310,
22335,
3590,
253,
4477,
2429,
616,
1332,
342,
884,
3082,
1690,
2067,
3332,
3676,
3082,
327,
1249,
15302,
285,
5183,
5185,
11701,
50275,
35529,
627,
403,
690,
7350,
432,
253,
30628,
1014,
846,
253,
5955,
3408,
337,
253,
15180,
6733,
273,
253,
4081,
1332,
3133,
281,
320,
1698,
1580,
581,
4736,
273,
4735,
5438,
310,
281,
3885,
598,
15450,
8892,
253,
6733,
273,
4735,
5438,
3139,
943,
671,
320,
2783,
891,
1804,
253,
4477,
12106,
253,
15180,
3673,
44856,
273,
253,
4081,
1332,
285,
3157,
253,
6733,
374,
625,
28913,
2175,
476,
320,
2879,
281,
17093,
849,
253,
4081,
1332,
26586,
253,
39296,
3374,
273,
253,
4236,
3386,
495,
690,
17082,
751,
22296,
9162,
7200,
476,
320,
7826,
908,
347,
247,
7982,
2167,
22296,
9162,
310,
7479,
275,
253,
440,
35421,
4715,
4758,
3515,
253,
4679,
327,
690,
15302,
326,
452,
13301,
407,
34452,
1907,
327,
5203,
310,
581,
1039,
281,
7472,
253,
1332,
50276,
1189,
455,
253,
2929,
3400,
690,
747,
285,
4722,
5697,
2299,
1677,
253,
1840,
7350,
253,
38135,
285,
8453,
273,
253,
2929,
588,
29458,
3738,
359,
1158,
253,
2929,
310,
417,
4704,
323,
17857,
32888,
275,
436,
3790,
359,
2868,
326,
253,
2929,
651,
320,
247,
2266,
581,
604,
253,
7350,
476,
320,
973,
9713
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper concerns a learning framework to predict set by formulating it as a conditional density estimation problem the approach relies on deep energybased models and predicts multiple plausible sets using gradientguided sampling the suggested method has been evaluated on a variety of set prediction problems with competitive performance compared to few set prediction baselines the idea of predicting set using energybased models seems to be interesting it is also important to predict but more than one plausible set reflecting modesample diversity in the underlying set distribution of the data however my major concerns about the submission are 1 few unsubstantiated claims a all over the text it is claimed that the proposed model can capture and learn multimodal set densities and i am not sure how this is possible with the current simplified sampling strategy which use a perturbation of gradient similar to cheap mcmc with sgd with no theoretical guarantee can capture this complex underlying distribution b it is also inaccurate to claim that multimodal distribution over sets cannot be learned directly from a set loss a sentence in introduction the definition of set loss is unclear in this context but the losses can be derived by modelling a set distribution possibly multimodal parametrically where the parameters of this distribution can be learned using deep neural network 2 diverse but simplified experiments and unclear evaluations a the experiments are diverse enough but some of the setups are very simplified eg generation of polygons digits experiments which considers a perfect input x and two numbers only these simplifications might not necessarily reflect the superiority of the proposed approach compared to the baselines when tested in a realworld problem clever for the task of object detection also seems to be very simple dataset but even with this the proposed approach does not seem to be superior compared to dspn under specific iou thresholds why is this the case b the suggested approach is claimed to generate multiple plausible sets are the samples derived from diverse modes or it is an importance sampling i cannot find out these from the samples in the figures provided in the appendix how are these samples weighted and which sample is used for the valuations docsep summary this paper introduces a two step process to learning models for set prediction tasks such as identifying locations of objects in images generating polygons or point clouds the first phase learns an energy model of the probability of a given set y given features x modeled using deep networks this allows to use the negative loglikelihood as a loss rather than assignment based losses which allows to model multiple plausible sets given a specific choice of features to learn this model the authors approximate the nll by sampling from data and when necessary generating synthetic sets y from the current model iteration in the set prediction phase the learned energy model is optimized using noisy sgd during the first few iterations in order to potentially sample from multiple different optima quality this paper is well motivated and the experiments clearly show the benefits of avoiding assignmentbased loss functions however some statements and design choices would in my opinion have benefited from deeper theoretical or empirical justifications my main question is regarding the use of the noise z in section 22 the noise is motivated by the discovery of multiple optima do the authors have any results showing that this is indeed the downstream effect of the noise eg showing a greater diversity in generated polygons noisy sgd is also known to help with nonconvex optimization problems regardless of identifying multiple minima clarity overall this paper was clear but several clarifications could be added to improve readability such as providing the mathematical form of the hungarian and chamfer losses and avoiding the use of z for both the partition function zx theta and the random variable sim mathcal n0 epsilon i i am also curious to know how the sets are encoded previous work eg belanger et al looked at tasks where the ground set is finite and the zeroone encoding of the set is made continuous for the purpose of the optimization then projected back into discrete space however given the chosen tasks in this paper it seems like the encodings are simply continuous coordinates with the cardinality being constrained is this correct finally is there a minus sign missing from the derivation at eq 4 originality my understanding is that this paper proposes two key contributions learning the conditional energy density by sampling and generating synthetic examples using noisy sgd steps for the first s iterations during prediction if my understanding is correct i would have like to see these two contributions explored in more detail the learning procedure seems similar to noise contrastive estimation but with a more subtle definition of noisy samples is this point of view correct if so did you explore other noise distributions for the synthetic samples as mentioned above i would also be curious to know if the noisy sgd during estimation leads to improved diverse mode sampling as well significance modeling distributions over sets with neural networks is as the authors point out a difficult problem due in part to permutation invariance the use of the nll under a distribution over sets as a loss function is an elegant way of avoiding this problem and generalizes approaches that already assume a more specific energy form the significance of this work lies in how this nll is optimized and how good samples are then drawn from the learned distribution however i believe that a more thorough investigation of how the proposed method compares to other methods to learn the energy density or optimize the sampling process would increase this papers impactdocsepthis paper poses set prediction as a conditional density estimation problem and subsequently develops an energybased model traininginference procedure the motivation for this approach is twofold existing approaches which impose structure via loss functions can induce model bias based on the metric chosen and existing approaches also induce bias in the sense that they cannot learn multimodal distributions of outputs when for example we may want to observe and rank various candidate sets for a given input these motivations clearly frame the development of the methods in this paper and the experiments do a good job recalling these motivations as the focus for comparison to previous approaches examples in multiple domains are shown to support the hypothesis that this papers approach performs better than existing approaches overall the writing is clear and the paper flows well i think overall the paper combines techniques from various fields namely energybased methods and set prediction that renders it useful for the community ideally i would have liked to have seen some theoretical results based on a simplified setup indicating more rigorously the ability of this approach in provably adapting to illconditioned metrics and learning multimodal distributions a variety of results for langevin mcmc exist that can be exploited to provide this analysis in its current form i think the paper is acceptable for submission and is relevant to the community but some more theoretical analysis will definitely add value to a portion of readers like myself some questionsfeedback for the authors i think there may be a negative sign missing from equation 4 also it may be useful to further explain why contrasting energy for real and synthesized examples makes sense i know this is a common setup for structured prediction problems but i think it is worth paying some lip service in the text furthermore i found it somewhat odd that equation 5 was framed as an instantiation of langevin mcmc perhaps it may be more useful to frame it as an instance of sgld but not equation 6 at least 6a this also introduces a natural follow up of whether doing hamiltonian monte carlo based methods makes any difference in mixing efficiency on the experimental setups finally table 1 was somewhat confusing the caption explains that chamfer is bad on polygons and hungarian is bad for digits but the numbers dont seem to follow that description maybe the rows or columns got switcheddocsepauthors propose a new method for formulating set prediction tasks they propose to use a noisy energybased model with langevin mcmc noisy startup as their model the can approximate the gradient of the likelihood function by computing the enery of ground truth pairs and energy of synthesized pairs where the target is sampled from the model distribution a major advantage of this formulation with contrastive gan style loss competing synthesized vs gt and improving synthesized over the previous related work which optimized a distance metric suitable for sets is that it works on a wider range of tasks experiments show that most common metrics hungarian and chamfer distance both fail in certain tasks where the energy based contrastive approach prevails apparently they only need 1 synthetic set to approximate the gradient of partition function which is surprising given this one expects their computation cost and training time to be on par with metric based methods unfortunately such analysis is currently missing one of their main contributions is advocating for adding gaussian noise to the first 80 of the steps they reason it is an effective way of covering multi modal scenarios they achieve impressive results on anomaly detection tasks which they attribute mainly to their multi modal abilities it would be interesting to have an ablation on the anomaly detection task with no start up noise in the mcmc algorithm another question that arises from their stochastic generation is regarding mnist autoencoding and clevr task it is not clear if they compute the results based on how many samples per example pros the paper is relatively well written they cover several different tasks in their experimental analysis the merits of optimizing for a distribution over the sets is explained well is intuitive and experiments shows that it works they introduce several novel ideas that can be significant in this line of research later on cons lack of computation cost training time inference time analysis it is not clear if this method scales to larger tasks with many more elements in the set for example they can tackle shapenet point cloud reconstruction as a larger scale dataset their datasets are toy scale in general but given that their method consistently outperforms previous work is not a major shortcoming niit the overloading of variable z is confusing in page 2 z is both the partitioning function and a random noise added to the transition function for smoothing the gradients post author response thank you for addressing my concern about complexity and adding fig 6 it seems that it is still better than baselines in terms of complexity
### Summary:
|
the paper proposes to predict sets using conditional density estimates the conditional densities of the reponse set given the observed features is modeled through an energy based function the energy function can be specified using tailored neural nets like deep sets and is trained trough approximate negative log likelihoods using sampling the paper was nice to read and was liked by all the reviewers the one thing that stood out to me was the emphasis on multimodality multi appears 51 times this could be toned down because little is said about the quality relative to the true py x and the focus is mainly on the lack of this in existing work
|
[
6760,
327,
247,
5235,
273,
873,
10554,
3237,
342,
12085,
3045,
2429,
281,
1643,
873,
10554,
1666,
25379,
50275,
783,
2934,
273,
21565,
873,
970,
2341,
3169,
3210,
3133,
281,
320,
4722,
352,
310,
671,
1774,
281,
3283,
533,
625,
685,
581,
21541,
873,
18964,
10006,
4636,
9991,
275,
253,
6944,
873,
3268,
273,
253,
941,
50274,
35529,
619,
2201,
7350,
670,
253,
19529,
403,
50276,
18,
1643,
440,
44167,
4215,
3916,
50275,
66,
512,
689,
253,
2505,
352,
310,
7558,
326,
253,
4081,
1566,
476,
9232,
285,
3037,
23390,
26306,
873,
16689,
285,
891,
717,
417,
2119,
849,
436,
310,
1896,
342,
253,
1655,
21010,
10491,
5700,
534,
897,
247,
20452,
273,
11786,
2074,
281,
11142,
278,
3591,
68,
342,
256,
35333,
342,
642,
10527,
12215,
476,
9232,
436,
2570,
6944,
3268,
50276,
67,
352,
310,
671,
31215,
281,
1750,
326,
23390,
26306,
3268,
689,
5239,
2550,
320,
6311,
3587,
432,
247,
873,
2957,
247,
6197,
275,
10199,
253,
5426,
273,
873,
2957,
310,
12744,
275,
436,
3634,
533,
253,
11655,
476,
320,
6012,
407,
26278,
247,
873,
3268,
6830,
23390,
26306,
2236,
11656,
1037,
835,
253,
3602,
273,
436,
3268,
476,
320,
6311,
970,
3676,
11454,
2990,
50270,
19,
11117,
533,
21010,
4679,
285,
12744,
27163,
247,
253,
4679,
403,
11117,
2217,
533,
690,
273,
253,
873,
8777,
403,
1077,
21010,
24088,
5978,
273,
35182,
790,
50276,
11174,
953,
4679,
534,
19401,
247,
3962,
3280,
1269,
285,
767,
3904,
760,
841,
8077,
6787,
1537,
417,
7933,
4887,
253,
34385,
273,
253,
4081,
2746,
2429,
281,
253,
1666,
25379,
672,
5762,
275,
247,
1524,
10186,
1895,
50275,
2148,
332,
323,
253,
4836,
273,
1789,
5481,
671,
3133,
281,
320,
1077,
2969,
10895,
533,
1014,
342,
436,
253,
4081,
2746,
1057,
417,
1646,
281,
320,
8936,
2429,
281,
277,
1033,
79,
762,
2173,
891,
276,
26682,
2139,
310,
436,
253,
1083,
50275,
67,
253,
5125,
2746,
310,
7558,
281,
6635,
2709,
21541,
5239,
403,
253,
3530,
6012,
432,
11117,
10006,
390,
352,
310,
271,
6349,
10491,
891,
2550,
1089,
562,
841,
432,
253,
3530,
275,
253,
8442,
2530,
275,
253,
30762,
50276,
5430,
403,
841,
3530,
17375,
285,
534,
3410,
310,
908,
323,
253,
821,
12542,
50274,
7152,
33032,
6010,
436,
2929,
23970,
247,
767,
3213,
1232,
281,
4715,
3210,
323,
873,
10554,
8892,
824,
347,
12488,
8593,
273,
5113,
275,
3888,
11365,
35182,
790,
390,
1127,
16173,
50276,
783,
806,
3408,
33772,
271,
2341,
1566,
273,
253,
5912,
273,
247,
1677,
873,
340,
1677,
3386,
1269,
23115,
970,
3676,
6928,
436,
4483,
281,
897,
253,
4016,
2412,
7513,
10202,
347,
247,
2957,
2581,
685,
12714,
1754,
11655,
534,
4483,
281,
1566,
2709,
21541,
5239,
1677,
247,
2173,
4327,
273,
3386,
281,
3037,
436,
1566,
253,
4477,
16851,
253,
295,
620,
407,
10491,
432,
941,
285,
672,
3309,
11365,
13506,
5239,
340,
432,
253,
1655,
1566,
19502,
50276,
249,
253,
873,
10554,
3408,
253,
6311,
2341,
1566,
310,
18325,
970,
27620,
256,
35333,
1309,
253,
806,
1643,
25142,
275,
1340,
281,
7826,
3410,
432,
2709,
1027,
5556,
66,
50275,
15177,
436,
2929,
310,
973,
17194,
285,
253,
4679,
4518,
921,
253,
5373,
273,
17816,
12714,
3169,
2957,
3470,
2299,
690,
7234,
285,
2216,
10165,
651,
275,
619,
4743,
452,
37081,
432,
12861,
10527,
390,
16774,
816,
6787,
50275,
2577,
2022,
1953,
310,
5001,
253,
897,
273,
253,
6046,
1182,
275,
2593,
3307,
253,
6046,
310,
17194,
407,
253,
8900,
273,
2709,
5556,
66,
513,
253,
4477,
452,
667,
1543,
4645,
326,
436,
310,
6296,
253,
15450,
1055,
273,
253,
6046,
24088,
4645,
247,
3687,
9991,
275,
4561,
35182,
790,
27620,
256,
35333,
310,
671,
1929,
281,
1361,
342,
1327,
44181,
13757,
3237,
10159,
273,
12488,
2709,
46836,
50275,
498,
15752,
4583,
436,
2929,
369,
2590,
533,
2067,
8254,
6787,
812,
320,
2879,
281,
3157,
1239,
1430,
824,
347,
5277,
253,
15965,
830,
273,
253,
10416,
6656,
285,
45909,
1592,
11655,
285,
17816,
253,
897,
273,
1182,
323,
1097,
253,
10883,
1159,
1182,
89,
39116,
285,
253,
3632,
4778,
948,
14168,
1179,
295,
17,
299,
4277,
891,
50276,
74,
717,
671,
14338,
281,
871,
849,
253,
5239,
403,
16202,
2045,
789,
24088,
1112,
3751,
1162,
355,
3261,
387,
8892,
835,
253,
3216,
873,
310,
6486,
285,
253,
5058,
531,
9706,
273,
253,
873,
310,
1160,
5415,
323,
253,
4096,
273,
253,
13757,
840,
16589,
896,
715,
13358,
2317,
2299,
1677,
253,
6777,
8892,
275,
436,
2929,
352,
3133,
751,
253,
2349,
351,
723,
403,
3365,
5415,
11627,
342,
253,
46950,
1146,
20793,
310,
436,
3451,
50276,
71,
3341,
310,
627,
247,
19734,
861,
5816,
432,
253,
28529,
387,
16186,
577,
50274,
19164,
414,
619,
4685,
310,
326,
436,
2929,
29328,
767,
2234,
9021,
50275,
28269,
253,
17697,
2341,
4038,
407,
10491,
285,
11365,
13506,
6667,
50275,
5302,
27620,
256,
35333,
5018,
323,
253,
806,
256,
25142,
1309,
10554,
50276,
338,
619,
4685,
310,
3451,
891,
651,
452,
751,
281,
923,
841,
767,
9021,
14859,
275,
625,
2508,
253,
4715,
5199,
3133,
2074,
281,
6046,
4499,
422,
13418,
533,
342,
247,
625,
16105,
5426,
273,
27620,
3530,
310,
436,
1127,
273,
1859,
3451,
604,
594,
858,
368,
8338,
643,
6046,
10670,
323,
253,
13506,
3530,
50275,
284,
5393,
1840,
891,
651,
671,
320,
14338,
281,
871,
604,
253,
27620,
256,
35333,
1309,
13418,
5644,
281,
5520,
11117,
4438,
10491,
347,
973,
50275,
9188,
40348,
14053,
10670,
689,
5239,
342,
11454,
6928,
310,
347,
253,
4477,
1127,
562,
247,
2834,
1895,
1955,
275,
629,
281,
50276,
468,
10082,
318,
31429,
253,
897,
273,
253,
295,
620,
762,
247,
3268,
689,
5239,
347,
247,
2957,
1159,
310,
271,
20654,
1039,
273,
17816,
436,
1895,
285,
2087,
4219,
7274,
326,
2168,
5467,
247,
625,
2173,
2341,
830,
253,
8453,
273,
436,
789,
8696,
275,
849,
436,
295,
620,
310,
18325,
285,
849,
1175,
3530,
403,
840,
8392,
432,
253,
6311,
3268,
2299,
891,
2868,
326,
247,
625,
11080,
5839,
273,
849,
253,
4081,
1332,
26662,
281,
643,
3082,
281,
3037,
253,
2341,
4038,
390,
22318,
253,
10491,
1232,
651,
2572,
436,
9380,
3486,
7152,
33032,
2520,
2929,
24543,
873,
10554,
347,
247,
17697,
4038,
13418,
1895,
285,
9674,
24357,
271,
2341,
3169,
1566,
3733,
249,
1793,
5199,
253,
16038,
323,
436,
2746,
310,
767,
8089,
5368,
7274,
534,
16209,
2605,
3066,
2957,
3470,
476,
10808,
1566,
8492,
1754,
327,
253,
7982,
6777,
285,
5368,
7274,
671,
10808,
8492,
275,
253,
3282,
326,
597,
2550,
3037,
23390,
26306,
10670,
273,
18012,
672,
323,
1650,
359,
778,
971,
281,
10018,
285,
5958,
2710,
7431,
5239,
323,
247,
1677,
3280,
841,
42852,
4518,
3665,
253,
2440,
273,
253,
3082,
275,
436,
2929,
285,
253,
4679,
513,
247,
1175,
2628,
43800,
841,
42852,
347,
253,
2770,
323,
5301,
281,
2045,
7274,
6667,
275,
2709,
10625,
403,
2011,
281,
1329,
253,
9079,
326,
436,
9380,
2746,
17923,
1805,
685,
5368,
7274,
4583,
253,
4028,
310,
2590,
285,
253,
2929,
14221,
973,
891,
1158,
4583,
253,
2929,
24772,
5609,
432,
2710,
4910,
10775,
2341,
3169,
3082,
285,
873,
10554,
326,
29512,
352,
4217,
323,
253,
3114,
50275,
504,
595,
891,
651,
452,
10490,
281,
452,
2326,
690,
10527,
1543,
1754,
327,
247,
21010,
9978,
7809,
625,
8132,
29689,
253,
3745,
273,
436,
2746,
275,
872,
1598,
42174,
281,
2853,
44321,
17082,
285,
4715,
23390,
26306,
10670,
247,
5235,
273,
1543,
323,
298,
912,
8498,
278,
3591,
68,
2226,
326,
476,
320,
28734,
281,
2085,
436,
1783,
275,
697,
1655,
830,
891,
1158,
253,
2929,
310,
12207,
323,
19529,
285,
310,
4623,
281,
253,
3114,
533,
690,
625,
10527,
1783,
588,
7964,
823,
1318,
281,
247,
5110,
273,
10668,
751,
4266,
50276,
8826,
3533,
44333,
323,
253,
4477,
891,
1158,
627,
778,
320,
247,
4016,
861,
5816,
432,
5150,
577,
671,
352,
778,
320,
4217,
281,
2007,
5513,
2139,
42455,
2341,
323,
1524,
285,
17791,
6667,
2789,
3282,
891,
871,
436,
310,
247,
1846,
9978,
323,
18872,
10554,
3237,
533,
891,
1158,
352,
310,
4409,
10054,
690,
5541,
2579,
275,
253,
2505,
33810,
891,
1119,
352,
8489,
8909,
326,
5150,
608,
369,
29318,
347,
271,
8164,
2492,
273,
298,
912,
8498,
278,
3591,
68,
4931,
352,
778,
320,
625,
4217,
281,
3665,
352,
347,
271,
4227,
273,
48237,
392,
533,
417,
5150,
721,
387,
1878,
721,
66,
436,
671,
23970,
247,
3626,
956,
598,
273,
1880,
2509,
10546,
7839,
757,
1114,
442,
1113,
4213,
1754,
3082,
2789,
667,
3064,
275,
12480,
6733,
327,
253,
5661,
873,
8777,
4720,
2829,
337,
369,
8489,
21643,
253,
11743,
11424,
326,
45909,
1592,
310,
3076,
327,
35182,
790,
285,
10416,
6656,
310,
3076,
323,
24321,
533,
253,
3904,
13414,
1646,
281,
956,
326,
5740,
5046,
253,
10175,
390,
9930,
1694,
17609,
7152,
33032,
43355,
12661,
247,
747,
1332,
323,
830,
8287,
873,
10554,
8892,
597,
12661,
281,
897,
247,
27620,
2341,
3169,
1566,
342,
298,
912,
8498,
278,
3591,
68,
50276,
2369,
17976,
20500,
347,
616,
1566,
253,
476,
16851,
253,
11786,
273,
253,
12177,
1159,
407,
12672,
253,
546,
1771,
273,
3216,
5083,
8557,
285,
2341,
273,
17791,
8557,
835,
253,
2303,
310,
19958,
432,
253,
1566,
3268,
50276,
66,
2201,
5750,
273,
436,
15895,
342,
4499,
422,
36827,
3740,
2957,
11771,
17791,
4632,
305,
85,
285,
11138,
17791,
689,
253,
2045,
2905,
789,
534,
18325,
247,
4181,
7982,
7470,
323,
5239,
310,
326,
352,
2987,
327,
247,
14200,
2491,
273,
8892,
4679,
921,
326,
954,
1846,
17082,
10416,
6656,
285,
45909,
1592,
4181,
1097,
1891,
275,
2176,
8892,
835,
253,
2341,
1754,
4499,
422,
2746,
1404,
5351,
8505,
597,
760,
878,
337,
13506,
873,
281,
16851,
253,
11786,
273,
10883,
1159,
534,
310,
10084,
1677,
436,
581,
21973,
616,
13782,
2105,
285,
3733,
673,
281,
320,
327,
1061,
342,
7982,
1754,
3082,
19235,
824,
1783,
310,
4390,
5816,
50276,
531,
273,
616,
2022,
9021,
310,
43243,
323,
6240,
305,
12064,
6046,
281,
253,
806,
5096,
273,
253,
5018,
597,
1921,
352,
310,
271,
3576,
1039,
273,
10985,
4471,
30771,
15216,
597,
5115,
13943,
1543,
327,
30207,
5481,
8892,
534,
597,
11104,
7194,
281,
616,
4471,
30771,
15277,
352,
651,
320,
4722,
281,
452,
271,
28913,
327,
253,
30207,
5481,
4836,
342,
642,
1265,
598,
6046,
275,
253,
278,
3591,
68,
5933,
50276,
23955,
1953,
326,
15877,
432,
616,
19191,
5978,
310,
5001,
278,
79,
382,
6753,
27676,
285,
1391,
24987,
4836,
352,
310,
417,
2590,
604,
597,
11897,
253,
1543,
1754,
327,
849,
1142,
3530,
591,
1650,
50276,
856,
84,
253,
2929,
310,
4942,
973,
3542,
50276,
9328,
3835,
2067,
1027,
8892,
275,
616,
5661,
1783,
253,
16108,
273,
39793,
323,
247,
3268,
689,
253,
5239,
310,
5544,
973,
310,
27350,
285,
4679,
2722,
326,
352,
2987,
597,
9569,
2067,
4460,
5697,
326,
476,
320,
1534,
275,
436,
1386,
273,
2561,
1996,
327,
50275,
5040,
50276,
77,
471,
273,
13782,
2105,
3733,
673,
17032,
673,
1783,
352,
310,
417,
2590,
604,
436,
1332,
11498,
281,
4067,
8892,
342,
1142,
625,
3603,
275,
253,
873,
323,
1650,
597,
476,
18915,
439,
522,
257,
292,
1127,
9005,
14433,
347,
247,
4067,
4311,
10895,
616,
15302,
403,
20953,
4311,
275,
2087,
533,
1677,
326,
616,
1332,
12724,
41731,
13015,
2045,
789,
310,
417,
247,
2201,
2159,
4202,
50275,
8311,
262,
253,
689,
23333,
273,
4778,
1182,
310,
21643,
275,
3239,
374,
1182,
310,
1097,
253,
41463,
1159,
285,
247,
3632,
6046,
2879,
281,
253,
5502,
1159,
323,
36971,
253,
27935,
50275,
5996,
2488,
2380,
5717,
368,
323,
15974,
619,
4468,
670,
10454,
285,
6240,
3036,
721,
352,
3133,
326,
352,
310,
1335,
1805,
685,
1666,
25379,
275,
2426,
273,
10454,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
3283,
5239,
970,
17697,
4038,
8197,
253,
17697,
16689,
273,
253,
1234,
1910,
873,
1677,
253,
2540,
3386,
310,
23115,
949,
271,
2341,
1754,
1159,
253,
2341,
1159,
476,
320,
7616,
970,
27846,
11454,
37507,
751,
3676,
5239,
285,
310,
10166,
39445,
16851,
4016,
2412,
12177,
84,
970,
10491,
50275,
783,
2929,
369,
5322,
281,
1239,
285,
369,
10490,
407,
512,
253,
30628,
253,
581,
2181,
326,
6225,
562,
281,
479,
369,
253,
15075,
327,
23390,
351,
1319,
4471,
4620,
8319,
2069,
50276,
2520,
812,
320,
7020,
264,
1066,
984,
1652,
310,
753,
670,
253,
3290,
4103,
281,
253,
2032,
7239,
50276,
89,
285,
253,
2770,
310,
7194,
327,
253,
3480,
273,
436,
275,
5368,
789
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
6760,
327,
247,
5235,
273,
873,
10554,
3237,
342,
12085,
3045,
2429,
281,
1643,
873,
10554,
1666,
25379,
50275,
783,
2934,
273,
21565,
873,
970,
2341,
3169,
3210,
3133,
281,
320,
4722,
352,
310,
671,
1774,
281,
3283,
533,
625,
685,
581,
21541,
873,
18964,
10006,
4636,
9991,
275,
253,
6944,
873,
3268,
273,
253,
941,
50274,
35529,
619,
2201,
7350,
670,
253,
19529,
403,
50276,
18,
1643,
440,
44167,
4215,
3916,
50275,
66,
512,
689,
253,
2505,
352,
310,
7558,
326,
253,
4081,
1566,
476,
9232,
285,
3037,
23390,
26306,
873,
16689,
285,
891,
717,
417,
2119,
849,
436,
310,
1896,
342,
253,
1655,
21010,
10491,
5700,
534,
897,
247,
20452,
273,
11786,
2074,
281,
11142,
278,
3591,
68,
342,
256,
35333,
342,
642,
10527,
12215,
476,
9232,
436,
2570,
6944,
3268,
50276,
67,
352,
310,
671,
31215,
281,
1750,
326,
23390,
26306,
3268,
689,
5239,
2550,
320,
6311,
3587,
432,
247,
873,
2957,
247,
6197,
275,
10199,
253,
5426,
273,
873,
2957,
310,
12744,
275,
436,
3634,
533,
253,
11655,
476,
320,
6012,
407,
26278,
247,
873,
3268,
6830,
23390,
26306,
2236,
11656,
1037,
835,
253,
3602,
273,
436,
3268,
476,
320,
6311,
970,
3676,
11454,
2990,
50270,
19,
11117,
533,
21010,
4679,
285,
12744,
27163,
247,
253,
4679,
403,
11117,
2217,
533,
690,
273,
253,
873,
8777,
403,
1077,
21010,
24088,
5978,
273,
35182,
790,
50276,
11174,
953,
4679,
534,
19401,
247,
3962,
3280,
1269,
285,
767,
3904,
760,
841,
8077,
6787,
1537,
417,
7933,
4887,
253,
34385,
273,
253,
4081,
2746,
2429,
281,
253,
1666,
25379,
672,
5762,
275,
247,
1524,
10186,
1895,
50275,
2148,
332,
323,
253,
4836,
273,
1789,
5481,
671,
3133,
281,
320,
1077,
2969,
10895,
533,
1014,
342,
436,
253,
4081,
2746,
1057,
417,
1646,
281,
320,
8936,
2429,
281,
277,
1033,
79,
762,
2173,
891,
276,
26682,
2139,
310,
436,
253,
1083,
50275,
67,
253,
5125,
2746,
310,
7558,
281,
6635,
2709,
21541,
5239,
403,
253,
3530,
6012,
432,
11117,
10006,
390,
352,
310,
271,
6349,
10491,
891,
2550,
1089,
562,
841,
432,
253,
3530,
275,
253,
8442,
2530,
275,
253,
30762,
50276,
5430,
403,
841,
3530,
17375,
285,
534,
3410,
310,
908,
323,
253,
821,
12542,
50274,
7152,
33032,
6010,
436,
2929,
23970,
247,
767,
3213,
1232,
281,
4715,
3210,
323,
873,
10554,
8892,
824,
347,
12488,
8593,
273,
5113,
275,
3888,
11365,
35182,
790,
390,
1127,
16173,
50276,
783,
806,
3408,
33772,
271,
2341,
1566,
273,
253,
5912,
273,
247,
1677,
873,
340,
1677,
3386,
1269,
23115,
970,
3676,
6928,
436,
4483,
281,
897,
253,
4016,
2412,
7513,
10202,
347,
247,
2957,
2581,
685,
12714,
1754,
11655,
534,
4483,
281,
1566,
2709,
21541,
5239,
1677,
247,
2173,
4327,
273,
3386,
281,
3037,
436,
1566,
253,
4477,
16851,
253,
295,
620,
407,
10491,
432,
941,
285,
672,
3309,
11365,
13506,
5239,
340,
432,
253,
1655,
1566,
19502,
50276,
249,
253,
873,
10554,
3408,
253,
6311,
2341,
1566,
310,
18325,
970,
27620,
256,
35333,
1309,
253,
806,
1643,
25142,
275,
1340,
281,
7826,
3410,
432,
2709,
1027,
5556,
66,
50275,
15177,
436,
2929,
310,
973,
17194,
285,
253,
4679,
4518,
921,
253,
5373,
273,
17816,
12714,
3169,
2957,
3470,
2299,
690,
7234,
285,
2216,
10165,
651,
275,
619,
4743,
452,
37081,
432,
12861,
10527,
390,
16774,
816,
6787,
50275,
2577,
2022,
1953,
310,
5001,
253,
897,
273,
253,
6046,
1182,
275,
2593,
3307,
253,
6046,
310,
17194,
407,
253,
8900,
273,
2709,
5556,
66,
513,
253,
4477,
452,
667,
1543,
4645,
326,
436,
310,
6296,
253,
15450,
1055,
273,
253,
6046,
24088,
4645,
247,
3687,
9991,
275,
4561,
35182,
790,
27620,
256,
35333,
310,
671,
1929,
281,
1361,
342,
1327,
44181,
13757,
3237,
10159,
273,
12488,
2709,
46836,
50275,
498,
15752,
4583,
436,
2929,
369,
2590,
533,
2067,
8254,
6787,
812,
320,
2879,
281,
3157,
1239,
1430,
824,
347,
5277,
253,
15965,
830,
273,
253,
10416,
6656,
285,
45909,
1592,
11655,
285,
17816,
253,
897,
273,
1182,
323,
1097,
253,
10883,
1159,
1182,
89,
39116,
285,
253,
3632,
4778,
948,
14168,
1179,
295,
17,
299,
4277,
891,
50276,
74,
717,
671,
14338,
281,
871,
849,
253,
5239,
403,
16202,
2045,
789,
24088,
1112,
3751,
1162,
355,
3261,
387,
8892,
835,
253,
3216,
873,
310,
6486,
285,
253,
5058,
531,
9706,
273,
253,
873,
310,
1160,
5415,
323,
253,
4096,
273,
253,
13757,
840,
16589,
896,
715,
13358,
2317,
2299,
1677,
253,
6777,
8892,
275,
436,
2929,
352,
3133,
751,
253,
2349,
351,
723,
403,
3365,
5415,
11627,
342,
253,
46950,
1146,
20793,
310,
436,
3451,
50276,
71,
3341,
310,
627,
247,
19734,
861,
5816,
432,
253,
28529,
387,
16186,
577,
50274,
19164,
414,
619,
4685,
310,
326,
436,
2929,
29328,
767,
2234,
9021,
50275,
28269,
253,
17697,
2341,
4038,
407,
10491,
285,
11365,
13506,
6667,
50275,
5302,
27620,
256,
35333,
5018,
323,
253,
806,
256,
25142,
1309,
10554,
50276,
338,
619,
4685,
310,
3451,
891,
651,
452,
751,
281,
923,
841,
767,
9021,
14859,
275,
625,
2508,
253,
4715,
5199,
3133,
2074,
281,
6046,
4499,
422,
13418,
533,
342,
247,
625,
16105,
5426,
273,
27620,
3530,
310,
436,
1127,
273,
1859,
3451,
604,
594,
858,
368,
8338,
643,
6046,
10670,
323,
253,
13506,
3530,
50275,
284,
5393,
1840,
891,
651,
671,
320,
14338,
281,
871,
604,
253,
27620,
256,
35333,
1309,
13418,
5644,
281,
5520,
11117,
4438,
10491,
347,
973,
50275,
9188,
40348,
14053,
10670,
689,
5239,
342,
11454,
6928,
310,
347,
253,
4477,
1127,
562,
247,
2834,
1895,
1955,
275,
629,
281,
50276,
468,
10082,
318,
31429,
253,
897,
273,
253,
295,
620,
762,
247,
3268,
689,
5239,
347,
247,
2957,
1159,
310,
271,
20654,
1039,
273,
17816,
436,
1895,
285,
2087,
4219,
7274,
326,
2168,
5467,
247,
625,
2173,
2341,
830,
253,
8453,
273,
436,
789,
8696,
275,
849,
436,
295,
620,
310,
18325,
285,
849,
1175,
3530,
403,
840,
8392,
432,
253,
6311,
3268,
2299,
891,
2868,
326,
247,
625,
11080,
5839,
273,
849,
253,
4081,
1332,
26662,
281,
643,
3082,
281,
3037,
253,
2341,
4038,
390,
22318,
253,
10491,
1232,
651,
2572,
436,
9380,
3486,
7152,
33032,
2520,
2929,
24543,
873,
10554,
347,
247,
17697,
4038,
13418,
1895,
285,
9674,
24357,
271,
2341,
3169,
1566,
3733,
249,
1793,
5199,
253,
16038,
323,
436,
2746,
310,
767,
8089,
5368,
7274,
534,
16209,
2605,
3066,
2957,
3470,
476,
10808,
1566,
8492,
1754,
327,
253,
7982,
6777,
285,
5368,
7274,
671,
10808,
8492,
275,
253,
3282,
326,
597,
2550,
3037,
23390,
26306,
10670,
273,
18012,
672,
323,
1650,
359,
778,
971,
281,
10018,
285,
5958,
2710,
7431,
5239,
323,
247,
1677,
3280,
841,
42852,
4518,
3665,
253,
2440,
273,
253,
3082,
275,
436,
2929,
285,
253,
4679,
513,
247,
1175,
2628,
43800,
841,
42852,
347,
253,
2770,
323,
5301,
281,
2045,
7274,
6667,
275,
2709,
10625,
403,
2011,
281,
1329,
253,
9079,
326,
436,
9380,
2746,
17923,
1805,
685,
5368,
7274,
4583,
253,
4028,
310,
2590,
285,
253,
2929,
14221,
973,
891,
1158,
4583,
253,
2929,
24772,
5609,
432,
2710,
4910,
10775,
2341,
3169,
3082,
285,
873,
10554,
326,
29512,
352,
4217,
323,
253,
3114,
50275,
504,
595,
891,
651,
452,
10490,
281,
452,
2326,
690,
10527,
1543,
1754,
327,
247,
21010,
9978,
7809,
625,
8132,
29689,
253,
3745,
273,
436,
2746,
275,
872,
1598,
42174,
281,
2853,
44321,
17082,
285,
4715,
23390,
26306,
10670,
247,
5235,
273,
1543,
323,
298,
912,
8498,
278,
3591,
68,
2226,
326,
476,
320,
28734,
281,
2085,
436,
1783,
275,
697,
1655,
830,
891,
1158,
253,
2929,
310,
12207,
323,
19529,
285,
310,
4623,
281,
253,
3114,
533,
690,
625,
10527,
1783,
588,
7964,
823,
1318,
281,
247,
5110,
273,
10668,
751,
4266,
50276,
8826,
3533,
44333,
323,
253,
4477,
891,
1158,
627,
778,
320,
247,
4016,
861,
5816,
432,
5150,
577,
671,
352,
778,
320,
4217,
281,
2007,
5513,
2139,
42455,
2341,
323,
1524,
285,
17791,
6667,
2789,
3282,
891,
871,
436,
310,
247,
1846,
9978,
323,
18872,
10554,
3237,
533,
891,
1158,
352,
310,
4409,
10054,
690,
5541,
2579,
275,
253,
2505,
33810,
891,
1119,
352,
8489,
8909,
326,
5150,
608,
369,
29318,
347,
271,
8164,
2492,
273,
298,
912,
8498,
278,
3591,
68,
4931,
352,
778,
320,
625,
4217,
281,
3665,
352,
347,
271,
4227,
273,
48237,
392,
533,
417,
5150,
721,
387,
1878,
721,
66,
436,
671,
23970,
247,
3626,
956,
598,
273,
1880,
2509,
10546,
7839,
757,
1114,
442,
1113,
4213,
1754,
3082,
2789,
667,
3064,
275,
12480,
6733,
327,
253,
5661,
873,
8777,
4720,
2829,
337,
369,
8489,
21643,
253,
11743,
11424,
326,
45909,
1592,
310,
3076,
327,
35182,
790,
285,
10416,
6656,
310,
3076,
323,
24321,
533,
253,
3904,
13414,
1646,
281,
956,
326,
5740,
5046,
253,
10175,
390,
9930,
1694,
17609,
7152,
33032,
43355,
12661,
247,
747,
1332,
323,
830,
8287,
873,
10554,
8892,
597,
12661,
281,
897,
247,
27620,
2341,
3169,
1566,
342,
298,
912,
8498,
278,
3591,
68,
50276,
2369,
17976,
20500,
347,
616,
1566,
253,
476,
16851,
253,
11786,
273,
253,
12177,
1159,
407,
12672,
253,
546,
1771,
273,
3216,
5083,
8557,
285,
2341,
273,
17791,
8557,
835,
253,
2303,
310,
19958,
432,
253,
1566,
3268,
50276,
66,
2201,
5750,
273,
436,
15895,
342,
4499,
422,
36827,
3740,
2957,
11771,
17791,
4632,
305,
85,
285,
11138,
17791,
689,
253,
2045,
2905,
789,
534,
18325,
247,
4181,
7982,
7470,
323,
5239,
310,
326,
352,
2987,
327,
247,
14200,
2491,
273,
8892,
4679,
921,
326,
954,
1846,
17082,
10416,
6656,
285,
45909,
1592,
4181,
1097,
1891,
275,
2176,
8892,
835,
253,
2341,
1754,
4499,
422,
2746,
1404,
5351,
8505,
597,
760,
878,
337,
13506,
873,
281,
16851,
253,
11786,
273,
10883,
1159,
534,
310,
10084,
1677,
436,
581,
21973,
616,
13782,
2105,
285,
3733,
673,
281,
320,
327,
1061,
342,
7982,
1754,
3082,
19235,
824,
1783,
310,
4390,
5816,
50276,
531,
273,
616,
2022,
9021,
310,
43243,
323,
6240,
305,
12064,
6046,
281,
253,
806,
5096,
273,
253,
5018,
597,
1921,
352,
310,
271,
3576,
1039,
273,
10985,
4471,
30771,
15216,
597,
5115,
13943,
1543,
327,
30207,
5481,
8892,
534,
597,
11104,
7194,
281,
616,
4471,
30771,
15277,
352,
651,
320,
4722,
281,
452,
271,
28913,
327,
253,
30207,
5481,
4836,
342,
642,
1265,
598,
6046,
275,
253,
278,
3591,
68,
5933,
50276,
23955,
1953,
326,
15877,
432,
616,
19191,
5978,
310,
5001,
278,
79,
382,
6753,
27676,
285,
1391,
24987,
4836,
352,
310,
417,
2590,
604,
597,
11897,
253,
1543,
1754,
327,
849,
1142,
3530,
591,
1650,
50276,
856,
84,
253,
2929,
310,
4942,
973,
3542,
50276,
9328,
3835,
2067,
1027,
8892,
275,
616,
5661,
1783,
253,
16108,
273,
39793,
323,
247,
3268,
689,
253,
5239,
310,
5544,
973,
310,
27350,
285,
4679,
2722,
326,
352,
2987,
597,
9569,
2067,
4460,
5697,
326,
476,
320,
1534,
275,
436,
1386,
273,
2561,
1996,
327,
50275,
5040,
50276,
77,
471,
273,
13782,
2105,
3733,
673,
17032,
673,
1783,
352,
310,
417,
2590,
604,
436,
1332,
11498,
281,
4067,
8892,
342,
1142,
625,
3603,
275,
253,
873,
323,
1650,
597,
476,
18915,
439,
522,
257,
292,
1127,
9005,
14433,
347,
247,
4067,
4311,
10895,
616,
15302,
403,
20953,
4311,
275,
2087,
533,
1677,
326,
616,
1332,
12724,
41731,
13015,
2045,
789,
310,
417,
247,
2201,
2159,
4202,
50275,
8311,
262,
253,
689,
23333,
273,
4778,
1182,
310,
21643,
275,
3239,
374,
1182,
310,
1097,
253,
41463,
1159,
285,
247,
3632,
6046,
2879,
281,
253,
5502,
1159,
323,
36971,
253,
27935,
50275,
5996,
2488,
2380,
5717,
368,
323,
15974,
619,
4468,
670,
10454,
285,
6240,
3036,
721,
352,
3133,
326,
352,
310,
1335,
1805,
685,
1666,
25379,
275,
2426,
273,
10454,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
3283,
5239,
970,
17697,
4038,
8197,
253,
17697,
16689,
273,
253,
1234,
1910,
873,
1677,
253,
2540,
3386,
310,
23115,
949,
271,
2341,
1754,
1159,
253,
2341,
1159,
476,
320,
7616,
970,
27846,
11454,
37507,
751,
3676,
5239,
285,
310,
10166,
39445,
16851,
4016,
2412,
12177,
84,
970,
10491,
50275,
783,
2929,
369,
5322,
281,
1239,
285,
369,
10490,
407,
512,
253,
30628,
253,
581,
2181,
326,
6225,
562,
281,
479,
369,
253,
15075,
327,
23390,
351,
1319,
4471,
4620,
8319,
2069,
50276,
2520,
812,
320,
7020,
264,
1066,
984,
1652,
310,
753,
670,
253,
3290,
4103,
281,
253,
2032,
7239,
50276,
89,
285,
253,
2770,
310,
7194,
327,
253,
3480,
273,
436,
275,
5368,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a way of exploring compressible nns across different architectures sparsity levels and quantization levels during training the proposed framework is called unified dnas for comppresssible udc nns and it can yield compressed nns that show better accuracymodel size tradeoffs than prior work the paper tackles an important problem and the proposed solution seems reasonable the authors claim that udc outperforms prior work in terms of accuracymodel size and the experiments seem to support this a brief description of the prior methods that the udc framework is compared to such as haq gong et al 27 mcdonnel 51 fbnet mbnetv2 choi et al 18 would be really helpful for readers in my opinion the paper is a bit hard to follow specifically introduction could be a lot clearer with a more structured presentation and perhaps with a few subsections the authors discussed the limitations of their work docsepthis paper presents a differentiable nas approach targetting compressible nn for npus with low memory footprint the approach conducts a joint search over nn architecture weight bitwidths and layerwise weight sparsity levels to find models with the smallest model size strengths a theoretical lowerbound of the storage size that is cheap to be computed clear threestage training process to deal with quantization and sparsity improved paretofront of model size vs accuracy over previous work the search algorithm has better sample efficiency vs random method weaknesses make it clear that the target device is mobilebased npu with limited model storage models targetting small npu and mcu have strigent latency and memory utilization constraints which are not discussed in this paper limited comparison with other nas algorithms there are other nas papers for mcu please also cite and compare to them for example httpsarxivorgabs201014246 the limitations are adequately discussed docsepthe authors present am updated differential neural architecture search algorithm udc which incorporates model compression features such as weight sparsity and quantization as part of the search space for the ideal high accuracy small model size neural network nn model in addition keeping in mind the constraints typically faced by tinyml models the authors also enforce a strict limit on the model size to ensure it can be implemented in resource constrained iot applications further the presented algorithm enables effective exploration of accuracy vs model size tradeoff lastly the paper demonstrates the pareto dominance of nns designed by udc compared to prior art for a range of ml benchmarks and varying hardware constraints on ethos u55 npu the paper presents a thorough comparison of udc against past techniques that attempt similar cooptimization of ml models while accounting for hardware constraints udc tackles a wider range of constraints compared to all listed in prior art section 3 and 4 describe the details of the design model and the proposed dnas algorithm these sections were somewhat hard to follow however it is understandable considering the mathematical nature of problem it is not obvious what these variables denote kindly add a description section 6 shows a comparative study of generating nns using udc and other prior art for stateoftheart sota ml benchmarks under different memory size constraints figure 4 illustrates the superiority of udc in terms of accuracy and model size could the authors also shed light on the runtime it takes to converge to the final nn design udc picks compared to other approaches eg mcunet considering udc accounts for a larger set of constraints compared to other approaches it would be interesting if it can demonstrate comparable runtimes too figure 5b compares udc results to random search and the authors do share some insight on udc runtime compared to training a baseline network 2x if they could share this runtime information it would be handy also note that authors should be careful that while comparing to random search the resource utilization for udc should be similar eg running equal number of parallel gpu instances for a fair comparison it would be interesting to know how long it takes to converge the random search to the optimal design picked up by udc withing reasonable time constraints of course additionally could authors share how they selected the memory size constraint values for the experiments shared in the paper is there some understanding if the superiority of udc holds for largersmaller mem constraints and if yes how a user could understand if they fall in that range to conclude this work promises a very useful optimization tool for designing small but high accuracy nns that can fit on typical iot devices servicing tinyml applications authors do not present any data showing the timing performance of udc compared to other approaches some perspective on that would be helpful also any explanation on how the memory size constraints were selected for presented experiments would be helpful docsepthis paper is on model compression quantize and prune with hardware constraints the proposed search method is an extension of differentiable nas to learn weight sparsity ratios and bitwidths per layer conceptually this work combines previous ideas such as dnas of layer width depth or operator and learning of layerwise bitwidth and sparsity and yield paretooptimal results on model size vs accuracy the methods used in the paper are not new but this work is a sort of combination of wellknown techniques thus making it work ie dnas for model architecture with pruning and quantization is a new contribution the related work part discusses the differences from prior work and the related work is adequately cited to my knowledge the submission is technically sound with mostly empirical results but it does not evaluate the weaknesses for example the searching costs of the proposed algorithm the writing is okay the design decisions and the proposed algorithm are described with details since the major contribution to my understanding is the practical issues of making dnas with compression techniques working i think more details are necessary to help readers and practitioners to use the findings in this work for example experiment setups and code examples are helpful especially if no source code available the results are useful but not significant in some scenarios with strict limitations on model size the proposed method could be used the reported benefits model size vs accuracy are not unique the authors have addressed the potential societal impact the limitations are not addressed enough
### Summary:
|
in this paper the authors present a new way to obtain compressible neural networks to fit on resourceconstrained npubased hardware initial reviews were mixed but the authors successfully managed to respond to reviewers concerns during the rebuttal period several reviewers pointed out clarity issues but 1 some of these issues came from reviewers not reading the paper carefully enough and 2 others were properly addressed by the authors i also want to acknowledge that the most negative review is a short one falling below neurips quality standards after discussion all reviewers are leaning towards acceptance agreeing that the paper successfully demonstrates the superiority of the proposed method vs existing relevant baselines as a result i also recommend acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1039,
273,
18216,
19477,
917,
295,
2224,
2439,
1027,
35615,
37139,
414,
2308,
285,
36643,
2308,
1309,
3733,
253,
4081,
7792,
310,
1925,
27998,
277,
27109,
323,
389,
377,
560,
6286,
18198,
68,
295,
2224,
285,
352,
476,
4917,
21012,
295,
2224,
326,
921,
1805,
3933,
317,
1105,
49797,
1979,
5454,
14273,
685,
2720,
789,
50276,
783,
2929,
39223,
271,
1774,
1895,
285,
253,
4081,
2900,
3133,
5272,
253,
4477,
1750,
326,
18198,
68,
41731,
13015,
2720,
789,
275,
2426,
273,
3933,
317,
1105,
49797,
1979,
285,
253,
4679,
1646,
281,
1329,
436,
247,
4864,
5740,
273,
253,
2720,
3082,
326,
253,
18198,
68,
7792,
310,
2429,
281,
824,
347,
419,
82,
305,
543,
1162,
355,
3435,
278,
2428,
251,
8101,
8319,
49962,
3024,
45505,
3024,
87,
19,
2093,
74,
1162,
355,
1283,
651,
320,
1663,
9371,
323,
10668,
50275,
249,
619,
4743,
253,
2929,
310,
247,
2372,
1892,
281,
956,
5742,
10199,
812,
320,
247,
2257,
30909,
342,
247,
625,
18872,
9759,
285,
4931,
342,
247,
1643,
749,
21454,
253,
4477,
5469,
253,
7364,
273,
616,
789,
5474,
33032,
2520,
2929,
10262,
247,
46350,
13332,
2746,
2303,
1076,
19477,
917,
48257,
323,
15749,
316,
342,
1698,
3541,
33257,
253,
2746,
2589,
84,
247,
6036,
3186,
689,
48257,
10336,
2801,
2372,
3429,
84,
285,
3828,
3020,
2801,
37139,
414,
2308,
281,
1089,
3210,
342,
253,
8004,
1566,
1979,
20544,
50276,
66,
10527,
2406,
9458,
273,
253,
5718,
1979,
326,
310,
11142,
281,
320,
10302,
50276,
8250,
289,
250,
383,
486,
3733,
1232,
281,
2968,
342,
36643,
285,
37139,
414,
50276,
303,
27369,
22865,
936,
6342,
273,
1566,
1979,
4632,
7200,
689,
2045,
789,
50276,
783,
3186,
5933,
556,
1805,
3410,
6733,
4632,
3632,
1332,
50276,
20881,
1255,
265,
50276,
11145,
352,
2590,
326,
253,
2303,
2813,
310,
6109,
3169,
295,
11113,
342,
3710,
1566,
5718,
50276,
19286,
2303,
1076,
1355,
295,
11113,
285,
278,
14573,
452,
1213,
32189,
22667,
285,
3541,
19575,
10806,
534,
403,
417,
5469,
275,
436,
2929,
50276,
15870,
5301,
342,
643,
13332,
11333,
50276,
9088,
403,
643,
13332,
9380,
323,
278,
14573,
4496,
671,
26542,
285,
7277,
281,
731,
323,
1650,
5987,
39962,
2061,
5375,
1252,
11494,
23260,
253,
7364,
403,
18212,
5469,
5474,
339,
431,
248,
4477,
1246,
717,
9300,
8967,
11454,
10336,
3186,
5933,
18198,
68,
534,
31167,
1566,
13800,
3386,
824,
347,
2801,
37139,
414,
285,
36643,
347,
629,
273,
253,
3186,
2317,
323,
253,
7445,
1029,
7200,
1355,
1566,
1979,
11454,
2990,
48257,
1566,
275,
1635,
7562,
275,
2564,
253,
10806,
5431,
11372,
407,
20596,
46415,
3210,
253,
4477,
671,
7767,
247,
7654,
2701,
327,
253,
1566,
1979,
281,
5416,
352,
476,
320,
9009,
275,
7741,
20793,
891,
302,
4893,
2007,
253,
3559,
5933,
13276,
3576,
17947,
273,
7200,
4632,
1566,
1979,
5454,
2727,
1390,
314,
253,
2929,
14371,
253,
22865,
936,
26447,
273,
295,
2224,
4158,
407,
18198,
68,
2429,
281,
2720,
1445,
323,
247,
2491,
273,
13361,
49602,
285,
11962,
10309,
10806,
327,
5105,
375,
1484,
2417,
295,
11113,
253,
2929,
10262,
247,
11080,
5301,
273,
18198,
68,
1411,
2469,
5609,
326,
3177,
2074,
820,
2178,
27996,
273,
13361,
3210,
1223,
15890,
323,
10309,
10806,
18198,
68,
39223,
247,
14200,
2491,
273,
10806,
2429,
281,
512,
7117,
275,
2720,
1445,
50275,
4674,
495,
285,
577,
6266,
253,
4278,
273,
253,
2216,
1566,
285,
253,
4081,
277,
27109,
5933,
841,
7118,
497,
8489,
1892,
281,
956,
2299,
352,
310,
34007,
7296,
253,
15965,
3753,
273,
1895,
352,
310,
417,
4755,
752,
841,
4903,
9173,
50274,
11258,
314,
823,
247,
5740,
50275,
4674,
721,
2722,
247,
20407,
1263,
273,
11365,
295,
2224,
970,
18198,
68,
285,
643,
2720,
1445,
323,
1375,
23037,
14387,
256,
5503,
13361,
49602,
762,
1027,
3541,
1979,
10806,
4677,
577,
18303,
253,
34385,
273,
18198,
68,
275,
2426,
273,
7200,
285,
1566,
1979,
812,
253,
4477,
671,
17914,
1708,
327,
253,
20243,
352,
3936,
281,
29623,
281,
253,
2457,
48257,
2216,
18198,
68,
21460,
2429,
281,
643,
7274,
24088,
278,
68,
328,
292,
7296,
18198,
68,
8553,
323,
247,
4067,
873,
273,
10806,
2429,
281,
643,
7274,
352,
651,
320,
4722,
604,
352,
476,
7568,
10870,
1408,
3181,
1512,
50275,
13206,
608,
67,
26662,
18198,
68,
1543,
281,
3632,
3186,
285,
253,
4477,
513,
3894,
690,
12288,
327,
18198,
68,
20243,
2429,
281,
3733,
247,
8245,
2990,
374,
89,
604,
597,
812,
3894,
436,
20243,
1491,
352,
651,
320,
24783,
671,
3877,
326,
4477,
943,
320,
10182,
326,
1223,
10941,
281,
3632,
3186,
253,
7741,
19575,
323,
18198,
68,
943,
320,
2074,
24088,
3515,
4503,
1180,
273,
7529,
305,
11113,
10872,
323,
247,
4344,
5301,
352,
651,
320,
4722,
281,
871,
849,
1048,
352,
3936,
281,
29623,
253,
3632,
3186,
281,
253,
8654,
2216,
5055,
598,
407,
18198,
68,
342,
272,
5272,
673,
10806,
273,
2282,
50275,
29483,
595,
812,
4477,
3894,
849,
597,
4236,
253,
3541,
1979,
7658,
2193,
323,
253,
4679,
6096,
275,
253,
2929,
310,
627,
690,
4685,
604,
253,
34385,
273,
18198,
68,
6556,
323,
1236,
7276,
78,
455,
254,
1167,
10806,
285,
604,
4754,
849,
247,
2608,
812,
2096,
604,
597,
2965,
275,
326,
2491,
50275,
936,
7525,
436,
789,
16966,
247,
1077,
4217,
13757,
4968,
323,
20462,
1355,
533,
1029,
7200,
295,
2224,
326,
476,
4944,
327,
6867,
891,
302,
4095,
1296,
11733,
20596,
46415,
4893,
50276,
43355,
513,
417,
1246,
667,
941,
4645,
253,
11795,
3045,
273,
18198,
68,
2429,
281,
643,
7274,
690,
8668,
327,
326,
651,
320,
9371,
671,
667,
8813,
327,
849,
253,
3541,
1979,
10806,
497,
4236,
323,
3559,
4679,
651,
320,
9371,
50276,
7152,
33032,
2520,
2929,
310,
327,
1566,
13800,
2677,
907,
285,
819,
2517,
342,
10309,
10806,
253,
4081,
3186,
1332,
310,
271,
6880,
273,
46350,
13332,
281,
3037,
2801,
37139,
414,
11878,
285,
2372,
3429,
84,
591,
3828,
4473,
1230,
436,
789,
24772,
2045,
5697,
824,
347,
277,
27109,
273,
3828,
4871,
6864,
390,
5572,
285,
4715,
273,
3828,
3020,
2372,
3429,
285,
37139,
414,
285,
4917,
22865,
936,
29776,
1543,
327,
1566,
1979,
4632,
7200,
50274,
783,
3082,
908,
275,
253,
2929,
403,
417,
747,
533,
436,
789,
310,
247,
3686,
273,
5019,
273,
973,
4304,
5609,
3021,
2403,
352,
789,
26332,
277,
27109,
323,
1566,
10336,
342,
819,
25004,
285,
36643,
310,
247,
747,
7680,
253,
2905,
789,
629,
25339,
253,
3910,
432,
2720,
789,
285,
253,
2905,
789,
310,
18212,
11106,
281,
619,
3640,
50276,
783,
19529,
310,
22335,
3590,
342,
6571,
16774,
1543,
533,
352,
1057,
417,
7472,
253,
32213,
323,
1650,
253,
12203,
4815,
273,
253,
4081,
5933,
50275,
783,
4028,
310,
8261,
253,
2216,
7089,
285,
253,
4081,
5933,
403,
2529,
342,
4278,
1580,
253,
2201,
7680,
281,
619,
4685,
310,
253,
8542,
3374,
273,
2403,
277,
27109,
342,
13800,
5609,
2444,
891,
1158,
625,
4278,
403,
3309,
281,
1361,
10668,
285,
24432,
281,
897,
253,
4342,
275,
436,
789,
323,
1650,
50276,
16217,
2092,
873,
8777,
285,
2127,
6667,
403,
9371,
3340,
604,
642,
2603,
2127,
2130,
50275,
783,
1543,
403,
4217,
533,
417,
1534,
275,
690,
15216,
342,
7654,
7364,
327,
1566,
1979,
253,
4081,
1332,
812,
320,
908,
253,
2361,
5373,
1566,
1979,
4632,
7200,
403,
417,
4451,
253,
4477,
452,
9713,
253,
2442,
38058,
3486,
50276,
783,
7364,
403,
417,
9713,
2217,
50276,
187,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
1246,
247,
747,
1039,
281,
4044,
19477,
917,
11454,
6928,
281,
4944,
327,
7741,
48454,
15749,
538,
833,
10309,
50276,
19078,
10123,
497,
6804,
533,
253,
4477,
8379,
7303,
281,
3794,
281,
30628,
7350,
1309,
253,
30080,
22559,
2180,
2067,
30628,
8042,
562,
19843,
3374,
533,
337,
690,
273,
841,
3374,
2210,
432,
30628,
417,
4361,
253,
2929,
9257,
2217,
285,
374,
2571,
497,
6283,
9713,
407,
253,
4477,
891,
671,
971,
281,
14409,
326,
253,
954,
4016,
2278,
310,
247,
2159,
581,
10805,
2708,
5723,
2824,
3290,
7465,
50276,
6438,
5955,
512,
30628,
403,
25661,
4404,
14924,
33732,
326,
253,
2929,
8379,
14371,
253,
34385,
273,
253,
4081,
1332,
4632,
5368,
4623,
1666,
25379,
347,
247,
906,
891,
671,
5583,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1039,
273,
18216,
19477,
917,
295,
2224,
2439,
1027,
35615,
37139,
414,
2308,
285,
36643,
2308,
1309,
3733,
253,
4081,
7792,
310,
1925,
27998,
277,
27109,
323,
389,
377,
560,
6286,
18198,
68,
295,
2224,
285,
352,
476,
4917,
21012,
295,
2224,
326,
921,
1805,
3933,
317,
1105,
49797,
1979,
5454,
14273,
685,
2720,
789,
50276,
783,
2929,
39223,
271,
1774,
1895,
285,
253,
4081,
2900,
3133,
5272,
253,
4477,
1750,
326,
18198,
68,
41731,
13015,
2720,
789,
275,
2426,
273,
3933,
317,
1105,
49797,
1979,
285,
253,
4679,
1646,
281,
1329,
436,
247,
4864,
5740,
273,
253,
2720,
3082,
326,
253,
18198,
68,
7792,
310,
2429,
281,
824,
347,
419,
82,
305,
543,
1162,
355,
3435,
278,
2428,
251,
8101,
8319,
49962,
3024,
45505,
3024,
87,
19,
2093,
74,
1162,
355,
1283,
651,
320,
1663,
9371,
323,
10668,
50275,
249,
619,
4743,
253,
2929,
310,
247,
2372,
1892,
281,
956,
5742,
10199,
812,
320,
247,
2257,
30909,
342,
247,
625,
18872,
9759,
285,
4931,
342,
247,
1643,
749,
21454,
253,
4477,
5469,
253,
7364,
273,
616,
789,
5474,
33032,
2520,
2929,
10262,
247,
46350,
13332,
2746,
2303,
1076,
19477,
917,
48257,
323,
15749,
316,
342,
1698,
3541,
33257,
253,
2746,
2589,
84,
247,
6036,
3186,
689,
48257,
10336,
2801,
2372,
3429,
84,
285,
3828,
3020,
2801,
37139,
414,
2308,
281,
1089,
3210,
342,
253,
8004,
1566,
1979,
20544,
50276,
66,
10527,
2406,
9458,
273,
253,
5718,
1979,
326,
310,
11142,
281,
320,
10302,
50276,
8250,
289,
250,
383,
486,
3733,
1232,
281,
2968,
342,
36643,
285,
37139,
414,
50276,
303,
27369,
22865,
936,
6342,
273,
1566,
1979,
4632,
7200,
689,
2045,
789,
50276,
783,
3186,
5933,
556,
1805,
3410,
6733,
4632,
3632,
1332,
50276,
20881,
1255,
265,
50276,
11145,
352,
2590,
326,
253,
2303,
2813,
310,
6109,
3169,
295,
11113,
342,
3710,
1566,
5718,
50276,
19286,
2303,
1076,
1355,
295,
11113,
285,
278,
14573,
452,
1213,
32189,
22667,
285,
3541,
19575,
10806,
534,
403,
417,
5469,
275,
436,
2929,
50276,
15870,
5301,
342,
643,
13332,
11333,
50276,
9088,
403,
643,
13332,
9380,
323,
278,
14573,
4496,
671,
26542,
285,
7277,
281,
731,
323,
1650,
5987,
39962,
2061,
5375,
1252,
11494,
23260,
253,
7364,
403,
18212,
5469,
5474,
339,
431,
248,
4477,
1246,
717,
9300,
8967,
11454,
10336,
3186,
5933,
18198,
68,
534,
31167,
1566,
13800,
3386,
824,
347,
2801,
37139,
414,
285,
36643,
347,
629,
273,
253,
3186,
2317,
323,
253,
7445,
1029,
7200,
1355,
1566,
1979,
11454,
2990,
48257,
1566,
275,
1635,
7562,
275,
2564,
253,
10806,
5431,
11372,
407,
20596,
46415,
3210,
253,
4477,
671,
7767,
247,
7654,
2701,
327,
253,
1566,
1979,
281,
5416,
352,
476,
320,
9009,
275,
7741,
20793,
891,
302,
4893,
2007,
253,
3559,
5933,
13276,
3576,
17947,
273,
7200,
4632,
1566,
1979,
5454,
2727,
1390,
314,
253,
2929,
14371,
253,
22865,
936,
26447,
273,
295,
2224,
4158,
407,
18198,
68,
2429,
281,
2720,
1445,
323,
247,
2491,
273,
13361,
49602,
285,
11962,
10309,
10806,
327,
5105,
375,
1484,
2417,
295,
11113,
253,
2929,
10262,
247,
11080,
5301,
273,
18198,
68,
1411,
2469,
5609,
326,
3177,
2074,
820,
2178,
27996,
273,
13361,
3210,
1223,
15890,
323,
10309,
10806,
18198,
68,
39223,
247,
14200,
2491,
273,
10806,
2429,
281,
512,
7117,
275,
2720,
1445,
50275,
4674,
495,
285,
577,
6266,
253,
4278,
273,
253,
2216,
1566,
285,
253,
4081,
277,
27109,
5933,
841,
7118,
497,
8489,
1892,
281,
956,
2299,
352,
310,
34007,
7296,
253,
15965,
3753,
273,
1895,
352,
310,
417,
4755,
752,
841,
4903,
9173,
50274,
11258,
314,
823,
247,
5740,
50275,
4674,
721,
2722,
247,
20407,
1263,
273,
11365,
295,
2224,
970,
18198,
68,
285,
643,
2720,
1445,
323,
1375,
23037,
14387,
256,
5503,
13361,
49602,
762,
1027,
3541,
1979,
10806,
4677,
577,
18303,
253,
34385,
273,
18198,
68,
275,
2426,
273,
7200,
285,
1566,
1979,
812,
253,
4477,
671,
17914,
1708,
327,
253,
20243,
352,
3936,
281,
29623,
281,
253,
2457,
48257,
2216,
18198,
68,
21460,
2429,
281,
643,
7274,
24088,
278,
68,
328,
292,
7296,
18198,
68,
8553,
323,
247,
4067,
873,
273,
10806,
2429,
281,
643,
7274,
352,
651,
320,
4722,
604,
352,
476,
7568,
10870,
1408,
3181,
1512,
50275,
13206,
608,
67,
26662,
18198,
68,
1543,
281,
3632,
3186,
285,
253,
4477,
513,
3894,
690,
12288,
327,
18198,
68,
20243,
2429,
281,
3733,
247,
8245,
2990,
374,
89,
604,
597,
812,
3894,
436,
20243,
1491,
352,
651,
320,
24783,
671,
3877,
326,
4477,
943,
320,
10182,
326,
1223,
10941,
281,
3632,
3186,
253,
7741,
19575,
323,
18198,
68,
943,
320,
2074,
24088,
3515,
4503,
1180,
273,
7529,
305,
11113,
10872,
323,
247,
4344,
5301,
352,
651,
320,
4722,
281,
871,
849,
1048,
352,
3936,
281,
29623,
253,
3632,
3186,
281,
253,
8654,
2216,
5055,
598,
407,
18198,
68,
342,
272,
5272,
673,
10806,
273,
2282,
50275,
29483,
595,
812,
4477,
3894,
849,
597,
4236,
253,
3541,
1979,
7658,
2193,
323,
253,
4679,
6096,
275,
253,
2929,
310,
627,
690,
4685,
604,
253,
34385,
273,
18198,
68,
6556,
323,
1236,
7276,
78,
455,
254,
1167,
10806,
285,
604,
4754,
849,
247,
2608,
812,
2096,
604,
597,
2965,
275,
326,
2491,
50275,
936,
7525,
436,
789,
16966,
247,
1077,
4217,
13757,
4968,
323,
20462,
1355,
533,
1029,
7200,
295,
2224,
326,
476,
4944,
327,
6867,
891,
302,
4095,
1296,
11733,
20596,
46415,
4893,
50276,
43355,
513,
417,
1246,
667,
941,
4645,
253,
11795,
3045,
273,
18198,
68,
2429,
281,
643,
7274,
690,
8668,
327,
326,
651,
320,
9371,
671,
667,
8813,
327,
849,
253,
3541,
1979,
10806,
497,
4236,
323,
3559,
4679,
651,
320,
9371,
50276,
7152,
33032,
2520,
2929,
310,
327,
1566,
13800,
2677,
907,
285,
819,
2517,
342,
10309,
10806,
253,
4081,
3186,
1332,
310,
271,
6880,
273,
46350,
13332,
281,
3037,
2801,
37139,
414,
11878,
285,
2372,
3429,
84,
591,
3828,
4473,
1230,
436,
789,
24772,
2045,
5697,
824,
347,
277,
27109,
273,
3828,
4871,
6864,
390,
5572,
285,
4715,
273,
3828,
3020,
2372,
3429,
285,
37139,
414,
285,
4917,
22865,
936,
29776,
1543,
327,
1566,
1979,
4632,
7200,
50274,
783,
3082,
908,
275,
253,
2929,
403,
417,
747,
533,
436,
789,
310,
247,
3686,
273,
5019,
273,
973,
4304,
5609,
3021,
2403,
352,
789,
26332,
277,
27109,
323,
1566,
10336,
342,
819,
25004,
285,
36643,
310,
247,
747,
7680,
253,
2905,
789,
629,
25339,
253,
3910,
432,
2720,
789,
285,
253,
2905,
789,
310,
18212,
11106,
281,
619,
3640,
50276,
783,
19529,
310,
22335,
3590,
342,
6571,
16774,
1543,
533,
352,
1057,
417,
7472,
253,
32213,
323,
1650,
253,
12203,
4815,
273,
253,
4081,
5933,
50275,
783,
4028,
310,
8261,
253,
2216,
7089,
285,
253,
4081,
5933,
403,
2529,
342,
4278,
1580,
253,
2201,
7680,
281,
619,
4685,
310,
253,
8542,
3374,
273,
2403,
277,
27109,
342,
13800,
5609,
2444,
891,
1158,
625,
4278,
403,
3309,
281,
1361,
10668,
285,
24432,
281,
897,
253,
4342,
275,
436,
789,
323,
1650,
50276,
16217,
2092,
873,
8777,
285,
2127,
6667,
403,
9371,
3340,
604,
642,
2603,
2127,
2130,
50275,
783,
1543,
403,
4217,
533,
417,
1534,
275,
690,
15216,
342,
7654,
7364,
327,
1566,
1979,
253,
4081,
1332,
812,
320,
908,
253,
2361,
5373,
1566,
1979,
4632,
7200,
403,
417,
4451,
253,
4477,
452,
9713,
253,
2442,
38058,
3486,
50276,
783,
7364,
403,
417,
9713,
2217,
50276,
187,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
1246,
247,
747,
1039,
281,
4044,
19477,
917,
11454,
6928,
281,
4944,
327,
7741,
48454,
15749,
538,
833,
10309,
50276,
19078,
10123,
497,
6804,
533,
253,
4477,
8379,
7303,
281,
3794,
281,
30628,
7350,
1309,
253,
30080,
22559,
2180,
2067,
30628,
8042,
562,
19843,
3374,
533,
337,
690,
273,
841,
3374,
2210,
432,
30628,
417,
4361,
253,
2929,
9257,
2217,
285,
374,
2571,
497,
6283,
9713,
407,
253,
4477,
891,
671,
971,
281,
14409,
326,
253,
954,
4016,
2278,
310,
247,
2159,
581,
10805,
2708,
5723,
2824,
3290,
7465,
50276,
6438,
5955,
512,
30628,
403,
25661,
4404,
14924,
33732,
326,
253,
2929,
8379,
14371,
253,
34385,
273,
253,
4081,
1332,
4632,
5368,
4623,
1666,
25379,
347,
247,
906,
891,
671,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a method called harmonicgan for unpaired imagetoimage translation the key idea is to introduce a regularization term on the basis of cyclegan which encourages similar image patches to acquire similar transformations two feature domains are explored for evaluating the patchlevel similarity including soft rgb histogram and semantic features based on vggnet in fact the key idea is very similar to that of distancegan the proposed method can be regarded as a combination of the advantages of distancegan and cyclegan thus the technical novelty is very limited in my opinion some experimental results are provided to demonstrate the superiority of the proposed method over cyclegan distancegan and unit given the limited novelty and the inadequate number of experiments i am leaning to reject this submission major questions 1 lots of method details are missing in section 332 what layers are chosen for computing the semantic features what exactly is the metric for computing the distance between semantic features 2 the qualitative results on the task horse2zebra and zebra2horse are not impressive obvious artifacts can be observed in the results although the paper claims that the proposed method does not change the background and performs more complete transformations the background is changed in the result for the horse2zebra case in fig 5 more qualitative results are needed to demonstrate the effectiveness of the proposed method 3 to demonstrate the effectiveness of a general unpaired imagetoimage translation method the proposed method is needed to be testified on more tasks 4 implementation details are missing i am not able to judge whether the comparisons are fair enough new comment i have read the authors explanations and clarifications that make me increase my rating regarding the technical novelty i still dont think this paper bears sufficient stuff if there is extra quota i would recommend accept docsepthis paper adds a spatial regularization loss to the wellknown cyclegan loss for unpaired imagetoimage translation zhu et al iccv17 essentially the regularization loss eq 6 is similar to imposing a crf conditional random field term on the network outputs encouraging spatial consistency between patches within each generated image the paper is clear and well written unpaired imagetoimage translation is an important problem the way the smoothness loss eq 6 is presented gives readers the impression that spatial pairwise regularization is new ignoring its long history eg crfs in computer vision not a single classical paper on crfs is cited putting aside classical spatial regularization works imposing pairwise regularization on the outputs of modern deep networks has been investigated in a very large number of works recently particularly in the context of weaklysupervised semantic cnn segmentation eg tang et al on regularized losses for weaklysupervised cnn segmentation eccv 18 lin et al scribblesup scribblesupervised convolutional networks for semantic segmentation cvpr 2016 among many other works very similar in spirit to this iclr submission these works impose withinimage pairwise regularization eg crf on the latent outputs of deep networks with the main difference that these works use cnn semantic segmentation classifiers whereas here we have a cyclegan for image generation also in the context of supervised cnn segmentation crfs have made a significant impact when used as postprocessing step eg very well known works such as deeplab by chen et al iclr15 and crfs as recurrent neural networks by zheng et al iccv 2015 it might be a valid contribution to evaluate spatial regularization eg crfs losses on image generation tasks such as cyclegan but the paper really needs to acknowledge very related prior works on regularization at least in the context of deep networks there are also related pioneering semisupervised deep learning works based on graph laplacian regularization eg westen et al deep learning via semisupervised embedding icml 2008 which the paper does not acknowledgediscuss the manifold regularization terminology is misleading the regularization is not over the feature space of image samples it is within the spatial domain of each generated image patch or pixel level so in my opinion crf or spatial regularization instead of manifold regularization is a much more appropriate terminology also i would not call this approach harmonicgan i would call it crfgan or spatiallyregularized gan the computation of harmonic functions is just one way among many other potentially better ways to optimize pairwise smoothness terms including the case of the used smoothness loss and by the way i did not get how the loss in 9 gives a harmonic function could you please clarify and give more details in my understanding the harmonic solution in zhu and ghahramani icml 2013 comes directly as a solution of the graph laplacian and it assumes some labeled points ie a semisupervised setting even if the solution is correct which i do not see how i do not think it is an efficient way to handle pairwiseregularization problems in image processing particularly when matrix w wij is dense which might be the case here unless you are truncating the gaussian kernel with some heuristics in this case backpropagating the proposed loss would be of quadratic complexity wrt the number of image patches again there is a long tradition in optimizing efficiently pairwise regularizers in visionlearning even in the case of dense affinity matrices and one very wellknown work which is currently being used a lot in the context imposing crf structure on the outputs of deep networks is krahenbuhl and koltun efficient inference in fully connected crfs with gaussian edge potentials nips 2011 this highly related and widely used inference work for dense pairwise regulation is not citeddiscussed neither the gaussian filtering ideas of the work of krahenbuhl and koltun which ease optimizing dense pairwise terms from quadratic to linear are applicable here as a gaussian kernel is used and are widely used in computer vision including closely related works imposing spatial regularization losses on the outputs of deep networks eg tang et al on regularized losses for weaklysupervised cnn segmentation eccv 18 among many others when using feature from pretraining vgg in the crf loss the comparison with unsupervised cyclegan is not fair in table 2 label translation on cityscapes cyclegan outperforms the proposed method in all metrics when only unsupervised histogram features are used which makes me doubt about the practical value of the proposed regularization in the context of imagetranslation tasks having said that the histogrambased regularization is helping in the medicalimaging application table 1 by the way the use of histograms of patches or superpixels as unsupervised features in pairwise regularization is not new neither see for instance lin et al scribblesup scribblesupervised convolutional networks for semantic segmentation cvpr 2016 also it might be better to use superpixels instead of patches so in summary the technical contribution is minor in my opinion imposing pairwise regularization on the outputs of deep networks has been done in many works but not for cyclegan optimization of the proposed loss as a harmonic function is not clear to me using vgg in the comparisons with cyclegan is not fair and the long history of closelyrelated spatial regularization terms eg crfs in computer vision is completely ignored minor please use term instead of constraint these are unconstrained optimization problems and there are no equality or inequality constraints here docsepsummary the paper proposes a new smoothness constraint in the original cyclegan formulation the cyclegan formulation minimizes reconstruction error on the input and there is no criterion other than the adversarial loss function to ensure that it produce a good output this is in sync with the observations from gokaslan et al eccv18 and bansal et al eccv18 a smoothness constraint is defined across random patches in input image and corresponding patches in transformed image this enables the translation network to preserve edge discontinuities and variation in the output and leads to better outputs for medical imaging image to labels task and horse to zebra and vice versa pros 1 additional smoothness constraints help in improving the performance over multiple tasks this constraint is intuitive 2 impressive human studies for medical imaging 3 improvement in the qualitative results for the shown examples in paper and appendix things not clear from the submission 1 the paper is lacking in technical details a what is the patchsize used for rgbhistogram b what features or convlayers are used to get the features from vgg 19 net c other than medical imaging where there isnt a variation in colors of the two domains it is not clear why rgbhistogram would work d the current formulation can be thought as a variant of perceptual loss from johnson et al eccv16 applied for the patches or including pair of patches in my opinion implementing via perceptual loss formulation would have made the formulation cleaner and simpler the authors might want to clarify as how it is different from adding perceptual loss over the pair of patches along with the adversarial loss one would hope that a perceptual loss would help improve the performance also see chen and koltun iccv17 2 the proposed approach is highly constrained to the settings where structure in inputoutput does not change i am not sure how would this approach work if the settings from gokaslan et al eccv18 were considered like cats to dogs where the structure changes while going from input to output 3 does the proposed approach also provide temporal smoothness in the output eg figure6 shows an example of man on horse being zebrafied my guess is that input is a small video sequence and i am wondering if it provides temporal smoothness in the output the failure on human body makes me wonder that smoothness constraints are helping learn the edge discontinuities what if the edges of the input using an edge detection algorithm such as hed from xie and tu iccv15 were concatenated to the input and used in formulation this would be similar in spirit to the formulation of deep cascaded binetworks from zhu et al eccv16
### Summary:
|
the proposed method introduces a method for unsupervised imagetoimage mapping using a new term into the objective function that enforces consistency in similarity between image patches across domains reviewers left constructive and detailed comments which the authors have made substantial efforts to address reviewers have ranked paper as borderline and in area chairs opinion most major issued have been addressed r3r2 novelty compared to distancegancrf limited authors have clarified contributions in reference to distancegancrf and demonstrated improved performance relative to several datasets r3r1 evaluation on additional datasets required authors added evaluation on 4 more tasks r3r1 details missing authors added details
|
[
23007,
1247,
323,
47223,
4440,
16713,
5695,
10234,
253,
2234,
2934,
310,
281,
9569,
247,
37820,
1307,
327,
253,
3720,
273,
5880,
1247,
534,
29426,
2074,
2460,
20412,
281,
16270,
2074,
21257,
50276,
9389,
4735,
10625,
403,
14859,
323,
16344,
253,
12097,
5251,
14259,
1690,
2602,
46206,
33105,
285,
24705,
3386,
1754,
327,
362,
1266,
3024,
275,
958,
253,
2234,
2934,
310,
1077,
2074,
281,
326,
273,
4181,
1247,
253,
4081,
1332,
476,
320,
12258,
347,
247,
5019,
273,
253,
11361,
273,
4181,
1247,
285,
5880,
1247,
3021,
253,
7681,
38135,
310,
1077,
3710,
275,
619,
4743,
690,
5661,
1543,
403,
2530,
281,
7568,
253,
34385,
273,
253,
4081,
1332,
689,
5880,
1247,
4181,
1247,
285,
3943,
50276,
28821,
253,
3710,
38135,
285,
253,
18766,
1180,
273,
4679,
891,
717,
25661,
281,
12009,
436,
19529,
50276,
24330,
3533,
337,
8783,
273,
1332,
4278,
403,
5816,
275,
2593,
35107,
752,
8090,
403,
6777,
323,
12672,
253,
24705,
3386,
752,
4555,
310,
253,
7982,
323,
12672,
253,
4181,
875,
24705,
3386,
374,
253,
18276,
1543,
327,
253,
4836,
8815,
19,
91,
25656,
285,
31151,
19,
33647,
403,
417,
13943,
4755,
24165,
476,
320,
2540,
275,
253,
1543,
3738,
253,
2929,
3916,
326,
253,
4081,
1332,
1057,
417,
1818,
253,
4114,
285,
17923,
625,
3426,
21257,
253,
4114,
310,
4391,
275,
253,
906,
323,
253,
8815,
19,
91,
25656,
1083,
275,
3036,
608,
625,
18276,
1543,
403,
3058,
281,
7568,
253,
12510,
273,
253,
4081,
1332,
495,
281,
7568,
253,
12510,
273,
247,
2087,
47223,
4440,
16713,
5695,
10234,
1332,
253,
4081,
1332,
310,
3058,
281,
320,
7859,
327,
625,
8892,
577,
7092,
4278,
403,
5816,
891,
717,
417,
2104,
281,
5963,
1880,
253,
14023,
403,
4344,
2217,
50276,
1826,
4385,
891,
452,
1239,
253,
4477,
22909,
285,
8254,
6787,
326,
1056,
479,
2572,
619,
13716,
5001,
253,
7681,
38135,
891,
1335,
13414,
1158,
436,
2929,
17267,
4209,
5017,
604,
627,
310,
4465,
45544,
891,
651,
5583,
2997,
5474,
33032,
2520,
2929,
11323,
247,
8820,
37820,
2957,
281,
253,
973,
4304,
5880,
1247,
2957,
323,
47223,
4440,
16713,
5695,
10234,
1182,
11917,
1162,
355,
17857,
17312,
1166,
50276,
405,
4303,
253,
37820,
2957,
16186,
721,
310,
2074,
281,
23254,
247,
1531,
71,
17697,
3632,
1673,
1307,
327,
253,
2990,
18012,
18462,
8820,
15274,
875,
20412,
1561,
1016,
4561,
2460,
50276,
783,
2929,
310,
2590,
285,
973,
3542,
50276,
328,
30125,
4440,
16713,
5695,
10234,
310,
271,
1774,
1895,
50275,
783,
1039,
253,
6032,
1255,
2957,
16186,
721,
310,
3559,
4245,
10668,
253,
13214,
326,
8820,
28208,
37820,
310,
747,
23111,
697,
1048,
2892,
24088,
1531,
3671,
275,
4382,
8113,
417,
247,
2014,
8946,
2929,
327,
1531,
3671,
310,
11106,
8133,
9255,
8946,
8820,
37820,
2987,
23254,
28208,
37820,
327,
253,
18012,
273,
4980,
3676,
6928,
556,
644,
6949,
275,
247,
1077,
1781,
1180,
273,
2987,
4102,
3782,
275,
253,
3634,
273,
22112,
35421,
24705,
260,
9866,
26405,
24088,
50276,
85,
606,
1162,
355,
327,
3963,
1025,
11655,
323,
22112,
35421,
260,
9866,
26405,
23746,
87,
1283,
50276,
3642,
1162,
355,
50276,
8425,
9143,
484,
43010,
9143,
29974,
13337,
27311,
267,
6928,
323,
24705,
26405,
30105,
1087,
4022,
2190,
50276,
20415,
643,
2987,
1077,
2074,
275,
5968,
281,
436,
17857,
32888,
19529,
841,
2987,
16209,
1561,
5695,
28208,
37820,
24088,
1531,
71,
327,
253,
21624,
18012,
273,
3676,
6928,
342,
253,
2022,
3064,
326,
841,
2987,
897,
260,
9866,
24705,
26405,
49996,
5727,
1060,
359,
452,
247,
5880,
1247,
323,
2460,
5978,
50276,
12563,
275,
253,
3634,
273,
22296,
260,
9866,
26405,
1531,
3671,
452,
1160,
247,
1534,
3486,
672,
908,
347,
1501,
21678,
3213,
24088,
1077,
973,
1929,
2987,
824,
347,
372,
70,
446,
357,
407,
260,
864,
1162,
355,
17857,
32888,
1010,
285,
1531,
3671,
347,
18902,
11454,
6928,
407,
1182,
24176,
1162,
355,
17857,
17312,
4104,
50275,
262,
1537,
320,
247,
3588,
7680,
281,
7472,
8820,
37820,
24088,
1531,
3671,
11655,
327,
2460,
5978,
8892,
824,
347,
5880,
1247,
533,
253,
2929,
1663,
3198,
281,
14409,
1077,
2905,
2720,
2987,
327,
37820,
387,
1878,
275,
253,
3634,
273,
3676,
6928,
50276,
9088,
403,
671,
2905,
45200,
49863,
29974,
13337,
3676,
4715,
2987,
1754,
327,
4216,
826,
43917,
37820,
24088,
8935,
257,
1162,
355,
3676,
4715,
3066,
49863,
29974,
13337,
21496,
17857,
1686,
4695,
534,
253,
2929,
1057,
417,
14969,
8552,
50275,
783,
16751,
37820,
28939,
310,
24363,
253,
37820,
310,
417,
689,
253,
4735,
2317,
273,
2460,
3530,
352,
310,
1561,
253,
8820,
5028,
273,
1016,
4561,
2460,
12097,
390,
12275,
1268,
594,
275,
619,
4743,
1531,
71,
390,
8820,
37820,
3185,
273,
16751,
37820,
310,
247,
1199,
625,
4569,
28939,
50275,
12563,
891,
651,
417,
1067,
436,
2746,
23007,
1247,
891,
651,
1067,
352,
1531,
71,
1247,
390,
28819,
12846,
1025,
36827,
253,
13782,
273,
23007,
3470,
310,
816,
581,
1039,
2190,
1142,
643,
7826,
1805,
4088,
281,
22318,
28208,
6032,
1255,
2426,
1690,
253,
1083,
273,
253,
908,
6032,
1255,
2957,
285,
407,
253,
1039,
891,
858,
417,
755,
849,
253,
2957,
275,
898,
4245,
247,
23007,
1159,
812,
368,
4496,
19148,
285,
1918,
625,
4278,
275,
619,
4685,
253,
23007,
2900,
275,
50276,
91,
11917,
285,
32798,
1240,
3358,
6451,
17857,
1686,
4072,
3249,
3587,
347,
247,
2900,
273,
253,
4216,
826,
43917,
285,
352,
19584,
690,
13130,
2792,
26332,
247,
49863,
29974,
13337,
4758,
1014,
604,
253,
2900,
310,
3451,
534,
891,
513,
417,
923,
849,
891,
513,
417,
1158,
352,
310,
271,
5919,
1039,
281,
6016,
28208,
12846,
1320,
3237,
275,
2460,
5162,
3782,
672,
4315,
50276,
88,
50276,
88,
1944,
310,
14086,
534,
1537,
320,
253,
1083,
1060,
5734,
368,
403,
17701,
839,
253,
305,
12064,
10295,
342,
690,
344,
321,
3397,
275,
436,
1083,
896,
44263,
839,
253,
4081,
2957,
651,
320,
273,
21396,
10454,
8772,
253,
1180,
273,
2460,
20412,
969,
627,
310,
247,
1048,
4062,
275,
39793,
14556,
28208,
3963,
14460,
275,
8113,
28269,
1014,
275,
253,
1083,
273,
14086,
15430,
12624,
285,
581,
1077,
973,
4304,
789,
534,
310,
4390,
1146,
908,
247,
2257,
275,
253,
3634,
23254,
1531,
71,
2605,
327,
253,
18012,
273,
3676,
6928,
310,
50276,
76,
376,
864,
67,
6968,
77,
285,
465,
7391,
328,
5919,
17032,
275,
4751,
4802,
1531,
3671,
342,
305,
12064,
5024,
19316,
295,
2824,
4332,
436,
4122,
2905,
285,
7561,
908,
17032,
789,
323,
14086,
28208,
7248,
310,
417,
11106,
35844,
264,
6747,
253,
305,
12064,
19690,
5697,
273,
253,
789,
273,
465,
376,
864,
67,
6968,
77,
285,
465,
7391,
328,
534,
11990,
39793,
14086,
28208,
2426,
432,
21396,
281,
4872,
403,
7763,
1060,
347,
247,
305,
12064,
10295,
310,
908,
285,
403,
7561,
908,
275,
4382,
8113,
1690,
8244,
2905,
2987,
23254,
8820,
37820,
11655,
327,
253,
18012,
273,
3676,
6928,
24088,
12717,
1162,
355,
327,
3963,
1025,
11655,
323,
22112,
35421,
260,
9866,
26405,
23746,
87,
1283,
2190,
1142,
2571,
50272,
9453,
970,
4735,
432,
3215,
26208,
362,
1266,
275,
253,
1531,
71,
2957,
253,
5301,
342,
440,
35421,
5880,
1247,
310,
417,
4344,
275,
2829,
374,
5203,
10234,
327,
2846,
1026,
9652,
5880,
1247,
41731,
13015,
253,
4081,
1332,
275,
512,
17082,
672,
760,
440,
35421,
33105,
3386,
403,
908,
534,
2789,
479,
5545,
670,
253,
8542,
1318,
273,
253,
4081,
37820,
275,
253,
3634,
273,
4440,
11656,
507,
4004,
8892,
1907,
753,
326,
253,
1872,
31481,
1369,
833,
37820,
310,
9073,
275,
253,
3739,
40270,
2898,
2829,
337,
407,
253,
1039,
253,
897,
273,
47846,
273,
20412,
390,
2221,
30061,
1241,
347,
440,
35421,
3386,
275,
28208,
37820,
310,
417,
747,
6747,
923,
323,
4227,
19169,
1162,
355,
43010,
9143,
484,
43010,
9143,
29974,
13337,
27311,
267,
6928,
323,
24705,
26405,
30105,
1087,
4022,
671,
352,
1537,
320,
1805,
281,
897,
2221,
30061,
1241,
3185,
273,
20412,
50275,
601,
275,
6010,
253,
7681,
7680,
310,
5884,
275,
619,
4743,
23254,
28208,
37820,
327,
253,
18012,
273,
3676,
6928,
556,
644,
2218,
275,
1142,
2987,
533,
417,
323,
5880,
1247,
13757,
273,
253,
4081,
2957,
347,
247,
23007,
1159,
310,
417,
2590,
281,
479,
970,
362,
1266,
275,
253,
14023,
342,
5880,
1247,
310,
417,
4344,
285,
253,
1048,
2892,
273,
8244,
4919,
8820,
37820,
2426,
24088,
1531,
3671,
275,
4382,
8113,
310,
4336,
12841,
50276,
37585,
4496,
897,
1307,
3185,
273,
7658,
841,
403,
440,
48454,
13757,
3237,
285,
627,
403,
642,
13919,
390,
11370,
10806,
1060,
50272,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
247,
747,
6032,
1255,
7658,
275,
253,
3236,
5880,
1247,
15895,
253,
5880,
1247,
15895,
46926,
14433,
2228,
327,
253,
3280,
285,
627,
310,
642,
17705,
643,
685,
253,
48960,
2957,
1159,
281,
5416,
326,
352,
4711,
247,
1175,
3453,
436,
310,
275,
20319,
342,
253,
7313,
432,
305,
536,
284,
13409,
1162,
355,
23746,
87,
1093,
285,
44464,
267,
1162,
355,
23746,
87,
1093,
247,
6032,
1255,
7658,
310,
2931,
2439,
3632,
20412,
275,
3280,
2460,
285,
3969,
20412,
275,
13657,
2460,
436,
13276,
253,
10234,
2990,
281,
14003,
5024,
16196,
39560,
285,
7629,
275,
253,
3453,
285,
5644,
281,
1805,
18012,
323,
3739,
6979,
2460,
281,
13301,
4836,
285,
8815,
281,
31151,
285,
12008,
26620,
50276,
856,
84,
50275,
18,
50276,
38092,
6032,
1255,
10806,
1361,
275,
11138,
253,
3045,
689,
2709,
8892,
436,
7658,
310,
27350,
50276,
19,
13943,
1966,
2175,
323,
3739,
6979,
50276,
20,
7756,
275,
253,
18276,
1543,
323,
253,
2011,
6667,
275,
2929,
285,
30762,
50276,
28579,
417,
2590,
432,
253,
19529,
50275,
18,
253,
2929,
310,
14999,
275,
7681,
4278,
50275,
66,
752,
310,
253,
12097,
3281,
908,
323,
46206,
15700,
12935,
50276,
67,
752,
3386,
390,
2410,
33990,
403,
908,
281,
755,
253,
3386,
432,
362,
1266,
655,
2036,
50275,
68,
643,
685,
3739,
6979,
835,
627,
310,
2649,
247,
7629,
275,
9830,
273,
253,
767,
10625,
352,
310,
417,
2590,
2139,
46206,
15700,
12935,
651,
789,
50276,
69,
253,
1655,
15895,
476,
320,
1869,
347,
247,
12955,
273,
39612,
2957,
432,
480,
2116,
1665,
1162,
355,
23746,
87,
1036,
3732,
323,
253,
20412,
390,
1690,
4667,
273,
20412,
275,
619,
4743,
16994,
3066,
39612,
2957,
15895,
651,
452,
1160,
253,
15895,
28452,
285,
19554,
253,
4477,
1537,
971,
281,
19148,
347,
849,
352,
310,
1027,
432,
6240,
39612,
2957,
689,
253,
4667,
273,
20412,
2112,
342,
253,
48960,
2957,
581,
651,
3524,
326,
247,
39612,
2957,
651,
1361,
3157,
253,
3045,
671,
923,
260,
864,
285,
465,
7391,
328,
17857,
17312,
1166,
50276,
19,
253,
4081,
2746,
310,
4122,
20793,
281,
253,
7533,
835,
2605,
275,
3280,
9252,
1057,
417,
1818,
891,
717,
417,
2119,
849,
651,
436,
2746,
789,
604,
253,
7533,
432,
305,
536,
284,
13409,
1162,
355,
23746,
87,
1093,
497,
2783,
751,
16581,
281,
9097,
835,
253,
2605,
2544,
1223,
1469,
432,
3280,
281,
3453,
50275,
20,
1057,
253,
4081,
2746,
671,
2085,
11935,
6032,
1255,
275,
253,
3453,
24088,
4677,
23,
2722,
271,
1650,
273,
637,
327,
8815,
1146,
31151,
71,
728,
619,
5476,
310,
326,
3280,
310,
247,
1355,
3492,
3425,
285,
891,
717,
12371,
604,
352,
3400,
11935,
6032,
1255,
275,
253,
3453,
253,
4433,
327,
1966,
2133,
2789,
479,
4282,
326,
6032,
1255,
10806,
403,
9073,
3037,
253,
5024,
16196,
39560,
752,
604,
253,
9297,
273,
253,
3280,
970,
271,
5024,
5481,
5933,
824,
347,
21867,
432,
1269,
466,
285,
11737,
17857,
17312,
1010,
497,
32147,
456,
281,
253,
3280,
285,
908,
275,
15895,
436,
651,
320,
2074,
275,
5968,
281,
253,
15895,
273,
3676,
18779,
7917,
270,
7795,
4896,
432,
1182,
11917,
1162,
355,
50276,
70,
550,
87,
1036,
187,
187,
4118,
18435,
27,
783,
4081,
1332,
23970,
247,
1332,
323,
440,
35421,
4440,
16713,
5695,
10603,
970,
247,
747,
1307,
715,
253,
8103,
1159,
326,
546,
36217,
15274,
275,
14259,
875,
2460,
20412,
2439,
10625,
30628,
1669,
25799,
285,
7000,
5701,
534,
253,
4477,
452,
1160,
6832,
6031,
281,
2953,
50276,
15337,
398,
452,
17045,
2929,
347,
45210,
285,
275,
2170,
21583,
4743,
954,
2201,
6808,
452,
644,
9713,
50275,
83,
20,
83,
19,
38135,
2429,
281,
4181,
1247,
7083,
71,
3710,
4477,
452,
31637,
9021,
275,
3806,
281,
4181,
1247,
7083,
71,
285,
5183,
5520,
3045,
4103,
281,
2067,
15302,
50276,
83,
20,
83,
18,
7103,
327,
3081,
15302,
2424,
4477,
2879,
7103,
327,
577,
625,
8892,
50276,
83,
20,
83,
18,
4278,
5816,
4477,
2879,
4278,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
23007,
1247,
323,
47223,
4440,
16713,
5695,
10234,
253,
2234,
2934,
310,
281,
9569,
247,
37820,
1307,
327,
253,
3720,
273,
5880,
1247,
534,
29426,
2074,
2460,
20412,
281,
16270,
2074,
21257,
50276,
9389,
4735,
10625,
403,
14859,
323,
16344,
253,
12097,
5251,
14259,
1690,
2602,
46206,
33105,
285,
24705,
3386,
1754,
327,
362,
1266,
3024,
275,
958,
253,
2234,
2934,
310,
1077,
2074,
281,
326,
273,
4181,
1247,
253,
4081,
1332,
476,
320,
12258,
347,
247,
5019,
273,
253,
11361,
273,
4181,
1247,
285,
5880,
1247,
3021,
253,
7681,
38135,
310,
1077,
3710,
275,
619,
4743,
690,
5661,
1543,
403,
2530,
281,
7568,
253,
34385,
273,
253,
4081,
1332,
689,
5880,
1247,
4181,
1247,
285,
3943,
50276,
28821,
253,
3710,
38135,
285,
253,
18766,
1180,
273,
4679,
891,
717,
25661,
281,
12009,
436,
19529,
50276,
24330,
3533,
337,
8783,
273,
1332,
4278,
403,
5816,
275,
2593,
35107,
752,
8090,
403,
6777,
323,
12672,
253,
24705,
3386,
752,
4555,
310,
253,
7982,
323,
12672,
253,
4181,
875,
24705,
3386,
374,
253,
18276,
1543,
327,
253,
4836,
8815,
19,
91,
25656,
285,
31151,
19,
33647,
403,
417,
13943,
4755,
24165,
476,
320,
2540,
275,
253,
1543,
3738,
253,
2929,
3916,
326,
253,
4081,
1332,
1057,
417,
1818,
253,
4114,
285,
17923,
625,
3426,
21257,
253,
4114,
310,
4391,
275,
253,
906,
323,
253,
8815,
19,
91,
25656,
1083,
275,
3036,
608,
625,
18276,
1543,
403,
3058,
281,
7568,
253,
12510,
273,
253,
4081,
1332,
495,
281,
7568,
253,
12510,
273,
247,
2087,
47223,
4440,
16713,
5695,
10234,
1332,
253,
4081,
1332,
310,
3058,
281,
320,
7859,
327,
625,
8892,
577,
7092,
4278,
403,
5816,
891,
717,
417,
2104,
281,
5963,
1880,
253,
14023,
403,
4344,
2217,
50276,
1826,
4385,
891,
452,
1239,
253,
4477,
22909,
285,
8254,
6787,
326,
1056,
479,
2572,
619,
13716,
5001,
253,
7681,
38135,
891,
1335,
13414,
1158,
436,
2929,
17267,
4209,
5017,
604,
627,
310,
4465,
45544,
891,
651,
5583,
2997,
5474,
33032,
2520,
2929,
11323,
247,
8820,
37820,
2957,
281,
253,
973,
4304,
5880,
1247,
2957,
323,
47223,
4440,
16713,
5695,
10234,
1182,
11917,
1162,
355,
17857,
17312,
1166,
50276,
405,
4303,
253,
37820,
2957,
16186,
721,
310,
2074,
281,
23254,
247,
1531,
71,
17697,
3632,
1673,
1307,
327,
253,
2990,
18012,
18462,
8820,
15274,
875,
20412,
1561,
1016,
4561,
2460,
50276,
783,
2929,
310,
2590,
285,
973,
3542,
50276,
328,
30125,
4440,
16713,
5695,
10234,
310,
271,
1774,
1895,
50275,
783,
1039,
253,
6032,
1255,
2957,
16186,
721,
310,
3559,
4245,
10668,
253,
13214,
326,
8820,
28208,
37820,
310,
747,
23111,
697,
1048,
2892,
24088,
1531,
3671,
275,
4382,
8113,
417,
247,
2014,
8946,
2929,
327,
1531,
3671,
310,
11106,
8133,
9255,
8946,
8820,
37820,
2987,
23254,
28208,
37820,
327,
253,
18012,
273,
4980,
3676,
6928,
556,
644,
6949,
275,
247,
1077,
1781,
1180,
273,
2987,
4102,
3782,
275,
253,
3634,
273,
22112,
35421,
24705,
260,
9866,
26405,
24088,
50276,
85,
606,
1162,
355,
327,
3963,
1025,
11655,
323,
22112,
35421,
260,
9866,
26405,
23746,
87,
1283,
50276,
3642,
1162,
355,
50276,
8425,
9143,
484,
43010,
9143,
29974,
13337,
27311,
267,
6928,
323,
24705,
26405,
30105,
1087,
4022,
2190,
50276,
20415,
643,
2987,
1077,
2074,
275,
5968,
281,
436,
17857,
32888,
19529,
841,
2987,
16209,
1561,
5695,
28208,
37820,
24088,
1531,
71,
327,
253,
21624,
18012,
273,
3676,
6928,
342,
253,
2022,
3064,
326,
841,
2987,
897,
260,
9866,
24705,
26405,
49996,
5727,
1060,
359,
452,
247,
5880,
1247,
323,
2460,
5978,
50276,
12563,
275,
253,
3634,
273,
22296,
260,
9866,
26405,
1531,
3671,
452,
1160,
247,
1534,
3486,
672,
908,
347,
1501,
21678,
3213,
24088,
1077,
973,
1929,
2987,
824,
347,
372,
70,
446,
357,
407,
260,
864,
1162,
355,
17857,
32888,
1010,
285,
1531,
3671,
347,
18902,
11454,
6928,
407,
1182,
24176,
1162,
355,
17857,
17312,
4104,
50275,
262,
1537,
320,
247,
3588,
7680,
281,
7472,
8820,
37820,
24088,
1531,
3671,
11655,
327,
2460,
5978,
8892,
824,
347,
5880,
1247,
533,
253,
2929,
1663,
3198,
281,
14409,
1077,
2905,
2720,
2987,
327,
37820,
387,
1878,
275,
253,
3634,
273,
3676,
6928,
50276,
9088,
403,
671,
2905,
45200,
49863,
29974,
13337,
3676,
4715,
2987,
1754,
327,
4216,
826,
43917,
37820,
24088,
8935,
257,
1162,
355,
3676,
4715,
3066,
49863,
29974,
13337,
21496,
17857,
1686,
4695,
534,
253,
2929,
1057,
417,
14969,
8552,
50275,
783,
16751,
37820,
28939,
310,
24363,
253,
37820,
310,
417,
689,
253,
4735,
2317,
273,
2460,
3530,
352,
310,
1561,
253,
8820,
5028,
273,
1016,
4561,
2460,
12097,
390,
12275,
1268,
594,
275,
619,
4743,
1531,
71,
390,
8820,
37820,
3185,
273,
16751,
37820,
310,
247,
1199,
625,
4569,
28939,
50275,
12563,
891,
651,
417,
1067,
436,
2746,
23007,
1247,
891,
651,
1067,
352,
1531,
71,
1247,
390,
28819,
12846,
1025,
36827,
253,
13782,
273,
23007,
3470,
310,
816,
581,
1039,
2190,
1142,
643,
7826,
1805,
4088,
281,
22318,
28208,
6032,
1255,
2426,
1690,
253,
1083,
273,
253,
908,
6032,
1255,
2957,
285,
407,
253,
1039,
891,
858,
417,
755,
849,
253,
2957,
275,
898,
4245,
247,
23007,
1159,
812,
368,
4496,
19148,
285,
1918,
625,
4278,
275,
619,
4685,
253,
23007,
2900,
275,
50276,
91,
11917,
285,
32798,
1240,
3358,
6451,
17857,
1686,
4072,
3249,
3587,
347,
247,
2900,
273,
253,
4216,
826,
43917,
285,
352,
19584,
690,
13130,
2792,
26332,
247,
49863,
29974,
13337,
4758,
1014,
604,
253,
2900,
310,
3451,
534,
891,
513,
417,
923,
849,
891,
513,
417,
1158,
352,
310,
271,
5919,
1039,
281,
6016,
28208,
12846,
1320,
3237,
275,
2460,
5162,
3782,
672,
4315,
50276,
88,
50276,
88,
1944,
310,
14086,
534,
1537,
320,
253,
1083,
1060,
5734,
368,
403,
17701,
839,
253,
305,
12064,
10295,
342,
690,
344,
321,
3397,
275,
436,
1083,
896,
44263,
839,
253,
4081,
2957,
651,
320,
273,
21396,
10454,
8772,
253,
1180,
273,
2460,
20412,
969,
627,
310,
247,
1048,
4062,
275,
39793,
14556,
28208,
3963,
14460,
275,
8113,
28269,
1014,
275,
253,
1083,
273,
14086,
15430,
12624,
285,
581,
1077,
973,
4304,
789,
534,
310,
4390,
1146,
908,
247,
2257,
275,
253,
3634,
23254,
1531,
71,
2605,
327,
253,
18012,
273,
3676,
6928,
310,
50276,
76,
376,
864,
67,
6968,
77,
285,
465,
7391,
328,
5919,
17032,
275,
4751,
4802,
1531,
3671,
342,
305,
12064,
5024,
19316,
295,
2824,
4332,
436,
4122,
2905,
285,
7561,
908,
17032,
789,
323,
14086,
28208,
7248,
310,
417,
11106,
35844,
264,
6747,
253,
305,
12064,
19690,
5697,
273,
253,
789,
273,
465,
376,
864,
67,
6968,
77,
285,
465,
7391,
328,
534,
11990,
39793,
14086,
28208,
2426,
432,
21396,
281,
4872,
403,
7763,
1060,
347,
247,
305,
12064,
10295,
310,
908,
285,
403,
7561,
908,
275,
4382,
8113,
1690,
8244,
2905,
2987,
23254,
8820,
37820,
11655,
327,
253,
18012,
273,
3676,
6928,
24088,
12717,
1162,
355,
327,
3963,
1025,
11655,
323,
22112,
35421,
260,
9866,
26405,
23746,
87,
1283,
2190,
1142,
2571,
50272,
9453,
970,
4735,
432,
3215,
26208,
362,
1266,
275,
253,
1531,
71,
2957,
253,
5301,
342,
440,
35421,
5880,
1247,
310,
417,
4344,
275,
2829,
374,
5203,
10234,
327,
2846,
1026,
9652,
5880,
1247,
41731,
13015,
253,
4081,
1332,
275,
512,
17082,
672,
760,
440,
35421,
33105,
3386,
403,
908,
534,
2789,
479,
5545,
670,
253,
8542,
1318,
273,
253,
4081,
37820,
275,
253,
3634,
273,
4440,
11656,
507,
4004,
8892,
1907,
753,
326,
253,
1872,
31481,
1369,
833,
37820,
310,
9073,
275,
253,
3739,
40270,
2898,
2829,
337,
407,
253,
1039,
253,
897,
273,
47846,
273,
20412,
390,
2221,
30061,
1241,
347,
440,
35421,
3386,
275,
28208,
37820,
310,
417,
747,
6747,
923,
323,
4227,
19169,
1162,
355,
43010,
9143,
484,
43010,
9143,
29974,
13337,
27311,
267,
6928,
323,
24705,
26405,
30105,
1087,
4022,
671,
352,
1537,
320,
1805,
281,
897,
2221,
30061,
1241,
3185,
273,
20412,
50275,
601,
275,
6010,
253,
7681,
7680,
310,
5884,
275,
619,
4743,
23254,
28208,
37820,
327,
253,
18012,
273,
3676,
6928,
556,
644,
2218,
275,
1142,
2987,
533,
417,
323,
5880,
1247,
13757,
273,
253,
4081,
2957,
347,
247,
23007,
1159,
310,
417,
2590,
281,
479,
970,
362,
1266,
275,
253,
14023,
342,
5880,
1247,
310,
417,
4344,
285,
253,
1048,
2892,
273,
8244,
4919,
8820,
37820,
2426,
24088,
1531,
3671,
275,
4382,
8113,
310,
4336,
12841,
50276,
37585,
4496,
897,
1307,
3185,
273,
7658,
841,
403,
440,
48454,
13757,
3237,
285,
627,
403,
642,
13919,
390,
11370,
10806,
1060,
50272,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
247,
747,
6032,
1255,
7658,
275,
253,
3236,
5880,
1247,
15895,
253,
5880,
1247,
15895,
46926,
14433,
2228,
327,
253,
3280,
285,
627,
310,
642,
17705,
643,
685,
253,
48960,
2957,
1159,
281,
5416,
326,
352,
4711,
247,
1175,
3453,
436,
310,
275,
20319,
342,
253,
7313,
432,
305,
536,
284,
13409,
1162,
355,
23746,
87,
1093,
285,
44464,
267,
1162,
355,
23746,
87,
1093,
247,
6032,
1255,
7658,
310,
2931,
2439,
3632,
20412,
275,
3280,
2460,
285,
3969,
20412,
275,
13657,
2460,
436,
13276,
253,
10234,
2990,
281,
14003,
5024,
16196,
39560,
285,
7629,
275,
253,
3453,
285,
5644,
281,
1805,
18012,
323,
3739,
6979,
2460,
281,
13301,
4836,
285,
8815,
281,
31151,
285,
12008,
26620,
50276,
856,
84,
50275,
18,
50276,
38092,
6032,
1255,
10806,
1361,
275,
11138,
253,
3045,
689,
2709,
8892,
436,
7658,
310,
27350,
50276,
19,
13943,
1966,
2175,
323,
3739,
6979,
50276,
20,
7756,
275,
253,
18276,
1543,
323,
253,
2011,
6667,
275,
2929,
285,
30762,
50276,
28579,
417,
2590,
432,
253,
19529,
50275,
18,
253,
2929,
310,
14999,
275,
7681,
4278,
50275,
66,
752,
310,
253,
12097,
3281,
908,
323,
46206,
15700,
12935,
50276,
67,
752,
3386,
390,
2410,
33990,
403,
908,
281,
755,
253,
3386,
432,
362,
1266,
655,
2036,
50275,
68,
643,
685,
3739,
6979,
835,
627,
310,
2649,
247,
7629,
275,
9830,
273,
253,
767,
10625,
352,
310,
417,
2590,
2139,
46206,
15700,
12935,
651,
789,
50276,
69,
253,
1655,
15895,
476,
320,
1869,
347,
247,
12955,
273,
39612,
2957,
432,
480,
2116,
1665,
1162,
355,
23746,
87,
1036,
3732,
323,
253,
20412,
390,
1690,
4667,
273,
20412,
275,
619,
4743,
16994,
3066,
39612,
2957,
15895,
651,
452,
1160,
253,
15895,
28452,
285,
19554,
253,
4477,
1537,
971,
281,
19148,
347,
849,
352,
310,
1027,
432,
6240,
39612,
2957,
689,
253,
4667,
273,
20412,
2112,
342,
253,
48960,
2957,
581,
651,
3524,
326,
247,
39612,
2957,
651,
1361,
3157,
253,
3045,
671,
923,
260,
864,
285,
465,
7391,
328,
17857,
17312,
1166,
50276,
19,
253,
4081,
2746,
310,
4122,
20793,
281,
253,
7533,
835,
2605,
275,
3280,
9252,
1057,
417,
1818,
891,
717,
417,
2119,
849,
651,
436,
2746,
789,
604,
253,
7533,
432,
305,
536,
284,
13409,
1162,
355,
23746,
87,
1093,
497,
2783,
751,
16581,
281,
9097,
835,
253,
2605,
2544,
1223,
1469,
432,
3280,
281,
3453,
50275,
20,
1057,
253,
4081,
2746,
671,
2085,
11935,
6032,
1255,
275,
253,
3453,
24088,
4677,
23,
2722,
271,
1650,
273,
637,
327,
8815,
1146,
31151,
71,
728,
619,
5476,
310,
326,
3280,
310,
247,
1355,
3492,
3425,
285,
891,
717,
12371,
604,
352,
3400,
11935,
6032,
1255,
275,
253,
3453,
253,
4433,
327,
1966,
2133,
2789,
479,
4282,
326,
6032,
1255,
10806,
403,
9073,
3037,
253,
5024,
16196,
39560,
752,
604,
253,
9297,
273,
253,
3280,
970,
271,
5024,
5481,
5933,
824,
347,
21867,
432,
1269,
466,
285,
11737,
17857,
17312,
1010,
497,
32147,
456,
281,
253,
3280,
285,
908,
275,
15895,
436,
651,
320,
2074,
275,
5968,
281,
253,
15895,
273,
3676,
18779,
7917,
270,
7795,
4896,
432,
1182,
11917,
1162,
355,
50276,
70,
550,
87,
1036,
187,
187,
4118,
18435,
27,
783,
4081,
1332,
23970,
247,
1332,
323,
440,
35421,
4440,
16713,
5695,
10603,
970,
247,
747,
1307,
715,
253,
8103,
1159,
326,
546,
36217,
15274,
275,
14259,
875,
2460,
20412,
2439,
10625,
30628,
1669,
25799,
285,
7000,
5701,
534,
253,
4477,
452,
1160,
6832,
6031,
281,
2953,
50276,
15337,
398,
452,
17045,
2929,
347,
45210,
285,
275,
2170,
21583,
4743,
954,
2201,
6808,
452,
644,
9713,
50275,
83,
20,
83,
19,
38135,
2429,
281,
4181,
1247,
7083,
71,
3710,
4477,
452,
31637,
9021,
275,
3806,
281,
4181,
1247,
7083,
71,
285,
5183,
5520,
3045,
4103,
281,
2067,
15302,
50276,
83,
20,
83,
18,
7103,
327,
3081,
15302,
2424,
4477,
2879,
7103,
327,
577,
625,
8892,
50276,
83,
20,
83,
18,
4278,
5816,
4477,
2879,
4278,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a new method named lart that assigns weights to adversarial examples during adversarial training for better robustness compared to the previous baseline work ie iart the proposed method has overcame several disadvantages quantitative results demonstrate the effectiveness of the proposed method strengths 1 weighting different adversarial examples for training is intuitive and the proposed lrat addresses some existing problems in irat 2 the proposed method is effective as demonstrated by the experimental results weaknesses 1 the experiments are not convincing enough to demonstrate whether the proposed method is really better than sat in table 1 can sat reach a similar level of robustness at the same or at least close performance drop namely what if we slightly increase the pgd step in sat in tab 1 and the performance on natural example is expected to degrade while the robustness is expected to be increased when both sat and the proposed method degrades into the same or close natural example performance eg 8317 or 8102 which method provides better robustness the proposed method is an intuitive improvement on the existing adversarial example weighting method however the experimental results are not convincing enough to reveal the true improvements docsepthe paper studies the weight strategy of adversarial training the technical contribution of this paper is too limited and the experiments show very little improvements the paper studies the weight strategy of adversarial training the instance weighting is very old the readers can not see anything new insight from this paper the technique of this paper is very simple without theory analysis and the experiments seem that the proposed methods only obtain the marginal gains the paper is not machine learning paper i would like to recommend the authors to submit the paper to other application tracks the details comments are 1 why instance reweighting is necesary for adversarial training can you provide some theory insight for this from the experiments we can see that the instance reweighting obains some marginal improvements over sat but what is the time cost why not report the time in experiments if you cost much more time than sat but only obtains very little improve on the accuracy then it is not a smart method 2 why the safeness should be attackdependent can you provide some theory guarantee for this what kind of attack make the safeness change is there any possibility that different attackes do not change the safeness how different attack change the safeness what kind of attack change the safeness towars more safe or unsafe the main claim of this paper is not rigious at all 3 can you prove the proposed algorithm converge and what is the time cost 4why only consider linifity attack why not try l1 and l2 attack why only resnet resnet18 why not try resnet34 and resnet50 and some other networks why only use cifar10 why not try imagenet and some other datasets 1 why instance reweighting is necesary for adversarial training can you provide some theory insight for this from the experiments we can see that the instance reweighting obains some marginal improvements over sat but what is the time cost why not report the time in experiments if you cost much more time than sat but only obtains very little improve on the accuracy then it is not a smart method 2 why the safeness should be attackdependent can you provide some theory guarantee for this what kind of attack make the safeness change is there any possibility that different attackes do not change the safeness how different attack change the safeness what kind of attack change the safeness towars more safe or unsafe the main claim of this paper is not rigious at all 3 can you prove the proposed algorithm converge and what is the time cost 4why only consider linifity attack why not try l1 and l2 attack why only resnet resnet18 why not try resnet34 and resnet50 and some other networks why only use cifar10 why not try imagenet and some other datasets docsepthe paper studies the reweighting of adversarial training it first points out the problem of gairat gairat only improves the robust accuracy under pgd and lowers the accuracy of other attacks like autoattack based on the findings the authors propose a new method lrat which performs adversarial training using different attacks and gives different weights to the instances generated by different attacks pros 1 the analysis of gairat is thorough and insightful the paper perform various experiments showing that gairat overfits to pgd and increases the robust accuracy of pgd however it is more vulnerable under other attacks like cw and aa 2 the motivation of using different attacks during training phase is reasonable as the network overfits the attack in the training phase diverse attacks make the network robust to wider range of attacks and reduce the overfitting cons 1 the evaluation of lrat is incomplete the paper only evaluates lrat on cifar10 with the resnet18 the paper should perform experiments on more datasets such as cifar10 or svhn to demonstrate the effecitveness of lrat besides wideresnet3410 is the most commonly used network architecture for adversarial training lrat should also be tested on this network 2 the motivation of locally reweighting is unclear it is also shown in the experiment that the performance of lsat is close to lrat 3 the paper is hard to follow as the authors come up with some notation with definition for example i cannot understand tildexi in eqn 7 and alg 1 minors 1 in section 43 the paper shows that when using gairat with cw attack the accuracy of cw attack is still lower than pgd is it contradictory with analysis in section 1 and section 3 the analysis of gairat is thorough and insightful and the motivation of using different attacks is clear however the evaluation of lrat is incomplete and i cannot see the benefit of locally reweighting therefore i tend to reject the paper at current phase docsepauthors pointed out that instancesreweighted adversarial training irat suffer from vulnerability against the unseen types of attacks authors claimed that a large number of instances are actually overlooked in irat which may cause that phenomenon therefore the authors suggest locally reweighted adversarial training lrat which performs local reweighting between the adversarial variants pair strong points 1 this paper points out the issue of irat which is interesting 2 this paper is well written and easy to follow weak points the paper claim that irat misleads the weights between adversarial variants however there is a limited explanation of why irat mislead the weights and how local reweighting can prevent those there are limited experimental results to verify the efficacy of lrat model capacity there have not been explored lrat is effective in large parameter models such as wideresnet3410 dataset there is only a single dataset cifar10 result in the paper since the author has shown that there is an inconsistency of strong attacks on different datasets figure 2 im curious how local weights can vary across datasets computational efficacy how long do lratsaa and lratcw take to train compare to sat generality does lrat also can be applied to other adversarial training methods such as trades limited improvement i would suggest the authors run the proposed method multiple times and report the variance the comparison between the gap in average performance and the variance would convince the reader if it is a real improvement minor points there is a missing explanation of how to rank the attack in figure 2 overall i recommend reject to this paper i think the contribution is not sufficient for iclr the improvement is limited compared to sat and lsat which is train the model with previous attacks such as cw or apgddlr moreover there are limited comparisons as i elaborated in the weak points however if all my concerns resolve properly i will increase my score
### Summary:
|
reviewers raised various concerns and authors sent in no rebuttal in view of the negative consensus this paper then made a clear rejection case
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
747,
1332,
4907,
298,
435,
326,
39360,
13461,
281,
48960,
6667,
1309,
48960,
3733,
323,
1805,
31640,
2429,
281,
253,
2045,
8245,
789,
26332,
891,
435,
253,
4081,
1332,
556,
689,
34403,
2067,
23797,
11745,
1543,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
337,
42428,
1027,
48960,
6667,
323,
3733,
310,
27350,
285,
253,
4081,
298,
9296,
12453,
690,
5368,
3237,
275,
3496,
255,
374,
253,
4081,
1332,
310,
3576,
347,
5183,
407,
253,
5661,
1543,
50276,
20881,
1255,
265,
337,
253,
4679,
403,
417,
21414,
2217,
281,
7568,
1880,
253,
4081,
1332,
310,
1663,
1805,
685,
2206,
275,
2829,
337,
476,
2206,
3986,
247,
2074,
1268,
273,
31640,
387,
253,
1072,
390,
387,
1878,
2810,
3045,
5926,
10775,
752,
604,
359,
5777,
2572,
253,
23256,
69,
3213,
275,
2206,
275,
10334,
337,
285,
253,
3045,
327,
3626,
1650,
310,
3264,
281,
40195,
1223,
253,
31640,
310,
3264,
281,
320,
2559,
672,
1097,
2206,
285,
253,
4081,
1332,
372,
25013,
715,
253,
1072,
390,
2810,
3626,
1650,
3045,
24088,
11439,
1166,
390,
854,
11335,
534,
1332,
3400,
1805,
31640,
50276,
783,
4081,
1332,
310,
271,
27350,
7756,
327,
253,
5368,
48960,
1650,
42428,
1332,
2299,
253,
5661,
1543,
403,
417,
21414,
2217,
281,
10313,
253,
2032,
11701,
5474,
339,
431,
248,
2929,
2175,
253,
2801,
5700,
273,
48960,
3733,
253,
7681,
7680,
273,
436,
2929,
310,
1512,
3710,
285,
253,
4679,
921,
1077,
1652,
11701,
253,
2929,
2175,
253,
2801,
5700,
273,
48960,
3733,
253,
4227,
42428,
310,
1077,
1711,
253,
10668,
476,
417,
923,
2712,
747,
12288,
432,
436,
2929,
50276,
783,
5853,
273,
436,
2929,
310,
1077,
2969,
1293,
3762,
1783,
285,
253,
4679,
1646,
326,
253,
4081,
3082,
760,
4044,
253,
16888,
15988,
253,
2929,
310,
417,
5145,
4715,
2929,
891,
651,
751,
281,
5583,
253,
4477,
281,
11929,
253,
2929,
281,
643,
2898,
11411,
253,
4278,
5701,
403,
337,
2139,
4227,
294,
6712,
272,
310,
30538,
552,
323,
48960,
3733,
476,
368,
2085,
690,
3762,
12288,
323,
436,
432,
253,
4679,
359,
476,
923,
326,
253,
4227,
294,
6712,
272,
691,
1550,
690,
16888,
11701,
689,
2206,
533,
752,
310,
253,
673,
2105,
2139,
417,
1304,
253,
673,
275,
4679,
604,
368,
2105,
1199,
625,
673,
685,
2206,
533,
760,
31326,
1077,
1652,
3157,
327,
253,
7200,
840,
352,
310,
417,
247,
7060,
1332,
374,
2139,
253,
4389,
8098,
943,
320,
2983,
6820,
50275,
5092,
368,
2085,
690,
3762,
12215,
323,
436,
50276,
5371,
2238,
273,
2983,
1056,
253,
4389,
8098,
1818,
310,
627,
667,
6387,
326,
1027,
2983,
265,
513,
417,
1818,
253,
4389,
8098,
849,
1027,
2983,
1818,
253,
4389,
8098,
752,
2238,
273,
2983,
1818,
253,
4389,
8098,
12413,
1032,
625,
4999,
390,
20372,
253,
2022,
1750,
273,
436,
2929,
310,
417,
8132,
784,
387,
512,
495,
476,
368,
5276,
253,
4081,
5933,
29623,
285,
752,
310,
253,
673,
2105,
577,
22309,
760,
1908,
19169,
338,
414,
2983,
2139,
417,
1611,
298,
18,
285,
298,
19,
2983,
2139,
760,
501,
3024,
501,
3024,
1093,
2139,
417,
1611,
501,
3024,
1706,
285,
501,
3024,
1235,
285,
690,
643,
6928,
2139,
760,
897,
260,
338,
274,
740,
2139,
417,
1611,
4440,
257,
292,
285,
690,
643,
15302,
337,
2139,
4227,
294,
6712,
272,
310,
30538,
552,
323,
48960,
3733,
476,
368,
2085,
690,
3762,
12288,
323,
436,
432,
253,
4679,
359,
476,
923,
326,
253,
4227,
294,
6712,
272,
691,
1550,
690,
16888,
11701,
689,
2206,
533,
752,
310,
253,
673,
2105,
2139,
417,
1304,
253,
673,
275,
4679,
604,
368,
2105,
1199,
625,
673,
685,
2206,
533,
760,
31326,
1077,
1652,
3157,
327,
253,
7200,
840,
352,
310,
417,
247,
7060,
1332,
374,
2139,
253,
4389,
8098,
943,
320,
2983,
6820,
50275,
5092,
368,
2085,
690,
3762,
12215,
323,
436,
50276,
5371,
2238,
273,
2983,
1056,
253,
4389,
8098,
1818,
310,
627,
667,
6387,
326,
1027,
2983,
265,
513,
417,
1818,
253,
4389,
8098,
849,
1027,
2983,
1818,
253,
4389,
8098,
752,
2238,
273,
2983,
1818,
253,
4389,
8098,
12413,
1032,
625,
4999,
390,
20372,
253,
2022,
1750,
273,
436,
2929,
310,
417,
8132,
784,
387,
512,
495,
476,
368,
5276,
253,
4081,
5933,
29623,
285,
752,
310,
253,
673,
2105,
577,
22309,
760,
1908,
19169,
338,
414,
2983,
2139,
417,
1611,
298,
18,
285,
298,
19,
2983,
2139,
760,
501,
3024,
501,
3024,
1093,
2139,
417,
1611,
501,
3024,
1706,
285,
501,
3024,
1235,
285,
690,
643,
6928,
2139,
760,
897,
260,
338,
274,
740,
2139,
417,
1611,
4440,
257,
292,
285,
690,
643,
15302,
5474,
339,
431,
248,
2929,
2175,
253,
294,
6712,
272,
273,
48960,
3733,
352,
806,
2792,
562,
253,
1895,
273,
305,
1094,
255,
305,
1094,
255,
760,
19132,
253,
10237,
7200,
762,
23256,
69,
285,
45742,
253,
7200,
273,
643,
8104,
751,
6753,
35946,
1754,
327,
253,
4342,
253,
4477,
12661,
247,
747,
1332,
298,
9296,
534,
17923,
48960,
3733,
970,
1027,
8104,
285,
4245,
1027,
13461,
281,
253,
10872,
4561,
407,
1027,
8104,
5847,
337,
253,
1783,
273,
305,
1094,
255,
310,
11080,
285,
47860,
253,
2929,
1347,
2710,
4679,
4645,
326,
305,
1094,
255,
689,
26017,
281,
23256,
69,
285,
5459,
253,
10237,
7200,
273,
23256,
69,
2299,
352,
310,
625,
14043,
762,
643,
8104,
751,
260,
88,
285,
39951,
50276,
19,
253,
16038,
273,
970,
1027,
8104,
1309,
3733,
3408,
310,
5272,
347,
253,
2990,
689,
26017,
253,
2983,
275,
253,
3733,
3408,
11117,
8104,
1056,
253,
2990,
10237,
281,
14200,
2491,
273,
8104,
285,
4796,
253,
689,
31893,
50276,
5040,
337,
253,
7103,
273,
298,
9296,
310,
18464,
253,
2929,
760,
44995,
298,
9296,
327,
260,
338,
274,
740,
342,
253,
501,
3024,
1093,
253,
2929,
943,
1347,
4679,
327,
625,
15302,
824,
347,
260,
338,
274,
740,
390,
18504,
13107,
281,
7568,
253,
29692,
453,
10179,
1261,
405,
273,
298,
9296,
16280,
4618,
373,
3024,
1706,
740,
310,
253,
954,
7744,
908,
2990,
10336,
323,
48960,
3733,
298,
9296,
943,
671,
320,
5762,
327,
436,
2990,
50276,
19,
253,
16038,
273,
12171,
294,
6712,
272,
310,
12744,
352,
310,
671,
2011,
275,
253,
3368,
326,
253,
3045,
273,
298,
22354,
310,
2810,
281,
298,
9296,
50276,
20,
253,
2929,
310,
1892,
281,
956,
347,
253,
4477,
1705,
598,
342,
690,
14951,
342,
5426,
323,
1650,
891,
2550,
2096,
10751,
2465,
74,
275,
16186,
79,
818,
285,
20320,
337,
50276,
1222,
641,
337,
275,
2593,
7652,
253,
2929,
2722,
326,
672,
970,
305,
1094,
255,
342,
260,
88,
2983,
253,
7200,
273,
260,
88,
2983,
310,
1335,
2406,
685,
23256,
69,
310,
352,
34126,
342,
1783,
275,
2593,
337,
285,
2593,
495,
50276,
783,
1783,
273,
305,
1094,
255,
310,
11080,
285,
47860,
285,
253,
16038,
273,
970,
1027,
8104,
310,
2590,
2299,
253,
7103,
273,
298,
9296,
310,
18464,
285,
891,
2550,
923,
253,
5649,
273,
12171,
294,
6712,
272,
3103,
891,
5257,
281,
12009,
253,
2929,
387,
1655,
3408,
5474,
33032,
43355,
8042,
562,
326,
10872,
250,
24676,
48960,
3733,
3496,
255,
11089,
432,
24189,
1411,
253,
39709,
3510,
273,
8104,
4477,
7558,
326,
247,
1781,
1180,
273,
10872,
403,
2686,
28849,
275,
3496,
255,
534,
778,
2847,
326,
11562,
3103,
253,
4477,
1804,
12171,
294,
24676,
48960,
3733,
298,
9296,
534,
17923,
1980,
294,
6712,
272,
875,
253,
48960,
11640,
4667,
2266,
2792,
50276,
18,
436,
2929,
2792,
562,
253,
2523,
273,
3496,
255,
534,
310,
4722,
50276,
19,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
20881,
2792,
50276,
783,
2929,
1750,
326,
3496,
255,
3731,
282,
6594,
253,
13461,
875,
48960,
11640,
2299,
627,
310,
247,
3710,
8813,
273,
2139,
3496,
255,
3731,
26460,
253,
13461,
285,
849,
1980,
294,
6712,
272,
476,
3657,
1110,
50275,
9088,
403,
3710,
5661,
1543,
281,
12654,
253,
10307,
273,
298,
9296,
50276,
7645,
5350,
627,
452,
417,
644,
14859,
298,
9296,
310,
3576,
275,
1781,
4764,
3210,
824,
347,
4618,
373,
3024,
1706,
740,
50276,
42429,
627,
310,
760,
247,
2014,
10895,
260,
338,
274,
740,
906,
275,
253,
2929,
1580,
253,
2488,
556,
2011,
326,
627,
310,
271,
43430,
273,
2266,
8104,
327,
1027,
15302,
4677,
374,
516,
14338,
849,
1980,
13461,
476,
6889,
2439,
15302,
50276,
16777,
1050,
10307,
849,
1048,
513,
298,
83,
1832,
5781,
285,
298,
9296,
68,
88,
1379,
281,
6194,
7277,
281,
2206,
50276,
8719,
1319,
1057,
298,
9296,
671,
476,
320,
3732,
281,
643,
48960,
3733,
3082,
824,
347,
28587,
50276,
15870,
7756,
891,
651,
1804,
253,
4477,
1408,
253,
4081,
1332,
2709,
2069,
285,
1304,
253,
11041,
253,
5301,
875,
253,
8037,
275,
3388,
3045,
285,
253,
11041,
651,
18578,
253,
9414,
604,
352,
310,
247,
1524,
7756,
5884,
2792,
50276,
9088,
310,
247,
5816,
8813,
273,
849,
281,
5958,
253,
2983,
275,
4677,
374,
4583,
891,
5583,
12009,
281,
436,
2929,
891,
1158,
253,
7680,
310,
417,
4209,
323,
17857,
32888,
50276,
783,
7756,
310,
3710,
2429,
281,
2206,
285,
298,
22354,
534,
310,
6194,
253,
1566,
342,
2045,
8104,
824,
347,
260,
88,
390,
1049,
72,
1678,
32888,
25761,
627,
403,
3710,
14023,
347,
891,
50221,
275,
253,
5075,
2792,
50276,
35529,
604,
512,
619,
7350,
11322,
6283,
891,
588,
2572,
619,
4868,
2490,
187,
4118,
18435,
27,
15337,
398,
5439,
2710,
7350,
285,
4477,
2197,
275,
642,
30080,
22559,
275,
1859,
273,
253,
4016,
13969,
436,
2929,
840,
1160,
247,
2590,
18235,
1083
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
747,
1332,
4907,
298,
435,
326,
39360,
13461,
281,
48960,
6667,
1309,
48960,
3733,
323,
1805,
31640,
2429,
281,
253,
2045,
8245,
789,
26332,
891,
435,
253,
4081,
1332,
556,
689,
34403,
2067,
23797,
11745,
1543,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
337,
42428,
1027,
48960,
6667,
323,
3733,
310,
27350,
285,
253,
4081,
298,
9296,
12453,
690,
5368,
3237,
275,
3496,
255,
374,
253,
4081,
1332,
310,
3576,
347,
5183,
407,
253,
5661,
1543,
50276,
20881,
1255,
265,
337,
253,
4679,
403,
417,
21414,
2217,
281,
7568,
1880,
253,
4081,
1332,
310,
1663,
1805,
685,
2206,
275,
2829,
337,
476,
2206,
3986,
247,
2074,
1268,
273,
31640,
387,
253,
1072,
390,
387,
1878,
2810,
3045,
5926,
10775,
752,
604,
359,
5777,
2572,
253,
23256,
69,
3213,
275,
2206,
275,
10334,
337,
285,
253,
3045,
327,
3626,
1650,
310,
3264,
281,
40195,
1223,
253,
31640,
310,
3264,
281,
320,
2559,
672,
1097,
2206,
285,
253,
4081,
1332,
372,
25013,
715,
253,
1072,
390,
2810,
3626,
1650,
3045,
24088,
11439,
1166,
390,
854,
11335,
534,
1332,
3400,
1805,
31640,
50276,
783,
4081,
1332,
310,
271,
27350,
7756,
327,
253,
5368,
48960,
1650,
42428,
1332,
2299,
253,
5661,
1543,
403,
417,
21414,
2217,
281,
10313,
253,
2032,
11701,
5474,
339,
431,
248,
2929,
2175,
253,
2801,
5700,
273,
48960,
3733,
253,
7681,
7680,
273,
436,
2929,
310,
1512,
3710,
285,
253,
4679,
921,
1077,
1652,
11701,
253,
2929,
2175,
253,
2801,
5700,
273,
48960,
3733,
253,
4227,
42428,
310,
1077,
1711,
253,
10668,
476,
417,
923,
2712,
747,
12288,
432,
436,
2929,
50276,
783,
5853,
273,
436,
2929,
310,
1077,
2969,
1293,
3762,
1783,
285,
253,
4679,
1646,
326,
253,
4081,
3082,
760,
4044,
253,
16888,
15988,
253,
2929,
310,
417,
5145,
4715,
2929,
891,
651,
751,
281,
5583,
253,
4477,
281,
11929,
253,
2929,
281,
643,
2898,
11411,
253,
4278,
5701,
403,
337,
2139,
4227,
294,
6712,
272,
310,
30538,
552,
323,
48960,
3733,
476,
368,
2085,
690,
3762,
12288,
323,
436,
432,
253,
4679,
359,
476,
923,
326,
253,
4227,
294,
6712,
272,
691,
1550,
690,
16888,
11701,
689,
2206,
533,
752,
310,
253,
673,
2105,
2139,
417,
1304,
253,
673,
275,
4679,
604,
368,
2105,
1199,
625,
673,
685,
2206,
533,
760,
31326,
1077,
1652,
3157,
327,
253,
7200,
840,
352,
310,
417,
247,
7060,
1332,
374,
2139,
253,
4389,
8098,
943,
320,
2983,
6820,
50275,
5092,
368,
2085,
690,
3762,
12215,
323,
436,
50276,
5371,
2238,
273,
2983,
1056,
253,
4389,
8098,
1818,
310,
627,
667,
6387,
326,
1027,
2983,
265,
513,
417,
1818,
253,
4389,
8098,
849,
1027,
2983,
1818,
253,
4389,
8098,
752,
2238,
273,
2983,
1818,
253,
4389,
8098,
12413,
1032,
625,
4999,
390,
20372,
253,
2022,
1750,
273,
436,
2929,
310,
417,
8132,
784,
387,
512,
495,
476,
368,
5276,
253,
4081,
5933,
29623,
285,
752,
310,
253,
673,
2105,
577,
22309,
760,
1908,
19169,
338,
414,
2983,
2139,
417,
1611,
298,
18,
285,
298,
19,
2983,
2139,
760,
501,
3024,
501,
3024,
1093,
2139,
417,
1611,
501,
3024,
1706,
285,
501,
3024,
1235,
285,
690,
643,
6928,
2139,
760,
897,
260,
338,
274,
740,
2139,
417,
1611,
4440,
257,
292,
285,
690,
643,
15302,
337,
2139,
4227,
294,
6712,
272,
310,
30538,
552,
323,
48960,
3733,
476,
368,
2085,
690,
3762,
12288,
323,
436,
432,
253,
4679,
359,
476,
923,
326,
253,
4227,
294,
6712,
272,
691,
1550,
690,
16888,
11701,
689,
2206,
533,
752,
310,
253,
673,
2105,
2139,
417,
1304,
253,
673,
275,
4679,
604,
368,
2105,
1199,
625,
673,
685,
2206,
533,
760,
31326,
1077,
1652,
3157,
327,
253,
7200,
840,
352,
310,
417,
247,
7060,
1332,
374,
2139,
253,
4389,
8098,
943,
320,
2983,
6820,
50275,
5092,
368,
2085,
690,
3762,
12215,
323,
436,
50276,
5371,
2238,
273,
2983,
1056,
253,
4389,
8098,
1818,
310,
627,
667,
6387,
326,
1027,
2983,
265,
513,
417,
1818,
253,
4389,
8098,
849,
1027,
2983,
1818,
253,
4389,
8098,
752,
2238,
273,
2983,
1818,
253,
4389,
8098,
12413,
1032,
625,
4999,
390,
20372,
253,
2022,
1750,
273,
436,
2929,
310,
417,
8132,
784,
387,
512,
495,
476,
368,
5276,
253,
4081,
5933,
29623,
285,
752,
310,
253,
673,
2105,
577,
22309,
760,
1908,
19169,
338,
414,
2983,
2139,
417,
1611,
298,
18,
285,
298,
19,
2983,
2139,
760,
501,
3024,
501,
3024,
1093,
2139,
417,
1611,
501,
3024,
1706,
285,
501,
3024,
1235,
285,
690,
643,
6928,
2139,
760,
897,
260,
338,
274,
740,
2139,
417,
1611,
4440,
257,
292,
285,
690,
643,
15302,
5474,
339,
431,
248,
2929,
2175,
253,
294,
6712,
272,
273,
48960,
3733,
352,
806,
2792,
562,
253,
1895,
273,
305,
1094,
255,
305,
1094,
255,
760,
19132,
253,
10237,
7200,
762,
23256,
69,
285,
45742,
253,
7200,
273,
643,
8104,
751,
6753,
35946,
1754,
327,
253,
4342,
253,
4477,
12661,
247,
747,
1332,
298,
9296,
534,
17923,
48960,
3733,
970,
1027,
8104,
285,
4245,
1027,
13461,
281,
253,
10872,
4561,
407,
1027,
8104,
5847,
337,
253,
1783,
273,
305,
1094,
255,
310,
11080,
285,
47860,
253,
2929,
1347,
2710,
4679,
4645,
326,
305,
1094,
255,
689,
26017,
281,
23256,
69,
285,
5459,
253,
10237,
7200,
273,
23256,
69,
2299,
352,
310,
625,
14043,
762,
643,
8104,
751,
260,
88,
285,
39951,
50276,
19,
253,
16038,
273,
970,
1027,
8104,
1309,
3733,
3408,
310,
5272,
347,
253,
2990,
689,
26017,
253,
2983,
275,
253,
3733,
3408,
11117,
8104,
1056,
253,
2990,
10237,
281,
14200,
2491,
273,
8104,
285,
4796,
253,
689,
31893,
50276,
5040,
337,
253,
7103,
273,
298,
9296,
310,
18464,
253,
2929,
760,
44995,
298,
9296,
327,
260,
338,
274,
740,
342,
253,
501,
3024,
1093,
253,
2929,
943,
1347,
4679,
327,
625,
15302,
824,
347,
260,
338,
274,
740,
390,
18504,
13107,
281,
7568,
253,
29692,
453,
10179,
1261,
405,
273,
298,
9296,
16280,
4618,
373,
3024,
1706,
740,
310,
253,
954,
7744,
908,
2990,
10336,
323,
48960,
3733,
298,
9296,
943,
671,
320,
5762,
327,
436,
2990,
50276,
19,
253,
16038,
273,
12171,
294,
6712,
272,
310,
12744,
352,
310,
671,
2011,
275,
253,
3368,
326,
253,
3045,
273,
298,
22354,
310,
2810,
281,
298,
9296,
50276,
20,
253,
2929,
310,
1892,
281,
956,
347,
253,
4477,
1705,
598,
342,
690,
14951,
342,
5426,
323,
1650,
891,
2550,
2096,
10751,
2465,
74,
275,
16186,
79,
818,
285,
20320,
337,
50276,
1222,
641,
337,
275,
2593,
7652,
253,
2929,
2722,
326,
672,
970,
305,
1094,
255,
342,
260,
88,
2983,
253,
7200,
273,
260,
88,
2983,
310,
1335,
2406,
685,
23256,
69,
310,
352,
34126,
342,
1783,
275,
2593,
337,
285,
2593,
495,
50276,
783,
1783,
273,
305,
1094,
255,
310,
11080,
285,
47860,
285,
253,
16038,
273,
970,
1027,
8104,
310,
2590,
2299,
253,
7103,
273,
298,
9296,
310,
18464,
285,
891,
2550,
923,
253,
5649,
273,
12171,
294,
6712,
272,
3103,
891,
5257,
281,
12009,
253,
2929,
387,
1655,
3408,
5474,
33032,
43355,
8042,
562,
326,
10872,
250,
24676,
48960,
3733,
3496,
255,
11089,
432,
24189,
1411,
253,
39709,
3510,
273,
8104,
4477,
7558,
326,
247,
1781,
1180,
273,
10872,
403,
2686,
28849,
275,
3496,
255,
534,
778,
2847,
326,
11562,
3103,
253,
4477,
1804,
12171,
294,
24676,
48960,
3733,
298,
9296,
534,
17923,
1980,
294,
6712,
272,
875,
253,
48960,
11640,
4667,
2266,
2792,
50276,
18,
436,
2929,
2792,
562,
253,
2523,
273,
3496,
255,
534,
310,
4722,
50276,
19,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
20881,
2792,
50276,
783,
2929,
1750,
326,
3496,
255,
3731,
282,
6594,
253,
13461,
875,
48960,
11640,
2299,
627,
310,
247,
3710,
8813,
273,
2139,
3496,
255,
3731,
26460,
253,
13461,
285,
849,
1980,
294,
6712,
272,
476,
3657,
1110,
50275,
9088,
403,
3710,
5661,
1543,
281,
12654,
253,
10307,
273,
298,
9296,
50276,
7645,
5350,
627,
452,
417,
644,
14859,
298,
9296,
310,
3576,
275,
1781,
4764,
3210,
824,
347,
4618,
373,
3024,
1706,
740,
50276,
42429,
627,
310,
760,
247,
2014,
10895,
260,
338,
274,
740,
906,
275,
253,
2929,
1580,
253,
2488,
556,
2011,
326,
627,
310,
271,
43430,
273,
2266,
8104,
327,
1027,
15302,
4677,
374,
516,
14338,
849,
1980,
13461,
476,
6889,
2439,
15302,
50276,
16777,
1050,
10307,
849,
1048,
513,
298,
83,
1832,
5781,
285,
298,
9296,
68,
88,
1379,
281,
6194,
7277,
281,
2206,
50276,
8719,
1319,
1057,
298,
9296,
671,
476,
320,
3732,
281,
643,
48960,
3733,
3082,
824,
347,
28587,
50276,
15870,
7756,
891,
651,
1804,
253,
4477,
1408,
253,
4081,
1332,
2709,
2069,
285,
1304,
253,
11041,
253,
5301,
875,
253,
8037,
275,
3388,
3045,
285,
253,
11041,
651,
18578,
253,
9414,
604,
352,
310,
247,
1524,
7756,
5884,
2792,
50276,
9088,
310,
247,
5816,
8813,
273,
849,
281,
5958,
253,
2983,
275,
4677,
374,
4583,
891,
5583,
12009,
281,
436,
2929,
891,
1158,
253,
7680,
310,
417,
4209,
323,
17857,
32888,
50276,
783,
7756,
310,
3710,
2429,
281,
2206,
285,
298,
22354,
534,
310,
6194,
253,
1566,
342,
2045,
8104,
824,
347,
260,
88,
390,
1049,
72,
1678,
32888,
25761,
627,
403,
3710,
14023,
347,
891,
50221,
275,
253,
5075,
2792,
50276,
35529,
604,
512,
619,
7350,
11322,
6283,
891,
588,
2572,
619,
4868,
2490,
187,
4118,
18435,
27,
15337,
398,
5439,
2710,
7350,
285,
4477,
2197,
275,
642,
30080,
22559,
275,
1859,
273,
253,
4016,
13969,
436,
2929,
840,
1160,
247,
2590,
18235,
1083
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper shows that randomly initialized rnns can display a type of chaos called liyorke chaos and the number of linear regions they have remains exponential with depth unlike in feedforward nets strengths the work seems technically sound although i have to admit i did not go through the proofs very carefully weaknesses please see questions and limitations below rnns are not a very commonly used architecture in machine learning any more so i doubt that these results will be of much interest to the neurips audience this paper in addition studies a peculiar type of rnn with a clip operation after the nonlinearity that calls into question even more the relevance of these findings for even standard rnn architectures that used to be used by the machine learning community it wasnt really clear to me what the implications of liyorke chaos or exponential number of linear regions are in practical terms beyond things like instability and exploding gradients etc the results are in general very far removed from the interests and concerns of the overwhelming proportion of machine learning practitioners today i believe a more specialized venue on dynamical systems may be more appropriate for this work finally in fig 4 having a jacobian norm greater than 1 seems like a very weak proxy for what the paper is primarily concerned with ie liyork chaos docsepthe authors establish nonasymptotic probability bounds that a random randomly initialized recurrent neural network is chaotic the analysis is performed as follows 1 hidden state xt is scalar and the state space is compact interval xt in ab subset mathbbr 2 relu is used as the activation function though the analysis only requires continuity rather than differentiability 3 using the specific definition of chaos due to li and yorke for interval maps the authors need to bound the probability that the rnn f has period three fixed points ie f3x x 4 the probability of such an event is then expressed in terms of the number of neurons k the variance parameter sigma2 of a gaussian distribution from which the weight matrix is sampled the submission offers a novel rigorous analysis that quantifies how commonly is chaos found in rnns with finitely many neurons unlike the meanfield approaches limk to infty isnt necessary this allows to establish chaos more rigorously than the use of lyapunov exponents usually allows for overall i believe this to be an extremely interesting study with excellent presentation i would like to offer some comments presented in chronological order that might help improve the manuscript further 1 i found figure 2 somewhat hard to understand at least initially it might be worthwhile displaying the equation for the tent map in the figure the graph added little intuition partly because nodes and edges arent explained it might also help to mention that the fnn graph is simply unrolled perhaps it would also help to indicate where the small gaussian perturbation is supplied what overall effect on the dynamics do stochastic perturbations to bis have 2 liyorke chaos is introduced but never contrasted with other definitions eg one due to devaney this could also give an avenue to explicit all the consequences of ly chaos for interval maps such as sensitive dependence on initial conditionsexploding gradients section 21 most of the cited references analyze their respective problems using ergodic theory the juxtaposition with ly chaos is entirely warranted but the phrasing of the sentence gives the impression that nonreliance on ly is somehow bad paragraph rnns transitions in higher dimensions invoking ly chaos is warranted for interval maps for higher dimensional systems period3 fixed points do not provide sufficient nor necessary conditions for chaos similarly requiring the existence of a scrambled set as in definition 21 can produce dynamics that do not have the sensitive dependence on initial conditions i believe this should be acknowledged in a similar vein paragraph techniques lines 143159 gives an impression that the analysis is more general than the meanfield one rather than that it lies at the opposite end of the spectrum i believe it should be stated more clearly in the sections of the submission that technical arguments apply only to rnns with a scalar hidden state xt na docsepthis paper studies the behavior of a specific class of rnns at initialization using dynamical systems theory its main result shows that chaos can arise in rnns even in the absence of external input and this behavior holds independently of the width this result explains the scrambling phenomenon observed experimentally it also shed light on the expressive power of rnns in terms of the number of linear regions formed by the nonlinear activations strengths provide first understanding of rnns via the lens of liyorke chaos overall well written weaknesses analysis is limited to scalar inputs and rnns with relu activation tanh activation is often used in practice there are no results connecting the behavior of input data to the chaotic behavior of inputdriven rnns which are highly relevant in practice in the checklist it was mentioned that limitations are described but i could not find any such discussions in the main paper docsepthe paper explores how randomly initialized rnns exhibit liyorke chaos with a certain probability and which leads to exploding gradients and other undesirable phenomena by using the fact that functions containing a period three point it exhibits chaos some probabilistic bounds on the emergence of chaos in rnns are derived finally the theoretical results are evaluated with two toy examples strengths stunning maths interesting little experiments with code attached weaknesses short conclusion no immediately apparent benefit other than a new name for the phenomenon suspected errors figure 1 a legend would be helpful what is the blue line typo in caption adabehind figure 2 unclear how spikes in the plots correspond to linear regions lemma 42 some ys should be xs line 156 end something missing here not applicable imo
### Summary:
|
this submission is borderline reviewer extu praises its theoretical contribution and highly recommends acceptance reviewers niom and yhqb are more tepid but still support acceptance in light of the sound theoretical analysis reviewer j5sz acknowledges that the analysis is sound but believes the models to which it pertains rnns with certain unconventional features are of little interest to the community and therefore argues for rejection while i agree with j5sz that the paper may not be of immediate interest to many practitioners i believe the theory it delivers is worthy on its own and may lead to further theoretical developments that will apply to more contemporary neural networks i thus recommend acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2722,
326,
12421,
31260,
391,
79,
2224,
476,
3148,
247,
1511,
273,
20142,
1925,
632,
47196,
413,
20142,
285,
253,
1180,
273,
4872,
4811,
597,
452,
4558,
17619,
342,
6864,
12401,
275,
3997,
10495,
37507,
20544,
50275,
783,
789,
3133,
22335,
3590,
3738,
891,
452,
281,
11476,
891,
858,
417,
564,
949,
253,
27947,
1077,
9257,
50276,
20881,
1255,
265,
50276,
32897,
923,
3533,
285,
7364,
2708,
391,
79,
2224,
403,
417,
247,
1077,
7744,
908,
10336,
275,
5145,
4715,
667,
625,
594,
891,
5545,
326,
841,
1543,
588,
320,
273,
1199,
1600,
281,
253,
5723,
2824,
8446,
436,
2929,
275,
1635,
2175,
247,
19532,
1511,
273,
391,
9866,
342,
247,
17230,
4254,
846,
253,
14561,
414,
326,
5841,
715,
1953,
1014,
625,
253,
17200,
273,
841,
4342,
323,
1014,
2629,
391,
9866,
35615,
326,
908,
281,
320,
908,
407,
253,
5145,
4715,
3114,
50276,
262,
369,
2649,
1663,
2590,
281,
479,
752,
253,
12739,
273,
632,
47196,
413,
20142,
390,
17619,
1180,
273,
4872,
4811,
403,
275,
8542,
2426,
4457,
1841,
751,
17620,
285,
1414,
4442,
27935,
3966,
253,
1543,
403,
275,
2087,
1077,
2080,
5176,
432,
253,
6284,
285,
7350,
273,
253,
16400,
8394,
273,
5145,
4715,
24432,
3063,
891,
2868,
247,
625,
18052,
18767,
327,
18525,
2718,
778,
320,
625,
4569,
323,
436,
789,
50275,
71,
3341,
275,
3036,
577,
1907,
247,
480,
317,
706,
757,
5222,
3687,
685,
337,
3133,
751,
247,
1077,
5075,
17335,
323,
752,
253,
2929,
310,
8558,
7514,
342,
26332,
632,
90,
1064,
20142,
5474,
339,
431,
248,
4477,
5100,
1327,
284,
40045,
3875,
5912,
14493,
326,
247,
3632,
12421,
31260,
18902,
11454,
2990,
310,
29784,
253,
1783,
310,
2684,
347,
3637,
50276,
18,
8763,
1375,
209,
633,
310,
13434,
285,
253,
1375,
2317,
310,
8566,
7726,
209,
633,
275,
490,
8578,
14168,
67,
1288,
50276,
19,
774,
86,
310,
908,
347,
253,
5743,
1159,
2167,
253,
1783,
760,
4419,
21815,
2581,
685,
1027,
74,
1430,
50276,
20,
970,
253,
2173,
5426,
273,
20142,
1955,
281,
632,
285,
340,
263,
413,
323,
7726,
8115,
253,
4477,
878,
281,
3033,
253,
5912,
326,
253,
391,
9866,
269,
556,
2180,
1264,
4229,
2792,
26332,
269,
20,
89,
50276,
89,
50276,
21,
253,
5912,
273,
824,
271,
2362,
310,
840,
4469,
275,
2426,
273,
253,
1180,
273,
8512,
465,
253,
11041,
4764,
40009,
19,
273,
247,
305,
12064,
3268,
432,
534,
253,
2801,
4315,
310,
19958,
253,
19529,
6131,
247,
4460,
26565,
1783,
326,
2677,
7790,
849,
7744,
310,
20142,
1119,
275,
391,
79,
2224,
342,
30268,
1142,
8512,
12401,
253,
1599,
3423,
7274,
1579,
76,
281,
2192,
555,
310,
2649,
3309,
50276,
2520,
4483,
281,
5100,
20142,
625,
8132,
29689,
685,
253,
897,
273,
12865,
522,
43772,
41025,
3798,
4483,
323,
50276,
1189,
455,
891,
2868,
436,
281,
320,
271,
6685,
4722,
1263,
342,
7126,
9759,
50276,
74,
651,
751,
281,
3959,
690,
5701,
3559,
275,
20600,
1975,
1340,
326,
1537,
1361,
3157,
253,
7714,
2007,
50276,
18,
891,
1119,
4677,
374,
8489,
1892,
281,
2096,
387,
1878,
8523,
352,
1537,
320,
32811,
19703,
253,
5150,
323,
253,
12556,
3711,
275,
253,
4677,
253,
4216,
2879,
1652,
30328,
13730,
984,
7632,
285,
9297,
403,
2649,
5544,
352,
1537,
671,
1361,
281,
3748,
326,
253,
269,
9866,
4216,
310,
3365,
440,
9095,
4931,
352,
651,
671,
1361,
281,
5224,
835,
253,
1355,
305,
12064,
20452,
310,
12164,
752,
4583,
1055,
327,
253,
8062,
513,
19191,
26309,
281,
17542,
452,
50275,
19,
632,
47196,
413,
20142,
310,
5611,
533,
1620,
48397,
342,
643,
14308,
24088,
581,
1955,
281,
1474,
27674,
436,
812,
671,
1918,
271,
39893,
281,
6843,
512,
253,
9099,
273,
12865,
20142,
323,
7726,
8115,
824,
347,
7996,
10096,
327,
3302,
1617,
11523,
446,
4442,
27935,
50276,
4674,
3127,
954,
273,
253,
11106,
10414,
12106,
616,
9056,
3237,
970,
21651,
23329,
3762,
253,
47806,
522,
3450,
342,
12865,
20142,
310,
7094,
26085,
533,
253,
9839,
2355,
273,
253,
6197,
4245,
253,
13214,
326,
1327,
22987,
593,
327,
12865,
310,
10380,
3076,
50276,
43575,
391,
79,
2224,
16307,
275,
2169,
10103,
42203,
12865,
20142,
310,
26085,
323,
7726,
8115,
323,
2169,
15759,
2718,
2180,
20,
4229,
2792,
513,
417,
2085,
4209,
4543,
3309,
2515,
323,
20142,
12014,
10568,
253,
6242,
273,
247,
36369,
873,
347,
275,
5426,
3127,
476,
4711,
8062,
326,
513,
417,
452,
253,
7996,
10096,
327,
3302,
2515,
50276,
74,
2868,
436,
943,
320,
14969,
275,
247,
2074,
17716,
12494,
5609,
3104,
21445,
17220,
4245,
271,
13214,
326,
253,
1783,
310,
625,
2087,
685,
253,
1599,
3423,
581,
2581,
685,
326,
352,
8696,
387,
253,
7285,
990,
273,
253,
6637,
891,
2868,
352,
943,
320,
4767,
625,
4518,
275,
253,
7118,
273,
253,
19529,
326,
7681,
7125,
4647,
760,
281,
391,
79,
2224,
342,
247,
13434,
8763,
1375,
209,
633,
5549,
5474,
33032,
2520,
2929,
2175,
253,
3879,
273,
247,
2173,
966,
273,
391,
79,
2224,
387,
31850,
970,
18525,
2718,
3762,
697,
2022,
906,
2722,
326,
20142,
476,
12893,
275,
391,
79,
2224,
1014,
275,
253,
5928,
273,
6024,
3280,
285,
436,
3879,
6556,
10939,
273,
253,
4871,
436,
906,
11424,
253,
7362,
17461,
11562,
2540,
21657,
352,
671,
17914,
1708,
327,
253,
43541,
1612,
273,
391,
79,
2224,
275,
2426,
273,
253,
1180,
273,
4872,
4811,
4447,
407,
253,
14561,
1396,
569,
20544,
50276,
42260,
806,
4685,
273,
391,
79,
2224,
3066,
253,
9655,
273,
632,
47196,
413,
20142,
50276,
1189,
455,
973,
3542,
50276,
20881,
1255,
265,
50276,
12792,
310,
3710,
281,
13434,
14800,
285,
391,
79,
2224,
342,
774,
86,
5743,
23136,
73,
5743,
310,
2223,
908,
275,
3946,
50276,
9088,
403,
642,
1543,
12873,
253,
3879,
273,
3280,
941,
281,
253,
29784,
3879,
273,
3280,
17477,
391,
79,
2224,
534,
403,
4122,
4623,
275,
3946,
275,
253,
44282,
352,
369,
5393,
326,
7364,
403,
2529,
533,
891,
812,
417,
1089,
667,
824,
11985,
275,
253,
2022,
2929,
5474,
339,
431,
248,
2929,
33826,
849,
12421,
31260,
391,
79,
2224,
10738,
632,
47196,
413,
20142,
342,
247,
2176,
5912,
285,
534,
5644,
281,
1414,
4442,
27935,
285,
643,
26016,
16958,
407,
970,
253,
958,
326,
3470,
4508,
247,
2180,
1264,
1127,
352,
15646,
20142,
690,
37851,
14493,
327,
253,
21313,
273,
20142,
275,
391,
79,
2224,
403,
6012,
4720,
253,
10527,
1543,
403,
6760,
342,
767,
20953,
6667,
20544,
50276,
296,
10455,
14168,
84,
50276,
47606,
1652,
4679,
342,
2127,
7660,
50276,
20881,
1255,
265,
50276,
14458,
6452,
642,
4745,
5165,
5649,
643,
685,
247,
747,
1416,
323,
253,
11562,
50276,
36903,
5344,
6332,
50276,
13206,
337,
247,
13691,
651,
320,
9371,
50275,
5371,
310,
253,
4797,
1386,
1745,
80,
275,
11743,
519,
357,
11430,
527,
50275,
13206,
374,
12744,
849,
34635,
275,
253,
14777,
2723,
281,
4872,
4811,
50276,
21838,
5976,
690,
340,
84,
943,
320,
48361,
50276,
1282,
21807,
990,
1633,
5816,
1060,
50275,
1439,
7763,
516,
80,
2490,
187,
4118,
18435,
27,
2520,
19529,
310,
45210,
50276,
15337,
254,
1021,
86,
9791,
3013,
697,
10527,
7680,
285,
4122,
32636,
14924,
50276,
15337,
398,
13065,
297,
285,
340,
73,
82,
67,
403,
625,
246,
47163,
533,
1335,
1329,
14924,
275,
1708,
273,
253,
3590,
10527,
1783,
50276,
15337,
254,
480,
22,
20932,
26785,
326,
253,
1783,
310,
3590,
533,
11532,
253,
3210,
281,
534,
352,
45703,
391,
79,
2224,
342,
2176,
49799,
3386,
403,
273,
1652,
1600,
281,
253,
3114,
285,
3103,
8219,
323,
18235,
50276,
6050,
891,
5194,
342,
480,
22,
20932,
326,
253,
2929,
778,
417,
320,
273,
8993,
1600,
281,
1142,
24432,
891,
2868,
253,
3762,
352,
26361,
310,
18338,
327,
697,
1211,
285,
778,
1421,
281,
2007,
10527,
16936,
326,
588,
4647,
281,
625,
13399,
11454,
6928,
50276,
74,
3021,
5583,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2722,
326,
12421,
31260,
391,
79,
2224,
476,
3148,
247,
1511,
273,
20142,
1925,
632,
47196,
413,
20142,
285,
253,
1180,
273,
4872,
4811,
597,
452,
4558,
17619,
342,
6864,
12401,
275,
3997,
10495,
37507,
20544,
50275,
783,
789,
3133,
22335,
3590,
3738,
891,
452,
281,
11476,
891,
858,
417,
564,
949,
253,
27947,
1077,
9257,
50276,
20881,
1255,
265,
50276,
32897,
923,
3533,
285,
7364,
2708,
391,
79,
2224,
403,
417,
247,
1077,
7744,
908,
10336,
275,
5145,
4715,
667,
625,
594,
891,
5545,
326,
841,
1543,
588,
320,
273,
1199,
1600,
281,
253,
5723,
2824,
8446,
436,
2929,
275,
1635,
2175,
247,
19532,
1511,
273,
391,
9866,
342,
247,
17230,
4254,
846,
253,
14561,
414,
326,
5841,
715,
1953,
1014,
625,
253,
17200,
273,
841,
4342,
323,
1014,
2629,
391,
9866,
35615,
326,
908,
281,
320,
908,
407,
253,
5145,
4715,
3114,
50276,
262,
369,
2649,
1663,
2590,
281,
479,
752,
253,
12739,
273,
632,
47196,
413,
20142,
390,
17619,
1180,
273,
4872,
4811,
403,
275,
8542,
2426,
4457,
1841,
751,
17620,
285,
1414,
4442,
27935,
3966,
253,
1543,
403,
275,
2087,
1077,
2080,
5176,
432,
253,
6284,
285,
7350,
273,
253,
16400,
8394,
273,
5145,
4715,
24432,
3063,
891,
2868,
247,
625,
18052,
18767,
327,
18525,
2718,
778,
320,
625,
4569,
323,
436,
789,
50275,
71,
3341,
275,
3036,
577,
1907,
247,
480,
317,
706,
757,
5222,
3687,
685,
337,
3133,
751,
247,
1077,
5075,
17335,
323,
752,
253,
2929,
310,
8558,
7514,
342,
26332,
632,
90,
1064,
20142,
5474,
339,
431,
248,
4477,
5100,
1327,
284,
40045,
3875,
5912,
14493,
326,
247,
3632,
12421,
31260,
18902,
11454,
2990,
310,
29784,
253,
1783,
310,
2684,
347,
3637,
50276,
18,
8763,
1375,
209,
633,
310,
13434,
285,
253,
1375,
2317,
310,
8566,
7726,
209,
633,
275,
490,
8578,
14168,
67,
1288,
50276,
19,
774,
86,
310,
908,
347,
253,
5743,
1159,
2167,
253,
1783,
760,
4419,
21815,
2581,
685,
1027,
74,
1430,
50276,
20,
970,
253,
2173,
5426,
273,
20142,
1955,
281,
632,
285,
340,
263,
413,
323,
7726,
8115,
253,
4477,
878,
281,
3033,
253,
5912,
326,
253,
391,
9866,
269,
556,
2180,
1264,
4229,
2792,
26332,
269,
20,
89,
50276,
89,
50276,
21,
253,
5912,
273,
824,
271,
2362,
310,
840,
4469,
275,
2426,
273,
253,
1180,
273,
8512,
465,
253,
11041,
4764,
40009,
19,
273,
247,
305,
12064,
3268,
432,
534,
253,
2801,
4315,
310,
19958,
253,
19529,
6131,
247,
4460,
26565,
1783,
326,
2677,
7790,
849,
7744,
310,
20142,
1119,
275,
391,
79,
2224,
342,
30268,
1142,
8512,
12401,
253,
1599,
3423,
7274,
1579,
76,
281,
2192,
555,
310,
2649,
3309,
50276,
2520,
4483,
281,
5100,
20142,
625,
8132,
29689,
685,
253,
897,
273,
12865,
522,
43772,
41025,
3798,
4483,
323,
50276,
1189,
455,
891,
2868,
436,
281,
320,
271,
6685,
4722,
1263,
342,
7126,
9759,
50276,
74,
651,
751,
281,
3959,
690,
5701,
3559,
275,
20600,
1975,
1340,
326,
1537,
1361,
3157,
253,
7714,
2007,
50276,
18,
891,
1119,
4677,
374,
8489,
1892,
281,
2096,
387,
1878,
8523,
352,
1537,
320,
32811,
19703,
253,
5150,
323,
253,
12556,
3711,
275,
253,
4677,
253,
4216,
2879,
1652,
30328,
13730,
984,
7632,
285,
9297,
403,
2649,
5544,
352,
1537,
671,
1361,
281,
3748,
326,
253,
269,
9866,
4216,
310,
3365,
440,
9095,
4931,
352,
651,
671,
1361,
281,
5224,
835,
253,
1355,
305,
12064,
20452,
310,
12164,
752,
4583,
1055,
327,
253,
8062,
513,
19191,
26309,
281,
17542,
452,
50275,
19,
632,
47196,
413,
20142,
310,
5611,
533,
1620,
48397,
342,
643,
14308,
24088,
581,
1955,
281,
1474,
27674,
436,
812,
671,
1918,
271,
39893,
281,
6843,
512,
253,
9099,
273,
12865,
20142,
323,
7726,
8115,
824,
347,
7996,
10096,
327,
3302,
1617,
11523,
446,
4442,
27935,
50276,
4674,
3127,
954,
273,
253,
11106,
10414,
12106,
616,
9056,
3237,
970,
21651,
23329,
3762,
253,
47806,
522,
3450,
342,
12865,
20142,
310,
7094,
26085,
533,
253,
9839,
2355,
273,
253,
6197,
4245,
253,
13214,
326,
1327,
22987,
593,
327,
12865,
310,
10380,
3076,
50276,
43575,
391,
79,
2224,
16307,
275,
2169,
10103,
42203,
12865,
20142,
310,
26085,
323,
7726,
8115,
323,
2169,
15759,
2718,
2180,
20,
4229,
2792,
513,
417,
2085,
4209,
4543,
3309,
2515,
323,
20142,
12014,
10568,
253,
6242,
273,
247,
36369,
873,
347,
275,
5426,
3127,
476,
4711,
8062,
326,
513,
417,
452,
253,
7996,
10096,
327,
3302,
2515,
50276,
74,
2868,
436,
943,
320,
14969,
275,
247,
2074,
17716,
12494,
5609,
3104,
21445,
17220,
4245,
271,
13214,
326,
253,
1783,
310,
625,
2087,
685,
253,
1599,
3423,
581,
2581,
685,
326,
352,
8696,
387,
253,
7285,
990,
273,
253,
6637,
891,
2868,
352,
943,
320,
4767,
625,
4518,
275,
253,
7118,
273,
253,
19529,
326,
7681,
7125,
4647,
760,
281,
391,
79,
2224,
342,
247,
13434,
8763,
1375,
209,
633,
5549,
5474,
33032,
2520,
2929,
2175,
253,
3879,
273,
247,
2173,
966,
273,
391,
79,
2224,
387,
31850,
970,
18525,
2718,
3762,
697,
2022,
906,
2722,
326,
20142,
476,
12893,
275,
391,
79,
2224,
1014,
275,
253,
5928,
273,
6024,
3280,
285,
436,
3879,
6556,
10939,
273,
253,
4871,
436,
906,
11424,
253,
7362,
17461,
11562,
2540,
21657,
352,
671,
17914,
1708,
327,
253,
43541,
1612,
273,
391,
79,
2224,
275,
2426,
273,
253,
1180,
273,
4872,
4811,
4447,
407,
253,
14561,
1396,
569,
20544,
50276,
42260,
806,
4685,
273,
391,
79,
2224,
3066,
253,
9655,
273,
632,
47196,
413,
20142,
50276,
1189,
455,
973,
3542,
50276,
20881,
1255,
265,
50276,
12792,
310,
3710,
281,
13434,
14800,
285,
391,
79,
2224,
342,
774,
86,
5743,
23136,
73,
5743,
310,
2223,
908,
275,
3946,
50276,
9088,
403,
642,
1543,
12873,
253,
3879,
273,
3280,
941,
281,
253,
29784,
3879,
273,
3280,
17477,
391,
79,
2224,
534,
403,
4122,
4623,
275,
3946,
275,
253,
44282,
352,
369,
5393,
326,
7364,
403,
2529,
533,
891,
812,
417,
1089,
667,
824,
11985,
275,
253,
2022,
2929,
5474,
339,
431,
248,
2929,
33826,
849,
12421,
31260,
391,
79,
2224,
10738,
632,
47196,
413,
20142,
342,
247,
2176,
5912,
285,
534,
5644,
281,
1414,
4442,
27935,
285,
643,
26016,
16958,
407,
970,
253,
958,
326,
3470,
4508,
247,
2180,
1264,
1127,
352,
15646,
20142,
690,
37851,
14493,
327,
253,
21313,
273,
20142,
275,
391,
79,
2224,
403,
6012,
4720,
253,
10527,
1543,
403,
6760,
342,
767,
20953,
6667,
20544,
50276,
296,
10455,
14168,
84,
50276,
47606,
1652,
4679,
342,
2127,
7660,
50276,
20881,
1255,
265,
50276,
14458,
6452,
642,
4745,
5165,
5649,
643,
685,
247,
747,
1416,
323,
253,
11562,
50276,
36903,
5344,
6332,
50276,
13206,
337,
247,
13691,
651,
320,
9371,
50275,
5371,
310,
253,
4797,
1386,
1745,
80,
275,
11743,
519,
357,
11430,
527,
50275,
13206,
374,
12744,
849,
34635,
275,
253,
14777,
2723,
281,
4872,
4811,
50276,
21838,
5976,
690,
340,
84,
943,
320,
48361,
50276,
1282,
21807,
990,
1633,
5816,
1060,
50275,
1439,
7763,
516,
80,
2490,
187,
4118,
18435,
27,
2520,
19529,
310,
45210,
50276,
15337,
254,
1021,
86,
9791,
3013,
697,
10527,
7680,
285,
4122,
32636,
14924,
50276,
15337,
398,
13065,
297,
285,
340,
73,
82,
67,
403,
625,
246,
47163,
533,
1335,
1329,
14924,
275,
1708,
273,
253,
3590,
10527,
1783,
50276,
15337,
254,
480,
22,
20932,
26785,
326,
253,
1783,
310,
3590,
533,
11532,
253,
3210,
281,
534,
352,
45703,
391,
79,
2224,
342,
2176,
49799,
3386,
403,
273,
1652,
1600,
281,
253,
3114,
285,
3103,
8219,
323,
18235,
50276,
6050,
891,
5194,
342,
480,
22,
20932,
326,
253,
2929,
778,
417,
320,
273,
8993,
1600,
281,
1142,
24432,
891,
2868,
253,
3762,
352,
26361,
310,
18338,
327,
697,
1211,
285,
778,
1421,
281,
2007,
10527,
16936,
326,
588,
4647,
281,
625,
13399,
11454,
6928,
50276,
74,
3021,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a representation learning technique for dataefficient rl that makes use of invarianceequivariance of the environment wrt both state and agent actions this work builds on two ideas mdp homomorphism and symmetries in stateactions due to some inherent structure in the environment mdp homomorphism refers to a mapping of states and actions of an mdp to a new space where the mdp reward structure and dynamics are preserved such a homomorphism can be useful if the new mapped stateaction space is easier to work with the second idea is that we may know that an environment exhibits certain regularities ie may have a group structure where we know the effect of certain operations then building these regularities directly into our models can then make training much more data efficient in this paper the authors combine these two ideas mdp homomorphism and symmetries in mdps such that state and actions are mapped to new spaces that satisfy the mdp homomorphism criteria along with the invarianceequivariance relations that result from the structure of the environment this is achieved by minimizing a loss function that consists of 3 terms the first term laet encourages the transition dynamics in the learned space to be the same with the original mdp the second lget encourages the effect of group actions to be equivariant on the learned space and finally the third one lr encourages the rewards in the learned space to be the same as the original mdp the authors test their technique in a rainbow dqn agent on atari levels under data limited setting and show that their technique albeit slighlty outperforms previous techniques overall i think this is a quite interesting paper it presents a very principled approach to learning better representations for rl in a dataefficient manner perhaps the improvement in performance is not very dramatic and parts of this were already in earlier publications but i think the motivation the technical development and the final presented technique are all of interest to the community and will be valuable contributions i dont have many concerns about the paper but i would have liked to see some analysis of the learned representations and the transition models are there any interesting patterns emerging how does the new learned stateaction space look like also it was not clear to me why adding the loss term lget led to worse results in median scores it would be nice to discuss this briefly this might be perhaps the assumed group structure is not really suitable for some of the games some other points it would be nice to have colors in figure 2 as it stands it is a little bit hard to parse it would be nice to mention what group se2 is at some point perhaps in the experiments section laet is always included in training but this is only mentioned afterwards it might be better to mention this earlier overall i think the technique presented in this paper is wellmotivated and addresses an important problem in rl perhaps the improvement in performance is small but i think the technical development and the final technique are valuable contributions docsepthe paper studies the question of recovering representations that are equivariant under the agents action and under the symmetries of the environment the representation are symmetrybased using the statetransition symmetry to create a strong inductive bias robust to symmetries that are not present in the reward in a latent transition model to enable learning of the transition model in the latent space and to allow for linear assumption of the group action through the representation a decomposed structure is imposed on the representation of the symmetry group such that it allows to represent the latent embedding space as a direct product of subgroups the authors propose 3 losses that encode the aforementioned constraints of the latent state and stateaction embedding as well as the invariance constraint of the reward i found the paper to be very well written i think the idea and method were clearly explained despite the complexity of the overall model combined with all the moving parts well done the authors perspective on equivariance in rl is interesting and novel to the best of my knowledge i was convinced by the motivations outlined and the consequently design choices the proposed method makes sense and is backed by interesting results i think it would benefit the paper to make it clear early on that the choice of the loss depends on a choice of subgroup blocks that depends on prior knowledge the environment eg 2d translation rotation etc as this didnt come clearly to me until example 1 is presented i appreciate that the appendix includes a discussion on the choice of group for atari games but it would be better to have more general discussion in the main text i understand that authors wanted to show performance across many different environments to showcase the robustness of the performance but i would have liked to see more indepth experimental results in one or two tasks including learning curves specially given that dataefficiency is the main desired property minor comment i believe the beta above eq 18 should be geta the paper is well written and introduces a very interesting perspective on equivariant representations in rl the method proposed makes sense to me and the results seem promising docsepthis paper presents a latent variable model for representation learning in rl taking into consideration both equivariance to an agents action and symmetry transformations of the environment the proposed method eqr outperforms several previous methods in dataefficient 100k atari benchmark strengths leveraging invariance and equivariance as inductive bias for representation learning in rl makes sense the proposed way to encode the equivariance invariance constraints in the latent representation space is novel and interesting the writing is in general clear and not too difficult to follow weaknesses one of my concerns about this paper is the performance comparison with spr from table 1 the performance of eqr lr is very close to spr in the median score since on atari median score is often considered as a more reasonable performance measure than mean score i would not be convinced that eqr outperforms spr besides it seems eqr lr lget performs worse in terms of median score than eqr lr in table 1 and figure 3b the novel part of this paper lget does not contribute much to the performance given that the method is also very similar to spr in some implementation choices i am not convinced that the proposed method is better to help readers better understand the math in section 32 and section 4 i suggest the authors give more examples during these 2 sections to provide a context possibly with the pendulum example other comments in equation 5 can the same g act both on element in mathbbs and on element in mathbbs times mathbba in the first paragraph of section 4 mathcalgt and mathcalgr are not defined i am not quite getting the sense of using tauthetasttauthetast1 as taug can the authors provide some explanations typo above equation 17 duplicate using above equation 18 beta and b are not consistent with figure 2 while i do find the idea of this paper novel and interesting the empirical evidence does not convince me of its superiority over prior methods in particular spr i will not recommend acceptance for its current form docsepthe paper focuses on the problem of incorporating symmetries into rl agents representations to improve data efficiency it proposes a modelbased representation learning method parameterized by a group action that is based on encoding states and stateaction pairs as elements of the groups representation in latent space the proposed method is evaluated against a number of baselines in the datalimited atari setting strengths the paper is wellmotivated and addresses an interesting and important problem the idea of embedding states and actions as matrices representing a group action is intriguing and may induce interesting properties in the representation beyond those studied in this paper the paper provides sufficient background to be understood by readers with some but not extensive experience in the topic weaknesses the justification for the proposed method is lacking section 4 provides a number of desiderata but there is no followup showing empirically or theoretically that the resulting method especially given the use of target networks and projections will actually satisfy these desiderata the use of lie groups is not motivated in section 3 as a reader i presumed that this class was chosen in order to justify the use of learned matrix embeddings in the proposed model but this would be worth clarifying additionally the basis of the lie algebra is not referred to in section 6 despite an indication that it will be used in section 3 the paper suffers from a lack of clarity for example in the discussion of symmetric mdps the group action is applied to both states and stateaction pairs but it is not clear whether these represent two separate actions or whether gcdot langle s a rangle has the same effect on states as g cdot s additionally the notation kappa mathcalg and taumathcalgmathbbs is not clear on a first read in general i think more explicitly stating the types of the mathematical objects used in the paper will make it much easier to parse by default i and probably many other rl readers default to assuming vectorvalued embeddings and it is easier to override this default when the output type of the embedding is made explicit section 4 is quite long and in many cases its not clear whether the proposed properties are desiderata or guaranteed to be satisfied by the proposed method for example the matrix embeddings of state actions seems to hold by construction but disentanglement presumably does not the empirical evaluations dont support the main goal of the paper adding the groupequivariant loss doesnt provide much improvement over the standard modellearning loss particularly when we look at median performance this suggests that most of the benefit over the baselines is coming from the modellearning component of the objective rather than the incorporation of symmetries further the method without the symmetrylearning loss is similar to many existing representation learning methods such as deepmdp psr pbl muzero latco and dbc the equivariant loss is therefore critical to the novelty of the proposed algorithm yet it doesnt seem to have a significant impact on overall performance it is possible that a more careful empirical analysis on environments with more explicit symmetries may reveal that this loss is indeed useful but the current experimental results are insufficient to show this to a satisfactory degree the experimental results are limited the paper only uses one evaluation scheme and doesnt provide much insight into whether the learned model really is equivariant to the set of transformations used further the baselines used in the paper dont seem to be described or cited which makes it difficult to determine whether a fair comparison is really being made between the different methods the baselines i was already familiar with mostly seem to make use of modelfree approaches combined with data augmentation which leaves the modellearning component of eqr as a confounding factor spr which does include a learned latentspace model seems to attain a higher median humannormalized score than eqr with the equivariant transition loss suggesting that at least part of the improved performance is coming from the specific parameterization of the model concrete avenues for improvement i am very intrigued by the use of matrix embeddings in the representation typically in rl we treat representations as vectors and it would be interesting to see whether enforcing linear actions of the transition and symmetry operators on the latent space induces particular structure it is possible that by looking at environments with more structure the proposed method might be shown to capture the appropriate inequivariances and therefore improve performance and sample efficiency i recommend incorporating evaluations on eg mujoco or some other environment with more explicit physicallybased invariances the focus of this paper is on equivariant representations for sample efficiency however it seems like an arguably more promising application is in improving generalization ensuring that symbols and their types are defined before they are used will make the paper much more readable section 4 should be rewritten to more clearly express which of the desiderata are definitely satisfied by the proposed method and which are not guaranteed to be satisfied it might also be interesting to include empirical investigations into the degree to which the proposed method satisfies these desiderata i recommend rejecting this paper due to a the limited empirical benefit of incorporating equivariance into the proposed representation learning method and b the lack of clarity in both the papers exposition and its claims about the proposed methods properties with additional work and a more informative evaluation setting i think the paper will be in a much better position for consideration in future conferences
### Summary:
|
this paper proposes to learn a latent space representation such that some linear equivariance and symmetry constraints are respected in the latent space with the goal to improve sample efficiency one core idea is that the latent space is also the same as the space of linear transformation used in the constraints which is shown to simplify some of the mathematical derivations experiments on the atari 100k benchmark demonstrate a statistical improvement over the spr baseline when using the se2 group of linear transformations as latent space following the discussion period most reviewers were in favor of acceptance however one reviewer remained unconvinced and after carefully reading the paper i actually share the same concerns ie that it is unclear under which conditions the proposed approach actually works and what makes it work i believe that as a research community we should value understanding over moving the needle on benchmarks especially when proposing such a complex method as this one see fig 5 more specifically 1 the method is only evaluated on atari games showing some improvements when using se2 and arguing that there are corresponding symmetries in such games there is however no analysis demonstrating or even hinting at the fact that the proposed technique is actually learning to take advantage of such symmetries nb i had a quick look at the animation added by the authors in the supplementary material but i do not see ifhow they help on this point even if analyzing representations on atari may be tricky i believe that given the motivation of this new algorithm it must be evaluated on some toy example eg the pendulum mentioned throughout the paper to validate that it is learning what we want it to learn although i also agree with the authors that experimenting on a more complex benchmark like atari is equally important 2 the idea of embedding states into the same space as transformations is interesting and brings some advantages when writing down equations as demonstrated by the authors however there is no justification besides mathematical convenience and it doesnt seem intuitive to me at all that why this should be a good idea considering that it ties the state representation to the mathematical representation of group transformations for instance what does the spcial group element e mean for a state and this coupling makes it difficult to interpret the effect of using a different group of transformations for instance when moving from gl2 to se2 is the observed benefit because we are using only specific transformations or simply because we are reducing the dimensionality of the state embedding note that in fig 4c the mlp variant has similar performance to gl2 and based on my understanding they use the same embedding dimensionality i believe it would be important to check what would happen with an mlp variant using the same dimensionality as se2 3 the effect of the lget loss is not convincing as pointed out by several reviewers i think it would have been an opportunity for the authors to investigate why especially since it seems to work in some games and not others but just focusing on here are the 1726 games where it works better doesnt really bring added value here do these games have some specific properties that make them better candidates to take advantage of lget this could have been a very interesting insight if that was the case but as it is now i am not sure what we can learn from that 4 there are several implementation details some moving the final algorithm farther from its theoretical justification that are not ablated making it difficult to understand their impact ex using target networks the choice of the value of m using projections onto the unit sphere of some arbitrary dimensionality how the s state is chosen in lget as a result we have here an algorithm with some interesting theoretical background but with a lot of moving components which when properly tweaked can lead to a statistically meaningful improvement on atari 100k without really understanding why i believe this is not quite enough for publication at iclr and i would encourage the authors to delve deeper into the understanding of their algorithm which i hope will bring useful insights to the research community working on representation learning
|
[
50276,
77,
788,
17923,
7197,
275,
2426,
273,
8876,
4868,
685,
16186,
83,
298,
83,
275,
2829,
337,
285,
4677,
495,
67,
253,
4460,
629,
273,
436,
2929,
298,
788,
1057,
417,
8162,
1199,
281,
253,
3045,
1677,
326,
253,
1332,
310,
671,
1077,
2074,
281,
8689,
275,
690,
7092,
10165,
891,
717,
417,
13762,
326,
253,
4081,
1332,
310,
1805,
50276,
936,
1361,
10668,
1805,
2096,
253,
14168,
275,
2593,
4567,
285,
2593,
577,
891,
1804,
253,
4477,
1918,
625,
6667,
1309,
841,
374,
7118,
281,
2085,
247,
3634,
6830,
342,
253,
32752,
15508,
1650,
50276,
977,
5701,
50275,
249,
5150,
608,
476,
253,
1072,
305,
769,
1097,
327,
3284,
275,
14168,
67,
1768,
285,
327,
3284,
275,
14168,
67,
1768,
2069,
14168,
4482,
66,
50274,
249,
253,
806,
12494,
273,
2593,
577,
14168,
1179,
7332,
285,
14168,
1179,
737,
403,
417,
2931,
50276,
74,
717,
417,
3240,
2970,
253,
3282,
273,
970,
44663,
6168,
505,
893,
307,
6168,
505,
18,
347,
246,
2321,
476,
253,
4477,
2085,
690,
22909,
50276,
555,
5367,
1840,
5150,
1722,
21036,
970,
50276,
25117,
5150,
1283,
9840,
285,
270,
403,
417,
5185,
342,
4677,
374,
1223,
891,
513,
1089,
253,
2934,
273,
436,
2929,
4460,
285,
4722,
253,
16774,
1941,
1057,
417,
18578,
479,
273,
697,
34385,
689,
2720,
3082,
275,
1798,
8689,
891,
588,
417,
5583,
14924,
323,
697,
1655,
830,
5474,
339,
431,
248,
2929,
16633,
327,
253,
1895,
273,
24049,
34902,
715,
391,
77,
6083,
14237,
281,
3157,
941,
6733,
352,
29328,
247,
1566,
3169,
6779,
4715,
1332,
4764,
1025,
407,
247,
1387,
2250,
326,
310,
1754,
327,
9706,
3054,
285,
1375,
1913,
8557,
347,
3603,
273,
253,
2390,
6779,
275,
21624,
2317,
253,
4081,
1332,
310,
6760,
1411,
247,
1180,
273,
1666,
25379,
275,
253,
2856,
267,
303,
959,
387,
1792,
4758,
50276,
296,
3755,
20556,
50275,
783,
2929,
310,
973,
24013,
8550,
285,
12453,
271,
4722,
285,
1774,
1895,
50276,
783,
2934,
273,
21496,
3054,
285,
5231,
347,
12624,
9999,
247,
1387,
2250,
310,
27807,
285,
778,
10808,
4722,
3607,
275,
253,
6779,
4457,
1110,
5421,
275,
436,
2929,
50276,
783,
2929,
3400,
4209,
4114,
281,
320,
7192,
407,
10668,
342,
690,
533,
417,
9470,
2793,
275,
253,
9400,
50274,
20881,
1255,
265,
50275,
783,
22861,
323,
253,
4081,
1332,
310,
14999,
50276,
4674,
577,
3400,
247,
1180,
273,
711,
1334,
682,
533,
627,
310,
642,
956,
484,
4645,
45190,
390,
28055,
326,
253,
4795,
1332,
3340,
1677,
253,
897,
273,
2303,
6928,
285,
20553,
588,
2686,
10517,
841,
711,
1334,
682,
50276,
783,
897,
273,
7027,
2390,
310,
417,
17194,
275,
2593,
495,
347,
247,
9414,
891,
24874,
326,
436,
966,
369,
6777,
275,
1340,
281,
15249,
253,
897,
273,
6311,
4315,
46234,
275,
253,
4081,
1566,
533,
436,
651,
320,
4409,
8254,
5411,
23000,
253,
3720,
273,
253,
7027,
8697,
310,
417,
6289,
281,
275,
2593,
721,
5747,
271,
14011,
326,
352,
588,
320,
908,
275,
2593,
495,
50275,
783,
2929,
27171,
432,
247,
3480,
273,
19843,
323,
1650,
275,
253,
5955,
273,
13123,
31934,
793,
253,
1387,
2250,
310,
3732,
281,
1097,
3054,
285,
1375,
1913,
8557,
533,
352,
310,
417,
2590,
1880,
841,
1957,
767,
4858,
5231,
390,
1880,
305,
3830,
298,
2134,
256,
247,
391,
2134,
556,
253,
1072,
1055,
327,
3054,
347,
305,
260,
5256,
256,
23000,
253,
14951,
465,
5596,
14168,
1179,
72,
285,
15307,
360,
506,
1179,
72,
1324,
1768,
310,
417,
2590,
327,
247,
806,
1239,
275,
2087,
891,
1158,
625,
11120,
14851,
253,
3510,
273,
253,
15965,
5113,
908,
275,
253,
2929,
588,
1056,
352,
1199,
6927,
281,
14390,
407,
4284,
891,
285,
3164,
1142,
643,
391,
77,
10668,
4284,
281,
7384,
4972,
24995,
46234,
285,
352,
310,
6927,
281,
12970,
436,
4284,
672,
253,
3453,
1511,
273,
253,
21496,
310,
1160,
6843,
28910,
2593,
577,
310,
3240,
1048,
285,
275,
1142,
2219,
697,
417,
2590,
1880,
253,
4081,
3607,
403,
711,
1334,
682,
390,
16293,
281,
320,
10048,
407,
253,
4081,
1332,
323,
1650,
253,
4315,
46234,
273,
1375,
5231,
3133,
281,
2186,
407,
5140,
533,
557,
290,
606,
1338,
18289,
1057,
417,
50276,
783,
16774,
27163,
13414,
1329,
253,
2022,
4736,
273,
253,
2929,
6240,
253,
650,
37841,
371,
400,
6410,
2957,
36908,
2085,
1199,
7756,
689,
253,
2629,
1566,
28269,
2957,
50276,
35456,
672,
359,
1007,
387,
8876,
3045,
436,
5936,
326,
954,
273,
253,
5649,
689,
253,
1666,
25379,
310,
3551,
432,
253,
1566,
28269,
4445,
273,
253,
8103,
2581,
685,
253,
24319,
273,
34902,
2007,
253,
1332,
1293,
253,
10377,
28269,
2957,
310,
2074,
281,
1142,
5368,
6779,
4715,
3082,
824,
347,
3676,
6535,
81,
3714,
83,
268,
1559,
278,
7958,
2771,
4329,
1940,
285,
277,
12847,
253,
32270,
6410,
2957,
310,
3103,
4619,
281,
253,
38135,
273,
253,
4081,
5933,
2568,
352,
36908,
1646,
281,
452,
247,
1534,
3486,
327,
4583,
3045,
352,
310,
1896,
326,
247,
625,
10182,
16774,
1783,
327,
12620,
342,
625,
6843,
34902,
778,
10313,
326,
436,
2957,
310,
6296,
4217,
533,
253,
1655,
5661,
1543,
403,
12497,
281,
921,
436,
281,
247,
20297,
4248,
50272,
783,
5661,
1543,
403,
3710,
253,
2929,
760,
4648,
581,
7103,
6974,
285,
36908,
2085,
1199,
12288,
715,
1880,
253,
6311,
1566,
1663,
310,
32270,
6410,
281,
253,
873,
273,
21257,
908,
50271,
44295,
253,
1666,
25379,
908,
275,
253,
2929,
13414,
1646,
281,
320,
2529,
390,
11106,
534,
2789,
352,
2834,
281,
3653,
1880,
247,
4344,
5301,
310,
1663,
1146,
1160,
875,
253,
1027,
3082,
253,
1666,
25379,
891,
369,
2168,
7615,
342,
6571,
1646,
281,
1056,
897,
273,
771,
813,
658,
7274,
5678,
342,
941,
42072,
534,
6505,
253,
1566,
28269,
4445,
273,
16186,
83,
347,
247,
34541,
2803,
8689,
534,
1057,
2486,
247,
6311,
4329,
592,
4511,
1566,
3133,
281,
20685,
247,
2169,
8876,
1547,
1136,
1939,
1025,
4868,
685,
16186,
83,
342,
253,
32270,
6410,
5502,
2957,
7738,
326,
387,
1878,
629,
273,
253,
5520,
3045,
310,
3551,
432,
253,
2173,
4764,
1320,
273,
253,
1566,
50274,
585,
6713,
44201,
323,
7756,
50275,
74,
717,
1077,
48515,
407,
253,
897,
273,
4315,
46234,
275,
253,
6779,
50276,
42623,
275,
391,
77,
359,
1555,
14237,
347,
11390,
285,
352,
651,
320,
4722,
281,
923,
1880,
37703,
4872,
5231,
273,
253,
5502,
285,
10377,
9158,
327,
253,
21624,
2317,
14757,
1798,
2605,
50275,
262,
310,
1896,
326,
407,
2819,
387,
12620,
342,
625,
2605,
253,
4081,
1332,
1537,
320,
2011,
281,
9232,
253,
4569,
275,
8275,
6656,
707,
285,
3103,
3157,
3045,
285,
3410,
6733,
891,
5583,
24049,
27163,
327,
24088,
278,
10441,
16856,
390,
690,
643,
3126,
342,
625,
6843,
13318,
3169,
828,
6656,
707,
50275,
783,
2770,
273,
436,
2929,
310,
327,
32270,
6410,
14237,
323,
3410,
6733,
2299,
352,
3133,
751,
271,
25711,
625,
12532,
2898,
310,
275,
11138,
26647,
50275,
561,
981,
326,
14217,
285,
616,
3510,
403,
2931,
1078,
597,
403,
908,
588,
1056,
253,
2929,
1199,
625,
34025,
50275,
4674,
577,
943,
320,
35993,
281,
625,
4518,
3890,
534,
273,
253,
711,
1334,
682,
403,
7964,
10048,
407,
253,
4081,
1332,
285,
534,
403,
417,
16293,
281,
320,
10048,
352,
1537,
671,
320,
4722,
281,
2486,
16774,
14006,
715,
253,
4248,
281,
534,
253,
4081,
1332,
12310,
841,
711,
1334,
682,
50276,
74,
5583,
33944,
436,
2929,
1955,
281,
247,
253,
3710,
16774,
5649,
273,
24049,
32270,
14417,
715,
253,
4081,
6779,
4715,
1332,
285,
270,
253,
3480,
273,
19843,
275,
1097,
253,
9380,
47284,
285,
697,
3916,
670,
253,
4081,
3082,
3607,
342,
3081,
789,
285,
247,
625,
27096,
7103,
4758,
891,
1158,
253,
2929,
588,
320,
275,
247,
1199,
1805,
1899,
323,
8180,
275,
2852,
27691,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
3037,
247,
21624,
2317,
6779,
824,
326,
690,
4872,
32270,
14417,
285,
10377,
10806,
403,
22694,
275,
253,
21624,
2317,
342,
253,
4736,
281,
3157,
3410,
6733,
581,
5161,
2934,
310,
326,
253,
21624,
2317,
310,
671,
253,
1072,
347,
253,
2317,
273,
4872,
9261,
908,
275,
253,
10806,
534,
310,
2011,
281,
25636,
690,
273,
253,
15965,
3538,
569,
4679,
327,
253,
387,
1792,
2233,
76,
22791,
7568,
247,
7605,
7756,
689,
253,
8689,
8245,
672,
970,
253,
396,
19,
1387,
273,
4872,
21257,
347,
21624,
2317,
50276,
34814,
253,
5955,
2180,
954,
30628,
497,
275,
3718,
273,
14924,
2299,
581,
37317,
6376,
10915,
8498,
758,
285,
846,
9257,
4361,
253,
2929,
891,
2686,
3894,
253,
1072,
7350,
26332,
326,
352,
310,
12744,
762,
534,
2515,
253,
4081,
2746,
2686,
2987,
285,
752,
2789,
352,
789,
891,
2868,
326,
347,
247,
2561,
3114,
359,
943,
1318,
4685,
689,
4886,
253,
15460,
327,
49602,
3340,
672,
36636,
824,
247,
2570,
1332,
347,
436,
581,
923,
3036,
608,
50276,
3062,
5742,
50276,
18,
253,
1332,
310,
760,
6760,
327,
387,
1792,
3958,
4645,
690,
11701,
672,
970,
396,
19,
285,
16425,
326,
627,
403,
3969,
34902,
275,
824,
3958,
627,
310,
2299,
642,
1783,
17227,
390,
1014,
12662,
272,
387,
253,
958,
326,
253,
4081,
5853,
310,
2686,
4715,
281,
1379,
5750,
273,
824,
34902,
295,
67,
891,
574,
247,
3158,
1007,
387,
253,
16904,
2879,
407,
253,
4477,
275,
253,
24864,
2144,
533,
891,
513,
417,
923,
604,
5430,
597,
1361,
327,
436,
1127,
1014,
604,
18918,
14237,
327,
387,
1792,
778,
320,
28190,
891,
2868,
326,
1677,
253,
16038,
273,
436,
747,
5933,
352,
1364,
320,
6760,
327,
690,
20953,
1650,
24088,
253,
32752,
15508,
5393,
4768,
253,
2929,
281,
17813,
326,
352,
310,
4715,
752,
359,
971,
352,
281,
3037,
3738,
891,
671,
5194,
342,
253,
4477,
326,
46086,
327,
247,
625,
2570,
22791,
751,
387,
1792,
310,
9696,
1774,
50276,
19,
253,
2934,
273,
21496,
3054,
715,
253,
1072,
2317,
347,
21257,
310,
4722,
285,
10316,
690,
11361,
672,
4028,
1066,
7424,
347,
5183,
407,
253,
4477,
2299,
627,
310,
642,
22861,
16280,
15965,
16397,
285,
352,
36908,
1646,
27350,
281,
479,
387,
512,
326,
2139,
436,
943,
320,
247,
1175,
2934,
7296,
326,
352,
16027,
253,
1375,
6779,
281,
253,
15965,
6779,
273,
1387,
21257,
323,
4227,
752,
1057,
253,
653,
7190,
1387,
3284,
299,
1599,
323,
247,
1375,
285,
436,
8789,
2789,
352,
2834,
281,
4665,
253,
1055,
273,
970,
247,
1027,
1387,
273,
21257,
323,
4227,
672,
4886,
432,
1289,
19,
281,
396,
19,
310,
253,
2540,
5649,
984,
359,
403,
970,
760,
2173,
21257,
390,
3365,
984,
359,
403,
8493,
253,
7877,
1319,
273,
253,
1375,
21496,
3877,
326,
275,
3036,
577,
68,
253,
13361,
81,
12955,
556,
2074,
3045,
281,
1289,
19,
285,
1754,
327,
619,
4685,
597,
897,
253,
1072,
21496,
7877,
1319,
50275,
74,
2868,
352,
651,
320,
1774,
281,
2451,
752,
651,
5108,
342,
271,
13361,
81,
12955,
970,
253,
1072,
7877,
1319,
347,
396,
19,
50276,
20,
253,
1055,
273,
253,
298,
788,
2957,
310,
417,
21414,
347,
8042,
562,
407,
2067,
30628,
891,
1158,
352,
651,
452,
644,
271,
5107,
323,
253,
4477,
281,
7409,
2139,
3340,
1580,
352,
3133,
281,
789,
275,
690,
3958,
285,
417,
2571,
533,
816,
13654,
327,
1060,
403,
253,
1722,
1731,
3958,
835,
352,
2987,
1805,
36908,
1663,
3324,
2879,
1318,
1060,
513,
841,
3958,
452,
690,
2173,
3607,
326,
1056,
731,
1805,
9183,
281,
1379,
5750,
273,
298,
788,
436,
812,
452,
644,
247,
1077,
4722,
12288,
604,
326,
369,
253,
1083,
533,
347,
352,
310,
1024,
891,
717,
417,
2119,
752,
359,
476,
3037,
432,
326,
50276,
21,
627,
403,
2067,
7092,
4278,
690,
4886,
253,
2457,
5933,
21816,
432,
697,
10527,
22861,
326,
403,
417,
490,
16148,
2403,
352,
2834,
281,
2096,
616,
3486,
385,
970,
2303,
6928,
253,
4327,
273,
253,
1318,
273,
278,
970,
20553,
4830,
253,
3943,
15269,
273,
690,
10341,
7877,
1319,
849,
253,
256,
1375,
310,
6777,
275,
298,
788,
50276,
284,
247,
906,
359,
452,
1060,
271,
5933,
342,
690,
4722,
10527,
4114,
533,
342,
247,
2257,
273,
4886,
4295,
534,
50276,
9453,
6283,
13660,
6840,
50276,
5092,
1421,
281,
247,
10126,
14282,
7756,
327,
387,
1792,
2233,
76,
50276,
14920,
1663,
4685,
2139,
891,
2868,
436,
310,
417,
3240,
2217,
323,
9311,
387,
17857,
32888,
285,
891,
651,
11907,
253,
4477,
281,
1448,
306,
12861,
715,
253,
4685,
273,
616,
5933,
534,
891,
3524,
588,
3324,
4217,
16039,
281,
253,
2561,
3114,
2444,
327,
6779,
4715
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
50276,
77,
788,
17923,
7197,
275,
2426,
273,
8876,
4868,
685,
16186,
83,
298,
83,
275,
2829,
337,
285,
4677,
495,
67,
253,
4460,
629,
273,
436,
2929,
298,
788,
1057,
417,
8162,
1199,
281,
253,
3045,
1677,
326,
253,
1332,
310,
671,
1077,
2074,
281,
8689,
275,
690,
7092,
10165,
891,
717,
417,
13762,
326,
253,
4081,
1332,
310,
1805,
50276,
936,
1361,
10668,
1805,
2096,
253,
14168,
275,
2593,
4567,
285,
2593,
577,
891,
1804,
253,
4477,
1918,
625,
6667,
1309,
841,
374,
7118,
281,
2085,
247,
3634,
6830,
342,
253,
32752,
15508,
1650,
50276,
977,
5701,
50275,
249,
5150,
608,
476,
253,
1072,
305,
769,
1097,
327,
3284,
275,
14168,
67,
1768,
285,
327,
3284,
275,
14168,
67,
1768,
2069,
14168,
4482,
66,
50274,
249,
253,
806,
12494,
273,
2593,
577,
14168,
1179,
7332,
285,
14168,
1179,
737,
403,
417,
2931,
50276,
74,
717,
417,
3240,
2970,
253,
3282,
273,
970,
44663,
6168,
505,
893,
307,
6168,
505,
18,
347,
246,
2321,
476,
253,
4477,
2085,
690,
22909,
50276,
555,
5367,
1840,
5150,
1722,
21036,
970,
50276,
25117,
5150,
1283,
9840,
285,
270,
403,
417,
5185,
342,
4677,
374,
1223,
891,
513,
1089,
253,
2934,
273,
436,
2929,
4460,
285,
4722,
253,
16774,
1941,
1057,
417,
18578,
479,
273,
697,
34385,
689,
2720,
3082,
275,
1798,
8689,
891,
588,
417,
5583,
14924,
323,
697,
1655,
830,
5474,
339,
431,
248,
2929,
16633,
327,
253,
1895,
273,
24049,
34902,
715,
391,
77,
6083,
14237,
281,
3157,
941,
6733,
352,
29328,
247,
1566,
3169,
6779,
4715,
1332,
4764,
1025,
407,
247,
1387,
2250,
326,
310,
1754,
327,
9706,
3054,
285,
1375,
1913,
8557,
347,
3603,
273,
253,
2390,
6779,
275,
21624,
2317,
253,
4081,
1332,
310,
6760,
1411,
247,
1180,
273,
1666,
25379,
275,
253,
2856,
267,
303,
959,
387,
1792,
4758,
50276,
296,
3755,
20556,
50275,
783,
2929,
310,
973,
24013,
8550,
285,
12453,
271,
4722,
285,
1774,
1895,
50276,
783,
2934,
273,
21496,
3054,
285,
5231,
347,
12624,
9999,
247,
1387,
2250,
310,
27807,
285,
778,
10808,
4722,
3607,
275,
253,
6779,
4457,
1110,
5421,
275,
436,
2929,
50276,
783,
2929,
3400,
4209,
4114,
281,
320,
7192,
407,
10668,
342,
690,
533,
417,
9470,
2793,
275,
253,
9400,
50274,
20881,
1255,
265,
50275,
783,
22861,
323,
253,
4081,
1332,
310,
14999,
50276,
4674,
577,
3400,
247,
1180,
273,
711,
1334,
682,
533,
627,
310,
642,
956,
484,
4645,
45190,
390,
28055,
326,
253,
4795,
1332,
3340,
1677,
253,
897,
273,
2303,
6928,
285,
20553,
588,
2686,
10517,
841,
711,
1334,
682,
50276,
783,
897,
273,
7027,
2390,
310,
417,
17194,
275,
2593,
495,
347,
247,
9414,
891,
24874,
326,
436,
966,
369,
6777,
275,
1340,
281,
15249,
253,
897,
273,
6311,
4315,
46234,
275,
253,
4081,
1566,
533,
436,
651,
320,
4409,
8254,
5411,
23000,
253,
3720,
273,
253,
7027,
8697,
310,
417,
6289,
281,
275,
2593,
721,
5747,
271,
14011,
326,
352,
588,
320,
908,
275,
2593,
495,
50275,
783,
2929,
27171,
432,
247,
3480,
273,
19843,
323,
1650,
275,
253,
5955,
273,
13123,
31934,
793,
253,
1387,
2250,
310,
3732,
281,
1097,
3054,
285,
1375,
1913,
8557,
533,
352,
310,
417,
2590,
1880,
841,
1957,
767,
4858,
5231,
390,
1880,
305,
3830,
298,
2134,
256,
247,
391,
2134,
556,
253,
1072,
1055,
327,
3054,
347,
305,
260,
5256,
256,
23000,
253,
14951,
465,
5596,
14168,
1179,
72,
285,
15307,
360,
506,
1179,
72,
1324,
1768,
310,
417,
2590,
327,
247,
806,
1239,
275,
2087,
891,
1158,
625,
11120,
14851,
253,
3510,
273,
253,
15965,
5113,
908,
275,
253,
2929,
588,
1056,
352,
1199,
6927,
281,
14390,
407,
4284,
891,
285,
3164,
1142,
643,
391,
77,
10668,
4284,
281,
7384,
4972,
24995,
46234,
285,
352,
310,
6927,
281,
12970,
436,
4284,
672,
253,
3453,
1511,
273,
253,
21496,
310,
1160,
6843,
28910,
2593,
577,
310,
3240,
1048,
285,
275,
1142,
2219,
697,
417,
2590,
1880,
253,
4081,
3607,
403,
711,
1334,
682,
390,
16293,
281,
320,
10048,
407,
253,
4081,
1332,
323,
1650,
253,
4315,
46234,
273,
1375,
5231,
3133,
281,
2186,
407,
5140,
533,
557,
290,
606,
1338,
18289,
1057,
417,
50276,
783,
16774,
27163,
13414,
1329,
253,
2022,
4736,
273,
253,
2929,
6240,
253,
650,
37841,
371,
400,
6410,
2957,
36908,
2085,
1199,
7756,
689,
253,
2629,
1566,
28269,
2957,
50276,
35456,
672,
359,
1007,
387,
8876,
3045,
436,
5936,
326,
954,
273,
253,
5649,
689,
253,
1666,
25379,
310,
3551,
432,
253,
1566,
28269,
4445,
273,
253,
8103,
2581,
685,
253,
24319,
273,
34902,
2007,
253,
1332,
1293,
253,
10377,
28269,
2957,
310,
2074,
281,
1142,
5368,
6779,
4715,
3082,
824,
347,
3676,
6535,
81,
3714,
83,
268,
1559,
278,
7958,
2771,
4329,
1940,
285,
277,
12847,
253,
32270,
6410,
2957,
310,
3103,
4619,
281,
253,
38135,
273,
253,
4081,
5933,
2568,
352,
36908,
1646,
281,
452,
247,
1534,
3486,
327,
4583,
3045,
352,
310,
1896,
326,
247,
625,
10182,
16774,
1783,
327,
12620,
342,
625,
6843,
34902,
778,
10313,
326,
436,
2957,
310,
6296,
4217,
533,
253,
1655,
5661,
1543,
403,
12497,
281,
921,
436,
281,
247,
20297,
4248,
50272,
783,
5661,
1543,
403,
3710,
253,
2929,
760,
4648,
581,
7103,
6974,
285,
36908,
2085,
1199,
12288,
715,
1880,
253,
6311,
1566,
1663,
310,
32270,
6410,
281,
253,
873,
273,
21257,
908,
50271,
44295,
253,
1666,
25379,
908,
275,
253,
2929,
13414,
1646,
281,
320,
2529,
390,
11106,
534,
2789,
352,
2834,
281,
3653,
1880,
247,
4344,
5301,
310,
1663,
1146,
1160,
875,
253,
1027,
3082,
253,
1666,
25379,
891,
369,
2168,
7615,
342,
6571,
1646,
281,
1056,
897,
273,
771,
813,
658,
7274,
5678,
342,
941,
42072,
534,
6505,
253,
1566,
28269,
4445,
273,
16186,
83,
347,
247,
34541,
2803,
8689,
534,
1057,
2486,
247,
6311,
4329,
592,
4511,
1566,
3133,
281,
20685,
247,
2169,
8876,
1547,
1136,
1939,
1025,
4868,
685,
16186,
83,
342,
253,
32270,
6410,
5502,
2957,
7738,
326,
387,
1878,
629,
273,
253,
5520,
3045,
310,
3551,
432,
253,
2173,
4764,
1320,
273,
253,
1566,
50274,
585,
6713,
44201,
323,
7756,
50275,
74,
717,
1077,
48515,
407,
253,
897,
273,
4315,
46234,
275,
253,
6779,
50276,
42623,
275,
391,
77,
359,
1555,
14237,
347,
11390,
285,
352,
651,
320,
4722,
281,
923,
1880,
37703,
4872,
5231,
273,
253,
5502,
285,
10377,
9158,
327,
253,
21624,
2317,
14757,
1798,
2605,
50275,
262,
310,
1896,
326,
407,
2819,
387,
12620,
342,
625,
2605,
253,
4081,
1332,
1537,
320,
2011,
281,
9232,
253,
4569,
275,
8275,
6656,
707,
285,
3103,
3157,
3045,
285,
3410,
6733,
891,
5583,
24049,
27163,
327,
24088,
278,
10441,
16856,
390,
690,
643,
3126,
342,
625,
6843,
13318,
3169,
828,
6656,
707,
50275,
783,
2770,
273,
436,
2929,
310,
327,
32270,
6410,
14237,
323,
3410,
6733,
2299,
352,
3133,
751,
271,
25711,
625,
12532,
2898,
310,
275,
11138,
26647,
50275,
561,
981,
326,
14217,
285,
616,
3510,
403,
2931,
1078,
597,
403,
908,
588,
1056,
253,
2929,
1199,
625,
34025,
50275,
4674,
577,
943,
320,
35993,
281,
625,
4518,
3890,
534,
273,
253,
711,
1334,
682,
403,
7964,
10048,
407,
253,
4081,
1332,
285,
534,
403,
417,
16293,
281,
320,
10048,
352,
1537,
671,
320,
4722,
281,
2486,
16774,
14006,
715,
253,
4248,
281,
534,
253,
4081,
1332,
12310,
841,
711,
1334,
682,
50276,
74,
5583,
33944,
436,
2929,
1955,
281,
247,
253,
3710,
16774,
5649,
273,
24049,
32270,
14417,
715,
253,
4081,
6779,
4715,
1332,
285,
270,
253,
3480,
273,
19843,
275,
1097,
253,
9380,
47284,
285,
697,
3916,
670,
253,
4081,
3082,
3607,
342,
3081,
789,
285,
247,
625,
27096,
7103,
4758,
891,
1158,
253,
2929,
588,
320,
275,
247,
1199,
1805,
1899,
323,
8180,
275,
2852,
27691,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
3037,
247,
21624,
2317,
6779,
824,
326,
690,
4872,
32270,
14417,
285,
10377,
10806,
403,
22694,
275,
253,
21624,
2317,
342,
253,
4736,
281,
3157,
3410,
6733,
581,
5161,
2934,
310,
326,
253,
21624,
2317,
310,
671,
253,
1072,
347,
253,
2317,
273,
4872,
9261,
908,
275,
253,
10806,
534,
310,
2011,
281,
25636,
690,
273,
253,
15965,
3538,
569,
4679,
327,
253,
387,
1792,
2233,
76,
22791,
7568,
247,
7605,
7756,
689,
253,
8689,
8245,
672,
970,
253,
396,
19,
1387,
273,
4872,
21257,
347,
21624,
2317,
50276,
34814,
253,
5955,
2180,
954,
30628,
497,
275,
3718,
273,
14924,
2299,
581,
37317,
6376,
10915,
8498,
758,
285,
846,
9257,
4361,
253,
2929,
891,
2686,
3894,
253,
1072,
7350,
26332,
326,
352,
310,
12744,
762,
534,
2515,
253,
4081,
2746,
2686,
2987,
285,
752,
2789,
352,
789,
891,
2868,
326,
347,
247,
2561,
3114,
359,
943,
1318,
4685,
689,
4886,
253,
15460,
327,
49602,
3340,
672,
36636,
824,
247,
2570,
1332,
347,
436,
581,
923,
3036,
608,
50276,
3062,
5742,
50276,
18,
253,
1332,
310,
760,
6760,
327,
387,
1792,
3958,
4645,
690,
11701,
672,
970,
396,
19,
285,
16425,
326,
627,
403,
3969,
34902,
275,
824,
3958,
627,
310,
2299,
642,
1783,
17227,
390,
1014,
12662,
272,
387,
253,
958,
326,
253,
4081,
5853,
310,
2686,
4715,
281,
1379,
5750,
273,
824,
34902,
295,
67,
891,
574,
247,
3158,
1007,
387,
253,
16904,
2879,
407,
253,
4477,
275,
253,
24864,
2144,
533,
891,
513,
417,
923,
604,
5430,
597,
1361,
327,
436,
1127,
1014,
604,
18918,
14237,
327,
387,
1792,
778,
320,
28190,
891,
2868,
326,
1677,
253,
16038,
273,
436,
747,
5933,
352,
1364,
320,
6760,
327,
690,
20953,
1650,
24088,
253,
32752,
15508,
5393,
4768,
253,
2929,
281,
17813,
326,
352,
310,
4715,
752,
359,
971,
352,
281,
3037,
3738,
891,
671,
5194,
342,
253,
4477,
326,
46086,
327,
247,
625,
2570,
22791,
751,
387,
1792,
310,
9696,
1774,
50276,
19,
253,
2934,
273,
21496,
3054,
715,
253,
1072,
2317,
347,
21257,
310,
4722,
285,
10316,
690,
11361,
672,
4028,
1066,
7424,
347,
5183,
407,
253,
4477,
2299,
627,
310,
642,
22861,
16280,
15965,
16397,
285,
352,
36908,
1646,
27350,
281,
479,
387,
512,
326,
2139,
436,
943,
320,
247,
1175,
2934,
7296,
326,
352,
16027,
253,
1375,
6779,
281,
253,
15965,
6779,
273,
1387,
21257,
323,
4227,
752,
1057,
253,
653,
7190,
1387,
3284,
299,
1599,
323,
247,
1375,
285,
436,
8789,
2789,
352,
2834,
281,
4665,
253,
1055,
273,
970,
247,
1027,
1387,
273,
21257,
323,
4227,
672,
4886,
432,
1289,
19,
281,
396,
19,
310,
253,
2540,
5649,
984,
359,
403,
970,
760,
2173,
21257,
390,
3365,
984,
359,
403,
8493,
253,
7877,
1319,
273,
253,
1375,
21496,
3877,
326,
275,
3036,
577,
68,
253,
13361,
81,
12955,
556,
2074,
3045,
281,
1289,
19,
285,
1754,
327,
619,
4685,
597,
897,
253,
1072,
21496,
7877,
1319,
50275,
74,
2868,
352,
651,
320,
1774,
281,
2451,
752,
651,
5108,
342,
271,
13361,
81,
12955,
970,
253,
1072,
7877,
1319,
347,
396,
19,
50276,
20,
253,
1055,
273,
253,
298,
788,
2957,
310,
417,
21414,
347,
8042,
562,
407,
2067,
30628,
891,
1158,
352,
651,
452,
644,
271,
5107,
323,
253,
4477,
281,
7409,
2139,
3340,
1580,
352,
3133,
281,
789,
275,
690,
3958,
285,
417,
2571,
533,
816,
13654,
327,
1060,
403,
253,
1722,
1731,
3958,
835,
352,
2987,
1805,
36908,
1663,
3324,
2879,
1318,
1060,
513,
841,
3958,
452,
690,
2173,
3607,
326,
1056,
731,
1805,
9183,
281,
1379,
5750,
273,
298,
788,
436,
812,
452,
644,
247,
1077,
4722,
12288,
604,
326,
369,
253,
1083,
533,
347,
352,
310,
1024,
891,
717,
417,
2119,
752,
359,
476,
3037,
432,
326,
50276,
21,
627,
403,
2067,
7092,
4278,
690,
4886,
253,
2457,
5933,
21816,
432,
697,
10527,
22861,
326,
403,
417,
490,
16148,
2403,
352,
2834,
281,
2096,
616,
3486,
385,
970,
2303,
6928,
253,
4327,
273,
253,
1318,
273,
278,
970,
20553,
4830,
253,
3943,
15269,
273,
690,
10341,
7877,
1319,
849,
253,
256,
1375,
310,
6777,
275,
298,
788,
50276,
284,
247,
906,
359,
452,
1060,
271,
5933,
342,
690,
4722,
10527,
4114,
533,
342,
247,
2257,
273,
4886,
4295,
534,
50276,
9453,
6283,
13660,
6840,
50276,
5092,
1421,
281,
247,
10126,
14282,
7756,
327,
387,
1792,
2233,
76,
50276,
14920,
1663,
4685,
2139,
891,
2868,
436,
310,
417,
3240,
2217,
323,
9311,
387,
17857,
32888,
285,
891,
651,
11907,
253,
4477,
281,
1448,
306,
12861,
715,
253,
4685,
273,
616,
5933,
534,
891,
3524,
588,
3324,
4217,
16039,
281,
253,
2561,
3114,
2444,
327,
6779,
4715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes using metalearning and fast online adaptation of models to overcome the mismatch between simulation and the real world as well as unexpected changes and dynamics this paper proposes two modelbased metalearning reinforcement algorithms one based on maml and the other based on recurrence and experimentally shows how they are more sample efficient and faster at adapting to test scenarios than prior approaches including prior modelfree metalearning approaches i do have an issue with the way this paper labels prior work as modelfree metalearning algorithms since for example maml is a general algorithm that can be applied to modelfree and modelbased algorithms alike it would be more accurate in my opinion to label the contributions of this paper as modelbased instantiations of prior existing algorithms rather than new algorithms outright im a bit confused with equation 3 as the expectation is over a single environment and the trajectory of data is also sampled from a single environment but in the writing the paper describes the setting as a potentially different environment at every timestep equation 3 seems to assume that the subsequence of data comes from a single environment which contradicts what you say in the text as described equation 3 is then not really much different from previous episodic or task based formulations the results themselves are not unexpected as there has already been prior work that this paper also mentions showing that modelbased rl algorithms are more sample efficient than modelfree section 61 i like this comparison and showing how the errors are getting better for section 62 judging from the plots it doesnt seem you are doing any metalearning in this experiment so then are you just basically running a modelbased rl algorithm im very confused what you are trying to show are you trying to show the benefit of modelbased vs modelfree prior work has already done that are you trying to show that even just using a metalearning algorithm in an online setting results in good online performance then you should be comparing your algorithm to just a modelbased online rl algorithm you also mention that the asymptotic performance falls behind is this because your model capacity is low or maybe your mpc method is insufficient if so then wouldnt it be more compelling to like prior work combine this with a modelfree algorithm and get the best of both worlds section 63 results look good section 64 i really like the fact you have results on a real robot overall i think the paper does successfully show the sample complexity benefits and fast adaptation of modelbased metarl methods the inclusion of a real world robot experiment is a plus however the result is not particularly surprising or insightful as prior work has already shown the massive sample complexity improvement of modelbased rl methods update dec 4 2018 i have read the author response and they have addressed the specific concerns i have brought up i am overall positive about this paper and the new changes and additions so i will slightly increase my score though i am still concerned about the significance of the results themselves docsepthis work addresses the problem of online adapting dynamics models in the context of modelbased rl learning globally accurate dynamics model is impossible if we consider that environments are dynamic and we cant observe every possible environment state at initial training time thus learning dynamics models that can be adapted online fast to deal with unexpected und never seen before events is an important research problem this paper proposes to use metalearning to train an update policy that can update the dynamics model at test time in a sample efficient manner two methods are proposed grbal this method uses maml for metalearning rebal this method trains a recurrent network during metatraining such that it can update the dynamics effectively at test time when the dynamics change both methods are evaluated on several simulation environments which show that grbal outperforms rebal on average grbal is then evaluated on a real system the strengths of this paper are this work addresses an important problem and is well motivated experiments on both simulated and on a real system are performed the weaknesses the related work section is biased towards the ml community there is a ton of work on adapting inverse dynamics models in the robotics community this line of work is almost entirely ignored in this paper furthermore some important recent references for modelbased rl are not provided in the related work section pets 3 and mppi 2 although mppi is the controller that is used in this work as a framework for modelbased rl additionally existing work on modelbased rl with metalearning 1 has not been cited this is unacceptable there is no significant technical contribution the contribution is that existing metalearning methods have been applied to the modelbased rl setting even if noone has had that idea before it would be a minor contribution but given that there is prior work on metalearning in the context of modelbased rl this idea itself is not novel anymore two methods are provided without much analysis often authors refer to our approach but its actually not clear what they mean by our approach the authors cant claim modelbased meta rl as their approach while i commend the authors for performing both simulation and realworld experiments i find the that experiments lack a principled evaluation more details below feedback on experiments section 62 sample efficiency you compare apples to oranges here i have no idea whether your improvements in terms of sampleefficiency are due to using a modelbased rl approach or because your deploying metalearning it is well known that modelbased rl is more sample efficient but often cannot achieve the same asymptotic performance as modelfree rl since mppi is your choice of modelbased rl framework you would have to include an evaluation that shows results on mppi with model bootstrapping as presented in 2 to give us an idea of how much more sampleefficient your approach is section 63 fast adaptation and generalization while in theory one can choose the metalearning approach independently from the choice of modelbased controller in practice the choice of the mpc method is very important mppi can handle model inaccuracies very well almost to the point where sometimes adaptation is not necessary you cannot evaluate mppi with online adaptation to another mpc approach with another modellearning approach this does not give me any information of how your metalearning improves modeladaptation in essence these comparisons are meaningless to make your results more meaningful you need to use the same controller setup lets say mppi and then compare the following 1 mppi with your metatrained online adaptation 2 mppi results with a fixed learned dynamics model this shows us whether online adaptation helps 3 results of mppi with the initial dynamics model trained in the metatraining phase without online adaptation this will tell us whether the metatraining phase provides a dynamics model that generalizes better even without online adaptation 4 mppi with model bootstrapping as presented in 2 this will show whether your metatrained online adaptation actually outperforms simple online model bootstrapping in terms of sampleefficiency the key here is that you need to use the same modelbased control setup whether its mppi or some other method otherwise you cannot detangle the effect of controller choice from your metalearned online adaptation 64 realworld same comments as above comparisons are not meaningful 1 meta reinforcement learning with latent variable gaussian processes uai 2018 2 mppi with modelbootstrapping information theoretic mpc for modelbased reinforcement learning icra 2017 3 deep reinforcement learning in a handful of trials using probabilistic dynamics models nips 2018docsepthe authors introduce an algorithm that addresses the problem of online policy adaptation for modelbased rl the main novelty of the proposed approach is that it defines an effective algorithm that can easily and quickly adapt to the changing contextenvironments it borrows the ideas from modelfree rl maml to define the gradientrecursive updates of their approach and it incorporates it efficiently into their modelbased rl framework the paper is well written and the experimental results on synthetic and real world data show that the algorithm can quickly adapt its policy and achieve good results in the tasks when compared to related approaches while applying the gradient based adaptation to the modelfree rl is trivial and has previously been proposed in this work the authors do so by also focusing on the local context m steps within a klong horizon allowing the method to recover quickly if learning from contaminated data andor its global policy cannot generalize well to the local contexts although this extension is trivial it seems that it has not been applied and measured in terms of the adaptation speed in previous works theoretically i see more value in their second approach where they investigate the application of fast parameter updates within modelbased rl showing that it does improve over the mamlrl and nonadaptive modelbased rl approaches this is expected but to my knowledge has not been investigated to this extent before what i find is lacking in this paper is insight into how sensitive the algorithm is in terms of the km ratio and also how it affects the adaptation speed vs performance tables 35 show an analysis but those are for different tasks no theoretical analysis was performed to provide deeper understanding of it the model does solve a practical problem reducing the learning time and having more robust model however it would add more value to the current state of the art in rl if the authors proposed a method for optimal selection of the recovery points and also window ratio rl depending on the target task this would make a significant theoretical contribution and the method could be easily applicable to a variety of tasks where the gains in the adaptation speed are important
### Summary:
|
the authors consider the use of maml with model based rl and applied this to robotics tasks with very encouraging results there was definite interest in the paper but also some concerns over how the results were situated particularly with respect to the related research in the robotics community the authors are strongly encouraged to carefully consider this feedback as they have been doing in their responses and address this as well as possible in the final version
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
970,
5148,
613,
920,
285,
3809,
3909,
15644,
273,
3210,
281,
11399,
253,
29713,
875,
9864,
285,
253,
1524,
1533,
347,
973,
347,
12439,
2544,
285,
8062,
436,
2929,
29328,
767,
1566,
3169,
5148,
613,
920,
35221,
11333,
581,
1754,
327,
278,
16878,
285,
253,
643,
1754,
327,
15969,
285,
21657,
2722,
849,
597,
403,
625,
3410,
5919,
285,
7938,
387,
42174,
281,
1071,
15216,
685,
2720,
7274,
1690,
2720,
771,
813,
658,
5148,
613,
920,
7274,
50276,
74,
513,
452,
271,
2523,
342,
253,
1039,
436,
2929,
13301,
2720,
789,
347,
771,
813,
658,
5148,
613,
920,
11333,
1580,
323,
1650,
278,
16878,
310,
247,
2087,
5933,
326,
476,
320,
3732,
281,
771,
813,
658,
285,
1566,
3169,
11333,
19605,
352,
651,
320,
625,
7899,
275,
619,
4743,
281,
5203,
253,
9021,
273,
436,
2929,
347,
1566,
3169,
8164,
10944,
273,
2720,
5368,
11333,
2581,
685,
747,
11333,
30227,
50276,
303,
247,
2372,
13477,
342,
5150,
495,
347,
253,
15355,
310,
689,
247,
2014,
3126,
285,
253,
18974,
273,
941,
310,
671,
19958,
432,
247,
2014,
3126,
533,
275,
253,
4028,
253,
2929,
8631,
253,
4758,
347,
247,
7826,
1027,
3126,
387,
1046,
4522,
383,
554,
5150,
495,
3133,
281,
5467,
326,
253,
50276,
2377,
21934,
273,
941,
3249,
432,
247,
2014,
3126,
534,
40878,
752,
368,
1333,
275,
253,
2505,
347,
2529,
5150,
495,
310,
840,
417,
1663,
1199,
1027,
432,
2045,
6314,
23329,
390,
4836,
1754,
26850,
50276,
783,
1543,
3746,
403,
417,
12439,
347,
627,
556,
2168,
644,
2720,
789,
326,
436,
2929,
671,
25957,
4645,
326,
1566,
3169,
391,
77,
11333,
403,
625,
3410,
5919,
685,
771,
813,
658,
50276,
4674,
9901,
891,
751,
436,
5301,
285,
4645,
849,
253,
6332,
403,
2970,
1805,
50276,
1542,
2593,
9743,
32721,
432,
253,
14777,
352,
36908,
1646,
368,
403,
2509,
667,
5148,
613,
920,
275,
436,
3368,
594,
840,
403,
368,
816,
10323,
3515,
247,
1566,
3169,
391,
77,
5933,
516,
1077,
13477,
752,
368,
403,
2820,
281,
921,
403,
368,
2820,
281,
921,
253,
5649,
273,
1566,
3169,
4632,
771,
813,
658,
2720,
789,
556,
2168,
2218,
326,
403,
368,
2820,
281,
921,
326,
1014,
816,
970,
247,
5148,
613,
920,
5933,
275,
271,
3909,
4758,
1543,
275,
1175,
3909,
3045,
840,
368,
943,
320,
10941,
634,
5933,
281,
816,
247,
1566,
3169,
3909,
391,
77,
5933,
368,
671,
3748,
326,
253,
20185,
3045,
11521,
3212,
310,
436,
984,
634,
1566,
5350,
310,
1698,
390,
5046,
634,
278,
5902,
1332,
310,
12497,
604,
594,
840,
651,
2649,
352,
320,
625,
18511,
281,
751,
2720,
789,
13398,
436,
342,
247,
771,
813,
658,
5933,
285,
755,
253,
1682,
273,
1097,
20490,
50276,
4674,
9654,
1543,
1007,
1175,
50276,
4674,
6705,
891,
1663,
751,
253,
958,
368,
452,
1543,
327,
247,
1524,
15688,
50276,
1189,
455,
891,
1158,
253,
2929,
1057,
8379,
921,
253,
3410,
10454,
5373,
285,
3809,
15644,
273,
1566,
3169,
1313,
7694,
3082,
253,
11250,
273,
247,
1524,
1533,
15688,
3368,
310,
247,
5043,
2299,
253,
906,
310,
417,
3782,
10084,
390,
47860,
347,
2720,
789,
556,
2168,
2011,
253,
7863,
3410,
10454,
7756,
273,
1566,
3169,
391,
77,
3082,
50276,
11183,
1086,
577,
4765,
50276,
74,
452,
1239,
253,
2488,
2380,
285,
597,
452,
9713,
253,
2173,
7350,
891,
452,
3982,
598,
891,
717,
4583,
2762,
670,
436,
2929,
285,
253,
747,
2544,
285,
30733,
594,
891,
588,
5777,
2572,
619,
4868,
2167,
891,
717,
1335,
7514,
670,
253,
8453,
273,
253,
1543,
3746,
5474,
33032,
2520,
789,
12453,
253,
1895,
273,
3909,
42174,
8062,
3210,
275,
253,
3634,
273,
1566,
3169,
391,
77,
4715,
21349,
7899,
8062,
1566,
310,
7479,
604,
359,
1908,
326,
12620,
403,
7870,
285,
359,
16216,
10018,
1046,
1896,
3126,
1375,
387,
3302,
3733,
673,
3021,
4715,
8062,
3210,
326,
476,
320,
12956,
3909,
3809,
281,
2968,
342,
12439,
3807,
1620,
2326,
1078,
3394,
310,
271,
1774,
2561,
1895,
50276,
2520,
2929,
29328,
281,
897,
5148,
613,
920,
281,
6194,
271,
5731,
3646,
326,
476,
5731,
253,
8062,
1566,
387,
1071,
673,
275,
247,
3410,
5919,
5133,
767,
3082,
403,
4081,
50276,
737,
7187,
436,
1332,
4648,
278,
16878,
323,
5148,
613,
920,
50276,
250,
7187,
436,
1332,
18784,
247,
18902,
2990,
1309,
1313,
255,
26208,
824,
326,
352,
476,
5731,
253,
8062,
8069,
387,
1071,
673,
672,
253,
8062,
50276,
4168,
50276,
15617,
3082,
403,
6760,
327,
2067,
9864,
12620,
534,
921,
326,
650,
7187,
41731,
13015,
6142,
267,
327,
3388,
650,
7187,
310,
840,
6760,
327,
247,
1524,
985,
50275,
783,
20544,
273,
436,
2929,
403,
50275,
2520,
789,
12453,
271,
1774,
1895,
285,
310,
973,
17194,
50276,
16217,
3825,
327,
1097,
15524,
285,
327,
247,
1524,
985,
403,
2684,
50276,
783,
32213,
50275,
783,
2905,
789,
2593,
310,
23539,
4404,
253,
13361,
3114,
627,
310,
247,
7020,
273,
789,
327,
42174,
13737,
8062,
3210,
275,
253,
15688,
982,
3114,
436,
1386,
273,
789,
310,
2761,
7094,
12841,
275,
436,
2929,
33810,
690,
1774,
3332,
10414,
323,
1566,
3169,
391,
77,
403,
417,
2530,
275,
253,
2905,
789,
2593,
29286,
495,
285,
278,
377,
74,
374,
3738,
278,
377,
74,
310,
253,
9763,
326,
310,
908,
275,
436,
789,
347,
247,
7792,
323,
1566,
3169,
391,
77,
23000,
5368,
789,
327,
1566,
3169,
391,
77,
342,
5148,
613,
920,
337,
556,
417,
644,
11106,
436,
310,
28536,
50275,
9088,
310,
642,
1534,
7681,
7680,
50276,
783,
7680,
310,
326,
5368,
5148,
613,
920,
3082,
452,
644,
3732,
281,
253,
1566,
3169,
391,
77,
4758,
1014,
604,
642,
531,
556,
574,
326,
2934,
1078,
50276,
262,
651,
320,
247,
5884,
7680,
533,
1677,
326,
627,
310,
2720,
789,
327,
5148,
613,
920,
275,
253,
3634,
273,
1566,
3169,
391,
77,
436,
2934,
3139,
310,
417,
4460,
10542,
50276,
9389,
3082,
403,
2530,
1293,
1199,
1783,
2223,
4477,
3730,
281,
776,
2746,
50276,
2858,
697,
2686,
417,
2590,
752,
597,
1599,
407,
776,
2746,
253,
4477,
16216,
1750,
1566,
3169,
11419,
391,
77,
347,
616,
2746,
50275,
6050,
891,
49638,
253,
4477,
323,
9591,
1097,
9864,
285,
1524,
10186,
4679,
891,
1089,
253,
326,
4679,
3480,
247,
3505,
74,
6216,
7103,
625,
4278,
2708,
50276,
44333,
327,
4679,
50276,
4674,
9743,
3410,
6733,
50276,
5658,
7277,
28580,
281,
390,
6525,
1060,
891,
452,
642,
2934,
1880,
634,
11701,
275,
2426,
273,
3410,
46505,
403,
1955,
281,
970,
247,
1566,
3169,
391,
77,
2746,
390,
984,
634,
45021,
5148,
613,
920,
352,
310,
973,
1929,
326,
1566,
3169,
391,
77,
310,
625,
3410,
5919,
533,
2223,
2550,
5115,
253,
1072,
20185,
3045,
347,
771,
813,
658,
391,
77,
1580,
278,
377,
74,
310,
634,
4327,
273,
1566,
3169,
391,
77,
7792,
368,
651,
452,
281,
2486,
271,
7103,
326,
2722,
1543,
327,
278,
377,
74,
342,
1566,
7491,
10981,
2784,
347,
3559,
275,
374,
281,
1918,
441,
271,
2934,
273,
849,
1199,
625,
3410,
20246,
634,
2746,
310,
50276,
4674,
9654,
3809,
15644,
285,
26647,
50276,
6050,
275,
3762,
581,
476,
5206,
253,
5148,
613,
920,
2746,
10939,
432,
253,
4327,
273,
1566,
3169,
9763,
275,
3946,
253,
4327,
273,
253,
278,
5902,
1332,
310,
1077,
1774,
278,
377,
74,
476,
6016,
1566,
23437,
19103,
1077,
973,
50276,
25855,
281,
253,
1127,
835,
4536,
15644,
310,
417,
3309,
368,
2550,
7472,
278,
377,
74,
342,
3909,
15644,
281,
1529,
278,
5902,
2746,
342,
1529,
1566,
28269,
2746,
436,
1057,
417,
1918,
479,
667,
1491,
273,
849,
634,
5148,
613,
920,
19132,
1566,
26672,
318,
275,
17718,
841,
14023,
403,
34209,
281,
1056,
634,
1543,
625,
14282,
368,
878,
281,
897,
253,
1072,
9763,
9978,
14935,
1333,
278,
377,
74,
285,
840,
7277,
253,
1563,
337,
278,
377,
74,
342,
634,
1313,
255,
11273,
3909,
15644,
374,
278,
377,
74,
1543,
342,
247,
4229,
6311,
8062,
1566,
50276,
2520,
2722,
441,
1880,
3909,
15644,
7729,
495,
1543,
273,
278,
377,
74,
342,
253,
3302,
8062,
1566,
10166,
275,
253,
1313,
255,
26208,
3408,
1293,
3909,
15644,
436,
588,
2028,
441,
1880,
253,
1313,
255,
26208,
3408,
3400,
247,
8062,
1566,
326,
2087,
4219,
1805,
1014,
1293,
3909,
15644,
577,
278,
377,
74,
342,
1566,
7491,
10981,
2784,
347,
3559,
275,
374,
436,
588,
921,
1880,
634,
1313,
255,
11273,
3909,
15644,
2686,
41731,
13015,
2969,
3909,
1566,
7491,
10981,
2784,
275,
2426,
273,
3410,
46505,
50276,
783,
2234,
1060,
310,
326,
368,
878,
281,
897,
253,
1072,
1566,
3169,
1453,
9978,
1880,
697,
278,
377,
74,
390,
690,
643,
1332,
5010,
368,
2550,
843,
2134,
253,
1055,
273,
9763,
4327,
432,
634,
5148,
613,
9306,
3909,
15644,
50276,
1540,
1524,
10186,
1072,
5701,
347,
1840,
14023,
403,
417,
14282,
50276,
18,
11419,
35221,
4715,
342,
21624,
4778,
305,
12064,
4870,
1484,
2284,
4765,
374,
278,
377,
74,
342,
1566,
13449,
10981,
2784,
1491,
253,
30325,
278,
5902,
323,
1566,
3169,
35221,
4715,
50276,
280,
376,
4240,
495,
3676,
35221,
4715,
275,
247,
17167,
273,
7587,
970,
37851,
8062,
3210,
295,
2824,
4765,
7152,
339,
431,
248,
4477,
9569,
271,
5933,
326,
12453,
253,
1895,
273,
3909,
3646,
15644,
323,
1566,
3169,
391,
77,
253,
2022,
38135,
273,
253,
4081,
2746,
310,
326,
352,
13067,
271,
3576,
5933,
326,
476,
4354,
285,
4541,
5223,
281,
253,
6890,
3634,
257,
11986,
352,
13179,
84,
253,
5697,
432,
771,
813,
658,
391,
77,
278,
16878,
281,
4853,
253,
11786,
48934,
422,
11269,
273,
616,
2746,
285,
352,
31167,
352,
14556,
715,
616,
1566,
3169,
391,
77,
7792,
253,
2929,
310,
973,
3542,
285,
253,
5661,
1543,
327,
13506,
285,
1524,
1533,
941,
921,
326,
253,
5933,
476,
4541,
5223,
697,
3646,
285,
5115,
1175,
1543,
275,
253,
8892,
672,
2429,
281,
2905,
7274,
50275,
6050,
9433,
253,
11786,
1754,
15644,
281,
253,
771,
813,
658,
391,
77,
310,
14916,
285,
556,
50276,
3456,
11529,
644,
4081,
275,
436,
789,
253,
4477,
513,
594,
407,
671,
13654,
327,
253,
1980,
3634,
278,
5018,
1561,
247,
465,
5056,
16892,
6941,
253,
1332,
281,
50276,
2845,
1189,
4541,
604,
4715,
432,
25493,
941,
285,
263,
697,
4156,
3646,
2550,
39970,
973,
281,
253,
1980,
22349,
3738,
436,
6880,
310,
14916,
352,
3133,
326,
352,
556,
417,
644,
3732,
285,
4080,
275,
2426,
273,
253,
15644,
3885,
275,
2045,
2987,
28055,
891,
923,
625,
1318,
275,
616,
1273,
2746,
835,
597,
7409,
253,
2898,
273,
3809,
4764,
11269,
1561,
1566,
3169,
391,
77,
4645,
326,
352,
1057,
3157,
689,
253,
278,
16878,
8435,
285,
1327,
26672,
422,
1566,
3169,
391,
77,
7274,
436,
310,
3264,
533,
50276,
936,
619,
3640,
556,
417,
644,
6949,
281,
436,
6070,
1078,
50275,
5371,
891,
1089,
310,
14999,
275,
436,
2929,
310,
12288,
715,
849,
7996,
253,
5933,
310,
275,
2426,
273,
253,
10771,
4313,
285,
671,
849,
352,
11852,
253,
15644,
3885,
4632,
3045,
7180,
4791,
921,
271,
1783,
533,
1110,
403,
323,
1027,
8892,
642,
10527,
1783,
369,
2684,
281,
2085,
12861,
4685,
273,
352,
253,
1566,
1057,
8415,
247,
8542,
1895,
8493,
253,
4715,
673,
285,
1907,
625,
10237,
1566,
2299,
352,
651,
823,
625,
1318,
281,
253,
1655,
1375,
273,
253,
1445,
275,
391,
77,
604,
253,
4477,
4081,
247,
1332,
323,
8654,
5438,
273,
253,
7355,
2792,
285,
671,
3497,
4313,
391,
77,
7293,
327,
253,
2303,
4836,
436,
651,
1056,
247,
1534,
10527,
7680,
285,
253,
1332,
812,
320,
4354,
7763,
281,
247,
5235,
273,
8892,
835,
253,
15988,
275,
253,
15644,
3885,
403,
1774,
187,
187,
4118,
18435,
27,
783,
4477,
1908,
253,
897,
273,
278,
16878,
342,
1566,
1754,
391,
77,
285,
3732,
436,
281,
15688,
982,
8892,
342,
1077,
18462,
1543,
627,
369,
19040,
1600,
275,
253,
2929,
533,
671,
690,
7350,
689,
849,
253,
1543,
497,
17860,
3782,
342,
1675,
281,
253,
2905,
2561,
275,
253,
15688,
982,
3114,
253,
4477,
403,
7052,
14659,
281,
9257,
1908,
436,
8680,
347,
597,
452,
644,
2509,
275,
616,
6128,
285,
2953,
436,
347,
973,
347,
1896,
275,
253,
2457,
2715,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
970,
5148,
613,
920,
285,
3809,
3909,
15644,
273,
3210,
281,
11399,
253,
29713,
875,
9864,
285,
253,
1524,
1533,
347,
973,
347,
12439,
2544,
285,
8062,
436,
2929,
29328,
767,
1566,
3169,
5148,
613,
920,
35221,
11333,
581,
1754,
327,
278,
16878,
285,
253,
643,
1754,
327,
15969,
285,
21657,
2722,
849,
597,
403,
625,
3410,
5919,
285,
7938,
387,
42174,
281,
1071,
15216,
685,
2720,
7274,
1690,
2720,
771,
813,
658,
5148,
613,
920,
7274,
50276,
74,
513,
452,
271,
2523,
342,
253,
1039,
436,
2929,
13301,
2720,
789,
347,
771,
813,
658,
5148,
613,
920,
11333,
1580,
323,
1650,
278,
16878,
310,
247,
2087,
5933,
326,
476,
320,
3732,
281,
771,
813,
658,
285,
1566,
3169,
11333,
19605,
352,
651,
320,
625,
7899,
275,
619,
4743,
281,
5203,
253,
9021,
273,
436,
2929,
347,
1566,
3169,
8164,
10944,
273,
2720,
5368,
11333,
2581,
685,
747,
11333,
30227,
50276,
303,
247,
2372,
13477,
342,
5150,
495,
347,
253,
15355,
310,
689,
247,
2014,
3126,
285,
253,
18974,
273,
941,
310,
671,
19958,
432,
247,
2014,
3126,
533,
275,
253,
4028,
253,
2929,
8631,
253,
4758,
347,
247,
7826,
1027,
3126,
387,
1046,
4522,
383,
554,
5150,
495,
3133,
281,
5467,
326,
253,
50276,
2377,
21934,
273,
941,
3249,
432,
247,
2014,
3126,
534,
40878,
752,
368,
1333,
275,
253,
2505,
347,
2529,
5150,
495,
310,
840,
417,
1663,
1199,
1027,
432,
2045,
6314,
23329,
390,
4836,
1754,
26850,
50276,
783,
1543,
3746,
403,
417,
12439,
347,
627,
556,
2168,
644,
2720,
789,
326,
436,
2929,
671,
25957,
4645,
326,
1566,
3169,
391,
77,
11333,
403,
625,
3410,
5919,
685,
771,
813,
658,
50276,
4674,
9901,
891,
751,
436,
5301,
285,
4645,
849,
253,
6332,
403,
2970,
1805,
50276,
1542,
2593,
9743,
32721,
432,
253,
14777,
352,
36908,
1646,
368,
403,
2509,
667,
5148,
613,
920,
275,
436,
3368,
594,
840,
403,
368,
816,
10323,
3515,
247,
1566,
3169,
391,
77,
5933,
516,
1077,
13477,
752,
368,
403,
2820,
281,
921,
403,
368,
2820,
281,
921,
253,
5649,
273,
1566,
3169,
4632,
771,
813,
658,
2720,
789,
556,
2168,
2218,
326,
403,
368,
2820,
281,
921,
326,
1014,
816,
970,
247,
5148,
613,
920,
5933,
275,
271,
3909,
4758,
1543,
275,
1175,
3909,
3045,
840,
368,
943,
320,
10941,
634,
5933,
281,
816,
247,
1566,
3169,
3909,
391,
77,
5933,
368,
671,
3748,
326,
253,
20185,
3045,
11521,
3212,
310,
436,
984,
634,
1566,
5350,
310,
1698,
390,
5046,
634,
278,
5902,
1332,
310,
12497,
604,
594,
840,
651,
2649,
352,
320,
625,
18511,
281,
751,
2720,
789,
13398,
436,
342,
247,
771,
813,
658,
5933,
285,
755,
253,
1682,
273,
1097,
20490,
50276,
4674,
9654,
1543,
1007,
1175,
50276,
4674,
6705,
891,
1663,
751,
253,
958,
368,
452,
1543,
327,
247,
1524,
15688,
50276,
1189,
455,
891,
1158,
253,
2929,
1057,
8379,
921,
253,
3410,
10454,
5373,
285,
3809,
15644,
273,
1566,
3169,
1313,
7694,
3082,
253,
11250,
273,
247,
1524,
1533,
15688,
3368,
310,
247,
5043,
2299,
253,
906,
310,
417,
3782,
10084,
390,
47860,
347,
2720,
789,
556,
2168,
2011,
253,
7863,
3410,
10454,
7756,
273,
1566,
3169,
391,
77,
3082,
50276,
11183,
1086,
577,
4765,
50276,
74,
452,
1239,
253,
2488,
2380,
285,
597,
452,
9713,
253,
2173,
7350,
891,
452,
3982,
598,
891,
717,
4583,
2762,
670,
436,
2929,
285,
253,
747,
2544,
285,
30733,
594,
891,
588,
5777,
2572,
619,
4868,
2167,
891,
717,
1335,
7514,
670,
253,
8453,
273,
253,
1543,
3746,
5474,
33032,
2520,
789,
12453,
253,
1895,
273,
3909,
42174,
8062,
3210,
275,
253,
3634,
273,
1566,
3169,
391,
77,
4715,
21349,
7899,
8062,
1566,
310,
7479,
604,
359,
1908,
326,
12620,
403,
7870,
285,
359,
16216,
10018,
1046,
1896,
3126,
1375,
387,
3302,
3733,
673,
3021,
4715,
8062,
3210,
326,
476,
320,
12956,
3909,
3809,
281,
2968,
342,
12439,
3807,
1620,
2326,
1078,
3394,
310,
271,
1774,
2561,
1895,
50276,
2520,
2929,
29328,
281,
897,
5148,
613,
920,
281,
6194,
271,
5731,
3646,
326,
476,
5731,
253,
8062,
1566,
387,
1071,
673,
275,
247,
3410,
5919,
5133,
767,
3082,
403,
4081,
50276,
737,
7187,
436,
1332,
4648,
278,
16878,
323,
5148,
613,
920,
50276,
250,
7187,
436,
1332,
18784,
247,
18902,
2990,
1309,
1313,
255,
26208,
824,
326,
352,
476,
5731,
253,
8062,
8069,
387,
1071,
673,
672,
253,
8062,
50276,
4168,
50276,
15617,
3082,
403,
6760,
327,
2067,
9864,
12620,
534,
921,
326,
650,
7187,
41731,
13015,
6142,
267,
327,
3388,
650,
7187,
310,
840,
6760,
327,
247,
1524,
985,
50275,
783,
20544,
273,
436,
2929,
403,
50275,
2520,
789,
12453,
271,
1774,
1895,
285,
310,
973,
17194,
50276,
16217,
3825,
327,
1097,
15524,
285,
327,
247,
1524,
985,
403,
2684,
50276,
783,
32213,
50275,
783,
2905,
789,
2593,
310,
23539,
4404,
253,
13361,
3114,
627,
310,
247,
7020,
273,
789,
327,
42174,
13737,
8062,
3210,
275,
253,
15688,
982,
3114,
436,
1386,
273,
789,
310,
2761,
7094,
12841,
275,
436,
2929,
33810,
690,
1774,
3332,
10414,
323,
1566,
3169,
391,
77,
403,
417,
2530,
275,
253,
2905,
789,
2593,
29286,
495,
285,
278,
377,
74,
374,
3738,
278,
377,
74,
310,
253,
9763,
326,
310,
908,
275,
436,
789,
347,
247,
7792,
323,
1566,
3169,
391,
77,
23000,
5368,
789,
327,
1566,
3169,
391,
77,
342,
5148,
613,
920,
337,
556,
417,
644,
11106,
436,
310,
28536,
50275,
9088,
310,
642,
1534,
7681,
7680,
50276,
783,
7680,
310,
326,
5368,
5148,
613,
920,
3082,
452,
644,
3732,
281,
253,
1566,
3169,
391,
77,
4758,
1014,
604,
642,
531,
556,
574,
326,
2934,
1078,
50276,
262,
651,
320,
247,
5884,
7680,
533,
1677,
326,
627,
310,
2720,
789,
327,
5148,
613,
920,
275,
253,
3634,
273,
1566,
3169,
391,
77,
436,
2934,
3139,
310,
417,
4460,
10542,
50276,
9389,
3082,
403,
2530,
1293,
1199,
1783,
2223,
4477,
3730,
281,
776,
2746,
50276,
2858,
697,
2686,
417,
2590,
752,
597,
1599,
407,
776,
2746,
253,
4477,
16216,
1750,
1566,
3169,
11419,
391,
77,
347,
616,
2746,
50275,
6050,
891,
49638,
253,
4477,
323,
9591,
1097,
9864,
285,
1524,
10186,
4679,
891,
1089,
253,
326,
4679,
3480,
247,
3505,
74,
6216,
7103,
625,
4278,
2708,
50276,
44333,
327,
4679,
50276,
4674,
9743,
3410,
6733,
50276,
5658,
7277,
28580,
281,
390,
6525,
1060,
891,
452,
642,
2934,
1880,
634,
11701,
275,
2426,
273,
3410,
46505,
403,
1955,
281,
970,
247,
1566,
3169,
391,
77,
2746,
390,
984,
634,
45021,
5148,
613,
920,
352,
310,
973,
1929,
326,
1566,
3169,
391,
77,
310,
625,
3410,
5919,
533,
2223,
2550,
5115,
253,
1072,
20185,
3045,
347,
771,
813,
658,
391,
77,
1580,
278,
377,
74,
310,
634,
4327,
273,
1566,
3169,
391,
77,
7792,
368,
651,
452,
281,
2486,
271,
7103,
326,
2722,
1543,
327,
278,
377,
74,
342,
1566,
7491,
10981,
2784,
347,
3559,
275,
374,
281,
1918,
441,
271,
2934,
273,
849,
1199,
625,
3410,
20246,
634,
2746,
310,
50276,
4674,
9654,
3809,
15644,
285,
26647,
50276,
6050,
275,
3762,
581,
476,
5206,
253,
5148,
613,
920,
2746,
10939,
432,
253,
4327,
273,
1566,
3169,
9763,
275,
3946,
253,
4327,
273,
253,
278,
5902,
1332,
310,
1077,
1774,
278,
377,
74,
476,
6016,
1566,
23437,
19103,
1077,
973,
50276,
25855,
281,
253,
1127,
835,
4536,
15644,
310,
417,
3309,
368,
2550,
7472,
278,
377,
74,
342,
3909,
15644,
281,
1529,
278,
5902,
2746,
342,
1529,
1566,
28269,
2746,
436,
1057,
417,
1918,
479,
667,
1491,
273,
849,
634,
5148,
613,
920,
19132,
1566,
26672,
318,
275,
17718,
841,
14023,
403,
34209,
281,
1056,
634,
1543,
625,
14282,
368,
878,
281,
897,
253,
1072,
9763,
9978,
14935,
1333,
278,
377,
74,
285,
840,
7277,
253,
1563,
337,
278,
377,
74,
342,
634,
1313,
255,
11273,
3909,
15644,
374,
278,
377,
74,
1543,
342,
247,
4229,
6311,
8062,
1566,
50276,
2520,
2722,
441,
1880,
3909,
15644,
7729,
495,
1543,
273,
278,
377,
74,
342,
253,
3302,
8062,
1566,
10166,
275,
253,
1313,
255,
26208,
3408,
1293,
3909,
15644,
436,
588,
2028,
441,
1880,
253,
1313,
255,
26208,
3408,
3400,
247,
8062,
1566,
326,
2087,
4219,
1805,
1014,
1293,
3909,
15644,
577,
278,
377,
74,
342,
1566,
7491,
10981,
2784,
347,
3559,
275,
374,
436,
588,
921,
1880,
634,
1313,
255,
11273,
3909,
15644,
2686,
41731,
13015,
2969,
3909,
1566,
7491,
10981,
2784,
275,
2426,
273,
3410,
46505,
50276,
783,
2234,
1060,
310,
326,
368,
878,
281,
897,
253,
1072,
1566,
3169,
1453,
9978,
1880,
697,
278,
377,
74,
390,
690,
643,
1332,
5010,
368,
2550,
843,
2134,
253,
1055,
273,
9763,
4327,
432,
634,
5148,
613,
9306,
3909,
15644,
50276,
1540,
1524,
10186,
1072,
5701,
347,
1840,
14023,
403,
417,
14282,
50276,
18,
11419,
35221,
4715,
342,
21624,
4778,
305,
12064,
4870,
1484,
2284,
4765,
374,
278,
377,
74,
342,
1566,
13449,
10981,
2784,
1491,
253,
30325,
278,
5902,
323,
1566,
3169,
35221,
4715,
50276,
280,
376,
4240,
495,
3676,
35221,
4715,
275,
247,
17167,
273,
7587,
970,
37851,
8062,
3210,
295,
2824,
4765,
7152,
339,
431,
248,
4477,
9569,
271,
5933,
326,
12453,
253,
1895,
273,
3909,
3646,
15644,
323,
1566,
3169,
391,
77,
253,
2022,
38135,
273,
253,
4081,
2746,
310,
326,
352,
13067,
271,
3576,
5933,
326,
476,
4354,
285,
4541,
5223,
281,
253,
6890,
3634,
257,
11986,
352,
13179,
84,
253,
5697,
432,
771,
813,
658,
391,
77,
278,
16878,
281,
4853,
253,
11786,
48934,
422,
11269,
273,
616,
2746,
285,
352,
31167,
352,
14556,
715,
616,
1566,
3169,
391,
77,
7792,
253,
2929,
310,
973,
3542,
285,
253,
5661,
1543,
327,
13506,
285,
1524,
1533,
941,
921,
326,
253,
5933,
476,
4541,
5223,
697,
3646,
285,
5115,
1175,
1543,
275,
253,
8892,
672,
2429,
281,
2905,
7274,
50275,
6050,
9433,
253,
11786,
1754,
15644,
281,
253,
771,
813,
658,
391,
77,
310,
14916,
285,
556,
50276,
3456,
11529,
644,
4081,
275,
436,
789,
253,
4477,
513,
594,
407,
671,
13654,
327,
253,
1980,
3634,
278,
5018,
1561,
247,
465,
5056,
16892,
6941,
253,
1332,
281,
50276,
2845,
1189,
4541,
604,
4715,
432,
25493,
941,
285,
263,
697,
4156,
3646,
2550,
39970,
973,
281,
253,
1980,
22349,
3738,
436,
6880,
310,
14916,
352,
3133,
326,
352,
556,
417,
644,
3732,
285,
4080,
275,
2426,
273,
253,
15644,
3885,
275,
2045,
2987,
28055,
891,
923,
625,
1318,
275,
616,
1273,
2746,
835,
597,
7409,
253,
2898,
273,
3809,
4764,
11269,
1561,
1566,
3169,
391,
77,
4645,
326,
352,
1057,
3157,
689,
253,
278,
16878,
8435,
285,
1327,
26672,
422,
1566,
3169,
391,
77,
7274,
436,
310,
3264,
533,
50276,
936,
619,
3640,
556,
417,
644,
6949,
281,
436,
6070,
1078,
50275,
5371,
891,
1089,
310,
14999,
275,
436,
2929,
310,
12288,
715,
849,
7996,
253,
5933,
310,
275,
2426,
273,
253,
10771,
4313,
285,
671,
849,
352,
11852,
253,
15644,
3885,
4632,
3045,
7180,
4791,
921,
271,
1783,
533,
1110,
403,
323,
1027,
8892,
642,
10527,
1783,
369,
2684,
281,
2085,
12861,
4685,
273,
352,
253,
1566,
1057,
8415,
247,
8542,
1895,
8493,
253,
4715,
673,
285,
1907,
625,
10237,
1566,
2299,
352,
651,
823,
625,
1318,
281,
253,
1655,
1375,
273,
253,
1445,
275,
391,
77,
604,
253,
4477,
4081,
247,
1332,
323,
8654,
5438,
273,
253,
7355,
2792,
285,
671,
3497,
4313,
391,
77,
7293,
327,
253,
2303,
4836,
436,
651,
1056,
247,
1534,
10527,
7680,
285,
253,
1332,
812,
320,
4354,
7763,
281,
247,
5235,
273,
8892,
835,
253,
15988,
275,
253,
15644,
3885,
403,
1774,
187,
187,
4118,
18435,
27,
783,
4477,
1908,
253,
897,
273,
278,
16878,
342,
1566,
1754,
391,
77,
285,
3732,
436,
281,
15688,
982,
8892,
342,
1077,
18462,
1543,
627,
369,
19040,
1600,
275,
253,
2929,
533,
671,
690,
7350,
689,
849,
253,
1543,
497,
17860,
3782,
342,
1675,
281,
253,
2905,
2561,
275,
253,
15688,
982,
3114,
253,
4477,
403,
7052,
14659,
281,
9257,
1908,
436,
8680,
347,
597,
452,
644,
2509,
275,
616,
6128,
285,
2953,
436,
347,
973,
347,
1896,
275,
253,
2457,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work evfm aims to to improve predictions of nbody system dynamics by combining continuous lie symmetries with permutation symmetry the authors propose to do this by encoding so3 invariant representations of each node followed by the use of a graph transformer and a vectorization block to estimate the vector field an evolving block is subsequently used to predict dynamics initial recommendation reject reason in my view in current format the weaknesses outweigh the strengths of the paper please see details below strengths 1 the idea using a scalarization block a graph transformer and a vectorization block is interesting 2 the results on the synthetic task and the molecular conformer generation task are promising weaknesses 1 for me the math appears to be imprecise wrong at many places which makes it a difficult read for example starting in def 21 shouldnt it be r vectors and s covectors also in the line the group actions from so3 acting should involve the dual representations 2 for me the authors make claims without sufficient proofs for eg on page 5 the bolded line doesnt have a proof can you please show the same especially given that the basis associated with ij pair is different also the proof for proposition a2 is not easy to follow or check 3 scalability concerns in the tensor example above eq 34 the authors provide an example for the s0 case 20 and this in mathbbr3 alone is of size 9 3x3 and this needs to be saved computed for each ij pair the authors also note that in their case they only use the 10 type tensors please add scalability results especially against en gnn which the authors compare against 4 appears to be missing some relevant work for eg 1 etc moreover the authors only appear to compare their model with the computationally efficient models like en gnn while ignoring models like lieconv2 lietransformer 3 etc which can be used for the same task newtonian manybody system task references 1 anderson brandon truongson hy and risi kondor cormorant covariant molecular neural networks arxiv preprint arxiv190604015 2019 2 finzi marc et al generalizing convolutional neural networks for equivariance to lie groups on arbitrary continuous data international conference on machine learning pmlr 2020 3 hutchinson michael j et al lietransformer equivariant selfattention for lie groups international conference on machine learning pmlr 2021 overall there are certain merits for the proposed architectures but the paper is hard to read due to some imprecise math lack of clarity proofs etc i initially suggest rejection docsepthis paper proposes a new model equivariant vector field network evfn which aims to solve the multibody system modeling problem specifically evfn introduces a new kind of equivariant basis which considers more interactions between two particles and employs a graph transformer architecture to learn the edgewise embeddings strengths 1 well written and easy to understand 2 the construction of the equivariant basis seems interesting weakness 1 the connection between evfn and egnn is not well discussed essentially egnn takes xi xj2 as the geometric scalar and apply phicdot to obtain the edgewise message see eq3 in 1 after obtaining edgewise message egnn uses phix to project mij to a scalar and then vectorizes it by xi xj in eq4 in 1 the key differences of evfn are 1 extending the equivariant basis from one dimension to three dimensions 2 replacing the mlp model phie as well as the aggregation operation in eq5 and eq6 in 1 with an existing graphtransformer block in this vein evfn can be viewed as the extension of egnn with the new equivariant bases 2 the motivation behind the equivariant basis construction is not well elaborated in the multibody system modeling the equivariant basis fracxi xjxixj has its physical meaning which indicates the force directions however the benefit of using the outer product of two coordinates to construct the equivariant basis is not clear since there is no explicit physical meaning of the outer product of two coordinates its better to give more indepth discussions about the choice of such equivariant bases here 3 there is no ablation study on the new equivariant basis and the graphtransformer on multibody system modeling as mentioned before the newadded equivariant bases have no explicit physical meaning of modeling the multibody system the main impact factor of the performance improvement of evfn looks vague on the other hand the ablation study on molecular conformation generation even implies that the contribution of graphtransformer is slightly larger than that of the new equivariant basis evfn wo gt gains the larger performance drop on 5 out of 8 metrics in table 3 if the performance improvement comes from graphtransformer the overall contribution of this paper is limited therefore it is crucial to conduct the ablation study of the different components of evfn to show the contributions of the new equivariant basis on multibody system modeling for example apply the new equivariant bases on the egnn framework 4 the number of training samples used in the multibody modeling experiment is not reported meanwhile there is no analysis of the impact of the number of training samples for the different baselines 5 from table 1 the result of gcn surpasses many sophisticated baselines in the interpolation and extrapolation tasks such as egnn this is quite weird why 1 en equivariant graph neural networks overall i think this paper is not ready to be accepted at this time docsepthe authors introduce a model to predict the time evolution of newton mechanical systems and small molecules the model takes a graphical representation as input and converts it to an se3 and permutation equivariant representation using physical principles ie whitebox model this representation is passed through a learned graph transformer module to produce a vector field which is used to predict the time evolution of the systemmolecule strengths the paper is well organized and well written besides some nonstandard use of english the mathematical presentation is fairly clear weaknesses it doesnt seem like the model produces equivariant representations i agree that the input to the graph transformer model is equivariant to se3 transformations but without some kind of constraint i dont see how the output could be moreover the empirical evidence does not support this claim table 1 col 7 questions so far we have achieved permutation and se3 equivariance p5 isnt it se3 invariance the initial centering giving translation invariance and the scalarization block giving so3 invariance did the authors experiment with other kinds of so3equivariant frames in the scalarization block i wonder if computing the frame from the neighborhood eg darboux frames would improve computational efficiency andor performance possible typos without concerning about breaking without concern about breaking p 9 verify the effectiveness of the propose method verify the effectiveness of the proposed method p 9 the trajectory points are uniformly sampleed the trajectory points are uniformly sampled p 6 we sample 1 30 and 30 trajectories we sample 1 and 30 trajectories p 6 gnn which is superior to model p 1 vector filed vector field p2 we use xt to denote we use xit p2 then each point of then each point p2 let yi be the dual frame of be the dual frame on gradient filed gradient field as shown in table 1 with the original input evfn outperforms all other equivariant methods in the interpolation and extrapolation tasks p 7 thats not what table 1 shows fij at bt ct p 4 extra parentheses should there be a time index in a5a8 the main claim of the paper is that there is no information lost in the representations learned as far as i can tell this claim is not supported by the theory or experiments moreover it does not seem that the learned representations are equivariant docsepthe paper proposes an rotationally equivariant neural network based on a transform of particle pairs into a rotationally invariant reference frame this is applied to trajectory prediction of manybody particle systems and molecular conformer search transforming particle interaction into a rotationally invariant reference frames is a common approach in molecular mechanics usually achieved with inner coordinates such as distances angles dihedral angles etc these have the disadvantage that some coordinate information is lost however they fulfill all equivariance invariance properties wrt rotation reflection translation and permutation in local reference frames the authors claim that evfn is able to represent tensor information losslessly this is also true for previous equivariant approaches such as tfn egnn painn or nequip as long as no distance cutoff is used see for example dym and maron 2020 however in practice the distance cutoff is essential not only for computational efficiency but also to facilitate generalization across local environments this is not possible in the proposed approach the evfn transform is rotationally equivariant and translational invariant only with respect to global transforms this has serious consequences for its capability to generalize in particular for larger systems as an extreme example take two molecules or clouds of newtonian particles at a large enough distance such that the intermolecular interactions are negligible since the proposed coordinate is not translational invariant the pairwise representations are not invariant to rotating the two molecules separately therefore the proposed representation is not suitable to model general manyparticle systems despite these flaws the method shows good performance on the toy systems and conformer search a reason might be that for small systems generalization is not the deciding factor and the baseline methods do not retain the necessary information to better judge this tradeoff it would be helpful to apply the proposed approach to the related task of predicting molecular force fields eg on the md17 dataset where a lot of baselines are available for comparison further comments centering the system prevents application of the method to a box with periodic boundary conditions are there any ideas how to overcome this the background section could be made more accessible for readers not familiar with group theory and tensor algebra since this part is not really necessary to understand the method and proofs in eq 3133 sec 42 the iso17 dataset was introduced by schtt et al 2017 why are the equivariance errors in table 1 slightly higher for evfn than for the other equivariant methods the proposed transformation has the flaws outlined above which might prevent generalization across local environments more experiments are required to judge how severe these issues are in practice therefore i can not recommend acceptance at this stage
### Summary:
|
the paper proposes a symmetryinformed neural network for modelling manybody systems the network is empirically evaluated in the tasks of predicting newtonian trajectories and molecular conformations all four reviewers are critical of the paper and recommend rejection one weak three strong the reviews have flagged weaknesses and quality issues with several aspects of the submission including the proposed methodology the novelty of the contribution and the clarity of the presentation although detailed clarifications were provided by the authors most of the reviewers concerns remain and the consensus among reviewers remains to reject the paper consequently the current version of the paper does not appear to meet the quality standards for acceptance to iclr
|
[
1056,
3916,
1293,
4209,
27947,
50276,
1542,
24088,
327,
3239,
608,
50276,
783,
13433,
264,
1386,
36908,
452,
247,
4737,
50276,
5092,
368,
4496,
921,
253,
1072,
3340,
1677,
326,
253,
3720,
2330,
342,
891,
75,
4667,
310,
1027,
671,
253,
4737,
323,
13989,
247,
19,
310,
417,
3477,
281,
956,
390,
2451,
495,
9171,
1430,
7350,
50276,
249,
253,
13148,
1650,
1840,
16186,
5910,
253,
4477,
2085,
271,
1650,
323,
253,
256,
17,
1083,
1384,
50276,
395,
436,
275,
14168,
67,
1288,
20,
3815,
310,
273,
1979,
898,
495,
89,
20,
50276,
395,
436,
3198,
281,
320,
9809,
10302,
323,
1016,
891,
75,
4667,
253,
4477,
671,
3877,
326,
275,
616,
1083,
597,
760,
897,
253,
884,
1511,
47454,
4496,
823,
9171,
1430,
1543,
3340,
1411,
546,
305,
9866,
534,
253,
4477,
7277,
1411,
577,
4620,
281,
320,
5816,
690,
4623,
789,
50276,
1542,
24088,
337,
3966,
25761,
253,
4477,
760,
3176,
281,
7277,
616,
1566,
342,
253,
43245,
5919,
3210,
751,
546,
305,
9866,
50276,
6050,
23111,
3210,
751,
7027,
13118,
19,
50276,
965,
11656,
507,
19946,
495,
3966,
534,
476,
320,
908,
323,
253,
1072,
4836,
747,
1299,
757,
1142,
2915,
985,
4836,
50275,
250,
3065,
337,
285,
3796,
7138,
251,
492,
86,
543,
1665,
1465,
285,
7361,
74,
465,
857,
263,
260,
526,
263,
386,
43359,
5787,
11454,
6928,
549,
32693,
638,
3845,
549,
32693,
16129,
26890,
10496,
6247,
374,
1442,
9877,
2304,
68,
1162,
355,
2087,
3006,
27311,
267,
11454,
6928,
323,
32270,
14417,
281,
7027,
2390,
327,
10341,
5415,
941,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
9169,
495,
288,
9248,
9258,
278,
44023,
480,
1162,
355,
632,
11656,
507,
19946,
32270,
6410,
1881,
42959,
323,
7027,
2390,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
43425,
4583,
627,
403,
2176,
16108,
323,
253,
4081,
35615,
533,
253,
2929,
310,
1892,
281,
1239,
1955,
281,
690,
1607,
2845,
885,
14168,
3480,
273,
19843,
27947,
3966,
891,
8523,
1804,
18235,
5474,
33032,
2520,
2929,
29328,
247,
747,
1566,
32270,
6410,
4972,
1673,
2990,
612,
4174,
534,
13698,
281,
8415,
253,
1554,
487,
1197,
985,
14053,
1895,
5742,
612,
4174,
50276,
36445,
707,
247,
747,
2238,
273,
32270,
6410,
3720,
534,
19401,
625,
6355,
875,
767,
6353,
285,
27532,
247,
4216,
39707,
10336,
281,
3037,
253,
5024,
3020,
46234,
50275,
296,
3755,
20556,
337,
973,
3542,
285,
3477,
281,
2096,
374,
253,
5140,
273,
253,
32270,
6410,
3720,
3133,
4722,
50275,
20881,
1255,
337,
253,
4602,
875,
612,
4174,
285,
299,
3757,
79,
310,
417,
973,
5469,
9093,
299,
3757,
79,
50276,
85,
1582,
1269,
74,
50276,
89,
75,
19,
347,
253,
17856,
13434,
285,
4647,
815,
280,
5256,
281,
4044,
253,
5024,
3020,
3935,
50275,
2887,
16186,
20,
50276,
249,
337,
846,
13546,
5024,
3020,
3935,
299,
3757,
79,
50276,
5123,
815,
895,
281,
2199,
278,
1944,
281,
247,
13434,
285,
840,
4972,
4219,
352,
407,
1269,
74,
50276,
89,
75,
275,
16186,
21,
275,
337,
253,
2234,
3910,
273,
612,
4174,
403,
337,
13633,
253,
32270,
6410,
3720,
432,
581,
7877,
281,
1264,
10103,
374,
15706,
253,
13361,
81,
1566,
815,
466,
347,
973,
347,
253,
20828,
4254,
275,
16186,
22,
285,
16186,
23,
275,
337,
342,
271,
5368,
17309,
384,
16147,
19946,
50276,
6172,
275,
436,
17716,
612,
4174,
50276,
5092,
320,
11575,
347,
253,
6880,
273,
299,
3757,
79,
342,
253,
747,
32270,
6410,
14395,
50276,
19,
253,
16038,
3212,
253,
32270,
6410,
3720,
5140,
310,
417,
973,
50221,
50276,
249,
253,
1554,
487,
1197,
985,
14053,
253,
32270,
6410,
3720,
1315,
317,
2981,
1269,
75,
89,
895,
75,
556,
697,
3520,
4495,
534,
6492,
253,
3490,
10746,
2299,
50276,
783,
5649,
273,
970,
253,
8346,
1885,
273,
767,
11627,
281,
3989,
253,
32270,
6410,
3720,
310,
417,
2590,
1580,
627,
310,
642,
6843,
3520,
4495,
273,
253,
8346,
1885,
273,
767,
11627,
697,
1805,
281,
1918,
625,
801,
554,
394,
11985,
670,
253,
4327,
273,
824,
32270,
6410,
14395,
1060,
50275,
20,
627,
310,
642,
28913,
1263,
327,
253,
747,
32270,
6410,
3720,
285,
253,
50276,
72,
1761,
384,
16147,
19946,
50276,
251,
1554,
487,
1197,
985,
14053,
50276,
284,
5393,
1078,
50276,
783,
747,
22566,
32270,
6410,
14395,
452,
642,
6843,
3520,
4495,
273,
14053,
253,
1554,
487,
1197,
985,
253,
2022,
3486,
2803,
273,
253,
3045,
7756,
273,
612,
4174,
4453,
21248,
50276,
251,
253,
643,
1133,
253,
28913,
1263,
327,
5787,
28324,
5978,
1014,
8018,
326,
253,
7680,
273,
50276,
72,
1761,
384,
16147,
19946,
50276,
261,
5777,
4067,
685,
326,
273,
253,
747,
32270,
6410,
3720,
612,
4174,
32063,
305,
85,
15988,
253,
4067,
3045,
5926,
327,
608,
562,
273,
854,
17082,
275,
2829,
495,
50275,
338,
253,
3045,
7756,
3249,
432,
17309,
384,
16147,
19946,
253,
4583,
7680,
273,
436,
2929,
310,
3710,
3103,
352,
310,
9560,
281,
2589,
253,
28913,
1263,
273,
253,
1027,
4295,
273,
612,
4174,
281,
921,
253,
9021,
273,
253,
747,
32270,
6410,
3720,
327,
1554,
487,
1197,
985,
14053,
323,
1650,
4647,
253,
747,
32270,
6410,
14395,
327,
253,
299,
3757,
79,
50276,
13149,
50275,
21,
253,
1180,
273,
3733,
3530,
908,
275,
253,
1554,
487,
1197,
14053,
3368,
310,
417,
2361,
26614,
627,
310,
642,
1783,
273,
253,
3486,
273,
253,
1180,
273,
3733,
3530,
323,
253,
1027,
1666,
25379,
50276,
22,
432,
2829,
337,
253,
906,
273,
305,
14340,
50276,
9960,
5858,
265,
1142,
18144,
1666,
25379,
275,
253,
30370,
285,
26480,
17888,
8892,
824,
347,
299,
3757,
79,
50275,
2520,
310,
3240,
12504,
2139,
50273,
18,
546,
32270,
6410,
4216,
11454,
6928,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
417,
4704,
281,
320,
7607,
387,
436,
673,
50276,
7152,
339,
431,
248,
4477,
9569,
247,
1566,
281,
3283,
253,
673,
5606,
273,
747,
1299,
8651,
2718,
285,
1355,
8094,
50276,
783,
1566,
3936,
247,
29886,
6779,
347,
3280,
285,
28472,
352,
281,
271,
396,
20,
285,
29391,
32270,
6410,
6779,
970,
3520,
9241,
26332,
3168,
3364,
1566,
436,
6779,
310,
4817,
949,
247,
6311,
4216,
39707,
6333,
281,
4711,
247,
4972,
1673,
534,
310,
908,
281,
3283,
253,
673,
5606,
273,
253,
985,
78,
44014,
50275,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
10932,
285,
973,
3542,
16280,
690,
1327,
15291,
897,
273,
48087,
50275,
783,
15965,
9759,
310,
9648,
2590,
50275,
20881,
1255,
265,
352,
36908,
1646,
751,
253,
1566,
11330,
32270,
6410,
14237,
891,
5194,
326,
253,
3280,
281,
253,
4216,
39707,
1566,
310,
32270,
6410,
281,
396,
20,
21257,
533,
1293,
690,
2238,
273,
7658,
891,
13414,
923,
849,
253,
3453,
812,
320,
25761,
253,
16774,
1941,
1057,
417,
1329,
436,
1750,
2829,
337,
847,
818,
50275,
34974,
50276,
601,
2080,
359,
452,
6786,
29391,
285,
396,
20,
32270,
14417,
268,
22,
310,
2649,
352,
396,
20,
31429,
50276,
783,
3302,
1399,
2158,
4933,
10234,
31429,
285,
253,
13434,
1320,
2972,
4933,
594,
20,
31429,
50276,
14958,
253,
4477,
3368,
342,
643,
9351,
273,
594,
20,
8275,
6410,
13009,
275,
253,
13434,
1320,
2972,
891,
4282,
604,
12672,
253,
3665,
432,
253,
9168,
24088,
13681,
67,
23051,
13009,
651,
3157,
15180,
6733,
285,
263,
3045,
50275,
24902,
963,
993,
50276,
14920,
8664,
670,
10155,
50276,
14920,
4468,
670,
10155,
268,
898,
50276,
36302,
253,
12510,
273,
253,
12661,
1332,
50276,
36302,
253,
12510,
273,
253,
4081,
1332,
268,
898,
50276,
783,
18974,
2792,
403,
17568,
3410,
264,
50275,
783,
18974,
2792,
403,
17568,
19958,
268,
721,
50276,
664,
3410,
337,
1884,
285,
1884,
24102,
50276,
664,
3410,
337,
50276,
395,
1884,
24102,
268,
721,
50276,
3757,
79,
534,
310,
8936,
281,
1566,
268,
337,
50276,
11000,
4724,
50276,
11000,
1673,
268,
19,
50276,
664,
897,
209,
633,
281,
9173,
50276,
664,
897,
1269,
262,
268,
19,
50276,
7461,
1016,
1127,
273,
50276,
7461,
1016,
1127,
268,
19,
50276,
1059,
340,
74,
320,
253,
8746,
3665,
273,
50275,
1257,
253,
8746,
3665,
327,
50276,
29844,
4724,
50276,
29844,
1673,
50276,
284,
2011,
275,
2829,
337,
342,
253,
3236,
3280,
612,
4174,
41731,
13015,
512,
643,
32270,
6410,
3082,
275,
253,
30370,
285,
26480,
17888,
8892,
268,
818,
50276,
394,
1832,
417,
752,
2829,
337,
2722,
50276,
71,
1944,
387,
37989,
45830,
268,
577,
50276,
24124,
41616,
50276,
11425,
627,
320,
247,
673,
3605,
275,
247,
22,
66,
25,
253,
2022,
1750,
273,
253,
2929,
310,
326,
627,
310,
642,
1491,
3663,
275,
253,
14237,
6311,
347,
2080,
347,
891,
476,
2028,
436,
1750,
310,
417,
4516,
407,
253,
3762,
390,
4679,
25761,
352,
1057,
417,
1646,
326,
253,
6311,
14237,
403,
32270,
6410,
5474,
339,
431,
248,
2929,
29328,
271,
9381,
595,
32270,
6410,
11454,
2990,
1754,
327,
247,
4979,
273,
8091,
8557,
715,
247,
9381,
595,
13727,
3806,
3665,
436,
310,
3732,
281,
18974,
10554,
273,
1142,
2915,
8091,
2718,
285,
5787,
10138,
254,
3186,
27197,
8091,
5016,
715,
247,
9381,
595,
13727,
3806,
13009,
310,
247,
1846,
2746,
275,
5787,
17823,
3798,
6786,
342,
6703,
11627,
824,
347,
13849,
14636,
1073,
16232,
14636,
3966,
841,
452,
253,
18928,
326,
690,
13249,
1491,
310,
3663,
2299,
597,
19145,
512,
32270,
14417,
50276,
7821,
14417,
3607,
8772,
9381,
12906,
10234,
285,
29391,
275,
1980,
3806,
13009,
50275,
783,
4477,
1750,
326,
612,
4174,
310,
2104,
281,
1957,
13148,
1491,
2957,
13102,
436,
310,
671,
2032,
323,
2045,
32270,
6410,
7274,
824,
347,
246,
4174,
299,
3757,
79,
3075,
79,
390,
425,
371,
532,
347,
1048,
347,
642,
4181,
23046,
310,
908,
923,
323,
1650,
277,
1105,
285,
2304,
251,
9169,
2299,
275,
3946,
253,
4181,
23046,
310,
5667,
417,
760,
323,
15180,
6733,
533,
671,
281,
12454,
26647,
2439,
1980,
12620,
436,
310,
417,
1896,
275,
253,
4081,
2746,
253,
612,
4174,
4979,
310,
9381,
595,
32270,
6410,
285,
33103,
13727,
760,
342,
1675,
281,
4156,
29698,
436,
556,
4092,
9099,
323,
697,
14603,
281,
39970,
275,
1798,
323,
4067,
2718,
347,
271,
9559,
1650,
1379,
767,
8094,
390,
16173,
273,
747,
1299,
757,
6353,
387,
247,
1781,
2217,
4181,
824,
326,
253,
734,
36911,
6355,
403,
22879,
1580,
253,
4081,
13249,
310,
417,
33103,
13727,
253,
28208,
14237,
403,
417,
13727,
281,
17387,
253,
767,
8094,
11794,
3103,
253,
4081,
6779,
310,
417,
7470,
281,
1566,
2087,
1142,
25268,
2718,
50276,
3229,
3784,
841,
32138,
253,
1332,
2722,
1175,
3045,
327,
253,
20953,
2718,
285,
10138,
254,
3186,
247,
1921,
1537,
320,
326,
323,
1355,
2718,
26647,
310,
417,
253,
18000,
2803,
285,
253,
8245,
3082,
513,
417,
13280,
253,
3309,
1491,
281,
1805,
5963,
436,
5454,
2727,
352,
651,
320,
9371,
281,
4647,
253,
4081,
2746,
281,
253,
2905,
4836,
273,
21565,
5787,
3490,
4910,
24088,
327,
253,
31934,
1166,
10895,
835,
247,
2257,
273,
1666,
25379,
403,
2130,
323,
5301,
50276,
44295,
5701,
50276,
1154,
2158,
253,
985,
16897,
2898,
273,
253,
1332,
281,
247,
3817,
342,
15316,
7548,
2515,
403,
627,
667,
5697,
849,
281,
11399,
436,
50276,
783,
4114,
2593,
812,
320,
1160,
625,
12482,
323,
10668,
417,
7615,
342,
1387,
3762,
285,
13148,
8697,
1580,
436,
629,
310,
417,
1663,
3309,
281,
2096,
253,
1332,
285,
27947,
275,
16186,
495,
14380,
50276,
1704,
5976,
253,
14928,
1166,
10895,
369,
5611,
407,
5807,
1440,
1162,
355,
4240,
50276,
22309,
403,
253,
32270,
14417,
6332,
275,
2829,
337,
5777,
2169,
323,
612,
4174,
685,
323,
253,
643,
32270,
6410,
3082,
253,
4081,
9261,
556,
253,
32138,
18627,
1840,
534,
1537,
3657,
26647,
2439,
1980,
12620,
625,
4679,
403,
2424,
281,
5963,
849,
5460,
841,
3374,
403,
275,
3946,
3103,
891,
476,
417,
5583,
14924,
387,
436,
3924,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10377,
38967,
11454,
2990,
323,
26278,
1142,
2915,
2718,
253,
2990,
310,
45190,
6760,
275,
253,
8892,
273,
21565,
747,
1299,
757,
24102,
285,
5787,
10138,
569,
50276,
455,
1740,
30628,
403,
4619,
273,
253,
2929,
285,
5583,
18235,
581,
5075,
1264,
2266,
253,
10123,
452,
7908,
2400,
32213,
285,
3290,
3374,
342,
2067,
7794,
273,
253,
19529,
1690,
253,
4081,
16182,
253,
38135,
273,
253,
7680,
285,
253,
19843,
273,
253,
9759,
3738,
7000,
8254,
6787,
497,
2530,
407,
253,
4477,
954,
273,
253,
30628,
7350,
3464,
285,
253,
13969,
2190,
30628,
4558,
281,
12009,
253,
2929,
50276,
585,
9642,
253,
1655,
2715,
273,
253,
2929,
1057,
417,
3176,
281,
2525,
253,
3290,
7465,
323,
14924,
281,
17857,
32888
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1056,
3916,
1293,
4209,
27947,
50276,
1542,
24088,
327,
3239,
608,
50276,
783,
13433,
264,
1386,
36908,
452,
247,
4737,
50276,
5092,
368,
4496,
921,
253,
1072,
3340,
1677,
326,
253,
3720,
2330,
342,
891,
75,
4667,
310,
1027,
671,
253,
4737,
323,
13989,
247,
19,
310,
417,
3477,
281,
956,
390,
2451,
495,
9171,
1430,
7350,
50276,
249,
253,
13148,
1650,
1840,
16186,
5910,
253,
4477,
2085,
271,
1650,
323,
253,
256,
17,
1083,
1384,
50276,
395,
436,
275,
14168,
67,
1288,
20,
3815,
310,
273,
1979,
898,
495,
89,
20,
50276,
395,
436,
3198,
281,
320,
9809,
10302,
323,
1016,
891,
75,
4667,
253,
4477,
671,
3877,
326,
275,
616,
1083,
597,
760,
897,
253,
884,
1511,
47454,
4496,
823,
9171,
1430,
1543,
3340,
1411,
546,
305,
9866,
534,
253,
4477,
7277,
1411,
577,
4620,
281,
320,
5816,
690,
4623,
789,
50276,
1542,
24088,
337,
3966,
25761,
253,
4477,
760,
3176,
281,
7277,
616,
1566,
342,
253,
43245,
5919,
3210,
751,
546,
305,
9866,
50276,
6050,
23111,
3210,
751,
7027,
13118,
19,
50276,
965,
11656,
507,
19946,
495,
3966,
534,
476,
320,
908,
323,
253,
1072,
4836,
747,
1299,
757,
1142,
2915,
985,
4836,
50275,
250,
3065,
337,
285,
3796,
7138,
251,
492,
86,
543,
1665,
1465,
285,
7361,
74,
465,
857,
263,
260,
526,
263,
386,
43359,
5787,
11454,
6928,
549,
32693,
638,
3845,
549,
32693,
16129,
26890,
10496,
6247,
374,
1442,
9877,
2304,
68,
1162,
355,
2087,
3006,
27311,
267,
11454,
6928,
323,
32270,
14417,
281,
7027,
2390,
327,
10341,
5415,
941,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
9169,
495,
288,
9248,
9258,
278,
44023,
480,
1162,
355,
632,
11656,
507,
19946,
32270,
6410,
1881,
42959,
323,
7027,
2390,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
43425,
4583,
627,
403,
2176,
16108,
323,
253,
4081,
35615,
533,
253,
2929,
310,
1892,
281,
1239,
1955,
281,
690,
1607,
2845,
885,
14168,
3480,
273,
19843,
27947,
3966,
891,
8523,
1804,
18235,
5474,
33032,
2520,
2929,
29328,
247,
747,
1566,
32270,
6410,
4972,
1673,
2990,
612,
4174,
534,
13698,
281,
8415,
253,
1554,
487,
1197,
985,
14053,
1895,
5742,
612,
4174,
50276,
36445,
707,
247,
747,
2238,
273,
32270,
6410,
3720,
534,
19401,
625,
6355,
875,
767,
6353,
285,
27532,
247,
4216,
39707,
10336,
281,
3037,
253,
5024,
3020,
46234,
50275,
296,
3755,
20556,
337,
973,
3542,
285,
3477,
281,
2096,
374,
253,
5140,
273,
253,
32270,
6410,
3720,
3133,
4722,
50275,
20881,
1255,
337,
253,
4602,
875,
612,
4174,
285,
299,
3757,
79,
310,
417,
973,
5469,
9093,
299,
3757,
79,
50276,
85,
1582,
1269,
74,
50276,
89,
75,
19,
347,
253,
17856,
13434,
285,
4647,
815,
280,
5256,
281,
4044,
253,
5024,
3020,
3935,
50275,
2887,
16186,
20,
50276,
249,
337,
846,
13546,
5024,
3020,
3935,
299,
3757,
79,
50276,
5123,
815,
895,
281,
2199,
278,
1944,
281,
247,
13434,
285,
840,
4972,
4219,
352,
407,
1269,
74,
50276,
89,
75,
275,
16186,
21,
275,
337,
253,
2234,
3910,
273,
612,
4174,
403,
337,
13633,
253,
32270,
6410,
3720,
432,
581,
7877,
281,
1264,
10103,
374,
15706,
253,
13361,
81,
1566,
815,
466,
347,
973,
347,
253,
20828,
4254,
275,
16186,
22,
285,
16186,
23,
275,
337,
342,
271,
5368,
17309,
384,
16147,
19946,
50276,
6172,
275,
436,
17716,
612,
4174,
50276,
5092,
320,
11575,
347,
253,
6880,
273,
299,
3757,
79,
342,
253,
747,
32270,
6410,
14395,
50276,
19,
253,
16038,
3212,
253,
32270,
6410,
3720,
5140,
310,
417,
973,
50221,
50276,
249,
253,
1554,
487,
1197,
985,
14053,
253,
32270,
6410,
3720,
1315,
317,
2981,
1269,
75,
89,
895,
75,
556,
697,
3520,
4495,
534,
6492,
253,
3490,
10746,
2299,
50276,
783,
5649,
273,
970,
253,
8346,
1885,
273,
767,
11627,
281,
3989,
253,
32270,
6410,
3720,
310,
417,
2590,
1580,
627,
310,
642,
6843,
3520,
4495,
273,
253,
8346,
1885,
273,
767,
11627,
697,
1805,
281,
1918,
625,
801,
554,
394,
11985,
670,
253,
4327,
273,
824,
32270,
6410,
14395,
1060,
50275,
20,
627,
310,
642,
28913,
1263,
327,
253,
747,
32270,
6410,
3720,
285,
253,
50276,
72,
1761,
384,
16147,
19946,
50276,
251,
1554,
487,
1197,
985,
14053,
50276,
284,
5393,
1078,
50276,
783,
747,
22566,
32270,
6410,
14395,
452,
642,
6843,
3520,
4495,
273,
14053,
253,
1554,
487,
1197,
985,
253,
2022,
3486,
2803,
273,
253,
3045,
7756,
273,
612,
4174,
4453,
21248,
50276,
251,
253,
643,
1133,
253,
28913,
1263,
327,
5787,
28324,
5978,
1014,
8018,
326,
253,
7680,
273,
50276,
72,
1761,
384,
16147,
19946,
50276,
261,
5777,
4067,
685,
326,
273,
253,
747,
32270,
6410,
3720,
612,
4174,
32063,
305,
85,
15988,
253,
4067,
3045,
5926,
327,
608,
562,
273,
854,
17082,
275,
2829,
495,
50275,
338,
253,
3045,
7756,
3249,
432,
17309,
384,
16147,
19946,
253,
4583,
7680,
273,
436,
2929,
310,
3710,
3103,
352,
310,
9560,
281,
2589,
253,
28913,
1263,
273,
253,
1027,
4295,
273,
612,
4174,
281,
921,
253,
9021,
273,
253,
747,
32270,
6410,
3720,
327,
1554,
487,
1197,
985,
14053,
323,
1650,
4647,
253,
747,
32270,
6410,
14395,
327,
253,
299,
3757,
79,
50276,
13149,
50275,
21,
253,
1180,
273,
3733,
3530,
908,
275,
253,
1554,
487,
1197,
14053,
3368,
310,
417,
2361,
26614,
627,
310,
642,
1783,
273,
253,
3486,
273,
253,
1180,
273,
3733,
3530,
323,
253,
1027,
1666,
25379,
50276,
22,
432,
2829,
337,
253,
906,
273,
305,
14340,
50276,
9960,
5858,
265,
1142,
18144,
1666,
25379,
275,
253,
30370,
285,
26480,
17888,
8892,
824,
347,
299,
3757,
79,
50275,
2520,
310,
3240,
12504,
2139,
50273,
18,
546,
32270,
6410,
4216,
11454,
6928,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
417,
4704,
281,
320,
7607,
387,
436,
673,
50276,
7152,
339,
431,
248,
4477,
9569,
247,
1566,
281,
3283,
253,
673,
5606,
273,
747,
1299,
8651,
2718,
285,
1355,
8094,
50276,
783,
1566,
3936,
247,
29886,
6779,
347,
3280,
285,
28472,
352,
281,
271,
396,
20,
285,
29391,
32270,
6410,
6779,
970,
3520,
9241,
26332,
3168,
3364,
1566,
436,
6779,
310,
4817,
949,
247,
6311,
4216,
39707,
6333,
281,
4711,
247,
4972,
1673,
534,
310,
908,
281,
3283,
253,
673,
5606,
273,
253,
985,
78,
44014,
50275,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
10932,
285,
973,
3542,
16280,
690,
1327,
15291,
897,
273,
48087,
50275,
783,
15965,
9759,
310,
9648,
2590,
50275,
20881,
1255,
265,
352,
36908,
1646,
751,
253,
1566,
11330,
32270,
6410,
14237,
891,
5194,
326,
253,
3280,
281,
253,
4216,
39707,
1566,
310,
32270,
6410,
281,
396,
20,
21257,
533,
1293,
690,
2238,
273,
7658,
891,
13414,
923,
849,
253,
3453,
812,
320,
25761,
253,
16774,
1941,
1057,
417,
1329,
436,
1750,
2829,
337,
847,
818,
50275,
34974,
50276,
601,
2080,
359,
452,
6786,
29391,
285,
396,
20,
32270,
14417,
268,
22,
310,
2649,
352,
396,
20,
31429,
50276,
783,
3302,
1399,
2158,
4933,
10234,
31429,
285,
253,
13434,
1320,
2972,
4933,
594,
20,
31429,
50276,
14958,
253,
4477,
3368,
342,
643,
9351,
273,
594,
20,
8275,
6410,
13009,
275,
253,
13434,
1320,
2972,
891,
4282,
604,
12672,
253,
3665,
432,
253,
9168,
24088,
13681,
67,
23051,
13009,
651,
3157,
15180,
6733,
285,
263,
3045,
50275,
24902,
963,
993,
50276,
14920,
8664,
670,
10155,
50276,
14920,
4468,
670,
10155,
268,
898,
50276,
36302,
253,
12510,
273,
253,
12661,
1332,
50276,
36302,
253,
12510,
273,
253,
4081,
1332,
268,
898,
50276,
783,
18974,
2792,
403,
17568,
3410,
264,
50275,
783,
18974,
2792,
403,
17568,
19958,
268,
721,
50276,
664,
3410,
337,
1884,
285,
1884,
24102,
50276,
664,
3410,
337,
50276,
395,
1884,
24102,
268,
721,
50276,
3757,
79,
534,
310,
8936,
281,
1566,
268,
337,
50276,
11000,
4724,
50276,
11000,
1673,
268,
19,
50276,
664,
897,
209,
633,
281,
9173,
50276,
664,
897,
1269,
262,
268,
19,
50276,
7461,
1016,
1127,
273,
50276,
7461,
1016,
1127,
268,
19,
50276,
1059,
340,
74,
320,
253,
8746,
3665,
273,
50275,
1257,
253,
8746,
3665,
327,
50276,
29844,
4724,
50276,
29844,
1673,
50276,
284,
2011,
275,
2829,
337,
342,
253,
3236,
3280,
612,
4174,
41731,
13015,
512,
643,
32270,
6410,
3082,
275,
253,
30370,
285,
26480,
17888,
8892,
268,
818,
50276,
394,
1832,
417,
752,
2829,
337,
2722,
50276,
71,
1944,
387,
37989,
45830,
268,
577,
50276,
24124,
41616,
50276,
11425,
627,
320,
247,
673,
3605,
275,
247,
22,
66,
25,
253,
2022,
1750,
273,
253,
2929,
310,
326,
627,
310,
642,
1491,
3663,
275,
253,
14237,
6311,
347,
2080,
347,
891,
476,
2028,
436,
1750,
310,
417,
4516,
407,
253,
3762,
390,
4679,
25761,
352,
1057,
417,
1646,
326,
253,
6311,
14237,
403,
32270,
6410,
5474,
339,
431,
248,
2929,
29328,
271,
9381,
595,
32270,
6410,
11454,
2990,
1754,
327,
247,
4979,
273,
8091,
8557,
715,
247,
9381,
595,
13727,
3806,
3665,
436,
310,
3732,
281,
18974,
10554,
273,
1142,
2915,
8091,
2718,
285,
5787,
10138,
254,
3186,
27197,
8091,
5016,
715,
247,
9381,
595,
13727,
3806,
13009,
310,
247,
1846,
2746,
275,
5787,
17823,
3798,
6786,
342,
6703,
11627,
824,
347,
13849,
14636,
1073,
16232,
14636,
3966,
841,
452,
253,
18928,
326,
690,
13249,
1491,
310,
3663,
2299,
597,
19145,
512,
32270,
14417,
50276,
7821,
14417,
3607,
8772,
9381,
12906,
10234,
285,
29391,
275,
1980,
3806,
13009,
50275,
783,
4477,
1750,
326,
612,
4174,
310,
2104,
281,
1957,
13148,
1491,
2957,
13102,
436,
310,
671,
2032,
323,
2045,
32270,
6410,
7274,
824,
347,
246,
4174,
299,
3757,
79,
3075,
79,
390,
425,
371,
532,
347,
1048,
347,
642,
4181,
23046,
310,
908,
923,
323,
1650,
277,
1105,
285,
2304,
251,
9169,
2299,
275,
3946,
253,
4181,
23046,
310,
5667,
417,
760,
323,
15180,
6733,
533,
671,
281,
12454,
26647,
2439,
1980,
12620,
436,
310,
417,
1896,
275,
253,
4081,
2746,
253,
612,
4174,
4979,
310,
9381,
595,
32270,
6410,
285,
33103,
13727,
760,
342,
1675,
281,
4156,
29698,
436,
556,
4092,
9099,
323,
697,
14603,
281,
39970,
275,
1798,
323,
4067,
2718,
347,
271,
9559,
1650,
1379,
767,
8094,
390,
16173,
273,
747,
1299,
757,
6353,
387,
247,
1781,
2217,
4181,
824,
326,
253,
734,
36911,
6355,
403,
22879,
1580,
253,
4081,
13249,
310,
417,
33103,
13727,
253,
28208,
14237,
403,
417,
13727,
281,
17387,
253,
767,
8094,
11794,
3103,
253,
4081,
6779,
310,
417,
7470,
281,
1566,
2087,
1142,
25268,
2718,
50276,
3229,
3784,
841,
32138,
253,
1332,
2722,
1175,
3045,
327,
253,
20953,
2718,
285,
10138,
254,
3186,
247,
1921,
1537,
320,
326,
323,
1355,
2718,
26647,
310,
417,
253,
18000,
2803,
285,
253,
8245,
3082,
513,
417,
13280,
253,
3309,
1491,
281,
1805,
5963,
436,
5454,
2727,
352,
651,
320,
9371,
281,
4647,
253,
4081,
2746,
281,
253,
2905,
4836,
273,
21565,
5787,
3490,
4910,
24088,
327,
253,
31934,
1166,
10895,
835,
247,
2257,
273,
1666,
25379,
403,
2130,
323,
5301,
50276,
44295,
5701,
50276,
1154,
2158,
253,
985,
16897,
2898,
273,
253,
1332,
281,
247,
3817,
342,
15316,
7548,
2515,
403,
627,
667,
5697,
849,
281,
11399,
436,
50276,
783,
4114,
2593,
812,
320,
1160,
625,
12482,
323,
10668,
417,
7615,
342,
1387,
3762,
285,
13148,
8697,
1580,
436,
629,
310,
417,
1663,
3309,
281,
2096,
253,
1332,
285,
27947,
275,
16186,
495,
14380,
50276,
1704,
5976,
253,
14928,
1166,
10895,
369,
5611,
407,
5807,
1440,
1162,
355,
4240,
50276,
22309,
403,
253,
32270,
14417,
6332,
275,
2829,
337,
5777,
2169,
323,
612,
4174,
685,
323,
253,
643,
32270,
6410,
3082,
253,
4081,
9261,
556,
253,
32138,
18627,
1840,
534,
1537,
3657,
26647,
2439,
1980,
12620,
625,
4679,
403,
2424,
281,
5963,
849,
5460,
841,
3374,
403,
275,
3946,
3103,
891,
476,
417,
5583,
14924,
387,
436,
3924,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10377,
38967,
11454,
2990,
323,
26278,
1142,
2915,
2718,
253,
2990,
310,
45190,
6760,
275,
253,
8892,
273,
21565,
747,
1299,
757,
24102,
285,
5787,
10138,
569,
50276,
455,
1740,
30628,
403,
4619,
273,
253,
2929,
285,
5583,
18235,
581,
5075,
1264,
2266,
253,
10123,
452,
7908,
2400,
32213,
285,
3290,
3374,
342,
2067,
7794,
273,
253,
19529,
1690,
253,
4081,
16182,
253,
38135,
273,
253,
7680,
285,
253,
19843,
273,
253,
9759,
3738,
7000,
8254,
6787,
497,
2530,
407,
253,
4477,
954,
273,
253,
30628,
7350,
3464,
285,
253,
13969,
2190,
30628,
4558,
281,
12009,
253,
2929,
50276,
585,
9642,
253,
1655,
2715,
273,
253,
2929,
1057,
417,
3176,
281,
2525,
253,
3290,
7465,
323,
14924,
281,
17857,
32888
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a gan variant called chaingan which expresses the generator as a base generator which maps the noise vector to a rough model sample followed by a sequence of editors which progressively refine the sample each component of the generator is trained independently to fool its own separate discriminator without backpropagating through the entire chain of editors the proposed chaingan model is trained on mnist cifar10 and celeba the paper presents model samples for all three datasets as well as inception scores for cifar10 i find the proposed idea simple and elegant but the evaluation lacking and as such im a bit hesitant to outright recommend accepting the paper evaluation is not very extensive or detailed inception scores are shown only for cifar10 and using two base generator architectures the inception score has known limitations and i would have expected the authors to also provide fid scores the main takeaway is also not articulated very clearly as far as i can tell it appears to be that chaingan allows to achieve similar performance with less tunable parameters but table 1 shows mixed results where chaingan outperforms the baseline dcgan architecture using fewer parameters but underperforms the baseline resnet architecture the way the experimental section is organized made it difficult for me to find my way around for example subsection titles are hard to locate due to the fact that figures and tables were placed immediately underneath them overall when the flow of the text is interrupted by a figure its hard to locate where to resume reading there is a connection to be made with other sequential generation approaches not to be confused with sequence generation such as lapgan draw and unrolled gans discussing the relationship to those approaches would in my opinion add more depth to the paperdocsepthe authors propose chaingan a gan architecture where the generator is supplemented with a series of editors that iteratively improve image quality in practice the algorithm also uses multiple critics discriminators although this is not explained until the experiments section the paper contains the germ of a powerful idea however it feels as if the authors havent yet come to grip with their own idea and architecture currently the role of the editors feels underspecified it is unclear and unexplored what architectures make for good editors exactly how editors should interact with the various losses and what the role of the critics ideas are proposed in related work should be in the experiments the editors sharpen image quality but the tradeoffs are not explored are more editors always better when does it saturate why adding a few editors and critics makes the architecture more parameterefficient but increases the number of losses what happens to wallclock training time moreover the paper is conflicted about the role of the critics is the core idea to have multiple generators discriminators or both what is moving the needle docsepsummary the authors present a straightforward method to improve generative quality of gans that can allow for fewer parameters by separating the task into a basicgeneration followed by a chain of multiple edits to the base generation with different editor networks the fact that separating the task allows for smaller networks is key to the reduction in parameters each editor is trained separately in alternance with its associated critic authors test their approach mostly on cifar10 as well as celeba and mnist pros the proposed method is simple and makes intuitive sense having editor generators acting like highlyconditioned gans should make their job easier to produce better samples empirical results show that when removing editors for evaluation some wellknown architecture dcgan wgangp can be outperformed with less parameters when comparing is scores interesting leads and negative results are discussed in section 5 cons comparisons with endtoend training seems inadequate the authors invalidate the endtoend approach with two arguments 1 that it doesnt allow for the removal of superfluous editors once the training is done section 33 and 2 that is scores are significantly lower section 42 table1 it seems that both these statements are true simply because endtoend learning is performed with a single score outputted by a single critic at the end of the chain while it would be entirely possible and simple to keep all discriminators and associated losses and train endtoend this would still push all editors to produce good samples thus allowing removal of editors at testtime and would probably yield better results than those reported in table 1 it could also invalidate the results shown in figure 4 left for me this is an important missing comparison as it might even yield better results than the proposed approach and invalidates one of the proposed advantage of the method i think this idea of sequential generation has been explored before eg stackgan 1 2 or lapgan 3 and others in which unconditional image generation is performed on relatively complicated datasets with a somewhat more principled way of actually simplifying the task of the base generator therefore i think important citations and valid comparisons are missing the only reported metric is the inception score is while most of the recent literature agrees that the frchet inception distance is a better metric i think fid should be presented as well to be better aligned with recent literature even in cases where comparison with previously reported performance is impossible if previous works only presented is if you want to be compared to in future work i think this is necessary it would be a good addition to have fidis scores for each editor output as we could see the quantitative increase in performance at each editing step in section 42 you specify that all experiments are done using the wgangp training formulation looking at table 1 this is unclear as you specify this training scheme only for model 6 and model 9 if all models use the same training scheme this information should be absent from the table celeba results experiments are reported in the main text without any results which are only in the appendix these results dont show any quantitative metrics and are visually disappointing it is hard to see if the editors actually improve the generation some of the main results or discussions are based on section 7 which is not the main article even though it is used as another section instead of an appendix i think section 7 should be separated into appendices a b etc maybe some important aspects of the research presented could fit into the main text given some removal of repetitions and some compression of the intro to gans which should be vastly known by the iclr community by now wrong citation format at the end of section 21 section 33 train the critic more than the generator in each iteration this could be clarified by stating exactly what you do training the critic for k steps for each generator steps its not always clear what the boldface represents throughout tables the discussion in section 5 about promising techniques explored is somewhat disappointing as efforts to investigate why training failed are not apparent every result shown from the proposed method is performed with small or tiny versions of existing architectures this method could have additional value if it could boost performance on the same architecture even if there are added editors and trainable parameters the fact that such results are absent makes me suspicious of such a behavior overall i think this paper presents a relevant and interesting idea however i think this idea has been explored before with more convincing results and with a more principled approach there are some important flaws in the comparisons made to assess the advantages of the method and the overall results fail to convince of any important benefit based on the proscons stated above i think this paper does not reach the required quality for acceptance in iclr 1 stackgan text to photorealistic image synthesis with stacked generative adversarial networks zhang et al 2017 2 stackgan realistic image synthesis with stacked generative adversarial networks zhang et al 2017 3 deep generative image models using a laplacian pyramid of adversarial networks denton et al 2015
### Summary:
|
the paper presents a ganbased generative model where the generator consists of the base generator followed by several editors each trained separately with its own discriminator the reviewers found the idea interesting but the evaluation insufficient no rebuttal was provided
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
36827,
12955,
1925,
11450,
272,
266,
534,
30599,
253,
14156,
347,
247,
2613,
14156,
50276,
4609,
8115,
253,
6046,
4972,
281,
247,
7227,
1566,
3410,
50276,
25739,
264,
407,
247,
3425,
273,
23145,
50276,
4609,
31414,
39494,
253,
3410,
1016,
4445,
273,
253,
14156,
310,
10166,
10939,
281,
11213,
697,
1211,
4858,
7134,
12915,
1293,
896,
44263,
839,
949,
253,
2862,
5931,
273,
23145,
253,
4081,
11450,
272,
266,
1566,
310,
10166,
327,
278,
79,
382,
260,
338,
274,
740,
285,
6076,
5830,
253,
2929,
10262,
1566,
3530,
323,
512,
1264,
15302,
347,
973,
347,
39645,
7363,
323,
260,
338,
274,
740,
50276,
74,
1089,
253,
4081,
2934,
2969,
285,
20654,
533,
253,
7103,
14999,
285,
347,
824,
516,
247,
2372,
16063,
386,
281,
30227,
5583,
18738,
253,
2929,
50275,
15419,
2368,
310,
417,
1077,
9470,
390,
7000,
39645,
7363,
403,
2011,
760,
323,
260,
338,
274,
740,
285,
970,
767,
2613,
14156,
35615,
253,
39645,
4868,
556,
1929,
7364,
285,
891,
651,
452,
3264,
253,
4477,
281,
671,
2085,
269,
301,
7363,
253,
2022,
1379,
12594,
310,
671,
417,
35144,
1077,
4518,
347,
2080,
347,
891,
476,
2028,
352,
4620,
281,
320,
326,
11450,
272,
266,
4483,
281,
5115,
2074,
3045,
342,
1679,
10839,
494,
3602,
533,
2829,
337,
2722,
6804,
1543,
835,
11450,
272,
266,
41731,
13015,
253,
8245,
36196,
1247,
10336,
970,
11184,
3602,
533,
762,
468,
13015,
253,
8245,
501,
3024,
10336,
50276,
783,
1039,
253,
5661,
2593,
310,
10932,
1160,
352,
2834,
323,
479,
281,
1089,
619,
1039,
1475,
323,
1650,
19087,
14505,
403,
1892,
281,
19912,
1955,
281,
253,
958,
326,
8442,
285,
7180,
497,
4845,
4745,
21281,
731,
4583,
672,
253,
2685,
273,
253,
2505,
310,
21018,
407,
247,
4677,
697,
1892,
281,
19912,
835,
281,
21058,
4361,
50276,
9088,
310,
247,
4602,
281,
320,
1160,
342,
643,
22453,
5978,
7274,
417,
281,
320,
13477,
342,
3425,
5978,
824,
347,
13427,
1247,
3812,
285,
440,
9095,
305,
507,
16585,
253,
2954,
281,
1110,
7274,
651,
275,
619,
4743,
823,
625,
6864,
281,
253,
2929,
7152,
339,
431,
248,
4477,
12661,
11450,
272,
266,
247,
36827,
10336,
835,
253,
14156,
310,
19079,
342,
247,
2962,
273,
23145,
326,
10040,
3146,
3157,
2460,
3290,
275,
3946,
253,
5933,
671,
4648,
2709,
17139,
20741,
2392,
3738,
436,
310,
417,
5544,
1919,
253,
4679,
2593,
50275,
783,
2929,
4428,
253,
14638,
273,
247,
6422,
2934,
2299,
352,
9193,
347,
604,
253,
4477,
419,
2254,
2568,
1705,
281,
17628,
342,
616,
1211,
2934,
285,
10336,
4390,
253,
2554,
273,
253,
23145,
9193,
17433,
1553,
1245,
352,
310,
12744,
285,
35021,
2149,
752,
35615,
1056,
323,
1175,
23145,
4555,
849,
23145,
943,
8008,
342,
253,
2710,
11655,
285,
752,
253,
2554,
273,
253,
17139,
5697,
403,
4081,
275,
2905,
789,
943,
320,
275,
253,
4679,
253,
23145,
17614,
3878,
2460,
3290,
533,
253,
5454,
14273,
403,
417,
14859,
403,
625,
23145,
1900,
1805,
672,
1057,
352,
19004,
366,
2139,
6240,
247,
1643,
23145,
285,
17139,
2789,
253,
10336,
625,
30364,
11892,
2276,
533,
5459,
253,
1180,
273,
11655,
752,
6569,
281,
3402,
13273,
3733,
673,
25761,
253,
2929,
310,
7344,
264,
670,
253,
2554,
273,
253,
17139,
310,
253,
5161,
2934,
281,
452,
2709,
21025,
20741,
2392,
390,
1097,
752,
310,
4886,
253,
15460,
50274,
7152,
339,
793,
360,
3454,
253,
4477,
1246,
247,
15246,
1332,
281,
3157,
1006,
800,
3290,
273,
305,
507,
326,
476,
1581,
323,
11184,
3602,
407,
23694,
253,
4836,
715,
247,
5044,
14520,
3560,
407,
247,
5931,
273,
2709,
1407,
953,
281,
253,
2613,
5978,
342,
1027,
8121,
6928,
253,
958,
326,
23694,
253,
4836,
4483,
323,
4577,
6928,
310,
2234,
281,
253,
5141,
275,
3602,
50276,
14382,
8121,
310,
10166,
11794,
275,
3960,
593,
342,
697,
2330,
7291,
4477,
1071,
616,
2746,
6571,
327,
260,
338,
274,
740,
347,
973,
347,
6076,
5830,
285,
278,
79,
382,
50276,
856,
84,
50276,
783,
4081,
1332,
310,
2969,
285,
2789,
27350,
3282,
1907,
8121,
21025,
8534,
751,
4122,
44321,
305,
507,
943,
1056,
616,
2628,
6927,
281,
4711,
1805,
3530,
50275,
358,
5378,
474,
1543,
921,
326,
672,
11922,
23145,
323,
7103,
690,
973,
4304,
10336,
36196,
1247,
50276,
88,
26774,
81,
476,
320,
41731,
10574,
342,
1679,
3602,
672,
10941,
310,
7363,
50275,
47606,
5644,
285,
4016,
1543,
403,
5469,
275,
2593,
608,
50276,
5040,
50274,
681,
1148,
10047,
342,
990,
936,
423,
3733,
3133,
18766,
253,
4477,
12078,
366,
253,
990,
936,
423,
2746,
342,
767,
7125,
337,
326,
352,
36908,
1581,
323,
253,
8570,
273,
2221,
1258,
3472,
23145,
2378,
253,
3733,
310,
2218,
2593,
5922,
285,
374,
326,
310,
7363,
403,
3012,
2406,
2593,
5976,
2829,
18,
352,
3133,
326,
1097,
841,
7234,
403,
2032,
3365,
984,
990,
936,
423,
4715,
310,
2684,
342,
247,
2014,
4868,
3453,
8659,
407,
247,
2014,
7291,
387,
253,
990,
273,
253,
5931,
1223,
352,
651,
320,
7094,
1896,
285,
2969,
281,
1978,
512,
20741,
2392,
285,
2330,
11655,
285,
6194,
990,
936,
423,
436,
651,
1335,
7450,
512,
23145,
281,
4711,
1175,
3530,
3021,
6941,
8570,
273,
23145,
387,
1071,
2606,
285,
651,
3164,
4917,
1805,
1543,
685,
1110,
2361,
275,
2829,
337,
352,
812,
671,
12078,
366,
253,
1543,
2011,
275,
4677,
577,
1669,
323,
479,
436,
310,
271,
1774,
5816,
5301,
347,
352,
1537,
1014,
4917,
1805,
1543,
685,
253,
4081,
2746,
285,
12078,
684,
581,
273,
253,
4081,
5750,
273,
253,
1332,
50275,
74,
1158,
436,
2934,
273,
22453,
5978,
556,
644,
14859,
1078,
24088,
8031,
1247,
337,
374,
390,
13427,
1247,
495,
285,
2571,
275,
534,
49795,
2460,
5978,
310,
2684,
327,
4942,
9542,
15302,
342,
247,
8489,
625,
3505,
74,
6216,
1039,
273,
2686,
8077,
5411,
253,
4836,
273,
253,
2613,
14156,
3103,
891,
1158,
1774,
30404,
285,
3588,
14023,
403,
5816,
50275,
783,
760,
2361,
7982,
310,
253,
39645,
4868,
310,
1223,
954,
273,
253,
3332,
6239,
18726,
326,
253,
1315,
25742,
39645,
4181,
310,
247,
1805,
7982,
891,
1158,
269,
301,
943,
320,
3559,
347,
973,
281,
320,
1805,
15616,
342,
3332,
6239,
1014,
275,
2219,
835,
5301,
342,
3786,
2361,
3045,
310,
7479,
604,
2045,
2987,
760,
3559,
310,
604,
368,
971,
281,
320,
2429,
281,
275,
2852,
789,
891,
1158,
436,
310,
3309,
50275,
262,
651,
320,
247,
1175,
1635,
281,
452,
269,
30861,
7363,
323,
1016,
8121,
3453,
347,
359,
812,
923,
253,
11745,
2572,
275,
3045,
387,
1016,
14835,
3213,
50275,
249,
2593,
5976,
368,
13199,
326,
512,
4679,
403,
2218,
970,
253,
259,
26774,
81,
3733,
15895,
2819,
387,
2829,
337,
436,
310,
12744,
347,
368,
13199,
436,
3733,
6974,
760,
323,
1566,
721,
285,
1566,
898,
604,
512,
3210,
897,
253,
1072,
3733,
6974,
436,
1491,
943,
320,
12125,
432,
253,
2829,
50275,
44033,
5830,
1543,
4679,
403,
2361,
275,
253,
2022,
2505,
1293,
667,
1543,
534,
403,
760,
275,
253,
30762,
841,
1543,
13414,
921,
667,
11745,
17082,
285,
403,
25910,
31623,
352,
310,
1892,
281,
923,
604,
253,
23145,
2686,
3157,
253,
5978,
50275,
8826,
273,
253,
2022,
1543,
390,
11985,
403,
1754,
327,
2593,
818,
534,
310,
417,
253,
2022,
3929,
1014,
2167,
352,
310,
908,
347,
1529,
2593,
3185,
273,
271,
30762,
891,
1158,
2593,
818,
943,
320,
9070,
715,
14801,
1271,
247,
270,
3966,
5046,
690,
1774,
7794,
273,
253,
2561,
3559,
812,
4944,
715,
253,
2022,
2505,
1677,
690,
8570,
273,
49495,
285,
690,
13800,
273,
253,
26432,
281,
305,
507,
534,
943,
320,
37078,
1929,
407,
253,
17857,
32888,
3114,
407,
1024,
50275,
36222,
25577,
5981,
387,
253,
990,
273,
2593,
3127,
50275,
4674,
5922,
50276,
24382,
253,
7291,
625,
685,
253,
14156,
275,
1016,
19502,
436,
812,
320,
31637,
407,
14851,
4555,
752,
368,
513,
3733,
253,
7291,
323,
465,
5018,
323,
1016,
14156,
5018,
50275,
953,
417,
1900,
2590,
752,
253,
13433,
1664,
6125,
4768,
7180,
50275,
783,
5955,
275,
2593,
608,
670,
12532,
5609,
14859,
310,
8489,
31623,
347,
6031,
281,
7409,
2139,
3733,
4242,
403,
417,
5165,
50275,
15160,
906,
2011,
432,
253,
4081,
1332,
310,
2684,
342,
1355,
390,
10058,
9508,
273,
5368,
35615,
436,
1332,
812,
452,
3081,
1318,
604,
352,
812,
9510,
3045,
327,
253,
1072,
10336,
1014,
604,
627,
403,
2879,
23145,
285,
6194,
494,
3602,
253,
958,
326,
824,
1543,
403,
12125,
2789,
479,
20634,
273,
824,
247,
3879,
50276,
1189,
455,
891,
1158,
436,
2929,
10262,
247,
4623,
285,
4722,
2934,
2299,
891,
1158,
436,
2934,
556,
644,
14859,
1078,
342,
625,
21414,
1543,
285,
342,
247,
625,
3505,
74,
6216,
2746,
627,
403,
690,
1774,
32138,
275,
253,
14023,
1160,
281,
2939,
253,
11361,
273,
253,
1332,
285,
253,
4583,
1543,
1891,
281,
18578,
273,
667,
1774,
5649,
1754,
327,
253,
5847,
5040,
4767,
1840,
891,
1158,
436,
2929,
1057,
417,
3986,
253,
2424,
3290,
323,
14924,
275,
17857,
32888,
50276,
18,
8031,
1247,
2505,
281,
38099,
267,
2531,
2460,
9066,
342,
24982,
1006,
800,
48960,
6928,
1182,
12109,
1162,
355,
4240,
374,
8031,
1247,
15958,
2460,
9066,
342,
24982,
1006,
800,
48960,
6928,
1182,
12109,
1162,
355,
4240,
495,
3676,
1006,
800,
2460,
3210,
970,
247,
826,
43917,
39694,
273,
48960,
6928,
13822,
251,
1162,
355,
4104,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
36827,
3169,
1006,
800,
1566,
835,
253,
14156,
8414,
273,
253,
2613,
14156,
3560,
407,
2067,
23145,
1016,
10166,
11794,
342,
697,
1211,
7134,
12915,
253,
30628,
1119,
253,
2934,
4722,
533,
253,
7103,
12497,
642,
30080,
22559,
369,
2530
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
36827,
12955,
1925,
11450,
272,
266,
534,
30599,
253,
14156,
347,
247,
2613,
14156,
50276,
4609,
8115,
253,
6046,
4972,
281,
247,
7227,
1566,
3410,
50276,
25739,
264,
407,
247,
3425,
273,
23145,
50276,
4609,
31414,
39494,
253,
3410,
1016,
4445,
273,
253,
14156,
310,
10166,
10939,
281,
11213,
697,
1211,
4858,
7134,
12915,
1293,
896,
44263,
839,
949,
253,
2862,
5931,
273,
23145,
253,
4081,
11450,
272,
266,
1566,
310,
10166,
327,
278,
79,
382,
260,
338,
274,
740,
285,
6076,
5830,
253,
2929,
10262,
1566,
3530,
323,
512,
1264,
15302,
347,
973,
347,
39645,
7363,
323,
260,
338,
274,
740,
50276,
74,
1089,
253,
4081,
2934,
2969,
285,
20654,
533,
253,
7103,
14999,
285,
347,
824,
516,
247,
2372,
16063,
386,
281,
30227,
5583,
18738,
253,
2929,
50275,
15419,
2368,
310,
417,
1077,
9470,
390,
7000,
39645,
7363,
403,
2011,
760,
323,
260,
338,
274,
740,
285,
970,
767,
2613,
14156,
35615,
253,
39645,
4868,
556,
1929,
7364,
285,
891,
651,
452,
3264,
253,
4477,
281,
671,
2085,
269,
301,
7363,
253,
2022,
1379,
12594,
310,
671,
417,
35144,
1077,
4518,
347,
2080,
347,
891,
476,
2028,
352,
4620,
281,
320,
326,
11450,
272,
266,
4483,
281,
5115,
2074,
3045,
342,
1679,
10839,
494,
3602,
533,
2829,
337,
2722,
6804,
1543,
835,
11450,
272,
266,
41731,
13015,
253,
8245,
36196,
1247,
10336,
970,
11184,
3602,
533,
762,
468,
13015,
253,
8245,
501,
3024,
10336,
50276,
783,
1039,
253,
5661,
2593,
310,
10932,
1160,
352,
2834,
323,
479,
281,
1089,
619,
1039,
1475,
323,
1650,
19087,
14505,
403,
1892,
281,
19912,
1955,
281,
253,
958,
326,
8442,
285,
7180,
497,
4845,
4745,
21281,
731,
4583,
672,
253,
2685,
273,
253,
2505,
310,
21018,
407,
247,
4677,
697,
1892,
281,
19912,
835,
281,
21058,
4361,
50276,
9088,
310,
247,
4602,
281,
320,
1160,
342,
643,
22453,
5978,
7274,
417,
281,
320,
13477,
342,
3425,
5978,
824,
347,
13427,
1247,
3812,
285,
440,
9095,
305,
507,
16585,
253,
2954,
281,
1110,
7274,
651,
275,
619,
4743,
823,
625,
6864,
281,
253,
2929,
7152,
339,
431,
248,
4477,
12661,
11450,
272,
266,
247,
36827,
10336,
835,
253,
14156,
310,
19079,
342,
247,
2962,
273,
23145,
326,
10040,
3146,
3157,
2460,
3290,
275,
3946,
253,
5933,
671,
4648,
2709,
17139,
20741,
2392,
3738,
436,
310,
417,
5544,
1919,
253,
4679,
2593,
50275,
783,
2929,
4428,
253,
14638,
273,
247,
6422,
2934,
2299,
352,
9193,
347,
604,
253,
4477,
419,
2254,
2568,
1705,
281,
17628,
342,
616,
1211,
2934,
285,
10336,
4390,
253,
2554,
273,
253,
23145,
9193,
17433,
1553,
1245,
352,
310,
12744,
285,
35021,
2149,
752,
35615,
1056,
323,
1175,
23145,
4555,
849,
23145,
943,
8008,
342,
253,
2710,
11655,
285,
752,
253,
2554,
273,
253,
17139,
5697,
403,
4081,
275,
2905,
789,
943,
320,
275,
253,
4679,
253,
23145,
17614,
3878,
2460,
3290,
533,
253,
5454,
14273,
403,
417,
14859,
403,
625,
23145,
1900,
1805,
672,
1057,
352,
19004,
366,
2139,
6240,
247,
1643,
23145,
285,
17139,
2789,
253,
10336,
625,
30364,
11892,
2276,
533,
5459,
253,
1180,
273,
11655,
752,
6569,
281,
3402,
13273,
3733,
673,
25761,
253,
2929,
310,
7344,
264,
670,
253,
2554,
273,
253,
17139,
310,
253,
5161,
2934,
281,
452,
2709,
21025,
20741,
2392,
390,
1097,
752,
310,
4886,
253,
15460,
50274,
7152,
339,
793,
360,
3454,
253,
4477,
1246,
247,
15246,
1332,
281,
3157,
1006,
800,
3290,
273,
305,
507,
326,
476,
1581,
323,
11184,
3602,
407,
23694,
253,
4836,
715,
247,
5044,
14520,
3560,
407,
247,
5931,
273,
2709,
1407,
953,
281,
253,
2613,
5978,
342,
1027,
8121,
6928,
253,
958,
326,
23694,
253,
4836,
4483,
323,
4577,
6928,
310,
2234,
281,
253,
5141,
275,
3602,
50276,
14382,
8121,
310,
10166,
11794,
275,
3960,
593,
342,
697,
2330,
7291,
4477,
1071,
616,
2746,
6571,
327,
260,
338,
274,
740,
347,
973,
347,
6076,
5830,
285,
278,
79,
382,
50276,
856,
84,
50276,
783,
4081,
1332,
310,
2969,
285,
2789,
27350,
3282,
1907,
8121,
21025,
8534,
751,
4122,
44321,
305,
507,
943,
1056,
616,
2628,
6927,
281,
4711,
1805,
3530,
50275,
358,
5378,
474,
1543,
921,
326,
672,
11922,
23145,
323,
7103,
690,
973,
4304,
10336,
36196,
1247,
50276,
88,
26774,
81,
476,
320,
41731,
10574,
342,
1679,
3602,
672,
10941,
310,
7363,
50275,
47606,
5644,
285,
4016,
1543,
403,
5469,
275,
2593,
608,
50276,
5040,
50274,
681,
1148,
10047,
342,
990,
936,
423,
3733,
3133,
18766,
253,
4477,
12078,
366,
253,
990,
936,
423,
2746,
342,
767,
7125,
337,
326,
352,
36908,
1581,
323,
253,
8570,
273,
2221,
1258,
3472,
23145,
2378,
253,
3733,
310,
2218,
2593,
5922,
285,
374,
326,
310,
7363,
403,
3012,
2406,
2593,
5976,
2829,
18,
352,
3133,
326,
1097,
841,
7234,
403,
2032,
3365,
984,
990,
936,
423,
4715,
310,
2684,
342,
247,
2014,
4868,
3453,
8659,
407,
247,
2014,
7291,
387,
253,
990,
273,
253,
5931,
1223,
352,
651,
320,
7094,
1896,
285,
2969,
281,
1978,
512,
20741,
2392,
285,
2330,
11655,
285,
6194,
990,
936,
423,
436,
651,
1335,
7450,
512,
23145,
281,
4711,
1175,
3530,
3021,
6941,
8570,
273,
23145,
387,
1071,
2606,
285,
651,
3164,
4917,
1805,
1543,
685,
1110,
2361,
275,
2829,
337,
352,
812,
671,
12078,
366,
253,
1543,
2011,
275,
4677,
577,
1669,
323,
479,
436,
310,
271,
1774,
5816,
5301,
347,
352,
1537,
1014,
4917,
1805,
1543,
685,
253,
4081,
2746,
285,
12078,
684,
581,
273,
253,
4081,
5750,
273,
253,
1332,
50275,
74,
1158,
436,
2934,
273,
22453,
5978,
556,
644,
14859,
1078,
24088,
8031,
1247,
337,
374,
390,
13427,
1247,
495,
285,
2571,
275,
534,
49795,
2460,
5978,
310,
2684,
327,
4942,
9542,
15302,
342,
247,
8489,
625,
3505,
74,
6216,
1039,
273,
2686,
8077,
5411,
253,
4836,
273,
253,
2613,
14156,
3103,
891,
1158,
1774,
30404,
285,
3588,
14023,
403,
5816,
50275,
783,
760,
2361,
7982,
310,
253,
39645,
4868,
310,
1223,
954,
273,
253,
3332,
6239,
18726,
326,
253,
1315,
25742,
39645,
4181,
310,
247,
1805,
7982,
891,
1158,
269,
301,
943,
320,
3559,
347,
973,
281,
320,
1805,
15616,
342,
3332,
6239,
1014,
275,
2219,
835,
5301,
342,
3786,
2361,
3045,
310,
7479,
604,
2045,
2987,
760,
3559,
310,
604,
368,
971,
281,
320,
2429,
281,
275,
2852,
789,
891,
1158,
436,
310,
3309,
50275,
262,
651,
320,
247,
1175,
1635,
281,
452,
269,
30861,
7363,
323,
1016,
8121,
3453,
347,
359,
812,
923,
253,
11745,
2572,
275,
3045,
387,
1016,
14835,
3213,
50275,
249,
2593,
5976,
368,
13199,
326,
512,
4679,
403,
2218,
970,
253,
259,
26774,
81,
3733,
15895,
2819,
387,
2829,
337,
436,
310,
12744,
347,
368,
13199,
436,
3733,
6974,
760,
323,
1566,
721,
285,
1566,
898,
604,
512,
3210,
897,
253,
1072,
3733,
6974,
436,
1491,
943,
320,
12125,
432,
253,
2829,
50275,
44033,
5830,
1543,
4679,
403,
2361,
275,
253,
2022,
2505,
1293,
667,
1543,
534,
403,
760,
275,
253,
30762,
841,
1543,
13414,
921,
667,
11745,
17082,
285,
403,
25910,
31623,
352,
310,
1892,
281,
923,
604,
253,
23145,
2686,
3157,
253,
5978,
50275,
8826,
273,
253,
2022,
1543,
390,
11985,
403,
1754,
327,
2593,
818,
534,
310,
417,
253,
2022,
3929,
1014,
2167,
352,
310,
908,
347,
1529,
2593,
3185,
273,
271,
30762,
891,
1158,
2593,
818,
943,
320,
9070,
715,
14801,
1271,
247,
270,
3966,
5046,
690,
1774,
7794,
273,
253,
2561,
3559,
812,
4944,
715,
253,
2022,
2505,
1677,
690,
8570,
273,
49495,
285,
690,
13800,
273,
253,
26432,
281,
305,
507,
534,
943,
320,
37078,
1929,
407,
253,
17857,
32888,
3114,
407,
1024,
50275,
36222,
25577,
5981,
387,
253,
990,
273,
2593,
3127,
50275,
4674,
5922,
50276,
24382,
253,
7291,
625,
685,
253,
14156,
275,
1016,
19502,
436,
812,
320,
31637,
407,
14851,
4555,
752,
368,
513,
3733,
253,
7291,
323,
465,
5018,
323,
1016,
14156,
5018,
50275,
953,
417,
1900,
2590,
752,
253,
13433,
1664,
6125,
4768,
7180,
50275,
783,
5955,
275,
2593,
608,
670,
12532,
5609,
14859,
310,
8489,
31623,
347,
6031,
281,
7409,
2139,
3733,
4242,
403,
417,
5165,
50275,
15160,
906,
2011,
432,
253,
4081,
1332,
310,
2684,
342,
1355,
390,
10058,
9508,
273,
5368,
35615,
436,
1332,
812,
452,
3081,
1318,
604,
352,
812,
9510,
3045,
327,
253,
1072,
10336,
1014,
604,
627,
403,
2879,
23145,
285,
6194,
494,
3602,
253,
958,
326,
824,
1543,
403,
12125,
2789,
479,
20634,
273,
824,
247,
3879,
50276,
1189,
455,
891,
1158,
436,
2929,
10262,
247,
4623,
285,
4722,
2934,
2299,
891,
1158,
436,
2934,
556,
644,
14859,
1078,
342,
625,
21414,
1543,
285,
342,
247,
625,
3505,
74,
6216,
2746,
627,
403,
690,
1774,
32138,
275,
253,
14023,
1160,
281,
2939,
253,
11361,
273,
253,
1332,
285,
253,
4583,
1543,
1891,
281,
18578,
273,
667,
1774,
5649,
1754,
327,
253,
5847,
5040,
4767,
1840,
891,
1158,
436,
2929,
1057,
417,
3986,
253,
2424,
3290,
323,
14924,
275,
17857,
32888,
50276,
18,
8031,
1247,
2505,
281,
38099,
267,
2531,
2460,
9066,
342,
24982,
1006,
800,
48960,
6928,
1182,
12109,
1162,
355,
4240,
374,
8031,
1247,
15958,
2460,
9066,
342,
24982,
1006,
800,
48960,
6928,
1182,
12109,
1162,
355,
4240,
495,
3676,
1006,
800,
2460,
3210,
970,
247,
826,
43917,
39694,
273,
48960,
6928,
13822,
251,
1162,
355,
4104,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
36827,
3169,
1006,
800,
1566,
835,
253,
14156,
8414,
273,
253,
2613,
14156,
3560,
407,
2067,
23145,
1016,
10166,
11794,
342,
697,
1211,
7134,
12915,
253,
30628,
1119,
253,
2934,
4722,
533,
253,
7103,
12497,
642,
30080,
22559,
369,
2530
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a 3d molecular generative model desert for structurebased drug design in a zeroshot manner the method involves two stages first sketching the molecular shape in the protein pocket then generating molecules based on the shape based on the assumption of structure determines properties the method aims to find molecules whose shapes are complementary to the pocket and thus massive unbounded molecular data eg zinc can be utilized to train the shape2mol model strengths the main idea of drug design by sketching and generating is novel and wellmotivated the proposed method shows better performance over existing methods in the experiments the analysis in experiments is comprehensive weaknesses dont have a related work section discussing the connection between proposed shape2mol and existing shapebased molecular generation methods ref 49 53 the proposed method section is not very clear to me see details in questions the authors discuss some limitations in the experiment section there is not negative social impact docsepthis paper proposes a zeroshot drug design method based on sketch starting from the property that structure determines properties the paper uses a sketch model of protein pocket shapes and a pretrained generative model based on molecular shape sketches to achieve zeroshot drug design that does not rely on docking experimental data and docking simulations strengths the paper presents a molecular generation method based only on shape sketching which is a very novel approach also the generation method without using protein binding data and simulations is enlightening for future work weakness although the sketchbased generation idea is novel the generation model is still similar to the transformerbased sequence generation model used in machine translation tasks the paper is also not clear enough in the method description only the design of the encoding and decoding method and the paper citation of the model used are given the specific model architecture as well as the parameters should be given using only information about the shape of the protein pocket avoids some of the limitations of the data set and binding simulations however the model is limited by ignoring information about the different interaction forces hydrogen bonding stacking etc that occur when molecules bind to proteins depending on the type of atom since it completely abandoned such information during the generation phase the treelike structure is less expressive for molecules with complex structures moreover the connecting step based entirely on greedy algorithms may lead to unreasonable results docsepthis paper presents a new framework and pretraining scheme for designing pocketconditioned ligands specifically it leverages the shape voxel of ligands and protein pockets in pretraining a sketching network that specifies the shape of the protein pocket and learning a shape2mol network that predicts specific ligands that fit into the given voxel shapes the model were trained over billions of molecules in zinc database and achieved stateoftheart results strength this paper proposes an interesting idea to recognize the voxelshape of the protein pocket and leverage the information for pretraining it is quite different from the other line of work which leverages the atoms of both proteins and ligands that aim to learn the interaction for generative design it is interesting to see how this idea plays out and i would suggest adding more discussions about how it is considered in traditional structurebased drug design shapevoxel vs atomic interaction the proposed idea is proper in realworld scenario where the paired ligand and pocket data are limited thus i am convinced pretraining is needed weakness how is sampling achieved for generating diverse molecules for a specific pocket in the paper only part mentioning it refers to nucleus 68 does it imply that sampling is only involved after generating molecule from the shape efficiency is neither compared nor discussed the authors make a fair argument in the introduction about the drawback of geko which performs equally powerful as the proposed method given the proposed method also takes tons of training time it would be good to see the comparison and discussion many discussions or relations to the literature are not discussed eg how tokenization and lineraization are related to the literature of graph generation framgmentscaffoldbased molecule generation etc experiments are only done over 12 protein targets i would suggest adding more experiments eg 100 targets in 3d sbdd paper ablation study is not complete the comparison method especially deep learningbased sbdd 3d sbdd and ligan are based on atomic reconstructions it would be interesting to see whether the proposed model benefit from fragmentbased method or the new training framework many details are missing or less mentioned for example how the postprocessing is done how many bins are cut for the rotation and translation operations where the origin is how the tranformation is done etc in training decoding section 24 only ligands data are mentioned while no protein data are mentioned figure 10 is not clear not sure what shape rotation translation and category mean codes are not uploaded which hinders the reproducibility of this work it would be beneficial to the community if authors consider publishing the codes if the paper is accepted overall i think this paper makes a fair point and it is a good attempt in the direction of pocketconditioned ligand design i am willing to raise my score if the questions get answered the limitations are not discussed in the paper but may come from two sides 1 whether this method can generate diverse ligands that bind to protein targets 2 the efficiency of the method the first one concerns whether the formulation makes sense in the biological context or if not is still a good way to leverage unlabeled data the second one needs a bit more experiments to support
### Summary:
|
the paper makes a novel contribution to methods for generating novel molecules from scratch the core idea is to generate a shape that fits the molecular pocket without looking at the protein structure two out of three reviewers recommended acceptance reviewers emphasize that the method is innovative and interesting and the empirical performance appealing especially given that only the shape information is provided to the model strong performance is enabled by good design choices made across the paper such as including the pretraining stage the reviewer that recommend rejection raised issues related to the novelty and clarity of the paper however i believe the paper is sufficiently clear and novel to meet the bar for acceptance overall it is my pleasure to recommend acceptance of the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
495,
69,
5787,
1006,
800,
1566,
13438,
323,
2605,
3169,
2854,
2216,
275,
247,
1182,
254,
6934,
302,
5133,
253,
1332,
8687,
767,
8661,
806,
30547,
7695,
253,
5787,
5281,
275,
253,
2601,
11320,
840,
11365,
8094,
1754,
327,
253,
5281,
1754,
327,
253,
9376,
273,
2605,
14802,
3607,
253,
1332,
13698,
281,
1089,
8094,
3692,
15029,
403,
19767,
281,
253,
11320,
285,
3021,
7863,
45515,
5787,
941,
24088,
20458,
575,
476,
320,
12845,
281,
6194,
253,
5281,
19,
17071,
1566,
20544,
50276,
783,
2022,
2934,
273,
2854,
2216,
407,
30547,
7695,
285,
11365,
310,
4460,
285,
973,
24013,
8550,
50276,
783,
4081,
1332,
2722,
1805,
3045,
689,
5368,
3082,
275,
253,
4679,
50276,
783,
1783,
275,
4679,
310,
11088,
50275,
20881,
1255,
265,
50276,
69,
834,
452,
247,
2905,
789,
2593,
16585,
253,
4602,
875,
4081,
5281,
19,
17071,
285,
5368,
5281,
3169,
5787,
5978,
3082,
1275,
7584,
50276,
3357,
50276,
783,
4081,
1332,
2593,
310,
417,
1077,
2590,
281,
479,
923,
4278,
275,
3533,
253,
4477,
2319,
690,
7364,
275,
253,
3368,
2593,
627,
310,
417,
4016,
2675,
3486,
5474,
33032,
2520,
2929,
29328,
247,
1182,
254,
6934,
302,
2854,
2216,
1332,
1754,
327,
23211,
4983,
432,
253,
2867,
326,
2605,
14802,
3607,
253,
2929,
4648,
247,
23211,
1566,
273,
2601,
11320,
15029,
285,
247,
3215,
11273,
1006,
800,
1566,
1754,
327,
5787,
5281,
46159,
281,
5115,
1182,
254,
6934,
302,
2854,
2216,
326,
1057,
417,
10725,
327,
36267,
5661,
941,
285,
36267,
9938,
20544,
253,
2929,
10262,
247,
5787,
5978,
1332,
1754,
760,
327,
5281,
30547,
7695,
534,
310,
247,
1077,
4460,
2746,
671,
253,
5978,
1332,
1293,
970,
2601,
4865,
941,
285,
9938,
310,
25441,
2980,
323,
2852,
789,
50276,
20881,
1255,
209,
186,
20261,
253,
23211,
3169,
5978,
2934,
310,
4460,
253,
5978,
1566,
310,
1335,
2074,
281,
253,
39707,
3169,
3425,
5978,
1566,
908,
275,
5145,
10234,
8892,
253,
2929,
310,
671,
417,
2590,
2217,
275,
253,
1332,
5740,
760,
253,
2216,
273,
253,
9706,
285,
28490,
1332,
285,
253,
2929,
25577,
273,
253,
1566,
908,
403,
1677,
253,
2173,
1566,
10336,
347,
973,
347,
253,
3602,
943,
320,
1677,
50276,
5302,
760,
1491,
670,
253,
5281,
273,
253,
2601,
11320,
32547,
690,
273,
253,
7364,
273,
253,
941,
873,
285,
4865,
9938,
2299,
253,
1566,
310,
3710,
407,
23111,
1491,
670,
253,
1027,
5016,
5621,
9946,
20705,
37444,
3966,
326,
2826,
672,
8094,
8980,
281,
4324,
7293,
327,
253,
1511,
273,
13112,
1580,
352,
4336,
13966,
824,
1491,
50276,
32674,
253,
5978,
3408,
253,
2578,
44549,
2605,
310,
1679,
43541,
323,
8094,
342,
2570,
5289,
25761,
253,
12873,
3213,
1754,
7094,
327,
38754,
11333,
778,
1421,
281,
20697,
1543,
50276,
7152,
33032,
2520,
2929,
10262,
247,
747,
7792,
285,
3215,
26208,
6974,
323,
20462,
11320,
44321,
22056,
5742,
352,
19732,
1131,
253,
5281,
46092,
273,
22056,
285,
2601,
21371,
275,
3215,
26208,
247,
30547,
7695,
2990,
326,
28251,
253,
5281,
273,
253,
2601,
11320,
285,
4715,
247,
5281,
19,
17071,
2990,
326,
26295,
2173,
22056,
326,
4944,
715,
253,
1677,
46092,
15029,
253,
1566,
497,
10166,
689,
25866,
273,
8094,
275,
20458,
5447,
285,
6786,
1375,
23037,
14387,
1543,
4757,
50276,
2520,
2929,
29328,
271,
4722,
2934,
281,
9446,
253,
46092,
15721,
273,
253,
2601,
11320,
285,
25057,
253,
1491,
323,
3215,
26208,
352,
310,
3240,
1027,
432,
253,
643,
1386,
273,
789,
534,
19732,
1131,
253,
10917,
273,
1097,
4324,
285,
22056,
326,
4388,
281,
3037,
253,
5016,
323,
1006,
800,
2216,
352,
310,
4722,
281,
923,
849,
436,
2934,
7120,
562,
285,
891,
651,
1804,
6240,
625,
11985,
670,
849,
352,
310,
2783,
275,
5899,
2605,
3169,
2854,
2216,
5281,
87,
1004,
293,
4632,
13805,
5016,
50276,
783,
4081,
2934,
310,
1463,
275,
1524,
10186,
10076,
835,
253,
18433,
16346,
285,
11320,
941,
403,
3710,
3021,
891,
717,
13762,
3215,
26208,
310,
3058,
50275,
20881,
1255,
50276,
5430,
310,
10491,
6786,
323,
11365,
11117,
8094,
323,
247,
2173,
11320,
275,
253,
2929,
760,
629,
29570,
352,
10770,
281,
13787,
9934,
1057,
352,
16084,
326,
10491,
310,
760,
3206,
846,
11365,
12570,
432,
253,
5281,
50275,
46505,
310,
6747,
2429,
4543,
5469,
253,
4477,
1056,
247,
4344,
4154,
275,
253,
10199,
670,
253,
32489,
273,
305,
1441,
80,
534,
17923,
9696,
6422,
347,
253,
4081,
1332,
1677,
253,
4081,
1332,
671,
3936,
16298,
273,
3733,
673,
352,
651,
320,
1175,
281,
923,
253,
5301,
285,
5955,
50276,
20415,
11985,
390,
2493,
281,
253,
6239,
403,
417,
5469,
24088,
849,
10669,
1320,
285,
19169,
3525,
1320,
403,
2905,
281,
253,
6239,
273,
4216,
5978,
30432,
14908,
68,
2843,
744,
3169,
12570,
5978,
3966,
50275,
16217,
3825,
403,
760,
2218,
689,
1249,
2601,
8571,
891,
651,
1804,
6240,
625,
4679,
24088,
2233,
8571,
275,
495,
69,
34505,
1678,
2929,
50276,
1752,
318,
1263,
310,
417,
3426,
253,
5301,
1332,
3340,
3676,
4715,
3169,
34505,
1678,
495,
69,
34505,
1678,
285,
8405,
266,
403,
1754,
327,
13805,
49866,
6477,
352,
651,
320,
4722,
281,
923,
1880,
253,
4081,
1566,
5649,
432,
10087,
3169,
1332,
390,
253,
747,
3733,
7792,
50276,
20415,
4278,
403,
5816,
390,
1679,
5393,
50276,
1542,
1650,
849,
253,
1501,
21678,
310,
2218,
849,
1142,
27925,
403,
2624,
323,
253,
9381,
285,
10234,
5871,
835,
253,
6510,
310,
849,
253,
21191,
1248,
310,
2218,
3966,
275,
3733,
50276,
8632,
4442,
2593,
2164,
760,
22056,
941,
403,
5393,
1223,
642,
2601,
941,
403,
5393,
4677,
884,
310,
417,
2590,
417,
2119,
752,
5281,
9381,
10234,
285,
7140,
1599,
50275,
30968,
403,
417,
28228,
534,
17134,
398,
253,
38041,
273,
436,
789,
352,
651,
320,
12912,
281,
253,
3114,
604,
4477,
1908,
18051,
253,
11646,
604,
253,
2929,
310,
7607,
50276,
1189,
455,
891,
1158,
436,
2929,
2789,
247,
4344,
1127,
285,
352,
310,
247,
1175,
3177,
275,
253,
3884,
273,
11320,
44321,
16346,
2216,
891,
717,
7378,
281,
7164,
619,
4868,
604,
253,
3533,
755,
9577,
253,
7364,
403,
417,
5469,
275,
253,
2929,
533,
778,
1705,
432,
767,
7123,
337,
1880,
436,
1332,
476,
6635,
11117,
22056,
326,
8980,
281,
2601,
8571,
374,
253,
6733,
273,
253,
1332,
253,
806,
581,
7350,
1880,
253,
15895,
2789,
3282,
275,
253,
7534,
3634,
390,
604,
417,
310,
1335,
247,
1175,
1039,
281,
25057,
440,
22027,
941,
253,
1273,
581,
3198,
247,
2372,
625,
4679,
281,
1329,
2490,
187,
4118,
18435,
27,
783,
2929,
2789,
247,
4460,
7680,
281,
3082,
323,
11365,
4460,
8094,
432,
20041,
253,
5161,
2934,
310,
281,
6635,
247,
5281,
326,
13840,
253,
5787,
11320,
1293,
2819,
387,
253,
2601,
2605,
50276,
9389,
562,
273,
1264,
30628,
8521,
14924,
30628,
22175,
326,
253,
1332,
310,
16694,
285,
4722,
285,
253,
16774,
3045,
23176,
3340,
1677,
326,
760,
253,
5281,
1491,
310,
2530,
281,
253,
1566,
2266,
3045,
310,
11410,
407,
1175,
2216,
10165,
1160,
2439,
253,
2929,
824,
347,
1690,
253,
3215,
26208,
3924,
50276,
783,
37317,
326,
5583,
18235,
5439,
3374,
2905,
281,
253,
38135,
285,
19843,
273,
253,
2929,
2299,
891,
2868,
253,
2929,
310,
10481,
2590,
285,
4460,
281,
2525,
253,
2534,
323,
14924,
50276,
1189,
455,
352,
310,
619,
11284,
281,
5583,
14924,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
495,
69,
5787,
1006,
800,
1566,
13438,
323,
2605,
3169,
2854,
2216,
275,
247,
1182,
254,
6934,
302,
5133,
253,
1332,
8687,
767,
8661,
806,
30547,
7695,
253,
5787,
5281,
275,
253,
2601,
11320,
840,
11365,
8094,
1754,
327,
253,
5281,
1754,
327,
253,
9376,
273,
2605,
14802,
3607,
253,
1332,
13698,
281,
1089,
8094,
3692,
15029,
403,
19767,
281,
253,
11320,
285,
3021,
7863,
45515,
5787,
941,
24088,
20458,
575,
476,
320,
12845,
281,
6194,
253,
5281,
19,
17071,
1566,
20544,
50276,
783,
2022,
2934,
273,
2854,
2216,
407,
30547,
7695,
285,
11365,
310,
4460,
285,
973,
24013,
8550,
50276,
783,
4081,
1332,
2722,
1805,
3045,
689,
5368,
3082,
275,
253,
4679,
50276,
783,
1783,
275,
4679,
310,
11088,
50275,
20881,
1255,
265,
50276,
69,
834,
452,
247,
2905,
789,
2593,
16585,
253,
4602,
875,
4081,
5281,
19,
17071,
285,
5368,
5281,
3169,
5787,
5978,
3082,
1275,
7584,
50276,
3357,
50276,
783,
4081,
1332,
2593,
310,
417,
1077,
2590,
281,
479,
923,
4278,
275,
3533,
253,
4477,
2319,
690,
7364,
275,
253,
3368,
2593,
627,
310,
417,
4016,
2675,
3486,
5474,
33032,
2520,
2929,
29328,
247,
1182,
254,
6934,
302,
2854,
2216,
1332,
1754,
327,
23211,
4983,
432,
253,
2867,
326,
2605,
14802,
3607,
253,
2929,
4648,
247,
23211,
1566,
273,
2601,
11320,
15029,
285,
247,
3215,
11273,
1006,
800,
1566,
1754,
327,
5787,
5281,
46159,
281,
5115,
1182,
254,
6934,
302,
2854,
2216,
326,
1057,
417,
10725,
327,
36267,
5661,
941,
285,
36267,
9938,
20544,
253,
2929,
10262,
247,
5787,
5978,
1332,
1754,
760,
327,
5281,
30547,
7695,
534,
310,
247,
1077,
4460,
2746,
671,
253,
5978,
1332,
1293,
970,
2601,
4865,
941,
285,
9938,
310,
25441,
2980,
323,
2852,
789,
50276,
20881,
1255,
209,
186,
20261,
253,
23211,
3169,
5978,
2934,
310,
4460,
253,
5978,
1566,
310,
1335,
2074,
281,
253,
39707,
3169,
3425,
5978,
1566,
908,
275,
5145,
10234,
8892,
253,
2929,
310,
671,
417,
2590,
2217,
275,
253,
1332,
5740,
760,
253,
2216,
273,
253,
9706,
285,
28490,
1332,
285,
253,
2929,
25577,
273,
253,
1566,
908,
403,
1677,
253,
2173,
1566,
10336,
347,
973,
347,
253,
3602,
943,
320,
1677,
50276,
5302,
760,
1491,
670,
253,
5281,
273,
253,
2601,
11320,
32547,
690,
273,
253,
7364,
273,
253,
941,
873,
285,
4865,
9938,
2299,
253,
1566,
310,
3710,
407,
23111,
1491,
670,
253,
1027,
5016,
5621,
9946,
20705,
37444,
3966,
326,
2826,
672,
8094,
8980,
281,
4324,
7293,
327,
253,
1511,
273,
13112,
1580,
352,
4336,
13966,
824,
1491,
50276,
32674,
253,
5978,
3408,
253,
2578,
44549,
2605,
310,
1679,
43541,
323,
8094,
342,
2570,
5289,
25761,
253,
12873,
3213,
1754,
7094,
327,
38754,
11333,
778,
1421,
281,
20697,
1543,
50276,
7152,
33032,
2520,
2929,
10262,
247,
747,
7792,
285,
3215,
26208,
6974,
323,
20462,
11320,
44321,
22056,
5742,
352,
19732,
1131,
253,
5281,
46092,
273,
22056,
285,
2601,
21371,
275,
3215,
26208,
247,
30547,
7695,
2990,
326,
28251,
253,
5281,
273,
253,
2601,
11320,
285,
4715,
247,
5281,
19,
17071,
2990,
326,
26295,
2173,
22056,
326,
4944,
715,
253,
1677,
46092,
15029,
253,
1566,
497,
10166,
689,
25866,
273,
8094,
275,
20458,
5447,
285,
6786,
1375,
23037,
14387,
1543,
4757,
50276,
2520,
2929,
29328,
271,
4722,
2934,
281,
9446,
253,
46092,
15721,
273,
253,
2601,
11320,
285,
25057,
253,
1491,
323,
3215,
26208,
352,
310,
3240,
1027,
432,
253,
643,
1386,
273,
789,
534,
19732,
1131,
253,
10917,
273,
1097,
4324,
285,
22056,
326,
4388,
281,
3037,
253,
5016,
323,
1006,
800,
2216,
352,
310,
4722,
281,
923,
849,
436,
2934,
7120,
562,
285,
891,
651,
1804,
6240,
625,
11985,
670,
849,
352,
310,
2783,
275,
5899,
2605,
3169,
2854,
2216,
5281,
87,
1004,
293,
4632,
13805,
5016,
50276,
783,
4081,
2934,
310,
1463,
275,
1524,
10186,
10076,
835,
253,
18433,
16346,
285,
11320,
941,
403,
3710,
3021,
891,
717,
13762,
3215,
26208,
310,
3058,
50275,
20881,
1255,
50276,
5430,
310,
10491,
6786,
323,
11365,
11117,
8094,
323,
247,
2173,
11320,
275,
253,
2929,
760,
629,
29570,
352,
10770,
281,
13787,
9934,
1057,
352,
16084,
326,
10491,
310,
760,
3206,
846,
11365,
12570,
432,
253,
5281,
50275,
46505,
310,
6747,
2429,
4543,
5469,
253,
4477,
1056,
247,
4344,
4154,
275,
253,
10199,
670,
253,
32489,
273,
305,
1441,
80,
534,
17923,
9696,
6422,
347,
253,
4081,
1332,
1677,
253,
4081,
1332,
671,
3936,
16298,
273,
3733,
673,
352,
651,
320,
1175,
281,
923,
253,
5301,
285,
5955,
50276,
20415,
11985,
390,
2493,
281,
253,
6239,
403,
417,
5469,
24088,
849,
10669,
1320,
285,
19169,
3525,
1320,
403,
2905,
281,
253,
6239,
273,
4216,
5978,
30432,
14908,
68,
2843,
744,
3169,
12570,
5978,
3966,
50275,
16217,
3825,
403,
760,
2218,
689,
1249,
2601,
8571,
891,
651,
1804,
6240,
625,
4679,
24088,
2233,
8571,
275,
495,
69,
34505,
1678,
2929,
50276,
1752,
318,
1263,
310,
417,
3426,
253,
5301,
1332,
3340,
3676,
4715,
3169,
34505,
1678,
495,
69,
34505,
1678,
285,
8405,
266,
403,
1754,
327,
13805,
49866,
6477,
352,
651,
320,
4722,
281,
923,
1880,
253,
4081,
1566,
5649,
432,
10087,
3169,
1332,
390,
253,
747,
3733,
7792,
50276,
20415,
4278,
403,
5816,
390,
1679,
5393,
50276,
1542,
1650,
849,
253,
1501,
21678,
310,
2218,
849,
1142,
27925,
403,
2624,
323,
253,
9381,
285,
10234,
5871,
835,
253,
6510,
310,
849,
253,
21191,
1248,
310,
2218,
3966,
275,
3733,
50276,
8632,
4442,
2593,
2164,
760,
22056,
941,
403,
5393,
1223,
642,
2601,
941,
403,
5393,
4677,
884,
310,
417,
2590,
417,
2119,
752,
5281,
9381,
10234,
285,
7140,
1599,
50275,
30968,
403,
417,
28228,
534,
17134,
398,
253,
38041,
273,
436,
789,
352,
651,
320,
12912,
281,
253,
3114,
604,
4477,
1908,
18051,
253,
11646,
604,
253,
2929,
310,
7607,
50276,
1189,
455,
891,
1158,
436,
2929,
2789,
247,
4344,
1127,
285,
352,
310,
247,
1175,
3177,
275,
253,
3884,
273,
11320,
44321,
16346,
2216,
891,
717,
7378,
281,
7164,
619,
4868,
604,
253,
3533,
755,
9577,
253,
7364,
403,
417,
5469,
275,
253,
2929,
533,
778,
1705,
432,
767,
7123,
337,
1880,
436,
1332,
476,
6635,
11117,
22056,
326,
8980,
281,
2601,
8571,
374,
253,
6733,
273,
253,
1332,
253,
806,
581,
7350,
1880,
253,
15895,
2789,
3282,
275,
253,
7534,
3634,
390,
604,
417,
310,
1335,
247,
1175,
1039,
281,
25057,
440,
22027,
941,
253,
1273,
581,
3198,
247,
2372,
625,
4679,
281,
1329,
2490,
187,
4118,
18435,
27,
783,
2929,
2789,
247,
4460,
7680,
281,
3082,
323,
11365,
4460,
8094,
432,
20041,
253,
5161,
2934,
310,
281,
6635,
247,
5281,
326,
13840,
253,
5787,
11320,
1293,
2819,
387,
253,
2601,
2605,
50276,
9389,
562,
273,
1264,
30628,
8521,
14924,
30628,
22175,
326,
253,
1332,
310,
16694,
285,
4722,
285,
253,
16774,
3045,
23176,
3340,
1677,
326,
760,
253,
5281,
1491,
310,
2530,
281,
253,
1566,
2266,
3045,
310,
11410,
407,
1175,
2216,
10165,
1160,
2439,
253,
2929,
824,
347,
1690,
253,
3215,
26208,
3924,
50276,
783,
37317,
326,
5583,
18235,
5439,
3374,
2905,
281,
253,
38135,
285,
19843,
273,
253,
2929,
2299,
891,
2868,
253,
2929,
310,
10481,
2590,
285,
4460,
281,
2525,
253,
2534,
323,
14924,
50276,
1189,
455,
352,
310,
619,
11284,
281,
5583,
14924,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this submission introduces a generalpurpose prior for option discovery from demonstrations minimum description length ie a compression objective the authors extend the vta model from kim et al 2019 to include a compression objective and a policy decoder which can constitutes the option policies the experiments demonstrate that the algorithm is effective in discrete action settings with both 2d and 3d observations the paper is wellwritten and the motivation and presentation is good and i think that finding good general priors for option discovery is of high interest the experiments are described clearly and although they clearly expose a very suitable structure for learning discrete options showcase the efficacy of the method sufficiently well as the authors note mdl for option discovery was recently proposed by zhang et al 2021 it would have been interesting to include this as another baseline for comparison in terms of ablations id encourage the authors to further analyze the effective numbers of skills that are used explaining the effect of filtering unused skills as described in section 5 in a similar vein i was wondering how robust the proposed algorithm is to the number of options that should be learned if i understand correctly in the experiments the number of objects to navigate to is known beforehand and the number of options is specified accordingly suppose that this information is hard to obtain from a given set of demonstrations how much would it hurt to specify too many or too few options to add to that in fig 8 appendix 7 out of 10 skills seem to result in the same actions why is that the conclusion in the main body mentions possible future work but fails to raise awareness on limitations or at least on open and as for now unaddressed open questions for example would the proposed method potentially work on the maze navigation tasks in d4rl httpssitesgooglecomviewd4rlhome i think that corresponding evaluations can be regarded as outside the scope of this work but if the authors see possible issues in applying the method to other popular offline rl tasks this would be an interest addition to the papers conclusion docsepthis paper presents a method from offline demonstration learning to hrl the paper builds upon the generative model of vta and add an information cost to enforce a better segmentation which ends up aligning with the ground truth boundaries the recovered options can be used in the hrl with augmented action space which shows good performance strengths the motivation is clear and the paper is well written and easy to follow i find the proposed method to be novel and clean the experiment results also suggest the method is effective at recovering the ground truth boundaries the connection to mdl also seems convincing the experiment on hrl and learning new tasks is also impressive weakness i would like to see discussion and comparison with compile 1 which i think is quite similar to this work since they are both based on generative models 1 httpsarxivorgabs181201483 not that i can see docsepthis paper proposes a framework to discover options from the offline data the framework is build upon the variational temporal abstractionvta method to discover semantically meaningful skills they involve an optimization object of compression they carry out experiments on 2d and 3d multitask domains the experiment results show that the proposed framework is able to discover meaningful options and reuses them for new tasks strengths 1 the proposed method is reasonable for option discovery 2 the analysis and introduction of the method is very clear and make it easy to follow weaknesses 1 although im not an expert in the field of option discovery from offline data the idea of penalizing option switching or reward option continuation is common is option discovery from online data eg optioncritic1 and 2 thus the novelty may be not enough 2 the choice of comparison method there are a lot of methods for offline option discovery proposed rencently can you explain the reason of choosing ddo 1 the optioncritic architecture 2 a compressioninspired framework for macro discovery the authors have adequately addressed the limitations and potential negative societal impact of their work docsepthis paper presents a regularization method for learning skills from data the authors are inspired by the minimum description length and combine the maximum likelihood objective with the compressing term to improve the effectiveness of learned skills the resulting approach love demonstrates the fasterconverging speed and semantically important results compared to prior work furthermore love can scale to tasks with image observations after reading other reviewers comments which are more positive than i thought and reading other rebuttals i decided to increase my score from five to six and then to seven strength 1 learning skills from demonstrations is an important task in hierarchical reinforcement learning 2 the proposed compression metric should be effective to improve the learned skills in hrl 3 the proposed visualizations in figure3 and table1 demonstrate the effectiveness of love in various perspectives 4 the code is provided weakness 1 the authors state in many places prior works do not directly learn skills from highdimensional pixel observations this is false in fact many work tries to learn skills on atari games realimage games such as the option critic architecture 4 and rapl 83 2 the proposed compression term is merely a trick beyond vta and the compression metric is already used in rapl 83 and the unsupervised skill discovery literature so the novelty is limited 3 the title is a bit misleading the title states learning options via compression however the compression is merely a regression term in the learning scheme and the major body of this approach is still vta the authors shortly describe the limitations in the conclusion part however i dont quite understand that part and i think the important limitations of this paper are not discussed refer to the above limitation part
### Summary:
|
this paper studies the problem of learning options in multitask reinforcement learning the authors note that previous works optimize an underspecified objective and they propose adding an extra term to the objective function that relates to the description lengths of skills the authors study their approach and show empirically that it scales to highdimensional problems and performs well compared to previous approaches the discovered skills can also be used to solve new tasks using fewer samples than previous approaches the initial reviews were overall very positive for this paper during the authorreviewer discussion the authors provide additional results and provided satisfying answers to most reviewer comments as a result the reviewers are unanimous that this work should be accepted and the discussion period did not reveal any other elements to report here i am pleased to recommend acceptance congratulations it seems like the current version of your manuscript already addresses most if not all of the points raised by reviewers in addition please do not forget to discuss the limitations of your current work including eg different domains that it might not work in see the comment from reviewer crht
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
19529,
23970,
247,
2087,
27299,
2720,
323,
4500,
8900,
432,
32367,
5927,
5740,
2978,
26332,
247,
13800,
8103,
253,
4477,
9017,
253,
362,
893,
1566,
432,
465,
303,
1162,
355,
6247,
281,
2486,
247,
13800,
8103,
285,
247,
3646,
29810,
534,
476,
16988,
253,
4500,
7823,
253,
4679,
7568,
326,
253,
5933,
310,
3576,
275,
13358,
2250,
7533,
342,
1097,
374,
69,
285,
495,
69,
7313,
253,
2929,
310,
973,
15720,
285,
253,
16038,
285,
9759,
310,
1175,
285,
891,
1158,
326,
4560,
1175,
2087,
2235,
641,
323,
4500,
8900,
310,
273,
1029,
1600,
253,
4679,
403,
2529,
4518,
285,
3738,
597,
4518,
22065,
247,
1077,
7470,
2605,
323,
4715,
13358,
4610,
34647,
253,
10307,
273,
253,
1332,
10481,
973,
50276,
284,
253,
4477,
3877,
278,
11830,
323,
4500,
8900,
369,
4102,
4081,
407,
1182,
12109,
1162,
355,
43425,
50276,
262,
651,
452,
644,
4722,
281,
2486,
436,
347,
1529,
8245,
323,
5301,
50276,
249,
2426,
273,
490,
77,
569,
2654,
11907,
253,
4477,
281,
2007,
12106,
253,
3576,
3904,
273,
6936,
326,
403,
908,
15571,
253,
1055,
273,
19690,
30732,
6936,
347,
2529,
275,
2593,
608,
275,
247,
2074,
17716,
891,
369,
12371,
849,
10237,
253,
4081,
5933,
310,
281,
253,
1180,
273,
4610,
326,
943,
320,
6311,
604,
891,
2096,
9113,
275,
253,
4679,
253,
1180,
273,
5113,
281,
24171,
281,
310,
1929,
38565,
285,
253,
1180,
273,
4610,
310,
7616,
15672,
9428,
326,
436,
1491,
310,
1892,
281,
4044,
432,
247,
1677,
873,
273,
32367,
50276,
5430,
1199,
651,
352,
8513,
281,
13199,
1512,
1142,
390,
1512,
1643,
4610,
281,
823,
281,
326,
275,
3036,
854,
30762,
818,
562,
273,
884,
6936,
1646,
281,
906,
275,
253,
1072,
5231,
2139,
310,
326,
253,
6452,
275,
253,
2022,
2133,
25957,
1896,
2852,
789,
533,
10224,
281,
7164,
11891,
327,
7364,
390,
387,
1878,
327,
1527,
285,
347,
323,
1024,
440,
1911,
2079,
1527,
3533,
323,
1650,
651,
253,
4081,
1332,
7826,
789,
327,
253,
37045,
15034,
8892,
275,
277,
21,
8435,
5987,
37813,
9906,
681,
1374,
69,
21,
8435,
9511,
891,
1158,
326,
3969,
27163,
476,
320,
12258,
347,
3345,
253,
7990,
273,
436,
789,
533,
604,
253,
4477,
923,
1896,
3374,
275,
9433,
253,
1332,
281,
643,
4633,
28841,
391,
77,
8892,
436,
651,
320,
271,
1600,
1635,
281,
253,
9380,
6452,
5474,
33032,
2520,
2929,
10262,
247,
1332,
432,
28841,
20028,
4715,
281,
288,
8435,
253,
2929,
21168,
2220,
253,
1006,
800,
1566,
273,
362,
893,
285,
823,
271,
1491,
2105,
281,
7767,
247,
1805,
26405,
534,
7637,
598,
8495,
272,
342,
253,
3216,
5083,
13674,
253,
12372,
4610,
476,
320,
908,
275,
253,
288,
8435,
342,
31612,
2250,
2317,
534,
2722,
1175,
3045,
20544,
253,
16038,
310,
2590,
285,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
891,
1089,
253,
4081,
1332,
281,
320,
4460,
285,
4076,
253,
3368,
1543,
671,
1804,
253,
1332,
310,
3576,
387,
27930,
253,
3216,
5083,
13674,
253,
4602,
281,
278,
11830,
671,
3133,
21414,
253,
3368,
327,
288,
8435,
285,
4715,
747,
8892,
310,
671,
13943,
50275,
20881,
1255,
891,
651,
751,
281,
923,
5955,
285,
5301,
342,
18122,
337,
534,
891,
1158,
310,
3240,
2074,
281,
436,
789,
1580,
597,
403,
1097,
1754,
327,
1006,
800,
3210,
50275,
18,
5987,
39962,
2061,
5375,
1093,
805,
520,
32282,
417,
326,
891,
476,
923,
5474,
33032,
2520,
2929,
29328,
247,
7792,
281,
9413,
4610,
432,
253,
28841,
941,
253,
7792,
310,
1973,
2220,
253,
39762,
11935,
38562,
87,
893,
1332,
281,
9413,
3300,
39904,
14282,
6936,
597,
6388,
271,
13757,
1789,
273,
13800,
597,
4459,
562,
4679,
327,
374,
69,
285,
495,
69,
1554,
262,
1945,
10625,
253,
3368,
1543,
921,
326,
253,
4081,
7792,
310,
2104,
281,
9413,
14282,
4610,
285,
294,
5123,
731,
323,
747,
8892,
20544,
337,
253,
4081,
1332,
310,
5272,
323,
4500,
8900,
374,
253,
1783,
285,
10199,
273,
253,
1332,
310,
1077,
2590,
285,
1056,
352,
3477,
281,
956,
50276,
20881,
1255,
265,
337,
3738,
516,
417,
271,
6485,
275,
253,
1673,
273,
4500,
8900,
432,
28841,
941,
253,
2934,
273,
29697,
3006,
4500,
12797,
390,
10921,
4500,
26272,
310,
1846,
310,
4500,
8900,
432,
3909,
941,
24088,
4500,
68,
17425,
18,
285,
374,
3021,
253,
38135,
778,
320,
417,
2217,
374,
253,
4327,
273,
5301,
1332,
627,
403,
247,
2257,
273,
3082,
323,
28841,
4500,
8900,
4081,
3816,
1154,
314,
476,
368,
5513,
253,
1921,
273,
13887,
277,
3088,
337,
253,
4500,
68,
17425,
10336,
374,
247,
13800,
38358,
7792,
323,
14823,
8900,
253,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
10262,
247,
37820,
1332,
323,
4715,
6936,
432,
941,
253,
4477,
403,
11797,
407,
253,
5927,
5740,
2978,
285,
13398,
253,
4869,
12177,
8103,
342,
253,
509,
13537,
1307,
281,
3157,
253,
12510,
273,
6311,
6936,
253,
4795,
2746,
2389,
14371,
253,
7938,
585,
332,
3390,
3885,
285,
3300,
39904,
1774,
1543,
2429,
281,
2720,
789,
33810,
2389,
476,
4311,
281,
8892,
342,
2460,
7313,
50275,
6438,
4361,
643,
30628,
5701,
534,
403,
625,
2762,
685,
891,
1869,
285,
4361,
643,
30080,
85,
932,
891,
4425,
281,
2572,
619,
4868,
432,
2620,
281,
2800,
285,
840,
281,
5093,
50276,
45563,
337,
4715,
6936,
432,
32367,
310,
271,
1774,
4836,
275,
24498,
35221,
4715,
374,
253,
4081,
13800,
7982,
943,
320,
3576,
281,
3157,
253,
6311,
6936,
275,
288,
8435,
495,
253,
4081,
5304,
5904,
275,
4677,
20,
285,
2829,
18,
7568,
253,
12510,
273,
2389,
275,
2710,
24302,
50276,
21,
253,
2127,
310,
2530,
50275,
20881,
1255,
337,
253,
4477,
1375,
275,
1142,
5053,
2720,
2987,
513,
417,
3587,
3037,
6936,
432,
1029,
6967,
12275,
7313,
436,
310,
3221,
275,
958,
1142,
789,
14177,
281,
3037,
6936,
327,
387,
1792,
3958,
1524,
5695,
3958,
824,
347,
253,
4500,
7291,
10336,
577,
285,
1218,
446,
11439,
374,
253,
4081,
13800,
1307,
310,
7960,
247,
10480,
4457,
362,
893,
285,
253,
13800,
7982,
310,
2168,
908,
275,
1218,
446,
11439,
285,
253,
440,
35421,
10861,
8900,
6239,
594,
253,
38135,
310,
3710,
50276,
20,
50276,
783,
4060,
310,
247,
2372,
24363,
253,
4060,
3054,
4715,
4610,
3066,
13800,
2299,
253,
13800,
310,
7960,
247,
9077,
1307,
275,
253,
4715,
6974,
285,
253,
2201,
2133,
273,
436,
2746,
310,
1335,
362,
893,
253,
4477,
13515,
6266,
253,
7364,
275,
253,
6452,
629,
2299,
891,
13414,
3240,
2096,
326,
629,
285,
891,
1158,
253,
1774,
7364,
273,
436,
2929,
403,
417,
5469,
3730,
281,
253,
1840,
12291,
629,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
4715,
4610,
275,
1554,
262,
1945,
35221,
4715,
253,
4477,
3877,
326,
2045,
2987,
22318,
271,
17433,
1553,
1245,
8103,
285,
597,
12661,
6240,
271,
4465,
1307,
281,
253,
8103,
1159,
326,
7033,
281,
253,
5740,
16095,
273,
6936,
253,
4477,
1263,
616,
2746,
285,
921,
45190,
326,
352,
11498,
281,
1029,
6967,
3237,
285,
17923,
973,
2429,
281,
2045,
7274,
253,
6888,
6936,
476,
671,
320,
908,
281,
8415,
747,
8892,
970,
11184,
3530,
685,
2045,
7274,
50276,
783,
3302,
10123,
497,
4583,
1077,
2762,
323,
436,
2929,
1309,
253,
2488,
15337,
254,
5955,
253,
4477,
2085,
3081,
1543,
285,
2530,
14127,
9172,
281,
954,
37317,
5701,
347,
247,
906,
253,
30628,
403,
42293,
326,
436,
789,
943,
320,
7607,
285,
253,
5955,
2180,
858,
417,
10313,
667,
643,
3603,
281,
1304,
1060,
50275,
74,
717,
13864,
281,
5583,
14924,
28858,
3339,
352,
3133,
751,
253,
1655,
2715,
273,
634,
7714,
2168,
12453,
954,
604,
417,
512,
273,
253,
2792,
5439,
407,
30628,
275,
1635,
4496,
513,
417,
7740,
281,
2319,
253,
7364,
273,
634,
1655,
789,
1690,
24088,
1027,
10625,
326,
352,
1537,
417,
789,
275,
923,
253,
4385,
432,
37317,
1531,
384,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
19529,
23970,
247,
2087,
27299,
2720,
323,
4500,
8900,
432,
32367,
5927,
5740,
2978,
26332,
247,
13800,
8103,
253,
4477,
9017,
253,
362,
893,
1566,
432,
465,
303,
1162,
355,
6247,
281,
2486,
247,
13800,
8103,
285,
247,
3646,
29810,
534,
476,
16988,
253,
4500,
7823,
253,
4679,
7568,
326,
253,
5933,
310,
3576,
275,
13358,
2250,
7533,
342,
1097,
374,
69,
285,
495,
69,
7313,
253,
2929,
310,
973,
15720,
285,
253,
16038,
285,
9759,
310,
1175,
285,
891,
1158,
326,
4560,
1175,
2087,
2235,
641,
323,
4500,
8900,
310,
273,
1029,
1600,
253,
4679,
403,
2529,
4518,
285,
3738,
597,
4518,
22065,
247,
1077,
7470,
2605,
323,
4715,
13358,
4610,
34647,
253,
10307,
273,
253,
1332,
10481,
973,
50276,
284,
253,
4477,
3877,
278,
11830,
323,
4500,
8900,
369,
4102,
4081,
407,
1182,
12109,
1162,
355,
43425,
50276,
262,
651,
452,
644,
4722,
281,
2486,
436,
347,
1529,
8245,
323,
5301,
50276,
249,
2426,
273,
490,
77,
569,
2654,
11907,
253,
4477,
281,
2007,
12106,
253,
3576,
3904,
273,
6936,
326,
403,
908,
15571,
253,
1055,
273,
19690,
30732,
6936,
347,
2529,
275,
2593,
608,
275,
247,
2074,
17716,
891,
369,
12371,
849,
10237,
253,
4081,
5933,
310,
281,
253,
1180,
273,
4610,
326,
943,
320,
6311,
604,
891,
2096,
9113,
275,
253,
4679,
253,
1180,
273,
5113,
281,
24171,
281,
310,
1929,
38565,
285,
253,
1180,
273,
4610,
310,
7616,
15672,
9428,
326,
436,
1491,
310,
1892,
281,
4044,
432,
247,
1677,
873,
273,
32367,
50276,
5430,
1199,
651,
352,
8513,
281,
13199,
1512,
1142,
390,
1512,
1643,
4610,
281,
823,
281,
326,
275,
3036,
854,
30762,
818,
562,
273,
884,
6936,
1646,
281,
906,
275,
253,
1072,
5231,
2139,
310,
326,
253,
6452,
275,
253,
2022,
2133,
25957,
1896,
2852,
789,
533,
10224,
281,
7164,
11891,
327,
7364,
390,
387,
1878,
327,
1527,
285,
347,
323,
1024,
440,
1911,
2079,
1527,
3533,
323,
1650,
651,
253,
4081,
1332,
7826,
789,
327,
253,
37045,
15034,
8892,
275,
277,
21,
8435,
5987,
37813,
9906,
681,
1374,
69,
21,
8435,
9511,
891,
1158,
326,
3969,
27163,
476,
320,
12258,
347,
3345,
253,
7990,
273,
436,
789,
533,
604,
253,
4477,
923,
1896,
3374,
275,
9433,
253,
1332,
281,
643,
4633,
28841,
391,
77,
8892,
436,
651,
320,
271,
1600,
1635,
281,
253,
9380,
6452,
5474,
33032,
2520,
2929,
10262,
247,
1332,
432,
28841,
20028,
4715,
281,
288,
8435,
253,
2929,
21168,
2220,
253,
1006,
800,
1566,
273,
362,
893,
285,
823,
271,
1491,
2105,
281,
7767,
247,
1805,
26405,
534,
7637,
598,
8495,
272,
342,
253,
3216,
5083,
13674,
253,
12372,
4610,
476,
320,
908,
275,
253,
288,
8435,
342,
31612,
2250,
2317,
534,
2722,
1175,
3045,
20544,
253,
16038,
310,
2590,
285,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
891,
1089,
253,
4081,
1332,
281,
320,
4460,
285,
4076,
253,
3368,
1543,
671,
1804,
253,
1332,
310,
3576,
387,
27930,
253,
3216,
5083,
13674,
253,
4602,
281,
278,
11830,
671,
3133,
21414,
253,
3368,
327,
288,
8435,
285,
4715,
747,
8892,
310,
671,
13943,
50275,
20881,
1255,
891,
651,
751,
281,
923,
5955,
285,
5301,
342,
18122,
337,
534,
891,
1158,
310,
3240,
2074,
281,
436,
789,
1580,
597,
403,
1097,
1754,
327,
1006,
800,
3210,
50275,
18,
5987,
39962,
2061,
5375,
1093,
805,
520,
32282,
417,
326,
891,
476,
923,
5474,
33032,
2520,
2929,
29328,
247,
7792,
281,
9413,
4610,
432,
253,
28841,
941,
253,
7792,
310,
1973,
2220,
253,
39762,
11935,
38562,
87,
893,
1332,
281,
9413,
3300,
39904,
14282,
6936,
597,
6388,
271,
13757,
1789,
273,
13800,
597,
4459,
562,
4679,
327,
374,
69,
285,
495,
69,
1554,
262,
1945,
10625,
253,
3368,
1543,
921,
326,
253,
4081,
7792,
310,
2104,
281,
9413,
14282,
4610,
285,
294,
5123,
731,
323,
747,
8892,
20544,
337,
253,
4081,
1332,
310,
5272,
323,
4500,
8900,
374,
253,
1783,
285,
10199,
273,
253,
1332,
310,
1077,
2590,
285,
1056,
352,
3477,
281,
956,
50276,
20881,
1255,
265,
337,
3738,
516,
417,
271,
6485,
275,
253,
1673,
273,
4500,
8900,
432,
28841,
941,
253,
2934,
273,
29697,
3006,
4500,
12797,
390,
10921,
4500,
26272,
310,
1846,
310,
4500,
8900,
432,
3909,
941,
24088,
4500,
68,
17425,
18,
285,
374,
3021,
253,
38135,
778,
320,
417,
2217,
374,
253,
4327,
273,
5301,
1332,
627,
403,
247,
2257,
273,
3082,
323,
28841,
4500,
8900,
4081,
3816,
1154,
314,
476,
368,
5513,
253,
1921,
273,
13887,
277,
3088,
337,
253,
4500,
68,
17425,
10336,
374,
247,
13800,
38358,
7792,
323,
14823,
8900,
253,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
10262,
247,
37820,
1332,
323,
4715,
6936,
432,
941,
253,
4477,
403,
11797,
407,
253,
5927,
5740,
2978,
285,
13398,
253,
4869,
12177,
8103,
342,
253,
509,
13537,
1307,
281,
3157,
253,
12510,
273,
6311,
6936,
253,
4795,
2746,
2389,
14371,
253,
7938,
585,
332,
3390,
3885,
285,
3300,
39904,
1774,
1543,
2429,
281,
2720,
789,
33810,
2389,
476,
4311,
281,
8892,
342,
2460,
7313,
50275,
6438,
4361,
643,
30628,
5701,
534,
403,
625,
2762,
685,
891,
1869,
285,
4361,
643,
30080,
85,
932,
891,
4425,
281,
2572,
619,
4868,
432,
2620,
281,
2800,
285,
840,
281,
5093,
50276,
45563,
337,
4715,
6936,
432,
32367,
310,
271,
1774,
4836,
275,
24498,
35221,
4715,
374,
253,
4081,
13800,
7982,
943,
320,
3576,
281,
3157,
253,
6311,
6936,
275,
288,
8435,
495,
253,
4081,
5304,
5904,
275,
4677,
20,
285,
2829,
18,
7568,
253,
12510,
273,
2389,
275,
2710,
24302,
50276,
21,
253,
2127,
310,
2530,
50275,
20881,
1255,
337,
253,
4477,
1375,
275,
1142,
5053,
2720,
2987,
513,
417,
3587,
3037,
6936,
432,
1029,
6967,
12275,
7313,
436,
310,
3221,
275,
958,
1142,
789,
14177,
281,
3037,
6936,
327,
387,
1792,
3958,
1524,
5695,
3958,
824,
347,
253,
4500,
7291,
10336,
577,
285,
1218,
446,
11439,
374,
253,
4081,
13800,
1307,
310,
7960,
247,
10480,
4457,
362,
893,
285,
253,
13800,
7982,
310,
2168,
908,
275,
1218,
446,
11439,
285,
253,
440,
35421,
10861,
8900,
6239,
594,
253,
38135,
310,
3710,
50276,
20,
50276,
783,
4060,
310,
247,
2372,
24363,
253,
4060,
3054,
4715,
4610,
3066,
13800,
2299,
253,
13800,
310,
7960,
247,
9077,
1307,
275,
253,
4715,
6974,
285,
253,
2201,
2133,
273,
436,
2746,
310,
1335,
362,
893,
253,
4477,
13515,
6266,
253,
7364,
275,
253,
6452,
629,
2299,
891,
13414,
3240,
2096,
326,
629,
285,
891,
1158,
253,
1774,
7364,
273,
436,
2929,
403,
417,
5469,
3730,
281,
253,
1840,
12291,
629,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
4715,
4610,
275,
1554,
262,
1945,
35221,
4715,
253,
4477,
3877,
326,
2045,
2987,
22318,
271,
17433,
1553,
1245,
8103,
285,
597,
12661,
6240,
271,
4465,
1307,
281,
253,
8103,
1159,
326,
7033,
281,
253,
5740,
16095,
273,
6936,
253,
4477,
1263,
616,
2746,
285,
921,
45190,
326,
352,
11498,
281,
1029,
6967,
3237,
285,
17923,
973,
2429,
281,
2045,
7274,
253,
6888,
6936,
476,
671,
320,
908,
281,
8415,
747,
8892,
970,
11184,
3530,
685,
2045,
7274,
50276,
783,
3302,
10123,
497,
4583,
1077,
2762,
323,
436,
2929,
1309,
253,
2488,
15337,
254,
5955,
253,
4477,
2085,
3081,
1543,
285,
2530,
14127,
9172,
281,
954,
37317,
5701,
347,
247,
906,
253,
30628,
403,
42293,
326,
436,
789,
943,
320,
7607,
285,
253,
5955,
2180,
858,
417,
10313,
667,
643,
3603,
281,
1304,
1060,
50275,
74,
717,
13864,
281,
5583,
14924,
28858,
3339,
352,
3133,
751,
253,
1655,
2715,
273,
634,
7714,
2168,
12453,
954,
604,
417,
512,
273,
253,
2792,
5439,
407,
30628,
275,
1635,
4496,
513,
417,
7740,
281,
2319,
253,
7364,
273,
634,
1655,
789,
1690,
24088,
1027,
10625,
326,
352,
1537,
417,
789,
275,
923,
253,
4385,
432,
37317,
1531,
384,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors shed light on linear gcns models and propose a new design which aim at generalizing gcns to continuous and and a linear propagation model inspired by newtons law it is based on the hypothesis that features propagation across nodes in a given graph follows the same process to do so the authors establish a link with heat kernels and formulate the problem as heat kernel learning within linear gcn model so that the network at the feature propagation step takes into consideration multihop neighboring systems to refine the features of a given central node in this paper the authors find out that features propagation based on heat kernel allows to control the oscillation between low and high frequencies controlling the appropriate level of granularities is quite a challenging task in deep learning mainly as the convolutional filters are biased toward low frequencies however important invariants and informative information for classification are within the chaos of high frequencies in order to control that the authors explain that gcn based heat kernels can act as a lowpass filter cutoff this combination of gcn and heat kernels are empirically validated considering node and graph classification tasks the settings are clear and the comparison with related works is convincing one of the strong points of the paper is its capacity to provide comparable results stateoftheart with a reasonable complexity in space with the advantage of being more simple and interpretable compared to existing related methods moreover a theoretical analysis from a spectral standpoint is introduced clearly it consists at setting link between linear gcn and heat kernels as well as with finite difference methods however its not clear how the proposed gcn tackles the problem of oversmoothness and graph isomorphism they are among the most challenging problems in graph learning from that l derive two questions 1 to what extent gcn based on heat kernel formulation is able to mitigate oversmoothness 2 is the proposed heat kernel function injective so that it is able to distinguish two nonisomorphic graphs one possible weakness of the proposed design is that it can be applicable only on graphs with fixed topology and size as it is the case for all spectral gcn filters are not transferable across graphs since they are basis dependent however in real world problems graphs are irregular the overall approach is original well placed in the litterature and the paper is well written the authors conduct both theoretical and practical studies to show that this research direction could be important to improve existing gcn models for that reason l propose to accept the paper docsepthis paper studies semisupervised node classification in graph data one powerful approach to the task is graph convolutional networks which use discrete layers to perform information propagation the paper generalizes gcns into a continuous model via heat kernel where the proposed model uses continuous layers for information propagation the authors conduct both theoretical and empirical analysis of the proposed model experiments on several standard datasets show promising results overall the paper studies an important problem in graph machine learning and proposes a principled approach which combines graph neural networks with heat kernels and gives a new way of analyzing existing graph neural networks however the paper also has several weaknesses 1 the novelty is limited although the idea of developing continuous propagation layers is interesting the idea has been explored by many recent works for example 123 use a graph neural network to define an ode which leads to a continuous feature propagation layer for node classification 4 uses a linear ode for feature propagation which is very similar to the method proposed here and the only difference is that the ode in 4 also incorporates some constant term besides lx the authors should explain and clarify the difference between this work and existing works 2 the results are worse than sota methods in experiments the authors conduct experiments in many standard datasets eg cora citeseer pubmed and the proposed method shows promising results however the compared methods used in experiments are not competitive enough the strongest baseline methods in table 3 are gcn and gat and in table 4 they are graphsage and gcn all these methods are proposed before 2017 and recently there are many more competitive methods proposed to make the results more convincing it is helpful to compare against some recent graph neural networks for node classification besides i also have some questions regarding the model detail 1 about t in equation 6 in equation 6 the analytical form of xt is given by xteltx where t can have a high impact on the results if t is very small then elt becomes an identity matrix and hence ht will be very close to x if t is very large then elt becomes an matrix whose elements are all close to 0 and thus all the rows in ht will be almost the same yielding an oversmoothing problem in practice what would be a proper value of t moreover if we look at figure 6 even when t is very large eg t20 the accuracy is still very high especially on cora which indicates that the model does not suffer from oversmoothing in practice but if we check the analytical form xteltx when t is large all the rows in ht become very similar which may lead to oversmoothing and a low accuracy i wonder how does the proposed model manage to avoid oversmoothing in practice could the authors elaborate on that 2 about feature dimensionality in the propose method the hidden matrix ht has the same size as the feature matrix x if the feature dimensionality of a dataset is very high which is quite common in practice then computing ht can entail high cost is there a way to deal with the potential problem 3 about the time complexity in section 36 the authors mention that the time complexity of data processing is oke what is k here references 1 poli michael et al graph neural ordinary differential equations arxiv preprint arxiv191107532 2019 2 deng zhiwei et al continuous graph flow for flexible density estimation arxiv preprint arxiv190802436 2019 3 zhuang juntang et al ordinary differential equations on graph networks 2019 4 xhonneux louispascal ac meng qu and jian tang continuous graph neural networks arxiv preprint arxiv191200967 2019docsepthis submission introduced a new graph convolutional operator based on heat diffusion named heat kernel gcn hkgcn first continuoustime heat diffusion on graphs is reviewed where the solution is given by the heat equation 6 then the authors showed that classical gcn can be approximated in the same formulation through discretization the proposed method hkgcn is similar to the simplified gcn sgc by wu et al the learning procedure is quite straightforward first a heat diffusion is performed on the input features for a prespecified time t then logic regression is performed on the diffused features to train the classifier the sgc model performs several times of neighborhood averaging as in vanilla gcn based on a polynomial spectral filter of order one while hkgcn performs heat diffusion with higherorder polynomial terms my main criticism is that applying heat diffusion on graph convolution is not new and the relationship with previous works is not clearly stated see the cited xu et al 2019a or not mentioned klicpera 2019 xu et al 2019c where similar formulations and ideas already appeared the main novelty of the proposed hkgcn therefore is on a combination of heat kernel and the sgc approach this combination is not nontrivial enough and may not be significant enough to be published in iclr the hkgcn method is motivated by the oscillation problem of gcn how does the heat kernel help avoid oscillation after the introduction of the heat equation there should be some theoretical statements to provide a solution to this problem and to correspond to this motivation ideally to show how hkgcn is different with the gcn approach instead the oscillation problem is mainly solved by numerical simulation on the toy example and informal arguments as another novelty the authors revealed that gcn can be approximated using heat diffusion under the same formulation again the connection although interesting is not a major contribution empirically the authors tested the hkgcn method on commonly used citation datasets and an ogb graph of arxiv articles on both transitive and inductive learning tasks the huge speed improvement is mainly due to the same trick as sgc is used in hkgcn there is no activation between the convolution layers which allows a precomputation step followed by an extremely simplified learning step logistic regression this improvement is due to sgc and is expected by looking at the accuracy scores the main comparison is hkgcn vs sgc because of their similarities it is not convincing that using heat diffusion hkgcn instead of graph convolution sgc can bring a notable performance improvement in most of the time the improvement is quite marginal overall this technical novelty and empirical significance are limited there are not theoretical statements in this paper and i am evaluating it as an algorithmic contribution i am recommending a weak rejection more comments the title is too broad please be more specific toy example in the introduction as this toy is mentioned again in later text please explain in more detail the computation of gcn vs hkgcn as you started from gnn it is good to cite some original gnn paper gori et al 2005 scarselli et al 09 eq9 niter is hard to read in this template most of the citations should use citep instead of cite references klicpera 2019 diffusion improves graph learning klicpera et al 2019 xu et al 2019c graph wavelet neural network xu et al 2019 after rebuttal thank you for the revision and the clarifications no one has made a clear connection between gcn and the heat kernel for example in klicpera 2019 section 2 the heat kernel is discussed as a special case we dont simply combine sgc and heat kernel clearly the only difference between the proposed method in section 34 and sgc is that the authors used heat diffusion as the spectral filter matrix while sgc used a polynomial filter the kth power of the normalized adjacency matrix furthermore the other reviewers raised similar works such as graph ode which further reduces the novelty of this work docsepthis submission proposes to use heat kernel as the propagation matrix in graph convolutional networks the authors show that heat kernels can induce more smooth propagation behavior than the commonly used discrete propagation the submission also designs an efficient method to calculate the heat kernel using chebyshev expansion strength the proposed method is well motivated and heat kernel is relatively wellunderstood the writing is clear and easytounderstand the proposed method demonstrates improved performance on node classification datasets while inheriting the efficiency of simple graph convolution the submission collects a larger scale arxiv graph dataset which can be useful for the community will the authors release this dataset in the future this submission has done a detailed and thorough evaluation of the proposed method i appreciate the effort to analyze tildet which should be helpful for practitioners weakness sgc is known to be ineffective for graph classification tasks does the proposed model also inherits this downside table 3 lacks comparisons to more recent graph neural networks like graph markov neural networks however i dont think this is critical given the efficiency and strong performance of the proposed method on larger graph datasets updated given that the reviewers have not reached a consensus i want to add more discussion to my review to facilitate the ac to make the decision i would also include a few more quick todos for the authors and hope they can help add evidence to my argument 1 this submission proposes to replace the propagation in gnns with heat kernel the main motivation for this method is that the laplacian filters tend to oscillate as illustrated by figure 1 the heatkernel provides a continuous convergence process and intuitively may address the oscillation process importantly i believe the motivation of this method is not to prevent oversmoothing but to prevent overoscillation i believe this intuition is sound but encourage the authors to do more to validate this hypothesis i appreciate the ablation study in figure 6 as one more analysis i suggest the authors to add the performance curves of sgc to figure 6 under the same setting if this theoretical intuition is valid we should expect hkgcn and sgc to behave more differently when the propagation degree is low within 210 my understanding is that both hkgcn and sgc are efficient so this experiment shouldnt take long ideally the authors can update this result at least on 12 datasets within the discussion period 2 i am also very impressed by the efficiency of hkgcn this submission has experimented with to my knowledge the largest publicly available graph dataset arxiv which contains more than one million nodes according to the authors hkgcn can be trained for this arxiv dataset in 485s which is impressive note that here the heat diffusion matrix does not need to be computedstored explicitly following the setup in sgc hkgcn only needs to compute the propagated features in a preprocessing step i suggest the authors to give more concrete numbers to illustrate the efficiency of this method for this arxiv dataset what kind of hardware is required how much actual ram did you use to compute the preprocessing step 3 after reading other reviews i now realize that this submission is not the first to introduce heat kernels into gcns among the papers pointed out by other reviewers i find 1 to be most relevant in that they also proposed the usage of heat kernels can the authors also clarify the difference between this submission and 2 based on this novelty concern i have lowered my review score to 6 1 xu et al graph wavelet neural network iclr 2019 2 xu et al graph convolutional networks using heat kernel for semisupervised learning ijcai 2019
### Summary:
|
this paper has been evaluated by four reviewers who overall hesitated between borderline rejectaccept in general as rev 4 points out this paper appears to cope with overoscillation rather than oversmoothing aspect of gcn modeling something worth clarifying rev 3 also rightly points out that the connection between the heat kernel and gcn in fact was established in previous works also the connection between sgc polynomial filter and the heat diffusion the spectral filter matrix is hard to overlook the impression that this work builds heavily on sgc therefore while ac sympathizes with the idea it is also difficult to overlook the incremental nature of the paper and therefore the paper cannot be accepted in its current form
|
[
4081,
1078,
4240,
285,
4102,
627,
403,
1142,
625,
12085,
3082,
4081,
281,
1056,
253,
1543,
625,
21414,
352,
310,
9371,
281,
7277,
1411,
690,
3332,
4216,
11454,
6928,
323,
4666,
9162,
16280,
891,
671,
452,
690,
3533,
5001,
253,
1566,
2508,
337,
670,
246,
275,
5150,
721,
275,
5150,
721,
253,
16101,
830,
273,
209,
633,
310,
1677,
407,
209,
633,
2585,
89,
835,
246,
476,
452,
247,
1029,
3486,
327,
253,
1543,
604,
246,
310,
1077,
1355,
840,
1045,
85,
4916,
271,
6489,
4315,
285,
7613,
288,
85,
588,
320,
1077,
2810,
281,
1269,
604,
246,
310,
1077,
1781,
840,
1045,
85,
4916,
271,
4315,
3692,
3603,
403,
512,
2810,
281,
470,
285,
3021,
512,
253,
10175,
275,
288,
85,
588,
320,
2761,
253,
1072,
27012,
271,
689,
34006,
272,
1895,
275,
3946,
752,
651,
320,
247,
1463,
1318,
273,
246,
25761,
604,
359,
1007,
387,
4677,
721,
1014,
672,
246,
310,
1077,
1781,
24088,
246,
938,
253,
7200,
310,
1335,
1077,
1029,
3340,
327,
944,
66,
534,
6492,
326,
253,
1566,
1057,
417,
11089,
432,
689,
34006,
272,
275,
3946,
533,
604,
359,
2451,
253,
16101,
830,
209,
633,
2585,
89,
672,
246,
310,
1781,
512,
253,
10175,
275,
288,
85,
2489,
1077,
2074,
534,
778,
1421,
281,
689,
34006,
272,
285,
247,
1698,
7200,
891,
4282,
849,
1057,
253,
4081,
1566,
8722,
281,
3693,
689,
34006,
272,
275,
3946,
812,
253,
4477,
21184,
327,
326,
374,
670,
4735,
7877,
1319,
275,
253,
12661,
1332,
253,
8763,
4315,
288,
85,
556,
253,
1072,
1979,
347,
253,
4735,
4315,
1269,
604,
253,
4735,
7877,
1319,
273,
247,
10895,
310,
1077,
1029,
534,
310,
3240,
1846,
275,
3946,
840,
12672,
288,
85,
476,
46518,
1029,
2105,
310,
627,
247,
1039,
281,
2968,
342,
253,
2442,
1895,
495,
670,
253,
673,
10454,
275,
2593,
5540,
253,
4477,
3748,
326,
253,
673,
10454,
273,
941,
5162,
310,
258,
413,
752,
310,
465,
1060,
10414,
337,
877,
74,
278,
44023,
1162,
355,
4216,
11454,
9826,
8967,
7424,
549,
32693,
638,
3845,
549,
32693,
746,
7749,
1976,
1237,
6247,
374,
32087,
1182,
5801,
26981,
1162,
355,
5415,
4216,
2685,
323,
12112,
4038,
13418,
549,
32693,
638,
3845,
549,
32693,
16129,
1438,
1348,
1812,
6247,
495,
1182,
11917,
606,
480,
2084,
606,
1162,
355,
9826,
8967,
7424,
327,
4216,
6928,
6247,
577,
1269,
25224,
570,
2310,
29245,
11135,
284,
1179,
913,
278,
1205,
572,
285,
480,
757,
12717,
5415,
4216,
11454,
6928,
549,
32693,
638,
3845,
549,
32693,
746,
805,
8972,
2251,
6247,
7152,
33032,
2520,
19529,
5611,
247,
747,
4216,
27311,
267,
5572,
1754,
327,
4250,
12393,
4907,
4250,
10295,
305,
14340,
288,
5840,
14340,
806,
44351,
26202,
553,
4250,
12393,
327,
14580,
310,
9814,
835,
253,
2900,
310,
1677,
407,
253,
4250,
5150,
721,
840,
253,
4477,
2692,
326,
8946,
305,
14340,
476,
320,
34930,
275,
253,
1072,
15895,
949,
35132,
1320,
50276,
783,
4081,
1332,
288,
5840,
14340,
310,
2074,
281,
253,
21010,
305,
14340,
256,
23654,
407,
259,
86,
1162,
355,
253,
4715,
5199,
310,
3240,
15246,
806,
247,
4250,
12393,
310,
2684,
327,
253,
3280,
3386,
323,
247,
838,
1553,
1245,
673,
246,
840,
9317,
9077,
310,
2684,
327,
253,
2171,
3197,
3386,
281,
6194,
253,
30410,
253,
256,
23654,
1566,
17923,
2067,
2069,
273,
9168,
25001,
347,
275,
26724,
305,
14340,
1754,
327,
247,
14189,
9879,
5806,
273,
1340,
581,
1223,
288,
5840,
14340,
17923,
4250,
12393,
342,
2169,
2621,
14189,
2426,
50276,
2577,
2022,
14226,
310,
326,
9433,
4250,
12393,
327,
4216,
27311,
310,
417,
747,
285,
253,
2954,
342,
2045,
2987,
310,
417,
4518,
4767,
923,
253,
11106,
1269,
86,
1162,
355,
6247,
66,
390,
417,
5393,
465,
663,
468,
66,
6247,
1269,
86,
1162,
355,
6247,
68,
835,
2074,
26850,
285,
5697,
2168,
5420,
253,
2022,
38135,
273,
253,
4081,
288,
5840,
14340,
3103,
310,
327,
247,
5019,
273,
4250,
10295,
285,
253,
256,
23654,
2746,
436,
5019,
310,
417,
37825,
2217,
285,
778,
417,
320,
1534,
2217,
281,
320,
3863,
275,
17857,
32888,
50276,
783,
288,
5840,
14340,
1332,
310,
17194,
407,
253,
28088,
1895,
273,
305,
14340,
849,
1057,
253,
4250,
10295,
1361,
3693,
28088,
846,
253,
10199,
273,
253,
4250,
5150,
627,
943,
320,
690,
10527,
7234,
281,
2085,
247,
2900,
281,
436,
1895,
285,
281,
2723,
281,
436,
16038,
34243,
281,
921,
849,
288,
5840,
14340,
310,
1027,
342,
253,
305,
14340,
2746,
3185,
253,
28088,
1895,
310,
7194,
14042,
407,
10704,
9864,
327,
253,
20953,
1650,
285,
25040,
7125,
50276,
284,
1529,
38135,
253,
4477,
4950,
326,
305,
14340,
476,
320,
34930,
970,
4250,
12393,
762,
253,
1072,
15895,
969,
253,
4602,
3738,
4722,
310,
417,
247,
2201,
7680,
50276,
358,
5378,
1037,
253,
4477,
5762,
253,
288,
5840,
14340,
1332,
327,
7744,
908,
25577,
15302,
285,
271,
9040,
67,
4216,
273,
549,
32693,
7774,
327,
1097,
811,
1483,
285,
42115,
4715,
8892,
50276,
783,
5699,
3885,
7756,
310,
7194,
1955,
281,
253,
1072,
10480,
347,
256,
23654,
310,
908,
275,
288,
5840,
14340,
627,
310,
642,
5743,
875,
253,
27311,
8090,
534,
4483,
247,
638,
681,
10340,
3213,
3560,
407,
271,
6685,
21010,
4715,
3213,
21535,
9077,
436,
7756,
310,
1955,
281,
256,
23654,
285,
310,
3264,
50276,
1615,
2819,
387,
253,
7200,
7363,
253,
2022,
5301,
310,
288,
5840,
14340,
4632,
256,
23654,
984,
273,
616,
22620,
352,
310,
417,
21414,
326,
970,
4250,
12393,
288,
5840,
14340,
3185,
273,
4216,
27311,
256,
23654,
476,
3324,
247,
16613,
3045,
7756,
275,
954,
273,
253,
673,
253,
7756,
310,
3240,
16888,
50276,
1189,
455,
436,
7681,
38135,
285,
16774,
8453,
403,
3710,
627,
403,
417,
10527,
7234,
275,
436,
2929,
285,
891,
717,
16344,
352,
347,
271,
5933,
280,
7680,
891,
717,
46705,
247,
5075,
18235,
50276,
3062,
5701,
50276,
783,
4060,
310,
1512,
3862,
4496,
320,
625,
2173,
50276,
85,
899,
1650,
275,
253,
10199,
347,
436,
20953,
310,
5393,
969,
275,
1996,
2505,
4496,
5513,
275,
625,
2508,
253,
13782,
273,
305,
14340,
4632,
288,
5840,
14340,
50276,
284,
368,
3053,
432,
305,
9866,
352,
310,
1175,
281,
26542,
690,
3236,
305,
9866,
2929,
305,
8885,
1162,
355,
5826,
38763,
13890,
1162,
355,
15630,
50276,
2574,
26,
295,
2562,
310,
1892,
281,
1239,
50276,
249,
436,
7646,
954,
273,
253,
30404,
943,
897,
4851,
554,
3185,
273,
26542,
50276,
250,
3065,
50276,
76,
663,
468,
66,
6247,
12393,
19132,
4216,
4715,
465,
663,
468,
66,
1162,
355,
6247,
50276,
46036,
1162,
355,
6247,
68,
4216,
5149,
1059,
11454,
2990,
1269,
86,
1162,
355,
6247,
50275,
6438,
30080,
22559,
50276,
47033,
368,
323,
253,
18520,
285,
253,
8254,
6787,
50276,
2369,
581,
556,
1160,
247,
2590,
4602,
875,
305,
14340,
285,
253,
4250,
10295,
50276,
1542,
1650,
275,
465,
663,
468,
66,
6247,
2593,
374,
253,
4250,
10295,
310,
5469,
347,
247,
2714,
1083,
50276,
664,
13414,
3365,
13398,
256,
23654,
285,
4250,
10295,
4518,
253,
760,
3064,
875,
253,
4081,
1332,
275,
2593,
5910,
285,
256,
23654,
310,
326,
253,
4477,
908,
4250,
12393,
347,
253,
9879,
5806,
4315,
1223,
256,
23654,
908,
247,
14189,
5806,
253,
465,
394,
1612,
273,
253,
12650,
3067,
43850,
4315,
50276,
44295,
3062,
253,
643,
30628,
5439,
2074,
2987,
824,
347,
4216,
258,
615,
534,
2007,
11355,
253,
38135,
273,
436,
789,
5474,
33032,
2520,
19529,
29328,
281,
897,
4250,
10295,
347,
253,
18634,
4315,
275,
4216,
27311,
267,
6928,
253,
4477,
921,
326,
4250,
34501,
476,
10808,
625,
6032,
18634,
3879,
685,
253,
7744,
908,
13358,
18634,
253,
19529,
671,
11809,
271,
5919,
1332,
281,
10173,
253,
4250,
10295,
970,
1161,
44678,
37552,
7466,
50275,
45563,
50276,
783,
4081,
1332,
310,
973,
17194,
285,
4250,
10295,
310,
4942,
973,
4524,
6545,
50275,
783,
4028,
310,
2590,
285,
1842,
1767,
10117,
1676,
50276,
783,
4081,
1332,
14371,
5520,
3045,
327,
4666,
9162,
15302,
1223,
10958,
2996,
253,
6733,
273,
2969,
4216,
27311,
50276,
783,
19529,
41084,
247,
4067,
4311,
549,
32693,
4216,
10895,
534,
476,
320,
4217,
323,
253,
3114,
588,
253,
4477,
3727,
436,
10895,
275,
253,
2852,
50276,
2520,
19529,
556,
2218,
247,
7000,
285,
11080,
7103,
273,
253,
4081,
1332,
891,
11435,
253,
3434,
281,
12106,
246,
786,
292,
534,
943,
320,
9371,
323,
24432,
50276,
20881,
1255,
50276,
8433,
68,
310,
1929,
281,
320,
18860,
323,
4216,
9162,
8892,
1057,
253,
4081,
1566,
671,
10958,
953,
436,
42719,
50276,
2420,
495,
19756,
14023,
281,
625,
3332,
4216,
11454,
6928,
751,
4216,
1616,
729,
11454,
6928,
2299,
891,
13414,
1158,
436,
310,
4619,
1677,
253,
6733,
285,
2266,
3045,
273,
253,
4081,
1332,
327,
4067,
4216,
15302,
50273,
39055,
50275,
28821,
326,
253,
30628,
452,
417,
4925,
247,
13969,
891,
971,
281,
823,
625,
5955,
281,
619,
2278,
281,
12454,
253,
913,
281,
1056,
253,
3061,
891,
651,
671,
2486,
247,
1643,
625,
3158,
23034,
323,
253,
4477,
285,
3524,
597,
476,
1361,
823,
1941,
281,
619,
4154,
50276,
18,
436,
19529,
29328,
281,
8171,
253,
18634,
275,
18976,
2224,
342,
4250,
10295,
253,
2022,
16038,
323,
436,
1332,
310,
326,
253,
826,
43917,
15116,
5257,
281,
9774,
366,
347,
12800,
407,
4677,
337,
253,
4250,
22562,
3400,
247,
5415,
14940,
1232,
285,
540,
41597,
778,
2953,
253,
28088,
1232,
15538,
891,
2868,
253,
16038,
273,
436,
1332,
310,
417,
281,
3657,
689,
34006,
272,
533,
281,
3657,
689,
5829,
21755,
50275,
74,
2868,
436,
30328,
310,
3590,
533,
11907,
253,
4477,
281,
513,
625,
281,
17813,
436,
9079,
891,
11435,
253,
28913,
1263,
275,
4677,
721,
347,
581,
625,
1783,
891,
1804,
253,
4477,
281,
823,
253,
3045,
9191,
273,
256,
23654,
281,
4677,
721,
762,
253,
1072,
4758,
604,
436,
10527,
30328,
310,
3588,
359,
943,
1902,
288,
5840,
14340,
285,
256,
23654,
281,
21319,
625,
13359,
672,
253,
18634,
4248,
310,
1698,
1561,
20048,
619,
4685,
310,
326,
1097,
288,
5840,
14340,
285,
256,
23654,
403,
5919,
594,
436,
3368,
943,
2649,
1379,
1048,
34243,
253,
4477,
476,
5731,
436,
906,
387,
1878,
327,
1249,
15302,
1561,
253,
5955,
2180,
50276,
19,
891,
717,
671,
1077,
17847,
407,
253,
6733,
273,
288,
5840,
14340,
436,
19529,
556,
3368,
264,
342,
281,
619,
3640,
253,
6253,
13644,
2130,
4216,
10895,
549,
32693,
534,
4428,
625,
685,
581,
3041,
7632,
2556,
281,
253,
4477,
288,
5840,
14340,
476,
320,
10166,
323,
436,
549,
32693,
10895,
275,
40873,
84,
534,
310,
13943,
50275,
9939,
326,
1060,
253,
4250,
12393,
4315,
1057,
417,
878,
281,
320,
10302,
296,
2149,
11120,
1563,
253,
9978,
275,
256,
23654,
288,
5840,
14340,
760,
3198,
281,
11897,
253,
46695,
3386,
275,
247,
638,
21678,
3213,
50274,
74,
1804,
253,
4477,
281,
1918,
625,
11859,
3904,
281,
17093,
253,
6733,
273,
436,
1332,
323,
436,
549,
32693,
10895,
752,
2238,
273,
10309,
310,
2424,
849,
1199,
4588,
17653,
858,
368,
897,
281,
11897,
253,
638,
21678,
3213,
50276,
20,
846,
4361,
643,
10123,
891,
1024,
8968,
326,
436,
19529,
310,
417,
253,
806,
281,
9569,
4250,
34501,
715,
305,
68,
2224,
2190,
253,
9380,
8042,
562,
407,
643,
30628,
891,
1089,
337,
281,
320,
954,
4623,
275,
326,
597,
671,
4081,
253,
10393,
273,
4250,
34501,
476,
253,
4477,
671,
19148,
253,
3064,
875,
436,
19529,
285,
374,
50276,
3169,
327,
436,
38135,
4468,
891,
452,
17892,
619,
2278,
4868,
281,
721,
50275,
18,
1269,
86,
1162,
355,
4216,
5149,
1059,
11454,
2990,
17857,
32888,
6247,
374,
1269,
86,
1162,
355,
4216,
27311,
267,
6928,
970,
4250,
10295,
323,
49863,
29974,
13337,
4715,
891,
23925,
2284,
6247,
2490,
187,
4118,
18435,
27,
2520,
2929,
556,
644,
6760,
407,
1740,
30628,
665,
4583,
32360,
875,
45210,
12009,
14764,
275,
2087,
347,
3585,
577,
2792,
562,
436,
2929,
4620,
281,
23808,
342,
689,
5829,
21755,
2581,
685,
689,
34006,
272,
4809,
273,
305,
14340,
14053,
1633,
4409,
8254,
5411,
3585,
495,
671,
35155,
2792,
562,
326,
253,
4602,
875,
253,
4250,
10295,
285,
305,
14340,
275,
958,
369,
4232,
275,
2045,
2987,
671,
253,
4602,
875,
256,
23654,
14189,
5806,
285,
50276,
783,
4250,
12393,
253,
9879,
5806,
4315,
310,
1892,
281,
20621,
253,
13214,
326,
436,
789,
21168,
11306,
327,
256,
23654,
3103,
1223,
913,
34144,
4219,
342,
253,
2934,
352,
310,
671,
2834,
281,
20621,
253,
32809,
3753,
273,
253,
2929,
285,
3103,
253,
2929,
2550,
320,
7607,
275,
697,
1655,
830
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
4081,
1078,
4240,
285,
4102,
627,
403,
1142,
625,
12085,
3082,
4081,
281,
1056,
253,
1543,
625,
21414,
352,
310,
9371,
281,
7277,
1411,
690,
3332,
4216,
11454,
6928,
323,
4666,
9162,
16280,
891,
671,
452,
690,
3533,
5001,
253,
1566,
2508,
337,
670,
246,
275,
5150,
721,
275,
5150,
721,
253,
16101,
830,
273,
209,
633,
310,
1677,
407,
209,
633,
2585,
89,
835,
246,
476,
452,
247,
1029,
3486,
327,
253,
1543,
604,
246,
310,
1077,
1355,
840,
1045,
85,
4916,
271,
6489,
4315,
285,
7613,
288,
85,
588,
320,
1077,
2810,
281,
1269,
604,
246,
310,
1077,
1781,
840,
1045,
85,
4916,
271,
4315,
3692,
3603,
403,
512,
2810,
281,
470,
285,
3021,
512,
253,
10175,
275,
288,
85,
588,
320,
2761,
253,
1072,
27012,
271,
689,
34006,
272,
1895,
275,
3946,
752,
651,
320,
247,
1463,
1318,
273,
246,
25761,
604,
359,
1007,
387,
4677,
721,
1014,
672,
246,
310,
1077,
1781,
24088,
246,
938,
253,
7200,
310,
1335,
1077,
1029,
3340,
327,
944,
66,
534,
6492,
326,
253,
1566,
1057,
417,
11089,
432,
689,
34006,
272,
275,
3946,
533,
604,
359,
2451,
253,
16101,
830,
209,
633,
2585,
89,
672,
246,
310,
1781,
512,
253,
10175,
275,
288,
85,
2489,
1077,
2074,
534,
778,
1421,
281,
689,
34006,
272,
285,
247,
1698,
7200,
891,
4282,
849,
1057,
253,
4081,
1566,
8722,
281,
3693,
689,
34006,
272,
275,
3946,
812,
253,
4477,
21184,
327,
326,
374,
670,
4735,
7877,
1319,
275,
253,
12661,
1332,
253,
8763,
4315,
288,
85,
556,
253,
1072,
1979,
347,
253,
4735,
4315,
1269,
604,
253,
4735,
7877,
1319,
273,
247,
10895,
310,
1077,
1029,
534,
310,
3240,
1846,
275,
3946,
840,
12672,
288,
85,
476,
46518,
1029,
2105,
310,
627,
247,
1039,
281,
2968,
342,
253,
2442,
1895,
495,
670,
253,
673,
10454,
275,
2593,
5540,
253,
4477,
3748,
326,
253,
673,
10454,
273,
941,
5162,
310,
258,
413,
752,
310,
465,
1060,
10414,
337,
877,
74,
278,
44023,
1162,
355,
4216,
11454,
9826,
8967,
7424,
549,
32693,
638,
3845,
549,
32693,
746,
7749,
1976,
1237,
6247,
374,
32087,
1182,
5801,
26981,
1162,
355,
5415,
4216,
2685,
323,
12112,
4038,
13418,
549,
32693,
638,
3845,
549,
32693,
16129,
1438,
1348,
1812,
6247,
495,
1182,
11917,
606,
480,
2084,
606,
1162,
355,
9826,
8967,
7424,
327,
4216,
6928,
6247,
577,
1269,
25224,
570,
2310,
29245,
11135,
284,
1179,
913,
278,
1205,
572,
285,
480,
757,
12717,
5415,
4216,
11454,
6928,
549,
32693,
638,
3845,
549,
32693,
746,
805,
8972,
2251,
6247,
7152,
33032,
2520,
19529,
5611,
247,
747,
4216,
27311,
267,
5572,
1754,
327,
4250,
12393,
4907,
4250,
10295,
305,
14340,
288,
5840,
14340,
806,
44351,
26202,
553,
4250,
12393,
327,
14580,
310,
9814,
835,
253,
2900,
310,
1677,
407,
253,
4250,
5150,
721,
840,
253,
4477,
2692,
326,
8946,
305,
14340,
476,
320,
34930,
275,
253,
1072,
15895,
949,
35132,
1320,
50276,
783,
4081,
1332,
288,
5840,
14340,
310,
2074,
281,
253,
21010,
305,
14340,
256,
23654,
407,
259,
86,
1162,
355,
253,
4715,
5199,
310,
3240,
15246,
806,
247,
4250,
12393,
310,
2684,
327,
253,
3280,
3386,
323,
247,
838,
1553,
1245,
673,
246,
840,
9317,
9077,
310,
2684,
327,
253,
2171,
3197,
3386,
281,
6194,
253,
30410,
253,
256,
23654,
1566,
17923,
2067,
2069,
273,
9168,
25001,
347,
275,
26724,
305,
14340,
1754,
327,
247,
14189,
9879,
5806,
273,
1340,
581,
1223,
288,
5840,
14340,
17923,
4250,
12393,
342,
2169,
2621,
14189,
2426,
50276,
2577,
2022,
14226,
310,
326,
9433,
4250,
12393,
327,
4216,
27311,
310,
417,
747,
285,
253,
2954,
342,
2045,
2987,
310,
417,
4518,
4767,
923,
253,
11106,
1269,
86,
1162,
355,
6247,
66,
390,
417,
5393,
465,
663,
468,
66,
6247,
1269,
86,
1162,
355,
6247,
68,
835,
2074,
26850,
285,
5697,
2168,
5420,
253,
2022,
38135,
273,
253,
4081,
288,
5840,
14340,
3103,
310,
327,
247,
5019,
273,
4250,
10295,
285,
253,
256,
23654,
2746,
436,
5019,
310,
417,
37825,
2217,
285,
778,
417,
320,
1534,
2217,
281,
320,
3863,
275,
17857,
32888,
50276,
783,
288,
5840,
14340,
1332,
310,
17194,
407,
253,
28088,
1895,
273,
305,
14340,
849,
1057,
253,
4250,
10295,
1361,
3693,
28088,
846,
253,
10199,
273,
253,
4250,
5150,
627,
943,
320,
690,
10527,
7234,
281,
2085,
247,
2900,
281,
436,
1895,
285,
281,
2723,
281,
436,
16038,
34243,
281,
921,
849,
288,
5840,
14340,
310,
1027,
342,
253,
305,
14340,
2746,
3185,
253,
28088,
1895,
310,
7194,
14042,
407,
10704,
9864,
327,
253,
20953,
1650,
285,
25040,
7125,
50276,
284,
1529,
38135,
253,
4477,
4950,
326,
305,
14340,
476,
320,
34930,
970,
4250,
12393,
762,
253,
1072,
15895,
969,
253,
4602,
3738,
4722,
310,
417,
247,
2201,
7680,
50276,
358,
5378,
1037,
253,
4477,
5762,
253,
288,
5840,
14340,
1332,
327,
7744,
908,
25577,
15302,
285,
271,
9040,
67,
4216,
273,
549,
32693,
7774,
327,
1097,
811,
1483,
285,
42115,
4715,
8892,
50276,
783,
5699,
3885,
7756,
310,
7194,
1955,
281,
253,
1072,
10480,
347,
256,
23654,
310,
908,
275,
288,
5840,
14340,
627,
310,
642,
5743,
875,
253,
27311,
8090,
534,
4483,
247,
638,
681,
10340,
3213,
3560,
407,
271,
6685,
21010,
4715,
3213,
21535,
9077,
436,
7756,
310,
1955,
281,
256,
23654,
285,
310,
3264,
50276,
1615,
2819,
387,
253,
7200,
7363,
253,
2022,
5301,
310,
288,
5840,
14340,
4632,
256,
23654,
984,
273,
616,
22620,
352,
310,
417,
21414,
326,
970,
4250,
12393,
288,
5840,
14340,
3185,
273,
4216,
27311,
256,
23654,
476,
3324,
247,
16613,
3045,
7756,
275,
954,
273,
253,
673,
253,
7756,
310,
3240,
16888,
50276,
1189,
455,
436,
7681,
38135,
285,
16774,
8453,
403,
3710,
627,
403,
417,
10527,
7234,
275,
436,
2929,
285,
891,
717,
16344,
352,
347,
271,
5933,
280,
7680,
891,
717,
46705,
247,
5075,
18235,
50276,
3062,
5701,
50276,
783,
4060,
310,
1512,
3862,
4496,
320,
625,
2173,
50276,
85,
899,
1650,
275,
253,
10199,
347,
436,
20953,
310,
5393,
969,
275,
1996,
2505,
4496,
5513,
275,
625,
2508,
253,
13782,
273,
305,
14340,
4632,
288,
5840,
14340,
50276,
284,
368,
3053,
432,
305,
9866,
352,
310,
1175,
281,
26542,
690,
3236,
305,
9866,
2929,
305,
8885,
1162,
355,
5826,
38763,
13890,
1162,
355,
15630,
50276,
2574,
26,
295,
2562,
310,
1892,
281,
1239,
50276,
249,
436,
7646,
954,
273,
253,
30404,
943,
897,
4851,
554,
3185,
273,
26542,
50276,
250,
3065,
50276,
76,
663,
468,
66,
6247,
12393,
19132,
4216,
4715,
465,
663,
468,
66,
1162,
355,
6247,
50276,
46036,
1162,
355,
6247,
68,
4216,
5149,
1059,
11454,
2990,
1269,
86,
1162,
355,
6247,
50275,
6438,
30080,
22559,
50276,
47033,
368,
323,
253,
18520,
285,
253,
8254,
6787,
50276,
2369,
581,
556,
1160,
247,
2590,
4602,
875,
305,
14340,
285,
253,
4250,
10295,
50276,
1542,
1650,
275,
465,
663,
468,
66,
6247,
2593,
374,
253,
4250,
10295,
310,
5469,
347,
247,
2714,
1083,
50276,
664,
13414,
3365,
13398,
256,
23654,
285,
4250,
10295,
4518,
253,
760,
3064,
875,
253,
4081,
1332,
275,
2593,
5910,
285,
256,
23654,
310,
326,
253,
4477,
908,
4250,
12393,
347,
253,
9879,
5806,
4315,
1223,
256,
23654,
908,
247,
14189,
5806,
253,
465,
394,
1612,
273,
253,
12650,
3067,
43850,
4315,
50276,
44295,
3062,
253,
643,
30628,
5439,
2074,
2987,
824,
347,
4216,
258,
615,
534,
2007,
11355,
253,
38135,
273,
436,
789,
5474,
33032,
2520,
19529,
29328,
281,
897,
4250,
10295,
347,
253,
18634,
4315,
275,
4216,
27311,
267,
6928,
253,
4477,
921,
326,
4250,
34501,
476,
10808,
625,
6032,
18634,
3879,
685,
253,
7744,
908,
13358,
18634,
253,
19529,
671,
11809,
271,
5919,
1332,
281,
10173,
253,
4250,
10295,
970,
1161,
44678,
37552,
7466,
50275,
45563,
50276,
783,
4081,
1332,
310,
973,
17194,
285,
4250,
10295,
310,
4942,
973,
4524,
6545,
50275,
783,
4028,
310,
2590,
285,
1842,
1767,
10117,
1676,
50276,
783,
4081,
1332,
14371,
5520,
3045,
327,
4666,
9162,
15302,
1223,
10958,
2996,
253,
6733,
273,
2969,
4216,
27311,
50276,
783,
19529,
41084,
247,
4067,
4311,
549,
32693,
4216,
10895,
534,
476,
320,
4217,
323,
253,
3114,
588,
253,
4477,
3727,
436,
10895,
275,
253,
2852,
50276,
2520,
19529,
556,
2218,
247,
7000,
285,
11080,
7103,
273,
253,
4081,
1332,
891,
11435,
253,
3434,
281,
12106,
246,
786,
292,
534,
943,
320,
9371,
323,
24432,
50276,
20881,
1255,
50276,
8433,
68,
310,
1929,
281,
320,
18860,
323,
4216,
9162,
8892,
1057,
253,
4081,
1566,
671,
10958,
953,
436,
42719,
50276,
2420,
495,
19756,
14023,
281,
625,
3332,
4216,
11454,
6928,
751,
4216,
1616,
729,
11454,
6928,
2299,
891,
13414,
1158,
436,
310,
4619,
1677,
253,
6733,
285,
2266,
3045,
273,
253,
4081,
1332,
327,
4067,
4216,
15302,
50273,
39055,
50275,
28821,
326,
253,
30628,
452,
417,
4925,
247,
13969,
891,
971,
281,
823,
625,
5955,
281,
619,
2278,
281,
12454,
253,
913,
281,
1056,
253,
3061,
891,
651,
671,
2486,
247,
1643,
625,
3158,
23034,
323,
253,
4477,
285,
3524,
597,
476,
1361,
823,
1941,
281,
619,
4154,
50276,
18,
436,
19529,
29328,
281,
8171,
253,
18634,
275,
18976,
2224,
342,
4250,
10295,
253,
2022,
16038,
323,
436,
1332,
310,
326,
253,
826,
43917,
15116,
5257,
281,
9774,
366,
347,
12800,
407,
4677,
337,
253,
4250,
22562,
3400,
247,
5415,
14940,
1232,
285,
540,
41597,
778,
2953,
253,
28088,
1232,
15538,
891,
2868,
253,
16038,
273,
436,
1332,
310,
417,
281,
3657,
689,
34006,
272,
533,
281,
3657,
689,
5829,
21755,
50275,
74,
2868,
436,
30328,
310,
3590,
533,
11907,
253,
4477,
281,
513,
625,
281,
17813,
436,
9079,
891,
11435,
253,
28913,
1263,
275,
4677,
721,
347,
581,
625,
1783,
891,
1804,
253,
4477,
281,
823,
253,
3045,
9191,
273,
256,
23654,
281,
4677,
721,
762,
253,
1072,
4758,
604,
436,
10527,
30328,
310,
3588,
359,
943,
1902,
288,
5840,
14340,
285,
256,
23654,
281,
21319,
625,
13359,
672,
253,
18634,
4248,
310,
1698,
1561,
20048,
619,
4685,
310,
326,
1097,
288,
5840,
14340,
285,
256,
23654,
403,
5919,
594,
436,
3368,
943,
2649,
1379,
1048,
34243,
253,
4477,
476,
5731,
436,
906,
387,
1878,
327,
1249,
15302,
1561,
253,
5955,
2180,
50276,
19,
891,
717,
671,
1077,
17847,
407,
253,
6733,
273,
288,
5840,
14340,
436,
19529,
556,
3368,
264,
342,
281,
619,
3640,
253,
6253,
13644,
2130,
4216,
10895,
549,
32693,
534,
4428,
625,
685,
581,
3041,
7632,
2556,
281,
253,
4477,
288,
5840,
14340,
476,
320,
10166,
323,
436,
549,
32693,
10895,
275,
40873,
84,
534,
310,
13943,
50275,
9939,
326,
1060,
253,
4250,
12393,
4315,
1057,
417,
878,
281,
320,
10302,
296,
2149,
11120,
1563,
253,
9978,
275,
256,
23654,
288,
5840,
14340,
760,
3198,
281,
11897,
253,
46695,
3386,
275,
247,
638,
21678,
3213,
50274,
74,
1804,
253,
4477,
281,
1918,
625,
11859,
3904,
281,
17093,
253,
6733,
273,
436,
1332,
323,
436,
549,
32693,
10895,
752,
2238,
273,
10309,
310,
2424,
849,
1199,
4588,
17653,
858,
368,
897,
281,
11897,
253,
638,
21678,
3213,
50276,
20,
846,
4361,
643,
10123,
891,
1024,
8968,
326,
436,
19529,
310,
417,
253,
806,
281,
9569,
4250,
34501,
715,
305,
68,
2224,
2190,
253,
9380,
8042,
562,
407,
643,
30628,
891,
1089,
337,
281,
320,
954,
4623,
275,
326,
597,
671,
4081,
253,
10393,
273,
4250,
34501,
476,
253,
4477,
671,
19148,
253,
3064,
875,
436,
19529,
285,
374,
50276,
3169,
327,
436,
38135,
4468,
891,
452,
17892,
619,
2278,
4868,
281,
721,
50275,
18,
1269,
86,
1162,
355,
4216,
5149,
1059,
11454,
2990,
17857,
32888,
6247,
374,
1269,
86,
1162,
355,
4216,
27311,
267,
6928,
970,
4250,
10295,
323,
49863,
29974,
13337,
4715,
891,
23925,
2284,
6247,
2490,
187,
4118,
18435,
27,
2520,
2929,
556,
644,
6760,
407,
1740,
30628,
665,
4583,
32360,
875,
45210,
12009,
14764,
275,
2087,
347,
3585,
577,
2792,
562,
436,
2929,
4620,
281,
23808,
342,
689,
5829,
21755,
2581,
685,
689,
34006,
272,
4809,
273,
305,
14340,
14053,
1633,
4409,
8254,
5411,
3585,
495,
671,
35155,
2792,
562,
326,
253,
4602,
875,
253,
4250,
10295,
285,
305,
14340,
275,
958,
369,
4232,
275,
2045,
2987,
671,
253,
4602,
875,
256,
23654,
14189,
5806,
285,
50276,
783,
4250,
12393,
253,
9879,
5806,
4315,
310,
1892,
281,
20621,
253,
13214,
326,
436,
789,
21168,
11306,
327,
256,
23654,
3103,
1223,
913,
34144,
4219,
342,
253,
2934,
352,
310,
671,
2834,
281,
20621,
253,
32809,
3753,
273,
253,
2929,
285,
3103,
253,
2929,
2550,
320,
7607,
275,
697,
1655,
830
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper frames active learning as an integer optimization problem that minimises the distance wasserstein distance between unlabelled pool of data this is done in a feature space in this case trained using selfsupervised methods all the data the method outperforms existing active learning methods for very small labelling budgets with theoretical guarantees of integer optimisation problem although not the performance in terms of model accuracy itself strengths tackles active learning from a slightly different approach wasserstein distance well justified and bounds of the integer optimisation problems are well specified proof not checked carefully however reasonable improvements over previous approaches well written paper clear algorithms and pseudocode provided for ease of understanding and reproducibility thorough evaluation in low data regime against other sota active learning methods in a number of scenarios with ssl classical active learning setting and domain adaptation time and complexity analysis provided weaknessespotential areas of improvement stated that there is improvement for high budget settings but no evidence is provided in the text of that in fact it seems like a trend for a dropoff in performance in terms of accuracy as budget increases would be nice to see that even it does occur there is limited analysis of scaling to larger unlabelled datasets or datasets with more classes skeptical that performance would be good or that the method would scale well although again training time is often cheaper than labelling so this is not necessarily a problem but some experiment here would be useful other questions were the features used for greedy kcenters the same as the features used in your approach or did you continue to use the vgg16 features that was used in sener and savarese well written and polished paper with novel angle on the active learning problem with some theoretical guarantees and good experimental performance in low budget scenarios would like see some additional results but this a good paper docsepthis work tackles the problem of active learning in every iteration the subset of the unlabeled data pool that needs to be labeled is selected by posing it as an integer programming ip problem that minimizes the discrete wasserstein distance generalized benders decomposition algorithm is used to solve this ip through relaxations it is shown to converge and also some acceleration techniques are provided all the above are supported through extensive empirical analysis strength the paper is well written it describes the problem in detail i believe using wasserstein distance to bound the core set loss is a unique and interesting way of dealing with the active learning problem the approaches are also backed by strong empirical analysis i find this a very strong paper i did not get to check all the proofs but i have checked some of them and they seem correct weaknesses i dont find any weaknesses of the paper this seems very relavant work and a strong accept in my view overall a strong accept for tackling an important problem docsepthis paper proposes a batch mode active learning method that chooses a subset of data points to be labeled through approximating the whole dataset in terms of wasserstein distance which is an upperbound of the generalization error the selection is formulated as a largescale mixed integer programming and the authors propose to solve it by the gbd for acceleration some additional constraints are proposed experimental results show that the proposed method is better or competitive than baseline methods like kcenter kmedoids and waal strong points the paper discusses why wasserstein distance is minimized theorem 1 the experiments show the effectiveness of the proposed method in downstream task active learning presentation is clear weak points effect of the embedding method is not investigated simclr is used for obtaining the encoding but i am curious what happens if other embedding methods are used intuitively the embedding is very important because the proposed method assumes that nearby points in the embedding space have the same label in the second paragraph of section 5 the authors say our approach is also effective without selfsupervised features what is the meaning of this sentence scalability scalability of the proposed method seems to be low the experiments include relatively small datasets does the proposed method scale even for datasets including millions of data points or larger budget values for example table includes larger bs results from 1000 to 6000 how large is the computation cost this concern is mainly from real application scenarios ml projects usually have budget to label more than 1000 data points and much larger unlabeled data pool for which we need to compute wasserstein distance questions the inequality in pp2 seems to be equal in what sense do you argue that the rhs upperbounds the generalization error how do we solve wrmp do we use ilp solver this paper deals with the batch mode active learning is it possible to extend the proposed method to the classical sequential active learning this paper is wellwritten and motivated with a theoretical guarantee before seeing the other reviews i am positive to accept the paper although including some concerns eg scalability please answer and clarify my concerns described above docsepthis paper studies active learning from a mixed integer programming perspective the active learning strategy is representationbased and aims at selecting a core set that minimizes the wasserstein distance various tricks eg enhanced optimality cuts and pruning from the integer programming literature are used to accelerate the algorithm empirical results show advantages of the proposed algorithm over existing ones the paper is wellwritten and easy to follow the authors provide an integer programming based approach for representationbased active learning which shows empirical advantages over existing ones i summarize my questions as follows 1 in section 32 the authors compare the size of the original optimization problem in eq 4 and the relaxed one in the relaxed problem however one needs to compute the wasserstein distance at each iteration what is the computational complexity of computing the mentioned wasserstein distance it would be great to include the explicit complexity rather than just saying that efficient algorithms exist 2 the author mentioned that the proposed method usually takes longer to make selections i wonder how well the proposed method performs when it is given the same running time as other baselines eg 3 minutes for both kmedoids and the proposed method i think overall the approach makes sense but i didnt find significantly novel contributions either theoretically or empirically maybe the authors could point out their novel contributions as a result ill vote for a weak acceptance
### Summary:
|
this is an interesting submission which was overall well received by the reviewers i would recommend the authors to discuss further the vast modern litterature on efficient computation of wasserstein distances and their minimization see eg peyr and cuturi 2019 and references therein
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
13009,
3939,
4715,
347,
271,
7007,
13757,
1895,
326,
7221,
3013,
253,
4181,
369,
2152,
6339,
4181,
875,
440,
47728,
6363,
273,
941,
436,
310,
2218,
275,
247,
4735,
2317,
275,
436,
1083,
10166,
970,
1881,
35421,
3082,
512,
253,
941,
253,
1332,
41731,
13015,
5368,
3939,
4715,
3082,
323,
1077,
1355,
46684,
35905,
342,
10527,
23632,
273,
7007,
5556,
5837,
1895,
3738,
417,
253,
3045,
275,
2426,
273,
1566,
7200,
3139,
20544,
50276,
85,
471,
868,
3939,
4715,
432,
247,
5777,
1027,
2746,
50276,
88,
30666,
6339,
4181,
973,
17285,
285,
14493,
273,
253,
7007,
5556,
5837,
3237,
403,
973,
7616,
4737,
417,
10141,
9257,
2299,
50276,
14885,
11701,
689,
2045,
7274,
50275,
4714,
3542,
2929,
2590,
11333,
285,
10585,
406,
853,
2530,
323,
11990,
273,
4685,
285,
38041,
50276,
42771,
602,
7103,
275,
1698,
941,
9459,
1411,
643,
256,
5503,
3939,
4715,
3082,
275,
247,
1180,
273,
15216,
342,
256,
3433,
8946,
3939,
4715,
4758,
285,
5028,
15644,
50276,
2606,
285,
10454,
1783,
2530,
50276,
20881,
1255,
265,
33177,
3672,
273,
7756,
50276,
33834,
326,
627,
310,
7756,
323,
1029,
7563,
7533,
533,
642,
1941,
310,
2530,
275,
253,
2505,
273,
326,
50276,
249,
958,
352,
3133,
751,
247,
9058,
323,
247,
5926,
2727,
275,
3045,
275,
2426,
273,
7200,
50276,
284,
7563,
5459,
651,
320,
5322,
281,
923,
326,
1014,
352,
1057,
2826,
50276,
9088,
310,
3710,
1783,
273,
13642,
281,
4067,
440,
47728,
15302,
390,
15302,
342,
625,
5971,
33872,
326,
3045,
651,
320,
1175,
390,
326,
253,
1332,
651,
4311,
973,
3738,
969,
3733,
673,
310,
2223,
20182,
685,
46684,
594,
436,
310,
417,
7933,
247,
1895,
533,
690,
3368,
1060,
651,
320,
4217,
50276,
977,
3533,
50276,
12796,
253,
3386,
908,
323,
38754,
465,
1154,
398,
253,
1072,
347,
253,
3386,
908,
275,
634,
2746,
390,
858,
368,
4035,
281,
897,
253,
362,
1266,
1036,
3386,
326,
369,
908,
275,
256,
4330,
285,
5745,
609,
339,
973,
3542,
285,
29422,
2929,
342,
4460,
6907,
327,
253,
3939,
4715,
1895,
342,
690,
10527,
23632,
285,
1175,
5661,
3045,
275,
1698,
7563,
15216,
651,
751,
923,
690,
3081,
1543,
533,
436,
247,
1175,
2929,
5474,
33032,
2520,
789,
39223,
253,
1895,
273,
3939,
4715,
275,
1046,
19502,
253,
8578,
273,
253,
440,
22027,
941,
6363,
326,
3198,
281,
320,
13130,
310,
4236,
407,
42501,
352,
347,
271,
7007,
10717,
13997,
1895,
326,
46926,
253,
13358,
369,
2152,
6339,
4181,
14923,
270,
13577,
14717,
5933,
310,
908,
281,
8415,
436,
13997,
949,
7921,
569,
352,
310,
2011,
281,
29623,
285,
671,
690,
17680,
5609,
403,
2530,
512,
253,
1840,
403,
4516,
949,
9470,
16774,
1783,
4757,
50276,
783,
2929,
310,
973,
3542,
352,
8631,
253,
1895,
275,
2508,
891,
2868,
970,
369,
2152,
6339,
4181,
281,
3033,
253,
5161,
873,
2957,
310,
247,
4451,
285,
4722,
1039,
273,
10620,
342,
253,
3939,
4715,
1895,
253,
7274,
403,
671,
17245,
407,
2266,
16774,
1783,
891,
1089,
436,
247,
1077,
2266,
2929,
891,
858,
417,
755,
281,
2451,
512,
253,
27947,
533,
891,
452,
10141,
690,
273,
731,
285,
597,
1646,
3451,
50275,
20881,
1255,
265,
50276,
74,
13414,
1089,
667,
32213,
273,
253,
2929,
436,
3133,
1077,
774,
580,
386,
789,
285,
247,
2266,
2997,
275,
619,
1859,
4583,
247,
2266,
2997,
323,
46710,
271,
1774,
1895,
5474,
33032,
2520,
2929,
29328,
247,
14604,
4438,
3939,
4715,
1332,
326,
28467,
247,
8578,
273,
941,
2792,
281,
320,
13130,
949,
4020,
839,
253,
2644,
10895,
275,
2426,
273,
369,
2152,
6339,
4181,
534,
310,
271,
5170,
9458,
273,
253,
26647,
2228,
253,
5438,
310,
26115,
347,
247,
1236,
2510,
25912,
6804,
7007,
10717,
285,
253,
4477,
12661,
281,
8415,
352,
407,
253,
305,
14836,
323,
17680,
690,
3081,
10806,
403,
4081,
5661,
1543,
921,
326,
253,
4081,
1332,
310,
1805,
390,
12085,
685,
8245,
3082,
751,
465,
9229,
465,
1314,
9448,
285,
5172,
267,
2266,
2792,
50275,
783,
2929,
25339,
2139,
369,
2152,
6339,
4181,
310,
36625,
10012,
337,
50276,
783,
4679,
921,
253,
12510,
273,
253,
4081,
1332,
275,
15450,
4836,
3939,
4715,
50276,
49836,
310,
2590,
50276,
20881,
2792,
50275,
8222,
273,
253,
21496,
1332,
310,
417,
6949,
948,
498,
83,
310,
908,
323,
13546,
253,
9706,
533,
891,
717,
14338,
752,
6569,
604,
643,
21496,
3082,
403,
908,
540,
41597,
253,
21496,
310,
1077,
1774,
984,
253,
4081,
1332,
19584,
326,
10151,
2792,
275,
253,
21496,
2317,
452,
253,
1072,
5203,
275,
253,
1273,
12494,
273,
2593,
608,
253,
4477,
1333,
776,
2746,
310,
671,
3576,
1293,
1881,
35421,
3386,
752,
310,
253,
4495,
273,
436,
6197,
50276,
24606,
1430,
9171,
1430,
273,
253,
4081,
1332,
3133,
281,
320,
1698,
253,
4679,
2486,
4942,
1355,
15302,
1057,
253,
4081,
1332,
4311,
1014,
323,
15302,
1690,
9790,
273,
941,
2792,
390,
4067,
7563,
2193,
323,
1650,
2829,
3797,
4067,
48996,
1543,
432,
9098,
281,
721,
933,
849,
1781,
310,
253,
13782,
2105,
436,
4468,
310,
7194,
432,
1524,
2898,
15216,
13361,
6493,
3798,
452,
7563,
281,
5203,
625,
685,
9098,
941,
2792,
285,
1199,
4067,
440,
22027,
941,
6363,
323,
534,
359,
878,
281,
11897,
369,
2152,
6339,
4181,
50276,
34974,
50275,
783,
11370,
275,
7266,
19,
3133,
281,
320,
4503,
275,
752,
3282,
513,
368,
9059,
326,
253,
38309,
5170,
35800,
253,
26647,
2228,
50276,
5430,
513,
359,
8415,
259,
1109,
81,
513,
359,
897,
4164,
81,
47037,
50276,
2520,
2929,
13330,
342,
253,
14604,
4438,
3939,
4715,
310,
352,
1896,
281,
9017,
253,
4081,
1332,
281,
253,
8946,
22453,
3939,
4715,
436,
2929,
310,
973,
15720,
285,
17194,
342,
247,
10527,
12215,
1078,
6523,
253,
643,
10123,
891,
717,
2762,
281,
2997,
253,
2929,
3738,
1690,
690,
7350,
24088,
9171,
1430,
4496,
3662,
285,
19148,
619,
7350,
2529,
1840,
5474,
33032,
2520,
2929,
2175,
3939,
4715,
432,
247,
6804,
7007,
10717,
8668,
253,
3939,
4715,
5700,
310,
6779,
3169,
285,
13698,
387,
17221,
247,
5161,
873,
326,
46926,
253,
369,
2152,
6339,
4181,
2710,
24866,
24088,
8655,
5556,
1319,
12176,
285,
819,
25004,
432,
253,
7007,
10717,
6239,
403,
908,
281,
28523,
253,
5933,
16774,
1543,
921,
11361,
273,
253,
4081,
5933,
689,
5368,
4394,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
253,
4477,
2085,
271,
7007,
10717,
1754,
2746,
323,
6779,
3169,
3939,
4715,
534,
2722,
16774,
11361,
689,
5368,
4394,
891,
26799,
619,
3533,
347,
3637,
50276,
18,
275,
2593,
4567,
253,
4477,
7277,
253,
1979,
273,
253,
3236,
13757,
1895,
275,
16186,
577,
285,
253,
19595,
581,
275,
253,
19595,
1895,
2299,
581,
3198,
281,
11897,
253,
369,
2152,
6339,
4181,
387,
1016,
19502,
752,
310,
253,
15180,
10454,
273,
12672,
253,
5393,
369,
2152,
6339,
4181,
352,
651,
320,
1270,
281,
2486,
253,
6843,
10454,
2581,
685,
816,
3981,
326,
5919,
11333,
2226,
374,
253,
2488,
5393,
326,
253,
4081,
1332,
3798,
3936,
3356,
281,
1056,
36318,
891,
4282,
849,
973,
253,
4081,
1332,
17923,
672,
352,
310,
1677,
253,
1072,
3515,
673,
347,
643,
1666,
25379,
24088,
495,
2909,
323,
1097,
465,
1314,
9448,
285,
253,
4081,
1332,
50276,
74,
1158,
4583,
253,
2746,
2789,
3282,
533,
891,
42126,
1089,
3012,
4460,
9021,
2057,
28055,
390,
45190,
5046,
253,
4477,
812,
1127,
562,
616,
4460,
9021,
347,
247,
906,
2853,
6273,
323,
247,
5075,
14924,
2490,
187,
4118,
18435,
27,
2520,
310,
271,
4722,
19529,
534,
369,
4583,
973,
2959,
407,
253,
30628,
891,
651,
5583,
253,
4477,
281,
2319,
2007,
253,
8485,
4980,
30653,
1177,
327,
5919,
13782,
273,
369,
2152,
6339,
13849,
285,
616,
41458,
923,
24088,
759,
6147,
285,
2624,
11317,
6247,
285,
10414,
15308
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
13009,
3939,
4715,
347,
271,
7007,
13757,
1895,
326,
7221,
3013,
253,
4181,
369,
2152,
6339,
4181,
875,
440,
47728,
6363,
273,
941,
436,
310,
2218,
275,
247,
4735,
2317,
275,
436,
1083,
10166,
970,
1881,
35421,
3082,
512,
253,
941,
253,
1332,
41731,
13015,
5368,
3939,
4715,
3082,
323,
1077,
1355,
46684,
35905,
342,
10527,
23632,
273,
7007,
5556,
5837,
1895,
3738,
417,
253,
3045,
275,
2426,
273,
1566,
7200,
3139,
20544,
50276,
85,
471,
868,
3939,
4715,
432,
247,
5777,
1027,
2746,
50276,
88,
30666,
6339,
4181,
973,
17285,
285,
14493,
273,
253,
7007,
5556,
5837,
3237,
403,
973,
7616,
4737,
417,
10141,
9257,
2299,
50276,
14885,
11701,
689,
2045,
7274,
50275,
4714,
3542,
2929,
2590,
11333,
285,
10585,
406,
853,
2530,
323,
11990,
273,
4685,
285,
38041,
50276,
42771,
602,
7103,
275,
1698,
941,
9459,
1411,
643,
256,
5503,
3939,
4715,
3082,
275,
247,
1180,
273,
15216,
342,
256,
3433,
8946,
3939,
4715,
4758,
285,
5028,
15644,
50276,
2606,
285,
10454,
1783,
2530,
50276,
20881,
1255,
265,
33177,
3672,
273,
7756,
50276,
33834,
326,
627,
310,
7756,
323,
1029,
7563,
7533,
533,
642,
1941,
310,
2530,
275,
253,
2505,
273,
326,
50276,
249,
958,
352,
3133,
751,
247,
9058,
323,
247,
5926,
2727,
275,
3045,
275,
2426,
273,
7200,
50276,
284,
7563,
5459,
651,
320,
5322,
281,
923,
326,
1014,
352,
1057,
2826,
50276,
9088,
310,
3710,
1783,
273,
13642,
281,
4067,
440,
47728,
15302,
390,
15302,
342,
625,
5971,
33872,
326,
3045,
651,
320,
1175,
390,
326,
253,
1332,
651,
4311,
973,
3738,
969,
3733,
673,
310,
2223,
20182,
685,
46684,
594,
436,
310,
417,
7933,
247,
1895,
533,
690,
3368,
1060,
651,
320,
4217,
50276,
977,
3533,
50276,
12796,
253,
3386,
908,
323,
38754,
465,
1154,
398,
253,
1072,
347,
253,
3386,
908,
275,
634,
2746,
390,
858,
368,
4035,
281,
897,
253,
362,
1266,
1036,
3386,
326,
369,
908,
275,
256,
4330,
285,
5745,
609,
339,
973,
3542,
285,
29422,
2929,
342,
4460,
6907,
327,
253,
3939,
4715,
1895,
342,
690,
10527,
23632,
285,
1175,
5661,
3045,
275,
1698,
7563,
15216,
651,
751,
923,
690,
3081,
1543,
533,
436,
247,
1175,
2929,
5474,
33032,
2520,
789,
39223,
253,
1895,
273,
3939,
4715,
275,
1046,
19502,
253,
8578,
273,
253,
440,
22027,
941,
6363,
326,
3198,
281,
320,
13130,
310,
4236,
407,
42501,
352,
347,
271,
7007,
10717,
13997,
1895,
326,
46926,
253,
13358,
369,
2152,
6339,
4181,
14923,
270,
13577,
14717,
5933,
310,
908,
281,
8415,
436,
13997,
949,
7921,
569,
352,
310,
2011,
281,
29623,
285,
671,
690,
17680,
5609,
403,
2530,
512,
253,
1840,
403,
4516,
949,
9470,
16774,
1783,
4757,
50276,
783,
2929,
310,
973,
3542,
352,
8631,
253,
1895,
275,
2508,
891,
2868,
970,
369,
2152,
6339,
4181,
281,
3033,
253,
5161,
873,
2957,
310,
247,
4451,
285,
4722,
1039,
273,
10620,
342,
253,
3939,
4715,
1895,
253,
7274,
403,
671,
17245,
407,
2266,
16774,
1783,
891,
1089,
436,
247,
1077,
2266,
2929,
891,
858,
417,
755,
281,
2451,
512,
253,
27947,
533,
891,
452,
10141,
690,
273,
731,
285,
597,
1646,
3451,
50275,
20881,
1255,
265,
50276,
74,
13414,
1089,
667,
32213,
273,
253,
2929,
436,
3133,
1077,
774,
580,
386,
789,
285,
247,
2266,
2997,
275,
619,
1859,
4583,
247,
2266,
2997,
323,
46710,
271,
1774,
1895,
5474,
33032,
2520,
2929,
29328,
247,
14604,
4438,
3939,
4715,
1332,
326,
28467,
247,
8578,
273,
941,
2792,
281,
320,
13130,
949,
4020,
839,
253,
2644,
10895,
275,
2426,
273,
369,
2152,
6339,
4181,
534,
310,
271,
5170,
9458,
273,
253,
26647,
2228,
253,
5438,
310,
26115,
347,
247,
1236,
2510,
25912,
6804,
7007,
10717,
285,
253,
4477,
12661,
281,
8415,
352,
407,
253,
305,
14836,
323,
17680,
690,
3081,
10806,
403,
4081,
5661,
1543,
921,
326,
253,
4081,
1332,
310,
1805,
390,
12085,
685,
8245,
3082,
751,
465,
9229,
465,
1314,
9448,
285,
5172,
267,
2266,
2792,
50275,
783,
2929,
25339,
2139,
369,
2152,
6339,
4181,
310,
36625,
10012,
337,
50276,
783,
4679,
921,
253,
12510,
273,
253,
4081,
1332,
275,
15450,
4836,
3939,
4715,
50276,
49836,
310,
2590,
50276,
20881,
2792,
50275,
8222,
273,
253,
21496,
1332,
310,
417,
6949,
948,
498,
83,
310,
908,
323,
13546,
253,
9706,
533,
891,
717,
14338,
752,
6569,
604,
643,
21496,
3082,
403,
908,
540,
41597,
253,
21496,
310,
1077,
1774,
984,
253,
4081,
1332,
19584,
326,
10151,
2792,
275,
253,
21496,
2317,
452,
253,
1072,
5203,
275,
253,
1273,
12494,
273,
2593,
608,
253,
4477,
1333,
776,
2746,
310,
671,
3576,
1293,
1881,
35421,
3386,
752,
310,
253,
4495,
273,
436,
6197,
50276,
24606,
1430,
9171,
1430,
273,
253,
4081,
1332,
3133,
281,
320,
1698,
253,
4679,
2486,
4942,
1355,
15302,
1057,
253,
4081,
1332,
4311,
1014,
323,
15302,
1690,
9790,
273,
941,
2792,
390,
4067,
7563,
2193,
323,
1650,
2829,
3797,
4067,
48996,
1543,
432,
9098,
281,
721,
933,
849,
1781,
310,
253,
13782,
2105,
436,
4468,
310,
7194,
432,
1524,
2898,
15216,
13361,
6493,
3798,
452,
7563,
281,
5203,
625,
685,
9098,
941,
2792,
285,
1199,
4067,
440,
22027,
941,
6363,
323,
534,
359,
878,
281,
11897,
369,
2152,
6339,
4181,
50276,
34974,
50275,
783,
11370,
275,
7266,
19,
3133,
281,
320,
4503,
275,
752,
3282,
513,
368,
9059,
326,
253,
38309,
5170,
35800,
253,
26647,
2228,
50276,
5430,
513,
359,
8415,
259,
1109,
81,
513,
359,
897,
4164,
81,
47037,
50276,
2520,
2929,
13330,
342,
253,
14604,
4438,
3939,
4715,
310,
352,
1896,
281,
9017,
253,
4081,
1332,
281,
253,
8946,
22453,
3939,
4715,
436,
2929,
310,
973,
15720,
285,
17194,
342,
247,
10527,
12215,
1078,
6523,
253,
643,
10123,
891,
717,
2762,
281,
2997,
253,
2929,
3738,
1690,
690,
7350,
24088,
9171,
1430,
4496,
3662,
285,
19148,
619,
7350,
2529,
1840,
5474,
33032,
2520,
2929,
2175,
3939,
4715,
432,
247,
6804,
7007,
10717,
8668,
253,
3939,
4715,
5700,
310,
6779,
3169,
285,
13698,
387,
17221,
247,
5161,
873,
326,
46926,
253,
369,
2152,
6339,
4181,
2710,
24866,
24088,
8655,
5556,
1319,
12176,
285,
819,
25004,
432,
253,
7007,
10717,
6239,
403,
908,
281,
28523,
253,
5933,
16774,
1543,
921,
11361,
273,
253,
4081,
5933,
689,
5368,
4394,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
253,
4477,
2085,
271,
7007,
10717,
1754,
2746,
323,
6779,
3169,
3939,
4715,
534,
2722,
16774,
11361,
689,
5368,
4394,
891,
26799,
619,
3533,
347,
3637,
50276,
18,
275,
2593,
4567,
253,
4477,
7277,
253,
1979,
273,
253,
3236,
13757,
1895,
275,
16186,
577,
285,
253,
19595,
581,
275,
253,
19595,
1895,
2299,
581,
3198,
281,
11897,
253,
369,
2152,
6339,
4181,
387,
1016,
19502,
752,
310,
253,
15180,
10454,
273,
12672,
253,
5393,
369,
2152,
6339,
4181,
352,
651,
320,
1270,
281,
2486,
253,
6843,
10454,
2581,
685,
816,
3981,
326,
5919,
11333,
2226,
374,
253,
2488,
5393,
326,
253,
4081,
1332,
3798,
3936,
3356,
281,
1056,
36318,
891,
4282,
849,
973,
253,
4081,
1332,
17923,
672,
352,
310,
1677,
253,
1072,
3515,
673,
347,
643,
1666,
25379,
24088,
495,
2909,
323,
1097,
465,
1314,
9448,
285,
253,
4081,
1332,
50276,
74,
1158,
4583,
253,
2746,
2789,
3282,
533,
891,
42126,
1089,
3012,
4460,
9021,
2057,
28055,
390,
45190,
5046,
253,
4477,
812,
1127,
562,
616,
4460,
9021,
347,
247,
906,
2853,
6273,
323,
247,
5075,
14924,
2490,
187,
4118,
18435,
27,
2520,
310,
271,
4722,
19529,
534,
369,
4583,
973,
2959,
407,
253,
30628,
891,
651,
5583,
253,
4477,
281,
2319,
2007,
253,
8485,
4980,
30653,
1177,
327,
5919,
13782,
273,
369,
2152,
6339,
13849,
285,
616,
41458,
923,
24088,
759,
6147,
285,
2624,
11317,
6247,
285,
10414,
15308
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents an impaintingbased method for anomaly localization on images in the training time a conditional ganbased generative modeling approach is adopted in the test time a mask matrix is adaptively estimated by thresholding the structural similarity index measure ssim between the original images and reconstructed images the idea is very intuitive and experiments demonstrate improved performance especially on textures over two recent baselines methods strong points this paper improves the test time in unsupervised anomaly detection with an iteration scheme although similar ideas of iteratively focusing on refining the lowerror regions and leaving alone the higherror regions have been proposed before the residualthresholding scheme proposed in this paper looks even simpler another novelty is in the training time the proposed approach using a conditional and ganbased approach this is in contrast with autoencoderbased reconstruction approaches which may lead to blurry image reconstruction weak points the experiment evaluation is not quite convincing some of these numbers look worse than what is reported by bergmann et al 2018 and dehaene et al 2020 are there any settings different when reproducing the baseline results the qualitative comparison with dehaene et al 2020 is missing in figure 3 figure 4 clearly shows the auc improves with iterations would this ssimthreshold scheme also benefit other approaches eg a conditional autoencoder are these performance improvements over baselines mainly attributed to the choices of freeform random mask and gan or the proposed ssimthreshold scheme would it be better than other iterative schemes such as iterative projection with the same trained model more ablation studies would be helpful other points to clarify i guess table 1 auroc is for pixelwise segmentation rather than image classification right how does the checkboard mask initialize the test applying different masks to 4 copies of images and then merge stopping rule of iteration overall i think the idea in this paper is very interesting i still have some conservations about experimental evaluation as explained above against me voting for acceptance docsep this approach relies on learning merely normal training sample and then distinguishing between normal and abnormal events regarding the threshold of the reconstruction error i3ad exploits a combination of perpixel identity function and conditional autoencoder which is capable of encoding only normal regions and decoding just anomalous regions weaknesses although this paper has attempted to propose an efficient algorithm for anomaly detection the the general idea of this paper is not sufficiently innovative the most important works on the topics are not cited the proposed method is not comprehensively compared with the other solution a comprehensive analysis of the complexity of the proposed method is required it seems the proposed method achieves the stateoftheart performed at the expense of complexity strengths this paper is well written so that the structure is easy to follow the majority of the obtained results are better than the stateoftheart has been reported in this paper the proposed method has tried to tackle two noticeable issues of using autoencoders for the task of anomaly detection including their defect inappropriately modeling small detail as well as the existing object mismatching that the models are trained to minimize total reconstruction errors docsepthis work proposed a novel learning strategy for unsupervisedly anomaly detection particularly authors propose to use an iterative mask generation process based on image impainting and reduction of a structural similarity metric ssmi between the input image and its reconstructed version for evaluation purposes authors resort to the public mvtec benchmark showing better results than the baselines please find me comments below strengths the paper is generally well written and easy to follow results show that the proposed method outperforms the baselines weaknesses the methodological contribution is marginalincremental similarly to several methods in weakly supervised segmentation see 1 for example authors use iterative steps to refine the initial segmentation mask the only difference is that instead of mine regions based on classification activation maps and classification scores the authors use a structural similarity pixelwise metric beyond that there is nothing novel in the proposed methodology some ideasmotivations are unclear for example i dont really understand why the mask initialization is needed at test time further authors also mention in the appendix d that to cover whole images they are split into x by x patches and then aggregated into four masks please provide more details since this is unclear literature is poorly conducted with many relevant recent papers missing furthermore the literature comes late in the paper in addition to the previous paper in weakly supervised segmentation the works in 26 are recent works in anomaly detection just to name a few authors should include all this papers discuss their limitations and show how the proposed work can overcome these drawbacks authors mention that autoencoders produce blurry images which is true nevertheless works including an adversarial discriminator have somehow addressed the issue of blurred reconstructed images given all this authors should better motivate their work among previous missing papers there is the work in 2 which also employs an impainting strategy coupled with an adversarial model which are the differences with respect to this work experiments need to be significantly extended first authors merely include two methods in their evaluation while there exist more than those used in the comparisons this is particularly important since some recent methods which have been omitted in the literature review made by the authors eg 4 significantly outperform the proposed method 0863 vs 090 in auc second authors report results in terms of auc while i strongly suggest that they use the accuracy for individual classes and the auc as average of the classes the reason for this is to better compare to related work see for example table 6 in 4 references 1 wang et al weaklysupervised semantic segmentation by iteratively mining common object features cvpr 2018 2 sabokrou et al avid adversarial visual irregularity detection accv 2018 3 perera et al ocgan oneclass novelty detection using gans with constrained latent representations cvpr 2019 4 venkataramananet al attention guided anomaly localization in images eccv 2020 5 deecke et al image anomaly detection with generative adversarial networks in joint european conference on machine learning and knowledge discovery in databases 2018 6 li et al exploring deep anomaly detection methods based on capsule net in canadian conference on artificial intelligence 2020 minor the paper in david dehaene oriel frigo sebastien combrexelle and pierre eline iterative energybased projection on a normal data manifold for anomaly localization arxiv preprint arxiv200203734 is an icml 2020 published paper docsepthis paper presents a method for contrastive anomaly detection ad using an iterative masked conditional autoencoder inpainting approach an autoencoder network is trained using an adversarial approach to reconstruct a randomly masked part of the input image at test time a mask is derived from the generated anomaly map using ssim index between the input and reconstructed image and used to mask the input so that the process is repeated n times the method is shown to produce sot results on the mvtec ad benchmark pros the method is novel and interesting the iterative masked approach does provide a principled way to increase the signaltonoise ratio in the reconstructed images cons the numbers given in table 1 for the baseline sot methods are not from the literature the authors reproduced the numbers but they appear to be much lower than those published for example the published value for the average auc over 15 classes in bergmann2019 are l2082ssim086 while on table 1 these values are l2076ssim076 this is a substantial difference which is not explained at all it makes the proposed method look much better than sot while in reality it appears to have similar performance the number of iterations is not discussed this is an important hyperparameter as it affects the overall speed of the approach only fig 4 shows the progression of the auc during iterations but no discussion are provided also the actual number of iterations used to generate table 1 auc scores is not given assuming the complexity is higher than other sot methods due to the iterative aspect and given that the performance is similar to sot see above point this approach now looks a lot less convincing the paper is not well written and suffers from grammatically wrong and overly complicated sentences to the point where it becomes distracting and confusing overall this is an interesting and novel approach to contrastive anomaly detection however i think the paper is not ready for publication the baseline sot numbers need to be explained or changed to reflect previously published numbers the computational complexity of the approach needs to be compared to these baselines and finally the paper needs to be seriously proofread
### Summary:
|
the initial reviews were a bit split r4 was slightly positive r3 was slightly negative and both r1 and r2 voted for rejection the main issue was lack of proper comparisons with the sota methods and missing references in the rebuttal the authors added additional experiments as requested but r1 and r2 were not convinced by the new results in particular r1 pointed out that even the unsupervised setup in 4 achieved 089 auc outperforming 086 as reported in the paper the ac agrees with r1 and r2 that the paper cannot be published until more thorough comparisons are conducted
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
271,
1607,
404,
1076,
3169,
1332,
323,
30207,
14536,
327,
3888,
275,
253,
3733,
673,
247,
17697,
36827,
3169,
1006,
800,
14053,
2746,
310,
8671,
275,
253,
1071,
673,
50276,
66,
8989,
4315,
310,
5223,
1242,
5998,
407,
7887,
272,
253,
8350,
14259,
3605,
2557,
256,
3549,
875,
253,
3236,
3888,
285,
25578,
3888,
253,
2934,
310,
1077,
27350,
285,
4679,
7568,
5520,
3045,
3340,
327,
38276,
689,
767,
3332,
1666,
25379,
3082,
50274,
9072,
2792,
50274,
2520,
2929,
19132,
253,
1071,
673,
275,
440,
35421,
30207,
5481,
342,
271,
19502,
6974,
3738,
2074,
5697,
273,
10040,
3146,
13654,
327,
1275,
1699,
253,
1698,
3775,
4811,
285,
6108,
3815,
253,
2169,
6045,
4811,
452,
644,
4081,
1078,
253,
12541,
34503,
272,
6974,
4081,
275,
436,
2929,
4453,
1014,
19554,
50274,
23955,
38135,
310,
275,
253,
3733,
673,
253,
4081,
2746,
970,
247,
17697,
285,
36827,
3169,
2746,
436,
310,
275,
4499,
342,
6753,
36465,
3169,
14433,
7274,
534,
778,
1421,
281,
787,
20657,
2460,
14433,
50275,
20881,
2792,
50273,
783,
3368,
7103,
310,
417,
3240,
21414,
690,
273,
841,
3904,
1007,
7197,
685,
752,
310,
2361,
407,
270,
1326,
8420,
1162,
355,
4765,
285,
372,
3227,
1751,
1162,
355,
9169,
403,
627,
667,
7533,
1027,
672,
39306,
253,
8245,
1543,
50273,
783,
18276,
5301,
342,
372,
3227,
1751,
1162,
355,
9169,
310,
5816,
275,
4677,
495,
50274,
13206,
577,
4518,
2722,
253,
247,
1028,
19132,
342,
25142,
651,
436,
256,
3549,
34503,
6974,
671,
5649,
643,
7274,
24088,
247,
17697,
6753,
36465,
403,
841,
3045,
11701,
689,
1666,
25379,
7194,
12877,
281,
253,
10165,
273,
1959,
630,
3632,
8989,
285,
36827,
390,
253,
4081,
256,
3549,
34503,
6974,
651,
352,
320,
1805,
685,
643,
34560,
15849,
824,
347,
34560,
12378,
342,
253,
1072,
10166,
1566,
625,
28913,
2175,
651,
320,
9371,
50275,
977,
2792,
281,
19148,
50274,
74,
5476,
2829,
337,
247,
1822,
68,
310,
323,
12275,
3020,
26405,
2581,
685,
2460,
9162,
987,
50275,
5430,
1057,
253,
2451,
4697,
8989,
26641,
253,
1071,
9433,
1027,
25965,
281,
577,
10125,
273,
3888,
285,
840,
17310,
50274,
11769,
2784,
4086,
273,
19502,
50275,
1189,
455,
891,
1158,
253,
2934,
275,
436,
2929,
310,
1077,
4722,
891,
1335,
452,
690,
6405,
569,
670,
5661,
7103,
347,
5544,
1840,
1411,
479,
13423,
323,
14924,
50276,
7152,
33032,
436,
2746,
15771,
327,
4715,
7960,
2622,
3733,
3410,
285,
840,
32495,
875,
2622,
285,
10969,
3394,
5001,
253,
7887,
273,
253,
14433,
2228,
891,
20,
324,
40725,
247,
5019,
273,
591,
29206,
6489,
1159,
285,
17697,
6753,
36465,
534,
310,
7032,
273,
9706,
760,
2622,
4811,
285,
28490,
816,
31946,
4811,
32213,
3738,
436,
2929,
556,
9919,
281,
12661,
271,
5919,
5933,
323,
30207,
5481,
253,
253,
2087,
2934,
273,
436,
2929,
310,
417,
10481,
16694,
50276,
783,
954,
1774,
2987,
327,
253,
12989,
403,
417,
11106,
50276,
783,
4081,
1332,
310,
417,
9483,
1242,
2429,
342,
253,
643,
2900,
247,
11088,
1783,
273,
253,
10454,
273,
253,
4081,
1332,
310,
2424,
352,
3133,
253,
4081,
1332,
33526,
253,
1375,
23037,
14387,
2684,
387,
253,
14247,
273,
10454,
50275,
296,
3755,
20556,
436,
2929,
310,
973,
3542,
594,
326,
253,
2605,
310,
3477,
281,
956,
253,
5020,
273,
253,
2797,
1543,
403,
1805,
685,
253,
1375,
23037,
14387,
556,
644,
2361,
275,
436,
2929,
253,
4081,
1332,
556,
3597,
281,
18915,
767,
28629,
3374,
273,
970,
6753,
2083,
351,
398,
323,
253,
4836,
273,
30207,
5481,
1690,
616,
7071,
275,
13759,
1523,
14053,
1355,
2508,
347,
973,
347,
253,
5368,
1789,
19412,
16464,
326,
253,
3210,
403,
10166,
281,
15338,
2264,
14433,
6332,
5474,
33032,
2520,
789,
4081,
247,
4460,
4715,
5700,
323,
440,
35421,
314,
30207,
5481,
3782,
4477,
12661,
281,
897,
271,
34560,
8989,
5978,
1232,
1754,
327,
2460,
1607,
404,
1076,
285,
5141,
273,
247,
8350,
14259,
7982,
256,
3610,
74,
875,
253,
3280,
2460,
285,
697,
25578,
2715,
323,
7103,
6378,
4477,
17942,
281,
253,
1345,
278,
87,
39766,
22791,
4645,
1805,
1543,
685,
253,
1666,
25379,
4496,
1089,
479,
5701,
2708,
50276,
296,
3755,
20556,
253,
2929,
310,
3839,
973,
3542,
285,
3477,
281,
956,
1543,
921,
326,
253,
4081,
1332,
41731,
13015,
253,
1666,
25379,
50276,
20881,
1255,
265,
253,
35961,
7680,
310,
16888,
19687,
30132,
12014,
281,
2067,
3082,
275,
22112,
22296,
26405,
923,
337,
323,
1650,
4477,
897,
34560,
5018,
281,
39494,
253,
3302,
26405,
8989,
253,
760,
3064,
310,
326,
3185,
273,
7477,
4811,
1754,
327,
9162,
5743,
8115,
285,
9162,
7363,
253,
4477,
897,
247,
8350,
14259,
12275,
3020,
7982,
4457,
326,
627,
310,
2717,
4460,
275,
253,
4081,
16182,
50276,
8826,
1827,
4542,
302,
400,
569,
403,
12744,
323,
1650,
891,
13414,
1663,
2096,
2139,
253,
8989,
31850,
310,
3058,
387,
1071,
673,
2007,
4477,
671,
3748,
275,
253,
30762,
277,
326,
281,
3835,
2644,
3888,
597,
403,
8085,
715,
1269,
407,
1269,
20412,
285,
840,
40006,
715,
1740,
25965,
4496,
2085,
625,
4278,
1580,
436,
310,
12744,
50275,
22478,
1177,
310,
15225,
5196,
342,
1142,
4623,
3332,
9380,
5816,
33810,
253,
6239,
3249,
3563,
275,
253,
2929,
275,
1635,
281,
253,
2045,
2929,
275,
22112,
22296,
26405,
253,
2987,
275,
3436,
403,
3332,
2987,
275,
30207,
5481,
816,
281,
1416,
247,
1643,
4477,
943,
2486,
512,
436,
9380,
2319,
616,
7364,
285,
921,
849,
253,
4081,
789,
476,
11399,
841,
30453,
4477,
3748,
326,
6753,
2083,
351,
398,
4711,
787,
20657,
3888,
534,
310,
2032,
17837,
2987,
1690,
271,
48960,
7134,
12915,
452,
10380,
9713,
253,
2523,
273,
44042,
25578,
3888,
1677,
512,
436,
4477,
943,
1805,
41509,
616,
789,
50275,
35094,
2045,
5816,
9380,
627,
310,
253,
789,
275,
374,
534,
671,
27532,
271,
1607,
404,
1076,
5700,
9904,
342,
271,
48960,
1566,
534,
403,
253,
3910,
342,
1675,
281,
436,
789,
50276,
16217,
3825,
878,
281,
320,
3012,
6508,
806,
4477,
7960,
2486,
767,
3082,
275,
616,
7103,
1223,
627,
2226,
625,
685,
1110,
908,
275,
253,
14023,
436,
310,
3782,
1774,
1580,
690,
3332,
3082,
534,
452,
644,
11035,
275,
253,
6239,
2278,
1160,
407,
253,
4477,
24088,
577,
3012,
562,
32231,
253,
4081,
1332,
470,
39948,
4632,
470,
2270,
275,
247,
1028,
1273,
4477,
1304,
1543,
275,
2426,
273,
247,
1028,
1223,
891,
7052,
1804,
326,
597,
897,
253,
7200,
323,
2060,
5971,
285,
253,
247,
1028,
347,
3388,
273,
253,
5971,
253,
1921,
323,
436,
310,
281,
1805,
7277,
281,
2905,
789,
923,
323,
1650,
2829,
721,
275,
577,
50275,
250,
3065,
337,
259,
606,
1162,
355,
22112,
35421,
24705,
26405,
407,
10040,
3146,
15067,
1846,
1789,
3386,
30105,
1087,
4765,
374,
18429,
536,
30898,
1162,
355,
46952,
48960,
5304,
17948,
414,
5481,
756,
87,
4765,
50276,
20,
759,
30059,
1162,
355,
15715,
1247,
581,
2437,
38135,
5481,
970,
305,
507,
342,
20793,
21624,
14237,
30105,
1087,
6247,
577,
8097,
76,
15642,
14990,
266,
292,
355,
4116,
18107,
30207,
14536,
275,
3888,
23746,
87,
9169,
608,
372,
886,
413,
1162,
355,
2460,
30207,
5481,
342,
1006,
800,
48960,
6928,
275,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
4765,
721,
632,
1162,
355,
18216,
3676,
30207,
5481,
3082,
1754,
327,
26661,
2036,
275,
476,
7577,
8059,
327,
13345,
9260,
9169,
50275,
37585,
253,
2929,
275,
34843,
301,
372,
3227,
1751,
258,
19399,
1315,
15973,
396,
24785,
1914,
2049,
18398,
4415,
285,
18753,
250,
1045,
460,
34560,
2341,
3169,
12378,
327,
247,
2622,
941,
16751,
323,
30207,
14536,
549,
32693,
638,
3845,
549,
32693,
1518,
938,
1787,
1706,
310,
271,
17857,
1686,
9169,
3863,
2929,
5474,
33032,
2520,
2929,
10262,
247,
1332,
323,
4499,
422,
30207,
5481,
519,
970,
271,
34560,
34741,
17697,
6753,
36465,
275,
31406,
1076,
2746,
271,
6753,
36465,
2990,
310,
10166,
970,
271,
48960,
2746,
281,
17029,
247,
12421,
34741,
629,
273,
253,
3280,
2460,
387,
1071,
673,
247,
8989,
310,
6012,
432,
253,
4561,
30207,
3711,
970,
256,
3549,
3605,
875,
253,
3280,
285,
25578,
2460,
285,
908,
281,
8989,
253,
3280,
594,
326,
253,
1232,
310,
6015,
295,
2069,
253,
1332,
310,
2011,
281,
4711,
256,
302,
1543,
327,
253,
278,
87,
39766,
519,
22791,
50273,
856,
84,
50272,
783,
1332,
310,
4460,
285,
4722,
253,
34560,
34741,
2746,
1057,
2085,
247,
3505,
74,
6216,
1039,
281,
2572,
253,
2625,
1299,
45416,
4313,
275,
253,
25578,
3888,
50271,
5040,
50272,
783,
3904,
1677,
275,
2829,
337,
323,
253,
8245,
256,
302,
3082,
403,
417,
432,
253,
6239,
253,
4477,
23775,
253,
3904,
533,
597,
3176,
281,
320,
1199,
2406,
685,
1110,
3863,
323,
1650,
253,
3863,
1318,
323,
253,
3388,
247,
1028,
689,
1458,
5971,
275,
270,
1326,
8420,
9638,
403,
298,
938,
3507,
859,
303,
44675,
1223,
327,
2829,
337,
841,
2193,
403,
298,
938,
3121,
859,
303,
41923,
436,
310,
247,
6832,
3064,
534,
310,
417,
5544,
387,
512,
352,
2789,
253,
4081,
1332,
1007,
1199,
1805,
685,
256,
302,
1223,
275,
6612,
352,
4620,
281,
452,
2074,
3045,
50272,
783,
1180,
273,
25142,
310,
417,
5469,
436,
310,
271,
1774,
4373,
19484,
347,
352,
11852,
253,
4583,
3885,
273,
253,
2746,
760,
3036,
577,
2722,
253,
10005,
273,
253,
247,
1028,
1309,
25142,
533,
642,
5955,
403,
2530,
671,
253,
4588,
1180,
273,
25142,
908,
281,
6635,
2829,
337,
247,
1028,
7363,
310,
417,
1677,
7384,
253,
10454,
310,
2169,
685,
643,
256,
302,
3082,
1955,
281,
253,
34560,
4809,
285,
1677,
326,
253,
3045,
310,
2074,
281,
256,
302,
923,
1840,
1127,
436,
2746,
1024,
4453,
247,
2257,
1679,
21414,
50271,
783,
2929,
310,
417,
973,
3542,
285,
27171,
432,
47412,
1037,
3430,
285,
27662,
9542,
14683,
281,
253,
1127,
835,
352,
4916,
940,
25031,
285,
21643,
50272,
1189,
455,
436,
310,
271,
4722,
285,
4460,
2746,
281,
4499,
422,
30207,
5481,
2299,
891,
1158,
253,
2929,
310,
417,
4704,
323,
9311,
253,
8245,
256,
302,
3904,
878,
281,
320,
5544,
390,
4391,
281,
4887,
3786,
3863,
3904,
253,
15180,
10454,
273,
253,
2746,
3198,
281,
320,
2429,
281,
841,
1666,
25379,
285,
4720,
253,
2929,
3198,
281,
320,
10369,
4737,
1088,
50276,
187,
187,
4118,
18435,
27,
783,
3302,
10123,
497,
247,
2372,
8085,
391,
21,
369,
5777,
2762,
391,
20,
369,
5777,
4016,
285,
1097,
391,
18,
285,
391,
19,
14285,
323,
18235,
253,
2022,
2523,
369,
3480,
273,
1463,
14023,
342,
253,
256,
5503,
3082,
285,
5816,
10414,
275,
253,
30080,
22559,
253,
4477,
2879,
3081,
4679,
347,
9521,
533,
391,
18,
285,
391,
19,
497,
417,
13762,
407,
253,
747,
1543,
275,
1798,
391,
18,
8042,
562,
326,
1014,
253,
440,
35421,
9978,
275,
577,
6786,
470,
2511,
247,
1028,
41731,
14692,
470,
2691,
347,
2361,
275,
253,
2929,
253,
913,
18726,
342,
391,
18,
285,
391,
19,
326,
253,
2929,
2550,
320,
3863,
1919,
625,
11080,
14023,
403,
5196,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
271,
1607,
404,
1076,
3169,
1332,
323,
30207,
14536,
327,
3888,
275,
253,
3733,
673,
247,
17697,
36827,
3169,
1006,
800,
14053,
2746,
310,
8671,
275,
253,
1071,
673,
50276,
66,
8989,
4315,
310,
5223,
1242,
5998,
407,
7887,
272,
253,
8350,
14259,
3605,
2557,
256,
3549,
875,
253,
3236,
3888,
285,
25578,
3888,
253,
2934,
310,
1077,
27350,
285,
4679,
7568,
5520,
3045,
3340,
327,
38276,
689,
767,
3332,
1666,
25379,
3082,
50274,
9072,
2792,
50274,
2520,
2929,
19132,
253,
1071,
673,
275,
440,
35421,
30207,
5481,
342,
271,
19502,
6974,
3738,
2074,
5697,
273,
10040,
3146,
13654,
327,
1275,
1699,
253,
1698,
3775,
4811,
285,
6108,
3815,
253,
2169,
6045,
4811,
452,
644,
4081,
1078,
253,
12541,
34503,
272,
6974,
4081,
275,
436,
2929,
4453,
1014,
19554,
50274,
23955,
38135,
310,
275,
253,
3733,
673,
253,
4081,
2746,
970,
247,
17697,
285,
36827,
3169,
2746,
436,
310,
275,
4499,
342,
6753,
36465,
3169,
14433,
7274,
534,
778,
1421,
281,
787,
20657,
2460,
14433,
50275,
20881,
2792,
50273,
783,
3368,
7103,
310,
417,
3240,
21414,
690,
273,
841,
3904,
1007,
7197,
685,
752,
310,
2361,
407,
270,
1326,
8420,
1162,
355,
4765,
285,
372,
3227,
1751,
1162,
355,
9169,
403,
627,
667,
7533,
1027,
672,
39306,
253,
8245,
1543,
50273,
783,
18276,
5301,
342,
372,
3227,
1751,
1162,
355,
9169,
310,
5816,
275,
4677,
495,
50274,
13206,
577,
4518,
2722,
253,
247,
1028,
19132,
342,
25142,
651,
436,
256,
3549,
34503,
6974,
671,
5649,
643,
7274,
24088,
247,
17697,
6753,
36465,
403,
841,
3045,
11701,
689,
1666,
25379,
7194,
12877,
281,
253,
10165,
273,
1959,
630,
3632,
8989,
285,
36827,
390,
253,
4081,
256,
3549,
34503,
6974,
651,
352,
320,
1805,
685,
643,
34560,
15849,
824,
347,
34560,
12378,
342,
253,
1072,
10166,
1566,
625,
28913,
2175,
651,
320,
9371,
50275,
977,
2792,
281,
19148,
50274,
74,
5476,
2829,
337,
247,
1822,
68,
310,
323,
12275,
3020,
26405,
2581,
685,
2460,
9162,
987,
50275,
5430,
1057,
253,
2451,
4697,
8989,
26641,
253,
1071,
9433,
1027,
25965,
281,
577,
10125,
273,
3888,
285,
840,
17310,
50274,
11769,
2784,
4086,
273,
19502,
50275,
1189,
455,
891,
1158,
253,
2934,
275,
436,
2929,
310,
1077,
4722,
891,
1335,
452,
690,
6405,
569,
670,
5661,
7103,
347,
5544,
1840,
1411,
479,
13423,
323,
14924,
50276,
7152,
33032,
436,
2746,
15771,
327,
4715,
7960,
2622,
3733,
3410,
285,
840,
32495,
875,
2622,
285,
10969,
3394,
5001,
253,
7887,
273,
253,
14433,
2228,
891,
20,
324,
40725,
247,
5019,
273,
591,
29206,
6489,
1159,
285,
17697,
6753,
36465,
534,
310,
7032,
273,
9706,
760,
2622,
4811,
285,
28490,
816,
31946,
4811,
32213,
3738,
436,
2929,
556,
9919,
281,
12661,
271,
5919,
5933,
323,
30207,
5481,
253,
253,
2087,
2934,
273,
436,
2929,
310,
417,
10481,
16694,
50276,
783,
954,
1774,
2987,
327,
253,
12989,
403,
417,
11106,
50276,
783,
4081,
1332,
310,
417,
9483,
1242,
2429,
342,
253,
643,
2900,
247,
11088,
1783,
273,
253,
10454,
273,
253,
4081,
1332,
310,
2424,
352,
3133,
253,
4081,
1332,
33526,
253,
1375,
23037,
14387,
2684,
387,
253,
14247,
273,
10454,
50275,
296,
3755,
20556,
436,
2929,
310,
973,
3542,
594,
326,
253,
2605,
310,
3477,
281,
956,
253,
5020,
273,
253,
2797,
1543,
403,
1805,
685,
253,
1375,
23037,
14387,
556,
644,
2361,
275,
436,
2929,
253,
4081,
1332,
556,
3597,
281,
18915,
767,
28629,
3374,
273,
970,
6753,
2083,
351,
398,
323,
253,
4836,
273,
30207,
5481,
1690,
616,
7071,
275,
13759,
1523,
14053,
1355,
2508,
347,
973,
347,
253,
5368,
1789,
19412,
16464,
326,
253,
3210,
403,
10166,
281,
15338,
2264,
14433,
6332,
5474,
33032,
2520,
789,
4081,
247,
4460,
4715,
5700,
323,
440,
35421,
314,
30207,
5481,
3782,
4477,
12661,
281,
897,
271,
34560,
8989,
5978,
1232,
1754,
327,
2460,
1607,
404,
1076,
285,
5141,
273,
247,
8350,
14259,
7982,
256,
3610,
74,
875,
253,
3280,
2460,
285,
697,
25578,
2715,
323,
7103,
6378,
4477,
17942,
281,
253,
1345,
278,
87,
39766,
22791,
4645,
1805,
1543,
685,
253,
1666,
25379,
4496,
1089,
479,
5701,
2708,
50276,
296,
3755,
20556,
253,
2929,
310,
3839,
973,
3542,
285,
3477,
281,
956,
1543,
921,
326,
253,
4081,
1332,
41731,
13015,
253,
1666,
25379,
50276,
20881,
1255,
265,
253,
35961,
7680,
310,
16888,
19687,
30132,
12014,
281,
2067,
3082,
275,
22112,
22296,
26405,
923,
337,
323,
1650,
4477,
897,
34560,
5018,
281,
39494,
253,
3302,
26405,
8989,
253,
760,
3064,
310,
326,
3185,
273,
7477,
4811,
1754,
327,
9162,
5743,
8115,
285,
9162,
7363,
253,
4477,
897,
247,
8350,
14259,
12275,
3020,
7982,
4457,
326,
627,
310,
2717,
4460,
275,
253,
4081,
16182,
50276,
8826,
1827,
4542,
302,
400,
569,
403,
12744,
323,
1650,
891,
13414,
1663,
2096,
2139,
253,
8989,
31850,
310,
3058,
387,
1071,
673,
2007,
4477,
671,
3748,
275,
253,
30762,
277,
326,
281,
3835,
2644,
3888,
597,
403,
8085,
715,
1269,
407,
1269,
20412,
285,
840,
40006,
715,
1740,
25965,
4496,
2085,
625,
4278,
1580,
436,
310,
12744,
50275,
22478,
1177,
310,
15225,
5196,
342,
1142,
4623,
3332,
9380,
5816,
33810,
253,
6239,
3249,
3563,
275,
253,
2929,
275,
1635,
281,
253,
2045,
2929,
275,
22112,
22296,
26405,
253,
2987,
275,
3436,
403,
3332,
2987,
275,
30207,
5481,
816,
281,
1416,
247,
1643,
4477,
943,
2486,
512,
436,
9380,
2319,
616,
7364,
285,
921,
849,
253,
4081,
789,
476,
11399,
841,
30453,
4477,
3748,
326,
6753,
2083,
351,
398,
4711,
787,
20657,
3888,
534,
310,
2032,
17837,
2987,
1690,
271,
48960,
7134,
12915,
452,
10380,
9713,
253,
2523,
273,
44042,
25578,
3888,
1677,
512,
436,
4477,
943,
1805,
41509,
616,
789,
50275,
35094,
2045,
5816,
9380,
627,
310,
253,
789,
275,
374,
534,
671,
27532,
271,
1607,
404,
1076,
5700,
9904,
342,
271,
48960,
1566,
534,
403,
253,
3910,
342,
1675,
281,
436,
789,
50276,
16217,
3825,
878,
281,
320,
3012,
6508,
806,
4477,
7960,
2486,
767,
3082,
275,
616,
7103,
1223,
627,
2226,
625,
685,
1110,
908,
275,
253,
14023,
436,
310,
3782,
1774,
1580,
690,
3332,
3082,
534,
452,
644,
11035,
275,
253,
6239,
2278,
1160,
407,
253,
4477,
24088,
577,
3012,
562,
32231,
253,
4081,
1332,
470,
39948,
4632,
470,
2270,
275,
247,
1028,
1273,
4477,
1304,
1543,
275,
2426,
273,
247,
1028,
1223,
891,
7052,
1804,
326,
597,
897,
253,
7200,
323,
2060,
5971,
285,
253,
247,
1028,
347,
3388,
273,
253,
5971,
253,
1921,
323,
436,
310,
281,
1805,
7277,
281,
2905,
789,
923,
323,
1650,
2829,
721,
275,
577,
50275,
250,
3065,
337,
259,
606,
1162,
355,
22112,
35421,
24705,
26405,
407,
10040,
3146,
15067,
1846,
1789,
3386,
30105,
1087,
4765,
374,
18429,
536,
30898,
1162,
355,
46952,
48960,
5304,
17948,
414,
5481,
756,
87,
4765,
50276,
20,
759,
30059,
1162,
355,
15715,
1247,
581,
2437,
38135,
5481,
970,
305,
507,
342,
20793,
21624,
14237,
30105,
1087,
6247,
577,
8097,
76,
15642,
14990,
266,
292,
355,
4116,
18107,
30207,
14536,
275,
3888,
23746,
87,
9169,
608,
372,
886,
413,
1162,
355,
2460,
30207,
5481,
342,
1006,
800,
48960,
6928,
275,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
4765,
721,
632,
1162,
355,
18216,
3676,
30207,
5481,
3082,
1754,
327,
26661,
2036,
275,
476,
7577,
8059,
327,
13345,
9260,
9169,
50275,
37585,
253,
2929,
275,
34843,
301,
372,
3227,
1751,
258,
19399,
1315,
15973,
396,
24785,
1914,
2049,
18398,
4415,
285,
18753,
250,
1045,
460,
34560,
2341,
3169,
12378,
327,
247,
2622,
941,
16751,
323,
30207,
14536,
549,
32693,
638,
3845,
549,
32693,
1518,
938,
1787,
1706,
310,
271,
17857,
1686,
9169,
3863,
2929,
5474,
33032,
2520,
2929,
10262,
247,
1332,
323,
4499,
422,
30207,
5481,
519,
970,
271,
34560,
34741,
17697,
6753,
36465,
275,
31406,
1076,
2746,
271,
6753,
36465,
2990,
310,
10166,
970,
271,
48960,
2746,
281,
17029,
247,
12421,
34741,
629,
273,
253,
3280,
2460,
387,
1071,
673,
247,
8989,
310,
6012,
432,
253,
4561,
30207,
3711,
970,
256,
3549,
3605,
875,
253,
3280,
285,
25578,
2460,
285,
908,
281,
8989,
253,
3280,
594,
326,
253,
1232,
310,
6015,
295,
2069,
253,
1332,
310,
2011,
281,
4711,
256,
302,
1543,
327,
253,
278,
87,
39766,
519,
22791,
50273,
856,
84,
50272,
783,
1332,
310,
4460,
285,
4722,
253,
34560,
34741,
2746,
1057,
2085,
247,
3505,
74,
6216,
1039,
281,
2572,
253,
2625,
1299,
45416,
4313,
275,
253,
25578,
3888,
50271,
5040,
50272,
783,
3904,
1677,
275,
2829,
337,
323,
253,
8245,
256,
302,
3082,
403,
417,
432,
253,
6239,
253,
4477,
23775,
253,
3904,
533,
597,
3176,
281,
320,
1199,
2406,
685,
1110,
3863,
323,
1650,
253,
3863,
1318,
323,
253,
3388,
247,
1028,
689,
1458,
5971,
275,
270,
1326,
8420,
9638,
403,
298,
938,
3507,
859,
303,
44675,
1223,
327,
2829,
337,
841,
2193,
403,
298,
938,
3121,
859,
303,
41923,
436,
310,
247,
6832,
3064,
534,
310,
417,
5544,
387,
512,
352,
2789,
253,
4081,
1332,
1007,
1199,
1805,
685,
256,
302,
1223,
275,
6612,
352,
4620,
281,
452,
2074,
3045,
50272,
783,
1180,
273,
25142,
310,
417,
5469,
436,
310,
271,
1774,
4373,
19484,
347,
352,
11852,
253,
4583,
3885,
273,
253,
2746,
760,
3036,
577,
2722,
253,
10005,
273,
253,
247,
1028,
1309,
25142,
533,
642,
5955,
403,
2530,
671,
253,
4588,
1180,
273,
25142,
908,
281,
6635,
2829,
337,
247,
1028,
7363,
310,
417,
1677,
7384,
253,
10454,
310,
2169,
685,
643,
256,
302,
3082,
1955,
281,
253,
34560,
4809,
285,
1677,
326,
253,
3045,
310,
2074,
281,
256,
302,
923,
1840,
1127,
436,
2746,
1024,
4453,
247,
2257,
1679,
21414,
50271,
783,
2929,
310,
417,
973,
3542,
285,
27171,
432,
47412,
1037,
3430,
285,
27662,
9542,
14683,
281,
253,
1127,
835,
352,
4916,
940,
25031,
285,
21643,
50272,
1189,
455,
436,
310,
271,
4722,
285,
4460,
2746,
281,
4499,
422,
30207,
5481,
2299,
891,
1158,
253,
2929,
310,
417,
4704,
323,
9311,
253,
8245,
256,
302,
3904,
878,
281,
320,
5544,
390,
4391,
281,
4887,
3786,
3863,
3904,
253,
15180,
10454,
273,
253,
2746,
3198,
281,
320,
2429,
281,
841,
1666,
25379,
285,
4720,
253,
2929,
3198,
281,
320,
10369,
4737,
1088,
50276,
187,
187,
4118,
18435,
27,
783,
3302,
10123,
497,
247,
2372,
8085,
391,
21,
369,
5777,
2762,
391,
20,
369,
5777,
4016,
285,
1097,
391,
18,
285,
391,
19,
14285,
323,
18235,
253,
2022,
2523,
369,
3480,
273,
1463,
14023,
342,
253,
256,
5503,
3082,
285,
5816,
10414,
275,
253,
30080,
22559,
253,
4477,
2879,
3081,
4679,
347,
9521,
533,
391,
18,
285,
391,
19,
497,
417,
13762,
407,
253,
747,
1543,
275,
1798,
391,
18,
8042,
562,
326,
1014,
253,
440,
35421,
9978,
275,
577,
6786,
470,
2511,
247,
1028,
41731,
14692,
470,
2691,
347,
2361,
275,
253,
2929,
253,
913,
18726,
342,
391,
18,
285,
391,
19,
326,
253,
2929,
2550,
320,
3863,
1919,
625,
11080,
14023,
403,
5196,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a model for unsupervised pretraining for the task of question generation the model first predicts the number of answer present in a given paragraph and then selects the topk answer spans from the paragraph after selecting the answer spans the model then tries to generate the answer containing sentence by inputting the paragraphs less the answercontaining sentence and also the answer span the key idea is that this unsupervised pretraining strategy is close to the actual task of generating question given the context and the answer this is also the key differentiator between this work and other existing pretraining strategies for question generation eg alberti et al 2019 the paper outperforms other existing methods of question generation on several datasets two splits of squad ms marco newsqa both on automatic and a small scale human evaluation efficacy of the pretraining scheme is shown via the fact that the pretraining scheme also improves other question generation model lastly training a qa model on the synthetic generated questions improves downstream performance of a qa model and the difference is greater in lowdata setting suggesting the applicability of this pretraining scheme in lowdata regime although no experiments in a specific lowdata domain is reported strengths the pretraining scheme is closer to the original task than other existing methods and can be easily scaled the paper is well written and easy to follow and the experiments and ablations were exhaustive improvements in lowdata qa setting is promising and shows the scope of usefulness of this work weaknesses i am not sure about the need to predict the number of unique answers in a paragraph or that if it makes sense to do so i think it is hard to estimate the number of questions that can be generated from a given text and regressing to a value which is present in current datasets might be suboptimal this is because the current qa datasets do not ask annotators to generate all possible question from a paragraph instead i believe you could have taken a principled approach of considering namedentities as answers and trying to generate questions for each entity or a set of entities together this would also eliminate the need to predict the number of answer spans in a paragraph at once i think it would be useful for the paper to categorize the kinds of reasoning that the generated question requires to answer for example is it just fuzzy pattern matching kind of questions or does answering the question require any kind of reasoning such as multihop or numerical reasoning it was not clear to me how you generate questions which are unanswerable eg those in squad 20 although it is good to see that training on the synthetically generated questions help in lowdata settings i think the paper would be stronger and more convincing if an actual experiment was done on a domain which has less data for example you could try the bioasq datasets which have very less annotations and is in the biomedical domain it would be interesting to see if pretraining on a scientific corpus is actually helpful moreover some questions in bioasq need reasoning such as handling list questions counting and it would be interesting to see if the performance on those questions improve overall i think the current experiments are reasonably welldone and i think the paper would be much stronger if it was tested on an actual domain which has lowdata and also if the paper discusses categorizes the kind of reasoning that the generated questions require docsepthis paper proposes a pretraining technique for questiongeneration models the pretraining involves generating the sentence which contains an answer candidate from the context around it through several wellexecuted experiments the paper shows that this type of pretraining can improve performance of several existing question generation models and the resulting synthetic questions generated help in augmenting reading comprehension datasets strengths this is solid empirical work with several detailed experiments showing the utility of the pretraining method on multiple benchmarks and with multiple models it was particularly nice to see evaluation on korean as well as english the authors clearly put effort in explaining all the methods and experiments precisely and with details weaknesses the technical contributions of the paper are rather limited pretraining has been of much interest in the nlp community recently and this paper follows a line of related works which have proposed similar techniques eg inverse cloze task lee et al acl 2019 t5 raffel et al jmlr 2020 there isnt much discussion of these existing papers either it is not clear how much of the benefit of pretraining comes from the specific approach used here versus the fact that there is some pretraining on the decoder which generates the questions we could learn more about this if there was a comparison to other pretrained models which have a decoder eg t5 bart lewis et al 2020 in terms of the question generation task it is well known now that squad questions have a bias towards high lexical overlap with the passage due to the manner in which they were constructed see lee et al above this raises the question whether the approach in this paper can generalize to datasets where this bias does not exist eg natural questions this seems to be a limitation of not just paper but the prior works as well other comments questions where do the ground truth answers come from during pretraining is the answer prediction model also pretrained in section 21 it says that the mse loss is computed using k but k comes from the floor function which does not support backpropagation do you use the output fk instead to compute the mse in the same section i assume a softmax is applied on si eij before taking the crossentropy loss in this case is eij normalized over all spans in the passage or only the ones which start at i is wot an embedding or a probability there seems to be confusion at the end of section 2 some more discussion of the unilm baseline would be good for people not familiar with that work missing discussion of related work yang zhilin et al semisupervised qa with generative domainadaptive nets acl 2017 how does the model generate unanswerable questions for squad 20docsepthe submission introduces a new approach for pretraining models for question generation conditioned on an answer and a document given a document a set of k answers is chosen by a classifier then the sentence containing the answer is removed and a sequence to sequence model is used to reconstruct it this pretraining task more closely matches the desired end task results show that this method improves question generation performance compared to previous work and that the synthetic questions can be used to improve question answering models overall i think the approach makes a lot of sense however i think it needs to compare with stronger pretraining baselines and needs a more thorough comparison of the effect on question answering performance the paper primarily evaluates on question generation for a range of datasets and shows improvements on both automated and human evaluation metrics i think its fair to say that the question generation is quite a niche task perhaps because of limited applications indeed the abstract of the paper pitches the task as being primarily useful for generating synthetic data for question answering thats fine but given this motivation i think the evaluation should focus much more on the downstream impact on question answering id really like to see if the method still improves question generation performance on top of more recent sequence to sequence models such as t5 or bart which have both been available for almost a year as far as i know these both significantly outperform unilm on all the comparisons ive seen the pretraining objectives used there which involve predicting masked spans seem closely related to the proposed method this experiment would help understand whether the proposed method adds anything on top of more general pretraining approaches the paper convincingly demonstrates that synthetic questions improve a baseline bert question answering model which was already shown by eg alberti et al 2019 however i dont think the paper does much to suggest that the synthetic questions from the proposed method are better for qa than other approaches for example alberti et al appear to report similar numbers with their method its important to include a baseline set of synthetic questions as its not at all clear that improvements on question generation metrics will correlate with usefulness for qa training again the question answering results would be more convincing if they hold up with more recent models than bert which achieve better scores without synthetic questions table 11 does give a result using electra but gives the baseline electra model an exact match score of 874 the electra paper instead lists 880 slightly higher than the results using synthetic questionsdocsep summary this work presents a new multistep method to pretrain a question generation system which can then be used to create synthetic data to improve a machine reading comprehension system first the authors train a system to identify spans in a text paragraphs which would constitute likely answers for questions about the paragraph then they pretrain a system to generate those questions by taking the selected answer and surrounding sentences as input and generating the sentence which contained the answer finally they use real mrc data to finetune the question generation system the authors first provide a direct evaluation of their method by showing a consistent improvement in referencebased metrics comparing to gold questions for a given paragraph answer then shows that the generated synthetic data also leads to improvement on the downstream mrc task for the squad dataset especially when using less annotated data the results provided are encouraging and accompanied by a number of enlightening supporting experiments the paper could still be improved by clarifying specific points clarity while the experimental setting description gives a broad idea of what the authors did some details are missing for full reproducibility most notably there is very little information on the first step setup training the answer selection system the authors also describe their setting based on bert but not the one with unilm what is the dimension of the indicator vector that tells the model where the answer is located is there a particular reason why the authors decided to use an indicator vector rather than eg use a sep token dynamic prediction of the number of answers in a paragraph doesnt seem to account for much improvement can the authors provide some measure of statistical significance otherwise the method really is novel enough without that part did the model evaluate the effect of using synthetic data on the other tasks correctness the claims are mostly well supported by the experiments to the exception of the dynamic k prediction additional citations the pretraining sentence prediction method is related to latent retrieval for weakly supervised open domain question answering lee et al acl 2019
### Summary:
|
all reviewers appreciate the good quality of this submission with a good idea and solid execution as said by r3 the paper is clearly written and the addition during the discussion have greatly improved it as acknowledged by all reviewers however a major weakness of the submission still needs to be addressed before a publication at iclr as said in the paper the task of question generation is a task whose main impact is to improve downstream tasks and primarily qa the evaluations follow that and extraexperiments eg bioasq and discussion wrt state of the art eg alberti et al reinforce them yet as pointed out by r1 r4 the effect on downstream qa performance is only shown for weaker models than the current state of the art eg t5 bart since the rebuttal period was not long enough to run these experiments it is impossible to assess how the proposed approach compare to them with the current draft adding the experiments on t5 small is a step in the right direction but it is not enough for that without those experiments one can not conclude that this pretraining strategy would also help over the strongest existing pretrained model the authors should run those experiments to make the arguments presented in the submission much stronger
|
[
490,
77,
569,
497,
41389,
50276,
49831,
942,
275,
1698,
2203,
2805,
66,
4758,
310,
12532,
285,
2722,
253,
7990,
273,
31471,
273,
436,
789,
50276,
20881,
1255,
265,
50276,
74,
717,
417,
2119,
670,
253,
878,
281,
3283,
253,
1180,
273,
4451,
9172,
275,
247,
12494,
390,
326,
604,
352,
2789,
3282,
281,
513,
594,
891,
1158,
352,
310,
1892,
281,
6642,
253,
1180,
273,
3533,
326,
476,
320,
4561,
432,
247,
1677,
2505,
285,
810,
13537,
281,
247,
1318,
534,
310,
1246,
275,
1655,
15302,
1537,
320,
749,
29776,
436,
310,
984,
253,
1655,
2805,
66,
15302,
513,
417,
1642,
12182,
2392,
281,
6635,
512,
1896,
1953,
432,
247,
12494,
3185,
891,
2868,
368,
812,
452,
2668,
247,
3505,
74,
6216,
2746,
273,
7296,
4907,
290,
1005,
347,
9172,
285,
2820,
281,
6635,
3533,
323,
1016,
10726,
390,
247,
873,
273,
14429,
2366,
436,
651,
671,
13469,
253,
878,
281,
3283,
253,
1180,
273,
3662,
35742,
275,
247,
12494,
387,
2378,
50276,
74,
1158,
352,
651,
320,
4217,
323,
253,
2929,
281,
13213,
907,
253,
9351,
273,
14720,
326,
253,
4561,
1953,
4419,
281,
3662,
323,
1650,
310,
352,
816,
31921,
3102,
11038,
2238,
273,
3533,
390,
1057,
22291,
253,
1953,
2430,
667,
2238,
273,
14720,
824,
347,
4471,
12242,
390,
10704,
14720,
50275,
262,
369,
417,
2590,
281,
479,
849,
368,
6635,
3533,
534,
403,
440,
31984,
494,
24088,
1110,
275,
13487,
1384,
50276,
20261,
352,
310,
1175,
281,
923,
326,
3733,
327,
253,
5132,
85,
1037,
4561,
3533,
1361,
275,
1698,
2203,
7533,
891,
1158,
253,
2929,
651,
320,
10046,
285,
625,
21414,
604,
271,
4588,
3368,
369,
2218,
327,
247,
5028,
534,
556,
1679,
941,
323,
1650,
368,
812,
1611,
253,
9015,
284,
82,
15302,
534,
452,
1077,
1679,
31825,
285,
310,
275,
253,
35156,
5028,
352,
651,
320,
4722,
281,
923,
604,
3215,
26208,
327,
247,
8249,
20689,
310,
2686,
9371,
25761,
690,
3533,
275,
9015,
284,
82,
878,
14720,
824,
347,
10885,
1618,
3533,
15496,
285,
352,
651,
320,
4722,
281,
923,
604,
253,
3045,
327,
1110,
3533,
3157,
50276,
1189,
455,
891,
1158,
253,
1655,
4679,
403,
12054,
6210,
392,
531,
285,
891,
1158,
253,
2929,
651,
320,
1199,
10046,
604,
352,
369,
5762,
327,
271,
4588,
5028,
534,
556,
1698,
2203,
285,
671,
604,
253,
2929,
25339,
50276,
68,
992,
263,
4219,
253,
2238,
273,
14720,
326,
253,
4561,
3533,
2430,
5474,
33032,
2520,
2929,
29328,
247,
3215,
26208,
5853,
323,
1953,
14520,
3210,
253,
3215,
26208,
8687,
11365,
253,
6197,
534,
4428,
271,
3662,
7431,
432,
253,
3634,
1475,
352,
949,
2067,
6210,
1591,
886,
4525,
4679,
253,
2929,
2722,
326,
436,
1511,
273,
3215,
26208,
476,
3157,
3045,
273,
2067,
5368,
1953,
5978,
3210,
285,
253,
4795,
13506,
3533,
4561,
1361,
275,
35919,
272,
4361,
35380,
15302,
50276,
296,
3755,
20556,
50276,
2520,
310,
4891,
16774,
789,
342,
2067,
7000,
4679,
4645,
253,
11839,
273,
253,
3215,
26208,
1332,
327,
2709,
49602,
285,
342,
2709,
3210,
352,
369,
3782,
5322,
281,
923,
7103,
327,
465,
37173,
347,
973,
347,
48087,
50275,
783,
4477,
4518,
1691,
3434,
275,
15571,
512,
253,
3082,
285,
4679,
10534,
285,
342,
4278,
50276,
20881,
1255,
265,
50276,
783,
7681,
9021,
273,
253,
2929,
403,
2581,
3710,
3215,
26208,
556,
644,
273,
1199,
1600,
275,
253,
295,
24343,
3114,
4102,
285,
436,
2929,
3637,
247,
1386,
273,
2905,
2987,
534,
452,
4081,
2074,
5609,
24088,
13737,
23406,
2721,
4836,
458,
70,
1162,
355,
247,
498,
6247,
246,
22,
1218,
567,
293,
1162,
355,
480,
1686,
83,
9169,
627,
310,
2649,
1199,
5955,
273,
841,
5368,
9380,
2057,
50275,
262,
310,
417,
2590,
849,
1199,
273,
253,
5649,
273,
3215,
26208,
3249,
432,
253,
2173,
2746,
908,
1060,
7147,
253,
958,
326,
627,
310,
690,
3215,
26208,
327,
253,
29810,
534,
15693,
253,
3533,
359,
812,
3037,
625,
670,
436,
604,
627,
369,
247,
5301,
281,
643,
3215,
11273,
3210,
534,
452,
247,
29810,
24088,
246,
22,
44693,
458,
88,
261,
1162,
355,
9169,
50275,
249,
2426,
273,
253,
1953,
5978,
4836,
352,
310,
973,
1929,
1024,
326,
13487,
3533,
452,
247,
8492,
4404,
1029,
26752,
474,
14787,
342,
253,
10056,
1955,
281,
253,
5133,
275,
534,
597,
497,
8818,
923,
458,
70,
1162,
355,
1840,
436,
16540,
253,
1953,
1880,
253,
2746,
275,
436,
2929,
476,
39970,
281,
15302,
835,
436,
8492,
1057,
417,
2226,
24088,
3626,
3533,
436,
3133,
281,
320,
247,
12291,
273,
417,
816,
2929,
533,
253,
2720,
2987,
347,
973,
50276,
977,
5701,
50276,
34974,
50276,
2811,
513,
253,
3216,
5083,
9172,
1705,
432,
1309,
3215,
26208,
310,
253,
3662,
10554,
1566,
671,
3215,
11273,
50275,
249,
2593,
3127,
352,
2296,
326,
253,
278,
339,
2957,
310,
10302,
970,
465,
533,
465,
3249,
432,
253,
5254,
1159,
534,
1057,
417,
1329,
896,
44263,
318,
513,
368,
897,
253,
3453,
269,
76,
3185,
281,
11897,
253,
278,
339,
50275,
249,
253,
1072,
2593,
891,
5467,
247,
2602,
4090,
310,
3732,
327,
4927,
299,
1944,
1078,
3192,
253,
2831,
290,
10144,
2957,
275,
436,
1083,
310,
299,
1944,
12650,
689,
512,
35742,
275,
253,
10056,
390,
760,
253,
4394,
534,
1265,
387,
891,
50275,
261,
259,
302,
271,
21496,
390,
247,
5912,
627,
3133,
281,
320,
13775,
387,
253,
990,
273,
2593,
374,
50275,
8826,
625,
5955,
273,
253,
440,
300,
78,
8245,
651,
320,
1175,
323,
952,
417,
7615,
342,
326,
789,
50275,
33722,
5955,
273,
2905,
789,
30966,
1182,
73,
39806,
1162,
355,
49863,
29974,
13337,
2805,
66,
342,
1006,
800,
5028,
26672,
422,
37507,
247,
498,
4240,
50275,
5430,
1057,
253,
1566,
6635,
440,
31984,
494,
3533,
323,
13487,
1384,
7152,
339,
431,
248,
19529,
23970,
247,
747,
2746,
323,
3215,
26208,
3210,
323,
1953,
5978,
27039,
327,
271,
3662,
285,
247,
3389,
1677,
247,
3389,
247,
873,
273,
465,
9172,
310,
6777,
407,
247,
30410,
840,
253,
6197,
4508,
253,
3662,
310,
5176,
285,
247,
3425,
281,
3425,
1566,
310,
908,
281,
17029,
352,
436,
3215,
26208,
4836,
625,
8244,
10129,
253,
6799,
990,
4836,
1543,
921,
326,
436,
1332,
19132,
1953,
5978,
3045,
2429,
281,
2045,
789,
285,
326,
253,
13506,
3533,
476,
320,
908,
281,
3157,
1953,
22291,
3210,
4583,
891,
1158,
253,
2746,
2789,
247,
2257,
273,
3282,
2299,
891,
1158,
352,
3198,
281,
7277,
342,
10046,
3215,
26208,
1666,
25379,
285,
3198,
247,
625,
11080,
5301,
273,
253,
1055,
327,
1953,
22291,
3045,
50276,
783,
2929,
8558,
44995,
327,
1953,
5978,
323,
247,
2491,
273,
15302,
285,
2722,
11701,
327,
1097,
16644,
285,
1966,
7103,
17082,
891,
1158,
697,
4344,
281,
1333,
326,
253,
1953,
5978,
310,
3240,
247,
25803,
4836,
4931,
984,
273,
3710,
4893,
6296,
253,
12002,
273,
253,
2929,
41095,
253,
4836,
347,
1146,
8558,
4217,
323,
11365,
13506,
941,
323,
1953,
22291,
28763,
4030,
533,
1677,
436,
16038,
891,
1158,
253,
7103,
943,
2770,
1199,
625,
327,
253,
15450,
3486,
327,
1953,
22291,
50276,
301,
1663,
751,
281,
923,
604,
253,
1332,
1335,
19132,
1953,
5978,
3045,
327,
1755,
273,
625,
3332,
3425,
281,
3425,
3210,
824,
347,
246,
22,
390,
44693,
534,
452,
1097,
644,
2130,
323,
2761,
247,
807,
347,
2080,
347,
891,
871,
841,
1097,
3012,
562,
32231,
440,
300,
78,
327,
512,
253,
14023,
209,
422,
2326,
253,
3215,
26208,
16566,
908,
627,
534,
6388,
21565,
34741,
35742,
1646,
8244,
2905,
281,
253,
4081,
1332,
436,
3368,
651,
1361,
2096,
1880,
253,
4081,
1332,
11323,
2712,
327,
1755,
273,
625,
2087,
3215,
26208,
7274,
50276,
783,
2929,
2410,
1763,
5356,
14371,
326,
13506,
3533,
3157,
247,
8245,
270,
797,
1953,
22291,
1566,
534,
369,
2168,
2011,
407,
24088,
355,
6291,
74,
1162,
355,
6247,
2299,
891,
13414,
1158,
253,
2929,
1057,
1199,
281,
1804,
326,
253,
13506,
3533,
432,
253,
4081,
1332,
403,
1805,
323,
2805,
66,
685,
643,
7274,
50276,
1542,
1650,
355,
6291,
74,
1162,
355,
3176,
281,
1304,
2074,
3904,
342,
616,
1332,
697,
1774,
281,
2486,
247,
8245,
873,
273,
13506,
3533,
347,
697,
417,
387,
512,
2590,
326,
11701,
327,
1953,
5978,
17082,
588,
24888,
342,
31471,
323,
2805,
66,
3733,
50276,
16245,
253,
1953,
22291,
1543,
651,
320,
625,
21414,
604,
597,
2186,
598,
342,
625,
3332,
3210,
685,
270,
797,
534,
5115,
1805,
7363,
1293,
13506,
3533,
2829,
1903,
1057,
1918,
247,
906,
970,
1516,
376,
50276,
2858,
4245,
253,
8245,
1516,
376,
1566,
271,
3242,
3761,
4868,
273,
854,
3566,
253,
1516,
376,
2929,
3185,
10894,
854,
1438,
5777,
2169,
685,
253,
1543,
970,
13506,
3533,
7152,
33032,
6010,
50276,
2520,
789,
10262,
247,
747,
1554,
382,
554,
1332,
281,
3215,
1949,
247,
1953,
5978,
985,
534,
476,
840,
320,
908,
281,
2794,
13506,
941,
281,
3157,
247,
5145,
4361,
35380,
985,
50276,
7053,
253,
4477,
6194,
247,
985,
281,
4271,
35742,
275,
247,
2505,
33295,
534,
651,
12647,
2779,
9172,
323,
3533,
670,
253,
12494,
840,
597,
3215,
1949,
247,
985,
281,
6635,
1110,
3533,
407,
3192,
253,
4236,
3662,
285,
8704,
14683,
347,
3280,
285,
11365,
253,
6197,
534,
6221,
253,
3662,
4720,
597,
897,
1524,
278,
3373,
941,
281,
1442,
292,
2517,
253,
1953,
5978,
985,
50276,
783,
4477,
806,
2085,
247,
1480,
7103,
273,
616,
1332,
407,
4645,
247,
5185,
7756,
275,
3806,
3169,
17082,
10941,
281,
5328,
3533,
323,
247,
1677,
12494,
3662,
840,
2722,
326,
253,
4561,
13506,
941,
671,
5644,
281,
7756,
327,
253,
15450,
278,
3373,
4836,
323,
253,
13487,
10895,
3340,
672,
970,
1679,
28267,
941,
50276,
783,
1543,
2530,
403,
18462,
285,
11704,
407,
247,
1180,
273,
25441,
2980,
8109,
4679,
253,
2929,
812,
1335,
320,
5520,
407,
8254,
5411,
2173,
2792,
50275,
498,
15752,
50276,
6050,
253,
5661,
4758,
5740,
4245,
247,
3862,
2934,
273,
752,
253,
4477,
858,
690,
4278,
403,
5816,
323,
2120,
38041,
954,
19836,
627,
310,
1077,
1652,
1491,
327,
253,
806,
3213,
9978,
3733,
253,
3662,
5438,
985,
253,
4477,
671,
6266,
616,
4758,
1754,
327,
270,
797,
533,
417,
253,
581,
342,
440,
300,
78,
50276,
5371,
310,
253,
7877,
273,
253,
15301,
4972,
326,
8599,
253,
1566,
835,
253,
3662,
310,
4441,
310,
627,
247,
1798,
1921,
2139,
253,
4477,
4425,
281,
897,
271,
15301,
4972,
2581,
685,
24088,
897,
247,
34742,
10669,
50276,
19681,
10554,
273,
253,
1180,
273,
9172,
275,
247,
12494,
36908,
1646,
281,
2395,
323,
1199,
7756,
476,
253,
4477,
2085,
690,
2557,
273,
7605,
8453,
5010,
253,
1332,
1663,
310,
4460,
2217,
1293,
326,
629,
50276,
14958,
253,
1566,
7472,
253,
1055,
273,
970,
13506,
941,
327,
253,
643,
8892,
50275,
28113,
1255,
50276,
783,
3916,
403,
6571,
973,
4516,
407,
253,
4679,
281,
253,
6517,
273,
253,
7870,
465,
10554,
50275,
38092,
30404,
253,
3215,
26208,
6197,
10554,
1332,
310,
2905,
281,
50275,
13324,
290,
25064,
323,
22112,
22296,
1527,
5028,
1953,
22291,
458,
70,
1162,
355,
247,
498,
6247,
50276,
187,
187,
4118,
18435,
27,
455,
30628,
11435,
253,
1175,
3290,
273,
436,
19529,
342,
247,
1175,
2934,
285,
4891,
10636,
347,
753,
407,
391,
20,
253,
2929,
310,
4518,
3542,
285,
253,
1635,
1309,
253,
5955,
452,
10260,
5520,
352,
347,
14969,
407,
512,
30628,
50276,
35529,
247,
2201,
14855,
273,
253,
19529,
1335,
3198,
281,
320,
9713,
1078,
247,
9311,
387,
17857,
32888,
50275,
284,
753,
275,
253,
2929,
253,
4836,
273,
1953,
5978,
310,
247,
4836,
3692,
2022,
3486,
310,
281,
3157,
15450,
8892,
285,
8558,
2805,
66,
253,
27163,
956,
326,
285,
4465,
16217,
3825,
24088,
9015,
284,
82,
285,
5955,
8772,
1375,
273,
253,
1445,
24088,
355,
6291,
74,
1162,
355,
28432,
731,
2568,
347,
8042,
562,
407,
391,
18,
50276,
83,
21,
253,
1055,
327,
15450,
2805,
66,
3045,
310,
760,
2011,
323,
21076,
3210,
685,
253,
1655,
1375,
273,
253,
1445,
24088,
246,
22,
44693,
1580,
253,
30080,
22559,
2180,
369,
417,
1048,
2217,
281,
1408,
841,
4679,
352,
310,
7479,
281,
2939,
849,
253,
4081,
2746,
7277,
281,
731,
342,
253,
1655,
7482,
6240,
253,
4679,
327,
246,
22,
1355,
310,
247,
3213,
275,
253,
987,
3884,
533,
352,
310,
417,
2217,
323,
326,
1293,
1110,
4679,
581,
476,
417,
7525,
326,
436,
3215,
26208,
5700,
651,
671,
1361,
689,
253,
19508,
5368,
3215,
11273,
1566,
50276,
783,
4477,
943,
1408,
1110,
4679,
281,
1056,
253,
7125,
3559,
275,
253,
19529,
1199,
10046,
50268
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
490,
77,
569,
497,
41389,
50276,
49831,
942,
275,
1698,
2203,
2805,
66,
4758,
310,
12532,
285,
2722,
253,
7990,
273,
31471,
273,
436,
789,
50276,
20881,
1255,
265,
50276,
74,
717,
417,
2119,
670,
253,
878,
281,
3283,
253,
1180,
273,
4451,
9172,
275,
247,
12494,
390,
326,
604,
352,
2789,
3282,
281,
513,
594,
891,
1158,
352,
310,
1892,
281,
6642,
253,
1180,
273,
3533,
326,
476,
320,
4561,
432,
247,
1677,
2505,
285,
810,
13537,
281,
247,
1318,
534,
310,
1246,
275,
1655,
15302,
1537,
320,
749,
29776,
436,
310,
984,
253,
1655,
2805,
66,
15302,
513,
417,
1642,
12182,
2392,
281,
6635,
512,
1896,
1953,
432,
247,
12494,
3185,
891,
2868,
368,
812,
452,
2668,
247,
3505,
74,
6216,
2746,
273,
7296,
4907,
290,
1005,
347,
9172,
285,
2820,
281,
6635,
3533,
323,
1016,
10726,
390,
247,
873,
273,
14429,
2366,
436,
651,
671,
13469,
253,
878,
281,
3283,
253,
1180,
273,
3662,
35742,
275,
247,
12494,
387,
2378,
50276,
74,
1158,
352,
651,
320,
4217,
323,
253,
2929,
281,
13213,
907,
253,
9351,
273,
14720,
326,
253,
4561,
1953,
4419,
281,
3662,
323,
1650,
310,
352,
816,
31921,
3102,
11038,
2238,
273,
3533,
390,
1057,
22291,
253,
1953,
2430,
667,
2238,
273,
14720,
824,
347,
4471,
12242,
390,
10704,
14720,
50275,
262,
369,
417,
2590,
281,
479,
849,
368,
6635,
3533,
534,
403,
440,
31984,
494,
24088,
1110,
275,
13487,
1384,
50276,
20261,
352,
310,
1175,
281,
923,
326,
3733,
327,
253,
5132,
85,
1037,
4561,
3533,
1361,
275,
1698,
2203,
7533,
891,
1158,
253,
2929,
651,
320,
10046,
285,
625,
21414,
604,
271,
4588,
3368,
369,
2218,
327,
247,
5028,
534,
556,
1679,
941,
323,
1650,
368,
812,
1611,
253,
9015,
284,
82,
15302,
534,
452,
1077,
1679,
31825,
285,
310,
275,
253,
35156,
5028,
352,
651,
320,
4722,
281,
923,
604,
3215,
26208,
327,
247,
8249,
20689,
310,
2686,
9371,
25761,
690,
3533,
275,
9015,
284,
82,
878,
14720,
824,
347,
10885,
1618,
3533,
15496,
285,
352,
651,
320,
4722,
281,
923,
604,
253,
3045,
327,
1110,
3533,
3157,
50276,
1189,
455,
891,
1158,
253,
1655,
4679,
403,
12054,
6210,
392,
531,
285,
891,
1158,
253,
2929,
651,
320,
1199,
10046,
604,
352,
369,
5762,
327,
271,
4588,
5028,
534,
556,
1698,
2203,
285,
671,
604,
253,
2929,
25339,
50276,
68,
992,
263,
4219,
253,
2238,
273,
14720,
326,
253,
4561,
3533,
2430,
5474,
33032,
2520,
2929,
29328,
247,
3215,
26208,
5853,
323,
1953,
14520,
3210,
253,
3215,
26208,
8687,
11365,
253,
6197,
534,
4428,
271,
3662,
7431,
432,
253,
3634,
1475,
352,
949,
2067,
6210,
1591,
886,
4525,
4679,
253,
2929,
2722,
326,
436,
1511,
273,
3215,
26208,
476,
3157,
3045,
273,
2067,
5368,
1953,
5978,
3210,
285,
253,
4795,
13506,
3533,
4561,
1361,
275,
35919,
272,
4361,
35380,
15302,
50276,
296,
3755,
20556,
50276,
2520,
310,
4891,
16774,
789,
342,
2067,
7000,
4679,
4645,
253,
11839,
273,
253,
3215,
26208,
1332,
327,
2709,
49602,
285,
342,
2709,
3210,
352,
369,
3782,
5322,
281,
923,
7103,
327,
465,
37173,
347,
973,
347,
48087,
50275,
783,
4477,
4518,
1691,
3434,
275,
15571,
512,
253,
3082,
285,
4679,
10534,
285,
342,
4278,
50276,
20881,
1255,
265,
50276,
783,
7681,
9021,
273,
253,
2929,
403,
2581,
3710,
3215,
26208,
556,
644,
273,
1199,
1600,
275,
253,
295,
24343,
3114,
4102,
285,
436,
2929,
3637,
247,
1386,
273,
2905,
2987,
534,
452,
4081,
2074,
5609,
24088,
13737,
23406,
2721,
4836,
458,
70,
1162,
355,
247,
498,
6247,
246,
22,
1218,
567,
293,
1162,
355,
480,
1686,
83,
9169,
627,
310,
2649,
1199,
5955,
273,
841,
5368,
9380,
2057,
50275,
262,
310,
417,
2590,
849,
1199,
273,
253,
5649,
273,
3215,
26208,
3249,
432,
253,
2173,
2746,
908,
1060,
7147,
253,
958,
326,
627,
310,
690,
3215,
26208,
327,
253,
29810,
534,
15693,
253,
3533,
359,
812,
3037,
625,
670,
436,
604,
627,
369,
247,
5301,
281,
643,
3215,
11273,
3210,
534,
452,
247,
29810,
24088,
246,
22,
44693,
458,
88,
261,
1162,
355,
9169,
50275,
249,
2426,
273,
253,
1953,
5978,
4836,
352,
310,
973,
1929,
1024,
326,
13487,
3533,
452,
247,
8492,
4404,
1029,
26752,
474,
14787,
342,
253,
10056,
1955,
281,
253,
5133,
275,
534,
597,
497,
8818,
923,
458,
70,
1162,
355,
1840,
436,
16540,
253,
1953,
1880,
253,
2746,
275,
436,
2929,
476,
39970,
281,
15302,
835,
436,
8492,
1057,
417,
2226,
24088,
3626,
3533,
436,
3133,
281,
320,
247,
12291,
273,
417,
816,
2929,
533,
253,
2720,
2987,
347,
973,
50276,
977,
5701,
50276,
34974,
50276,
2811,
513,
253,
3216,
5083,
9172,
1705,
432,
1309,
3215,
26208,
310,
253,
3662,
10554,
1566,
671,
3215,
11273,
50275,
249,
2593,
3127,
352,
2296,
326,
253,
278,
339,
2957,
310,
10302,
970,
465,
533,
465,
3249,
432,
253,
5254,
1159,
534,
1057,
417,
1329,
896,
44263,
318,
513,
368,
897,
253,
3453,
269,
76,
3185,
281,
11897,
253,
278,
339,
50275,
249,
253,
1072,
2593,
891,
5467,
247,
2602,
4090,
310,
3732,
327,
4927,
299,
1944,
1078,
3192,
253,
2831,
290,
10144,
2957,
275,
436,
1083,
310,
299,
1944,
12650,
689,
512,
35742,
275,
253,
10056,
390,
760,
253,
4394,
534,
1265,
387,
891,
50275,
261,
259,
302,
271,
21496,
390,
247,
5912,
627,
3133,
281,
320,
13775,
387,
253,
990,
273,
2593,
374,
50275,
8826,
625,
5955,
273,
253,
440,
300,
78,
8245,
651,
320,
1175,
323,
952,
417,
7615,
342,
326,
789,
50275,
33722,
5955,
273,
2905,
789,
30966,
1182,
73,
39806,
1162,
355,
49863,
29974,
13337,
2805,
66,
342,
1006,
800,
5028,
26672,
422,
37507,
247,
498,
4240,
50275,
5430,
1057,
253,
1566,
6635,
440,
31984,
494,
3533,
323,
13487,
1384,
7152,
339,
431,
248,
19529,
23970,
247,
747,
2746,
323,
3215,
26208,
3210,
323,
1953,
5978,
27039,
327,
271,
3662,
285,
247,
3389,
1677,
247,
3389,
247,
873,
273,
465,
9172,
310,
6777,
407,
247,
30410,
840,
253,
6197,
4508,
253,
3662,
310,
5176,
285,
247,
3425,
281,
3425,
1566,
310,
908,
281,
17029,
352,
436,
3215,
26208,
4836,
625,
8244,
10129,
253,
6799,
990,
4836,
1543,
921,
326,
436,
1332,
19132,
1953,
5978,
3045,
2429,
281,
2045,
789,
285,
326,
253,
13506,
3533,
476,
320,
908,
281,
3157,
1953,
22291,
3210,
4583,
891,
1158,
253,
2746,
2789,
247,
2257,
273,
3282,
2299,
891,
1158,
352,
3198,
281,
7277,
342,
10046,
3215,
26208,
1666,
25379,
285,
3198,
247,
625,
11080,
5301,
273,
253,
1055,
327,
1953,
22291,
3045,
50276,
783,
2929,
8558,
44995,
327,
1953,
5978,
323,
247,
2491,
273,
15302,
285,
2722,
11701,
327,
1097,
16644,
285,
1966,
7103,
17082,
891,
1158,
697,
4344,
281,
1333,
326,
253,
1953,
5978,
310,
3240,
247,
25803,
4836,
4931,
984,
273,
3710,
4893,
6296,
253,
12002,
273,
253,
2929,
41095,
253,
4836,
347,
1146,
8558,
4217,
323,
11365,
13506,
941,
323,
1953,
22291,
28763,
4030,
533,
1677,
436,
16038,
891,
1158,
253,
7103,
943,
2770,
1199,
625,
327,
253,
15450,
3486,
327,
1953,
22291,
50276,
301,
1663,
751,
281,
923,
604,
253,
1332,
1335,
19132,
1953,
5978,
3045,
327,
1755,
273,
625,
3332,
3425,
281,
3425,
3210,
824,
347,
246,
22,
390,
44693,
534,
452,
1097,
644,
2130,
323,
2761,
247,
807,
347,
2080,
347,
891,
871,
841,
1097,
3012,
562,
32231,
440,
300,
78,
327,
512,
253,
14023,
209,
422,
2326,
253,
3215,
26208,
16566,
908,
627,
534,
6388,
21565,
34741,
35742,
1646,
8244,
2905,
281,
253,
4081,
1332,
436,
3368,
651,
1361,
2096,
1880,
253,
4081,
1332,
11323,
2712,
327,
1755,
273,
625,
2087,
3215,
26208,
7274,
50276,
783,
2929,
2410,
1763,
5356,
14371,
326,
13506,
3533,
3157,
247,
8245,
270,
797,
1953,
22291,
1566,
534,
369,
2168,
2011,
407,
24088,
355,
6291,
74,
1162,
355,
6247,
2299,
891,
13414,
1158,
253,
2929,
1057,
1199,
281,
1804,
326,
253,
13506,
3533,
432,
253,
4081,
1332,
403,
1805,
323,
2805,
66,
685,
643,
7274,
50276,
1542,
1650,
355,
6291,
74,
1162,
355,
3176,
281,
1304,
2074,
3904,
342,
616,
1332,
697,
1774,
281,
2486,
247,
8245,
873,
273,
13506,
3533,
347,
697,
417,
387,
512,
2590,
326,
11701,
327,
1953,
5978,
17082,
588,
24888,
342,
31471,
323,
2805,
66,
3733,
50276,
16245,
253,
1953,
22291,
1543,
651,
320,
625,
21414,
604,
597,
2186,
598,
342,
625,
3332,
3210,
685,
270,
797,
534,
5115,
1805,
7363,
1293,
13506,
3533,
2829,
1903,
1057,
1918,
247,
906,
970,
1516,
376,
50276,
2858,
4245,
253,
8245,
1516,
376,
1566,
271,
3242,
3761,
4868,
273,
854,
3566,
253,
1516,
376,
2929,
3185,
10894,
854,
1438,
5777,
2169,
685,
253,
1543,
970,
13506,
3533,
7152,
33032,
6010,
50276,
2520,
789,
10262,
247,
747,
1554,
382,
554,
1332,
281,
3215,
1949,
247,
1953,
5978,
985,
534,
476,
840,
320,
908,
281,
2794,
13506,
941,
281,
3157,
247,
5145,
4361,
35380,
985,
50276,
7053,
253,
4477,
6194,
247,
985,
281,
4271,
35742,
275,
247,
2505,
33295,
534,
651,
12647,
2779,
9172,
323,
3533,
670,
253,
12494,
840,
597,
3215,
1949,
247,
985,
281,
6635,
1110,
3533,
407,
3192,
253,
4236,
3662,
285,
8704,
14683,
347,
3280,
285,
11365,
253,
6197,
534,
6221,
253,
3662,
4720,
597,
897,
1524,
278,
3373,
941,
281,
1442,
292,
2517,
253,
1953,
5978,
985,
50276,
783,
4477,
806,
2085,
247,
1480,
7103,
273,
616,
1332,
407,
4645,
247,
5185,
7756,
275,
3806,
3169,
17082,
10941,
281,
5328,
3533,
323,
247,
1677,
12494,
3662,
840,
2722,
326,
253,
4561,
13506,
941,
671,
5644,
281,
7756,
327,
253,
15450,
278,
3373,
4836,
323,
253,
13487,
10895,
3340,
672,
970,
1679,
28267,
941,
50276,
783,
1543,
2530,
403,
18462,
285,
11704,
407,
247,
1180,
273,
25441,
2980,
8109,
4679,
253,
2929,
812,
1335,
320,
5520,
407,
8254,
5411,
2173,
2792,
50275,
498,
15752,
50276,
6050,
253,
5661,
4758,
5740,
4245,
247,
3862,
2934,
273,
752,
253,
4477,
858,
690,
4278,
403,
5816,
323,
2120,
38041,
954,
19836,
627,
310,
1077,
1652,
1491,
327,
253,
806,
3213,
9978,
3733,
253,
3662,
5438,
985,
253,
4477,
671,
6266,
616,
4758,
1754,
327,
270,
797,
533,
417,
253,
581,
342,
440,
300,
78,
50276,
5371,
310,
253,
7877,
273,
253,
15301,
4972,
326,
8599,
253,
1566,
835,
253,
3662,
310,
4441,
310,
627,
247,
1798,
1921,
2139,
253,
4477,
4425,
281,
897,
271,
15301,
4972,
2581,
685,
24088,
897,
247,
34742,
10669,
50276,
19681,
10554,
273,
253,
1180,
273,
9172,
275,
247,
12494,
36908,
1646,
281,
2395,
323,
1199,
7756,
476,
253,
4477,
2085,
690,
2557,
273,
7605,
8453,
5010,
253,
1332,
1663,
310,
4460,
2217,
1293,
326,
629,
50276,
14958,
253,
1566,
7472,
253,
1055,
273,
970,
13506,
941,
327,
253,
643,
8892,
50275,
28113,
1255,
50276,
783,
3916,
403,
6571,
973,
4516,
407,
253,
4679,
281,
253,
6517,
273,
253,
7870,
465,
10554,
50275,
38092,
30404,
253,
3215,
26208,
6197,
10554,
1332,
310,
2905,
281,
50275,
13324,
290,
25064,
323,
22112,
22296,
1527,
5028,
1953,
22291,
458,
70,
1162,
355,
247,
498,
6247,
50276,
187,
187,
4118,
18435,
27,
455,
30628,
11435,
253,
1175,
3290,
273,
436,
19529,
342,
247,
1175,
2934,
285,
4891,
10636,
347,
753,
407,
391,
20,
253,
2929,
310,
4518,
3542,
285,
253,
1635,
1309,
253,
5955,
452,
10260,
5520,
352,
347,
14969,
407,
512,
30628,
50276,
35529,
247,
2201,
14855,
273,
253,
19529,
1335,
3198,
281,
320,
9713,
1078,
247,
9311,
387,
17857,
32888,
50275,
284,
753,
275,
253,
2929,
253,
4836,
273,
1953,
5978,
310,
247,
4836,
3692,
2022,
3486,
310,
281,
3157,
15450,
8892,
285,
8558,
2805,
66,
253,
27163,
956,
326,
285,
4465,
16217,
3825,
24088,
9015,
284,
82,
285,
5955,
8772,
1375,
273,
253,
1445,
24088,
355,
6291,
74,
1162,
355,
28432,
731,
2568,
347,
8042,
562,
407,
391,
18,
50276,
83,
21,
253,
1055,
327,
15450,
2805,
66,
3045,
310,
760,
2011,
323,
21076,
3210,
685,
253,
1655,
1375,
273,
253,
1445,
24088,
246,
22,
44693,
1580,
253,
30080,
22559,
2180,
369,
417,
1048,
2217,
281,
1408,
841,
4679,
352,
310,
7479,
281,
2939,
849,
253,
4081,
2746,
7277,
281,
731,
342,
253,
1655,
7482,
6240,
253,
4679,
327,
246,
22,
1355,
310,
247,
3213,
275,
253,
987,
3884,
533,
352,
310,
417,
2217,
323,
326,
1293,
1110,
4679,
581,
476,
417,
7525,
326,
436,
3215,
26208,
5700,
651,
671,
1361,
689,
253,
19508,
5368,
3215,
11273,
1566,
50276,
783,
4477,
943,
1408,
1110,
4679,
281,
1056,
253,
7125,
3559,
275,
253,
19529,
1199,
10046,
50268
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors construct a spiking neural network that performs metropolishastings sampling of a target posterior distribution for eg estimating external features given noisy sensory input and extend it to continuous time dynamics demonstrate its relationship to efficient balanced networks show how the network can be augmented by imposing population geometry to implement the generalized stochastic gradient mcmc and demonstrate that geometryaware networks are superior to geometrynaive networks especially with increasing correlation and posterior dimension im quite naive to the field of spiking network as samplers so i cannot really evaluate the originality significance or the technical quality of the work broadly speaking i believe the work is of interest to the computational neuroscience community since the bayesian brain is an active field of research and therefore a central question is how a biologically realistic network ie spiking network could implement such sampling dynamics in addition at a superficial level the work appears to be carefully done and cites relevant literature as well as carefully discussing its relations to previous work and its limitations based on the figures it seems that the central claim of the paper is supported that access to the natural geometry of the sampling space enables more efficient and accurate sampling there are several assumptions that seem biologically implausible to me many of which the authors touch on in the discussion and i understand certain assumptions are required for such theoretical work and cannot comment on whether they are standard for the field however on the issue of long membrane time constant nu1 it would be nice to see ifwhen the results break down with a shorter integration window similarly the fact that only one neuron is allowed to spike at a time significantly dampens my enthusiasm for the work since at some point the actual model become so far away from the very nicely stated high level motivation there are some lowlevel editing mistakes that can improve the quality of the work eg figure 4 panels a and b are labeled as b and c extensive discussion of the papers limitations and assumptions and no discussion of negative societal impact though i dont foresee any immediate impact docsepnew framework for constructing fast sampling dynamics in spiking neural networks encompasses many of the previously proposed solutions as limit cases but also allows for a unified treatment and interesting variations strengths clear biological and computational motivation interesting knowledge transfer between machine learning and computational neuroscience interesting mechanics deterministic dynamics stochastic spike generation process usually stochastic dynamics deterministic theshold weaknesses restricted to multivariate gaussian posteriors although the time varying mean makes it somewhat more unusualinteresting as a setup dependence of input the actual inference part missing in the construction for simplicity i would agree this is fine for static posteriors but seems awkward once there is a time constant in the posterior changes themselves emphasis on the practically relevant scale of sampling being unexplored is an overstatement of fact both the hennequin and savin type of dynamics can be used to extract moments of interest at time scale of hundreds of milliseconds explicitly with spiking neurons at least for the second case which many would argue is perceptually relevant enough especially given the inherent tradeoff between precision and time for all sampling based codes that is not to say that there is no room for alternative models of fast sampling but its not nearly as bad as the introduction would make you believe very preciseartificial architectural constraints on the solution pair of readout pools tied weights etc in the neurally relevant regime the dynamics are not guaranteed to have the target posterior statistics unclear why the series of approximatons should be expected to be constrained rather than lead to accumulation of errors and big deviations from the target minor strictly speaking it is not the dimensionality of the latent space per se but rather posterior entropy that limits sampling speed although admittedly the two tend to go together in simple cases this is semantics i would say that typically the goal of bayesian perception is stated as computing a posterior over latent variables not parameters potentially relevant refs radford neal tech report on speeding up sampling by dropping the detailed balance requirement and the old lars buesing sampling paper on a biological realization of that idea refractory period of sampling no issues docsepthe present paper studies sampling in spiking network models and tries to unify the langevin sampling in the efficient balanced networks and metropolishastings sampling in networks with probabilistic spike rules and it also studies how the geometric structure in the neural code speeds up the sampling the strength of this paper is its unifying nature and its proposal of a probabilistic spike rule implementing metropolishastings sampling is novel the present study did some theoretical derivations in linking the sampling dynamics with neural dynamics but some of the presentations is not very clear major i have a couple of major concerns about this present paper a strong requirement of neural sampling is that the neural circuit dynamics with fixed parameters is able to sample posteriors with different uncertainties it is not clear whether this can be achieved in the current framework the proposed network model seems not able to achieve this if i understood correctly in that the network parameters in eq 12 depend on the mean and the variance explained by the text in line 160 this implies that to sample distribution with different parameters we need to adjust the network connections as mentioned in eq 14 that the d matrix only modifies the temporal dynamics but leaves the sampling distribution unchanged that means sampling in the euclidean space and the natural space including the inverse fisher information matrix line 195 should not change the sampling distribution however from fig 2 and fig 3 the two sampling trajectories blue vs red differ a lot and have different variances i am wondering whether the sampling is correct or not writing the writing in sec 21 is not clear and i feel difficulty sometimes in following the flow a lot of questions about the presentation remain it is not clear why the problem from discrete and signconstrained spikes can be solved by imposing a finetuned balancing condition on the readout weights lines 9398 more explanations are needed also it is not clear the motivation for choosing a neuron uniformly at random line 104 in eq 1 should i interpret the estimate hatthetat as a sample at time t is the uncertainty of the sampling distribution represented by the fluctuation of hatthetat over time it is better to explicitly mention this in the beginning eqs 3 and 4 the argument in the distribution in the denominator of eq 4 ie 1etagamma rt1 is not consistent with hatthetat1 in the denominator of eq 3 it is unclear how eq 6 and 10 are related to the undefined generative model which was mentioned in line 71 is there an implicit assumption of the linear gaussian generative model because the terms in eq 10 are the derivative of quadratic terms moreover it is not clear how psi is defined at this moment until later i read the text in line 160 also in eq 6 the input to the neural dynamics is not the observation x defined in the generative model line 71 but the latent variable theta directly more explanations are needed here line 195 it is not clear what the matrix g is until i read line 232 please define symbols when they first appear the limitation of the current framework is the analogdigital conversion dac in sampling a continuous distribution in discrete spiking networks as mentioned in line 238 this makes people feel that the spiking network is not designed by implementing sampling although the error of dac can be reduced by considering population geometry or in large number limit similar feelings were mentioned by the authors in the paper i hope to see more discussion of alternative solutions to the dac error one possible solution is utilizing the math techniques in probabilistic population code to sample continuous distribution in discrete spikes where the representation of a continuous distribution in spiking networks comes from the continuously smooth tuning curves of neurons docsepin this work the authors demonstrate the possibility of implementing sampling algorithms through spiking neural networks they provide details on how to realize the metropolishastings sampler using the probabilistic spike rules overall the work is interesting and extends the potential application of spiking neural networks strengths 1 the application is relatively new for snn 2 the writing is clear weakness 1 the motivation for applying snn for sampling is not convincing enough 2 the comparison to traditional sampling is not enough na
### Summary:
|
although some reviewers have reservations about strong modelling assumptions the main contribution of the paper is clearly presented and technically sound
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
50276,
17439,
247,
653,
16434,
11454,
2990,
326,
17923,
1313,
18427,
763,
42118,
10491,
273,
247,
2303,
12637,
3268,
323,
24088,
26230,
6024,
3386,
1677,
27620,
17872,
3280,
285,
9017,
352,
281,
5415,
673,
8062,
50276,
48387,
366,
697,
2954,
281,
5919,
16645,
6928,
50276,
9029,
849,
253,
2990,
476,
320,
31612,
407,
23254,
3072,
12087,
281,
3359,
253,
14923,
19191,
11786,
278,
3591,
68,
50276,
395,
7568,
326,
12087,
13823,
6928,
403,
8936,
281,
12087,
2072,
422,
6928,
3340,
342,
3629,
5921,
285,
12637,
7877,
50276,
303,
3240,
27785,
281,
253,
1673,
273,
653,
16434,
2990,
347,
1775,
446,
398,
594,
891,
2550,
1663,
7472,
253,
3236,
414,
8453,
390,
253,
7681,
3290,
273,
253,
789,
21450,
8288,
891,
2868,
253,
789,
310,
273,
1600,
281,
253,
15180,
6551,
21559,
3114,
1580,
253,
17699,
16561,
3998,
310,
271,
3939,
1673,
273,
2561,
285,
3103,
247,
4275,
1953,
310,
849,
247,
35605,
15958,
2990,
26332,
653,
16434,
2990,
812,
3359,
824,
10491,
8062,
275,
1635,
387,
247,
28019,
1268,
253,
789,
4620,
281,
320,
9257,
2218,
285,
28070,
4623,
6239,
347,
973,
347,
9257,
16585,
697,
2493,
281,
2045,
789,
285,
697,
7364,
1754,
327,
253,
8442,
352,
3133,
326,
253,
4275,
1750,
273,
253,
2929,
310,
4516,
326,
2289,
281,
253,
3626,
12087,
273,
253,
10491,
2317,
13276,
625,
5919,
285,
7899,
10491,
50276,
9088,
403,
2067,
13260,
326,
1646,
35605,
3898,
666,
917,
281,
479,
1142,
273,
534,
253,
4477,
5181,
327,
275,
253,
5955,
285,
891,
2096,
2176,
13260,
403,
2424,
323,
824,
10527,
789,
285,
2550,
4385,
327,
1880,
597,
403,
2629,
323,
253,
1673,
2299,
327,
253,
2523,
273,
1048,
6384,
673,
3638,
8794,
18,
352,
651,
320,
5322,
281,
923,
604,
9453,
253,
1543,
2740,
1066,
342,
247,
12217,
9554,
3497,
12014,
253,
958,
326,
760,
581,
23586,
310,
4136,
281,
24147,
387,
247,
673,
3012,
16109,
561,
619,
23027,
323,
253,
789,
1580,
387,
690,
1127,
253,
4588,
1566,
2489,
594,
2080,
1977,
432,
253,
1077,
23395,
4767,
1029,
1268,
16038,
50276,
9088,
403,
690,
1698,
5251,
14835,
16503,
326,
476,
3157,
253,
3290,
273,
253,
789,
24088,
4677,
577,
12471,
247,
285,
270,
403,
13130,
347,
270,
285,
260,
9470,
5955,
273,
253,
9380,
7364,
285,
13260,
285,
642,
5955,
273,
4016,
38058,
3486,
2167,
891,
13414,
32734,
667,
8993,
3486,
5474,
33032,
1826,
7792,
323,
26736,
3809,
10491,
8062,
275,
653,
16434,
11454,
6928,
37035,
1142,
273,
253,
3786,
4081,
5482,
347,
2701,
2219,
533,
671,
4483,
323,
247,
27998,
1971,
285,
4722,
10575,
20544,
50276,
8250,
7534,
285,
15180,
16038,
50275,
47606,
3640,
3700,
875,
5145,
4715,
285,
15180,
6551,
21559,
50276,
47606,
17823,
30027,
8062,
19191,
24147,
5978,
1232,
3798,
19191,
8062,
30027,
253,
1200,
744,
50276,
20881,
1255,
265,
50276,
44255,
281,
21471,
305,
12064,
20731,
17327,
3738,
253,
673,
11962,
1599,
2789,
352,
8489,
625,
11555,
47606,
347,
247,
9978,
50276,
39606,
273,
3280,
253,
4588,
17032,
629,
5816,
275,
253,
5140,
323,
17647,
891,
651,
5194,
436,
310,
4030,
323,
4228,
20731,
17327,
533,
3133,
19328,
2378,
627,
310,
247,
673,
3638,
275,
253,
12637,
2544,
3746,
50276,
18532,
327,
253,
18236,
4623,
4311,
273,
10491,
1146,
35021,
2149,
310,
271,
689,
25322,
273,
958,
1097,
253,
23394,
570,
16573,
285,
5745,
249,
1511,
273,
8062,
476,
320,
908,
281,
4908,
9506,
273,
1600,
387,
673,
4311,
273,
8307,
273,
38178,
11120,
342,
653,
16434,
8512,
387,
1878,
323,
253,
1273,
1083,
534,
1142,
651,
9059,
310,
591,
916,
1230,
4623,
2217,
3340,
1677,
253,
12794,
5454,
2727,
875,
12320,
285,
673,
323,
512,
10491,
1754,
11646,
326,
310,
417,
281,
1333,
326,
627,
310,
642,
2316,
323,
5795,
3210,
273,
3809,
10491,
533,
697,
417,
4829,
347,
3076,
347,
253,
10199,
651,
1056,
368,
2868,
50276,
635,
10799,
435,
11232,
27934,
10806,
327,
253,
2900,
4667,
273,
49914,
24283,
12331,
13461,
3966,
50276,
249,
253,
5723,
595,
4623,
9459,
253,
8062,
403,
417,
16293,
281,
452,
253,
2303,
12637,
9990,
12744,
2139,
253,
2962,
273,
4020,
255,
790,
943,
320,
3264,
281,
320,
20793,
2581,
685,
1421,
281,
12037,
273,
6332,
285,
1943,
21492,
432,
253,
2303,
50276,
37585,
50276,
30862,
314,
8288,
352,
310,
417,
253,
7877,
1319,
273,
253,
21624,
2317,
591,
396,
533,
2581,
12637,
15579,
326,
7787,
10491,
3885,
3738,
47421,
253,
767,
5257,
281,
564,
2366,
275,
2969,
2219,
50276,
2520,
310,
35185,
891,
651,
1333,
326,
5431,
253,
4736,
273,
17699,
16561,
13071,
310,
4767,
347,
12672,
247,
12637,
689,
21624,
4903,
417,
3602,
50276,
11714,
4303,
4623,
1275,
84,
1985,
4379,
425,
267,
13817,
1304,
327,
43088,
598,
10491,
407,
18752,
253,
7000,
6654,
8284,
285,
253,
1711,
298,
1032,
270,
955,
272,
10491,
2929,
327,
247,
7534,
22786,
273,
326,
2934,
34708,
2180,
273,
10491,
50276,
2369,
3374,
5474,
339,
431,
248,
1246,
2929,
2175,
10491,
275,
653,
16434,
2990,
3210,
285,
14177,
281,
440,
1419,
253,
298,
912,
8498,
10491,
275,
253,
5919,
16645,
6928,
285,
1313,
18427,
763,
42118,
10491,
275,
6928,
342,
37851,
24147,
4803,
285,
352,
671,
2175,
849,
253,
17856,
2605,
275,
253,
11454,
2127,
18819,
598,
253,
10491,
253,
4757,
273,
436,
2929,
310,
697,
440,
5411,
3753,
285,
697,
10419,
273,
247,
37851,
24147,
4086,
16994,
1313,
18427,
763,
42118,
10491,
310,
4460,
253,
1246,
1263,
858,
690,
10527,
3538,
569,
275,
20057,
253,
10491,
8062,
342,
11454,
8062,
533,
690,
273,
253,
27228,
310,
417,
1077,
2590,
50275,
24330,
50276,
74,
452,
247,
4564,
273,
2201,
7350,
670,
436,
1246,
2929,
50275,
66,
2266,
8284,
273,
11454,
10491,
310,
326,
253,
11454,
5049,
8062,
342,
4229,
3602,
310,
2104,
281,
3410,
20731,
17327,
342,
1027,
20418,
352,
310,
417,
2590,
1880,
436,
476,
320,
6786,
275,
253,
1655,
7792,
253,
4081,
2990,
1566,
3133,
417,
2104,
281,
5115,
436,
604,
891,
7192,
9113,
275,
326,
253,
2990,
3602,
275,
16186,
1249,
3469,
327,
253,
1599,
285,
253,
11041,
5544,
407,
253,
2505,
275,
1386,
12036,
436,
8018,
326,
281,
3410,
3268,
342,
1027,
3602,
359,
878,
281,
4575,
253,
2990,
10291,
50274,
284,
5393,
275,
16186,
1638,
326,
253,
277,
4315,
760,
771,
7790,
253,
11935,
8062,
533,
6505,
253,
10491,
3268,
19965,
326,
2097,
10491,
275,
253,
299,
26365,
2317,
285,
253,
3626,
2317,
1690,
253,
13737,
27633,
1491,
4315,
1386,
23627,
943,
417,
1818,
253,
10491,
3268,
2299,
432,
3036,
374,
285,
3036,
495,
253,
767,
10491,
24102,
4797,
4632,
2502,
9184,
247,
2257,
285,
452,
1027,
48894,
891,
717,
12371,
1880,
253,
10491,
310,
3451,
390,
417,
50275,
17695,
50276,
783,
4028,
275,
4706,
3127,
310,
417,
2590,
285,
891,
1928,
10183,
4536,
275,
1563,
253,
2685,
247,
2257,
273,
3533,
670,
253,
9759,
3464,
50275,
262,
310,
417,
2590,
2139,
253,
1895,
432,
13358,
285,
861,
48454,
34635,
476,
320,
14042,
407,
23254,
247,
1442,
292,
37437,
26259,
1617,
327,
253,
49914,
13461,
3104,
898,
26221,
625,
22909,
403,
3058,
671,
352,
310,
417,
2590,
253,
16038,
323,
13887,
247,
23586,
17568,
387,
3632,
1386,
12131,
50275,
249,
16186,
337,
943,
891,
4665,
253,
6642,
7856,
783,
41506,
347,
247,
3410,
387,
673,
246,
310,
253,
11649,
273,
253,
10491,
3268,
6607,
407,
253,
31608,
273,
7856,
783,
41506,
689,
673,
352,
310,
1805,
281,
11120,
3748,
436,
275,
253,
5068,
50275,
2574,
84,
495,
285,
577,
253,
4154,
275,
253,
3268,
275,
253,
12619,
273,
16186,
577,
26332,
337,
292,
356,
1861,
37523,
18,
310,
417,
5185,
342,
7856,
783,
41506,
18,
275,
253,
12619,
273,
16186,
495,
50275,
262,
310,
12744,
849,
16186,
721,
285,
884,
403,
2905,
281,
253,
17011,
1006,
800,
1566,
534,
369,
5393,
275,
1386,
11102,
310,
627,
271,
15424,
9376,
273,
253,
4872,
305,
12064,
1006,
800,
1566,
984,
253,
2426,
275,
16186,
884,
403,
253,
4309,
273,
21396,
2426,
25761,
352,
310,
417,
2590,
849,
3714,
74,
310,
2931,
387,
436,
2774,
1919,
1996,
891,
1239,
253,
2505,
275,
1386,
12036,
671,
275,
16186,
721,
253,
3280,
281,
253,
11454,
8062,
310,
417,
253,
8310,
1269,
2931,
275,
253,
1006,
800,
1566,
1386,
11102,
533,
253,
21624,
4778,
39116,
3587,
625,
22909,
403,
3058,
1060,
50275,
1282,
23627,
352,
310,
417,
2590,
752,
253,
4315,
305,
310,
1919,
891,
1239,
1386,
26972,
4496,
4853,
14217,
672,
597,
806,
3176,
50276,
783,
12291,
273,
253,
1655,
7792,
310,
253,
7370,
35399,
9436,
277,
317,
275,
10491,
247,
5415,
3268,
275,
13358,
653,
16434,
6928,
347,
5393,
275,
1386,
27518,
436,
2789,
952,
1928,
326,
253,
653,
16434,
2990,
310,
417,
4158,
407,
16994,
10491,
3738,
253,
2228,
273,
277,
317,
476,
320,
3777,
407,
7296,
3072,
12087,
390,
275,
1781,
1180,
2701,
2074,
10450,
497,
5393,
407,
253,
4477,
275,
253,
2929,
891,
3524,
281,
923,
625,
5955,
273,
5795,
5482,
281,
253,
277,
317,
2228,
581,
1896,
2900,
310,
17617,
253,
14168,
5609,
275,
37851,
3072,
2127,
281,
3410,
5415,
3268,
275,
13358,
34635,
835,
253,
6779,
273,
247,
5415,
3268,
275,
653,
16434,
6928,
3249,
432,
253,
14949,
6032,
25184,
9191,
273,
8512,
50276,
7152,
339,
9852,
436,
789,
253,
4477,
7568,
253,
6387,
273,
16994,
10491,
11333,
949,
653,
16434,
11454,
6928,
597,
2085,
4278,
327,
849,
281,
8968,
253,
1313,
18427,
763,
42118,
1775,
17407,
970,
253,
37851,
24147,
4803,
4583,
253,
789,
310,
4722,
285,
8725,
253,
2442,
2898,
273,
653,
16434,
11454,
6928,
50276,
296,
3755,
20556,
337,
253,
2898,
310,
4942,
747,
323,
3802,
79,
50276,
19,
253,
4028,
310,
2590,
50276,
20881,
1255,
337,
253,
16038,
323,
9433,
3802,
79,
323,
10491,
310,
417,
21414,
2217,
50276,
19,
253,
5301,
281,
5899,
10491,
310,
417,
2217,
50276,
2072,
50276,
187,
187,
4118,
18435,
27,
20261,
690,
30628,
452,
33196,
670,
2266,
26278,
13260,
253,
2022,
7680,
273,
253,
2929,
310,
4518,
3559,
285,
22335,
3590
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
50276,
17439,
247,
653,
16434,
11454,
2990,
326,
17923,
1313,
18427,
763,
42118,
10491,
273,
247,
2303,
12637,
3268,
323,
24088,
26230,
6024,
3386,
1677,
27620,
17872,
3280,
285,
9017,
352,
281,
5415,
673,
8062,
50276,
48387,
366,
697,
2954,
281,
5919,
16645,
6928,
50276,
9029,
849,
253,
2990,
476,
320,
31612,
407,
23254,
3072,
12087,
281,
3359,
253,
14923,
19191,
11786,
278,
3591,
68,
50276,
395,
7568,
326,
12087,
13823,
6928,
403,
8936,
281,
12087,
2072,
422,
6928,
3340,
342,
3629,
5921,
285,
12637,
7877,
50276,
303,
3240,
27785,
281,
253,
1673,
273,
653,
16434,
2990,
347,
1775,
446,
398,
594,
891,
2550,
1663,
7472,
253,
3236,
414,
8453,
390,
253,
7681,
3290,
273,
253,
789,
21450,
8288,
891,
2868,
253,
789,
310,
273,
1600,
281,
253,
15180,
6551,
21559,
3114,
1580,
253,
17699,
16561,
3998,
310,
271,
3939,
1673,
273,
2561,
285,
3103,
247,
4275,
1953,
310,
849,
247,
35605,
15958,
2990,
26332,
653,
16434,
2990,
812,
3359,
824,
10491,
8062,
275,
1635,
387,
247,
28019,
1268,
253,
789,
4620,
281,
320,
9257,
2218,
285,
28070,
4623,
6239,
347,
973,
347,
9257,
16585,
697,
2493,
281,
2045,
789,
285,
697,
7364,
1754,
327,
253,
8442,
352,
3133,
326,
253,
4275,
1750,
273,
253,
2929,
310,
4516,
326,
2289,
281,
253,
3626,
12087,
273,
253,
10491,
2317,
13276,
625,
5919,
285,
7899,
10491,
50276,
9088,
403,
2067,
13260,
326,
1646,
35605,
3898,
666,
917,
281,
479,
1142,
273,
534,
253,
4477,
5181,
327,
275,
253,
5955,
285,
891,
2096,
2176,
13260,
403,
2424,
323,
824,
10527,
789,
285,
2550,
4385,
327,
1880,
597,
403,
2629,
323,
253,
1673,
2299,
327,
253,
2523,
273,
1048,
6384,
673,
3638,
8794,
18,
352,
651,
320,
5322,
281,
923,
604,
9453,
253,
1543,
2740,
1066,
342,
247,
12217,
9554,
3497,
12014,
253,
958,
326,
760,
581,
23586,
310,
4136,
281,
24147,
387,
247,
673,
3012,
16109,
561,
619,
23027,
323,
253,
789,
1580,
387,
690,
1127,
253,
4588,
1566,
2489,
594,
2080,
1977,
432,
253,
1077,
23395,
4767,
1029,
1268,
16038,
50276,
9088,
403,
690,
1698,
5251,
14835,
16503,
326,
476,
3157,
253,
3290,
273,
253,
789,
24088,
4677,
577,
12471,
247,
285,
270,
403,
13130,
347,
270,
285,
260,
9470,
5955,
273,
253,
9380,
7364,
285,
13260,
285,
642,
5955,
273,
4016,
38058,
3486,
2167,
891,
13414,
32734,
667,
8993,
3486,
5474,
33032,
1826,
7792,
323,
26736,
3809,
10491,
8062,
275,
653,
16434,
11454,
6928,
37035,
1142,
273,
253,
3786,
4081,
5482,
347,
2701,
2219,
533,
671,
4483,
323,
247,
27998,
1971,
285,
4722,
10575,
20544,
50276,
8250,
7534,
285,
15180,
16038,
50275,
47606,
3640,
3700,
875,
5145,
4715,
285,
15180,
6551,
21559,
50276,
47606,
17823,
30027,
8062,
19191,
24147,
5978,
1232,
3798,
19191,
8062,
30027,
253,
1200,
744,
50276,
20881,
1255,
265,
50276,
44255,
281,
21471,
305,
12064,
20731,
17327,
3738,
253,
673,
11962,
1599,
2789,
352,
8489,
625,
11555,
47606,
347,
247,
9978,
50276,
39606,
273,
3280,
253,
4588,
17032,
629,
5816,
275,
253,
5140,
323,
17647,
891,
651,
5194,
436,
310,
4030,
323,
4228,
20731,
17327,
533,
3133,
19328,
2378,
627,
310,
247,
673,
3638,
275,
253,
12637,
2544,
3746,
50276,
18532,
327,
253,
18236,
4623,
4311,
273,
10491,
1146,
35021,
2149,
310,
271,
689,
25322,
273,
958,
1097,
253,
23394,
570,
16573,
285,
5745,
249,
1511,
273,
8062,
476,
320,
908,
281,
4908,
9506,
273,
1600,
387,
673,
4311,
273,
8307,
273,
38178,
11120,
342,
653,
16434,
8512,
387,
1878,
323,
253,
1273,
1083,
534,
1142,
651,
9059,
310,
591,
916,
1230,
4623,
2217,
3340,
1677,
253,
12794,
5454,
2727,
875,
12320,
285,
673,
323,
512,
10491,
1754,
11646,
326,
310,
417,
281,
1333,
326,
627,
310,
642,
2316,
323,
5795,
3210,
273,
3809,
10491,
533,
697,
417,
4829,
347,
3076,
347,
253,
10199,
651,
1056,
368,
2868,
50276,
635,
10799,
435,
11232,
27934,
10806,
327,
253,
2900,
4667,
273,
49914,
24283,
12331,
13461,
3966,
50276,
249,
253,
5723,
595,
4623,
9459,
253,
8062,
403,
417,
16293,
281,
452,
253,
2303,
12637,
9990,
12744,
2139,
253,
2962,
273,
4020,
255,
790,
943,
320,
3264,
281,
320,
20793,
2581,
685,
1421,
281,
12037,
273,
6332,
285,
1943,
21492,
432,
253,
2303,
50276,
37585,
50276,
30862,
314,
8288,
352,
310,
417,
253,
7877,
1319,
273,
253,
21624,
2317,
591,
396,
533,
2581,
12637,
15579,
326,
7787,
10491,
3885,
3738,
47421,
253,
767,
5257,
281,
564,
2366,
275,
2969,
2219,
50276,
2520,
310,
35185,
891,
651,
1333,
326,
5431,
253,
4736,
273,
17699,
16561,
13071,
310,
4767,
347,
12672,
247,
12637,
689,
21624,
4903,
417,
3602,
50276,
11714,
4303,
4623,
1275,
84,
1985,
4379,
425,
267,
13817,
1304,
327,
43088,
598,
10491,
407,
18752,
253,
7000,
6654,
8284,
285,
253,
1711,
298,
1032,
270,
955,
272,
10491,
2929,
327,
247,
7534,
22786,
273,
326,
2934,
34708,
2180,
273,
10491,
50276,
2369,
3374,
5474,
339,
431,
248,
1246,
2929,
2175,
10491,
275,
653,
16434,
2990,
3210,
285,
14177,
281,
440,
1419,
253,
298,
912,
8498,
10491,
275,
253,
5919,
16645,
6928,
285,
1313,
18427,
763,
42118,
10491,
275,
6928,
342,
37851,
24147,
4803,
285,
352,
671,
2175,
849,
253,
17856,
2605,
275,
253,
11454,
2127,
18819,
598,
253,
10491,
253,
4757,
273,
436,
2929,
310,
697,
440,
5411,
3753,
285,
697,
10419,
273,
247,
37851,
24147,
4086,
16994,
1313,
18427,
763,
42118,
10491,
310,
4460,
253,
1246,
1263,
858,
690,
10527,
3538,
569,
275,
20057,
253,
10491,
8062,
342,
11454,
8062,
533,
690,
273,
253,
27228,
310,
417,
1077,
2590,
50275,
24330,
50276,
74,
452,
247,
4564,
273,
2201,
7350,
670,
436,
1246,
2929,
50275,
66,
2266,
8284,
273,
11454,
10491,
310,
326,
253,
11454,
5049,
8062,
342,
4229,
3602,
310,
2104,
281,
3410,
20731,
17327,
342,
1027,
20418,
352,
310,
417,
2590,
1880,
436,
476,
320,
6786,
275,
253,
1655,
7792,
253,
4081,
2990,
1566,
3133,
417,
2104,
281,
5115,
436,
604,
891,
7192,
9113,
275,
326,
253,
2990,
3602,
275,
16186,
1249,
3469,
327,
253,
1599,
285,
253,
11041,
5544,
407,
253,
2505,
275,
1386,
12036,
436,
8018,
326,
281,
3410,
3268,
342,
1027,
3602,
359,
878,
281,
4575,
253,
2990,
10291,
50274,
284,
5393,
275,
16186,
1638,
326,
253,
277,
4315,
760,
771,
7790,
253,
11935,
8062,
533,
6505,
253,
10491,
3268,
19965,
326,
2097,
10491,
275,
253,
299,
26365,
2317,
285,
253,
3626,
2317,
1690,
253,
13737,
27633,
1491,
4315,
1386,
23627,
943,
417,
1818,
253,
10491,
3268,
2299,
432,
3036,
374,
285,
3036,
495,
253,
767,
10491,
24102,
4797,
4632,
2502,
9184,
247,
2257,
285,
452,
1027,
48894,
891,
717,
12371,
1880,
253,
10491,
310,
3451,
390,
417,
50275,
17695,
50276,
783,
4028,
275,
4706,
3127,
310,
417,
2590,
285,
891,
1928,
10183,
4536,
275,
1563,
253,
2685,
247,
2257,
273,
3533,
670,
253,
9759,
3464,
50275,
262,
310,
417,
2590,
2139,
253,
1895,
432,
13358,
285,
861,
48454,
34635,
476,
320,
14042,
407,
23254,
247,
1442,
292,
37437,
26259,
1617,
327,
253,
49914,
13461,
3104,
898,
26221,
625,
22909,
403,
3058,
671,
352,
310,
417,
2590,
253,
16038,
323,
13887,
247,
23586,
17568,
387,
3632,
1386,
12131,
50275,
249,
16186,
337,
943,
891,
4665,
253,
6642,
7856,
783,
41506,
347,
247,
3410,
387,
673,
246,
310,
253,
11649,
273,
253,
10491,
3268,
6607,
407,
253,
31608,
273,
7856,
783,
41506,
689,
673,
352,
310,
1805,
281,
11120,
3748,
436,
275,
253,
5068,
50275,
2574,
84,
495,
285,
577,
253,
4154,
275,
253,
3268,
275,
253,
12619,
273,
16186,
577,
26332,
337,
292,
356,
1861,
37523,
18,
310,
417,
5185,
342,
7856,
783,
41506,
18,
275,
253,
12619,
273,
16186,
495,
50275,
262,
310,
12744,
849,
16186,
721,
285,
884,
403,
2905,
281,
253,
17011,
1006,
800,
1566,
534,
369,
5393,
275,
1386,
11102,
310,
627,
271,
15424,
9376,
273,
253,
4872,
305,
12064,
1006,
800,
1566,
984,
253,
2426,
275,
16186,
884,
403,
253,
4309,
273,
21396,
2426,
25761,
352,
310,
417,
2590,
849,
3714,
74,
310,
2931,
387,
436,
2774,
1919,
1996,
891,
1239,
253,
2505,
275,
1386,
12036,
671,
275,
16186,
721,
253,
3280,
281,
253,
11454,
8062,
310,
417,
253,
8310,
1269,
2931,
275,
253,
1006,
800,
1566,
1386,
11102,
533,
253,
21624,
4778,
39116,
3587,
625,
22909,
403,
3058,
1060,
50275,
1282,
23627,
352,
310,
417,
2590,
752,
253,
4315,
305,
310,
1919,
891,
1239,
1386,
26972,
4496,
4853,
14217,
672,
597,
806,
3176,
50276,
783,
12291,
273,
253,
1655,
7792,
310,
253,
7370,
35399,
9436,
277,
317,
275,
10491,
247,
5415,
3268,
275,
13358,
653,
16434,
6928,
347,
5393,
275,
1386,
27518,
436,
2789,
952,
1928,
326,
253,
653,
16434,
2990,
310,
417,
4158,
407,
16994,
10491,
3738,
253,
2228,
273,
277,
317,
476,
320,
3777,
407,
7296,
3072,
12087,
390,
275,
1781,
1180,
2701,
2074,
10450,
497,
5393,
407,
253,
4477,
275,
253,
2929,
891,
3524,
281,
923,
625,
5955,
273,
5795,
5482,
281,
253,
277,
317,
2228,
581,
1896,
2900,
310,
17617,
253,
14168,
5609,
275,
37851,
3072,
2127,
281,
3410,
5415,
3268,
275,
13358,
34635,
835,
253,
6779,
273,
247,
5415,
3268,
275,
653,
16434,
6928,
3249,
432,
253,
14949,
6032,
25184,
9191,
273,
8512,
50276,
7152,
339,
9852,
436,
789,
253,
4477,
7568,
253,
6387,
273,
16994,
10491,
11333,
949,
653,
16434,
11454,
6928,
597,
2085,
4278,
327,
849,
281,
8968,
253,
1313,
18427,
763,
42118,
1775,
17407,
970,
253,
37851,
24147,
4803,
4583,
253,
789,
310,
4722,
285,
8725,
253,
2442,
2898,
273,
653,
16434,
11454,
6928,
50276,
296,
3755,
20556,
337,
253,
2898,
310,
4942,
747,
323,
3802,
79,
50276,
19,
253,
4028,
310,
2590,
50276,
20881,
1255,
337,
253,
16038,
323,
9433,
3802,
79,
323,
10491,
310,
417,
21414,
2217,
50276,
19,
253,
5301,
281,
5899,
10491,
310,
417,
2217,
50276,
2072,
50276,
187,
187,
4118,
18435,
27,
20261,
690,
30628,
452,
33196,
670,
2266,
26278,
13260,
253,
2022,
7680,
273,
253,
2929,
310,
4518,
3559,
285,
22335,
3590
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper has a short analysis on rounding schemes comparing nearestrounding to stochastic rounding for quantized training it also introduces luq a new quantized training scheme with a fp4 format for the gradients in order to make this work efficiently in hardware they introduce a new hardware block that does multiplicationless gradient calculations for the backward pass this paper is interesting but leaves me with too many questions and uncertainties on all three of its contributions to make a proper judgment it would be great if the authors could address my questions below for clarity i appreciate the rounding scheme analysis wrt the mse i havent seen that specifically before and its insightful to understand why stochastic rounding on the forward pass is not necessarily a good idea there is one small caveat that should be mentioned and that is that noise sometimes helps for proper convergence and regularization dropout adds noise and increases the mse but still sometimes helps it would be good if the authors rephrased this section slightly to take this into account properly the conclusion on page 3 is much too fast and needs to be worked out significantly more for clarity paraphrasing unbiased gradients are necessary for convergence therefore gradients should be quantized with stochastic rounding i cant find the statements regarding the biased gradients in bottou 2010 perhaps the authors could point me to where exactly this is mentioned i would also like to understand better what the bias of the gradients actually means and why this would be so detrimental if you talk about the bias of the forward pass given the stochastic distribution i understand however its unclear to me what the effect of this is on the gradients themselves it would be good if the authors worked this out what are the gradients biased with respect to the fp32 gradients is that even the correct thing to compare to to me figure 1 is insufficient evidence that a bias is necessarily the issue in training this effect would have to be disentangled from other effects that adding noise to a network would have on training as it stands the difference could come from just the addition of noise similarly the argument that rdn should be used in the forward pass goes much too fast for me as well it would be nice if the authors showed explicitly how the linearity causes a nonbiased estimate of the gradients in the forward pass section 31 extra sampling would also linearly increase the cost of the method so why would this be better than simply training more iterations on different data as opposed to the same data a discussion of the overheadcomplexity of this method already here would be much appreciated for the results section i also dont see how the overhead comes into play with the comparison the raw luq method itself section 32 wouldnt we generally want the activations in a lower bitwidth as well method it is unclear to me if the weights in this scheme are kept in a quantized format or if they are in floating points with the common shadowweight approach please take more care in describing this properly if the weights are in floating point that would mean a very significant overhead during training as the floating points would have to be loaded somewhere then quantized then an operation is to be applied to them in full quantized training you would want to have the weights quantized as well but i see no mention of this similarly are the updates to the weights quantized see eg the wage paper underflow threshold what are you taking the maximum over if you are taking a maximum over all the values in the tensor you are quantizing you will run into problems with quantization as described in the inhindsight quantization paper httpsarxivorgpdf210504246pdf essentially for activation quantization if you need any statistic of a tensor to do the quantization you have to write significant amounts of data to memory which is very slow you might as well not have done activationgradient quantization at that point overhead of smp and fnt this number shouldn be 8x compared to fp16 compared to fp32 it would be 16x at least main results it seems you are using at least a different resnet18 model compared to the ultralow paper as the baseline accuracy is 697 for your model but ultralow reports 694 could some difference not be explained based on diffent models the delta in performance is the same roughly 07 for resnet18 also how does this method compare to other works before it like wage httpsarxivorgpdf180204680pdf and other papers that are based on this you mention that the shortcut connection in resnets and the depthwise layers are kept in full precision for the depthwise layer does that mean the input activations to it or the output activations for both i dont think this is necessarily trivial to do its not a given that a fixed point training engine also has a fullprecision engine on its die having that would significantly increase the die size similarly although the compute is small of this solution the data movement is not data movement is a very important metric on efficient devices the ones we would be doing quantized training for this needs to be addressed in the context of overall efficiency mfbprop i definitely appreciate the hardware implementation as if youd have just the fp7 calculation mac array you might as well just do int8 computations all the way as opposed to going throug the trouble of quantizing everything to int4fp4 but this section does highlight one problem with this method dedicated hardware is necessary for this method to work efficiently this greatly limits the scope of this work in its practicality and this brings me back to the above data movement questions that were glossed over the suggested method is not a general method for quantized training and more of a hardware and software go handinhand kind of method but many of the practical details of the hardware implementation and costsoverhead are left out please comment on this typeos page 3 roundtozero nearest page 4 avoid of clipping of page 5 the a different samples a page 6 4bit training in various dnn models in on this is an interesting paper with some interesting insights but the paper tries to do too much in too small a space this leads me to have too many open questions for a good rating the paper would likely have been better if it honed in on a single aspect that they present and did so in a clearer fashion with more bases covered the conclusions on page 3 are drawn much too fast and are unclear the authors have to convince the reader a lot harder to make the claims they make on the one hand the papers main method only seemingly work more efficiently with very dedicated hardware implementations but several important aspects of efficient hardware implementations of training such as data movement are seemingly ignored i would be willing to increase my rating if the authors addressed my above questions rewrote section 2 significantly with a more convincing analysis and added a better complexity analysis of the overhead of their method in terms of data movement postrebuttal given the authors excellent response to my questions and the questions of the other authors i am increasing my review to a 5 for a 6 or higher i definitely need the efficiency conundrum resolved as for now i just dont see how to resolve the dichotomy between being either a generally applicable method on commonday and general hardware or a method that is really aimed at hyperefficient dedicated implementations docsepthe paper focuses on reducing the computational footprint of training neural networks using quantization they investigate the importance of having unbiased gradients and show that stochastic rounding is important for the backwards pass they propose a logarithmic unbiased quantization luq scheme which achieves sota results for 4bit training on imagenet results are further improved with multiple sampling smp and a final full precision finetuning fnt at the cost of extra compute overhead finally they introduce a hw block which exploits luq by avoiding multiplication which can reduce the multiplier area by 5 times strong points theoretically wellmotivated choice of quantization scheme for forward and backward path the logarithmic unbiased quantization method is theoretically sound and well explained strong results for 4 bit training of various imagenet models clear ablation studies for the contribution of smp and fnt in table 1 most parts of the paper are well written and easy to follow weak points it is unclear what the contributions of unbiased stochastic pruning and logarithmic unbiased rounding in luc are an ablation study would be insightful on this the smp method for reducing variance requires further clarification it is unclear where and when this exactly is applied for example are the multiple samples only used to calculate the weight update similar to the second sample in sun et al or is the average of multiple samples also used in back propagating to he other layers if the latter then this seems to likely imply that a higher bit width needs to be used in the matmul of the next layer how does the multiple samples compare in performance and complexity to using a different bit width fnt fullprecision finetuning makes the comparison to other lowbit training methods unfair but results are also show without it and show competitive results the transform to standard fp7 lacks detail and explanation it is hard to understand how the inputoutput table is constructed while the paper is easy to read and clearly written many small details are missing especially on the experimentation part see questions below questions what is the hw impact of rounding to nearest power this seems significantly more complicated than adding uniform noise as it is the case for uniform quantization how do they define the pruning threshold are they learnedupdated i could not find any details on this in the paper how do you deal with bn what quantization approach is applied in the forward pass weights and activations rounding to nearest is clear but how are the ranges definedlearned etc editorial notes few appreciations introduced are unclear what they stand for eg smp fnt rdn the later i assume comes from roundingtonearest but would than not rtn be the right appreciation typo below equation 9 roundingtozero page 3 will not make help making the loss estimate unbiased overall a well written paper that is easy to follow except a few parts highlighted above the proposed approach is somewhat novel it combines stochastic rounding with logarithmic quantization and stochastic pruning the main strength are that most parts are well motivated and the strong 4 bit training results the main weaknesses are around smp and some other parts that are unclear overall the contributions slightly outweigh the weaknesses of the paper docsepthis paper proposes techniques to quantize the gradients in the backpropagation algorithm down to 4 bits to do so the authors propose the 130 radix2 fp4 format as well as bias removal and variance reduction techniques in order to achieve sota accuracy the method requires an additional stage of high precision finetuning while this paper is interesting and presents promising results i did have many questions listed hereafter introduction radix4 representation is criticized arguing that conversion to radix2 requires an explicit multiplication to modify exponent and mantissa this is untrue in the paper by sun et al that the authors cite the radix4 format does not use a mantissa it only uses 3 exponent bits and therefore converting to radix2 simply requires appending the exponent field by a zero section 2 in the calculation of rounding variance and mse why is the input x considered deterministic in section 2 conclusion the following claim is made the forward phase should be quantized deterministically using rdn since stochastic rounding will not make help making the loss estimate unbiased due to the nonlinearity of the loss and activation functions while unnecessarily increasing the meansquareerror here the authors are claiming that the use of nonlinear activations and loss will negate the fact that sr is unbiased why is that the case is this a known result from prior arts if so the authors should add a reference as was done in this same paragraph regarding the works of chen et al and bottou the luq proposed in eq 1112 is not unbiased when nb1 ie in the quantization region corresponding to the largest magnitudes usually such boundary cases do not matter but given that here there are only 16 regions and that the data is assumed to be heavytailed most of the data would actually fall in this final region this issue might be significant can the authors comment on what happens in the final quantization region related to the above this is more of a suggestion rather than setting an underflow value as quantization metaparameter why isnt the max of the tensor used instead this would avoid the above issue have the authors considered that did tuning an underflow hyperparameter yield better results in eq14 why is a subtraction of a 12 term needed when rdn is defined as per eq5 also this equation is improperly typeset one final question the issue of quantization bias occurring in gradients and weight updates has previously been studied in the paper below can the authors compare their findings and check if their conclusions are consistent with prior arts sakr et al pertensor fixedpoint quantization of the backpropagation algorithm in iclr 2019 the experimental results and crucially the implementation details in the appendix look good finally there are many typos and grammatical errors throughout the paper i urge the authors to perform a spell check the paper is interesting tackles an important problem and presents promising results therefore i do not recommend a clear reject however there are unfortunately many issues i though were unclear and hence raised in my detailed review therefore i cannot recommend acceptance of the paper if these issues are not addressed hence for now i am rating the paper as a weak reject docsepthe paper proposes some techniques to train a deep neural network in 4bit and provides a theoretical analysis the experimental results support the effectiveness of the proposed method although some part of the method still needs high precision the reviewer thinks that it could open a door to training the deep neural network with ultralow bits strength 1 overall the paper is well written quite thorough and wellcited the theoretical analysis related to stochastic rounding and roundingtonearest are well justified 2 the paper achieved superior training results in 4bit training 3 the authors not only provided the 4bit training algorithm but also suggested dedicated hardware blocks weaknesses 1 some parts of the proposed method still need highprecision 2 some experimental results mobilenetv2 and resnext50 are not provided because of the time deadline 3 the hardware implementation costs are evaluated in terms of logical area but i think it should also be analyzed in terms of memory area since the multiple sampling smp and underflow threshold computation parts need additional memories minor comments 1 in figure 1 b and c which rounding methods are adopted for bwd and fwd respectively 2 in order to reduce variance the author performs multiple samplings and averages them but as the sampling increases does not it get closer to the rdn 3 there is a typo on page 5 the a different samples the different samples 4 the authors mentioned that we increase all network parts to fullprecision except the weights for the highprecision finetuning if activation is also trained with full precision here after finetuning the activation precision should be lowered back to 4 bits was there any performance degradation at this point this paper proposed several techniques for effective 4bit training some parts of the proposed method still require highprecision but it might be a good starting point for full 4bit training in the near future
### Summary:
|
this paper proposes a method for 4bit quantized training of nns forward and backward obtaining sota 4bit training quantization motivated by an analysis of rounding schemes an important aspect in quantized training the main concerns from the reviewers were that the approach was not practical both a general concern and of specific note here since the word is used in the title and motivation of the work due to lack of compatibility with current general purpose hardware and lack of suitability of the approach for specialized hardware so it is unclear what the actual use case is for the approach the authors argued that 1 this is not a problem on some hardware and 2 that past works have not been held to this standard i did not find the authors to provide a strong argument during the discussion period to address these concerns
|
[
10309,
7092,
285,
2105,
28242,
2522,
403,
1669,
562,
4496,
4385,
327,
436,
50276,
881,
375,
3239,
495,
3790,
936,
10528,
50276,
570,
4885,
3239,
577,
3693,
273,
502,
8201,
50276,
1171,
3239,
608,
253,
247,
1027,
3530,
50276,
66,
3239,
721,
577,
2713,
3733,
275,
2710,
277,
9866,
3210,
50276,
249,
50276,
251,
50275,
2520,
310,
271,
4722,
2929,
342,
690,
4722,
16039,
533,
253,
2929,
14177,
281,
513,
1512,
1199,
275,
1512,
1355,
247,
2317,
436,
5644,
479,
281,
452,
1512,
1142,
1527,
3533,
323,
247,
1175,
13716,
253,
2929,
651,
2779,
452,
644,
1805,
604,
352,
4070,
264,
275,
327,
247,
2014,
4809,
326,
597,
1246,
285,
858,
594,
275,
247,
30909,
8142,
342,
625,
14395,
6107,
253,
11815,
327,
3239,
495,
403,
8392,
1199,
1512,
3809,
285,
403,
12744,
253,
4477,
452,
281,
18578,
253,
9414,
247,
2257,
12150,
281,
1056,
253,
3916,
597,
1056,
327,
253,
581,
1133,
253,
9380,
2022,
1332,
760,
16907,
789,
625,
14556,
342,
1077,
9940,
10309,
27558,
533,
2067,
1774,
7794,
273,
5919,
10309,
27558,
273,
3733,
824,
347,
941,
4866,
403,
16907,
12841,
50275,
74,
651,
320,
7378,
281,
2572,
619,
13716,
604,
253,
4477,
9713,
619,
1840,
3533,
294,
48141,
2593,
374,
3012,
342,
247,
625,
21414,
1783,
285,
2879,
247,
1805,
10454,
1783,
273,
253,
18332,
273,
616,
1332,
275,
2426,
273,
941,
4866,
50275,
5996,
250,
2858,
22559,
50275,
28821,
253,
4477,
7126,
2380,
281,
619,
3533,
285,
253,
3533,
273,
253,
643,
4477,
891,
717,
3629,
619,
2278,
281,
247,
608,
323,
247,
721,
390,
2169,
891,
7964,
878,
253,
6733,
345,
1504,
4638,
11512,
347,
323,
1024,
891,
816,
13414,
923,
849,
281,
11322,
253,
19821,
22438,
875,
1146,
2057,
247,
3839,
7763,
1332,
327,
764,
6461,
285,
2087,
10309,
390,
247,
1332,
326,
310,
1663,
11205,
387,
31012,
250,
2276,
9940,
27558,
5474,
339,
431,
248,
2929,
16633,
327,
8493,
253,
15180,
33257,
273,
3733,
11454,
6928,
970,
36643,
597,
7409,
253,
6349,
273,
1907,
38663,
27935,
285,
921,
326,
19191,
46551,
310,
1774,
323,
253,
24291,
1509,
597,
12661,
247,
32643,
38663,
36643,
26535,
82,
6974,
534,
33526,
256,
5503,
1543,
323,
577,
2713,
3733,
327,
4440,
257,
292,
1543,
403,
2007,
5520,
342,
2709,
10491,
924,
81,
285,
247,
2457,
2120,
12320,
1442,
292,
25004,
269,
2649,
387,
253,
2105,
273,
4465,
11897,
18332,
4720,
597,
9569,
247,
45850,
2972,
534,
40725,
26535,
82,
407,
17816,
25219,
534,
476,
4796,
253,
39199,
2170,
407,
608,
2069,
50276,
9072,
2792,
50276,
783,
7262,
1037,
973,
24013,
8550,
4327,
273,
36643,
6974,
323,
3579,
285,
19265,
1854,
50276,
783,
32643,
38663,
36643,
1332,
310,
28055,
3590,
285,
973,
5544,
50276,
9072,
1543,
323,
577,
2372,
3733,
273,
2710,
4440,
257,
292,
3210,
50276,
8250,
28913,
2175,
323,
253,
7680,
273,
924,
81,
285,
269,
2649,
275,
2829,
337,
50276,
2252,
4243,
273,
253,
2929,
403,
973,
3542,
285,
3477,
281,
956,
50276,
20881,
2792,
50276,
262,
310,
12744,
752,
253,
9021,
273,
38663,
19191,
819,
25004,
285,
32643,
38663,
46551,
275,
18205,
403,
271,
28913,
1263,
651,
320,
47860,
327,
436,
50276,
783,
924,
81,
1332,
323,
8493,
11041,
4419,
2007,
37699,
352,
310,
12744,
835,
285,
672,
436,
4555,
310,
3732,
323,
1650,
403,
253,
2709,
3530,
760,
908,
281,
10173,
253,
2801,
5731,
2074,
281,
253,
1273,
3410,
275,
5101,
1162,
355,
390,
310,
253,
3388,
273,
2709,
3530,
671,
908,
275,
896,
42995,
281,
344,
643,
8090,
604,
253,
6158,
840,
436,
3133,
281,
2779,
16084,
326,
247,
2169,
2372,
4871,
3198,
281,
320,
908,
275,
253,
1111,
31601,
273,
253,
1735,
3828,
849,
1057,
253,
2709,
3530,
7277,
275,
3045,
285,
10454,
281,
970,
247,
1027,
2372,
4871,
50276,
71,
2649,
2120,
40540,
1442,
292,
25004,
2789,
253,
5301,
281,
643,
1698,
2713,
3733,
3082,
16593,
533,
1543,
403,
671,
921,
1293,
352,
285,
921,
12085,
1543,
50276,
783,
4979,
281,
2629,
44296,
24,
19756,
2508,
285,
8813,
352,
310,
1892,
281,
2096,
849,
253,
3280,
9252,
2829,
310,
8818,
50276,
6050,
253,
2929,
310,
3477,
281,
1239,
285,
4518,
3542,
1142,
1355,
4278,
403,
5816,
3340,
327,
253,
40290,
629,
923,
3533,
2708,
50276,
34974,
50276,
5371,
310,
253,
45850,
3486,
273,
46551,
281,
5275,
1612,
436,
3133,
3012,
625,
9542,
685,
6240,
6447,
6046,
347,
352,
310,
253,
1083,
323,
6447,
36643,
50276,
5430,
513,
597,
4853,
253,
819,
25004,
7887,
403,
597,
6311,
39055,
891,
812,
417,
1089,
667,
4278,
327,
436,
275,
253,
2929,
50276,
5430,
513,
368,
2968,
342,
270,
79,
50276,
5371,
36643,
2746,
310,
3732,
275,
253,
3579,
1509,
13461,
285,
1396,
569,
46551,
281,
5275,
310,
2590,
533,
849,
403,
253,
13794,
2931,
29343,
264,
3966,
50276,
20511,
451,
7211,
50275,
37922,
6373,
10944,
5611,
403,
12744,
752,
597,
1462,
323,
24088,
924,
81,
269,
2649,
391,
17915,
253,
1996,
891,
5467,
3249,
432,
46551,
48619,
4885,
533,
651,
685,
417,
391,
14543,
320,
253,
987,
24540,
50276,
555,
5367,
2708,
5150,
898,
46551,
936,
10528,
50276,
6377,
495,
588,
417,
1056,
1361,
2403,
253,
2957,
6642,
38663,
4583,
247,
973,
3542,
2929,
326,
310,
3477,
281,
956,
3707,
247,
1643,
4243,
16318,
1840,
253,
4081,
2746,
310,
8489,
4460,
352,
24772,
19191,
46551,
342,
32643,
36643,
285,
19191,
819,
25004,
253,
2022,
4757,
403,
326,
954,
4243,
403,
973,
17194,
285,
253,
2266,
577,
2372,
3733,
1543,
253,
2022,
32213,
403,
1475,
924,
81,
285,
690,
643,
4243,
326,
403,
12744,
4583,
253,
9021,
5777,
32180,
798,
253,
32213,
273,
253,
2929,
5474,
33032,
2520,
2929,
29328,
5609,
281,
2677,
907,
253,
27935,
275,
253,
896,
44263,
318,
5933,
1066,
281,
577,
9886,
281,
513,
594,
253,
4477,
12661,
253,
11084,
1985,
895,
19,
44296,
21,
5981,
347,
973,
347,
8492,
8570,
285,
11041,
5141,
5609,
275,
1340,
281,
5115,
256,
5503,
7200,
253,
1332,
4419,
271,
3081,
3924,
273,
1029,
12320,
1442,
292,
25004,
1223,
436,
2929,
310,
4722,
285,
10262,
12532,
1543,
891,
858,
452,
1142,
3533,
7117,
38492,
50276,
46089,
1985,
895,
21,
6779,
310,
23159,
16425,
326,
9436,
281,
1985,
895,
19,
4419,
271,
6843,
25219,
281,
10007,
23653,
285,
18006,
14302,
436,
310,
440,
5672,
275,
253,
2929,
407,
5101,
1162,
355,
326,
253,
4477,
26542,
253,
1985,
895,
21,
5981,
1057,
417,
897,
247,
18006,
14302,
352,
760,
4648,
495,
23653,
9886,
285,
3103,
22022,
281,
1985,
895,
19,
3365,
4419,
622,
1946,
253,
23653,
1673,
407,
247,
5058,
50276,
4674,
374,
275,
253,
10272,
273,
46551,
11041,
285,
278,
339,
2139,
310,
253,
3280,
1269,
2783,
30027,
50276,
249,
2593,
374,
6452,
253,
1563,
1750,
310,
1160,
50276,
783,
3579,
3408,
943,
320,
2677,
1025,
11544,
18260,
970,
391,
17915,
1580,
19191,
46551,
588,
417,
1056,
1361,
2403,
253,
2957,
6642,
38663,
1955,
281,
253,
14561,
414,
273,
253,
2957,
285,
5743,
3470,
1223,
48312,
3629,
253,
2097,
8974,
3775,
1060,
253,
4477,
403,
15081,
326,
253,
897,
273,
14561,
1396,
569,
285,
2957,
588,
2297,
366,
253,
958,
326,
49975,
310,
38663,
2139,
310,
326,
253,
1083,
310,
436,
247,
1929,
906,
432,
2720,
14635,
604,
594,
253,
4477,
943,
823,
247,
3806,
347,
369,
2218,
275,
436,
1072,
12494,
5001,
253,
2987,
273,
260,
864,
1162,
355,
285,
3673,
276,
50276,
783,
26535,
82,
4081,
275,
16186,
1903,
805,
310,
417,
38663,
672,
295,
67,
18,
26332,
275,
253,
36643,
2919,
3969,
281,
253,
6253,
32800,
3798,
824,
7548,
2219,
513,
417,
2647,
533,
1677,
326,
1060,
627,
403,
760,
1668,
4811,
285,
326,
253,
941,
310,
8025,
281,
320,
3573,
1767,
7193,
954,
273,
253,
941,
651,
2686,
2965,
275,
436,
2457,
2919,
436,
2523,
1537,
320,
1534,
476,
253,
4477,
4385,
327,
752,
6569,
275,
253,
2457,
36643,
2919,
50276,
4919,
281,
253,
1840,
436,
310,
625,
273,
247,
14876,
2581,
685,
4758,
271,
762,
5449,
1318,
347,
36643,
21543,
274,
6245,
2139,
310,
2649,
253,
2781,
273,
253,
13148,
908,
3185,
436,
651,
3693,
253,
1840,
2523,
452,
253,
4477,
2783,
326,
858,
25184,
271,
762,
5449,
4373,
19484,
4917,
1805,
1543,
50276,
249,
16186,
1047,
2139,
310,
247,
38171,
273,
247,
1249,
1307,
3058,
672,
391,
17915,
310,
2931,
347,
591,
16186,
22,
671,
436,
5150,
310,
28203,
3510,
292,
50276,
531,
2457,
1953,
253,
2523,
273,
36643,
8492,
12952,
275,
27935,
285,
2801,
11269,
556,
3786,
644,
5421,
275,
253,
2929,
2708,
476,
253,
4477,
7277,
616,
4342,
285,
2451,
604,
616,
11815,
403,
5185,
342,
2720,
14635,
256,
518,
83,
1162,
355,
6925,
11313,
4229,
3659,
36643,
273,
253,
896,
44263,
318,
5933,
275,
17857,
32888,
6247,
50276,
783,
5661,
1543,
285,
29325,
1365,
253,
7092,
4278,
275,
253,
30762,
1007,
1175,
50276,
71,
3341,
627,
403,
1142,
963,
993,
285,
47412,
474,
6332,
4768,
253,
2929,
891,
21434,
253,
4477,
281,
1347,
247,
15368,
2451,
253,
2929,
310,
4722,
39223,
271,
1774,
1895,
285,
10262,
12532,
1543,
3103,
891,
513,
417,
5583,
247,
2590,
12009,
2299,
627,
403,
19235,
1142,
3374,
891,
2167,
497,
12744,
285,
7613,
5439,
275,
619,
7000,
2278,
3103,
891,
2550,
5583,
14924,
273,
253,
2929,
604,
841,
3374,
403,
417,
9713,
7613,
323,
1024,
891,
717,
13716,
253,
2929,
347,
247,
5075,
12009,
5474,
339,
431,
248,
2929,
29328,
690,
5609,
281,
6194,
247,
3676,
11454,
2990,
275,
577,
2713,
285,
3400,
247,
10527,
1783,
253,
5661,
1543,
1329,
253,
12510,
273,
253,
4081,
1332,
3738,
690,
629,
273,
253,
1332,
1335,
3198,
1029,
12320,
253,
37317,
11121,
326,
352,
812,
1527,
247,
3369,
281,
3733,
253,
3676,
11454,
2990,
342,
4054,
1544,
319,
9886,
50276,
45563,
50276,
18,
4583,
253,
2929,
310,
973,
3542,
3240,
11080,
285,
973,
68,
959,
253,
10527,
1783,
2905,
281,
19191,
46551,
285,
46551,
48619,
4885,
403,
973,
17285,
374,
253,
2929,
6786,
8936,
3733,
1543,
275,
577,
2713,
3733,
495,
253,
4477,
417,
760,
2530,
253,
577,
2713,
3733,
5933,
533,
671,
5125,
9940,
10309,
8336,
50276,
20881,
1255,
265,
50276,
18,
690,
4243,
273,
253,
4081,
1332,
1335,
878,
1029,
40540,
374,
690,
5661,
1543,
31551,
257,
292,
87,
19,
285,
501,
8384,
1235,
403,
417,
2530,
984,
273,
253,
673,
20639,
495,
253,
10309,
7092,
4815,
403,
6760,
275,
2426,
273,
13760,
2170,
533,
891,
1158,
352,
943,
671,
320,
5867,
275,
2426,
273,
3541,
2170,
1580,
253,
2709,
10491,
924,
81,
285,
762,
5449,
7887,
13782,
4243,
878,
3081,
12959,
50275,
37585,
5701,
50276,
18,
275,
4677,
337,
270,
285,
260,
534,
46551,
3082,
403,
8671,
323,
270,
14066,
285,
269,
14066,
2975,
374,
275,
1340,
281,
4796,
11041,
253,
2488,
17923,
2709,
1775,
22945,
285,
31218,
731,
533,
347,
253,
10491,
5459,
1057,
417,
352,
755,
8003,
281,
253,
391,
17915,
495,
627,
310,
247,
1745,
80,
327,
3239,
608,
253,
247,
1027,
3530,
50276,
783,
1027,
3530,
577,
253,
4477,
5393,
326,
359,
2572,
512,
2990,
4243,
281,
2120,
40540,
3707,
253,
13461,
323,
253,
1029,
40540,
1442,
292,
25004,
604,
5743,
310,
671,
10166,
342,
2120,
12320,
1060,
846,
1442,
292,
25004,
253,
5743,
12320,
943,
320,
17892,
896,
281,
577,
9886,
50276,
4238,
627,
667,
3045,
11961,
387,
436,
1127,
50274,
2520,
2929,
4081,
2067,
5609,
323,
3576,
577,
2713,
3733,
690,
4243,
273,
253,
4081,
1332,
1335,
2430,
1029,
40540,
533,
352,
1537,
320,
247,
1175,
4983,
1127,
323,
2120,
577,
2713,
3733,
275,
253,
2822,
2852,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
323,
577,
2713,
2677,
1025,
3733,
273,
295,
2224,
3579,
285,
19265,
13546,
256,
5503,
577,
2713,
3733,
36643,
17194,
407,
271,
1783,
273,
46551,
15849,
271,
1774,
4809,
275,
2677,
1025,
3733,
253,
2022,
7350,
432,
253,
30628,
497,
326,
253,
2746,
369,
417,
8542,
1097,
247,
2087,
4468,
285,
273,
2173,
3877,
1060,
1580,
253,
3159,
310,
908,
275,
253,
4060,
285,
16038,
273,
253,
789,
1955,
281,
3480,
273,
22862,
342,
1655,
2087,
4096,
10309,
285,
3480,
273,
45984,
273,
253,
2746,
323,
18052,
10309,
594,
352,
310,
12744,
752,
253,
4588,
897,
1083,
310,
323,
253,
2746,
253,
4477,
9125,
326,
337,
436,
310,
417,
247,
1895,
327,
690,
10309,
285,
374,
326,
2469,
2987,
452,
417,
644,
2918,
281,
436,
2629,
891,
858,
417,
1089,
253,
4477,
281,
2085,
247,
2266,
4154,
1309,
253,
5955,
2180,
281,
2953,
841,
7350
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
10309,
7092,
285,
2105,
28242,
2522,
403,
1669,
562,
4496,
4385,
327,
436,
50276,
881,
375,
3239,
495,
3790,
936,
10528,
50276,
570,
4885,
3239,
577,
3693,
273,
502,
8201,
50276,
1171,
3239,
608,
253,
247,
1027,
3530,
50276,
66,
3239,
721,
577,
2713,
3733,
275,
2710,
277,
9866,
3210,
50276,
249,
50276,
251,
50275,
2520,
310,
271,
4722,
2929,
342,
690,
4722,
16039,
533,
253,
2929,
14177,
281,
513,
1512,
1199,
275,
1512,
1355,
247,
2317,
436,
5644,
479,
281,
452,
1512,
1142,
1527,
3533,
323,
247,
1175,
13716,
253,
2929,
651,
2779,
452,
644,
1805,
604,
352,
4070,
264,
275,
327,
247,
2014,
4809,
326,
597,
1246,
285,
858,
594,
275,
247,
30909,
8142,
342,
625,
14395,
6107,
253,
11815,
327,
3239,
495,
403,
8392,
1199,
1512,
3809,
285,
403,
12744,
253,
4477,
452,
281,
18578,
253,
9414,
247,
2257,
12150,
281,
1056,
253,
3916,
597,
1056,
327,
253,
581,
1133,
253,
9380,
2022,
1332,
760,
16907,
789,
625,
14556,
342,
1077,
9940,
10309,
27558,
533,
2067,
1774,
7794,
273,
5919,
10309,
27558,
273,
3733,
824,
347,
941,
4866,
403,
16907,
12841,
50275,
74,
651,
320,
7378,
281,
2572,
619,
13716,
604,
253,
4477,
9713,
619,
1840,
3533,
294,
48141,
2593,
374,
3012,
342,
247,
625,
21414,
1783,
285,
2879,
247,
1805,
10454,
1783,
273,
253,
18332,
273,
616,
1332,
275,
2426,
273,
941,
4866,
50275,
5996,
250,
2858,
22559,
50275,
28821,
253,
4477,
7126,
2380,
281,
619,
3533,
285,
253,
3533,
273,
253,
643,
4477,
891,
717,
3629,
619,
2278,
281,
247,
608,
323,
247,
721,
390,
2169,
891,
7964,
878,
253,
6733,
345,
1504,
4638,
11512,
347,
323,
1024,
891,
816,
13414,
923,
849,
281,
11322,
253,
19821,
22438,
875,
1146,
2057,
247,
3839,
7763,
1332,
327,
764,
6461,
285,
2087,
10309,
390,
247,
1332,
326,
310,
1663,
11205,
387,
31012,
250,
2276,
9940,
27558,
5474,
339,
431,
248,
2929,
16633,
327,
8493,
253,
15180,
33257,
273,
3733,
11454,
6928,
970,
36643,
597,
7409,
253,
6349,
273,
1907,
38663,
27935,
285,
921,
326,
19191,
46551,
310,
1774,
323,
253,
24291,
1509,
597,
12661,
247,
32643,
38663,
36643,
26535,
82,
6974,
534,
33526,
256,
5503,
1543,
323,
577,
2713,
3733,
327,
4440,
257,
292,
1543,
403,
2007,
5520,
342,
2709,
10491,
924,
81,
285,
247,
2457,
2120,
12320,
1442,
292,
25004,
269,
2649,
387,
253,
2105,
273,
4465,
11897,
18332,
4720,
597,
9569,
247,
45850,
2972,
534,
40725,
26535,
82,
407,
17816,
25219,
534,
476,
4796,
253,
39199,
2170,
407,
608,
2069,
50276,
9072,
2792,
50276,
783,
7262,
1037,
973,
24013,
8550,
4327,
273,
36643,
6974,
323,
3579,
285,
19265,
1854,
50276,
783,
32643,
38663,
36643,
1332,
310,
28055,
3590,
285,
973,
5544,
50276,
9072,
1543,
323,
577,
2372,
3733,
273,
2710,
4440,
257,
292,
3210,
50276,
8250,
28913,
2175,
323,
253,
7680,
273,
924,
81,
285,
269,
2649,
275,
2829,
337,
50276,
2252,
4243,
273,
253,
2929,
403,
973,
3542,
285,
3477,
281,
956,
50276,
20881,
2792,
50276,
262,
310,
12744,
752,
253,
9021,
273,
38663,
19191,
819,
25004,
285,
32643,
38663,
46551,
275,
18205,
403,
271,
28913,
1263,
651,
320,
47860,
327,
436,
50276,
783,
924,
81,
1332,
323,
8493,
11041,
4419,
2007,
37699,
352,
310,
12744,
835,
285,
672,
436,
4555,
310,
3732,
323,
1650,
403,
253,
2709,
3530,
760,
908,
281,
10173,
253,
2801,
5731,
2074,
281,
253,
1273,
3410,
275,
5101,
1162,
355,
390,
310,
253,
3388,
273,
2709,
3530,
671,
908,
275,
896,
42995,
281,
344,
643,
8090,
604,
253,
6158,
840,
436,
3133,
281,
2779,
16084,
326,
247,
2169,
2372,
4871,
3198,
281,
320,
908,
275,
253,
1111,
31601,
273,
253,
1735,
3828,
849,
1057,
253,
2709,
3530,
7277,
275,
3045,
285,
10454,
281,
970,
247,
1027,
2372,
4871,
50276,
71,
2649,
2120,
40540,
1442,
292,
25004,
2789,
253,
5301,
281,
643,
1698,
2713,
3733,
3082,
16593,
533,
1543,
403,
671,
921,
1293,
352,
285,
921,
12085,
1543,
50276,
783,
4979,
281,
2629,
44296,
24,
19756,
2508,
285,
8813,
352,
310,
1892,
281,
2096,
849,
253,
3280,
9252,
2829,
310,
8818,
50276,
6050,
253,
2929,
310,
3477,
281,
1239,
285,
4518,
3542,
1142,
1355,
4278,
403,
5816,
3340,
327,
253,
40290,
629,
923,
3533,
2708,
50276,
34974,
50276,
5371,
310,
253,
45850,
3486,
273,
46551,
281,
5275,
1612,
436,
3133,
3012,
625,
9542,
685,
6240,
6447,
6046,
347,
352,
310,
253,
1083,
323,
6447,
36643,
50276,
5430,
513,
597,
4853,
253,
819,
25004,
7887,
403,
597,
6311,
39055,
891,
812,
417,
1089,
667,
4278,
327,
436,
275,
253,
2929,
50276,
5430,
513,
368,
2968,
342,
270,
79,
50276,
5371,
36643,
2746,
310,
3732,
275,
253,
3579,
1509,
13461,
285,
1396,
569,
46551,
281,
5275,
310,
2590,
533,
849,
403,
253,
13794,
2931,
29343,
264,
3966,
50276,
20511,
451,
7211,
50275,
37922,
6373,
10944,
5611,
403,
12744,
752,
597,
1462,
323,
24088,
924,
81,
269,
2649,
391,
17915,
253,
1996,
891,
5467,
3249,
432,
46551,
48619,
4885,
533,
651,
685,
417,
391,
14543,
320,
253,
987,
24540,
50276,
555,
5367,
2708,
5150,
898,
46551,
936,
10528,
50276,
6377,
495,
588,
417,
1056,
1361,
2403,
253,
2957,
6642,
38663,
4583,
247,
973,
3542,
2929,
326,
310,
3477,
281,
956,
3707,
247,
1643,
4243,
16318,
1840,
253,
4081,
2746,
310,
8489,
4460,
352,
24772,
19191,
46551,
342,
32643,
36643,
285,
19191,
819,
25004,
253,
2022,
4757,
403,
326,
954,
4243,
403,
973,
17194,
285,
253,
2266,
577,
2372,
3733,
1543,
253,
2022,
32213,
403,
1475,
924,
81,
285,
690,
643,
4243,
326,
403,
12744,
4583,
253,
9021,
5777,
32180,
798,
253,
32213,
273,
253,
2929,
5474,
33032,
2520,
2929,
29328,
5609,
281,
2677,
907,
253,
27935,
275,
253,
896,
44263,
318,
5933,
1066,
281,
577,
9886,
281,
513,
594,
253,
4477,
12661,
253,
11084,
1985,
895,
19,
44296,
21,
5981,
347,
973,
347,
8492,
8570,
285,
11041,
5141,
5609,
275,
1340,
281,
5115,
256,
5503,
7200,
253,
1332,
4419,
271,
3081,
3924,
273,
1029,
12320,
1442,
292,
25004,
1223,
436,
2929,
310,
4722,
285,
10262,
12532,
1543,
891,
858,
452,
1142,
3533,
7117,
38492,
50276,
46089,
1985,
895,
21,
6779,
310,
23159,
16425,
326,
9436,
281,
1985,
895,
19,
4419,
271,
6843,
25219,
281,
10007,
23653,
285,
18006,
14302,
436,
310,
440,
5672,
275,
253,
2929,
407,
5101,
1162,
355,
326,
253,
4477,
26542,
253,
1985,
895,
21,
5981,
1057,
417,
897,
247,
18006,
14302,
352,
760,
4648,
495,
23653,
9886,
285,
3103,
22022,
281,
1985,
895,
19,
3365,
4419,
622,
1946,
253,
23653,
1673,
407,
247,
5058,
50276,
4674,
374,
275,
253,
10272,
273,
46551,
11041,
285,
278,
339,
2139,
310,
253,
3280,
1269,
2783,
30027,
50276,
249,
2593,
374,
6452,
253,
1563,
1750,
310,
1160,
50276,
783,
3579,
3408,
943,
320,
2677,
1025,
11544,
18260,
970,
391,
17915,
1580,
19191,
46551,
588,
417,
1056,
1361,
2403,
253,
2957,
6642,
38663,
1955,
281,
253,
14561,
414,
273,
253,
2957,
285,
5743,
3470,
1223,
48312,
3629,
253,
2097,
8974,
3775,
1060,
253,
4477,
403,
15081,
326,
253,
897,
273,
14561,
1396,
569,
285,
2957,
588,
2297,
366,
253,
958,
326,
49975,
310,
38663,
2139,
310,
326,
253,
1083,
310,
436,
247,
1929,
906,
432,
2720,
14635,
604,
594,
253,
4477,
943,
823,
247,
3806,
347,
369,
2218,
275,
436,
1072,
12494,
5001,
253,
2987,
273,
260,
864,
1162,
355,
285,
3673,
276,
50276,
783,
26535,
82,
4081,
275,
16186,
1903,
805,
310,
417,
38663,
672,
295,
67,
18,
26332,
275,
253,
36643,
2919,
3969,
281,
253,
6253,
32800,
3798,
824,
7548,
2219,
513,
417,
2647,
533,
1677,
326,
1060,
627,
403,
760,
1668,
4811,
285,
326,
253,
941,
310,
8025,
281,
320,
3573,
1767,
7193,
954,
273,
253,
941,
651,
2686,
2965,
275,
436,
2457,
2919,
436,
2523,
1537,
320,
1534,
476,
253,
4477,
4385,
327,
752,
6569,
275,
253,
2457,
36643,
2919,
50276,
4919,
281,
253,
1840,
436,
310,
625,
273,
247,
14876,
2581,
685,
4758,
271,
762,
5449,
1318,
347,
36643,
21543,
274,
6245,
2139,
310,
2649,
253,
2781,
273,
253,
13148,
908,
3185,
436,
651,
3693,
253,
1840,
2523,
452,
253,
4477,
2783,
326,
858,
25184,
271,
762,
5449,
4373,
19484,
4917,
1805,
1543,
50276,
249,
16186,
1047,
2139,
310,
247,
38171,
273,
247,
1249,
1307,
3058,
672,
391,
17915,
310,
2931,
347,
591,
16186,
22,
671,
436,
5150,
310,
28203,
3510,
292,
50276,
531,
2457,
1953,
253,
2523,
273,
36643,
8492,
12952,
275,
27935,
285,
2801,
11269,
556,
3786,
644,
5421,
275,
253,
2929,
2708,
476,
253,
4477,
7277,
616,
4342,
285,
2451,
604,
616,
11815,
403,
5185,
342,
2720,
14635,
256,
518,
83,
1162,
355,
6925,
11313,
4229,
3659,
36643,
273,
253,
896,
44263,
318,
5933,
275,
17857,
32888,
6247,
50276,
783,
5661,
1543,
285,
29325,
1365,
253,
7092,
4278,
275,
253,
30762,
1007,
1175,
50276,
71,
3341,
627,
403,
1142,
963,
993,
285,
47412,
474,
6332,
4768,
253,
2929,
891,
21434,
253,
4477,
281,
1347,
247,
15368,
2451,
253,
2929,
310,
4722,
39223,
271,
1774,
1895,
285,
10262,
12532,
1543,
3103,
891,
513,
417,
5583,
247,
2590,
12009,
2299,
627,
403,
19235,
1142,
3374,
891,
2167,
497,
12744,
285,
7613,
5439,
275,
619,
7000,
2278,
3103,
891,
2550,
5583,
14924,
273,
253,
2929,
604,
841,
3374,
403,
417,
9713,
7613,
323,
1024,
891,
717,
13716,
253,
2929,
347,
247,
5075,
12009,
5474,
339,
431,
248,
2929,
29328,
690,
5609,
281,
6194,
247,
3676,
11454,
2990,
275,
577,
2713,
285,
3400,
247,
10527,
1783,
253,
5661,
1543,
1329,
253,
12510,
273,
253,
4081,
1332,
3738,
690,
629,
273,
253,
1332,
1335,
3198,
1029,
12320,
253,
37317,
11121,
326,
352,
812,
1527,
247,
3369,
281,
3733,
253,
3676,
11454,
2990,
342,
4054,
1544,
319,
9886,
50276,
45563,
50276,
18,
4583,
253,
2929,
310,
973,
3542,
3240,
11080,
285,
973,
68,
959,
253,
10527,
1783,
2905,
281,
19191,
46551,
285,
46551,
48619,
4885,
403,
973,
17285,
374,
253,
2929,
6786,
8936,
3733,
1543,
275,
577,
2713,
3733,
495,
253,
4477,
417,
760,
2530,
253,
577,
2713,
3733,
5933,
533,
671,
5125,
9940,
10309,
8336,
50276,
20881,
1255,
265,
50276,
18,
690,
4243,
273,
253,
4081,
1332,
1335,
878,
1029,
40540,
374,
690,
5661,
1543,
31551,
257,
292,
87,
19,
285,
501,
8384,
1235,
403,
417,
2530,
984,
273,
253,
673,
20639,
495,
253,
10309,
7092,
4815,
403,
6760,
275,
2426,
273,
13760,
2170,
533,
891,
1158,
352,
943,
671,
320,
5867,
275,
2426,
273,
3541,
2170,
1580,
253,
2709,
10491,
924,
81,
285,
762,
5449,
7887,
13782,
4243,
878,
3081,
12959,
50275,
37585,
5701,
50276,
18,
275,
4677,
337,
270,
285,
260,
534,
46551,
3082,
403,
8671,
323,
270,
14066,
285,
269,
14066,
2975,
374,
275,
1340,
281,
4796,
11041,
253,
2488,
17923,
2709,
1775,
22945,
285,
31218,
731,
533,
347,
253,
10491,
5459,
1057,
417,
352,
755,
8003,
281,
253,
391,
17915,
495,
627,
310,
247,
1745,
80,
327,
3239,
608,
253,
247,
1027,
3530,
50276,
783,
1027,
3530,
577,
253,
4477,
5393,
326,
359,
2572,
512,
2990,
4243,
281,
2120,
40540,
3707,
253,
13461,
323,
253,
1029,
40540,
1442,
292,
25004,
604,
5743,
310,
671,
10166,
342,
2120,
12320,
1060,
846,
1442,
292,
25004,
253,
5743,
12320,
943,
320,
17892,
896,
281,
577,
9886,
50276,
4238,
627,
667,
3045,
11961,
387,
436,
1127,
50274,
2520,
2929,
4081,
2067,
5609,
323,
3576,
577,
2713,
3733,
690,
4243,
273,
253,
4081,
1332,
1335,
2430,
1029,
40540,
533,
352,
1537,
320,
247,
1175,
4983,
1127,
323,
2120,
577,
2713,
3733,
275,
253,
2822,
2852,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
323,
577,
2713,
2677,
1025,
3733,
273,
295,
2224,
3579,
285,
19265,
13546,
256,
5503,
577,
2713,
3733,
36643,
17194,
407,
271,
1783,
273,
46551,
15849,
271,
1774,
4809,
275,
2677,
1025,
3733,
253,
2022,
7350,
432,
253,
30628,
497,
326,
253,
2746,
369,
417,
8542,
1097,
247,
2087,
4468,
285,
273,
2173,
3877,
1060,
1580,
253,
3159,
310,
908,
275,
253,
4060,
285,
16038,
273,
253,
789,
1955,
281,
3480,
273,
22862,
342,
1655,
2087,
4096,
10309,
285,
3480,
273,
45984,
273,
253,
2746,
323,
18052,
10309,
594,
352,
310,
12744,
752,
253,
4588,
897,
1083,
310,
323,
253,
2746,
253,
4477,
9125,
326,
337,
436,
310,
417,
247,
1895,
327,
690,
10309,
285,
374,
326,
2469,
2987,
452,
417,
644,
2918,
281,
436,
2629,
891,
858,
417,
1089,
253,
4477,
281,
2085,
247,
2266,
4154,
1309,
253,
5955,
2180,
281,
2953,
841,
7350
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors present a method to improve the performance of graph embeddings by using psl to reason using ontology axioms to predict the types of entities the iterefin method is able to the predicted types as supervision to finetune the embeddings complex and conve the authors propose an iterative method where the predictions from embeddings are fed back to the psl model to infer new types the experiments are performed on corrupted versions of nell yago310 fb15k237 and wordnet using the methodology introduced in pujaras kgi work the experiments show substantial improvement on datasets with rich ontologies not wordnet the effects of iteration are minimal so it is not clear that they are useful as some iterations result in slight improvements while others result in loss of performance the ablation studies show that range and subclass are the most important axioms and other have minimal or no effect additional details would be useful about the number of each type of constraint in the psl model as it is not clear whether the contribution is due to the number of character of the constraint the importance of the range constraint seems correlated to the method for introducing noise in the evaluation datasets pros interesting approach for combining two different approaches to reasoning good experiments to show the benefits of the method cons the claim that the iterative methods is helpful which is part of the name of the system is not supported by the experiments no data on execution times and scalability all experiments are on small or medium size datasets insufficient analysis of the contribution of different axioms table 5 is not enough the paper is well written and easy to follow substantial room is spent on the analysis of the iterative method which in my opinion is not producing the desired results the space could be used to describe the method in more detail and include additional experiment resultsdocsepthe paper addresses the kg refinement task aiming to improve kgs that were built via imperfect automated processes either by removing incorrect relationships or by adding missing links the key insight is to augment kg embeddings not only with implicit type information but also with explicit types produced by an ontologybased system the resulting algorithm typeex leverages the benefits of structured often humancrafted information and the versatility of continuous embeddings the model proceeds in two stages first the pslkgi component which takes in kg triples ontology information and inference rules produces type information for entities and relations in the second stage these are passed to the typeex module which appends this explicit type information to a implicit type embeddings and b generalpurpose kg embeddings these two steps can optionally be repeated for multiple iterations in a loop while the highlevel picture is clear there are a few details about the information flow and implementation that are harder to figure out at the end of section 3 the authors write it is also important to note that pslkgi also generates a number of candidate facts that are not originally in the kg by softinference over the ontology and inference rules it is not obvious in figure 1 when this happens how is this model parameterized what exactly is trainable in the conclusions sections the authors write we will look in ways to combine such methods at the training level which raises the previous question again how are the types produced by pslkgi converted to continuous representations is it a simple dictionary lookup the authors validate their models on four datasets with ontologies of various sizes they compare against multiple baselines including pslkgi alone generic kg embeddings alone and generic kg embeddings implicit type embeddings showing their work outscores previous work one observation is that the datasets are prepared somewhat artificially noise is programmatically inserted in the kgs and the model is expected to detect these alterations and its not entirely clear how well this added noise correlates with the noise encountered in realworld kgs it would be interesting to provide results on a downstream task eg kgbased question answering with and without kg refinement to get an understanding of how much this step helps however in authors defense they are following the same procedure as previous work and do make an effort to ensure the denoising task is reasonably hard eg half of the corrupted facts have entities that are type compatible to the relation of the fact the ablation studies are insightful they look into how the number of loop iterations affect performance on various datasets the impact of threshold hyperparameters and the impact of various ontological rules overall i think the paper is well written it has a clear goal and convincing evidence to achieve it however i would have liked to see a clearer explanation of the algorithm and more implementation detailsdocsepsummary the authors propose a new method for identifying noisy triples in knowledge graphs that combines ontological information using probabilistic soft logic with an embedding method such as conve or complex they show that combining these two approaches improves performance on four different datasets with varying amounts of ontological information provided clarity the paper is generally clear and wellwritten i think it would be helpful to give a little more detail on how pslkgi works for example its not entirely clear to me how it outputs the type information originality significance as mentioned in the paper implicit type embeddings have been incorporated into embedding methods but more extensive ontological information has not been used in this way they also show that doing so results in improved performance over competitive baselines for this task pros novel use of ontological features with more recent embedding approaches for kg refinement performance improvement over competitive baselines cons the analysis feels a bit lacking see comments below for more thoughts here comments it doesnt seem like the iterative aspect of the model actually helps from figure 2 it only appears to hurt performance on some datasets and from figure 3 the change in accuracy appears to be minimal and not obviously more than just noise i would be curious to see how much using the full ontology improves things vs just using explicit type information dom ran and type labels for entities also if most of the benefit comes from the type information it might be interesting to see how much pslkgi is actually adding or whether you can just apply the type labels directly for the datasets that you have them anyways while perhaps not in the scope of this paper it would also be interesting to see how incorporating the ontological information affected other kg tasks like link prediction docsepthis paper tackles the task of knowledgebase refinement the described method uses an approach based on cotraining using a combination of two models the first model pslkgi and the second model is a knowledge graph embeddings model complex or conve the experiments are conducted on four datasets based on nell yago fb15k and wn18 the idea of combining 2 conceptually different approaches like pslkgi and graphembeddings work well however it is not surprising that it works better than a single model it is observed in many works this does not need a citation that if you combine the prediction of multiple models into an ensemble model it would work better than a single model this is especially true if the models have a different nature additionally a cotraining setup similar to the one presented here would expectedly boost the performance further in that case comparing the combined system to a single pslkgi or single kge model is not enough in order to claim some supreme performance of the method it should be compared to similar methods that combine multiple models and ensembles of the same model types some additional experiments are performed which could be of interest to the community with some further analysis and insights the analysis of the number of feedback iterations is interesting in order to know the limits of this type of refinement it is very hard to see the difference from figure 6 but for some of the datasets ex nell it seems 6 steps do not seem to be enough to see the peak also it is not clear if the difference in the performance is significant for most datasets more insights into why more steps work or do not work are needed the ablation of types in table 5 might be interesting but it needs further discussions what does it mean for a kg and each ontology that for example rng is the most important type how does that help us to know more about them and how is that related to the number of instances in each type displayed in table 2 some minor comments it is worth noting that embeddings with a few recent exceptions fatemi et al 2019 do not make use of any form of taxonomicontological rules there is work that uses taxonomyrules with some examples being kale guo et al 2016 asrx minervini 2017 ntp minervini et al 2020 in table 5 please add the difference of the ablated results to the overall all for better reading
### Summary:
|
this paper proposes a novel method iterefinee for cleaning up noise in kgs this method combines the advantages of using ontological information and inferences rules and kg embeddings with iterative cotraining iterefinee improves the task of denoising kgs on multiple datasets while the importance of multiple iterations is mixed reviewers agree that the combination of two significantly different types of reasoning is a promising direction
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
247,
1332,
281,
3157,
253,
3045,
273,
4216,
46234,
407,
970,
268,
3433,
281,
1921,
970,
42081,
26373,
3056,
281,
3283,
253,
3510,
273,
14429,
253,
352,
70,
709,
249,
1332,
310,
2104,
281,
253,
8131,
3510,
347,
20446,
281,
1442,
292,
2517,
253,
46234,
2570,
285,
11785,
253,
4477,
12661,
271,
34560,
1332,
835,
253,
13650,
432,
46234,
403,
10208,
896,
281,
253,
268,
3433,
1566,
281,
9441,
747,
3510,
50276,
783,
4679,
403,
2684,
327,
40634,
9508,
273,
295,
437,
340,
5477,
19569,
49962,
1010,
76,
20991,
285,
3159,
3024,
970,
253,
16182,
5611,
275,
8429,
12987,
284,
465,
7311,
789,
253,
4679,
921,
6832,
7756,
327,
15302,
342,
6793,
14642,
5970,
417,
3159,
3024,
253,
2538,
273,
19502,
403,
8723,
594,
352,
310,
417,
2590,
326,
597,
403,
4217,
347,
690,
25142,
906,
275,
4512,
11701,
1223,
2571,
906,
275,
2957,
273,
3045,
50276,
783,
28913,
2175,
921,
326,
2491,
285,
35851,
403,
253,
954,
1774,
26373,
3056,
285,
643,
452,
8723,
390,
642,
1055,
3081,
4278,
651,
320,
4217,
670,
253,
1180,
273,
1016,
1511,
273,
7658,
275,
253,
268,
3433,
1566,
347,
352,
310,
417,
2590,
1880,
253,
7680,
310,
1955,
281,
253,
1180,
273,
1894,
273,
253,
7658,
253,
6349,
273,
253,
2491,
7658,
3133,
9578,
281,
253,
1332,
323,
16984,
6046,
275,
253,
7103,
15302,
50276,
856,
84,
50276,
47606,
2746,
323,
16248,
767,
1027,
7274,
281,
14720,
50274,
12311,
4679,
281,
921,
253,
5373,
273,
253,
1332,
50276,
5040,
50276,
783,
1750,
326,
253,
34560,
3082,
310,
9371,
534,
310,
629,
273,
253,
1416,
273,
253,
985,
310,
417,
4516,
407,
253,
4679,
50276,
2369,
941,
327,
10636,
2069,
285,
9171,
1430,
512,
4679,
403,
327,
1355,
390,
4646,
1979,
15302,
50276,
968,
86,
2276,
1783,
273,
253,
7680,
273,
1027,
26373,
3056,
2829,
608,
310,
417,
2217,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
6832,
2316,
310,
5262,
327,
253,
1783,
273,
253,
34560,
1332,
534,
275,
619,
4743,
310,
417,
9603,
253,
6799,
1543,
253,
2317,
812,
320,
908,
281,
6266,
253,
1332,
275,
625,
2508,
285,
2486,
3081,
3368,
1543,
7152,
339,
431,
248,
2929,
12453,
253,
15841,
29646,
4836,
26400,
281,
3157,
465,
5943,
326,
497,
4270,
3066,
35180,
16644,
4870,
2057,
407,
11922,
13583,
7688,
390,
407,
6240,
5816,
4859,
253,
2234,
12288,
310,
281,
35919,
15841,
46234,
417,
760,
342,
15424,
1511,
1491,
533,
671,
342,
6843,
3510,
4197,
407,
271,
42081,
3169,
985,
253,
4795,
5933,
1511,
911,
19732,
1131,
253,
5373,
273,
18872,
2223,
1547,
1377,
2694,
264,
1491,
285,
253,
49607,
273,
5415,
46234,
50276,
783,
1566,
16947,
275,
767,
8661,
806,
253,
268,
3433,
5840,
74,
4445,
534,
3936,
275,
15841,
1195,
1868,
42081,
1491,
285,
17032,
4803,
11330,
1511,
1491,
323,
14429,
285,
2493,
275,
253,
1273,
3924,
841,
403,
4817,
281,
253,
1511,
911,
6333,
534,
622,
1727,
436,
6843,
1511,
1491,
281,
247,
15424,
1511,
46234,
285,
270,
2087,
27299,
15841,
46234,
841,
767,
5018,
476,
33323,
320,
6015,
323,
2709,
25142,
275,
247,
6287,
50276,
6050,
253,
1029,
5251,
5406,
310,
2590,
627,
403,
247,
1643,
4278,
670,
253,
1491,
2685,
285,
7092,
326,
403,
12150,
281,
4677,
562,
50276,
255,
253,
990,
273,
2593,
495,
253,
4477,
3630,
352,
310,
671,
1774,
281,
3877,
326,
268,
3433,
5840,
74,
671,
15693,
247,
1180,
273,
7431,
5441,
326,
403,
417,
8927,
275,
253,
15841,
407,
2602,
249,
1793,
689,
253,
42081,
285,
17032,
4803,
352,
310,
417,
4755,
275,
4677,
337,
672,
436,
6569,
50276,
5430,
310,
436,
1566,
4764,
1025,
752,
4555,
310,
6194,
494,
275,
253,
11815,
7118,
253,
4477,
3630,
359,
588,
1007,
275,
4088,
281,
13398,
824,
3082,
387,
253,
3733,
1268,
534,
16540,
253,
2045,
1953,
969,
50276,
5430,
403,
253,
3510,
4197,
407,
268,
3433,
5840,
74,
11516,
281,
5415,
14237,
310,
352,
247,
2969,
19034,
31994,
50276,
783,
4477,
17813,
616,
3210,
327,
1740,
15302,
342,
14642,
5970,
273,
2710,
9552,
597,
7277,
1411,
2709,
1666,
25379,
1690,
268,
3433,
5840,
74,
3815,
12314,
15841,
46234,
3815,
285,
12314,
15841,
46234,
50276,
303,
20692,
1511,
46234,
4645,
616,
789,
562,
44142,
2045,
789,
50276,
531,
8310,
310,
326,
253,
15302,
403,
5480,
8489,
41544,
6046,
310,
2086,
33866,
13400,
275,
253,
465,
5943,
285,
253,
1566,
310,
3264,
281,
2736,
841,
16663,
285,
697,
417,
7094,
2590,
849,
973,
436,
2879,
6046,
27972,
342,
253,
6046,
14494,
275,
1524,
10186,
465,
5943,
352,
651,
320,
4722,
281,
2085,
1543,
327,
247,
15450,
4836,
24088,
15841,
3169,
1953,
22291,
342,
285,
1293,
15841,
29646,
281,
755,
271,
4685,
273,
849,
1199,
436,
3213,
7729,
2299,
275,
4477,
5684,
597,
403,
1563,
253,
1072,
5199,
347,
2045,
789,
285,
513,
1056,
271,
3434,
281,
5416,
253,
1850,
80,
2182,
4836,
310,
12054,
1892,
24088,
2716,
273,
253,
40634,
5441,
452,
14429,
326,
403,
1511,
13333,
281,
253,
5886,
273,
253,
958,
50276,
783,
28913,
2175,
403,
47860,
50276,
9328,
1007,
715,
849,
253,
1180,
273,
6287,
25142,
2818,
3045,
327,
2710,
15302,
253,
3486,
273,
7887,
4373,
22041,
285,
253,
3486,
273,
2710,
14642,
1975,
4803,
50276,
1189,
455,
891,
1158,
253,
2929,
310,
973,
3542,
352,
556,
247,
2590,
4736,
285,
21414,
1941,
281,
5115,
352,
2299,
891,
651,
452,
10490,
281,
923,
247,
30909,
8813,
273,
253,
5933,
285,
625,
7092,
4278,
7152,
339,
793,
360,
3454,
253,
4477,
12661,
247,
747,
1332,
323,
12488,
27620,
1195,
1868,
275,
3640,
14580,
326,
24772,
14642,
1975,
1491,
970,
37851,
2602,
9317,
342,
271,
21496,
1332,
824,
347,
11785,
390,
2570,
597,
921,
326,
16248,
841,
767,
7274,
19132,
3045,
327,
1740,
1027,
15302,
342,
11962,
8322,
273,
14642,
1975,
1491,
2530,
50275,
498,
15752,
253,
2929,
310,
3839,
2590,
285,
973,
15720,
891,
1158,
352,
651,
320,
9371,
281,
1918,
247,
1652,
625,
2508,
327,
849,
268,
3433,
5840,
74,
2987,
323,
1650,
697,
417,
7094,
2590,
281,
479,
849,
352,
18012,
253,
1511,
1491,
50276,
19164,
414,
50276,
9188,
40348,
347,
5393,
275,
253,
2929,
15424,
1511,
46234,
452,
644,
11217,
715,
21496,
3082,
533,
625,
9470,
14642,
1975,
1491,
556,
417,
644,
908,
275,
436,
1039,
597,
671,
921,
326,
2509,
594,
1543,
275,
5520,
3045,
689,
12085,
1666,
25379,
323,
436,
4836,
50275,
856,
84,
50275,
2369,
652,
897,
273,
14642,
1975,
3386,
342,
625,
3332,
21496,
7274,
323,
15841,
29646,
50275,
24159,
7756,
689,
12085,
1666,
25379,
50275,
5040,
50275,
783,
1783,
9193,
247,
2372,
14999,
923,
5701,
2708,
323,
625,
7906,
1060,
50276,
26122,
50275,
262,
36908,
1646,
751,
253,
34560,
4809,
273,
253,
1566,
2686,
7729,
432,
4677,
374,
352,
760,
4620,
281,
8513,
3045,
50273,
251,
690,
15302,
285,
432,
4677,
495,
253,
1818,
275,
7200,
4620,
281,
320,
8723,
285,
417,
9090,
625,
685,
816,
50273,
24946,
50275,
74,
651,
320,
14338,
281,
923,
849,
1199,
970,
253,
2120,
42081,
19132,
1841,
4632,
816,
970,
6843,
1511,
1491,
2328,
50273,
4011,
285,
1511,
13301,
323,
14429,
671,
604,
954,
273,
253,
5649,
3249,
432,
253,
1511,
1491,
352,
1537,
320,
4722,
281,
50273,
2887,
849,
1199,
268,
3433,
5840,
74,
310,
2686,
6240,
390,
1880,
368,
476,
816,
4647,
253,
1511,
13301,
3587,
323,
253,
15302,
326,
368,
50273,
9802,
731,
667,
1576,
50275,
6050,
4931,
417,
275,
253,
7990,
273,
436,
2929,
352,
651,
671,
320,
4722,
281,
923,
849,
24049,
253,
14642,
1975,
50273,
18480,
5876,
643,
15841,
8892,
751,
3048,
10554,
50276,
7152,
33032,
2520,
2929,
39223,
253,
4836,
273,
3640,
4793,
29646,
253,
2529,
1332,
4648,
271,
2746,
1754,
327,
13450,
26208,
970,
247,
5019,
273,
767,
3210,
253,
806,
1566,
268,
3433,
5840,
74,
285,
253,
1273,
1566,
310,
247,
3640,
4216,
46234,
1566,
2570,
390,
11785,
253,
4679,
403,
5196,
327,
1740,
15302,
1754,
327,
295,
437,
340,
5477,
49962,
1010,
76,
285,
259,
79,
1093,
50274,
783,
2934,
273,
16248,
374,
4473,
1230,
1027,
7274,
751,
268,
3433,
5840,
74,
285,
17309,
248,
1814,
264,
26935,
789,
973,
2299,
352,
310,
417,
10084,
326,
352,
2987,
1805,
685,
247,
2014,
1566,
352,
310,
2540,
275,
1142,
2987,
436,
1057,
417,
878,
247,
25577,
326,
604,
368,
13398,
253,
10554,
273,
2709,
3210,
715,
271,
19862,
1566,
352,
651,
789,
1805,
685,
247,
2014,
1566,
436,
310,
3340,
2032,
604,
253,
3210,
452,
247,
1027,
3753,
23000,
247,
13450,
26208,
9978,
2074,
281,
253,
581,
3559,
1060,
651,
3264,
314,
9510,
253,
3045,
2007,
50276,
249,
326,
1083,
10941,
253,
5678,
985,
281,
247,
2014,
268,
3433,
5840,
74,
390,
2014,
465,
463,
1566,
310,
417,
2217,
275,
1340,
281,
1750,
690,
24555,
3045,
273,
253,
1332,
352,
943,
320,
2429,
281,
2074,
3082,
326,
13398,
2709,
3210,
285,
49328,
273,
253,
1072,
1566,
3510,
50274,
8826,
3081,
4679,
403,
2684,
534,
812,
320,
273,
1600,
281,
253,
3114,
342,
690,
2007,
1783,
285,
16039,
50276,
783,
1783,
273,
253,
1180,
273,
8680,
25142,
310,
4722,
275,
1340,
281,
871,
253,
7787,
273,
436,
1511,
273,
29646,
352,
310,
1077,
1892,
281,
923,
253,
3064,
432,
4677,
721,
533,
323,
690,
273,
253,
15302,
385,
295,
437,
352,
3133,
721,
5018,
513,
417,
1646,
281,
320,
2217,
281,
923,
253,
5241,
671,
352,
310,
417,
2590,
604,
253,
3064,
275,
253,
3045,
310,
1534,
323,
954,
15302,
625,
16039,
715,
2139,
625,
5018,
789,
390,
513,
417,
789,
403,
3058,
50275,
783,
28913,
273,
3510,
275,
2829,
608,
50276,
22732,
320,
4722,
533,
352,
3198,
2007,
11985,
752,
1057,
352,
1599,
323,
247,
15841,
285,
1016,
42081,
326,
323,
1650,
391,
1251,
310,
253,
954,
1774,
1511,
849,
1057,
326,
1361,
441,
281,
871,
625,
670,
731,
285,
849,
310,
326,
2905,
281,
253,
1180,
273,
10872,
275,
1016,
1511,
8653,
275,
2829,
374,
50275,
8826,
5884,
5701,
50275,
262,
310,
4409,
15806,
326,
46234,
342,
247,
1643,
3332,
16022,
4688,
35381,
1162,
355,
6247,
513,
417,
1056,
897,
273,
667,
830,
273,
43653,
834,
1975,
4803,
50276,
9088,
310,
789,
326,
4648,
2891,
13646,
25553,
342,
690,
6667,
1146,
465,
1079,
1149,
80,
1162,
355,
4022,
347,
18677,
1054,
677,
5391,
4240,
295,
17394,
1054,
677,
5391,
1162,
355,
9169,
50276,
249,
2829,
608,
4496,
823,
253,
3064,
273,
253,
490,
16148,
1543,
281,
253,
4583,
512,
323,
1805,
4361,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
4460,
1332,
352,
70,
709,
22593,
323,
12478,
598,
6046,
275,
465,
5943,
436,
1332,
24772,
253,
11361,
273,
970,
14642,
1975,
1491,
285,
27377,
4803,
285,
15841,
46234,
342,
34560,
13450,
26208,
352,
70,
709,
22593,
19132,
253,
4836,
273,
1850,
80,
2182,
465,
5943,
327,
2709,
15302,
1223,
253,
6349,
273,
2709,
25142,
310,
6804,
30628,
5194,
326,
253,
5019,
273,
767,
3012,
1027,
3510,
273,
14720,
310,
247,
12532,
3884
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
247,
1332,
281,
3157,
253,
3045,
273,
4216,
46234,
407,
970,
268,
3433,
281,
1921,
970,
42081,
26373,
3056,
281,
3283,
253,
3510,
273,
14429,
253,
352,
70,
709,
249,
1332,
310,
2104,
281,
253,
8131,
3510,
347,
20446,
281,
1442,
292,
2517,
253,
46234,
2570,
285,
11785,
253,
4477,
12661,
271,
34560,
1332,
835,
253,
13650,
432,
46234,
403,
10208,
896,
281,
253,
268,
3433,
1566,
281,
9441,
747,
3510,
50276,
783,
4679,
403,
2684,
327,
40634,
9508,
273,
295,
437,
340,
5477,
19569,
49962,
1010,
76,
20991,
285,
3159,
3024,
970,
253,
16182,
5611,
275,
8429,
12987,
284,
465,
7311,
789,
253,
4679,
921,
6832,
7756,
327,
15302,
342,
6793,
14642,
5970,
417,
3159,
3024,
253,
2538,
273,
19502,
403,
8723,
594,
352,
310,
417,
2590,
326,
597,
403,
4217,
347,
690,
25142,
906,
275,
4512,
11701,
1223,
2571,
906,
275,
2957,
273,
3045,
50276,
783,
28913,
2175,
921,
326,
2491,
285,
35851,
403,
253,
954,
1774,
26373,
3056,
285,
643,
452,
8723,
390,
642,
1055,
3081,
4278,
651,
320,
4217,
670,
253,
1180,
273,
1016,
1511,
273,
7658,
275,
253,
268,
3433,
1566,
347,
352,
310,
417,
2590,
1880,
253,
7680,
310,
1955,
281,
253,
1180,
273,
1894,
273,
253,
7658,
253,
6349,
273,
253,
2491,
7658,
3133,
9578,
281,
253,
1332,
323,
16984,
6046,
275,
253,
7103,
15302,
50276,
856,
84,
50276,
47606,
2746,
323,
16248,
767,
1027,
7274,
281,
14720,
50274,
12311,
4679,
281,
921,
253,
5373,
273,
253,
1332,
50276,
5040,
50276,
783,
1750,
326,
253,
34560,
3082,
310,
9371,
534,
310,
629,
273,
253,
1416,
273,
253,
985,
310,
417,
4516,
407,
253,
4679,
50276,
2369,
941,
327,
10636,
2069,
285,
9171,
1430,
512,
4679,
403,
327,
1355,
390,
4646,
1979,
15302,
50276,
968,
86,
2276,
1783,
273,
253,
7680,
273,
1027,
26373,
3056,
2829,
608,
310,
417,
2217,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
6832,
2316,
310,
5262,
327,
253,
1783,
273,
253,
34560,
1332,
534,
275,
619,
4743,
310,
417,
9603,
253,
6799,
1543,
253,
2317,
812,
320,
908,
281,
6266,
253,
1332,
275,
625,
2508,
285,
2486,
3081,
3368,
1543,
7152,
339,
431,
248,
2929,
12453,
253,
15841,
29646,
4836,
26400,
281,
3157,
465,
5943,
326,
497,
4270,
3066,
35180,
16644,
4870,
2057,
407,
11922,
13583,
7688,
390,
407,
6240,
5816,
4859,
253,
2234,
12288,
310,
281,
35919,
15841,
46234,
417,
760,
342,
15424,
1511,
1491,
533,
671,
342,
6843,
3510,
4197,
407,
271,
42081,
3169,
985,
253,
4795,
5933,
1511,
911,
19732,
1131,
253,
5373,
273,
18872,
2223,
1547,
1377,
2694,
264,
1491,
285,
253,
49607,
273,
5415,
46234,
50276,
783,
1566,
16947,
275,
767,
8661,
806,
253,
268,
3433,
5840,
74,
4445,
534,
3936,
275,
15841,
1195,
1868,
42081,
1491,
285,
17032,
4803,
11330,
1511,
1491,
323,
14429,
285,
2493,
275,
253,
1273,
3924,
841,
403,
4817,
281,
253,
1511,
911,
6333,
534,
622,
1727,
436,
6843,
1511,
1491,
281,
247,
15424,
1511,
46234,
285,
270,
2087,
27299,
15841,
46234,
841,
767,
5018,
476,
33323,
320,
6015,
323,
2709,
25142,
275,
247,
6287,
50276,
6050,
253,
1029,
5251,
5406,
310,
2590,
627,
403,
247,
1643,
4278,
670,
253,
1491,
2685,
285,
7092,
326,
403,
12150,
281,
4677,
562,
50276,
255,
253,
990,
273,
2593,
495,
253,
4477,
3630,
352,
310,
671,
1774,
281,
3877,
326,
268,
3433,
5840,
74,
671,
15693,
247,
1180,
273,
7431,
5441,
326,
403,
417,
8927,
275,
253,
15841,
407,
2602,
249,
1793,
689,
253,
42081,
285,
17032,
4803,
352,
310,
417,
4755,
275,
4677,
337,
672,
436,
6569,
50276,
5430,
310,
436,
1566,
4764,
1025,
752,
4555,
310,
6194,
494,
275,
253,
11815,
7118,
253,
4477,
3630,
359,
588,
1007,
275,
4088,
281,
13398,
824,
3082,
387,
253,
3733,
1268,
534,
16540,
253,
2045,
1953,
969,
50276,
5430,
403,
253,
3510,
4197,
407,
268,
3433,
5840,
74,
11516,
281,
5415,
14237,
310,
352,
247,
2969,
19034,
31994,
50276,
783,
4477,
17813,
616,
3210,
327,
1740,
15302,
342,
14642,
5970,
273,
2710,
9552,
597,
7277,
1411,
2709,
1666,
25379,
1690,
268,
3433,
5840,
74,
3815,
12314,
15841,
46234,
3815,
285,
12314,
15841,
46234,
50276,
303,
20692,
1511,
46234,
4645,
616,
789,
562,
44142,
2045,
789,
50276,
531,
8310,
310,
326,
253,
15302,
403,
5480,
8489,
41544,
6046,
310,
2086,
33866,
13400,
275,
253,
465,
5943,
285,
253,
1566,
310,
3264,
281,
2736,
841,
16663,
285,
697,
417,
7094,
2590,
849,
973,
436,
2879,
6046,
27972,
342,
253,
6046,
14494,
275,
1524,
10186,
465,
5943,
352,
651,
320,
4722,
281,
2085,
1543,
327,
247,
15450,
4836,
24088,
15841,
3169,
1953,
22291,
342,
285,
1293,
15841,
29646,
281,
755,
271,
4685,
273,
849,
1199,
436,
3213,
7729,
2299,
275,
4477,
5684,
597,
403,
1563,
253,
1072,
5199,
347,
2045,
789,
285,
513,
1056,
271,
3434,
281,
5416,
253,
1850,
80,
2182,
4836,
310,
12054,
1892,
24088,
2716,
273,
253,
40634,
5441,
452,
14429,
326,
403,
1511,
13333,
281,
253,
5886,
273,
253,
958,
50276,
783,
28913,
2175,
403,
47860,
50276,
9328,
1007,
715,
849,
253,
1180,
273,
6287,
25142,
2818,
3045,
327,
2710,
15302,
253,
3486,
273,
7887,
4373,
22041,
285,
253,
3486,
273,
2710,
14642,
1975,
4803,
50276,
1189,
455,
891,
1158,
253,
2929,
310,
973,
3542,
352,
556,
247,
2590,
4736,
285,
21414,
1941,
281,
5115,
352,
2299,
891,
651,
452,
10490,
281,
923,
247,
30909,
8813,
273,
253,
5933,
285,
625,
7092,
4278,
7152,
339,
793,
360,
3454,
253,
4477,
12661,
247,
747,
1332,
323,
12488,
27620,
1195,
1868,
275,
3640,
14580,
326,
24772,
14642,
1975,
1491,
970,
37851,
2602,
9317,
342,
271,
21496,
1332,
824,
347,
11785,
390,
2570,
597,
921,
326,
16248,
841,
767,
7274,
19132,
3045,
327,
1740,
1027,
15302,
342,
11962,
8322,
273,
14642,
1975,
1491,
2530,
50275,
498,
15752,
253,
2929,
310,
3839,
2590,
285,
973,
15720,
891,
1158,
352,
651,
320,
9371,
281,
1918,
247,
1652,
625,
2508,
327,
849,
268,
3433,
5840,
74,
2987,
323,
1650,
697,
417,
7094,
2590,
281,
479,
849,
352,
18012,
253,
1511,
1491,
50276,
19164,
414,
50276,
9188,
40348,
347,
5393,
275,
253,
2929,
15424,
1511,
46234,
452,
644,
11217,
715,
21496,
3082,
533,
625,
9470,
14642,
1975,
1491,
556,
417,
644,
908,
275,
436,
1039,
597,
671,
921,
326,
2509,
594,
1543,
275,
5520,
3045,
689,
12085,
1666,
25379,
323,
436,
4836,
50275,
856,
84,
50275,
2369,
652,
897,
273,
14642,
1975,
3386,
342,
625,
3332,
21496,
7274,
323,
15841,
29646,
50275,
24159,
7756,
689,
12085,
1666,
25379,
50275,
5040,
50275,
783,
1783,
9193,
247,
2372,
14999,
923,
5701,
2708,
323,
625,
7906,
1060,
50276,
26122,
50275,
262,
36908,
1646,
751,
253,
34560,
4809,
273,
253,
1566,
2686,
7729,
432,
4677,
374,
352,
760,
4620,
281,
8513,
3045,
50273,
251,
690,
15302,
285,
432,
4677,
495,
253,
1818,
275,
7200,
4620,
281,
320,
8723,
285,
417,
9090,
625,
685,
816,
50273,
24946,
50275,
74,
651,
320,
14338,
281,
923,
849,
1199,
970,
253,
2120,
42081,
19132,
1841,
4632,
816,
970,
6843,
1511,
1491,
2328,
50273,
4011,
285,
1511,
13301,
323,
14429,
671,
604,
954,
273,
253,
5649,
3249,
432,
253,
1511,
1491,
352,
1537,
320,
4722,
281,
50273,
2887,
849,
1199,
268,
3433,
5840,
74,
310,
2686,
6240,
390,
1880,
368,
476,
816,
4647,
253,
1511,
13301,
3587,
323,
253,
15302,
326,
368,
50273,
9802,
731,
667,
1576,
50275,
6050,
4931,
417,
275,
253,
7990,
273,
436,
2929,
352,
651,
671,
320,
4722,
281,
923,
849,
24049,
253,
14642,
1975,
50273,
18480,
5876,
643,
15841,
8892,
751,
3048,
10554,
50276,
7152,
33032,
2520,
2929,
39223,
253,
4836,
273,
3640,
4793,
29646,
253,
2529,
1332,
4648,
271,
2746,
1754,
327,
13450,
26208,
970,
247,
5019,
273,
767,
3210,
253,
806,
1566,
268,
3433,
5840,
74,
285,
253,
1273,
1566,
310,
247,
3640,
4216,
46234,
1566,
2570,
390,
11785,
253,
4679,
403,
5196,
327,
1740,
15302,
1754,
327,
295,
437,
340,
5477,
49962,
1010,
76,
285,
259,
79,
1093,
50274,
783,
2934,
273,
16248,
374,
4473,
1230,
1027,
7274,
751,
268,
3433,
5840,
74,
285,
17309,
248,
1814,
264,
26935,
789,
973,
2299,
352,
310,
417,
10084,
326,
352,
2987,
1805,
685,
247,
2014,
1566,
352,
310,
2540,
275,
1142,
2987,
436,
1057,
417,
878,
247,
25577,
326,
604,
368,
13398,
253,
10554,
273,
2709,
3210,
715,
271,
19862,
1566,
352,
651,
789,
1805,
685,
247,
2014,
1566,
436,
310,
3340,
2032,
604,
253,
3210,
452,
247,
1027,
3753,
23000,
247,
13450,
26208,
9978,
2074,
281,
253,
581,
3559,
1060,
651,
3264,
314,
9510,
253,
3045,
2007,
50276,
249,
326,
1083,
10941,
253,
5678,
985,
281,
247,
2014,
268,
3433,
5840,
74,
390,
2014,
465,
463,
1566,
310,
417,
2217,
275,
1340,
281,
1750,
690,
24555,
3045,
273,
253,
1332,
352,
943,
320,
2429,
281,
2074,
3082,
326,
13398,
2709,
3210,
285,
49328,
273,
253,
1072,
1566,
3510,
50274,
8826,
3081,
4679,
403,
2684,
534,
812,
320,
273,
1600,
281,
253,
3114,
342,
690,
2007,
1783,
285,
16039,
50276,
783,
1783,
273,
253,
1180,
273,
8680,
25142,
310,
4722,
275,
1340,
281,
871,
253,
7787,
273,
436,
1511,
273,
29646,
352,
310,
1077,
1892,
281,
923,
253,
3064,
432,
4677,
721,
533,
323,
690,
273,
253,
15302,
385,
295,
437,
352,
3133,
721,
5018,
513,
417,
1646,
281,
320,
2217,
281,
923,
253,
5241,
671,
352,
310,
417,
2590,
604,
253,
3064,
275,
253,
3045,
310,
1534,
323,
954,
15302,
625,
16039,
715,
2139,
625,
5018,
789,
390,
513,
417,
789,
403,
3058,
50275,
783,
28913,
273,
3510,
275,
2829,
608,
50276,
22732,
320,
4722,
533,
352,
3198,
2007,
11985,
752,
1057,
352,
1599,
323,
247,
15841,
285,
1016,
42081,
326,
323,
1650,
391,
1251,
310,
253,
954,
1774,
1511,
849,
1057,
326,
1361,
441,
281,
871,
625,
670,
731,
285,
849,
310,
326,
2905,
281,
253,
1180,
273,
10872,
275,
1016,
1511,
8653,
275,
2829,
374,
50275,
8826,
5884,
5701,
50275,
262,
310,
4409,
15806,
326,
46234,
342,
247,
1643,
3332,
16022,
4688,
35381,
1162,
355,
6247,
513,
417,
1056,
897,
273,
667,
830,
273,
43653,
834,
1975,
4803,
50276,
9088,
310,
789,
326,
4648,
2891,
13646,
25553,
342,
690,
6667,
1146,
465,
1079,
1149,
80,
1162,
355,
4022,
347,
18677,
1054,
677,
5391,
4240,
295,
17394,
1054,
677,
5391,
1162,
355,
9169,
50276,
249,
2829,
608,
4496,
823,
253,
3064,
273,
253,
490,
16148,
1543,
281,
253,
4583,
512,
323,
1805,
4361,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
4460,
1332,
352,
70,
709,
22593,
323,
12478,
598,
6046,
275,
465,
5943,
436,
1332,
24772,
253,
11361,
273,
970,
14642,
1975,
1491,
285,
27377,
4803,
285,
15841,
46234,
342,
34560,
13450,
26208,
352,
70,
709,
22593,
19132,
253,
4836,
273,
1850,
80,
2182,
465,
5943,
327,
2709,
15302,
1223,
253,
6349,
273,
2709,
25142,
310,
6804,
30628,
5194,
326,
253,
5019,
273,
767,
3012,
1027,
3510,
273,
14720,
310,
247,
12532,
3884
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper provides an interesting way to combine regularized approximate minimal polynomial extrapolation and optimistic methods i like the idea and the experimental results look promising however i have some concerns im wondering if the comparison with the baseline is fair actually one iteration of optimisticamsgrad is more expensive than one iteration of amsgrad since it requires to compute mt1 the authors should explain to what extend this computation is significantly cheaper that a backprop if it actually is the discussion after theorem 1 is not clear to me it is not clear whether or not optimisticamsgrad has a better regret that amsgrad you did not compare the sum of the two additional term with the term with a logt second line of 8 with second lien of 9 do you get a better regret that osqrtt a better constant moreover you did not justify why it is reasonable to assume that each gti2hti2sqrtvt1i are bounded im also concerned about the definition of dinfty did you prove that this constant is not infinite reddi et al 2018 proposed a projected version of their algorithm and did the analysis assuming that the constraints set was bounded in your algorithm 2 would you project in line 8 and 9 or only on line 9 i think that the easiest fix would be to provide a projected version of your algorithm and to do your analysis with the standard assumption that the constraints set is bounded the description of alg 2 is not clear notice that the gradient vector is computed at wt instead of wt12 why would it be wt12 in comparison to what also wt 12 is updated from wt 12 instead of wt same the comments are made without any description of the algorithm fact is if the reader is not familiar with the algorithm which is introduced in the following page the whole paragraph is hard to catch actually page 6 you explain how the optimistic step of daskalikis et al 2018 is unclear but you can merge the updates lines 8 and 9 to wt1 wt etat1 frac4 ht11beta1 sqrtvt etat racthetatsqrtvt etat frac4 ht1beta1 sqrtvt1 plug line 8 in line 9 and then plug line 9 at time t to get a very similar update if you look more closely at daskalakis et al 2018 their guess mt1 is gt finally you theorem 2 is stated in a bit unfair way since you also require beta1 sqrtbeta2 moreover it seems that theorem 2 is no longer true anymore if as you says you impose that the second moment of adamdisz is monotone adding the maximization step about the experiments i do not understand why there is the number of iterations in the left plots and the number of epochs on the right plots it makes the plots hard to compare you should compare your method to extragradient methods to sum up this paper introduce interesting results the combination of scieur et al 2016 and optimistic online learning is really promising and solid theoretical results are claimed however some points should be clarified see my comments above especially i think that the authors focused too much sometimes being unfair in their discussion and propositions on showing how their algorithm is better than daskalakis et al 2018 whereas as they mentioned it the goals are different adamdisz is designed to optimize games and is similar to extragradient it is know that extragradient methods are slower than gradient methods for single objective minimization because of the extrapolation step using a too conservative signal for single objective minimization some minor remarks page one nesterovsmethod which can be much smaller than sqrtt of ftrl if one has a good guess you could refer to section 3 or something else because otherwise this sentence remains mysterious what is a good guess or maybe you could say that standard good guesses are either the previous gradient or the average of the previous gradients it combines the idea of adagrad duchi et al 2011 which has individual learning rate for different dimensions what is the other thing combined beginning of sec 31 psit represent diagvt12 after authors response as developed in my response on dinfty assumption to the reviewers i think that the assumption that dinfty bounded is a critical issue that is why i am moving down my gradedocsepin this manuscript the authors borrow the idea of optimism from the online learning literature and apply it to two frequently used methods for neural network training amsgrad and adam more or less replicating the theory known in the literature they give a regret analysis the manuscript ends with a comparison of the optimistic methods against their plain counterparts on a set of test problems this is a wellwritten paper filling a gap in the literature through the contribution does not seem significant the results do support that such extensions should be out there in addition to a few typos some clarification on several points could be quite useful 1 it is not clear why the authors use this particular extrapolation algorithm 2 if we have the past r1 gradients can we put them into use for scaling the next direction like in quasinewton methods 3 the following part of the sentence is not clear the gradient vectors at a specific time span is assumed to be captured by 5 4 nabla is missing at the end of the line right after equation 6 5 the second line after lemma 2 should be it does not matter how the word not is missing docsepthe paper proposes new online optimization algorithms by adding the idea of optimistic updates to the already popular components of adaptive preconditioning and momentum as used in amsgrad and adam such optimistic schemes attempt to guess the yetunseen gradients before each update which can lead to better regret guarantees when the guesses are accurate in a certain sense this in turn can lead to faster convergence when the resulting algorithm is used in an optimization framework the specific contribution of the present paper is proving formally that optimistic updates can indeed be combined with advanced methods like adam and amsgrad also providing a regret analysis of the former algorithm on the practical front the authors also propose a method closely resembling anderson acceleration for guessing the next gradient and the eventual scheme is shown to work well empirically in training deep neural networks the idea of optimistic updates has been popular in recent years within the onlinelearning literature and has been used with particularly great success for achieving improved convergence guarantees for learning equilibria in games more recently optimistic updates have also appeared in more practical settings such as training gans where they were shown to improve stability of training the present paper argues that the idea of optimism can be useful for largescale optimization as well if the gradient guesses are chosen appropriately i have lukewarm feelings about the paper on the positive side the proposed method is a natural and sensible combination of solid technical ideas and its theoretical analysis appears to be correct as the authors point out their algorithm incorporates the idea of optimism in a much more natural way than the related optimistic adam algorithm previously proposed by daskalakis et al 2018 does the experiments also indicate some advantage of optimism in the studied optimization problems on the other hand the theoretical contribution is marginal the algorithm and its analysis is a straightforward combination of previous ideas and the result itself doesnt strike me as surprising at all then again perhaps this is more of a presentation issue as it may be the case that the authors did not manage to highlight clearly enough the technical challenges they needed to overcome to prove their theoretical results furthermore i find the method for guessing the gradients to be rather arbitrary and poorly explainedat least im not sure if anyone unfamiliar with the mentioned gradient extrapolation methods would find this approach to be sensible at all i am not fully convinced by the experimental results either since i have an impression that the gradientguessing method only introduces yet another knob to turn when tuning the hyperparameters and its not clear at all that this new dimension would indeed unlock levels of performance that were not attainable before indeed the authors seem to fix all hyperparameters across all experiments and only switch around the optimistic components rather than finding the best tuning for each individual algorithm and comparing the respective results also i dont really see any qualitative improvements in the learning curves due to the new componentsbut maybe i just cant read these graphs properly since i have more of a theorist background the writing is mostly ok although there is room for improvement in terms of english use especially on the front of articles which seem to be off in almost every sentence overall i dont feel very comfortable about suggesting acceptance mostly because i find the results to be rather unsurprising i suggest that the authors try to convince me of the nontrivial challenges arising in the analysis or about the definite practical advantage that optimism can buy for largescale optimization detailed comments pp1 abstract we consider new variants of optimization algorithmsthis sentence is rather vague and generic i guess you wanted to refer to convex optimization algorithms which is actually what you consider in the paper no need to be embarrassed about assuming convexity pp1 a general nuisance with the typesetting that already shows on the first page is that italic and small capital fonts are used excessively and without any clearly identifiable logic please simplify pp1 adagrad exploits the geometry of the data and performs informative updatethis makes it sound like other algorithms make noninformative updates pp1 regret was not defined even informally in the introduction yet already some regret bounds are compared highlighting that one can be much smaller than osqrtt this is not very friendly for readers with no prior experience in online learning pp1 their regret analysis are the regret analysis in online learningwhat is this sentence trying to say pp2 for this discussion of ftrl it would be useful to remark that this algorithm really only makes sense if the loss function is convex also related to this discussion you mention that the bound for optimistic ftrl can be much smaller than sqrtt but never actually say that sqrtt is minimax optimalwithout this piece of context this statement has little value pp3 adam does not converge to some specific convex functionsi guess i understand what this sentence is trying to say but it certainly doesnt say it right why would an algorithm converge to a function pp3 bottom this description of extrapolation methods is utterly cryptic what is xt here what is the fixed point x why is this scheme applicable at all here why would one believe the errors to be nearlinear in this case would this argument work at all for nonconvex objectives pp5 lemma 1 why would one expect dinfty to be finite in order to ensure this one would need to project the iterates to a compact set pp5 right after lemma 1 it does matter how gt is generated it does not matter how gt is generated pp6 top we claimed that it is smaller than osqrtt so that we are good herewhere exactly did this claim appear and in what sense are we good here also the norms in this paragraph should be squared pp6 sec 32 while this section makes some interesting points its tone feels a bit too apologetic eg saying that you are aware of a previous algorithm thats similar to yours and doubling down on the claim that the goals are different makes the text feel like youre taking a defensive stance even though i cant see a clear reason for this in my book the approach you propose is clearly different and more natural for the purpose of your studydocsepthis paper combines recent results in online learning and convex optimization specifically adaptivity momentum and optimism the authors add an optimistic gradient prediction step into the amsgrad algorithm proposed by reddi et al 2018 moreover they propose using the rmpe algorithm of scieur et al 2016 to come up with the gradient prediction step the new method that they introduce is called optimistic amsgrad and the authors present both theoretical guarantees as well as numerical experiments justifying this new method the paper is relatively wellwritten and the authors do a good job of explaining recent work on adaptivity momentum and optimism in online learning and convex optimization to motivate their algorithm the algorithm is also presented clearly and the fact that the method is accompanied by both a regret bound as well as numerical experiments is appreciated at the same time i found the presentation of this work to be a little misleading the idea of applying optimism to adam was already presented in daskalakis et al 2018 the algorithm in that paper is in fact called optimistic adam i found it very strange that the authors chose to rename that algorithm in this paper there are two main differences between optimistic adam in daskalakis et al 2018 and optimistic amsgrad the first is the extension from adam to amsgrad which involves an extra maximization step line 7 in algorithm 2 that is immediate the second is the choice of gradient prediction method since daskalakis et al 2018 were concerned with equilibrium convergence they opted to use the most recent gradient as the prediction on the other hand the authors in this work are concerned with general online optimization so they use a linear combination of past gradients as the prediction based on a method introduced by scieur et al 2016 on its own i do not find this extensions to be sufficiently novel or significant to merit publication the fact that this paper includes theoretical guarantees for optimistic amsgrad that were missing in daskalakis et al 2018 for optimistic adam does make it a little more compelling however i found the bound in theorem 1 to be a little strange in that 1 it doesnt reduce to the amsgrad bound when the gradient predictions are 0 and 2 it doesnt seem better than the amsgrad or optimistic ftrl bounds the authors claim to justify 2 by saying that the extra gt ht term is osqrtt but the whole appeal of adaptive algorithms is that the sqrtt terms are datadependent the empirical results also do not include error bars which makes it hard to judge their significance there were also many grammatical errors and typos in the paper other comments and questions 1 page 1 their theoretical analysis are the regret analysis in online learning grammatical error 2 page 2 the concern is that how to get good mt grammatical error 3 page 3 as the future work grammatical error 4 page 3 nestrovs method typo 5 page 4 with input consists of grammatical error 6 page 4 call algorithm 3 with what is the computational cost of this step one of the main benefits of algorithms like amsgrad is that they run in od time with very mild constants 7 page 4 for this extrapolation method to well well the gradient vectors at a specific time span is assumed to be captured by 5 if the gradient does not change significantly this will be a mild condition if the gradient doesnt change significantly then choosing mt gt would also work well wouldnt it can you come up with examples of objectives for which this method makes sense even toy ones would strengthen this paper 8 page 5 equation 8 as discussed above this bound doesnt appear to reduce to the amsgrad bound for mt 0 which makes it a little unsatisfying the fact that there is an extra expression that isnt in terms of the gradient prediction error that one has for optimistic ftrl also makes the bound a little strange 9 page 7 the conduct optimisticamsgrad with different values of r and observe similar performance you should mention that you show the performance for some of these different values in the appendix 10 page 7 multiclassification problems typo 11 page 7 figure 1 without error bars its impossible to tell whether these results are meaningful moreover its strange to evaluate algorithms with online convex optimization guarantees on offline nonconvex problems 12 page 7 widely studied and playing grammatical error 13 page 8 a potential directions grammatical error
### Summary:
|
the reviewers expressed some interest in this paper but overall were lukewarm about its contributions r4 raises a fundamental issue with the presentation of the analysis see the dinfty assumption the ac thus goes for a revise and resubmit
|
[
285,
10254,
347,
908,
275,
717,
84,
4971,
285,
38622,
824,
28684,
15849,
3177,
281,
5476,
253,
2568,
328,
16564,
27935,
1078,
1016,
5731,
534,
476,
1421,
281,
1805,
14938,
23632,
672,
253,
5476,
265,
403,
7899,
275,
247,
2176,
3282,
436,
275,
1614,
476,
1421,
281,
7938,
14940,
672,
253,
4795,
5933,
310,
908,
275,
271,
13757,
7792,
253,
2173,
7680,
273,
253,
1246,
2929,
310,
18597,
19186,
326,
28684,
11269,
476,
6296,
320,
5678,
342,
7269,
3082,
751,
38622,
285,
717,
84,
4971,
671,
5277,
247,
14938,
1783,
273,
253,
3438,
5933,
327,
253,
8542,
2914,
253,
4477,
671,
12661,
247,
1332,
8244,
37427,
285,
3796,
17680,
323,
29985,
253,
1735,
11786,
285,
253,
27585,
6974,
310,
2011,
281,
789,
973,
45190,
275,
3733,
3676,
11454,
6928,
50276,
783,
2934,
273,
28684,
11269,
556,
644,
4633,
275,
3332,
1107,
1561,
253,
3909,
28269,
6239,
285,
556,
644,
908,
342,
3782,
1270,
2323,
323,
17170,
5520,
14940,
23632,
323,
4715,
45571,
5182,
275,
3958,
625,
4102,
28684,
11269,
452,
671,
5420,
275,
625,
8542,
7533,
824,
347,
3733,
305,
507,
835,
597,
497,
2011,
281,
3157,
7882,
273,
3733,
253,
1246,
2929,
8219,
326,
253,
2934,
273,
36970,
476,
320,
4217,
323,
1236,
2510,
25912,
13757,
347,
973,
604,
253,
11786,
5476,
265,
403,
6777,
20420,
50276,
74,
452,
298,
17936,
44041,
10450,
670,
253,
2929,
327,
253,
2762,
1930,
253,
4081,
1332,
310,
247,
3626,
285,
24600,
5019,
273,
4891,
7681,
5697,
285,
697,
10527,
1783,
4620,
281,
320,
3451,
347,
253,
4477,
1127,
562,
616,
5933,
31167,
253,
2934,
273,
36970,
275,
247,
1199,
625,
3626,
1039,
685,
253,
2905,
28684,
38622,
5933,
3786,
4081,
407,
277,
1945,
267,
30441,
1162,
355,
4765,
1057,
253,
4679,
671,
5224,
690,
5750,
273,
36970,
275,
253,
5421,
13757,
3237,
50276,
251,
253,
643,
1133,
253,
10527,
7680,
310,
16888,
253,
5933,
285,
697,
1783,
310,
247,
15246,
5019,
273,
2045,
5697,
285,
253,
906,
3139,
36908,
9974,
479,
347,
10084,
387,
512,
840,
969,
4931,
436,
310,
625,
273,
247,
9759,
2523,
347,
352,
778,
320,
253,
1083,
326,
253,
4477,
858,
417,
8722,
281,
6780,
4518,
2217,
253,
7681,
7881,
597,
3058,
281,
11399,
281,
5276,
616,
10527,
1543,
33810,
891,
1089,
253,
1332,
323,
29985,
253,
27935,
281,
320,
2581,
10341,
285,
15225,
5544,
255,
1878,
516,
417,
2119,
604,
3780,
32139,
342,
253,
5393,
11786,
26480,
17888,
3082,
651,
1089,
436,
2746,
281,
320,
24600,
387,
512,
50276,
74,
717,
417,
4751,
13762,
407,
253,
5661,
1543,
2057,
1580,
891,
452,
271,
13214,
326,
253,
11786,
4297,
41804,
1332,
760,
23970,
2568,
1529,
47133,
281,
1614,
672,
25184,
253,
4373,
22041,
285,
697,
417,
2590,
387,
512,
326,
436,
747,
7877,
651,
6296,
19444,
2308,
273,
3045,
326,
497,
417,
20685,
494,
1078,
6296,
253,
4477,
1646,
281,
4993,
512,
4373,
22041,
2439,
512,
4679,
285,
760,
5234,
1475,
253,
28684,
4295,
2581,
685,
4560,
253,
1682,
25184,
323,
1016,
2060,
5933,
285,
10941,
253,
9056,
1543,
671,
891,
13414,
1663,
923,
667,
18276,
11701,
275,
253,
4715,
9191,
1955,
281,
253,
747,
4295,
2858,
5046,
891,
816,
16216,
1239,
841,
14580,
6283,
1580,
891,
452,
625,
273,
247,
29075,
382,
4114,
50276,
783,
4028,
310,
6571,
8718,
3738,
627,
310,
2316,
323,
7756,
275,
2426,
273,
48087,
897,
3340,
327,
253,
2914,
273,
7774,
534,
1646,
281,
320,
745,
275,
2761,
1046,
6197,
50276,
1189,
455,
891,
13414,
1928,
1077,
9848,
670,
7738,
14924,
6571,
984,
891,
1089,
253,
1543,
281,
320,
2581,
5061,
321,
20733,
891,
1804,
326,
253,
4477,
1611,
281,
18578,
479,
273,
253,
37825,
7881,
14475,
275,
253,
1783,
390,
670,
253,
19040,
8542,
5750,
326,
36970,
476,
4489,
323,
1236,
2510,
25912,
13757,
50276,
5992,
7193,
5701,
50275,
377,
18,
12002,
359,
1908,
747,
11640,
273,
13757,
5933,
296,
8701,
6197,
310,
2581,
21248,
285,
12314,
891,
5476,
368,
3078,
281,
3730,
281,
17133,
13757,
11333,
534,
310,
2686,
752,
368,
1908,
275,
253,
2929,
642,
878,
281,
320,
30069,
670,
7384,
17133,
414,
50276,
377,
18,
247,
2087,
41843,
342,
253,
3510,
33513,
326,
2168,
2722,
327,
253,
806,
3239,
310,
326,
36037,
280,
285,
1355,
5347,
36622,
403,
908,
45891,
285,
1293,
667,
4518,
38640,
9317,
4496,
25636,
50276,
377,
18,
519,
356,
4614,
50276,
15083,
80,
953,
253,
12087,
273,
253,
941,
285,
17923,
27096,
5535,
255,
678,
261,
2789,
352,
3590,
751,
643,
11333,
1056,
1327,
37650,
800,
11269,
50276,
377,
18,
14938,
369,
417,
2931,
1014,
4151,
595,
275,
253,
10199,
2568,
2168,
690,
14938,
14493,
403,
2429,
27321,
326,
581,
476,
320,
1199,
4577,
685,
258,
2609,
85,
436,
310,
417,
1077,
11453,
323,
10668,
342,
642,
2720,
2793,
275,
3909,
4715,
50276,
377,
18,
616,
14938,
1783,
403,
253,
14938,
1783,
275,
3909,
4715,
5371,
310,
436,
6197,
2820,
281,
1333,
50276,
377,
19,
323,
436,
5955,
273,
269,
18609,
352,
651,
320,
4217,
281,
7579,
326,
436,
5933,
1663,
760,
2789,
3282,
604,
253,
2957,
1159,
310,
17133,
671,
2905,
281,
436,
5955,
368,
3748,
326,
253,
3033,
323,
28684,
269,
18609,
476,
320,
1199,
4577,
685,
8084,
85,
533,
1620,
2686,
1333,
326,
8084,
85,
310,
7221,
991,
8654,
14920,
436,
5313,
273,
3634,
436,
3908,
556,
1652,
1318,
50276,
377,
20,
38622,
50276,
18566,
417,
29623,
281,
690,
2173,
17133,
3470,
74,
5476,
891,
2096,
752,
436,
6197,
310,
2820,
281,
1333,
533,
352,
5604,
36908,
1333,
352,
987,
2139,
651,
271,
5933,
29623,
281,
247,
1159,
50276,
377,
20,
5004,
436,
5740,
273,
26480,
17888,
3082,
310,
23228,
10105,
280,
752,
310,
209,
633,
1060,
752,
310,
253,
4229,
1127,
1269,
2139,
310,
436,
6974,
7763,
387,
512,
1060,
2139,
651,
581,
2868,
253,
6332,
281,
320,
2822,
8172,
275,
436,
1083,
651,
436,
4154,
789,
387,
512,
323,
1327,
44181,
16566,
50276,
377,
22,
18057,
337,
2139,
651,
581,
1902,
277,
3259,
281,
320,
6486,
275,
1340,
281,
5416,
436,
581,
651,
878,
281,
2199,
253,
10040,
684,
281,
247,
8566,
873,
50276,
377,
22,
987,
846,
18057,
337,
352,
1057,
2647,
849,
305,
85,
310,
4561,
50276,
262,
1057,
417,
2647,
849,
305,
85,
310,
4561,
50276,
377,
23,
1755,
359,
7558,
326,
352,
310,
4577,
685,
258,
2609,
85,
594,
326,
359,
403,
1175,
1060,
2811,
4555,
858,
436,
1750,
3176,
285,
275,
752,
3282,
403,
359,
1175,
1060,
671,
253,
22429,
275,
436,
12494,
943,
320,
30044,
50276,
377,
23,
4706,
4567,
1223,
436,
2593,
2789,
690,
4722,
2792,
697,
10541,
9193,
247,
2372,
1512,
15251,
1999,
24088,
3981,
326,
368,
403,
6600,
273,
247,
2045,
5933,
28763,
2074,
281,
13298,
285,
35373,
1066,
327,
253,
1750,
326,
253,
7342,
403,
1027,
2789,
253,
2505,
1928,
751,
368,
250,
3192,
247,
14397,
22567,
1014,
2167,
891,
16216,
923,
247,
2590,
1921,
323,
436,
275,
619,
1984,
253,
2746,
368,
12661,
310,
4518,
1027,
285,
625,
3626,
323,
253,
4096,
273,
634,
1263,
7152,
33032,
2520,
2929,
24772,
3332,
1543,
275,
3909,
4715,
285,
17133,
13757,
5742,
5223,
2351,
10254,
285,
36970,
253,
4477,
823,
271,
28684,
11786,
10554,
3213,
715,
253,
717,
84,
4971,
5933,
4081,
407,
28159,
74,
1162,
355,
4765,
25761,
597,
12661,
970,
253,
40373,
365,
5933,
273,
660,
26730,
1162,
355,
4022,
281,
1705,
598,
342,
253,
11786,
10554,
3213,
253,
747,
1332,
326,
597,
9569,
310,
1925,
28684,
717,
84,
4971,
285,
253,
4477,
1246,
1097,
10527,
23632,
347,
973,
347,
10704,
4679,
816,
5411,
436,
747,
1332,
50276,
783,
2929,
310,
4942,
973,
15720,
285,
253,
4477,
513,
247,
1175,
2628,
273,
15571,
3332,
789,
327,
5223,
2351,
10254,
285,
36970,
275,
3909,
4715,
285,
17133,
13757,
281,
41509,
616,
5933,
253,
5933,
310,
671,
3559,
4518,
285,
253,
958,
326,
253,
1332,
310,
11704,
407,
1097,
247,
14938,
3033,
347,
973,
347,
10704,
4679,
310,
14109,
50276,
255,
253,
1072,
673,
891,
1119,
253,
9759,
273,
436,
789,
281,
320,
247,
1652,
24363,
253,
2934,
273,
9433,
36970,
281,
38622,
369,
2168,
3559,
275,
277,
1945,
267,
30441,
1162,
355,
4765,
253,
5933,
275,
326,
2929,
310,
275,
958,
1925,
28684,
38622,
891,
1119,
352,
1077,
8921,
326,
253,
4477,
9703,
281,
41838,
326,
5933,
275,
436,
2929,
627,
403,
767,
2022,
3910,
875,
28684,
38622,
275,
277,
1945,
267,
30441,
1162,
355,
4765,
285,
28684,
717,
84,
4971,
253,
806,
310,
253,
6880,
432,
38622,
281,
717,
84,
4971,
534,
8687,
271,
4465,
11903,
1320,
3213,
1386,
818,
275,
5933,
374,
326,
310,
8993,
253,
1273,
310,
253,
4327,
273,
11786,
10554,
1332,
1580,
277,
1945,
267,
30441,
1162,
355,
4765,
497,
7514,
342,
12902,
14940,
597,
33549,
281,
897,
253,
954,
3332,
11786,
347,
253,
10554,
327,
253,
643,
1133,
253,
4477,
275,
436,
789,
403,
7514,
342,
2087,
3909,
13757,
594,
597,
897,
247,
4872,
5019,
273,
2469,
27935,
347,
253,
10554,
1754,
327,
247,
1332,
5611,
407,
660,
26730,
1162,
355,
4022,
327,
697,
1211,
891,
513,
417,
1089,
436,
18149,
281,
320,
10481,
4460,
390,
1534,
281,
15785,
9311,
50275,
783,
958,
326,
436,
2929,
3797,
10527,
23632,
323,
28684,
717,
84,
4971,
326,
497,
5816,
275,
277,
1945,
267,
30441,
1162,
355,
4765,
323,
28684,
38622,
1057,
1056,
352,
247,
1652,
625,
18511,
2299,
891,
1119,
253,
3033,
275,
10012,
337,
281,
320,
247,
1652,
8921,
275,
326,
337,
352,
36908,
4796,
281,
253,
717,
84,
4971,
3033,
672,
253,
11786,
13650,
403,
470,
285,
374,
352,
36908,
1646,
1805,
685,
253,
717,
84,
4971,
390,
28684,
269,
18609,
14493,
253,
4477,
1750,
281,
15249,
374,
407,
3981,
326,
253,
4465,
305,
85,
50276,
384,
1307,
310,
258,
2609,
85,
533,
253,
2644,
4549,
273,
17825,
11333,
310,
326,
253,
8084,
85,
2426,
403,
2856,
324,
2662,
253,
16774,
1543,
671,
513,
417,
2486,
2228,
8965,
534,
2789,
352,
1892,
281,
5963,
616,
8453,
50275,
9088,
497,
671,
1142,
47412,
474,
6332,
285,
963,
993,
275,
253,
2929,
50275,
977,
5701,
285,
3533,
337,
3239,
337,
616,
10527,
1783,
403,
253,
14938,
1783,
275,
3909,
4715,
47412,
474,
2228,
374,
3239,
374,
253,
4468,
310,
326,
849,
281,
755,
1175,
26301,
47412,
474,
2228,
495,
3239,
495,
347,
253,
2852,
789,
47412,
474,
2228,
577,
3239,
495,
15178,
287,
10936,
1332,
1745,
80,
50276,
22,
3239,
577,
342,
3280,
8414,
273,
47412,
474,
2228,
721,
3239,
577,
1067,
5933,
495,
342,
752,
310,
253,
15180,
2105,
273,
436,
3213,
581,
273,
253,
2022,
5373,
273,
11333,
751,
717,
84,
4971,
310,
326,
597,
1408,
275,
7687,
673,
342,
1077,
11134,
14637,
50276,
24,
3239,
577,
323,
436,
26480,
17888,
1332,
281,
973,
973,
253,
11786,
11390,
387,
247,
2173,
673,
13905,
310,
8025,
281,
320,
10848,
407,
608,
604,
253,
11786,
1057,
417,
1818,
3012,
436,
588,
320,
247,
11134,
1617,
604,
253,
11786,
36908,
1818,
3012,
840,
13887,
26301,
50276,
7332,
651,
671,
789,
973,
651,
2649,
352,
476,
368,
1705,
598,
342,
6667,
273,
16566,
323,
534,
436,
1332,
2789,
3282,
1014,
20953,
4394,
651,
17084,
436,
2929,
854,
3239,
608,
5150,
854,
347,
5469,
1840,
436,
3033,
36908,
3176,
281,
4796,
281,
253,
717,
84,
4971,
3033,
323,
26301,
50276,
17,
534,
2789,
352,
247,
1652,
43288,
3184,
253,
958,
326,
627,
310,
271,
4465,
2048,
326,
310,
2649,
275,
2426,
273,
253,
11786,
10554,
2228,
326,
581,
556,
323,
28684,
269,
18609,
671,
2789,
253,
3033,
247,
1652,
8921,
898,
3239,
818,
253,
2589,
28684,
1317,
4971,
342,
1027,
2193,
273,
391,
285,
10018,
2074,
3045,
368,
943,
3748,
326,
368,
921,
253,
3045,
323,
690,
273,
841,
1027,
2193,
275,
253,
30762,
884,
3239,
818,
23559,
14407,
1877,
3237,
1745,
80,
1903,
3239,
818,
4677,
337,
1293,
2228,
8965,
697,
7479,
281,
2028,
1880,
841,
1543,
403,
14282,
25761,
697,
8921,
281,
7472,
11333,
342,
3909,
17133,
13757,
23632,
327,
28841,
1327,
44181,
3237,
1249,
3239,
818,
7561,
5421,
285,
4882,
47412,
474,
2228,
2145,
3239,
854,
247,
2442,
10746,
47412,
474,
2228,
50254,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
4469,
690,
1600,
275,
436,
2929,
533,
4583,
497,
298,
17936,
44041,
670,
697,
9021,
391,
21,
16540,
247,
7936,
2523,
342,
253,
9759,
273,
253,
1783,
923,
253,
277,
3259,
9376,
253,
913,
3021,
4566,
323,
247,
49620,
285,
501,
538,
2225
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
285,
10254,
347,
908,
275,
717,
84,
4971,
285,
38622,
824,
28684,
15849,
3177,
281,
5476,
253,
2568,
328,
16564,
27935,
1078,
1016,
5731,
534,
476,
1421,
281,
1805,
14938,
23632,
672,
253,
5476,
265,
403,
7899,
275,
247,
2176,
3282,
436,
275,
1614,
476,
1421,
281,
7938,
14940,
672,
253,
4795,
5933,
310,
908,
275,
271,
13757,
7792,
253,
2173,
7680,
273,
253,
1246,
2929,
310,
18597,
19186,
326,
28684,
11269,
476,
6296,
320,
5678,
342,
7269,
3082,
751,
38622,
285,
717,
84,
4971,
671,
5277,
247,
14938,
1783,
273,
253,
3438,
5933,
327,
253,
8542,
2914,
253,
4477,
671,
12661,
247,
1332,
8244,
37427,
285,
3796,
17680,
323,
29985,
253,
1735,
11786,
285,
253,
27585,
6974,
310,
2011,
281,
789,
973,
45190,
275,
3733,
3676,
11454,
6928,
50276,
783,
2934,
273,
28684,
11269,
556,
644,
4633,
275,
3332,
1107,
1561,
253,
3909,
28269,
6239,
285,
556,
644,
908,
342,
3782,
1270,
2323,
323,
17170,
5520,
14940,
23632,
323,
4715,
45571,
5182,
275,
3958,
625,
4102,
28684,
11269,
452,
671,
5420,
275,
625,
8542,
7533,
824,
347,
3733,
305,
507,
835,
597,
497,
2011,
281,
3157,
7882,
273,
3733,
253,
1246,
2929,
8219,
326,
253,
2934,
273,
36970,
476,
320,
4217,
323,
1236,
2510,
25912,
13757,
347,
973,
604,
253,
11786,
5476,
265,
403,
6777,
20420,
50276,
74,
452,
298,
17936,
44041,
10450,
670,
253,
2929,
327,
253,
2762,
1930,
253,
4081,
1332,
310,
247,
3626,
285,
24600,
5019,
273,
4891,
7681,
5697,
285,
697,
10527,
1783,
4620,
281,
320,
3451,
347,
253,
4477,
1127,
562,
616,
5933,
31167,
253,
2934,
273,
36970,
275,
247,
1199,
625,
3626,
1039,
685,
253,
2905,
28684,
38622,
5933,
3786,
4081,
407,
277,
1945,
267,
30441,
1162,
355,
4765,
1057,
253,
4679,
671,
5224,
690,
5750,
273,
36970,
275,
253,
5421,
13757,
3237,
50276,
251,
253,
643,
1133,
253,
10527,
7680,
310,
16888,
253,
5933,
285,
697,
1783,
310,
247,
15246,
5019,
273,
2045,
5697,
285,
253,
906,
3139,
36908,
9974,
479,
347,
10084,
387,
512,
840,
969,
4931,
436,
310,
625,
273,
247,
9759,
2523,
347,
352,
778,
320,
253,
1083,
326,
253,
4477,
858,
417,
8722,
281,
6780,
4518,
2217,
253,
7681,
7881,
597,
3058,
281,
11399,
281,
5276,
616,
10527,
1543,
33810,
891,
1089,
253,
1332,
323,
29985,
253,
27935,
281,
320,
2581,
10341,
285,
15225,
5544,
255,
1878,
516,
417,
2119,
604,
3780,
32139,
342,
253,
5393,
11786,
26480,
17888,
3082,
651,
1089,
436,
2746,
281,
320,
24600,
387,
512,
50276,
74,
717,
417,
4751,
13762,
407,
253,
5661,
1543,
2057,
1580,
891,
452,
271,
13214,
326,
253,
11786,
4297,
41804,
1332,
760,
23970,
2568,
1529,
47133,
281,
1614,
672,
25184,
253,
4373,
22041,
285,
697,
417,
2590,
387,
512,
326,
436,
747,
7877,
651,
6296,
19444,
2308,
273,
3045,
326,
497,
417,
20685,
494,
1078,
6296,
253,
4477,
1646,
281,
4993,
512,
4373,
22041,
2439,
512,
4679,
285,
760,
5234,
1475,
253,
28684,
4295,
2581,
685,
4560,
253,
1682,
25184,
323,
1016,
2060,
5933,
285,
10941,
253,
9056,
1543,
671,
891,
13414,
1663,
923,
667,
18276,
11701,
275,
253,
4715,
9191,
1955,
281,
253,
747,
4295,
2858,
5046,
891,
816,
16216,
1239,
841,
14580,
6283,
1580,
891,
452,
625,
273,
247,
29075,
382,
4114,
50276,
783,
4028,
310,
6571,
8718,
3738,
627,
310,
2316,
323,
7756,
275,
2426,
273,
48087,
897,
3340,
327,
253,
2914,
273,
7774,
534,
1646,
281,
320,
745,
275,
2761,
1046,
6197,
50276,
1189,
455,
891,
13414,
1928,
1077,
9848,
670,
7738,
14924,
6571,
984,
891,
1089,
253,
1543,
281,
320,
2581,
5061,
321,
20733,
891,
1804,
326,
253,
4477,
1611,
281,
18578,
479,
273,
253,
37825,
7881,
14475,
275,
253,
1783,
390,
670,
253,
19040,
8542,
5750,
326,
36970,
476,
4489,
323,
1236,
2510,
25912,
13757,
50276,
5992,
7193,
5701,
50275,
377,
18,
12002,
359,
1908,
747,
11640,
273,
13757,
5933,
296,
8701,
6197,
310,
2581,
21248,
285,
12314,
891,
5476,
368,
3078,
281,
3730,
281,
17133,
13757,
11333,
534,
310,
2686,
752,
368,
1908,
275,
253,
2929,
642,
878,
281,
320,
30069,
670,
7384,
17133,
414,
50276,
377,
18,
247,
2087,
41843,
342,
253,
3510,
33513,
326,
2168,
2722,
327,
253,
806,
3239,
310,
326,
36037,
280,
285,
1355,
5347,
36622,
403,
908,
45891,
285,
1293,
667,
4518,
38640,
9317,
4496,
25636,
50276,
377,
18,
519,
356,
4614,
50276,
15083,
80,
953,
253,
12087,
273,
253,
941,
285,
17923,
27096,
5535,
255,
678,
261,
2789,
352,
3590,
751,
643,
11333,
1056,
1327,
37650,
800,
11269,
50276,
377,
18,
14938,
369,
417,
2931,
1014,
4151,
595,
275,
253,
10199,
2568,
2168,
690,
14938,
14493,
403,
2429,
27321,
326,
581,
476,
320,
1199,
4577,
685,
258,
2609,
85,
436,
310,
417,
1077,
11453,
323,
10668,
342,
642,
2720,
2793,
275,
3909,
4715,
50276,
377,
18,
616,
14938,
1783,
403,
253,
14938,
1783,
275,
3909,
4715,
5371,
310,
436,
6197,
2820,
281,
1333,
50276,
377,
19,
323,
436,
5955,
273,
269,
18609,
352,
651,
320,
4217,
281,
7579,
326,
436,
5933,
1663,
760,
2789,
3282,
604,
253,
2957,
1159,
310,
17133,
671,
2905,
281,
436,
5955,
368,
3748,
326,
253,
3033,
323,
28684,
269,
18609,
476,
320,
1199,
4577,
685,
8084,
85,
533,
1620,
2686,
1333,
326,
8084,
85,
310,
7221,
991,
8654,
14920,
436,
5313,
273,
3634,
436,
3908,
556,
1652,
1318,
50276,
377,
20,
38622,
50276,
18566,
417,
29623,
281,
690,
2173,
17133,
3470,
74,
5476,
891,
2096,
752,
436,
6197,
310,
2820,
281,
1333,
533,
352,
5604,
36908,
1333,
352,
987,
2139,
651,
271,
5933,
29623,
281,
247,
1159,
50276,
377,
20,
5004,
436,
5740,
273,
26480,
17888,
3082,
310,
23228,
10105,
280,
752,
310,
209,
633,
1060,
752,
310,
253,
4229,
1127,
1269,
2139,
310,
436,
6974,
7763,
387,
512,
1060,
2139,
651,
581,
2868,
253,
6332,
281,
320,
2822,
8172,
275,
436,
1083,
651,
436,
4154,
789,
387,
512,
323,
1327,
44181,
16566,
50276,
377,
22,
18057,
337,
2139,
651,
581,
1902,
277,
3259,
281,
320,
6486,
275,
1340,
281,
5416,
436,
581,
651,
878,
281,
2199,
253,
10040,
684,
281,
247,
8566,
873,
50276,
377,
22,
987,
846,
18057,
337,
352,
1057,
2647,
849,
305,
85,
310,
4561,
50276,
262,
1057,
417,
2647,
849,
305,
85,
310,
4561,
50276,
377,
23,
1755,
359,
7558,
326,
352,
310,
4577,
685,
258,
2609,
85,
594,
326,
359,
403,
1175,
1060,
2811,
4555,
858,
436,
1750,
3176,
285,
275,
752,
3282,
403,
359,
1175,
1060,
671,
253,
22429,
275,
436,
12494,
943,
320,
30044,
50276,
377,
23,
4706,
4567,
1223,
436,
2593,
2789,
690,
4722,
2792,
697,
10541,
9193,
247,
2372,
1512,
15251,
1999,
24088,
3981,
326,
368,
403,
6600,
273,
247,
2045,
5933,
28763,
2074,
281,
13298,
285,
35373,
1066,
327,
253,
1750,
326,
253,
7342,
403,
1027,
2789,
253,
2505,
1928,
751,
368,
250,
3192,
247,
14397,
22567,
1014,
2167,
891,
16216,
923,
247,
2590,
1921,
323,
436,
275,
619,
1984,
253,
2746,
368,
12661,
310,
4518,
1027,
285,
625,
3626,
323,
253,
4096,
273,
634,
1263,
7152,
33032,
2520,
2929,
24772,
3332,
1543,
275,
3909,
4715,
285,
17133,
13757,
5742,
5223,
2351,
10254,
285,
36970,
253,
4477,
823,
271,
28684,
11786,
10554,
3213,
715,
253,
717,
84,
4971,
5933,
4081,
407,
28159,
74,
1162,
355,
4765,
25761,
597,
12661,
970,
253,
40373,
365,
5933,
273,
660,
26730,
1162,
355,
4022,
281,
1705,
598,
342,
253,
11786,
10554,
3213,
253,
747,
1332,
326,
597,
9569,
310,
1925,
28684,
717,
84,
4971,
285,
253,
4477,
1246,
1097,
10527,
23632,
347,
973,
347,
10704,
4679,
816,
5411,
436,
747,
1332,
50276,
783,
2929,
310,
4942,
973,
15720,
285,
253,
4477,
513,
247,
1175,
2628,
273,
15571,
3332,
789,
327,
5223,
2351,
10254,
285,
36970,
275,
3909,
4715,
285,
17133,
13757,
281,
41509,
616,
5933,
253,
5933,
310,
671,
3559,
4518,
285,
253,
958,
326,
253,
1332,
310,
11704,
407,
1097,
247,
14938,
3033,
347,
973,
347,
10704,
4679,
310,
14109,
50276,
255,
253,
1072,
673,
891,
1119,
253,
9759,
273,
436,
789,
281,
320,
247,
1652,
24363,
253,
2934,
273,
9433,
36970,
281,
38622,
369,
2168,
3559,
275,
277,
1945,
267,
30441,
1162,
355,
4765,
253,
5933,
275,
326,
2929,
310,
275,
958,
1925,
28684,
38622,
891,
1119,
352,
1077,
8921,
326,
253,
4477,
9703,
281,
41838,
326,
5933,
275,
436,
2929,
627,
403,
767,
2022,
3910,
875,
28684,
38622,
275,
277,
1945,
267,
30441,
1162,
355,
4765,
285,
28684,
717,
84,
4971,
253,
806,
310,
253,
6880,
432,
38622,
281,
717,
84,
4971,
534,
8687,
271,
4465,
11903,
1320,
3213,
1386,
818,
275,
5933,
374,
326,
310,
8993,
253,
1273,
310,
253,
4327,
273,
11786,
10554,
1332,
1580,
277,
1945,
267,
30441,
1162,
355,
4765,
497,
7514,
342,
12902,
14940,
597,
33549,
281,
897,
253,
954,
3332,
11786,
347,
253,
10554,
327,
253,
643,
1133,
253,
4477,
275,
436,
789,
403,
7514,
342,
2087,
3909,
13757,
594,
597,
897,
247,
4872,
5019,
273,
2469,
27935,
347,
253,
10554,
1754,
327,
247,
1332,
5611,
407,
660,
26730,
1162,
355,
4022,
327,
697,
1211,
891,
513,
417,
1089,
436,
18149,
281,
320,
10481,
4460,
390,
1534,
281,
15785,
9311,
50275,
783,
958,
326,
436,
2929,
3797,
10527,
23632,
323,
28684,
717,
84,
4971,
326,
497,
5816,
275,
277,
1945,
267,
30441,
1162,
355,
4765,
323,
28684,
38622,
1057,
1056,
352,
247,
1652,
625,
18511,
2299,
891,
1119,
253,
3033,
275,
10012,
337,
281,
320,
247,
1652,
8921,
275,
326,
337,
352,
36908,
4796,
281,
253,
717,
84,
4971,
3033,
672,
253,
11786,
13650,
403,
470,
285,
374,
352,
36908,
1646,
1805,
685,
253,
717,
84,
4971,
390,
28684,
269,
18609,
14493,
253,
4477,
1750,
281,
15249,
374,
407,
3981,
326,
253,
4465,
305,
85,
50276,
384,
1307,
310,
258,
2609,
85,
533,
253,
2644,
4549,
273,
17825,
11333,
310,
326,
253,
8084,
85,
2426,
403,
2856,
324,
2662,
253,
16774,
1543,
671,
513,
417,
2486,
2228,
8965,
534,
2789,
352,
1892,
281,
5963,
616,
8453,
50275,
9088,
497,
671,
1142,
47412,
474,
6332,
285,
963,
993,
275,
253,
2929,
50275,
977,
5701,
285,
3533,
337,
3239,
337,
616,
10527,
1783,
403,
253,
14938,
1783,
275,
3909,
4715,
47412,
474,
2228,
374,
3239,
374,
253,
4468,
310,
326,
849,
281,
755,
1175,
26301,
47412,
474,
2228,
495,
3239,
495,
347,
253,
2852,
789,
47412,
474,
2228,
577,
3239,
495,
15178,
287,
10936,
1332,
1745,
80,
50276,
22,
3239,
577,
342,
3280,
8414,
273,
47412,
474,
2228,
721,
3239,
577,
1067,
5933,
495,
342,
752,
310,
253,
15180,
2105,
273,
436,
3213,
581,
273,
253,
2022,
5373,
273,
11333,
751,
717,
84,
4971,
310,
326,
597,
1408,
275,
7687,
673,
342,
1077,
11134,
14637,
50276,
24,
3239,
577,
323,
436,
26480,
17888,
1332,
281,
973,
973,
253,
11786,
11390,
387,
247,
2173,
673,
13905,
310,
8025,
281,
320,
10848,
407,
608,
604,
253,
11786,
1057,
417,
1818,
3012,
436,
588,
320,
247,
11134,
1617,
604,
253,
11786,
36908,
1818,
3012,
840,
13887,
26301,
50276,
7332,
651,
671,
789,
973,
651,
2649,
352,
476,
368,
1705,
598,
342,
6667,
273,
16566,
323,
534,
436,
1332,
2789,
3282,
1014,
20953,
4394,
651,
17084,
436,
2929,
854,
3239,
608,
5150,
854,
347,
5469,
1840,
436,
3033,
36908,
3176,
281,
4796,
281,
253,
717,
84,
4971,
3033,
323,
26301,
50276,
17,
534,
2789,
352,
247,
1652,
43288,
3184,
253,
958,
326,
627,
310,
271,
4465,
2048,
326,
310,
2649,
275,
2426,
273,
253,
11786,
10554,
2228,
326,
581,
556,
323,
28684,
269,
18609,
671,
2789,
253,
3033,
247,
1652,
8921,
898,
3239,
818,
253,
2589,
28684,
1317,
4971,
342,
1027,
2193,
273,
391,
285,
10018,
2074,
3045,
368,
943,
3748,
326,
368,
921,
253,
3045,
323,
690,
273,
841,
1027,
2193,
275,
253,
30762,
884,
3239,
818,
23559,
14407,
1877,
3237,
1745,
80,
1903,
3239,
818,
4677,
337,
1293,
2228,
8965,
697,
7479,
281,
2028,
1880,
841,
1543,
403,
14282,
25761,
697,
8921,
281,
7472,
11333,
342,
3909,
17133,
13757,
23632,
327,
28841,
1327,
44181,
3237,
1249,
3239,
818,
7561,
5421,
285,
4882,
47412,
474,
2228,
2145,
3239,
854,
247,
2442,
10746,
47412,
474,
2228,
50254,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
4469,
690,
1600,
275,
436,
2929,
533,
4583,
497,
298,
17936,
44041,
670,
697,
9021,
391,
21,
16540,
247,
7936,
2523,
342,
253,
9759,
273,
253,
1783,
923,
253,
277,
3259,
9376,
253,
913,
3021,
4566,
323,
247,
49620,
285,
501,
538,
2225
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper finds an interesting problem with the residual networks ie the network loafing problem they find that the subnetworks will perform worse than those that are individually trained inspired by social psychology they propose a stimulative training strategy which randomly samples a residual subnetwork and calculates the kldivergence loss between the sampled subnetwork and the residual network to provide additional supervision to the subnetwork the proposed strategy is validated by a comprehensive theoretical and empirical analysis originality i like the idea to view the residual network from a social psychological perspective this perspective is creative following the unraveled view they see the residual network as an ensemble of subnetworks and study the performance of each subnetwork they discover a novel problem called network loafing which has analogous to the problem in social psychology where an individual contributes less when in a group quality in general the paper is technically sound clarity overall the paper is well written and easy to understand the motivation and the contributions are clear and the whole paper is organized well some minor issues with the writing some experimental details are missing eg how do you measure the performance of subnetworks will you retrain the last logits layer see the questions above docsepthis paper explores the use of a training method for residual networks called stimulative training this method is inspired by the psychological phenomenon of social loafing where members of a group give less effort when in a group compared to when not in a group the paper hypothesizes a similar network loafing phenomenon for neural networks where subnetworks in a larger network would perform suboptimally compared to when a similar network is trained by itself the paper devises a training strategy called stimulative training that samples random subnetworks and minimizes a kl divergence loss between the output of the sampled subnetwork and the full network the paper performs a few experiments to show that the loafing problem exists with traditional training strategies and is eliminatedreduced with their method their method improves the robustness of the network to deletionpermutation of layers improves the overall performance of the network for cifar 100 and imagenet datasets strengths connecting the training of neural networks to the social loafing phenomenon was an interesting point showing that simply getting each individual subnetwork more supervision to a common goal leads to an improvement in performance is also an interesting result as i could imagine that it could lead to worse generalization on test set something like the group think phenomenon i think the types of experiments selected was good they covered a range of questions about this phenomenon weaknesses as far as i can tell the results are based on only a single run of data to draw the conclusions that the paper wants us to i think they need to show that the results are replicable for example even if we say network loafing exists its unclear if we can conclude that it is a problem with a single run of data table 1 shows the results comparing the final accuracy of conventional training with stimulative training and the improvement is small to conclude that it is significant the paper needs to show that it is replicable the method seems fairly similar to self distillation in self distillation they have some added machinery to be able to use the method on architectures where the output of each layer might not be the same size in this paper because the architectures are limited to residual networks that machinery is not needed still it seems to be a special case of self distillation for residual networks although the paper does examine the specific phenomenon of network loafing it would be interesting to see something like selfdistillation as a baseline for figure 4 please use more distinctive markers its difficult to tell the difference between markers of the same color other than showing reproducibility id mostly want to see a comparison todiscussion of a method like selfdistillation which is not limited by the type of architecture which this method is assuming the network loafing phenomenon exists with other architectures as well why shouldnt we use something like selfdistillation instead of this method since its more general is there a big improvement in the ease of use or performance with resnets when using this method also i am curious as to how much longer this method takes compared to conventional training docsepthe paper proposes a view on residual network training from a social psychology perspective an analogy is made where individual subnetworks of a resnet are compared to people that are assigned a group project social loafing is the phenomenon that work is often not equally distributed in such groups the paper argues that subnetworks do not perform equally on the classification task in a similar fashion and proposes two mechanisms random sampling of blocks and kl divergence between intermediate representations as a remedy strengths the results show that the proposed mechanisms do improve the performance of the overall network and also individual subnetworks the writing of the paper is clear and the goal method and results are presented in an easily understandable format the paper contains many results of training networks with error bars which has high computational load weaknesses framing the grounding of the work in social psychology is extremely far fetched resblocks are compared to people in a complex social group setting the paper itself compares training a network to polynomial regression which has nothing to do with the complex social dynamics in a work group there are no insights gained from this analogy and comparing a couple of resblocks to people contributes to false ai hype and other ethical issues i strongly advocate for removing this framing from the paper learning redundant blocks a large portion of the analysis is aimed at the performance of individual sub networks permuting and deleting residual blocks and the kl divergence between intermediate representations in all these experiments the proposed training strategy shows better results than conventional training this is not very surprising as the additional objective of adding a kl divergence term between representations and sampling subnetworks trains the model exactly for these tasks moreover both objectives essentially train the individual block to be redundant this is another breakdown point for the loafing analogy where the optimal group work outcome is not achieved when every member does the same job it is questionable if this is a desired behavior as cnns are commonly understood to learn a hierarchical structure of representations if blocks can be reshuffled or deleted without affecting the performance much this structure is lost it is possible that this effect is less important on small images such as cifar100 comparisons the method is only compared to conventional training similar approaches have been explored that add auxiliary losses distill representation student teacher kd 1219 or drop residual blocks22 and it would be important to understand their effect on the network in comparison to the proposed approach this would also make the comparison of dropping and reshuffling blocks more interesting limitations and societal impact are not discussed in the paper especially because of the framing as inspired by social psychology it would be very important to discuss both aspects in the paper docsepthe authors argue that subnetworks in resnet tend to perform suboptimally due to a social loafing effect that is since no single subnetwork is accountable for the final performance they all tend to rely on the aggregate group performance rather than achieving competitive individual performance in order to alleviate this problem the authors propose including an additional loss term that encourages subnetworks to match the predictions of the group in experiments the authors show that not only their loss solves the loafing problem but also improves the overall final performance an additional theoretical analysis shows that improving subnetworks is tied to improving overall network performance overall review overall the authors present an interesting phenomenon in residual networks in an ingenious way social loafing while making use of that knowledge to improve such architecture on the other hand the experimental evaluation leaves many questions open see below in its current state i think this work does not meet neurips standards but i encourage the authors to continue working on it and to submit a rebuttal strengths the authors succeed at showing that social loafing does happen in residual networks and that fixing it results in a performance improvement the experimental design succeeds at convincing the reader that theres loafing and that it can be fixed the authors include a theoretical analysis and reproducibility details in the appendix weaknesses reading the submission generates a series of questions that are not answered by this work for instance does loafing happen in stochastic depth networks in fact it would be nice to stochastic depth networks as a baseline in all the figures can your method be applied to pretrained neural networks what if instead of the kl you used the same crossentropy loss as in the output the authors of densenet a state that this kind of neural network is less redundant than resnet since each layer is aware of what the other layers are learning it would be interesting if you checked whether densenet also suffers from loafing the block dropping strategy is the same as in stochastic depth networks thus i would devote less space to it in the paper and state that you use the same strategy while stochastic depth networks reduce the training computation time this work increases it a huang gao et al densely connected convolutional networks proceedings of the ieee conference on computer vision and pattern recognition 2017 detailed comments originality i found original the connection with social loafing the method to solve it is a combination of multiple existing ideas but i think that it can be considered novel given the context in which it is introduced quality the overall quality is good but the experimental quality could be improved for example you could explore if loafing happens or if it is solved with other methods such as stochastic depth or densenet clarity the text is clear and easy to read there are some minor typos significance given the performance improvement introduced by the proposed method it could be adopted by future work a major drawback is the increase in computational cost due to having to compute two losses every time reproducibility the authors provide details to reproduce their work however they do not provide the final code there is no limitations section i encourage the authors to add one discussing issues such as the computational cost or the applicability of their method beyond resnets such as transformers
### Summary:
|
this paper proposes to study the loafing problem in deep resnets which suggests that the sub networks of a deep resnet perform significantly worse than the same architecture trained alone it proposes a simple technique which jointly trains the main network and the kl divergence the main networks output and that of a random subnetwork it is shown empirically that this technique improves the final accuracy for both the main network and random subnetworks the reviewers agreed that the loafing problem is an interesting phenomenon but also raised concerns about both the motivationpresentation and comparison with similar techniques like deep supervised and self distillation the authors provided extensive responses with new additional experimental results after the discussion phase the reviewers reached to a consensus of acceptance conditioned on that the authors carefully address the framing of loafing and make clear that the loafing term is just a loose analogy without any real implications to biology the ac agrees that the problem identified in this paper is interesting and can have implications to both regularization and model compression however the authors should try to remove the excessive reference to the social psychology aspects which does provide scientific justification to the method but rather could invite unnecessary confusion and controversy
|
[
310,
22335,
3590,
50276,
498,
15752,
4583,
253,
2929,
310,
973,
3542,
285,
3477,
281,
2096,
253,
16038,
285,
253,
9021,
403,
2590,
285,
253,
2644,
2929,
310,
10932,
973,
690,
5884,
3374,
342,
253,
4028,
690,
5661,
4278,
403,
5816,
24088,
849,
513,
368,
2557,
253,
3045,
273,
749,
3024,
4896,
588,
368,
851,
1949,
253,
1390,
2412,
953,
3828,
50276,
2887,
253,
3533,
1840,
5474,
33032,
2520,
2929,
33826,
253,
897,
273,
247,
3733,
1332,
323,
12541,
6928,
1925,
7328,
800,
3733,
436,
1332,
310,
11797,
407,
253,
12264,
11562,
273,
2675,
48518,
272,
835,
2758,
273,
247,
1387,
1918,
1679,
3434,
672,
275,
247,
1387,
2429,
281,
672,
417,
275,
247,
1387,
253,
2929,
6482,
4219,
247,
2074,
2990,
48518,
272,
11562,
323,
11454,
6928,
835,
749,
3024,
4896,
275,
247,
4067,
2990,
651,
1347,
749,
32581,
595,
2429,
281,
672,
247,
2074,
2990,
310,
10166,
407,
3139,
253,
2929,
1474,
3013,
247,
3733,
5700,
1925,
7328,
800,
3733,
326,
3530,
3632,
749,
3024,
4896,
285,
46926,
247,
27451,
23279,
2957,
875,
253,
3453,
273,
253,
19958,
749,
18428,
285,
253,
2120,
2990,
253,
2929,
17923,
247,
1643,
4679,
281,
921,
326,
50275,
783,
48518,
272,
1895,
4961,
342,
5899,
3733,
8130,
285,
310,
17527,
43408,
342,
616,
1332,
50276,
14094,
1332,
19132,
253,
31640,
273,
253,
2990,
281,
17404,
468,
10082,
318,
273,
8090,
50276,
303,
856,
1634,
253,
4583,
3045,
273,
253,
2990,
323,
260,
338,
274,
2233,
285,
4440,
257,
292,
15302,
20544,
50276,
11025,
272,
253,
3733,
273,
11454,
6928,
281,
253,
2675,
48518,
272,
11562,
369,
271,
4722,
1127,
4645,
326,
3365,
2970,
1016,
2060,
749,
18428,
625,
20446,
281,
247,
1846,
4736,
5644,
281,
271,
7756,
275,
3045,
310,
671,
271,
4722,
906,
347,
891,
812,
8564,
326,
352,
812,
1421,
281,
7197,
26647,
327,
1071,
873,
1633,
751,
253,
1387,
1158,
11562,
50276,
74,
1158,
253,
3510,
273,
4679,
4236,
369,
1175,
597,
6107,
247,
2491,
273,
3533,
670,
436,
11562,
50276,
20881,
1255,
265,
50276,
284,
2080,
347,
891,
476,
2028,
253,
1543,
403,
1754,
327,
760,
247,
2014,
1408,
273,
941,
281,
3812,
253,
11815,
326,
253,
2929,
5605,
441,
281,
891,
1158,
597,
878,
281,
921,
326,
253,
1543,
403,
7446,
494,
323,
1650,
1014,
604,
359,
1333,
2990,
48518,
272,
4961,
697,
12744,
604,
359,
476,
7525,
326,
352,
310,
247,
1895,
342,
247,
2014,
1408,
273,
941,
2829,
337,
2722,
253,
1543,
10941,
253,
2457,
7200,
273,
6041,
3733,
342,
7328,
800,
3733,
285,
253,
7756,
310,
1355,
281,
7525,
326,
352,
310,
1534,
253,
2929,
3198,
281,
921,
326,
352,
310,
7446,
494,
50276,
783,
1332,
3133,
9648,
2074,
281,
1881,
940,
21755,
275,
1881,
940,
21755,
597,
452,
690,
2879,
20949,
281,
320,
2104,
281,
897,
253,
1332,
327,
35615,
835,
253,
3453,
273,
1016,
3828,
1537,
417,
320,
253,
1072,
1979,
275,
436,
2929,
984,
253,
35615,
403,
3710,
281,
12541,
6928,
326,
20949,
310,
417,
3058,
1335,
352,
3133,
281,
320,
247,
2714,
1083,
273,
1881,
940,
21755,
323,
12541,
6928,
3738,
253,
2929,
1057,
9186,
253,
2173,
11562,
273,
2990,
48518,
272,
352,
651,
320,
4722,
281,
923,
1633,
751,
1881,
8155,
21755,
347,
247,
8245,
50276,
1542,
4677,
577,
4496,
897,
625,
21488,
9588,
697,
2834,
281,
2028,
253,
3064,
875,
9588,
273,
253,
1072,
3295,
643,
685,
4645,
38041,
2654,
6571,
971,
281,
923,
247,
5301,
15779,
10055,
273,
247,
1332,
751,
1881,
8155,
21755,
534,
310,
417,
3710,
407,
253,
1511,
273,
10336,
534,
436,
1332,
310,
7384,
253,
2990,
48518,
272,
11562,
4961,
342,
643,
35615,
347,
973,
2139,
943,
2649,
359,
897,
1633,
751,
1881,
8155,
21755,
3185,
273,
436,
1332,
1580,
697,
625,
2087,
310,
627,
247,
1943,
7756,
275,
253,
11990,
273,
897,
390,
3045,
342,
501,
47301,
672,
970,
436,
1332,
671,
891,
717,
14338,
347,
281,
849,
1199,
3356,
436,
1332,
3936,
2429,
281,
6041,
3733,
5474,
339,
431,
248,
2929,
29328,
247,
1859,
327,
12541,
2990,
3733,
432,
247,
2675,
20162,
8668,
271,
24760,
310,
1160,
835,
2060,
749,
3024,
4896,
273,
247,
501,
3024,
403,
2429,
281,
952,
326,
403,
7922,
247,
1387,
2199,
2675,
48518,
272,
310,
253,
11562,
326,
789,
310,
2223,
417,
9696,
5939,
275,
824,
2390,
253,
2929,
8219,
326,
749,
3024,
4896,
513,
417,
1347,
9696,
327,
253,
9162,
4836,
275,
247,
2074,
8142,
285,
29328,
767,
6297,
3632,
10491,
273,
8336,
285,
27451,
23279,
875,
10444,
14237,
347,
247,
16748,
20544,
50276,
783,
1543,
921,
326,
253,
4081,
6297,
513,
3157,
253,
3045,
273,
253,
4583,
2990,
285,
671,
2060,
749,
3024,
4896,
50276,
783,
4028,
273,
253,
2929,
310,
2590,
285,
253,
4736,
1332,
285,
1543,
403,
3559,
275,
271,
4354,
34007,
5981,
50275,
783,
2929,
4428,
1142,
1543,
273,
3733,
6928,
342,
2228,
8965,
534,
556,
1029,
15180,
3301,
50276,
20881,
1255,
265,
50276,
925,
6472,
253,
3216,
272,
273,
253,
789,
275,
2675,
20162,
310,
6685,
2080,
8264,
2147,
501,
27027,
403,
2429,
281,
952,
275,
247,
2570,
2675,
1387,
4758,
253,
2929,
3139,
26662,
3733,
247,
2990,
281,
14189,
9077,
534,
556,
2717,
281,
513,
342,
253,
2570,
2675,
8062,
275,
247,
789,
1387,
627,
403,
642,
16039,
12103,
432,
436,
24760,
285,
10941,
247,
4564,
273,
501,
27027,
281,
952,
17904,
281,
3221,
23105,
31012,
285,
643,
16289,
3374,
891,
7052,
21424,
323,
11922,
436,
39926,
432,
253,
2929,
50276,
28269,
28116,
8336,
247,
1781,
5110,
273,
253,
1783,
310,
11205,
387,
253,
3045,
273,
2060,
749,
6928,
8143,
9634,
285,
37193,
12541,
8336,
285,
253,
27451,
23279,
875,
10444,
14237,
275,
512,
841,
4679,
253,
4081,
3733,
5700,
2722,
1805,
1543,
685,
6041,
3733,
436,
310,
417,
1077,
10084,
347,
253,
3081,
8103,
273,
6240,
247,
27451,
23279,
1307,
875,
14237,
285,
10491,
749,
3024,
4896,
18784,
253,
1566,
4555,
323,
841,
8892,
25761,
1097,
16566,
9093,
6194,
253,
2060,
2972,
281,
320,
28116,
436,
310,
1529,
19501,
1127,
323,
253,
48518,
272,
24760,
835,
253,
8654,
1387,
789,
6454,
310,
417,
6786,
672,
1046,
3558,
1057,
253,
1072,
2628,
352,
310,
30455,
604,
436,
310,
247,
6799,
3879,
347,
260,
79,
2224,
403,
7744,
7192,
281,
3037,
247,
24498,
2605,
273,
14237,
604,
8336,
476,
320,
40206,
31377,
390,
16737,
1293,
13567,
253,
3045,
1199,
436,
2605,
310,
3663,
352,
310,
1896,
326,
436,
1055,
310,
1679,
1774,
327,
1355,
3888,
824,
347,
260,
338,
274,
2313,
50275,
681,
1148,
10047,
253,
1332,
310,
760,
2429,
281,
6041,
3733,
2074,
7274,
452,
644,
14859,
326,
823,
24026,
11655,
940,
408,
6779,
5974,
9732,
465,
69,
1249,
746,
390,
5926,
12541,
8336,
1423,
285,
352,
651,
320,
1774,
281,
2096,
616,
1055,
327,
253,
2990,
275,
5301,
281,
253,
4081,
2746,
436,
651,
671,
1056,
253,
5301,
273,
18752,
285,
40206,
47587,
8336,
625,
4722,
50276,
17465,
569,
285,
38058,
3486,
403,
417,
5469,
275,
253,
2929,
3340,
984,
273,
253,
39926,
347,
11797,
407,
2675,
20162,
352,
651,
320,
1077,
1774,
281,
2319,
1097,
7794,
275,
253,
2929,
50275,
7152,
339,
431,
248,
4477,
9059,
326,
749,
3024,
4896,
275,
501,
3024,
5257,
281,
1347,
749,
32581,
595,
1955,
281,
247,
2675,
48518,
272,
1055,
326,
310,
1580,
642,
2014,
749,
18428,
310,
28461,
323,
253,
2457,
3045,
597,
512,
5257,
281,
10725,
327,
253,
19737,
1387,
3045,
2581,
685,
17170,
12085,
2060,
3045,
275,
1340,
281,
33623,
436,
1895,
253,
4477,
12661,
1690,
271,
3081,
2957,
1307,
326,
29426,
749,
3024,
4896,
281,
3761,
253,
13650,
273,
253,
1387,
50276,
249,
4679,
253,
4477,
921,
326,
417,
760,
616,
2957,
35910,
253,
48518,
272,
1895,
533,
671,
19132,
253,
4583,
2457,
3045,
271,
3081,
10527,
1783,
2722,
326,
11138,
749,
3024,
4896,
310,
12331,
281,
11138,
4583,
2990,
3045,
50276,
1189,
455,
2278,
50276,
1189,
455,
253,
4477,
1246,
271,
4722,
11562,
275,
12541,
6928,
275,
271,
35604,
784,
1039,
2675,
48518,
272,
1223,
2403,
897,
273,
326,
3640,
281,
3157,
824,
10336,
327,
253,
643,
1133,
253,
5661,
7103,
6505,
1142,
3533,
1527,
923,
2708,
275,
697,
1655,
1375,
891,
1158,
436,
789,
1057,
417,
2525,
5723,
2824,
7465,
533,
891,
11907,
253,
4477,
281,
4035,
2444,
327,
352,
285,
281,
11929,
247,
30080,
22559,
50276,
296,
3755,
20556,
50275,
783,
4477,
9302,
387,
4645,
326,
2675,
48518,
272,
1057,
5108,
275,
12541,
6928,
285,
326,
18505,
352,
1543,
275,
247,
3045,
7756,
50276,
783,
5661,
2216,
44584,
387,
21414,
253,
9414,
326,
253,
373,
48518,
272,
285,
326,
352,
476,
320,
4229,
50276,
783,
4477,
2486,
247,
10527,
1783,
285,
38041,
4278,
275,
253,
30762,
50276,
20881,
1255,
265,
50275,
24042,
253,
19529,
15693,
247,
2962,
273,
3533,
326,
403,
417,
9577,
407,
436,
789,
323,
4227,
1057,
48518,
272,
5108,
275,
19191,
6864,
6928,
275,
958,
352,
651,
320,
5322,
281,
19191,
6864,
6928,
347,
247,
8245,
275,
512,
253,
8442,
476,
634,
1332,
320,
3732,
281,
3215,
11273,
11454,
6928,
752,
604,
3185,
273,
253,
27451,
368,
908,
253,
1072,
2831,
290,
10144,
2957,
347,
275,
253,
3453,
50275,
783,
4477,
273,
12006,
257,
292,
247,
1375,
326,
436,
2238,
273,
11454,
2990,
310,
1679,
28116,
685,
501,
3024,
1580,
1016,
3828,
310,
6600,
273,
752,
253,
643,
8090,
403,
4715,
352,
651,
320,
4722,
604,
368,
10141,
1880,
12006,
257,
292,
671,
27171,
432,
48518,
272,
50276,
783,
2972,
18752,
5700,
310,
253,
1072,
347,
275,
19191,
6864,
6928,
3021,
891,
651,
34097,
1679,
2317,
281,
352,
275,
253,
2929,
285,
1375,
326,
368,
897,
253,
1072,
5700,
50276,
6050,
19191,
6864,
6928,
4796,
253,
3733,
13782,
673,
436,
789,
5459,
352,
50275,
66,
30287,
606,
305,
8500,
1162,
355,
42350,
4802,
27311,
267,
6928,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
4240,
50275,
5992,
7193,
5701,
50276,
19164,
414,
50276,
74,
1119,
3236,
253,
4602,
342,
2675,
48518,
272,
253,
1332,
281,
8415,
352,
310,
247,
5019,
273,
2709,
5368,
5697,
533,
891,
1158,
326,
352,
476,
320,
2783,
4460,
1677,
253,
3634,
275,
534,
352,
310,
5611,
50276,
15177,
50276,
783,
4583,
3290,
310,
1175,
533,
253,
5661,
3290,
812,
320,
5520,
323,
1650,
368,
812,
8338,
604,
48518,
272,
6569,
390,
604,
352,
310,
14042,
342,
643,
3082,
824,
347,
19191,
6864,
390,
12006,
257,
292,
50276,
498,
15752,
50276,
783,
2505,
310,
2590,
285,
3477,
281,
1239,
627,
403,
690,
5884,
963,
993,
50276,
9188,
40348,
50276,
28821,
253,
3045,
7756,
5611,
407,
253,
4081,
1332,
352,
812,
320,
8671,
407,
2852,
789,
247,
2201,
32489,
310,
253,
2572,
275,
15180,
2105,
1955,
281,
1907,
281,
11897,
767,
11655,
1046,
673,
50275,
250,
5551,
33593,
50276,
783,
4477,
2085,
4278,
281,
18302,
616,
789,
2299,
597,
513,
417,
2085,
253,
2457,
2127,
627,
310,
642,
7364,
2593,
891,
11907,
253,
4477,
281,
823,
581,
16585,
3374,
824,
347,
253,
15180,
2105,
390,
253,
30437,
273,
616,
1332,
4457,
501,
47301,
824,
347,
4979,
398,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
1263,
253,
48518,
272,
1895,
275,
3676,
501,
47301,
534,
5936,
326,
253,
749,
6928,
273,
247,
3676,
501,
3024,
1347,
3012,
7197,
685,
253,
1072,
10336,
10166,
3815,
352,
29328,
247,
2969,
5853,
534,
26277,
18784,
253,
2022,
2990,
285,
253,
27451,
23279,
253,
2022,
6928,
3453,
285,
326,
273,
247,
3632,
749,
18428,
352,
310,
2011,
45190,
326,
436,
5853,
19132,
253,
2457,
7200,
323,
1097,
253,
2022,
2990,
285,
3632,
749,
3024,
4896,
253,
30628,
5821,
326,
253,
48518,
272,
1895,
310,
271,
4722,
11562,
533,
671,
5439,
7350,
670,
1097,
253,
16038,
49836,
285,
5301,
342,
2074,
5609,
751,
3676,
22296,
285,
1881,
940,
21755,
253,
4477,
2530,
9470,
6128,
342,
747,
3081,
5661,
1543,
846,
253,
5955,
3408,
253,
30628,
4925,
281,
247,
13969,
273,
14924,
27039,
327,
326,
253,
4477,
9257,
2953,
253,
39926,
273,
48518,
272,
285,
1056,
2590,
326,
253,
48518,
272,
1307,
310,
816,
247,
13155,
24760,
1293,
667,
1524,
12739,
281,
16775,
253,
913,
18726,
326,
253,
1895,
3636,
275,
436,
2929,
310,
4722,
285,
476,
452,
12739,
281,
1097,
37820,
285,
1566,
13800,
2299,
253,
4477,
943,
1611,
281,
5386,
253,
13622,
3806,
281,
253,
2675,
20162,
7794,
534,
1057,
2085,
8249,
22861,
281,
253,
1332,
533,
2581,
812,
19864,
15279,
13775,
285,
16305,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
310,
22335,
3590,
50276,
498,
15752,
4583,
253,
2929,
310,
973,
3542,
285,
3477,
281,
2096,
253,
16038,
285,
253,
9021,
403,
2590,
285,
253,
2644,
2929,
310,
10932,
973,
690,
5884,
3374,
342,
253,
4028,
690,
5661,
4278,
403,
5816,
24088,
849,
513,
368,
2557,
253,
3045,
273,
749,
3024,
4896,
588,
368,
851,
1949,
253,
1390,
2412,
953,
3828,
50276,
2887,
253,
3533,
1840,
5474,
33032,
2520,
2929,
33826,
253,
897,
273,
247,
3733,
1332,
323,
12541,
6928,
1925,
7328,
800,
3733,
436,
1332,
310,
11797,
407,
253,
12264,
11562,
273,
2675,
48518,
272,
835,
2758,
273,
247,
1387,
1918,
1679,
3434,
672,
275,
247,
1387,
2429,
281,
672,
417,
275,
247,
1387,
253,
2929,
6482,
4219,
247,
2074,
2990,
48518,
272,
11562,
323,
11454,
6928,
835,
749,
3024,
4896,
275,
247,
4067,
2990,
651,
1347,
749,
32581,
595,
2429,
281,
672,
247,
2074,
2990,
310,
10166,
407,
3139,
253,
2929,
1474,
3013,
247,
3733,
5700,
1925,
7328,
800,
3733,
326,
3530,
3632,
749,
3024,
4896,
285,
46926,
247,
27451,
23279,
2957,
875,
253,
3453,
273,
253,
19958,
749,
18428,
285,
253,
2120,
2990,
253,
2929,
17923,
247,
1643,
4679,
281,
921,
326,
50275,
783,
48518,
272,
1895,
4961,
342,
5899,
3733,
8130,
285,
310,
17527,
43408,
342,
616,
1332,
50276,
14094,
1332,
19132,
253,
31640,
273,
253,
2990,
281,
17404,
468,
10082,
318,
273,
8090,
50276,
303,
856,
1634,
253,
4583,
3045,
273,
253,
2990,
323,
260,
338,
274,
2233,
285,
4440,
257,
292,
15302,
20544,
50276,
11025,
272,
253,
3733,
273,
11454,
6928,
281,
253,
2675,
48518,
272,
11562,
369,
271,
4722,
1127,
4645,
326,
3365,
2970,
1016,
2060,
749,
18428,
625,
20446,
281,
247,
1846,
4736,
5644,
281,
271,
7756,
275,
3045,
310,
671,
271,
4722,
906,
347,
891,
812,
8564,
326,
352,
812,
1421,
281,
7197,
26647,
327,
1071,
873,
1633,
751,
253,
1387,
1158,
11562,
50276,
74,
1158,
253,
3510,
273,
4679,
4236,
369,
1175,
597,
6107,
247,
2491,
273,
3533,
670,
436,
11562,
50276,
20881,
1255,
265,
50276,
284,
2080,
347,
891,
476,
2028,
253,
1543,
403,
1754,
327,
760,
247,
2014,
1408,
273,
941,
281,
3812,
253,
11815,
326,
253,
2929,
5605,
441,
281,
891,
1158,
597,
878,
281,
921,
326,
253,
1543,
403,
7446,
494,
323,
1650,
1014,
604,
359,
1333,
2990,
48518,
272,
4961,
697,
12744,
604,
359,
476,
7525,
326,
352,
310,
247,
1895,
342,
247,
2014,
1408,
273,
941,
2829,
337,
2722,
253,
1543,
10941,
253,
2457,
7200,
273,
6041,
3733,
342,
7328,
800,
3733,
285,
253,
7756,
310,
1355,
281,
7525,
326,
352,
310,
1534,
253,
2929,
3198,
281,
921,
326,
352,
310,
7446,
494,
50276,
783,
1332,
3133,
9648,
2074,
281,
1881,
940,
21755,
275,
1881,
940,
21755,
597,
452,
690,
2879,
20949,
281,
320,
2104,
281,
897,
253,
1332,
327,
35615,
835,
253,
3453,
273,
1016,
3828,
1537,
417,
320,
253,
1072,
1979,
275,
436,
2929,
984,
253,
35615,
403,
3710,
281,
12541,
6928,
326,
20949,
310,
417,
3058,
1335,
352,
3133,
281,
320,
247,
2714,
1083,
273,
1881,
940,
21755,
323,
12541,
6928,
3738,
253,
2929,
1057,
9186,
253,
2173,
11562,
273,
2990,
48518,
272,
352,
651,
320,
4722,
281,
923,
1633,
751,
1881,
8155,
21755,
347,
247,
8245,
50276,
1542,
4677,
577,
4496,
897,
625,
21488,
9588,
697,
2834,
281,
2028,
253,
3064,
875,
9588,
273,
253,
1072,
3295,
643,
685,
4645,
38041,
2654,
6571,
971,
281,
923,
247,
5301,
15779,
10055,
273,
247,
1332,
751,
1881,
8155,
21755,
534,
310,
417,
3710,
407,
253,
1511,
273,
10336,
534,
436,
1332,
310,
7384,
253,
2990,
48518,
272,
11562,
4961,
342,
643,
35615,
347,
973,
2139,
943,
2649,
359,
897,
1633,
751,
1881,
8155,
21755,
3185,
273,
436,
1332,
1580,
697,
625,
2087,
310,
627,
247,
1943,
7756,
275,
253,
11990,
273,
897,
390,
3045,
342,
501,
47301,
672,
970,
436,
1332,
671,
891,
717,
14338,
347,
281,
849,
1199,
3356,
436,
1332,
3936,
2429,
281,
6041,
3733,
5474,
339,
431,
248,
2929,
29328,
247,
1859,
327,
12541,
2990,
3733,
432,
247,
2675,
20162,
8668,
271,
24760,
310,
1160,
835,
2060,
749,
3024,
4896,
273,
247,
501,
3024,
403,
2429,
281,
952,
326,
403,
7922,
247,
1387,
2199,
2675,
48518,
272,
310,
253,
11562,
326,
789,
310,
2223,
417,
9696,
5939,
275,
824,
2390,
253,
2929,
8219,
326,
749,
3024,
4896,
513,
417,
1347,
9696,
327,
253,
9162,
4836,
275,
247,
2074,
8142,
285,
29328,
767,
6297,
3632,
10491,
273,
8336,
285,
27451,
23279,
875,
10444,
14237,
347,
247,
16748,
20544,
50276,
783,
1543,
921,
326,
253,
4081,
6297,
513,
3157,
253,
3045,
273,
253,
4583,
2990,
285,
671,
2060,
749,
3024,
4896,
50276,
783,
4028,
273,
253,
2929,
310,
2590,
285,
253,
4736,
1332,
285,
1543,
403,
3559,
275,
271,
4354,
34007,
5981,
50275,
783,
2929,
4428,
1142,
1543,
273,
3733,
6928,
342,
2228,
8965,
534,
556,
1029,
15180,
3301,
50276,
20881,
1255,
265,
50276,
925,
6472,
253,
3216,
272,
273,
253,
789,
275,
2675,
20162,
310,
6685,
2080,
8264,
2147,
501,
27027,
403,
2429,
281,
952,
275,
247,
2570,
2675,
1387,
4758,
253,
2929,
3139,
26662,
3733,
247,
2990,
281,
14189,
9077,
534,
556,
2717,
281,
513,
342,
253,
2570,
2675,
8062,
275,
247,
789,
1387,
627,
403,
642,
16039,
12103,
432,
436,
24760,
285,
10941,
247,
4564,
273,
501,
27027,
281,
952,
17904,
281,
3221,
23105,
31012,
285,
643,
16289,
3374,
891,
7052,
21424,
323,
11922,
436,
39926,
432,
253,
2929,
50276,
28269,
28116,
8336,
247,
1781,
5110,
273,
253,
1783,
310,
11205,
387,
253,
3045,
273,
2060,
749,
6928,
8143,
9634,
285,
37193,
12541,
8336,
285,
253,
27451,
23279,
875,
10444,
14237,
275,
512,
841,
4679,
253,
4081,
3733,
5700,
2722,
1805,
1543,
685,
6041,
3733,
436,
310,
417,
1077,
10084,
347,
253,
3081,
8103,
273,
6240,
247,
27451,
23279,
1307,
875,
14237,
285,
10491,
749,
3024,
4896,
18784,
253,
1566,
4555,
323,
841,
8892,
25761,
1097,
16566,
9093,
6194,
253,
2060,
2972,
281,
320,
28116,
436,
310,
1529,
19501,
1127,
323,
253,
48518,
272,
24760,
835,
253,
8654,
1387,
789,
6454,
310,
417,
6786,
672,
1046,
3558,
1057,
253,
1072,
2628,
352,
310,
30455,
604,
436,
310,
247,
6799,
3879,
347,
260,
79,
2224,
403,
7744,
7192,
281,
3037,
247,
24498,
2605,
273,
14237,
604,
8336,
476,
320,
40206,
31377,
390,
16737,
1293,
13567,
253,
3045,
1199,
436,
2605,
310,
3663,
352,
310,
1896,
326,
436,
1055,
310,
1679,
1774,
327,
1355,
3888,
824,
347,
260,
338,
274,
2313,
50275,
681,
1148,
10047,
253,
1332,
310,
760,
2429,
281,
6041,
3733,
2074,
7274,
452,
644,
14859,
326,
823,
24026,
11655,
940,
408,
6779,
5974,
9732,
465,
69,
1249,
746,
390,
5926,
12541,
8336,
1423,
285,
352,
651,
320,
1774,
281,
2096,
616,
1055,
327,
253,
2990,
275,
5301,
281,
253,
4081,
2746,
436,
651,
671,
1056,
253,
5301,
273,
18752,
285,
40206,
47587,
8336,
625,
4722,
50276,
17465,
569,
285,
38058,
3486,
403,
417,
5469,
275,
253,
2929,
3340,
984,
273,
253,
39926,
347,
11797,
407,
2675,
20162,
352,
651,
320,
1077,
1774,
281,
2319,
1097,
7794,
275,
253,
2929,
50275,
7152,
339,
431,
248,
4477,
9059,
326,
749,
3024,
4896,
275,
501,
3024,
5257,
281,
1347,
749,
32581,
595,
1955,
281,
247,
2675,
48518,
272,
1055,
326,
310,
1580,
642,
2014,
749,
18428,
310,
28461,
323,
253,
2457,
3045,
597,
512,
5257,
281,
10725,
327,
253,
19737,
1387,
3045,
2581,
685,
17170,
12085,
2060,
3045,
275,
1340,
281,
33623,
436,
1895,
253,
4477,
12661,
1690,
271,
3081,
2957,
1307,
326,
29426,
749,
3024,
4896,
281,
3761,
253,
13650,
273,
253,
1387,
50276,
249,
4679,
253,
4477,
921,
326,
417,
760,
616,
2957,
35910,
253,
48518,
272,
1895,
533,
671,
19132,
253,
4583,
2457,
3045,
271,
3081,
10527,
1783,
2722,
326,
11138,
749,
3024,
4896,
310,
12331,
281,
11138,
4583,
2990,
3045,
50276,
1189,
455,
2278,
50276,
1189,
455,
253,
4477,
1246,
271,
4722,
11562,
275,
12541,
6928,
275,
271,
35604,
784,
1039,
2675,
48518,
272,
1223,
2403,
897,
273,
326,
3640,
281,
3157,
824,
10336,
327,
253,
643,
1133,
253,
5661,
7103,
6505,
1142,
3533,
1527,
923,
2708,
275,
697,
1655,
1375,
891,
1158,
436,
789,
1057,
417,
2525,
5723,
2824,
7465,
533,
891,
11907,
253,
4477,
281,
4035,
2444,
327,
352,
285,
281,
11929,
247,
30080,
22559,
50276,
296,
3755,
20556,
50275,
783,
4477,
9302,
387,
4645,
326,
2675,
48518,
272,
1057,
5108,
275,
12541,
6928,
285,
326,
18505,
352,
1543,
275,
247,
3045,
7756,
50276,
783,
5661,
2216,
44584,
387,
21414,
253,
9414,
326,
253,
373,
48518,
272,
285,
326,
352,
476,
320,
4229,
50276,
783,
4477,
2486,
247,
10527,
1783,
285,
38041,
4278,
275,
253,
30762,
50276,
20881,
1255,
265,
50275,
24042,
253,
19529,
15693,
247,
2962,
273,
3533,
326,
403,
417,
9577,
407,
436,
789,
323,
4227,
1057,
48518,
272,
5108,
275,
19191,
6864,
6928,
275,
958,
352,
651,
320,
5322,
281,
19191,
6864,
6928,
347,
247,
8245,
275,
512,
253,
8442,
476,
634,
1332,
320,
3732,
281,
3215,
11273,
11454,
6928,
752,
604,
3185,
273,
253,
27451,
368,
908,
253,
1072,
2831,
290,
10144,
2957,
347,
275,
253,
3453,
50275,
783,
4477,
273,
12006,
257,
292,
247,
1375,
326,
436,
2238,
273,
11454,
2990,
310,
1679,
28116,
685,
501,
3024,
1580,
1016,
3828,
310,
6600,
273,
752,
253,
643,
8090,
403,
4715,
352,
651,
320,
4722,
604,
368,
10141,
1880,
12006,
257,
292,
671,
27171,
432,
48518,
272,
50276,
783,
2972,
18752,
5700,
310,
253,
1072,
347,
275,
19191,
6864,
6928,
3021,
891,
651,
34097,
1679,
2317,
281,
352,
275,
253,
2929,
285,
1375,
326,
368,
897,
253,
1072,
5700,
50276,
6050,
19191,
6864,
6928,
4796,
253,
3733,
13782,
673,
436,
789,
5459,
352,
50275,
66,
30287,
606,
305,
8500,
1162,
355,
42350,
4802,
27311,
267,
6928,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
4240,
50275,
5992,
7193,
5701,
50276,
19164,
414,
50276,
74,
1119,
3236,
253,
4602,
342,
2675,
48518,
272,
253,
1332,
281,
8415,
352,
310,
247,
5019,
273,
2709,
5368,
5697,
533,
891,
1158,
326,
352,
476,
320,
2783,
4460,
1677,
253,
3634,
275,
534,
352,
310,
5611,
50276,
15177,
50276,
783,
4583,
3290,
310,
1175,
533,
253,
5661,
3290,
812,
320,
5520,
323,
1650,
368,
812,
8338,
604,
48518,
272,
6569,
390,
604,
352,
310,
14042,
342,
643,
3082,
824,
347,
19191,
6864,
390,
12006,
257,
292,
50276,
498,
15752,
50276,
783,
2505,
310,
2590,
285,
3477,
281,
1239,
627,
403,
690,
5884,
963,
993,
50276,
9188,
40348,
50276,
28821,
253,
3045,
7756,
5611,
407,
253,
4081,
1332,
352,
812,
320,
8671,
407,
2852,
789,
247,
2201,
32489,
310,
253,
2572,
275,
15180,
2105,
1955,
281,
1907,
281,
11897,
767,
11655,
1046,
673,
50275,
250,
5551,
33593,
50276,
783,
4477,
2085,
4278,
281,
18302,
616,
789,
2299,
597,
513,
417,
2085,
253,
2457,
2127,
627,
310,
642,
7364,
2593,
891,
11907,
253,
4477,
281,
823,
581,
16585,
3374,
824,
347,
253,
15180,
2105,
390,
253,
30437,
273,
616,
1332,
4457,
501,
47301,
824,
347,
4979,
398,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
1263,
253,
48518,
272,
1895,
275,
3676,
501,
47301,
534,
5936,
326,
253,
749,
6928,
273,
247,
3676,
501,
3024,
1347,
3012,
7197,
685,
253,
1072,
10336,
10166,
3815,
352,
29328,
247,
2969,
5853,
534,
26277,
18784,
253,
2022,
2990,
285,
253,
27451,
23279,
253,
2022,
6928,
3453,
285,
326,
273,
247,
3632,
749,
18428,
352,
310,
2011,
45190,
326,
436,
5853,
19132,
253,
2457,
7200,
323,
1097,
253,
2022,
2990,
285,
3632,
749,
3024,
4896,
253,
30628,
5821,
326,
253,
48518,
272,
1895,
310,
271,
4722,
11562,
533,
671,
5439,
7350,
670,
1097,
253,
16038,
49836,
285,
5301,
342,
2074,
5609,
751,
3676,
22296,
285,
1881,
940,
21755,
253,
4477,
2530,
9470,
6128,
342,
747,
3081,
5661,
1543,
846,
253,
5955,
3408,
253,
30628,
4925,
281,
247,
13969,
273,
14924,
27039,
327,
326,
253,
4477,
9257,
2953,
253,
39926,
273,
48518,
272,
285,
1056,
2590,
326,
253,
48518,
272,
1307,
310,
816,
247,
13155,
24760,
1293,
667,
1524,
12739,
281,
16775,
253,
913,
18726,
326,
253,
1895,
3636,
275,
436,
2929,
310,
4722,
285,
476,
452,
12739,
281,
1097,
37820,
285,
1566,
13800,
2299,
253,
4477,
943,
1611,
281,
5386,
253,
13622,
3806,
281,
253,
2675,
20162,
7794,
534,
1057,
2085,
8249,
22861,
281,
253,
1332,
533,
2581,
812,
19864,
15279,
13775,
285,
16305,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces a methodology of graph learning for dynamic graphs where the dynamics are encoded in the representation to obtain improved results on graph classification tasks this framework includes a temporal graph encoder that uses attention mechanisms to generate representations as well as a metalearning component that ensures easy knowledge transfer experiments are carried out on two temporal graph datasets to show strong performance in graph classification strengths the idea of enriching a graph representation using knowledge of its temporal dynamics is quite nice the authors split the time attention into three parts nodelevel intrasnapshot and intersnapshot the nodelevel attention mechanism supplies temporal sensitivity to the node representations the intrasnapshot mechanism adds in a graph autoencoder to learn how to reconstruct the adjacency matrix of a snapshot and the intersnapshot attention mechanism weights different snapshots according to discriminative power the modular design is helpful in understanding how different aspects of the pipeline can be improved in future work the additional metalearning module is a nice addition and it is useful to see how to interface this module with learning dynamic graph representations weaknessespoints to clarify the use of te in te is unclear from figure 2 could the authors please clarify how te is obtained currently it seems that we have a set of edges at ts0 with maxte 2 and that the new edges added at ts 1 have te3 ie incremented by 1 similarly at ts2 the new edges are labeled with te4 so the figure doesnt shed light into the differences between the two timescales te and ts beyond increment by 1 section a2 contains further mention of te but without explanation so i still dont fully understand how te is generated i think this issue can be fixed with a few lines of clarifying language in section 411 where is the attention in the intrasnapshot time attention module it seems that here we just learn how to reconstruct the adjacency matrix at each snapshot perhaps this can be clarified in the author response possibly more seriously im concerned about the novelty of the proposed framework consider references ab and the further references contained therein to attentionbased models for learning dynamic graph representations these do not appear as baselines in the current work and in fact do not even appear as references it seems that the metalearning portion in the current work is new but is that the only source of novelty i hope the authors will be able to clarify these connections in their response a0 sankar a wu y gou l zhang w yang h 2018 dynamic graph representation learning via selfattention networks arxiv preprint arxiv181209430 a1 sankar a wu y gou l zhang w yang h 2020 january dysat deep neural representation learning on dynamic graphs via selfattention networks in proceedings of the 13th international conference on web search and data mining pp 519527 a1 is an expanded version of a0 b rossi e chamberlain b frasca f eynard d monti f bronstein m 2020 temporal graph networks for deep learning on dynamic graphs arxiv preprint arxiv200610637 i think the paper has good experimental results and that the incorporation of the metalearning framework is useful however i am concerned that the authors have left out important references and would welcome a thorough evaluation of these references and ideally other important references contained therein and how they compare to the current work currently i will lean toward reject but welcome a discussion with the authors on the novelty of their work docsepthis authors present a novel method for learning representations for timevarying graphs which allows for incorporating information at different timescales using their streamingsnapshot model the streamingsnapshot model has the following parts each snapshot sts vts ets has edges of the form vi vj te in ets where te denotes the time at which edge was formed and is present since then the snapshots sts are at a different timescale te and ts are not comparable with the overall learning representation being cal s to ref learning this representation is used for downstream fewshot classification task for dynamic graphs and is evaluated on two scenarios timevarying biological proteinprotein interaction networks and timevarying social networks the metatag architecture has the following components timeaware node representation the edge creation time te is used to learn a timeaware node representation bf ute using attentionbased weighting of neighbouring nodes features concatenated with a learnable time kernel algorithm 1 the snapshot feature matrix uts takes the node representation by consider the latest edge for node u and using attention mechanism above to get influence of earlier edges intrasnapshot representation this is constructed using standard representation loss using a gcnbased encoderdecoder architecture followed by permutationinvariant readout to obtain vector representation for snapshot overall representation the overall representation for the timevarying graph is weighted average using attention pooling learnt parameter of different snapshot representations this representation is used downstream for classification task classificationhead based on prototypical approach snell 2017 resulting in overall endtoend differentiable model with weighted average of reconstruction loss and classification loss further the model allows adaptation to new tasks with different classification labels by finetuning on small test set fewshot learning experiments are shown on biological and social network datasets in appendix showing efficacy of the approach compared to static graph representation methods as well as tdgraphembed doc2vec style method for embedding temporal graphs including augmentation with protonet for fewshot learning comparison strengths first attempt to provide an endtoend differentiable model for handling timevarying graphs addresses separate timescales using streamingsnapshot model adapts and uses existing methodology where appropriate gcnsreadout for snapshot representation protonets for fewshot learning learnable timekernels and attention for aggregating the timeaware influence of neighbouring node features into node representation weaknesses 1 the paper needs some rewriting starting with separate notions of time streamingsnapshot classic stochastic theory handles this by consider te as discrete time and cal st tts is the observation event when we see the interactions as described by the authors the snapshots model episodic slowchanging and periodical patterns however the model does not correspond to time elapsed between consecutive snapshots s1 s s1 beyond ordering further it is not clear how to relate ts to te which is the real time i would recommend the authors call it k snapshots and not call it snapshot time for clarity 2 the attention weighting across snapshots is agnostic to elapsed time meaning that fewshot learning results will work well if the time between consecutive snapshots is same across original task and new task see point 3 below as well 3 the biological dataset dppin fu and he 2021 is not 12 separate timevarying graphs rather one large graph and a single timevarying gene expression dataset from which dynamics are inferred over twelve different subgraphs this means in the finetuning setting the learnt weighting equation 7 benefits from the fact that the temporal dynamics are representing the same underlying timescale snapshots in test and train correspond to real gene expression values in the same experiment at the same instant which is why the attention weighting works well i would be very interested in seeing how fewshot learning works when considering two different gene expression arrays for deriving the underlying dynamics 4 would it make sense to reorder algorithm 1 lines 12 in terms of latest edge connecting to a node for a snapshot ie to generate bf uts for each node v select latest edge v v t and use timeaware attention mechanism to get influence of preexisting edges v v t t le t into its node representation from a computational perspective is only bf utv computed or all bf utv computed first and only the latest one selected in bf uts 5 a when computing bf uts suppose you have only the following edges v1 v2 0 v2 v3 1 v3 v4 2 would the final bf u u10 u21 u32 u42 b in algorithm 1 should t te be t le te otherwise consider two edges x y te3 and x y te3 then the influence of the other edge is missed further how is this addressed in bf uts an example in the text would clarify here 6 in the appendix a2 the lines for each temporal graph 36 edge timestamps together describe three consecutive metabolic cycles in each graph we take a subgraph by extracting a interval of 5 edge timestamps every 3 edge timestamps the subgraph shares the same class label with its original entire graph therefore we have 11 temporal subgraphs per class are directly copied from httpsarxivorgpdf210702168pdf in each graph we take a subgraph by extracting the time interval of five timestamps every three timestamps the subgraph shares the same class label with its original entire graph therefore we have eleven temporal subgraphs per class it is not clear to me how eleven and not twelve temporal subgraphs are obtained further there is a mismatch in the table 1 classes shown as 1 and section 52 line 1 given the 12 classes the description of the dataset should be expanded upon and made consistent minor corrections equation 1 bf yj should be bf xj in section 3 the authors should define a timestamped edge vi vj te ts my understanding is that this means an edge between nodes vi and vj that formed at microscopic time scale te and is present whenever the tsth snapshot was taken clearly sometime after max te vi vj te ts in sts page 9 line 1 and fine tune a few times on tildegtrainsupport and report the accuracy on tildegtrainquery should this be fine tune a few times on tildegtestsupport and report the accuracy on tildegtestquery on page 3 it should be made clear whether bf ats the snapshot adjacency matrix is just the presence of an edge at that instant regardless of when it was created ie i j te in ets rightarrow aijts1 the current paper addresses an important problem using a smart approach but requires significant rewriting as well as better empirical evaluation results on social network data are only marginally better than a much simpler approach gl2vecprotonet while the biological timevarying graph dataset is not actually different tasks since the underlying dynamics for all the 12 networks are learnt from a single geneexpression dataset which means train and test settings share the same underlying biological process and timescale i do not believe the paper is ready for publication in the present form docsepthis paper considers the graph metric learning problem where the underlying graphs are temporal the key idea of obtaining a higher classification accuracy is to use a bilevel metalearning paradigm it essential contains two parts 1 prototypical temporal graph encoder where the model uses multiscale time attention to capture temporal information and 2 metalearner where it uses bilevel paradigm proposed in finn et al 2017 the authors apply the proposed method to the task of graph classification on two realworld datasets compared with other baseline methods metatag achieves better performance in terms of classification accuracy the strengths are 1 the problem considered in this paper is interesting and new specifically the authors consider the graph metric learning but in the temporal setting 2 the authors proposed a new method which includes two parts to capture the temporal information the key of the proposed is based on three different attentions previous works such as xu et al 2020 and yang et al 2021 have demonstrated the power of timerelated attention layers 3 the temporal graph classification experiments on two datasets demonstrate that the proposed method learns much better classifiers than other baseline methods the weaknesses are 1 the contribution of the streamingsnapshot model is not critically novel the essential idea of the streamingsnapshot model is to combine the discretetime dynamic graph dtdg and continue the time dynamic graph ctdg model together see more details in 1 the authors use both two models but may have different time granularity 2 the technical contributions are limited the whole framework may look interesting and new but it is based on finn et al 2017 the time attention layers are commonly used in temporal graphs which are also proposed in previous work 3 the experiments are limited first of all the run time analysis is missing from both theoretical and empirical perspectives more explanations are needed on the results of gl2vec protonet the experimental setup and parameter settings are unclear minors 1 it is a little bit unfair to directly use caw and tgat under the graph metric learning setting what is the specific loss used for these two 2 table 3 metatag textscmetatag 3 why directly adding protonet onto caw and tgat be a reasonable thing 4 i expect detailed experimental setups like parameter tuning in the appendix could you list the experimental details in the appendix 5 what is the scalability of this method compared with two strong baselines 1 gl2vec protonet and 2 tdgraphembed protonet it seems that gl2vec protonet is competitive on the social network datasets but bad on the biobased datasets why is this case 1 kazemi sm goel r jain k kobyzev i sethi a forsyth p poupart p representation learning for dynamic graphs a survey j mach learn res 2020 jan 12170173 in general it is an interesting paper the experimental results look promising however the overall quality is not strong enough for acceptance there is space for improvement in experiments for example the comparison between the proposed method and baseline methods more discussion on the experimental results is also needed docsepthe paper proposed a method for metric learning for fewshort examples where each example is a temporal graph modeled with two timescales the definition and example of the temporal graph are confusing i could see that that a temporal graph consists of snapshots i could not see why in snapshot there are edge timestamps do you assume the node set fixed do you mean that a snapshot is also a temporal graph with edge appear and disappear at different times a if yes then i could not see it in the figure 2 i could not see the meaning of te012 in the figure id be assuming a then the statements like previous methods only focus on one time scale and ignore the whole lifetime evolution representation are difficult to understand i dont see why methods for temporal graphs do not take all the times of the graphs into account ie do not take the whole graphs into account the method used in the paper is a combination of different standard methods available there is nothing wrong with that but it would be useful to prove the usefulness for each of these choices overall i could see that the proposed method is supposed to be general with multiscale timestamps and claim that due to its generality it is better than all other methods available out there this kind of messages in my opinion needs to be more specific on whenwhy the problem is not clearly defined and hard to read the method is general and claimed to be better than others due to its generality i do not see too much novelty in the method the results look good
### Summary:
|
the paper proposes a new method for representation learning of timevarying graphs which uses a streamingsnapshot model to describe graphs on different time scales and metalearning for adaption to unseen graphs reviewers highlighted as strengths that the paper proposes an interesting approach for modeling temporal dynamics in graphs which of interest to the iclr community however reviewers raised also concerns regarding the novelty of contributions the empirical evaluation also with regard to related work as well as the clarity of presentation in addition there was no author response all reviewers and the ac agree therefore that the paper is not yet ready for publication at iclr at this point
|
[
1566,
323,
10885,
673,
39381,
272,
14580,
50275,
12025,
265,
4858,
2069,
1179,
265,
970,
5542,
723,
79,
27760,
1566,
50276,
26672,
84,
285,
4648,
5368,
16182,
835,
4569,
305,
68,
2224,
1088,
483,
323,
29679,
6779,
19025,
1507,
323,
1643,
11860,
4715,
3037,
494,
673,
25272,
1241,
285,
4116,
323,
9406,
839,
253,
673,
13823,
4833,
273,
33423,
4666,
3386,
715,
4666,
6779,
50274,
20881,
1255,
265,
337,
253,
2929,
3198,
690,
294,
17695,
4983,
342,
4858,
27367,
273,
673,
5542,
723,
79,
27760,
50276,
2437,
280,
19191,
3762,
22139,
436,
407,
1908,
716,
347,
13358,
673,
285,
1724,
331,
246,
1641,
310,
253,
8310,
2362,
672,
359,
923,
253,
6355,
50276,
284,
2529,
407,
253,
4477,
253,
14496,
28853,
1566,
6314,
23329,
3468,
28276,
285,
2180,
474,
6127,
2299,
253,
1566,
1057,
417,
2723,
281,
673,
38667,
875,
12640,
14496,
28853,
256,
18,
256,
256,
18,
4457,
15824,
2007,
352,
310,
417,
2590,
849,
281,
14588,
28669,
281,
716,
534,
310,
253,
1524,
673,
50276,
74,
651,
5583,
253,
4477,
1067,
352,
465,
14496,
28853,
285,
417,
1067,
352,
29679,
673,
323,
19843,
50275,
19,
253,
4116,
42428,
2439,
14496,
28853,
310,
639,
79,
6932,
281,
38667,
673,
4495,
326,
1643,
11860,
4715,
1543,
588,
789,
973,
604,
253,
673,
875,
12640,
14496,
28853,
310,
1072,
2439,
3236,
4836,
285,
747,
4836,
923,
1127,
495,
2708,
347,
973,
50274,
20,
253,
7534,
10895,
277,
377,
249,
15260,
285,
344,
43425,
310,
417,
1249,
4858,
673,
39381,
272,
14580,
2581,
581,
1781,
4216,
285,
247,
2014,
673,
39381,
272,
3320,
2048,
10895,
432,
534,
8062,
403,
22245,
689,
13265,
1027,
749,
33884,
436,
2097,
275,
253,
1442,
292,
25004,
4758,
253,
34003,
42428,
5150,
818,
5373,
432,
253,
958,
326,
253,
11935,
8062,
403,
9999,
253,
1072,
6944,
43936,
14496,
28853,
275,
1071,
285,
6194,
2723,
281,
1524,
3320,
2048,
2193,
275,
253,
1072,
3368,
387,
253,
1072,
8164,
534,
310,
2139,
253,
4116,
42428,
2987,
973,
891,
651,
320,
1077,
6110,
275,
6523,
849,
1643,
11860,
4715,
2987,
672,
7296,
767,
1027,
3320,
2048,
16417,
323,
44190,
253,
6944,
8062,
50276,
21,
651,
352,
1056,
3282,
281,
294,
2621,
5933,
337,
3104,
1249,
275,
2426,
273,
6323,
5024,
12873,
281,
247,
4666,
323,
247,
29679,
26332,
281,
6635,
270,
71,
2780,
84,
323,
1016,
4666,
362,
3609,
6323,
5024,
362,
362,
246,
285,
897,
673,
13823,
4116,
5122,
281,
755,
4833,
273,
638,
20137,
9297,
362,
362,
246,
246,
458,
246,
50276,
14806,
697,
4666,
6779,
432,
247,
15180,
8668,
310,
760,
270,
71,
2780,
87,
10302,
390,
512,
270,
71,
2780,
87,
10302,
806,
285,
760,
253,
6323,
581,
4236,
275,
50276,
3342,
2780,
84,
50275,
22,
247,
672,
12672,
270,
71,
2780,
84,
9428,
368,
452,
760,
253,
1563,
9297,
362,
18,
362,
19,
470,
362,
19,
362,
20,
337,
362,
20,
362,
21,
374,
651,
253,
2457,
270,
71,
1484,
50275,
86,
740,
1484,
1797,
1484,
1237,
1484,
2945,
50272,
67,
275,
5933,
337,
943,
246,
50276,
442,
320,
246,
458,
716,
5010,
1908,
767,
9297,
1269,
340,
716,
20,
50276,
395,
1269,
340,
716,
20,
50276,
7461,
253,
4833,
273,
253,
643,
5024,
310,
9829,
2007,
849,
310,
436,
9713,
275,
270,
71,
2780,
84,
50273,
266,
1650,
275,
253,
2505,
651,
19148,
1060,
50274,
23,
275,
253,
30762,
247,
19,
253,
3104,
323,
1016,
11935,
4216,
5540,
5024,
4522,
383,
11441,
2366,
6266,
1264,
12640,
10761,
11945,
275,
1016,
4216,
359,
1379,
247,
749,
10580,
407,
34705,
247,
7726,
273,
608,
5024,
4522,
383,
11441,
1046,
495,
5024,
4522,
383,
11441,
253,
749,
10580,
10764,
253,
1072,
966,
5203,
342,
697,
3236,
2862,
4216,
3103,
359,
452,
1903,
11935,
749,
33884,
591,
966,
403,
3587,
22489,
432,
5987,
39962,
2061,
9275,
16899,
30967,
13851,
9275,
50276,
249,
1016,
4216,
359,
1379,
247,
749,
10580,
407,
34705,
253,
673,
7726,
273,
2620,
4522,
383,
11441,
1046,
1264,
4522,
383,
11441,
253,
749,
10580,
10764,
253,
1072,
966,
5203,
342,
697,
3236,
2862,
4216,
3103,
359,
452,
19525,
11935,
749,
33884,
591,
966,
352,
310,
417,
2590,
281,
479,
849,
19525,
285,
417,
13265,
11935,
749,
33884,
403,
2797,
2007,
627,
310,
247,
29713,
275,
253,
2829,
337,
5971,
2011,
347,
337,
285,
2593,
8073,
1386,
337,
1677,
253,
1249,
5971,
253,
5740,
273,
253,
10895,
943,
320,
11848,
2220,
285,
1160,
5185,
50274,
37585,
17660,
50276,
29813,
337,
270,
71,
340,
75,
943,
320,
270,
71,
1269,
75,
50276,
249,
2593,
495,
253,
4477,
943,
4853,
247,
28921,
264,
5024,
2177,
362,
75,
716,
28669,
619,
4685,
310,
326,
436,
2097,
271,
5024,
875,
7632,
2177,
285,
362,
75,
326,
4447,
387,
22973,
673,
4311,
716,
285,
310,
1246,
10793,
253,
246,
296,
73,
29679,
369,
2668,
4518,
24225,
846,
2781,
50276,
442,
50276,
6584,
362,
75,
716,
28669,
275,
331,
84,
50272,
6377,
898,
1386,
337,
285,
4030,
19928,
247,
1643,
2069,
327,
246,
6227,
7332,
44196,
1135,
430,
50276,
395,
1304,
253,
7200,
327,
50276,
3582,
7332,
1949,
7267,
943,
436,
320,
4030,
19928,
247,
1643,
2069,
327,
246,
6227,
72,
2566,
13821,
285,
1304,
253,
7200,
327,
50276,
3582,
72,
2566,
7267,
50275,
251,
3239,
495,
352,
943,
320,
1160,
2590,
1880,
270,
71,
387,
84,
253,
29679,
3067,
43850,
4315,
310,
816,
253,
3361,
273,
271,
5024,
387,
326,
8164,
10159,
273,
672,
352,
369,
3562,
26332,
891,
480,
716,
275,
1162,
84,
987,
2501,
247,
1944,
1641,
18,
50275,
783,
1655,
2929,
12453,
271,
1774,
1895,
970,
247,
7060,
2746,
533,
4419,
1534,
294,
17695,
347,
973,
347,
1805,
16774,
7103,
1543,
327,
2675,
2990,
941,
403,
760,
42876,
1805,
685,
247,
1199,
19554,
2746,
1289,
19,
4642,
856,
1299,
292,
1223,
253,
7534,
673,
39381,
272,
4216,
10895,
310,
417,
2686,
1027,
8892,
1580,
253,
6944,
8062,
323,
512,
253,
1249,
6928,
403,
34003,
432,
247,
2014,
3320,
17759,
10895,
534,
2097,
6194,
285,
1071,
7533,
3894,
253,
1072,
6944,
7534,
1232,
285,
43936,
891,
513,
417,
2868,
253,
2929,
310,
4704,
323,
9311,
275,
253,
1246,
830,
50275,
7152,
33032,
2520,
2929,
19401,
253,
4216,
7982,
4715,
1895,
835,
253,
6944,
14580,
403,
11935,
253,
2234,
2934,
273,
13546,
247,
2169,
9162,
7200,
310,
281,
897,
247,
26413,
652,
5148,
613,
920,
22199,
352,
5667,
4428,
767,
4243,
337,
3861,
49225,
11935,
4216,
32049,
835,
253,
1566,
4648,
1554,
2865,
1079,
673,
4116,
281,
9232,
11935,
1491,
285,
374,
5148,
613,
1216,
835,
352,
4648,
26413,
652,
22199,
4081,
275,
1442,
79,
1162,
355,
4240,
253,
4477,
4647,
253,
4081,
1332,
281,
253,
4836,
273,
4216,
9162,
327,
767,
1524,
10186,
15302,
2429,
342,
643,
8245,
3082,
1313,
255,
356,
33526,
1805,
3045,
275,
2426,
273,
9162,
7200,
50276,
783,
20544,
403,
50276,
18,
253,
1895,
2783,
275,
436,
2929,
310,
4722,
285,
747,
5742,
253,
4477,
1908,
253,
4216,
7982,
4715,
533,
275,
253,
11935,
4758,
50276,
19,
253,
4477,
4081,
247,
747,
1332,
534,
3797,
767,
4243,
281,
9232,
253,
11935,
1491,
253,
2234,
273,
253,
4081,
310,
1754,
327,
1264,
1027,
33056,
621,
2045,
2987,
824,
347,
1269,
86,
1162,
355,
9169,
285,
30966,
1162,
355,
43425,
452,
5183,
253,
1612,
273,
673,
4919,
4116,
8090,
50276,
20,
253,
11935,
4216,
9162,
4679,
327,
767,
15302,
7568,
326,
253,
4081,
1332,
33772,
1199,
1805,
49996,
685,
643,
8245,
3082,
50276,
783,
32213,
403,
50276,
18,
253,
7680,
273,
253,
5542,
723,
79,
27760,
1566,
310,
417,
21038,
4460,
253,
5667,
2934,
273,
253,
5542,
723,
79,
27760,
1566,
310,
281,
13398,
253,
35132,
7816,
7870,
4216,
277,
2851,
72,
285,
4035,
253,
673,
7870,
4216,
260,
2851,
72,
1566,
2366,
923,
625,
4278,
275,
337,
253,
4477,
897,
1097,
767,
3210,
533,
778,
452,
1027,
673,
32449,
414,
50276,
19,
253,
7681,
9021,
403,
3710,
253,
2644,
7792,
778,
1007,
4722,
285,
747,
533,
352,
310,
1754,
327,
1442,
79,
1162,
355,
4240,
253,
673,
4116,
8090,
403,
7744,
908,
275,
11935,
14580,
534,
403,
671,
4081,
275,
2045,
789,
50276,
20,
253,
4679,
403,
3710,
806,
273,
512,
253,
1408,
673,
1783,
310,
5816,
432,
1097,
10527,
285,
16774,
24302,
625,
22909,
403,
3058,
327,
253,
1543,
273,
1289,
19,
4642,
50276,
856,
1299,
292,
253,
5661,
9978,
285,
4764,
7533,
403,
12744,
50275,
1222,
641,
337,
352,
310,
247,
1652,
2372,
16593,
281,
3587,
897,
260,
1403,
285,
246,
72,
255,
762,
253,
4216,
7982,
4715,
4758,
752,
310,
253,
2173,
2957,
908,
323,
841,
767,
374,
2829,
495,
1313,
255,
356,
50276,
40914,
3899,
255,
356,
495,
2139,
3587,
6240,
19025,
292,
4830,
260,
1403,
285,
246,
72,
255,
320,
247,
5272,
2181,
577,
891,
1902,
7000,
5661,
873,
8777,
751,
4764,
25184,
275,
253,
30762,
812,
368,
1618,
253,
5661,
4278,
275,
253,
30762,
608,
752,
310,
253,
9171,
1430,
273,
436,
1332,
2429,
342,
767,
2266,
1666,
25379,
337,
1289,
19,
4642,
50276,
856,
1299,
292,
285,
374,
246,
27421,
1761,
248,
1814,
264,
50276,
856,
1299,
292,
352,
3133,
326,
1289,
19,
4642,
50276,
856,
1299,
292,
310,
12085,
327,
253,
2675,
2990,
15302,
533,
3076,
327,
253,
1794,
706,
833,
15302,
2139,
310,
436,
1083,
50276,
18,
465,
1370,
35381,
924,
564,
293,
391,
480,
404,
465,
465,
706,
30608,
1173,
891,
256,
678,
74,
247,
323,
19089,
394,
268,
268,
1011,
435,
268,
6779,
4715,
323,
7870,
14580,
247,
6630,
480,
3674,
3037,
501,
9169,
44118,
1249,
1166,
520,
3655,
50275,
249,
2087,
352,
310,
271,
4722,
2929,
253,
5661,
1543,
1007,
12532,
2299,
253,
4583,
3290,
310,
417,
2266,
2217,
323,
14924,
627,
310,
2317,
323,
7756,
275,
4679,
323,
1650,
253,
5301,
875,
253,
4081,
1332,
285,
8245,
3082,
625,
5955,
327,
253,
5661,
1543,
310,
671,
3058,
5474,
339,
431,
248,
2929,
4081,
247,
1332,
323,
7982,
4715,
323,
1643,
14458,
6667,
835,
1016,
1650,
310,
247,
11935,
4216,
23115,
342,
767,
2069,
1179,
265,
50276,
783,
5426,
285,
1650,
273,
253,
11935,
4216,
403,
21643,
891,
812,
923,
326,
326,
247,
11935,
4216,
8414,
273,
14496,
28853,
891,
812,
417,
923,
2139,
275,
29679,
627,
403,
5024,
4522,
383,
11441,
513,
368,
5467,
253,
4666,
873,
4229,
513,
368,
1599,
326,
247,
29679,
310,
671,
247,
11935,
4216,
342,
5024,
3176,
285,
15529,
387,
1027,
2069,
247,
604,
4754,
840,
891,
812,
417,
923,
352,
275,
253,
4677,
374,
891,
812,
417,
923,
253,
4495,
273,
716,
12522,
275,
253,
4677,
50275,
301,
320,
7384,
247,
840,
253,
7234,
751,
2045,
3082,
760,
2770,
327,
581,
673,
4311,
285,
11823,
253,
2644,
12702,
5606,
6779,
403,
2834,
281,
2096,
891,
13414,
923,
2139,
3082,
323,
11935,
14580,
513,
417,
1379,
512,
253,
2069,
273,
253,
14580,
715,
2395,
26332,
513,
417,
1379,
253,
2644,
14580,
715,
2395,
50275,
783,
1332,
908,
275,
253,
2929,
310,
247,
5019,
273,
1027,
2629,
3082,
2130,
627,
310,
2717,
3430,
342,
326,
533,
352,
651,
320,
4217,
281,
5276,
253,
31471,
323,
1016,
273,
841,
10165,
50275,
1189,
455,
891,
812,
923,
326,
253,
4081,
1332,
310,
6326,
281,
320,
2087,
342,
1554,
2865,
1079,
4522,
383,
11441,
285,
1750,
326,
1955,
281,
697,
31376,
352,
310,
1805,
685,
512,
643,
3082,
2130,
562,
627,
436,
2238,
273,
8169,
275,
619,
4743,
3198,
281,
320,
625,
2173,
327,
672,
22309,
253,
1895,
310,
417,
4518,
2931,
285,
1892,
281,
1239,
253,
1332,
310,
2087,
285,
7558,
281,
320,
1805,
685,
2571,
1955,
281,
697,
31376,
891,
513,
417,
923,
1512,
1199,
38135,
275,
253,
1332,
253,
1543,
1007,
1175,
50275,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
747,
1332,
323,
6779,
4715,
273,
673,
39381,
272,
14580,
534,
4648,
247,
5542,
723,
79,
27760,
1566,
281,
6266,
14580,
327,
1027,
673,
11498,
285,
5148,
613,
920,
323,
5223,
279,
281,
39709,
14580,
30628,
16318,
347,
20544,
326,
253,
2929,
29328,
271,
4722,
2746,
323,
14053,
11935,
8062,
275,
14580,
534,
273,
1600,
281,
253,
17857,
32888,
3114,
2299,
30628,
5439,
671,
7350,
5001,
253,
38135,
273,
9021,
253,
16774,
7103,
671,
342,
2743,
281,
2905,
789,
347,
973,
347,
253,
19843,
273,
9759,
275,
1635,
627,
369,
642,
2488,
2380,
512,
30628,
285,
253,
913,
5194,
3103,
326,
253,
2929,
310,
417,
2568,
4704,
323,
9311,
387,
17857,
32888,
387,
436,
1127
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1566,
323,
10885,
673,
39381,
272,
14580,
50275,
12025,
265,
4858,
2069,
1179,
265,
970,
5542,
723,
79,
27760,
1566,
50276,
26672,
84,
285,
4648,
5368,
16182,
835,
4569,
305,
68,
2224,
1088,
483,
323,
29679,
6779,
19025,
1507,
323,
1643,
11860,
4715,
3037,
494,
673,
25272,
1241,
285,
4116,
323,
9406,
839,
253,
673,
13823,
4833,
273,
33423,
4666,
3386,
715,
4666,
6779,
50274,
20881,
1255,
265,
337,
253,
2929,
3198,
690,
294,
17695,
4983,
342,
4858,
27367,
273,
673,
5542,
723,
79,
27760,
50276,
2437,
280,
19191,
3762,
22139,
436,
407,
1908,
716,
347,
13358,
673,
285,
1724,
331,
246,
1641,
310,
253,
8310,
2362,
672,
359,
923,
253,
6355,
50276,
284,
2529,
407,
253,
4477,
253,
14496,
28853,
1566,
6314,
23329,
3468,
28276,
285,
2180,
474,
6127,
2299,
253,
1566,
1057,
417,
2723,
281,
673,
38667,
875,
12640,
14496,
28853,
256,
18,
256,
256,
18,
4457,
15824,
2007,
352,
310,
417,
2590,
849,
281,
14588,
28669,
281,
716,
534,
310,
253,
1524,
673,
50276,
74,
651,
5583,
253,
4477,
1067,
352,
465,
14496,
28853,
285,
417,
1067,
352,
29679,
673,
323,
19843,
50275,
19,
253,
4116,
42428,
2439,
14496,
28853,
310,
639,
79,
6932,
281,
38667,
673,
4495,
326,
1643,
11860,
4715,
1543,
588,
789,
973,
604,
253,
673,
875,
12640,
14496,
28853,
310,
1072,
2439,
3236,
4836,
285,
747,
4836,
923,
1127,
495,
2708,
347,
973,
50274,
20,
253,
7534,
10895,
277,
377,
249,
15260,
285,
344,
43425,
310,
417,
1249,
4858,
673,
39381,
272,
14580,
2581,
581,
1781,
4216,
285,
247,
2014,
673,
39381,
272,
3320,
2048,
10895,
432,
534,
8062,
403,
22245,
689,
13265,
1027,
749,
33884,
436,
2097,
275,
253,
1442,
292,
25004,
4758,
253,
34003,
42428,
5150,
818,
5373,
432,
253,
958,
326,
253,
11935,
8062,
403,
9999,
253,
1072,
6944,
43936,
14496,
28853,
275,
1071,
285,
6194,
2723,
281,
1524,
3320,
2048,
2193,
275,
253,
1072,
3368,
387,
253,
1072,
8164,
534,
310,
2139,
253,
4116,
42428,
2987,
973,
891,
651,
320,
1077,
6110,
275,
6523,
849,
1643,
11860,
4715,
2987,
672,
7296,
767,
1027,
3320,
2048,
16417,
323,
44190,
253,
6944,
8062,
50276,
21,
651,
352,
1056,
3282,
281,
294,
2621,
5933,
337,
3104,
1249,
275,
2426,
273,
6323,
5024,
12873,
281,
247,
4666,
323,
247,
29679,
26332,
281,
6635,
270,
71,
2780,
84,
323,
1016,
4666,
362,
3609,
6323,
5024,
362,
362,
246,
285,
897,
673,
13823,
4116,
5122,
281,
755,
4833,
273,
638,
20137,
9297,
362,
362,
246,
246,
458,
246,
50276,
14806,
697,
4666,
6779,
432,
247,
15180,
8668,
310,
760,
270,
71,
2780,
87,
10302,
390,
512,
270,
71,
2780,
87,
10302,
806,
285,
760,
253,
6323,
581,
4236,
275,
50276,
3342,
2780,
84,
50275,
22,
247,
672,
12672,
270,
71,
2780,
84,
9428,
368,
452,
760,
253,
1563,
9297,
362,
18,
362,
19,
470,
362,
19,
362,
20,
337,
362,
20,
362,
21,
374,
651,
253,
2457,
270,
71,
1484,
50275,
86,
740,
1484,
1797,
1484,
1237,
1484,
2945,
50272,
67,
275,
5933,
337,
943,
246,
50276,
442,
320,
246,
458,
716,
5010,
1908,
767,
9297,
1269,
340,
716,
20,
50276,
395,
1269,
340,
716,
20,
50276,
7461,
253,
4833,
273,
253,
643,
5024,
310,
9829,
2007,
849,
310,
436,
9713,
275,
270,
71,
2780,
84,
50273,
266,
1650,
275,
253,
2505,
651,
19148,
1060,
50274,
23,
275,
253,
30762,
247,
19,
253,
3104,
323,
1016,
11935,
4216,
5540,
5024,
4522,
383,
11441,
2366,
6266,
1264,
12640,
10761,
11945,
275,
1016,
4216,
359,
1379,
247,
749,
10580,
407,
34705,
247,
7726,
273,
608,
5024,
4522,
383,
11441,
1046,
495,
5024,
4522,
383,
11441,
253,
749,
10580,
10764,
253,
1072,
966,
5203,
342,
697,
3236,
2862,
4216,
3103,
359,
452,
1903,
11935,
749,
33884,
591,
966,
403,
3587,
22489,
432,
5987,
39962,
2061,
9275,
16899,
30967,
13851,
9275,
50276,
249,
1016,
4216,
359,
1379,
247,
749,
10580,
407,
34705,
253,
673,
7726,
273,
2620,
4522,
383,
11441,
1046,
1264,
4522,
383,
11441,
253,
749,
10580,
10764,
253,
1072,
966,
5203,
342,
697,
3236,
2862,
4216,
3103,
359,
452,
19525,
11935,
749,
33884,
591,
966,
352,
310,
417,
2590,
281,
479,
849,
19525,
285,
417,
13265,
11935,
749,
33884,
403,
2797,
2007,
627,
310,
247,
29713,
275,
253,
2829,
337,
5971,
2011,
347,
337,
285,
2593,
8073,
1386,
337,
1677,
253,
1249,
5971,
253,
5740,
273,
253,
10895,
943,
320,
11848,
2220,
285,
1160,
5185,
50274,
37585,
17660,
50276,
29813,
337,
270,
71,
340,
75,
943,
320,
270,
71,
1269,
75,
50276,
249,
2593,
495,
253,
4477,
943,
4853,
247,
28921,
264,
5024,
2177,
362,
75,
716,
28669,
619,
4685,
310,
326,
436,
2097,
271,
5024,
875,
7632,
2177,
285,
362,
75,
326,
4447,
387,
22973,
673,
4311,
716,
285,
310,
1246,
10793,
253,
246,
296,
73,
29679,
369,
2668,
4518,
24225,
846,
2781,
50276,
442,
50276,
6584,
362,
75,
716,
28669,
275,
331,
84,
50272,
6377,
898,
1386,
337,
285,
4030,
19928,
247,
1643,
2069,
327,
246,
6227,
7332,
44196,
1135,
430,
50276,
395,
1304,
253,
7200,
327,
50276,
3582,
7332,
1949,
7267,
943,
436,
320,
4030,
19928,
247,
1643,
2069,
327,
246,
6227,
72,
2566,
13821,
285,
1304,
253,
7200,
327,
50276,
3582,
72,
2566,
7267,
50275,
251,
3239,
495,
352,
943,
320,
1160,
2590,
1880,
270,
71,
387,
84,
253,
29679,
3067,
43850,
4315,
310,
816,
253,
3361,
273,
271,
5024,
387,
326,
8164,
10159,
273,
672,
352,
369,
3562,
26332,
891,
480,
716,
275,
1162,
84,
987,
2501,
247,
1944,
1641,
18,
50275,
783,
1655,
2929,
12453,
271,
1774,
1895,
970,
247,
7060,
2746,
533,
4419,
1534,
294,
17695,
347,
973,
347,
1805,
16774,
7103,
1543,
327,
2675,
2990,
941,
403,
760,
42876,
1805,
685,
247,
1199,
19554,
2746,
1289,
19,
4642,
856,
1299,
292,
1223,
253,
7534,
673,
39381,
272,
4216,
10895,
310,
417,
2686,
1027,
8892,
1580,
253,
6944,
8062,
323,
512,
253,
1249,
6928,
403,
34003,
432,
247,
2014,
3320,
17759,
10895,
534,
2097,
6194,
285,
1071,
7533,
3894,
253,
1072,
6944,
7534,
1232,
285,
43936,
891,
513,
417,
2868,
253,
2929,
310,
4704,
323,
9311,
275,
253,
1246,
830,
50275,
7152,
33032,
2520,
2929,
19401,
253,
4216,
7982,
4715,
1895,
835,
253,
6944,
14580,
403,
11935,
253,
2234,
2934,
273,
13546,
247,
2169,
9162,
7200,
310,
281,
897,
247,
26413,
652,
5148,
613,
920,
22199,
352,
5667,
4428,
767,
4243,
337,
3861,
49225,
11935,
4216,
32049,
835,
253,
1566,
4648,
1554,
2865,
1079,
673,
4116,
281,
9232,
11935,
1491,
285,
374,
5148,
613,
1216,
835,
352,
4648,
26413,
652,
22199,
4081,
275,
1442,
79,
1162,
355,
4240,
253,
4477,
4647,
253,
4081,
1332,
281,
253,
4836,
273,
4216,
9162,
327,
767,
1524,
10186,
15302,
2429,
342,
643,
8245,
3082,
1313,
255,
356,
33526,
1805,
3045,
275,
2426,
273,
9162,
7200,
50276,
783,
20544,
403,
50276,
18,
253,
1895,
2783,
275,
436,
2929,
310,
4722,
285,
747,
5742,
253,
4477,
1908,
253,
4216,
7982,
4715,
533,
275,
253,
11935,
4758,
50276,
19,
253,
4477,
4081,
247,
747,
1332,
534,
3797,
767,
4243,
281,
9232,
253,
11935,
1491,
253,
2234,
273,
253,
4081,
310,
1754,
327,
1264,
1027,
33056,
621,
2045,
2987,
824,
347,
1269,
86,
1162,
355,
9169,
285,
30966,
1162,
355,
43425,
452,
5183,
253,
1612,
273,
673,
4919,
4116,
8090,
50276,
20,
253,
11935,
4216,
9162,
4679,
327,
767,
15302,
7568,
326,
253,
4081,
1332,
33772,
1199,
1805,
49996,
685,
643,
8245,
3082,
50276,
783,
32213,
403,
50276,
18,
253,
7680,
273,
253,
5542,
723,
79,
27760,
1566,
310,
417,
21038,
4460,
253,
5667,
2934,
273,
253,
5542,
723,
79,
27760,
1566,
310,
281,
13398,
253,
35132,
7816,
7870,
4216,
277,
2851,
72,
285,
4035,
253,
673,
7870,
4216,
260,
2851,
72,
1566,
2366,
923,
625,
4278,
275,
337,
253,
4477,
897,
1097,
767,
3210,
533,
778,
452,
1027,
673,
32449,
414,
50276,
19,
253,
7681,
9021,
403,
3710,
253,
2644,
7792,
778,
1007,
4722,
285,
747,
533,
352,
310,
1754,
327,
1442,
79,
1162,
355,
4240,
253,
673,
4116,
8090,
403,
7744,
908,
275,
11935,
14580,
534,
403,
671,
4081,
275,
2045,
789,
50276,
20,
253,
4679,
403,
3710,
806,
273,
512,
253,
1408,
673,
1783,
310,
5816,
432,
1097,
10527,
285,
16774,
24302,
625,
22909,
403,
3058,
327,
253,
1543,
273,
1289,
19,
4642,
50276,
856,
1299,
292,
253,
5661,
9978,
285,
4764,
7533,
403,
12744,
50275,
1222,
641,
337,
352,
310,
247,
1652,
2372,
16593,
281,
3587,
897,
260,
1403,
285,
246,
72,
255,
762,
253,
4216,
7982,
4715,
4758,
752,
310,
253,
2173,
2957,
908,
323,
841,
767,
374,
2829,
495,
1313,
255,
356,
50276,
40914,
3899,
255,
356,
495,
2139,
3587,
6240,
19025,
292,
4830,
260,
1403,
285,
246,
72,
255,
320,
247,
5272,
2181,
577,
891,
1902,
7000,
5661,
873,
8777,
751,
4764,
25184,
275,
253,
30762,
812,
368,
1618,
253,
5661,
4278,
275,
253,
30762,
608,
752,
310,
253,
9171,
1430,
273,
436,
1332,
2429,
342,
767,
2266,
1666,
25379,
337,
1289,
19,
4642,
50276,
856,
1299,
292,
285,
374,
246,
27421,
1761,
248,
1814,
264,
50276,
856,
1299,
292,
352,
3133,
326,
1289,
19,
4642,
50276,
856,
1299,
292,
310,
12085,
327,
253,
2675,
2990,
15302,
533,
3076,
327,
253,
1794,
706,
833,
15302,
2139,
310,
436,
1083,
50276,
18,
465,
1370,
35381,
924,
564,
293,
391,
480,
404,
465,
465,
706,
30608,
1173,
891,
256,
678,
74,
247,
323,
19089,
394,
268,
268,
1011,
435,
268,
6779,
4715,
323,
7870,
14580,
247,
6630,
480,
3674,
3037,
501,
9169,
44118,
1249,
1166,
520,
3655,
50275,
249,
2087,
352,
310,
271,
4722,
2929,
253,
5661,
1543,
1007,
12532,
2299,
253,
4583,
3290,
310,
417,
2266,
2217,
323,
14924,
627,
310,
2317,
323,
7756,
275,
4679,
323,
1650,
253,
5301,
875,
253,
4081,
1332,
285,
8245,
3082,
625,
5955,
327,
253,
5661,
1543,
310,
671,
3058,
5474,
339,
431,
248,
2929,
4081,
247,
1332,
323,
7982,
4715,
323,
1643,
14458,
6667,
835,
1016,
1650,
310,
247,
11935,
4216,
23115,
342,
767,
2069,
1179,
265,
50276,
783,
5426,
285,
1650,
273,
253,
11935,
4216,
403,
21643,
891,
812,
923,
326,
326,
247,
11935,
4216,
8414,
273,
14496,
28853,
891,
812,
417,
923,
2139,
275,
29679,
627,
403,
5024,
4522,
383,
11441,
513,
368,
5467,
253,
4666,
873,
4229,
513,
368,
1599,
326,
247,
29679,
310,
671,
247,
11935,
4216,
342,
5024,
3176,
285,
15529,
387,
1027,
2069,
247,
604,
4754,
840,
891,
812,
417,
923,
352,
275,
253,
4677,
374,
891,
812,
417,
923,
253,
4495,
273,
716,
12522,
275,
253,
4677,
50275,
301,
320,
7384,
247,
840,
253,
7234,
751,
2045,
3082,
760,
2770,
327,
581,
673,
4311,
285,
11823,
253,
2644,
12702,
5606,
6779,
403,
2834,
281,
2096,
891,
13414,
923,
2139,
3082,
323,
11935,
14580,
513,
417,
1379,
512,
253,
2069,
273,
253,
14580,
715,
2395,
26332,
513,
417,
1379,
253,
2644,
14580,
715,
2395,
50275,
783,
1332,
908,
275,
253,
2929,
310,
247,
5019,
273,
1027,
2629,
3082,
2130,
627,
310,
2717,
3430,
342,
326,
533,
352,
651,
320,
4217,
281,
5276,
253,
31471,
323,
1016,
273,
841,
10165,
50275,
1189,
455,
891,
812,
923,
326,
253,
4081,
1332,
310,
6326,
281,
320,
2087,
342,
1554,
2865,
1079,
4522,
383,
11441,
285,
1750,
326,
1955,
281,
697,
31376,
352,
310,
1805,
685,
512,
643,
3082,
2130,
562,
627,
436,
2238,
273,
8169,
275,
619,
4743,
3198,
281,
320,
625,
2173,
327,
672,
22309,
253,
1895,
310,
417,
4518,
2931,
285,
1892,
281,
1239,
253,
1332,
310,
2087,
285,
7558,
281,
320,
1805,
685,
2571,
1955,
281,
697,
31376,
891,
513,
417,
923,
1512,
1199,
38135,
275,
253,
1332,
253,
1543,
1007,
1175,
50275,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
747,
1332,
323,
6779,
4715,
273,
673,
39381,
272,
14580,
534,
4648,
247,
5542,
723,
79,
27760,
1566,
281,
6266,
14580,
327,
1027,
673,
11498,
285,
5148,
613,
920,
323,
5223,
279,
281,
39709,
14580,
30628,
16318,
347,
20544,
326,
253,
2929,
29328,
271,
4722,
2746,
323,
14053,
11935,
8062,
275,
14580,
534,
273,
1600,
281,
253,
17857,
32888,
3114,
2299,
30628,
5439,
671,
7350,
5001,
253,
38135,
273,
9021,
253,
16774,
7103,
671,
342,
2743,
281,
2905,
789,
347,
973,
347,
253,
19843,
273,
9759,
275,
1635,
627,
369,
642,
2488,
2380,
512,
30628,
285,
253,
913,
5194,
3103,
326,
253,
2929,
310,
417,
2568,
4704,
323,
9311,
387,
17857,
32888,
387,
436,
1127
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes to make any neural network equivariant by symmetrizing over a subset of the group rather than over whole group if the subset selection fx depending on input x is equivariant gfxfgx then the symmetrization is equivariant the authors furthermore prove 1 when interested in invariant prediction the subset can be chosen in the quotient ggx where gx is the stabilizer subgroup of x 2 when symmetrizing with a random subsample of fx the probability of a particular subsample that deviates from symmetrizing with all of fx by less than some epsilon is bounded below 3 when using the symmetrization of a universal model the resulting model class is universal in the class of equivariant functions strengths the paper proposes a very practical strategy of building equivariant nets the universality proof helps convince the reader to use this method the paper considers and experiments on three different instantiations of their method showing wide applicability the experimental results show the method performs competitively weaknesses i dont understand whats happening in theorem 4 it considers a subsample hatmu of fx to be good when the symmetrizer that uses the subsample is epsilonclose to the full fx symmetrizer then it says that the probability of one particular good subsample is bounded below however that bound seems vacuus as plugging any reasonable number brings the bound quickly close to 0 also its counterintuitive why the bound should become looser as epsilon grows or as k grows what one would want instead is giving a lower bound of the probability that we get any hatmu that is epsilonclose to the full fx symmetrizer and we want this bound to get higher when epsilon or k increases the line below theorem 4 draws a conclusion that would follow from a theorem as i propose it above not from the theorem in the paper as it is currently stated and im not completely misunderstanding theorem 4 can best be removed from the paper it is a bit unclear when the results apply to finite and infinite groups and frames fx everywhere a summation symbol is used but in some places fx is infinite in the infinite cases which measure should then be used can one always use some canonical haarlike measure in particular in the proofs of theorem 1 and theorem 4 this should be discussed the writing of the paper can be improved i dont follow the choice of fx for the ed case which are the 2d od matrices perhaps the authors can elaborate in more detail one of the examples how to construct fx in the main paper and then do the other two in the appendix i would like some more theoretical discussion about the choice of fx does the choice of fx affect the output if so how is fx required to be continuous does a continuous nontrivial fx always exist what does the topology on 2g look like how does this affect the continuity of the symmetrized function if it is discontinuous does that affect the universality when can fx be chosen to be finite other comments why does gamlp and gaginid only get 50 score on expclassify are you there using a finite subsample of g or fx and could you give any insight into why wed expect then complete failure for a g subsample and complete success for a fx subsample in the proof of theorem 5 which norm is used for rho2g it cant be the max knorm because k is a subset of the input of phi not the output is it the operator norm i would like theorem 1 to be put in the main paper as it shows why the key construction is correct i think a citation would be appropriate to finzi et al 2020 generalizing convolutional neural networks for equivariance to lie groups on arbitrary continuous data as they also consider sampling from the group to build equivariant networks why have the authors chosen the name frame for fx i know frame as a set of vectors or in the context of a frame bundle i think this is a great paper as it proposes a new practical method for building equivariant networks which is broadly applicable universal and performs well in practice i have serious concerns about theorem 4 if the authors convince me why it makes sense or if they remove it i will increase my score ive updated my score after the response and revision docsepsummary and contributions the paper introduces a framework called frame averaging fa that can adapt existing backbone architectures to become invariantequivariant to new symmetry types it achieves this by averaging over an inputdependent frame which outputs a subset of groups frame averaging is often much more efficient to compute than averaging over the entire group while at the same time guarantees exact invarianceequivariance on the technical side the paper also proves that fabased models have the same expressive power as the original backbone architectures on the empirical side the paper provides new classes of models using fa such as universal euclidean motion invariant point cloud networks message passing gnns and demonstrates their practical effectiveness on several tasks strengths fa is a very simple framework yet it is potentially very useful group equivariance is an important form of inductive bias in deep learning architectures but designing architectures that has such equivariance is challenging it would be very useful if we could adapt any back architecture to become invariantequivariant to a certain group however the previously studied group averaging is computationally infeasible when the group is large so this paper which greatly reduces the computational cost of group averaging could be potentially very useful i really like this work and will be looking forward to seeing future development of this work technical statements are all sound and proved mathematical statements in this paper all look correct to me and they have provided proofs these statements are clear and rigorous impressive results using the simple idea of frame averaging the paper demonstrates stateoftheart results on several tasks the results are impressive which suggests that the fa framework can be very useful in practice weaknesses lack of simple examples while it is not hard to check the correctness of all these statements it takes me some time to form an intuition of what is proposed in this paper it would help me a lot if the authors could provide a simple example at the beginning to give readers some intuition for example it might be good to work through an example to make mlp translation equivariant with the simplest possible construction of frames insufficient study and explanation of the proposed frames the construction of frames in section 31 seems to come out of nowhere while i could check they are indeed equivariant i dont think i understand the motivation or thinking process behind such designs furthermore can the frames be simplified as an example if we let the input x be a function on groups ie xfggin g can we let ff argmaxg fg i might be wrong but i think this simple construction is also equivariant are the number of elements output by frames the smaller the better or is there a balance between performance and computational efficiency questions how stable are the proposed frames for the proposed frames i would be interested to know how stable they are that is to say if i add noise to the inputs will the output subset of groups be significantly different the paper studies an important problem that could potentially have great impact how to adapt existing architectures to become invariantequivariant to a certain group while maintaining the expressive power and computational efficiency of the original backbone model the paper provides a simple yet effective solution the technical statements are sound and the empirical results are impressive docsepthis paper proposes a method to transform a model into an invariantequivariant model using the reynolds operator in addition they define a notion called the equivariant frame to compute the reynolds operator efficiently strengths this paper proposes a general method for transforming existing models into invariantequivariant models using reynolds operators combined with other methods it provides stateoftheart experimental results weaknesses questions first of all i would like to point out that the idea of using reynolds operators to transform a model into an invariant model is found in kicki et al 2021 where the construction is exactly the same except for the use of frames the authors should be cited for this paper also the name frame is confusing with the concept of frame in differential geometry and should be given a different name theorem 1 it is proved that if a frame is an equivariant function then a partial sum over the frame gives a transformation to an invariantequivariant function but the fact that the frame is defined depending only on the input space is not appropriate for example if we have an invariant model f the proposed method will sum over frames to convert it to an invariant function which will increase the computational complexity the transformation should be done without any transformation for the invariant model f this leads to the problem of overlapping invariants when combined with other models in the experimental section also since calculating the specific frame itself involves mathematical difficulties i believe that the evaluation should be done on the model for which the frame has been calculated ie for which the experiment has been conducted point cloud models the sn times ed or sedinvariant model is constructed by computing a frame for ed or sed and transforming the model of the deep sets with that frame the simple question is why dont you construct a frame for sn times ed or sed if this method is good enough the point permutation action should also be subject to the frame averaging model i would like you to explain the rationale reason for not doing so graph models two types of models have been proposed mlpfa and gnnfa the problem with mlpfa is that it uses the adjacency matrix itself as input so when the number of nodes is large the input space is also large and the number of parameters is significantly larger than for example the model in maron et al is it possible to train mlpfa with say 50 nodes also in a real task the number of nodes in a graph can take many different values is it possible to train mlpfa for such a case in gnnfa the problem seems to be that invariance is calculated redundantly as described above and the contribution of this method is not clear reference zaheer et al deep sets neural information processing systems neurips 2017 maron et al invariant and equivariant graph networks international conference on learning representations iclr 2019 kicky et al a new approach to design symmetry invariant neural networks international joint conference on neural networks ijcnn 2021 the structure of the proposed model has already been seen in kicki et al 2021 except for the use of frames and even if frames are used mlpfa for example does not seem to be able to handle changes in inputs such as changes in the number of nodes in the graph when combined with the existing strong methods it gives good results but this could be achieved by for example concatting two gnns and then transforming them with fnn so we decided that it is not worthy of evaluation after some discussion the score was raised because the theoretical uncertainties and doubts were resolved docsepthe authors introduce frame averaging fa a general framework for adapting known architectures to become invariant or equivariant with respect to a general group by using group averaging operator the idea of fa is to replace the averaging operator over the entire group by the averaging over a smaller set of group elements but still achieve the full invariantequivariant property strengths the idea of replacing the averaging operator over the entire group by the fa is innovative fa is simpler to compute and has less complexity in comparison with the averaging operator over the entire group the authors prove that the fabased models preserve the universality property of their backbone architectures the fa framework is then applied to design several invariant and equivariant architectures for point clouds and graphs weaknesses my concern is mostly about the incompleteness of the framework it may happen the case that for some mathbfx mathcalfmathbfx is a small set but for some other mathbfx mathcalfmathbfx is large or even infinity for example the frame choice in subsection 31 is in this case in this case a deeper analysis on how to separate these two cases and how to deal when mathcalfmathbfx is large or even infinity is necessary it is also not clear how the fa is adapted in the dalocalpointnet a detailed explanation here would be helpful no comparison with the previous equivariant architectures are presented therefore it is hard to estimate how novel and efficient the framework is in the world of current group equivariant architectures some technical parts are quite compact and not easy to read maybe the reason is that the paper includes rich contents and the authors tried to fix all of them in 9 pages in addition i have some comments on the technical parts of the paper the definition of the frame mathcalfmathbfx for the case of point clouds and ged in page 5 is not clear why mathcalfmathbfx is the collection of e3 euclidean transformations defined by while in this case i think mathcalfmathbfx must be a subset of ed i am curious to know why generically the frame would consist of 2d elements while for rare inputs mathbfx the frame can be an infinite set i think this remark is crucial for the proposed method as it affects the size of the fa framework i think the proof of proposition 1 appendix a6 has some flaws line 3 what is lambda it is not defined yet line 67 it seems to me that you need to prove that mathbfro consists of eigenvectors of mathbfrmathbftmathbfxt cdot ifrac1n mathbf1 mathbf1t cdot mathbfrmathbftmathbfx which is mathbfrmathbft mathbf1t mathbfxt cdot ifrac1n mathbf1 mathbf1t cdot mathbfx mathbfrmathbft mathbf1tt rather than c mathbfr mathbfxt cdot ifrac1n mathbf1 mathbf1t cdot mathbfx mathbfrt as you claimed if this is exactly the case i do not know how it can be true
### Summary:
|
the submission proposes a method to make a preexisting model equivariant to desired symmetries frame averaging the strategy relies on a significant reduction of the number of symmetries to average over with respect to the reynolds operator and uniform subsampling the paper also demonstrates the usefulness of this method theoretically universal approximation result and practically competitive performance the contributions are clear and the core idea is simple i recommend this paper for acceptance with spotlight
|
[
10012,
577,
604,
253,
4477,
18578,
479,
2139,
352,
2789,
3282,
390,
604,
597,
5386,
352,
891,
588,
2572,
619,
4868,
50276,
422,
9300,
619,
4868,
846,
253,
2380,
285,
18520,
5474,
339,
793,
360,
3454,
285,
9021,
50276,
783,
2929,
23970,
247,
7792,
1925,
3665,
25001,
4195,
326,
476,
5223,
5368,
27882,
35615,
281,
2489,
13727,
8275,
6410,
281,
747,
10377,
3510,
352,
33526,
436,
407,
25001,
689,
271,
3280,
6820,
3665,
534,
18012,
247,
8578,
273,
2390,
3665,
25001,
310,
2223,
1199,
625,
5919,
281,
11897,
685,
25001,
689,
253,
2862,
1387,
1223,
387,
253,
1072,
673,
23632,
3242,
31429,
8275,
14417,
50276,
251,
253,
7681,
1930,
253,
2929,
671,
19539,
326,
6969,
833,
3210,
452,
253,
1072,
43541,
1612,
347,
253,
3236,
27882,
35615,
50276,
251,
253,
16774,
1930,
253,
2929,
3400,
747,
5971,
273,
3210,
970,
4195,
824,
347,
10898,
299,
26365,
3200,
13727,
1127,
9005,
6928,
50276,
8559,
8136,
18976,
2224,
285,
14371,
616,
8542,
12510,
327,
2067,
8892,
50276,
296,
3755,
20556,
50275,
6855,
310,
247,
1077,
2969,
7792,
2568,
352,
310,
7826,
1077,
4217,
1387,
32270,
14417,
310,
271,
1774,
830,
273,
42115,
8492,
275,
3676,
4715,
35615,
533,
20462,
35615,
326,
556,
824,
32270,
14417,
310,
11132,
352,
651,
320,
1077,
4217,
604,
359,
812,
5223,
667,
896,
10336,
281,
2489,
13727,
8275,
6410,
281,
247,
2176,
1387,
2299,
253,
3786,
5421,
1387,
25001,
310,
43245,
275,
36764,
917,
672,
253,
1387,
310,
1781,
594,
436,
2929,
534,
10260,
11355,
253,
15180,
2105,
273,
1387,
25001,
812,
320,
7826,
1077,
4217,
891,
1663,
751,
436,
789,
285,
588,
320,
2819,
3579,
281,
6523,
2852,
2440,
273,
436,
789,
50276,
48746,
7234,
403,
512,
3590,
285,
8058,
15965,
7234,
275,
436,
2929,
512,
1007,
3451,
281,
479,
285,
597,
452,
2530,
27947,
841,
7234,
403,
2590,
285,
26565,
50276,
303,
7100,
422,
1543,
970,
253,
2969,
2934,
273,
3665,
25001,
253,
2929,
14371,
1375,
23037,
14387,
1543,
327,
2067,
8892,
253,
1543,
403,
13943,
534,
5936,
326,
253,
4195,
7792,
476,
320,
1077,
4217,
275,
3946,
50276,
20881,
1255,
265,
50276,
77,
471,
273,
2969,
6667,
1223,
352,
310,
417,
1892,
281,
2451,
253,
36594,
273,
512,
841,
7234,
352,
3936,
479,
690,
673,
281,
830,
271,
30328,
273,
752,
310,
4081,
275,
436,
2929,
352,
651,
1361,
479,
247,
2257,
604,
253,
4477,
812,
2085,
247,
2969,
1650,
387,
253,
5068,
281,
1918,
10668,
690,
30328,
323,
1650,
352,
1537,
320,
1175,
281,
789,
949,
271,
1650,
281,
1056,
13361,
81,
10234,
32270,
6410,
342,
253,
22325,
1896,
5140,
273,
13009,
50274,
968,
86,
2276,
1263,
285,
8813,
273,
253,
4081,
13009,
50273,
783,
5140,
273,
13009,
275,
2593,
4562,
3133,
281,
1705,
562,
273,
17663,
1223,
891,
812,
2451,
597,
403,
6296,
32270,
6410,
891,
13414,
1158,
891,
2096,
253,
16038,
390,
4680,
1232,
3212,
824,
11809,
50273,
44295,
3062,
476,
253,
13009,
320,
21010,
347,
271,
1650,
604,
359,
1339,
253,
3280,
1269,
320,
247,
1159,
327,
2390,
26332,
1269,
71,
1266,
249,
305,
476,
359,
1339,
34082,
50276,
1662,
4090,
72,
269,
72,
891,
1537,
320,
3430,
533,
891,
1158,
436,
2969,
5140,
310,
671,
32270,
6410,
50273,
609,
253,
1180,
273,
3603,
3453,
407,
13009,
253,
4577,
253,
1805,
390,
310,
627,
247,
6654,
875,
3045,
285,
15180,
6733,
50276,
34974,
849,
6474,
403,
253,
4081,
13009,
323,
253,
4081,
13009,
891,
651,
320,
6110,
281,
871,
849,
6474,
597,
403,
326,
310,
281,
1333,
604,
891,
823,
6046,
281,
253,
14800,
588,
253,
3453,
8578,
273,
2390,
320,
3012,
1027,
50275,
783,
2929,
2175,
271,
1774,
1895,
326,
812,
7826,
452,
1270,
3486,
849,
281,
5223,
5368,
35615,
281,
2489,
13727,
8275,
6410,
281,
247,
2176,
1387,
1223,
11850,
253,
43541,
1612,
285,
15180,
6733,
273,
253,
3236,
27882,
1566,
253,
2929,
3400,
247,
2969,
2568,
3576,
2900,
253,
7681,
7234,
403,
3590,
285,
253,
16774,
1543,
403,
13943,
5474,
33032,
2520,
2929,
29328,
247,
1332,
281,
4979,
247,
1566,
715,
271,
13727,
8275,
6410,
1566,
970,
253,
294,
1362,
3502,
5572,
275,
1635,
597,
4853,
247,
10732,
1925,
253,
32270,
6410,
3665,
281,
11897,
253,
294,
1362,
3502,
5572,
14556,
20544,
50276,
2520,
2929,
29328,
247,
2087,
1332,
323,
27197,
5368,
3210,
715,
13727,
8275,
6410,
3210,
970,
294,
1362,
3502,
9158,
5678,
342,
643,
3082,
352,
3400,
1375,
23037,
14387,
5661,
1543,
50276,
20881,
1255,
265,
50276,
34974,
806,
273,
512,
891,
651,
751,
281,
1127,
562,
326,
253,
2934,
273,
970,
294,
1362,
3502,
9158,
281,
4979,
247,
1566,
715,
271,
13727,
1566,
310,
1119,
275,
8386,
74,
1162,
355,
43425,
835,
253,
5140,
310,
4555,
253,
1072,
3707,
323,
253,
897,
273,
13009,
253,
4477,
943,
320,
11106,
323,
436,
2929,
671,
253,
1416,
3665,
310,
21643,
342,
253,
4473,
273,
3665,
275,
8967,
12087,
285,
943,
320,
1677,
247,
1027,
1416,
50276,
33921,
337,
352,
310,
8058,
326,
604,
247,
3665,
310,
271,
32270,
6410,
1159,
840,
247,
7898,
2020,
689,
253,
3665,
4245,
247,
9261,
281,
271,
13727,
8275,
6410,
1159,
533,
253,
958,
326,
253,
3665,
310,
2931,
7293,
760,
327,
253,
3280,
2317,
310,
417,
4569,
323,
1650,
604,
359,
452,
271,
13727,
1566,
269,
253,
4081,
1332,
588,
2020,
689,
13009,
281,
6455,
352,
281,
271,
13727,
1159,
534,
588,
2572,
253,
15180,
10454,
253,
9261,
943,
320,
2218,
1293,
667,
9261,
323,
253,
13727,
1566,
269,
436,
5644,
281,
253,
1895,
273,
21481,
38318,
672,
5678,
342,
643,
3210,
275,
253,
5661,
2593,
50276,
12563,
1580,
18899,
253,
2173,
3665,
3139,
8687,
15965,
12748,
891,
2868,
326,
253,
7103,
943,
320,
2218,
327,
253,
1566,
323,
534,
253,
3665,
556,
644,
5118,
26332,
323,
534,
253,
3368,
556,
644,
5196,
50276,
3659,
9005,
3210,
253,
3802,
2069,
1407,
390,
9043,
25168,
1566,
310,
8818,
407,
12672,
247,
3665,
323,
1407,
390,
9043,
285,
27197,
253,
1566,
273,
253,
3676,
5239,
342,
326,
3665,
253,
2969,
1953,
310,
2139,
13414,
368,
3989,
247,
3665,
323,
3802,
2069,
1407,
390,
9043,
604,
436,
1332,
310,
1175,
2217,
253,
1127,
29391,
2250,
943,
671,
320,
2256,
281,
253,
3665,
25001,
1566,
891,
651,
751,
368,
281,
5513,
253,
24775,
1921,
323,
417,
2509,
594,
50276,
10580,
3210,
767,
3510,
273,
3210,
452,
644,
4081,
13361,
81,
6855,
285,
305,
9866,
6855,
253,
1895,
342,
13361,
81,
6855,
310,
326,
352,
4648,
253,
3067,
43850,
4315,
3139,
347,
3280,
594,
672,
253,
1180,
273,
7632,
310,
1781,
253,
3280,
2317,
310,
671,
1781,
285,
253,
1180,
273,
3602,
310,
3012,
4067,
685,
323,
1650,
253,
1566,
275,
2304,
251,
1162,
355,
310,
352,
1896,
281,
6194,
13361,
81,
6855,
342,
1333,
2456,
7632,
671,
275,
247,
1524,
4836,
253,
1180,
273,
7632,
275,
247,
4216,
476,
1379,
1142,
1027,
2193,
310,
352,
1896,
281,
6194,
13361,
81,
6855,
323,
824,
247,
1083,
275,
305,
9866,
6855,
253,
1895,
3133,
281,
320,
326,
31429,
310,
5118,
19886,
5954,
347,
2529,
1840,
285,
253,
7680,
273,
436,
1332,
310,
417,
2590,
50276,
14005,
50276,
4019,
44403,
1162,
355,
3676,
5239,
11454,
1491,
5162,
2718,
5723,
2824,
4240,
50276,
4175,
251,
1162,
355,
13727,
285,
32270,
6410,
4216,
6928,
5213,
8059,
327,
4715,
14237,
17857,
32888,
6247,
50276,
76,
30043,
1162,
355,
247,
747,
2746,
281,
2216,
10377,
13727,
11454,
6928,
5213,
6036,
8059,
327,
11454,
6928,
891,
23925,
9866,
43425,
50273,
783,
2605,
273,
253,
4081,
1566,
556,
2168,
644,
2326,
275,
8386,
74,
1162,
355,
43425,
3707,
323,
253,
897,
273,
13009,
285,
1014,
604,
13009,
403,
908,
13361,
81,
6855,
323,
1650,
1057,
417,
1646,
281,
320,
2104,
281,
6016,
2544,
275,
14800,
824,
347,
2544,
275,
253,
1180,
273,
7632,
275,
253,
4216,
672,
5678,
342,
253,
5368,
2266,
3082,
352,
4245,
1175,
1543,
533,
436,
812,
320,
6786,
407,
323,
1650,
7036,
255,
1076,
767,
18976,
2224,
285,
840,
27197,
731,
342,
269,
9866,
594,
359,
4425,
326,
352,
310,
417,
18338,
273,
7103,
50276,
6438,
690,
5955,
253,
4868,
369,
5439,
984,
253,
10527,
20418,
285,
24626,
497,
11512,
50272,
7152,
339,
431,
248,
4477,
9569,
3665,
25001,
4195,
247,
2087,
7792,
323,
42174,
1929,
35615,
281,
2489,
13727,
390,
32270,
6410,
342,
1675,
281,
247,
2087,
1387,
407,
970,
1387,
25001,
5572,
253,
2934,
273,
4195,
310,
281,
8171,
253,
25001,
5572,
689,
253,
2862,
1387,
407,
253,
25001,
689,
247,
4577,
873,
273,
1387,
3603,
533,
1335,
5115,
253,
2120,
13727,
8275,
6410,
2867,
20544,
50276,
783,
2934,
273,
15706,
253,
25001,
5572,
689,
253,
2862,
1387,
407,
253,
4195,
310,
16694,
4195,
310,
19554,
281,
11897,
285,
556,
1679,
10454,
275,
5301,
342,
253,
25001,
5572,
689,
253,
2862,
1387,
50275,
783,
4477,
5276,
326,
253,
6969,
833,
3210,
14003,
253,
6978,
1319,
2867,
273,
616,
27882,
35615,
50274,
783,
4195,
7792,
310,
840,
3732,
281,
2216,
2067,
13727,
285,
32270,
6410,
35615,
323,
1127,
16173,
285,
14580,
50276,
20881,
1255,
265,
50275,
2577,
4468,
310,
6571,
670,
253,
275,
7507,
1866,
405,
273,
253,
7792,
352,
778,
5108,
253,
1083,
326,
323,
690,
14168,
3342,
89,
14168,
1179,
71,
2407,
89,
310,
247,
1355,
873,
533,
323,
690,
643,
14168,
3342,
89,
14168,
1179,
71,
2407,
89,
310,
1781,
390,
1014,
23579,
323,
1650,
253,
3665,
4327,
275,
19087,
4562,
310,
275,
436,
1083,
275,
436,
1083,
247,
12861,
1783,
327,
849,
281,
4858,
841,
767,
2219,
285,
849,
281,
2968,
672,
14168,
1179,
71,
2407,
89,
310,
1781,
390,
1014,
23579,
310,
3309,
50275,
262,
310,
671,
417,
2590,
849,
253,
4195,
310,
12956,
275,
253,
26932,
3100,
3659,
3024,
247,
7000,
8813,
1060,
651,
320,
9371,
50275,
2369,
5301,
342,
253,
2045,
32270,
6410,
35615,
403,
3559,
3103,
352,
310,
1892,
281,
6642,
849,
4460,
285,
5919,
253,
7792,
310,
275,
253,
1533,
273,
1655,
1387,
32270,
6410,
35615,
50275,
8826,
7681,
4243,
403,
3240,
8566,
285,
417,
3477,
281,
1239,
5046,
253,
1921,
310,
326,
253,
2929,
3797,
6793,
9410,
285,
253,
4477,
3597,
281,
4993,
512,
273,
731,
275,
898,
7223,
275,
1635,
891,
452,
690,
5701,
327,
253,
7681,
4243,
273,
253,
2929,
50275,
783,
5426,
273,
253,
3665,
14168,
1179,
71,
2407,
89,
323,
253,
1083,
273,
1127,
16173,
285,
305,
264,
275,
3239,
608,
310,
417,
2590,
2139,
14168,
1179,
71,
2407,
89,
50276,
261,
253,
4849,
273,
299,
20,
299,
26365,
21257,
2931,
407,
50276,
6050,
275,
436,
1083,
891,
1158,
14168,
1179,
71,
2407,
89,
1364,
320,
247,
8578,
273,
1407,
50275,
74,
717,
14338,
281,
871,
2139,
1006,
1037,
253,
3665,
651,
2882,
273,
374,
69,
3603,
50276,
6050,
323,
7520,
14800,
14168,
3342,
89,
253,
3665,
476,
320,
271,
11968,
873,
891,
1158,
436,
7579,
310,
9560,
323,
253,
4081,
1332,
347,
352,
11852,
253,
1979,
273,
253,
4195,
7792,
50275,
74,
1158,
253,
4737,
273,
13989,
337,
30762,
247,
23,
556,
690,
32138,
50270,
1282,
495,
752,
310,
29331,
352,
310,
417,
2931,
2568,
50270,
1282,
9963,
352,
3133,
281,
479,
326,
368,
878,
281,
5276,
326,
14168,
3342,
287,
8414,
273,
48670,
273,
50276,
1324,
925,
1324,
649,
2407,
633,
260,
5256,
604,
8306,
18,
79,
14168,
3342,
18,
14168,
3342,
18,
85,
260,
5256,
14168,
67,
925,
1324,
649,
2407,
89,
534,
310,
50276,
1324,
925,
1324,
649,
14168,
3342,
18,
85,
14168,
3342,
633,
260,
5256,
604,
8306,
18,
79,
14168,
3342,
18,
14168,
3342,
18,
85,
260,
5256,
14168,
3342,
89,
14168,
67,
925,
1324,
649,
14168,
3342,
18,
1440,
2581,
685,
50276,
68,
50276,
1324,
925,
14168,
3342,
633,
260,
5256,
604,
8306,
18,
79,
14168,
3342,
18,
14168,
3342,
18,
85,
260,
5256,
14168,
3342,
89,
14168,
67,
925,
85,
347,
368,
7558,
604,
436,
310,
4555,
253,
1083,
891,
513,
417,
871,
849,
352,
476,
320,
2032,
2490,
187,
4118,
18435,
27,
783,
19529,
29328,
247,
1332,
281,
1056,
247,
638,
20137,
1566,
32270,
6410,
281,
6799,
34902,
3665,
25001,
253,
5700,
15771,
327,
247,
1534,
5141,
273,
253,
1180,
273,
34902,
281,
3388,
689,
342,
1675,
281,
253,
294,
1362,
3502,
5572,
285,
6447,
8790,
312,
4906,
253,
2929,
671,
14371,
253,
31471,
273,
436,
1332,
28055,
10898,
11193,
906,
285,
18236,
12085,
3045,
253,
9021,
403,
2590,
285,
253,
5161,
2934,
310,
2969,
891,
5583,
436,
2929,
323,
14924,
342,
34543
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
10012,
577,
604,
253,
4477,
18578,
479,
2139,
352,
2789,
3282,
390,
604,
597,
5386,
352,
891,
588,
2572,
619,
4868,
50276,
422,
9300,
619,
4868,
846,
253,
2380,
285,
18520,
5474,
339,
793,
360,
3454,
285,
9021,
50276,
783,
2929,
23970,
247,
7792,
1925,
3665,
25001,
4195,
326,
476,
5223,
5368,
27882,
35615,
281,
2489,
13727,
8275,
6410,
281,
747,
10377,
3510,
352,
33526,
436,
407,
25001,
689,
271,
3280,
6820,
3665,
534,
18012,
247,
8578,
273,
2390,
3665,
25001,
310,
2223,
1199,
625,
5919,
281,
11897,
685,
25001,
689,
253,
2862,
1387,
1223,
387,
253,
1072,
673,
23632,
3242,
31429,
8275,
14417,
50276,
251,
253,
7681,
1930,
253,
2929,
671,
19539,
326,
6969,
833,
3210,
452,
253,
1072,
43541,
1612,
347,
253,
3236,
27882,
35615,
50276,
251,
253,
16774,
1930,
253,
2929,
3400,
747,
5971,
273,
3210,
970,
4195,
824,
347,
10898,
299,
26365,
3200,
13727,
1127,
9005,
6928,
50276,
8559,
8136,
18976,
2224,
285,
14371,
616,
8542,
12510,
327,
2067,
8892,
50276,
296,
3755,
20556,
50275,
6855,
310,
247,
1077,
2969,
7792,
2568,
352,
310,
7826,
1077,
4217,
1387,
32270,
14417,
310,
271,
1774,
830,
273,
42115,
8492,
275,
3676,
4715,
35615,
533,
20462,
35615,
326,
556,
824,
32270,
14417,
310,
11132,
352,
651,
320,
1077,
4217,
604,
359,
812,
5223,
667,
896,
10336,
281,
2489,
13727,
8275,
6410,
281,
247,
2176,
1387,
2299,
253,
3786,
5421,
1387,
25001,
310,
43245,
275,
36764,
917,
672,
253,
1387,
310,
1781,
594,
436,
2929,
534,
10260,
11355,
253,
15180,
2105,
273,
1387,
25001,
812,
320,
7826,
1077,
4217,
891,
1663,
751,
436,
789,
285,
588,
320,
2819,
3579,
281,
6523,
2852,
2440,
273,
436,
789,
50276,
48746,
7234,
403,
512,
3590,
285,
8058,
15965,
7234,
275,
436,
2929,
512,
1007,
3451,
281,
479,
285,
597,
452,
2530,
27947,
841,
7234,
403,
2590,
285,
26565,
50276,
303,
7100,
422,
1543,
970,
253,
2969,
2934,
273,
3665,
25001,
253,
2929,
14371,
1375,
23037,
14387,
1543,
327,
2067,
8892,
253,
1543,
403,
13943,
534,
5936,
326,
253,
4195,
7792,
476,
320,
1077,
4217,
275,
3946,
50276,
20881,
1255,
265,
50276,
77,
471,
273,
2969,
6667,
1223,
352,
310,
417,
1892,
281,
2451,
253,
36594,
273,
512,
841,
7234,
352,
3936,
479,
690,
673,
281,
830,
271,
30328,
273,
752,
310,
4081,
275,
436,
2929,
352,
651,
1361,
479,
247,
2257,
604,
253,
4477,
812,
2085,
247,
2969,
1650,
387,
253,
5068,
281,
1918,
10668,
690,
30328,
323,
1650,
352,
1537,
320,
1175,
281,
789,
949,
271,
1650,
281,
1056,
13361,
81,
10234,
32270,
6410,
342,
253,
22325,
1896,
5140,
273,
13009,
50274,
968,
86,
2276,
1263,
285,
8813,
273,
253,
4081,
13009,
50273,
783,
5140,
273,
13009,
275,
2593,
4562,
3133,
281,
1705,
562,
273,
17663,
1223,
891,
812,
2451,
597,
403,
6296,
32270,
6410,
891,
13414,
1158,
891,
2096,
253,
16038,
390,
4680,
1232,
3212,
824,
11809,
50273,
44295,
3062,
476,
253,
13009,
320,
21010,
347,
271,
1650,
604,
359,
1339,
253,
3280,
1269,
320,
247,
1159,
327,
2390,
26332,
1269,
71,
1266,
249,
305,
476,
359,
1339,
34082,
50276,
1662,
4090,
72,
269,
72,
891,
1537,
320,
3430,
533,
891,
1158,
436,
2969,
5140,
310,
671,
32270,
6410,
50273,
609,
253,
1180,
273,
3603,
3453,
407,
13009,
253,
4577,
253,
1805,
390,
310,
627,
247,
6654,
875,
3045,
285,
15180,
6733,
50276,
34974,
849,
6474,
403,
253,
4081,
13009,
323,
253,
4081,
13009,
891,
651,
320,
6110,
281,
871,
849,
6474,
597,
403,
326,
310,
281,
1333,
604,
891,
823,
6046,
281,
253,
14800,
588,
253,
3453,
8578,
273,
2390,
320,
3012,
1027,
50275,
783,
2929,
2175,
271,
1774,
1895,
326,
812,
7826,
452,
1270,
3486,
849,
281,
5223,
5368,
35615,
281,
2489,
13727,
8275,
6410,
281,
247,
2176,
1387,
1223,
11850,
253,
43541,
1612,
285,
15180,
6733,
273,
253,
3236,
27882,
1566,
253,
2929,
3400,
247,
2969,
2568,
3576,
2900,
253,
7681,
7234,
403,
3590,
285,
253,
16774,
1543,
403,
13943,
5474,
33032,
2520,
2929,
29328,
247,
1332,
281,
4979,
247,
1566,
715,
271,
13727,
8275,
6410,
1566,
970,
253,
294,
1362,
3502,
5572,
275,
1635,
597,
4853,
247,
10732,
1925,
253,
32270,
6410,
3665,
281,
11897,
253,
294,
1362,
3502,
5572,
14556,
20544,
50276,
2520,
2929,
29328,
247,
2087,
1332,
323,
27197,
5368,
3210,
715,
13727,
8275,
6410,
3210,
970,
294,
1362,
3502,
9158,
5678,
342,
643,
3082,
352,
3400,
1375,
23037,
14387,
5661,
1543,
50276,
20881,
1255,
265,
50276,
34974,
806,
273,
512,
891,
651,
751,
281,
1127,
562,
326,
253,
2934,
273,
970,
294,
1362,
3502,
9158,
281,
4979,
247,
1566,
715,
271,
13727,
1566,
310,
1119,
275,
8386,
74,
1162,
355,
43425,
835,
253,
5140,
310,
4555,
253,
1072,
3707,
323,
253,
897,
273,
13009,
253,
4477,
943,
320,
11106,
323,
436,
2929,
671,
253,
1416,
3665,
310,
21643,
342,
253,
4473,
273,
3665,
275,
8967,
12087,
285,
943,
320,
1677,
247,
1027,
1416,
50276,
33921,
337,
352,
310,
8058,
326,
604,
247,
3665,
310,
271,
32270,
6410,
1159,
840,
247,
7898,
2020,
689,
253,
3665,
4245,
247,
9261,
281,
271,
13727,
8275,
6410,
1159,
533,
253,
958,
326,
253,
3665,
310,
2931,
7293,
760,
327,
253,
3280,
2317,
310,
417,
4569,
323,
1650,
604,
359,
452,
271,
13727,
1566,
269,
253,
4081,
1332,
588,
2020,
689,
13009,
281,
6455,
352,
281,
271,
13727,
1159,
534,
588,
2572,
253,
15180,
10454,
253,
9261,
943,
320,
2218,
1293,
667,
9261,
323,
253,
13727,
1566,
269,
436,
5644,
281,
253,
1895,
273,
21481,
38318,
672,
5678,
342,
643,
3210,
275,
253,
5661,
2593,
50276,
12563,
1580,
18899,
253,
2173,
3665,
3139,
8687,
15965,
12748,
891,
2868,
326,
253,
7103,
943,
320,
2218,
327,
253,
1566,
323,
534,
253,
3665,
556,
644,
5118,
26332,
323,
534,
253,
3368,
556,
644,
5196,
50276,
3659,
9005,
3210,
253,
3802,
2069,
1407,
390,
9043,
25168,
1566,
310,
8818,
407,
12672,
247,
3665,
323,
1407,
390,
9043,
285,
27197,
253,
1566,
273,
253,
3676,
5239,
342,
326,
3665,
253,
2969,
1953,
310,
2139,
13414,
368,
3989,
247,
3665,
323,
3802,
2069,
1407,
390,
9043,
604,
436,
1332,
310,
1175,
2217,
253,
1127,
29391,
2250,
943,
671,
320,
2256,
281,
253,
3665,
25001,
1566,
891,
651,
751,
368,
281,
5513,
253,
24775,
1921,
323,
417,
2509,
594,
50276,
10580,
3210,
767,
3510,
273,
3210,
452,
644,
4081,
13361,
81,
6855,
285,
305,
9866,
6855,
253,
1895,
342,
13361,
81,
6855,
310,
326,
352,
4648,
253,
3067,
43850,
4315,
3139,
347,
3280,
594,
672,
253,
1180,
273,
7632,
310,
1781,
253,
3280,
2317,
310,
671,
1781,
285,
253,
1180,
273,
3602,
310,
3012,
4067,
685,
323,
1650,
253,
1566,
275,
2304,
251,
1162,
355,
310,
352,
1896,
281,
6194,
13361,
81,
6855,
342,
1333,
2456,
7632,
671,
275,
247,
1524,
4836,
253,
1180,
273,
7632,
275,
247,
4216,
476,
1379,
1142,
1027,
2193,
310,
352,
1896,
281,
6194,
13361,
81,
6855,
323,
824,
247,
1083,
275,
305,
9866,
6855,
253,
1895,
3133,
281,
320,
326,
31429,
310,
5118,
19886,
5954,
347,
2529,
1840,
285,
253,
7680,
273,
436,
1332,
310,
417,
2590,
50276,
14005,
50276,
4019,
44403,
1162,
355,
3676,
5239,
11454,
1491,
5162,
2718,
5723,
2824,
4240,
50276,
4175,
251,
1162,
355,
13727,
285,
32270,
6410,
4216,
6928,
5213,
8059,
327,
4715,
14237,
17857,
32888,
6247,
50276,
76,
30043,
1162,
355,
247,
747,
2746,
281,
2216,
10377,
13727,
11454,
6928,
5213,
6036,
8059,
327,
11454,
6928,
891,
23925,
9866,
43425,
50273,
783,
2605,
273,
253,
4081,
1566,
556,
2168,
644,
2326,
275,
8386,
74,
1162,
355,
43425,
3707,
323,
253,
897,
273,
13009,
285,
1014,
604,
13009,
403,
908,
13361,
81,
6855,
323,
1650,
1057,
417,
1646,
281,
320,
2104,
281,
6016,
2544,
275,
14800,
824,
347,
2544,
275,
253,
1180,
273,
7632,
275,
253,
4216,
672,
5678,
342,
253,
5368,
2266,
3082,
352,
4245,
1175,
1543,
533,
436,
812,
320,
6786,
407,
323,
1650,
7036,
255,
1076,
767,
18976,
2224,
285,
840,
27197,
731,
342,
269,
9866,
594,
359,
4425,
326,
352,
310,
417,
18338,
273,
7103,
50276,
6438,
690,
5955,
253,
4868,
369,
5439,
984,
253,
10527,
20418,
285,
24626,
497,
11512,
50272,
7152,
339,
431,
248,
4477,
9569,
3665,
25001,
4195,
247,
2087,
7792,
323,
42174,
1929,
35615,
281,
2489,
13727,
390,
32270,
6410,
342,
1675,
281,
247,
2087,
1387,
407,
970,
1387,
25001,
5572,
253,
2934,
273,
4195,
310,
281,
8171,
253,
25001,
5572,
689,
253,
2862,
1387,
407,
253,
25001,
689,
247,
4577,
873,
273,
1387,
3603,
533,
1335,
5115,
253,
2120,
13727,
8275,
6410,
2867,
20544,
50276,
783,
2934,
273,
15706,
253,
25001,
5572,
689,
253,
2862,
1387,
407,
253,
4195,
310,
16694,
4195,
310,
19554,
281,
11897,
285,
556,
1679,
10454,
275,
5301,
342,
253,
25001,
5572,
689,
253,
2862,
1387,
50275,
783,
4477,
5276,
326,
253,
6969,
833,
3210,
14003,
253,
6978,
1319,
2867,
273,
616,
27882,
35615,
50274,
783,
4195,
7792,
310,
840,
3732,
281,
2216,
2067,
13727,
285,
32270,
6410,
35615,
323,
1127,
16173,
285,
14580,
50276,
20881,
1255,
265,
50275,
2577,
4468,
310,
6571,
670,
253,
275,
7507,
1866,
405,
273,
253,
7792,
352,
778,
5108,
253,
1083,
326,
323,
690,
14168,
3342,
89,
14168,
1179,
71,
2407,
89,
310,
247,
1355,
873,
533,
323,
690,
643,
14168,
3342,
89,
14168,
1179,
71,
2407,
89,
310,
1781,
390,
1014,
23579,
323,
1650,
253,
3665,
4327,
275,
19087,
4562,
310,
275,
436,
1083,
275,
436,
1083,
247,
12861,
1783,
327,
849,
281,
4858,
841,
767,
2219,
285,
849,
281,
2968,
672,
14168,
1179,
71,
2407,
89,
310,
1781,
390,
1014,
23579,
310,
3309,
50275,
262,
310,
671,
417,
2590,
849,
253,
4195,
310,
12956,
275,
253,
26932,
3100,
3659,
3024,
247,
7000,
8813,
1060,
651,
320,
9371,
50275,
2369,
5301,
342,
253,
2045,
32270,
6410,
35615,
403,
3559,
3103,
352,
310,
1892,
281,
6642,
849,
4460,
285,
5919,
253,
7792,
310,
275,
253,
1533,
273,
1655,
1387,
32270,
6410,
35615,
50275,
8826,
7681,
4243,
403,
3240,
8566,
285,
417,
3477,
281,
1239,
5046,
253,
1921,
310,
326,
253,
2929,
3797,
6793,
9410,
285,
253,
4477,
3597,
281,
4993,
512,
273,
731,
275,
898,
7223,
275,
1635,
891,
452,
690,
5701,
327,
253,
7681,
4243,
273,
253,
2929,
50275,
783,
5426,
273,
253,
3665,
14168,
1179,
71,
2407,
89,
323,
253,
1083,
273,
1127,
16173,
285,
305,
264,
275,
3239,
608,
310,
417,
2590,
2139,
14168,
1179,
71,
2407,
89,
50276,
261,
253,
4849,
273,
299,
20,
299,
26365,
21257,
2931,
407,
50276,
6050,
275,
436,
1083,
891,
1158,
14168,
1179,
71,
2407,
89,
1364,
320,
247,
8578,
273,
1407,
50275,
74,
717,
14338,
281,
871,
2139,
1006,
1037,
253,
3665,
651,
2882,
273,
374,
69,
3603,
50276,
6050,
323,
7520,
14800,
14168,
3342,
89,
253,
3665,
476,
320,
271,
11968,
873,
891,
1158,
436,
7579,
310,
9560,
323,
253,
4081,
1332,
347,
352,
11852,
253,
1979,
273,
253,
4195,
7792,
50275,
74,
1158,
253,
4737,
273,
13989,
337,
30762,
247,
23,
556,
690,
32138,
50270,
1282,
495,
752,
310,
29331,
352,
310,
417,
2931,
2568,
50270,
1282,
9963,
352,
3133,
281,
479,
326,
368,
878,
281,
5276,
326,
14168,
3342,
287,
8414,
273,
48670,
273,
50276,
1324,
925,
1324,
649,
2407,
633,
260,
5256,
604,
8306,
18,
79,
14168,
3342,
18,
14168,
3342,
18,
85,
260,
5256,
14168,
67,
925,
1324,
649,
2407,
89,
534,
310,
50276,
1324,
925,
1324,
649,
14168,
3342,
18,
85,
14168,
3342,
633,
260,
5256,
604,
8306,
18,
79,
14168,
3342,
18,
14168,
3342,
18,
85,
260,
5256,
14168,
3342,
89,
14168,
67,
925,
1324,
649,
14168,
3342,
18,
1440,
2581,
685,
50276,
68,
50276,
1324,
925,
14168,
3342,
633,
260,
5256,
604,
8306,
18,
79,
14168,
3342,
18,
14168,
3342,
18,
85,
260,
5256,
14168,
3342,
89,
14168,
67,
925,
85,
347,
368,
7558,
604,
436,
310,
4555,
253,
1083,
891,
513,
417,
871,
849,
352,
476,
320,
2032,
2490,
187,
4118,
18435,
27,
783,
19529,
29328,
247,
1332,
281,
1056,
247,
638,
20137,
1566,
32270,
6410,
281,
6799,
34902,
3665,
25001,
253,
5700,
15771,
327,
247,
1534,
5141,
273,
253,
1180,
273,
34902,
281,
3388,
689,
342,
1675,
281,
253,
294,
1362,
3502,
5572,
285,
6447,
8790,
312,
4906,
253,
2929,
671,
14371,
253,
31471,
273,
436,
1332,
28055,
10898,
11193,
906,
285,
18236,
12085,
3045,
253,
9021,
403,
2590,
285,
253,
5161,
2934,
310,
2969,
891,
5583,
436,
2929,
323,
14924,
342,
34543
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
originalitynovelty the two key contributions are to the best of my knowledge novel in the context of lda and require nontrivial efforts to develop especially for the theoretical results significanceimpact byzantine failure is an important issue in distributed and federated learning and providing a principled method to handle it in the context of lda may be of interest to the community correctnesstechnical quality the paper provides theoretical results on the estimation error bounds and support recovery of the proposed algorithm which leverage results from high dimensional statistics and appear to be sound to the best of my knowledge quality of experiments overall the empirical studies support the main claims reproducibility the proofs and code are available the experiment details are clearly described in the paper clarity of writing the paper is well written and clear there are quite some notations but are well introduced before using originalitynovelty significanceimpact the idea of performing median aggregation is not new and has been previously studied in distributed learning the contribution may seem incremental and may be considered as a combination of existing distributed lda method and existing method to handle byzantine failure quality of experiments the simulations can be improved by considering more parameters involved in the bound as well as comparing both variants mean and median of the proposed algorithm in the simulations more details in q5 for the local machines section 2 mentions that the method by tian and gu 2017 requires onp2 computation and that the local computation complexity of our method is op2 which is sufficiently reduced compared with tian and gu 2017 however if i understand correctly op2 refers to the complexity per iteration and therefore the total complexity of the proposed algorithm is otp2 in that case how is otp2 sufficiently reduced relative to onp2 by tian and gu 2017 it might be informative to include both variants mean and median of the proposed algorithm together in some experiments with synthetic data eg figures 1 and 2 this could provide a sense of how the median variant performs under normal distributed system and how the mean variant performs under failure the current experiments with synthetic data section 41 only focus on different iteration number and failure rates however there are more other parameters involved in the theoretical bounds thus it would be more compelling to also include simulations to verify them eg varying dimensions sample sizes can the authors provide a brief explanation or simulations about whether the proposed algorithm work or how to extend it to handle cases where the samples are not evenly distributed this seems to be a more realistic setting in practice i would suggest the authors to also explain whether the median aggregation scheme applies to the method by tian and gu 2017 eg by applying it to eq 35 after reading rebuttal i appreciate the authors detailed response i have read the rebuttal and the other reviews and my concerns have been well resolved docsepthe problem considered here is fundamental and worth studying the proposed algorithms are more efficient than existing ones like tian and gu 2017 this advantage is significant because applications like could storage often involve large data sizes and time efficiency is thus essential under certain assumptions the theoretical guarantees are nearoptimal connoting the satisfactory performance of the algorithm resilience against local machine failures and corruption is a good plus for satisfying practical needs the nofailure case system i is a particular case of system ii hence the authors should mention the necessity of the former my understanding is that the nofailure scenario guarantee is only slightly better than the other due to the additional 1n term omitting sqrts unless m is as large as n this particular term is no more significant than the leading sqrtlog p n term so maybe we do not need the meanbased algorithm and a separate formulation besides system ii the assumption that the master machine is free from failure or corruption seems to simplify the problem quite a bit i wonder if the authors have considered cases where only alpha 12 is available and we dont know which machines including the master one are genuine in the latter case we may still borrow techniques from robust statistics to prune and filter bad samples and aggregate the information appropriately please also refer to other sections the writing is okay but not easy to follow and needs improvement some suggestions it may be better to collect all the necessary notations and present them together in a single place some appear at the beginning of section 13 and others like nk and n spread across the remaining paragraphs i would like to see more discussions and experimental results on the time efficiency of the algorithms the authors claimed this attribute as the primary advantage over tian and gu 2017 but provided minimal discussion on this instead i saw lots of comments on the algorithms learning errors docsepthis paper provides two distributed machine learning methods the related theoretical analysis is clear 1 the scales of the two real datasets used in this study are not really large and not convincible and the improvement on the performance is limited 2 the environment requirement of the algorithms are not described clearly some more analysis and more data sets should be included in the experimental part
### Summary:
|
meta review the authors consider learning a distributed multiclass sparse lda model particularly including a fraction of byzantine failures arbitrary behavior and providing robustness by performing medianbased aggregation the main contribution of the work is the theoretical analysis eg showing only a slight decrease in efficiency for the added robustness the main weakness of the paper is probably its narrow scope although the general setting considered distributed learning fault tolerance is of broad interest the work is only analyzing a very simple classifier the novelty of the proposed estimator itself is also limited applying a csl framework and using median to provide robustness in the aggregation step empirical assessment could also be stronger for example r1 provided a number of specific suggestions for more compelling evaluation the authors response provided a number of additional experimental results that are helpful in understanding the impact of their method and should be included if accepted however the significance of these updates without a revision make it difficult to judge the final product there are also a number of minor typos in the draft that should be updated
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
19164,
414,
2369,
652,
555,
50276,
783,
767,
2234,
9021,
403,
281,
253,
1682,
273,
619,
3640,
4460,
275,
253,
3634,
273,
298,
1473,
285,
2430,
37825,
6031,
281,
1287,
3340,
323,
253,
10527,
1543,
50276,
9188,
40348,
48276,
50276,
1615,
91,
31404,
4433,
310,
271,
1774,
2523,
275,
5939,
285,
10208,
12072,
4715,
285,
5277,
247,
3505,
74,
6216,
1332,
281,
6016,
352,
275,
253,
3634,
273,
298,
1473,
778,
320,
273,
1600,
281,
253,
3114,
50275,
28113,
5210,
3241,
1451,
474,
3290,
50276,
783,
2929,
3400,
10527,
1543,
327,
253,
13418,
2228,
14493,
285,
1329,
7355,
273,
253,
4081,
5933,
534,
25057,
1543,
432,
1029,
15759,
9990,
285,
3176,
281,
320,
3590,
281,
253,
1682,
273,
619,
3640,
50276,
15177,
273,
4679,
50276,
1189,
455,
253,
16774,
2175,
1329,
253,
2022,
3916,
50276,
250,
5551,
33593,
50276,
783,
27947,
285,
2127,
403,
2130,
253,
3368,
4278,
403,
4518,
2529,
275,
253,
2929,
50276,
498,
15752,
273,
4028,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
627,
403,
3240,
690,
41818,
533,
403,
973,
5611,
1078,
970,
3236,
414,
2369,
652,
555,
50276,
9188,
40348,
48276,
50276,
783,
2934,
273,
9591,
8876,
20828,
310,
417,
747,
285,
556,
644,
3786,
5421,
275,
5939,
4715,
50276,
783,
7680,
778,
1646,
32809,
285,
778,
320,
2783,
347,
247,
5019,
273,
5368,
5939,
298,
1473,
1332,
285,
5368,
1332,
281,
6016,
407,
91,
31404,
4433,
50276,
15177,
273,
4679,
50276,
783,
9938,
476,
320,
5520,
407,
7296,
625,
3602,
3206,
275,
253,
3033,
347,
973,
347,
10941,
1097,
11640,
1599,
285,
8876,
273,
253,
4081,
5933,
275,
253,
9938,
625,
4278,
275,
2805,
22,
50276,
1542,
253,
1980,
10679,
2593,
374,
25957,
326,
253,
1332,
407,
246,
757,
285,
1149,
4240,
4419,
327,
81,
19,
13782,
285,
326,
253,
1980,
13782,
10454,
273,
776,
1332,
310,
1121,
19,
534,
310,
10481,
3777,
2429,
342,
246,
757,
285,
1149,
4240,
2299,
604,
891,
2096,
9113,
1121,
19,
10770,
281,
253,
10454,
591,
19502,
285,
3103,
253,
2264,
10454,
273,
253,
4081,
5933,
310,
14366,
81,
19,
275,
326,
1083,
849,
310,
14366,
81,
19,
10481,
3777,
4103,
281,
327,
81,
19,
407,
246,
757,
285,
1149,
4240,
50276,
262,
1537,
320,
27096,
281,
2486,
1097,
11640,
1599,
285,
8876,
273,
253,
4081,
5933,
2366,
275,
690,
4679,
342,
13506,
941,
24088,
8442,
337,
285,
374,
436,
812,
2085,
247,
3282,
273,
849,
253,
8876,
12955,
17923,
762,
2622,
5939,
985,
285,
849,
253,
1599,
12955,
17923,
762,
4433,
50276,
783,
1655,
4679,
342,
13506,
941,
2593,
7609,
760,
2770,
327,
1027,
19502,
1180,
285,
4433,
4142,
2299,
627,
403,
625,
643,
3602,
3206,
275,
253,
10527,
14493,
3021,
352,
651,
320,
625,
18511,
281,
671,
2486,
9938,
281,
12654,
731,
24088,
11962,
10103,
3410,
9552,
50276,
5092,
253,
4477,
2085,
247,
4864,
8813,
390,
9938,
670,
1880,
253,
4081,
5933,
789,
390,
849,
281,
9017,
352,
281,
6016,
2219,
835,
253,
3530,
403,
417,
25356,
5939,
436,
3133,
281,
320,
247,
625,
15958,
4758,
275,
3946,
50276,
74,
651,
1804,
253,
4477,
281,
671,
5513,
1880,
253,
8876,
20828,
6974,
10384,
281,
253,
1332,
407,
246,
757,
285,
1149,
4240,
24088,
407,
9433,
352,
281,
16186,
4791,
50276,
6438,
4361,
30080,
22559,
891,
11435,
253,
4477,
7000,
2380,
891,
452,
1239,
253,
30080,
22559,
285,
253,
643,
10123,
285,
619,
7350,
452,
644,
973,
11512,
5474,
339,
431,
248,
1895,
2783,
1060,
310,
7936,
285,
4409,
12392,
50276,
783,
4081,
11333,
403,
625,
5919,
685,
5368,
4394,
751,
246,
757,
285,
1149,
4240,
436,
5750,
310,
1534,
984,
4893,
751,
812,
5718,
2223,
6388,
1781,
941,
9552,
285,
673,
6733,
310,
3021,
5667,
50276,
4524,
2176,
13260,
253,
10527,
23632,
403,
2822,
29776,
345,
43507,
253,
20297,
3045,
273,
253,
5933,
50275,
373,
300,
1482,
1411,
1980,
5145,
20101,
285,
16933,
310,
247,
1175,
5043,
323,
14127,
8542,
3198,
253,
642,
33699,
1083,
985,
891,
310,
247,
1798,
1083,
273,
985,
21255,
7613,
253,
4477,
943,
3748,
253,
15504,
273,
253,
3438,
619,
4685,
310,
326,
253,
642,
33699,
10076,
12215,
310,
760,
5777,
1805,
685,
253,
643,
1955,
281,
253,
3081,
337,
79,
1307,
7005,
2835,
8084,
84,
5734,
278,
310,
347,
1781,
347,
295,
436,
1798,
1307,
310,
642,
625,
1534,
685,
253,
4283,
8084,
2808,
268,
295,
1307,
594,
5046,
359,
513,
417,
878,
253,
1599,
3169,
5933,
285,
247,
4858,
15895,
16280,
985,
21255,
50276,
783,
9376,
326,
253,
6303,
5145,
310,
1959,
432,
4433,
390,
16933,
3133,
281,
25636,
253,
1895,
3240,
247,
2372,
891,
4282,
604,
253,
4477,
452,
2783,
2219,
835,
760,
9765,
50276,
805,
310,
2130,
285,
359,
13414,
871,
534,
10679,
1690,
253,
6303,
581,
403,
13241,
275,
253,
6158,
1083,
359,
778,
1335,
13179,
5609,
432,
10237,
9990,
281,
819,
2517,
285,
5806,
3076,
3530,
285,
19737,
253,
1491,
20420,
4496,
671,
3730,
281,
643,
7118,
253,
4028,
310,
8261,
533,
417,
3477,
281,
956,
285,
3198,
7756,
690,
13991,
50276,
262,
778,
320,
1805,
281,
4822,
512,
253,
3309,
41818,
285,
1246,
731,
2366,
275,
247,
2014,
1659,
690,
3176,
387,
253,
5068,
273,
2593,
2145,
285,
2571,
751,
295,
76,
285,
295,
5195,
2439,
253,
5780,
33295,
50275,
74,
651,
751,
281,
923,
625,
11985,
285,
5661,
1543,
327,
253,
673,
6733,
273,
253,
11333,
253,
4477,
7558,
436,
11104,
347,
253,
3625,
5750,
689,
246,
757,
285,
1149,
4240,
533,
2530,
8723,
5955,
327,
436,
3185,
891,
3047,
8783,
273,
5701,
327,
253,
11333,
4715,
6332,
50276,
7152,
33032,
2520,
2929,
3400,
767,
5939,
5145,
4715,
3082,
253,
2905,
10527,
1783,
310,
2590,
337,
253,
11498,
273,
253,
767,
1524,
15302,
908,
275,
436,
1263,
403,
417,
1663,
1781,
285,
417,
2410,
1763,
917,
285,
253,
7756,
327,
253,
3045,
310,
3710,
374,
253,
3126,
8284,
273,
253,
11333,
403,
417,
2529,
4518,
690,
625,
1783,
285,
625,
941,
5239,
943,
320,
2908,
275,
253,
5661,
629,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
4477,
1908,
4715,
247,
5939,
23559,
14407,
23507,
298,
1473,
1566,
3782,
1690,
247,
6919,
273,
407,
91,
31404,
20101,
10341,
3879,
285,
5277,
31640,
407,
9591,
8876,
3169,
20828,
50276,
783,
2022,
7680,
273,
253,
789,
310,
253,
10527,
1783,
24088,
4645,
760,
247,
4512,
6379,
275,
6733,
323,
253,
2879,
31640,
50276,
783,
2022,
14855,
273,
253,
2929,
310,
3164,
697,
6891,
7990,
50276,
20261,
253,
2087,
4758,
2783,
5939,
4715,
9331,
13761,
310,
273,
3862,
1600,
253,
789,
310,
760,
18918,
247,
1077,
2969,
30410,
50276,
783,
38135,
273,
253,
4081,
29107,
3139,
310,
671,
3710,
9433,
247,
260,
3433,
7792,
285,
970,
8876,
281,
2085,
31640,
275,
253,
20828,
3213,
50276,
358,
5378,
474,
6803,
812,
671,
320,
10046,
323,
1650,
391,
18,
2530,
247,
1180,
273,
2173,
13991,
323,
625,
18511,
7103,
50276,
783,
4477,
2380,
2530,
247,
1180,
273,
3081,
5661,
1543,
326,
403,
9371,
275,
4685,
253,
3486,
273,
616,
1332,
285,
943,
320,
2908,
604,
7607,
50276,
35529,
253,
8453,
273,
841,
11269,
1293,
247,
18520,
1056,
352,
2834,
281,
5963,
253,
2457,
1885,
50276,
9088,
403,
671,
247,
1180,
273,
5884,
963,
993,
275,
253,
7482,
326,
943,
320,
9300,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
19164,
414,
2369,
652,
555,
50276,
783,
767,
2234,
9021,
403,
281,
253,
1682,
273,
619,
3640,
4460,
275,
253,
3634,
273,
298,
1473,
285,
2430,
37825,
6031,
281,
1287,
3340,
323,
253,
10527,
1543,
50276,
9188,
40348,
48276,
50276,
1615,
91,
31404,
4433,
310,
271,
1774,
2523,
275,
5939,
285,
10208,
12072,
4715,
285,
5277,
247,
3505,
74,
6216,
1332,
281,
6016,
352,
275,
253,
3634,
273,
298,
1473,
778,
320,
273,
1600,
281,
253,
3114,
50275,
28113,
5210,
3241,
1451,
474,
3290,
50276,
783,
2929,
3400,
10527,
1543,
327,
253,
13418,
2228,
14493,
285,
1329,
7355,
273,
253,
4081,
5933,
534,
25057,
1543,
432,
1029,
15759,
9990,
285,
3176,
281,
320,
3590,
281,
253,
1682,
273,
619,
3640,
50276,
15177,
273,
4679,
50276,
1189,
455,
253,
16774,
2175,
1329,
253,
2022,
3916,
50276,
250,
5551,
33593,
50276,
783,
27947,
285,
2127,
403,
2130,
253,
3368,
4278,
403,
4518,
2529,
275,
253,
2929,
50276,
498,
15752,
273,
4028,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
627,
403,
3240,
690,
41818,
533,
403,
973,
5611,
1078,
970,
3236,
414,
2369,
652,
555,
50276,
9188,
40348,
48276,
50276,
783,
2934,
273,
9591,
8876,
20828,
310,
417,
747,
285,
556,
644,
3786,
5421,
275,
5939,
4715,
50276,
783,
7680,
778,
1646,
32809,
285,
778,
320,
2783,
347,
247,
5019,
273,
5368,
5939,
298,
1473,
1332,
285,
5368,
1332,
281,
6016,
407,
91,
31404,
4433,
50276,
15177,
273,
4679,
50276,
783,
9938,
476,
320,
5520,
407,
7296,
625,
3602,
3206,
275,
253,
3033,
347,
973,
347,
10941,
1097,
11640,
1599,
285,
8876,
273,
253,
4081,
5933,
275,
253,
9938,
625,
4278,
275,
2805,
22,
50276,
1542,
253,
1980,
10679,
2593,
374,
25957,
326,
253,
1332,
407,
246,
757,
285,
1149,
4240,
4419,
327,
81,
19,
13782,
285,
326,
253,
1980,
13782,
10454,
273,
776,
1332,
310,
1121,
19,
534,
310,
10481,
3777,
2429,
342,
246,
757,
285,
1149,
4240,
2299,
604,
891,
2096,
9113,
1121,
19,
10770,
281,
253,
10454,
591,
19502,
285,
3103,
253,
2264,
10454,
273,
253,
4081,
5933,
310,
14366,
81,
19,
275,
326,
1083,
849,
310,
14366,
81,
19,
10481,
3777,
4103,
281,
327,
81,
19,
407,
246,
757,
285,
1149,
4240,
50276,
262,
1537,
320,
27096,
281,
2486,
1097,
11640,
1599,
285,
8876,
273,
253,
4081,
5933,
2366,
275,
690,
4679,
342,
13506,
941,
24088,
8442,
337,
285,
374,
436,
812,
2085,
247,
3282,
273,
849,
253,
8876,
12955,
17923,
762,
2622,
5939,
985,
285,
849,
253,
1599,
12955,
17923,
762,
4433,
50276,
783,
1655,
4679,
342,
13506,
941,
2593,
7609,
760,
2770,
327,
1027,
19502,
1180,
285,
4433,
4142,
2299,
627,
403,
625,
643,
3602,
3206,
275,
253,
10527,
14493,
3021,
352,
651,
320,
625,
18511,
281,
671,
2486,
9938,
281,
12654,
731,
24088,
11962,
10103,
3410,
9552,
50276,
5092,
253,
4477,
2085,
247,
4864,
8813,
390,
9938,
670,
1880,
253,
4081,
5933,
789,
390,
849,
281,
9017,
352,
281,
6016,
2219,
835,
253,
3530,
403,
417,
25356,
5939,
436,
3133,
281,
320,
247,
625,
15958,
4758,
275,
3946,
50276,
74,
651,
1804,
253,
4477,
281,
671,
5513,
1880,
253,
8876,
20828,
6974,
10384,
281,
253,
1332,
407,
246,
757,
285,
1149,
4240,
24088,
407,
9433,
352,
281,
16186,
4791,
50276,
6438,
4361,
30080,
22559,
891,
11435,
253,
4477,
7000,
2380,
891,
452,
1239,
253,
30080,
22559,
285,
253,
643,
10123,
285,
619,
7350,
452,
644,
973,
11512,
5474,
339,
431,
248,
1895,
2783,
1060,
310,
7936,
285,
4409,
12392,
50276,
783,
4081,
11333,
403,
625,
5919,
685,
5368,
4394,
751,
246,
757,
285,
1149,
4240,
436,
5750,
310,
1534,
984,
4893,
751,
812,
5718,
2223,
6388,
1781,
941,
9552,
285,
673,
6733,
310,
3021,
5667,
50276,
4524,
2176,
13260,
253,
10527,
23632,
403,
2822,
29776,
345,
43507,
253,
20297,
3045,
273,
253,
5933,
50275,
373,
300,
1482,
1411,
1980,
5145,
20101,
285,
16933,
310,
247,
1175,
5043,
323,
14127,
8542,
3198,
253,
642,
33699,
1083,
985,
891,
310,
247,
1798,
1083,
273,
985,
21255,
7613,
253,
4477,
943,
3748,
253,
15504,
273,
253,
3438,
619,
4685,
310,
326,
253,
642,
33699,
10076,
12215,
310,
760,
5777,
1805,
685,
253,
643,
1955,
281,
253,
3081,
337,
79,
1307,
7005,
2835,
8084,
84,
5734,
278,
310,
347,
1781,
347,
295,
436,
1798,
1307,
310,
642,
625,
1534,
685,
253,
4283,
8084,
2808,
268,
295,
1307,
594,
5046,
359,
513,
417,
878,
253,
1599,
3169,
5933,
285,
247,
4858,
15895,
16280,
985,
21255,
50276,
783,
9376,
326,
253,
6303,
5145,
310,
1959,
432,
4433,
390,
16933,
3133,
281,
25636,
253,
1895,
3240,
247,
2372,
891,
4282,
604,
253,
4477,
452,
2783,
2219,
835,
760,
9765,
50276,
805,
310,
2130,
285,
359,
13414,
871,
534,
10679,
1690,
253,
6303,
581,
403,
13241,
275,
253,
6158,
1083,
359,
778,
1335,
13179,
5609,
432,
10237,
9990,
281,
819,
2517,
285,
5806,
3076,
3530,
285,
19737,
253,
1491,
20420,
4496,
671,
3730,
281,
643,
7118,
253,
4028,
310,
8261,
533,
417,
3477,
281,
956,
285,
3198,
7756,
690,
13991,
50276,
262,
778,
320,
1805,
281,
4822,
512,
253,
3309,
41818,
285,
1246,
731,
2366,
275,
247,
2014,
1659,
690,
3176,
387,
253,
5068,
273,
2593,
2145,
285,
2571,
751,
295,
76,
285,
295,
5195,
2439,
253,
5780,
33295,
50275,
74,
651,
751,
281,
923,
625,
11985,
285,
5661,
1543,
327,
253,
673,
6733,
273,
253,
11333,
253,
4477,
7558,
436,
11104,
347,
253,
3625,
5750,
689,
246,
757,
285,
1149,
4240,
533,
2530,
8723,
5955,
327,
436,
3185,
891,
3047,
8783,
273,
5701,
327,
253,
11333,
4715,
6332,
50276,
7152,
33032,
2520,
2929,
3400,
767,
5939,
5145,
4715,
3082,
253,
2905,
10527,
1783,
310,
2590,
337,
253,
11498,
273,
253,
767,
1524,
15302,
908,
275,
436,
1263,
403,
417,
1663,
1781,
285,
417,
2410,
1763,
917,
285,
253,
7756,
327,
253,
3045,
310,
3710,
374,
253,
3126,
8284,
273,
253,
11333,
403,
417,
2529,
4518,
690,
625,
1783,
285,
625,
941,
5239,
943,
320,
2908,
275,
253,
5661,
629,
2490,
187,
4118,
18435,
27,
13518,
2278,
253,
4477,
1908,
4715,
247,
5939,
23559,
14407,
23507,
298,
1473,
1566,
3782,
1690,
247,
6919,
273,
407,
91,
31404,
20101,
10341,
3879,
285,
5277,
31640,
407,
9591,
8876,
3169,
20828,
50276,
783,
2022,
7680,
273,
253,
789,
310,
253,
10527,
1783,
24088,
4645,
760,
247,
4512,
6379,
275,
6733,
323,
253,
2879,
31640,
50276,
783,
2022,
14855,
273,
253,
2929,
310,
3164,
697,
6891,
7990,
50276,
20261,
253,
2087,
4758,
2783,
5939,
4715,
9331,
13761,
310,
273,
3862,
1600,
253,
789,
310,
760,
18918,
247,
1077,
2969,
30410,
50276,
783,
38135,
273,
253,
4081,
29107,
3139,
310,
671,
3710,
9433,
247,
260,
3433,
7792,
285,
970,
8876,
281,
2085,
31640,
275,
253,
20828,
3213,
50276,
358,
5378,
474,
6803,
812,
671,
320,
10046,
323,
1650,
391,
18,
2530,
247,
1180,
273,
2173,
13991,
323,
625,
18511,
7103,
50276,
783,
4477,
2380,
2530,
247,
1180,
273,
3081,
5661,
1543,
326,
403,
9371,
275,
4685,
253,
3486,
273,
616,
1332,
285,
943,
320,
2908,
604,
7607,
50276,
35529,
253,
8453,
273,
841,
11269,
1293,
247,
18520,
1056,
352,
2834,
281,
5963,
253,
2457,
1885,
50276,
9088,
403,
671,
247,
1180,
273,
5884,
963,
993,
275,
253,
7482,
326,
943,
320,
9300,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper uses experimental measurements and empirical curvefits of learning curves to study their interaction with training protocols such as transfer learning and data augmentation they define the learning curve to be the test error as a function of training set size i appreciate the authors goal of relating the scaling behavior of learning curves with common choices in nn training deep learning models are often complex and hard to model from first principles trying to understand and design deep learning models using scaling laws empirical measurements and powerlaw fits seems like a great idea however i am not entirely sure if the conclusions of this paper are convincing enough to be sufficiently useful to the iclr readership for example a the authors conclude that pretraining on similar domains nearly always helps compared to training from scratch in agreement with their initial guess before they ran their experiments as far as i can tell they reached this conclusion because one model they plot in figure 1d which was trained from random initializations does worse on average than the 3 models they finetuned all of which were pretrained on larger datasets it has already been reported in literature that pretraining on a similar dataset does not always outperform training from scratch 1 already cited 2 3 in fact ref 3 has shown that depending on the dataaugmentation settings pretraining can significantly hurt final performance compared to training from scratch furthermore ref 2 found that pretraining will not be helpful for certain dataset pairs even if the target dataset has similar classes as the source dataset eg cars in fgvc cars and imagenet i find the conclusion from ref 2 more convincing and precise since they have evaluated transferring to 12 different datasets using 16 different architectures compared to the 2 different target datasets in this paper and only 1 architecture authors mention that they trained 8 different architectures but am i understanding correctly that figure 1d only includes a single architecture that is not pretrained b building on item a i am not sure if the model trained without pretraining is representative of what researchers would use accuracies are not reported in the text but it reads as though the no pretr model achieves around 72 accuracy on cifar100 fig 1d it is relatively easy to achieve above 80 with a standard architecture such as wideresnet2810 other work cited above have already seen that pretraining is more likely to do better than models trained from scratch if the models trained from scratch are not trained as optimally as they could be common culprits are not training for long enough not using sufficient regularization etc for this reason in order to claim something as strong as popular beliefs 1 it is important to have a good baseline for the model trained from scratch note that i am not asking for a stateoftheart results here but a result that is comparable to models trained in literature one reason could potentially be that the authors trained on 40k samples of cifar100 instead of the commonly used 45k but it is still easy to get 80 with a standard training protocol initialized to random weights on 40k samples of cifar100 c i worry that similar issues exist for other conclusions of the paper for example the conclusions about increasing the network depth are very interesting but i am not convinced that the paper has the experiments to justify them it is crucial to optimize the regularization techniques for each width and depth separately before one can make a statement about whether increasing the depth or width is generally helpful or not i hope that the authors continue this work expand and improve their experimental setup relating learningcurves to models that are of interest to the community with performance that match what is reported in literature would be a great first step this way the readers can easily judge if the connections observed in the paper would be applicable to the standard training protocols a next step could be to show that using the insights that are developed in this paper the authors can achieve a result that was not possible without their insights 1 he kaiming ross girshick and piotr dollr rethinking imagenet pretraining proceedings of the ieee international conference on computer vision 2019 2 kornblith simon jonathon shlens and quoc v le do better imagenet models transfer better proceedings of the ieee conference on computer vision and pattern recognition 2019 3 zoph barret et al rethinking pretraining and selftraining arxiv preprint arxiv200606882 2020docsepin this paper the authors first propose a simple weighted least squares method to compute the learning curve error plotted against dataset size where error is modelled with the form error alpha etangamma for parameters alpha eta gamma gamma is taken to be 05 while alpha eta are estimated from the data this also allows an estimate of data reliance in essence the slope of error wrt dataset size computing how much error decrease is dependent on dataset size the authors then perform an extensive experimental evaluation on varying sized subsets of cifar100 and places365 across multiple different neural architectures and varying choices of finetuning pretraining linear classifier frozen feature training varying architecture size and data augmentation they fit learning curves on these different empirical configurations estimating datareliance along with extrapolating error finding interesting conclusions such as finetuning outperforming linear classifiers even on small datasets and larger architectures actually improving data reliance even in small data settings both the learning curve computations and empirical evaluations are interesting and i recommend accepting this paper however i would strongly suggest changing the layout of the paper in particular splitting up figure 1 into several figures to better emphasize some of the different takeaways eg perhaps section 32 which consists of recapping weighted least squares could be moved to the supplementary as it is it is very difficult to follow the main takeaways from the different experiments and even the insights given by fitting the learning curve and computing datareliance if the authors can provide such a revised version i would definitely consider further increasing my score minor comments i appreciate the open acknowledgement of some of the limitations of the method i also liked the summary table deep learning quiz summarizing some of the conclusions of figure 1 which would have been hard to absorb otherwise are there assumptions about the datasettask that must be made for fitting learning curves and predicting data reliance to work well for example what if the model is trained to memorize random labelsdocsep summary the authors conduct an investigation of learning curves on image classification models to understand tradeoffs between error and dataset size across different design choices these learning curves help clarify the relationship between error and data reliance as a function of choices such as depth width pretraining and finetuning comments the introduction and motivation for the work is clear and well written i found the work wellcontextualized within existing work the learning curve is a nice way to understand tradeoffs in design choices for a given model a single choice of optimizer is used and learning curves could conceivably vary between different optimizers it would be good to explore this further the experiments are thorough and have interesting conclusions that are applicable to researchers practitioners eg deeper networks are suitable for smaller datasets recommendation justification i vote to accept the paper the authors do an excellent job of motivating the importance of learning curves and are systematic about their experimentation and analysis minor feedback missing header and page number on first page i like the tf quiz i think itd help to put more detailed descriptions of some of the procedures eg finetuning the entire network is only helpful if the training set is large is vague if you do not quantify helpful nor define an alternative such as finetuning just the final layer typo pseudeoinverse appendix discussion on impact of gamma gamma 0 05 would probably be clearer as gamma in 0 05docsepthis paper advocates for studying the effect of design choices in deep learning via their effect on entire learning curves test error vs num samples n as opposed to their effect only for a fixed n this is a valid and important message and it is indeed an aspect that is often overlooked in certain domains however although this paper addresses an important issue there are methodological concerns described below which prevent me from recommending acceptance in summary the paper oversimplifies certain important aspects in both the setup and the experiments concerns 1 my main concern is that the discussion of learning curves ignores the effect of model size prior work including kaplan 2020 and rosenfeld 2020 has shown that learning curves exhibit quantitatively different behavior when models are overparameterized vs underparameterized in particular learning curves are only known to exhibit clean powerlaw behavior when modelsize is not the bottleneck eg if modelsize is scaled up correspondingly to data size there is no discussion of the modelsize issue in the present work this may be problematic since data from small n are used to extrapolate to large n but the model size is held fixed concretely a full discussion of how to evaluate and interpret learning curves should account for the effect of modelsize 2 the curvefitting procedure is nonstandard and produces some questionable extrapolations this is concerning because one of the stated contributions of this paper is to propose an experimental methodology specifically a if the true parametric form is a powerlaw with some gamma 05 why are the learning curves plotted assuming gamma05 figure 1 in the regression estimate equation 7 why is the exponent gamma encouraged to be close to 05 note that the theoretical justification for gamma05 table 2 is weak it only includes parametric bounds nonparametric rates are in general different from gamma05 b several of the curves in figure 1 predict crossovers which we do not expect to occur at ninfty for example figure 1g predicts that an ensemble of 6 resnet18s will be better than 1 resnet50 at ninfty which we do not expect c in general the curves are extrapolated from only 5 points it would be more convincing to see more datasizes tested 3 regarding experimental setup and conclusions a why are there experiments for cifar100 but not cifar10 most of the current experiments have high error rates 20 so it would have been nice to see how the curvefits perform down to low errorrates 5 as we would see on cifar10 b the claim that pretraining does not bias the classifier is too strong to be supported by the experiments certainly this does not hold for any arbitrary pretraining dataset but perhaps it holds for natural pretraining datasets close to the ones tested here in general several of the experimental claims are too strong in this way they make universal statements but are only tested in a few limited respects further experiments would give more evidence to these claims for example it is speculated on pg 11 that gamma does not depend much on the model architecture does this continue to hold for mlps only convnets are tested in this paper summary the motivation of this paper is very good but the proposed experimental methodology is somewhat lacking this paper would be much improved by more thorough experiments and analysis and more nuanced discussion of the experimental conclusion commentsclarifications which do not affect the score why are the experiments done using the ranger optimizer would any conclusions differ if we use standard optimizers sgdadam i would suggest moving section 32 to the appendix since the mechanics of least squares is likely familiar to readers this would open more space for further discussion of figure 1 experiments edit after rebuttal changed score from 5 to 6 see below
### Summary:
|
this paper studies the relationship between test error as a function of training set size and various design choices of neural network training overall all of the reviewers are excited about the prospect of relating error curves to neural network design choices but different reviewers complain about the rigor of empirical evaluation and the accuracy of conclusions given limited data points i agree with reviewers on both points ie the paper studies different design choices but does not do a thorough job studying those design choices moreover it is not clear what aspects of the study are directly related to error curves vs a standard correlation study done in prior work eg in do better imagenet models transfer better for usefulness of imagenet pretraining so overall i believe not only the empirical evaluation needs improvement but also the story needs refinement i am looking forward to seeing this paper published in other ml venues
|
[
247,
891,
717,
417,
2119,
604,
253,
1566,
10166,
1293,
3215,
26208,
310,
8612,
273,
752,
8607,
651,
897,
3933,
19103,
403,
417,
2361,
275,
253,
2505,
533,
352,
9563,
347,
2167,
253,
642,
638,
1206,
1566,
33526,
1475,
8187,
7200,
327,
260,
338,
274,
2313,
3036,
337,
69,
352,
310,
4942,
3477,
281,
5115,
1840,
5096,
342,
247,
2629,
10336,
824,
347,
4618,
373,
3024,
1619,
740,
643,
789,
11106,
1840,
452,
2168,
2326,
326,
3215,
26208,
310,
625,
2779,
281,
513,
1805,
685,
3210,
10166,
432,
20041,
604,
253,
3210,
10166,
432,
20041,
403,
417,
10166,
347,
5556,
595,
347,
597,
812,
320,
1846,
43412,
84,
403,
417,
3733,
323,
1048,
2217,
417,
970,
4209,
37820,
3966,
323,
436,
1921,
275,
1340,
281,
1750,
1633,
347,
2266,
347,
50276,
32667,
13379,
337,
352,
310,
1774,
281,
452,
247,
1175,
8245,
323,
253,
1566,
10166,
432,
20041,
3877,
326,
891,
717,
417,
7004,
323,
247,
1375,
23037,
14387,
1543,
1060,
533,
247,
906,
326,
310,
10870,
281,
3210,
10166,
275,
6239,
581,
1921,
812,
7826,
320,
326,
253,
4477,
10166,
327,
3387,
76,
3530,
273,
260,
338,
274,
2313,
3185,
273,
253,
7744,
908,
5329,
76,
533,
352,
310,
1335,
3477,
281,
755,
5096,
342,
247,
2629,
3733,
7241,
31260,
281,
3632,
13461,
327,
3387,
76,
3530,
273,
260,
338,
274,
2313,
50275,
68,
50276,
74,
7664,
326,
2074,
3374,
2226,
323,
643,
11815,
273,
253,
2929,
323,
1650,
253,
11815,
670,
3629,
253,
2990,
6864,
403,
1077,
4722,
533,
891,
717,
417,
13762,
326,
253,
2929,
556,
253,
4679,
281,
15249,
731,
352,
310,
9560,
281,
22318,
253,
37820,
5609,
323,
1016,
4871,
285,
6864,
11794,
1078,
581,
476,
1056,
247,
3908,
670,
1880,
3629,
253,
6864,
390,
4871,
310,
3839,
9371,
390,
417,
50275,
74,
3524,
326,
253,
4477,
4035,
436,
789,
5645,
285,
3157,
616,
5661,
9978,
12600,
4715,
1915,
1634,
281,
3210,
326,
403,
273,
1600,
281,
253,
3114,
342,
3045,
326,
3761,
752,
310,
2361,
275,
6239,
651,
320,
247,
1270,
806,
3213,
436,
1039,
253,
10668,
476,
4354,
5963,
604,
253,
10291,
2540,
275,
253,
2929,
651,
320,
7763,
281,
253,
2629,
3733,
14238,
247,
1735,
3213,
812,
320,
281,
921,
326,
970,
253,
16039,
326,
403,
3715,
275,
436,
2929,
253,
4477,
476,
5115,
247,
906,
326,
369,
417,
1896,
1293,
616,
16039,
50272,
18,
344,
465,
1468,
272,
687,
859,
48496,
1200,
781,
285,
268,
7173,
83,
6135,
83,
294,
37341,
4440,
257,
292,
3215,
26208,
10061,
273,
253,
26332,
1796,
5213,
8059,
327,
4382,
8113,
6247,
374,
465,
1575,
1559,
334,
948,
251,
480,
251,
22721,
439,
47360,
285,
572,
406,
362,
458,
513,
1805,
4440,
257,
292,
3210,
3700,
1805,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
495,
1182,
2689,
2534,
1221,
1162,
355,
294,
37341,
3215,
26208,
285,
11329,
649,
26208,
549,
32693,
638,
3845,
549,
32693,
1518,
25358,
39027,
9169,
7152,
339,
9852,
436,
2929,
253,
4477,
806,
12661,
247,
2969,
17375,
1878,
19325,
1332,
281,
11897,
253,
4715,
6970,
2228,
17944,
1411,
10895,
1979,
50276,
2811,
2228,
310,
41329,
342,
253,
830,
2228,
50276,
1637,
50276,
292,
606,
1861,
323,
3602,
9765,
1162,
66,
17356,
17356,
310,
2668,
281,
320,
50276,
1762,
1223,
9765,
1162,
66,
403,
5998,
432,
253,
941,
436,
671,
4483,
271,
6642,
273,
941,
22095,
275,
17718,
253,
14679,
273,
2228,
8772,
10895,
1979,
12672,
849,
1199,
2228,
6379,
310,
7976,
327,
10895,
1979,
50276,
783,
4477,
840,
1347,
271,
9470,
5661,
7103,
327,
11962,
25180,
20077,
273,
260,
338,
274,
2313,
285,
5053,
22359,
2439,
2709,
1027,
11454,
35615,
285,
11962,
10165,
273,
1442,
292,
25004,
3215,
26208,
4872,
30410,
13831,
4735,
3733,
11962,
10336,
1979,
285,
941,
42072,
597,
4944,
4715,
9191,
327,
841,
1027,
16774,
16012,
26230,
2856,
609,
38114,
2112,
342,
26480,
311,
839,
2228,
4560,
4722,
11815,
824,
347,
1442,
292,
25004,
41731,
14692,
4872,
49996,
1014,
327,
1355,
15302,
285,
4067,
35615,
2686,
11138,
941,
22095,
1014,
275,
1355,
941,
7533,
50276,
15617,
253,
4715,
6970,
30745,
285,
16774,
27163,
403,
4722,
285,
891,
5583,
18738,
436,
2929,
50275,
35529,
891,
651,
7052,
1804,
6890,
253,
12806,
273,
253,
2929,
275,
1798,
19860,
598,
4677,
337,
715,
2067,
8442,
281,
1805,
22175,
690,
273,
253,
1027,
1379,
42287,
24088,
4931,
2593,
4567,
534,
8414,
273,
761,
5436,
17375,
1878,
19325,
812,
320,
4395,
281,
253,
24864,
347,
352,
310,
352,
310,
1077,
2834,
281,
956,
253,
2022,
1379,
42287,
432,
253,
1027,
4679,
285,
1014,
253,
16039,
1677,
407,
13532,
253,
4715,
6970,
285,
12672,
2856,
609,
38114,
50276,
338,
253,
4477,
476,
2085,
824,
247,
17265,
2715,
891,
651,
7964,
1908,
2007,
3629,
619,
4868,
50276,
37585,
5701,
50275,
74,
11435,
253,
1527,
9555,
20800,
273,
690,
273,
253,
7364,
273,
253,
1332,
50275,
74,
671,
10490,
253,
6010,
2829,
3676,
4715,
39627,
10405,
3006,
690,
273,
253,
11815,
273,
4677,
337,
534,
651,
452,
644,
1892,
281,
15816,
5010,
50275,
609,
627,
13260,
670,
253,
7621,
3592,
1945,
326,
1364,
320,
1160,
323,
13532,
4715,
9191,
285,
21565,
941,
22095,
281,
789,
973,
323,
1650,
752,
604,
253,
1566,
310,
10166,
281,
16407,
907,
3632,
13301,
7152,
33032,
6010,
253,
4477,
2589,
271,
5839,
273,
4715,
9191,
327,
2460,
9162,
3210,
281,
2096,
5454,
14273,
875,
2228,
285,
10895,
1979,
2439,
1027,
2216,
10165,
841,
4715,
9191,
1361,
19148,
253,
2954,
875,
2228,
285,
941,
22095,
347,
247,
1159,
273,
10165,
824,
347,
6864,
4871,
3215,
26208,
285,
1442,
292,
25004,
50275,
26122,
50276,
783,
10199,
285,
16038,
323,
253,
789,
310,
2590,
285,
973,
3542,
50276,
74,
1119,
253,
789,
973,
8882,
780,
1025,
1561,
5368,
789,
50276,
783,
4715,
6970,
310,
247,
5322,
1039,
281,
2096,
5454,
14273,
275,
2216,
10165,
323,
247,
1677,
1566,
50276,
66,
2014,
4327,
273,
5556,
6081,
310,
908,
285,
4715,
9191,
812,
10686,
400,
1598,
6889,
875,
1027,
5556,
14460,
352,
651,
320,
1175,
281,
8338,
436,
2007,
50276,
783,
4679,
403,
11080,
285,
452,
4722,
11815,
326,
403,
7763,
281,
8607,
50276,
81,
974,
539,
398,
24088,
12861,
6928,
403,
7470,
323,
4577,
15302,
50275,
250,
27167,
318,
50276,
6309,
1877,
891,
6273,
281,
2997,
253,
2929,
253,
4477,
513,
271,
7126,
2628,
273,
15265,
839,
253,
6349,
273,
4715,
9191,
285,
403,
12082,
670,
616,
40290,
285,
1783,
50275,
37585,
8680,
50276,
33722,
10478,
285,
3239,
1180,
327,
806,
3239,
50276,
74,
751,
253,
28793,
39627,
891,
1158,
352,
69,
1361,
281,
1691,
625,
7000,
20121,
273,
690,
273,
253,
7259,
24088,
1442,
292,
25004,
253,
2862,
2990,
310,
760,
9371,
604,
253,
3733,
873,
310,
1781,
310,
21248,
604,
368,
513,
417,
22048,
9371,
4543,
4853,
271,
5795,
824,
347,
1442,
292,
25004,
816,
253,
2457,
3828,
50276,
555,
5367,
50276,
81,
339,
2496,
31511,
3025,
50275,
50237,
5955,
327,
3486,
273,
17356,
17356,
50276,
17,
50276,
1762,
651,
3164,
320,
30909,
347,
17356,
275,
470,
16987,
7152,
33032,
2520,
2929,
23318,
323,
12392,
253,
1055,
273,
2216,
10165,
275,
3676,
4715,
3066,
616,
1055,
327,
2862,
4715,
9191,
1071,
2228,
4632,
930,
3530,
295,
347,
10066,
281,
616,
1055,
760,
323,
247,
4229,
295,
436,
310,
247,
3588,
285,
1774,
3935,
285,
352,
310,
6296,
271,
4809,
326,
310,
2223,
28849,
275,
2176,
10625,
2299,
3738,
436,
2929,
12453,
271,
1774,
2523,
627,
403,
35961,
7350,
2529,
2708,
534,
3657,
479,
432,
46705,
14924,
275,
6010,
253,
2929,
689,
48573,
7790,
2176,
1774,
7794,
275,
1097,
253,
9978,
285,
253,
4679,
50276,
585,
1209,
2224,
337,
619,
2022,
4468,
310,
326,
253,
5955,
273,
4715,
9191,
35136,
253,
1055,
273,
1566,
1979,
2720,
789,
1690,
16288,
11139,
9169,
285,
687,
8243,
25281,
9169,
556,
2011,
326,
4715,
9191,
10738,
36878,
1027,
3879,
672,
3210,
403,
689,
19484,
1025,
4632,
762,
19484,
1025,
275,
1798,
4715,
9191,
403,
760,
1929,
281,
10738,
4076,
1612,
6937,
3879,
672,
3210,
907,
310,
417,
253,
3673,
44856,
24088,
604,
3210,
907,
310,
24337,
598,
47512,
281,
941,
1979,
627,
310,
642,
5955,
273,
253,
3210,
907,
2523,
275,
253,
1246,
789,
436,
778,
320,
20276,
1580,
941,
432,
1355,
295,
403,
908,
281,
26480,
25839,
281,
1781,
295,
533,
253,
1566,
1979,
310,
2918,
4229,
50276,
585,
2414,
600,
247,
2120,
5955,
273,
849,
281,
7472,
285,
4665,
4715,
9191,
943,
2395,
323,
253,
1055,
273,
3210,
907,
50276,
19,
253,
6970,
31893,
5199,
310,
1327,
15291,
285,
11330,
690,
30455,
26480,
311,
569,
436,
310,
8664,
984,
581,
273,
253,
4767,
9021,
273,
436,
2929,
310,
281,
12661,
271,
5661,
16182,
5742,
50275,
66,
604,
253,
2032,
36833,
830,
310,
247,
1612,
6937,
342,
690,
17356,
50276,
1762,
2139,
403,
253,
4715,
9191,
17944,
7384,
17356,
1762,
4677,
337,
275,
253,
9077,
6642,
5150,
818,
2139,
310,
253,
23653,
17356,
14659,
281,
320,
2810,
281,
16987,
3877,
326,
253,
10527,
22861,
323,
17356,
1762,
2829,
374,
310,
5075,
50276,
262,
760,
3797,
36833,
14493,
1327,
36928,
4142,
403,
275,
2087,
1027,
432,
17356,
1762,
50275,
67,
2067,
273,
253,
9191,
275,
4677,
337,
3283,
23987,
601,
735,
534,
359,
513,
417,
1902,
281,
2826,
387,
295,
3259,
323,
1650,
4677,
337,
72,
26295,
326,
271,
19862,
273,
721,
501,
3024,
1093,
84,
588,
320,
1805,
685,
337,
501,
3024,
1235,
387,
295,
3259,
534,
359,
513,
417,
1902,
50276,
68,
275,
2087,
253,
9191,
403,
26480,
35631,
432,
760,
608,
2792,
50276,
262,
651,
320,
625,
21414,
281,
923,
625,
7621,
4219,
5762,
50276,
20,
5001,
5661,
9978,
285,
11815,
50276,
66,
2139,
403,
627,
4679,
323,
260,
338,
274,
2313,
533,
417,
260,
338,
274,
740,
954,
273,
253,
1655,
4679,
452,
1029,
2228,
4142,
1384,
594,
352,
651,
452,
644,
5322,
281,
923,
849,
253,
6970,
26017,
1347,
1066,
281,
1698,
2228,
27619,
50276,
22,
347,
359,
651,
923,
327,
260,
338,
274,
740,
50276,
67,
253,
1750,
326,
3215,
26208,
1057,
417,
8492,
253,
30410,
310,
1512,
2266,
281,
320,
4516,
407,
253,
4679,
5604,
436,
1057,
417,
2186,
323,
667,
10341,
3215,
26208,
10895,
533,
4931,
352,
6556,
323,
3626,
3215,
26208,
15302,
2810,
281,
253,
4394,
5762,
1060,
275,
2087,
2067,
273,
253,
5661,
3916,
403,
1512,
2266,
275,
436,
1039,
50276,
9328,
1056,
10898,
7234,
533,
403,
760,
5762,
275,
247,
1643,
3710,
23006,
2007,
4679,
651,
1918,
625,
1941,
281,
841,
3916,
323,
1650,
352,
310,
39377,
327,
23256,
1903,
326,
17356,
1057,
417,
3469,
1199,
327,
253,
1566,
10336,
1057,
436,
4035,
281,
2186,
323,
13361,
793,
760,
2410,
47301,
403,
5762,
275,
436,
2929,
50275,
8774,
253,
16038,
273,
436,
2929,
310,
1077,
1175,
533,
253,
4081,
5661,
16182,
310,
8489,
14999,
436,
2929,
651,
320,
1199,
5520,
407,
625,
11080,
4679,
285,
1783,
285,
625,
8794,
3086,
5955,
273,
253,
5661,
6452,
50275,
26122,
498,
274,
6787,
534,
513,
417,
2818,
253,
4868,
2139,
403,
253,
4679,
2218,
970,
253,
391,
3751,
5556,
6081,
651,
667,
11815,
9184,
604,
359,
897,
2629,
5556,
14460,
256,
35333,
43089,
891,
651,
1804,
4886,
2593,
4567,
281,
253,
30762,
1580,
253,
17823,
273,
1878,
19325,
310,
2779,
7615,
281,
10668,
436,
651,
1527,
625,
2317,
323,
2007,
5955,
273,
4677,
337,
4679,
50275,
15576,
846,
30080,
22559,
4391,
4868,
432,
608,
281,
721,
923,
2708,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
2954,
875,
1071,
2228,
347,
247,
1159,
273,
3733,
873,
1979,
285,
2710,
2216,
10165,
273,
11454,
2990,
3733,
4583,
512,
273,
253,
30628,
403,
9049,
670,
253,
11091,
273,
12600,
2228,
9191,
281,
11454,
2990,
2216,
10165,
533,
1027,
30628,
17805,
670,
253,
8132,
263,
273,
16774,
7103,
285,
253,
7200,
273,
11815,
1677,
3710,
941,
2792,
891,
5194,
342,
30628,
327,
1097,
2792,
26332,
253,
2929,
2175,
1027,
2216,
10165,
533,
1057,
417,
513,
247,
11080,
2628,
12392,
1110,
2216,
10165,
25761,
352,
310,
417,
2590,
752,
7794,
273,
253,
1263,
403,
3587,
2905,
281,
2228,
9191,
4632,
247,
2629,
5921,
1263,
2218,
275,
2720,
789,
24088,
275,
513,
1805,
4440,
257,
292,
3210,
3700,
1805,
323,
31471,
273,
4440,
257,
292,
3215,
26208,
594,
4583,
891,
2868,
417,
760,
253,
16774,
7103,
3198,
7756,
533,
671,
253,
2926,
3198,
29646,
891,
717,
2819,
3579,
281,
6523,
436,
2929,
3863,
275,
643,
13361,
28966
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
247,
891,
717,
417,
2119,
604,
253,
1566,
10166,
1293,
3215,
26208,
310,
8612,
273,
752,
8607,
651,
897,
3933,
19103,
403,
417,
2361,
275,
253,
2505,
533,
352,
9563,
347,
2167,
253,
642,
638,
1206,
1566,
33526,
1475,
8187,
7200,
327,
260,
338,
274,
2313,
3036,
337,
69,
352,
310,
4942,
3477,
281,
5115,
1840,
5096,
342,
247,
2629,
10336,
824,
347,
4618,
373,
3024,
1619,
740,
643,
789,
11106,
1840,
452,
2168,
2326,
326,
3215,
26208,
310,
625,
2779,
281,
513,
1805,
685,
3210,
10166,
432,
20041,
604,
253,
3210,
10166,
432,
20041,
403,
417,
10166,
347,
5556,
595,
347,
597,
812,
320,
1846,
43412,
84,
403,
417,
3733,
323,
1048,
2217,
417,
970,
4209,
37820,
3966,
323,
436,
1921,
275,
1340,
281,
1750,
1633,
347,
2266,
347,
50276,
32667,
13379,
337,
352,
310,
1774,
281,
452,
247,
1175,
8245,
323,
253,
1566,
10166,
432,
20041,
3877,
326,
891,
717,
417,
7004,
323,
247,
1375,
23037,
14387,
1543,
1060,
533,
247,
906,
326,
310,
10870,
281,
3210,
10166,
275,
6239,
581,
1921,
812,
7826,
320,
326,
253,
4477,
10166,
327,
3387,
76,
3530,
273,
260,
338,
274,
2313,
3185,
273,
253,
7744,
908,
5329,
76,
533,
352,
310,
1335,
3477,
281,
755,
5096,
342,
247,
2629,
3733,
7241,
31260,
281,
3632,
13461,
327,
3387,
76,
3530,
273,
260,
338,
274,
2313,
50275,
68,
50276,
74,
7664,
326,
2074,
3374,
2226,
323,
643,
11815,
273,
253,
2929,
323,
1650,
253,
11815,
670,
3629,
253,
2990,
6864,
403,
1077,
4722,
533,
891,
717,
417,
13762,
326,
253,
2929,
556,
253,
4679,
281,
15249,
731,
352,
310,
9560,
281,
22318,
253,
37820,
5609,
323,
1016,
4871,
285,
6864,
11794,
1078,
581,
476,
1056,
247,
3908,
670,
1880,
3629,
253,
6864,
390,
4871,
310,
3839,
9371,
390,
417,
50275,
74,
3524,
326,
253,
4477,
4035,
436,
789,
5645,
285,
3157,
616,
5661,
9978,
12600,
4715,
1915,
1634,
281,
3210,
326,
403,
273,
1600,
281,
253,
3114,
342,
3045,
326,
3761,
752,
310,
2361,
275,
6239,
651,
320,
247,
1270,
806,
3213,
436,
1039,
253,
10668,
476,
4354,
5963,
604,
253,
10291,
2540,
275,
253,
2929,
651,
320,
7763,
281,
253,
2629,
3733,
14238,
247,
1735,
3213,
812,
320,
281,
921,
326,
970,
253,
16039,
326,
403,
3715,
275,
436,
2929,
253,
4477,
476,
5115,
247,
906,
326,
369,
417,
1896,
1293,
616,
16039,
50272,
18,
344,
465,
1468,
272,
687,
859,
48496,
1200,
781,
285,
268,
7173,
83,
6135,
83,
294,
37341,
4440,
257,
292,
3215,
26208,
10061,
273,
253,
26332,
1796,
5213,
8059,
327,
4382,
8113,
6247,
374,
465,
1575,
1559,
334,
948,
251,
480,
251,
22721,
439,
47360,
285,
572,
406,
362,
458,
513,
1805,
4440,
257,
292,
3210,
3700,
1805,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
495,
1182,
2689,
2534,
1221,
1162,
355,
294,
37341,
3215,
26208,
285,
11329,
649,
26208,
549,
32693,
638,
3845,
549,
32693,
1518,
25358,
39027,
9169,
7152,
339,
9852,
436,
2929,
253,
4477,
806,
12661,
247,
2969,
17375,
1878,
19325,
1332,
281,
11897,
253,
4715,
6970,
2228,
17944,
1411,
10895,
1979,
50276,
2811,
2228,
310,
41329,
342,
253,
830,
2228,
50276,
1637,
50276,
292,
606,
1861,
323,
3602,
9765,
1162,
66,
17356,
17356,
310,
2668,
281,
320,
50276,
1762,
1223,
9765,
1162,
66,
403,
5998,
432,
253,
941,
436,
671,
4483,
271,
6642,
273,
941,
22095,
275,
17718,
253,
14679,
273,
2228,
8772,
10895,
1979,
12672,
849,
1199,
2228,
6379,
310,
7976,
327,
10895,
1979,
50276,
783,
4477,
840,
1347,
271,
9470,
5661,
7103,
327,
11962,
25180,
20077,
273,
260,
338,
274,
2313,
285,
5053,
22359,
2439,
2709,
1027,
11454,
35615,
285,
11962,
10165,
273,
1442,
292,
25004,
3215,
26208,
4872,
30410,
13831,
4735,
3733,
11962,
10336,
1979,
285,
941,
42072,
597,
4944,
4715,
9191,
327,
841,
1027,
16774,
16012,
26230,
2856,
609,
38114,
2112,
342,
26480,
311,
839,
2228,
4560,
4722,
11815,
824,
347,
1442,
292,
25004,
41731,
14692,
4872,
49996,
1014,
327,
1355,
15302,
285,
4067,
35615,
2686,
11138,
941,
22095,
1014,
275,
1355,
941,
7533,
50276,
15617,
253,
4715,
6970,
30745,
285,
16774,
27163,
403,
4722,
285,
891,
5583,
18738,
436,
2929,
50275,
35529,
891,
651,
7052,
1804,
6890,
253,
12806,
273,
253,
2929,
275,
1798,
19860,
598,
4677,
337,
715,
2067,
8442,
281,
1805,
22175,
690,
273,
253,
1027,
1379,
42287,
24088,
4931,
2593,
4567,
534,
8414,
273,
761,
5436,
17375,
1878,
19325,
812,
320,
4395,
281,
253,
24864,
347,
352,
310,
352,
310,
1077,
2834,
281,
956,
253,
2022,
1379,
42287,
432,
253,
1027,
4679,
285,
1014,
253,
16039,
1677,
407,
13532,
253,
4715,
6970,
285,
12672,
2856,
609,
38114,
50276,
338,
253,
4477,
476,
2085,
824,
247,
17265,
2715,
891,
651,
7964,
1908,
2007,
3629,
619,
4868,
50276,
37585,
5701,
50275,
74,
11435,
253,
1527,
9555,
20800,
273,
690,
273,
253,
7364,
273,
253,
1332,
50275,
74,
671,
10490,
253,
6010,
2829,
3676,
4715,
39627,
10405,
3006,
690,
273,
253,
11815,
273,
4677,
337,
534,
651,
452,
644,
1892,
281,
15816,
5010,
50275,
609,
627,
13260,
670,
253,
7621,
3592,
1945,
326,
1364,
320,
1160,
323,
13532,
4715,
9191,
285,
21565,
941,
22095,
281,
789,
973,
323,
1650,
752,
604,
253,
1566,
310,
10166,
281,
16407,
907,
3632,
13301,
7152,
33032,
6010,
253,
4477,
2589,
271,
5839,
273,
4715,
9191,
327,
2460,
9162,
3210,
281,
2096,
5454,
14273,
875,
2228,
285,
10895,
1979,
2439,
1027,
2216,
10165,
841,
4715,
9191,
1361,
19148,
253,
2954,
875,
2228,
285,
941,
22095,
347,
247,
1159,
273,
10165,
824,
347,
6864,
4871,
3215,
26208,
285,
1442,
292,
25004,
50275,
26122,
50276,
783,
10199,
285,
16038,
323,
253,
789,
310,
2590,
285,
973,
3542,
50276,
74,
1119,
253,
789,
973,
8882,
780,
1025,
1561,
5368,
789,
50276,
783,
4715,
6970,
310,
247,
5322,
1039,
281,
2096,
5454,
14273,
275,
2216,
10165,
323,
247,
1677,
1566,
50276,
66,
2014,
4327,
273,
5556,
6081,
310,
908,
285,
4715,
9191,
812,
10686,
400,
1598,
6889,
875,
1027,
5556,
14460,
352,
651,
320,
1175,
281,
8338,
436,
2007,
50276,
783,
4679,
403,
11080,
285,
452,
4722,
11815,
326,
403,
7763,
281,
8607,
50276,
81,
974,
539,
398,
24088,
12861,
6928,
403,
7470,
323,
4577,
15302,
50275,
250,
27167,
318,
50276,
6309,
1877,
891,
6273,
281,
2997,
253,
2929,
253,
4477,
513,
271,
7126,
2628,
273,
15265,
839,
253,
6349,
273,
4715,
9191,
285,
403,
12082,
670,
616,
40290,
285,
1783,
50275,
37585,
8680,
50276,
33722,
10478,
285,
3239,
1180,
327,
806,
3239,
50276,
74,
751,
253,
28793,
39627,
891,
1158,
352,
69,
1361,
281,
1691,
625,
7000,
20121,
273,
690,
273,
253,
7259,
24088,
1442,
292,
25004,
253,
2862,
2990,
310,
760,
9371,
604,
253,
3733,
873,
310,
1781,
310,
21248,
604,
368,
513,
417,
22048,
9371,
4543,
4853,
271,
5795,
824,
347,
1442,
292,
25004,
816,
253,
2457,
3828,
50276,
555,
5367,
50276,
81,
339,
2496,
31511,
3025,
50275,
50237,
5955,
327,
3486,
273,
17356,
17356,
50276,
17,
50276,
1762,
651,
3164,
320,
30909,
347,
17356,
275,
470,
16987,
7152,
33032,
2520,
2929,
23318,
323,
12392,
253,
1055,
273,
2216,
10165,
275,
3676,
4715,
3066,
616,
1055,
327,
2862,
4715,
9191,
1071,
2228,
4632,
930,
3530,
295,
347,
10066,
281,
616,
1055,
760,
323,
247,
4229,
295,
436,
310,
247,
3588,
285,
1774,
3935,
285,
352,
310,
6296,
271,
4809,
326,
310,
2223,
28849,
275,
2176,
10625,
2299,
3738,
436,
2929,
12453,
271,
1774,
2523,
627,
403,
35961,
7350,
2529,
2708,
534,
3657,
479,
432,
46705,
14924,
275,
6010,
253,
2929,
689,
48573,
7790,
2176,
1774,
7794,
275,
1097,
253,
9978,
285,
253,
4679,
50276,
585,
1209,
2224,
337,
619,
2022,
4468,
310,
326,
253,
5955,
273,
4715,
9191,
35136,
253,
1055,
273,
1566,
1979,
2720,
789,
1690,
16288,
11139,
9169,
285,
687,
8243,
25281,
9169,
556,
2011,
326,
4715,
9191,
10738,
36878,
1027,
3879,
672,
3210,
403,
689,
19484,
1025,
4632,
762,
19484,
1025,
275,
1798,
4715,
9191,
403,
760,
1929,
281,
10738,
4076,
1612,
6937,
3879,
672,
3210,
907,
310,
417,
253,
3673,
44856,
24088,
604,
3210,
907,
310,
24337,
598,
47512,
281,
941,
1979,
627,
310,
642,
5955,
273,
253,
3210,
907,
2523,
275,
253,
1246,
789,
436,
778,
320,
20276,
1580,
941,
432,
1355,
295,
403,
908,
281,
26480,
25839,
281,
1781,
295,
533,
253,
1566,
1979,
310,
2918,
4229,
50276,
585,
2414,
600,
247,
2120,
5955,
273,
849,
281,
7472,
285,
4665,
4715,
9191,
943,
2395,
323,
253,
1055,
273,
3210,
907,
50276,
19,
253,
6970,
31893,
5199,
310,
1327,
15291,
285,
11330,
690,
30455,
26480,
311,
569,
436,
310,
8664,
984,
581,
273,
253,
4767,
9021,
273,
436,
2929,
310,
281,
12661,
271,
5661,
16182,
5742,
50275,
66,
604,
253,
2032,
36833,
830,
310,
247,
1612,
6937,
342,
690,
17356,
50276,
1762,
2139,
403,
253,
4715,
9191,
17944,
7384,
17356,
1762,
4677,
337,
275,
253,
9077,
6642,
5150,
818,
2139,
310,
253,
23653,
17356,
14659,
281,
320,
2810,
281,
16987,
3877,
326,
253,
10527,
22861,
323,
17356,
1762,
2829,
374,
310,
5075,
50276,
262,
760,
3797,
36833,
14493,
1327,
36928,
4142,
403,
275,
2087,
1027,
432,
17356,
1762,
50275,
67,
2067,
273,
253,
9191,
275,
4677,
337,
3283,
23987,
601,
735,
534,
359,
513,
417,
1902,
281,
2826,
387,
295,
3259,
323,
1650,
4677,
337,
72,
26295,
326,
271,
19862,
273,
721,
501,
3024,
1093,
84,
588,
320,
1805,
685,
337,
501,
3024,
1235,
387,
295,
3259,
534,
359,
513,
417,
1902,
50276,
68,
275,
2087,
253,
9191,
403,
26480,
35631,
432,
760,
608,
2792,
50276,
262,
651,
320,
625,
21414,
281,
923,
625,
7621,
4219,
5762,
50276,
20,
5001,
5661,
9978,
285,
11815,
50276,
66,
2139,
403,
627,
4679,
323,
260,
338,
274,
2313,
533,
417,
260,
338,
274,
740,
954,
273,
253,
1655,
4679,
452,
1029,
2228,
4142,
1384,
594,
352,
651,
452,
644,
5322,
281,
923,
849,
253,
6970,
26017,
1347,
1066,
281,
1698,
2228,
27619,
50276,
22,
347,
359,
651,
923,
327,
260,
338,
274,
740,
50276,
67,
253,
1750,
326,
3215,
26208,
1057,
417,
8492,
253,
30410,
310,
1512,
2266,
281,
320,
4516,
407,
253,
4679,
5604,
436,
1057,
417,
2186,
323,
667,
10341,
3215,
26208,
10895,
533,
4931,
352,
6556,
323,
3626,
3215,
26208,
15302,
2810,
281,
253,
4394,
5762,
1060,
275,
2087,
2067,
273,
253,
5661,
3916,
403,
1512,
2266,
275,
436,
1039,
50276,
9328,
1056,
10898,
7234,
533,
403,
760,
5762,
275,
247,
1643,
3710,
23006,
2007,
4679,
651,
1918,
625,
1941,
281,
841,
3916,
323,
1650,
352,
310,
39377,
327,
23256,
1903,
326,
17356,
1057,
417,
3469,
1199,
327,
253,
1566,
10336,
1057,
436,
4035,
281,
2186,
323,
13361,
793,
760,
2410,
47301,
403,
5762,
275,
436,
2929,
50275,
8774,
253,
16038,
273,
436,
2929,
310,
1077,
1175,
533,
253,
4081,
5661,
16182,
310,
8489,
14999,
436,
2929,
651,
320,
1199,
5520,
407,
625,
11080,
4679,
285,
1783,
285,
625,
8794,
3086,
5955,
273,
253,
5661,
6452,
50275,
26122,
498,
274,
6787,
534,
513,
417,
2818,
253,
4868,
2139,
403,
253,
4679,
2218,
970,
253,
391,
3751,
5556,
6081,
651,
667,
11815,
9184,
604,
359,
897,
2629,
5556,
14460,
256,
35333,
43089,
891,
651,
1804,
4886,
2593,
4567,
281,
253,
30762,
1580,
253,
17823,
273,
1878,
19325,
310,
2779,
7615,
281,
10668,
436,
651,
1527,
625,
2317,
323,
2007,
5955,
273,
4677,
337,
4679,
50275,
15576,
846,
30080,
22559,
4391,
4868,
432,
608,
281,
721,
923,
2708,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
2954,
875,
1071,
2228,
347,
247,
1159,
273,
3733,
873,
1979,
285,
2710,
2216,
10165,
273,
11454,
2990,
3733,
4583,
512,
273,
253,
30628,
403,
9049,
670,
253,
11091,
273,
12600,
2228,
9191,
281,
11454,
2990,
2216,
10165,
533,
1027,
30628,
17805,
670,
253,
8132,
263,
273,
16774,
7103,
285,
253,
7200,
273,
11815,
1677,
3710,
941,
2792,
891,
5194,
342,
30628,
327,
1097,
2792,
26332,
253,
2929,
2175,
1027,
2216,
10165,
533,
1057,
417,
513,
247,
11080,
2628,
12392,
1110,
2216,
10165,
25761,
352,
310,
417,
2590,
752,
7794,
273,
253,
1263,
403,
3587,
2905,
281,
2228,
9191,
4632,
247,
2629,
5921,
1263,
2218,
275,
2720,
789,
24088,
275,
513,
1805,
4440,
257,
292,
3210,
3700,
1805,
323,
31471,
273,
4440,
257,
292,
3215,
26208,
594,
4583,
891,
2868,
417,
760,
253,
16774,
7103,
3198,
7756,
533,
671,
253,
2926,
3198,
29646,
891,
717,
2819,
3579,
281,
6523,
436,
2929,
3863,
275,
643,
13361,
28966
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies gederated generative adversarial networks federated gans in particular the authors propose a new method uagan which is claimed to be better than earlier approaches i have several concerns first the writing can be improved significantly second the statements in the text are often rather vague this makes it hard to understand what are the results and to verify their correctness eg theorems are formulated either vaguely or unprecisely moreover the theorems refer to optimal discriminators however there is no guarantee that one can find such an optimal discriminator or that one can even verify whether a discriminator is optimal in fact the paper doesnt define the term optimal precisely even if it would be possible to learn an optimal discriminator then there is no bound on the amount of time this would take as that depends on various parameters of the learning problem and the learning algorithm the paper contains more vague statements eg remark 1 claims the method preserves privacy but doesnt define what information is kept private for example it seems the size of the private datasets is needed for the central weighting and hence made public moreover there is no proof that from the information leaving the individual centers nothing about the sensitive data can be inferred for example from the point of view of differential privacy even if only aggregates are revealed if their revealed value is exact then probably epsilondelta differential privacy is not guaranteed in conclusion it is hard for the average reader to understand the paper due to a lack of precision and the paper insufficiently specifies definitions assumptions made and precise formulations of results some details the first line of the abstract suggests that gans are federated networks while the second line of the abstract correctly states that this is called federated gan and the first line of the introduction correctly describes gan as generating realistic data algorithm 1 eq equation 2 repetition last line page 3 algorithm equation 1 algorithm 1 top of page 4 the generator gz seems to depend on an argument z of which the nature is not revealed immediately later equation 2 suggests that z can be drawn from mathcaln01 and hence is a real number it is unclear why the argument of g would be just onedimensional top of page 4 while the word discriminator is used very frequently no precise definition of this concept is provided nor an explanation of how the discriminators are obtained with the help of eq 2 some readers may be able to guess that dj must have values in the open interval 01 definition 1 as the symbol q is already used for distributions it is preferably to not use it for a probability in 01 too after definition 1 the central idea of ua framework framework needs an article just before section 31 we will provide error bound an error bound theorem 1 uses jenson shannon divergence loss which isnt defined in the paper and no definition is cited jenson shannon divergence is a somewhat wellknown concept in probability theory but even for those knowing this it is unclear how to get from it to jenson shannon divergence loss equation 6 in theorem 1 seems to give an expression similar to the jensonshannon divergence definition but doesnt appear a statement the theorem is claiming to be true the rest of the sentence refers to q and q but q is a bound variable in eq 6 ie it has no meaning outside the scope of argminq and q doesnt occur in the formula so why do you say where q it is hence hard to parse the theorem statement and discover what is the claim exactly a proof is provided in appendix however the proof first says to prove the theorem we first introduce the following lemma the text next states lemma 1 but never returns to the proof of theorem 1 as the next title says proof of theorem 4 docsepthis paper proposes an algorithm for training gans in federated learning setups the federated learning task is to train a centralized generative model using the samples distributed across a group of local nodes in a network toward this goal the paper proposes universal aggregation gan uagan in which at every iteration the server communicates a batch of samples produced by the generator to the local nodes and then the local nodes optimize their discriminator using their observed real and received fake samples overall the uagan approach is technicallysound and potentially useful for various distributed learning problems however i still have a few comments regarding the papers theoretical and numerical results i look forward to the authors response to my comments my main concern is on training the discriminator in uagan according to algorithm 1 uagan trains k discriminator functions ie one discrminator per local node in the network in a typical federated learning problem there might exist hundreds of nodes where each node observes only tens or hundreds of training samples in such scenarios the generalization error of training one discriminator per local node can be very large since every discriminator is trained using a limited number of real data points on the other hand the generalization error seems to be significantly smaller if one uses standard fedavg for training a single discriminator network currently there are no theoretical guarantees or numerical analysis on the generalization properties of uagan i think analyzing the generalization behavior either theoretically or numerically is needed to address this comment especially becuase the paper mostly focuses on the theoretical properties of uagan the papers numerical expeirments only consider setups with nonidentical distributions across nodes also in the experiments k is chosen to be moderately small while the number of training samples at every node is relatively large i recommend providing some numerical results for larger federated learning settings with smaller training sets for example one can consider 100 local nodes with only 500 training samples available at every node these numbers sound closer to a real federated learning problem than the ones used in the papers experiments in addition lines 9 and 14 in algorithm 1 should clearly state the optimization procedures for updating the discriminator and generator parameters instead of only referring to the optimization problems in the current version it remains unclear how many gradient steps are used to optimize the parameters at every iteration and how the performance of the trained gan is affected by the number of gradient steps applied for optimizing the discriminator parameters at every communication rounddocsepthis paper proposes federated gan with multiple private discriminators the authors also analyze the optimality of the proposed federated gan the review has several concerns on the current submission 1 is minimax problem5 the total loss of uagan that is optimized by algorithm 1 to solve the minimax problem 5 di for i12 k is coupled hencewhy di defined in2is the solution of 5when g is fixed 2 when and how does the nash equilibrium of minimax problem 5holds 3 lack of experiments for federated unconditional gan to verify the theory docsepsummary the paper proposes a new method uagan to train gans in a federated learning setup the method simulates a central discriminator dua such that the odds values of the central discriminator is equivalent to the weighted sum of the local discriminators the central generator is then trained based on the simulated central discriminator the paper provides theoretical analysis on its proposed method and conducts experiments on toy datasets and mixtures of real world datasets to simulate a federated learning setup pros the problem of training gans in a federated learning setup is an important problem to tackle as it naturally provides a secure way to train a generative model the idea of simulated central discriminator based on the weighted odds value of the individual discriminators is quite interesting and seems to outperform a simple weighted sum of the discriminator gradients the results provided in the experimental section are reasonable and outperforms similar gan setups on nonidentical data setups i also appreciate the inclusion of an accuracy metric in evaluating the different methods as it shows the effectiveness of generated data in training of downstream tasks cons the paper provided a theoretical analysis of the performance of uagan under conditions where some discriminators are trained suboptimally it would be nice to also have some experiments in the experimental section that shows the results of this setup ie with smaller dataset size for some of the discriminators the experimental results will also be more convincing if results of training the uagan on the mixture of all three real world datasets font mnist and fashionmnist are shown overall the proposed method of training gans in federated learning setup shows fairly convincing results with some additional experimental results i believe this will be a good submission for iclr i thank the authors for the additional experiments which have marginally satisfied my initial concerns ideally more setup can be experimented i keep my original rating
### Summary:
|
the paper presents a provable correct framework namely universal aggregation for training gans in federated learning scenarios it aims to address an important problem the proposed solution is wellgrounded with theoretical analysis and promising empirical results the paper receives mixed ratings and therefore there were extensive discussions one the positive end some reviewers think that the authors feedback provide clarification to confusing part of the paper on the negative side the authors feedback also confirms some of the concerns raised in the reviews 1 it was confirmed that there is no guarantee that one can find an nearly optimal discriminator which decreases the impact of the work as in practice we work with nonoptimal discriminators and hence some of the results couldnt apply 2 it was confirmed that no privacy guarantees can be given this is concerning since the complexity of gans wont prohibit skilled attackers from inferring some information while it is true that some of the guarantees would be hard to achieve even for a traditional gan the paper sets up a highexpectation at the beginning of the paper but fails to satisfy the readers with enough evidence in addition the writing can be significantly improved to ensure precise formulations and consistency the added experiment results are useful but stronger empirical results could help alleviate the issues in theoretical results in summary the paper has built solid foundations for a good piece of work but the current version could benefit from one more round of revision to become a strong publication in the future
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
50275,
2520,
2929,
2175,
305,
264,
12072,
1006,
800,
48960,
6928,
10208,
12072,
305,
507,
275,
1798,
253,
4477,
12661,
247,
747,
1332,
1484,
12043,
534,
310,
7558,
281,
320,
1805,
685,
4321,
7274,
50276,
74,
452,
2067,
7350,
50276,
7053,
253,
4028,
476,
320,
5520,
3012,
50276,
9815,
253,
7234,
275,
253,
2505,
403,
2223,
2581,
21248,
50276,
2520,
2789,
352,
1892,
281,
2096,
752,
403,
253,
1543,
285,
281,
12654,
616,
36594,
50275,
909,
39383,
403,
26115,
2057,
39559,
390,
440,
17995,
9299,
50276,
3062,
1189,
253,
39383,
3730,
281,
8654,
20741,
2392,
2299,
627,
310,
642,
12215,
326,
581,
476,
1089,
824,
271,
8654,
7134,
12915,
390,
326,
581,
476,
1014,
12654,
1880,
247,
7134,
12915,
310,
8654,
50276,
249,
958,
253,
2929,
36908,
4853,
253,
1307,
8654,
10534,
50275,
9154,
604,
352,
651,
320,
1896,
281,
3037,
271,
8654,
7134,
12915,
840,
627,
310,
642,
3033,
327,
253,
2408,
273,
673,
436,
651,
1379,
347,
326,
7024,
327,
2710,
3602,
273,
253,
4715,
1895,
285,
253,
4715,
5933,
50274,
783,
2929,
4428,
625,
21248,
7234,
24088,
7579,
337,
3916,
253,
1332,
31221,
11068,
533,
36908,
4853,
752,
1491,
310,
4934,
3055,
50275,
1542,
1650,
352,
3133,
253,
1979,
273,
253,
3055,
15302,
310,
3058,
323,
253,
4275,
42428,
285,
7613,
1160,
1345,
50276,
3062,
1189,
627,
310,
642,
4737,
326,
432,
253,
1491,
6108,
253,
2060,
12127,
2717,
670,
253,
7996,
941,
476,
320,
22245,
50276,
1542,
1650,
432,
253,
1127,
273,
1859,
273,
8967,
11068,
1014,
604,
760,
29111,
403,
4950,
604,
616,
4950,
1318,
310,
3242,
840,
3164,
299,
793,
300,
857,
1862,
50276,
19623,
451,
11068,
310,
417,
16293,
50276,
249,
6452,
352,
310,
1892,
323,
253,
3388,
9414,
281,
2096,
253,
2929,
1955,
281,
247,
3480,
273,
12320,
285,
253,
2929,
12497,
314,
28251,
14308,
13260,
1160,
285,
10799,
26850,
273,
1543,
50274,
8826,
4278,
50275,
783,
806,
1386,
273,
253,
12002,
5936,
326,
305,
507,
403,
10208,
12072,
6928,
1223,
253,
1273,
1386,
273,
253,
12002,
9113,
3054,
326,
436,
310,
1925,
10208,
12072,
36827,
285,
253,
806,
1386,
273,
253,
10199,
9113,
8631,
36827,
347,
11365,
15958,
941,
50276,
41528,
337,
16186,
5150,
374,
50276,
4762,
2930,
50276,
6275,
1386,
3239,
495,
5933,
5150,
337,
50276,
41528,
337,
50276,
3956,
273,
3239,
577,
253,
14156,
305,
91,
3133,
281,
3469,
327,
271,
4154,
1182,
273,
534,
253,
3753,
310,
417,
4950,
4745,
50276,
31312,
5150,
374,
5936,
326,
1182,
476,
320,
8392,
432,
14168,
1179,
79,
520,
285,
7613,
310,
247,
1524,
1180,
50276,
262,
310,
12744,
2139,
253,
4154,
273,
305,
651,
320,
816,
327,
264,
37613,
50276,
3956,
273,
3239,
577,
1223,
253,
3159,
7134,
12915,
310,
908,
1077,
7208,
642,
10799,
5426,
273,
436,
4473,
310,
2530,
4543,
271,
8813,
273,
849,
253,
20741,
2392,
403,
2797,
50276,
3113,
253,
1361,
273,
16186,
374,
690,
10668,
778,
320,
2104,
281,
5476,
326,
277,
75,
1364,
452,
2193,
275,
253,
1527,
7726,
14805,
50276,
28692,
337,
347,
253,
9484,
2805,
310,
2168,
908,
323,
10670,
352,
310,
13027,
281,
417,
897,
352,
323,
247,
5912,
275,
14805,
1512,
50276,
6438,
5426,
337,
253,
4275,
2934,
273,
1484,
66,
7792,
50276,
13149,
3198,
271,
3929,
50276,
6309,
1078,
2593,
4562,
50276,
664,
588,
2085,
2228,
3033,
50275,
266,
2228,
3033,
50276,
33921,
337,
4648,
480,
24480,
439,
16554,
23279,
2957,
534,
310,
2649,
2931,
275,
253,
2929,
285,
642,
5426,
310,
11106,
50276,
75,
24480,
439,
16554,
23279,
310,
247,
8489,
973,
4304,
4473,
275,
5912,
3762,
533,
1014,
323,
1110,
8958,
436,
352,
310,
12744,
849,
281,
755,
432,
352,
281,
480,
24480,
439,
16554,
23279,
2957,
50276,
29813,
721,
275,
10012,
337,
3133,
281,
1918,
271,
2048,
2074,
281,
253,
480,
561,
790,
73,
16554,
23279,
5426,
533,
36908,
3176,
247,
3908,
253,
10012,
310,
15081,
281,
320,
2032,
50276,
783,
1551,
273,
253,
6197,
10770,
281,
2805,
285,
2805,
533,
2805,
310,
247,
3033,
4778,
275,
16186,
721,
26332,
352,
556,
642,
4495,
3345,
253,
7990,
273,
1736,
1222,
82,
285,
2805,
36908,
2826,
275,
253,
7212,
594,
2139,
513,
368,
1333,
835,
2805,
50273,
262,
310,
7613,
1892,
281,
14390,
253,
10012,
3908,
285,
9413,
752,
310,
253,
1750,
4555,
50276,
66,
4737,
310,
2530,
275,
30762,
50276,
35529,
253,
4737,
806,
2296,
281,
5276,
253,
10012,
359,
806,
9569,
253,
1563,
18057,
253,
2505,
1735,
3054,
18057,
337,
533,
1620,
6548,
281,
253,
4737,
273,
10012,
337,
347,
253,
1735,
4060,
2296,
4737,
273,
10012,
577,
5474,
33032,
2520,
2929,
29328,
271,
5933,
323,
3733,
305,
507,
275,
10208,
12072,
4715,
873,
8777,
253,
10208,
12072,
4715,
4836,
310,
281,
6194,
247,
36409,
1006,
800,
1566,
970,
253,
3530,
5939,
2439,
247,
1387,
273,
1980,
7632,
275,
247,
2990,
2584,
436,
4736,
253,
2929,
29328,
10898,
20828,
36827,
1484,
12043,
275,
534,
387,
1046,
19502,
253,
4771,
3461,
684,
247,
14604,
273,
3530,
4197,
407,
253,
14156,
281,
253,
1980,
7632,
285,
840,
253,
1980,
7632,
22318,
616,
7134,
12915,
970,
616,
2540,
1524,
285,
2959,
15223,
3530,
4583,
253,
1484,
12043,
2746,
310,
22335,
27962,
285,
7826,
4217,
323,
2710,
5939,
4715,
3237,
2299,
891,
1335,
452,
247,
1643,
5701,
5001,
253,
9380,
10527,
285,
10704,
1543,
891,
1007,
3579,
281,
253,
4477,
2380,
281,
619,
5701,
50274,
2577,
2022,
4468,
310,
327,
3733,
253,
7134,
12915,
275,
1484,
12043,
2556,
281,
5933,
337,
1484,
12043,
18784,
465,
7134,
12915,
3470,
26332,
581,
1262,
1109,
12915,
591,
1980,
4666,
275,
253,
2990,
275,
247,
6867,
10208,
12072,
4715,
1895,
627,
1537,
2226,
8307,
273,
7632,
50276,
2811,
1016,
4666,
40687,
760,
7114,
390,
8307,
273,
3733,
3530,
275,
824,
15216,
253,
26647,
2228,
273,
3733,
581,
7134,
12915,
591,
1980,
4666,
476,
320,
1077,
1781,
1580,
1046,
7134,
12915,
310,
10166,
970,
247,
3710,
1180,
273,
1524,
941,
2792,
327,
253,
643,
1133,
253,
26647,
2228,
3133,
281,
320,
3012,
4577,
604,
581,
4648,
2629,
10208,
42921,
323,
3733,
247,
2014,
7134,
12915,
2990,
50276,
47590,
627,
403,
642,
10527,
23632,
390,
10704,
1783,
327,
253,
26647,
3607,
273,
1484,
12043,
891,
1158,
18918,
253,
26647,
3879,
2057,
28055,
390,
27184,
310,
3058,
281,
2953,
436,
4385,
3340,
791,
86,
511,
253,
2929,
6571,
16633,
327,
253,
10527,
3607,
273,
1484,
12043,
50275,
783,
9380,
10704,
385,
365,
343,
942,
760,
1908,
873,
8777,
342,
1327,
888,
474,
10670,
2439,
7632,
671,
275,
253,
4679,
465,
310,
6777,
281,
320,
28249,
1355,
1223,
253,
1180,
273,
3733,
3530,
387,
1046,
4666,
310,
4942,
1781,
891,
5583,
5277,
690,
10704,
1543,
323,
4067,
10208,
12072,
4715,
7533,
342,
4577,
3733,
5239,
323,
1650,
581,
476,
1908,
2233,
1980,
7632,
342,
760,
6783,
3733,
3530,
2130,
387,
1046,
4666,
841,
3904,
3590,
8003,
281,
247,
1524,
10208,
12072,
4715,
1895,
685,
253,
4394,
908,
275,
253,
9380,
4679,
275,
1635,
3104,
898,
285,
1638,
275,
5933,
337,
943,
4518,
1375,
253,
13757,
7259,
323,
22753,
253,
7134,
12915,
285,
14156,
3602,
3185,
273,
760,
14339,
281,
253,
13757,
3237,
275,
253,
1655,
2715,
352,
4558,
12744,
849,
1142,
11786,
5018,
403,
908,
281,
22318,
253,
3602,
387,
1046,
19502,
285,
849,
253,
3045,
273,
253,
10166,
36827,
310,
5876,
407,
253,
1180,
273,
11786,
5018,
3732,
323,
39793,
253,
7134,
12915,
3602,
387,
1046,
5511,
3790,
7152,
33032,
2520,
2929,
29328,
10208,
12072,
36827,
342,
2709,
3055,
20741,
2392,
253,
4477,
671,
12106,
253,
5556,
1319,
273,
253,
4081,
10208,
12072,
36827,
253,
2278,
556,
2067,
7350,
327,
253,
1655,
19529,
337,
50276,
261,
7221,
991,
1895,
22,
50276,
783,
2264,
2957,
273,
1484,
12043,
326,
310,
18325,
407,
5933,
337,
50276,
936,
8415,
253,
7221,
991,
1895,
608,
50276,
5168,
323,
891,
805,
465,
310,
9904,
50276,
48521,
22309,
1073,
2931,
275,
19,
261,
253,
2900,
273,
608,
9453,
305,
310,
4229,
374,
672,
285,
849,
1057,
253,
295,
1225,
12902,
273,
7221,
991,
1895,
608,
15532,
50276,
20,
3480,
273,
4679,
323,
10208,
12072,
49795,
36827,
281,
12654,
253,
3762,
50274,
7152,
339,
793,
360,
3454,
50276,
783,
2929,
29328,
247,
747,
1332,
1484,
12043,
281,
6194,
305,
507,
275,
247,
10208,
12072,
4715,
9978,
253,
1332,
948,
17815,
247,
4275,
7134,
12915,
3443,
66,
824,
326,
253,
13653,
2193,
273,
253,
4275,
7134,
12915,
310,
6425,
281,
253,
17375,
2020,
273,
253,
1980,
20741,
2392,
253,
4275,
14156,
310,
840,
10166,
1754,
327,
253,
15524,
4275,
7134,
12915,
253,
2929,
3400,
10527,
1783,
327,
697,
4081,
1332,
285,
2589,
84,
4679,
327,
20953,
15302,
285,
24170,
273,
1524,
1533,
15302,
281,
26065,
247,
10208,
12072,
4715,
9978,
50276,
856,
84,
50276,
783,
1895,
273,
3733,
305,
507,
275,
247,
10208,
12072,
4715,
9978,
310,
271,
1774,
1895,
281,
18915,
347,
352,
10748,
3400,
247,
7895,
1039,
281,
6194,
247,
1006,
800,
1566,
50276,
783,
2934,
273,
15524,
4275,
7134,
12915,
1754,
327,
253,
17375,
13653,
1318,
273,
253,
2060,
20741,
2392,
310,
3240,
4722,
285,
3133,
281,
562,
32231,
247,
2969,
17375,
2020,
273,
253,
7134,
12915,
27935,
50276,
783,
1543,
2530,
275,
253,
5661,
2593,
403,
5272,
285,
41731,
13015,
2074,
36827,
873,
8777,
327,
1327,
888,
474,
941,
873,
8777,
891,
671,
11435,
253,
11250,
273,
271,
7200,
7982,
275,
16344,
253,
1027,
3082,
347,
352,
2722,
253,
12510,
273,
4561,
941,
275,
3733,
273,
15450,
8892,
50276,
5040,
50276,
783,
2929,
2530,
247,
10527,
1783,
273,
253,
3045,
273,
1484,
12043,
762,
2515,
835,
690,
20741,
2392,
403,
10166,
749,
32581,
595,
352,
651,
320,
5322,
281,
671,
452,
690,
4679,
275,
253,
5661,
2593,
326,
2722,
253,
1543,
273,
436,
9978,
26332,
342,
4577,
10895,
1979,
323,
690,
273,
253,
20741,
2392,
50276,
783,
5661,
1543,
588,
671,
320,
625,
21414,
604,
1543,
273,
3733,
253,
1484,
12043,
327,
253,
7802,
273,
512,
1264,
1524,
1533,
15302,
8266,
278,
79,
382,
285,
8142,
16192,
382,
403,
2011,
50276,
1189,
455,
253,
4081,
1332,
273,
3733,
305,
507,
275,
10208,
12072,
4715,
9978,
2722,
9648,
21414,
1543,
342,
690,
3081,
5661,
1543,
891,
2868,
436,
588,
320,
247,
1175,
19529,
323,
17857,
32888,
50274,
74,
5717,
253,
4477,
323,
253,
3081,
4679,
534,
452,
42876,
10048,
619,
3302,
7350,
34243,
625,
9978,
476,
320,
3368,
264,
891,
1978,
619,
3236,
13716,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
872,
494,
3451,
7792,
10775,
10898,
20828,
323,
3733,
305,
507,
275,
10208,
12072,
4715,
15216,
352,
13698,
281,
2953,
271,
1774,
1895,
253,
4081,
2900,
310,
973,
2595,
264,
342,
10527,
1783,
285,
12532,
16774,
1543,
50275,
783,
2929,
14488,
6804,
17503,
285,
3103,
627,
497,
9470,
11985,
581,
253,
2762,
990,
690,
30628,
1158,
326,
253,
4477,
8680,
2085,
37699,
281,
21643,
629,
273,
253,
2929,
327,
253,
4016,
1930,
253,
4477,
8680,
671,
23849,
690,
273,
253,
7350,
5439,
275,
253,
10123,
50275,
18,
50276,
262,
369,
5783,
326,
627,
310,
642,
12215,
326,
581,
476,
1089,
271,
4829,
8654,
7134,
12915,
534,
12075,
253,
3486,
273,
253,
789,
347,
275,
3946,
359,
789,
342,
1327,
29776,
20741,
2392,
285,
7613,
690,
273,
253,
1543,
812,
2649,
4647,
50275,
19,
352,
369,
5783,
326,
642,
11068,
23632,
476,
320,
1677,
436,
310,
8664,
1580,
253,
10454,
273,
305,
507,
31451,
33459,
18024,
40567,
432,
9441,
804,
690,
1491,
50275,
6050,
352,
310,
2032,
326,
690,
273,
253,
23632,
651,
320,
1892,
281,
5115,
1014,
323,
247,
5899,
36827,
253,
2929,
5239,
598,
247,
1725,
15741,
808,
318,
387,
253,
5068,
273,
253,
2929,
50276,
2858,
10224,
281,
10517,
253,
10668,
342,
2217,
1941,
50276,
249,
1635,
253,
4028,
476,
320,
3012,
5520,
281,
5416,
10799,
26850,
285,
15274,
253,
2879,
3368,
1543,
403,
4217,
533,
10046,
16774,
1543,
812,
1361,
33623,
253,
3374,
275,
10527,
1543,
50275,
249,
6010,
253,
2929,
556,
4270,
4891,
27629,
323,
247,
1175,
5313,
273,
789,
533,
253,
1655,
2715,
812,
5649,
432,
581,
625,
3790,
273,
18520,
281,
2489,
247,
2266,
9311,
275,
253,
2852,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
50275,
2520,
2929,
2175,
305,
264,
12072,
1006,
800,
48960,
6928,
10208,
12072,
305,
507,
275,
1798,
253,
4477,
12661,
247,
747,
1332,
1484,
12043,
534,
310,
7558,
281,
320,
1805,
685,
4321,
7274,
50276,
74,
452,
2067,
7350,
50276,
7053,
253,
4028,
476,
320,
5520,
3012,
50276,
9815,
253,
7234,
275,
253,
2505,
403,
2223,
2581,
21248,
50276,
2520,
2789,
352,
1892,
281,
2096,
752,
403,
253,
1543,
285,
281,
12654,
616,
36594,
50275,
909,
39383,
403,
26115,
2057,
39559,
390,
440,
17995,
9299,
50276,
3062,
1189,
253,
39383,
3730,
281,
8654,
20741,
2392,
2299,
627,
310,
642,
12215,
326,
581,
476,
1089,
824,
271,
8654,
7134,
12915,
390,
326,
581,
476,
1014,
12654,
1880,
247,
7134,
12915,
310,
8654,
50276,
249,
958,
253,
2929,
36908,
4853,
253,
1307,
8654,
10534,
50275,
9154,
604,
352,
651,
320,
1896,
281,
3037,
271,
8654,
7134,
12915,
840,
627,
310,
642,
3033,
327,
253,
2408,
273,
673,
436,
651,
1379,
347,
326,
7024,
327,
2710,
3602,
273,
253,
4715,
1895,
285,
253,
4715,
5933,
50274,
783,
2929,
4428,
625,
21248,
7234,
24088,
7579,
337,
3916,
253,
1332,
31221,
11068,
533,
36908,
4853,
752,
1491,
310,
4934,
3055,
50275,
1542,
1650,
352,
3133,
253,
1979,
273,
253,
3055,
15302,
310,
3058,
323,
253,
4275,
42428,
285,
7613,
1160,
1345,
50276,
3062,
1189,
627,
310,
642,
4737,
326,
432,
253,
1491,
6108,
253,
2060,
12127,
2717,
670,
253,
7996,
941,
476,
320,
22245,
50276,
1542,
1650,
432,
253,
1127,
273,
1859,
273,
8967,
11068,
1014,
604,
760,
29111,
403,
4950,
604,
616,
4950,
1318,
310,
3242,
840,
3164,
299,
793,
300,
857,
1862,
50276,
19623,
451,
11068,
310,
417,
16293,
50276,
249,
6452,
352,
310,
1892,
323,
253,
3388,
9414,
281,
2096,
253,
2929,
1955,
281,
247,
3480,
273,
12320,
285,
253,
2929,
12497,
314,
28251,
14308,
13260,
1160,
285,
10799,
26850,
273,
1543,
50274,
8826,
4278,
50275,
783,
806,
1386,
273,
253,
12002,
5936,
326,
305,
507,
403,
10208,
12072,
6928,
1223,
253,
1273,
1386,
273,
253,
12002,
9113,
3054,
326,
436,
310,
1925,
10208,
12072,
36827,
285,
253,
806,
1386,
273,
253,
10199,
9113,
8631,
36827,
347,
11365,
15958,
941,
50276,
41528,
337,
16186,
5150,
374,
50276,
4762,
2930,
50276,
6275,
1386,
3239,
495,
5933,
5150,
337,
50276,
41528,
337,
50276,
3956,
273,
3239,
577,
253,
14156,
305,
91,
3133,
281,
3469,
327,
271,
4154,
1182,
273,
534,
253,
3753,
310,
417,
4950,
4745,
50276,
31312,
5150,
374,
5936,
326,
1182,
476,
320,
8392,
432,
14168,
1179,
79,
520,
285,
7613,
310,
247,
1524,
1180,
50276,
262,
310,
12744,
2139,
253,
4154,
273,
305,
651,
320,
816,
327,
264,
37613,
50276,
3956,
273,
3239,
577,
1223,
253,
3159,
7134,
12915,
310,
908,
1077,
7208,
642,
10799,
5426,
273,
436,
4473,
310,
2530,
4543,
271,
8813,
273,
849,
253,
20741,
2392,
403,
2797,
50276,
3113,
253,
1361,
273,
16186,
374,
690,
10668,
778,
320,
2104,
281,
5476,
326,
277,
75,
1364,
452,
2193,
275,
253,
1527,
7726,
14805,
50276,
28692,
337,
347,
253,
9484,
2805,
310,
2168,
908,
323,
10670,
352,
310,
13027,
281,
417,
897,
352,
323,
247,
5912,
275,
14805,
1512,
50276,
6438,
5426,
337,
253,
4275,
2934,
273,
1484,
66,
7792,
50276,
13149,
3198,
271,
3929,
50276,
6309,
1078,
2593,
4562,
50276,
664,
588,
2085,
2228,
3033,
50275,
266,
2228,
3033,
50276,
33921,
337,
4648,
480,
24480,
439,
16554,
23279,
2957,
534,
310,
2649,
2931,
275,
253,
2929,
285,
642,
5426,
310,
11106,
50276,
75,
24480,
439,
16554,
23279,
310,
247,
8489,
973,
4304,
4473,
275,
5912,
3762,
533,
1014,
323,
1110,
8958,
436,
352,
310,
12744,
849,
281,
755,
432,
352,
281,
480,
24480,
439,
16554,
23279,
2957,
50276,
29813,
721,
275,
10012,
337,
3133,
281,
1918,
271,
2048,
2074,
281,
253,
480,
561,
790,
73,
16554,
23279,
5426,
533,
36908,
3176,
247,
3908,
253,
10012,
310,
15081,
281,
320,
2032,
50276,
783,
1551,
273,
253,
6197,
10770,
281,
2805,
285,
2805,
533,
2805,
310,
247,
3033,
4778,
275,
16186,
721,
26332,
352,
556,
642,
4495,
3345,
253,
7990,
273,
1736,
1222,
82,
285,
2805,
36908,
2826,
275,
253,
7212,
594,
2139,
513,
368,
1333,
835,
2805,
50273,
262,
310,
7613,
1892,
281,
14390,
253,
10012,
3908,
285,
9413,
752,
310,
253,
1750,
4555,
50276,
66,
4737,
310,
2530,
275,
30762,
50276,
35529,
253,
4737,
806,
2296,
281,
5276,
253,
10012,
359,
806,
9569,
253,
1563,
18057,
253,
2505,
1735,
3054,
18057,
337,
533,
1620,
6548,
281,
253,
4737,
273,
10012,
337,
347,
253,
1735,
4060,
2296,
4737,
273,
10012,
577,
5474,
33032,
2520,
2929,
29328,
271,
5933,
323,
3733,
305,
507,
275,
10208,
12072,
4715,
873,
8777,
253,
10208,
12072,
4715,
4836,
310,
281,
6194,
247,
36409,
1006,
800,
1566,
970,
253,
3530,
5939,
2439,
247,
1387,
273,
1980,
7632,
275,
247,
2990,
2584,
436,
4736,
253,
2929,
29328,
10898,
20828,
36827,
1484,
12043,
275,
534,
387,
1046,
19502,
253,
4771,
3461,
684,
247,
14604,
273,
3530,
4197,
407,
253,
14156,
281,
253,
1980,
7632,
285,
840,
253,
1980,
7632,
22318,
616,
7134,
12915,
970,
616,
2540,
1524,
285,
2959,
15223,
3530,
4583,
253,
1484,
12043,
2746,
310,
22335,
27962,
285,
7826,
4217,
323,
2710,
5939,
4715,
3237,
2299,
891,
1335,
452,
247,
1643,
5701,
5001,
253,
9380,
10527,
285,
10704,
1543,
891,
1007,
3579,
281,
253,
4477,
2380,
281,
619,
5701,
50274,
2577,
2022,
4468,
310,
327,
3733,
253,
7134,
12915,
275,
1484,
12043,
2556,
281,
5933,
337,
1484,
12043,
18784,
465,
7134,
12915,
3470,
26332,
581,
1262,
1109,
12915,
591,
1980,
4666,
275,
253,
2990,
275,
247,
6867,
10208,
12072,
4715,
1895,
627,
1537,
2226,
8307,
273,
7632,
50276,
2811,
1016,
4666,
40687,
760,
7114,
390,
8307,
273,
3733,
3530,
275,
824,
15216,
253,
26647,
2228,
273,
3733,
581,
7134,
12915,
591,
1980,
4666,
476,
320,
1077,
1781,
1580,
1046,
7134,
12915,
310,
10166,
970,
247,
3710,
1180,
273,
1524,
941,
2792,
327,
253,
643,
1133,
253,
26647,
2228,
3133,
281,
320,
3012,
4577,
604,
581,
4648,
2629,
10208,
42921,
323,
3733,
247,
2014,
7134,
12915,
2990,
50276,
47590,
627,
403,
642,
10527,
23632,
390,
10704,
1783,
327,
253,
26647,
3607,
273,
1484,
12043,
891,
1158,
18918,
253,
26647,
3879,
2057,
28055,
390,
27184,
310,
3058,
281,
2953,
436,
4385,
3340,
791,
86,
511,
253,
2929,
6571,
16633,
327,
253,
10527,
3607,
273,
1484,
12043,
50275,
783,
9380,
10704,
385,
365,
343,
942,
760,
1908,
873,
8777,
342,
1327,
888,
474,
10670,
2439,
7632,
671,
275,
253,
4679,
465,
310,
6777,
281,
320,
28249,
1355,
1223,
253,
1180,
273,
3733,
3530,
387,
1046,
4666,
310,
4942,
1781,
891,
5583,
5277,
690,
10704,
1543,
323,
4067,
10208,
12072,
4715,
7533,
342,
4577,
3733,
5239,
323,
1650,
581,
476,
1908,
2233,
1980,
7632,
342,
760,
6783,
3733,
3530,
2130,
387,
1046,
4666,
841,
3904,
3590,
8003,
281,
247,
1524,
10208,
12072,
4715,
1895,
685,
253,
4394,
908,
275,
253,
9380,
4679,
275,
1635,
3104,
898,
285,
1638,
275,
5933,
337,
943,
4518,
1375,
253,
13757,
7259,
323,
22753,
253,
7134,
12915,
285,
14156,
3602,
3185,
273,
760,
14339,
281,
253,
13757,
3237,
275,
253,
1655,
2715,
352,
4558,
12744,
849,
1142,
11786,
5018,
403,
908,
281,
22318,
253,
3602,
387,
1046,
19502,
285,
849,
253,
3045,
273,
253,
10166,
36827,
310,
5876,
407,
253,
1180,
273,
11786,
5018,
3732,
323,
39793,
253,
7134,
12915,
3602,
387,
1046,
5511,
3790,
7152,
33032,
2520,
2929,
29328,
10208,
12072,
36827,
342,
2709,
3055,
20741,
2392,
253,
4477,
671,
12106,
253,
5556,
1319,
273,
253,
4081,
10208,
12072,
36827,
253,
2278,
556,
2067,
7350,
327,
253,
1655,
19529,
337,
50276,
261,
7221,
991,
1895,
22,
50276,
783,
2264,
2957,
273,
1484,
12043,
326,
310,
18325,
407,
5933,
337,
50276,
936,
8415,
253,
7221,
991,
1895,
608,
50276,
5168,
323,
891,
805,
465,
310,
9904,
50276,
48521,
22309,
1073,
2931,
275,
19,
261,
253,
2900,
273,
608,
9453,
305,
310,
4229,
374,
672,
285,
849,
1057,
253,
295,
1225,
12902,
273,
7221,
991,
1895,
608,
15532,
50276,
20,
3480,
273,
4679,
323,
10208,
12072,
49795,
36827,
281,
12654,
253,
3762,
50274,
7152,
339,
793,
360,
3454,
50276,
783,
2929,
29328,
247,
747,
1332,
1484,
12043,
281,
6194,
305,
507,
275,
247,
10208,
12072,
4715,
9978,
253,
1332,
948,
17815,
247,
4275,
7134,
12915,
3443,
66,
824,
326,
253,
13653,
2193,
273,
253,
4275,
7134,
12915,
310,
6425,
281,
253,
17375,
2020,
273,
253,
1980,
20741,
2392,
253,
4275,
14156,
310,
840,
10166,
1754,
327,
253,
15524,
4275,
7134,
12915,
253,
2929,
3400,
10527,
1783,
327,
697,
4081,
1332,
285,
2589,
84,
4679,
327,
20953,
15302,
285,
24170,
273,
1524,
1533,
15302,
281,
26065,
247,
10208,
12072,
4715,
9978,
50276,
856,
84,
50276,
783,
1895,
273,
3733,
305,
507,
275,
247,
10208,
12072,
4715,
9978,
310,
271,
1774,
1895,
281,
18915,
347,
352,
10748,
3400,
247,
7895,
1039,
281,
6194,
247,
1006,
800,
1566,
50276,
783,
2934,
273,
15524,
4275,
7134,
12915,
1754,
327,
253,
17375,
13653,
1318,
273,
253,
2060,
20741,
2392,
310,
3240,
4722,
285,
3133,
281,
562,
32231,
247,
2969,
17375,
2020,
273,
253,
7134,
12915,
27935,
50276,
783,
1543,
2530,
275,
253,
5661,
2593,
403,
5272,
285,
41731,
13015,
2074,
36827,
873,
8777,
327,
1327,
888,
474,
941,
873,
8777,
891,
671,
11435,
253,
11250,
273,
271,
7200,
7982,
275,
16344,
253,
1027,
3082,
347,
352,
2722,
253,
12510,
273,
4561,
941,
275,
3733,
273,
15450,
8892,
50276,
5040,
50276,
783,
2929,
2530,
247,
10527,
1783,
273,
253,
3045,
273,
1484,
12043,
762,
2515,
835,
690,
20741,
2392,
403,
10166,
749,
32581,
595,
352,
651,
320,
5322,
281,
671,
452,
690,
4679,
275,
253,
5661,
2593,
326,
2722,
253,
1543,
273,
436,
9978,
26332,
342,
4577,
10895,
1979,
323,
690,
273,
253,
20741,
2392,
50276,
783,
5661,
1543,
588,
671,
320,
625,
21414,
604,
1543,
273,
3733,
253,
1484,
12043,
327,
253,
7802,
273,
512,
1264,
1524,
1533,
15302,
8266,
278,
79,
382,
285,
8142,
16192,
382,
403,
2011,
50276,
1189,
455,
253,
4081,
1332,
273,
3733,
305,
507,
275,
10208,
12072,
4715,
9978,
2722,
9648,
21414,
1543,
342,
690,
3081,
5661,
1543,
891,
2868,
436,
588,
320,
247,
1175,
19529,
323,
17857,
32888,
50274,
74,
5717,
253,
4477,
323,
253,
3081,
4679,
534,
452,
42876,
10048,
619,
3302,
7350,
34243,
625,
9978,
476,
320,
3368,
264,
891,
1978,
619,
3236,
13716,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
872,
494,
3451,
7792,
10775,
10898,
20828,
323,
3733,
305,
507,
275,
10208,
12072,
4715,
15216,
352,
13698,
281,
2953,
271,
1774,
1895,
253,
4081,
2900,
310,
973,
2595,
264,
342,
10527,
1783,
285,
12532,
16774,
1543,
50275,
783,
2929,
14488,
6804,
17503,
285,
3103,
627,
497,
9470,
11985,
581,
253,
2762,
990,
690,
30628,
1158,
326,
253,
4477,
8680,
2085,
37699,
281,
21643,
629,
273,
253,
2929,
327,
253,
4016,
1930,
253,
4477,
8680,
671,
23849,
690,
273,
253,
7350,
5439,
275,
253,
10123,
50275,
18,
50276,
262,
369,
5783,
326,
627,
310,
642,
12215,
326,
581,
476,
1089,
271,
4829,
8654,
7134,
12915,
534,
12075,
253,
3486,
273,
253,
789,
347,
275,
3946,
359,
789,
342,
1327,
29776,
20741,
2392,
285,
7613,
690,
273,
253,
1543,
812,
2649,
4647,
50275,
19,
352,
369,
5783,
326,
642,
11068,
23632,
476,
320,
1677,
436,
310,
8664,
1580,
253,
10454,
273,
305,
507,
31451,
33459,
18024,
40567,
432,
9441,
804,
690,
1491,
50275,
6050,
352,
310,
2032,
326,
690,
273,
253,
23632,
651,
320,
1892,
281,
5115,
1014,
323,
247,
5899,
36827,
253,
2929,
5239,
598,
247,
1725,
15741,
808,
318,
387,
253,
5068,
273,
253,
2929,
50276,
2858,
10224,
281,
10517,
253,
10668,
342,
2217,
1941,
50276,
249,
1635,
253,
4028,
476,
320,
3012,
5520,
281,
5416,
10799,
26850,
285,
15274,
253,
2879,
3368,
1543,
403,
4217,
533,
10046,
16774,
1543,
812,
1361,
33623,
253,
3374,
275,
10527,
1543,
50275,
249,
6010,
253,
2929,
556,
4270,
4891,
27629,
323,
247,
1175,
5313,
273,
789,
533,
253,
1655,
2715,
812,
5649,
432,
581,
625,
3790,
273,
18520,
281,
2489,
247,
2266,
9311,
275,
253,
2852,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
1 this paper designs a randomized substitutionbased approach model to detect textual adversarial examples 2 a number of experiments have been performed to evaluate the proposed model 1 this paper is not well motivated as there exists a good number of related work why you propose this randomized synonym substitutionbased approach what is the relationship between this work and the literature 2 it is hard to figure out a meaningful realworld application which needs nlp defense against adversarial examples 3 the compared methods are not strong enough and thus the results are not convincing it is suggested to start with a more reasonable example to demonstrate the necessity of the proposed work the related work should be well discussed to show the contribution of this manuscript experiments should be greatly enhanced docsep1 the motivation of the paper is clear the whole manuscript is wellorganized and easy to follow 2 the experiments are sufficient enough and validate the hypothesis at the beginning of the paper 3 the proposed method rsv is simple but effective which seems applicable to reality situation 1 the key idea of this paper is not innovative enough the rs module only substitutes the word in the replacement sequence at the training phase randomly which seems more like a data augmentation method to this end i suggest the authors move a step forward to give more theoretical explanations on how rsv works rather than just using empirical experiments for validation besides i am wondering whether this rs can be applied with other related tasks 2 since every word needs a synonym set the authors should evaluate the efficiency of this work please refer to weaknesses docsep the paper is well written and the structure is crystal clear and helps to understand the idea and how it is implemented the method is well explained both using a verbose description and an algorithmic one alg1 the idea behind the method seems to be well justified by the experiments described in section 32 with some doubts reported in the next section the model is compared to two recent proposed methods ie the baseline models the method obtains very high performances compared to the baseline models the are only two main weakness that the code is not available section 3 the motivation is based only on 1000 samples from one dataset why did you not tried on many samples from different datasets the authors propose a new textual adversarial example detection method rsv the method has the goal to contrast synonym substitutionbased textual adversarial attacks by substituting random words in the text samples with the aim of destroying the mutual interaction between words this interaction is used by attackers to trick the model and its destruction in adversarial samples improves the models accuracy from a preliminary experiment seems clear that benign samples are not affected by the substitution of some words with their synonyms while adversarial ones are hardly affected helping the model to correctly classify them starting from this analysis the authors propose this method the method involves the creation of k variation of each sample and making the model predict a label for each variation if the most predicted label is equal to the one predicted by the model for the original sample then the text was not corrupted otherwise yes the sample is an adversarial one the paper is well written and the proposed method is effective it would be interesting to see it in other tasks moreover i suggest you try to eliminate the s hyperparameter using a fixed set of stopwords not dependent by the samples the are only two main weakness that the code is not available section 3 the motivation is based only on 1000 samples from one dataset why did you not tried on many samples from different datasets other some uncovered points section 32 if you mask randomly how do you know that is the break of mutual interaction of words that leads to these results in the adversarial samples maybe you masked the adversarial words and this increases the accuracy a small analysis of this case would have been interesting last paragraph of section 32 even in this case how could you say that the performances improvement is due to substitution of surrounding words respect to adversarial word or of the adversarial word itself is clear that the probability of substituting a surrounding word is significantly higher but a more indepth analysis would have been interesting section 33 for the research of synonyms why not use a regexbased algorithm in this case you are almost sure to take a synonym while using the embedding produced by a neural network could not be safe section 33 p is a fixed number or a probability in alg 1 and in the rest of the paper it seems a probability and this makes sense however here you say we randomly sample p words this is probably a mistake docsep1 the motivation is reasonable by substituting words with synonyms successful attacks are rare and most of the time the prediction results remain unchanged 2 the method is simple and effective and the presentation of the method is clear 3 the experimental results show the effectiveness of the method 1 rsv generates k25 texts and queries the attacked model separately for one detection even for k10 a value to obtain stable performance the cost is a bit high in contrast disp and fgws restore benign texts directly which should be much faster see above
### Summary:
|
meta review for the most part reviewers felt the proposed method was proven to be effective and that the paper constitutes a useful contribution
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
18,
186,
2520,
2929,
11809,
247,
14871,
19137,
3169,
2746,
1566,
281,
2736,
45860,
48960,
6667,
374,
186,
66,
1180,
273,
4679,
452,
644,
2684,
281,
7472,
253,
4081,
1566,
50274,
18,
186,
2520,
2929,
310,
417,
973,
17194,
347,
627,
4961,
247,
1175,
1180,
273,
2905,
789,
2139,
368,
12661,
436,
14871,
2753,
7983,
19137,
3169,
2746,
752,
310,
253,
2954,
875,
436,
789,
285,
253,
6239,
50276,
19,
186,
262,
310,
1892,
281,
4677,
562,
247,
14282,
1524,
10186,
2898,
534,
3198,
295,
24343,
5684,
1411,
48960,
6667,
50276,
20,
186,
783,
2429,
3082,
403,
417,
2266,
2217,
285,
3021,
253,
1543,
403,
417,
21414,
50274,
262,
310,
5125,
281,
1265,
342,
247,
625,
5272,
1650,
281,
7568,
253,
15504,
273,
253,
4081,
789,
253,
2905,
789,
943,
320,
973,
5469,
281,
921,
253,
7680,
273,
436,
7714,
4679,
943,
320,
10260,
8655,
5474,
33032,
18,
186,
783,
16038,
273,
253,
2929,
310,
2590,
253,
2644,
7714,
310,
973,
34092,
285,
3477,
281,
956,
374,
186,
783,
4679,
403,
4209,
2217,
285,
17813,
253,
9079,
387,
253,
5068,
273,
253,
2929,
495,
186,
783,
4081,
1332,
391,
11427,
310,
2969,
533,
3576,
534,
3133,
7763,
281,
6612,
4112,
50275,
18,
186,
783,
2234,
2934,
273,
436,
2929,
310,
417,
16694,
2217,
253,
14208,
6333,
760,
10829,
2279,
253,
3159,
275,
253,
5407,
3425,
387,
253,
3733,
3408,
12421,
534,
3133,
625,
751,
247,
941,
42072,
1332,
281,
436,
990,
891,
1804,
253,
4477,
2118,
247,
3213,
3579,
281,
1918,
625,
10527,
22909,
327,
849,
391,
11427,
2987,
2581,
685,
816,
970,
16774,
4679,
323,
12820,
16280,
891,
717,
12371,
1880,
436,
14208,
476,
320,
3732,
342,
643,
2905,
8892,
374,
186,
17480,
1046,
3159,
3198,
247,
2753,
7983,
873,
253,
4477,
943,
7472,
253,
6733,
273,
436,
789,
50276,
32897,
3730,
281,
32213,
50276,
7152,
33032,
253,
2929,
310,
973,
3542,
285,
253,
2605,
310,
9266,
2590,
285,
7729,
281,
2096,
253,
2934,
285,
849,
352,
310,
9009,
50276,
783,
1332,
310,
973,
5544,
1097,
970,
247,
48656,
5740,
285,
271,
5933,
280,
581,
20320,
18,
50276,
783,
2934,
3212,
253,
1332,
3133,
281,
320,
973,
17285,
407,
253,
4679,
2529,
275,
2593,
4567,
342,
690,
24626,
2361,
275,
253,
1735,
2593,
50276,
783,
1566,
310,
2429,
281,
767,
3332,
4081,
3082,
26332,
253,
8245,
3210,
50276,
783,
1332,
31326,
1077,
1029,
16226,
2429,
281,
253,
8245,
3210,
50276,
783,
403,
760,
767,
2022,
14855,
50275,
3529,
253,
2127,
310,
417,
2130,
50276,
4674,
495,
253,
16038,
310,
1754,
760,
327,
9098,
3530,
432,
581,
10895,
2139,
858,
368,
417,
3597,
327,
1142,
3530,
432,
1027,
15302,
50275,
783,
4477,
12661,
247,
747,
45860,
48960,
1650,
5481,
1332,
391,
11427,
253,
1332,
556,
253,
4736,
281,
4499,
2753,
7983,
19137,
3169,
45860,
48960,
8104,
407,
40944,
3632,
3000,
275,
253,
2505,
3530,
342,
253,
4388,
273,
25473,
253,
15577,
5016,
875,
3000,
436,
5016,
310,
908,
407,
40567,
281,
10480,
253,
1566,
285,
697,
12536,
275,
48960,
3530,
19132,
253,
3210,
7200,
50275,
4064,
247,
12611,
3368,
3133,
2590,
326,
21690,
3530,
403,
417,
5876,
407,
253,
19137,
273,
690,
3000,
342,
616,
2753,
2421,
983,
1223,
48960,
4394,
403,
10693,
5876,
9073,
253,
1566,
281,
9113,
30215,
731,
4983,
432,
436,
1783,
253,
4477,
12661,
436,
1332,
253,
1332,
8687,
253,
8869,
273,
465,
7629,
273,
1016,
3410,
285,
2403,
253,
1566,
3283,
247,
5203,
323,
1016,
7629,
604,
253,
954,
8131,
5203,
310,
4503,
281,
253,
581,
8131,
407,
253,
1566,
323,
253,
3236,
3410,
840,
253,
2505,
369,
417,
40634,
5010,
4754,
253,
3410,
310,
271,
48960,
581,
50275,
783,
2929,
310,
973,
3542,
285,
253,
4081,
1332,
310,
3576,
352,
651,
320,
4722,
281,
923,
352,
275,
643,
8892,
25761,
891,
1804,
368,
1611,
281,
13469,
253,
256,
4373,
19484,
970,
247,
4229,
873,
273,
3523,
12113,
417,
7976,
407,
253,
3530,
50276,
783,
403,
760,
767,
2022,
14855,
50275,
3529,
253,
2127,
310,
417,
2130,
50276,
4674,
495,
253,
16038,
310,
1754,
760,
327,
9098,
3530,
432,
581,
10895,
2139,
858,
368,
417,
3597,
327,
1142,
3530,
432,
1027,
15302,
50276,
977,
690,
27819,
2792,
50276,
4674,
4567,
50276,
338,
368,
8989,
12421,
849,
513,
368,
871,
326,
310,
253,
2740,
273,
15577,
5016,
273,
3000,
326,
5644,
281,
841,
1543,
275,
253,
48960,
3530,
5046,
368,
34741,
253,
48960,
3000,
285,
436,
5459,
253,
7200,
247,
1355,
1783,
273,
436,
1083,
651,
452,
644,
4722,
50276,
6275,
12494,
273,
2593,
4567,
1014,
275,
436,
1083,
50276,
5430,
812,
368,
1333,
326,
253,
16226,
7756,
310,
1955,
281,
19137,
273,
8704,
3000,
1675,
281,
48960,
3159,
390,
273,
253,
48960,
3159,
3139,
310,
2590,
326,
253,
5912,
273,
40944,
247,
8704,
3159,
310,
3012,
2169,
50276,
2858,
247,
625,
801,
554,
394,
1783,
651,
452,
644,
4722,
50276,
4674,
5922,
323,
253,
2561,
273,
2753,
2421,
983,
2139,
417,
897,
247,
21165,
3169,
5933,
275,
436,
1083,
368,
403,
2761,
2119,
281,
1379,
247,
2753,
7983,
1223,
970,
253,
21496,
4197,
407,
247,
11454,
2990,
812,
417,
320,
4999,
50276,
4674,
5922,
268,
310,
247,
4229,
1180,
390,
247,
5912,
275,
20320,
337,
285,
275,
253,
1551,
273,
253,
2929,
352,
3133,
247,
5912,
285,
436,
2789,
3282,
2299,
1060,
368,
1333,
359,
12421,
3410,
268,
3000,
436,
310,
3164,
247,
10551,
50276,
7152,
33032,
18,
186,
783,
16038,
310,
5272,
407,
40944,
3000,
342,
2753,
2421,
983,
5547,
8104,
403,
7520,
285,
954,
273,
253,
673,
253,
10554,
1543,
3464,
19965,
374,
186,
783,
1332,
310,
2969,
285,
3576,
285,
253,
9759,
273,
253,
1332,
310,
2590,
495,
186,
783,
5661,
1543,
921,
253,
12510,
273,
253,
1332,
50276,
18,
186,
2967,
87,
15693,
465,
1099,
17438,
285,
19241,
253,
13964,
1566,
11794,
323,
581,
5481,
1014,
323,
465,
740,
247,
1318,
281,
4044,
6474,
3045,
253,
2105,
310,
247,
2372,
1029,
275,
4499,
7644,
285,
269,
72,
8819,
15042,
21690,
17438,
3587,
534,
943,
320,
1199,
7938,
923,
1840,
2490,
187,
4118,
18435,
27,
13518,
2278,
323,
253,
954,
629,
30628,
3543,
253,
4081,
1332,
369,
11464,
281,
320,
3576,
285,
326,
253,
2929,
16988,
247,
4217,
7680
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
18,
186,
2520,
2929,
11809,
247,
14871,
19137,
3169,
2746,
1566,
281,
2736,
45860,
48960,
6667,
374,
186,
66,
1180,
273,
4679,
452,
644,
2684,
281,
7472,
253,
4081,
1566,
50274,
18,
186,
2520,
2929,
310,
417,
973,
17194,
347,
627,
4961,
247,
1175,
1180,
273,
2905,
789,
2139,
368,
12661,
436,
14871,
2753,
7983,
19137,
3169,
2746,
752,
310,
253,
2954,
875,
436,
789,
285,
253,
6239,
50276,
19,
186,
262,
310,
1892,
281,
4677,
562,
247,
14282,
1524,
10186,
2898,
534,
3198,
295,
24343,
5684,
1411,
48960,
6667,
50276,
20,
186,
783,
2429,
3082,
403,
417,
2266,
2217,
285,
3021,
253,
1543,
403,
417,
21414,
50274,
262,
310,
5125,
281,
1265,
342,
247,
625,
5272,
1650,
281,
7568,
253,
15504,
273,
253,
4081,
789,
253,
2905,
789,
943,
320,
973,
5469,
281,
921,
253,
7680,
273,
436,
7714,
4679,
943,
320,
10260,
8655,
5474,
33032,
18,
186,
783,
16038,
273,
253,
2929,
310,
2590,
253,
2644,
7714,
310,
973,
34092,
285,
3477,
281,
956,
374,
186,
783,
4679,
403,
4209,
2217,
285,
17813,
253,
9079,
387,
253,
5068,
273,
253,
2929,
495,
186,
783,
4081,
1332,
391,
11427,
310,
2969,
533,
3576,
534,
3133,
7763,
281,
6612,
4112,
50275,
18,
186,
783,
2234,
2934,
273,
436,
2929,
310,
417,
16694,
2217,
253,
14208,
6333,
760,
10829,
2279,
253,
3159,
275,
253,
5407,
3425,
387,
253,
3733,
3408,
12421,
534,
3133,
625,
751,
247,
941,
42072,
1332,
281,
436,
990,
891,
1804,
253,
4477,
2118,
247,
3213,
3579,
281,
1918,
625,
10527,
22909,
327,
849,
391,
11427,
2987,
2581,
685,
816,
970,
16774,
4679,
323,
12820,
16280,
891,
717,
12371,
1880,
436,
14208,
476,
320,
3732,
342,
643,
2905,
8892,
374,
186,
17480,
1046,
3159,
3198,
247,
2753,
7983,
873,
253,
4477,
943,
7472,
253,
6733,
273,
436,
789,
50276,
32897,
3730,
281,
32213,
50276,
7152,
33032,
253,
2929,
310,
973,
3542,
285,
253,
2605,
310,
9266,
2590,
285,
7729,
281,
2096,
253,
2934,
285,
849,
352,
310,
9009,
50276,
783,
1332,
310,
973,
5544,
1097,
970,
247,
48656,
5740,
285,
271,
5933,
280,
581,
20320,
18,
50276,
783,
2934,
3212,
253,
1332,
3133,
281,
320,
973,
17285,
407,
253,
4679,
2529,
275,
2593,
4567,
342,
690,
24626,
2361,
275,
253,
1735,
2593,
50276,
783,
1566,
310,
2429,
281,
767,
3332,
4081,
3082,
26332,
253,
8245,
3210,
50276,
783,
1332,
31326,
1077,
1029,
16226,
2429,
281,
253,
8245,
3210,
50276,
783,
403,
760,
767,
2022,
14855,
50275,
3529,
253,
2127,
310,
417,
2130,
50276,
4674,
495,
253,
16038,
310,
1754,
760,
327,
9098,
3530,
432,
581,
10895,
2139,
858,
368,
417,
3597,
327,
1142,
3530,
432,
1027,
15302,
50275,
783,
4477,
12661,
247,
747,
45860,
48960,
1650,
5481,
1332,
391,
11427,
253,
1332,
556,
253,
4736,
281,
4499,
2753,
7983,
19137,
3169,
45860,
48960,
8104,
407,
40944,
3632,
3000,
275,
253,
2505,
3530,
342,
253,
4388,
273,
25473,
253,
15577,
5016,
875,
3000,
436,
5016,
310,
908,
407,
40567,
281,
10480,
253,
1566,
285,
697,
12536,
275,
48960,
3530,
19132,
253,
3210,
7200,
50275,
4064,
247,
12611,
3368,
3133,
2590,
326,
21690,
3530,
403,
417,
5876,
407,
253,
19137,
273,
690,
3000,
342,
616,
2753,
2421,
983,
1223,
48960,
4394,
403,
10693,
5876,
9073,
253,
1566,
281,
9113,
30215,
731,
4983,
432,
436,
1783,
253,
4477,
12661,
436,
1332,
253,
1332,
8687,
253,
8869,
273,
465,
7629,
273,
1016,
3410,
285,
2403,
253,
1566,
3283,
247,
5203,
323,
1016,
7629,
604,
253,
954,
8131,
5203,
310,
4503,
281,
253,
581,
8131,
407,
253,
1566,
323,
253,
3236,
3410,
840,
253,
2505,
369,
417,
40634,
5010,
4754,
253,
3410,
310,
271,
48960,
581,
50275,
783,
2929,
310,
973,
3542,
285,
253,
4081,
1332,
310,
3576,
352,
651,
320,
4722,
281,
923,
352,
275,
643,
8892,
25761,
891,
1804,
368,
1611,
281,
13469,
253,
256,
4373,
19484,
970,
247,
4229,
873,
273,
3523,
12113,
417,
7976,
407,
253,
3530,
50276,
783,
403,
760,
767,
2022,
14855,
50275,
3529,
253,
2127,
310,
417,
2130,
50276,
4674,
495,
253,
16038,
310,
1754,
760,
327,
9098,
3530,
432,
581,
10895,
2139,
858,
368,
417,
3597,
327,
1142,
3530,
432,
1027,
15302,
50276,
977,
690,
27819,
2792,
50276,
4674,
4567,
50276,
338,
368,
8989,
12421,
849,
513,
368,
871,
326,
310,
253,
2740,
273,
15577,
5016,
273,
3000,
326,
5644,
281,
841,
1543,
275,
253,
48960,
3530,
5046,
368,
34741,
253,
48960,
3000,
285,
436,
5459,
253,
7200,
247,
1355,
1783,
273,
436,
1083,
651,
452,
644,
4722,
50276,
6275,
12494,
273,
2593,
4567,
1014,
275,
436,
1083,
50276,
5430,
812,
368,
1333,
326,
253,
16226,
7756,
310,
1955,
281,
19137,
273,
8704,
3000,
1675,
281,
48960,
3159,
390,
273,
253,
48960,
3159,
3139,
310,
2590,
326,
253,
5912,
273,
40944,
247,
8704,
3159,
310,
3012,
2169,
50276,
2858,
247,
625,
801,
554,
394,
1783,
651,
452,
644,
4722,
50276,
4674,
5922,
323,
253,
2561,
273,
2753,
2421,
983,
2139,
417,
897,
247,
21165,
3169,
5933,
275,
436,
1083,
368,
403,
2761,
2119,
281,
1379,
247,
2753,
7983,
1223,
970,
253,
21496,
4197,
407,
247,
11454,
2990,
812,
417,
320,
4999,
50276,
4674,
5922,
268,
310,
247,
4229,
1180,
390,
247,
5912,
275,
20320,
337,
285,
275,
253,
1551,
273,
253,
2929,
352,
3133,
247,
5912,
285,
436,
2789,
3282,
2299,
1060,
368,
1333,
359,
12421,
3410,
268,
3000,
436,
310,
3164,
247,
10551,
50276,
7152,
33032,
18,
186,
783,
16038,
310,
5272,
407,
40944,
3000,
342,
2753,
2421,
983,
5547,
8104,
403,
7520,
285,
954,
273,
253,
673,
253,
10554,
1543,
3464,
19965,
374,
186,
783,
1332,
310,
2969,
285,
3576,
285,
253,
9759,
273,
253,
1332,
310,
2590,
495,
186,
783,
5661,
1543,
921,
253,
12510,
273,
253,
1332,
50276,
18,
186,
2967,
87,
15693,
465,
1099,
17438,
285,
19241,
253,
13964,
1566,
11794,
323,
581,
5481,
1014,
323,
465,
740,
247,
1318,
281,
4044,
6474,
3045,
253,
2105,
310,
247,
2372,
1029,
275,
4499,
7644,
285,
269,
72,
8819,
15042,
21690,
17438,
3587,
534,
943,
320,
1199,
7938,
923,
1840,
2490,
187,
4118,
18435,
27,
13518,
2278,
323,
253,
954,
629,
30628,
3543,
253,
4081,
1332,
369,
11464,
281,
320,
3576,
285,
326,
253,
2929,
16988,
247,
4217,
7680
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper surveys various adversarial defense methods on their performance when the perturbation distortion epsilon is increased the author argues that robustness for a specific epsilon may not be enough and suggests robustness curves as an alternative i think in general this paper provides some interesting empirical studies my detailed comments are as follows 1 the overfitting of specific epsilon value is expected but interesting to see and i think this is one of the main reasons why a robustness curve is necessary however in this adversarial robustness community the status quo is that researchers compare with each other on some specific datasets with some specific epsilon for example 03 for mnist 8255 for cifar i think one reason for choosing these values is that studying robustness under perturbation with larger distortion is kind of unnecessary because then the noise added is no longer imperceptible which is at odds with adversarial examples definition i think the authors may need to provide more discussion on why studying robustness with epsilon06 for mnist is necessary since we may misclassify those images as human 2 the authors suggest a robustness curve as an evaluation metric however i havent seen any works on improving the robustness for all epsilon values globally one possibility is that there is a tradeoff between small epsilon performance and large epsilon performance similar to the tradeoff between robustness and accuracy epsilon0 performance my suggestion is that the authors may define an area under the curve value just like in roc curves for better comparison 3 minor please refrain from only using color to distinguish curves in figures as it may not be friendly to readers with color blindness docsepsummary the authors advocate for the use of robustness curves plotting the adversarial accuracy as a function of the size of the neighbourhood region of allowed perturbation the problem that they identify is that if you only evaluate adversarial accuracy at some numbers of threshold you might conclude that some models and the method that was used to train them are more robust than others while it would be incorrect at other thresholds general comment the paper clearly describes the problem it deals with and reads easily on the other hand i am not entirely convinced by the importance of the problem if you have a particular specification that you care about robustness against linf attack for eps01 you can just verify that on the other hand if your goal is i want my network to be robust then its not properly defined so of course its hard to evaluate robustness curves will help there but there is still the problem that you might want to be robust to linfinity l1 l2 brightness difference wasserstein difference changes of small patches and then the robustness curves will not help you unless of course you compute one for each difference at the same time they are quite a bit more costly to compute that simple point measures while i agree that you are going to get more information if you compute a full robustness curve than if you sample it at a bunch of points im not convinced that it is worth the effort one thing that i would recommend the authors is to make clearer the distinction between robustness curves as they described them based on finding the closest adversarial example vs plotting on a robustness curve the pointwise measures and interpolating through them this would be a much cheaper solution and is essentially what reporting experimental results for a few chosen eps achieves for example the results the authors give in their table 1 based on point wise measures wessentially achieves what the authors want to show no defense strictly dominate the others specific comments the toy dataset example presented in figure 1 is great and provides a great explanation of the problem that the authors identify the constructive proof in appendix a is also quite interesting and really drives the point that the authors want to make the authors argue that robustness curves allow to compare global robustness properties and their dependence on a given classifier distribution and distance function in practice does it really give insights global robustness property if i look at the linfinity robustness curves it does not tell me much about the robustness to l2 perturbations i dont understand how the robustness curves are generated for the linfinity case if pgd is used to find adversarial examples its not likely to be the closest adversarial examples that is going to be found in all likelihood its going to be one that matches the epsilon given as input to the pgd attack due to the projection there is nothing that is even encouraging the sample to be close to the input beyond the constraints used for projection even for the l2 distance there is still the usual problem that although the cw attack encourages to find the closest sample there is no guarantee that it will and the effectiveness it will have at doing so might depend from model to model as a result its hard to decouple the robustness curve from the attack that it used internally minor notes typos to improve the look of the paper it should be possible to include manual linebreaks in the title so that its not broken in every line the authors talk in the introduction about recently proposed robustness curves and cite a paper from 2020 for them but it seems like those curves were already in use before that on the effectiveness of interval bound propagation for training verifiably robust models gowal et al had some in 2018 provable defenses against adversarial examples via the convex outer adversarial polytope had some transposed in 2017 at the end of the introduction the author say it is our belief that the continued use of single perturbation thresholds in the adversarial robustness literature is due to a lack of awareness of the shortcomings of these measures this seems overtly harsh you could make the same point about training algorithms and say that authors only reporting on only a few datasets due it just out of lack of awareness of the fact that the relative performance of different algorithms will vary depending on the dataset given that computing robustness curves needs computing the closest adversary to a point this is much more expensive so maybe computational cost might be the differentiating factor rather than lack of awarenessdocsepthis paper presents a theoretical scenario where pointwise measure of adversarial robustness falls short in comparing model robustness then conduct experiments to show that robustness curve is a more meaningful evaluation metric from a global perspective pros the motivation is well explained i mainly agree with the authors on the argument that pointwise measurement of robustness may be insufficient in explaining model robustness computing and visualizing robustness curves seems to be more meaningful and rigorous from a security perspective relating the choice of the perturbation strength to the underlying property of the data distribution is useful the interclass distances demonstrated in table could potentially be used as a reference on determining the right scale of perturbation strength cons the robustness results presented in table 1 seems far below the stateoftheart robustness for instance in the last row epsilon8255 the robust test error of at is 092 which is much higher than the reported statistics in madry et al 2018 the author uses a very small 4layer convolutional neural network for cifar10 experiments whereas the stateoftheart robustness results are achieved using a much larger network such as a resnet architecture or a wideresnet architecture refer to 1 for the current best robustness results on cifar10 thus i recommend authors to rerun these experiments using a larger network similar architecture is used for the robustness curves in figures 3 and 4 this suboptimal choice of network architecture makes the argument it contains multiple intersections in the robustness curve unconvincing other questions or comments 1 most of the existing defenses against adversarial examples are typically trained using a specificallychosen perturbation strength if adopting robustness curves or global robustness as the evaluation criteria instead of pointwise robustness how will this affect the existing adversarial training procedure 2 what does the distance statistics presented in table 2 suggest for the typical choice of perturbation strength used in existing literature 3 the global robustness considered in this paper is robustness for varying perturbation strength is there a way to define the perturbation strength for different input locations based on your computed interclass statistics 4 the bibliography style of the reference is not standard check if you are using the correct file 1 reliable evaluation of adversarial robustness with an ensemble of diverse parameterfree attacks francesco croce and matthias hein icml 2020 docsepsummary the paper showed that pointwise measures fail to capture important properties that are essential to compare the robustness of different classifiers the authors introduced the recently proposed robustness curves to provide a global perspective including the scale as a way to distinguish small and large perturbations pros 1 how to compare the robustness is a very important question for current machine learning models the author introduced a better criterion for this important question 2 the paper is well written and easy to read the experiments and its discussion are strong proofs to support that the robustness curve should be the better criterion 3 the authors released code to reproduce all the experiments for the current popular frameworks it will be very helpful for the researchers to the advantages of this curve over pointwise measures cons overall this paper is impressive the only concern the reviewer has is the contribution compared to the previous work who proposed the robustness curve c gopfert et al 2020 it seems this papers contribution is highly based on the proposal of robustness curve and providing more explanations and discussions but this will not affect the importance of this paper
### Summary:
|
the authors study robustness curves which are plots of the robust error versus the radius used in the corresponding lpball threat model pro i completely agree with the authors that the current evaluation purely based on evaluation for a single radius is insufficient and one should report the complete curve con the authors are overclaiming that they have come up with robustness curves very early papers eg even in the adversarial training paper of madry there are plots of robust accuracy versus chosen threshold moreover i agree with one of the reviewers that using pgd for the purpose of a robustness curve is inaccurate and in particular inefficient as several attacks for different radii have to be done there have been several attacks developed which aim to find the adversarial sample with minimum norm and thus compute the robustness curve in one run the additional insights eg intersection of robustness curves are partially to be expected and i dont find them sufficient to move the paper over the bar for iclr as these insights are additionally only shown for relatively small models which seem far away from the state of the art it is unclear if they generalize however i encourage to follow some of the reviewers suggestions to improve the paper
|
[
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
17276,
2710,
48960,
5684,
3082,
327,
616,
3045,
672,
253,
20452,
22841,
299,
4277,
310,
2559,
253,
2488,
8219,
326,
31640,
323,
247,
2173,
299,
4277,
778,
417,
320,
2217,
285,
5936,
31640,
9191,
347,
271,
5795,
891,
1158,
275,
2087,
436,
2929,
3400,
690,
4722,
16774,
2175,
619,
7000,
5701,
403,
347,
3637,
337,
253,
689,
31893,
273,
2173,
299,
4277,
1318,
310,
3264,
533,
4722,
281,
923,
285,
891,
1158,
436,
310,
581,
273,
253,
2022,
4606,
2139,
247,
31640,
6970,
310,
3309,
2299,
275,
436,
48960,
31640,
3114,
253,
3708,
29817,
310,
326,
8607,
7277,
342,
1016,
643,
327,
690,
2173,
15302,
342,
690,
2173,
299,
4277,
323,
1650,
17272,
323,
278,
79,
382,
854,
10637,
323,
260,
338,
274,
891,
1158,
581,
1921,
323,
13887,
841,
2193,
310,
326,
12392,
31640,
762,
20452,
342,
4067,
22841,
310,
2238,
273,
15279,
984,
840,
253,
6046,
2879,
310,
642,
3356,
9719,
44043,
534,
310,
387,
13653,
342,
48960,
6667,
5426,
891,
1158,
253,
4477,
778,
878,
281,
2085,
625,
5955,
327,
2139,
12392,
31640,
342,
299,
4277,
3071,
323,
278,
79,
382,
310,
3309,
1580,
359,
778,
3731,
2437,
1419,
1110,
3888,
347,
1966,
50276,
19,
253,
4477,
1804,
247,
31640,
6970,
347,
271,
7103,
7982,
2299,
891,
419,
2254,
2326,
667,
2987,
327,
11138,
253,
31640,
323,
512,
299,
4277,
2193,
21349,
581,
6387,
310,
326,
627,
310,
247,
5454,
2727,
875,
1355,
299,
4277,
3045,
285,
1781,
299,
4277,
3045,
2074,
281,
253,
5454,
2727,
875,
31640,
285,
7200,
299,
4277,
17,
3045,
619,
14876,
310,
326,
253,
4477,
778,
4853,
271,
2170,
762,
253,
6970,
1318,
816,
751,
275,
687,
68,
9191,
323,
1805,
5301,
495,
5884,
4496,
35531,
432,
760,
970,
3295,
281,
12129,
9191,
275,
8442,
347,
352,
778,
417,
320,
11453,
281,
10668,
342,
3295,
44868,
5474,
339,
793,
360,
3454,
253,
4477,
21424,
323,
253,
897,
273,
31640,
9191,
38542,
253,
48960,
7200,
347,
247,
1159,
273,
253,
1979,
273,
253,
24092,
2919,
273,
4136,
20452,
253,
1895,
326,
597,
4271,
310,
326,
604,
368,
760,
7472,
48960,
7200,
387,
690,
3904,
273,
7887,
368,
1537,
7525,
326,
690,
3210,
285,
253,
1332,
326,
369,
908,
281,
6194,
731,
403,
625,
10237,
685,
2571,
1223,
352,
651,
320,
13583,
387,
643,
26682,
50276,
16691,
4385,
253,
2929,
4518,
8631,
253,
1895,
352,
13330,
342,
285,
9563,
4354,
327,
253,
643,
1133,
891,
717,
417,
7094,
13762,
407,
253,
6349,
273,
253,
1895,
604,
368,
452,
247,
1798,
17776,
326,
368,
1557,
670,
50276,
18848,
461,
1255,
1411,
298,
2050,
2983,
323,
299,
793,
520,
368,
476,
816,
12654,
326,
327,
253,
643,
1133,
604,
634,
4736,
310,
891,
971,
619,
2990,
281,
320,
10237,
840,
697,
417,
6283,
2931,
594,
273,
2282,
697,
1892,
281,
7472,
31640,
9191,
588,
1361,
627,
533,
627,
310,
1335,
253,
1895,
326,
368,
1537,
971,
281,
320,
10237,
281,
298,
43723,
298,
18,
298,
19,
20468,
3064,
369,
2152,
6339,
3064,
2544,
273,
1355,
20412,
285,
840,
253,
31640,
9191,
588,
417,
1361,
368,
5734,
273,
2282,
368,
11897,
581,
323,
1016,
3064,
387,
253,
1072,
673,
597,
403,
3240,
247,
2372,
625,
19983,
281,
11897,
326,
2969,
1127,
5593,
1223,
891,
5194,
326,
368,
403,
1469,
281,
755,
625,
1491,
604,
368,
11897,
247,
2120,
31640,
6970,
685,
604,
368,
3410,
352,
387,
247,
12190,
273,
2792,
516,
417,
13762,
326,
352,
310,
4409,
253,
3434,
50276,
531,
2181,
326,
891,
651,
5583,
253,
4477,
310,
281,
1056,
30909,
253,
13812,
875,
31640,
9191,
347,
597,
2529,
731,
1754,
327,
4560,
253,
8642,
48960,
1650,
4632,
38542,
327,
247,
31640,
6970,
253,
1127,
3020,
5593,
285,
20670,
839,
949,
731,
436,
651,
320,
247,
1199,
20182,
2900,
285,
310,
9093,
752,
9610,
5661,
1543,
323,
247,
1643,
6777,
299,
793,
33526,
323,
1650,
253,
1543,
253,
4477,
1918,
275,
616,
2829,
337,
1754,
327,
1127,
15822,
5593,
259,
405,
4303,
33526,
752,
253,
4477,
971,
281,
921,
642,
5684,
13714,
25903,
253,
2571,
50276,
6160,
5701,
50276,
783,
20953,
10895,
1650,
3559,
275,
4677,
337,
310,
1270,
285,
3400,
247,
1270,
8813,
273,
253,
1895,
326,
253,
4477,
4271,
253,
25799,
4737,
275,
30762,
247,
310,
671,
3240,
4722,
285,
1663,
14137,
253,
1127,
326,
253,
4477,
971,
281,
1056,
50275,
783,
4477,
9059,
326,
31640,
9191,
1581,
281,
7277,
4156,
31640,
3607,
285,
616,
10096,
327,
247,
1677,
30410,
3268,
285,
4181,
1159,
275,
3946,
1057,
352,
1663,
1918,
16039,
4156,
31640,
2867,
604,
891,
1007,
387,
253,
298,
43723,
31640,
9191,
352,
1057,
417,
2028,
479,
1199,
670,
253,
31640,
281,
298,
19,
26309,
50275,
74,
13414,
2096,
849,
253,
31640,
9191,
403,
4561,
323,
253,
298,
43723,
1083,
604,
23256,
69,
310,
908,
281,
1089,
48960,
6667,
697,
417,
2779,
281,
320,
253,
8642,
48960,
6667,
326,
310,
1469,
281,
320,
1119,
275,
512,
12177,
697,
1469,
281,
320,
581,
326,
10129,
253,
299,
4277,
1677,
347,
3280,
281,
253,
23256,
69,
2983,
1955,
281,
253,
12378,
627,
310,
2717,
326,
310,
1014,
18462,
253,
3410,
281,
320,
2810,
281,
253,
3280,
4457,
253,
10806,
908,
323,
12378,
1014,
323,
253,
298,
19,
4181,
627,
310,
1335,
253,
7312,
1895,
326,
3738,
253,
260,
88,
2983,
29426,
281,
1089,
253,
8642,
3410,
627,
310,
642,
12215,
326,
352,
588,
285,
253,
12510,
352,
588,
452,
387,
2509,
594,
1537,
3469,
432,
1566,
281,
1566,
347,
247,
906,
697,
1892,
281,
34430,
713,
253,
31640,
6970,
432,
253,
2983,
326,
352,
908,
26506,
50276,
37585,
7211,
50276,
555,
993,
50276,
936,
3157,
253,
1007,
273,
253,
2929,
352,
943,
320,
1896,
281,
2486,
11595,
1386,
7054,
84,
275,
253,
4060,
594,
326,
697,
417,
7154,
275,
1046,
1386,
50275,
783,
4477,
2312,
275,
253,
10199,
670,
4102,
4081,
31640,
9191,
285,
26542,
247,
2929,
432,
9169,
323,
731,
533,
352,
3133,
751,
1110,
9191,
497,
2168,
275,
897,
1078,
326,
327,
253,
12510,
273,
7726,
3033,
18634,
323,
3733,
2336,
18279,
1598,
10237,
3210,
305,
319,
267,
1162,
355,
574,
690,
275,
4765,
872,
494,
25774,
1411,
48960,
6667,
3066,
253,
17133,
8346,
48960,
3488,
936,
365,
574,
690,
811,
7334,
275,
4240,
50276,
255,
253,
990,
273,
253,
10199,
253,
2488,
1333,
352,
310,
776,
9927,
326,
253,
4821,
897,
273,
2014,
20452,
26682,
275,
253,
48960,
31640,
6239,
310,
1955,
281,
247,
3480,
273,
11891,
273,
253,
35387,
273,
841,
5593,
50276,
2520,
3133,
19486,
314,
17770,
368,
812,
1056,
253,
1072,
1127,
670,
3733,
11333,
285,
1333,
326,
4477,
760,
9610,
327,
760,
247,
1643,
15302,
1955,
352,
816,
562,
273,
3480,
273,
11891,
273,
253,
958,
326,
253,
4103,
3045,
273,
1027,
11333,
588,
6889,
7293,
327,
253,
10895,
1677,
326,
12672,
31640,
9191,
3198,
12672,
253,
8642,
34014,
281,
247,
1127,
436,
310,
1199,
625,
8214,
594,
5046,
15180,
2105,
1537,
320,
253,
43073,
2803,
2581,
685,
3480,
273,
11891,
7152,
33032,
2520,
2929,
10262,
247,
10527,
10076,
835,
1127,
3020,
2557,
273,
48960,
31640,
11521,
2159,
275,
10941,
1566,
31640,
840,
2589,
4679,
281,
921,
326,
31640,
6970,
310,
247,
625,
14282,
7103,
7982,
432,
247,
4156,
8668,
50275,
856,
84,
50274,
783,
16038,
310,
973,
5544,
891,
7194,
5194,
342,
253,
4477,
327,
253,
4154,
326,
1127,
3020,
6814,
273,
31640,
778,
320,
12497,
275,
15571,
1566,
31640,
12672,
285,
5304,
3006,
31640,
9191,
3133,
281,
320,
625,
14282,
285,
26565,
432,
247,
3988,
8668,
50274,
1661,
839,
253,
4327,
273,
253,
20452,
4757,
281,
253,
6944,
2867,
273,
253,
941,
3268,
310,
4217,
253,
734,
2437,
13849,
5183,
275,
2829,
812,
7826,
320,
908,
347,
247,
3806,
327,
8925,
253,
987,
4311,
273,
20452,
4757,
50275,
5040,
50274,
783,
31640,
1543,
3559,
275,
2829,
337,
3133,
2080,
2708,
253,
1375,
23037,
14387,
31640,
323,
4227,
275,
253,
1390,
4194,
299,
4277,
25,
10637,
253,
10237,
1071,
2228,
273,
387,
310,
470,
4529,
534,
310,
1199,
2169,
685,
253,
2361,
9990,
275,
10279,
610,
1162,
355,
4765,
253,
2488,
4648,
247,
1077,
1355,
577,
12026,
27311,
267,
11454,
2990,
323,
260,
338,
274,
740,
4679,
5727,
253,
1375,
23037,
14387,
31640,
1543,
403,
6786,
970,
247,
1199,
4067,
2990,
824,
347,
247,
501,
3024,
10336,
390,
247,
4618,
373,
3024,
10336,
3730,
281,
337,
323,
253,
1655,
1682,
31640,
1543,
327,
260,
338,
274,
740,
3021,
891,
5583,
4477,
281,
294,
6321,
841,
4679,
970,
247,
4067,
2990,
50275,
22202,
10336,
310,
908,
323,
253,
31640,
9191,
275,
8442,
495,
285,
577,
436,
749,
29776,
4327,
273,
2990,
10336,
2789,
253,
4154,
352,
4428,
2709,
42320,
275,
253,
31640,
6970,
10915,
87,
19163,
50274,
977,
3533,
390,
5701,
50276,
18,
954,
273,
253,
5368,
25774,
1411,
48960,
6667,
403,
5431,
10166,
970,
247,
5742,
348,
5458,
20452,
4757,
604,
25987,
31640,
9191,
390,
4156,
31640,
347,
253,
7103,
6866,
3185,
273,
1127,
3020,
31640,
849,
588,
436,
2818,
253,
5368,
48960,
3733,
5199,
50276,
19,
752,
1057,
253,
4181,
9990,
3559,
275,
2829,
374,
1804,
323,
253,
6867,
4327,
273,
20452,
4757,
908,
275,
5368,
6239,
50275,
20,
253,
4156,
31640,
2783,
275,
436,
2929,
310,
31640,
323,
11962,
20452,
4757,
310,
627,
247,
1039,
281,
4853,
253,
20452,
4757,
323,
1027,
3280,
8593,
1754,
327,
634,
10302,
734,
2437,
9990,
50276,
21,
253,
20314,
20561,
3740,
273,
253,
3806,
310,
417,
2629,
2451,
604,
368,
403,
970,
253,
3451,
1873,
50275,
18,
9630,
7103,
273,
48960,
31640,
342,
271,
19862,
273,
11117,
4764,
4924,
8104,
1315,
1972,
1940,
9187,
336,
285,
1111,
394,
6358,
344,
249,
17857,
1686,
9169,
5474,
339,
793,
360,
3454,
253,
2929,
2692,
326,
1127,
3020,
5593,
1891,
281,
9232,
1774,
3607,
326,
403,
5667,
281,
7277,
253,
31640,
273,
1027,
49996,
253,
4477,
5611,
253,
4102,
4081,
31640,
9191,
281,
2085,
247,
4156,
8668,
1690,
253,
4311,
347,
247,
1039,
281,
12129,
1355,
285,
1781,
26309,
50275,
856,
84,
337,
849,
281,
7277,
253,
31640,
310,
247,
1077,
1774,
1953,
323,
1655,
5145,
4715,
3210,
253,
2488,
5611,
247,
1805,
17705,
323,
436,
1774,
1953,
50276,
19,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50276,
783,
4679,
285,
697,
5955,
403,
2266,
27947,
281,
1329,
326,
253,
31640,
6970,
943,
320,
253,
1805,
17705,
495,
253,
4477,
4439,
2127,
281,
18302,
512,
253,
4679,
323,
253,
1655,
4633,
31225,
352,
588,
320,
1077,
9371,
323,
253,
8607,
281,
253,
11361,
273,
436,
6970,
689,
1127,
3020,
5593,
50275,
5040,
4583,
436,
2929,
310,
13943,
50276,
783,
760,
4468,
253,
37317,
556,
310,
253,
7680,
2429,
281,
253,
2045,
789,
665,
4081,
253,
31640,
6970,
260,
305,
36470,
797,
1162,
355,
9169,
50276,
262,
3133,
436,
9380,
7680,
310,
4122,
1754,
327,
50276,
783,
10419,
273,
31640,
6970,
285,
5277,
625,
22909,
285,
11985,
533,
436,
588,
417,
2818,
253,
6349,
273,
436,
2929,
50273,
187,
187,
4118,
18435,
27,
783,
4477,
1263,
31640,
9191,
534,
403,
14777,
273,
253,
10237,
2228,
7147,
253,
9941,
908,
275,
253,
3969,
39322,
2910,
4322,
1566,
50276,
856,
891,
4336,
5194,
342,
253,
4477,
326,
253,
1655,
7103,
15846,
1754,
327,
7103,
323,
247,
2014,
9941,
310,
12497,
285,
581,
943,
1304,
253,
3426,
6970,
50275,
585,
253,
4477,
403,
689,
43759,
326,
597,
452,
1705,
598,
342,
31640,
9191,
1077,
2393,
9380,
24088,
1014,
275,
253,
48960,
3733,
2929,
273,
10279,
610,
627,
403,
14777,
273,
10237,
7200,
7147,
6777,
7887,
25761,
891,
5194,
342,
581,
273,
253,
30628,
326,
970,
23256,
69,
323,
253,
4096,
273,
247,
31640,
6970,
310,
31215,
285,
275,
1798,
31334,
347,
2067,
8104,
323,
1027,
32285,
452,
281,
320,
2218,
627,
452,
644,
2067,
8104,
3715,
534,
4388,
281,
1089,
253,
48960,
3410,
342,
5927,
5222,
285,
3021,
11897,
253,
31640,
6970,
275,
581,
1408,
50276,
783,
3081,
16039,
24088,
15171,
273,
31640,
9191,
403,
10571,
281,
320,
3264,
285,
891,
13414,
1089,
731,
4209,
281,
2118,
253,
2929,
689,
253,
2534,
323,
17857,
32888,
50276,
284,
841,
16039,
403,
23000,
50276,
7483,
2011,
323,
4942,
1355,
3210,
534,
1646,
2080,
1977,
432,
253,
1375,
273,
253,
1445,
352,
310,
12744,
604,
597,
39970,
2299,
891,
11907,
281,
956,
690,
273,
253,
30628,
13991,
281,
3157,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
17276,
2710,
48960,
5684,
3082,
327,
616,
3045,
672,
253,
20452,
22841,
299,
4277,
310,
2559,
253,
2488,
8219,
326,
31640,
323,
247,
2173,
299,
4277,
778,
417,
320,
2217,
285,
5936,
31640,
9191,
347,
271,
5795,
891,
1158,
275,
2087,
436,
2929,
3400,
690,
4722,
16774,
2175,
619,
7000,
5701,
403,
347,
3637,
337,
253,
689,
31893,
273,
2173,
299,
4277,
1318,
310,
3264,
533,
4722,
281,
923,
285,
891,
1158,
436,
310,
581,
273,
253,
2022,
4606,
2139,
247,
31640,
6970,
310,
3309,
2299,
275,
436,
48960,
31640,
3114,
253,
3708,
29817,
310,
326,
8607,
7277,
342,
1016,
643,
327,
690,
2173,
15302,
342,
690,
2173,
299,
4277,
323,
1650,
17272,
323,
278,
79,
382,
854,
10637,
323,
260,
338,
274,
891,
1158,
581,
1921,
323,
13887,
841,
2193,
310,
326,
12392,
31640,
762,
20452,
342,
4067,
22841,
310,
2238,
273,
15279,
984,
840,
253,
6046,
2879,
310,
642,
3356,
9719,
44043,
534,
310,
387,
13653,
342,
48960,
6667,
5426,
891,
1158,
253,
4477,
778,
878,
281,
2085,
625,
5955,
327,
2139,
12392,
31640,
342,
299,
4277,
3071,
323,
278,
79,
382,
310,
3309,
1580,
359,
778,
3731,
2437,
1419,
1110,
3888,
347,
1966,
50276,
19,
253,
4477,
1804,
247,
31640,
6970,
347,
271,
7103,
7982,
2299,
891,
419,
2254,
2326,
667,
2987,
327,
11138,
253,
31640,
323,
512,
299,
4277,
2193,
21349,
581,
6387,
310,
326,
627,
310,
247,
5454,
2727,
875,
1355,
299,
4277,
3045,
285,
1781,
299,
4277,
3045,
2074,
281,
253,
5454,
2727,
875,
31640,
285,
7200,
299,
4277,
17,
3045,
619,
14876,
310,
326,
253,
4477,
778,
4853,
271,
2170,
762,
253,
6970,
1318,
816,
751,
275,
687,
68,
9191,
323,
1805,
5301,
495,
5884,
4496,
35531,
432,
760,
970,
3295,
281,
12129,
9191,
275,
8442,
347,
352,
778,
417,
320,
11453,
281,
10668,
342,
3295,
44868,
5474,
339,
793,
360,
3454,
253,
4477,
21424,
323,
253,
897,
273,
31640,
9191,
38542,
253,
48960,
7200,
347,
247,
1159,
273,
253,
1979,
273,
253,
24092,
2919,
273,
4136,
20452,
253,
1895,
326,
597,
4271,
310,
326,
604,
368,
760,
7472,
48960,
7200,
387,
690,
3904,
273,
7887,
368,
1537,
7525,
326,
690,
3210,
285,
253,
1332,
326,
369,
908,
281,
6194,
731,
403,
625,
10237,
685,
2571,
1223,
352,
651,
320,
13583,
387,
643,
26682,
50276,
16691,
4385,
253,
2929,
4518,
8631,
253,
1895,
352,
13330,
342,
285,
9563,
4354,
327,
253,
643,
1133,
891,
717,
417,
7094,
13762,
407,
253,
6349,
273,
253,
1895,
604,
368,
452,
247,
1798,
17776,
326,
368,
1557,
670,
50276,
18848,
461,
1255,
1411,
298,
2050,
2983,
323,
299,
793,
520,
368,
476,
816,
12654,
326,
327,
253,
643,
1133,
604,
634,
4736,
310,
891,
971,
619,
2990,
281,
320,
10237,
840,
697,
417,
6283,
2931,
594,
273,
2282,
697,
1892,
281,
7472,
31640,
9191,
588,
1361,
627,
533,
627,
310,
1335,
253,
1895,
326,
368,
1537,
971,
281,
320,
10237,
281,
298,
43723,
298,
18,
298,
19,
20468,
3064,
369,
2152,
6339,
3064,
2544,
273,
1355,
20412,
285,
840,
253,
31640,
9191,
588,
417,
1361,
368,
5734,
273,
2282,
368,
11897,
581,
323,
1016,
3064,
387,
253,
1072,
673,
597,
403,
3240,
247,
2372,
625,
19983,
281,
11897,
326,
2969,
1127,
5593,
1223,
891,
5194,
326,
368,
403,
1469,
281,
755,
625,
1491,
604,
368,
11897,
247,
2120,
31640,
6970,
685,
604,
368,
3410,
352,
387,
247,
12190,
273,
2792,
516,
417,
13762,
326,
352,
310,
4409,
253,
3434,
50276,
531,
2181,
326,
891,
651,
5583,
253,
4477,
310,
281,
1056,
30909,
253,
13812,
875,
31640,
9191,
347,
597,
2529,
731,
1754,
327,
4560,
253,
8642,
48960,
1650,
4632,
38542,
327,
247,
31640,
6970,
253,
1127,
3020,
5593,
285,
20670,
839,
949,
731,
436,
651,
320,
247,
1199,
20182,
2900,
285,
310,
9093,
752,
9610,
5661,
1543,
323,
247,
1643,
6777,
299,
793,
33526,
323,
1650,
253,
1543,
253,
4477,
1918,
275,
616,
2829,
337,
1754,
327,
1127,
15822,
5593,
259,
405,
4303,
33526,
752,
253,
4477,
971,
281,
921,
642,
5684,
13714,
25903,
253,
2571,
50276,
6160,
5701,
50276,
783,
20953,
10895,
1650,
3559,
275,
4677,
337,
310,
1270,
285,
3400,
247,
1270,
8813,
273,
253,
1895,
326,
253,
4477,
4271,
253,
25799,
4737,
275,
30762,
247,
310,
671,
3240,
4722,
285,
1663,
14137,
253,
1127,
326,
253,
4477,
971,
281,
1056,
50275,
783,
4477,
9059,
326,
31640,
9191,
1581,
281,
7277,
4156,
31640,
3607,
285,
616,
10096,
327,
247,
1677,
30410,
3268,
285,
4181,
1159,
275,
3946,
1057,
352,
1663,
1918,
16039,
4156,
31640,
2867,
604,
891,
1007,
387,
253,
298,
43723,
31640,
9191,
352,
1057,
417,
2028,
479,
1199,
670,
253,
31640,
281,
298,
19,
26309,
50275,
74,
13414,
2096,
849,
253,
31640,
9191,
403,
4561,
323,
253,
298,
43723,
1083,
604,
23256,
69,
310,
908,
281,
1089,
48960,
6667,
697,
417,
2779,
281,
320,
253,
8642,
48960,
6667,
326,
310,
1469,
281,
320,
1119,
275,
512,
12177,
697,
1469,
281,
320,
581,
326,
10129,
253,
299,
4277,
1677,
347,
3280,
281,
253,
23256,
69,
2983,
1955,
281,
253,
12378,
627,
310,
2717,
326,
310,
1014,
18462,
253,
3410,
281,
320,
2810,
281,
253,
3280,
4457,
253,
10806,
908,
323,
12378,
1014,
323,
253,
298,
19,
4181,
627,
310,
1335,
253,
7312,
1895,
326,
3738,
253,
260,
88,
2983,
29426,
281,
1089,
253,
8642,
3410,
627,
310,
642,
12215,
326,
352,
588,
285,
253,
12510,
352,
588,
452,
387,
2509,
594,
1537,
3469,
432,
1566,
281,
1566,
347,
247,
906,
697,
1892,
281,
34430,
713,
253,
31640,
6970,
432,
253,
2983,
326,
352,
908,
26506,
50276,
37585,
7211,
50276,
555,
993,
50276,
936,
3157,
253,
1007,
273,
253,
2929,
352,
943,
320,
1896,
281,
2486,
11595,
1386,
7054,
84,
275,
253,
4060,
594,
326,
697,
417,
7154,
275,
1046,
1386,
50275,
783,
4477,
2312,
275,
253,
10199,
670,
4102,
4081,
31640,
9191,
285,
26542,
247,
2929,
432,
9169,
323,
731,
533,
352,
3133,
751,
1110,
9191,
497,
2168,
275,
897,
1078,
326,
327,
253,
12510,
273,
7726,
3033,
18634,
323,
3733,
2336,
18279,
1598,
10237,
3210,
305,
319,
267,
1162,
355,
574,
690,
275,
4765,
872,
494,
25774,
1411,
48960,
6667,
3066,
253,
17133,
8346,
48960,
3488,
936,
365,
574,
690,
811,
7334,
275,
4240,
50276,
255,
253,
990,
273,
253,
10199,
253,
2488,
1333,
352,
310,
776,
9927,
326,
253,
4821,
897,
273,
2014,
20452,
26682,
275,
253,
48960,
31640,
6239,
310,
1955,
281,
247,
3480,
273,
11891,
273,
253,
35387,
273,
841,
5593,
50276,
2520,
3133,
19486,
314,
17770,
368,
812,
1056,
253,
1072,
1127,
670,
3733,
11333,
285,
1333,
326,
4477,
760,
9610,
327,
760,
247,
1643,
15302,
1955,
352,
816,
562,
273,
3480,
273,
11891,
273,
253,
958,
326,
253,
4103,
3045,
273,
1027,
11333,
588,
6889,
7293,
327,
253,
10895,
1677,
326,
12672,
31640,
9191,
3198,
12672,
253,
8642,
34014,
281,
247,
1127,
436,
310,
1199,
625,
8214,
594,
5046,
15180,
2105,
1537,
320,
253,
43073,
2803,
2581,
685,
3480,
273,
11891,
7152,
33032,
2520,
2929,
10262,
247,
10527,
10076,
835,
1127,
3020,
2557,
273,
48960,
31640,
11521,
2159,
275,
10941,
1566,
31640,
840,
2589,
4679,
281,
921,
326,
31640,
6970,
310,
247,
625,
14282,
7103,
7982,
432,
247,
4156,
8668,
50275,
856,
84,
50274,
783,
16038,
310,
973,
5544,
891,
7194,
5194,
342,
253,
4477,
327,
253,
4154,
326,
1127,
3020,
6814,
273,
31640,
778,
320,
12497,
275,
15571,
1566,
31640,
12672,
285,
5304,
3006,
31640,
9191,
3133,
281,
320,
625,
14282,
285,
26565,
432,
247,
3988,
8668,
50274,
1661,
839,
253,
4327,
273,
253,
20452,
4757,
281,
253,
6944,
2867,
273,
253,
941,
3268,
310,
4217,
253,
734,
2437,
13849,
5183,
275,
2829,
812,
7826,
320,
908,
347,
247,
3806,
327,
8925,
253,
987,
4311,
273,
20452,
4757,
50275,
5040,
50274,
783,
31640,
1543,
3559,
275,
2829,
337,
3133,
2080,
2708,
253,
1375,
23037,
14387,
31640,
323,
4227,
275,
253,
1390,
4194,
299,
4277,
25,
10637,
253,
10237,
1071,
2228,
273,
387,
310,
470,
4529,
534,
310,
1199,
2169,
685,
253,
2361,
9990,
275,
10279,
610,
1162,
355,
4765,
253,
2488,
4648,
247,
1077,
1355,
577,
12026,
27311,
267,
11454,
2990,
323,
260,
338,
274,
740,
4679,
5727,
253,
1375,
23037,
14387,
31640,
1543,
403,
6786,
970,
247,
1199,
4067,
2990,
824,
347,
247,
501,
3024,
10336,
390,
247,
4618,
373,
3024,
10336,
3730,
281,
337,
323,
253,
1655,
1682,
31640,
1543,
327,
260,
338,
274,
740,
3021,
891,
5583,
4477,
281,
294,
6321,
841,
4679,
970,
247,
4067,
2990,
50275,
22202,
10336,
310,
908,
323,
253,
31640,
9191,
275,
8442,
495,
285,
577,
436,
749,
29776,
4327,
273,
2990,
10336,
2789,
253,
4154,
352,
4428,
2709,
42320,
275,
253,
31640,
6970,
10915,
87,
19163,
50274,
977,
3533,
390,
5701,
50276,
18,
954,
273,
253,
5368,
25774,
1411,
48960,
6667,
403,
5431,
10166,
970,
247,
5742,
348,
5458,
20452,
4757,
604,
25987,
31640,
9191,
390,
4156,
31640,
347,
253,
7103,
6866,
3185,
273,
1127,
3020,
31640,
849,
588,
436,
2818,
253,
5368,
48960,
3733,
5199,
50276,
19,
752,
1057,
253,
4181,
9990,
3559,
275,
2829,
374,
1804,
323,
253,
6867,
4327,
273,
20452,
4757,
908,
275,
5368,
6239,
50275,
20,
253,
4156,
31640,
2783,
275,
436,
2929,
310,
31640,
323,
11962,
20452,
4757,
310,
627,
247,
1039,
281,
4853,
253,
20452,
4757,
323,
1027,
3280,
8593,
1754,
327,
634,
10302,
734,
2437,
9990,
50276,
21,
253,
20314,
20561,
3740,
273,
253,
3806,
310,
417,
2629,
2451,
604,
368,
403,
970,
253,
3451,
1873,
50275,
18,
9630,
7103,
273,
48960,
31640,
342,
271,
19862,
273,
11117,
4764,
4924,
8104,
1315,
1972,
1940,
9187,
336,
285,
1111,
394,
6358,
344,
249,
17857,
1686,
9169,
5474,
339,
793,
360,
3454,
253,
2929,
2692,
326,
1127,
3020,
5593,
1891,
281,
9232,
1774,
3607,
326,
403,
5667,
281,
7277,
253,
31640,
273,
1027,
49996,
253,
4477,
5611,
253,
4102,
4081,
31640,
9191,
281,
2085,
247,
4156,
8668,
1690,
253,
4311,
347,
247,
1039,
281,
12129,
1355,
285,
1781,
26309,
50275,
856,
84,
337,
849,
281,
7277,
253,
31640,
310,
247,
1077,
1774,
1953,
323,
1655,
5145,
4715,
3210,
253,
2488,
5611,
247,
1805,
17705,
323,
436,
1774,
1953,
50276,
19,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50276,
783,
4679,
285,
697,
5955,
403,
2266,
27947,
281,
1329,
326,
253,
31640,
6970,
943,
320,
253,
1805,
17705,
495,
253,
4477,
4439,
2127,
281,
18302,
512,
253,
4679,
323,
253,
1655,
4633,
31225,
352,
588,
320,
1077,
9371,
323,
253,
8607,
281,
253,
11361,
273,
436,
6970,
689,
1127,
3020,
5593,
50275,
5040,
4583,
436,
2929,
310,
13943,
50276,
783,
760,
4468,
253,
37317,
556,
310,
253,
7680,
2429,
281,
253,
2045,
789,
665,
4081,
253,
31640,
6970,
260,
305,
36470,
797,
1162,
355,
9169,
50276,
262,
3133,
436,
9380,
7680,
310,
4122,
1754,
327,
50276,
783,
10419,
273,
31640,
6970,
285,
5277,
625,
22909,
285,
11985,
533,
436,
588,
417,
2818,
253,
6349,
273,
436,
2929,
50273,
187,
187,
4118,
18435,
27,
783,
4477,
1263,
31640,
9191,
534,
403,
14777,
273,
253,
10237,
2228,
7147,
253,
9941,
908,
275,
253,
3969,
39322,
2910,
4322,
1566,
50276,
856,
891,
4336,
5194,
342,
253,
4477,
326,
253,
1655,
7103,
15846,
1754,
327,
7103,
323,
247,
2014,
9941,
310,
12497,
285,
581,
943,
1304,
253,
3426,
6970,
50275,
585,
253,
4477,
403,
689,
43759,
326,
597,
452,
1705,
598,
342,
31640,
9191,
1077,
2393,
9380,
24088,
1014,
275,
253,
48960,
3733,
2929,
273,
10279,
610,
627,
403,
14777,
273,
10237,
7200,
7147,
6777,
7887,
25761,
891,
5194,
342,
581,
273,
253,
30628,
326,
970,
23256,
69,
323,
253,
4096,
273,
247,
31640,
6970,
310,
31215,
285,
275,
1798,
31334,
347,
2067,
8104,
323,
1027,
32285,
452,
281,
320,
2218,
627,
452,
644,
2067,
8104,
3715,
534,
4388,
281,
1089,
253,
48960,
3410,
342,
5927,
5222,
285,
3021,
11897,
253,
31640,
6970,
275,
581,
1408,
50276,
783,
3081,
16039,
24088,
15171,
273,
31640,
9191,
403,
10571,
281,
320,
3264,
285,
891,
13414,
1089,
731,
4209,
281,
2118,
253,
2929,
689,
253,
2534,
323,
17857,
32888,
50276,
284,
841,
16039,
403,
23000,
50276,
7483,
2011,
323,
4942,
1355,
3210,
534,
1646,
2080,
1977,
432,
253,
1375,
273,
253,
1445,
352,
310,
12744,
604,
597,
39970,
2299,
891,
11907,
281,
956,
690,
273,
253,
30628,
13991,
281,
3157,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the manuscript improves the equivalent graph neural network gnn to tackle the possible inefficiency when the model predicts the dynamics of a physical system when the system symmetry is partially broken by an external force such as gravity the subequivariant gnn sgnn proposed in this work introduces the hierarchical architecture in terms of the particles and objects where the subequivariant message passing is incorporated into the model to deal with complicated object interactions the approach with the additional freedom can be more accurate in the task to evaluate the physical dynamic of objects from vision and achieve an impressive generalization compared with gns model etc originality the work introduces the concept of subequivariance based on the hierarchical network to extend the capability of gnn in the task of learning the physical dynamics from vision information the work has very good originality quality the model is developed rigorously based on the understanding of the rotation equivariant group the design and test of sgnn with physion dataset are thoughtful and convincing clarity the presentation of the manuscript is clear however the background and the motivation are confusing it is a bit hard to understand the challenge of the problem until i read the online introduction of physion dataset significance though i may not consider this work as a breakthrough of gnn model development in the focused task the fresh idea such as the subequivariance described in the manuscript indeed helps the neural network achieve better accuracy and generalization the work can bring people to pay attention to the importance of developing a model with appropriate physical constrain for a similar task yes this is a work dedicated to developing the machine learning model for science there would be no potential negative societal impact docsepthis paper targets an interesting question of embedding equivariance into graph neural networks for physical dynamics recovery the main focus is to 1 relax the strict constraint for cases with gravity and 2 consider the differences in self and mutual interactions during learning this work targets the learning of physical system dynamics which is an important topic and needs more attention in the community the authors provide a clear illustration of the relevance to existing works on 1 graph neural networks to model interactions among particlesobjects and 2 using physical constraints as an inductive bias to improve generalizability the authors mainly propose subequivariance against the case with gravity and design multistage modeling to account for differences in object properties such as shape to the best of the reviews knowledge this work is new please see the questions above docsepin this work authors present a new formulation of the equivariant graph neural network namely subequivariant gnn sgnn which allows the modeling of systems with symmetry breakage such as gravity in addition to model systems with different shapes and geometry an additional feature that represent the object type is added so as to distinguish the intraobject elasticityrigidityplasticity and interobject interactions collisionrepulsionweak attraction a hierarchical modeling approach is implemented to address particle and objectlevel interactions separately by considering different physical scenarios for instance collision and contact prediction the superior performance of sgnn is demonstrated overall the work is wellwritten and clearly presented it builds on the equivariant gnns and modifies it to present the idea of sgnn which can include directional symmetry breaking such as gravity tsome of the main comments regarding the work are as follows 1 while the idea is useful the implementation and proofs are fairly straightforward extension from the equivariant case also no additional inductive biases to preserve the physics as in the case of hamiltonian or other physicsinformed gnns are implemented this raises a question on the validity of the trajectory predicted specifically no comments on whether the trajectories represent a physically feasible realization is not discussed this is important because one of the major advantages the authors claim for sgnn is the ability to learn the dynamics 2 authors refer to previous works such as hamiltonian gnns and mention that they do not consider rotational equivariance this is incorrect in hamiltonian gnns the edge embeddings can be modified to have the distance l2 norm of the difference in positions instead of giving simply the vectorial difference since the functional form to be learned in the case of a hamiltonian is scalar this approach also works very well and is both translationally and rotationally invariant 3 the examples demonstrated in the work are those where contact seems to be of main interest it is not clear how much better the model would perform in other cases where contact is not necessarily the primary interest but dynamics is now if the main focus of the work is simulate contact then there has been several works which attempted to do this for instance zhong yd dey b and chakraborty a 2021 extending lagrangian and hamiltonian neural networks with differentiable contact models advances in neural information processing systems 34 pp2191021922 which employ a similar to idea of cutoffbased contact detection indeed they do not employ a graphbased approach however in the present case although the particles are considered there do not seem to be any deformation simulated and hence these approaches referred should also stand equally valid as sgnn 4 also the experiments chosen seem to be favorable for sgnn in comparison to the other baselines for instance implementing egnn and gmn exactly as they are with gravity is expected to yield poor performance as the architecture expects the data to be rotationally invariant similarly gns and dpi learns purely from data and hence are unaware of symmetry unless trained specifically for it more interesting examples where the situations where gns and egnn have been shown to yield sota performance could give a more realistic representation of how better sgnn is in comparison to these models there are several limitations for the present work which the authors should consider 1 although a particlebased approach is employed unlike previous works such gns there are no discussions on deformable systems it is not clear how much improvement sgnn can give for deformable systems 2 as mentioned in the questions discussions on selfcontact and how this is incorporated is missing 3 performance of sgnn on systems with drag and other dissipative forces are lacking 4 in the case of a deformable system consider a scenario where a particle from a given object breaks away gets reflected by the wall and then comes back to interact with the same object in such case if the particle comes with the varepsilon cutoff distance does sgnn model this as a contact or as a particle of the object in other words does the particle heal with the object or not if yes this is unphysical it is not clear if the model can address this issue docsepthe paper introduces a new model for learning physical dynamics it introduces a concept that is termed subequivariance which appears similar to restricted representations from group theory the paper also introduces a hierarchical message passing to better enable long range interactions between particles the paper also expands the input feature space by computing geometric features of the input particles and their objects the paper experiments on a range of dynamics datasets namely physion and rigidfall strengths the paper adds a binary operation which is the difference in position between adjacent features which adds a translation invariant representation to the input feature space although the benefits and justification for this are not adequately detailed the experiments seem thorough and multiple baselines are used although i am not famailir with these baselines so i cannot comment on whether any relevant baselines have been missed weaknesses major a key concept of the paper is characterising subequivariance as a relaxation of equivariance focusing on the group e3 although this concept has already been considered previously in 12 by using restricted representations of the group rather than the regular representation a criticism of gnns made in the paper is that they seldom explicitly involve the object information into the message passing but this feels like a specific decision choice of the input features rather than some fundamental issue of gnns which a paper can solve also there are papers which consider specific object information such as 3 the concept of having different edges which pass different information appears strongly related to a relevant subfield of gnns exploring automorphism equivariance which has been ignored in this paper 456 in the experiments only rotations around the gravity axis are used surely all rotations should be possible and all the models which do not correctly account for the break in symmetry caused by gravity should produce unrealistic results while if this model works correctly it should break the symmetry and produce realistic results i feel like this would be a far stronger result proving the correctness of the model minor multiple typos grammatical mistakes makes reading the paper difficult l44 59 the object features appear to be the pooled features of its particles which is used to create a feature space by subtracting these from each of the particles features this seems to be a complexly worded way of saying subtracting the mean from the particle features which is a common approach in ml 1 weiler m and cesa g 2019 general e 2equivariant steerable cnns advances in neural information processing systems 32 2 cesa g lang l and weiler m 2021 september a program to build e nequivariant steerable cnns in international conference on learning representations 3 pfaff t fortunato m sanchezgonzalez a and battaglia pw 2020 learning meshbased simulation with graph networks arxiv preprint arxiv201003409 4 de haan p cohen ts and welling m 2020 natural graph networks advances in neural information processing systems 33 pp36363646 5 thiede e zhou w and kondor r 2021 autobahn automorphismbased graph neural nets advances in neural information processing systems 34 pp2992229934 6 mitton j and murraysmith r 2021 local permutation equivariance for graph neural networks arxiv preprint arxiv211111840 na
### Summary:
|
overall this is an interesting paper it proposed a new formulation of the equivariant graph neural network subequivariant gnn reviewers agree that the proposed idea could be useful to the community albeit with perhaps small application scope so on the novelty side this paper is okay the biggest concern among the reviewers is about the experiments ie mostly the fair comparision i feel the authors did a reasonable job to explaining why the current baselines were chosen and provided additional experimental evidence authors could take in account the comments from the reviewers to improve the overall presentation of the paper
|
[
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
7714,
19132,
253,
6425,
4216,
11454,
2990,
305,
9866,
281,
18915,
253,
1896,
275,
46505,
672,
253,
1566,
26295,
253,
8062,
273,
247,
3520,
985,
672,
253,
985,
10377,
310,
10571,
7154,
407,
271,
6024,
3490,
824,
347,
12926,
50276,
783,
749,
8275,
6410,
305,
9866,
256,
3757,
79,
4081,
275,
436,
789,
23970,
253,
24498,
10336,
275,
2426,
273,
253,
6353,
285,
5113,
835,
253,
749,
8275,
6410,
3935,
8136,
310,
11217,
715,
253,
1566,
281,
2968,
342,
9542,
1789,
6355,
253,
2746,
342,
253,
3081,
7185,
476,
320,
625,
7899,
275,
253,
4836,
281,
7472,
253,
3520,
7870,
273,
5113,
432,
8113,
285,
5115,
271,
13943,
26647,
2429,
342,
305,
2224,
1566,
3966,
50276,
19164,
414,
50276,
783,
789,
23970,
253,
4473,
273,
749,
8275,
14417,
1754,
327,
253,
24498,
2990,
281,
9017,
253,
14603,
273,
305,
9866,
275,
253,
4836,
273,
4715,
253,
3520,
8062,
432,
8113,
1491,
253,
789,
556,
1077,
1175,
3236,
414,
50275,
15177,
253,
1566,
310,
3715,
8132,
29689,
1754,
327,
253,
4685,
273,
253,
9381,
32270,
6410,
1387,
253,
2216,
285,
1071,
273,
256,
3757,
79,
342,
2150,
279,
10895,
403,
30457,
285,
21414,
50275,
498,
15752,
253,
9759,
273,
253,
7714,
310,
2590,
2299,
253,
4114,
285,
253,
16038,
403,
21643,
352,
310,
247,
2372,
1892,
281,
2096,
253,
5691,
273,
253,
1895,
1919,
891,
1239,
253,
3909,
10199,
273,
2150,
279,
10895,
50275,
9188,
40348,
2167,
891,
778,
417,
1908,
436,
789,
347,
247,
29709,
273,
305,
9866,
1566,
2440,
275,
253,
7106,
4836,
253,
5352,
2934,
824,
347,
253,
749,
8275,
14417,
2529,
275,
253,
7714,
6296,
7729,
253,
11454,
2990,
5115,
1805,
7200,
285,
26647,
253,
789,
476,
3324,
952,
281,
2075,
4116,
281,
253,
6349,
273,
6684,
247,
1566,
342,
4569,
3520,
37709,
323,
247,
2074,
4836,
50274,
9820,
436,
310,
247,
789,
9940,
281,
6684,
253,
5145,
4715,
1566,
323,
5859,
627,
651,
320,
642,
2442,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
8571,
271,
4722,
1953,
273,
21496,
32270,
14417,
715,
4216,
11454,
6928,
323,
3520,
8062,
7355,
253,
2022,
2770,
310,
281,
337,
7921,
253,
7654,
7658,
323,
2219,
342,
12926,
285,
374,
1908,
253,
3910,
275,
1881,
285,
15577,
6355,
1309,
4715,
436,
789,
8571,
253,
4715,
273,
3520,
985,
8062,
534,
310,
271,
1774,
9400,
285,
3198,
625,
4116,
275,
253,
3114,
253,
4477,
2085,
247,
2590,
23356,
273,
253,
17200,
281,
5368,
2987,
327,
337,
4216,
11454,
6928,
281,
1566,
6355,
2190,
6353,
25964,
285,
374,
970,
3520,
10806,
347,
271,
42115,
8492,
281,
3157,
2087,
50228,
253,
4477,
7194,
12661,
749,
8275,
14417,
1411,
253,
1083,
342,
12926,
285,
2216,
1554,
382,
486,
14053,
281,
2395,
323,
3910,
275,
1789,
3607,
824,
347,
5281,
281,
253,
1682,
273,
253,
10123,
3640,
436,
789,
310,
747,
50274,
32897,
923,
253,
3533,
1840,
5474,
339,
9852,
436,
789,
4477,
1246,
247,
747,
15895,
273,
253,
32270,
6410,
4216,
11454,
2990,
10775,
749,
8275,
6410,
305,
9866,
256,
3757,
79,
534,
4483,
253,
14053,
273,
2718,
342,
10377,
2740,
486,
824,
347,
12926,
275,
1635,
281,
1566,
2718,
342,
1027,
15029,
285,
12087,
271,
3081,
4735,
326,
1957,
253,
1789,
1511,
310,
2879,
594,
347,
281,
12129,
253,
8376,
6082,
43546,
10389,
14283,
32502,
414,
285,
734,
6082,
6355,
15708,
4762,
15305,
20881,
21779,
247,
24498,
14053,
2746,
310,
9009,
281,
2953,
8091,
285,
1789,
5251,
6355,
11794,
407,
7296,
1027,
3520,
15216,
323,
4227,
15708,
285,
3057,
10554,
253,
8936,
3045,
273,
256,
3757,
79,
310,
5183,
4583,
253,
789,
310,
973,
15720,
285,
4518,
3559,
352,
21168,
327,
253,
32270,
6410,
18976,
2224,
285,
771,
7790,
352,
281,
1246,
253,
2934,
273,
256,
3757,
79,
534,
476,
2486,
36799,
10377,
10155,
824,
347,
12926,
246,
8826,
273,
253,
2022,
5701,
5001,
253,
789,
403,
347,
3637,
50276,
18,
1223,
253,
2934,
310,
4217,
253,
7092,
285,
27947,
403,
9648,
15246,
6880,
432,
253,
32270,
6410,
1083,
671,
642,
3081,
42115,
31306,
281,
14003,
253,
12057,
347,
275,
253,
1083,
273,
10546,
7839,
757,
390,
643,
12057,
38967,
18976,
2224,
403,
9009,
436,
16540,
247,
1953,
327,
253,
13091,
273,
253,
18974,
8131,
5742,
642,
5701,
327,
1880,
253,
24102,
1957,
247,
13318,
17887,
22786,
310,
417,
5469,
436,
310,
1774,
984,
581,
273,
253,
2201,
11361,
253,
4477,
1750,
323,
256,
3757,
79,
310,
253,
3745,
281,
3037,
253,
8062,
50276,
19,
4477,
3730,
281,
2045,
2987,
824,
347,
10546,
7839,
757,
18976,
2224,
285,
3748,
326,
597,
513,
417,
1908,
22090,
32270,
14417,
436,
310,
13583,
275,
10546,
7839,
757,
18976,
2224,
253,
5024,
46234,
476,
320,
7321,
281,
452,
253,
4181,
298,
19,
5222,
273,
253,
3064,
275,
6887,
3185,
273,
4933,
3365,
253,
4972,
451,
3064,
1580,
253,
5164,
830,
281,
320,
6311,
275,
253,
1083,
273,
247,
10546,
7839,
757,
310,
13434,
436,
2746,
671,
2987,
1077,
973,
285,
310,
1097,
10234,
595,
285,
9381,
595,
13727,
50276,
20,
253,
6667,
5183,
275,
253,
789,
403,
1110,
835,
3057,
3133,
281,
320,
273,
2022,
1600,
352,
310,
417,
2590,
849,
1199,
1805,
253,
1566,
651,
1347,
275,
643,
2219,
835,
3057,
310,
417,
7933,
253,
3625,
1600,
533,
8062,
310,
1024,
604,
253,
2022,
2770,
273,
253,
789,
310,
26065,
3057,
840,
627,
556,
644,
2067,
2987,
534,
9919,
281,
513,
436,
323,
4227,
1182,
73,
543,
340,
69,
372,
90,
270,
285,
448,
518,
37588,
430,
90,
247,
43425,
13633,
16653,
23623,
285,
10546,
7839,
757,
11454,
6928,
342,
46350,
3057,
3210,
16424,
275,
11454,
1491,
5162,
2718,
5910,
7266,
24558,
11335,
746,
1423,
534,
2126,
247,
2074,
281,
2934,
273,
23046,
3169,
3057,
5481,
6296,
597,
513,
417,
2126,
247,
4216,
3169,
2746,
2299,
275,
253,
1246,
1083,
3738,
253,
6353,
403,
2783,
627,
513,
417,
1646,
281,
320,
667,
19972,
15524,
285,
7613,
841,
7274,
6289,
943,
671,
1462,
9696,
3588,
347,
256,
3757,
79,
577,
671,
253,
4679,
6777,
1646,
281,
320,
13857,
323,
256,
3757,
79,
275,
5301,
281,
253,
643,
1666,
25379,
323,
4227,
16994,
299,
3757,
79,
285,
305,
16192,
4555,
347,
597,
403,
342,
12926,
310,
3264,
281,
4917,
4105,
3045,
347,
253,
10336,
21973,
253,
941,
281,
320,
9381,
595,
13727,
12014,
305,
2224,
285,
277,
2059,
33772,
15846,
432,
941,
285,
7613,
403,
25229,
273,
10377,
5734,
10166,
5742,
323,
352,
625,
4722,
6667,
835,
253,
9534,
835,
305,
2224,
285,
299,
3757,
79,
452,
644,
2011,
281,
4917,
256,
5503,
3045,
812,
1918,
247,
625,
15958,
6779,
273,
849,
1805,
256,
3757,
79,
310,
275,
5301,
281,
841,
3210,
627,
403,
2067,
7364,
323,
253,
1246,
789,
534,
253,
4477,
943,
1908,
337,
3738,
247,
8091,
3169,
2746,
310,
7091,
12401,
2045,
2987,
824,
305,
2224,
627,
403,
642,
11985,
327,
22403,
494,
2718,
352,
310,
417,
2590,
849,
1199,
7756,
256,
3757,
79,
476,
1918,
323,
22403,
494,
2718,
374,
347,
5393,
275,
253,
3533,
11985,
327,
1881,
22045,
285,
849,
436,
310,
11217,
310,
5816,
495,
3045,
273,
256,
3757,
79,
327,
2718,
342,
9310,
285,
643,
18335,
800,
5621,
403,
14999,
577,
275,
253,
1083,
273,
247,
22403,
494,
985,
1908,
247,
10076,
835,
247,
8091,
432,
247,
1677,
1789,
13471,
1977,
4850,
11392,
407,
253,
3402,
285,
840,
3249,
896,
281,
8008,
342,
253,
1072,
1789,
275,
824,
1083,
604,
253,
8091,
3249,
342,
253,
362,
609,
4277,
23046,
4181,
1057,
256,
3757,
79,
1566,
436,
347,
247,
3057,
390,
347,
247,
8091,
273,
253,
1789,
275,
643,
3000,
1057,
253,
8091,
20658,
342,
253,
1789,
390,
417,
604,
4754,
436,
310,
440,
29079,
352,
310,
417,
2590,
604,
253,
1566,
476,
2953,
436,
2523,
5474,
339,
431,
248,
2929,
23970,
247,
747,
1566,
323,
4715,
3520,
8062,
352,
23970,
247,
4473,
326,
310,
23776,
749,
8275,
14417,
534,
4620,
2074,
281,
11096,
14237,
432,
1387,
3762,
253,
2929,
671,
23970,
247,
24498,
3935,
8136,
281,
1805,
8046,
1048,
2491,
6355,
875,
6353,
253,
2929,
671,
35205,
253,
3280,
4735,
2317,
407,
12672,
17856,
3386,
273,
253,
3280,
6353,
285,
616,
5113,
253,
2929,
4679,
327,
247,
2491,
273,
8062,
15302,
10775,
2150,
279,
285,
16572,
12615,
50276,
296,
3755,
20556,
50276,
783,
2929,
11323,
247,
8985,
4254,
534,
310,
253,
3064,
275,
1899,
875,
9701,
3386,
534,
11323,
247,
10234,
13727,
6779,
281,
253,
3280,
4735,
2317,
3738,
253,
5373,
285,
22861,
323,
436,
403,
417,
18212,
7000,
50275,
783,
4679,
1646,
11080,
285,
2709,
1666,
25379,
403,
908,
3738,
891,
717,
417,
1431,
647,
343,
342,
841,
1666,
25379,
594,
891,
2550,
4385,
327,
1880,
667,
4623,
1666,
25379,
452,
644,
9829,
50276,
20881,
1255,
265,
50276,
24330,
50276,
66,
2234,
4473,
273,
253,
2929,
310,
1894,
2182,
749,
8275,
14417,
347,
247,
17040,
273,
32270,
14417,
13654,
327,
253,
1387,
299,
20,
3738,
436,
4473,
556,
2168,
644,
2783,
3786,
275,
1249,
407,
970,
11096,
14237,
273,
253,
1387,
2581,
685,
253,
3963,
6779,
50276,
66,
14226,
273,
18976,
2224,
1160,
275,
253,
2929,
310,
326,
597,
28277,
11120,
6388,
253,
1789,
1491,
715,
253,
3935,
8136,
533,
436,
9193,
751,
247,
2173,
3061,
4327,
273,
253,
3280,
3386,
2581,
685,
690,
7936,
2523,
273,
18976,
2224,
534,
247,
2929,
476,
8415,
671,
627,
403,
9380,
534,
1908,
2173,
1789,
1491,
824,
347,
495,
50276,
783,
4473,
273,
1907,
1027,
9297,
534,
1509,
1027,
1491,
4620,
7052,
2905,
281,
247,
4623,
749,
3423,
273,
18976,
2224,
18216,
40755,
32270,
14417,
534,
556,
644,
12841,
275,
436,
2929,
38094,
50276,
249,
253,
4679,
760,
39501,
1475,
253,
12926,
7844,
403,
908,
13353,
512,
39501,
943,
320,
1896,
285,
512,
253,
3210,
534,
513,
417,
9113,
2395,
323,
253,
2740,
275,
10377,
4269,
407,
12926,
943,
4711,
46521,
1543,
1223,
604,
436,
1566,
2987,
9113,
352,
943,
2740,
253,
10377,
285,
4711,
15958,
1543,
891,
1928,
751,
436,
651,
320,
247,
2080,
10046,
906,
18597,
253,
36594,
273,
253,
1566,
50275,
37585,
50276,
34263,
963,
993,
50276,
1710,
2056,
474,
16503,
2789,
4361,
253,
2929,
2834,
298,
2031,
8978,
50276,
783,
1789,
3386,
3176,
281,
320,
253,
24462,
3386,
273,
697,
6353,
534,
310,
908,
281,
2794,
247,
4735,
2317,
407,
45771,
841,
432,
1016,
273,
253,
6353,
3386,
436,
3133,
281,
320,
247,
2570,
314,
3159,
264,
1039,
273,
3981,
45771,
253,
1599,
432,
253,
8091,
3386,
534,
310,
247,
1846,
2746,
275,
13361,
50275,
18,
359,
6731,
278,
285,
19109,
66,
305,
6247,
2087,
299,
374,
8275,
6410,
37434,
494,
260,
79,
2224,
16424,
275,
11454,
1491,
5162,
2718,
4567,
374,
19109,
66,
305,
19457,
298,
285,
359,
6731,
278,
43425,
22688,
2037,
247,
2086,
281,
1973,
299,
425,
371,
400,
6410,
37434,
494,
260,
79,
2224,
275,
5213,
8059,
327,
4715,
14237,
495,
268,
71,
2843,
246,
22972,
4611,
278,
256,
31432,
72,
16430,
30050,
247,
285,
11750,
356,
19702,
268,
88,
9169,
4715,
17489,
3169,
9864,
342,
4216,
6928,
549,
32693,
638,
3845,
549,
32693,
1252,
4838,
24264,
577,
372,
419,
266,
268,
820,
864,
28669,
285,
973,
272,
278,
9169,
3626,
4216,
6928,
16424,
275,
11454,
1491,
5162,
2718,
5922,
7266,
1812,
1812,
23100,
23,
608,
289,
728,
70,
299,
1182,
14451,
259,
285,
465,
857,
263,
391,
43425,
1125,
706,
18272,
40755,
3169,
4216,
11454,
37507,
16424,
275,
11454,
1491,
5162,
2718,
5910,
7266,
24514,
18895,
1525,
1706,
721,
50075,
251,
480,
285,
4682,
20237,
27268,
391,
43425,
1980,
29391,
32270,
14417,
323,
4216,
11454,
6928,
549,
32693,
638,
3845,
549,
32693,
19,
13721,
1093,
1449,
5549,
2490,
187,
4118,
18435,
27,
1189,
455,
436,
310,
271,
4722,
2929,
352,
4081,
247,
747,
15895,
273,
253,
32270,
6410,
4216,
11454,
2990,
749,
8275,
6410,
305,
9866,
30628,
5194,
326,
253,
4081,
2934,
812,
320,
4217,
281,
253,
3114,
23447,
342,
4931,
1355,
2898,
7990,
594,
327,
253,
38135,
1930,
436,
2929,
310,
8261,
253,
5962,
4468,
2190,
253,
30628,
310,
670,
253,
4679,
26332,
6571,
253,
4344,
3294,
1297,
891,
1928,
253,
4477,
858,
247,
5272,
2628,
281,
15571,
2139,
253,
1655,
1666,
25379,
497,
6777,
285,
2530,
3081,
5661,
1941,
4477,
812,
1379,
275,
2395,
253,
5701,
432,
253,
30628,
281,
3157,
253,
4583,
9759,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
7714,
19132,
253,
6425,
4216,
11454,
2990,
305,
9866,
281,
18915,
253,
1896,
275,
46505,
672,
253,
1566,
26295,
253,
8062,
273,
247,
3520,
985,
672,
253,
985,
10377,
310,
10571,
7154,
407,
271,
6024,
3490,
824,
347,
12926,
50276,
783,
749,
8275,
6410,
305,
9866,
256,
3757,
79,
4081,
275,
436,
789,
23970,
253,
24498,
10336,
275,
2426,
273,
253,
6353,
285,
5113,
835,
253,
749,
8275,
6410,
3935,
8136,
310,
11217,
715,
253,
1566,
281,
2968,
342,
9542,
1789,
6355,
253,
2746,
342,
253,
3081,
7185,
476,
320,
625,
7899,
275,
253,
4836,
281,
7472,
253,
3520,
7870,
273,
5113,
432,
8113,
285,
5115,
271,
13943,
26647,
2429,
342,
305,
2224,
1566,
3966,
50276,
19164,
414,
50276,
783,
789,
23970,
253,
4473,
273,
749,
8275,
14417,
1754,
327,
253,
24498,
2990,
281,
9017,
253,
14603,
273,
305,
9866,
275,
253,
4836,
273,
4715,
253,
3520,
8062,
432,
8113,
1491,
253,
789,
556,
1077,
1175,
3236,
414,
50275,
15177,
253,
1566,
310,
3715,
8132,
29689,
1754,
327,
253,
4685,
273,
253,
9381,
32270,
6410,
1387,
253,
2216,
285,
1071,
273,
256,
3757,
79,
342,
2150,
279,
10895,
403,
30457,
285,
21414,
50275,
498,
15752,
253,
9759,
273,
253,
7714,
310,
2590,
2299,
253,
4114,
285,
253,
16038,
403,
21643,
352,
310,
247,
2372,
1892,
281,
2096,
253,
5691,
273,
253,
1895,
1919,
891,
1239,
253,
3909,
10199,
273,
2150,
279,
10895,
50275,
9188,
40348,
2167,
891,
778,
417,
1908,
436,
789,
347,
247,
29709,
273,
305,
9866,
1566,
2440,
275,
253,
7106,
4836,
253,
5352,
2934,
824,
347,
253,
749,
8275,
14417,
2529,
275,
253,
7714,
6296,
7729,
253,
11454,
2990,
5115,
1805,
7200,
285,
26647,
253,
789,
476,
3324,
952,
281,
2075,
4116,
281,
253,
6349,
273,
6684,
247,
1566,
342,
4569,
3520,
37709,
323,
247,
2074,
4836,
50274,
9820,
436,
310,
247,
789,
9940,
281,
6684,
253,
5145,
4715,
1566,
323,
5859,
627,
651,
320,
642,
2442,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
8571,
271,
4722,
1953,
273,
21496,
32270,
14417,
715,
4216,
11454,
6928,
323,
3520,
8062,
7355,
253,
2022,
2770,
310,
281,
337,
7921,
253,
7654,
7658,
323,
2219,
342,
12926,
285,
374,
1908,
253,
3910,
275,
1881,
285,
15577,
6355,
1309,
4715,
436,
789,
8571,
253,
4715,
273,
3520,
985,
8062,
534,
310,
271,
1774,
9400,
285,
3198,
625,
4116,
275,
253,
3114,
253,
4477,
2085,
247,
2590,
23356,
273,
253,
17200,
281,
5368,
2987,
327,
337,
4216,
11454,
6928,
281,
1566,
6355,
2190,
6353,
25964,
285,
374,
970,
3520,
10806,
347,
271,
42115,
8492,
281,
3157,
2087,
50228,
253,
4477,
7194,
12661,
749,
8275,
14417,
1411,
253,
1083,
342,
12926,
285,
2216,
1554,
382,
486,
14053,
281,
2395,
323,
3910,
275,
1789,
3607,
824,
347,
5281,
281,
253,
1682,
273,
253,
10123,
3640,
436,
789,
310,
747,
50274,
32897,
923,
253,
3533,
1840,
5474,
339,
9852,
436,
789,
4477,
1246,
247,
747,
15895,
273,
253,
32270,
6410,
4216,
11454,
2990,
10775,
749,
8275,
6410,
305,
9866,
256,
3757,
79,
534,
4483,
253,
14053,
273,
2718,
342,
10377,
2740,
486,
824,
347,
12926,
275,
1635,
281,
1566,
2718,
342,
1027,
15029,
285,
12087,
271,
3081,
4735,
326,
1957,
253,
1789,
1511,
310,
2879,
594,
347,
281,
12129,
253,
8376,
6082,
43546,
10389,
14283,
32502,
414,
285,
734,
6082,
6355,
15708,
4762,
15305,
20881,
21779,
247,
24498,
14053,
2746,
310,
9009,
281,
2953,
8091,
285,
1789,
5251,
6355,
11794,
407,
7296,
1027,
3520,
15216,
323,
4227,
15708,
285,
3057,
10554,
253,
8936,
3045,
273,
256,
3757,
79,
310,
5183,
4583,
253,
789,
310,
973,
15720,
285,
4518,
3559,
352,
21168,
327,
253,
32270,
6410,
18976,
2224,
285,
771,
7790,
352,
281,
1246,
253,
2934,
273,
256,
3757,
79,
534,
476,
2486,
36799,
10377,
10155,
824,
347,
12926,
246,
8826,
273,
253,
2022,
5701,
5001,
253,
789,
403,
347,
3637,
50276,
18,
1223,
253,
2934,
310,
4217,
253,
7092,
285,
27947,
403,
9648,
15246,
6880,
432,
253,
32270,
6410,
1083,
671,
642,
3081,
42115,
31306,
281,
14003,
253,
12057,
347,
275,
253,
1083,
273,
10546,
7839,
757,
390,
643,
12057,
38967,
18976,
2224,
403,
9009,
436,
16540,
247,
1953,
327,
253,
13091,
273,
253,
18974,
8131,
5742,
642,
5701,
327,
1880,
253,
24102,
1957,
247,
13318,
17887,
22786,
310,
417,
5469,
436,
310,
1774,
984,
581,
273,
253,
2201,
11361,
253,
4477,
1750,
323,
256,
3757,
79,
310,
253,
3745,
281,
3037,
253,
8062,
50276,
19,
4477,
3730,
281,
2045,
2987,
824,
347,
10546,
7839,
757,
18976,
2224,
285,
3748,
326,
597,
513,
417,
1908,
22090,
32270,
14417,
436,
310,
13583,
275,
10546,
7839,
757,
18976,
2224,
253,
5024,
46234,
476,
320,
7321,
281,
452,
253,
4181,
298,
19,
5222,
273,
253,
3064,
275,
6887,
3185,
273,
4933,
3365,
253,
4972,
451,
3064,
1580,
253,
5164,
830,
281,
320,
6311,
275,
253,
1083,
273,
247,
10546,
7839,
757,
310,
13434,
436,
2746,
671,
2987,
1077,
973,
285,
310,
1097,
10234,
595,
285,
9381,
595,
13727,
50276,
20,
253,
6667,
5183,
275,
253,
789,
403,
1110,
835,
3057,
3133,
281,
320,
273,
2022,
1600,
352,
310,
417,
2590,
849,
1199,
1805,
253,
1566,
651,
1347,
275,
643,
2219,
835,
3057,
310,
417,
7933,
253,
3625,
1600,
533,
8062,
310,
1024,
604,
253,
2022,
2770,
273,
253,
789,
310,
26065,
3057,
840,
627,
556,
644,
2067,
2987,
534,
9919,
281,
513,
436,
323,
4227,
1182,
73,
543,
340,
69,
372,
90,
270,
285,
448,
518,
37588,
430,
90,
247,
43425,
13633,
16653,
23623,
285,
10546,
7839,
757,
11454,
6928,
342,
46350,
3057,
3210,
16424,
275,
11454,
1491,
5162,
2718,
5910,
7266,
24558,
11335,
746,
1423,
534,
2126,
247,
2074,
281,
2934,
273,
23046,
3169,
3057,
5481,
6296,
597,
513,
417,
2126,
247,
4216,
3169,
2746,
2299,
275,
253,
1246,
1083,
3738,
253,
6353,
403,
2783,
627,
513,
417,
1646,
281,
320,
667,
19972,
15524,
285,
7613,
841,
7274,
6289,
943,
671,
1462,
9696,
3588,
347,
256,
3757,
79,
577,
671,
253,
4679,
6777,
1646,
281,
320,
13857,
323,
256,
3757,
79,
275,
5301,
281,
253,
643,
1666,
25379,
323,
4227,
16994,
299,
3757,
79,
285,
305,
16192,
4555,
347,
597,
403,
342,
12926,
310,
3264,
281,
4917,
4105,
3045,
347,
253,
10336,
21973,
253,
941,
281,
320,
9381,
595,
13727,
12014,
305,
2224,
285,
277,
2059,
33772,
15846,
432,
941,
285,
7613,
403,
25229,
273,
10377,
5734,
10166,
5742,
323,
352,
625,
4722,
6667,
835,
253,
9534,
835,
305,
2224,
285,
299,
3757,
79,
452,
644,
2011,
281,
4917,
256,
5503,
3045,
812,
1918,
247,
625,
15958,
6779,
273,
849,
1805,
256,
3757,
79,
310,
275,
5301,
281,
841,
3210,
627,
403,
2067,
7364,
323,
253,
1246,
789,
534,
253,
4477,
943,
1908,
337,
3738,
247,
8091,
3169,
2746,
310,
7091,
12401,
2045,
2987,
824,
305,
2224,
627,
403,
642,
11985,
327,
22403,
494,
2718,
352,
310,
417,
2590,
849,
1199,
7756,
256,
3757,
79,
476,
1918,
323,
22403,
494,
2718,
374,
347,
5393,
275,
253,
3533,
11985,
327,
1881,
22045,
285,
849,
436,
310,
11217,
310,
5816,
495,
3045,
273,
256,
3757,
79,
327,
2718,
342,
9310,
285,
643,
18335,
800,
5621,
403,
14999,
577,
275,
253,
1083,
273,
247,
22403,
494,
985,
1908,
247,
10076,
835,
247,
8091,
432,
247,
1677,
1789,
13471,
1977,
4850,
11392,
407,
253,
3402,
285,
840,
3249,
896,
281,
8008,
342,
253,
1072,
1789,
275,
824,
1083,
604,
253,
8091,
3249,
342,
253,
362,
609,
4277,
23046,
4181,
1057,
256,
3757,
79,
1566,
436,
347,
247,
3057,
390,
347,
247,
8091,
273,
253,
1789,
275,
643,
3000,
1057,
253,
8091,
20658,
342,
253,
1789,
390,
417,
604,
4754,
436,
310,
440,
29079,
352,
310,
417,
2590,
604,
253,
1566,
476,
2953,
436,
2523,
5474,
339,
431,
248,
2929,
23970,
247,
747,
1566,
323,
4715,
3520,
8062,
352,
23970,
247,
4473,
326,
310,
23776,
749,
8275,
14417,
534,
4620,
2074,
281,
11096,
14237,
432,
1387,
3762,
253,
2929,
671,
23970,
247,
24498,
3935,
8136,
281,
1805,
8046,
1048,
2491,
6355,
875,
6353,
253,
2929,
671,
35205,
253,
3280,
4735,
2317,
407,
12672,
17856,
3386,
273,
253,
3280,
6353,
285,
616,
5113,
253,
2929,
4679,
327,
247,
2491,
273,
8062,
15302,
10775,
2150,
279,
285,
16572,
12615,
50276,
296,
3755,
20556,
50276,
783,
2929,
11323,
247,
8985,
4254,
534,
310,
253,
3064,
275,
1899,
875,
9701,
3386,
534,
11323,
247,
10234,
13727,
6779,
281,
253,
3280,
4735,
2317,
3738,
253,
5373,
285,
22861,
323,
436,
403,
417,
18212,
7000,
50275,
783,
4679,
1646,
11080,
285,
2709,
1666,
25379,
403,
908,
3738,
891,
717,
417,
1431,
647,
343,
342,
841,
1666,
25379,
594,
891,
2550,
4385,
327,
1880,
667,
4623,
1666,
25379,
452,
644,
9829,
50276,
20881,
1255,
265,
50276,
24330,
50276,
66,
2234,
4473,
273,
253,
2929,
310,
1894,
2182,
749,
8275,
14417,
347,
247,
17040,
273,
32270,
14417,
13654,
327,
253,
1387,
299,
20,
3738,
436,
4473,
556,
2168,
644,
2783,
3786,
275,
1249,
407,
970,
11096,
14237,
273,
253,
1387,
2581,
685,
253,
3963,
6779,
50276,
66,
14226,
273,
18976,
2224,
1160,
275,
253,
2929,
310,
326,
597,
28277,
11120,
6388,
253,
1789,
1491,
715,
253,
3935,
8136,
533,
436,
9193,
751,
247,
2173,
3061,
4327,
273,
253,
3280,
3386,
2581,
685,
690,
7936,
2523,
273,
18976,
2224,
534,
247,
2929,
476,
8415,
671,
627,
403,
9380,
534,
1908,
2173,
1789,
1491,
824,
347,
495,
50276,
783,
4473,
273,
1907,
1027,
9297,
534,
1509,
1027,
1491,
4620,
7052,
2905,
281,
247,
4623,
749,
3423,
273,
18976,
2224,
18216,
40755,
32270,
14417,
534,
556,
644,
12841,
275,
436,
2929,
38094,
50276,
249,
253,
4679,
760,
39501,
1475,
253,
12926,
7844,
403,
908,
13353,
512,
39501,
943,
320,
1896,
285,
512,
253,
3210,
534,
513,
417,
9113,
2395,
323,
253,
2740,
275,
10377,
4269,
407,
12926,
943,
4711,
46521,
1543,
1223,
604,
436,
1566,
2987,
9113,
352,
943,
2740,
253,
10377,
285,
4711,
15958,
1543,
891,
1928,
751,
436,
651,
320,
247,
2080,
10046,
906,
18597,
253,
36594,
273,
253,
1566,
50275,
37585,
50276,
34263,
963,
993,
50276,
1710,
2056,
474,
16503,
2789,
4361,
253,
2929,
2834,
298,
2031,
8978,
50276,
783,
1789,
3386,
3176,
281,
320,
253,
24462,
3386,
273,
697,
6353,
534,
310,
908,
281,
2794,
247,
4735,
2317,
407,
45771,
841,
432,
1016,
273,
253,
6353,
3386,
436,
3133,
281,
320,
247,
2570,
314,
3159,
264,
1039,
273,
3981,
45771,
253,
1599,
432,
253,
8091,
3386,
534,
310,
247,
1846,
2746,
275,
13361,
50275,
18,
359,
6731,
278,
285,
19109,
66,
305,
6247,
2087,
299,
374,
8275,
6410,
37434,
494,
260,
79,
2224,
16424,
275,
11454,
1491,
5162,
2718,
4567,
374,
19109,
66,
305,
19457,
298,
285,
359,
6731,
278,
43425,
22688,
2037,
247,
2086,
281,
1973,
299,
425,
371,
400,
6410,
37434,
494,
260,
79,
2224,
275,
5213,
8059,
327,
4715,
14237,
495,
268,
71,
2843,
246,
22972,
4611,
278,
256,
31432,
72,
16430,
30050,
247,
285,
11750,
356,
19702,
268,
88,
9169,
4715,
17489,
3169,
9864,
342,
4216,
6928,
549,
32693,
638,
3845,
549,
32693,
1252,
4838,
24264,
577,
372,
419,
266,
268,
820,
864,
28669,
285,
973,
272,
278,
9169,
3626,
4216,
6928,
16424,
275,
11454,
1491,
5162,
2718,
5922,
7266,
1812,
1812,
23100,
23,
608,
289,
728,
70,
299,
1182,
14451,
259,
285,
465,
857,
263,
391,
43425,
1125,
706,
18272,
40755,
3169,
4216,
11454,
37507,
16424,
275,
11454,
1491,
5162,
2718,
5910,
7266,
24514,
18895,
1525,
1706,
721,
50075,
251,
480,
285,
4682,
20237,
27268,
391,
43425,
1980,
29391,
32270,
14417,
323,
4216,
11454,
6928,
549,
32693,
638,
3845,
549,
32693,
19,
13721,
1093,
1449,
5549,
2490,
187,
4118,
18435,
27,
1189,
455,
436,
310,
271,
4722,
2929,
352,
4081,
247,
747,
15895,
273,
253,
32270,
6410,
4216,
11454,
2990,
749,
8275,
6410,
305,
9866,
30628,
5194,
326,
253,
4081,
2934,
812,
320,
4217,
281,
253,
3114,
23447,
342,
4931,
1355,
2898,
7990,
594,
327,
253,
38135,
1930,
436,
2929,
310,
8261,
253,
5962,
4468,
2190,
253,
30628,
310,
670,
253,
4679,
26332,
6571,
253,
4344,
3294,
1297,
891,
1928,
253,
4477,
858,
247,
5272,
2628,
281,
15571,
2139,
253,
1655,
1666,
25379,
497,
6777,
285,
2530,
3081,
5661,
1941,
4477,
812,
1379,
275,
2395,
253,
5701,
432,
253,
30628,
281,
3157,
253,
4583,
9759,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the problem of private data analysis in the adaptivestreaming setting more specifically the authors provide a matrix factorization based method in this setting and apply it to sgd the proposed method seems to be novel and the empirical results look promising strengths 1 the proposed matrix factorization based method seems novel and can recover many existing methods for sgd 2 the authors provide an efficient algorithm to estimate the optimal factorization weaknesses 1 it is unclear how the general factorization will affect the privacy and utility tradeoff when applying it to sgd 2 the method seems to be limited in the case that only one data pass 3 the extension of the current method to the fedavg in the experiments is unclear yes docsepthe work analyzes the matrix mechanism in the adaptive streaming setting that gaussian noise addition mechanism can be private in the adaptive streaming setting and designs a parameterfree xedpoint algorithm to compute optimal factorizations for summationstyle problems furthermore the work provides userlevel privacy experiment to see the improvement via the approach 1 adaptive privacy analysis is essential and inevitable while most works on continual release analyzed privacy only in a nonadaptive setting 2 theoretical results are solid to demonstrate efficiency and convergence of the proposed algorithm on the matrix mechanism and dp in the adaptive streaming settings the paper is well writing but not easy to understand since the structure of it is not very clear docsepthis work studies matrix mechanism which guarantees differential privacy over adaptive streams the authors prove that gaussian mechanism and epsilon 0dp mechanisms that are private for nonadaptive inputs are also private for adaptive inputs the authors further characterize the expression for optimal matrix factorization based on the theoretical intuitions the authors design a matrix mechanism for sgd which has superior performance than previous algorithms strengths designing private algorithms for adaptive streaming is an important problem in differential privacy the authors prove fundamental theoretical results for the matrix mechanism that lead to a practical algorithm for sgd which empirically outperform existing methods the paper is well structured and clearly written weakness some algorithms do not have formal theoretical guarantee convergence guarantee for the fixedpoint optimization algorithm used in experiments effect of a suboptimal matrix factorization on algorithm performance convergence rate for dp factorization for sgd algorithm 1 limitations and potential negative societal impact are adequately addressed
### Summary:
|
the paper applies the idea of matrix mechanism to the problem of dp learning which releases sequences of adaptively chosen gradients this can be viewed as a generalization and refinement of the dpftrl approach the approach is interesting and the empirical results are stronger than existing alternatives such as dpsgd and dpftrl the idea is similar to a recent work fhu22 httpsarxivorgabs220211205 on the problem of continual release of counting queries but the authors have claimed independence and carefully explained the differences all reviewers are supporting acceptance and so do i
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
3055,
941,
1783,
275,
253,
5223,
400,
383,
1883,
272,
4758,
625,
5742,
253,
4477,
2085,
247,
4315,
39401,
1754,
1332,
275,
436,
4758,
285,
4647,
352,
281,
256,
35333,
253,
4081,
1332,
3133,
281,
320,
4460,
285,
253,
16774,
1543,
1007,
12532,
20544,
337,
253,
4081,
4315,
39401,
1754,
1332,
3133,
4460,
285,
476,
9295,
1142,
5368,
3082,
323,
256,
35333,
374,
253,
4477,
2085,
271,
5919,
5933,
281,
6642,
253,
8654,
39401,
50276,
20881,
1255,
265,
337,
352,
310,
12744,
849,
253,
2087,
39401,
588,
2818,
253,
11068,
285,
11839,
5454,
2727,
672,
9433,
352,
281,
256,
35333,
374,
253,
1332,
3133,
281,
320,
3710,
275,
253,
1083,
326,
760,
581,
941,
1509,
495,
253,
6880,
273,
253,
1655,
1332,
281,
253,
10208,
42921,
275,
253,
4679,
310,
12744,
4754,
5474,
339,
431,
248,
789,
3537,
13505,
253,
4315,
5122,
275,
253,
17825,
18361,
4758,
326,
305,
12064,
6046,
1635,
5122,
476,
320,
3055,
275,
253,
17825,
18361,
4758,
285,
11809,
247,
4764,
4924,
1269,
264,
3659,
5933,
281,
11897,
8654,
2803,
5904,
323,
36138,
4826,
3237,
33810,
253,
789,
3400,
2608,
5251,
11068,
3368,
281,
923,
253,
7756,
3066,
253,
2746,
50276,
18,
17825,
11068,
1783,
310,
5667,
285,
19455,
1223,
954,
2987,
327,
45120,
3727,
5867,
11068,
760,
275,
247,
1327,
26672,
422,
4758,
374,
10527,
1543,
403,
4891,
281,
7568,
6733,
285,
14940,
273,
253,
4081,
5933,
327,
253,
4315,
5122,
285,
33234,
275,
253,
17825,
18361,
7533,
50276,
783,
2929,
310,
973,
4028,
533,
417,
3477,
281,
2096,
1580,
253,
2605,
273,
352,
310,
417,
1077,
2590,
50276,
7152,
33032,
2520,
789,
2175,
4315,
5122,
534,
23632,
8967,
11068,
689,
17825,
17795,
253,
4477,
5276,
326,
305,
12064,
5122,
285,
299,
4277,
470,
12132,
6297,
326,
403,
3055,
323,
1327,
26672,
422,
14800,
403,
671,
3055,
323,
17825,
14800,
253,
4477,
2007,
17710,
253,
2048,
323,
8654,
4315,
39401,
1754,
327,
253,
10527,
16875,
4431,
253,
4477,
2216,
247,
4315,
5122,
323,
256,
35333,
534,
556,
8936,
3045,
685,
2045,
11333,
20544,
50276,
19417,
272,
3055,
11333,
323,
17825,
18361,
310,
271,
1774,
1895,
275,
8967,
11068,
253,
4477,
5276,
7936,
10527,
1543,
323,
253,
4315,
5122,
326,
1421,
281,
247,
8542,
5933,
323,
256,
35333,
534,
45190,
562,
32231,
5368,
3082,
50276,
783,
2929,
310,
973,
18872,
285,
4518,
3542,
50276,
20881,
1255,
50276,
8826,
11333,
513,
417,
452,
7473,
10527,
12215,
50274,
585,
41801,
12215,
323,
253,
4229,
3659,
13757,
5933,
908,
275,
4679,
50274,
8222,
273,
247,
749,
29776,
4315,
39401,
327,
5933,
3045,
50274,
585,
41801,
2281,
323,
33234,
39401,
323,
256,
35333,
5933,
337,
50276,
17465,
569,
285,
2442,
4016,
38058,
3486,
403,
18212,
9713,
2490,
187,
4118,
18435,
27,
783,
2929,
10384,
253,
2934,
273,
4315,
5122,
281,
253,
1895,
273,
33234,
4715,
534,
16784,
6430,
273,
5223,
1242,
6777,
27935,
436,
476,
320,
11575,
347,
247,
26647,
285,
29646,
273,
253,
33234,
649,
8435,
2746,
253,
2746,
310,
4722,
285,
253,
16774,
1543,
403,
10046,
685,
5368,
18075,
824,
347,
20093,
35333,
285,
33234,
649,
8435,
50276,
783,
2934,
310,
2074,
281,
247,
3332,
789,
269,
11917,
1423,
5987,
39962,
2061,
5375,
14256,
1797,
805,
1762,
327,
253,
1895,
273,
45120,
3727,
273,
15496,
19241,
533,
253,
4477,
452,
7558,
14275,
285,
9257,
5544,
253,
3910,
50276,
455,
30628,
403,
8109,
14924,
285,
594,
513,
891
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
3055,
941,
1783,
275,
253,
5223,
400,
383,
1883,
272,
4758,
625,
5742,
253,
4477,
2085,
247,
4315,
39401,
1754,
1332,
275,
436,
4758,
285,
4647,
352,
281,
256,
35333,
253,
4081,
1332,
3133,
281,
320,
4460,
285,
253,
16774,
1543,
1007,
12532,
20544,
337,
253,
4081,
4315,
39401,
1754,
1332,
3133,
4460,
285,
476,
9295,
1142,
5368,
3082,
323,
256,
35333,
374,
253,
4477,
2085,
271,
5919,
5933,
281,
6642,
253,
8654,
39401,
50276,
20881,
1255,
265,
337,
352,
310,
12744,
849,
253,
2087,
39401,
588,
2818,
253,
11068,
285,
11839,
5454,
2727,
672,
9433,
352,
281,
256,
35333,
374,
253,
1332,
3133,
281,
320,
3710,
275,
253,
1083,
326,
760,
581,
941,
1509,
495,
253,
6880,
273,
253,
1655,
1332,
281,
253,
10208,
42921,
275,
253,
4679,
310,
12744,
4754,
5474,
339,
431,
248,
789,
3537,
13505,
253,
4315,
5122,
275,
253,
17825,
18361,
4758,
326,
305,
12064,
6046,
1635,
5122,
476,
320,
3055,
275,
253,
17825,
18361,
4758,
285,
11809,
247,
4764,
4924,
1269,
264,
3659,
5933,
281,
11897,
8654,
2803,
5904,
323,
36138,
4826,
3237,
33810,
253,
789,
3400,
2608,
5251,
11068,
3368,
281,
923,
253,
7756,
3066,
253,
2746,
50276,
18,
17825,
11068,
1783,
310,
5667,
285,
19455,
1223,
954,
2987,
327,
45120,
3727,
5867,
11068,
760,
275,
247,
1327,
26672,
422,
4758,
374,
10527,
1543,
403,
4891,
281,
7568,
6733,
285,
14940,
273,
253,
4081,
5933,
327,
253,
4315,
5122,
285,
33234,
275,
253,
17825,
18361,
7533,
50276,
783,
2929,
310,
973,
4028,
533,
417,
3477,
281,
2096,
1580,
253,
2605,
273,
352,
310,
417,
1077,
2590,
50276,
7152,
33032,
2520,
789,
2175,
4315,
5122,
534,
23632,
8967,
11068,
689,
17825,
17795,
253,
4477,
5276,
326,
305,
12064,
5122,
285,
299,
4277,
470,
12132,
6297,
326,
403,
3055,
323,
1327,
26672,
422,
14800,
403,
671,
3055,
323,
17825,
14800,
253,
4477,
2007,
17710,
253,
2048,
323,
8654,
4315,
39401,
1754,
327,
253,
10527,
16875,
4431,
253,
4477,
2216,
247,
4315,
5122,
323,
256,
35333,
534,
556,
8936,
3045,
685,
2045,
11333,
20544,
50276,
19417,
272,
3055,
11333,
323,
17825,
18361,
310,
271,
1774,
1895,
275,
8967,
11068,
253,
4477,
5276,
7936,
10527,
1543,
323,
253,
4315,
5122,
326,
1421,
281,
247,
8542,
5933,
323,
256,
35333,
534,
45190,
562,
32231,
5368,
3082,
50276,
783,
2929,
310,
973,
18872,
285,
4518,
3542,
50276,
20881,
1255,
50276,
8826,
11333,
513,
417,
452,
7473,
10527,
12215,
50274,
585,
41801,
12215,
323,
253,
4229,
3659,
13757,
5933,
908,
275,
4679,
50274,
8222,
273,
247,
749,
29776,
4315,
39401,
327,
5933,
3045,
50274,
585,
41801,
2281,
323,
33234,
39401,
323,
256,
35333,
5933,
337,
50276,
17465,
569,
285,
2442,
4016,
38058,
3486,
403,
18212,
9713,
2490,
187,
4118,
18435,
27,
783,
2929,
10384,
253,
2934,
273,
4315,
5122,
281,
253,
1895,
273,
33234,
4715,
534,
16784,
6430,
273,
5223,
1242,
6777,
27935,
436,
476,
320,
11575,
347,
247,
26647,
285,
29646,
273,
253,
33234,
649,
8435,
2746,
253,
2746,
310,
4722,
285,
253,
16774,
1543,
403,
10046,
685,
5368,
18075,
824,
347,
20093,
35333,
285,
33234,
649,
8435,
50276,
783,
2934,
310,
2074,
281,
247,
3332,
789,
269,
11917,
1423,
5987,
39962,
2061,
5375,
14256,
1797,
805,
1762,
327,
253,
1895,
273,
45120,
3727,
273,
15496,
19241,
533,
253,
4477,
452,
7558,
14275,
285,
9257,
5544,
253,
3910,
50276,
455,
30628,
403,
8109,
14924,
285,
594,
513,
891
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the online facility location problem with predictions in this problem there is a sequence of points which arrive online and the algorithm must either open a facility to serve each point paying the facility opening cost plus the distance to the opened facility or it can assign the point to an already open facility paying the distance only additionally each arriving point is given a prediction of a facility which serves it in an optimal solution the goal is to be competitive with the offline optimal solution and the competitiveness is parameterized by the error in the predictions here the prediction error is the maximum distance between a predicted facility and the facility in the optimal solution which serves a point the main result is an algorithm which is omin log n log fracnetainftyoptcompetitive where etainfty is the max error and opt is the optimal cost this improves over worst case lower bounds when log fracnetainftyopt ofraclog nloglog n and nearly matches worst case bounds otherwise at a high level the algorithm first follows meyersons algorithm then opens some additional facilities by taking into account the predictions in order to get the stated bound the predictions are calibrated so that etainfty oopt so that log fracnetainftyopt olog n thus it suffices to only prove the bound of olog fracnetainftyopt a lower bound is also given showing that the result is nearly tight additionally an experimental validation is carried out comparing the proposed algorithm with meyersons algorithm and a naive algorithm which follows the predictions the experiments are performed on datasets where the underlying metric is either euclidean or a graph metric the predictions are simulated to have a given level of error while the proposed method underperforms against blindly following the predictions when the error is small it is more robust to large prediction errors and often comes close to meyersons algorithm when the prediction error is large this paper follows a recent line of work in considering online algorithms with errorprone predictions the main results are an online algorithm for online facility location in this setting and a nearly matching lower bound which are interesting the experimental evaluation is reasonable but not overly exciting since the improvement over meyersons algorithm seem to be very narrow in some cases the presentation is mostly clear but some improvements could be made especially to the analysis overall i would be okay with accepting this paper see below for some further commentssuggestions this paper considers the predictions as coming from an oracle recent work has started to explore the question of how to construct predictions for online algorithms from past data eg 123 below it would be interesting to explore how to construct predictions for online facility location from past data especially for an empirical evaluation 1 customizing ml predictions for online algorithms anand et al icml 2020 2 learning online algorithms with distributional advice diakonikolas et al icml 2021 3 learnable and instance robust predictions for online matching flows and load balancing lavastida et al esa 2021 some suggestions for the related work section which could be fleshed out more purohit et al 2018 also considers nonclairvoyant scheduling on a single machine secretaries with advice dtting et al ec 2021 also considers the secretary problem with predictions flow time scheduling with uncertain processing time azar et al stoc 2021 and online scheduling via learned weights lattanzi et al soda 2020 consider flow time scheduling and online load balancing respectively in settings with errorprone predictions minor commentssuggestions at the end of page 1 such as experts advices such as expert advice in the calibrating predictions paragraph second last second to last in the calibrating predictions paragraph fracn etainftyopt olog n should probably be fracn etainftyopt on in the proof of lemma 36 the fact that we does not the fact that we do not paragraph before lemma 38 let barell be the integer that let barell be the integer such that in the experiments the four plots seem to have different scales on the xaxis this seems to be due to the differing distance scales across each dataset it might be helpful for the presentation to either explain this in the text or maybe present the plots with something like netainfty opt on the xaxis edit after reading author responses the authors responses have cleared up my questions i appreciate the experiments with the simple predictor which seems reasonable to implement in practice and performs well on the considered datasets i have raised my score to accept this paper considers a fundamental online problem in the presence of predictions with a nearly tight result for this setting the experimental validation is reasonable some improvements could be made to the presentation but it is currently clear docsepthis paper studies the classical online facility location problem in a metric space given a new demand in this space the algorithm will either open a new facility by paying an opening cost wf and assign the demand node to this facility or it will assign the demand node to an existing open facility the cost incurred by the algorithm is the sum of all the opening costs and assignment costs in this paper the authors consider the setting where along with every demand node we are also given a prediction on which facility it gets assigned to in the optimal such predictions can potentially be acquired through historical data their main contribution is an online algorithm for facility location with predictions that has a nearoptimal competitive ratio the competitive ratio relies on a parameter etainfty which is simply the largest prediction error where prediction error for any demand point is the distance between the predicted facility and the facility that is assigned to the demand in the optimal solution intuitively the authors show that if the predictions are accurate ie etainfty is small then their algorithm is o1competitive on the other hand if the predictions are highly inaccurate the algorithm will become olog n competitive the algorithm relies on an olog n competitive worstcase optimal randomized online algorithm by meyerson the idea becomes very simple when facilities have uniform opening costs suppose meyersons algorithm opens a facility one can simply also open a facility at the predicted location by doing so the cost incurred will not be more than twice that of meyersons algorithm which is olog ncompetitive however when the predictions are accurate opening the facilities at predicted locations lead to better future assignments leading to o1competitive algorithm they extend this approach in a nontrivial way to the case with nonuniform opening costs strength of this paper the paper gives a natural relation between prediction error and competitive ratio for a classical online problem furthermore the lower bound presented in the paper makes the competitive ratio of their algorithm nearoptimal due to this i find overall result in this paper to be a nice one on the technical side the authors present a finegrained analysis of meyersons algorithm parameterizing the competitive ratio on the prediction error although it appears somewhat straightforward it seems to be a nice contribution weakness i have three sets of concernscriticisms a their algorithm uses the worstcase algorithm by meyerson as a black box to guide its decisions a more natural approach would have been to incorporate predictions within meyersons algorithm their experiments seem to suggest that even with perfect predictions the empirical gains over meyersons algorithm seem only minor i suspect that relying on meyersons cost as a black box may be the main reason for this indeed the authors explicitly state that they do not make any attempt to reduce the hidden constant within the bigo notation of the competitive ratio so there may be other algorithms that have a similar theoretical tradeoff but do significantly better in practice b there are several presentation issues in the paper these include undefined notations typos and minor inaccurate lemmatheorem statements i point some of them out below these issues make it challenging to fully verify and appreciate the theoretical claims especially in a short time frame experimental result section is somewhat confusing as well the authors do not explain their choice of data sets and what makes these data sets real for the facility location problem also how are the nonuniform costs assigned to facilities for the nonuniform data set c finally i do not see why this prediction model is natural for this problem how can we use historical data to generate predictions with small eta value presentation issues are listed below 1 opt is defined to be a value sometimes and a set of facilities on some other occasions for instance in the statement of theorem 11 2 can etainfty be 0 if so how should we interpret your competitive ratio 3 section 12 paragraph 2 there is a missing log is the competitive ratio of meyersons algorithm 4 section 13 determined deterministic 5 fopen is not initialized in the procedure pred l23 of the algorithm uses it without any initialization 6 a short overview of meyersons algorithm in the preliminaries can be very helpful 7 lemma 35 xf is undefined 8 theorem 12 should it be for etainfty 0 9 page 6 we does does not seem correct 10 page 6 line 1 that such that this seems to be a recurring issue 11 there are several other typos a careful proofreading of the paper is necessary for it to be of publishable quality due to the weaknesses mentioned above i do not believe the paper is ready for publication update thank you for addressing my concerns after reading the new version i am happy to update my score docsepthe paper studies a variant of the online facility location problem where each demand point arrives with a prediction of which facility it is assigned to in an optimal solution in line with recent work on learningaugmented algorithms the paper designs an algorithm whose performance degrades gracefully with the quality of the predictions and yet retains an almost optimal worstcase guarantee overall the paper is very wellwritten and does a good job of explaining the key ideas behind the algorithm and proof techniques the algorithm itself is quite natural and also likely to be useful to practitioners essentially the proposed algorithm is to use myersons algorithm for online facility location and then open an additional facility at the predicted location whenever myerson decides to open one the key technical contributions are for the variant when facilities have nonuniform costs in which case the additional facilities are opened near the predicted location the proposed algorithm obtains a competitive ratio of olog n etainftyopt where etainfty is the maximum prediction error so the algorithm only improves upon the worstcase setting as long as each prediction is opt the authors also demonstrate a matching lower bound though and in particular show that the dependence on n etainfty is unavoidable q is it easy to compute the consistency of the algorithm ie the competitive ratio when the predictions are perfectly accurate can one try to obtain a 1epsilonconsistent but also robust algorithm q can the robustness bound be improved to olog n log log n minor comments abstract the last sentence of the first paragraph needs some punctuation page 4 last line of section2 0leq k leq l 1 leq k leq l page 5 last line in by lemma 35 is by the statement of lemma 36 bounds costm but the proof only bounds costp it will be good to remind readers that the expected cost is the same overall the paper is a nice addition to the new area of learningaugmented algorithms the proposed algorithm is practical and almost optimal though it would be nice to obtain the optimal robustness bound of olog n log log n the paper seems to be a good fit for iclr docsepthe paper studies the classic problem of online facility location augmented with a prediction model the main contribution is a logarithmic competitive ratio algorithm for the case where facility opening costs are nonuniform strengths of the paper the paper is reasonably well written for most parts except perhaps please provide a bit more description of the algorithm instead asking users parse through the pseudocode the nonuniform case requires a nontrivial amount of analysis and may be of interest to the community weaknesses of the paper while theoretically the problem is interesting i am not convinced of the practicality of the model how will you construct a model that can predict what facilities an optimal solution would pick it depends on the other facilities that have already been chosen so what dataset would you use to train such a model in the real datasets based experiments this important aspect is simulated using a 3approximation algorithm to find the optimal solution and then randomized in accordance with prediction error ninfty this needs to be clarified while the nonuniform case is interesting as opposed to the uniform case which is down right trivial and needs a good bit of work to prove the idea is quite straightforward and predictable indeed the doubling trick used because we do not need ninfty is quite common place in the field of approximation algorithm skirental problem purohit et al for example uses a similar trick lastly as a pedantic issue i would really question the suitability of this work in iclr there is no mlai if one discounts the vague motivation which i have issues with as well the key issue that needs to be addressed is the motivation for assuming that new facility locations can be predicted using a machine learning model i feel this is quite unrealistic and significantly diminishes the value of the paper it would be great if the authors could address this issue in the rebuttal phase otherwise i think theoretically the paper is worth publishing albeit at a more theoretical computer science such as esa or approx docsepthis paper presents an algorithm for online facility location with predictions the prediction model is that on the arrival of each client the prediction provides a facility to connect this client to the quality of the prediction is quantified in the result by the maximum prediction error ie the maximum distance between the locations of the predicted facility and a facility serving the same client in a fixed optimal solution the main result is that the competitive ratio is constant which is the best offline ratio when the maximum prediction error is small eg when it is a 1nfraction of the optimal cost which of course if true when the predictions are precisely correct and olog n which is the best online ratio ignoring lower order terms when the prediction error is large the algorithm is an augmentation of an online algorithm for the problem by meyerson that obtains a competitive ratio of olog n where in each step in addition to running meyersons algorithm the algorithm uses a second copy of the incremental cost in meyersons step to preemptively open more facilities close to the predicted facility for the current client in addition to the theoretical analysis the paper includes an experimental evaluation of the algorithm comparing it to baselines produced by an algorithm that blindly trusts the prediction and meyersons algorithm strengths 1 the paper is technically solid and the analysis although it builds heavily on meyersons analysis is nontrivial i would say that from a purely technical perspective the paper passes the bar for the conference 2 the facility location problem is an important one while i am generally inclined more toward papers that develop general purpose tools for the online algorithms with predictions area instead of giving very specific algorithms for particular problems i think online facility location is a problem that is fundamental enough that it merits study in this framework as the paper mentions this is not the only recent paper on this very problem although the results are in this paper are more general in that they apply to onuniform opening costs weaknesses 1 i am really not convinced by the error model in this paper it bases the error on the worst prediction which means a large error even with one bad prediction and every other prediction is precisely correct the paper tries to justify this by showing that the worst case error cannot be replaced by the sum of prediction errors but to me the right solution to this would be to have two components of the error one component that counts the number of mispredictions where the distance by which the prediction missed is irrelevant and the other that captures the ell1 distance for the correct predictions that might still not be exact of course what is counted as a misprediction and what is counted as a correct prediction is not canonical but the algorithmic guarantee can hold for any such choice this would naturally interpolate between robustness where all predictions are counted as mispredictions and consistency where all predictions are correct predictions and the error distance is 0 instead this paper replaces ell1 error with a very loose upper bound of n times the ellinfty error which as i mentioned above wouldnt give anything better than an online algorithm without predictions even if one prediction is badly off 2 my second concern and question to the authors is the dependence of this algorithm on meyersons algorithm the way the algorithm is set up it takes the solution produced by meyersons algorithm and uses the budget provided by this algorithm to essentially reduce future costs by buying facilities near the predicted facility what if we replaced meyersons algorithm with another competitive facility location algorithm if the algorithm can use any competitive facility location algorithm that has multiple benefits a the algorithmic framework being designed is more flexible and can be applied to other similar problems instead of designing another algorithm from scratch and b much of the analysis which at the moment reprove properties of meyersons algorithm can be eliminated and simplified as stated the current algorithm seems to rely on specific properties of meyersons algorithm but is that really necessary 3 one minor gripe is the way the algorithm is presented in this paper could you please include a text description perhaps in addition to the pseudocode so that one doesnt have to understand the algorithm solely from the pseudocode this is a credible paper on online facility location with predictions the problem considered is important and relevant and the results obtained are tight in the sense that consistency and robustness results in the right ballpark but i have concerns about the error metric which is the most important parameter here with the current error metric the very realistic possibility of having a single bad prediction is also going to relegate the guarantees of the algorithm to those of having no prediction at all i would also have been happier with an algorithm that can use an online algorithm for the problem in a more black box manner this would make the framework more generally applicable and i suspect this is possible even with the current framework or some modification of it because i dont see the framework as conceptually using anything specific about meyersons algorithm although i might be wrong on that last point docsepthe paper considers the online facility location problem with predictions in this problem there is a set of facilities given offline on a metric and a sequence of demand points appearing online when a demand point arrives we must either route it to a facility we have previously opened or open new facilities and route the demand to one of the new facilities the total cost of a solution is the total cost of facilities opened plus the total distance between each demand point and the facility it was routed to in the learningaugmented setting when each demand point arrives we are also given a prediction in the form of a facility we believe it is routed to in an optimal solution let etainfty be the maximum distance between any the facility we predict a demand point is routed to in the optimal solution and the facility it is actually routed to in the optimal solution the authors give a olog fracn etainftyoptcompetitive algorithm where n is the number of demand points and opt is the cost of the optimal solution without loss of generality we can assume etainfty oopt so this is at worst an olog ncompetitive algorithm they also show that this is the optimal competitive ratio up to olog log n factors even if eta1 the total distance between predictions and opts facilities rather than the maximum is constant the authors also give experiments showing that on realworld data sets with synthetic predictions that have a bounded etainfty value their algorithm outperforms meyersons when the prediction error is low outperforms following the predictions when the prediction error is high and is not too much worse than meyersons when the prediction error is high the algorithm the authors use first uses the algorithm of meyerson for the setting without predictions in this algorithm for each power of 2 2k we consider opening the closest facility to the demand point whose cost is at most 2k or that is already in our set of open facilities let deltak be the distance from the demand point to this facility we open this facility with roughly speaking probability proportional to fracdeltak1 deltak2k intuitively we will never open a facility that is further from the demand point from a previously opened facility and the probability we open a facility closer than the best previously opened facility is roughly proportional to the decrease in connection cost from opening this facility versus the best facility costing half as much divided by this facilitys opening cost following a round of meyersons algorithm we use a prediction step to possibly open more facilities whose expected cost is at most the increase in facility and connection cost in the round of meyersons algorithm to do so we consider facilities in a ball around the predicted facility whose radius is proportional to the distance from the predicted facility to the nearest facility opened in a previous prediction step and consider opening the cheapest one we will definitely open it if doing so would not cause the cost of the prediction step to exceed the cost of the round of meyersons algorithm otherwise we open it with some probability such that the prediction steps expected cost is exactly the cost of the round of meyersons algorithm the main contribution of the paper is giving a nearoptimal result for online facility location with predictions along with a nearlymatching lower bound the only other result is the fotakis et al paper the authors discuss in the intro which only considers the setting where all facilities have opening cost 1 whereas this paper handles nonuniform opening costs even ignoring the correctness issues of the fotakis et al paper brought up by the authors the algorithms prediction step is to the best of my knowledge a novel one ie it is not simply lifting a technique from another paper and the difference between the weighted and unweighted setting is a key technical challenge in their results so i would view this as a substantial improvement over the fotakis et al paper even if the fotakis et al paper is correctgets a better competitive ratio in the setting where eta1 is small although i agree with the authors that the bound claimed in the fotakis et al paper is somewhat fishy due to the fact that it could be negative when eta1 is large compared to etainfty and was not able to discern from reading that paper where the catch is furthermore the design of the prediction step is fairly nontrivial to me but after reading the paper i felt i had some good intuition for why this was the right way to use the predicted facilities ie that the prediction step had a nice intuitive explanation the experiments also nicely supplement the theoretical results while the lower bound construction is not particularly novel by itself i believe it is a modification of a lower bound proof for ofl without predictions the bound itself nicely complements the upper bound as well my main criticism of the paper is the presentation of the lower bound in particular the claim that the lower bound shows that a bound on eta1 instead of etainfty is unlikely to offer substantial improvements the authors state that their lower bound holds even when eta1 o1 however in the lower bound example the distances in the metric could be smaller than one ie the ratio eta1etainfty which is eg not changed by just rescaling the metric and is in my view the right quantity to look at might still be quite large so this doesnt say anything about the relation between eta1 and etainfty ie this ratio can vary depending on the setting of etainfty in their example roughly eta1 is fixed but etainfty can be varied i think if etainfty is sufficiently small in their lower bound one gets eta1 oetainfty which shows that in general no ofraclog neta1optlog log ncompetitive algorithm is possible which would eg say that one cant get a better asymptotic ratio by replacing etainfty with eta1 but im not completely sure if this is the right takeaway and the authors dont spell out any argument like this i would like to see the relationship between eta1 and etainfty stated more clearly in this lower bound as well as a discussion of what this relationship implies about bounds depending on eta1 if the authors wish to make claims that eta1 ll etainfty is unlikely to help like they do in the introduction also i dont believe it is mentioned anywhere that opt does not have to be the offline optimal solution in the analysis which is implicitly being used in the lower bound in this lower bound etainfty is the distance between the predictions and a solution which may be optimal for claritys sake it would be good to either mention this somewhere or the lower bound construction should prove the onefacility solution is optimal if anything the former should strengthen the upper bound result by making it more general and it would be more in line with the empirical results which dont find an exact solution to base predictions off lastly less so a weakness than a lack of strength while online facility location is a fairly fundamental question in the online algorithms space i dont think that compared to eg the primaldual method for learning augmented algorithms by bamas et al this papers techniques are fundamentally breaking new technical ground in the learningaugmented space that could offer insights into other problems again i wouldnt view this a weakness just a reason why i dont feel i can rate this paper highly despite enjoying reading it overall i think the paper is a weak accept if the aforementioned concerns with the presentation of the lower bound construction can be resolved if they are not resolved in the rebuttal period i would rate at a weak reject i think the algorithm and its analysis are fairly nontrivial but intuitive and enjoyable to read about the problem being addressed is one of interest and the lower boundempirical results nicely complement the upper bounds however it is not clear to me if this paper stands out in particular amongst other papers in the learningaugmented online algorithms literature and there are some concerns with the presentation of the lower bound i wish to see addressed before publication
### Summary:
|
this paper considers the recent line of work on algorithms with predictions they give new results on the online facility location problem overall the reviewers felt the topic was of interest to the community there were some concerns about the error metric used and the overall framework however the majority of reviewers still felt the paper was interesting and i think the paper can be accepted
|
[
253,
2228,
4181,
310,
470,
3185,
436,
2929,
36287,
11591,
18,
2228,
342,
247,
1077,
13155,
5170,
3033,
273,
295,
2069,
253,
11591,
3259,
2228,
534,
347,
891,
5393,
1840,
651,
2649,
1918,
2712,
1805,
685,
271,
3909,
5933,
1293,
13650,
1014,
604,
581,
10554,
310,
16426,
745,
50276,
19,
619,
1273,
4468,
285,
1953,
281,
253,
4477,
310,
253,
10096,
273,
436,
5933,
327,
479,
13356,
790,
5933,
253,
1039,
253,
5933,
310,
873,
598,
352,
3936,
253,
2900,
4197,
407,
479,
13356,
790,
5933,
285,
4648,
253,
7563,
2530,
407,
436,
5933,
281,
9093,
4796,
2852,
4815,
407,
11280,
9189,
2822,
253,
8131,
9509,
752,
604,
359,
7932,
479,
13356,
790,
5933,
342,
1529,
12085,
50276,
28402,
874,
4328,
5933,
604,
253,
5933,
476,
897,
667,
12085,
9509,
4328,
5933,
326,
556,
2709,
5373,
247,
253,
5933,
280,
7792,
1146,
4158,
310,
625,
12112,
285,
476,
320,
3732,
281,
643,
2074,
3237,
3185,
273,
20462,
1529,
5933,
432,
20041,
285,
270,
1199,
273,
253,
1783,
534,
387,
253,
2774,
294,
17460,
3607,
273,
479,
13356,
790,
5933,
476,
320,
17527,
285,
21010,
347,
4767,
253,
1655,
5933,
3133,
281,
10725,
327,
2173,
3607,
273,
479,
13356,
790,
5933,
533,
310,
326,
1663,
3309,
50276,
20,
581,
5884,
18234,
365,
310,
253,
1039,
253,
5933,
310,
3559,
275,
436,
2929,
812,
368,
4496,
2486,
247,
2505,
5740,
4931,
275,
1635,
281,
253,
10585,
406,
853,
594,
326,
581,
36908,
452,
281,
2096,
253,
5933,
12718,
432,
253,
10585,
406,
853,
436,
310,
247,
24542,
2929,
327,
3909,
9509,
4328,
342,
13650,
253,
1895,
2783,
310,
1774,
285,
4623,
285,
253,
1543,
2797,
403,
6863,
275,
253,
3282,
326,
15274,
285,
31640,
1543,
275,
253,
987,
4023,
30844,
533,
891,
452,
7350,
670,
253,
2228,
7982,
534,
310,
253,
954,
1774,
4764,
1060,
50276,
3113,
253,
1655,
2228,
7982,
253,
1077,
15958,
6387,
273,
1907,
247,
2014,
3076,
10554,
310,
671,
1469,
281,
1693,
15353,
253,
23632,
273,
253,
5933,
281,
1110,
273,
1907,
642,
10554,
387,
512,
891,
651,
671,
452,
644,
33128,
342,
271,
5933,
326,
476,
897,
271,
3909,
5933,
323,
253,
1895,
275,
247,
625,
2806,
3817,
5133,
50276,
2520,
651,
1056,
253,
7792,
625,
3839,
7763,
285,
891,
9101,
436,
310,
1896,
1014,
342,
253,
1655,
7792,
390,
690,
11237,
273,
352,
984,
891,
13414,
923,
253,
7792,
347,
4473,
1230,
970,
2712,
2173,
670,
479,
13356,
790,
5933,
3738,
891,
1537,
320,
3430,
327,
326,
1390,
1127,
5474,
339,
431,
248,
2929,
19401,
253,
3909,
9509,
4328,
1895,
342,
13650,
275,
436,
1895,
627,
310,
247,
873,
273,
9189,
1677,
28841,
327,
247,
7982,
285,
247,
3425,
273,
4831,
2792,
15602,
3909,
672,
247,
4831,
1127,
23981,
359,
1364,
2057,
7622,
352,
281,
247,
9509,
359,
452,
3786,
5485,
390,
1527,
747,
9189,
285,
7622,
253,
4831,
281,
581,
273,
253,
747,
9189,
253,
2264,
2105,
273,
247,
2900,
310,
253,
2264,
2105,
273,
9189,
5485,
5043,
253,
2264,
4181,
875,
1016,
4831,
1127,
285,
253,
9509,
352,
369,
50088,
281,
275,
253,
4715,
2321,
16390,
4758,
672,
1016,
4831,
1127,
23981,
359,
403,
671,
1677,
247,
10554,
275,
253,
830,
273,
247,
9509,
359,
2868,
352,
310,
50088,
281,
275,
271,
8654,
2900,
1339,
1162,
404,
30399,
320,
253,
4869,
4181,
875,
667,
253,
9509,
359,
3283,
247,
4831,
1127,
310,
50088,
281,
275,
253,
8654,
2900,
285,
253,
9509,
352,
310,
2686,
50088,
281,
275,
253,
8654,
2900,
253,
4477,
1918,
247,
258,
2808,
1315,
317,
79,
1162,
404,
30399,
2178,
3118,
19922,
5933,
835,
295,
310,
253,
1180,
273,
4831,
2792,
285,
1478,
310,
253,
2105,
273,
253,
8654,
2900,
50276,
14920,
2957,
273,
31376,
359,
476,
5467,
1162,
404,
30399,
50276,
80,
2178,
594,
436,
310,
387,
9065,
271,
258,
2808,
295,
3118,
19922,
5933,
597,
671,
921,
326,
436,
310,
253,
8654,
12085,
4313,
598,
281,
258,
2808,
2412,
295,
2616,
1014,
604,
1162,
66,
18,
253,
2264,
4181,
875,
13650,
285,
32659,
9189,
2581,
685,
253,
4869,
310,
3638,
253,
4477,
671,
1918,
4679,
4645,
326,
327,
1524,
10186,
941,
5239,
342,
13506,
13650,
326,
452,
247,
11542,
1162,
404,
30399,
1318,
616,
5933,
41731,
13015,
479,
13356,
790,
672,
253,
10554,
2228,
310,
1698,
41731,
13015,
1563,
253,
13650,
672,
253,
10554,
2228,
310,
1029,
285,
310,
417,
1512,
1199,
7197,
685,
479,
13356,
790,
672,
253,
10554,
2228,
310,
1029,
50276,
783,
5933,
253,
4477,
897,
806,
4648,
253,
5933,
273,
479,
90,
3796,
323,
253,
4758,
1293,
13650,
275,
436,
5933,
323,
1016,
1612,
273,
374,
374,
76,
359,
1908,
5909,
253,
8642,
9509,
281,
253,
4831,
1127,
3692,
2105,
310,
387,
954,
374,
76,
390,
326,
310,
2168,
275,
776,
873,
273,
1527,
9189,
1339,
1448,
85,
518,
320,
253,
4181,
432,
253,
4831,
1127,
281,
436,
9509,
359,
1527,
436,
9509,
342,
11467,
8288,
5912,
14495,
281,
1315,
317,
69,
2585,
518,
18,
50276,
69,
2585,
518,
19,
76,
540,
41597,
359,
588,
1620,
1527,
247,
9509,
326,
310,
2007,
432,
253,
4831,
1127,
432,
247,
3786,
5485,
9509,
285,
253,
5912,
359,
1527,
247,
9509,
8003,
685,
253,
1682,
3786,
5485,
9509,
310,
11467,
14495,
281,
253,
6379,
275,
4602,
2105,
432,
5909,
436,
9509,
7147,
253,
1682,
9509,
45964,
2716,
347,
1199,
4272,
407,
436,
9509,
84,
5909,
2105,
1563,
247,
3790,
273,
479,
13356,
790,
5933,
359,
897,
247,
10554,
3213,
281,
6830,
1527,
625,
9189,
3692,
3264,
2105,
310,
387,
954,
253,
2572,
275,
9509,
285,
4602,
2105,
275,
253,
3790,
273,
479,
13356,
790,
5933,
281,
513,
594,
359,
1908,
9189,
275,
247,
4023,
1475,
253,
8131,
9509,
3692,
9941,
310,
14495,
281,
253,
4181,
432,
253,
8131,
9509,
281,
253,
5275,
9509,
5485,
275,
247,
2045,
10554,
3213,
285,
1908,
5909,
253,
45869,
581,
359,
588,
7964,
1527,
352,
604,
2509,
594,
651,
417,
2847,
253,
2105,
273,
253,
10554,
3213,
281,
8268,
253,
2105,
273,
253,
3790,
273,
479,
13356,
790,
5933,
5010,
359,
1527,
352,
342,
690,
5912,
824,
326,
253,
10554,
5018,
3264,
2105,
310,
4555,
253,
2105,
273,
253,
3790,
273,
479,
13356,
790,
5933,
50276,
783,
2022,
7680,
273,
253,
2929,
310,
4933,
247,
2822,
29776,
906,
323,
3909,
9509,
4328,
342,
13650,
2112,
342,
247,
4829,
45767,
2406,
3033,
253,
760,
643,
906,
310,
253,
269,
302,
30441,
1162,
355,
2929,
253,
4477,
2319,
275,
253,
26432,
534,
760,
19401,
253,
4758,
835,
512,
9189,
452,
5909,
2105,
337,
5727,
436,
2929,
22139,
1327,
23714,
5909,
4815,
1014,
23111,
253,
36594,
3374,
273,
253,
269,
302,
30441,
1162,
355,
2929,
3982,
598,
407,
253,
4477,
253,
11333,
10554,
3213,
310,
281,
253,
1682,
273,
619,
3640,
247,
4460,
581,
26332,
352,
310,
417,
3365,
20284,
247,
5853,
432,
1529,
2929,
285,
253,
3064,
875,
253,
17375,
285,
440,
24676,
4758,
310,
247,
2234,
7681,
5691,
275,
616,
1543,
594,
891,
651,
1859,
436,
347,
247,
6832,
7756,
689,
253,
269,
302,
30441,
1162,
355,
2929,
1014,
604,
253,
269,
302,
30441,
1162,
355,
2929,
310,
3451,
18145,
247,
1805,
12085,
4313,
275,
253,
4758,
835,
1162,
66,
18,
310,
1355,
3738,
891,
5194,
342,
253,
4477,
326,
253,
3033,
7558,
275,
253,
269,
302,
30441,
1162,
355,
2929,
310,
8489,
6773,
90,
1955,
281,
253,
958,
326,
352,
812,
320,
4016,
672,
1162,
66,
18,
310,
1781,
2429,
281,
1162,
404,
30399,
285,
369,
417,
2104,
281,
26923,
432,
4361,
326,
2929,
835,
253,
5834,
310,
33810,
253,
2216,
273,
253,
10554,
3213,
310,
9648,
37825,
281,
479,
533,
846,
4361,
253,
2929,
891,
3543,
891,
574,
690,
1175,
30328,
323,
2139,
436,
369,
253,
987,
1039,
281,
897,
253,
8131,
9189,
26332,
326,
253,
10554,
3213,
574,
247,
5322,
27350,
8813,
253,
4679,
671,
23395,
8499,
253,
10527,
1543,
1223,
253,
2406,
3033,
5140,
310,
417,
3782,
4460,
407,
3139,
891,
2868,
352,
310,
247,
11237,
273,
247,
2406,
3033,
4737,
323,
273,
77,
1293,
13650,
253,
3033,
3139,
23395,
509,
9115,
253,
5170,
3033,
347,
973,
50275,
2577,
2022,
14226,
273,
253,
2929,
310,
253,
9759,
273,
253,
2406,
3033,
275,
1798,
253,
1750,
326,
253,
2406,
3033,
2722,
326,
247,
3033,
327,
1162,
66,
18,
3185,
273,
1162,
404,
30399,
310,
11543,
281,
3959,
6832,
11701,
253,
4477,
1375,
326,
616,
2406,
3033,
6556,
1014,
672,
1162,
66,
18,
50276,
80,
18,
2299,
275,
253,
2406,
3033,
1650,
253,
13849,
275,
253,
7982,
812,
320,
4577,
685,
581,
26332,
253,
4313,
1162,
66,
18,
292,
404,
30399,
534,
310,
24088,
417,
4391,
407,
816,
46595,
272,
253,
7982,
285,
310,
275,
619,
1859,
253,
987,
10671,
281,
1007,
387,
1537,
1335,
320,
3240,
1781,
594,
436,
36908,
1333,
2712,
670,
253,
5886,
875,
1162,
66,
18,
285,
1162,
404,
30399,
26332,
436,
4313,
476,
6889,
7293,
327,
253,
4758,
273,
1162,
404,
30399,
275,
616,
1650,
11467,
1162,
66,
18,
310,
4229,
533,
1162,
404,
30399,
476,
320,
12848,
891,
1158,
604,
1162,
404,
30399,
310,
10481,
1355,
275,
616,
2406,
3033,
581,
4850,
1162,
66,
18,
50276,
80,
292,
404,
30399,
534,
2722,
326,
275,
2087,
642,
273,
8306,
2808,
295,
1464,
18,
2178,
2808,
2412,
295,
3118,
19922,
5933,
310,
1896,
534,
651,
24088,
1333,
326,
581,
16216,
755,
247,
1805,
20185,
4313,
407,
15706,
1162,
404,
30399,
342,
1162,
66,
18,
533,
516,
417,
4336,
2119,
604,
436,
310,
253,
987,
1379,
12594,
285,
253,
4477,
13414,
15368,
562,
667,
4154,
751,
436,
891,
651,
751,
281,
923,
253,
2954,
875,
1162,
66,
18,
285,
1162,
404,
30399,
4767,
625,
4518,
275,
436,
2406,
3033,
347,
973,
347,
247,
5955,
273,
752,
436,
2954,
8018,
670,
14493,
7293,
327,
1162,
66,
18,
604,
253,
4477,
5730,
281,
1056,
3916,
326,
1162,
66,
18,
26198,
1162,
404,
30399,
310,
11543,
281,
1361,
751,
597,
513,
275,
253,
10199,
50276,
12563,
891,
13414,
2868,
352,
310,
5393,
9825,
326,
1478,
1057,
417,
452,
281,
320,
253,
28841,
8654,
2900,
275,
253,
1783,
534,
310,
29688,
1146,
908,
275,
253,
2406,
3033,
50276,
249,
436,
2406,
3033,
1162,
404,
30399,
310,
253,
4181,
875,
253,
13650,
285,
247,
2900,
534,
778,
320,
8654,
323,
19843,
84,
13232,
352,
651,
320,
1175,
281,
2057,
3748,
436,
9366,
390,
253,
2406,
3033,
5140,
943,
5276,
253,
581,
28402,
874,
2900,
310,
8654,
604,
2712,
253,
3438,
943,
17084,
253,
5170,
3033,
906,
407,
2403,
352,
625,
2087,
285,
352,
651,
320,
625,
275,
1386,
342,
253,
16774,
1543,
534,
13414,
1089,
271,
3242,
2900,
281,
2613,
13650,
745,
50275,
6275,
314,
1679,
594,
247,
14855,
685,
247,
3480,
273,
4757,
1223,
3909,
9509,
4328,
310,
247,
9648,
7936,
1953,
275,
253,
3909,
11333,
2317,
891,
13414,
1158,
326,
2429,
281,
24088,
253,
819,
1983,
34716,
1332,
323,
4715,
31612,
11333,
407,
270,
33122,
1162,
355,
436,
9380,
5609,
403,
26401,
10155,
747,
7681,
3216,
275,
253,
4715,
2321,
16390,
2317,
326,
812,
3959,
16039,
715,
643,
3237,
969,
891,
651,
2649,
1859,
436,
247,
14855,
816,
247,
1921,
2139,
891,
13414,
1928,
891,
476,
2281,
436,
2929,
4122,
5747,
18262,
4361,
352,
4583,
891,
1158,
253,
2929,
310,
247,
5075,
2997,
604,
253,
18979,
7350,
342,
253,
9759,
273,
253,
2406,
3033,
5140,
476,
320,
11512,
604,
597,
403,
417,
11512,
275,
253,
30080,
22559,
2180,
891,
651,
2281,
387,
247,
5075,
12009,
891,
1158,
253,
5933,
285,
697,
1783,
403,
9648,
37825,
533,
27350,
285,
30357,
281,
1239,
670,
253,
1895,
1146,
9713,
310,
581,
273,
1600,
285,
253,
2406,
3033,
358,
5378,
474,
1543,
23395,
13503,
253,
5170,
14493,
2299,
352,
310,
417,
2590,
281,
479,
604,
436,
2929,
9572,
562,
275,
1798,
15995,
643,
9380,
275,
253,
4715,
2321,
16390,
3909,
11333,
6239,
285,
627,
403,
690,
7350,
342,
253,
9759,
273,
253,
2406,
3033,
891,
5730,
281,
923,
9713,
1078,
9311,
2490,
187,
4118,
18435,
27,
2520,
2929,
19401,
253,
3332,
1386,
273,
789,
327,
11333,
342,
13650,
50276,
9328,
1918,
747,
1543,
327,
253,
3909,
9509,
4328,
1895,
4583,
253,
30628,
3543,
253,
9400,
369,
273,
1600,
281,
253,
3114,
50276,
9088,
497,
690,
7350,
670,
253,
2228,
7982,
908,
285,
253,
4583,
7792,
2299,
253,
5020,
273,
30628,
1335,
3543,
253,
2929,
369,
4722,
285,
891,
1158,
253,
2929,
476,
320,
7607
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
253,
2228,
4181,
310,
470,
3185,
436,
2929,
36287,
11591,
18,
2228,
342,
247,
1077,
13155,
5170,
3033,
273,
295,
2069,
253,
11591,
3259,
2228,
534,
347,
891,
5393,
1840,
651,
2649,
1918,
2712,
1805,
685,
271,
3909,
5933,
1293,
13650,
1014,
604,
581,
10554,
310,
16426,
745,
50276,
19,
619,
1273,
4468,
285,
1953,
281,
253,
4477,
310,
253,
10096,
273,
436,
5933,
327,
479,
13356,
790,
5933,
253,
1039,
253,
5933,
310,
873,
598,
352,
3936,
253,
2900,
4197,
407,
479,
13356,
790,
5933,
285,
4648,
253,
7563,
2530,
407,
436,
5933,
281,
9093,
4796,
2852,
4815,
407,
11280,
9189,
2822,
253,
8131,
9509,
752,
604,
359,
7932,
479,
13356,
790,
5933,
342,
1529,
12085,
50276,
28402,
874,
4328,
5933,
604,
253,
5933,
476,
897,
667,
12085,
9509,
4328,
5933,
326,
556,
2709,
5373,
247,
253,
5933,
280,
7792,
1146,
4158,
310,
625,
12112,
285,
476,
320,
3732,
281,
643,
2074,
3237,
3185,
273,
20462,
1529,
5933,
432,
20041,
285,
270,
1199,
273,
253,
1783,
534,
387,
253,
2774,
294,
17460,
3607,
273,
479,
13356,
790,
5933,
476,
320,
17527,
285,
21010,
347,
4767,
253,
1655,
5933,
3133,
281,
10725,
327,
2173,
3607,
273,
479,
13356,
790,
5933,
533,
310,
326,
1663,
3309,
50276,
20,
581,
5884,
18234,
365,
310,
253,
1039,
253,
5933,
310,
3559,
275,
436,
2929,
812,
368,
4496,
2486,
247,
2505,
5740,
4931,
275,
1635,
281,
253,
10585,
406,
853,
594,
326,
581,
36908,
452,
281,
2096,
253,
5933,
12718,
432,
253,
10585,
406,
853,
436,
310,
247,
24542,
2929,
327,
3909,
9509,
4328,
342,
13650,
253,
1895,
2783,
310,
1774,
285,
4623,
285,
253,
1543,
2797,
403,
6863,
275,
253,
3282,
326,
15274,
285,
31640,
1543,
275,
253,
987,
4023,
30844,
533,
891,
452,
7350,
670,
253,
2228,
7982,
534,
310,
253,
954,
1774,
4764,
1060,
50276,
3113,
253,
1655,
2228,
7982,
253,
1077,
15958,
6387,
273,
1907,
247,
2014,
3076,
10554,
310,
671,
1469,
281,
1693,
15353,
253,
23632,
273,
253,
5933,
281,
1110,
273,
1907,
642,
10554,
387,
512,
891,
651,
671,
452,
644,
33128,
342,
271,
5933,
326,
476,
897,
271,
3909,
5933,
323,
253,
1895,
275,
247,
625,
2806,
3817,
5133,
50276,
2520,
651,
1056,
253,
7792,
625,
3839,
7763,
285,
891,
9101,
436,
310,
1896,
1014,
342,
253,
1655,
7792,
390,
690,
11237,
273,
352,
984,
891,
13414,
923,
253,
7792,
347,
4473,
1230,
970,
2712,
2173,
670,
479,
13356,
790,
5933,
3738,
891,
1537,
320,
3430,
327,
326,
1390,
1127,
5474,
339,
431,
248,
2929,
19401,
253,
3909,
9509,
4328,
1895,
342,
13650,
275,
436,
1895,
627,
310,
247,
873,
273,
9189,
1677,
28841,
327,
247,
7982,
285,
247,
3425,
273,
4831,
2792,
15602,
3909,
672,
247,
4831,
1127,
23981,
359,
1364,
2057,
7622,
352,
281,
247,
9509,
359,
452,
3786,
5485,
390,
1527,
747,
9189,
285,
7622,
253,
4831,
281,
581,
273,
253,
747,
9189,
253,
2264,
2105,
273,
247,
2900,
310,
253,
2264,
2105,
273,
9189,
5485,
5043,
253,
2264,
4181,
875,
1016,
4831,
1127,
285,
253,
9509,
352,
369,
50088,
281,
275,
253,
4715,
2321,
16390,
4758,
672,
1016,
4831,
1127,
23981,
359,
403,
671,
1677,
247,
10554,
275,
253,
830,
273,
247,
9509,
359,
2868,
352,
310,
50088,
281,
275,
271,
8654,
2900,
1339,
1162,
404,
30399,
320,
253,
4869,
4181,
875,
667,
253,
9509,
359,
3283,
247,
4831,
1127,
310,
50088,
281,
275,
253,
8654,
2900,
285,
253,
9509,
352,
310,
2686,
50088,
281,
275,
253,
8654,
2900,
253,
4477,
1918,
247,
258,
2808,
1315,
317,
79,
1162,
404,
30399,
2178,
3118,
19922,
5933,
835,
295,
310,
253,
1180,
273,
4831,
2792,
285,
1478,
310,
253,
2105,
273,
253,
8654,
2900,
50276,
14920,
2957,
273,
31376,
359,
476,
5467,
1162,
404,
30399,
50276,
80,
2178,
594,
436,
310,
387,
9065,
271,
258,
2808,
295,
3118,
19922,
5933,
597,
671,
921,
326,
436,
310,
253,
8654,
12085,
4313,
598,
281,
258,
2808,
2412,
295,
2616,
1014,
604,
1162,
66,
18,
253,
2264,
4181,
875,
13650,
285,
32659,
9189,
2581,
685,
253,
4869,
310,
3638,
253,
4477,
671,
1918,
4679,
4645,
326,
327,
1524,
10186,
941,
5239,
342,
13506,
13650,
326,
452,
247,
11542,
1162,
404,
30399,
1318,
616,
5933,
41731,
13015,
479,
13356,
790,
672,
253,
10554,
2228,
310,
1698,
41731,
13015,
1563,
253,
13650,
672,
253,
10554,
2228,
310,
1029,
285,
310,
417,
1512,
1199,
7197,
685,
479,
13356,
790,
672,
253,
10554,
2228,
310,
1029,
50276,
783,
5933,
253,
4477,
897,
806,
4648,
253,
5933,
273,
479,
90,
3796,
323,
253,
4758,
1293,
13650,
275,
436,
5933,
323,
1016,
1612,
273,
374,
374,
76,
359,
1908,
5909,
253,
8642,
9509,
281,
253,
4831,
1127,
3692,
2105,
310,
387,
954,
374,
76,
390,
326,
310,
2168,
275,
776,
873,
273,
1527,
9189,
1339,
1448,
85,
518,
320,
253,
4181,
432,
253,
4831,
1127,
281,
436,
9509,
359,
1527,
436,
9509,
342,
11467,
8288,
5912,
14495,
281,
1315,
317,
69,
2585,
518,
18,
50276,
69,
2585,
518,
19,
76,
540,
41597,
359,
588,
1620,
1527,
247,
9509,
326,
310,
2007,
432,
253,
4831,
1127,
432,
247,
3786,
5485,
9509,
285,
253,
5912,
359,
1527,
247,
9509,
8003,
685,
253,
1682,
3786,
5485,
9509,
310,
11467,
14495,
281,
253,
6379,
275,
4602,
2105,
432,
5909,
436,
9509,
7147,
253,
1682,
9509,
45964,
2716,
347,
1199,
4272,
407,
436,
9509,
84,
5909,
2105,
1563,
247,
3790,
273,
479,
13356,
790,
5933,
359,
897,
247,
10554,
3213,
281,
6830,
1527,
625,
9189,
3692,
3264,
2105,
310,
387,
954,
253,
2572,
275,
9509,
285,
4602,
2105,
275,
253,
3790,
273,
479,
13356,
790,
5933,
281,
513,
594,
359,
1908,
9189,
275,
247,
4023,
1475,
253,
8131,
9509,
3692,
9941,
310,
14495,
281,
253,
4181,
432,
253,
8131,
9509,
281,
253,
5275,
9509,
5485,
275,
247,
2045,
10554,
3213,
285,
1908,
5909,
253,
45869,
581,
359,
588,
7964,
1527,
352,
604,
2509,
594,
651,
417,
2847,
253,
2105,
273,
253,
10554,
3213,
281,
8268,
253,
2105,
273,
253,
3790,
273,
479,
13356,
790,
5933,
5010,
359,
1527,
352,
342,
690,
5912,
824,
326,
253,
10554,
5018,
3264,
2105,
310,
4555,
253,
2105,
273,
253,
3790,
273,
479,
13356,
790,
5933,
50276,
783,
2022,
7680,
273,
253,
2929,
310,
4933,
247,
2822,
29776,
906,
323,
3909,
9509,
4328,
342,
13650,
2112,
342,
247,
4829,
45767,
2406,
3033,
253,
760,
643,
906,
310,
253,
269,
302,
30441,
1162,
355,
2929,
253,
4477,
2319,
275,
253,
26432,
534,
760,
19401,
253,
4758,
835,
512,
9189,
452,
5909,
2105,
337,
5727,
436,
2929,
22139,
1327,
23714,
5909,
4815,
1014,
23111,
253,
36594,
3374,
273,
253,
269,
302,
30441,
1162,
355,
2929,
3982,
598,
407,
253,
4477,
253,
11333,
10554,
3213,
310,
281,
253,
1682,
273,
619,
3640,
247,
4460,
581,
26332,
352,
310,
417,
3365,
20284,
247,
5853,
432,
1529,
2929,
285,
253,
3064,
875,
253,
17375,
285,
440,
24676,
4758,
310,
247,
2234,
7681,
5691,
275,
616,
1543,
594,
891,
651,
1859,
436,
347,
247,
6832,
7756,
689,
253,
269,
302,
30441,
1162,
355,
2929,
1014,
604,
253,
269,
302,
30441,
1162,
355,
2929,
310,
3451,
18145,
247,
1805,
12085,
4313,
275,
253,
4758,
835,
1162,
66,
18,
310,
1355,
3738,
891,
5194,
342,
253,
4477,
326,
253,
3033,
7558,
275,
253,
269,
302,
30441,
1162,
355,
2929,
310,
8489,
6773,
90,
1955,
281,
253,
958,
326,
352,
812,
320,
4016,
672,
1162,
66,
18,
310,
1781,
2429,
281,
1162,
404,
30399,
285,
369,
417,
2104,
281,
26923,
432,
4361,
326,
2929,
835,
253,
5834,
310,
33810,
253,
2216,
273,
253,
10554,
3213,
310,
9648,
37825,
281,
479,
533,
846,
4361,
253,
2929,
891,
3543,
891,
574,
690,
1175,
30328,
323,
2139,
436,
369,
253,
987,
1039,
281,
897,
253,
8131,
9189,
26332,
326,
253,
10554,
3213,
574,
247,
5322,
27350,
8813,
253,
4679,
671,
23395,
8499,
253,
10527,
1543,
1223,
253,
2406,
3033,
5140,
310,
417,
3782,
4460,
407,
3139,
891,
2868,
352,
310,
247,
11237,
273,
247,
2406,
3033,
4737,
323,
273,
77,
1293,
13650,
253,
3033,
3139,
23395,
509,
9115,
253,
5170,
3033,
347,
973,
50275,
2577,
2022,
14226,
273,
253,
2929,
310,
253,
9759,
273,
253,
2406,
3033,
275,
1798,
253,
1750,
326,
253,
2406,
3033,
2722,
326,
247,
3033,
327,
1162,
66,
18,
3185,
273,
1162,
404,
30399,
310,
11543,
281,
3959,
6832,
11701,
253,
4477,
1375,
326,
616,
2406,
3033,
6556,
1014,
672,
1162,
66,
18,
50276,
80,
18,
2299,
275,
253,
2406,
3033,
1650,
253,
13849,
275,
253,
7982,
812,
320,
4577,
685,
581,
26332,
253,
4313,
1162,
66,
18,
292,
404,
30399,
534,
310,
24088,
417,
4391,
407,
816,
46595,
272,
253,
7982,
285,
310,
275,
619,
1859,
253,
987,
10671,
281,
1007,
387,
1537,
1335,
320,
3240,
1781,
594,
436,
36908,
1333,
2712,
670,
253,
5886,
875,
1162,
66,
18,
285,
1162,
404,
30399,
26332,
436,
4313,
476,
6889,
7293,
327,
253,
4758,
273,
1162,
404,
30399,
275,
616,
1650,
11467,
1162,
66,
18,
310,
4229,
533,
1162,
404,
30399,
476,
320,
12848,
891,
1158,
604,
1162,
404,
30399,
310,
10481,
1355,
275,
616,
2406,
3033,
581,
4850,
1162,
66,
18,
50276,
80,
292,
404,
30399,
534,
2722,
326,
275,
2087,
642,
273,
8306,
2808,
295,
1464,
18,
2178,
2808,
2412,
295,
3118,
19922,
5933,
310,
1896,
534,
651,
24088,
1333,
326,
581,
16216,
755,
247,
1805,
20185,
4313,
407,
15706,
1162,
404,
30399,
342,
1162,
66,
18,
533,
516,
417,
4336,
2119,
604,
436,
310,
253,
987,
1379,
12594,
285,
253,
4477,
13414,
15368,
562,
667,
4154,
751,
436,
891,
651,
751,
281,
923,
253,
2954,
875,
1162,
66,
18,
285,
1162,
404,
30399,
4767,
625,
4518,
275,
436,
2406,
3033,
347,
973,
347,
247,
5955,
273,
752,
436,
2954,
8018,
670,
14493,
7293,
327,
1162,
66,
18,
604,
253,
4477,
5730,
281,
1056,
3916,
326,
1162,
66,
18,
26198,
1162,
404,
30399,
310,
11543,
281,
1361,
751,
597,
513,
275,
253,
10199,
50276,
12563,
891,
13414,
2868,
352,
310,
5393,
9825,
326,
1478,
1057,
417,
452,
281,
320,
253,
28841,
8654,
2900,
275,
253,
1783,
534,
310,
29688,
1146,
908,
275,
253,
2406,
3033,
50276,
249,
436,
2406,
3033,
1162,
404,
30399,
310,
253,
4181,
875,
253,
13650,
285,
247,
2900,
534,
778,
320,
8654,
323,
19843,
84,
13232,
352,
651,
320,
1175,
281,
2057,
3748,
436,
9366,
390,
253,
2406,
3033,
5140,
943,
5276,
253,
581,
28402,
874,
2900,
310,
8654,
604,
2712,
253,
3438,
943,
17084,
253,
5170,
3033,
906,
407,
2403,
352,
625,
2087,
285,
352,
651,
320,
625,
275,
1386,
342,
253,
16774,
1543,
534,
13414,
1089,
271,
3242,
2900,
281,
2613,
13650,
745,
50275,
6275,
314,
1679,
594,
247,
14855,
685,
247,
3480,
273,
4757,
1223,
3909,
9509,
4328,
310,
247,
9648,
7936,
1953,
275,
253,
3909,
11333,
2317,
891,
13414,
1158,
326,
2429,
281,
24088,
253,
819,
1983,
34716,
1332,
323,
4715,
31612,
11333,
407,
270,
33122,
1162,
355,
436,
9380,
5609,
403,
26401,
10155,
747,
7681,
3216,
275,
253,
4715,
2321,
16390,
2317,
326,
812,
3959,
16039,
715,
643,
3237,
969,
891,
651,
2649,
1859,
436,
247,
14855,
816,
247,
1921,
2139,
891,
13414,
1928,
891,
476,
2281,
436,
2929,
4122,
5747,
18262,
4361,
352,
4583,
891,
1158,
253,
2929,
310,
247,
5075,
2997,
604,
253,
18979,
7350,
342,
253,
9759,
273,
253,
2406,
3033,
5140,
476,
320,
11512,
604,
597,
403,
417,
11512,
275,
253,
30080,
22559,
2180,
891,
651,
2281,
387,
247,
5075,
12009,
891,
1158,
253,
5933,
285,
697,
1783,
403,
9648,
37825,
533,
27350,
285,
30357,
281,
1239,
670,
253,
1895,
1146,
9713,
310,
581,
273,
1600,
285,
253,
2406,
3033,
358,
5378,
474,
1543,
23395,
13503,
253,
5170,
14493,
2299,
352,
310,
417,
2590,
281,
479,
604,
436,
2929,
9572,
562,
275,
1798,
15995,
643,
9380,
275,
253,
4715,
2321,
16390,
3909,
11333,
6239,
285,
627,
403,
690,
7350,
342,
253,
9759,
273,
253,
2406,
3033,
891,
5730,
281,
923,
9713,
1078,
9311,
2490,
187,
4118,
18435,
27,
2520,
2929,
19401,
253,
3332,
1386,
273,
789,
327,
11333,
342,
13650,
50276,
9328,
1918,
747,
1543,
327,
253,
3909,
9509,
4328,
1895,
4583,
253,
30628,
3543,
253,
9400,
369,
273,
1600,
281,
253,
3114,
50276,
9088,
497,
690,
7350,
670,
253,
2228,
7982,
908,
285,
253,
4583,
7792,
2299,
253,
5020,
273,
30628,
1335,
3543,
253,
2929,
369,
4722,
285,
891,
1158,
253,
2929,
476,
320,
7607
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in order to eliminate the need for large training sets one can consider a transition from 1 fullysupervised to 2 selfsupervised methods and then from 2 selfsupervised methods to 3 singleinstance reconstruction methods in the context of accelerated mri reconstruction which is considered in this work the above categories translate into models that are trained based on 1 having access to a fullysampled dataset 2 having access to a dataset of undersampled measurements and 3 having access to only one undersampled measurement the paper targets 3 that is proposing a zeroshot learning approach for accelerated mri reconstruction the algorithm is based on the idea proposed in paper 1 combined with building a dataset for the given sample in order to eventually perform selfsupervised training on the synthesized dataset in order to prevent overfitting the authors propose a way to do automatic early stopping 1 yaman burhaneddin et al selfsupervised physicsbased deep learning mri reconstruction without fullysampled data 2020 ieee 17th international symposium on biomedical imaging isbi ieee 2020 strengths overall this is a very interesting paper with impressive experimental results the proposed algorithm yields highquality reconstructions and the rationale behind designing such an algorithm is very wellmotivated by the authors indeed it is important to come up with reconstruction algorithms that do not require large training sets especially in medical imaging where creating datasets is a tedious task the idea of self validation is also interesting since it mimics the presence of a validation set for training and prevents the network from overfitting the robustness investigations are appealing and very related to the problem considered moreover from the robustness perspective any effort toward singleinstance reconstruction is of great value provided that it results in more robustness regarding robustness evaluations please see the reviewers comment below weaknessescomments the paper states several times that zsssltl significantly reduces convergence time yet what is missing is a computational comparison among singleinstance reconstruction methods such as traditional sparsitybased dipbased zsssl zsssltl this is important since one of the critical challenges of singleinstance reconstruction algorithms is their inefficiency at the inference adding such an experiment would also make the paper even stronger table 2 in the appendix contains very interesting results however there are several surprising scores achieved by zsssltl for instance how come for trained on brain tested on knee zsssltl achieves 404 dbs yet according to table 1 trained on knee tested on knee achieves 401 dbs this is counterintuitive in that it suggests pretraining a model on brains is marginally better than pretraining it on knees if one wants to get a good psnr on knees page 7 section 45 paragraph 2 last sentence zsssl has no prior domain information and is inherently not susceptible to such generalizability issues is there any evidence to suggest that this is the case for example dipbased methods have been suggested to be susceptible to distribution shifts although theres no prior domain information involved but their hyperparameters are tuned on a specific domain doesnt zsssl rely on any tunable hyperparameters that differ from one domain to another the reviewer perceives the point made in figure 2 regarding automatic early stopping however there are the following two comments in terms of the consistency of the results second plot from the left can the authors provide intuition on why the curves are uncorrelated wrt k specifically at the convergence epoch 300 the curve for k10 is on top of k1 the curve for k25 is on top of k10 but then somehow the curve for k50 is below k25 moreover the breakout point of k50 has a sudden drop from the breakout point of k25 does this mean k100 would go down even further the rightmost plot the reviewer is unsure what the authors mean by zsssl with tl converges faster compared to zsssl in the caption in that the validation loss implies no benefit comes with combining zsssl and tl and its not a matter of convergence table 1 suggests that diptl performs strangely poorly compared to pgdlr and zsssltl does this mean that the pretrained network used for diptl performs poorly and dip has not been able to improve its performance and thus the low score has nothing to do with the dip itself decision accept 6 the reviewer finds the paper of great interest to the community and the thorough experimental analysis of the proposed algorithm is the main motivation for accepting the paper however 1 several concernscomments mentioned above and 2 the fact that the major difference between the proposed algorithm and the prior work 1 is the selfvalidation step combined with dataset synthesis prevent the reviewer from giving a higher score 1 yaman burhaneddin et al selfsupervised physicsbased deep learning mri reconstruction without fullysampled data 2020 ieee 17th international symposium on biomedical imaging isbi ieee 2020 docsepthis article propose a zeroshot method for mri reconstruction which is a wellstudied inverse problem the method is based on the ideas of deep image prior ie the ability of correctly sized neural networks to learn about the structure of a single image sufficiently well for denoising tasks this selfsupervised learned network can then be used in a plugandplay architecture to solve inverse problem with a variational approach ie as if the learned denoiser were a total variation tv minimiser their denoiser can be improved in a transfer learning approach to benefit from a more complex network trained on similar images than those at hand and finedtuned on the image to be reconstructed the authors go on to apply their plugandplay architecture to solve the mri reconstruction problem they provide experiments and comparisons the paper is well written and clear with great illustrations and only a few typos it applies the combined principles of zeroshot learning 1 plugandplay 2 iterative architectures and transfer learning to solve the mri reconstruction problem on single images the idea of zeroshot learning is not new in the particular context of imaging it has long been known that the denoising problem could be solved by training on the very same image one would seek to denoise 34 plugandplay architectures combine a generalpurpose denoiser with a tikhonovlike leastsquare solver to solve arbitrary inverse problems mri is often regarded as a classical inverse problem in the same range as ct reconstruction or deblurring the range of solutions provided to solving inverse problems in medical imaging is too large to mention 5 from the methodological point of view i see very little novelty in this article there are many aspects to using zeroshot learned approaches for denoising in particular the fact that their regularity is not established and so they can only be applied for a limited number of iterations and the need to use earlystopping one contribution of this article is their proposed condition for early stopping but it is not studied mathematically 1 romeraparedes b torr p an embarrassingly simple approach to zeroshot learning ininternational conference on machine learning 2015 jun 1 pp 21522161 pmlr 2 s venkatakrishnan c a bouman and b wohlberg plugandplay priors for model based reconstruction ieee global conference on signal and information processing pp 945948 2013 3 huang da kang lw wang yc lin cw selflearning based image decomposition with applications to single image denoising ieee transactions on multimedia 2013 oct 71618393 4 liu c szeliski r kang sb zitnick cl freeman wt automatic estimation and removal of noise from a single image ieee transactions on pattern analysis and machine intelligence 2007 dec 18302299314 5 senouf o vedula s weiss t bronstein a michailovich o zibulevsky m selfsupervised learning of inverse problem solvers in medical imaging indomain adaptation and representation transfer and medical image learning with less labels and imperfect data 2019 oct 13 pp 111119 springer cham the authors make a big deal of using tl 0shot and plugandplay to achieve good results on mri reconstruction all of these elements are known and have been applied before to this problem they do achieve better results than senouf et al 5 thanks to a better tl regimen the paper is correct overall the early stopping condition is debatable and not studied theoretically overall the paper is pretty good but i think it does not clear the bar for acceptance at iclr due to the lack of technical novelty docsepthis submission deals with mr images reconstruction in a context where the raw data is undersampled this data which is in the fourier domain can be undersampled to accelerate the imaging exam and thus improve clinical workflow and it is thus a very relevant research topic the submission relies on recent deep learning models that explicitly incorporate some physical aspects of mr image acquisition multiple coils coil sensitivity and fourier transform and achieve mri reconstruction based on supervised training examples the training examples do not need to be undersampled fully sampled pairs indeed some entries of undersampled data can be deleted to create an input for which correct reconstruction of the deleted entries can be required through a computable loss term while the paper is pleasant to read and rigorous it does not offer a significant improvement over ref yaman et al 2020 which by the way is regrettably not included in the set of benchmark methods to which the submission is compared in the result tables of the experimental section i am not sure to understand the argument wrt transfer learning and single dataset reconstruction the proposed approach also retrains the model on a new dataset with patient specific properties i think that what the authors mean is that this dataset does not need to be supervised but this is a builtin property of the model from yaman et al 2020 from which the authors are building upon the proposed approach as well as supervised pgdlr seem to fail to reconstruct texture feature visible in ground truth images the reconstructed ones are smoother this might be a problem for diagnosis or for other trained ml based models that rely on texture features such as radiomics and take mr images as inputs is this a known issue in mri reconstruction in general or in dl based reconstruction pros the paper is well written and ideas are clearly explained and stated the paper is technically sound cons the contribution is too incremental compared to yaman et al 2020 as it consists in reusing the fourier domain partitioning idea in order to have a validation loss allowing to detect overfitting and stop learning patient specific training demands more computation time and resources than usual clinical workflow
### Summary:
|
the paper considers the problem of accelerated magnetic resonance imaging where the goal is to reconstruct an image from undersampled measurements the paper proposes a zeroshot selfsupervised learning approach for accelerated deep learning based magnetic resonance imaging the approach partitions the measurements from a single scan into two disjoint sets one set is used for selfsupervised learning and one set is used to perform validation specifically to select a regularization parameter the set that is used for selfsupervised learning is then again split into two different sets and a network is trained to predict the frequencies from one set based on the frequencies in the other set this enables accelerated mri without any training data the paper evaluates on the fastmri dataset a standard dataset for deep learning based mri research and the paper compares to a trained baseline and an untrained baseline dip the paper finds their selfsupervised method to perform very well compared to both and shows images that indicate excellent performance it would have been even better to compare the method on the test set of the fastmri competition to have a proper benchmark comparison here is how the discussion went reviewer pt6r is supportive of acceptance but notes a few potential irregularities such as the method pretrained on brain and tested on knees performing better than the method pretrained on knees and tested on knees and not providing a comparison of the computational cost the authors added a table to the appendix revealing that the computational costs are very high much higher than for dip even the reviewer was content with the response and raised the score reviewer mbmk argues that the contribution is too incremental compared to prior work in particular relative to the results of yaman et al 2020 and also argues that the idea of partitioning the measurements is not new the authors argue in response that their approach of partitioning the measurements is new and the reviewer was inclined to raise the score slightly but still thinks that the novelty on the technical ml side remains limited and doesnt want to back the submission too much and did not raise the score at the end in the system reviewer 19v3 has the concern that the all elements used transfer learning plugandplay etc are well known techniques and have been applied before to mri and therefore thinks that the paper does not clear the bar for acceptance the paper points out that while those ideas might be applied for the first time to mri they have been used before in other image reconstruction problems in particular denoising ive read the paper in detail too and am somewhat on the fence i think its very valuable to see that a clever application of selfsupervised learning works so well for mri i agree with the reviewers that the technical novelty is relatively small but on the other hand this is the first time that i see selfsupervised learning being applied that successfully to mri i dont share the concern about novelty yes the papers approach builds on prior work but its not clear from the literature how well such a well tuned selfsupervised learning approach would work what i would have liked to see in addition to the experimental results is a proper evaluation on the fastmri dataset an advantage of the fastmri dataset is that it provides a benchmark and if researchers evaluate on that benchmark on the testsetvalidation set we can compare different methods well the paper under review doesnt do that it only evaluates on 30 test slices and thus its hard to benchmark the method also the paper would benefit from more ablation studies in conclusion i would be happy to discuss this paper at the conference and think that other researchers in the intersection of deep learning and inverse problems would be too and therefore recommend acceptance
|
[
275,
326,
253,
12820,
2957,
8018,
642,
5649,
3249,
342,
16248,
1182,
859,
3433,
285,
246,
77,
285,
697,
417,
247,
2647,
273,
14940,
50274,
2420,
337,
5936,
326,
1073,
431,
77,
17923,
38612,
15225,
2429,
281,
23256,
11830,
83,
285,
1182,
859,
3433,
17945,
1057,
436,
1599,
326,
253,
3215,
11273,
2990,
908,
323,
1073,
431,
77,
17923,
15225,
285,
12539,
556,
417,
644,
2104,
281,
3157,
697,
3045,
285,
3021,
253,
1698,
4868,
556,
2717,
281,
513,
342,
253,
12539,
3139,
50276,
33642,
2997,
721,
50276,
783,
37317,
9010,
253,
2929,
273,
1270,
1600,
281,
253,
3114,
285,
253,
11080,
5661,
1783,
273,
253,
4081,
5933,
310,
253,
2022,
16038,
323,
18738,
253,
2929,
2299,
50275,
18,
2067,
7350,
26122,
5393,
1840,
285,
50275,
19,
253,
958,
326,
253,
2201,
3064,
875,
253,
4081,
5933,
285,
253,
2720,
789,
337,
310,
253,
1881,
29599,
3213,
5678,
342,
10895,
9066,
50275,
33898,
253,
37317,
432,
4933,
247,
2169,
4868,
50276,
18,
340,
14990,
3600,
5582,
264,
23341,
1162,
355,
1881,
35421,
12057,
3169,
3676,
4715,
278,
363,
14433,
1293,
4751,
22163,
6216,
941,
9169,
26332,
1796,
1722,
394,
5213,
18870,
35835,
327,
35156,
6979,
310,
4193,
26332,
1796,
9169,
50276,
7152,
33032,
2520,
3929,
12661,
247,
1182,
254,
6934,
302,
1332,
323,
278,
363,
14433,
534,
310,
247,
973,
14091,
728,
13737,
1895,
253,
1332,
310,
1754,
327,
253,
5697,
273,
3676,
2460,
2720,
26332,
253,
3745,
273,
9113,
25180,
11454,
6928,
281,
3037,
670,
253,
2605,
273,
247,
2014,
2460,
10481,
973,
323,
1850,
80,
2182,
8892,
436,
1881,
35421,
6311,
2990,
476,
840,
320,
908,
275,
247,
10358,
395,
1993,
10336,
281,
8415,
13737,
1895,
342,
247,
39762,
2746,
26332,
347,
604,
253,
6311,
1850,
80,
9141,
497,
247,
2264,
7629,
23055,
7221,
9141,
616,
1850,
80,
9141,
476,
320,
5520,
275,
247,
3700,
4715,
2746,
281,
5649,
432,
247,
625,
2570,
2990,
10166,
327,
2074,
3888,
685,
1110,
387,
1133,
285,
44861,
85,
37437,
327,
253,
2460,
281,
320,
25578,
50276,
783,
4477,
564,
327,
281,
4647,
616,
10358,
395,
1993,
10336,
281,
8415,
253,
278,
363,
14433,
1895,
597,
2085,
4679,
285,
14023,
253,
2929,
310,
973,
3542,
285,
2590,
342,
1270,
33954,
285,
760,
247,
1643,
963,
993,
352,
10384,
253,
5678,
9241,
273,
1182,
254,
6934,
302,
4715,
337,
10358,
395,
1993,
374,
34560,
35615,
285,
3700,
4715,
281,
8415,
253,
278,
363,
14433,
1895,
327,
2014,
3888,
50276,
783,
2934,
273,
1182,
254,
6934,
302,
4715,
310,
417,
747,
275,
253,
1798,
3634,
273,
6979,
352,
556,
1048,
644,
1929,
326,
253,
1850,
80,
2182,
1895,
812,
320,
14042,
407,
3733,
327,
253,
1077,
1072,
2460,
581,
651,
7703,
281,
1850,
45416,
5910,
10358,
395,
1993,
35615,
13398,
247,
2087,
27299,
1850,
80,
9141,
342,
247,
246,
20323,
251,
729,
3022,
1878,
15044,
47037,
281,
8415,
10341,
13737,
3237,
278,
363,
310,
2223,
12258,
347,
247,
8946,
13737,
1895,
275,
253,
1072,
2491,
347,
45830,
14433,
390,
372,
49857,
804,
253,
2491,
273,
5482,
2530,
281,
16161,
13737,
3237,
275,
3739,
6979,
310,
1512,
1781,
281,
3748,
608,
50276,
4064,
253,
35961,
1127,
273,
1859,
891,
923,
1077,
1652,
38135,
275,
436,
3929,
50276,
9088,
403,
1142,
7794,
281,
970,
1182,
254,
6934,
302,
6311,
7274,
323,
1850,
80,
2182,
275,
1798,
253,
958,
326,
616,
31793,
310,
417,
4232,
285,
594,
597,
476,
760,
320,
3732,
323,
247,
3710,
1180,
273,
25142,
285,
253,
878,
281,
897,
2393,
11769,
2784,
581,
7680,
273,
436,
3929,
310,
616,
4081,
1617,
323,
2393,
15910,
533,
352,
310,
417,
5421,
11076,
1037,
50272,
18,
10102,
254,
522,
1096,
265,
270,
7263,
83,
268,
271,
15005,
5356,
2969,
2746,
281,
1182,
254,
6934,
302,
4715,
275,
48985,
8059,
327,
5145,
4715,
4104,
12468,
337,
7266,
23917,
1423,
20664,
268,
1686,
83,
374,
256,
8097,
29846,
518,
41657,
11943,
260,
247,
29909,
1342,
285,
270,
259,
37844,
4978,
10358,
395,
1993,
2235,
641,
323,
1566,
1754,
14433,
26332,
1796,
4156,
8059,
327,
2625,
285,
1491,
5162,
7266,
898,
28333,
2385,
4072,
495,
30287,
606,
4204,
465,
606,
298,
88,
259,
606,
340,
68,
19169,
260,
88,
1881,
28269,
1754,
2460,
14717,
342,
4893,
281,
2014,
2460,
1850,
80,
2182,
26332,
1796,
13122,
327,
38195,
4072,
17109,
818,
1036,
1093,
29360,
577,
632,
86,
260,
18558,
293,
1886,
74,
391,
465,
606,
34505,
1182,
262,
23237,
502,
4107,
11155,
22923,
12077,
13418,
285,
8570,
273,
6046,
432,
247,
2014,
2460,
26332,
1796,
13122,
327,
3102,
1783,
285,
5145,
9260,
5215,
1086,
46215,
1423,
1525,
18486,
608,
5303,
276,
71,
258,
27685,
3627,
256,
359,
739,
246,
13605,
6339,
247,
49068,
647,
23303,
258,
1182,
487,
1657,
87,
13456,
278,
1881,
35421,
4715,
273,
13737,
1895,
1220,
735,
275,
3739,
6979,
801,
297,
404,
15644,
285,
6779,
3700,
285,
3739,
2460,
4715,
342,
1679,
13301,
285,
35180,
941,
6247,
17109,
2145,
7266,
337,
10768,
746,
7203,
254,
45909,
253,
4477,
1056,
247,
1943,
2968,
273,
970,
246,
77,
470,
11860,
285,
10358,
395,
1993,
281,
5115,
1175,
1543,
327,
278,
363,
14433,
512,
273,
841,
3603,
403,
1929,
285,
452,
644,
3732,
1078,
281,
436,
1895,
597,
513,
5115,
1805,
1543,
685,
5303,
276,
71,
1162,
355,
608,
6701,
281,
247,
1805,
246,
77,
24020,
50276,
783,
2929,
310,
3451,
4583,
253,
2393,
15910,
1617,
310,
4274,
17980,
285,
417,
5421,
28055,
50276,
1189,
455,
253,
2929,
310,
3965,
1175,
533,
891,
1158,
352,
1057,
417,
2590,
253,
2534,
323,
14924,
387,
17857,
32888,
1955,
281,
253,
3480,
273,
7681,
38135,
5474,
33032,
2520,
19529,
13330,
342,
278,
83,
3888,
14433,
275,
247,
3634,
835,
253,
9305,
941,
310,
17433,
312,
6216,
436,
941,
534,
310,
275,
253,
269,
15421,
5028,
476,
320,
17433,
312,
6216,
281,
28523,
253,
6979,
1174,
285,
3021,
3157,
3382,
24824,
285,
352,
310,
3021,
247,
1077,
4623,
2561,
9400,
50275,
783,
19529,
15771,
327,
3332,
3676,
4715,
3210,
326,
11120,
19071,
690,
3520,
7794,
273,
278,
83,
2460,
11931,
2709,
35062,
18783,
7340,
285,
269,
15421,
4979,
285,
5115,
278,
363,
14433,
1754,
327,
22296,
3733,
6667,
253,
3733,
6667,
513,
417,
878,
281,
320,
17433,
312,
6216,
50276,
2920,
19958,
8557,
6296,
690,
12028,
273,
17433,
312,
6216,
941,
476,
320,
16737,
281,
2794,
271,
3280,
323,
534,
3451,
14433,
273,
253,
16737,
12028,
476,
320,
2424,
949,
247,
2475,
494,
2957,
1307,
50276,
6050,
253,
2929,
310,
17127,
281,
1239,
285,
26565,
352,
1057,
417,
3959,
247,
1534,
7756,
689,
1275,
340,
14990,
1162,
355,
9169,
534,
407,
253,
1039,
310,
810,
12436,
1598,
417,
2908,
275,
253,
873,
273,
22791,
3082,
281,
534,
253,
19529,
310,
2429,
275,
253,
906,
7180,
273,
253,
5661,
2593,
50276,
74,
717,
417,
2119,
281,
2096,
253,
4154,
8772,
3700,
4715,
285,
2014,
10895,
14433,
253,
4081,
2746,
671,
851,
44196,
253,
1566,
327,
247,
747,
10895,
342,
3110,
2173,
3607,
891,
1158,
326,
752,
253,
4477,
1599,
310,
326,
436,
10895,
1057,
417,
878,
281,
320,
22296,
533,
436,
310,
247,
4270,
249,
2867,
273,
253,
1566,
432,
340,
14990,
1162,
355,
9169,
432,
534,
253,
4477,
403,
3652,
2220,
50276,
783,
4081,
2746,
347,
973,
347,
22296,
23256,
11830,
83,
1646,
281,
1891,
281,
17029,
14542,
4735,
7985,
275,
3216,
5083,
3888,
253,
25578,
4394,
403,
39797,
977,
436,
1537,
320,
247,
1895,
323,
6120,
390,
323,
643,
10166,
13361,
1754,
3210,
326,
10725,
327,
14542,
3386,
824,
347,
8188,
19177,
285,
1379,
278,
83,
3888,
347,
14800,
310,
436,
247,
1929,
2523,
275,
278,
363,
14433,
275,
2087,
390,
275,
45439,
1754,
14433,
50276,
856,
84,
50274,
783,
2929,
310,
973,
3542,
285,
5697,
403,
4518,
5544,
285,
4767,
50276,
783,
2929,
310,
22335,
3590,
50276,
5040,
50274,
783,
7680,
310,
1512,
32809,
2429,
281,
340,
14990,
1162,
355,
9169,
347,
352,
8414,
275,
294,
5302,
253,
269,
15421,
5028,
41463,
2934,
275,
1340,
281,
452,
247,
12820,
2957,
6941,
281,
2736,
689,
31893,
285,
3523,
4715,
50276,
15325,
2173,
3733,
11684,
625,
13782,
673,
285,
5300,
685,
7312,
3382,
24824,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
253,
1895,
273,
21702,
5212,
12549,
6979,
835,
253,
4736,
310,
281,
17029,
271,
2460,
432,
17433,
312,
6216,
6341,
253,
2929,
29328,
247,
1182,
254,
6934,
302,
1881,
35421,
4715,
2746,
323,
21702,
3676,
4715,
1754,
5212,
12549,
6979,
253,
2746,
27959,
253,
6341,
432,
247,
2014,
11017,
715,
767,
28465,
5239,
581,
873,
310,
908,
323,
1881,
35421,
4715,
285,
581,
873,
310,
908,
281,
1347,
12820,
5742,
281,
3609,
247,
37820,
4764,
253,
873,
326,
310,
908,
323,
1881,
35421,
4715,
310,
840,
969,
8085,
715,
767,
1027,
5239,
285,
247,
2990,
310,
10166,
281,
3283,
253,
11383,
432,
581,
873,
1754,
327,
253,
11383,
275,
253,
643,
873,
436,
13276,
21702,
278,
363,
1293,
667,
3733,
941,
50276,
783,
2929,
44995,
327,
253,
3809,
78,
363,
10895,
247,
2629,
10895,
323,
3676,
4715,
1754,
278,
363,
2561,
285,
253,
2929,
26662,
281,
247,
10166,
8245,
285,
271,
440,
32927,
8245,
12539,
253,
2929,
9010,
616,
1881,
35421,
1332,
281,
1347,
1077,
973,
2429,
281,
1097,
285,
2722,
3888,
326,
5224,
7126,
3045,
352,
651,
452,
644,
1014,
1805,
281,
7277,
253,
1332,
327,
253,
1071,
873,
273,
253,
3809,
78,
363,
7324,
281,
452,
247,
1463,
22791,
5301,
50276,
1568,
310,
849,
253,
5955,
2427,
50276,
15337,
254,
31048,
23,
83,
310,
23384,
273,
14924,
533,
7211,
247,
1643,
2442,
17948,
1005,
824,
347,
253,
1332,
3215,
11273,
327,
3998,
285,
5762,
327,
15746,
9591,
1805,
685,
253,
1332,
3215,
11273,
327,
15746,
285,
5762,
327,
15746,
285,
417,
5277,
247,
5301,
273,
253,
15180,
2105,
253,
4477,
2879,
247,
2829,
281,
253,
30762,
19678,
326,
253,
15180,
4815,
403,
1077,
1029,
1199,
2169,
685,
323,
12539,
1014,
253,
37317,
369,
2600,
342,
253,
2380,
285,
5439,
253,
4868,
50273,
15337,
254,
278,
5844,
76,
8219,
326,
253,
7680,
310,
1512,
32809,
2429,
281,
2720,
789,
275,
1798,
4103,
281,
253,
1543,
273,
340,
14990,
1162,
355,
9169,
285,
671,
8219,
326,
253,
2934,
273,
41463,
253,
6341,
310,
417,
747,
253,
4477,
9059,
275,
2380,
326,
616,
2746,
273,
41463,
253,
6341,
310,
747,
285,
253,
37317,
369,
21802,
281,
7164,
253,
4868,
5777,
533,
1335,
11121,
326,
253,
38135,
327,
253,
7681,
13361,
1930,
4558,
3710,
285,
36908,
971,
281,
896,
253,
19529,
1512,
1199,
285,
858,
417,
7164,
253,
4868,
387,
253,
990,
275,
253,
985,
50275,
15337,
254,
655,
87,
20,
556,
253,
4468,
326,
253,
512,
3603,
908,
3700,
4715,
10358,
395,
1993,
3966,
403,
973,
1929,
5609,
285,
452,
644,
3732,
1078,
281,
278,
363,
285,
3103,
11121,
326,
253,
2929,
1057,
417,
2590,
253,
2534,
323,
14924,
253,
2929,
2792,
562,
326,
1223,
1110,
5697,
1537,
320,
3732,
323,
253,
806,
673,
281,
278,
363,
597,
452,
644,
908,
1078,
275,
643,
2460,
14433,
3237,
275,
1798,
1850,
80,
2182,
50275,
422,
1239,
253,
2929,
275,
2508,
1512,
285,
717,
8489,
327,
253,
19354,
891,
1158,
697,
1077,
9865,
281,
923,
326,
247,
19080,
2898,
273,
1881,
35421,
4715,
2987,
594,
973,
323,
278,
363,
891,
5194,
342,
253,
30628,
326,
253,
7681,
38135,
310,
4942,
1355,
533,
327,
253,
643,
1133,
436,
310,
253,
806,
673,
326,
891,
923,
1881,
35421,
4715,
1146,
3732,
326,
8379,
281,
278,
363,
891,
13414,
3894,
253,
4468,
670,
38135,
50276,
9820,
253,
9380,
2746,
21168,
327,
2720,
789,
533,
697,
417,
2590,
432,
253,
6239,
849,
973,
824,
247,
973,
24251,
1881,
35421,
4715,
2746,
651,
789,
50276,
5371,
891,
651,
452,
10490,
281,
923,
275,
1635,
281,
253,
5661,
1543,
310,
247,
1463,
7103,
327,
253,
3809,
78,
363,
10895,
271,
5750,
273,
253,
3809,
78,
363,
10895,
310,
326,
352,
3400,
247,
22791,
285,
604,
8607,
7472,
327,
326,
22791,
327,
253,
1071,
1178,
29599,
873,
359,
476,
7277,
1027,
3082,
973,
253,
2929,
762,
2278,
36908,
513,
326,
352,
760,
44995,
327,
1884,
1071,
18484,
285,
3021,
697,
1892,
281,
22791,
253,
1332,
671,
253,
2929,
651,
5649,
432,
625,
28913,
2175,
50276,
249,
6452,
891,
651,
320,
5211,
281,
2319,
436,
2929,
387,
253,
8059,
285,
1158,
326,
643,
8607,
275,
253,
15171,
273,
3676,
4715,
285,
13737,
3237,
651,
320,
1512,
285,
3103,
5583,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
275,
326,
253,
12820,
2957,
8018,
642,
5649,
3249,
342,
16248,
1182,
859,
3433,
285,
246,
77,
285,
697,
417,
247,
2647,
273,
14940,
50274,
2420,
337,
5936,
326,
1073,
431,
77,
17923,
38612,
15225,
2429,
281,
23256,
11830,
83,
285,
1182,
859,
3433,
17945,
1057,
436,
1599,
326,
253,
3215,
11273,
2990,
908,
323,
1073,
431,
77,
17923,
15225,
285,
12539,
556,
417,
644,
2104,
281,
3157,
697,
3045,
285,
3021,
253,
1698,
4868,
556,
2717,
281,
513,
342,
253,
12539,
3139,
50276,
33642,
2997,
721,
50276,
783,
37317,
9010,
253,
2929,
273,
1270,
1600,
281,
253,
3114,
285,
253,
11080,
5661,
1783,
273,
253,
4081,
5933,
310,
253,
2022,
16038,
323,
18738,
253,
2929,
2299,
50275,
18,
2067,
7350,
26122,
5393,
1840,
285,
50275,
19,
253,
958,
326,
253,
2201,
3064,
875,
253,
4081,
5933,
285,
253,
2720,
789,
337,
310,
253,
1881,
29599,
3213,
5678,
342,
10895,
9066,
50275,
33898,
253,
37317,
432,
4933,
247,
2169,
4868,
50276,
18,
340,
14990,
3600,
5582,
264,
23341,
1162,
355,
1881,
35421,
12057,
3169,
3676,
4715,
278,
363,
14433,
1293,
4751,
22163,
6216,
941,
9169,
26332,
1796,
1722,
394,
5213,
18870,
35835,
327,
35156,
6979,
310,
4193,
26332,
1796,
9169,
50276,
7152,
33032,
2520,
3929,
12661,
247,
1182,
254,
6934,
302,
1332,
323,
278,
363,
14433,
534,
310,
247,
973,
14091,
728,
13737,
1895,
253,
1332,
310,
1754,
327,
253,
5697,
273,
3676,
2460,
2720,
26332,
253,
3745,
273,
9113,
25180,
11454,
6928,
281,
3037,
670,
253,
2605,
273,
247,
2014,
2460,
10481,
973,
323,
1850,
80,
2182,
8892,
436,
1881,
35421,
6311,
2990,
476,
840,
320,
908,
275,
247,
10358,
395,
1993,
10336,
281,
8415,
13737,
1895,
342,
247,
39762,
2746,
26332,
347,
604,
253,
6311,
1850,
80,
9141,
497,
247,
2264,
7629,
23055,
7221,
9141,
616,
1850,
80,
9141,
476,
320,
5520,
275,
247,
3700,
4715,
2746,
281,
5649,
432,
247,
625,
2570,
2990,
10166,
327,
2074,
3888,
685,
1110,
387,
1133,
285,
44861,
85,
37437,
327,
253,
2460,
281,
320,
25578,
50276,
783,
4477,
564,
327,
281,
4647,
616,
10358,
395,
1993,
10336,
281,
8415,
253,
278,
363,
14433,
1895,
597,
2085,
4679,
285,
14023,
253,
2929,
310,
973,
3542,
285,
2590,
342,
1270,
33954,
285,
760,
247,
1643,
963,
993,
352,
10384,
253,
5678,
9241,
273,
1182,
254,
6934,
302,
4715,
337,
10358,
395,
1993,
374,
34560,
35615,
285,
3700,
4715,
281,
8415,
253,
278,
363,
14433,
1895,
327,
2014,
3888,
50276,
783,
2934,
273,
1182,
254,
6934,
302,
4715,
310,
417,
747,
275,
253,
1798,
3634,
273,
6979,
352,
556,
1048,
644,
1929,
326,
253,
1850,
80,
2182,
1895,
812,
320,
14042,
407,
3733,
327,
253,
1077,
1072,
2460,
581,
651,
7703,
281,
1850,
45416,
5910,
10358,
395,
1993,
35615,
13398,
247,
2087,
27299,
1850,
80,
9141,
342,
247,
246,
20323,
251,
729,
3022,
1878,
15044,
47037,
281,
8415,
10341,
13737,
3237,
278,
363,
310,
2223,
12258,
347,
247,
8946,
13737,
1895,
275,
253,
1072,
2491,
347,
45830,
14433,
390,
372,
49857,
804,
253,
2491,
273,
5482,
2530,
281,
16161,
13737,
3237,
275,
3739,
6979,
310,
1512,
1781,
281,
3748,
608,
50276,
4064,
253,
35961,
1127,
273,
1859,
891,
923,
1077,
1652,
38135,
275,
436,
3929,
50276,
9088,
403,
1142,
7794,
281,
970,
1182,
254,
6934,
302,
6311,
7274,
323,
1850,
80,
2182,
275,
1798,
253,
958,
326,
616,
31793,
310,
417,
4232,
285,
594,
597,
476,
760,
320,
3732,
323,
247,
3710,
1180,
273,
25142,
285,
253,
878,
281,
897,
2393,
11769,
2784,
581,
7680,
273,
436,
3929,
310,
616,
4081,
1617,
323,
2393,
15910,
533,
352,
310,
417,
5421,
11076,
1037,
50272,
18,
10102,
254,
522,
1096,
265,
270,
7263,
83,
268,
271,
15005,
5356,
2969,
2746,
281,
1182,
254,
6934,
302,
4715,
275,
48985,
8059,
327,
5145,
4715,
4104,
12468,
337,
7266,
23917,
1423,
20664,
268,
1686,
83,
374,
256,
8097,
29846,
518,
41657,
11943,
260,
247,
29909,
1342,
285,
270,
259,
37844,
4978,
10358,
395,
1993,
2235,
641,
323,
1566,
1754,
14433,
26332,
1796,
4156,
8059,
327,
2625,
285,
1491,
5162,
7266,
898,
28333,
2385,
4072,
495,
30287,
606,
4204,
465,
606,
298,
88,
259,
606,
340,
68,
19169,
260,
88,
1881,
28269,
1754,
2460,
14717,
342,
4893,
281,
2014,
2460,
1850,
80,
2182,
26332,
1796,
13122,
327,
38195,
4072,
17109,
818,
1036,
1093,
29360,
577,
632,
86,
260,
18558,
293,
1886,
74,
391,
465,
606,
34505,
1182,
262,
23237,
502,
4107,
11155,
22923,
12077,
13418,
285,
8570,
273,
6046,
432,
247,
2014,
2460,
26332,
1796,
13122,
327,
3102,
1783,
285,
5145,
9260,
5215,
1086,
46215,
1423,
1525,
18486,
608,
5303,
276,
71,
258,
27685,
3627,
256,
359,
739,
246,
13605,
6339,
247,
49068,
647,
23303,
258,
1182,
487,
1657,
87,
13456,
278,
1881,
35421,
4715,
273,
13737,
1895,
1220,
735,
275,
3739,
6979,
801,
297,
404,
15644,
285,
6779,
3700,
285,
3739,
2460,
4715,
342,
1679,
13301,
285,
35180,
941,
6247,
17109,
2145,
7266,
337,
10768,
746,
7203,
254,
45909,
253,
4477,
1056,
247,
1943,
2968,
273,
970,
246,
77,
470,
11860,
285,
10358,
395,
1993,
281,
5115,
1175,
1543,
327,
278,
363,
14433,
512,
273,
841,
3603,
403,
1929,
285,
452,
644,
3732,
1078,
281,
436,
1895,
597,
513,
5115,
1805,
1543,
685,
5303,
276,
71,
1162,
355,
608,
6701,
281,
247,
1805,
246,
77,
24020,
50276,
783,
2929,
310,
3451,
4583,
253,
2393,
15910,
1617,
310,
4274,
17980,
285,
417,
5421,
28055,
50276,
1189,
455,
253,
2929,
310,
3965,
1175,
533,
891,
1158,
352,
1057,
417,
2590,
253,
2534,
323,
14924,
387,
17857,
32888,
1955,
281,
253,
3480,
273,
7681,
38135,
5474,
33032,
2520,
19529,
13330,
342,
278,
83,
3888,
14433,
275,
247,
3634,
835,
253,
9305,
941,
310,
17433,
312,
6216,
436,
941,
534,
310,
275,
253,
269,
15421,
5028,
476,
320,
17433,
312,
6216,
281,
28523,
253,
6979,
1174,
285,
3021,
3157,
3382,
24824,
285,
352,
310,
3021,
247,
1077,
4623,
2561,
9400,
50275,
783,
19529,
15771,
327,
3332,
3676,
4715,
3210,
326,
11120,
19071,
690,
3520,
7794,
273,
278,
83,
2460,
11931,
2709,
35062,
18783,
7340,
285,
269,
15421,
4979,
285,
5115,
278,
363,
14433,
1754,
327,
22296,
3733,
6667,
253,
3733,
6667,
513,
417,
878,
281,
320,
17433,
312,
6216,
50276,
2920,
19958,
8557,
6296,
690,
12028,
273,
17433,
312,
6216,
941,
476,
320,
16737,
281,
2794,
271,
3280,
323,
534,
3451,
14433,
273,
253,
16737,
12028,
476,
320,
2424,
949,
247,
2475,
494,
2957,
1307,
50276,
6050,
253,
2929,
310,
17127,
281,
1239,
285,
26565,
352,
1057,
417,
3959,
247,
1534,
7756,
689,
1275,
340,
14990,
1162,
355,
9169,
534,
407,
253,
1039,
310,
810,
12436,
1598,
417,
2908,
275,
253,
873,
273,
22791,
3082,
281,
534,
253,
19529,
310,
2429,
275,
253,
906,
7180,
273,
253,
5661,
2593,
50276,
74,
717,
417,
2119,
281,
2096,
253,
4154,
8772,
3700,
4715,
285,
2014,
10895,
14433,
253,
4081,
2746,
671,
851,
44196,
253,
1566,
327,
247,
747,
10895,
342,
3110,
2173,
3607,
891,
1158,
326,
752,
253,
4477,
1599,
310,
326,
436,
10895,
1057,
417,
878,
281,
320,
22296,
533,
436,
310,
247,
4270,
249,
2867,
273,
253,
1566,
432,
340,
14990,
1162,
355,
9169,
432,
534,
253,
4477,
403,
3652,
2220,
50276,
783,
4081,
2746,
347,
973,
347,
22296,
23256,
11830,
83,
1646,
281,
1891,
281,
17029,
14542,
4735,
7985,
275,
3216,
5083,
3888,
253,
25578,
4394,
403,
39797,
977,
436,
1537,
320,
247,
1895,
323,
6120,
390,
323,
643,
10166,
13361,
1754,
3210,
326,
10725,
327,
14542,
3386,
824,
347,
8188,
19177,
285,
1379,
278,
83,
3888,
347,
14800,
310,
436,
247,
1929,
2523,
275,
278,
363,
14433,
275,
2087,
390,
275,
45439,
1754,
14433,
50276,
856,
84,
50274,
783,
2929,
310,
973,
3542,
285,
5697,
403,
4518,
5544,
285,
4767,
50276,
783,
2929,
310,
22335,
3590,
50276,
5040,
50274,
783,
7680,
310,
1512,
32809,
2429,
281,
340,
14990,
1162,
355,
9169,
347,
352,
8414,
275,
294,
5302,
253,
269,
15421,
5028,
41463,
2934,
275,
1340,
281,
452,
247,
12820,
2957,
6941,
281,
2736,
689,
31893,
285,
3523,
4715,
50276,
15325,
2173,
3733,
11684,
625,
13782,
673,
285,
5300,
685,
7312,
3382,
24824,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
253,
1895,
273,
21702,
5212,
12549,
6979,
835,
253,
4736,
310,
281,
17029,
271,
2460,
432,
17433,
312,
6216,
6341,
253,
2929,
29328,
247,
1182,
254,
6934,
302,
1881,
35421,
4715,
2746,
323,
21702,
3676,
4715,
1754,
5212,
12549,
6979,
253,
2746,
27959,
253,
6341,
432,
247,
2014,
11017,
715,
767,
28465,
5239,
581,
873,
310,
908,
323,
1881,
35421,
4715,
285,
581,
873,
310,
908,
281,
1347,
12820,
5742,
281,
3609,
247,
37820,
4764,
253,
873,
326,
310,
908,
323,
1881,
35421,
4715,
310,
840,
969,
8085,
715,
767,
1027,
5239,
285,
247,
2990,
310,
10166,
281,
3283,
253,
11383,
432,
581,
873,
1754,
327,
253,
11383,
275,
253,
643,
873,
436,
13276,
21702,
278,
363,
1293,
667,
3733,
941,
50276,
783,
2929,
44995,
327,
253,
3809,
78,
363,
10895,
247,
2629,
10895,
323,
3676,
4715,
1754,
278,
363,
2561,
285,
253,
2929,
26662,
281,
247,
10166,
8245,
285,
271,
440,
32927,
8245,
12539,
253,
2929,
9010,
616,
1881,
35421,
1332,
281,
1347,
1077,
973,
2429,
281,
1097,
285,
2722,
3888,
326,
5224,
7126,
3045,
352,
651,
452,
644,
1014,
1805,
281,
7277,
253,
1332,
327,
253,
1071,
873,
273,
253,
3809,
78,
363,
7324,
281,
452,
247,
1463,
22791,
5301,
50276,
1568,
310,
849,
253,
5955,
2427,
50276,
15337,
254,
31048,
23,
83,
310,
23384,
273,
14924,
533,
7211,
247,
1643,
2442,
17948,
1005,
824,
347,
253,
1332,
3215,
11273,
327,
3998,
285,
5762,
327,
15746,
9591,
1805,
685,
253,
1332,
3215,
11273,
327,
15746,
285,
5762,
327,
15746,
285,
417,
5277,
247,
5301,
273,
253,
15180,
2105,
253,
4477,
2879,
247,
2829,
281,
253,
30762,
19678,
326,
253,
15180,
4815,
403,
1077,
1029,
1199,
2169,
685,
323,
12539,
1014,
253,
37317,
369,
2600,
342,
253,
2380,
285,
5439,
253,
4868,
50273,
15337,
254,
278,
5844,
76,
8219,
326,
253,
7680,
310,
1512,
32809,
2429,
281,
2720,
789,
275,
1798,
4103,
281,
253,
1543,
273,
340,
14990,
1162,
355,
9169,
285,
671,
8219,
326,
253,
2934,
273,
41463,
253,
6341,
310,
417,
747,
253,
4477,
9059,
275,
2380,
326,
616,
2746,
273,
41463,
253,
6341,
310,
747,
285,
253,
37317,
369,
21802,
281,
7164,
253,
4868,
5777,
533,
1335,
11121,
326,
253,
38135,
327,
253,
7681,
13361,
1930,
4558,
3710,
285,
36908,
971,
281,
896,
253,
19529,
1512,
1199,
285,
858,
417,
7164,
253,
4868,
387,
253,
990,
275,
253,
985,
50275,
15337,
254,
655,
87,
20,
556,
253,
4468,
326,
253,
512,
3603,
908,
3700,
4715,
10358,
395,
1993,
3966,
403,
973,
1929,
5609,
285,
452,
644,
3732,
1078,
281,
278,
363,
285,
3103,
11121,
326,
253,
2929,
1057,
417,
2590,
253,
2534,
323,
14924,
253,
2929,
2792,
562,
326,
1223,
1110,
5697,
1537,
320,
3732,
323,
253,
806,
673,
281,
278,
363,
597,
452,
644,
908,
1078,
275,
643,
2460,
14433,
3237,
275,
1798,
1850,
80,
2182,
50275,
422,
1239,
253,
2929,
275,
2508,
1512,
285,
717,
8489,
327,
253,
19354,
891,
1158,
697,
1077,
9865,
281,
923,
326,
247,
19080,
2898,
273,
1881,
35421,
4715,
2987,
594,
973,
323,
278,
363,
891,
5194,
342,
253,
30628,
326,
253,
7681,
38135,
310,
4942,
1355,
533,
327,
253,
643,
1133,
436,
310,
253,
806,
673,
326,
891,
923,
1881,
35421,
4715,
1146,
3732,
326,
8379,
281,
278,
363,
891,
13414,
3894,
253,
4468,
670,
38135,
50276,
9820,
253,
9380,
2746,
21168,
327,
2720,
789,
533,
697,
417,
2590,
432,
253,
6239,
849,
973,
824,
247,
973,
24251,
1881,
35421,
4715,
2746,
651,
789,
50276,
5371,
891,
651,
452,
10490,
281,
923,
275,
1635,
281,
253,
5661,
1543,
310,
247,
1463,
7103,
327,
253,
3809,
78,
363,
10895,
271,
5750,
273,
253,
3809,
78,
363,
10895,
310,
326,
352,
3400,
247,
22791,
285,
604,
8607,
7472,
327,
326,
22791,
327,
253,
1071,
1178,
29599,
873,
359,
476,
7277,
1027,
3082,
973,
253,
2929,
762,
2278,
36908,
513,
326,
352,
760,
44995,
327,
1884,
1071,
18484,
285,
3021,
697,
1892,
281,
22791,
253,
1332,
671,
253,
2929,
651,
5649,
432,
625,
28913,
2175,
50276,
249,
6452,
891,
651,
320,
5211,
281,
2319,
436,
2929,
387,
253,
8059,
285,
1158,
326,
643,
8607,
275,
253,
15171,
273,
3676,
4715,
285,
13737,
3237,
651,
320,
1512,
285,
3103,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper focuses on the setting of federated learning where the two agents are attempting to perform kernel regression using different kernels and hence have different models their study yields an algorithm of using alternating knowledge distillation akd imposes overly strong regularization and may lead to severe underfitting their theory also shows an interesting connection between akd and the alternating projection algorithm for finding the intersection of sets leveraging this connection they propose an algorithm that improves upon akd strengths this paper introduced the federated kernel regression framework where they formalized notions of both model and data heterogeneity which can be useful for developing new algorithms for model agnostic federated learning the theoretical analysis for knowledge distillation including akd avgkd and ekd is very rigorous weakness the application of the proposed method is limited based on the problem setting the proposed method is only applicable for the case of two agents while the problem of federated learning is usually for multiagents more than two can this proposed method be extended to a more agents case the analysis of the experimental section is not clear enough or even missing ex1 for figure 3 is the loss of akd convergent or not this result cannot be directly observed from the figure this paper can give an analysis on this and also for the reason why the loss of akd increase ex2 the analysis for figure 4 5 6 is missing which can make the readers confused about these results minor problem some of the notations in the paper are a bit confusing to me especially those seem to have similar meanings i would suggest that if have the same meaning they can be represented by the same notation if not more explanations for their differences can be offered ex it seems that ht in algorithm akd have similar meaning as gt2 in algorithm avgkd because these two algorithms have similar schemes although this paper introduced the federated kernel regression framework which might be useful for developing new algorithms for model agnostic federated learning however the application of the proposed method is limited although the theoretical analysis part is rigorous the analysis of the experimental section is not clear enough or even missing docsepthis paper analyze knowledge distillation based model agnostic federated learning they consider simple two agent kernel regression scenario where each agent has its own dataset and predicting function they propose to train agent 1 model on dataset 1 and then use agent 1s model to make predictions on dataset 2 and agent 2 will train model using these predictions they analyze the dynamics of agent 1s model and show that alternating knowledge distillation will degrade the model prediction to 0 they provide another algorithm ensemble avgkd which can actually avoid this issue and converge to optimal solution the experiments also show that avgkd can converge to zero loss while other approaches do not work pros 1 this work gives the first theory analysis for model agnostic fl even though the setting is very simple they also use the negative result to motivate the improved algorithm ensemble avgkd the whole story is complete and clear 2 the experiments are sufficient and support their analysis cons 1 lin et al also have similar kd idea for model agnostic fl what is difference between avgkd and their work lin tao et al ensemble distillation for robust model fusion in federated learning arxiv preprint arxiv200607242 2020 2 as i mentioned before the setting is somewhat toy in practice there are multiple agents and the machine learning model is way complicated that may not have closed form solution 3 in addition the paper only consider optimization perspective but i am curious about the generalization ability of avgkd does it provably generalize better than purely local training overall model agnostic fl area is lack of theory results so this work has its own novelty however the setting is somewhat toy so i am afraid it may not be very helpful towards understanding model agnostic fl in practice eg multiple clients and complicated models that do not have closed form solution docsepthe paper analyzes the dynamics of optimizing two kernel regression models via codistillation in a distributed setup where local models may differ in the kernel used in each round each local client uses the other clients model to produce novel labels for its local dataset and then retrains its local model using these novel labels the paper analyzes three variants of this approach the vanilla variant a variant where novel labels are an average of the actual label and the one predicted by the other clients model and an ensembling approach that uses all model iterations to produce predictions the approaches are analyzed theoretically and empirically the paper thoroughly analyzes the dynamics of codistillation and the proposed variants in a limited setting with two clients the local models may differ in their kernel function so that model averaging is not possible while the constrained setup limits practical insights it allows to clearly analyze the behavior both theoretically and empirically to that end the paper details the dynamics of all three approaches and shows that the straightforward approach may degenerate and that the ensemble approach is optimal in the limit moreover it gives conditions under which the averaging approach degenerates these theoretical results are interesting and insightful the empirical evaluation confirms the theoretical findings and in addition shows that the behavior is similar when using neural networks instead of kernel regression this hints at the generality of this analysis the paper is wellwritten and clear the theoretical results are sound and to the best of my knowledge correct i really appreciate the theoretical contributions but i feel that at least the empirical evaluation is too limited while there is good reason to analyze only two clients theoretically it would be straightforward to apply the proposed methods on larger numbers of clients empirically using only two clients and only linear regression and mnist as experiments limits the contribution the paper would benefit greatly from a broader empirical evaluation detailed comments why is only linear regression used for the kernel experiments which kernels are used for the experiments i guess a linear kernel would be most suitable to the task but then both kernel functions would be the same how does codistillation compare to averaging kernel models 1 the approach in 2 similarly uses codistillation even though they use an unlabeled reference dataset it seems worthwhile to compare to it 1 kamp michael et al communicationefficient distributed online learning with kernels joint european conference on machine learning and knowledge discovery in databases springer cham 2016 2 bistritz ilai ariana mann and nicholas bambos distributed distillation for ondevice learning advances in neural information processing systems 33 2020 after rebuttal the authors have addressed my questions to my satisfaction while i agree with my fellow reviewers that this work is limited in its scope i do enjoy this novel theoretical take on distributed learning with different model types via knowledge distillation thus i vote for acceptance i have updated my score accordingly the paper presents an interesting theoretical analysis of codistillation for two different kernel regression models the empirical evaluation confirms the theoretical findings the constrained setting on the one hand allows for strong theoretical results but on the other limits the practical insights that can be drawn from them here a broader empirical analysis would have improved the significance of the contributions docsepthis paper introduce an interesting perspective on federated learning via knowlegde distillation which allows participating clients to have their own choices of models in particular the paper develops theoretical results for a twoclient federated regression scenario which demonstrates a the degeneration of an alternating knowledge distillation where iterative distillations ie a client model at each iteration is retrained based on unlabeled input prediction of the other clients model of the previous iteration gradually lose information over the iterations and eventually converge towards a vacuous model and b a new ensembling technique that aggregates intermediate models produced by both clients over the iterations such that in the limit ie when the no of iterations tends to infinity the aggregated model is the same as the oracle centralized model this developed intuition on ensembling intermediate models is then applied to more realistic 2client federated classification scenarios on mnist and cifar10 datasets novelty significance this paper uses a simple 2client federated regression scenario with simplified modeling choices ie kernelized linear regression with closedform solutions to develop novel insights which can potentially be translated into new modelagnostic fl framework with strong asymptotic guarantees the idea as i summarized above is novel to me however in terms of its practical significance i am not convinced because of there is too much gap between the theoretic results and realworld setting which is elaborated below 1 the theoretic setup started with 2client setting but has not been extended to multiagent setting this is problematic because the result derived in 2client setting is very specific to the iteration between the two clients and how they average between the local data the distilled prediction of the shared models if there are more than one shared models how would the client set up their training output for the next iteration furthermore the authors assume implicitly that all clients have the same amount of data which is clearly not true in practical setting how will it impact the averaging scheme in such cases 2 another peculiar restriction is that the presented result is tied to an obvious flaw in the way akd is set up in this paper from the 4 steps mentioned at the beginning of section 4 client 2 will never get to see true output i suspect this is a main cause that leads to akds degeneration given this the theoretic result developed in 42 is kind of moot can the authors revisit this result in case models at iteration t 1 of any clients are built on distillation of models from iteration t instead this is so client 2 can see its true training output at iteration 1 3 the authors also proposed a data reincorporation scheme ie avgkd which performs better than akd but given the 2nd point above it is not clear if this mechanism is necessary if the peculiar flaw in akd is fixed my point is if this new mechanism ends up fixing only the flaw above and not anything else then it is somewhat unnecessary because the above flaw can be fixed by allowing both client to share models concurrently rather than iteratively could the authors elaborate more on this 4 the fix in 3 is later abandoned and replaced with a better ekd fix via ensembling intermediate akd models but again this result is correct given the flawed setup of akd as pointed out in 2 above if this flaw is corrected can the authors revisit the asymptotic convergence result for ekd 5 the result is largely based on a formulation of regression model and the result is certainly tied to this specific regression form i am not sure if it is reasonable to anyhow impose the implication of the derived result on a very distant classification setting in the same vein of thought another minor restriction is that if we view this regression formulation from a probabilistic perspective then it appears the authors impose the same gaussian likelihood across all clients and that kind of clashes with the modelagnostic motivation 6 have the authors considered the communication cost beyond the 2client setting as the distillation requires access to local data every client would have to send models to every other clients the total communication cost is therefore n times for than the normal cost of fl and in addition there will also be extra distillation expense which is a lot more costly than aggregating model weights soundness the results appear correct to me but i have reservation about the theoretic implication of both avgkd and ekd as i pointed out above i am not sure if these proposals were meant to fix anything more than the obvious flaw in the presented akd setup eg client 2 never gets to see true training output furthermore if the akd setup were to be fixed by letting client 2 sees its true training output would the convergence result of ekd still hold clarity the paper is wellorganized with a nice narrative flow that is easy to follow but on a minor note there are a number of grammatical errors all over the paper experiment i find the experiment somewhat strange it implies the initial local model of client 1 is already on the same level with the centralized model which means distillation does not help at all in all 3 experiment settings not a single one shows that distillation is being helpful am i missing something here on another note the flaw setup on akd is observable from the reported results in all 3 settings client 2 always perform worse than client 1 in fact on the same model same data setting it is noticeable that client 2 is much worse than client 1 at the initial round this is clearly because client 2 never sees its true training output and this seems to be the case that with the flaw client 2 is initiating and leading the distillation degradation this paper presents interesting thoughts that were substantiated via a theoretical exercise on a simplified setting but unfortunately in both theory practice this is still pretty much a work in progress given the huge gap between the theoretic and realworld setup it is not clear whether one can apply the insight derived from the theoretic setting to another distant realworld setting please see my points in 15 above thus while this could be the beginning of an interesting theory more work is needed to complete the idea as a matter of fact the current experiment implies the proposed distillation is not working in addition i believe the authors have not considered the computecommunication cost of this distillation scheme
### Summary:
|
this manuscript proposes and analyzes a distillation approach to address heterogeneity in distributed learning the main paper focuses on a relatively simple twoagent kernel regression setting and the insights developed are extended and partially analyzed for a multiagent setting there are four reviewers all of whom agree that the method addresses an interesting and timely issue however reviewers are mixed on the paper score while all reviewers agree that the setting is somewhat stylized a subset of reviewers highlights that the results give some deep insight that might drive future analysis and implementation in the area other concerns raised include potential issues with the communication overhead and the simplicity of the kernel regression setting vs realworld deep learning there are initial concerns about whether the failure case is realistic which the authors address extensions to the multiagent setting and a partial analysis are also addressed by the authors and partially satisfy the reviewers nevertheless after reviews and discussion the reviewers are mixed at the end of the discussion the area chair finds first that the paper is much improved and much more applicable in the updated form than in the original version and indeed the insights from the simple model may be informative for practice however the concerns raised about the distance between theory and practice are valid the final opinion remains borderline authors are encouraged to address the highlighted technical concerns in any future submission of this work in particular the muti0agent setting should probably be central in the discussion of this work more ambitious empirical evaluation showing that the theory translates to practice even if there is a gap would also help
|
[
26724,
12955,
247,
12955,
835,
4460,
13301,
403,
271,
3388,
273,
253,
4588,
5203,
285,
253,
581,
8131,
407,
253,
643,
8548,
1566,
285,
271,
546,
35128,
2746,
326,
4648,
512,
1566,
25142,
281,
4711,
13650,
253,
7274,
403,
5867,
28055,
285,
45190,
253,
2929,
16575,
3537,
13505,
253,
8062,
273,
12738,
382,
21755,
285,
253,
4081,
11640,
275,
247,
3710,
4758,
342,
767,
8548,
253,
1980,
3210,
778,
9184,
275,
616,
10295,
1159,
594,
326,
1566,
25001,
310,
417,
1896,
1223,
253,
20793,
9978,
7787,
8542,
16039,
352,
4483,
281,
4518,
12106,
253,
3879,
1097,
28055,
285,
45190,
281,
326,
990,
253,
2929,
4278,
253,
8062,
273,
512,
1264,
7274,
285,
2722,
326,
253,
15246,
2746,
778,
29458,
285,
326,
253,
19862,
2746,
310,
8654,
275,
253,
2701,
25761,
352,
4245,
2515,
762,
534,
253,
25001,
2746,
25273,
684,
841,
10527,
1543,
403,
4722,
285,
47860,
253,
16774,
7103,
23849,
253,
10527,
4342,
285,
275,
1635,
2722,
326,
253,
3879,
310,
2074,
672,
970,
11454,
6928,
3185,
273,
10295,
9077,
436,
28145,
387,
253,
31376,
273,
436,
1783,
253,
2929,
310,
973,
15720,
285,
2590,
253,
10527,
1543,
403,
3590,
285,
281,
253,
1682,
273,
619,
3640,
3451,
50276,
74,
1663,
11435,
253,
10527,
9021,
533,
891,
1928,
326,
387,
1878,
253,
16774,
7103,
310,
1512,
3710,
1223,
627,
310,
1175,
1921,
281,
12106,
760,
767,
8548,
28055,
352,
651,
320,
15246,
281,
4647,
253,
4081,
3082,
327,
4067,
3904,
273,
8548,
45190,
970,
760,
767,
8548,
285,
760,
4872,
9077,
285,
278,
79,
382,
347,
4679,
7787,
253,
7680,
253,
2929,
651,
5649,
10260,
432,
247,
16055,
16774,
7103,
50275,
5992,
7193,
5701,
50276,
22309,
310,
760,
4872,
9077,
908,
323,
253,
10295,
4679,
50275,
4609,
34501,
403,
908,
323,
253,
4679,
891,
5476,
247,
4872,
10295,
651,
320,
954,
7470,
281,
253,
4836,
533,
840,
1097,
10295,
3470,
651,
320,
253,
1072,
50276,
5430,
1057,
12738,
382,
21755,
7277,
281,
25001,
10295,
3210,
337,
50276,
783,
2746,
275,
374,
12014,
4648,
12738,
382,
21755,
1014,
2167,
597,
897,
271,
440,
22027,
3806,
10895,
352,
3133,
32811,
281,
7277,
281,
352,
50276,
18,
465,
1301,
278,
44023,
1162,
355,
5511,
20246,
5939,
3909,
4715,
342,
34501,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
7203,
254,
45909,
4022,
374,
270,
382,
32071,
4164,
2284,
247,
363,
3230,
637,
79,
285,
295,
469,
16328,
270,
1369,
375,
5939,
940,
21755,
323,
327,
10933,
4715,
16424,
275,
11454,
1491,
5162,
2718,
5922,
9169,
50274,
6438,
30080,
22559,
50275,
783,
4477,
452,
9713,
619,
3533,
281,
619,
13212,
1223,
891,
5194,
342,
619,
7715,
30628,
326,
436,
789,
310,
3710,
275,
697,
7990,
891,
513,
4264,
436,
4460,
10527,
1379,
327,
5939,
4715,
342,
1027,
1566,
3510,
3066,
3640,
940,
21755,
3021,
891,
6273,
323,
14924,
891,
452,
9300,
619,
4868,
15672,
50275,
783,
2929,
10262,
271,
4722,
10527,
1783,
273,
12738,
382,
21755,
323,
767,
1027,
10295,
9077,
3210,
253,
16774,
7103,
23849,
253,
10527,
4342,
253,
20793,
4758,
327,
253,
581,
1133,
4483,
323,
2266,
10527,
1543,
533,
327,
253,
643,
7787,
253,
8542,
16039,
326,
476,
320,
8392,
432,
731,
1060,
247,
16055,
16774,
1783,
651,
452,
5520,
253,
8453,
273,
253,
9021,
5474,
33032,
2520,
2929,
9569,
271,
4722,
8668,
327,
10208,
12072,
4715,
3066,
871,
1851,
615,
940,
21755,
534,
4483,
15299,
8548,
281,
452,
616,
1211,
10165,
273,
3210,
275,
1798,
253,
2929,
24357,
10527,
1543,
323,
247,
2500,
15436,
850,
10208,
12072,
9077,
10076,
534,
14371,
247,
253,
27939,
273,
271,
28035,
3640,
940,
21755,
835,
34560,
940,
408,
569,
26332,
247,
5268,
1566,
387,
1016,
19502,
310,
851,
11273,
1754,
327,
440,
22027,
3280,
50276,
12787,
2474,
273,
253,
643,
8548,
1566,
273,
253,
2045,
19502,
13237,
7168,
1491,
689,
253,
25142,
285,
6524,
29623,
4404,
247,
5809,
3472,
1566,
285,
270,
247,
747,
546,
35128,
5853,
326,
29111,
10444,
3210,
4197,
407,
1097,
8548,
689,
253,
25142,
824,
326,
275,
253,
2701,
26332,
672,
253,
642,
273,
25142,
14280,
281,
23579,
253,
40006,
1566,
310,
253,
1072,
347,
253,
42295,
36409,
1566,
436,
3715,
30328,
327,
546,
35128,
10444,
3210,
310,
840,
3732,
281,
625,
15958,
374,
8780,
10208,
12072,
9162,
15216,
327,
278,
79,
382,
285,
260,
338,
274,
740,
15302,
38135,
50276,
9188,
40348,
50276,
2520,
2929,
4648,
247,
2969,
374,
8780,
10208,
12072,
9077,
10076,
342,
21010,
14053,
10165,
26332,
10295,
1025,
4872,
9077,
342,
4581,
630,
5482,
281,
1287,
4460,
16039,
534,
476,
7826,
320,
15786,
715,
747,
1566,
1530,
6932,
892,
7792,
342,
2266,
20185,
23632,
253,
2934,
347,
891,
17903,
1840,
310,
4460,
281,
479,
2299,
275,
2426,
273,
697,
8542,
8453,
891,
717,
417,
13762,
984,
273,
627,
310,
1512,
1199,
8037,
875,
253,
253,
30325,
1543,
285,
1524,
10186,
4758,
534,
310,
50221,
2708,
50276,
18,
253,
253,
30325,
9978,
3053,
342,
374,
8780,
4758,
533,
556,
417,
644,
6508,
281,
4471,
12788,
4758,
436,
310,
20276,
984,
253,
906,
6012,
275,
374,
8780,
4758,
310,
1077,
2173,
281,
253,
19502,
875,
253,
767,
8548,
285,
849,
597,
3388,
875,
253,
1980,
941,
50276,
783,
35755,
10554,
273,
253,
6096,
3210,
604,
627,
403,
625,
685,
581,
6096,
3210,
849,
651,
253,
5268,
873,
598,
616,
3733,
3453,
323,
253,
1735,
19502,
33810,
253,
4477,
5467,
29688,
326,
512,
8548,
452,
253,
1072,
2408,
273,
941,
534,
310,
4518,
417,
2032,
275,
8542,
4758,
50276,
5430,
588,
352,
3486,
253,
25001,
6974,
275,
824,
2219,
50276,
19,
1529,
19532,
12400,
310,
326,
253,
3559,
906,
310,
12331,
281,
271,
4755,
19652,
275,
253,
1039,
29507,
69,
310,
873,
598,
275,
436,
2929,
432,
253,
577,
5018,
5393,
387,
253,
5068,
273,
2593,
577,
5268,
374,
588,
1620,
755,
281,
923,
2032,
3453,
891,
9101,
436,
310,
247,
2022,
2847,
326,
5644,
281,
29507,
1397,
27939,
1677,
436,
253,
253,
30325,
906,
3715,
275,
5976,
310,
2238,
273,
31866,
476,
253,
4477,
45735,
436,
906,
275,
1083,
3210,
387,
19502,
246,
50276,
18,
273,
667,
8548,
403,
4270,
327,
940,
21755,
273,
3210,
432,
19502,
246,
3185,
50276,
2520,
310,
594,
5268,
374,
476,
923,
697,
2032,
3733,
3453,
387,
19502,
337,
50276,
20,
253,
4477,
671,
4081,
247,
941,
294,
1763,
24993,
318,
6974,
26332,
1323,
72,
76,
69,
534,
17923,
1805,
685,
29507,
69,
533,
1677,
253,
374,
2109,
1127,
1840,
352,
310,
417,
2590,
604,
436,
5122,
310,
3309,
604,
253,
19532,
19652,
275,
29507,
69,
310,
4229,
619,
1127,
310,
604,
436,
747,
5122,
7637,
598,
18505,
760,
253,
19652,
1840,
285,
417,
2712,
2010,
840,
352,
310,
8489,
15279,
984,
253,
1840,
19652,
476,
320,
4229,
407,
6941,
1097,
5268,
281,
3894,
3210,
35046,
2581,
685,
10040,
3146,
812,
253,
4477,
21184,
625,
327,
436,
50273,
21,
253,
4993,
275,
495,
310,
1996,
13966,
285,
7932,
342,
247,
1805,
34978,
69,
4993,
3066,
546,
35128,
10444,
29507,
69,
3210,
533,
969,
436,
906,
310,
3451,
1677,
253,
33657,
9978,
273,
29507,
69,
347,
8042,
562,
275,
374,
1840,
604,
436,
19652,
310,
15045,
476,
253,
4477,
45735,
253,
20185,
14940,
906,
323,
34978,
69,
50276,
22,
253,
906,
310,
8127,
1754,
327,
247,
15895,
273,
9077,
1566,
285,
253,
906,
310,
5604,
12331,
281,
436,
2173,
9077,
830,
891,
717,
417,
2119,
604,
352,
310,
5272,
281,
667,
5430,
16209,
253,
27570,
273,
253,
6012,
906,
327,
247,
1077,
13392,
9162,
4758,
275,
253,
1072,
17716,
273,
1869,
1529,
5884,
12400,
310,
326,
604,
359,
1859,
436,
9077,
15895,
432,
247,
37851,
8668,
840,
352,
4620,
253,
4477,
16209,
253,
1072,
305,
12064,
12177,
2439,
512,
8548,
285,
326,
2238,
273,
45880,
342,
253,
1566,
1530,
6932,
16038,
50275,
23,
452,
253,
4477,
2783,
253,
5511,
2105,
4457,
253,
374,
8780,
4758,
347,
253,
940,
21755,
4419,
2289,
281,
1980,
941,
1046,
5268,
651,
452,
281,
5007,
3210,
281,
1046,
643,
8548,
253,
2264,
5511,
2105,
310,
3103,
295,
2069,
323,
685,
253,
2622,
2105,
273,
892,
285,
275,
1635,
627,
588,
671,
320,
4465,
940,
21755,
14247,
534,
310,
247,
2257,
625,
19983,
685,
9406,
839,
1566,
13461,
50276,
27962,
1255,
50276,
783,
1543,
3176,
3451,
281,
479,
533,
891,
452,
28930,
670,
253,
253,
30325,
27570,
273,
1097,
1323,
72,
76,
69,
285,
34978,
69,
347,
891,
8042,
562,
1840,
891,
717,
417,
2119,
604,
841,
18595,
497,
5486,
281,
4993,
2712,
625,
685,
253,
4755,
19652,
275,
253,
3559,
29507,
69,
9978,
24088,
5268,
374,
1620,
4850,
281,
923,
2032,
3733,
3453,
33810,
604,
253,
29507,
69,
9978,
497,
281,
320,
4229,
407,
13872,
5268,
374,
11403,
697,
2032,
3733,
3453,
651,
253,
14940,
906,
273,
34978,
69,
1335,
2186,
50276,
498,
15752,
50276,
783,
2929,
310,
973,
34092,
342,
247,
5322,
14511,
2685,
326,
310,
3477,
281,
956,
533,
327,
247,
5884,
3877,
627,
403,
247,
1180,
273,
47412,
474,
6332,
512,
689,
253,
2929,
50276,
16217,
2092,
50276,
74,
1089,
253,
3368,
8489,
8921,
352,
8018,
253,
3302,
1980,
1566,
273,
5268,
337,
310,
2168,
327,
253,
1072,
1268,
342,
253,
36409,
1566,
534,
2097,
940,
21755,
1057,
417,
1361,
387,
512,
275,
512,
495,
3368,
7533,
417,
247,
2014,
581,
2722,
326,
940,
21755,
310,
1146,
9371,
717,
891,
5816,
1633,
1060,
50275,
251,
1529,
3877,
253,
19652,
9978,
327,
29507,
69,
310,
24802,
432,
253,
2361,
1543,
275,
512,
495,
7533,
5268,
374,
1900,
1347,
7197,
685,
5268,
337,
275,
958,
327,
253,
1072,
1566,
1072,
941,
4758,
352,
310,
28629,
326,
5268,
374,
310,
1199,
7197,
685,
5268,
337,
387,
253,
3302,
3790,
436,
310,
4518,
984,
5268,
374,
1620,
11403,
697,
2032,
3733,
3453,
285,
436,
3133,
281,
320,
253,
1083,
326,
342,
253,
19652,
5268,
374,
310,
36300,
285,
4283,
253,
940,
21755,
11961,
50276,
2520,
2929,
10262,
4722,
7906,
326,
497,
4326,
4215,
3066,
247,
10527,
5763,
327,
247,
21010,
4758,
533,
19235,
275,
1097,
3762,
50276,
29105,
436,
310,
1335,
3965,
1199,
247,
789,
275,
4780,
1677,
253,
5699,
8037,
875,
253,
253,
30325,
285,
1524,
10186,
9978,
352,
310,
417,
2590,
1880,
581,
476,
4647,
253,
12288,
6012,
432,
253,
253,
30325,
4758,
281,
1529,
13392,
1524,
10186,
4758,
50276,
32897,
923,
619,
2792,
275,
1458,
1840,
3021,
1223,
436,
812,
320,
253,
5068,
273,
271,
4722,
3762,
625,
789,
310,
3058,
281,
3426,
253,
2934,
347,
247,
2647,
273,
958,
253,
1655,
3368,
8018,
253,
4081,
940,
21755,
310,
417,
2444,
275,
1635,
891,
2868,
253,
4477,
452,
417,
2783,
253,
11897,
35437,
2105,
273,
436,
940,
21755,
6974,
2490,
187,
4118,
18435,
27,
2520,
7714,
29328,
285,
3537,
13505,
247,
940,
21755,
2746,
281,
2953,
19331,
275,
5939,
4715,
253,
2022,
2929,
16633,
327,
247,
4942,
2969,
767,
12788,
10295,
9077,
4758,
285,
253,
16039,
3715,
403,
6508,
285,
10571,
5867,
323,
247,
4471,
12788,
4758,
50275,
9088,
403,
1740,
30628,
512,
273,
5207,
5194,
326,
253,
1332,
12453,
271,
4722,
285,
14793,
2523,
2299,
30628,
403,
6804,
327,
253,
2929,
4868,
1223,
512,
30628,
5194,
326,
253,
4758,
310,
8489,
17521,
1025,
247,
8578,
273,
30628,
16681,
326,
253,
1543,
1918,
690,
3676,
12288,
326,
1537,
4446,
2852,
1783,
285,
7092,
275,
253,
2170,
643,
7350,
5439,
2486,
2442,
3374,
342,
253,
5511,
18332,
285,
253,
17647,
273,
253,
10295,
9077,
4758,
4632,
1524,
10186,
3676,
4715,
627,
403,
3302,
7350,
670,
1880,
253,
4433,
1083,
310,
15958,
534,
253,
4477,
2953,
18149,
281,
253,
4471,
12788,
4758,
285,
247,
7898,
1783,
403,
671,
9713,
407,
253,
4477,
285,
10571,
10517,
253,
30628,
17837,
846,
10123,
285,
5955,
253,
30628,
403,
6804,
387,
253,
990,
273,
253,
5955,
50275,
783,
2170,
6951,
9010,
806,
326,
253,
2929,
310,
1199,
5520,
285,
1199,
625,
7763,
275,
253,
9300,
830,
685,
275,
253,
3236,
2715,
285,
6296,
253,
16039,
432,
253,
2969,
1566,
778,
320,
27096,
323,
3946,
2299,
253,
7350,
5439,
670,
253,
4181,
875,
3762,
285,
3946,
403,
3588,
253,
2457,
4743,
4558,
45210,
4477,
403,
14659,
281,
2953,
253,
16318,
7681,
7350,
275,
667,
2852,
19529,
273,
436,
789,
275,
1798,
253,
2873,
74,
17,
12788,
4758,
943,
3164,
320,
4275,
275,
253,
5955,
273,
436,
789,
625,
24683,
16774,
7103,
4645,
326,
253,
3762,
30376,
281,
3946,
1014,
604,
627,
310,
247,
8037,
651,
671,
1361
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
26724,
12955,
247,
12955,
835,
4460,
13301,
403,
271,
3388,
273,
253,
4588,
5203,
285,
253,
581,
8131,
407,
253,
643,
8548,
1566,
285,
271,
546,
35128,
2746,
326,
4648,
512,
1566,
25142,
281,
4711,
13650,
253,
7274,
403,
5867,
28055,
285,
45190,
253,
2929,
16575,
3537,
13505,
253,
8062,
273,
12738,
382,
21755,
285,
253,
4081,
11640,
275,
247,
3710,
4758,
342,
767,
8548,
253,
1980,
3210,
778,
9184,
275,
616,
10295,
1159,
594,
326,
1566,
25001,
310,
417,
1896,
1223,
253,
20793,
9978,
7787,
8542,
16039,
352,
4483,
281,
4518,
12106,
253,
3879,
1097,
28055,
285,
45190,
281,
326,
990,
253,
2929,
4278,
253,
8062,
273,
512,
1264,
7274,
285,
2722,
326,
253,
15246,
2746,
778,
29458,
285,
326,
253,
19862,
2746,
310,
8654,
275,
253,
2701,
25761,
352,
4245,
2515,
762,
534,
253,
25001,
2746,
25273,
684,
841,
10527,
1543,
403,
4722,
285,
47860,
253,
16774,
7103,
23849,
253,
10527,
4342,
285,
275,
1635,
2722,
326,
253,
3879,
310,
2074,
672,
970,
11454,
6928,
3185,
273,
10295,
9077,
436,
28145,
387,
253,
31376,
273,
436,
1783,
253,
2929,
310,
973,
15720,
285,
2590,
253,
10527,
1543,
403,
3590,
285,
281,
253,
1682,
273,
619,
3640,
3451,
50276,
74,
1663,
11435,
253,
10527,
9021,
533,
891,
1928,
326,
387,
1878,
253,
16774,
7103,
310,
1512,
3710,
1223,
627,
310,
1175,
1921,
281,
12106,
760,
767,
8548,
28055,
352,
651,
320,
15246,
281,
4647,
253,
4081,
3082,
327,
4067,
3904,
273,
8548,
45190,
970,
760,
767,
8548,
285,
760,
4872,
9077,
285,
278,
79,
382,
347,
4679,
7787,
253,
7680,
253,
2929,
651,
5649,
10260,
432,
247,
16055,
16774,
7103,
50275,
5992,
7193,
5701,
50276,
22309,
310,
760,
4872,
9077,
908,
323,
253,
10295,
4679,
50275,
4609,
34501,
403,
908,
323,
253,
4679,
891,
5476,
247,
4872,
10295,
651,
320,
954,
7470,
281,
253,
4836,
533,
840,
1097,
10295,
3470,
651,
320,
253,
1072,
50276,
5430,
1057,
12738,
382,
21755,
7277,
281,
25001,
10295,
3210,
337,
50276,
783,
2746,
275,
374,
12014,
4648,
12738,
382,
21755,
1014,
2167,
597,
897,
271,
440,
22027,
3806,
10895,
352,
3133,
32811,
281,
7277,
281,
352,
50276,
18,
465,
1301,
278,
44023,
1162,
355,
5511,
20246,
5939,
3909,
4715,
342,
34501,
6036,
19454,
266,
8059,
327,
5145,
4715,
285,
3640,
8900,
275,
16634,
7203,
254,
45909,
4022,
374,
270,
382,
32071,
4164,
2284,
247,
363,
3230,
637,
79,
285,
295,
469,
16328,
270,
1369,
375,
5939,
940,
21755,
323,
327,
10933,
4715,
16424,
275,
11454,
1491,
5162,
2718,
5922,
9169,
50274,
6438,
30080,
22559,
50275,
783,
4477,
452,
9713,
619,
3533,
281,
619,
13212,
1223,
891,
5194,
342,
619,
7715,
30628,
326,
436,
789,
310,
3710,
275,
697,
7990,
891,
513,
4264,
436,
4460,
10527,
1379,
327,
5939,
4715,
342,
1027,
1566,
3510,
3066,
3640,
940,
21755,
3021,
891,
6273,
323,
14924,
891,
452,
9300,
619,
4868,
15672,
50275,
783,
2929,
10262,
271,
4722,
10527,
1783,
273,
12738,
382,
21755,
323,
767,
1027,
10295,
9077,
3210,
253,
16774,
7103,
23849,
253,
10527,
4342,
253,
20793,
4758,
327,
253,
581,
1133,
4483,
323,
2266,
10527,
1543,
533,
327,
253,
643,
7787,
253,
8542,
16039,
326,
476,
320,
8392,
432,
731,
1060,
247,
16055,
16774,
1783,
651,
452,
5520,
253,
8453,
273,
253,
9021,
5474,
33032,
2520,
2929,
9569,
271,
4722,
8668,
327,
10208,
12072,
4715,
3066,
871,
1851,
615,
940,
21755,
534,
4483,
15299,
8548,
281,
452,
616,
1211,
10165,
273,
3210,
275,
1798,
253,
2929,
24357,
10527,
1543,
323,
247,
2500,
15436,
850,
10208,
12072,
9077,
10076,
534,
14371,
247,
253,
27939,
273,
271,
28035,
3640,
940,
21755,
835,
34560,
940,
408,
569,
26332,
247,
5268,
1566,
387,
1016,
19502,
310,
851,
11273,
1754,
327,
440,
22027,
3280,
50276,
12787,
2474,
273,
253,
643,
8548,
1566,
273,
253,
2045,
19502,
13237,
7168,
1491,
689,
253,
25142,
285,
6524,
29623,
4404,
247,
5809,
3472,
1566,
285,
270,
247,
747,
546,
35128,
5853,
326,
29111,
10444,
3210,
4197,
407,
1097,
8548,
689,
253,
25142,
824,
326,
275,
253,
2701,
26332,
672,
253,
642,
273,
25142,
14280,
281,
23579,
253,
40006,
1566,
310,
253,
1072,
347,
253,
42295,
36409,
1566,
436,
3715,
30328,
327,
546,
35128,
10444,
3210,
310,
840,
3732,
281,
625,
15958,
374,
8780,
10208,
12072,
9162,
15216,
327,
278,
79,
382,
285,
260,
338,
274,
740,
15302,
38135,
50276,
9188,
40348,
50276,
2520,
2929,
4648,
247,
2969,
374,
8780,
10208,
12072,
9077,
10076,
342,
21010,
14053,
10165,
26332,
10295,
1025,
4872,
9077,
342,
4581,
630,
5482,
281,
1287,
4460,
16039,
534,
476,
7826,
320,
15786,
715,
747,
1566,
1530,
6932,
892,
7792,
342,
2266,
20185,
23632,
253,
2934,
347,
891,
17903,
1840,
310,
4460,
281,
479,
2299,
275,
2426,
273,
697,
8542,
8453,
891,
717,
417,
13762,
984,
273,
627,
310,
1512,
1199,
8037,
875,
253,
253,
30325,
1543,
285,
1524,
10186,
4758,
534,
310,
50221,
2708,
50276,
18,
253,
253,
30325,
9978,
3053,
342,
374,
8780,
4758,
533,
556,
417,
644,
6508,
281,
4471,
12788,
4758,
436,
310,
20276,
984,
253,
906,
6012,
275,
374,
8780,
4758,
310,
1077,
2173,
281,
253,
19502,
875,
253,
767,
8548,
285,
849,
597,
3388,
875,
253,
1980,
941,
50276,
783,
35755,
10554,
273,
253,
6096,
3210,
604,
627,
403,
625,
685,
581,
6096,
3210,
849,
651,
253,
5268,
873,
598,
616,
3733,
3453,
323,
253,
1735,
19502,
33810,
253,
4477,
5467,
29688,
326,
512,
8548,
452,
253,
1072,
2408,
273,
941,
534,
310,
4518,
417,
2032,
275,
8542,
4758,
50276,
5430,
588,
352,
3486,
253,
25001,
6974,
275,
824,
2219,
50276,
19,
1529,
19532,
12400,
310,
326,
253,
3559,
906,
310,
12331,
281,
271,
4755,
19652,
275,
253,
1039,
29507,
69,
310,
873,
598,
275,
436,
2929,
432,
253,
577,
5018,
5393,
387,
253,
5068,
273,
2593,
577,
5268,
374,
588,
1620,
755,
281,
923,
2032,
3453,
891,
9101,
436,
310,
247,
2022,
2847,
326,
5644,
281,
29507,
1397,
27939,
1677,
436,
253,
253,
30325,
906,
3715,
275,
5976,
310,
2238,
273,
31866,
476,
253,
4477,
45735,
436,
906,
275,
1083,
3210,
387,
19502,
246,
50276,
18,
273,
667,
8548,
403,
4270,
327,
940,
21755,
273,
3210,
432,
19502,
246,
3185,
50276,
2520,
310,
594,
5268,
374,
476,
923,
697,
2032,
3733,
3453,
387,
19502,
337,
50276,
20,
253,
4477,
671,
4081,
247,
941,
294,
1763,
24993,
318,
6974,
26332,
1323,
72,
76,
69,
534,
17923,
1805,
685,
29507,
69,
533,
1677,
253,
374,
2109,
1127,
1840,
352,
310,
417,
2590,
604,
436,
5122,
310,
3309,
604,
253,
19532,
19652,
275,
29507,
69,
310,
4229,
619,
1127,
310,
604,
436,
747,
5122,
7637,
598,
18505,
760,
253,
19652,
1840,
285,
417,
2712,
2010,
840,
352,
310,
8489,
15279,
984,
253,
1840,
19652,
476,
320,
4229,
407,
6941,
1097,
5268,
281,
3894,
3210,
35046,
2581,
685,
10040,
3146,
812,
253,
4477,
21184,
625,
327,
436,
50273,
21,
253,
4993,
275,
495,
310,
1996,
13966,
285,
7932,
342,
247,
1805,
34978,
69,
4993,
3066,
546,
35128,
10444,
29507,
69,
3210,
533,
969,
436,
906,
310,
3451,
1677,
253,
33657,
9978,
273,
29507,
69,
347,
8042,
562,
275,
374,
1840,
604,
436,
19652,
310,
15045,
476,
253,
4477,
45735,
253,
20185,
14940,
906,
323,
34978,
69,
50276,
22,
253,
906,
310,
8127,
1754,
327,
247,
15895,
273,
9077,
1566,
285,
253,
906,
310,
5604,
12331,
281,
436,
2173,
9077,
830,
891,
717,
417,
2119,
604,
352,
310,
5272,
281,
667,
5430,
16209,
253,
27570,
273,
253,
6012,
906,
327,
247,
1077,
13392,
9162,
4758,
275,
253,
1072,
17716,
273,
1869,
1529,
5884,
12400,
310,
326,
604,
359,
1859,
436,
9077,
15895,
432,
247,
37851,
8668,
840,
352,
4620,
253,
4477,
16209,
253,
1072,
305,
12064,
12177,
2439,
512,
8548,
285,
326,
2238,
273,
45880,
342,
253,
1566,
1530,
6932,
16038,
50275,
23,
452,
253,
4477,
2783,
253,
5511,
2105,
4457,
253,
374,
8780,
4758,
347,
253,
940,
21755,
4419,
2289,
281,
1980,
941,
1046,
5268,
651,
452,
281,
5007,
3210,
281,
1046,
643,
8548,
253,
2264,
5511,
2105,
310,
3103,
295,
2069,
323,
685,
253,
2622,
2105,
273,
892,
285,
275,
1635,
627,
588,
671,
320,
4465,
940,
21755,
14247,
534,
310,
247,
2257,
625,
19983,
685,
9406,
839,
1566,
13461,
50276,
27962,
1255,
50276,
783,
1543,
3176,
3451,
281,
479,
533,
891,
452,
28930,
670,
253,
253,
30325,
27570,
273,
1097,
1323,
72,
76,
69,
285,
34978,
69,
347,
891,
8042,
562,
1840,
891,
717,
417,
2119,
604,
841,
18595,
497,
5486,
281,
4993,
2712,
625,
685,
253,
4755,
19652,
275,
253,
3559,
29507,
69,
9978,
24088,
5268,
374,
1620,
4850,
281,
923,
2032,
3733,
3453,
33810,
604,
253,
29507,
69,
9978,
497,
281,
320,
4229,
407,
13872,
5268,
374,
11403,
697,
2032,
3733,
3453,
651,
253,
14940,
906,
273,
34978,
69,
1335,
2186,
50276,
498,
15752,
50276,
783,
2929,
310,
973,
34092,
342,
247,
5322,
14511,
2685,
326,
310,
3477,
281,
956,
533,
327,
247,
5884,
3877,
627,
403,
247,
1180,
273,
47412,
474,
6332,
512,
689,
253,
2929,
50276,
16217,
2092,
50276,
74,
1089,
253,
3368,
8489,
8921,
352,
8018,
253,
3302,
1980,
1566,
273,
5268,
337,
310,
2168,
327,
253,
1072,
1268,
342,
253,
36409,
1566,
534,
2097,
940,
21755,
1057,
417,
1361,
387,
512,
275,
512,
495,
3368,
7533,
417,
247,
2014,
581,
2722,
326,
940,
21755,
310,
1146,
9371,
717,
891,
5816,
1633,
1060,
50275,
251,
1529,
3877,
253,
19652,
9978,
327,
29507,
69,
310,
24802,
432,
253,
2361,
1543,
275,
512,
495,
7533,
5268,
374,
1900,
1347,
7197,
685,
5268,
337,
275,
958,
327,
253,
1072,
1566,
1072,
941,
4758,
352,
310,
28629,
326,
5268,
374,
310,
1199,
7197,
685,
5268,
337,
387,
253,
3302,
3790,
436,
310,
4518,
984,
5268,
374,
1620,
11403,
697,
2032,
3733,
3453,
285,
436,
3133,
281,
320,
253,
1083,
326,
342,
253,
19652,
5268,
374,
310,
36300,
285,
4283,
253,
940,
21755,
11961,
50276,
2520,
2929,
10262,
4722,
7906,
326,
497,
4326,
4215,
3066,
247,
10527,
5763,
327,
247,
21010,
4758,
533,
19235,
275,
1097,
3762,
50276,
29105,
436,
310,
1335,
3965,
1199,
247,
789,
275,
4780,
1677,
253,
5699,
8037,
875,
253,
253,
30325,
285,
1524,
10186,
9978,
352,
310,
417,
2590,
1880,
581,
476,
4647,
253,
12288,
6012,
432,
253,
253,
30325,
4758,
281,
1529,
13392,
1524,
10186,
4758,
50276,
32897,
923,
619,
2792,
275,
1458,
1840,
3021,
1223,
436,
812,
320,
253,
5068,
273,
271,
4722,
3762,
625,
789,
310,
3058,
281,
3426,
253,
2934,
347,
247,
2647,
273,
958,
253,
1655,
3368,
8018,
253,
4081,
940,
21755,
310,
417,
2444,
275,
1635,
891,
2868,
253,
4477,
452,
417,
2783,
253,
11897,
35437,
2105,
273,
436,
940,
21755,
6974,
2490,
187,
4118,
18435,
27,
2520,
7714,
29328,
285,
3537,
13505,
247,
940,
21755,
2746,
281,
2953,
19331,
275,
5939,
4715,
253,
2022,
2929,
16633,
327,
247,
4942,
2969,
767,
12788,
10295,
9077,
4758,
285,
253,
16039,
3715,
403,
6508,
285,
10571,
5867,
323,
247,
4471,
12788,
4758,
50275,
9088,
403,
1740,
30628,
512,
273,
5207,
5194,
326,
253,
1332,
12453,
271,
4722,
285,
14793,
2523,
2299,
30628,
403,
6804,
327,
253,
2929,
4868,
1223,
512,
30628,
5194,
326,
253,
4758,
310,
8489,
17521,
1025,
247,
8578,
273,
30628,
16681,
326,
253,
1543,
1918,
690,
3676,
12288,
326,
1537,
4446,
2852,
1783,
285,
7092,
275,
253,
2170,
643,
7350,
5439,
2486,
2442,
3374,
342,
253,
5511,
18332,
285,
253,
17647,
273,
253,
10295,
9077,
4758,
4632,
1524,
10186,
3676,
4715,
627,
403,
3302,
7350,
670,
1880,
253,
4433,
1083,
310,
15958,
534,
253,
4477,
2953,
18149,
281,
253,
4471,
12788,
4758,
285,
247,
7898,
1783,
403,
671,
9713,
407,
253,
4477,
285,
10571,
10517,
253,
30628,
17837,
846,
10123,
285,
5955,
253,
30628,
403,
6804,
387,
253,
990,
273,
253,
5955,
50275,
783,
2170,
6951,
9010,
806,
326,
253,
2929,
310,
1199,
5520,
285,
1199,
625,
7763,
275,
253,
9300,
830,
685,
275,
253,
3236,
2715,
285,
6296,
253,
16039,
432,
253,
2969,
1566,
778,
320,
27096,
323,
3946,
2299,
253,
7350,
5439,
670,
253,
4181,
875,
3762,
285,
3946,
403,
3588,
253,
2457,
4743,
4558,
45210,
4477,
403,
14659,
281,
2953,
253,
16318,
7681,
7350,
275,
667,
2852,
19529,
273,
436,
789,
275,
1798,
253,
2873,
74,
17,
12788,
4758,
943,
3164,
320,
4275,
275,
253,
5955,
273,
436,
789,
625,
24683,
16774,
7103,
4645,
326,
253,
3762,
30376,
281,
3946,
1014,
604,
627,
310,
247,
8037,
651,
671,
1361
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposed a new multimodal signature and digit string msds dataset including two subsets msdschs chinese signatures and msdstds token digit strings which is the largest publicly available chinese signature dataset the authors also provided the baseline experimental results using the sota methods and found the interesting result that the models utilizing token digit string perform better than the ones on handwritten chinese signature 1 this paper proposed a new dataset for an important task handwriting verification 2 this paper provided the sufficient details eg the process of collecting this dataset the sota baseline experimental results 1 the size of collected dataset is small 402 20 for the time series and image tasks the current sota neural networks need the large size of datasets to train and can then make some convincing conclusions if the cost is not high the authors are suggested to collect more data but i can change my opinion if stronger arguments are provided by the authors 2 this paper needs more polishing the details are provided in the clarity part docsepthis paper introduced a new handwriting verification benchmark dataset named msds it consists of two subsets msdschs for chinese signatures msdstds for token digit strings this dataset was contributed by 402 data contributors with 20 genuine samples and 20 skilled forgeries per user in each subset the authors verified the usefulness of the proposed dataset with extensive experiments msdschs is the largest publicly available handwritten chinese signature dataset msdstds novelly covers the actual phone numbers of users which has not yet been investigated thorough and extensive benchmark to provide baseline results both subsets are provided in both online and offline modalities accessibility of the data the msds dataset can only be used for noncommercial research purposes the authors require the potential users to submit an application form and provide a recent publication list in order to obtain the decompression password which may potentially introduce entry barriers for young researchers researchers with conflict of interest etc what is the motivation of requiring this verification if this verification is indeed necessary for any reasons the authors should elaborate more on the process about how they plan to approve the application forms in a transparent and objective way the authors should also promise how much time this verification service will be maintained the dataset contains signatures thus names of the data contributors which is personally identifiable information and can reveal identities the authors should discuss the privacy issues how will the data contributors identities be protected did data contributors provide their consent to use or share the data did they explicitly know and the purpose of the collection of such data and potential risks will the misuse of such data lead to any potential negative societal impact if yes do the authors have any measures to fight against that will individuals right to be forgotten removed from the dataset gdpr be ensured the neurips checklisthttpsneuripsccpublicguidespaperchecklist is missing docsepthis paper presents a new chinese handwriting verification dataset containing handwritten chinese characters and digits it is the largest publicly available chinese signature dataset the authors performed experiments on both msdschs and msdstds showing that the handwritten token digit string is a better biometric indicator for handwriting 1 the paper is wellorganized and the proposed dataset is large in scale containing both genuine and skilled forgeries samples 2 the paper sheds light on the importance of the token digit string instead of separate digits and can assist future verification research on the writers intersession variation 3 the authors have performed thorough experiments on the proposed dataset and bring a new perspective on the potential use of the token digit string as a better biometric for handwritten verification 4 the dataset has multiple modalities including static images coordinates pressure and time stamps i am not an expert on handwritten signature verification but i have some major concerns about this work 1 i dont see whether this topic is suitable for neurips track datasets and benchmarks maybe venues like biometric authentication where previous works were published are better for this work 2 where is the checklist 3 ethical concerns as the authors claim this dataset contains the users real names and phone numbers i dont see any explicit discussion about the ethical considerations of the collection process for instance did people provide their consent on the collection of such data general ethical conduct docsepthis paper presents a new dataset msds that contains two parts chinese signatures and their forgeries and token digit strings and forgeries the dataset is the largest of its type by far and also provides both the online and offline modalities of each signature and token digit string the authors comprehensively benchmarked the datasets with existing sota models 1 this paper presents a new dataset of chinese signatures and token digit strings and it is the largest of its type this dataset is going to be useful for research in forgery prevention via machine learning models 2 the authors comprehensively compared their new dataset with existing ones and clearly listed the novelties or advantages of their dataset in table 2 and 3 3 the authors benchmarked their dataset with sufficient number of recent machine learning models on signature forgery detection and have the surprising finding that the models were able to better detect forgery from token digit strings than chinese signatures moreover they conducted additional experiments to show that including both online and offline modalities help and they also crossvalidated their dataset with deepsigndb 1 there lacks documentation on a few key parts of the data collection process how are the participants selected how were consent obtained from the participants since the data collected involves their names and phone numbers and is released publicly docsepduring this paper the authors has presented a new signature and token digit string dataset collected from the same 402 chinese participants during two sessions spaced by an interval of 21 days this dataset has the advantage of combining both online and offline modalities in such a large quantity that make it ones of this kind this dataset was collected in a controlled context in such a way that intersession variability can be measured authors also evaluate this dataset performance using classical and stateoftheart verification approaches in various scenarios for highlighting the limits of them in more realworld cases they also prove the effectiveness of using other type of biometric such as handwritten token digit strings from users phone numbers or id card numbers for a more accurate identity verification to conclude they apply crossdataset validation with the western signature dataset deepsigndb to understand if such dataset can be more effective in specific scenario like random forgery attacks this publicly available dataset is by its size and level of granularity the biggest one available for chinese signature at the time i am writing this its also including multiples modalities which are very useful for modern online verification tools like the ones presented in the uptodate approaches section the way how the dataset as been collected make it even more unique of its kind the number of users involved in the campaign hardware variety technically and financially the fact that you make the users switch their writing hand for three forged samples and the delay between each session is way beyond the average datasets currently available on the market the data distribution and collection across both tasks is perfect and open new field of research for identity verification based on handwriting outside the chinese domain authors also performs appropriate stateoftheart experimentations that define a good baseline for future works on this dataset i found it very appropriate to see the modality fusion and was surprised of the outcomes we generally can expect higher performances from online architecture but never see other papers combining both at the same time for the task we dont have any information about the users genders ages level of education permanent disease main hand or even if they wear glasses such information can be extremely useful for others tasks and doesnt cost very much to get it can also put in evidence some bias during the data collection about hardware it could be nice to have the identifier of the device used to collect a specific signature or tds it will allow studying inter and intradevice variability also maybe some users are already familiar with one of the two tablets before the event and have an ergonomic advantage compare to others we dont have any information about if the users have to switch between devices during data collection or if everybody went through the same devices in the same order if users have been changed of tablets models between the first and second session this can among other things explain the intersession variability the screen real estate isnt exactly the same and stylus ergonomic differ from one to another one missing reference at line 173 obvious margins you can also add a reference for the dtw according to the wikipedia page of the dtw one of the first reference is vintsyuk t k 1968 speech discrimination by dynamic programming kibernetika 4 8188
### Summary:
|
the paper gets divergent reviews initially the reviewers appeciate the largest publicly available scale of the dataset for handwriting verification novel data collection process extensive experimental validation on baselines and modality fusion and findings on the importance of token digit string the main concerns include ethics due to personally identifiable information incomplete information on users and devices unavailable license missing checklist and unclear presentation the rebuttal is successful at addressing many of these concerns majority of the reviewers are satisfied and support acceptance postrebuttal the main remaining concern is on the ethics the authors have taken extra care and measures to ensure the personally identifiable information is not wrongly used however the limited ageregional diversity remains a serious issue the acs agree with the majority assessment on the contributions of this paper and take into account of the unique challenge of collecting handwriting verification data at scale and thus recommend acceptance in the final version the authors should add explicit discussion on the datasets limitations on ageregional diversity and improve clarity on other issues raised by the reviewers
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
4081,
247,
747,
23390,
26306,
11118,
285,
6670,
2876,
13818,
1397,
10895,
1690,
767,
20077,
13818,
1397,
42585,
448,
5187,
20076,
285,
278,
8289,
296,
1397,
10669,
6670,
11559,
534,
310,
253,
6253,
13644,
2130,
448,
5187,
11118,
10895,
253,
4477,
671,
2530,
253,
8245,
5661,
1543,
970,
253,
256,
5503,
3082,
285,
1119,
253,
4722,
906,
326,
253,
3210,
17617,
10669,
6670,
2876,
1347,
1805,
685,
253,
4394,
327,
1133,
15720,
448,
5187,
11118,
50276,
18,
436,
2929,
4081,
247,
747,
10895,
323,
271,
1774,
4836,
50276,
4608,
17695,
21999,
374,
436,
2929,
2530,
253,
4209,
4278,
24088,
253,
1232,
273,
17055,
436,
10895,
253,
256,
5503,
8245,
5661,
1543,
50275,
18,
253,
1979,
273,
5728,
10895,
310,
1355,
32835,
50276,
938,
323,
253,
673,
2962,
285,
2460,
8892,
253,
1655,
256,
5503,
11454,
6928,
878,
253,
1781,
1979,
273,
15302,
281,
6194,
285,
476,
840,
1056,
690,
21414,
11815,
604,
253,
2105,
310,
417,
1029,
253,
4477,
403,
5125,
281,
4822,
625,
941,
533,
891,
476,
1818,
619,
4743,
604,
10046,
7125,
403,
2530,
407,
253,
4477,
50274,
19,
436,
2929,
3198,
625,
35952,
253,
4278,
403,
2530,
275,
253,
19843,
629,
50275,
7152,
33032,
2520,
2929,
5611,
247,
747,
47021,
21999,
22791,
10895,
4907,
13818,
1397,
352,
8414,
273,
767,
20077,
13818,
1397,
42585,
323,
448,
5187,
20076,
278,
8289,
296,
1397,
323,
10669,
6670,
11559,
436,
10895,
369,
9945,
407,
32835,
941,
24781,
342,
1384,
13241,
3530,
285,
1384,
18024,
323,
34527,
591,
2608,
275,
1016,
8578,
253,
4477,
16058,
253,
31471,
273,
253,
4081,
10895,
342,
9470,
4679,
50275,
983,
1397,
42585,
310,
253,
6253,
13644,
2130,
1133,
15720,
448,
5187,
11118,
10895,
50276,
983,
19335,
1397,
4460,
314,
10949,
253,
4588,
4481,
3904,
273,
4212,
534,
556,
417,
2568,
644,
6949,
50276,
42771,
602,
285,
9470,
22791,
281,
2085,
8245,
1543,
50276,
15617,
20077,
403,
2530,
275,
1097,
3909,
285,
28841,
33433,
50275,
10773,
2322,
273,
253,
941,
253,
13818,
1397,
10895,
476,
760,
320,
908,
323,
1327,
37763,
2561,
6378,
253,
4477,
2430,
253,
2442,
4212,
281,
11929,
271,
2898,
830,
285,
2085,
247,
3332,
9311,
1618,
275,
1340,
281,
4044,
253,
30572,
1256,
9868,
534,
778,
7826,
9569,
5857,
15938,
323,
2872,
8607,
8607,
342,
7344,
273,
1600,
3966,
752,
310,
253,
16038,
273,
10568,
436,
21999,
604,
436,
21999,
310,
6296,
3309,
323,
667,
4606,
253,
4477,
943,
21184,
625,
327,
253,
1232,
670,
849,
597,
2098,
281,
23255,
253,
2898,
4948,
275,
247,
13955,
285,
8103,
1039,
253,
4477,
943,
671,
9023,
849,
1199,
673,
436,
21999,
2579,
588,
320,
8838,
50275,
783,
10895,
4428,
20076,
3021,
4454,
273,
253,
941,
24781,
534,
310,
11697,
38640,
1491,
285,
476,
10313,
22925,
253,
4477,
943,
2319,
253,
11068,
3374,
849,
588,
253,
941,
24781,
22925,
320,
6885,
858,
941,
24781,
2085,
616,
7578,
281,
897,
390,
3894,
253,
941,
858,
597,
11120,
871,
285,
253,
4096,
273,
253,
4849,
273,
824,
941,
285,
2442,
10502,
588,
253,
41775,
273,
824,
941,
1421,
281,
667,
2442,
4016,
38058,
3486,
604,
4754,
513,
253,
4477,
452,
667,
5593,
281,
3819,
1411,
326,
588,
4292,
987,
281,
320,
14454,
5176,
432,
253,
10895,
305,
69,
1087,
320,
33075,
50276,
783,
5723,
2824,
44282,
3614,
32167,
2824,
550,
4387,
4297,
1487,
20790,
5903,
3550,
310,
5816,
50274,
7152,
33032,
2520,
2929,
10262,
247,
747,
448,
5187,
47021,
21999,
10895,
4508,
1133,
15720,
448,
5187,
5810,
285,
24321,
352,
310,
253,
6253,
13644,
2130,
448,
5187,
11118,
10895,
253,
4477,
2684,
4679,
327,
1097,
13818,
1397,
42585,
285,
278,
8289,
296,
1397,
4645,
326,
253,
1133,
15720,
10669,
6670,
2876,
310,
247,
1805,
1794,
7480,
15301,
323,
47021,
337,
253,
2929,
310,
973,
34092,
285,
253,
4081,
10895,
310,
1781,
275,
4311,
4508,
1097,
13241,
285,
18024,
323,
34527,
3530,
374,
253,
2929,
703,
1397,
1708,
327,
253,
6349,
273,
253,
10669,
6670,
2876,
3185,
273,
4858,
24321,
285,
476,
10073,
2852,
21999,
2561,
327,
253,
11597,
734,
12426,
7629,
495,
253,
4477,
452,
2684,
11080,
4679,
327,
253,
4081,
10895,
285,
3324,
247,
747,
8668,
327,
253,
2442,
897,
273,
253,
10669,
6670,
2876,
347,
247,
1805,
1794,
7480,
323,
1133,
15720,
21999,
577,
253,
10895,
556,
2709,
33433,
1690,
4228,
3888,
11627,
3473,
285,
673,
37946,
891,
717,
417,
271,
6485,
327,
1133,
15720,
11118,
21999,
533,
891,
452,
690,
2201,
7350,
670,
436,
789,
337,
891,
13414,
923,
1880,
436,
9400,
310,
7470,
323,
5723,
2824,
3540,
15302,
285,
49602,
5046,
28966,
751,
1794,
7480,
19676,
835,
2045,
2987,
497,
3863,
403,
1805,
323,
436,
789,
374,
835,
310,
253,
44282,
495,
16289,
7350,
347,
253,
4477,
1750,
436,
10895,
4428,
253,
4212,
1524,
4454,
285,
4481,
3904,
891,
13414,
923,
667,
6843,
5955,
670,
253,
16289,
15711,
273,
253,
4849,
1232,
323,
4227,
858,
952,
2085,
616,
7578,
327,
253,
4849,
273,
824,
941,
2087,
16289,
2589,
5474,
33032,
2520,
2929,
10262,
247,
747,
10895,
13818,
1397,
326,
4428,
767,
4243,
448,
5187,
20076,
285,
616,
323,
34527,
285,
10669,
6670,
11559,
285,
323,
34527,
253,
10895,
310,
253,
6253,
273,
697,
1511,
407,
2080,
285,
671,
3400,
1097,
253,
3909,
285,
28841,
33433,
273,
1016,
11118,
285,
10669,
6670,
2876,
253,
4477,
9483,
1242,
22791,
264,
253,
15302,
342,
5368,
256,
5503,
3210,
337,
436,
2929,
10262,
247,
747,
10895,
273,
448,
5187,
20076,
285,
10669,
6670,
11559,
285,
352,
310,
253,
6253,
273,
697,
1511,
436,
10895,
310,
1469,
281,
320,
4217,
323,
2561,
275,
323,
5158,
12212,
3066,
5145,
4715,
3210,
374,
253,
4477,
9483,
1242,
2429,
616,
747,
10895,
342,
5368,
4394,
285,
4518,
7117,
253,
4460,
2890,
390,
11361,
273,
616,
10895,
275,
2829,
374,
285,
495,
495,
253,
4477,
22791,
264,
616,
10895,
342,
4209,
1180,
273,
3332,
5145,
4715,
3210,
327,
11118,
323,
5158,
5481,
285,
452,
253,
10084,
4560,
326,
253,
3210,
497,
2104,
281,
1805,
2736,
323,
5158,
432,
10669,
6670,
11559,
685,
448,
5187,
20076,
25761,
597,
5196,
3081,
4679,
281,
921,
326,
1690,
1097,
3909,
285,
28841,
33433,
1361,
285,
597,
671,
2831,
7210,
456,
616,
10895,
342,
372,
2265,
525,
5470,
337,
627,
19756,
10097,
327,
247,
1643,
2234,
4243,
273,
253,
941,
4849,
1232,
849,
403,
253,
5014,
4236,
849,
497,
7578,
2797,
432,
253,
5014,
1580,
253,
941,
5728,
8687,
616,
4454,
285,
4481,
3904,
285,
310,
4439,
13644,
50276,
7152,
339,
19875,
981,
436,
2929,
253,
4477,
556,
3559,
247,
747,
11118,
285,
10669,
6670,
2876,
10895,
5728,
432,
253,
1072,
32835,
448,
5187,
5014,
1309,
767,
12154,
26549,
407,
271,
7726,
273,
3127,
1897,
436,
10895,
556,
253,
5750,
273,
16248,
1097,
3909,
285,
28841,
33433,
275,
824,
247,
1781,
10671,
326,
1056,
352,
4394,
273,
436,
2238,
436,
10895,
369,
5728,
275,
247,
6537,
3634,
275,
824,
247,
1039,
326,
734,
12426,
13099,
476,
320,
4080,
4477,
671,
7472,
436,
10895,
3045,
970,
8946,
285,
1375,
23037,
14387,
21999,
7274,
275,
2710,
15216,
323,
27321,
253,
7787,
273,
731,
275,
625,
1524,
10186,
2219,
597,
671,
5276,
253,
12510,
273,
970,
643,
1511,
273,
1794,
7480,
824,
347,
1133,
15720,
10669,
6670,
11559,
432,
4212,
4481,
3904,
390,
2654,
3120,
3904,
323,
247,
625,
7899,
6489,
21999,
281,
7525,
597,
4647,
2831,
42429,
12820,
342,
253,
10439,
11118,
10895,
372,
2265,
525,
5470,
281,
2096,
604,
824,
10895,
476,
320,
625,
3576,
275,
2173,
10076,
751,
3632,
323,
5158,
8104,
436,
13644,
2130,
10895,
310,
407,
697,
1979,
285,
1268,
273,
32449,
414,
253,
5962,
581,
2130,
323,
448,
5187,
11118,
387,
253,
673,
891,
717,
4028,
436,
697,
671,
1690,
4471,
1868,
33433,
534,
403,
1077,
4217,
323,
4980,
3909,
21999,
5657,
751,
253,
4394,
3559,
275,
253,
11776,
351,
366,
7274,
2593,
253,
1039,
849,
253,
10895,
347,
644,
5728,
1056,
352,
1014,
625,
4451,
273,
697,
2238,
253,
1180,
273,
4212,
3206,
275,
253,
4544,
10309,
5235,
22335,
285,
27576,
253,
958,
326,
368,
1056,
253,
4212,
5234,
616,
4028,
1133,
323,
1264,
37260,
3530,
285,
253,
5778,
875,
1016,
6874,
310,
1039,
4457,
253,
3388,
15302,
4390,
2130,
327,
253,
2791,
253,
941,
3268,
285,
4849,
2439,
1097,
8892,
310,
3962,
285,
1527,
747,
1673,
273,
2561,
323,
6489,
21999,
1754,
327,
47021,
3345,
253,
448,
5187,
5028,
4477,
671,
17923,
4569,
1375,
23037,
14387,
3368,
569,
326,
4853,
247,
1175,
8245,
323,
2852,
2987,
327,
436,
10895,
891,
1119,
352,
1077,
4569,
281,
923,
253,
36453,
11781,
285,
369,
9861,
273,
253,
6973,
359,
3839,
476,
1902,
2169,
16226,
432,
3909,
10336,
533,
1620,
923,
643,
9380,
16248,
1097,
387,
253,
1072,
673,
323,
253,
4836,
359,
13414,
452,
667,
1491,
670,
253,
4212,
305,
13577,
11880,
1268,
273,
4730,
9928,
2728,
2022,
1133,
390,
1014,
604,
597,
8251,
17543,
50276,
10328,
1491,
476,
320,
6685,
4217,
323,
2571,
8892,
285,
36908,
2105,
1077,
1199,
281,
755,
352,
476,
671,
1691,
275,
1941,
690,
8492,
1309,
253,
941,
4849,
50276,
10383,
10309,
352,
812,
320,
5322,
281,
452,
253,
21674,
273,
253,
2813,
908,
281,
4822,
247,
2173,
11118,
390,
246,
1397,
352,
588,
1581,
12392,
734,
285,
4996,
796,
87,
547,
13099,
671,
5046,
690,
4212,
403,
2168,
7615,
342,
581,
273,
253,
767,
23368,
1078,
253,
2362,
285,
452,
271,
21651,
22255,
5750,
7277,
281,
2571,
359,
13414,
452,
667,
1491,
670,
604,
253,
4212,
452,
281,
5234,
875,
4095,
1309,
941,
4849,
390,
604,
11648,
2427,
949,
253,
1072,
4095,
275,
253,
1072,
1340,
50276,
338,
4212,
452,
644,
4391,
273,
23368,
3210,
875,
253,
806,
285,
1273,
6874,
436,
476,
2190,
643,
1841,
5513,
253,
734,
12426,
13099,
253,
3601,
1524,
8304,
310,
2649,
4555,
253,
1072,
285,
17521,
316,
21651,
22255,
9184,
432,
581,
281,
1529,
50276,
531,
5816,
3806,
387,
1386,
24687,
4755,
24390,
50276,
5658,
476,
671,
823,
247,
3806,
323,
253,
277,
7553,
2556,
281,
253,
259,
15170,
3239,
273,
253,
277,
7553,
581,
273,
253,
806,
3806,
310,
50276,
87,
565,
19089,
2788,
246,
465,
16221,
6519,
11081,
407,
7870,
10717,
465,
487,
1808,
292,
11825,
577,
854,
17599,
2490,
187,
4118,
18435,
27,
783,
2929,
4850,
34249,
10123,
8523,
253,
30628,
1393,
68,
4513,
253,
6253,
13644,
2130,
4311,
273,
253,
10895,
323,
47021,
21999,
4460,
941,
4849,
1232,
9470,
5661,
12820,
327,
1666,
25379,
285,
36453,
11781,
285,
4342,
327,
253,
6349,
273,
10669,
6670,
2876,
253,
2022,
7350,
2486,
18035,
1955,
281,
11697,
38640,
1491,
18464,
1491,
327,
4212,
285,
4095,
29356,
7981,
5816,
44282,
285,
12744,
9759,
50276,
783,
30080,
22559,
310,
5547,
387,
15974,
1142,
273,
841,
7350,
5020,
273,
253,
30628,
403,
10048,
285,
1329,
14924,
1501,
250,
2858,
22559,
253,
2022,
5780,
4468,
310,
327,
253,
18035,
253,
4477,
452,
2668,
4465,
1557,
285,
5593,
281,
5416,
253,
11697,
38640,
1491,
310,
417,
47723,
908,
2299,
253,
3710,
2363,
1747,
1593,
9991,
4558,
247,
4092,
2523,
50276,
783,
913,
84,
5194,
342,
253,
5020,
6803,
327,
253,
9021,
273,
436,
2929,
285,
1379,
715,
2395,
273,
253,
4451,
5691,
273,
17055,
47021,
21999,
941,
387,
4311,
285,
3021,
5583,
14924,
50276,
249,
253,
2457,
2715,
253,
4477,
943,
823,
6843,
5955,
327,
253,
15302,
7364,
327,
2363,
1747,
1593,
9991,
285,
3157,
19843,
327,
643,
3374,
5439,
407,
253,
30628
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
4081,
247,
747,
23390,
26306,
11118,
285,
6670,
2876,
13818,
1397,
10895,
1690,
767,
20077,
13818,
1397,
42585,
448,
5187,
20076,
285,
278,
8289,
296,
1397,
10669,
6670,
11559,
534,
310,
253,
6253,
13644,
2130,
448,
5187,
11118,
10895,
253,
4477,
671,
2530,
253,
8245,
5661,
1543,
970,
253,
256,
5503,
3082,
285,
1119,
253,
4722,
906,
326,
253,
3210,
17617,
10669,
6670,
2876,
1347,
1805,
685,
253,
4394,
327,
1133,
15720,
448,
5187,
11118,
50276,
18,
436,
2929,
4081,
247,
747,
10895,
323,
271,
1774,
4836,
50276,
4608,
17695,
21999,
374,
436,
2929,
2530,
253,
4209,
4278,
24088,
253,
1232,
273,
17055,
436,
10895,
253,
256,
5503,
8245,
5661,
1543,
50275,
18,
253,
1979,
273,
5728,
10895,
310,
1355,
32835,
50276,
938,
323,
253,
673,
2962,
285,
2460,
8892,
253,
1655,
256,
5503,
11454,
6928,
878,
253,
1781,
1979,
273,
15302,
281,
6194,
285,
476,
840,
1056,
690,
21414,
11815,
604,
253,
2105,
310,
417,
1029,
253,
4477,
403,
5125,
281,
4822,
625,
941,
533,
891,
476,
1818,
619,
4743,
604,
10046,
7125,
403,
2530,
407,
253,
4477,
50274,
19,
436,
2929,
3198,
625,
35952,
253,
4278,
403,
2530,
275,
253,
19843,
629,
50275,
7152,
33032,
2520,
2929,
5611,
247,
747,
47021,
21999,
22791,
10895,
4907,
13818,
1397,
352,
8414,
273,
767,
20077,
13818,
1397,
42585,
323,
448,
5187,
20076,
278,
8289,
296,
1397,
323,
10669,
6670,
11559,
436,
10895,
369,
9945,
407,
32835,
941,
24781,
342,
1384,
13241,
3530,
285,
1384,
18024,
323,
34527,
591,
2608,
275,
1016,
8578,
253,
4477,
16058,
253,
31471,
273,
253,
4081,
10895,
342,
9470,
4679,
50275,
983,
1397,
42585,
310,
253,
6253,
13644,
2130,
1133,
15720,
448,
5187,
11118,
10895,
50276,
983,
19335,
1397,
4460,
314,
10949,
253,
4588,
4481,
3904,
273,
4212,
534,
556,
417,
2568,
644,
6949,
50276,
42771,
602,
285,
9470,
22791,
281,
2085,
8245,
1543,
50276,
15617,
20077,
403,
2530,
275,
1097,
3909,
285,
28841,
33433,
50275,
10773,
2322,
273,
253,
941,
253,
13818,
1397,
10895,
476,
760,
320,
908,
323,
1327,
37763,
2561,
6378,
253,
4477,
2430,
253,
2442,
4212,
281,
11929,
271,
2898,
830,
285,
2085,
247,
3332,
9311,
1618,
275,
1340,
281,
4044,
253,
30572,
1256,
9868,
534,
778,
7826,
9569,
5857,
15938,
323,
2872,
8607,
8607,
342,
7344,
273,
1600,
3966,
752,
310,
253,
16038,
273,
10568,
436,
21999,
604,
436,
21999,
310,
6296,
3309,
323,
667,
4606,
253,
4477,
943,
21184,
625,
327,
253,
1232,
670,
849,
597,
2098,
281,
23255,
253,
2898,
4948,
275,
247,
13955,
285,
8103,
1039,
253,
4477,
943,
671,
9023,
849,
1199,
673,
436,
21999,
2579,
588,
320,
8838,
50275,
783,
10895,
4428,
20076,
3021,
4454,
273,
253,
941,
24781,
534,
310,
11697,
38640,
1491,
285,
476,
10313,
22925,
253,
4477,
943,
2319,
253,
11068,
3374,
849,
588,
253,
941,
24781,
22925,
320,
6885,
858,
941,
24781,
2085,
616,
7578,
281,
897,
390,
3894,
253,
941,
858,
597,
11120,
871,
285,
253,
4096,
273,
253,
4849,
273,
824,
941,
285,
2442,
10502,
588,
253,
41775,
273,
824,
941,
1421,
281,
667,
2442,
4016,
38058,
3486,
604,
4754,
513,
253,
4477,
452,
667,
5593,
281,
3819,
1411,
326,
588,
4292,
987,
281,
320,
14454,
5176,
432,
253,
10895,
305,
69,
1087,
320,
33075,
50276,
783,
5723,
2824,
44282,
3614,
32167,
2824,
550,
4387,
4297,
1487,
20790,
5903,
3550,
310,
5816,
50274,
7152,
33032,
2520,
2929,
10262,
247,
747,
448,
5187,
47021,
21999,
10895,
4508,
1133,
15720,
448,
5187,
5810,
285,
24321,
352,
310,
253,
6253,
13644,
2130,
448,
5187,
11118,
10895,
253,
4477,
2684,
4679,
327,
1097,
13818,
1397,
42585,
285,
278,
8289,
296,
1397,
4645,
326,
253,
1133,
15720,
10669,
6670,
2876,
310,
247,
1805,
1794,
7480,
15301,
323,
47021,
337,
253,
2929,
310,
973,
34092,
285,
253,
4081,
10895,
310,
1781,
275,
4311,
4508,
1097,
13241,
285,
18024,
323,
34527,
3530,
374,
253,
2929,
703,
1397,
1708,
327,
253,
6349,
273,
253,
10669,
6670,
2876,
3185,
273,
4858,
24321,
285,
476,
10073,
2852,
21999,
2561,
327,
253,
11597,
734,
12426,
7629,
495,
253,
4477,
452,
2684,
11080,
4679,
327,
253,
4081,
10895,
285,
3324,
247,
747,
8668,
327,
253,
2442,
897,
273,
253,
10669,
6670,
2876,
347,
247,
1805,
1794,
7480,
323,
1133,
15720,
21999,
577,
253,
10895,
556,
2709,
33433,
1690,
4228,
3888,
11627,
3473,
285,
673,
37946,
891,
717,
417,
271,
6485,
327,
1133,
15720,
11118,
21999,
533,
891,
452,
690,
2201,
7350,
670,
436,
789,
337,
891,
13414,
923,
1880,
436,
9400,
310,
7470,
323,
5723,
2824,
3540,
15302,
285,
49602,
5046,
28966,
751,
1794,
7480,
19676,
835,
2045,
2987,
497,
3863,
403,
1805,
323,
436,
789,
374,
835,
310,
253,
44282,
495,
16289,
7350,
347,
253,
4477,
1750,
436,
10895,
4428,
253,
4212,
1524,
4454,
285,
4481,
3904,
891,
13414,
923,
667,
6843,
5955,
670,
253,
16289,
15711,
273,
253,
4849,
1232,
323,
4227,
858,
952,
2085,
616,
7578,
327,
253,
4849,
273,
824,
941,
2087,
16289,
2589,
5474,
33032,
2520,
2929,
10262,
247,
747,
10895,
13818,
1397,
326,
4428,
767,
4243,
448,
5187,
20076,
285,
616,
323,
34527,
285,
10669,
6670,
11559,
285,
323,
34527,
253,
10895,
310,
253,
6253,
273,
697,
1511,
407,
2080,
285,
671,
3400,
1097,
253,
3909,
285,
28841,
33433,
273,
1016,
11118,
285,
10669,
6670,
2876,
253,
4477,
9483,
1242,
22791,
264,
253,
15302,
342,
5368,
256,
5503,
3210,
337,
436,
2929,
10262,
247,
747,
10895,
273,
448,
5187,
20076,
285,
10669,
6670,
11559,
285,
352,
310,
253,
6253,
273,
697,
1511,
436,
10895,
310,
1469,
281,
320,
4217,
323,
2561,
275,
323,
5158,
12212,
3066,
5145,
4715,
3210,
374,
253,
4477,
9483,
1242,
2429,
616,
747,
10895,
342,
5368,
4394,
285,
4518,
7117,
253,
4460,
2890,
390,
11361,
273,
616,
10895,
275,
2829,
374,
285,
495,
495,
253,
4477,
22791,
264,
616,
10895,
342,
4209,
1180,
273,
3332,
5145,
4715,
3210,
327,
11118,
323,
5158,
5481,
285,
452,
253,
10084,
4560,
326,
253,
3210,
497,
2104,
281,
1805,
2736,
323,
5158,
432,
10669,
6670,
11559,
685,
448,
5187,
20076,
25761,
597,
5196,
3081,
4679,
281,
921,
326,
1690,
1097,
3909,
285,
28841,
33433,
1361,
285,
597,
671,
2831,
7210,
456,
616,
10895,
342,
372,
2265,
525,
5470,
337,
627,
19756,
10097,
327,
247,
1643,
2234,
4243,
273,
253,
941,
4849,
1232,
849,
403,
253,
5014,
4236,
849,
497,
7578,
2797,
432,
253,
5014,
1580,
253,
941,
5728,
8687,
616,
4454,
285,
4481,
3904,
285,
310,
4439,
13644,
50276,
7152,
339,
19875,
981,
436,
2929,
253,
4477,
556,
3559,
247,
747,
11118,
285,
10669,
6670,
2876,
10895,
5728,
432,
253,
1072,
32835,
448,
5187,
5014,
1309,
767,
12154,
26549,
407,
271,
7726,
273,
3127,
1897,
436,
10895,
556,
253,
5750,
273,
16248,
1097,
3909,
285,
28841,
33433,
275,
824,
247,
1781,
10671,
326,
1056,
352,
4394,
273,
436,
2238,
436,
10895,
369,
5728,
275,
247,
6537,
3634,
275,
824,
247,
1039,
326,
734,
12426,
13099,
476,
320,
4080,
4477,
671,
7472,
436,
10895,
3045,
970,
8946,
285,
1375,
23037,
14387,
21999,
7274,
275,
2710,
15216,
323,
27321,
253,
7787,
273,
731,
275,
625,
1524,
10186,
2219,
597,
671,
5276,
253,
12510,
273,
970,
643,
1511,
273,
1794,
7480,
824,
347,
1133,
15720,
10669,
6670,
11559,
432,
4212,
4481,
3904,
390,
2654,
3120,
3904,
323,
247,
625,
7899,
6489,
21999,
281,
7525,
597,
4647,
2831,
42429,
12820,
342,
253,
10439,
11118,
10895,
372,
2265,
525,
5470,
281,
2096,
604,
824,
10895,
476,
320,
625,
3576,
275,
2173,
10076,
751,
3632,
323,
5158,
8104,
436,
13644,
2130,
10895,
310,
407,
697,
1979,
285,
1268,
273,
32449,
414,
253,
5962,
581,
2130,
323,
448,
5187,
11118,
387,
253,
673,
891,
717,
4028,
436,
697,
671,
1690,
4471,
1868,
33433,
534,
403,
1077,
4217,
323,
4980,
3909,
21999,
5657,
751,
253,
4394,
3559,
275,
253,
11776,
351,
366,
7274,
2593,
253,
1039,
849,
253,
10895,
347,
644,
5728,
1056,
352,
1014,
625,
4451,
273,
697,
2238,
253,
1180,
273,
4212,
3206,
275,
253,
4544,
10309,
5235,
22335,
285,
27576,
253,
958,
326,
368,
1056,
253,
4212,
5234,
616,
4028,
1133,
323,
1264,
37260,
3530,
285,
253,
5778,
875,
1016,
6874,
310,
1039,
4457,
253,
3388,
15302,
4390,
2130,
327,
253,
2791,
253,
941,
3268,
285,
4849,
2439,
1097,
8892,
310,
3962,
285,
1527,
747,
1673,
273,
2561,
323,
6489,
21999,
1754,
327,
47021,
3345,
253,
448,
5187,
5028,
4477,
671,
17923,
4569,
1375,
23037,
14387,
3368,
569,
326,
4853,
247,
1175,
8245,
323,
2852,
2987,
327,
436,
10895,
891,
1119,
352,
1077,
4569,
281,
923,
253,
36453,
11781,
285,
369,
9861,
273,
253,
6973,
359,
3839,
476,
1902,
2169,
16226,
432,
3909,
10336,
533,
1620,
923,
643,
9380,
16248,
1097,
387,
253,
1072,
673,
323,
253,
4836,
359,
13414,
452,
667,
1491,
670,
253,
4212,
305,
13577,
11880,
1268,
273,
4730,
9928,
2728,
2022,
1133,
390,
1014,
604,
597,
8251,
17543,
50276,
10328,
1491,
476,
320,
6685,
4217,
323,
2571,
8892,
285,
36908,
2105,
1077,
1199,
281,
755,
352,
476,
671,
1691,
275,
1941,
690,
8492,
1309,
253,
941,
4849,
50276,
10383,
10309,
352,
812,
320,
5322,
281,
452,
253,
21674,
273,
253,
2813,
908,
281,
4822,
247,
2173,
11118,
390,
246,
1397,
352,
588,
1581,
12392,
734,
285,
4996,
796,
87,
547,
13099,
671,
5046,
690,
4212,
403,
2168,
7615,
342,
581,
273,
253,
767,
23368,
1078,
253,
2362,
285,
452,
271,
21651,
22255,
5750,
7277,
281,
2571,
359,
13414,
452,
667,
1491,
670,
604,
253,
4212,
452,
281,
5234,
875,
4095,
1309,
941,
4849,
390,
604,
11648,
2427,
949,
253,
1072,
4095,
275,
253,
1072,
1340,
50276,
338,
4212,
452,
644,
4391,
273,
23368,
3210,
875,
253,
806,
285,
1273,
6874,
436,
476,
2190,
643,
1841,
5513,
253,
734,
12426,
13099,
253,
3601,
1524,
8304,
310,
2649,
4555,
253,
1072,
285,
17521,
316,
21651,
22255,
9184,
432,
581,
281,
1529,
50276,
531,
5816,
3806,
387,
1386,
24687,
4755,
24390,
50276,
5658,
476,
671,
823,
247,
3806,
323,
253,
277,
7553,
2556,
281,
253,
259,
15170,
3239,
273,
253,
277,
7553,
581,
273,
253,
806,
3806,
310,
50276,
87,
565,
19089,
2788,
246,
465,
16221,
6519,
11081,
407,
7870,
10717,
465,
487,
1808,
292,
11825,
577,
854,
17599,
2490,
187,
4118,
18435,
27,
783,
2929,
4850,
34249,
10123,
8523,
253,
30628,
1393,
68,
4513,
253,
6253,
13644,
2130,
4311,
273,
253,
10895,
323,
47021,
21999,
4460,
941,
4849,
1232,
9470,
5661,
12820,
327,
1666,
25379,
285,
36453,
11781,
285,
4342,
327,
253,
6349,
273,
10669,
6670,
2876,
253,
2022,
7350,
2486,
18035,
1955,
281,
11697,
38640,
1491,
18464,
1491,
327,
4212,
285,
4095,
29356,
7981,
5816,
44282,
285,
12744,
9759,
50276,
783,
30080,
22559,
310,
5547,
387,
15974,
1142,
273,
841,
7350,
5020,
273,
253,
30628,
403,
10048,
285,
1329,
14924,
1501,
250,
2858,
22559,
253,
2022,
5780,
4468,
310,
327,
253,
18035,
253,
4477,
452,
2668,
4465,
1557,
285,
5593,
281,
5416,
253,
11697,
38640,
1491,
310,
417,
47723,
908,
2299,
253,
3710,
2363,
1747,
1593,
9991,
4558,
247,
4092,
2523,
50276,
783,
913,
84,
5194,
342,
253,
5020,
6803,
327,
253,
9021,
273,
436,
2929,
285,
1379,
715,
2395,
273,
253,
4451,
5691,
273,
17055,
47021,
21999,
941,
387,
4311,
285,
3021,
5583,
14924,
50276,
249,
253,
2457,
2715,
253,
4477,
943,
823,
6843,
5955,
327,
253,
15302,
7364,
327,
2363,
1747,
1593,
9991,
285,
3157,
19843,
327,
643,
3374,
5439,
407,
253,
30628
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
unlike most of the continual learning approaches in the literature that perform supervised training at each learning stage the authors propose to perform unsupervised representation learning on the sequence of incoming data and then classify the samples at each stage using knn experiments on standard cifar10100 and tinyimagenet shows that the proposed method alleviates catastrophic forgetting and generalizes better in different scenarios strengths the submission is well written and easy to follow the proposed concept is well motivated with various quantitative and qualitative justifications while many unsupervisedselfsupervised training approaches require pretraining on massive unlabeled data the proposed method here works well with the help from mixup and does not require additional pretraining set largely making it more applicable to realworld use cases i especially enjoy reading sec 53 regarding the analyses on feature similarity between different learning approaches visualization of feature space and loss landscape visualization this section provides additional justification besides the absolute accuracy improvement over the supervised continual learning counterparts weaknesses i think the main limitation of using the proposed pipeline in practice is runtime and memory constraints to run knn in a lifelong learning setting is challenging due to evergrowing storage requirements for storing samples coming from different stages during inference the algorithm also needs to compute distances between the query and many stored data points can the authors shed some light about the runtime and memory comparison between the proposed method and supervised counterparts it would be great to also compare with other recent supervised continual learning approaches as well such as a1 a2 a1 zhao et al maintaining discrimination and fairness in class incremental learning cvpr 2020 a2 liu et al mnemonics training multiclass incremental learning without forgetting cvpr 2020 overall a good quality submission with novel and interesting ideas it would be great for the authors to address the weaknesses mentioned above docsepthis paper studies the problem of representation learning in an unsupervised continual learningucl setting it shows that the representation learned with ucl is more general than the one learned with supervised cl scl and investigates why ucl is more robust to catastrophic forgetting than scl by analyzing the similarity of learned features and visualizing loss landscape the authors also propose to apply mixup technique to ucl setting and present a lump algorithm to further improve the performance of clthis paper studies the problem of representation learning in an unsupervised continual learningucl setting it shows that the representation learned with ucl is more general than the one learned with supervised cl scl and investigates why ucl is more robust to catastrophic forgetting than scl by analyzing the similarity of learned features and visualizing loss landscape the authors also propose to apply mixup technique to ucl setting and present a lump algorithm to further improve the performance of cl strong points the paper takes one of the most import issues in cl learning robust representation in unsupervised setting for me the problem itself is real and practical the paper provides comprehensive experiments including both qualitative analysis and quantitative results to show the effectiveness of ucl and the proposed lump algorithm over scl methods overall the paper is well written in particular the related work section has a nice flow and puts the proposed method into context despite the method having limited novelty the method has been well motivated by pointing out the limitations in sota methods the authors provide code for reproducing the results in the paper weak points the proposed lump algorithm is adopted from supervised mixup techniquezhang et al 2018 so the novelty is limited the authors conducted extensive experiments in taskincremental setting it would be interesting to see how ucl and the proposed method perform in classincremental and taskagnostic cl settings fig 3 in general higher layers have lower feature similarity than lower layers and similarity between ucl models are higher than that of scls however there is an exception in layer 4 of der method the similarity of scl is higher than that of ucl it is worth some discussion on this exception overall i vote for marginally accepting i like the idea of unsupervised continual learning and handling it by the proposed lump method my major concern is about the limited novelty of the proposed method adopted from mixup in supervised learning and some additional experiments on classincremental or taskagnostic settingssee weakness above hopefully the authors can address my concern in the rebuttal period after rebuttal the authors addressed most of my concerns so i would like to raise my score docsepthe paper proposes to tackle the continual learning problem in an unsupervised setting it shows that recent selfsupervised learning methods are efficient tools to learn image representation with lower catastrophic learning problems two recent selfsupervised methods are evaluated simsiam and barlowtwins in both cases the superiority of unsupervised features is demonstrated the widely used mixup method is also adapted to the ucl problem straightforwardly current images are mixed with images of the past tasks sampled from the replay buffer strengths first the paper is well written and easy to read the experiments are rich and wellchosen to better understand the superiority of unsupervised representations in the context of cl i especially appreciated the experiments in fig 2 that investigate the impact of the size of the training dataset the conclusions are enlightening and will be very helpful to design new supervised or unsupervised cl methods the code is publicly available and looks clean and easy to use weaknesses some details are unclear sec 51 knn classifier which set is used for nn the replay buffer validation set the knn is used both for supervised and unsupervised experiments right sec 53 more explanations about cka are required for the reader that is not familiar with this measure i recommend changing the title rethinking the representational continuity is much too strong the conclusions of the paper are great but it does not provoke a real rethinking of the problem i am not really convinced but the visualization in fig4 it seems that lump has sparser activations the shapes of the objects are more clearly visible in its feature map does it simply mean that it learns lowerlevel features similar to edge detector maybe a tsne visualization would help to see how the features of old tasks are affected when learning a new task in fig5 we can notice that the range of value gets smaller in t19 from 4656 for t0 to 4446 any idea why the paper shows interesting results that confirm the potential of selfsupervised learning methods in the context of continual learning the technical novelty may look incremental but the experimental conclusions are very interesting for the community the paper is clear and well written docsepthis paper attempts to bridge the gap between unsupervised representation learning and continual learning by extending various supervised continual learning methods to the unsupervised learning framework it builds upon two recent unsupervised feature learning techniques simsiam and barlowtwins and a powerful data augmentation technique mixup improved performances have been demonstrated in various experimental settings together with comprehensive feature visualizations strengths rethinking continual learning with unsupervised representation learning is interesting and empirical results indicate that most supervised continual learning methods can be improved by the proposed approach a bunch of experiments have been conducted to demonstrate the effectiveness of the proposed approach in various settings and several visualizations have also been included for a better understanding of the learned features weaknesses missed comparison with curl though the authors criticised that continual unsupervised representation learning framework curl to be limited by digitbased grayscale datasets no direct comparison with curl is done by following their evaluation protocol adding this result could better reveal the difference between the proposed method and curl in terms of effectiveness beside the evaluation of cluster quality used in curl seems to be an important evaluation metric in unsupervised continual learning which has not been used in the paper degraded performance of der and multitask in the table 1 we see that the proposed unsupervised continual learning can improve all baseline methods except der and multitask a clear explanation about this performance drop should be added qualitative analysis why the visualization of feature maps stops at mathcalt13 while the loss landscape visualization continues to mathcalt19 and in figure 5 the difference between sclder and lump is hard to interpret why not directly use unsupervised learned presentations as an important purpose of the unsupervised representation learning is to learn a powerful embedding space that can be quickly finetuned for latter downstream tasks why dont we consider a baseline where the feature backbone is initialized with simsiam or barlow twins and directly finetune them on a sequence of tasks this is probably not considered in the standard continual learning but the results of this baseline could be informative to the community of both domains this paper attempts to rethink the standard continual learning in a new point of view by considering unsupervised representation learning methods this purpose is interesting but the major difference between the scl supervised continual learning and ucl unsupervised continual learning seems to be just adding a unsupervised representation loss at the backbone and freezing it in the second stage of predicting head finetuning moreover some parts of the empirical results are not clearly presented or explained an improved version of experimental results could be helpful for better validating the contributions
### Summary:
|
exciting work at the intersection of continual learning and representation learning the reviewers have all commented that the proposed work addresses a number of issues related to catastrophic forgetting which is very encouraging the work also shows that the representation learning with the proposed method is more general than the one learned with supervised cl the reviewers have praised the work as being wellwritten and with thorough experiments there was a robust back and forth between the reviewers and the authors during the rebuttal period in which the authors appear to have addressed most of the concerns given the insights results and potential impact of this work i think this work definitely should be published at iclr
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
328,
3022,
954,
273,
253,
45120,
4715,
7274,
275,
253,
6239,
326,
1347,
22296,
3733,
387,
1016,
4715,
3924,
253,
4477,
12661,
281,
1347,
440,
35421,
6779,
4715,
327,
253,
3425,
273,
19363,
941,
285,
840,
30215,
253,
3530,
387,
1016,
3924,
970,
694,
79,
4679,
327,
2629,
260,
338,
274,
6903,
361,
285,
10058,
303,
6533,
292,
2722,
326,
253,
4081,
1332,
7374,
6584,
684,
36256,
37264,
285,
2087,
4219,
1805,
275,
1027,
15216,
20544,
50276,
783,
19529,
310,
973,
3542,
285,
3477,
281,
956,
253,
4081,
4473,
310,
973,
17194,
342,
2710,
11745,
285,
18276,
816,
6787,
50275,
6050,
1142,
440,
35421,
1286,
35421,
3733,
7274,
2430,
3215,
26208,
327,
7863,
440,
22027,
941,
253,
4081,
1332,
1060,
2987,
973,
342,
253,
1361,
432,
5878,
484,
285,
1057,
417,
2430,
3081,
3215,
26208,
873,
8127,
2403,
352,
625,
7763,
281,
1524,
10186,
897,
2219,
50275,
74,
3340,
4264,
4361,
4706,
8676,
5001,
253,
6260,
327,
4735,
14259,
875,
1027,
4715,
7274,
24426,
273,
4735,
2317,
285,
2957,
13016,
24426,
436,
2593,
3400,
3081,
22861,
16280,
253,
7880,
7200,
7756,
689,
253,
22296,
45120,
4715,
21421,
50276,
20881,
1255,
265,
50276,
74,
1158,
253,
2022,
12291,
273,
970,
253,
4081,
15722,
275,
3946,
310,
20243,
285,
3541,
10806,
281,
1408,
694,
79,
275,
247,
36536,
4715,
4758,
310,
11132,
1955,
281,
2455,
33601,
5718,
6095,
323,
20073,
3530,
3551,
432,
1027,
8661,
1309,
17032,
253,
5933,
671,
3198,
281,
11897,
13849,
875,
253,
7316,
285,
1142,
7141,
941,
2792,
476,
253,
4477,
17914,
690,
1708,
670,
253,
20243,
285,
3541,
5301,
875,
253,
4081,
1332,
285,
22296,
21421,
50276,
262,
651,
320,
1270,
281,
671,
7277,
342,
643,
3332,
22296,
45120,
4715,
7274,
347,
973,
824,
347,
247,
18,
247,
19,
50276,
66,
18,
1182,
31035,
1162,
355,
11850,
11081,
285,
28959,
275,
966,
32809,
4715,
30105,
1087,
9169,
50276,
66,
19,
632,
86,
1162,
355,
278,
79,
15125,
982,
3733,
23559,
14407,
32809,
4715,
1293,
37264,
30105,
1087,
9169,
50276,
1189,
455,
247,
1175,
3290,
19529,
342,
4460,
285,
4722,
5697,
352,
651,
320,
1270,
323,
253,
4477,
281,
2953,
253,
32213,
5393,
1840,
5474,
33032,
2520,
2929,
2175,
253,
1895,
273,
6779,
4715,
275,
271,
440,
35421,
45120,
4715,
13340,
4758,
352,
2722,
326,
253,
6779,
6311,
342,
1484,
498,
310,
625,
2087,
685,
253,
581,
6311,
342,
22296,
502,
256,
498,
285,
2340,
684,
2139,
1484,
498,
310,
625,
10237,
281,
36256,
37264,
685,
256,
498,
407,
18918,
253,
14259,
273,
6311,
3386,
285,
5304,
3006,
2957,
13016,
253,
4477,
671,
12661,
281,
4647,
5878,
484,
5853,
281,
1484,
498,
4758,
285,
1246,
247,
32163,
5933,
281,
2007,
3157,
253,
3045,
273,
502,
2520,
2929,
2175,
253,
1895,
273,
6779,
4715,
275,
271,
440,
35421,
45120,
4715,
13340,
4758,
352,
2722,
326,
253,
6779,
6311,
342,
1484,
498,
310,
625,
2087,
685,
253,
581,
6311,
342,
22296,
502,
256,
498,
285,
2340,
684,
2139,
1484,
498,
310,
625,
10237,
281,
36256,
37264,
685,
256,
498,
407,
18918,
253,
14259,
273,
6311,
3386,
285,
5304,
3006,
2957,
13016,
253,
4477,
671,
12661,
281,
4647,
5878,
484,
5853,
281,
1484,
498,
4758,
285,
1246,
247,
32163,
5933,
281,
2007,
3157,
253,
3045,
273,
502,
2266,
2792,
50275,
783,
2929,
3936,
581,
273,
253,
954,
1395,
3374,
275,
502,
4715,
10237,
6779,
275,
440,
35421,
4758,
323,
479,
253,
1895,
3139,
310,
1524,
285,
8542,
50275,
783,
2929,
3400,
11088,
4679,
1690,
1097,
18276,
1783,
285,
11745,
1543,
281,
921,
253,
12510,
273,
1484,
498,
285,
253,
4081,
50276,
77,
1765,
5933,
689,
256,
498,
3082,
50275,
1189,
455,
253,
2929,
310,
973,
3542,
275,
1798,
253,
2905,
789,
2593,
556,
247,
5322,
2685,
285,
12516,
253,
4081,
1332,
715,
3634,
5747,
253,
1332,
1907,
3710,
38135,
253,
1332,
556,
644,
973,
17194,
407,
13458,
562,
253,
7364,
275,
256,
5503,
3082,
50275,
783,
4477,
2085,
2127,
323,
39306,
253,
1543,
275,
253,
2929,
50275,
20881,
2792,
50275,
783,
4081,
32163,
5933,
310,
8671,
432,
22296,
5878,
484,
5853,
91,
12109,
1162,
355,
4765,
594,
253,
38135,
310,
3710,
50275,
783,
4477,
5196,
9470,
4679,
275,
4836,
19687,
30132,
4758,
352,
651,
320,
4722,
281,
923,
849,
1484,
498,
285,
253,
4081,
1332,
1347,
275,
966,
19687,
30132,
285,
4836,
1530,
6932,
502,
7533,
50275,
926,
495,
275,
2087,
2169,
8090,
452,
2406,
4735,
14259,
685,
2406,
8090,
285,
14259,
875,
1484,
498,
3210,
403,
2169,
685,
326,
273,
256,
40087,
2299,
627,
310,
271,
6517,
275,
3828,
577,
273,
1784,
1332,
50276,
783,
14259,
273,
256,
498,
310,
2169,
685,
326,
273,
1484,
498,
352,
310,
4409,
690,
5955,
327,
436,
6517,
50275,
1189,
455,
891,
6273,
323,
42876,
18738,
891,
751,
253,
2934,
273,
440,
35421,
45120,
4715,
285,
10885,
352,
407,
253,
4081,
32163,
1332,
619,
2201,
4468,
310,
670,
253,
3710,
38135,
273,
253,
4081,
1332,
50276,
324,
2178,
264,
432,
5878,
484,
275,
22296,
4715,
285,
690,
3081,
4679,
327,
966,
19687,
30132,
390,
4836,
1530,
6932,
7533,
2887,
14855,
1840,
18670,
253,
4477,
476,
2953,
619,
4468,
275,
253,
30080,
22559,
2180,
50276,
6438,
30080,
22559,
253,
4477,
9713,
954,
273,
619,
7350,
594,
891,
651,
751,
281,
7164,
619,
4868,
50276,
7152,
339,
431,
248,
2929,
29328,
281,
18915,
253,
45120,
4715,
1895,
275,
271,
440,
35421,
4758,
352,
2722,
326,
3332,
1881,
35421,
4715,
3082,
403,
5919,
5657,
281,
3037,
2460,
6779,
342,
2406,
36256,
4715,
3237,
767,
3332,
1881,
35421,
3082,
403,
6760,
948,
9245,
312,
285,
2534,
676,
7553,
968,
275,
1097,
2219,
253,
34385,
273,
440,
35421,
3386,
310,
5183,
50275,
783,
7561,
908,
5878,
484,
1332,
310,
671,
12956,
281,
253,
1484,
498,
1895,
15246,
314,
1655,
3888,
403,
6804,
342,
3888,
273,
253,
2469,
8892,
19958,
432,
253,
44864,
6391,
20544,
50276,
7053,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50275,
783,
4679,
403,
6793,
285,
973,
348,
5458,
281,
1805,
2096,
253,
34385,
273,
440,
35421,
14237,
275,
253,
3634,
273,
502,
891,
3340,
14109,
253,
4679,
275,
3036,
374,
326,
7409,
253,
3486,
273,
253,
1979,
273,
253,
3733,
10895,
50276,
783,
11815,
403,
25441,
2980,
285,
588,
320,
1077,
9371,
281,
2216,
747,
22296,
390,
440,
35421,
502,
3082,
50276,
783,
2127,
310,
13644,
2130,
285,
4453,
4076,
285,
50276,
36423,
281,
897,
50276,
20881,
1255,
265,
50276,
8826,
4278,
403,
12744,
50272,
1704,
8319,
694,
79,
30410,
534,
873,
310,
908,
323,
48257,
253,
44864,
6391,
12820,
873,
50272,
783,
694,
79,
310,
908,
1097,
323,
22296,
285,
440,
35421,
4679,
987,
50273,
1704,
8676,
625,
22909,
670,
260,
4530,
403,
2424,
323,
253,
9414,
326,
310,
417,
7615,
342,
436,
2557,
50276,
74,
5583,
6890,
253,
4060,
294,
37341,
253,
1957,
1050,
21815,
310,
1199,
1512,
2266,
253,
11815,
273,
253,
2929,
403,
1270,
533,
352,
1057,
417,
46554,
247,
1524,
294,
37341,
273,
253,
1895,
50276,
74,
717,
417,
1663,
13762,
533,
253,
24426,
275,
3036,
21,
352,
3133,
326,
32163,
556,
653,
9332,
1396,
569,
253,
15029,
273,
253,
5113,
403,
625,
4518,
7985,
275,
697,
4735,
3711,
1057,
352,
3365,
1599,
326,
352,
33772,
2406,
5251,
3386,
2074,
281,
5024,
13562,
5046,
247,
28669,
570,
24426,
651,
1361,
281,
923,
849,
253,
3386,
273,
1711,
8892,
403,
5876,
672,
4715,
247,
747,
4836,
50276,
249,
3036,
22,
359,
476,
4366,
326,
253,
2491,
273,
1318,
4850,
4577,
275,
246,
746,
432,
577,
29543,
323,
246,
17,
281,
577,
27590,
667,
2934,
2139,
50274,
783,
2929,
2722,
4722,
1543,
326,
6583,
253,
2442,
273,
1881,
35421,
4715,
3082,
275,
253,
3634,
273,
45120,
4715,
253,
7681,
38135,
778,
1007,
32809,
533,
253,
5661,
11815,
403,
1077,
4722,
323,
253,
3114,
253,
2929,
310,
2590,
285,
973,
3542,
5474,
33032,
2520,
2929,
9437,
281,
9729,
253,
8037,
875,
440,
35421,
6779,
4715,
285,
45120,
4715,
407,
13633,
2710,
22296,
45120,
4715,
3082,
281,
253,
440,
35421,
4715,
7792,
352,
21168,
2220,
767,
3332,
440,
35421,
4735,
4715,
5609,
948,
9245,
312,
285,
2534,
676,
7553,
968,
285,
247,
6422,
941,
42072,
5853,
5878,
484,
5520,
16226,
452,
644,
5183,
275,
2710,
5661,
7533,
2366,
342,
11088,
4735,
5304,
5904,
50276,
296,
3755,
20556,
50275,
250,
37341,
45120,
4715,
342,
440,
35421,
6779,
4715,
310,
4722,
285,
16774,
1543,
5224,
326,
954,
22296,
45120,
4715,
3082,
476,
320,
5520,
407,
253,
4081,
2746,
50275,
66,
12190,
273,
4679,
452,
644,
5196,
281,
7568,
253,
12510,
273,
253,
4081,
2746,
275,
2710,
7533,
285,
2067,
5304,
5904,
452,
671,
644,
2908,
323,
247,
1805,
4685,
273,
253,
6311,
3386,
50275,
20881,
1255,
265,
50275,
3099,
264,
5301,
342,
26721,
50276,
2004,
253,
4477,
46581,
326,
45120,
440,
35421,
6779,
4715,
7792,
26721,
281,
320,
3710,
407,
6670,
3169,
650,
698,
25912,
15302,
642,
1480,
5301,
342,
26721,
310,
2218,
407,
1563,
616,
7103,
7241,
6240,
436,
906,
812,
1805,
10313,
253,
3064,
875,
253,
4081,
1332,
285,
26721,
275,
2426,
273,
12510,
50276,
12133,
504,
253,
7103,
273,
7368,
3290,
908,
275,
26721,
3133,
281,
320,
271,
1774,
7103,
7982,
275,
440,
35421,
45120,
4715,
534,
556,
417,
644,
908,
275,
253,
2929,
50275,
615,
41556,
3045,
273,
1784,
285,
1554,
262,
1945,
50275,
249,
253,
2829,
337,
359,
923,
326,
253,
4081,
440,
35421,
45120,
4715,
476,
3157,
512,
8245,
3082,
3707,
1784,
285,
1554,
262,
1945,
247,
2590,
8813,
670,
436,
3045,
5926,
943,
320,
2879,
50275,
15847,
6716,
1783,
50275,
22309,
253,
24426,
273,
4735,
8115,
14545,
387,
14168,
1179,
85,
1012,
1223,
253,
2957,
13016,
24426,
7788,
281,
14168,
1179,
85,
746,
50276,
395,
275,
4677,
608,
253,
3064,
875,
660,
39828,
285,
32163,
310,
1892,
281,
4665,
50275,
22309,
417,
3587,
897,
440,
35421,
6311,
27228,
50276,
284,
271,
1774,
4096,
273,
253,
440,
35421,
6779,
4715,
310,
281,
3037,
247,
6422,
21496,
2317,
326,
476,
320,
4541,
1442,
292,
37437,
323,
6158,
15450,
8892,
2139,
13414,
359,
1908,
247,
8245,
835,
253,
4735,
27882,
310,
31260,
342,
948,
9245,
312,
390,
2534,
676,
26664,
285,
3587,
1442,
292,
2517,
731,
327,
247,
3425,
273,
8892,
436,
310,
3164,
417,
2783,
275,
253,
2629,
45120,
4715,
533,
253,
1543,
273,
436,
8245,
812,
320,
27096,
281,
253,
3114,
273,
1097,
10625,
436,
2929,
9437,
281,
294,
18959,
253,
2629,
45120,
4715,
275,
247,
747,
1127,
273,
1859,
407,
7296,
440,
35421,
6779,
4715,
3082,
436,
4096,
310,
4722,
533,
253,
2201,
3064,
875,
253,
256,
498,
22296,
45120,
4715,
285,
1484,
498,
440,
35421,
45120,
4715,
3133,
281,
320,
816,
6240,
247,
440,
35421,
6779,
2957,
387,
253,
27882,
285,
24250,
352,
275,
253,
1273,
3924,
273,
21565,
1481,
1442,
292,
25004,
25761,
690,
4243,
273,
253,
16774,
1543,
403,
417,
4518,
3559,
390,
5544,
271,
5520,
2715,
273,
5661,
1543,
812,
320,
9371,
323,
1805,
3588,
839,
253,
9021,
2490,
187,
4118,
18435,
27,
911,
18799,
789,
387,
253,
15171,
273,
45120,
4715,
285,
6779,
4715,
253,
30628,
452,
512,
20503,
326,
253,
4081,
789,
12453,
247,
1180,
273,
3374,
2905,
281,
36256,
37264,
534,
310,
1077,
18462,
253,
789,
671,
2722,
326,
253,
6779,
4715,
342,
253,
4081,
1332,
310,
625,
2087,
685,
253,
581,
6311,
342,
22296,
502,
253,
30628,
452,
26108,
253,
789,
347,
1146,
973,
15720,
285,
342,
11080,
4679,
627,
369,
247,
10237,
896,
285,
6593,
875,
253,
30628,
285,
253,
4477,
1309,
253,
30080,
22559,
2180,
275,
534,
253,
4477,
3176,
281,
452,
9713,
954,
273,
253,
7350,
1677,
253,
16039,
1543,
285,
2442,
3486,
273,
436,
789,
891,
1158,
436,
789,
7964,
943,
320,
3863,
387,
17857,
32888
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
328,
3022,
954,
273,
253,
45120,
4715,
7274,
275,
253,
6239,
326,
1347,
22296,
3733,
387,
1016,
4715,
3924,
253,
4477,
12661,
281,
1347,
440,
35421,
6779,
4715,
327,
253,
3425,
273,
19363,
941,
285,
840,
30215,
253,
3530,
387,
1016,
3924,
970,
694,
79,
4679,
327,
2629,
260,
338,
274,
6903,
361,
285,
10058,
303,
6533,
292,
2722,
326,
253,
4081,
1332,
7374,
6584,
684,
36256,
37264,
285,
2087,
4219,
1805,
275,
1027,
15216,
20544,
50276,
783,
19529,
310,
973,
3542,
285,
3477,
281,
956,
253,
4081,
4473,
310,
973,
17194,
342,
2710,
11745,
285,
18276,
816,
6787,
50275,
6050,
1142,
440,
35421,
1286,
35421,
3733,
7274,
2430,
3215,
26208,
327,
7863,
440,
22027,
941,
253,
4081,
1332,
1060,
2987,
973,
342,
253,
1361,
432,
5878,
484,
285,
1057,
417,
2430,
3081,
3215,
26208,
873,
8127,
2403,
352,
625,
7763,
281,
1524,
10186,
897,
2219,
50275,
74,
3340,
4264,
4361,
4706,
8676,
5001,
253,
6260,
327,
4735,
14259,
875,
1027,
4715,
7274,
24426,
273,
4735,
2317,
285,
2957,
13016,
24426,
436,
2593,
3400,
3081,
22861,
16280,
253,
7880,
7200,
7756,
689,
253,
22296,
45120,
4715,
21421,
50276,
20881,
1255,
265,
50276,
74,
1158,
253,
2022,
12291,
273,
970,
253,
4081,
15722,
275,
3946,
310,
20243,
285,
3541,
10806,
281,
1408,
694,
79,
275,
247,
36536,
4715,
4758,
310,
11132,
1955,
281,
2455,
33601,
5718,
6095,
323,
20073,
3530,
3551,
432,
1027,
8661,
1309,
17032,
253,
5933,
671,
3198,
281,
11897,
13849,
875,
253,
7316,
285,
1142,
7141,
941,
2792,
476,
253,
4477,
17914,
690,
1708,
670,
253,
20243,
285,
3541,
5301,
875,
253,
4081,
1332,
285,
22296,
21421,
50276,
262,
651,
320,
1270,
281,
671,
7277,
342,
643,
3332,
22296,
45120,
4715,
7274,
347,
973,
824,
347,
247,
18,
247,
19,
50276,
66,
18,
1182,
31035,
1162,
355,
11850,
11081,
285,
28959,
275,
966,
32809,
4715,
30105,
1087,
9169,
50276,
66,
19,
632,
86,
1162,
355,
278,
79,
15125,
982,
3733,
23559,
14407,
32809,
4715,
1293,
37264,
30105,
1087,
9169,
50276,
1189,
455,
247,
1175,
3290,
19529,
342,
4460,
285,
4722,
5697,
352,
651,
320,
1270,
323,
253,
4477,
281,
2953,
253,
32213,
5393,
1840,
5474,
33032,
2520,
2929,
2175,
253,
1895,
273,
6779,
4715,
275,
271,
440,
35421,
45120,
4715,
13340,
4758,
352,
2722,
326,
253,
6779,
6311,
342,
1484,
498,
310,
625,
2087,
685,
253,
581,
6311,
342,
22296,
502,
256,
498,
285,
2340,
684,
2139,
1484,
498,
310,
625,
10237,
281,
36256,
37264,
685,
256,
498,
407,
18918,
253,
14259,
273,
6311,
3386,
285,
5304,
3006,
2957,
13016,
253,
4477,
671,
12661,
281,
4647,
5878,
484,
5853,
281,
1484,
498,
4758,
285,
1246,
247,
32163,
5933,
281,
2007,
3157,
253,
3045,
273,
502,
2520,
2929,
2175,
253,
1895,
273,
6779,
4715,
275,
271,
440,
35421,
45120,
4715,
13340,
4758,
352,
2722,
326,
253,
6779,
6311,
342,
1484,
498,
310,
625,
2087,
685,
253,
581,
6311,
342,
22296,
502,
256,
498,
285,
2340,
684,
2139,
1484,
498,
310,
625,
10237,
281,
36256,
37264,
685,
256,
498,
407,
18918,
253,
14259,
273,
6311,
3386,
285,
5304,
3006,
2957,
13016,
253,
4477,
671,
12661,
281,
4647,
5878,
484,
5853,
281,
1484,
498,
4758,
285,
1246,
247,
32163,
5933,
281,
2007,
3157,
253,
3045,
273,
502,
2266,
2792,
50275,
783,
2929,
3936,
581,
273,
253,
954,
1395,
3374,
275,
502,
4715,
10237,
6779,
275,
440,
35421,
4758,
323,
479,
253,
1895,
3139,
310,
1524,
285,
8542,
50275,
783,
2929,
3400,
11088,
4679,
1690,
1097,
18276,
1783,
285,
11745,
1543,
281,
921,
253,
12510,
273,
1484,
498,
285,
253,
4081,
50276,
77,
1765,
5933,
689,
256,
498,
3082,
50275,
1189,
455,
253,
2929,
310,
973,
3542,
275,
1798,
253,
2905,
789,
2593,
556,
247,
5322,
2685,
285,
12516,
253,
4081,
1332,
715,
3634,
5747,
253,
1332,
1907,
3710,
38135,
253,
1332,
556,
644,
973,
17194,
407,
13458,
562,
253,
7364,
275,
256,
5503,
3082,
50275,
783,
4477,
2085,
2127,
323,
39306,
253,
1543,
275,
253,
2929,
50275,
20881,
2792,
50275,
783,
4081,
32163,
5933,
310,
8671,
432,
22296,
5878,
484,
5853,
91,
12109,
1162,
355,
4765,
594,
253,
38135,
310,
3710,
50275,
783,
4477,
5196,
9470,
4679,
275,
4836,
19687,
30132,
4758,
352,
651,
320,
4722,
281,
923,
849,
1484,
498,
285,
253,
4081,
1332,
1347,
275,
966,
19687,
30132,
285,
4836,
1530,
6932,
502,
7533,
50275,
926,
495,
275,
2087,
2169,
8090,
452,
2406,
4735,
14259,
685,
2406,
8090,
285,
14259,
875,
1484,
498,
3210,
403,
2169,
685,
326,
273,
256,
40087,
2299,
627,
310,
271,
6517,
275,
3828,
577,
273,
1784,
1332,
50276,
783,
14259,
273,
256,
498,
310,
2169,
685,
326,
273,
1484,
498,
352,
310,
4409,
690,
5955,
327,
436,
6517,
50275,
1189,
455,
891,
6273,
323,
42876,
18738,
891,
751,
253,
2934,
273,
440,
35421,
45120,
4715,
285,
10885,
352,
407,
253,
4081,
32163,
1332,
619,
2201,
4468,
310,
670,
253,
3710,
38135,
273,
253,
4081,
1332,
50276,
324,
2178,
264,
432,
5878,
484,
275,
22296,
4715,
285,
690,
3081,
4679,
327,
966,
19687,
30132,
390,
4836,
1530,
6932,
7533,
2887,
14855,
1840,
18670,
253,
4477,
476,
2953,
619,
4468,
275,
253,
30080,
22559,
2180,
50276,
6438,
30080,
22559,
253,
4477,
9713,
954,
273,
619,
7350,
594,
891,
651,
751,
281,
7164,
619,
4868,
50276,
7152,
339,
431,
248,
2929,
29328,
281,
18915,
253,
45120,
4715,
1895,
275,
271,
440,
35421,
4758,
352,
2722,
326,
3332,
1881,
35421,
4715,
3082,
403,
5919,
5657,
281,
3037,
2460,
6779,
342,
2406,
36256,
4715,
3237,
767,
3332,
1881,
35421,
3082,
403,
6760,
948,
9245,
312,
285,
2534,
676,
7553,
968,
275,
1097,
2219,
253,
34385,
273,
440,
35421,
3386,
310,
5183,
50275,
783,
7561,
908,
5878,
484,
1332,
310,
671,
12956,
281,
253,
1484,
498,
1895,
15246,
314,
1655,
3888,
403,
6804,
342,
3888,
273,
253,
2469,
8892,
19958,
432,
253,
44864,
6391,
20544,
50276,
7053,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50275,
783,
4679,
403,
6793,
285,
973,
348,
5458,
281,
1805,
2096,
253,
34385,
273,
440,
35421,
14237,
275,
253,
3634,
273,
502,
891,
3340,
14109,
253,
4679,
275,
3036,
374,
326,
7409,
253,
3486,
273,
253,
1979,
273,
253,
3733,
10895,
50276,
783,
11815,
403,
25441,
2980,
285,
588,
320,
1077,
9371,
281,
2216,
747,
22296,
390,
440,
35421,
502,
3082,
50276,
783,
2127,
310,
13644,
2130,
285,
4453,
4076,
285,
50276,
36423,
281,
897,
50276,
20881,
1255,
265,
50276,
8826,
4278,
403,
12744,
50272,
1704,
8319,
694,
79,
30410,
534,
873,
310,
908,
323,
48257,
253,
44864,
6391,
12820,
873,
50272,
783,
694,
79,
310,
908,
1097,
323,
22296,
285,
440,
35421,
4679,
987,
50273,
1704,
8676,
625,
22909,
670,
260,
4530,
403,
2424,
323,
253,
9414,
326,
310,
417,
7615,
342,
436,
2557,
50276,
74,
5583,
6890,
253,
4060,
294,
37341,
253,
1957,
1050,
21815,
310,
1199,
1512,
2266,
253,
11815,
273,
253,
2929,
403,
1270,
533,
352,
1057,
417,
46554,
247,
1524,
294,
37341,
273,
253,
1895,
50276,
74,
717,
417,
1663,
13762,
533,
253,
24426,
275,
3036,
21,
352,
3133,
326,
32163,
556,
653,
9332,
1396,
569,
253,
15029,
273,
253,
5113,
403,
625,
4518,
7985,
275,
697,
4735,
3711,
1057,
352,
3365,
1599,
326,
352,
33772,
2406,
5251,
3386,
2074,
281,
5024,
13562,
5046,
247,
28669,
570,
24426,
651,
1361,
281,
923,
849,
253,
3386,
273,
1711,
8892,
403,
5876,
672,
4715,
247,
747,
4836,
50276,
249,
3036,
22,
359,
476,
4366,
326,
253,
2491,
273,
1318,
4850,
4577,
275,
246,
746,
432,
577,
29543,
323,
246,
17,
281,
577,
27590,
667,
2934,
2139,
50274,
783,
2929,
2722,
4722,
1543,
326,
6583,
253,
2442,
273,
1881,
35421,
4715,
3082,
275,
253,
3634,
273,
45120,
4715,
253,
7681,
38135,
778,
1007,
32809,
533,
253,
5661,
11815,
403,
1077,
4722,
323,
253,
3114,
253,
2929,
310,
2590,
285,
973,
3542,
5474,
33032,
2520,
2929,
9437,
281,
9729,
253,
8037,
875,
440,
35421,
6779,
4715,
285,
45120,
4715,
407,
13633,
2710,
22296,
45120,
4715,
3082,
281,
253,
440,
35421,
4715,
7792,
352,
21168,
2220,
767,
3332,
440,
35421,
4735,
4715,
5609,
948,
9245,
312,
285,
2534,
676,
7553,
968,
285,
247,
6422,
941,
42072,
5853,
5878,
484,
5520,
16226,
452,
644,
5183,
275,
2710,
5661,
7533,
2366,
342,
11088,
4735,
5304,
5904,
50276,
296,
3755,
20556,
50275,
250,
37341,
45120,
4715,
342,
440,
35421,
6779,
4715,
310,
4722,
285,
16774,
1543,
5224,
326,
954,
22296,
45120,
4715,
3082,
476,
320,
5520,
407,
253,
4081,
2746,
50275,
66,
12190,
273,
4679,
452,
644,
5196,
281,
7568,
253,
12510,
273,
253,
4081,
2746,
275,
2710,
7533,
285,
2067,
5304,
5904,
452,
671,
644,
2908,
323,
247,
1805,
4685,
273,
253,
6311,
3386,
50275,
20881,
1255,
265,
50275,
3099,
264,
5301,
342,
26721,
50276,
2004,
253,
4477,
46581,
326,
45120,
440,
35421,
6779,
4715,
7792,
26721,
281,
320,
3710,
407,
6670,
3169,
650,
698,
25912,
15302,
642,
1480,
5301,
342,
26721,
310,
2218,
407,
1563,
616,
7103,
7241,
6240,
436,
906,
812,
1805,
10313,
253,
3064,
875,
253,
4081,
1332,
285,
26721,
275,
2426,
273,
12510,
50276,
12133,
504,
253,
7103,
273,
7368,
3290,
908,
275,
26721,
3133,
281,
320,
271,
1774,
7103,
7982,
275,
440,
35421,
45120,
4715,
534,
556,
417,
644,
908,
275,
253,
2929,
50275,
615,
41556,
3045,
273,
1784,
285,
1554,
262,
1945,
50275,
249,
253,
2829,
337,
359,
923,
326,
253,
4081,
440,
35421,
45120,
4715,
476,
3157,
512,
8245,
3082,
3707,
1784,
285,
1554,
262,
1945,
247,
2590,
8813,
670,
436,
3045,
5926,
943,
320,
2879,
50275,
15847,
6716,
1783,
50275,
22309,
253,
24426,
273,
4735,
8115,
14545,
387,
14168,
1179,
85,
1012,
1223,
253,
2957,
13016,
24426,
7788,
281,
14168,
1179,
85,
746,
50276,
395,
275,
4677,
608,
253,
3064,
875,
660,
39828,
285,
32163,
310,
1892,
281,
4665,
50275,
22309,
417,
3587,
897,
440,
35421,
6311,
27228,
50276,
284,
271,
1774,
4096,
273,
253,
440,
35421,
6779,
4715,
310,
281,
3037,
247,
6422,
21496,
2317,
326,
476,
320,
4541,
1442,
292,
37437,
323,
6158,
15450,
8892,
2139,
13414,
359,
1908,
247,
8245,
835,
253,
4735,
27882,
310,
31260,
342,
948,
9245,
312,
390,
2534,
676,
26664,
285,
3587,
1442,
292,
2517,
731,
327,
247,
3425,
273,
8892,
436,
310,
3164,
417,
2783,
275,
253,
2629,
45120,
4715,
533,
253,
1543,
273,
436,
8245,
812,
320,
27096,
281,
253,
3114,
273,
1097,
10625,
436,
2929,
9437,
281,
294,
18959,
253,
2629,
45120,
4715,
275,
247,
747,
1127,
273,
1859,
407,
7296,
440,
35421,
6779,
4715,
3082,
436,
4096,
310,
4722,
533,
253,
2201,
3064,
875,
253,
256,
498,
22296,
45120,
4715,
285,
1484,
498,
440,
35421,
45120,
4715,
3133,
281,
320,
816,
6240,
247,
440,
35421,
6779,
2957,
387,
253,
27882,
285,
24250,
352,
275,
253,
1273,
3924,
273,
21565,
1481,
1442,
292,
25004,
25761,
690,
4243,
273,
253,
16774,
1543,
403,
417,
4518,
3559,
390,
5544,
271,
5520,
2715,
273,
5661,
1543,
812,
320,
9371,
323,
1805,
3588,
839,
253,
9021,
2490,
187,
4118,
18435,
27,
911,
18799,
789,
387,
253,
15171,
273,
45120,
4715,
285,
6779,
4715,
253,
30628,
452,
512,
20503,
326,
253,
4081,
789,
12453,
247,
1180,
273,
3374,
2905,
281,
36256,
37264,
534,
310,
1077,
18462,
253,
789,
671,
2722,
326,
253,
6779,
4715,
342,
253,
4081,
1332,
310,
625,
2087,
685,
253,
581,
6311,
342,
22296,
502,
253,
30628,
452,
26108,
253,
789,
347,
1146,
973,
15720,
285,
342,
11080,
4679,
627,
369,
247,
10237,
896,
285,
6593,
875,
253,
30628,
285,
253,
4477,
1309,
253,
30080,
22559,
2180,
275,
534,
253,
4477,
3176,
281,
452,
9713,
954,
273,
253,
7350,
1677,
253,
16039,
1543,
285,
2442,
3486,
273,
436,
789,
891,
1158,
436,
789,
7964,
943,
320,
3863,
387,
17857,
32888
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the proposed work looks at the task of automated selfsupervised learning ssl on graphs by using pseudohomophily as a surrogate objective combined with a search strategy for the proposed approach autossl homophily is defined as the average of sameness of labels over pairs of connected vertices pseudohomophily is computed by assigning labels based on kmeans clustering the work theoretically shows that maximizing pseudohomophily is shown to maximize the upper bound of mutual information between pseudolabels and downstream labels as cluster assignments are not differentiable and need search over a large space the work looks at using evolutionary strategies autossles and differentiable search through soft cluster assignments using gaussian mixture model autosslds experiments are performed on 5 ssl tasks and 8 datasets autossl based approaches outperform several other unsupervised baselines and perform comparably to supervised baselines when measured using normalized mutual information accuracy of node classification and psuedohomophily strengths 1 paper is well written and easy to follow 2 pseudohomophily is shown to be a good surrogate measure for mutual information and combining with search strategies proves effective on several benchmark datasets and tasks weaknesses 1 method is somewhat simple and intuitive 2 method underperforms significantly on clustering task for photos dataset the method is well motivated and quite intuitive the paper is easy to follow extensive experimental results are provided on several tasks and datasets but statistical significance of improvements is missing which is needed given large confidence intervals reported docsepthis paper mainly studies the problem of automatically weighting multiple selfsupervised tasks on a graph without information of true labels in order to make downstream nodelevel prediction tasks perform better to overcome the challenge of missing ground truth in pretraining tasks the authors propose to use a property named homophily of the graph to create pseudo labels then they propose two algorithms to automatically adjust weights among different tasks one is based on evolution algorithms another is based on gradient descent extensive experiments on different realworld datasets show the improvement of the proposed methods compared with a single pretext task strengths the paper is wellwritten and easy to follow i can easily understand the idea of this paper this paper conducts extensive experiments on 7 different datasets from different domains to show their improved performances which is very reasonable and convincing to me weaknesses the main concern i have towards this paper is its novelty i can see there is much effort of this paper to make adjusting different pretext tasks on a graph especially the authors bring up the important homophily property of graph to create pseudo labels for selfsupervision however the essence of this paper is still dynamically adjusting weights for different losses as the authors mentioned in the section related work of automated loss function search there is so much work on reweighting different loss functions which have been well studied though the authors mentioned that the problem of selfsupervised loss search for graphs remains rarely explored i find out the solved problem in this paper is actually an old problem which is limited in novelty and significance the core contributions of this paper mainly include i making use of homophily to create pseudo labels for selfsupervision ii proposing autossles based evolutionary strategy iii proposing autosslds based on metagradient descent for i after checking the paper and appendix i think it is good and reasonable but why should the authors propose two different methods to solve the problem the author mentions autossles requires evaluating a large population of candidate combinations which is not practical and autosslds is much efficient so what are the advantages of autossles the authors should give more explanations and analysis towards choosing these two methods otherwise i would consider they are simply a combination of existing works without much contribution i observe that the improvement of autossl compared with the best result of individual tasks is not that significant moreover the best results of individual tasks are from par and dgi this leads to a question that does is really needs to weigh these 5 tasks maybe considering the other 3 tasks is not that helpful also more recent works of graph selfsupervised learning can be considered and cited for example han x huang z an b et al adaptive transfer learning on graph neural networkscproceedings of the 27th acm sigkdd conference on knowledge discovery data mining 2021 565574 though this paper is wellwritten and studies an important and popular problem in graph learning i still have concerns in different aspects docsepselfsupervised learning ssl methods for graphs take a given graph with node attribute information and construct various ssl tasks using structural and attribution information these tasks provide selfsupervision for training graph neural networks without accessing any labeled data existing studies show that different ssl tasks can lead to varying downstream performance across tasks suggesting that the success of ssl tasks strongly depends on the dataset and the downstream task this observation is important yet not surprising as similar findings were made in other data modalities text vision as a result selecting the right ssl tasks can be important which serves as a motivation for this study this study develops an approach called autossl for combining multiple ssl tasks for unlabeled representation learning to this end it defines a pseudohomophily metric to measure the quality of learned representations based on the pseudohomophily the study describes two techniques to search combine ssl tasks one using an evolutionary algorithm and the other using metagradient descent the autossl approach is evaluated on eight datasets considering combinations of five ssl tasks one contrastive learning task and four predictive tasks 1 performance gains are small or nonsignificant closely examining tables 13 i see many scenarios where the best performing autossl model performs on par with the strongest baselines or offers minuscule improvements that is when taking into account standard variation of performance across independent runs for example in table 2 results for corafull are 6110068 vs 6056033 onpar performance results for physics are 9557002 vs 9466010 minuscule improvement which we only see after considering the stronger autossles variant autosslds performs at 9513036 which leads to essentially no performance there are many such scenarios in tables 13 2 the autossl approach is concerned with finding an effective way to combine individual ssl tasks problem definition in eq 1 where mathcalh is set to the negative pseudohomophily that is it assumes that a set of n ssl tasks are already given and it will find a good combination of those tasks formulated as a weighted addition of individual ssl tasks each task associated with a learned task weight lambdai such formulation seems quite restrictive it does not allow for any interaction between the loss functions further it also assumes that the user already knows what are potentially good ssl tasks 3 it is unclear how the proposed strategy could be used for other contexts such as link prediction and graph classification both autossl variants evolutionary strategy and the gradient descent version seem adhoc and straightforward combinations of existing tools this is not a problem on its own however the empirical gains are not convincing docsepauthors use homophily datasets and meta learning for selfsupervised learning authors use evolutionary strategy 1 is homophily a necessity for graph neural networks is a paper in that paper it states homophily is not necessity if dataset is homophily the task comes to easy in this paper authors use such easy datasets 2 if the metalearning is the contribution it is not a contribution of the paper but is a tool 3 the experimental results show it obtains better performance 4 is the evolutionary strategy a contribution homophily datasets is used and authors shows theoretical analysis
### Summary:
|
strengths strong empirical study across multiple datasets however the gains are not as impressive as for other pretraining domains such as text or images interesting formulation of pseudohomophily as an objective to optimize in the selfsupervision stage wellwritten paper weaknesses novelty may be limited by the fact that the method is essentially learning or searching for a weighted average of selfsupervised training objectives in that case while the pseudohomophily angle is interesting there may be other appropriate baselines for yielding this weighted combination of tasks that are not explored there is concern about the degree of empirical improvements on certain datasets
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4081,
789,
4453,
387,
253,
4836,
273,
16644,
1881,
35421,
4715,
256,
3433,
327,
14580,
407,
970,
10585,
1368,
297,
2689,
1031,
347,
247,
35701,
8103,
5678,
342,
247,
3186,
5700,
323,
253,
4081,
2746,
1125,
1730,
77,
50274,
12856,
2689,
1031,
310,
2931,
347,
253,
3388,
273,
1775,
8098,
273,
13301,
689,
8557,
273,
4802,
13388,
10585,
1368,
297,
2689,
1031,
310,
10302,
407,
34018,
13301,
1754,
327,
465,
30799,
17524,
253,
789,
28055,
2722,
326,
46875,
10585,
1368,
297,
2689,
1031,
310,
2011,
281,
22950,
253,
5170,
3033,
273,
15577,
1491,
875,
10585,
311,
357,
1241,
285,
15450,
13301,
50275,
284,
7368,
23768,
403,
417,
46350,
285,
878,
3186,
689,
247,
1781,
2317,
253,
789,
4453,
387,
970,
16483,
8130,
1125,
1730,
868,
285,
46350,
3186,
949,
2602,
7368,
23768,
970,
305,
12064,
7802,
1566,
1125,
1730,
392,
84,
50276,
16217,
3825,
403,
2684,
327,
608,
256,
3433,
8892,
285,
854,
15302,
1125,
1730,
77,
1754,
7274,
562,
32231,
2067,
643,
440,
35421,
1666,
25379,
285,
1347,
3294,
1598,
281,
22296,
1666,
25379,
672,
4080,
970,
12650,
15577,
1491,
7200,
273,
4666,
9162,
285,
3714,
2107,
1368,
297,
2689,
1031,
50276,
296,
3755,
20556,
337,
2929,
310,
973,
3542,
285,
3477,
281,
956,
374,
10585,
1368,
297,
2689,
1031,
310,
2011,
281,
320,
247,
1175,
35701,
2557,
323,
15577,
1491,
285,
16248,
342,
3186,
8130,
19539,
3576,
327,
2067,
22791,
15302,
285,
8892,
50275,
20881,
1255,
265,
337,
1332,
310,
8489,
2969,
285,
27350,
374,
1332,
762,
468,
13015,
3012,
327,
17524,
4836,
323,
7963,
10895,
50276,
783,
1332,
310,
973,
17194,
285,
3240,
27350,
253,
2929,
310,
3477,
281,
956,
9470,
5661,
1543,
403,
2530,
327,
2067,
8892,
285,
15302,
533,
7605,
8453,
273,
11701,
310,
5816,
534,
310,
3058,
1677,
1781,
7162,
11508,
2361,
50276,
7152,
33032,
2520,
2929,
7194,
2175,
253,
1895,
273,
8356,
42428,
2709,
1881,
35421,
8892,
327,
247,
4216,
1293,
1491,
273,
2032,
13301,
275,
1340,
281,
1056,
15450,
4666,
5251,
10554,
8892,
1347,
1805,
281,
11399,
253,
5691,
273,
5816,
3216,
5083,
275,
3215,
26208,
8892,
253,
4477,
12661,
281,
897,
247,
2867,
4907,
2860,
2689,
1031,
273,
253,
4216,
281,
2794,
17927,
13301,
840,
597,
12661,
767,
11333,
281,
8356,
4575,
13461,
2190,
1027,
8892,
581,
310,
1754,
327,
5606,
11333,
1529,
310,
1754,
327,
11786,
18499,
9470,
4679,
327,
1027,
1524,
10186,
15302,
921,
253,
7756,
273,
253,
4081,
3082,
2429,
342,
247,
2014,
39543,
4836,
20544,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
891,
476,
4354,
2096,
253,
2934,
273,
436,
2929,
50276,
2520,
2929,
2589,
84,
9470,
4679,
327,
818,
1027,
15302,
432,
1027,
10625,
281,
921,
616,
5520,
16226,
534,
310,
1077,
5272,
285,
21414,
281,
479,
50276,
20881,
1255,
265,
253,
2022,
4468,
891,
452,
4404,
436,
2929,
310,
697,
38135,
891,
476,
923,
627,
310,
1199,
3434,
273,
436,
2929,
281,
1056,
19427,
1027,
39543,
8892,
327,
247,
4216,
3340,
253,
4477,
3324,
598,
253,
1774,
2860,
2689,
1031,
2867,
273,
4216,
281,
2794,
17927,
13301,
323,
1881,
12185,
4694,
2299,
253,
17718,
273,
436,
2929,
310,
1335,
23043,
19427,
13461,
323,
1027,
11655,
347,
253,
4477,
5393,
275,
253,
2593,
2905,
789,
273,
16644,
2957,
1159,
3186,
627,
310,
594,
1199,
789,
327,
294,
6712,
272,
1027,
2957,
3470,
534,
452,
644,
973,
5421,
2167,
253,
4477,
5393,
326,
253,
1895,
273,
1881,
35421,
2957,
3186,
323,
14580,
4558,
11766,
14859,
891,
1089,
562,
253,
14042,
1895,
275,
436,
2929,
310,
2686,
271,
1711,
1895,
534,
310,
3710,
275,
38135,
285,
8453,
50276,
783,
5161,
9021,
273,
436,
2929,
7194,
2486,
891,
2403,
897,
273,
2860,
2689,
1031,
281,
2794,
17927,
13301,
323,
1881,
12185,
4694,
21255,
36636,
1125,
1730,
868,
1754,
16483,
5700,
37685,
36636,
1125,
1730,
392,
84,
1754,
327,
1313,
356,
4614,
850,
18499,
323,
891,
846,
12669,
253,
2929,
285,
30762,
891,
1158,
352,
310,
1175,
285,
5272,
533,
2139,
943,
253,
4477,
12661,
767,
1027,
3082,
281,
8415,
253,
1895,
253,
2488,
25957,
1125,
1730,
868,
4419,
16344,
247,
1781,
3072,
273,
7431,
13553,
534,
310,
417,
8542,
285,
1125,
1730,
392,
84,
310,
1199,
5919,
594,
752,
403,
253,
11361,
273,
1125,
1730,
868,
253,
4477,
943,
1918,
625,
22909,
285,
1783,
4404,
13887,
841,
767,
3082,
5010,
891,
651,
1908,
597,
403,
3365,
247,
5019,
273,
5368,
2987,
1293,
1199,
7680,
50276,
74,
10018,
326,
253,
7756,
273,
1125,
1730,
77,
2429,
342,
253,
1682,
906,
273,
2060,
8892,
310,
417,
326,
1534,
25761,
253,
1682,
1543,
273,
2060,
8892,
403,
432,
1061,
285,
277,
7311,
436,
5644,
281,
247,
1953,
326,
1057,
310,
1663,
3198,
281,
14357,
841,
608,
8892,
5046,
7296,
253,
643,
495,
8892,
310,
417,
326,
9371,
50276,
12563,
625,
3332,
2987,
273,
4216,
1881,
35421,
4715,
476,
320,
2783,
285,
11106,
323,
1650,
15761,
1269,
30287,
606,
1182,
271,
270,
1162,
355,
17825,
3700,
4715,
327,
4216,
11454,
6928,
68,
856,
22868,
273,
253,
3435,
394,
913,
78,
9788,
76,
1678,
8059,
327,
3640,
8900,
50276,
2203,
15067,
43425,
46472,
38503,
2167,
436,
2929,
310,
973,
15720,
285,
2175,
271,
1774,
285,
4633,
1895,
275,
4216,
4715,
891,
1335,
452,
7350,
275,
1027,
7794,
5474,
339,
793,
813,
35421,
4715,
256,
3433,
3082,
323,
14580,
1379,
247,
1677,
4216,
342,
4666,
11104,
1491,
285,
3989,
2710,
256,
3433,
8892,
970,
8350,
285,
863,
2382,
1491,
841,
8892,
2085,
1881,
12185,
4694,
323,
3733,
4216,
11454,
6928,
1293,
24497,
667,
13130,
941,
5368,
2175,
921,
326,
1027,
256,
3433,
8892,
476,
1421,
281,
11962,
15450,
3045,
2439,
8892,
7738,
326,
253,
2323,
273,
256,
3433,
8892,
7052,
7024,
327,
253,
10895,
285,
253,
15450,
4836,
436,
8310,
310,
1774,
2568,
417,
10084,
347,
2074,
4342,
497,
1160,
275,
643,
941,
33433,
2505,
8113,
347,
247,
906,
17221,
253,
987,
256,
3433,
8892,
476,
320,
1774,
534,
11029,
347,
247,
16038,
323,
436,
1263,
50275,
2520,
1263,
24357,
271,
2746,
1925,
1125,
1730,
77,
323,
16248,
2709,
256,
3433,
8892,
323,
440,
22027,
6779,
4715,
281,
436,
990,
352,
13067,
247,
10585,
1368,
297,
2689,
1031,
7982,
281,
2557,
253,
3290,
273,
6311,
14237,
1754,
327,
253,
10585,
1368,
297,
2689,
1031,
253,
1263,
8631,
767,
5609,
281,
3186,
50276,
17890,
460,
256,
3433,
8892,
581,
970,
271,
16483,
5933,
285,
253,
643,
970,
1313,
356,
4614,
850,
18499,
253,
1125,
1730,
77,
2746,
310,
6760,
327,
4314,
15302,
7296,
13553,
273,
2620,
256,
3433,
8892,
581,
4499,
422,
4715,
4836,
285,
1740,
15970,
8892,
50275,
18,
3045,
15988,
403,
1355,
390,
14122,
525,
13907,
8244,
17565,
7180,
2145,
891,
923,
1142,
15216,
835,
253,
1682,
9591,
1125,
1730,
77,
1566,
17923,
327,
1061,
342,
253,
19508,
1666,
25379,
390,
6131,
19734,
11047,
11701,
326,
310,
672,
3192,
715,
2395,
2629,
7629,
273,
3045,
2439,
3907,
6613,
323,
1650,
275,
2829,
374,
1543,
323,
944,
2320,
962,
403,
721,
37965,
2358,
4632,
46436,
1549,
1610,
50276,
251,
1148,
3045,
1543,
323,
12057,
403,
898,
38032,
4699,
4632,
11107,
2526,
9104,
50276,
10420,
11047,
7756,
534,
359,
760,
923,
846,
7296,
253,
10046,
1125,
1730,
868,
12955,
1125,
1730,
392,
84,
17923,
387,
5325,
11246,
1812,
534,
5644,
281,
9093,
642,
3045,
627,
403,
1142,
824,
15216,
275,
7180,
2145,
50275,
19,
253,
1125,
1730,
77,
2746,
310,
7514,
342,
4560,
271,
3576,
1039,
281,
13398,
2060,
256,
3433,
8892,
1895,
5426,
275,
16186,
337,
835,
14168,
1179,
73,
310,
873,
281,
253,
4016,
10585,
1368,
297,
2689,
1031,
326,
310,
352,
19584,
326,
247,
873,
273,
295,
256,
3433,
8892,
403,
2168,
1677,
285,
352,
588,
1089,
247,
1175,
5019,
273,
1110,
8892,
26115,
347,
247,
17375,
1635,
273,
2060,
256,
3433,
8892,
1016,
4836,
2330,
342,
247,
6311,
4836,
2801,
29331,
74,
824,
15895,
3133,
3240,
29190,
352,
1057,
417,
1581,
323,
667,
5016,
875,
253,
2957,
3470,
2007,
352,
671,
19584,
326,
253,
2608,
2168,
6057,
752,
403,
7826,
1175,
256,
3433,
8892,
50275,
20,
352,
310,
12744,
849,
253,
4081,
5700,
812,
320,
908,
323,
643,
22349,
824,
347,
3048,
10554,
285,
4216,
9162,
1097,
1125,
1730,
77,
11640,
16483,
5700,
285,
253,
11786,
18499,
2715,
1646,
519,
37806,
285,
15246,
13553,
273,
5368,
5657,
436,
310,
417,
247,
1895,
327,
697,
1211,
2299,
253,
16774,
15988,
403,
417,
21414,
50276,
7152,
33032,
43355,
897,
2860,
2689,
1031,
15302,
285,
11419,
4715,
323,
1881,
35421,
4715,
4477,
897,
16483,
5700,
50276,
18,
310,
2860,
2689,
1031,
247,
15504,
323,
4216,
11454,
6928,
310,
247,
2929,
275,
326,
2929,
352,
3054,
2860,
2689,
1031,
310,
417,
15504,
604,
10895,
310,
2860,
2689,
1031,
253,
4836,
3249,
281,
3477,
275,
436,
2929,
4477,
897,
824,
3477,
15302,
374,
604,
253,
5148,
613,
920,
310,
253,
7680,
352,
310,
417,
247,
7680,
273,
253,
2929,
533,
310,
247,
4968,
495,
253,
5661,
1543,
921,
352,
31326,
1805,
3045,
577,
310,
253,
16483,
5700,
247,
7680,
50274,
12856,
2689,
1031,
15302,
310,
908,
285,
4477,
2722,
10527,
1783,
2490,
187,
4118,
18435,
27,
296,
3755,
20556,
50276,
9072,
16774,
1263,
2439,
2709,
15302,
2299,
253,
15988,
403,
417,
347,
13943,
347,
323,
643,
3215,
26208,
10625,
824,
347,
2505,
390,
3888,
50276,
47606,
15895,
273,
10585,
1368,
297,
2689,
1031,
347,
271,
8103,
281,
22318,
275,
253,
1881,
12185,
4694,
3924,
50276,
4714,
15720,
2929,
50276,
20881,
1255,
265,
50276,
2369,
652,
555,
778,
320,
3710,
407,
253,
958,
326,
253,
1332,
310,
9093,
4715,
390,
12203,
323,
247,
17375,
3388,
273,
1881,
35421,
3733,
16566,
50276,
249,
326,
1083,
1223,
253,
10585,
1368,
297,
2689,
1031,
6907,
310,
4722,
627,
778,
320,
643,
4569,
1666,
25379,
323,
27012,
436,
17375,
5019,
273,
8892,
326,
403,
417,
14859,
50276,
9088,
310,
4468,
670,
253,
4248,
273,
16774,
11701,
327,
2176,
15302
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4081,
789,
4453,
387,
253,
4836,
273,
16644,
1881,
35421,
4715,
256,
3433,
327,
14580,
407,
970,
10585,
1368,
297,
2689,
1031,
347,
247,
35701,
8103,
5678,
342,
247,
3186,
5700,
323,
253,
4081,
2746,
1125,
1730,
77,
50274,
12856,
2689,
1031,
310,
2931,
347,
253,
3388,
273,
1775,
8098,
273,
13301,
689,
8557,
273,
4802,
13388,
10585,
1368,
297,
2689,
1031,
310,
10302,
407,
34018,
13301,
1754,
327,
465,
30799,
17524,
253,
789,
28055,
2722,
326,
46875,
10585,
1368,
297,
2689,
1031,
310,
2011,
281,
22950,
253,
5170,
3033,
273,
15577,
1491,
875,
10585,
311,
357,
1241,
285,
15450,
13301,
50275,
284,
7368,
23768,
403,
417,
46350,
285,
878,
3186,
689,
247,
1781,
2317,
253,
789,
4453,
387,
970,
16483,
8130,
1125,
1730,
868,
285,
46350,
3186,
949,
2602,
7368,
23768,
970,
305,
12064,
7802,
1566,
1125,
1730,
392,
84,
50276,
16217,
3825,
403,
2684,
327,
608,
256,
3433,
8892,
285,
854,
15302,
1125,
1730,
77,
1754,
7274,
562,
32231,
2067,
643,
440,
35421,
1666,
25379,
285,
1347,
3294,
1598,
281,
22296,
1666,
25379,
672,
4080,
970,
12650,
15577,
1491,
7200,
273,
4666,
9162,
285,
3714,
2107,
1368,
297,
2689,
1031,
50276,
296,
3755,
20556,
337,
2929,
310,
973,
3542,
285,
3477,
281,
956,
374,
10585,
1368,
297,
2689,
1031,
310,
2011,
281,
320,
247,
1175,
35701,
2557,
323,
15577,
1491,
285,
16248,
342,
3186,
8130,
19539,
3576,
327,
2067,
22791,
15302,
285,
8892,
50275,
20881,
1255,
265,
337,
1332,
310,
8489,
2969,
285,
27350,
374,
1332,
762,
468,
13015,
3012,
327,
17524,
4836,
323,
7963,
10895,
50276,
783,
1332,
310,
973,
17194,
285,
3240,
27350,
253,
2929,
310,
3477,
281,
956,
9470,
5661,
1543,
403,
2530,
327,
2067,
8892,
285,
15302,
533,
7605,
8453,
273,
11701,
310,
5816,
534,
310,
3058,
1677,
1781,
7162,
11508,
2361,
50276,
7152,
33032,
2520,
2929,
7194,
2175,
253,
1895,
273,
8356,
42428,
2709,
1881,
35421,
8892,
327,
247,
4216,
1293,
1491,
273,
2032,
13301,
275,
1340,
281,
1056,
15450,
4666,
5251,
10554,
8892,
1347,
1805,
281,
11399,
253,
5691,
273,
5816,
3216,
5083,
275,
3215,
26208,
8892,
253,
4477,
12661,
281,
897,
247,
2867,
4907,
2860,
2689,
1031,
273,
253,
4216,
281,
2794,
17927,
13301,
840,
597,
12661,
767,
11333,
281,
8356,
4575,
13461,
2190,
1027,
8892,
581,
310,
1754,
327,
5606,
11333,
1529,
310,
1754,
327,
11786,
18499,
9470,
4679,
327,
1027,
1524,
10186,
15302,
921,
253,
7756,
273,
253,
4081,
3082,
2429,
342,
247,
2014,
39543,
4836,
20544,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
891,
476,
4354,
2096,
253,
2934,
273,
436,
2929,
50276,
2520,
2929,
2589,
84,
9470,
4679,
327,
818,
1027,
15302,
432,
1027,
10625,
281,
921,
616,
5520,
16226,
534,
310,
1077,
5272,
285,
21414,
281,
479,
50276,
20881,
1255,
265,
253,
2022,
4468,
891,
452,
4404,
436,
2929,
310,
697,
38135,
891,
476,
923,
627,
310,
1199,
3434,
273,
436,
2929,
281,
1056,
19427,
1027,
39543,
8892,
327,
247,
4216,
3340,
253,
4477,
3324,
598,
253,
1774,
2860,
2689,
1031,
2867,
273,
4216,
281,
2794,
17927,
13301,
323,
1881,
12185,
4694,
2299,
253,
17718,
273,
436,
2929,
310,
1335,
23043,
19427,
13461,
323,
1027,
11655,
347,
253,
4477,
5393,
275,
253,
2593,
2905,
789,
273,
16644,
2957,
1159,
3186,
627,
310,
594,
1199,
789,
327,
294,
6712,
272,
1027,
2957,
3470,
534,
452,
644,
973,
5421,
2167,
253,
4477,
5393,
326,
253,
1895,
273,
1881,
35421,
2957,
3186,
323,
14580,
4558,
11766,
14859,
891,
1089,
562,
253,
14042,
1895,
275,
436,
2929,
310,
2686,
271,
1711,
1895,
534,
310,
3710,
275,
38135,
285,
8453,
50276,
783,
5161,
9021,
273,
436,
2929,
7194,
2486,
891,
2403,
897,
273,
2860,
2689,
1031,
281,
2794,
17927,
13301,
323,
1881,
12185,
4694,
21255,
36636,
1125,
1730,
868,
1754,
16483,
5700,
37685,
36636,
1125,
1730,
392,
84,
1754,
327,
1313,
356,
4614,
850,
18499,
323,
891,
846,
12669,
253,
2929,
285,
30762,
891,
1158,
352,
310,
1175,
285,
5272,
533,
2139,
943,
253,
4477,
12661,
767,
1027,
3082,
281,
8415,
253,
1895,
253,
2488,
25957,
1125,
1730,
868,
4419,
16344,
247,
1781,
3072,
273,
7431,
13553,
534,
310,
417,
8542,
285,
1125,
1730,
392,
84,
310,
1199,
5919,
594,
752,
403,
253,
11361,
273,
1125,
1730,
868,
253,
4477,
943,
1918,
625,
22909,
285,
1783,
4404,
13887,
841,
767,
3082,
5010,
891,
651,
1908,
597,
403,
3365,
247,
5019,
273,
5368,
2987,
1293,
1199,
7680,
50276,
74,
10018,
326,
253,
7756,
273,
1125,
1730,
77,
2429,
342,
253,
1682,
906,
273,
2060,
8892,
310,
417,
326,
1534,
25761,
253,
1682,
1543,
273,
2060,
8892,
403,
432,
1061,
285,
277,
7311,
436,
5644,
281,
247,
1953,
326,
1057,
310,
1663,
3198,
281,
14357,
841,
608,
8892,
5046,
7296,
253,
643,
495,
8892,
310,
417,
326,
9371,
50276,
12563,
625,
3332,
2987,
273,
4216,
1881,
35421,
4715,
476,
320,
2783,
285,
11106,
323,
1650,
15761,
1269,
30287,
606,
1182,
271,
270,
1162,
355,
17825,
3700,
4715,
327,
4216,
11454,
6928,
68,
856,
22868,
273,
253,
3435,
394,
913,
78,
9788,
76,
1678,
8059,
327,
3640,
8900,
50276,
2203,
15067,
43425,
46472,
38503,
2167,
436,
2929,
310,
973,
15720,
285,
2175,
271,
1774,
285,
4633,
1895,
275,
4216,
4715,
891,
1335,
452,
7350,
275,
1027,
7794,
5474,
339,
793,
813,
35421,
4715,
256,
3433,
3082,
323,
14580,
1379,
247,
1677,
4216,
342,
4666,
11104,
1491,
285,
3989,
2710,
256,
3433,
8892,
970,
8350,
285,
863,
2382,
1491,
841,
8892,
2085,
1881,
12185,
4694,
323,
3733,
4216,
11454,
6928,
1293,
24497,
667,
13130,
941,
5368,
2175,
921,
326,
1027,
256,
3433,
8892,
476,
1421,
281,
11962,
15450,
3045,
2439,
8892,
7738,
326,
253,
2323,
273,
256,
3433,
8892,
7052,
7024,
327,
253,
10895,
285,
253,
15450,
4836,
436,
8310,
310,
1774,
2568,
417,
10084,
347,
2074,
4342,
497,
1160,
275,
643,
941,
33433,
2505,
8113,
347,
247,
906,
17221,
253,
987,
256,
3433,
8892,
476,
320,
1774,
534,
11029,
347,
247,
16038,
323,
436,
1263,
50275,
2520,
1263,
24357,
271,
2746,
1925,
1125,
1730,
77,
323,
16248,
2709,
256,
3433,
8892,
323,
440,
22027,
6779,
4715,
281,
436,
990,
352,
13067,
247,
10585,
1368,
297,
2689,
1031,
7982,
281,
2557,
253,
3290,
273,
6311,
14237,
1754,
327,
253,
10585,
1368,
297,
2689,
1031,
253,
1263,
8631,
767,
5609,
281,
3186,
50276,
17890,
460,
256,
3433,
8892,
581,
970,
271,
16483,
5933,
285,
253,
643,
970,
1313,
356,
4614,
850,
18499,
253,
1125,
1730,
77,
2746,
310,
6760,
327,
4314,
15302,
7296,
13553,
273,
2620,
256,
3433,
8892,
581,
4499,
422,
4715,
4836,
285,
1740,
15970,
8892,
50275,
18,
3045,
15988,
403,
1355,
390,
14122,
525,
13907,
8244,
17565,
7180,
2145,
891,
923,
1142,
15216,
835,
253,
1682,
9591,
1125,
1730,
77,
1566,
17923,
327,
1061,
342,
253,
19508,
1666,
25379,
390,
6131,
19734,
11047,
11701,
326,
310,
672,
3192,
715,
2395,
2629,
7629,
273,
3045,
2439,
3907,
6613,
323,
1650,
275,
2829,
374,
1543,
323,
944,
2320,
962,
403,
721,
37965,
2358,
4632,
46436,
1549,
1610,
50276,
251,
1148,
3045,
1543,
323,
12057,
403,
898,
38032,
4699,
4632,
11107,
2526,
9104,
50276,
10420,
11047,
7756,
534,
359,
760,
923,
846,
7296,
253,
10046,
1125,
1730,
868,
12955,
1125,
1730,
392,
84,
17923,
387,
5325,
11246,
1812,
534,
5644,
281,
9093,
642,
3045,
627,
403,
1142,
824,
15216,
275,
7180,
2145,
50275,
19,
253,
1125,
1730,
77,
2746,
310,
7514,
342,
4560,
271,
3576,
1039,
281,
13398,
2060,
256,
3433,
8892,
1895,
5426,
275,
16186,
337,
835,
14168,
1179,
73,
310,
873,
281,
253,
4016,
10585,
1368,
297,
2689,
1031,
326,
310,
352,
19584,
326,
247,
873,
273,
295,
256,
3433,
8892,
403,
2168,
1677,
285,
352,
588,
1089,
247,
1175,
5019,
273,
1110,
8892,
26115,
347,
247,
17375,
1635,
273,
2060,
256,
3433,
8892,
1016,
4836,
2330,
342,
247,
6311,
4836,
2801,
29331,
74,
824,
15895,
3133,
3240,
29190,
352,
1057,
417,
1581,
323,
667,
5016,
875,
253,
2957,
3470,
2007,
352,
671,
19584,
326,
253,
2608,
2168,
6057,
752,
403,
7826,
1175,
256,
3433,
8892,
50275,
20,
352,
310,
12744,
849,
253,
4081,
5700,
812,
320,
908,
323,
643,
22349,
824,
347,
3048,
10554,
285,
4216,
9162,
1097,
1125,
1730,
77,
11640,
16483,
5700,
285,
253,
11786,
18499,
2715,
1646,
519,
37806,
285,
15246,
13553,
273,
5368,
5657,
436,
310,
417,
247,
1895,
327,
697,
1211,
2299,
253,
16774,
15988,
403,
417,
21414,
50276,
7152,
33032,
43355,
897,
2860,
2689,
1031,
15302,
285,
11419,
4715,
323,
1881,
35421,
4715,
4477,
897,
16483,
5700,
50276,
18,
310,
2860,
2689,
1031,
247,
15504,
323,
4216,
11454,
6928,
310,
247,
2929,
275,
326,
2929,
352,
3054,
2860,
2689,
1031,
310,
417,
15504,
604,
10895,
310,
2860,
2689,
1031,
253,
4836,
3249,
281,
3477,
275,
436,
2929,
4477,
897,
824,
3477,
15302,
374,
604,
253,
5148,
613,
920,
310,
253,
7680,
352,
310,
417,
247,
7680,
273,
253,
2929,
533,
310,
247,
4968,
495,
253,
5661,
1543,
921,
352,
31326,
1805,
3045,
577,
310,
253,
16483,
5700,
247,
7680,
50274,
12856,
2689,
1031,
15302,
310,
908,
285,
4477,
2722,
10527,
1783,
2490,
187,
4118,
18435,
27,
296,
3755,
20556,
50276,
9072,
16774,
1263,
2439,
2709,
15302,
2299,
253,
15988,
403,
417,
347,
13943,
347,
323,
643,
3215,
26208,
10625,
824,
347,
2505,
390,
3888,
50276,
47606,
15895,
273,
10585,
1368,
297,
2689,
1031,
347,
271,
8103,
281,
22318,
275,
253,
1881,
12185,
4694,
3924,
50276,
4714,
15720,
2929,
50276,
20881,
1255,
265,
50276,
2369,
652,
555,
778,
320,
3710,
407,
253,
958,
326,
253,
1332,
310,
9093,
4715,
390,
12203,
323,
247,
17375,
3388,
273,
1881,
35421,
3733,
16566,
50276,
249,
326,
1083,
1223,
253,
10585,
1368,
297,
2689,
1031,
6907,
310,
4722,
627,
778,
320,
643,
4569,
1666,
25379,
323,
27012,
436,
17375,
5019,
273,
8892,
326,
403,
417,
14859,
50276,
9088,
310,
4468,
670,
253,
4248,
273,
16774,
11701,
327,
2176,
15302
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a graph autoencoder architecture using what they term as neighborhood wasserstein reconstruction nwr they show experimentally that this reconstruction loss improves the embedding performance in structureoriented graph mining tasks strengths especially clear descriptive figures and writing experimental benefits on both toy and real world datasets and additional exploration as to why these improvements might be occurring well motivated and well placed in the literature the empirical comparisons are extensive the idea of nwr is seems novel and useful to me weaknesses the notation is a bit cumbersome with many subscripts and superscripts that are difficult to keep track of some of the choices in matching neighborhoods seem a bit arbitrary and not sufficiently justified see questions below although this is somewhat understandable as there are large number of tricks used in this paper limited theoretical contribution with some small inconsistencies questions how much does the neighborhood wasserstein reconstruction improve over a naive reconstruction of something like the mean of the neighborhood is it the wasserstein distribution matching that is important or just that it helps to reconstruct something about the neighborhood of each node theorem 31 seems a bit out of place given that you dont actually want a universally approximating network here if you had a large enough network then the nwr loss would have no benefit right as the fnn psipi could always approximate the neighborhood distribution from any initial mu sigma input do you use w22 as in eq 6 figure 3 or w2 as in eq 1 not that it matters much but would be good to clarify small notes it would be slightly better to prove approximation in the w2 metric rather than w1 in theorem 31 as this is what is used in the empirical results and the rest of the paper this seems trivially true to me theorem 31 you are missing a closing paren in w1p ug this paper was an interesting read and provided insightful and clear justifications for the results there remain some questions as to the theory and some particular loss choices made but these are relatively minor points docsepthis paper studies the problem of graph representation learning with graph autoencoder the paper argues that most gnns are designed for semisupervised learning and cannot learn taskagnostic embedding as a result the paper proposes a graph autoencoder architecture that trains the gnn in an unsupervised manner the key idea is to develop a decoder to reconstruct both the node degree and feature distribution experimental results show that the results outperform existing autoencoder baselines in several datasets writing 1 while most of the paper is well written it can be still improved for instance the paper mentions that existing approaches are either structureoriented or proximityoriented approaches and they cannot distinguish certain node pairs however the concept of the structure or proximity information of a graph is not defined and introduced well at all in the introduction making the paper hard to follow at the very beginning 2 figure 1 tries to illustrate the disadvantages of existing approaches it is unclear why there are two nodes with the same numbers such as 5 and 4 are they the node labels or ids in the graph method the time complexity of the proposed method looks very high the paper briefly describes the time complexity of eq 8 it would be good to know the overall time complexity of the proposed algorithm experiments 1 the idea of training a graph encoder in supervised manner links to the early graph embedding and the recent selfsupervised learning ssl for graph data it is good to see that both random walk based approaches as well as ssl based approaches are compared in the experiments but it is unclear why graphcl and mvgrl are only compared on the reallife datasets but are missing in the synthetic datasets in table 1 2 the proposed algorithm is significantly worse than the ssl based approaches such as dgl and mvgrl on cora citeseer and pubmed these datasets are identified as proximityoriented datasets in table 2 more detailed explanation would be expected 3 all datasets are relatively smaller in this paper it is unclear the scalability of the proposed method the paper describes a new graph autoencoder approach that could encoder more information into the latent space however the scalability of the proposed method is questionable also the proposed method did not outperform baselines on three datasets docsepthe paper proposes a novel approach to graph representation learning in particular a graph autoencoder is proposed that aims to better capture the topological structure by utilising a neighbourhood reconstruction and a degree reconstruction objective an optimaltransport based objective is proposed for the neighbourhood reconstruction that optimises the 2wasserstein distance between the decoded distribution and an empirical estimate of the neighbourhood distribution an extensive experimental analysis is performed highlighting the benefits of the proposed approach on a range of synthetic datasets to capture structure information the experimental results also highlight its robustness across 9 different realworld graph datasets ranging from proximityoriented to structureoriented datasets strengths 1 the method is intuitive and the way that the neighbourhood information is reconstructed appears novel 2 the empirical evaluation is extensive and highlights the models benefit across a range of different datasets when compared to several categories of baseline approaches covering both structurebased graph autoencoder based and contrastive learning approaches 3 the paper is mostly well written concerns 1 how does an increase in k affect the model 2 delta in the expression for pv0 does not seem to be defined which impacts the clarity of how the distribution is computed 3 a loss is introduced to predict the degree based on the node feature however it is not explicitly used in the neighbourhood reconstruction process why not sample q according to or proportional to the degree how is the neighboursampling handled when a node has less than q neighbours minor for improved clarity i suggest to include in table 2 the heading that indicates which datasets are structureoriented or proximityoriented etc as is done in table 4 in the appendix the second sentence in section 32 seems to be incomplete in the second sentence of section 43 you mention that dgi is the best performing model should it be mvgrl overall the paper is well written and presents an interesting and efficient approach to graph representation learning i lean towards accepting the paper if the authors address the abovementioned concerns and questions docsepthe paper proposes a new loss in graph autoencoder gae model for unsupervised learning composing of degree prediction and wasserstein distance which helps to identify structure information better vs the original loss extensive experiments demonstrate the advantages of the proposed loss the appreciated side of this paper is the correctness of the method detailed illustration of implementation and the thorough experiment comparison the proposal to adopt degree decoder wasserstein distance and its approximation into gae looks reasonable to me and the empirical examination verifies this the disadvantage is on the novelty side considering that i enforcing awareness of the context of nodes to highlight structure information is not new in the graph field httpsarxivorgabs190512265 etc ii the employment of the ot theory into networks existed in even a more fancy way httpsarxivorgpdf200303892pdf httpsopenreviewnetpdfidatuh28lnsuw etc the benefits of individual components were established and therefore directly combining them together further boosting is not surprising i would recommend digging further into either theory or ablations to strengthen this interesting work to characterize the real structure information captured by the proposed method compared with literature probably following the thought as 1 defining structure information si and its quantitative assessment metric 2 showing the proposed method can capture certain si but the existing methods cant 3 showing ot can capture better i am satisfied with the solidness of the paper including the methods part and experiments while i feel it is limited in novelty
### Summary:
|
the paper proposes a novel approach to graph representation learning in particular a graph autoencoder is proposed that aims to better capture the topological structure by utilising a neighbourhood reconstruction and a degree reconstruction objective an optimaltransport based objective is proposed for the neighbourhood reconstruction that optimises the 2wasserstein distance between the decoded distribution and an empirical estimate of the neighbourhood distribution an extensive experimental analysis is performed highlighting the benefits of the proposed approach on a range of synthetic datasets to capture structure information the experimental results also highlight its robustness across 9 different realworld graph datasets ranging from proximityoriented to structureoriented datasets strengths the problem studied is well motivated and the method proposed is well placed in the literature the method is intuitive and the way that the neighbourhood information is reconstructed appears novel the empirical comparisons are extensive weaknesses some of the choices in matching neighborhoods seem a bit arbitrary and not sufficiently justified the scalability of the proposed method is questionable the method has a high complexity of ond3 where n is the number of nodes and d is the average node degree the authors address this problem by resorting to the neighborhood sampling method without citing the prior art which is only very briefly discussed in the paper the reviewers have also expressed concerns about the fixed sample size q the question of how the neighboursampling is handled when a node has less than q neighbours remains unanswered
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
4216,
6753,
36465,
10336,
970,
752,
597,
1307,
347,
9168,
369,
2152,
6339,
14433,
295,
10641,
597,
921,
21657,
326,
436,
14433,
2957,
19132,
253,
21496,
3045,
275,
2605,
21085,
4216,
15067,
8892,
50276,
296,
3755,
20556,
3340,
2590,
27389,
8442,
285,
4028,
5661,
5373,
327,
1097,
20953,
285,
1524,
1533,
15302,
285,
3081,
17947,
347,
281,
2139,
841,
11701,
1537,
320,
12952,
50276,
4714,
17194,
285,
973,
4845,
275,
253,
6239,
253,
16774,
14023,
403,
9470,
50276,
783,
2934,
273,
295,
10641,
310,
3133,
4460,
285,
4217,
281,
479,
50276,
20881,
1255,
265,
50276,
783,
14951,
310,
247,
2372,
41049,
342,
1142,
749,
26804,
285,
17402,
1687,
84,
326,
403,
2834,
281,
1978,
3540,
273,
50276,
8826,
273,
253,
10165,
275,
11038,
25237,
1646,
247,
2372,
10341,
285,
417,
10481,
17285,
923,
3533,
2708,
3738,
436,
310,
8489,
34007,
347,
627,
403,
1781,
1180,
273,
24866,
908,
275,
436,
2929,
50276,
15870,
10527,
7680,
342,
690,
1355,
45611,
50276,
34974,
849,
1199,
1057,
253,
9168,
369,
2152,
6339,
14433,
3157,
689,
247,
27785,
14433,
273,
1633,
751,
253,
1599,
273,
253,
9168,
310,
352,
253,
369,
2152,
6339,
3268,
11038,
326,
310,
1774,
390,
816,
326,
352,
7729,
281,
17029,
1633,
670,
253,
9168,
273,
1016,
4666,
50275,
33921,
4562,
3133,
247,
2372,
562,
273,
1659,
1677,
326,
368,
13414,
2686,
971,
247,
35114,
4020,
839,
2990,
1060,
604,
368,
574,
247,
1781,
2217,
2990,
840,
253,
295,
10641,
2957,
651,
452,
642,
5649,
987,
347,
253,
269,
9866,
3714,
532,
74,
812,
1900,
16851,
253,
9168,
3268,
432,
667,
3302,
12910,
40009,
3280,
50276,
3088,
368,
897,
259,
1423,
347,
275,
16186,
721,
4677,
495,
390,
259,
19,
50276,
284,
275,
16186,
337,
417,
326,
352,
8213,
1199,
533,
651,
320,
1175,
281,
19148,
50276,
6795,
7211,
352,
651,
320,
5777,
1805,
281,
5276,
11193,
275,
253,
259,
19,
7982,
2581,
685,
259,
18,
275,
10012,
4562,
347,
436,
310,
752,
310,
908,
275,
253,
16774,
1543,
285,
253,
1551,
273,
253,
2929,
436,
3133,
35820,
1365,
2032,
281,
479,
50276,
33921,
4562,
368,
403,
5816,
247,
11196,
1349,
445,
275,
259,
18,
81,
15649,
436,
2929,
369,
271,
4722,
1239,
285,
2530,
47860,
285,
2590,
816,
6787,
323,
253,
1543,
627,
3464,
690,
3533,
347,
281,
253,
3762,
285,
690,
1798,
2957,
10165,
1160,
533,
841,
403,
4942,
5884,
2792,
50275,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
4216,
6779,
4715,
342,
4216,
6753,
36465,
253,
2929,
8219,
326,
954,
18976,
2224,
403,
4158,
323,
49863,
29974,
13337,
4715,
285,
2550,
3037,
4836,
1530,
6932,
21496,
347,
247,
906,
253,
2929,
29328,
247,
4216,
6753,
36465,
10336,
326,
18784,
253,
305,
9866,
275,
271,
440,
35421,
5133,
253,
2234,
2934,
310,
281,
1287,
247,
29810,
281,
17029,
1097,
253,
4666,
4248,
285,
4735,
3268,
5661,
1543,
921,
326,
253,
1543,
562,
32231,
5368,
6753,
36465,
1666,
25379,
275,
2067,
15302,
50276,
17695,
337,
1223,
954,
273,
253,
2929,
310,
973,
3542,
352,
476,
320,
1335,
5520,
323,
4227,
253,
2929,
25957,
326,
5368,
7274,
403,
2057,
2605,
21085,
390,
18326,
21085,
7274,
285,
597,
2550,
12129,
2176,
4666,
8557,
2299,
253,
4473,
273,
253,
2605,
390,
18326,
1491,
273,
247,
4216,
310,
417,
2931,
285,
5611,
973,
387,
512,
275,
253,
10199,
2403,
253,
2929,
1892,
281,
956,
387,
253,
1077,
5068,
50276,
19,
4677,
337,
14177,
281,
17093,
253,
23797,
273,
5368,
7274,
352,
310,
12744,
2139,
627,
403,
767,
7632,
342,
253,
1072,
3904,
824,
347,
608,
285,
577,
403,
597,
253,
4666,
13301,
390,
44077,
275,
253,
4216,
50276,
9349,
50274,
783,
673,
10454,
273,
253,
4081,
1332,
4453,
1077,
1029,
253,
2929,
13366,
8631,
253,
673,
10454,
273,
16186,
854,
352,
651,
320,
1175,
281,
871,
253,
4583,
673,
10454,
273,
253,
4081,
5933,
50275,
16217,
3825,
50276,
18,
253,
2934,
273,
3733,
247,
4216,
32049,
275,
22296,
5133,
4859,
281,
253,
2393,
4216,
21496,
285,
253,
3332,
1881,
35421,
4715,
256,
3433,
323,
4216,
941,
352,
310,
1175,
281,
923,
326,
1097,
3632,
2940,
1754,
7274,
347,
973,
347,
256,
3433,
1754,
7274,
403,
2429,
275,
253,
4679,
533,
352,
310,
12744,
2139,
4216,
498,
285,
278,
87,
737,
77,
403,
760,
2429,
327,
253,
294,
455,
1074,
15302,
533,
403,
5816,
275,
253,
13506,
15302,
275,
2829,
337,
374,
253,
4081,
5933,
310,
3012,
7197,
685,
253,
256,
3433,
1754,
7274,
824,
347,
277,
3129,
285,
278,
87,
737,
77,
327,
944,
66,
4851,
3248,
254,
285,
13384,
1314,
841,
15302,
403,
3636,
347,
18326,
21085,
15302,
275,
2829,
374,
625,
7000,
8813,
651,
320,
3264,
50276,
20,
512,
15302,
403,
4942,
4577,
275,
436,
2929,
50276,
262,
310,
12744,
253,
9171,
1430,
273,
253,
4081,
1332,
253,
2929,
8631,
247,
747,
4216,
6753,
36465,
2746,
326,
812,
32049,
625,
1491,
715,
253,
21624,
2317,
2299,
253,
9171,
1430,
273,
253,
4081,
1332,
310,
30455,
671,
253,
4081,
1332,
858,
417,
562,
32231,
1666,
25379,
327,
1264,
15302,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
2746,
281,
4216,
6779,
4715,
275,
1798,
247,
4216,
6753,
36465,
310,
4081,
326,
13698,
281,
1805,
9232,
253,
17597,
2605,
407,
4981,
2182,
247,
24092,
14433,
285,
247,
4248,
14433,
8103,
271,
8654,
33788,
1754,
8103,
310,
4081,
323,
253,
24092,
14433,
326,
5556,
3013,
253,
374,
88,
30666,
6339,
4181,
875,
253,
45775,
3268,
285,
271,
16774,
6642,
273,
253,
24092,
3268,
271,
9470,
5661,
1783,
310,
2684,
27321,
253,
5373,
273,
253,
4081,
2746,
327,
247,
2491,
273,
13506,
15302,
281,
9232,
2605,
1491,
253,
5661,
1543,
671,
6780,
697,
31640,
2439,
898,
1027,
1524,
10186,
4216,
15302,
12319,
432,
18326,
21085,
281,
2605,
21085,
15302,
20544,
50276,
18,
253,
1332,
310,
27350,
285,
253,
1039,
326,
253,
24092,
1491,
310,
25578,
4620,
4460,
374,
253,
16774,
7103,
310,
9470,
285,
16681,
253,
3210,
5649,
2439,
247,
2491,
273,
1027,
15302,
672,
2429,
281,
2067,
9050,
273,
8245,
7274,
10985,
1097,
2605,
3169,
4216,
6753,
36465,
1754,
285,
4499,
422,
4715,
7274,
495,
253,
2929,
310,
6571,
973,
3542,
50276,
585,
1209,
2224,
337,
849,
1057,
271,
2572,
275,
465,
2818,
253,
1566,
374,
18687,
275,
253,
2048,
323,
268,
87,
17,
1057,
417,
1646,
281,
320,
2931,
534,
16274,
253,
19843,
273,
849,
253,
3268,
310,
10302,
495,
247,
2957,
310,
5611,
281,
3283,
253,
4248,
1754,
327,
253,
4666,
4735,
2299,
352,
310,
417,
11120,
908,
275,
253,
24092,
14433,
1232,
2139,
417,
3410,
2805,
2556,
281,
390,
14495,
281,
253,
4248,
849,
310,
253,
31359,
312,
4906,
15726,
672,
247,
4666,
556,
1679,
685,
2805,
31359,
50275,
37585,
50275,
1542,
5520,
19843,
891,
1804,
281,
2486,
275,
2829,
374,
253,
13590,
326,
6492,
534,
15302,
403,
2605,
21085,
390,
18326,
21085,
3966,
347,
310,
2218,
275,
2829,
577,
275,
253,
30762,
50275,
783,
1273,
6197,
275,
2593,
4567,
3133,
281,
320,
18464,
50275,
249,
253,
1273,
6197,
273,
2593,
7652,
368,
3748,
326,
277,
7311,
310,
253,
1682,
9591,
1566,
943,
352,
320,
278,
87,
737,
77,
4583,
253,
2929,
310,
973,
3542,
285,
10262,
271,
4722,
285,
5919,
2746,
281,
4216,
6779,
4715,
891,
9644,
4404,
18738,
253,
2929,
604,
253,
4477,
2953,
253,
1840,
13012,
7350,
285,
3533,
5474,
339,
431,
248,
2929,
29328,
247,
747,
2957,
275,
4216,
6753,
36465,
305,
3348,
1566,
323,
440,
35421,
4715,
47247,
273,
4248,
10554,
285,
369,
2152,
6339,
4181,
534,
7729,
281,
4271,
2605,
1491,
1805,
4632,
253,
3236,
2957,
9470,
4679,
7568,
253,
11361,
273,
253,
4081,
2957,
253,
14109,
1930,
273,
436,
2929,
310,
253,
36594,
273,
253,
1332,
7000,
23356,
273,
7092,
285,
253,
11080,
3368,
5301,
253,
10419,
281,
5283,
4248,
29810,
369,
2152,
6339,
4181,
285,
697,
11193,
715,
305,
3348,
4453,
5272,
281,
479,
285,
253,
16774,
8368,
2336,
7790,
436,
50276,
783,
18928,
310,
327,
253,
38135,
1930,
7296,
326,
891,
37703,
11891,
273,
253,
3634,
273,
7632,
281,
6780,
2605,
1491,
310,
417,
747,
275,
253,
4216,
1673,
5987,
39962,
2061,
5375,
746,
1762,
805,
21317,
3966,
21255,
253,
8410,
273,
253,
14366,
3762,
715,
6928,
13164,
275,
1014,
247,
625,
18612,
1039,
5987,
39962,
2061,
9275,
1518,
1229,
1839,
4529,
9275,
5987,
5758,
15337,
3024,
9275,
301,
255,
6968,
1619,
77,
2224,
41440,
3966,
253,
5373,
273,
2060,
4295,
497,
4232,
285,
3103,
3587,
16248,
731,
2366,
2007,
43124,
310,
417,
10084,
50276,
74,
651,
5583,
28063,
2007,
715,
2057,
3762,
390,
490,
77,
569,
281,
17084,
436,
4722,
789,
281,
17710,
253,
1524,
2605,
1491,
10848,
407,
253,
4081,
1332,
2429,
342,
6239,
3164,
1563,
253,
1869,
347,
337,
13947,
2605,
1491,
4927,
285,
697,
11745,
6803,
7982,
374,
4645,
253,
4081,
1332,
476,
9232,
2176,
4927,
533,
253,
5368,
3082,
16216,
495,
4645,
14366,
476,
9232,
1805,
891,
717,
10048,
342,
253,
4891,
1255,
273,
253,
2929,
1690,
253,
3082,
629,
285,
4679,
1223,
891,
1928,
352,
310,
3710,
275,
38135,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
4460,
2746,
281,
4216,
6779,
4715,
275,
1798,
247,
4216,
6753,
36465,
310,
4081,
326,
13698,
281,
1805,
9232,
253,
17597,
2605,
407,
4981,
2182,
247,
24092,
14433,
285,
247,
4248,
14433,
8103,
271,
8654,
33788,
1754,
8103,
310,
4081,
323,
253,
24092,
14433,
326,
5556,
3013,
253,
374,
88,
30666,
6339,
4181,
875,
253,
45775,
3268,
285,
271,
16774,
6642,
273,
253,
24092,
3268,
271,
9470,
5661,
1783,
310,
2684,
27321,
253,
5373,
273,
253,
4081,
2746,
327,
247,
2491,
273,
13506,
15302,
281,
9232,
2605,
1491,
253,
5661,
1543,
671,
6780,
697,
31640,
2439,
898,
1027,
1524,
10186,
4216,
15302,
12319,
432,
18326,
21085,
281,
2605,
21085,
15302,
50276,
296,
3755,
20556,
50276,
783,
1895,
5421,
310,
973,
17194,
285,
253,
1332,
4081,
310,
973,
4845,
275,
253,
6239,
50276,
783,
1332,
310,
27350,
285,
253,
1039,
326,
253,
24092,
1491,
310,
25578,
4620,
4460,
50276,
783,
16774,
14023,
403,
9470,
50275,
20881,
1255,
265,
50275,
8826,
273,
253,
10165,
275,
11038,
25237,
1646,
247,
2372,
10341,
285,
417,
10481,
17285,
50275,
783,
9171,
1430,
273,
253,
4081,
1332,
310,
30455,
253,
1332,
556,
247,
1029,
10454,
273,
327,
69,
20,
835,
295,
310,
253,
1180,
273,
7632,
285,
277,
310,
253,
3388,
4666,
4248,
253,
4477,
2953,
436,
1895,
407,
501,
12655,
281,
253,
9168,
10491,
1332,
1293,
19936,
253,
2720,
1445,
534,
310,
760,
1077,
13366,
5469,
275,
253,
2929,
50275,
783,
30628,
452,
671,
4469,
7350,
670,
253,
4229,
3410,
1979,
2805,
253,
1953,
273,
849,
253,
31359,
312,
4906,
310,
15726,
672,
247,
4666,
556,
1679,
685,
2805,
31359,
4558,
440,
42195
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
4216,
6753,
36465,
10336,
970,
752,
597,
1307,
347,
9168,
369,
2152,
6339,
14433,
295,
10641,
597,
921,
21657,
326,
436,
14433,
2957,
19132,
253,
21496,
3045,
275,
2605,
21085,
4216,
15067,
8892,
50276,
296,
3755,
20556,
3340,
2590,
27389,
8442,
285,
4028,
5661,
5373,
327,
1097,
20953,
285,
1524,
1533,
15302,
285,
3081,
17947,
347,
281,
2139,
841,
11701,
1537,
320,
12952,
50276,
4714,
17194,
285,
973,
4845,
275,
253,
6239,
253,
16774,
14023,
403,
9470,
50276,
783,
2934,
273,
295,
10641,
310,
3133,
4460,
285,
4217,
281,
479,
50276,
20881,
1255,
265,
50276,
783,
14951,
310,
247,
2372,
41049,
342,
1142,
749,
26804,
285,
17402,
1687,
84,
326,
403,
2834,
281,
1978,
3540,
273,
50276,
8826,
273,
253,
10165,
275,
11038,
25237,
1646,
247,
2372,
10341,
285,
417,
10481,
17285,
923,
3533,
2708,
3738,
436,
310,
8489,
34007,
347,
627,
403,
1781,
1180,
273,
24866,
908,
275,
436,
2929,
50276,
15870,
10527,
7680,
342,
690,
1355,
45611,
50276,
34974,
849,
1199,
1057,
253,
9168,
369,
2152,
6339,
14433,
3157,
689,
247,
27785,
14433,
273,
1633,
751,
253,
1599,
273,
253,
9168,
310,
352,
253,
369,
2152,
6339,
3268,
11038,
326,
310,
1774,
390,
816,
326,
352,
7729,
281,
17029,
1633,
670,
253,
9168,
273,
1016,
4666,
50275,
33921,
4562,
3133,
247,
2372,
562,
273,
1659,
1677,
326,
368,
13414,
2686,
971,
247,
35114,
4020,
839,
2990,
1060,
604,
368,
574,
247,
1781,
2217,
2990,
840,
253,
295,
10641,
2957,
651,
452,
642,
5649,
987,
347,
253,
269,
9866,
3714,
532,
74,
812,
1900,
16851,
253,
9168,
3268,
432,
667,
3302,
12910,
40009,
3280,
50276,
3088,
368,
897,
259,
1423,
347,
275,
16186,
721,
4677,
495,
390,
259,
19,
50276,
284,
275,
16186,
337,
417,
326,
352,
8213,
1199,
533,
651,
320,
1175,
281,
19148,
50276,
6795,
7211,
352,
651,
320,
5777,
1805,
281,
5276,
11193,
275,
253,
259,
19,
7982,
2581,
685,
259,
18,
275,
10012,
4562,
347,
436,
310,
752,
310,
908,
275,
253,
16774,
1543,
285,
253,
1551,
273,
253,
2929,
436,
3133,
35820,
1365,
2032,
281,
479,
50276,
33921,
4562,
368,
403,
5816,
247,
11196,
1349,
445,
275,
259,
18,
81,
15649,
436,
2929,
369,
271,
4722,
1239,
285,
2530,
47860,
285,
2590,
816,
6787,
323,
253,
1543,
627,
3464,
690,
3533,
347,
281,
253,
3762,
285,
690,
1798,
2957,
10165,
1160,
533,
841,
403,
4942,
5884,
2792,
50275,
7152,
33032,
2520,
2929,
2175,
253,
1895,
273,
4216,
6779,
4715,
342,
4216,
6753,
36465,
253,
2929,
8219,
326,
954,
18976,
2224,
403,
4158,
323,
49863,
29974,
13337,
4715,
285,
2550,
3037,
4836,
1530,
6932,
21496,
347,
247,
906,
253,
2929,
29328,
247,
4216,
6753,
36465,
10336,
326,
18784,
253,
305,
9866,
275,
271,
440,
35421,
5133,
253,
2234,
2934,
310,
281,
1287,
247,
29810,
281,
17029,
1097,
253,
4666,
4248,
285,
4735,
3268,
5661,
1543,
921,
326,
253,
1543,
562,
32231,
5368,
6753,
36465,
1666,
25379,
275,
2067,
15302,
50276,
17695,
337,
1223,
954,
273,
253,
2929,
310,
973,
3542,
352,
476,
320,
1335,
5520,
323,
4227,
253,
2929,
25957,
326,
5368,
7274,
403,
2057,
2605,
21085,
390,
18326,
21085,
7274,
285,
597,
2550,
12129,
2176,
4666,
8557,
2299,
253,
4473,
273,
253,
2605,
390,
18326,
1491,
273,
247,
4216,
310,
417,
2931,
285,
5611,
973,
387,
512,
275,
253,
10199,
2403,
253,
2929,
1892,
281,
956,
387,
253,
1077,
5068,
50276,
19,
4677,
337,
14177,
281,
17093,
253,
23797,
273,
5368,
7274,
352,
310,
12744,
2139,
627,
403,
767,
7632,
342,
253,
1072,
3904,
824,
347,
608,
285,
577,
403,
597,
253,
4666,
13301,
390,
44077,
275,
253,
4216,
50276,
9349,
50274,
783,
673,
10454,
273,
253,
4081,
1332,
4453,
1077,
1029,
253,
2929,
13366,
8631,
253,
673,
10454,
273,
16186,
854,
352,
651,
320,
1175,
281,
871,
253,
4583,
673,
10454,
273,
253,
4081,
5933,
50275,
16217,
3825,
50276,
18,
253,
2934,
273,
3733,
247,
4216,
32049,
275,
22296,
5133,
4859,
281,
253,
2393,
4216,
21496,
285,
253,
3332,
1881,
35421,
4715,
256,
3433,
323,
4216,
941,
352,
310,
1175,
281,
923,
326,
1097,
3632,
2940,
1754,
7274,
347,
973,
347,
256,
3433,
1754,
7274,
403,
2429,
275,
253,
4679,
533,
352,
310,
12744,
2139,
4216,
498,
285,
278,
87,
737,
77,
403,
760,
2429,
327,
253,
294,
455,
1074,
15302,
533,
403,
5816,
275,
253,
13506,
15302,
275,
2829,
337,
374,
253,
4081,
5933,
310,
3012,
7197,
685,
253,
256,
3433,
1754,
7274,
824,
347,
277,
3129,
285,
278,
87,
737,
77,
327,
944,
66,
4851,
3248,
254,
285,
13384,
1314,
841,
15302,
403,
3636,
347,
18326,
21085,
15302,
275,
2829,
374,
625,
7000,
8813,
651,
320,
3264,
50276,
20,
512,
15302,
403,
4942,
4577,
275,
436,
2929,
50276,
262,
310,
12744,
253,
9171,
1430,
273,
253,
4081,
1332,
253,
2929,
8631,
247,
747,
4216,
6753,
36465,
2746,
326,
812,
32049,
625,
1491,
715,
253,
21624,
2317,
2299,
253,
9171,
1430,
273,
253,
4081,
1332,
310,
30455,
671,
253,
4081,
1332,
858,
417,
562,
32231,
1666,
25379,
327,
1264,
15302,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
2746,
281,
4216,
6779,
4715,
275,
1798,
247,
4216,
6753,
36465,
310,
4081,
326,
13698,
281,
1805,
9232,
253,
17597,
2605,
407,
4981,
2182,
247,
24092,
14433,
285,
247,
4248,
14433,
8103,
271,
8654,
33788,
1754,
8103,
310,
4081,
323,
253,
24092,
14433,
326,
5556,
3013,
253,
374,
88,
30666,
6339,
4181,
875,
253,
45775,
3268,
285,
271,
16774,
6642,
273,
253,
24092,
3268,
271,
9470,
5661,
1783,
310,
2684,
27321,
253,
5373,
273,
253,
4081,
2746,
327,
247,
2491,
273,
13506,
15302,
281,
9232,
2605,
1491,
253,
5661,
1543,
671,
6780,
697,
31640,
2439,
898,
1027,
1524,
10186,
4216,
15302,
12319,
432,
18326,
21085,
281,
2605,
21085,
15302,
20544,
50276,
18,
253,
1332,
310,
27350,
285,
253,
1039,
326,
253,
24092,
1491,
310,
25578,
4620,
4460,
374,
253,
16774,
7103,
310,
9470,
285,
16681,
253,
3210,
5649,
2439,
247,
2491,
273,
1027,
15302,
672,
2429,
281,
2067,
9050,
273,
8245,
7274,
10985,
1097,
2605,
3169,
4216,
6753,
36465,
1754,
285,
4499,
422,
4715,
7274,
495,
253,
2929,
310,
6571,
973,
3542,
50276,
585,
1209,
2224,
337,
849,
1057,
271,
2572,
275,
465,
2818,
253,
1566,
374,
18687,
275,
253,
2048,
323,
268,
87,
17,
1057,
417,
1646,
281,
320,
2931,
534,
16274,
253,
19843,
273,
849,
253,
3268,
310,
10302,
495,
247,
2957,
310,
5611,
281,
3283,
253,
4248,
1754,
327,
253,
4666,
4735,
2299,
352,
310,
417,
11120,
908,
275,
253,
24092,
14433,
1232,
2139,
417,
3410,
2805,
2556,
281,
390,
14495,
281,
253,
4248,
849,
310,
253,
31359,
312,
4906,
15726,
672,
247,
4666,
556,
1679,
685,
2805,
31359,
50275,
37585,
50275,
1542,
5520,
19843,
891,
1804,
281,
2486,
275,
2829,
374,
253,
13590,
326,
6492,
534,
15302,
403,
2605,
21085,
390,
18326,
21085,
3966,
347,
310,
2218,
275,
2829,
577,
275,
253,
30762,
50275,
783,
1273,
6197,
275,
2593,
4567,
3133,
281,
320,
18464,
50275,
249,
253,
1273,
6197,
273,
2593,
7652,
368,
3748,
326,
277,
7311,
310,
253,
1682,
9591,
1566,
943,
352,
320,
278,
87,
737,
77,
4583,
253,
2929,
310,
973,
3542,
285,
10262,
271,
4722,
285,
5919,
2746,
281,
4216,
6779,
4715,
891,
9644,
4404,
18738,
253,
2929,
604,
253,
4477,
2953,
253,
1840,
13012,
7350,
285,
3533,
5474,
339,
431,
248,
2929,
29328,
247,
747,
2957,
275,
4216,
6753,
36465,
305,
3348,
1566,
323,
440,
35421,
4715,
47247,
273,
4248,
10554,
285,
369,
2152,
6339,
4181,
534,
7729,
281,
4271,
2605,
1491,
1805,
4632,
253,
3236,
2957,
9470,
4679,
7568,
253,
11361,
273,
253,
4081,
2957,
253,
14109,
1930,
273,
436,
2929,
310,
253,
36594,
273,
253,
1332,
7000,
23356,
273,
7092,
285,
253,
11080,
3368,
5301,
253,
10419,
281,
5283,
4248,
29810,
369,
2152,
6339,
4181,
285,
697,
11193,
715,
305,
3348,
4453,
5272,
281,
479,
285,
253,
16774,
8368,
2336,
7790,
436,
50276,
783,
18928,
310,
327,
253,
38135,
1930,
7296,
326,
891,
37703,
11891,
273,
253,
3634,
273,
7632,
281,
6780,
2605,
1491,
310,
417,
747,
275,
253,
4216,
1673,
5987,
39962,
2061,
5375,
746,
1762,
805,
21317,
3966,
21255,
253,
8410,
273,
253,
14366,
3762,
715,
6928,
13164,
275,
1014,
247,
625,
18612,
1039,
5987,
39962,
2061,
9275,
1518,
1229,
1839,
4529,
9275,
5987,
5758,
15337,
3024,
9275,
301,
255,
6968,
1619,
77,
2224,
41440,
3966,
253,
5373,
273,
2060,
4295,
497,
4232,
285,
3103,
3587,
16248,
731,
2366,
2007,
43124,
310,
417,
10084,
50276,
74,
651,
5583,
28063,
2007,
715,
2057,
3762,
390,
490,
77,
569,
281,
17084,
436,
4722,
789,
281,
17710,
253,
1524,
2605,
1491,
10848,
407,
253,
4081,
1332,
2429,
342,
6239,
3164,
1563,
253,
1869,
347,
337,
13947,
2605,
1491,
4927,
285,
697,
11745,
6803,
7982,
374,
4645,
253,
4081,
1332,
476,
9232,
2176,
4927,
533,
253,
5368,
3082,
16216,
495,
4645,
14366,
476,
9232,
1805,
891,
717,
10048,
342,
253,
4891,
1255,
273,
253,
2929,
1690,
253,
3082,
629,
285,
4679,
1223,
891,
1928,
352,
310,
3710,
275,
38135,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
4460,
2746,
281,
4216,
6779,
4715,
275,
1798,
247,
4216,
6753,
36465,
310,
4081,
326,
13698,
281,
1805,
9232,
253,
17597,
2605,
407,
4981,
2182,
247,
24092,
14433,
285,
247,
4248,
14433,
8103,
271,
8654,
33788,
1754,
8103,
310,
4081,
323,
253,
24092,
14433,
326,
5556,
3013,
253,
374,
88,
30666,
6339,
4181,
875,
253,
45775,
3268,
285,
271,
16774,
6642,
273,
253,
24092,
3268,
271,
9470,
5661,
1783,
310,
2684,
27321,
253,
5373,
273,
253,
4081,
2746,
327,
247,
2491,
273,
13506,
15302,
281,
9232,
2605,
1491,
253,
5661,
1543,
671,
6780,
697,
31640,
2439,
898,
1027,
1524,
10186,
4216,
15302,
12319,
432,
18326,
21085,
281,
2605,
21085,
15302,
50276,
296,
3755,
20556,
50276,
783,
1895,
5421,
310,
973,
17194,
285,
253,
1332,
4081,
310,
973,
4845,
275,
253,
6239,
50276,
783,
1332,
310,
27350,
285,
253,
1039,
326,
253,
24092,
1491,
310,
25578,
4620,
4460,
50276,
783,
16774,
14023,
403,
9470,
50275,
20881,
1255,
265,
50275,
8826,
273,
253,
10165,
275,
11038,
25237,
1646,
247,
2372,
10341,
285,
417,
10481,
17285,
50275,
783,
9171,
1430,
273,
253,
4081,
1332,
310,
30455,
253,
1332,
556,
247,
1029,
10454,
273,
327,
69,
20,
835,
295,
310,
253,
1180,
273,
7632,
285,
277,
310,
253,
3388,
4666,
4248,
253,
4477,
2953,
436,
1895,
407,
501,
12655,
281,
253,
9168,
10491,
1332,
1293,
19936,
253,
2720,
1445,
534,
310,
760,
1077,
13366,
5469,
275,
253,
2929,
50275,
783,
30628,
452,
671,
4469,
7350,
670,
253,
4229,
3410,
1979,
2805,
253,
1953,
273,
849,
253,
31359,
312,
4906,
310,
15726,
672,
247,
4666,
556,
1679,
685,
2805,
31359,
4558,
440,
42195
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the problem of tracking a curve path for a vehicle model the authors apply the neural lyapunov function approach to a straight line tracking problem given the learned lyapunov function the authors compute the largest curvature that the lyapunov function can tolerate the idea of this paper is straightforward but the authors evaluated it on hardware strengths 1 experiments on hardware weakness 1 theoretical contribution is minor 2 some presentation issues docsepthis paper applies existing techniques on learning control lyapunov function clf to the application of path following of ground vehicles the main contribution of the paper is the ability to safely use a clf designed for low curvature paths to paths with higher curvatures without losing the correctness guarantees of the clf the paper provides sufficient experimental evidence to support the proposed approach strengths the paper applies the proposed framework to a realworld example with interesting experimental results weaknesses the papers contribution is limited to applying existing algorithms ie neuralbased clf learning and verification with minor modifications control lyapunov functions for ground vehicles have been extensively studied and several manually designed clfs have been discussed in the literature the paper lacks a clear justification of the motivation behind using a neuralbased clf for this setup the paper lacks a comparison against other techniques to understand the advantages and disadvantages of using the proposed framework docsepthis paper proposes a framework to search for the best control for path tracking problems using neural control lyapunov functions the authors show the proposed method can ensure a bounded deviation of a target path and demonstrate the effectiveness of their algorithm on a real robot strengths this paper is wellwritten and easy to read the framework provides a general and simple way to extract a stabilizing feedback controller weaknesses in the experiment section there is lacking comparison between the proposed method and existing path tracking methods it would be also interesting to see the performance comparison in controllers extracted by polynomial clfs and neural clfs docsepthis paper proposes a new way to do path following using neural network based control lyapunov functions clf for a given vehicle dynamic with straightline reference paths the authors construct the neural clf to satisfy the lyapunov conditions next they derive the curvature bounds to ensure the clf controller stay valid hardware experiments are conducted to showcase the proposed approachs realtime performance strengths theoretical guarantees for the clf converging behavior and tolerant reference curvature bound hardware implementation to showcase the proposed controller can work in realtime weaknesses this work is incremental since there are already so many papers in using nn to construct clf and no baselines used in the experiments to demonstrate what is the benefit of this new nn clf method and they only use this clf controller in a relative lowdimension experiment the verification process can only converge within a small roa
### Summary:
|
the paper focuses on path following of a ground vehicle with the use of nnbased clf it presents appealing demonstrations in hardware however it lacks empirical comparisons to prior works the paper is generally wellwritten and technically clear but the theoretical novelty is limited as the approach relies on minor modifications to existing methods clearer justification of the significance is needed main pros promising hardware experiments the paper is wellwritten main cons lack of comparison to baselines in the experiments limited theoretical novelty postrebuttal update the revised version of the paper has offered comparisons to baselines the authors also provided further explanations regarding the theoretical novelty it is important that the final version of the paper will show also experiments with curved path following as the authors discussed with reviewer mrsr
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
1895,
273,
12544,
247,
6970,
1854,
323,
247,
4958,
1566,
253,
4477,
4647,
253,
11454,
12865,
522,
43772,
1159,
2746,
281,
247,
4951,
1386,
12544,
1895,
1677,
253,
6311,
12865,
522,
43772,
1159,
253,
4477,
11897,
253,
6253,
16841,
326,
253,
12865,
522,
43772,
1159,
476,
31259,
253,
2934,
273,
436,
2929,
310,
15246,
533,
253,
4477,
6760,
352,
327,
10309,
20544,
337,
4679,
327,
10309,
50276,
20881,
1255,
337,
10527,
7680,
310,
5884,
374,
690,
9759,
3374,
5474,
33032,
2520,
2929,
10384,
5368,
5609,
327,
4715,
1453,
12865,
522,
43772,
1159,
502,
71,
281,
253,
2898,
273,
1854,
1563,
273,
3216,
9411,
253,
2022,
7680,
273,
253,
2929,
310,
253,
3745,
281,
15792,
897,
247,
502,
71,
4158,
323,
1698,
16841,
11865,
281,
11865,
342,
2169,
15340,
2478,
1293,
10305,
253,
36594,
23632,
273,
253,
502,
71,
253,
2929,
3400,
4209,
5661,
1941,
281,
1329,
253,
4081,
2746,
20544,
50276,
783,
2929,
10384,
253,
4081,
7792,
281,
247,
1524,
10186,
1650,
342,
4722,
5661,
1543,
50276,
20881,
1255,
265,
50276,
783,
9380,
7680,
310,
3710,
281,
9433,
5368,
11333,
26332,
11454,
3169,
502,
71,
4715,
285,
21999,
342,
5884,
14586,
50275,
8519,
12865,
522,
43772,
3470,
323,
3216,
9411,
452,
644,
18171,
5421,
285,
2067,
13542,
4158,
502,
3671,
452,
644,
5469,
275,
253,
6239,
253,
2929,
19756,
247,
2590,
22861,
273,
253,
16038,
3212,
970,
247,
11454,
3169,
502,
71,
323,
436,
9978,
50274,
783,
2929,
19756,
247,
5301,
1411,
643,
5609,
281,
2096,
253,
11361,
285,
23797,
273,
970,
253,
4081,
7792,
50275,
7152,
33032,
2520,
2929,
29328,
247,
7792,
281,
3186,
323,
253,
1682,
1453,
323,
1854,
12544,
3237,
970,
11454,
1453,
12865,
522,
43772,
3470,
253,
4477,
921,
253,
4081,
1332,
476,
5416,
247,
11542,
11254,
273,
247,
2303,
1854,
285,
7568,
253,
12510,
273,
616,
5933,
327,
247,
1524,
15688,
20544,
50275,
2520,
2929,
310,
973,
15720,
285,
3477,
281,
1239,
50274,
783,
7792,
3400,
247,
2087,
285,
2969,
1039,
281,
4908,
247,
41427,
8680,
9763,
50273,
20881,
1255,
265,
50275,
249,
253,
3368,
2593,
627,
310,
14999,
5301,
875,
253,
4081,
1332,
285,
5368,
1854,
12544,
3082,
352,
651,
320,
671,
4722,
281,
923,
253,
3045,
5301,
275,
27765,
10375,
407,
14189,
502,
3671,
285,
11454,
502,
3671,
50275,
7152,
33032,
2520,
2929,
29328,
247,
747,
1039,
281,
513,
1854,
1563,
970,
11454,
2990,
1754,
1453,
12865,
522,
43772,
3470,
502,
71,
323,
247,
1677,
4958,
7870,
342,
4951,
1282,
3806,
11865,
253,
4477,
3989,
253,
11454,
502,
71,
281,
10517,
253,
12865,
522,
43772,
2515,
1735,
597,
15313,
253,
16841,
14493,
281,
5416,
253,
502,
71,
9763,
3297,
3588,
10309,
4679,
403,
5196,
281,
34647,
253,
4081,
2746,
84,
1524,
2606,
3045,
20544,
10527,
23632,
323,
253,
502,
71,
5975,
3390,
3879,
285,
41842,
3806,
16841,
3033,
50276,
10984,
1935,
7092,
281,
34647,
253,
4081,
9763,
476,
789,
275,
1524,
2606,
50276,
20881,
1255,
265,
436,
789,
310,
32809,
1580,
627,
403,
2168,
594,
1142,
9380,
275,
970,
48257,
281,
3989,
502,
71,
285,
642,
1666,
25379,
908,
275,
253,
4679,
281,
7568,
752,
310,
253,
5649,
273,
436,
747,
48257,
502,
71,
1332,
285,
597,
760,
897,
436,
502,
71,
9763,
275,
247,
4103,
1698,
39120,
3368,
253,
21999,
1232,
476,
760,
29623,
1561,
247,
1355,
687,
66,
2490,
187,
4118,
18435,
27,
783,
2929,
16633,
327,
1854,
1563,
273,
247,
3216,
4958,
342,
253,
897,
273,
48257,
3169,
502,
71,
352,
10262,
23176,
32367,
275,
10309,
2299,
352,
19756,
16774,
14023,
281,
2720,
2987,
253,
2929,
310,
3839,
973,
15720,
285,
22335,
2590,
533,
253,
10527,
38135,
310,
3710,
347,
253,
2746,
15771,
327,
5884,
14586,
281,
5368,
3082,
30909,
22861,
273,
253,
8453,
310,
3058,
50276,
7265,
5847,
50276,
13382,
2182,
10309,
4679,
50276,
783,
2929,
310,
973,
15720,
50276,
7265,
772,
50276,
77,
471,
273,
5301,
281,
1666,
25379,
275,
253,
4679,
50276,
15870,
10527,
38135,
50276,
5996,
250,
2858,
22559,
5731,
253,
17265,
2715,
273,
253,
2929,
556,
5907,
14023,
281,
1666,
25379,
253,
4477,
671,
2530,
2007,
22909,
5001,
253,
10527,
38135,
352,
310,
1774,
326,
253,
2457,
2715,
273,
253,
2929,
588,
921,
671,
4679,
342,
22627,
1854,
1563,
347,
253,
4477,
5469,
342,
37317,
278,
2967,
83
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
1895,
273,
12544,
247,
6970,
1854,
323,
247,
4958,
1566,
253,
4477,
4647,
253,
11454,
12865,
522,
43772,
1159,
2746,
281,
247,
4951,
1386,
12544,
1895,
1677,
253,
6311,
12865,
522,
43772,
1159,
253,
4477,
11897,
253,
6253,
16841,
326,
253,
12865,
522,
43772,
1159,
476,
31259,
253,
2934,
273,
436,
2929,
310,
15246,
533,
253,
4477,
6760,
352,
327,
10309,
20544,
337,
4679,
327,
10309,
50276,
20881,
1255,
337,
10527,
7680,
310,
5884,
374,
690,
9759,
3374,
5474,
33032,
2520,
2929,
10384,
5368,
5609,
327,
4715,
1453,
12865,
522,
43772,
1159,
502,
71,
281,
253,
2898,
273,
1854,
1563,
273,
3216,
9411,
253,
2022,
7680,
273,
253,
2929,
310,
253,
3745,
281,
15792,
897,
247,
502,
71,
4158,
323,
1698,
16841,
11865,
281,
11865,
342,
2169,
15340,
2478,
1293,
10305,
253,
36594,
23632,
273,
253,
502,
71,
253,
2929,
3400,
4209,
5661,
1941,
281,
1329,
253,
4081,
2746,
20544,
50276,
783,
2929,
10384,
253,
4081,
7792,
281,
247,
1524,
10186,
1650,
342,
4722,
5661,
1543,
50276,
20881,
1255,
265,
50276,
783,
9380,
7680,
310,
3710,
281,
9433,
5368,
11333,
26332,
11454,
3169,
502,
71,
4715,
285,
21999,
342,
5884,
14586,
50275,
8519,
12865,
522,
43772,
3470,
323,
3216,
9411,
452,
644,
18171,
5421,
285,
2067,
13542,
4158,
502,
3671,
452,
644,
5469,
275,
253,
6239,
253,
2929,
19756,
247,
2590,
22861,
273,
253,
16038,
3212,
970,
247,
11454,
3169,
502,
71,
323,
436,
9978,
50274,
783,
2929,
19756,
247,
5301,
1411,
643,
5609,
281,
2096,
253,
11361,
285,
23797,
273,
970,
253,
4081,
7792,
50275,
7152,
33032,
2520,
2929,
29328,
247,
7792,
281,
3186,
323,
253,
1682,
1453,
323,
1854,
12544,
3237,
970,
11454,
1453,
12865,
522,
43772,
3470,
253,
4477,
921,
253,
4081,
1332,
476,
5416,
247,
11542,
11254,
273,
247,
2303,
1854,
285,
7568,
253,
12510,
273,
616,
5933,
327,
247,
1524,
15688,
20544,
50275,
2520,
2929,
310,
973,
15720,
285,
3477,
281,
1239,
50274,
783,
7792,
3400,
247,
2087,
285,
2969,
1039,
281,
4908,
247,
41427,
8680,
9763,
50273,
20881,
1255,
265,
50275,
249,
253,
3368,
2593,
627,
310,
14999,
5301,
875,
253,
4081,
1332,
285,
5368,
1854,
12544,
3082,
352,
651,
320,
671,
4722,
281,
923,
253,
3045,
5301,
275,
27765,
10375,
407,
14189,
502,
3671,
285,
11454,
502,
3671,
50275,
7152,
33032,
2520,
2929,
29328,
247,
747,
1039,
281,
513,
1854,
1563,
970,
11454,
2990,
1754,
1453,
12865,
522,
43772,
3470,
502,
71,
323,
247,
1677,
4958,
7870,
342,
4951,
1282,
3806,
11865,
253,
4477,
3989,
253,
11454,
502,
71,
281,
10517,
253,
12865,
522,
43772,
2515,
1735,
597,
15313,
253,
16841,
14493,
281,
5416,
253,
502,
71,
9763,
3297,
3588,
10309,
4679,
403,
5196,
281,
34647,
253,
4081,
2746,
84,
1524,
2606,
3045,
20544,
10527,
23632,
323,
253,
502,
71,
5975,
3390,
3879,
285,
41842,
3806,
16841,
3033,
50276,
10984,
1935,
7092,
281,
34647,
253,
4081,
9763,
476,
789,
275,
1524,
2606,
50276,
20881,
1255,
265,
436,
789,
310,
32809,
1580,
627,
403,
2168,
594,
1142,
9380,
275,
970,
48257,
281,
3989,
502,
71,
285,
642,
1666,
25379,
908,
275,
253,
4679,
281,
7568,
752,
310,
253,
5649,
273,
436,
747,
48257,
502,
71,
1332,
285,
597,
760,
897,
436,
502,
71,
9763,
275,
247,
4103,
1698,
39120,
3368,
253,
21999,
1232,
476,
760,
29623,
1561,
247,
1355,
687,
66,
2490,
187,
4118,
18435,
27,
783,
2929,
16633,
327,
1854,
1563,
273,
247,
3216,
4958,
342,
253,
897,
273,
48257,
3169,
502,
71,
352,
10262,
23176,
32367,
275,
10309,
2299,
352,
19756,
16774,
14023,
281,
2720,
2987,
253,
2929,
310,
3839,
973,
15720,
285,
22335,
2590,
533,
253,
10527,
38135,
310,
3710,
347,
253,
2746,
15771,
327,
5884,
14586,
281,
5368,
3082,
30909,
22861,
273,
253,
8453,
310,
3058,
50276,
7265,
5847,
50276,
13382,
2182,
10309,
4679,
50276,
783,
2929,
310,
973,
15720,
50276,
7265,
772,
50276,
77,
471,
273,
5301,
281,
1666,
25379,
275,
253,
4679,
50276,
15870,
10527,
38135,
50276,
5996,
250,
2858,
22559,
5731,
253,
17265,
2715,
273,
253,
2929,
556,
5907,
14023,
281,
1666,
25379,
253,
4477,
671,
2530,
2007,
22909,
5001,
253,
10527,
38135,
352,
310,
1774,
326,
253,
2457,
2715,
273,
253,
2929,
588,
921,
671,
4679,
342,
22627,
1854,
1563,
347,
253,
4477,
5469,
342,
37317,
278,
2967,
83
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the problem of maximizing a monotone submodular function under matroid constraints when the value queries have independent random noise drawn from generalized exponential tail distributions the first result is for uniform matroid constraints aka cardinality constraints here the new result improves on existing work in that the cardinality range over which it applies is larger than that of previous work while preserving polynomial dependence of query complexity on the approximation parameter epsilon prior work either applied only to larger cardinality values omegalog log nepsilon2 against omega1epsilon or required exponential dependence of query complexity on epsilon the second result is for general matroid constraints and obtains a constant approximation prior work had not considered general matroid constraints in maximizing submodular functions in the noisy setting the main technique is to use local search where each step swaps a single element prior work for noisy submodular maximization mostly used variants of the greedy algorithm the local search algorithm is an adaptation of prior work by filmus and ward for exact queries and the optimization function used for local search is also a robust version of the function used by fw strengths 1 submodular function maximization is an important classical problem with applications spanning many areas 2 the paper gives the first results for submodular maximization under general matroid constraints in the noisy setting without noise this is a standard setting in which submodular maximization is studied so this is a good research direction in general the approximation constant doesnt appear very impressive but as the first paper considering this problem it provides a good start 3 the adaptation of the local search technique to the noisy setting might be useful in general local search is used in many contexts and constructing a robust version in the presence of noise might have value beyond the results in this paper weaknesses 1 the techniques employed in this paper are not very novel the adaptation of the fw algorithm is fairly straightforward and as such the paper does not contribute much in terms of new techniques 2 the noise model comes across as quite limited and the paper does not include a discussion on it it is true that a couple of previous papers also considered the same noise model but i feel it would be useful to discuss the contexts in which this type of noise distribution is relevant the paper includes a discussion of limitations at the end but it seems largely perfunctory as i mentioned above it would be interesting to have a longer discussion about the noise model with its application scenarios and limitations for instance also are the cardinality bounds in this paper tight if not what are the challenges in removing this constraint altogether discussions of this sort would help make the paper a more engaging read docsepthis paper studies the problem of maximizing a submodular set function when function values are given by a noisy oracle the authors propose 2 novel auxiliary functions which can be queried efficiently in order to improve the algorithms candidate solutions via local search the approach for cardinality constraints is then extended to work with matroid constraints originality significance interesting theoretical contributions which lead to more practical ways to handle noisy function evaluation cites previous related work where appropriate quality smoothing surrogate auxiliary functions appear to be a novel elegant approach that leads to exponential improvement in query complexity compared to previous work clarity this paper is generally well organized however i have 2 small suggestions state the note after theorem 33 as a separate corollary showing intermediate steps in the appendix this is important because subsequent theorems appear to include an extra log r term the second paragraph of related work on noisefree optimization with more complex constraints is not relevant to the paper this could be moved to the appendix or removed altogether approximate ratio is used throughout the paper where approximation ratio is more standard this paper describes limitations as opportunities for future work but does not address any potential negative social impact docsepthe paper focuses on the problem of monotone submodular maximization subject to matroid constraints when the function is only accessible through a noisy oracle in particular they provide an algorithm that with high probability outputs a solution with nearoptimal 1 1e oepsilonapproximation guarantee notably they deliver this result with a polynomial computational complexity in 1epsilon while the current works in the literature have an exponential complexity strengths a the paper studies an interesting generic problem with broad applications in the data mining and machine learning areas b the paper is very wellwritten and easy to follow in particular the authors make the transition into their main results smooth by providing sufficient intuitive explanations and motivations moreover they justly credit the related the works and clearly distinguish their contributions from the existing methods or definitions c they provide strong results and considerable improvements over the stateoftheart weaknesses overall i find the results quite strong i just have few minor comments on the presentation of the results which i state under the questions yes
### Summary:
|
overall this paper studies an important problem and obtains strong results there were some concerns raised by one reviewer regarding the novelty of the techniques and the lack of discussion regarding the noise model please make sure to add the discussion about the noise model from the rebuttal in the camera ready version of the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
1895,
273,
46875,
247,
49123,
749,
2307,
792,
1159,
762,
1111,
2631,
10806,
672,
253,
1318,
19241,
452,
3907,
3632,
6046,
8392,
432,
14923,
17619,
8105,
10670,
253,
806,
906,
310,
323,
6447,
1111,
2631,
10806,
38857,
46950,
10806,
1060,
253,
747,
906,
19132,
327,
5368,
789,
275,
326,
253,
46950,
2491,
689,
534,
352,
10384,
310,
4067,
685,
326,
273,
2045,
789,
1223,
24279,
14189,
10096,
273,
7316,
10454,
327,
253,
11193,
4764,
299,
4277,
2720,
789,
2057,
3732,
760,
281,
4067,
46950,
2193,
7005,
42552,
462,
2412,
425,
4277,
19,
1411,
40639,
18,
4259,
390,
2424,
17619,
10096,
273,
7316,
10454,
327,
299,
4277,
253,
1273,
906,
310,
323,
2087,
1111,
2631,
10806,
285,
31326,
247,
3638,
11193,
2720,
789,
574,
417,
2783,
2087,
1111,
2631,
10806,
275,
46875,
749,
2307,
792,
3470,
275,
253,
27620,
4758,
50276,
783,
2022,
5853,
310,
281,
897,
1980,
3186,
835,
1016,
3213,
1863,
1825,
247,
2014,
3284,
2720,
789,
323,
27620,
749,
2307,
792,
11903,
1320,
6571,
908,
11640,
273,
253,
38754,
5933,
253,
1980,
3186,
5933,
310,
271,
15644,
273,
2720,
789,
407,
3085,
316,
285,
18588,
323,
3242,
19241,
285,
253,
13757,
1159,
908,
323,
1980,
3186,
310,
671,
247,
10237,
2715,
273,
253,
1159,
908,
407,
269,
88,
20544,
50276,
18,
749,
2307,
792,
1159,
11903,
1320,
310,
271,
1774,
8946,
1895,
342,
4893,
28369,
1142,
3672,
50276,
19,
253,
2929,
4245,
253,
806,
1543,
323,
749,
2307,
792,
11903,
1320,
762,
2087,
1111,
2631,
10806,
275,
253,
27620,
4758,
1293,
6046,
436,
310,
247,
2629,
4758,
275,
534,
749,
2307,
792,
11903,
1320,
310,
5421,
594,
436,
310,
247,
1175,
2561,
3884,
275,
2087,
253,
11193,
3638,
36908,
3176,
1077,
13943,
533,
347,
253,
806,
2929,
7296,
436,
1895,
352,
3400,
247,
1175,
1265,
50276,
20,
253,
15644,
273,
253,
1980,
3186,
5853,
281,
253,
27620,
4758,
1537,
320,
4217,
275,
2087,
1980,
3186,
310,
908,
275,
1142,
22349,
285,
26736,
247,
10237,
2715,
275,
253,
3361,
273,
6046,
1537,
452,
1318,
4457,
253,
1543,
275,
436,
2929,
50276,
20881,
1255,
265,
50276,
18,
253,
5609,
7091,
275,
436,
2929,
403,
417,
1077,
4460,
253,
15644,
273,
253,
269,
88,
5933,
310,
9648,
15246,
285,
347,
824,
253,
2929,
1057,
417,
8162,
1199,
275,
2426,
273,
747,
5609,
50276,
19,
253,
6046,
1566,
3249,
2439,
347,
3240,
3710,
285,
253,
2929,
1057,
417,
2486,
247,
5955,
327,
352,
352,
310,
2032,
326,
247,
4564,
273,
2045,
9380,
671,
2783,
253,
1072,
6046,
1566,
533,
891,
1928,
352,
651,
320,
4217,
281,
2319,
253,
22349,
275,
534,
436,
1511,
273,
6046,
3268,
310,
4623,
253,
2929,
3797,
247,
5955,
273,
7364,
387,
253,
990,
533,
352,
3133,
8127,
591,
2337,
291,
590,
347,
891,
5393,
1840,
352,
651,
320,
4722,
281,
452,
247,
3356,
5955,
670,
253,
6046,
1566,
342,
697,
2898,
15216,
285,
7364,
323,
4227,
671,
403,
253,
46950,
14493,
275,
436,
2929,
6863,
604,
417,
752,
403,
253,
7881,
275,
11922,
436,
7658,
17965,
11985,
273,
436,
3686,
651,
1361,
1056,
253,
2929,
247,
625,
15966,
1239,
5474,
33032,
2520,
2929,
2175,
253,
1895,
273,
46875,
247,
749,
2307,
792,
873,
1159,
672,
1159,
2193,
403,
1677,
407,
247,
27620,
42295,
253,
4477,
12661,
374,
4460,
24026,
3470,
534,
476,
320,
32305,
728,
14556,
275,
1340,
281,
3157,
253,
11333,
7431,
5482,
3066,
1980,
3186,
253,
2746,
323,
46950,
10806,
310,
840,
6508,
281,
789,
342,
1111,
2631,
10806,
50276,
19164,
414,
8453,
50276,
47606,
10527,
9021,
534,
1421,
281,
625,
8542,
4088,
281,
6016,
27620,
1159,
7103,
50276,
68,
3254,
2045,
2905,
789,
835,
4569,
50275,
15177,
50276,
34006,
272,
35701,
24026,
3470,
3176,
281,
320,
247,
4460,
20654,
2746,
326,
5644,
281,
17619,
7756,
275,
7316,
10454,
2429,
281,
2045,
789,
50275,
498,
15752,
50276,
2520,
2929,
310,
3839,
973,
10932,
2299,
891,
452,
374,
1355,
13991,
50272,
3409,
253,
3877,
846,
10012,
5922,
347,
247,
4858,
40460,
4645,
10444,
5018,
275,
253,
30762,
436,
310,
1774,
984,
6774,
39383,
3176,
281,
2486,
271,
4465,
2412,
391,
1307,
50271,
783,
1273,
12494,
273,
2905,
789,
327,
642,
261,
832,
658,
13757,
342,
625,
2570,
10806,
310,
417,
4623,
281,
253,
2929,
436,
812,
320,
4395,
281,
253,
30762,
390,
5176,
17965,
50276,
9887,
2542,
4313,
310,
908,
4768,
253,
2929,
835,
11193,
4313,
310,
625,
2629,
50276,
2520,
2929,
8631,
7364,
347,
9091,
323,
2852,
789,
533,
1057,
417,
2953,
667,
2442,
4016,
2675,
3486,
5474,
339,
431,
248,
2929,
16633,
327,
253,
1895,
273,
49123,
749,
2307,
792,
11903,
1320,
2256,
281,
1111,
2631,
10806,
672,
253,
1159,
310,
760,
12482,
949,
247,
27620,
42295,
275,
1798,
597,
2085,
271,
5933,
326,
342,
1029,
5912,
18012,
247,
2900,
342,
2822,
29776,
337,
50276,
18,
70,
50276,
80,
4259,
6772,
3266,
318,
12215,
19836,
597,
7257,
436,
906,
342,
247,
14189,
15180,
10454,
275,
337,
4259,
1223,
253,
1655,
2987,
275,
253,
6239,
452,
271,
17619,
10454,
50276,
296,
3755,
20556,
247,
253,
2929,
2175,
271,
4722,
12314,
1895,
342,
3862,
4893,
275,
253,
941,
15067,
285,
5145,
4715,
3672,
50276,
67,
253,
2929,
310,
1077,
973,
15720,
285,
3477,
281,
956,
275,
1798,
253,
4477,
1056,
253,
5502,
715,
616,
2022,
1543,
6032,
407,
5277,
4209,
27350,
22909,
285,
42852,
25761,
597,
816,
314,
6152,
253,
2905,
253,
2987,
285,
4518,
12129,
616,
9021,
432,
253,
5368,
3082,
390,
14308,
50276,
68,
597,
2085,
2266,
1543,
285,
10665,
11701,
689,
253,
1375,
23037,
14387,
50275,
20881,
1255,
265,
4583,
891,
1089,
253,
1543,
3240,
2266,
891,
816,
452,
1643,
5884,
5701,
327,
253,
9759,
273,
253,
1543,
534,
891,
1375,
762,
253,
3533,
50274,
9820,
50276,
187,
187,
4118,
18435,
27,
1189,
455,
436,
2929,
2175,
271,
1774,
1895,
285,
31326,
2266,
1543,
627,
497,
690,
7350,
5439,
407,
581,
37317,
5001,
253,
38135,
273,
253,
5609,
285,
253,
3480,
273,
5955,
5001,
253,
6046,
1566,
4496,
1056,
2119,
281,
823,
253,
5955,
670,
253,
6046,
1566,
432,
253,
30080,
22559,
275,
253,
6568,
4704,
2715,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
1895,
273,
46875,
247,
49123,
749,
2307,
792,
1159,
762,
1111,
2631,
10806,
672,
253,
1318,
19241,
452,
3907,
3632,
6046,
8392,
432,
14923,
17619,
8105,
10670,
253,
806,
906,
310,
323,
6447,
1111,
2631,
10806,
38857,
46950,
10806,
1060,
253,
747,
906,
19132,
327,
5368,
789,
275,
326,
253,
46950,
2491,
689,
534,
352,
10384,
310,
4067,
685,
326,
273,
2045,
789,
1223,
24279,
14189,
10096,
273,
7316,
10454,
327,
253,
11193,
4764,
299,
4277,
2720,
789,
2057,
3732,
760,
281,
4067,
46950,
2193,
7005,
42552,
462,
2412,
425,
4277,
19,
1411,
40639,
18,
4259,
390,
2424,
17619,
10096,
273,
7316,
10454,
327,
299,
4277,
253,
1273,
906,
310,
323,
2087,
1111,
2631,
10806,
285,
31326,
247,
3638,
11193,
2720,
789,
574,
417,
2783,
2087,
1111,
2631,
10806,
275,
46875,
749,
2307,
792,
3470,
275,
253,
27620,
4758,
50276,
783,
2022,
5853,
310,
281,
897,
1980,
3186,
835,
1016,
3213,
1863,
1825,
247,
2014,
3284,
2720,
789,
323,
27620,
749,
2307,
792,
11903,
1320,
6571,
908,
11640,
273,
253,
38754,
5933,
253,
1980,
3186,
5933,
310,
271,
15644,
273,
2720,
789,
407,
3085,
316,
285,
18588,
323,
3242,
19241,
285,
253,
13757,
1159,
908,
323,
1980,
3186,
310,
671,
247,
10237,
2715,
273,
253,
1159,
908,
407,
269,
88,
20544,
50276,
18,
749,
2307,
792,
1159,
11903,
1320,
310,
271,
1774,
8946,
1895,
342,
4893,
28369,
1142,
3672,
50276,
19,
253,
2929,
4245,
253,
806,
1543,
323,
749,
2307,
792,
11903,
1320,
762,
2087,
1111,
2631,
10806,
275,
253,
27620,
4758,
1293,
6046,
436,
310,
247,
2629,
4758,
275,
534,
749,
2307,
792,
11903,
1320,
310,
5421,
594,
436,
310,
247,
1175,
2561,
3884,
275,
2087,
253,
11193,
3638,
36908,
3176,
1077,
13943,
533,
347,
253,
806,
2929,
7296,
436,
1895,
352,
3400,
247,
1175,
1265,
50276,
20,
253,
15644,
273,
253,
1980,
3186,
5853,
281,
253,
27620,
4758,
1537,
320,
4217,
275,
2087,
1980,
3186,
310,
908,
275,
1142,
22349,
285,
26736,
247,
10237,
2715,
275,
253,
3361,
273,
6046,
1537,
452,
1318,
4457,
253,
1543,
275,
436,
2929,
50276,
20881,
1255,
265,
50276,
18,
253,
5609,
7091,
275,
436,
2929,
403,
417,
1077,
4460,
253,
15644,
273,
253,
269,
88,
5933,
310,
9648,
15246,
285,
347,
824,
253,
2929,
1057,
417,
8162,
1199,
275,
2426,
273,
747,
5609,
50276,
19,
253,
6046,
1566,
3249,
2439,
347,
3240,
3710,
285,
253,
2929,
1057,
417,
2486,
247,
5955,
327,
352,
352,
310,
2032,
326,
247,
4564,
273,
2045,
9380,
671,
2783,
253,
1072,
6046,
1566,
533,
891,
1928,
352,
651,
320,
4217,
281,
2319,
253,
22349,
275,
534,
436,
1511,
273,
6046,
3268,
310,
4623,
253,
2929,
3797,
247,
5955,
273,
7364,
387,
253,
990,
533,
352,
3133,
8127,
591,
2337,
291,
590,
347,
891,
5393,
1840,
352,
651,
320,
4722,
281,
452,
247,
3356,
5955,
670,
253,
6046,
1566,
342,
697,
2898,
15216,
285,
7364,
323,
4227,
671,
403,
253,
46950,
14493,
275,
436,
2929,
6863,
604,
417,
752,
403,
253,
7881,
275,
11922,
436,
7658,
17965,
11985,
273,
436,
3686,
651,
1361,
1056,
253,
2929,
247,
625,
15966,
1239,
5474,
33032,
2520,
2929,
2175,
253,
1895,
273,
46875,
247,
749,
2307,
792,
873,
1159,
672,
1159,
2193,
403,
1677,
407,
247,
27620,
42295,
253,
4477,
12661,
374,
4460,
24026,
3470,
534,
476,
320,
32305,
728,
14556,
275,
1340,
281,
3157,
253,
11333,
7431,
5482,
3066,
1980,
3186,
253,
2746,
323,
46950,
10806,
310,
840,
6508,
281,
789,
342,
1111,
2631,
10806,
50276,
19164,
414,
8453,
50276,
47606,
10527,
9021,
534,
1421,
281,
625,
8542,
4088,
281,
6016,
27620,
1159,
7103,
50276,
68,
3254,
2045,
2905,
789,
835,
4569,
50275,
15177,
50276,
34006,
272,
35701,
24026,
3470,
3176,
281,
320,
247,
4460,
20654,
2746,
326,
5644,
281,
17619,
7756,
275,
7316,
10454,
2429,
281,
2045,
789,
50275,
498,
15752,
50276,
2520,
2929,
310,
3839,
973,
10932,
2299,
891,
452,
374,
1355,
13991,
50272,
3409,
253,
3877,
846,
10012,
5922,
347,
247,
4858,
40460,
4645,
10444,
5018,
275,
253,
30762,
436,
310,
1774,
984,
6774,
39383,
3176,
281,
2486,
271,
4465,
2412,
391,
1307,
50271,
783,
1273,
12494,
273,
2905,
789,
327,
642,
261,
832,
658,
13757,
342,
625,
2570,
10806,
310,
417,
4623,
281,
253,
2929,
436,
812,
320,
4395,
281,
253,
30762,
390,
5176,
17965,
50276,
9887,
2542,
4313,
310,
908,
4768,
253,
2929,
835,
11193,
4313,
310,
625,
2629,
50276,
2520,
2929,
8631,
7364,
347,
9091,
323,
2852,
789,
533,
1057,
417,
2953,
667,
2442,
4016,
2675,
3486,
5474,
339,
431,
248,
2929,
16633,
327,
253,
1895,
273,
49123,
749,
2307,
792,
11903,
1320,
2256,
281,
1111,
2631,
10806,
672,
253,
1159,
310,
760,
12482,
949,
247,
27620,
42295,
275,
1798,
597,
2085,
271,
5933,
326,
342,
1029,
5912,
18012,
247,
2900,
342,
2822,
29776,
337,
50276,
18,
70,
50276,
80,
4259,
6772,
3266,
318,
12215,
19836,
597,
7257,
436,
906,
342,
247,
14189,
15180,
10454,
275,
337,
4259,
1223,
253,
1655,
2987,
275,
253,
6239,
452,
271,
17619,
10454,
50276,
296,
3755,
20556,
247,
253,
2929,
2175,
271,
4722,
12314,
1895,
342,
3862,
4893,
275,
253,
941,
15067,
285,
5145,
4715,
3672,
50276,
67,
253,
2929,
310,
1077,
973,
15720,
285,
3477,
281,
956,
275,
1798,
253,
4477,
1056,
253,
5502,
715,
616,
2022,
1543,
6032,
407,
5277,
4209,
27350,
22909,
285,
42852,
25761,
597,
816,
314,
6152,
253,
2905,
253,
2987,
285,
4518,
12129,
616,
9021,
432,
253,
5368,
3082,
390,
14308,
50276,
68,
597,
2085,
2266,
1543,
285,
10665,
11701,
689,
253,
1375,
23037,
14387,
50275,
20881,
1255,
265,
4583,
891,
1089,
253,
1543,
3240,
2266,
891,
816,
452,
1643,
5884,
5701,
327,
253,
9759,
273,
253,
1543,
534,
891,
1375,
762,
253,
3533,
50274,
9820,
50276,
187,
187,
4118,
18435,
27,
1189,
455,
436,
2929,
2175,
271,
1774,
1895,
285,
31326,
2266,
1543,
627,
497,
690,
7350,
5439,
407,
581,
37317,
5001,
253,
38135,
273,
253,
5609,
285,
253,
3480,
273,
5955,
5001,
253,
6046,
1566,
4496,
1056,
2119,
281,
823,
253,
5955,
670,
253,
6046,
1566,
432,
253,
30080,
22559,
275,
253,
6568,
4704,
2715,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents the languagecomplete arc to study how human language can affect abstraction and reasoning capability it collects the natural program described by human annotators and applies program synthesis techniques to analyze the gap between communicating to humans and machines the motivation is sound and the contribution is solid with the effort of annotating larc and benchmarking the program synthesis approaches strengths 1 the paper is wellmotivated with the goal of studying the gap between communicating natural programs to humans and machines the existing reasoning benchmark shows there still exists a large gap in reasoning and problem solving between them its a critical problem to be addressed 2 the larc dataset construction is by collecting verifiable instructions from a describer and builder which ensures the task finish rate the dataset is valuable for the community to study how natural program boosts the abstraction and reasoning process the linguistic analysis shows how concepts differ in computer and natural programs 3 it uses existing program synthesis techniques to study how to execute the natural programs as humans showing solid qualitative and quantitative findings weaknesses 1 the technical contribution of this paper is weak since it mainly uses previous approaches to analyze the program synthesis results i would like to see a new learning method to tackle this challenge 2 the improvement from 16183 to 22183 is not trivial but still very poor one main reason is the dslbased program synthesis is limited by the scale and range of the dsl the analysis of how the scale and coverage range of dsl affects the performance is required for a better understanding of how to close the gap i like the idea and solid contribution of this paper the main concern i have is the technical contributions please refer to the above for details i would consider raising the score if the concern is properly addressed while i am not totally familiar with all the literature my assessment of the novelty and originality might not be accurate enough docsepthe paper extends the arc dataset by adding humanwritten instructions that they term as natural programs the extended dataset is called larc they draw parallels and cite differences between computer programs and natural programs they draw some linguistic insights based on the natural programs and finally study the performance of a program synthesis model on the larc dataset strengths the paper addresses an important problem creating a benchmark aimed towards creating machine learning models that can help understand and reason concepts adding language annotations which is the way humans communicate to arc is the right direction to progress especially keeping in view the surge of large langauge models for both natural language and code the paper is well written in most parts especially introduction and motivation however there are significant improvements that can be done in terms of improving the clarity see comments below weaknesses even though the motivations are well set i think just adding langauge to an existing dataset might be considered less novel the linguistic analysis is interesting but i would have liked to see the results of more program synthesis systems especially the large models like codex 1 gptneo 2 gptj 3 i feel the addition of language naturally makes the set of tasks in larc more suitable for such language models that makes it essential to have this comparison using a program synthesis system that is not catered towards accepted natural langauge instructions the system used in the paper will not lead to significant improvements i felt that the clarity of the paper needs to be improved see concrete suggestions below on a highlevel including more examples will really help in understanding the ideas put forth in the paper suggestions comments 1 an example in each category of the tag will be useful just describing the results with these tags used directly in the sentences seemed a bit abrupt to me 2 more details should be provided about the exact program synthesis model employed 3 i felt the bandit algorithm for data collection was interesting and if more details about it are placed in the main paper it will definitely add much value 4 more description of how psedoannotations are produced along with examples 5 more description of how exactly the distant supervision algorithm works 6 i found the brief section on suggestions quite interesting personally i would like if the authors could expand this section with more observations drawn from other program synthesis systems and evaluation settings 7 some sample cases showing in which cases the program synthesis system succeeds and why will be extremely useful in fact i will say it is needed to understand the role and contributions brought in by adding language it might be possible that the dsl itself is not defined properly or the language annotations or pseudoannotations are not good enough to accomplish the set of reasoning required to solve the particular task basically more analysis of why the systems fail so terribly through sample cases needs to be added 1 httpsopenaicomblogopenaicodex 2 httpsgithubcomhendrycksapps 3 httpsarxivorgabs200903393 the paper studies an important problem and the motivation is well laid however i feel the paper needs some work in terms of improving clarity as well as more extensive evaluation of large language models they also need failure case analysis elucidating some sample cases docsepauthors construct a dataset of natural language programs for solving arc tasks by using a population of mechanical turk workers to validate that a turker can correctly communicate how to solve an arc task to another turk purely via the natural language used the collected dataset is then analysed by the author and evidence is gathered for the richness of the programming primitives implied by natural language the use of refining language and the diversity of terms the authors use this analysis to motivate the need for an improved dsl for solving arc strengths the authors are attempting to tackle an important program in artificial intelligence how to tackle openended puzzle learning communication and resolution past approaches have selected fixed goal posts via dsls implying there exists a known solution to the problem while here the authors build off of arc which has no dsl and thus could prove to be unreachable via any standard dsl fortunately the authors show that in 88 of the cases human descriptions of a solution can correctly communicate a solution to another human this single observation is one of the key achievements of the paper and is interesting enough in that it constructs a new artifact that machines could try to approximate instead of tackling arc directly the author methodology for obtaining the samples is cost efficient and provides good coverage over the dataset the proposed approach is original and could prove valuable to researches in ai communication planning knowledge representation linguistics and meta or multimodal learning weaknesses while the contribution of a natural language caption over the arc dataset is compelling is it not yet clear whether this data can prove useful for others downstream the inclusion of a seq2seq model accuracy at predicting these descriptions or some other form of evaluation would help highlight whether these annotations are within reach of current approaches or too diverse or unpredictable to be effectively captured by present conditional language models existence proof of the usability of a description certainly proves that some humans can save a task but does not demonstrate the robustness of a description to all human receivers in this sense it might be useful to either describe this limitation more extensively provide an example or a counterexample of this shortcoming paper presents a useful dataset complementary to the arc effort for understanding the limits of dsls and capabilities of natural language for expressing programs the dataset appears to be adequately constructed however a lack of downstream applications and proof of utility make the effort harder to contextualise and the dataset can be seen as incremental over the original arc effort
### Summary:
|
the paper presents the languagecomplete abstraction and reasoning corpus larc a collection of natural language descriptions by a group of human participants who instruct each other on how to solve tasks in the abstraction and reasoning corpus arc overall the reviewers found the larc benchmarks to be wellmotivated however there were concerns about whether the value of the dataset to downstream tasks results from additional program synthesis systems like codex and gptneo would also make the paper stronger i agree with these objections and am recommending rejection this time around however i encourage the authors to continue pursuing this line of work and resubmit after incorporating the feedback from this round
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
253,
3448,
11984,
12423,
281,
1263,
849,
1966,
3448,
476,
2818,
38562,
285,
14720,
14603,
50276,
262,
41084,
253,
3626,
2086,
2529,
407,
1966,
12182,
2392,
285,
10384,
2086,
9066,
5609,
281,
12106,
253,
8037,
875,
26728,
281,
7497,
285,
10679,
253,
16038,
310,
3590,
285,
253,
7680,
310,
4891,
342,
253,
3434,
273,
12182,
839,
1236,
68,
285,
22791,
272,
253,
2086,
9066,
7274,
20544,
50276,
18,
253,
2929,
310,
973,
24013,
8550,
342,
253,
4736,
273,
12392,
253,
8037,
875,
26728,
3626,
5659,
281,
7497,
285,
10679,
253,
5368,
14720,
22791,
2722,
627,
1335,
4961,
247,
1781,
8037,
275,
14720,
285,
1895,
16161,
875,
731,
697,
247,
4619,
1895,
281,
320,
9713,
50276,
19,
253,
1236,
68,
10895,
5140,
310,
407,
17055,
2336,
18397,
7997,
432,
247,
1398,
22875,
285,
22074,
534,
20096,
253,
4836,
8416,
2281,
253,
10895,
310,
9865,
323,
253,
3114,
281,
1263,
849,
3626,
2086,
9510,
84,
253,
38562,
285,
14720,
1232,
253,
32019,
1783,
2722,
849,
12342,
9184,
275,
4382,
285,
3626,
5659,
50276,
20,
352,
4648,
5368,
2086,
9066,
5609,
281,
1263,
849,
281,
13194,
253,
3626,
5659,
347,
7497,
4645,
4891,
18276,
285,
11745,
4342,
50276,
20881,
1255,
265,
50276,
18,
253,
7681,
7680,
273,
436,
2929,
310,
5075,
1580,
352,
7194,
4648,
2045,
7274,
281,
12106,
253,
2086,
9066,
1543,
891,
651,
751,
281,
923,
247,
747,
4715,
1332,
281,
18915,
436,
5691,
50276,
19,
253,
7756,
432,
1668,
21092,
281,
3307,
21092,
310,
417,
14916,
533,
1335,
1077,
4105,
581,
2022,
1921,
310,
253,
277,
3433,
3169,
2086,
9066,
310,
3710,
407,
253,
4311,
285,
2491,
273,
253,
277,
3433,
253,
1783,
273,
849,
253,
4311,
285,
7031,
2491,
273,
277,
3433,
11852,
253,
3045,
310,
2424,
323,
247,
1805,
4685,
273,
849,
281,
2810,
253,
8037,
891,
751,
253,
2934,
285,
4891,
7680,
273,
436,
2929,
253,
2022,
4468,
891,
452,
310,
253,
7681,
9021,
4496,
3730,
281,
253,
1840,
323,
4278,
891,
651,
1908,
12976,
253,
4868,
604,
253,
4468,
310,
6283,
9713,
50276,
6050,
891,
717,
417,
9106,
7615,
342,
512,
253,
6239,
619,
6803,
273,
253,
38135,
285,
3236,
414,
1537,
417,
320,
7899,
2217,
5474,
339,
431,
248,
2929,
8725,
253,
12423,
10895,
407,
6240,
1966,
15720,
7997,
326,
597,
1307,
347,
3626,
5659,
50276,
783,
6508,
10895,
310,
1925,
1236,
68,
597,
3812,
43630,
285,
26542,
3910,
875,
4382,
5659,
285,
3626,
5659,
597,
3812,
690,
32019,
16039,
1754,
327,
253,
3626,
5659,
285,
4720,
1263,
253,
3045,
273,
247,
2086,
9066,
1566,
327,
253,
1236,
68,
10895,
20544,
50276,
783,
2929,
12453,
271,
1774,
1895,
6153,
247,
22791,
11205,
4404,
6153,
5145,
4715,
3210,
326,
476,
1361,
2096,
285,
1921,
12342,
6240,
3448,
31825,
50276,
4609,
310,
253,
1039,
7497,
13791,
281,
12423,
310,
253,
987,
3884,
281,
4780,
3340,
7562,
275,
1859,
253,
12061,
273,
1781,
19457,
9789,
3210,
323,
1097,
3626,
3448,
285,
2127,
50275,
783,
2929,
310,
973,
3542,
275,
954,
4243,
50276,
20432,
10199,
285,
16038,
2299,
627,
403,
1534,
11701,
326,
476,
320,
2218,
275,
2426,
273,
11138,
253,
19843,
50276,
2887,
5701,
2708,
50276,
20881,
1255,
265,
50276,
9154,
2167,
253,
42852,
403,
973,
873,
891,
1158,
816,
6240,
19457,
9789,
281,
271,
5368,
10895,
1537,
320,
2783,
1679,
4460,
253,
32019,
1783,
310,
4722,
533,
891,
651,
452,
10490,
281,
923,
253,
1543,
273,
625,
2086,
9066,
2718,
3340,
253,
1781,
3210,
751,
2127,
89,
337,
305,
431,
40612,
374,
305,
431,
75,
495,
891,
1928,
253,
1635,
273,
3448,
10748,
2789,
253,
873,
273,
8892,
275,
1236,
68,
625,
7470,
323,
824,
3448,
3210,
326,
2789,
352,
5667,
281,
452,
436,
5301,
970,
247,
2086,
9066,
985,
326,
310,
417,
32439,
433,
4404,
7607,
3626,
19457,
9789,
7997,
253,
985,
908,
275,
253,
2929,
588,
417,
1421,
281,
1534,
11701,
50276,
74,
3543,
326,
253,
19843,
273,
253,
2929,
3198,
281,
320,
5520,
50276,
2887,
11859,
13991,
2708,
327,
247,
1029,
5251,
1690,
625,
6667,
588,
1663,
1361,
275,
4685,
253,
5697,
1691,
6593,
275,
253,
2929,
50275,
35640,
621,
5701,
50276,
18,
271,
1650,
275,
1016,
7140,
273,
253,
6809,
588,
320,
4217,
816,
12930,
253,
1543,
342,
841,
14610,
908,
3587,
275,
253,
14683,
4455,
247,
2372,
21213,
281,
479,
374,
625,
4278,
943,
320,
2530,
670,
253,
3242,
2086,
9066,
1566,
7091,
495,
891,
3543,
253,
3961,
262,
5933,
323,
941,
4849,
369,
4722,
285,
604,
625,
4278,
670,
352,
403,
4845,
275,
253,
2022,
2929,
352,
588,
7964,
823,
1199,
1318,
577,
625,
5740,
273,
849,
3714,
29091,
47926,
403,
4197,
2112,
342,
6667,
608,
50276,
3062,
5740,
273,
849,
4555,
253,
13392,
20446,
5933,
2987,
721,
891,
1119,
253,
4864,
2593,
327,
13991,
3240,
4722,
11697,
891,
651,
751,
604,
253,
4477,
812,
5645,
436,
2593,
342,
625,
7313,
8392,
432,
643,
2086,
9066,
2718,
285,
7103,
7533,
818,
690,
3410,
2219,
4645,
275,
534,
2219,
253,
2086,
9066,
985,
44584,
285,
2139,
588,
320,
6685,
4217,
275,
958,
891,
588,
1333,
352,
310,
3058,
281,
2096,
253,
2554,
285,
9021,
3982,
275,
407,
6240,
3448,
352,
1537,
320,
1896,
326,
253,
277,
3433,
3139,
310,
417,
2931,
6283,
390,
253,
3448,
31825,
390,
17927,
47926,
403,
417,
1175,
2217,
281,
14294,
253,
873,
273,
14720,
2424,
281,
8415,
253,
1798,
4836,
10323,
625,
1783,
273,
2139,
253,
2718,
1891,
594,
30643,
949,
3410,
2219,
3198,
281,
320,
2879,
50276,
18,
5987,
5758,
39533,
297,
9198,
5758,
66,
18969,
89,
374,
5987,
7280,
681,
864,
36765,
6163,
28047,
495,
5987,
39962,
2061,
5375,
1518,
2270,
1610,
4590,
253,
2929,
2175,
271,
1774,
1895,
285,
253,
16038,
310,
973,
10090,
2299,
891,
1928,
253,
2929,
3198,
690,
789,
275,
2426,
273,
11138,
19843,
347,
973,
347,
625,
9470,
7103,
273,
1781,
3448,
3210,
597,
671,
878,
4433,
1083,
1783,
19125,
839,
690,
3410,
2219,
5474,
33032,
43355,
3989,
247,
10895,
273,
3626,
3448,
5659,
323,
16161,
12423,
8892,
407,
970,
247,
3072,
273,
8651,
10709,
76,
5820,
281,
17813,
326,
247,
10709,
6426,
476,
9113,
13791,
849,
281,
8415,
271,
12423,
4836,
281,
1529,
10709,
76,
15846,
3066,
253,
3626,
3448,
908,
253,
5728,
10895,
310,
840,
15626,
407,
253,
2488,
285,
1941,
310,
13037,
323,
253,
37175,
273,
253,
10717,
2248,
23223,
10466,
407,
3626,
3448,
253,
897,
273,
1275,
1699,
3448,
285,
253,
9991,
273,
2426,
253,
4477,
897,
436,
1783,
281,
41509,
253,
878,
323,
271,
5520,
277,
3433,
323,
16161,
12423,
50276,
296,
3755,
20556,
50276,
783,
4477,
403,
13756,
281,
18915,
271,
1774,
2086,
275,
13345,
9260,
849,
281,
18915,
1527,
1834,
25351,
4715,
5511,
285,
6064,
2469,
7274,
452,
4236,
4229,
4736,
9319,
3066,
277,
3433,
84,
27594,
627,
4961,
247,
1929,
2900,
281,
253,
1895,
1223,
1060,
253,
4477,
1973,
745,
273,
12423,
534,
556,
642,
277,
3433,
285,
3021,
812,
5276,
281,
320,
440,
21943,
494,
3066,
667,
2629,
277,
3433,
22972,
1523,
253,
4477,
921,
326,
275,
11003,
273,
253,
2219,
1966,
20121,
273,
247,
2900,
476,
9113,
13791,
247,
2900,
281,
1529,
1966,
436,
2014,
8310,
310,
581,
273,
253,
2234,
26751,
273,
253,
2929,
285,
310,
4722,
2217,
275,
326,
352,
21031,
247,
747,
34332,
326,
10679,
812,
1611,
281,
16851,
3185,
273,
46710,
12423,
3587,
50276,
783,
2488,
16182,
323,
13546,
253,
3530,
310,
2105,
5919,
285,
3400,
1175,
7031,
689,
253,
10895,
50276,
783,
4081,
2746,
310,
3236,
285,
812,
5276,
9865,
281,
29905,
2706,
275,
23105,
5511,
7219,
3640,
6779,
20365,
3397,
285,
11419,
390,
23390,
26306,
4715,
50275,
20881,
1255,
265,
50276,
6050,
253,
7680,
273,
247,
3626,
3448,
11743,
689,
253,
12423,
10895,
310,
18511,
310,
352,
417,
2568,
2590,
1880,
436,
941,
476,
5276,
4217,
323,
2571,
15450,
253,
11250,
273,
247,
22510,
19,
14571,
1566,
7200,
387,
21565,
841,
20121,
390,
690,
643,
830,
273,
7103,
651,
1361,
6780,
1880,
841,
31825,
403,
1561,
3986,
273,
1655,
7274,
390,
1512,
11117,
390,
32947,
281,
320,
8069,
10848,
407,
1246,
17697,
3448,
3210,
50276,
29419,
4737,
273,
253,
47813,
273,
247,
5740,
5604,
19539,
326,
690,
7497,
476,
5321,
247,
4836,
533,
1057,
417,
7568,
253,
31640,
273,
247,
5740,
281,
512,
1966,
29390,
50276,
249,
436,
3282,
352,
1537,
320,
4217,
281,
2057,
6266,
436,
12291,
625,
18171,
2085,
271,
1650,
390,
247,
2258,
442,
18398,
4636,
273,
436,
2159,
4202,
50276,
20790,
10262,
247,
4217,
10895,
19767,
281,
253,
12423,
3434,
323,
4685,
253,
7787,
273,
277,
3433,
84,
285,
13789,
273,
3626,
3448,
323,
13002,
5659,
253,
10895,
4620,
281,
320,
18212,
8818,
2299,
247,
3480,
273,
15450,
4893,
285,
4737,
273,
11839,
1056,
253,
3434,
12150,
281,
33876,
885,
285,
253,
10895,
476,
320,
2326,
347,
32809,
689,
253,
3236,
12423,
3434,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
253,
3448,
11984,
38562,
285,
14720,
20689,
1236,
68,
247,
4849,
273,
3626,
3448,
20121,
407,
247,
1387,
273,
1966,
5014,
665,
9618,
1016,
643,
327,
849,
281,
8415,
8892,
275,
253,
38562,
285,
14720,
20689,
12423,
50276,
1189,
455,
253,
30628,
1119,
253,
1236,
68,
49602,
281,
320,
973,
24013,
8550,
2299,
627,
497,
7350,
670,
1880,
253,
1318,
273,
253,
10895,
281,
15450,
8892,
1543,
432,
3081,
2086,
9066,
2718,
751,
2127,
89,
285,
305,
431,
40612,
651,
671,
1056,
253,
2929,
10046,
891,
5194,
342,
841,
21915,
285,
717,
46705,
18235,
436,
673,
1475,
2299,
891,
11907,
253,
4477,
281,
4035,
23453,
436,
1386,
273,
789,
285,
501,
538,
2225,
846,
24049,
253,
8680,
432,
436,
3790
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
253,
3448,
11984,
12423,
281,
1263,
849,
1966,
3448,
476,
2818,
38562,
285,
14720,
14603,
50276,
262,
41084,
253,
3626,
2086,
2529,
407,
1966,
12182,
2392,
285,
10384,
2086,
9066,
5609,
281,
12106,
253,
8037,
875,
26728,
281,
7497,
285,
10679,
253,
16038,
310,
3590,
285,
253,
7680,
310,
4891,
342,
253,
3434,
273,
12182,
839,
1236,
68,
285,
22791,
272,
253,
2086,
9066,
7274,
20544,
50276,
18,
253,
2929,
310,
973,
24013,
8550,
342,
253,
4736,
273,
12392,
253,
8037,
875,
26728,
3626,
5659,
281,
7497,
285,
10679,
253,
5368,
14720,
22791,
2722,
627,
1335,
4961,
247,
1781,
8037,
275,
14720,
285,
1895,
16161,
875,
731,
697,
247,
4619,
1895,
281,
320,
9713,
50276,
19,
253,
1236,
68,
10895,
5140,
310,
407,
17055,
2336,
18397,
7997,
432,
247,
1398,
22875,
285,
22074,
534,
20096,
253,
4836,
8416,
2281,
253,
10895,
310,
9865,
323,
253,
3114,
281,
1263,
849,
3626,
2086,
9510,
84,
253,
38562,
285,
14720,
1232,
253,
32019,
1783,
2722,
849,
12342,
9184,
275,
4382,
285,
3626,
5659,
50276,
20,
352,
4648,
5368,
2086,
9066,
5609,
281,
1263,
849,
281,
13194,
253,
3626,
5659,
347,
7497,
4645,
4891,
18276,
285,
11745,
4342,
50276,
20881,
1255,
265,
50276,
18,
253,
7681,
7680,
273,
436,
2929,
310,
5075,
1580,
352,
7194,
4648,
2045,
7274,
281,
12106,
253,
2086,
9066,
1543,
891,
651,
751,
281,
923,
247,
747,
4715,
1332,
281,
18915,
436,
5691,
50276,
19,
253,
7756,
432,
1668,
21092,
281,
3307,
21092,
310,
417,
14916,
533,
1335,
1077,
4105,
581,
2022,
1921,
310,
253,
277,
3433,
3169,
2086,
9066,
310,
3710,
407,
253,
4311,
285,
2491,
273,
253,
277,
3433,
253,
1783,
273,
849,
253,
4311,
285,
7031,
2491,
273,
277,
3433,
11852,
253,
3045,
310,
2424,
323,
247,
1805,
4685,
273,
849,
281,
2810,
253,
8037,
891,
751,
253,
2934,
285,
4891,
7680,
273,
436,
2929,
253,
2022,
4468,
891,
452,
310,
253,
7681,
9021,
4496,
3730,
281,
253,
1840,
323,
4278,
891,
651,
1908,
12976,
253,
4868,
604,
253,
4468,
310,
6283,
9713,
50276,
6050,
891,
717,
417,
9106,
7615,
342,
512,
253,
6239,
619,
6803,
273,
253,
38135,
285,
3236,
414,
1537,
417,
320,
7899,
2217,
5474,
339,
431,
248,
2929,
8725,
253,
12423,
10895,
407,
6240,
1966,
15720,
7997,
326,
597,
1307,
347,
3626,
5659,
50276,
783,
6508,
10895,
310,
1925,
1236,
68,
597,
3812,
43630,
285,
26542,
3910,
875,
4382,
5659,
285,
3626,
5659,
597,
3812,
690,
32019,
16039,
1754,
327,
253,
3626,
5659,
285,
4720,
1263,
253,
3045,
273,
247,
2086,
9066,
1566,
327,
253,
1236,
68,
10895,
20544,
50276,
783,
2929,
12453,
271,
1774,
1895,
6153,
247,
22791,
11205,
4404,
6153,
5145,
4715,
3210,
326,
476,
1361,
2096,
285,
1921,
12342,
6240,
3448,
31825,
50276,
4609,
310,
253,
1039,
7497,
13791,
281,
12423,
310,
253,
987,
3884,
281,
4780,
3340,
7562,
275,
1859,
253,
12061,
273,
1781,
19457,
9789,
3210,
323,
1097,
3626,
3448,
285,
2127,
50275,
783,
2929,
310,
973,
3542,
275,
954,
4243,
50276,
20432,
10199,
285,
16038,
2299,
627,
403,
1534,
11701,
326,
476,
320,
2218,
275,
2426,
273,
11138,
253,
19843,
50276,
2887,
5701,
2708,
50276,
20881,
1255,
265,
50276,
9154,
2167,
253,
42852,
403,
973,
873,
891,
1158,
816,
6240,
19457,
9789,
281,
271,
5368,
10895,
1537,
320,
2783,
1679,
4460,
253,
32019,
1783,
310,
4722,
533,
891,
651,
452,
10490,
281,
923,
253,
1543,
273,
625,
2086,
9066,
2718,
3340,
253,
1781,
3210,
751,
2127,
89,
337,
305,
431,
40612,
374,
305,
431,
75,
495,
891,
1928,
253,
1635,
273,
3448,
10748,
2789,
253,
873,
273,
8892,
275,
1236,
68,
625,
7470,
323,
824,
3448,
3210,
326,
2789,
352,
5667,
281,
452,
436,
5301,
970,
247,
2086,
9066,
985,
326,
310,
417,
32439,
433,
4404,
7607,
3626,
19457,
9789,
7997,
253,
985,
908,
275,
253,
2929,
588,
417,
1421,
281,
1534,
11701,
50276,
74,
3543,
326,
253,
19843,
273,
253,
2929,
3198,
281,
320,
5520,
50276,
2887,
11859,
13991,
2708,
327,
247,
1029,
5251,
1690,
625,
6667,
588,
1663,
1361,
275,
4685,
253,
5697,
1691,
6593,
275,
253,
2929,
50275,
35640,
621,
5701,
50276,
18,
271,
1650,
275,
1016,
7140,
273,
253,
6809,
588,
320,
4217,
816,
12930,
253,
1543,
342,
841,
14610,
908,
3587,
275,
253,
14683,
4455,
247,
2372,
21213,
281,
479,
374,
625,
4278,
943,
320,
2530,
670,
253,
3242,
2086,
9066,
1566,
7091,
495,
891,
3543,
253,
3961,
262,
5933,
323,
941,
4849,
369,
4722,
285,
604,
625,
4278,
670,
352,
403,
4845,
275,
253,
2022,
2929,
352,
588,
7964,
823,
1199,
1318,
577,
625,
5740,
273,
849,
3714,
29091,
47926,
403,
4197,
2112,
342,
6667,
608,
50276,
3062,
5740,
273,
849,
4555,
253,
13392,
20446,
5933,
2987,
721,
891,
1119,
253,
4864,
2593,
327,
13991,
3240,
4722,
11697,
891,
651,
751,
604,
253,
4477,
812,
5645,
436,
2593,
342,
625,
7313,
8392,
432,
643,
2086,
9066,
2718,
285,
7103,
7533,
818,
690,
3410,
2219,
4645,
275,
534,
2219,
253,
2086,
9066,
985,
44584,
285,
2139,
588,
320,
6685,
4217,
275,
958,
891,
588,
1333,
352,
310,
3058,
281,
2096,
253,
2554,
285,
9021,
3982,
275,
407,
6240,
3448,
352,
1537,
320,
1896,
326,
253,
277,
3433,
3139,
310,
417,
2931,
6283,
390,
253,
3448,
31825,
390,
17927,
47926,
403,
417,
1175,
2217,
281,
14294,
253,
873,
273,
14720,
2424,
281,
8415,
253,
1798,
4836,
10323,
625,
1783,
273,
2139,
253,
2718,
1891,
594,
30643,
949,
3410,
2219,
3198,
281,
320,
2879,
50276,
18,
5987,
5758,
39533,
297,
9198,
5758,
66,
18969,
89,
374,
5987,
7280,
681,
864,
36765,
6163,
28047,
495,
5987,
39962,
2061,
5375,
1518,
2270,
1610,
4590,
253,
2929,
2175,
271,
1774,
1895,
285,
253,
16038,
310,
973,
10090,
2299,
891,
1928,
253,
2929,
3198,
690,
789,
275,
2426,
273,
11138,
19843,
347,
973,
347,
625,
9470,
7103,
273,
1781,
3448,
3210,
597,
671,
878,
4433,
1083,
1783,
19125,
839,
690,
3410,
2219,
5474,
33032,
43355,
3989,
247,
10895,
273,
3626,
3448,
5659,
323,
16161,
12423,
8892,
407,
970,
247,
3072,
273,
8651,
10709,
76,
5820,
281,
17813,
326,
247,
10709,
6426,
476,
9113,
13791,
849,
281,
8415,
271,
12423,
4836,
281,
1529,
10709,
76,
15846,
3066,
253,
3626,
3448,
908,
253,
5728,
10895,
310,
840,
15626,
407,
253,
2488,
285,
1941,
310,
13037,
323,
253,
37175,
273,
253,
10717,
2248,
23223,
10466,
407,
3626,
3448,
253,
897,
273,
1275,
1699,
3448,
285,
253,
9991,
273,
2426,
253,
4477,
897,
436,
1783,
281,
41509,
253,
878,
323,
271,
5520,
277,
3433,
323,
16161,
12423,
50276,
296,
3755,
20556,
50276,
783,
4477,
403,
13756,
281,
18915,
271,
1774,
2086,
275,
13345,
9260,
849,
281,
18915,
1527,
1834,
25351,
4715,
5511,
285,
6064,
2469,
7274,
452,
4236,
4229,
4736,
9319,
3066,
277,
3433,
84,
27594,
627,
4961,
247,
1929,
2900,
281,
253,
1895,
1223,
1060,
253,
4477,
1973,
745,
273,
12423,
534,
556,
642,
277,
3433,
285,
3021,
812,
5276,
281,
320,
440,
21943,
494,
3066,
667,
2629,
277,
3433,
22972,
1523,
253,
4477,
921,
326,
275,
11003,
273,
253,
2219,
1966,
20121,
273,
247,
2900,
476,
9113,
13791,
247,
2900,
281,
1529,
1966,
436,
2014,
8310,
310,
581,
273,
253,
2234,
26751,
273,
253,
2929,
285,
310,
4722,
2217,
275,
326,
352,
21031,
247,
747,
34332,
326,
10679,
812,
1611,
281,
16851,
3185,
273,
46710,
12423,
3587,
50276,
783,
2488,
16182,
323,
13546,
253,
3530,
310,
2105,
5919,
285,
3400,
1175,
7031,
689,
253,
10895,
50276,
783,
4081,
2746,
310,
3236,
285,
812,
5276,
9865,
281,
29905,
2706,
275,
23105,
5511,
7219,
3640,
6779,
20365,
3397,
285,
11419,
390,
23390,
26306,
4715,
50275,
20881,
1255,
265,
50276,
6050,
253,
7680,
273,
247,
3626,
3448,
11743,
689,
253,
12423,
10895,
310,
18511,
310,
352,
417,
2568,
2590,
1880,
436,
941,
476,
5276,
4217,
323,
2571,
15450,
253,
11250,
273,
247,
22510,
19,
14571,
1566,
7200,
387,
21565,
841,
20121,
390,
690,
643,
830,
273,
7103,
651,
1361,
6780,
1880,
841,
31825,
403,
1561,
3986,
273,
1655,
7274,
390,
1512,
11117,
390,
32947,
281,
320,
8069,
10848,
407,
1246,
17697,
3448,
3210,
50276,
29419,
4737,
273,
253,
47813,
273,
247,
5740,
5604,
19539,
326,
690,
7497,
476,
5321,
247,
4836,
533,
1057,
417,
7568,
253,
31640,
273,
247,
5740,
281,
512,
1966,
29390,
50276,
249,
436,
3282,
352,
1537,
320,
4217,
281,
2057,
6266,
436,
12291,
625,
18171,
2085,
271,
1650,
390,
247,
2258,
442,
18398,
4636,
273,
436,
2159,
4202,
50276,
20790,
10262,
247,
4217,
10895,
19767,
281,
253,
12423,
3434,
323,
4685,
253,
7787,
273,
277,
3433,
84,
285,
13789,
273,
3626,
3448,
323,
13002,
5659,
253,
10895,
4620,
281,
320,
18212,
8818,
2299,
247,
3480,
273,
15450,
4893,
285,
4737,
273,
11839,
1056,
253,
3434,
12150,
281,
33876,
885,
285,
253,
10895,
476,
320,
2326,
347,
32809,
689,
253,
3236,
12423,
3434,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
253,
3448,
11984,
38562,
285,
14720,
20689,
1236,
68,
247,
4849,
273,
3626,
3448,
20121,
407,
247,
1387,
273,
1966,
5014,
665,
9618,
1016,
643,
327,
849,
281,
8415,
8892,
275,
253,
38562,
285,
14720,
20689,
12423,
50276,
1189,
455,
253,
30628,
1119,
253,
1236,
68,
49602,
281,
320,
973,
24013,
8550,
2299,
627,
497,
7350,
670,
1880,
253,
1318,
273,
253,
10895,
281,
15450,
8892,
1543,
432,
3081,
2086,
9066,
2718,
751,
2127,
89,
285,
305,
431,
40612,
651,
671,
1056,
253,
2929,
10046,
891,
5194,
342,
841,
21915,
285,
717,
46705,
18235,
436,
673,
1475,
2299,
891,
11907,
253,
4477,
281,
4035,
23453,
436,
1386,
273,
789,
285,
501,
538,
2225,
846,
24049,
253,
8680,
432,
436,
3790
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose an approach for stereo depth estimation that is robust to adversarial attacks due to its use of traditional image features instead of cnnbased features positive the results appear to be quite good the use of nondnn features is quite interesting and the experimental section is reasonably good at demonstrating their value negative while i am not an expert in attacks of stereo networks the way the authors study the brittleness of cnn feature backbones feels unfair they select a neural net attack strategy and find an approach noncnnbased where this attack does not work as well they do not however say if how the method would behave with a noncnnbased attack strategy or indeed one designed for their matcher tables are difficult to read eg tables 1 and 2 show various columns eg epe bad 10 etc without explaining what these metrics represent or what units they use these might be somewhat standard in stereo but nevertheless should be explained with those tables instead of just towards the end of the paper in section 4 the cost aggregation stage does not seem to be described much the authors state that in certain configs their approach is will be unattackable since the gradient flows are completely blocked this should be explained more since id expect differentiable cost aggregation stages to still be attackable the paper is a bit confused about its own objectives is it about adversarial robustness or is it about generalizability the limitations of the approach are not discussed much docsepthe paper considers the problem of adversarial on binocular stereo matching systems it shows that an adversarial attack based on pgd that is photometrically consistent and consistent with stereo constraints significantly affects the performance of learned features for stereo matching to counter the attack the paper proposes to not rely on learned features but rather on handcrafted features here the census transform is used in order to compensate for using weaker features the cost aggregation stage is learned and optionally a network is used to provide higherlevel context the paper shows that this approach has multiple advantages 1 as lowlevel features are fixed and not learned learning focuses on the higherlevel task of identifying valid structures in the cost volume it generalizes better from training purely on synthetic data to being used on real data 2 the proposed approach is more robust against the proposed adversarial attacks in addition when trained with adversarial training the drop in performance on unaltered images is significantly reduced the paper concludes that current networks for learning features for stereo matching do not properly learn to match strengths s1 the paper tackles an interesting problem of practical relevance namely robustness to adversarial attacks furthermore it asks an interesting theoretical questions do matching networks really learn to match and aims to answer this using adversarial attacks s2 the proposed adversarial attack is technically sound and its effectiveness is shown by evaluating multiple stateoftheart baselines on modified images furthermore the paper argues convincingly that the proposed attack is feasible in the real world although the drop in performance is significantly smaller in the scenario considered in the paper s3 the proposed counter to the attack is welldescribed and intuitively meaningful detailed experiments show that it performs well on standard benchmark datasets s4 the paper is overall easy to read and to follow and tells a consistent and interesting story weaknesses w1 the paper focuses on a variant of the stereo matching problem where both images are taken at the same time from relatively similar viewpoints eg binocular stereo for selfdriving cars in this setting the assumption of strong photometric consistency between stereo images holds there are other stereo scenarios eg as part of multiview stereo algorithms where there can be strong viewpoint and illumination changes as the images can be taken at different points in time and by different cameras in this scenario this assumption is violated and using handcrafted rather than learned features might significantly affect performance this limitation is currently not discussed w2 some of the results of the paper seem rather predictable to me as a result some of the claims made in the paper seem too strong a i would have been surprised if the proposed approach would not have generalized from synthetic to real data the used features are handcrafted and will thus not be affected by the domain gap learning cost aggregation is a higherlevel task that operates on feature similarities ie an abstract representation learnable cost aggregation methods have been shown to generalize eg seki pollefeys sgmnets semiglobal matching with neural networks cvpr 2017 propose a trainable version of the semiglobal matching framework from 1 and show that it generalizes from synthetic to real data for both learned and handcrafted features b given the sgmnets paper which uses a trainable cost aggregation strategy and has been evaluated with a nonparametric cost volume computed from handcrafted features i think that the statement that this paper proposes to rethink the role of dnns it presents a method that casts stereo matching as a cost aggregation problem solved by training a dnn over a nonparametric cost volume that truly focuses on matching with parametric contextual features is too strong c one central question the paper asks is how well learned stereo matchers actually learn to establish matches based on the adversarial attacks and the limited generalization from synthetic to real data the paper concludes that dnns may not learn to perform matching well in the sense that they should otherwise achieve potentially even better after stereoconstrained perturbations are introduced yet i am not sure how valid this conclusion is savinov et al matching neural paths transfer from recognition to correspondence search nips 2017 show that dnns trained for classification can be used for matching in particular they show interesting qualitative results on stylized versions of the stereo input images ie inputs that are heavily perturbed their results suggest that dnn features can be used to for robust matching d the paper states that 22 studied adversarial patch based attacks in optical flow which is inherently different to attacking stereo matching due to the underlyingly different matching nature of the problems i do not agree with this statement the binocular stereo matching problem considered in this paper where both images are rectified with respect to each other is a special case of the optical flow problem where the flow in the ydirection of the image is 0 as such the relevance of the adversarial attacks from 22 should be discussed in more detail w3 multiple times it is necessary to compare results from tables on different pages which makes is harder than necessary to follow the paper overall this is a solid paper i will consider raising my rating if the rebuttal successfully addresses the weaknesses listed above postrebuttal comments the rebuttal successfully addressed my concerns and i thus increase my rating as detailed above under w1 the limitations of the proposed approach to counter the adversarial attack are not fully discussed docsepthe paper addresses robustness to adversarial attacks and generalizability in stereo matching via a network architecture that operates on a cost volume generated by a multiscale census transform ct instead of learned image features this cost volume has superior robustness and generalization properties compared to recent stateoftheart stereo matching networks most likely because it is the input to a network that only aggregates information and is less susceptible to shortcuts experiments on several datasets using three strong baseline algorithms are presented in the paper strengths the proposed approach is successful in both aims it shows robustness to image intensity perturbation attacks as well as strong results in simtoreal transfer the paper provides useful insights on the behavior of modern network architectures their surprising failures in the presence of stereoconstrained attacks and feature extraction and matching rather than cost aggregation being the cause of the vulnerability of dnns applied outside their training domains the stereoconstrained projected gradient descent pgd attack method is an additional contribution but i consider it less important than the above experimental validation is extensive and the proposed network achieves high accuracy on several datasets see the technical comments below however weaknesses an approach for stereo matching relying on handcrafted features aggregated by neural networks has been published by c cai m poggi s mattoccia and p mordohai matchingspace stereo networks for crossdomain generalization 3dv 2020 it relies on multiple handcrafted features including ct and shows good generalization and simtoreal performance recent papers on generalization in stereo matching include the following f zhang x qi r yang v prisacariu b wah and p torr domaininvariant stereo matching networks eccv 2020 zhengfa liang yulan guo yiliu feng wei chen linbo qiao li zhou jianfeng zhang and hengzhu liu stereo matching using multilevel cost volume and multiscale feature constancy tpami 2019 zhelun shen yuchao dai and zhibo rao cfnet cascade and fused cost volume for robust stereo matching cvpr 2021 i find the role and effectiveness of the context extractor perplexing it seems to contradict the main argument of the paper that the feature extractors typically a siamese network used to construct the cost volume cause overfitting presumably by learning nonessential features of the training set the last phrase is mine not taken from the paper i wonder what protects the context extractor which is only applied on the left image from suffering from the same limitation i speculate that such a singleimage network would associate appearance with depth in ways that do not transfer across datasets sections 32 and 44 contain arguments that the advantage of ct is that it is not differentiable and this prevents gradients from propagating through it cleary gradient propagation can be stopped before any operation whether it is differentiable or not considering certain parts of the system unlearnable freezing them during training is the key characteristic technical comments i consider the following comments relatively easy to correct or clarify i will not repeat them in the questions section of the review disparity is typically defined as the difference between the left and right xcoordinates ie d xl xr lines 50 and 1 are based on the opposite definition 127 it is not clear to me that not perturbing occluded pixels is the correct approach 133135 this is only true if the patches have constant disparity 209211 are evensized windows also used where is the reference pixel placed within the window table 3 the middlebury dataset does not make a bad30 error available what is actually shown in the last column table 4 the bad30 results on clean images are way better than anything reported in the literature and the official leaderboard several error rates below 1 are shown including a minimum of 022 the top submission to the benchmark is at 149 on unoccluded pixels i am aware that these are results on the training set but the degree of difficulty of the training set is comparable to that of the test set is it possible that the networks have seen these images the same is true for the 100 bad30 error on kitti 2012 reported in table 6 table 8 contains reasonable results it is not surprising that the cost of robustness and generalization is somewhat reduced indomain accuracy there are tradeoffs between generalization and specialization this is a neutral comment placed here to complement the comments above section 5 among other topics discusses physically realizable attacks on stereo matching in the context of autonomous driving i am not convinced that such attacks are feasible over sequences of frames captured by mobile cameras attacks in the form of specular or textureless surfaces may be more dangerous to most current stereo matching systems minor comments 12 due to the data hungry of dnns should be corrected since hungry is not a noun 27 3 the crossproduct symbol should not be used for scalar multiplication 4145 concatenation is one of the options for constructing the cost volume 48 learn to perform matching 51 delete in before regardless 177 the performance drop on clean images seems wrong 393 provide correct publication venue checklist 4b the licenses of the used assets datasets and network implementations are not mentioned limitations are discussed briefly which is sufficient due to the nature of the paper docsepthe paper presented a new design for stereo matching which utilizes dnns to aggregateoptimize nonparametric cost volumes with parametric contextual features it harnesses the best of classic features multiscale census transform and endtoend trainable dnns for adversariallyrobust and crossdomain generalizable stereo matching the proposed method is motivated by the observation that dnnbased stereo matching methods can be deceived by a type of physically realizable attacks that entail stereo constraints in learning the perturbation strengths the paper proposes a new design for stereo matching by utilizing dnns to aggregateoptimize nonparametric cost volumes with parametric contextual features which shows significantly better adversarial robustness and improved crossdomain sim2real generalizability when nofine tuning is used it presents the stereoconstrained projected gradient descent pgd attack method which by design preserves photometric consistency to show the more serious vulnerabilities of stateoftheart dnnbased stereo matching methods weaknesses the cost aggregation problem perspective has been widely exploited the weak performance on the kitti 2015 leaderborder other related work in attacking stereo matching networks openworld stereo video matching with deep rnn eccv 2018 the authors partially addressed the limitations more analysis should be provided
### Summary:
|
this paper addresses the problem of robustness in stereomatching it has been reviewed by several knowledgeable reviewers with extensive experience in stereomatching and learning for stereo the majority consensus from the reviews was that the paper will be of interest to the community and should be accepted this metareview agrees and recommends acceptance however as noted by the reviewers there are some issues with the text that need to be fixed eg lack of focus in aspects of the presentation 3tml lacking descriptions of the method and evaluation 3tml lack of discussion of the multiview not time synchronized setting wpxi difficult to compare the results across tables wpxi missing discussion of cai et al 3dv 2020 rugj finally while hfdj was less supportive of the paper in their original review they did not provide references to address their claims that the the cost aggregation problem perspective has been widely exploited furthermore they did not engage in the discussion to further expand on their concerns given this less weight was placed on their comments
|
[
2929,
1646,
2581,
28826,
281,
479,
347,
247,
906,
690,
273,
253,
3916,
1160,
275,
253,
2929,
1646,
1512,
2266,
50276,
66,
891,
651,
452,
644,
9861,
604,
253,
4081,
2746,
651,
417,
452,
14923,
432,
13506,
281,
1524,
941,
253,
908,
3386,
403,
1133,
12517,
264,
285,
588,
3021,
417,
320,
5876,
407,
253,
5028,
8037,
4715,
2105,
20828,
310,
247,
2169,
5251,
4836,
326,
17209,
327,
4735,
22620,
26332,
271,
12002,
6779,
3037,
494,
2105,
20828,
3082,
452,
644,
2011,
281,
39970,
24088,
396,
5985,
50276,
4818,
282,
453,
656,
48237,
16192,
1507,
3300,
48148,
3051,
11038,
342,
11454,
6928,
30105,
1087,
4240,
12661,
247,
6194,
494,
2715,
273,
253,
3300,
48148,
3051,
11038,
7792,
432,
337,
285,
921,
326,
352,
2087,
4219,
432,
13506,
281,
1524,
941,
323,
1097,
6311,
285,
1133,
12517,
264,
3386,
50276,
67,
1677,
253,
48237,
16192,
1507,
2929,
534,
4648,
247,
6194,
494,
2105,
20828,
5700,
285,
556,
644,
6760,
342,
247,
1327,
36928,
2105,
4644,
10302,
432,
1133,
12517,
264,
3386,
891,
1158,
326,
253,
3908,
326,
436,
2929,
29328,
281,
294,
18959,
253,
2554,
273,
277,
79,
2224,
352,
10262,
247,
1332,
326,
43603,
36167,
11038,
347,
247,
2105,
20828,
1895,
14042,
407,
3733,
247,
277,
9866,
689,
247,
1327,
36928,
2105,
4644,
326,
7777,
16633,
327,
11038,
342,
36833,
33876,
3386,
310,
1512,
2266,
260,
581,
4275,
1953,
253,
2929,
12325,
310,
849,
973,
6311,
36167,
1111,
13580,
2686,
3037,
281,
5100,
10129,
1754,
327,
253,
48960,
8104,
285,
253,
3710,
26647,
432,
13506,
281,
1524,
941,
253,
2929,
20097,
326,
277,
79,
2224,
778,
417,
3037,
281,
1347,
11038,
973,
275,
253,
3282,
326,
597,
943,
5010,
5115,
7826,
1014,
1805,
846,
14166,
16033,
10981,
967,
26309,
403,
5611,
2568,
891,
717,
417,
2119,
849,
3588,
436,
6452,
310,
5745,
249,
729,
1162,
355,
11038,
11454,
11865,
3700,
432,
8981,
281,
17668,
3186,
295,
2824,
4240,
921,
326,
277,
79,
2224,
10166,
323,
9162,
476,
320,
908,
323,
11038,
275,
1798,
597,
921,
4722,
18276,
1543,
327,
17521,
1025,
9508,
273,
253,
36167,
3280,
3888,
26332,
14800,
326,
403,
11306,
44711,
616,
1543,
1804,
326,
277,
9866,
3386,
476,
320,
908,
281,
323,
10237,
11038,
277,
253,
2929,
3054,
326,
3307,
5421,
48960,
12097,
1754,
8104,
275,
5748,
2685,
534,
310,
26557,
1027,
281,
20362,
36167,
11038,
1955,
281,
253,
6944,
314,
1027,
11038,
3753,
273,
253,
3237,
891,
513,
417,
5194,
342,
436,
3908,
253,
10269,
26292,
36167,
11038,
1895,
2783,
275,
436,
2929,
835,
1097,
3888,
403,
9004,
1245,
342,
1675,
281,
1016,
643,
310,
247,
2714,
1083,
273,
253,
5748,
2685,
1895,
835,
253,
2685,
275,
253,
340,
21285,
273,
253,
2460,
310,
470,
347,
824,
253,
17200,
273,
253,
48960,
8104,
432,
3307,
943,
320,
5469,
275,
625,
2508,
50276,
88,
20,
2709,
2069,
352,
310,
3309,
281,
7277,
1543,
432,
7180,
327,
1027,
7223,
534,
2789,
310,
12150,
685,
3309,
281,
956,
253,
2929,
50275,
1189,
455,
436,
310,
247,
4891,
2929,
891,
588,
1908,
12976,
619,
13716,
604,
253,
30080,
22559,
8379,
12453,
253,
32213,
7117,
1840,
50276,
5996,
250,
2858,
22559,
5701,
50276,
783,
30080,
22559,
8379,
9713,
619,
7350,
285,
891,
3021,
2572,
619,
13716,
347,
7000,
1840,
762,
259,
18,
253,
7364,
273,
253,
4081,
2746,
281,
4828,
253,
48960,
2983,
403,
417,
4751,
5469,
5474,
339,
431,
248,
2929,
12453,
31640,
281,
48960,
8104,
285,
2087,
50228,
275,
36167,
11038,
3066,
247,
2990,
10336,
326,
17209,
327,
247,
2105,
4644,
4561,
407,
247,
1554,
2865,
1079,
20740,
4979,
45830,
3185,
273,
6311,
2460,
3386,
436,
2105,
4644,
556,
8936,
31640,
285,
26647,
3607,
2429,
281,
3332,
1375,
23037,
14387,
36167,
11038,
6928,
954,
2779,
984,
352,
310,
253,
3280,
281,
247,
2990,
326,
760,
29111,
1491,
285,
310,
1679,
16931,
281,
28194,
84,
4679,
327,
2067,
15302,
970,
1264,
2266,
8245,
11333,
403,
3559,
275,
253,
2929,
20544,
50276,
783,
4081,
2746,
310,
5547,
275,
1097,
13698,
352,
2722,
31640,
281,
2460,
7133,
20452,
8104,
347,
973,
347,
2266,
1543,
275,
948,
85,
43549,
3700,
50276,
783,
2929,
3400,
4217,
16039,
327,
253,
3879,
273,
4980,
2990,
35615,
616,
10084,
20101,
275,
253,
3361,
273,
14166,
16033,
10981,
967,
8104,
285,
4735,
11998,
285,
11038,
2581,
685,
2105,
20828,
1146,
253,
2847,
273,
253,
24189,
273,
277,
79,
2224,
3732,
3345,
616,
3733,
10625,
50276,
783,
14166,
16033,
10981,
967,
16589,
11786,
18499,
23256,
69,
2983,
1332,
310,
271,
3081,
7680,
533,
891,
1908,
352,
1679,
1774,
685,
253,
1840,
50276,
49363,
12820,
310,
9470,
285,
253,
4081,
2990,
33526,
1029,
7200,
327,
2067,
15302,
923,
253,
7681,
5701,
2708,
2299,
50275,
20881,
1255,
265,
50276,
266,
2746,
323,
36167,
11038,
22128,
327,
1133,
12517,
264,
3386,
40006,
407,
11454,
6928,
556,
644,
3863,
407,
260,
260,
2284,
278,
268,
462,
7311,
256,
26714,
406,
14265,
285,
268,
278,
636,
1368,
2284,
3761,
723,
4511,
36167,
6928,
323,
2831,
13517,
26647,
495,
27088,
9169,
352,
15771,
327,
2709,
1133,
12517,
264,
3386,
1690,
45830,
285,
2722,
1175,
26647,
285,
948,
85,
43549,
3045,
50276,
45019,
9380,
327,
26647,
275,
36167,
11038,
2486,
253,
1563,
269,
1182,
12109,
1269,
2805,
74,
391,
30966,
362,
819,
261,
317,
1792,
86,
270,
259,
1240,
285,
268,
7263,
83,
5028,
25168,
36167,
11038,
6928,
23746,
87,
9169,
1182,
24176,
6855,
632,
606,
340,
335,
266,
1149,
80,
340,
3093,
86,
269,
1205,
359,
74,
260,
864,
19169,
2399,
2805,
22728,
632,
1182,
14451,
480,
757,
71,
1205,
1182,
12109,
285,
344,
1251,
91,
11917,
632,
86,
36167,
11038,
970,
1554,
48268,
2105,
4644,
285,
1554,
2865,
1079,
4735,
1030,
4306,
246,
81,
7588,
6247,
1182,
2955,
328,
703,
79,
340,
976,
8500,
277,
2284,
285,
1182,
1856,
80,
1218,
80,
21194,
3024,
25282,
285,
29843,
2105,
4644,
323,
10237,
36167,
11038,
30105,
1087,
43425,
50276,
74,
1089,
253,
2554,
285,
12510,
273,
253,
3634,
4908,
263,
44229,
272,
352,
3133,
281,
17343,
253,
2022,
4154,
273,
253,
2929,
326,
253,
4735,
4908,
641,
5431,
247,
4927,
1443,
70,
2990,
908,
281,
3989,
253,
2105,
4644,
2847,
689,
31893,
18289,
407,
4715,
1327,
33132,
3386,
273,
253,
3733,
873,
253,
1390,
12616,
310,
7477,
417,
2668,
432,
253,
2929,
891,
4282,
752,
24696,
253,
3634,
4908,
263,
534,
310,
760,
3732,
327,
253,
1669,
2460,
432,
9958,
432,
253,
1072,
12291,
891,
30821,
326,
824,
247,
2014,
5695,
2990,
651,
15629,
7286,
342,
6864,
275,
4088,
326,
513,
417,
3700,
2439,
15302,
50276,
21454,
4567,
285,
7127,
3831,
7125,
326,
253,
5750,
273,
45830,
310,
326,
352,
310,
417,
46350,
285,
436,
16897,
27935,
432,
42995,
949,
352,
1391,
552,
11786,
18634,
476,
320,
6331,
1078,
667,
4254,
1880,
352,
310,
46350,
390,
417,
7296,
2176,
4243,
273,
253,
985,
440,
29343,
494,
24250,
731,
1309,
3733,
310,
253,
2234,
8847,
50275,
48746,
5701,
50276,
74,
1908,
253,
1563,
5701,
4942,
3477,
281,
3451,
390,
19148,
891,
588,
417,
10280,
731,
275,
253,
3533,
2593,
273,
253,
2278,
50276,
3431,
1148,
414,
310,
5431,
2931,
347,
253,
3064,
875,
253,
1669,
285,
987,
1269,
29309,
8475,
26332,
277,
1269,
77,
50276,
89,
83,
3104,
2456,
285,
337,
403,
1754,
327,
253,
7285,
5426,
50275,
11946,
352,
310,
417,
2590,
281,
479,
326,
417,
12230,
272,
15715,
4686,
15115,
310,
253,
3451,
2746,
50276,
14380,
13743,
436,
310,
760,
2032,
604,
253,
20412,
452,
3638,
37808,
50276,
938,
4529,
883,
403,
612,
561,
1025,
8323,
671,
908,
835,
310,
253,
3806,
12275,
4845,
1561,
253,
3497,
50276,
2420,
495,
253,
4766,
12473,
10895,
1057,
417,
1056,
247,
3076,
1229,
2228,
2130,
752,
310,
2686,
2011,
275,
253,
1390,
5084,
50276,
2420,
577,
253,
3076,
1229,
1543,
327,
4076,
3888,
403,
1039,
1805,
685,
2712,
2361,
275,
253,
6239,
285,
253,
3565,
6657,
4697,
2067,
2228,
4142,
2708,
337,
403,
2011,
1690,
247,
5927,
273,
470,
1423,
253,
1755,
19529,
281,
253,
22791,
310,
387,
21300,
327,
440,
406,
4686,
15115,
891,
717,
6600,
326,
841,
403,
1543,
327,
253,
3733,
873,
533,
253,
4248,
273,
10183,
273,
253,
3733,
873,
310,
10870,
281,
326,
273,
253,
1071,
873,
310,
352,
1896,
326,
253,
6928,
452,
2326,
841,
3888,
253,
1072,
310,
2032,
323,
253,
2233,
3076,
1229,
2228,
327,
465,
21498,
4050,
2361,
275,
2829,
721,
50276,
2420,
854,
4428,
5272,
1543,
352,
310,
417,
10084,
326,
253,
2105,
273,
31640,
285,
26647,
310,
8489,
3777,
801,
297,
404,
7200,
627,
403,
5454,
14273,
875,
26647,
285,
48544,
436,
310,
247,
9238,
4385,
4845,
1060,
281,
13503,
253,
5701,
1840,
50276,
4674,
608,
2190,
643,
12989,
25339,
13318,
1524,
12729,
8104,
327,
36167,
11038,
275,
253,
3634,
273,
26279,
6276,
891,
717,
417,
13762,
326,
824,
8104,
403,
17887,
689,
6430,
273,
13009,
10848,
407,
6109,
14693,
8104,
275,
253,
830,
273,
946,
792,
390,
14542,
1417,
9421,
778,
320,
625,
8312,
281,
954,
1655,
36167,
11038,
2718,
50275,
37585,
5701,
50276,
805,
1955,
281,
253,
941,
18254,
273,
277,
79,
2224,
943,
320,
15045,
1580,
18254,
310,
417,
247,
28407,
50276,
1630,
495,
253,
2831,
7509,
9484,
943,
417,
320,
908,
323,
13434,
25219,
50276,
21,
11838,
32147,
318,
310,
581,
273,
253,
4610,
323,
26736,
253,
2105,
4644,
50276,
2385,
3037,
281,
1347,
11038,
50276,
3712,
11352,
275,
1078,
10159,
50276,
15907,
253,
3045,
5926,
327,
4076,
3888,
3133,
3430,
50276,
29360,
2085,
3451,
9311,
18767,
50276,
5903,
3550,
577,
67,
253,
23937,
273,
253,
908,
10434,
15302,
285,
2990,
27558,
403,
417,
5393,
50276,
17465,
569,
403,
5469,
13366,
534,
310,
4209,
1955,
281,
253,
3753,
273,
253,
2929,
5474,
339,
431,
248,
2929,
3559,
247,
747,
2216,
323,
36167,
11038,
534,
29820,
277,
79,
2224,
281,
19737,
32581,
907,
1327,
36928,
2105,
14118,
342,
36833,
33876,
3386,
352,
26880,
265,
253,
1682,
273,
10610,
3386,
1554,
2865,
1079,
20740,
4979,
285,
990,
936,
423,
6194,
494,
277,
79,
2224,
323,
18539,
274,
1365,
18848,
461,
285,
2831,
13517,
2087,
12729,
36167,
11038,
253,
4081,
1332,
310,
17194,
407,
253,
8310,
326,
277,
9866,
3169,
36167,
11038,
3082,
476,
320,
372,
6991,
407,
247,
1511,
273,
13318,
1524,
12729,
8104,
326,
46518,
36167,
10806,
275,
4715,
253,
20452,
20544,
50275,
783,
2929,
29328,
247,
747,
2216,
323,
36167,
11038,
407,
17617,
277,
79,
2224,
281,
19737,
32581,
907,
1327,
36928,
2105,
14118,
342,
36833,
33876,
3386,
534,
2722,
3012,
1805,
48960,
31640,
285,
5520,
2831,
13517,
948,
19,
6549,
2087,
50228,
672,
642,
32829,
25184,
310,
908,
50275,
262,
10262,
253,
14166,
16033,
10981,
967,
16589,
11786,
18499,
23256,
69,
2983,
1332,
534,
407,
2216,
31221,
36392,
15274,
281,
921,
253,
625,
4092,
42220,
273,
1375,
23037,
14387,
277,
9866,
3169,
36167,
11038,
3082,
50276,
20881,
1255,
265,
50275,
783,
2105,
20828,
1895,
8668,
556,
644,
7561,
28734,
50275,
783,
5075,
3045,
327,
253,
465,
21498,
4104,
6657,
14224,
50275,
977,
2905,
789,
275,
20362,
36167,
11038,
6928,
50275,
5758,
10186,
36167,
3492,
11038,
342,
3676,
391,
9866,
23746,
87,
4765,
253,
4477,
10571,
9713,
253,
7364,
625,
1783,
943,
320,
2530,
2490,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
31640,
275,
14166,
297,
16464,
352,
556,
644,
9814,
407,
2067,
37289,
30628,
342,
9470,
2793,
275,
14166,
297,
16464,
285,
4715,
323,
36167,
253,
5020,
13969,
432,
253,
10123,
369,
326,
253,
2929,
588,
320,
273,
1600,
281,
253,
3114,
285,
943,
320,
7607,
436,
1313,
609,
1374,
18726,
285,
32636,
14924,
50275,
35529,
347,
4879,
407,
253,
30628,
627,
403,
690,
3374,
342,
253,
2505,
326,
878,
281,
320,
4229,
24088,
50274,
77,
471,
273,
2770,
275,
7794,
273,
253,
9759,
495,
10842,
50273,
77,
10892,
20121,
273,
253,
1332,
285,
7103,
495,
10842,
50275,
77,
471,
273,
5955,
273,
253,
1554,
400,
827,
417,
673,
30492,
4758,
39648,
2981,
50275,
38157,
281,
7277,
253,
1543,
2439,
7180,
39648,
2981,
50275,
33722,
5955,
273,
260,
2284,
1162,
355,
495,
27088,
9169,
16051,
75,
50273,
71,
3341,
1223,
288,
9194,
75,
369,
1679,
23384,
273,
253,
2929,
275,
616,
3236,
2278,
597,
858,
417,
2085,
10414,
281,
2953,
616,
3916,
326,
253,
253,
2105,
20828,
1895,
8668,
556,
644,
7561,
28734,
33810,
597,
858,
417,
11377,
275,
253,
5955,
281,
2007,
5645,
327,
616,
7350,
1677,
436,
1679,
2801,
369,
4845,
327,
616,
5701,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
2929,
1646,
2581,
28826,
281,
479,
347,
247,
906,
690,
273,
253,
3916,
1160,
275,
253,
2929,
1646,
1512,
2266,
50276,
66,
891,
651,
452,
644,
9861,
604,
253,
4081,
2746,
651,
417,
452,
14923,
432,
13506,
281,
1524,
941,
253,
908,
3386,
403,
1133,
12517,
264,
285,
588,
3021,
417,
320,
5876,
407,
253,
5028,
8037,
4715,
2105,
20828,
310,
247,
2169,
5251,
4836,
326,
17209,
327,
4735,
22620,
26332,
271,
12002,
6779,
3037,
494,
2105,
20828,
3082,
452,
644,
2011,
281,
39970,
24088,
396,
5985,
50276,
4818,
282,
453,
656,
48237,
16192,
1507,
3300,
48148,
3051,
11038,
342,
11454,
6928,
30105,
1087,
4240,
12661,
247,
6194,
494,
2715,
273,
253,
3300,
48148,
3051,
11038,
7792,
432,
337,
285,
921,
326,
352,
2087,
4219,
432,
13506,
281,
1524,
941,
323,
1097,
6311,
285,
1133,
12517,
264,
3386,
50276,
67,
1677,
253,
48237,
16192,
1507,
2929,
534,
4648,
247,
6194,
494,
2105,
20828,
5700,
285,
556,
644,
6760,
342,
247,
1327,
36928,
2105,
4644,
10302,
432,
1133,
12517,
264,
3386,
891,
1158,
326,
253,
3908,
326,
436,
2929,
29328,
281,
294,
18959,
253,
2554,
273,
277,
79,
2224,
352,
10262,
247,
1332,
326,
43603,
36167,
11038,
347,
247,
2105,
20828,
1895,
14042,
407,
3733,
247,
277,
9866,
689,
247,
1327,
36928,
2105,
4644,
326,
7777,
16633,
327,
11038,
342,
36833,
33876,
3386,
310,
1512,
2266,
260,
581,
4275,
1953,
253,
2929,
12325,
310,
849,
973,
6311,
36167,
1111,
13580,
2686,
3037,
281,
5100,
10129,
1754,
327,
253,
48960,
8104,
285,
253,
3710,
26647,
432,
13506,
281,
1524,
941,
253,
2929,
20097,
326,
277,
79,
2224,
778,
417,
3037,
281,
1347,
11038,
973,
275,
253,
3282,
326,
597,
943,
5010,
5115,
7826,
1014,
1805,
846,
14166,
16033,
10981,
967,
26309,
403,
5611,
2568,
891,
717,
417,
2119,
849,
3588,
436,
6452,
310,
5745,
249,
729,
1162,
355,
11038,
11454,
11865,
3700,
432,
8981,
281,
17668,
3186,
295,
2824,
4240,
921,
326,
277,
79,
2224,
10166,
323,
9162,
476,
320,
908,
323,
11038,
275,
1798,
597,
921,
4722,
18276,
1543,
327,
17521,
1025,
9508,
273,
253,
36167,
3280,
3888,
26332,
14800,
326,
403,
11306,
44711,
616,
1543,
1804,
326,
277,
9866,
3386,
476,
320,
908,
281,
323,
10237,
11038,
277,
253,
2929,
3054,
326,
3307,
5421,
48960,
12097,
1754,
8104,
275,
5748,
2685,
534,
310,
26557,
1027,
281,
20362,
36167,
11038,
1955,
281,
253,
6944,
314,
1027,
11038,
3753,
273,
253,
3237,
891,
513,
417,
5194,
342,
436,
3908,
253,
10269,
26292,
36167,
11038,
1895,
2783,
275,
436,
2929,
835,
1097,
3888,
403,
9004,
1245,
342,
1675,
281,
1016,
643,
310,
247,
2714,
1083,
273,
253,
5748,
2685,
1895,
835,
253,
2685,
275,
253,
340,
21285,
273,
253,
2460,
310,
470,
347,
824,
253,
17200,
273,
253,
48960,
8104,
432,
3307,
943,
320,
5469,
275,
625,
2508,
50276,
88,
20,
2709,
2069,
352,
310,
3309,
281,
7277,
1543,
432,
7180,
327,
1027,
7223,
534,
2789,
310,
12150,
685,
3309,
281,
956,
253,
2929,
50275,
1189,
455,
436,
310,
247,
4891,
2929,
891,
588,
1908,
12976,
619,
13716,
604,
253,
30080,
22559,
8379,
12453,
253,
32213,
7117,
1840,
50276,
5996,
250,
2858,
22559,
5701,
50276,
783,
30080,
22559,
8379,
9713,
619,
7350,
285,
891,
3021,
2572,
619,
13716,
347,
7000,
1840,
762,
259,
18,
253,
7364,
273,
253,
4081,
2746,
281,
4828,
253,
48960,
2983,
403,
417,
4751,
5469,
5474,
339,
431,
248,
2929,
12453,
31640,
281,
48960,
8104,
285,
2087,
50228,
275,
36167,
11038,
3066,
247,
2990,
10336,
326,
17209,
327,
247,
2105,
4644,
4561,
407,
247,
1554,
2865,
1079,
20740,
4979,
45830,
3185,
273,
6311,
2460,
3386,
436,
2105,
4644,
556,
8936,
31640,
285,
26647,
3607,
2429,
281,
3332,
1375,
23037,
14387,
36167,
11038,
6928,
954,
2779,
984,
352,
310,
253,
3280,
281,
247,
2990,
326,
760,
29111,
1491,
285,
310,
1679,
16931,
281,
28194,
84,
4679,
327,
2067,
15302,
970,
1264,
2266,
8245,
11333,
403,
3559,
275,
253,
2929,
20544,
50276,
783,
4081,
2746,
310,
5547,
275,
1097,
13698,
352,
2722,
31640,
281,
2460,
7133,
20452,
8104,
347,
973,
347,
2266,
1543,
275,
948,
85,
43549,
3700,
50276,
783,
2929,
3400,
4217,
16039,
327,
253,
3879,
273,
4980,
2990,
35615,
616,
10084,
20101,
275,
253,
3361,
273,
14166,
16033,
10981,
967,
8104,
285,
4735,
11998,
285,
11038,
2581,
685,
2105,
20828,
1146,
253,
2847,
273,
253,
24189,
273,
277,
79,
2224,
3732,
3345,
616,
3733,
10625,
50276,
783,
14166,
16033,
10981,
967,
16589,
11786,
18499,
23256,
69,
2983,
1332,
310,
271,
3081,
7680,
533,
891,
1908,
352,
1679,
1774,
685,
253,
1840,
50276,
49363,
12820,
310,
9470,
285,
253,
4081,
2990,
33526,
1029,
7200,
327,
2067,
15302,
923,
253,
7681,
5701,
2708,
2299,
50275,
20881,
1255,
265,
50276,
266,
2746,
323,
36167,
11038,
22128,
327,
1133,
12517,
264,
3386,
40006,
407,
11454,
6928,
556,
644,
3863,
407,
260,
260,
2284,
278,
268,
462,
7311,
256,
26714,
406,
14265,
285,
268,
278,
636,
1368,
2284,
3761,
723,
4511,
36167,
6928,
323,
2831,
13517,
26647,
495,
27088,
9169,
352,
15771,
327,
2709,
1133,
12517,
264,
3386,
1690,
45830,
285,
2722,
1175,
26647,
285,
948,
85,
43549,
3045,
50276,
45019,
9380,
327,
26647,
275,
36167,
11038,
2486,
253,
1563,
269,
1182,
12109,
1269,
2805,
74,
391,
30966,
362,
819,
261,
317,
1792,
86,
270,
259,
1240,
285,
268,
7263,
83,
5028,
25168,
36167,
11038,
6928,
23746,
87,
9169,
1182,
24176,
6855,
632,
606,
340,
335,
266,
1149,
80,
340,
3093,
86,
269,
1205,
359,
74,
260,
864,
19169,
2399,
2805,
22728,
632,
1182,
14451,
480,
757,
71,
1205,
1182,
12109,
285,
344,
1251,
91,
11917,
632,
86,
36167,
11038,
970,
1554,
48268,
2105,
4644,
285,
1554,
2865,
1079,
4735,
1030,
4306,
246,
81,
7588,
6247,
1182,
2955,
328,
703,
79,
340,
976,
8500,
277,
2284,
285,
1182,
1856,
80,
1218,
80,
21194,
3024,
25282,
285,
29843,
2105,
4644,
323,
10237,
36167,
11038,
30105,
1087,
43425,
50276,
74,
1089,
253,
2554,
285,
12510,
273,
253,
3634,
4908,
263,
44229,
272,
352,
3133,
281,
17343,
253,
2022,
4154,
273,
253,
2929,
326,
253,
4735,
4908,
641,
5431,
247,
4927,
1443,
70,
2990,
908,
281,
3989,
253,
2105,
4644,
2847,
689,
31893,
18289,
407,
4715,
1327,
33132,
3386,
273,
253,
3733,
873,
253,
1390,
12616,
310,
7477,
417,
2668,
432,
253,
2929,
891,
4282,
752,
24696,
253,
3634,
4908,
263,
534,
310,
760,
3732,
327,
253,
1669,
2460,
432,
9958,
432,
253,
1072,
12291,
891,
30821,
326,
824,
247,
2014,
5695,
2990,
651,
15629,
7286,
342,
6864,
275,
4088,
326,
513,
417,
3700,
2439,
15302,
50276,
21454,
4567,
285,
7127,
3831,
7125,
326,
253,
5750,
273,
45830,
310,
326,
352,
310,
417,
46350,
285,
436,
16897,
27935,
432,
42995,
949,
352,
1391,
552,
11786,
18634,
476,
320,
6331,
1078,
667,
4254,
1880,
352,
310,
46350,
390,
417,
7296,
2176,
4243,
273,
253,
985,
440,
29343,
494,
24250,
731,
1309,
3733,
310,
253,
2234,
8847,
50275,
48746,
5701,
50276,
74,
1908,
253,
1563,
5701,
4942,
3477,
281,
3451,
390,
19148,
891,
588,
417,
10280,
731,
275,
253,
3533,
2593,
273,
253,
2278,
50276,
3431,
1148,
414,
310,
5431,
2931,
347,
253,
3064,
875,
253,
1669,
285,
987,
1269,
29309,
8475,
26332,
277,
1269,
77,
50276,
89,
83,
3104,
2456,
285,
337,
403,
1754,
327,
253,
7285,
5426,
50275,
11946,
352,
310,
417,
2590,
281,
479,
326,
417,
12230,
272,
15715,
4686,
15115,
310,
253,
3451,
2746,
50276,
14380,
13743,
436,
310,
760,
2032,
604,
253,
20412,
452,
3638,
37808,
50276,
938,
4529,
883,
403,
612,
561,
1025,
8323,
671,
908,
835,
310,
253,
3806,
12275,
4845,
1561,
253,
3497,
50276,
2420,
495,
253,
4766,
12473,
10895,
1057,
417,
1056,
247,
3076,
1229,
2228,
2130,
752,
310,
2686,
2011,
275,
253,
1390,
5084,
50276,
2420,
577,
253,
3076,
1229,
1543,
327,
4076,
3888,
403,
1039,
1805,
685,
2712,
2361,
275,
253,
6239,
285,
253,
3565,
6657,
4697,
2067,
2228,
4142,
2708,
337,
403,
2011,
1690,
247,
5927,
273,
470,
1423,
253,
1755,
19529,
281,
253,
22791,
310,
387,
21300,
327,
440,
406,
4686,
15115,
891,
717,
6600,
326,
841,
403,
1543,
327,
253,
3733,
873,
533,
253,
4248,
273,
10183,
273,
253,
3733,
873,
310,
10870,
281,
326,
273,
253,
1071,
873,
310,
352,
1896,
326,
253,
6928,
452,
2326,
841,
3888,
253,
1072,
310,
2032,
323,
253,
2233,
3076,
1229,
2228,
327,
465,
21498,
4050,
2361,
275,
2829,
721,
50276,
2420,
854,
4428,
5272,
1543,
352,
310,
417,
10084,
326,
253,
2105,
273,
31640,
285,
26647,
310,
8489,
3777,
801,
297,
404,
7200,
627,
403,
5454,
14273,
875,
26647,
285,
48544,
436,
310,
247,
9238,
4385,
4845,
1060,
281,
13503,
253,
5701,
1840,
50276,
4674,
608,
2190,
643,
12989,
25339,
13318,
1524,
12729,
8104,
327,
36167,
11038,
275,
253,
3634,
273,
26279,
6276,
891,
717,
417,
13762,
326,
824,
8104,
403,
17887,
689,
6430,
273,
13009,
10848,
407,
6109,
14693,
8104,
275,
253,
830,
273,
946,
792,
390,
14542,
1417,
9421,
778,
320,
625,
8312,
281,
954,
1655,
36167,
11038,
2718,
50275,
37585,
5701,
50276,
805,
1955,
281,
253,
941,
18254,
273,
277,
79,
2224,
943,
320,
15045,
1580,
18254,
310,
417,
247,
28407,
50276,
1630,
495,
253,
2831,
7509,
9484,
943,
417,
320,
908,
323,
13434,
25219,
50276,
21,
11838,
32147,
318,
310,
581,
273,
253,
4610,
323,
26736,
253,
2105,
4644,
50276,
2385,
3037,
281,
1347,
11038,
50276,
3712,
11352,
275,
1078,
10159,
50276,
15907,
253,
3045,
5926,
327,
4076,
3888,
3133,
3430,
50276,
29360,
2085,
3451,
9311,
18767,
50276,
5903,
3550,
577,
67,
253,
23937,
273,
253,
908,
10434,
15302,
285,
2990,
27558,
403,
417,
5393,
50276,
17465,
569,
403,
5469,
13366,
534,
310,
4209,
1955,
281,
253,
3753,
273,
253,
2929,
5474,
339,
431,
248,
2929,
3559,
247,
747,
2216,
323,
36167,
11038,
534,
29820,
277,
79,
2224,
281,
19737,
32581,
907,
1327,
36928,
2105,
14118,
342,
36833,
33876,
3386,
352,
26880,
265,
253,
1682,
273,
10610,
3386,
1554,
2865,
1079,
20740,
4979,
285,
990,
936,
423,
6194,
494,
277,
79,
2224,
323,
18539,
274,
1365,
18848,
461,
285,
2831,
13517,
2087,
12729,
36167,
11038,
253,
4081,
1332,
310,
17194,
407,
253,
8310,
326,
277,
9866,
3169,
36167,
11038,
3082,
476,
320,
372,
6991,
407,
247,
1511,
273,
13318,
1524,
12729,
8104,
326,
46518,
36167,
10806,
275,
4715,
253,
20452,
20544,
50275,
783,
2929,
29328,
247,
747,
2216,
323,
36167,
11038,
407,
17617,
277,
79,
2224,
281,
19737,
32581,
907,
1327,
36928,
2105,
14118,
342,
36833,
33876,
3386,
534,
2722,
3012,
1805,
48960,
31640,
285,
5520,
2831,
13517,
948,
19,
6549,
2087,
50228,
672,
642,
32829,
25184,
310,
908,
50275,
262,
10262,
253,
14166,
16033,
10981,
967,
16589,
11786,
18499,
23256,
69,
2983,
1332,
534,
407,
2216,
31221,
36392,
15274,
281,
921,
253,
625,
4092,
42220,
273,
1375,
23037,
14387,
277,
9866,
3169,
36167,
11038,
3082,
50276,
20881,
1255,
265,
50275,
783,
2105,
20828,
1895,
8668,
556,
644,
7561,
28734,
50275,
783,
5075,
3045,
327,
253,
465,
21498,
4104,
6657,
14224,
50275,
977,
2905,
789,
275,
20362,
36167,
11038,
6928,
50275,
5758,
10186,
36167,
3492,
11038,
342,
3676,
391,
9866,
23746,
87,
4765,
253,
4477,
10571,
9713,
253,
7364,
625,
1783,
943,
320,
2530,
2490,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
31640,
275,
14166,
297,
16464,
352,
556,
644,
9814,
407,
2067,
37289,
30628,
342,
9470,
2793,
275,
14166,
297,
16464,
285,
4715,
323,
36167,
253,
5020,
13969,
432,
253,
10123,
369,
326,
253,
2929,
588,
320,
273,
1600,
281,
253,
3114,
285,
943,
320,
7607,
436,
1313,
609,
1374,
18726,
285,
32636,
14924,
50275,
35529,
347,
4879,
407,
253,
30628,
627,
403,
690,
3374,
342,
253,
2505,
326,
878,
281,
320,
4229,
24088,
50274,
77,
471,
273,
2770,
275,
7794,
273,
253,
9759,
495,
10842,
50273,
77,
10892,
20121,
273,
253,
1332,
285,
7103,
495,
10842,
50275,
77,
471,
273,
5955,
273,
253,
1554,
400,
827,
417,
673,
30492,
4758,
39648,
2981,
50275,
38157,
281,
7277,
253,
1543,
2439,
7180,
39648,
2981,
50275,
33722,
5955,
273,
260,
2284,
1162,
355,
495,
27088,
9169,
16051,
75,
50273,
71,
3341,
1223,
288,
9194,
75,
369,
1679,
23384,
273,
253,
2929,
275,
616,
3236,
2278,
597,
858,
417,
2085,
10414,
281,
2953,
616,
3916,
326,
253,
253,
2105,
20828,
1895,
8668,
556,
644,
7561,
28734,
33810,
597,
858,
417,
11377,
275,
253,
5955,
281,
2007,
5645,
327,
616,
7350,
1677,
436,
1679,
2801,
369,
4845,
327,
616,
5701,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses the problem of speeding up adaptation training by introducing an adversarial training algorithm that is compatible with large batch training previously while being able to significantly speed up adaptation training large batch training has been shown to harm the models ability to generalize adversarial training algorithm helps but the inner optimization steps brings a major computation overhead over other optimization algorithms motivated by these insights this paper proposes scala which 1 sticks to the adversarial solution but reduces gradient computation steps to 1 and 2 introduces a delay in perturbation injection in order to bring down the computation overhead experiments show that the proposed training regime is faster than previouslyproposed solutions like freelb and lamb while also improving the performance of the model which indicates stronger generalization capabilities strengths the research problem is wellmotivated by the related work survey and the preliminary analyses the presentation of the solution is mostly clear with the benefit of each part being clearly stated experiments show clear benefit over existing methods in terms of adaptation training speed while also showing strong performance weaknesses it is unclear to me if any of the modification here is specifically targeted to speedup the adaptation training process as opposed to pretraining there are a few nonideal presentation choices see below some more detailed comments eqn 2 the choice of using x and y as model parameters and perturbed embeddings is a very confusing one especially since they were used as instances from the training data in 1 l130 it would be nice to introduce what exactly with and without communication means or provide some citations for it i have trouble understanding how would the noncommunication case work algorithm 1 l9 i dont understand prodomega means here adequately addressed to the best of my knowledge docsepthis paper proposes a scala method to improve the computational efficiency of the finetuning of pretrained language models while preserving generalization scala uses a large batch size to lower the total training iterations and employs adversarial training to mitigate the generalization drop brought by using a large batch size this idea is quite counterintuitive to me as it aims to lower the computational overhead but introduces a more timeconsuming adversarial training part however by using techniques such as oneshot adversarial examples and delayed perturbation injection scala is able to accelerate the adversarial training part and thus lower the overall computational overhead which is empirically justified by experiments over a range of natural language understanding tasks strengths 1 the paper is wellwritten and the method is clearly presented 2 the empirical result looks strong weakness 1 the proposed method looks like a naive combination of two existing techniques and thus to some extent incremental 2 it would help if the authors can include a more indepth analysis of why large batch size does harm to generalization and how this issue can be addressed or even best addressed by adversarial training the current version focuses more on empirical validation after rebuttal thank the authors for the response after reading all the reviews and responses i decided to keep my rating unchanged i do not see any potential negative social impact docsepthis paper aims to improve the efficiency of finetuning pretrained large transformers through adversarial training it resorts to large minibatch training and proposes several methods to reduce the computational overhead and address the optimization challenges of large minibatch training the paper provides an extensive empirical analysis confirming similar challenges raised by recent works the proposed scala algorithm combines several optimization tricks such as delayed and oneshot perturbations on the blue benchmark scala substantially reduces the overhead of finetuning bert and roberta without hurting accuracy convergence analysis of the algorithm is provided the ablation study is sound the paper reads like a collection of techniques mostly proposed by previous works for example as the paper points out several key ideas are already established in computer vision applications and now are adopted to textual data by this work to be clear i have no doubt that such efforts are of great use in realworld applications of large models just that i am not sure this paper is a right fit for the neurips community after author response with the additional results and analysis as well as the experiments planned out the paper is definitely in a better shape i have increased my rating to 6 strengths addressing the challenges of large minibatch training is important and timely technical aspects of the paper are reasonably clear the proposed method is simple and effective weaknesses the paper heavily builds on the findings of existing works it is hard to argue for technical novelty i think the paper could be better organized the current presentation has several separate ideas floating around lacking a clear central theme besides i find the detailed discussion of previous work in the middle of the method section very distractive the paper should include experiments on other pretrained transformers as well as other datasets the paper does not explicitly discuss the limitations and potential negative societal impact
### Summary:
|
this paper presents scala a method to improve the efficiency of finetuning of a pretrained language model using adversarial training experiments confirm that scala allows faster finetuning compared to standard approaches without reducing accuracy i think this is a nice approach that can have a lot of impact for those who would like to finetune large language models however the reviewers have concerns regarding novelty of the paper and would like to see the proposed method to be tested for other larger scale pretrained models as well as more analysis i encourage the authors to incorporate these changes and resubmit to another conference
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
1895,
273,
43088,
598,
15644,
3733,
407,
16984,
271,
48960,
3733,
5933,
326,
310,
13333,
342,
1781,
14604,
3733,
3786,
1223,
1146,
2104,
281,
3012,
3885,
598,
15644,
3733,
1781,
14604,
3733,
556,
644,
2011,
281,
5237,
253,
3210,
3745,
281,
39970,
48960,
3733,
5933,
7729,
533,
253,
6703,
13757,
5018,
10316,
247,
2201,
13782,
18332,
689,
643,
13757,
11333,
17194,
407,
841,
16039,
436,
2929,
29328,
660,
7080,
534,
337,
24094,
281,
253,
48960,
2900,
533,
11355,
11786,
13782,
5018,
281,
337,
285,
374,
23970,
247,
5778,
275,
20452,
8829,
275,
1340,
281,
3324,
1066,
253,
13782,
18332,
4679,
921,
326,
253,
4081,
3733,
9459,
310,
7938,
685,
3786,
856,
7334,
5482,
751,
31847,
67,
285,
24082,
1223,
671,
11138,
253,
3045,
273,
253,
1566,
534,
6492,
10046,
26647,
13789,
20544,
50275,
783,
2561,
1895,
310,
973,
24013,
8550,
407,
253,
2905,
789,
6630,
285,
253,
12611,
6260,
50276,
783,
9759,
273,
253,
2900,
310,
6571,
2590,
342,
253,
5649,
273,
1016,
629,
1146,
4518,
4767,
50276,
16217,
3825,
921,
2590,
5649,
689,
5368,
3082,
275,
2426,
273,
15644,
3733,
3885,
1223,
671,
4645,
2266,
3045,
50276,
20881,
1255,
265,
50275,
262,
310,
12744,
281,
479,
604,
667,
273,
253,
11237,
1060,
310,
5742,
10522,
281,
3885,
484,
253,
15644,
3733,
1232,
347,
10066,
281,
3215,
26208,
50276,
9088,
403,
247,
1643,
1327,
45222,
9759,
10165,
923,
2708,
50276,
8826,
625,
7000,
5701,
50275,
15214,
374,
253,
4327,
273,
970,
1269,
285,
340,
347,
1566,
3602,
285,
44711,
46234,
310,
247,
1077,
21643,
581,
3340,
1580,
597,
497,
908,
347,
10872,
432,
253,
3733,
941,
275,
337,
50276,
77,
11246,
352,
651,
320,
5322,
281,
9569,
752,
4555,
342,
285,
1293,
5511,
2097,
390,
2085,
690,
30404,
323,
352,
891,
452,
7596,
4685,
849,
651,
253,
1327,
35437,
1083,
789,
50276,
41528,
337,
298,
26,
891,
13414,
2096,
30508,
3151,
2097,
1060,
18212,
9713,
281,
253,
1682,
273,
619,
3640,
5474,
33032,
2520,
2929,
29328,
247,
660,
7080,
1332,
281,
3157,
253,
15180,
6733,
273,
253,
1442,
292,
25004,
273,
3215,
11273,
3448,
3210,
1223,
24279,
26647,
660,
7080,
4648,
247,
1781,
14604,
1979,
281,
2406,
253,
2264,
3733,
25142,
285,
27532,
48960,
3733,
281,
29966,
253,
26647,
5926,
3982,
407,
970,
247,
1781,
14604,
1979,
436,
2934,
310,
3240,
4828,
565,
48714,
281,
479,
347,
352,
13698,
281,
2406,
253,
15180,
18332,
533,
23970,
247,
625,
673,
33136,
48960,
3733,
629,
2299,
407,
970,
5609,
824,
347,
4394,
12022,
48960,
6667,
285,
13444,
20452,
8829,
660,
7080,
310,
2104,
281,
28523,
253,
48960,
3733,
629,
285,
3021,
2406,
253,
4583,
15180,
18332,
534,
310,
45190,
17285,
407,
4679,
689,
247,
2491,
273,
3626,
3448,
4685,
8892,
20544,
337,
253,
2929,
310,
973,
15720,
285,
253,
1332,
310,
4518,
3559,
374,
253,
16774,
906,
4453,
2266,
50276,
20881,
1255,
337,
253,
4081,
1332,
4453,
751,
247,
27785,
5019,
273,
767,
5368,
5609,
285,
3021,
281,
690,
6070,
32809,
374,
352,
651,
1361,
604,
253,
4477,
476,
2486,
247,
625,
801,
554,
394,
1783,
273,
2139,
1781,
14604,
1979,
1057,
5237,
281,
26647,
285,
849,
436,
2523,
476,
320,
9713,
390,
1014,
1682,
9713,
407,
48960,
3733,
253,
1655,
2715,
16633,
625,
327,
16774,
12820,
50276,
6438,
30080,
22559,
50276,
47033,
253,
4477,
323,
253,
2380,
846,
4361,
512,
253,
10123,
285,
6128,
891,
4425,
281,
1978,
619,
13716,
19965,
50276,
74,
513,
417,
923,
667,
2442,
4016,
2675,
3486,
5474,
33032,
2520,
2929,
13698,
281,
3157,
253,
6733,
273,
1442,
292,
25004,
3215,
11273,
1781,
4979,
398,
949,
48960,
3733,
352,
48520,
281,
1781,
1054,
487,
1506,
3733,
285,
29328,
2067,
3082,
281,
4796,
253,
15180,
18332,
285,
2953,
253,
13757,
7881,
273,
1781,
1054,
487,
1506,
3733,
253,
2929,
3400,
271,
9470,
16774,
1783,
24025,
2074,
7881,
5439,
407,
3332,
2987,
253,
4081,
660,
7080,
5933,
24772,
2067,
13757,
24866,
824,
347,
13444,
285,
4394,
12022,
26309,
327,
253,
4797,
22791,
660,
7080,
9619,
11355,
253,
18332,
273,
1442,
292,
25004,
270,
797,
285,
687,
589,
893,
1293,
34426,
7200,
14940,
1783,
273,
253,
5933,
310,
2530,
253,
28913,
1263,
310,
3590,
50276,
783,
2929,
9563,
751,
247,
4849,
273,
5609,
6571,
4081,
407,
2045,
2987,
323,
1650,
347,
253,
2929,
2792,
562,
2067,
2234,
5697,
403,
2168,
4232,
275,
4382,
8113,
4893,
285,
1024,
403,
8671,
281,
45860,
941,
407,
436,
789,
281,
320,
2590,
891,
452,
642,
5545,
326,
824,
6031,
403,
273,
1270,
897,
275,
1524,
10186,
4893,
273,
1781,
3210,
816,
326,
891,
717,
417,
2119,
436,
2929,
310,
247,
987,
4944,
323,
253,
5723,
2824,
3114,
50273,
6438,
2488,
2380,
342,
253,
3081,
1543,
285,
1783,
347,
973,
347,
253,
4679,
9355,
562,
253,
2929,
310,
7964,
275,
247,
1805,
5281,
891,
452,
2559,
619,
13716,
281,
721,
50276,
296,
3755,
20556,
50276,
12025,
272,
253,
7881,
273,
1781,
1054,
487,
1506,
3733,
310,
1774,
285,
14793,
50276,
48746,
7794,
273,
253,
2929,
403,
12054,
2590,
50276,
783,
4081,
1332,
310,
2969,
285,
3576,
50276,
20881,
1255,
265,
50276,
783,
2929,
11306,
21168,
327,
253,
4342,
273,
5368,
2987,
352,
310,
1892,
281,
9059,
323,
7681,
38135,
50276,
74,
1158,
253,
2929,
812,
320,
1805,
10932,
253,
1655,
9759,
556,
2067,
4858,
5697,
14974,
1475,
14999,
247,
2590,
4275,
10014,
16280,
891,
1089,
253,
7000,
5955,
273,
2045,
789,
275,
253,
4766,
273,
253,
1332,
2593,
1077,
940,
36484,
50275,
783,
2929,
943,
2486,
4679,
327,
643,
3215,
11273,
4979,
398,
347,
973,
347,
643,
15302,
50276,
783,
2929,
1057,
417,
11120,
2319,
253,
7364,
285,
2442,
4016,
38058,
3486,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
660,
7080,
247,
1332,
281,
3157,
253,
6733,
273,
1442,
292,
25004,
273,
247,
3215,
11273,
3448,
1566,
970,
48960,
3733,
4679,
6583,
326,
660,
7080,
4483,
7938,
1442,
292,
25004,
2429,
281,
2629,
7274,
1293,
8493,
7200,
50275,
74,
1158,
436,
310,
247,
5322,
2746,
326,
476,
452,
247,
2257,
273,
3486,
323,
1110,
665,
651,
751,
281,
1442,
292,
2517,
1781,
3448,
3210,
2299,
253,
30628,
452,
7350,
5001,
38135,
273,
253,
2929,
285,
651,
751,
281,
923,
253,
4081,
1332,
281,
320,
5762,
323,
643,
4067,
4311,
3215,
11273,
3210,
347,
973,
347,
625,
1783,
891,
11907,
253,
4477,
281,
19071,
841,
2544,
285,
501,
538,
2225,
281,
1529,
8059
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
1895,
273,
43088,
598,
15644,
3733,
407,
16984,
271,
48960,
3733,
5933,
326,
310,
13333,
342,
1781,
14604,
3733,
3786,
1223,
1146,
2104,
281,
3012,
3885,
598,
15644,
3733,
1781,
14604,
3733,
556,
644,
2011,
281,
5237,
253,
3210,
3745,
281,
39970,
48960,
3733,
5933,
7729,
533,
253,
6703,
13757,
5018,
10316,
247,
2201,
13782,
18332,
689,
643,
13757,
11333,
17194,
407,
841,
16039,
436,
2929,
29328,
660,
7080,
534,
337,
24094,
281,
253,
48960,
2900,
533,
11355,
11786,
13782,
5018,
281,
337,
285,
374,
23970,
247,
5778,
275,
20452,
8829,
275,
1340,
281,
3324,
1066,
253,
13782,
18332,
4679,
921,
326,
253,
4081,
3733,
9459,
310,
7938,
685,
3786,
856,
7334,
5482,
751,
31847,
67,
285,
24082,
1223,
671,
11138,
253,
3045,
273,
253,
1566,
534,
6492,
10046,
26647,
13789,
20544,
50275,
783,
2561,
1895,
310,
973,
24013,
8550,
407,
253,
2905,
789,
6630,
285,
253,
12611,
6260,
50276,
783,
9759,
273,
253,
2900,
310,
6571,
2590,
342,
253,
5649,
273,
1016,
629,
1146,
4518,
4767,
50276,
16217,
3825,
921,
2590,
5649,
689,
5368,
3082,
275,
2426,
273,
15644,
3733,
3885,
1223,
671,
4645,
2266,
3045,
50276,
20881,
1255,
265,
50275,
262,
310,
12744,
281,
479,
604,
667,
273,
253,
11237,
1060,
310,
5742,
10522,
281,
3885,
484,
253,
15644,
3733,
1232,
347,
10066,
281,
3215,
26208,
50276,
9088,
403,
247,
1643,
1327,
45222,
9759,
10165,
923,
2708,
50276,
8826,
625,
7000,
5701,
50275,
15214,
374,
253,
4327,
273,
970,
1269,
285,
340,
347,
1566,
3602,
285,
44711,
46234,
310,
247,
1077,
21643,
581,
3340,
1580,
597,
497,
908,
347,
10872,
432,
253,
3733,
941,
275,
337,
50276,
77,
11246,
352,
651,
320,
5322,
281,
9569,
752,
4555,
342,
285,
1293,
5511,
2097,
390,
2085,
690,
30404,
323,
352,
891,
452,
7596,
4685,
849,
651,
253,
1327,
35437,
1083,
789,
50276,
41528,
337,
298,
26,
891,
13414,
2096,
30508,
3151,
2097,
1060,
18212,
9713,
281,
253,
1682,
273,
619,
3640,
5474,
33032,
2520,
2929,
29328,
247,
660,
7080,
1332,
281,
3157,
253,
15180,
6733,
273,
253,
1442,
292,
25004,
273,
3215,
11273,
3448,
3210,
1223,
24279,
26647,
660,
7080,
4648,
247,
1781,
14604,
1979,
281,
2406,
253,
2264,
3733,
25142,
285,
27532,
48960,
3733,
281,
29966,
253,
26647,
5926,
3982,
407,
970,
247,
1781,
14604,
1979,
436,
2934,
310,
3240,
4828,
565,
48714,
281,
479,
347,
352,
13698,
281,
2406,
253,
15180,
18332,
533,
23970,
247,
625,
673,
33136,
48960,
3733,
629,
2299,
407,
970,
5609,
824,
347,
4394,
12022,
48960,
6667,
285,
13444,
20452,
8829,
660,
7080,
310,
2104,
281,
28523,
253,
48960,
3733,
629,
285,
3021,
2406,
253,
4583,
15180,
18332,
534,
310,
45190,
17285,
407,
4679,
689,
247,
2491,
273,
3626,
3448,
4685,
8892,
20544,
337,
253,
2929,
310,
973,
15720,
285,
253,
1332,
310,
4518,
3559,
374,
253,
16774,
906,
4453,
2266,
50276,
20881,
1255,
337,
253,
4081,
1332,
4453,
751,
247,
27785,
5019,
273,
767,
5368,
5609,
285,
3021,
281,
690,
6070,
32809,
374,
352,
651,
1361,
604,
253,
4477,
476,
2486,
247,
625,
801,
554,
394,
1783,
273,
2139,
1781,
14604,
1979,
1057,
5237,
281,
26647,
285,
849,
436,
2523,
476,
320,
9713,
390,
1014,
1682,
9713,
407,
48960,
3733,
253,
1655,
2715,
16633,
625,
327,
16774,
12820,
50276,
6438,
30080,
22559,
50276,
47033,
253,
4477,
323,
253,
2380,
846,
4361,
512,
253,
10123,
285,
6128,
891,
4425,
281,
1978,
619,
13716,
19965,
50276,
74,
513,
417,
923,
667,
2442,
4016,
2675,
3486,
5474,
33032,
2520,
2929,
13698,
281,
3157,
253,
6733,
273,
1442,
292,
25004,
3215,
11273,
1781,
4979,
398,
949,
48960,
3733,
352,
48520,
281,
1781,
1054,
487,
1506,
3733,
285,
29328,
2067,
3082,
281,
4796,
253,
15180,
18332,
285,
2953,
253,
13757,
7881,
273,
1781,
1054,
487,
1506,
3733,
253,
2929,
3400,
271,
9470,
16774,
1783,
24025,
2074,
7881,
5439,
407,
3332,
2987,
253,
4081,
660,
7080,
5933,
24772,
2067,
13757,
24866,
824,
347,
13444,
285,
4394,
12022,
26309,
327,
253,
4797,
22791,
660,
7080,
9619,
11355,
253,
18332,
273,
1442,
292,
25004,
270,
797,
285,
687,
589,
893,
1293,
34426,
7200,
14940,
1783,
273,
253,
5933,
310,
2530,
253,
28913,
1263,
310,
3590,
50276,
783,
2929,
9563,
751,
247,
4849,
273,
5609,
6571,
4081,
407,
2045,
2987,
323,
1650,
347,
253,
2929,
2792,
562,
2067,
2234,
5697,
403,
2168,
4232,
275,
4382,
8113,
4893,
285,
1024,
403,
8671,
281,
45860,
941,
407,
436,
789,
281,
320,
2590,
891,
452,
642,
5545,
326,
824,
6031,
403,
273,
1270,
897,
275,
1524,
10186,
4893,
273,
1781,
3210,
816,
326,
891,
717,
417,
2119,
436,
2929,
310,
247,
987,
4944,
323,
253,
5723,
2824,
3114,
50273,
6438,
2488,
2380,
342,
253,
3081,
1543,
285,
1783,
347,
973,
347,
253,
4679,
9355,
562,
253,
2929,
310,
7964,
275,
247,
1805,
5281,
891,
452,
2559,
619,
13716,
281,
721,
50276,
296,
3755,
20556,
50276,
12025,
272,
253,
7881,
273,
1781,
1054,
487,
1506,
3733,
310,
1774,
285,
14793,
50276,
48746,
7794,
273,
253,
2929,
403,
12054,
2590,
50276,
783,
4081,
1332,
310,
2969,
285,
3576,
50276,
20881,
1255,
265,
50276,
783,
2929,
11306,
21168,
327,
253,
4342,
273,
5368,
2987,
352,
310,
1892,
281,
9059,
323,
7681,
38135,
50276,
74,
1158,
253,
2929,
812,
320,
1805,
10932,
253,
1655,
9759,
556,
2067,
4858,
5697,
14974,
1475,
14999,
247,
2590,
4275,
10014,
16280,
891,
1089,
253,
7000,
5955,
273,
2045,
789,
275,
253,
4766,
273,
253,
1332,
2593,
1077,
940,
36484,
50275,
783,
2929,
943,
2486,
4679,
327,
643,
3215,
11273,
4979,
398,
347,
973,
347,
643,
15302,
50276,
783,
2929,
1057,
417,
11120,
2319,
253,
7364,
285,
2442,
4016,
38058,
3486,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
660,
7080,
247,
1332,
281,
3157,
253,
6733,
273,
1442,
292,
25004,
273,
247,
3215,
11273,
3448,
1566,
970,
48960,
3733,
4679,
6583,
326,
660,
7080,
4483,
7938,
1442,
292,
25004,
2429,
281,
2629,
7274,
1293,
8493,
7200,
50275,
74,
1158,
436,
310,
247,
5322,
2746,
326,
476,
452,
247,
2257,
273,
3486,
323,
1110,
665,
651,
751,
281,
1442,
292,
2517,
1781,
3448,
3210,
2299,
253,
30628,
452,
7350,
5001,
38135,
273,
253,
2929,
285,
651,
751,
281,
923,
253,
4081,
1332,
281,
320,
5762,
323,
643,
4067,
4311,
3215,
11273,
3210,
347,
973,
347,
625,
1783,
891,
11907,
253,
4477,
281,
19071,
841,
2544,
285,
501,
538,
2225,
281,
1529,
8059
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper provides two new generalization bounds for nonlinear metric learning with deep neural networks by extending results of bartlett et al 2017 to the metric learning setting the two bounds have been called the sparse and nonsparse bounds and differ in the norm used for the last layer experiments are performed where it is shown that either bound may dominate on real datasets strengths the paper exploits the structure of metric learning to extend the uniform generalization bound of bartlett et al 2017 and a new 2infinity bound for the metric learning setting experiments show both bounds are useful depending on datatraining weaknesses theorem 1 seems like straightforward application of bartlett et al 2017 novel technical insights for the new results eg theorem 2 could be more clearly stated eg the novel technical insight for theorem 2 is very briefly stated eg it would be nice to have informal or formal statement of lemma 9 in main paper the sparse and nonsparse regimes are defined based on when one bound dominates the other but it is less clear what data or training properties can result in one or the other eg what regime is expected when mnist is trained with regularization or 20newsgroups without or with less regularization more thorough experiments say with more datasets and degrees of regularization can potentially give interesting insights overall the paper gives interesting generalization bounds for the metric learning setting but does not seem ready for publication docsepthis paper tries to provide uniform guarantees for the dnn type metric embeddings strengths it is indeed a promising research direction to regard neural networks as a special nonlinear metric embedding weakness 1 this paper only uses metric embedding to tell a story for dnn models and does not provide the specific relationship between metric learning and dnns for example whether the feature transformation obtained by dnn meets the definition of metric or part of the definition and whether the perspective of metric embedding can bring new inspiration to the theory of dnns 2 the metric learning theory in this paper basically comes from the generalization theory of neural networks bartlett et al 2017 compared with the previous theoretical results the metric perspective analysis proposed in this paper does not give better results from the existing content of this paper the part of metric learning does not seem to work in a word i think this paper does not provide a new theoretical guarantee for nonlinear metric learning and its results are basically the same as the existing generalization theory of dnns docsepthe authors consider the setting of metric learning they prove bounds on the radermacher complexity of embeddingscomposedwithdistancefunctions for a particular architecture of neural network with different results for the dense and sparse regimes caveats i am not familiar with the literature in this area so cannot determine the novelty of the results or whether there are missing citations i did not read the proofs in detail so cannot vouch for the correctness of the mathematical results review in general i found this paper quite difficult to read this is in part because i am not an expert in the area of the paper but also because it is somewhat confusingly organised to be explicit after a brief introduction the paper states in section 11 the bounds which are the main results only partially defining the mathematical terms being used and without context explaining why these are interesting then in section 12 there is a brief discussion of existing literature whose methods they build on in section 2 metric learning in general is once more discussed and in section 3 other literature is again discussed much of the crucial mathematical notation used at the beginning of the paper is only defined in section 4 where the results are formally stated i think the paper would benefit significantly from being reorganised i suspect that this could lead to some space saving as there is currently some repetition on the technical side it is not easy to pick apart the exact contributions of the paper i am not an expert in this area so anyway wouldnt be fully able to evaluate the novelty of the results for instance the results are all about proving bounds on radermacher complexity why is this actually interesting it is never stated explicitly in the paper the bounds from bartlett et al 2017 are mentioned multiple times what exactly is the difference between those bounds and the ones in this paper what is the connection between these results and the real world what are the limitations my review should be taken with a pinch of salt because i am not an expert in this area but in my opinion the paper is not ready for publication in its current form other detailed comments abstract i wasnt familiar with 21 norms and 2 infty norms if other reviewers also complain of this could you consider adding a couple of words to give more context maybe even just 21 matrix norm or something page 1 last paragraph i found this sentence a bit confusing consider rephrasing difference between the expected value of the loss and the average value of the loss on the train set did you mean expected value of the loss wrt the unknown population distribution vs empirical average over the samples in the train set 20newsgroups could you cite the dataset here on its first mentioning in the main text page 2 after eq 2 given that the main contribution of the paper is proving upper bounds on the radermacher complexity of this function class i think it wouldnt be a bad idea to talk in a little more detail about the importance of radermacher complexity just after equation 3 i think can be helpful to readers to connect the assumptions made with real settings could you maybe add a note saying which types of network architectures this framework does and does not encompass without thinking too hard i guess it does include convolutions and mlps without biases but not resnets and does include relus sigmoids are there any commonly used activations that are not lipschitz can biases somehow be included wlog before eq 4 could you define the spectraloperator norm in eq 4 i found this equation quite confusing for a while and thought there was a mistake with the subscript is it would probably make it a lot clearer if you define mathcalai i guess this should be the set of all ai and then make it supai in mathcalai eq 5 what does the o with a line on top mean could you please define it maybe this is standard in this part of the literature and im just unfamiliar with it here b is an upper bound on inputs im not sure what this means also by inputs do you mean features i presume you mean the norm of the features but please be explicit page 4 eq 12 as someone not so familiar with this area i wondered why the margin constants are called s and d page 6 can the assumption that phi00 be made wlog if not what generality is lost equation 16 in general i found the organisation of this paper a bit confusing and this is a good example why does this definition not appear somewhere near the beginning of the paper page 7 relu relu page 8 recall that the notion of dense in this paper refers to the specific condition that i think this is the first time that the definition is actually given in this way i think it would be clearer to properly define it the first time you use it on page 3 something has gone wrong with the formatting of citations for mnist and 20newsgroups datasets could you move the citations to the first time you mention the datasets paper is not fit for publication in its current form it should be structured more coherently and the results discussed in more context docsepin this paper the authors look at the rademacher complexity of the family of euclidean metrics learned on a data set via an l layer network where the activations functions are lipschitz the idea behind the proof is to use the bounds for epsilon net for the embedding network from barlett foster and telgarsky 2017 this net then provides a net for space of metrics with only requiring a change in the constants using this net the authors then use standard arguments to bound the rademacher complexity specifically for the metric learning setting they show that this bound can be improved strength the authors show that for the metric learning setting the bounds from bartlett foster and telgarsky can be improved by checking if the last layer of the network is dense the authors then show that this setting where the last layer is dense occurs in practice the authors show that their bound matches known bounds for the linear case weakness while the bound is an improvement i do not think the improvement is significant this is my main concern see the discussion in the questions section i would have liked more discussion on how the bounds translate to generalization error bounds specifically are they meaningful empirically questions 1 i am not sure how the bounds are dimension free the depth l explicitly occurs in the bound we have products and sums of l terms second as we increase the width unless we decrease the magnitude of each entries the matrix norms will increase as well hence both factors the width and depth play a factor in the bound 2 how big do we expect the matrix norms to be because that determines that size of the bound at initialization if we use something like lecun initialization we expect each column to have norm 1 due to the normalization and so we expect a21 sim k then from your lemma we expect aop sim sqrtk then if we have a square network so ki are all equal then sumi1l fracai2123aiop23 sim lk13 so ignoring the log factors assuming 1lipschitz activation maps we get that the bound from theorem 1 looks like fracb2sqrtnl32kl12 so first this very explicit depends on k and l second an improvement of sqrtk is an improvement however if l is large i dont think it is significant okay so this was at initialization but maybe during training the norms change significantly do the authors know if this occurs in practice my concern is if we are in the large width limit then the results from neural tangent kernel theory will tells us that training didnt change the norms 3 i do not fully follow the discussion about the quadratic dependence specifically are the authors claiming that the complexity must be lower bounded by pii aiop2 if so then the improvement in the bound is significant overall the paper presents a nice adaptation of the bounds from bartlett foster and telgarsky for the problem of metric learning i think understanding metric learning is an important problem in machine learning however i am not convinced that the improvement in the bounds is significant however my opinion on this can be changed
### Summary:
|
the paper provides two new generalization bounds for nonlinear metric learning with deep neural networks by extending results of bartlett et al 2017 to the metric learning setting the main contribution of the paper is by extending the techniques of bartlett et al from a classification setting to the metric learning setting which has very different objectives and consider two regimes in the first regime the techniques are fairly similar but the second regime is more novel however the current version of the paper does not highlight the similarity and differences between the results and techniques with bartlett et al 2017 it also does not give sufficient intuition on how the metric learning setting is fundamentally different from the classification setting and how the paper leverage the difference to get improved bounds all the reviewers had some confusions to different degrees and the paper would be much stronger if it can explain the intuition and make more explicit comparisons
|
[
9459,
310,
3264,
672,
278,
79,
382,
310,
10166,
342,
37820,
390,
1384,
13608,
12896,
1293,
390,
342,
1679,
37820,
625,
11080,
4679,
1333,
342,
625,
15302,
285,
7759,
273,
37820,
476,
7826,
1918,
4722,
16039,
50276,
1189,
455,
253,
2929,
4245,
4722,
26647,
14493,
323,
253,
7982,
4715,
4758,
533,
1057,
417,
1646,
4704,
323,
9311,
5474,
33032,
2520,
2929,
14177,
281,
2085,
6447,
23632,
323,
253,
277,
9866,
1511,
7982,
46234,
20544,
50276,
262,
310,
6296,
247,
12532,
2561,
3884,
281,
2743,
11454,
6928,
347,
247,
2714,
14561,
7982,
21496,
50276,
20881,
1255,
337,
186,
2520,
2929,
760,
4648,
7982,
21496,
281,
2028,
247,
2926,
323,
277,
9866,
3210,
285,
1057,
417,
2085,
253,
2173,
2954,
875,
7982,
4715,
285,
277,
79,
2224,
323,
1650,
1880,
253,
4735,
9261,
2797,
407,
277,
9866,
16382,
253,
5426,
273,
7982,
390,
629,
273,
253,
5426,
285,
1880,
253,
8668,
273,
7982,
21496,
476,
3324,
747,
17006,
281,
253,
3762,
273,
277,
79,
2224,
374,
186,
783,
7982,
4715,
3762,
275,
436,
2929,
10323,
3249,
432,
253,
26647,
3762,
273,
11454,
6928,
44693,
17655,
1162,
355,
4240,
2429,
342,
253,
2045,
10527,
1543,
253,
7982,
8668,
1783,
4081,
275,
436,
2929,
1057,
417,
1918,
1805,
1543,
432,
253,
5368,
2600,
273,
436,
2929,
253,
629,
273,
7982,
4715,
1057,
417,
1646,
281,
789,
50276,
249,
247,
3159,
891,
1158,
436,
2929,
1057,
417,
2085,
247,
747,
10527,
12215,
323,
14561,
7982,
4715,
285,
697,
1543,
403,
10323,
253,
1072,
347,
253,
5368,
26647,
3762,
273,
277,
79,
2224,
5474,
339,
431,
248,
4477,
1908,
253,
4758,
273,
7982,
4715,
597,
5276,
14493,
327,
253,
1985,
693,
12844,
10454,
273,
46234,
681,
7334,
3113,
19893,
20619,
323,
247,
1798,
10336,
273,
11454,
2990,
342,
1027,
1543,
323,
253,
14086,
285,
23507,
27005,
50275,
68,
1123,
1832,
50276,
74,
717,
417,
7615,
342,
253,
6239,
275,
436,
2170,
594,
2550,
3653,
253,
38135,
273,
253,
1543,
390,
1880,
627,
403,
5816,
30404,
50276,
74,
858,
417,
1239,
253,
27947,
275,
2508,
594,
2550,
362,
9764,
323,
253,
36594,
273,
253,
15965,
1543,
50275,
15337,
50276,
249,
2087,
891,
1119,
436,
2929,
3240,
2834,
281,
1239,
436,
310,
275,
629,
984,
891,
717,
417,
271,
6485,
275,
253,
2170,
273,
253,
2929,
533,
671,
984,
352,
310,
8489,
21643,
314,
29070,
50275,
936,
320,
6843,
50275,
6438,
247,
4864,
10199,
253,
2929,
3054,
275,
2593,
1903,
253,
14493,
534,
403,
253,
2022,
1543,
760,
10571,
13947,
253,
15965,
2426,
1146,
908,
285,
1293,
3634,
15571,
2139,
841,
403,
4722,
50275,
7461,
275,
2593,
1249,
627,
310,
247,
4864,
5955,
273,
5368,
6239,
3692,
3082,
597,
1973,
327,
50275,
249,
2593,
374,
7982,
4715,
275,
2087,
310,
2378,
625,
5469,
285,
275,
2593,
495,
643,
6239,
310,
969,
5469,
50275,
25914,
273,
253,
9560,
15965,
14951,
908,
387,
253,
5068,
273,
253,
2929,
310,
760,
2931,
275,
2593,
577,
835,
253,
1543,
403,
19186,
4767,
50275,
74,
1158,
253,
2929,
651,
5649,
3012,
432,
1146,
294,
7397,
1701,
891,
9101,
326,
436,
812,
1421,
281,
690,
2317,
13868,
347,
627,
310,
4390,
690,
22563,
50274,
251,
253,
7681,
1930,
352,
310,
417,
3477,
281,
2619,
7419,
253,
3242,
9021,
273,
253,
2929,
891,
717,
417,
271,
6485,
275,
436,
2170,
594,
8791,
651,
2649,
320,
4751,
2104,
281,
7472,
253,
38135,
273,
253,
1543,
323,
4227,
50276,
783,
1543,
403,
512,
670,
18597,
14493,
327,
1985,
693,
12844,
10454,
2139,
310,
436,
2686,
4722,
352,
310,
1620,
4767,
11120,
275,
253,
2929,
50276,
783,
14493,
432,
44693,
17655,
1162,
355,
4240,
403,
5393,
2709,
2069,
752,
4555,
310,
253,
3064,
875,
1110,
14493,
285,
253,
4394,
275,
436,
2929,
50276,
5371,
310,
253,
4602,
875,
841,
1543,
285,
253,
1524,
1533,
752,
403,
253,
7364,
50275,
2577,
2278,
943,
320,
2668,
342,
247,
33161,
273,
7043,
984,
891,
717,
417,
271,
6485,
275,
436,
2170,
533,
275,
619,
4743,
253,
2929,
310,
417,
4704,
323,
9311,
275,
697,
1655,
830,
50273,
977,
7000,
5701,
50276,
15834,
50275,
74,
369,
2649,
7615,
342,
3127,
22429,
285,
374,
2192,
555,
22429,
604,
643,
30628,
671,
17805,
273,
436,
812,
368,
1908,
6240,
247,
4564,
273,
3000,
281,
1918,
625,
3634,
5046,
1014,
816,
3127,
4315,
5222,
390,
1633,
50274,
6377,
337,
50275,
6275,
12494,
891,
1119,
436,
6197,
247,
2372,
21643,
1908,
294,
545,
83,
2355,
3064,
875,
253,
3264,
1318,
273,
253,
2957,
285,
253,
3388,
1318,
273,
253,
2957,
327,
253,
6194,
873,
858,
368,
1599,
3264,
1318,
273,
253,
2957,
8772,
253,
7202,
3072,
3268,
4632,
16774,
3388,
689,
253,
3530,
275,
253,
6194,
873,
50275,
938,
13608,
12896,
812,
368,
26542,
253,
10895,
1060,
327,
697,
806,
29570,
275,
253,
2022,
2505,
50275,
6377,
374,
50275,
6438,
16186,
374,
1677,
326,
253,
2022,
7680,
273,
253,
2929,
310,
18597,
5170,
14493,
327,
253,
1985,
693,
12844,
10454,
273,
436,
1159,
966,
891,
1158,
352,
651,
2649,
320,
247,
3076,
2934,
281,
2312,
275,
247,
1652,
625,
2508,
670,
253,
6349,
273,
1985,
693,
12844,
10454,
50275,
6309,
846,
5150,
495,
891,
1158,
476,
320,
9371,
281,
10668,
281,
4684,
253,
13260,
1160,
342,
1524,
7533,
812,
368,
5046,
823,
247,
3877,
3981,
534,
3510,
273,
2990,
35615,
436,
7792,
1057,
285,
1057,
417,
18387,
1293,
4680,
1512,
1892,
891,
5476,
352,
1057,
2486,
2410,
17009,
285,
13361,
793,
1293,
31306,
533,
417,
501,
47301,
285,
1057,
2486,
774,
316,
9788,
78,
9448,
403,
627,
667,
7744,
908,
1396,
569,
326,
403,
417,
11233,
37913,
476,
31306,
10380,
320,
2908,
259,
2808,
50275,
9131,
16186,
577,
812,
368,
4853,
253,
9879,
4402,
5222,
50276,
249,
16186,
577,
891,
1119,
436,
5150,
3240,
21643,
323,
247,
1223,
285,
1869,
627,
369,
247,
10551,
342,
253,
749,
3866,
310,
352,
651,
3164,
1056,
352,
247,
2257,
30909,
604,
368,
4853,
14168,
1179,
2284,
891,
5476,
436,
943,
320,
253,
873,
273,
512,
23105,
285,
840,
1056,
352,
7018,
2284,
275,
14168,
1179,
2284,
50275,
2574,
608,
752,
1057,
253,
258,
342,
247,
1386,
327,
1755,
1599,
812,
368,
4496,
4853,
352,
5046,
436,
310,
2629,
275,
436,
629,
273,
253,
6239,
285,
516,
816,
32139,
342,
352,
50276,
1568,
270,
310,
271,
5170,
3033,
327,
14800,
516,
417,
2119,
752,
436,
2097,
671,
407,
14800,
513,
368,
1599,
3386,
891,
35533,
368,
1599,
253,
5222,
273,
253,
3386,
533,
4496,
320,
6843,
50274,
6377,
577,
50275,
2574,
1249,
347,
3095,
417,
594,
7615,
342,
436,
2170,
891,
13876,
2139,
253,
8459,
14637,
403,
1925,
256,
285,
277,
50275,
6377,
721,
50275,
5092,
253,
9376,
326,
815,
74,
361,
320,
1160,
259,
2808,
604,
417,
752,
31376,
310,
3663,
50275,
29813,
1668,
275,
2087,
891,
1119,
253,
19156,
273,
436,
2929,
247,
2372,
21643,
285,
436,
310,
247,
1175,
1650,
2139,
1057,
436,
5426,
417,
3176,
9366,
2822,
253,
5068,
273,
253,
2929,
50275,
6377,
818,
50276,
1661,
86,
50276,
1661,
86,
50276,
6377,
854,
50275,
2845,
455,
326,
253,
10732,
273,
14086,
275,
436,
2929,
10770,
281,
253,
2173,
1617,
326,
50276,
74,
1158,
436,
310,
253,
806,
673,
326,
253,
5426,
310,
2686,
1677,
275,
436,
1039,
891,
1158,
352,
651,
320,
30909,
281,
6283,
4853,
352,
253,
806,
673,
368,
897,
352,
327,
3239,
495,
50275,
17873,
556,
4783,
3430,
342,
253,
33907,
273,
30404,
323,
278,
79,
382,
285,
1384,
13608,
12896,
15302,
812,
368,
2118,
253,
30404,
281,
253,
806,
673,
368,
3748,
253,
15302,
50276,
20790,
310,
417,
4944,
323,
9311,
275,
697,
1655,
830,
352,
943,
320,
18872,
625,
18893,
314,
285,
253,
1543,
5469,
275,
625,
3634,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
1007,
387,
253,
1985,
358,
12844,
10454,
273,
253,
2021,
273,
299,
26365,
17082,
6311,
327,
247,
941,
873,
3066,
271,
298,
3828,
2990,
835,
253,
1396,
569,
3470,
403,
11233,
37913,
253,
2934,
3212,
253,
4737,
310,
281,
897,
253,
14493,
323,
299,
4277,
2036,
323,
253,
21496,
2990,
432,
2534,
17655,
22846,
285,
14591,
72,
1032,
4742,
4240,
436,
2036,
840,
3400,
247,
2036,
323,
2317,
273,
17082,
342,
760,
10568,
247,
1818,
275,
253,
14637,
970,
436,
2036,
253,
4477,
840,
897,
2629,
7125,
281,
3033,
253,
1985,
358,
12844,
10454,
50275,
46458,
323,
253,
7982,
4715,
4758,
597,
921,
326,
436,
3033,
476,
320,
5520,
50276,
45563,
50276,
783,
4477,
921,
326,
323,
253,
7982,
4715,
4758,
253,
14493,
432,
44693,
17655,
22846,
285,
14591,
72,
1032,
4742,
476,
320,
5520,
407,
12669,
604,
253,
1390,
3828,
273,
253,
2990,
310,
14086,
50275,
783,
4477,
840,
921,
326,
436,
4758,
835,
253,
1390,
3828,
310,
14086,
6634,
275,
3946,
50275,
783,
4477,
921,
326,
616,
3033,
10129,
1929,
14493,
323,
253,
4872,
1083,
50275,
20881,
1255,
50276,
6050,
253,
3033,
310,
271,
7756,
891,
513,
417,
1158,
253,
7756,
310,
1534,
436,
310,
619,
2022,
4468,
923,
253,
5955,
275,
253,
3533,
2593,
50275,
74,
651,
452,
10490,
625,
5955,
327,
849,
253,
14493,
16497,
281,
26647,
2228,
14493,
5742,
403,
597,
14282,
45190,
50276,
34974,
50276,
18,
891,
717,
417,
2119,
849,
253,
14493,
403,
7877,
1959,
253,
6864,
298,
11120,
6634,
275,
253,
3033,
359,
452,
3580,
285,
22661,
273,
298,
2426,
1273,
347,
359,
2572,
253,
4871,
5734,
359,
6379,
253,
9777,
273,
1016,
12028,
253,
4315,
22429,
588,
2572,
347,
973,
7613,
1097,
2616,
253,
4871,
285,
6864,
1132,
247,
2803,
275,
253,
3033,
50275,
19,
849,
1943,
513,
359,
1902,
253,
4315,
22429,
281,
320,
984,
326,
14802,
326,
1979,
273,
253,
3033,
50275,
255,
31850,
604,
359,
897,
1633,
751,
458,
68,
328,
31850,
359,
1902,
1016,
5084,
281,
452,
5222,
337,
1955,
281,
253,
21539,
285,
594,
359,
1902,
247,
1797,
948,
465,
840,
432,
634,
18057,
359,
1902,
247,
412,
948,
8084,
76,
50275,
7461,
604,
359,
452,
247,
6278,
2990,
594,
25130,
403,
512,
4503,
840,
50275,
2204,
74,
18,
77,
1315,
317,
2284,
19,
10683,
2284,
412,
1508,
948,
298,
76,
1012,
50275,
601,
23111,
253,
2412,
2616,
7384,
337,
77,
2824,
37913,
5743,
8115,
359,
755,
326,
253,
3033,
432,
10012,
337,
4453,
751,
50275,
1124,
67,
19,
2609,
13307,
1237,
7261,
805,
50276,
601,
806,
436,
1077,
6843,
7024,
327,
465,
285,
298,
1273,
271,
7756,
273,
8084,
76,
310,
271,
7756,
2299,
604,
298,
310,
1781,
891,
13414,
1158,
352,
310,
1534,
50275,
536,
333,
594,
436,
369,
387,
31850,
533,
5046,
1309,
3733,
253,
22429,
1818,
3012,
513,
253,
4477,
871,
604,
436,
6634,
275,
3946,
619,
4468,
310,
604,
359,
403,
275,
253,
1781,
4871,
2701,
840,
253,
1543,
432,
11454,
28196,
10295,
3762,
588,
8599,
441,
326,
3733,
42126,
1818,
253,
22429,
50275,
20,
891,
513,
417,
4751,
956,
253,
5955,
670,
253,
21396,
10096,
5742,
403,
253,
4477,
15081,
326,
253,
10454,
1364,
320,
2406,
11542,
407,
268,
2886,
247,
17591,
19,
604,
594,
840,
253,
7756,
275,
253,
3033,
310,
1534,
50271,
1189,
455,
253,
2929,
10262,
247,
5322,
15644,
273,
253,
14493,
432,
44693,
17655,
22846,
285,
14591,
72,
1032,
4742,
323,
253,
1895,
273,
7982,
4715,
891,
1158,
4685,
7982,
4715,
310,
271,
1774,
1895,
275,
5145,
4715,
2299,
891,
717,
417,
13762,
326,
253,
7756,
275,
253,
14493,
310,
1534,
2299,
619,
4743,
327,
436,
476,
320,
4391,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
3400,
767,
747,
26647,
14493,
323,
14561,
7982,
4715,
342,
3676,
11454,
6928,
407,
13633,
1543,
273,
44693,
17655,
1162,
355,
4240,
281,
253,
7982,
4715,
4758,
253,
2022,
7680,
273,
253,
2929,
310,
407,
13633,
253,
5609,
273,
44693,
17655,
1162,
355,
432,
247,
9162,
4758,
281,
253,
7982,
4715,
4758,
534,
556,
1077,
1027,
16566,
285,
1908,
767,
27005,
275,
253,
806,
9459,
253,
5609,
403,
9648,
2074,
533,
253,
1273,
9459,
310,
625,
4460,
2299,
253,
1655,
2715,
273,
253,
2929,
1057,
417,
6780,
253,
14259,
285,
3910,
875,
253,
1543,
285,
5609,
342,
44693,
17655,
1162,
355,
4240,
352,
671,
1057,
417,
1918,
4209,
30328,
327,
849,
253,
7982,
4715,
4758,
310,
26401,
1027,
432,
253,
9162,
4758,
285,
849,
253,
2929,
25057,
253,
3064,
281,
755,
5520,
14493,
512,
253,
30628,
574,
690,
1461,
16723,
281,
1027,
7759,
285,
253,
2929,
651,
320,
1199,
10046,
604,
352,
476,
5513,
253,
30328,
285,
1056,
625,
6843,
14023
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
9459,
310,
3264,
672,
278,
79,
382,
310,
10166,
342,
37820,
390,
1384,
13608,
12896,
1293,
390,
342,
1679,
37820,
625,
11080,
4679,
1333,
342,
625,
15302,
285,
7759,
273,
37820,
476,
7826,
1918,
4722,
16039,
50276,
1189,
455,
253,
2929,
4245,
4722,
26647,
14493,
323,
253,
7982,
4715,
4758,
533,
1057,
417,
1646,
4704,
323,
9311,
5474,
33032,
2520,
2929,
14177,
281,
2085,
6447,
23632,
323,
253,
277,
9866,
1511,
7982,
46234,
20544,
50276,
262,
310,
6296,
247,
12532,
2561,
3884,
281,
2743,
11454,
6928,
347,
247,
2714,
14561,
7982,
21496,
50276,
20881,
1255,
337,
186,
2520,
2929,
760,
4648,
7982,
21496,
281,
2028,
247,
2926,
323,
277,
9866,
3210,
285,
1057,
417,
2085,
253,
2173,
2954,
875,
7982,
4715,
285,
277,
79,
2224,
323,
1650,
1880,
253,
4735,
9261,
2797,
407,
277,
9866,
16382,
253,
5426,
273,
7982,
390,
629,
273,
253,
5426,
285,
1880,
253,
8668,
273,
7982,
21496,
476,
3324,
747,
17006,
281,
253,
3762,
273,
277,
79,
2224,
374,
186,
783,
7982,
4715,
3762,
275,
436,
2929,
10323,
3249,
432,
253,
26647,
3762,
273,
11454,
6928,
44693,
17655,
1162,
355,
4240,
2429,
342,
253,
2045,
10527,
1543,
253,
7982,
8668,
1783,
4081,
275,
436,
2929,
1057,
417,
1918,
1805,
1543,
432,
253,
5368,
2600,
273,
436,
2929,
253,
629,
273,
7982,
4715,
1057,
417,
1646,
281,
789,
50276,
249,
247,
3159,
891,
1158,
436,
2929,
1057,
417,
2085,
247,
747,
10527,
12215,
323,
14561,
7982,
4715,
285,
697,
1543,
403,
10323,
253,
1072,
347,
253,
5368,
26647,
3762,
273,
277,
79,
2224,
5474,
339,
431,
248,
4477,
1908,
253,
4758,
273,
7982,
4715,
597,
5276,
14493,
327,
253,
1985,
693,
12844,
10454,
273,
46234,
681,
7334,
3113,
19893,
20619,
323,
247,
1798,
10336,
273,
11454,
2990,
342,
1027,
1543,
323,
253,
14086,
285,
23507,
27005,
50275,
68,
1123,
1832,
50276,
74,
717,
417,
7615,
342,
253,
6239,
275,
436,
2170,
594,
2550,
3653,
253,
38135,
273,
253,
1543,
390,
1880,
627,
403,
5816,
30404,
50276,
74,
858,
417,
1239,
253,
27947,
275,
2508,
594,
2550,
362,
9764,
323,
253,
36594,
273,
253,
15965,
1543,
50275,
15337,
50276,
249,
2087,
891,
1119,
436,
2929,
3240,
2834,
281,
1239,
436,
310,
275,
629,
984,
891,
717,
417,
271,
6485,
275,
253,
2170,
273,
253,
2929,
533,
671,
984,
352,
310,
8489,
21643,
314,
29070,
50275,
936,
320,
6843,
50275,
6438,
247,
4864,
10199,
253,
2929,
3054,
275,
2593,
1903,
253,
14493,
534,
403,
253,
2022,
1543,
760,
10571,
13947,
253,
15965,
2426,
1146,
908,
285,
1293,
3634,
15571,
2139,
841,
403,
4722,
50275,
7461,
275,
2593,
1249,
627,
310,
247,
4864,
5955,
273,
5368,
6239,
3692,
3082,
597,
1973,
327,
50275,
249,
2593,
374,
7982,
4715,
275,
2087,
310,
2378,
625,
5469,
285,
275,
2593,
495,
643,
6239,
310,
969,
5469,
50275,
25914,
273,
253,
9560,
15965,
14951,
908,
387,
253,
5068,
273,
253,
2929,
310,
760,
2931,
275,
2593,
577,
835,
253,
1543,
403,
19186,
4767,
50275,
74,
1158,
253,
2929,
651,
5649,
3012,
432,
1146,
294,
7397,
1701,
891,
9101,
326,
436,
812,
1421,
281,
690,
2317,
13868,
347,
627,
310,
4390,
690,
22563,
50274,
251,
253,
7681,
1930,
352,
310,
417,
3477,
281,
2619,
7419,
253,
3242,
9021,
273,
253,
2929,
891,
717,
417,
271,
6485,
275,
436,
2170,
594,
8791,
651,
2649,
320,
4751,
2104,
281,
7472,
253,
38135,
273,
253,
1543,
323,
4227,
50276,
783,
1543,
403,
512,
670,
18597,
14493,
327,
1985,
693,
12844,
10454,
2139,
310,
436,
2686,
4722,
352,
310,
1620,
4767,
11120,
275,
253,
2929,
50276,
783,
14493,
432,
44693,
17655,
1162,
355,
4240,
403,
5393,
2709,
2069,
752,
4555,
310,
253,
3064,
875,
1110,
14493,
285,
253,
4394,
275,
436,
2929,
50276,
5371,
310,
253,
4602,
875,
841,
1543,
285,
253,
1524,
1533,
752,
403,
253,
7364,
50275,
2577,
2278,
943,
320,
2668,
342,
247,
33161,
273,
7043,
984,
891,
717,
417,
271,
6485,
275,
436,
2170,
533,
275,
619,
4743,
253,
2929,
310,
417,
4704,
323,
9311,
275,
697,
1655,
830,
50273,
977,
7000,
5701,
50276,
15834,
50275,
74,
369,
2649,
7615,
342,
3127,
22429,
285,
374,
2192,
555,
22429,
604,
643,
30628,
671,
17805,
273,
436,
812,
368,
1908,
6240,
247,
4564,
273,
3000,
281,
1918,
625,
3634,
5046,
1014,
816,
3127,
4315,
5222,
390,
1633,
50274,
6377,
337,
50275,
6275,
12494,
891,
1119,
436,
6197,
247,
2372,
21643,
1908,
294,
545,
83,
2355,
3064,
875,
253,
3264,
1318,
273,
253,
2957,
285,
253,
3388,
1318,
273,
253,
2957,
327,
253,
6194,
873,
858,
368,
1599,
3264,
1318,
273,
253,
2957,
8772,
253,
7202,
3072,
3268,
4632,
16774,
3388,
689,
253,
3530,
275,
253,
6194,
873,
50275,
938,
13608,
12896,
812,
368,
26542,
253,
10895,
1060,
327,
697,
806,
29570,
275,
253,
2022,
2505,
50275,
6377,
374,
50275,
6438,
16186,
374,
1677,
326,
253,
2022,
7680,
273,
253,
2929,
310,
18597,
5170,
14493,
327,
253,
1985,
693,
12844,
10454,
273,
436,
1159,
966,
891,
1158,
352,
651,
2649,
320,
247,
3076,
2934,
281,
2312,
275,
247,
1652,
625,
2508,
670,
253,
6349,
273,
1985,
693,
12844,
10454,
50275,
6309,
846,
5150,
495,
891,
1158,
476,
320,
9371,
281,
10668,
281,
4684,
253,
13260,
1160,
342,
1524,
7533,
812,
368,
5046,
823,
247,
3877,
3981,
534,
3510,
273,
2990,
35615,
436,
7792,
1057,
285,
1057,
417,
18387,
1293,
4680,
1512,
1892,
891,
5476,
352,
1057,
2486,
2410,
17009,
285,
13361,
793,
1293,
31306,
533,
417,
501,
47301,
285,
1057,
2486,
774,
316,
9788,
78,
9448,
403,
627,
667,
7744,
908,
1396,
569,
326,
403,
417,
11233,
37913,
476,
31306,
10380,
320,
2908,
259,
2808,
50275,
9131,
16186,
577,
812,
368,
4853,
253,
9879,
4402,
5222,
50276,
249,
16186,
577,
891,
1119,
436,
5150,
3240,
21643,
323,
247,
1223,
285,
1869,
627,
369,
247,
10551,
342,
253,
749,
3866,
310,
352,
651,
3164,
1056,
352,
247,
2257,
30909,
604,
368,
4853,
14168,
1179,
2284,
891,
5476,
436,
943,
320,
253,
873,
273,
512,
23105,
285,
840,
1056,
352,
7018,
2284,
275,
14168,
1179,
2284,
50275,
2574,
608,
752,
1057,
253,
258,
342,
247,
1386,
327,
1755,
1599,
812,
368,
4496,
4853,
352,
5046,
436,
310,
2629,
275,
436,
629,
273,
253,
6239,
285,
516,
816,
32139,
342,
352,
50276,
1568,
270,
310,
271,
5170,
3033,
327,
14800,
516,
417,
2119,
752,
436,
2097,
671,
407,
14800,
513,
368,
1599,
3386,
891,
35533,
368,
1599,
253,
5222,
273,
253,
3386,
533,
4496,
320,
6843,
50274,
6377,
577,
50275,
2574,
1249,
347,
3095,
417,
594,
7615,
342,
436,
2170,
891,
13876,
2139,
253,
8459,
14637,
403,
1925,
256,
285,
277,
50275,
6377,
721,
50275,
5092,
253,
9376,
326,
815,
74,
361,
320,
1160,
259,
2808,
604,
417,
752,
31376,
310,
3663,
50275,
29813,
1668,
275,
2087,
891,
1119,
253,
19156,
273,
436,
2929,
247,
2372,
21643,
285,
436,
310,
247,
1175,
1650,
2139,
1057,
436,
5426,
417,
3176,
9366,
2822,
253,
5068,
273,
253,
2929,
50275,
6377,
818,
50276,
1661,
86,
50276,
1661,
86,
50276,
6377,
854,
50275,
2845,
455,
326,
253,
10732,
273,
14086,
275,
436,
2929,
10770,
281,
253,
2173,
1617,
326,
50276,
74,
1158,
436,
310,
253,
806,
673,
326,
253,
5426,
310,
2686,
1677,
275,
436,
1039,
891,
1158,
352,
651,
320,
30909,
281,
6283,
4853,
352,
253,
806,
673,
368,
897,
352,
327,
3239,
495,
50275,
17873,
556,
4783,
3430,
342,
253,
33907,
273,
30404,
323,
278,
79,
382,
285,
1384,
13608,
12896,
15302,
812,
368,
2118,
253,
30404,
281,
253,
806,
673,
368,
3748,
253,
15302,
50276,
20790,
310,
417,
4944,
323,
9311,
275,
697,
1655,
830,
352,
943,
320,
18872,
625,
18893,
314,
285,
253,
1543,
5469,
275,
625,
3634,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
1007,
387,
253,
1985,
358,
12844,
10454,
273,
253,
2021,
273,
299,
26365,
17082,
6311,
327,
247,
941,
873,
3066,
271,
298,
3828,
2990,
835,
253,
1396,
569,
3470,
403,
11233,
37913,
253,
2934,
3212,
253,
4737,
310,
281,
897,
253,
14493,
323,
299,
4277,
2036,
323,
253,
21496,
2990,
432,
2534,
17655,
22846,
285,
14591,
72,
1032,
4742,
4240,
436,
2036,
840,
3400,
247,
2036,
323,
2317,
273,
17082,
342,
760,
10568,
247,
1818,
275,
253,
14637,
970,
436,
2036,
253,
4477,
840,
897,
2629,
7125,
281,
3033,
253,
1985,
358,
12844,
10454,
50275,
46458,
323,
253,
7982,
4715,
4758,
597,
921,
326,
436,
3033,
476,
320,
5520,
50276,
45563,
50276,
783,
4477,
921,
326,
323,
253,
7982,
4715,
4758,
253,
14493,
432,
44693,
17655,
22846,
285,
14591,
72,
1032,
4742,
476,
320,
5520,
407,
12669,
604,
253,
1390,
3828,
273,
253,
2990,
310,
14086,
50275,
783,
4477,
840,
921,
326,
436,
4758,
835,
253,
1390,
3828,
310,
14086,
6634,
275,
3946,
50275,
783,
4477,
921,
326,
616,
3033,
10129,
1929,
14493,
323,
253,
4872,
1083,
50275,
20881,
1255,
50276,
6050,
253,
3033,
310,
271,
7756,
891,
513,
417,
1158,
253,
7756,
310,
1534,
436,
310,
619,
2022,
4468,
923,
253,
5955,
275,
253,
3533,
2593,
50275,
74,
651,
452,
10490,
625,
5955,
327,
849,
253,
14493,
16497,
281,
26647,
2228,
14493,
5742,
403,
597,
14282,
45190,
50276,
34974,
50276,
18,
891,
717,
417,
2119,
849,
253,
14493,
403,
7877,
1959,
253,
6864,
298,
11120,
6634,
275,
253,
3033,
359,
452,
3580,
285,
22661,
273,
298,
2426,
1273,
347,
359,
2572,
253,
4871,
5734,
359,
6379,
253,
9777,
273,
1016,
12028,
253,
4315,
22429,
588,
2572,
347,
973,
7613,
1097,
2616,
253,
4871,
285,
6864,
1132,
247,
2803,
275,
253,
3033,
50275,
19,
849,
1943,
513,
359,
1902,
253,
4315,
22429,
281,
320,
984,
326,
14802,
326,
1979,
273,
253,
3033,
50275,
255,
31850,
604,
359,
897,
1633,
751,
458,
68,
328,
31850,
359,
1902,
1016,
5084,
281,
452,
5222,
337,
1955,
281,
253,
21539,
285,
594,
359,
1902,
247,
1797,
948,
465,
840,
432,
634,
18057,
359,
1902,
247,
412,
948,
8084,
76,
50275,
7461,
604,
359,
452,
247,
6278,
2990,
594,
25130,
403,
512,
4503,
840,
50275,
2204,
74,
18,
77,
1315,
317,
2284,
19,
10683,
2284,
412,
1508,
948,
298,
76,
1012,
50275,
601,
23111,
253,
2412,
2616,
7384,
337,
77,
2824,
37913,
5743,
8115,
359,
755,
326,
253,
3033,
432,
10012,
337,
4453,
751,
50275,
1124,
67,
19,
2609,
13307,
1237,
7261,
805,
50276,
601,
806,
436,
1077,
6843,
7024,
327,
465,
285,
298,
1273,
271,
7756,
273,
8084,
76,
310,
271,
7756,
2299,
604,
298,
310,
1781,
891,
13414,
1158,
352,
310,
1534,
50275,
536,
333,
594,
436,
369,
387,
31850,
533,
5046,
1309,
3733,
253,
22429,
1818,
3012,
513,
253,
4477,
871,
604,
436,
6634,
275,
3946,
619,
4468,
310,
604,
359,
403,
275,
253,
1781,
4871,
2701,
840,
253,
1543,
432,
11454,
28196,
10295,
3762,
588,
8599,
441,
326,
3733,
42126,
1818,
253,
22429,
50275,
20,
891,
513,
417,
4751,
956,
253,
5955,
670,
253,
21396,
10096,
5742,
403,
253,
4477,
15081,
326,
253,
10454,
1364,
320,
2406,
11542,
407,
268,
2886,
247,
17591,
19,
604,
594,
840,
253,
7756,
275,
253,
3033,
310,
1534,
50271,
1189,
455,
253,
2929,
10262,
247,
5322,
15644,
273,
253,
14493,
432,
44693,
17655,
22846,
285,
14591,
72,
1032,
4742,
323,
253,
1895,
273,
7982,
4715,
891,
1158,
4685,
7982,
4715,
310,
271,
1774,
1895,
275,
5145,
4715,
2299,
891,
717,
417,
13762,
326,
253,
7756,
275,
253,
14493,
310,
1534,
2299,
619,
4743,
327,
436,
476,
320,
4391,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
3400,
767,
747,
26647,
14493,
323,
14561,
7982,
4715,
342,
3676,
11454,
6928,
407,
13633,
1543,
273,
44693,
17655,
1162,
355,
4240,
281,
253,
7982,
4715,
4758,
253,
2022,
7680,
273,
253,
2929,
310,
407,
13633,
253,
5609,
273,
44693,
17655,
1162,
355,
432,
247,
9162,
4758,
281,
253,
7982,
4715,
4758,
534,
556,
1077,
1027,
16566,
285,
1908,
767,
27005,
275,
253,
806,
9459,
253,
5609,
403,
9648,
2074,
533,
253,
1273,
9459,
310,
625,
4460,
2299,
253,
1655,
2715,
273,
253,
2929,
1057,
417,
6780,
253,
14259,
285,
3910,
875,
253,
1543,
285,
5609,
342,
44693,
17655,
1162,
355,
4240,
352,
671,
1057,
417,
1918,
4209,
30328,
327,
849,
253,
7982,
4715,
4758,
310,
26401,
1027,
432,
253,
9162,
4758,
285,
849,
253,
2929,
25057,
253,
3064,
281,
755,
5520,
14493,
512,
253,
30628,
574,
690,
1461,
16723,
281,
1027,
7759,
285,
253,
2929,
651,
320,
1199,
10046,
604,
352,
476,
5513,
253,
30328,
285,
1056,
625,
6843,
14023
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper targets solving a challenging and interesting task in medical image analysis the method was tested on multiple datasets it shows a decent improvement over ae and vae methods with a big margin 1 the novelty of this submission is limited its an application of the existing works vqvae and transformers 2 the paper didnt mention or compare to the stateoftheart work in unsupervised anomaly detection ie the anogan and its family like fanogan vaeanogan thomas schlegl et al unsupervised anomaly detection with generative adversarial networks to guide marker discovery ipmi 2017 docsepplaying around the lower dimensional latent space is interesting and shows potential for further discussion and its property makes tools designed for sequence data transformerlstm applicable for high dimensional data the illustration of the paper is clear and professional personally i am curious about how the latent codebook e1e2ek is set up which is not mentioned in the paper secondly in the evaluation part the real lesion segmentation dice score is poor for all baseline methods a lot of false positive skull stripping and more advanced binarization methods might be able to help with the performance docsepstrengths the idea of exploiting a resampling masks appears original and shows an improvement over simple autoencoder approaches unsupervision offers practical advantages use of ensembles for added robustness is welcome weaknesses motivation of transformers and performers could be strengthened as is more details could be useful this may further highlight the differentiation of the proposed method with the vqvae results from a vanilla vqvae appears very low when compared to the proposed enhancement of the method this should be further verified docsep1 the proposed approach utilizes the power of the transformer model which is now state of the art in language modeling for static neuroimaging data transformers have seen limited adoption so far in medical imaging domain and given the success of these models in other domains their application in medical imaging should be encouraged 2 the authors circumvent the computational difficulty with using transformer models by applying the vqvae first to reduce the input dimension and then also adopting performers a new model with efficient implementation of the attention mechanism 3 the experimental validation is quite extensive with 3 different experiments 2 on synthetic data 1 on real data using very large datasets 10000 images for synthetic training on 15000 from uk biobank for real data experiments and included evaluation on multiple datasets with actual brain abnormalities for each of these experiments the authors reported substantial increases in anomaly detectionsegmentation performance compared to recently published baseline methods 4 the paper is very well written and easy to follow 1 the evaluation of segmentation is measured using highestachievable dice score where a search through residual thresholds was done and the best possible result was reported i understand this is done for all the methods and thus is fair in that sense but it would be nice to know what the accuracy would be like under normal use cases given the large amount of data im wondering why a validation set was not used to choose a threshold also maybe could have used auc of segmentation like in the anomaly detection experiment so that it is a thresholdfree way of comparing the models 2 i am curious as to whether the authors can explain why the different orderings may have an effect on the result when the advantage of the transformer is that it is supposed to be able to learn complex dependencies across any range of sequences 3 this is not a weakness per se but a thought the anomaly detection is done in latent space using the likelihood of the latent encoding while the anomaly segmentation is done using comparison in original image space im wondering if the detection of anomalous regions could be done in the latent space first then transformed back to image space or whether the low dimensional encoding is prohibitive and loses spatial resolution
### Summary:
|
i share the positive views of the reviewers about this paper i was especially pleased to see the improvements made to the paper during the rebuttal phase such as the addition of the auc experiment
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
8571,
16161,
247,
11132,
285,
4722,
4836,
275,
3739,
2460,
1783,
253,
1332,
369,
5762,
327,
2709,
15302,
352,
2722,
247,
12524,
7756,
689,
247,
70,
285,
362,
3348,
3082,
342,
247,
1943,
8459,
50275,
18,
253,
38135,
273,
436,
19529,
310,
3710,
697,
271,
2898,
273,
253,
5368,
2987,
362,
82,
21574,
285,
4979,
398,
50275,
19,
253,
2929,
42126,
3748,
390,
7277,
281,
253,
1375,
23037,
14387,
789,
275,
440,
35421,
30207,
5481,
26332,
253,
271,
19356,
285,
697,
2021,
751,
7989,
19356,
362,
3348,
266,
19356,
50275,
394,
4921,
5807,
1851,
77,
1162,
355,
440,
35421,
30207,
5481,
342,
1006,
800,
48960,
6928,
281,
7102,
10705,
8900,
13997,
7373,
4240,
50275,
7152,
339,
377,
15328,
272,
1475,
253,
2406,
15759,
21624,
2317,
310,
4722,
285,
2722,
2442,
323,
2007,
5955,
285,
697,
2867,
2789,
5657,
4158,
323,
3425,
941,
39707,
42663,
78,
7763,
323,
1029,
15759,
941,
50276,
783,
23356,
273,
253,
2929,
310,
2590,
285,
5702,
11697,
891,
717,
14338,
670,
849,
253,
21624,
2127,
3305,
299,
18,
70,
19,
1441,
310,
873,
598,
534,
310,
417,
5393,
275,
253,
2929,
1273,
314,
275,
253,
7103,
629,
253,
1524,
15411,
26405,
25807,
4868,
310,
4105,
323,
512,
8245,
3082,
247,
2257,
273,
3221,
2762,
19567,
42922,
285,
625,
7269,
10269,
274,
1320,
3082,
1537,
320,
2104,
281,
1361,
342,
253,
3045,
5474,
33032,
296,
3755,
20556,
50276,
783,
2934,
273,
38883,
247,
501,
312,
4906,
25965,
4620,
3236,
285,
2722,
271,
7756,
689,
2969,
6753,
36465,
7274,
50276,
328,
12185,
4694,
6131,
8542,
11361,
50276,
2327,
273,
49328,
323,
2879,
31640,
310,
10112,
50276,
20881,
1255,
265,
50276,
24013,
7639,
273,
4979,
398,
285,
34983,
812,
320,
34615,
347,
310,
625,
4278,
812,
320,
4217,
436,
778,
2007,
6780,
253,
9827,
273,
253,
4081,
1332,
342,
253,
362,
82,
21574,
50276,
16680,
432,
247,
26724,
362,
82,
21574,
4620,
1077,
1698,
672,
2429,
281,
253,
4081,
14314,
273,
253,
1332,
436,
943,
320,
2007,
16058,
50276,
7152,
33032,
18,
253,
4081,
2746,
29820,
253,
1612,
273,
253,
39707,
1566,
534,
310,
1024,
1375,
273,
253,
1445,
275,
3448,
14053,
323,
4228,
6551,
40270,
941,
4979,
398,
452,
2326,
3710,
16253,
594,
2080,
275,
3739,
6979,
5028,
285,
1677,
253,
2323,
273,
841,
3210,
275,
643,
10625,
616,
2898,
275,
3739,
6979,
943,
320,
14659,
50276,
19,
253,
4477,
39256,
253,
15180,
10183,
342,
970,
39707,
3210,
407,
9433,
253,
362,
82,
21574,
806,
281,
4796,
253,
3280,
7877,
285,
840,
671,
25987,
34983,
247,
747,
1566,
342,
5919,
7092,
273,
253,
4116,
5122,
50275,
20,
253,
5661,
12820,
310,
3240,
9470,
342,
495,
1027,
4679,
374,
327,
13506,
941,
337,
327,
1524,
941,
970,
1077,
1781,
15302,
30321,
3888,
323,
13506,
3733,
327,
1458,
933,
432,
42487,
1794,
706,
1164,
323,
1524,
941,
4679,
285,
2908,
7103,
327,
2709,
15302,
342,
4588,
3998,
19147,
323,
1016,
273,
841,
4679,
253,
4477,
2361,
6832,
5459,
275,
30207,
5481,
29429,
318,
3045,
2429,
281,
4102,
3863,
8245,
3082,
50276,
21,
253,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
50276,
18,
253,
7103,
273,
26405,
310,
4080,
970,
4585,
607,
466,
17254,
25807,
4868,
835,
247,
3186,
949,
12541,
26682,
369,
2218,
285,
253,
1682,
1896,
906,
369,
2361,
891,
2096,
436,
310,
2218,
323,
512,
253,
3082,
285,
3021,
310,
4344,
275,
326,
3282,
533,
352,
651,
320,
5322,
281,
871,
752,
253,
7200,
651,
320,
751,
762,
2622,
897,
2219,
1677,
253,
1781,
2408,
273,
941,
516,
12371,
2139,
247,
12820,
873,
369,
417,
908,
281,
5206,
247,
7887,
671,
5046,
812,
452,
908,
247,
1028,
273,
26405,
751,
275,
253,
30207,
5481,
3368,
594,
326,
352,
310,
247,
7887,
4924,
1039,
273,
10941,
253,
3210,
50276,
19,
891,
717,
14338,
347,
281,
1880,
253,
4477,
476,
5513,
2139,
253,
1027,
1340,
723,
778,
452,
271,
1055,
327,
253,
906,
672,
253,
5750,
273,
253,
39707,
310,
326,
352,
310,
6326,
281,
320,
2104,
281,
3037,
2570,
21011,
2439,
667,
2491,
273,
6430,
50276,
20,
436,
310,
417,
247,
14855,
591,
396,
533,
247,
1869,
253,
30207,
5481,
310,
2218,
275,
21624,
2317,
970,
253,
12177,
273,
253,
21624,
9706,
1223,
253,
30207,
26405,
310,
2218,
970,
5301,
275,
3236,
2460,
2317,
516,
12371,
604,
253,
5481,
273,
31946,
4811,
812,
320,
2218,
275,
253,
21624,
2317,
806,
840,
13657,
896,
281,
2460,
2317,
390,
1880,
253,
1698,
15759,
9706,
310,
9419,
1483,
285,
25068,
8820,
6064,
50276,
187,
187,
4118,
18435,
27,
74,
3894,
253,
2762,
6849,
273,
253,
30628,
670,
436,
2929,
891,
369,
3340,
13864,
281,
923,
253,
11701,
1160,
281,
253,
2929,
1309,
253,
30080,
22559,
3408,
824,
347,
253,
1635,
273,
253,
247,
1028,
3368
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
8571,
16161,
247,
11132,
285,
4722,
4836,
275,
3739,
2460,
1783,
253,
1332,
369,
5762,
327,
2709,
15302,
352,
2722,
247,
12524,
7756,
689,
247,
70,
285,
362,
3348,
3082,
342,
247,
1943,
8459,
50275,
18,
253,
38135,
273,
436,
19529,
310,
3710,
697,
271,
2898,
273,
253,
5368,
2987,
362,
82,
21574,
285,
4979,
398,
50275,
19,
253,
2929,
42126,
3748,
390,
7277,
281,
253,
1375,
23037,
14387,
789,
275,
440,
35421,
30207,
5481,
26332,
253,
271,
19356,
285,
697,
2021,
751,
7989,
19356,
362,
3348,
266,
19356,
50275,
394,
4921,
5807,
1851,
77,
1162,
355,
440,
35421,
30207,
5481,
342,
1006,
800,
48960,
6928,
281,
7102,
10705,
8900,
13997,
7373,
4240,
50275,
7152,
339,
377,
15328,
272,
1475,
253,
2406,
15759,
21624,
2317,
310,
4722,
285,
2722,
2442,
323,
2007,
5955,
285,
697,
2867,
2789,
5657,
4158,
323,
3425,
941,
39707,
42663,
78,
7763,
323,
1029,
15759,
941,
50276,
783,
23356,
273,
253,
2929,
310,
2590,
285,
5702,
11697,
891,
717,
14338,
670,
849,
253,
21624,
2127,
3305,
299,
18,
70,
19,
1441,
310,
873,
598,
534,
310,
417,
5393,
275,
253,
2929,
1273,
314,
275,
253,
7103,
629,
253,
1524,
15411,
26405,
25807,
4868,
310,
4105,
323,
512,
8245,
3082,
247,
2257,
273,
3221,
2762,
19567,
42922,
285,
625,
7269,
10269,
274,
1320,
3082,
1537,
320,
2104,
281,
1361,
342,
253,
3045,
5474,
33032,
296,
3755,
20556,
50276,
783,
2934,
273,
38883,
247,
501,
312,
4906,
25965,
4620,
3236,
285,
2722,
271,
7756,
689,
2969,
6753,
36465,
7274,
50276,
328,
12185,
4694,
6131,
8542,
11361,
50276,
2327,
273,
49328,
323,
2879,
31640,
310,
10112,
50276,
20881,
1255,
265,
50276,
24013,
7639,
273,
4979,
398,
285,
34983,
812,
320,
34615,
347,
310,
625,
4278,
812,
320,
4217,
436,
778,
2007,
6780,
253,
9827,
273,
253,
4081,
1332,
342,
253,
362,
82,
21574,
50276,
16680,
432,
247,
26724,
362,
82,
21574,
4620,
1077,
1698,
672,
2429,
281,
253,
4081,
14314,
273,
253,
1332,
436,
943,
320,
2007,
16058,
50276,
7152,
33032,
18,
253,
4081,
2746,
29820,
253,
1612,
273,
253,
39707,
1566,
534,
310,
1024,
1375,
273,
253,
1445,
275,
3448,
14053,
323,
4228,
6551,
40270,
941,
4979,
398,
452,
2326,
3710,
16253,
594,
2080,
275,
3739,
6979,
5028,
285,
1677,
253,
2323,
273,
841,
3210,
275,
643,
10625,
616,
2898,
275,
3739,
6979,
943,
320,
14659,
50276,
19,
253,
4477,
39256,
253,
15180,
10183,
342,
970,
39707,
3210,
407,
9433,
253,
362,
82,
21574,
806,
281,
4796,
253,
3280,
7877,
285,
840,
671,
25987,
34983,
247,
747,
1566,
342,
5919,
7092,
273,
253,
4116,
5122,
50275,
20,
253,
5661,
12820,
310,
3240,
9470,
342,
495,
1027,
4679,
374,
327,
13506,
941,
337,
327,
1524,
941,
970,
1077,
1781,
15302,
30321,
3888,
323,
13506,
3733,
327,
1458,
933,
432,
42487,
1794,
706,
1164,
323,
1524,
941,
4679,
285,
2908,
7103,
327,
2709,
15302,
342,
4588,
3998,
19147,
323,
1016,
273,
841,
4679,
253,
4477,
2361,
6832,
5459,
275,
30207,
5481,
29429,
318,
3045,
2429,
281,
4102,
3863,
8245,
3082,
50276,
21,
253,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
50276,
18,
253,
7103,
273,
26405,
310,
4080,
970,
4585,
607,
466,
17254,
25807,
4868,
835,
247,
3186,
949,
12541,
26682,
369,
2218,
285,
253,
1682,
1896,
906,
369,
2361,
891,
2096,
436,
310,
2218,
323,
512,
253,
3082,
285,
3021,
310,
4344,
275,
326,
3282,
533,
352,
651,
320,
5322,
281,
871,
752,
253,
7200,
651,
320,
751,
762,
2622,
897,
2219,
1677,
253,
1781,
2408,
273,
941,
516,
12371,
2139,
247,
12820,
873,
369,
417,
908,
281,
5206,
247,
7887,
671,
5046,
812,
452,
908,
247,
1028,
273,
26405,
751,
275,
253,
30207,
5481,
3368,
594,
326,
352,
310,
247,
7887,
4924,
1039,
273,
10941,
253,
3210,
50276,
19,
891,
717,
14338,
347,
281,
1880,
253,
4477,
476,
5513,
2139,
253,
1027,
1340,
723,
778,
452,
271,
1055,
327,
253,
906,
672,
253,
5750,
273,
253,
39707,
310,
326,
352,
310,
6326,
281,
320,
2104,
281,
3037,
2570,
21011,
2439,
667,
2491,
273,
6430,
50276,
20,
436,
310,
417,
247,
14855,
591,
396,
533,
247,
1869,
253,
30207,
5481,
310,
2218,
275,
21624,
2317,
970,
253,
12177,
273,
253,
21624,
9706,
1223,
253,
30207,
26405,
310,
2218,
970,
5301,
275,
3236,
2460,
2317,
516,
12371,
604,
253,
5481,
273,
31946,
4811,
812,
320,
2218,
275,
253,
21624,
2317,
806,
840,
13657,
896,
281,
2460,
2317,
390,
1880,
253,
1698,
15759,
9706,
310,
9419,
1483,
285,
25068,
8820,
6064,
50276,
187,
187,
4118,
18435,
27,
74,
3894,
253,
2762,
6849,
273,
253,
30628,
670,
436,
2929,
891,
369,
3340,
13864,
281,
923,
253,
11701,
1160,
281,
253,
2929,
1309,
253,
30080,
22559,
3408,
824,
347,
253,
1635,
273,
253,
247,
1028,
3368
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a method to determine the number of clusters k to use with kmeans the method uses a of policy gradient inspired method without states to predict the number of clusters to use the idea of the paper and the presented results are promising and indicate that the idea has merit however there are several concerns with the paper as it stands the underlying assumption of the paper appears to be that there is a single correct value of clusters is correct and that the used metric is capable of reliably telling good from bad clustering solutions apart another aspect that is entirely ignored in the discussion and description of the method is that the training of the policy is performed in batches this means that unless the the data in each batch is well distributed over all classes present in the data the metric would seem to be different for different batches leading to an optimization problem with a moving target from the text it is not clear to me how this challenge is addressed by the proposed method in practice it is impossible to guarantee either balancedness or stratification of the data presented to an unsupervised algorithm so it would seem unreasonable to require this this is a crucial challenge for the proposed method which is sadly not addressed at all from a related work point of view the paper focuses only on kmeans and ignores other clustering methods which determine the number of clusters on their own eg affinity propagation or lda while kmeans and spectral clustering are widely used it would be good to include and compare to other methods that already are capable of determining the number of clusters directly from data furthermore the paper does not speak to or compare with the well established methods to determine k for kmeans such as the bic or aic another aspect that the experiments dont really make clear is what the advantage of the proposed method is compared to simply increasing k and using the proposed metric for each value and picking the best k as was done for the baseline method obviously there is a cost involved in running kmeans repeatedly but training the policy is not free either and guaranteeing convergence of a deep network to a reasonable solution would seem more challenging than just computing the metric value repeatedly a discussion of this would be greatly appreciated the description of the method is at times a bit hard to follow this is partly due to the paper intermixing description of the method and background information an example of this is 31 where rl without with some variation on mdp is introduced intermingled with the actual setup used for the proposed method it would have been easier to read if first the entire mdp and bandit setting was presented before describing how this is used in the proposed method certain aspects also never become clear it appears that the network outputs a highdimensional feature vector which is used to determine k figure 1 makes this output appear to be a scalar when the text makes it sound like a vector or distribution overall this adds to the challenge that as it stands it would be hard to implement the proposed method the general idea is conveyed but many of the nuanced details remain unclear docsepthe paper uses policy gradients in a bandit setting to learn the optimal number of clusters k in kmeans clustering based on the silhouette score finding k that leads to the highest silhouette score is a more specific problem that what the paper title promises the approach is welldescribed and supported by experiments on simulated and realworld data phrasing the task at hand as hyperparameter optimization problem it could be tackled with a variety of well known tools eg gaussian process based hpo grid search random sampling etc the authors need to compare the policy gradient method to these existing approaches detailed comments k values for competitors are not chosen properly clearly written and presented is the generation of synthetic data noisefree what is the yaxis in fig2 the phenomenon is called the curse of dimensionality maybe better the underlying phenomenon eq 6 argmax should be formatted as textargmax please check spacing and parentheses around citations our approach can predict the number of clusters in most settings with an insignificant error insignificant is vacuous here eq 8 is not correct docsepthe reviewed paper presents a completely unsupervised framework metak for predicting the number of clusters the approach advocated in the paper comprises two main parts autoencoder for feature extraction and multilayer perceptron mlp for predicting the number of clusters autoencoder is used if necessary to decrease the dimensionality of the input data the mlp is trained using policy gradient optimization schema to predict the best according to silhouette score number of clusters k in the given dataset overall the authors show that their approach achieves nearoptimum results on both a number of synthetic datasets as well as on two wellknown computer vision datasets mnist and fmnist strong points overall the paper is well written the reading flow is smooth and clear without major disturbances alike text figures especially figure 1 greatly facilitates the understanding of the paper the review of the related literature seems to be very thorough and lists most of the relevant papers in the field this section helps to put the current work into context and explain the limitations of approaches proposed by past research weak points as well as reasons for the score the main problem with the paper seems to be the apparent discrepancy between the performance and complexity of the proposed approach metak and the baseline method silhouette score while both methods are completely unsupervised the former is a complex composite pipeline which includes two neural networks one of which is trained using a reinforcement learning framework the latter is a simple formula that can be calculated for as many clusters as necessary in a relatively short time both methods according to the authors perform on par with the baseline being marginally better on synthetic data moreover according to the paper the metak framework is trained to find the best number of clusters by maximizing the silhouette score in the first place unfortunately the authors largely omitted a question of why their approach should be preferred over the baseline only once when commenting on results of the comparison with baseline the authors noted although the baseline approach achieves similar or better results it is not feasible to use this method when the range of possible ks is broad not disclosing details as to what exactly about a broad range of ks makes the baseline method infeasible and why metak must be preferred in such circumstances overall the benefit of using a more complex method such as metak should be clearly stated in the paper and also extensively experimentally verified the lack of experimentation is the second most important concern and also the reason for the current score metak has been used on several datasets including a number of synthetic datasets generated using sklearn package and also wellknown mnist and fmnist while experiments on synthetic data is a great way to confirm the initial hypothesis about the model superior performance on realworld data is what can be of interest to the community although mnist and fmnist are good starting points a lot more datasets exist where the number of clusters ie classes is known eg cifar10 cifar100 imagenet etc the same can be said about the competitors as currently only one approach except baseline is compared to metak also it is not made clear to the readers why policy gradient optimization was used for training the controller network rather than for example common sgd with custom loss based on silhouette score the rl training in such circumstances and without proper explanations seems to be an excessive measure the paper can be better structured section 3 might have been called methods where the authors could have described not only their proposed solution but also the competitor approaches that they decided to compare with it might be a good idea to move some part of the policy gradient optimization discussion to supplementary materials leaving more room for experiments and results section experiments might have been split into experiments and results with experiments focusing on performed experiments and their setup while results remaining more focused on obtained outcomes and relevant interpretation large parts of the introduction are overlapping or repeated in related work it is not necessary to discuss relevant literature in the introduction this part should be moved to related work completely over a number of occasions in the abstract introduction and related work the authors point out a close relationship between their approach and metalearning or learning to learn concept hence the name of the approach metak however nowhere in the paper the authors seem to clarify which part of their method performs metalearning while it might be understood that the authors refer to the fact that their approach trains the controller network using the unsupervised signal from the silhouette score it is however might be a good idea to make this connection explicit minor comments training curves for the policy gradient optimization presented in section 44 are left with no interpretation and to the reader it remains unclear why they were presented in the final section in the first place caption of figure 1 should ideally also explain what x x z and other variables mean the third item on the list of contributions interprets experimental results and can hardly be considered as a contribution of the paper some of the terms related to reinforcement learning such as environment transitions or finitehorizon undiscounted return must be explained before use figure 2 could have been made more clear by adding vertical lines that would correspond to the true value of the number of clusters all acronyms such as mlp should be defined before being used
### Summary:
|
the reviewers were unanimous that this submission is not ready for publication at iclr concerns were raised about clarity of the exposition as well as lack of sufficient experiments comparing to related work
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1332,
281,
3653,
253,
1180,
273,
9959,
465,
281,
897,
342,
465,
30799,
253,
1332,
4648,
247,
273,
3646,
11786,
11797,
1332,
1293,
3054,
281,
3283,
253,
1180,
273,
9959,
281,
897,
50276,
783,
2934,
273,
253,
2929,
285,
253,
3559,
1543,
403,
12532,
285,
5224,
326,
253,
2934,
556,
15785,
2299,
627,
403,
2067,
7350,
342,
253,
2929,
347,
352,
9572,
253,
6944,
9376,
273,
253,
2929,
4620,
281,
320,
326,
627,
310,
247,
2014,
3451,
1318,
273,
9959,
310,
3451,
285,
326,
253,
908,
7982,
310,
7032,
273,
27340,
7746,
1175,
432,
3076,
17524,
5482,
7419,
50275,
23955,
4809,
326,
310,
7094,
12841,
275,
253,
5955,
285,
5740,
273,
253,
1332,
310,
326,
253,
3733,
273,
253,
3646,
310,
2684,
275,
39657,
436,
2097,
326,
5734,
253,
253,
941,
275,
1016,
14604,
310,
973,
5939,
689,
512,
5971,
1246,
275,
253,
941,
253,
7982,
651,
1646,
281,
320,
1027,
323,
1027,
39657,
4283,
281,
271,
13757,
1895,
342,
247,
4886,
2303,
432,
253,
2505,
352,
310,
417,
2590,
281,
479,
849,
436,
5691,
310,
9713,
407,
253,
4081,
1332,
275,
3946,
352,
310,
7479,
281,
12215,
2057,
16645,
1255,
390,
40752,
273,
253,
941,
3559,
281,
271,
440,
35421,
5933,
594,
352,
651,
1646,
20697,
281,
2430,
436,
436,
310,
247,
9560,
5691,
323,
253,
4081,
1332,
534,
310,
30018,
417,
9713,
387,
512,
50276,
4064,
247,
2905,
789,
1127,
273,
1859,
253,
2929,
16633,
760,
327,
465,
30799,
285,
35136,
643,
17524,
3082,
534,
3653,
253,
1180,
273,
9959,
327,
616,
1211,
24088,
15430,
18634,
390,
298,
1473,
1223,
465,
30799,
285,
9879,
17524,
403,
7561,
908,
352,
651,
320,
1175,
281,
2486,
285,
7277,
281,
643,
3082,
326,
2168,
403,
7032,
273,
8925,
253,
1180,
273,
9959,
3587,
432,
941,
33810,
253,
2929,
1057,
417,
3984,
281,
390,
7277,
342,
253,
973,
4232,
3082,
281,
3653,
465,
323,
465,
30799,
824,
347,
253,
43022,
390,
247,
280,
50276,
23955,
4809,
326,
253,
4679,
13414,
1663,
1056,
2590,
310,
752,
253,
5750,
273,
253,
4081,
1332,
310,
2429,
281,
3365,
3629,
465,
285,
970,
253,
4081,
7982,
323,
1016,
1318,
285,
8871,
253,
1682,
465,
347,
369,
2218,
323,
253,
8245,
1332,
9090,
627,
310,
247,
2105,
3206,
275,
3515,
465,
30799,
12889,
533,
3733,
253,
3646,
310,
417,
1959,
2057,
285,
12215,
272,
14940,
273,
247,
3676,
2990,
281,
247,
5272,
2900,
651,
1646,
625,
11132,
685,
816,
12672,
253,
7982,
1318,
12889,
247,
5955,
273,
436,
651,
320,
10260,
14109,
50276,
783,
5740,
273,
253,
1332,
310,
387,
2069,
247,
2372,
1892,
281,
956,
436,
310,
13730,
1955,
281,
253,
2929,
734,
24706,
272,
5740,
273,
253,
1332,
285,
4114,
1491,
271,
1650,
273,
436,
310,
4562,
835,
391,
77,
1293,
342,
690,
7629,
327,
278,
12132,
310,
5611,
734,
3987,
1070,
342,
253,
4588,
9978,
908,
323,
253,
4081,
1332,
352,
651,
452,
644,
6927,
281,
1239,
604,
806,
253,
2862,
278,
12132,
285,
3961,
262,
4758,
369,
3559,
1078,
12930,
849,
436,
310,
908,
275,
253,
4081,
1332,
2176,
7794,
671,
1620,
2489,
2590,
352,
4620,
326,
253,
2990,
18012,
247,
1029,
6967,
4735,
4972,
534,
310,
908,
281,
3653,
465,
4677,
337,
2789,
436,
3453,
3176,
281,
320,
247,
13434,
672,
253,
2505,
2789,
352,
3590,
751,
247,
4972,
390,
3268,
4583,
436,
11323,
281,
253,
5691,
326,
347,
352,
9572,
352,
651,
320,
1892,
281,
3359,
253,
4081,
1332,
253,
2087,
2934,
310,
29403,
533,
1142,
273,
253,
8794,
3086,
4278,
3464,
12744,
5474,
339,
431,
248,
2929,
4648,
3646,
27935,
275,
247,
3961,
262,
4758,
281,
3037,
253,
8654,
1180,
273,
9959,
465,
50276,
249,
465,
30799,
17524,
1754,
327,
253,
43031,
5464,
4868,
4560,
465,
326,
5644,
281,
253,
4585,
43031,
5464,
4868,
310,
247,
625,
2173,
1895,
326,
752,
253,
2929,
4060,
16966,
253,
2746,
310,
6210,
392,
265,
9397,
285,
4516,
407,
4679,
327,
15524,
285,
1524,
10186,
941,
9839,
2355,
253,
4836,
387,
1133,
347,
4373,
19484,
13757,
1895,
352,
812,
320,
11463,
1070,
342,
247,
5235,
273,
973,
1929,
5657,
24088,
305,
12064,
1232,
1754,
288,
5367,
9860,
3186,
3632,
10491,
3966,
253,
4477,
878,
281,
7277,
253,
3646,
11786,
1332,
281,
841,
5368,
7274,
50276,
5992,
7193,
5701,
50276,
76,
2193,
323,
21607,
403,
417,
6777,
6283,
50276,
49346,
3542,
285,
3559,
50276,
261,
253,
5978,
273,
13506,
941,
642,
261,
832,
658,
50276,
5371,
310,
253,
340,
10565,
275,
3036,
19,
50275,
783,
11562,
310,
1925,
253,
28401,
273,
7877,
1319,
50276,
28489,
1805,
253,
6944,
11562,
50276,
2574,
721,
1736,
4090,
943,
320,
39113,
347,
2505,
1662,
4090,
50276,
32897,
2451,
22735,
285,
41616,
1475,
30404,
50276,
454,
2746,
476,
3283,
253,
1180,
273,
9959,
275,
954,
7533,
342,
271,
34584,
2228,
50276,
968,
525,
13907,
310,
5809,
3472,
1060,
50276,
2574,
854,
310,
417,
3451,
5474,
339,
431,
248,
9814,
2929,
10262,
247,
4336,
440,
35421,
7792,
1313,
518,
323,
21565,
253,
1180,
273,
9959,
253,
2746,
36431,
275,
253,
2929,
12093,
767,
2022,
4243,
6753,
36465,
323,
4735,
11998,
285,
33362,
4071,
591,
916,
1406,
13361,
81,
323,
21565,
253,
1180,
273,
9959,
6753,
36465,
310,
908,
604,
3309,
281,
6379,
253,
7877,
1319,
273,
253,
3280,
941,
253,
13361,
81,
310,
10166,
970,
3646,
11786,
13757,
20824,
281,
3283,
253,
1682,
2556,
281,
43031,
5464,
4868,
1180,
273,
9959,
465,
275,
253,
1677,
10895,
4583,
253,
4477,
921,
326,
616,
2746,
33526,
2822,
2178,
2751,
1543,
327,
1097,
247,
1180,
273,
13506,
15302,
347,
973,
347,
327,
767,
973,
4304,
4382,
8113,
15302,
278,
79,
382,
285,
269,
16192,
382,
50276,
9072,
2792,
50276,
1189,
455,
253,
2929,
310,
973,
3542,
253,
4361,
2685,
310,
6032,
285,
2590,
1293,
2201,
30992,
19605,
2505,
8442,
3340,
4677,
337,
10260,
29499,
253,
4685,
273,
253,
2929,
50274,
783,
2278,
273,
253,
2905,
6239,
3133,
281,
320,
1077,
11080,
285,
10894,
954,
273,
253,
4623,
9380,
275,
253,
1673,
436,
2593,
7729,
281,
1691,
253,
1655,
789,
715,
3634,
285,
5513,
253,
7364,
273,
7274,
4081,
407,
2469,
2561,
50276,
20881,
2792,
347,
973,
347,
4606,
323,
253,
4868,
50276,
783,
2022,
1895,
342,
253,
2929,
3133,
281,
320,
253,
5165,
26210,
875,
253,
3045,
285,
10454,
273,
253,
4081,
2746,
1313,
518,
285,
253,
8245,
1332,
43031,
5464,
4868,
1223,
1097,
3082,
403,
4336,
440,
35421,
253,
3438,
310,
247,
2570,
8212,
15722,
534,
3797,
767,
11454,
6928,
581,
273,
534,
310,
10166,
970,
247,
35221,
4715,
7792,
253,
6158,
310,
247,
2969,
7212,
326,
476,
320,
5118,
323,
347,
1142,
9959,
347,
3309,
275,
247,
4942,
2159,
673,
1097,
3082,
2556,
281,
253,
4477,
1347,
327,
1061,
342,
253,
8245,
1146,
42876,
1805,
327,
13506,
941,
25761,
2556,
281,
253,
2929,
253,
1313,
518,
7792,
310,
10166,
281,
1089,
253,
1682,
1180,
273,
9959,
407,
46875,
253,
43031,
5464,
4868,
275,
253,
806,
1659,
19235,
253,
4477,
8127,
11035,
247,
1953,
273,
2139,
616,
2746,
943,
320,
9013,
689,
253,
8245,
760,
2378,
672,
36738,
327,
1543,
273,
253,
5301,
342,
8245,
253,
4477,
4879,
3738,
253,
8245,
2746,
33526,
2074,
390,
1805,
1543,
352,
310,
417,
17887,
281,
897,
436,
1332,
672,
253,
2491,
273,
1896,
465,
84,
310,
3862,
417,
4540,
5555,
4278,
347,
281,
752,
4555,
670,
247,
3862,
2491,
273,
465,
84,
2789,
253,
8245,
1332,
275,
36764,
917,
285,
2139,
1313,
518,
1364,
320,
9013,
275,
824,
5989,
4583,
253,
5649,
273,
970,
247,
625,
2570,
1332,
824,
347,
1313,
518,
943,
320,
4518,
4767,
275,
253,
2929,
285,
671,
18171,
21657,
16058,
50275,
783,
3480,
273,
40290,
310,
253,
1273,
954,
1774,
4468,
285,
671,
253,
1921,
323,
253,
1655,
4868,
1313,
518,
556,
644,
908,
327,
2067,
15302,
1690,
247,
1180,
273,
13506,
15302,
4561,
970,
1629,
29343,
5522,
285,
671,
973,
4304,
278,
79,
382,
285,
269,
16192,
382,
1223,
4679,
327,
13506,
941,
310,
247,
1270,
1039,
281,
6583,
253,
3302,
9079,
670,
253,
1566,
8936,
3045,
327,
1524,
10186,
941,
310,
752,
476,
320,
273,
1600,
281,
253,
3114,
3738,
278,
79,
382,
285,
269,
16192,
382,
403,
1175,
4983,
2792,
247,
2257,
625,
15302,
2226,
835,
253,
1180,
273,
9959,
26332,
5971,
310,
1929,
24088,
260,
338,
274,
740,
260,
338,
274,
2313,
4440,
257,
292,
3966,
50276,
783,
1072,
476,
320,
753,
670,
253,
21607,
347,
4390,
760,
581,
2746,
3707,
8245,
310,
2429,
281,
1313,
518,
50274,
12563,
352,
310,
417,
1160,
2590,
281,
253,
10668,
2139,
3646,
11786,
13757,
369,
908,
323,
3733,
253,
9763,
2990,
2581,
685,
323,
1650,
1846,
256,
35333,
342,
2840,
2957,
1754,
327,
43031,
5464,
4868,
253,
391,
77,
3733,
275,
824,
5989,
285,
1293,
1463,
22909,
3133,
281,
320,
271,
13622,
2557,
50275,
783,
2929,
476,
320,
1805,
18872,
2593,
495,
1537,
452,
644,
1925,
3082,
835,
253,
4477,
812,
452,
2529,
417,
760,
616,
4081,
2900,
533,
671,
253,
32048,
7274,
326,
597,
4425,
281,
7277,
342,
352,
1537,
320,
247,
1175,
2934,
281,
2118,
690,
629,
273,
253,
3646,
11786,
13757,
5955,
281,
24864,
4753,
6108,
625,
2316,
323,
4679,
285,
1543,
2593,
4679,
1537,
452,
644,
8085,
715,
4679,
285,
1543,
342,
4679,
13654,
327,
2684,
4679,
285,
616,
9978,
1223,
1543,
5780,
625,
7106,
327,
2797,
6973,
285,
4623,
7914,
50275,
16374,
4243,
273,
253,
10199,
403,
21481,
390,
6015,
275,
2905,
789,
352,
310,
417,
3309,
281,
2319,
4623,
6239,
275,
253,
10199,
436,
629,
943,
320,
4395,
281,
2905,
789,
4336,
50273,
1189,
247,
1180,
273,
15530,
275,
253,
12002,
10199,
285,
2905,
789,
253,
4477,
1127,
562,
247,
2810,
2954,
875,
616,
2746,
285,
5148,
613,
920,
390,
4715,
281,
3037,
4473,
7613,
253,
1416,
273,
253,
2746,
1313,
518,
2299,
17663,
275,
253,
2929,
253,
4477,
1646,
281,
19148,
534,
629,
273,
616,
1332,
17923,
5148,
613,
920,
1223,
352,
1537,
320,
7192,
326,
253,
4477,
3730,
281,
253,
958,
326,
616,
2746,
18784,
253,
9763,
2990,
970,
253,
440,
35421,
2625,
432,
253,
43031,
5464,
4868,
352,
310,
2299,
1537,
320,
247,
1175,
2934,
281,
1056,
436,
4602,
6843,
50276,
37585,
5701,
50275,
31158,
9191,
323,
253,
3646,
11786,
13757,
3559,
275,
2593,
7127,
403,
1669,
342,
642,
7914,
285,
281,
253,
9414,
352,
4558,
12744,
2139,
597,
497,
3559,
275,
253,
2457,
2593,
275,
253,
806,
1659,
50275,
34480,
273,
4677,
337,
943,
34243,
671,
5513,
752,
1269,
1269,
1182,
285,
643,
4903,
1599,
50275,
783,
2626,
5382,
327,
253,
1618,
273,
9021,
4665,
84,
5661,
1543,
285,
476,
10693,
320,
2783,
347,
247,
7680,
273,
253,
2929,
50275,
8826,
273,
253,
2426,
2905,
281,
35221,
4715,
824,
347,
3126,
16307,
390,
6486,
1688,
21148,
3807,
2865,
702,
264,
1091,
1364,
320,
5544,
1078,
897,
50275,
13206,
374,
812,
452,
644,
1160,
625,
2590,
407,
6240,
9118,
3104,
326,
651,
2723,
281,
253,
2032,
1318,
273,
253,
1180,
273,
9959,
50270,
455,
913,
1406,
90,
983,
824,
347,
13361,
81,
943,
320,
2931,
1078,
1146,
908,
187,
187,
4118,
18435,
27,
783,
30628,
497,
42293,
326,
436,
19529,
310,
417,
4704,
323,
9311,
387,
17857,
32888,
7350,
497,
5439,
670,
19843,
273,
253,
47284,
347,
973,
347,
3480,
273,
4209,
4679,
10941,
281,
2905,
789
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1332,
281,
3653,
253,
1180,
273,
9959,
465,
281,
897,
342,
465,
30799,
253,
1332,
4648,
247,
273,
3646,
11786,
11797,
1332,
1293,
3054,
281,
3283,
253,
1180,
273,
9959,
281,
897,
50276,
783,
2934,
273,
253,
2929,
285,
253,
3559,
1543,
403,
12532,
285,
5224,
326,
253,
2934,
556,
15785,
2299,
627,
403,
2067,
7350,
342,
253,
2929,
347,
352,
9572,
253,
6944,
9376,
273,
253,
2929,
4620,
281,
320,
326,
627,
310,
247,
2014,
3451,
1318,
273,
9959,
310,
3451,
285,
326,
253,
908,
7982,
310,
7032,
273,
27340,
7746,
1175,
432,
3076,
17524,
5482,
7419,
50275,
23955,
4809,
326,
310,
7094,
12841,
275,
253,
5955,
285,
5740,
273,
253,
1332,
310,
326,
253,
3733,
273,
253,
3646,
310,
2684,
275,
39657,
436,
2097,
326,
5734,
253,
253,
941,
275,
1016,
14604,
310,
973,
5939,
689,
512,
5971,
1246,
275,
253,
941,
253,
7982,
651,
1646,
281,
320,
1027,
323,
1027,
39657,
4283,
281,
271,
13757,
1895,
342,
247,
4886,
2303,
432,
253,
2505,
352,
310,
417,
2590,
281,
479,
849,
436,
5691,
310,
9713,
407,
253,
4081,
1332,
275,
3946,
352,
310,
7479,
281,
12215,
2057,
16645,
1255,
390,
40752,
273,
253,
941,
3559,
281,
271,
440,
35421,
5933,
594,
352,
651,
1646,
20697,
281,
2430,
436,
436,
310,
247,
9560,
5691,
323,
253,
4081,
1332,
534,
310,
30018,
417,
9713,
387,
512,
50276,
4064,
247,
2905,
789,
1127,
273,
1859,
253,
2929,
16633,
760,
327,
465,
30799,
285,
35136,
643,
17524,
3082,
534,
3653,
253,
1180,
273,
9959,
327,
616,
1211,
24088,
15430,
18634,
390,
298,
1473,
1223,
465,
30799,
285,
9879,
17524,
403,
7561,
908,
352,
651,
320,
1175,
281,
2486,
285,
7277,
281,
643,
3082,
326,
2168,
403,
7032,
273,
8925,
253,
1180,
273,
9959,
3587,
432,
941,
33810,
253,
2929,
1057,
417,
3984,
281,
390,
7277,
342,
253,
973,
4232,
3082,
281,
3653,
465,
323,
465,
30799,
824,
347,
253,
43022,
390,
247,
280,
50276,
23955,
4809,
326,
253,
4679,
13414,
1663,
1056,
2590,
310,
752,
253,
5750,
273,
253,
4081,
1332,
310,
2429,
281,
3365,
3629,
465,
285,
970,
253,
4081,
7982,
323,
1016,
1318,
285,
8871,
253,
1682,
465,
347,
369,
2218,
323,
253,
8245,
1332,
9090,
627,
310,
247,
2105,
3206,
275,
3515,
465,
30799,
12889,
533,
3733,
253,
3646,
310,
417,
1959,
2057,
285,
12215,
272,
14940,
273,
247,
3676,
2990,
281,
247,
5272,
2900,
651,
1646,
625,
11132,
685,
816,
12672,
253,
7982,
1318,
12889,
247,
5955,
273,
436,
651,
320,
10260,
14109,
50276,
783,
5740,
273,
253,
1332,
310,
387,
2069,
247,
2372,
1892,
281,
956,
436,
310,
13730,
1955,
281,
253,
2929,
734,
24706,
272,
5740,
273,
253,
1332,
285,
4114,
1491,
271,
1650,
273,
436,
310,
4562,
835,
391,
77,
1293,
342,
690,
7629,
327,
278,
12132,
310,
5611,
734,
3987,
1070,
342,
253,
4588,
9978,
908,
323,
253,
4081,
1332,
352,
651,
452,
644,
6927,
281,
1239,
604,
806,
253,
2862,
278,
12132,
285,
3961,
262,
4758,
369,
3559,
1078,
12930,
849,
436,
310,
908,
275,
253,
4081,
1332,
2176,
7794,
671,
1620,
2489,
2590,
352,
4620,
326,
253,
2990,
18012,
247,
1029,
6967,
4735,
4972,
534,
310,
908,
281,
3653,
465,
4677,
337,
2789,
436,
3453,
3176,
281,
320,
247,
13434,
672,
253,
2505,
2789,
352,
3590,
751,
247,
4972,
390,
3268,
4583,
436,
11323,
281,
253,
5691,
326,
347,
352,
9572,
352,
651,
320,
1892,
281,
3359,
253,
4081,
1332,
253,
2087,
2934,
310,
29403,
533,
1142,
273,
253,
8794,
3086,
4278,
3464,
12744,
5474,
339,
431,
248,
2929,
4648,
3646,
27935,
275,
247,
3961,
262,
4758,
281,
3037,
253,
8654,
1180,
273,
9959,
465,
50276,
249,
465,
30799,
17524,
1754,
327,
253,
43031,
5464,
4868,
4560,
465,
326,
5644,
281,
253,
4585,
43031,
5464,
4868,
310,
247,
625,
2173,
1895,
326,
752,
253,
2929,
4060,
16966,
253,
2746,
310,
6210,
392,
265,
9397,
285,
4516,
407,
4679,
327,
15524,
285,
1524,
10186,
941,
9839,
2355,
253,
4836,
387,
1133,
347,
4373,
19484,
13757,
1895,
352,
812,
320,
11463,
1070,
342,
247,
5235,
273,
973,
1929,
5657,
24088,
305,
12064,
1232,
1754,
288,
5367,
9860,
3186,
3632,
10491,
3966,
253,
4477,
878,
281,
7277,
253,
3646,
11786,
1332,
281,
841,
5368,
7274,
50276,
5992,
7193,
5701,
50276,
76,
2193,
323,
21607,
403,
417,
6777,
6283,
50276,
49346,
3542,
285,
3559,
50276,
261,
253,
5978,
273,
13506,
941,
642,
261,
832,
658,
50276,
5371,
310,
253,
340,
10565,
275,
3036,
19,
50275,
783,
11562,
310,
1925,
253,
28401,
273,
7877,
1319,
50276,
28489,
1805,
253,
6944,
11562,
50276,
2574,
721,
1736,
4090,
943,
320,
39113,
347,
2505,
1662,
4090,
50276,
32897,
2451,
22735,
285,
41616,
1475,
30404,
50276,
454,
2746,
476,
3283,
253,
1180,
273,
9959,
275,
954,
7533,
342,
271,
34584,
2228,
50276,
968,
525,
13907,
310,
5809,
3472,
1060,
50276,
2574,
854,
310,
417,
3451,
5474,
339,
431,
248,
9814,
2929,
10262,
247,
4336,
440,
35421,
7792,
1313,
518,
323,
21565,
253,
1180,
273,
9959,
253,
2746,
36431,
275,
253,
2929,
12093,
767,
2022,
4243,
6753,
36465,
323,
4735,
11998,
285,
33362,
4071,
591,
916,
1406,
13361,
81,
323,
21565,
253,
1180,
273,
9959,
6753,
36465,
310,
908,
604,
3309,
281,
6379,
253,
7877,
1319,
273,
253,
3280,
941,
253,
13361,
81,
310,
10166,
970,
3646,
11786,
13757,
20824,
281,
3283,
253,
1682,
2556,
281,
43031,
5464,
4868,
1180,
273,
9959,
465,
275,
253,
1677,
10895,
4583,
253,
4477,
921,
326,
616,
2746,
33526,
2822,
2178,
2751,
1543,
327,
1097,
247,
1180,
273,
13506,
15302,
347,
973,
347,
327,
767,
973,
4304,
4382,
8113,
15302,
278,
79,
382,
285,
269,
16192,
382,
50276,
9072,
2792,
50276,
1189,
455,
253,
2929,
310,
973,
3542,
253,
4361,
2685,
310,
6032,
285,
2590,
1293,
2201,
30992,
19605,
2505,
8442,
3340,
4677,
337,
10260,
29499,
253,
4685,
273,
253,
2929,
50274,
783,
2278,
273,
253,
2905,
6239,
3133,
281,
320,
1077,
11080,
285,
10894,
954,
273,
253,
4623,
9380,
275,
253,
1673,
436,
2593,
7729,
281,
1691,
253,
1655,
789,
715,
3634,
285,
5513,
253,
7364,
273,
7274,
4081,
407,
2469,
2561,
50276,
20881,
2792,
347,
973,
347,
4606,
323,
253,
4868,
50276,
783,
2022,
1895,
342,
253,
2929,
3133,
281,
320,
253,
5165,
26210,
875,
253,
3045,
285,
10454,
273,
253,
4081,
2746,
1313,
518,
285,
253,
8245,
1332,
43031,
5464,
4868,
1223,
1097,
3082,
403,
4336,
440,
35421,
253,
3438,
310,
247,
2570,
8212,
15722,
534,
3797,
767,
11454,
6928,
581,
273,
534,
310,
10166,
970,
247,
35221,
4715,
7792,
253,
6158,
310,
247,
2969,
7212,
326,
476,
320,
5118,
323,
347,
1142,
9959,
347,
3309,
275,
247,
4942,
2159,
673,
1097,
3082,
2556,
281,
253,
4477,
1347,
327,
1061,
342,
253,
8245,
1146,
42876,
1805,
327,
13506,
941,
25761,
2556,
281,
253,
2929,
253,
1313,
518,
7792,
310,
10166,
281,
1089,
253,
1682,
1180,
273,
9959,
407,
46875,
253,
43031,
5464,
4868,
275,
253,
806,
1659,
19235,
253,
4477,
8127,
11035,
247,
1953,
273,
2139,
616,
2746,
943,
320,
9013,
689,
253,
8245,
760,
2378,
672,
36738,
327,
1543,
273,
253,
5301,
342,
8245,
253,
4477,
4879,
3738,
253,
8245,
2746,
33526,
2074,
390,
1805,
1543,
352,
310,
417,
17887,
281,
897,
436,
1332,
672,
253,
2491,
273,
1896,
465,
84,
310,
3862,
417,
4540,
5555,
4278,
347,
281,
752,
4555,
670,
247,
3862,
2491,
273,
465,
84,
2789,
253,
8245,
1332,
275,
36764,
917,
285,
2139,
1313,
518,
1364,
320,
9013,
275,
824,
5989,
4583,
253,
5649,
273,
970,
247,
625,
2570,
1332,
824,
347,
1313,
518,
943,
320,
4518,
4767,
275,
253,
2929,
285,
671,
18171,
21657,
16058,
50275,
783,
3480,
273,
40290,
310,
253,
1273,
954,
1774,
4468,
285,
671,
253,
1921,
323,
253,
1655,
4868,
1313,
518,
556,
644,
908,
327,
2067,
15302,
1690,
247,
1180,
273,
13506,
15302,
4561,
970,
1629,
29343,
5522,
285,
671,
973,
4304,
278,
79,
382,
285,
269,
16192,
382,
1223,
4679,
327,
13506,
941,
310,
247,
1270,
1039,
281,
6583,
253,
3302,
9079,
670,
253,
1566,
8936,
3045,
327,
1524,
10186,
941,
310,
752,
476,
320,
273,
1600,
281,
253,
3114,
3738,
278,
79,
382,
285,
269,
16192,
382,
403,
1175,
4983,
2792,
247,
2257,
625,
15302,
2226,
835,
253,
1180,
273,
9959,
26332,
5971,
310,
1929,
24088,
260,
338,
274,
740,
260,
338,
274,
2313,
4440,
257,
292,
3966,
50276,
783,
1072,
476,
320,
753,
670,
253,
21607,
347,
4390,
760,
581,
2746,
3707,
8245,
310,
2429,
281,
1313,
518,
50274,
12563,
352,
310,
417,
1160,
2590,
281,
253,
10668,
2139,
3646,
11786,
13757,
369,
908,
323,
3733,
253,
9763,
2990,
2581,
685,
323,
1650,
1846,
256,
35333,
342,
2840,
2957,
1754,
327,
43031,
5464,
4868,
253,
391,
77,
3733,
275,
824,
5989,
285,
1293,
1463,
22909,
3133,
281,
320,
271,
13622,
2557,
50275,
783,
2929,
476,
320,
1805,
18872,
2593,
495,
1537,
452,
644,
1925,
3082,
835,
253,
4477,
812,
452,
2529,
417,
760,
616,
4081,
2900,
533,
671,
253,
32048,
7274,
326,
597,
4425,
281,
7277,
342,
352,
1537,
320,
247,
1175,
2934,
281,
2118,
690,
629,
273,
253,
3646,
11786,
13757,
5955,
281,
24864,
4753,
6108,
625,
2316,
323,
4679,
285,
1543,
2593,
4679,
1537,
452,
644,
8085,
715,
4679,
285,
1543,
342,
4679,
13654,
327,
2684,
4679,
285,
616,
9978,
1223,
1543,
5780,
625,
7106,
327,
2797,
6973,
285,
4623,
7914,
50275,
16374,
4243,
273,
253,
10199,
403,
21481,
390,
6015,
275,
2905,
789,
352,
310,
417,
3309,
281,
2319,
4623,
6239,
275,
253,
10199,
436,
629,
943,
320,
4395,
281,
2905,
789,
4336,
50273,
1189,
247,
1180,
273,
15530,
275,
253,
12002,
10199,
285,
2905,
789,
253,
4477,
1127,
562,
247,
2810,
2954,
875,
616,
2746,
285,
5148,
613,
920,
390,
4715,
281,
3037,
4473,
7613,
253,
1416,
273,
253,
2746,
1313,
518,
2299,
17663,
275,
253,
2929,
253,
4477,
1646,
281,
19148,
534,
629,
273,
616,
1332,
17923,
5148,
613,
920,
1223,
352,
1537,
320,
7192,
326,
253,
4477,
3730,
281,
253,
958,
326,
616,
2746,
18784,
253,
9763,
2990,
970,
253,
440,
35421,
2625,
432,
253,
43031,
5464,
4868,
352,
310,
2299,
1537,
320,
247,
1175,
2934,
281,
1056,
436,
4602,
6843,
50276,
37585,
5701,
50275,
31158,
9191,
323,
253,
3646,
11786,
13757,
3559,
275,
2593,
7127,
403,
1669,
342,
642,
7914,
285,
281,
253,
9414,
352,
4558,
12744,
2139,
597,
497,
3559,
275,
253,
2457,
2593,
275,
253,
806,
1659,
50275,
34480,
273,
4677,
337,
943,
34243,
671,
5513,
752,
1269,
1269,
1182,
285,
643,
4903,
1599,
50275,
783,
2626,
5382,
327,
253,
1618,
273,
9021,
4665,
84,
5661,
1543,
285,
476,
10693,
320,
2783,
347,
247,
7680,
273,
253,
2929,
50275,
8826,
273,
253,
2426,
2905,
281,
35221,
4715,
824,
347,
3126,
16307,
390,
6486,
1688,
21148,
3807,
2865,
702,
264,
1091,
1364,
320,
5544,
1078,
897,
50275,
13206,
374,
812,
452,
644,
1160,
625,
2590,
407,
6240,
9118,
3104,
326,
651,
2723,
281,
253,
2032,
1318,
273,
253,
1180,
273,
9959,
50270,
455,
913,
1406,
90,
983,
824,
347,
13361,
81,
943,
320,
2931,
1078,
1146,
908,
187,
187,
4118,
18435,
27,
783,
30628,
497,
42293,
326,
436,
19529,
310,
417,
4704,
323,
9311,
387,
17857,
32888,
7350,
497,
5439,
670,
19843,
273,
253,
47284,
347,
973,
347,
3480,
273,
4209,
4679,
10941,
281,
2905,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a new deep architecture for segmentation of time series ie to find the subsequences inside of a time series corresponding to different classes and to determine the bounds of those subsequences the proposed architecture includes two core components a a kind of lstm with skip connections to deal with the multiscale problem of time series b an encoderdecoder module based on cnn and resnet the outputs of the two components are then used in a convolutional layer to provide a stepwise classification at each time step the approach is evaluated on two classical datasets wrt 3 baselines pros the paper addresses an important problem of time series ie accurate segmentation of a time series the philosophy of the presented approach stepwise classification seems promising using both lstm and cnn to treat the multiscale challenge of time series is interesting especially how it is done in the paper with two separate modules the sota section is accurate and easy to read having a single architecture able to deal with fast and low changing labels is clearly an important outcome of the paper neutral while the architecture is novel to the best of my knowledge most of the used modules are adapted from existing literature cons the technical section is very hard to read the appendixes are not used at the best to explain deeper the architecture the main problem is that there is no explanation of the rolethe idea behind each module what can achieve the msslstm that can not achieve the encoderdecoder why use amsp 1ddsresnet looking especially at the ablation study i wonder if the two modules are really useful and i can not judge because the evaluation is not sufficiently detailed see below the experimental part is insufficient to judge the relevance of the proposed architecture evaluating more datasets will be useful to assess the stability of the proposed approach wrt the sota as it is the results are very close and it is hard to assess the strength and weakness of the proposed approach wrt sota the core thesis of the authors is to provide a model able to do stepwise segmentation and to retrieve accurately the bounds of each subsequence unfortunately all results concern aggregated measures over the whole sequences like accuracy or fscore it seems very important to have some results on the bounds found by the model compared to sota maybe something like a deltatime between the ground truth and the inferred timestep the datasets are not well detailed for instance what is the proportion of each class the paper said that the problem is unbalanced but no further details the results are not well detailed for the ablation study only one single aggregated result is reported it is impossible to understand the real role of each component there is no indication about the training time and the inference time of each model and as the results are very close it is important to know the computational cost of each model the approach seems interesting and promising but the technical part is too hard to follow and requires more explanation on the reason for the use of each module the experimental part is incomplete missing at least an important experiment to show how well the proposed model answer to the problem of the accurate detection of the time series segments and details on results to understand the ablation study docsepthe paper presents a stepwise segmentation for time series data namely segtime contrary to the sliding window approach segtime takes the whole sequence as input and process it in two separate modules the msslstm network and the 1d encoderdecoder network outputs from the two separate networks are concatenated and taken as the input to the final convolutional layer to produce the final output which has class labels for each timestep segment this papers contributions can be summarized as follows 1 by predicting in a step level rather than a sliding window it works well on both fast and slowchanging labels 2 high parameter efficiency is achieved with depthwise separable convolution atrous convolution and skiplstms 3 longterm dependency is captured using msslstm and 1d convolutional layers pros compared to the sliding window approach the proposed segtime method can take the whole input sequence at once this can increase the possibility of capturing longterm dependencies the model combines dilated convolution and skiplstm for prediction which results in better prediction results compared to the case where only a single module is used cons the technical novelty of this paper is a bit limited the proposed method is based on the 1d version of deeplabv3 3 with an additional msslstm module but there are already other timeseries segment approaches that are based on deep convolution layers 1 2 moreover the effect of msslstm is not thoroughly analyzed from the experiment as segtime has a limited effect on both datasets compared to the model without msslstm segtime although the authors provide an ablation study table 4 for the effect of msslstm the results presented in the previous experiments tables 2 and 3 show minor differences between segtime and segtime the authors need to provide clarification for the ablation study to prove the effect of msslstm which is the main contribution of this paper there is insufficient evidence of the effect of considering longterm dependencies mainly msslstm and 1ddsresnet take longterm dependencies into account but the effect on the final prediction is not properly evaluated to consider this the authors may report the performance according to different input lengths furthermore empirical results on the reduction of computation cost need to be provided the authors argue that the model achieves computational efficiency by reducing parameters and computations however the paper does not provide appropriate experiments such as inference time for clarity i recommend the authors to correct the minor typos and grammar issues in the paper 1 francisco javier ordonez and daniel roggen deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition sensors 161115 2016 2 olaf ronneberger philipp fischer and thomas brox unet convolutional networks for biomedical image segmentation in international conference on medical image computing and computerassisted intervention pp 234241 springer 2015 3 liangchieh chen yukun zhu george papandreou florian schroff and hartwig adam encoderdecoder with atrous separable convolution for semantic image segmentation in proceedings of the european conference on computer vision eccv pp 801818 2018 the segtime method presented in the paper might be effective as the model takes the whole input sequence at once rather than a sliding window approach however this paper needs further experiments and evidence to properly support the authors claims moreover technical novelty is a bit limited therefore my evaluation of the paper is marginally below the acceptance threshold if all the issues mentioned are fully addressed i may reconsider my assessment of the paper docsepthis paper focuses on timeseries ts segmentation they claim that in the typical approach for this problem where you is to apply a model eg a temporal conv net over fixed windows in time in sliding window fashion it is challenging to predict precise breakpoints especially when the labels change frequently relative to the sampling rate of the input datasensors they also claim that these approaches ignore longterm dependences they introduce a network which they obviates the need for sliding windows and can precisely find breakpoints their claimed contributions are a conceptual framework for ts segmentation an architecture for solving ts segmentation problems an adaptation of deeplabv3 for ts segmentation problems state of the art results this paper brings up many interesting questions and does interesting analyses however i have concerns primarily from the broad conceptual claims and missing background from literature in this area a core conceptual insight in the paper is described in contribution 1 there is a claim that most ts segmentation methods operate as divideclassifyconcatenate however this isnt true most current papers in this area use temporal conversion net architectures which as the paper also mentions typically operate on a sliding window approach much of this work until a couple years ago was using lstms conditional random fields hmms or other temporal models which also did not operate under divideclassifyconcatenate the conceptual framework used by this paper is no different than the union of tcn an lstmbased approaches which has also been widely documented in the literature eg in the speech community there is also a claim that other works havent looked at labels operating over multiple timescales this is reiterated on page 3 no prior work has investigated time series segmentation in two scales labels of high frequency and low frequency change few works are dedicated a precise segmentation with steplevel accuracy these statements highlight that this paper is missing a large number of references from the computer vision and ml communities often under the guise of finegrained action recognition working to overcome many of these same problems the mstcn work below for example specifically addresses multiscale issues like this some good references i recommend doing a search for more in this direction lea et al temporal convolutional networks for action segmentation and detection cvpr 2017 bai et al an empirical evaluation of generic convolutional and recurrent networks for sequence modeling arxiv 2018 farha et al mstcn multistage temporal convolutional network for action segmentation cvpr 2019 and mstcn at tpami 2020 kahatapitiya et al coarsefine networks for temporal activity detection in videos cvpr 2021 re the stepwise segmentation module on one hand i understand why you would want to reduce the temporal resolution of your output space eg from input sampling rate of 3k hz to output of 30 hz this is commonly done in areas like speech where acoustic models are often downsampled to wordfragments at 30 or 100 hz however this seems antithetical to the premise of the paper where the goal is to have very precise timestamps re segtime while i like the simplify of the multiscale skip lstm idea i appreciate the ablations between the full tcnlstm model and the adaptation of deeplabv3 tcn while the lstmbased model does improve performance i wonder if making other modifications to the tcnside of the model could have the same impact ablations on other tcn architectures could be useful here its unclear what issues the lstmside prevents which couldnt alternatively be done with a modified tcn architecture while the authors clearly put a lot of time an energy into this work i think it would benefit from workshopping with others in the field as noted above there are many missing pieces from different parts of the literature which could be used to improve this work and better situate it with other progress in this area docsepthe paper presents a supervised method called segtime for time series segmentation that is based on stepwise time series classification the method avoids sliding windows which is the typical approach thus avoids the specification of window size and stride it also seems to be insensitive to the label changing frequency and this constitutes a major advantage over other approaches the network architecture is based on two core modules a novel multiscale skip lstm called msslstm that employs lstms with skip connections and a very deep cnn called 1ddsresnet several other modules are also included such as 1d depthwise separable and atrous convolutional layers and the atrous multiscale pooling module amsp the method is evaluated on two datasets one with fast changing labels and one with slow changing labels there are several concerns regarding the manuscript 1 the idea of lstm with skip connections is interesting and is borrowed from image segmentation models like deeplabv3 however it requires the proper adjustment of the skipping factors how one can adjust the skipping factors i wonder whether properly adjusting the skipping factor is equally difficult to properly adjusting the window size 2 in general the whole architecture seems too complex and requires a large number of hyperparameters as illustrated in the appendix the description is unclear in some points for example do you provide the whole sequence as input to the model and does not present the motivation behind architectural choices why so many modules are needed the ablation study table 4 indicates minor performance deterioration when a core module is omitted 3 moreover it is not easy to justify the novelty of the method since a qualitative comparison with other methods is not presented it seems that the proposed method is the only one that does not use the sliding window mechanism however if overlapping sliding windows are used differing only in one time step a stepwise classification is achieved 4 experimental evaluation can be considered as limited since only two datasets have been considered note that the parameters for the sleepedf dataset only are presented have the same parameters been used for the opportunity dataset to better illustrate the potential of the method the authors could also perform experiments using synthetic time series that include both fast and slow changing labels 5 i strongly suggest to include the term supervised in the title typically time series segmentation is considered as an unsupervised problem where class labels are not exploited alternatively the term stepwise classification better illustrates the problem that is solved the paper presents a rather complex model for stepwise time series classification the proposed approach relies on ideas borrowed from semantic image segmentation models like deeplab there several issues to be resolved related to motivation novelty clarity hyperparameter specification and experimental validation
### Summary:
|
this paper deals with segmentation of time series the paper has received quite detailed reviews and the approach seems to have several interesting aspects interesting architecture choice stepwise classification approach ability of capturing long range dependencies however there is a consensus that the paper would definitely benefit from a further iteration before publication in iclr or in any other similar venue the authors in their final response have already identified the improvement points raised by the reviewers in addition to these i believe it would be helpful to put the contributions better into perspective with existing literature i think all these this would require a major rewrite and i encourage the authors to make a fresh submission in a future venue
|
[
9380,
9021,
476,
320,
17903,
347,
3637,
50276,
18,
186,
1615,
21565,
275,
247,
3213,
1268,
2581,
685,
247,
20661,
3497,
352,
2987,
973,
327,
1097,
3809,
285,
3468,
28276,
13301,
50276,
19,
186,
8656,
4764,
6733,
310,
6786,
342,
6864,
3020,
39690,
27311,
387,
18437,
27311,
285,
1629,
7070,
296,
983,
495,
186,
5056,
3945,
18925,
310,
10848,
970,
278,
28908,
296,
78,
285,
337,
69,
27311,
267,
8090,
541,
214,
50276,
856,
84,
209,
186,
3118,
1096,
281,
253,
20661,
3497,
2746,
253,
4081,
8753,
2606,
1332,
476,
1379,
253,
2644,
3280,
3425,
387,
2378,
436,
476,
2572,
253,
6387,
273,
26475,
1048,
3945,
21011,
209,
186,
783,
1566,
24772,
49783,
27311,
285,
1629,
7070,
296,
78,
323,
10554,
534,
1543,
275,
1805,
10554,
1543,
2429,
281,
253,
1083,
835,
760,
247,
2014,
6333,
310,
908,
50276,
5040,
209,
186,
783,
7681,
38135,
273,
436,
2929,
310,
247,
2372,
3710,
253,
4081,
1332,
310,
1754,
327,
253,
337,
69,
2715,
273,
372,
70,
446,
357,
87,
20,
495,
342,
271,
3081,
278,
28908,
296,
78,
6333,
533,
627,
403,
2168,
643,
2069,
12395,
8223,
7274,
326,
403,
1754,
327,
3676,
27311,
8090,
337,
374,
209,
186,
3062,
1189,
253,
1055,
273,
278,
28908,
296,
78,
310,
417,
16575,
5867,
432,
253,
3368,
347,
8753,
2606,
556,
247,
3710,
1055,
327,
1097,
15302,
2429,
281,
253,
1566,
1293,
278,
28908,
296,
78,
8753,
2606,
3738,
253,
4477,
2085,
271,
28913,
1263,
2829,
577,
323,
253,
1055,
273,
278,
28908,
296,
78,
253,
1543,
3559,
275,
253,
2045,
4679,
7180,
374,
285,
495,
921,
5884,
3910,
875,
8753,
2606,
285,
8753,
2606,
253,
4477,
878,
281,
2085,
37699,
323,
253,
28913,
1263,
281,
5276,
253,
1055,
273,
278,
28908,
296,
78,
534,
310,
253,
2022,
7680,
273,
436,
2929,
209,
186,
9088,
310,
12497,
1941,
273,
253,
1055,
273,
7296,
1048,
3945,
21011,
7194,
278,
28908,
296,
78,
285,
337,
69,
1397,
373,
3024,
1379,
1048,
3945,
21011,
715,
2395,
533,
253,
1055,
327,
253,
2457,
10554,
310,
417,
6283,
6760,
281,
1908,
436,
253,
4477,
778,
1304,
253,
3045,
2556,
281,
1027,
3280,
16095,
209,
186,
44295,
3062,
16774,
1543,
327,
253,
5141,
273,
13782,
2105,
878,
281,
320,
2530,
253,
4477,
9059,
326,
253,
1566,
33526,
15180,
6733,
407,
8493,
3602,
285,
30745,
2299,
253,
2929,
1057,
417,
2085,
4569,
4679,
824,
347,
17032,
673,
50276,
186,
1542,
19843,
891,
5583,
253,
4477,
281,
3451,
253,
5884,
963,
993,
285,
28146,
3374,
275,
253,
2929,
50276,
18,
38996,
23538,
19199,
1321,
4036,
531,
91,
285,
16447,
928,
687,
1266,
257,
3676,
27311,
267,
285,
298,
296,
78,
18902,
11454,
50276,
3024,
4896,
323,
23390,
26306,
8251,
494,
2425,
8981,
13479,
1668,
883,
1010,
4022,
50276,
19,
8919,
2320,
391,
44215,
24423,
815,
14134,
269,
23268,
285,
289,
4921,
1795,
89,
440,
292,
27311,
267,
6928,
323,
35156,
2460,
26405,
275,
5213,
8059,
327,
3739,
2460,
12672,
285,
4382,
27310,
7268,
7266,
27812,
24552,
7203,
254,
4104,
50276,
20,
632,
606,
2942,
73,
260,
864,
340,
2788,
328,
1182,
11917,
3471,
4652,
13860,
395,
250,
276,
892,
40563,
5807,
287,
567,
285,
288,
435,
28015,
38622,
32049,
48759,
342,
387,
18437,
39690,
27311,
323,
24705,
2460,
26405,
275,
10061,
273,
253,
19454,
266,
8059,
327,
4382,
8113,
23746,
87,
7266,
854,
12058,
1093,
4765,
50276,
783,
8753,
2606,
1332,
3559,
275,
253,
2929,
1537,
320,
3576,
347,
253,
1566,
3936,
253,
2644,
3280,
3425,
387,
2378,
2581,
685,
247,
20661,
3497,
2746,
2299,
436,
2929,
3198,
2007,
4679,
285,
1941,
281,
6283,
1329,
253,
4477,
3916,
25761,
7681,
38135,
310,
247,
2372,
3710,
3103,
619,
7103,
273,
253,
2929,
310,
42876,
2708,
253,
14924,
7887,
604,
512,
253,
3374,
5393,
403,
4751,
9713,
891,
778,
24033,
619,
6803,
273,
253,
2929,
5474,
33032,
2520,
2929,
16633,
327,
2069,
12395,
28669,
26405,
597,
1750,
326,
275,
253,
6867,
2746,
323,
436,
1895,
50276,
2811,
368,
310,
281,
4647,
247,
1566,
24088,
247,
11935,
2410,
2036,
689,
4229,
8323,
275,
673,
275,
20661,
3497,
8142,
50276,
262,
310,
11132,
281,
3283,
10799,
2740,
10801,
3340,
672,
253,
13301,
1818,
7208,
4103,
281,
253,
10491,
2281,
273,
253,
3280,
7621,
32541,
597,
671,
1750,
326,
841,
7274,
11823,
1048,
3945,
3469,
2979,
597,
9569,
247,
2990,
534,
597,
691,
6584,
684,
253,
878,
323,
20661,
8323,
285,
476,
10534,
1089,
2740,
10801,
50276,
14094,
7558,
9021,
403,
50275,
66,
20178,
7792,
323,
28669,
26405,
50276,
266,
10336,
323,
16161,
28669,
26405,
3237,
50276,
266,
15644,
273,
372,
70,
446,
357,
87,
20,
323,
28669,
26405,
3237,
50276,
3409,
273,
253,
1445,
1543,
436,
2929,
10316,
598,
1142,
4722,
3533,
285,
1057,
4722,
6260,
2299,
891,
452,
7350,
8558,
432,
253,
3862,
20178,
3916,
285,
5816,
4114,
432,
6239,
275,
436,
2170,
50275,
66,
5161,
20178,
12288,
275,
253,
2929,
310,
2529,
275,
7680,
337,
50275,
9088,
310,
247,
1750,
326,
954,
28669,
26405,
3082,
10196,
347,
10957,
2437,
1419,
35707,
257,
366,
2299,
436,
310,
2649,
2032,
954,
1655,
9380,
275,
436,
2170,
897,
11935,
9436,
2036,
35615,
534,
347,
253,
2929,
671,
25957,
5431,
10196,
327,
247,
20661,
3497,
2746,
1199,
273,
436,
789,
1919,
247,
4564,
1107,
3622,
369,
970,
298,
296,
983,
17697,
3632,
4910,
288,
78,
983,
390,
643,
11935,
3210,
534,
671,
858,
417,
10196,
762,
10957,
2437,
1419,
35707,
257,
366,
253,
20178,
7792,
908,
407,
436,
2929,
310,
642,
1027,
685,
253,
8083,
273,
246,
14340,
271,
298,
296,
1814,
833,
7274,
534,
556,
671,
644,
7561,
14290,
275,
253,
6239,
24088,
275,
253,
6519,
3114,
50276,
9088,
310,
671,
247,
1750,
326,
643,
2987,
419,
2254,
3261,
387,
13301,
6498,
689,
2709,
2069,
1179,
265,
436,
310,
43269,
327,
3239,
495,
642,
2720,
789,
556,
6949,
673,
2962,
26405,
275,
767,
11498,
13301,
273,
1029,
4294,
285,
1698,
4294,
1818,
1643,
2987,
403,
9940,
247,
10799,
26405,
342,
3213,
5251,
7200,
841,
7234,
6780,
326,
436,
2929,
310,
5816,
247,
1781,
1180,
273,
10414,
432,
253,
4382,
8113,
285,
13361,
7888,
2223,
762,
253,
1149,
885,
273,
4030,
72,
11273,
2250,
8981,
2444,
281,
11399,
1142,
273,
841,
1072,
3237,
253,
278,
296,
14340,
789,
2708,
323,
1650,
5742,
12453,
1554,
2865,
1079,
3374,
751,
436,
50275,
8826,
1175,
10414,
891,
5583,
2509,
247,
3186,
323,
625,
275,
436,
3884,
50275,
282,
66,
1162,
355,
11935,
27311,
267,
6928,
323,
2250,
26405,
285,
5481,
30105,
1087,
4240,
50276,
67,
2284,
1162,
355,
271,
16774,
7103,
273,
12314,
27311,
267,
285,
18902,
6928,
323,
3425,
14053,
549,
32693,
4765,
50276,
14103,
3227,
1162,
355,
278,
296,
14340,
1554,
382,
486,
11935,
27311,
267,
2990,
323,
2250,
26405,
30105,
1087,
6247,
285,
278,
296,
14340,
387,
246,
81,
7588,
9169,
50276,
4530,
700,
522,
15208,
5973,
1162,
355,
25319,
32829,
6928,
323,
11935,
2425,
5481,
275,
10556,
30105,
1087,
43425,
50276,
250,
253,
3213,
3020,
26405,
6333,
327,
581,
1133,
891,
2096,
2139,
368,
651,
971,
281,
4796,
253,
11935,
6064,
273,
634,
3453,
2317,
24088,
432,
3280,
10491,
2281,
273,
495,
76,
288,
91,
281,
3453,
273,
1884,
288,
91,
436,
310,
7744,
2218,
275,
3672,
751,
6519,
835,
19463,
3210,
403,
2223,
1066,
22163,
6216,
281,
3159,
42494,
942,
387,
1884,
390,
2233,
288,
91,
2299,
436,
3133,
20711,
6168,
474,
281,
253,
26536,
273,
253,
2929,
835,
253,
4736,
310,
281,
452,
1077,
10799,
4522,
383,
11441,
50275,
250,
8753,
2606,
1223,
891,
751,
253,
25636,
273,
253,
1554,
2865,
1079,
17049,
298,
296,
78,
2934,
891,
11435,
253,
490,
77,
569,
875,
253,
2120,
246,
68,
13307,
296,
78,
1566,
285,
253,
15644,
273,
372,
70,
446,
357,
87,
20,
246,
14340,
1223,
253,
298,
296,
1814,
833,
1566,
1057,
3157,
3045,
891,
4282,
604,
2403,
643,
14586,
281,
253,
246,
14340,
2189,
273,
253,
1566,
812,
452,
253,
1072,
3486,
490,
77,
569,
327,
643,
246,
14340,
35615,
812,
320,
4217,
1060,
697,
12744,
752,
3374,
253,
298,
296,
983,
504,
16897,
534,
812,
2649,
31506,
320,
2218,
342,
247,
7321,
246,
14340,
10336,
50275,
6050,
253,
4477,
4518,
1691,
247,
2257,
273,
673,
271,
2341,
715,
436,
789,
891,
1158,
352,
651,
5649,
432,
2987,
1689,
2784,
342,
2571,
275,
253,
1673,
347,
4879,
1840,
627,
403,
1142,
5816,
7437,
432,
1027,
4243,
273,
253,
6239,
534,
812,
320,
908,
281,
3157,
436,
789,
285,
1805,
5999,
366,
352,
342,
643,
4780,
275,
436,
2170,
50276,
7152,
339,
431,
248,
2929,
10262,
247,
22296,
1332,
1925,
8753,
2606,
323,
673,
2962,
26405,
326,
310,
1754,
327,
3213,
3020,
673,
2962,
9162,
253,
1332,
32547,
20661,
8323,
534,
310,
253,
6867,
2746,
3021,
32547,
253,
17776,
273,
3497,
1979,
285,
31482,
352,
671,
3133,
281,
320,
39188,
281,
253,
5203,
6890,
4294,
285,
436,
16988,
247,
2201,
5750,
689,
643,
7274,
253,
2990,
10336,
310,
1754,
327,
767,
5161,
11911,
247,
4460,
1554,
2865,
1079,
17049,
298,
296,
78,
1925,
278,
28908,
296,
78,
326,
27532,
298,
296,
983,
342,
17049,
10291,
285,
247,
1077,
3676,
260,
9866,
1925,
337,
69,
1397,
373,
3024,
2067,
643,
11911,
403,
671,
2908,
824,
347,
337,
69,
6864,
3020,
39690,
285,
387,
18437,
27311,
267,
8090,
285,
253,
387,
18437,
1554,
2865,
1079,
45900,
6333,
717,
1033,
253,
1332,
310,
6760,
327,
767,
15302,
581,
342,
3809,
6890,
13301,
285,
581,
342,
3468,
6890,
13301,
50276,
9088,
403,
2067,
7350,
5001,
253,
7714,
337,
253,
2934,
273,
298,
296,
78,
342,
17049,
10291,
310,
4722,
285,
310,
29563,
432,
2460,
26405,
3210,
751,
372,
70,
446,
357,
87,
20,
2299,
352,
4419,
253,
1463,
14000,
273,
253,
42654,
2616,
849,
581,
476,
4575,
253,
42654,
2616,
891,
4282,
1880,
6283,
19427,
253,
42654,
2803,
310,
9696,
2834,
281,
6283,
19427,
253,
3497,
1979,
374,
275,
2087,
253,
2644,
10336,
3133,
1512,
2570,
285,
4419,
247,
1781,
1180,
273,
4373,
22041,
347,
12800,
275,
253,
30762,
253,
5740,
310,
12744,
275,
690,
2792,
323,
1650,
513,
368,
2085,
253,
2644,
3425,
347,
3280,
281,
253,
1566,
285,
1057,
417,
1246,
253,
16038,
3212,
27934,
10165,
2139,
594,
1142,
11911,
403,
3058,
253,
28913,
1263,
2829,
577,
6492,
5884,
3045,
28153,
672,
247,
5161,
6333,
310,
11035,
50276,
20,
25761,
352,
310,
417,
3477,
281,
15249,
253,
38135,
273,
253,
1332,
1580,
247,
18276,
5301,
342,
643,
3082,
310,
417,
3559,
352,
3133,
326,
253,
4081,
1332,
310,
253,
760,
581,
326,
1057,
417,
897,
253,
20661,
3497,
5122,
2299,
604,
21481,
20661,
8323,
403,
908,
26704,
760,
275,
581,
673,
3213,
247,
3213,
3020,
9162,
310,
6786,
50276,
21,
5661,
7103,
476,
320,
2783,
347,
3710,
1580,
760,
767,
15302,
452,
644,
2783,
3877,
326,
253,
3602,
323,
253,
4600,
264,
71,
10895,
760,
403,
3559,
452,
253,
1072,
3602,
644,
908,
323,
253,
5107,
10895,
281,
1805,
17093,
253,
2442,
273,
253,
1332,
253,
4477,
812,
671,
1347,
4679,
970,
13506,
673,
2962,
326,
2486,
1097,
3809,
285,
3468,
6890,
13301,
50276,
22,
891,
7052,
1804,
281,
2486,
253,
1307,
22296,
275,
253,
4060,
5431,
673,
2962,
26405,
310,
2783,
347,
271,
440,
35421,
1895,
835,
966,
13301,
403,
417,
28734,
31506,
253,
1307,
3213,
3020,
9162,
1805,
18303,
253,
1895,
326,
310,
14042,
50275,
783,
2929,
10262,
247,
2581,
2570,
1566,
323,
3213,
3020,
673,
2962,
9162,
253,
4081,
2746,
15771,
327,
5697,
29563,
432,
24705,
2460,
26405,
3210,
751,
372,
70,
446,
357,
50276,
9088,
2067,
3374,
281,
320,
11512,
2905,
281,
16038,
38135,
19843,
4373,
19484,
17776,
285,
5661,
12820,
50275,
187,
187,
4118,
18435,
27,
2520,
2929,
13330,
342,
26405,
273,
673,
2962,
253,
2929,
556,
2959,
3240,
7000,
10123,
285,
253,
2746,
3133,
281,
452,
2067,
4722,
7794,
4722,
10336,
4327,
3213,
3020,
9162,
2746,
3745,
273,
26475,
1048,
2491,
21011,
2299,
627,
310,
247,
13969,
326,
253,
2929,
651,
7964,
5649,
432,
247,
2007,
19502,
1078,
9311,
275,
17857,
32888,
390,
275,
667,
643,
2074,
18767,
253,
4477,
275,
616,
2457,
2380,
452,
2168,
3636,
253,
7756,
2792,
5439,
407,
253,
30628,
275,
1635,
281,
841,
891,
2868,
352,
651,
320,
9371,
281,
1691,
253,
9021,
1805,
715,
8668,
342,
5368,
6239,
891,
1158,
512,
841,
436,
651,
2430,
247,
2201,
24813,
285,
891,
11907,
253,
4477,
281,
1056,
247,
5352,
19529,
275,
247,
2852,
18767
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
9380,
9021,
476,
320,
17903,
347,
3637,
50276,
18,
186,
1615,
21565,
275,
247,
3213,
1268,
2581,
685,
247,
20661,
3497,
352,
2987,
973,
327,
1097,
3809,
285,
3468,
28276,
13301,
50276,
19,
186,
8656,
4764,
6733,
310,
6786,
342,
6864,
3020,
39690,
27311,
387,
18437,
27311,
285,
1629,
7070,
296,
983,
495,
186,
5056,
3945,
18925,
310,
10848,
970,
278,
28908,
296,
78,
285,
337,
69,
27311,
267,
8090,
541,
214,
50276,
856,
84,
209,
186,
3118,
1096,
281,
253,
20661,
3497,
2746,
253,
4081,
8753,
2606,
1332,
476,
1379,
253,
2644,
3280,
3425,
387,
2378,
436,
476,
2572,
253,
6387,
273,
26475,
1048,
3945,
21011,
209,
186,
783,
1566,
24772,
49783,
27311,
285,
1629,
7070,
296,
78,
323,
10554,
534,
1543,
275,
1805,
10554,
1543,
2429,
281,
253,
1083,
835,
760,
247,
2014,
6333,
310,
908,
50276,
5040,
209,
186,
783,
7681,
38135,
273,
436,
2929,
310,
247,
2372,
3710,
253,
4081,
1332,
310,
1754,
327,
253,
337,
69,
2715,
273,
372,
70,
446,
357,
87,
20,
495,
342,
271,
3081,
278,
28908,
296,
78,
6333,
533,
627,
403,
2168,
643,
2069,
12395,
8223,
7274,
326,
403,
1754,
327,
3676,
27311,
8090,
337,
374,
209,
186,
3062,
1189,
253,
1055,
273,
278,
28908,
296,
78,
310,
417,
16575,
5867,
432,
253,
3368,
347,
8753,
2606,
556,
247,
3710,
1055,
327,
1097,
15302,
2429,
281,
253,
1566,
1293,
278,
28908,
296,
78,
8753,
2606,
3738,
253,
4477,
2085,
271,
28913,
1263,
2829,
577,
323,
253,
1055,
273,
278,
28908,
296,
78,
253,
1543,
3559,
275,
253,
2045,
4679,
7180,
374,
285,
495,
921,
5884,
3910,
875,
8753,
2606,
285,
8753,
2606,
253,
4477,
878,
281,
2085,
37699,
323,
253,
28913,
1263,
281,
5276,
253,
1055,
273,
278,
28908,
296,
78,
534,
310,
253,
2022,
7680,
273,
436,
2929,
209,
186,
9088,
310,
12497,
1941,
273,
253,
1055,
273,
7296,
1048,
3945,
21011,
7194,
278,
28908,
296,
78,
285,
337,
69,
1397,
373,
3024,
1379,
1048,
3945,
21011,
715,
2395,
533,
253,
1055,
327,
253,
2457,
10554,
310,
417,
6283,
6760,
281,
1908,
436,
253,
4477,
778,
1304,
253,
3045,
2556,
281,
1027,
3280,
16095,
209,
186,
44295,
3062,
16774,
1543,
327,
253,
5141,
273,
13782,
2105,
878,
281,
320,
2530,
253,
4477,
9059,
326,
253,
1566,
33526,
15180,
6733,
407,
8493,
3602,
285,
30745,
2299,
253,
2929,
1057,
417,
2085,
4569,
4679,
824,
347,
17032,
673,
50276,
186,
1542,
19843,
891,
5583,
253,
4477,
281,
3451,
253,
5884,
963,
993,
285,
28146,
3374,
275,
253,
2929,
50276,
18,
38996,
23538,
19199,
1321,
4036,
531,
91,
285,
16447,
928,
687,
1266,
257,
3676,
27311,
267,
285,
298,
296,
78,
18902,
11454,
50276,
3024,
4896,
323,
23390,
26306,
8251,
494,
2425,
8981,
13479,
1668,
883,
1010,
4022,
50276,
19,
8919,
2320,
391,
44215,
24423,
815,
14134,
269,
23268,
285,
289,
4921,
1795,
89,
440,
292,
27311,
267,
6928,
323,
35156,
2460,
26405,
275,
5213,
8059,
327,
3739,
2460,
12672,
285,
4382,
27310,
7268,
7266,
27812,
24552,
7203,
254,
4104,
50276,
20,
632,
606,
2942,
73,
260,
864,
340,
2788,
328,
1182,
11917,
3471,
4652,
13860,
395,
250,
276,
892,
40563,
5807,
287,
567,
285,
288,
435,
28015,
38622,
32049,
48759,
342,
387,
18437,
39690,
27311,
323,
24705,
2460,
26405,
275,
10061,
273,
253,
19454,
266,
8059,
327,
4382,
8113,
23746,
87,
7266,
854,
12058,
1093,
4765,
50276,
783,
8753,
2606,
1332,
3559,
275,
253,
2929,
1537,
320,
3576,
347,
253,
1566,
3936,
253,
2644,
3280,
3425,
387,
2378,
2581,
685,
247,
20661,
3497,
2746,
2299,
436,
2929,
3198,
2007,
4679,
285,
1941,
281,
6283,
1329,
253,
4477,
3916,
25761,
7681,
38135,
310,
247,
2372,
3710,
3103,
619,
7103,
273,
253,
2929,
310,
42876,
2708,
253,
14924,
7887,
604,
512,
253,
3374,
5393,
403,
4751,
9713,
891,
778,
24033,
619,
6803,
273,
253,
2929,
5474,
33032,
2520,
2929,
16633,
327,
2069,
12395,
28669,
26405,
597,
1750,
326,
275,
253,
6867,
2746,
323,
436,
1895,
50276,
2811,
368,
310,
281,
4647,
247,
1566,
24088,
247,
11935,
2410,
2036,
689,
4229,
8323,
275,
673,
275,
20661,
3497,
8142,
50276,
262,
310,
11132,
281,
3283,
10799,
2740,
10801,
3340,
672,
253,
13301,
1818,
7208,
4103,
281,
253,
10491,
2281,
273,
253,
3280,
7621,
32541,
597,
671,
1750,
326,
841,
7274,
11823,
1048,
3945,
3469,
2979,
597,
9569,
247,
2990,
534,
597,
691,
6584,
684,
253,
878,
323,
20661,
8323,
285,
476,
10534,
1089,
2740,
10801,
50276,
14094,
7558,
9021,
403,
50275,
66,
20178,
7792,
323,
28669,
26405,
50276,
266,
10336,
323,
16161,
28669,
26405,
3237,
50276,
266,
15644,
273,
372,
70,
446,
357,
87,
20,
323,
28669,
26405,
3237,
50276,
3409,
273,
253,
1445,
1543,
436,
2929,
10316,
598,
1142,
4722,
3533,
285,
1057,
4722,
6260,
2299,
891,
452,
7350,
8558,
432,
253,
3862,
20178,
3916,
285,
5816,
4114,
432,
6239,
275,
436,
2170,
50275,
66,
5161,
20178,
12288,
275,
253,
2929,
310,
2529,
275,
7680,
337,
50275,
9088,
310,
247,
1750,
326,
954,
28669,
26405,
3082,
10196,
347,
10957,
2437,
1419,
35707,
257,
366,
2299,
436,
310,
2649,
2032,
954,
1655,
9380,
275,
436,
2170,
897,
11935,
9436,
2036,
35615,
534,
347,
253,
2929,
671,
25957,
5431,
10196,
327,
247,
20661,
3497,
2746,
1199,
273,
436,
789,
1919,
247,
4564,
1107,
3622,
369,
970,
298,
296,
983,
17697,
3632,
4910,
288,
78,
983,
390,
643,
11935,
3210,
534,
671,
858,
417,
10196,
762,
10957,
2437,
1419,
35707,
257,
366,
253,
20178,
7792,
908,
407,
436,
2929,
310,
642,
1027,
685,
253,
8083,
273,
246,
14340,
271,
298,
296,
1814,
833,
7274,
534,
556,
671,
644,
7561,
14290,
275,
253,
6239,
24088,
275,
253,
6519,
3114,
50276,
9088,
310,
671,
247,
1750,
326,
643,
2987,
419,
2254,
3261,
387,
13301,
6498,
689,
2709,
2069,
1179,
265,
436,
310,
43269,
327,
3239,
495,
642,
2720,
789,
556,
6949,
673,
2962,
26405,
275,
767,
11498,
13301,
273,
1029,
4294,
285,
1698,
4294,
1818,
1643,
2987,
403,
9940,
247,
10799,
26405,
342,
3213,
5251,
7200,
841,
7234,
6780,
326,
436,
2929,
310,
5816,
247,
1781,
1180,
273,
10414,
432,
253,
4382,
8113,
285,
13361,
7888,
2223,
762,
253,
1149,
885,
273,
4030,
72,
11273,
2250,
8981,
2444,
281,
11399,
1142,
273,
841,
1072,
3237,
253,
278,
296,
14340,
789,
2708,
323,
1650,
5742,
12453,
1554,
2865,
1079,
3374,
751,
436,
50275,
8826,
1175,
10414,
891,
5583,
2509,
247,
3186,
323,
625,
275,
436,
3884,
50275,
282,
66,
1162,
355,
11935,
27311,
267,
6928,
323,
2250,
26405,
285,
5481,
30105,
1087,
4240,
50276,
67,
2284,
1162,
355,
271,
16774,
7103,
273,
12314,
27311,
267,
285,
18902,
6928,
323,
3425,
14053,
549,
32693,
4765,
50276,
14103,
3227,
1162,
355,
278,
296,
14340,
1554,
382,
486,
11935,
27311,
267,
2990,
323,
2250,
26405,
30105,
1087,
6247,
285,
278,
296,
14340,
387,
246,
81,
7588,
9169,
50276,
4530,
700,
522,
15208,
5973,
1162,
355,
25319,
32829,
6928,
323,
11935,
2425,
5481,
275,
10556,
30105,
1087,
43425,
50276,
250,
253,
3213,
3020,
26405,
6333,
327,
581,
1133,
891,
2096,
2139,
368,
651,
971,
281,
4796,
253,
11935,
6064,
273,
634,
3453,
2317,
24088,
432,
3280,
10491,
2281,
273,
495,
76,
288,
91,
281,
3453,
273,
1884,
288,
91,
436,
310,
7744,
2218,
275,
3672,
751,
6519,
835,
19463,
3210,
403,
2223,
1066,
22163,
6216,
281,
3159,
42494,
942,
387,
1884,
390,
2233,
288,
91,
2299,
436,
3133,
20711,
6168,
474,
281,
253,
26536,
273,
253,
2929,
835,
253,
4736,
310,
281,
452,
1077,
10799,
4522,
383,
11441,
50275,
250,
8753,
2606,
1223,
891,
751,
253,
25636,
273,
253,
1554,
2865,
1079,
17049,
298,
296,
78,
2934,
891,
11435,
253,
490,
77,
569,
875,
253,
2120,
246,
68,
13307,
296,
78,
1566,
285,
253,
15644,
273,
372,
70,
446,
357,
87,
20,
246,
14340,
1223,
253,
298,
296,
1814,
833,
1566,
1057,
3157,
3045,
891,
4282,
604,
2403,
643,
14586,
281,
253,
246,
14340,
2189,
273,
253,
1566,
812,
452,
253,
1072,
3486,
490,
77,
569,
327,
643,
246,
14340,
35615,
812,
320,
4217,
1060,
697,
12744,
752,
3374,
253,
298,
296,
983,
504,
16897,
534,
812,
2649,
31506,
320,
2218,
342,
247,
7321,
246,
14340,
10336,
50275,
6050,
253,
4477,
4518,
1691,
247,
2257,
273,
673,
271,
2341,
715,
436,
789,
891,
1158,
352,
651,
5649,
432,
2987,
1689,
2784,
342,
2571,
275,
253,
1673,
347,
4879,
1840,
627,
403,
1142,
5816,
7437,
432,
1027,
4243,
273,
253,
6239,
534,
812,
320,
908,
281,
3157,
436,
789,
285,
1805,
5999,
366,
352,
342,
643,
4780,
275,
436,
2170,
50276,
7152,
339,
431,
248,
2929,
10262,
247,
22296,
1332,
1925,
8753,
2606,
323,
673,
2962,
26405,
326,
310,
1754,
327,
3213,
3020,
673,
2962,
9162,
253,
1332,
32547,
20661,
8323,
534,
310,
253,
6867,
2746,
3021,
32547,
253,
17776,
273,
3497,
1979,
285,
31482,
352,
671,
3133,
281,
320,
39188,
281,
253,
5203,
6890,
4294,
285,
436,
16988,
247,
2201,
5750,
689,
643,
7274,
253,
2990,
10336,
310,
1754,
327,
767,
5161,
11911,
247,
4460,
1554,
2865,
1079,
17049,
298,
296,
78,
1925,
278,
28908,
296,
78,
326,
27532,
298,
296,
983,
342,
17049,
10291,
285,
247,
1077,
3676,
260,
9866,
1925,
337,
69,
1397,
373,
3024,
2067,
643,
11911,
403,
671,
2908,
824,
347,
337,
69,
6864,
3020,
39690,
285,
387,
18437,
27311,
267,
8090,
285,
253,
387,
18437,
1554,
2865,
1079,
45900,
6333,
717,
1033,
253,
1332,
310,
6760,
327,
767,
15302,
581,
342,
3809,
6890,
13301,
285,
581,
342,
3468,
6890,
13301,
50276,
9088,
403,
2067,
7350,
5001,
253,
7714,
337,
253,
2934,
273,
298,
296,
78,
342,
17049,
10291,
310,
4722,
285,
310,
29563,
432,
2460,
26405,
3210,
751,
372,
70,
446,
357,
87,
20,
2299,
352,
4419,
253,
1463,
14000,
273,
253,
42654,
2616,
849,
581,
476,
4575,
253,
42654,
2616,
891,
4282,
1880,
6283,
19427,
253,
42654,
2803,
310,
9696,
2834,
281,
6283,
19427,
253,
3497,
1979,
374,
275,
2087,
253,
2644,
10336,
3133,
1512,
2570,
285,
4419,
247,
1781,
1180,
273,
4373,
22041,
347,
12800,
275,
253,
30762,
253,
5740,
310,
12744,
275,
690,
2792,
323,
1650,
513,
368,
2085,
253,
2644,
3425,
347,
3280,
281,
253,
1566,
285,
1057,
417,
1246,
253,
16038,
3212,
27934,
10165,
2139,
594,
1142,
11911,
403,
3058,
253,
28913,
1263,
2829,
577,
6492,
5884,
3045,
28153,
672,
247,
5161,
6333,
310,
11035,
50276,
20,
25761,
352,
310,
417,
3477,
281,
15249,
253,
38135,
273,
253,
1332,
1580,
247,
18276,
5301,
342,
643,
3082,
310,
417,
3559,
352,
3133,
326,
253,
4081,
1332,
310,
253,
760,
581,
326,
1057,
417,
897,
253,
20661,
3497,
5122,
2299,
604,
21481,
20661,
8323,
403,
908,
26704,
760,
275,
581,
673,
3213,
247,
3213,
3020,
9162,
310,
6786,
50276,
21,
5661,
7103,
476,
320,
2783,
347,
3710,
1580,
760,
767,
15302,
452,
644,
2783,
3877,
326,
253,
3602,
323,
253,
4600,
264,
71,
10895,
760,
403,
3559,
452,
253,
1072,
3602,
644,
908,
323,
253,
5107,
10895,
281,
1805,
17093,
253,
2442,
273,
253,
1332,
253,
4477,
812,
671,
1347,
4679,
970,
13506,
673,
2962,
326,
2486,
1097,
3809,
285,
3468,
6890,
13301,
50276,
22,
891,
7052,
1804,
281,
2486,
253,
1307,
22296,
275,
253,
4060,
5431,
673,
2962,
26405,
310,
2783,
347,
271,
440,
35421,
1895,
835,
966,
13301,
403,
417,
28734,
31506,
253,
1307,
3213,
3020,
9162,
1805,
18303,
253,
1895,
326,
310,
14042,
50275,
783,
2929,
10262,
247,
2581,
2570,
1566,
323,
3213,
3020,
673,
2962,
9162,
253,
4081,
2746,
15771,
327,
5697,
29563,
432,
24705,
2460,
26405,
3210,
751,
372,
70,
446,
357,
50276,
9088,
2067,
3374,
281,
320,
11512,
2905,
281,
16038,
38135,
19843,
4373,
19484,
17776,
285,
5661,
12820,
50275,
187,
187,
4118,
18435,
27,
2520,
2929,
13330,
342,
26405,
273,
673,
2962,
253,
2929,
556,
2959,
3240,
7000,
10123,
285,
253,
2746,
3133,
281,
452,
2067,
4722,
7794,
4722,
10336,
4327,
3213,
3020,
9162,
2746,
3745,
273,
26475,
1048,
2491,
21011,
2299,
627,
310,
247,
13969,
326,
253,
2929,
651,
7964,
5649,
432,
247,
2007,
19502,
1078,
9311,
275,
17857,
32888,
390,
275,
667,
643,
2074,
18767,
253,
4477,
275,
616,
2457,
2380,
452,
2168,
3636,
253,
7756,
2792,
5439,
407,
253,
30628,
275,
1635,
281,
841,
891,
2868,
352,
651,
320,
9371,
281,
1691,
253,
9021,
1805,
715,
8668,
342,
5368,
6239,
891,
1158,
512,
841,
436,
651,
2430,
247,
2201,
24813,
285,
891,
11907,
253,
4477,
281,
1056,
247,
5352,
19529,
275,
247,
2852,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the paper reviews common assumptions made by recent theoretical analysis of metalearning and applies them to metalearning methods as regularization results show that these regularization terms improve over vanilla metalearning reasons for score overall i vote for reject the main idea of applying theory to practice is reasonable but the regularization methods proposed are mainly known regularizing the singular value is similar to the spectral normalization proposed in 1 the frobenius norm regularization is similar to the commonly used weight decay 1 assumption 1 in du et al states that the ground truth weight should cover all directions evenly it cannot be ensured when the tasks are fixed the proposed regularization penalizes the condition number of the weight matrix during training which is more similar to the spectral normalization proposed in 1 as to regularizing the frobenius norm there exist a line of literature showing weight decay works for general settings apart from metalearning thus i think the regularization proposed in this paper is known 2 the experimental results indeed improve over vanilla metalearning however as shown in 2 even by with some simple tricks metalearning can be more stable and achieves better results this casts doubt on the value of the proposed method 1 spectral normalization for generative adversarial networks iclr 2018 2 how to train your maml iclr 2019 docsepthe main motivation of this paper is based on the theoretical results of metalearning to ensure the assumptions of the theories the authors propose a novel regularizer which improves the generalization ability of the model some results on fewshot learning benchmarks show the proposed method improves wrt those baselines here are the main concerns of this paper 1 the proposed method in this paper is based on the metalearning theory as stated in section 2 however the theoretical setting here is not fully consistent with the fewshot learning setting for example there is no validation set in eq 1 the authors should make more discussions here to show will these differences influence the final results 2 one main theoretical assumption in metalearning theory is the task distribution could the authors make this notion clear should we do empirical results on those tasks with different kinds of task distributions 3 the metalearning loss in eq 4 is a bit different from the popular metalearning objective for example in maml we do not optimize the classifier w till convergence while only a limited number of gradient steps are used 4 the authors should list those baseline values in table 1 which are still important for referencedocsepto improve the practical performance of metalearning algorithms this paper proposes two regularization terms that are motivated by two common assumptions in some recent theoretical work on metalearning namely 1 the optimal linear predictors cover the embedding space evenly and 2 the norms of the optimal predictors remain bounded as the number of tasks grow numerical experiments show that the proposed regularization terms help achieve better performance of metalearning in some tasks this work serves as a nice attempt to instruct the practice of metalearning with theoretical insights below are some of my concerns in some experimental results the improvement due to the proposed regularization seems to be at the same level of the standard deviation as well as the difference between the reproduced results of existing metalearning algorithms and those reported in earlier papers this casts doubt on the true efficacy of the proposed methods for the loss function in eq 4 it is more reasonable and natural to introduce two weighting parameters as tunable hyperparameters for the proposed regularization terms the authors often talk about enforcingensuring the assumptions however from my understanding whether the assumptions on the optimal linear predictors or groundtruth predictors hold or not depends on the learning problem itself not on the algorithms therefore there is no way we can enforceensure these assumptions i would prefer using the phrase respecting the assumptions used by the authors on page 8 this seems more accurate and reasonable following the previous point im curious about one question if the learning problem actually doesnt satisfy the two assumptions then is it still helpful to add the proposed regularization terms to the loss function im not sure but my guess is no indeed it might even hurt to solve puzzles like this i would encourage the authors to conduct some synthetic experiments where they can design the data generating process eg they can control whether the true linear predictors satisfy the assumptions or not since this work is a connection between theory and practice i believe that experiments with synthetic data can help explain things more clearly and make the claims more convincingdocsepsummary in this paper the authors aim at bridging the gap between the practice and theory in metalearning approaches specifically they propose two regularization terms to 1 capture the diversity of the tasks and 2 control the norm of the prediction layer thereby satisfying the assumptions in metalearning theory strength the motivation of this paper is interesting before proposing the methodology these theoretical assumptions have not been paid enough attention before the paper is wellorganized and clearly written the experimental setting is designed in a good manner and the results are promising weakness i am skeptical of the novelty of the second regularize in eq4 according to section 32 it is equivalent to w2o1 so what is its difference to a simple l2 weight decay according to section 2 the outerlevel parameters are restricted as a linear layer is this means the proposed regularizes would become trivial while applied on top of a more complicated model eg leo1 too few competitors it would be better to add some comparisons with recent methods the details to calculate the subgradients of the singular values which is quite complicated are missing especially seeing that there is no guarantee that an autodifferentiation tool will do that correct ref 1 andrei a rusu dushyant rao jakub sygnowski oriol vinyals razvan pascanu simon osindero raia hadsell metalearning with latent embedding optimization iclr 2019 above all since the contribution and the technical details to calculate the subgradients are not clear to me i have to currently recommend a weak reject
### Summary:
|
this paper is a systematic study of how assumptions that are present recent theoretical metalearning bounds are satisfied in practical methods and whether promoting these assumptions by adding appropriate regularization terms can improve performance of existing methods the authors review common themes in theoretical frameworks for a meta learning setting that involves a feature learning step based on which linear predictors for a variety of tasks are trained statistical guarantees for such a framework that is statistical guarantees for the performance of trained on an additional target task are based on the assumption that the set of weight vectors of the linear predictors span the space ie exhibit variety and that the training tasks all enjoy a similar margin separability that is that the representation is not significantly better suited for some of the tasks than others the current submission cleanly reviews the existing literature distills out these two properties and then proposes a regularization framework that could be added to various metalearning algorithms to promote these properties in the learned feature representation finally the authors experimentally evaluate to what degree the properties are already observed by some meta learning methods and whether the proposed additions will improve performance it is established that adding the regularization terms improves performance on most tasks the authors thus argue that incorporating insights obtained form recent theoretical frameworks of analysis can lead to improved performance in practice naturally the purpose of the presented results is not to establish a new state of the art on a set of benchmark tasks but to systematically study and compare the effect of adding regularization terms that will promote the properties that are desirable for a feature representation based on statistical bounds i would argue that the research community should support this type of studies the work is well presented and conducted most importantly the study has a clear and general message that will be valuable for researchers and practitioners working in on metalearning however the reviewers did not recommend publishing this type of study for iclr the authors are encouraged to resubmit their work to a different venue
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
6010,
50275,
783,
2929,
10123,
1846,
13260,
1160,
407,
3332,
10527,
1783,
273,
5148,
613,
920,
285,
10384,
731,
281,
5148,
613,
920,
3082,
347,
37820,
1543,
921,
326,
841,
37820,
2426,
3157,
689,
26724,
5148,
613,
920,
50275,
250,
3743,
323,
4868,
50275,
1189,
455,
891,
6273,
323,
12009,
253,
2022,
2934,
273,
9433,
3762,
281,
3946,
310,
5272,
533,
253,
37820,
3082,
4081,
403,
7194,
1929,
3963,
3006,
253,
11098,
1318,
310,
2074,
281,
253,
9879,
21539,
4081,
275,
337,
253,
8954,
7564,
3750,
5222,
37820,
310,
2074,
281,
253,
7744,
908,
2801,
10027,
50275,
18,
186,
515,
23892,
337,
275,
3443,
1162,
355,
3054,
326,
253,
3216,
5083,
2801,
943,
3835,
512,
10746,
25356,
352,
2550,
320,
33075,
672,
253,
8892,
403,
4229,
253,
4081,
37820,
29697,
4219,
253,
1617,
1180,
273,
253,
2801,
4315,
1309,
3733,
534,
310,
625,
2074,
281,
253,
9879,
21539,
4081,
275,
337,
347,
281,
3963,
3006,
253,
8954,
7564,
3750,
5222,
627,
2226,
247,
1386,
273,
6239,
4645,
2801,
10027,
2987,
323,
2087,
7533,
7419,
432,
5148,
613,
920,
3021,
891,
1158,
253,
37820,
4081,
275,
436,
2929,
310,
1929,
374,
186,
783,
5661,
1543,
6296,
3157,
689,
26724,
5148,
613,
920,
2299,
347,
2011,
275,
374,
1014,
407,
342,
690,
2969,
24866,
5148,
613,
920,
476,
320,
625,
6474,
285,
33526,
1805,
1543,
436,
43603,
5545,
327,
253,
1318,
273,
253,
4081,
1332,
50276,
18,
9879,
21539,
323,
1006,
800,
48960,
6928,
17857,
32888,
4765,
374,
849,
281,
6194,
634,
278,
16878,
17857,
32888,
6247,
5474,
339,
431,
248,
2022,
16038,
273,
436,
2929,
310,
1754,
327,
253,
10527,
1543,
273,
5148,
613,
920,
281,
5416,
253,
13260,
273,
253,
11813,
253,
4477,
12661,
247,
4460,
3963,
6081,
534,
19132,
253,
26647,
3745,
273,
253,
1566,
690,
1543,
327,
1643,
11860,
4715,
49602,
921,
253,
4081,
1332,
19132,
8772,
1110,
1666,
25379,
50276,
1568,
403,
253,
2022,
7350,
273,
436,
2929,
337,
253,
4081,
1332,
275,
436,
2929,
310,
1754,
327,
253,
5148,
613,
920,
3762,
347,
4767,
275,
2593,
374,
2299,
253,
10527,
4758,
1060,
310,
417,
4751,
5185,
342,
253,
1643,
11860,
4715,
4758,
323,
1650,
627,
310,
642,
12820,
873,
275,
16186,
337,
253,
4477,
943,
1056,
625,
11985,
1060,
281,
921,
588,
841,
3910,
4833,
253,
2457,
1543,
374,
581,
2022,
10527,
9376,
275,
5148,
613,
920,
3762,
310,
253,
4836,
3268,
812,
253,
4477,
1056,
436,
10732,
2590,
943,
359,
513,
16774,
1543,
327,
1110,
8892,
342,
1027,
9351,
273,
4836,
10670,
495,
253,
5148,
613,
920,
2957,
275,
16186,
577,
310,
247,
2372,
1027,
432,
253,
4633,
5148,
613,
920,
8103,
323,
1650,
275,
278,
16878,
359,
513,
417,
22318,
253,
30410,
259,
7357,
14940,
1223,
760,
247,
3710,
1180,
273,
11786,
5018,
403,
908,
50276,
21,
253,
4477,
943,
1618,
1110,
8245,
2193,
275,
2829,
337,
534,
403,
1335,
1774,
323,
23378,
406,
339,
22352,
3157,
253,
8542,
3045,
273,
5148,
613,
920,
11333,
436,
2929,
29328,
767,
37820,
2426,
326,
403,
17194,
407,
767,
1846,
13260,
275,
690,
3332,
10527,
789,
327,
5148,
613,
920,
10775,
337,
253,
8654,
4872,
23477,
3835,
253,
21496,
2317,
25356,
285,
374,
253,
22429,
273,
253,
8654,
23477,
3464,
11542,
347,
253,
1180,
273,
8892,
1756,
10704,
4679,
921,
326,
253,
4081,
37820,
2426,
1361,
5115,
1805,
3045,
273,
5148,
613,
920,
275,
690,
8892,
50276,
2520,
789,
11029,
347,
247,
5322,
3177,
281,
9618,
253,
3946,
273,
5148,
613,
920,
342,
10527,
16039,
2708,
403,
690,
273,
619,
7350,
50275,
249,
690,
5661,
1543,
253,
7756,
1955,
281,
253,
4081,
37820,
3133,
281,
320,
387,
253,
1072,
1268,
273,
253,
2629,
11254,
347,
973,
347,
253,
3064,
875,
253,
23775,
1543,
273,
5368,
5148,
613,
920,
11333,
285,
1110,
2361,
275,
4321,
9380,
436,
43603,
5545,
327,
253,
2032,
10307,
273,
253,
4081,
3082,
50275,
1542,
253,
2957,
1159,
275,
16186,
577,
352,
310,
625,
5272,
285,
3626,
281,
9569,
767,
42428,
3602,
347,
10839,
494,
4373,
22041,
323,
253,
4081,
37820,
2426,
50275,
783,
4477,
2223,
2312,
670,
37703,
561,
981,
253,
13260,
2299,
432,
619,
4685,
1880,
253,
13260,
327,
253,
8654,
4872,
23477,
390,
3216,
33024,
23477,
2186,
390,
417,
7024,
327,
253,
4715,
1895,
3139,
417,
327,
253,
11333,
3103,
627,
310,
642,
1039,
359,
476,
7767,
5358,
841,
13260,
891,
651,
4510,
970,
253,
12616,
41006,
253,
13260,
908,
407,
253,
4477,
327,
3239,
854,
436,
3133,
625,
7899,
285,
5272,
50274,
34814,
253,
2045,
1127,
516,
14338,
670,
581,
1953,
604,
253,
4715,
1895,
2686,
36908,
10517,
253,
767,
13260,
840,
310,
352,
1335,
9371,
281,
823,
253,
4081,
37820,
2426,
281,
253,
2957,
1159,
516,
417,
2119,
533,
619,
5476,
310,
642,
6296,
352,
1537,
1014,
8513,
281,
8415,
43884,
751,
436,
891,
651,
11907,
253,
4477,
281,
2589,
690,
13506,
4679,
835,
597,
476,
2216,
253,
941,
11365,
1232,
24088,
597,
476,
1453,
1880,
253,
2032,
4872,
23477,
10517,
253,
13260,
390,
417,
1580,
436,
789,
310,
247,
4602,
875,
3762,
285,
3946,
891,
2868,
326,
4679,
342,
13506,
941,
476,
1361,
5513,
1841,
625,
4518,
285,
1056,
253,
3916,
625,
21414,
7152,
339,
793,
360,
3454,
275,
436,
2929,
253,
4477,
4388,
387,
49519,
253,
8037,
875,
253,
3946,
285,
3762,
275,
5148,
613,
920,
7274,
5742,
597,
12661,
767,
37820,
2426,
281,
337,
9232,
253,
9991,
273,
253,
8892,
285,
374,
1453,
253,
5222,
273,
253,
10554,
3828,
7624,
14127,
253,
13260,
275,
5148,
613,
920,
3762,
50276,
45563,
50276,
783,
16038,
273,
436,
2929,
310,
4722,
1078,
36636,
253,
16182,
841,
10527,
13260,
452,
417,
644,
5087,
2217,
4116,
1078,
50276,
783,
2929,
310,
973,
34092,
285,
4518,
3542,
50275,
783,
5661,
4758,
310,
4158,
275,
247,
1175,
5133,
285,
253,
1543,
403,
12532,
50276,
20881,
1255,
50276,
74,
717,
33872,
273,
253,
38135,
273,
253,
1273,
3963,
907,
275,
16186,
21,
2556,
281,
2593,
4567,
352,
310,
6425,
281,
259,
19,
80,
18,
594,
752,
310,
697,
3064,
281,
247,
2969,
298,
19,
2801,
10027,
50276,
35861,
281,
2593,
374,
253,
8346,
5251,
3602,
403,
11096,
347,
247,
4872,
3828,
310,
436,
2097,
253,
4081,
3963,
4219,
651,
2489,
14916,
1223,
3732,
327,
1755,
273,
247,
625,
9542,
1566,
24088,
458,
80,
18,
50276,
15627,
1643,
21607,
352,
651,
320,
1805,
281,
823,
690,
14023,
342,
3332,
3082,
50276,
783,
4278,
281,
10173,
253,
749,
4971,
1104,
273,
253,
11098,
2193,
534,
310,
3240,
9542,
403,
5816,
3340,
6523,
326,
627,
310,
642,
12215,
326,
271,
1125,
351,
7413,
2492,
4968,
588,
513,
326,
3451,
50276,
709,
337,
285,
31586,
247,
391,
316,
86,
277,
316,
1994,
386,
1218,
80,
39173,
538,
726,
72,
2666,
9327,
47692,
311,
362,
5104,
932,
21721,
6148,
268,
4843,
40160,
948,
251,
7684,
6228,
80,
1218,
571,
574,
39217,
5148,
613,
920,
342,
21624,
21496,
13757,
17857,
32888,
6247,
50276,
25117,
512,
1580,
253,
7680,
285,
253,
7681,
4278,
281,
10173,
253,
749,
4971,
1104,
403,
417,
2590,
281,
479,
891,
452,
281,
4390,
5583,
247,
5075,
12009,
50274,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
247,
12082,
1263,
273,
849,
13260,
326,
403,
1246,
3332,
10527,
5148,
613,
920,
14493,
403,
10048,
275,
8542,
3082,
285,
1880,
14312,
841,
13260,
407,
6240,
4569,
37820,
2426,
476,
3157,
3045,
273,
5368,
3082,
253,
4477,
2278,
1846,
16876,
275,
10527,
31225,
323,
247,
11419,
4715,
4758,
326,
8687,
247,
4735,
4715,
3213,
1754,
327,
534,
4872,
23477,
323,
247,
5235,
273,
8892,
403,
10166,
7605,
23632,
323,
824,
247,
7792,
326,
310,
7605,
23632,
323,
253,
3045,
273,
10166,
327,
271,
3081,
2303,
4836,
403,
1754,
327,
253,
9376,
326,
253,
873,
273,
2801,
11390,
273,
253,
4872,
23477,
13905,
253,
2317,
26332,
10738,
5235,
285,
326,
253,
3733,
8892,
512,
4264,
247,
2074,
8459,
2533,
1430,
326,
310,
326,
253,
6779,
310,
417,
3012,
1805,
18960,
323,
690,
273,
253,
8892,
685,
2571,
50276,
783,
1655,
19529,
4076,
314,
10123,
253,
5368,
6239,
940,
3171,
562,
841,
767,
3607,
285,
840,
29328,
247,
37820,
7792,
326,
812,
320,
2879,
281,
2710,
5148,
613,
920,
11333,
281,
8591,
841,
3607,
275,
253,
6311,
4735,
6779,
50275,
71,
3341,
253,
4477,
21657,
7472,
281,
752,
4248,
253,
3607,
403,
2168,
2540,
407,
690,
11419,
4715,
3082,
285,
1880,
253,
4081,
30733,
588,
3157,
3045,
352,
310,
4232,
326,
6240,
253,
37820,
2426,
19132,
3045,
327,
954,
8892,
253,
4477,
3021,
9059,
326,
24049,
16039,
2797,
830,
3332,
10527,
31225,
273,
1783,
476,
1421,
281,
5520,
3045,
275,
3946,
10748,
253,
4096,
273,
253,
3559,
1543,
310,
417,
281,
5100,
247,
747,
1375,
273,
253,
1445,
327,
247,
873,
273,
22791,
8892,
533,
281,
24181,
1263,
285,
7277,
253,
1055,
273,
6240,
37820,
2426,
326,
588,
8591,
253,
3607,
326,
403,
11408,
323,
247,
50276,
24594,
6779,
1754,
327,
7605,
14493,
50276,
74,
651,
9059,
326,
253,
2561,
3114,
943,
1329,
436,
1511,
273,
2175,
253,
789,
310,
973,
3559,
285,
5196,
954,
15538,
253,
1263,
556,
247,
2590,
285,
2087,
3935,
326,
588,
320,
9865,
323,
8607,
285,
24432,
2444,
275,
327,
5148,
613,
920,
50274,
35529,
253,
30628,
858,
417,
5583,
18051,
436,
1511,
273,
1263,
323,
17857,
32888,
253,
4477,
403,
14659,
281,
501,
538,
2225,
616,
789,
281,
247,
1027,
18767
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
6010,
50275,
783,
2929,
10123,
1846,
13260,
1160,
407,
3332,
10527,
1783,
273,
5148,
613,
920,
285,
10384,
731,
281,
5148,
613,
920,
3082,
347,
37820,
1543,
921,
326,
841,
37820,
2426,
3157,
689,
26724,
5148,
613,
920,
50275,
250,
3743,
323,
4868,
50275,
1189,
455,
891,
6273,
323,
12009,
253,
2022,
2934,
273,
9433,
3762,
281,
3946,
310,
5272,
533,
253,
37820,
3082,
4081,
403,
7194,
1929,
3963,
3006,
253,
11098,
1318,
310,
2074,
281,
253,
9879,
21539,
4081,
275,
337,
253,
8954,
7564,
3750,
5222,
37820,
310,
2074,
281,
253,
7744,
908,
2801,
10027,
50275,
18,
186,
515,
23892,
337,
275,
3443,
1162,
355,
3054,
326,
253,
3216,
5083,
2801,
943,
3835,
512,
10746,
25356,
352,
2550,
320,
33075,
672,
253,
8892,
403,
4229,
253,
4081,
37820,
29697,
4219,
253,
1617,
1180,
273,
253,
2801,
4315,
1309,
3733,
534,
310,
625,
2074,
281,
253,
9879,
21539,
4081,
275,
337,
347,
281,
3963,
3006,
253,
8954,
7564,
3750,
5222,
627,
2226,
247,
1386,
273,
6239,
4645,
2801,
10027,
2987,
323,
2087,
7533,
7419,
432,
5148,
613,
920,
3021,
891,
1158,
253,
37820,
4081,
275,
436,
2929,
310,
1929,
374,
186,
783,
5661,
1543,
6296,
3157,
689,
26724,
5148,
613,
920,
2299,
347,
2011,
275,
374,
1014,
407,
342,
690,
2969,
24866,
5148,
613,
920,
476,
320,
625,
6474,
285,
33526,
1805,
1543,
436,
43603,
5545,
327,
253,
1318,
273,
253,
4081,
1332,
50276,
18,
9879,
21539,
323,
1006,
800,
48960,
6928,
17857,
32888,
4765,
374,
849,
281,
6194,
634,
278,
16878,
17857,
32888,
6247,
5474,
339,
431,
248,
2022,
16038,
273,
436,
2929,
310,
1754,
327,
253,
10527,
1543,
273,
5148,
613,
920,
281,
5416,
253,
13260,
273,
253,
11813,
253,
4477,
12661,
247,
4460,
3963,
6081,
534,
19132,
253,
26647,
3745,
273,
253,
1566,
690,
1543,
327,
1643,
11860,
4715,
49602,
921,
253,
4081,
1332,
19132,
8772,
1110,
1666,
25379,
50276,
1568,
403,
253,
2022,
7350,
273,
436,
2929,
337,
253,
4081,
1332,
275,
436,
2929,
310,
1754,
327,
253,
5148,
613,
920,
3762,
347,
4767,
275,
2593,
374,
2299,
253,
10527,
4758,
1060,
310,
417,
4751,
5185,
342,
253,
1643,
11860,
4715,
4758,
323,
1650,
627,
310,
642,
12820,
873,
275,
16186,
337,
253,
4477,
943,
1056,
625,
11985,
1060,
281,
921,
588,
841,
3910,
4833,
253,
2457,
1543,
374,
581,
2022,
10527,
9376,
275,
5148,
613,
920,
3762,
310,
253,
4836,
3268,
812,
253,
4477,
1056,
436,
10732,
2590,
943,
359,
513,
16774,
1543,
327,
1110,
8892,
342,
1027,
9351,
273,
4836,
10670,
495,
253,
5148,
613,
920,
2957,
275,
16186,
577,
310,
247,
2372,
1027,
432,
253,
4633,
5148,
613,
920,
8103,
323,
1650,
275,
278,
16878,
359,
513,
417,
22318,
253,
30410,
259,
7357,
14940,
1223,
760,
247,
3710,
1180,
273,
11786,
5018,
403,
908,
50276,
21,
253,
4477,
943,
1618,
1110,
8245,
2193,
275,
2829,
337,
534,
403,
1335,
1774,
323,
23378,
406,
339,
22352,
3157,
253,
8542,
3045,
273,
5148,
613,
920,
11333,
436,
2929,
29328,
767,
37820,
2426,
326,
403,
17194,
407,
767,
1846,
13260,
275,
690,
3332,
10527,
789,
327,
5148,
613,
920,
10775,
337,
253,
8654,
4872,
23477,
3835,
253,
21496,
2317,
25356,
285,
374,
253,
22429,
273,
253,
8654,
23477,
3464,
11542,
347,
253,
1180,
273,
8892,
1756,
10704,
4679,
921,
326,
253,
4081,
37820,
2426,
1361,
5115,
1805,
3045,
273,
5148,
613,
920,
275,
690,
8892,
50276,
2520,
789,
11029,
347,
247,
5322,
3177,
281,
9618,
253,
3946,
273,
5148,
613,
920,
342,
10527,
16039,
2708,
403,
690,
273,
619,
7350,
50275,
249,
690,
5661,
1543,
253,
7756,
1955,
281,
253,
4081,
37820,
3133,
281,
320,
387,
253,
1072,
1268,
273,
253,
2629,
11254,
347,
973,
347,
253,
3064,
875,
253,
23775,
1543,
273,
5368,
5148,
613,
920,
11333,
285,
1110,
2361,
275,
4321,
9380,
436,
43603,
5545,
327,
253,
2032,
10307,
273,
253,
4081,
3082,
50275,
1542,
253,
2957,
1159,
275,
16186,
577,
352,
310,
625,
5272,
285,
3626,
281,
9569,
767,
42428,
3602,
347,
10839,
494,
4373,
22041,
323,
253,
4081,
37820,
2426,
50275,
783,
4477,
2223,
2312,
670,
37703,
561,
981,
253,
13260,
2299,
432,
619,
4685,
1880,
253,
13260,
327,
253,
8654,
4872,
23477,
390,
3216,
33024,
23477,
2186,
390,
417,
7024,
327,
253,
4715,
1895,
3139,
417,
327,
253,
11333,
3103,
627,
310,
642,
1039,
359,
476,
7767,
5358,
841,
13260,
891,
651,
4510,
970,
253,
12616,
41006,
253,
13260,
908,
407,
253,
4477,
327,
3239,
854,
436,
3133,
625,
7899,
285,
5272,
50274,
34814,
253,
2045,
1127,
516,
14338,
670,
581,
1953,
604,
253,
4715,
1895,
2686,
36908,
10517,
253,
767,
13260,
840,
310,
352,
1335,
9371,
281,
823,
253,
4081,
37820,
2426,
281,
253,
2957,
1159,
516,
417,
2119,
533,
619,
5476,
310,
642,
6296,
352,
1537,
1014,
8513,
281,
8415,
43884,
751,
436,
891,
651,
11907,
253,
4477,
281,
2589,
690,
13506,
4679,
835,
597,
476,
2216,
253,
941,
11365,
1232,
24088,
597,
476,
1453,
1880,
253,
2032,
4872,
23477,
10517,
253,
13260,
390,
417,
1580,
436,
789,
310,
247,
4602,
875,
3762,
285,
3946,
891,
2868,
326,
4679,
342,
13506,
941,
476,
1361,
5513,
1841,
625,
4518,
285,
1056,
253,
3916,
625,
21414,
7152,
339,
793,
360,
3454,
275,
436,
2929,
253,
4477,
4388,
387,
49519,
253,
8037,
875,
253,
3946,
285,
3762,
275,
5148,
613,
920,
7274,
5742,
597,
12661,
767,
37820,
2426,
281,
337,
9232,
253,
9991,
273,
253,
8892,
285,
374,
1453,
253,
5222,
273,
253,
10554,
3828,
7624,
14127,
253,
13260,
275,
5148,
613,
920,
3762,
50276,
45563,
50276,
783,
16038,
273,
436,
2929,
310,
4722,
1078,
36636,
253,
16182,
841,
10527,
13260,
452,
417,
644,
5087,
2217,
4116,
1078,
50276,
783,
2929,
310,
973,
34092,
285,
4518,
3542,
50275,
783,
5661,
4758,
310,
4158,
275,
247,
1175,
5133,
285,
253,
1543,
403,
12532,
50276,
20881,
1255,
50276,
74,
717,
33872,
273,
253,
38135,
273,
253,
1273,
3963,
907,
275,
16186,
21,
2556,
281,
2593,
4567,
352,
310,
6425,
281,
259,
19,
80,
18,
594,
752,
310,
697,
3064,
281,
247,
2969,
298,
19,
2801,
10027,
50276,
35861,
281,
2593,
374,
253,
8346,
5251,
3602,
403,
11096,
347,
247,
4872,
3828,
310,
436,
2097,
253,
4081,
3963,
4219,
651,
2489,
14916,
1223,
3732,
327,
1755,
273,
247,
625,
9542,
1566,
24088,
458,
80,
18,
50276,
15627,
1643,
21607,
352,
651,
320,
1805,
281,
823,
690,
14023,
342,
3332,
3082,
50276,
783,
4278,
281,
10173,
253,
749,
4971,
1104,
273,
253,
11098,
2193,
534,
310,
3240,
9542,
403,
5816,
3340,
6523,
326,
627,
310,
642,
12215,
326,
271,
1125,
351,
7413,
2492,
4968,
588,
513,
326,
3451,
50276,
709,
337,
285,
31586,
247,
391,
316,
86,
277,
316,
1994,
386,
1218,
80,
39173,
538,
726,
72,
2666,
9327,
47692,
311,
362,
5104,
932,
21721,
6148,
268,
4843,
40160,
948,
251,
7684,
6228,
80,
1218,
571,
574,
39217,
5148,
613,
920,
342,
21624,
21496,
13757,
17857,
32888,
6247,
50276,
25117,
512,
1580,
253,
7680,
285,
253,
7681,
4278,
281,
10173,
253,
749,
4971,
1104,
403,
417,
2590,
281,
479,
891,
452,
281,
4390,
5583,
247,
5075,
12009,
50274,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
247,
12082,
1263,
273,
849,
13260,
326,
403,
1246,
3332,
10527,
5148,
613,
920,
14493,
403,
10048,
275,
8542,
3082,
285,
1880,
14312,
841,
13260,
407,
6240,
4569,
37820,
2426,
476,
3157,
3045,
273,
5368,
3082,
253,
4477,
2278,
1846,
16876,
275,
10527,
31225,
323,
247,
11419,
4715,
4758,
326,
8687,
247,
4735,
4715,
3213,
1754,
327,
534,
4872,
23477,
323,
247,
5235,
273,
8892,
403,
10166,
7605,
23632,
323,
824,
247,
7792,
326,
310,
7605,
23632,
323,
253,
3045,
273,
10166,
327,
271,
3081,
2303,
4836,
403,
1754,
327,
253,
9376,
326,
253,
873,
273,
2801,
11390,
273,
253,
4872,
23477,
13905,
253,
2317,
26332,
10738,
5235,
285,
326,
253,
3733,
8892,
512,
4264,
247,
2074,
8459,
2533,
1430,
326,
310,
326,
253,
6779,
310,
417,
3012,
1805,
18960,
323,
690,
273,
253,
8892,
685,
2571,
50276,
783,
1655,
19529,
4076,
314,
10123,
253,
5368,
6239,
940,
3171,
562,
841,
767,
3607,
285,
840,
29328,
247,
37820,
7792,
326,
812,
320,
2879,
281,
2710,
5148,
613,
920,
11333,
281,
8591,
841,
3607,
275,
253,
6311,
4735,
6779,
50275,
71,
3341,
253,
4477,
21657,
7472,
281,
752,
4248,
253,
3607,
403,
2168,
2540,
407,
690,
11419,
4715,
3082,
285,
1880,
253,
4081,
30733,
588,
3157,
3045,
352,
310,
4232,
326,
6240,
253,
37820,
2426,
19132,
3045,
327,
954,
8892,
253,
4477,
3021,
9059,
326,
24049,
16039,
2797,
830,
3332,
10527,
31225,
273,
1783,
476,
1421,
281,
5520,
3045,
275,
3946,
10748,
253,
4096,
273,
253,
3559,
1543,
310,
417,
281,
5100,
247,
747,
1375,
273,
253,
1445,
327,
247,
873,
273,
22791,
8892,
533,
281,
24181,
1263,
285,
7277,
253,
1055,
273,
6240,
37820,
2426,
326,
588,
8591,
253,
3607,
326,
403,
11408,
323,
247,
50276,
24594,
6779,
1754,
327,
7605,
14493,
50276,
74,
651,
9059,
326,
253,
2561,
3114,
943,
1329,
436,
1511,
273,
2175,
253,
789,
310,
973,
3559,
285,
5196,
954,
15538,
253,
1263,
556,
247,
2590,
285,
2087,
3935,
326,
588,
320,
9865,
323,
8607,
285,
24432,
2444,
275,
327,
5148,
613,
920,
50274,
35529,
253,
30628,
858,
417,
5583,
18051,
436,
1511,
273,
1263,
323,
17857,
32888,
253,
4477,
403,
14659,
281,
501,
538,
2225,
616,
789,
281,
247,
1027,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a method to defend against model stealing attacks this method is based on the mean correlation among the selected samples it provides two ways to select samples 1 finding wrongly classified normal samples 2 selecting mixed samples via cutmix experiments on four datasets demonstrate the effectiveness of the proposed method strengths interesting topic and good motivation sacm has high efficiency weaknesses it lacks solid explanations about why the proposed method works generalizability to different source models is not clear comparisons with some related works are missing detailed comments this paper does not provide solid explanations about why the proposed method works more discussion or analysis would make the intuition behind the proposed method more convincing comparisons with some related works are missing this paper claims that the proposed method outperforms previous methods however it lacks comparisons with two related works 12 therefore it is hard to say if the proposed method achieves sota performance table 3 is hard to read the title of table 3 mentions accuracies under different source models but it seems there is only one source model therefore this methods generalizability to different source models is also not clear 1 li et al defending against model stealing via verifying embedded external features aaai 2022 2 chen et al copy right a testing framework for copyright protection of deep learning models ieee sp 2022 this paper has discussed the limitations and potential negative impacts docsepmodel fingerprinting allows a model owner to claim ownership of a stolen model prior works on fingerprinting typically use transferable adversarial examples to perform fingerprinting such techniques have two key shortcomings 1 they dont work in the presence of defenses like adversarial training 2 they cannot be used when the stolen model is used for transfer learning as the output label space is different from the original model to solve these issues the authors propose to use pairwise correlation of the models output of wrongly classified sacw or mixed sacm samples to perform fingerprinting using wrongly classifiedmixed inputs instead of adversarial examples allows the technique to be used in the presence of defenses like adversarial training using pairwise relationships instead of pointwise predictions allows fingerprinting to be performed when the stolen model is used for transfer learning evaluations show the the proposed technique can detect stolen models with high accuracy and can outperform prior works strengths 1 the paper proposes a technique to perform fingerprinting in the presence of transfer learning which is a new and interesting subproblem in model fingerprinting that has received limited attention from prior works 2 in addition to the softlabel setting the paper proposes a method to convert hard labels to soft labels which enables fingerprinting when only the hard labels are available 3 the proposed fingerprinting techniques have been evaluated against several categories of stealing attacks finetuning pruning transfer learning model extraction and adversarial model extraction the proposed techniques shows high detection performance for fingerprinting and outperforms prior works 4 the paper is wellwritten and easy to follow weakness 1 sac does not seem to work under transfer learning a in the hard label setting table 4 the authors have addressed the limitations and societal impact of their work docsepthis paper proposes a model stealing detection method based on sample correlation the proposed method calculated the correlation among the model outputs for the misclassified samples the cutmix approach is used to generate more effective sample inputs the experiment evaluates the performance of the proposed method outperforms in different attack scenarios such as finetuning transfer learning and adversarial training strengths 1 the paper considers many realistic attack scenarios in model stealing attacks such as finetuning pruning adversarial training and transfer learning which have not been widely explored 2 it is nice to see that the protection method considers the labelonly cases 3 leveraging cutmix to augment data for fingerprinting is novel weaknesses 1 the threat model is not welldefined in the paper what are the attackers capabilities in each category of stealing attacks does the attacker have access to training data and model architectures 2 the implementation of the model stealing attacks is unclear 3 it is unclear why the misclassified samples help to identify the stolen model how many misclassified samples are necessary for fingerprinting 4 the threshold d in equation 4 is critical to the success of model stealing detection however it is unclear how to select an appropriate value of d 5 as shown in table 1 and 2 the existing detection methods can achieve 100 auc in many attack categories eg finetune pruning the proposed method only outperforms the existing methods in model extraction attacks 6 the proposed method does not work well in the transfera attack table 4 it would be great if the paper could provide some explanations on it the authors have discussed potential societal impact of their work
### Summary:
|
the reviewers agreed that the proposed method and validation overall are a good contribution we urge the authors to update their paper to reflect the discussed clarifications eg regarding the threat models in use
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1332,
281,
2342,
1411,
1566,
27980,
8104,
436,
1332,
310,
1754,
327,
253,
1599,
5921,
2190,
253,
4236,
3530,
352,
3400,
767,
4088,
281,
3609,
3530,
337,
4560,
47723,
10509,
2622,
3530,
374,
17221,
6804,
3530,
3066,
2624,
24706,
4679,
327,
1740,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50275,
47606,
9400,
285,
1175,
16038,
50275,
38346,
78,
556,
1029,
6733,
50276,
20881,
1255,
265,
50275,
262,
19756,
4891,
22909,
670,
2139,
253,
4081,
1332,
2987,
50275,
16691,
50228,
281,
1027,
2603,
3210,
310,
417,
2590,
50275,
681,
1148,
10047,
342,
690,
2905,
2987,
403,
5816,
50276,
5992,
7193,
5701,
50275,
2520,
2929,
1057,
417,
2085,
4891,
22909,
670,
2139,
253,
4081,
1332,
2987,
625,
5955,
390,
1783,
651,
1056,
253,
30328,
3212,
253,
4081,
1332,
625,
21414,
50275,
681,
1148,
10047,
342,
690,
2905,
2987,
403,
5816,
436,
2929,
3916,
326,
253,
4081,
1332,
41731,
13015,
2045,
3082,
2299,
352,
19756,
14023,
342,
767,
2905,
2987,
1249,
3103,
352,
310,
1892,
281,
1333,
604,
253,
4081,
1332,
33526,
256,
5503,
3045,
50275,
2420,
495,
310,
1892,
281,
1239,
253,
4060,
273,
2829,
495,
25957,
3933,
19103,
762,
1027,
2603,
3210,
533,
352,
3133,
627,
310,
760,
581,
2603,
1566,
3103,
436,
3082,
2087,
50228,
281,
1027,
2603,
3210,
310,
671,
417,
2590,
50276,
18,
632,
1162,
355,
21449,
1411,
1566,
27980,
3066,
49160,
12691,
6024,
3386,
39951,
2284,
1384,
1423,
50276,
19,
260,
864,
1162,
355,
3491,
987,
247,
5175,
7792,
323,
9451,
6055,
273,
3676,
4715,
3210,
26332,
1796,
653,
1384,
1423,
50276,
2520,
2929,
556,
5469,
253,
7364,
285,
2442,
4016,
16274,
50276,
7152,
33032,
7645,
32970,
272,
4483,
247,
1566,
7302,
281,
1750,
12851,
273,
247,
15661,
1566,
2720,
2987,
327,
32970,
272,
5431,
897,
3700,
494,
48960,
6667,
281,
1347,
32970,
272,
824,
5609,
452,
767,
2234,
35387,
337,
597,
13414,
789,
275,
253,
3361,
273,
25774,
751,
48960,
3733,
374,
597,
2550,
320,
908,
672,
253,
15661,
1566,
310,
908,
323,
3700,
4715,
347,
253,
3453,
5203,
2317,
310,
1027,
432,
253,
3236,
1566,
281,
8415,
841,
3374,
253,
4477,
12661,
281,
897,
28208,
5921,
273,
253,
3210,
3453,
273,
47723,
10509,
7044,
88,
390,
6804,
7044,
78,
3530,
281,
1347,
32970,
272,
50276,
5302,
47723,
10509,
37340,
14800,
3185,
273,
48960,
6667,
4483,
253,
5853,
281,
320,
908,
275,
253,
3361,
273,
25774,
751,
48960,
3733,
970,
28208,
7688,
3185,
273,
1127,
3020,
13650,
4483,
32970,
272,
281,
320,
2684,
672,
253,
15661,
1566,
310,
908,
323,
3700,
4715,
27163,
921,
253,
253,
4081,
5853,
476,
2736,
15661,
3210,
342,
1029,
7200,
285,
476,
562,
32231,
2720,
2987,
50276,
296,
3755,
20556,
50276,
18,
253,
2929,
29328,
247,
5853,
281,
1347,
32970,
272,
275,
253,
3361,
273,
3700,
4715,
534,
310,
247,
747,
285,
4722,
749,
28872,
275,
1566,
32970,
272,
326,
556,
2959,
3710,
4116,
432,
2720,
2987,
374,
275,
1635,
281,
253,
2602,
1968,
4758,
253,
2929,
29328,
247,
1332,
281,
6455,
1892,
13301,
281,
2602,
13301,
534,
13276,
32970,
272,
672,
760,
253,
1892,
13301,
403,
2130,
495,
253,
4081,
32970,
272,
5609,
452,
644,
6760,
1411,
2067,
9050,
273,
27980,
8104,
1442,
292,
25004,
819,
25004,
3700,
4715,
1566,
11998,
285,
48960,
1566,
11998,
253,
4081,
5609,
2722,
1029,
5481,
3045,
323,
32970,
272,
285,
41731,
13015,
2720,
2987,
50276,
21,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50275,
20881,
1255,
50276,
18,
7044,
1057,
417,
1646,
281,
789,
762,
3700,
4715,
247,
275,
253,
1892,
5203,
4758,
2829,
577,
253,
4477,
452,
9713,
253,
7364,
285,
38058,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
29328,
247,
1566,
27980,
5481,
1332,
1754,
327,
3410,
5921,
253,
4081,
1332,
5118,
253,
5921,
2190,
253,
1566,
18012,
323,
253,
3731,
39651,
3530,
253,
2624,
24706,
2746,
310,
908,
281,
6635,
625,
3576,
3410,
14800,
253,
3368,
44995,
253,
3045,
273,
253,
4081,
1332,
41731,
13015,
275,
1027,
2983,
15216,
824,
347,
1442,
292,
25004,
3700,
4715,
285,
48960,
3733,
20544,
50276,
18,
253,
2929,
19401,
1142,
15958,
2983,
15216,
275,
1566,
27980,
8104,
824,
347,
1442,
292,
25004,
819,
25004,
48960,
3733,
285,
3700,
4715,
534,
452,
417,
644,
7561,
14859,
50275,
19,
352,
310,
5322,
281,
923,
326,
253,
6055,
1332,
19401,
253,
5203,
7483,
2219,
50276,
20,
19732,
2977,
2624,
24706,
281,
35919,
941,
323,
32970,
272,
310,
4460,
50274,
20881,
1255,
265,
50276,
18,
253,
4322,
1566,
310,
417,
6210,
392,
37224,
275,
253,
2929,
752,
403,
253,
40567,
13789,
275,
1016,
7140,
273,
27980,
8104,
1057,
253,
30539,
452,
2289,
281,
3733,
941,
285,
1566,
35615,
50276,
19,
253,
7092,
273,
253,
1566,
27980,
8104,
310,
12744,
50275,
20,
352,
310,
12744,
2139,
253,
3731,
39651,
3530,
1361,
281,
4271,
253,
15661,
1566,
849,
1142,
3731,
39651,
3530,
403,
3309,
323,
32970,
272,
50275,
21,
253,
7887,
277,
275,
5150,
577,
310,
4619,
281,
253,
2323,
273,
1566,
27980,
5481,
2299,
352,
310,
12744,
849,
281,
3609,
271,
4569,
1318,
273,
277,
50276,
22,
347,
2011,
275,
2829,
337,
285,
374,
253,
5368,
5481,
3082,
476,
5115,
2233,
247,
1028,
275,
1142,
2983,
9050,
24088,
1442,
292,
2517,
819,
25004,
253,
4081,
1332,
760,
41731,
13015,
253,
5368,
3082,
275,
1566,
11998,
8104,
50276,
23,
253,
4081,
1332,
1057,
417,
789,
973,
275,
253,
3700,
66,
2983,
2829,
577,
352,
651,
320,
1270,
604,
253,
2929,
812,
2085,
690,
22909,
327,
352,
50276,
783,
4477,
452,
5469,
2442,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
783,
30628,
5821,
326,
253,
4081,
1332,
285,
12820,
4583,
403,
247,
1175,
7680,
50275,
664,
21434,
253,
4477,
281,
5731,
616,
2929,
281,
4887,
253,
5469,
8254,
6787,
24088,
5001,
253,
4322,
3210,
275,
897,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1332,
281,
2342,
1411,
1566,
27980,
8104,
436,
1332,
310,
1754,
327,
253,
1599,
5921,
2190,
253,
4236,
3530,
352,
3400,
767,
4088,
281,
3609,
3530,
337,
4560,
47723,
10509,
2622,
3530,
374,
17221,
6804,
3530,
3066,
2624,
24706,
4679,
327,
1740,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50275,
47606,
9400,
285,
1175,
16038,
50275,
38346,
78,
556,
1029,
6733,
50276,
20881,
1255,
265,
50275,
262,
19756,
4891,
22909,
670,
2139,
253,
4081,
1332,
2987,
50275,
16691,
50228,
281,
1027,
2603,
3210,
310,
417,
2590,
50275,
681,
1148,
10047,
342,
690,
2905,
2987,
403,
5816,
50276,
5992,
7193,
5701,
50275,
2520,
2929,
1057,
417,
2085,
4891,
22909,
670,
2139,
253,
4081,
1332,
2987,
625,
5955,
390,
1783,
651,
1056,
253,
30328,
3212,
253,
4081,
1332,
625,
21414,
50275,
681,
1148,
10047,
342,
690,
2905,
2987,
403,
5816,
436,
2929,
3916,
326,
253,
4081,
1332,
41731,
13015,
2045,
3082,
2299,
352,
19756,
14023,
342,
767,
2905,
2987,
1249,
3103,
352,
310,
1892,
281,
1333,
604,
253,
4081,
1332,
33526,
256,
5503,
3045,
50275,
2420,
495,
310,
1892,
281,
1239,
253,
4060,
273,
2829,
495,
25957,
3933,
19103,
762,
1027,
2603,
3210,
533,
352,
3133,
627,
310,
760,
581,
2603,
1566,
3103,
436,
3082,
2087,
50228,
281,
1027,
2603,
3210,
310,
671,
417,
2590,
50276,
18,
632,
1162,
355,
21449,
1411,
1566,
27980,
3066,
49160,
12691,
6024,
3386,
39951,
2284,
1384,
1423,
50276,
19,
260,
864,
1162,
355,
3491,
987,
247,
5175,
7792,
323,
9451,
6055,
273,
3676,
4715,
3210,
26332,
1796,
653,
1384,
1423,
50276,
2520,
2929,
556,
5469,
253,
7364,
285,
2442,
4016,
16274,
50276,
7152,
33032,
7645,
32970,
272,
4483,
247,
1566,
7302,
281,
1750,
12851,
273,
247,
15661,
1566,
2720,
2987,
327,
32970,
272,
5431,
897,
3700,
494,
48960,
6667,
281,
1347,
32970,
272,
824,
5609,
452,
767,
2234,
35387,
337,
597,
13414,
789,
275,
253,
3361,
273,
25774,
751,
48960,
3733,
374,
597,
2550,
320,
908,
672,
253,
15661,
1566,
310,
908,
323,
3700,
4715,
347,
253,
3453,
5203,
2317,
310,
1027,
432,
253,
3236,
1566,
281,
8415,
841,
3374,
253,
4477,
12661,
281,
897,
28208,
5921,
273,
253,
3210,
3453,
273,
47723,
10509,
7044,
88,
390,
6804,
7044,
78,
3530,
281,
1347,
32970,
272,
50276,
5302,
47723,
10509,
37340,
14800,
3185,
273,
48960,
6667,
4483,
253,
5853,
281,
320,
908,
275,
253,
3361,
273,
25774,
751,
48960,
3733,
970,
28208,
7688,
3185,
273,
1127,
3020,
13650,
4483,
32970,
272,
281,
320,
2684,
672,
253,
15661,
1566,
310,
908,
323,
3700,
4715,
27163,
921,
253,
253,
4081,
5853,
476,
2736,
15661,
3210,
342,
1029,
7200,
285,
476,
562,
32231,
2720,
2987,
50276,
296,
3755,
20556,
50276,
18,
253,
2929,
29328,
247,
5853,
281,
1347,
32970,
272,
275,
253,
3361,
273,
3700,
4715,
534,
310,
247,
747,
285,
4722,
749,
28872,
275,
1566,
32970,
272,
326,
556,
2959,
3710,
4116,
432,
2720,
2987,
374,
275,
1635,
281,
253,
2602,
1968,
4758,
253,
2929,
29328,
247,
1332,
281,
6455,
1892,
13301,
281,
2602,
13301,
534,
13276,
32970,
272,
672,
760,
253,
1892,
13301,
403,
2130,
495,
253,
4081,
32970,
272,
5609,
452,
644,
6760,
1411,
2067,
9050,
273,
27980,
8104,
1442,
292,
25004,
819,
25004,
3700,
4715,
1566,
11998,
285,
48960,
1566,
11998,
253,
4081,
5609,
2722,
1029,
5481,
3045,
323,
32970,
272,
285,
41731,
13015,
2720,
2987,
50276,
21,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50275,
20881,
1255,
50276,
18,
7044,
1057,
417,
1646,
281,
789,
762,
3700,
4715,
247,
275,
253,
1892,
5203,
4758,
2829,
577,
253,
4477,
452,
9713,
253,
7364,
285,
38058,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
29328,
247,
1566,
27980,
5481,
1332,
1754,
327,
3410,
5921,
253,
4081,
1332,
5118,
253,
5921,
2190,
253,
1566,
18012,
323,
253,
3731,
39651,
3530,
253,
2624,
24706,
2746,
310,
908,
281,
6635,
625,
3576,
3410,
14800,
253,
3368,
44995,
253,
3045,
273,
253,
4081,
1332,
41731,
13015,
275,
1027,
2983,
15216,
824,
347,
1442,
292,
25004,
3700,
4715,
285,
48960,
3733,
20544,
50276,
18,
253,
2929,
19401,
1142,
15958,
2983,
15216,
275,
1566,
27980,
8104,
824,
347,
1442,
292,
25004,
819,
25004,
48960,
3733,
285,
3700,
4715,
534,
452,
417,
644,
7561,
14859,
50275,
19,
352,
310,
5322,
281,
923,
326,
253,
6055,
1332,
19401,
253,
5203,
7483,
2219,
50276,
20,
19732,
2977,
2624,
24706,
281,
35919,
941,
323,
32970,
272,
310,
4460,
50274,
20881,
1255,
265,
50276,
18,
253,
4322,
1566,
310,
417,
6210,
392,
37224,
275,
253,
2929,
752,
403,
253,
40567,
13789,
275,
1016,
7140,
273,
27980,
8104,
1057,
253,
30539,
452,
2289,
281,
3733,
941,
285,
1566,
35615,
50276,
19,
253,
7092,
273,
253,
1566,
27980,
8104,
310,
12744,
50275,
20,
352,
310,
12744,
2139,
253,
3731,
39651,
3530,
1361,
281,
4271,
253,
15661,
1566,
849,
1142,
3731,
39651,
3530,
403,
3309,
323,
32970,
272,
50275,
21,
253,
7887,
277,
275,
5150,
577,
310,
4619,
281,
253,
2323,
273,
1566,
27980,
5481,
2299,
352,
310,
12744,
849,
281,
3609,
271,
4569,
1318,
273,
277,
50276,
22,
347,
2011,
275,
2829,
337,
285,
374,
253,
5368,
5481,
3082,
476,
5115,
2233,
247,
1028,
275,
1142,
2983,
9050,
24088,
1442,
292,
2517,
819,
25004,
253,
4081,
1332,
760,
41731,
13015,
253,
5368,
3082,
275,
1566,
11998,
8104,
50276,
23,
253,
4081,
1332,
1057,
417,
789,
973,
275,
253,
3700,
66,
2983,
2829,
577,
352,
651,
320,
1270,
604,
253,
2929,
812,
2085,
690,
22909,
327,
352,
50276,
783,
4477,
452,
5469,
2442,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
783,
30628,
5821,
326,
253,
4081,
1332,
285,
12820,
4583,
403,
247,
1175,
7680,
50275,
664,
21434,
253,
4477,
281,
5731,
616,
2929,
281,
4887,
253,
5469,
8254,
6787,
24088,
5001,
253,
4322,
3210,
275,
897,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper formulates a novel problem in openset domain adaptation and aims to classify the unknown classes in the target domain this is different from the traditional openset domain adaptation problem setting to do this additional knowledge of interclass relations are employed and embedded so that knowledge learned from the shared classes can be transferred to the unknown classes this paper formulates a novel problem by stepping one step further from the openset domain adaptation to classifying the unknown classes overall the problem is interesting and can be practically useful however the work does not seem to be solid enough for a publication in iclr some of my concerns are as follows in the last sentence of 32 the authors claim but also provides mor general classifiers of the known ones is there any empirical evidence for this statement in eq 4 and 5 it is confusing to use t for both the boundary and the subsript for the target domain and i dont understand the meaning of eq 4 and 5 i dont see how the domain adaptation module can align two domains from the objective in eq 10 and 11 ls make sure the classification accuracy on sourcedomain known classes ladv enables the separation of known and unknown classes in the target domain and lgcn learns a mapping from the knowledge graph node v to the classifier weight w how is the domain adaptation done the knowledge graph should also be mentioned in the problem definition in section 31 i believe the knowledge graph is important for good zeroshot learning however this information is not mentioned in the experiments experiments on more datasets would be helpful to validate the benefit of the proposed method there are some language issues to fix eg to align the domain gap should be to bridge the domain gap or something else in summary the paper provides a solution to a practically useful problem but the novelty is limited and the presentation is not clear enough to understand docsepthey propose a new setting in openset domain adaptation where the goal is to classify known classes into their classes as well as cluster unknown classes well a difference from an existing openset domain adaptation is that it does only require separating unknown instances from known ones whereas this paper aims to cluster unknown instances for this goal they propose a model for open set domain adaptation with zeroshot learning on the unknown classes they combine adversarial learning to align the two domains and the knowledge graph is introduced to generate the classifiers for the unknown classes with the employment of the graph convolution network gcn they provide experiments on digits dataset and show the gain over baselines strong points 1 their proposed setting is interesting and realistic 2 the idea of using graph structure in the target domain sounds reasonable weak points 1 though the highlevel idea of leveraging a graph in the target domain sounds reasonable their description of the method is not clear and not convincing enough in eq 3 it is not clear where winf comes from also what are m in eq 3 and eq 1 in addition in eq 3 the objective is just to predict known classifier weights but why it is enough to get discriminative features for unknown samples 2 where is the attribute v come from this is missing in both method and experiments section 3 their evaluation is not enough to support the validity of their method first they perform experiments only on digits dataset but this is clearly not enough second they are lacking several baselines in openset domain adaptation utilizing clustering methods such as do we really need to access the source data and universal domain adaptation through self supervision considering the weak points i recommend rejecting this paper there are many missing parts in methods and they need more experiments to justify their ideas docsepthe paper considers the problem of openset domain adaptation where the target domain has additional group of unknown classes and domainshift with source domain one interesting aspect of the paper is that the method utilizes a knowledge graph for zeroshot learning on the unknown classes further the method utilizes an adversarial learning approach to align the source and target domain strengths s1 knowledge graph for learning classifiers unlike prior works the paper aims at learning the classes present even in unknown bucket by using a zeroshot learning s2 the paper tackles the openset domain adaptation by generating classifiers that work well for unknown classes by utilizing graph learning methods weaknesses w1 motivation for a zeroshot learning and a clarification typically openset domainadaptation focussed on rejecting unknown classes rather than modeling for the same explicitly ref 1 therefore these methods are useful in rejecting images even it they are not present in the predetermined group of unknown classes there are definitely some interesting usecases such as incremental learning where the model adapts continually and learns new classes so it would be interesting to include some additional motivation for the zeroshot learning on unknown classes in real world with domain gap w2 limited novelty of the proposed approach the domain adaptation module and adversarial training procedure is not novel this idea has been utilized in several papers ref 2 4 etc for ex ref 4 utilized tparameter to separate the unknown classes from source class boundaries so the novelty of the paper is limited to utilization of gcn for unknown classes w3 evaluation of the method and comparison against recent methods the experiments are limited to evaluation of the method on digits datasets such as mnist although the results are encouraging more experimental validation would be required across datasets that have richer attribute information for knowledge graph further the authors need to include additional comparisons from recent literature such as sta ref 1 inheritable models ref 5 and closely related works such as ref 3 oneshot learning w4 experimental protocol it would be more interesting to consider an evaluation protocol that shows accuracy of unknown classes both as a binaryclassification problem and multiclass classification problem other aspects a1 in my opinion the paper considers an interesting problem but the idea of using exactly ntns classes is unreasonable for practical reasons first there is already a domain shift among source and target distributions second the target dataset is unlabeled therefore it is hard to anchor out a strong distribution in this scenario this problem has been addressed in a related work by utilizing prototypical learning ref 3 a2 as suggested in the above sections the evaluation protocol should reflect the efficacy of method and its novel elements this often requires providing additional metrics that have typically not been considered in literature a3 including tsne plots would help the reader understand the qualitative performance of the proposed approach for the unknown classes questions q1 approach training strategy is there a prescribed way to alternate between the two stages for training the model to gain higher performance or is this a heuristic that yields higher performance q2 experiments settings section how long does the model take to converge the paper mentions that the model has been trained for 1200 epochs for digits dataset i wonder if this is practical and if it would scale for larger datasets q3 experiments comparison section it is not clear what the domain invariance means in the context of feature alignment for unknown classes minor typos m1 closedset domain adaptation and not closeset m2 there are few grammatical errors make the paper little hard to follow references ref 1 separate to adapt open set domain adaptation via progressive separation cvpr19 ref 2 adversarial discriminative domain adaptation cvpr17 ref 3 classincremental domain adaptation eccv20 ref 4 open set domain adaptation by backpropagation eccv18 ref 5 towards inheritable models for openset domain adaptation cvpr20 i have provided the review for the paper in the previous sections although the paper considers an interesting problem setting more experimental validation would be required to judge its claims further the novelty is limited to application of an already existing ideas im happy to take the discussion with authors in the postreview phase to hear their thoughts at the moment the paper is simply not in acceptable state
### Summary:
|
the paper addresses openset da where samples from novel classes in the target domain get clustered into new unlabeled classes a key novelty in the learning setup is that it is assumed that one has access to a knowledge graph over classes both source and target that kg is used for grouping target samples into novel classes reviewers were concerned that the method is not explained with sufficient details and the experiments lacked comparisons with openset da baselines no rebuttal was submitted the paper cannot be accepted to iclr
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
17075,
684,
247,
4460,
1895,
275,
13279,
292,
5028,
15644,
285,
13698,
281,
30215,
253,
7202,
5971,
275,
253,
2303,
5028,
436,
310,
1027,
432,
253,
5899,
13279,
292,
5028,
15644,
1895,
4758,
281,
513,
436,
3081,
3640,
273,
734,
2437,
2493,
403,
7091,
285,
12691,
594,
326,
3640,
6311,
432,
253,
6096,
5971,
476,
320,
9495,
281,
253,
7202,
5971,
436,
2929,
17075,
684,
247,
4460,
1895,
407,
24655,
581,
3213,
2007,
432,
253,
13279,
292,
5028,
15644,
281,
49653,
253,
7202,
5971,
4583,
253,
1895,
310,
4722,
285,
476,
320,
18236,
4217,
2299,
253,
789,
1057,
417,
1646,
281,
320,
4891,
2217,
323,
247,
9311,
275,
17857,
32888,
50276,
8826,
273,
619,
7350,
403,
347,
3637,
50276,
249,
253,
1390,
6197,
273,
4567,
253,
4477,
1750,
533,
671,
3400,
2298,
2087,
49996,
273,
253,
1929,
4394,
310,
627,
667,
16774,
1941,
323,
436,
3908,
50276,
249,
16186,
577,
285,
608,
352,
310,
21643,
281,
897,
246,
323,
1097,
253,
7548,
285,
253,
8790,
1122,
323,
253,
2303,
5028,
285,
891,
13414,
2096,
253,
4495,
273,
16186,
577,
285,
608,
50276,
74,
13414,
923,
849,
253,
5028,
15644,
6333,
476,
8495,
767,
10625,
432,
253,
8103,
275,
16186,
884,
285,
1903,
35253,
1056,
2119,
253,
9162,
7200,
327,
47344,
297,
404,
1929,
5971,
9931,
87,
13276,
253,
9712,
273,
1929,
285,
7202,
5971,
275,
253,
2303,
5028,
285,
298,
72,
14340,
33772,
247,
10603,
432,
253,
3640,
4216,
4666,
362,
281,
253,
30410,
2801,
259,
849,
310,
253,
5028,
15644,
2218,
50276,
783,
3640,
4216,
943,
671,
320,
5393,
275,
253,
1895,
5426,
275,
2593,
4562,
50276,
74,
2868,
253,
3640,
4216,
310,
1774,
323,
1175,
1182,
254,
6934,
302,
4715,
2299,
436,
1491,
310,
417,
5393,
275,
253,
4679,
50276,
16217,
3825,
327,
625,
15302,
651,
320,
9371,
281,
17813,
253,
5649,
273,
253,
4081,
1332,
50276,
9088,
403,
690,
3448,
3374,
281,
4993,
24088,
281,
8495,
253,
5028,
8037,
943,
320,
281,
9729,
253,
5028,
8037,
390,
1633,
2010,
275,
6010,
253,
2929,
3400,
247,
2900,
281,
247,
18236,
4217,
1895,
533,
253,
38135,
310,
3710,
285,
253,
9759,
310,
417,
2590,
2217,
281,
2096,
5474,
339,
431,
26512,
12661,
247,
747,
4758,
275,
13279,
292,
5028,
15644,
835,
253,
4736,
310,
281,
30215,
1929,
5971,
715,
616,
5971,
347,
973,
347,
7368,
7202,
5971,
973,
50276,
66,
3064,
432,
271,
5368,
13279,
292,
5028,
15644,
310,
326,
352,
1057,
760,
2430,
23694,
7202,
10872,
432,
1929,
4394,
5727,
436,
2929,
13698,
281,
7368,
7202,
10872,
323,
436,
4736,
597,
12661,
247,
1566,
323,
1527,
873,
5028,
15644,
342,
1182,
254,
6934,
302,
4715,
327,
253,
7202,
5971,
597,
13398,
48960,
4715,
281,
8495,
253,
767,
10625,
285,
253,
3640,
4216,
310,
5611,
281,
6635,
253,
49996,
323,
253,
7202,
5971,
342,
253,
8410,
273,
253,
4216,
27311,
2990,
305,
14340,
597,
2085,
4679,
327,
24321,
10895,
285,
921,
253,
6351,
689,
1666,
25379,
50276,
9072,
2792,
337,
616,
4081,
4758,
310,
4722,
285,
15958,
50276,
19,
253,
2934,
273,
970,
4216,
2605,
275,
253,
2303,
5028,
7835,
5272,
50275,
20881,
2792,
50276,
18,
2167,
253,
1029,
5251,
2934,
273,
19732,
2977,
247,
4216,
275,
253,
2303,
5028,
7835,
5272,
616,
5740,
273,
253,
1332,
310,
417,
2590,
285,
417,
21414,
2217,
275,
16186,
495,
352,
310,
417,
2590,
835,
259,
2050,
3249,
432,
671,
752,
403,
278,
275,
16186,
495,
285,
16186,
337,
275,
1635,
275,
16186,
495,
253,
8103,
310,
816,
281,
3283,
1929,
30410,
13461,
533,
2139,
352,
310,
2217,
281,
755,
20741,
800,
3386,
323,
7202,
3530,
50275,
19,
835,
310,
253,
11104,
362,
1705,
432,
436,
310,
5816,
275,
1097,
1332,
285,
4679,
2593,
50275,
20,
616,
7103,
310,
417,
2217,
281,
1329,
253,
13091,
273,
616,
1332,
806,
597,
1347,
4679,
760,
327,
24321,
10895,
533,
436,
310,
4518,
417,
2217,
1273,
597,
403,
14999,
2067,
1666,
25379,
275,
13279,
292,
5028,
15644,
17617,
17524,
3082,
824,
347,
513,
359,
1663,
878,
281,
2289,
253,
2603,
941,
285,
10898,
5028,
15644,
949,
1881,
20446,
50275,
15603,
272,
253,
5075,
2792,
891,
5583,
33944,
436,
2929,
627,
403,
1142,
5816,
4243,
275,
3082,
285,
597,
878,
625,
4679,
281,
15249,
616,
5697,
50276,
7152,
339,
431,
248,
2929,
19401,
253,
1895,
273,
13279,
292,
5028,
15644,
835,
253,
2303,
5028,
556,
3081,
1387,
273,
7202,
5971,
285,
5028,
11551,
342,
2603,
5028,
581,
4722,
4809,
273,
253,
2929,
310,
326,
253,
1332,
29820,
247,
3640,
4216,
323,
1182,
254,
6934,
302,
4715,
327,
253,
7202,
5971,
2007,
253,
1332,
29820,
271,
48960,
4715,
2746,
281,
8495,
253,
2603,
285,
2303,
5028,
50275,
296,
3755,
20556,
50276,
84,
18,
3640,
4216,
323,
4715,
49996,
12401,
2720,
2987,
253,
2929,
13698,
387,
4715,
253,
5971,
1246,
1014,
275,
7202,
22205,
407,
970,
247,
1182,
254,
6934,
302,
4715,
50276,
84,
19,
253,
2929,
39223,
253,
13279,
292,
5028,
15644,
407,
11365,
49996,
326,
789,
973,
323,
7202,
5971,
407,
17617,
4216,
4715,
3082,
50273,
20881,
1255,
265,
50276,
88,
18,
16038,
323,
247,
1182,
254,
6934,
302,
4715,
285,
247,
37699,
5431,
13279,
292,
5028,
26672,
318,
41685,
47291,
327,
33944,
7202,
5971,
2581,
685,
14053,
323,
253,
1072,
11120,
1275,
337,
3103,
841,
3082,
403,
4217,
275,
33944,
3888,
1014,
352,
597,
403,
417,
1246,
275,
253,
17095,
1387,
273,
7202,
5971,
627,
403,
7964,
690,
4722,
441,
886,
1169,
824,
347,
32809,
4715,
835,
253,
1566,
5223,
84,
23265,
285,
33772,
747,
5971,
594,
352,
651,
320,
4722,
281,
2486,
690,
3081,
16038,
323,
253,
1182,
254,
6934,
302,
4715,
327,
7202,
5971,
275,
1524,
1533,
342,
5028,
8037,
50276,
88,
19,
3710,
38135,
273,
253,
4081,
2746,
253,
5028,
15644,
6333,
285,
48960,
3733,
5199,
310,
417,
4460,
436,
2934,
556,
644,
12845,
275,
2067,
9380,
1275,
374,
577,
3966,
323,
385,
1275,
577,
12845,
246,
19484,
281,
4858,
253,
7202,
5971,
432,
2603,
966,
13674,
594,
253,
38135,
273,
253,
2929,
310,
3710,
281,
19575,
273,
305,
14340,
323,
7202,
5971,
50275,
88,
20,
7103,
273,
253,
1332,
285,
5301,
1411,
3332,
3082,
253,
4679,
403,
3710,
281,
50276,
15419,
2368,
273,
253,
1332,
327,
24321,
15302,
824,
347,
278,
79,
382,
3738,
253,
1543,
403,
18462,
625,
5661,
12820,
651,
320,
2424,
2439,
15302,
326,
452,
38539,
11104,
1491,
323,
3640,
4216,
2007,
253,
4477,
878,
281,
2486,
3081,
14023,
432,
3332,
6239,
824,
347,
21513,
1275,
337,
10958,
7116,
3210,
1275,
608,
285,
8244,
2905,
2987,
824,
347,
1275,
495,
4394,
12022,
4715,
50275,
88,
21,
5661,
7241,
352,
651,
320,
625,
4722,
281,
1908,
271,
7103,
7241,
326,
2722,
7200,
273,
7202,
5971,
1097,
347,
247,
8985,
42070,
1895,
285,
23559,
14407,
9162,
1895,
50272,
977,
7794,
50276,
66,
18,
275,
619,
4743,
253,
2929,
19401,
271,
4722,
1895,
533,
253,
2934,
273,
970,
4555,
34900,
2224,
5971,
310,
20697,
323,
8542,
4606,
806,
627,
310,
2168,
247,
5028,
5333,
2190,
2603,
285,
2303,
10670,
1273,
253,
2303,
10895,
310,
440,
22027,
3103,
352,
310,
1892,
281,
18536,
562,
247,
2266,
3268,
275,
436,
10076,
436,
1895,
556,
644,
9713,
275,
247,
2905,
789,
407,
17617,
3861,
49225,
4715,
1275,
495,
50275,
66,
19,
347,
5125,
275,
253,
1840,
7118,
253,
7103,
7241,
943,
4887,
253,
10307,
273,
1332,
285,
697,
4460,
3603,
436,
2223,
4419,
5277,
3081,
17082,
326,
452,
5431,
417,
644,
2783,
275,
6239,
50275,
66,
20,
1690,
28669,
570,
14777,
651,
1361,
253,
9414,
2096,
253,
18276,
3045,
273,
253,
4081,
2746,
323,
253,
7202,
5971,
50273,
34974,
50276,
82,
18,
2746,
50276,
31158,
5700,
310,
627,
247,
15588,
1039,
281,
17958,
875,
253,
767,
8661,
323,
3733,
253,
1566,
281,
6351,
2169,
3045,
390,
310,
436,
247,
47641,
326,
11026,
2169,
3045,
50276,
82,
19,
4679,
50276,
17494,
2593,
849,
1048,
1057,
253,
1566,
1379,
281,
29623,
253,
2929,
25957,
326,
253,
1566,
556,
644,
10166,
323,
31268,
44540,
323,
24321,
10895,
891,
4282,
604,
436,
310,
8542,
285,
604,
352,
651,
4311,
323,
4067,
15302,
50275,
82,
20,
4679,
50276,
47109,
2593,
352,
310,
417,
2590,
752,
253,
5028,
31429,
2097,
275,
253,
3634,
273,
4735,
12420,
323,
7202,
5971,
50272,
37585,
963,
993,
50275,
78,
18,
4581,
1178,
5028,
15644,
285,
417,
27599,
292,
50276,
78,
19,
627,
403,
1643,
47412,
474,
6332,
1056,
253,
2929,
1652,
1892,
50276,
936,
956,
50274,
250,
3065,
50276,
709,
337,
4858,
281,
5223,
1527,
873,
5028,
15644,
3066,
13439,
9712,
30105,
1087,
746,
50276,
709,
374,
48960,
20741,
800,
5028,
15644,
30105,
1087,
1166,
50276,
709,
495,
966,
19687,
30132,
5028,
15644,
23746,
87,
938,
50276,
709,
577,
1527,
873,
5028,
15644,
407,
896,
44263,
318,
23746,
87,
1093,
50276,
709,
608,
4404,
10958,
7116,
3210,
323,
13279,
292,
5028,
15644,
30105,
1087,
938,
891,
452,
2530,
253,
2278,
323,
253,
2929,
275,
253,
2045,
7118,
3738,
253,
2929,
19401,
271,
4722,
1895,
4758,
625,
5661,
12820,
651,
320,
2424,
281,
5963,
697,
3916,
2007,
253,
38135,
310,
3710,
281,
2898,
273,
271,
2168,
5368,
5697,
516,
5211,
281,
1379,
253,
5955,
342,
4477,
275,
253,
1501,
15337,
3408,
281,
4089,
616,
7906,
387,
253,
2774,
253,
2929,
310,
3365,
417,
275,
12207,
1375,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
12453,
13279,
292,
4204,
835,
3530,
432,
4460,
5971,
275,
253,
2303,
5028,
755,
29102,
50276,
14806,
747,
440,
22027,
5971,
247,
2234,
38135,
275,
253,
4715,
9978,
310,
326,
352,
310,
8025,
326,
581,
50276,
7110,
2289,
281,
247,
3640,
4216,
689,
5971,
1097,
2603,
285,
2303,
326,
15841,
310,
908,
323,
32827,
50276,
7831,
3530,
715,
4460,
5971,
50275,
15337,
398,
497,
7514,
326,
253,
1332,
310,
417,
5544,
342,
4209,
50276,
23454,
285,
253,
4679,
20296,
14023,
342,
13279,
292,
4204,
1666,
25379,
50276,
2369,
30080,
22559,
369,
9262,
50275,
783,
2929,
2550,
320,
7607,
281,
17857,
32888
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
17075,
684,
247,
4460,
1895,
275,
13279,
292,
5028,
15644,
285,
13698,
281,
30215,
253,
7202,
5971,
275,
253,
2303,
5028,
436,
310,
1027,
432,
253,
5899,
13279,
292,
5028,
15644,
1895,
4758,
281,
513,
436,
3081,
3640,
273,
734,
2437,
2493,
403,
7091,
285,
12691,
594,
326,
3640,
6311,
432,
253,
6096,
5971,
476,
320,
9495,
281,
253,
7202,
5971,
436,
2929,
17075,
684,
247,
4460,
1895,
407,
24655,
581,
3213,
2007,
432,
253,
13279,
292,
5028,
15644,
281,
49653,
253,
7202,
5971,
4583,
253,
1895,
310,
4722,
285,
476,
320,
18236,
4217,
2299,
253,
789,
1057,
417,
1646,
281,
320,
4891,
2217,
323,
247,
9311,
275,
17857,
32888,
50276,
8826,
273,
619,
7350,
403,
347,
3637,
50276,
249,
253,
1390,
6197,
273,
4567,
253,
4477,
1750,
533,
671,
3400,
2298,
2087,
49996,
273,
253,
1929,
4394,
310,
627,
667,
16774,
1941,
323,
436,
3908,
50276,
249,
16186,
577,
285,
608,
352,
310,
21643,
281,
897,
246,
323,
1097,
253,
7548,
285,
253,
8790,
1122,
323,
253,
2303,
5028,
285,
891,
13414,
2096,
253,
4495,
273,
16186,
577,
285,
608,
50276,
74,
13414,
923,
849,
253,
5028,
15644,
6333,
476,
8495,
767,
10625,
432,
253,
8103,
275,
16186,
884,
285,
1903,
35253,
1056,
2119,
253,
9162,
7200,
327,
47344,
297,
404,
1929,
5971,
9931,
87,
13276,
253,
9712,
273,
1929,
285,
7202,
5971,
275,
253,
2303,
5028,
285,
298,
72,
14340,
33772,
247,
10603,
432,
253,
3640,
4216,
4666,
362,
281,
253,
30410,
2801,
259,
849,
310,
253,
5028,
15644,
2218,
50276,
783,
3640,
4216,
943,
671,
320,
5393,
275,
253,
1895,
5426,
275,
2593,
4562,
50276,
74,
2868,
253,
3640,
4216,
310,
1774,
323,
1175,
1182,
254,
6934,
302,
4715,
2299,
436,
1491,
310,
417,
5393,
275,
253,
4679,
50276,
16217,
3825,
327,
625,
15302,
651,
320,
9371,
281,
17813,
253,
5649,
273,
253,
4081,
1332,
50276,
9088,
403,
690,
3448,
3374,
281,
4993,
24088,
281,
8495,
253,
5028,
8037,
943,
320,
281,
9729,
253,
5028,
8037,
390,
1633,
2010,
275,
6010,
253,
2929,
3400,
247,
2900,
281,
247,
18236,
4217,
1895,
533,
253,
38135,
310,
3710,
285,
253,
9759,
310,
417,
2590,
2217,
281,
2096,
5474,
339,
431,
26512,
12661,
247,
747,
4758,
275,
13279,
292,
5028,
15644,
835,
253,
4736,
310,
281,
30215,
1929,
5971,
715,
616,
5971,
347,
973,
347,
7368,
7202,
5971,
973,
50276,
66,
3064,
432,
271,
5368,
13279,
292,
5028,
15644,
310,
326,
352,
1057,
760,
2430,
23694,
7202,
10872,
432,
1929,
4394,
5727,
436,
2929,
13698,
281,
7368,
7202,
10872,
323,
436,
4736,
597,
12661,
247,
1566,
323,
1527,
873,
5028,
15644,
342,
1182,
254,
6934,
302,
4715,
327,
253,
7202,
5971,
597,
13398,
48960,
4715,
281,
8495,
253,
767,
10625,
285,
253,
3640,
4216,
310,
5611,
281,
6635,
253,
49996,
323,
253,
7202,
5971,
342,
253,
8410,
273,
253,
4216,
27311,
2990,
305,
14340,
597,
2085,
4679,
327,
24321,
10895,
285,
921,
253,
6351,
689,
1666,
25379,
50276,
9072,
2792,
337,
616,
4081,
4758,
310,
4722,
285,
15958,
50276,
19,
253,
2934,
273,
970,
4216,
2605,
275,
253,
2303,
5028,
7835,
5272,
50275,
20881,
2792,
50276,
18,
2167,
253,
1029,
5251,
2934,
273,
19732,
2977,
247,
4216,
275,
253,
2303,
5028,
7835,
5272,
616,
5740,
273,
253,
1332,
310,
417,
2590,
285,
417,
21414,
2217,
275,
16186,
495,
352,
310,
417,
2590,
835,
259,
2050,
3249,
432,
671,
752,
403,
278,
275,
16186,
495,
285,
16186,
337,
275,
1635,
275,
16186,
495,
253,
8103,
310,
816,
281,
3283,
1929,
30410,
13461,
533,
2139,
352,
310,
2217,
281,
755,
20741,
800,
3386,
323,
7202,
3530,
50275,
19,
835,
310,
253,
11104,
362,
1705,
432,
436,
310,
5816,
275,
1097,
1332,
285,
4679,
2593,
50275,
20,
616,
7103,
310,
417,
2217,
281,
1329,
253,
13091,
273,
616,
1332,
806,
597,
1347,
4679,
760,
327,
24321,
10895,
533,
436,
310,
4518,
417,
2217,
1273,
597,
403,
14999,
2067,
1666,
25379,
275,
13279,
292,
5028,
15644,
17617,
17524,
3082,
824,
347,
513,
359,
1663,
878,
281,
2289,
253,
2603,
941,
285,
10898,
5028,
15644,
949,
1881,
20446,
50275,
15603,
272,
253,
5075,
2792,
891,
5583,
33944,
436,
2929,
627,
403,
1142,
5816,
4243,
275,
3082,
285,
597,
878,
625,
4679,
281,
15249,
616,
5697,
50276,
7152,
339,
431,
248,
2929,
19401,
253,
1895,
273,
13279,
292,
5028,
15644,
835,
253,
2303,
5028,
556,
3081,
1387,
273,
7202,
5971,
285,
5028,
11551,
342,
2603,
5028,
581,
4722,
4809,
273,
253,
2929,
310,
326,
253,
1332,
29820,
247,
3640,
4216,
323,
1182,
254,
6934,
302,
4715,
327,
253,
7202,
5971,
2007,
253,
1332,
29820,
271,
48960,
4715,
2746,
281,
8495,
253,
2603,
285,
2303,
5028,
50275,
296,
3755,
20556,
50276,
84,
18,
3640,
4216,
323,
4715,
49996,
12401,
2720,
2987,
253,
2929,
13698,
387,
4715,
253,
5971,
1246,
1014,
275,
7202,
22205,
407,
970,
247,
1182,
254,
6934,
302,
4715,
50276,
84,
19,
253,
2929,
39223,
253,
13279,
292,
5028,
15644,
407,
11365,
49996,
326,
789,
973,
323,
7202,
5971,
407,
17617,
4216,
4715,
3082,
50273,
20881,
1255,
265,
50276,
88,
18,
16038,
323,
247,
1182,
254,
6934,
302,
4715,
285,
247,
37699,
5431,
13279,
292,
5028,
26672,
318,
41685,
47291,
327,
33944,
7202,
5971,
2581,
685,
14053,
323,
253,
1072,
11120,
1275,
337,
3103,
841,
3082,
403,
4217,
275,
33944,
3888,
1014,
352,
597,
403,
417,
1246,
275,
253,
17095,
1387,
273,
7202,
5971,
627,
403,
7964,
690,
4722,
441,
886,
1169,
824,
347,
32809,
4715,
835,
253,
1566,
5223,
84,
23265,
285,
33772,
747,
5971,
594,
352,
651,
320,
4722,
281,
2486,
690,
3081,
16038,
323,
253,
1182,
254,
6934,
302,
4715,
327,
7202,
5971,
275,
1524,
1533,
342,
5028,
8037,
50276,
88,
19,
3710,
38135,
273,
253,
4081,
2746,
253,
5028,
15644,
6333,
285,
48960,
3733,
5199,
310,
417,
4460,
436,
2934,
556,
644,
12845,
275,
2067,
9380,
1275,
374,
577,
3966,
323,
385,
1275,
577,
12845,
246,
19484,
281,
4858,
253,
7202,
5971,
432,
2603,
966,
13674,
594,
253,
38135,
273,
253,
2929,
310,
3710,
281,
19575,
273,
305,
14340,
323,
7202,
5971,
50275,
88,
20,
7103,
273,
253,
1332,
285,
5301,
1411,
3332,
3082,
253,
4679,
403,
3710,
281,
50276,
15419,
2368,
273,
253,
1332,
327,
24321,
15302,
824,
347,
278,
79,
382,
3738,
253,
1543,
403,
18462,
625,
5661,
12820,
651,
320,
2424,
2439,
15302,
326,
452,
38539,
11104,
1491,
323,
3640,
4216,
2007,
253,
4477,
878,
281,
2486,
3081,
14023,
432,
3332,
6239,
824,
347,
21513,
1275,
337,
10958,
7116,
3210,
1275,
608,
285,
8244,
2905,
2987,
824,
347,
1275,
495,
4394,
12022,
4715,
50275,
88,
21,
5661,
7241,
352,
651,
320,
625,
4722,
281,
1908,
271,
7103,
7241,
326,
2722,
7200,
273,
7202,
5971,
1097,
347,
247,
8985,
42070,
1895,
285,
23559,
14407,
9162,
1895,
50272,
977,
7794,
50276,
66,
18,
275,
619,
4743,
253,
2929,
19401,
271,
4722,
1895,
533,
253,
2934,
273,
970,
4555,
34900,
2224,
5971,
310,
20697,
323,
8542,
4606,
806,
627,
310,
2168,
247,
5028,
5333,
2190,
2603,
285,
2303,
10670,
1273,
253,
2303,
10895,
310,
440,
22027,
3103,
352,
310,
1892,
281,
18536,
562,
247,
2266,
3268,
275,
436,
10076,
436,
1895,
556,
644,
9713,
275,
247,
2905,
789,
407,
17617,
3861,
49225,
4715,
1275,
495,
50275,
66,
19,
347,
5125,
275,
253,
1840,
7118,
253,
7103,
7241,
943,
4887,
253,
10307,
273,
1332,
285,
697,
4460,
3603,
436,
2223,
4419,
5277,
3081,
17082,
326,
452,
5431,
417,
644,
2783,
275,
6239,
50275,
66,
20,
1690,
28669,
570,
14777,
651,
1361,
253,
9414,
2096,
253,
18276,
3045,
273,
253,
4081,
2746,
323,
253,
7202,
5971,
50273,
34974,
50276,
82,
18,
2746,
50276,
31158,
5700,
310,
627,
247,
15588,
1039,
281,
17958,
875,
253,
767,
8661,
323,
3733,
253,
1566,
281,
6351,
2169,
3045,
390,
310,
436,
247,
47641,
326,
11026,
2169,
3045,
50276,
82,
19,
4679,
50276,
17494,
2593,
849,
1048,
1057,
253,
1566,
1379,
281,
29623,
253,
2929,
25957,
326,
253,
1566,
556,
644,
10166,
323,
31268,
44540,
323,
24321,
10895,
891,
4282,
604,
436,
310,
8542,
285,
604,
352,
651,
4311,
323,
4067,
15302,
50275,
82,
20,
4679,
50276,
47109,
2593,
352,
310,
417,
2590,
752,
253,
5028,
31429,
2097,
275,
253,
3634,
273,
4735,
12420,
323,
7202,
5971,
50272,
37585,
963,
993,
50275,
78,
18,
4581,
1178,
5028,
15644,
285,
417,
27599,
292,
50276,
78,
19,
627,
403,
1643,
47412,
474,
6332,
1056,
253,
2929,
1652,
1892,
50276,
936,
956,
50274,
250,
3065,
50276,
709,
337,
4858,
281,
5223,
1527,
873,
5028,
15644,
3066,
13439,
9712,
30105,
1087,
746,
50276,
709,
374,
48960,
20741,
800,
5028,
15644,
30105,
1087,
1166,
50276,
709,
495,
966,
19687,
30132,
5028,
15644,
23746,
87,
938,
50276,
709,
577,
1527,
873,
5028,
15644,
407,
896,
44263,
318,
23746,
87,
1093,
50276,
709,
608,
4404,
10958,
7116,
3210,
323,
13279,
292,
5028,
15644,
30105,
1087,
938,
891,
452,
2530,
253,
2278,
323,
253,
2929,
275,
253,
2045,
7118,
3738,
253,
2929,
19401,
271,
4722,
1895,
4758,
625,
5661,
12820,
651,
320,
2424,
281,
5963,
697,
3916,
2007,
253,
38135,
310,
3710,
281,
2898,
273,
271,
2168,
5368,
5697,
516,
5211,
281,
1379,
253,
5955,
342,
4477,
275,
253,
1501,
15337,
3408,
281,
4089,
616,
7906,
387,
253,
2774,
253,
2929,
310,
3365,
417,
275,
12207,
1375,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
12453,
13279,
292,
4204,
835,
3530,
432,
4460,
5971,
275,
253,
2303,
5028,
755,
29102,
50276,
14806,
747,
440,
22027,
5971,
247,
2234,
38135,
275,
253,
4715,
9978,
310,
326,
352,
310,
8025,
326,
581,
50276,
7110,
2289,
281,
247,
3640,
4216,
689,
5971,
1097,
2603,
285,
2303,
326,
15841,
310,
908,
323,
32827,
50276,
7831,
3530,
715,
4460,
5971,
50275,
15337,
398,
497,
7514,
326,
253,
1332,
310,
417,
5544,
342,
4209,
50276,
23454,
285,
253,
4679,
20296,
14023,
342,
13279,
292,
4204,
1666,
25379,
50276,
2369,
30080,
22559,
369,
9262,
50275,
783,
2929,
2550,
320,
7607,
281,
17857,
32888
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a method that accepts two partially overlapping images and a keypoint in one of them and detects the location of the keypoint in the other image regardless of whether it is visible occluded or outside the frame of the image this is an interesting reformulation of the correspondence estimation problem the conventional formulation of which considers only keypoints that are visible in both images under the previous formulation correspondence estimators are considered successful if they can declare that the correspondent of the keypoint is invisible in the target image in addition to the capability to predict the location of invisible keypoints in isolation the paper demonstrates that this capability is beneficial to camera pose estimation strengths the capability to predict the location of points that have just become occluded or exited the frame is useful in many tasks such as perception for robots where such predictions could guide the robots next moves humans are good at this but computer vision research has focused on the limited setting when the keypoints are visible in both images loss functions used for training networks for correspondence estimation only consider descriptors of visible keypoints this is an important contribution of the paper and it goes beyond more qualitative work of predicting unseen objects based on context the approach builds upon the concept of neural reprojection error nre which was recently introduced by germain et al 2021 nre does not require that the appearance of two image regions that are hypothesized to be projections of the same 3d point is similar it can be viewed as an extension of the purely geometric reprojection error germain et al did not consider occluded or outofbounds keypoints i consider the unified treatment of visible occluded and outofbounds keypoints an advantage experiments are thorough findings are explained clearly and limitations of the proposed method are pointed out i consider the use of four publicly available datasets two indoor and two outdoor sufficient over one million keypoints are used in the experiments generalization on additional datasets not used for training is also shown this is important since it allows broader deployment of the algorithm the main limitation of neurhal is low accuracy under favorable conditions due to the low resolution of its output improving this is left as a direction for future work the appendices contain useful additional information experiments and implementation details weaknesses it is unclear to me what the network actually learns i speculate that it learns to warp the source image to the target given the relative pose between the two cameras and the depth of the keypoint in the source image the output of this process is a correspondence map which suggests that multiple warpings are considered this is not described clearly enough the fact that the visibility of keypoint in the target images does not need to be labeled is clear the metrics used for evaluation are somewhat arbitrary or not sufficiently justified figure 3 for example compares the results to random chance which is a very weak baseline camera pose estimation is considered correct when the rotation error is under 20 degrees and the translation error is under 15 m the latter is uninformative without knowing the magnitude of the translation between the two cameras 15 m may be a very small or a very large error depending on the input data other comments i find the term hallucination in the title marginally acceptable but i think that it is abused in the rest of the paper for example in local feature matching methods are only able to identify the correspondents location when it is visible while humans can also hallucinate its location when it is occluded or outside the field of view through geometric reasoning humans do not hallucinate they predict or estimate footnote 1 on p 3 is unnecessary the equation it refers to holds for any images in general configuration sharing the same intrinsic parameters k matrix the latter constraint can be easily relaxed section 32 the calibration matrix kc does not encode the boundaries of the image instead of a different matrix what is needed are different boundary conditions for considering pixels to be within the image or not section a2 that discusses additional related work should be moved to the main paper it is very relevant to the problem at hand figure 8 precision on the yaxis is perplexing the description suggests that this should be recall the paper contains an important contribution as discussed above the fact that predicting the locations of invisible points improves camera pose estimation provides further support that the method is useful in downstream tasks docsepthe paper proposed a deep method neurhal to predict visible occluded or outofview keypoint matching from source images to target images in training a correspondence map is obtained from ground truth of camera matrixpose and images in testing the model directly outputs three categories of matchings identified outputpainting and inpainting as an application of neurhal the method is applied to camera pose estimation and tested on scannet and megadepth the experiments shows the method improves the estimation accuracy particularly the outputpainting correspondences strengths the paper proposed an new problem if and how neural network can predict unseen scene keypoints from a sourcetarget image pair human is able to heuristically guess the location of outpainting correspondences from the geometry for example the keypoint displacement is relative camera pose and scene depth this paper uses a deep model to accomplish this goal the paper proposed a method that is based on correspondence map and neural reprojection error germain 2021 with sufficient details neurhal is tested on public datasets for precision of correspondences and camera pose estimation showing better result than baselines weakness the proposed method has limited technique contribution for example the original nre germain 2021 is able to predict outpainting correspondences it gives an extra category for unmatched keypoints as a result this paper adapts germain 2021 to solve this problem in figure 3 the baseline uniform correspondence map is too weak in figure 4 i do not find the precision for identified keypoints it is unfair for the other methods in camera pose estimation section 42 the description of the baseline correspondent to the first light blue method in figure 6 is not clear to me because some methods highly rely on good correspondences but others are not so it is hard to tell if the proposed method is able to improve the stateoftheart camera pose estimation methods i tend to reject the paper because of the lack of contribution in technique and insufficient experiments i would like to see 1 the technique differences compared with germain 2021 2 improve the baselines and compare with the stateoftheart camera pose estimation methods docsepin this manuscript authors proposed a new problem of correspondence hallucination in which for keypoint in the source image its correspondence should be detected regardless if it is occluded or outside the field of view in particular the authors proposed a new model and training paradigm that learn to hallucinate correspondence by predicting its probability distribution of its location extensive experiments on both indoor and outdoor benchmarks demonstrated that proposed method can help camera pose estimation and outperforms prior state of the art feature matching approaches strengths 1 this work provides an interesting research direction correspondence learning in which occluded and out of fov correspondence can also help largebaseline camera pose estimation 2 based on this insight authors proposed a new model for hallucinating the correspondence based on attention blocks 3 extensive experiments on indoor and outdoor benchmarks demonstrated that proposed model is able to hallucinate correspondence and outperform prior sota on wide baseline setting in terms of camera poes estimation 4 author provided detailed supplementary material and trained model and source code for better supporting reproducibility weakness 1 authors admitted that the proposed method can only produce low resolution correspondence which can have a negative impact on localization accuracy 2 is there any explanation why correspondence inpainting is harder for inpainting 3 did authors define how to compute kc for transforming points from image plane to correspondence plane it seems that i am not able to find it in the main manuscript 4 it would be better if authors could provide more analysis and ablation study on the relationship between accuracy of camera pose estimation and distribution of hallucinated correspondence and in which case hallucinated correspondence will be more accurate or less accurate 5 authors should also cite several other related work for feature matching learning feature descriptors using camera pose supervision eccv 2020 selfsupervised geometric perception cvpr 2021 patch2pix epipolarguided pixellevel correspondences cvpr 2021 in summary based on the strength and weakness i mentioned i like this papers idea for hallucinating correspondence so that we can obtain better accuracy for camera pose estimation although it would be better to have more analysis and ablation study for the impact of correspondence hallucination on camera pose estimation accuracy
### Summary:
|
this paper receives positive reviews the authors provide additional results and justifications during the rebuttal phase all reviewers find this paper interesting and the contributions are sufficient for this conference the area chair agrees with the reviewers and recommends it be accepted for presentation
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1332,
326,
25026,
767,
10571,
21481,
3888,
285,
247,
2234,
3659,
275,
581,
273,
731,
285,
34472,
253,
4328,
273,
253,
2234,
3659,
275,
253,
643,
2460,
10159,
273,
1880,
352,
310,
7985,
15715,
4686,
390,
3345,
253,
3665,
273,
253,
2460,
436,
310,
271,
4722,
8460,
1427,
273,
253,
17668,
13418,
1895,
253,
6041,
15895,
273,
534,
19401,
760,
2234,
10801,
326,
403,
7985,
275,
1097,
3888,
762,
253,
2045,
15895,
17668,
48489,
403,
2783,
5547,
604,
597,
476,
11165,
326,
253,
35260,
273,
253,
2234,
3659,
310,
20624,
275,
253,
2303,
2460,
275,
1635,
281,
253,
14603,
281,
3283,
253,
4328,
273,
20624,
2234,
10801,
275,
12940,
253,
2929,
14371,
326,
436,
14603,
310,
12912,
281,
6568,
16753,
13418,
20544,
50276,
783,
14603,
281,
3283,
253,
4328,
273,
2792,
326,
452,
816,
2489,
15715,
4686,
390,
43788,
253,
3665,
310,
4217,
275,
1142,
8892,
824,
347,
13071,
323,
25497,
835,
824,
13650,
812,
7102,
253,
25497,
1735,
9727,
7497,
403,
1175,
387,
436,
533,
4382,
8113,
2561,
556,
7106,
327,
253,
3710,
4758,
672,
253,
2234,
10801,
403,
7985,
275,
1097,
3888,
2957,
3470,
908,
323,
3733,
6928,
323,
17668,
13418,
760,
1908,
42785,
273,
7985,
2234,
10801,
436,
310,
271,
1774,
7680,
273,
253,
2929,
285,
352,
4566,
4457,
625,
18276,
789,
273,
21565,
39709,
5113,
1754,
327,
3634,
50276,
783,
2746,
21168,
2220,
253,
4473,
273,
11454,
41543,
5342,
2228,
295,
250,
534,
369,
4102,
5611,
407,
14638,
404,
1162,
355,
43425,
295,
250,
1057,
417,
2430,
326,
253,
7286,
273,
767,
2460,
4811,
326,
403,
24045,
281,
320,
20553,
273,
253,
1072,
495,
69,
1127,
310,
2074,
352,
476,
320,
11575,
347,
271,
6880,
273,
253,
15846,
17856,
41543,
5342,
2228,
14638,
404,
1162,
355,
858,
417,
1908,
15715,
4686,
390,
562,
1171,
35800,
2234,
10801,
50276,
74,
1908,
253,
27998,
1971,
273,
7985,
15715,
4686,
285,
562,
1171,
35800,
2234,
10801,
271,
5750,
50276,
16217,
3825,
403,
11080,
4342,
403,
5544,
4518,
285,
7364,
273,
253,
4081,
1332,
403,
8042,
562,
891,
1908,
253,
897,
273,
1740,
13644,
2130,
15302,
767,
24340,
285,
767,
17603,
4209,
689,
581,
3041,
2234,
10801,
403,
908,
275,
253,
4679,
50276,
16691,
1320,
327,
3081,
15302,
417,
908,
323,
3733,
310,
671,
2011,
436,
310,
1774,
1580,
352,
4483,
16055,
19007,
273,
253,
5933,
50276,
783,
2022,
12291,
273,
5723,
5590,
310,
1698,
7200,
762,
13857,
2515,
1955,
281,
253,
1698,
6064,
273,
697,
3453,
11138,
436,
310,
1669,
347,
247,
3884,
323,
2852,
789,
50275,
783,
14801,
1271,
3831,
4217,
3081,
1491,
4679,
285,
7092,
4278,
50275,
20881,
1255,
265,
50276,
262,
310,
12744,
281,
479,
752,
253,
2990,
2686,
33772,
891,
30821,
326,
352,
33772,
281,
45645,
253,
2603,
2460,
281,
253,
2303,
1677,
253,
4103,
16753,
875,
253,
767,
14693,
285,
253,
6864,
273,
253,
2234,
3659,
275,
253,
2603,
2460,
253,
3453,
273,
436,
1232,
310,
247,
17668,
3711,
534,
5936,
326,
2709,
45645,
723,
403,
2783,
436,
310,
417,
2529,
4518,
2217,
253,
958,
326,
253,
23114,
273,
2234,
3659,
275,
253,
2303,
3888,
1057,
417,
878,
281,
320,
13130,
310,
2590,
50276,
783,
17082,
908,
323,
7103,
403,
8489,
10341,
390,
417,
10481,
17285,
4677,
495,
323,
1650,
26662,
253,
1543,
281,
3632,
4839,
534,
310,
247,
1077,
5075,
8245,
6568,
16753,
13418,
310,
2783,
3451,
672,
253,
9381,
2228,
310,
762,
1384,
7759,
285,
253,
10234,
2228,
310,
762,
1458,
278,
253,
6158,
310,
440,
37650,
800,
1293,
8958,
253,
9777,
273,
253,
10234,
875,
253,
767,
14693,
1458,
278,
778,
320,
247,
1077,
1355,
390,
247,
1077,
1781,
2228,
7293,
327,
253,
3280,
941,
50275,
977,
5701,
50276,
74,
1089,
253,
1307,
33092,
1515,
275,
253,
4060,
42876,
12207,
533,
891,
1158,
326,
352,
310,
19848,
275,
253,
1551,
273,
253,
2929,
323,
1650,
275,
1980,
4735,
11038,
3082,
403,
760,
2104,
281,
4271,
253,
2723,
592,
4328,
672,
352,
310,
7985,
1223,
7497,
476,
671,
33092,
4024,
697,
4328,
672,
352,
310,
15715,
4686,
390,
3345,
253,
1673,
273,
1859,
949,
17856,
14720,
7497,
513,
417,
33092,
4024,
597,
3283,
390,
6642,
50275,
8938,
9939,
337,
327,
268,
495,
310,
15279,
253,
5150,
352,
10770,
281,
6556,
323,
667,
3888,
275,
2087,
6661,
9628,
253,
1072,
15276,
3602,
465,
4315,
253,
6158,
7658,
476,
320,
4354,
19595,
50276,
4674,
4567,
253,
18543,
4315,
465,
68,
1057,
417,
22573,
253,
13674,
273,
253,
2460,
3185,
273,
247,
1027,
4315,
752,
310,
3058,
403,
1027,
7548,
2515,
323,
7296,
15115,
281,
320,
1561,
253,
2460,
390,
417,
50276,
4674,
247,
19,
326,
25339,
3081,
2905,
789,
943,
320,
4395,
281,
253,
2022,
2929,
352,
310,
1077,
4623,
281,
253,
1895,
387,
1133,
50276,
13206,
854,
12320,
327,
253,
340,
10565,
310,
44229,
272,
253,
5740,
5936,
326,
436,
943,
320,
6983,
50275,
783,
2929,
4428,
271,
1774,
7680,
347,
5469,
1840,
253,
958,
326,
21565,
253,
8593,
273,
20624,
2792,
19132,
6568,
16753,
13418,
3400,
2007,
1329,
326,
253,
1332,
310,
4217,
275,
15450,
8892,
50276,
7152,
339,
431,
248,
2929,
4081,
247,
3676,
1332,
5723,
5590,
281,
3283,
7985,
15715,
4686,
390,
562,
1171,
1374,
2234,
3659,
11038,
432,
2603,
3888,
281,
2303,
3888,
275,
3733,
247,
17668,
3711,
310,
2797,
432,
3216,
5083,
273,
6568,
4315,
3014,
285,
3888,
275,
5175,
253,
1566,
3587,
18012,
1264,
9050,
273,
3761,
723,
3636,
3453,
31406,
1076,
285,
275,
31406,
1076,
347,
271,
2898,
273,
5723,
5590,
253,
1332,
310,
3732,
281,
6568,
16753,
13418,
285,
5762,
327,
660,
1136,
292,
285,
19488,
324,
554,
394,
253,
4679,
2722,
253,
1332,
19132,
253,
13418,
7200,
3782,
253,
3453,
31406,
1076,
2723,
2979,
50276,
296,
3755,
20556,
253,
2929,
4081,
271,
747,
1895,
604,
285,
849,
11454,
2990,
476,
3283,
39709,
6200,
2234,
10801,
432,
247,
18988,
68,
292,
1816,
2460,
4667,
1966,
310,
2104,
281,
344,
321,
18260,
5476,
253,
4328,
273,
562,
31406,
1076,
2723,
2979,
432,
253,
12087,
323,
1650,
253,
2234,
3659,
16837,
310,
4103,
6568,
16753,
285,
6200,
6864,
436,
2929,
4648,
247,
3676,
1566,
281,
14294,
436,
4736,
50276,
783,
2929,
4081,
247,
1332,
326,
310,
1754,
327,
17668,
3711,
285,
11454,
41543,
5342,
2228,
14638,
404,
43425,
342,
4209,
4278,
50276,
32167,
5590,
310,
5762,
327,
1345,
15302,
323,
12320,
273,
2723,
2979,
285,
6568,
16753,
13418,
4645,
1805,
906,
685,
1666,
25379,
50276,
20881,
1255,
50276,
783,
4081,
1332,
556,
3710,
5853,
7680,
323,
1650,
253,
3236,
295,
250,
14638,
404,
43425,
50276,
261,
2104,
281,
3283,
562,
31406,
1076,
2723,
2979,
352,
4245,
271,
4465,
7140,
323,
440,
25049,
2234,
10801,
347,
247,
906,
436,
2929,
5223,
84,
14638,
404,
43425,
281,
8415,
436,
1895,
50276,
249,
4677,
495,
253,
8245,
6447,
17668,
3711,
310,
1512,
5075,
275,
4677,
577,
891,
513,
417,
1089,
253,
12320,
323,
3636,
2234,
10801,
352,
310,
16593,
323,
253,
643,
3082,
50276,
249,
6568,
16753,
13418,
2593,
5976,
253,
5740,
273,
253,
8245,
35260,
281,
253,
806,
1708,
4797,
1332,
275,
4677,
721,
310,
417,
2590,
281,
479,
984,
690,
3082,
4122,
10725,
327,
1175,
2723,
2979,
533,
2571,
403,
417,
594,
352,
310,
1892,
281,
2028,
604,
253,
4081,
1332,
310,
2104,
281,
3157,
253,
1375,
23037,
14387,
6568,
16753,
13418,
3082,
50274,
74,
5257,
281,
12009,
253,
2929,
984,
273,
253,
3480,
273,
7680,
275,
5853,
285,
12497,
4679,
891,
651,
751,
281,
923,
50276,
18,
253,
5853,
3910,
2429,
342,
14638,
404,
43425,
50276,
19,
3157,
253,
1666,
25379,
285,
7277,
342,
253,
1375,
23037,
14387,
6568,
16753,
13418,
3082,
5474,
339,
9852,
436,
7714,
4477,
4081,
247,
747,
1895,
273,
17668,
33092,
1515,
275,
534,
323,
2234,
3659,
275,
253,
2603,
2460,
697,
17668,
943,
320,
5189,
10159,
604,
352,
310,
15715,
4686,
390,
3345,
253,
1673,
273,
1859,
275,
1798,
253,
4477,
4081,
247,
747,
1566,
285,
3733,
22199,
326,
3037,
281,
33092,
4024,
17668,
407,
21565,
697,
5912,
3268,
273,
697,
4328,
9470,
4679,
327,
1097,
24340,
285,
17603,
49602,
5183,
326,
4081,
1332,
50276,
5092,
1361,
6568,
16753,
13418,
285,
41731,
13015,
2720,
1375,
273,
253,
1445,
4735,
11038,
7274,
50276,
296,
3755,
20556,
337,
436,
789,
3400,
271,
4722,
2561,
3884,
17668,
4715,
275,
534,
15715,
4686,
285,
562,
273,
269,
729,
17668,
476,
671,
1361,
1781,
44650,
6568,
16753,
13418,
374,
1754,
327,
436,
12288,
4477,
4081,
247,
747,
1566,
323,
33092,
8779,
253,
17668,
1754,
327,
4116,
8336,
495,
9470,
4679,
327,
24340,
285,
17603,
49602,
5183,
326,
4081,
1566,
310,
2104,
281,
33092,
4024,
17668,
285,
562,
32231,
2720,
256,
5503,
327,
4618,
8245,
4758,
275,
2426,
273,
6568,
2963,
265,
13418,
577,
2488,
2530,
7000,
24864,
2144,
285,
10166,
1566,
285,
2603,
2127,
323,
1805,
8109,
38041,
50276,
20881,
1255,
337,
4477,
8176,
326,
253,
4081,
1332,
476,
760,
4711,
1698,
6064,
17668,
534,
476,
452,
247,
4016,
3486,
327,
14536,
7200,
50276,
19,
310,
627,
667,
8813,
2139,
17668,
275,
31406,
1076,
310,
12150,
323,
275,
31406,
1076,
495,
858,
4477,
4853,
849,
281,
11897,
465,
68,
323,
27197,
2792,
432,
2460,
6415,
281,
17668,
6415,
352,
3133,
326,
891,
717,
417,
2104,
281,
1089,
352,
275,
253,
2022,
7714,
577,
352,
651,
320,
1805,
604,
4477,
812,
2085,
625,
1783,
285,
28913,
1263,
327,
253,
2954,
875,
7200,
273,
6568,
16753,
13418,
285,
3268,
273,
33092,
3901,
17668,
285,
275,
534,
1083,
33092,
3901,
17668,
588,
320,
625,
7899,
390,
1679,
7899,
608,
4477,
943,
671,
26542,
2067,
643,
2905,
789,
323,
4735,
11038,
4715,
4735,
42785,
970,
6568,
16753,
20446,
23746,
87,
9169,
1881,
35421,
17856,
13071,
30105,
1087,
43425,
12097,
19,
30061,
2563,
532,
311,
1662,
86,
1356,
8066,
4415,
652,
2723,
2979,
30105,
1087,
43425,
50276,
249,
6010,
1754,
327,
253,
4757,
285,
14855,
891,
5393,
891,
751,
436,
9380,
2934,
323,
33092,
8779,
17668,
594,
326,
359,
476,
4044,
1805,
7200,
323,
6568,
16753,
13418,
3738,
352,
651,
320,
1805,
281,
452,
625,
1783,
285,
28913,
1263,
323,
253,
3486,
273,
17668,
33092,
1515,
327,
6568,
16753,
13418,
7200,
2490,
187,
4118,
18435,
27,
2520,
2929,
14488,
2762,
10123,
253,
4477,
2085,
3081,
1543,
285,
816,
6787,
1309,
253,
30080,
22559,
3408,
512,
30628,
1089,
436,
2929,
4722,
285,
253,
9021,
403,
4209,
323,
436,
8059,
253,
2170,
6951,
18726,
342,
253,
30628,
285,
32636,
352,
320,
7607,
323,
9759
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1332,
326,
25026,
767,
10571,
21481,
3888,
285,
247,
2234,
3659,
275,
581,
273,
731,
285,
34472,
253,
4328,
273,
253,
2234,
3659,
275,
253,
643,
2460,
10159,
273,
1880,
352,
310,
7985,
15715,
4686,
390,
3345,
253,
3665,
273,
253,
2460,
436,
310,
271,
4722,
8460,
1427,
273,
253,
17668,
13418,
1895,
253,
6041,
15895,
273,
534,
19401,
760,
2234,
10801,
326,
403,
7985,
275,
1097,
3888,
762,
253,
2045,
15895,
17668,
48489,
403,
2783,
5547,
604,
597,
476,
11165,
326,
253,
35260,
273,
253,
2234,
3659,
310,
20624,
275,
253,
2303,
2460,
275,
1635,
281,
253,
14603,
281,
3283,
253,
4328,
273,
20624,
2234,
10801,
275,
12940,
253,
2929,
14371,
326,
436,
14603,
310,
12912,
281,
6568,
16753,
13418,
20544,
50276,
783,
14603,
281,
3283,
253,
4328,
273,
2792,
326,
452,
816,
2489,
15715,
4686,
390,
43788,
253,
3665,
310,
4217,
275,
1142,
8892,
824,
347,
13071,
323,
25497,
835,
824,
13650,
812,
7102,
253,
25497,
1735,
9727,
7497,
403,
1175,
387,
436,
533,
4382,
8113,
2561,
556,
7106,
327,
253,
3710,
4758,
672,
253,
2234,
10801,
403,
7985,
275,
1097,
3888,
2957,
3470,
908,
323,
3733,
6928,
323,
17668,
13418,
760,
1908,
42785,
273,
7985,
2234,
10801,
436,
310,
271,
1774,
7680,
273,
253,
2929,
285,
352,
4566,
4457,
625,
18276,
789,
273,
21565,
39709,
5113,
1754,
327,
3634,
50276,
783,
2746,
21168,
2220,
253,
4473,
273,
11454,
41543,
5342,
2228,
295,
250,
534,
369,
4102,
5611,
407,
14638,
404,
1162,
355,
43425,
295,
250,
1057,
417,
2430,
326,
253,
7286,
273,
767,
2460,
4811,
326,
403,
24045,
281,
320,
20553,
273,
253,
1072,
495,
69,
1127,
310,
2074,
352,
476,
320,
11575,
347,
271,
6880,
273,
253,
15846,
17856,
41543,
5342,
2228,
14638,
404,
1162,
355,
858,
417,
1908,
15715,
4686,
390,
562,
1171,
35800,
2234,
10801,
50276,
74,
1908,
253,
27998,
1971,
273,
7985,
15715,
4686,
285,
562,
1171,
35800,
2234,
10801,
271,
5750,
50276,
16217,
3825,
403,
11080,
4342,
403,
5544,
4518,
285,
7364,
273,
253,
4081,
1332,
403,
8042,
562,
891,
1908,
253,
897,
273,
1740,
13644,
2130,
15302,
767,
24340,
285,
767,
17603,
4209,
689,
581,
3041,
2234,
10801,
403,
908,
275,
253,
4679,
50276,
16691,
1320,
327,
3081,
15302,
417,
908,
323,
3733,
310,
671,
2011,
436,
310,
1774,
1580,
352,
4483,
16055,
19007,
273,
253,
5933,
50276,
783,
2022,
12291,
273,
5723,
5590,
310,
1698,
7200,
762,
13857,
2515,
1955,
281,
253,
1698,
6064,
273,
697,
3453,
11138,
436,
310,
1669,
347,
247,
3884,
323,
2852,
789,
50275,
783,
14801,
1271,
3831,
4217,
3081,
1491,
4679,
285,
7092,
4278,
50275,
20881,
1255,
265,
50276,
262,
310,
12744,
281,
479,
752,
253,
2990,
2686,
33772,
891,
30821,
326,
352,
33772,
281,
45645,
253,
2603,
2460,
281,
253,
2303,
1677,
253,
4103,
16753,
875,
253,
767,
14693,
285,
253,
6864,
273,
253,
2234,
3659,
275,
253,
2603,
2460,
253,
3453,
273,
436,
1232,
310,
247,
17668,
3711,
534,
5936,
326,
2709,
45645,
723,
403,
2783,
436,
310,
417,
2529,
4518,
2217,
253,
958,
326,
253,
23114,
273,
2234,
3659,
275,
253,
2303,
3888,
1057,
417,
878,
281,
320,
13130,
310,
2590,
50276,
783,
17082,
908,
323,
7103,
403,
8489,
10341,
390,
417,
10481,
17285,
4677,
495,
323,
1650,
26662,
253,
1543,
281,
3632,
4839,
534,
310,
247,
1077,
5075,
8245,
6568,
16753,
13418,
310,
2783,
3451,
672,
253,
9381,
2228,
310,
762,
1384,
7759,
285,
253,
10234,
2228,
310,
762,
1458,
278,
253,
6158,
310,
440,
37650,
800,
1293,
8958,
253,
9777,
273,
253,
10234,
875,
253,
767,
14693,
1458,
278,
778,
320,
247,
1077,
1355,
390,
247,
1077,
1781,
2228,
7293,
327,
253,
3280,
941,
50275,
977,
5701,
50276,
74,
1089,
253,
1307,
33092,
1515,
275,
253,
4060,
42876,
12207,
533,
891,
1158,
326,
352,
310,
19848,
275,
253,
1551,
273,
253,
2929,
323,
1650,
275,
1980,
4735,
11038,
3082,
403,
760,
2104,
281,
4271,
253,
2723,
592,
4328,
672,
352,
310,
7985,
1223,
7497,
476,
671,
33092,
4024,
697,
4328,
672,
352,
310,
15715,
4686,
390,
3345,
253,
1673,
273,
1859,
949,
17856,
14720,
7497,
513,
417,
33092,
4024,
597,
3283,
390,
6642,
50275,
8938,
9939,
337,
327,
268,
495,
310,
15279,
253,
5150,
352,
10770,
281,
6556,
323,
667,
3888,
275,
2087,
6661,
9628,
253,
1072,
15276,
3602,
465,
4315,
253,
6158,
7658,
476,
320,
4354,
19595,
50276,
4674,
4567,
253,
18543,
4315,
465,
68,
1057,
417,
22573,
253,
13674,
273,
253,
2460,
3185,
273,
247,
1027,
4315,
752,
310,
3058,
403,
1027,
7548,
2515,
323,
7296,
15115,
281,
320,
1561,
253,
2460,
390,
417,
50276,
4674,
247,
19,
326,
25339,
3081,
2905,
789,
943,
320,
4395,
281,
253,
2022,
2929,
352,
310,
1077,
4623,
281,
253,
1895,
387,
1133,
50276,
13206,
854,
12320,
327,
253,
340,
10565,
310,
44229,
272,
253,
5740,
5936,
326,
436,
943,
320,
6983,
50275,
783,
2929,
4428,
271,
1774,
7680,
347,
5469,
1840,
253,
958,
326,
21565,
253,
8593,
273,
20624,
2792,
19132,
6568,
16753,
13418,
3400,
2007,
1329,
326,
253,
1332,
310,
4217,
275,
15450,
8892,
50276,
7152,
339,
431,
248,
2929,
4081,
247,
3676,
1332,
5723,
5590,
281,
3283,
7985,
15715,
4686,
390,
562,
1171,
1374,
2234,
3659,
11038,
432,
2603,
3888,
281,
2303,
3888,
275,
3733,
247,
17668,
3711,
310,
2797,
432,
3216,
5083,
273,
6568,
4315,
3014,
285,
3888,
275,
5175,
253,
1566,
3587,
18012,
1264,
9050,
273,
3761,
723,
3636,
3453,
31406,
1076,
285,
275,
31406,
1076,
347,
271,
2898,
273,
5723,
5590,
253,
1332,
310,
3732,
281,
6568,
16753,
13418,
285,
5762,
327,
660,
1136,
292,
285,
19488,
324,
554,
394,
253,
4679,
2722,
253,
1332,
19132,
253,
13418,
7200,
3782,
253,
3453,
31406,
1076,
2723,
2979,
50276,
296,
3755,
20556,
253,
2929,
4081,
271,
747,
1895,
604,
285,
849,
11454,
2990,
476,
3283,
39709,
6200,
2234,
10801,
432,
247,
18988,
68,
292,
1816,
2460,
4667,
1966,
310,
2104,
281,
344,
321,
18260,
5476,
253,
4328,
273,
562,
31406,
1076,
2723,
2979,
432,
253,
12087,
323,
1650,
253,
2234,
3659,
16837,
310,
4103,
6568,
16753,
285,
6200,
6864,
436,
2929,
4648,
247,
3676,
1566,
281,
14294,
436,
4736,
50276,
783,
2929,
4081,
247,
1332,
326,
310,
1754,
327,
17668,
3711,
285,
11454,
41543,
5342,
2228,
14638,
404,
43425,
342,
4209,
4278,
50276,
32167,
5590,
310,
5762,
327,
1345,
15302,
323,
12320,
273,
2723,
2979,
285,
6568,
16753,
13418,
4645,
1805,
906,
685,
1666,
25379,
50276,
20881,
1255,
50276,
783,
4081,
1332,
556,
3710,
5853,
7680,
323,
1650,
253,
3236,
295,
250,
14638,
404,
43425,
50276,
261,
2104,
281,
3283,
562,
31406,
1076,
2723,
2979,
352,
4245,
271,
4465,
7140,
323,
440,
25049,
2234,
10801,
347,
247,
906,
436,
2929,
5223,
84,
14638,
404,
43425,
281,
8415,
436,
1895,
50276,
249,
4677,
495,
253,
8245,
6447,
17668,
3711,
310,
1512,
5075,
275,
4677,
577,
891,
513,
417,
1089,
253,
12320,
323,
3636,
2234,
10801,
352,
310,
16593,
323,
253,
643,
3082,
50276,
249,
6568,
16753,
13418,
2593,
5976,
253,
5740,
273,
253,
8245,
35260,
281,
253,
806,
1708,
4797,
1332,
275,
4677,
721,
310,
417,
2590,
281,
479,
984,
690,
3082,
4122,
10725,
327,
1175,
2723,
2979,
533,
2571,
403,
417,
594,
352,
310,
1892,
281,
2028,
604,
253,
4081,
1332,
310,
2104,
281,
3157,
253,
1375,
23037,
14387,
6568,
16753,
13418,
3082,
50274,
74,
5257,
281,
12009,
253,
2929,
984,
273,
253,
3480,
273,
7680,
275,
5853,
285,
12497,
4679,
891,
651,
751,
281,
923,
50276,
18,
253,
5853,
3910,
2429,
342,
14638,
404,
43425,
50276,
19,
3157,
253,
1666,
25379,
285,
7277,
342,
253,
1375,
23037,
14387,
6568,
16753,
13418,
3082,
5474,
339,
9852,
436,
7714,
4477,
4081,
247,
747,
1895,
273,
17668,
33092,
1515,
275,
534,
323,
2234,
3659,
275,
253,
2603,
2460,
697,
17668,
943,
320,
5189,
10159,
604,
352,
310,
15715,
4686,
390,
3345,
253,
1673,
273,
1859,
275,
1798,
253,
4477,
4081,
247,
747,
1566,
285,
3733,
22199,
326,
3037,
281,
33092,
4024,
17668,
407,
21565,
697,
5912,
3268,
273,
697,
4328,
9470,
4679,
327,
1097,
24340,
285,
17603,
49602,
5183,
326,
4081,
1332,
50276,
5092,
1361,
6568,
16753,
13418,
285,
41731,
13015,
2720,
1375,
273,
253,
1445,
4735,
11038,
7274,
50276,
296,
3755,
20556,
337,
436,
789,
3400,
271,
4722,
2561,
3884,
17668,
4715,
275,
534,
15715,
4686,
285,
562,
273,
269,
729,
17668,
476,
671,
1361,
1781,
44650,
6568,
16753,
13418,
374,
1754,
327,
436,
12288,
4477,
4081,
247,
747,
1566,
323,
33092,
8779,
253,
17668,
1754,
327,
4116,
8336,
495,
9470,
4679,
327,
24340,
285,
17603,
49602,
5183,
326,
4081,
1566,
310,
2104,
281,
33092,
4024,
17668,
285,
562,
32231,
2720,
256,
5503,
327,
4618,
8245,
4758,
275,
2426,
273,
6568,
2963,
265,
13418,
577,
2488,
2530,
7000,
24864,
2144,
285,
10166,
1566,
285,
2603,
2127,
323,
1805,
8109,
38041,
50276,
20881,
1255,
337,
4477,
8176,
326,
253,
4081,
1332,
476,
760,
4711,
1698,
6064,
17668,
534,
476,
452,
247,
4016,
3486,
327,
14536,
7200,
50276,
19,
310,
627,
667,
8813,
2139,
17668,
275,
31406,
1076,
310,
12150,
323,
275,
31406,
1076,
495,
858,
4477,
4853,
849,
281,
11897,
465,
68,
323,
27197,
2792,
432,
2460,
6415,
281,
17668,
6415,
352,
3133,
326,
891,
717,
417,
2104,
281,
1089,
352,
275,
253,
2022,
7714,
577,
352,
651,
320,
1805,
604,
4477,
812,
2085,
625,
1783,
285,
28913,
1263,
327,
253,
2954,
875,
7200,
273,
6568,
16753,
13418,
285,
3268,
273,
33092,
3901,
17668,
285,
275,
534,
1083,
33092,
3901,
17668,
588,
320,
625,
7899,
390,
1679,
7899,
608,
4477,
943,
671,
26542,
2067,
643,
2905,
789,
323,
4735,
11038,
4715,
4735,
42785,
970,
6568,
16753,
20446,
23746,
87,
9169,
1881,
35421,
17856,
13071,
30105,
1087,
43425,
12097,
19,
30061,
2563,
532,
311,
1662,
86,
1356,
8066,
4415,
652,
2723,
2979,
30105,
1087,
43425,
50276,
249,
6010,
1754,
327,
253,
4757,
285,
14855,
891,
5393,
891,
751,
436,
9380,
2934,
323,
33092,
8779,
17668,
594,
326,
359,
476,
4044,
1805,
7200,
323,
6568,
16753,
13418,
3738,
352,
651,
320,
1805,
281,
452,
625,
1783,
285,
28913,
1263,
323,
253,
3486,
273,
17668,
33092,
1515,
327,
6568,
16753,
13418,
7200,
2490,
187,
4118,
18435,
27,
2520,
2929,
14488,
2762,
10123,
253,
4477,
2085,
3081,
1543,
285,
816,
6787,
1309,
253,
30080,
22559,
3408,
512,
30628,
1089,
436,
2929,
4722,
285,
253,
9021,
403,
4209,
323,
436,
8059,
253,
2170,
6951,
18726,
342,
253,
30628,
285,
32636,
352,
320,
7607,
323,
9759
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors present a general framework for the analysis of fewshot hpo with a focus on the optimallty gap across varying warmstarting techniques in hpo the paper provides an interesting study of the optimality gap in light of transfer learning solutions for hpo nevertheless the paper has potentially very minimal impact on the field of automl the derivations and assumptions presented in the main paper appear to be technically correct for the most part the work is clear i refer the authors to line 95 to fix the beginning of the sentence check review summary the authors present a general framework to analyze the optimality gap in fewshot hpo the idea extends in the transfer learning context of hpo where sequential modelbased optimization has already been applied to a collection of source tasks strengths the paper addresses an interesting aspect of hpo that has been overlooked by the community sound derivations the authors look at several ways by which transfer learning has been employed to improve hpo and derive several respective theoretical guarantees weaknesses the theoretical guarantees that define the approximation error to the optimal hyperparameter ie the optimality gap are governed by several assumptions including surrogates being lipschitz continuous and low approximation error more on that below lemma 31 defines the 1wasserstein distance between two distributions distribution pthetad is not clearly defined furthermore it would differ depending on the sample size and this point is not further discussed there is no clear motivation as to why wasserstein distance is used going through the derivations i was surprised to see that metafeatures are not at all used as a proxy to measure data set similarity as was clearly stated in the related work sections the authors cover this topic even though they miss a prominent paper on learning metafeatures 1 and their applications in fewshot hpo 2 assumption 3334 although restrictive these assumptions overlook a major aspect of any surrogate function namely modeling uncertainty it is not mentioned throughout the derivations and one would assume from the current literature trends that uncertainty would play an important role in deriving the optimality gap it would also be interesting to look at the relative performance of hps in measuring the quality of the surrogate instead of the approximation error references 1 jomaa hadi s lars schmidtthieme and josif grabocka dataset2vec learning dataset metafeatures data mining and knowledge discovery 353 2021 964985 2 jomaa hadi samer et al transfer learning for bayesian hpo with endtoend landmark metafeatures fifth workshop on metalearning at the conference on neural information processing systems 2021 docsepthe paper presents a theoretical study of the optimality gap of warmstarted hyperparameter optimization algorithms for a variety of warmstarting strategies that either restrict the search space or use previously fitted surrogate functions the authors discuss the implications of such bounds concluding that the optimality gap of surrogate function transfer is smaller than that achievable by studied pruning strategies when using adaptive weights in particular this shows the importance of the weighting strategies in these second classes of methods the paper does not include any numerical experiments theoretical analysis of hpo methods is lagging behind empirical advances and this paper somewhat tries to close the gap by focusing on fewshot warmstarted hpo algorithms there are a series of limitations to the analysis the authors provide most of which are clearly discussed in the paper additionally i would say that a further limitation is that the setting consider is quite homogeneous meaning that it is assumed that all the tasks feature the same loss and that it may sound quite optimistic to assume that for the source tasks optimal hyperparameters have been found also the paper seems not to focus on a specific hpo technique other than putting a certain emphasis on modelbased algorithms and rather fully commits to the fewshot manysourcetasks setting nevertheless i believe the results reported including the type of approach that the authors follow can have a certain impact in the field and stimulate further analysis and empirical investigation superficially the results seem correct although i have not checked all the details of all the proofs the paper is overall clear but some passages would benefit from further proofreading eg sec 2 is full of repetitions the mathematical notation is a bit involved but probably this is a necessity there are some symbols that are not properly introduced such as the tildeo in equation 2 and onward big o with log factors minor in the appendix rewriting the theorems statement would increase readability it is not clear to me how the local search is supposed to be performed in section 4 strengths interesting theoretical analysis of many warmstarting hpo strategies clear and interesting discussion of the obtained bounds clear exposition of the problem setting including motivation weaknesses many assumptions that may limit the applicability of the results in practice no empirical validationvisualization of the bounds the work would benefit from an additional round of proofreading the results contained in the paper may be of interest for the hpo community revitalizing interest in fewshot hpo techniques and theoretical analysis of hpo algorithms the limitations of the analysis are overall understandable and can be addressed by future work after rebuttal i thank the authors for their reply and confirm my recommendation to accept the paper docsepthe authors provide theoretical analysis of the optimality gap of the hyperparameters obtained via warmstarted fewshot hpo using their theoretical analysis they identify situations where transfer surrogate loss functions perform better than hp space pruning the findings are generally interesting for the automl community as it gives us deeper insights into the theory behind the selected families of metalearning approaches however i find it quite difficult to see the implications of the work after reading the paper the work may have potential for more impact if additional interpretations are provided and it is linked more precisely to the empirical observations in literature with some examples of how it relates to different specific approaches and what it says about them the work is of high quality the authors make it clear what the assumptions are and the proofs appear to be correct after quickly reading them perhaps one detail that i find slightly worrying regarding technical quality i get the impression that the statements in the abstract introduction and conclusion about the general implications are somewhat exaggerated since from the theoretical analysis it is not very clear how it gives guidance when eg one scheme is better than another it should be described more clearly how the theoretical analysis gives guidance to identify when some schemes are better after reading the paper it is not too clear to me how i could actually use those theoretical results more detailed examples could be provided also transfer surrogate loss functions and hp space pruning methods could be described with more example approaches from literature it is not particularly clear what proportion of metalearninghpo methods this actually encapsulates and hence how much impact the work has to further improve the clarity it would be better to avoid very long sentences that span eg 3 or 4 lines on the other hand i like how authors include a section to define the preliminaries this certainly improves the clarity and makes it easier to follow the proofs there are a few minor typosgrammar mistakes eg l102 however the advantage of the metafeatures rely heavily l126 a output positive aspects the paper provides a theoretical analysis that can be useful to decide when to prefer certain families of hpo methods the analysis is detailed assumptions are clearly described and the theoretical analysis is properly written up the authors provide a detailed technical analysis of limitations of their their work negative aspects currently it is difficult to identify what are the implications of the theoretical analysis and how it can be useful to make decisions on what hpo or metalearning methods to use the authors say that their theoretical results give the guidance eg in abstract introduction or conclusion but it is either so technical that it is hard to grasp or the details of such guidance are actually missing it is also not clear to how many hpo or metalearning methods the results apply is it some very specialized family which would imply a rather small impact of the analysis or is it more general generally i would appreciate more examples for those approaches and some discussion overall i think it is a solid work with some interesting results but the current presentation makes it difficult to understand how to use the results of the analysis i currently select marginally above the acceptance threshold but the paper can move quite easily to accept good paper if the paper gives better guidance on the practical implications of the theoretical results clarifications to the issues described in various sections of the review would be appreciated within the rebuttal the paper would benefit a lot from connecting the theoretical analysis to practical guidance to a larger extent docsepna for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers the authors only performed a theoretical analysis na for reproducibility reviewers docsepthe authors propose a theoretical framework to evaluate various transferlearning methods that are used to warmstart a hpo job on a target task using hpo runs from multiple source tasks authors cover a variety of popular methods from the literature and provide a lower bound for all of those apart from providing mathematical bounds authors also provide intuitions regarding the bounds and use it to further validate certain empirical observations from the literature the paper is entirely theoretical and does not have any empirical component as the work analyzes existing techniques which already provide experimental results from a theoretical standpoint this paper can have a strong impact in this specific subarea transferlearningwarmstarting for hpo of automl as new methods can use the framework proposed in the paper to compare the lower bounds obtained by their method with respect to the existing ones apart from only validating their hypothesis with empirical results from a practical standpoint this work will have limited impact because i the authors do not propose any new methodinsight using their theoretical observation and ii most of the hypotheses proved in the paper seems intuitive eg hpo transfer performance is heavily dependent on the sourcetarget task domain difference the proposed lemmas and bounds described in this paper seem correct to me the assumptions used to prove the bounds seem reasonable to the most extent the authors propose a general framework and use it to cover two styles of hpotransferlearning eg searchspace pruning and transferring the surrogate function itself the framework is sufficient to prove the optimality bounds in both cases the bounds also theoretically justify empirical results of a few techniques from the literature thereby further showing the correctness of the claims authors perform a good job in presenting the materials in a cohesive way to a reviewer without a strong theoretical background the flow of the information and the narrative seems natural and all assumptions are clearly listed down before discussing each lemma or corollaries strength the paper proposes a comprehensive theoretical framework to analyze various hpo warmstarting techniques by analyzing their lower bound the framework can accommodate both pruning based methods and surrogate function transfer methods paper is nicely written and all the assumptions and limitations about each theorem and lemma are clear from the paper whenever certain assumptions are stronger that what is encountered in real life authors mention that and also provide bounds for alternative weaker assumptions which are more realistic the bounds obtained by the authors intuitively make sense to me they also seem to be correct for a couple of proofs that i have checked in the appendix the fact that the results obtained from the theoretical analysis helped to verify empirical observations from the literature convexhull vs bounding boxes further solidify the work weakness i was expecting the authors to provide some novel insight or algorithm based on all the analysis they have done the results do not provide additional help to the practitioners between choosing one strategy over the other because to me it looks like the largest contributing factor for all methods is the domaingap between source and target tasks which is a hard problem itself on the same line authors use the 1wasserstein metric between datasets to measure domaingap they also mention that in practice we can use dataset metafeatures to approximate this distance id have to liked to see a discussion as to how well metafeatures approximate the dataset distance metric used in this paper the authors have developed a holistic theoretical framework to evaluate various warmstarting strategies for hpo the framework is comprehensive intuitive easy to understand and extendable to future work on the other hand the practicality of this analysis is slightly limited overall i found the paper to be an important work for both analyzing past work and making future work on this topic more theoretically grounded and therefore recommend accept
### Summary:
|
the paper presents theoretical bounds on performance of transfer learning methods various methods recently proposed in the literature are analysed with a novel framework which will be helpful to improve foundational aspects of transfer learning hpo methods all reviewers acknowledged the novelty of the paper and potential impact one of the biggest concern was the amount of insights coming out of the paper but the authors clarified this aspect two raised their scores the authors also answered the pointed raised by the reviewers and all but one reviewers argue for accepting the paper except one reviewer who did not acknowledge the author response i recommend accepting the paper which will be a valuable contribution to improve theoretical guarantees of transfer learning methods for hpo
|
[
326,
2057,
4656,
253,
3186,
2317,
390,
897,
3786,
14662,
35701,
3470,
50276,
783,
4477,
2319,
253,
12739,
273,
824,
14493,
26215,
326,
253,
5556,
1319,
8037,
273,
35701,
1159,
3700,
310,
4577,
685,
326,
39941,
407,
5421,
819,
25004,
8130,
672,
970,
17825,
13461,
50276,
249,
1798,
436,
2722,
253,
6349,
273,
253,
42428,
8130,
275,
841,
1273,
5971,
273,
3082,
253,
2929,
1057,
417,
2486,
667,
10704,
4679,
10527,
1783,
273,
288,
5367,
3082,
310,
16653,
3390,
3212,
16774,
16424,
285,
436,
2929,
8489,
14177,
281,
2810,
253,
8037,
407,
13654,
327,
1643,
11860,
5890,
40324,
288,
5367,
11333,
50276,
9088,
403,
247,
2962,
273,
7364,
281,
253,
1783,
253,
4477,
2085,
954,
273,
534,
403,
4518,
5469,
275,
253,
2929,
50276,
29483,
595,
891,
651,
1333,
326,
247,
2007,
12291,
310,
326,
253,
4758,
1908,
310,
3240,
17010,
4495,
326,
352,
310,
8025,
326,
512,
253,
8892,
4735,
253,
1072,
2957,
285,
326,
352,
778,
3590,
3240,
28684,
281,
5467,
326,
323,
253,
2603,
8892,
8654,
4373,
22041,
452,
644,
1119,
50276,
12563,
253,
2929,
3133,
417,
281,
2770,
327,
247,
2173,
288,
5367,
5853,
643,
685,
8133,
247,
2176,
15075,
327,
1566,
3169,
11333,
285,
2581,
4751,
34588,
281,
253,
1643,
11860,
637,
656,
454,
68,
292,
6579,
4758,
17837,
891,
2868,
253,
1543,
2361,
1690,
253,
1511,
273,
2746,
326,
253,
4477,
956,
476,
452,
247,
2176,
3486,
275,
253,
1673,
285,
23278,
2007,
1783,
285,
16774,
5839,
23426,
280,
1365,
253,
1543,
1646,
3451,
3738,
891,
452,
417,
10141,
512,
253,
4278,
273,
512,
253,
27947,
253,
2929,
310,
4583,
2590,
533,
690,
24392,
651,
5649,
432,
2007,
4737,
24042,
24088,
4706,
374,
310,
2120,
273,
49495,
253,
15965,
14951,
310,
247,
2372,
3206,
533,
3164,
436,
310,
247,
15504,
627,
403,
690,
14217,
326,
403,
417,
6283,
5611,
824,
347,
253,
246,
6227,
80,
275,
5150,
374,
285,
47768,
1943,
258,
342,
2412,
2616,
50275,
37585,
50276,
249,
253,
30762,
294,
17695,
253,
39383,
3908,
651,
2572,
1239,
1430,
50275,
262,
310,
417,
2590,
281,
479,
849,
253,
1980,
3186,
310,
6326,
281,
320,
2684,
275,
2593,
577,
50276,
296,
3755,
20556,
50276,
47606,
10527,
1783,
273,
1142,
5890,
45033,
288,
5367,
8130,
50276,
8250,
285,
4722,
5955,
273,
253,
2797,
14493,
50276,
8250,
47284,
273,
253,
1895,
4758,
1690,
16038,
50275,
20881,
1255,
265,
50276,
20415,
13260,
326,
778,
2701,
253,
30437,
273,
253,
1543,
275,
3946,
50276,
2369,
16774,
12820,
34309,
1320,
273,
253,
14493,
50276,
783,
789,
651,
5649,
432,
271,
3081,
3790,
273,
4737,
24042,
253,
1543,
6221,
275,
253,
2929,
778,
320,
273,
1600,
323,
253,
288,
5367,
3114,
3585,
1562,
3006,
1600,
275,
1643,
11860,
288,
5367,
5609,
285,
10527,
1783,
273,
288,
5367,
11333,
253,
7364,
273,
253,
1783,
403,
4583,
34007,
285,
476,
320,
9713,
407,
2852,
789,
50275,
6438,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
12252,
285,
6583,
619,
17401,
281,
2997,
253,
2929,
50276,
7152,
339,
431,
248,
4477,
2085,
10527,
1783,
273,
253,
5556,
1319,
8037,
273,
253,
4373,
22041,
2797,
3066,
5890,
40324,
1643,
11860,
288,
5367,
970,
616,
10527,
1783,
597,
4271,
9534,
835,
3700,
35701,
2957,
3470,
1347,
1805,
685,
288,
81,
2317,
819,
25004,
253,
4342,
403,
3839,
4722,
323,
253,
3772,
77,
3114,
347,
352,
4245,
441,
12861,
16039,
715,
253,
3762,
3212,
253,
4236,
5870,
273,
5148,
613,
920,
7274,
2299,
891,
1089,
352,
3240,
2834,
281,
923,
253,
12739,
273,
253,
789,
846,
4361,
253,
2929,
253,
789,
778,
452,
2442,
323,
625,
3486,
604,
3081,
27838,
403,
2530,
285,
352,
310,
7939,
625,
10534,
281,
253,
16774,
7313,
275,
6239,
50276,
3113,
690,
6667,
273,
849,
352,
7033,
281,
1027,
2173,
7274,
285,
752,
352,
2296,
670,
731,
253,
789,
310,
273,
1029,
3290,
253,
4477,
1056,
352,
2590,
752,
253,
13260,
403,
285,
253,
27947,
3176,
281,
320,
3451,
846,
4541,
4361,
731,
4931,
581,
2508,
326,
891,
1089,
5777,
29124,
5001,
7681,
3290,
891,
755,
253,
13214,
326,
253,
7234,
275,
253,
12002,
10199,
285,
6452,
670,
253,
2087,
12739,
403,
8489,
36074,
1580,
432,
253,
10527,
1783,
352,
310,
417,
1077,
2590,
849,
352,
4245,
12925,
672,
24088,
581,
6974,
310,
1805,
685,
1529,
352,
943,
320,
2529,
625,
4518,
849,
253,
10527,
1783,
4245,
12925,
281,
4271,
672,
690,
15849,
403,
1805,
846,
4361,
253,
2929,
352,
310,
417,
1512,
2590,
281,
479,
849,
891,
812,
2686,
897,
1110,
10527,
1543,
625,
7000,
6667,
812,
320,
2530,
671,
3700,
35701,
2957,
3470,
285,
288,
81,
2317,
819,
25004,
3082,
812,
320,
2529,
342,
625,
1650,
7274,
432,
6239,
352,
310,
417,
3782,
2590,
752,
8394,
273,
5148,
613,
920,
73,
5367,
3082,
436,
2686,
22642,
17815,
285,
7613,
849,
1199,
3486,
253,
789,
556,
281,
2007,
3157,
253,
19843,
352,
651,
320,
1805,
281,
3693,
1077,
1048,
14683,
326,
13905,
24088,
495,
390,
577,
3104,
327,
253,
643,
1133,
891,
751,
849,
4477,
2486,
247,
2593,
281,
4853,
253,
11944,
249,
3927,
50276,
2520,
5604,
19132,
253,
19843,
285,
2789,
352,
6927,
281,
956,
253,
27947,
50276,
9088,
403,
247,
1643,
5884,
963,
993,
1710,
4175,
16503,
24088,
298,
11335,
2299,
253,
5750,
273,
253,
1313,
8407,
2478,
10725,
11306,
298,
13381,
247,
3453,
50276,
10247,
7794,
50276,
783,
2929,
3400,
247,
10527,
1783,
326,
476,
320,
4217,
281,
7617,
672,
281,
4510,
2176,
5870,
273,
288,
5367,
3082,
50276,
783,
1783,
310,
7000,
13260,
403,
4518,
2529,
285,
253,
10527,
1783,
310,
6283,
3542,
598,
50276,
783,
4477,
2085,
247,
7000,
7681,
1783,
273,
7364,
273,
616,
616,
789,
50276,
12373,
7794,
50276,
47590,
352,
310,
2834,
281,
4271,
752,
403,
253,
12739,
273,
253,
10527,
1783,
285,
849,
352,
476,
320,
4217,
281,
1056,
7089,
327,
752,
288,
5367,
390,
5148,
613,
920,
3082,
281,
897,
50276,
783,
4477,
1333,
326,
616,
10527,
1543,
1918,
253,
12925,
24088,
275,
12002,
10199,
390,
6452,
533,
352,
310,
2057,
594,
7681,
326,
352,
310,
1892,
281,
15909,
390,
253,
4278,
273,
824,
12925,
403,
2686,
5816,
50276,
262,
310,
671,
417,
2590,
281,
849,
1142,
288,
5367,
390,
5148,
613,
920,
3082,
253,
1543,
4647,
50276,
261,
352,
690,
1077,
18052,
2021,
534,
651,
16084,
247,
2581,
1355,
3486,
273,
253,
1783,
390,
310,
352,
625,
2087,
50276,
43786,
891,
651,
11435,
625,
6667,
323,
1110,
7274,
285,
690,
5955,
50276,
1189,
455,
891,
1158,
352,
310,
247,
4891,
789,
342,
690,
4722,
1543,
533,
253,
1655,
9759,
2789,
352,
2834,
281,
2096,
849,
281,
897,
253,
1543,
273,
253,
1783,
50276,
74,
4390,
3609,
42876,
1840,
253,
14924,
7887,
533,
253,
2929,
476,
2118,
3240,
4354,
281,
2997,
1175,
2929,
604,
253,
2929,
4245,
1805,
12925,
327,
253,
8542,
12739,
273,
253,
10527,
1543,
8254,
6787,
281,
253,
3374,
2529,
275,
2710,
7118,
273,
253,
2278,
651,
320,
14109,
1561,
253,
30080,
22559,
253,
2929,
651,
5649,
247,
2257,
432,
12873,
253,
10527,
1783,
281,
8542,
12925,
281,
247,
4067,
6070,
5474,
33032,
2072,
323,
38041,
30628,
5549,
323,
38041,
30628,
5549,
323,
38041,
30628,
5549,
323,
38041,
30628,
253,
4477,
760,
2684,
247,
10527,
1783,
5549,
323,
38041,
30628,
5474,
339,
431,
248,
4477,
12661,
247,
10527,
7792,
281,
7472,
2710,
3700,
28269,
3082,
326,
403,
908,
281,
5890,
5478,
247,
288,
5367,
2628,
327,
247,
2303,
4836,
970,
288,
5367,
6613,
432,
2709,
2603,
8892,
4477,
3835,
247,
5235,
273,
4633,
3082,
432,
253,
6239,
285,
2085,
247,
2406,
3033,
323,
512,
273,
1110,
7419,
432,
5277,
15965,
14493,
4477,
671,
2085,
16875,
4431,
5001,
253,
14493,
285,
897,
352,
281,
2007,
17813,
2176,
16774,
7313,
432,
253,
6239,
253,
2929,
310,
7094,
10527,
285,
1057,
417,
452,
667,
16774,
4445,
347,
253,
789,
3537,
13505,
5368,
5609,
534,
2168,
2085,
5661,
1543,
432,
247,
10527,
32764,
436,
2929,
476,
452,
247,
2266,
3486,
275,
436,
2173,
749,
12879,
3700,
28269,
44041,
45033,
323,
288,
5367,
273,
3772,
77,
347,
747,
3082,
476,
897,
253,
7792,
4081,
275,
253,
2929,
281,
7277,
253,
2406,
14493,
2797,
407,
616,
1332,
342,
1675,
281,
253,
5368,
4394,
7419,
432,
760,
3588,
839,
616,
9079,
342,
16774,
1543,
50275,
4064,
247,
8542,
32764,
436,
789,
588,
452,
3710,
3486,
984,
891,
253,
4477,
513,
417,
12661,
667,
747,
1332,
968,
429,
970,
616,
10527,
8310,
285,
21255,
954,
273,
253,
24316,
8058,
275,
253,
2929,
3133,
27350,
24088,
288,
5367,
3700,
3045,
310,
11306,
7976,
327,
253,
18988,
68,
292,
1816,
4836,
5028,
3064,
50276,
783,
4081,
458,
44661,
285,
14493,
2529,
275,
436,
2929,
1646,
3451,
281,
479,
50276,
783,
13260,
908,
281,
5276,
253,
14493,
1646,
5272,
281,
253,
954,
6070,
50276,
783,
4477,
12661,
247,
2087,
7792,
285,
897,
352,
281,
3835,
767,
14957,
273,
288,
11714,
16147,
1592,
28269,
24088,
3186,
5641,
819,
25004,
285,
27090,
253,
35701,
1159,
3139,
253,
7792,
310,
4209,
281,
5276,
253,
5556,
1319,
14493,
275,
1097,
2219,
50276,
783,
14493,
671,
28055,
15249,
16774,
1543,
273,
247,
1643,
5609,
432,
253,
6239,
7624,
2007,
4645,
253,
36594,
273,
253,
3916,
50276,
43355,
1347,
247,
1175,
2628,
275,
15250,
253,
4753,
275,
247,
28901,
422,
1039,
281,
247,
37317,
1293,
247,
2266,
10527,
4114,
253,
2685,
273,
253,
1491,
285,
253,
14511,
3133,
3626,
285,
512,
13260,
403,
4518,
7117,
1066,
1078,
16585,
1016,
18057,
390,
944,
2555,
3927,
50276,
45563,
50275,
783,
2929,
29328,
247,
11088,
10527,
7792,
281,
12106,
2710,
288,
5367,
5890,
45033,
5609,
407,
18918,
616,
2406,
3033,
253,
7792,
476,
18045,
1097,
819,
25004,
1754,
3082,
285,
35701,
1159,
3700,
3082,
50276,
20790,
310,
23395,
3542,
285,
512,
253,
13260,
285,
7364,
670,
1016,
10012,
285,
18057,
403,
2590,
432,
253,
2929,
10793,
2176,
13260,
403,
10046,
326,
752,
310,
14494,
275,
1524,
1495,
4477,
3748,
326,
285,
671,
2085,
14493,
323,
5795,
21076,
13260,
534,
403,
625,
15958,
50276,
783,
14493,
2797,
407,
253,
4477,
540,
41597,
1056,
3282,
281,
479,
597,
671,
1646,
281,
320,
3451,
323,
247,
4564,
273,
27947,
326,
891,
452,
10141,
275,
253,
30762,
253,
958,
326,
253,
1543,
2797,
432,
253,
10527,
1783,
6518,
281,
12654,
16774,
7313,
432,
253,
6239,
17133,
73,
962,
4632,
41113,
12783,
2007,
4891,
1419,
253,
789,
50276,
20881,
1255,
50275,
74,
369,
16764,
253,
4477,
281,
2085,
690,
4460,
12288,
390,
5933,
1754,
327,
512,
253,
1783,
597,
452,
2218,
253,
1543,
513,
417,
2085,
3081,
1361,
281,
253,
24432,
875,
13887,
581,
5700,
689,
253,
643,
984,
281,
479,
352,
4453,
751,
253,
6253,
15979,
2803,
323,
512,
3082,
310,
253,
2328,
66,
272,
522,
875,
2603,
285,
2303,
8892,
534,
310,
247,
1892,
1895,
3139,
50275,
251,
253,
1072,
1386,
4477,
897,
253,
337,
88,
30666,
6339,
7982,
875,
15302,
281,
2557,
2328,
66,
272,
522,
597,
671,
3748,
326,
275,
3946,
359,
476,
897,
10895,
1313,
8407,
2478,
281,
16851,
436,
4181,
2654,
452,
281,
10490,
281,
923,
247,
5955,
347,
281,
849,
973,
1313,
8407,
2478,
16851,
253,
10895,
4181,
7982,
908,
275,
436,
2929,
253,
4477,
452,
3715,
247,
45290,
10527,
7792,
281,
7472,
2710,
5890,
45033,
8130,
323,
288,
5367,
253,
7792,
310,
11088,
27350,
3477,
281,
2096,
285,
9017,
494,
281,
2852,
789,
327,
253,
643,
1133,
253,
8542,
414,
273,
436,
1783,
310,
5777,
3710,
4583,
891,
1119,
253,
2929,
281,
320,
271,
1774,
789,
323,
1097,
18918,
2469,
789,
285,
2403,
2852,
789,
327,
436,
9400,
625,
28055,
28462,
285,
3103,
5583,
2997,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
10527,
14493,
327,
3045,
273,
3700,
4715,
3082,
2710,
3082,
4102,
4081,
275,
253,
6239,
403,
15626,
342,
247,
4460,
7792,
534,
588,
320,
9371,
281,
3157,
1119,
1050,
7794,
273,
3700,
4715,
288,
5367,
3082,
50276,
455,
30628,
14969,
253,
38135,
273,
253,
2929,
285,
2442,
3486,
581,
273,
253,
5962,
4468,
369,
253,
2408,
273,
16039,
3551,
562,
273,
253,
2929,
533,
253,
4477,
31637,
436,
4809,
767,
5439,
616,
7363,
253,
4477,
671,
9577,
253,
8042,
5439,
407,
253,
30628,
285,
512,
533,
581,
30628,
9059,
323,
18738,
253,
2929,
3707,
581,
37317,
665,
858,
417,
14409,
253,
2488,
2380,
50275,
74,
5583,
18738,
253,
2929,
534,
588,
320,
247,
9865,
7680,
281,
3157,
10527,
23632,
273,
3700,
4715,
3082,
323,
288,
5367,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
326,
2057,
4656,
253,
3186,
2317,
390,
897,
3786,
14662,
35701,
3470,
50276,
783,
4477,
2319,
253,
12739,
273,
824,
14493,
26215,
326,
253,
5556,
1319,
8037,
273,
35701,
1159,
3700,
310,
4577,
685,
326,
39941,
407,
5421,
819,
25004,
8130,
672,
970,
17825,
13461,
50276,
249,
1798,
436,
2722,
253,
6349,
273,
253,
42428,
8130,
275,
841,
1273,
5971,
273,
3082,
253,
2929,
1057,
417,
2486,
667,
10704,
4679,
10527,
1783,
273,
288,
5367,
3082,
310,
16653,
3390,
3212,
16774,
16424,
285,
436,
2929,
8489,
14177,
281,
2810,
253,
8037,
407,
13654,
327,
1643,
11860,
5890,
40324,
288,
5367,
11333,
50276,
9088,
403,
247,
2962,
273,
7364,
281,
253,
1783,
253,
4477,
2085,
954,
273,
534,
403,
4518,
5469,
275,
253,
2929,
50276,
29483,
595,
891,
651,
1333,
326,
247,
2007,
12291,
310,
326,
253,
4758,
1908,
310,
3240,
17010,
4495,
326,
352,
310,
8025,
326,
512,
253,
8892,
4735,
253,
1072,
2957,
285,
326,
352,
778,
3590,
3240,
28684,
281,
5467,
326,
323,
253,
2603,
8892,
8654,
4373,
22041,
452,
644,
1119,
50276,
12563,
253,
2929,
3133,
417,
281,
2770,
327,
247,
2173,
288,
5367,
5853,
643,
685,
8133,
247,
2176,
15075,
327,
1566,
3169,
11333,
285,
2581,
4751,
34588,
281,
253,
1643,
11860,
637,
656,
454,
68,
292,
6579,
4758,
17837,
891,
2868,
253,
1543,
2361,
1690,
253,
1511,
273,
2746,
326,
253,
4477,
956,
476,
452,
247,
2176,
3486,
275,
253,
1673,
285,
23278,
2007,
1783,
285,
16774,
5839,
23426,
280,
1365,
253,
1543,
1646,
3451,
3738,
891,
452,
417,
10141,
512,
253,
4278,
273,
512,
253,
27947,
253,
2929,
310,
4583,
2590,
533,
690,
24392,
651,
5649,
432,
2007,
4737,
24042,
24088,
4706,
374,
310,
2120,
273,
49495,
253,
15965,
14951,
310,
247,
2372,
3206,
533,
3164,
436,
310,
247,
15504,
627,
403,
690,
14217,
326,
403,
417,
6283,
5611,
824,
347,
253,
246,
6227,
80,
275,
5150,
374,
285,
47768,
1943,
258,
342,
2412,
2616,
50275,
37585,
50276,
249,
253,
30762,
294,
17695,
253,
39383,
3908,
651,
2572,
1239,
1430,
50275,
262,
310,
417,
2590,
281,
479,
849,
253,
1980,
3186,
310,
6326,
281,
320,
2684,
275,
2593,
577,
50276,
296,
3755,
20556,
50276,
47606,
10527,
1783,
273,
1142,
5890,
45033,
288,
5367,
8130,
50276,
8250,
285,
4722,
5955,
273,
253,
2797,
14493,
50276,
8250,
47284,
273,
253,
1895,
4758,
1690,
16038,
50275,
20881,
1255,
265,
50276,
20415,
13260,
326,
778,
2701,
253,
30437,
273,
253,
1543,
275,
3946,
50276,
2369,
16774,
12820,
34309,
1320,
273,
253,
14493,
50276,
783,
789,
651,
5649,
432,
271,
3081,
3790,
273,
4737,
24042,
253,
1543,
6221,
275,
253,
2929,
778,
320,
273,
1600,
323,
253,
288,
5367,
3114,
3585,
1562,
3006,
1600,
275,
1643,
11860,
288,
5367,
5609,
285,
10527,
1783,
273,
288,
5367,
11333,
253,
7364,
273,
253,
1783,
403,
4583,
34007,
285,
476,
320,
9713,
407,
2852,
789,
50275,
6438,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
12252,
285,
6583,
619,
17401,
281,
2997,
253,
2929,
50276,
7152,
339,
431,
248,
4477,
2085,
10527,
1783,
273,
253,
5556,
1319,
8037,
273,
253,
4373,
22041,
2797,
3066,
5890,
40324,
1643,
11860,
288,
5367,
970,
616,
10527,
1783,
597,
4271,
9534,
835,
3700,
35701,
2957,
3470,
1347,
1805,
685,
288,
81,
2317,
819,
25004,
253,
4342,
403,
3839,
4722,
323,
253,
3772,
77,
3114,
347,
352,
4245,
441,
12861,
16039,
715,
253,
3762,
3212,
253,
4236,
5870,
273,
5148,
613,
920,
7274,
2299,
891,
1089,
352,
3240,
2834,
281,
923,
253,
12739,
273,
253,
789,
846,
4361,
253,
2929,
253,
789,
778,
452,
2442,
323,
625,
3486,
604,
3081,
27838,
403,
2530,
285,
352,
310,
7939,
625,
10534,
281,
253,
16774,
7313,
275,
6239,
50276,
3113,
690,
6667,
273,
849,
352,
7033,
281,
1027,
2173,
7274,
285,
752,
352,
2296,
670,
731,
253,
789,
310,
273,
1029,
3290,
253,
4477,
1056,
352,
2590,
752,
253,
13260,
403,
285,
253,
27947,
3176,
281,
320,
3451,
846,
4541,
4361,
731,
4931,
581,
2508,
326,
891,
1089,
5777,
29124,
5001,
7681,
3290,
891,
755,
253,
13214,
326,
253,
7234,
275,
253,
12002,
10199,
285,
6452,
670,
253,
2087,
12739,
403,
8489,
36074,
1580,
432,
253,
10527,
1783,
352,
310,
417,
1077,
2590,
849,
352,
4245,
12925,
672,
24088,
581,
6974,
310,
1805,
685,
1529,
352,
943,
320,
2529,
625,
4518,
849,
253,
10527,
1783,
4245,
12925,
281,
4271,
672,
690,
15849,
403,
1805,
846,
4361,
253,
2929,
352,
310,
417,
1512,
2590,
281,
479,
849,
891,
812,
2686,
897,
1110,
10527,
1543,
625,
7000,
6667,
812,
320,
2530,
671,
3700,
35701,
2957,
3470,
285,
288,
81,
2317,
819,
25004,
3082,
812,
320,
2529,
342,
625,
1650,
7274,
432,
6239,
352,
310,
417,
3782,
2590,
752,
8394,
273,
5148,
613,
920,
73,
5367,
3082,
436,
2686,
22642,
17815,
285,
7613,
849,
1199,
3486,
253,
789,
556,
281,
2007,
3157,
253,
19843,
352,
651,
320,
1805,
281,
3693,
1077,
1048,
14683,
326,
13905,
24088,
495,
390,
577,
3104,
327,
253,
643,
1133,
891,
751,
849,
4477,
2486,
247,
2593,
281,
4853,
253,
11944,
249,
3927,
50276,
2520,
5604,
19132,
253,
19843,
285,
2789,
352,
6927,
281,
956,
253,
27947,
50276,
9088,
403,
247,
1643,
5884,
963,
993,
1710,
4175,
16503,
24088,
298,
11335,
2299,
253,
5750,
273,
253,
1313,
8407,
2478,
10725,
11306,
298,
13381,
247,
3453,
50276,
10247,
7794,
50276,
783,
2929,
3400,
247,
10527,
1783,
326,
476,
320,
4217,
281,
7617,
672,
281,
4510,
2176,
5870,
273,
288,
5367,
3082,
50276,
783,
1783,
310,
7000,
13260,
403,
4518,
2529,
285,
253,
10527,
1783,
310,
6283,
3542,
598,
50276,
783,
4477,
2085,
247,
7000,
7681,
1783,
273,
7364,
273,
616,
616,
789,
50276,
12373,
7794,
50276,
47590,
352,
310,
2834,
281,
4271,
752,
403,
253,
12739,
273,
253,
10527,
1783,
285,
849,
352,
476,
320,
4217,
281,
1056,
7089,
327,
752,
288,
5367,
390,
5148,
613,
920,
3082,
281,
897,
50276,
783,
4477,
1333,
326,
616,
10527,
1543,
1918,
253,
12925,
24088,
275,
12002,
10199,
390,
6452,
533,
352,
310,
2057,
594,
7681,
326,
352,
310,
1892,
281,
15909,
390,
253,
4278,
273,
824,
12925,
403,
2686,
5816,
50276,
262,
310,
671,
417,
2590,
281,
849,
1142,
288,
5367,
390,
5148,
613,
920,
3082,
253,
1543,
4647,
50276,
261,
352,
690,
1077,
18052,
2021,
534,
651,
16084,
247,
2581,
1355,
3486,
273,
253,
1783,
390,
310,
352,
625,
2087,
50276,
43786,
891,
651,
11435,
625,
6667,
323,
1110,
7274,
285,
690,
5955,
50276,
1189,
455,
891,
1158,
352,
310,
247,
4891,
789,
342,
690,
4722,
1543,
533,
253,
1655,
9759,
2789,
352,
2834,
281,
2096,
849,
281,
897,
253,
1543,
273,
253,
1783,
50276,
74,
4390,
3609,
42876,
1840,
253,
14924,
7887,
533,
253,
2929,
476,
2118,
3240,
4354,
281,
2997,
1175,
2929,
604,
253,
2929,
4245,
1805,
12925,
327,
253,
8542,
12739,
273,
253,
10527,
1543,
8254,
6787,
281,
253,
3374,
2529,
275,
2710,
7118,
273,
253,
2278,
651,
320,
14109,
1561,
253,
30080,
22559,
253,
2929,
651,
5649,
247,
2257,
432,
12873,
253,
10527,
1783,
281,
8542,
12925,
281,
247,
4067,
6070,
5474,
33032,
2072,
323,
38041,
30628,
5549,
323,
38041,
30628,
5549,
323,
38041,
30628,
5549,
323,
38041,
30628,
253,
4477,
760,
2684,
247,
10527,
1783,
5549,
323,
38041,
30628,
5474,
339,
431,
248,
4477,
12661,
247,
10527,
7792,
281,
7472,
2710,
3700,
28269,
3082,
326,
403,
908,
281,
5890,
5478,
247,
288,
5367,
2628,
327,
247,
2303,
4836,
970,
288,
5367,
6613,
432,
2709,
2603,
8892,
4477,
3835,
247,
5235,
273,
4633,
3082,
432,
253,
6239,
285,
2085,
247,
2406,
3033,
323,
512,
273,
1110,
7419,
432,
5277,
15965,
14493,
4477,
671,
2085,
16875,
4431,
5001,
253,
14493,
285,
897,
352,
281,
2007,
17813,
2176,
16774,
7313,
432,
253,
6239,
253,
2929,
310,
7094,
10527,
285,
1057,
417,
452,
667,
16774,
4445,
347,
253,
789,
3537,
13505,
5368,
5609,
534,
2168,
2085,
5661,
1543,
432,
247,
10527,
32764,
436,
2929,
476,
452,
247,
2266,
3486,
275,
436,
2173,
749,
12879,
3700,
28269,
44041,
45033,
323,
288,
5367,
273,
3772,
77,
347,
747,
3082,
476,
897,
253,
7792,
4081,
275,
253,
2929,
281,
7277,
253,
2406,
14493,
2797,
407,
616,
1332,
342,
1675,
281,
253,
5368,
4394,
7419,
432,
760,
3588,
839,
616,
9079,
342,
16774,
1543,
50275,
4064,
247,
8542,
32764,
436,
789,
588,
452,
3710,
3486,
984,
891,
253,
4477,
513,
417,
12661,
667,
747,
1332,
968,
429,
970,
616,
10527,
8310,
285,
21255,
954,
273,
253,
24316,
8058,
275,
253,
2929,
3133,
27350,
24088,
288,
5367,
3700,
3045,
310,
11306,
7976,
327,
253,
18988,
68,
292,
1816,
4836,
5028,
3064,
50276,
783,
4081,
458,
44661,
285,
14493,
2529,
275,
436,
2929,
1646,
3451,
281,
479,
50276,
783,
13260,
908,
281,
5276,
253,
14493,
1646,
5272,
281,
253,
954,
6070,
50276,
783,
4477,
12661,
247,
2087,
7792,
285,
897,
352,
281,
3835,
767,
14957,
273,
288,
11714,
16147,
1592,
28269,
24088,
3186,
5641,
819,
25004,
285,
27090,
253,
35701,
1159,
3139,
253,
7792,
310,
4209,
281,
5276,
253,
5556,
1319,
14493,
275,
1097,
2219,
50276,
783,
14493,
671,
28055,
15249,
16774,
1543,
273,
247,
1643,
5609,
432,
253,
6239,
7624,
2007,
4645,
253,
36594,
273,
253,
3916,
50276,
43355,
1347,
247,
1175,
2628,
275,
15250,
253,
4753,
275,
247,
28901,
422,
1039,
281,
247,
37317,
1293,
247,
2266,
10527,
4114,
253,
2685,
273,
253,
1491,
285,
253,
14511,
3133,
3626,
285,
512,
13260,
403,
4518,
7117,
1066,
1078,
16585,
1016,
18057,
390,
944,
2555,
3927,
50276,
45563,
50275,
783,
2929,
29328,
247,
11088,
10527,
7792,
281,
12106,
2710,
288,
5367,
5890,
45033,
5609,
407,
18918,
616,
2406,
3033,
253,
7792,
476,
18045,
1097,
819,
25004,
1754,
3082,
285,
35701,
1159,
3700,
3082,
50276,
20790,
310,
23395,
3542,
285,
512,
253,
13260,
285,
7364,
670,
1016,
10012,
285,
18057,
403,
2590,
432,
253,
2929,
10793,
2176,
13260,
403,
10046,
326,
752,
310,
14494,
275,
1524,
1495,
4477,
3748,
326,
285,
671,
2085,
14493,
323,
5795,
21076,
13260,
534,
403,
625,
15958,
50276,
783,
14493,
2797,
407,
253,
4477,
540,
41597,
1056,
3282,
281,
479,
597,
671,
1646,
281,
320,
3451,
323,
247,
4564,
273,
27947,
326,
891,
452,
10141,
275,
253,
30762,
253,
958,
326,
253,
1543,
2797,
432,
253,
10527,
1783,
6518,
281,
12654,
16774,
7313,
432,
253,
6239,
17133,
73,
962,
4632,
41113,
12783,
2007,
4891,
1419,
253,
789,
50276,
20881,
1255,
50275,
74,
369,
16764,
253,
4477,
281,
2085,
690,
4460,
12288,
390,
5933,
1754,
327,
512,
253,
1783,
597,
452,
2218,
253,
1543,
513,
417,
2085,
3081,
1361,
281,
253,
24432,
875,
13887,
581,
5700,
689,
253,
643,
984,
281,
479,
352,
4453,
751,
253,
6253,
15979,
2803,
323,
512,
3082,
310,
253,
2328,
66,
272,
522,
875,
2603,
285,
2303,
8892,
534,
310,
247,
1892,
1895,
3139,
50275,
251,
253,
1072,
1386,
4477,
897,
253,
337,
88,
30666,
6339,
7982,
875,
15302,
281,
2557,
2328,
66,
272,
522,
597,
671,
3748,
326,
275,
3946,
359,
476,
897,
10895,
1313,
8407,
2478,
281,
16851,
436,
4181,
2654,
452,
281,
10490,
281,
923,
247,
5955,
347,
281,
849,
973,
1313,
8407,
2478,
16851,
253,
10895,
4181,
7982,
908,
275,
436,
2929,
253,
4477,
452,
3715,
247,
45290,
10527,
7792,
281,
7472,
2710,
5890,
45033,
8130,
323,
288,
5367,
253,
7792,
310,
11088,
27350,
3477,
281,
2096,
285,
9017,
494,
281,
2852,
789,
327,
253,
643,
1133,
253,
8542,
414,
273,
436,
1783,
310,
5777,
3710,
4583,
891,
1119,
253,
2929,
281,
320,
271,
1774,
789,
323,
1097,
18918,
2469,
789,
285,
2403,
2852,
789,
327,
436,
9400,
625,
28055,
28462,
285,
3103,
5583,
2997,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
10527,
14493,
327,
3045,
273,
3700,
4715,
3082,
2710,
3082,
4102,
4081,
275,
253,
6239,
403,
15626,
342,
247,
4460,
7792,
534,
588,
320,
9371,
281,
3157,
1119,
1050,
7794,
273,
3700,
4715,
288,
5367,
3082,
50276,
455,
30628,
14969,
253,
38135,
273,
253,
2929,
285,
2442,
3486,
581,
273,
253,
5962,
4468,
369,
253,
2408,
273,
16039,
3551,
562,
273,
253,
2929,
533,
253,
4477,
31637,
436,
4809,
767,
5439,
616,
7363,
253,
4477,
671,
9577,
253,
8042,
5439,
407,
253,
30628,
285,
512,
533,
581,
30628,
9059,
323,
18738,
253,
2929,
3707,
581,
37317,
665,
858,
417,
14409,
253,
2488,
2380,
50275,
74,
5583,
18738,
253,
2929,
534,
588,
320,
247,
9865,
7680,
281,
3157,
10527,
23632,
273,
3700,
4715,
3082,
323,
288,
5367,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
does the paper present substantively new ideas or explore an under explored or highly novel question somewhat the paper combines two popular existing approaches imitation learning model based control and uncertainty quantification using dropout the novelty is in combining preexisting ideas does the results substantively advance the state of the art no the compared methods are not stateoftheart will a substantial fraction of the iclr attendees be interested in reading this paper yes i think that the topics of this paper would be very interesting to iclr attendees quality unclear motivation to penalize prediction uncertainty to make the predicted states stay in the training data also in some cases references to existing work that includes real robotic systems is out of context at minimum so yes there are similarities between this paper and existing works on learning control for robotics systems using imitation learning model based control and uncertainty aware cost function however there is a profound difference in terms of working in simulation and working with a real system for which model and environment uncertainty is a very big issue there are different challenges in working with a real uncertain system which you will have to actuate and working with set of images for making predictions in simulation clarity easy to read experimental evaluation is clearly presented originality similar uncertainty penalty was used in other paper kahn et al 2017 therefore the originality is in some sense reduced would i send this paper to one of my colleagues to read yes i would definitely send this paper to my colleagues general comment dropout can be used to represent the uncertaintycovariance of the neural network model the epistemic uncertainty coming from the lack of data can be gained through monte carlo sampling of the dropoutmasked model during prediction however this type of uncertainty can only decrease by adding more explored data to current data set without any addition of data the variance reduction which results by penalizing the high variance during training might indicate overfitting to the current training data as the penalty forces the model to predict states only in the training dataset it is unclear how this shows better testtime performance the output of the policy network will simply be biased towards the training set as a result of the uncertainty cost more theoretical explanation is needed or perhaps some intuition this observation is also related to the fact that the model based controller used is essentially a risk sensitive controller docsepthe paper addresses the difficulty of covariate shift in modelbased reinforcement learning here the distribution over trajectories during is significantly different for the behaviour or datacollecting policy and the target or optimised policy as a mean to address this the authors propose to add an uncertainty term to the cost which is realised by the trace of the covariance of the outputs of a mc dropout forward model the method is applied to driving in dense traffic where even single wrong actions can be catastrophic i want to stress that the paper was a pleasure to read it was extraordinarily straightfoward to follow because the text was well aligned with the necessary equations the introduction and related work seem complete to me with two exceptions depeweg s hernandezlobato j m doshivelez f udluft s 2018 july decomposition of uncertainty in bayesian deep learning for efficient and risksensitive learning in international conference on machine learning pp 11921201 thomas philip s safe reinforcement learning diss university of massachusetts libraries 2015 the work by depeweg et al addresses quite the same question as the authors of this work but with a broader scope ie not limited to traffic but very much the same machinery there are some important theoretical insights in this work and the connection to this submission should be drawn in particular the proposed method needs to be either compared to this work or it needs to be clarified why it is not applicable the latter appears to be of less significance in this context but i found robust offline policy evaluation underrepresented in the related work i wonder if there is a way for a neural network to hack the uncertainty cost i suppose that the proposed approach is an approximation to some entropy term and it would be informative to see how exactly the approach shown by eq 1 appears to be an adhoc way of estimating whether the uncertainty resulting from an action is due to the data or the model what happens if this approach is not taken the objective function of the forward model is only given in the appendix i think it needs to be moved to the main text especially because the sumofsquares term indicates a homoskedastic gaussian for a likelihood this has implications for the uncertainty estimates see point above overall the separation of data uncertaintyrisk vs model uncertainty is not done this indicates that heterskedastic environments are candidats where the method can fail and this limitation needs to be discussed or pointed out further the authors did not observe a benefit from using a stochastic forward model especially if the prior instead of the approximate posterior is used my point would be that depending on the exact grapical model and the way the sampling is done to train the policy it is actually mathematically right to sample from the prior this is also how it is described in the last equation of section 2 summary overall i liked the paper and the way it was written however there are some shortcomings such as the comparison to the work by depeweg et al which does a very similar thing also justifying the used heuristics as approximations to a principled quantity would help it appears that the question why and how stochastic forward models should be used requires further investigationdocseppros the paper formulates the driving policy problem as a modelbased rl problem most related work on driving policy has been traditional robotics planning methods such as rrt or modelfree rl such as policy gradient methods the policy is learned through unrolling a learned model of the environment dynamics over multiple time steps and training a policy network to minimize a differentiable cost over this rolledout trajectory the cost combine the objective the policy seeks to optimize proximity to other cars and an uncertainty cost representing the divergence from the states it is trained on cons the model based rl formulation is pretty standard except that the paper has a additional model uncertainty cost realistically the output of driving policy should be planning decision ie the waypoints instead of steering angles and acceleration deceleration commands there does not seem to be a need to solve the control problem using learning since pid and ilqr has solved the control problem very well the paper did not seem to reach a conclusion on why stochastic forward model does not yield a clear improvement over the deterministic model this may be due to the limitation of the dataset or the prediction horizon which seems to be 2 second the dataset is only 45 minutes which captured by a camera looking down a small section of the road so the policies learned might only do lane following and occasionally doing collision avoidance i would encourage the authors to look into more diverse dataset see the paper desire distant future prediction in dynamic scenes with interacting agents cvpr 2017 overall the paper makes an interesting contribution formulate the driving policy problem as a modelbased rl problem the techniques used are pretty standard there are some insights in the experimental section however due to the limitation of the dataset it is not clear how much the results can generalize to complex settings such as nudging around other cars cutting in pedestrian crossing etc response to rebuttal it is good to know that the authors have a new modified vae posterior distribution for the stochastic model which can achieve significant gain over the deterministic model is this empirical and specific to this dataset without knowing the details it is not clear how general this new stochastic model is i agree that it is worthwhile to test the model using the 45 minute dataset however i still believe the dataset is very limiting and it is not clear how much the experimental results can apply to other large realistic datasets my rating stays the same
### Summary:
|
reviewers are in a consensus and recommended to accept after engaging with the authors please take reviewers comments into consideration to improve your submission for the camera ready
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
1057,
253,
2929,
1246,
4326,
1242,
747,
5697,
390,
8338,
271,
762,
14859,
390,
4122,
4460,
1953,
50275,
8826,
5371,
253,
2929,
24772,
767,
4633,
5368,
7274,
45738,
4715,
50276,
7645,
1754,
1453,
285,
11649,
21652,
970,
5926,
483,
50276,
783,
38135,
310,
275,
50276,
17890,
1699,
638,
20137,
5697,
50274,
18566,
253,
1543,
4326,
1242,
7170,
253,
1375,
273,
253,
1445,
50275,
2369,
253,
2429,
3082,
403,
417,
1375,
23037,
14387,
50275,
9846,
247,
6832,
6919,
273,
253,
17857,
32888,
36994,
320,
6110,
275,
4361,
436,
2929,
50275,
9820,
891,
1158,
326,
253,
12989,
273,
436,
2929,
651,
320,
1077,
4722,
281,
17857,
32888,
36994,
50275,
15177,
50274,
328,
8250,
16038,
281,
29697,
907,
10554,
11649,
281,
1056,
253,
8131,
3054,
3297,
275,
253,
3733,
941,
50276,
12563,
275,
690,
2219,
10414,
281,
5368,
789,
326,
3797,
1524,
35121,
2718,
310,
562,
273,
3634,
387,
5927,
594,
4754,
627,
403,
22620,
875,
436,
2929,
285,
5368,
2987,
50276,
251,
50276,
28269,
1453,
323,
15688,
982,
2718,
970,
45738,
4715,
1566,
1754,
1453,
285,
11649,
6600,
2105,
1159,
2299,
627,
310,
247,
15585,
3064,
275,
2426,
273,
2444,
275,
9864,
285,
2444,
342,
247,
1524,
985,
323,
534,
1566,
285,
3126,
11649,
310,
247,
1077,
1943,
2523,
627,
403,
1027,
7881,
275,
2444,
342,
247,
1524,
8767,
985,
534,
368,
588,
452,
281,
769,
6340,
50276,
395,
2444,
342,
873,
273,
3888,
323,
2403,
13650,
275,
9864,
50270,
498,
15752,
50275,
36423,
281,
1239,
5661,
7103,
310,
4518,
3559,
50275,
19164,
414,
50275,
22202,
11649,
12339,
369,
908,
275,
643,
2929,
465,
18272,
1162,
355,
4240,
50276,
45230,
253,
3236,
414,
310,
275,
690,
3282,
3777,
50275,
12756,
891,
5007,
436,
2929,
281,
581,
273,
619,
11651,
281,
1239,
50276,
9820,
891,
651,
7964,
5007,
436,
2929,
281,
619,
11651,
50274,
16691,
4385,
50275,
12233,
483,
476,
320,
908,
281,
1957,
253,
11649,
31485,
14417,
273,
253,
11454,
2990,
1566,
253,
30009,
11060,
11649,
3551,
432,
253,
3480,
273,
941,
476,
320,
12103,
949,
1114,
442,
1113,
4213,
10491,
273,
253,
5926,
483,
12477,
264,
1566,
1309,
10554,
2299,
436,
1511,
273,
11649,
476,
760,
6379,
407,
6240,
625,
14859,
941,
281,
1655,
941,
873,
1293,
667,
1635,
273,
941,
253,
50276,
87,
14417,
5141,
534,
1543,
50276,
1615,
29697,
3006,
253,
1029,
11041,
1309,
3733,
1537,
5224,
689,
31893,
281,
253,
1655,
3733,
941,
347,
253,
12339,
5621,
253,
1566,
281,
3283,
3054,
760,
275,
253,
3733,
10895,
352,
310,
12744,
849,
436,
2722,
1805,
1071,
2606,
3045,
253,
3453,
273,
253,
3646,
2990,
588,
3365,
320,
23539,
4404,
253,
3733,
873,
347,
247,
906,
273,
253,
11649,
2105,
625,
10527,
8813,
310,
3058,
390,
4931,
690,
30328,
50274,
2520,
8310,
310,
671,
2905,
281,
253,
958,
326,
253,
1566,
1754,
9763,
908,
50276,
261,
9093,
247,
50276,
16272,
7996,
9763,
50276,
7152,
339,
431,
248,
2929,
12453,
253,
10183,
273,
9383,
11610,
5333,
275,
1566,
3169,
35221,
4715,
1060,
253,
3268,
689,
24102,
1309,
310,
3012,
1027,
323,
253,
8770,
390,
2856,
317,
311,
732,
272,
3646,
285,
253,
2303,
390,
5556,
1701,
3646,
347,
247,
1599,
281,
2953,
436,
253,
4477,
12661,
281,
823,
271,
11649,
1307,
281,
253,
2105,
534,
310,
25436,
407,
253,
10711,
273,
253,
26677,
273,
253,
18012,
273,
247,
278,
68,
5926,
483,
3579,
1566,
253,
1332,
310,
3732,
281,
6276,
275,
14086,
7137,
835,
1014,
2014,
3430,
5231,
476,
320,
36256,
50276,
74,
971,
281,
4073,
326,
253,
2929,
369,
247,
11284,
281,
1239,
352,
369,
49258,
4951,
71,
319,
472,
281,
956,
984,
253,
2505,
369,
973,
15616,
342,
253,
3309,
7424,
50276,
783,
10199,
285,
2905,
789,
1646,
3426,
281,
479,
342,
767,
16022,
50275,
615,
365,
41768,
256,
617,
79,
24673,
77,
706,
4611,
480,
278,
277,
6934,
422,
33383,
269,
50276,
438,
7675,
649,
256,
50274,
7798,
480,
2988,
14717,
273,
11649,
275,
17699,
16561,
3676,
4715,
323,
5919,
285,
10502,
18917,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
12035,
17223,
520,
50276,
394,
4921,
30005,
532,
256,
4999,
35221,
4715,
5408,
9835,
273,
2280,
13462,
13747,
4104,
50276,
783,
789,
407,
372,
365,
41768,
1162,
355,
12453,
3240,
253,
1072,
1953,
347,
253,
4477,
273,
436,
789,
533,
342,
247,
16055,
7990,
26332,
417,
3710,
281,
7137,
533,
1077,
1199,
253,
1072,
20949,
627,
403,
690,
1774,
10527,
16039,
275,
436,
789,
285,
253,
4602,
281,
436,
19529,
943,
320,
8392,
275,
1798,
253,
4081,
1332,
3198,
281,
320,
2057,
2429,
281,
436,
789,
390,
352,
3198,
281,
320,
31637,
2139,
352,
310,
417,
7763,
50276,
783,
6158,
4620,
281,
320,
273,
1679,
8453,
275,
436,
3634,
533,
891,
1119,
10237,
28841,
3646,
7103,
762,
33174,
275,
253,
2905,
789,
50275,
74,
4282,
604,
627,
310,
247,
1039,
323,
247,
11454,
2990,
281,
13908,
253,
11649,
2105,
891,
9428,
326,
253,
4081,
2746,
310,
271,
11193,
281,
690,
15579,
1307,
285,
352,
651,
320,
27096,
281,
923,
849,
4555,
50275,
783,
2746,
2011,
407,
16186,
337,
4620,
281,
320,
271,
519,
37806,
1039,
273,
26230,
1880,
253,
11649,
4795,
432,
271,
2250,
310,
1955,
281,
253,
941,
390,
253,
1566,
752,
6569,
604,
436,
2746,
310,
417,
2668,
50276,
783,
8103,
1159,
273,
253,
3579,
1566,
310,
760,
1677,
275,
253,
30762,
891,
1158,
352,
3198,
281,
320,
4395,
281,
253,
2022,
2505,
3340,
984,
253,
2020,
1171,
23600,
4420,
1307,
6492,
247,
2860,
375,
16386,
3258,
305,
12064,
323,
247,
12177,
436,
556,
12739,
323,
253,
11649,
8197,
923,
1127,
1840,
50276,
1189,
455,
253,
9712,
273,
941,
11649,
16272,
4632,
1566,
11649,
310,
417,
2218,
436,
6492,
326,
344,
1336,
16386,
3258,
12620,
403,
4613,
1832,
835,
253,
1332,
476,
1891,
285,
436,
12291,
3198,
281,
320,
5469,
390,
8042,
562,
50276,
44295,
253,
4477,
858,
417,
10018,
247,
5649,
432,
970,
247,
19191,
3579,
1566,
3340,
604,
253,
2720,
3185,
273,
253,
16851,
12637,
310,
908,
619,
1127,
651,
320,
326,
7293,
327,
253,
3242,
17309,
474,
1566,
285,
253,
1039,
253,
10491,
310,
2218,
281,
6194,
253,
3646,
352,
310,
2686,
11076,
1037,
987,
281,
3410,
432,
253,
2720,
436,
310,
671,
849,
352,
310,
2529,
275,
253,
1390,
5150,
273,
2593,
374,
50274,
8774,
50276,
1189,
455,
891,
10490,
253,
2929,
285,
253,
1039,
352,
369,
3542,
2299,
627,
403,
690,
35387,
824,
347,
253,
5301,
281,
253,
789,
407,
372,
365,
41768,
1162,
355,
534,
1057,
247,
1077,
2074,
2181,
671,
816,
5411,
253,
908,
344,
321,
3397,
347,
34754,
281,
247,
3505,
74,
6216,
10671,
651,
1361,
352,
4620,
326,
253,
1953,
2139,
285,
849,
19191,
3579,
3210,
943,
320,
908,
4419,
2007,
5839,
7152,
339,
377,
2921,
253,
2929,
17075,
684,
253,
6276,
3646,
1895,
347,
247,
1566,
3169,
391,
77,
1895,
954,
2905,
789,
327,
6276,
3646,
556,
644,
5899,
15688,
982,
7219,
3082,
824,
347,
391,
1378,
390,
771,
813,
658,
391,
77,
824,
347,
3646,
11786,
3082,
50276,
783,
3646,
310,
6311,
949,
440,
19891,
247,
6311,
1566,
273,
253,
3126,
8062,
689,
2709,
673,
5018,
285,
3733,
247,
3646,
2990,
281,
15338,
247,
46350,
2105,
689,
436,
14436,
483,
18974,
50276,
783,
2105,
13398,
253,
8103,
253,
3646,
14993,
281,
22318,
18326,
281,
643,
8458,
285,
271,
11649,
2105,
9999,
253,
23279,
432,
253,
3054,
352,
310,
10166,
327,
50276,
5040,
50276,
783,
1566,
1754,
391,
77,
15895,
310,
3965,
2629,
3707,
326,
253,
2929,
556,
247,
3081,
1566,
11649,
2105,
50276,
6549,
18260,
253,
3453,
273,
6276,
3646,
943,
320,
7219,
3061,
26332,
253,
1039,
10801,
3185,
273,
21406,
14636,
285,
17680,
50276,
615,
6226,
3328,
13896,
627,
1057,
417,
1646,
281,
320,
247,
878,
281,
8415,
253,
1453,
1895,
970,
4715,
1580,
33786,
285,
4164,
50070,
556,
14042,
253,
1453,
1895,
1077,
973,
50275,
783,
2929,
858,
417,
1646,
281,
3986,
247,
6452,
327,
2139,
19191,
3579,
1566,
1057,
417,
4917,
247,
2590,
7756,
689,
253,
30027,
1566,
436,
778,
320,
1955,
281,
253,
12291,
273,
253,
10895,
390,
253,
10554,
16892,
534,
3133,
281,
320,
374,
1273,
50275,
783,
10895,
310,
760,
5329,
2909,
534,
10848,
407,
247,
6568,
2819,
1066,
247,
1355,
2593,
273,
253,
3971,
594,
253,
7823,
6311,
1537,
760,
513,
18209,
1563,
285,
13949,
2509,
15708,
28772,
891,
651,
11907,
253,
4477,
281,
1007,
715,
625,
11117,
10895,
923,
253,
2929,
8327,
13392,
2852,
10554,
275,
7870,
13451,
342,
18745,
6083,
30105,
1087,
4240,
50276,
1189,
455,
253,
2929,
2789,
271,
4722,
7680,
36803,
253,
6276,
3646,
1895,
347,
247,
1566,
3169,
391,
77,
1895,
253,
5609,
908,
403,
3965,
2629,
627,
403,
690,
16039,
275,
253,
5661,
2593,
2299,
1955,
281,
253,
12291,
273,
253,
10895,
352,
310,
417,
2590,
849,
1199,
253,
1543,
476,
39970,
281,
2570,
7533,
824,
347,
34408,
3390,
1475,
643,
8458,
9968,
275,
34792,
14270,
3966,
50276,
10927,
281,
30080,
22559,
352,
310,
1175,
281,
871,
326,
253,
4477,
452,
247,
747,
7321,
362,
3348,
12637,
3268,
323,
253,
19191,
1566,
534,
476,
5115,
1534,
6351,
689,
253,
30027,
1566,
310,
436,
16774,
285,
2173,
281,
436,
10895,
1293,
8958,
253,
4278,
352,
310,
417,
2590,
849,
2087,
436,
747,
19191,
1566,
310,
50276,
74,
5194,
326,
352,
310,
32811,
281,
1071,
253,
1566,
970,
253,
5329,
7017,
10895,
2299,
891,
1335,
2868,
253,
10895,
310,
1077,
14155,
285,
352,
310,
417,
2590,
849,
1199,
253,
5661,
1543,
476,
4647,
281,
643,
1781,
15958,
15302,
50276,
2577,
13716,
19931,
253,
1072,
50276,
187,
187,
4118,
18435,
27,
15337,
398,
403,
275,
247,
13969,
285,
8521,
281,
2997,
846,
15966,
342,
253,
4477,
4496,
1379,
30628,
5701,
715,
8180,
281,
3157,
634,
19529,
323,
253,
6568,
4704,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
1057,
253,
2929,
1246,
4326,
1242,
747,
5697,
390,
8338,
271,
762,
14859,
390,
4122,
4460,
1953,
50275,
8826,
5371,
253,
2929,
24772,
767,
4633,
5368,
7274,
45738,
4715,
50276,
7645,
1754,
1453,
285,
11649,
21652,
970,
5926,
483,
50276,
783,
38135,
310,
275,
50276,
17890,
1699,
638,
20137,
5697,
50274,
18566,
253,
1543,
4326,
1242,
7170,
253,
1375,
273,
253,
1445,
50275,
2369,
253,
2429,
3082,
403,
417,
1375,
23037,
14387,
50275,
9846,
247,
6832,
6919,
273,
253,
17857,
32888,
36994,
320,
6110,
275,
4361,
436,
2929,
50275,
9820,
891,
1158,
326,
253,
12989,
273,
436,
2929,
651,
320,
1077,
4722,
281,
17857,
32888,
36994,
50275,
15177,
50274,
328,
8250,
16038,
281,
29697,
907,
10554,
11649,
281,
1056,
253,
8131,
3054,
3297,
275,
253,
3733,
941,
50276,
12563,
275,
690,
2219,
10414,
281,
5368,
789,
326,
3797,
1524,
35121,
2718,
310,
562,
273,
3634,
387,
5927,
594,
4754,
627,
403,
22620,
875,
436,
2929,
285,
5368,
2987,
50276,
251,
50276,
28269,
1453,
323,
15688,
982,
2718,
970,
45738,
4715,
1566,
1754,
1453,
285,
11649,
6600,
2105,
1159,
2299,
627,
310,
247,
15585,
3064,
275,
2426,
273,
2444,
275,
9864,
285,
2444,
342,
247,
1524,
985,
323,
534,
1566,
285,
3126,
11649,
310,
247,
1077,
1943,
2523,
627,
403,
1027,
7881,
275,
2444,
342,
247,
1524,
8767,
985,
534,
368,
588,
452,
281,
769,
6340,
50276,
395,
2444,
342,
873,
273,
3888,
323,
2403,
13650,
275,
9864,
50270,
498,
15752,
50275,
36423,
281,
1239,
5661,
7103,
310,
4518,
3559,
50275,
19164,
414,
50275,
22202,
11649,
12339,
369,
908,
275,
643,
2929,
465,
18272,
1162,
355,
4240,
50276,
45230,
253,
3236,
414,
310,
275,
690,
3282,
3777,
50275,
12756,
891,
5007,
436,
2929,
281,
581,
273,
619,
11651,
281,
1239,
50276,
9820,
891,
651,
7964,
5007,
436,
2929,
281,
619,
11651,
50274,
16691,
4385,
50275,
12233,
483,
476,
320,
908,
281,
1957,
253,
11649,
31485,
14417,
273,
253,
11454,
2990,
1566,
253,
30009,
11060,
11649,
3551,
432,
253,
3480,
273,
941,
476,
320,
12103,
949,
1114,
442,
1113,
4213,
10491,
273,
253,
5926,
483,
12477,
264,
1566,
1309,
10554,
2299,
436,
1511,
273,
11649,
476,
760,
6379,
407,
6240,
625,
14859,
941,
281,
1655,
941,
873,
1293,
667,
1635,
273,
941,
253,
50276,
87,
14417,
5141,
534,
1543,
50276,
1615,
29697,
3006,
253,
1029,
11041,
1309,
3733,
1537,
5224,
689,
31893,
281,
253,
1655,
3733,
941,
347,
253,
12339,
5621,
253,
1566,
281,
3283,
3054,
760,
275,
253,
3733,
10895,
352,
310,
12744,
849,
436,
2722,
1805,
1071,
2606,
3045,
253,
3453,
273,
253,
3646,
2990,
588,
3365,
320,
23539,
4404,
253,
3733,
873,
347,
247,
906,
273,
253,
11649,
2105,
625,
10527,
8813,
310,
3058,
390,
4931,
690,
30328,
50274,
2520,
8310,
310,
671,
2905,
281,
253,
958,
326,
253,
1566,
1754,
9763,
908,
50276,
261,
9093,
247,
50276,
16272,
7996,
9763,
50276,
7152,
339,
431,
248,
2929,
12453,
253,
10183,
273,
9383,
11610,
5333,
275,
1566,
3169,
35221,
4715,
1060,
253,
3268,
689,
24102,
1309,
310,
3012,
1027,
323,
253,
8770,
390,
2856,
317,
311,
732,
272,
3646,
285,
253,
2303,
390,
5556,
1701,
3646,
347,
247,
1599,
281,
2953,
436,
253,
4477,
12661,
281,
823,
271,
11649,
1307,
281,
253,
2105,
534,
310,
25436,
407,
253,
10711,
273,
253,
26677,
273,
253,
18012,
273,
247,
278,
68,
5926,
483,
3579,
1566,
253,
1332,
310,
3732,
281,
6276,
275,
14086,
7137,
835,
1014,
2014,
3430,
5231,
476,
320,
36256,
50276,
74,
971,
281,
4073,
326,
253,
2929,
369,
247,
11284,
281,
1239,
352,
369,
49258,
4951,
71,
319,
472,
281,
956,
984,
253,
2505,
369,
973,
15616,
342,
253,
3309,
7424,
50276,
783,
10199,
285,
2905,
789,
1646,
3426,
281,
479,
342,
767,
16022,
50275,
615,
365,
41768,
256,
617,
79,
24673,
77,
706,
4611,
480,
278,
277,
6934,
422,
33383,
269,
50276,
438,
7675,
649,
256,
50274,
7798,
480,
2988,
14717,
273,
11649,
275,
17699,
16561,
3676,
4715,
323,
5919,
285,
10502,
18917,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
12035,
17223,
520,
50276,
394,
4921,
30005,
532,
256,
4999,
35221,
4715,
5408,
9835,
273,
2280,
13462,
13747,
4104,
50276,
783,
789,
407,
372,
365,
41768,
1162,
355,
12453,
3240,
253,
1072,
1953,
347,
253,
4477,
273,
436,
789,
533,
342,
247,
16055,
7990,
26332,
417,
3710,
281,
7137,
533,
1077,
1199,
253,
1072,
20949,
627,
403,
690,
1774,
10527,
16039,
275,
436,
789,
285,
253,
4602,
281,
436,
19529,
943,
320,
8392,
275,
1798,
253,
4081,
1332,
3198,
281,
320,
2057,
2429,
281,
436,
789,
390,
352,
3198,
281,
320,
31637,
2139,
352,
310,
417,
7763,
50276,
783,
6158,
4620,
281,
320,
273,
1679,
8453,
275,
436,
3634,
533,
891,
1119,
10237,
28841,
3646,
7103,
762,
33174,
275,
253,
2905,
789,
50275,
74,
4282,
604,
627,
310,
247,
1039,
323,
247,
11454,
2990,
281,
13908,
253,
11649,
2105,
891,
9428,
326,
253,
4081,
2746,
310,
271,
11193,
281,
690,
15579,
1307,
285,
352,
651,
320,
27096,
281,
923,
849,
4555,
50275,
783,
2746,
2011,
407,
16186,
337,
4620,
281,
320,
271,
519,
37806,
1039,
273,
26230,
1880,
253,
11649,
4795,
432,
271,
2250,
310,
1955,
281,
253,
941,
390,
253,
1566,
752,
6569,
604,
436,
2746,
310,
417,
2668,
50276,
783,
8103,
1159,
273,
253,
3579,
1566,
310,
760,
1677,
275,
253,
30762,
891,
1158,
352,
3198,
281,
320,
4395,
281,
253,
2022,
2505,
3340,
984,
253,
2020,
1171,
23600,
4420,
1307,
6492,
247,
2860,
375,
16386,
3258,
305,
12064,
323,
247,
12177,
436,
556,
12739,
323,
253,
11649,
8197,
923,
1127,
1840,
50276,
1189,
455,
253,
9712,
273,
941,
11649,
16272,
4632,
1566,
11649,
310,
417,
2218,
436,
6492,
326,
344,
1336,
16386,
3258,
12620,
403,
4613,
1832,
835,
253,
1332,
476,
1891,
285,
436,
12291,
3198,
281,
320,
5469,
390,
8042,
562,
50276,
44295,
253,
4477,
858,
417,
10018,
247,
5649,
432,
970,
247,
19191,
3579,
1566,
3340,
604,
253,
2720,
3185,
273,
253,
16851,
12637,
310,
908,
619,
1127,
651,
320,
326,
7293,
327,
253,
3242,
17309,
474,
1566,
285,
253,
1039,
253,
10491,
310,
2218,
281,
6194,
253,
3646,
352,
310,
2686,
11076,
1037,
987,
281,
3410,
432,
253,
2720,
436,
310,
671,
849,
352,
310,
2529,
275,
253,
1390,
5150,
273,
2593,
374,
50274,
8774,
50276,
1189,
455,
891,
10490,
253,
2929,
285,
253,
1039,
352,
369,
3542,
2299,
627,
403,
690,
35387,
824,
347,
253,
5301,
281,
253,
789,
407,
372,
365,
41768,
1162,
355,
534,
1057,
247,
1077,
2074,
2181,
671,
816,
5411,
253,
908,
344,
321,
3397,
347,
34754,
281,
247,
3505,
74,
6216,
10671,
651,
1361,
352,
4620,
326,
253,
1953,
2139,
285,
849,
19191,
3579,
3210,
943,
320,
908,
4419,
2007,
5839,
7152,
339,
377,
2921,
253,
2929,
17075,
684,
253,
6276,
3646,
1895,
347,
247,
1566,
3169,
391,
77,
1895,
954,
2905,
789,
327,
6276,
3646,
556,
644,
5899,
15688,
982,
7219,
3082,
824,
347,
391,
1378,
390,
771,
813,
658,
391,
77,
824,
347,
3646,
11786,
3082,
50276,
783,
3646,
310,
6311,
949,
440,
19891,
247,
6311,
1566,
273,
253,
3126,
8062,
689,
2709,
673,
5018,
285,
3733,
247,
3646,
2990,
281,
15338,
247,
46350,
2105,
689,
436,
14436,
483,
18974,
50276,
783,
2105,
13398,
253,
8103,
253,
3646,
14993,
281,
22318,
18326,
281,
643,
8458,
285,
271,
11649,
2105,
9999,
253,
23279,
432,
253,
3054,
352,
310,
10166,
327,
50276,
5040,
50276,
783,
1566,
1754,
391,
77,
15895,
310,
3965,
2629,
3707,
326,
253,
2929,
556,
247,
3081,
1566,
11649,
2105,
50276,
6549,
18260,
253,
3453,
273,
6276,
3646,
943,
320,
7219,
3061,
26332,
253,
1039,
10801,
3185,
273,
21406,
14636,
285,
17680,
50276,
615,
6226,
3328,
13896,
627,
1057,
417,
1646,
281,
320,
247,
878,
281,
8415,
253,
1453,
1895,
970,
4715,
1580,
33786,
285,
4164,
50070,
556,
14042,
253,
1453,
1895,
1077,
973,
50275,
783,
2929,
858,
417,
1646,
281,
3986,
247,
6452,
327,
2139,
19191,
3579,
1566,
1057,
417,
4917,
247,
2590,
7756,
689,
253,
30027,
1566,
436,
778,
320,
1955,
281,
253,
12291,
273,
253,
10895,
390,
253,
10554,
16892,
534,
3133,
281,
320,
374,
1273,
50275,
783,
10895,
310,
760,
5329,
2909,
534,
10848,
407,
247,
6568,
2819,
1066,
247,
1355,
2593,
273,
253,
3971,
594,
253,
7823,
6311,
1537,
760,
513,
18209,
1563,
285,
13949,
2509,
15708,
28772,
891,
651,
11907,
253,
4477,
281,
1007,
715,
625,
11117,
10895,
923,
253,
2929,
8327,
13392,
2852,
10554,
275,
7870,
13451,
342,
18745,
6083,
30105,
1087,
4240,
50276,
1189,
455,
253,
2929,
2789,
271,
4722,
7680,
36803,
253,
6276,
3646,
1895,
347,
247,
1566,
3169,
391,
77,
1895,
253,
5609,
908,
403,
3965,
2629,
627,
403,
690,
16039,
275,
253,
5661,
2593,
2299,
1955,
281,
253,
12291,
273,
253,
10895,
352,
310,
417,
2590,
849,
1199,
253,
1543,
476,
39970,
281,
2570,
7533,
824,
347,
34408,
3390,
1475,
643,
8458,
9968,
275,
34792,
14270,
3966,
50276,
10927,
281,
30080,
22559,
352,
310,
1175,
281,
871,
326,
253,
4477,
452,
247,
747,
7321,
362,
3348,
12637,
3268,
323,
253,
19191,
1566,
534,
476,
5115,
1534,
6351,
689,
253,
30027,
1566,
310,
436,
16774,
285,
2173,
281,
436,
10895,
1293,
8958,
253,
4278,
352,
310,
417,
2590,
849,
2087,
436,
747,
19191,
1566,
310,
50276,
74,
5194,
326,
352,
310,
32811,
281,
1071,
253,
1566,
970,
253,
5329,
7017,
10895,
2299,
891,
1335,
2868,
253,
10895,
310,
1077,
14155,
285,
352,
310,
417,
2590,
849,
1199,
253,
5661,
1543,
476,
4647,
281,
643,
1781,
15958,
15302,
50276,
2577,
13716,
19931,
253,
1072,
50276,
187,
187,
4118,
18435,
27,
15337,
398,
403,
275,
247,
13969,
285,
8521,
281,
2997,
846,
15966,
342,
253,
4477,
4496,
1379,
30628,
5701,
715,
8180,
281,
3157,
634,
19529,
323,
253,
6568,
4704,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a postprocessing method to improve the gan results the postprocessing consists of a projected gradient descent step to update the generated image to fit its target class which works with both conditionalunconditional generative models for the unconditional generative model the authors design a debias method to force the model to uniformly generate images in each class the results show this model can significantly improve the fid and is scores on cifar10 dataset additionally the authors show this postprocessing also works for image interpolation advantages 1 the improvement on the cifar10 dataset is significant and the refined images look much better than baselines 2 the model can work with both conditionalunconditional generative models weakness 1 in algorithm 1 it is not clear how to use epsilon please add more details 2 in section 22 the robust classifier method optimizes both the input images and the network but in section 3 it only optimizes the input images the definition is not aligned 3 have the authors normalized the input images will the input pixel value between 01 or 0255 in the supplementary the epsilon of vae is 25 if the model uses a very large epsilon value will the refined image still look realistic 4 this paper only shows the cifar10 results will this method generalize to largescale and highresolution datasets such as imagenet currently people are interested in highresolution image generation for example biggan can generate highresolution images please the authors test this postprocession method on these generated highres images 5 the projected gradient descent has already been used in some image generation works 1 6 the fid and is scores of cganpd and biggan in this draft are different from the original papers please add more details also missing the definitions of 10k and 50k 1 xia weihao et al gan inversion a survey arxiv preprint arxiv210105278 2021 given the results and novelty are marginal i think this paper is between boardline please the authors address my questions i am happy to change my score docsepthis paper proposes a modelagnostic method for improving the quality of images produced by generative models the method requires a robust classifier trained on the same data source as the generative models based on the perceptually aligned gradients phenomenon the proposed method improves the quality of a generated image by the targeted projected gradient descent method with the help of the robust classifier experiments on cifar10 show that the proposed method does improve several generative models both quantitively and qualitatively the proposed method has some nice properties it is modelagnostic and can be used to improve generative models without retraining the only requirement is a robust classifier trained on the same dataset as the generative models as stated in the paper training a classifier is much easier than training a generator even better the same classifier can help all generative models trained on the dataset these characteristics are attractive in applications there are two weaknesses of the paper first the experiments are on the lowresolution cifar10 dataset although the method is effective on the cifar10 dataset it is uncertain how well the proposed method performs on normal quality images for 32x32 images it is impossible to check out details also lowresolution images are not very useful in real applications as classifiers could focus more on highlevel features it is not clear how they help synthesize realistic details for highresolution images second as stated in the paper the method is similar to prior work santukar et al 2019 and turner et al 2019 the paper lists three differences 1 the proposed method builds upon the generated images 2 the proposed method is modelagnostic and 3 the proposed method can be performed on every image generated by the generative models without throwing out anyone although the technical novelty is not significant the proposed method has advantages and could be useful in applications overall i like the proposed method because it is modelagnostic simple and effective however as stated in the previous section since experiments are only performed on lowresolution images it is not clear how the proposed method performs on images with ordinary resolutions and quality novelty is another potential issue although i think that the paper has made sufficient contribution given its nice characteristics listed above docsepthe paper proposes bigroc boosting image generation via a robust classifier a method to refine samples from a base generative model using a robust classifier in this context a robust classifier means one that has been trained to be robust to adversarial samples given a robust classifier and a base generative model both trained on the same dataset the key idea involves using the gradients of the robust classifier to update samples generated from the base generative model in the direction that maximizes the conditional probability of the samples class experiments on the cifar10 dataset have been presented that demonstrate that bigroc improves the sample quality from several base generative models in terms of the fid and is metrics writing the manuscript is wellwritten and easy to understand novelty significance and prior work the authors have explored an interesting idea of using robust classifiers for sample refinement however the key technical contribution is minor compared to santurkar et al 1 it only involves using a robust classifier on pretrained generative models one might argue for the novelty of application however the authors have ignored several works that address the problem of sample refinement eg using gan discriminators 2 3 4 in contrast to these works bigroc also has the additional overhead of training a robust classifier moreover these prior works bring in new technical perspectives to the problem of sample refinement eg from optimal transport 2 energybased modeling 3 gradient flows 4 unlike the proposed method which hardly brings in any new technical insights empirical evaluation it is appreciable that the experiments have been conducted on several base models however only the cifar10 dataset has been studied it is unclear how the method will perform on other datasets how sensitive is the method to the training of the robust classifier what is the additional time overhead of training a robust classifier furthermore the method has not been evaluated against any baseline methods eg 2 3 4 1 santurkar shibani et al image synthesis with a single robust classifier arxiv preprint arxiv190609453 2019 2 tanaka akinori discriminator optimal transport arxiv preprint arxiv191006832 2019 3 che tong et al your gan is secretly an energybased model and you should use discriminator driven latent sampling arxiv preprint arxiv200306060 2020 4 ansari abdul fatir ming liang ang and harold soh refining deep generative models via discriminator gradient flow arxiv preprint arxiv201200780 2020 the paper explores an interesting idea of using robust classifiers to refine samples from generative models however the technical contribution is limited several directly relevant methods of sample refinement have neither been discussed nor compared against overall the contribution is not significant enough to warrant acceptance
### Summary:
|
the paper proposes to improve generated images via a postprocessing update procedure guided by gradients from a robust classifier after the author response and discussion all reviewers agree that the paper is below the acceptance threshold reviewer concerns include limited technical novelty and missing experimental comparison to relevant baselines
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
1501,
21678,
1332,
281,
3157,
253,
36827,
1543,
253,
1501,
21678,
8414,
273,
247,
16589,
11786,
18499,
3213,
281,
5731,
253,
4561,
2460,
281,
4944,
697,
2303,
966,
534,
2987,
342,
1097,
17697,
328,
35428,
1006,
800,
3210,
323,
253,
49795,
1006,
800,
1566,
253,
4477,
2216,
247,
372,
39043,
1332,
281,
3490,
253,
1566,
281,
17568,
6635,
3888,
275,
1016,
966,
253,
1543,
921,
436,
1566,
476,
3012,
3157,
253,
269,
301,
285,
310,
7363,
327,
260,
338,
274,
740,
10895,
23000,
253,
4477,
921,
436,
1501,
21678,
671,
2987,
323,
2460,
30370,
50275,
11402,
1131,
337,
253,
7756,
327,
253,
260,
338,
274,
740,
10895,
310,
1534,
285,
253,
22407,
3888,
1007,
1199,
1805,
685,
1666,
25379,
374,
253,
1566,
476,
789,
342,
1097,
17697,
328,
35428,
1006,
800,
3210,
50275,
20881,
1255,
337,
275,
5933,
337,
352,
310,
417,
2590,
849,
281,
897,
299,
4277,
4496,
823,
625,
4278,
374,
275,
2593,
3307,
253,
10237,
30410,
1332,
5556,
4219,
1097,
253,
3280,
3888,
285,
253,
2990,
533,
275,
2593,
495,
352,
760,
5556,
4219,
253,
3280,
3888,
253,
5426,
310,
417,
15616,
495,
452,
253,
4477,
12650,
253,
3280,
3888,
588,
253,
3280,
12275,
1318,
875,
14805,
390,
470,
10637,
275,
253,
24864,
253,
299,
4277,
273,
362,
3348,
310,
2030,
604,
253,
1566,
4648,
247,
1077,
1781,
299,
4277,
1318,
588,
253,
22407,
2460,
1335,
1007,
15958,
577,
436,
2929,
760,
2722,
253,
260,
338,
274,
740,
1543,
588,
436,
1332,
39970,
281,
1236,
2510,
25912,
285,
1029,
21061,
15302,
824,
347,
4440,
257,
292,
4390,
952,
403,
6110,
275,
1029,
21061,
2460,
5978,
323,
1650,
1943,
1247,
476,
6635,
1029,
21061,
3888,
4496,
253,
4477,
1071,
436,
1501,
7404,
279,
1332,
327,
841,
4561,
1029,
373,
3888,
608,
253,
16589,
11786,
18499,
556,
2168,
644,
908,
275,
690,
2460,
5978,
2987,
337,
721,
253,
269,
301,
285,
310,
7363,
273,
260,
1247,
19875,
285,
1943,
1247,
275,
436,
7482,
403,
1027,
432,
253,
3236,
9380,
4496,
823,
625,
4278,
671,
5816,
253,
14308,
273,
884,
76,
285,
2456,
76,
50274,
18,
1269,
571,
359,
74,
31035,
1162,
355,
36827,
27697,
247,
6630,
549,
32693,
638,
3845,
549,
32693,
1797,
520,
1762,
24803,
43425,
1677,
253,
1543,
285,
38135,
403,
16888,
891,
1158,
436,
2929,
310,
875,
4450,
1282,
4496,
253,
4477,
2953,
619,
3533,
891,
717,
5211,
281,
1818,
619,
4868,
5474,
33032,
2520,
2929,
29328,
247,
1566,
1530,
6932,
1332,
323,
11138,
253,
3290,
273,
3888,
4197,
407,
1006,
800,
3210,
253,
1332,
4419,
247,
10237,
30410,
10166,
327,
253,
1072,
941,
2603,
347,
253,
1006,
800,
3210,
1754,
327,
253,
591,
916,
1230,
15616,
27935,
11562,
253,
4081,
1332,
19132,
253,
3290,
273,
247,
4561,
2460,
407,
253,
10522,
16589,
11786,
18499,
1332,
342,
253,
1361,
273,
253,
10237,
30410,
4679,
327,
260,
338,
274,
740,
921,
326,
253,
4081,
1332,
1057,
3157,
2067,
1006,
800,
3210,
1097,
2677,
25785,
285,
36143,
253,
4081,
1332,
556,
690,
5322,
3607,
352,
310,
1566,
1530,
6932,
285,
476,
320,
908,
281,
3157,
1006,
800,
3210,
1293,
851,
26208,
253,
760,
8284,
310,
247,
10237,
30410,
10166,
327,
253,
1072,
10895,
347,
253,
1006,
800,
3210,
347,
4767,
275,
253,
2929,
3733,
247,
30410,
310,
1199,
6927,
685,
3733,
247,
14156,
1014,
1805,
253,
1072,
30410,
476,
1361,
512,
1006,
800,
3210,
10166,
327,
253,
10895,
841,
5319,
403,
12994,
275,
4893,
50275,
9088,
403,
767,
32213,
273,
253,
2929,
806,
253,
4679,
403,
327,
253,
1698,
21061,
260,
338,
274,
740,
10895,
3738,
253,
1332,
310,
3576,
327,
253,
260,
338,
274,
740,
10895,
352,
310,
8767,
849,
973,
253,
4081,
1332,
17923,
327,
2622,
3290,
3888,
323,
4567,
89,
1237,
3888,
352,
310,
7479,
281,
2451,
562,
4278,
671,
1698,
21061,
3888,
403,
417,
1077,
4217,
275,
1524,
4893,
347,
49996,
812,
2770,
625,
327,
1029,
5251,
3386,
352,
310,
417,
2590,
849,
597,
1361,
46919,
15958,
4278,
323,
1029,
21061,
3888,
50275,
9815,
347,
4767,
275,
253,
2929,
253,
1332,
310,
2074,
281,
2720,
789,
256,
386,
2788,
274,
1162,
355,
6247,
285,
1614,
254,
1162,
355,
6247,
253,
2929,
10894,
1264,
3910,
337,
253,
4081,
1332,
21168,
2220,
253,
4561,
3888,
374,
253,
4081,
1332,
310,
1566,
1530,
6932,
285,
495,
253,
4081,
1332,
476,
320,
2684,
327,
1046,
2460,
4561,
407,
253,
1006,
800,
3210,
1293,
15950,
562,
3780,
3738,
253,
7681,
38135,
310,
417,
1534,
253,
4081,
1332,
556,
11361,
285,
812,
320,
4217,
275,
4893,
50276,
1189,
455,
891,
751,
253,
4081,
1332,
984,
352,
310,
1566,
1530,
6932,
2969,
285,
3576,
2299,
347,
4767,
275,
253,
2045,
2593,
1580,
4679,
403,
760,
2684,
327,
1698,
21061,
3888,
352,
310,
417,
2590,
849,
253,
4081,
1332,
17923,
327,
3888,
342,
9826,
30285,
285,
3290,
38135,
310,
1529,
2442,
2523,
3738,
891,
1158,
326,
253,
2929,
556,
1160,
4209,
7680,
1677,
697,
5322,
5319,
7117,
1840,
5474,
339,
431,
248,
2929,
29328,
1943,
15683,
43124,
2460,
5978,
3066,
247,
10237,
30410,
247,
1332,
281,
39494,
3530,
432,
247,
2613,
1006,
800,
1566,
970,
247,
10237,
30410,
275,
436,
3634,
247,
10237,
30410,
2097,
581,
326,
556,
644,
10166,
281,
320,
10237,
281,
48960,
3530,
1677,
247,
10237,
30410,
285,
247,
2613,
1006,
800,
1566,
1097,
10166,
327,
253,
1072,
10895,
253,
2234,
2934,
8687,
970,
253,
27935,
273,
253,
10237,
30410,
281,
5731,
3530,
4561,
432,
253,
2613,
1006,
800,
1566,
275,
253,
3884,
326,
11903,
4219,
253,
17697,
5912,
273,
253,
3530,
966,
4679,
327,
253,
260,
338,
274,
740,
10895,
452,
644,
3559,
326,
7568,
326,
1943,
15683,
19132,
253,
3410,
3290,
432,
2067,
2613,
1006,
800,
3210,
275,
2426,
273,
253,
269,
301,
285,
310,
17082,
4028,
50276,
783,
7714,
310,
973,
15720,
285,
3477,
281,
2096,
50275,
2369,
652,
555,
8453,
285,
2720,
789,
50276,
783,
4477,
452,
14859,
271,
4722,
2934,
273,
970,
10237,
49996,
323,
3410,
29646,
2299,
253,
2234,
7681,
7680,
310,
5884,
2429,
281,
256,
386,
321,
18970,
1162,
355,
337,
50276,
262,
760,
8687,
970,
247,
10237,
30410,
327,
3215,
11273,
1006,
800,
3210,
581,
1537,
9059,
323,
253,
38135,
273,
2898,
2299,
253,
4477,
452,
12841,
2067,
2987,
326,
2953,
253,
1895,
273,
3410,
29646,
24088,
970,
36827,
20741,
2392,
374,
495,
577,
275,
4499,
281,
841,
2987,
1943,
15683,
671,
556,
253,
3081,
18332,
273,
3733,
247,
10237,
30410,
25761,
841,
2720,
2987,
3324,
275,
747,
7681,
24302,
281,
253,
1895,
273,
3410,
29646,
24088,
432,
8654,
4616,
374,
2341,
3169,
14053,
495,
11786,
14221,
577,
12401,
253,
4081,
1332,
534,
10693,
10316,
275,
667,
747,
7681,
16039,
50276,
358,
5378,
474,
7103,
50276,
262,
310,
6373,
6051,
326,
253,
4679,
452,
644,
5196,
327,
2067,
2613,
3210,
2299,
760,
253,
260,
338,
274,
740,
10895,
556,
644,
5421,
352,
310,
12744,
50275,
5430,
253,
1332,
588,
1347,
327,
643,
15302,
50276,
5430,
7996,
310,
253,
1332,
281,
253,
3733,
273,
253,
10237,
30410,
50276,
5371,
310,
253,
3081,
673,
18332,
273,
3733,
247,
10237,
30410,
50276,
44295,
3062,
253,
1332,
556,
417,
644,
6760,
1411,
667,
8245,
3082,
24088,
374,
495,
577,
50275,
18,
256,
386,
321,
18970,
439,
487,
6451,
1162,
355,
2460,
9066,
342,
247,
2014,
10237,
30410,
575,
39962,
638,
3845,
549,
32693,
16129,
27059,
31213,
575,
9638,
50276,
19,
23136,
10573,
33917,
8885,
7134,
12915,
8654,
4616,
575,
39962,
638,
3845,
549,
32693,
746,
2313,
2358,
1237,
575,
9638,
50276,
20,
1161,
12386,
1162,
355,
634,
36827,
310,
31065,
271,
2341,
3169,
1566,
285,
368,
943,
897,
7134,
12915,
8877,
21624,
10491,
575,
39962,
638,
3845,
549,
32693,
1518,
1229,
1549,
1549,
575,
14952,
50276,
21,
7897,
1792,
490,
69,
335,
4688,
343,
43261,
632,
606,
2897,
285,
4230,
744,
594,
73,
1275,
1699,
3676,
1006,
800,
3210,
3066,
7134,
12915,
11786,
2685,
575,
39962,
638,
3845,
549,
32693,
1252,
8602,
1438,
575,
14952,
253,
2929,
33826,
271,
4722,
2934,
273,
970,
10237,
49996,
281,
39494,
3530,
432,
1006,
800,
3210,
2299,
253,
7681,
7680,
310,
3710,
2067,
3587,
4623,
3082,
273,
3410,
29646,
452,
6747,
644,
5469,
4543,
2429,
1411,
4583,
253,
7680,
310,
417,
1534,
2217,
281,
7501,
14924,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
3157,
4561,
3888,
3066,
247,
1501,
21678,
5731,
5199,
18107,
407,
27935,
432,
247,
10237,
30410,
50276,
6438,
253,
2488,
2380,
285,
5955,
512,
30628,
5194,
326,
253,
2929,
310,
2708,
253,
14924,
7887,
50276,
15337,
254,
7350,
2486,
3710,
7681,
38135,
285,
5816,
5661,
5301,
281,
4623,
1666,
25379
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
1501,
21678,
1332,
281,
3157,
253,
36827,
1543,
253,
1501,
21678,
8414,
273,
247,
16589,
11786,
18499,
3213,
281,
5731,
253,
4561,
2460,
281,
4944,
697,
2303,
966,
534,
2987,
342,
1097,
17697,
328,
35428,
1006,
800,
3210,
323,
253,
49795,
1006,
800,
1566,
253,
4477,
2216,
247,
372,
39043,
1332,
281,
3490,
253,
1566,
281,
17568,
6635,
3888,
275,
1016,
966,
253,
1543,
921,
436,
1566,
476,
3012,
3157,
253,
269,
301,
285,
310,
7363,
327,
260,
338,
274,
740,
10895,
23000,
253,
4477,
921,
436,
1501,
21678,
671,
2987,
323,
2460,
30370,
50275,
11402,
1131,
337,
253,
7756,
327,
253,
260,
338,
274,
740,
10895,
310,
1534,
285,
253,
22407,
3888,
1007,
1199,
1805,
685,
1666,
25379,
374,
253,
1566,
476,
789,
342,
1097,
17697,
328,
35428,
1006,
800,
3210,
50275,
20881,
1255,
337,
275,
5933,
337,
352,
310,
417,
2590,
849,
281,
897,
299,
4277,
4496,
823,
625,
4278,
374,
275,
2593,
3307,
253,
10237,
30410,
1332,
5556,
4219,
1097,
253,
3280,
3888,
285,
253,
2990,
533,
275,
2593,
495,
352,
760,
5556,
4219,
253,
3280,
3888,
253,
5426,
310,
417,
15616,
495,
452,
253,
4477,
12650,
253,
3280,
3888,
588,
253,
3280,
12275,
1318,
875,
14805,
390,
470,
10637,
275,
253,
24864,
253,
299,
4277,
273,
362,
3348,
310,
2030,
604,
253,
1566,
4648,
247,
1077,
1781,
299,
4277,
1318,
588,
253,
22407,
2460,
1335,
1007,
15958,
577,
436,
2929,
760,
2722,
253,
260,
338,
274,
740,
1543,
588,
436,
1332,
39970,
281,
1236,
2510,
25912,
285,
1029,
21061,
15302,
824,
347,
4440,
257,
292,
4390,
952,
403,
6110,
275,
1029,
21061,
2460,
5978,
323,
1650,
1943,
1247,
476,
6635,
1029,
21061,
3888,
4496,
253,
4477,
1071,
436,
1501,
7404,
279,
1332,
327,
841,
4561,
1029,
373,
3888,
608,
253,
16589,
11786,
18499,
556,
2168,
644,
908,
275,
690,
2460,
5978,
2987,
337,
721,
253,
269,
301,
285,
310,
7363,
273,
260,
1247,
19875,
285,
1943,
1247,
275,
436,
7482,
403,
1027,
432,
253,
3236,
9380,
4496,
823,
625,
4278,
671,
5816,
253,
14308,
273,
884,
76,
285,
2456,
76,
50274,
18,
1269,
571,
359,
74,
31035,
1162,
355,
36827,
27697,
247,
6630,
549,
32693,
638,
3845,
549,
32693,
1797,
520,
1762,
24803,
43425,
1677,
253,
1543,
285,
38135,
403,
16888,
891,
1158,
436,
2929,
310,
875,
4450,
1282,
4496,
253,
4477,
2953,
619,
3533,
891,
717,
5211,
281,
1818,
619,
4868,
5474,
33032,
2520,
2929,
29328,
247,
1566,
1530,
6932,
1332,
323,
11138,
253,
3290,
273,
3888,
4197,
407,
1006,
800,
3210,
253,
1332,
4419,
247,
10237,
30410,
10166,
327,
253,
1072,
941,
2603,
347,
253,
1006,
800,
3210,
1754,
327,
253,
591,
916,
1230,
15616,
27935,
11562,
253,
4081,
1332,
19132,
253,
3290,
273,
247,
4561,
2460,
407,
253,
10522,
16589,
11786,
18499,
1332,
342,
253,
1361,
273,
253,
10237,
30410,
4679,
327,
260,
338,
274,
740,
921,
326,
253,
4081,
1332,
1057,
3157,
2067,
1006,
800,
3210,
1097,
2677,
25785,
285,
36143,
253,
4081,
1332,
556,
690,
5322,
3607,
352,
310,
1566,
1530,
6932,
285,
476,
320,
908,
281,
3157,
1006,
800,
3210,
1293,
851,
26208,
253,
760,
8284,
310,
247,
10237,
30410,
10166,
327,
253,
1072,
10895,
347,
253,
1006,
800,
3210,
347,
4767,
275,
253,
2929,
3733,
247,
30410,
310,
1199,
6927,
685,
3733,
247,
14156,
1014,
1805,
253,
1072,
30410,
476,
1361,
512,
1006,
800,
3210,
10166,
327,
253,
10895,
841,
5319,
403,
12994,
275,
4893,
50275,
9088,
403,
767,
32213,
273,
253,
2929,
806,
253,
4679,
403,
327,
253,
1698,
21061,
260,
338,
274,
740,
10895,
3738,
253,
1332,
310,
3576,
327,
253,
260,
338,
274,
740,
10895,
352,
310,
8767,
849,
973,
253,
4081,
1332,
17923,
327,
2622,
3290,
3888,
323,
4567,
89,
1237,
3888,
352,
310,
7479,
281,
2451,
562,
4278,
671,
1698,
21061,
3888,
403,
417,
1077,
4217,
275,
1524,
4893,
347,
49996,
812,
2770,
625,
327,
1029,
5251,
3386,
352,
310,
417,
2590,
849,
597,
1361,
46919,
15958,
4278,
323,
1029,
21061,
3888,
50275,
9815,
347,
4767,
275,
253,
2929,
253,
1332,
310,
2074,
281,
2720,
789,
256,
386,
2788,
274,
1162,
355,
6247,
285,
1614,
254,
1162,
355,
6247,
253,
2929,
10894,
1264,
3910,
337,
253,
4081,
1332,
21168,
2220,
253,
4561,
3888,
374,
253,
4081,
1332,
310,
1566,
1530,
6932,
285,
495,
253,
4081,
1332,
476,
320,
2684,
327,
1046,
2460,
4561,
407,
253,
1006,
800,
3210,
1293,
15950,
562,
3780,
3738,
253,
7681,
38135,
310,
417,
1534,
253,
4081,
1332,
556,
11361,
285,
812,
320,
4217,
275,
4893,
50276,
1189,
455,
891,
751,
253,
4081,
1332,
984,
352,
310,
1566,
1530,
6932,
2969,
285,
3576,
2299,
347,
4767,
275,
253,
2045,
2593,
1580,
4679,
403,
760,
2684,
327,
1698,
21061,
3888,
352,
310,
417,
2590,
849,
253,
4081,
1332,
17923,
327,
3888,
342,
9826,
30285,
285,
3290,
38135,
310,
1529,
2442,
2523,
3738,
891,
1158,
326,
253,
2929,
556,
1160,
4209,
7680,
1677,
697,
5322,
5319,
7117,
1840,
5474,
339,
431,
248,
2929,
29328,
1943,
15683,
43124,
2460,
5978,
3066,
247,
10237,
30410,
247,
1332,
281,
39494,
3530,
432,
247,
2613,
1006,
800,
1566,
970,
247,
10237,
30410,
275,
436,
3634,
247,
10237,
30410,
2097,
581,
326,
556,
644,
10166,
281,
320,
10237,
281,
48960,
3530,
1677,
247,
10237,
30410,
285,
247,
2613,
1006,
800,
1566,
1097,
10166,
327,
253,
1072,
10895,
253,
2234,
2934,
8687,
970,
253,
27935,
273,
253,
10237,
30410,
281,
5731,
3530,
4561,
432,
253,
2613,
1006,
800,
1566,
275,
253,
3884,
326,
11903,
4219,
253,
17697,
5912,
273,
253,
3530,
966,
4679,
327,
253,
260,
338,
274,
740,
10895,
452,
644,
3559,
326,
7568,
326,
1943,
15683,
19132,
253,
3410,
3290,
432,
2067,
2613,
1006,
800,
3210,
275,
2426,
273,
253,
269,
301,
285,
310,
17082,
4028,
50276,
783,
7714,
310,
973,
15720,
285,
3477,
281,
2096,
50275,
2369,
652,
555,
8453,
285,
2720,
789,
50276,
783,
4477,
452,
14859,
271,
4722,
2934,
273,
970,
10237,
49996,
323,
3410,
29646,
2299,
253,
2234,
7681,
7680,
310,
5884,
2429,
281,
256,
386,
321,
18970,
1162,
355,
337,
50276,
262,
760,
8687,
970,
247,
10237,
30410,
327,
3215,
11273,
1006,
800,
3210,
581,
1537,
9059,
323,
253,
38135,
273,
2898,
2299,
253,
4477,
452,
12841,
2067,
2987,
326,
2953,
253,
1895,
273,
3410,
29646,
24088,
970,
36827,
20741,
2392,
374,
495,
577,
275,
4499,
281,
841,
2987,
1943,
15683,
671,
556,
253,
3081,
18332,
273,
3733,
247,
10237,
30410,
25761,
841,
2720,
2987,
3324,
275,
747,
7681,
24302,
281,
253,
1895,
273,
3410,
29646,
24088,
432,
8654,
4616,
374,
2341,
3169,
14053,
495,
11786,
14221,
577,
12401,
253,
4081,
1332,
534,
10693,
10316,
275,
667,
747,
7681,
16039,
50276,
358,
5378,
474,
7103,
50276,
262,
310,
6373,
6051,
326,
253,
4679,
452,
644,
5196,
327,
2067,
2613,
3210,
2299,
760,
253,
260,
338,
274,
740,
10895,
556,
644,
5421,
352,
310,
12744,
50275,
5430,
253,
1332,
588,
1347,
327,
643,
15302,
50276,
5430,
7996,
310,
253,
1332,
281,
253,
3733,
273,
253,
10237,
30410,
50276,
5371,
310,
253,
3081,
673,
18332,
273,
3733,
247,
10237,
30410,
50276,
44295,
3062,
253,
1332,
556,
417,
644,
6760,
1411,
667,
8245,
3082,
24088,
374,
495,
577,
50275,
18,
256,
386,
321,
18970,
439,
487,
6451,
1162,
355,
2460,
9066,
342,
247,
2014,
10237,
30410,
575,
39962,
638,
3845,
549,
32693,
16129,
27059,
31213,
575,
9638,
50276,
19,
23136,
10573,
33917,
8885,
7134,
12915,
8654,
4616,
575,
39962,
638,
3845,
549,
32693,
746,
2313,
2358,
1237,
575,
9638,
50276,
20,
1161,
12386,
1162,
355,
634,
36827,
310,
31065,
271,
2341,
3169,
1566,
285,
368,
943,
897,
7134,
12915,
8877,
21624,
10491,
575,
39962,
638,
3845,
549,
32693,
1518,
1229,
1549,
1549,
575,
14952,
50276,
21,
7897,
1792,
490,
69,
335,
4688,
343,
43261,
632,
606,
2897,
285,
4230,
744,
594,
73,
1275,
1699,
3676,
1006,
800,
3210,
3066,
7134,
12915,
11786,
2685,
575,
39962,
638,
3845,
549,
32693,
1252,
8602,
1438,
575,
14952,
253,
2929,
33826,
271,
4722,
2934,
273,
970,
10237,
49996,
281,
39494,
3530,
432,
1006,
800,
3210,
2299,
253,
7681,
7680,
310,
3710,
2067,
3587,
4623,
3082,
273,
3410,
29646,
452,
6747,
644,
5469,
4543,
2429,
1411,
4583,
253,
7680,
310,
417,
1534,
2217,
281,
7501,
14924,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
3157,
4561,
3888,
3066,
247,
1501,
21678,
5731,
5199,
18107,
407,
27935,
432,
247,
10237,
30410,
50276,
6438,
253,
2488,
2380,
285,
5955,
512,
30628,
5194,
326,
253,
2929,
310,
2708,
253,
14924,
7887,
50276,
15337,
254,
7350,
2486,
3710,
7681,
38135,
285,
5816,
5661,
5301,
281,
4623,
1666,
25379
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in summary this paper does the following the initial problem is to analyze the trajectory of sgd in training anns in the space of p of probability measures on y times y this problem is interesting but difficult the paper constructs a markov chain that follows a shortest path in tv metric on p the alpha smlc through experiments the paper shows that the trajectories of sgd and alphasmlc have similar conditional entropy my issues with this paper are a the main result is a simulation how general is this could it depend on the dataset could you provide some intuition or prove that for certain dataset these two trajectories are the same or very close b meaning of this trajectory this is not the trajectory in p it is the trajectory of the entropies in general is there an intuitive explanation on why these trajectories are similar and what does it mean for example what would be a possible implication for training sgd could it be that all learning methods will have this characteristic parabolic trajectory for entropies c the theoretical contribution is minor both the techniques and results quoted are known overall i think the paper lacks a takeaway it is an interesting observation that the trajectory of alphasmlc is similar to that of sgd in these plots but the authors have not made a sufficient effort to interpret this docsepthis paper study the trajectory of hhaty versus hhatyy on the information plane for stochastic gradient descent methods for training neural networks this paper was inspired by ziv and tishby 17 but instead of measuring the mutual information ixt and iyt this paper proposed to measure hhaty and hhatyy which are much easier to compute but carries similar meaning as iyt and ixt the interesting part of this paper appears in section 4 where the author makes a connection between the sgd training process and alphasmlcstrong markov learning chain smlc is just simply linear combination of the initial distribution and the final stable distribution of the labels the authors show that the trajectory of the real experiment is similar to that of smlc generally i think the paper is wellwritten and clearly present the ideas here are some pros and cons pros 1 the trajectory presented in this paper is much more reliable than that in ziv and tishby 17 since measuring the entropy and conditional entropy of discrete random variables are much easier also it is easy for people to believe that the trajectory holds for various neural network structure and various activation functions pros 2 the connection to smlc is interesting and it may contain lot of insights cons 1 one of my major concern is if you look at the trajectory of the experiment vs smlc figure 3 they look similar at first glance but if you look at it carefully you will notice that the color of them are different for sgd the trajectory goes to the turning point very soon usually no more than 10 of the training steps whereas smlc goes to the turning point much slower how do the authors think about this phenomenon and what does this mean cons 2 this paper is going to be more meaningful if the author can provide some discussions especially about 1 what does the shape trajectory mean 2 what do the connection between the trajectory and markov chain means 3 how can these connections be potentially useful to improve training algorithm i understand that these questions may not be clearly answerable but the authors should make this paper more inspiring such that other researchers can think deeper after reading this paper cons 3 i suggest the authors using sgd instead of gd throughout the paper usually gd means true gradient descent but the paper is talking about batched stochastic gradient descent gd does not have markovity generally i think the paper is on the borderline i think the paper is acceptable if the author can provide more insights against cons 2docsepthe paper tries to describe sgd from the point of view of the distribution pyy where y is a possibly corrupted true classlabel and y a model prediction assuming tv metric of probabilities a trajectory is defined which fits to general learning behaviour of distributions the issue is that the paper abstracts the actual algorithm model and data away and the only thing that remains are marginal distributions py and conditional pyy at this point one can already argue that the result is either not describing real behavior or is trivial the proposed trajectory starts with a model that only predicts oneclass low entropy hy and high conditional entropy and ends with the optimal model the trajectory is linear in distribution space therefore one obtains initially a stage where hy and hyy increase a lot followed by a stage where hyy decrease this is known to happen because almost all models include a bias on the output thus the easiest way to initially decrease the error is to obtain the correct marginal distribution by tuning the bias learning the actual classlabel depending on the observed image is much harder and thus takes longer therefore no matter what algorithm is used one would expect this kind of trajectory with a model that has a bias it also means that the interesting part of an analysis only begins after the marginal distribution is learned sufficiently well and here the experimental results deviate a lot from the theoretical prediction while showing some parabola like shape there are big differences in how the shapes are looking like i dont see how this paper is improving the state of the art most of the theoretical contributions are well known or easy to derive there is no actual connection to sgd left therefore it is even hard to argue that the predicted shape will be observed independent of dataset or modelone could think about a model which can not model a bias and the inputs are meanfree thus it is hard to learn the marginal distribution which might change the trajectory therefore i vote for a strong reject
### Summary:
|
the paper proposes a quantity to monitor learning on an information plane which is related to the information curves considered in the bottleneck analysis but is more reliable and easier to compute the main concern with the paper is the lack of interpretation and elaboration of potential uses a concern is raised that the proposed method abstracts away way too much detail so that the shapes of the curves are to be expected and contain little useful information see anonreviewer2 comments the authors agree to some of the main issues as they pointed out in the discussion although they maintain that the method could still contain useful information the reviewers are not very convinced by this paper with ratings either marginally above the acceptance threshold marginally below the acceptance threshold or strong reject
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
6010,
436,
2929,
1057,
253,
1563,
50276,
783,
3302,
1895,
310,
281,
12106,
253,
18974,
273,
256,
35333,
275,
3733,
271,
2224,
275,
253,
2317,
273,
50276,
81,
273,
5912,
5593,
327,
340,
2069,
340,
436,
1895,
310,
4722,
533,
2834,
50275,
783,
2929,
21031,
247,
1616,
729,
5931,
326,
3637,
247,
30505,
1854,
275,
23055,
7982,
327,
268,
253,
9765,
924,
27827,
50276,
10489,
4679,
253,
2929,
2722,
326,
253,
24102,
273,
256,
35333,
285,
355,
545,
284,
1686,
68,
452,
50276,
22202,
17697,
15579,
50275,
2577,
3374,
342,
436,
2929,
403,
247,
253,
2022,
906,
310,
247,
9864,
849,
2087,
310,
436,
812,
352,
3469,
327,
253,
10895,
812,
368,
2085,
690,
30328,
390,
5276,
326,
323,
2176,
10895,
841,
767,
24102,
403,
253,
1072,
390,
1077,
2810,
50276,
67,
4495,
273,
436,
18974,
436,
310,
417,
253,
18974,
275,
268,
352,
310,
253,
18974,
273,
253,
994,
1658,
447,
275,
2087,
310,
627,
271,
27350,
8813,
327,
2139,
841,
24102,
403,
2074,
285,
752,
1057,
352,
1599,
50276,
1542,
1650,
752,
651,
320,
247,
1896,
27570,
323,
3733,
256,
35333,
812,
352,
320,
326,
512,
4715,
3082,
588,
452,
436,
8847,
38981,
18974,
323,
994,
1658,
447,
50276,
68,
253,
10527,
7680,
310,
5884,
1097,
253,
5609,
285,
1543,
15212,
403,
1929,
50275,
1189,
455,
891,
1158,
253,
2929,
19756,
247,
1379,
12594,
352,
310,
271,
4722,
8310,
326,
253,
18974,
273,
355,
545,
284,
1686,
68,
50276,
261,
2074,
281,
326,
273,
256,
35333,
275,
841,
14777,
533,
253,
4477,
452,
417,
1160,
247,
4209,
3434,
281,
4665,
436,
50276,
7152,
33032,
2520,
2929,
1263,
253,
18974,
273,
288,
700,
90,
7147,
288,
700,
12502,
327,
253,
1491,
6415,
323,
19191,
11786,
18499,
3082,
323,
3733,
11454,
6928,
436,
2929,
369,
11797,
407,
1182,
400,
285,
246,
763,
1615,
1722,
533,
3185,
273,
10499,
253,
15577,
1491,
891,
633,
285,
891,
1767,
436,
2929,
4081,
281,
2557,
288,
700,
90,
285,
288,
700,
12502,
534,
403,
1199,
6927,
281,
11897,
533,
15814,
2074,
4495,
347,
891,
1767,
285,
891,
633,
50276,
783,
4722,
629,
273,
436,
2929,
4620,
275,
2593,
577,
835,
253,
2488,
2789,
247,
4602,
875,
253,
256,
35333,
3733,
1232,
285,
355,
545,
284,
1686,
68,
9072,
1616,
729,
4715,
5931,
924,
27827,
310,
816,
3365,
4872,
5019,
273,
253,
3302,
3268,
285,
253,
2457,
6474,
3268,
273,
253,
13301,
253,
4477,
921,
326,
253,
18974,
273,
253,
1524,
3368,
310,
2074,
281,
326,
273,
924,
27827,
50276,
43786,
891,
1158,
253,
2929,
310,
973,
15720,
285,
4518,
1246,
253,
5697,
1060,
403,
690,
5847,
285,
772,
50276,
856,
84,
337,
253,
18974,
3559,
275,
436,
2929,
310,
1199,
625,
9630,
685,
326,
275,
1182,
400,
285,
246,
763,
1615,
1722,
1580,
10499,
253,
15579,
285,
17697,
15579,
273,
13358,
3632,
4903,
403,
1199,
6927,
671,
352,
310,
3477,
323,
952,
281,
2868,
326,
253,
18974,
6556,
323,
2710,
11454,
2990,
2605,
285,
2710,
5743,
3470,
50276,
856,
84,
374,
253,
4602,
281,
924,
27827,
310,
4722,
285,
352,
778,
3831,
2257,
273,
16039,
50276,
5040,
337,
581,
273,
619,
2201,
4468,
310,
50276,
338,
368,
1007,
387,
253,
18974,
273,
253,
3368,
4632,
924,
27827,
4677,
495,
597,
1007,
2074,
387,
806,
17834,
533,
604,
368,
1007,
387,
352,
9257,
368,
588,
4366,
326,
253,
3295,
273,
731,
403,
1027,
323,
256,
35333,
253,
18974,
4566,
281,
253,
8577,
1127,
1077,
3517,
3798,
642,
625,
685,
884,
273,
253,
3733,
5018,
5727,
924,
27827,
4566,
281,
253,
8577,
1127,
1199,
17357,
849,
513,
253,
4477,
1158,
670,
436,
11562,
285,
752,
1057,
436,
1599,
50276,
5040,
374,
436,
2929,
310,
1469,
281,
320,
625,
14282,
604,
253,
2488,
476,
2085,
690,
11985,
3340,
670,
337,
752,
1057,
253,
5281,
18974,
1599,
374,
752,
513,
253,
4602,
875,
253,
18974,
285,
1616,
729,
5931,
2097,
495,
849,
476,
841,
10291,
320,
7826,
4217,
281,
3157,
3733,
5933,
891,
2096,
326,
841,
3533,
778,
417,
320,
4518,
3662,
494,
533,
253,
4477,
943,
1056,
436,
2929,
625,
29853,
824,
326,
643,
8607,
476,
1158,
12861,
846,
4361,
436,
2929,
50276,
5040,
495,
891,
1804,
253,
4477,
970,
256,
35333,
3185,
273,
305,
69,
4768,
253,
2929,
3798,
305,
69,
2097,
2032,
11786,
18499,
533,
253,
2929,
310,
5015,
670,
10464,
2147,
19191,
11786,
18499,
305,
69,
1057,
417,
452,
1616,
729,
414,
50276,
43786,
891,
1158,
253,
2929,
310,
327,
253,
45210,
891,
1158,
253,
2929,
310,
12207,
604,
253,
2488,
476,
2085,
625,
16039,
1411,
772,
374,
7152,
339,
431,
248,
2929,
14177,
281,
6266,
256,
35333,
432,
253,
1127,
273,
1859,
273,
253,
3268,
7239,
90,
835,
340,
310,
247,
6830,
40634,
2032,
966,
1968,
285,
340,
247,
1566,
10554,
7384,
23055,
7982,
273,
20552,
247,
18974,
310,
2931,
534,
13840,
281,
2087,
4715,
8770,
273,
10670,
50276,
783,
2523,
310,
326,
253,
2929,
12002,
84,
253,
4588,
5933,
1566,
285,
941,
1977,
285,
253,
760,
2181,
326,
4558,
403,
16888,
10670,
7239,
285,
17697,
7239,
90,
387,
436,
1127,
581,
476,
2168,
9059,
326,
253,
906,
310,
2057,
417,
12930,
1524,
3879,
390,
310,
14916,
253,
4081,
18974,
7866,
342,
247,
1566,
326,
760,
26295,
581,
2437,
1698,
15579,
1465,
285,
1029,
17697,
15579,
285,
7637,
342,
253,
8654,
1566,
253,
18974,
310,
4872,
275,
3268,
2317,
3103,
581,
31326,
8523,
247,
3924,
835,
1465,
285,
1465,
90,
2572,
247,
2257,
3560,
407,
247,
3924,
835,
1465,
90,
6379,
50276,
2520,
310,
1929,
281,
5108,
984,
2761,
512,
3210,
2486,
247,
8492,
327,
253,
3453,
3021,
253,
24746,
1039,
281,
8523,
6379,
253,
2228,
310,
281,
4044,
253,
3451,
16888,
3268,
407,
25184,
253,
8492,
4715,
253,
4588,
966,
1968,
7293,
327,
253,
2540,
2460,
310,
1199,
12150,
285,
3021,
3936,
3356,
3103,
642,
2647,
752,
5933,
310,
908,
581,
651,
1902,
436,
2238,
273,
18974,
342,
247,
1566,
326,
556,
247,
8492,
50276,
262,
671,
2097,
326,
253,
4722,
629,
273,
271,
1783,
760,
9513,
846,
253,
16888,
3268,
310,
6311,
10481,
973,
285,
1060,
253,
5661,
1543,
1474,
4513,
247,
2257,
432,
253,
10527,
10554,
1223,
4645,
690,
1061,
357,
6836,
751,
5281,
627,
403,
1943,
3910,
275,
849,
253,
15029,
403,
2819,
751,
50276,
74,
13414,
923,
849,
436,
2929,
310,
11138,
253,
1375,
273,
253,
1445,
954,
273,
253,
10527,
9021,
403,
973,
1929,
390,
3477,
281,
15313,
627,
310,
642,
4588,
4602,
281,
256,
35333,
1669,
3103,
352,
310,
1014,
1892,
281,
9059,
326,
253,
8131,
5281,
588,
320,
2540,
3907,
273,
10895,
390,
1566,
531,
812,
1158,
670,
247,
1566,
534,
476,
417,
1566,
247,
8492,
285,
253,
14800,
403,
1599,
4924,
3021,
352,
310,
1892,
281,
3037,
253,
16888,
3268,
534,
1537,
1818,
253,
18974,
50275,
45230,
891,
6273,
323,
247,
2266,
12009,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10671,
281,
5724,
4715,
327,
271,
1491,
6415,
534,
310,
2905,
281,
253,
1491,
9191,
2783,
275,
253,
3673,
44856,
1783,
533,
310,
625,
9630,
285,
6927,
281,
11897,
50275,
783,
2022,
4468,
342,
253,
2929,
310,
253,
3480,
273,
7914,
285,
14883,
318,
273,
2442,
4648,
247,
4468,
310,
5439,
326,
253,
4081,
1332,
12002,
84,
1977,
1039,
1512,
1199,
2508,
594,
326,
253,
15029,
273,
253,
9191,
403,
281,
320,
3264,
285,
3831,
1652,
4217,
1491,
923,
271,
251,
15337,
254,
19,
5701,
253,
4477,
5194,
281,
690,
273,
253,
2022,
3374,
347,
597,
8042,
562,
275,
253,
5955,
3738,
597,
6558,
326,
253,
1332,
812,
1335,
3831,
4217,
1491,
50275,
783,
30628,
403,
417,
1077,
13762,
407,
436,
2929,
342,
17503,
2057,
42876,
1840,
253,
14924,
7887,
42876,
2708,
253,
14924,
7887,
390,
2266,
12009,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
6010,
436,
2929,
1057,
253,
1563,
50276,
783,
3302,
1895,
310,
281,
12106,
253,
18974,
273,
256,
35333,
275,
3733,
271,
2224,
275,
253,
2317,
273,
50276,
81,
273,
5912,
5593,
327,
340,
2069,
340,
436,
1895,
310,
4722,
533,
2834,
50275,
783,
2929,
21031,
247,
1616,
729,
5931,
326,
3637,
247,
30505,
1854,
275,
23055,
7982,
327,
268,
253,
9765,
924,
27827,
50276,
10489,
4679,
253,
2929,
2722,
326,
253,
24102,
273,
256,
35333,
285,
355,
545,
284,
1686,
68,
452,
50276,
22202,
17697,
15579,
50275,
2577,
3374,
342,
436,
2929,
403,
247,
253,
2022,
906,
310,
247,
9864,
849,
2087,
310,
436,
812,
352,
3469,
327,
253,
10895,
812,
368,
2085,
690,
30328,
390,
5276,
326,
323,
2176,
10895,
841,
767,
24102,
403,
253,
1072,
390,
1077,
2810,
50276,
67,
4495,
273,
436,
18974,
436,
310,
417,
253,
18974,
275,
268,
352,
310,
253,
18974,
273,
253,
994,
1658,
447,
275,
2087,
310,
627,
271,
27350,
8813,
327,
2139,
841,
24102,
403,
2074,
285,
752,
1057,
352,
1599,
50276,
1542,
1650,
752,
651,
320,
247,
1896,
27570,
323,
3733,
256,
35333,
812,
352,
320,
326,
512,
4715,
3082,
588,
452,
436,
8847,
38981,
18974,
323,
994,
1658,
447,
50276,
68,
253,
10527,
7680,
310,
5884,
1097,
253,
5609,
285,
1543,
15212,
403,
1929,
50275,
1189,
455,
891,
1158,
253,
2929,
19756,
247,
1379,
12594,
352,
310,
271,
4722,
8310,
326,
253,
18974,
273,
355,
545,
284,
1686,
68,
50276,
261,
2074,
281,
326,
273,
256,
35333,
275,
841,
14777,
533,
253,
4477,
452,
417,
1160,
247,
4209,
3434,
281,
4665,
436,
50276,
7152,
33032,
2520,
2929,
1263,
253,
18974,
273,
288,
700,
90,
7147,
288,
700,
12502,
327,
253,
1491,
6415,
323,
19191,
11786,
18499,
3082,
323,
3733,
11454,
6928,
436,
2929,
369,
11797,
407,
1182,
400,
285,
246,
763,
1615,
1722,
533,
3185,
273,
10499,
253,
15577,
1491,
891,
633,
285,
891,
1767,
436,
2929,
4081,
281,
2557,
288,
700,
90,
285,
288,
700,
12502,
534,
403,
1199,
6927,
281,
11897,
533,
15814,
2074,
4495,
347,
891,
1767,
285,
891,
633,
50276,
783,
4722,
629,
273,
436,
2929,
4620,
275,
2593,
577,
835,
253,
2488,
2789,
247,
4602,
875,
253,
256,
35333,
3733,
1232,
285,
355,
545,
284,
1686,
68,
9072,
1616,
729,
4715,
5931,
924,
27827,
310,
816,
3365,
4872,
5019,
273,
253,
3302,
3268,
285,
253,
2457,
6474,
3268,
273,
253,
13301,
253,
4477,
921,
326,
253,
18974,
273,
253,
1524,
3368,
310,
2074,
281,
326,
273,
924,
27827,
50276,
43786,
891,
1158,
253,
2929,
310,
973,
15720,
285,
4518,
1246,
253,
5697,
1060,
403,
690,
5847,
285,
772,
50276,
856,
84,
337,
253,
18974,
3559,
275,
436,
2929,
310,
1199,
625,
9630,
685,
326,
275,
1182,
400,
285,
246,
763,
1615,
1722,
1580,
10499,
253,
15579,
285,
17697,
15579,
273,
13358,
3632,
4903,
403,
1199,
6927,
671,
352,
310,
3477,
323,
952,
281,
2868,
326,
253,
18974,
6556,
323,
2710,
11454,
2990,
2605,
285,
2710,
5743,
3470,
50276,
856,
84,
374,
253,
4602,
281,
924,
27827,
310,
4722,
285,
352,
778,
3831,
2257,
273,
16039,
50276,
5040,
337,
581,
273,
619,
2201,
4468,
310,
50276,
338,
368,
1007,
387,
253,
18974,
273,
253,
3368,
4632,
924,
27827,
4677,
495,
597,
1007,
2074,
387,
806,
17834,
533,
604,
368,
1007,
387,
352,
9257,
368,
588,
4366,
326,
253,
3295,
273,
731,
403,
1027,
323,
256,
35333,
253,
18974,
4566,
281,
253,
8577,
1127,
1077,
3517,
3798,
642,
625,
685,
884,
273,
253,
3733,
5018,
5727,
924,
27827,
4566,
281,
253,
8577,
1127,
1199,
17357,
849,
513,
253,
4477,
1158,
670,
436,
11562,
285,
752,
1057,
436,
1599,
50276,
5040,
374,
436,
2929,
310,
1469,
281,
320,
625,
14282,
604,
253,
2488,
476,
2085,
690,
11985,
3340,
670,
337,
752,
1057,
253,
5281,
18974,
1599,
374,
752,
513,
253,
4602,
875,
253,
18974,
285,
1616,
729,
5931,
2097,
495,
849,
476,
841,
10291,
320,
7826,
4217,
281,
3157,
3733,
5933,
891,
2096,
326,
841,
3533,
778,
417,
320,
4518,
3662,
494,
533,
253,
4477,
943,
1056,
436,
2929,
625,
29853,
824,
326,
643,
8607,
476,
1158,
12861,
846,
4361,
436,
2929,
50276,
5040,
495,
891,
1804,
253,
4477,
970,
256,
35333,
3185,
273,
305,
69,
4768,
253,
2929,
3798,
305,
69,
2097,
2032,
11786,
18499,
533,
253,
2929,
310,
5015,
670,
10464,
2147,
19191,
11786,
18499,
305,
69,
1057,
417,
452,
1616,
729,
414,
50276,
43786,
891,
1158,
253,
2929,
310,
327,
253,
45210,
891,
1158,
253,
2929,
310,
12207,
604,
253,
2488,
476,
2085,
625,
16039,
1411,
772,
374,
7152,
339,
431,
248,
2929,
14177,
281,
6266,
256,
35333,
432,
253,
1127,
273,
1859,
273,
253,
3268,
7239,
90,
835,
340,
310,
247,
6830,
40634,
2032,
966,
1968,
285,
340,
247,
1566,
10554,
7384,
23055,
7982,
273,
20552,
247,
18974,
310,
2931,
534,
13840,
281,
2087,
4715,
8770,
273,
10670,
50276,
783,
2523,
310,
326,
253,
2929,
12002,
84,
253,
4588,
5933,
1566,
285,
941,
1977,
285,
253,
760,
2181,
326,
4558,
403,
16888,
10670,
7239,
285,
17697,
7239,
90,
387,
436,
1127,
581,
476,
2168,
9059,
326,
253,
906,
310,
2057,
417,
12930,
1524,
3879,
390,
310,
14916,
253,
4081,
18974,
7866,
342,
247,
1566,
326,
760,
26295,
581,
2437,
1698,
15579,
1465,
285,
1029,
17697,
15579,
285,
7637,
342,
253,
8654,
1566,
253,
18974,
310,
4872,
275,
3268,
2317,
3103,
581,
31326,
8523,
247,
3924,
835,
1465,
285,
1465,
90,
2572,
247,
2257,
3560,
407,
247,
3924,
835,
1465,
90,
6379,
50276,
2520,
310,
1929,
281,
5108,
984,
2761,
512,
3210,
2486,
247,
8492,
327,
253,
3453,
3021,
253,
24746,
1039,
281,
8523,
6379,
253,
2228,
310,
281,
4044,
253,
3451,
16888,
3268,
407,
25184,
253,
8492,
4715,
253,
4588,
966,
1968,
7293,
327,
253,
2540,
2460,
310,
1199,
12150,
285,
3021,
3936,
3356,
3103,
642,
2647,
752,
5933,
310,
908,
581,
651,
1902,
436,
2238,
273,
18974,
342,
247,
1566,
326,
556,
247,
8492,
50276,
262,
671,
2097,
326,
253,
4722,
629,
273,
271,
1783,
760,
9513,
846,
253,
16888,
3268,
310,
6311,
10481,
973,
285,
1060,
253,
5661,
1543,
1474,
4513,
247,
2257,
432,
253,
10527,
10554,
1223,
4645,
690,
1061,
357,
6836,
751,
5281,
627,
403,
1943,
3910,
275,
849,
253,
15029,
403,
2819,
751,
50276,
74,
13414,
923,
849,
436,
2929,
310,
11138,
253,
1375,
273,
253,
1445,
954,
273,
253,
10527,
9021,
403,
973,
1929,
390,
3477,
281,
15313,
627,
310,
642,
4588,
4602,
281,
256,
35333,
1669,
3103,
352,
310,
1014,
1892,
281,
9059,
326,
253,
8131,
5281,
588,
320,
2540,
3907,
273,
10895,
390,
1566,
531,
812,
1158,
670,
247,
1566,
534,
476,
417,
1566,
247,
8492,
285,
253,
14800,
403,
1599,
4924,
3021,
352,
310,
1892,
281,
3037,
253,
16888,
3268,
534,
1537,
1818,
253,
18974,
50275,
45230,
891,
6273,
323,
247,
2266,
12009,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10671,
281,
5724,
4715,
327,
271,
1491,
6415,
534,
310,
2905,
281,
253,
1491,
9191,
2783,
275,
253,
3673,
44856,
1783,
533,
310,
625,
9630,
285,
6927,
281,
11897,
50275,
783,
2022,
4468,
342,
253,
2929,
310,
253,
3480,
273,
7914,
285,
14883,
318,
273,
2442,
4648,
247,
4468,
310,
5439,
326,
253,
4081,
1332,
12002,
84,
1977,
1039,
1512,
1199,
2508,
594,
326,
253,
15029,
273,
253,
9191,
403,
281,
320,
3264,
285,
3831,
1652,
4217,
1491,
923,
271,
251,
15337,
254,
19,
5701,
253,
4477,
5194,
281,
690,
273,
253,
2022,
3374,
347,
597,
8042,
562,
275,
253,
5955,
3738,
597,
6558,
326,
253,
1332,
812,
1335,
3831,
4217,
1491,
50275,
783,
30628,
403,
417,
1077,
13762,
407,
436,
2929,
342,
17503,
2057,
42876,
1840,
253,
14924,
7887,
42876,
2708,
253,
14924,
7887,
390,
2266,
12009,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper empirically studies various cnn robustifying mechanisms aiming to achieve rotational invariance the main finding is that such robustifying mechanisms may lead to lack of robustness against pixellevel attacks such as fgsm and its variants the paper does a comprehensive job in studying relevant robustifying schemes and attacks strategies however the paper does not present sufficiently new information worthy of a regular conference paper it can be a good workshop paper though for the robust learning community some analytical insights would really strengthen the work also from an empirical standpoint the authors need to consider other data sets beyond just the mnist data set xxxxxxxxxxxxxx while i appreciate the authors rebuttal and revisions i still do not see sufficient contribution here worthy of a regular iclr paper docsepusing the dataset mnist the authors empirically studied the robustness of several rotationequivariant neural network modelsgcnn hnets ptn et al to geometric transformation and small pixelwise perturbations their experiments showed that the equivariant network modelsstdcnns gcnns hnets et al are robust to geometric transformation but vulnerable to pixelwise adversarial perturbations these findings help us understand the neural network models better however this paper is not acceptable due to lack of innovation and novelty docsepthis paper empirically studies the robustness of equivariant cnns to rotations as well as adversarial perturbations it also studies their sample efficiency parameter efficiency and the effect of rotation and adversarial augmentation during training andor testing the main findings are 1 rotationequivariant networks are robust to small rotations even if equivariance to small rotations is not directly built into the architecture 2 applying rotational data augmentation increases robustness to rotations 3 equivariant networks are more sample efficient than cnns and outperform them for all dataset sizes 4 applying rotational data augmentation decreases robustness to adversarial perturbations and this effect is more pronounced for gcnns if true this is a valuable addition to the literature it is one of the first independent validations of claims regarding sample complexity and accuracy made by the authors of the various equivariant network papers performed by a party that does not have their own method to promote many of the findings do not have an obvious explanation so the data from this paper could conceivably prompt new theoretical questions and investigations the authors chose to highlight one finding in particular namely that gcnns become more sensitive to adversarial perturbations as they are trained on more heavily rotationaugmented data however this appears to be true for both cnns and gcnns the difference being only in degree see fig 4 10 11 this is not apparent from the text though as eg the abstract notes that robustness to geometric transformations in these models equivariant nets comes at the cost of robustness to small pixelwise perturbations since hnets gcnns and roteqnets should be exactly equivariant to 90 degree rotations and some others perhaps it is surprising that figure 1 shows a continuing decline in performance with bigger and bigger random rotations if the network is made rotation invariant through some pooling layer at the end of the network one would expect to see a decline in performance up to 45 degrees followed by an increase back to baseline at 90 degrees etc polar transformer networks achieve good results in fig 1 but i wonder if this is still true for rotations around points other than the origin since cnns and gcnns differ in terms of the number of channels at a certain number of parameters and differ in terms of number of parameters at a certain number of channels it could be that channel count or parameter count is the more relevant factor rather than equivariance so it would be good to make a scatterplot where each dot is a network either cnn or gcnn at various model sizes the xaxis is parameter count or in another plot 2d channel count and the yaxis corresponds to the accuracy this can be done for various choices of augmentation perturbation the type of network cnn or gcnn could be color coded if indeed the cnngcnn variable is relevant that should be clearly visible in the plot and similarly if the parameter count or channel count is relevant one could also do a linear regression of accuracy or logaccuracy or something using cnngccn paramcount channelcount as covariates and report the variance explained by each in several plots eg fig 4 8 the yaxes do not have the same range making it hard to compare results between subplots the experiments have some weaknesses for one thing it seems like each accuracy value reported comes from a single training run it would be much preferable to plot mean and standard deviation error bars another weakness is that all experiments are performed on mnist even just a simple validation of the main findings on cifar would significantly strengthen the paper because of the limited scope of the experiments it is not clear to me how generalizable and robust the experimental results are with deep network performance it can be hard to know what the relevant hyperparameters are and so we may well be reading tea leaves here it is also unfortunate that no explanation for the observed phenomena is available however it is conceivable that the findings presented in this paper could help researchers who are trying to understand adversarial attacks robustness so it is not a fatal flaw i am certainly glad the authors did not make up some unsupported story to explain the findings as is all too common in the literature these days overall i consider this a borderline paper and am tending towards a reject my main considerations are 1 uncertainty about generalizability 2 uncertainty about usefulness to practitioners or theorists admittedly this is hard to predict but no clear usecase is available at this point 3 a lot of data but no clear central finding of the paper
### Summary:
|
positives the paper proposes an interesting idea to study the effect on vulnerability to adversarial attacks of training for invariance with respect to rotations experiments on mnist fashionmnist and cifar10 an interesting hypothesis partially borne out in experiments negatives no accept recommendation from any reviewer insufficient empirical results not a clear enough message very limited theoretical contribution although additional experimental results on fashionmnist and cifar10 were added to the initial very limited results on mnist the main claim of the paper seems to be somewhat weakened the effect of increased vulnerability to adversarial attacks as invariance is increased is less pronounced on the additional datasets this calls into question how relevant this effect is on more realistic data than the toy problems considered here the size of the network is not varied in the experiments if increased invariance results in poorer performance with respect to attacks one possible explanation is that the invariance taxes the capacity of the network architecture varying architecture depth could partially answer whether this is relevant given the lack of theoretical contribution more insights along these lines would potentially strengthen the work the title uses the term equivariance which strictly speaking is when the inputs and outputs of a function vary equally eg an image and its segmentation are equivariant under rotations but classification tasks should probably be called invariant the reviewers were unanimous in not recommending the paper for acceptance the key concerns remain after the author response
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
45190,
2175,
2710,
260,
9866,
10237,
5411,
6297,
26400,
281,
5115,
22090,
31429,
253,
2022,
4560,
310,
326,
824,
10237,
5411,
6297,
778,
1421,
281,
3480,
273,
31640,
1411,
8066,
4415,
652,
8104,
824,
347,
269,
72,
3610,
285,
697,
11640,
253,
2929,
1057,
247,
11088,
2628,
275,
12392,
4623,
10237,
5411,
15849,
285,
8104,
8130,
2299,
253,
2929,
1057,
417,
1246,
10481,
747,
1491,
18338,
273,
247,
3963,
8059,
2929,
352,
476,
320,
247,
1175,
22586,
2929,
2167,
323,
253,
10237,
4715,
3114,
690,
16101,
16039,
651,
1663,
17084,
253,
789,
671,
432,
271,
16774,
32764,
253,
4477,
878,
281,
1908,
643,
941,
5239,
4457,
816,
253,
278,
79,
382,
941,
873,
50273,
34576,
16321,
5260,
50276,
6050,
891,
11435,
253,
4477,
30080,
22559,
285,
38549,
891,
1335,
513,
417,
923,
4209,
7680,
1060,
18338,
273,
247,
3963,
17857,
32888,
2929,
50276,
7152,
33032,
5302,
253,
10895,
278,
79,
382,
253,
4477,
45190,
5421,
253,
31640,
273,
2067,
9381,
8275,
6410,
11454,
2990,
3210,
23654,
9866,
288,
47301,
268,
14543,
1162,
355,
281,
17856,
9261,
285,
1355,
12275,
3020,
26309,
616,
4679,
2692,
326,
253,
32270,
6410,
2990,
3210,
8400,
14340,
2224,
305,
14340,
2224,
288,
47301,
1162,
355,
403,
10237,
281,
17856,
9261,
533,
14043,
281,
12275,
3020,
48960,
26309,
841,
4342,
1361,
441,
2096,
253,
50276,
570,
1546,
2990,
3210,
1805,
2299,
436,
2929,
310,
417,
12207,
1955,
281,
3480,
273,
15832,
285,
38135,
5474,
33032,
2520,
2929,
45190,
2175,
253,
31640,
273,
32270,
6410,
260,
79,
2224,
281,
39501,
347,
973,
347,
48960,
26309,
352,
671,
2175,
616,
3410,
6733,
4764,
6733,
285,
253,
1055,
273,
9381,
285,
48960,
42072,
1309,
3733,
285,
263,
5175,
50275,
783,
2022,
4342,
403,
337,
9381,
8275,
6410,
6928,
403,
10237,
281,
1355,
39501,
1014,
604,
32270,
14417,
281,
1355,
39501,
310,
417,
3587,
4270,
715,
253,
10336,
374,
9433,
22090,
941,
42072,
5459,
31640,
281,
39501,
495,
32270,
6410,
6928,
403,
625,
3410,
5919,
685,
260,
79,
2224,
285,
562,
32231,
731,
323,
512,
10895,
9552,
577,
9433,
22090,
941,
42072,
12075,
31640,
281,
48960,
26309,
285,
436,
1055,
310,
625,
17088,
323,
305,
14340,
2224,
50276,
338,
2032,
436,
310,
247,
9865,
1635,
281,
253,
6239,
352,
310,
581,
273,
253,
806,
3907,
3588,
569,
273,
3916,
5001,
3410,
10454,
285,
7200,
1160,
407,
253,
4477,
273,
253,
2710,
32270,
6410,
2990,
9380,
2684,
407,
247,
3128,
326,
1057,
417,
452,
616,
1211,
1332,
281,
8591,
1142,
273,
253,
4342,
513,
417,
452,
271,
4755,
8813,
594,
253,
941,
432,
436,
2929,
812,
10686,
400,
1598,
8959,
747,
10527,
3533,
285,
14006,
50276,
783,
4477,
9703,
281,
6780,
581,
4560,
275,
1798,
10775,
326,
305,
14340,
2224,
2489,
625,
7996,
281,
48960,
26309,
347,
597,
403,
10166,
327,
625,
11306,
9381,
2321,
16390,
941,
2299,
436,
4620,
281,
320,
2032,
323,
1097,
260,
79,
2224,
285,
305,
14340,
2224,
253,
3064,
1146,
760,
275,
4248,
923,
3036,
577,
884,
1903,
436,
310,
417,
5165,
432,
253,
2505,
2167,
347,
24088,
253,
12002,
7211,
326,
31640,
281,
17856,
21257,
275,
841,
3210,
32270,
6410,
37507,
3249,
387,
253,
2105,
273,
31640,
281,
1355,
12275,
3020,
26309,
50276,
17480,
288,
47301,
305,
14340,
2224,
285,
687,
442,
47051,
1507,
943,
320,
4555,
32270,
6410,
281,
5091,
4248,
39501,
285,
690,
2571,
4931,
352,
310,
10084,
326,
4677,
337,
2722,
247,
11440,
10343,
275,
3045,
342,
8750,
285,
8750,
3632,
39501,
604,
253,
2990,
310,
1160,
9381,
13727,
949,
690,
45900,
3828,
387,
253,
990,
273,
253,
2990,
581,
651,
1902,
281,
923,
247,
10343,
275,
3045,
598,
281,
5329,
7759,
3560,
407,
271,
2572,
896,
281,
8245,
387,
5091,
7759,
3966,
50275,
24862,
39707,
6928,
5115,
1175,
1543,
275,
3036,
337,
533,
891,
4282,
604,
436,
310,
1335,
2032,
323,
39501,
1475,
2792,
643,
685,
253,
6510,
50276,
17480,
260,
79,
2224,
285,
305,
14340,
2224,
9184,
275,
2426,
273,
253,
1180,
273,
8123,
387,
247,
2176,
1180,
273,
3602,
285,
9184,
275,
2426,
273,
1180,
273,
3602,
387,
247,
2176,
1180,
273,
8123,
352,
812,
320,
326,
5048,
1385,
390,
4764,
1385,
310,
253,
625,
4623,
2803,
2581,
685,
32270,
14417,
594,
352,
651,
320,
1175,
281,
1056,
247,
24493,
14095,
835,
1016,
14261,
310,
247,
2990,
2057,
260,
9866,
390,
305,
68,
9866,
387,
2710,
1566,
9552,
253,
1269,
10565,
310,
4764,
1385,
390,
275,
1529,
7484,
374,
69,
5048,
1385,
285,
253,
340,
10565,
10140,
281,
253,
7200,
436,
476,
320,
2218,
323,
2710,
10165,
273,
42072,
50276,
44931,
318,
253,
1511,
273,
2990,
260,
9866,
390,
305,
68,
9866,
812,
320,
3295,
25175,
604,
6296,
253,
260,
79,
1251,
68,
9866,
4778,
310,
4623,
326,
943,
320,
4518,
7985,
275,
253,
7484,
285,
12014,
604,
253,
4764,
1385,
390,
5048,
1385,
310,
4623,
581,
812,
671,
513,
247,
4872,
9077,
273,
7200,
390,
2412,
18921,
1974,
390,
1633,
970,
260,
79,
1251,
550,
79,
2236,
5560,
5048,
5560,
347,
33520,
285,
1304,
253,
11041,
5544,
407,
1016,
50275,
249,
2067,
14777,
24088,
3036,
577,
854,
253,
340,
44832,
513,
417,
452,
253,
1072,
2491,
2403,
352,
1892,
281,
7277,
1543,
875,
749,
42045,
50275,
783,
4679,
452,
690,
32213,
323,
581,
2181,
352,
3133,
751,
1016,
7200,
1318,
2361,
3249,
432,
247,
2014,
3733,
1408,
352,
651,
320,
1199,
29224,
281,
7484,
1599,
285,
2629,
11254,
50276,
3775,
8965,
1529,
14855,
310,
326,
512,
4679,
403,
2684,
327,
278,
79,
382,
1014,
816,
247,
2969,
12820,
273,
253,
2022,
4342,
327,
260,
338,
274,
651,
3012,
17084,
253,
2929,
50276,
12157,
273,
253,
3710,
7990,
273,
253,
4679,
352,
310,
417,
2590,
281,
479,
849,
2087,
12729,
285,
10237,
253,
5661,
1543,
403,
342,
3676,
2990,
3045,
352,
476,
320,
1892,
281,
871,
752,
253,
4623,
4373,
22041,
403,
285,
594,
359,
778,
973,
320,
4361,
10331,
6505,
1060,
50276,
262,
310,
671,
23293,
326,
642,
8813,
323,
253,
2540,
16958,
310,
2130,
2299,
352,
310,
42177,
326,
253,
4342,
3559,
275,
436,
2929,
812,
1361,
8607,
665,
403,
2820,
281,
2096,
48960,
8104,
50276,
18848,
461,
1255,
594,
352,
310,
417,
247,
15444,
19652,
891,
717,
5604,
9995,
253,
4477,
858,
417,
1056,
598,
690,
36542,
2926,
281,
5513,
253,
4342,
347,
310,
512,
1512,
1846,
275,
253,
6239,
841,
1897,
50276,
1189,
455,
891,
1908,
436,
247,
45210,
2929,
285,
717,
43981,
4404,
247,
12009,
619,
2022,
15711,
403,
337,
11649,
670,
2087,
50228,
374,
11649,
670,
31471,
281,
24432,
390,
29075,
1346,
47421,
436,
310,
1892,
281,
3283,
533,
642,
2590,
441,
886,
511,
310,
2130,
387,
436,
1127,
495,
247,
2257,
273,
941,
533,
642,
2590,
4275,
4560,
273,
253,
2929,
187,
187,
4118,
18435,
27,
993,
23223,
50276,
783,
2929,
29328,
271,
4722,
2934,
281,
1263,
253,
1055,
327,
24189,
281,
48960,
8104,
273,
3733,
323,
31429,
342,
1675,
281,
39501,
4679,
327,
278,
79,
382,
8142,
16192,
382,
285,
260,
338,
274,
740,
271,
4722,
9079,
10571,
32708,
562,
275,
4679,
50276,
8265,
3993,
50276,
2369,
2997,
17401,
432,
667,
37317,
12497,
16774,
1543,
417,
247,
2590,
2217,
3935,
1077,
3710,
10527,
7680,
50276,
20261,
3081,
5661,
1543,
327,
8142,
16192,
382,
285,
260,
338,
274,
740,
497,
2879,
281,
253,
3302,
1077,
3710,
1543,
327,
278,
79,
382,
253,
2022,
1750,
273,
253,
2929,
3133,
281,
320,
8489,
33153,
50276,
783,
1055,
273,
2559,
24189,
281,
48960,
8104,
347,
31429,
310,
2559,
310,
1679,
17088,
327,
253,
3081,
15302,
50276,
2520,
5841,
715,
1953,
849,
4623,
436,
1055,
310,
327,
625,
15958,
941,
685,
253,
20953,
3237,
2783,
1060,
50276,
783,
1979,
273,
253,
2990,
310,
417,
12848,
275,
253,
4679,
50276,
338,
2559,
31429,
1543,
275,
30560,
3045,
342,
1675,
281,
8104,
581,
1896,
8813,
310,
326,
253,
31429,
10474,
253,
5350,
273,
253,
2990,
10336,
50276,
39381,
272,
10336,
6864,
812,
10571,
3662,
1880,
436,
310,
4623,
50276,
28821,
253,
3480,
273,
10527,
7680,
625,
16039,
2112,
841,
3104,
651,
7826,
17084,
253,
789,
50276,
783,
4060,
4648,
253,
1307,
32270,
14417,
534,
13714,
8288,
310,
672,
253,
14800,
285,
18012,
273,
247,
1159,
6889,
9696,
24088,
271,
2460,
285,
697,
26405,
403,
32270,
6410,
762,
39501,
533,
9162,
8892,
943,
3164,
320,
1925,
13727,
50276,
783,
30628,
497,
42293,
275,
417,
46705,
253,
2929,
323,
14924,
50276,
783,
2234,
7350,
3464,
846,
253,
2488,
2380,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
45190,
2175,
2710,
260,
9866,
10237,
5411,
6297,
26400,
281,
5115,
22090,
31429,
253,
2022,
4560,
310,
326,
824,
10237,
5411,
6297,
778,
1421,
281,
3480,
273,
31640,
1411,
8066,
4415,
652,
8104,
824,
347,
269,
72,
3610,
285,
697,
11640,
253,
2929,
1057,
247,
11088,
2628,
275,
12392,
4623,
10237,
5411,
15849,
285,
8104,
8130,
2299,
253,
2929,
1057,
417,
1246,
10481,
747,
1491,
18338,
273,
247,
3963,
8059,
2929,
352,
476,
320,
247,
1175,
22586,
2929,
2167,
323,
253,
10237,
4715,
3114,
690,
16101,
16039,
651,
1663,
17084,
253,
789,
671,
432,
271,
16774,
32764,
253,
4477,
878,
281,
1908,
643,
941,
5239,
4457,
816,
253,
278,
79,
382,
941,
873,
50273,
34576,
16321,
5260,
50276,
6050,
891,
11435,
253,
4477,
30080,
22559,
285,
38549,
891,
1335,
513,
417,
923,
4209,
7680,
1060,
18338,
273,
247,
3963,
17857,
32888,
2929,
50276,
7152,
33032,
5302,
253,
10895,
278,
79,
382,
253,
4477,
45190,
5421,
253,
31640,
273,
2067,
9381,
8275,
6410,
11454,
2990,
3210,
23654,
9866,
288,
47301,
268,
14543,
1162,
355,
281,
17856,
9261,
285,
1355,
12275,
3020,
26309,
616,
4679,
2692,
326,
253,
32270,
6410,
2990,
3210,
8400,
14340,
2224,
305,
14340,
2224,
288,
47301,
1162,
355,
403,
10237,
281,
17856,
9261,
533,
14043,
281,
12275,
3020,
48960,
26309,
841,
4342,
1361,
441,
2096,
253,
50276,
570,
1546,
2990,
3210,
1805,
2299,
436,
2929,
310,
417,
12207,
1955,
281,
3480,
273,
15832,
285,
38135,
5474,
33032,
2520,
2929,
45190,
2175,
253,
31640,
273,
32270,
6410,
260,
79,
2224,
281,
39501,
347,
973,
347,
48960,
26309,
352,
671,
2175,
616,
3410,
6733,
4764,
6733,
285,
253,
1055,
273,
9381,
285,
48960,
42072,
1309,
3733,
285,
263,
5175,
50275,
783,
2022,
4342,
403,
337,
9381,
8275,
6410,
6928,
403,
10237,
281,
1355,
39501,
1014,
604,
32270,
14417,
281,
1355,
39501,
310,
417,
3587,
4270,
715,
253,
10336,
374,
9433,
22090,
941,
42072,
5459,
31640,
281,
39501,
495,
32270,
6410,
6928,
403,
625,
3410,
5919,
685,
260,
79,
2224,
285,
562,
32231,
731,
323,
512,
10895,
9552,
577,
9433,
22090,
941,
42072,
12075,
31640,
281,
48960,
26309,
285,
436,
1055,
310,
625,
17088,
323,
305,
14340,
2224,
50276,
338,
2032,
436,
310,
247,
9865,
1635,
281,
253,
6239,
352,
310,
581,
273,
253,
806,
3907,
3588,
569,
273,
3916,
5001,
3410,
10454,
285,
7200,
1160,
407,
253,
4477,
273,
253,
2710,
32270,
6410,
2990,
9380,
2684,
407,
247,
3128,
326,
1057,
417,
452,
616,
1211,
1332,
281,
8591,
1142,
273,
253,
4342,
513,
417,
452,
271,
4755,
8813,
594,
253,
941,
432,
436,
2929,
812,
10686,
400,
1598,
8959,
747,
10527,
3533,
285,
14006,
50276,
783,
4477,
9703,
281,
6780,
581,
4560,
275,
1798,
10775,
326,
305,
14340,
2224,
2489,
625,
7996,
281,
48960,
26309,
347,
597,
403,
10166,
327,
625,
11306,
9381,
2321,
16390,
941,
2299,
436,
4620,
281,
320,
2032,
323,
1097,
260,
79,
2224,
285,
305,
14340,
2224,
253,
3064,
1146,
760,
275,
4248,
923,
3036,
577,
884,
1903,
436,
310,
417,
5165,
432,
253,
2505,
2167,
347,
24088,
253,
12002,
7211,
326,
31640,
281,
17856,
21257,
275,
841,
3210,
32270,
6410,
37507,
3249,
387,
253,
2105,
273,
31640,
281,
1355,
12275,
3020,
26309,
50276,
17480,
288,
47301,
305,
14340,
2224,
285,
687,
442,
47051,
1507,
943,
320,
4555,
32270,
6410,
281,
5091,
4248,
39501,
285,
690,
2571,
4931,
352,
310,
10084,
326,
4677,
337,
2722,
247,
11440,
10343,
275,
3045,
342,
8750,
285,
8750,
3632,
39501,
604,
253,
2990,
310,
1160,
9381,
13727,
949,
690,
45900,
3828,
387,
253,
990,
273,
253,
2990,
581,
651,
1902,
281,
923,
247,
10343,
275,
3045,
598,
281,
5329,
7759,
3560,
407,
271,
2572,
896,
281,
8245,
387,
5091,
7759,
3966,
50275,
24862,
39707,
6928,
5115,
1175,
1543,
275,
3036,
337,
533,
891,
4282,
604,
436,
310,
1335,
2032,
323,
39501,
1475,
2792,
643,
685,
253,
6510,
50276,
17480,
260,
79,
2224,
285,
305,
14340,
2224,
9184,
275,
2426,
273,
253,
1180,
273,
8123,
387,
247,
2176,
1180,
273,
3602,
285,
9184,
275,
2426,
273,
1180,
273,
3602,
387,
247,
2176,
1180,
273,
8123,
352,
812,
320,
326,
5048,
1385,
390,
4764,
1385,
310,
253,
625,
4623,
2803,
2581,
685,
32270,
14417,
594,
352,
651,
320,
1175,
281,
1056,
247,
24493,
14095,
835,
1016,
14261,
310,
247,
2990,
2057,
260,
9866,
390,
305,
68,
9866,
387,
2710,
1566,
9552,
253,
1269,
10565,
310,
4764,
1385,
390,
275,
1529,
7484,
374,
69,
5048,
1385,
285,
253,
340,
10565,
10140,
281,
253,
7200,
436,
476,
320,
2218,
323,
2710,
10165,
273,
42072,
50276,
44931,
318,
253,
1511,
273,
2990,
260,
9866,
390,
305,
68,
9866,
812,
320,
3295,
25175,
604,
6296,
253,
260,
79,
1251,
68,
9866,
4778,
310,
4623,
326,
943,
320,
4518,
7985,
275,
253,
7484,
285,
12014,
604,
253,
4764,
1385,
390,
5048,
1385,
310,
4623,
581,
812,
671,
513,
247,
4872,
9077,
273,
7200,
390,
2412,
18921,
1974,
390,
1633,
970,
260,
79,
1251,
550,
79,
2236,
5560,
5048,
5560,
347,
33520,
285,
1304,
253,
11041,
5544,
407,
1016,
50275,
249,
2067,
14777,
24088,
3036,
577,
854,
253,
340,
44832,
513,
417,
452,
253,
1072,
2491,
2403,
352,
1892,
281,
7277,
1543,
875,
749,
42045,
50275,
783,
4679,
452,
690,
32213,
323,
581,
2181,
352,
3133,
751,
1016,
7200,
1318,
2361,
3249,
432,
247,
2014,
3733,
1408,
352,
651,
320,
1199,
29224,
281,
7484,
1599,
285,
2629,
11254,
50276,
3775,
8965,
1529,
14855,
310,
326,
512,
4679,
403,
2684,
327,
278,
79,
382,
1014,
816,
247,
2969,
12820,
273,
253,
2022,
4342,
327,
260,
338,
274,
651,
3012,
17084,
253,
2929,
50276,
12157,
273,
253,
3710,
7990,
273,
253,
4679,
352,
310,
417,
2590,
281,
479,
849,
2087,
12729,
285,
10237,
253,
5661,
1543,
403,
342,
3676,
2990,
3045,
352,
476,
320,
1892,
281,
871,
752,
253,
4623,
4373,
22041,
403,
285,
594,
359,
778,
973,
320,
4361,
10331,
6505,
1060,
50276,
262,
310,
671,
23293,
326,
642,
8813,
323,
253,
2540,
16958,
310,
2130,
2299,
352,
310,
42177,
326,
253,
4342,
3559,
275,
436,
2929,
812,
1361,
8607,
665,
403,
2820,
281,
2096,
48960,
8104,
50276,
18848,
461,
1255,
594,
352,
310,
417,
247,
15444,
19652,
891,
717,
5604,
9995,
253,
4477,
858,
417,
1056,
598,
690,
36542,
2926,
281,
5513,
253,
4342,
347,
310,
512,
1512,
1846,
275,
253,
6239,
841,
1897,
50276,
1189,
455,
891,
1908,
436,
247,
45210,
2929,
285,
717,
43981,
4404,
247,
12009,
619,
2022,
15711,
403,
337,
11649,
670,
2087,
50228,
374,
11649,
670,
31471,
281,
24432,
390,
29075,
1346,
47421,
436,
310,
1892,
281,
3283,
533,
642,
2590,
441,
886,
511,
310,
2130,
387,
436,
1127,
495,
247,
2257,
273,
941,
533,
642,
2590,
4275,
4560,
273,
253,
2929,
187,
187,
4118,
18435,
27,
993,
23223,
50276,
783,
2929,
29328,
271,
4722,
2934,
281,
1263,
253,
1055,
327,
24189,
281,
48960,
8104,
273,
3733,
323,
31429,
342,
1675,
281,
39501,
4679,
327,
278,
79,
382,
8142,
16192,
382,
285,
260,
338,
274,
740,
271,
4722,
9079,
10571,
32708,
562,
275,
4679,
50276,
8265,
3993,
50276,
2369,
2997,
17401,
432,
667,
37317,
12497,
16774,
1543,
417,
247,
2590,
2217,
3935,
1077,
3710,
10527,
7680,
50276,
20261,
3081,
5661,
1543,
327,
8142,
16192,
382,
285,
260,
338,
274,
740,
497,
2879,
281,
253,
3302,
1077,
3710,
1543,
327,
278,
79,
382,
253,
2022,
1750,
273,
253,
2929,
3133,
281,
320,
8489,
33153,
50276,
783,
1055,
273,
2559,
24189,
281,
48960,
8104,
347,
31429,
310,
2559,
310,
1679,
17088,
327,
253,
3081,
15302,
50276,
2520,
5841,
715,
1953,
849,
4623,
436,
1055,
310,
327,
625,
15958,
941,
685,
253,
20953,
3237,
2783,
1060,
50276,
783,
1979,
273,
253,
2990,
310,
417,
12848,
275,
253,
4679,
50276,
338,
2559,
31429,
1543,
275,
30560,
3045,
342,
1675,
281,
8104,
581,
1896,
8813,
310,
326,
253,
31429,
10474,
253,
5350,
273,
253,
2990,
10336,
50276,
39381,
272,
10336,
6864,
812,
10571,
3662,
1880,
436,
310,
4623,
50276,
28821,
253,
3480,
273,
10527,
7680,
625,
16039,
2112,
841,
3104,
651,
7826,
17084,
253,
789,
50276,
783,
4060,
4648,
253,
1307,
32270,
14417,
534,
13714,
8288,
310,
672,
253,
14800,
285,
18012,
273,
247,
1159,
6889,
9696,
24088,
271,
2460,
285,
697,
26405,
403,
32270,
6410,
762,
39501,
533,
9162,
8892,
943,
3164,
320,
1925,
13727,
50276,
783,
30628,
497,
42293,
275,
417,
46705,
253,
2929,
323,
14924,
50276,
783,
2234,
7350,
3464,
846,
253,
2488,
2380,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
reproducibility summary present scope of reproducibility clearly stated code reused author repository communication with original authors yes hyperparameter search yes includes new hyperparameters not tried by the authors ablation study comprehensive discussion on results extensive discussion detailed description of the reproduciblenonreproducible parts recommendations for reproducibility useful criticism for the authors results beyond the paper yes overall organization and clarity gooddocsepthis submission investigates the reproducibility of a recent work on attacks targeted at fairness in algorithms the authors assess the reproducibility of five main claims in the original work and their analysis supports three of these claims the authors also extend the original work by implementing new baseline attacks for added robustness analysis and suggesting a modification to influence the performance of certain types of attacks quality clarity the report very clearly and concisely states the results and the scope of reproducibility and adheres to it technical content is balanced by clear structure and discussion types of attachs and main claims to explore are stated clearly concrete shortcomings in the original reporting and code are pinpointed and this provides good guidance for improving reproducibility in computational analyses and reporting including supporting visualizations and a summary table of the issues and newly implemented solutions the authors have done notable additional work to improve the executability and documentation by modifying the original code base originality significance communication with the original authors is reported and it has been relevant for reproducing the results this work provides partial support to the claims in the original work but also identifies many shortcomings in the reporting and analysis reproducibility detailed reproducibility analysis steps are included and it is indicated that the original report and source code do not seem to be sufficient for a full replication code availability codebase is available via an anonymized github repository and seems clearly documented pros the reporting is sufficiently comprehensive and easy to follow despite the inevitable technicality of the content and it includes useful remarks and recommendations that support independent reproducibility analyses good use of text structuring illustrations and summary table help to follow the text notable extra work to improve the executability and documentation on top of the original code base additional new contributions include two new baseline attacks and a modification that can influence efficiency of certain types of attacks source code of the reproducibility analysis is available cons is the license copyright information is uptodate source code of the reproducibility analysis has an open license but the license file names the author of the original work as the copyright holder but according to the authors of the reproducibility report the original code has been remarkably augmented table 2 makes somewhat strong claims about the nonreproducibility of the results strong support was not identified and there is limited reproducibility no reproducibility is a stronger claim and concerns a specific limited setup this statement could be mitigated to also acknowledge the limitations of the reproducibility analysis itself
### Summary:
|
the paper has very strong reviews i agree with the reviewers that the paper has an exceptional quality in reproduction of the original results and clarity in presentation i recommend for its acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
38041,
6010,
1246,
50276,
14329,
273,
38041,
4518,
4767,
50276,
3211,
294,
3197,
2488,
18491,
50276,
35437,
342,
3236,
4477,
4754,
50276,
27049,
19484,
3186,
4754,
3797,
747,
4373,
22041,
417,
3597,
407,
253,
4477,
50276,
1752,
318,
1263,
11088,
50276,
49794,
327,
1543,
9470,
5955,
7000,
5740,
273,
253,
7598,
68,
487,
5025,
251,
250,
5551,
20109,
4243,
50276,
250,
27167,
569,
323,
38041,
4217,
14226,
323,
253,
4477,
50276,
16680,
4457,
253,
2929,
4754,
50275,
1189,
455,
6003,
285,
19843,
1175,
7152,
33032,
2520,
19529,
2340,
684,
253,
38041,
273,
247,
3332,
789,
327,
8104,
10522,
387,
28959,
275,
11333,
253,
4477,
2939,
253,
38041,
273,
2620,
2022,
3916,
275,
253,
3236,
789,
285,
616,
1783,
8525,
1264,
273,
841,
3916,
253,
4477,
671,
9017,
253,
3236,
789,
407,
16994,
747,
8245,
8104,
323,
2879,
31640,
1783,
285,
7738,
247,
11237,
281,
4833,
253,
3045,
273,
2176,
3510,
273,
8104,
50276,
15177,
50276,
498,
15752,
253,
1304,
1077,
4518,
285,
7036,
9299,
3054,
253,
1543,
285,
253,
7990,
273,
38041,
285,
519,
14210,
281,
352,
7681,
2600,
310,
16645,
407,
2590,
2605,
285,
5955,
3510,
273,
16152,
84,
285,
2022,
3916,
281,
8338,
403,
4767,
4518,
11859,
35387,
275,
253,
3236,
9610,
285,
2127,
403,
45661,
264,
285,
436,
3400,
1175,
12925,
323,
11138,
38041,
275,
15180,
6260,
285,
9610,
1690,
8109,
5304,
5904,
285,
247,
6010,
2829,
273,
253,
3374,
285,
9841,
9009,
5482,
253,
4477,
452,
2218,
16613,
3081,
789,
281,
3157,
253,
19599,
1430,
285,
10097,
407,
26264,
253,
3236,
2127,
2613,
50276,
19164,
414,
50276,
9188,
40348,
5511,
342,
253,
3236,
4477,
310,
2361,
285,
352,
556,
644,
4623,
323,
39306,
253,
1543,
436,
789,
3400,
7898,
1329,
281,
253,
3916,
275,
253,
3236,
789,
533,
671,
22649,
1142,
35387,
275,
253,
9610,
285,
1783,
38041,
7000,
38041,
1783,
5018,
403,
2908,
285,
352,
310,
4860,
326,
253,
3236,
1304,
285,
2603,
2127,
513,
417,
1646,
281,
320,
4209,
323,
247,
2120,
14970,
50275,
3211,
11659,
2127,
4793,
310,
2130,
3066,
271,
26314,
1025,
40477,
18491,
285,
3133,
4518,
14290,
50276,
856,
84,
50276,
783,
9610,
310,
10481,
11088,
285,
3477,
281,
956,
5747,
253,
19455,
7681,
414,
273,
253,
2600,
285,
352,
3797,
4217,
16157,
285,
12645,
326,
1329,
3907,
38041,
6260,
1175,
897,
273,
2505,
1577,
981,
33954,
285,
6010,
2829,
1361,
281,
956,
253,
2505,
50276,
1439,
494,
4465,
789,
281,
3157,
253,
19599,
1430,
285,
10097,
327,
1755,
273,
253,
3236,
2127,
2613,
50276,
38092,
747,
9021,
2486,
767,
747,
8245,
8104,
285,
247,
11237,
326,
476,
4833,
6733,
273,
2176,
3510,
273,
8104,
50276,
6756,
2127,
273,
253,
38041,
1783,
310,
2130,
50275,
5040,
50276,
261,
253,
7981,
9451,
1491,
310,
11776,
351,
366,
2603,
2127,
273,
253,
38041,
1783,
556,
271,
1527,
7981,
533,
253,
7981,
1873,
4454,
253,
2488,
273,
253,
3236,
789,
347,
253,
9451,
17051,
533,
2556,
281,
253,
4477,
273,
253,
38041,
1304,
253,
3236,
2127,
556,
644,
24678,
31612,
50276,
2420,
374,
2789,
8489,
2266,
3916,
670,
253,
1327,
250,
5551,
33593,
273,
253,
1543,
2266,
1329,
369,
417,
3636,
285,
627,
310,
3710,
38041,
642,
38041,
310,
247,
10046,
1750,
285,
7350,
247,
2173,
3710,
9978,
436,
3908,
812,
320,
4784,
27285,
281,
671,
14409,
253,
7364,
273,
253,
38041,
1783,
3139,
187,
187,
4118,
18435,
27,
783,
2929,
556,
1077,
2266,
10123,
891,
5194,
342,
253,
30628,
326,
253,
2929,
556,
271,
18714,
3290,
275,
21068,
273,
253,
3236,
1543,
285,
19843,
275,
9759,
891,
5583,
323,
697,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
38041,
6010,
1246,
50276,
14329,
273,
38041,
4518,
4767,
50276,
3211,
294,
3197,
2488,
18491,
50276,
35437,
342,
3236,
4477,
4754,
50276,
27049,
19484,
3186,
4754,
3797,
747,
4373,
22041,
417,
3597,
407,
253,
4477,
50276,
1752,
318,
1263,
11088,
50276,
49794,
327,
1543,
9470,
5955,
7000,
5740,
273,
253,
7598,
68,
487,
5025,
251,
250,
5551,
20109,
4243,
50276,
250,
27167,
569,
323,
38041,
4217,
14226,
323,
253,
4477,
50276,
16680,
4457,
253,
2929,
4754,
50275,
1189,
455,
6003,
285,
19843,
1175,
7152,
33032,
2520,
19529,
2340,
684,
253,
38041,
273,
247,
3332,
789,
327,
8104,
10522,
387,
28959,
275,
11333,
253,
4477,
2939,
253,
38041,
273,
2620,
2022,
3916,
275,
253,
3236,
789,
285,
616,
1783,
8525,
1264,
273,
841,
3916,
253,
4477,
671,
9017,
253,
3236,
789,
407,
16994,
747,
8245,
8104,
323,
2879,
31640,
1783,
285,
7738,
247,
11237,
281,
4833,
253,
3045,
273,
2176,
3510,
273,
8104,
50276,
15177,
50276,
498,
15752,
253,
1304,
1077,
4518,
285,
7036,
9299,
3054,
253,
1543,
285,
253,
7990,
273,
38041,
285,
519,
14210,
281,
352,
7681,
2600,
310,
16645,
407,
2590,
2605,
285,
5955,
3510,
273,
16152,
84,
285,
2022,
3916,
281,
8338,
403,
4767,
4518,
11859,
35387,
275,
253,
3236,
9610,
285,
2127,
403,
45661,
264,
285,
436,
3400,
1175,
12925,
323,
11138,
38041,
275,
15180,
6260,
285,
9610,
1690,
8109,
5304,
5904,
285,
247,
6010,
2829,
273,
253,
3374,
285,
9841,
9009,
5482,
253,
4477,
452,
2218,
16613,
3081,
789,
281,
3157,
253,
19599,
1430,
285,
10097,
407,
26264,
253,
3236,
2127,
2613,
50276,
19164,
414,
50276,
9188,
40348,
5511,
342,
253,
3236,
4477,
310,
2361,
285,
352,
556,
644,
4623,
323,
39306,
253,
1543,
436,
789,
3400,
7898,
1329,
281,
253,
3916,
275,
253,
3236,
789,
533,
671,
22649,
1142,
35387,
275,
253,
9610,
285,
1783,
38041,
7000,
38041,
1783,
5018,
403,
2908,
285,
352,
310,
4860,
326,
253,
3236,
1304,
285,
2603,
2127,
513,
417,
1646,
281,
320,
4209,
323,
247,
2120,
14970,
50275,
3211,
11659,
2127,
4793,
310,
2130,
3066,
271,
26314,
1025,
40477,
18491,
285,
3133,
4518,
14290,
50276,
856,
84,
50276,
783,
9610,
310,
10481,
11088,
285,
3477,
281,
956,
5747,
253,
19455,
7681,
414,
273,
253,
2600,
285,
352,
3797,
4217,
16157,
285,
12645,
326,
1329,
3907,
38041,
6260,
1175,
897,
273,
2505,
1577,
981,
33954,
285,
6010,
2829,
1361,
281,
956,
253,
2505,
50276,
1439,
494,
4465,
789,
281,
3157,
253,
19599,
1430,
285,
10097,
327,
1755,
273,
253,
3236,
2127,
2613,
50276,
38092,
747,
9021,
2486,
767,
747,
8245,
8104,
285,
247,
11237,
326,
476,
4833,
6733,
273,
2176,
3510,
273,
8104,
50276,
6756,
2127,
273,
253,
38041,
1783,
310,
2130,
50275,
5040,
50276,
261,
253,
7981,
9451,
1491,
310,
11776,
351,
366,
2603,
2127,
273,
253,
38041,
1783,
556,
271,
1527,
7981,
533,
253,
7981,
1873,
4454,
253,
2488,
273,
253,
3236,
789,
347,
253,
9451,
17051,
533,
2556,
281,
253,
4477,
273,
253,
38041,
1304,
253,
3236,
2127,
556,
644,
24678,
31612,
50276,
2420,
374,
2789,
8489,
2266,
3916,
670,
253,
1327,
250,
5551,
33593,
273,
253,
1543,
2266,
1329,
369,
417,
3636,
285,
627,
310,
3710,
38041,
642,
38041,
310,
247,
10046,
1750,
285,
7350,
247,
2173,
3710,
9978,
436,
3908,
812,
320,
4784,
27285,
281,
671,
14409,
253,
7364,
273,
253,
38041,
1783,
3139,
187,
187,
4118,
18435,
27,
783,
2929,
556,
1077,
2266,
10123,
891,
5194,
342,
253,
30628,
326,
253,
2929,
556,
271,
18714,
3290,
275,
21068,
273,
253,
3236,
1543,
285,
19843,
275,
9759,
891,
5583,
323,
697,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes an rso random search optimization method for training deep neural networks this method is gradientfree and based on the markov chain monte carlo search in particular it adds a perturbation to weight in a deep neural network and tests if it reduces the loss on a minibatch the weight is updated if this reduces the loss and retained otherwise merits of the paper this paper shows that repeating the rso process a few times for each weight is sufficient to train a deep neural network as a result the number of weight updates is an order of magnitude lesser when compared to backpropagation with sgd it can make aggressive weight updates in each step as there is no concept of learning rate the weight update step for individual layers is also not coupled with the magnitude of the loss rso is evaluated on classification tasks on mnist and cifar10 datasets where it achieves competitive accuracies issues of the paper one potential issue is that the method is only evaluated on relatively small networks i wonder how it works for larger networks such as resnet50 and resnet101 the current figures only show the comparison in terms of training iterations i would like to see the comparison in terms of training time ie accuracytime curve it would be interesting to see the performance of rso on imagenet andor coco i have read the response and the rating is not changeddocsep summary instead of backpropagation the authors consider a randomized search heuristic to train the parameters of neural networks the proposed work is based on the hypothesis that the initial set of neural network weights is close to the final solution the authors identify the problem that existing randomized search methods update all the parameters of the network in each update thus the proposed method updates only a single weight per iteration experimental results on mnist and cifar10 show that the proposed method delivers competitive results reasons for score overall i vote for rejecting indeed investigating alternative learning methods for deep architectures is highly relevant my major concern is about the novelty of the paper and formal presentation see cons below i do not expect that the authors can address my concern in the rebuttal period pros strong 1 investigating alternative learning methods for deep architectures is highly relevant 2 nice practical implementation details are provided like parallel computation or clever caching 3 this paper provides experiments on wellknown benchmark data sets the results suggest that randomized search heuristics can work well for training the weights of deep neural networks weak 1 highly relevant theoretical work in this field is not referenced or discussed eg nesterovs efficiency of coordinate descent methods on hugescale optimization problems or work on randomized search in general like evolution strategies a comprehensive introduction by hansgeorg beyer and hanspaul schwefel 2 the novelty is unclear to me maybe the presentation is suboptimal but i do not see any novel methodology here or insight to make this more clear it is well known that randomized search heuristics work very well on a wide variety of optimization problems both combinatorial and numerical the real open question in this field is to prove under which conditions these methods will work and under which conditions these methods will not work what is the expected number of objective function evaluations what is the probability that a solutions that is epsilonclose to a good solution is found in polynomial time for me answering any of these questions would make the paper at hand acceptable 3 the formal presentation regarding classic an recent results can be improved see below some statements are unfortunate eg on page 2 the authors state the research community has started questioning the commonly assumed hypothesis if gradient based optimizers get stuck in local minima however this is far from being a commonly assumed hypothesis it is a matter of fact from numerical optimization that gradient based optimizers willforsure get stuck in local minima the assumption is that these local minima are bad solutions the authors must be careful with such statements since many researchers spent decades to reveal insights into numerical optimization which shall not be ignored by todays scientists this impreciseness in statements appears for recent works as well on page 3 the authors state that weight agnostic neural networks wann also searches for architectures but keeps the set of weights fixed this is however not correct wanns evaluate the expected performance of a model over various parameters which are shared by all connections computing the expectation over multiple parameters is far from keeping weights fixed docsepin this paper rather than training a dnn using sgd the proposed idea is to perturb the weights of the network and accept the perturbation if it improves the performance this naive idea seems to perform almost as well as sgd and an order of magnitude faster see below while i commend the authors for bringing up the fact that such training is possible which can be very practical in ram bound settings there are at least two things that i would like to see fixed markov chain monte carlo is mentioned in the abstract and never discussed again though the method certainly resembles mcmc to some degree either you make this connection explicit and say under which probabilistic model this approach corresponds to mcmc and potentially connect the values sampled over time with a posterior density or eliminate the references to mcmc as of now you simply mention it but it begs a question more than answers anything why would it be meaningful to compare sgd and rso in terms of cycles given that cycles have vastly different computational costs for sgd and rso in fact a comparison in terms of cycles would be useful but using the same update schedule for sgd that you use for rso which of course would make sgd the competing approach take a higher computational cost per cycle right now you are stating that rso is an order of magnitude faster when measured in a unit cycles that is much more costly for rso that statement is meaningless so a show a fair comparison for instance accuracy vs compute time using a stateoftheart gpuoptimized version of sgd and also of rso b given that a is done it would also be useful to show the current version of accuracy vs cycle time if you consider the two potential versions of sgd cycles parallel updates and sequential updates that might make rso not seem as good compared with sgd but would be much more useful to judge when rso is to be preferred to sgd edit score modified after reading the authors reply edit 2 regarding the issue of optimization getting stuck in local optima or finding global minima moved my comment here for visibility i thank the authors for their flexibility on this issue but id like to weigh in on this saying that from my perspective those statements were actually accurate and useful and should be kept in the paper as they existed originally they are properly backed up by citations unlike the comment of getting stuck for sure in a local optimum which is very much dependent on the optimization problem i am willing to follow up with the other reviewers should they consider i am mistaken i have also increased my score after reading the authors response and i agree that the final decision should not depend on this specific issue if the paper is accepted id like it to include the original statement which can be very informative for some readersdocsepthis paper discusses a possible method for training a deep neural network without using backpropagation backpropagation has been very successful in minimizing output errors for both supervised and unsupervised methods and has stood up to challenges from other methods the motivation for finding suitable replacements or approximations is to reduce the computational complexity of training neural networks this can take the form of reducing the size of the nn required to accomplish a task or simply train an nn with a smaller number of operations i believe this is a very important new topic to find viable alternatives to backprop these kinds of methods have advantages on betterutilizing memory bandwidth making cheaper hardware more relevant to the training side of nns the authors do a good job of giving background by citing node perturbation methods lottery ticket hypothesis and genetic methods they all appear to be pointing to an underlying concept that random initializations in overparameterized networks already have embedded sparse representations the main result of the paper is that a small number of sequential weight updates using the authors proposed algorithm rivals the performance of an established method like backpropagation the proposed algorithm is simply to perturb weights from a randomly initialized neural network and keep the perturbation if it reduces the loss on a minibatch this relies on an assumption that a randomly initialized network is close to a final solution i really enjoyed this paper nearly every question i asked myself while reading it was answered in a subsequent section as pointed out this is the first step at a new concept as with any good paper this paper begets a lot more questions than it completely answers suggestions section 3 what is the motivation for using a gaussian distribution to initialize weights not that i see anything wrong with that but is there some reason this might be better or worse than other initializations section 3 we first update the weights of the layer closest this could be an area of additional research as to where to update first if we look at neuroscience we see that layers closer to inputs seems to learn first so might be good to include some discussion on that here section 4 these are good networks to start with but i would like to see larger networks that are more relevant to problems todaytransformers being trained to useful levels using this method could be a huge and important step section 41 it could strengthen the paper to include some analysis on the number of mac operations required and the number of readswrites to memory for sgd vs rso this could be useful in this paper or a subsequent one section 42 some theory likely needs to be developed here it would good to add some discussion about the tradeoffs between these options i believe this is more for future work section 45 if the rso algorithm is more amenable to parallelism that could be an important advantage some discussion of that vs sgd could also build a stronger casezs
### Summary:
|
the paper proposes a variant derivativefree optimization algorithm that belongs to the family of evolution strategies es and zeroorder optimization algorithms to train deep neural networks the proposed random search optimization rso perturbs the weights via additive gaussian noise and updates the weights only when the perturbations improve the training objective function unlike the existing es and blackbox optimization algorithms that perturb all the weights at once rso perturbs and updates the weights in a coordinate descent fashion rso adds noise to only a subset of the weights sequentially layerbylayer and neuronbyneuron the empirical experiments demonstrated rso can achieve comparable performance when training small convolutional neural networks on mnist and cifar10 the paper contains some interesting ideas however there are some major concerns in the current submission 1 novelty there is a wealth of literature in optimization neural networks via derivativefree methods the proposed algorithm belongs to evolution strategies and other zeroorder methods rechenberg eigen 1973 schmidhuber et al 2007 salimans et al 2017 unforunately among all the rich prior works on related algorithms only salimans et al 2017 is merely mentioned in the related works furthermore the experiments only compared against sgd rather than any other zeroorder optimization algorithms many ideas in algorithm 1 was proposed in the prior es literature evaluate the weights using a pair of noise deltaw and deltaw in alg1 line1314 is known as antithetic sampling geweke 1988 also known as mirrored sampling brockhoff et al 2010 in the es literature update the weights by considering whether the objective function has improved or not was proposed in wierstra et al 2014 that is known as fitness shaping given the current submission it is difficult to discern the contribution of the proposed method when compared to the prior works in addition the convergence analysis of the zeroorder optimization was studied in duchi et al 2015 that includes the special coordinate descent version closely related to the proposed algorithm 2 experiments although the experiments showcase the performance of sequential rso the xaxis in figure 4 only reported the iterations after updating the entire network the true computational cost of the proposed rso is the forwardpass x parameters that is much more costly than the paper currently acknowledges also rso requires drawing 5000 random samples and perform forwardpasses on all 5000 samples for every single weight update it will be a great addition to include the multiplications and computation complexity of rso and the baseline algorithms more importantly the paper only compared rso with sgd in all the experiments it will significantly strengthen the current paper by including some of the existing es algorithms in summary the basic idea is interesting but the current paper is not for publication and will need further development and nontrivial modification
|
[
253,
13461,
273,
3676,
11454,
6928,
50271,
20881,
50275,
18,
4122,
4623,
10527,
789,
275,
436,
1673,
310,
417,
23378,
390,
5469,
24088,
295,
9358,
729,
84,
6733,
273,
13249,
18499,
3082,
327,
15729,
20039,
1079,
13757,
3237,
390,
789,
327,
14871,
3186,
275,
2087,
751,
5606,
8130,
50276,
66,
11088,
10199,
407,
288,
507,
463,
2061,
320,
7885,
285,
288,
507,
4904,
335,
5807,
664,
28353,
50274,
19,
253,
38135,
310,
12744,
281,
479,
5046,
253,
9759,
310,
749,
29776,
533,
891,
513,
417,
923,
667,
4460,
16182,
1060,
390,
12288,
281,
1056,
436,
625,
2590,
352,
310,
973,
1929,
326,
14871,
3186,
344,
321,
3397,
789,
1077,
973,
327,
247,
4618,
5235,
273,
13757,
3237,
1097,
38183,
285,
10704,
253,
1524,
1527,
1953,
275,
436,
1673,
310,
281,
5276,
762,
534,
2515,
841,
3082,
588,
789,
285,
762,
534,
2515,
841,
3082,
588,
417,
789,
752,
310,
253,
3264,
1180,
273,
8103,
1159,
27163,
752,
310,
253,
5912,
326,
247,
5482,
326,
310,
299,
4277,
10483,
281,
247,
1175,
2900,
310,
1119,
275,
14189,
673,
323,
479,
22291,
667,
273,
841,
3533,
651,
1056,
253,
2929,
387,
1133,
12207,
50275,
20,
253,
7473,
9759,
5001,
10610,
271,
3332,
1543,
476,
320,
5520,
923,
2708,
50273,
8826,
7234,
403,
23293,
24088,
327,
3239,
374,
253,
4477,
1375,
253,
2561,
3114,
556,
3053,
20501,
253,
7744,
8025,
9079,
604,
11786,
1754,
5556,
14460,
755,
10960,
275,
1980,
46836,
2299,
436,
310,
2080,
432,
1146,
247,
7744,
8025,
9079,
352,
310,
247,
2647,
273,
958,
432,
10704,
13757,
326,
11786,
1754,
5556,
14460,
588,
48755,
459,
755,
10960,
275,
1980,
46836,
253,
9376,
310,
326,
841,
1980,
46836,
403,
3076,
5482,
253,
4477,
1364,
320,
10182,
342,
824,
7234,
1580,
1142,
8607,
5262,
8007,
281,
10313,
16039,
715,
10704,
13757,
534,
3091,
417,
320,
12841,
407,
281,
11015,
10950,
50275,
2520,
1607,
2845,
261,
8098,
275,
7234,
4620,
323,
3332,
2987,
347,
973,
327,
3239,
495,
253,
4477,
1375,
326,
2801,
639,
79,
6932,
11454,
6928,
259,
1136,
671,
17891,
323,
35615,
533,
11359,
253,
873,
273,
13461,
4229,
436,
310,
2299,
417,
3451,
259,
1136,
84,
7472,
253,
3264,
3045,
273,
247,
1566,
689,
2710,
3602,
534,
403,
6096,
407,
512,
10291,
12672,
253,
15355,
689,
2709,
3602,
310,
2080,
432,
7562,
13461,
4229,
50274,
7152,
339,
9852,
436,
2929,
2581,
685,
3733,
247,
277,
9866,
970,
256,
35333,
253,
4081,
2934,
310,
281,
12230,
253,
13461,
273,
253,
2990,
285,
2997,
253,
20452,
604,
352,
19132,
253,
3045,
436,
27785,
2934,
3133,
281,
1347,
2761,
347,
973,
347,
256,
35333,
285,
271,
1340,
273,
9777,
7938,
923,
2708,
50276,
6050,
891,
49638,
253,
4477,
323,
9745,
598,
253,
958,
326,
824,
3733,
310,
1896,
534,
476,
320,
1077,
8542,
275,
17653,
3033,
7533,
627,
403,
387,
1878,
767,
1841,
326,
891,
651,
751,
281,
923,
4229,
50275,
4698,
729,
5931,
1114,
442,
1113,
4213,
310,
5393,
275,
253,
12002,
285,
1620,
5469,
969,
2167,
253,
1332,
5604,
29217,
278,
3591,
68,
281,
690,
4248,
2057,
368,
1056,
436,
4602,
6843,
285,
1333,
762,
534,
37851,
1566,
436,
2746,
10140,
281,
278,
3591,
68,
285,
7826,
4684,
253,
2193,
19958,
689,
673,
342,
247,
12637,
4038,
390,
13469,
253,
10414,
281,
278,
3591,
68,
347,
273,
1024,
368,
3365,
3748,
352,
533,
352,
2353,
84,
247,
1953,
625,
685,
9172,
2712,
50275,
22309,
651,
352,
320,
14282,
281,
7277,
256,
35333,
285,
391,
601,
275,
2426,
273,
11945,
1677,
326,
11945,
452,
37078,
1027,
15180,
4815,
323,
256,
35333,
285,
391,
601,
275,
958,
247,
5301,
275,
2426,
273,
11945,
651,
320,
4217,
533,
970,
253,
1072,
5731,
10130,
323,
256,
35333,
326,
368,
897,
323,
391,
601,
534,
273,
2282,
651,
1056,
256,
35333,
253,
11771,
2746,
1379,
247,
2169,
15180,
2105,
591,
5880,
987,
1024,
368,
403,
14851,
326,
391,
601,
310,
271,
1340,
273,
9777,
7938,
672,
4080,
275,
247,
3943,
11945,
326,
310,
1199,
625,
19983,
323,
391,
601,
326,
3908,
310,
34209,
594,
247,
921,
247,
4344,
5301,
323,
4227,
7200,
4632,
11897,
673,
970,
247,
1375,
23037,
14387,
305,
11113,
32581,
1025,
2715,
273,
256,
35333,
285,
671,
273,
391,
601,
270,
50276,
28821,
326,
247,
310,
2218,
352,
651,
671,
320,
4217,
281,
921,
253,
1655,
2715,
273,
50276,
18921,
1974,
4632,
5880,
673,
604,
368,
1908,
253,
767,
2442,
9508,
273,
256,
35333,
11945,
7529,
11269,
285,
22453,
11269,
326,
1537,
1056,
391,
601,
417,
1646,
347,
1175,
2429,
342,
256,
35333,
533,
651,
320,
1199,
625,
4217,
281,
5963,
672,
391,
601,
310,
281,
320,
9013,
281,
256,
35333,
50276,
15576,
4868,
7321,
846,
4361,
253,
4477,
12252,
50276,
15576,
374,
5001,
253,
2523,
273,
13757,
2970,
10960,
275,
1980,
5556,
66,
390,
4560,
4156,
46836,
4395,
619,
4385,
1060,
323,
23114,
50276,
74,
5717,
253,
4477,
323,
616,
15840,
327,
436,
2523,
533,
2654,
751,
281,
14357,
275,
327,
436,
3981,
326,
432,
619,
8668,
1110,
7234,
497,
2686,
7899,
285,
4217,
285,
943,
320,
4934,
275,
253,
2929,
347,
597,
13164,
8927,
597,
403,
6283,
17245,
598,
407,
30404,
12401,
253,
4385,
273,
2970,
10960,
323,
2119,
275,
247,
1980,
24571,
534,
310,
1077,
1199,
7976,
327,
253,
13757,
1895,
891,
717,
7378,
281,
956,
598,
342,
253,
643,
30628,
943,
597,
1908,
891,
717,
20854,
50276,
74,
452,
671,
2559,
619,
4868,
846,
4361,
253,
4477,
2380,
285,
891,
5194,
326,
253,
2457,
3061,
943,
417,
3469,
327,
436,
2173,
2523,
604,
253,
2929,
310,
7607,
2654,
751,
352,
281,
2486,
253,
3236,
3908,
534,
476,
320,
1077,
27096,
323,
690,
10668,
7152,
33032,
2520,
2929,
25339,
247,
1896,
1332,
323,
3733,
247,
3676,
11454,
2990,
1293,
970,
896,
44263,
318,
50276,
2135,
44263,
318,
556,
644,
1077,
5547,
275,
28699,
3453,
6332,
323,
1097,
22296,
285,
440,
35421,
3082,
285,
556,
6225,
598,
281,
7881,
432,
643,
3082,
50276,
783,
16038,
323,
4560,
7470,
47105,
390,
34754,
310,
281,
4796,
253,
15180,
10454,
273,
3733,
11454,
6928,
50276,
2520,
476,
1379,
253,
830,
273,
8493,
253,
1979,
273,
253,
48257,
2424,
281,
14294,
247,
4836,
390,
3365,
6194,
271,
48257,
342,
247,
4577,
1180,
273,
5871,
891,
2868,
436,
310,
247,
1077,
1774,
747,
9400,
281,
1089,
16571,
18075,
281,
896,
8560,
50276,
20513,
9351,
273,
3082,
452,
11361,
327,
1805,
8906,
3006,
3541,
16992,
2403,
20182,
10309,
625,
4623,
281,
253,
3733,
1930,
273,
295,
2224,
50276,
783,
4477,
513,
247,
1175,
2628,
273,
4933,
4114,
407,
19936,
4666,
20452,
3082,
36284,
13571,
9079,
285,
6380,
3082,
50276,
9328,
512,
3176,
281,
320,
13458,
281,
271,
6944,
4473,
326,
3632,
3302,
5904,
275,
689,
19484,
1025,
6928,
2168,
452,
12691,
23507,
14237,
50276,
783,
2022,
906,
273,
253,
2929,
310,
326,
247,
1355,
1180,
273,
22453,
2801,
11269,
970,
253,
4477,
4081,
5933,
28851,
253,
3045,
273,
271,
4232,
1332,
751,
896,
44263,
318,
50276,
783,
4081,
5933,
310,
3365,
281,
12230,
13461,
432,
247,
12421,
31260,
11454,
2990,
285,
1978,
253,
20452,
604,
352,
11355,
253,
2957,
327,
247,
1054,
487,
1506,
436,
15771,
327,
271,
9376,
326,
247,
12421,
31260,
2990,
310,
2810,
281,
247,
2457,
2900,
50275,
74,
1663,
11346,
436,
2929,
4829,
1046,
1953,
891,
2546,
4266,
1223,
4361,
352,
369,
9577,
275,
247,
6774,
2593,
50276,
284,
8042,
562,
436,
310,
253,
806,
3213,
387,
247,
747,
4473,
50276,
284,
342,
667,
1175,
2929,
436,
2929,
320,
18145,
247,
2257,
625,
3533,
685,
352,
4336,
9172,
50274,
35640,
621,
2593,
495,
752,
310,
253,
16038,
323,
970,
247,
305,
12064,
3268,
281,
26641,
13461,
50276,
1439,
326,
891,
923,
2712,
3430,
342,
326,
533,
310,
627,
690,
1921,
436,
1537,
320,
1805,
390,
7197,
685,
643,
3302,
5904,
2593,
495,
359,
806,
5731,
253,
13461,
273,
253,
3828,
8642,
50276,
2520,
812,
320,
271,
2170,
273,
3081,
2561,
347,
281,
835,
281,
5731,
806,
50276,
338,
359,
1007,
387,
6551,
21559,
359,
923,
326,
8090,
8003,
281,
14800,
3133,
281,
3037,
806,
594,
1537,
320,
1175,
281,
2486,
690,
5955,
327,
326,
1060,
2593,
577,
841,
403,
1175,
6928,
281,
1265,
342,
533,
891,
651,
751,
281,
923,
4067,
6928,
326,
403,
625,
4623,
281,
3237,
3063,
16702,
398,
1146,
10166,
281,
4217,
2308,
970,
436,
1332,
812,
320,
247,
5699,
285,
1774,
3213,
50275,
4674,
7609,
352,
812,
17084,
253,
2929,
281,
2486,
690,
1783,
327,
253,
1180,
273,
5315,
5871,
2424,
285,
253,
1180,
273,
1239,
2140,
31320,
281,
3541,
323,
256,
35333,
4632,
391,
601,
50276,
2520,
812,
320,
4217,
275,
436,
2929,
390,
247,
6774,
581,
50276,
4674,
5976,
690,
3762,
2779,
3198,
281,
320,
3715,
1060,
50276,
262,
651,
1175,
281,
823,
690,
5955,
670,
253,
5454,
14273,
875,
841,
4610,
891,
2868,
436,
310,
625,
323,
2852,
789,
50276,
4674,
5329,
604,
253,
391,
601,
5933,
310,
625,
42133,
281,
7529,
1204,
326,
812,
320,
271,
1774,
5750,
50276,
8826,
5955,
273,
326,
4632,
256,
35333,
812,
671,
1973,
247,
10046,
1083,
91,
84,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
12955,
4309,
4924,
13757,
5933,
326,
14125,
281,
253,
2021,
273,
5606,
8130,
1578,
285,
1182,
254,
1887,
491,
13757,
11333,
281,
6194,
3676,
11454,
6928,
253,
4081,
3632,
3186,
13757,
391,
601,
6925,
28312,
253,
13461,
3066,
21842,
305,
12064,
6046,
285,
11269,
253,
13461,
760,
672,
253,
26309,
3157,
253,
3733,
8103,
1159,
12401,
253,
5368,
1578,
285,
2806,
3364,
13757,
11333,
326,
12230,
512,
253,
13461,
387,
2378,
391,
601,
6925,
28312,
285,
11269,
253,
13461,
275,
247,
13249,
18499,
8142,
391,
601,
11323,
6046,
281,
760,
247,
8578,
273,
253,
13461,
32627,
3828,
67,
1190,
4071,
285,
23586,
1615,
570,
27658,
253,
16774,
4679,
5183,
391,
601,
476,
5115,
10870,
3045,
672,
3733,
1355,
27311,
267,
11454,
6928,
327,
278,
79,
382,
285,
260,
338,
274,
740,
50275,
783,
2929,
4428,
690,
4722,
5697,
2299,
627,
403,
690,
2201,
7350,
275,
253,
1655,
19529,
50276,
18,
38135,
627,
310,
247,
8788,
273,
6239,
275,
13757,
11454,
6928,
3066,
4309,
4924,
3082,
253,
4081,
5933,
14125,
281,
5606,
8130,
285,
643,
1182,
254,
1887,
491,
3082,
761,
864,
4978,
50276,
70,
3855,
15621,
5807,
7893,
73,
22651,
1162,
355,
5215,
3779,
303,
507,
1162,
355,
4240,
42439,
328,
1523,
2190,
512,
253,
6793,
2720,
2987,
327,
2905,
11333,
760,
3779,
303,
507,
1162,
355,
4240,
310,
7960,
5393,
275,
253,
2905,
2987,
33810,
253,
4679,
760,
2429,
1411,
256,
35333,
2581,
685,
667,
643,
1182,
254,
1887,
491,
13757,
11333,
50275,
20415,
5697,
275,
5933,
337,
369,
4081,
275,
253,
2720,
1578,
6239,
50275,
45141,
253,
13461,
970,
247,
4667,
273,
6046,
18687,
88,
285,
18687,
88,
275,
20320,
18,
1386,
1012,
1047,
310,
1929,
347,
20711,
11176,
10491,
3471,
664,
413,
11513,
671,
1929,
347,
6385,
45284,
10491,
1795,
777,
35660,
1162,
355,
4267,
275,
253,
1578,
6239,
50275,
11183,
253,
13461,
407,
7296,
1880,
253,
8103,
1159,
556,
5520,
390,
417,
369,
4081,
275,
259,
1321,
10981,
1162,
355,
4059,
326,
310,
1929,
347,
14601,
29209,
50276,
28821,
253,
1655,
19529,
352,
310,
2834,
281,
26923,
253,
7680,
273,
253,
4081,
1332,
672,
2429,
281,
253,
2720,
2987,
275,
1635,
253,
14940,
1783,
273,
253,
1182,
254,
1887,
491,
13757,
369,
5421,
275,
277,
26550,
1162,
355,
4104,
326,
3797,
253,
2714,
13249,
18499,
2715,
8244,
2905,
281,
253,
4081,
5933,
50276,
19,
4679,
50274,
20261,
253,
4679,
34647,
253,
3045,
273,
22453,
391,
601,
253,
1269,
10565,
275,
4677,
577,
760,
2361,
253,
25142,
846,
22753,
253,
2862,
2990,
253,
2032,
15180,
2105,
273,
253,
4081,
391,
601,
310,
253,
3579,
5858,
1269,
3602,
326,
310,
1199,
625,
19983,
685,
253,
2929,
4390,
26785,
671,
391,
601,
4419,
10263,
29067,
3632,
3530,
285,
1347,
3579,
5858,
265,
327,
512,
29067,
3530,
323,
1046,
2014,
2801,
5731,
352,
588,
320,
247,
1270,
1635,
281,
2486,
253,
30840,
569,
285,
13782,
10454,
273,
391,
601,
285,
253,
8245,
11333,
50274,
3062,
15538,
253,
2929,
760,
2429,
391,
601,
342,
256,
35333,
275,
512,
253,
4679,
352,
588,
3012,
17084,
253,
1655,
2929,
407,
1690,
690,
273,
253,
5368,
1578,
11333,
50274,
249,
6010,
253,
5044,
2934,
310,
4722,
533,
253,
1655,
2929,
310,
417,
323,
9311,
285,
588,
878,
2007,
2440,
285,
37825,
11237,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
253,
13461,
273,
3676,
11454,
6928,
50271,
20881,
50275,
18,
4122,
4623,
10527,
789,
275,
436,
1673,
310,
417,
23378,
390,
5469,
24088,
295,
9358,
729,
84,
6733,
273,
13249,
18499,
3082,
327,
15729,
20039,
1079,
13757,
3237,
390,
789,
327,
14871,
3186,
275,
2087,
751,
5606,
8130,
50276,
66,
11088,
10199,
407,
288,
507,
463,
2061,
320,
7885,
285,
288,
507,
4904,
335,
5807,
664,
28353,
50274,
19,
253,
38135,
310,
12744,
281,
479,
5046,
253,
9759,
310,
749,
29776,
533,
891,
513,
417,
923,
667,
4460,
16182,
1060,
390,
12288,
281,
1056,
436,
625,
2590,
352,
310,
973,
1929,
326,
14871,
3186,
344,
321,
3397,
789,
1077,
973,
327,
247,
4618,
5235,
273,
13757,
3237,
1097,
38183,
285,
10704,
253,
1524,
1527,
1953,
275,
436,
1673,
310,
281,
5276,
762,
534,
2515,
841,
3082,
588,
789,
285,
762,
534,
2515,
841,
3082,
588,
417,
789,
752,
310,
253,
3264,
1180,
273,
8103,
1159,
27163,
752,
310,
253,
5912,
326,
247,
5482,
326,
310,
299,
4277,
10483,
281,
247,
1175,
2900,
310,
1119,
275,
14189,
673,
323,
479,
22291,
667,
273,
841,
3533,
651,
1056,
253,
2929,
387,
1133,
12207,
50275,
20,
253,
7473,
9759,
5001,
10610,
271,
3332,
1543,
476,
320,
5520,
923,
2708,
50273,
8826,
7234,
403,
23293,
24088,
327,
3239,
374,
253,
4477,
1375,
253,
2561,
3114,
556,
3053,
20501,
253,
7744,
8025,
9079,
604,
11786,
1754,
5556,
14460,
755,
10960,
275,
1980,
46836,
2299,
436,
310,
2080,
432,
1146,
247,
7744,
8025,
9079,
352,
310,
247,
2647,
273,
958,
432,
10704,
13757,
326,
11786,
1754,
5556,
14460,
588,
48755,
459,
755,
10960,
275,
1980,
46836,
253,
9376,
310,
326,
841,
1980,
46836,
403,
3076,
5482,
253,
4477,
1364,
320,
10182,
342,
824,
7234,
1580,
1142,
8607,
5262,
8007,
281,
10313,
16039,
715,
10704,
13757,
534,
3091,
417,
320,
12841,
407,
281,
11015,
10950,
50275,
2520,
1607,
2845,
261,
8098,
275,
7234,
4620,
323,
3332,
2987,
347,
973,
327,
3239,
495,
253,
4477,
1375,
326,
2801,
639,
79,
6932,
11454,
6928,
259,
1136,
671,
17891,
323,
35615,
533,
11359,
253,
873,
273,
13461,
4229,
436,
310,
2299,
417,
3451,
259,
1136,
84,
7472,
253,
3264,
3045,
273,
247,
1566,
689,
2710,
3602,
534,
403,
6096,
407,
512,
10291,
12672,
253,
15355,
689,
2709,
3602,
310,
2080,
432,
7562,
13461,
4229,
50274,
7152,
339,
9852,
436,
2929,
2581,
685,
3733,
247,
277,
9866,
970,
256,
35333,
253,
4081,
2934,
310,
281,
12230,
253,
13461,
273,
253,
2990,
285,
2997,
253,
20452,
604,
352,
19132,
253,
3045,
436,
27785,
2934,
3133,
281,
1347,
2761,
347,
973,
347,
256,
35333,
285,
271,
1340,
273,
9777,
7938,
923,
2708,
50276,
6050,
891,
49638,
253,
4477,
323,
9745,
598,
253,
958,
326,
824,
3733,
310,
1896,
534,
476,
320,
1077,
8542,
275,
17653,
3033,
7533,
627,
403,
387,
1878,
767,
1841,
326,
891,
651,
751,
281,
923,
4229,
50275,
4698,
729,
5931,
1114,
442,
1113,
4213,
310,
5393,
275,
253,
12002,
285,
1620,
5469,
969,
2167,
253,
1332,
5604,
29217,
278,
3591,
68,
281,
690,
4248,
2057,
368,
1056,
436,
4602,
6843,
285,
1333,
762,
534,
37851,
1566,
436,
2746,
10140,
281,
278,
3591,
68,
285,
7826,
4684,
253,
2193,
19958,
689,
673,
342,
247,
12637,
4038,
390,
13469,
253,
10414,
281,
278,
3591,
68,
347,
273,
1024,
368,
3365,
3748,
352,
533,
352,
2353,
84,
247,
1953,
625,
685,
9172,
2712,
50275,
22309,
651,
352,
320,
14282,
281,
7277,
256,
35333,
285,
391,
601,
275,
2426,
273,
11945,
1677,
326,
11945,
452,
37078,
1027,
15180,
4815,
323,
256,
35333,
285,
391,
601,
275,
958,
247,
5301,
275,
2426,
273,
11945,
651,
320,
4217,
533,
970,
253,
1072,
5731,
10130,
323,
256,
35333,
326,
368,
897,
323,
391,
601,
534,
273,
2282,
651,
1056,
256,
35333,
253,
11771,
2746,
1379,
247,
2169,
15180,
2105,
591,
5880,
987,
1024,
368,
403,
14851,
326,
391,
601,
310,
271,
1340,
273,
9777,
7938,
672,
4080,
275,
247,
3943,
11945,
326,
310,
1199,
625,
19983,
323,
391,
601,
326,
3908,
310,
34209,
594,
247,
921,
247,
4344,
5301,
323,
4227,
7200,
4632,
11897,
673,
970,
247,
1375,
23037,
14387,
305,
11113,
32581,
1025,
2715,
273,
256,
35333,
285,
671,
273,
391,
601,
270,
50276,
28821,
326,
247,
310,
2218,
352,
651,
671,
320,
4217,
281,
921,
253,
1655,
2715,
273,
50276,
18921,
1974,
4632,
5880,
673,
604,
368,
1908,
253,
767,
2442,
9508,
273,
256,
35333,
11945,
7529,
11269,
285,
22453,
11269,
326,
1537,
1056,
391,
601,
417,
1646,
347,
1175,
2429,
342,
256,
35333,
533,
651,
320,
1199,
625,
4217,
281,
5963,
672,
391,
601,
310,
281,
320,
9013,
281,
256,
35333,
50276,
15576,
4868,
7321,
846,
4361,
253,
4477,
12252,
50276,
15576,
374,
5001,
253,
2523,
273,
13757,
2970,
10960,
275,
1980,
5556,
66,
390,
4560,
4156,
46836,
4395,
619,
4385,
1060,
323,
23114,
50276,
74,
5717,
253,
4477,
323,
616,
15840,
327,
436,
2523,
533,
2654,
751,
281,
14357,
275,
327,
436,
3981,
326,
432,
619,
8668,
1110,
7234,
497,
2686,
7899,
285,
4217,
285,
943,
320,
4934,
275,
253,
2929,
347,
597,
13164,
8927,
597,
403,
6283,
17245,
598,
407,
30404,
12401,
253,
4385,
273,
2970,
10960,
323,
2119,
275,
247,
1980,
24571,
534,
310,
1077,
1199,
7976,
327,
253,
13757,
1895,
891,
717,
7378,
281,
956,
598,
342,
253,
643,
30628,
943,
597,
1908,
891,
717,
20854,
50276,
74,
452,
671,
2559,
619,
4868,
846,
4361,
253,
4477,
2380,
285,
891,
5194,
326,
253,
2457,
3061,
943,
417,
3469,
327,
436,
2173,
2523,
604,
253,
2929,
310,
7607,
2654,
751,
352,
281,
2486,
253,
3236,
3908,
534,
476,
320,
1077,
27096,
323,
690,
10668,
7152,
33032,
2520,
2929,
25339,
247,
1896,
1332,
323,
3733,
247,
3676,
11454,
2990,
1293,
970,
896,
44263,
318,
50276,
2135,
44263,
318,
556,
644,
1077,
5547,
275,
28699,
3453,
6332,
323,
1097,
22296,
285,
440,
35421,
3082,
285,
556,
6225,
598,
281,
7881,
432,
643,
3082,
50276,
783,
16038,
323,
4560,
7470,
47105,
390,
34754,
310,
281,
4796,
253,
15180,
10454,
273,
3733,
11454,
6928,
50276,
2520,
476,
1379,
253,
830,
273,
8493,
253,
1979,
273,
253,
48257,
2424,
281,
14294,
247,
4836,
390,
3365,
6194,
271,
48257,
342,
247,
4577,
1180,
273,
5871,
891,
2868,
436,
310,
247,
1077,
1774,
747,
9400,
281,
1089,
16571,
18075,
281,
896,
8560,
50276,
20513,
9351,
273,
3082,
452,
11361,
327,
1805,
8906,
3006,
3541,
16992,
2403,
20182,
10309,
625,
4623,
281,
253,
3733,
1930,
273,
295,
2224,
50276,
783,
4477,
513,
247,
1175,
2628,
273,
4933,
4114,
407,
19936,
4666,
20452,
3082,
36284,
13571,
9079,
285,
6380,
3082,
50276,
9328,
512,
3176,
281,
320,
13458,
281,
271,
6944,
4473,
326,
3632,
3302,
5904,
275,
689,
19484,
1025,
6928,
2168,
452,
12691,
23507,
14237,
50276,
783,
2022,
906,
273,
253,
2929,
310,
326,
247,
1355,
1180,
273,
22453,
2801,
11269,
970,
253,
4477,
4081,
5933,
28851,
253,
3045,
273,
271,
4232,
1332,
751,
896,
44263,
318,
50276,
783,
4081,
5933,
310,
3365,
281,
12230,
13461,
432,
247,
12421,
31260,
11454,
2990,
285,
1978,
253,
20452,
604,
352,
11355,
253,
2957,
327,
247,
1054,
487,
1506,
436,
15771,
327,
271,
9376,
326,
247,
12421,
31260,
2990,
310,
2810,
281,
247,
2457,
2900,
50275,
74,
1663,
11346,
436,
2929,
4829,
1046,
1953,
891,
2546,
4266,
1223,
4361,
352,
369,
9577,
275,
247,
6774,
2593,
50276,
284,
8042,
562,
436,
310,
253,
806,
3213,
387,
247,
747,
4473,
50276,
284,
342,
667,
1175,
2929,
436,
2929,
320,
18145,
247,
2257,
625,
3533,
685,
352,
4336,
9172,
50274,
35640,
621,
2593,
495,
752,
310,
253,
16038,
323,
970,
247,
305,
12064,
3268,
281,
26641,
13461,
50276,
1439,
326,
891,
923,
2712,
3430,
342,
326,
533,
310,
627,
690,
1921,
436,
1537,
320,
1805,
390,
7197,
685,
643,
3302,
5904,
2593,
495,
359,
806,
5731,
253,
13461,
273,
253,
3828,
8642,
50276,
2520,
812,
320,
271,
2170,
273,
3081,
2561,
347,
281,
835,
281,
5731,
806,
50276,
338,
359,
1007,
387,
6551,
21559,
359,
923,
326,
8090,
8003,
281,
14800,
3133,
281,
3037,
806,
594,
1537,
320,
1175,
281,
2486,
690,
5955,
327,
326,
1060,
2593,
577,
841,
403,
1175,
6928,
281,
1265,
342,
533,
891,
651,
751,
281,
923,
4067,
6928,
326,
403,
625,
4623,
281,
3237,
3063,
16702,
398,
1146,
10166,
281,
4217,
2308,
970,
436,
1332,
812,
320,
247,
5699,
285,
1774,
3213,
50275,
4674,
7609,
352,
812,
17084,
253,
2929,
281,
2486,
690,
1783,
327,
253,
1180,
273,
5315,
5871,
2424,
285,
253,
1180,
273,
1239,
2140,
31320,
281,
3541,
323,
256,
35333,
4632,
391,
601,
50276,
2520,
812,
320,
4217,
275,
436,
2929,
390,
247,
6774,
581,
50276,
4674,
5976,
690,
3762,
2779,
3198,
281,
320,
3715,
1060,
50276,
262,
651,
1175,
281,
823,
690,
5955,
670,
253,
5454,
14273,
875,
841,
4610,
891,
2868,
436,
310,
625,
323,
2852,
789,
50276,
4674,
5329,
604,
253,
391,
601,
5933,
310,
625,
42133,
281,
7529,
1204,
326,
812,
320,
271,
1774,
5750,
50276,
8826,
5955,
273,
326,
4632,
256,
35333,
812,
671,
1973,
247,
10046,
1083,
91,
84,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
12955,
4309,
4924,
13757,
5933,
326,
14125,
281,
253,
2021,
273,
5606,
8130,
1578,
285,
1182,
254,
1887,
491,
13757,
11333,
281,
6194,
3676,
11454,
6928,
253,
4081,
3632,
3186,
13757,
391,
601,
6925,
28312,
253,
13461,
3066,
21842,
305,
12064,
6046,
285,
11269,
253,
13461,
760,
672,
253,
26309,
3157,
253,
3733,
8103,
1159,
12401,
253,
5368,
1578,
285,
2806,
3364,
13757,
11333,
326,
12230,
512,
253,
13461,
387,
2378,
391,
601,
6925,
28312,
285,
11269,
253,
13461,
275,
247,
13249,
18499,
8142,
391,
601,
11323,
6046,
281,
760,
247,
8578,
273,
253,
13461,
32627,
3828,
67,
1190,
4071,
285,
23586,
1615,
570,
27658,
253,
16774,
4679,
5183,
391,
601,
476,
5115,
10870,
3045,
672,
3733,
1355,
27311,
267,
11454,
6928,
327,
278,
79,
382,
285,
260,
338,
274,
740,
50275,
783,
2929,
4428,
690,
4722,
5697,
2299,
627,
403,
690,
2201,
7350,
275,
253,
1655,
19529,
50276,
18,
38135,
627,
310,
247,
8788,
273,
6239,
275,
13757,
11454,
6928,
3066,
4309,
4924,
3082,
253,
4081,
5933,
14125,
281,
5606,
8130,
285,
643,
1182,
254,
1887,
491,
3082,
761,
864,
4978,
50276,
70,
3855,
15621,
5807,
7893,
73,
22651,
1162,
355,
5215,
3779,
303,
507,
1162,
355,
4240,
42439,
328,
1523,
2190,
512,
253,
6793,
2720,
2987,
327,
2905,
11333,
760,
3779,
303,
507,
1162,
355,
4240,
310,
7960,
5393,
275,
253,
2905,
2987,
33810,
253,
4679,
760,
2429,
1411,
256,
35333,
2581,
685,
667,
643,
1182,
254,
1887,
491,
13757,
11333,
50275,
20415,
5697,
275,
5933,
337,
369,
4081,
275,
253,
2720,
1578,
6239,
50275,
45141,
253,
13461,
970,
247,
4667,
273,
6046,
18687,
88,
285,
18687,
88,
275,
20320,
18,
1386,
1012,
1047,
310,
1929,
347,
20711,
11176,
10491,
3471,
664,
413,
11513,
671,
1929,
347,
6385,
45284,
10491,
1795,
777,
35660,
1162,
355,
4267,
275,
253,
1578,
6239,
50275,
11183,
253,
13461,
407,
7296,
1880,
253,
8103,
1159,
556,
5520,
390,
417,
369,
4081,
275,
259,
1321,
10981,
1162,
355,
4059,
326,
310,
1929,
347,
14601,
29209,
50276,
28821,
253,
1655,
19529,
352,
310,
2834,
281,
26923,
253,
7680,
273,
253,
4081,
1332,
672,
2429,
281,
253,
2720,
2987,
275,
1635,
253,
14940,
1783,
273,
253,
1182,
254,
1887,
491,
13757,
369,
5421,
275,
277,
26550,
1162,
355,
4104,
326,
3797,
253,
2714,
13249,
18499,
2715,
8244,
2905,
281,
253,
4081,
5933,
50276,
19,
4679,
50274,
20261,
253,
4679,
34647,
253,
3045,
273,
22453,
391,
601,
253,
1269,
10565,
275,
4677,
577,
760,
2361,
253,
25142,
846,
22753,
253,
2862,
2990,
253,
2032,
15180,
2105,
273,
253,
4081,
391,
601,
310,
253,
3579,
5858,
1269,
3602,
326,
310,
1199,
625,
19983,
685,
253,
2929,
4390,
26785,
671,
391,
601,
4419,
10263,
29067,
3632,
3530,
285,
1347,
3579,
5858,
265,
327,
512,
29067,
3530,
323,
1046,
2014,
2801,
5731,
352,
588,
320,
247,
1270,
1635,
281,
2486,
253,
30840,
569,
285,
13782,
10454,
273,
391,
601,
285,
253,
8245,
11333,
50274,
3062,
15538,
253,
2929,
760,
2429,
391,
601,
342,
256,
35333,
275,
512,
253,
4679,
352,
588,
3012,
17084,
253,
1655,
2929,
407,
1690,
690,
273,
253,
5368,
1578,
11333,
50274,
249,
6010,
253,
5044,
2934,
310,
4722,
533,
253,
1655,
2929,
310,
417,
323,
9311,
285,
588,
878,
2007,
2440,
285,
37825,
11237,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes an algorithm for multiagent learning by sharing gradient information but not data among agents this algorithm can achieve collaborative learning while keeping data privacy this paper also shows this algorithm can converge in limit and provides supporting numerical experiments strengths 1 this algorithm solve the featuredistributed optimization problem which is valuable and practical 2 the algorithm shares gradient information rather than data which preserves the privacy 3 this paper proves the algorithm converges in limit 4 the numerical experiments are comprehensive and convincing weaknesses 1 the theoretical analysis may be able to be improved the current proof only shows convergence in limit maybe can obtain an explicit convergence rate na docsepthis paper studies collaborative learning between organizations without sharing data models and even objective functions the authors propose a gradient assisted learning gal method that collaboratively optimize the aggregation of local loss functions while each worker iteratively fits the gradients of the overarching objective function the authors prove that the proposed gal can achieve the performance close to centralized learning when all data models and objective functions are fully disclosed the empirical results support this argument pros this paper addresses a real and timely problem that collaboratively optimize the aggregation of local loss functions of different organizations while each worker iteratively fits the gradients of the overarching objective functions the paper is wellwritten and easy to follow cons i find the setting is a bit unrealisticconfusing and needs further clarification is there any restrictions on the discrepancy between the data distributions of different organizations in some extreme cases would it be possible that the gradients from other devices cannot assist or have bad effects the experiments are not comprehensive enough does any other existing method can be applied here i understand that these methods are not designed to this setting and they may be rigorously reasonable but i find this would be helpful to discuss them limitations are duely discussed while no potential negative societal impact is identified docsepthis paper proposes the gradient assistive learning gal algorithm for a multiorganization collaborative learning setting the proposed algorithm can be considered as a distributed version of the gradient boosting algorithm the authors provide convergence analysis and experimental results to show the benefit of the proposed algorithm strengths the paper studies an interesting distributed learning scenario which i think is relatively less explored in the community the proposed algorithm is a reasonable extension of prior work on assistive learning this paper also builds a connection between the gal algorithm and the classic gradient boosting algorithm the paper is relatively wellwritten and easy to follow authors provide results on a variety of datasets i also appreciate that the authors provided detailed descriptions of the experiments in the appendix as well as the results in the privacy enhancement setting i did not check the proof details but the theoretical result looked sound overall i think this is a good paper weaknesses for the mnist and cifar experiments authors assume that the images are split into patches and different patches are stored in different organizations i dont think this is a realistic experimental setup and its unclear why different organizations only store a patch of a natural image i would like to see a more realistic scenario where this method can be applied in deep learning models authors mentioned that they do not foresee any negative societal impacts i agree with this docsepgradient assisted learning gal a decentralized collaborative supervised learning methodology is proposed gal works to assist a particular organization dubbed alice with their learning problem by utilizing other organizations learning capabilities but without sharing any sensitive information such as models or raw data between any two organizations this is accomplished by iteratively having each helper organization learn a model of the pseudoresidual using their own data and sending the models prediction back to alice at each iteration alice updates the overall model to be learned by incorporating these predictions and optimizing their associated weightings at testtime alice queries the helper organizations for their predictions and feeds those predictions through the learned weightings to arrive at a prediction a theorem is given that states that under mild technical conditions the loss of the gal model converges as the number of assistance iterations approaches infinity to the optimal loss amongst all models in the span of the organizations model function classes experiments on a variety of datasets shows that gal outperforms prior stateoftheart decentralized methods with performance approaching that of centralized learning paradigms strengths s1 quality the authors formulate their methodology into a technically sound and rigorous mathematical and algorithmic framework the experiments are thorough and demonstrate the strengths of the proposed method in a wide range of applications s2 clarity the paper is wellmotivated and easy to read weaknesses w1 originality the proposed methodology is extremely similar to the prior assisted learning al work 1 providing what seems to be only incremental extensions to the technical aspects of the al framework see below for more details regarding this concern neutral n1 significance since the proposed method gal applies to more general problems eg classification than al and is shown to outperform al in the considered benchmarks the reviewer finds that the paper advances the practical strength and performance of the literatures decentralized collaborative learning methods however the reviewer believes that the approaches taken by the authors to extend al to gal ie replacing regression residuals with classificationregression pseudoresiduals and optimizing the model weights and learning rates are not significant conceptualtheoretical advancements 1 xian xun et al assisted learning a framework for multiorganization learning neurips 2020 the authors do a good job at describing their limitations namely that their theoretical convergence result is not sufficiently strong to describe why gal converges after only a few iterations of assistance and that it may be possible to infer sensitive information from the shared pseudoresiduals the authors demonstrate that upon adding moderate amounts of noise to the pseudoresiduals to take care of the latter limitation the performance of gal does not significantly deteriorate and that performance is still better than that without collaboration however it would be interesting to discuss and potentially test the issue of adversarial sensitivity what happens if one of the organizations is malicious or subject to an adversarial attack either in their data or in the predictions that they feed back to alice relative to traditional adversarial machine learning settings wherein the adversary is typically assumed access to just one mode of attack at a time eg the input data at test time the proposed decentralized learning paradigm potentially offers an adversary more avenues for attack eg at a particular organizations input data and at the various predictions fed to alice
### Summary:
|
this paper proposes and analyzes a new approach gradient assisted learning to collaborative training when features are split across several organizations although the work has some limitations all reviewers agreed that the paper contains a strong sound contribution and should be accepted the revisions made during the postrebuttal discussion already address the majority of the concerns raised which were about presentation of the results and clarifying some points wrt related work
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
5933,
323,
4471,
12788,
4715,
407,
9628,
11786,
1491,
533,
417,
941,
2190,
6083,
436,
5933,
476,
5115,
27549,
4715,
1223,
7562,
941,
11068,
436,
2929,
671,
2722,
436,
5933,
476,
29623,
275,
2701,
285,
3400,
8109,
10704,
4679,
20544,
50276,
18,
436,
5933,
8415,
253,
12819,
382,
3567,
13757,
1895,
534,
310,
9865,
285,
8542,
50276,
19,
253,
5933,
10764,
11786,
1491,
2581,
685,
941,
534,
31221,
253,
11068,
50276,
20,
436,
2929,
19539,
253,
5933,
26414,
275,
2701,
50276,
21,
253,
10704,
4679,
403,
11088,
285,
21414,
50276,
20881,
1255,
265,
50276,
18,
253,
10527,
1783,
778,
320,
2104,
281,
320,
5520,
253,
1655,
4737,
760,
2722,
14940,
275,
2701,
5046,
476,
4044,
271,
6843,
14940,
2281,
5549,
5474,
33032,
2520,
2929,
2175,
27549,
4715,
875,
8889,
1293,
9628,
941,
3210,
285,
1014,
8103,
3470,
253,
4477,
12661,
247,
11786,
21075,
4715,
5918,
1332,
326,
8317,
3146,
22318,
253,
20828,
273,
1980,
2957,
3470,
1223,
1016,
12954,
10040,
3146,
13840,
253,
27935,
273,
253,
689,
50238,
8103,
1159,
253,
4477,
5276,
326,
253,
4081,
5918,
476,
5115,
253,
3045,
2810,
281,
36409,
4715,
672,
512,
941,
3210,
285,
8103,
3470,
403,
4751,
10557,
253,
16774,
1543,
1329,
436,
4154,
5847,
50276,
2520,
2929,
12453,
247,
1524,
285,
14793,
1895,
326,
8317,
3146,
22318,
253,
20828,
273,
1980,
2957,
3470,
273,
1027,
8889,
1223,
1016,
12954,
10040,
3146,
13840,
253,
27935,
273,
253,
689,
50238,
8103,
3470,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
5040,
50276,
74,
1089,
253,
4758,
310,
247,
2372,
46521,
8259,
5302,
285,
3198,
2007,
37699,
310,
627,
667,
13133,
327,
253,
26210,
875,
253,
941,
10670,
273,
1027,
8889,
275,
690,
9559,
2219,
651,
352,
320,
1896,
326,
253,
27935,
432,
643,
4095,
2550,
10073,
390,
452,
3076,
2538,
50276,
783,
4679,
403,
417,
11088,
2217,
1057,
667,
643,
5368,
1332,
476,
320,
3732,
1060,
891,
2096,
326,
841,
3082,
403,
417,
4158,
281,
436,
4758,
285,
597,
778,
320,
8132,
29689,
5272,
533,
891,
1089,
436,
651,
320,
9371,
281,
2319,
731,
7364,
403,
3443,
600,
5469,
1223,
642,
2442,
4016,
38058,
3486,
310,
3636,
5474,
33032,
2520,
2929,
29328,
253,
11786,
10073,
422,
4715,
5918,
5933,
323,
247,
1554,
1528,
1247,
1320,
27549,
4715,
4758,
253,
4081,
5933,
476,
320,
2783,
347,
247,
5939,
2715,
273,
253,
11786,
43124,
5933,
253,
4477,
2085,
14940,
1783,
285,
5661,
1543,
281,
921,
253,
5649,
273,
253,
4081,
5933,
20544,
50276,
783,
2929,
2175,
271,
4722,
5939,
4715,
10076,
534,
891,
1158,
310,
4942,
1679,
14859,
275,
253,
3114,
253,
4081,
5933,
310,
247,
5272,
6880,
273,
2720,
789,
327,
10073,
422,
4715,
436,
2929,
671,
21168,
247,
4602,
875,
253,
5918,
5933,
285,
253,
10610,
11786,
43124,
5933,
253,
2929,
310,
4942,
973,
15720,
285,
3477,
281,
956,
4477,
2085,
1543,
327,
247,
5235,
273,
15302,
891,
671,
11435,
326,
253,
4477,
2530,
7000,
20121,
273,
253,
4679,
275,
253,
30762,
347,
973,
347,
253,
1543,
275,
253,
11068,
14314,
4758,
891,
858,
417,
2451,
253,
4737,
4278,
533,
253,
10527,
906,
3261,
3590,
4583,
891,
1158,
436,
310,
247,
1175,
2929,
50276,
20881,
1255,
265,
50276,
1542,
253,
278,
79,
382,
285,
260,
338,
274,
4679,
4477,
5467,
326,
253,
3888,
403,
8085,
715,
20412,
285,
1027,
20412,
403,
7141,
275,
1027,
8889,
891,
13414,
1158,
436,
310,
247,
15958,
5661,
9978,
285,
697,
12744,
2139,
1027,
8889,
760,
4657,
247,
12097,
273,
247,
3626,
2460,
891,
651,
751,
281,
923,
247,
625,
15958,
10076,
835,
436,
1332,
476,
320,
3732,
275,
3676,
4715,
3210,
50276,
43355,
5393,
326,
597,
513,
417,
32734,
667,
4016,
38058,
16274,
891,
5194,
342,
436,
5474,
33032,
29844,
21075,
4715,
5918,
247,
40880,
27549,
22296,
4715,
16182,
310,
4081,
5918,
2987,
281,
10073,
247,
1798,
6003,
33690,
355,
547,
342,
616,
4715,
1895,
407,
17617,
643,
8889,
4715,
13789,
533,
1293,
9628,
667,
7996,
1491,
824,
347,
3210,
390,
9305,
941,
875,
667,
767,
8889,
436,
310,
14123,
407,
10040,
3146,
1907,
1016,
25557,
6003,
3037,
247,
1566,
273,
253,
10585,
2324,
301,
780,
970,
616,
1211,
941,
285,
10430,
253,
3210,
10554,
896,
281,
355,
547,
387,
1016,
19502,
355,
547,
11269,
253,
4583,
1566,
281,
320,
6311,
407,
24049,
841,
13650,
285,
39793,
616,
2330,
2801,
723,
387,
1071,
2606,
355,
547,
19241,
253,
25557,
8889,
323,
616,
13650,
285,
28468,
1110,
13650,
949,
253,
6311,
2801,
723,
281,
12666,
387,
247,
10554,
247,
10012,
310,
1677,
326,
3054,
326,
762,
11134,
7681,
2515,
253,
2957,
273,
253,
5918,
1566,
26414,
347,
253,
1180,
273,
8385,
25142,
7274,
23579,
281,
253,
8654,
2957,
15995,
512,
3210,
275,
253,
13905,
273,
253,
8889,
1566,
1159,
5971,
4679,
327,
247,
5235,
273,
15302,
2722,
326,
5918,
41731,
13015,
2720,
1375,
23037,
14387,
40880,
3082,
342,
3045,
17682,
326,
273,
36409,
4715,
11951,
304,
983,
20544,
50276,
84,
18,
3290,
253,
4477,
36803,
616,
16182,
715,
247,
22335,
3590,
285,
26565,
15965,
285,
5933,
280,
7792,
253,
4679,
403,
11080,
285,
7568,
253,
20544,
273,
253,
4081,
1332,
275,
247,
4618,
2491,
273,
4893,
50276,
84,
19,
19843,
253,
2929,
310,
973,
24013,
8550,
285,
3477,
281,
1239,
50275,
20881,
1255,
265,
50276,
88,
18,
3236,
414,
253,
4081,
16182,
310,
6685,
2074,
281,
253,
2720,
21075,
4715,
355,
789,
337,
5277,
752,
3133,
281,
320,
760,
32809,
18149,
281,
253,
7681,
7794,
273,
253,
355,
7792,
923,
2708,
323,
625,
4278,
5001,
436,
4468,
50275,
27912,
50276,
79,
18,
8453,
1580,
253,
4081,
1332,
5918,
10384,
281,
625,
2087,
3237,
24088,
9162,
685,
355,
285,
310,
2011,
281,
562,
32231,
355,
275,
253,
2783,
49602,
253,
37317,
9010,
326,
253,
2929,
16424,
253,
8542,
4757,
285,
3045,
273,
253,
4133,
2478,
40880,
27549,
4715,
3082,
2299,
253,
37317,
11532,
326,
253,
7274,
2668,
407,
253,
4477,
281,
9017,
355,
281,
5918,
26332,
15706,
9077,
42435,
342,
9162,
1747,
1256,
10585,
2324,
301,
780,
84,
285,
39793,
253,
1566,
13461,
285,
4715,
4142,
403,
417,
1534,
20178,
783,
33977,
7170,
942,
50275,
18,
1269,
757,
1269,
328,
1162,
355,
21075,
4715,
247,
7792,
323,
1554,
1528,
1247,
1320,
4715,
5723,
2824,
9169,
253,
4477,
513,
247,
1175,
2628,
387,
12930,
616,
7364,
10775,
326,
616,
10527,
14940,
906,
310,
417,
10481,
2266,
281,
6266,
2139,
5918,
26414,
846,
760,
247,
1643,
25142,
273,
8385,
285,
326,
352,
778,
320,
1896,
281,
9441,
7996,
1491,
432,
253,
6096,
10585,
2324,
301,
780,
84,
253,
4477,
7568,
326,
2220,
6240,
10290,
8322,
273,
6046,
281,
253,
10585,
2324,
301,
780,
84,
281,
1379,
1557,
273,
253,
6158,
12291,
253,
3045,
273,
5918,
1057,
417,
3012,
16528,
366,
285,
326,
3045,
310,
1335,
1805,
685,
326,
1293,
14448,
2299,
352,
651,
320,
4722,
281,
2319,
285,
7826,
1071,
253,
2523,
273,
48960,
7340,
752,
6569,
604,
581,
273,
253,
8889,
310,
24764,
390,
2256,
281,
271,
48960,
2983,
2057,
275,
616,
941,
390,
275,
253,
13650,
326,
597,
3997,
896,
281,
355,
547,
4103,
281,
5899,
48960,
5145,
4715,
7533,
10646,
253,
34014,
310,
5431,
8025,
2289,
281,
816,
581,
4438,
273,
2983,
387,
247,
673,
24088,
253,
3280,
941,
387,
1071,
673,
253,
4081,
40880,
4715,
22199,
7826,
6131,
271,
34014,
625,
44201,
323,
2983,
24088,
387,
247,
1798,
8889,
3280,
941,
285,
387,
253,
2710,
13650,
10208,
281,
355,
547,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
285,
3537,
13505,
247,
747,
2746,
11786,
21075,
4715,
281,
27549,
3733,
672,
3386,
403,
8085,
2439,
2067,
8889,
3738,
253,
789,
556,
690,
7364,
512,
30628,
5821,
326,
253,
2929,
4428,
247,
2266,
3590,
7680,
285,
943,
320,
7607,
253,
38549,
1160,
1309,
253,
1501,
250,
2858,
22559,
5955,
2168,
2953,
253,
5020,
273,
253,
7350,
5439,
534,
497,
670,
9759,
273,
253,
1543,
285,
8254,
5411,
690,
2792,
8772,
2905,
789,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
5933,
323,
4471,
12788,
4715,
407,
9628,
11786,
1491,
533,
417,
941,
2190,
6083,
436,
5933,
476,
5115,
27549,
4715,
1223,
7562,
941,
11068,
436,
2929,
671,
2722,
436,
5933,
476,
29623,
275,
2701,
285,
3400,
8109,
10704,
4679,
20544,
50276,
18,
436,
5933,
8415,
253,
12819,
382,
3567,
13757,
1895,
534,
310,
9865,
285,
8542,
50276,
19,
253,
5933,
10764,
11786,
1491,
2581,
685,
941,
534,
31221,
253,
11068,
50276,
20,
436,
2929,
19539,
253,
5933,
26414,
275,
2701,
50276,
21,
253,
10704,
4679,
403,
11088,
285,
21414,
50276,
20881,
1255,
265,
50276,
18,
253,
10527,
1783,
778,
320,
2104,
281,
320,
5520,
253,
1655,
4737,
760,
2722,
14940,
275,
2701,
5046,
476,
4044,
271,
6843,
14940,
2281,
5549,
5474,
33032,
2520,
2929,
2175,
27549,
4715,
875,
8889,
1293,
9628,
941,
3210,
285,
1014,
8103,
3470,
253,
4477,
12661,
247,
11786,
21075,
4715,
5918,
1332,
326,
8317,
3146,
22318,
253,
20828,
273,
1980,
2957,
3470,
1223,
1016,
12954,
10040,
3146,
13840,
253,
27935,
273,
253,
689,
50238,
8103,
1159,
253,
4477,
5276,
326,
253,
4081,
5918,
476,
5115,
253,
3045,
2810,
281,
36409,
4715,
672,
512,
941,
3210,
285,
8103,
3470,
403,
4751,
10557,
253,
16774,
1543,
1329,
436,
4154,
5847,
50276,
2520,
2929,
12453,
247,
1524,
285,
14793,
1895,
326,
8317,
3146,
22318,
253,
20828,
273,
1980,
2957,
3470,
273,
1027,
8889,
1223,
1016,
12954,
10040,
3146,
13840,
253,
27935,
273,
253,
689,
50238,
8103,
3470,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
5040,
50276,
74,
1089,
253,
4758,
310,
247,
2372,
46521,
8259,
5302,
285,
3198,
2007,
37699,
310,
627,
667,
13133,
327,
253,
26210,
875,
253,
941,
10670,
273,
1027,
8889,
275,
690,
9559,
2219,
651,
352,
320,
1896,
326,
253,
27935,
432,
643,
4095,
2550,
10073,
390,
452,
3076,
2538,
50276,
783,
4679,
403,
417,
11088,
2217,
1057,
667,
643,
5368,
1332,
476,
320,
3732,
1060,
891,
2096,
326,
841,
3082,
403,
417,
4158,
281,
436,
4758,
285,
597,
778,
320,
8132,
29689,
5272,
533,
891,
1089,
436,
651,
320,
9371,
281,
2319,
731,
7364,
403,
3443,
600,
5469,
1223,
642,
2442,
4016,
38058,
3486,
310,
3636,
5474,
33032,
2520,
2929,
29328,
253,
11786,
10073,
422,
4715,
5918,
5933,
323,
247,
1554,
1528,
1247,
1320,
27549,
4715,
4758,
253,
4081,
5933,
476,
320,
2783,
347,
247,
5939,
2715,
273,
253,
11786,
43124,
5933,
253,
4477,
2085,
14940,
1783,
285,
5661,
1543,
281,
921,
253,
5649,
273,
253,
4081,
5933,
20544,
50276,
783,
2929,
2175,
271,
4722,
5939,
4715,
10076,
534,
891,
1158,
310,
4942,
1679,
14859,
275,
253,
3114,
253,
4081,
5933,
310,
247,
5272,
6880,
273,
2720,
789,
327,
10073,
422,
4715,
436,
2929,
671,
21168,
247,
4602,
875,
253,
5918,
5933,
285,
253,
10610,
11786,
43124,
5933,
253,
2929,
310,
4942,
973,
15720,
285,
3477,
281,
956,
4477,
2085,
1543,
327,
247,
5235,
273,
15302,
891,
671,
11435,
326,
253,
4477,
2530,
7000,
20121,
273,
253,
4679,
275,
253,
30762,
347,
973,
347,
253,
1543,
275,
253,
11068,
14314,
4758,
891,
858,
417,
2451,
253,
4737,
4278,
533,
253,
10527,
906,
3261,
3590,
4583,
891,
1158,
436,
310,
247,
1175,
2929,
50276,
20881,
1255,
265,
50276,
1542,
253,
278,
79,
382,
285,
260,
338,
274,
4679,
4477,
5467,
326,
253,
3888,
403,
8085,
715,
20412,
285,
1027,
20412,
403,
7141,
275,
1027,
8889,
891,
13414,
1158,
436,
310,
247,
15958,
5661,
9978,
285,
697,
12744,
2139,
1027,
8889,
760,
4657,
247,
12097,
273,
247,
3626,
2460,
891,
651,
751,
281,
923,
247,
625,
15958,
10076,
835,
436,
1332,
476,
320,
3732,
275,
3676,
4715,
3210,
50276,
43355,
5393,
326,
597,
513,
417,
32734,
667,
4016,
38058,
16274,
891,
5194,
342,
436,
5474,
33032,
29844,
21075,
4715,
5918,
247,
40880,
27549,
22296,
4715,
16182,
310,
4081,
5918,
2987,
281,
10073,
247,
1798,
6003,
33690,
355,
547,
342,
616,
4715,
1895,
407,
17617,
643,
8889,
4715,
13789,
533,
1293,
9628,
667,
7996,
1491,
824,
347,
3210,
390,
9305,
941,
875,
667,
767,
8889,
436,
310,
14123,
407,
10040,
3146,
1907,
1016,
25557,
6003,
3037,
247,
1566,
273,
253,
10585,
2324,
301,
780,
970,
616,
1211,
941,
285,
10430,
253,
3210,
10554,
896,
281,
355,
547,
387,
1016,
19502,
355,
547,
11269,
253,
4583,
1566,
281,
320,
6311,
407,
24049,
841,
13650,
285,
39793,
616,
2330,
2801,
723,
387,
1071,
2606,
355,
547,
19241,
253,
25557,
8889,
323,
616,
13650,
285,
28468,
1110,
13650,
949,
253,
6311,
2801,
723,
281,
12666,
387,
247,
10554,
247,
10012,
310,
1677,
326,
3054,
326,
762,
11134,
7681,
2515,
253,
2957,
273,
253,
5918,
1566,
26414,
347,
253,
1180,
273,
8385,
25142,
7274,
23579,
281,
253,
8654,
2957,
15995,
512,
3210,
275,
253,
13905,
273,
253,
8889,
1566,
1159,
5971,
4679,
327,
247,
5235,
273,
15302,
2722,
326,
5918,
41731,
13015,
2720,
1375,
23037,
14387,
40880,
3082,
342,
3045,
17682,
326,
273,
36409,
4715,
11951,
304,
983,
20544,
50276,
84,
18,
3290,
253,
4477,
36803,
616,
16182,
715,
247,
22335,
3590,
285,
26565,
15965,
285,
5933,
280,
7792,
253,
4679,
403,
11080,
285,
7568,
253,
20544,
273,
253,
4081,
1332,
275,
247,
4618,
2491,
273,
4893,
50276,
84,
19,
19843,
253,
2929,
310,
973,
24013,
8550,
285,
3477,
281,
1239,
50275,
20881,
1255,
265,
50276,
88,
18,
3236,
414,
253,
4081,
16182,
310,
6685,
2074,
281,
253,
2720,
21075,
4715,
355,
789,
337,
5277,
752,
3133,
281,
320,
760,
32809,
18149,
281,
253,
7681,
7794,
273,
253,
355,
7792,
923,
2708,
323,
625,
4278,
5001,
436,
4468,
50275,
27912,
50276,
79,
18,
8453,
1580,
253,
4081,
1332,
5918,
10384,
281,
625,
2087,
3237,
24088,
9162,
685,
355,
285,
310,
2011,
281,
562,
32231,
355,
275,
253,
2783,
49602,
253,
37317,
9010,
326,
253,
2929,
16424,
253,
8542,
4757,
285,
3045,
273,
253,
4133,
2478,
40880,
27549,
4715,
3082,
2299,
253,
37317,
11532,
326,
253,
7274,
2668,
407,
253,
4477,
281,
9017,
355,
281,
5918,
26332,
15706,
9077,
42435,
342,
9162,
1747,
1256,
10585,
2324,
301,
780,
84,
285,
39793,
253,
1566,
13461,
285,
4715,
4142,
403,
417,
1534,
20178,
783,
33977,
7170,
942,
50275,
18,
1269,
757,
1269,
328,
1162,
355,
21075,
4715,
247,
7792,
323,
1554,
1528,
1247,
1320,
4715,
5723,
2824,
9169,
253,
4477,
513,
247,
1175,
2628,
387,
12930,
616,
7364,
10775,
326,
616,
10527,
14940,
906,
310,
417,
10481,
2266,
281,
6266,
2139,
5918,
26414,
846,
760,
247,
1643,
25142,
273,
8385,
285,
326,
352,
778,
320,
1896,
281,
9441,
7996,
1491,
432,
253,
6096,
10585,
2324,
301,
780,
84,
253,
4477,
7568,
326,
2220,
6240,
10290,
8322,
273,
6046,
281,
253,
10585,
2324,
301,
780,
84,
281,
1379,
1557,
273,
253,
6158,
12291,
253,
3045,
273,
5918,
1057,
417,
3012,
16528,
366,
285,
326,
3045,
310,
1335,
1805,
685,
326,
1293,
14448,
2299,
352,
651,
320,
4722,
281,
2319,
285,
7826,
1071,
253,
2523,
273,
48960,
7340,
752,
6569,
604,
581,
273,
253,
8889,
310,
24764,
390,
2256,
281,
271,
48960,
2983,
2057,
275,
616,
941,
390,
275,
253,
13650,
326,
597,
3997,
896,
281,
355,
547,
4103,
281,
5899,
48960,
5145,
4715,
7533,
10646,
253,
34014,
310,
5431,
8025,
2289,
281,
816,
581,
4438,
273,
2983,
387,
247,
673,
24088,
253,
3280,
941,
387,
1071,
673,
253,
4081,
40880,
4715,
22199,
7826,
6131,
271,
34014,
625,
44201,
323,
2983,
24088,
387,
247,
1798,
8889,
3280,
941,
285,
387,
253,
2710,
13650,
10208,
281,
355,
547,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
285,
3537,
13505,
247,
747,
2746,
11786,
21075,
4715,
281,
27549,
3733,
672,
3386,
403,
8085,
2439,
2067,
8889,
3738,
253,
789,
556,
690,
7364,
512,
30628,
5821,
326,
253,
2929,
4428,
247,
2266,
3590,
7680,
285,
943,
320,
7607,
253,
38549,
1160,
1309,
253,
1501,
250,
2858,
22559,
5955,
2168,
2953,
253,
5020,
273,
253,
7350,
5439,
534,
497,
670,
9759,
273,
253,
1543,
285,
8254,
5411,
690,
2792,
8772,
2905,
789,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
scope problem setting the paper studies the problem of representation learning from multiview data from an identifiability perspective in particular the focus is on latent correlation maximisation approaches as commonly used in nonlinear canonical correlation analysis cca and artificial multiview selfsupervised learning amssl the paper postulates a generative model by which views x1x2 are generated by invertible functions fi applied to a mixture of independent shared z and private ci components xifizci for i12 theory it is shown that latent correlation maximisation with a suitable objective and constraints as well as with invertible encoders identifies or separates both i the shared component z and ii the private components c1c2 up to arbitrary invertible functions further iii a finite sample analysis is provided algorithm experiments the paper proposes a learning objective that combines latent correlation maximisation with a regulariser and a reconstruction objective to encourage independence among zc1c2 and invertibility of the encoders several experiments on synthetic data and image benchmarks are used to validate the theory and compare the proposed method against dcca barlowtwins and byol strengths the paper presents sound i only checked the proof of thm 1 in detail and could not find any major flaws theory proving identifiability of both shared and private components for the assumed multiview setting a finite sample analysis and result is provided which is rare in this area and thus appreciated the paper is overall wellwritten and well structured based on the presented theory a new learning method is developed and implemented the experimental evaluation is relatively diverse multiple datasets and baselines weaknesses the theoretical analysis relies on two relatively strong and restrictive assumptions the first is invertibility of the learnt encoders which as stated by the authors ensures that all information is preserved and corresponds to assuming the reconstruction task is solved in their proposed algorithm this is implemented in the form of a reconstruction objective within an autoencoding framework however the dcca and amssl methods cited in the paper do not use such reconstruction objectives partly owing to the complex nature of the data these methods are often applied to eg natural images where perfect reconstruction is not feasible instead these methods avoid trivial representations eg via redundancy reduction barlowtwins or using moving averagesstop gradients byolsimsiam unfortunately this undermines one of the papers main claims that its theory explains the success of dcca and amssl methods in order to justify this claim further theoretical analysis would be needed that explains why the aforementioned methods still work without explicitly enforcing invertibility there is existing work that actually addresses learning with noninvertible encoders see eg s1 s2 von kgelgen et al 2021 cf contrastive learning theory literature s3 s4 the second is mutual independence between shared and private components this assumption seems central to the proofs of thms 1 and 2 but is only very briefly discussed in 32 this assumption may be less restrictive than invertibility see also questions below but it deserves a more extensive discussion and justification moreover since specifying a latent distribution is part of specifying a generative model i believe this should be moved earlier to the beginning of 31 and inform the comparison with other generative models of multiview data for some of the theoretical results it is unclear to what extent they are novel or can already be found in similar form in the literature in particular there seem to be large parallels between thm 1 and the cited work of von kgelgen et al 2021 which also identifies the shared latent component without the above assumptions and uses the same notion of identifiability up to invertible function however this is not discussed in the theory or related work sections questionscomments for the authors for the assumed generative model starting from a setting where zc1c2 are dependent can this equivalently be transformed into a setting with independent latents and possible different f1f2 and if so how in other words does the independence assumption come without loss of generality when arbitrary invertible f1f2 are allowed it seems that dd1d2 are implicitly assumed to be known if so this should be highlighted as an assumption also did you perform any ablation on this the finite sample analysis at the end of 33 is hard to parse for nonexperts it would be nice if you could provide some additional intuition on assumption 2 and thm 3 i did not understand exactly how the clustering experiment in 61 directly evaluates thm 1 ie the existence of an invertible function mapping between ground truth and inferred latents could you please clarify in particular how are the ground truth shared components used in the evaluation it may be nice to complement the existing presentation with performance at predicting the ground truth latents or is this the reported clustering accuracy in fig 1 could you also show tsne for hatz2 what does your analysis have to sayhow does it relate to contrastive amssl approaches such as simclr can this also be phrased as latent correlation maximisation other minor comments suggestions and typos p2 2nd para the claim that theoretical understanding has been lacking seems too strong and should be toned down and complemented with references acknowledging at least some of the many works from previous years that seek to theoretically understand cca amssl p2 i the key assumptions eg invertibility and type of identifiability up to invertible function should be stated here for transparency p3 end of 21 presumably the criterion should hold for all x p7 1st para is hsic maximizes the correlation a typo and should read minimizes p7 reformulation para invertibility cannot be enforced by using an autoencoder at least in practice this should be toned down eg to encourage or promote as used in the next sentence p14 3rd last para there seems to be a typo in that the partial derivative of hs wrt c1 instead of z should be zero otherwise this contradicts the first part of the sentence p14 2nd para hp has not yet been defined references s1 zimmermann roland s et al contrastive learning inverts the data generating process 2021 s2 tian yuandong xinlei chen and surya ganguli understanding selfsupervised learning dynamics without contrastive pairs 2021 s3 arora sanjeev et al a theoretical analysis of contrastive unsupervised representation learning 2019 s4 tosh christopher akshay krishnamurthy and daniel hsu contrastive learning multiview redundancy and linear models 2021 overall i enjoyed reading this paper however i have two main concerns see weaknesses above for details i the invertibility assumption used to obtain the theoretical results is at odds with the fact that latent correlationbased multiview learning typically does not use invertible encoders this is not made sufficiently clear and the claims that the presented analysis explains or helps understand the success of current methods is not fully justified and should be toned down such an explanation would also have to explain why these methods work without invertible encoders ii part of the theory appears weaker than results in von kgelgen et al 2021 which are not compared to in sufficient detail in particular i encourage the authors to include a more complete comparison i believe the paper also provides valuable novel contributions in particular the finite sample analysis thm 3 the new learning method sec 4 and the thorough experimental evaluation sec 6 i actually see these as the main contributions thms 1 and 2 seem to follow quite directly from invertibility and independence and think they could feature more prominently together with addressing i and ii above this would make the paper a solid and wellrounded submission and i remain open to increasing my score depending on the authors response postrebuttal updates the authors have provided detailed responses to my comments and have made several modifications to the manuscript that satisfactorily address my two main concerns see the discussion below for details i have decided to increase my score as a result docsepthe paper offers a theoretical analysis of neural canonical correlation analysis cca type methods this reveals conditions under which these methods may work the paper then proposes regularizers that approximately realize these conditions which result in a new ccatype algorithm empirical results indicate that the approximate algorithm behaves as dictated by the theory multiview data eg getting both visual and audio data of the same situation is often analysed with ccastyle algorithms contemporary nonlinear cca models rely on neural encoders to find suitable representations this paper develops a theoretical analysis of a particular cca model this model differs in subtleties from existing models but reflect the same intuitions as current stateoftheart the paper provides theoretical evidence that suitable representations can be recovered from multiview data while i am not an expert in this area this does seem like valuable insights to me the largest caveat of the paper are the subtleties which is where the proposed model differs from others these allow for theoretical insights but it is unclear to which extend the assumptions are reasonable for example in eq 6b it is stated that fq should be invertible this seem like a rather strict assumption in practice in particular does it not imply that fq mathbbrd rightarrow mathbbrd is it reasonable that fq is dimensionality preserving eq 6d seems to be a rather important assumption about independence but i find it rather hard to determine if thats a suitable assumption i get that it comes with mathematical convenience but is it reasonable in general i would have liked the paper to have a remark after both assumption 1 2 about the reasonability of these assumptions algorithmically the assumed constraints are converted in soft constraints regularization and an appropriate optimization is proposed while the empirical analysis is limited to fairly simple settings the evidence does point towards these regularizers working well it would have been nice with an empirical study of the sensitivity towards the scaling of these regularizers eg lambda minor comments in sec 31 the paragraph generative models of multiview analysis feels more like a comment on previous work and it broke my flow while reading perhaps this paragraph could be moved elsewhere in theorem 2 it is unclear what is meant by a certain invertible function the citation associated with tsne in sec 61 appears to be about sne this should be corrected the parenthesis remark in the conclusion which is later realized should perhaps indicate that the realization is approximate update after rebuttal i am happy with the given replies and retain my positive score the paper is generally easy to read and tackles an interesting problem with a novel solution as far as i can tell the issues i have pointed to can fairly easily be fixed hence the positive score i would have given a 7 if that was an option docsepthis paper proposed a general multiview learning approach and provides a theoretical analysis for the proposed method specifically each view is considered as a commonshared and privateviewspecific components then the multiview learn problem is converted to a identification and disentanglement problems the theoretical analysis provides the bounds of the proposed model experimental results demonstrate the effectiveness for downstream tasks this paper proposed a novel approach for multiview learning the pros and cons are listed below pros the general motivation of the proposed model is reasonable and logical consider there are a few multiview learning works provide the theoretical analysis while this work gives a comprehensive analysis of the model it is potential to extend to wider multiview methods for deeply understanding multiview scenario the experimental results demonstrate the effectiveness of the proposed modules cons the experimental results mainly based on relatively small and simple datasets the potential extension to a more large and sophisticated datasets and sample formats should be discussed the computational cost of the proposed model should be discussed the potential solution for largescale applications should be discussed this paper proposed a novel approach for multiview learning problem a theoretical analysis of the proposed model is provided in experiments most of the datasets are small scale and simple data format more discussions and analysis about its potential realworld applications could be discussed
### Summary:
|
all reviewers agreed that this is a strong paper that the methodological contributions are both relevant and significant and that the experimental validation is convincing i fully share this viewpoint
|
[
15296,
285,
10140,
281,
7384,
253,
14433,
4836,
310,
14042,
275,
616,
4081,
5933,
436,
310,
9009,
275,
253,
830,
273,
247,
14433,
8103,
1561,
271,
6753,
27676,
7792,
2299,
253,
277,
18442,
285,
717,
28908,
3082,
11106,
275,
253,
2929,
513,
417,
897,
824,
14433,
16566,
13730,
21681,
281,
253,
2570,
3753,
273,
253,
941,
841,
3082,
403,
2223,
3732,
281,
24088,
3626,
3888,
835,
3962,
14433,
310,
417,
17887,
3185,
841,
3082,
3693,
14916,
14237,
24088,
3066,
39296,
5141,
2534,
676,
7553,
968,
390,
970,
4886,
31218,
13121,
27935,
407,
3017,
303,
9245,
312,
19235,
436,
35162,
1100,
581,
273,
253,
9380,
2022,
3916,
326,
697,
3762,
11424,
253,
2323,
273,
277,
18442,
285,
717,
28908,
3082,
275,
1340,
281,
15249,
436,
1750,
2007,
10527,
1783,
651,
320,
3058,
326,
11424,
2139,
253,
18979,
3082,
1335,
789,
1293,
11120,
37703,
30332,
2322,
627,
310,
5368,
789,
326,
2686,
12453,
4715,
342,
1327,
249,
1748,
917,
2349,
351,
398,
923,
24088,
256,
18,
256,
19,
8449,
465,
11500,
1541,
1162,
355,
43425,
21194,
4499,
422,
4715,
3762,
6239,
256,
20,
256,
21,
50272,
783,
1273,
310,
15577,
14275,
875,
6096,
285,
3055,
4295,
436,
9376,
3133,
4275,
281,
253,
27947,
273,
289,
983,
337,
50276,
395,
374,
533,
310,
760,
1077,
13366,
5469,
275,
4567,
436,
9376,
778,
320,
1679,
29190,
685,
30332,
2322,
923,
671,
3533,
2708,
533,
352,
22828,
247,
625,
9470,
5955,
285,
22861,
25761,
1580,
31238,
247,
21624,
3268,
310,
629,
273,
31238,
247,
1006,
800,
1566,
891,
2868,
436,
943,
320,
4395,
4321,
281,
253,
5068,
273,
4562,
285,
4151,
253,
5301,
342,
643,
1006,
800,
3210,
273,
1554,
400,
827,
941,
50275,
1542,
690,
273,
253,
10527,
1543,
352,
310,
12744,
281,
752,
6070,
597,
403,
4460,
390,
476,
2168,
320,
1119,
275,
2074,
830,
275,
253,
6239,
275,
1798,
627,
1646,
281,
320,
1781,
43630,
875,
289,
78,
337,
285,
253,
11106,
789,
273,
8449,
465,
11500,
1541,
1162,
355,
43425,
534,
671,
22649,
253,
6096,
21624,
4445,
1293,
253,
1840,
13260,
285,
4648,
253,
1072,
10732,
273,
1548,
18279,
1430,
598,
281,
42275,
1159,
2299,
436,
310,
417,
5469,
275,
253,
3762,
390,
2905,
789,
7118,
50275,
34974,
26122,
323,
253,
4477,
50276,
1542,
253,
8025,
1006,
800,
1566,
4983,
432,
247,
4758,
835,
1182,
68,
18,
68,
19,
403,
7976,
476,
436,
39406,
320,
13657,
715,
247,
4758,
342,
3907,
4329,
592,
285,
1896,
1027,
269,
18,
71,
19,
285,
604,
594,
849,
275,
643,
3000,
1057,
253,
14275,
9376,
1705,
1293,
2957,
273,
31376,
672,
10341,
42275,
269,
18,
71,
19,
403,
4136,
50276,
262,
3133,
326,
32765,
18,
69,
19,
403,
29688,
8025,
281,
320,
1929,
604,
594,
436,
943,
320,
16318,
347,
271,
9376,
671,
858,
368,
1347,
667,
28913,
327,
436,
50275,
783,
6486,
3410,
1783,
387,
253,
990,
273,
5922,
310,
1892,
281,
14390,
323,
44382,
468,
1641,
352,
651,
320,
5322,
604,
368,
812,
2085,
690,
3081,
30328,
327,
9376,
374,
285,
289,
78,
495,
50276,
74,
858,
417,
2096,
4555,
849,
253,
17524,
3368,
275,
9901,
3587,
44995,
289,
78,
337,
26332,
253,
6242,
273,
271,
42275,
1159,
10603,
875,
3216,
5083,
285,
22245,
4329,
592,
812,
368,
4496,
19148,
275,
1798,
849,
403,
253,
3216,
5083,
6096,
4295,
908,
275,
253,
7103,
352,
778,
320,
5322,
281,
13503,
253,
5368,
9759,
342,
3045,
387,
21565,
253,
3216,
5083,
4329,
592,
390,
310,
436,
253,
2361,
17524,
7200,
275,
3036,
337,
812,
368,
671,
921,
28669,
570,
323,
7856,
91,
19,
50276,
5371,
1057,
634,
1783,
452,
281,
1333,
5430,
1057,
352,
14588,
281,
4499,
422,
717,
28908,
7274,
824,
347,
948,
498,
83,
476,
436,
671,
320,
9839,
833,
347,
50276,
13324,
290,
5921,
11903,
5837,
50275,
977,
5884,
5701,
13991,
285,
963,
993,
50276,
81,
19,
374,
2109,
5586,
253,
1750,
326,
10527,
4685,
556,
644,
14999,
3133,
1512,
2266,
285,
943,
320,
7020,
264,
1066,
285,
48912,
342,
10414,
40088,
387,
1878,
690,
273,
253,
1142,
2987,
432,
2045,
1107,
326,
7703,
281,
28055,
2096,
260,
6357,
50276,
8809,
77,
50276,
81,
19,
891,
253,
2234,
13260,
24088,
30332,
2322,
285,
1511,
273,
1548,
18279,
1430,
598,
281,
42275,
1159,
943,
320,
4767,
1060,
323,
22107,
50275,
81,
20,
990,
273,
3127,
18289,
253,
17705,
943,
2186,
323,
512,
1269,
50276,
81,
24,
337,
296,
5586,
310,
288,
25831,
50276,
785,
3266,
4219,
253,
5921,
247,
1745,
80,
285,
943,
1239,
46926,
50276,
81,
24,
8460,
1427,
5586,
30332,
2322,
2550,
320,
27810,
407,
970,
271,
6753,
36465,
387,
1878,
275,
3946,
436,
943,
320,
7020,
264,
1066,
24088,
281,
11907,
390,
8591,
347,
908,
275,
253,
1735,
6197,
50276,
81,
1047,
495,
5784,
1390,
5586,
627,
3133,
281,
320,
247,
1745,
80,
275,
326,
253,
7898,
4309,
273,
49343,
8772,
260,
18,
3185,
273,
1182,
943,
320,
5058,
5010,
436,
40878,
253,
806,
629,
273,
253,
6197,
50276,
81,
1047,
374,
2109,
5586,
288,
81,
556,
417,
2568,
644,
2931,
50275,
250,
3065,
256,
18,
1182,
12784,
39968,
687,
1373,
256,
1162,
355,
4499,
422,
4715,
275,
31332,
253,
941,
11365,
1232,
43425,
50276,
84,
19,
246,
757,
340,
86,
395,
543,
1269,
249,
42157,
260,
864,
285,
919,
5973,
10821,
22357,
4685,
1881,
35421,
4715,
8062,
1293,
4499,
422,
8557,
43425,
50276,
84,
20,
549,
6464,
7699,
5173,
1173,
1162,
355,
247,
10527,
1783,
273,
4499,
422,
440,
35421,
6779,
4715,
6247,
50276,
84,
21,
281,
1200,
37622,
12679,
247,
661,
49289,
36407,
763,
6292,
321,
24085,
285,
16447,
928,
288,
3467,
4499,
422,
4715,
1554,
400,
827,
39296,
285,
4872,
3210,
43425,
4583,
891,
11346,
4361,
436,
2929,
2299,
891,
452,
767,
2022,
7350,
923,
32213,
1840,
323,
4278,
891,
253,
30332,
2322,
9376,
908,
281,
4044,
253,
10527,
1543,
310,
387,
13653,
342,
253,
958,
326,
21624,
5921,
3169,
1554,
400,
827,
4715,
5431,
1057,
417,
897,
42275,
2349,
351,
398,
436,
310,
417,
1160,
10481,
2590,
285,
253,
3916,
326,
253,
3559,
1783,
11424,
390,
7729,
2096,
253,
2323,
273,
1655,
3082,
310,
417,
4751,
17285,
285,
943,
320,
7020,
264,
1066,
824,
271,
8813,
651,
671,
452,
281,
5513,
2139,
841,
3082,
789,
1293,
42275,
2349,
351,
398,
21255,
629,
273,
253,
3762,
4620,
21076,
685,
1543,
275,
8449,
465,
11500,
1541,
1162,
355,
43425,
534,
403,
417,
2429,
281,
275,
4209,
2508,
275,
1798,
891,
11907,
253,
4477,
281,
2486,
247,
625,
3426,
5301,
50276,
74,
2868,
253,
2929,
671,
3400,
9865,
4460,
9021,
275,
1798,
253,
6486,
3410,
1783,
289,
78,
495,
253,
747,
4715,
1332,
4706,
577,
285,
253,
11080,
5661,
7103,
4706,
721,
891,
2686,
923,
841,
347,
253,
2022,
9021,
289,
983,
337,
285,
374,
1646,
281,
956,
3240,
3587,
432,
30332,
2322,
285,
14275,
285,
1158,
597,
812,
4735,
625,
46454,
2366,
342,
15974,
891,
285,
21255,
1840,
436,
651,
1056,
253,
2929,
247,
4891,
285,
973,
48198,
19529,
285,
891,
3464,
1527,
281,
3629,
619,
4868,
7293,
327,
253,
4477,
2380,
50275,
5996,
250,
2858,
22559,
11269,
50276,
783,
4477,
452,
2530,
7000,
6128,
281,
619,
5701,
285,
452,
1160,
2067,
14586,
281,
253,
7714,
326,
3449,
5906,
1031,
2953,
619,
767,
2022,
7350,
923,
253,
5955,
2708,
323,
4278,
891,
452,
4425,
281,
2572,
619,
4868,
347,
247,
906,
50276,
7152,
339,
431,
248,
2929,
6131,
247,
10527,
1783,
273,
11454,
15516,
5921,
1783,
260,
6357,
1511,
3082,
436,
12957,
2515,
762,
534,
841,
3082,
778,
789,
253,
2929,
840,
29328,
3963,
14460,
326,
5512,
8968,
841,
2515,
534,
906,
275,
247,
747,
260,
8076,
1692,
5933,
16774,
1543,
5224,
326,
253,
16851,
5933,
37824,
347,
39460,
407,
253,
3762,
1554,
400,
827,
941,
24088,
2970,
1097,
5304,
285,
9797,
941,
273,
253,
1072,
4112,
310,
2223,
15626,
342,
260,
4008,
2172,
11333,
13399,
14561,
260,
6357,
3210,
10725,
327,
11454,
2349,
351,
398,
281,
1089,
7470,
14237,
436,
2929,
24357,
247,
10527,
1783,
273,
247,
1798,
260,
6357,
1566,
436,
1566,
19986,
275,
8482,
1059,
447,
432,
5368,
3210,
533,
4887,
253,
1072,
16875,
4431,
347,
1655,
1375,
23037,
14387,
253,
2929,
3400,
10527,
1941,
326,
7470,
14237,
476,
320,
12372,
432,
1554,
400,
827,
941,
1223,
891,
717,
417,
271,
6485,
275,
436,
2170,
436,
1057,
1646,
751,
9865,
16039,
50276,
936,
479,
253,
6253,
15985,
255,
273,
253,
2929,
403,
253,
8482,
1059,
447,
534,
310,
835,
253,
4081,
1566,
19986,
432,
2571,
841,
1581,
323,
10527,
16039,
533,
352,
310,
12744,
281,
534,
9017,
253,
13260,
403,
5272,
323,
1650,
50274,
249,
16186,
721,
67,
352,
310,
4767,
326,
269,
82,
943,
320,
42275,
436,
1646,
751,
247,
2581,
7654,
9376,
275,
3946,
275,
1798,
1057,
352,
417,
16084,
326,
269,
82,
14168,
67,
1288,
69,
987,
2501,
14168,
67,
1288,
69,
310,
352,
5272,
326,
269,
82,
310,
7877,
1319,
24279,
50275,
2574,
721,
69,
3133,
281,
320,
247,
2581,
1774,
9376,
670,
14275,
533,
891,
1089,
352,
2581,
1892,
281,
3653,
604,
28763,
247,
7470,
9376,
891,
755,
326,
352,
3249,
342,
15965,
16397,
533,
310,
352,
5272,
50276,
249,
2087,
891,
651,
452,
10490,
253,
2929,
281,
452,
247,
7579,
846,
1097,
9376,
50276,
18,
50276,
19,
670,
253,
1921,
1430,
273,
841,
13260,
50276,
41528,
1037,
253,
8025,
10806,
403,
11516,
275,
2602,
10806,
37820,
285,
271,
4569,
13757,
310,
4081,
1223,
253,
16774,
1783,
310,
3710,
281,
9648,
2969,
7533,
253,
1941,
1057,
1127,
4404,
841,
3963,
14460,
2444,
973,
352,
651,
452,
644,
5322,
342,
271,
16774,
1263,
273,
253,
7340,
4404,
253,
13642,
273,
841,
3963,
14460,
24088,
29331,
50275,
37585,
5701,
50274,
249,
4706,
4562,
253,
12494,
1006,
800,
3210,
273,
1554,
400,
827,
1783,
9193,
625,
751,
247,
4385,
327,
2045,
789,
285,
352,
9377,
619,
2685,
1223,
4361,
4931,
436,
12494,
812,
320,
4395,
11358,
50275,
249,
10012,
374,
352,
310,
12744,
752,
310,
5486,
407,
247,
2176,
42275,
1159,
50275,
783,
25577,
2330,
342,
28669,
570,
275,
4706,
9901,
4620,
281,
320,
670,
16037,
436,
943,
320,
15045,
50275,
783,
2885,
25232,
7579,
275,
253,
6452,
534,
310,
1996,
8156,
943,
4931,
5224,
326,
253,
22786,
310,
16851,
50275,
11183,
846,
30080,
22559,
50276,
74,
717,
5211,
342,
253,
1677,
32114,
285,
13280,
619,
2762,
4868,
253,
2929,
310,
3839,
3477,
281,
1239,
285,
39223,
271,
4722,
1895,
342,
247,
4460,
2900,
347,
2080,
347,
891,
476,
2028,
253,
3374,
891,
452,
8042,
281,
476,
9648,
4354,
320,
4229,
7613,
253,
2762,
4868,
891,
651,
452,
1677,
247,
818,
604,
326,
369,
271,
4500,
5474,
33032,
2520,
2929,
4081,
247,
2087,
1554,
400,
827,
4715,
2746,
285,
3400,
247,
10527,
1783,
323,
253,
4081,
1332,
5742,
1016,
1859,
310,
2783,
347,
247,
764,
790,
73,
1096,
285,
3055,
1374,
6160,
4295,
840,
253,
1554,
400,
827,
3037,
1895,
310,
11516,
281,
247,
8137,
285,
557,
290,
606,
1338,
3237,
253,
10527,
1783,
3400,
253,
14493,
273,
253,
4081,
1566,
5661,
1543,
7568,
253,
12510,
323,
15450,
8892,
436,
2929,
4081,
247,
4460,
2746,
323,
1554,
400,
827,
4715,
253,
5847,
285,
772,
403,
7117,
2708,
50276,
856,
84,
50276,
783,
2087,
16038,
273,
253,
4081,
1566,
310,
5272,
285,
13760,
50276,
15603,
627,
403,
247,
1643,
1554,
400,
827,
4715,
2987,
2085,
253,
10527,
1783,
1223,
436,
789,
4245,
247,
11088,
1783,
273,
253,
1566,
352,
310,
2442,
281,
9017,
281,
14200,
1554,
400,
827,
3082,
323,
11617,
4685,
1554,
400,
827,
10076,
50276,
783,
5661,
1543,
7568,
253,
12510,
273,
253,
4081,
11911,
50276,
5040,
50276,
783,
5661,
1543,
7194,
1754,
327,
4942,
1355,
285,
2969,
15302,
253,
2442,
6880,
281,
247,
625,
1781,
285,
18144,
15302,
285,
3410,
21453,
943,
320,
5469,
50276,
783,
15180,
2105,
273,
253,
4081,
1566,
943,
320,
5469,
253,
2442,
2900,
323,
1236,
2510,
25912,
4893,
943,
320,
5469,
50276,
2520,
2929,
4081,
247,
4460,
2746,
323,
1554,
400,
827,
4715,
1895,
247,
10527,
1783,
273,
253,
4081,
1566,
310,
2530,
275,
4679,
954,
273,
253,
15302,
403,
1355,
4311,
285,
2969,
941,
5981,
625,
11985,
285,
1783,
670,
697,
2442,
1524,
10186,
4893,
812,
320,
5469,
2490,
187,
4118,
18435,
27,
455,
30628,
5821,
326,
436,
310,
247,
2266,
2929,
326,
253,
35961,
9021,
403,
1097,
4623,
285,
1534,
285,
326,
253,
5661,
12820,
310,
21414,
891,
4751,
3894,
436,
31460
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
15296,
285,
10140,
281,
7384,
253,
14433,
4836,
310,
14042,
275,
616,
4081,
5933,
436,
310,
9009,
275,
253,
830,
273,
247,
14433,
8103,
1561,
271,
6753,
27676,
7792,
2299,
253,
277,
18442,
285,
717,
28908,
3082,
11106,
275,
253,
2929,
513,
417,
897,
824,
14433,
16566,
13730,
21681,
281,
253,
2570,
3753,
273,
253,
941,
841,
3082,
403,
2223,
3732,
281,
24088,
3626,
3888,
835,
3962,
14433,
310,
417,
17887,
3185,
841,
3082,
3693,
14916,
14237,
24088,
3066,
39296,
5141,
2534,
676,
7553,
968,
390,
970,
4886,
31218,
13121,
27935,
407,
3017,
303,
9245,
312,
19235,
436,
35162,
1100,
581,
273,
253,
9380,
2022,
3916,
326,
697,
3762,
11424,
253,
2323,
273,
277,
18442,
285,
717,
28908,
3082,
275,
1340,
281,
15249,
436,
1750,
2007,
10527,
1783,
651,
320,
3058,
326,
11424,
2139,
253,
18979,
3082,
1335,
789,
1293,
11120,
37703,
30332,
2322,
627,
310,
5368,
789,
326,
2686,
12453,
4715,
342,
1327,
249,
1748,
917,
2349,
351,
398,
923,
24088,
256,
18,
256,
19,
8449,
465,
11500,
1541,
1162,
355,
43425,
21194,
4499,
422,
4715,
3762,
6239,
256,
20,
256,
21,
50272,
783,
1273,
310,
15577,
14275,
875,
6096,
285,
3055,
4295,
436,
9376,
3133,
4275,
281,
253,
27947,
273,
289,
983,
337,
50276,
395,
374,
533,
310,
760,
1077,
13366,
5469,
275,
4567,
436,
9376,
778,
320,
1679,
29190,
685,
30332,
2322,
923,
671,
3533,
2708,
533,
352,
22828,
247,
625,
9470,
5955,
285,
22861,
25761,
1580,
31238,
247,
21624,
3268,
310,
629,
273,
31238,
247,
1006,
800,
1566,
891,
2868,
436,
943,
320,
4395,
4321,
281,
253,
5068,
273,
4562,
285,
4151,
253,
5301,
342,
643,
1006,
800,
3210,
273,
1554,
400,
827,
941,
50275,
1542,
690,
273,
253,
10527,
1543,
352,
310,
12744,
281,
752,
6070,
597,
403,
4460,
390,
476,
2168,
320,
1119,
275,
2074,
830,
275,
253,
6239,
275,
1798,
627,
1646,
281,
320,
1781,
43630,
875,
289,
78,
337,
285,
253,
11106,
789,
273,
8449,
465,
11500,
1541,
1162,
355,
43425,
534,
671,
22649,
253,
6096,
21624,
4445,
1293,
253,
1840,
13260,
285,
4648,
253,
1072,
10732,
273,
1548,
18279,
1430,
598,
281,
42275,
1159,
2299,
436,
310,
417,
5469,
275,
253,
3762,
390,
2905,
789,
7118,
50275,
34974,
26122,
323,
253,
4477,
50276,
1542,
253,
8025,
1006,
800,
1566,
4983,
432,
247,
4758,
835,
1182,
68,
18,
68,
19,
403,
7976,
476,
436,
39406,
320,
13657,
715,
247,
4758,
342,
3907,
4329,
592,
285,
1896,
1027,
269,
18,
71,
19,
285,
604,
594,
849,
275,
643,
3000,
1057,
253,
14275,
9376,
1705,
1293,
2957,
273,
31376,
672,
10341,
42275,
269,
18,
71,
19,
403,
4136,
50276,
262,
3133,
326,
32765,
18,
69,
19,
403,
29688,
8025,
281,
320,
1929,
604,
594,
436,
943,
320,
16318,
347,
271,
9376,
671,
858,
368,
1347,
667,
28913,
327,
436,
50275,
783,
6486,
3410,
1783,
387,
253,
990,
273,
5922,
310,
1892,
281,
14390,
323,
44382,
468,
1641,
352,
651,
320,
5322,
604,
368,
812,
2085,
690,
3081,
30328,
327,
9376,
374,
285,
289,
78,
495,
50276,
74,
858,
417,
2096,
4555,
849,
253,
17524,
3368,
275,
9901,
3587,
44995,
289,
78,
337,
26332,
253,
6242,
273,
271,
42275,
1159,
10603,
875,
3216,
5083,
285,
22245,
4329,
592,
812,
368,
4496,
19148,
275,
1798,
849,
403,
253,
3216,
5083,
6096,
4295,
908,
275,
253,
7103,
352,
778,
320,
5322,
281,
13503,
253,
5368,
9759,
342,
3045,
387,
21565,
253,
3216,
5083,
4329,
592,
390,
310,
436,
253,
2361,
17524,
7200,
275,
3036,
337,
812,
368,
671,
921,
28669,
570,
323,
7856,
91,
19,
50276,
5371,
1057,
634,
1783,
452,
281,
1333,
5430,
1057,
352,
14588,
281,
4499,
422,
717,
28908,
7274,
824,
347,
948,
498,
83,
476,
436,
671,
320,
9839,
833,
347,
50276,
13324,
290,
5921,
11903,
5837,
50275,
977,
5884,
5701,
13991,
285,
963,
993,
50276,
81,
19,
374,
2109,
5586,
253,
1750,
326,
10527,
4685,
556,
644,
14999,
3133,
1512,
2266,
285,
943,
320,
7020,
264,
1066,
285,
48912,
342,
10414,
40088,
387,
1878,
690,
273,
253,
1142,
2987,
432,
2045,
1107,
326,
7703,
281,
28055,
2096,
260,
6357,
50276,
8809,
77,
50276,
81,
19,
891,
253,
2234,
13260,
24088,
30332,
2322,
285,
1511,
273,
1548,
18279,
1430,
598,
281,
42275,
1159,
943,
320,
4767,
1060,
323,
22107,
50275,
81,
20,
990,
273,
3127,
18289,
253,
17705,
943,
2186,
323,
512,
1269,
50276,
81,
24,
337,
296,
5586,
310,
288,
25831,
50276,
785,
3266,
4219,
253,
5921,
247,
1745,
80,
285,
943,
1239,
46926,
50276,
81,
24,
8460,
1427,
5586,
30332,
2322,
2550,
320,
27810,
407,
970,
271,
6753,
36465,
387,
1878,
275,
3946,
436,
943,
320,
7020,
264,
1066,
24088,
281,
11907,
390,
8591,
347,
908,
275,
253,
1735,
6197,
50276,
81,
1047,
495,
5784,
1390,
5586,
627,
3133,
281,
320,
247,
1745,
80,
275,
326,
253,
7898,
4309,
273,
49343,
8772,
260,
18,
3185,
273,
1182,
943,
320,
5058,
5010,
436,
40878,
253,
806,
629,
273,
253,
6197,
50276,
81,
1047,
374,
2109,
5586,
288,
81,
556,
417,
2568,
644,
2931,
50275,
250,
3065,
256,
18,
1182,
12784,
39968,
687,
1373,
256,
1162,
355,
4499,
422,
4715,
275,
31332,
253,
941,
11365,
1232,
43425,
50276,
84,
19,
246,
757,
340,
86,
395,
543,
1269,
249,
42157,
260,
864,
285,
919,
5973,
10821,
22357,
4685,
1881,
35421,
4715,
8062,
1293,
4499,
422,
8557,
43425,
50276,
84,
20,
549,
6464,
7699,
5173,
1173,
1162,
355,
247,
10527,
1783,
273,
4499,
422,
440,
35421,
6779,
4715,
6247,
50276,
84,
21,
281,
1200,
37622,
12679,
247,
661,
49289,
36407,
763,
6292,
321,
24085,
285,
16447,
928,
288,
3467,
4499,
422,
4715,
1554,
400,
827,
39296,
285,
4872,
3210,
43425,
4583,
891,
11346,
4361,
436,
2929,
2299,
891,
452,
767,
2022,
7350,
923,
32213,
1840,
323,
4278,
891,
253,
30332,
2322,
9376,
908,
281,
4044,
253,
10527,
1543,
310,
387,
13653,
342,
253,
958,
326,
21624,
5921,
3169,
1554,
400,
827,
4715,
5431,
1057,
417,
897,
42275,
2349,
351,
398,
436,
310,
417,
1160,
10481,
2590,
285,
253,
3916,
326,
253,
3559,
1783,
11424,
390,
7729,
2096,
253,
2323,
273,
1655,
3082,
310,
417,
4751,
17285,
285,
943,
320,
7020,
264,
1066,
824,
271,
8813,
651,
671,
452,
281,
5513,
2139,
841,
3082,
789,
1293,
42275,
2349,
351,
398,
21255,
629,
273,
253,
3762,
4620,
21076,
685,
1543,
275,
8449,
465,
11500,
1541,
1162,
355,
43425,
534,
403,
417,
2429,
281,
275,
4209,
2508,
275,
1798,
891,
11907,
253,
4477,
281,
2486,
247,
625,
3426,
5301,
50276,
74,
2868,
253,
2929,
671,
3400,
9865,
4460,
9021,
275,
1798,
253,
6486,
3410,
1783,
289,
78,
495,
253,
747,
4715,
1332,
4706,
577,
285,
253,
11080,
5661,
7103,
4706,
721,
891,
2686,
923,
841,
347,
253,
2022,
9021,
289,
983,
337,
285,
374,
1646,
281,
956,
3240,
3587,
432,
30332,
2322,
285,
14275,
285,
1158,
597,
812,
4735,
625,
46454,
2366,
342,
15974,
891,
285,
21255,
1840,
436,
651,
1056,
253,
2929,
247,
4891,
285,
973,
48198,
19529,
285,
891,
3464,
1527,
281,
3629,
619,
4868,
7293,
327,
253,
4477,
2380,
50275,
5996,
250,
2858,
22559,
11269,
50276,
783,
4477,
452,
2530,
7000,
6128,
281,
619,
5701,
285,
452,
1160,
2067,
14586,
281,
253,
7714,
326,
3449,
5906,
1031,
2953,
619,
767,
2022,
7350,
923,
253,
5955,
2708,
323,
4278,
891,
452,
4425,
281,
2572,
619,
4868,
347,
247,
906,
50276,
7152,
339,
431,
248,
2929,
6131,
247,
10527,
1783,
273,
11454,
15516,
5921,
1783,
260,
6357,
1511,
3082,
436,
12957,
2515,
762,
534,
841,
3082,
778,
789,
253,
2929,
840,
29328,
3963,
14460,
326,
5512,
8968,
841,
2515,
534,
906,
275,
247,
747,
260,
8076,
1692,
5933,
16774,
1543,
5224,
326,
253,
16851,
5933,
37824,
347,
39460,
407,
253,
3762,
1554,
400,
827,
941,
24088,
2970,
1097,
5304,
285,
9797,
941,
273,
253,
1072,
4112,
310,
2223,
15626,
342,
260,
4008,
2172,
11333,
13399,
14561,
260,
6357,
3210,
10725,
327,
11454,
2349,
351,
398,
281,
1089,
7470,
14237,
436,
2929,
24357,
247,
10527,
1783,
273,
247,
1798,
260,
6357,
1566,
436,
1566,
19986,
275,
8482,
1059,
447,
432,
5368,
3210,
533,
4887,
253,
1072,
16875,
4431,
347,
1655,
1375,
23037,
14387,
253,
2929,
3400,
10527,
1941,
326,
7470,
14237,
476,
320,
12372,
432,
1554,
400,
827,
941,
1223,
891,
717,
417,
271,
6485,
275,
436,
2170,
436,
1057,
1646,
751,
9865,
16039,
50276,
936,
479,
253,
6253,
15985,
255,
273,
253,
2929,
403,
253,
8482,
1059,
447,
534,
310,
835,
253,
4081,
1566,
19986,
432,
2571,
841,
1581,
323,
10527,
16039,
533,
352,
310,
12744,
281,
534,
9017,
253,
13260,
403,
5272,
323,
1650,
50274,
249,
16186,
721,
67,
352,
310,
4767,
326,
269,
82,
943,
320,
42275,
436,
1646,
751,
247,
2581,
7654,
9376,
275,
3946,
275,
1798,
1057,
352,
417,
16084,
326,
269,
82,
14168,
67,
1288,
69,
987,
2501,
14168,
67,
1288,
69,
310,
352,
5272,
326,
269,
82,
310,
7877,
1319,
24279,
50275,
2574,
721,
69,
3133,
281,
320,
247,
2581,
1774,
9376,
670,
14275,
533,
891,
1089,
352,
2581,
1892,
281,
3653,
604,
28763,
247,
7470,
9376,
891,
755,
326,
352,
3249,
342,
15965,
16397,
533,
310,
352,
5272,
50276,
249,
2087,
891,
651,
452,
10490,
253,
2929,
281,
452,
247,
7579,
846,
1097,
9376,
50276,
18,
50276,
19,
670,
253,
1921,
1430,
273,
841,
13260,
50276,
41528,
1037,
253,
8025,
10806,
403,
11516,
275,
2602,
10806,
37820,
285,
271,
4569,
13757,
310,
4081,
1223,
253,
16774,
1783,
310,
3710,
281,
9648,
2969,
7533,
253,
1941,
1057,
1127,
4404,
841,
3963,
14460,
2444,
973,
352,
651,
452,
644,
5322,
342,
271,
16774,
1263,
273,
253,
7340,
4404,
253,
13642,
273,
841,
3963,
14460,
24088,
29331,
50275,
37585,
5701,
50274,
249,
4706,
4562,
253,
12494,
1006,
800,
3210,
273,
1554,
400,
827,
1783,
9193,
625,
751,
247,
4385,
327,
2045,
789,
285,
352,
9377,
619,
2685,
1223,
4361,
4931,
436,
12494,
812,
320,
4395,
11358,
50275,
249,
10012,
374,
352,
310,
12744,
752,
310,
5486,
407,
247,
2176,
42275,
1159,
50275,
783,
25577,
2330,
342,
28669,
570,
275,
4706,
9901,
4620,
281,
320,
670,
16037,
436,
943,
320,
15045,
50275,
783,
2885,
25232,
7579,
275,
253,
6452,
534,
310,
1996,
8156,
943,
4931,
5224,
326,
253,
22786,
310,
16851,
50275,
11183,
846,
30080,
22559,
50276,
74,
717,
5211,
342,
253,
1677,
32114,
285,
13280,
619,
2762,
4868,
253,
2929,
310,
3839,
3477,
281,
1239,
285,
39223,
271,
4722,
1895,
342,
247,
4460,
2900,
347,
2080,
347,
891,
476,
2028,
253,
3374,
891,
452,
8042,
281,
476,
9648,
4354,
320,
4229,
7613,
253,
2762,
4868,
891,
651,
452,
1677,
247,
818,
604,
326,
369,
271,
4500,
5474,
33032,
2520,
2929,
4081,
247,
2087,
1554,
400,
827,
4715,
2746,
285,
3400,
247,
10527,
1783,
323,
253,
4081,
1332,
5742,
1016,
1859,
310,
2783,
347,
247,
764,
790,
73,
1096,
285,
3055,
1374,
6160,
4295,
840,
253,
1554,
400,
827,
3037,
1895,
310,
11516,
281,
247,
8137,
285,
557,
290,
606,
1338,
3237,
253,
10527,
1783,
3400,
253,
14493,
273,
253,
4081,
1566,
5661,
1543,
7568,
253,
12510,
323,
15450,
8892,
436,
2929,
4081,
247,
4460,
2746,
323,
1554,
400,
827,
4715,
253,
5847,
285,
772,
403,
7117,
2708,
50276,
856,
84,
50276,
783,
2087,
16038,
273,
253,
4081,
1566,
310,
5272,
285,
13760,
50276,
15603,
627,
403,
247,
1643,
1554,
400,
827,
4715,
2987,
2085,
253,
10527,
1783,
1223,
436,
789,
4245,
247,
11088,
1783,
273,
253,
1566,
352,
310,
2442,
281,
9017,
281,
14200,
1554,
400,
827,
3082,
323,
11617,
4685,
1554,
400,
827,
10076,
50276,
783,
5661,
1543,
7568,
253,
12510,
273,
253,
4081,
11911,
50276,
5040,
50276,
783,
5661,
1543,
7194,
1754,
327,
4942,
1355,
285,
2969,
15302,
253,
2442,
6880,
281,
247,
625,
1781,
285,
18144,
15302,
285,
3410,
21453,
943,
320,
5469,
50276,
783,
15180,
2105,
273,
253,
4081,
1566,
943,
320,
5469,
253,
2442,
2900,
323,
1236,
2510,
25912,
4893,
943,
320,
5469,
50276,
2520,
2929,
4081,
247,
4460,
2746,
323,
1554,
400,
827,
4715,
1895,
247,
10527,
1783,
273,
253,
4081,
1566,
310,
2530,
275,
4679,
954,
273,
253,
15302,
403,
1355,
4311,
285,
2969,
941,
5981,
625,
11985,
285,
1783,
670,
697,
2442,
1524,
10186,
4893,
812,
320,
5469,
2490,
187,
4118,
18435,
27,
455,
30628,
5821,
326,
436,
310,
247,
2266,
2929,
326,
253,
35961,
9021,
403,
1097,
4623,
285,
1534,
285,
326,
253,
5661,
12820,
310,
21414,
891,
4751,
3894,
436,
31460
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
overview in this paper the authors proposed a new format of representation called scene programs to describe the visual scenes to extract the scene programs from scenes the authors exploited the offtheshelf object detection and segmentations model mask rcnn to extract all objects and the corresponding attributes from the images and then detect groups for those objects which are then used to generate the programs which matches the input scenes the experiments are performed on a synthetics datasets which consists of multiple shapes with different attributes the experiments shows that the proposed model can infer more accurate programs from the scenes and those generated programs can be used to recover the input scenes more accurately besides the authors also showed that the generated scene programs can be used for image editing and making visual analogy strengthes 1 the authors proposed a new representation called scene programs to describe the visual scenes with some textual program this is a new scene representation which could be potentially used in various scenarios such as the image synthesis in graphics 2 the authors proposed a hierarchical method to model the structures in scenes specifically the objects in a scene are first extracted and then grouped into multiple clusters which will be used to guide the scene program synthesis 3 the experimental results demonstrate the effectiveness of the proposed method both qualitatively and quantitatively the authors also showed the the programs generated can be sued for image editing and crossmodality matching weaknesses 1 it is a bit unfair to compare the proposed method with the two baseline methods listed in table 2 the authors used a pretrained maskrcnn to detect all objects and predict the attributes for all objects however the counterpart methods have no access to this supervision even in this case cnnlstm seems achieve comparable performance on the first three metrics 2 the advantage of scene program compared with scene graph johnson et al are not clear to me scene graph is also a symbolic representation for images also for all the tasks mentioned in this paper such as image editing and visual analogy scene graph can probably also complete well the authors should comment about the specific advantages of scene program in comparison with scene graph 3 all the images shown in the paper seems arranged uniformly which i think contains some bias to the proposed grouping strategy i would like to see more diverse configurations of the foreground objects it would be good to see if the proposed model can describe more complicated scenes summary this paper proposed a novel scene representations called scene program to extract the scene program the authors proposed a hieratchical inference method the resulting scene programs based on the proposed model outperforms several baseline models quantitatively the authors also showed the proposed scene program is suitable for image editing and visual analogy making however as pointed above there are some unclear points to me especially the advantages of scene program compared with scene graph and the representation power of scene program for complicated scenesdocsepthis paper investigates a descriptive representation of scenes using programs given an input image and an initial set of detections obtained from bottomup detectors a sequence to sequence network is used to generate programs in a domain specific language dsl the authors consider a dataset where simple primitives are arranged in layouts in 3d scenes with varying material and color properties they argue that the scene representation lead to better generalization on novel scene types and improve over baselines on image analogy tasks the paper is well written but the evaluation and technical novelty is weak first the use of scene programs is not a contribution of this paper going beyond the works cited in the related work section several recent works have proposed and investigated the advantages of program synthesis for shape generation eg csgnet sharma et al cvpr 2018 and scene derendering wu et al cvpr 2017 visual reasoning modular networks andreas et al 2015 among others at a highlevel the motivation of the program level representation for the considered tasks is not highlighted it seems that an attributebased representation ie the output of the mask rcnn detector that describes the image as a collection of objects material properties and their positions and scales is a sufficient representation the higherorder relationships can be relatively easily extracted from the detections since the images are clean and clutter free a baseline approach where the program synthesis was performed using search and grouping should be compared with the considered tasks are relatively simple achieving 995 tokenlevel accuracy the evaluation beyond the synthetic datasets is fairly limited and it is unclear how well the method generalizes to novel images in clutter and occlusion in summary the paper makes a number of observations that have been motivated in a number of prior works but the contributions of this paper is not highlighted eg over neural scene derendering the main claim that higherorder relationships are being modeled is not apparent due to the simplicity of the scenes being considered for example the program blocks being considered are somewhat arbitrary and a comparison with a clustering based grouping approach should have been evaluated the experimental evaluation is weak in several aspects the generalization to real images is anecdotal with only two examples shown in the figure 7 docsepthis paper presents a system that infers programs describing 3d scenes composed of simple primitives the system consists of three stages each of which is trained separately first the perceptual module extracts object masks and their attributes the objects are then are split into several groups finally each group is mapped to a corresponding dsl program using a sequencetosequence network similar to the ones typically employed in neural machine translation pros the paper is written clearly and easy to read visual program synthesis is very exciting and important direction both for image understanding and generation the results on synthetic datasets are good the authors also demonstrate the applicability of the approach to realworld data albeit significantly constrained i find it surprising that a seq2seq is good at producing an accurate program for a group of objects visual analogy making experiments are impressive cons the proposed model requires rich annotation of training data since all the components of the systems are trained in a supervised fashion its not clear how to use the method on the inthewild data without such annotation related to the previous point even when its possible to synthesize data it is nontrivial to obtain the groundtruth grouping of objects judging by table 2 it seems that the system breaks in absence of the grouping information the data used in the paper is quite simplistic limited number of primitives located in a regular grid im wondering if there is a natural way to extend the approach to more complex settings my guess is that the performance will drop significantly notesquestions section 2 paragraph 1 the paper by ganin et al 2018 presents both a system for reproducing an image as well as for sampling from a distribution moreover it presents experiments on 3d data ie not limited to drawing section 34 paragraph 2 im not sure i understand the last sentence how can we know that we successfully recovered the scene at test time could the authors elaborate on the stopping criterion for sampling section 42 paragraph 2 do i understand correctly that the main difference between the test set and the generalization set is the number of groups ie 2 vs 3 if so its a fairly limited demonstration of generalization capabilities of the system section 42 paragraph 4 we search top 3 proposals how do we decide which one is better do we somehow have an access to the ground truth program at test time could the authors explain the representation of a program more clearly how are loops handled how can one subtractadd programs in the analogy making experiment overall i think it is a interesting paper and can be potentially accepted on the condition that the authors address my questions and concerns
### Summary:
|
this paper presents a dataset and method for training a model to infer from a visual scene the program that would generatedescribe it in doing so it produces abstract disentangled representations of the scene which could be used by agents models and other ml methods to reason about the scene this is yet another paper where the reviewers disappointingly did not interact the first round of reviews were mediocretoacceptable the authors i think did a good job of responding to the concerns raised by the reviewers and edited their paper accordingly unfortunately not one of the reviewers took the time to consider author responses in light of my reading of the responses and the revisions in the paper i am leaning towards treating this as a paper where the review process has failed the authors and recommending acceptance the paper presents a novel method and dataset and the experiments are reasonably convincing the paper has flaws and the authors are advised to carefully take into account the concerns flagged by reviewersmany of which they have responded toin producing their final manuscript
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
39930,
50276,
249,
436,
2929,
253,
4477,
4081,
247,
747,
5981,
273,
6779,
1925,
6200,
5659,
281,
6266,
253,
5304,
13451,
281,
4908,
253,
6200,
5659,
432,
13451,
253,
4477,
28734,
253,
273,
649,
1041,
48164,
1789,
5481,
285,
8223,
569,
1566,
8989,
27657,
9866,
281,
4908,
512,
5113,
285,
253,
3969,
12474,
432,
253,
3888,
285,
840,
2736,
2390,
323,
1110,
5113,
534,
403,
840,
908,
281,
6635,
253,
5659,
534,
10129,
253,
3280,
13451,
253,
4679,
403,
2684,
327,
247,
5132,
49631,
15302,
534,
8414,
273,
2709,
15029,
342,
1027,
12474,
253,
4679,
2722,
326,
253,
4081,
1566,
476,
9441,
625,
7899,
5659,
432,
253,
13451,
285,
1110,
4561,
5659,
476,
320,
908,
281,
9295,
253,
3280,
13451,
625,
13613,
16280,
253,
4477,
671,
2692,
326,
253,
4561,
6200,
5659,
476,
320,
908,
323,
2460,
14835,
285,
2403,
5304,
24760,
50276,
296,
3755,
783,
84,
50276,
18,
253,
4477,
4081,
247,
747,
6779,
1925,
6200,
5659,
281,
6266,
253,
5304,
13451,
342,
690,
45860,
2086,
436,
310,
247,
747,
6200,
6779,
534,
812,
320,
7826,
908,
275,
2710,
15216,
824,
347,
253,
2460,
9066,
275,
15896,
50276,
19,
253,
4477,
4081,
247,
24498,
1332,
281,
1566,
253,
5289,
275,
13451,
5742,
253,
5113,
275,
247,
6200,
403,
806,
10375,
285,
840,
24104,
715,
2709,
9959,
534,
588,
320,
908,
281,
7102,
253,
6200,
2086,
9066,
50275,
20,
253,
5661,
1543,
7568,
253,
12510,
273,
253,
4081,
1332,
1097,
36143,
285,
36878,
253,
4477,
671,
2692,
253,
253,
5659,
4561,
50276,
5092,
320,
23460,
323,
2460,
14835,
285,
2831,
2307,
1319,
11038,
50275,
20881,
1255,
265,
50276,
18,
352,
310,
247,
2372,
16593,
281,
7277,
253,
4081,
1332,
342,
253,
767,
8245,
3082,
7117,
275,
2829,
374,
253,
4477,
908,
247,
3215,
11273,
8989,
3373,
9866,
281,
2736,
512,
5113,
285,
3283,
253,
12474,
323,
512,
5113,
2299,
253,
14317,
3082,
452,
642,
2289,
281,
436,
20446,
1014,
275,
436,
1083,
260,
9866,
42663,
78,
3133,
5115,
10870,
3045,
327,
253,
806,
1264,
17082,
50275,
19,
253,
5750,
273,
6200,
2086,
2429,
342,
6200,
4216,
480,
2116,
1665,
1162,
355,
403,
417,
2590,
281,
479,
6200,
4216,
310,
671,
247,
24762,
6779,
323,
3888,
671,
323,
512,
253,
8892,
5393,
275,
436,
2929,
824,
347,
2460,
14835,
285,
5304,
24760,
6200,
4216,
476,
3164,
671,
3426,
973,
253,
4477,
943,
4385,
670,
253,
2173,
11361,
273,
6200,
2086,
275,
5301,
342,
6200,
4216,
50276,
20,
512,
253,
3888,
2011,
275,
253,
2929,
3133,
10912,
17568,
534,
891,
1158,
4428,
690,
8492,
281,
253,
4081,
32827,
5700,
891,
651,
751,
281,
923,
625,
11117,
16012,
273,
253,
35936,
5113,
352,
651,
320,
1175,
281,
923,
604,
253,
4081,
1566,
476,
6266,
625,
9542,
13451,
50276,
8774,
50276,
2520,
2929,
4081,
247,
4460,
6200,
14237,
1925,
6200,
2086,
281,
4908,
253,
6200,
2086,
253,
4477,
4081,
247,
10549,
1506,
474,
17032,
1332,
253,
4795,
6200,
5659,
1754,
327,
253,
4081,
1566,
41731,
13015,
2067,
8245,
3210,
36878,
253,
4477,
671,
2692,
253,
4081,
6200,
2086,
310,
7470,
323,
2460,
14835,
285,
5304,
24760,
2403,
2299,
347,
8042,
1840,
627,
403,
690,
12744,
2792,
281,
479,
3340,
253,
11361,
273,
6200,
2086,
2429,
342,
6200,
4216,
285,
253,
6779,
1612,
273,
6200,
2086,
323,
9542,
13451,
7152,
33032,
2520,
2929,
2340,
684,
247,
27389,
6779,
273,
13451,
970,
5659,
1677,
271,
3280,
2460,
285,
271,
3302,
873,
273,
843,
20713,
2797,
432,
5004,
484,
25421,
247,
3425,
281,
3425,
2990,
310,
908,
281,
6635,
5659,
275,
247,
5028,
2173,
3448,
277,
3433,
253,
4477,
1908,
247,
10895,
835,
2969,
2248,
23223,
403,
10912,
275,
50107,
275,
495,
69,
13451,
342,
11962,
2144,
285,
3295,
3607,
597,
9059,
326,
253,
6200,
6779,
1421,
281,
1805,
26647,
327,
4460,
6200,
3510,
285,
3157,
689,
1666,
25379,
327,
2460,
24760,
8892,
253,
2929,
310,
973,
3542,
533,
253,
7103,
285,
7681,
38135,
310,
5075,
50275,
7053,
253,
897,
273,
6200,
5659,
310,
417,
247,
7680,
273,
436,
2929,
1469,
4457,
253,
2987,
11106,
275,
253,
2905,
789,
2593,
2067,
3332,
2987,
452,
4081,
285,
6949,
253,
11361,
273,
2086,
9066,
323,
5281,
5978,
24088,
260,
8433,
3024,
17614,
785,
1162,
355,
30105,
1087,
4765,
285,
6200,
372,
12574,
272,
259,
86,
1162,
355,
30105,
1087,
4240,
5304,
14720,
23178,
6928,
285,
250,
284,
1162,
355,
4104,
2190,
2571,
50275,
255,
247,
1029,
5251,
253,
16038,
273,
253,
2086,
1268,
6779,
323,
253,
2783,
8892,
310,
417,
16318,
352,
3133,
326,
271,
11104,
3169,
6779,
26332,
253,
3453,
273,
253,
8989,
27657,
9866,
13562,
326,
8631,
253,
2460,
347,
247,
4849,
273,
5113,
2144,
3607,
285,
616,
6887,
285,
11498,
310,
247,
4209,
6779,
253,
2169,
2621,
7688,
476,
320,
4942,
4354,
10375,
432,
253,
843,
20713,
1580,
253,
3888,
403,
4076,
285,
502,
12216,
1959,
247,
8245,
2746,
835,
253,
2086,
9066,
369,
2684,
970,
3186,
285,
32827,
943,
320,
2429,
342,
50275,
783,
2783,
8892,
403,
4942,
2969,
17170,
898,
2222,
10669,
5251,
7200,
253,
7103,
4457,
253,
13506,
15302,
310,
9648,
3710,
285,
352,
310,
12744,
849,
973,
253,
1332,
2087,
4219,
281,
4460,
3888,
275,
502,
12216,
285,
30796,
50275,
249,
6010,
253,
2929,
2789,
247,
1180,
273,
7313,
326,
452,
644,
17194,
275,
247,
1180,
273,
2720,
2987,
533,
253,
9021,
273,
436,
2929,
310,
417,
16318,
24088,
689,
11454,
6200,
372,
12574,
272,
253,
2022,
1750,
326,
2169,
2621,
7688,
403,
1146,
23115,
310,
417,
5165,
1955,
281,
253,
17647,
273,
253,
13451,
1146,
2783,
323,
1650,
253,
2086,
8336,
1146,
2783,
403,
8489,
10341,
285,
247,
5301,
342,
247,
17524,
1754,
32827,
2746,
943,
452,
644,
6760,
253,
5661,
7103,
310,
5075,
275,
2067,
7794,
253,
26647,
281,
1524,
3888,
310,
34009,
5256,
267,
342,
760,
767,
6667,
2011,
275,
253,
4677,
818,
5474,
33032,
2520,
2929,
10262,
247,
985,
326,
2192,
398,
5659,
12930,
495,
69,
13451,
9924,
273,
2969,
2248,
23223,
253,
985,
8414,
273,
1264,
8661,
1016,
273,
534,
310,
10166,
11794,
806,
253,
39612,
6333,
16756,
1789,
25965,
285,
616,
12474,
253,
5113,
403,
840,
403,
8085,
715,
2067,
2390,
4720,
1016,
1387,
310,
18301,
281,
247,
3969,
277,
3433,
2086,
970,
247,
2160,
2083,
292,
583,
371,
566,
2990,
2074,
281,
253,
4394,
5431,
7091,
275,
11454,
5145,
10234,
50276,
856,
84,
50276,
783,
2929,
310,
3542,
4518,
285,
3477,
281,
1239,
50276,
34309,
2086,
9066,
310,
1077,
12302,
285,
1774,
3884,
1097,
323,
2460,
4685,
285,
5978,
50276,
783,
1543,
327,
13506,
15302,
403,
1175,
253,
4477,
671,
7568,
253,
30437,
273,
253,
2746,
281,
1524,
10186,
941,
23447,
3012,
20793,
50276,
74,
1089,
352,
10084,
326,
247,
22510,
19,
14571,
310,
1175,
387,
9603,
271,
7899,
2086,
323,
247,
1387,
273,
5113,
50276,
34309,
24760,
2403,
4679,
403,
13943,
50276,
5040,
50276,
783,
4081,
1566,
4419,
6793,
22581,
273,
3733,
941,
1580,
512,
253,
4295,
273,
253,
2718,
403,
10166,
275,
247,
22296,
8142,
697,
417,
2590,
849,
281,
897,
253,
1332,
327,
253,
540,
248,
32778,
941,
1293,
824,
22581,
50276,
4919,
281,
253,
2045,
1127,
1014,
672,
697,
1896,
281,
46919,
941,
352,
310,
37825,
281,
4044,
253,
3216,
33024,
32827,
273,
5113,
32721,
407,
2829,
374,
352,
3133,
326,
253,
985,
13471,
275,
5928,
273,
253,
32827,
1491,
50276,
783,
941,
908,
275,
253,
2929,
310,
3240,
8077,
2531,
3710,
1180,
273,
2248,
23223,
4441,
275,
247,
3963,
9860,
516,
12371,
604,
627,
310,
247,
3626,
1039,
281,
9017,
253,
2746,
281,
625,
2570,
7533,
619,
5476,
310,
326,
253,
3045,
588,
5926,
3012,
50276,
21377,
34974,
50276,
4674,
374,
12494,
337,
253,
2929,
407,
36827,
249,
1162,
355,
4765,
10262,
1097,
247,
985,
323,
39306,
271,
2460,
347,
973,
347,
323,
10491,
432,
247,
3268,
25761,
352,
10262,
4679,
327,
495,
69,
941,
26332,
417,
3710,
281,
10263,
50276,
4674,
5910,
12494,
374,
516,
417,
2119,
891,
2096,
253,
1390,
6197,
849,
476,
359,
871,
326,
359,
8379,
12372,
253,
6200,
387,
1071,
673,
812,
253,
4477,
21184,
327,
253,
15910,
17705,
323,
10491,
50276,
4674,
5976,
12494,
374,
513,
891,
2096,
9113,
326,
253,
2022,
3064,
875,
253,
1071,
873,
285,
253,
26647,
873,
310,
253,
1180,
273,
2390,
26332,
374,
4632,
495,
604,
594,
697,
247,
9648,
3710,
20028,
273,
26647,
13789,
273,
253,
985,
50276,
4674,
5976,
12494,
577,
359,
3186,
1755,
495,
18595,
50275,
5430,
513,
359,
7617,
534,
581,
310,
1805,
513,
359,
10380,
452,
271,
2289,
281,
253,
3216,
5083,
2086,
387,
1071,
673,
50276,
16534,
253,
4477,
5513,
253,
6779,
273,
247,
2086,
625,
4518,
849,
403,
17417,
15726,
849,
476,
581,
43444,
1911,
5659,
275,
253,
24760,
2403,
3368,
50276,
1189,
455,
891,
1158,
352,
310,
247,
4722,
2929,
285,
476,
320,
7826,
7607,
327,
253,
1617,
326,
253,
4477,
2953,
619,
3533,
285,
7350,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
10895,
285,
1332,
323,
3733,
247,
1566,
281,
9441,
432,
247,
5304,
6200,
253,
2086,
326,
651,
4561,
265,
19268,
352,
275,
2509,
594,
352,
11330,
12002,
557,
290,
33195,
14237,
273,
253,
6200,
534,
812,
320,
908,
407,
6083,
3210,
285,
643,
13361,
3082,
281,
1921,
670,
253,
6200,
50276,
2520,
310,
2568,
1529,
2929,
835,
253,
30628,
11034,
5356,
858,
417,
8008,
253,
806,
3790,
273,
10123,
497,
12069,
49636,
936,
24826,
253,
4477,
891,
1158,
858,
247,
1175,
2628,
273,
19392,
281,
253,
7350,
5439,
407,
253,
30628,
285,
16168,
616,
2929,
15672,
19235,
417,
581,
273,
253,
30628,
2335,
253,
673,
281,
1908,
2488,
6128,
50276,
249,
1708,
273,
619,
4361,
273,
253,
6128,
285,
253,
38549,
275,
253,
2929,
891,
717,
25661,
4404,
12767,
436,
347,
247,
2929,
835,
253,
2278,
1232,
556,
4242,
253,
4477,
285,
46705,
14924,
253,
2929,
10262,
247,
4460,
1332,
285,
10895,
285,
253,
4679,
403,
12054,
21414,
253,
2929,
556,
32138,
285,
253,
4477,
403,
15140,
281,
9257,
1379,
715,
2395,
253,
7350,
7908,
2400,
407,
30628,
20415,
273,
534,
597,
452,
10974,
281,
249,
9603,
616,
2457,
7714
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
39930,
50276,
249,
436,
2929,
253,
4477,
4081,
247,
747,
5981,
273,
6779,
1925,
6200,
5659,
281,
6266,
253,
5304,
13451,
281,
4908,
253,
6200,
5659,
432,
13451,
253,
4477,
28734,
253,
273,
649,
1041,
48164,
1789,
5481,
285,
8223,
569,
1566,
8989,
27657,
9866,
281,
4908,
512,
5113,
285,
253,
3969,
12474,
432,
253,
3888,
285,
840,
2736,
2390,
323,
1110,
5113,
534,
403,
840,
908,
281,
6635,
253,
5659,
534,
10129,
253,
3280,
13451,
253,
4679,
403,
2684,
327,
247,
5132,
49631,
15302,
534,
8414,
273,
2709,
15029,
342,
1027,
12474,
253,
4679,
2722,
326,
253,
4081,
1566,
476,
9441,
625,
7899,
5659,
432,
253,
13451,
285,
1110,
4561,
5659,
476,
320,
908,
281,
9295,
253,
3280,
13451,
625,
13613,
16280,
253,
4477,
671,
2692,
326,
253,
4561,
6200,
5659,
476,
320,
908,
323,
2460,
14835,
285,
2403,
5304,
24760,
50276,
296,
3755,
783,
84,
50276,
18,
253,
4477,
4081,
247,
747,
6779,
1925,
6200,
5659,
281,
6266,
253,
5304,
13451,
342,
690,
45860,
2086,
436,
310,
247,
747,
6200,
6779,
534,
812,
320,
7826,
908,
275,
2710,
15216,
824,
347,
253,
2460,
9066,
275,
15896,
50276,
19,
253,
4477,
4081,
247,
24498,
1332,
281,
1566,
253,
5289,
275,
13451,
5742,
253,
5113,
275,
247,
6200,
403,
806,
10375,
285,
840,
24104,
715,
2709,
9959,
534,
588,
320,
908,
281,
7102,
253,
6200,
2086,
9066,
50275,
20,
253,
5661,
1543,
7568,
253,
12510,
273,
253,
4081,
1332,
1097,
36143,
285,
36878,
253,
4477,
671,
2692,
253,
253,
5659,
4561,
50276,
5092,
320,
23460,
323,
2460,
14835,
285,
2831,
2307,
1319,
11038,
50275,
20881,
1255,
265,
50276,
18,
352,
310,
247,
2372,
16593,
281,
7277,
253,
4081,
1332,
342,
253,
767,
8245,
3082,
7117,
275,
2829,
374,
253,
4477,
908,
247,
3215,
11273,
8989,
3373,
9866,
281,
2736,
512,
5113,
285,
3283,
253,
12474,
323,
512,
5113,
2299,
253,
14317,
3082,
452,
642,
2289,
281,
436,
20446,
1014,
275,
436,
1083,
260,
9866,
42663,
78,
3133,
5115,
10870,
3045,
327,
253,
806,
1264,
17082,
50275,
19,
253,
5750,
273,
6200,
2086,
2429,
342,
6200,
4216,
480,
2116,
1665,
1162,
355,
403,
417,
2590,
281,
479,
6200,
4216,
310,
671,
247,
24762,
6779,
323,
3888,
671,
323,
512,
253,
8892,
5393,
275,
436,
2929,
824,
347,
2460,
14835,
285,
5304,
24760,
6200,
4216,
476,
3164,
671,
3426,
973,
253,
4477,
943,
4385,
670,
253,
2173,
11361,
273,
6200,
2086,
275,
5301,
342,
6200,
4216,
50276,
20,
512,
253,
3888,
2011,
275,
253,
2929,
3133,
10912,
17568,
534,
891,
1158,
4428,
690,
8492,
281,
253,
4081,
32827,
5700,
891,
651,
751,
281,
923,
625,
11117,
16012,
273,
253,
35936,
5113,
352,
651,
320,
1175,
281,
923,
604,
253,
4081,
1566,
476,
6266,
625,
9542,
13451,
50276,
8774,
50276,
2520,
2929,
4081,
247,
4460,
6200,
14237,
1925,
6200,
2086,
281,
4908,
253,
6200,
2086,
253,
4477,
4081,
247,
10549,
1506,
474,
17032,
1332,
253,
4795,
6200,
5659,
1754,
327,
253,
4081,
1566,
41731,
13015,
2067,
8245,
3210,
36878,
253,
4477,
671,
2692,
253,
4081,
6200,
2086,
310,
7470,
323,
2460,
14835,
285,
5304,
24760,
2403,
2299,
347,
8042,
1840,
627,
403,
690,
12744,
2792,
281,
479,
3340,
253,
11361,
273,
6200,
2086,
2429,
342,
6200,
4216,
285,
253,
6779,
1612,
273,
6200,
2086,
323,
9542,
13451,
7152,
33032,
2520,
2929,
2340,
684,
247,
27389,
6779,
273,
13451,
970,
5659,
1677,
271,
3280,
2460,
285,
271,
3302,
873,
273,
843,
20713,
2797,
432,
5004,
484,
25421,
247,
3425,
281,
3425,
2990,
310,
908,
281,
6635,
5659,
275,
247,
5028,
2173,
3448,
277,
3433,
253,
4477,
1908,
247,
10895,
835,
2969,
2248,
23223,
403,
10912,
275,
50107,
275,
495,
69,
13451,
342,
11962,
2144,
285,
3295,
3607,
597,
9059,
326,
253,
6200,
6779,
1421,
281,
1805,
26647,
327,
4460,
6200,
3510,
285,
3157,
689,
1666,
25379,
327,
2460,
24760,
8892,
253,
2929,
310,
973,
3542,
533,
253,
7103,
285,
7681,
38135,
310,
5075,
50275,
7053,
253,
897,
273,
6200,
5659,
310,
417,
247,
7680,
273,
436,
2929,
1469,
4457,
253,
2987,
11106,
275,
253,
2905,
789,
2593,
2067,
3332,
2987,
452,
4081,
285,
6949,
253,
11361,
273,
2086,
9066,
323,
5281,
5978,
24088,
260,
8433,
3024,
17614,
785,
1162,
355,
30105,
1087,
4765,
285,
6200,
372,
12574,
272,
259,
86,
1162,
355,
30105,
1087,
4240,
5304,
14720,
23178,
6928,
285,
250,
284,
1162,
355,
4104,
2190,
2571,
50275,
255,
247,
1029,
5251,
253,
16038,
273,
253,
2086,
1268,
6779,
323,
253,
2783,
8892,
310,
417,
16318,
352,
3133,
326,
271,
11104,
3169,
6779,
26332,
253,
3453,
273,
253,
8989,
27657,
9866,
13562,
326,
8631,
253,
2460,
347,
247,
4849,
273,
5113,
2144,
3607,
285,
616,
6887,
285,
11498,
310,
247,
4209,
6779,
253,
2169,
2621,
7688,
476,
320,
4942,
4354,
10375,
432,
253,
843,
20713,
1580,
253,
3888,
403,
4076,
285,
502,
12216,
1959,
247,
8245,
2746,
835,
253,
2086,
9066,
369,
2684,
970,
3186,
285,
32827,
943,
320,
2429,
342,
50275,
783,
2783,
8892,
403,
4942,
2969,
17170,
898,
2222,
10669,
5251,
7200,
253,
7103,
4457,
253,
13506,
15302,
310,
9648,
3710,
285,
352,
310,
12744,
849,
973,
253,
1332,
2087,
4219,
281,
4460,
3888,
275,
502,
12216,
285,
30796,
50275,
249,
6010,
253,
2929,
2789,
247,
1180,
273,
7313,
326,
452,
644,
17194,
275,
247,
1180,
273,
2720,
2987,
533,
253,
9021,
273,
436,
2929,
310,
417,
16318,
24088,
689,
11454,
6200,
372,
12574,
272,
253,
2022,
1750,
326,
2169,
2621,
7688,
403,
1146,
23115,
310,
417,
5165,
1955,
281,
253,
17647,
273,
253,
13451,
1146,
2783,
323,
1650,
253,
2086,
8336,
1146,
2783,
403,
8489,
10341,
285,
247,
5301,
342,
247,
17524,
1754,
32827,
2746,
943,
452,
644,
6760,
253,
5661,
7103,
310,
5075,
275,
2067,
7794,
253,
26647,
281,
1524,
3888,
310,
34009,
5256,
267,
342,
760,
767,
6667,
2011,
275,
253,
4677,
818,
5474,
33032,
2520,
2929,
10262,
247,
985,
326,
2192,
398,
5659,
12930,
495,
69,
13451,
9924,
273,
2969,
2248,
23223,
253,
985,
8414,
273,
1264,
8661,
1016,
273,
534,
310,
10166,
11794,
806,
253,
39612,
6333,
16756,
1789,
25965,
285,
616,
12474,
253,
5113,
403,
840,
403,
8085,
715,
2067,
2390,
4720,
1016,
1387,
310,
18301,
281,
247,
3969,
277,
3433,
2086,
970,
247,
2160,
2083,
292,
583,
371,
566,
2990,
2074,
281,
253,
4394,
5431,
7091,
275,
11454,
5145,
10234,
50276,
856,
84,
50276,
783,
2929,
310,
3542,
4518,
285,
3477,
281,
1239,
50276,
34309,
2086,
9066,
310,
1077,
12302,
285,
1774,
3884,
1097,
323,
2460,
4685,
285,
5978,
50276,
783,
1543,
327,
13506,
15302,
403,
1175,
253,
4477,
671,
7568,
253,
30437,
273,
253,
2746,
281,
1524,
10186,
941,
23447,
3012,
20793,
50276,
74,
1089,
352,
10084,
326,
247,
22510,
19,
14571,
310,
1175,
387,
9603,
271,
7899,
2086,
323,
247,
1387,
273,
5113,
50276,
34309,
24760,
2403,
4679,
403,
13943,
50276,
5040,
50276,
783,
4081,
1566,
4419,
6793,
22581,
273,
3733,
941,
1580,
512,
253,
4295,
273,
253,
2718,
403,
10166,
275,
247,
22296,
8142,
697,
417,
2590,
849,
281,
897,
253,
1332,
327,
253,
540,
248,
32778,
941,
1293,
824,
22581,
50276,
4919,
281,
253,
2045,
1127,
1014,
672,
697,
1896,
281,
46919,
941,
352,
310,
37825,
281,
4044,
253,
3216,
33024,
32827,
273,
5113,
32721,
407,
2829,
374,
352,
3133,
326,
253,
985,
13471,
275,
5928,
273,
253,
32827,
1491,
50276,
783,
941,
908,
275,
253,
2929,
310,
3240,
8077,
2531,
3710,
1180,
273,
2248,
23223,
4441,
275,
247,
3963,
9860,
516,
12371,
604,
627,
310,
247,
3626,
1039,
281,
9017,
253,
2746,
281,
625,
2570,
7533,
619,
5476,
310,
326,
253,
3045,
588,
5926,
3012,
50276,
21377,
34974,
50276,
4674,
374,
12494,
337,
253,
2929,
407,
36827,
249,
1162,
355,
4765,
10262,
1097,
247,
985,
323,
39306,
271,
2460,
347,
973,
347,
323,
10491,
432,
247,
3268,
25761,
352,
10262,
4679,
327,
495,
69,
941,
26332,
417,
3710,
281,
10263,
50276,
4674,
5910,
12494,
374,
516,
417,
2119,
891,
2096,
253,
1390,
6197,
849,
476,
359,
871,
326,
359,
8379,
12372,
253,
6200,
387,
1071,
673,
812,
253,
4477,
21184,
327,
253,
15910,
17705,
323,
10491,
50276,
4674,
5976,
12494,
374,
513,
891,
2096,
9113,
326,
253,
2022,
3064,
875,
253,
1071,
873,
285,
253,
26647,
873,
310,
253,
1180,
273,
2390,
26332,
374,
4632,
495,
604,
594,
697,
247,
9648,
3710,
20028,
273,
26647,
13789,
273,
253,
985,
50276,
4674,
5976,
12494,
577,
359,
3186,
1755,
495,
18595,
50275,
5430,
513,
359,
7617,
534,
581,
310,
1805,
513,
359,
10380,
452,
271,
2289,
281,
253,
3216,
5083,
2086,
387,
1071,
673,
50276,
16534,
253,
4477,
5513,
253,
6779,
273,
247,
2086,
625,
4518,
849,
403,
17417,
15726,
849,
476,
581,
43444,
1911,
5659,
275,
253,
24760,
2403,
3368,
50276,
1189,
455,
891,
1158,
352,
310,
247,
4722,
2929,
285,
476,
320,
7826,
7607,
327,
253,
1617,
326,
253,
4477,
2953,
619,
3533,
285,
7350,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
10895,
285,
1332,
323,
3733,
247,
1566,
281,
9441,
432,
247,
5304,
6200,
253,
2086,
326,
651,
4561,
265,
19268,
352,
275,
2509,
594,
352,
11330,
12002,
557,
290,
33195,
14237,
273,
253,
6200,
534,
812,
320,
908,
407,
6083,
3210,
285,
643,
13361,
3082,
281,
1921,
670,
253,
6200,
50276,
2520,
310,
2568,
1529,
2929,
835,
253,
30628,
11034,
5356,
858,
417,
8008,
253,
806,
3790,
273,
10123,
497,
12069,
49636,
936,
24826,
253,
4477,
891,
1158,
858,
247,
1175,
2628,
273,
19392,
281,
253,
7350,
5439,
407,
253,
30628,
285,
16168,
616,
2929,
15672,
19235,
417,
581,
273,
253,
30628,
2335,
253,
673,
281,
1908,
2488,
6128,
50276,
249,
1708,
273,
619,
4361,
273,
253,
6128,
285,
253,
38549,
275,
253,
2929,
891,
717,
25661,
4404,
12767,
436,
347,
247,
2929,
835,
253,
2278,
1232,
556,
4242,
253,
4477,
285,
46705,
14924,
253,
2929,
10262,
247,
4460,
1332,
285,
10895,
285,
253,
4679,
403,
12054,
21414,
253,
2929,
556,
32138,
285,
253,
4477,
403,
15140,
281,
9257,
1379,
715,
2395,
253,
7350,
7908,
2400,
407,
30628,
20415,
273,
534,
597,
452,
10974,
281,
249,
9603,
616,
2457,
7714
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a method to learn object representation and train the model in a selfsupervised manner with a contrastive loss ablation studies involving different kinds of attention maps and object losses are conducted to prove the efficacy however why isnt there any baselines in the experiment section to compare with it is necessary to show whether your method outperforms other stateoftheart models for example the vanilla selfsupervised learning methods of contrastive learning and simclrdocsepthis paper builds upon two recently popular fields of work the objectcentric representation learning and the selfsupervised contrastive learning instead of using the pixelwise reconstruction loss as the main supervision signal this paper introduces both imagelevel and objectlevel contrastive loss to help object discovery the segmentation masks can be extracted as the attention masks in slotcross attention the trained model achieves reasonable performance on segmentation iou and vqa ap more comments are as follows i completely agree with the author that perpixel reconstruction loss might hinder the model to scale up to realworld images i am very happy to see the contrastive loss from simsiam can work here simply maximizing the similarity of positive pair is indeed an easytoimplement and promising technique i am a bit surprised by the fact that the global loss matters that much and i am interested in how the segmentation masks look like in this case after all the object loss is still minimized especially in the ctrall case the iou result 40 is not very good the slot attention paper only has numbers for ari but on the other hand objectcentric representation learning doesnt necessarily require segmentation the learned good object representation is verified by the vqa task it would be very interesting to extend this work by incorporating with techniques in unsupervised video representation learning papers overall i believe this paper presents an interesting direction of objectcentric representation learning i vote for the acceptance of this paper
### Summary:
|
the reviewers agree the paper should be accepted at the workshop congratulations it would be great to include baselines as pointed out by reviewer vy15 if this is not feasible for this workshops cameraready version due to the time limitations we highly recommend including them in future iterations of this work
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1332,
281,
3037,
1789,
6779,
285,
6194,
253,
1566,
275,
247,
1881,
35421,
5133,
342,
247,
4499,
422,
2957,
28913,
2175,
7668,
1027,
9351,
273,
4116,
8115,
285,
1789,
11655,
403,
5196,
281,
5276,
253,
10307,
2299,
2139,
310,
2649,
627,
667,
1666,
25379,
275,
253,
3368,
2593,
281,
7277,
342,
352,
310,
3309,
281,
921,
1880,
634,
1332,
41731,
13015,
643,
1375,
23037,
14387,
3210,
323,
1650,
253,
26724,
1881,
35421,
4715,
3082,
273,
4499,
422,
4715,
285,
948,
498,
5784,
406,
33032,
2520,
2929,
21168,
2220,
767,
4102,
4633,
4910,
273,
789,
253,
1789,
37382,
6779,
4715,
285,
253,
1881,
35421,
4499,
422,
4715,
3185,
273,
970,
253,
12275,
3020,
14433,
2957,
347,
253,
2022,
20446,
2625,
436,
2929,
23970,
1097,
2460,
5251,
285,
1789,
5251,
4499,
422,
2957,
281,
1361,
1789,
8900,
253,
26405,
25965,
476,
320,
10375,
347,
253,
4116,
25965,
275,
15239,
16599,
4116,
253,
10166,
1566,
33526,
5272,
3045,
327,
26405,
891,
276,
285,
362,
31569,
1049,
625,
5701,
403,
347,
3637,
50275,
74,
4336,
5194,
342,
253,
2488,
326,
591,
29206,
14433,
2957,
1537,
35007,
253,
1566,
281,
4311,
598,
281,
1524,
10186,
3888,
891,
717,
1077,
5211,
281,
923,
253,
4499,
422,
2957,
432,
948,
9245,
312,
476,
789,
1060,
3365,
46875,
253,
14259,
273,
2762,
4667,
310,
6296,
271,
3477,
936,
303,
3018,
285,
12532,
5853,
50276,
74,
717,
247,
2372,
9861,
407,
253,
958,
326,
253,
4156,
2957,
8213,
326,
1199,
285,
891,
717,
6110,
275,
849,
253,
26405,
25965,
1007,
751,
275,
436,
1083,
846,
512,
253,
1789,
2957,
310,
1335,
36625,
3340,
275,
253,
260,
1206,
455,
1083,
50276,
783,
891,
276,
906,
3387,
310,
417,
1077,
1175,
253,
15239,
4116,
2929,
760,
556,
3904,
323,
247,
363,
533,
327,
253,
643,
1133,
1789,
37382,
6779,
4715,
36908,
7933,
2430,
26405,
253,
6311,
1175,
1789,
6779,
310,
16058,
407,
253,
362,
31569,
4836,
50276,
262,
651,
320,
1077,
4722,
281,
9017,
436,
789,
407,
24049,
342,
5609,
275,
440,
35421,
3492,
6779,
4715,
9380,
50276,
1189,
455,
891,
2868,
436,
2929,
10262,
271,
4722,
3884,
273,
1789,
37382,
6779,
4715,
891,
6273,
323,
253,
14924,
273,
436,
2929,
2490,
187,
4118,
18435,
27,
783,
30628,
5194,
253,
2929,
943,
320,
7607,
387,
253,
22586,
28858,
3339,
50276,
262,
651,
320,
1270,
281,
2486,
1666,
25379,
347,
8042,
562,
407,
37317,
46643,
1010,
604,
436,
310,
417,
17887,
323,
436,
29561,
4049,
254,
609,
5102,
2715,
1955,
281,
253,
673,
7364,
359,
4122,
5583,
1690,
731,
275,
2852,
25142,
273,
436,
789
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1332,
281,
3037,
1789,
6779,
285,
6194,
253,
1566,
275,
247,
1881,
35421,
5133,
342,
247,
4499,
422,
2957,
28913,
2175,
7668,
1027,
9351,
273,
4116,
8115,
285,
1789,
11655,
403,
5196,
281,
5276,
253,
10307,
2299,
2139,
310,
2649,
627,
667,
1666,
25379,
275,
253,
3368,
2593,
281,
7277,
342,
352,
310,
3309,
281,
921,
1880,
634,
1332,
41731,
13015,
643,
1375,
23037,
14387,
3210,
323,
1650,
253,
26724,
1881,
35421,
4715,
3082,
273,
4499,
422,
4715,
285,
948,
498,
5784,
406,
33032,
2520,
2929,
21168,
2220,
767,
4102,
4633,
4910,
273,
789,
253,
1789,
37382,
6779,
4715,
285,
253,
1881,
35421,
4499,
422,
4715,
3185,
273,
970,
253,
12275,
3020,
14433,
2957,
347,
253,
2022,
20446,
2625,
436,
2929,
23970,
1097,
2460,
5251,
285,
1789,
5251,
4499,
422,
2957,
281,
1361,
1789,
8900,
253,
26405,
25965,
476,
320,
10375,
347,
253,
4116,
25965,
275,
15239,
16599,
4116,
253,
10166,
1566,
33526,
5272,
3045,
327,
26405,
891,
276,
285,
362,
31569,
1049,
625,
5701,
403,
347,
3637,
50275,
74,
4336,
5194,
342,
253,
2488,
326,
591,
29206,
14433,
2957,
1537,
35007,
253,
1566,
281,
4311,
598,
281,
1524,
10186,
3888,
891,
717,
1077,
5211,
281,
923,
253,
4499,
422,
2957,
432,
948,
9245,
312,
476,
789,
1060,
3365,
46875,
253,
14259,
273,
2762,
4667,
310,
6296,
271,
3477,
936,
303,
3018,
285,
12532,
5853,
50276,
74,
717,
247,
2372,
9861,
407,
253,
958,
326,
253,
4156,
2957,
8213,
326,
1199,
285,
891,
717,
6110,
275,
849,
253,
26405,
25965,
1007,
751,
275,
436,
1083,
846,
512,
253,
1789,
2957,
310,
1335,
36625,
3340,
275,
253,
260,
1206,
455,
1083,
50276,
783,
891,
276,
906,
3387,
310,
417,
1077,
1175,
253,
15239,
4116,
2929,
760,
556,
3904,
323,
247,
363,
533,
327,
253,
643,
1133,
1789,
37382,
6779,
4715,
36908,
7933,
2430,
26405,
253,
6311,
1175,
1789,
6779,
310,
16058,
407,
253,
362,
31569,
4836,
50276,
262,
651,
320,
1077,
4722,
281,
9017,
436,
789,
407,
24049,
342,
5609,
275,
440,
35421,
3492,
6779,
4715,
9380,
50276,
1189,
455,
891,
2868,
436,
2929,
10262,
271,
4722,
3884,
273,
1789,
37382,
6779,
4715,
891,
6273,
323,
253,
14924,
273,
436,
2929,
2490,
187,
4118,
18435,
27,
783,
30628,
5194,
253,
2929,
943,
320,
7607,
387,
253,
22586,
28858,
3339,
50276,
262,
651,
320,
1270,
281,
2486,
1666,
25379,
347,
8042,
562,
407,
37317,
46643,
1010,
604,
436,
310,
417,
17887,
323,
436,
29561,
4049,
254,
609,
5102,
2715,
1955,
281,
253,
673,
7364,
359,
4122,
5583,
1690,
731,
275,
2852,
25142,
273,
436,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the idea is that as one smoothens the decision boundary of a piecewise linear function f eg from relunet its saliency map g obtained by gfracdfdx gets closer g n2 to the normal of the closest boundary hyperplane n the authors then propose two variants of explanation techniques based on the nearest decision boundary hyperplane and try it on explaining trained image deep classifiers the results seem to corroborate as for a better alignment with the nearest boundary hyperplanes normal also the proposed methods achieve better explanations as measured by locality and overlap with groundtruth bounding boxes strengths the general idea of alignment with the nearest decision hyperplane normal and the specific modification of ig seem quite novel and plausible in the experiments big achieves significantly better results using various explanation metrics questions to authors smoothening the learnt function at some point should start losing the discrmination ability of the learnt function has the authors pushed enough to find some indication of this tradeoff from theorem 1 i can understand why a smoother learned functions can give rise to a more faithful saliencybased explanation but i cannot see how it advocates smoothgrad as explanation wouldnt smoothgrad be faithful to a verylikely different function than the actual learned function and thus not necessarily faithful to the true learned function the text before definition 6 argues for big based on the existence of multiple boundary segments near a point and proposed definition 6 that integrates over the segment connecting a point x to its nearest adversarial x however shouldnt the nearest decision boundary segment for all points along the line segment xrightarrow x remain the same the integral is taken over standard saliency g which of course can change linear regions but the rationale of wanting to find different decision boundary hyperplanes does not seem to hold for the proposal the previous question could be simply rectified if the meaning of boundary segments is the linear regions boundary segments as opposed to the decision boundary segments but then i think boundary has been used as decision boundary at occasions before this definition eg in def 5 am i mistaken if not the text needs a rewrite to distinguish between regions boundary segments and decision boundary segments due to the approximation using an ensemble of adversarial example methods we should expect that the found segment is very likely not the closest decision boundary segment since we know from many works that the density of linear regions are extremely high in the input space in light of that how reliable are the observations in the experiments section especially with regards to the deviation from the normal vector figure 3a following up on the previous question could the fact that bsm does not show improvement on standard models be due to this approximation minor points on many occasions when referring to boundary facets of a polytope better to use hyperplane as opposed to segment to avoid confusion with line segments that are used as linear path in definition 3 falpha epsilon rightarrow fx epsilon in theorem 1 forall x in b better to replace rightarrow although a minor point it makes reading the statement challenging in the first glance in theoretm 1 it might be better to use ofrac1sigma c two different notations are used for definition or else better to refer to figure 3a and 3b as tables page 7 a smaller difference between the difference between attributions page 7 instead evaluates computes page 8 it is naturally to treat big frees users from the baseline selection the paper has an interesting and original idea which brings consistent improvement to the established explanations techniques such as gradientbased saliency maps and integrated gradients however there are some questions that makes me keep my rating only at borderline accept postrebuttal comments general rationale for the updated score after some more thoughts and the discussions during the rebuttal phase the reviewer remains unconvinced about the claim of the normality of sm to the extension of a segment in the decision boundary in this regard there are two significant concerns i there are explicit statements about this both in the revised paper see discussion with the authors for some instances as well as the authors arguments during the discussion and ii the motivation for the proposed method big is based on this claim furthermore the paper cites other papers that as far as the reviewer understands do not explicitly discuss this claim therefore i do not think i can vouch for accepting the paper claiming and building on some formal statements that i cannot personally verify consequently i reduce my rating from 6 to 3 since there might be a simple point that i am missing here which would prove the claim i reduce my confidence as well from 4 to 3 summary of the technical discussion the authors at various points implicitly suggest or explicitly claim that sm which is the gradient of the networks function wrt the input fracdfdx is perpendicular to the extension of a segment in the decision boundary this is then used to motivated a variant of ig called big which integrates sm over a line path from the sample to the nearest adversarial example for the reviewer it is possible to see i how fracdfdx for a linear binary classifier will always be orthogonal to the decision boundary since the decision boundary is by definition a hyperplane with the sm as its normal as described in section 31 ii how fracdfifjdx is orthogonal to the surface fifj0 however it is unclear to the reviewer how fracdfdx can be guaranteed to be prependicular to the decision boundary of a general function f mathbbrdrightarrowmathbbrk with k being the number of classes in fact i believe for the simplest case of linear binary classification as soon as we redundantly model each class with a separate linear model to become analogue to the multiclass setup the gradient of each of the linear functions ie fracdf1dx and fracdf2dx will no more be orthogonal to the decision boundary docsepthe paper focuses on the intersection of gradient attribution and adversarial robustness first it analyzes the weaknesses of vanilla gradients the gradient does not have to point towards the decision boundary of an nlayer relu network then the paper provides some insights into the smoothing of onelayer relu networks theorem 1 finally a boundarybased saliency map and an extension of integrated gradients are proposed and evaluated in terms of boundary alignment and object localization the paper has an interesting topic adding theoretical insights to explainability methods the paper does especially well on providing a good intuition about the relationships of normals polytopes and decision boundary content of 31 and first part of 32 i also found the paper overall well written some minor typos and duplicates are listed below the papers story of first analyzing the limitations of gradients fixing the errors and then evaluating the methods is also good i address my concerns about the generality and rigor of theorem 1 the evaluation and the limitations below proof of theorem 1 theorem 1 contains an lessapprox sign after checking the appendix it turns out that the proof is only correct for the case that dombrowski et al 2019 points out that the random distribution p epsiloni fracexp epsiloni 2exp epsiloni 22 closely resembles a normal distribution with a standard deviation sigma sqrtlog 2 fracsqrt2 pibeta however under which conditions does it resemble a normal distribution dombrowski et al 2019 only made this comment to explain a possible connection to smoothgrad see page 8 in dombrowski et al 2019 no concrete conditions are given on when or how close the distributions matches i did not even found how sigma was derived in dombrowski et al 2019 if you know where please point me to it i did a small experiment myself and plotted the distributions for each plot the corresponding beta is given on top and the normal distribution has sigma sqrtlog 2 fracsqrt2 pibeta for the notebook with the code see this linkhttpsf002backblazeb2comfilennnnnnnniclr2022robustmodelsaremoreinterpretablebecauseattributionslooknormalipynb plots for different betashttpsf002backblazeb2comfilennnnnnnniclr2022betaplotspng as you can see it is only close for beta approx 1 two solutions exist either provide a theorem with leq or give a rigorous discussion on the cases where only approx or even holds the other limitation of theorem 1 is that it only holds for onelayer relu networks i would find a short discussion helpful why it does not hold for nlayer relu networks in addition it should be emphasized throughout the paper that theorem 1 is only for onelayer networks for example in the last paragraph of the introduction we present an analysis that sheds light on the previouslyobserved phenomeon of robust interpretability showing that alignment between the normal vectors of decision boundaries and models gradients is a key ingredient proposition 1 theorem 1 please make clear in that sentence and others that theorem 1 only addresses onelayer networks at the end of section 32 figure 10 is referenced as empirical validation of theorem 1 but i do not understand the figure and caption distances in logarithm between sg and bsg against different standard deviations of the gaussian noise results are computed on resnet50 notice the first column corresponds to 0 please clarify what you want to evaluate with this figure eg the first column says sigma015 evaluation i think the evaluation of the normality to the decision boundary can be improved in figure 3 pairs of gradient attribution method and the corresponding boundary attributions eg ig vs big are compared to evaluate how normal the attributions are however why not measure the normality in the feature space zx directly zx is defined such that fix wit zx we know that wi must be normal to the decision boundary as shown in figure 2a the corresponding change in zspace of an attribution gx would be delta z zx zx alpha gx now we can measure the similarity of the normal wi and the different attributions just compute cosdelta z wi for all the different attributions this evaluation would relate the estimated directions in xspace to the groundtruth normals in zspace the current evaluation of attributions methods against their boundary equivalent cannot provide such a groundtruth reference the evaluation using the groundtruth bounding boxes is a good proxy task and seems to be executed correctly it might make sense to only use images where the bounding box covers less than 50 of the image as done in schulz et al 2020 the attribution method in schulz et al 2020 might also be an interesting candidate for the evaluation as it was also able to outperform intgrad and smooth grad i would also suggest focusing on one or two metrics for the bounding box task instead of four i would also encourage the authors to include the sanity check for weight reinitialization adebayo et al 2018 it is easy to implement and should be passed by any new attribution method limitations while i do not think that the paper requires a humansubject evaluation its lack should be mentioned in the limitation section also the saliency maps look more concentrated would humans actually profit from it even if there is a significant difference would you expect a large effect size please also list that theorem 1 is only for onelayered networks in the limitations limitation 2 not applicable to perturbation attributions arises from focus of the paper and i think there is not need to mention it minor comemnts in fact the fact page 4 smaller difference between the difference between page 7 thefore page 7 lost clause it is naturally to treat page 8 it should be table 3 and not figure 3 iamgenet page 6 i think it should be the rhs of the above equation is smoothed gradient page 15 references schulz et al 2020 httpsopenreviewnetforumids1xwh1rywb adebayo et al 2018 httpsarxivorgabs181003292 after rebuttal update the authors were able to rectify their proof and also provided details to my other questions while the initial submission was a clear reject the rebuttal was well done i agree with the concerns of the others reviewers about novelty overall i increased my rating to marginal above acceptance while i like that the paper aims to provide a more theoretical justification on attributions i am not satisfied with the rigor of the theory and the empirical evaluation i am not convinced that the proof is correct and the alignment with the normals should be checked using groundtruth knowledge overall the paper is wellwritten but please fix the grammar i cannot recommend the paper in its current form for acceptance if the proof were corrected and the evaluation extend to a groundtruth assessment of the normal i would reconsider my rating technical novelty the papers technical contribution is novel i am less convinced about the significance in its current form empirical novelty the paper does not present new empirical evaluations or datasets confidence i am confident about my assessment i read the proof and investigated the issue of resemblesanormaldistribution in depth i still might have missed other issues of the proof while i did looked at the referenced literature about adversarial examples i am more familiar with the interpretability side of the related work docsepthe paper has two main contributions a first it shows that one reason behind the attributions being more interpretable for adversarial robust models is that for these models the gradient with respect to the input is more closely aligned with the normal direction to a close decision boundary they empirically verify this claim by showing that the l2distance between attributions and their boundary variantsattributions computed at a close point on the decision boundary are lower for robust models than for standard models b using the previous fact they devise two new attribution methods bsm and big which can be used to get more interpretabilityexplanation from even a normal nonrobust model they again verify this claim empirically through various quantitative metrics aimed at finding the relation between positive attributions inside a localized bounding box of an object in the image strengths 1 the motivation of the idea is well explained in the paper 2 the mathematical foundation required for understanding is also well explained 3 i like the effort put in the paper in understanding the reasoning behind interpretable attributions for robust models and then using the info to devise new attribution methods 4 for both claims the paper does extensive qualitative and quantitative experiments weakness 1 the new attributions devised in the paper seem very similar to the agi attributionmentioned in the paper approach in big the attributions are computed along interpolations of x and its closest adversarial image whereas in agi the attributions are computed along each step of the adversarial image generation 2 in table1 mentioned in the paper the improvements along the two metrics used in other papers are not really significant the improvement only comes along with the two new metrics proposed in this paper i would like to see a comparison against some other metrics used in the related works such as top1 localization accuracy as used in 1 and 2 3 for a fairer comparison with the agi method can the authors use only the pgd attack for the adversarial image generation or the authors can also incorporate other adversarial images and not just pgd in agi for instance the agi method can be used to compute the attributions along each step of pgd cw and autopgd attacks and the final attribution is just the mean attribution of all three approaches 4 3 showed that their attribution technique works well with even multiple objects in the image can the authors show some qualitative results of comparison for multiple objects across different attribution methods references 1 attentionbased dropout layer for weakly supervised object localization choe et al 2019 2 on the benefits of models with perceptually aligned gradients aggarwal et al 2020 3 scorecam scoreweighted visual explanations for convolutional neural networks wang et al 2020 i have a few concerns regarding the quantitative experiments in the paper which are mentioned in the weakness section i will be willing to update my ratings if the authors address all my points docsepthis paper introduces boundary attributions which leverage the connection between boundary normal vectors and gradients to yield explanations for nonrobust models that carry over many of the favorable properties that have been observed of explanations on robust models it also proposes a big to explain models strengths 1 table 1 shows the empirical results are good weakness 1 my major concern for this paper is that the conclusion has already known for example ilyas et al shows that robust models can produce better perceptual aligned features when gradient descent and adversarial robust models are known to have smooth decision boundary 1 1 theoretically principled tradeoff between robustness and accuracy icml 2019 the conclusion is not novel to me which is already known to the community
### Summary:
|
this paper makes the following contributions 1 it shows that one reason behind the attributions being more interpretable for adversarial robust models is that for these models the gradient with respect to the input is more closely aligned with the normal direction to a close decision boundary 2 using the previous fact the authors devise two new attribution methods bsm and big which can be used to get more reliable explanations from even a normal nonrobust model while the reviewers agree that the premise of this paper is interesting some concerns remain post the rebuttal more specifically some reviewers opine that the agi and big methods are somewhat similar and other reviewers are not very convinced about some of the details eg the generalization of the orthogonality of sm to the decision boundary from the binary classification case section 31 to the more general case of relunets multiclass classifiers section 32 given this we are unable to accept this paper at this time we hope the authors find the reviewer feedback useful
|
[
275,
2328,
44827,
9327,
1162,
355,
6247,
604,
368,
871,
835,
4496,
1127,
479,
281,
352,
891,
858,
247,
1355,
3368,
4266,
285,
17944,
253,
10670,
323,
1016,
7484,
253,
3969,
9840,
575,
261,
1677,
327,
1755,
285,
253,
2622,
3268,
556,
40009,
50276,
2609,
2808,
374,
1315,
317,
2609,
19,
268,
487,
1464,
575,
1542,
253,
24849,
342,
253,
2127,
923,
436,
3048,
3614,
71,
4699,
2135,
1559,
1370,
2275,
19,
681,
5073,
2477,
9866,
9866,
79,
9335,
32888,
938,
1423,
18848,
461,
19286,
609,
3062,
22416,
494,
12157,
1595,
8303,
6204,
6320,
532,
1362,
67,
50276,
42045,
323,
1027,
701,
284,
3614,
71,
4699,
2135,
1559,
1370,
2275,
19,
681,
5073,
2477,
9866,
9866,
79,
9335,
32888,
938,
1423,
2461,
14095,
1033,
1251,
50275,
284,
368,
476,
923,
352,
310,
760,
2810,
323,
9840,
1192,
89,
337,
767,
5482,
2226,
2057,
2085,
247,
10012,
342,
458,
82,
390,
1918,
247,
26565,
5955,
327,
253,
2219,
835,
760,
1192,
89,
50276,
263,
1014,
50276,
15532,
50275,
783,
643,
12291,
273,
10012,
337,
310,
326,
352,
760,
6556,
323,
327,
293,
4071,
774,
86,
6928,
891,
651,
1089,
247,
2159,
5955,
9371,
2139,
352,
1057,
417,
2186,
323,
295,
12026,
774,
86,
6928,
275,
1635,
352,
943,
320,
21947,
4768,
253,
2929,
326,
10012,
337,
310,
760,
323,
327,
293,
4071,
6928,
323,
1650,
275,
253,
1390,
12494,
273,
253,
10199,
50275,
664,
1246,
271,
1783,
326,
703,
1397,
1708,
327,
253,
3786,
45912,
4188,
485,
251,
273,
10237,
4665,
1430,
4645,
326,
12420,
875,
253,
2622,
11390,
273,
3061,
13674,
285,
3210,
27935,
310,
247,
2234,
24405,
13989,
337,
10012,
337,
50276,
32897,
1056,
2590,
275,
326,
6197,
285,
2571,
326,
10012,
337,
760,
12453,
327,
293,
4071,
6928,
50275,
255,
253,
990,
273,
2593,
4567,
4677,
884,
310,
23378,
347,
16774,
12820,
273,
10012,
337,
533,
891,
513,
417,
2096,
253,
4677,
285,
11743,
50275,
8155,
1972,
275,
42407,
875,
48237,
285,
270,
8433,
1411,
1027,
2629,
21492,
50276,
1171,
253,
305,
12064,
6046,
1543,
403,
10302,
327,
501,
3024,
1235,
4366,
253,
806,
5084,
10140,
281,
50275,
17,
50276,
32897,
19148,
752,
368,
971,
281,
7472,
342,
436,
4677,
24088,
253,
806,
5084,
2296,
40009,
10496,
50275,
15419,
2368,
50276,
74,
1158,
253,
7103,
273,
253,
5222,
1319,
281,
253,
3061,
7548,
476,
320,
5520,
275,
4677,
495,
8557,
273,
11786,
863,
2382,
1332,
285,
253,
3969,
7548,
863,
8303,
24088,
25477,
4632,
1943,
403,
2429,
281,
7472,
849,
2622,
253,
863,
8303,
403,
2299,
2139,
417,
2557,
253,
5222,
1319,
275,
253,
4735,
2317,
1182,
89,
575,
18711,
314,
50276,
91,
89,
310,
2931,
824,
326,
4993,
50276,
88,
262,
1182,
89,
359,
871,
326,
38435,
1364,
320,
2622,
281,
253,
3061,
7548,
347,
2011,
275,
4677,
374,
66,
253,
3969,
1818,
275,
1182,
5641,
273,
271,
863,
2382,
305,
89,
651,
320,
18687,
1182,
50276,
91,
89,
50276,
91,
89,
50276,
1637,
305,
89,
1024,
359,
476,
2557,
253,
14259,
273,
253,
2622,
38435,
285,
253,
1027,
863,
8303,
816,
11897,
7349,
3005,
1182,
50276,
22084,
323,
512,
253,
1027,
863,
8303,
436,
7103,
651,
14588,
253,
5998,
10746,
275,
1269,
5641,
281,
253,
3216,
33024,
5222,
932,
275,
1182,
5641,
253,
1655,
7103,
273,
863,
8303,
3082,
1411,
616,
7548,
6425,
2550,
2085,
824,
247,
3216,
33024,
3806,
50275,
783,
7103,
970,
253,
3216,
33024,
41113,
12783,
310,
247,
1175,
17335,
4836,
285,
3133,
281,
320,
11407,
9113,
352,
1537,
1056,
3282,
281,
760,
897,
3888,
835,
253,
41113,
3817,
10949,
1679,
685,
2456,
273,
253,
2460,
347,
2218,
275,
5807,
335,
91,
1162,
355,
9169,
253,
863,
2382,
1332,
275,
5807,
335,
91,
1162,
355,
9169,
1537,
671,
320,
271,
4722,
7431,
323,
253,
7103,
347,
352,
369,
671,
2104,
281,
562,
32231,
540,
4971,
285,
6032,
3805,
891,
651,
671,
1804,
13654,
327,
581,
390,
767,
17082,
323,
253,
41113,
3817,
4836,
3185,
273,
1740,
50276,
74,
651,
671,
11907,
253,
4477,
281,
2486,
253,
45985,
2451,
323,
2801,
294,
19078,
1320,
519,
2275,
333,
80,
1162,
355,
4765,
352,
310,
3477,
281,
3359,
285,
943,
320,
4817,
407,
667,
747,
863,
2382,
1332,
50274,
17465,
569,
50276,
6050,
891,
513,
417,
1158,
326,
253,
2929,
4419,
247,
7497,
538,
720,
7103,
697,
3480,
943,
320,
5393,
275,
253,
12291,
2593,
671,
253,
3779,
4364,
8115,
1007,
625,
16761,
651,
7497,
2686,
11528,
432,
352,
1014,
604,
627,
310,
247,
1534,
3064,
651,
368,
1902,
247,
1781,
1055,
1979,
4496,
671,
1618,
326,
10012,
337,
310,
760,
323,
327,
293,
333,
2122,
6928,
275,
253,
7364,
12291,
374,
417,
7763,
281,
20452,
863,
8303,
15877,
432,
2770,
273,
253,
2929,
285,
891,
1158,
627,
310,
417,
878,
281,
3748,
352,
50274,
37585,
389,
24551,
1641,
50275,
249,
958,
253,
958,
3239,
577,
50276,
6795,
254,
3064,
875,
253,
3064,
875,
3239,
818,
50276,
783,
922,
3239,
818,
50276,
38483,
13604,
352,
310,
10748,
281,
1555,
50276,
6377,
854,
50276,
262,
943,
320,
2829,
495,
285,
417,
4677,
495,
50276,
16726,
1541,
292,
3239,
721,
50276,
74,
1158,
352,
943,
320,
253,
38309,
273,
253,
1840,
5150,
310,
43966,
11786,
3239,
1458,
50275,
250,
3065,
50276,
10629,
335,
91,
1162,
355,
9169,
5987,
5758,
15337,
3024,
39061,
2352,
18,
89,
2484,
18,
610,
34338,
50276,
796,
32442,
80,
1162,
355,
4765,
5987,
39962,
2061,
5375,
1093,
2313,
1237,
4529,
50275,
6438,
30080,
22559,
5731,
50276,
783,
4477,
497,
2104,
281,
9004,
1419,
616,
4737,
285,
671,
2530,
4278,
281,
619,
643,
3533,
1223,
253,
3302,
19529,
369,
247,
2590,
12009,
253,
30080,
22559,
369,
973,
2218,
891,
5194,
342,
253,
7350,
273,
253,
2571,
30628,
670,
38135,
4583,
891,
2559,
619,
13716,
281,
16888,
1840,
14924,
1223,
891,
751,
326,
253,
2929,
13698,
281,
2085,
247,
625,
10527,
22861,
327,
863,
8303,
891,
717,
417,
10048,
342,
253,
8132,
263,
273,
253,
3762,
285,
253,
16774,
7103,
891,
717,
417,
13762,
326,
253,
4737,
310,
3451,
285,
253,
12420,
342,
253,
5222,
932,
943,
320,
10141,
970,
3216,
33024,
3640,
4583,
253,
2929,
310,
973,
15720,
533,
4496,
4993,
253,
28146,
891,
2550,
5583,
253,
2929,
275,
697,
1655,
830,
323,
14924,
604,
253,
4737,
497,
15045,
285,
253,
7103,
9017,
281,
247,
3216,
33024,
6803,
273,
253,
2622,
891,
651,
24033,
619,
13716,
50276,
48746,
38135,
253,
9380,
7681,
7680,
310,
4460,
891,
717,
1679,
13762,
670,
253,
8453,
275,
697,
1655,
830,
50275,
358,
5378,
474,
38135,
253,
2929,
1057,
417,
1246,
747,
16774,
27163,
390,
15302,
50276,
39943,
891,
717,
13224,
670,
619,
6803,
891,
1239,
253,
4737,
285,
6949,
253,
2523,
273,
29217,
266,
1939,
35360,
275,
6864,
891,
1335,
1537,
452,
9829,
643,
3374,
273,
253,
4737,
1223,
891,
858,
3261,
387,
253,
23378,
6239,
670,
48960,
6667,
891,
717,
625,
7615,
342,
253,
4665,
1430,
1930,
273,
253,
2905,
789,
50276,
7152,
339,
431,
248,
2929,
556,
767,
2022,
9021,
50274,
66,
806,
352,
2722,
326,
581,
1921,
3212,
253,
863,
8303,
1146,
625,
4665,
494,
323,
48960,
10237,
3210,
310,
326,
323,
841,
3210,
253,
11786,
342,
1675,
281,
253,
3280,
310,
625,
8244,
15616,
342,
253,
2622,
3884,
281,
247,
2810,
3061,
7548,
597,
45190,
12654,
436,
1750,
407,
4645,
326,
253,
298,
19,
19893,
875,
863,
8303,
285,
616,
7548,
11640,
1595,
8303,
10302,
387,
247,
2810,
1127,
327,
253,
3061,
7548,
403,
2406,
323,
10237,
3210,
685,
323,
2629,
3210,
50274,
67,
970,
253,
2045,
958,
597,
45018,
767,
747,
863,
2382,
3082,
270,
3610,
285,
1943,
534,
476,
320,
908,
281,
755,
625,
4665,
1430,
911,
45525,
432,
1014,
247,
2622,
1327,
18848,
461,
1566,
597,
969,
12654,
436,
1750,
45190,
949,
2710,
11745,
17082,
11205,
387,
4560,
253,
5886,
875,
2762,
863,
8303,
3304,
247,
15783,
41113,
3817,
273,
271,
1789,
275,
253,
2460,
20544,
50273,
18,
253,
16038,
273,
253,
2934,
310,
973,
5544,
275,
253,
2929,
374,
253,
15965,
12153,
2424,
323,
4685,
310,
671,
973,
5544,
495,
891,
751,
253,
3434,
1691,
275,
253,
2929,
275,
4685,
253,
14720,
3212,
4665,
494,
863,
8303,
323,
10237,
3210,
285,
840,
970,
253,
8692,
281,
45018,
747,
863,
2382,
3082,
577,
323,
1097,
3916,
253,
2929,
1057,
9470,
18276,
285,
11745,
4679,
50276,
20881,
1255,
50275,
18,
253,
747,
863,
8303,
32434,
275,
253,
2929,
1646,
1077,
2074,
281,
253,
639,
74,
863,
2382,
13012,
275,
253,
2929,
2746,
275,
1943,
253,
863,
8303,
403,
10302,
2112,
20670,
569,
273,
1269,
285,
697,
8642,
48960,
2460,
5727,
275,
639,
74,
253,
863,
8303,
403,
10302,
2112,
1016,
3213,
273,
253,
48960,
2460,
5978,
374,
275,
2829,
18,
5393,
275,
253,
2929,
253,
11701,
2112,
253,
767,
17082,
908,
275,
643,
9380,
403,
417,
1663,
1534,
253,
7756,
760,
3249,
2112,
342,
253,
767,
747,
17082,
4081,
275,
436,
2929,
891,
651,
751,
281,
923,
247,
5301,
1411,
690,
643,
17082,
908,
275,
253,
2905,
2987,
824,
347,
1755,
18,
14536,
7200,
347,
908,
275,
337,
285,
374,
495,
323,
247,
22870,
83,
5301,
342,
253,
639,
74,
1332,
476,
253,
4477,
897,
760,
253,
23256,
69,
2983,
323,
253,
48960,
2460,
5978,
390,
253,
4477,
476,
671,
19071,
643,
48960,
3888,
285,
417,
816,
23256,
69,
275,
639,
74,
323,
4227,
253,
639,
74,
1332,
476,
320,
908,
281,
11897,
253,
863,
8303,
2112,
1016,
3213,
273,
23256,
69,
260,
88,
285,
1125,
412,
35333,
8104,
285,
253,
2457,
863,
2382,
310,
816,
253,
1599,
863,
2382,
273,
512,
1264,
7274,
577,
495,
2692,
326,
616,
863,
2382,
5853,
2987,
973,
342,
1014,
2709,
5113,
275,
253,
2460,
476,
253,
4477,
921,
690,
18276,
1543,
273,
5301,
323,
2709,
5113,
2439,
1027,
863,
2382,
3082,
50276,
250,
3065,
50273,
18,
4116,
3169,
5926,
483,
3828,
323,
22112,
22296,
1789,
14536,
2093,
70,
1162,
355,
6247,
50276,
19,
327,
253,
5373,
273,
3210,
342,
591,
916,
1230,
50276,
2132,
27935,
639,
5209,
18758,
1162,
355,
9169,
50276,
20,
4868,
12583,
4868,
24676,
5304,
22909,
323,
27311,
267,
11454,
6928,
259,
606,
1162,
355,
9169,
50276,
74,
452,
247,
1643,
7350,
5001,
253,
11745,
4679,
275,
253,
2929,
534,
403,
5393,
275,
253,
14855,
2593,
891,
588,
320,
7378,
281,
5731,
619,
17503,
604,
253,
4477,
2953,
512,
619,
2792,
50276,
7152,
33032,
2520,
2929,
23970,
7548,
863,
8303,
534,
25057,
253,
4602,
875,
7548,
2622,
11390,
285,
27935,
281,
4917,
22909,
323,
1327,
18848,
461,
3210,
326,
4459,
689,
1142,
273,
253,
13857,
3607,
326,
452,
644,
2540,
273,
22909,
327,
10237,
3210,
352,
671,
29328,
247,
1943,
281,
5513,
3210,
50276,
296,
3755,
20556,
50276,
18,
2829,
337,
2722,
253,
16774,
1543,
403,
1175,
50276,
20881,
1255,
50276,
18,
619,
2201,
4468,
323,
436,
2929,
310,
326,
253,
6452,
556,
2168,
1929,
323,
1650,
209,
1031,
284,
1162,
355,
2722,
326,
10237,
3210,
476,
4711,
1805,
39612,
15616,
3386,
672,
11786,
18499,
285,
48960,
10237,
3210,
403,
1929,
281,
452,
6032,
3061,
7548,
337,
50276,
18,
28055,
3505,
74,
6216,
5454,
2727,
875,
31640,
285,
7200,
17857,
1686,
6247,
50276,
783,
6452,
310,
417,
4460,
281,
479,
534,
310,
2168,
1929,
281,
253,
3114,
2490,
187,
4118,
18435,
27,
2520,
2929,
2789,
253,
1563,
9021,
50276,
18,
352,
2722,
326,
581,
1921,
3212,
253,
863,
8303,
1146,
625,
4665,
494,
323,
48960,
10237,
3210,
310,
326,
323,
841,
3210,
253,
11786,
342,
1675,
281,
253,
3280,
310,
625,
8244,
15616,
342,
253,
2622,
3884,
281,
247,
2810,
3061,
7548,
374,
970,
253,
2045,
958,
253,
4477,
45018,
767,
747,
863,
2382,
3082,
50276,
1768,
78,
285,
1943,
50276,
4609,
476,
320,
908,
281,
755,
625,
9630,
22909,
432,
1014,
247,
2622,
1327,
18848,
461,
1566,
1223,
253,
30628,
5194,
326,
253,
26536,
273,
436,
2929,
310,
4722,
690,
7350,
3464,
1501,
253,
30080,
22559,
625,
5742,
690,
30628,
1121,
460,
326,
253,
639,
74,
285,
1943,
3082,
403,
8489,
2074,
285,
643,
30628,
403,
417,
1077,
13762,
670,
690,
273,
253,
4278,
24088,
253,
26647,
273,
253,
9373,
38931,
1319,
273,
924,
281,
253,
3061,
7548,
432,
253,
8985,
9162,
1083,
2593,
4562,
281,
253,
625,
2087,
1083,
273,
774,
328,
1507,
23559,
14407,
49996,
2593,
4567,
1677,
436,
359,
403,
7591,
281,
2997,
436,
2929,
387,
436,
673,
359,
3524,
253,
4477,
1089,
253,
37317,
8680,
4217
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
275,
2328,
44827,
9327,
1162,
355,
6247,
604,
368,
871,
835,
4496,
1127,
479,
281,
352,
891,
858,
247,
1355,
3368,
4266,
285,
17944,
253,
10670,
323,
1016,
7484,
253,
3969,
9840,
575,
261,
1677,
327,
1755,
285,
253,
2622,
3268,
556,
40009,
50276,
2609,
2808,
374,
1315,
317,
2609,
19,
268,
487,
1464,
575,
1542,
253,
24849,
342,
253,
2127,
923,
436,
3048,
3614,
71,
4699,
2135,
1559,
1370,
2275,
19,
681,
5073,
2477,
9866,
9866,
79,
9335,
32888,
938,
1423,
18848,
461,
19286,
609,
3062,
22416,
494,
12157,
1595,
8303,
6204,
6320,
532,
1362,
67,
50276,
42045,
323,
1027,
701,
284,
3614,
71,
4699,
2135,
1559,
1370,
2275,
19,
681,
5073,
2477,
9866,
9866,
79,
9335,
32888,
938,
1423,
2461,
14095,
1033,
1251,
50275,
284,
368,
476,
923,
352,
310,
760,
2810,
323,
9840,
1192,
89,
337,
767,
5482,
2226,
2057,
2085,
247,
10012,
342,
458,
82,
390,
1918,
247,
26565,
5955,
327,
253,
2219,
835,
760,
1192,
89,
50276,
263,
1014,
50276,
15532,
50275,
783,
643,
12291,
273,
10012,
337,
310,
326,
352,
760,
6556,
323,
327,
293,
4071,
774,
86,
6928,
891,
651,
1089,
247,
2159,
5955,
9371,
2139,
352,
1057,
417,
2186,
323,
295,
12026,
774,
86,
6928,
275,
1635,
352,
943,
320,
21947,
4768,
253,
2929,
326,
10012,
337,
310,
760,
323,
327,
293,
4071,
6928,
323,
1650,
275,
253,
1390,
12494,
273,
253,
10199,
50275,
664,
1246,
271,
1783,
326,
703,
1397,
1708,
327,
253,
3786,
45912,
4188,
485,
251,
273,
10237,
4665,
1430,
4645,
326,
12420,
875,
253,
2622,
11390,
273,
3061,
13674,
285,
3210,
27935,
310,
247,
2234,
24405,
13989,
337,
10012,
337,
50276,
32897,
1056,
2590,
275,
326,
6197,
285,
2571,
326,
10012,
337,
760,
12453,
327,
293,
4071,
6928,
50275,
255,
253,
990,
273,
2593,
4567,
4677,
884,
310,
23378,
347,
16774,
12820,
273,
10012,
337,
533,
891,
513,
417,
2096,
253,
4677,
285,
11743,
50275,
8155,
1972,
275,
42407,
875,
48237,
285,
270,
8433,
1411,
1027,
2629,
21492,
50276,
1171,
253,
305,
12064,
6046,
1543,
403,
10302,
327,
501,
3024,
1235,
4366,
253,
806,
5084,
10140,
281,
50275,
17,
50276,
32897,
19148,
752,
368,
971,
281,
7472,
342,
436,
4677,
24088,
253,
806,
5084,
2296,
40009,
10496,
50275,
15419,
2368,
50276,
74,
1158,
253,
7103,
273,
253,
5222,
1319,
281,
253,
3061,
7548,
476,
320,
5520,
275,
4677,
495,
8557,
273,
11786,
863,
2382,
1332,
285,
253,
3969,
7548,
863,
8303,
24088,
25477,
4632,
1943,
403,
2429,
281,
7472,
849,
2622,
253,
863,
8303,
403,
2299,
2139,
417,
2557,
253,
5222,
1319,
275,
253,
4735,
2317,
1182,
89,
575,
18711,
314,
50276,
91,
89,
310,
2931,
824,
326,
4993,
50276,
88,
262,
1182,
89,
359,
871,
326,
38435,
1364,
320,
2622,
281,
253,
3061,
7548,
347,
2011,
275,
4677,
374,
66,
253,
3969,
1818,
275,
1182,
5641,
273,
271,
863,
2382,
305,
89,
651,
320,
18687,
1182,
50276,
91,
89,
50276,
91,
89,
50276,
1637,
305,
89,
1024,
359,
476,
2557,
253,
14259,
273,
253,
2622,
38435,
285,
253,
1027,
863,
8303,
816,
11897,
7349,
3005,
1182,
50276,
22084,
323,
512,
253,
1027,
863,
8303,
436,
7103,
651,
14588,
253,
5998,
10746,
275,
1269,
5641,
281,
253,
3216,
33024,
5222,
932,
275,
1182,
5641,
253,
1655,
7103,
273,
863,
8303,
3082,
1411,
616,
7548,
6425,
2550,
2085,
824,
247,
3216,
33024,
3806,
50275,
783,
7103,
970,
253,
3216,
33024,
41113,
12783,
310,
247,
1175,
17335,
4836,
285,
3133,
281,
320,
11407,
9113,
352,
1537,
1056,
3282,
281,
760,
897,
3888,
835,
253,
41113,
3817,
10949,
1679,
685,
2456,
273,
253,
2460,
347,
2218,
275,
5807,
335,
91,
1162,
355,
9169,
253,
863,
2382,
1332,
275,
5807,
335,
91,
1162,
355,
9169,
1537,
671,
320,
271,
4722,
7431,
323,
253,
7103,
347,
352,
369,
671,
2104,
281,
562,
32231,
540,
4971,
285,
6032,
3805,
891,
651,
671,
1804,
13654,
327,
581,
390,
767,
17082,
323,
253,
41113,
3817,
4836,
3185,
273,
1740,
50276,
74,
651,
671,
11907,
253,
4477,
281,
2486,
253,
45985,
2451,
323,
2801,
294,
19078,
1320,
519,
2275,
333,
80,
1162,
355,
4765,
352,
310,
3477,
281,
3359,
285,
943,
320,
4817,
407,
667,
747,
863,
2382,
1332,
50274,
17465,
569,
50276,
6050,
891,
513,
417,
1158,
326,
253,
2929,
4419,
247,
7497,
538,
720,
7103,
697,
3480,
943,
320,
5393,
275,
253,
12291,
2593,
671,
253,
3779,
4364,
8115,
1007,
625,
16761,
651,
7497,
2686,
11528,
432,
352,
1014,
604,
627,
310,
247,
1534,
3064,
651,
368,
1902,
247,
1781,
1055,
1979,
4496,
671,
1618,
326,
10012,
337,
310,
760,
323,
327,
293,
333,
2122,
6928,
275,
253,
7364,
12291,
374,
417,
7763,
281,
20452,
863,
8303,
15877,
432,
2770,
273,
253,
2929,
285,
891,
1158,
627,
310,
417,
878,
281,
3748,
352,
50274,
37585,
389,
24551,
1641,
50275,
249,
958,
253,
958,
3239,
577,
50276,
6795,
254,
3064,
875,
253,
3064,
875,
3239,
818,
50276,
783,
922,
3239,
818,
50276,
38483,
13604,
352,
310,
10748,
281,
1555,
50276,
6377,
854,
50276,
262,
943,
320,
2829,
495,
285,
417,
4677,
495,
50276,
16726,
1541,
292,
3239,
721,
50276,
74,
1158,
352,
943,
320,
253,
38309,
273,
253,
1840,
5150,
310,
43966,
11786,
3239,
1458,
50275,
250,
3065,
50276,
10629,
335,
91,
1162,
355,
9169,
5987,
5758,
15337,
3024,
39061,
2352,
18,
89,
2484,
18,
610,
34338,
50276,
796,
32442,
80,
1162,
355,
4765,
5987,
39962,
2061,
5375,
1093,
2313,
1237,
4529,
50275,
6438,
30080,
22559,
5731,
50276,
783,
4477,
497,
2104,
281,
9004,
1419,
616,
4737,
285,
671,
2530,
4278,
281,
619,
643,
3533,
1223,
253,
3302,
19529,
369,
247,
2590,
12009,
253,
30080,
22559,
369,
973,
2218,
891,
5194,
342,
253,
7350,
273,
253,
2571,
30628,
670,
38135,
4583,
891,
2559,
619,
13716,
281,
16888,
1840,
14924,
1223,
891,
751,
326,
253,
2929,
13698,
281,
2085,
247,
625,
10527,
22861,
327,
863,
8303,
891,
717,
417,
10048,
342,
253,
8132,
263,
273,
253,
3762,
285,
253,
16774,
7103,
891,
717,
417,
13762,
326,
253,
4737,
310,
3451,
285,
253,
12420,
342,
253,
5222,
932,
943,
320,
10141,
970,
3216,
33024,
3640,
4583,
253,
2929,
310,
973,
15720,
533,
4496,
4993,
253,
28146,
891,
2550,
5583,
253,
2929,
275,
697,
1655,
830,
323,
14924,
604,
253,
4737,
497,
15045,
285,
253,
7103,
9017,
281,
247,
3216,
33024,
6803,
273,
253,
2622,
891,
651,
24033,
619,
13716,
50276,
48746,
38135,
253,
9380,
7681,
7680,
310,
4460,
891,
717,
1679,
13762,
670,
253,
8453,
275,
697,
1655,
830,
50275,
358,
5378,
474,
38135,
253,
2929,
1057,
417,
1246,
747,
16774,
27163,
390,
15302,
50276,
39943,
891,
717,
13224,
670,
619,
6803,
891,
1239,
253,
4737,
285,
6949,
253,
2523,
273,
29217,
266,
1939,
35360,
275,
6864,
891,
1335,
1537,
452,
9829,
643,
3374,
273,
253,
4737,
1223,
891,
858,
3261,
387,
253,
23378,
6239,
670,
48960,
6667,
891,
717,
625,
7615,
342,
253,
4665,
1430,
1930,
273,
253,
2905,
789,
50276,
7152,
339,
431,
248,
2929,
556,
767,
2022,
9021,
50274,
66,
806,
352,
2722,
326,
581,
1921,
3212,
253,
863,
8303,
1146,
625,
4665,
494,
323,
48960,
10237,
3210,
310,
326,
323,
841,
3210,
253,
11786,
342,
1675,
281,
253,
3280,
310,
625,
8244,
15616,
342,
253,
2622,
3884,
281,
247,
2810,
3061,
7548,
597,
45190,
12654,
436,
1750,
407,
4645,
326,
253,
298,
19,
19893,
875,
863,
8303,
285,
616,
7548,
11640,
1595,
8303,
10302,
387,
247,
2810,
1127,
327,
253,
3061,
7548,
403,
2406,
323,
10237,
3210,
685,
323,
2629,
3210,
50274,
67,
970,
253,
2045,
958,
597,
45018,
767,
747,
863,
2382,
3082,
270,
3610,
285,
1943,
534,
476,
320,
908,
281,
755,
625,
4665,
1430,
911,
45525,
432,
1014,
247,
2622,
1327,
18848,
461,
1566,
597,
969,
12654,
436,
1750,
45190,
949,
2710,
11745,
17082,
11205,
387,
4560,
253,
5886,
875,
2762,
863,
8303,
3304,
247,
15783,
41113,
3817,
273,
271,
1789,
275,
253,
2460,
20544,
50273,
18,
253,
16038,
273,
253,
2934,
310,
973,
5544,
275,
253,
2929,
374,
253,
15965,
12153,
2424,
323,
4685,
310,
671,
973,
5544,
495,
891,
751,
253,
3434,
1691,
275,
253,
2929,
275,
4685,
253,
14720,
3212,
4665,
494,
863,
8303,
323,
10237,
3210,
285,
840,
970,
253,
8692,
281,
45018,
747,
863,
2382,
3082,
577,
323,
1097,
3916,
253,
2929,
1057,
9470,
18276,
285,
11745,
4679,
50276,
20881,
1255,
50275,
18,
253,
747,
863,
8303,
32434,
275,
253,
2929,
1646,
1077,
2074,
281,
253,
639,
74,
863,
2382,
13012,
275,
253,
2929,
2746,
275,
1943,
253,
863,
8303,
403,
10302,
2112,
20670,
569,
273,
1269,
285,
697,
8642,
48960,
2460,
5727,
275,
639,
74,
253,
863,
8303,
403,
10302,
2112,
1016,
3213,
273,
253,
48960,
2460,
5978,
374,
275,
2829,
18,
5393,
275,
253,
2929,
253,
11701,
2112,
253,
767,
17082,
908,
275,
643,
9380,
403,
417,
1663,
1534,
253,
7756,
760,
3249,
2112,
342,
253,
767,
747,
17082,
4081,
275,
436,
2929,
891,
651,
751,
281,
923,
247,
5301,
1411,
690,
643,
17082,
908,
275,
253,
2905,
2987,
824,
347,
1755,
18,
14536,
7200,
347,
908,
275,
337,
285,
374,
495,
323,
247,
22870,
83,
5301,
342,
253,
639,
74,
1332,
476,
253,
4477,
897,
760,
253,
23256,
69,
2983,
323,
253,
48960,
2460,
5978,
390,
253,
4477,
476,
671,
19071,
643,
48960,
3888,
285,
417,
816,
23256,
69,
275,
639,
74,
323,
4227,
253,
639,
74,
1332,
476,
320,
908,
281,
11897,
253,
863,
8303,
2112,
1016,
3213,
273,
23256,
69,
260,
88,
285,
1125,
412,
35333,
8104,
285,
253,
2457,
863,
2382,
310,
816,
253,
1599,
863,
2382,
273,
512,
1264,
7274,
577,
495,
2692,
326,
616,
863,
2382,
5853,
2987,
973,
342,
1014,
2709,
5113,
275,
253,
2460,
476,
253,
4477,
921,
690,
18276,
1543,
273,
5301,
323,
2709,
5113,
2439,
1027,
863,
2382,
3082,
50276,
250,
3065,
50273,
18,
4116,
3169,
5926,
483,
3828,
323,
22112,
22296,
1789,
14536,
2093,
70,
1162,
355,
6247,
50276,
19,
327,
253,
5373,
273,
3210,
342,
591,
916,
1230,
50276,
2132,
27935,
639,
5209,
18758,
1162,
355,
9169,
50276,
20,
4868,
12583,
4868,
24676,
5304,
22909,
323,
27311,
267,
11454,
6928,
259,
606,
1162,
355,
9169,
50276,
74,
452,
247,
1643,
7350,
5001,
253,
11745,
4679,
275,
253,
2929,
534,
403,
5393,
275,
253,
14855,
2593,
891,
588,
320,
7378,
281,
5731,
619,
17503,
604,
253,
4477,
2953,
512,
619,
2792,
50276,
7152,
33032,
2520,
2929,
23970,
7548,
863,
8303,
534,
25057,
253,
4602,
875,
7548,
2622,
11390,
285,
27935,
281,
4917,
22909,
323,
1327,
18848,
461,
3210,
326,
4459,
689,
1142,
273,
253,
13857,
3607,
326,
452,
644,
2540,
273,
22909,
327,
10237,
3210,
352,
671,
29328,
247,
1943,
281,
5513,
3210,
50276,
296,
3755,
20556,
50276,
18,
2829,
337,
2722,
253,
16774,
1543,
403,
1175,
50276,
20881,
1255,
50276,
18,
619,
2201,
4468,
323,
436,
2929,
310,
326,
253,
6452,
556,
2168,
1929,
323,
1650,
209,
1031,
284,
1162,
355,
2722,
326,
10237,
3210,
476,
4711,
1805,
39612,
15616,
3386,
672,
11786,
18499,
285,
48960,
10237,
3210,
403,
1929,
281,
452,
6032,
3061,
7548,
337,
50276,
18,
28055,
3505,
74,
6216,
5454,
2727,
875,
31640,
285,
7200,
17857,
1686,
6247,
50276,
783,
6452,
310,
417,
4460,
281,
479,
534,
310,
2168,
1929,
281,
253,
3114,
2490,
187,
4118,
18435,
27,
2520,
2929,
2789,
253,
1563,
9021,
50276,
18,
352,
2722,
326,
581,
1921,
3212,
253,
863,
8303,
1146,
625,
4665,
494,
323,
48960,
10237,
3210,
310,
326,
323,
841,
3210,
253,
11786,
342,
1675,
281,
253,
3280,
310,
625,
8244,
15616,
342,
253,
2622,
3884,
281,
247,
2810,
3061,
7548,
374,
970,
253,
2045,
958,
253,
4477,
45018,
767,
747,
863,
2382,
3082,
50276,
1768,
78,
285,
1943,
50276,
4609,
476,
320,
908,
281,
755,
625,
9630,
22909,
432,
1014,
247,
2622,
1327,
18848,
461,
1566,
1223,
253,
30628,
5194,
326,
253,
26536,
273,
436,
2929,
310,
4722,
690,
7350,
3464,
1501,
253,
30080,
22559,
625,
5742,
690,
30628,
1121,
460,
326,
253,
639,
74,
285,
1943,
3082,
403,
8489,
2074,
285,
643,
30628,
403,
417,
1077,
13762,
670,
690,
273,
253,
4278,
24088,
253,
26647,
273,
253,
9373,
38931,
1319,
273,
924,
281,
253,
3061,
7548,
432,
253,
8985,
9162,
1083,
2593,
4562,
281,
253,
625,
2087,
1083,
273,
774,
328,
1507,
23559,
14407,
49996,
2593,
4567,
1677,
436,
359,
403,
7591,
281,
2997,
436,
2929,
387,
436,
673,
359,
3524,
253,
4477,
1089,
253,
37317,
8680,
4217
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper observes the onesided convergence phenomenon of gans training and proposes the onesided mvi condition suitable for this problem then the convergence analysis is provided for the proposed amsgradeg and amsgradegdrd algorithms pros the paper is well written and easy to follow the convergence analysis of the proposed algorithms are solid cons my main concerns of the paper focus on the motivation and the experiments when training the gan models we usually require that the loss function given by the discriminator approaches to zero in terms of either kl divergence or wasserstein distance although fig 1 gives the gradient information of both the generator and discriminator it is hard to say if the model really converges well as a nonconvex problem it is possible that the gradient of the generator approaches to zero while the model actually doesnt converge in such situation the generated data may be meaningless moreover with the low quality of the generated images it is highly possible that the gan model does not converge thus i am not sure if gan model is a proper application of the proposed algorithms in the experiments the authors use the wasserstein gan arjovsky et al 2017 to conduct the experiments the vanilla wgan arjovsky et al 2017 model may fail to converge due to the lack of the 1lipschitz constraint of the discriminator the generator only works well with a promising and working discriminator this means that the used wgan model may not satisfy assumption 1 or assumption 3 in such situation it is hard to differentiate if the proposed mechanism or the nonworking discriminator causes the zero gradient problem of the generator to make the experiments more convincing i suggest the authors to experiment on 2dimensional toy dataset first which will be much simpler and be easier to find if the model does converge it seems that the theoretical part of the paper does not support the motivation well if this problem can be solved ill raise my rating docsepthis paper analyzes the performance of adamtype algorithms amsgrad to be specific in nonconvex nonconcave minimax optimization the authors propose that adamtype algorithms can converge to a stationary point with the standard mvi assumption and an even weaker onesided mvi assumption the authors verify their claims using experiments this paper looks particularly interesting to me because of its clarity in presentations and its novelty in the theoretical results as i am not an expert in minmax optimization or gans i cannot be very confident that the results are completely new however analyzing adamtype algorithms in the training of gans is as far as i am aware an important yet not sufficiently explored direction the authors have chosen a general nonconvex nonconcave problem to analyze and have shown the results under the standard mvi assumption and the onesided mvi assumption the onesided convergence behavior is also supported by the experiments therefore i like the results of this paper in terms of weaknesses i would like to discuss the following questions with the authors 1 in terms of the assumptions in table 1 since the authors not only need mvi but also some other assumptions eg lipchitzness and boundedness in assumption 1 why do the authors only list mvi in the table i spent some time reading the other papers mentioned in this table and it seems to me that different papers have very different assumptions and assumption 1 does not seem to be used in all of them therefore i dont think table 1 is clear enough i wish the authors could explain table 1 a little more thoroughly 2 in terms of the optimization algorithm i wonder whether the other adaptive algorithms could also obtain similar results of course amsgrad is a perfectly fine choice however i dont see the technical difficulty if the authors just use a general nondecreasing ht since the authors are claiming the convergence results for adamtype algorithms a general choice of ht would make their argument stronger besides is it possible to use popovs extra gradient to reduce the number of queries to the sfo 3 the experiments part section 4 looks less convincing to me or even unnecessary because the main point of this paper is on the theoretical side and to show that onesided convergence exists i dont see why it is needed to compare the performance of the proposed algorithm with sgda also there are no fid scores inception scores and its really hard to tell which images are better again i am not an expert in this area and i might be missing something i look forward to the authors discussions this paper provides a solid contribution to the area of minmax optimization with their theory supported by experiments in training gans docsepthis manuscript developed several algorithms eg amsgradeg amsgradegdrd for nonconvexnonconcave minmax optimization the convergence result of amsgradegdrd is shown under the onesided mvi condition polynomialtime complexity results are established some toy experiments are conducted for gan on mnist and fashionmnist datasets strengths the paper is wellwritten the onesided mvi assumption is an interesting observation weaknesses 1 the main technical proofs follow closely to liu et al 2020 expect for changing the gradient to movingaveraged gradient amsgradeg uses the same mvi assumption as in liu et al 2020 amsgradegdrd only requires onesided mvi assumption but the convergence measure is also weaker only partial gradient in terms of x please explain what is the technical contribution of your theorem 31 and theorem 32 when compared with liu et al 2020 2 onesided mvi is not weaker than mvi assumption since they cannot imply each other however theorem 32 is a weaker convergence result since the lhs is only a partial gradient in terms of x 3 for both theorem 31 and theorem 32 the convergence crucially relies on large minibatch this is not practical 4 assumption 1 4 requires bounded iterate however it seems that the authors did not consider a projection in their algorithm 5 i have a very big concern about experiments i the evaluation is not comprehensive and not convincing for example the authors only show the results on mnist dataset which is too small for gan also they only provided subjective generated picture without providing quantitative results eg inception score and frechet inception distance ii the experiments presented in figure 3 are very poor for example columns d e f are all pictures of visually bad quality i personally do not think column e is better than d and f at all this manuscript proposed an interesting onesided mvi assumption for nonconvexnonconcave minimax problems and developed some algorithms with polynomialtime complexity to firstorder stationary points the theory requires certain unrealistic assumptions eg large minibatch bounded iterate and overall is not surprising empirical results are poorly evaluated and are not convincing docsepthis paper analyzes two variants of the adam optimizer and proves their convergence under either the standard mvi condition or the newly proposed onesided mvi condition the aforementioned adam variants are then empirically evaluated by training gans on mnist fashionmnist and cifar10 demonstrating better sample quality than sgd the proposed onesided mvi condition is interesting and the analysis under this condition provides new insights into the convergence of adaptive optimization algorithms on minmax problems such as gans the theoretical analysis is clear and easy to follow the authors argue that it is very unlikely that there is a optimal discriminator based on the observation that the gradient norm of discriminators remains large throughout the training process of gans however it is possible there exists an optimal solution for discriminators which the optimizer fails to converge to and there is no convincing evidence in the paper to prove otherwise for instance the mvi condition seems to hold reasonably well in some scenarios such as the dcgan experiments presented in appendix c furthermore the adam variants are proposed to simplify the analysis of adam therefore it is necessary to show if the variants perform similarly as adam in practice which is not done in the empirical study in addition the generative performances of different optimizers are only compared qualitatively without using metrics like fid there are also noticeable typos or errors throughout the paper eg in sec 21 stochastic gradient descent sgd was originally proposed by goodfellow et al 2016 and in sec 5 provide the theoretical guarantee of the onesided convergence of adam under onesided mvi condition where it should be adamlike algorithms instead this paper presented some interesting theoretical analysis of adaptive optimization algorithms but part of its theory and experiments are not convincing enough and the overall writing needs to be improved
### Summary:
|
this paper studies the convergence of adamtype algorithms two variants of amsgrad in particular in minmax problems that satisfy a onesided minty variational inequality condition the reviewers identified several weaknesses in the paper and the authors did not provide a rebuttal to these concerns so there was consensus to reject the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
40687,
253,
4394,
1356,
14940,
11562,
273,
305,
507,
3733,
285,
29328,
253,
4394,
1356,
278,
6584,
1617,
7470,
323,
436,
1895,
840,
253,
14940,
1783,
310,
2530,
323,
253,
4081,
717,
84,
7698,
72,
285,
717,
84,
7698,
72,
5267,
69,
11333,
5847,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
14940,
1783,
273,
253,
4081,
11333,
403,
4891,
50276,
5040,
619,
2022,
7350,
273,
253,
2929,
2770,
327,
253,
16038,
285,
253,
4679,
50276,
9453,
3733,
253,
36827,
3210,
359,
3798,
2430,
326,
253,
2957,
1159,
1677,
407,
253,
7134,
12915,
7274,
281,
5058,
275,
2426,
273,
2057,
27451,
23279,
390,
369,
2152,
6339,
4181,
3738,
3036,
337,
4245,
253,
11786,
1491,
273,
1097,
253,
14156,
285,
7134,
12915,
352,
310,
1892,
281,
1333,
604,
253,
1566,
1663,
26414,
973,
347,
247,
1327,
44181,
1895,
352,
310,
1896,
326,
253,
11786,
273,
253,
14156,
7274,
281,
5058,
1223,
253,
1566,
2686,
36908,
29623,
275,
824,
4112,
253,
4561,
941,
778,
320,
34209,
25761,
342,
253,
1698,
3290,
273,
253,
4561,
3888,
352,
310,
4122,
1896,
326,
253,
36827,
1566,
1057,
417,
29623,
3021,
891,
717,
417,
2119,
604,
36827,
1566,
310,
247,
1463,
2898,
273,
253,
4081,
11333,
50276,
249,
253,
4679,
253,
4477,
897,
253,
369,
2152,
6339,
36827,
549,
75,
42891,
1162,
355,
4240,
281,
2589,
253,
4679,
253,
26724,
259,
1247,
549,
75,
42891,
1162,
355,
4240,
1566,
778,
1891,
281,
29623,
1955,
281,
253,
3480,
273,
253,
337,
77,
2824,
37913,
7658,
273,
253,
7134,
12915,
253,
14156,
760,
2987,
973,
342,
247,
12532,
285,
2444,
7134,
12915,
436,
2097,
326,
253,
908,
259,
1247,
1566,
778,
417,
10517,
9376,
337,
390,
9376,
495,
275,
824,
4112,
352,
310,
1892,
281,
22629,
604,
253,
4081,
5122,
390,
253,
1327,
21107,
7134,
12915,
5997,
253,
5058,
11786,
1895,
273,
253,
14156,
50276,
936,
1056,
253,
4679,
625,
21414,
891,
1804,
253,
4477,
281,
3368,
327,
374,
6967,
20953,
10895,
806,
534,
588,
320,
1199,
19554,
285,
320,
6927,
281,
1089,
604,
253,
1566,
1057,
29623,
352,
3133,
326,
253,
10527,
629,
273,
253,
2929,
1057,
417,
1329,
253,
16038,
973,
604,
436,
1895,
476,
320,
14042,
2853,
7164,
619,
13716,
5474,
33032,
2520,
2929,
3537,
13505,
253,
3045,
273,
38622,
881,
11333,
717,
84,
4971,
281,
320,
2173,
275,
1327,
44181,
1327,
45542,
1123,
7221,
991,
13757,
253,
4477,
12661,
326,
38622,
881,
11333,
476,
29623,
281,
247,
17429,
1127,
342,
253,
2629,
278,
6584,
9376,
285,
271,
1014,
21076,
4394,
1356,
278,
6584,
9376,
253,
4477,
12654,
616,
3916,
970,
4679,
436,
2929,
4453,
3782,
4722,
281,
479,
984,
273,
697,
19843,
275,
27228,
285,
697,
38135,
275,
253,
10527,
1543,
347,
891,
717,
417,
271,
6485,
275,
1054,
4090,
13757,
390,
305,
507,
891,
2550,
320,
1077,
13224,
326,
253,
1543,
403,
4336,
747,
2299,
18918,
38622,
881,
11333,
275,
253,
3733,
273,
305,
507,
310,
347,
2080,
347,
891,
717,
6600,
271,
1774,
2568,
417,
10481,
14859,
3884,
253,
4477,
452,
6777,
247,
2087,
1327,
44181,
1327,
45542,
1123,
1895,
281,
12106,
285,
452,
2011,
253,
1543,
762,
253,
2629,
278,
6584,
9376,
285,
253,
4394,
1356,
278,
6584,
9376,
253,
4394,
1356,
14940,
3879,
310,
671,
4516,
407,
253,
4679,
3103,
891,
751,
253,
1543,
273,
436,
2929,
50276,
249,
2426,
273,
32213,
891,
651,
751,
281,
2319,
253,
1563,
3533,
342,
253,
4477,
50276,
18,
275,
2426,
273,
253,
13260,
275,
2829,
337,
1580,
253,
4477,
417,
760,
878,
278,
6584,
533,
671,
690,
643,
13260,
24088,
5541,
37913,
1255,
285,
11542,
1255,
275,
9376,
337,
2139,
513,
253,
4477,
760,
1618,
278,
6584,
275,
253,
2829,
891,
5262,
690,
673,
4361,
253,
643,
9380,
5393,
275,
436,
2829,
285,
352,
3133,
281,
479,
326,
1027,
9380,
452,
1077,
1027,
13260,
285,
9376,
337,
1057,
417,
1646,
281,
320,
908,
275,
512,
273,
731,
3103,
891,
13414,
1158,
2829,
337,
310,
2590,
2217,
891,
5730,
253,
4477,
812,
5513,
2829,
337,
247,
1652,
625,
16575,
50276,
19,
275,
2426,
273,
253,
13757,
5933,
891,
4282,
1880,
253,
643,
17825,
11333,
812,
671,
4044,
2074,
1543,
273,
2282,
717,
84,
4971,
310,
247,
9670,
4030,
4327,
2299,
891,
13414,
923,
253,
7681,
10183,
604,
253,
4477,
816,
897,
247,
2087,
1327,
40600,
2355,
288,
85,
1580,
253,
4477,
403,
15081,
253,
14940,
1543,
323,
38622,
881,
11333,
247,
2087,
4327,
273,
288,
85,
651,
1056,
616,
4154,
10046,
16280,
310,
352,
1896,
281,
897,
1684,
729,
84,
4465,
11786,
281,
4796,
253,
1180,
273,
19241,
281,
253,
256,
4786,
50276,
20,
253,
4679,
629,
2593,
577,
4453,
1679,
21414,
281,
479,
390,
1014,
15279,
984,
253,
2022,
1127,
273,
436,
2929,
310,
327,
253,
10527,
1930,
285,
281,
921,
326,
4394,
1356,
14940,
4961,
891,
13414,
923,
2139,
352,
310,
3058,
281,
7277,
253,
3045,
273,
253,
4081,
5933,
342,
48237,
1473,
671,
627,
403,
642,
269,
301,
7363,
50276,
249,
2409,
7363,
285,
697,
1663,
1892,
281,
2028,
534,
3888,
403,
1805,
50276,
16245,
891,
717,
417,
271,
6485,
275,
436,
2170,
285,
891,
1537,
320,
5816,
1633,
891,
1007,
3579,
281,
253,
4477,
11985,
50276,
2520,
2929,
3400,
247,
4891,
7680,
281,
253,
2170,
273,
1054,
4090,
13757,
342,
616,
3762,
4516,
407,
4679,
275,
3733,
305,
507,
5474,
33032,
2520,
7714,
3715,
2067,
11333,
24088,
717,
84,
7698,
72,
717,
84,
7698,
72,
5267,
69,
323,
1327,
44181,
4160,
45542,
1123,
1054,
4090,
13757,
253,
14940,
906,
273,
717,
84,
7698,
72,
5267,
69,
310,
2011,
762,
253,
4394,
1356,
278,
6584,
1617,
14189,
2606,
10454,
1543,
403,
4232,
690,
20953,
4679,
403,
5196,
323,
36827,
327,
278,
79,
382,
285,
8142,
16192,
382,
15302,
20544,
253,
2929,
310,
973,
15720,
253,
4394,
1356,
278,
6584,
9376,
310,
271,
4722,
8310,
50275,
20881,
1255,
265,
337,
253,
2022,
7681,
27947,
956,
8244,
281,
632,
86,
1162,
355,
9169,
1902,
323,
6890,
253,
11786,
281,
4886,
11215,
2961,
11786,
717,
84,
7698,
72,
4648,
253,
1072,
278,
6584,
9376,
347,
275,
632,
86,
1162,
355,
9169,
717,
84,
7698,
72,
5267,
69,
760,
4419,
4394,
1356,
278,
6584,
9376,
533,
253,
14940,
2557,
310,
671,
21076,
760,
7898,
11786,
275,
2426,
273,
1269,
4496,
5513,
752,
310,
253,
7681,
7680,
273,
634,
10012,
4562,
285,
10012,
4567,
672,
2429,
342,
632,
86,
1162,
355,
9169,
50275,
19,
4394,
1356,
278,
6584,
310,
417,
21076,
685,
278,
6584,
9376,
1580,
597,
2550,
16084,
1016,
643,
2299,
10012,
4567,
310,
247,
21076,
14940,
906,
1580,
253,
298,
11285,
310,
760,
247,
7898,
11786,
275,
2426,
273,
1269,
50276,
20,
323,
1097,
10012,
4562,
285,
10012,
4567,
253,
14940,
29325,
1365,
15771,
327,
1781,
1054,
487,
1506,
436,
310,
417,
8542,
50275,
21,
9376,
337,
577,
4419,
11542,
35388,
2299,
352,
3133,
326,
253,
4477,
858,
417,
1908,
247,
12378,
275,
616,
5933,
50276,
22,
891,
452,
247,
1077,
1943,
4468,
670,
4679,
891,
253,
7103,
310,
417,
11088,
285,
417,
21414,
323,
1650,
253,
4477,
760,
921,
253,
1543,
327,
278,
79,
382,
10895,
534,
310,
1512,
1355,
323,
36827,
671,
597,
760,
2530,
17854,
4561,
5406,
1293,
5277,
11745,
1543,
24088,
39645,
4868,
285,
4107,
25742,
39645,
4181,
21255,
253,
4679,
3559,
275,
4677,
495,
403,
1077,
4105,
323,
1650,
9930,
277,
299,
269,
403,
512,
7968,
273,
25910,
3076,
3290,
891,
11697,
513,
417,
1158,
5084,
299,
310,
1805,
685,
277,
285,
269,
387,
512,
50272,
2520,
7714,
4081,
271,
4722,
4394,
1356,
278,
6584,
9376,
323,
1327,
44181,
4160,
45542,
1123,
7221,
991,
3237,
285,
3715,
690,
11333,
342,
14189,
2606,
10454,
281,
806,
2621,
17429,
2792,
253,
3762,
4419,
2176,
46521,
13260,
24088,
1781,
1054,
487,
1506,
11542,
35388,
285,
4583,
310,
417,
10084,
16774,
1543,
403,
15225,
6760,
285,
403,
417,
21414,
50276,
7152,
33032,
2520,
2929,
3537,
13505,
767,
11640,
273,
253,
38622,
5556,
6081,
285,
19539,
616,
14940,
762,
2057,
253,
2629,
278,
6584,
1617,
390,
253,
9841,
4081,
4394,
1356,
278,
6584,
1617,
253,
18979,
38622,
11640,
403,
840,
45190,
6760,
407,
3733,
305,
507,
327,
278,
79,
382,
8142,
16192,
382,
285,
260,
338,
274,
740,
17227,
1805,
3410,
3290,
685,
256,
35333,
253,
4081,
4394,
1356,
278,
6584,
1617,
310,
4722,
285,
253,
1783,
762,
436,
1617,
3400,
747,
16039,
715,
253,
14940,
273,
17825,
13757,
11333,
327,
1054,
4090,
3237,
824,
347,
305,
507,
253,
10527,
1783,
310,
2590,
285,
3477,
281,
956,
50276,
783,
4477,
9059,
326,
352,
310,
1077,
11543,
326,
627,
310,
247,
8654,
7134,
12915,
1754,
327,
253,
8310,
326,
253,
11786,
5222,
273,
20741,
2392,
4558,
1781,
4768,
253,
3733,
1232,
273,
305,
507,
2299,
352,
310,
1896,
627,
4961,
271,
8654,
2900,
323,
20741,
2392,
534,
253,
5556,
6081,
10224,
281,
29623,
281,
285,
627,
310,
642,
21414,
1941,
275,
253,
2929,
281,
5276,
5010,
323,
4227,
253,
278,
6584,
1617,
3133,
281,
2186,
12054,
973,
275,
690,
15216,
824,
347,
253,
36196,
1247,
4679,
3559,
275,
30762,
260,
33810,
253,
38622,
11640,
403,
4081,
281,
25636,
253,
1783,
273,
38622,
3103,
352,
310,
3309,
281,
921,
604,
253,
11640,
1347,
12014,
347,
38622,
275,
3946,
534,
310,
417,
2218,
275,
253,
16774,
1263,
275,
1635,
253,
1006,
800,
16226,
273,
1027,
5556,
14460,
403,
760,
2429,
36143,
1293,
970,
17082,
751,
269,
301,
50276,
9088,
403,
671,
28629,
963,
993,
390,
6332,
4768,
253,
2929,
24088,
275,
4706,
3127,
19191,
11786,
18499,
256,
35333,
369,
8927,
4081,
407,
1175,
47530,
1162,
355,
4022,
285,
275,
4706,
608,
2085,
253,
10527,
12215,
273,
253,
4394,
1356,
14940,
273,
38622,
762,
4394,
1356,
278,
6584,
1617,
835,
352,
943,
320,
38622,
3022,
11333,
3185,
436,
2929,
3559,
690,
4722,
10527,
1783,
273,
17825,
13757,
11333,
533,
629,
273,
697,
3762,
285,
4679,
403,
417,
21414,
2217,
285,
253,
4583,
4028,
3198,
281,
320,
5520,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
14940,
273,
38622,
881,
11333,
767,
11640,
273,
717,
84,
4971,
275,
1798,
275,
1054,
4090,
3237,
326,
10517,
247,
4394,
1356,
1054,
555,
39762,
11370,
1617,
50276,
783,
30628,
3636,
2067,
32213,
275,
253,
2929,
285,
253,
4477,
858,
417,
2085,
247,
30080,
22559,
281,
841,
7350,
594,
627,
369,
13969,
281,
12009,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
40687,
253,
4394,
1356,
14940,
11562,
273,
305,
507,
3733,
285,
29328,
253,
4394,
1356,
278,
6584,
1617,
7470,
323,
436,
1895,
840,
253,
14940,
1783,
310,
2530,
323,
253,
4081,
717,
84,
7698,
72,
285,
717,
84,
7698,
72,
5267,
69,
11333,
5847,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
14940,
1783,
273,
253,
4081,
11333,
403,
4891,
50276,
5040,
619,
2022,
7350,
273,
253,
2929,
2770,
327,
253,
16038,
285,
253,
4679,
50276,
9453,
3733,
253,
36827,
3210,
359,
3798,
2430,
326,
253,
2957,
1159,
1677,
407,
253,
7134,
12915,
7274,
281,
5058,
275,
2426,
273,
2057,
27451,
23279,
390,
369,
2152,
6339,
4181,
3738,
3036,
337,
4245,
253,
11786,
1491,
273,
1097,
253,
14156,
285,
7134,
12915,
352,
310,
1892,
281,
1333,
604,
253,
1566,
1663,
26414,
973,
347,
247,
1327,
44181,
1895,
352,
310,
1896,
326,
253,
11786,
273,
253,
14156,
7274,
281,
5058,
1223,
253,
1566,
2686,
36908,
29623,
275,
824,
4112,
253,
4561,
941,
778,
320,
34209,
25761,
342,
253,
1698,
3290,
273,
253,
4561,
3888,
352,
310,
4122,
1896,
326,
253,
36827,
1566,
1057,
417,
29623,
3021,
891,
717,
417,
2119,
604,
36827,
1566,
310,
247,
1463,
2898,
273,
253,
4081,
11333,
50276,
249,
253,
4679,
253,
4477,
897,
253,
369,
2152,
6339,
36827,
549,
75,
42891,
1162,
355,
4240,
281,
2589,
253,
4679,
253,
26724,
259,
1247,
549,
75,
42891,
1162,
355,
4240,
1566,
778,
1891,
281,
29623,
1955,
281,
253,
3480,
273,
253,
337,
77,
2824,
37913,
7658,
273,
253,
7134,
12915,
253,
14156,
760,
2987,
973,
342,
247,
12532,
285,
2444,
7134,
12915,
436,
2097,
326,
253,
908,
259,
1247,
1566,
778,
417,
10517,
9376,
337,
390,
9376,
495,
275,
824,
4112,
352,
310,
1892,
281,
22629,
604,
253,
4081,
5122,
390,
253,
1327,
21107,
7134,
12915,
5997,
253,
5058,
11786,
1895,
273,
253,
14156,
50276,
936,
1056,
253,
4679,
625,
21414,
891,
1804,
253,
4477,
281,
3368,
327,
374,
6967,
20953,
10895,
806,
534,
588,
320,
1199,
19554,
285,
320,
6927,
281,
1089,
604,
253,
1566,
1057,
29623,
352,
3133,
326,
253,
10527,
629,
273,
253,
2929,
1057,
417,
1329,
253,
16038,
973,
604,
436,
1895,
476,
320,
14042,
2853,
7164,
619,
13716,
5474,
33032,
2520,
2929,
3537,
13505,
253,
3045,
273,
38622,
881,
11333,
717,
84,
4971,
281,
320,
2173,
275,
1327,
44181,
1327,
45542,
1123,
7221,
991,
13757,
253,
4477,
12661,
326,
38622,
881,
11333,
476,
29623,
281,
247,
17429,
1127,
342,
253,
2629,
278,
6584,
9376,
285,
271,
1014,
21076,
4394,
1356,
278,
6584,
9376,
253,
4477,
12654,
616,
3916,
970,
4679,
436,
2929,
4453,
3782,
4722,
281,
479,
984,
273,
697,
19843,
275,
27228,
285,
697,
38135,
275,
253,
10527,
1543,
347,
891,
717,
417,
271,
6485,
275,
1054,
4090,
13757,
390,
305,
507,
891,
2550,
320,
1077,
13224,
326,
253,
1543,
403,
4336,
747,
2299,
18918,
38622,
881,
11333,
275,
253,
3733,
273,
305,
507,
310,
347,
2080,
347,
891,
717,
6600,
271,
1774,
2568,
417,
10481,
14859,
3884,
253,
4477,
452,
6777,
247,
2087,
1327,
44181,
1327,
45542,
1123,
1895,
281,
12106,
285,
452,
2011,
253,
1543,
762,
253,
2629,
278,
6584,
9376,
285,
253,
4394,
1356,
278,
6584,
9376,
253,
4394,
1356,
14940,
3879,
310,
671,
4516,
407,
253,
4679,
3103,
891,
751,
253,
1543,
273,
436,
2929,
50276,
249,
2426,
273,
32213,
891,
651,
751,
281,
2319,
253,
1563,
3533,
342,
253,
4477,
50276,
18,
275,
2426,
273,
253,
13260,
275,
2829,
337,
1580,
253,
4477,
417,
760,
878,
278,
6584,
533,
671,
690,
643,
13260,
24088,
5541,
37913,
1255,
285,
11542,
1255,
275,
9376,
337,
2139,
513,
253,
4477,
760,
1618,
278,
6584,
275,
253,
2829,
891,
5262,
690,
673,
4361,
253,
643,
9380,
5393,
275,
436,
2829,
285,
352,
3133,
281,
479,
326,
1027,
9380,
452,
1077,
1027,
13260,
285,
9376,
337,
1057,
417,
1646,
281,
320,
908,
275,
512,
273,
731,
3103,
891,
13414,
1158,
2829,
337,
310,
2590,
2217,
891,
5730,
253,
4477,
812,
5513,
2829,
337,
247,
1652,
625,
16575,
50276,
19,
275,
2426,
273,
253,
13757,
5933,
891,
4282,
1880,
253,
643,
17825,
11333,
812,
671,
4044,
2074,
1543,
273,
2282,
717,
84,
4971,
310,
247,
9670,
4030,
4327,
2299,
891,
13414,
923,
253,
7681,
10183,
604,
253,
4477,
816,
897,
247,
2087,
1327,
40600,
2355,
288,
85,
1580,
253,
4477,
403,
15081,
253,
14940,
1543,
323,
38622,
881,
11333,
247,
2087,
4327,
273,
288,
85,
651,
1056,
616,
4154,
10046,
16280,
310,
352,
1896,
281,
897,
1684,
729,
84,
4465,
11786,
281,
4796,
253,
1180,
273,
19241,
281,
253,
256,
4786,
50276,
20,
253,
4679,
629,
2593,
577,
4453,
1679,
21414,
281,
479,
390,
1014,
15279,
984,
253,
2022,
1127,
273,
436,
2929,
310,
327,
253,
10527,
1930,
285,
281,
921,
326,
4394,
1356,
14940,
4961,
891,
13414,
923,
2139,
352,
310,
3058,
281,
7277,
253,
3045,
273,
253,
4081,
5933,
342,
48237,
1473,
671,
627,
403,
642,
269,
301,
7363,
50276,
249,
2409,
7363,
285,
697,
1663,
1892,
281,
2028,
534,
3888,
403,
1805,
50276,
16245,
891,
717,
417,
271,
6485,
275,
436,
2170,
285,
891,
1537,
320,
5816,
1633,
891,
1007,
3579,
281,
253,
4477,
11985,
50276,
2520,
2929,
3400,
247,
4891,
7680,
281,
253,
2170,
273,
1054,
4090,
13757,
342,
616,
3762,
4516,
407,
4679,
275,
3733,
305,
507,
5474,
33032,
2520,
7714,
3715,
2067,
11333,
24088,
717,
84,
7698,
72,
717,
84,
7698,
72,
5267,
69,
323,
1327,
44181,
4160,
45542,
1123,
1054,
4090,
13757,
253,
14940,
906,
273,
717,
84,
7698,
72,
5267,
69,
310,
2011,
762,
253,
4394,
1356,
278,
6584,
1617,
14189,
2606,
10454,
1543,
403,
4232,
690,
20953,
4679,
403,
5196,
323,
36827,
327,
278,
79,
382,
285,
8142,
16192,
382,
15302,
20544,
253,
2929,
310,
973,
15720,
253,
4394,
1356,
278,
6584,
9376,
310,
271,
4722,
8310,
50275,
20881,
1255,
265,
337,
253,
2022,
7681,
27947,
956,
8244,
281,
632,
86,
1162,
355,
9169,
1902,
323,
6890,
253,
11786,
281,
4886,
11215,
2961,
11786,
717,
84,
7698,
72,
4648,
253,
1072,
278,
6584,
9376,
347,
275,
632,
86,
1162,
355,
9169,
717,
84,
7698,
72,
5267,
69,
760,
4419,
4394,
1356,
278,
6584,
9376,
533,
253,
14940,
2557,
310,
671,
21076,
760,
7898,
11786,
275,
2426,
273,
1269,
4496,
5513,
752,
310,
253,
7681,
7680,
273,
634,
10012,
4562,
285,
10012,
4567,
672,
2429,
342,
632,
86,
1162,
355,
9169,
50275,
19,
4394,
1356,
278,
6584,
310,
417,
21076,
685,
278,
6584,
9376,
1580,
597,
2550,
16084,
1016,
643,
2299,
10012,
4567,
310,
247,
21076,
14940,
906,
1580,
253,
298,
11285,
310,
760,
247,
7898,
11786,
275,
2426,
273,
1269,
50276,
20,
323,
1097,
10012,
4562,
285,
10012,
4567,
253,
14940,
29325,
1365,
15771,
327,
1781,
1054,
487,
1506,
436,
310,
417,
8542,
50275,
21,
9376,
337,
577,
4419,
11542,
35388,
2299,
352,
3133,
326,
253,
4477,
858,
417,
1908,
247,
12378,
275,
616,
5933,
50276,
22,
891,
452,
247,
1077,
1943,
4468,
670,
4679,
891,
253,
7103,
310,
417,
11088,
285,
417,
21414,
323,
1650,
253,
4477,
760,
921,
253,
1543,
327,
278,
79,
382,
10895,
534,
310,
1512,
1355,
323,
36827,
671,
597,
760,
2530,
17854,
4561,
5406,
1293,
5277,
11745,
1543,
24088,
39645,
4868,
285,
4107,
25742,
39645,
4181,
21255,
253,
4679,
3559,
275,
4677,
495,
403,
1077,
4105,
323,
1650,
9930,
277,
299,
269,
403,
512,
7968,
273,
25910,
3076,
3290,
891,
11697,
513,
417,
1158,
5084,
299,
310,
1805,
685,
277,
285,
269,
387,
512,
50272,
2520,
7714,
4081,
271,
4722,
4394,
1356,
278,
6584,
9376,
323,
1327,
44181,
4160,
45542,
1123,
7221,
991,
3237,
285,
3715,
690,
11333,
342,
14189,
2606,
10454,
281,
806,
2621,
17429,
2792,
253,
3762,
4419,
2176,
46521,
13260,
24088,
1781,
1054,
487,
1506,
11542,
35388,
285,
4583,
310,
417,
10084,
16774,
1543,
403,
15225,
6760,
285,
403,
417,
21414,
50276,
7152,
33032,
2520,
2929,
3537,
13505,
767,
11640,
273,
253,
38622,
5556,
6081,
285,
19539,
616,
14940,
762,
2057,
253,
2629,
278,
6584,
1617,
390,
253,
9841,
4081,
4394,
1356,
278,
6584,
1617,
253,
18979,
38622,
11640,
403,
840,
45190,
6760,
407,
3733,
305,
507,
327,
278,
79,
382,
8142,
16192,
382,
285,
260,
338,
274,
740,
17227,
1805,
3410,
3290,
685,
256,
35333,
253,
4081,
4394,
1356,
278,
6584,
1617,
310,
4722,
285,
253,
1783,
762,
436,
1617,
3400,
747,
16039,
715,
253,
14940,
273,
17825,
13757,
11333,
327,
1054,
4090,
3237,
824,
347,
305,
507,
253,
10527,
1783,
310,
2590,
285,
3477,
281,
956,
50276,
783,
4477,
9059,
326,
352,
310,
1077,
11543,
326,
627,
310,
247,
8654,
7134,
12915,
1754,
327,
253,
8310,
326,
253,
11786,
5222,
273,
20741,
2392,
4558,
1781,
4768,
253,
3733,
1232,
273,
305,
507,
2299,
352,
310,
1896,
627,
4961,
271,
8654,
2900,
323,
20741,
2392,
534,
253,
5556,
6081,
10224,
281,
29623,
281,
285,
627,
310,
642,
21414,
1941,
275,
253,
2929,
281,
5276,
5010,
323,
4227,
253,
278,
6584,
1617,
3133,
281,
2186,
12054,
973,
275,
690,
15216,
824,
347,
253,
36196,
1247,
4679,
3559,
275,
30762,
260,
33810,
253,
38622,
11640,
403,
4081,
281,
25636,
253,
1783,
273,
38622,
3103,
352,
310,
3309,
281,
921,
604,
253,
11640,
1347,
12014,
347,
38622,
275,
3946,
534,
310,
417,
2218,
275,
253,
16774,
1263,
275,
1635,
253,
1006,
800,
16226,
273,
1027,
5556,
14460,
403,
760,
2429,
36143,
1293,
970,
17082,
751,
269,
301,
50276,
9088,
403,
671,
28629,
963,
993,
390,
6332,
4768,
253,
2929,
24088,
275,
4706,
3127,
19191,
11786,
18499,
256,
35333,
369,
8927,
4081,
407,
1175,
47530,
1162,
355,
4022,
285,
275,
4706,
608,
2085,
253,
10527,
12215,
273,
253,
4394,
1356,
14940,
273,
38622,
762,
4394,
1356,
278,
6584,
1617,
835,
352,
943,
320,
38622,
3022,
11333,
3185,
436,
2929,
3559,
690,
4722,
10527,
1783,
273,
17825,
13757,
11333,
533,
629,
273,
697,
3762,
285,
4679,
403,
417,
21414,
2217,
285,
253,
4583,
4028,
3198,
281,
320,
5520,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
14940,
273,
38622,
881,
11333,
767,
11640,
273,
717,
84,
4971,
275,
1798,
275,
1054,
4090,
3237,
326,
10517,
247,
4394,
1356,
1054,
555,
39762,
11370,
1617,
50276,
783,
30628,
3636,
2067,
32213,
275,
253,
2929,
285,
253,
4477,
858,
417,
2085,
247,
30080,
22559,
281,
841,
7350,
594,
627,
369,
13969,
281,
12009,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents heterogeneous graph transformer hgt a new architecture that combines useful properties of gnns and transformers to design hgt for combinatorial reasoning problems particularly boolean satisfiability sat the idea of combining these two powerful models is appealing the motivation behind considering both homogenous attention between two literals or between two clauses and heterogenous attention between a literal and a clause seems new and interesting too however there are several missing pieces in the paper in the present form as outlined below the closest existing work in terms of model architecture appears to be graph attention network or gatconv 2017 however the first time this work is mentioned is quite late in section 42 im not sure why i would have expected to see this relationship mentioned in the intro or related work sections without it the proposal seems more novel than it actually is i also find the title transformers satisfy too distant from the proposed technique which is a combination of heterogeneous gnns and attention mechanism extending gatconv why the emphasis on transformers they are a particular way of using an alltoall attention mechanism which isnt quite what the proposed approach uses similarly the emphasis of the abstract and intro on solving csps too broad for whats supported in the paper after the sentence a csp can be formulated as a cnf formula on page 2 the rest of the formulation as well as all experiments are for the sat problem it would be better to frame the work as a new approach for satisfiable instances of sat its ok to mention that it can be generalized to csps but the focus is best left on sat i should note that even within sat the focus here is on 4 families of random instances leaving out a whole universe of handcrafted as well as application industrial instances used to evaluate sat solvers the claim in the 2nd paragraph of the intro that conventional csp solvers rely on handcrafted heuristics and hence the resuling model is bounded by the greedy strategy which is generally suboptimal is misleading it suggests that existing and proposed neural methods are closer to being optimal than conventional csp solvers which cannot be farther from the truth the fact is neural methods for csp and sat are one to two decades behind conventional solvers in terms of the scale of problems they can convincingly solve its an important research area but not one that has yet even come close to conventional solvers the presentation is difficult to follow in places as it leaves many important details unclear eg section 33 tries to give an intuitive connection to transformers but relies on specific prior knowledge of queries keys similarity computation etc in transformers or attention mechanisms which is left undescribed i would suggest cutting down some standard details about csps or factor graphs to instead focus on newer concepts in secion 431 the choice of vi floorxi 05 eps appears unusual and unnecessarily reliant on the parameter epsilon why not simply use vi floorxi 05 the typical way to implement rounding later in that same section i did not follow the intuition behind the regularizer term in eqn 8 the motivation was to shift the value of each xi away from 05 and towards either 0 or 1 presumably it doesnt matter whether xi goes towards 0 or towards 1 as long as the regularizer pushes it either way how does the relu term achieve this for one it is asymmetric while the motivation was a symmetric move away from 05 in the experiments its not clear what exactly are competing approaches trained on is it on the same distribution as the proposed model or something else were any hyperparameters of baseline approaches tuned during training im not sure whats the intended empirical support for the claim just before the start of 52 models traind on these graphs generalize to other csp problems dataset and larger graphs is there a particular experiment you are referring to importaintly the experiment section is completely missing accuracy results on the test set when comparing to rlsat and pdp sure training accuracy given some intuition but its impossible to judge the value of the method without comparing accuracy on unseen test instancesdocsepsummary of the paper the paper proposes a novel architecture called heterogeneous graph transformers hgt to solve constraint satisfaction problems csp using unsupervised learning their model seems to achieve good results on classical problems encoded as csp 3sat kcoloring kcover kclique with a few hundreds of variables and clauses along with faster running time as compared to other contemporary methods because of a clauseparallel model architecture the main contribution of the paper is to modify typical messagepassing steps in graph neural networks to take more of the clausevariable structure into account and accelerate messagepassing inbetween clauses and variables respectively strengths a new approach to solving csp using graph neural networkstransformers with some modeling improvements leading to reduced solution times and better success rate in finding satisfiable assignments weaknesses the experimental setup and results need significant improvements before this work can be published see details in questions below writing the motivating story in the introduction is not so clearcut there is a mention of previous work being limited to sequential algorithms but no strong intuition is offered for why clauseparallelism is useful additionally some key technical terms eg crossattention metapaths are mentioned with little context making it difficult for the reader to start grasping the papers contributions early on novelty the attention mechanisms presented are very much similar to other attention mechanisms like selsam et al 2018 the use of transformertype models is not uncommon and was used prominently for tsptype problems in this paper which was not cited kool wouter herke van hoof and max welling attention learn to solve routing problems international conference on learning representations 2018 recommendation there are some positives to this work particularly in the careful messagepassing design however i have to recommend a reject because the experimental results are very very limited at this point it is unlikely that the authors will be able to address all the issue i raise here but i encourage them to do so in the near future and submit to future conferences questions to the authors in no particular order 1 please show comparisons with a stateoftheart optimization csp solver such as ibm cplexs cp optimizer or others 2 test accuracy figure 4 only showcases the speedup achieved by the method compared to other approaches please report the test accuracy numbers mean and standard deviation otherwise the tables show only training results which does not tell the reader much 3 homogeneous and heterogeneous graph attention the distinctions between homogeneous and heterogenous graph attention is not clearly stated the only distinction seems to be different sizes of the initial node v in v and edge u in u embedding sizes v is of fv dimension and u is of fu dimension as opposed starting with same dimensions as in velickovic et al 2017 ideally in heterogeneous attention eviuj and eujvi should lead to different values however based on the definition in equation 3 both of them will attain the same value it would be nice if you can clearly state the differences 4 four encoderlayers in the encoder and three decoderlayers in the decoder how did you choose the numbers of layers here same question for all the hyperparameters described under general setup in section 53 5 section 431 the definition of lxi eia after equation 5 is different from how lxi is used with a single argument in 5 what is the definition of eia also can you clarify what you mean by is applied to specify the polarity of each variable 6 table 3 the time difference with pdp is clearly implementationdependent can you comment on the significance of these results also it is impossible to guess how many learnable parameters pdp and your model have can you add that information to the table minor the modern approach is trending to solve csp through neural symbolic methods one modern trendy approach to solving csp is through neural symbolic methods unclear what this sentence means hence the resulted model is bounded by the greedy strategy which is generally suboptimal the adjacent matrix the adjacency matrix literals forming a partition different from literals forming a partition are different from a highly paralleled messagepassing a highly parallelized messagepassing page 2 para 1 missing full stop in the last line figure 2b the annotations encoder and decoder in the light blue boxes seem to be interchanged docsep summary the paper presents a model for inferring the solution of a constraint satisfaction problem over boolean variables expressed in conjunctive normal form cnf the proposed model builds on existing works which represent the factor graph of the cnf as a bipartite graph and use a graph neural network for the learning task it redefines the graph to facilitate the message passing among the nodes which are in the same part of the graph ie cnf variablesclauses it also introduces a crossattention mechanism between the nodes belonging to different parts the proposed model is evaluated on a set of benchmark sat instances and is compared with an existing baseline method strong points the preliminary results are encouraging the ideas seem to be novel and there is a potential for impact the paper is wellwritten and easy to follow aspects to be improved ablation study it seems that metapaths can be applied without the attention mechanism if this is the case it should be clarified that to what extent the obtained results depend on the combination of both ideas instance selection sat instances vary significantly in terms of difficulty it seems that the problem instances used in the experimental evaluation are all easy instances i was not able to evaluate the difficulty of the instances from the dataset description to have an estimate of the difficulty of the problems i solved all instances in the random 3sat class using the cdcl solver minisat and obtained these runtimes beginarraylrrr textbfproblem class textbfinstances textbftotal runtime textbfaverage runtime rand3100 430 1000 0709 000071 rand3150 645 100 0570 000570 rand3200 860 100 6171 006171 endarray these results indicate that the random 3sat instances especially the ones with 100 and 150 variables are particularly easy sat instances the results of the experiments in the paper leads me to believe that the instances of other problem classes ie kcoloring kcover and kclique are also among the easier sat instances the reported accuracies in tables 2 and 3 are quite high both for the proposed method and the baseline this indicates that there is room for including more challenging instances in the empirical analysis note that i am not suggesting that the learned models should compete with standard sat solvers im simply using standard solvers as a proxy to estimate the difficulty of the instances the authors can use a similar approach to categorize a set of instances eg a subset selected from a recent sat competition into different levels of difficulty eg easy medium and hard this will make the empirical analysis more reliable and ideally further demonstrates the advantages of the proposed method analysis and reporting the space in the experimental evaluation section can be used more effectively in contrast to the detailed description of the instances baseline and setup the main results are aggregated into two small tables i suggest spending this space on a more extensive empirical analysis moreover the section and figure on efficiency of testing can be summarized in a few sentences if the authors insist on presenting the inference speed as a merit of their proposed approach they should identify instances where the solving time using a sat solver is prohibitively long certainly more than a fraction of a second recommendation the paper presents interesting ideas which can potentially extend to domains beyond sat solving however the proposed ideas need to be evaluated in a more extensive and rigorous empirical study additional suggestions for improvement an interesting question which is for example studied by rlsat is the performance of models trained on one problem class when evaluated on another class this is not a central question yet would be a nice experiment to be included in an appendix docsepsummary the paper transformers satisfy presents an improved graph neural network model to solve sat problems prior applications of gnns to sat have used convolutional gnns instead of graph attention networks gats and this work suggests a modification of gats to improve their performance on the bipartite graphs encountered in sat the main part of the modification is that their gnns consider up to two hops between nodes such that variables can attend to other variables and clauses can attend to other clauses the paper then also suggests a way to allow crossattention between clauses and variables the authors report moderate accuracy gains at predicting satisfiability on small synthetic sat formulas on the positive side the authors explore the use of metapaths a way to exploit the bipartite structure to speed up the gnns but the authors do not contrast that with plain gats applied to their problem so it is hard to say how much that idea contributes my concerns are as follows the paper claims to address constraint satisfaction problems csp but focuses on sat the paper claims to consider transformers but uses graph neural networks evaluation done on very synthetic benchmarks does not necessarily carry over to other benchmarks table 2 shows that the performance of their approach does not outperform their baseline in 2 out of 3 benchmarks table 3 shows moderate gains but only on random sat problems the motivation of the work is to improve the performance of neural networks on sat problems but i am not convinced that there is any perspective to beat cdcl solvers with neural networks at this problem writing style the paper is understandable but could be written better formulations in several places are unclear for example abstract we define the heterogeneous attention mechanism based on metapaths for the selfattention between literals the crossattention based on the bipartite graph links between literals and clauses what does that mean what are meta paths what is crossattention introduction first sentence what does it help the reader to see only the claim that csp is important if you feel that sat and csp need motivation please provide concrete applications introduction we apply the crossattention mechanism to optimize message exchanges between heterogeneous nodes most of the terms here were not yet introduced and i only understood this sentence after reading the rest of the paper the main contribution section 42 is not explained in detail only the math is given and the reader has to reconstruct the motivation minor comment check capitalization of references
### Summary:
|
this paper presents a new graph neural network gnn architecture with attention and with applications to boolean satisfiability the reviewers expressed concerns over various aspects of the paper such as a need for better ablations and an analysis of the difficulty level of the sat problems used in evaluation no rebuttal was provided
|
[
8307,
273,
4903,
285,
34510,
2112,
342,
7938,
3515,
673,
347,
2429,
281,
643,
13399,
3082,
984,
273,
247,
502,
666,
554,
274,
5984,
1566,
10336,
253,
2022,
7680,
273,
253,
2929,
310,
281,
10007,
6867,
3935,
5858,
272,
5018,
275,
4216,
11454,
6928,
281,
1379,
625,
273,
253,
13604,
18645,
2605,
715,
2395,
285,
28523,
3935,
5858,
272,
275,
17352,
34510,
285,
4903,
2975,
50276,
296,
3755,
20556,
50276,
66,
747,
2746,
281,
16161,
260,
1033,
970,
4216,
11454,
2990,
1344,
507,
630,
398,
342,
690,
14053,
11701,
4283,
281,
3777,
2900,
2069,
285,
1805,
2323,
2281,
275,
4560,
3449,
6051,
23768,
50276,
20881,
1255,
265,
50276,
783,
5661,
9978,
285,
1543,
878,
1534,
11701,
1078,
436,
789,
476,
320,
3863,
923,
4278,
275,
3533,
2708,
50275,
17695,
253,
15265,
839,
2926,
275,
253,
10199,
310,
417,
594,
2590,
7317,
627,
310,
247,
3748,
273,
2045,
789,
1146,
3710,
281,
22453,
11333,
533,
642,
2266,
30328,
310,
5907,
323,
2139,
502,
666,
554,
274,
5984,
1204,
310,
4217,
23000,
690,
2234,
7681,
2426,
24088,
2831,
42959,
21543,
48193,
403,
5393,
342,
1652,
3634,
2403,
352,
2834,
323,
253,
9414,
281,
1265,
48635,
253,
9380,
9021,
2393,
327,
50275,
2369,
652,
555,
253,
4116,
6297,
3559,
403,
1077,
1199,
2074,
281,
643,
4116,
6297,
751,
256,
1241,
312,
1162,
355,
4765,
253,
897,
273,
39707,
881,
3210,
310,
417,
24666,
285,
369,
908,
46454,
323,
28669,
431,
1692,
3237,
275,
436,
2929,
534,
369,
417,
11106,
465,
1062,
259,
11872,
617,
413,
3889,
8511,
1171,
285,
2781,
973,
272,
4116,
3037,
281,
8415,
24749,
3237,
5213,
8059,
327,
4715,
14237,
4765,
50275,
250,
27167,
318,
50276,
9088,
403,
690,
37865,
281,
436,
789,
3782,
275,
253,
10182,
3935,
5858,
272,
2216,
2299,
891,
452,
281,
5583,
247,
12009,
984,
253,
5661,
1543,
403,
1077,
1077,
3710,
387,
436,
1127,
352,
310,
11543,
326,
253,
4477,
588,
320,
2104,
281,
2953,
512,
253,
2523,
891,
7164,
1060,
533,
891,
11907,
731,
281,
513,
594,
275,
253,
2822,
2852,
285,
11929,
281,
2852,
27691,
50275,
34974,
281,
253,
4477,
275,
642,
1798,
1340,
337,
4496,
921,
14023,
342,
247,
1375,
23037,
14387,
13757,
260,
1033,
47037,
824,
347,
18890,
78,
260,
12813,
84,
27191,
5556,
6081,
390,
2571,
50276,
19,
1071,
7200,
4677,
577,
760,
921,
12866,
253,
3885,
484,
6786,
407,
253,
1332,
2429,
281,
643,
7274,
4496,
1304,
253,
1071,
7200,
3904,
1599,
285,
2629,
11254,
5010,
253,
7180,
921,
760,
3733,
1543,
534,
1057,
417,
2028,
253,
9414,
1199,
50276,
20,
17010,
285,
22766,
4216,
4116,
253,
42060,
875,
17010,
285,
6895,
11426,
4216,
4116,
310,
417,
4518,
4767,
253,
760,
13812,
3133,
281,
320,
1027,
9552,
273,
253,
3302,
4666,
362,
275,
362,
285,
5024,
1484,
275,
1484,
21496,
9552,
362,
310,
273,
269,
87,
7877,
285,
1484,
310,
273,
15260,
7877,
347,
10066,
4983,
342,
1072,
10103,
347,
275,
5828,
781,
32733,
1162,
355,
4240,
34243,
275,
22766,
4116,
612,
74,
10441,
285,
299,
10441,
6584,
943,
1421,
281,
1027,
2193,
2299,
1754,
327,
253,
5426,
275,
5150,
495,
1097,
273,
731,
588,
20685,
253,
1072,
1318,
352,
651,
320,
5322,
604,
368,
476,
4518,
1375,
253,
3910,
50276,
21,
1740,
32049,
33990,
275,
253,
32049,
285,
1264,
29810,
33990,
275,
253,
29810,
849,
858,
368,
5206,
253,
3904,
273,
8090,
1060,
1072,
1953,
323,
512,
253,
4373,
22041,
2529,
762,
2087,
9978,
275,
2593,
8676,
50276,
22,
2593,
37397,
253,
5426,
273,
298,
2981,
299,
571,
846,
5150,
608,
310,
1027,
432,
849,
298,
2981,
310,
908,
342,
247,
2014,
4154,
275,
608,
752,
310,
253,
5426,
273,
299,
571,
671,
476,
368,
19148,
752,
368,
1599,
407,
310,
3732,
281,
13199,
253,
33487,
273,
1016,
4778,
50275,
23,
2829,
495,
253,
673,
3064,
342,
268,
12132,
310,
4518,
7092,
6820,
476,
368,
4385,
327,
253,
8453,
273,
841,
1543,
671,
352,
310,
7479,
281,
5476,
849,
1142,
3037,
494,
3602,
268,
12132,
285,
634,
1566,
452,
476,
368,
823,
326,
1491,
281,
253,
2829,
50275,
37585,
50276,
783,
4980,
2746,
310,
9058,
272,
281,
8415,
260,
1033,
949,
11454,
24762,
3082,
50276,
531,
4980,
9058,
90,
2746,
281,
16161,
260,
1033,
310,
949,
11454,
24762,
3082,
50276,
328,
8250,
752,
436,
6197,
2097,
7613,
253,
7369,
1566,
310,
11542,
407,
253,
38754,
5700,
534,
310,
3839,
749,
29776,
50276,
783,
9701,
4315,
50276,
783,
3067,
43850,
4315,
50276,
22478,
932,
9046,
247,
10883,
1027,
432,
50276,
22478,
932,
9046,
247,
10883,
403,
1027,
432,
50276,
66,
4122,
29736,
1070,
3935,
5858,
272,
50276,
66,
4122,
7529,
1025,
3935,
5858,
272,
50276,
6377,
374,
5586,
337,
5816,
2120,
3523,
275,
253,
1390,
1386,
50276,
13206,
374,
67,
253,
31825,
32049,
285,
29810,
275,
253,
1708,
4797,
12783,
1646,
281,
320,
734,
11950,
5474,
33032,
50276,
8774,
50275,
783,
2929,
10262,
247,
1566,
323,
9441,
804,
253,
2900,
273,
247,
7658,
13212,
1895,
689,
12419,
4903,
4469,
275,
7862,
28816,
2622,
830,
260,
35478,
253,
4081,
1566,
21168,
327,
5368,
2987,
534,
1957,
253,
2803,
4216,
273,
253,
260,
35478,
347,
247,
49240,
4216,
285,
897,
247,
4216,
11454,
2990,
323,
253,
4715,
4836,
352,
294,
1545,
1100,
253,
4216,
281,
12454,
253,
3935,
8136,
2190,
253,
7632,
534,
403,
275,
253,
1072,
629,
273,
253,
4216,
26332,
260,
35478,
4903,
498,
19726,
352,
671,
23970,
247,
2831,
42959,
5122,
875,
253,
7632,
15823,
281,
1027,
4243,
50275,
783,
4081,
1566,
310,
6760,
327,
247,
873,
273,
22791,
2206,
10872,
285,
310,
2429,
342,
271,
5368,
8245,
1332,
50274,
9072,
2792,
50275,
783,
12611,
1543,
403,
18462,
50276,
783,
5697,
1646,
281,
320,
4460,
285,
627,
310,
247,
2442,
323,
3486,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50275,
284,
808,
84,
281,
320,
5520,
50276,
1752,
318,
1263,
352,
3133,
326,
21543,
48193,
476,
320,
3732,
1293,
253,
4116,
5122,
604,
436,
310,
253,
1083,
352,
943,
320,
31637,
326,
281,
752,
6070,
253,
2797,
1543,
3469,
327,
253,
5019,
273,
1097,
5697,
50275,
14966,
5438,
2206,
10872,
6889,
3012,
275,
2426,
273,
10183,
352,
3133,
326,
253,
1895,
10872,
908,
275,
253,
5661,
7103,
403,
512,
3477,
10872,
891,
369,
417,
2104,
281,
7472,
253,
10183,
273,
253,
10872,
432,
253,
10895,
5740,
281,
452,
271,
6642,
273,
253,
10183,
273,
253,
3237,
891,
14042,
512,
10872,
275,
253,
3632,
495,
22354,
966,
970,
253,
22942,
498,
47037,
1054,
261,
255,
285,
2797,
841,
1408,
3181,
50275,
2043,
3728,
77,
2676,
83,
2505,
3342,
28872,
966,
50276,
11765,
249,
4777,
50276,
1156,
67,
649,
5359,
20243,
50276,
11765,
25629,
20243,
50276,
17554,
20,
2313,
34068,
50276,
9138,
50266,
17,
26371,
50271,
1418,
3677,
50274,
17554,
20,
8970,
721,
1857,
50276,
2313,
50265,
1762,
1967,
50271,
13930,
1967,
50274,
17554,
1237,
361,
854,
1549,
50276,
2313,
50265,
23,
19816,
50271,
7174,
19816,
50274,
423,
3728,
50275,
20513,
1543,
5224,
326,
253,
3632,
495,
22354,
10872,
3340,
253,
4394,
342,
2233,
285,
7783,
4903,
403,
3782,
3477,
2206,
10872,
253,
1543,
273,
253,
4679,
275,
253,
2929,
5644,
479,
281,
2868,
326,
253,
10872,
273,
643,
1895,
5971,
26332,
465,
2052,
4263,
465,
16484,
285,
465,
498,
2271,
403,
671,
2190,
253,
6927,
2206,
10872,
253,
2361,
3933,
19103,
275,
7180,
374,
285,
495,
403,
3240,
1029,
1097,
323,
253,
4081,
1332,
285,
253,
8245,
436,
6492,
326,
627,
310,
2316,
323,
1690,
625,
11132,
10872,
275,
253,
16774,
1783,
50275,
9939,
326,
891,
717,
417,
7738,
326,
253,
6311,
3210,
943,
15639,
342,
2629,
2206,
1220,
735,
516,
3365,
970,
2629,
1220,
735,
347,
247,
17335,
281,
6642,
253,
10183,
273,
253,
10872,
253,
4477,
476,
897,
247,
2074,
2746,
281,
13213,
907,
247,
873,
273,
10872,
24088,
247,
8578,
4236,
432,
247,
3332,
2206,
7324,
715,
1027,
2308,
273,
10183,
24088,
3477,
4646,
285,
1892,
436,
588,
1056,
253,
16774,
1783,
625,
9630,
285,
34243,
2007,
14371,
253,
11361,
273,
253,
4081,
1332,
50275,
12792,
285,
9610,
50275,
783,
2317,
275,
253,
5661,
7103,
2593,
476,
320,
908,
625,
8069,
275,
4499,
281,
253,
7000,
5740,
273,
253,
10872,
8245,
285,
9978,
253,
2022,
1543,
403,
40006,
715,
767,
1355,
7180,
891,
1804,
9100,
436,
2317,
327,
247,
625,
9470,
16774,
1783,
25761,
253,
2593,
285,
4677,
327,
6733,
273,
5175,
476,
320,
17903,
275,
247,
1643,
14683,
604,
253,
4477,
23103,
327,
15250,
253,
17032,
3885,
347,
247,
15785,
273,
616,
4081,
2746,
597,
943,
4271,
10872,
835,
253,
16161,
673,
970,
247,
2206,
47037,
310,
9419,
25785,
1048,
5604,
625,
685,
247,
6919,
273,
247,
1273,
50274,
250,
27167,
318,
50276,
783,
2929,
10262,
4722,
5697,
534,
476,
7826,
9017,
281,
10625,
4457,
2206,
16161,
2299,
253,
4081,
5697,
878,
281,
320,
6760,
275,
247,
625,
9470,
285,
26565,
16774,
1263,
50274,
38092,
13991,
323,
7756,
50275,
266,
4722,
1953,
534,
310,
323,
1650,
5421,
407,
391,
5200,
255,
310,
253,
3045,
273,
3210,
10166,
327,
581,
1895,
966,
672,
6760,
327,
1529,
966,
436,
310,
417,
247,
4275,
1953,
2568,
651,
320,
247,
5322,
3368,
281,
320,
2908,
275,
271,
30762,
5474,
339,
793,
360,
3454,
253,
2929,
4979,
398,
10517,
10262,
271,
5520,
4216,
11454,
2990,
1566,
281,
8415,
2206,
3237,
2720,
4893,
273,
18976,
2224,
281,
2206,
452,
908,
27311,
267,
18976,
2224,
3185,
273,
4216,
4116,
6928,
305,
1832,
285,
436,
789,
5936,
247,
11237,
273,
305,
1832,
281,
3157,
616,
3045,
327,
253,
49240,
14580,
14494,
275,
2206,
50276,
783,
2022,
629,
273,
253,
11237,
310,
326,
616,
18976,
2224,
1908,
598,
281,
767,
47010,
875,
7632,
824,
326,
4903,
476,
8041,
281,
643,
4903,
285,
34510,
476,
8041,
281,
643,
34510,
253,
2929,
840,
671,
5936,
247,
1039,
281,
1581,
2831,
42959,
875,
34510,
285,
4903,
50276,
783,
4477,
1304,
10290,
7200,
15988,
387,
21565,
3449,
74,
1430,
327,
1355,
13506,
2206,
23276,
50276,
251,
253,
2762,
1930,
253,
4477,
8338,
253,
897,
273,
21543,
48193,
247,
1039,
281,
22059,
253,
49240,
2605,
281,
3885,
598,
253,
18976,
2224,
533,
253,
4477,
513,
417,
4499,
326,
342,
8342,
305,
1832,
3732,
281,
616,
1895,
594,
352,
310,
1892,
281,
1333,
849,
1199,
326,
2934,
17904,
50275,
2577,
7350,
403,
347,
3637,
50276,
783,
2929,
3916,
281,
2953,
7658,
13212,
3237,
260,
1033,
533,
16633,
327,
2206,
50276,
783,
2929,
3916,
281,
1908,
4979,
398,
533,
4648,
4216,
11454,
6928,
50276,
15419,
2368,
2218,
327,
1077,
13506,
49602,
1057,
417,
7933,
4459,
689,
281,
643,
49602,
50276,
2420,
374,
2722,
326,
253,
3045,
273,
616,
2746,
1057,
417,
562,
32231,
616,
8245,
275,
374,
562,
273,
495,
49602,
2829,
495,
2722,
10290,
15988,
533,
760,
327,
3632,
2206,
3237,
50276,
783,
16038,
273,
253,
789,
310,
281,
3157,
253,
3045,
273,
11454,
6928,
327,
2206,
3237,
533,
891,
717,
417,
13762,
326,
627,
310,
667,
8668,
281,
7171,
22942,
498,
1220,
735,
342,
11454,
6928,
387,
436,
1895,
50275,
17695,
3740,
253,
2929,
310,
34007,
533,
812,
320,
3542,
1805,
26850,
275,
2067,
5053,
403,
12744,
323,
1650,
50276,
15834,
359,
4853,
253,
22766,
4116,
5122,
1754,
327,
21543,
48193,
323,
253,
1881,
42959,
875,
4133,
932,
253,
2831,
42959,
1754,
327,
253,
49240,
4216,
4859,
875,
4133,
932,
285,
34510,
752,
1057,
326,
1599,
752,
403,
11419,
11865,
752,
310,
2831,
42959,
50276,
46089,
806,
6197,
752,
1057,
352,
1361,
253,
9414,
281,
923,
760,
253,
1750,
326,
260,
1033,
310,
1774,
604,
368,
1928,
326,
2206,
285,
260,
1033,
878,
16038,
4496,
2085,
11859,
4893,
50276,
46089,
359,
4647,
253,
2831,
42959,
5122,
281,
22318,
3935,
23261,
875,
22766,
7632,
954,
273,
253,
2426,
1060,
497,
417,
2568,
5611,
285,
891,
760,
7192,
436,
6197,
846,
4361,
253,
1551,
273,
253,
2929,
50276,
783,
2022,
7680,
2593,
5976,
310,
417,
5544,
275,
2508,
760,
253,
14168,
310,
1677,
285,
253,
9414,
556,
281,
17029,
253,
16038,
50275,
37585,
4385,
50276,
5903,
5347,
1320,
273,
10414,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
747,
4216,
11454,
2990,
305,
9866,
10336,
342,
4116,
285,
342,
4893,
281,
12419,
3449,
74,
1430,
50276,
783,
30628,
4469,
7350,
689,
2710,
7794,
273,
253,
2929,
824,
347,
247,
878,
323,
1805,
490,
77,
569,
285,
271,
1783,
273,
253,
10183,
1268,
273,
253,
2206,
3237,
908,
275,
7103,
50276,
2369,
30080,
22559,
369,
2530
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
8307,
273,
4903,
285,
34510,
2112,
342,
7938,
3515,
673,
347,
2429,
281,
643,
13399,
3082,
984,
273,
247,
502,
666,
554,
274,
5984,
1566,
10336,
253,
2022,
7680,
273,
253,
2929,
310,
281,
10007,
6867,
3935,
5858,
272,
5018,
275,
4216,
11454,
6928,
281,
1379,
625,
273,
253,
13604,
18645,
2605,
715,
2395,
285,
28523,
3935,
5858,
272,
275,
17352,
34510,
285,
4903,
2975,
50276,
296,
3755,
20556,
50276,
66,
747,
2746,
281,
16161,
260,
1033,
970,
4216,
11454,
2990,
1344,
507,
630,
398,
342,
690,
14053,
11701,
4283,
281,
3777,
2900,
2069,
285,
1805,
2323,
2281,
275,
4560,
3449,
6051,
23768,
50276,
20881,
1255,
265,
50276,
783,
5661,
9978,
285,
1543,
878,
1534,
11701,
1078,
436,
789,
476,
320,
3863,
923,
4278,
275,
3533,
2708,
50275,
17695,
253,
15265,
839,
2926,
275,
253,
10199,
310,
417,
594,
2590,
7317,
627,
310,
247,
3748,
273,
2045,
789,
1146,
3710,
281,
22453,
11333,
533,
642,
2266,
30328,
310,
5907,
323,
2139,
502,
666,
554,
274,
5984,
1204,
310,
4217,
23000,
690,
2234,
7681,
2426,
24088,
2831,
42959,
21543,
48193,
403,
5393,
342,
1652,
3634,
2403,
352,
2834,
323,
253,
9414,
281,
1265,
48635,
253,
9380,
9021,
2393,
327,
50275,
2369,
652,
555,
253,
4116,
6297,
3559,
403,
1077,
1199,
2074,
281,
643,
4116,
6297,
751,
256,
1241,
312,
1162,
355,
4765,
253,
897,
273,
39707,
881,
3210,
310,
417,
24666,
285,
369,
908,
46454,
323,
28669,
431,
1692,
3237,
275,
436,
2929,
534,
369,
417,
11106,
465,
1062,
259,
11872,
617,
413,
3889,
8511,
1171,
285,
2781,
973,
272,
4116,
3037,
281,
8415,
24749,
3237,
5213,
8059,
327,
4715,
14237,
4765,
50275,
250,
27167,
318,
50276,
9088,
403,
690,
37865,
281,
436,
789,
3782,
275,
253,
10182,
3935,
5858,
272,
2216,
2299,
891,
452,
281,
5583,
247,
12009,
984,
253,
5661,
1543,
403,
1077,
1077,
3710,
387,
436,
1127,
352,
310,
11543,
326,
253,
4477,
588,
320,
2104,
281,
2953,
512,
253,
2523,
891,
7164,
1060,
533,
891,
11907,
731,
281,
513,
594,
275,
253,
2822,
2852,
285,
11929,
281,
2852,
27691,
50275,
34974,
281,
253,
4477,
275,
642,
1798,
1340,
337,
4496,
921,
14023,
342,
247,
1375,
23037,
14387,
13757,
260,
1033,
47037,
824,
347,
18890,
78,
260,
12813,
84,
27191,
5556,
6081,
390,
2571,
50276,
19,
1071,
7200,
4677,
577,
760,
921,
12866,
253,
3885,
484,
6786,
407,
253,
1332,
2429,
281,
643,
7274,
4496,
1304,
253,
1071,
7200,
3904,
1599,
285,
2629,
11254,
5010,
253,
7180,
921,
760,
3733,
1543,
534,
1057,
417,
2028,
253,
9414,
1199,
50276,
20,
17010,
285,
22766,
4216,
4116,
253,
42060,
875,
17010,
285,
6895,
11426,
4216,
4116,
310,
417,
4518,
4767,
253,
760,
13812,
3133,
281,
320,
1027,
9552,
273,
253,
3302,
4666,
362,
275,
362,
285,
5024,
1484,
275,
1484,
21496,
9552,
362,
310,
273,
269,
87,
7877,
285,
1484,
310,
273,
15260,
7877,
347,
10066,
4983,
342,
1072,
10103,
347,
275,
5828,
781,
32733,
1162,
355,
4240,
34243,
275,
22766,
4116,
612,
74,
10441,
285,
299,
10441,
6584,
943,
1421,
281,
1027,
2193,
2299,
1754,
327,
253,
5426,
275,
5150,
495,
1097,
273,
731,
588,
20685,
253,
1072,
1318,
352,
651,
320,
5322,
604,
368,
476,
4518,
1375,
253,
3910,
50276,
21,
1740,
32049,
33990,
275,
253,
32049,
285,
1264,
29810,
33990,
275,
253,
29810,
849,
858,
368,
5206,
253,
3904,
273,
8090,
1060,
1072,
1953,
323,
512,
253,
4373,
22041,
2529,
762,
2087,
9978,
275,
2593,
8676,
50276,
22,
2593,
37397,
253,
5426,
273,
298,
2981,
299,
571,
846,
5150,
608,
310,
1027,
432,
849,
298,
2981,
310,
908,
342,
247,
2014,
4154,
275,
608,
752,
310,
253,
5426,
273,
299,
571,
671,
476,
368,
19148,
752,
368,
1599,
407,
310,
3732,
281,
13199,
253,
33487,
273,
1016,
4778,
50275,
23,
2829,
495,
253,
673,
3064,
342,
268,
12132,
310,
4518,
7092,
6820,
476,
368,
4385,
327,
253,
8453,
273,
841,
1543,
671,
352,
310,
7479,
281,
5476,
849,
1142,
3037,
494,
3602,
268,
12132,
285,
634,
1566,
452,
476,
368,
823,
326,
1491,
281,
253,
2829,
50275,
37585,
50276,
783,
4980,
2746,
310,
9058,
272,
281,
8415,
260,
1033,
949,
11454,
24762,
3082,
50276,
531,
4980,
9058,
90,
2746,
281,
16161,
260,
1033,
310,
949,
11454,
24762,
3082,
50276,
328,
8250,
752,
436,
6197,
2097,
7613,
253,
7369,
1566,
310,
11542,
407,
253,
38754,
5700,
534,
310,
3839,
749,
29776,
50276,
783,
9701,
4315,
50276,
783,
3067,
43850,
4315,
50276,
22478,
932,
9046,
247,
10883,
1027,
432,
50276,
22478,
932,
9046,
247,
10883,
403,
1027,
432,
50276,
66,
4122,
29736,
1070,
3935,
5858,
272,
50276,
66,
4122,
7529,
1025,
3935,
5858,
272,
50276,
6377,
374,
5586,
337,
5816,
2120,
3523,
275,
253,
1390,
1386,
50276,
13206,
374,
67,
253,
31825,
32049,
285,
29810,
275,
253,
1708,
4797,
12783,
1646,
281,
320,
734,
11950,
5474,
33032,
50276,
8774,
50275,
783,
2929,
10262,
247,
1566,
323,
9441,
804,
253,
2900,
273,
247,
7658,
13212,
1895,
689,
12419,
4903,
4469,
275,
7862,
28816,
2622,
830,
260,
35478,
253,
4081,
1566,
21168,
327,
5368,
2987,
534,
1957,
253,
2803,
4216,
273,
253,
260,
35478,
347,
247,
49240,
4216,
285,
897,
247,
4216,
11454,
2990,
323,
253,
4715,
4836,
352,
294,
1545,
1100,
253,
4216,
281,
12454,
253,
3935,
8136,
2190,
253,
7632,
534,
403,
275,
253,
1072,
629,
273,
253,
4216,
26332,
260,
35478,
4903,
498,
19726,
352,
671,
23970,
247,
2831,
42959,
5122,
875,
253,
7632,
15823,
281,
1027,
4243,
50275,
783,
4081,
1566,
310,
6760,
327,
247,
873,
273,
22791,
2206,
10872,
285,
310,
2429,
342,
271,
5368,
8245,
1332,
50274,
9072,
2792,
50275,
783,
12611,
1543,
403,
18462,
50276,
783,
5697,
1646,
281,
320,
4460,
285,
627,
310,
247,
2442,
323,
3486,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50275,
284,
808,
84,
281,
320,
5520,
50276,
1752,
318,
1263,
352,
3133,
326,
21543,
48193,
476,
320,
3732,
1293,
253,
4116,
5122,
604,
436,
310,
253,
1083,
352,
943,
320,
31637,
326,
281,
752,
6070,
253,
2797,
1543,
3469,
327,
253,
5019,
273,
1097,
5697,
50275,
14966,
5438,
2206,
10872,
6889,
3012,
275,
2426,
273,
10183,
352,
3133,
326,
253,
1895,
10872,
908,
275,
253,
5661,
7103,
403,
512,
3477,
10872,
891,
369,
417,
2104,
281,
7472,
253,
10183,
273,
253,
10872,
432,
253,
10895,
5740,
281,
452,
271,
6642,
273,
253,
10183,
273,
253,
3237,
891,
14042,
512,
10872,
275,
253,
3632,
495,
22354,
966,
970,
253,
22942,
498,
47037,
1054,
261,
255,
285,
2797,
841,
1408,
3181,
50275,
2043,
3728,
77,
2676,
83,
2505,
3342,
28872,
966,
50276,
11765,
249,
4777,
50276,
1156,
67,
649,
5359,
20243,
50276,
11765,
25629,
20243,
50276,
17554,
20,
2313,
34068,
50276,
9138,
50266,
17,
26371,
50271,
1418,
3677,
50274,
17554,
20,
8970,
721,
1857,
50276,
2313,
50265,
1762,
1967,
50271,
13930,
1967,
50274,
17554,
1237,
361,
854,
1549,
50276,
2313,
50265,
23,
19816,
50271,
7174,
19816,
50274,
423,
3728,
50275,
20513,
1543,
5224,
326,
253,
3632,
495,
22354,
10872,
3340,
253,
4394,
342,
2233,
285,
7783,
4903,
403,
3782,
3477,
2206,
10872,
253,
1543,
273,
253,
4679,
275,
253,
2929,
5644,
479,
281,
2868,
326,
253,
10872,
273,
643,
1895,
5971,
26332,
465,
2052,
4263,
465,
16484,
285,
465,
498,
2271,
403,
671,
2190,
253,
6927,
2206,
10872,
253,
2361,
3933,
19103,
275,
7180,
374,
285,
495,
403,
3240,
1029,
1097,
323,
253,
4081,
1332,
285,
253,
8245,
436,
6492,
326,
627,
310,
2316,
323,
1690,
625,
11132,
10872,
275,
253,
16774,
1783,
50275,
9939,
326,
891,
717,
417,
7738,
326,
253,
6311,
3210,
943,
15639,
342,
2629,
2206,
1220,
735,
516,
3365,
970,
2629,
1220,
735,
347,
247,
17335,
281,
6642,
253,
10183,
273,
253,
10872,
253,
4477,
476,
897,
247,
2074,
2746,
281,
13213,
907,
247,
873,
273,
10872,
24088,
247,
8578,
4236,
432,
247,
3332,
2206,
7324,
715,
1027,
2308,
273,
10183,
24088,
3477,
4646,
285,
1892,
436,
588,
1056,
253,
16774,
1783,
625,
9630,
285,
34243,
2007,
14371,
253,
11361,
273,
253,
4081,
1332,
50275,
12792,
285,
9610,
50275,
783,
2317,
275,
253,
5661,
7103,
2593,
476,
320,
908,
625,
8069,
275,
4499,
281,
253,
7000,
5740,
273,
253,
10872,
8245,
285,
9978,
253,
2022,
1543,
403,
40006,
715,
767,
1355,
7180,
891,
1804,
9100,
436,
2317,
327,
247,
625,
9470,
16774,
1783,
25761,
253,
2593,
285,
4677,
327,
6733,
273,
5175,
476,
320,
17903,
275,
247,
1643,
14683,
604,
253,
4477,
23103,
327,
15250,
253,
17032,
3885,
347,
247,
15785,
273,
616,
4081,
2746,
597,
943,
4271,
10872,
835,
253,
16161,
673,
970,
247,
2206,
47037,
310,
9419,
25785,
1048,
5604,
625,
685,
247,
6919,
273,
247,
1273,
50274,
250,
27167,
318,
50276,
783,
2929,
10262,
4722,
5697,
534,
476,
7826,
9017,
281,
10625,
4457,
2206,
16161,
2299,
253,
4081,
5697,
878,
281,
320,
6760,
275,
247,
625,
9470,
285,
26565,
16774,
1263,
50274,
38092,
13991,
323,
7756,
50275,
266,
4722,
1953,
534,
310,
323,
1650,
5421,
407,
391,
5200,
255,
310,
253,
3045,
273,
3210,
10166,
327,
581,
1895,
966,
672,
6760,
327,
1529,
966,
436,
310,
417,
247,
4275,
1953,
2568,
651,
320,
247,
5322,
3368,
281,
320,
2908,
275,
271,
30762,
5474,
339,
793,
360,
3454,
253,
2929,
4979,
398,
10517,
10262,
271,
5520,
4216,
11454,
2990,
1566,
281,
8415,
2206,
3237,
2720,
4893,
273,
18976,
2224,
281,
2206,
452,
908,
27311,
267,
18976,
2224,
3185,
273,
4216,
4116,
6928,
305,
1832,
285,
436,
789,
5936,
247,
11237,
273,
305,
1832,
281,
3157,
616,
3045,
327,
253,
49240,
14580,
14494,
275,
2206,
50276,
783,
2022,
629,
273,
253,
11237,
310,
326,
616,
18976,
2224,
1908,
598,
281,
767,
47010,
875,
7632,
824,
326,
4903,
476,
8041,
281,
643,
4903,
285,
34510,
476,
8041,
281,
643,
34510,
253,
2929,
840,
671,
5936,
247,
1039,
281,
1581,
2831,
42959,
875,
34510,
285,
4903,
50276,
783,
4477,
1304,
10290,
7200,
15988,
387,
21565,
3449,
74,
1430,
327,
1355,
13506,
2206,
23276,
50276,
251,
253,
2762,
1930,
253,
4477,
8338,
253,
897,
273,
21543,
48193,
247,
1039,
281,
22059,
253,
49240,
2605,
281,
3885,
598,
253,
18976,
2224,
533,
253,
4477,
513,
417,
4499,
326,
342,
8342,
305,
1832,
3732,
281,
616,
1895,
594,
352,
310,
1892,
281,
1333,
849,
1199,
326,
2934,
17904,
50275,
2577,
7350,
403,
347,
3637,
50276,
783,
2929,
3916,
281,
2953,
7658,
13212,
3237,
260,
1033,
533,
16633,
327,
2206,
50276,
783,
2929,
3916,
281,
1908,
4979,
398,
533,
4648,
4216,
11454,
6928,
50276,
15419,
2368,
2218,
327,
1077,
13506,
49602,
1057,
417,
7933,
4459,
689,
281,
643,
49602,
50276,
2420,
374,
2722,
326,
253,
3045,
273,
616,
2746,
1057,
417,
562,
32231,
616,
8245,
275,
374,
562,
273,
495,
49602,
2829,
495,
2722,
10290,
15988,
533,
760,
327,
3632,
2206,
3237,
50276,
783,
16038,
273,
253,
789,
310,
281,
3157,
253,
3045,
273,
11454,
6928,
327,
2206,
3237,
533,
891,
717,
417,
13762,
326,
627,
310,
667,
8668,
281,
7171,
22942,
498,
1220,
735,
342,
11454,
6928,
387,
436,
1895,
50275,
17695,
3740,
253,
2929,
310,
34007,
533,
812,
320,
3542,
1805,
26850,
275,
2067,
5053,
403,
12744,
323,
1650,
50276,
15834,
359,
4853,
253,
22766,
4116,
5122,
1754,
327,
21543,
48193,
323,
253,
1881,
42959,
875,
4133,
932,
253,
2831,
42959,
1754,
327,
253,
49240,
4216,
4859,
875,
4133,
932,
285,
34510,
752,
1057,
326,
1599,
752,
403,
11419,
11865,
752,
310,
2831,
42959,
50276,
46089,
806,
6197,
752,
1057,
352,
1361,
253,
9414,
281,
923,
760,
253,
1750,
326,
260,
1033,
310,
1774,
604,
368,
1928,
326,
2206,
285,
260,
1033,
878,
16038,
4496,
2085,
11859,
4893,
50276,
46089,
359,
4647,
253,
2831,
42959,
5122,
281,
22318,
3935,
23261,
875,
22766,
7632,
954,
273,
253,
2426,
1060,
497,
417,
2568,
5611,
285,
891,
760,
7192,
436,
6197,
846,
4361,
253,
1551,
273,
253,
2929,
50276,
783,
2022,
7680,
2593,
5976,
310,
417,
5544,
275,
2508,
760,
253,
14168,
310,
1677,
285,
253,
9414,
556,
281,
17029,
253,
16038,
50275,
37585,
4385,
50276,
5903,
5347,
1320,
273,
10414,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
747,
4216,
11454,
2990,
305,
9866,
10336,
342,
4116,
285,
342,
4893,
281,
12419,
3449,
74,
1430,
50276,
783,
30628,
4469,
7350,
689,
2710,
7794,
273,
253,
2929,
824,
347,
247,
878,
323,
1805,
490,
77,
569,
285,
271,
1783,
273,
253,
10183,
1268,
273,
253,
2206,
3237,
908,
275,
7103,
50276,
2369,
30080,
22559,
369,
2530
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper provides an optimal transport ot based algorithm for improving existing summary networks for learning from setstructured data the proposed approach views each set as a distribution over a set of global prototypes to learn the distribution over the global prototypes the proposed approach minimizes the ot distance to the sets empirical distribution over data points empirical results demonstrate that the proposed framework improves upon the existing summary network approaches as well as metricbased fewshot classification and generative modeling applications originality the paper is original and proposes an optimal transport ot based algorithm for improving existing summary networks for learning from setstructured data quality the paper is technically sound however the novelty is a little bit limited the proposed approach doesnt improve upon the preexisting summary networks architecture clarity the paper is clear and easy to follow algorithm 1 should be moved to the main body rather than the appendix significance the work is significant but novelty is a bit limited limitations the paper lacks theoretical analysis for the proposed approach error bars are missing and not reported in the results questions to authors the paper provides an optimal transport ot based algorithm for improving existing summary networks for learning from setstructured data the paper is original and technically sound however the novelty is a little bit limited the proposed approach doesnt improve beyond the preexisting summary networks architecture empirical results demonstrate that the proposed framework improves upon the existing summary network approaches as well as metricbased fewshot classification and generative modeling applications the paper lacks theoretical analysis for the proposed approach and the error bars are not reported for the empirical results docsepthis work introduces a straightforward development for set representation learning in the metalearning context based on the intuition that the sets encountered in realworld metalearning tasks tend to have common attributes as illustrated in figure 1 the idea is to jointly learn these common attributes beta1k referred to as global centres or global prototypes and the parameters of a summary network using an optimal transport derived loss function that compares the empirical set distribution pj with a set summary distribution over the global centresprototypes qjsumk1k hjkdeltabetak the idea is elegant in its simplicity and effectiveness strengths the intuition of the work is clear and the approach is reasonable a priori this makes it easy to follow the motivations and illustrations and lowers the barrier to the community adopting the presented techniques the proposed method is simple and does not require complicated architectural or training adaptations to improve the models to which it is applied the experiments are extensive varied and compelling the related work section is mostly good and does a good job of comparing the contained approach with existing work although see the minor comments weaknesses the main weakness is in the presentation of the manuscript i think these problem can be addressed by the authors within the rebuttal period and my score is based on the expectation that this happens if the authors do not improve the technical aspects of the presentation i will lower my score 1 technical all the tables should include errors currently the tables 1 and 3 do not include errors the table captions should also include details of the number of repetitions used to calculate the mean and the error that is being reported standard deviation or standard error on the mean an indication should be made when results are within error 2 a metastatistic would be useful for table 2 in many cases the original model and the ot model performances are within error but overall 14 of the 16 headtohead comparisons results in the ot model having the higher mean it is unlikely that the addition of the ot loss does not improve performance given such a high number of headtohead eg in a simple binomial model with p05 and 16 trials the probability of 14 or greater successes is 003 3 nontechnical figure 2 would be improved by using a better colour scheme and different markers for each model to make it easier to distinguish the different models in black and white or for a colour blind person this is probably an access issue which is why i have included it in the weaknesses rather than the minor comments and it is in the authors interest to improve the presentation of their results in any case 4 nontechnical the last sentence of page 2 reads despite their effectiveness there is no clear evidence that the output of these summary networks could describe a sets summary statistics well which have been proven crucial for the setstructured inference problems chen et al 2021 if x is necessary for y and z has y then z has x if describing a sets summary statistics well is crucial for setstructured inference problems and summary networks are effective at setstructured inference problems then the output of summary networks describe a sets summary statistics well either effectiveness is clear evidence or describing a sets summary statistics well is not crucial for these problems this sentence should be revised 5 there is no supplementary code provided which means i cannot verify the experimental claims this has resulted in a lower confidence score in my review minor comments and suggestions the following comments are minor and potentially subjective so feel free to ignore them my score does not depend on these comments being acted on infinite mixture prototypes for fewshot learning allen et al icml 2019 seems like a relevant reference that could be discussed in the related work section on developments to metric based approaches the approaches differ in fundamental ways and may be complimentary but both have the concept of learning multiple centres and it may help to position this work to discuss the ideas presented here in contrast with those presented by allen et al algorithm 1 is in the appendix and that should be stated in the text when it is referenced eg page 4 algorithm 1 should be included in the main text as good as possible on page 5 should be as well as possible the related work section has many sentences that should be revised different from them that focus on where most related work to ours is different from these studies usually compute class prototypes figure 3 is not particularly compelling and takes up a lot of space that could be better used by presenting algorithm 1 in summary i think this is a well motivated and elegant proposal that has been shown to be effective in a variety of experimental settings i found the authors claims to be well supported and i cannot find technical fault with their work that would support rejection so i will recommend acceptance the lack of supplementary code reduces my confidence in the review as i can only assess the claims as they are presented the main issue with the work is its presentation which may be easily improved within the rebuttal docsep this work proposes a method to improve set representation and applies it in the context of metalearning more precisely the method consists in jointly learning a summary network and prototypes using an optimal transport loss the prototypes and summary network output should minimize the sum of the optimal transport costs for all the sets of the meta dataset experiments are conducted in the context of fewshot learning and other tasks used to evaluate summary networks my main concern is on the novelty of this work indeed it seems to me that some ideas presented here can be found in previous papers on set representation learning that are not mentioned in 1 prototypes against which ot is computed are learned potentially jointly with an elementwise embedding with or without supervision although the outputs used in practice seem to differ 2 presents a similar method in the context of graph representation it is therefore difficult to me to understand the real contributions of this paper it would be great if the authors could comment on the similarities and differences between their method and these ones pros the paper is wellmotivated and mostly clear experiments are varied and seems sound in particular the proposed framework seems to improve existing methods for set representations deepsets and set transformer cons i have concerns on the novelty of this work questions and remarks equation 6 seems to be an instance of wasserstein barycenter could you comment on this it could be worth to look at this other paper on set representation in the related work 3 which has a different approach to the current and previous work yet also similar ideas could the authors elaborate on the computational complexity of computing the transport plans how many sinkhorn iterations do you use how sensitive are you results with respect to the entropic regularization parameter epsilon 1 a trainable optimal transport embedding for feature aggregation and its relationship to attention grgoire mialon dexiong chen alexandre daspremont julien mairal 2 wasserstein embedding for graph learning soheil kolouri navid naderializadeh gustavo k rohde heiko hoffmann 3 rep the set neural networks for learning set representations konstantinos skianis giannis nikolentzos stratis limnios michalis vazirgiannis the paper seems sound but i have concerns on the novelty i am willing to raise my score if the authors clarify their contribution wrt 1 and 2 docsepthis paper proposed an improved training of summary network using optimal transport based auxiliary loss the summary network with negligibly increased parameters show much better performance on amortized clustering pointcloud classification sum of digits fewshot classification and fewshot generation tasks strengths 1 this paper provides a promising research direction in using setstructured data on various downstream tasks especially the paradigm using welllearnt setrepresentations as an conditional input deserves more explorations to tackle the limitation of fewshot generation problems 2 the proposed algorithm is wellmotivated and is effective to be implemented for realworld applications in particular it is easy to combine the summary network with the majority of existing metricbased metalearning models and generative models 3 the experimental study is extensively conducted on various tasks related to setstructured data and the improvement is consistent over different tasks this shows the effectiveness of the proposed plugandplay framework weakness 1 the main idea of this paper that uses optimal transport in learning representation has been widely studied such as httpsopenreviewnetpdfidzk6vtvb84s httpsarxivorgpdf200705840pdf it would be better to clarify what the contribution of this paper is and what the major difference is in contrast to the existing literature of optimal transport for representation learning this paper is wellmotivated and definitely effective for realworld metalearning applications however the authors should discuss the literature of representation learning using optimal transport
### Summary:
|
the proposed method for set representation learning with an application to mete learning is wellmotivated and reasonable reviewers original concerns about novelty and technical presentation have been well explained and addressed in the revision if some theoretical analysis can be provided regarding the proposed method it would make this work stronger in summary a positive recommendation is given here
|
[
253,
4156,
3861,
9117,
253,
4081,
2746,
46926,
253,
14366,
4181,
281,
253,
5239,
16774,
3268,
689,
941,
2792,
16774,
1543,
7568,
326,
253,
4081,
7792,
19132,
2220,
253,
5368,
6010,
2990,
7274,
347,
973,
347,
7982,
3169,
1643,
11860,
9162,
285,
1006,
800,
14053,
4893,
50275,
19164,
414,
50276,
783,
2929,
310,
3236,
285,
29328,
271,
8654,
4616,
14366,
1754,
5933,
323,
11138,
5368,
6010,
6928,
323,
4715,
432,
873,
34218,
941,
50276,
15177,
50276,
783,
2929,
310,
22335,
3590,
2299,
253,
38135,
310,
247,
1652,
2372,
3710,
253,
4081,
2746,
36908,
3157,
2220,
253,
638,
20137,
6010,
6928,
10336,
50276,
498,
15752,
50276,
783,
2929,
310,
2590,
285,
3477,
281,
956,
5933,
337,
943,
320,
4395,
281,
253,
2022,
2133,
2581,
685,
253,
30762,
50276,
9188,
40348,
50276,
783,
789,
310,
1534,
533,
38135,
310,
247,
2372,
3710,
50276,
17465,
569,
50275,
783,
2929,
19756,
10527,
1783,
323,
253,
4081,
2746,
50276,
3775,
8965,
403,
5816,
285,
417,
2361,
275,
253,
1543,
50275,
34974,
281,
4477,
50276,
783,
2929,
3400,
271,
8654,
4616,
14366,
1754,
5933,
323,
11138,
5368,
6010,
6928,
323,
4715,
432,
873,
34218,
941,
253,
2929,
310,
3236,
285,
50276,
23693,
1037,
3590,
2299,
253,
38135,
310,
247,
1652,
2372,
3710,
253,
4081,
2746,
36908,
3157,
4457,
253,
638,
20137,
6010,
6928,
10336,
16774,
1543,
7568,
326,
253,
4081,
7792,
19132,
2220,
253,
5368,
6010,
2990,
7274,
347,
973,
347,
7982,
3169,
1643,
11860,
9162,
285,
1006,
800,
14053,
4893,
253,
2929,
19756,
10527,
1783,
323,
253,
4081,
2746,
285,
253,
2228,
8965,
403,
417,
2361,
323,
253,
16774,
1543,
50276,
7152,
33032,
2520,
789,
23970,
247,
15246,
2440,
323,
873,
6779,
4715,
275,
253,
5148,
613,
920,
3634,
1754,
327,
253,
30328,
326,
253,
5239,
14494,
275,
1524,
10186,
5148,
613,
920,
8892,
5257,
281,
452,
1846,
12474,
347,
12800,
275,
4677,
337,
253,
2934,
310,
281,
26277,
3037,
841,
1846,
12474,
9840,
18,
76,
6289,
281,
347,
4156,
23221,
390,
4156,
3861,
9117,
285,
253,
3602,
273,
247,
6010,
2990,
970,
271,
8654,
4616,
6012,
2957,
1159,
326,
26662,
253,
16774,
873,
3268,
268,
75,
342,
247,
873,
6010,
3268,
689,
253,
4156,
23221,
10075,
9117,
2805,
75,
2204,
76,
18,
76,
288,
17443,
69,
2585,
357,
292,
518,
253,
2934,
310,
20654,
275,
697,
17647,
285,
12510,
20544,
50276,
783,
30328,
273,
253,
789,
310,
2590,
285,
253,
2746,
310,
5272,
247,
30400,
436,
2789,
352,
3477,
281,
956,
253,
42852,
285,
33954,
285,
45742,
253,
11394,
281,
253,
3114,
25987,
253,
3559,
5609,
50276,
783,
4081,
1332,
310,
2969,
285,
1057,
417,
2430,
9542,
27934,
390,
3733,
41655,
281,
3157,
253,
3210,
281,
534,
352,
310,
3732,
50276,
783,
4679,
403,
9470,
12848,
285,
18511,
50276,
783,
2905,
789,
2593,
310,
6571,
1175,
285,
1057,
247,
1175,
2628,
273,
10941,
253,
6221,
2746,
342,
5368,
789,
3738,
923,
253,
5884,
5701,
50276,
20881,
1255,
265,
50276,
783,
2022,
14855,
310,
275,
253,
9759,
273,
253,
7714,
891,
1158,
841,
1895,
476,
320,
9713,
407,
253,
4477,
1561,
253,
30080,
22559,
2180,
285,
619,
4868,
310,
1754,
327,
253,
15355,
326,
436,
6569,
604,
253,
4477,
513,
417,
3157,
253,
7681,
7794,
273,
253,
9759,
891,
588,
2406,
619,
4868,
50276,
18,
7681,
512,
253,
7180,
943,
2486,
6332,
4390,
253,
7180,
337,
285,
495,
513,
417,
2486,
6332,
253,
2829,
3403,
621,
943,
671,
2486,
4278,
273,
253,
1180,
273,
49495,
908,
281,
10173,
253,
1599,
285,
253,
2228,
326,
310,
1146,
2361,
2629,
11254,
390,
2629,
2228,
327,
253,
1599,
271,
14011,
943,
320,
1160,
672,
1543,
403,
1561,
2228,
50276,
19,
247,
8866,
255,
2531,
651,
320,
4217,
323,
2829,
374,
275,
1142,
2219,
253,
3236,
1566,
285,
253,
14366,
1566,
16226,
403,
1561,
2228,
533,
4583,
1638,
273,
253,
1668,
1481,
936,
2522,
14023,
1543,
275,
253,
14366,
1566,
1907,
253,
2169,
1599,
352,
310,
11543,
326,
253,
1635,
273,
253,
14366,
2957,
1057,
417,
3157,
3045,
1677,
824,
247,
1029,
1180,
273,
1481,
936,
2522,
24088,
275,
247,
2969,
47585,
1566,
342,
268,
1762,
285,
1668,
7587,
253,
5912,
273,
1638,
390,
3687,
34574,
310,
209,
4838,
50276,
20,
1327,
48746,
4677,
374,
651,
320,
5520,
407,
970,
247,
1805,
10688,
6974,
285,
1027,
9588,
323,
1016,
1566,
281,
1056,
352,
6927,
281,
12129,
253,
1027,
3210,
275,
2806,
285,
3168,
390,
323,
247,
10688,
9645,
1436,
436,
310,
3164,
271,
2289,
2523,
534,
310,
2139,
891,
452,
2908,
352,
275,
253,
32213,
2581,
685,
253,
5884,
5701,
285,
352,
310,
275,
253,
4477,
1600,
281,
3157,
253,
9759,
273,
616,
1543,
275,
667,
1083,
50276,
21,
1327,
48746,
253,
1390,
6197,
273,
3239,
374,
9563,
5747,
616,
12510,
627,
310,
642,
2590,
1941,
326,
253,
3453,
273,
841,
6010,
6928,
812,
6266,
247,
5239,
6010,
9990,
973,
534,
452,
644,
11464,
9560,
323,
253,
873,
34218,
17032,
3237,
260,
864,
1162,
355,
43425,
604,
1269,
310,
3309,
323,
340,
285,
1182,
556,
340,
840,
1182,
556,
1269,
604,
12930,
247,
5239,
6010,
9990,
973,
310,
9560,
323,
873,
34218,
17032,
3237,
285,
6010,
6928,
403,
3576,
387,
873,
34218,
17032,
3237,
840,
253,
3453,
273,
6010,
6928,
6266,
247,
5239,
6010,
9990,
973,
2057,
12510,
310,
2590,
1941,
390,
12930,
247,
5239,
6010,
9990,
973,
310,
417,
9560,
323,
841,
3237,
436,
6197,
943,
320,
17265,
50276,
22,
627,
310,
642,
24864,
2127,
2530,
534,
2097,
891,
2550,
12654,
253,
5661,
3916,
436,
556,
7369,
275,
247,
2406,
7162,
4868,
275,
619,
2278,
50276,
37585,
5701,
285,
13991,
50276,
783,
1563,
5701,
403,
5884,
285,
7826,
17854,
594,
1928,
1959,
281,
11823,
731,
619,
4868,
1057,
417,
3469,
327,
841,
5701,
1146,
14001,
327,
50276,
2050,
8234,
7802,
3861,
9117,
323,
1643,
11860,
4715,
512,
257,
1162,
355,
17857,
1686,
6247,
3133,
751,
247,
4623,
3806,
326,
812,
320,
5469,
275,
253,
2905,
789,
2593,
327,
16936,
281,
7982,
1754,
7274,
253,
7274,
9184,
275,
7936,
4088,
285,
778,
320,
48301,
533,
1097,
452,
253,
4473,
273,
4715,
2709,
23221,
285,
352,
778,
1361,
281,
1899,
436,
789,
281,
2319,
253,
5697,
3559,
1060,
275,
4499,
342,
1110,
3559,
407,
512,
257,
1162,
355,
50276,
41528,
337,
310,
275,
253,
30762,
285,
326,
943,
320,
4767,
275,
253,
2505,
672,
352,
310,
23378,
24088,
3239,
577,
50276,
41528,
337,
943,
320,
2908,
275,
253,
2022,
2505,
50276,
284,
1175,
347,
1896,
327,
3239,
608,
943,
320,
347,
973,
347,
1896,
50276,
783,
2905,
789,
2593,
556,
1142,
14683,
326,
943,
320,
17265,
50274,
19623,
432,
731,
326,
2770,
327,
50274,
2811,
954,
2905,
789,
281,
20451,
310,
50274,
19623,
432,
841,
2175,
3798,
11897,
966,
3861,
9117,
50276,
13206,
495,
310,
417,
3782,
18511,
285,
3936,
598,
247,
2257,
273,
2317,
326,
812,
320,
1805,
908,
407,
15250,
5933,
337,
275,
6010,
891,
1158,
436,
310,
247,
973,
17194,
285,
20654,
10419,
326,
556,
644,
2011,
281,
320,
3576,
275,
247,
5235,
273,
5661,
7533,
891,
1119,
253,
4477,
3916,
281,
320,
973,
4516,
285,
891,
2550,
1089,
7681,
9331,
342,
616,
789,
326,
651,
1329,
18235,
594,
891,
588,
5583,
14924,
253,
3480,
273,
24864,
2127,
11355,
619,
7162,
275,
253,
2278,
347,
891,
476,
760,
2939,
253,
3916,
347,
597,
403,
3559,
253,
2022,
2523,
342,
253,
789,
310,
697,
9759,
534,
778,
320,
4354,
5520,
1561,
253,
30080,
22559,
5474,
33032,
436,
789,
29328,
247,
1332,
281,
3157,
873,
6779,
285,
10384,
352,
275,
253,
3634,
273,
5148,
613,
920,
625,
10534,
253,
1332,
8414,
275,
26277,
4715,
247,
6010,
2990,
285,
3861,
9117,
970,
271,
8654,
4616,
2957,
253,
3861,
9117,
285,
6010,
2990,
3453,
943,
15338,
253,
2020,
273,
253,
8654,
4616,
4815,
323,
512,
253,
5239,
273,
253,
11419,
10895,
4679,
403,
5196,
275,
253,
3634,
273,
1643,
11860,
4715,
285,
643,
8892,
908,
281,
7472,
6010,
6928,
619,
2022,
4468,
310,
327,
253,
38135,
273,
436,
789,
6296,
352,
3133,
281,
479,
326,
690,
5697,
3559,
1060,
476,
320,
1119,
275,
2045,
9380,
327,
873,
6779,
4715,
326,
403,
417,
5393,
275,
337,
3861,
9117,
1411,
534,
14366,
310,
10302,
403,
6311,
7826,
26277,
342,
271,
3284,
3020,
21496,
342,
390,
1293,
20446,
3738,
253,
18012,
908,
275,
3946,
1646,
281,
9184,
374,
10262,
247,
2074,
1332,
275,
253,
3634,
273,
4216,
6779,
352,
310,
3103,
2834,
281,
479,
281,
2096,
253,
1524,
9021,
273,
436,
2929,
352,
651,
320,
1270,
604,
253,
4477,
812,
4385,
327,
253,
22620,
285,
3910,
875,
616,
1332,
285,
841,
4394,
50276,
856,
84,
50276,
783,
2929,
310,
973,
24013,
8550,
285,
6571,
2590,
50276,
16217,
3825,
403,
12848,
285,
3133,
3590,
50276,
249,
1798,
253,
4081,
7792,
3133,
281,
3157,
5368,
3082,
323,
873,
14237,
3676,
19598,
285,
873,
39707,
50276,
5040,
50276,
74,
452,
7350,
327,
253,
38135,
273,
436,
789,
50276,
34974,
285,
16157,
50276,
29813,
721,
3133,
281,
320,
271,
4227,
273,
369,
2152,
6339,
28556,
9229,
812,
368,
4385,
327,
436,
50276,
262,
812,
320,
4409,
281,
1007,
387,
436,
643,
2929,
327,
873,
6779,
275,
253,
2905,
789,
495,
534,
556,
247,
1027,
2746,
281,
253,
1655,
285,
2045,
789,
2568,
671,
2074,
5697,
50276,
16534,
253,
4477,
21184,
327,
253,
15180,
10454,
273,
12672,
253,
4616,
5827,
849,
1142,
16338,
27721,
25142,
513,
368,
897,
50276,
5430,
7996,
403,
368,
1543,
342,
1675,
281,
253,
994,
12189,
37820,
4764,
299,
4277,
50273,
18,
50276,
66,
6194,
494,
8654,
4616,
21496,
323,
4735,
20828,
285,
697,
2954,
281,
4116,
650,
2184,
603,
278,
451,
251,
27625,
279,
72,
260,
864,
247,
1591,
395,
250,
277,
4938,
2013,
834,
49137,
1914,
278,
1094,
267,
50276,
19,
369,
2152,
6339,
21496,
323,
4216,
4715,
594,
248,
300,
38301,
13979,
6563,
301,
295,
6475,
451,
478,
796,
73,
30942,
41231,
465,
687,
73,
615,
344,
20592,
288,
2727,
8420,
50276,
20,
1234,
253,
873,
11454,
6928,
323,
4715,
873,
14237,
17022,
3223,
15777,
1629,
757,
261,
305,
757,
24836,
295,
1479,
311,
290,
46031,
15252,
261,
1579,
79,
3783,
49068,
21728,
362,
1370,
343,
15287,
24836,
253,
2929,
3133,
3590,
533,
891,
452,
7350,
327,
253,
38135,
891,
717,
7378,
281,
7164,
619,
4868,
604,
253,
4477,
19148,
616,
7680,
8772,
337,
285,
374,
5474,
33032,
2520,
2929,
4081,
271,
5520,
3733,
273,
6010,
2990,
970,
8654,
4616,
1754,
24026,
2957,
253,
6010,
2990,
342,
9768,
4360,
2559,
3602,
921,
1199,
1805,
3045,
327,
717,
430,
1025,
17524,
1127,
18534,
9162,
2020,
273,
24321,
1643,
11860,
9162,
285,
1643,
11860,
5978,
8892,
20544,
337,
436,
2929,
3400,
247,
12532,
2561,
3884,
275,
970,
873,
34218,
941,
327,
2710,
15450,
8892,
3340,
253,
22199,
970,
973,
29343,
85,
873,
12554,
569,
347,
271,
17697,
3280,
22828,
625,
31880,
569,
281,
18915,
253,
12291,
273,
1643,
11860,
5978,
3237,
50276,
19,
253,
4081,
5933,
310,
973,
24013,
8550,
285,
310,
3576,
281,
320,
9009,
323,
1524,
10186,
4893,
275,
1798,
352,
310,
3477,
281,
13398,
253,
6010,
2990,
342,
253,
5020,
273,
5368,
7982,
3169,
5148,
613,
920,
3210,
285,
1006,
800,
3210,
50276,
20,
253,
5661,
1263,
310,
18171,
5196,
327,
2710,
8892,
2905,
281,
873,
34218,
941,
285,
253,
7756,
310,
5185,
689,
1027,
8892,
436,
2722,
253,
12510,
273,
253,
4081,
10358,
395,
1993,
7792,
50276,
20881,
1255,
337,
253,
2022,
2934,
273,
436,
2929,
326,
4648,
8654,
4616,
275,
4715,
6779,
556,
644,
7561,
5421,
824,
347,
5987,
5758,
15337,
3024,
9275,
301,
32786,
23,
87,
18698,
67,
2759,
84,
5987,
39962,
2061,
9275,
8602,
30541,
1449,
9275,
352,
651,
320,
1805,
281,
19148,
752,
253,
7680,
273,
436,
2929,
310,
285,
752,
253,
2201,
3064,
310,
275,
4499,
281,
253,
5368,
6239,
273,
8654,
4616,
323,
6779,
4715,
436,
2929,
310,
973,
24013,
8550,
285,
7964,
3576,
323,
1524,
10186,
5148,
613,
920,
4893,
2299,
253,
4477,
943,
2319,
253,
6239,
273,
6779,
4715,
970,
8654,
4616,
2490,
187,
4118,
18435,
27,
783,
4081,
1332,
323,
873,
6779,
4715,
342,
271,
2898,
281,
1313,
70,
4715,
310,
973,
24013,
8550,
285,
5272,
30628,
3236,
7350,
670,
38135,
285,
7681,
9759,
452,
644,
973,
5544,
285,
9713,
275,
253,
18520,
604,
690,
10527,
1783,
476,
320,
2530,
5001,
253,
4081,
1332,
352,
651,
1056,
436,
789,
10046,
50276,
249,
6010,
247,
2762,
17401,
310,
1677,
1060
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
253,
4156,
3861,
9117,
253,
4081,
2746,
46926,
253,
14366,
4181,
281,
253,
5239,
16774,
3268,
689,
941,
2792,
16774,
1543,
7568,
326,
253,
4081,
7792,
19132,
2220,
253,
5368,
6010,
2990,
7274,
347,
973,
347,
7982,
3169,
1643,
11860,
9162,
285,
1006,
800,
14053,
4893,
50275,
19164,
414,
50276,
783,
2929,
310,
3236,
285,
29328,
271,
8654,
4616,
14366,
1754,
5933,
323,
11138,
5368,
6010,
6928,
323,
4715,
432,
873,
34218,
941,
50276,
15177,
50276,
783,
2929,
310,
22335,
3590,
2299,
253,
38135,
310,
247,
1652,
2372,
3710,
253,
4081,
2746,
36908,
3157,
2220,
253,
638,
20137,
6010,
6928,
10336,
50276,
498,
15752,
50276,
783,
2929,
310,
2590,
285,
3477,
281,
956,
5933,
337,
943,
320,
4395,
281,
253,
2022,
2133,
2581,
685,
253,
30762,
50276,
9188,
40348,
50276,
783,
789,
310,
1534,
533,
38135,
310,
247,
2372,
3710,
50276,
17465,
569,
50275,
783,
2929,
19756,
10527,
1783,
323,
253,
4081,
2746,
50276,
3775,
8965,
403,
5816,
285,
417,
2361,
275,
253,
1543,
50275,
34974,
281,
4477,
50276,
783,
2929,
3400,
271,
8654,
4616,
14366,
1754,
5933,
323,
11138,
5368,
6010,
6928,
323,
4715,
432,
873,
34218,
941,
253,
2929,
310,
3236,
285,
50276,
23693,
1037,
3590,
2299,
253,
38135,
310,
247,
1652,
2372,
3710,
253,
4081,
2746,
36908,
3157,
4457,
253,
638,
20137,
6010,
6928,
10336,
16774,
1543,
7568,
326,
253,
4081,
7792,
19132,
2220,
253,
5368,
6010,
2990,
7274,
347,
973,
347,
7982,
3169,
1643,
11860,
9162,
285,
1006,
800,
14053,
4893,
253,
2929,
19756,
10527,
1783,
323,
253,
4081,
2746,
285,
253,
2228,
8965,
403,
417,
2361,
323,
253,
16774,
1543,
50276,
7152,
33032,
2520,
789,
23970,
247,
15246,
2440,
323,
873,
6779,
4715,
275,
253,
5148,
613,
920,
3634,
1754,
327,
253,
30328,
326,
253,
5239,
14494,
275,
1524,
10186,
5148,
613,
920,
8892,
5257,
281,
452,
1846,
12474,
347,
12800,
275,
4677,
337,
253,
2934,
310,
281,
26277,
3037,
841,
1846,
12474,
9840,
18,
76,
6289,
281,
347,
4156,
23221,
390,
4156,
3861,
9117,
285,
253,
3602,
273,
247,
6010,
2990,
970,
271,
8654,
4616,
6012,
2957,
1159,
326,
26662,
253,
16774,
873,
3268,
268,
75,
342,
247,
873,
6010,
3268,
689,
253,
4156,
23221,
10075,
9117,
2805,
75,
2204,
76,
18,
76,
288,
17443,
69,
2585,
357,
292,
518,
253,
2934,
310,
20654,
275,
697,
17647,
285,
12510,
20544,
50276,
783,
30328,
273,
253,
789,
310,
2590,
285,
253,
2746,
310,
5272,
247,
30400,
436,
2789,
352,
3477,
281,
956,
253,
42852,
285,
33954,
285,
45742,
253,
11394,
281,
253,
3114,
25987,
253,
3559,
5609,
50276,
783,
4081,
1332,
310,
2969,
285,
1057,
417,
2430,
9542,
27934,
390,
3733,
41655,
281,
3157,
253,
3210,
281,
534,
352,
310,
3732,
50276,
783,
4679,
403,
9470,
12848,
285,
18511,
50276,
783,
2905,
789,
2593,
310,
6571,
1175,
285,
1057,
247,
1175,
2628,
273,
10941,
253,
6221,
2746,
342,
5368,
789,
3738,
923,
253,
5884,
5701,
50276,
20881,
1255,
265,
50276,
783,
2022,
14855,
310,
275,
253,
9759,
273,
253,
7714,
891,
1158,
841,
1895,
476,
320,
9713,
407,
253,
4477,
1561,
253,
30080,
22559,
2180,
285,
619,
4868,
310,
1754,
327,
253,
15355,
326,
436,
6569,
604,
253,
4477,
513,
417,
3157,
253,
7681,
7794,
273,
253,
9759,
891,
588,
2406,
619,
4868,
50276,
18,
7681,
512,
253,
7180,
943,
2486,
6332,
4390,
253,
7180,
337,
285,
495,
513,
417,
2486,
6332,
253,
2829,
3403,
621,
943,
671,
2486,
4278,
273,
253,
1180,
273,
49495,
908,
281,
10173,
253,
1599,
285,
253,
2228,
326,
310,
1146,
2361,
2629,
11254,
390,
2629,
2228,
327,
253,
1599,
271,
14011,
943,
320,
1160,
672,
1543,
403,
1561,
2228,
50276,
19,
247,
8866,
255,
2531,
651,
320,
4217,
323,
2829,
374,
275,
1142,
2219,
253,
3236,
1566,
285,
253,
14366,
1566,
16226,
403,
1561,
2228,
533,
4583,
1638,
273,
253,
1668,
1481,
936,
2522,
14023,
1543,
275,
253,
14366,
1566,
1907,
253,
2169,
1599,
352,
310,
11543,
326,
253,
1635,
273,
253,
14366,
2957,
1057,
417,
3157,
3045,
1677,
824,
247,
1029,
1180,
273,
1481,
936,
2522,
24088,
275,
247,
2969,
47585,
1566,
342,
268,
1762,
285,
1668,
7587,
253,
5912,
273,
1638,
390,
3687,
34574,
310,
209,
4838,
50276,
20,
1327,
48746,
4677,
374,
651,
320,
5520,
407,
970,
247,
1805,
10688,
6974,
285,
1027,
9588,
323,
1016,
1566,
281,
1056,
352,
6927,
281,
12129,
253,
1027,
3210,
275,
2806,
285,
3168,
390,
323,
247,
10688,
9645,
1436,
436,
310,
3164,
271,
2289,
2523,
534,
310,
2139,
891,
452,
2908,
352,
275,
253,
32213,
2581,
685,
253,
5884,
5701,
285,
352,
310,
275,
253,
4477,
1600,
281,
3157,
253,
9759,
273,
616,
1543,
275,
667,
1083,
50276,
21,
1327,
48746,
253,
1390,
6197,
273,
3239,
374,
9563,
5747,
616,
12510,
627,
310,
642,
2590,
1941,
326,
253,
3453,
273,
841,
6010,
6928,
812,
6266,
247,
5239,
6010,
9990,
973,
534,
452,
644,
11464,
9560,
323,
253,
873,
34218,
17032,
3237,
260,
864,
1162,
355,
43425,
604,
1269,
310,
3309,
323,
340,
285,
1182,
556,
340,
840,
1182,
556,
1269,
604,
12930,
247,
5239,
6010,
9990,
973,
310,
9560,
323,
873,
34218,
17032,
3237,
285,
6010,
6928,
403,
3576,
387,
873,
34218,
17032,
3237,
840,
253,
3453,
273,
6010,
6928,
6266,
247,
5239,
6010,
9990,
973,
2057,
12510,
310,
2590,
1941,
390,
12930,
247,
5239,
6010,
9990,
973,
310,
417,
9560,
323,
841,
3237,
436,
6197,
943,
320,
17265,
50276,
22,
627,
310,
642,
24864,
2127,
2530,
534,
2097,
891,
2550,
12654,
253,
5661,
3916,
436,
556,
7369,
275,
247,
2406,
7162,
4868,
275,
619,
2278,
50276,
37585,
5701,
285,
13991,
50276,
783,
1563,
5701,
403,
5884,
285,
7826,
17854,
594,
1928,
1959,
281,
11823,
731,
619,
4868,
1057,
417,
3469,
327,
841,
5701,
1146,
14001,
327,
50276,
2050,
8234,
7802,
3861,
9117,
323,
1643,
11860,
4715,
512,
257,
1162,
355,
17857,
1686,
6247,
3133,
751,
247,
4623,
3806,
326,
812,
320,
5469,
275,
253,
2905,
789,
2593,
327,
16936,
281,
7982,
1754,
7274,
253,
7274,
9184,
275,
7936,
4088,
285,
778,
320,
48301,
533,
1097,
452,
253,
4473,
273,
4715,
2709,
23221,
285,
352,
778,
1361,
281,
1899,
436,
789,
281,
2319,
253,
5697,
3559,
1060,
275,
4499,
342,
1110,
3559,
407,
512,
257,
1162,
355,
50276,
41528,
337,
310,
275,
253,
30762,
285,
326,
943,
320,
4767,
275,
253,
2505,
672,
352,
310,
23378,
24088,
3239,
577,
50276,
41528,
337,
943,
320,
2908,
275,
253,
2022,
2505,
50276,
284,
1175,
347,
1896,
327,
3239,
608,
943,
320,
347,
973,
347,
1896,
50276,
783,
2905,
789,
2593,
556,
1142,
14683,
326,
943,
320,
17265,
50274,
19623,
432,
731,
326,
2770,
327,
50274,
2811,
954,
2905,
789,
281,
20451,
310,
50274,
19623,
432,
841,
2175,
3798,
11897,
966,
3861,
9117,
50276,
13206,
495,
310,
417,
3782,
18511,
285,
3936,
598,
247,
2257,
273,
2317,
326,
812,
320,
1805,
908,
407,
15250,
5933,
337,
275,
6010,
891,
1158,
436,
310,
247,
973,
17194,
285,
20654,
10419,
326,
556,
644,
2011,
281,
320,
3576,
275,
247,
5235,
273,
5661,
7533,
891,
1119,
253,
4477,
3916,
281,
320,
973,
4516,
285,
891,
2550,
1089,
7681,
9331,
342,
616,
789,
326,
651,
1329,
18235,
594,
891,
588,
5583,
14924,
253,
3480,
273,
24864,
2127,
11355,
619,
7162,
275,
253,
2278,
347,
891,
476,
760,
2939,
253,
3916,
347,
597,
403,
3559,
253,
2022,
2523,
342,
253,
789,
310,
697,
9759,
534,
778,
320,
4354,
5520,
1561,
253,
30080,
22559,
5474,
33032,
436,
789,
29328,
247,
1332,
281,
3157,
873,
6779,
285,
10384,
352,
275,
253,
3634,
273,
5148,
613,
920,
625,
10534,
253,
1332,
8414,
275,
26277,
4715,
247,
6010,
2990,
285,
3861,
9117,
970,
271,
8654,
4616,
2957,
253,
3861,
9117,
285,
6010,
2990,
3453,
943,
15338,
253,
2020,
273,
253,
8654,
4616,
4815,
323,
512,
253,
5239,
273,
253,
11419,
10895,
4679,
403,
5196,
275,
253,
3634,
273,
1643,
11860,
4715,
285,
643,
8892,
908,
281,
7472,
6010,
6928,
619,
2022,
4468,
310,
327,
253,
38135,
273,
436,
789,
6296,
352,
3133,
281,
479,
326,
690,
5697,
3559,
1060,
476,
320,
1119,
275,
2045,
9380,
327,
873,
6779,
4715,
326,
403,
417,
5393,
275,
337,
3861,
9117,
1411,
534,
14366,
310,
10302,
403,
6311,
7826,
26277,
342,
271,
3284,
3020,
21496,
342,
390,
1293,
20446,
3738,
253,
18012,
908,
275,
3946,
1646,
281,
9184,
374,
10262,
247,
2074,
1332,
275,
253,
3634,
273,
4216,
6779,
352,
310,
3103,
2834,
281,
479,
281,
2096,
253,
1524,
9021,
273,
436,
2929,
352,
651,
320,
1270,
604,
253,
4477,
812,
4385,
327,
253,
22620,
285,
3910,
875,
616,
1332,
285,
841,
4394,
50276,
856,
84,
50276,
783,
2929,
310,
973,
24013,
8550,
285,
6571,
2590,
50276,
16217,
3825,
403,
12848,
285,
3133,
3590,
50276,
249,
1798,
253,
4081,
7792,
3133,
281,
3157,
5368,
3082,
323,
873,
14237,
3676,
19598,
285,
873,
39707,
50276,
5040,
50276,
74,
452,
7350,
327,
253,
38135,
273,
436,
789,
50276,
34974,
285,
16157,
50276,
29813,
721,
3133,
281,
320,
271,
4227,
273,
369,
2152,
6339,
28556,
9229,
812,
368,
4385,
327,
436,
50276,
262,
812,
320,
4409,
281,
1007,
387,
436,
643,
2929,
327,
873,
6779,
275,
253,
2905,
789,
495,
534,
556,
247,
1027,
2746,
281,
253,
1655,
285,
2045,
789,
2568,
671,
2074,
5697,
50276,
16534,
253,
4477,
21184,
327,
253,
15180,
10454,
273,
12672,
253,
4616,
5827,
849,
1142,
16338,
27721,
25142,
513,
368,
897,
50276,
5430,
7996,
403,
368,
1543,
342,
1675,
281,
253,
994,
12189,
37820,
4764,
299,
4277,
50273,
18,
50276,
66,
6194,
494,
8654,
4616,
21496,
323,
4735,
20828,
285,
697,
2954,
281,
4116,
650,
2184,
603,
278,
451,
251,
27625,
279,
72,
260,
864,
247,
1591,
395,
250,
277,
4938,
2013,
834,
49137,
1914,
278,
1094,
267,
50276,
19,
369,
2152,
6339,
21496,
323,
4216,
4715,
594,
248,
300,
38301,
13979,
6563,
301,
295,
6475,
451,
478,
796,
73,
30942,
41231,
465,
687,
73,
615,
344,
20592,
288,
2727,
8420,
50276,
20,
1234,
253,
873,
11454,
6928,
323,
4715,
873,
14237,
17022,
3223,
15777,
1629,
757,
261,
305,
757,
24836,
295,
1479,
311,
290,
46031,
15252,
261,
1579,
79,
3783,
49068,
21728,
362,
1370,
343,
15287,
24836,
253,
2929,
3133,
3590,
533,
891,
452,
7350,
327,
253,
38135,
891,
717,
7378,
281,
7164,
619,
4868,
604,
253,
4477,
19148,
616,
7680,
8772,
337,
285,
374,
5474,
33032,
2520,
2929,
4081,
271,
5520,
3733,
273,
6010,
2990,
970,
8654,
4616,
1754,
24026,
2957,
253,
6010,
2990,
342,
9768,
4360,
2559,
3602,
921,
1199,
1805,
3045,
327,
717,
430,
1025,
17524,
1127,
18534,
9162,
2020,
273,
24321,
1643,
11860,
9162,
285,
1643,
11860,
5978,
8892,
20544,
337,
436,
2929,
3400,
247,
12532,
2561,
3884,
275,
970,
873,
34218,
941,
327,
2710,
15450,
8892,
3340,
253,
22199,
970,
973,
29343,
85,
873,
12554,
569,
347,
271,
17697,
3280,
22828,
625,
31880,
569,
281,
18915,
253,
12291,
273,
1643,
11860,
5978,
3237,
50276,
19,
253,
4081,
5933,
310,
973,
24013,
8550,
285,
310,
3576,
281,
320,
9009,
323,
1524,
10186,
4893,
275,
1798,
352,
310,
3477,
281,
13398,
253,
6010,
2990,
342,
253,
5020,
273,
5368,
7982,
3169,
5148,
613,
920,
3210,
285,
1006,
800,
3210,
50276,
20,
253,
5661,
1263,
310,
18171,
5196,
327,
2710,
8892,
2905,
281,
873,
34218,
941,
285,
253,
7756,
310,
5185,
689,
1027,
8892,
436,
2722,
253,
12510,
273,
253,
4081,
10358,
395,
1993,
7792,
50276,
20881,
1255,
337,
253,
2022,
2934,
273,
436,
2929,
326,
4648,
8654,
4616,
275,
4715,
6779,
556,
644,
7561,
5421,
824,
347,
5987,
5758,
15337,
3024,
9275,
301,
32786,
23,
87,
18698,
67,
2759,
84,
5987,
39962,
2061,
9275,
8602,
30541,
1449,
9275,
352,
651,
320,
1805,
281,
19148,
752,
253,
7680,
273,
436,
2929,
310,
285,
752,
253,
2201,
3064,
310,
275,
4499,
281,
253,
5368,
6239,
273,
8654,
4616,
323,
6779,
4715,
436,
2929,
310,
973,
24013,
8550,
285,
7964,
3576,
323,
1524,
10186,
5148,
613,
920,
4893,
2299,
253,
4477,
943,
2319,
253,
6239,
273,
6779,
4715,
970,
8654,
4616,
2490,
187,
4118,
18435,
27,
783,
4081,
1332,
323,
873,
6779,
4715,
342,
271,
2898,
281,
1313,
70,
4715,
310,
973,
24013,
8550,
285,
5272,
30628,
3236,
7350,
670,
38135,
285,
7681,
9759,
452,
644,
973,
5544,
285,
9713,
275,
253,
18520,
604,
690,
10527,
1783,
476,
320,
2530,
5001,
253,
4081,
1332,
352,
651,
1056,
436,
789,
10046,
50276,
249,
6010,
247,
2762,
17401,
310,
1677,
1060
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper tackles the inverse problem of seismic deblending by using the well known plugandplay pnp framework together with a recently proposed selfsupervised denoiser which is adapted to follow the noise structure of the particular problem at hand experiments on a real dataset artificially blended by the authors show that the approach outperforms a stateoftheart method based on a fourier sparsity prior and an ablation study show that the proposed combination of pnp with a carefully trained selfsupervised denoiser is efficient strengths the paper is generally very didactic quite unusually 5 out of 9 pages are devoted to the introduction background and related work sections in that sense the paper can almost be regarded as a tutorial paper giving a subsequent overview of the field of seismic deblending and of recently proposed selfsupervised denoising techniques to nonexpert readers the proposed approach is sound and seem to outperform a more conventional technique on a real dataset weaknesses while 5 pages are devoted to introductory materials only half a page is devoted to presenting the methodology on the one hand this seems short on the other hand this can be explained by the fact that the proposed methodological contribution is very incremental two relatively wellknown existing techniques are straightforwardly combined and applied to a known problem in the field the experiments are relatively limited eventhough this may be justified by the lack of available real data for this problem still more detailed quantitative results could have been provided no detail is given on the key structbs step of the method the selfsupervised denoising not the paper nor in the supplementary material the reader is entirely referred to the recent reference 47 for this in fact the reference to 47 is implicit and the acronym structbs is never made explicit this makes the methodological part of paper not selfcontained which is not good for research reproducibility and transmission a lot of important implementation details are scattered across the experimental section or in the appendix which would make the proposed method very hard to reimplement in the ablation study not using the pnp iterations and using only selfsupervised denoising seem to be equivalent to what is done in 47 but this is not explicitly stated by the authors more details should be given on these results than a single number since according to the authors themselves 47 is the closest work to what they propose the limitations and societal impact of the work are adequately addressed by the authors in section 6 docsepthe paper introduces the problem of seismic deblending and provide a stateofthe art result in this problem using denoising network in the pnp setup denoising model is learned as a part of pnp strengths 1 introduces seismic deblending to ml community seismic problems are generally less explored in ml community and this paper introduces a good problem 2 good results with learned denoiser weaknesses 1 writing the paper devotes very less space to the main algorithm pnp denoiser also for people who are not familiar with the admm algorithm like me the section 4 reads very difficult with a lot of terms introduced but not explained like yk xupdate yupdate lsqr stuctbs theta these details are provided partially in the supplementary however more discussion space should be provided to the algorithmic part 2 novelty i am not sure if paper provides enough novelty for this conference it uses denoiser architecture in a seismic setting and shows good results however it is not clear what the contributions are apart from applying existing denoiser in seismic pnp 3 experimental validation is limited the experiments are performed on a single dataset more empirical validation is needed limitations are adequately addressed docsepthis paper combines a selfsupervised deep image denoising method and pnp framework to improve seismic deblending the problem is interesting and significant however the rationale of the proposed algorithm is unclear it seems like a combination of existing techniques and the technical contribution seems limited strength 1 this paper aims to improve the performance of seismic deblending which is a significant problem and can benefit geophysical studies 2 the authors explore to improve the conventional deblending problem with deep image denoising technology 3 the proposed method can outperform a conventional method according to the experiment weakness 1 this paper is poorly organized and the meanings of many symbols in formulas and figures are not well introduced see questions for detail this makes it difficult to understand the background and the problem formulation the reader needs to spend a long time reading this paper before understanding the problem 2 the authors introduced their algorithm very briefly in method the motivation of the proposed algorithm is unclear because the authors only provide the resulting update rules instead of showing the original optimization problem and the regularization term especially the biggest modification to the algorithm in this paper seems to be using a selfsupervised denoiser for yupdate however the rationale behind it is confusing i wonder how this update rule is derived indeed too much essential information has been omitted 3 partly because the authors did not give any theoretical illustrations for their algorithm the technical contribution seems limited according to the paper the authors 1 follow the conventional optimization problem formulation of seismic blendingdenoising 2 use pnp framework to solve this optimization problem 3 change a term for yupdate without properly explaining the rationale based on the selfsupervised image denoiser then this paper seems like a combination of existing techniques the inspiration it can bring to the community is limited 4 the experiments are not sound the authors have mentioned in the related work that there are other seismic deblendingdenoising methods however only one conventional method is used as the baseline the authors should report their performance to better demonstrate the effectiveness of the proposed algorithm in comparison to sota otherwise the authors should explain why the other methods are not applicable the authors have adequately addressed the limitations and social impact of their work docsepthis paper proposed a plugandplay pnp algorithm for reflection seismology rs the key concept of pnp is to leverage an image denoiser as an implicit regularizer to impose prior within an iterative optimization algorithm by using deep image denoisers pnp combines the physical constraints and trainable priors proposed algorithm in this paper combines pnp with selfsupervised deep image denoisers to solve the rs problem strength new application of pnp to reflection seismology rs weakness the proposed method is not novel the claimed novelty is the inclusion of selfsupervised image denoiser in the pnp framework however this has been done in 1 for medical imaging the presentation of the work is not well organized the introduction and background is too long while the method section is too short 1 rare image reconstruction using deep priors learned without ground truth not applicable
### Summary:
|
the paper studies a seismic deblending problem this is a problem in reflection seismology in which multiple excitations are applied simultaneously and then an underdetermined inverse problem is solved to recover the underlying composition of the earth existing approaches to this problem are mostly based on regularization eg frequency domain sparsity the paper proposes an alternative method based on plugandplayadmm with a selfsupervised regularizer the regularizer here is a blind spot network which tries to predict a pixel based on its surroundings in simulation studies based on synthetic blending of real seismic data the proposed algorithm outperforms the regularization approach
reviews of the paper were mixed reviewers all recognized careful pedagogical manner in which the paper lays out its problem of interest at the same time several reviewers raised concerns that the exposition was overly focused on background material at the expense of explaining the papers technical contributions exposition aside much of the discussion in the reviews and authors response centers on the novelty and depth of the papers technical contributions the reviewers note that the application of selfsupervised denoising within a plugnplay framework is not a novelty of the paper nor is it argued as one rather the technical contribution lies in a combination of existing ideas selfsupervised denoising ala struct bs plugnplay which is well suited to the reflection seismology application reviewers generally felt that the paper would be stronger if it focused more on this methodology and on the technical justification of the approach while the paper introduces a method that has value for reflection seismology it is current form the concerns are significant enough to place it below the bar for acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
253,
13737,
1895,
273,
44967,
372,
1559,
1946,
407,
970,
253,
973,
1929,
10358,
395,
1993,
268,
18650,
7792,
2366,
342,
247,
4102,
4081,
1881,
35421,
1850,
80,
9141,
534,
310,
12956,
281,
956,
253,
6046,
2605,
273,
253,
1798,
1895,
387,
1133,
4679,
327,
247,
1524,
10895,
41544,
35986,
407,
253,
4477,
921,
326,
253,
2746,
41731,
13015,
247,
1375,
23037,
14387,
1332,
1754,
327,
247,
269,
15421,
37139,
414,
2720,
285,
271,
28913,
1263,
921,
326,
253,
4081,
5019,
273,
268,
18650,
342,
247,
9257,
10166,
1881,
35421,
1850,
80,
9141,
310,
5919,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
3839,
1077,
858,
9994,
3240,
33940,
608,
562,
273,
898,
7223,
403,
16222,
281,
253,
10199,
4114,
285,
2905,
789,
7118,
275,
326,
3282,
253,
2929,
476,
2761,
320,
12258,
347,
247,
23647,
2929,
4933,
247,
6774,
18389,
273,
253,
1673,
273,
44967,
372,
1559,
1946,
285,
273,
4102,
4081,
1881,
35421,
1850,
80,
2182,
5609,
281,
44382,
8292,
10668,
50276,
783,
4081,
2746,
310,
3590,
285,
1646,
281,
562,
32231,
247,
625,
6041,
5853,
327,
247,
1524,
10895,
50275,
20881,
1255,
265,
50276,
6050,
608,
7223,
403,
16222,
281,
47649,
4753,
760,
2716,
247,
3239,
310,
16222,
281,
15250,
253,
16182,
327,
253,
581,
1133,
436,
3133,
2159,
327,
253,
643,
1133,
436,
476,
320,
5544,
407,
253,
958,
326,
253,
4081,
35961,
7680,
310,
1077,
32809,
767,
4942,
973,
4304,
5368,
5609,
403,
15246,
314,
5678,
285,
3732,
281,
247,
1929,
1895,
275,
253,
1673,
50276,
783,
4679,
403,
4942,
3710,
2362,
73,
602,
436,
778,
320,
17285,
407,
253,
3480,
273,
2130,
1524,
941,
323,
436,
1895,
1335,
625,
7000,
11745,
1543,
812,
452,
644,
2530,
50276,
2369,
2508,
310,
1677,
327,
253,
2234,
1577,
1768,
3213,
273,
253,
1332,
253,
1881,
35421,
1850,
80,
2182,
417,
253,
2929,
4543,
275,
253,
24864,
2144,
253,
9414,
310,
7094,
6289,
281,
253,
3332,
3806,
7543,
323,
436,
275,
958,
253,
3806,
281,
7543,
310,
15424,
285,
253,
913,
1406,
1105,
1577,
1768,
310,
1620,
1160,
6843,
436,
2789,
253,
35961,
629,
273,
2929,
417,
1881,
41010,
534,
310,
417,
1175,
323,
2561,
38041,
285,
6322,
50276,
66,
2257,
273,
1774,
7092,
4278,
403,
17485,
2439,
253,
5661,
2593,
390,
275,
253,
30762,
534,
651,
1056,
253,
4081,
1332,
1077,
1892,
281,
294,
303,
3018,
50276,
249,
253,
28913,
1263,
417,
970,
253,
268,
18650,
25142,
285,
970,
760,
1881,
35421,
1850,
80,
2182,
1646,
281,
320,
6425,
281,
752,
310,
2218,
275,
7543,
533,
436,
310,
417,
11120,
4767,
407,
253,
4477,
625,
4278,
943,
320,
1677,
327,
841,
1543,
685,
247,
2014,
1180,
1580,
2556,
281,
253,
4477,
3746,
7543,
310,
253,
8642,
789,
281,
752,
597,
12661,
50275,
783,
7364,
285,
38058,
3486,
273,
253,
789,
403,
18212,
9713,
407,
253,
4477,
275,
2593,
721,
5474,
339,
431,
248,
2929,
23970,
253,
1895,
273,
44967,
372,
1559,
1946,
285,
2085,
247,
1375,
23037,
248,
1445,
906,
275,
436,
1895,
970,
1850,
80,
2182,
2990,
275,
253,
268,
18650,
9978,
1850,
80,
2182,
1566,
310,
6311,
347,
247,
629,
273,
268,
18650,
50276,
296,
3755,
20556,
337,
23970,
44967,
372,
1559,
1946,
281,
13361,
3114,
44967,
3237,
403,
3839,
1679,
14859,
275,
13361,
3114,
285,
436,
2929,
23970,
247,
1175,
1895,
50276,
19,
1175,
1543,
342,
6311,
1850,
80,
9141,
50276,
20881,
1255,
265,
337,
4028,
50276,
783,
2929,
1474,
4787,
1077,
1679,
2317,
281,
253,
2022,
5933,
268,
18650,
50276,
3354,
80,
9141,
671,
323,
952,
665,
403,
417,
7615,
342,
253,
519,
2188,
5933,
751,
479,
253,
2593,
577,
9563,
1077,
2834,
342,
247,
2257,
273,
2426,
5611,
533,
417,
5544,
751,
340,
76,
50276,
89,
11183,
340,
11183,
298,
18858,
83,
331,
10475,
1768,
39116,
841,
4278,
403,
2530,
10571,
275,
253,
24864,
2299,
625,
5955,
2317,
943,
320,
2530,
281,
253,
5933,
280,
629,
50276,
19,
38135,
891,
717,
417,
2119,
604,
2929,
3400,
2217,
38135,
323,
436,
8059,
352,
4648,
1850,
80,
9141,
10336,
275,
247,
44967,
4758,
285,
2722,
1175,
1543,
2299,
352,
310,
417,
2590,
752,
253,
9021,
403,
7419,
432,
9433,
5368,
1850,
80,
9141,
275,
44967,
268,
18650,
50276,
20,
50276,
49363,
12820,
310,
3710,
253,
4679,
403,
2684,
327,
247,
2014,
10895,
625,
16774,
12820,
310,
3058,
50275,
17465,
569,
403,
18212,
9713,
5474,
33032,
2520,
2929,
24772,
247,
1881,
35421,
3676,
2460,
1850,
80,
2182,
1332,
285,
268,
18650,
7792,
281,
3157,
44967,
372,
1559,
1946,
253,
1895,
310,
4722,
285,
1534,
2299,
253,
24775,
273,
253,
4081,
5933,
310,
12744,
352,
3133,
751,
247,
5019,
273,
5368,
5609,
285,
253,
7681,
7680,
3133,
3710,
50276,
45563,
337,
436,
2929,
13698,
281,
3157,
253,
3045,
273,
44967,
372,
1559,
1946,
534,
310,
247,
1534,
1895,
285,
476,
5649,
3471,
40947,
2175,
374,
253,
4477,
8338,
281,
3157,
253,
6041,
372,
1559,
1946,
1895,
342,
3676,
2460,
1850,
80,
2182,
4302,
495,
253,
4081,
1332,
476,
562,
32231,
247,
6041,
1332,
2556,
281,
253,
3368,
50276,
20881,
1255,
337,
436,
2929,
310,
15225,
10932,
285,
253,
30460,
273,
1142,
14217,
275,
23276,
285,
8442,
403,
417,
973,
5611,
923,
3533,
323,
2508,
436,
2789,
352,
2834,
281,
2096,
253,
4114,
285,
253,
1895,
15895,
253,
9414,
3198,
281,
6947,
247,
1048,
673,
4361,
436,
2929,
1078,
4685,
253,
1895,
374,
253,
4477,
5611,
616,
5933,
1077,
13366,
275,
1332,
253,
16038,
273,
253,
4081,
5933,
310,
12744,
984,
253,
4477,
760,
2085,
253,
4795,
5731,
4803,
3185,
273,
4645,
253,
3236,
13757,
1895,
285,
253,
37820,
1307,
3340,
253,
5962,
11237,
281,
253,
5933,
275,
436,
2929,
3133,
281,
320,
970,
247,
1881,
35421,
1850,
80,
9141,
323,
340,
11183,
2299,
253,
24775,
3212,
352,
310,
21643,
891,
4282,
849,
436,
5731,
4086,
310,
6012,
6296,
1512,
1199,
5667,
1491,
556,
644,
11035,
50276,
20,
13730,
984,
253,
4477,
858,
417,
1918,
667,
10527,
33954,
323,
616,
5933,
253,
7681,
7680,
3133,
3710,
2556,
281,
253,
2929,
253,
4477,
337,
956,
253,
6041,
13757,
1895,
15895,
273,
44967,
41209,
3354,
80,
2182,
374,
897,
268,
18650,
7792,
281,
8415,
436,
13757,
1895,
495,
1818,
247,
1307,
323,
340,
11183,
1293,
6283,
15571,
253,
24775,
1754,
327,
253,
1881,
35421,
2460,
1850,
80,
9141,
840,
436,
2929,
3133,
751,
247,
5019,
273,
5368,
5609,
253,
17006,
352,
476,
3324,
281,
253,
3114,
310,
3710,
577,
253,
4679,
403,
417,
3590,
253,
4477,
452,
5393,
275,
253,
2905,
789,
326,
627,
403,
643,
44967,
372,
1559,
1946,
3354,
80,
2182,
3082,
2299,
760,
581,
6041,
1332,
310,
908,
347,
253,
8245,
253,
4477,
943,
1304,
616,
3045,
281,
1805,
7568,
253,
12510,
273,
253,
4081,
5933,
275,
5301,
281,
256,
5503,
5010,
253,
4477,
943,
5513,
2139,
253,
643,
3082,
403,
417,
7763,
50275,
783,
4477,
452,
18212,
9713,
253,
7364,
285,
2675,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
4081,
247,
10358,
395,
1993,
268,
18650,
5933,
323,
12906,
396,
1204,
1497,
14208,
253,
2234,
4473,
273,
268,
18650,
310,
281,
25057,
271,
2460,
1850,
80,
9141,
347,
271,
15424,
3963,
6081,
281,
16209,
2720,
1561,
271,
34560,
13757,
5933,
407,
970,
3676,
2460,
1850,
10225,
398,
268,
18650,
24772,
253,
3520,
10806,
285,
6194,
494,
2235,
641,
4081,
5933,
275,
436,
2929,
24772,
268,
18650,
342,
1881,
35421,
3676,
2460,
1850,
10225,
398,
281,
8415,
253,
14208,
1895,
4757,
50276,
1826,
2898,
273,
268,
18650,
281,
12906,
396,
1204,
1497,
14208,
50276,
20881,
1255,
50276,
783,
4081,
1332,
310,
417,
4460,
253,
7558,
38135,
310,
253,
11250,
273,
1881,
35421,
2460,
1850,
80,
9141,
275,
253,
268,
18650,
7792,
2299,
436,
556,
644,
2218,
275,
337,
323,
3739,
6979,
50276,
783,
9759,
273,
253,
789,
310,
417,
973,
10932,
253,
10199,
285,
4114,
310,
1512,
1048,
1223,
253,
1332,
2593,
310,
1512,
2159,
50276,
18,
7520,
2460,
14433,
970,
3676,
2235,
641,
6311,
1293,
3216,
5083,
417,
7763,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
247,
44967,
372,
1559,
1946,
1895,
436,
310,
247,
1895,
275,
12906,
396,
1204,
1497,
275,
534,
2709,
35188,
403,
3732,
10486,
285,
840,
271,
762,
37501,
13737,
1895,
310,
14042,
281,
9295,
253,
6944,
5889,
273,
253,
6149,
5368,
7274,
281,
436,
1895,
403,
6571,
1754,
327,
37820,
50276,
909,
4294,
5028,
37139,
414,
253,
2929,
29328,
271,
5795,
1332,
1754,
327,
10358,
395,
1993,
324,
2188,
342,
247,
1881,
35421,
3963,
6081,
253,
3963,
6081,
1060,
310,
247,
9645,
6308,
2990,
534,
14177,
281,
3283,
247,
12275,
1754,
327,
697,
27762,
275,
9864,
2175,
1754,
327,
13506,
41209,
273,
1524,
44967,
941,
253,
4081,
5933,
41731,
13015,
253,
37820,
2746,
50275,
40702,
15337,
84,
273,
253,
2929,
497,
6804,
30628,
512,
7478,
10182,
7690,
356,
38721,
5133,
275,
534,
253,
2929,
41714,
562,
697,
1895,
273,
1600,
387,
253,
1072,
673,
2067,
30628,
5439,
7350,
326,
253,
47284,
369,
27662,
7106,
327,
4114,
2144,
387,
253,
14247,
273,
15571,
253,
9380,
7681,
9021,
47284,
9255,
1199,
273,
253,
5955,
275,
253,
10123,
285,
4477,
2380,
12127,
327,
253,
38135,
285,
6864,
273,
253,
9380,
7681,
9021,
253,
30628,
3877,
326,
253,
2898,
273,
1881,
35421,
1850,
80,
2182,
1561,
247,
10358,
79,
1993,
7792,
310,
417,
247,
38135,
273,
253,
2929,
4543,
310,
352,
9125,
347,
581,
2581,
253,
7681,
7680,
8696,
275,
247,
5019,
273,
5368,
5697,
1881,
35421,
1850,
80,
2182,
355,
66,
1577,
48996,
10358,
79,
1993,
534,
310,
973,
18960,
281,
253,
12906,
396,
1204,
1497,
2898,
30628,
3839,
3543,
326,
253,
2929,
651,
320,
10046,
604,
352,
7106,
625,
327,
436,
16182,
285,
327,
253,
7681,
22861,
273,
253,
2746,
1223,
253,
2929,
23970,
247,
1332,
326,
556,
1318,
323,
12906,
396,
1204,
1497,
352,
310,
1655,
830,
253,
7350,
403,
1534,
2217,
281,
1659,
352,
2708,
253,
2534,
323,
14924,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
253,
13737,
1895,
273,
44967,
372,
1559,
1946,
407,
970,
253,
973,
1929,
10358,
395,
1993,
268,
18650,
7792,
2366,
342,
247,
4102,
4081,
1881,
35421,
1850,
80,
9141,
534,
310,
12956,
281,
956,
253,
6046,
2605,
273,
253,
1798,
1895,
387,
1133,
4679,
327,
247,
1524,
10895,
41544,
35986,
407,
253,
4477,
921,
326,
253,
2746,
41731,
13015,
247,
1375,
23037,
14387,
1332,
1754,
327,
247,
269,
15421,
37139,
414,
2720,
285,
271,
28913,
1263,
921,
326,
253,
4081,
5019,
273,
268,
18650,
342,
247,
9257,
10166,
1881,
35421,
1850,
80,
9141,
310,
5919,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
3839,
1077,
858,
9994,
3240,
33940,
608,
562,
273,
898,
7223,
403,
16222,
281,
253,
10199,
4114,
285,
2905,
789,
7118,
275,
326,
3282,
253,
2929,
476,
2761,
320,
12258,
347,
247,
23647,
2929,
4933,
247,
6774,
18389,
273,
253,
1673,
273,
44967,
372,
1559,
1946,
285,
273,
4102,
4081,
1881,
35421,
1850,
80,
2182,
5609,
281,
44382,
8292,
10668,
50276,
783,
4081,
2746,
310,
3590,
285,
1646,
281,
562,
32231,
247,
625,
6041,
5853,
327,
247,
1524,
10895,
50275,
20881,
1255,
265,
50276,
6050,
608,
7223,
403,
16222,
281,
47649,
4753,
760,
2716,
247,
3239,
310,
16222,
281,
15250,
253,
16182,
327,
253,
581,
1133,
436,
3133,
2159,
327,
253,
643,
1133,
436,
476,
320,
5544,
407,
253,
958,
326,
253,
4081,
35961,
7680,
310,
1077,
32809,
767,
4942,
973,
4304,
5368,
5609,
403,
15246,
314,
5678,
285,
3732,
281,
247,
1929,
1895,
275,
253,
1673,
50276,
783,
4679,
403,
4942,
3710,
2362,
73,
602,
436,
778,
320,
17285,
407,
253,
3480,
273,
2130,
1524,
941,
323,
436,
1895,
1335,
625,
7000,
11745,
1543,
812,
452,
644,
2530,
50276,
2369,
2508,
310,
1677,
327,
253,
2234,
1577,
1768,
3213,
273,
253,
1332,
253,
1881,
35421,
1850,
80,
2182,
417,
253,
2929,
4543,
275,
253,
24864,
2144,
253,
9414,
310,
7094,
6289,
281,
253,
3332,
3806,
7543,
323,
436,
275,
958,
253,
3806,
281,
7543,
310,
15424,
285,
253,
913,
1406,
1105,
1577,
1768,
310,
1620,
1160,
6843,
436,
2789,
253,
35961,
629,
273,
2929,
417,
1881,
41010,
534,
310,
417,
1175,
323,
2561,
38041,
285,
6322,
50276,
66,
2257,
273,
1774,
7092,
4278,
403,
17485,
2439,
253,
5661,
2593,
390,
275,
253,
30762,
534,
651,
1056,
253,
4081,
1332,
1077,
1892,
281,
294,
303,
3018,
50276,
249,
253,
28913,
1263,
417,
970,
253,
268,
18650,
25142,
285,
970,
760,
1881,
35421,
1850,
80,
2182,
1646,
281,
320,
6425,
281,
752,
310,
2218,
275,
7543,
533,
436,
310,
417,
11120,
4767,
407,
253,
4477,
625,
4278,
943,
320,
1677,
327,
841,
1543,
685,
247,
2014,
1180,
1580,
2556,
281,
253,
4477,
3746,
7543,
310,
253,
8642,
789,
281,
752,
597,
12661,
50275,
783,
7364,
285,
38058,
3486,
273,
253,
789,
403,
18212,
9713,
407,
253,
4477,
275,
2593,
721,
5474,
339,
431,
248,
2929,
23970,
253,
1895,
273,
44967,
372,
1559,
1946,
285,
2085,
247,
1375,
23037,
248,
1445,
906,
275,
436,
1895,
970,
1850,
80,
2182,
2990,
275,
253,
268,
18650,
9978,
1850,
80,
2182,
1566,
310,
6311,
347,
247,
629,
273,
268,
18650,
50276,
296,
3755,
20556,
337,
23970,
44967,
372,
1559,
1946,
281,
13361,
3114,
44967,
3237,
403,
3839,
1679,
14859,
275,
13361,
3114,
285,
436,
2929,
23970,
247,
1175,
1895,
50276,
19,
1175,
1543,
342,
6311,
1850,
80,
9141,
50276,
20881,
1255,
265,
337,
4028,
50276,
783,
2929,
1474,
4787,
1077,
1679,
2317,
281,
253,
2022,
5933,
268,
18650,
50276,
3354,
80,
9141,
671,
323,
952,
665,
403,
417,
7615,
342,
253,
519,
2188,
5933,
751,
479,
253,
2593,
577,
9563,
1077,
2834,
342,
247,
2257,
273,
2426,
5611,
533,
417,
5544,
751,
340,
76,
50276,
89,
11183,
340,
11183,
298,
18858,
83,
331,
10475,
1768,
39116,
841,
4278,
403,
2530,
10571,
275,
253,
24864,
2299,
625,
5955,
2317,
943,
320,
2530,
281,
253,
5933,
280,
629,
50276,
19,
38135,
891,
717,
417,
2119,
604,
2929,
3400,
2217,
38135,
323,
436,
8059,
352,
4648,
1850,
80,
9141,
10336,
275,
247,
44967,
4758,
285,
2722,
1175,
1543,
2299,
352,
310,
417,
2590,
752,
253,
9021,
403,
7419,
432,
9433,
5368,
1850,
80,
9141,
275,
44967,
268,
18650,
50276,
20,
50276,
49363,
12820,
310,
3710,
253,
4679,
403,
2684,
327,
247,
2014,
10895,
625,
16774,
12820,
310,
3058,
50275,
17465,
569,
403,
18212,
9713,
5474,
33032,
2520,
2929,
24772,
247,
1881,
35421,
3676,
2460,
1850,
80,
2182,
1332,
285,
268,
18650,
7792,
281,
3157,
44967,
372,
1559,
1946,
253,
1895,
310,
4722,
285,
1534,
2299,
253,
24775,
273,
253,
4081,
5933,
310,
12744,
352,
3133,
751,
247,
5019,
273,
5368,
5609,
285,
253,
7681,
7680,
3133,
3710,
50276,
45563,
337,
436,
2929,
13698,
281,
3157,
253,
3045,
273,
44967,
372,
1559,
1946,
534,
310,
247,
1534,
1895,
285,
476,
5649,
3471,
40947,
2175,
374,
253,
4477,
8338,
281,
3157,
253,
6041,
372,
1559,
1946,
1895,
342,
3676,
2460,
1850,
80,
2182,
4302,
495,
253,
4081,
1332,
476,
562,
32231,
247,
6041,
1332,
2556,
281,
253,
3368,
50276,
20881,
1255,
337,
436,
2929,
310,
15225,
10932,
285,
253,
30460,
273,
1142,
14217,
275,
23276,
285,
8442,
403,
417,
973,
5611,
923,
3533,
323,
2508,
436,
2789,
352,
2834,
281,
2096,
253,
4114,
285,
253,
1895,
15895,
253,
9414,
3198,
281,
6947,
247,
1048,
673,
4361,
436,
2929,
1078,
4685,
253,
1895,
374,
253,
4477,
5611,
616,
5933,
1077,
13366,
275,
1332,
253,
16038,
273,
253,
4081,
5933,
310,
12744,
984,
253,
4477,
760,
2085,
253,
4795,
5731,
4803,
3185,
273,
4645,
253,
3236,
13757,
1895,
285,
253,
37820,
1307,
3340,
253,
5962,
11237,
281,
253,
5933,
275,
436,
2929,
3133,
281,
320,
970,
247,
1881,
35421,
1850,
80,
9141,
323,
340,
11183,
2299,
253,
24775,
3212,
352,
310,
21643,
891,
4282,
849,
436,
5731,
4086,
310,
6012,
6296,
1512,
1199,
5667,
1491,
556,
644,
11035,
50276,
20,
13730,
984,
253,
4477,
858,
417,
1918,
667,
10527,
33954,
323,
616,
5933,
253,
7681,
7680,
3133,
3710,
2556,
281,
253,
2929,
253,
4477,
337,
956,
253,
6041,
13757,
1895,
15895,
273,
44967,
41209,
3354,
80,
2182,
374,
897,
268,
18650,
7792,
281,
8415,
436,
13757,
1895,
495,
1818,
247,
1307,
323,
340,
11183,
1293,
6283,
15571,
253,
24775,
1754,
327,
253,
1881,
35421,
2460,
1850,
80,
9141,
840,
436,
2929,
3133,
751,
247,
5019,
273,
5368,
5609,
253,
17006,
352,
476,
3324,
281,
253,
3114,
310,
3710,
577,
253,
4679,
403,
417,
3590,
253,
4477,
452,
5393,
275,
253,
2905,
789,
326,
627,
403,
643,
44967,
372,
1559,
1946,
3354,
80,
2182,
3082,
2299,
760,
581,
6041,
1332,
310,
908,
347,
253,
8245,
253,
4477,
943,
1304,
616,
3045,
281,
1805,
7568,
253,
12510,
273,
253,
4081,
5933,
275,
5301,
281,
256,
5503,
5010,
253,
4477,
943,
5513,
2139,
253,
643,
3082,
403,
417,
7763,
50275,
783,
4477,
452,
18212,
9713,
253,
7364,
285,
2675,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
4081,
247,
10358,
395,
1993,
268,
18650,
5933,
323,
12906,
396,
1204,
1497,
14208,
253,
2234,
4473,
273,
268,
18650,
310,
281,
25057,
271,
2460,
1850,
80,
9141,
347,
271,
15424,
3963,
6081,
281,
16209,
2720,
1561,
271,
34560,
13757,
5933,
407,
970,
3676,
2460,
1850,
10225,
398,
268,
18650,
24772,
253,
3520,
10806,
285,
6194,
494,
2235,
641,
4081,
5933,
275,
436,
2929,
24772,
268,
18650,
342,
1881,
35421,
3676,
2460,
1850,
10225,
398,
281,
8415,
253,
14208,
1895,
4757,
50276,
1826,
2898,
273,
268,
18650,
281,
12906,
396,
1204,
1497,
14208,
50276,
20881,
1255,
50276,
783,
4081,
1332,
310,
417,
4460,
253,
7558,
38135,
310,
253,
11250,
273,
1881,
35421,
2460,
1850,
80,
9141,
275,
253,
268,
18650,
7792,
2299,
436,
556,
644,
2218,
275,
337,
323,
3739,
6979,
50276,
783,
9759,
273,
253,
789,
310,
417,
973,
10932,
253,
10199,
285,
4114,
310,
1512,
1048,
1223,
253,
1332,
2593,
310,
1512,
2159,
50276,
18,
7520,
2460,
14433,
970,
3676,
2235,
641,
6311,
1293,
3216,
5083,
417,
7763,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
247,
44967,
372,
1559,
1946,
1895,
436,
310,
247,
1895,
275,
12906,
396,
1204,
1497,
275,
534,
2709,
35188,
403,
3732,
10486,
285,
840,
271,
762,
37501,
13737,
1895,
310,
14042,
281,
9295,
253,
6944,
5889,
273,
253,
6149,
5368,
7274,
281,
436,
1895,
403,
6571,
1754,
327,
37820,
50276,
909,
4294,
5028,
37139,
414,
253,
2929,
29328,
271,
5795,
1332,
1754,
327,
10358,
395,
1993,
324,
2188,
342,
247,
1881,
35421,
3963,
6081,
253,
3963,
6081,
1060,
310,
247,
9645,
6308,
2990,
534,
14177,
281,
3283,
247,
12275,
1754,
327,
697,
27762,
275,
9864,
2175,
1754,
327,
13506,
41209,
273,
1524,
44967,
941,
253,
4081,
5933,
41731,
13015,
253,
37820,
2746,
50275,
40702,
15337,
84,
273,
253,
2929,
497,
6804,
30628,
512,
7478,
10182,
7690,
356,
38721,
5133,
275,
534,
253,
2929,
41714,
562,
697,
1895,
273,
1600,
387,
253,
1072,
673,
2067,
30628,
5439,
7350,
326,
253,
47284,
369,
27662,
7106,
327,
4114,
2144,
387,
253,
14247,
273,
15571,
253,
9380,
7681,
9021,
47284,
9255,
1199,
273,
253,
5955,
275,
253,
10123,
285,
4477,
2380,
12127,
327,
253,
38135,
285,
6864,
273,
253,
9380,
7681,
9021,
253,
30628,
3877,
326,
253,
2898,
273,
1881,
35421,
1850,
80,
2182,
1561,
247,
10358,
79,
1993,
7792,
310,
417,
247,
38135,
273,
253,
2929,
4543,
310,
352,
9125,
347,
581,
2581,
253,
7681,
7680,
8696,
275,
247,
5019,
273,
5368,
5697,
1881,
35421,
1850,
80,
2182,
355,
66,
1577,
48996,
10358,
79,
1993,
534,
310,
973,
18960,
281,
253,
12906,
396,
1204,
1497,
2898,
30628,
3839,
3543,
326,
253,
2929,
651,
320,
10046,
604,
352,
7106,
625,
327,
436,
16182,
285,
327,
253,
7681,
22861,
273,
253,
2746,
1223,
253,
2929,
23970,
247,
1332,
326,
556,
1318,
323,
12906,
396,
1204,
1497,
352,
310,
1655,
830,
253,
7350,
403,
1534,
2217,
281,
1659,
352,
2708,
253,
2534,
323,
14924,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a clustering algorithm by learning from the weighted similarity of each data partition more precisely the authors propose an unsupervised framework which aims to minimize an objective of a similaritybased classifier then the upper bound of excess risk of this classifier is provided by rademacher complexity they show that the weighted kernel similarity is a special case of the general framework and they apply it to discriminative clustering by iteratively optimizing the class labels and the weights of similarity finally experiments are conducted to verify the effectiveness of the proposed algorithm strength this paper presents a learnable similarity function that adaptively adjusts the weights of the similarity between one sample and the samples of some class the authors also provide theoretical results to support the propose algorithm and give the strict proofs the experimental results of the proposed algorithm outperform that of the other comparable clustering algorithms weaknesses 1 the expected loss in eq 4 seems to be not matched with the corresponding empirical loss moreover the authors should explain how the empirical loss in eq 4 can be transformed into eq 5 the detailed derivative process should be listed in the appendix this is important 2 its not suitable to use boldface to represent a scalar variable 3 there are some typos in this paper 4 some variables lack necessary description eg mathbb1yi neq yj in eq 7 the details of the paper needs further refinement docsepthis paper proposes a new discriminative clustering framework by learning an unsupervised classifier from unlabeled data an unsupervised classifier is learned from every hypothetical data partition and the optimal data partition is then regarded as the partition corresponding to the minimum generalization error bound of the unsupervised classifier the authors use a similaritybased classifier as the unsupervised classifier in this framework and derive its generalization error bound as the sum of discriminative similarity between different clusters the proposed algorithm cds via unsupervised kernel classification cdsk then minimizes the betweencluster discriminative similarity extensive experimental results with the comparison with a broad range of baselines confirm the superior performance of the proposed algorithm strengths the proposed cds framework seems interesting and it is a principled framework of data clustering in fact the maximum margin based clustering methods are implicitly based on such a framework and it seems nice to explicitly formulate this framework in this paper this paper also provides strong theoretical results for the generalization error bound of unsupervised similaritybased classifier using rademacher complexity as explained in remark 34 and the appendix the derived generalization bound is a generalized version of the wellestablished generalization bound for kernel machines so it has an independent theoretical interest this paper also conducts extensive experimental results with various baselines from the similaritybased clustering and discriminative clustering literature i particularly appreciate the detailed discussion in section 5 which places cdsk in a clear position in the literature and explains its significance weakness it could add more value to this paper if the authors provide more examples showing why the derived discriminative similarity is better than conventional similarities such as the regular kernel similarity and the similarity used in the sparse graph especially subspacebased clustering literature this paper presents a novel and interesting clustering method with theoretical explanations and the detailed discussion with the similaritybased clustering and discriminative clustering literature is appreciated i encourage the authors to add more examples showing the comparison between the derived discriminative similarity and conventional similarity used in data clustering docsepthis paper proposes a new clustering framework called clustering by discriminative similarity cds cds learns an unsupervised similaritybased classifier from each data partition and searches for the optimal partition of the data by minimizing the generalization error of the learnt classifiers associated with the data partitions in contrasts with kernel similarity with uniform weights the induced discriminative similarity with learnable weights enhances its capability to represent complex interconnection between data based on cds cdsk is proposed as a new clustering method with its effectiveness demonstrated by experimental results positive 1 under the framework of cds discriminative similarity is induced by the generalization error bound for unsupervised similaritybased classifier the authors conduct a complete and detailed theoretical analysis and the results provide theoretical guarantee on the discriminative similarity can be induced from kernel density classification 2 moreover based on the cds model the authors develop a clustering algorithm termed clustering by discriminative similarity via unsupervised kernel classification cdsk 3 cdsk uses a psd kernel as the similarity function and outperforms competing clustering algorithms including nonparametric discriminative similarity based clustering methods and similarity graph based clustering methods demonstrating the effectiveness of cdsk negative 1 clustering performance highly depends on the effective data similarity one of the main contributions of this paper is to propose a discriminative similarity as is known to all there exists a lot of classic similarity learning paradigms such as metric learning methods and subspace learning methods what are the main differences between these methods and the proposed method in this paper i hope that the authors will theoretically or experimentally discuss the specific similarities and differences in a more detail way taking lrr as an example the authors hope to obtain the similarity matrix has a property of low rank what kind of properties does the discriminative similarity contain in this paper 2 the authors let skij2alphaialphajlambdaalphaialphajkxixj be the discriminative similarity between data from different classes the authors did not clearly explain what the essential definition of this formula is could the authors explain the physical meaning of this formula more specifically i hope the author can give an example to explain how does the similarity with learnable weights reflect discrimination 3 the authors use gaussian kernel as the predefined kernel whether different predefined kernel have an impact on the final clustering results 4 does there exist kernel learning methods based on samples weighting i believe there may be exist it is suggested that the authors make a deep analysis on the related work and summarize what are the key differences between these efforts and the proposed method in this paper 5 the authors conduct a complete and detailed theoretical analysis but the description of the algorithm process is not clear and the optimization process is not detailed enough 6 the experiments are inadequate in this paper it is hoped that the authors give the parameter analysis and convergence analysis of the algorithm in an experimental way the authors should compare the efficiency of different algorithms through the running time of different algorithms the paper contributes some new ideas and the motivation of this paper is clear this paper proposes a new clustering framework termed clustering by discriminative similarity cds based on this model the authors develop a cdsk clustering algorithm the paper is well organized the authors conduct a complete and detailed theoretical analysis experiments on realworld datasets validated the effectiveness of the proposed methods docsepthis paper proposes a discriminative similarity clustering method via unsupervised classification and provides the generalization bound for the similarity classification the experimental results show the effectiveness of the proposed method strengths 1 the paper is technically solid and it provides the theoretical analysis of the generalization bound 2 the paper proposes a simple yet effective clustering method and provide the hyperparameter tuning strategy 3 the experimental results show the effectiveness of the proposed method weaknesses 1 the motivation is unclear for me what are the benefits of the proposed discriminative similarity clustering method compared with other similarity based clustering methods 2 the compared methods in experiments are not the most recent ones among the 10 compared methods only one was proposed in 2021 and all other methods were proposed before 2015 it would be better to compare with some more recent stateoftheart clustering methods 3 the computational complexity has been already analyzed in page 6 i do not understand why section computational complexity is repeated in experiments maybe the authors want to compare with other methods wrt computational complexity but i do not find such comparison it would be better to show the comparison results of the computational complexity and running time on the used data sets 4 the captions of table 1 and table 2 are exactly the same which need to be corrected moreover in the caption it says c in the left column is the cluster number but i do not find c in tables 1 and 2 although there are some concerns the paper proposes an interesting and effective method and also provides some theoretical analysis therefore i recommend for weak accept
### Summary:
|
the authors provide a framework for unsupervised clarification based on minimizing a betweencluster discriminative similarity it is more flexible than existing methods whose kernel similarity implicitly assumes uniform weights and the authors connect to ideas such as maxmargin and weighted kernel approaches this yields a clustering algorithm naturally that alternates between updating class labels and similarity weights moreover the reviewers and i appreciate the analysis of generalization error through rademacher complexity arguments and detailed author responses i might add while the paper draws connections to weighted kernel methods and have since added references to sparse subspace clustering etc there is recent interest in using similar arguments to derive error bounds and uniform concentration results for centerbased methods that might be included in the survey of related work for instance recent work from swagatam das and collaborators the authors have importantly added details on the optimization using smo and the revision should include these details in a clear exposition together with the computational complexity discussion mentioned in their response
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
17524,
5933,
407,
4715,
432,
253,
17375,
14259,
273,
1016,
941,
10883,
625,
10534,
253,
4477,
12661,
271,
440,
35421,
7792,
534,
13698,
281,
15338,
271,
8103,
273,
247,
14259,
3169,
30410,
840,
253,
5170,
3033,
273,
6714,
2495,
273,
436,
30410,
310,
2530,
407,
1985,
358,
12844,
10454,
597,
921,
326,
253,
17375,
10295,
14259,
310,
247,
2714,
1083,
273,
253,
2087,
7792,
285,
597,
4647,
352,
281,
20741,
800,
17524,
407,
10040,
3146,
39793,
253,
966,
13301,
285,
253,
13461,
273,
14259,
4720,
4679,
403,
5196,
281,
12654,
253,
12510,
273,
253,
4081,
5933,
4757,
50276,
2520,
2929,
10262,
247,
3037,
494,
14259,
1159,
326,
5223,
1242,
4575,
84,
253,
13461,
273,
253,
14259,
875,
581,
3410,
285,
253,
3530,
273,
690,
966,
50276,
783,
4477,
671,
2085,
10527,
1543,
281,
1329,
253,
12661,
5933,
285,
1918,
253,
7654,
27947,
50276,
783,
5661,
1543,
273,
253,
4081,
5933,
562,
32231,
326,
273,
253,
643,
10870,
17524,
11333,
50276,
20881,
1255,
265,
337,
253,
3264,
2957,
275,
16186,
577,
3133,
281,
320,
417,
13373,
342,
253,
3969,
16774,
2957,
25761,
253,
4477,
943,
5513,
849,
253,
16774,
2957,
275,
16186,
577,
476,
320,
13657,
715,
16186,
608,
253,
7000,
4309,
1232,
943,
320,
7117,
275,
253,
30762,
436,
310,
1774,
374,
697,
417,
7470,
281,
897,
13433,
1664,
281,
1957,
247,
13434,
4778,
50276,
20,
627,
403,
690,
963,
993,
275,
436,
2929,
577,
690,
4903,
3480,
3309,
5740,
24088,
14168,
4482,
18,
28212,
425,
82,
340,
75,
275,
16186,
818,
50276,
783,
4278,
273,
253,
2929,
3198,
2007,
29646,
5474,
33032,
2520,
2929,
29328,
247,
747,
20741,
800,
17524,
7792,
407,
4715,
271,
440,
35421,
30410,
432,
440,
22027,
941,
271,
440,
35421,
30410,
310,
6311,
432,
1046,
27710,
941,
10883,
285,
253,
8654,
941,
10883,
310,
840,
12258,
347,
253,
10883,
3969,
281,
253,
5927,
26647,
2228,
3033,
273,
253,
440,
35421,
30410,
253,
4477,
897,
247,
14259,
3169,
30410,
347,
253,
440,
35421,
30410,
275,
436,
7792,
285,
15313,
697,
26647,
2228,
3033,
347,
253,
2020,
273,
20741,
800,
14259,
875,
1027,
9959,
253,
4081,
5933,
260,
1397,
3066,
440,
35421,
10295,
9162,
260,
1397,
76,
840,
46926,
253,
875,
24670,
20741,
800,
14259,
9470,
5661,
1543,
342,
253,
5301,
342,
247,
3862,
2491,
273,
1666,
25379,
6583,
253,
8936,
3045,
273,
253,
4081,
5933,
20544,
50275,
783,
4081,
260,
1397,
7792,
3133,
4722,
285,
352,
310,
247,
3505,
74,
6216,
7792,
273,
941,
17524,
275,
958,
253,
4869,
8459,
1754,
17524,
3082,
403,
29688,
1754,
327,
824,
247,
7792,
285,
352,
3133,
5322,
281,
11120,
36803,
436,
7792,
275,
436,
2929,
436,
2929,
671,
3400,
2266,
10527,
1543,
323,
253,
26647,
2228,
3033,
273,
440,
35421,
14259,
3169,
30410,
970,
1985,
358,
12844,
10454,
347,
5544,
275,
7579,
5910,
285,
253,
30762,
253,
6012,
26647,
3033,
310,
247,
14923,
2715,
273,
253,
973,
21877,
26647,
3033,
323,
10295,
10679,
594,
352,
556,
271,
3907,
10527,
1600,
50276,
2520,
2929,
671,
2589,
84,
9470,
5661,
1543,
342,
2710,
1666,
25379,
432,
253,
14259,
3169,
17524,
285,
20741,
800,
17524,
6239,
891,
3782,
11435,
253,
7000,
5955,
275,
2593,
608,
534,
5053,
260,
1397,
76,
275,
247,
2590,
1899,
275,
253,
6239,
285,
11424,
697,
8453,
50276,
20881,
1255,
50275,
262,
812,
823,
625,
1318,
281,
436,
2929,
604,
253,
4477,
2085,
625,
6667,
4645,
2139,
253,
6012,
20741,
800,
14259,
310,
1805,
685,
6041,
22620,
824,
347,
253,
3963,
10295,
14259,
285,
253,
14259,
908,
275,
253,
23507,
4216,
3340,
24822,
3169,
17524,
6239,
50275,
2520,
2929,
10262,
247,
4460,
285,
4722,
17524,
1332,
342,
10527,
22909,
285,
253,
7000,
5955,
342,
253,
14259,
3169,
17524,
285,
20741,
800,
17524,
6239,
310,
14109,
891,
11907,
253,
4477,
281,
823,
625,
6667,
4645,
253,
5301,
875,
253,
6012,
20741,
800,
14259,
285,
6041,
14259,
908,
275,
941,
17524,
5474,
33032,
2520,
2929,
29328,
247,
747,
17524,
7792,
1925,
17524,
407,
20741,
800,
14259,
260,
1397,
260,
1397,
33772,
271,
440,
35421,
14259,
3169,
30410,
432,
1016,
941,
10883,
285,
17891,
323,
253,
8654,
10883,
273,
253,
941,
407,
28699,
253,
26647,
2228,
273,
253,
34003,
49996,
2330,
342,
253,
941,
27959,
275,
39165,
342,
10295,
14259,
342,
6447,
13461,
253,
5802,
20741,
800,
14259,
342,
3037,
494,
13461,
25222,
697,
14603,
281,
1957,
2570,
734,
14477,
875,
941,
1754,
327,
260,
1397,
260,
1397,
76,
310,
4081,
347,
247,
747,
17524,
1332,
342,
697,
12510,
5183,
407,
5661,
1543,
2762,
337,
762,
253,
7792,
273,
260,
1397,
20741,
800,
14259,
310,
5802,
407,
253,
26647,
2228,
3033,
323,
440,
35421,
14259,
3169,
30410,
253,
4477,
2589,
247,
3426,
285,
7000,
10527,
1783,
285,
253,
1543,
2085,
10527,
12215,
327,
253,
20741,
800,
14259,
476,
320,
5802,
432,
10295,
4038,
9162,
374,
25761,
1754,
327,
253,
260,
1397,
1566,
253,
4477,
1287,
247,
17524,
5933,
23776,
17524,
407,
20741,
800,
14259,
3066,
440,
35421,
10295,
9162,
260,
1397,
76,
50276,
20,
260,
1397,
76,
4648,
247,
3714,
69,
10295,
347,
253,
14259,
1159,
285,
41731,
13015,
11771,
17524,
11333,
1690,
1327,
36928,
20741,
800,
14259,
1754,
17524,
3082,
285,
14259,
4216,
1754,
17524,
3082,
17227,
253,
12510,
273,
260,
1397,
76,
50275,
12373,
337,
186,
498,
49591,
3045,
4122,
7024,
327,
253,
3576,
941,
14259,
581,
273,
253,
2022,
9021,
273,
436,
2929,
310,
281,
12661,
247,
20741,
800,
14259,
347,
310,
1929,
281,
512,
627,
4961,
247,
2257,
273,
10610,
14259,
4715,
11951,
304,
983,
824,
347,
7982,
4715,
3082,
285,
24822,
4715,
3082,
752,
403,
253,
2022,
3910,
875,
841,
3082,
285,
253,
4081,
1332,
275,
436,
2929,
891,
3524,
326,
253,
4477,
588,
28055,
390,
21657,
2319,
253,
2173,
22620,
285,
3910,
275,
247,
625,
2508,
1039,
3192,
298,
2676,
347,
271,
1650,
253,
4477,
3524,
281,
4044,
253,
14259,
4315,
556,
247,
2867,
273,
1698,
5958,
752,
2238,
273,
3607,
1057,
253,
20741,
800,
14259,
3831,
275,
436,
2929,
50276,
19,
186,
783,
4477,
1339,
1629,
1944,
19,
1637,
451,
545,
1432,
2260,
1637,
451,
545,
1432,
76,
89,
895,
75,
320,
253,
20741,
800,
14259,
875,
941,
432,
1027,
5971,
253,
4477,
858,
417,
4518,
5513,
752,
253,
5667,
5426,
273,
436,
7212,
310,
812,
253,
4477,
5513,
253,
3520,
4495,
273,
436,
7212,
625,
5742,
891,
3524,
253,
2488,
476,
1918,
271,
1650,
281,
5513,
849,
1057,
253,
14259,
342,
3037,
494,
13461,
4887,
11081,
495,
186,
783,
4477,
897,
305,
12064,
10295,
347,
253,
41364,
10295,
1880,
1027,
41364,
10295,
452,
271,
3486,
327,
253,
2457,
17524,
1543,
577,
186,
18566,
627,
2226,
10295,
4715,
3082,
1754,
327,
3530,
42428,
891,
2868,
627,
778,
320,
2226,
352,
310,
5125,
326,
253,
4477,
1056,
247,
3676,
1783,
327,
253,
2905,
789,
285,
26799,
752,
403,
253,
2234,
3910,
875,
841,
6031,
285,
253,
4081,
1332,
275,
436,
2929,
608,
186,
783,
4477,
2589,
247,
3426,
285,
7000,
10527,
1783,
533,
253,
5740,
273,
253,
5933,
1232,
310,
417,
2590,
285,
253,
13757,
1232,
310,
417,
7000,
2217,
721,
186,
783,
4679,
403,
18766,
275,
436,
2929,
352,
310,
13937,
326,
253,
4477,
1918,
253,
4764,
1783,
285,
14940,
1783,
273,
253,
5933,
275,
271,
5661,
1039,
253,
4477,
943,
7277,
253,
6733,
273,
1027,
11333,
949,
253,
3515,
673,
273,
1027,
11333,
50276,
783,
2929,
17904,
690,
747,
5697,
285,
253,
16038,
273,
436,
2929,
310,
2590,
436,
2929,
29328,
247,
747,
17524,
7792,
23776,
17524,
407,
20741,
800,
14259,
260,
1397,
1754,
327,
436,
1566,
253,
4477,
1287,
247,
260,
1397,
76,
17524,
5933,
253,
2929,
310,
973,
10932,
253,
4477,
2589,
247,
3426,
285,
7000,
10527,
1783,
4679,
327,
1524,
10186,
15302,
17618,
253,
12510,
273,
253,
4081,
3082,
5474,
33032,
2520,
2929,
29328,
247,
20741,
800,
14259,
17524,
1332,
3066,
440,
35421,
9162,
285,
3400,
253,
26647,
3033,
323,
253,
14259,
9162,
253,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
20544,
337,
253,
2929,
310,
22335,
4891,
285,
352,
3400,
253,
10527,
1783,
273,
253,
26647,
3033,
374,
253,
2929,
29328,
247,
2969,
2568,
3576,
17524,
1332,
285,
2085,
253,
4373,
19484,
25184,
5700,
495,
253,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
50276,
20881,
1255,
265,
337,
253,
16038,
310,
12744,
323,
479,
752,
403,
253,
5373,
273,
253,
4081,
20741,
800,
14259,
17524,
1332,
2429,
342,
643,
14259,
1754,
17524,
3082,
374,
253,
2429,
3082,
275,
4679,
403,
417,
253,
954,
3332,
4394,
2190,
253,
884,
2429,
3082,
760,
581,
369,
4081,
275,
43425,
285,
512,
643,
3082,
497,
4081,
1078,
4104,
352,
651,
320,
1805,
281,
7277,
342,
690,
625,
3332,
1375,
23037,
14387,
17524,
3082,
495,
253,
15180,
10454,
556,
644,
2168,
5867,
275,
3239,
721,
891,
513,
417,
2096,
2139,
50276,
4674,
15180,
10454,
310,
6015,
275,
4679,
5046,
253,
4477,
971,
281,
7277,
342,
643,
3082,
8772,
15180,
10454,
533,
891,
513,
417,
1089,
824,
5301,
352,
651,
320,
1805,
281,
921,
253,
5301,
1543,
273,
253,
50276,
16777,
1050,
10454,
285,
3515,
673,
327,
253,
908,
941,
5239,
577,
253,
3403,
621,
273,
2829,
337,
285,
2829,
374,
403,
4555,
253,
1072,
534,
878,
281,
320,
15045,
25761,
275,
253,
11743,
352,
2296,
260,
275,
253,
1669,
5084,
310,
253,
7368,
1180,
533,
891,
513,
417,
1089,
260,
275,
7180,
337,
285,
374,
3738,
627,
403,
690,
7350,
253,
2929,
29328,
271,
4722,
285,
3576,
1332,
285,
671,
3400,
690,
10527,
1783,
50276,
45230,
891,
5583,
323,
5075,
2997,
2490,
187,
4118,
18435,
27,
783,
4477,
2085,
247,
7792,
323,
440,
35421,
37699,
1754,
327,
28699,
247,
875,
24670,
20741,
800,
14259,
352,
310,
625,
12112,
685,
5368,
3082,
3692,
10295,
14259,
29688,
19584,
6447,
13461,
285,
253,
4477,
4684,
281,
5697,
824,
347,
2781,
15456,
285,
17375,
10295,
7274,
436,
11026,
247,
17524,
5933,
10748,
326,
3960,
684,
875,
22753,
966,
13301,
285,
14259,
13461,
25761,
253,
30628,
285,
891,
11435,
253,
1783,
273,
26647,
2228,
949,
1985,
358,
12844,
10454,
7125,
285,
7000,
2488,
6128,
891,
1537,
823,
1223,
253,
2929,
21354,
10291,
281,
17375,
10295,
3082,
285,
452,
1580,
2879,
10414,
281,
23507,
24822,
17524,
3966,
627,
310,
3332,
1600,
275,
970,
2074,
7125,
281,
15313,
2228,
14493,
285,
6447,
4719,
1543,
323,
4055,
3169,
3082,
326,
1537,
320,
2908,
275,
253,
6630,
273,
2905,
789,
323,
4227,
3332,
789,
432,
1863,
356,
255,
312,
9527,
285,
49732,
253,
4477,
452,
15538,
2879,
4278,
327,
253,
13757,
970,
39797,
285,
253,
18520,
943,
2486,
841,
4278,
275,
247,
2590,
47284,
2366,
342,
253,
15180,
10454,
5955,
5393,
275,
616,
2380
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
17524,
5933,
407,
4715,
432,
253,
17375,
14259,
273,
1016,
941,
10883,
625,
10534,
253,
4477,
12661,
271,
440,
35421,
7792,
534,
13698,
281,
15338,
271,
8103,
273,
247,
14259,
3169,
30410,
840,
253,
5170,
3033,
273,
6714,
2495,
273,
436,
30410,
310,
2530,
407,
1985,
358,
12844,
10454,
597,
921,
326,
253,
17375,
10295,
14259,
310,
247,
2714,
1083,
273,
253,
2087,
7792,
285,
597,
4647,
352,
281,
20741,
800,
17524,
407,
10040,
3146,
39793,
253,
966,
13301,
285,
253,
13461,
273,
14259,
4720,
4679,
403,
5196,
281,
12654,
253,
12510,
273,
253,
4081,
5933,
4757,
50276,
2520,
2929,
10262,
247,
3037,
494,
14259,
1159,
326,
5223,
1242,
4575,
84,
253,
13461,
273,
253,
14259,
875,
581,
3410,
285,
253,
3530,
273,
690,
966,
50276,
783,
4477,
671,
2085,
10527,
1543,
281,
1329,
253,
12661,
5933,
285,
1918,
253,
7654,
27947,
50276,
783,
5661,
1543,
273,
253,
4081,
5933,
562,
32231,
326,
273,
253,
643,
10870,
17524,
11333,
50276,
20881,
1255,
265,
337,
253,
3264,
2957,
275,
16186,
577,
3133,
281,
320,
417,
13373,
342,
253,
3969,
16774,
2957,
25761,
253,
4477,
943,
5513,
849,
253,
16774,
2957,
275,
16186,
577,
476,
320,
13657,
715,
16186,
608,
253,
7000,
4309,
1232,
943,
320,
7117,
275,
253,
30762,
436,
310,
1774,
374,
697,
417,
7470,
281,
897,
13433,
1664,
281,
1957,
247,
13434,
4778,
50276,
20,
627,
403,
690,
963,
993,
275,
436,
2929,
577,
690,
4903,
3480,
3309,
5740,
24088,
14168,
4482,
18,
28212,
425,
82,
340,
75,
275,
16186,
818,
50276,
783,
4278,
273,
253,
2929,
3198,
2007,
29646,
5474,
33032,
2520,
2929,
29328,
247,
747,
20741,
800,
17524,
7792,
407,
4715,
271,
440,
35421,
30410,
432,
440,
22027,
941,
271,
440,
35421,
30410,
310,
6311,
432,
1046,
27710,
941,
10883,
285,
253,
8654,
941,
10883,
310,
840,
12258,
347,
253,
10883,
3969,
281,
253,
5927,
26647,
2228,
3033,
273,
253,
440,
35421,
30410,
253,
4477,
897,
247,
14259,
3169,
30410,
347,
253,
440,
35421,
30410,
275,
436,
7792,
285,
15313,
697,
26647,
2228,
3033,
347,
253,
2020,
273,
20741,
800,
14259,
875,
1027,
9959,
253,
4081,
5933,
260,
1397,
3066,
440,
35421,
10295,
9162,
260,
1397,
76,
840,
46926,
253,
875,
24670,
20741,
800,
14259,
9470,
5661,
1543,
342,
253,
5301,
342,
247,
3862,
2491,
273,
1666,
25379,
6583,
253,
8936,
3045,
273,
253,
4081,
5933,
20544,
50275,
783,
4081,
260,
1397,
7792,
3133,
4722,
285,
352,
310,
247,
3505,
74,
6216,
7792,
273,
941,
17524,
275,
958,
253,
4869,
8459,
1754,
17524,
3082,
403,
29688,
1754,
327,
824,
247,
7792,
285,
352,
3133,
5322,
281,
11120,
36803,
436,
7792,
275,
436,
2929,
436,
2929,
671,
3400,
2266,
10527,
1543,
323,
253,
26647,
2228,
3033,
273,
440,
35421,
14259,
3169,
30410,
970,
1985,
358,
12844,
10454,
347,
5544,
275,
7579,
5910,
285,
253,
30762,
253,
6012,
26647,
3033,
310,
247,
14923,
2715,
273,
253,
973,
21877,
26647,
3033,
323,
10295,
10679,
594,
352,
556,
271,
3907,
10527,
1600,
50276,
2520,
2929,
671,
2589,
84,
9470,
5661,
1543,
342,
2710,
1666,
25379,
432,
253,
14259,
3169,
17524,
285,
20741,
800,
17524,
6239,
891,
3782,
11435,
253,
7000,
5955,
275,
2593,
608,
534,
5053,
260,
1397,
76,
275,
247,
2590,
1899,
275,
253,
6239,
285,
11424,
697,
8453,
50276,
20881,
1255,
50275,
262,
812,
823,
625,
1318,
281,
436,
2929,
604,
253,
4477,
2085,
625,
6667,
4645,
2139,
253,
6012,
20741,
800,
14259,
310,
1805,
685,
6041,
22620,
824,
347,
253,
3963,
10295,
14259,
285,
253,
14259,
908,
275,
253,
23507,
4216,
3340,
24822,
3169,
17524,
6239,
50275,
2520,
2929,
10262,
247,
4460,
285,
4722,
17524,
1332,
342,
10527,
22909,
285,
253,
7000,
5955,
342,
253,
14259,
3169,
17524,
285,
20741,
800,
17524,
6239,
310,
14109,
891,
11907,
253,
4477,
281,
823,
625,
6667,
4645,
253,
5301,
875,
253,
6012,
20741,
800,
14259,
285,
6041,
14259,
908,
275,
941,
17524,
5474,
33032,
2520,
2929,
29328,
247,
747,
17524,
7792,
1925,
17524,
407,
20741,
800,
14259,
260,
1397,
260,
1397,
33772,
271,
440,
35421,
14259,
3169,
30410,
432,
1016,
941,
10883,
285,
17891,
323,
253,
8654,
10883,
273,
253,
941,
407,
28699,
253,
26647,
2228,
273,
253,
34003,
49996,
2330,
342,
253,
941,
27959,
275,
39165,
342,
10295,
14259,
342,
6447,
13461,
253,
5802,
20741,
800,
14259,
342,
3037,
494,
13461,
25222,
697,
14603,
281,
1957,
2570,
734,
14477,
875,
941,
1754,
327,
260,
1397,
260,
1397,
76,
310,
4081,
347,
247,
747,
17524,
1332,
342,
697,
12510,
5183,
407,
5661,
1543,
2762,
337,
762,
253,
7792,
273,
260,
1397,
20741,
800,
14259,
310,
5802,
407,
253,
26647,
2228,
3033,
323,
440,
35421,
14259,
3169,
30410,
253,
4477,
2589,
247,
3426,
285,
7000,
10527,
1783,
285,
253,
1543,
2085,
10527,
12215,
327,
253,
20741,
800,
14259,
476,
320,
5802,
432,
10295,
4038,
9162,
374,
25761,
1754,
327,
253,
260,
1397,
1566,
253,
4477,
1287,
247,
17524,
5933,
23776,
17524,
407,
20741,
800,
14259,
3066,
440,
35421,
10295,
9162,
260,
1397,
76,
50276,
20,
260,
1397,
76,
4648,
247,
3714,
69,
10295,
347,
253,
14259,
1159,
285,
41731,
13015,
11771,
17524,
11333,
1690,
1327,
36928,
20741,
800,
14259,
1754,
17524,
3082,
285,
14259,
4216,
1754,
17524,
3082,
17227,
253,
12510,
273,
260,
1397,
76,
50275,
12373,
337,
186,
498,
49591,
3045,
4122,
7024,
327,
253,
3576,
941,
14259,
581,
273,
253,
2022,
9021,
273,
436,
2929,
310,
281,
12661,
247,
20741,
800,
14259,
347,
310,
1929,
281,
512,
627,
4961,
247,
2257,
273,
10610,
14259,
4715,
11951,
304,
983,
824,
347,
7982,
4715,
3082,
285,
24822,
4715,
3082,
752,
403,
253,
2022,
3910,
875,
841,
3082,
285,
253,
4081,
1332,
275,
436,
2929,
891,
3524,
326,
253,
4477,
588,
28055,
390,
21657,
2319,
253,
2173,
22620,
285,
3910,
275,
247,
625,
2508,
1039,
3192,
298,
2676,
347,
271,
1650,
253,
4477,
3524,
281,
4044,
253,
14259,
4315,
556,
247,
2867,
273,
1698,
5958,
752,
2238,
273,
3607,
1057,
253,
20741,
800,
14259,
3831,
275,
436,
2929,
50276,
19,
186,
783,
4477,
1339,
1629,
1944,
19,
1637,
451,
545,
1432,
2260,
1637,
451,
545,
1432,
76,
89,
895,
75,
320,
253,
20741,
800,
14259,
875,
941,
432,
1027,
5971,
253,
4477,
858,
417,
4518,
5513,
752,
253,
5667,
5426,
273,
436,
7212,
310,
812,
253,
4477,
5513,
253,
3520,
4495,
273,
436,
7212,
625,
5742,
891,
3524,
253,
2488,
476,
1918,
271,
1650,
281,
5513,
849,
1057,
253,
14259,
342,
3037,
494,
13461,
4887,
11081,
495,
186,
783,
4477,
897,
305,
12064,
10295,
347,
253,
41364,
10295,
1880,
1027,
41364,
10295,
452,
271,
3486,
327,
253,
2457,
17524,
1543,
577,
186,
18566,
627,
2226,
10295,
4715,
3082,
1754,
327,
3530,
42428,
891,
2868,
627,
778,
320,
2226,
352,
310,
5125,
326,
253,
4477,
1056,
247,
3676,
1783,
327,
253,
2905,
789,
285,
26799,
752,
403,
253,
2234,
3910,
875,
841,
6031,
285,
253,
4081,
1332,
275,
436,
2929,
608,
186,
783,
4477,
2589,
247,
3426,
285,
7000,
10527,
1783,
533,
253,
5740,
273,
253,
5933,
1232,
310,
417,
2590,
285,
253,
13757,
1232,
310,
417,
7000,
2217,
721,
186,
783,
4679,
403,
18766,
275,
436,
2929,
352,
310,
13937,
326,
253,
4477,
1918,
253,
4764,
1783,
285,
14940,
1783,
273,
253,
5933,
275,
271,
5661,
1039,
253,
4477,
943,
7277,
253,
6733,
273,
1027,
11333,
949,
253,
3515,
673,
273,
1027,
11333,
50276,
783,
2929,
17904,
690,
747,
5697,
285,
253,
16038,
273,
436,
2929,
310,
2590,
436,
2929,
29328,
247,
747,
17524,
7792,
23776,
17524,
407,
20741,
800,
14259,
260,
1397,
1754,
327,
436,
1566,
253,
4477,
1287,
247,
260,
1397,
76,
17524,
5933,
253,
2929,
310,
973,
10932,
253,
4477,
2589,
247,
3426,
285,
7000,
10527,
1783,
4679,
327,
1524,
10186,
15302,
17618,
253,
12510,
273,
253,
4081,
3082,
5474,
33032,
2520,
2929,
29328,
247,
20741,
800,
14259,
17524,
1332,
3066,
440,
35421,
9162,
285,
3400,
253,
26647,
3033,
323,
253,
14259,
9162,
253,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
20544,
337,
253,
2929,
310,
22335,
4891,
285,
352,
3400,
253,
10527,
1783,
273,
253,
26647,
3033,
374,
253,
2929,
29328,
247,
2969,
2568,
3576,
17524,
1332,
285,
2085,
253,
4373,
19484,
25184,
5700,
495,
253,
5661,
1543,
921,
253,
12510,
273,
253,
4081,
1332,
50276,
20881,
1255,
265,
337,
253,
16038,
310,
12744,
323,
479,
752,
403,
253,
5373,
273,
253,
4081,
20741,
800,
14259,
17524,
1332,
2429,
342,
643,
14259,
1754,
17524,
3082,
374,
253,
2429,
3082,
275,
4679,
403,
417,
253,
954,
3332,
4394,
2190,
253,
884,
2429,
3082,
760,
581,
369,
4081,
275,
43425,
285,
512,
643,
3082,
497,
4081,
1078,
4104,
352,
651,
320,
1805,
281,
7277,
342,
690,
625,
3332,
1375,
23037,
14387,
17524,
3082,
495,
253,
15180,
10454,
556,
644,
2168,
5867,
275,
3239,
721,
891,
513,
417,
2096,
2139,
50276,
4674,
15180,
10454,
310,
6015,
275,
4679,
5046,
253,
4477,
971,
281,
7277,
342,
643,
3082,
8772,
15180,
10454,
533,
891,
513,
417,
1089,
824,
5301,
352,
651,
320,
1805,
281,
921,
253,
5301,
1543,
273,
253,
50276,
16777,
1050,
10454,
285,
3515,
673,
327,
253,
908,
941,
5239,
577,
253,
3403,
621,
273,
2829,
337,
285,
2829,
374,
403,
4555,
253,
1072,
534,
878,
281,
320,
15045,
25761,
275,
253,
11743,
352,
2296,
260,
275,
253,
1669,
5084,
310,
253,
7368,
1180,
533,
891,
513,
417,
1089,
260,
275,
7180,
337,
285,
374,
3738,
627,
403,
690,
7350,
253,
2929,
29328,
271,
4722,
285,
3576,
1332,
285,
671,
3400,
690,
10527,
1783,
50276,
45230,
891,
5583,
323,
5075,
2997,
2490,
187,
4118,
18435,
27,
783,
4477,
2085,
247,
7792,
323,
440,
35421,
37699,
1754,
327,
28699,
247,
875,
24670,
20741,
800,
14259,
352,
310,
625,
12112,
685,
5368,
3082,
3692,
10295,
14259,
29688,
19584,
6447,
13461,
285,
253,
4477,
4684,
281,
5697,
824,
347,
2781,
15456,
285,
17375,
10295,
7274,
436,
11026,
247,
17524,
5933,
10748,
326,
3960,
684,
875,
22753,
966,
13301,
285,
14259,
13461,
25761,
253,
30628,
285,
891,
11435,
253,
1783,
273,
26647,
2228,
949,
1985,
358,
12844,
10454,
7125,
285,
7000,
2488,
6128,
891,
1537,
823,
1223,
253,
2929,
21354,
10291,
281,
17375,
10295,
3082,
285,
452,
1580,
2879,
10414,
281,
23507,
24822,
17524,
3966,
627,
310,
3332,
1600,
275,
970,
2074,
7125,
281,
15313,
2228,
14493,
285,
6447,
4719,
1543,
323,
4055,
3169,
3082,
326,
1537,
320,
2908,
275,
253,
6630,
273,
2905,
789,
323,
4227,
3332,
789,
432,
1863,
356,
255,
312,
9527,
285,
49732,
253,
4477,
452,
15538,
2879,
4278,
327,
253,
13757,
970,
39797,
285,
253,
18520,
943,
2486,
841,
4278,
275,
247,
2590,
47284,
2366,
342,
253,
15180,
10454,
5955,
5393,
275,
616,
2380
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies iterative magnitude pruning imp which is used to find a sparse subnetwork that can train to achieve the accuracy dense networks can achieve the authors found that empirically training on a small fraction of randomly chosen data is sufficient to find a good initialization training on easy training data can reduce the number of steps to find a good initialization mean train loss barrier is predictive of final test accuracy strength finding initializations with sparse trainable networks is a fundamental question in neural network training the observations made by the paper are interesting and can provide insights into understanding sparse trainable networks in particular it shows that using easy data is sufficient and can reduce rewinding time which can be practically useful weakness the writing of this paper can be improved i think the connection between the sections can be made stronger for example in section 5 the authors found that easy data does not reduce the amount of training needed in the learning rate warmup but did not talk much about how this phenomenon is related to the previous sections i think this paper could delve deeper into the observations made while the observations made by the authors are novel the authors did not give much explanation of the phenomena oberserved i think the paper can be made much stronger if the authors could propose and investigate some explanations of the phenomena and discuss more how the observations can facilitate new algorithms this paper has some discussion of the limitations in section 6 docsepthe paper deals with performing pruning with respect to data the emphasis of this work is in understanding the dynamics of the learning process more specifically how do the early stages of the training help you with performing pruning based on the neural network weights the paper is extremely clearly written the claims are very crisp and clear more importantly the impact of this is pretty huge if this will hold at scale for very large data sets the method section is well explained and the continually hammered in connection to the lottery ticket hypotheses gives good context to the solution albeit there are a few things that are still very empiricalad hoc line 136 is the 2 a stiff criteria or did you try to change it more importantly what would have been very convincing if this would have been tried on several modalities of data i have a sneaking suspicion that for tabulartime series this would not hold line 147 is p the probability of the network the wights a few typos and assortment of other things line 115 missing a space line 118 double dashes sometimes authors relay way too much on references for example el2n context some of the benchmarks of gwmp other methods etc this is extremely nitpicking but the reviewer is colorblind such it took me a but to correlate in fig 5 easiest examples and low el2n can you please make it consistent between the title of the figures and legend of the scatter plots no there is no discussion on it or perhaps i missed it in the supplementary material docsepthis paper focuses on the pretraining phase of the imp procedure the authors show that only a small fraction of data is required for finding a matching initialization in particular the authors show that the length of such pretraining is reduced if we use as training set only the easiest examples then they also introduce two measures of the loss landscape of the dense network that correlate well with the imp performance of the pretrained initialization the authors also investigate if the same behavior can be observed during learning rate warmup the paper provides new insights into the role of the pretraining phase for finding matching lottery tickets i think that the empirical evidence shown in this paper can significantly contribute to the research in this field reducing the amount of data and time required for this training phase the contributions presented in this paper also help to shed light into the behavior of the network during such pretraining phase the paper is well written and easy to follow the experimental validation of the assumptions presented in the paper is solid the authors adequately addressed the limitations and potential negative societal impact of their work docsepthis paper studies the role of data in finding earlystage lottery tickets with carefullydesigned experiments the authors showed that training on a small fraction of easy data suffices to get a good initialization for imp and in particular it takes fewer pretrain iterations in addition the authors proposed to use the distribution of perexample loss barriers to predicting the quality of imp initialization overall these findings offer interesting insights into the early phase neural network training dynamics strengths the paper is wellwritten and quite easy to follow although the ingredients are not new the idea of studying the role of data in finding good lottery tickets is novel i believe some of the findings of this work are interesting to many people in our community experiments are welldesigned and support the authors claims very well weaknesses all experiments are done on relatively small datasets quite often the results on cifar do not transfer to larger datasets like imagenet one or two runs on imagenet would make the results more convincing and i believe nowadays most academic labs can afford to run a few experiments on imagenet the authors only used one single metric to rank the training data it would be great if the authors could try out one or two other metrics in addition it seems that the data selection procedure could be expensive i think the authors should add some discussions in the main paper about that the authors discovered an interesting phenomenon but didnt give any explanation of why easy data could help the authors discussed the limitations at the end of the paper i dont see any negative societal impact of this work
### Summary:
|
this paper presents comprehensive experiments studying the role of data in finding lottery tickets in the early stage of training all reviewers liked the paper and agreed that the paper has novel and insightful results worth sharing with the community
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
34560,
9777,
819,
25004,
1607,
534,
310,
908,
281,
1089,
247,
23507,
749,
18428,
326,
476,
6194,
281,
5115,
253,
7200,
14086,
6928,
476,
5115,
253,
4477,
1119,
326,
45190,
50276,
31158,
327,
247,
1355,
6919,
273,
12421,
6777,
941,
310,
4209,
281,
1089,
247,
1175,
31850,
50275,
31158,
327,
3477,
3733,
941,
476,
4796,
253,
1180,
273,
5018,
281,
1089,
247,
1175,
31850,
50276,
10722,
6194,
2957,
11394,
310,
15970,
273,
2457,
1071,
7200,
50274,
45563,
50276,
28983,
3302,
5904,
342,
23507,
6194,
494,
6928,
310,
247,
7936,
1953,
275,
11454,
2990,
3733,
253,
7313,
1160,
407,
253,
2929,
403,
4722,
285,
476,
2085,
16039,
715,
4685,
23507,
6194,
494,
6928,
275,
1798,
352,
2722,
326,
970,
3477,
941,
310,
4209,
285,
476,
4796,
294,
88,
3087,
673,
534,
476,
320,
18236,
4217,
50275,
20881,
1255,
50275,
783,
4028,
273,
436,
2929,
476,
320,
5520,
891,
1158,
253,
4602,
875,
253,
7118,
476,
320,
1160,
10046,
323,
1650,
275,
2593,
608,
253,
4477,
1119,
326,
3477,
941,
1057,
417,
4796,
253,
2408,
273,
3733,
3058,
275,
253,
4715,
2281,
5890,
484,
533,
858,
417,
2312,
1199,
670,
849,
436,
11562,
310,
2905,
281,
253,
2045,
7118,
50276,
74,
1158,
436,
2929,
812,
1448,
306,
12861,
715,
253,
7313,
1160,
1223,
253,
7313,
1160,
407,
253,
4477,
403,
4460,
253,
4477,
858,
417,
1918,
1199,
8813,
273,
253,
16958,
691,
398,
4994,
891,
1158,
253,
2929,
476,
320,
1160,
1199,
10046,
604,
253,
4477,
812,
12661,
285,
7409,
690,
22909,
273,
253,
16958,
285,
2319,
625,
849,
253,
7313,
476,
12454,
747,
11333,
436,
2929,
556,
690,
5955,
273,
253,
7364,
275,
2593,
721,
50275,
7152,
339,
431,
248,
2929,
13330,
342,
9591,
819,
25004,
342,
1675,
281,
941,
253,
15075,
273,
436,
789,
310,
275,
4685,
253,
8062,
273,
253,
4715,
1232,
625,
5742,
849,
513,
253,
2393,
8661,
273,
253,
3733,
1361,
368,
342,
9591,
819,
25004,
1754,
327,
253,
11454,
2990,
13461,
253,
2929,
310,
6685,
4518,
3542,
253,
3916,
403,
1077,
29990,
285,
2590,
625,
15538,
253,
3486,
273,
436,
310,
3965,
5699,
604,
436,
588,
2186,
387,
4311,
323,
1077,
1781,
941,
5239,
50276,
783,
1332,
2593,
310,
973,
5544,
285,
253,
23265,
10546,
38086,
275,
50276,
14477,
281,
253,
36284,
13571,
24316,
4245,
1175,
3634,
281,
253,
2900,
50276,
267,
15932,
627,
403,
247,
1643,
1841,
326,
403,
1335,
1077,
16774,
324,
26901,
50276,
1282,
14821,
50276,
261,
253,
374,
247,
13827,
6866,
50276,
263,
858,
368,
1611,
281,
1818,
352,
625,
15538,
752,
651,
452,
644,
1077,
21414,
604,
436,
651,
452,
644,
3597,
327,
2067,
33433,
273,
941,
891,
452,
247,
16037,
1170,
18910,
326,
323,
10334,
335,
435,
553,
2962,
436,
651,
417,
2186,
1386,
20825,
50276,
261,
268,
253,
5912,
273,
253,
2990,
253,
259,
4380,
50275,
66,
1643,
963,
993,
285,
50131,
273,
643,
1841,
1386,
11343,
5816,
247,
2317,
1386,
12643,
50276,
12237,
9527,
1041,
50275,
32307,
4477,
26276,
1039,
1512,
1199,
327,
10414,
323,
1650,
1045,
19,
79,
3634,
690,
273,
253,
49602,
273,
305,
88,
2503,
643,
3082,
3966,
50276,
2520,
310,
6685,
12389,
81,
12427,
533,
253,
37317,
310,
3295,
27895,
824,
352,
2335,
479,
247,
533,
281,
24888,
275,
3036,
608,
24746,
6667,
285,
1698,
1045,
19,
79,
476,
368,
4496,
1056,
352,
5185,
875,
253,
4060,
273,
253,
8442,
285,
13691,
273,
253,
24493,
14777,
50276,
2369,
627,
310,
642,
5955,
327,
352,
390,
4931,
891,
9829,
352,
275,
253,
24864,
2144,
50275,
7152,
33032,
2520,
2929,
16633,
327,
253,
3215,
26208,
3408,
273,
253,
1607,
5199,
253,
4477,
921,
326,
760,
247,
1355,
6919,
273,
941,
310,
2424,
323,
4560,
247,
11038,
31850,
275,
1798,
253,
4477,
921,
326,
253,
2978,
273,
824,
3215,
26208,
310,
3777,
604,
359,
897,
347,
3733,
873,
760,
253,
24746,
6667,
840,
597,
671,
9569,
767,
5593,
273,
253,
2957,
13016,
273,
253,
14086,
2990,
326,
24888,
973,
342,
253,
1607,
3045,
273,
253,
3215,
11273,
31850,
253,
4477,
671,
7409,
604,
253,
1072,
3879,
476,
320,
2540,
1309,
4715,
2281,
5890,
484,
253,
2929,
3400,
747,
16039,
715,
253,
2554,
273,
253,
3215,
26208,
3408,
323,
4560,
11038,
36284,
14997,
891,
1158,
326,
253,
16774,
1941,
2011,
275,
436,
2929,
476,
3012,
8162,
281,
253,
2561,
275,
436,
1673,
8493,
253,
2408,
273,
941,
285,
673,
2424,
323,
436,
3733,
3408,
253,
9021,
3559,
275,
436,
2929,
671,
1361,
281,
17914,
1708,
715,
253,
3879,
273,
253,
2990,
1309,
824,
3215,
26208,
3408,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
5661,
12820,
273,
253,
13260,
3559,
275,
253,
2929,
310,
4891,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
50276,
7152,
33032,
2520,
2929,
2175,
253,
2554,
273,
941,
275,
4560,
2393,
13311,
36284,
14997,
342,
9257,
38061,
4679,
253,
4477,
2692,
326,
3733,
327,
247,
1355,
6919,
273,
3477,
941,
31088,
281,
755,
247,
1175,
31850,
323,
1607,
285,
275,
1798,
352,
3936,
11184,
3215,
1949,
25142,
275,
1635,
253,
4477,
4081,
281,
897,
253,
3268,
273,
759,
18398,
4636,
2957,
15938,
281,
21565,
253,
3290,
273,
1607,
31850,
4583,
841,
4342,
3959,
4722,
16039,
715,
253,
2393,
3408,
11454,
2990,
3733,
8062,
20544,
50276,
783,
2929,
310,
973,
15720,
285,
3240,
3477,
281,
956,
50276,
20261,
253,
12696,
403,
417,
747,
253,
2934,
273,
12392,
253,
2554,
273,
941,
275,
4560,
1175,
36284,
14997,
310,
4460,
891,
2868,
690,
273,
253,
4342,
273,
436,
789,
403,
4722,
281,
1142,
952,
275,
776,
3114,
50276,
16217,
3825,
403,
6210,
392,
265,
1300,
285,
1329,
253,
4477,
3916,
1077,
973,
50276,
20881,
1255,
265,
50276,
455,
4679,
403,
2218,
327,
4942,
1355,
15302,
3240,
2223,
253,
1543,
327,
260,
338,
274,
513,
417,
3700,
281,
4067,
15302,
751,
4440,
257,
292,
581,
390,
767,
6613,
327,
4440,
257,
292,
651,
1056,
253,
1543,
625,
21414,
285,
891,
2868,
31735,
954,
11073,
39803,
476,
7848,
281,
1408,
247,
1643,
4679,
327,
4440,
257,
292,
50276,
783,
4477,
760,
908,
581,
2014,
7982,
281,
5958,
253,
3733,
941,
352,
651,
320,
1270,
604,
253,
4477,
812,
1611,
562,
581,
390,
767,
643,
17082,
275,
1635,
352,
3133,
326,
253,
941,
5438,
5199,
812,
320,
8214,
891,
1158,
253,
4477,
943,
823,
690,
11985,
275,
253,
2022,
2929,
670,
326,
50276,
783,
4477,
6888,
271,
4722,
11562,
533,
42126,
1918,
667,
8813,
273,
2139,
3477,
941,
812,
1361,
253,
4477,
5469,
253,
7364,
387,
253,
990,
273,
253,
2929,
891,
13414,
923,
667,
4016,
38058,
3486,
273,
436,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
11088,
4679,
12392,
253,
2554,
273,
941,
275,
4560,
36284,
14997,
275,
253,
2393,
3924,
273,
3733,
512,
30628,
10490,
253,
2929,
285,
5821,
326,
253,
2929,
556,
4460,
285,
47860,
1543,
4409,
9628,
342,
253,
3114
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
34560,
9777,
819,
25004,
1607,
534,
310,
908,
281,
1089,
247,
23507,
749,
18428,
326,
476,
6194,
281,
5115,
253,
7200,
14086,
6928,
476,
5115,
253,
4477,
1119,
326,
45190,
50276,
31158,
327,
247,
1355,
6919,
273,
12421,
6777,
941,
310,
4209,
281,
1089,
247,
1175,
31850,
50275,
31158,
327,
3477,
3733,
941,
476,
4796,
253,
1180,
273,
5018,
281,
1089,
247,
1175,
31850,
50276,
10722,
6194,
2957,
11394,
310,
15970,
273,
2457,
1071,
7200,
50274,
45563,
50276,
28983,
3302,
5904,
342,
23507,
6194,
494,
6928,
310,
247,
7936,
1953,
275,
11454,
2990,
3733,
253,
7313,
1160,
407,
253,
2929,
403,
4722,
285,
476,
2085,
16039,
715,
4685,
23507,
6194,
494,
6928,
275,
1798,
352,
2722,
326,
970,
3477,
941,
310,
4209,
285,
476,
4796,
294,
88,
3087,
673,
534,
476,
320,
18236,
4217,
50275,
20881,
1255,
50275,
783,
4028,
273,
436,
2929,
476,
320,
5520,
891,
1158,
253,
4602,
875,
253,
7118,
476,
320,
1160,
10046,
323,
1650,
275,
2593,
608,
253,
4477,
1119,
326,
3477,
941,
1057,
417,
4796,
253,
2408,
273,
3733,
3058,
275,
253,
4715,
2281,
5890,
484,
533,
858,
417,
2312,
1199,
670,
849,
436,
11562,
310,
2905,
281,
253,
2045,
7118,
50276,
74,
1158,
436,
2929,
812,
1448,
306,
12861,
715,
253,
7313,
1160,
1223,
253,
7313,
1160,
407,
253,
4477,
403,
4460,
253,
4477,
858,
417,
1918,
1199,
8813,
273,
253,
16958,
691,
398,
4994,
891,
1158,
253,
2929,
476,
320,
1160,
1199,
10046,
604,
253,
4477,
812,
12661,
285,
7409,
690,
22909,
273,
253,
16958,
285,
2319,
625,
849,
253,
7313,
476,
12454,
747,
11333,
436,
2929,
556,
690,
5955,
273,
253,
7364,
275,
2593,
721,
50275,
7152,
339,
431,
248,
2929,
13330,
342,
9591,
819,
25004,
342,
1675,
281,
941,
253,
15075,
273,
436,
789,
310,
275,
4685,
253,
8062,
273,
253,
4715,
1232,
625,
5742,
849,
513,
253,
2393,
8661,
273,
253,
3733,
1361,
368,
342,
9591,
819,
25004,
1754,
327,
253,
11454,
2990,
13461,
253,
2929,
310,
6685,
4518,
3542,
253,
3916,
403,
1077,
29990,
285,
2590,
625,
15538,
253,
3486,
273,
436,
310,
3965,
5699,
604,
436,
588,
2186,
387,
4311,
323,
1077,
1781,
941,
5239,
50276,
783,
1332,
2593,
310,
973,
5544,
285,
253,
23265,
10546,
38086,
275,
50276,
14477,
281,
253,
36284,
13571,
24316,
4245,
1175,
3634,
281,
253,
2900,
50276,
267,
15932,
627,
403,
247,
1643,
1841,
326,
403,
1335,
1077,
16774,
324,
26901,
50276,
1282,
14821,
50276,
261,
253,
374,
247,
13827,
6866,
50276,
263,
858,
368,
1611,
281,
1818,
352,
625,
15538,
752,
651,
452,
644,
1077,
21414,
604,
436,
651,
452,
644,
3597,
327,
2067,
33433,
273,
941,
891,
452,
247,
16037,
1170,
18910,
326,
323,
10334,
335,
435,
553,
2962,
436,
651,
417,
2186,
1386,
20825,
50276,
261,
268,
253,
5912,
273,
253,
2990,
253,
259,
4380,
50275,
66,
1643,
963,
993,
285,
50131,
273,
643,
1841,
1386,
11343,
5816,
247,
2317,
1386,
12643,
50276,
12237,
9527,
1041,
50275,
32307,
4477,
26276,
1039,
1512,
1199,
327,
10414,
323,
1650,
1045,
19,
79,
3634,
690,
273,
253,
49602,
273,
305,
88,
2503,
643,
3082,
3966,
50276,
2520,
310,
6685,
12389,
81,
12427,
533,
253,
37317,
310,
3295,
27895,
824,
352,
2335,
479,
247,
533,
281,
24888,
275,
3036,
608,
24746,
6667,
285,
1698,
1045,
19,
79,
476,
368,
4496,
1056,
352,
5185,
875,
253,
4060,
273,
253,
8442,
285,
13691,
273,
253,
24493,
14777,
50276,
2369,
627,
310,
642,
5955,
327,
352,
390,
4931,
891,
9829,
352,
275,
253,
24864,
2144,
50275,
7152,
33032,
2520,
2929,
16633,
327,
253,
3215,
26208,
3408,
273,
253,
1607,
5199,
253,
4477,
921,
326,
760,
247,
1355,
6919,
273,
941,
310,
2424,
323,
4560,
247,
11038,
31850,
275,
1798,
253,
4477,
921,
326,
253,
2978,
273,
824,
3215,
26208,
310,
3777,
604,
359,
897,
347,
3733,
873,
760,
253,
24746,
6667,
840,
597,
671,
9569,
767,
5593,
273,
253,
2957,
13016,
273,
253,
14086,
2990,
326,
24888,
973,
342,
253,
1607,
3045,
273,
253,
3215,
11273,
31850,
253,
4477,
671,
7409,
604,
253,
1072,
3879,
476,
320,
2540,
1309,
4715,
2281,
5890,
484,
253,
2929,
3400,
747,
16039,
715,
253,
2554,
273,
253,
3215,
26208,
3408,
323,
4560,
11038,
36284,
14997,
891,
1158,
326,
253,
16774,
1941,
2011,
275,
436,
2929,
476,
3012,
8162,
281,
253,
2561,
275,
436,
1673,
8493,
253,
2408,
273,
941,
285,
673,
2424,
323,
436,
3733,
3408,
253,
9021,
3559,
275,
436,
2929,
671,
1361,
281,
17914,
1708,
715,
253,
3879,
273,
253,
2990,
1309,
824,
3215,
26208,
3408,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
5661,
12820,
273,
253,
13260,
3559,
275,
253,
2929,
310,
4891,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
50276,
7152,
33032,
2520,
2929,
2175,
253,
2554,
273,
941,
275,
4560,
2393,
13311,
36284,
14997,
342,
9257,
38061,
4679,
253,
4477,
2692,
326,
3733,
327,
247,
1355,
6919,
273,
3477,
941,
31088,
281,
755,
247,
1175,
31850,
323,
1607,
285,
275,
1798,
352,
3936,
11184,
3215,
1949,
25142,
275,
1635,
253,
4477,
4081,
281,
897,
253,
3268,
273,
759,
18398,
4636,
2957,
15938,
281,
21565,
253,
3290,
273,
1607,
31850,
4583,
841,
4342,
3959,
4722,
16039,
715,
253,
2393,
3408,
11454,
2990,
3733,
8062,
20544,
50276,
783,
2929,
310,
973,
15720,
285,
3240,
3477,
281,
956,
50276,
20261,
253,
12696,
403,
417,
747,
253,
2934,
273,
12392,
253,
2554,
273,
941,
275,
4560,
1175,
36284,
14997,
310,
4460,
891,
2868,
690,
273,
253,
4342,
273,
436,
789,
403,
4722,
281,
1142,
952,
275,
776,
3114,
50276,
16217,
3825,
403,
6210,
392,
265,
1300,
285,
1329,
253,
4477,
3916,
1077,
973,
50276,
20881,
1255,
265,
50276,
455,
4679,
403,
2218,
327,
4942,
1355,
15302,
3240,
2223,
253,
1543,
327,
260,
338,
274,
513,
417,
3700,
281,
4067,
15302,
751,
4440,
257,
292,
581,
390,
767,
6613,
327,
4440,
257,
292,
651,
1056,
253,
1543,
625,
21414,
285,
891,
2868,
31735,
954,
11073,
39803,
476,
7848,
281,
1408,
247,
1643,
4679,
327,
4440,
257,
292,
50276,
783,
4477,
760,
908,
581,
2014,
7982,
281,
5958,
253,
3733,
941,
352,
651,
320,
1270,
604,
253,
4477,
812,
1611,
562,
581,
390,
767,
643,
17082,
275,
1635,
352,
3133,
326,
253,
941,
5438,
5199,
812,
320,
8214,
891,
1158,
253,
4477,
943,
823,
690,
11985,
275,
253,
2022,
2929,
670,
326,
50276,
783,
4477,
6888,
271,
4722,
11562,
533,
42126,
1918,
667,
8813,
273,
2139,
3477,
941,
812,
1361,
253,
4477,
5469,
253,
7364,
387,
253,
990,
273,
253,
2929,
891,
13414,
923,
667,
4016,
38058,
3486,
273,
436,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
11088,
4679,
12392,
253,
2554,
273,
941,
275,
4560,
36284,
14997,
275,
253,
2393,
3924,
273,
3733,
512,
30628,
10490,
253,
2929,
285,
5821,
326,
253,
2929,
556,
4460,
285,
47860,
1543,
4409,
9628,
342,
253,
3114
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper describes a starganbased approach to generating synthetic images of various defects based on the underlying training data the proposed method considers two different types of domains eg the foreground domain as defect types and the background domain as product types and the derived synthetic images are included in the training data to improve the performance of defect classification detailed comments are as follows 1 the proposed method is reasonable but the technical novelty is only moderate the gan formulation described in the paper heavily relies on stargan v2 with additional addition of the content transfer strategy introduced in mokady et al 2020 2 the presentation needs significant improvements for example the contentswriting of section 3 are way too similar to stargan v2 this could negatively affect the evaluation of the technical contributions of this work 3 the experimental results are not convincing the experiment is tested on only the sdi dataset and the comparisons do not include existing sota techniques for defect classification it is not clear whether the claimed main advantage of increasing the diversity of defects on each specific target product would still be valid for the case that defects of different products vary significantly also it would be more insightful to include the experiment of defect localization say over the mvtec ad dataset to further justify the usefulness of the proposed approach 4 the ablation study on page 7 should be made more comprehensive eg with explicit quantitative results rather than with qualitative examples as in figure 5 5 the paragraph of crossdomain effect with respect to table 4 needs to be discussed in more detail is the defect transferring still effective for the case that defects of different products vary significantly the technical novelty of the proposed method can be improved and the experiments lack comparisons with sota techniques for defect classification this work is not ready for publication yet docsep this paper introduces defect transfer gan to transfer or generate different types of defects such as stretches or spots foreground and apply them onto different product images background the method builds upon starganv2 and add the cyclecontent consistency loss and classification loss between foreground and background experiments were reported on the surface defect inspection dataset results show that the proposed method achieved better fid and kid scores the paper shows that training on the synthesized dataset performs better than traditional data augmentation strength the paper addresses an interesting and important application the technical implementation details are well documented the appendix provides great details on the experimental protocol and additional results the authors promise to publish the code and dataset so this could be a nice contribution to the community the classification accuracy was higher when using the proposed data synthesis approach weakness i have a hard time to find anything that is technically novel all the training losses are inherent from startganv2 or cyclegan the main difference lies in the foregroundbackground separation but there was not ablation study to validate this only results on a closed dataset is reported i feel that this paper would be a lot more better suited for conferences with a focus on applications i am not sure whats the takeaway lesson on the technical components from this paper i think this paper tackles an interesting applications the results are thorough and solid my main concern of this paper is that the technical novelty is limited i am thus leaning negative about this paper docsepauthors proposed the gan method that can translate the defect between images the obtained results from the proposed gan is able to disentangle the defectspecific content from the background images in the weaklysupervised manner by using the proposed gan method authors augmented the training data the accuracy gain obtained using the data augmentation shows consistent and significant gap to the conventional method without the data augmentation it is hard to understand what is the style and what is the content in the targeted application even though the method is proposed to tackle the defect recognition problem while style and content are not clearly defined in such a context this confuses the concept of style and content throughout the draft the technical novelty looks weak most of components are similar to the stargan v2 while authors insist that they added few more modules such as backgroundforeground classifier however this extension is rather trivial methodwise it is hard to find the improvement in the applicationside there could be however it was not clearly explained why such stylecontent separation and bgfg classification are important for the targeted application experiments are somehow limited only 1 dataset is involved for the experiment furthermore the presentation for the results is rather weak figures 3 through 6 are not clearly explained they are not the natural images and it is hard to empathize that the images obtained is operating well for the given scenario i think authors need to better explain such a subtle improvement due to the limited presentation experiments and technical novelty i am on the borderline for this draft yet however i could go towards the accept if authors could made effective rebuttal to my comments docsepthis work applies gan for data augmentation to enhance defect classification concretely based on stargan v2 structure the proposed framework is able to encode the foreground and the background separately enabling diverse defect synthesis by either transferring the defect from a reference sample or synthesizing from randomly sampled noises strengths applying gans to defect synthesis is interesting the results of referenceguided defect transfer are promising weaknesses my main concern is the technical novelty from the task perspective using gans for data augmentation is not new and using gans for stylecontent transfer is also not new this paper just applies it to a new area from the technique perspective most of the architectures are borrowed from stargan v2 some loss terms are added to improve the performance but all losses seem to be trivial despite the good results i cannot see many insights from this work eg how can this work guide other researchers the proposed approach is only evaluated on the surface defect inspection sdi dataset the superiority cannot be fully verified to be honest i do not think this work is suitable for iclr as most audiences may not get interested i strongly recommend the authors submit this work to a conferencejournal in the industry field docsep the paper prposes a stargan based model to distangle the defect foreground background transfer the style of foreground and then synthesize the defected image for different products the quality of the synthesized defect images are evaluated using fid kid the approach is also appled to augment defect images to improve the performance of defect classification strength the paper is well writen and easy to follow the proposed methods seems to mitigate the data collection and labelling costs of the defect inspection problem widely available in production industry weakness from technical part the model is basically a tailored stargan v2 tuned for the defect generation task the novelty and contribution of methodology is limited particularly for iclr 2022 the employed sdi dataset is not publicly available and its difficult for researchers to evaluate and compare the performances while stylegan v2 and biggan are involved for comparison how do you train them for the synthesis by fintuning or training from scratch only scratch and spots are involved the number of categories for defects are too limited and its not convincing to support the claim much more defects and products need to be tested and evaluated the work presented in the paper can be regarded as an application of stargan v2 while the paper is well writen and easy to follow the novelty and contribution of methodology is limited particularly for iclr 2022 as only scratch and spots are involved the number of categories for defects are too limited and its not convincing to support the claim much more defects and products need to be tested and evaluated
### Summary:
|
the paper proposes a gan based method for synthesizing various types of defects as foreground on different product images background the method builds upon starganv2 and adds the cyclecontent consistency loss and classification loss between foreground and background while the paper considers an important problemapplication the reviewers found it lacking sufficient novelty for publication the paper will be more suited for publication at an application oriented venue
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8631,
247,
4177,
1247,
3169,
2746,
281,
11365,
13506,
3888,
273,
2710,
12834,
1754,
327,
253,
6944,
3733,
941,
253,
4081,
1332,
19401,
767,
1027,
3510,
273,
10625,
24088,
253,
35936,
5028,
347,
7071,
3510,
285,
253,
4114,
5028,
347,
1885,
3510,
285,
253,
6012,
13506,
3888,
403,
2908,
275,
253,
3733,
941,
281,
3157,
253,
3045,
273,
7071,
9162,
7000,
5701,
403,
347,
3637,
50276,
18,
253,
4081,
1332,
310,
5272,
533,
253,
7681,
38135,
310,
760,
10290,
253,
36827,
15895,
2529,
275,
253,
2929,
11306,
15771,
327,
4177,
1247,
362,
19,
342,
3081,
1635,
273,
253,
2600,
3700,
5700,
5611,
275,
278,
536,
5102,
1162,
355,
9169,
50274,
19,
253,
9759,
3198,
1534,
11701,
323,
1650,
253,
9410,
17695,
273,
2593,
495,
403,
1039,
1512,
2074,
281,
4177,
1247,
362,
19,
436,
812,
18123,
2818,
253,
7103,
273,
253,
7681,
9021,
273,
436,
789,
50276,
20,
253,
5661,
1543,
403,
417,
21414,
253,
3368,
310,
5762,
327,
760,
253,
256,
5168,
10895,
285,
253,
14023,
513,
417,
2486,
5368,
256,
5503,
5609,
323,
7071,
9162,
352,
310,
417,
2590,
1880,
253,
7558,
2022,
5750,
273,
3629,
253,
9991,
273,
12834,
327,
1016,
2173,
2303,
1885,
651,
1335,
320,
3588,
323,
253,
1083,
326,
12834,
273,
1027,
3580,
6889,
3012,
671,
352,
651,
320,
625,
47860,
281,
2486,
253,
3368,
273,
7071,
14536,
1333,
689,
253,
278,
87,
39766,
519,
10895,
281,
2007,
15249,
253,
31471,
273,
253,
4081,
2746,
50276,
21,
253,
28913,
1263,
327,
3239,
818,
943,
320,
1160,
625,
11088,
24088,
342,
6843,
11745,
1543,
2581,
685,
342,
18276,
6667,
347,
275,
4677,
608,
50276,
22,
253,
12494,
273,
2831,
13517,
1055,
342,
1675,
281,
2829,
577,
3198,
281,
320,
5469,
275,
625,
2508,
310,
253,
7071,
27090,
1335,
3576,
323,
253,
1083,
326,
12834,
273,
1027,
3580,
6889,
3012,
50276,
783,
7681,
38135,
273,
253,
4081,
1332,
476,
320,
5520,
285,
253,
4679,
3480,
14023,
342,
256,
5503,
5609,
323,
7071,
9162,
436,
789,
310,
417,
4704,
323,
9311,
2568,
5474,
33032,
436,
2929,
23970,
7071,
3700,
36827,
281,
3700,
390,
6635,
1027,
3510,
273,
12834,
824,
347,
33902,
390,
13977,
35936,
285,
4647,
731,
4830,
1027,
1885,
3888,
4114,
253,
1332,
21168,
2220,
4177,
1247,
87,
19,
285,
823,
253,
5880,
6071,
15274,
2957,
285,
9162,
2957,
875,
35936,
285,
4114,
4679,
497,
2361,
327,
253,
2553,
7071,
15981,
10895,
1543,
921,
326,
253,
4081,
1332,
6786,
1805,
269,
301,
285,
5772,
7363,
253,
2929,
2722,
326,
3733,
327,
253,
17791,
10895,
17923,
1805,
685,
5899,
941,
42072,
4757,
50275,
783,
2929,
12453,
271,
4722,
285,
1774,
2898,
50274,
783,
7681,
7092,
4278,
403,
973,
14290,
253,
30762,
3400,
1270,
4278,
327,
253,
5661,
7241,
285,
3081,
1543,
253,
4477,
9023,
281,
15452,
253,
2127,
285,
10895,
594,
436,
812,
320,
247,
5322,
7680,
281,
253,
3114,
50274,
783,
9162,
7200,
369,
2169,
672,
970,
253,
4081,
941,
9066,
2746,
50275,
20881,
1255,
50276,
186,
74,
452,
247,
1892,
673,
281,
1089,
2712,
326,
310,
22335,
4460,
512,
253,
3733,
11655,
403,
12794,
432,
1265,
1247,
87,
19,
390,
5880,
1247,
253,
2022,
3064,
8696,
275,
253,
35936,
11814,
9712,
533,
627,
369,
417,
28913,
1263,
281,
17813,
436,
50275,
186,
7483,
1543,
327,
247,
4581,
10895,
310,
2361,
891,
1928,
326,
436,
2929,
651,
320,
247,
2257,
625,
1805,
18960,
323,
27691,
342,
247,
2770,
327,
4893,
891,
717,
417,
2119,
47515,
253,
1379,
12594,
16507,
327,
253,
7681,
4295,
432,
436,
2929,
50275,
74,
1158,
436,
2929,
39223,
271,
4722,
4893,
253,
1543,
403,
11080,
285,
4891,
619,
2022,
4468,
273,
436,
2929,
310,
326,
253,
7681,
38135,
310,
3710,
891,
717,
3021,
25661,
4016,
670,
436,
2929,
5474,
33032,
43355,
4081,
253,
36827,
1332,
326,
476,
16497,
253,
7071,
875,
3888,
253,
2797,
1543,
432,
253,
4081,
36827,
310,
2104,
281,
557,
290,
2134,
253,
7071,
6160,
2600,
432,
253,
4114,
3888,
275,
253,
22112,
35421,
5133,
407,
970,
253,
4081,
36827,
1332,
4477,
31612,
253,
3733,
941,
253,
7200,
6351,
2797,
970,
253,
941,
42072,
2722,
5185,
285,
1534,
8037,
281,
253,
6041,
1332,
1293,
253,
941,
42072,
50276,
262,
310,
1892,
281,
2096,
752,
310,
253,
3740,
285,
752,
310,
253,
2600,
275,
253,
10522,
2898,
1014,
2167,
253,
1332,
310,
4081,
281,
18915,
253,
7071,
8981,
1895,
1223,
3740,
285,
2600,
403,
417,
4518,
2931,
275,
824,
247,
3634,
436,
1461,
5123,
253,
4473,
273,
3740,
285,
2600,
4768,
253,
7482,
50275,
783,
7681,
38135,
4453,
5075,
954,
273,
4295,
403,
2074,
281,
253,
4177,
1247,
362,
19,
1223,
4477,
23103,
326,
597,
2879,
1643,
625,
11911,
824,
347,
4114,
922,
2595,
30410,
2299,
436,
6880,
310,
2581,
14916,
1332,
3020,
352,
310,
1892,
281,
1089,
253,
7756,
275,
253,
4893,
504,
627,
812,
320,
2299,
352,
369,
417,
4518,
5544,
2139,
824,
3740,
6071,
9712,
285,
15826,
16054,
9162,
403,
1774,
323,
253,
10522,
2898,
50275,
16217,
3825,
403,
10380,
3710,
760,
337,
10895,
310,
3206,
323,
253,
3368,
33810,
253,
9759,
323,
253,
1543,
310,
2581,
5075,
8442,
495,
949,
721,
403,
417,
4518,
5544,
597,
403,
417,
253,
3626,
3888,
285,
352,
310,
1892,
281,
802,
3967,
907,
326,
253,
3888,
2797,
310,
6498,
973,
323,
253,
1677,
10076,
891,
1158,
4477,
878,
281,
1805,
5513,
824,
247,
16105,
7756,
1955,
281,
253,
3710,
9759,
4679,
285,
7681,
38135,
891,
717,
327,
253,
45210,
323,
436,
7482,
2568,
2299,
891,
812,
564,
4404,
253,
2997,
604,
4477,
812,
1160,
3576,
30080,
22559,
281,
619,
5701,
5474,
33032,
2520,
789,
10384,
36827,
323,
941,
42072,
281,
7278,
7071,
9162,
345,
2414,
600,
1754,
327,
4177,
1247,
362,
19,
2605,
253,
4081,
7792,
310,
2104,
281,
22573,
253,
35936,
285,
253,
4114,
11794,
17690,
11117,
7071,
9066,
407,
2057,
27090,
253,
7071,
432,
247,
3806,
3410,
390,
35143,
3006,
432,
12421,
19958,
33737,
20544,
50275,
1212,
2943,
305,
507,
281,
7071,
9066,
310,
4722,
50275,
783,
1543,
273,
3806,
26960,
7071,
3700,
403,
12532,
50276,
20881,
1255,
265,
50275,
2577,
2022,
4468,
310,
253,
7681,
38135,
432,
253,
4836,
8668,
970,
305,
507,
323,
941,
42072,
310,
417,
747,
285,
970,
305,
507,
323,
3740,
6071,
3700,
310,
671,
417,
747,
436,
2929,
816,
10384,
352,
281,
247,
747,
2170,
432,
253,
5853,
8668,
954,
273,
253,
35615,
403,
29563,
432,
4177,
1247,
362,
19,
690,
2957,
2426,
403,
2879,
281,
3157,
253,
3045,
533,
512,
11655,
1646,
281,
320,
14916,
5747,
253,
1175,
1543,
891,
2550,
923,
1142,
16039,
432,
436,
789,
24088,
849,
476,
436,
789,
7102,
643,
8607,
50275,
783,
4081,
2746,
310,
760,
6760,
327,
253,
2553,
7071,
15981,
256,
5168,
10895,
253,
34385,
2550,
320,
4751,
16058,
281,
320,
8274,
891,
513,
417,
1158,
436,
789,
310,
7470,
323,
17857,
32888,
347,
954,
23886,
778,
417,
755,
6110,
891,
7052,
5583,
253,
4477,
11929,
436,
789,
281,
247,
8059,
19317,
275,
253,
4491,
1673,
5474,
33032,
253,
2929,
819,
6013,
247,
4177,
1247,
1754,
1566,
281,
940,
2134,
253,
7071,
35936,
4114,
3700,
253,
3740,
273,
35936,
285,
840,
46919,
253,
372,
5925,
2460,
323,
1027,
3580,
253,
3290,
273,
253,
17791,
7071,
3888,
403,
6760,
970,
269,
301,
5772,
253,
2746,
310,
671,
622,
1070,
281,
35919,
7071,
3888,
281,
3157,
253,
3045,
273,
7071,
9162,
4757,
253,
2929,
310,
973,
2416,
257,
285,
3477,
281,
956,
253,
4081,
3082,
3133,
281,
29966,
253,
941,
4849,
285,
46684,
4815,
273,
253,
7071,
15981,
1895,
7561,
2130,
275,
3275,
4491,
50276,
20881,
1255,
432,
7681,
629,
253,
1566,
310,
10323,
247,
27846,
4177,
1247,
362,
19,
24251,
323,
253,
7071,
5978,
4836,
253,
38135,
285,
7680,
273,
16182,
310,
3710,
3782,
323,
17857,
32888,
1384,
1423,
50276,
783,
7091,
256,
5168,
10895,
310,
417,
13644,
2130,
285,
697,
2834,
323,
8607,
281,
7472,
285,
7277,
253,
16226,
50276,
6050,
3740,
1247,
362,
19,
285,
1943,
1247,
403,
3206,
323,
5301,
849,
513,
368,
6194,
731,
323,
253,
9066,
407,
269,
565,
25004,
390,
3733,
432,
20041,
50276,
7483,
20041,
285,
13977,
403,
3206,
253,
1180,
273,
9050,
323,
12834,
403,
1512,
3710,
285,
697,
417,
21414,
281,
1329,
253,
1750,
1199,
625,
12834,
285,
3580,
878,
281,
320,
5762,
285,
6760,
50275,
783,
789,
3559,
275,
253,
2929,
476,
320,
12258,
347,
271,
2898,
273,
4177,
1247,
362,
19,
1223,
253,
2929,
310,
973,
2416,
257,
285,
3477,
281,
956,
253,
38135,
285,
7680,
273,
16182,
310,
3710,
3782,
323,
17857,
32888,
1384,
1423,
50276,
284,
760,
20041,
285,
13977,
403,
3206,
253,
1180,
273,
9050,
323,
12834,
403,
1512,
3710,
285,
697,
417,
21414,
281,
1329,
253,
1750,
1199,
625,
12834,
285,
3580,
878,
281,
320,
5762,
285,
6760,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
36827,
1754,
1332,
323,
35143,
3006,
2710,
3510,
273,
12834,
347,
35936,
327,
1027,
1885,
3888,
4114,
253,
1332,
21168,
2220,
4177,
1247,
87,
19,
285,
11323,
253,
5880,
6071,
15274,
2957,
285,
9162,
2957,
875,
35936,
285,
4114,
1223,
253,
2929,
19401,
271,
1774,
33991,
377,
5688,
253,
30628,
1119,
352,
14999,
4209,
38135,
323,
9311,
253,
2929,
588,
320,
625,
18960,
323,
9311,
387,
271,
2898,
19373,
18767
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8631,
247,
4177,
1247,
3169,
2746,
281,
11365,
13506,
3888,
273,
2710,
12834,
1754,
327,
253,
6944,
3733,
941,
253,
4081,
1332,
19401,
767,
1027,
3510,
273,
10625,
24088,
253,
35936,
5028,
347,
7071,
3510,
285,
253,
4114,
5028,
347,
1885,
3510,
285,
253,
6012,
13506,
3888,
403,
2908,
275,
253,
3733,
941,
281,
3157,
253,
3045,
273,
7071,
9162,
7000,
5701,
403,
347,
3637,
50276,
18,
253,
4081,
1332,
310,
5272,
533,
253,
7681,
38135,
310,
760,
10290,
253,
36827,
15895,
2529,
275,
253,
2929,
11306,
15771,
327,
4177,
1247,
362,
19,
342,
3081,
1635,
273,
253,
2600,
3700,
5700,
5611,
275,
278,
536,
5102,
1162,
355,
9169,
50274,
19,
253,
9759,
3198,
1534,
11701,
323,
1650,
253,
9410,
17695,
273,
2593,
495,
403,
1039,
1512,
2074,
281,
4177,
1247,
362,
19,
436,
812,
18123,
2818,
253,
7103,
273,
253,
7681,
9021,
273,
436,
789,
50276,
20,
253,
5661,
1543,
403,
417,
21414,
253,
3368,
310,
5762,
327,
760,
253,
256,
5168,
10895,
285,
253,
14023,
513,
417,
2486,
5368,
256,
5503,
5609,
323,
7071,
9162,
352,
310,
417,
2590,
1880,
253,
7558,
2022,
5750,
273,
3629,
253,
9991,
273,
12834,
327,
1016,
2173,
2303,
1885,
651,
1335,
320,
3588,
323,
253,
1083,
326,
12834,
273,
1027,
3580,
6889,
3012,
671,
352,
651,
320,
625,
47860,
281,
2486,
253,
3368,
273,
7071,
14536,
1333,
689,
253,
278,
87,
39766,
519,
10895,
281,
2007,
15249,
253,
31471,
273,
253,
4081,
2746,
50276,
21,
253,
28913,
1263,
327,
3239,
818,
943,
320,
1160,
625,
11088,
24088,
342,
6843,
11745,
1543,
2581,
685,
342,
18276,
6667,
347,
275,
4677,
608,
50276,
22,
253,
12494,
273,
2831,
13517,
1055,
342,
1675,
281,
2829,
577,
3198,
281,
320,
5469,
275,
625,
2508,
310,
253,
7071,
27090,
1335,
3576,
323,
253,
1083,
326,
12834,
273,
1027,
3580,
6889,
3012,
50276,
783,
7681,
38135,
273,
253,
4081,
1332,
476,
320,
5520,
285,
253,
4679,
3480,
14023,
342,
256,
5503,
5609,
323,
7071,
9162,
436,
789,
310,
417,
4704,
323,
9311,
2568,
5474,
33032,
436,
2929,
23970,
7071,
3700,
36827,
281,
3700,
390,
6635,
1027,
3510,
273,
12834,
824,
347,
33902,
390,
13977,
35936,
285,
4647,
731,
4830,
1027,
1885,
3888,
4114,
253,
1332,
21168,
2220,
4177,
1247,
87,
19,
285,
823,
253,
5880,
6071,
15274,
2957,
285,
9162,
2957,
875,
35936,
285,
4114,
4679,
497,
2361,
327,
253,
2553,
7071,
15981,
10895,
1543,
921,
326,
253,
4081,
1332,
6786,
1805,
269,
301,
285,
5772,
7363,
253,
2929,
2722,
326,
3733,
327,
253,
17791,
10895,
17923,
1805,
685,
5899,
941,
42072,
4757,
50275,
783,
2929,
12453,
271,
4722,
285,
1774,
2898,
50274,
783,
7681,
7092,
4278,
403,
973,
14290,
253,
30762,
3400,
1270,
4278,
327,
253,
5661,
7241,
285,
3081,
1543,
253,
4477,
9023,
281,
15452,
253,
2127,
285,
10895,
594,
436,
812,
320,
247,
5322,
7680,
281,
253,
3114,
50274,
783,
9162,
7200,
369,
2169,
672,
970,
253,
4081,
941,
9066,
2746,
50275,
20881,
1255,
50276,
186,
74,
452,
247,
1892,
673,
281,
1089,
2712,
326,
310,
22335,
4460,
512,
253,
3733,
11655,
403,
12794,
432,
1265,
1247,
87,
19,
390,
5880,
1247,
253,
2022,
3064,
8696,
275,
253,
35936,
11814,
9712,
533,
627,
369,
417,
28913,
1263,
281,
17813,
436,
50275,
186,
7483,
1543,
327,
247,
4581,
10895,
310,
2361,
891,
1928,
326,
436,
2929,
651,
320,
247,
2257,
625,
1805,
18960,
323,
27691,
342,
247,
2770,
327,
4893,
891,
717,
417,
2119,
47515,
253,
1379,
12594,
16507,
327,
253,
7681,
4295,
432,
436,
2929,
50275,
74,
1158,
436,
2929,
39223,
271,
4722,
4893,
253,
1543,
403,
11080,
285,
4891,
619,
2022,
4468,
273,
436,
2929,
310,
326,
253,
7681,
38135,
310,
3710,
891,
717,
3021,
25661,
4016,
670,
436,
2929,
5474,
33032,
43355,
4081,
253,
36827,
1332,
326,
476,
16497,
253,
7071,
875,
3888,
253,
2797,
1543,
432,
253,
4081,
36827,
310,
2104,
281,
557,
290,
2134,
253,
7071,
6160,
2600,
432,
253,
4114,
3888,
275,
253,
22112,
35421,
5133,
407,
970,
253,
4081,
36827,
1332,
4477,
31612,
253,
3733,
941,
253,
7200,
6351,
2797,
970,
253,
941,
42072,
2722,
5185,
285,
1534,
8037,
281,
253,
6041,
1332,
1293,
253,
941,
42072,
50276,
262,
310,
1892,
281,
2096,
752,
310,
253,
3740,
285,
752,
310,
253,
2600,
275,
253,
10522,
2898,
1014,
2167,
253,
1332,
310,
4081,
281,
18915,
253,
7071,
8981,
1895,
1223,
3740,
285,
2600,
403,
417,
4518,
2931,
275,
824,
247,
3634,
436,
1461,
5123,
253,
4473,
273,
3740,
285,
2600,
4768,
253,
7482,
50275,
783,
7681,
38135,
4453,
5075,
954,
273,
4295,
403,
2074,
281,
253,
4177,
1247,
362,
19,
1223,
4477,
23103,
326,
597,
2879,
1643,
625,
11911,
824,
347,
4114,
922,
2595,
30410,
2299,
436,
6880,
310,
2581,
14916,
1332,
3020,
352,
310,
1892,
281,
1089,
253,
7756,
275,
253,
4893,
504,
627,
812,
320,
2299,
352,
369,
417,
4518,
5544,
2139,
824,
3740,
6071,
9712,
285,
15826,
16054,
9162,
403,
1774,
323,
253,
10522,
2898,
50275,
16217,
3825,
403,
10380,
3710,
760,
337,
10895,
310,
3206,
323,
253,
3368,
33810,
253,
9759,
323,
253,
1543,
310,
2581,
5075,
8442,
495,
949,
721,
403,
417,
4518,
5544,
597,
403,
417,
253,
3626,
3888,
285,
352,
310,
1892,
281,
802,
3967,
907,
326,
253,
3888,
2797,
310,
6498,
973,
323,
253,
1677,
10076,
891,
1158,
4477,
878,
281,
1805,
5513,
824,
247,
16105,
7756,
1955,
281,
253,
3710,
9759,
4679,
285,
7681,
38135,
891,
717,
327,
253,
45210,
323,
436,
7482,
2568,
2299,
891,
812,
564,
4404,
253,
2997,
604,
4477,
812,
1160,
3576,
30080,
22559,
281,
619,
5701,
5474,
33032,
2520,
789,
10384,
36827,
323,
941,
42072,
281,
7278,
7071,
9162,
345,
2414,
600,
1754,
327,
4177,
1247,
362,
19,
2605,
253,
4081,
7792,
310,
2104,
281,
22573,
253,
35936,
285,
253,
4114,
11794,
17690,
11117,
7071,
9066,
407,
2057,
27090,
253,
7071,
432,
247,
3806,
3410,
390,
35143,
3006,
432,
12421,
19958,
33737,
20544,
50275,
1212,
2943,
305,
507,
281,
7071,
9066,
310,
4722,
50275,
783,
1543,
273,
3806,
26960,
7071,
3700,
403,
12532,
50276,
20881,
1255,
265,
50275,
2577,
2022,
4468,
310,
253,
7681,
38135,
432,
253,
4836,
8668,
970,
305,
507,
323,
941,
42072,
310,
417,
747,
285,
970,
305,
507,
323,
3740,
6071,
3700,
310,
671,
417,
747,
436,
2929,
816,
10384,
352,
281,
247,
747,
2170,
432,
253,
5853,
8668,
954,
273,
253,
35615,
403,
29563,
432,
4177,
1247,
362,
19,
690,
2957,
2426,
403,
2879,
281,
3157,
253,
3045,
533,
512,
11655,
1646,
281,
320,
14916,
5747,
253,
1175,
1543,
891,
2550,
923,
1142,
16039,
432,
436,
789,
24088,
849,
476,
436,
789,
7102,
643,
8607,
50275,
783,
4081,
2746,
310,
760,
6760,
327,
253,
2553,
7071,
15981,
256,
5168,
10895,
253,
34385,
2550,
320,
4751,
16058,
281,
320,
8274,
891,
513,
417,
1158,
436,
789,
310,
7470,
323,
17857,
32888,
347,
954,
23886,
778,
417,
755,
6110,
891,
7052,
5583,
253,
4477,
11929,
436,
789,
281,
247,
8059,
19317,
275,
253,
4491,
1673,
5474,
33032,
253,
2929,
819,
6013,
247,
4177,
1247,
1754,
1566,
281,
940,
2134,
253,
7071,
35936,
4114,
3700,
253,
3740,
273,
35936,
285,
840,
46919,
253,
372,
5925,
2460,
323,
1027,
3580,
253,
3290,
273,
253,
17791,
7071,
3888,
403,
6760,
970,
269,
301,
5772,
253,
2746,
310,
671,
622,
1070,
281,
35919,
7071,
3888,
281,
3157,
253,
3045,
273,
7071,
9162,
4757,
253,
2929,
310,
973,
2416,
257,
285,
3477,
281,
956,
253,
4081,
3082,
3133,
281,
29966,
253,
941,
4849,
285,
46684,
4815,
273,
253,
7071,
15981,
1895,
7561,
2130,
275,
3275,
4491,
50276,
20881,
1255,
432,
7681,
629,
253,
1566,
310,
10323,
247,
27846,
4177,
1247,
362,
19,
24251,
323,
253,
7071,
5978,
4836,
253,
38135,
285,
7680,
273,
16182,
310,
3710,
3782,
323,
17857,
32888,
1384,
1423,
50276,
783,
7091,
256,
5168,
10895,
310,
417,
13644,
2130,
285,
697,
2834,
323,
8607,
281,
7472,
285,
7277,
253,
16226,
50276,
6050,
3740,
1247,
362,
19,
285,
1943,
1247,
403,
3206,
323,
5301,
849,
513,
368,
6194,
731,
323,
253,
9066,
407,
269,
565,
25004,
390,
3733,
432,
20041,
50276,
7483,
20041,
285,
13977,
403,
3206,
253,
1180,
273,
9050,
323,
12834,
403,
1512,
3710,
285,
697,
417,
21414,
281,
1329,
253,
1750,
1199,
625,
12834,
285,
3580,
878,
281,
320,
5762,
285,
6760,
50275,
783,
789,
3559,
275,
253,
2929,
476,
320,
12258,
347,
271,
2898,
273,
4177,
1247,
362,
19,
1223,
253,
2929,
310,
973,
2416,
257,
285,
3477,
281,
956,
253,
38135,
285,
7680,
273,
16182,
310,
3710,
3782,
323,
17857,
32888,
1384,
1423,
50276,
284,
760,
20041,
285,
13977,
403,
3206,
253,
1180,
273,
9050,
323,
12834,
403,
1512,
3710,
285,
697,
417,
21414,
281,
1329,
253,
1750,
1199,
625,
12834,
285,
3580,
878,
281,
320,
5762,
285,
6760,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
36827,
1754,
1332,
323,
35143,
3006,
2710,
3510,
273,
12834,
347,
35936,
327,
1027,
1885,
3888,
4114,
253,
1332,
21168,
2220,
4177,
1247,
87,
19,
285,
11323,
253,
5880,
6071,
15274,
2957,
285,
9162,
2957,
875,
35936,
285,
4114,
1223,
253,
2929,
19401,
271,
1774,
33991,
377,
5688,
253,
30628,
1119,
352,
14999,
4209,
38135,
323,
9311,
253,
2929,
588,
320,
625,
18960,
323,
9311,
387,
271,
2898,
19373,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
1 the authors considered uniform upper bound of the stochastic gradients gi the authors may argue that the classical theoretical analysis of sgd assumes that the stochastic gradients are uniformly bounded but one can even strongly argue that this bound is actually infty moreover an even stronger argument can be made that the above assumption is in contrast with strong convexity please see sgd and hogwild convergence without the bounded gradients assumption by nguyen et al as one of the instances please understand there are relaxed assumptions such as strong growth condition on a stochastic gradient as in assumption 4 of 1 2 there is work 2 that proposes we propose a flexible framework which adapts the compression level to the true gradient at each iteration maximizing the improvement in the objective function that is achieved per communicated bit how is work different from theirs i suggest the authors mention this work and make connections with their present results 3 please check theorem 34 and 35 in qsgd paper there is a count of bits communicating in each round for qsgd which depends on s the quantization level moreover you also mentioned that quantization level is roughly exponential to the number of quantized bits in light of your formulation how the results in qsgd paper are relevant additionally in section 3 equations 3 and 4 are not your contributions therefore please adequately cite their sources 4 what is the idea behind proposition 1 5 what is the point of theorem 2 on the other hand where are the derivationsproofs of equations 18 and 20 those are the main results of this paper if i am not wrong 6 in terms of experimental results i had a hard time interpreting table 1 and associated text in the paragraph test accuracy vs compression ratio especially i do not know how do i understand the last sentence in the abovementioned paragraph to do proper experiments by using compression techniques the authors can check a very elaborative work and codebase by hang xu et al compressed communication for distributed deep learning survey and quantitative evaluation in that case i would encourage the authors to plot relative datavolume vs test accuracy similar to figures 6 and 7 therein in the present papers the experiments and their presentations are substandard additional comments 1 recently with the boomingplease refrain from using these types of words you are preparing a scientific document not a scifi novel 2 i have some reservations on this statement existing algorithms often quantize parameters into a fixed number of bits which is shown to be inefficient in balancing the communicationconvergence tradeoff seide et al 2014 alistarh et al 2017 bernstein et al 2018 in my opinion qsgd is robust and stable for 1bit sgd both seide and bernstein the proposed solution is to use errorfeedback 3 show good performance in some certain tasksbad sentence 1 dutta et al aaai 2020 on the discrepancy between the theoretical analysis and practical implementations of compressed communication for distributed deep learning 2 khirirat et al 2020 a flexible framework for communicationefficient machine learning from hpc to iot docsepthis paper proposed an adaptive quantized method which is derived by minimizing a constrained quantization error bound the theoretical analysis suggests adjusting the quantization level according to the gradient norm convergence rate of the model and the current iteration number theoretical results show that the dynamic bits leads to better error bound than the fixed bits the result is intuitive overall the paper is clearly written but the improvement is not significant enough to warrant a publication at iclr 1 in the proof of proposition 1 it is assumed that fracgi gp in ufracls fracl1s is this empirically supported and the final result looks wrong to me for example pfracgigp fracls epsilon0 should be 1s and s s2epsilon0 can be much larger than 1 2 bn in 18 depends on alpha and k how can we estimate them in practice on the other hand 18 shows that more bits may need to be used in the later stage of training due to exponent n n while 20 suggests a smaller number of bits in later iterations as gradient norm typically has a trend of decreasing during training they are contradictory to each other 3 in the experiment 18 or 20 is used 4 as the proposed method uses dynamic number of bits how do we compute its compression ratio is it averaged compression ratio across iterations 5 from table 1 we can see that though the proposed method has slightly higher compression ratio than qsgd its accuracy is worse since the compression overhead is nontrivial in practice and there is no report on training time it is not clear if the proposed method is faster in terms of cpu wallclock time 6 the baseline looks weak there are a number of papers showing that error feedback can fix the poor performance of signsgd karimireddy 2019 tang 2019 zheng 2019 and is momentum used in the experiment karimireddy sai praneeth et al error feedback fixes signsgd and other gradient compression schemes icml 2019 tang hanlin et al doublesqueeze parallel stochastic gradient descent with doublepass errorcompensated compression icml 2019 zheng s huang z kwok j communicationefficient distributed blockwise momentum sgd with errorfeedback neurips 2019docsepthe paper considers distributed learning using sgd and aims at improving the convergence rate of the quantized sgd to this purpose it theoretically characterizes the tradeoff between communication cost and the model accuracy in training with quantized stochastic gradients based on the theoretical analysis the authors proposed a dynamic quantized sgd framework to optimize the quantization strategy to be analyzing the tradeoff between communication and model error in addition to the theoretical analysis the performance of the proposed communication scheme is evaluated extensively in computer vision tasks on cifar10100 and nlp on agnews dataset and is shown to outperform the other considered quantization techniques strengths although the idea of changing the number of quantization bit during training has been proposed and studied before but this paper looks at it from a new angle and by analyzing the convergence rate under different assumptions for the loss function it computes the required number of bits to minimize the upper bound of the convergence error weaknesses and questions 1 in the proof of proposition 1 it is explicitly assumed that the sgs are uniformly distributed in each quantization bin although this might be true for sufficiently small quantization bins large bits for low bits which is the interested region in distributed training this assumption is not valid this should be mentioned in the body of the claims although this assumption is not true in general to be more accurate the distribution of error for example for e0 can be written as pe1se suml pgigp els 2 theorem 2 assumes gaussian noise while the quantization noise is nongaussian and prop 1 assumes it is uniform it seems that this assumption is not in line with the previous assumptions and practical cases 3 the experiments section is not satisfactory the qsgd and terngrad methods are relatively old and better quantization methods based on dithering transforming has already been proposed compared to adaptive and adaqs both these methods achieve higher accuracy with a lower compression ratio on the other hand qsgd4 bits with a higher compression ratio achieves a comparable accuracy moreover the convergence rate plots fig 1 shows that all methods converge almost at the same rate despite the theoretical analysis and the motivation of the dynamic bt allocation for a fair comparison the compression ratio of all methods should be fixed to the same value and the convergence rate as in fig 1ab in addition to the final accuracy should be evaluated the limited experiments and not appropriate comparison setups are major shortcomings of the paper minor theorem 1 uses assumption 3 not stated in the body docsepit is known that assumption 3 equation 10 bounded variance with assumption 2 strong convexity leads to a contradiction thus having these two assumptions together is strong there are some recent works that overcome assumption 3 by a different assumption known as the expected smoothness like in the following work gower r m richtarik p and bach f stochastic quasigradient methods variance reduction via jacobian sketchingarxiv180502632 2018 the authors may revise their strongly convex results part using this kind of assumption in proposition 1 equation 13 it is not clear what the authors mean by the probability of a random variable i checked the proof but i did not understand it either some reported results are already known in the literature and the paper gives the impression that these results are new especially that the authors give the proofs examples of such results the unbiasedness of the quantization its bounded variance and the first part of theorem 1 page 5 concerning the quadratic example this is a trivial case and the only case where one can hope the lower bound to match the upper bound in fact alpha beta iff lmu and from assumptions 1 2 we get that f is quadratic with mul which implies h mu i equation 20 for me this one of the main results of the paper but i did not see its proof anywhere
### Summary:
|
the reviewers have a strong consensus towards rejection here and i agree with this consensus although i think some of the reviewers concerns are misplaced for example the paper does not appear to use a magnitude upper bound that would be vacuous together with a strong convexity assumption although variance bounds strong convexity do cover only a small fraction of strongly convex learning tasks these assumptions arent vacuous some feedback i have that perhaps was not covered by the reviewers pros studying the setting where the number of bits varies dynamically is very interesting although as reviewer 3 points out not entirely novel there is significant possibility for improvement from this method and your theory seems to back this up cons the experimental setup is weak and is measuring the wrong thing when we run sgd to train a model what we really care about is when the training finishes the total wall clock time to train on some system for compression methods with fixed compression rates its fine to use the number of bits transmitted as a proxy because when the number of bits transmitted is uniform over time this will be monotonic in the wallclock time however when the bits transmitted per iteration can change over time this can have a difficulttopredict effect on the wallclock time because of the potential for overlap between communication and computation where below a certain number of bits sent the system is not communicationbound wallclock time experiments comparing against other more modern compression methods would significantly improve this paper
|
[
15891,
253,
4477,
778,
9059,
326,
253,
8946,
10527,
1783,
273,
256,
35333,
19584,
326,
253,
19191,
27935,
403,
17568,
11542,
533,
581,
476,
1014,
7052,
9059,
326,
436,
3033,
310,
2686,
2192,
555,
25761,
271,
1014,
10046,
4154,
476,
320,
1160,
326,
253,
1840,
9376,
310,
275,
4499,
342,
2266,
17133,
414,
4496,
923,
256,
35333,
285,
47891,
32778,
14940,
1293,
253,
11542,
27935,
9376,
407,
295,
39170,
1162,
355,
347,
581,
273,
253,
10872,
4496,
2096,
627,
403,
19595,
13260,
824,
347,
2266,
3116,
1617,
327,
247,
19191,
11786,
347,
275,
9376,
577,
273,
337,
50276,
19,
186,
9088,
310,
789,
374,
326,
29328,
359,
12661,
247,
12112,
7792,
534,
5223,
84,
253,
13800,
1268,
281,
253,
2032,
11786,
387,
1016,
19502,
46875,
253,
7756,
275,
253,
8103,
1159,
326,
310,
6786,
591,
32452,
2372,
849,
310,
789,
1027,
432,
31187,
891,
1804,
253,
4477,
3748,
436,
789,
285,
1056,
10291,
342,
616,
1246,
1543,
50276,
20,
186,
32897,
2451,
10012,
5910,
285,
4791,
275,
2805,
8433,
69,
2929,
627,
310,
247,
1385,
273,
9886,
26728,
275,
1016,
3790,
323,
2805,
8433,
69,
534,
7024,
327,
256,
253,
36643,
1268,
25761,
368,
671,
5393,
326,
36643,
1268,
310,
11467,
17619,
281,
253,
1180,
273,
2677,
1025,
9886,
275,
1708,
273,
634,
15895,
849,
253,
1543,
275,
2805,
8433,
69,
2929,
403,
4623,
23000,
275,
2593,
495,
7424,
495,
285,
577,
403,
417,
634,
9021,
3103,
4496,
18212,
26542,
616,
4973,
50276,
21,
186,
5371,
310,
253,
2934,
3212,
13989,
337,
50276,
22,
186,
5371,
310,
253,
1127,
273,
10012,
374,
327,
253,
643,
1133,
835,
403,
253,
3538,
569,
16314,
84,
273,
7424,
1283,
285,
1384,
1110,
403,
253,
2022,
1543,
273,
436,
2929,
604,
891,
717,
417,
3430,
50276,
23,
186,
249,
2426,
273,
5661,
1543,
891,
574,
247,
1892,
673,
29375,
2829,
337,
285,
2330,
2505,
275,
253,
12494,
1071,
7200,
4632,
13800,
4313,
3340,
891,
513,
417,
871,
849,
513,
891,
2096,
253,
1390,
6197,
275,
253,
1840,
13012,
12494,
281,
513,
1463,
4679,
407,
970,
13800,
5609,
253,
4477,
476,
2451,
247,
1077,
14883,
800,
789,
285,
2127,
4793,
407,
10913,
1269,
86,
1162,
355,
21012,
5511,
323,
5939,
3676,
4715,
6630,
285,
11745,
7103,
275,
326,
1083,
891,
651,
11907,
253,
4477,
281,
7484,
4103,
2856,
580,
311,
2123,
4632,
1071,
7200,
2074,
281,
8442,
721,
285,
818,
15308,
275,
253,
1246,
9380,
253,
4679,
285,
616,
27228,
403,
749,
15291,
50275,
38092,
5701,
337,
186,
45019,
314,
342,
253,
1766,
19275,
32897,
35531,
432,
970,
841,
3510,
273,
3000,
368,
403,
13828,
247,
8249,
3389,
417,
247,
660,
18279,
4460,
50276,
19,
186,
74,
452,
690,
33196,
327,
436,
3908,
5368,
11333,
2223,
2677,
907,
3602,
715,
247,
4229,
1180,
273,
9886,
534,
310,
2011,
281,
320,
31334,
275,
26259,
253,
5511,
585,
41801,
5454,
2727,
396,
504,
1162,
355,
4059,
355,
43418,
73,
1162,
355,
4240,
270,
1808,
6339,
1162,
355,
4765,
50276,
249,
619,
4743,
2805,
8433,
69,
310,
10237,
285,
6474,
323,
337,
2713,
256,
35333,
1097,
396,
504,
285,
270,
1808,
6339,
253,
4081,
2900,
310,
281,
897,
2228,
44333,
50276,
20,
186,
9029,
1175,
3045,
275,
690,
2176,
8892,
14367,
6197,
50274,
18,
5387,
893,
1162,
355,
39951,
2284,
9169,
327,
253,
26210,
875,
253,
10527,
1783,
285,
8542,
27558,
273,
21012,
5511,
323,
5939,
3676,
4715,
374,
26856,
343,
343,
255,
1162,
355,
9169,
247,
12112,
7792,
323,
5511,
20246,
5145,
4715,
432,
288,
5902,
281,
891,
302,
50276,
7152,
33032,
2520,
2929,
4081,
271,
17825,
2677,
1025,
1332,
534,
310,
6012,
407,
28699,
247,
20793,
36643,
2228,
3033,
253,
10527,
1783,
5936,
19427,
253,
36643,
1268,
2556,
281,
253,
11786,
5222,
14940,
2281,
273,
253,
1566,
285,
253,
1655,
19502,
1180,
10527,
1543,
921,
326,
253,
7870,
9886,
5644,
281,
1805,
2228,
3033,
685,
253,
4229,
9886,
253,
906,
310,
27350,
4583,
253,
2929,
310,
4518,
3542,
533,
253,
7756,
310,
417,
1534,
2217,
281,
7501,
247,
9311,
387,
17857,
32888,
50276,
18,
275,
253,
4737,
273,
13989,
337,
352,
310,
8025,
326,
1315,
317,
7311,
50276,
17788,
275,
1484,
1124,
5200,
1315,
29404,
18,
84,
310,
436,
45190,
4516,
285,
253,
2457,
906,
4453,
3430,
281,
479,
323,
1650,
268,
1124,
72,
304,
81,
50276,
1124,
5200,
50276,
4259,
17,
943,
320,
337,
84,
285,
256,
50276,
84,
19,
4259,
17,
476,
320,
1199,
4067,
685,
337,
50276,
19,
270,
79,
275,
1283,
7024,
327,
9765,
285,
465,
849,
476,
359,
6642,
731,
275,
3946,
327,
253,
643,
1133,
1283,
2722,
326,
625,
9886,
778,
878,
281,
320,
908,
275,
253,
1996,
3924,
273,
3733,
1955,
281,
23653,
295,
50276,
79,
1223,
1384,
5936,
247,
4577,
1180,
273,
9886,
275,
1996,
25142,
347,
11786,
5222,
5431,
556,
247,
9058,
273,
11052,
1309,
3733,
597,
403,
34126,
281,
1016,
643,
50276,
20,
275,
253,
3368,
1283,
390,
1384,
310,
908,
50275,
21,
347,
253,
4081,
1332,
4648,
7870,
1180,
273,
9886,
849,
513,
359,
11897,
697,
13800,
4313,
310,
352,
17522,
13800,
4313,
2439,
25142,
50276,
22,
432,
2829,
337,
359,
476,
923,
326,
2167,
253,
4081,
1332,
556,
5777,
2169,
13800,
4313,
685,
2805,
8433,
69,
697,
7200,
310,
7197,
1580,
253,
13800,
18332,
310,
37825,
275,
3946,
285,
627,
310,
642,
1304,
327,
3733,
673,
352,
310,
417,
2590,
604,
253,
4081,
1332,
310,
7938,
275,
2426,
273,
27754,
3402,
13273,
673,
50276,
23,
253,
8245,
4453,
5075,
627,
403,
247,
1180,
273,
9380,
4645,
326,
2228,
8680,
476,
4993,
253,
4105,
3045,
273,
7871,
35333,
46247,
303,
1250,
6421,
6247,
12717,
6247,
1182,
24176,
6247,
285,
310,
10254,
908,
275,
253,
3368,
50276,
18970,
303,
1250,
6421,
618,
74,
819,
1351,
678,
1162,
355,
2228,
8680,
26019,
7871,
35333,
285,
643,
11786,
13800,
15849,
17857,
1686,
6247,
50276,
85,
606,
15761,
3642,
1162,
355,
33478,
39757,
2721,
7529,
19191,
11786,
18499,
342,
4021,
5858,
2228,
3118,
561,
456,
13800,
17857,
1686,
6247,
50276,
91,
24176,
256,
30287,
606,
1182,
41291,
536,
480,
5511,
20246,
5939,
2972,
3020,
10254,
256,
35333,
342,
2228,
44333,
5723,
2824,
6247,
7152,
339,
431,
248,
2929,
19401,
5939,
4715,
970,
256,
35333,
285,
13698,
387,
11138,
253,
14940,
2281,
273,
253,
2677,
1025,
256,
35333,
281,
436,
4096,
352,
28055,
45589,
253,
5454,
2727,
875,
5511,
2105,
285,
253,
1566,
7200,
275,
3733,
342,
2677,
1025,
19191,
27935,
1754,
327,
253,
10527,
1783,
253,
4477,
4081,
247,
7870,
2677,
1025,
256,
35333,
7792,
281,
22318,
253,
36643,
5700,
281,
320,
18918,
253,
5454,
2727,
875,
5511,
285,
1566,
2228,
275,
1635,
281,
253,
10527,
1783,
253,
3045,
273,
253,
4081,
5511,
6974,
310,
6760,
18171,
275,
4382,
8113,
8892,
327,
260,
338,
274,
6903,
361,
285,
295,
24343,
327,
639,
13608,
10895,
285,
310,
2011,
281,
562,
32231,
253,
643,
2783,
36643,
5609,
50275,
296,
3755,
20556,
3738,
253,
2934,
273,
6890,
253,
1180,
273,
36643,
2372,
1309,
3733,
556,
644,
4081,
285,
5421,
1078,
533,
436,
2929,
4453,
387,
352,
432,
247,
747,
6907,
285,
407,
18918,
253,
14940,
2281,
762,
1027,
13260,
323,
253,
2957,
1159,
352,
48169,
253,
2424,
1180,
273,
9886,
281,
15338,
253,
5170,
3033,
273,
253,
14940,
2228,
50276,
20881,
1255,
265,
285,
3533,
337,
275,
253,
4737,
273,
13989,
337,
352,
310,
11120,
8025,
326,
253,
256,
5943,
403,
17568,
5939,
275,
1016,
36643,
10269,
3738,
436,
1537,
320,
2032,
323,
10481,
1355,
36643,
27925,
1781,
9886,
323,
1698,
9886,
534,
310,
253,
6110,
2919,
275,
5939,
3733,
436,
9376,
310,
417,
3588,
436,
943,
320,
5393,
275,
253,
2133,
273,
253,
3916,
3738,
436,
9376,
310,
417,
2032,
275,
2087,
281,
320,
625,
7899,
253,
3268,
273,
2228,
323,
1650,
323,
299,
17,
476,
320,
3542,
347,
759,
18,
339,
2020,
77,
23256,
304,
81,
50276,
1241,
50276,
19,
10012,
374,
19584,
305,
12064,
6046,
1223,
253,
36643,
6046,
310,
295,
543,
12064,
285,
4198,
337,
19584,
352,
310,
6447,
352,
3133,
326,
436,
9376,
310,
417,
275,
1386,
342,
253,
2045,
13260,
285,
8542,
2219,
495,
253,
4679,
2593,
310,
417,
20297,
253,
2805,
8433,
69,
285,
49688,
4971,
3082,
403,
4942,
1711,
285,
1805,
36643,
3082,
1754,
327,
277,
1622,
272,
27197,
50276,
7110,
2168,
644,
4081,
2429,
281,
17825,
285,
519,
32785,
84,
1097,
841,
3082,
5115,
2169,
7200,
342,
247,
2406,
13800,
4313,
327,
253,
643,
1133,
2805,
8433,
69,
21,
9886,
342,
247,
2169,
13800,
4313,
33526,
247,
10870,
7200,
25761,
253,
14940,
2281,
14777,
3036,
337,
2722,
326,
512,
3082,
29623,
2761,
387,
253,
1072,
2281,
5747,
253,
10527,
1783,
285,
253,
16038,
273,
253,
7870,
37989,
17621,
323,
247,
4344,
5301,
253,
13800,
4313,
273,
512,
3082,
943,
320,
4229,
281,
253,
1072,
1318,
285,
253,
14940,
2281,
347,
275,
3036,
337,
357,
275,
1635,
281,
253,
2457,
7200,
943,
320,
6760,
253,
3710,
4679,
285,
417,
4569,
5301,
873,
8777,
403,
2201,
35387,
273,
253,
2929,
50275,
37585,
10012,
337,
4648,
9376,
495,
417,
4767,
275,
253,
2133,
5474,
339,
18086,
310,
1929,
326,
9376,
495,
5150,
884,
11542,
11041,
342,
9376,
374,
2266,
17133,
414,
5644,
281,
247,
20620,
3021,
1907,
841,
767,
13260,
2366,
310,
2266,
627,
403,
690,
3332,
2987,
326,
11399,
9376,
495,
407,
247,
1027,
9376,
1929,
347,
253,
3264,
6032,
1255,
751,
275,
253,
1563,
789,
305,
1017,
50276,
83,
50276,
78,
50276,
32246,
274,
1479,
50276,
81,
50276,
395,
50276,
16836,
50276,
71,
50274,
296,
17283,
21582,
304,
4614,
850,
3082,
11041,
5141,
3066,
480,
317,
706,
757,
30547,
7695,
39962,
11395,
1235,
1731,
1237,
4765,
253,
4477,
778,
49620,
616,
7052,
17133,
1543,
629,
970,
436,
2238,
273,
9376,
50276,
249,
13989,
337,
5150,
2145,
352,
310,
417,
2590,
752,
253,
4477,
1599,
407,
253,
5912,
273,
247,
3632,
4778,
891,
10141,
253,
4737,
533,
891,
858,
417,
2096,
352,
2057,
50275,
8826,
2361,
1543,
403,
2168,
1929,
275,
253,
6239,
285,
253,
2929,
4245,
253,
13214,
326,
841,
1543,
403,
747,
3340,
326,
253,
4477,
1918,
253,
27947,
6667,
273,
824,
1543,
253,
38663,
1255,
273,
253,
36643,
697,
11542,
11041,
285,
253,
806,
629,
273,
10012,
337,
50276,
6377,
608,
8664,
253,
21396,
1650,
436,
310,
247,
14916,
1083,
285,
253,
760,
1083,
835,
581,
476,
3524,
253,
2406,
3033,
281,
3761,
253,
5170,
3033,
275,
958,
9765,
50276,
2461,
36714,
298,
1906,
285,
432,
13260,
337,
50276,
19,
359,
755,
326,
269,
310,
21396,
342,
22871,
534,
8018,
288,
50276,
1906,
891,
50275,
29813,
1384,
323,
479,
436,
581,
273,
253,
2022,
1543,
273,
253,
2929,
533,
891,
858,
417,
923,
697,
4737,
9825,
187,
187,
4118,
18435,
27,
783,
30628,
452,
247,
2266,
13969,
4404,
18235,
1060,
285,
891,
5194,
342,
436,
13969,
50276,
20261,
891,
1158,
690,
273,
253,
30628,
7350,
403,
48032,
323,
1650,
253,
2929,
1057,
417,
3176,
281,
897,
247,
9777,
5170,
3033,
326,
651,
320,
5809,
3472,
2366,
342,
247,
2266,
17133,
414,
9376,
3738,
11041,
14493,
50276,
9072,
17133,
414,
513,
3835,
760,
247,
1355,
6919,
273,
7052,
17133,
4715,
8892,
841,
13260,
403,
2649,
5809,
3472,
50276,
8826,
8680,
891,
452,
326,
4931,
369,
417,
6107,
407,
253,
30628,
50276,
856,
84,
50274,
14091,
3184,
253,
4758,
835,
253,
1180,
273,
9886,
16149,
23043,
310,
1077,
4722,
3738,
347,
37317,
495,
2792,
562,
417,
7094,
4460,
627,
310,
1534,
6387,
323,
7756,
432,
436,
1332,
285,
634,
3762,
3133,
281,
896,
436,
598,
50276,
5040,
50274,
783,
5661,
9978,
310,
5075,
285,
310,
10499,
253,
3430,
2181,
672,
359,
1408,
256,
35333,
281,
6194,
247,
1566,
752,
359,
1663,
1557,
670,
310,
672,
253,
3733,
30214,
253,
2264,
3402,
8886,
673,
281,
6194,
327,
690,
985,
323,
13800,
3082,
342,
4229,
13800,
4142,
697,
4030,
281,
897,
253,
1180,
273,
9886,
12573,
347,
247,
17335,
984,
672,
253,
1180,
273,
9886,
12573,
310,
6447,
689,
673,
436,
588,
320,
45973,
275,
253,
3402,
13273,
673,
2299,
672,
253,
9886,
12573,
591,
19502,
476,
1818,
689,
673,
436,
476,
452,
247,
2834,
3956,
433,
882,
1055,
327,
253,
3402,
13273,
673,
984,
273,
253,
2442,
323,
14787,
875,
5511,
285,
13782,
835,
2708,
247,
2176,
1180,
273,
9886,
2197,
253,
985,
310,
417,
5511,
9458,
3402,
13273,
673,
4679,
10941,
1411,
643,
625,
4980,
13800,
3082,
651,
3012,
3157,
436,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
15891,
253,
4477,
778,
9059,
326,
253,
8946,
10527,
1783,
273,
256,
35333,
19584,
326,
253,
19191,
27935,
403,
17568,
11542,
533,
581,
476,
1014,
7052,
9059,
326,
436,
3033,
310,
2686,
2192,
555,
25761,
271,
1014,
10046,
4154,
476,
320,
1160,
326,
253,
1840,
9376,
310,
275,
4499,
342,
2266,
17133,
414,
4496,
923,
256,
35333,
285,
47891,
32778,
14940,
1293,
253,
11542,
27935,
9376,
407,
295,
39170,
1162,
355,
347,
581,
273,
253,
10872,
4496,
2096,
627,
403,
19595,
13260,
824,
347,
2266,
3116,
1617,
327,
247,
19191,
11786,
347,
275,
9376,
577,
273,
337,
50276,
19,
186,
9088,
310,
789,
374,
326,
29328,
359,
12661,
247,
12112,
7792,
534,
5223,
84,
253,
13800,
1268,
281,
253,
2032,
11786,
387,
1016,
19502,
46875,
253,
7756,
275,
253,
8103,
1159,
326,
310,
6786,
591,
32452,
2372,
849,
310,
789,
1027,
432,
31187,
891,
1804,
253,
4477,
3748,
436,
789,
285,
1056,
10291,
342,
616,
1246,
1543,
50276,
20,
186,
32897,
2451,
10012,
5910,
285,
4791,
275,
2805,
8433,
69,
2929,
627,
310,
247,
1385,
273,
9886,
26728,
275,
1016,
3790,
323,
2805,
8433,
69,
534,
7024,
327,
256,
253,
36643,
1268,
25761,
368,
671,
5393,
326,
36643,
1268,
310,
11467,
17619,
281,
253,
1180,
273,
2677,
1025,
9886,
275,
1708,
273,
634,
15895,
849,
253,
1543,
275,
2805,
8433,
69,
2929,
403,
4623,
23000,
275,
2593,
495,
7424,
495,
285,
577,
403,
417,
634,
9021,
3103,
4496,
18212,
26542,
616,
4973,
50276,
21,
186,
5371,
310,
253,
2934,
3212,
13989,
337,
50276,
22,
186,
5371,
310,
253,
1127,
273,
10012,
374,
327,
253,
643,
1133,
835,
403,
253,
3538,
569,
16314,
84,
273,
7424,
1283,
285,
1384,
1110,
403,
253,
2022,
1543,
273,
436,
2929,
604,
891,
717,
417,
3430,
50276,
23,
186,
249,
2426,
273,
5661,
1543,
891,
574,
247,
1892,
673,
29375,
2829,
337,
285,
2330,
2505,
275,
253,
12494,
1071,
7200,
4632,
13800,
4313,
3340,
891,
513,
417,
871,
849,
513,
891,
2096,
253,
1390,
6197,
275,
253,
1840,
13012,
12494,
281,
513,
1463,
4679,
407,
970,
13800,
5609,
253,
4477,
476,
2451,
247,
1077,
14883,
800,
789,
285,
2127,
4793,
407,
10913,
1269,
86,
1162,
355,
21012,
5511,
323,
5939,
3676,
4715,
6630,
285,
11745,
7103,
275,
326,
1083,
891,
651,
11907,
253,
4477,
281,
7484,
4103,
2856,
580,
311,
2123,
4632,
1071,
7200,
2074,
281,
8442,
721,
285,
818,
15308,
275,
253,
1246,
9380,
253,
4679,
285,
616,
27228,
403,
749,
15291,
50275,
38092,
5701,
337,
186,
45019,
314,
342,
253,
1766,
19275,
32897,
35531,
432,
970,
841,
3510,
273,
3000,
368,
403,
13828,
247,
8249,
3389,
417,
247,
660,
18279,
4460,
50276,
19,
186,
74,
452,
690,
33196,
327,
436,
3908,
5368,
11333,
2223,
2677,
907,
3602,
715,
247,
4229,
1180,
273,
9886,
534,
310,
2011,
281,
320,
31334,
275,
26259,
253,
5511,
585,
41801,
5454,
2727,
396,
504,
1162,
355,
4059,
355,
43418,
73,
1162,
355,
4240,
270,
1808,
6339,
1162,
355,
4765,
50276,
249,
619,
4743,
2805,
8433,
69,
310,
10237,
285,
6474,
323,
337,
2713,
256,
35333,
1097,
396,
504,
285,
270,
1808,
6339,
253,
4081,
2900,
310,
281,
897,
2228,
44333,
50276,
20,
186,
9029,
1175,
3045,
275,
690,
2176,
8892,
14367,
6197,
50274,
18,
5387,
893,
1162,
355,
39951,
2284,
9169,
327,
253,
26210,
875,
253,
10527,
1783,
285,
8542,
27558,
273,
21012,
5511,
323,
5939,
3676,
4715,
374,
26856,
343,
343,
255,
1162,
355,
9169,
247,
12112,
7792,
323,
5511,
20246,
5145,
4715,
432,
288,
5902,
281,
891,
302,
50276,
7152,
33032,
2520,
2929,
4081,
271,
17825,
2677,
1025,
1332,
534,
310,
6012,
407,
28699,
247,
20793,
36643,
2228,
3033,
253,
10527,
1783,
5936,
19427,
253,
36643,
1268,
2556,
281,
253,
11786,
5222,
14940,
2281,
273,
253,
1566,
285,
253,
1655,
19502,
1180,
10527,
1543,
921,
326,
253,
7870,
9886,
5644,
281,
1805,
2228,
3033,
685,
253,
4229,
9886,
253,
906,
310,
27350,
4583,
253,
2929,
310,
4518,
3542,
533,
253,
7756,
310,
417,
1534,
2217,
281,
7501,
247,
9311,
387,
17857,
32888,
50276,
18,
275,
253,
4737,
273,
13989,
337,
352,
310,
8025,
326,
1315,
317,
7311,
50276,
17788,
275,
1484,
1124,
5200,
1315,
29404,
18,
84,
310,
436,
45190,
4516,
285,
253,
2457,
906,
4453,
3430,
281,
479,
323,
1650,
268,
1124,
72,
304,
81,
50276,
1124,
5200,
50276,
4259,
17,
943,
320,
337,
84,
285,
256,
50276,
84,
19,
4259,
17,
476,
320,
1199,
4067,
685,
337,
50276,
19,
270,
79,
275,
1283,
7024,
327,
9765,
285,
465,
849,
476,
359,
6642,
731,
275,
3946,
327,
253,
643,
1133,
1283,
2722,
326,
625,
9886,
778,
878,
281,
320,
908,
275,
253,
1996,
3924,
273,
3733,
1955,
281,
23653,
295,
50276,
79,
1223,
1384,
5936,
247,
4577,
1180,
273,
9886,
275,
1996,
25142,
347,
11786,
5222,
5431,
556,
247,
9058,
273,
11052,
1309,
3733,
597,
403,
34126,
281,
1016,
643,
50276,
20,
275,
253,
3368,
1283,
390,
1384,
310,
908,
50275,
21,
347,
253,
4081,
1332,
4648,
7870,
1180,
273,
9886,
849,
513,
359,
11897,
697,
13800,
4313,
310,
352,
17522,
13800,
4313,
2439,
25142,
50276,
22,
432,
2829,
337,
359,
476,
923,
326,
2167,
253,
4081,
1332,
556,
5777,
2169,
13800,
4313,
685,
2805,
8433,
69,
697,
7200,
310,
7197,
1580,
253,
13800,
18332,
310,
37825,
275,
3946,
285,
627,
310,
642,
1304,
327,
3733,
673,
352,
310,
417,
2590,
604,
253,
4081,
1332,
310,
7938,
275,
2426,
273,
27754,
3402,
13273,
673,
50276,
23,
253,
8245,
4453,
5075,
627,
403,
247,
1180,
273,
9380,
4645,
326,
2228,
8680,
476,
4993,
253,
4105,
3045,
273,
7871,
35333,
46247,
303,
1250,
6421,
6247,
12717,
6247,
1182,
24176,
6247,
285,
310,
10254,
908,
275,
253,
3368,
50276,
18970,
303,
1250,
6421,
618,
74,
819,
1351,
678,
1162,
355,
2228,
8680,
26019,
7871,
35333,
285,
643,
11786,
13800,
15849,
17857,
1686,
6247,
50276,
85,
606,
15761,
3642,
1162,
355,
33478,
39757,
2721,
7529,
19191,
11786,
18499,
342,
4021,
5858,
2228,
3118,
561,
456,
13800,
17857,
1686,
6247,
50276,
91,
24176,
256,
30287,
606,
1182,
41291,
536,
480,
5511,
20246,
5939,
2972,
3020,
10254,
256,
35333,
342,
2228,
44333,
5723,
2824,
6247,
7152,
339,
431,
248,
2929,
19401,
5939,
4715,
970,
256,
35333,
285,
13698,
387,
11138,
253,
14940,
2281,
273,
253,
2677,
1025,
256,
35333,
281,
436,
4096,
352,
28055,
45589,
253,
5454,
2727,
875,
5511,
2105,
285,
253,
1566,
7200,
275,
3733,
342,
2677,
1025,
19191,
27935,
1754,
327,
253,
10527,
1783,
253,
4477,
4081,
247,
7870,
2677,
1025,
256,
35333,
7792,
281,
22318,
253,
36643,
5700,
281,
320,
18918,
253,
5454,
2727,
875,
5511,
285,
1566,
2228,
275,
1635,
281,
253,
10527,
1783,
253,
3045,
273,
253,
4081,
5511,
6974,
310,
6760,
18171,
275,
4382,
8113,
8892,
327,
260,
338,
274,
6903,
361,
285,
295,
24343,
327,
639,
13608,
10895,
285,
310,
2011,
281,
562,
32231,
253,
643,
2783,
36643,
5609,
50275,
296,
3755,
20556,
3738,
253,
2934,
273,
6890,
253,
1180,
273,
36643,
2372,
1309,
3733,
556,
644,
4081,
285,
5421,
1078,
533,
436,
2929,
4453,
387,
352,
432,
247,
747,
6907,
285,
407,
18918,
253,
14940,
2281,
762,
1027,
13260,
323,
253,
2957,
1159,
352,
48169,
253,
2424,
1180,
273,
9886,
281,
15338,
253,
5170,
3033,
273,
253,
14940,
2228,
50276,
20881,
1255,
265,
285,
3533,
337,
275,
253,
4737,
273,
13989,
337,
352,
310,
11120,
8025,
326,
253,
256,
5943,
403,
17568,
5939,
275,
1016,
36643,
10269,
3738,
436,
1537,
320,
2032,
323,
10481,
1355,
36643,
27925,
1781,
9886,
323,
1698,
9886,
534,
310,
253,
6110,
2919,
275,
5939,
3733,
436,
9376,
310,
417,
3588,
436,
943,
320,
5393,
275,
253,
2133,
273,
253,
3916,
3738,
436,
9376,
310,
417,
2032,
275,
2087,
281,
320,
625,
7899,
253,
3268,
273,
2228,
323,
1650,
323,
299,
17,
476,
320,
3542,
347,
759,
18,
339,
2020,
77,
23256,
304,
81,
50276,
1241,
50276,
19,
10012,
374,
19584,
305,
12064,
6046,
1223,
253,
36643,
6046,
310,
295,
543,
12064,
285,
4198,
337,
19584,
352,
310,
6447,
352,
3133,
326,
436,
9376,
310,
417,
275,
1386,
342,
253,
2045,
13260,
285,
8542,
2219,
495,
253,
4679,
2593,
310,
417,
20297,
253,
2805,
8433,
69,
285,
49688,
4971,
3082,
403,
4942,
1711,
285,
1805,
36643,
3082,
1754,
327,
277,
1622,
272,
27197,
50276,
7110,
2168,
644,
4081,
2429,
281,
17825,
285,
519,
32785,
84,
1097,
841,
3082,
5115,
2169,
7200,
342,
247,
2406,
13800,
4313,
327,
253,
643,
1133,
2805,
8433,
69,
21,
9886,
342,
247,
2169,
13800,
4313,
33526,
247,
10870,
7200,
25761,
253,
14940,
2281,
14777,
3036,
337,
2722,
326,
512,
3082,
29623,
2761,
387,
253,
1072,
2281,
5747,
253,
10527,
1783,
285,
253,
16038,
273,
253,
7870,
37989,
17621,
323,
247,
4344,
5301,
253,
13800,
4313,
273,
512,
3082,
943,
320,
4229,
281,
253,
1072,
1318,
285,
253,
14940,
2281,
347,
275,
3036,
337,
357,
275,
1635,
281,
253,
2457,
7200,
943,
320,
6760,
253,
3710,
4679,
285,
417,
4569,
5301,
873,
8777,
403,
2201,
35387,
273,
253,
2929,
50275,
37585,
10012,
337,
4648,
9376,
495,
417,
4767,
275,
253,
2133,
5474,
339,
18086,
310,
1929,
326,
9376,
495,
5150,
884,
11542,
11041,
342,
9376,
374,
2266,
17133,
414,
5644,
281,
247,
20620,
3021,
1907,
841,
767,
13260,
2366,
310,
2266,
627,
403,
690,
3332,
2987,
326,
11399,
9376,
495,
407,
247,
1027,
9376,
1929,
347,
253,
3264,
6032,
1255,
751,
275,
253,
1563,
789,
305,
1017,
50276,
83,
50276,
78,
50276,
32246,
274,
1479,
50276,
81,
50276,
395,
50276,
16836,
50276,
71,
50274,
296,
17283,
21582,
304,
4614,
850,
3082,
11041,
5141,
3066,
480,
317,
706,
757,
30547,
7695,
39962,
11395,
1235,
1731,
1237,
4765,
253,
4477,
778,
49620,
616,
7052,
17133,
1543,
629,
970,
436,
2238,
273,
9376,
50276,
249,
13989,
337,
5150,
2145,
352,
310,
417,
2590,
752,
253,
4477,
1599,
407,
253,
5912,
273,
247,
3632,
4778,
891,
10141,
253,
4737,
533,
891,
858,
417,
2096,
352,
2057,
50275,
8826,
2361,
1543,
403,
2168,
1929,
275,
253,
6239,
285,
253,
2929,
4245,
253,
13214,
326,
841,
1543,
403,
747,
3340,
326,
253,
4477,
1918,
253,
27947,
6667,
273,
824,
1543,
253,
38663,
1255,
273,
253,
36643,
697,
11542,
11041,
285,
253,
806,
629,
273,
10012,
337,
50276,
6377,
608,
8664,
253,
21396,
1650,
436,
310,
247,
14916,
1083,
285,
253,
760,
1083,
835,
581,
476,
3524,
253,
2406,
3033,
281,
3761,
253,
5170,
3033,
275,
958,
9765,
50276,
2461,
36714,
298,
1906,
285,
432,
13260,
337,
50276,
19,
359,
755,
326,
269,
310,
21396,
342,
22871,
534,
8018,
288,
50276,
1906,
891,
50275,
29813,
1384,
323,
479,
436,
581,
273,
253,
2022,
1543,
273,
253,
2929,
533,
891,
858,
417,
923,
697,
4737,
9825,
187,
187,
4118,
18435,
27,
783,
30628,
452,
247,
2266,
13969,
4404,
18235,
1060,
285,
891,
5194,
342,
436,
13969,
50276,
20261,
891,
1158,
690,
273,
253,
30628,
7350,
403,
48032,
323,
1650,
253,
2929,
1057,
417,
3176,
281,
897,
247,
9777,
5170,
3033,
326,
651,
320,
5809,
3472,
2366,
342,
247,
2266,
17133,
414,
9376,
3738,
11041,
14493,
50276,
9072,
17133,
414,
513,
3835,
760,
247,
1355,
6919,
273,
7052,
17133,
4715,
8892,
841,
13260,
403,
2649,
5809,
3472,
50276,
8826,
8680,
891,
452,
326,
4931,
369,
417,
6107,
407,
253,
30628,
50276,
856,
84,
50274,
14091,
3184,
253,
4758,
835,
253,
1180,
273,
9886,
16149,
23043,
310,
1077,
4722,
3738,
347,
37317,
495,
2792,
562,
417,
7094,
4460,
627,
310,
1534,
6387,
323,
7756,
432,
436,
1332,
285,
634,
3762,
3133,
281,
896,
436,
598,
50276,
5040,
50274,
783,
5661,
9978,
310,
5075,
285,
310,
10499,
253,
3430,
2181,
672,
359,
1408,
256,
35333,
281,
6194,
247,
1566,
752,
359,
1663,
1557,
670,
310,
672,
253,
3733,
30214,
253,
2264,
3402,
8886,
673,
281,
6194,
327,
690,
985,
323,
13800,
3082,
342,
4229,
13800,
4142,
697,
4030,
281,
897,
253,
1180,
273,
9886,
12573,
347,
247,
17335,
984,
672,
253,
1180,
273,
9886,
12573,
310,
6447,
689,
673,
436,
588,
320,
45973,
275,
253,
3402,
13273,
673,
2299,
672,
253,
9886,
12573,
591,
19502,
476,
1818,
689,
673,
436,
476,
452,
247,
2834,
3956,
433,
882,
1055,
327,
253,
3402,
13273,
673,
984,
273,
253,
2442,
323,
14787,
875,
5511,
285,
13782,
835,
2708,
247,
2176,
1180,
273,
9886,
2197,
253,
985,
310,
417,
5511,
9458,
3402,
13273,
673,
4679,
10941,
1411,
643,
625,
4980,
13800,
3082,
651,
3012,
3157,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a monitor that observes the input and output of a machine learning model and checks if they satisfy requirements two scenarios are investigated first when the monitor is only present at inference time and the second when the requirement is known at training time if a hypothesis class h is pac learning then using the verifier at inference stage can give a hypothesis with a guarantee in its generalization error when requirements are known in training phase then the inputoutput requirements do not increase the rademacher complexity the proofs in appendix are dense and difficult to follow while the theorem statements themselves appear to be rather unsurprising the clarification questions will help the reviewer better understand the significance of this work the rademacher bound appears to be straightforward adaption of standard result it is not clear why one would expect hc to have higher complexity than h there are no potential negative societal impact of the work to the best of this reviewers understanding docsepthis paper considers the problem of introducing a concurrent verifier either in the learning or inference phase in the setting of pac learning a concurrent verifier checks if the predicted output satisfies a given requirement represented as a constraint if so it doesnt change the predicted output otherwise it forces the output to take a value that satisfies the requirement the authors consider the generalization error of learning with concurrent verifiers in the inference and in the learning phase and show that a under the realizability assumption learning with concurrent verifier in the inference phase doesnt suffer from too much generalization error b for learning with concurrent verifier in the learning phase the authors use the rademacher complexity based generalization error bounds and show that these dont increase in the presence of concurrent verifier in the learning phase the authors also provide analysis of the running time overhead of using a concurrent verifier in the different settings they also describe a structured prediction setting where their theory is applicable this is a theoretical paper with no experimental results strengths 1 very relevant problem 2 the results are good and show when the use of concurrent verifiers is beneficial 3 the results expand our current understanding of learningpredicting with concurrent verifiers 4 the presentation introduces the relevant theory and definitions making it easier to read weaknesses 1 the paper is very dense particularly the statement of the main technical results shifting the proofs to the appendix but not providing any intuition behind the proofs makes it hard to understand the intuition behind the main theorems 2 the running time complexity analysis for using the concurrent verifier is a bit sketchy this is particularly so in sec 63 3 the math is often hard to follow for the general reader the paper is unnecessarily notationally heavy at places the authors have discussed the limitations of their approach to some extent there is no potential negative societal impact of their work docsepthe paper analyzes the generalization bounds of ml models augmented with concurrent verifiers as background it presents pac learnability and rademacher complexity and then it introduces concurrent verifiers it shows how generalization changes when a verifier is applied to a model at inference time when both when the hypothesis space is realizable and when it is not when h is pac learnable with 01 loss inference time verification itv gives the minimum possible generalization error while satisfying the constraints when h is not pac learnable this is not obtained the paper also considers learning time verification ltv using rademacher complexity to show that adding a concurrent verifier to a model at learning time does not reduce its learnability giving bounds both for multiclass prediction and structured prediction originality the results are original drawing upon bounds derived in earlier work and extending them to the concurrent verifier settings the background material is both cited appropriately and explained clearly the relevant bounds built upon are cited and tighter bounds that are not built upon but perhaps could be are also referenced quality all claims are well supported with proofs provided in the appendices i have only verified the proofs through section 5 not those of section 6 i have not found evidence of anything unsound in my review and the results are sensible so i expect even the proofs i have not verified thoroughly will be sound too clarity the submission is well written and well organized the background material is clearly explained most concepts are introduced prior to their use at least up until line 280 eg mercer kernels are not introduced the following are largely typographic suggestions but these issues do not meaningfully degrade the overall quality of the paper nit line 134 consists should read consisting nit line 156 all should be allow line 185 it should say sample rather than example yes nit line 185 sbeis line 191 the notation f and fc is only introduced in the appendix and so seems out of scope here strikeline 237 if rhoh 0 is this necessarily a misclassificationstrike line 275 what does f refer to in the definition of empirical local rademacher complexity significance the new setting machine learning with a concurrent verifier is an important setting for many application domains i do not know whether the bounds are relevantmeaningful in these application domains in practice as my understanding is that in modern ml systems typical generalization performances can be well within the bounds given by rademacher complexity nevertheless the setting is an important one and the results are to the best of my knowledge novel and so tell us important properties of how generalizability can change remain unchanged when a verifier is applied both during learning and at inference time to a pretrained model the conditions in the theorems capture the limitations of the results eg for itv the generalization guarantee requires that h is pac learnable
### Summary:
|
all reviewers liked the presented approach of using a concurrent verifier in both the learning and inference phase in the pac setting the approach also presents theoretical proofs for bounds when using such a verifier the reviews also provide many details for improving the presentation for both improving the correctness and clarity which would be great to incorporate in the next version of the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
5724,
326,
40687,
253,
3280,
285,
3453,
273,
247,
5145,
4715,
1566,
285,
12255,
604,
597,
10517,
6095,
767,
15216,
403,
6949,
50276,
7053,
672,
253,
5724,
310,
760,
1246,
387,
17032,
673,
285,
253,
1273,
672,
253,
8284,
310,
1929,
387,
3733,
673,
50276,
338,
247,
9079,
966,
288,
310,
19162,
4715,
840,
970,
253,
2336,
5425,
387,
17032,
3924,
476,
1918,
247,
9079,
342,
247,
12215,
275,
697,
26647,
2228,
50275,
9453,
6095,
403,
1929,
275,
3733,
3408,
840,
253,
3280,
9252,
6095,
513,
417,
2572,
253,
1985,
358,
12844,
10454,
50274,
783,
27947,
275,
30762,
403,
14086,
285,
2834,
281,
956,
1223,
253,
10012,
7234,
3746,
3176,
281,
320,
2581,
5061,
321,
20733,
50276,
783,
37699,
3533,
588,
1361,
253,
37317,
1805,
2096,
253,
8453,
273,
436,
789,
50274,
783,
1985,
358,
12844,
3033,
4620,
281,
320,
15246,
5223,
279,
273,
2629,
906,
352,
310,
417,
2590,
2139,
581,
651,
1902,
288,
68,
281,
452,
2169,
10454,
685,
288,
627,
403,
642,
2442,
4016,
38058,
3486,
273,
253,
789,
281,
253,
1682,
273,
436,
30628,
4685,
5474,
33032,
2520,
2929,
19401,
253,
1895,
273,
16984,
247,
17336,
2336,
5425,
2057,
275,
253,
4715,
390,
17032,
3408,
275,
253,
4758,
273,
19162,
4715,
50276,
66,
17336,
2336,
5425,
12255,
604,
253,
8131,
3453,
12310,
247,
1677,
8284,
6607,
347,
247,
7658,
50276,
338,
594,
352,
36908,
1818,
253,
8131,
3453,
50276,
32240,
352,
5621,
253,
3453,
281,
1379,
247,
1318,
326,
12310,
253,
8284,
50276,
783,
4477,
1908,
253,
26647,
2228,
273,
4715,
342,
17336,
2336,
13783,
275,
253,
17032,
285,
275,
253,
4715,
3408,
285,
921,
326,
247,
762,
253,
42924,
1430,
9376,
4715,
342,
17336,
2336,
5425,
275,
253,
17032,
3408,
36908,
11089,
432,
1512,
1199,
26647,
2228,
270,
323,
4715,
342,
17336,
2336,
5425,
275,
253,
4715,
3408,
253,
4477,
897,
253,
1985,
358,
12844,
10454,
1754,
26647,
2228,
14493,
285,
921,
326,
841,
13414,
2572,
275,
253,
3361,
273,
17336,
2336,
5425,
275,
253,
4715,
3408,
50276,
783,
4477,
671,
2085,
1783,
273,
253,
3515,
673,
18332,
273,
970,
247,
17336,
2336,
5425,
275,
253,
1027,
7533,
50276,
9328,
671,
6266,
247,
18872,
10554,
4758,
835,
616,
3762,
310,
7763,
50276,
2520,
310,
247,
10527,
2929,
342,
642,
5661,
1543,
20544,
337,
1077,
4623,
1895,
374,
253,
1543,
403,
1175,
285,
921,
672,
253,
897,
273,
17336,
2336,
13783,
310,
12912,
495,
253,
1543,
5645,
776,
1655,
4685,
273,
4715,
22714,
272,
342,
17336,
2336,
13783,
577,
253,
9759,
23970,
253,
4623,
3762,
285,
14308,
2403,
352,
6927,
281,
1239,
50275,
20881,
1255,
265,
337,
253,
2929,
310,
1077,
14086,
3782,
253,
3908,
273,
253,
2022,
7681,
1543,
50276,
1200,
12545,
253,
27947,
281,
253,
30762,
533,
417,
5277,
667,
30328,
3212,
253,
27947,
2789,
352,
1892,
281,
2096,
253,
30328,
3212,
253,
2022,
39383,
374,
253,
3515,
673,
10454,
1783,
323,
970,
253,
17336,
2336,
5425,
310,
247,
2372,
23211,
90,
50276,
2520,
310,
3782,
594,
275,
4706,
9654,
495,
253,
14168,
310,
2223,
1892,
281,
956,
323,
253,
2087,
9414,
50276,
783,
2929,
310,
48312,
14951,
595,
5536,
387,
5053,
253,
4477,
452,
5469,
253,
7364,
273,
616,
2746,
281,
690,
6070,
50276,
9088,
310,
642,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
339,
431,
248,
2929,
3537,
13505,
253,
26647,
14493,
273,
13361,
3210,
31612,
342,
17336,
2336,
13783,
347,
4114,
352,
10262,
19162,
3037,
1430,
285,
1985,
358,
12844,
10454,
285,
840,
352,
23970,
17336,
2336,
13783,
352,
2722,
849,
26647,
2544,
672,
247,
2336,
5425,
310,
3732,
281,
247,
1566,
387,
17032,
673,
672,
1097,
672,
253,
9079,
2317,
310,
1524,
12729,
285,
672,
352,
310,
417,
672,
288,
310,
19162,
3037,
494,
342,
14805,
2957,
17032,
673,
21999,
352,
87,
4245,
253,
5927,
1896,
26647,
2228,
1223,
14127,
253,
10806,
672,
288,
310,
417,
19162,
3037,
494,
436,
310,
417,
2797,
253,
2929,
671,
19401,
4715,
673,
21999,
298,
18698,
970,
1985,
358,
12844,
10454,
281,
921,
326,
6240,
247,
17336,
2336,
5425,
281,
247,
1566,
387,
4715,
673,
1057,
417,
4796,
697,
3037,
1430,
4933,
14493,
1097,
323,
23559,
14407,
10554,
285,
18872,
10554,
50275,
19164,
414,
50276,
783,
1543,
403,
3236,
10263,
2220,
14493,
6012,
275,
4321,
789,
285,
13633,
731,
281,
253,
17336,
2336,
5425,
7533,
253,
4114,
2144,
310,
1097,
11106,
20420,
285,
5544,
4518,
253,
4623,
14493,
4270,
2220,
403,
11106,
285,
40638,
14493,
326,
403,
417,
4270,
2220,
533,
4931,
812,
320,
403,
671,
23378,
50275,
15177,
50276,
455,
3916,
403,
973,
4516,
342,
27947,
2530,
275,
253,
14801,
1271,
891,
452,
760,
16058,
253,
27947,
949,
2593,
608,
417,
1110,
273,
2593,
721,
891,
452,
417,
1119,
1941,
273,
2712,
5061,
517,
275,
619,
2278,
285,
253,
1543,
403,
24600,
594,
891,
1902,
1014,
253,
27947,
891,
452,
417,
16058,
16575,
588,
320,
3590,
1512,
50275,
498,
15752,
50276,
783,
19529,
310,
973,
3542,
285,
973,
10932,
253,
4114,
2144,
310,
4518,
5544,
954,
12342,
403,
5611,
2720,
281,
616,
897,
387,
1878,
598,
1919,
1386,
22121,
24088,
4285,
1209,
34501,
403,
417,
5611,
50276,
783,
1563,
403,
8127,
1745,
5576,
13991,
533,
841,
3374,
513,
417,
4495,
2920,
40195,
253,
4583,
3290,
273,
253,
2929,
50276,
32202,
1386,
13900,
8414,
943,
1239,
11253,
50276,
32202,
1386,
21807,
512,
943,
320,
1581,
50276,
1282,
23512,
352,
943,
1333,
3410,
2581,
685,
1650,
4754,
50276,
32202,
1386,
23512,
256,
1257,
261,
50276,
1282,
27446,
253,
14951,
269,
285,
269,
68,
310,
760,
5611,
275,
253,
30762,
285,
594,
3133,
562,
273,
7990,
1060,
50276,
296,
16409,
4115,
27332,
604,
13882,
1368,
50276,
17,
310,
436,
7933,
247,
3731,
42070,
296,
24087,
50276,
1282,
25255,
752,
1057,
269,
3730,
281,
275,
253,
5426,
273,
16774,
1980,
1985,
358,
12844,
10454,
50275,
9188,
40348,
50276,
783,
747,
4758,
5145,
4715,
342,
247,
17336,
2336,
5425,
310,
271,
1774,
4758,
323,
1142,
2898,
10625,
891,
513,
417,
871,
1880,
253,
14493,
403,
4623,
30407,
1020,
275,
841,
2898,
10625,
275,
3946,
347,
619,
4685,
310,
326,
275,
4980,
13361,
2718,
6867,
26647,
16226,
476,
320,
973,
1561,
253,
14493,
1677,
407,
1985,
358,
12844,
10454,
17837,
253,
4758,
310,
271,
1774,
581,
285,
253,
1543,
403,
281,
253,
1682,
273,
619,
3640,
4460,
285,
594,
2028,
441,
1774,
3607,
273,
849,
2087,
50228,
476,
1818,
50276,
2013,
404,
19965,
672,
247,
2336,
5425,
310,
3732,
1097,
1309,
4715,
285,
387,
17032,
673,
281,
247,
3215,
11273,
1566,
50275,
783,
2515,
275,
253,
39383,
9232,
253,
7364,
273,
253,
1543,
24088,
323,
352,
87,
253,
26647,
12215,
4419,
326,
288,
310,
19162,
3037,
494,
2490,
187,
4118,
18435,
27,
455,
30628,
10490,
253,
3559,
2746,
273,
970,
247,
17336,
2336,
5425,
275,
1097,
253,
4715,
285,
17032,
3408,
275,
253,
19162,
4758,
253,
2746,
671,
10262,
10527,
27947,
323,
14493,
672,
970,
824,
247,
2336,
5425,
253,
10123,
671,
2085,
1142,
4278,
323,
11138,
253,
9759,
323,
1097,
11138,
253,
36594,
285,
19843,
534,
651,
320,
1270,
281,
19071,
275,
253,
1735,
2715,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
5724,
326,
40687,
253,
3280,
285,
3453,
273,
247,
5145,
4715,
1566,
285,
12255,
604,
597,
10517,
6095,
767,
15216,
403,
6949,
50276,
7053,
672,
253,
5724,
310,
760,
1246,
387,
17032,
673,
285,
253,
1273,
672,
253,
8284,
310,
1929,
387,
3733,
673,
50276,
338,
247,
9079,
966,
288,
310,
19162,
4715,
840,
970,
253,
2336,
5425,
387,
17032,
3924,
476,
1918,
247,
9079,
342,
247,
12215,
275,
697,
26647,
2228,
50275,
9453,
6095,
403,
1929,
275,
3733,
3408,
840,
253,
3280,
9252,
6095,
513,
417,
2572,
253,
1985,
358,
12844,
10454,
50274,
783,
27947,
275,
30762,
403,
14086,
285,
2834,
281,
956,
1223,
253,
10012,
7234,
3746,
3176,
281,
320,
2581,
5061,
321,
20733,
50276,
783,
37699,
3533,
588,
1361,
253,
37317,
1805,
2096,
253,
8453,
273,
436,
789,
50274,
783,
1985,
358,
12844,
3033,
4620,
281,
320,
15246,
5223,
279,
273,
2629,
906,
352,
310,
417,
2590,
2139,
581,
651,
1902,
288,
68,
281,
452,
2169,
10454,
685,
288,
627,
403,
642,
2442,
4016,
38058,
3486,
273,
253,
789,
281,
253,
1682,
273,
436,
30628,
4685,
5474,
33032,
2520,
2929,
19401,
253,
1895,
273,
16984,
247,
17336,
2336,
5425,
2057,
275,
253,
4715,
390,
17032,
3408,
275,
253,
4758,
273,
19162,
4715,
50276,
66,
17336,
2336,
5425,
12255,
604,
253,
8131,
3453,
12310,
247,
1677,
8284,
6607,
347,
247,
7658,
50276,
338,
594,
352,
36908,
1818,
253,
8131,
3453,
50276,
32240,
352,
5621,
253,
3453,
281,
1379,
247,
1318,
326,
12310,
253,
8284,
50276,
783,
4477,
1908,
253,
26647,
2228,
273,
4715,
342,
17336,
2336,
13783,
275,
253,
17032,
285,
275,
253,
4715,
3408,
285,
921,
326,
247,
762,
253,
42924,
1430,
9376,
4715,
342,
17336,
2336,
5425,
275,
253,
17032,
3408,
36908,
11089,
432,
1512,
1199,
26647,
2228,
270,
323,
4715,
342,
17336,
2336,
5425,
275,
253,
4715,
3408,
253,
4477,
897,
253,
1985,
358,
12844,
10454,
1754,
26647,
2228,
14493,
285,
921,
326,
841,
13414,
2572,
275,
253,
3361,
273,
17336,
2336,
5425,
275,
253,
4715,
3408,
50276,
783,
4477,
671,
2085,
1783,
273,
253,
3515,
673,
18332,
273,
970,
247,
17336,
2336,
5425,
275,
253,
1027,
7533,
50276,
9328,
671,
6266,
247,
18872,
10554,
4758,
835,
616,
3762,
310,
7763,
50276,
2520,
310,
247,
10527,
2929,
342,
642,
5661,
1543,
20544,
337,
1077,
4623,
1895,
374,
253,
1543,
403,
1175,
285,
921,
672,
253,
897,
273,
17336,
2336,
13783,
310,
12912,
495,
253,
1543,
5645,
776,
1655,
4685,
273,
4715,
22714,
272,
342,
17336,
2336,
13783,
577,
253,
9759,
23970,
253,
4623,
3762,
285,
14308,
2403,
352,
6927,
281,
1239,
50275,
20881,
1255,
265,
337,
253,
2929,
310,
1077,
14086,
3782,
253,
3908,
273,
253,
2022,
7681,
1543,
50276,
1200,
12545,
253,
27947,
281,
253,
30762,
533,
417,
5277,
667,
30328,
3212,
253,
27947,
2789,
352,
1892,
281,
2096,
253,
30328,
3212,
253,
2022,
39383,
374,
253,
3515,
673,
10454,
1783,
323,
970,
253,
17336,
2336,
5425,
310,
247,
2372,
23211,
90,
50276,
2520,
310,
3782,
594,
275,
4706,
9654,
495,
253,
14168,
310,
2223,
1892,
281,
956,
323,
253,
2087,
9414,
50276,
783,
2929,
310,
48312,
14951,
595,
5536,
387,
5053,
253,
4477,
452,
5469,
253,
7364,
273,
616,
2746,
281,
690,
6070,
50276,
9088,
310,
642,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
339,
431,
248,
2929,
3537,
13505,
253,
26647,
14493,
273,
13361,
3210,
31612,
342,
17336,
2336,
13783,
347,
4114,
352,
10262,
19162,
3037,
1430,
285,
1985,
358,
12844,
10454,
285,
840,
352,
23970,
17336,
2336,
13783,
352,
2722,
849,
26647,
2544,
672,
247,
2336,
5425,
310,
3732,
281,
247,
1566,
387,
17032,
673,
672,
1097,
672,
253,
9079,
2317,
310,
1524,
12729,
285,
672,
352,
310,
417,
672,
288,
310,
19162,
3037,
494,
342,
14805,
2957,
17032,
673,
21999,
352,
87,
4245,
253,
5927,
1896,
26647,
2228,
1223,
14127,
253,
10806,
672,
288,
310,
417,
19162,
3037,
494,
436,
310,
417,
2797,
253,
2929,
671,
19401,
4715,
673,
21999,
298,
18698,
970,
1985,
358,
12844,
10454,
281,
921,
326,
6240,
247,
17336,
2336,
5425,
281,
247,
1566,
387,
4715,
673,
1057,
417,
4796,
697,
3037,
1430,
4933,
14493,
1097,
323,
23559,
14407,
10554,
285,
18872,
10554,
50275,
19164,
414,
50276,
783,
1543,
403,
3236,
10263,
2220,
14493,
6012,
275,
4321,
789,
285,
13633,
731,
281,
253,
17336,
2336,
5425,
7533,
253,
4114,
2144,
310,
1097,
11106,
20420,
285,
5544,
4518,
253,
4623,
14493,
4270,
2220,
403,
11106,
285,
40638,
14493,
326,
403,
417,
4270,
2220,
533,
4931,
812,
320,
403,
671,
23378,
50275,
15177,
50276,
455,
3916,
403,
973,
4516,
342,
27947,
2530,
275,
253,
14801,
1271,
891,
452,
760,
16058,
253,
27947,
949,
2593,
608,
417,
1110,
273,
2593,
721,
891,
452,
417,
1119,
1941,
273,
2712,
5061,
517,
275,
619,
2278,
285,
253,
1543,
403,
24600,
594,
891,
1902,
1014,
253,
27947,
891,
452,
417,
16058,
16575,
588,
320,
3590,
1512,
50275,
498,
15752,
50276,
783,
19529,
310,
973,
3542,
285,
973,
10932,
253,
4114,
2144,
310,
4518,
5544,
954,
12342,
403,
5611,
2720,
281,
616,
897,
387,
1878,
598,
1919,
1386,
22121,
24088,
4285,
1209,
34501,
403,
417,
5611,
50276,
783,
1563,
403,
8127,
1745,
5576,
13991,
533,
841,
3374,
513,
417,
4495,
2920,
40195,
253,
4583,
3290,
273,
253,
2929,
50276,
32202,
1386,
13900,
8414,
943,
1239,
11253,
50276,
32202,
1386,
21807,
512,
943,
320,
1581,
50276,
1282,
23512,
352,
943,
1333,
3410,
2581,
685,
1650,
4754,
50276,
32202,
1386,
23512,
256,
1257,
261,
50276,
1282,
27446,
253,
14951,
269,
285,
269,
68,
310,
760,
5611,
275,
253,
30762,
285,
594,
3133,
562,
273,
7990,
1060,
50276,
296,
16409,
4115,
27332,
604,
13882,
1368,
50276,
17,
310,
436,
7933,
247,
3731,
42070,
296,
24087,
50276,
1282,
25255,
752,
1057,
269,
3730,
281,
275,
253,
5426,
273,
16774,
1980,
1985,
358,
12844,
10454,
50275,
9188,
40348,
50276,
783,
747,
4758,
5145,
4715,
342,
247,
17336,
2336,
5425,
310,
271,
1774,
4758,
323,
1142,
2898,
10625,
891,
513,
417,
871,
1880,
253,
14493,
403,
4623,
30407,
1020,
275,
841,
2898,
10625,
275,
3946,
347,
619,
4685,
310,
326,
275,
4980,
13361,
2718,
6867,
26647,
16226,
476,
320,
973,
1561,
253,
14493,
1677,
407,
1985,
358,
12844,
10454,
17837,
253,
4758,
310,
271,
1774,
581,
285,
253,
1543,
403,
281,
253,
1682,
273,
619,
3640,
4460,
285,
594,
2028,
441,
1774,
3607,
273,
849,
2087,
50228,
476,
1818,
50276,
2013,
404,
19965,
672,
247,
2336,
5425,
310,
3732,
1097,
1309,
4715,
285,
387,
17032,
673,
281,
247,
3215,
11273,
1566,
50275,
783,
2515,
275,
253,
39383,
9232,
253,
7364,
273,
253,
1543,
24088,
323,
352,
87,
253,
26647,
12215,
4419,
326,
288,
310,
19162,
3037,
494,
2490,
187,
4118,
18435,
27,
455,
30628,
10490,
253,
3559,
2746,
273,
970,
247,
17336,
2336,
5425,
275,
1097,
253,
4715,
285,
17032,
3408,
275,
253,
19162,
4758,
253,
2746,
671,
10262,
10527,
27947,
323,
14493,
672,
970,
824,
247,
2336,
5425,
253,
10123,
671,
2085,
1142,
4278,
323,
11138,
253,
9759,
323,
1097,
11138,
253,
36594,
285,
19843,
534,
651,
320,
1270,
281,
19071,
275,
253,
1735,
2715,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work studied a variancebased regularization method based on weighting correction for domain generalization it also theoretically analyzed the potential benefits of the proposed method over erm and the connection between the proposed variancebased regularization method and the group dro problem from an optimization perspective strengths 1 this paper proposed a weighting correction scheme for varianceregularized domain generalization methods inspired by the generalization error bound 2 it theoretically compared the proposed method with erm 3 it demonstrated that the objective function of the proposed method is equivalent to a group dro problem from an optimization perspective weaknesses 1 more experiments need to be provided to validate the observations of this paper 2 it is unclear how the generalization gap between the training and test data in line 101102 is derived 3 the computational complexity of varianceregularized domain generalization method with weighting correction scheme is not analyzed the authors showed that no potential negative societal impact was observed in this work docsepthis paper investigated the variancebased regularization method for domain generalization in particular the authors originally study how the empirical efficient estimator hatmathbfq effects the generalization performance for a convex hull of domains the authors first give an generalization bound for indistribution generalization based on the indomain variance and outofdomain variance thm 1 then they study the relationship of empirical efficient estimator and the group dro objectives and show the equivalence between group dro and the variancebased regularization thm 3 overall this paper gives some new insights for the variancebased domain generalization such as the effects of empirical efficient estimator and the indistribution generalization bound based on variance however i still think the theorical results in this paper is insufficient for answer the question can variancebased regularization improve domain generalization besides the main contribution of studying the effect of empirical efficient estimator seems a bit trivial since there exists work on variancebased domain generalization 12 1 krueger et al outofdistribution generalization via risk extrapolation rex icml 2021 2 xie el tal risk variance penalization arxiv preprint arxiv200607544 2020 strengths 1 some new insights for the variancebased domain generalization such as the effects of empirical efficient estimator and the indistribution generalization bound based on variance 2 although there exists work on variancebased generalization 3 the study of variancebased domain generalization is interesting to me 3 the theoretical derivations are complete 3 duchi et al variancebased regularization with convex objectives 2017 weaknesses 1 the main concern to me is the significance of the theoretical results thm 1 gives an indomain generalization result bound the generalization risk of training domains which is far away from the setting of domain generalization where the target domain is unseen during training and the discrepancy between the target domain and training domains can be large although the authors have discussed this problem in eq 9 i still think they should give a generalization bound on the standard setting of domain generalization 2 another major concern is the novelty of this paper to my knowledge xie et al 2 have proposed the relationship between group dro objective and the variances i think the novelty of this paper is the study of empirical efficient estimator for variancebased domain generalization which seems a bit minor 3 for experimental results in tab 1 i notice the results on pacs are amazingly strong 969 the details of experimental setting and analysis for the results is missing but i think the experiment is important for this work na docsepthis paper addresses the connection between domain generalization and group distributionally robust optimization varianceregualarized erm is approximated by group dro with phidivergence ball unlike previous works they take empirical domain distribution as an anchor of uncertainty sets of dro instead of the uniform distribution and provides guarantees of indistribution generalization they argue that with some conditions their method can outperform erm for domain generalization strengths this paper did a focused job of generalizing and analysizing the connection between varizancebased regualarization and domain generalization their theoretical result for generalization bound for indistribution seems solid weaknesses their work seems similar to 1 the work 1 showed that group dro formulation with unifrom distribution as a anchor is bounded with risk variance and the equality can be established with some condition in this paper the study is extended into general anchor including empirical distribution however theorem 3 and 4 can be straightforwardly extended from the results of 1 they argue that a weighting correction scheme can improve generalization in some cases it would be better to show some experiments to make their arguments valid 1 xie et al risk variance penelization 2020 the author have addressed the limitations docsepin this paper the author studied the problem of whether the variancebased regularization approach can improve domain generalization with a mild assumption of the outofdistribution data is generated by the shift of the domain distribution the paper proposed a weighting correction scheme based on the previous varianceregularized domain generalization methods they prove the guarantees for indistribution generalization and show the potential advantages of the proposed method when compared to the empirical risk minimization experimental results show improvements in some cases strength 1 the paper is wellwritten with strong mathematical support 2 the paper studies the question of whether variancebased regularization can improve domain generalization based on strong theoretical support they proposed a simple weighting correction scheme and provide guarantees of indistribution generalization weakness 1 it could be better to include the experiments in the main paper the main paper did not mention that there are experiments in the appendix more experiments can be done to support the proposed idea 2 based on the description from line 194 to 204 the sample size m is critical to the proposed approach in line 199 it looks like a larger m could benefit the proposed method while line 203 shows that the m should not be too large an empirical experiment might be required to demonstrate the impact of the sample size m na
### Summary:
|
this paper has been widely discussed between reviewers and authors unfortunately even after the reviewers updated their scores the paper was still judged to be below the acceptance threshold i encourage the authors in taking into account the reviewers comments while preparing the next iteration of their work
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
5421,
247,
11041,
3169,
37820,
1332,
1754,
327,
42428,
10618,
323,
5028,
26647,
352,
671,
28055,
5867,
253,
2442,
5373,
273,
253,
4081,
1332,
689,
209,
693,
285,
253,
4602,
875,
253,
4081,
11041,
3169,
37820,
1332,
285,
253,
1387,
3926,
1895,
432,
271,
13757,
8668,
20544,
337,
436,
2929,
4081,
247,
42428,
10618,
6974,
323,
11041,
12846,
1025,
5028,
26647,
3082,
11797,
407,
253,
26647,
2228,
3033,
374,
352,
28055,
2429,
253,
4081,
1332,
342,
209,
693,
495,
352,
5183,
326,
253,
8103,
1159,
273,
253,
4081,
1332,
310,
6425,
281,
247,
1387,
3926,
1895,
432,
271,
13757,
8668,
50276,
20881,
1255,
265,
337,
625,
4679,
878,
281,
320,
2530,
281,
17813,
253,
7313,
273,
436,
2929,
374,
352,
310,
12744,
849,
253,
26647,
8037,
875,
253,
3733,
285,
1071,
941,
275,
1386,
8437,
11335,
310,
6012,
495,
253,
15180,
10454,
273,
11041,
12846,
1025,
5028,
26647,
1332,
342,
42428,
10618,
6974,
310,
417,
5867,
253,
4477,
2692,
326,
642,
2442,
4016,
38058,
3486,
369,
2540,
275,
436,
789,
5474,
33032,
2520,
2929,
6949,
253,
11041,
3169,
37820,
1332,
323,
5028,
26647,
275,
1798,
253,
4477,
8927,
1263,
849,
253,
16774,
5919,
29107,
7856,
2407,
82,
2538,
253,
26647,
3045,
323,
247,
17133,
28470,
273,
10625,
253,
4477,
806,
1918,
271,
26647,
3033,
323,
31929,
2382,
26647,
1754,
327,
253,
801,
297,
404,
11041,
285,
562,
1171,
13517,
11041,
289,
78,
337,
840,
597,
1263,
253,
2954,
273,
16774,
5919,
29107,
285,
253,
1387,
3926,
16566,
285,
921,
253,
19945,
875,
1387,
3926,
285,
253,
11041,
3169,
37820,
289,
78,
495,
50276,
1189,
455,
436,
2929,
4245,
690,
747,
16039,
323,
253,
11041,
3169,
5028,
26647,
824,
347,
253,
2538,
273,
16774,
5919,
29107,
285,
253,
31929,
2382,
26647,
3033,
1754,
327,
11041,
2299,
891,
1335,
1158,
253,
29075,
474,
1543,
275,
436,
2929,
310,
12497,
323,
3662,
253,
1953,
476,
11041,
3169,
37820,
3157,
5028,
26647,
50276,
67,
11587,
253,
2022,
7680,
273,
12392,
253,
1055,
273,
16774,
5919,
29107,
3133,
247,
2372,
14916,
1580,
627,
4961,
789,
327,
11041,
3169,
5028,
26647,
1249,
50276,
18,
465,
32340,
1063,
1162,
355,
562,
1171,
35360,
26647,
3066,
2495,
26480,
17888,
294,
89,
17857,
1686,
43425,
50275,
19,
1269,
466,
1045,
5269,
2495,
11041,
29697,
1320,
549,
32693,
638,
3845,
549,
32693,
1518,
1549,
1976,
2031,
9169,
50276,
296,
3755,
20556,
337,
690,
747,
16039,
323,
253,
11041,
3169,
5028,
26647,
824,
347,
253,
2538,
273,
16774,
5919,
29107,
285,
253,
31929,
2382,
26647,
3033,
1754,
327,
11041,
50275,
19,
3738,
627,
4961,
789,
327,
11041,
3169,
26647,
495,
253,
1263,
273,
11041,
3169,
5028,
26647,
310,
4722,
281,
479,
495,
253,
10527,
3538,
569,
403,
3426,
50276,
20,
277,
26550,
1162,
355,
11041,
3169,
37820,
342,
17133,
16566,
4240,
50276,
20881,
1255,
265,
337,
253,
2022,
4468,
281,
479,
310,
253,
8453,
273,
253,
10527,
1543,
289,
78,
337,
4245,
271,
801,
297,
404,
26647,
906,
3033,
253,
26647,
2495,
273,
3733,
10625,
534,
310,
2080,
1977,
432,
253,
4758,
273,
5028,
26647,
835,
253,
2303,
5028,
310,
39709,
1309,
3733,
285,
253,
26210,
875,
253,
2303,
5028,
285,
3733,
10625,
476,
320,
1781,
3738,
253,
4477,
452,
5469,
436,
1895,
275,
16186,
898,
891,
1335,
1158,
597,
943,
1918,
247,
26647,
3033,
327,
253,
2629,
4758,
273,
5028,
26647,
50276,
19,
1529,
2201,
4468,
310,
253,
38135,
273,
436,
2929,
281,
619,
3640,
1269,
466,
1162,
355,
374,
452,
4081,
253,
2954,
875,
1387,
3926,
8103,
285,
253,
48894,
891,
1158,
253,
38135,
273,
436,
2929,
310,
253,
1263,
273,
16774,
5919,
29107,
323,
11041,
3169,
5028,
26647,
534,
3133,
247,
2372,
5884,
50276,
20,
323,
5661,
1543,
275,
10334,
337,
891,
4366,
253,
1543,
327,
268,
18944,
403,
7001,
5356,
2266,
898,
2090,
253,
4278,
273,
5661,
4758,
285,
1783,
323,
253,
1543,
310,
5816,
533,
891,
1158,
253,
3368,
310,
1774,
323,
436,
789,
5549,
5474,
33032,
2520,
2929,
12453,
253,
4602,
875,
5028,
26647,
285,
1387,
3268,
595,
10237,
13757,
11041,
1747,
780,
274,
1025,
209,
693,
310,
34930,
407,
1387,
3926,
342,
815,
301,
2373,
9515,
4023,
12401,
2045,
2987,
597,
1379,
16774,
5028,
3268,
347,
271,
18536,
273,
11649,
5239,
273,
3926,
3185,
273,
253,
6447,
3268,
285,
3400,
23632,
273,
31929,
2382,
26647,
597,
9059,
326,
342,
690,
2515,
616,
1332,
476,
562,
32231,
209,
693,
323,
5028,
26647,
20544,
50276,
2520,
2929,
858,
247,
7106,
2628,
273,
2087,
3006,
285,
5127,
3006,
253,
4602,
875,
945,
478,
593,
3169,
810,
780,
274,
1320,
285,
5028,
26647,
616,
10527,
906,
323,
26647,
3033,
323,
31929,
2382,
3133,
4891,
50276,
20881,
1255,
265,
50276,
14094,
789,
3133,
2074,
281,
337,
253,
789,
337,
2692,
326,
1387,
3926,
15895,
342,
440,
338,
409,
3268,
347,
247,
18536,
310,
11542,
342,
2495,
11041,
285,
253,
13919,
476,
320,
4232,
342,
690,
1617,
275,
436,
2929,
253,
1263,
310,
6508,
715,
2087,
18536,
1690,
16774,
3268,
2299,
10012,
495,
285,
577,
476,
320,
15246,
314,
6508,
432,
253,
1543,
273,
337,
50274,
9328,
9059,
326,
247,
42428,
10618,
6974,
476,
3157,
26647,
275,
690,
2219,
352,
651,
320,
1805,
281,
921,
690,
4679,
281,
1056,
616,
7125,
3588,
50276,
18,
1269,
466,
1162,
355,
2495,
11041,
4331,
293,
1320,
9169,
253,
2488,
452,
9713,
253,
7364,
5474,
339,
9852,
436,
2929,
253,
2488,
5421,
253,
1895,
273,
1880,
253,
11041,
3169,
37820,
2746,
476,
3157,
5028,
26647,
342,
247,
11134,
9376,
273,
253,
562,
1171,
35360,
941,
310,
4561,
407,
253,
5333,
273,
253,
5028,
3268,
253,
2929,
4081,
247,
42428,
10618,
6974,
1754,
327,
253,
2045,
11041,
12846,
1025,
5028,
26647,
3082,
597,
5276,
253,
23632,
323,
31929,
2382,
26647,
285,
921,
253,
2442,
11361,
273,
253,
4081,
1332,
672,
2429,
281,
253,
16774,
2495,
41458,
5661,
1543,
921,
11701,
275,
690,
2219,
4757,
50276,
18,
253,
2929,
310,
973,
15720,
342,
2266,
15965,
1329,
50276,
19,
253,
2929,
2175,
253,
1953,
273,
1880,
11041,
3169,
37820,
476,
3157,
5028,
26647,
1754,
327,
2266,
10527,
1329,
597,
4081,
247,
2969,
42428,
10618,
6974,
285,
2085,
23632,
273,
31929,
2382,
26647,
50275,
20881,
1255,
50276,
18,
352,
812,
320,
1805,
281,
2486,
253,
4679,
275,
253,
2022,
2929,
253,
2022,
2929,
858,
417,
3748,
326,
627,
403,
4679,
275,
253,
30762,
625,
4679,
476,
320,
2218,
281,
1329,
253,
4081,
2934,
50276,
19,
1754,
327,
253,
5740,
432,
1386,
26771,
281,
23133,
253,
3410,
1979,
278,
310,
4619,
281,
253,
4081,
2746,
275,
1386,
1749,
352,
4453,
751,
247,
4067,
278,
812,
5649,
253,
4081,
1332,
1223,
1386,
24876,
2722,
326,
253,
278,
943,
417,
320,
1512,
1781,
50276,
266,
16774,
3368,
1537,
320,
2424,
281,
7568,
253,
3486,
273,
253,
3410,
1979,
278,
50276,
2072,
2490,
187,
4118,
18435,
27,
2520,
2929,
556,
644,
7561,
5469,
875,
30628,
285,
4477,
19235,
1014,
846,
253,
30628,
9300,
616,
7363,
253,
2929,
369,
1335,
24242,
281,
320,
2708,
253,
14924,
7887,
891,
11907,
253,
4477,
275,
3192,
715,
2395,
253,
30628,
5701,
1223,
13828,
253,
1735,
19502,
273,
616,
789
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
5421,
247,
11041,
3169,
37820,
1332,
1754,
327,
42428,
10618,
323,
5028,
26647,
352,
671,
28055,
5867,
253,
2442,
5373,
273,
253,
4081,
1332,
689,
209,
693,
285,
253,
4602,
875,
253,
4081,
11041,
3169,
37820,
1332,
285,
253,
1387,
3926,
1895,
432,
271,
13757,
8668,
20544,
337,
436,
2929,
4081,
247,
42428,
10618,
6974,
323,
11041,
12846,
1025,
5028,
26647,
3082,
11797,
407,
253,
26647,
2228,
3033,
374,
352,
28055,
2429,
253,
4081,
1332,
342,
209,
693,
495,
352,
5183,
326,
253,
8103,
1159,
273,
253,
4081,
1332,
310,
6425,
281,
247,
1387,
3926,
1895,
432,
271,
13757,
8668,
50276,
20881,
1255,
265,
337,
625,
4679,
878,
281,
320,
2530,
281,
17813,
253,
7313,
273,
436,
2929,
374,
352,
310,
12744,
849,
253,
26647,
8037,
875,
253,
3733,
285,
1071,
941,
275,
1386,
8437,
11335,
310,
6012,
495,
253,
15180,
10454,
273,
11041,
12846,
1025,
5028,
26647,
1332,
342,
42428,
10618,
6974,
310,
417,
5867,
253,
4477,
2692,
326,
642,
2442,
4016,
38058,
3486,
369,
2540,
275,
436,
789,
5474,
33032,
2520,
2929,
6949,
253,
11041,
3169,
37820,
1332,
323,
5028,
26647,
275,
1798,
253,
4477,
8927,
1263,
849,
253,
16774,
5919,
29107,
7856,
2407,
82,
2538,
253,
26647,
3045,
323,
247,
17133,
28470,
273,
10625,
253,
4477,
806,
1918,
271,
26647,
3033,
323,
31929,
2382,
26647,
1754,
327,
253,
801,
297,
404,
11041,
285,
562,
1171,
13517,
11041,
289,
78,
337,
840,
597,
1263,
253,
2954,
273,
16774,
5919,
29107,
285,
253,
1387,
3926,
16566,
285,
921,
253,
19945,
875,
1387,
3926,
285,
253,
11041,
3169,
37820,
289,
78,
495,
50276,
1189,
455,
436,
2929,
4245,
690,
747,
16039,
323,
253,
11041,
3169,
5028,
26647,
824,
347,
253,
2538,
273,
16774,
5919,
29107,
285,
253,
31929,
2382,
26647,
3033,
1754,
327,
11041,
2299,
891,
1335,
1158,
253,
29075,
474,
1543,
275,
436,
2929,
310,
12497,
323,
3662,
253,
1953,
476,
11041,
3169,
37820,
3157,
5028,
26647,
50276,
67,
11587,
253,
2022,
7680,
273,
12392,
253,
1055,
273,
16774,
5919,
29107,
3133,
247,
2372,
14916,
1580,
627,
4961,
789,
327,
11041,
3169,
5028,
26647,
1249,
50276,
18,
465,
32340,
1063,
1162,
355,
562,
1171,
35360,
26647,
3066,
2495,
26480,
17888,
294,
89,
17857,
1686,
43425,
50275,
19,
1269,
466,
1045,
5269,
2495,
11041,
29697,
1320,
549,
32693,
638,
3845,
549,
32693,
1518,
1549,
1976,
2031,
9169,
50276,
296,
3755,
20556,
337,
690,
747,
16039,
323,
253,
11041,
3169,
5028,
26647,
824,
347,
253,
2538,
273,
16774,
5919,
29107,
285,
253,
31929,
2382,
26647,
3033,
1754,
327,
11041,
50275,
19,
3738,
627,
4961,
789,
327,
11041,
3169,
26647,
495,
253,
1263,
273,
11041,
3169,
5028,
26647,
310,
4722,
281,
479,
495,
253,
10527,
3538,
569,
403,
3426,
50276,
20,
277,
26550,
1162,
355,
11041,
3169,
37820,
342,
17133,
16566,
4240,
50276,
20881,
1255,
265,
337,
253,
2022,
4468,
281,
479,
310,
253,
8453,
273,
253,
10527,
1543,
289,
78,
337,
4245,
271,
801,
297,
404,
26647,
906,
3033,
253,
26647,
2495,
273,
3733,
10625,
534,
310,
2080,
1977,
432,
253,
4758,
273,
5028,
26647,
835,
253,
2303,
5028,
310,
39709,
1309,
3733,
285,
253,
26210,
875,
253,
2303,
5028,
285,
3733,
10625,
476,
320,
1781,
3738,
253,
4477,
452,
5469,
436,
1895,
275,
16186,
898,
891,
1335,
1158,
597,
943,
1918,
247,
26647,
3033,
327,
253,
2629,
4758,
273,
5028,
26647,
50276,
19,
1529,
2201,
4468,
310,
253,
38135,
273,
436,
2929,
281,
619,
3640,
1269,
466,
1162,
355,
374,
452,
4081,
253,
2954,
875,
1387,
3926,
8103,
285,
253,
48894,
891,
1158,
253,
38135,
273,
436,
2929,
310,
253,
1263,
273,
16774,
5919,
29107,
323,
11041,
3169,
5028,
26647,
534,
3133,
247,
2372,
5884,
50276,
20,
323,
5661,
1543,
275,
10334,
337,
891,
4366,
253,
1543,
327,
268,
18944,
403,
7001,
5356,
2266,
898,
2090,
253,
4278,
273,
5661,
4758,
285,
1783,
323,
253,
1543,
310,
5816,
533,
891,
1158,
253,
3368,
310,
1774,
323,
436,
789,
5549,
5474,
33032,
2520,
2929,
12453,
253,
4602,
875,
5028,
26647,
285,
1387,
3268,
595,
10237,
13757,
11041,
1747,
780,
274,
1025,
209,
693,
310,
34930,
407,
1387,
3926,
342,
815,
301,
2373,
9515,
4023,
12401,
2045,
2987,
597,
1379,
16774,
5028,
3268,
347,
271,
18536,
273,
11649,
5239,
273,
3926,
3185,
273,
253,
6447,
3268,
285,
3400,
23632,
273,
31929,
2382,
26647,
597,
9059,
326,
342,
690,
2515,
616,
1332,
476,
562,
32231,
209,
693,
323,
5028,
26647,
20544,
50276,
2520,
2929,
858,
247,
7106,
2628,
273,
2087,
3006,
285,
5127,
3006,
253,
4602,
875,
945,
478,
593,
3169,
810,
780,
274,
1320,
285,
5028,
26647,
616,
10527,
906,
323,
26647,
3033,
323,
31929,
2382,
3133,
4891,
50276,
20881,
1255,
265,
50276,
14094,
789,
3133,
2074,
281,
337,
253,
789,
337,
2692,
326,
1387,
3926,
15895,
342,
440,
338,
409,
3268,
347,
247,
18536,
310,
11542,
342,
2495,
11041,
285,
253,
13919,
476,
320,
4232,
342,
690,
1617,
275,
436,
2929,
253,
1263,
310,
6508,
715,
2087,
18536,
1690,
16774,
3268,
2299,
10012,
495,
285,
577,
476,
320,
15246,
314,
6508,
432,
253,
1543,
273,
337,
50274,
9328,
9059,
326,
247,
42428,
10618,
6974,
476,
3157,
26647,
275,
690,
2219,
352,
651,
320,
1805,
281,
921,
690,
4679,
281,
1056,
616,
7125,
3588,
50276,
18,
1269,
466,
1162,
355,
2495,
11041,
4331,
293,
1320,
9169,
253,
2488,
452,
9713,
253,
7364,
5474,
339,
9852,
436,
2929,
253,
2488,
5421,
253,
1895,
273,
1880,
253,
11041,
3169,
37820,
2746,
476,
3157,
5028,
26647,
342,
247,
11134,
9376,
273,
253,
562,
1171,
35360,
941,
310,
4561,
407,
253,
5333,
273,
253,
5028,
3268,
253,
2929,
4081,
247,
42428,
10618,
6974,
1754,
327,
253,
2045,
11041,
12846,
1025,
5028,
26647,
3082,
597,
5276,
253,
23632,
323,
31929,
2382,
26647,
285,
921,
253,
2442,
11361,
273,
253,
4081,
1332,
672,
2429,
281,
253,
16774,
2495,
41458,
5661,
1543,
921,
11701,
275,
690,
2219,
4757,
50276,
18,
253,
2929,
310,
973,
15720,
342,
2266,
15965,
1329,
50276,
19,
253,
2929,
2175,
253,
1953,
273,
1880,
11041,
3169,
37820,
476,
3157,
5028,
26647,
1754,
327,
2266,
10527,
1329,
597,
4081,
247,
2969,
42428,
10618,
6974,
285,
2085,
23632,
273,
31929,
2382,
26647,
50275,
20881,
1255,
50276,
18,
352,
812,
320,
1805,
281,
2486,
253,
4679,
275,
253,
2022,
2929,
253,
2022,
2929,
858,
417,
3748,
326,
627,
403,
4679,
275,
253,
30762,
625,
4679,
476,
320,
2218,
281,
1329,
253,
4081,
2934,
50276,
19,
1754,
327,
253,
5740,
432,
1386,
26771,
281,
23133,
253,
3410,
1979,
278,
310,
4619,
281,
253,
4081,
2746,
275,
1386,
1749,
352,
4453,
751,
247,
4067,
278,
812,
5649,
253,
4081,
1332,
1223,
1386,
24876,
2722,
326,
253,
278,
943,
417,
320,
1512,
1781,
50276,
266,
16774,
3368,
1537,
320,
2424,
281,
7568,
253,
3486,
273,
253,
3410,
1979,
278,
50276,
2072,
2490,
187,
4118,
18435,
27,
2520,
2929,
556,
644,
7561,
5469,
875,
30628,
285,
4477,
19235,
1014,
846,
253,
30628,
9300,
616,
7363,
253,
2929,
369,
1335,
24242,
281,
320,
2708,
253,
14924,
7887,
891,
11907,
253,
4477,
275,
3192,
715,
2395,
253,
30628,
5701,
1223,
13828,
253,
1735,
19502,
273,
616,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
pros targets an important problem adversarial attacks semantically constrained as opposed to being constrained by an artificial norm ball extensive results with a wide variety of models datasets and more importantly applications with not only attack evaluation on standard models but application to adversarial training a user study and evaluating against certified defenses cons the disentangled representations of stylegan were used for generating realistic perturbations and the application of training with said perturbations adversarial training was considered 1 this paper isnt cited let alone compared to given the significant similarity in the methodology 2 considered constraining adversarials to be within the output space of a learned generative model this work was also not cited the emphasis given to the result that adversarial training improves clean performance isnt fully justified as unrealistic images eg images not too perturbed were controlled for by limiting the number of iterations if one limits the number of iterations for a normbased attack or considers a smaller epsilon ball the same control would be met but a comparison to normbased adversarials with the same control is not conducted note that adversarial training was originally viewed and used as a regularization method 34 improving iid performance and the robustnessaccuracy tradeoff can straightforwardly be mitigated by considering a weaker form of robustness by using a weaker attack fgsm 3 or controlling the strength of the perturbation where the latter is exactly the filtering performed here it is intuitive that semantic adversarials would provide benefit for the robustnessaccuracy tradeoff but this experiment certainly does not demonstrate that due to the obvious confounders the emphasis on the breaking of a certified defense can also be seen as overclaiming as breaking a certified defense implies succeeding against the defense within their considered threat model this is not done here thus it doesnt violate any of the claims made by randomized smoothing thus it does not break the certified defense though the impracticality of normbased adversarial defenses are wellunderstood this experiment simply shows that if one goes outside of a certified defenses threat model the certified defense can perform worse which is entirely expected conclusion on one hand the research direction is valuable and the experimental results are extensive on the other hand comparison to the literature is sorely lacking there exists techniques which share much of the same functionality of the method 1 a comparison to normbased adversarials in the adversarial training experiment should have been done to clarify concerns that controlling for unrealistic adversarials is the source of the result not the contribution of semantic adversarials and the certified defense is not broken as the attack went outside of the defenses threat model in summary this work does not compare to any baselines when baseline experimentation is clearly needed to justify the novelty and contribution of the work and fair treatment of prior literature is missing with not only citations missing for quite similar approaches but claims are made which on the surface invalidate previous work breaking randomized smoothing when said previous work was not fairly evaluated as the attack went outside of the defenses clearly specified threat model thus not invalidating any of their claims i implore the authors to consider their contribution in the context of the literature more carefully references 1 httpsopenaccessthecvfcomcontentcvpr2020htmlgowalachievingrobustnessinthewildviaadversarialmixingwithdisentangledcvpr2020paperhtml 2 httpsopenaccessthecvfcomcontentcvpr2019htmlstutzdisentanglingadversarialrobustnessandgeneralizationcvpr2019paperhtml 3 httpsarxivorgabs14126572 4 httpsarxivorgabs170403976docsepthis paper proposes a mechanism to generate adversarial examples by applying latent variables level manipulation based on the stylegan framework unlike previous works mostly focused on image level perturbations and geometry transformations this work tends to control higher level latent sampling such as style so as to generate a styleadversarial examples although a similar idea has been proposed by song et al 2018 this work is along the same direction and achieves better performance the loss is proposed for general classification tasks such as object classification object detection and semantic segmentation the experimental results show not only qualitatively confusing human vision but also quantitatively improve the performance on testing clean images the paper is well written and easy to read experimental results demonstrate the proposed idea qualitatively and quantitatively sufficient ablation analysis to make the proposed method convincing however i still have some concerns theres no experiment to compare with existing methods such as xie et al 20 and pgd a standard experimental protocol ex imagenet should be conducted and fairly compared deeper analysis of the impact by yadv and etaadv in both object detection and semantic segmentation although final results in table 1 shows that stylebased adversarial training benefits the performance ablation studies of yadv and etaadv in both tasks should be conducted since the attack is in the feature space the defense should also happen in the feature space a overall i think the paper is valuable the proposed idea is novel and sufficient experiments are provided to demonstrate the idea rich visualization to help readers understand the concept im willing to raise my rate if my concerns are addressed a m lecuyer et al certified robustness to adversarial examples with differential privacy httpsarxivorgabs180203471docsepthe paper presents a new method for generating unrestricted adversarial examples based on stylegan this work separates stylistic and noise modifications so as to control higherlevel aspects and lowerlevel aspects of image generation by handling style and noise variables separately and changing the different levels of synthesis networks the model can input various types of perturbations in generating adversarial images as the authors claim the style variables from different layers affect different aspects of images generation of adversarial images are tested in both untargeted and targeted attacks overall the paper is wellmotivated wellwritten and the method is evaluated with three tasks classification semantic segmentation and object detection on the other hand although the different layers of the networks are concerned with different aspects of the images and the proposed method can generate a variety of images we may not be able to intentionally control specific aspects of the images this is an incremental work on top of stylegan so that the novelty of the paper is not very high please make clear how the parameter values are determined for example how did you select the step sizesdocsepsummary the paper proposes a method of generating adversarial samples to enhance classification performance specifically it finds variables of pretrained generative model to produce images that the pretrained classifier gives wrong answers the new classifier is then trained by adding these samples to the existing training dataset this method achieved good performance because the distributions of the samples generated in this way is closer to the distribution of the real images than the ones generated with normbounded perturbations pros the effectiveness of the proposed method is reasonably explained by comparing with the preceding works authors thoroughly analyzed the qualitative results it helps in understanding the underlying mechanisms questions is there a difference in performance between using nontargeted adversarial samples and targeted adversarial samples if so how different is it which of the two to use depends on the outcome what is the criteria that you divide layers as high mid and lowlevel ones have you checked layerwise effect how one image varies in different settings for example generation results in figure 3 using only one original image
### Summary:
|
the paper received two borderline accept recommendations and one accept recommendation from three reviewers with low confidence and a reject recommendation from an expert reviewer although all reviewers found that the paper addresses an important and challenging problem of semantically constraining adversarial attacks as opposed to constraining them artificially by an artificial norm ball however during the discussion phase it has been pointed out that there were some important weaknesses indicating that the paper may need one more evaluation round the meta reviewer recommends rejection based on the following observations in terms of evaluation while it is understandable the authors were unable to compare to gowal et al due to the lack of publicly available implementation showing song et als adversarials hurt performance and and are farther than the image manifold has been found puzzling as this was done by song et al only to keep human prediction the same while changing model prediction furthermore the paper did not contain a user study similar to song et al for a fair comparison finally the discussion revealed that the comparison to normbounded adversarial inputs may not have clarified whether this experiment faithfully demonstrates an advantage for the contribution as the norm could be contained to a point where accuracy is not reduced and the discussion on the certified defense being broken was inconclusive
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
856,
84,
50276,
48413,
271,
1774,
1895,
48960,
8104,
3300,
39904,
20793,
347,
10066,
281,
1146,
20793,
407,
271,
13345,
5222,
4023,
50275,
2068,
3134,
1543,
342,
247,
4618,
5235,
273,
3210,
15302,
285,
625,
15538,
4893,
342,
417,
760,
2983,
7103,
327,
2629,
3210,
533,
2898,
281,
48960,
3733,
247,
2608,
1263,
285,
16344,
1411,
18065,
25774,
50275,
5040,
50276,
783,
557,
290,
33195,
14237,
273,
3740,
1247,
497,
908,
323,
11365,
15958,
26309,
285,
253,
2898,
273,
3733,
342,
753,
26309,
48960,
3733,
369,
2783,
337,
436,
2929,
310,
2649,
11106,
1339,
3815,
2429,
281,
1677,
253,
1534,
14259,
275,
253,
16182,
374,
2783,
1030,
26208,
18539,
274,
8075,
281,
320,
1561,
253,
3453,
2317,
273,
247,
6311,
1006,
800,
1566,
436,
789,
369,
671,
417,
11106,
50275,
783,
15075,
1677,
281,
253,
906,
326,
48960,
3733,
19132,
4076,
3045,
310,
2649,
4751,
17285,
347,
46521,
3888,
24088,
3888,
417,
1512,
44711,
497,
6537,
323,
407,
14155,
253,
1180,
273,
25142,
604,
581,
7787,
253,
1180,
273,
25142,
323,
247,
5222,
3169,
2983,
390,
19401,
247,
4577,
299,
4277,
4023,
253,
1072,
1453,
651,
320,
1313,
533,
247,
5301,
281,
5222,
3169,
18539,
274,
8075,
342,
253,
1072,
1453,
310,
417,
5196,
3877,
326,
48960,
3733,
369,
8927,
11575,
285,
908,
347,
247,
37820,
1332,
5910,
11138,
891,
301,
3045,
285,
253,
31640,
18921,
1974,
5454,
2727,
476,
15246,
314,
320,
4784,
27285,
407,
7296,
247,
21076,
830,
273,
31640,
407,
970,
247,
21076,
2983,
269,
72,
3610,
495,
390,
10938,
253,
4757,
273,
253,
20452,
835,
253,
6158,
310,
4555,
253,
19690,
2684,
1060,
352,
310,
27350,
326,
24705,
18539,
274,
8075,
651,
2085,
5649,
323,
253,
31640,
18921,
1974,
5454,
2727,
533,
436,
3368,
5604,
1057,
417,
7568,
326,
1955,
281,
253,
4755,
44667,
398,
50275,
783,
15075,
327,
253,
10155,
273,
247,
18065,
5684,
476,
671,
320,
2326,
347,
689,
43759,
347,
10155,
247,
18065,
5684,
8018,
42547,
1411,
253,
5684,
1561,
616,
2783,
4322,
1566,
436,
310,
417,
2218,
1060,
3021,
352,
36908,
20835,
667,
273,
253,
3916,
1160,
407,
14871,
36971,
3021,
352,
1057,
417,
2740,
253,
18065,
5684,
2167,
253,
45783,
414,
273,
5222,
3169,
48960,
25774,
403,
973,
4524,
6545,
436,
3368,
3365,
2722,
326,
604,
581,
4566,
3345,
273,
247,
18065,
25774,
4322,
1566,
253,
18065,
5684,
476,
1347,
7197,
534,
310,
7094,
3264,
50275,
585,
3444,
327,
581,
1133,
253,
2561,
3884,
310,
9865,
285,
253,
5661,
1543,
403,
9470,
327,
253,
643,
1133,
5301,
281,
253,
6239,
310,
25132,
314,
14999,
627,
4961,
5609,
534,
3894,
1199,
273,
253,
1072,
13175,
273,
253,
1332,
337,
247,
5301,
281,
5222,
3169,
18539,
274,
8075,
275,
253,
48960,
3733,
3368,
943,
452,
644,
2218,
281,
19148,
7350,
326,
10938,
323,
46521,
18539,
274,
8075,
310,
253,
2603,
273,
253,
906,
417,
253,
7680,
273,
24705,
18539,
274,
8075,
285,
253,
18065,
5684,
310,
417,
7154,
347,
253,
2983,
2427,
3345,
273,
253,
25774,
4322,
1566,
50275,
249,
6010,
436,
789,
1057,
417,
7277,
281,
667,
1666,
25379,
672,
8245,
40290,
310,
4518,
3058,
281,
15249,
253,
38135,
285,
7680,
273,
253,
789,
285,
4344,
1971,
273,
2720,
6239,
310,
5816,
342,
417,
760,
30404,
5816,
323,
3240,
2074,
7274,
533,
3916,
403,
1160,
534,
327,
253,
2553,
12078,
366,
2045,
789,
10155,
14871,
36971,
672,
753,
2045,
789,
369,
417,
9648,
6760,
347,
253,
2983,
2427,
3345,
273,
253,
25774,
4518,
7616,
4322,
1566,
3021,
417,
12078,
839,
667,
273,
616,
3916,
50275,
74,
3898,
410,
253,
4477,
281,
1908,
616,
7680,
275,
253,
3634,
273,
253,
6239,
625,
9257,
50275,
250,
3065,
50274,
18,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
17312,
1087,
14952,
2974,
24003,
267,
607,
466,
1382,
18848,
461,
1255,
565,
248,
32778,
13917,
324,
735,
24406,
24706,
272,
3113,
3431,
290,
33195,
17312,
1087,
14952,
20790,
2974,
50274,
19,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
17312,
1087,
9638,
2974,
296,
25374,
3431,
290,
36874,
324,
735,
24406,
18848,
461,
1255,
395,
16691,
1320,
17312,
1087,
9638,
20790,
2974,
50275,
20,
5987,
39962,
2061,
5375,
1047,
805,
2082,
3547,
50274,
21,
5987,
39962,
2061,
5375,
15046,
1449,
1867,
3121,
7152,
33032,
2520,
2929,
29328,
247,
5122,
281,
6635,
48960,
6667,
407,
9433,
21624,
4903,
1268,
19763,
1754,
327,
253,
3740,
1247,
7792,
12401,
2045,
2987,
6571,
7106,
327,
2460,
1268,
26309,
285,
12087,
21257,
436,
789,
14280,
281,
1453,
2169,
1268,
21624,
10491,
824,
347,
3740,
594,
347,
281,
6635,
247,
3740,
324,
735,
24406,
6667,
3738,
247,
2074,
2934,
556,
644,
4081,
407,
4498,
1162,
355,
4765,
436,
789,
310,
2112,
253,
1072,
3884,
285,
33526,
1805,
3045,
253,
2957,
310,
4081,
323,
2087,
9162,
8892,
824,
347,
1789,
9162,
1789,
5481,
285,
24705,
26405,
253,
5661,
1543,
921,
417,
760,
36143,
21643,
1966,
8113,
533,
671,
36878,
3157,
253,
3045,
327,
5175,
4076,
3888,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50276,
49363,
1543,
7568,
253,
4081,
2934,
36143,
285,
36878,
50276,
31031,
28913,
1783,
281,
1056,
253,
4081,
1332,
21414,
50276,
35529,
891,
1335,
452,
690,
7350,
50275,
783,
373,
642,
3368,
281,
7277,
342,
5368,
3082,
824,
347,
1269,
466,
1162,
355,
1384,
285,
23256,
69,
247,
2629,
5661,
7241,
385,
4440,
257,
292,
943,
320,
5196,
285,
9648,
2429,
50276,
615,
8390,
1783,
273,
253,
3486,
407,
340,
24301,
285,
1162,
66,
24301,
275,
1097,
1789,
5481,
285,
24705,
26405,
3738,
2457,
1543,
275,
2829,
337,
2722,
326,
3740,
3169,
48960,
3733,
5373,
253,
3045,
28913,
2175,
273,
340,
24301,
285,
1162,
66,
24301,
275,
1097,
8892,
943,
320,
5196,
50276,
17480,
253,
2983,
310,
275,
253,
4735,
2317,
253,
5684,
943,
671,
5108,
275,
253,
4735,
2317,
247,
50275,
1189,
455,
891,
1158,
253,
2929,
310,
9865,
253,
4081,
2934,
310,
4460,
285,
4209,
4679,
403,
2530,
281,
7568,
253,
2934,
6793,
24426,
281,
1361,
10668,
2096,
253,
4473,
516,
7378,
281,
7164,
619,
2281,
604,
619,
7350,
403,
9713,
50276,
66,
278,
458,
68,
7352,
254,
1162,
355,
18065,
31640,
281,
48960,
6667,
342,
8967,
11068,
5987,
39962,
2061,
5375,
1093,
9992,
1706,
3677,
7152,
339,
431,
248,
2929,
10262,
247,
747,
1332,
323,
11365,
48566,
48960,
6667,
50276,
3169,
327,
3740,
1247,
436,
789,
36158,
17521,
2531,
285,
6046,
14586,
594,
347,
281,
1453,
2169,
5251,
7794,
285,
2406,
5251,
7794,
273,
2460,
5978,
50276,
1615,
10885,
3740,
285,
6046,
4903,
11794,
285,
6890,
253,
1027,
2308,
273,
9066,
6928,
253,
1566,
476,
3280,
2710,
3510,
273,
26309,
275,
11365,
48960,
3888,
50276,
284,
253,
4477,
1750,
253,
3740,
4903,
432,
1027,
8090,
2818,
1027,
7794,
273,
3888,
50276,
14520,
273,
48960,
3888,
403,
5762,
275,
1097,
440,
44490,
285,
10522,
8104,
50276,
1189,
455,
253,
2929,
310,
973,
24013,
8550,
973,
15720,
285,
253,
1332,
310,
6760,
342,
1264,
8892,
9162,
24705,
26405,
285,
1789,
5481,
50276,
251,
253,
643,
1133,
3738,
253,
1027,
8090,
273,
253,
6928,
403,
7514,
342,
1027,
7794,
273,
253,
3888,
285,
253,
4081,
1332,
476,
6635,
247,
5235,
273,
3888,
359,
778,
417,
320,
2104,
281,
23209,
1453,
2173,
7794,
273,
253,
3888,
50276,
2520,
310,
271,
32809,
789,
327,
1755,
273,
3740,
1247,
594,
326,
253,
38135,
273,
253,
2929,
310,
417,
1077,
1029,
50276,
32897,
1056,
2590,
849,
253,
4764,
2193,
403,
3413,
50276,
1542,
1650,
849,
858,
368,
3609,
253,
3213,
9552,
7152,
339,
793,
360,
3454,
50275,
783,
2929,
29328,
247,
1332,
273,
11365,
48960,
3530,
281,
7278,
9162,
3045,
5742,
352,
9010,
4903,
273,
3215,
11273,
1006,
800,
1566,
281,
4711,
3888,
326,
253,
3215,
11273,
30410,
4245,
3430,
9172,
253,
747,
30410,
310,
840,
10166,
407,
6240,
841,
3530,
281,
253,
5368,
3733,
10895,
436,
1332,
6786,
1175,
3045,
984,
253,
10670,
273,
253,
3530,
4561,
275,
436,
1039,
310,
8003,
281,
253,
3268,
273,
253,
1524,
3888,
685,
253,
4394,
4561,
342,
5222,
44344,
26309,
50275,
856,
84,
50274,
783,
12510,
273,
253,
4081,
1332,
310,
12054,
5544,
407,
10941,
342,
253,
17691,
2987,
50274,
43355,
16575,
5867,
253,
18276,
1543,
352,
7729,
275,
4685,
253,
6944,
6297,
50274,
34974,
50274,
261,
627,
247,
3064,
275,
3045,
875,
970,
25450,
1816,
264,
48960,
3530,
285,
10522,
48960,
3530,
604,
594,
849,
1027,
310,
352,
534,
273,
253,
767,
281,
897,
7024,
327,
253,
6454,
50275,
5371,
310,
253,
6866,
326,
368,
10957,
8090,
347,
1029,
4260,
285,
1698,
5251,
4394,
452,
368,
10141,
3828,
3020,
1055,
50275,
5430,
581,
2460,
16149,
275,
1027,
7533,
323,
1650,
5978,
1543,
275,
4677,
495,
970,
760,
581,
3236,
2460,
187,
187,
4118,
18435,
27,
783,
2929,
2959,
767,
45210,
2997,
12645,
285,
581,
2997,
17401,
432,
1264,
30628,
342,
1698,
7162,
285,
247,
12009,
17401,
432,
271,
6485,
37317,
50275,
20261,
512,
30628,
1119,
326,
253,
2929,
12453,
271,
1774,
285,
11132,
1895,
273,
3300,
39904,
1030,
26208,
48960,
8104,
347,
10066,
281,
1030,
26208,
731,
41544,
407,
271,
13345,
5222,
4023,
2299,
1309,
253,
5955,
3408,
352,
556,
644,
8042,
562,
326,
627,
497,
690,
1774,
32213,
7809,
326,
253,
2929,
778,
878,
581,
625,
7103,
3790,
50276,
783,
11419,
37317,
32636,
18235,
1754,
327,
253,
1563,
7313,
50275,
249,
2426,
273,
7103,
1223,
352,
310,
34007,
253,
4477,
497,
7591,
281,
7277,
281,
305,
319,
267,
1162,
355,
1955,
281,
253,
3480,
273,
13644,
2130,
7092,
4645,
4498,
1162,
14350,
18539,
274,
8075,
8513,
3045,
285,
285,
403,
21816,
685,
253,
2460,
16751,
556,
644,
1119,
21843,
1981,
347,
436,
369,
2218,
407,
4498,
1162,
355,
760,
281,
1978,
1966,
10554,
253,
1072,
1223,
6890,
1566,
10554,
33810,
253,
2929,
858,
417,
3831,
247,
2608,
1263,
2074,
281,
4498,
1162,
355,
323,
247,
4344,
5301,
4720,
253,
5955,
4950,
326,
253,
5301,
281,
5222,
44344,
48960,
14800,
778,
417,
452,
31637,
1880,
436,
3368,
48479,
14371,
271,
5750,
323,
253,
7680,
347,
253,
5222,
812,
320,
6221,
281,
247,
1127,
835,
7200,
310,
417,
3777,
285,
253,
5955,
327,
253,
18065,
5684,
1146,
7154,
369,
16656,
7426
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
856,
84,
50276,
48413,
271,
1774,
1895,
48960,
8104,
3300,
39904,
20793,
347,
10066,
281,
1146,
20793,
407,
271,
13345,
5222,
4023,
50275,
2068,
3134,
1543,
342,
247,
4618,
5235,
273,
3210,
15302,
285,
625,
15538,
4893,
342,
417,
760,
2983,
7103,
327,
2629,
3210,
533,
2898,
281,
48960,
3733,
247,
2608,
1263,
285,
16344,
1411,
18065,
25774,
50275,
5040,
50276,
783,
557,
290,
33195,
14237,
273,
3740,
1247,
497,
908,
323,
11365,
15958,
26309,
285,
253,
2898,
273,
3733,
342,
753,
26309,
48960,
3733,
369,
2783,
337,
436,
2929,
310,
2649,
11106,
1339,
3815,
2429,
281,
1677,
253,
1534,
14259,
275,
253,
16182,
374,
2783,
1030,
26208,
18539,
274,
8075,
281,
320,
1561,
253,
3453,
2317,
273,
247,
6311,
1006,
800,
1566,
436,
789,
369,
671,
417,
11106,
50275,
783,
15075,
1677,
281,
253,
906,
326,
48960,
3733,
19132,
4076,
3045,
310,
2649,
4751,
17285,
347,
46521,
3888,
24088,
3888,
417,
1512,
44711,
497,
6537,
323,
407,
14155,
253,
1180,
273,
25142,
604,
581,
7787,
253,
1180,
273,
25142,
323,
247,
5222,
3169,
2983,
390,
19401,
247,
4577,
299,
4277,
4023,
253,
1072,
1453,
651,
320,
1313,
533,
247,
5301,
281,
5222,
3169,
18539,
274,
8075,
342,
253,
1072,
1453,
310,
417,
5196,
3877,
326,
48960,
3733,
369,
8927,
11575,
285,
908,
347,
247,
37820,
1332,
5910,
11138,
891,
301,
3045,
285,
253,
31640,
18921,
1974,
5454,
2727,
476,
15246,
314,
320,
4784,
27285,
407,
7296,
247,
21076,
830,
273,
31640,
407,
970,
247,
21076,
2983,
269,
72,
3610,
495,
390,
10938,
253,
4757,
273,
253,
20452,
835,
253,
6158,
310,
4555,
253,
19690,
2684,
1060,
352,
310,
27350,
326,
24705,
18539,
274,
8075,
651,
2085,
5649,
323,
253,
31640,
18921,
1974,
5454,
2727,
533,
436,
3368,
5604,
1057,
417,
7568,
326,
1955,
281,
253,
4755,
44667,
398,
50275,
783,
15075,
327,
253,
10155,
273,
247,
18065,
5684,
476,
671,
320,
2326,
347,
689,
43759,
347,
10155,
247,
18065,
5684,
8018,
42547,
1411,
253,
5684,
1561,
616,
2783,
4322,
1566,
436,
310,
417,
2218,
1060,
3021,
352,
36908,
20835,
667,
273,
253,
3916,
1160,
407,
14871,
36971,
3021,
352,
1057,
417,
2740,
253,
18065,
5684,
2167,
253,
45783,
414,
273,
5222,
3169,
48960,
25774,
403,
973,
4524,
6545,
436,
3368,
3365,
2722,
326,
604,
581,
4566,
3345,
273,
247,
18065,
25774,
4322,
1566,
253,
18065,
5684,
476,
1347,
7197,
534,
310,
7094,
3264,
50275,
585,
3444,
327,
581,
1133,
253,
2561,
3884,
310,
9865,
285,
253,
5661,
1543,
403,
9470,
327,
253,
643,
1133,
5301,
281,
253,
6239,
310,
25132,
314,
14999,
627,
4961,
5609,
534,
3894,
1199,
273,
253,
1072,
13175,
273,
253,
1332,
337,
247,
5301,
281,
5222,
3169,
18539,
274,
8075,
275,
253,
48960,
3733,
3368,
943,
452,
644,
2218,
281,
19148,
7350,
326,
10938,
323,
46521,
18539,
274,
8075,
310,
253,
2603,
273,
253,
906,
417,
253,
7680,
273,
24705,
18539,
274,
8075,
285,
253,
18065,
5684,
310,
417,
7154,
347,
253,
2983,
2427,
3345,
273,
253,
25774,
4322,
1566,
50275,
249,
6010,
436,
789,
1057,
417,
7277,
281,
667,
1666,
25379,
672,
8245,
40290,
310,
4518,
3058,
281,
15249,
253,
38135,
285,
7680,
273,
253,
789,
285,
4344,
1971,
273,
2720,
6239,
310,
5816,
342,
417,
760,
30404,
5816,
323,
3240,
2074,
7274,
533,
3916,
403,
1160,
534,
327,
253,
2553,
12078,
366,
2045,
789,
10155,
14871,
36971,
672,
753,
2045,
789,
369,
417,
9648,
6760,
347,
253,
2983,
2427,
3345,
273,
253,
25774,
4518,
7616,
4322,
1566,
3021,
417,
12078,
839,
667,
273,
616,
3916,
50275,
74,
3898,
410,
253,
4477,
281,
1908,
616,
7680,
275,
253,
3634,
273,
253,
6239,
625,
9257,
50275,
250,
3065,
50274,
18,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
17312,
1087,
14952,
2974,
24003,
267,
607,
466,
1382,
18848,
461,
1255,
565,
248,
32778,
13917,
324,
735,
24406,
24706,
272,
3113,
3431,
290,
33195,
17312,
1087,
14952,
20790,
2974,
50274,
19,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
17312,
1087,
9638,
2974,
296,
25374,
3431,
290,
36874,
324,
735,
24406,
18848,
461,
1255,
395,
16691,
1320,
17312,
1087,
9638,
20790,
2974,
50275,
20,
5987,
39962,
2061,
5375,
1047,
805,
2082,
3547,
50274,
21,
5987,
39962,
2061,
5375,
15046,
1449,
1867,
3121,
7152,
33032,
2520,
2929,
29328,
247,
5122,
281,
6635,
48960,
6667,
407,
9433,
21624,
4903,
1268,
19763,
1754,
327,
253,
3740,
1247,
7792,
12401,
2045,
2987,
6571,
7106,
327,
2460,
1268,
26309,
285,
12087,
21257,
436,
789,
14280,
281,
1453,
2169,
1268,
21624,
10491,
824,
347,
3740,
594,
347,
281,
6635,
247,
3740,
324,
735,
24406,
6667,
3738,
247,
2074,
2934,
556,
644,
4081,
407,
4498,
1162,
355,
4765,
436,
789,
310,
2112,
253,
1072,
3884,
285,
33526,
1805,
3045,
253,
2957,
310,
4081,
323,
2087,
9162,
8892,
824,
347,
1789,
9162,
1789,
5481,
285,
24705,
26405,
253,
5661,
1543,
921,
417,
760,
36143,
21643,
1966,
8113,
533,
671,
36878,
3157,
253,
3045,
327,
5175,
4076,
3888,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50276,
49363,
1543,
7568,
253,
4081,
2934,
36143,
285,
36878,
50276,
31031,
28913,
1783,
281,
1056,
253,
4081,
1332,
21414,
50276,
35529,
891,
1335,
452,
690,
7350,
50275,
783,
373,
642,
3368,
281,
7277,
342,
5368,
3082,
824,
347,
1269,
466,
1162,
355,
1384,
285,
23256,
69,
247,
2629,
5661,
7241,
385,
4440,
257,
292,
943,
320,
5196,
285,
9648,
2429,
50276,
615,
8390,
1783,
273,
253,
3486,
407,
340,
24301,
285,
1162,
66,
24301,
275,
1097,
1789,
5481,
285,
24705,
26405,
3738,
2457,
1543,
275,
2829,
337,
2722,
326,
3740,
3169,
48960,
3733,
5373,
253,
3045,
28913,
2175,
273,
340,
24301,
285,
1162,
66,
24301,
275,
1097,
8892,
943,
320,
5196,
50276,
17480,
253,
2983,
310,
275,
253,
4735,
2317,
253,
5684,
943,
671,
5108,
275,
253,
4735,
2317,
247,
50275,
1189,
455,
891,
1158,
253,
2929,
310,
9865,
253,
4081,
2934,
310,
4460,
285,
4209,
4679,
403,
2530,
281,
7568,
253,
2934,
6793,
24426,
281,
1361,
10668,
2096,
253,
4473,
516,
7378,
281,
7164,
619,
2281,
604,
619,
7350,
403,
9713,
50276,
66,
278,
458,
68,
7352,
254,
1162,
355,
18065,
31640,
281,
48960,
6667,
342,
8967,
11068,
5987,
39962,
2061,
5375,
1093,
9992,
1706,
3677,
7152,
339,
431,
248,
2929,
10262,
247,
747,
1332,
323,
11365,
48566,
48960,
6667,
50276,
3169,
327,
3740,
1247,
436,
789,
36158,
17521,
2531,
285,
6046,
14586,
594,
347,
281,
1453,
2169,
5251,
7794,
285,
2406,
5251,
7794,
273,
2460,
5978,
50276,
1615,
10885,
3740,
285,
6046,
4903,
11794,
285,
6890,
253,
1027,
2308,
273,
9066,
6928,
253,
1566,
476,
3280,
2710,
3510,
273,
26309,
275,
11365,
48960,
3888,
50276,
284,
253,
4477,
1750,
253,
3740,
4903,
432,
1027,
8090,
2818,
1027,
7794,
273,
3888,
50276,
14520,
273,
48960,
3888,
403,
5762,
275,
1097,
440,
44490,
285,
10522,
8104,
50276,
1189,
455,
253,
2929,
310,
973,
24013,
8550,
973,
15720,
285,
253,
1332,
310,
6760,
342,
1264,
8892,
9162,
24705,
26405,
285,
1789,
5481,
50276,
251,
253,
643,
1133,
3738,
253,
1027,
8090,
273,
253,
6928,
403,
7514,
342,
1027,
7794,
273,
253,
3888,
285,
253,
4081,
1332,
476,
6635,
247,
5235,
273,
3888,
359,
778,
417,
320,
2104,
281,
23209,
1453,
2173,
7794,
273,
253,
3888,
50276,
2520,
310,
271,
32809,
789,
327,
1755,
273,
3740,
1247,
594,
326,
253,
38135,
273,
253,
2929,
310,
417,
1077,
1029,
50276,
32897,
1056,
2590,
849,
253,
4764,
2193,
403,
3413,
50276,
1542,
1650,
849,
858,
368,
3609,
253,
3213,
9552,
7152,
339,
793,
360,
3454,
50275,
783,
2929,
29328,
247,
1332,
273,
11365,
48960,
3530,
281,
7278,
9162,
3045,
5742,
352,
9010,
4903,
273,
3215,
11273,
1006,
800,
1566,
281,
4711,
3888,
326,
253,
3215,
11273,
30410,
4245,
3430,
9172,
253,
747,
30410,
310,
840,
10166,
407,
6240,
841,
3530,
281,
253,
5368,
3733,
10895,
436,
1332,
6786,
1175,
3045,
984,
253,
10670,
273,
253,
3530,
4561,
275,
436,
1039,
310,
8003,
281,
253,
3268,
273,
253,
1524,
3888,
685,
253,
4394,
4561,
342,
5222,
44344,
26309,
50275,
856,
84,
50274,
783,
12510,
273,
253,
4081,
1332,
310,
12054,
5544,
407,
10941,
342,
253,
17691,
2987,
50274,
43355,
16575,
5867,
253,
18276,
1543,
352,
7729,
275,
4685,
253,
6944,
6297,
50274,
34974,
50274,
261,
627,
247,
3064,
275,
3045,
875,
970,
25450,
1816,
264,
48960,
3530,
285,
10522,
48960,
3530,
604,
594,
849,
1027,
310,
352,
534,
273,
253,
767,
281,
897,
7024,
327,
253,
6454,
50275,
5371,
310,
253,
6866,
326,
368,
10957,
8090,
347,
1029,
4260,
285,
1698,
5251,
4394,
452,
368,
10141,
3828,
3020,
1055,
50275,
5430,
581,
2460,
16149,
275,
1027,
7533,
323,
1650,
5978,
1543,
275,
4677,
495,
970,
760,
581,
3236,
2460,
187,
187,
4118,
18435,
27,
783,
2929,
2959,
767,
45210,
2997,
12645,
285,
581,
2997,
17401,
432,
1264,
30628,
342,
1698,
7162,
285,
247,
12009,
17401,
432,
271,
6485,
37317,
50275,
20261,
512,
30628,
1119,
326,
253,
2929,
12453,
271,
1774,
285,
11132,
1895,
273,
3300,
39904,
1030,
26208,
48960,
8104,
347,
10066,
281,
1030,
26208,
731,
41544,
407,
271,
13345,
5222,
4023,
2299,
1309,
253,
5955,
3408,
352,
556,
644,
8042,
562,
326,
627,
497,
690,
1774,
32213,
7809,
326,
253,
2929,
778,
878,
581,
625,
7103,
3790,
50276,
783,
11419,
37317,
32636,
18235,
1754,
327,
253,
1563,
7313,
50275,
249,
2426,
273,
7103,
1223,
352,
310,
34007,
253,
4477,
497,
7591,
281,
7277,
281,
305,
319,
267,
1162,
355,
1955,
281,
253,
3480,
273,
13644,
2130,
7092,
4645,
4498,
1162,
14350,
18539,
274,
8075,
8513,
3045,
285,
285,
403,
21816,
685,
253,
2460,
16751,
556,
644,
1119,
21843,
1981,
347,
436,
369,
2218,
407,
4498,
1162,
355,
760,
281,
1978,
1966,
10554,
253,
1072,
1223,
6890,
1566,
10554,
33810,
253,
2929,
858,
417,
3831,
247,
2608,
1263,
2074,
281,
4498,
1162,
355,
323,
247,
4344,
5301,
4720,
253,
5955,
4950,
326,
253,
5301,
281,
5222,
44344,
48960,
14800,
778,
417,
452,
31637,
1880,
436,
3368,
48479,
14371,
271,
5750,
323,
253,
7680,
347,
253,
5222,
812,
320,
6221,
281,
247,
1127,
835,
7200,
310,
417,
3777,
285,
253,
5955,
327,
253,
18065,
5684,
1146,
7154,
369,
16656,
7426
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors investigate the value of fully connected layers at the end of convolutional neural networks in the small data regime they demonstrate that the addition of these layers significantly improves model quality in this regime strengths 1 i found the paper to be very clear and well written it was easy to understand what the authors were doing and why 2 the results seem significant this is not a phenomenon that i was aware of previously although i am not an expert in the use of deep learning in the small data regime weaknesses 1 distillation is known to improve quality for a constant parameter count so im not convinced that the distillation experiments disprove the hypothesis that the quality gains of adding fully connected layers are from increased parameter count a more convincing argument is that youre not increasing the parameter count much because of the dimensionality reduction in your fc stack id encourage the authors to include data on the increase in model parameters with their proposed addition the above sections detail limitationsquestions id like to address docsepthis paper proposes a training method that involves a module plugged into the final features of a backbone as an additional classification head for joint training with the existing classification head along with the original head the newly added head also pass through the softmax for the cross entropy loss the training is performed with the summation of the two crossentropy loss the proposed module is dubbed feature refiner fr consisting of two fully connected layers followed by the layer normalizations and operating with the gradient gate gg right before the classification head gg acts exactly like a stopgradient technique which is widely used in selfsupervised learning to train the original head only based on the output of the frozen backbone which is the input for the head and the backbone is trained only with the extra head with fr at the inference phase the original backbone with the original classification head is used for the forward propagation the authors claim that this training method works well on lowdata regimes for me it is limited to the lowdata regimes which may not be expected some experimental results on small datasets including the cifar datasets and the caltech datasets for supervised learning active learning only on cifar and semisupervised learning presumably on cifar are provided to show the effectiveness of the proposed method the authors try to show the universality of the proposed method of training with some backbones not constrained on resnets strengths this paper is easy to follow the proposed idea looks somewhat interesting weaknesses the main concern with the proposed method is that it seems to have a very similar training pipeline to simsiam 1 which showed that using the stopgradient is a key to training a backbone in selfsupervised training specifically feature refiner rf and the classification heads seem to be the predictor and the projector in the method 1 2 respectively the order is reversed but would not be a matter from my standpoint a difference is at the loss but as the authors claim in line 83 p3 assuming the network is trained in a kd way the proposed method is a supervised simsiam with a singleview i hardly agree with the authors claim that the method performs like a kd except for using only a singleview image the training procedure is very close to simsiam therefore the authors should argue the difference between the proposed method with simsiam there is no intuition why the proposed method has the benefit of training with small data the experimental setup is somewhat unconvincing the setup is inconsistent and does not follow the authors claim of requiring pretraining for the method in line 234 page 9 the authors specify that they use an imagenetpretrained resnet for the caltech datasets training only providing a seemingly inappropriate reason all the training dataset is small so the experimental verification of the claim is limited using small data for training and a small dataset is technically different therefore it would be better to justify the proposed method on a larger scale dataset such as imagenet with a small data regime for a stronger claim comparison of the proposed method with mlpmixer and vitb16 in figure 4 seems unfair mlpmixer and vitb16 have the stem of performing nonoverlapping patchification for the input so training them with 32x32 size images in the cifar datasets degrades the model accuracy regardless of the size of training data as aforementioned the authors claim the proposed method is a joint kd however the loss is not a straight kdbased loss eg the kld loss and training a backbone with the extra head may not leverage the kd concept in my opinion can the authors elaborate on the concept prerebuttal comment this work presents a training method using the stop gradient technique with an extra fc head for model training in a supervised manner except for using a singleview image and supervised loss the overall training pipeline is quite similar to the previous selfsupervised methods 1 2 so the authors should elaborate on the difference and provide any intuition why the proposed method could work well another concern is that the experiments are not convincing because of the inconsistent experimental setups and smallscale experiments therefore i am leaning towards rejection but would like to see the authors response and the other reviewers comments for my final decision 1 chen et al exploring simple siamese representation learning cvpr 2021 2 grill et al bootstrap your own latent a new approach to selfsupervised learning neurips 2020 limitation is provided but any potential negative social impacts do not seme to be provided docsepthis paper shows that one can improve accuracy in the lowdata regime by adding fully connected layers to cnns during training and distilling knowledge to a classification head resulting in a network with the same number of parameters and better generalization performance i found the method proposed in this paper very interesting and the results showed clear improvements over the base network considering how easy this is to implement i can see this paper having a large amount of impact to the community and the experiments showing improvements in the active learning and semisupervised learning setting show the diversity of this approach however the lack of knowledge distillation baselines and theoretical backing has me questioning whether this is any different than a typical teach student distillation approach for instance what would happen if one jointly trained resnet18 and resnet50 applied a loss similar to 1 in this case the test time network would contain no more parameters so it seems like a reasonable comparison and it follows a similar intuition of this paper train with more parameters and drop them during testtime although i am not too familiar with current kd methods i would be surprised if none of them showed a similar gain in performance so it would be a necessary baseline to compare this papers method to a lesser concern i have is with overfitting since this method seems to be training a network with more parameters on a small amount of data i wonder if performance on robustness benchmarks would be worse than the baseline network overall i think that this paper has the potential to have high impact as it is well written and has a simple easy to follow method that is effective under several tasks i am giving it a reject as i dont believe the current experiments are sufficient to prove that this method is more advantageous than other knowledge distillation methods but i am eager to have the authors quell my doubts with a more thorough evaluation 1 zhang et al deep mutual learning na
### Summary:
|
the paper shows that using final fullyconnected layers helps the generalization of convolutional neural networks in lowdata regimes the addition of these layers significantly improves model quality resulting in a network with the same number of parameters and better generalization performance initially reviewers had mixed evaluation of the paper all the reviewers saw that the proposed method is simple and easy to follow at the same time providing clear improvements over baselines also agreed that the results are significant and surprising effect there were some concerns raised by the reviewers but the authors rebuttal mostly addressed and improved the paper with sufficiently more experiments and analysis supporting the main claim reviewer dx6o mentioned that there are few updates promised by the authors which cant be validated until camera ready but it does not seem to warrant block publication the authorreviewer discussion period was active and the authors did a great job clearing various concerns and questions and all reviewers agreed to support acceptance of the paper the paper demonstrates a simple yet effective method for small data regime which would be interesting to the broad neurips audience both for practitioners as well as researchers
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
7409,
253,
1318,
273,
4751,
4802,
8090,
387,
253,
990,
273,
27311,
267,
11454,
6928,
275,
253,
1355,
941,
9459,
597,
7568,
326,
253,
1635,
273,
841,
8090,
3012,
19132,
1566,
3290,
275,
436,
9459,
20544,
337,
891,
1119,
253,
2929,
281,
320,
1077,
2590,
285,
973,
3542,
352,
369,
3477,
281,
2096,
752,
253,
4477,
497,
2509,
285,
2139,
374,
253,
1543,
1646,
1534,
436,
310,
417,
247,
11562,
326,
891,
369,
6600,
273,
3786,
3738,
891,
717,
417,
271,
6485,
275,
253,
897,
273,
3676,
4715,
275,
253,
1355,
941,
9459,
50276,
20881,
1255,
265,
337,
940,
21755,
310,
1929,
281,
3157,
3290,
323,
247,
3638,
4764,
1385,
594,
516,
417,
13762,
326,
253,
940,
21755,
4679,
557,
17460,
253,
9079,
326,
253,
3290,
15988,
273,
6240,
4751,
4802,
8090,
403,
432,
2559,
4764,
1385,
247,
625,
21414,
4154,
310,
326,
368,
250,
417,
3629,
253,
4764,
1385,
1199,
984,
273,
253,
7877,
1319,
5141,
275,
634,
269,
68,
8031,
2654,
11907,
253,
4477,
281,
2486,
941,
327,
253,
50276,
19687,
511,
275,
1566,
3602,
342,
616,
4081,
1635,
50275,
783,
1840,
7118,
2508,
7364,
34974,
2654,
751,
281,
2953,
5474,
33032,
2520,
2929,
29328,
247,
3733,
1332,
326,
8687,
247,
6333,
43867,
715,
253,
2457,
3386,
273,
247,
27882,
347,
271,
3081,
9162,
1481,
323,
6036,
3733,
342,
253,
5368,
9162,
1481,
2112,
342,
253,
3236,
1481,
253,
9841,
2879,
1481,
671,
1509,
949,
253,
2602,
4090,
323,
253,
2831,
15579,
2957,
253,
3733,
310,
2684,
342,
253,
36138,
273,
253,
767,
2831,
290,
10144,
2957,
253,
4081,
6333,
310,
33690,
4735,
1275,
7068,
1315,
11253,
273,
767,
4751,
4802,
8090,
3560,
407,
253,
3828,
2622,
5904,
285,
6498,
342,
253,
11786,
7394,
305,
72,
987,
1078,
253,
9162,
1481,
305,
72,
6993,
4555,
751,
247,
3523,
29844,
5853,
534,
310,
7561,
908,
275,
1881,
35421,
4715,
281,
6194,
253,
3236,
1481,
760,
1754,
327,
253,
3453,
273,
253,
13831,
27882,
534,
310,
253,
3280,
323,
253,
1481,
285,
253,
27882,
310,
10166,
760,
342,
253,
4465,
1481,
342,
1315,
387,
253,
17032,
3408,
253,
3236,
27882,
342,
253,
3236,
9162,
1481,
310,
908,
323,
253,
3579,
18634,
253,
4477,
1750,
326,
436,
3733,
1332,
2987,
973,
327,
1698,
2203,
27005,
323,
479,
352,
310,
3710,
281,
253,
1698,
2203,
27005,
534,
778,
417,
320,
3264,
690,
5661,
1543,
327,
1355,
15302,
1690,
253,
260,
338,
274,
15302,
285,
253,
1724,
17556,
15302,
323,
22296,
4715,
3939,
4715,
760,
327,
260,
338,
274,
285,
49863,
29974,
13337,
4715,
18289,
327,
260,
338,
274,
403,
2530,
281,
921,
253,
12510,
273,
253,
4081,
1332,
253,
4477,
1611,
281,
921,
253,
6978,
1319,
273,
253,
4081,
1332,
273,
3733,
342,
690,
896,
47473,
417,
20793,
327,
501,
47301,
50275,
296,
3755,
20556,
50276,
2520,
2929,
310,
3477,
281,
956,
50275,
783,
4081,
2934,
4453,
8489,
4722,
50275,
20881,
1255,
265,
50276,
783,
2022,
4468,
342,
253,
4081,
1332,
310,
326,
352,
3133,
281,
452,
247,
1077,
2074,
3733,
15722,
281,
948,
9245,
312,
337,
534,
2692,
326,
970,
253,
3523,
29844,
310,
247,
2234,
281,
3733,
247,
27882,
275,
1881,
35421,
3733,
5742,
4735,
1275,
7068,
391,
71,
285,
253,
9162,
9851,
1646,
281,
320,
253,
23403,
285,
253,
42987,
275,
253,
1332,
337,
374,
2975,
253,
1340,
310,
13891,
533,
651,
417,
320,
247,
2647,
432,
619,
32764,
247,
3064,
310,
387,
253,
2957,
533,
347,
253,
4477,
1750,
275,
1386,
11439,
268,
20,
7384,
253,
2990,
310,
10166,
275,
247,
465,
69,
1039,
253,
4081,
1332,
310,
247,
22296,
948,
9245,
312,
342,
247,
2014,
1374,
50276,
74,
10693,
5194,
342,
253,
4477,
1750,
326,
253,
1332,
17923,
751,
247,
465,
69,
50276,
16829,
323,
970,
760,
247,
2014,
1374,
2460,
253,
3733,
5199,
310,
1077,
2810,
281,
948,
9245,
312,
3103,
253,
4477,
943,
9059,
253,
3064,
875,
253,
4081,
1332,
342,
948,
9245,
312,
50276,
9088,
310,
642,
30328,
2139,
253,
4081,
1332,
556,
253,
5649,
273,
3733,
342,
1355,
941,
50274,
783,
5661,
9978,
310,
8489,
10915,
87,
19163,
253,
9978,
310,
16706,
285,
1057,
417,
956,
253,
4477,
1750,
273,
10568,
3215,
26208,
323,
253,
1332,
275,
1386,
27812,
3239,
898,
253,
4477,
13199,
326,
597,
897,
271,
4440,
257,
292,
4025,
11273,
501,
3024,
323,
253,
1724,
17556,
15302,
3733,
760,
5277,
247,
16907,
19582,
1921,
50276,
455,
253,
3733,
10895,
310,
1355,
594,
253,
5661,
21999,
273,
253,
1750,
310,
3710,
970,
1355,
941,
323,
3733,
285,
247,
1355,
10895,
310,
22335,
1027,
3103,
352,
651,
320,
1805,
281,
15249,
253,
4081,
1332,
327,
247,
4067,
4311,
10895,
824,
347,
4440,
257,
292,
342,
247,
1355,
941,
9459,
323,
247,
10046,
1750,
50276,
47109,
273,
253,
4081,
1332,
342,
13361,
2617,
895,
254,
285,
9084,
67,
1036,
275,
4677,
577,
3133,
16593,
13361,
2617,
895,
254,
285,
9084,
67,
1036,
452,
253,
8424,
273,
9591,
1327,
1189,
77,
5436,
12097,
1877,
323,
253,
3280,
594,
3733,
731,
342,
4567,
89,
1237,
1979,
3888,
275,
253,
260,
338,
274,
15302,
372,
25013,
253,
1566,
7200,
10159,
273,
253,
1979,
273,
3733,
941,
50276,
284,
18979,
253,
4477,
1750,
253,
4081,
1332,
310,
247,
6036,
465,
69,
2299,
253,
2957,
310,
417,
247,
4951,
465,
69,
3169,
2957,
24088,
253,
465,
392,
2957,
285,
3733,
247,
27882,
342,
253,
4465,
1481,
778,
417,
25057,
253,
465,
69,
4473,
275,
619,
4743,
476,
253,
4477,
21184,
327,
253,
4473,
50274,
3456,
250,
2858,
22559,
4385,
436,
789,
10262,
247,
3733,
1332,
970,
253,
3523,
11786,
5853,
342,
271,
4465,
269,
68,
1481,
323,
1566,
3733,
275,
247,
22296,
5133,
3707,
323,
970,
247,
2014,
1374,
2460,
285,
22296,
2957,
253,
4583,
3733,
15722,
310,
3240,
2074,
281,
253,
2045,
1881,
35421,
3082,
337,
374,
594,
253,
4477,
943,
21184,
327,
253,
3064,
285,
2085,
667,
30328,
2139,
253,
4081,
1332,
812,
789,
973,
1529,
4468,
310,
326,
253,
4679,
403,
417,
21414,
984,
273,
253,
16706,
5661,
873,
8777,
285,
1355,
7527,
4679,
3103,
891,
717,
25661,
4404,
18235,
533,
651,
751,
281,
923,
253,
4477,
2380,
285,
253,
643,
30628,
5701,
323,
619,
2457,
3061,
50276,
18,
260,
864,
1162,
355,
18216,
2969,
4927,
1443,
70,
6779,
4715,
30105,
1087,
43425,
374,
32257,
1162,
355,
28551,
634,
1211,
21624,
247,
747,
2746,
281,
1881,
35421,
4715,
5723,
2824,
9169,
50275,
2815,
3535,
310,
2530,
533,
667,
2442,
4016,
2675,
16274,
513,
417,
3300,
70,
281,
320,
2530,
5474,
33032,
2520,
2929,
2722,
326,
581,
476,
3157,
7200,
275,
253,
1698,
2203,
9459,
407,
6240,
4751,
4802,
8090,
281,
260,
79,
2224,
1309,
3733,
285,
940,
3867,
3640,
281,
247,
9162,
1481,
4795,
275,
247,
2990,
342,
253,
1072,
1180,
273,
3602,
285,
1805,
26647,
3045,
50276,
74,
1119,
253,
1332,
4081,
275,
436,
2929,
1077,
4722,
285,
253,
1543,
2692,
2590,
11701,
689,
253,
2613,
2990,
7296,
849,
3477,
436,
310,
281,
3359,
891,
476,
923,
436,
2929,
1907,
247,
1781,
2408,
273,
3486,
281,
253,
3114,
285,
253,
4679,
4645,
11701,
275,
253,
3939,
4715,
285,
49863,
29974,
13337,
4715,
4758,
921,
253,
9991,
273,
436,
2746,
2299,
253,
3480,
273,
3640,
940,
21755,
1666,
25379,
285,
10527,
19673,
556,
479,
20501,
1880,
436,
310,
667,
1027,
685,
247,
6867,
9798,
5974,
940,
21755,
2746,
50275,
1542,
4227,
752,
651,
5108,
604,
581,
26277,
10166,
501,
3024,
1093,
285,
501,
3024,
1235,
3732,
247,
2957,
2074,
281,
337,
275,
436,
1083,
253,
1071,
673,
2990,
651,
3831,
642,
625,
3602,
594,
352,
3133,
751,
247,
5272,
5301,
285,
352,
3637,
247,
2074,
30328,
273,
436,
2929,
6194,
342,
625,
3602,
285,
5926,
731,
1309,
1071,
2606,
3738,
891,
717,
417,
1512,
7615,
342,
1655,
465,
69,
3082,
891,
651,
320,
9861,
604,
5293,
273,
731,
2692,
247,
2074,
6351,
275,
3045,
594,
352,
651,
320,
247,
3309,
8245,
281,
7277,
436,
9380,
1332,
281,
50275,
66,
16277,
4468,
891,
452,
310,
342,
689,
31893,
1580,
436,
1332,
3133,
281,
320,
3733,
247,
2990,
342,
625,
3602,
327,
247,
1355,
2408,
273,
941,
891,
4282,
604,
3045,
327,
31640,
49602,
651,
320,
7197,
685,
253,
8245,
2990,
50275,
1189,
455,
891,
1158,
326,
436,
2929,
556,
253,
2442,
281,
452,
1029,
3486,
347,
352,
310,
973,
3542,
285,
556,
247,
2969,
3477,
281,
956,
1332,
326,
310,
3576,
762,
2067,
8892,
891,
717,
4933,
352,
247,
12009,
347,
891,
13414,
2868,
253,
1655,
4679,
403,
4209,
281,
5276,
326,
436,
1332,
310,
625,
24400,
685,
643,
3640,
940,
21755,
3082,
533,
891,
717,
15211,
281,
452,
253,
4477,
572,
437,
619,
24626,
342,
247,
625,
11080,
7103,
50275,
18,
1182,
12109,
1162,
355,
3676,
15577,
4715,
5549,
2490,
187,
4118,
18435,
27,
783,
2929,
2722,
326,
970,
2457,
4751,
14063,
8090,
7729,
253,
26647,
273,
27311,
267,
11454,
6928,
275,
1698,
2203,
27005,
253,
1635,
273,
841,
8090,
3012,
19132,
1566,
3290,
4795,
275,
247,
2990,
342,
253,
1072,
1180,
273,
3602,
285,
1805,
26647,
3045,
50276,
4478,
1365,
30628,
574,
6804,
7103,
273,
253,
2929,
512,
253,
30628,
3047,
326,
253,
4081,
1332,
310,
2969,
285,
3477,
281,
956,
387,
253,
1072,
673,
5277,
2590,
11701,
689,
1666,
25379,
671,
5821,
326,
253,
1543,
403,
1534,
285,
10084,
1055,
627,
497,
690,
7350,
5439,
407,
253,
30628,
533,
253,
4477,
30080,
22559,
6571,
9713,
285,
5520,
253,
2929,
342,
10481,
625,
4679,
285,
1783,
8109,
253,
2022,
1750,
37317,
18747,
23,
80,
5393,
326,
627,
403,
1643,
11269,
12316,
407,
253,
4477,
534,
16216,
320,
17618,
1919,
6568,
4704,
533,
352,
1057,
417,
1646,
281,
7501,
2972,
9311,
50275,
783,
2488,
15337,
254,
5955,
2180,
369,
3939,
285,
253,
4477,
858,
247,
1270,
2628,
22980,
2710,
7350,
285,
3533,
285,
512,
30628,
5821,
281,
1329,
14924,
273,
253,
2929,
253,
2929,
14371,
247,
2969,
2568,
3576,
1332,
323,
1355,
941,
9459,
534,
651,
320,
4722,
281,
253,
3862,
5723,
2824,
8446,
1097,
323,
24432,
347,
973,
347,
8607,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
7409,
253,
1318,
273,
4751,
4802,
8090,
387,
253,
990,
273,
27311,
267,
11454,
6928,
275,
253,
1355,
941,
9459,
597,
7568,
326,
253,
1635,
273,
841,
8090,
3012,
19132,
1566,
3290,
275,
436,
9459,
20544,
337,
891,
1119,
253,
2929,
281,
320,
1077,
2590,
285,
973,
3542,
352,
369,
3477,
281,
2096,
752,
253,
4477,
497,
2509,
285,
2139,
374,
253,
1543,
1646,
1534,
436,
310,
417,
247,
11562,
326,
891,
369,
6600,
273,
3786,
3738,
891,
717,
417,
271,
6485,
275,
253,
897,
273,
3676,
4715,
275,
253,
1355,
941,
9459,
50276,
20881,
1255,
265,
337,
940,
21755,
310,
1929,
281,
3157,
3290,
323,
247,
3638,
4764,
1385,
594,
516,
417,
13762,
326,
253,
940,
21755,
4679,
557,
17460,
253,
9079,
326,
253,
3290,
15988,
273,
6240,
4751,
4802,
8090,
403,
432,
2559,
4764,
1385,
247,
625,
21414,
4154,
310,
326,
368,
250,
417,
3629,
253,
4764,
1385,
1199,
984,
273,
253,
7877,
1319,
5141,
275,
634,
269,
68,
8031,
2654,
11907,
253,
4477,
281,
2486,
941,
327,
253,
50276,
19687,
511,
275,
1566,
3602,
342,
616,
4081,
1635,
50275,
783,
1840,
7118,
2508,
7364,
34974,
2654,
751,
281,
2953,
5474,
33032,
2520,
2929,
29328,
247,
3733,
1332,
326,
8687,
247,
6333,
43867,
715,
253,
2457,
3386,
273,
247,
27882,
347,
271,
3081,
9162,
1481,
323,
6036,
3733,
342,
253,
5368,
9162,
1481,
2112,
342,
253,
3236,
1481,
253,
9841,
2879,
1481,
671,
1509,
949,
253,
2602,
4090,
323,
253,
2831,
15579,
2957,
253,
3733,
310,
2684,
342,
253,
36138,
273,
253,
767,
2831,
290,
10144,
2957,
253,
4081,
6333,
310,
33690,
4735,
1275,
7068,
1315,
11253,
273,
767,
4751,
4802,
8090,
3560,
407,
253,
3828,
2622,
5904,
285,
6498,
342,
253,
11786,
7394,
305,
72,
987,
1078,
253,
9162,
1481,
305,
72,
6993,
4555,
751,
247,
3523,
29844,
5853,
534,
310,
7561,
908,
275,
1881,
35421,
4715,
281,
6194,
253,
3236,
1481,
760,
1754,
327,
253,
3453,
273,
253,
13831,
27882,
534,
310,
253,
3280,
323,
253,
1481,
285,
253,
27882,
310,
10166,
760,
342,
253,
4465,
1481,
342,
1315,
387,
253,
17032,
3408,
253,
3236,
27882,
342,
253,
3236,
9162,
1481,
310,
908,
323,
253,
3579,
18634,
253,
4477,
1750,
326,
436,
3733,
1332,
2987,
973,
327,
1698,
2203,
27005,
323,
479,
352,
310,
3710,
281,
253,
1698,
2203,
27005,
534,
778,
417,
320,
3264,
690,
5661,
1543,
327,
1355,
15302,
1690,
253,
260,
338,
274,
15302,
285,
253,
1724,
17556,
15302,
323,
22296,
4715,
3939,
4715,
760,
327,
260,
338,
274,
285,
49863,
29974,
13337,
4715,
18289,
327,
260,
338,
274,
403,
2530,
281,
921,
253,
12510,
273,
253,
4081,
1332,
253,
4477,
1611,
281,
921,
253,
6978,
1319,
273,
253,
4081,
1332,
273,
3733,
342,
690,
896,
47473,
417,
20793,
327,
501,
47301,
50275,
296,
3755,
20556,
50276,
2520,
2929,
310,
3477,
281,
956,
50275,
783,
4081,
2934,
4453,
8489,
4722,
50275,
20881,
1255,
265,
50276,
783,
2022,
4468,
342,
253,
4081,
1332,
310,
326,
352,
3133,
281,
452,
247,
1077,
2074,
3733,
15722,
281,
948,
9245,
312,
337,
534,
2692,
326,
970,
253,
3523,
29844,
310,
247,
2234,
281,
3733,
247,
27882,
275,
1881,
35421,
3733,
5742,
4735,
1275,
7068,
391,
71,
285,
253,
9162,
9851,
1646,
281,
320,
253,
23403,
285,
253,
42987,
275,
253,
1332,
337,
374,
2975,
253,
1340,
310,
13891,
533,
651,
417,
320,
247,
2647,
432,
619,
32764,
247,
3064,
310,
387,
253,
2957,
533,
347,
253,
4477,
1750,
275,
1386,
11439,
268,
20,
7384,
253,
2990,
310,
10166,
275,
247,
465,
69,
1039,
253,
4081,
1332,
310,
247,
22296,
948,
9245,
312,
342,
247,
2014,
1374,
50276,
74,
10693,
5194,
342,
253,
4477,
1750,
326,
253,
1332,
17923,
751,
247,
465,
69,
50276,
16829,
323,
970,
760,
247,
2014,
1374,
2460,
253,
3733,
5199,
310,
1077,
2810,
281,
948,
9245,
312,
3103,
253,
4477,
943,
9059,
253,
3064,
875,
253,
4081,
1332,
342,
948,
9245,
312,
50276,
9088,
310,
642,
30328,
2139,
253,
4081,
1332,
556,
253,
5649,
273,
3733,
342,
1355,
941,
50274,
783,
5661,
9978,
310,
8489,
10915,
87,
19163,
253,
9978,
310,
16706,
285,
1057,
417,
956,
253,
4477,
1750,
273,
10568,
3215,
26208,
323,
253,
1332,
275,
1386,
27812,
3239,
898,
253,
4477,
13199,
326,
597,
897,
271,
4440,
257,
292,
4025,
11273,
501,
3024,
323,
253,
1724,
17556,
15302,
3733,
760,
5277,
247,
16907,
19582,
1921,
50276,
455,
253,
3733,
10895,
310,
1355,
594,
253,
5661,
21999,
273,
253,
1750,
310,
3710,
970,
1355,
941,
323,
3733,
285,
247,
1355,
10895,
310,
22335,
1027,
3103,
352,
651,
320,
1805,
281,
15249,
253,
4081,
1332,
327,
247,
4067,
4311,
10895,
824,
347,
4440,
257,
292,
342,
247,
1355,
941,
9459,
323,
247,
10046,
1750,
50276,
47109,
273,
253,
4081,
1332,
342,
13361,
2617,
895,
254,
285,
9084,
67,
1036,
275,
4677,
577,
3133,
16593,
13361,
2617,
895,
254,
285,
9084,
67,
1036,
452,
253,
8424,
273,
9591,
1327,
1189,
77,
5436,
12097,
1877,
323,
253,
3280,
594,
3733,
731,
342,
4567,
89,
1237,
1979,
3888,
275,
253,
260,
338,
274,
15302,
372,
25013,
253,
1566,
7200,
10159,
273,
253,
1979,
273,
3733,
941,
50276,
284,
18979,
253,
4477,
1750,
253,
4081,
1332,
310,
247,
6036,
465,
69,
2299,
253,
2957,
310,
417,
247,
4951,
465,
69,
3169,
2957,
24088,
253,
465,
392,
2957,
285,
3733,
247,
27882,
342,
253,
4465,
1481,
778,
417,
25057,
253,
465,
69,
4473,
275,
619,
4743,
476,
253,
4477,
21184,
327,
253,
4473,
50274,
3456,
250,
2858,
22559,
4385,
436,
789,
10262,
247,
3733,
1332,
970,
253,
3523,
11786,
5853,
342,
271,
4465,
269,
68,
1481,
323,
1566,
3733,
275,
247,
22296,
5133,
3707,
323,
970,
247,
2014,
1374,
2460,
285,
22296,
2957,
253,
4583,
3733,
15722,
310,
3240,
2074,
281,
253,
2045,
1881,
35421,
3082,
337,
374,
594,
253,
4477,
943,
21184,
327,
253,
3064,
285,
2085,
667,
30328,
2139,
253,
4081,
1332,
812,
789,
973,
1529,
4468,
310,
326,
253,
4679,
403,
417,
21414,
984,
273,
253,
16706,
5661,
873,
8777,
285,
1355,
7527,
4679,
3103,
891,
717,
25661,
4404,
18235,
533,
651,
751,
281,
923,
253,
4477,
2380,
285,
253,
643,
30628,
5701,
323,
619,
2457,
3061,
50276,
18,
260,
864,
1162,
355,
18216,
2969,
4927,
1443,
70,
6779,
4715,
30105,
1087,
43425,
374,
32257,
1162,
355,
28551,
634,
1211,
21624,
247,
747,
2746,
281,
1881,
35421,
4715,
5723,
2824,
9169,
50275,
2815,
3535,
310,
2530,
533,
667,
2442,
4016,
2675,
16274,
513,
417,
3300,
70,
281,
320,
2530,
5474,
33032,
2520,
2929,
2722,
326,
581,
476,
3157,
7200,
275,
253,
1698,
2203,
9459,
407,
6240,
4751,
4802,
8090,
281,
260,
79,
2224,
1309,
3733,
285,
940,
3867,
3640,
281,
247,
9162,
1481,
4795,
275,
247,
2990,
342,
253,
1072,
1180,
273,
3602,
285,
1805,
26647,
3045,
50276,
74,
1119,
253,
1332,
4081,
275,
436,
2929,
1077,
4722,
285,
253,
1543,
2692,
2590,
11701,
689,
253,
2613,
2990,
7296,
849,
3477,
436,
310,
281,
3359,
891,
476,
923,
436,
2929,
1907,
247,
1781,
2408,
273,
3486,
281,
253,
3114,
285,
253,
4679,
4645,
11701,
275,
253,
3939,
4715,
285,
49863,
29974,
13337,
4715,
4758,
921,
253,
9991,
273,
436,
2746,
2299,
253,
3480,
273,
3640,
940,
21755,
1666,
25379,
285,
10527,
19673,
556,
479,
20501,
1880,
436,
310,
667,
1027,
685,
247,
6867,
9798,
5974,
940,
21755,
2746,
50275,
1542,
4227,
752,
651,
5108,
604,
581,
26277,
10166,
501,
3024,
1093,
285,
501,
3024,
1235,
3732,
247,
2957,
2074,
281,
337,
275,
436,
1083,
253,
1071,
673,
2990,
651,
3831,
642,
625,
3602,
594,
352,
3133,
751,
247,
5272,
5301,
285,
352,
3637,
247,
2074,
30328,
273,
436,
2929,
6194,
342,
625,
3602,
285,
5926,
731,
1309,
1071,
2606,
3738,
891,
717,
417,
1512,
7615,
342,
1655,
465,
69,
3082,
891,
651,
320,
9861,
604,
5293,
273,
731,
2692,
247,
2074,
6351,
275,
3045,
594,
352,
651,
320,
247,
3309,
8245,
281,
7277,
436,
9380,
1332,
281,
50275,
66,
16277,
4468,
891,
452,
310,
342,
689,
31893,
1580,
436,
1332,
3133,
281,
320,
3733,
247,
2990,
342,
625,
3602,
327,
247,
1355,
2408,
273,
941,
891,
4282,
604,
3045,
327,
31640,
49602,
651,
320,
7197,
685,
253,
8245,
2990,
50275,
1189,
455,
891,
1158,
326,
436,
2929,
556,
253,
2442,
281,
452,
1029,
3486,
347,
352,
310,
973,
3542,
285,
556,
247,
2969,
3477,
281,
956,
1332,
326,
310,
3576,
762,
2067,
8892,
891,
717,
4933,
352,
247,
12009,
347,
891,
13414,
2868,
253,
1655,
4679,
403,
4209,
281,
5276,
326,
436,
1332,
310,
625,
24400,
685,
643,
3640,
940,
21755,
3082,
533,
891,
717,
15211,
281,
452,
253,
4477,
572,
437,
619,
24626,
342,
247,
625,
11080,
7103,
50275,
18,
1182,
12109,
1162,
355,
3676,
15577,
4715,
5549,
2490,
187,
4118,
18435,
27,
783,
2929,
2722,
326,
970,
2457,
4751,
14063,
8090,
7729,
253,
26647,
273,
27311,
267,
11454,
6928,
275,
1698,
2203,
27005,
253,
1635,
273,
841,
8090,
3012,
19132,
1566,
3290,
4795,
275,
247,
2990,
342,
253,
1072,
1180,
273,
3602,
285,
1805,
26647,
3045,
50276,
4478,
1365,
30628,
574,
6804,
7103,
273,
253,
2929,
512,
253,
30628,
3047,
326,
253,
4081,
1332,
310,
2969,
285,
3477,
281,
956,
387,
253,
1072,
673,
5277,
2590,
11701,
689,
1666,
25379,
671,
5821,
326,
253,
1543,
403,
1534,
285,
10084,
1055,
627,
497,
690,
7350,
5439,
407,
253,
30628,
533,
253,
4477,
30080,
22559,
6571,
9713,
285,
5520,
253,
2929,
342,
10481,
625,
4679,
285,
1783,
8109,
253,
2022,
1750,
37317,
18747,
23,
80,
5393,
326,
627,
403,
1643,
11269,
12316,
407,
253,
4477,
534,
16216,
320,
17618,
1919,
6568,
4704,
533,
352,
1057,
417,
1646,
281,
7501,
2972,
9311,
50275,
783,
2488,
15337,
254,
5955,
2180,
369,
3939,
285,
253,
4477,
858,
247,
1270,
2628,
22980,
2710,
7350,
285,
3533,
285,
512,
30628,
5821,
281,
1329,
14924,
273,
253,
2929,
253,
2929,
14371,
247,
2969,
2568,
3576,
1332,
323,
1355,
941,
9459,
534,
651,
320,
4722,
281,
253,
3862,
5723,
2824,
8446,
1097,
323,
24432,
347,
973,
347,
8607,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work proposes a new approach based on projective clustering for compressing the embedding layers of dnns for natural language modeling tasks the authors show that the tradeoff between compression and model accuracy can be improved by considering a set of k subspaces rather than just a single subspace methods for compressing dnns is an active area of research and this paper presents a promising approach to do so as well as interesting results rating the paper presents interesting ideas for compressing embedding layers however since this is an empirical paper i would expect a more comprehensive set of empirical results and a better comparison with other related methods overall the paper seems not very mature in its current form hence my rating is ok but not good enough rejection pros the proposed method is appealing due to its simplicity and the idea of considering multiple subspaces for embedding is plausible in the context of compressing embedding matrices of nlp models the results show improvements as compared to using just a single subspace the framework provides several ideas for future works cons typically the svd takes the form a udv where u and v are the left and right singular vectors and the diagonal entries of d are the singular values from the discussion it is not clear whether you factor the singular values into u or whether you simply ignore the singular values also how do you enforce the orthogonality constraints on u and v during the fine tuning stage have you considered a simpler lowrank factorization a ef in your experiments where no orthogonality constraints on e and f are imposed it would be good to see the progression for k2345 in figure 5 and 6 further the ensemble approach in figure 6 hasnt been discussed in detail anywhere in the paper it is not exactly clear to me how you are computing the ensemble it would be very helpful to see some tables that shows the total number of weights accuracy k j etc in order to better understand the performance how do you determine k and j in practice are you using some heuristic or are you simply doing a grid search i would like to see how your method compares to albert and whether a modified albert as you suggest in your future work section is doing better i would be interesting to see if you approach is also useful for compressing a fully connected layer in different settings this should be easy to test and could be reported in the appendix minor comments it is nice to see that you have many generalizations an extensions in mind but this section appears very lengthy to me compression rater compression ratesdocsepthis paper extends the idea of using subspace clustering to compress the neural nets by considering multiple subspaces and projecting each point to its closest subspace the paper needs more investigation on the related works basically the idea and the technique is not novel see the related literature below 1 trittenbach holger and klemens bhm oneclass active learning for outlier detection with multiple subspaces proceedings of the 28th acm international conference on information and knowledge management 2019 2 liu risheng et al fixedrank representation for unsupervised visual learning 2012 ieee conference on computer vision and pattern recognition ieee 2012 3 xu dong et al concurrent subspaces analysis 2005 ieee computer society conference on computer vision and pattern recognition cvpr05 vol 2 ieee 2005 4 feng jianzhou et al learning dictionary via subspace segmentation for sparse representation 2011 18th ieee international conference on image processing ieee 2011 pros smoothly readable the contribution section is described thoroughly and properly providing the codes for reproducing results cons abstract the abbreviations like nlp or svd should be defined first then used assuming that the reader already has corresponding field knowledge about systems such as glue distilbert or roberta and mentioning them in the abstract may be bold details of the methods such as the use of aj matrix or k1 subspace should not be mentioned in the abstract but rather in the contribution or introduction section accordingly the last sentence open code for should not be mentioned in the abstract but in the code description section the figures 13 in the paper look not well organized which makes the proposed simple idea to be extremely complex results it would be better to discuss the comparable results more thoroughly model compression literature should be reviewed and the typical methods should be compared with in the experiments discussion and conclusion only discussion of the results is provided in this section and the conclusion is not provided explicitly future work better not to start the section with numbered items right away better to have a starting sentence first appendix b titled results before finetuning and includes figures with no explanation provide proper description and discussion for each subfigure docsepsummary this paper applies projective clustering to the embedding layer of deep networks with large model sizes such as roberta the idea of finding more than one subspaces to factorize the embedding weight matrix has nice intuition and insights i vote for accepting strengths 1 the paper has convincing evidence showing the reduction in percent of accuracy drop when applying projective clustering to the embedding weight vectors 2 the paper has illustration figures that clearly show the intuition of the approach as well as how the compression is achieved weaknesses 1 it would be better if more baselines can be included in the experiment comparisons in particular since step 23 of the proposed messi pipeline page 4 is partitioning of all the input neurons and computing svd for each partition i would be really interested in seeing the comparison of projective clustering vs simpler clustering methods such as kmeans to partition the input neurons in the evaluation 2 the authors discussed extensions such as using l1 error and l1 distance but no experiments were performed for the extensions some experiment results will be better to establish the flexibility of the framework of projective clustering in model compression tasks questions during rebuttal period 1 please provide some results regarding the weaknesses above especially the result of more baseline methods 2 is projective clustering the only way to find clusters in multiple subspaces what are some alternatives for example in subspace clustering all the data points can be projected to the same subspace and form clusters we may run subspace clustering for multiple times to get clustering results in different subspaces
### Summary:
|
the paper proposes to use projective clustering to compress the embedding layers of dnn this is a novel interesting idea which can impact the area of knowledge distillation there were some concerns about the empirical study which was addressed to some extent by the authors during the rebuttal
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
29328,
247,
747,
2746,
1754,
327,
27266,
17524,
323,
509,
13537,
253,
21496,
8090,
273,
277,
79,
2224,
323,
3626,
3448,
14053,
8892,
253,
4477,
921,
326,
253,
5454,
2727,
875,
13800,
285,
1566,
7200,
476,
320,
5520,
407,
7296,
247,
873,
273,
465,
749,
31748,
2581,
685,
816,
247,
2014,
24822,
3082,
323,
509,
13537,
277,
79,
2224,
310,
271,
3939,
2170,
273,
2561,
285,
436,
2929,
10262,
247,
12532,
2746,
281,
513,
594,
347,
973,
347,
4722,
1543,
50275,
38203,
253,
2929,
10262,
4722,
5697,
323,
509,
13537,
21496,
8090,
2299,
1580,
436,
310,
271,
16774,
2929,
891,
651,
1902,
247,
625,
11088,
873,
273,
16774,
1543,
285,
247,
1805,
5301,
342,
643,
2905,
3082,
4583,
253,
2929,
3133,
417,
1077,
14242,
275,
697,
1655,
830,
7613,
619,
13716,
310,
8718,
533,
417,
1175,
2217,
50276,
250,
5342,
50276,
856,
84,
50275,
783,
4081,
1332,
310,
23176,
1955,
281,
697,
17647,
285,
253,
2934,
273,
7296,
2709,
749,
31748,
323,
21496,
310,
21541,
275,
253,
3634,
273,
509,
13537,
21496,
12624,
273,
295,
24343,
3210,
50275,
783,
1543,
921,
11701,
347,
2429,
281,
970,
816,
247,
2014,
24822,
50274,
783,
7792,
3400,
2067,
5697,
323,
2852,
2987,
50275,
5040,
50275,
42623,
253,
18504,
69,
3936,
253,
830,
247,
50276,
438,
87,
835,
1484,
285,
362,
403,
253,
1669,
285,
987,
11098,
11390,
285,
253,
16421,
12028,
273,
277,
403,
253,
11098,
2193,
432,
253,
5955,
352,
310,
417,
2590,
1880,
368,
2803,
253,
11098,
2193,
715,
1484,
390,
1880,
368,
3365,
11823,
253,
11098,
2193,
671,
849,
513,
368,
7767,
253,
9373,
38931,
1319,
10806,
327,
1484,
285,
362,
1309,
253,
4030,
25184,
3924,
452,
368,
2783,
247,
19554,
1698,
14714,
39401,
247,
50276,
832,
275,
634,
4679,
835,
642,
9373,
38931,
1319,
10806,
327,
299,
285,
269,
403,
11295,
50275,
262,
651,
320,
1175,
281,
923,
253,
10005,
323,
465,
1508,
1857,
275,
4677,
608,
285,
721,
2007,
253,
19862,
2746,
275,
4677,
721,
556,
2649,
644,
5469,
275,
2508,
9825,
275,
253,
2929,
352,
310,
417,
4555,
2590,
281,
479,
849,
368,
403,
12672,
253,
19862,
50275,
262,
651,
320,
1077,
9371,
281,
923,
690,
7180,
326,
2722,
253,
2264,
1180,
273,
13461,
7200,
465,
480,
3966,
275,
1340,
281,
1805,
2096,
253,
3045,
50273,
5430,
513,
368,
3653,
465,
285,
480,
275,
3946,
403,
368,
970,
690,
47641,
390,
403,
368,
3365,
2509,
247,
9860,
3186,
50275,
74,
651,
751,
281,
923,
849,
634,
1332,
26662,
281,
355,
6291,
285,
1880,
247,
7321,
355,
6291,
347,
368,
1804,
275,
634,
2852,
789,
2593,
310,
2509,
1805,
50274,
74,
651,
320,
4722,
281,
923,
604,
368,
2746,
310,
671,
4217,
323,
509,
13537,
247,
4751,
4802,
3828,
275,
1027,
7533,
436,
943,
320,
3477,
281,
1071,
285,
812,
320,
2361,
275,
253,
30762,
50275,
37585,
5701,
50275,
262,
310,
5322,
281,
923,
326,
368,
452,
1142,
2087,
5904,
271,
18149,
275,
2564,
533,
436,
2593,
4620,
1077,
24585,
281,
479,
50274,
3118,
1256,
391,
727,
50276,
3118,
1256,
4142,
7152,
33032,
2520,
2929,
8725,
253,
2934,
273,
970,
24822,
17524,
281,
19477,
253,
11454,
37507,
407,
7296,
2709,
749,
31748,
285,
35104,
1016,
1127,
281,
697,
8642,
24822,
253,
2929,
3198,
625,
5839,
327,
253,
2905,
2987,
10323,
253,
2934,
285,
253,
5853,
310,
417,
4460,
923,
253,
2905,
6239,
2708,
337,
492,
2937,
16836,
5965,
1063,
285,
465,
5616,
561,
270,
11774,
581,
2437,
3939,
4715,
323,
562,
3623,
5481,
342,
2709,
749,
31748,
10061,
273,
253,
3349,
394,
913,
78,
5213,
8059,
327,
1491,
285,
3640,
4323,
6247,
374,
632,
86,
7361,
24176,
1162,
355,
4229,
14714,
6779,
323,
440,
35421,
5304,
4715,
4050,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
26332,
1796,
4050,
495,
1269,
86,
277,
543,
1162,
355,
17336,
749,
31748,
1783,
5826,
26332,
1796,
4382,
5948,
8059,
327,
4382,
8113,
285,
3102,
8981,
30105,
1087,
1762,
1936,
374,
26332,
1796,
5826,
577,
269,
1205,
480,
757,
40305,
1162,
355,
4715,
19034,
3066,
24822,
26405,
323,
23507,
6779,
4332,
1283,
394,
26332,
1796,
5213,
8059,
327,
2460,
5162,
26332,
1796,
4332,
50276,
856,
84,
209,
186,
34006,
314,
34025,
50276,
186,
783,
7680,
2593,
310,
2529,
16575,
285,
6283,
50276,
186,
11404,
2821,
253,
11646,
323,
39306,
1543,
50276,
5040,
12002,
209,
186,
783,
490,
25669,
751,
295,
24343,
390,
18504,
69,
943,
320,
2931,
806,
840,
908,
50276,
186,
37411,
326,
253,
9414,
2168,
556,
3969,
1673,
3640,
670,
2718,
824,
347,
28400,
940,
300,
6291,
390,
687,
589,
893,
285,
29570,
731,
275,
253,
12002,
778,
320,
13433,
209,
186,
23454,
273,
253,
3082,
824,
347,
253,
897,
273,
29168,
4315,
390,
465,
18,
24822,
943,
417,
320,
5393,
275,
253,
12002,
533,
2581,
275,
253,
7680,
390,
10199,
2593,
15672,
209,
186,
783,
1390,
6197,
1527,
2127,
323,
50276,
11425,
417,
320,
5393,
275,
253,
12002,
533,
275,
253,
2127,
5740,
2593,
50276,
186,
783,
8442,
2145,
275,
253,
2929,
1007,
417,
973,
10932,
534,
2789,
253,
4081,
2969,
2934,
281,
320,
6685,
2570,
1543,
209,
186,
262,
651,
320,
1805,
281,
2319,
253,
10870,
1543,
625,
16575,
50276,
186,
7645,
13800,
6239,
943,
320,
9814,
285,
253,
6867,
3082,
943,
320,
2429,
342,
275,
253,
4679,
5955,
285,
6452,
209,
186,
7483,
5955,
273,
253,
1543,
310,
2530,
275,
436,
2593,
285,
253,
6452,
310,
417,
2530,
11120,
50274,
32279,
789,
209,
186,
29266,
417,
281,
1265,
253,
2593,
342,
31050,
4957,
987,
1977,
1805,
281,
452,
247,
4983,
6197,
806,
50276,
50237,
270,
209,
186,
85,
5924,
1543,
1078,
1442,
292,
25004,
285,
3797,
8442,
342,
642,
8813,
2085,
1463,
5740,
285,
5955,
323,
1016,
749,
13206,
50276,
7152,
339,
793,
360,
3454,
436,
2929,
10384,
27266,
17524,
281,
253,
21496,
3828,
273,
3676,
6928,
342,
1781,
1566,
9552,
824,
347,
687,
589,
893,
253,
2934,
273,
4560,
625,
685,
581,
749,
31748,
281,
2803,
907,
253,
21496,
2801,
4315,
556,
5322,
30328,
285,
16039,
891,
6273,
323,
18738,
50276,
296,
3755,
20556,
337,
253,
2929,
556,
21414,
1941,
4645,
253,
5141,
275,
2558,
273,
7200,
5926,
672,
9433,
27266,
17524,
281,
253,
21496,
2801,
11390,
374,
253,
2929,
556,
23356,
8442,
326,
4518,
921,
253,
30328,
273,
253,
2746,
347,
973,
347,
849,
253,
13800,
310,
6786,
50276,
20881,
1255,
265,
337,
352,
651,
320,
1805,
604,
625,
1666,
25379,
476,
320,
2908,
275,
253,
3368,
14023,
275,
1798,
1580,
3213,
3495,
273,
253,
4081,
4840,
74,
15722,
3239,
577,
310,
41463,
273,
512,
253,
3280,
8512,
285,
12672,
18504,
69,
323,
1016,
10883,
891,
651,
320,
1663,
6110,
275,
6523,
253,
5301,
273,
27266,
17524,
4632,
19554,
17524,
3082,
824,
347,
465,
30799,
281,
10883,
253,
3280,
8512,
275,
253,
7103,
374,
253,
4477,
5469,
18149,
824,
347,
970,
298,
18,
2228,
285,
298,
18,
4181,
533,
642,
4679,
497,
2684,
323,
253,
18149,
690,
3368,
1543,
588,
320,
1805,
281,
5100,
253,
15840,
273,
253,
7792,
273,
27266,
17524,
275,
1566,
13800,
8892,
50276,
34974,
1309,
30080,
22559,
2180,
50276,
18,
4496,
2085,
690,
1543,
5001,
253,
32213,
1840,
3340,
253,
906,
273,
625,
8245,
3082,
374,
310,
27266,
17524,
253,
760,
1039,
281,
1089,
9959,
275,
2709,
749,
31748,
752,
403,
690,
18075,
323,
1650,
275,
24822,
17524,
512,
253,
941,
2792,
476,
320,
16589,
281,
253,
1072,
24822,
285,
830,
9959,
359,
778,
1408,
24822,
17524,
323,
2709,
2069,
281,
755,
17524,
1543,
275,
1027,
749,
31748,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
897,
27266,
17524,
281,
19477,
253,
21496,
8090,
273,
277,
9866,
436,
310,
247,
4460,
4722,
2934,
534,
476,
50276,
48276,
253,
2170,
273,
3640,
940,
21755,
627,
497,
690,
7350,
670,
253,
16774,
1263,
534,
369,
9713,
281,
690,
6070,
50276,
1615,
253,
4477,
1309,
253,
30080,
22559
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
29328,
247,
747,
2746,
1754,
327,
27266,
17524,
323,
509,
13537,
253,
21496,
8090,
273,
277,
79,
2224,
323,
3626,
3448,
14053,
8892,
253,
4477,
921,
326,
253,
5454,
2727,
875,
13800,
285,
1566,
7200,
476,
320,
5520,
407,
7296,
247,
873,
273,
465,
749,
31748,
2581,
685,
816,
247,
2014,
24822,
3082,
323,
509,
13537,
277,
79,
2224,
310,
271,
3939,
2170,
273,
2561,
285,
436,
2929,
10262,
247,
12532,
2746,
281,
513,
594,
347,
973,
347,
4722,
1543,
50275,
38203,
253,
2929,
10262,
4722,
5697,
323,
509,
13537,
21496,
8090,
2299,
1580,
436,
310,
271,
16774,
2929,
891,
651,
1902,
247,
625,
11088,
873,
273,
16774,
1543,
285,
247,
1805,
5301,
342,
643,
2905,
3082,
4583,
253,
2929,
3133,
417,
1077,
14242,
275,
697,
1655,
830,
7613,
619,
13716,
310,
8718,
533,
417,
1175,
2217,
50276,
250,
5342,
50276,
856,
84,
50275,
783,
4081,
1332,
310,
23176,
1955,
281,
697,
17647,
285,
253,
2934,
273,
7296,
2709,
749,
31748,
323,
21496,
310,
21541,
275,
253,
3634,
273,
509,
13537,
21496,
12624,
273,
295,
24343,
3210,
50275,
783,
1543,
921,
11701,
347,
2429,
281,
970,
816,
247,
2014,
24822,
50274,
783,
7792,
3400,
2067,
5697,
323,
2852,
2987,
50275,
5040,
50275,
42623,
253,
18504,
69,
3936,
253,
830,
247,
50276,
438,
87,
835,
1484,
285,
362,
403,
253,
1669,
285,
987,
11098,
11390,
285,
253,
16421,
12028,
273,
277,
403,
253,
11098,
2193,
432,
253,
5955,
352,
310,
417,
2590,
1880,
368,
2803,
253,
11098,
2193,
715,
1484,
390,
1880,
368,
3365,
11823,
253,
11098,
2193,
671,
849,
513,
368,
7767,
253,
9373,
38931,
1319,
10806,
327,
1484,
285,
362,
1309,
253,
4030,
25184,
3924,
452,
368,
2783,
247,
19554,
1698,
14714,
39401,
247,
50276,
832,
275,
634,
4679,
835,
642,
9373,
38931,
1319,
10806,
327,
299,
285,
269,
403,
11295,
50275,
262,
651,
320,
1175,
281,
923,
253,
10005,
323,
465,
1508,
1857,
275,
4677,
608,
285,
721,
2007,
253,
19862,
2746,
275,
4677,
721,
556,
2649,
644,
5469,
275,
2508,
9825,
275,
253,
2929,
352,
310,
417,
4555,
2590,
281,
479,
849,
368,
403,
12672,
253,
19862,
50275,
262,
651,
320,
1077,
9371,
281,
923,
690,
7180,
326,
2722,
253,
2264,
1180,
273,
13461,
7200,
465,
480,
3966,
275,
1340,
281,
1805,
2096,
253,
3045,
50273,
5430,
513,
368,
3653,
465,
285,
480,
275,
3946,
403,
368,
970,
690,
47641,
390,
403,
368,
3365,
2509,
247,
9860,
3186,
50275,
74,
651,
751,
281,
923,
849,
634,
1332,
26662,
281,
355,
6291,
285,
1880,
247,
7321,
355,
6291,
347,
368,
1804,
275,
634,
2852,
789,
2593,
310,
2509,
1805,
50274,
74,
651,
320,
4722,
281,
923,
604,
368,
2746,
310,
671,
4217,
323,
509,
13537,
247,
4751,
4802,
3828,
275,
1027,
7533,
436,
943,
320,
3477,
281,
1071,
285,
812,
320,
2361,
275,
253,
30762,
50275,
37585,
5701,
50275,
262,
310,
5322,
281,
923,
326,
368,
452,
1142,
2087,
5904,
271,
18149,
275,
2564,
533,
436,
2593,
4620,
1077,
24585,
281,
479,
50274,
3118,
1256,
391,
727,
50276,
3118,
1256,
4142,
7152,
33032,
2520,
2929,
8725,
253,
2934,
273,
970,
24822,
17524,
281,
19477,
253,
11454,
37507,
407,
7296,
2709,
749,
31748,
285,
35104,
1016,
1127,
281,
697,
8642,
24822,
253,
2929,
3198,
625,
5839,
327,
253,
2905,
2987,
10323,
253,
2934,
285,
253,
5853,
310,
417,
4460,
923,
253,
2905,
6239,
2708,
337,
492,
2937,
16836,
5965,
1063,
285,
465,
5616,
561,
270,
11774,
581,
2437,
3939,
4715,
323,
562,
3623,
5481,
342,
2709,
749,
31748,
10061,
273,
253,
3349,
394,
913,
78,
5213,
8059,
327,
1491,
285,
3640,
4323,
6247,
374,
632,
86,
7361,
24176,
1162,
355,
4229,
14714,
6779,
323,
440,
35421,
5304,
4715,
4050,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
26332,
1796,
4050,
495,
1269,
86,
277,
543,
1162,
355,
17336,
749,
31748,
1783,
5826,
26332,
1796,
4382,
5948,
8059,
327,
4382,
8113,
285,
3102,
8981,
30105,
1087,
1762,
1936,
374,
26332,
1796,
5826,
577,
269,
1205,
480,
757,
40305,
1162,
355,
4715,
19034,
3066,
24822,
26405,
323,
23507,
6779,
4332,
1283,
394,
26332,
1796,
5213,
8059,
327,
2460,
5162,
26332,
1796,
4332,
50276,
856,
84,
209,
186,
34006,
314,
34025,
50276,
186,
783,
7680,
2593,
310,
2529,
16575,
285,
6283,
50276,
186,
11404,
2821,
253,
11646,
323,
39306,
1543,
50276,
5040,
12002,
209,
186,
783,
490,
25669,
751,
295,
24343,
390,
18504,
69,
943,
320,
2931,
806,
840,
908,
50276,
186,
37411,
326,
253,
9414,
2168,
556,
3969,
1673,
3640,
670,
2718,
824,
347,
28400,
940,
300,
6291,
390,
687,
589,
893,
285,
29570,
731,
275,
253,
12002,
778,
320,
13433,
209,
186,
23454,
273,
253,
3082,
824,
347,
253,
897,
273,
29168,
4315,
390,
465,
18,
24822,
943,
417,
320,
5393,
275,
253,
12002,
533,
2581,
275,
253,
7680,
390,
10199,
2593,
15672,
209,
186,
783,
1390,
6197,
1527,
2127,
323,
50276,
11425,
417,
320,
5393,
275,
253,
12002,
533,
275,
253,
2127,
5740,
2593,
50276,
186,
783,
8442,
2145,
275,
253,
2929,
1007,
417,
973,
10932,
534,
2789,
253,
4081,
2969,
2934,
281,
320,
6685,
2570,
1543,
209,
186,
262,
651,
320,
1805,
281,
2319,
253,
10870,
1543,
625,
16575,
50276,
186,
7645,
13800,
6239,
943,
320,
9814,
285,
253,
6867,
3082,
943,
320,
2429,
342,
275,
253,
4679,
5955,
285,
6452,
209,
186,
7483,
5955,
273,
253,
1543,
310,
2530,
275,
436,
2593,
285,
253,
6452,
310,
417,
2530,
11120,
50274,
32279,
789,
209,
186,
29266,
417,
281,
1265,
253,
2593,
342,
31050,
4957,
987,
1977,
1805,
281,
452,
247,
4983,
6197,
806,
50276,
50237,
270,
209,
186,
85,
5924,
1543,
1078,
1442,
292,
25004,
285,
3797,
8442,
342,
642,
8813,
2085,
1463,
5740,
285,
5955,
323,
1016,
749,
13206,
50276,
7152,
339,
793,
360,
3454,
436,
2929,
10384,
27266,
17524,
281,
253,
21496,
3828,
273,
3676,
6928,
342,
1781,
1566,
9552,
824,
347,
687,
589,
893,
253,
2934,
273,
4560,
625,
685,
581,
749,
31748,
281,
2803,
907,
253,
21496,
2801,
4315,
556,
5322,
30328,
285,
16039,
891,
6273,
323,
18738,
50276,
296,
3755,
20556,
337,
253,
2929,
556,
21414,
1941,
4645,
253,
5141,
275,
2558,
273,
7200,
5926,
672,
9433,
27266,
17524,
281,
253,
21496,
2801,
11390,
374,
253,
2929,
556,
23356,
8442,
326,
4518,
921,
253,
30328,
273,
253,
2746,
347,
973,
347,
849,
253,
13800,
310,
6786,
50276,
20881,
1255,
265,
337,
352,
651,
320,
1805,
604,
625,
1666,
25379,
476,
320,
2908,
275,
253,
3368,
14023,
275,
1798,
1580,
3213,
3495,
273,
253,
4081,
4840,
74,
15722,
3239,
577,
310,
41463,
273,
512,
253,
3280,
8512,
285,
12672,
18504,
69,
323,
1016,
10883,
891,
651,
320,
1663,
6110,
275,
6523,
253,
5301,
273,
27266,
17524,
4632,
19554,
17524,
3082,
824,
347,
465,
30799,
281,
10883,
253,
3280,
8512,
275,
253,
7103,
374,
253,
4477,
5469,
18149,
824,
347,
970,
298,
18,
2228,
285,
298,
18,
4181,
533,
642,
4679,
497,
2684,
323,
253,
18149,
690,
3368,
1543,
588,
320,
1805,
281,
5100,
253,
15840,
273,
253,
7792,
273,
27266,
17524,
275,
1566,
13800,
8892,
50276,
34974,
1309,
30080,
22559,
2180,
50276,
18,
4496,
2085,
690,
1543,
5001,
253,
32213,
1840,
3340,
253,
906,
273,
625,
8245,
3082,
374,
310,
27266,
17524,
253,
760,
1039,
281,
1089,
9959,
275,
2709,
749,
31748,
752,
403,
690,
18075,
323,
1650,
275,
24822,
17524,
512,
253,
941,
2792,
476,
320,
16589,
281,
253,
1072,
24822,
285,
830,
9959,
359,
778,
1408,
24822,
17524,
323,
2709,
2069,
281,
755,
17524,
1543,
275,
1027,
749,
31748,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
897,
27266,
17524,
281,
19477,
253,
21496,
8090,
273,
277,
9866,
436,
310,
247,
4460,
4722,
2934,
534,
476,
50276,
48276,
253,
2170,
273,
3640,
940,
21755,
627,
497,
690,
7350,
670,
253,
16774,
1263,
534,
369,
9713,
281,
690,
6070,
50276,
1615,
253,
4477,
1309,
253,
30080,
22559
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper modifies the original attention block in rest paper by 1 rewind convin based multihead attention in rest to original multihead attention form no conv in in vit 2 add upsampling operation to the downsampled v image classification performance is validated on imagenet besides the paper conducted ablation studies on how different windowbased attention win cwin hwin global affect ap flops throughput when transfer to object detection task strengths the paper presents results across a range of different vision tasks including imagenet classification object detection and segmentation on coco semantic segmentation on ade20k and achieves competitive results besides flops and parameters throughput is also benchmarked to make give one more practical aspect of model performance weaknesses novelty is too limited at core what the paper proposed is to upsample kv where kv are downsampled for saving computations current presentation is more like a patch fix to rest if the generalization abilities of the proposed upsampling operation beyond rest is validated as there exist many other vit variants saving computations by downsampling eg pvt the proposed method can be much strengthened the paper proposes specific change to rest the change might benefit other methods but not validated in the work docsepthis paper proposes a vision transformer network based on the previously proposed restv1 architecture by involving a new element that performs upsampling of a token sequence the upsampling module is actually attached to the projected output of value to recover the dimensionality of the original sequence length the authors additionally refine the baseline restv1 network 1 removing the multihead interaction components and 2 reconfiguring the network stage due to the appearance of the upsampling module the modifications seem to work well but the key design choice still remain the authors provide experimental results on imagenet and some downstream tasks to justify the effectiveness of the proposed network architecture strengths the paper is easy to follow the exploration of using pixelshuffle as the upsampling model for recovering the original signal is a nice idea and interpreting it as the convolutional hourglass architecture also seems to make sense weaknesses novelty is incremental the major change over the baseline restv1 is the pixelshuffle only and the rest of the modifications are not new and cannot be one of the contributions any intuitions or insights of why the architecture should be designed like this are missing why the upsampling module should be involved what can we learn from the architectural modifications from restv1 to restv2 such as the block number at the first stage is halved experimental justifications in sections 34 and 43 do not seem to be enough backups for the explanation of the proposed architectural design for example figure 3 tells us the upsampling module seemingly reduces the difference in the log amplitude between particular frequencies and the center frequency however this does not indicate the upsampling module is necessarily used furthermore one may naturally ask the questions 1 why do some specific frequencies only benefit from information recovery 2 if the upsampling module really helps information flow shouldnt the entire frequency have the same effect 3 why the outputside layers do not benefit from it furthermore figure 4 is not clearly illustrated the details of pixelshuffle are not clearly presented is it the pixelshuffle operation used in the superresolution field then why the dimensionality remains the same after upsampling in figure 2 b the authors did not provide the limitations and potential negative societal impact of their work docsepthis paper proposes an improved vision transformer architecture based on the restv1 to address the issue that the downsample operation in restv1 will impair the longdistance modeling ability an upsample operation is employed to reconstruct the lost information besides authors apply the proposed restv2 to different downstream tasks extensive results demonstrate the effectiveness and efficiency of this transformer backbone strengths 1 the proposed architecture is simple but seems effective for different downstream tasks 2 it is interesting that the authors find the gap between the theoretical flops and the actual speed and they consider the actual speed when designing the model 3 the experiments and theoretical analysis are sufficient weaknesses 1 the main contribution of this paper is to introduce the upsample operation into the restv1 to compensate for the lost information from the downsample though this simple design can provide the performance benefit the generalizability of this design seems narrow does it only work well for the specific efficient transformer model with the downsample operation 2 for equ2 is the norm operation also eliminated along with the conv if so the paper does not mention it which is somewhat confusing 3 i think a more detailed description of fig3 is needed for a better understanding eg the meaning of coordinates and curves yes the authors address the efficiency to some extent they point out the gap between the theoretical flops and the actual speed and consider the actual running speed more when designing the model
### Summary:
|
this paper introduced an improvement over rest by addressing the issues introduced by downsampling operations in msa all reviewers have recognized the contribution of this paper and the impressive performance achieved by the proposed algorithm in the rebuttal the authors have wellfixed reviewers major concerns and new results have been updated
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
771,
7790,
253,
3236,
4116,
2972,
275,
1551,
2929,
407,
337,
294,
16668,
13136,
1754,
4471,
2522,
4116,
275,
1551,
281,
3236,
4471,
2522,
4116,
830,
642,
2410,
275,
275,
9084,
374,
823,
598,
48027,
4254,
281,
253,
1066,
22163,
6216,
362,
2460,
9162,
3045,
310,
17618,
327,
4440,
257,
292,
16280,
253,
2929,
5196,
28913,
2175,
327,
849,
1027,
3497,
3169,
4116,
3330,
260,
6481,
288,
6481,
4156,
2818,
1049,
892,
2695,
28519,
672,
3700,
281,
1789,
5481,
4836,
50275,
296,
3755,
20556,
209,
186,
783,
2929,
10262,
1543,
2439,
247,
2491,
273,
1027,
8113,
8892,
1690,
4440,
257,
292,
9162,
1789,
5481,
285,
26405,
327,
9285,
80,
24705,
26405,
327,
519,
70,
938,
76,
285,
33526,
12085,
1543,
50276,
186,
67,
11587,
892,
2695,
285,
3602,
28519,
310,
671,
22791,
264,
281,
1056,
1918,
581,
625,
8542,
4809,
273,
1566,
3045,
50275,
20881,
1255,
265,
209,
186,
2369,
652,
555,
310,
1512,
3710,
387,
5161,
752,
253,
2929,
4081,
310,
281,
598,
16848,
44739,
835,
44739,
403,
1066,
22163,
6216,
323,
13868,
30745,
1655,
9759,
310,
625,
751,
247,
12097,
4993,
281,
1551,
604,
253,
26647,
15277,
273,
253,
4081,
598,
48027,
4254,
4457,
1551,
310,
17618,
347,
627,
2226,
1142,
643,
9084,
11640,
13868,
30745,
407,
1066,
48027,
24088,
268,
20282,
253,
4081,
1332,
476,
320,
1199,
34615,
50275,
783,
2929,
29328,
2173,
1818,
281,
1551,
253,
1818,
1537,
5649,
643,
3082,
533,
417,
17618,
275,
253,
789,
50276,
7152,
33032,
2520,
2929,
29328,
247,
8113,
39707,
2990,
1754,
327,
253,
3786,
4081,
1551,
87,
18,
10336,
407,
7668,
247,
747,
3284,
326,
17923,
598,
48027,
273,
247,
10669,
3425,
253,
598,
48027,
6333,
310,
2686,
7660,
281,
253,
16589,
3453,
273,
1318,
281,
9295,
253,
7877,
1319,
273,
253,
3236,
3425,
2978,
253,
4477,
23000,
39494,
253,
8245,
1551,
87,
18,
2990,
337,
11922,
253,
4471,
2522,
5016,
4295,
285,
374,
294,
5397,
981,
253,
2990,
3924,
1955,
281,
253,
7286,
273,
253,
598,
48027,
6333,
50276,
783,
14586,
1646,
281,
789,
973,
533,
253,
2234,
2216,
4327,
1335,
3464,
253,
4477,
2085,
5661,
1543,
327,
4440,
257,
292,
285,
690,
15450,
8892,
281,
15249,
253,
12510,
273,
253,
4081,
2990,
10336,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
3477,
281,
956,
50276,
783,
17947,
273,
970,
12275,
1200,
23831,
347,
253,
598,
48027,
1566,
323,
27930,
253,
3236,
2625,
310,
247,
5322,
2934,
285,
29375,
352,
347,
253,
27311,
267,
4964,
25483,
10336,
671,
3133,
281,
1056,
3282,
50274,
20881,
1255,
265,
50276,
2369,
652,
555,
310,
32809,
253,
2201,
1818,
689,
253,
8245,
1551,
87,
18,
310,
253,
12275,
1200,
23831,
760,
285,
253,
1551,
273,
253,
14586,
403,
417,
747,
285,
2550,
320,
581,
273,
253,
9021,
50275,
1279,
16875,
4431,
390,
16039,
273,
2139,
253,
10336,
943,
320,
4158,
751,
436,
403,
5816,
2139,
253,
598,
48027,
6333,
943,
320,
3206,
752,
476,
359,
3037,
432,
253,
27934,
14586,
432,
1551,
87,
18,
281,
1551,
87,
19,
824,
347,
253,
2972,
1180,
387,
253,
806,
3924,
310,
7905,
1272,
50276,
49363,
816,
6787,
275,
7118,
5910,
285,
7652,
513,
417,
1646,
281,
320,
2217,
896,
8777,
323,
253,
8813,
273,
253,
4081,
27934,
2216,
323,
1650,
4677,
495,
8599,
441,
253,
598,
48027,
6333,
16907,
11355,
253,
3064,
275,
253,
2412,
10896,
875,
1798,
11383,
285,
253,
4055,
4294,
2299,
436,
1057,
417,
5224,
253,
598,
48027,
6333,
310,
7933,
908,
33810,
581,
778,
10748,
1642,
253,
3533,
337,
2139,
513,
690,
2173,
11383,
760,
5649,
432,
1491,
7355,
374,
604,
253,
598,
48027,
6333,
1663,
7729,
1491,
2685,
943,
2649,
253,
2862,
4294,
452,
253,
1072,
1055,
495,
2139,
253,
3453,
2189,
8090,
513,
417,
5649,
432,
352,
33810,
4677,
577,
310,
417,
4518,
12800,
50275,
783,
4278,
273,
12275,
1200,
23831,
403,
417,
4518,
3559,
310,
352,
253,
12275,
1200,
23831,
4254,
908,
275,
253,
2221,
21061,
1673,
840,
2139,
253,
7877,
1319,
4558,
253,
1072,
846,
598,
48027,
275,
4677,
374,
270,
50274,
783,
4477,
858,
417,
2085,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
29328,
271,
5520,
8113,
39707,
10336,
1754,
327,
253,
1551,
87,
18,
281,
2953,
253,
2523,
326,
253,
1066,
16848,
4254,
275,
1551,
87,
18,
588,
11490,
253,
1048,
19893,
14053,
3745,
271,
598,
16848,
4254,
310,
7091,
281,
17029,
253,
3663,
1491,
16280,
4477,
4647,
253,
4081,
1551,
87,
19,
281,
1027,
15450,
8892,
9470,
1543,
7568,
253,
12510,
285,
6733,
273,
436,
39707,
27882,
20544,
50276,
18,
253,
4081,
10336,
310,
2969,
533,
3133,
3576,
323,
1027,
15450,
8892,
50276,
19,
352,
310,
4722,
326,
253,
4477,
1089,
253,
8037,
875,
253,
10527,
892,
2695,
285,
253,
4588,
3885,
285,
597,
1908,
253,
4588,
3885,
672,
20462,
253,
1566,
50276,
20,
253,
4679,
285,
10527,
1783,
403,
4209,
50276,
20881,
1255,
265,
337,
253,
2022,
7680,
273,
436,
2929,
310,
281,
9569,
253,
598,
16848,
4254,
715,
253,
1551,
87,
18,
281,
23514,
323,
253,
3663,
1491,
432,
253,
1066,
16848,
2167,
436,
2969,
2216,
476,
2085,
253,
3045,
5649,
253,
2087,
50228,
273,
436,
2216,
3133,
6891,
1057,
352,
760,
789,
973,
323,
253,
2173,
5919,
39707,
1566,
342,
253,
1066,
16848,
4254,
50276,
19,
323,
1298,
19,
310,
253,
5222,
4254,
671,
17527,
2112,
342,
253,
2410,
604,
594,
253,
2929,
1057,
417,
3748,
352,
534,
310,
8489,
21643,
495,
891,
1158,
247,
625,
7000,
5740,
273,
3036,
20,
310,
3058,
323,
247,
1805,
4685,
24088,
253,
4495,
273,
11627,
285,
9191,
50276,
9820,
253,
4477,
2953,
253,
6733,
281,
690,
6070,
597,
1127,
562,
253,
8037,
875,
253,
10527,
892,
2695,
285,
253,
4588,
3885,
285,
1908,
253,
4588,
3515,
3885,
625,
672,
20462,
253,
1566,
2490,
187,
4118,
18435,
27,
2520,
2929,
5611,
271,
7756,
689,
1551,
407,
15974,
253,
3374,
5611,
407,
1066,
48027,
5871,
275,
278,
6678,
512,
30628,
452,
7478,
253,
7680,
273,
436,
2929,
285,
253,
13943,
3045,
6786,
407,
253,
4081,
5933,
50276,
249,
253,
30080,
22559,
253,
4477,
452,
973,
20188,
30628,
2201,
7350,
285,
747,
1543,
452,
644,
9300,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
771,
7790,
253,
3236,
4116,
2972,
275,
1551,
2929,
407,
337,
294,
16668,
13136,
1754,
4471,
2522,
4116,
275,
1551,
281,
3236,
4471,
2522,
4116,
830,
642,
2410,
275,
275,
9084,
374,
823,
598,
48027,
4254,
281,
253,
1066,
22163,
6216,
362,
2460,
9162,
3045,
310,
17618,
327,
4440,
257,
292,
16280,
253,
2929,
5196,
28913,
2175,
327,
849,
1027,
3497,
3169,
4116,
3330,
260,
6481,
288,
6481,
4156,
2818,
1049,
892,
2695,
28519,
672,
3700,
281,
1789,
5481,
4836,
50275,
296,
3755,
20556,
209,
186,
783,
2929,
10262,
1543,
2439,
247,
2491,
273,
1027,
8113,
8892,
1690,
4440,
257,
292,
9162,
1789,
5481,
285,
26405,
327,
9285,
80,
24705,
26405,
327,
519,
70,
938,
76,
285,
33526,
12085,
1543,
50276,
186,
67,
11587,
892,
2695,
285,
3602,
28519,
310,
671,
22791,
264,
281,
1056,
1918,
581,
625,
8542,
4809,
273,
1566,
3045,
50275,
20881,
1255,
265,
209,
186,
2369,
652,
555,
310,
1512,
3710,
387,
5161,
752,
253,
2929,
4081,
310,
281,
598,
16848,
44739,
835,
44739,
403,
1066,
22163,
6216,
323,
13868,
30745,
1655,
9759,
310,
625,
751,
247,
12097,
4993,
281,
1551,
604,
253,
26647,
15277,
273,
253,
4081,
598,
48027,
4254,
4457,
1551,
310,
17618,
347,
627,
2226,
1142,
643,
9084,
11640,
13868,
30745,
407,
1066,
48027,
24088,
268,
20282,
253,
4081,
1332,
476,
320,
1199,
34615,
50275,
783,
2929,
29328,
2173,
1818,
281,
1551,
253,
1818,
1537,
5649,
643,
3082,
533,
417,
17618,
275,
253,
789,
50276,
7152,
33032,
2520,
2929,
29328,
247,
8113,
39707,
2990,
1754,
327,
253,
3786,
4081,
1551,
87,
18,
10336,
407,
7668,
247,
747,
3284,
326,
17923,
598,
48027,
273,
247,
10669,
3425,
253,
598,
48027,
6333,
310,
2686,
7660,
281,
253,
16589,
3453,
273,
1318,
281,
9295,
253,
7877,
1319,
273,
253,
3236,
3425,
2978,
253,
4477,
23000,
39494,
253,
8245,
1551,
87,
18,
2990,
337,
11922,
253,
4471,
2522,
5016,
4295,
285,
374,
294,
5397,
981,
253,
2990,
3924,
1955,
281,
253,
7286,
273,
253,
598,
48027,
6333,
50276,
783,
14586,
1646,
281,
789,
973,
533,
253,
2234,
2216,
4327,
1335,
3464,
253,
4477,
2085,
5661,
1543,
327,
4440,
257,
292,
285,
690,
15450,
8892,
281,
15249,
253,
12510,
273,
253,
4081,
2990,
10336,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
3477,
281,
956,
50276,
783,
17947,
273,
970,
12275,
1200,
23831,
347,
253,
598,
48027,
1566,
323,
27930,
253,
3236,
2625,
310,
247,
5322,
2934,
285,
29375,
352,
347,
253,
27311,
267,
4964,
25483,
10336,
671,
3133,
281,
1056,
3282,
50274,
20881,
1255,
265,
50276,
2369,
652,
555,
310,
32809,
253,
2201,
1818,
689,
253,
8245,
1551,
87,
18,
310,
253,
12275,
1200,
23831,
760,
285,
253,
1551,
273,
253,
14586,
403,
417,
747,
285,
2550,
320,
581,
273,
253,
9021,
50275,
1279,
16875,
4431,
390,
16039,
273,
2139,
253,
10336,
943,
320,
4158,
751,
436,
403,
5816,
2139,
253,
598,
48027,
6333,
943,
320,
3206,
752,
476,
359,
3037,
432,
253,
27934,
14586,
432,
1551,
87,
18,
281,
1551,
87,
19,
824,
347,
253,
2972,
1180,
387,
253,
806,
3924,
310,
7905,
1272,
50276,
49363,
816,
6787,
275,
7118,
5910,
285,
7652,
513,
417,
1646,
281,
320,
2217,
896,
8777,
323,
253,
8813,
273,
253,
4081,
27934,
2216,
323,
1650,
4677,
495,
8599,
441,
253,
598,
48027,
6333,
16907,
11355,
253,
3064,
275,
253,
2412,
10896,
875,
1798,
11383,
285,
253,
4055,
4294,
2299,
436,
1057,
417,
5224,
253,
598,
48027,
6333,
310,
7933,
908,
33810,
581,
778,
10748,
1642,
253,
3533,
337,
2139,
513,
690,
2173,
11383,
760,
5649,
432,
1491,
7355,
374,
604,
253,
598,
48027,
6333,
1663,
7729,
1491,
2685,
943,
2649,
253,
2862,
4294,
452,
253,
1072,
1055,
495,
2139,
253,
3453,
2189,
8090,
513,
417,
5649,
432,
352,
33810,
4677,
577,
310,
417,
4518,
12800,
50275,
783,
4278,
273,
12275,
1200,
23831,
403,
417,
4518,
3559,
310,
352,
253,
12275,
1200,
23831,
4254,
908,
275,
253,
2221,
21061,
1673,
840,
2139,
253,
7877,
1319,
4558,
253,
1072,
846,
598,
48027,
275,
4677,
374,
270,
50274,
783,
4477,
858,
417,
2085,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
29328,
271,
5520,
8113,
39707,
10336,
1754,
327,
253,
1551,
87,
18,
281,
2953,
253,
2523,
326,
253,
1066,
16848,
4254,
275,
1551,
87,
18,
588,
11490,
253,
1048,
19893,
14053,
3745,
271,
598,
16848,
4254,
310,
7091,
281,
17029,
253,
3663,
1491,
16280,
4477,
4647,
253,
4081,
1551,
87,
19,
281,
1027,
15450,
8892,
9470,
1543,
7568,
253,
12510,
285,
6733,
273,
436,
39707,
27882,
20544,
50276,
18,
253,
4081,
10336,
310,
2969,
533,
3133,
3576,
323,
1027,
15450,
8892,
50276,
19,
352,
310,
4722,
326,
253,
4477,
1089,
253,
8037,
875,
253,
10527,
892,
2695,
285,
253,
4588,
3885,
285,
597,
1908,
253,
4588,
3885,
672,
20462,
253,
1566,
50276,
20,
253,
4679,
285,
10527,
1783,
403,
4209,
50276,
20881,
1255,
265,
337,
253,
2022,
7680,
273,
436,
2929,
310,
281,
9569,
253,
598,
16848,
4254,
715,
253,
1551,
87,
18,
281,
23514,
323,
253,
3663,
1491,
432,
253,
1066,
16848,
2167,
436,
2969,
2216,
476,
2085,
253,
3045,
5649,
253,
2087,
50228,
273,
436,
2216,
3133,
6891,
1057,
352,
760,
789,
973,
323,
253,
2173,
5919,
39707,
1566,
342,
253,
1066,
16848,
4254,
50276,
19,
323,
1298,
19,
310,
253,
5222,
4254,
671,
17527,
2112,
342,
253,
2410,
604,
594,
253,
2929,
1057,
417,
3748,
352,
534,
310,
8489,
21643,
495,
891,
1158,
247,
625,
7000,
5740,
273,
3036,
20,
310,
3058,
323,
247,
1805,
4685,
24088,
253,
4495,
273,
11627,
285,
9191,
50276,
9820,
253,
4477,
2953,
253,
6733,
281,
690,
6070,
597,
1127,
562,
253,
8037,
875,
253,
10527,
892,
2695,
285,
253,
4588,
3885,
285,
1908,
253,
4588,
3515,
3885,
625,
672,
20462,
253,
1566,
2490,
187,
4118,
18435,
27,
2520,
2929,
5611,
271,
7756,
689,
1551,
407,
15974,
253,
3374,
5611,
407,
1066,
48027,
5871,
275,
278,
6678,
512,
30628,
452,
7478,
253,
7680,
273,
436,
2929,
285,
253,
13943,
3045,
6786,
407,
253,
4081,
5933,
50276,
249,
253,
30080,
22559,
253,
4477,
452,
973,
20188,
30628,
2201,
7350,
285,
747,
1543,
452,
644,
9300,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes an approach to handling the challenge of applying rl via supervision methods to stochastic environments the method is empirically demonstrated to outperform two recent baselines across three environments at the bottom of page 2 the paper claims three contributions 1 the observation that rl via supervision agents can only reliably achieve outcomes that are independent from environment stochasticity 2 the proposed method esper 3 the stochastic offline rl benchmark tasks used to evaluate esper however observation 1 was previously made in shaking the foundations delusions in sequential models for interaction and control ortega et al arxiv 2020 the environments 3 contribution cannot be fully evaluated as code is only available for the simplest of the three environments open sourcing all environments would improve the significance of this contribution esper the proposed method remains a novel contribution that significantly outperforms two recent related baselines the empirical evaluation of this method could be improved by including cql in figures 46 and quantifying the correlation between performance and target return currently visualized in figure 6 i agree with the conclusion the authors reach from this plot but in this case believe the raw value would be more informative than the plots currently used the space saved could then be used to include the more insightful results currently in appendix a3 on the representations learnt the authors have adequately addressed the limitations and potential negative societal impact of their work i would be very interested in seeing the follow up study proposed regarding testing the performance of esper on other types of goals as i agree this may require further innovation docsepthis paper first identify that the existing offline reinforcement learning via supervised learning rvs methods that conditioned on returnsoutcomes eg decision transformers can only reliably achieve outcomes that are independent from environment stochasticity in other words they do not consider whether the offline demonstration results are achieved by pure luck based on this observation the authors proposed learning learn environmentstochasticityindependent representations of trajectories by adversarially clustering offline experiences by data collection policy to disentangle actions from environment stochasticity trajectories in each cluster can be annotated with the average return for trajectories these pieces together give esper environmentstochasticityindependent representation rvs with esper achieves substantial improvements on several simple stochastic environment benchmarks gambling an illustrative singlestep stochastic environment connect four with a stochastic opponent and 2048 strengths i appreciate the insightful observation that existing rvs can only reliably achieve outcomes that are independent from environment stochasticity this paper presents a clear illustration to explain why this could be an issue i believe future rvs research could potentially benefit a lot from this work the proposed solution is theoretically sound empirically it shows decent improvements compared to return conditioning on several simple benchmarks including gambling connect four and 2048 i think these are enough proof that the issue does exist and that the proposed method esper achieve some initial successes weaknesses my concerns are about whether this paper has discussed enough limitations of the proposed solution ie clustering by policy and enough future challenges besides highdimensional observation space i think clustering by behaviorpolicy implies that when the data is collected with many behavioral modes the number of clusters need to be able to scale with that for that reason i think we will perhaps see that it struggles with highdimensional and continuous action space potentially statedependent noops and nearnoops could also lead to unnecessary complexity furthermore my haunch is that clustering by behavior will go against behavioral compositionality making it difficult to interpolate desired outcomes as opposed to returnconditioning and thus probably go against the authors desire to pave the way for creation of more general agent as stated in the conclusion section given the early position of this paper which focus on a limited set of empirical evaluations i would like to see much more discussion about the potential limitations and future challenges of the proposed approach to help future works to extend this contribution if these could be sufficiently addressed or clarified during rebuttal which i think would not be very difficult i will increase my rating to 6 or 7 to recommend acceptance the potential negative societal impact is addressed however the limitation is not sufficiently discussed which i think is also my main concern about his paper and discussed in the strengths and weaknesses section docsepoffline decision transformers trained with supervised learning do not account for the fact that some favorable outcomes are due to stochastic dynamics and not to the quality of the policy therefore by design some probability mass may be distributed to suboptimal actions the solution proposed is to focus on trajectory statistics that are independent of uncontrollable randomness in the environment from cluster assignments based on these statistics useful conditioning quantities such as expected returns can be learnt three predictors are optimized alternatively with two losses on the one hand a transition predictor is trained to predict the next state conditioned on the previous state action and the trajectory cluster assignment on the other hand the cluster assignment predictor is trained to maximize the loss of the transition predictor while maximizing the likelihood of a policy predictor trained to predict the action conditioned on the current state and cluster assignment finally another predictor is trained to predict the return conditioned on a cluster assignment it is then used to label trajectories for policy learning conditioned on expected returns the authors propose a benchmark of three stochastic environments and build offline datasets with various policies ranging from random policies to experts empirically their approach outperforms naive returnconditioning and a conservative qlearning baseline strengths to the best of my knowledge this method is new it is simple to understand well motivated and easy to implement claims made at the end of the introduction are supported by experimental results the paper is mostly easy to follow and understand the authors highlight a potential key limitation of decision transformers trained with offline trajectories in stochastic environments as transformers become more popular as agents such insights are beneficial to the community weaknesses the paper is titled why decision transformers fail in stochastic environments but not a single decision transformer is trained nor evaluated in this work while the authors provide evidence that mlpbased behavioral cloning agents conditioned on returns fail in toy stochastic environments they do not illustrate this phenomenom in more complex environments with recent architectures for instance decision transformers in atari with sticky actionshttpsarxivorgabs170906009 or crafterhttpsarxivorgabs210906780 could be considered the main limitation of this work is the lack of empirical evidence beyond toy environments and multilayer perceptrons while i agree that mlps can be used instead of transformers in the mdps considered the question of whether decision transformers fail in complex stochastic environments is not answered
### Summary:
|
the authors explore a fundamental limitation of decision transformers and related rl via supervised learning approaches when applied to stochastic environments they propose a new and simple approach that clusters experiences to disentangle action quality from environment stochasticity their esper approach achieves large improvements on a number of simple stochastic environments including gambling connect four and 2048 the reviewers were all satisfied by the novelty technical soundness and relevance of this work for the neurips community but were initially of mixed opinions about the selection of challenge domains having discussed this point at length with the reviewers i am satisfied with the authors choice of environments for two reasons 1 they allow the specific shortcomings of previous methods to be isolated and addressed directly and 2 they are consistent with the environments used by previous related work published in toptier conferences i am recommending this paper for acceptance accordingly
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
2746,
281,
10885,
253,
5691,
273,
9433,
391,
77,
3066,
20446,
3082,
281,
19191,
12620,
253,
1332,
310,
45190,
5183,
281,
562,
32231,
767,
3332,
1666,
25379,
2439,
1264,
12620,
387,
253,
5004,
273,
3239,
374,
253,
2929,
3916,
1264,
9021,
337,
253,
8310,
326,
391,
77,
3066,
20446,
6083,
476,
760,
27340,
5115,
6973,
326,
403,
3907,
432,
3126,
19191,
414,
374,
253,
4081,
1332,
35656,
495,
253,
19191,
28841,
391,
77,
22791,
8892,
908,
281,
7472,
35656,
50276,
35529,
50276,
23705,
318,
337,
369,
3786,
1160,
275,
18577,
253,
27629,
1448,
16723,
275,
22453,
3210,
323,
5016,
285,
1453,
390,
442,
2485,
1162,
355,
549,
32693,
9169,
50276,
783,
12620,
495,
7680,
2550,
320,
4751,
6760,
347,
2127,
310,
760,
2130,
323,
253,
22325,
273,
253,
1264,
12620,
1527,
18988,
2844,
512,
12620,
651,
3157,
253,
8453,
273,
436,
7680,
50276,
265,
468,
253,
4081,
1332,
4558,
247,
4460,
7680,
326,
3012,
41731,
13015,
767,
3332,
2905,
1666,
25379,
253,
16774,
7103,
273,
436,
1332,
812,
320,
5520,
407,
1690,
260,
5848,
275,
8442,
7904,
285,
2677,
5411,
253,
5921,
875,
3045,
285,
2303,
1091,
4390,
27130,
275,
4677,
721,
891,
5194,
342,
253,
6452,
253,
4477,
3986,
432,
436,
7484,
533,
275,
436,
1083,
2868,
253,
9305,
1318,
651,
320,
625,
27096,
685,
253,
14777,
4390,
908,
253,
2317,
9809,
812,
840,
320,
908,
281,
2486,
253,
625,
47860,
1543,
4390,
275,
30762,
247,
20,
327,
253,
14237,
34003,
253,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
891,
651,
320,
1077,
6110,
275,
6523,
253,
956,
598,
1263,
4081,
5001,
5175,
253,
3045,
273,
35656,
327,
643,
3510,
273,
7342,
347,
891,
5194,
436,
778,
2430,
2007,
15832,
5474,
33032,
2520,
2929,
806,
4271,
326,
253,
5368,
28841,
35221,
4715,
3066,
22296,
4715,
391,
10936,
3082,
326,
27039,
327,
6548,
483,
3217,
24088,
3061,
4979,
398,
476,
760,
27340,
5115,
6973,
326,
403,
3907,
432,
3126,
19191,
414,
275,
643,
3000,
597,
513,
417,
1908,
1880,
253,
28841,
20028,
1543,
403,
6786,
407,
6313,
7516,
50275,
3169,
327,
436,
8310,
253,
4477,
4081,
4715,
3037,
3126,
296,
17283,
414,
17777,
14237,
273,
24102,
407,
18539,
274,
1365,
17524,
28841,
8450,
407,
941,
4849,
3646,
281,
557,
290,
2134,
5231,
432,
3126,
19191,
414,
24102,
275,
1016,
7368,
476,
320,
28267,
342,
253,
3388,
1091,
323,
24102,
841,
7437,
2366,
1918,
35656,
3126,
296,
17283,
414,
17777,
6779,
391,
10936,
342,
35656,
33526,
6832,
11701,
327,
2067,
2969,
19191,
3126,
49602,
25572,
271,
47386,
1625,
46701,
554,
19191,
3126,
4684,
1740,
342,
247,
19191,
16871,
285,
1384,
2385,
20544,
50276,
74,
11435,
253,
47860,
8310,
326,
5368,
391,
10936,
476,
760,
27340,
5115,
6973,
326,
403,
3907,
432,
3126,
19191,
414,
436,
2929,
10262,
247,
2590,
23356,
281,
5513,
2139,
436,
812,
320,
271,
2523,
891,
2868,
2852,
391,
10936,
2561,
812,
7826,
5649,
247,
2257,
432,
436,
789,
50276,
783,
4081,
2900,
310,
28055,
3590,
45190,
352,
2722,
12524,
11701,
2429,
281,
1091,
21839,
327,
2067,
2969,
49602,
1690,
25572,
4684,
1740,
285,
1384,
2385,
891,
1158,
841,
403,
2217,
4737,
326,
253,
2523,
1057,
2226,
285,
326,
253,
4081,
1332,
35656,
5115,
690,
3302,
34574,
50276,
20881,
1255,
265,
619,
7350,
403,
670,
1880,
436,
2929,
556,
5469,
2217,
7364,
273,
253,
4081,
2900,
26332,
17524,
407,
3646,
285,
2217,
2852,
7881,
16280,
1029,
6967,
8310,
2317,
891,
1158,
17524,
407,
3879,
22872,
8018,
326,
672,
253,
941,
310,
5728,
342,
1142,
14613,
10006,
253,
1180,
273,
9959,
878,
281,
320,
2104,
281,
4311,
342,
326,
323,
326,
1921,
891,
1158,
359,
588,
4931,
923,
326,
352,
23490,
342,
1029,
6967,
285,
5415,
2250,
2317,
7826,
4767,
2662,
642,
2695,
285,
425,
1596,
80,
2695,
812,
671,
1421,
281,
15279,
10454,
33810,
619,
419,
3204,
310,
326,
17524,
407,
3879,
588,
564,
1411,
14613,
5889,
1319,
2403,
352,
2834,
281,
20670,
366,
6799,
6973,
347,
10066,
281,
1091,
42743,
285,
3021,
3164,
564,
1411,
253,
4477,
8327,
281,
29238,
253,
1039,
323,
8869,
273,
625,
2087,
5570,
347,
4767,
275,
253,
6452,
2593,
50275,
28821,
253,
2393,
1899,
273,
436,
2929,
534,
2770,
327,
247,
3710,
873,
273,
16774,
27163,
891,
651,
751,
281,
923,
1199,
625,
5955,
670,
253,
2442,
7364,
285,
2852,
7881,
273,
253,
4081,
2746,
281,
1361,
2852,
2987,
281,
9017,
436,
7680,
604,
841,
812,
320,
10481,
9713,
390,
31637,
1309,
30080,
22559,
534,
891,
1158,
651,
417,
320,
1077,
2834,
891,
588,
2572,
619,
13716,
281,
721,
390,
818,
281,
5583,
14924,
50276,
783,
2442,
4016,
38058,
3486,
310,
9713,
2299,
253,
12291,
310,
417,
10481,
5469,
534,
891,
1158,
310,
671,
619,
2022,
4468,
670,
521,
2929,
285,
5469,
275,
253,
20544,
285,
32213,
2593,
50276,
7152,
33032,
2727,
1282,
3061,
4979,
398,
10166,
342,
22296,
4715,
513,
417,
2395,
323,
253,
958,
326,
690,
13857,
6973,
403,
1955,
281,
19191,
8062,
285,
417,
281,
253,
3290,
273,
253,
3646,
50275,
45230,
407,
2216,
690,
5912,
2280,
778,
320,
5939,
281,
749,
29776,
5231,
253,
2900,
4081,
310,
281,
2770,
327,
18974,
9990,
326,
403,
3907,
273,
440,
35019,
494,
3632,
1255,
275,
253,
3126,
432,
7368,
23768,
1754,
327,
841,
9990,
4217,
21839,
13483,
824,
347,
3264,
6548,
476,
320,
34003,
50276,
13524,
23477,
403,
18325,
31506,
342,
767,
11655,
50275,
251,
253,
581,
1133,
247,
5502,
23403,
310,
10166,
281,
3283,
253,
1735,
1375,
27039,
327,
253,
2045,
1375,
2250,
285,
253,
18974,
7368,
12714,
50276,
251,
253,
643,
1133,
253,
7368,
12714,
23403,
310,
10166,
281,
22950,
253,
2957,
273,
253,
5502,
23403,
1223,
46875,
253,
12177,
273,
247,
3646,
23403,
10166,
281,
3283,
253,
2250,
27039,
327,
253,
1655,
1375,
285,
7368,
12714,
50276,
71,
3341,
1529,
23403,
310,
10166,
281,
3283,
253,
1091,
27039,
327,
247,
7368,
12714,
352,
310,
840,
908,
281,
5203,
24102,
323,
3646,
4715,
27039,
327,
3264,
6548,
50276,
783,
4477,
12661,
247,
22791,
273,
1264,
19191,
12620,
285,
1973,
28841,
15302,
342,
2710,
7823,
12319,
432,
3632,
7823,
281,
10071,
45190,
616,
2746,
41731,
13015,
27785,
1091,
42743,
285,
247,
11518,
2805,
28269,
8245,
50276,
296,
3755,
20556,
50275,
936,
253,
1682,
273,
619,
3640,
436,
1332,
310,
747,
352,
310,
2969,
281,
2096,
973,
17194,
285,
3477,
281,
3359,
50276,
28803,
1160,
387,
253,
990,
273,
253,
10199,
403,
4516,
407,
5661,
1543,
50276,
783,
2929,
310,
6571,
3477,
281,
956,
285,
2096,
50276,
783,
4477,
6780,
247,
2442,
2234,
12291,
273,
3061,
4979,
398,
10166,
342,
28841,
24102,
275,
19191,
12620,
347,
4979,
398,
2489,
625,
4633,
347,
6083,
824,
16039,
403,
12912,
281,
253,
3114,
50275,
20881,
1255,
265,
50276,
783,
2929,
310,
18879,
2139,
3061,
4979,
398,
1891,
275,
19191,
12620,
533,
417,
247,
2014,
3061,
39707,
310,
10166,
4543,
6760,
275,
436,
789,
1223,
253,
4477,
2085,
1941,
326,
13361,
81,
3169,
14613,
34591,
6083,
27039,
327,
6548,
1891,
275,
20953,
19191,
12620,
597,
513,
417,
17093,
436,
7434,
297,
275,
625,
2570,
12620,
342,
3332,
35615,
323,
4227,
3061,
4979,
398,
275,
387,
1792,
342,
31714,
5231,
3614,
39962,
2061,
5375,
15046,
30920,
8972,
390,
8034,
699,
3614,
39962,
2061,
5375,
16899,
2270,
2251,
1438,
812,
320,
2783,
253,
2022,
12291,
273,
436,
789,
310,
253,
3480,
273,
16774,
1941,
4457,
20953,
12620,
285,
33362,
4071,
591,
916,
9036,
1223,
891,
5194,
326,
13361,
793,
476,
320,
908,
3185,
273,
4979,
398,
275,
253,
31934,
793,
2783,
253,
1953,
273,
1880,
3061,
4979,
398,
1891,
275,
2570,
19191,
12620,
310,
417,
9577,
2490,
187,
4118,
18435,
27,
783,
4477,
8338,
247,
7936,
12291,
273,
3061,
4979,
398,
285,
2905,
391,
77,
3066,
22296,
4715,
7274,
672,
3732,
281,
19191,
12620,
597,
12661,
247,
747,
285,
2969,
2746,
326,
9959,
8450,
281,
557,
290,
2134,
2250,
3290,
432,
3126,
19191,
414,
616,
35656,
2746,
33526,
1781,
11701,
327,
247,
1180,
273,
2969,
19191,
12620,
1690,
25572,
4684,
1740,
285,
1384,
2385,
50276,
783,
30628,
497,
512,
10048,
407,
253,
38135,
7681,
3590,
1255,
285,
17200,
273,
436,
789,
323,
253,
5723,
2824,
3114,
533,
497,
8523,
273,
6804,
11626,
670,
253,
5438,
273,
5691,
10625,
1907,
5469,
436,
1127,
387,
2978,
342,
253,
30628,
891,
717,
10048,
342,
253,
4477,
4327,
273,
12620,
323,
767,
4606,
337,
597,
1581,
253,
2173,
35387,
273,
2045,
3082,
281,
320,
7011,
285,
9713,
3587,
285,
374,
597,
403,
5185,
342,
253,
12620,
908,
407,
2045,
2905,
789,
3863,
275,
281,
431,
1321,
27691,
891,
717,
46705,
436,
2929,
323,
14924,
15672
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
2746,
281,
10885,
253,
5691,
273,
9433,
391,
77,
3066,
20446,
3082,
281,
19191,
12620,
253,
1332,
310,
45190,
5183,
281,
562,
32231,
767,
3332,
1666,
25379,
2439,
1264,
12620,
387,
253,
5004,
273,
3239,
374,
253,
2929,
3916,
1264,
9021,
337,
253,
8310,
326,
391,
77,
3066,
20446,
6083,
476,
760,
27340,
5115,
6973,
326,
403,
3907,
432,
3126,
19191,
414,
374,
253,
4081,
1332,
35656,
495,
253,
19191,
28841,
391,
77,
22791,
8892,
908,
281,
7472,
35656,
50276,
35529,
50276,
23705,
318,
337,
369,
3786,
1160,
275,
18577,
253,
27629,
1448,
16723,
275,
22453,
3210,
323,
5016,
285,
1453,
390,
442,
2485,
1162,
355,
549,
32693,
9169,
50276,
783,
12620,
495,
7680,
2550,
320,
4751,
6760,
347,
2127,
310,
760,
2130,
323,
253,
22325,
273,
253,
1264,
12620,
1527,
18988,
2844,
512,
12620,
651,
3157,
253,
8453,
273,
436,
7680,
50276,
265,
468,
253,
4081,
1332,
4558,
247,
4460,
7680,
326,
3012,
41731,
13015,
767,
3332,
2905,
1666,
25379,
253,
16774,
7103,
273,
436,
1332,
812,
320,
5520,
407,
1690,
260,
5848,
275,
8442,
7904,
285,
2677,
5411,
253,
5921,
875,
3045,
285,
2303,
1091,
4390,
27130,
275,
4677,
721,
891,
5194,
342,
253,
6452,
253,
4477,
3986,
432,
436,
7484,
533,
275,
436,
1083,
2868,
253,
9305,
1318,
651,
320,
625,
27096,
685,
253,
14777,
4390,
908,
253,
2317,
9809,
812,
840,
320,
908,
281,
2486,
253,
625,
47860,
1543,
4390,
275,
30762,
247,
20,
327,
253,
14237,
34003,
253,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
891,
651,
320,
1077,
6110,
275,
6523,
253,
956,
598,
1263,
4081,
5001,
5175,
253,
3045,
273,
35656,
327,
643,
3510,
273,
7342,
347,
891,
5194,
436,
778,
2430,
2007,
15832,
5474,
33032,
2520,
2929,
806,
4271,
326,
253,
5368,
28841,
35221,
4715,
3066,
22296,
4715,
391,
10936,
3082,
326,
27039,
327,
6548,
483,
3217,
24088,
3061,
4979,
398,
476,
760,
27340,
5115,
6973,
326,
403,
3907,
432,
3126,
19191,
414,
275,
643,
3000,
597,
513,
417,
1908,
1880,
253,
28841,
20028,
1543,
403,
6786,
407,
6313,
7516,
50275,
3169,
327,
436,
8310,
253,
4477,
4081,
4715,
3037,
3126,
296,
17283,
414,
17777,
14237,
273,
24102,
407,
18539,
274,
1365,
17524,
28841,
8450,
407,
941,
4849,
3646,
281,
557,
290,
2134,
5231,
432,
3126,
19191,
414,
24102,
275,
1016,
7368,
476,
320,
28267,
342,
253,
3388,
1091,
323,
24102,
841,
7437,
2366,
1918,
35656,
3126,
296,
17283,
414,
17777,
6779,
391,
10936,
342,
35656,
33526,
6832,
11701,
327,
2067,
2969,
19191,
3126,
49602,
25572,
271,
47386,
1625,
46701,
554,
19191,
3126,
4684,
1740,
342,
247,
19191,
16871,
285,
1384,
2385,
20544,
50276,
74,
11435,
253,
47860,
8310,
326,
5368,
391,
10936,
476,
760,
27340,
5115,
6973,
326,
403,
3907,
432,
3126,
19191,
414,
436,
2929,
10262,
247,
2590,
23356,
281,
5513,
2139,
436,
812,
320,
271,
2523,
891,
2868,
2852,
391,
10936,
2561,
812,
7826,
5649,
247,
2257,
432,
436,
789,
50276,
783,
4081,
2900,
310,
28055,
3590,
45190,
352,
2722,
12524,
11701,
2429,
281,
1091,
21839,
327,
2067,
2969,
49602,
1690,
25572,
4684,
1740,
285,
1384,
2385,
891,
1158,
841,
403,
2217,
4737,
326,
253,
2523,
1057,
2226,
285,
326,
253,
4081,
1332,
35656,
5115,
690,
3302,
34574,
50276,
20881,
1255,
265,
619,
7350,
403,
670,
1880,
436,
2929,
556,
5469,
2217,
7364,
273,
253,
4081,
2900,
26332,
17524,
407,
3646,
285,
2217,
2852,
7881,
16280,
1029,
6967,
8310,
2317,
891,
1158,
17524,
407,
3879,
22872,
8018,
326,
672,
253,
941,
310,
5728,
342,
1142,
14613,
10006,
253,
1180,
273,
9959,
878,
281,
320,
2104,
281,
4311,
342,
326,
323,
326,
1921,
891,
1158,
359,
588,
4931,
923,
326,
352,
23490,
342,
1029,
6967,
285,
5415,
2250,
2317,
7826,
4767,
2662,
642,
2695,
285,
425,
1596,
80,
2695,
812,
671,
1421,
281,
15279,
10454,
33810,
619,
419,
3204,
310,
326,
17524,
407,
3879,
588,
564,
1411,
14613,
5889,
1319,
2403,
352,
2834,
281,
20670,
366,
6799,
6973,
347,
10066,
281,
1091,
42743,
285,
3021,
3164,
564,
1411,
253,
4477,
8327,
281,
29238,
253,
1039,
323,
8869,
273,
625,
2087,
5570,
347,
4767,
275,
253,
6452,
2593,
50275,
28821,
253,
2393,
1899,
273,
436,
2929,
534,
2770,
327,
247,
3710,
873,
273,
16774,
27163,
891,
651,
751,
281,
923,
1199,
625,
5955,
670,
253,
2442,
7364,
285,
2852,
7881,
273,
253,
4081,
2746,
281,
1361,
2852,
2987,
281,
9017,
436,
7680,
604,
841,
812,
320,
10481,
9713,
390,
31637,
1309,
30080,
22559,
534,
891,
1158,
651,
417,
320,
1077,
2834,
891,
588,
2572,
619,
13716,
281,
721,
390,
818,
281,
5583,
14924,
50276,
783,
2442,
4016,
38058,
3486,
310,
9713,
2299,
253,
12291,
310,
417,
10481,
5469,
534,
891,
1158,
310,
671,
619,
2022,
4468,
670,
521,
2929,
285,
5469,
275,
253,
20544,
285,
32213,
2593,
50276,
7152,
33032,
2727,
1282,
3061,
4979,
398,
10166,
342,
22296,
4715,
513,
417,
2395,
323,
253,
958,
326,
690,
13857,
6973,
403,
1955,
281,
19191,
8062,
285,
417,
281,
253,
3290,
273,
253,
3646,
50275,
45230,
407,
2216,
690,
5912,
2280,
778,
320,
5939,
281,
749,
29776,
5231,
253,
2900,
4081,
310,
281,
2770,
327,
18974,
9990,
326,
403,
3907,
273,
440,
35019,
494,
3632,
1255,
275,
253,
3126,
432,
7368,
23768,
1754,
327,
841,
9990,
4217,
21839,
13483,
824,
347,
3264,
6548,
476,
320,
34003,
50276,
13524,
23477,
403,
18325,
31506,
342,
767,
11655,
50275,
251,
253,
581,
1133,
247,
5502,
23403,
310,
10166,
281,
3283,
253,
1735,
1375,
27039,
327,
253,
2045,
1375,
2250,
285,
253,
18974,
7368,
12714,
50276,
251,
253,
643,
1133,
253,
7368,
12714,
23403,
310,
10166,
281,
22950,
253,
2957,
273,
253,
5502,
23403,
1223,
46875,
253,
12177,
273,
247,
3646,
23403,
10166,
281,
3283,
253,
2250,
27039,
327,
253,
1655,
1375,
285,
7368,
12714,
50276,
71,
3341,
1529,
23403,
310,
10166,
281,
3283,
253,
1091,
27039,
327,
247,
7368,
12714,
352,
310,
840,
908,
281,
5203,
24102,
323,
3646,
4715,
27039,
327,
3264,
6548,
50276,
783,
4477,
12661,
247,
22791,
273,
1264,
19191,
12620,
285,
1973,
28841,
15302,
342,
2710,
7823,
12319,
432,
3632,
7823,
281,
10071,
45190,
616,
2746,
41731,
13015,
27785,
1091,
42743,
285,
247,
11518,
2805,
28269,
8245,
50276,
296,
3755,
20556,
50275,
936,
253,
1682,
273,
619,
3640,
436,
1332,
310,
747,
352,
310,
2969,
281,
2096,
973,
17194,
285,
3477,
281,
3359,
50276,
28803,
1160,
387,
253,
990,
273,
253,
10199,
403,
4516,
407,
5661,
1543,
50276,
783,
2929,
310,
6571,
3477,
281,
956,
285,
2096,
50276,
783,
4477,
6780,
247,
2442,
2234,
12291,
273,
3061,
4979,
398,
10166,
342,
28841,
24102,
275,
19191,
12620,
347,
4979,
398,
2489,
625,
4633,
347,
6083,
824,
16039,
403,
12912,
281,
253,
3114,
50275,
20881,
1255,
265,
50276,
783,
2929,
310,
18879,
2139,
3061,
4979,
398,
1891,
275,
19191,
12620,
533,
417,
247,
2014,
3061,
39707,
310,
10166,
4543,
6760,
275,
436,
789,
1223,
253,
4477,
2085,
1941,
326,
13361,
81,
3169,
14613,
34591,
6083,
27039,
327,
6548,
1891,
275,
20953,
19191,
12620,
597,
513,
417,
17093,
436,
7434,
297,
275,
625,
2570,
12620,
342,
3332,
35615,
323,
4227,
3061,
4979,
398,
275,
387,
1792,
342,
31714,
5231,
3614,
39962,
2061,
5375,
15046,
30920,
8972,
390,
8034,
699,
3614,
39962,
2061,
5375,
16899,
2270,
2251,
1438,
812,
320,
2783,
253,
2022,
12291,
273,
436,
789,
310,
253,
3480,
273,
16774,
1941,
4457,
20953,
12620,
285,
33362,
4071,
591,
916,
9036,
1223,
891,
5194,
326,
13361,
793,
476,
320,
908,
3185,
273,
4979,
398,
275,
253,
31934,
793,
2783,
253,
1953,
273,
1880,
3061,
4979,
398,
1891,
275,
2570,
19191,
12620,
310,
417,
9577,
2490,
187,
4118,
18435,
27,
783,
4477,
8338,
247,
7936,
12291,
273,
3061,
4979,
398,
285,
2905,
391,
77,
3066,
22296,
4715,
7274,
672,
3732,
281,
19191,
12620,
597,
12661,
247,
747,
285,
2969,
2746,
326,
9959,
8450,
281,
557,
290,
2134,
2250,
3290,
432,
3126,
19191,
414,
616,
35656,
2746,
33526,
1781,
11701,
327,
247,
1180,
273,
2969,
19191,
12620,
1690,
25572,
4684,
1740,
285,
1384,
2385,
50276,
783,
30628,
497,
512,
10048,
407,
253,
38135,
7681,
3590,
1255,
285,
17200,
273,
436,
789,
323,
253,
5723,
2824,
3114,
533,
497,
8523,
273,
6804,
11626,
670,
253,
5438,
273,
5691,
10625,
1907,
5469,
436,
1127,
387,
2978,
342,
253,
30628,
891,
717,
10048,
342,
253,
4477,
4327,
273,
12620,
323,
767,
4606,
337,
597,
1581,
253,
2173,
35387,
273,
2045,
3082,
281,
320,
7011,
285,
9713,
3587,
285,
374,
597,
403,
5185,
342,
253,
12620,
908,
407,
2045,
2905,
789,
3863,
275,
281,
431,
1321,
27691,
891,
717,
46705,
436,
2929,
323,
14924,
15672
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
i am sorry but i am super confused with this paper there is no clarity and about half of the sentences are written with broken english the model as far as i can understand from the partial explanations and figure 2 looks like a kitchen sink a combination of pieces from previously explored methods in the context of traffic flow estimation this might be fine but there is no motivation provided for this rather than spending the method section with repeating well known loss equations kldivergence convolution etc please focus on the architecture provided in the paper and the motivations behind it more importantly how it differs from previous approaches and why these choices have been made this paper is not ready for publication it needs a rewrite at least preferably working out the original motivations behind architectural choices docsepthis paper has potential but i do not think it is ready for publication i will ask some questions make some suggestions 1 your first sentence makes a claim about there being a large body of research on traffic flow forecasting i dont doubt this but you should cite some papers please 2 your contributions raise the following questions for me contribution 1 is that you use a very large dataset for training you dont say and a small dataset for testing thus proving that your method works and generalizes your method may be effective but compared to what your method may generalize but how do we know that if youve only tested it on one small dataset contribution 2 says that you creatively used lagged data in a time series model this is probably a good idea but it does not sound all that creative to me compare with eg an ar model contribution 3 says that you use driving distance to model spatial correlation again this is probably a good idea and when we get further we learn that you applied a graph convolution network were these the choices that you claim are novel are they novel what other choices might be reasonable and how would they compare 3 section 3 immediately jumps into the use of autoencoders but i think you need to justify why we care about using autoencoders in the first place if the problem is traffic forecasting why dont you tackle that problem head on 4 section 3 mentions sparsity without justifying why i care about sparsity this might be an important tool for regularization in a deep neural network or it might not begiven enough data and other regularization techniques weight decay early stopping dropout 5 is the spatial dependency that you end up learning qualitatively different than the spatial dependency you would get by instead assuming a particular parametric form as is done in kernel methods gaussian processes eg the gaussian kernel or the matern kernel parameterizes the covariance between observations at two spatial locations 6 in your experiment i believe you randomly split 15 minute blocks into traintestvalidate i think this evaluation will be overoptimistic insofar as if 10301045 and 11001115 are in the train set but 10451100 is in the test set it will be relatively easy to predict 10451100 i would suggest considering traintestvalidate splits based on larger chunks eg leave the data in 15 minute blocks but randomly select hours 4 blocks to put in traintestvalidatedocsepthe paper uses a number of deep learning approaches to analyse sets of traffic data however as these sets of traffic data are never explained it is difficult to follow or understand what is going on here some major comments 1 many of the key concepts in the paper are not discussed the primary one would be that of what the two data sets contain without knowledge of this it is difficult to ascertain what is going on 2 many of the processes used are not described in enough detail to either understand what is going on or to reproduce the work without this it is difficult to make headway wit the work 3 it is not clearly articulated what the experiments performed are doing for example how have you applied the other techniques to this data 4 key terms are not defined such as traffic flow 5 the english structure of the paper is poor with many mistakes a thorough proofreading is essential some more specific points with the larger road network the difficulty of flow forecasting grows this seems to be a consequence of the other ones not a challenge in its own right what is superiority spatiotemporal traffic flow forecasting task is currently under a heated discussion and has attracted a large research population evidence to back up this statement your contributions arent contributions but rather a list of what you have done how does your related work relate to what you have done hard to parse to extract temporal relationships within the history traffic flows we model this process as a layering structure with autoencoder as cell appendices b and c should be in the main paper what is in x1 when take the sparsity constrains into consideration what are the sparsity constraints how do you obtain the weights figure 2 should come much sooner as it relates a lot of the concepts together on both datasets we slice traffic flow information into 15 minutes windows where 70 of data is for training 10 for validation and remaining 20 for testing is that each 15 mins is split 701020 proof by example is not a proof
### Summary:
|
the paper proposes an interesting neural architecture for traffic flow forecasting which is tested on a number of datasets unfortunately the lack of clarity as well as precision in writing appears to be a big issue for this paper which prevents it from being accepted for publication in its current form however the reviewers did provide valuable feedback regarding writing explanation presentation and structure that the paper would benefit from
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
74,
717,
7016,
533,
891,
717,
2221,
13477,
342,
436,
2929,
627,
310,
642,
19843,
285,
670,
2716,
273,
253,
14683,
403,
3542,
342,
7154,
48087,
50275,
783,
1566,
347,
2080,
347,
891,
476,
2096,
432,
253,
7898,
22909,
285,
4677,
374,
4453,
751,
247,
8576,
16338,
50276,
66,
5019,
273,
7437,
432,
3786,
14859,
3082,
275,
253,
3634,
273,
7137,
2685,
13418,
436,
1537,
320,
4030,
533,
627,
310,
642,
16038,
2530,
323,
436,
50275,
30786,
685,
9100,
253,
1332,
2593,
342,
24385,
973,
1929,
2957,
7424,
465,
392,
2373,
9515,
27311,
3966,
4496,
2770,
327,
253,
10336,
2530,
275,
253,
2929,
285,
253,
42852,
3212,
352,
625,
15538,
849,
352,
19986,
432,
2045,
7274,
285,
2139,
841,
10165,
452,
644,
1160,
50275,
2520,
2929,
310,
417,
4704,
323,
9311,
352,
3198,
247,
24813,
387,
1878,
13027,
2444,
562,
253,
3236,
42852,
3212,
27934,
10165,
5474,
33032,
2520,
2929,
556,
2442,
533,
891,
513,
417,
1158,
352,
310,
4704,
323,
9311,
891,
588,
1642,
690,
3533,
50276,
11145,
690,
13991,
50276,
18,
634,
806,
6197,
2789,
247,
1750,
670,
627,
1146,
247,
1781,
2133,
273,
2561,
327,
7137,
2685,
16923,
272,
891,
13414,
5545,
436,
533,
368,
943,
26542,
690,
9380,
4496,
50276,
19,
634,
9021,
7164,
253,
1563,
3533,
323,
479,
50274,
1987,
2382,
337,
310,
326,
368,
897,
247,
1077,
1781,
10895,
323,
3733,
368,
13414,
1333,
285,
247,
1355,
10895,
323,
5175,
3021,
18597,
326,
634,
1332,
2987,
285,
2087,
4219,
634,
1332,
778,
320,
3576,
533,
2429,
281,
752,
634,
1332,
778,
39970,
533,
849,
513,
359,
871,
326,
604,
368,
306,
760,
5762,
352,
327,
581,
1355,
10895,
50275,
1987,
2382,
374,
2296,
326,
368,
2833,
1242,
908,
16653,
2400,
941,
275,
247,
673,
2962,
1566,
436,
310,
3164,
247,
1175,
2934,
533,
352,
1057,
417,
3590,
512,
326,
10995,
281,
479,
7277,
342,
24088,
271,
549,
1566,
50275,
1987,
2382,
495,
2296,
326,
368,
897,
6276,
4181,
281,
1566,
8820,
5921,
969,
436,
310,
3164,
247,
1175,
2934,
285,
672,
359,
755,
2007,
359,
3037,
326,
368,
3732,
247,
4216,
27311,
2990,
497,
841,
253,
10165,
326,
368,
1750,
403,
4460,
403,
597,
4460,
752,
643,
10165,
1537,
320,
5272,
285,
849,
651,
597,
7277,
50276,
20,
2593,
495,
4745,
27287,
715,
253,
897,
273,
6753,
2083,
351,
398,
533,
891,
1158,
368,
878,
281,
15249,
2139,
359,
1557,
670,
970,
6753,
2083,
351,
398,
275,
253,
806,
1659,
604,
253,
1895,
310,
7137,
16923,
272,
2139,
13414,
368,
18915,
326,
1895,
1481,
327,
50276,
21,
2593,
495,
25957,
37139,
414,
1293,
816,
5411,
2139,
891,
1557,
670,
37139,
414,
436,
1537,
320,
271,
1774,
4968,
323,
37820,
275,
247,
3676,
11454,
2990,
390,
352,
1537,
417,
2353,
3870,
2217,
941,
285,
643,
37820,
5609,
2801,
10027,
2393,
15910,
5926,
483,
50276,
22,
310,
253,
8820,
18925,
326,
368,
990,
598,
4715,
36143,
1027,
685,
253,
8820,
18925,
368,
651,
755,
407,
3185,
7384,
247,
1798,
36833,
830,
347,
310,
2218,
275,
10295,
3082,
50276,
72,
12064,
4870,
24088,
253,
305,
12064,
10295,
390,
253,
45171,
10295,
4764,
4219,
253,
26677,
875,
7313,
387,
767,
8820,
8593,
50276,
23,
275,
634,
3368,
891,
2868,
368,
12421,
8085,
1458,
7017,
8336,
715,
1140,
565,
383,
30716,
891,
1158,
436,
7103,
588,
320,
689,
32581,
2531,
37900,
347,
604,
13062,
9104,
1857,
285,
1903,
23643,
1010,
403,
275,
253,
6194,
873,
533,
884,
1857,
37965,
310,
275,
253,
1071,
873,
352,
588,
320,
4942,
3477,
281,
3283,
884,
1857,
37965,
891,
651,
1804,
7296,
1140,
565,
383,
30716,
36509,
1754,
327,
4067,
30151,
24088,
3553,
253,
941,
275,
1458,
7017,
8336,
533,
12421,
3609,
3038,
577,
8336,
281,
1691,
275,
1140,
565,
383,
7210,
456,
406,
339,
431,
248,
2929,
4648,
247,
1180,
273,
3676,
4715,
7274,
281,
30648,
5239,
273,
7137,
941,
2299,
347,
841,
5239,
273,
7137,
941,
403,
1620,
5544,
352,
310,
2834,
281,
956,
390,
2096,
752,
310,
1469,
327,
1060,
50276,
8826,
2201,
5701,
337,
1142,
273,
253,
2234,
12342,
275,
253,
2929,
403,
417,
5469,
253,
3625,
581,
651,
320,
326,
273,
752,
253,
767,
941,
5239,
3831,
1293,
3640,
273,
436,
352,
310,
2834,
281,
24228,
752,
310,
1469,
327,
50275,
19,
1142,
273,
253,
4870,
908,
403,
417,
2529,
275,
2217,
2508,
281,
2057,
2096,
752,
310,
1469,
327,
390,
281,
18302,
253,
789,
1293,
436,
352,
310,
2834,
281,
1056,
1481,
1106,
19311,
253,
789,
50276,
20,
352,
310,
417,
4518,
35144,
752,
253,
4679,
2684,
403,
2509,
323,
1650,
849,
452,
368,
3732,
253,
643,
5609,
281,
436,
941,
50276,
21,
2234,
2426,
403,
417,
2931,
824,
347,
7137,
2685,
50276,
22,
253,
48087,
2605,
273,
253,
2929,
310,
4105,
342,
1142,
16503,
247,
11080,
4737,
24042,
310,
5667,
50276,
8826,
625,
2173,
2792,
50276,
3113,
253,
4067,
3971,
2990,
253,
10183,
273,
2685,
16923,
272,
17202,
50276,
2520,
3133,
281,
320,
247,
9936,
273,
253,
643,
4394,
417,
247,
5691,
275,
697,
1211,
987,
50275,
5371,
310,
34385,
50275,
1033,
255,
7173,
358,
23702,
7137,
2685,
16923,
272,
4836,
310,
4390,
762,
247,
16934,
5955,
285,
556,
17755,
247,
1781,
2561,
3072,
50276,
22432,
281,
896,
598,
436,
3908,
50275,
12550,
9021,
403,
2649,
9021,
533,
2581,
247,
1618,
273,
752,
368,
452,
2218,
50275,
5430,
1057,
634,
2905,
789,
14588,
281,
752,
368,
452,
2218,
50275,
10984,
281,
14390,
281,
4908,
11935,
7688,
1561,
253,
2892,
7137,
14221,
359,
1566,
436,
1232,
347,
247,
2242,
2158,
2605,
342,
6753,
36465,
347,
894,
50275,
9691,
1271,
270,
285,
260,
943,
320,
275,
253,
2022,
2929,
50275,
5371,
310,
275,
1269,
18,
50275,
9453,
1379,
253,
37139,
414,
1030,
44196,
715,
8180,
50276,
5371,
403,
253,
37139,
414,
10806,
50275,
5430,
513,
368,
4044,
253,
13461,
50275,
13206,
374,
943,
1705,
1199,
19473,
347,
352,
7033,
247,
2257,
273,
253,
12342,
2366,
50275,
251,
1097,
15302,
359,
15512,
7137,
2685,
1491,
715,
1458,
2909,
8323,
835,
5571,
273,
941,
310,
323,
3733,
884,
323,
12820,
285,
5780,
1384,
323,
5175,
50276,
261,
326,
1016,
1458,
29202,
310,
8085,
818,
9104,
938,
50275,
16314,
407,
1650,
310,
417,
247,
4737,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
4722,
11454,
10336,
323,
7137,
2685,
16923,
272,
534,
310,
5762,
327,
247,
1180,
273,
15302,
19235,
253,
3480,
273,
19843,
347,
973,
347,
50276,
40540,
50276,
249,
4028,
4620,
281,
320,
247,
1943,
2523,
323,
436,
2929,
534,
16897,
352,
432,
1146,
7607,
323,
9311,
275,
697,
1655,
830,
2299,
253,
30628,
858,
2085,
9865,
8680,
5001,
4028,
8813,
9759,
285,
2605,
50276,
3529,
253,
2929,
651,
5649,
432,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
74,
717,
7016,
533,
891,
717,
2221,
13477,
342,
436,
2929,
627,
310,
642,
19843,
285,
670,
2716,
273,
253,
14683,
403,
3542,
342,
7154,
48087,
50275,
783,
1566,
347,
2080,
347,
891,
476,
2096,
432,
253,
7898,
22909,
285,
4677,
374,
4453,
751,
247,
8576,
16338,
50276,
66,
5019,
273,
7437,
432,
3786,
14859,
3082,
275,
253,
3634,
273,
7137,
2685,
13418,
436,
1537,
320,
4030,
533,
627,
310,
642,
16038,
2530,
323,
436,
50275,
30786,
685,
9100,
253,
1332,
2593,
342,
24385,
973,
1929,
2957,
7424,
465,
392,
2373,
9515,
27311,
3966,
4496,
2770,
327,
253,
10336,
2530,
275,
253,
2929,
285,
253,
42852,
3212,
352,
625,
15538,
849,
352,
19986,
432,
2045,
7274,
285,
2139,
841,
10165,
452,
644,
1160,
50275,
2520,
2929,
310,
417,
4704,
323,
9311,
352,
3198,
247,
24813,
387,
1878,
13027,
2444,
562,
253,
3236,
42852,
3212,
27934,
10165,
5474,
33032,
2520,
2929,
556,
2442,
533,
891,
513,
417,
1158,
352,
310,
4704,
323,
9311,
891,
588,
1642,
690,
3533,
50276,
11145,
690,
13991,
50276,
18,
634,
806,
6197,
2789,
247,
1750,
670,
627,
1146,
247,
1781,
2133,
273,
2561,
327,
7137,
2685,
16923,
272,
891,
13414,
5545,
436,
533,
368,
943,
26542,
690,
9380,
4496,
50276,
19,
634,
9021,
7164,
253,
1563,
3533,
323,
479,
50274,
1987,
2382,
337,
310,
326,
368,
897,
247,
1077,
1781,
10895,
323,
3733,
368,
13414,
1333,
285,
247,
1355,
10895,
323,
5175,
3021,
18597,
326,
634,
1332,
2987,
285,
2087,
4219,
634,
1332,
778,
320,
3576,
533,
2429,
281,
752,
634,
1332,
778,
39970,
533,
849,
513,
359,
871,
326,
604,
368,
306,
760,
5762,
352,
327,
581,
1355,
10895,
50275,
1987,
2382,
374,
2296,
326,
368,
2833,
1242,
908,
16653,
2400,
941,
275,
247,
673,
2962,
1566,
436,
310,
3164,
247,
1175,
2934,
533,
352,
1057,
417,
3590,
512,
326,
10995,
281,
479,
7277,
342,
24088,
271,
549,
1566,
50275,
1987,
2382,
495,
2296,
326,
368,
897,
6276,
4181,
281,
1566,
8820,
5921,
969,
436,
310,
3164,
247,
1175,
2934,
285,
672,
359,
755,
2007,
359,
3037,
326,
368,
3732,
247,
4216,
27311,
2990,
497,
841,
253,
10165,
326,
368,
1750,
403,
4460,
403,
597,
4460,
752,
643,
10165,
1537,
320,
5272,
285,
849,
651,
597,
7277,
50276,
20,
2593,
495,
4745,
27287,
715,
253,
897,
273,
6753,
2083,
351,
398,
533,
891,
1158,
368,
878,
281,
15249,
2139,
359,
1557,
670,
970,
6753,
2083,
351,
398,
275,
253,
806,
1659,
604,
253,
1895,
310,
7137,
16923,
272,
2139,
13414,
368,
18915,
326,
1895,
1481,
327,
50276,
21,
2593,
495,
25957,
37139,
414,
1293,
816,
5411,
2139,
891,
1557,
670,
37139,
414,
436,
1537,
320,
271,
1774,
4968,
323,
37820,
275,
247,
3676,
11454,
2990,
390,
352,
1537,
417,
2353,
3870,
2217,
941,
285,
643,
37820,
5609,
2801,
10027,
2393,
15910,
5926,
483,
50276,
22,
310,
253,
8820,
18925,
326,
368,
990,
598,
4715,
36143,
1027,
685,
253,
8820,
18925,
368,
651,
755,
407,
3185,
7384,
247,
1798,
36833,
830,
347,
310,
2218,
275,
10295,
3082,
50276,
72,
12064,
4870,
24088,
253,
305,
12064,
10295,
390,
253,
45171,
10295,
4764,
4219,
253,
26677,
875,
7313,
387,
767,
8820,
8593,
50276,
23,
275,
634,
3368,
891,
2868,
368,
12421,
8085,
1458,
7017,
8336,
715,
1140,
565,
383,
30716,
891,
1158,
436,
7103,
588,
320,
689,
32581,
2531,
37900,
347,
604,
13062,
9104,
1857,
285,
1903,
23643,
1010,
403,
275,
253,
6194,
873,
533,
884,
1857,
37965,
310,
275,
253,
1071,
873,
352,
588,
320,
4942,
3477,
281,
3283,
884,
1857,
37965,
891,
651,
1804,
7296,
1140,
565,
383,
30716,
36509,
1754,
327,
4067,
30151,
24088,
3553,
253,
941,
275,
1458,
7017,
8336,
533,
12421,
3609,
3038,
577,
8336,
281,
1691,
275,
1140,
565,
383,
7210,
456,
406,
339,
431,
248,
2929,
4648,
247,
1180,
273,
3676,
4715,
7274,
281,
30648,
5239,
273,
7137,
941,
2299,
347,
841,
5239,
273,
7137,
941,
403,
1620,
5544,
352,
310,
2834,
281,
956,
390,
2096,
752,
310,
1469,
327,
1060,
50276,
8826,
2201,
5701,
337,
1142,
273,
253,
2234,
12342,
275,
253,
2929,
403,
417,
5469,
253,
3625,
581,
651,
320,
326,
273,
752,
253,
767,
941,
5239,
3831,
1293,
3640,
273,
436,
352,
310,
2834,
281,
24228,
752,
310,
1469,
327,
50275,
19,
1142,
273,
253,
4870,
908,
403,
417,
2529,
275,
2217,
2508,
281,
2057,
2096,
752,
310,
1469,
327,
390,
281,
18302,
253,
789,
1293,
436,
352,
310,
2834,
281,
1056,
1481,
1106,
19311,
253,
789,
50276,
20,
352,
310,
417,
4518,
35144,
752,
253,
4679,
2684,
403,
2509,
323,
1650,
849,
452,
368,
3732,
253,
643,
5609,
281,
436,
941,
50276,
21,
2234,
2426,
403,
417,
2931,
824,
347,
7137,
2685,
50276,
22,
253,
48087,
2605,
273,
253,
2929,
310,
4105,
342,
1142,
16503,
247,
11080,
4737,
24042,
310,
5667,
50276,
8826,
625,
2173,
2792,
50276,
3113,
253,
4067,
3971,
2990,
253,
10183,
273,
2685,
16923,
272,
17202,
50276,
2520,
3133,
281,
320,
247,
9936,
273,
253,
643,
4394,
417,
247,
5691,
275,
697,
1211,
987,
50275,
5371,
310,
34385,
50275,
1033,
255,
7173,
358,
23702,
7137,
2685,
16923,
272,
4836,
310,
4390,
762,
247,
16934,
5955,
285,
556,
17755,
247,
1781,
2561,
3072,
50276,
22432,
281,
896,
598,
436,
3908,
50275,
12550,
9021,
403,
2649,
9021,
533,
2581,
247,
1618,
273,
752,
368,
452,
2218,
50275,
5430,
1057,
634,
2905,
789,
14588,
281,
752,
368,
452,
2218,
50275,
10984,
281,
14390,
281,
4908,
11935,
7688,
1561,
253,
2892,
7137,
14221,
359,
1566,
436,
1232,
347,
247,
2242,
2158,
2605,
342,
6753,
36465,
347,
894,
50275,
9691,
1271,
270,
285,
260,
943,
320,
275,
253,
2022,
2929,
50275,
5371,
310,
275,
1269,
18,
50275,
9453,
1379,
253,
37139,
414,
1030,
44196,
715,
8180,
50276,
5371,
403,
253,
37139,
414,
10806,
50275,
5430,
513,
368,
4044,
253,
13461,
50275,
13206,
374,
943,
1705,
1199,
19473,
347,
352,
7033,
247,
2257,
273,
253,
12342,
2366,
50275,
251,
1097,
15302,
359,
15512,
7137,
2685,
1491,
715,
1458,
2909,
8323,
835,
5571,
273,
941,
310,
323,
3733,
884,
323,
12820,
285,
5780,
1384,
323,
5175,
50276,
261,
326,
1016,
1458,
29202,
310,
8085,
818,
9104,
938,
50275,
16314,
407,
1650,
310,
417,
247,
4737,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
4722,
11454,
10336,
323,
7137,
2685,
16923,
272,
534,
310,
5762,
327,
247,
1180,
273,
15302,
19235,
253,
3480,
273,
19843,
347,
973,
347,
50276,
40540,
50276,
249,
4028,
4620,
281,
320,
247,
1943,
2523,
323,
436,
2929,
534,
16897,
352,
432,
1146,
7607,
323,
9311,
275,
697,
1655,
830,
2299,
253,
30628,
858,
2085,
9865,
8680,
5001,
4028,
8813,
9759,
285,
2605,
50276,
3529,
253,
2929,
651,
5649,
432,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper discusses a method for generating long videos in which new content is introduced as the camera moves forwards training on such videos especially in high resolution is prohibitively expensive one of the possible ways to mitigate the issue is to train a two stage architecture in which dynamics is trained on lowres followed by superresolving the lowres long video with the second stage to show advantages of their approach the authors collect two new datasets in which the camera moves forward according to the results the method outperformed current works on these two datasets and finishes second on previously available datasets strengths the paper has some s1 the paper shows that treating temporal noise is important the idea of filtering it with a lowpass filter is interesting this way only lowfrequency longer events are captured by the model s2 results are quite impressive on the two new datasets very impressive there also are weaknesses unfortunately w1 subsampling in space and time has been proposed in the previous literature it has been shown that training at high res both in space and time is prohibitively expensive in tganv2 for example they generate longer videos at low resolution and as they increase resolution with more generators they drop the frame rate this way they achieved efficient training of 2562 resolutionhighest resolution of this paperin 2018 another similar idea was used in a discriminator of dvdgan 9 they downsampled the resolution for video discriminator and reduced framerate for image discriminator tganv2 is not cited dvdgan is cited but the similaritiesdifferences are not discussed w2 the proposed framework is reasonable and seems to be working well on the proposed datasets the key new interesting idea is temporal filtering the rest of the framework from the high level is known see w1 i acknowledge the amount of effort the authors put into making it work of course w3 other video datasets the paper fails to report ucf faceforensics and others from styleganv tats arguing that these datasets contain less new content and camera movements while it might be case the method only reports better scores on their own proposed datasets the problem with this is that it takes a lot of effort to tune each method to each dataset making it not very clear if the proper and sufficient tuning for styleganv was made if one compares to existing numbers on existing datasets such as ucf on which tats and styleganv show reasonable performance one can guarantee that the method is evaluated against the numbers in which proper time was invested otherwise its always possible to select a dataset on which the method scores best w4 other resolutions styleganv and mocoganhd show training on much higher resolutions as high as 1024 while this method focuses on the temporal part of videos its not clear whats the resolution upper bound its important to understand that even for this work since scaling gans is a nontrivial task both in terms of computational resources and quality tganv2 saito masaki et al train sparsely generate densely memoryefficient unsupervised training of highresolution temporal gan international journal of computer vision 12810 2020 25862606 please see above docsepbrief summary the paper addresses the problem of video generation with a focus on longer time horizon videos which requires consistency to this end the authors introduce two new benchmark datasets on horseriding and mountain biking the key observation is that the main components for temporal consistency are preserved at lower spatial resolution the authors therefore propose using a hierarchical architecture to first create long lowresolution video followed by sliding windows to create higher resolution videos at shorter timesteps the authors have further performed a human evaluation on mechanical turk to find their method was preferred over 80 of the time pros 1 the idea is simple yet elegant implementationwise it is quite interesting to find that a straightforward extension of stylegan to videos where input images are simply concatenated leads to promising results 2 new datasets contributions are always welcome the two contributed datasets on mountain biking and horseback riding could be useful for future research especially to evaluate temporal consistency 3 the authors have done a human eval which is very important in video generation and found their method was preferred 80 of the time 4 the authors also show colorchange as a heuristic for temporal which clearly show the benefits of hierarchical training 5 the visualizations provided in the supplementary are very cool cons 1 for the user study the authors should have provided equally same as an option and used sanity check that equally same is picked when both videos are obtained from the same model i am not sure if this could have created any issues in the assessment my guess is the effect could be mild but not negligible 2 i am wondering if a better evaluation for longterm consistency would be conditional generation where first few and last few seconds of real video are provided for instance if a tree is seen at the last frame fig1 10s one could have a softmetric that the tree be seen albeit at lower resolution at some intermediate frame could make the task on human evaluation easier as well given that they would be comparing more similar videos i think authors have done a good job of highlighting failure cases as well as pointing out where fvd metric is not very predictive of human eval performance docsepthe paper presents a video generation model that is capable of producing new content eg new object or scenery object motion and changes in camera viewpoint over time for longer videos than prior works in order to achieve this temporal modeling is emphasized in the proposed architecture unlike existing works in which frame quality is often prioritized as their main contribution the authors redesign the temporal latent representation and train the carefullydesigned model which has the capability to operate over long time scales with a vast temporal receptive field on longer videos at a low resolution and shorter videos at a high resolution two new benchmark datasets are introduced to best evaluate the proposed model since there are no existing datasets with long enough videos evaluated on 4 datasets with different characteristics the proposed model outperforms prior methods especially qualitatively on aspects including generating plausible dynamics and object persistence producing new content while maintaining consistencies over time strength the paper has suggested several interesting insights and many useful practices for video generation for example a multiresolution twostage strategy might be a good solution for training and deployment for models to handle long videos the lowresolution generator should be fully convolutional over time to learn longterm temporal correlations whereas the superresolution generator can operate in a framebyframe basis videos with longer sequence length tend to exacerbate the issue of overfitting and therefore some strong augmentations might be required the paper has provided a deep investigation on the metrics of video generation particularly they propose to analyze color change over time as a simple way to diagnose potential bias captured by the different models in addition existing commonlyused metrics such as fvd and lpips are also discussed and the authors have found these metrics to agree less with the qualitative results or user study results the qualitative results are convincing it was shown that the proposed model is capable of generating videos with rich motion and scenery changes existing methods are incapable of generating realistic long videos and explanations and analysis presented in the paper are reasonable two new datasets are proposed which might be beneficial for future researchers weaknesses an indepth failure case analysis is absent which might be interesting to have for readers the proposed architecture seems to heavily depend on the correct data augmentation in use the authors have discussed the limitations and potential negative societal impact from the qualitative results it seems that the model also struggles when objects in the scene interact with each other docsepthis paper takes the initiative to improve the problem of video generation in terms of the dynamics of the generated videos the main strategy proposed by the author is to increase the real video lengths seen by the discriminator by reducing the resolution the reduced resolution is compensated separately with a superresolution network the authors provide some analysis of the color dynamics of the real dataset and the generated videos using different methods strengths 1 the literature has overlooked video dynamics and temporal modeling for video generation in generative models this research has selected the correct challenge in video generation to address 2 great writing and supplementary materials 3 decoupling the temporal dynamics from frame resolution has some novelties however it has been remotely mentioned by previous research but has never been explicitly modeled with two networks in two different stages weaknesses 1 although i respect the authors braveness and choice of problem i believe the paper conveys such a message that the main reason for low dynamics in video generation models is data since we are not showing long videos to the discriminator the generated videos do not have enough dynamics or look repetitive although i agree with this argument imo the main challenge is modeling the temporal dynamics we can still wisely select shorter videos but with enough frames change but we need better structures to model the temporal dynamics 2 i appreciate the honest selection of the generated video samples in the supplementary materials it is clear that in some scenarios like horse riding we see novel objects enter the scene and the motion is not repetitive as the baselines however in some cases like acid acidgridmp4 second row we see a lot of distortions in the generated video although the method works better for high dynamics it may not handle low dynamics i believe the authors are honest about the negative societal impacts i have no more points to add here
### Summary:
|
all four reviewers enjoyed this paper and were particularly impressed by the videos provided in the supplementary material the results are very impressive indeed the reviewers also agreed that using a multi stage approach was interesting and effective the two new datasets were deemed useful to the generation community and the proposed metrics and human evaluations were appreciated by the reviewers a few smaller concerns included a missing failure analysis and some clarifications questions which were addressed in the rebuttal given the above i recommend acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
25339,
247,
1332,
323,
11365,
1048,
10556,
275,
534,
747,
2600,
310,
5611,
347,
253,
6568,
9727,
32856,
3733,
327,
824,
10556,
3340,
275,
1029,
6064,
310,
9419,
25785,
8214,
581,
273,
253,
1896,
4088,
281,
29966,
253,
2523,
310,
281,
6194,
247,
767,
3924,
10336,
275,
534,
8062,
310,
10166,
327,
1698,
373,
3560,
407,
2221,
373,
11932,
253,
1698,
373,
1048,
3492,
342,
253,
1273,
3924,
281,
921,
11361,
273,
616,
2746,
253,
4477,
4822,
767,
747,
15302,
275,
534,
253,
6568,
9727,
3579,
2556,
281,
253,
1543,
253,
1332,
41731,
10574,
1655,
2987,
327,
841,
767,
15302,
285,
30214,
1273,
327,
3786,
2130,
15302,
50276,
296,
3755,
20556,
253,
2929,
556,
690,
50276,
84,
18,
253,
2929,
2722,
326,
12767,
11935,
6046,
310,
1774,
253,
2934,
273,
19690,
352,
342,
247,
1698,
5858,
5806,
310,
4722,
436,
1039,
760,
1698,
18163,
3356,
3394,
403,
10848,
407,
253,
1566,
256,
19,
1543,
403,
3240,
13943,
327,
253,
767,
747,
15302,
1077,
13943,
50276,
9088,
671,
403,
32213,
19235,
50276,
88,
18,
8790,
312,
4906,
275,
2317,
285,
673,
556,
644,
4081,
275,
253,
2045,
6239,
352,
556,
644,
2011,
326,
3733,
387,
1029,
501,
1097,
275,
2317,
285,
673,
310,
9419,
25785,
8214,
275,
246,
1247,
87,
19,
323,
1650,
597,
6635,
3356,
10556,
387,
1698,
6064,
285,
347,
597,
2572,
6064,
342,
625,
21025,
597,
5926,
253,
3665,
2281,
436,
1039,
597,
6786,
5919,
3733,
273,
2030,
3763,
6064,
49842,
6064,
273,
436,
2929,
249,
4765,
1529,
2074,
2934,
369,
908,
275,
247,
7134,
12915,
273,
277,
19122,
1247,
898,
597,
1066,
22163,
6216,
253,
6064,
323,
3492,
7134,
12915,
285,
3777,
30432,
5034,
323,
2460,
7134,
12915,
246,
1247,
87,
19,
310,
417,
11106,
277,
19122,
1247,
310,
11106,
533,
253,
22620,
69,
26776,
403,
417,
5469,
50275,
88,
19,
253,
4081,
7792,
310,
5272,
285,
3133,
281,
320,
2444,
973,
327,
253,
4081,
15302,
253,
2234,
747,
4722,
2934,
310,
11935,
19690,
253,
1551,
273,
253,
7792,
432,
253,
1029,
1268,
310,
1929,
923,
259,
18,
891,
14409,
253,
2408,
273,
3434,
253,
4477,
1691,
715,
2403,
352,
789,
273,
2282,
50275,
88,
20,
643,
3492,
15302,
253,
2929,
10224,
281,
1304,
44274,
71,
2454,
922,
2224,
982,
285,
2571,
432,
3740,
1247,
87,
50276,
85,
1832,
16425,
326,
841,
15302,
3831,
1679,
747,
2600,
285,
6568,
11438,
1223,
352,
1537,
320,
1083,
253,
1332,
760,
5012,
1805,
7363,
327,
616,
1211,
4081,
15302,
253,
1895,
342,
436,
310,
326,
352,
3936,
247,
2257,
273,
3434,
281,
19928,
1016,
1332,
281,
1016,
10895,
2403,
352,
417,
1077,
2590,
604,
253,
1463,
285,
4209,
25184,
323,
3740,
1247,
87,
369,
1160,
604,
581,
26662,
281,
5368,
3904,
327,
5368,
15302,
824,
347,
44274,
71,
327,
534,
246,
1832,
285,
3740,
1247,
87,
921,
5272,
3045,
581,
476,
12215,
326,
253,
1332,
310,
6760,
1411,
253,
3904,
275,
534,
1463,
673,
369,
22171,
5010,
697,
1900,
1896,
281,
3609,
247,
10895,
327,
534,
253,
1332,
7363,
1682,
50276,
88,
21,
643,
30285,
3740,
1247,
87,
285,
278,
406,
19356,
13838,
921,
3733,
327,
1199,
2169,
30285,
347,
1029,
347,
27277,
1223,
436,
1332,
16633,
327,
253,
11935,
629,
273,
10556,
697,
417,
2590,
47515,
253,
6064,
5170,
3033,
697,
1774,
281,
2096,
326,
1014,
323,
436,
789,
1580,
13642,
305,
507,
310,
247,
37825,
4836,
1097,
275,
2426,
273,
15180,
5300,
285,
3290,
50274,
85,
1247,
87,
19,
618,
7067,
9425,
11938,
1162,
355,
6194,
37139,
600,
6635,
42350,
3541,
20246,
440,
35421,
3733,
273,
1029,
21061,
11935,
36827,
5213,
6698,
273,
4382,
8113,
12842,
740,
9169,
2030,
2691,
19319,
23,
4496,
923,
1840,
5474,
339,
15656,
3624,
6010,
253,
2929,
12453,
253,
1895,
273,
3492,
5978,
342,
247,
2770,
327,
3356,
673,
16892,
10556,
534,
4419,
15274,
281,
436,
990,
253,
4477,
9569,
767,
747,
22791,
15302,
327,
43347,
254,
2821,
285,
11129,
270,
16434,
253,
2234,
8310,
310,
326,
253,
2022,
4295,
323,
11935,
15274,
403,
15296,
387,
2406,
8820,
6064,
253,
4477,
3103,
12661,
970,
247,
24498,
10336,
281,
806,
2794,
1048,
1698,
21061,
3492,
3560,
407,
20661,
8323,
281,
2794,
2169,
6064,
10556,
387,
12217,
4522,
383,
2265,
50276,
783,
4477,
452,
2007,
2684,
247,
1966,
7103,
327,
8651,
10709,
76,
281,
1089,
616,
1332,
369,
9013,
689,
5096,
273,
253,
673,
50275,
856,
84,
50276,
18,
253,
2934,
310,
2969,
2568,
20654,
7092,
3020,
352,
310,
3240,
4722,
281,
1089,
326,
247,
15246,
6880,
273,
3740,
1247,
281,
10556,
835,
3280,
3888,
403,
3365,
32147,
456,
5644,
281,
12532,
1543,
50275,
19,
747,
15302,
9021,
403,
1900,
10112,
253,
767,
9945,
15302,
327,
11129,
270,
16434,
285,
8815,
2135,
15150,
812,
320,
4217,
323,
2852,
2561,
3340,
281,
7472,
11935,
15274,
50276,
20,
253,
4477,
452,
2218,
247,
1966,
2777,
534,
310,
1077,
1774,
275,
3492,
5978,
285,
1119,
616,
1332,
369,
9013,
5096,
273,
253,
673,
50275,
21,
253,
4477,
671,
921,
3295,
4168,
347,
247,
47641,
323,
11935,
534,
4518,
921,
253,
5373,
273,
24498,
3733,
50276,
22,
253,
5304,
5904,
2530,
275,
253,
24864,
403,
1077,
4484,
50276,
5040,
337,
323,
253,
2608,
1263,
253,
4477,
943,
452,
2530,
9696,
1072,
347,
271,
4500,
285,
908,
45985,
2451,
326,
9696,
1072,
310,
5055,
672,
1097,
10556,
403,
2797,
432,
253,
1072,
1566,
891,
717,
417,
2119,
604,
436,
812,
452,
3562,
667,
3374,
275,
253,
6803,
619,
5476,
310,
253,
1055,
812,
320,
11134,
533,
417,
22879,
50276,
19,
891,
717,
12371,
604,
247,
1805,
7103,
323,
1048,
3945,
15274,
651,
320,
17697,
5978,
835,
806,
1643,
285,
1390,
1643,
7253,
273,
1524,
3492,
403,
2530,
323,
4227,
604,
247,
5202,
310,
2326,
387,
253,
1390,
3665,
3036,
18,
884,
84,
581,
812,
452,
247,
2602,
10994,
326,
253,
5202,
320,
2326,
23447,
387,
2406,
6064,
387,
690,
10444,
3665,
812,
1056,
253,
4836,
327,
1966,
7103,
6927,
347,
973,
1677,
326,
597,
651,
320,
10941,
625,
2074,
10556,
891,
1158,
4477,
452,
2218,
247,
1175,
2628,
273,
27321,
4433,
2219,
347,
973,
347,
13458,
562,
835,
269,
19122,
7982,
310,
417,
1077,
15970,
273,
1966,
2777,
3045,
5474,
339,
431,
248,
2929,
10262,
247,
3492,
5978,
1566,
326,
310,
7032,
273,
9603,
747,
2600,
24088,
747,
1789,
390,
38811,
1789,
3200,
285,
2544,
275,
6568,
31460,
689,
673,
323,
3356,
10556,
685,
2720,
2987,
275,
1340,
281,
5115,
436,
11935,
14053,
310,
21947,
275,
253,
4081,
10336,
12401,
5368,
2987,
275,
534,
3665,
3290,
310,
2223,
23652,
1025,
347,
616,
2022,
7680,
253,
4477,
45755,
253,
11935,
21624,
6779,
285,
6194,
253,
9257,
38061,
1566,
534,
556,
253,
14603,
281,
10196,
689,
1048,
673,
11498,
342,
247,
8485,
11935,
44952,
1673,
327,
3356,
10556,
387,
247,
1698,
6064,
285,
12217,
10556,
387,
247,
1029,
6064,
767,
747,
22791,
15302,
403,
5611,
281,
1682,
7472,
253,
4081,
1566,
1580,
627,
403,
642,
5368,
15302,
342,
1048,
2217,
10556,
6760,
327,
577,
15302,
342,
1027,
5319,
253,
4081,
1566,
41731,
13015,
2720,
3082,
3340,
36143,
327,
7794,
1690,
11365,
21541,
8062,
285,
1789,
25306,
9603,
747,
2600,
1223,
11850,
2882,
4601,
689,
673,
50276,
45563,
50276,
783,
2929,
556,
5125,
2067,
4722,
16039,
285,
1142,
4217,
8333,
323,
3492,
5978,
323,
1650,
247,
1554,
2731,
2241,
2500,
493,
486,
5700,
1537,
320,
247,
1175,
2900,
323,
3733,
285,
19007,
323,
3210,
281,
6016,
1048,
10556,
253,
1698,
21061,
14156,
943,
320,
4751,
27311,
267,
689,
673,
281,
3037,
1048,
3945,
11935,
13007,
5727,
253,
2221,
21061,
14156,
476,
10196,
275,
247,
3665,
1615,
6301,
3720,
10556,
342,
3356,
3425,
2978,
5257,
281,
21951,
366,
253,
2523,
273,
689,
31893,
285,
3103,
690,
2266,
35919,
569,
1537,
320,
2424,
50274,
783,
2929,
556,
2530,
247,
3676,
5839,
327,
253,
17082,
273,
3492,
5978,
3782,
597,
12661,
281,
12106,
3295,
1818,
689,
673,
347,
247,
2969,
1039,
281,
33901,
2442,
8492,
10848,
407,
253,
1027,
3210,
275,
1635,
5368,
7744,
3197,
17082,
824,
347,
269,
19122,
285,
39322,
2824,
403,
671,
5469,
285,
253,
4477,
452,
1119,
841,
17082,
281,
5194,
1679,
342,
253,
18276,
1543,
390,
2608,
1263,
1543,
50275,
783,
18276,
1543,
403,
21414,
352,
369,
2011,
326,
253,
4081,
1566,
310,
7032,
273,
11365,
10556,
342,
6793,
3200,
285,
38811,
2544,
5368,
3082,
403,
31257,
273,
11365,
15958,
1048,
10556,
285,
22909,
285,
1783,
3559,
275,
253,
2929,
403,
5272,
50275,
9389,
747,
15302,
403,
4081,
534,
1537,
320,
12912,
323,
2852,
8607,
50273,
20881,
1255,
265,
50276,
266,
801,
554,
394,
4433,
1083,
1783,
310,
12125,
534,
1537,
320,
4722,
281,
452,
323,
10668,
50275,
783,
4081,
10336,
3133,
281,
11306,
3469,
327,
253,
3451,
941,
42072,
275,
897,
50275,
783,
4477,
452,
5469,
253,
7364,
285,
2442,
4016,
38058,
3486,
432,
253,
18276,
1543,
352,
3133,
326,
253,
1566,
671,
23490,
672,
5113,
275,
253,
6200,
8008,
342,
1016,
643,
50276,
7152,
33032,
2520,
2929,
3936,
253,
15952,
281,
3157,
253,
1895,
273,
3492,
5978,
275,
2426,
273,
253,
8062,
273,
253,
4561,
10556,
253,
2022,
5700,
4081,
407,
253,
2488,
310,
281,
2572,
253,
1524,
3492,
16095,
2326,
407,
253,
7134,
12915,
407,
8493,
253,
6064,
253,
3777,
6064,
310,
31745,
11794,
342,
247,
2221,
21061,
2990,
50276,
783,
4477,
2085,
690,
1783,
273,
253,
3295,
8062,
273,
253,
1524,
10895,
285,
253,
4561,
10556,
970,
1027,
3082,
50276,
296,
3755,
20556,
50276,
18,
253,
6239,
556,
28849,
3492,
8062,
285,
11935,
14053,
323,
3492,
5978,
275,
1006,
800,
3210,
436,
2561,
556,
4236,
253,
3451,
5691,
275,
3492,
5978,
281,
2953,
50276,
19,
1270,
4028,
285,
24864,
4753,
50276,
20,
34430,
4906,
253,
11935,
8062,
432,
3665,
6064,
556,
690,
4460,
2890,
2299,
352,
556,
644,
27938,
5393,
407,
2045,
2561,
533,
556,
1620,
644,
11120,
23115,
342,
767,
6928,
275,
767,
1027,
8661,
50276,
20881,
1255,
265,
50276,
18,
3738,
891,
1675,
253,
4477,
6513,
1261,
405,
285,
4327,
273,
1895,
891,
2868,
253,
2929,
11785,
656,
824,
247,
3935,
326,
253,
2022,
1921,
323,
1698,
8062,
275,
3492,
5978,
3210,
310,
941,
1580,
359,
403,
417,
4645,
1048,
10556,
281,
253,
7134,
12915,
253,
4561,
10556,
513,
417,
452,
2217,
8062,
390,
1007,
29104,
3738,
891,
5194,
342,
436,
4154,
516,
80,
253,
2022,
5691,
310,
14053,
253,
11935,
8062,
359,
476,
1335,
46529,
3609,
12217,
10556,
533,
342,
2217,
13009,
1818,
533,
359,
878,
1805,
5289,
281,
1566,
253,
11935,
8062,
50275,
19,
891,
11435,
253,
8274,
5438,
273,
253,
4561,
3492,
3530,
275,
253,
24864,
4753,
352,
310,
2590,
326,
275,
690,
15216,
751,
8815,
15150,
359,
923,
4460,
5113,
4901,
253,
6200,
285,
253,
3200,
310,
417,
29104,
347,
253,
1666,
25379,
2299,
275,
690,
2219,
751,
3527,
3527,
15476,
2503,
21,
1273,
4194,
50276,
664,
923,
247,
2257,
273,
46598,
275,
253,
4561,
3492,
3738,
253,
1332,
2987,
1805,
323,
1029,
8062,
352,
778,
417,
6016,
1698,
8062,
50274,
74,
2868,
253,
4477,
403,
8274,
670,
253,
4016,
38058,
16274,
891,
452,
642,
625,
2792,
281,
823,
1060,
2490,
187,
4118,
18435,
27,
455,
1740,
30628,
11346,
436,
2929,
285,
497,
3782,
17847,
407,
253,
10556,
2530,
275,
253,
24864,
2144,
253,
1543,
403,
1077,
13943,
6296,
253,
30628,
671,
5821,
326,
970,
247,
4471,
3924,
2746,
369,
4722,
285,
3576,
253,
767,
747,
15302,
497,
14320,
4217,
281,
253,
5978,
3114,
285,
253,
4081,
17082,
285,
1966,
27163,
497,
14109,
407,
253,
30628,
247,
1643,
4577,
7350,
2908,
247,
5816,
4433,
1783,
285,
690,
8254,
6787,
3533,
534,
497,
9713,
275,
253,
30080,
22559,
1677,
253,
1840,
891,
5583,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
25339,
247,
1332,
323,
11365,
1048,
10556,
275,
534,
747,
2600,
310,
5611,
347,
253,
6568,
9727,
32856,
3733,
327,
824,
10556,
3340,
275,
1029,
6064,
310,
9419,
25785,
8214,
581,
273,
253,
1896,
4088,
281,
29966,
253,
2523,
310,
281,
6194,
247,
767,
3924,
10336,
275,
534,
8062,
310,
10166,
327,
1698,
373,
3560,
407,
2221,
373,
11932,
253,
1698,
373,
1048,
3492,
342,
253,
1273,
3924,
281,
921,
11361,
273,
616,
2746,
253,
4477,
4822,
767,
747,
15302,
275,
534,
253,
6568,
9727,
3579,
2556,
281,
253,
1543,
253,
1332,
41731,
10574,
1655,
2987,
327,
841,
767,
15302,
285,
30214,
1273,
327,
3786,
2130,
15302,
50276,
296,
3755,
20556,
253,
2929,
556,
690,
50276,
84,
18,
253,
2929,
2722,
326,
12767,
11935,
6046,
310,
1774,
253,
2934,
273,
19690,
352,
342,
247,
1698,
5858,
5806,
310,
4722,
436,
1039,
760,
1698,
18163,
3356,
3394,
403,
10848,
407,
253,
1566,
256,
19,
1543,
403,
3240,
13943,
327,
253,
767,
747,
15302,
1077,
13943,
50276,
9088,
671,
403,
32213,
19235,
50276,
88,
18,
8790,
312,
4906,
275,
2317,
285,
673,
556,
644,
4081,
275,
253,
2045,
6239,
352,
556,
644,
2011,
326,
3733,
387,
1029,
501,
1097,
275,
2317,
285,
673,
310,
9419,
25785,
8214,
275,
246,
1247,
87,
19,
323,
1650,
597,
6635,
3356,
10556,
387,
1698,
6064,
285,
347,
597,
2572,
6064,
342,
625,
21025,
597,
5926,
253,
3665,
2281,
436,
1039,
597,
6786,
5919,
3733,
273,
2030,
3763,
6064,
49842,
6064,
273,
436,
2929,
249,
4765,
1529,
2074,
2934,
369,
908,
275,
247,
7134,
12915,
273,
277,
19122,
1247,
898,
597,
1066,
22163,
6216,
253,
6064,
323,
3492,
7134,
12915,
285,
3777,
30432,
5034,
323,
2460,
7134,
12915,
246,
1247,
87,
19,
310,
417,
11106,
277,
19122,
1247,
310,
11106,
533,
253,
22620,
69,
26776,
403,
417,
5469,
50275,
88,
19,
253,
4081,
7792,
310,
5272,
285,
3133,
281,
320,
2444,
973,
327,
253,
4081,
15302,
253,
2234,
747,
4722,
2934,
310,
11935,
19690,
253,
1551,
273,
253,
7792,
432,
253,
1029,
1268,
310,
1929,
923,
259,
18,
891,
14409,
253,
2408,
273,
3434,
253,
4477,
1691,
715,
2403,
352,
789,
273,
2282,
50275,
88,
20,
643,
3492,
15302,
253,
2929,
10224,
281,
1304,
44274,
71,
2454,
922,
2224,
982,
285,
2571,
432,
3740,
1247,
87,
50276,
85,
1832,
16425,
326,
841,
15302,
3831,
1679,
747,
2600,
285,
6568,
11438,
1223,
352,
1537,
320,
1083,
253,
1332,
760,
5012,
1805,
7363,
327,
616,
1211,
4081,
15302,
253,
1895,
342,
436,
310,
326,
352,
3936,
247,
2257,
273,
3434,
281,
19928,
1016,
1332,
281,
1016,
10895,
2403,
352,
417,
1077,
2590,
604,
253,
1463,
285,
4209,
25184,
323,
3740,
1247,
87,
369,
1160,
604,
581,
26662,
281,
5368,
3904,
327,
5368,
15302,
824,
347,
44274,
71,
327,
534,
246,
1832,
285,
3740,
1247,
87,
921,
5272,
3045,
581,
476,
12215,
326,
253,
1332,
310,
6760,
1411,
253,
3904,
275,
534,
1463,
673,
369,
22171,
5010,
697,
1900,
1896,
281,
3609,
247,
10895,
327,
534,
253,
1332,
7363,
1682,
50276,
88,
21,
643,
30285,
3740,
1247,
87,
285,
278,
406,
19356,
13838,
921,
3733,
327,
1199,
2169,
30285,
347,
1029,
347,
27277,
1223,
436,
1332,
16633,
327,
253,
11935,
629,
273,
10556,
697,
417,
2590,
47515,
253,
6064,
5170,
3033,
697,
1774,
281,
2096,
326,
1014,
323,
436,
789,
1580,
13642,
305,
507,
310,
247,
37825,
4836,
1097,
275,
2426,
273,
15180,
5300,
285,
3290,
50274,
85,
1247,
87,
19,
618,
7067,
9425,
11938,
1162,
355,
6194,
37139,
600,
6635,
42350,
3541,
20246,
440,
35421,
3733,
273,
1029,
21061,
11935,
36827,
5213,
6698,
273,
4382,
8113,
12842,
740,
9169,
2030,
2691,
19319,
23,
4496,
923,
1840,
5474,
339,
15656,
3624,
6010,
253,
2929,
12453,
253,
1895,
273,
3492,
5978,
342,
247,
2770,
327,
3356,
673,
16892,
10556,
534,
4419,
15274,
281,
436,
990,
253,
4477,
9569,
767,
747,
22791,
15302,
327,
43347,
254,
2821,
285,
11129,
270,
16434,
253,
2234,
8310,
310,
326,
253,
2022,
4295,
323,
11935,
15274,
403,
15296,
387,
2406,
8820,
6064,
253,
4477,
3103,
12661,
970,
247,
24498,
10336,
281,
806,
2794,
1048,
1698,
21061,
3492,
3560,
407,
20661,
8323,
281,
2794,
2169,
6064,
10556,
387,
12217,
4522,
383,
2265,
50276,
783,
4477,
452,
2007,
2684,
247,
1966,
7103,
327,
8651,
10709,
76,
281,
1089,
616,
1332,
369,
9013,
689,
5096,
273,
253,
673,
50275,
856,
84,
50276,
18,
253,
2934,
310,
2969,
2568,
20654,
7092,
3020,
352,
310,
3240,
4722,
281,
1089,
326,
247,
15246,
6880,
273,
3740,
1247,
281,
10556,
835,
3280,
3888,
403,
3365,
32147,
456,
5644,
281,
12532,
1543,
50275,
19,
747,
15302,
9021,
403,
1900,
10112,
253,
767,
9945,
15302,
327,
11129,
270,
16434,
285,
8815,
2135,
15150,
812,
320,
4217,
323,
2852,
2561,
3340,
281,
7472,
11935,
15274,
50276,
20,
253,
4477,
452,
2218,
247,
1966,
2777,
534,
310,
1077,
1774,
275,
3492,
5978,
285,
1119,
616,
1332,
369,
9013,
5096,
273,
253,
673,
50275,
21,
253,
4477,
671,
921,
3295,
4168,
347,
247,
47641,
323,
11935,
534,
4518,
921,
253,
5373,
273,
24498,
3733,
50276,
22,
253,
5304,
5904,
2530,
275,
253,
24864,
403,
1077,
4484,
50276,
5040,
337,
323,
253,
2608,
1263,
253,
4477,
943,
452,
2530,
9696,
1072,
347,
271,
4500,
285,
908,
45985,
2451,
326,
9696,
1072,
310,
5055,
672,
1097,
10556,
403,
2797,
432,
253,
1072,
1566,
891,
717,
417,
2119,
604,
436,
812,
452,
3562,
667,
3374,
275,
253,
6803,
619,
5476,
310,
253,
1055,
812,
320,
11134,
533,
417,
22879,
50276,
19,
891,
717,
12371,
604,
247,
1805,
7103,
323,
1048,
3945,
15274,
651,
320,
17697,
5978,
835,
806,
1643,
285,
1390,
1643,
7253,
273,
1524,
3492,
403,
2530,
323,
4227,
604,
247,
5202,
310,
2326,
387,
253,
1390,
3665,
3036,
18,
884,
84,
581,
812,
452,
247,
2602,
10994,
326,
253,
5202,
320,
2326,
23447,
387,
2406,
6064,
387,
690,
10444,
3665,
812,
1056,
253,
4836,
327,
1966,
7103,
6927,
347,
973,
1677,
326,
597,
651,
320,
10941,
625,
2074,
10556,
891,
1158,
4477,
452,
2218,
247,
1175,
2628,
273,
27321,
4433,
2219,
347,
973,
347,
13458,
562,
835,
269,
19122,
7982,
310,
417,
1077,
15970,
273,
1966,
2777,
3045,
5474,
339,
431,
248,
2929,
10262,
247,
3492,
5978,
1566,
326,
310,
7032,
273,
9603,
747,
2600,
24088,
747,
1789,
390,
38811,
1789,
3200,
285,
2544,
275,
6568,
31460,
689,
673,
323,
3356,
10556,
685,
2720,
2987,
275,
1340,
281,
5115,
436,
11935,
14053,
310,
21947,
275,
253,
4081,
10336,
12401,
5368,
2987,
275,
534,
3665,
3290,
310,
2223,
23652,
1025,
347,
616,
2022,
7680,
253,
4477,
45755,
253,
11935,
21624,
6779,
285,
6194,
253,
9257,
38061,
1566,
534,
556,
253,
14603,
281,
10196,
689,
1048,
673,
11498,
342,
247,
8485,
11935,
44952,
1673,
327,
3356,
10556,
387,
247,
1698,
6064,
285,
12217,
10556,
387,
247,
1029,
6064,
767,
747,
22791,
15302,
403,
5611,
281,
1682,
7472,
253,
4081,
1566,
1580,
627,
403,
642,
5368,
15302,
342,
1048,
2217,
10556,
6760,
327,
577,
15302,
342,
1027,
5319,
253,
4081,
1566,
41731,
13015,
2720,
3082,
3340,
36143,
327,
7794,
1690,
11365,
21541,
8062,
285,
1789,
25306,
9603,
747,
2600,
1223,
11850,
2882,
4601,
689,
673,
50276,
45563,
50276,
783,
2929,
556,
5125,
2067,
4722,
16039,
285,
1142,
4217,
8333,
323,
3492,
5978,
323,
1650,
247,
1554,
2731,
2241,
2500,
493,
486,
5700,
1537,
320,
247,
1175,
2900,
323,
3733,
285,
19007,
323,
3210,
281,
6016,
1048,
10556,
253,
1698,
21061,
14156,
943,
320,
4751,
27311,
267,
689,
673,
281,
3037,
1048,
3945,
11935,
13007,
5727,
253,
2221,
21061,
14156,
476,
10196,
275,
247,
3665,
1615,
6301,
3720,
10556,
342,
3356,
3425,
2978,
5257,
281,
21951,
366,
253,
2523,
273,
689,
31893,
285,
3103,
690,
2266,
35919,
569,
1537,
320,
2424,
50274,
783,
2929,
556,
2530,
247,
3676,
5839,
327,
253,
17082,
273,
3492,
5978,
3782,
597,
12661,
281,
12106,
3295,
1818,
689,
673,
347,
247,
2969,
1039,
281,
33901,
2442,
8492,
10848,
407,
253,
1027,
3210,
275,
1635,
5368,
7744,
3197,
17082,
824,
347,
269,
19122,
285,
39322,
2824,
403,
671,
5469,
285,
253,
4477,
452,
1119,
841,
17082,
281,
5194,
1679,
342,
253,
18276,
1543,
390,
2608,
1263,
1543,
50275,
783,
18276,
1543,
403,
21414,
352,
369,
2011,
326,
253,
4081,
1566,
310,
7032,
273,
11365,
10556,
342,
6793,
3200,
285,
38811,
2544,
5368,
3082,
403,
31257,
273,
11365,
15958,
1048,
10556,
285,
22909,
285,
1783,
3559,
275,
253,
2929,
403,
5272,
50275,
9389,
747,
15302,
403,
4081,
534,
1537,
320,
12912,
323,
2852,
8607,
50273,
20881,
1255,
265,
50276,
266,
801,
554,
394,
4433,
1083,
1783,
310,
12125,
534,
1537,
320,
4722,
281,
452,
323,
10668,
50275,
783,
4081,
10336,
3133,
281,
11306,
3469,
327,
253,
3451,
941,
42072,
275,
897,
50275,
783,
4477,
452,
5469,
253,
7364,
285,
2442,
4016,
38058,
3486,
432,
253,
18276,
1543,
352,
3133,
326,
253,
1566,
671,
23490,
672,
5113,
275,
253,
6200,
8008,
342,
1016,
643,
50276,
7152,
33032,
2520,
2929,
3936,
253,
15952,
281,
3157,
253,
1895,
273,
3492,
5978,
275,
2426,
273,
253,
8062,
273,
253,
4561,
10556,
253,
2022,
5700,
4081,
407,
253,
2488,
310,
281,
2572,
253,
1524,
3492,
16095,
2326,
407,
253,
7134,
12915,
407,
8493,
253,
6064,
253,
3777,
6064,
310,
31745,
11794,
342,
247,
2221,
21061,
2990,
50276,
783,
4477,
2085,
690,
1783,
273,
253,
3295,
8062,
273,
253,
1524,
10895,
285,
253,
4561,
10556,
970,
1027,
3082,
50276,
296,
3755,
20556,
50276,
18,
253,
6239,
556,
28849,
3492,
8062,
285,
11935,
14053,
323,
3492,
5978,
275,
1006,
800,
3210,
436,
2561,
556,
4236,
253,
3451,
5691,
275,
3492,
5978,
281,
2953,
50276,
19,
1270,
4028,
285,
24864,
4753,
50276,
20,
34430,
4906,
253,
11935,
8062,
432,
3665,
6064,
556,
690,
4460,
2890,
2299,
352,
556,
644,
27938,
5393,
407,
2045,
2561,
533,
556,
1620,
644,
11120,
23115,
342,
767,
6928,
275,
767,
1027,
8661,
50276,
20881,
1255,
265,
50276,
18,
3738,
891,
1675,
253,
4477,
6513,
1261,
405,
285,
4327,
273,
1895,
891,
2868,
253,
2929,
11785,
656,
824,
247,
3935,
326,
253,
2022,
1921,
323,
1698,
8062,
275,
3492,
5978,
3210,
310,
941,
1580,
359,
403,
417,
4645,
1048,
10556,
281,
253,
7134,
12915,
253,
4561,
10556,
513,
417,
452,
2217,
8062,
390,
1007,
29104,
3738,
891,
5194,
342,
436,
4154,
516,
80,
253,
2022,
5691,
310,
14053,
253,
11935,
8062,
359,
476,
1335,
46529,
3609,
12217,
10556,
533,
342,
2217,
13009,
1818,
533,
359,
878,
1805,
5289,
281,
1566,
253,
11935,
8062,
50275,
19,
891,
11435,
253,
8274,
5438,
273,
253,
4561,
3492,
3530,
275,
253,
24864,
4753,
352,
310,
2590,
326,
275,
690,
15216,
751,
8815,
15150,
359,
923,
4460,
5113,
4901,
253,
6200,
285,
253,
3200,
310,
417,
29104,
347,
253,
1666,
25379,
2299,
275,
690,
2219,
751,
3527,
3527,
15476,
2503,
21,
1273,
4194,
50276,
664,
923,
247,
2257,
273,
46598,
275,
253,
4561,
3492,
3738,
253,
1332,
2987,
1805,
323,
1029,
8062,
352,
778,
417,
6016,
1698,
8062,
50274,
74,
2868,
253,
4477,
403,
8274,
670,
253,
4016,
38058,
16274,
891,
452,
642,
625,
2792,
281,
823,
1060,
2490,
187,
4118,
18435,
27,
455,
1740,
30628,
11346,
436,
2929,
285,
497,
3782,
17847,
407,
253,
10556,
2530,
275,
253,
24864,
2144,
253,
1543,
403,
1077,
13943,
6296,
253,
30628,
671,
5821,
326,
970,
247,
4471,
3924,
2746,
369,
4722,
285,
3576,
253,
767,
747,
15302,
497,
14320,
4217,
281,
253,
5978,
3114,
285,
253,
4081,
17082,
285,
1966,
27163,
497,
14109,
407,
253,
30628,
247,
1643,
4577,
7350,
2908,
247,
5816,
4433,
1783,
285,
690,
8254,
6787,
3533,
534,
497,
9713,
275,
253,
30080,
22559,
1677,
253,
1840,
891,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes using featurelevel adversarial perturbation to explore interpretable adversarial attacks the proposed method can generate targeted universal disguised physically realizable and blackbox attacks at the imagenet scale the method can also be used as an interpretability tool for detecting bugs in networks and to evaluate this the paper designs copypaste attacks to cause targeted misclassification strengths the paper is well written and easy to follow the method is evaluated on a largescale imagenet dataset weaknesses the evaluation of the disguise or realistic objective is not very convincing the paper lacks a quantitative comparison with previous methods see the detailed questions in the next section i think the paper adequately addressed the limitations and potential negative societal impact docsepthe paper presents a method for generating featurelevel adversarial examples in the most successful variant of the attacks adversarial perturbations in representation space of a generative model allow generation of patches which will then be added to natural imagesthe patches are optimized to be classified as real by a discriminator increase entropy of a surrogate classifier and misclassify the targeted classifier the authors empirically verified the robustness of the adversarial examples including its correctness in physical world then the authors study how the proposed method can be used for interpreting mistakes made by dnns strength the general method of the paper is presented relatively clear and the idea of using featurelevel adversarial examples for model interpretability weakness the focus of the paper is not clear the proposed method has its advantage that it can be used as a interpretability tool whereas as an attack the authors said it may be detectable by a human however the paper focus more on the attack than the interpretability besides some important contents are placed in the appendix the authors stated the limitations of their work in the discussion and claimed that they are future works compared to other works on adversarial examples the negative societal impact of this work is not significant since the generated adversarial examples here are easily detectable by human beings docsepthe authors use a pretrained generative model to automatically identify featurelevel adversarial examples which are adversarial perturbations that are both interpretable to a human and robust ie still serve as adversarial attacks in related contexts they target image classifier networks to do this they use several loss terms a term that penalizes the predicted image class for being similar to the true image class a term that penalizes highfrequency patterns a term that encourages the realisticness of the image a term that encourages the image to look like a particular class and a term that encourages that class not to be the true class they demonstrate the utility of all these loss terms with an ablation study they evaluate three kinds of adversarial attack square patch region and generalized any shape patch impressively they show that the attacks generated by their method are physically realizable and also transfer to a held out classifier they are also able to use the attacks generated by the model to handcraft copypaste attacks which is very useful for interpreting mistakes made by the target network the paper is very high quality clearly presented and probably high significance although the method itself makes from only minor alterations to previous work what the paper lacks in originality it more than makes up for in other factors including extensiveness of analyses evaluation of societal impact and provides a valuable tool to neural network interpretability a few criticisms of an otherwise great paper its often hard to identify the perturbation from the original image in eg fig 3 that hinders the readers ability to understand the difference i appreciate that an additional sidebyside comparison might make the figure unmanageably large but i encourage the authors to figure out ways to make this key figure more useful to the reader the description and labels in fig 3 caption are unclear each patch and generalized patch is labeled with its mean fooling confidence under random insertion in source images labeled adv and the confidence for the disguise class labeled img is disguise class missing a d as in disguised class and the img and adv labels are confusing to me it isnt perfectly obvious which is the class of the original and which is the predicted class after adversarial attack the authors should make it even clearer what the difference between their work and brown et al 2017 is is it simply that the present work uses a generator and brown et al do not figure 6 could be clearer it isnt obvious at a glance which row ought to be paired with which others moving associated rows closer to each other would solve this i feel like the bad example in fig 11 is perhaps a bit contrived its not obviously caucasian skin certainly not in the patch adversary and caucasian would not have been my first guess for the skin colour of the region adversary or generalized patch adversary id encourage the authors to find a more convincing example the authors do an exceptionally good job of discussing the limitations and potential social impact of their work they clearly make as much effort as can be expected of them to mitigate risks moreover they also engage constructively with a lay audience although their work focuses on images clearly there is much interest in the interpretability and robustness of nonimagebased networks such as language models it isnt clear that gradientbased optimization central to their method will work in that domain due to the difficulty of generating realistic language examples from a generative model that uses an optimizable latent space as is possible using image gans given the considerable interest in the language domain the authors might consider providing a slightly more detailed discussion than they currently do docsepthe authors propose a new method for generating adversarial perturbations of images these perturbations operate at a higher featurelevel rather than at a pixellevel as such the perturbations are more understandable to humans and can potentially provide insights into model behavior strengths 1 through extensive experiments the authors demonstrate that their method can provide insights on the behavior of deeplearningbased image classifiers the perturbations produced by the authors proposed method are indeed interpretable and i found it quite interesting to see how the suggested perturbations changed model outputs in the provided examples as such i believe this method could be a valuable addition to the modeldebugging toolbox to help machine learning practitioners better diagnose spurious correlations learned by their models 2 to my knowledge i am not an expert in the adversarial example literature the method proposed by the authors is novel and the literature review seems very thorough 3 i generally with a couple exceptions see weaknesses section found the writing to be highquality and easy to follow weaknesses 1 my biggest issue with this work is not with the method itself but rather with the vocabulary used to describeframe it specifically to my knowledge and indeed this is in the second sentence of the paper an adversarial attack involves modifying an image with a smallnorm perturbation such that a deep learning model produces a wildly different output while a human does not realize the image has been modified in this work the authors describe their method as a form of adversarial attack even though the method is designed to produce perturbations that are obvious to human observers for example i wouldnt expect the attacked image in figure 1d to be correctly classified after such a large perturbation i would encourage the authors to find another way to frame their method perhaps focusing more on the interpretability benefits as the current framing confused me when i first read the paper similarly i would highly recommend that the authors choose another word instead of disguised to describe something that is perceptible and resembling something other than the target class on my first reading i assumed that disguised referred to something that is imperceptible 2 the authors may wish to citediscuss cycleconsistencybased methods for discovering humaninterpretable perturbations that change the output of deep learning image classifiers see eg 1 and an application in 2 1 explanation by progressive exaggeration httpsarxivorgabs191100483 2 ai for radiographic covid19 detection selects shortcuts over signal httpswwwnaturecomarticless42256021003387 the authors discussed the limitations of their work in a satisfactory manner
### Summary:
|
this paper proposed the use of generative models to create feature level adversarial perturbations the resulting attack images have unusual transferability properties the reviewers agree that the threat model is interesting and the clarity and thoroughness of the paper is above the bar
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
970,
4735,
5251,
48960,
20452,
281,
8338,
4665,
494,
48960,
8104,
253,
4081,
1332,
476,
6635,
10522,
10898,
49932,
13318,
1524,
12729,
285,
2806,
3364,
8104,
387,
253,
4440,
257,
292,
4311,
253,
1332,
476,
671,
320,
908,
347,
271,
4665,
1430,
4968,
323,
15549,
19775,
275,
6928,
285,
281,
7472,
436,
253,
2929,
11809,
3491,
23164,
8104,
281,
2847,
10522,
3731,
42070,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
1332,
310,
6760,
327,
247,
1236,
2510,
25912,
4440,
257,
292,
10895,
50276,
20881,
1255,
265,
50276,
783,
7103,
273,
253,
40646,
390,
15958,
8103,
310,
417,
1077,
21414,
50276,
783,
2929,
19756,
247,
11745,
5301,
342,
2045,
3082,
923,
253,
7000,
3533,
275,
253,
1735,
2593,
891,
1158,
253,
2929,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
5474,
339,
431,
248,
2929,
10262,
247,
1332,
323,
11365,
4735,
5251,
48960,
6667,
275,
253,
954,
5547,
12955,
273,
253,
8104,
48960,
26309,
275,
6779,
2317,
273,
247,
1006,
800,
1566,
1581,
5978,
273,
20412,
534,
588,
840,
320,
2879,
281,
3626,
4440,
383,
248,
20412,
403,
18325,
281,
320,
10509,
347,
1524,
407,
247,
7134,
12915,
2572,
15579,
273,
247,
35701,
30410,
285,
3731,
2437,
1419,
253,
10522,
30410,
253,
4477,
45190,
16058,
253,
31640,
273,
253,
48960,
6667,
1690,
697,
36594,
275,
3520,
1533,
840,
253,
4477,
1263,
849,
253,
4081,
1332,
476,
320,
908,
323,
29375,
16503,
1160,
407,
277,
79,
2224,
4757,
253,
2087,
1332,
273,
253,
2929,
310,
3559,
4942,
2590,
285,
253,
2934,
273,
970,
4735,
5251,
48960,
6667,
323,
1566,
4665,
1430,
50276,
20881,
1255,
253,
2770,
273,
253,
2929,
310,
417,
2590,
253,
4081,
1332,
556,
697,
5750,
326,
352,
476,
320,
908,
347,
247,
4665,
1430,
4968,
5727,
347,
271,
2983,
253,
4477,
753,
352,
778,
320,
21759,
407,
247,
1966,
2299,
253,
2929,
2770,
625,
327,
253,
2983,
685,
253,
4665,
1430,
16280,
690,
1774,
9410,
403,
4845,
275,
253,
30762,
50276,
783,
4477,
4767,
253,
7364,
273,
616,
789,
275,
253,
5955,
285,
7558,
326,
597,
403,
2852,
2987,
2429,
281,
643,
2987,
327,
48960,
6667,
253,
4016,
38058,
3486,
273,
436,
789,
310,
417,
1534,
1580,
253,
4561,
48960,
6667,
1060,
403,
4354,
21759,
407,
1966,
14965,
5474,
339,
431,
248,
4477,
897,
247,
3215,
11273,
1006,
800,
1566,
281,
8356,
4271,
4735,
5251,
48960,
6667,
534,
403,
48960,
26309,
326,
403,
1097,
4665,
494,
281,
247,
1966,
285,
10237,
26332,
1335,
5752,
347,
48960,
8104,
275,
2905,
22349,
597,
2303,
2460,
30410,
6928,
281,
513,
436,
597,
897,
2067,
2957,
2426,
247,
1307,
326,
29697,
4219,
253,
8131,
2460,
966,
323,
1146,
2074,
281,
253,
2032,
2460,
966,
247,
1307,
326,
29697,
4219,
1029,
18163,
6127,
247,
1307,
326,
29426,
253,
15958,
1255,
273,
253,
2460,
247,
1307,
326,
29426,
253,
2460,
281,
1007,
751,
247,
1798,
966,
285,
247,
1307,
326,
29426,
326,
966,
417,
281,
320,
253,
2032,
966,
597,
7568,
253,
11839,
273,
512,
841,
2957,
2426,
342,
271,
28913,
1263,
597,
7472,
1264,
9351,
273,
48960,
2983,
50276,
15044,
12097,
2919,
285,
14923,
667,
5281,
12097,
50276,
303,
7100,
1242,
597,
921,
326,
253,
8104,
4561,
407,
616,
1332,
403,
13318,
1524,
12729,
285,
671,
3700,
281,
247,
2918,
562,
30410,
597,
403,
671,
2104,
281,
897,
253,
8104,
4561,
407,
253,
1566,
281,
1133,
12517,
3491,
23164,
8104,
534,
310,
1077,
4217,
323,
29375,
16503,
1160,
407,
253,
2303,
2990,
50276,
783,
2929,
310,
1077,
1029,
3290,
4518,
3559,
285,
3164,
1029,
8453,
3738,
253,
1332,
3139,
2789,
432,
760,
5884,
16663,
281,
2045,
789,
752,
253,
2929,
19756,
275,
3236,
414,
352,
625,
685,
2789,
598,
323,
275,
643,
2616,
1690,
1021,
561,
6460,
273,
6260,
7103,
273,
38058,
3486,
285,
3400,
247,
9865,
4968,
281,
11454,
2990,
4665,
1430,
50276,
66,
1643,
43680,
273,
271,
5010,
1270,
2929,
50275,
953,
2223,
1892,
281,
4271,
253,
20452,
432,
253,
3236,
2460,
275,
24088,
3036,
495,
326,
17134,
398,
253,
10668,
3745,
281,
2096,
253,
3064,
891,
11435,
326,
271,
3081,
1930,
44678,
504,
5301,
1537,
1056,
253,
4677,
440,
48200,
1598,
1781,
533,
891,
11907,
253,
4477,
281,
4677,
562,
4088,
281,
1056,
436,
2234,
4677,
625,
4217,
281,
253,
9414,
50274,
783,
5740,
285,
13301,
275,
3036,
495,
11743,
403,
12744,
1016,
12097,
285,
14923,
12097,
310,
13130,
342,
697,
1599,
11213,
272,
7162,
762,
3632,
16941,
275,
2603,
3888,
13130,
1604,
285,
253,
7162,
323,
253,
40646,
966,
13130,
23764,
310,
40646,
966,
5816,
247,
277,
347,
275,
49932,
966,
285,
253,
23764,
285,
1604,
13301,
403,
21643,
281,
479,
50276,
262,
310,
2649,
9670,
4755,
534,
310,
253,
966,
273,
253,
3236,
285,
534,
310,
253,
8131,
966,
846,
48960,
2983,
50275,
783,
4477,
943,
1056,
352,
1014,
30909,
752,
253,
3064,
875,
616,
789,
285,
8516,
1162,
355,
4240,
310,
50276,
261,
352,
3365,
326,
253,
1246,
789,
4648,
247,
14156,
285,
8516,
1162,
355,
513,
417,
50275,
13206,
721,
812,
320,
30909,
352,
310,
2649,
4755,
387,
247,
17834,
534,
4194,
12758,
281,
320,
18433,
342,
534,
2571,
4886,
2330,
10175,
8003,
281,
1016,
643,
651,
8415,
436,
50275,
74,
1928,
751,
253,
3076,
1650,
275,
3036,
1903,
310,
4931,
247,
2372,
523,
30487,
697,
417,
9090,
42775,
27933,
4808,
5604,
417,
275,
253,
12097,
34014,
285,
42775,
27933,
651,
417,
452,
644,
619,
806,
5476,
323,
253,
4808,
10688,
273,
253,
2919,
34014,
390,
14923,
12097,
34014,
2654,
11907,
253,
4477,
281,
1089,
247,
625,
21414,
1650,
50275,
783,
4477,
513,
271,
35888,
1175,
2628,
273,
16585,
253,
7364,
285,
2442,
2675,
3486,
273,
616,
789,
597,
4518,
1056,
347,
1199,
3434,
347,
476,
320,
3264,
273,
731,
281,
29966,
10502,
25761,
597,
671,
11377,
3989,
1242,
342,
247,
2242,
8446,
50275,
20261,
616,
789,
16633,
327,
3888,
4518,
627,
310,
1199,
1600,
275,
253,
4665,
1430,
285,
31640,
273,
1327,
5695,
3169,
6928,
824,
347,
3448,
3210,
352,
310,
2649,
2590,
326,
11786,
3169,
13757,
4275,
281,
616,
1332,
588,
789,
275,
326,
5028,
1955,
281,
253,
10183,
273,
11365,
15958,
3448,
6667,
432,
247,
1006,
800,
1566,
326,
4648,
271,
5556,
12729,
21624,
2317,
347,
310,
1896,
970,
2460,
305,
507,
1677,
253,
10665,
1600,
275,
253,
3448,
5028,
253,
4477,
1537,
1908,
5277,
247,
5777,
625,
7000,
5955,
685,
597,
4390,
513,
50275,
7152,
339,
431,
248,
4477,
12661,
247,
747,
1332,
323,
11365,
48960,
26309,
273,
3888,
841,
26309,
10196,
387,
247,
2169,
4735,
5251,
2581,
685,
387,
247,
8066,
4415,
652,
347,
824,
253,
26309,
403,
625,
34007,
281,
7497,
285,
476,
7826,
2085,
16039,
715,
1566,
3879,
20544,
50276,
18,
949,
9470,
4679,
253,
4477,
7568,
326,
616,
1332,
476,
2085,
16039,
327,
253,
3879,
273,
3676,
28269,
3169,
2460,
49996,
253,
26309,
4197,
407,
253,
4477,
4081,
1332,
403,
6296,
4665,
494,
285,
891,
1119,
352,
3240,
4722,
281,
923,
849,
253,
5125,
26309,
4391,
1566,
18012,
275,
253,
2530,
6667,
347,
824,
891,
2868,
436,
1332,
812,
320,
247,
9865,
1635,
281,
253,
1566,
13125,
3390,
4968,
3364,
281,
1361,
5145,
4715,
24432,
1805,
33901,
46541,
13007,
6311,
407,
616,
3210,
374,
281,
619,
3640,
891,
717,
417,
271,
6485,
275,
253,
48960,
1650,
6239,
253,
1332,
4081,
407,
253,
4477,
310,
4460,
285,
253,
6239,
2278,
3133,
1077,
11080,
495,
891,
3839,
342,
247,
4564,
16022,
923,
32213,
2593,
1119,
253,
4028,
281,
320,
1029,
15177,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
337,
619,
5962,
2523,
342,
436,
789,
310,
417,
342,
253,
1332,
3139,
533,
2581,
342,
253,
30318,
908,
281,
1800,
832,
30873,
352,
5742,
50272,
936,
619,
3640,
285,
6296,
436,
310,
275,
253,
1273,
6197,
273,
253,
2929,
271,
48960,
2983,
8687,
26264,
271,
2460,
342,
247,
1355,
12850,
20452,
824,
326,
247,
3676,
4715,
1566,
11330,
247,
32251,
1027,
3453,
1223,
247,
1966,
1057,
417,
8968,
253,
2460,
556,
644,
7321,
275,
436,
789,
253,
4477,
6266,
616,
1332,
347,
247,
830,
273,
48960,
2983,
1014,
2167,
253,
1332,
310,
4158,
281,
4711,
26309,
326,
403,
4755,
281,
1966,
25226,
323,
1650,
891,
651,
2649,
1902,
253,
13964,
2460,
275,
4677,
337,
69,
281,
320,
9113,
10509,
846,
824,
247,
1781,
20452,
891,
651,
11907,
253,
4477,
281,
1089,
1529,
1039,
281,
3665,
616,
1332,
4931,
13654,
625,
327,
253,
4665,
1430,
5373,
347,
253,
1655,
39926,
13477,
479,
672,
891,
806,
1239,
253,
2929,
50272,
3549,
6241,
891,
651,
4122,
5583,
326,
253,
4477,
5206,
1529,
3159,
3185,
273,
49932,
281,
6266,
1633,
326,
310,
591,
44043,
285,
37427,
1633,
643,
685,
253,
2303,
966,
327,
619,
806,
4361,
891,
8025,
326,
49932,
6289,
281,
1633,
326,
310,
9719,
44043,
374,
253,
4477,
778,
5730,
281,
11106,
8552,
5880,
46540,
1371,
3169,
3082,
323,
30375,
1966,
22416,
494,
26309,
326,
1818,
253,
3453,
273,
3676,
4715,
2460,
49996,
923,
24088,
337,
285,
271,
2898,
275,
374,
50276,
18,
8813,
407,
13439,
23668,
318,
5987,
39962,
2061,
5375,
746,
37965,
32282,
374,
23105,
323,
42561,
9383,
301,
746,
5481,
34899,
28194,
84,
689,
2625,
5987,
1477,
939,
1177,
681,
45894,
1417,
2945,
1099,
1549,
38832,
1610,
2597,
253,
4477,
5469,
253,
7364,
273,
616,
789,
275,
247,
20297,
5133,
2490,
187,
4118,
18435,
27,
2520,
2929,
4081,
253,
897,
273,
1006,
800,
3210,
281,
2794,
4735,
1268,
48960,
26309,
50276,
783,
4795,
2983,
3888,
452,
11555,
3700,
1430,
3607,
50276,
783,
30628,
5194,
326,
253,
4322,
1566,
310,
4722,
285,
253,
19843,
285,
11080,
1255,
273,
253,
2929,
310,
1840,
253,
2534
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
970,
4735,
5251,
48960,
20452,
281,
8338,
4665,
494,
48960,
8104,
253,
4081,
1332,
476,
6635,
10522,
10898,
49932,
13318,
1524,
12729,
285,
2806,
3364,
8104,
387,
253,
4440,
257,
292,
4311,
253,
1332,
476,
671,
320,
908,
347,
271,
4665,
1430,
4968,
323,
15549,
19775,
275,
6928,
285,
281,
7472,
436,
253,
2929,
11809,
3491,
23164,
8104,
281,
2847,
10522,
3731,
42070,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
1332,
310,
6760,
327,
247,
1236,
2510,
25912,
4440,
257,
292,
10895,
50276,
20881,
1255,
265,
50276,
783,
7103,
273,
253,
40646,
390,
15958,
8103,
310,
417,
1077,
21414,
50276,
783,
2929,
19756,
247,
11745,
5301,
342,
2045,
3082,
923,
253,
7000,
3533,
275,
253,
1735,
2593,
891,
1158,
253,
2929,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
5474,
339,
431,
248,
2929,
10262,
247,
1332,
323,
11365,
4735,
5251,
48960,
6667,
275,
253,
954,
5547,
12955,
273,
253,
8104,
48960,
26309,
275,
6779,
2317,
273,
247,
1006,
800,
1566,
1581,
5978,
273,
20412,
534,
588,
840,
320,
2879,
281,
3626,
4440,
383,
248,
20412,
403,
18325,
281,
320,
10509,
347,
1524,
407,
247,
7134,
12915,
2572,
15579,
273,
247,
35701,
30410,
285,
3731,
2437,
1419,
253,
10522,
30410,
253,
4477,
45190,
16058,
253,
31640,
273,
253,
48960,
6667,
1690,
697,
36594,
275,
3520,
1533,
840,
253,
4477,
1263,
849,
253,
4081,
1332,
476,
320,
908,
323,
29375,
16503,
1160,
407,
277,
79,
2224,
4757,
253,
2087,
1332,
273,
253,
2929,
310,
3559,
4942,
2590,
285,
253,
2934,
273,
970,
4735,
5251,
48960,
6667,
323,
1566,
4665,
1430,
50276,
20881,
1255,
253,
2770,
273,
253,
2929,
310,
417,
2590,
253,
4081,
1332,
556,
697,
5750,
326,
352,
476,
320,
908,
347,
247,
4665,
1430,
4968,
5727,
347,
271,
2983,
253,
4477,
753,
352,
778,
320,
21759,
407,
247,
1966,
2299,
253,
2929,
2770,
625,
327,
253,
2983,
685,
253,
4665,
1430,
16280,
690,
1774,
9410,
403,
4845,
275,
253,
30762,
50276,
783,
4477,
4767,
253,
7364,
273,
616,
789,
275,
253,
5955,
285,
7558,
326,
597,
403,
2852,
2987,
2429,
281,
643,
2987,
327,
48960,
6667,
253,
4016,
38058,
3486,
273,
436,
789,
310,
417,
1534,
1580,
253,
4561,
48960,
6667,
1060,
403,
4354,
21759,
407,
1966,
14965,
5474,
339,
431,
248,
4477,
897,
247,
3215,
11273,
1006,
800,
1566,
281,
8356,
4271,
4735,
5251,
48960,
6667,
534,
403,
48960,
26309,
326,
403,
1097,
4665,
494,
281,
247,
1966,
285,
10237,
26332,
1335,
5752,
347,
48960,
8104,
275,
2905,
22349,
597,
2303,
2460,
30410,
6928,
281,
513,
436,
597,
897,
2067,
2957,
2426,
247,
1307,
326,
29697,
4219,
253,
8131,
2460,
966,
323,
1146,
2074,
281,
253,
2032,
2460,
966,
247,
1307,
326,
29697,
4219,
1029,
18163,
6127,
247,
1307,
326,
29426,
253,
15958,
1255,
273,
253,
2460,
247,
1307,
326,
29426,
253,
2460,
281,
1007,
751,
247,
1798,
966,
285,
247,
1307,
326,
29426,
326,
966,
417,
281,
320,
253,
2032,
966,
597,
7568,
253,
11839,
273,
512,
841,
2957,
2426,
342,
271,
28913,
1263,
597,
7472,
1264,
9351,
273,
48960,
2983,
50276,
15044,
12097,
2919,
285,
14923,
667,
5281,
12097,
50276,
303,
7100,
1242,
597,
921,
326,
253,
8104,
4561,
407,
616,
1332,
403,
13318,
1524,
12729,
285,
671,
3700,
281,
247,
2918,
562,
30410,
597,
403,
671,
2104,
281,
897,
253,
8104,
4561,
407,
253,
1566,
281,
1133,
12517,
3491,
23164,
8104,
534,
310,
1077,
4217,
323,
29375,
16503,
1160,
407,
253,
2303,
2990,
50276,
783,
2929,
310,
1077,
1029,
3290,
4518,
3559,
285,
3164,
1029,
8453,
3738,
253,
1332,
3139,
2789,
432,
760,
5884,
16663,
281,
2045,
789,
752,
253,
2929,
19756,
275,
3236,
414,
352,
625,
685,
2789,
598,
323,
275,
643,
2616,
1690,
1021,
561,
6460,
273,
6260,
7103,
273,
38058,
3486,
285,
3400,
247,
9865,
4968,
281,
11454,
2990,
4665,
1430,
50276,
66,
1643,
43680,
273,
271,
5010,
1270,
2929,
50275,
953,
2223,
1892,
281,
4271,
253,
20452,
432,
253,
3236,
2460,
275,
24088,
3036,
495,
326,
17134,
398,
253,
10668,
3745,
281,
2096,
253,
3064,
891,
11435,
326,
271,
3081,
1930,
44678,
504,
5301,
1537,
1056,
253,
4677,
440,
48200,
1598,
1781,
533,
891,
11907,
253,
4477,
281,
4677,
562,
4088,
281,
1056,
436,
2234,
4677,
625,
4217,
281,
253,
9414,
50274,
783,
5740,
285,
13301,
275,
3036,
495,
11743,
403,
12744,
1016,
12097,
285,
14923,
12097,
310,
13130,
342,
697,
1599,
11213,
272,
7162,
762,
3632,
16941,
275,
2603,
3888,
13130,
1604,
285,
253,
7162,
323,
253,
40646,
966,
13130,
23764,
310,
40646,
966,
5816,
247,
277,
347,
275,
49932,
966,
285,
253,
23764,
285,
1604,
13301,
403,
21643,
281,
479,
50276,
262,
310,
2649,
9670,
4755,
534,
310,
253,
966,
273,
253,
3236,
285,
534,
310,
253,
8131,
966,
846,
48960,
2983,
50275,
783,
4477,
943,
1056,
352,
1014,
30909,
752,
253,
3064,
875,
616,
789,
285,
8516,
1162,
355,
4240,
310,
50276,
261,
352,
3365,
326,
253,
1246,
789,
4648,
247,
14156,
285,
8516,
1162,
355,
513,
417,
50275,
13206,
721,
812,
320,
30909,
352,
310,
2649,
4755,
387,
247,
17834,
534,
4194,
12758,
281,
320,
18433,
342,
534,
2571,
4886,
2330,
10175,
8003,
281,
1016,
643,
651,
8415,
436,
50275,
74,
1928,
751,
253,
3076,
1650,
275,
3036,
1903,
310,
4931,
247,
2372,
523,
30487,
697,
417,
9090,
42775,
27933,
4808,
5604,
417,
275,
253,
12097,
34014,
285,
42775,
27933,
651,
417,
452,
644,
619,
806,
5476,
323,
253,
4808,
10688,
273,
253,
2919,
34014,
390,
14923,
12097,
34014,
2654,
11907,
253,
4477,
281,
1089,
247,
625,
21414,
1650,
50275,
783,
4477,
513,
271,
35888,
1175,
2628,
273,
16585,
253,
7364,
285,
2442,
2675,
3486,
273,
616,
789,
597,
4518,
1056,
347,
1199,
3434,
347,
476,
320,
3264,
273,
731,
281,
29966,
10502,
25761,
597,
671,
11377,
3989,
1242,
342,
247,
2242,
8446,
50275,
20261,
616,
789,
16633,
327,
3888,
4518,
627,
310,
1199,
1600,
275,
253,
4665,
1430,
285,
31640,
273,
1327,
5695,
3169,
6928,
824,
347,
3448,
3210,
352,
310,
2649,
2590,
326,
11786,
3169,
13757,
4275,
281,
616,
1332,
588,
789,
275,
326,
5028,
1955,
281,
253,
10183,
273,
11365,
15958,
3448,
6667,
432,
247,
1006,
800,
1566,
326,
4648,
271,
5556,
12729,
21624,
2317,
347,
310,
1896,
970,
2460,
305,
507,
1677,
253,
10665,
1600,
275,
253,
3448,
5028,
253,
4477,
1537,
1908,
5277,
247,
5777,
625,
7000,
5955,
685,
597,
4390,
513,
50275,
7152,
339,
431,
248,
4477,
12661,
247,
747,
1332,
323,
11365,
48960,
26309,
273,
3888,
841,
26309,
10196,
387,
247,
2169,
4735,
5251,
2581,
685,
387,
247,
8066,
4415,
652,
347,
824,
253,
26309,
403,
625,
34007,
281,
7497,
285,
476,
7826,
2085,
16039,
715,
1566,
3879,
20544,
50276,
18,
949,
9470,
4679,
253,
4477,
7568,
326,
616,
1332,
476,
2085,
16039,
327,
253,
3879,
273,
3676,
28269,
3169,
2460,
49996,
253,
26309,
4197,
407,
253,
4477,
4081,
1332,
403,
6296,
4665,
494,
285,
891,
1119,
352,
3240,
4722,
281,
923,
849,
253,
5125,
26309,
4391,
1566,
18012,
275,
253,
2530,
6667,
347,
824,
891,
2868,
436,
1332,
812,
320,
247,
9865,
1635,
281,
253,
1566,
13125,
3390,
4968,
3364,
281,
1361,
5145,
4715,
24432,
1805,
33901,
46541,
13007,
6311,
407,
616,
3210,
374,
281,
619,
3640,
891,
717,
417,
271,
6485,
275,
253,
48960,
1650,
6239,
253,
1332,
4081,
407,
253,
4477,
310,
4460,
285,
253,
6239,
2278,
3133,
1077,
11080,
495,
891,
3839,
342,
247,
4564,
16022,
923,
32213,
2593,
1119,
253,
4028,
281,
320,
1029,
15177,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
337,
619,
5962,
2523,
342,
436,
789,
310,
417,
342,
253,
1332,
3139,
533,
2581,
342,
253,
30318,
908,
281,
1800,
832,
30873,
352,
5742,
50272,
936,
619,
3640,
285,
6296,
436,
310,
275,
253,
1273,
6197,
273,
253,
2929,
271,
48960,
2983,
8687,
26264,
271,
2460,
342,
247,
1355,
12850,
20452,
824,
326,
247,
3676,
4715,
1566,
11330,
247,
32251,
1027,
3453,
1223,
247,
1966,
1057,
417,
8968,
253,
2460,
556,
644,
7321,
275,
436,
789,
253,
4477,
6266,
616,
1332,
347,
247,
830,
273,
48960,
2983,
1014,
2167,
253,
1332,
310,
4158,
281,
4711,
26309,
326,
403,
4755,
281,
1966,
25226,
323,
1650,
891,
651,
2649,
1902,
253,
13964,
2460,
275,
4677,
337,
69,
281,
320,
9113,
10509,
846,
824,
247,
1781,
20452,
891,
651,
11907,
253,
4477,
281,
1089,
1529,
1039,
281,
3665,
616,
1332,
4931,
13654,
625,
327,
253,
4665,
1430,
5373,
347,
253,
1655,
39926,
13477,
479,
672,
891,
806,
1239,
253,
2929,
50272,
3549,
6241,
891,
651,
4122,
5583,
326,
253,
4477,
5206,
1529,
3159,
3185,
273,
49932,
281,
6266,
1633,
326,
310,
591,
44043,
285,
37427,
1633,
643,
685,
253,
2303,
966,
327,
619,
806,
4361,
891,
8025,
326,
49932,
6289,
281,
1633,
326,
310,
9719,
44043,
374,
253,
4477,
778,
5730,
281,
11106,
8552,
5880,
46540,
1371,
3169,
3082,
323,
30375,
1966,
22416,
494,
26309,
326,
1818,
253,
3453,
273,
3676,
4715,
2460,
49996,
923,
24088,
337,
285,
271,
2898,
275,
374,
50276,
18,
8813,
407,
13439,
23668,
318,
5987,
39962,
2061,
5375,
746,
37965,
32282,
374,
23105,
323,
42561,
9383,
301,
746,
5481,
34899,
28194,
84,
689,
2625,
5987,
1477,
939,
1177,
681,
45894,
1417,
2945,
1099,
1549,
38832,
1610,
2597,
253,
4477,
5469,
253,
7364,
273,
616,
789,
275,
247,
20297,
5133,
2490,
187,
4118,
18435,
27,
2520,
2929,
4081,
253,
897,
273,
1006,
800,
3210,
281,
2794,
4735,
1268,
48960,
26309,
50276,
783,
4795,
2983,
3888,
452,
11555,
3700,
1430,
3607,
50276,
783,
30628,
5194,
326,
253,
4322,
1566,
310,
4722,
285,
253,
19843,
285,
11080,
1255,
273,
253,
2929,
310,
1840,
253,
2534
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a novel way of performing fewshot kb completion without having to rely on metatrain datasets constructed in an adhoc fashion specifically they propose learning a hypothesis proposal module that given different support evidence graphs finds a common hypothesis that is supported by the evidence the hypothesis is then instantiated with the different examples and passed to the encoder to generate a representation of the support examples during inference the proposed method checks if there is an evidence close enough to the proposed hypothesis using the evidence proposal module to first generate a hypothesis and then instantiating it with the query and using the encoder to generate the query evidence representation the authors present 2 approaches for the hypothesis proposal and evidence proposal modules an optimization based training free method and a fully trainable gcnn approach for the latter they present a selfsupervised pretraining method for training the optimization free method is competitive against other meta learning approaches while the fully trainable approach outperforms them especially for the scenario where inference is performed on unseen entities dubbed the inductive setting in the paper strengths the paper presents an interesting approach for adapting to fewshot kb completion which is contrary to the popular metalearning approach it is especially interesting to see that the training free approach with randomly initialized encoders performs competitively against other meta learning approaches the results especially for the transductive setting for conceptnet and nell and inductive setting for all 3 datasets are quite compelling with substantial improvements over the baseline in addition the approach also appears to be fairly robust to distribution shifts weakness given that during inference for each entity pair the model needs to compute a forward pass through evidence proposal module and the encoder module it would be good to also compare the inference time of the proposed method compared to the baseline approaches since the approach should produce interpretable masks it might be helpful and insightful to include some qualitative examples eg for the running example in the paper chop kitchen does the hypothesis proposal module actually identify the underlying accurate hypothesis and is that correctly regenerated from the evidence proposal module the definition of pretraining is a bit overloaded from the supplementary material it seems like the model is jointly trained with all 3 losses as opposed to first doing selfsupervised pretraining followed by finetuning it might be good to clarify that in the paper minor this setup for fewshot link prediction seems to be very well suited for incontext learning setups i wonder if it makes sense to also compare against the same minor the paper is a bit hard to parse some of the notation eg equations 9 and 11 are somewhat nonstandard in equation 16 t is not defined and it is somewhat unclear how all the pieces fit together it would be nice if the authors could present even in the appendix if space is a concern an algorithm that clearly defines how the different proposals are connected for a training and inference pass through the dataset the authors have covered the limitations fairly well i am a little concerned wrt inference cost of the proposed method if it is indeed an issue it would also be good to mention that in the limitations as well docsepin this paper the authors investigated fewshot knowledge graph completion task previous work usually model this under metalearning framework the authors proposed to leverage subgraph between the head entity and tail entity which may be shared between support and query triplets through this way they are able to better predict the target entity through the shared subgraph which contains richer local information that the model can leverage to get a model that can be generalized to specific kg they proposed to pretrained encoderdecoder model with the pretraining task to recover the masked nodes and relations in the subgraph the experimental verify the effectiveness of the proposed model which outperforms the baseline models on multiple existing datasets dramatically strengths the authors proposed an approach to leverage local subgraph for fewshot relational reasoning tasks which achieves stateoftheart performance on multiple datasets they proposed to pretrain an encoderdecoder model which can generalize to specific kg the paper is wellwritten and easy to follow weaknesses on some datasets like fb15k237 the proposed approach underperforms an existing baseline model which may raise the question of the generalization ability of the pretrained model and on what kind of datasets will this model performs better na docsepthe author proposes a method that can perform well on any novel fewshot tasks without relying on specifically designed metatraining set the proposed approach relies on an intuition that entities linked by the same relation should have a similarstrctured subgraph the framework is called connection subgraph reasoner it has two major components hypothesis proposal module and evidence proposal module the hypothesis proposal module generates a hypothesis embedding from each support graph such that they are most similar to each other then the evidence model attempts to find the closest evidence to the hypothesis embedding in a leanringfree optimization the output masks are optimized directly theres also a learning version where the masks are outputed by a neural model in experiments strong performance is shown especially in inductive setting the proposed method is novel i think the idea of subgraph matching is interesting in experiments strong performance is shown especially in inductive setting i did not find a limitation section in the main text
### Summary:
|
this paper studies fewshot knowledge graph completion problem it proposes learning a hypothesis proposal module that given different support evidence graphs finds a common hypothesis that is supported by the evidence the authors present 2 approaches for the hypothesis proposal and evidence proposal modules an optimizationbased training free method and a fully trainable gcnn approach the reviewers agree that the proposed method is interesting and solid the experiments are thorough and the results provide valuable insights for future work reviewers raised concerns and questions are properly addressed by the authors response
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
253,
2929,
10262,
247,
4460,
1039,
273,
9591,
1643,
11860,
28724,
12240,
1293,
1907,
281,
10725,
327,
1313,
255,
1949,
15302,
8818,
275,
271,
519,
37806,
8142,
5742,
597,
12661,
4715,
247,
9079,
10419,
6333,
326,
1677,
1027,
1329,
1941,
14580,
9010,
247,
1846,
9079,
326,
310,
4516,
407,
253,
1941,
253,
9079,
310,
840,
8164,
4215,
342,
253,
1027,
6667,
285,
4817,
281,
253,
32049,
281,
6635,
247,
6779,
273,
253,
1329,
6667,
1309,
17032,
253,
4081,
1332,
12255,
604,
627,
310,
271,
1941,
2810,
2217,
281,
253,
4081,
9079,
970,
253,
1941,
10419,
6333,
281,
806,
6635,
247,
9079,
285,
840,
8164,
15544,
352,
342,
253,
7316,
285,
970,
253,
32049,
281,
6635,
253,
7316,
1941,
6779,
50275,
783,
4477,
1246,
374,
7274,
323,
253,
9079,
10419,
285,
1941,
10419,
11911,
271,
13757,
1754,
3733,
1959,
1332,
285,
247,
4751,
6194,
494,
305,
68,
9866,
2746,
323,
253,
6158,
597,
1246,
247,
1881,
35421,
3215,
26208,
1332,
323,
3733,
50275,
783,
13757,
1959,
1332,
310,
12085,
1411,
643,
11419,
4715,
7274,
1223,
253,
4751,
6194,
494,
2746,
41731,
13015,
731,
3340,
323,
253,
10076,
835,
17032,
310,
2684,
327,
39709,
14429,
33690,
253,
42115,
4758,
275,
253,
2929,
50275,
296,
3755,
20556,
50275,
783,
2929,
10262,
271,
4722,
2746,
323,
42174,
281,
1643,
11860,
28724,
12240,
534,
310,
10214,
281,
253,
4633,
5148,
613,
920,
2746,
50276,
262,
310,
3340,
4722,
281,
923,
326,
253,
3733,
1959,
2746,
342,
12421,
31260,
2349,
351,
398,
17923,
3947,
25785,
1411,
643,
11419,
4715,
7274,
50275,
783,
1543,
3340,
323,
253,
811,
43324,
4758,
323,
4473,
3024,
285,
295,
437,
285,
42115,
4758,
323,
512,
495,
15302,
403,
3240,
18511,
342,
6832,
11701,
689,
253,
8245,
275,
1635,
253,
2746,
671,
4620,
281,
320,
9648,
10237,
281,
3268,
15036,
50276,
20881,
1255,
50276,
28821,
326,
1309,
17032,
323,
1016,
10726,
4667,
253,
1566,
3198,
281,
11897,
247,
3579,
1509,
949,
1941,
10419,
6333,
285,
253,
32049,
6333,
352,
651,
320,
1175,
281,
671,
7277,
253,
17032,
673,
273,
253,
4081,
1332,
2429,
281,
253,
8245,
7274,
50275,
17480,
253,
2746,
943,
4711,
4665,
494,
25965,
352,
1537,
320,
9371,
285,
47860,
281,
2486,
690,
18276,
6667,
24088,
323,
253,
3515,
1650,
275,
253,
2929,
38419,
50276,
11554,
5756,
1057,
253,
9079,
10419,
6333,
2686,
4271,
253,
6944,
7899,
9079,
285,
310,
326,
9113,
23999,
456,
432,
253,
1941,
10419,
6333,
50275,
783,
5426,
273,
3215,
26208,
310,
247,
2372,
689,
19052,
432,
253,
24864,
2144,
352,
3133,
751,
253,
1566,
310,
26277,
10166,
342,
512,
495,
11655,
347,
10066,
281,
806,
2509,
1881,
35421,
3215,
26208,
3560,
407,
1442,
292,
25004,
352,
1537,
320,
1175,
281,
19148,
326,
275,
253,
2929,
50275,
37585,
436,
9978,
323,
1643,
11860,
3048,
10554,
3133,
281,
320,
1077,
973,
18960,
323,
275,
8882,
4715,
873,
8777,
891,
4282,
604,
352,
2789,
3282,
281,
671,
7277,
1411,
253,
1072,
50275,
37585,
253,
2929,
310,
247,
2372,
1892,
281,
14390,
690,
273,
253,
14951,
24088,
7424,
898,
285,
1903,
403,
8489,
1327,
15291,
275,
5150,
1668,
246,
310,
417,
2931,
285,
352,
310,
8489,
12744,
849,
512,
253,
7437,
4944,
2366,
352,
651,
320,
5322,
604,
253,
4477,
812,
1246,
1014,
275,
253,
30762,
604,
2317,
310,
247,
4468,
271,
5933,
326,
4518,
13067,
849,
253,
1027,
18595,
403,
4802,
323,
247,
3733,
285,
17032,
1509,
949,
253,
10895,
50276,
783,
4477,
452,
6107,
253,
7364,
9648,
973,
891,
717,
247,
1652,
7514,
8772,
17032,
2105,
273,
253,
4081,
1332,
604,
352,
310,
6296,
271,
2523,
352,
651,
671,
320,
1175,
281,
3748,
326,
275,
253,
7364,
347,
973,
5474,
339,
9852,
436,
2929,
253,
4477,
6949,
1643,
11860,
3640,
4216,
12240,
4836,
2045,
789,
3798,
1566,
436,
762,
5148,
613,
920,
7792,
253,
4477,
4081,
281,
25057,
749,
10580,
875,
253,
1481,
10726,
285,
8105,
10726,
534,
778,
320,
6096,
875,
1329,
285,
7316,
16260,
1641,
949,
436,
1039,
597,
403,
2104,
281,
1805,
3283,
253,
2303,
10726,
949,
253,
6096,
749,
10580,
534,
4428,
38539,
1980,
1491,
326,
253,
1566,
476,
25057,
281,
755,
247,
1566,
326,
476,
320,
14923,
281,
2173,
15841,
597,
4081,
281,
3215,
11273,
32049,
48759,
1566,
342,
253,
3215,
26208,
4836,
281,
9295,
253,
34741,
7632,
285,
2493,
275,
253,
749,
10580,
253,
5661,
12654,
253,
12510,
273,
253,
4081,
1566,
534,
41731,
13015,
253,
8245,
3210,
327,
2709,
5368,
15302,
16821,
20544,
50276,
783,
4477,
4081,
271,
2746,
281,
25057,
1980,
749,
10580,
323,
1643,
11860,
38524,
14720,
8892,
534,
33526,
1375,
23037,
14387,
3045,
327,
2709,
15302,
50276,
9328,
4081,
281,
3215,
1949,
271,
32049,
48759,
1566,
534,
476,
39970,
281,
2173,
15841,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
327,
690,
15302,
751,
49962,
1010,
76,
20991,
253,
4081,
2746,
762,
468,
13015,
271,
5368,
8245,
1566,
534,
778,
7164,
253,
1953,
273,
253,
26647,
3745,
273,
253,
3215,
11273,
1566,
285,
327,
752,
2238,
273,
15302,
588,
436,
1566,
17923,
1805,
50276,
2072,
5474,
339,
431,
248,
2488,
29328,
50276,
66,
1332,
326,
476,
1347,
973,
327,
667,
4460,
1643,
11860,
8892,
1293,
22128,
327,
5742,
4158,
1313,
255,
26208,
873,
253,
4081,
2746,
15771,
327,
271,
30328,
326,
14429,
7939,
407,
253,
1072,
5886,
943,
452,
247,
2074,
1344,
291,
1520,
749,
10580,
253,
7792,
310,
1925,
4602,
749,
10580,
1921,
254,
352,
556,
767,
2201,
4295,
9079,
10419,
6333,
285,
1941,
10419,
6333,
253,
9079,
10419,
6333,
15693,
247,
9079,
21496,
432,
1016,
1329,
4216,
824,
326,
597,
403,
954,
2074,
281,
1016,
643,
840,
253,
1941,
1566,
9437,
281,
1089,
253,
8642,
1941,
281,
253,
9079,
21496,
275,
247,
9644,
804,
4924,
13757,
253,
3453,
25965,
403,
18325,
3587,
50276,
783,
373,
671,
247,
4715,
2715,
835,
253,
25965,
403,
3453,
264,
407,
247,
11454,
1566,
275,
4679,
2266,
3045,
310,
2011,
3340,
275,
42115,
4758,
253,
4081,
1332,
310,
4460,
50276,
74,
1158,
253,
2934,
273,
749,
10580,
11038,
310,
4722,
50275,
249,
4679,
2266,
3045,
310,
2011,
3340,
275,
50276,
527,
10475,
422,
4758,
891,
858,
417,
1089,
247,
12291,
2593,
275,
253,
2022,
2505,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
1643,
11860,
3640,
4216,
12240,
1895,
352,
29328,
4715,
247,
9079,
10419,
6333,
326,
1677,
1027,
1329,
1941,
14580,
9010,
247,
1846,
9079,
326,
310,
4516,
407,
253,
1941,
253,
4477,
1246,
374,
7274,
323,
253,
9079,
10419,
285,
1941,
10419,
11911,
271,
13757,
3169,
3733,
1959,
1332,
285,
247,
4751,
6194,
494,
305,
68,
9866,
2746,
50275,
783,
30628,
5194,
326,
253,
4081,
1332,
310,
4722,
285,
4891,
253,
4679,
403,
11080,
285,
253,
1543,
2085,
9865,
16039,
323,
2852,
789,
30628,
5439,
7350,
285,
3533,
403,
6283,
9713,
407,
253,
4477,
2380,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
253,
2929,
10262,
247,
4460,
1039,
273,
9591,
1643,
11860,
28724,
12240,
1293,
1907,
281,
10725,
327,
1313,
255,
1949,
15302,
8818,
275,
271,
519,
37806,
8142,
5742,
597,
12661,
4715,
247,
9079,
10419,
6333,
326,
1677,
1027,
1329,
1941,
14580,
9010,
247,
1846,
9079,
326,
310,
4516,
407,
253,
1941,
253,
9079,
310,
840,
8164,
4215,
342,
253,
1027,
6667,
285,
4817,
281,
253,
32049,
281,
6635,
247,
6779,
273,
253,
1329,
6667,
1309,
17032,
253,
4081,
1332,
12255,
604,
627,
310,
271,
1941,
2810,
2217,
281,
253,
4081,
9079,
970,
253,
1941,
10419,
6333,
281,
806,
6635,
247,
9079,
285,
840,
8164,
15544,
352,
342,
253,
7316,
285,
970,
253,
32049,
281,
6635,
253,
7316,
1941,
6779,
50275,
783,
4477,
1246,
374,
7274,
323,
253,
9079,
10419,
285,
1941,
10419,
11911,
271,
13757,
1754,
3733,
1959,
1332,
285,
247,
4751,
6194,
494,
305,
68,
9866,
2746,
323,
253,
6158,
597,
1246,
247,
1881,
35421,
3215,
26208,
1332,
323,
3733,
50275,
783,
13757,
1959,
1332,
310,
12085,
1411,
643,
11419,
4715,
7274,
1223,
253,
4751,
6194,
494,
2746,
41731,
13015,
731,
3340,
323,
253,
10076,
835,
17032,
310,
2684,
327,
39709,
14429,
33690,
253,
42115,
4758,
275,
253,
2929,
50275,
296,
3755,
20556,
50275,
783,
2929,
10262,
271,
4722,
2746,
323,
42174,
281,
1643,
11860,
28724,
12240,
534,
310,
10214,
281,
253,
4633,
5148,
613,
920,
2746,
50276,
262,
310,
3340,
4722,
281,
923,
326,
253,
3733,
1959,
2746,
342,
12421,
31260,
2349,
351,
398,
17923,
3947,
25785,
1411,
643,
11419,
4715,
7274,
50275,
783,
1543,
3340,
323,
253,
811,
43324,
4758,
323,
4473,
3024,
285,
295,
437,
285,
42115,
4758,
323,
512,
495,
15302,
403,
3240,
18511,
342,
6832,
11701,
689,
253,
8245,
275,
1635,
253,
2746,
671,
4620,
281,
320,
9648,
10237,
281,
3268,
15036,
50276,
20881,
1255,
50276,
28821,
326,
1309,
17032,
323,
1016,
10726,
4667,
253,
1566,
3198,
281,
11897,
247,
3579,
1509,
949,
1941,
10419,
6333,
285,
253,
32049,
6333,
352,
651,
320,
1175,
281,
671,
7277,
253,
17032,
673,
273,
253,
4081,
1332,
2429,
281,
253,
8245,
7274,
50275,
17480,
253,
2746,
943,
4711,
4665,
494,
25965,
352,
1537,
320,
9371,
285,
47860,
281,
2486,
690,
18276,
6667,
24088,
323,
253,
3515,
1650,
275,
253,
2929,
38419,
50276,
11554,
5756,
1057,
253,
9079,
10419,
6333,
2686,
4271,
253,
6944,
7899,
9079,
285,
310,
326,
9113,
23999,
456,
432,
253,
1941,
10419,
6333,
50275,
783,
5426,
273,
3215,
26208,
310,
247,
2372,
689,
19052,
432,
253,
24864,
2144,
352,
3133,
751,
253,
1566,
310,
26277,
10166,
342,
512,
495,
11655,
347,
10066,
281,
806,
2509,
1881,
35421,
3215,
26208,
3560,
407,
1442,
292,
25004,
352,
1537,
320,
1175,
281,
19148,
326,
275,
253,
2929,
50275,
37585,
436,
9978,
323,
1643,
11860,
3048,
10554,
3133,
281,
320,
1077,
973,
18960,
323,
275,
8882,
4715,
873,
8777,
891,
4282,
604,
352,
2789,
3282,
281,
671,
7277,
1411,
253,
1072,
50275,
37585,
253,
2929,
310,
247,
2372,
1892,
281,
14390,
690,
273,
253,
14951,
24088,
7424,
898,
285,
1903,
403,
8489,
1327,
15291,
275,
5150,
1668,
246,
310,
417,
2931,
285,
352,
310,
8489,
12744,
849,
512,
253,
7437,
4944,
2366,
352,
651,
320,
5322,
604,
253,
4477,
812,
1246,
1014,
275,
253,
30762,
604,
2317,
310,
247,
4468,
271,
5933,
326,
4518,
13067,
849,
253,
1027,
18595,
403,
4802,
323,
247,
3733,
285,
17032,
1509,
949,
253,
10895,
50276,
783,
4477,
452,
6107,
253,
7364,
9648,
973,
891,
717,
247,
1652,
7514,
8772,
17032,
2105,
273,
253,
4081,
1332,
604,
352,
310,
6296,
271,
2523,
352,
651,
671,
320,
1175,
281,
3748,
326,
275,
253,
7364,
347,
973,
5474,
339,
9852,
436,
2929,
253,
4477,
6949,
1643,
11860,
3640,
4216,
12240,
4836,
2045,
789,
3798,
1566,
436,
762,
5148,
613,
920,
7792,
253,
4477,
4081,
281,
25057,
749,
10580,
875,
253,
1481,
10726,
285,
8105,
10726,
534,
778,
320,
6096,
875,
1329,
285,
7316,
16260,
1641,
949,
436,
1039,
597,
403,
2104,
281,
1805,
3283,
253,
2303,
10726,
949,
253,
6096,
749,
10580,
534,
4428,
38539,
1980,
1491,
326,
253,
1566,
476,
25057,
281,
755,
247,
1566,
326,
476,
320,
14923,
281,
2173,
15841,
597,
4081,
281,
3215,
11273,
32049,
48759,
1566,
342,
253,
3215,
26208,
4836,
281,
9295,
253,
34741,
7632,
285,
2493,
275,
253,
749,
10580,
253,
5661,
12654,
253,
12510,
273,
253,
4081,
1566,
534,
41731,
13015,
253,
8245,
3210,
327,
2709,
5368,
15302,
16821,
20544,
50276,
783,
4477,
4081,
271,
2746,
281,
25057,
1980,
749,
10580,
323,
1643,
11860,
38524,
14720,
8892,
534,
33526,
1375,
23037,
14387,
3045,
327,
2709,
15302,
50276,
9328,
4081,
281,
3215,
1949,
271,
32049,
48759,
1566,
534,
476,
39970,
281,
2173,
15841,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
20881,
1255,
265,
327,
690,
15302,
751,
49962,
1010,
76,
20991,
253,
4081,
2746,
762,
468,
13015,
271,
5368,
8245,
1566,
534,
778,
7164,
253,
1953,
273,
253,
26647,
3745,
273,
253,
3215,
11273,
1566,
285,
327,
752,
2238,
273,
15302,
588,
436,
1566,
17923,
1805,
50276,
2072,
5474,
339,
431,
248,
2488,
29328,
50276,
66,
1332,
326,
476,
1347,
973,
327,
667,
4460,
1643,
11860,
8892,
1293,
22128,
327,
5742,
4158,
1313,
255,
26208,
873,
253,
4081,
2746,
15771,
327,
271,
30328,
326,
14429,
7939,
407,
253,
1072,
5886,
943,
452,
247,
2074,
1344,
291,
1520,
749,
10580,
253,
7792,
310,
1925,
4602,
749,
10580,
1921,
254,
352,
556,
767,
2201,
4295,
9079,
10419,
6333,
285,
1941,
10419,
6333,
253,
9079,
10419,
6333,
15693,
247,
9079,
21496,
432,
1016,
1329,
4216,
824,
326,
597,
403,
954,
2074,
281,
1016,
643,
840,
253,
1941,
1566,
9437,
281,
1089,
253,
8642,
1941,
281,
253,
9079,
21496,
275,
247,
9644,
804,
4924,
13757,
253,
3453,
25965,
403,
18325,
3587,
50276,
783,
373,
671,
247,
4715,
2715,
835,
253,
25965,
403,
3453,
264,
407,
247,
11454,
1566,
275,
4679,
2266,
3045,
310,
2011,
3340,
275,
42115,
4758,
253,
4081,
1332,
310,
4460,
50276,
74,
1158,
253,
2934,
273,
749,
10580,
11038,
310,
4722,
50275,
249,
4679,
2266,
3045,
310,
2011,
3340,
275,
50276,
527,
10475,
422,
4758,
891,
858,
417,
1089,
247,
12291,
2593,
275,
253,
2022,
2505,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
1643,
11860,
3640,
4216,
12240,
1895,
352,
29328,
4715,
247,
9079,
10419,
6333,
326,
1677,
1027,
1329,
1941,
14580,
9010,
247,
1846,
9079,
326,
310,
4516,
407,
253,
1941,
253,
4477,
1246,
374,
7274,
323,
253,
9079,
10419,
285,
1941,
10419,
11911,
271,
13757,
3169,
3733,
1959,
1332,
285,
247,
4751,
6194,
494,
305,
68,
9866,
2746,
50275,
783,
30628,
5194,
326,
253,
4081,
1332,
310,
4722,
285,
4891,
253,
4679,
403,
11080,
285,
253,
1543,
2085,
9865,
16039,
323,
2852,
789,
30628,
5439,
7350,
285,
3533,
403,
6283,
9713,
407,
253,
4477,
2380,
209
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.