Input
stringlengths 251
41.6k
| Output
stringlengths 137
9.7k
| input_ids
listlengths 157
2.05k
| attention_mask
listlengths 157
2.05k
| labels
listlengths 157
2.05k
|
---|---|---|---|---|
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents one endtoend multitask learning architecture for depth segmentation map estimation and the driving prediction the whole architecture is composed of two components the first one is the perception module segmentation and depth map inference the second one is the driving decision module the training process is sequential initially train the perception module then train the driving decision task with freezing the weights of the perception module the author evaluated the proposed approach on one simulated dataset experimental results demonstrated the advantage of multitask compared to the single task advantages the pipeline is also easy to understand it is simple and efficient based on the provided results the proposed framework aims to give better understanding of the application of deep learning in selfdriving car project such as the analysis and illustration in figure 3 questions there are several typos needed to be addressed eg the question mark in fig index of section 51 there should be comma in the second sentence at the last paragraph of section 52 multitask especially the segmentation part is not novel for selfdriving car prediction such as xu et al cvpr 17 paper from berkeley the experiment for generalization shows the potential advancement however it is less convincing with the limited size of the evaluation data the authors discussed about how to analyze the failure causes however if the perception learning model does not work well then it would be hard to analyze the reason of incorrectly prediction in general the paper has the merits and these investigations may be helpful for this problem but it is not good enough for iclr docsepmajor contribution this paper details a method for a modified endtoend architecture that has better generalization and explanation ability the paper outlines a method for this implemented using an autoencoder for an efficient feature extractor by first training an autoencoder to ensure the encoder captures enough depth and segmentation information and then using the processed information as a more useful and compressed new input to train a regression model the author claimed that this model is more robust to a different testing setting and by observing the output of the decoder it can help us debug the model when it makes a wrong prediction organizationstyle the paper is well written organized and clear on most points a few minor points 1 on page 5 the last sentence there is a missing table number 2 i dont think the last part finetune test is necessary since there are no formal proofs and only speculations technical accuracy the problem that the paper is trying to address is the blackbox problem in the endtoend selfdriving system the paper proposes a method by constructing a depth image and a segmentation mask autoencoder though it has been proved that it is effective in making the right prediction and demonstrated that it has the cause explanation ability for possible prediction failures i have a few points the idea makes sense and the model will always perform better when the given input captures more relevant and saturated representations the paper listed two important features depth information and segmentation information but there are other important features that are missing in other words when the decoder performs bad it means the encoder doesnt capture the good depth and segmentation features then it will be highly possible that the model performs badly as well however when the model performs bad it does not necessarily mean the decoder will perform badly since there might be other information missing for example failure to detect the object lines and traffic lights etc in conclusion the question is really how to get a good representation of a selfdriving scene i dont think to design two simple autoencoders for depth image construction and image segmentation is enough it works apparently but it is not good enough adequacy of citations good coverage of literature in selfdrivingdocsep summary this submission proposes a multitask convolutional neural network architecture for endtoend driving going from an rgb image to controls evaluated using the carla open source simulator the architecture consists of an encoder and three decoders on top two for perception depth prediction and semantic segmentation and one for driving controls prediction the network is trained in a twostep supervised fashion first training the encoder and perception decoders using depth and semantic segmentation ground truth second freezing the encoder and training the driving module imitation learning on demonstrations the network is evaluated on the standard carla benchmark showing better generalization performance in new driving conditions town and weather compared to the carla baselines modular pipeline imitation learning rl qualitative results also show that failure modes are easier to interpret by looking at predicted depth maps and semantic segmentation results strengths simplicity of the approach the overall architecture described above is simple cf figure 1 combining the benefits of the modular and endtoend approaches into a feedforward cnn the aforementioned twostage learning algorithm is also explained clearly predicted depth maps and semantic segmentation results are indeed more interpretable than attention maps as traditionally used in endtoend driving evaluation of the driving policy the evaluation is done with actual navigation tasks using the carla corl18 benchmark instead of just offline behavior cloning accuracy often used in endtoend driving papers easier to overfit to not guaranteed to transfer to actual driving simple ablative analysis table 2 quantifies the generalization performance benefits of pretraining and freezing the encoder on perception tasks esp going from 16 to 62 of completed episodes in the new town and weather dynamic navigation scenario weaknesses writing i have to start with the most obvious one the paper is littered with typos and grammatical errors way too many to list for instance the usage of the and a is almost nonexistent overall the paper is really hard to read and needs a thorough pass of proofreading and editing also please remove the acknowledgments section i think it is borderline breaking the doubleblind submission policy i dont know these persons but if i did that would be a breach of iclr submission policy furthermore i think its contents are not very professional for a submission at a top international academic venue but that is just my opinion novelty this is the main weakness for me the architecture is very close to at least the following works xu h gao y yu f and darrell t endtoend learning of driving models from largescale video datasets cvpr17 this reference is missing from the paper whereas it is very closely related as it also shows the benefit of a segmentation decoder on top of a shared encoder for endtoend driving calling it privileged training codevilla et als conditional imitation learning icra18 the only novelty in the current submission wrt cil is the addition of the depth and segmentation decoders mller m dosovitskiy a ghanem b koltun v driving policy transfer via modularity and abstraction corl18 the architecture also uses a shared perception module and segmentation although in a mediated way instead of auxiliary task to show better generalization performance including from sim to real additional missing related works include kim j and canny jf interpretable learning for selfdriving cars by visualizing causal attention iccv17 uses posthoc attention interpretation of black box endtoend networks sauer a savinov n and geiger a conditional affordance learning for driving in urban environments corl18 also uses a perception module in the middle of the cil network showing better generalization performance in carla although a bit lower than the results in the current submission pomerleau da alvinn an autonomous land vehicle in a neural network nips89 the landmark paper for endtoend driving with neural networks insights significance in light of the aforementioned prior art i believe the claims are correct but already reported in other publications in the community cf references above in particular the proposed approach uses a lot more strongly labeled data depth and semantic segmentation supervision in a dataset of 40000 images than the competing approaches mentioned above for instance the modular pipeline in the original carla paper uses only 2500 labeled images and i am sure its performance would be vastly improved with 40000 images but this is not evaluated hence the comparison in table 1 being unfair in my opinion this matters because the encoder in the proposed method is frozen after training on the perception tasks and the main point of the experiments is to convince that it results in a great fixed intermediate representation which is in line with the aforementioned works doing mediated perception for driving the finetuning experiments are also confirming what is know in the litterature namely that simple finetuning can lead to catastrophic forgetting table 3 finally the qualitative evaluation of failure cases 53 leads to a trivial conclusion a modular approach is indeed more interpretable than an endtoend one this is actually by design and the main advocated benefit of modular approaches failure in the downstream perception module yields failure in the upstream driving module that builds on top of it as the perception module is by design outputting a human interpretable representation eg a semantic segmentation map then this leads to better interpretation overall reproducibility there are not enough details in section 31 about the deep net architecture to enable reimplementation structure similar to segnet no detailed description of the number of layers nonlinearities number of channels etc will the authors release the perception training dataset collected in carla described in section 42 recommendation although the results of the proposed multitask network on the carla driving benchmark are good it is probably due to using almost two orders of magnitude more labeled data for semantic segmentation and depth prediction than prior works which is only practical because the experiments are done in simulation prior work has confirmed that combining perception tasks like semantic segmentation with endtoend driving networks yield better performance including using a strongly related approach xu et al in addition to the lack of novelty or new insights the writing needs serious attention for these reasons i believe this paper is not suitable for publication at iclr
### Summary:
|
the paper presents a unified system for perception and control that is trained in a stepwise fashion with visual decoders to inspect scene parsing and understanding results demonstrate improved performance under certain conditions but reviewers raise several concerns that must be addressed before the work is accepted reviewer pros simple elegant design easy to understand provides some insight behind system function during failure conditions error in perception vs control improves performance under a subset of tested conditions reviewer cons concern about lack of novelty evaluation is limited in scope references incomplete missing implementation details hard to reproduce paper still contains many writing errors
|
[
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
581,
990,
936,
423,
1554,
262,
1945,
4715,
10336,
323,
6864,
50276,
29429,
318,
3711,
13418,
285,
253,
6276,
10554,
253,
2644,
10336,
310,
9924,
273,
767,
4295,
253,
806,
581,
310,
253,
50276,
468,
2409,
6333,
26405,
285,
6864,
3711,
17032,
253,
1273,
581,
310,
253,
6276,
3061,
6333,
253,
3733,
1232,
310,
22453,
8523,
6194,
253,
13071,
6333,
840,
6194,
253,
6276,
3061,
4836,
342,
24250,
253,
13461,
273,
253,
13071,
6333,
253,
2488,
6760,
253,
4081,
2746,
327,
581,
15524,
10895,
5661,
1543,
5183,
253,
5750,
273,
1554,
262,
1945,
2429,
281,
253,
2014,
4836,
50275,
11402,
1131,
253,
15722,
310,
671,
3477,
281,
2096,
352,
310,
2969,
285,
5919,
1754,
327,
253,
2530,
1543,
253,
4081,
7792,
13698,
281,
1918,
1805,
4685,
273,
253,
2898,
273,
3676,
4715,
275,
1881,
41571,
1113,
2199,
824,
347,
253,
1783,
285,
23356,
275,
4677,
495,
50275,
34974,
627,
403,
2067,
963,
993,
3058,
281,
320,
9713,
24088,
253,
1953,
1616,
275,
3036,
3605,
273,
2593,
8319,
627,
943,
320,
39169,
275,
253,
1273,
6197,
387,
253,
1390,
12494,
273,
2593,
8073,
50275,
9961,
262,
1945,
3340,
253,
26405,
629,
310,
417,
4460,
323,
1881,
41571,
1113,
10554,
824,
347,
1269,
86,
1162,
355,
30105,
1087,
1722,
2929,
432,
17099,
21202,
253,
3368,
323,
26647,
2722,
253,
2442,
32992,
2299,
352,
310,
1679,
21414,
342,
253,
3710,
1979,
273,
253,
7103,
941,
253,
4477,
5469,
670,
849,
281,
12106,
253,
4433,
5997,
2299,
604,
253,
13071,
50276,
28269,
1566,
1057,
417,
789,
973,
840,
352,
651,
320,
1892,
281,
12106,
253,
1921,
273,
30833,
10554,
50276,
249,
2087,
253,
2929,
556,
253,
16108,
285,
841,
14006,
778,
320,
9371,
323,
436,
1895,
533,
352,
310,
417,
1175,
2217,
323,
17857,
32888,
50276,
7152,
33032,
24330,
7680,
436,
2929,
4278,
247,
1332,
323,
247,
7321,
990,
936,
423,
10336,
326,
556,
1805,
26647,
285,
8813,
3745,
253,
2929,
36264,
247,
1332,
323,
436,
9009,
970,
271,
6753,
36465,
323,
271,
5919,
4735,
4908,
263,
407,
806,
3733,
271,
6753,
36465,
281,
5416,
253,
32049,
28174,
2217,
6864,
285,
26405,
1491,
285,
840,
970,
253,
11742,
1491,
347,
247,
625,
4217,
285,
21012,
747,
3280,
281,
6194,
247,
9077,
1566,
253,
2488,
7558,
326,
436,
1566,
310,
625,
10237,
281,
247,
1027,
5175,
4758,
285,
407,
20764,
253,
3453,
273,
253,
29810,
352,
476,
1361,
441,
13844,
253,
1566,
672,
352,
2789,
247,
3430,
10554,
50276,
25590,
4826,
253,
2929,
310,
973,
3542,
10932,
285,
2590,
327,
954,
2792,
247,
1643,
5884,
2792,
337,
327,
3239,
608,
253,
1390,
6197,
627,
310,
247,
5816,
2829,
1180,
374,
891,
13414,
1158,
253,
1390,
629,
1442,
292,
2517,
1071,
310,
3309,
1580,
627,
403,
642,
7473,
27947,
285,
760,
946,
3339,
50276,
48746,
7200,
253,
1895,
326,
253,
2929,
310,
2820,
281,
2953,
310,
253,
2806,
3364,
1895,
275,
253,
990,
936,
423,
1881,
41571,
985,
253,
2929,
29328,
247,
1332,
407,
26736,
247,
6864,
2460,
285,
247,
26405,
8989,
6753,
36465,
2167,
352,
556,
644,
8058,
326,
352,
310,
3576,
275,
2403,
253,
987,
10554,
285,
5183,
326,
352,
556,
253,
2847,
8813,
3745,
323,
1896,
10554,
20101,
891,
452,
247,
1643,
2792,
253,
2934,
2789,
3282,
285,
253,
1566,
588,
1900,
1347,
1805,
672,
253,
1677,
3280,
28174,
625,
4623,
285,
23543,
14237,
253,
2929,
7117,
767,
1774,
3386,
6864,
1491,
285,
26405,
1491,
533,
627,
403,
643,
1774,
3386,
326,
403,
5816,
275,
643,
3000,
672,
253,
29810,
17923,
3076,
352,
2097,
253,
32049,
36908,
9232,
253,
1175,
6864,
285,
26405,
3386,
840,
352,
588,
320,
4122,
1896,
326,
253,
1566,
17923,
16426,
347,
973,
2299,
672,
253,
1566,
17923,
3076,
352,
1057,
417,
7933,
1599,
253,
29810,
588,
1347,
16426,
1580,
627,
1537,
320,
643,
1491,
5816,
323,
1650,
4433,
281,
2736,
253,
1789,
3104,
285,
7137,
10654,
3966,
50276,
249,
6452,
253,
1953,
310,
1663,
849,
281,
755,
247,
1175,
6779,
273,
247,
1881,
41571,
6200,
891,
13414,
1158,
281,
2216,
767,
2969,
6753,
2083,
351,
398,
323,
6864,
2460,
5140,
285,
2460,
26405,
310,
2217,
352,
2987,
8505,
533,
352,
310,
417,
1175,
2217,
50276,
14629,
1974,
273,
30404,
50276,
12311,
7031,
273,
6239,
275,
1881,
41571,
7152,
33032,
6010,
50276,
2520,
19529,
29328,
247,
1554,
262,
1945,
27311,
267,
11454,
2990,
10336,
323,
990,
936,
423,
6276,
1469,
432,
271,
46206,
2460,
281,
5760,
6760,
970,
253,
1113,
4123,
1527,
2603,
40022,
253,
10336,
8414,
273,
271,
32049,
285,
1264,
1086,
351,
398,
327,
1755,
767,
323,
13071,
6864,
10554,
285,
24705,
26405,
285,
581,
323,
6276,
5760,
10554,
253,
2990,
310,
10166,
275,
247,
2500,
493,
554,
22296,
8142,
806,
3733,
253,
32049,
285,
13071,
1086,
351,
398,
970,
6864,
285,
24705,
26405,
3216,
5083,
1273,
24250,
253,
32049,
285,
3733,
253,
6276,
6333,
45738,
4715,
327,
32367,
253,
2990,
310,
6760,
327,
253,
2629,
1113,
4123,
22791,
4645,
1805,
26647,
3045,
275,
747,
6276,
2515,
3874,
285,
8588,
2429,
281,
253,
1113,
4123,
1666,
25379,
23178,
15722,
45738,
4715,
391,
77,
18276,
1543,
671,
921,
326,
4433,
10006,
403,
6927,
281,
4665,
407,
2819,
387,
8131,
6864,
8115,
285,
24705,
26405,
1543,
50274,
296,
3755,
20556,
50276,
3549,
12986,
273,
253,
2746,
253,
4583,
10336,
2529,
1840,
310,
2969,
21194,
4677,
337,
16248,
253,
5373,
273,
253,
23178,
285,
990,
936,
423,
7274,
715,
247,
3997,
10495,
260,
9866,
253,
18979,
2500,
493,
486,
4715,
5933,
310,
671,
5544,
4518,
8131,
6864,
8115,
285,
24705,
26405,
1543,
403,
6296,
625,
4665,
494,
685,
4116,
8115,
347,
21533,
908,
275,
990,
936,
423,
6276,
50276,
15419,
2368,
273,
253,
6276,
3646,
253,
7103,
310,
2218,
342,
4588,
15034,
8892,
970,
253,
1113,
4123,
944,
77,
1093,
22791,
3185,
273,
816,
28841,
3879,
34591,
7200,
2223,
908,
275,
990,
936,
423,
6276,
9380,
6927,
281,
689,
8491,
281,
417,
16293,
281,
3700,
281,
4588,
6276,
50276,
19583,
490,
77,
800,
1783,
2829,
374,
2677,
7790,
253,
26647,
3045,
5373,
273,
3215,
26208,
285,
24250,
253,
32049,
327,
13071,
8892,
17985,
1469,
432,
1668,
281,
9743,
273,
6312,
13305,
275,
253,
747,
3874,
285,
8588,
7870,
15034,
10076,
50274,
20881,
1255,
265,
50275,
17695,
50276,
74,
452,
281,
1265,
342,
253,
954,
4755,
581,
253,
2929,
310,
6195,
3606,
342,
963,
993,
285,
47412,
474,
6332,
1039,
1512,
1142,
281,
1618,
323,
4227,
253,
10393,
273,
253,
285,
247,
310,
2761,
44382,
6688,
4583,
253,
2929,
310,
1663,
1892,
281,
1239,
285,
3198,
247,
11080,
1509,
273,
4737,
24042,
285,
14835,
671,
4496,
5386,
253,
9555,
14908,
2593,
891,
1158,
352,
310,
45210,
10155,
253,
4021,
27895,
19529,
3646,
891,
13414,
871,
841,
7732,
533,
604,
891,
858,
326,
651,
320,
247,
13770,
273,
17857,
32888,
19529,
3646,
33810,
891,
1158,
697,
9410,
403,
417,
1077,
5702,
323,
247,
19529,
387,
247,
1755,
5213,
11073,
18767,
533,
326,
310,
816,
619,
4743,
50273,
2369,
652,
555,
50276,
2520,
310,
253,
2022,
14855,
323,
479,
253,
10336,
310,
1077,
2810,
281,
387,
1878,
253,
1563,
2987,
50276,
46036,
288,
305,
8500,
340,
340,
86,
269,
285,
13681,
11436,
246,
990,
936,
423,
4715,
273,
6276,
3210,
432,
1236,
2510,
25912,
3492,
15302,
30105,
1087,
1166,
436,
3806,
310,
5816,
432,
253,
2929,
5727,
352,
310,
1077,
8244,
2905,
347,
352,
671,
2722,
253,
5649,
273,
247,
26405,
29810,
327,
1755,
273,
247,
6096,
32049,
323,
990,
936,
423,
6276,
6789,
352,
30082,
3733,
50276,
3211,
87,
6077,
1162,
14350,
17697,
45738,
4715,
17857,
376,
1093,
253,
760,
38135,
275,
253,
1655,
19529,
8772,
260,
300,
310,
253,
1635,
273,
253,
6864,
285,
26405,
1086,
351,
398,
50276,
78,
22999,
278,
9500,
729,
953,
5985,
90,
247,
305,
5582,
358,
270,
50276,
76,
7391,
328,
362,
6276,
3646,
3700,
3066,
23178,
414,
285,
38562,
944,
77,
1093,
253,
10336,
671,
4648,
247,
6096,
13071,
6333,
285,
26405,
3738,
275,
247,
14946,
1039,
3185,
273,
24026,
4836,
281,
921,
1805,
26647,
3045,
1690,
432,
948,
281,
1524,
50276,
38092,
5816,
2905,
2987,
2486,
50276,
42686,
480,
285,
476,
5134,
480,
71,
4665,
494,
4715,
323,
1881,
41571,
8458,
407,
5304,
3006,
19349,
4116,
17857,
17312,
1166,
4648,
1501,
37806,
4116,
7914,
273,
2806,
3817,
990,
936,
423,
6928,
50276,
84,
17146,
247,
5745,
249,
729,
295,
285,
3471,
8047,
247,
17697,
7848,
593,
4715,
323,
6276,
275,
10106,
12620,
944,
77,
1093,
671,
4648,
247,
13071,
6333,
275,
253,
4766,
273,
253,
260,
300,
2990,
4645,
1805,
26647,
3045,
275,
1113,
4123,
3738,
247,
2372,
2406,
685,
253,
1543,
275,
253,
1655,
19529,
50276,
81,
8056,
282,
1952,
4204,
355,
87,
2966,
271,
26279,
2659,
4958,
275,
247,
11454,
2990,
295,
2824,
2511,
253,
30951,
2929,
323,
990,
936,
423,
6276,
342,
11454,
6928,
50274,
968,
4380,
50276,
9188,
40348,
50276,
249,
1708,
273,
253,
18979,
2720,
1445,
891,
2868,
253,
3916,
403,
3451,
533,
2168,
2361,
275,
643,
16516,
275,
253,
3114,
21194,
10414,
1840,
275,
1798,
253,
4081,
2746,
4648,
247,
2257,
625,
7052,
13130,
941,
6864,
285,
24705,
26405,
20446,
275,
247,
10895,
273,
577,
1418,
3888,
685,
253,
11771,
7274,
5393,
1840,
323,
4227,
253,
23178,
15722,
275,
253,
3236,
1113,
4123,
2929,
4648,
760,
46059,
13130,
3888,
285,
891,
717,
2119,
697,
3045,
651,
320,
37078,
5520,
342,
577,
1418,
3888,
533,
436,
310,
417,
6760,
7613,
253,
5301,
275,
2829,
337,
1146,
16593,
275,
619,
4743,
436,
8213,
984,
253,
32049,
275,
253,
4081,
1332,
310,
13831,
846,
3733,
327,
253,
13071,
8892,
285,
253,
2022,
1127,
273,
253,
4679,
310,
281,
18578,
326,
352,
1543,
275,
247,
1270,
4229,
10444,
6779,
534,
310,
275,
1386,
342,
253,
18979,
2987,
2509,
14946,
13071,
323,
6276,
50276,
783,
1442,
292,
25004,
4679,
403,
671,
24025,
752,
310,
871,
275,
253,
30653,
1177,
10775,
326,
2969,
1442,
292,
25004,
476,
1421,
281,
36256,
37264,
2829,
495,
50276,
71,
3341,
253,
18276,
7103,
273,
4433,
2219,
8676,
5644,
281,
247,
14916,
6452,
247,
23178,
2746,
310,
6296,
625,
4665,
494,
685,
271,
990,
936,
423,
581,
436,
310,
2686,
407,
2216,
285,
253,
2022,
36431,
5649,
273,
23178,
7274,
4433,
275,
253,
15450,
13071,
6333,
11026,
4433,
275,
253,
17934,
6276,
6333,
326,
21168,
327,
1755,
273,
352,
347,
253,
13071,
6333,
310,
407,
2216,
3453,
1076,
247,
1966,
4665,
494,
6779,
24088,
247,
24705,
26405,
3711,
840,
436,
5644,
281,
1805,
7914,
4583,
50274,
250,
5551,
33593,
50276,
9088,
403,
417,
2217,
4278,
275,
2593,
4562,
670,
253,
3676,
2036,
10336,
281,
8046,
294,
39595,
2605,
2074,
281,
8753,
3024,
642,
7000,
5740,
273,
253,
1180,
273,
8090,
14561,
1005,
1180,
273,
8123,
3966,
50276,
9846,
253,
4477,
3727,
253,
13071,
3733,
10895,
5728,
275,
1113,
4123,
2529,
275,
2593,
5976,
50273,
250,
27167,
318,
50276,
20261,
253,
1543,
273,
253,
4081,
1554,
262,
1945,
2990,
327,
253,
1113,
4123,
6276,
22791,
403,
1175,
352,
310,
3164,
1955,
281,
970,
2761,
767,
7367,
273,
9777,
625,
13130,
941,
323,
24705,
26405,
285,
6864,
10554,
685,
2720,
2987,
534,
310,
760,
8542,
984,
253,
4679,
403,
2218,
275,
9864,
2720,
789,
556,
5783,
326,
16248,
13071,
8892,
751,
24705,
26405,
342,
990,
936,
423,
6276,
6928,
4917,
1805,
3045,
1690,
970,
247,
7052,
2905,
2746,
1269,
86,
1162,
355,
275,
1635,
281,
253,
3480,
273,
38135,
390,
747,
16039,
253,
4028,
3198,
4092,
4116,
50276,
1542,
841,
4606,
891,
2868,
436,
2929,
310,
417,
7470,
323,
9311,
387,
17857,
32888,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
27998,
985,
323,
13071,
285,
1453,
326,
310,
10166,
275,
247,
3213,
3020,
8142,
342,
5304,
1086,
351,
398,
281,
16030,
6200,
29072,
285,
4685,
1543,
7568,
5520,
3045,
762,
2176,
2515,
533,
30628,
7164,
2067,
7350,
326,
1364,
320,
9713,
1078,
253,
789,
310,
7607,
50276,
15337,
254,
5847,
50276,
19583,
20654,
2216,
3477,
281,
2096,
50276,
11404,
1487,
690,
12288,
3212,
985,
1159,
1309,
4433,
2515,
2228,
275,
13071,
4632,
1453,
50276,
303,
856,
1634,
3045,
762,
247,
8578,
273,
5762,
2515,
50275,
15337,
254,
772,
50276,
585,
20631,
670,
3480,
273,
38135,
50276,
15419,
2368,
310,
3710,
275,
7990,
50276,
250,
3065,
18464,
50276,
33722,
7092,
4278,
1892,
281,
18302,
50276,
20790,
1335,
4428,
1142,
4028,
6332
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
581,
990,
936,
423,
1554,
262,
1945,
4715,
10336,
323,
6864,
50276,
29429,
318,
3711,
13418,
285,
253,
6276,
10554,
253,
2644,
10336,
310,
9924,
273,
767,
4295,
253,
806,
581,
310,
253,
50276,
468,
2409,
6333,
26405,
285,
6864,
3711,
17032,
253,
1273,
581,
310,
253,
6276,
3061,
6333,
253,
3733,
1232,
310,
22453,
8523,
6194,
253,
13071,
6333,
840,
6194,
253,
6276,
3061,
4836,
342,
24250,
253,
13461,
273,
253,
13071,
6333,
253,
2488,
6760,
253,
4081,
2746,
327,
581,
15524,
10895,
5661,
1543,
5183,
253,
5750,
273,
1554,
262,
1945,
2429,
281,
253,
2014,
4836,
50275,
11402,
1131,
253,
15722,
310,
671,
3477,
281,
2096,
352,
310,
2969,
285,
5919,
1754,
327,
253,
2530,
1543,
253,
4081,
7792,
13698,
281,
1918,
1805,
4685,
273,
253,
2898,
273,
3676,
4715,
275,
1881,
41571,
1113,
2199,
824,
347,
253,
1783,
285,
23356,
275,
4677,
495,
50275,
34974,
627,
403,
2067,
963,
993,
3058,
281,
320,
9713,
24088,
253,
1953,
1616,
275,
3036,
3605,
273,
2593,
8319,
627,
943,
320,
39169,
275,
253,
1273,
6197,
387,
253,
1390,
12494,
273,
2593,
8073,
50275,
9961,
262,
1945,
3340,
253,
26405,
629,
310,
417,
4460,
323,
1881,
41571,
1113,
10554,
824,
347,
1269,
86,
1162,
355,
30105,
1087,
1722,
2929,
432,
17099,
21202,
253,
3368,
323,
26647,
2722,
253,
2442,
32992,
2299,
352,
310,
1679,
21414,
342,
253,
3710,
1979,
273,
253,
7103,
941,
253,
4477,
5469,
670,
849,
281,
12106,
253,
4433,
5997,
2299,
604,
253,
13071,
50276,
28269,
1566,
1057,
417,
789,
973,
840,
352,
651,
320,
1892,
281,
12106,
253,
1921,
273,
30833,
10554,
50276,
249,
2087,
253,
2929,
556,
253,
16108,
285,
841,
14006,
778,
320,
9371,
323,
436,
1895,
533,
352,
310,
417,
1175,
2217,
323,
17857,
32888,
50276,
7152,
33032,
24330,
7680,
436,
2929,
4278,
247,
1332,
323,
247,
7321,
990,
936,
423,
10336,
326,
556,
1805,
26647,
285,
8813,
3745,
253,
2929,
36264,
247,
1332,
323,
436,
9009,
970,
271,
6753,
36465,
323,
271,
5919,
4735,
4908,
263,
407,
806,
3733,
271,
6753,
36465,
281,
5416,
253,
32049,
28174,
2217,
6864,
285,
26405,
1491,
285,
840,
970,
253,
11742,
1491,
347,
247,
625,
4217,
285,
21012,
747,
3280,
281,
6194,
247,
9077,
1566,
253,
2488,
7558,
326,
436,
1566,
310,
625,
10237,
281,
247,
1027,
5175,
4758,
285,
407,
20764,
253,
3453,
273,
253,
29810,
352,
476,
1361,
441,
13844,
253,
1566,
672,
352,
2789,
247,
3430,
10554,
50276,
25590,
4826,
253,
2929,
310,
973,
3542,
10932,
285,
2590,
327,
954,
2792,
247,
1643,
5884,
2792,
337,
327,
3239,
608,
253,
1390,
6197,
627,
310,
247,
5816,
2829,
1180,
374,
891,
13414,
1158,
253,
1390,
629,
1442,
292,
2517,
1071,
310,
3309,
1580,
627,
403,
642,
7473,
27947,
285,
760,
946,
3339,
50276,
48746,
7200,
253,
1895,
326,
253,
2929,
310,
2820,
281,
2953,
310,
253,
2806,
3364,
1895,
275,
253,
990,
936,
423,
1881,
41571,
985,
253,
2929,
29328,
247,
1332,
407,
26736,
247,
6864,
2460,
285,
247,
26405,
8989,
6753,
36465,
2167,
352,
556,
644,
8058,
326,
352,
310,
3576,
275,
2403,
253,
987,
10554,
285,
5183,
326,
352,
556,
253,
2847,
8813,
3745,
323,
1896,
10554,
20101,
891,
452,
247,
1643,
2792,
253,
2934,
2789,
3282,
285,
253,
1566,
588,
1900,
1347,
1805,
672,
253,
1677,
3280,
28174,
625,
4623,
285,
23543,
14237,
253,
2929,
7117,
767,
1774,
3386,
6864,
1491,
285,
26405,
1491,
533,
627,
403,
643,
1774,
3386,
326,
403,
5816,
275,
643,
3000,
672,
253,
29810,
17923,
3076,
352,
2097,
253,
32049,
36908,
9232,
253,
1175,
6864,
285,
26405,
3386,
840,
352,
588,
320,
4122,
1896,
326,
253,
1566,
17923,
16426,
347,
973,
2299,
672,
253,
1566,
17923,
3076,
352,
1057,
417,
7933,
1599,
253,
29810,
588,
1347,
16426,
1580,
627,
1537,
320,
643,
1491,
5816,
323,
1650,
4433,
281,
2736,
253,
1789,
3104,
285,
7137,
10654,
3966,
50276,
249,
6452,
253,
1953,
310,
1663,
849,
281,
755,
247,
1175,
6779,
273,
247,
1881,
41571,
6200,
891,
13414,
1158,
281,
2216,
767,
2969,
6753,
2083,
351,
398,
323,
6864,
2460,
5140,
285,
2460,
26405,
310,
2217,
352,
2987,
8505,
533,
352,
310,
417,
1175,
2217,
50276,
14629,
1974,
273,
30404,
50276,
12311,
7031,
273,
6239,
275,
1881,
41571,
7152,
33032,
6010,
50276,
2520,
19529,
29328,
247,
1554,
262,
1945,
27311,
267,
11454,
2990,
10336,
323,
990,
936,
423,
6276,
1469,
432,
271,
46206,
2460,
281,
5760,
6760,
970,
253,
1113,
4123,
1527,
2603,
40022,
253,
10336,
8414,
273,
271,
32049,
285,
1264,
1086,
351,
398,
327,
1755,
767,
323,
13071,
6864,
10554,
285,
24705,
26405,
285,
581,
323,
6276,
5760,
10554,
253,
2990,
310,
10166,
275,
247,
2500,
493,
554,
22296,
8142,
806,
3733,
253,
32049,
285,
13071,
1086,
351,
398,
970,
6864,
285,
24705,
26405,
3216,
5083,
1273,
24250,
253,
32049,
285,
3733,
253,
6276,
6333,
45738,
4715,
327,
32367,
253,
2990,
310,
6760,
327,
253,
2629,
1113,
4123,
22791,
4645,
1805,
26647,
3045,
275,
747,
6276,
2515,
3874,
285,
8588,
2429,
281,
253,
1113,
4123,
1666,
25379,
23178,
15722,
45738,
4715,
391,
77,
18276,
1543,
671,
921,
326,
4433,
10006,
403,
6927,
281,
4665,
407,
2819,
387,
8131,
6864,
8115,
285,
24705,
26405,
1543,
50274,
296,
3755,
20556,
50276,
3549,
12986,
273,
253,
2746,
253,
4583,
10336,
2529,
1840,
310,
2969,
21194,
4677,
337,
16248,
253,
5373,
273,
253,
23178,
285,
990,
936,
423,
7274,
715,
247,
3997,
10495,
260,
9866,
253,
18979,
2500,
493,
486,
4715,
5933,
310,
671,
5544,
4518,
8131,
6864,
8115,
285,
24705,
26405,
1543,
403,
6296,
625,
4665,
494,
685,
4116,
8115,
347,
21533,
908,
275,
990,
936,
423,
6276,
50276,
15419,
2368,
273,
253,
6276,
3646,
253,
7103,
310,
2218,
342,
4588,
15034,
8892,
970,
253,
1113,
4123,
944,
77,
1093,
22791,
3185,
273,
816,
28841,
3879,
34591,
7200,
2223,
908,
275,
990,
936,
423,
6276,
9380,
6927,
281,
689,
8491,
281,
417,
16293,
281,
3700,
281,
4588,
6276,
50276,
19583,
490,
77,
800,
1783,
2829,
374,
2677,
7790,
253,
26647,
3045,
5373,
273,
3215,
26208,
285,
24250,
253,
32049,
327,
13071,
8892,
17985,
1469,
432,
1668,
281,
9743,
273,
6312,
13305,
275,
253,
747,
3874,
285,
8588,
7870,
15034,
10076,
50274,
20881,
1255,
265,
50275,
17695,
50276,
74,
452,
281,
1265,
342,
253,
954,
4755,
581,
253,
2929,
310,
6195,
3606,
342,
963,
993,
285,
47412,
474,
6332,
1039,
1512,
1142,
281,
1618,
323,
4227,
253,
10393,
273,
253,
285,
247,
310,
2761,
44382,
6688,
4583,
253,
2929,
310,
1663,
1892,
281,
1239,
285,
3198,
247,
11080,
1509,
273,
4737,
24042,
285,
14835,
671,
4496,
5386,
253,
9555,
14908,
2593,
891,
1158,
352,
310,
45210,
10155,
253,
4021,
27895,
19529,
3646,
891,
13414,
871,
841,
7732,
533,
604,
891,
858,
326,
651,
320,
247,
13770,
273,
17857,
32888,
19529,
3646,
33810,
891,
1158,
697,
9410,
403,
417,
1077,
5702,
323,
247,
19529,
387,
247,
1755,
5213,
11073,
18767,
533,
326,
310,
816,
619,
4743,
50273,
2369,
652,
555,
50276,
2520,
310,
253,
2022,
14855,
323,
479,
253,
10336,
310,
1077,
2810,
281,
387,
1878,
253,
1563,
2987,
50276,
46036,
288,
305,
8500,
340,
340,
86,
269,
285,
13681,
11436,
246,
990,
936,
423,
4715,
273,
6276,
3210,
432,
1236,
2510,
25912,
3492,
15302,
30105,
1087,
1166,
436,
3806,
310,
5816,
432,
253,
2929,
5727,
352,
310,
1077,
8244,
2905,
347,
352,
671,
2722,
253,
5649,
273,
247,
26405,
29810,
327,
1755,
273,
247,
6096,
32049,
323,
990,
936,
423,
6276,
6789,
352,
30082,
3733,
50276,
3211,
87,
6077,
1162,
14350,
17697,
45738,
4715,
17857,
376,
1093,
253,
760,
38135,
275,
253,
1655,
19529,
8772,
260,
300,
310,
253,
1635,
273,
253,
6864,
285,
26405,
1086,
351,
398,
50276,
78,
22999,
278,
9500,
729,
953,
5985,
90,
247,
305,
5582,
358,
270,
50276,
76,
7391,
328,
362,
6276,
3646,
3700,
3066,
23178,
414,
285,
38562,
944,
77,
1093,
253,
10336,
671,
4648,
247,
6096,
13071,
6333,
285,
26405,
3738,
275,
247,
14946,
1039,
3185,
273,
24026,
4836,
281,
921,
1805,
26647,
3045,
1690,
432,
948,
281,
1524,
50276,
38092,
5816,
2905,
2987,
2486,
50276,
42686,
480,
285,
476,
5134,
480,
71,
4665,
494,
4715,
323,
1881,
41571,
8458,
407,
5304,
3006,
19349,
4116,
17857,
17312,
1166,
4648,
1501,
37806,
4116,
7914,
273,
2806,
3817,
990,
936,
423,
6928,
50276,
84,
17146,
247,
5745,
249,
729,
295,
285,
3471,
8047,
247,
17697,
7848,
593,
4715,
323,
6276,
275,
10106,
12620,
944,
77,
1093,
671,
4648,
247,
13071,
6333,
275,
253,
4766,
273,
253,
260,
300,
2990,
4645,
1805,
26647,
3045,
275,
1113,
4123,
3738,
247,
2372,
2406,
685,
253,
1543,
275,
253,
1655,
19529,
50276,
81,
8056,
282,
1952,
4204,
355,
87,
2966,
271,
26279,
2659,
4958,
275,
247,
11454,
2990,
295,
2824,
2511,
253,
30951,
2929,
323,
990,
936,
423,
6276,
342,
11454,
6928,
50274,
968,
4380,
50276,
9188,
40348,
50276,
249,
1708,
273,
253,
18979,
2720,
1445,
891,
2868,
253,
3916,
403,
3451,
533,
2168,
2361,
275,
643,
16516,
275,
253,
3114,
21194,
10414,
1840,
275,
1798,
253,
4081,
2746,
4648,
247,
2257,
625,
7052,
13130,
941,
6864,
285,
24705,
26405,
20446,
275,
247,
10895,
273,
577,
1418,
3888,
685,
253,
11771,
7274,
5393,
1840,
323,
4227,
253,
23178,
15722,
275,
253,
3236,
1113,
4123,
2929,
4648,
760,
46059,
13130,
3888,
285,
891,
717,
2119,
697,
3045,
651,
320,
37078,
5520,
342,
577,
1418,
3888,
533,
436,
310,
417,
6760,
7613,
253,
5301,
275,
2829,
337,
1146,
16593,
275,
619,
4743,
436,
8213,
984,
253,
32049,
275,
253,
4081,
1332,
310,
13831,
846,
3733,
327,
253,
13071,
8892,
285,
253,
2022,
1127,
273,
253,
4679,
310,
281,
18578,
326,
352,
1543,
275,
247,
1270,
4229,
10444,
6779,
534,
310,
275,
1386,
342,
253,
18979,
2987,
2509,
14946,
13071,
323,
6276,
50276,
783,
1442,
292,
25004,
4679,
403,
671,
24025,
752,
310,
871,
275,
253,
30653,
1177,
10775,
326,
2969,
1442,
292,
25004,
476,
1421,
281,
36256,
37264,
2829,
495,
50276,
71,
3341,
253,
18276,
7103,
273,
4433,
2219,
8676,
5644,
281,
247,
14916,
6452,
247,
23178,
2746,
310,
6296,
625,
4665,
494,
685,
271,
990,
936,
423,
581,
436,
310,
2686,
407,
2216,
285,
253,
2022,
36431,
5649,
273,
23178,
7274,
4433,
275,
253,
15450,
13071,
6333,
11026,
4433,
275,
253,
17934,
6276,
6333,
326,
21168,
327,
1755,
273,
352,
347,
253,
13071,
6333,
310,
407,
2216,
3453,
1076,
247,
1966,
4665,
494,
6779,
24088,
247,
24705,
26405,
3711,
840,
436,
5644,
281,
1805,
7914,
4583,
50274,
250,
5551,
33593,
50276,
9088,
403,
417,
2217,
4278,
275,
2593,
4562,
670,
253,
3676,
2036,
10336,
281,
8046,
294,
39595,
2605,
2074,
281,
8753,
3024,
642,
7000,
5740,
273,
253,
1180,
273,
8090,
14561,
1005,
1180,
273,
8123,
3966,
50276,
9846,
253,
4477,
3727,
253,
13071,
3733,
10895,
5728,
275,
1113,
4123,
2529,
275,
2593,
5976,
50273,
250,
27167,
318,
50276,
20261,
253,
1543,
273,
253,
4081,
1554,
262,
1945,
2990,
327,
253,
1113,
4123,
6276,
22791,
403,
1175,
352,
310,
3164,
1955,
281,
970,
2761,
767,
7367,
273,
9777,
625,
13130,
941,
323,
24705,
26405,
285,
6864,
10554,
685,
2720,
2987,
534,
310,
760,
8542,
984,
253,
4679,
403,
2218,
275,
9864,
2720,
789,
556,
5783,
326,
16248,
13071,
8892,
751,
24705,
26405,
342,
990,
936,
423,
6276,
6928,
4917,
1805,
3045,
1690,
970,
247,
7052,
2905,
2746,
1269,
86,
1162,
355,
275,
1635,
281,
253,
3480,
273,
38135,
390,
747,
16039,
253,
4028,
3198,
4092,
4116,
50276,
1542,
841,
4606,
891,
2868,
436,
2929,
310,
417,
7470,
323,
9311,
387,
17857,
32888,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
27998,
985,
323,
13071,
285,
1453,
326,
310,
10166,
275,
247,
3213,
3020,
8142,
342,
5304,
1086,
351,
398,
281,
16030,
6200,
29072,
285,
4685,
1543,
7568,
5520,
3045,
762,
2176,
2515,
533,
30628,
7164,
2067,
7350,
326,
1364,
320,
9713,
1078,
253,
789,
310,
7607,
50276,
15337,
254,
5847,
50276,
19583,
20654,
2216,
3477,
281,
2096,
50276,
11404,
1487,
690,
12288,
3212,
985,
1159,
1309,
4433,
2515,
2228,
275,
13071,
4632,
1453,
50276,
303,
856,
1634,
3045,
762,
247,
8578,
273,
5762,
2515,
50275,
15337,
254,
772,
50276,
585,
20631,
670,
3480,
273,
38135,
50276,
15419,
2368,
310,
3710,
275,
7990,
50276,
250,
3065,
18464,
50276,
33722,
7092,
4278,
1892,
281,
18302,
50276,
20790,
1335,
4428,
1142,
4028,
6332
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper introduces a generative model for 3d point clouds authors aim at theoretically showing the difficulties of using existing generative models to learn distributions of point clouds and propose a variant that supposedly solves the issues pros the problem of designing generative models for 3d data is important cons paper is often hard to follow and contains a significant number of typos authors claim to identify a fundamental problem with the existing generative models for point clouds yet section 2 tries to show that a specific version that uses deepset does not satisfy theoretical guarantees what if we use eg a recurrent network instead as is the counter example proof itself is quite confusing it would really help if the proof was more formal jointly learning an inference network q has certainly been done before and i am not sure authors provide an elaborate enough explanation of what is the difference with adversarially learned inference adversarial feature learning it is not clear why authors did not follow the evaluation protocol of achlioptas17 or wu16 more closely in particular evaluation for the classification task should be compatible with the proposed model which would give a much better picture of the learned representations docsepsummary this paper proposes a generative point cloud model based on adversarial learning and definittis representation theorem of exchangeable variables the main focus in experiments and the exposition is on 3d point clouds representing object shapes seems the surface but could also be the interior of objects please clarify the main idea is to represent a point cloud using a global latent variable that captures the overall shape and a collection of local latent variables that code for the position of a point on the shape the model consists of thee components i an encoder that takes a point cloud as input and maps it to a point estimate of the global latent variable of the shape represented by the input cloud a pointnet architecture is used here ii a decoder that takes the estimated global latent variable and a local latent variable and maps it to an output point in the cloud to be produced by the model iii a discriminator network that aims to distinguish points from a given shape and the points produced by pipelining the encoder and decoder critically different from conventional gans the discriminator is optimized per shape ie each point cloud is considered as a distribution over r3 specific to that shape iv a shape prior that once the encoderdecoder model from above is trained is used to model the distribution over the global latent variables this model is trained presumably in a conventional gan style using the global latent variable representations inferred across the different training point clouds as compared to prior work by achiloptas et al 2017 the proposed approach has the advantage to allow for sampling an arbitrary number of points from the target shape rather than a fixed predefined number in addition the authors propose to minimize a weighted average of a lower bound and upper bound on the wasserstein distance between the distributions of points corresponding to given shapes this approach translates to improved quantitative evaluation measures experiments are conducted on a simple toy data set as a proof of concept and on data from modelnet10 and modelnet40 two performance metrics are introduced to assess the autoencoding ability of the model to what extent does the encoderdecoder pipeline result in point clouds similar to the shape from which the input pointcloud is generated overall i find the idea of the paper interesting and worth publishing but the exposition of the paper is less than ideal and needs further work the experimental validation of the proposed approach can also be further improved see more specific comments below specific comments the counter example at the bottom of page 2 is limited in the sense that the oracle assumption seems highly nonrealistic casting doubt on the relevance of the argument the notation in section 3 before 31 is rather sloppy for example please define p and g the elements of the divergence dpg that appears in the first paragraph of section 3 it is not defined in which space theta lives it is not clear what the authors intend with the notation gthetau sim ptheta what prior distributions pz and pu are used what is the choice based on abbreviation ipm is referred several times in the paper but remains undefined in the paper until end of page 4 please define earlier the model gtheta does not appear in the training objective function 4 how is this module trained precisely lack of clarity in the following passage in our setting each point xi in the point cloud can be considered to correspond to single images when we train gans over images the notion of divergence dpg is not made concrete in section 3 and 31 which makes the notation of rather little use the following paper merits a discussion in the related work section towards a neural statistician iclr17 httpsopenreviewnetpdfidhjdbuf5le the manuscript contains many typos for example vedio op page 4 circile on page 5 condct on page 8 etc please proof read your paper and fix these the refenence to bengio 2018 is incomplete what do you refer to precisely there seems to be no mention of the dimension of the local latent variables zi please comment on the choice and its impact on the behavior of the model the quantitative evaluation in table 1 is interesting and useful it is limited however in the sense that it only measures autoencoding capabilities to what extent can the shape be reproduced given a sample point cloud from the given shape quantitative evaluation of generative modeling performance is unfortunately missing from this paper as it is in much of the gan literature could you please comment on how this canwill be fixed the toy data set experiments could be dropped to make room for experiments suggested below an experimental study of the effect of the mixing parameter s would be useful to include for example by taking s on a grid from 0 to 1 one could plot the coverage and distancetoface measures experimental evaluation of autoencoding using a variable number of input points is interesting to add ie how do the two evaluation measures evolve as a function of the number of points in the input point cloud similar it is interesting to evaluate how auto encoding performs when nonuniform decimation of the input cloud is performed eg what happens if we chop off part of the input point cloud eg the legs of the chair does the model recover and add the removed parts this is potentially useful to practitioners which have to deal with incomplete point clouds acquired by range scanners analysis of shapes with different genus and dimensions would be interesting does the model manage to capture that some shapes have holes or consists of a closed 2d surface ball vs an open surface disk despite a simple prior on the local latent variables z docsepauthors provide a variant of wgan called pcgan to generate 3d point clouds the drawback of a vanilla gan with a deepset classifier is analyzed the rationality that decoupling the point generator with the object generator is also discussed a sandwiching objective function is proposed to achieve a better estimation of wasserstein distance compared with aae and the simplified variants of the proposed pcgan the proposed pcgan achieves incremental results on point cloud generation comments 1 authors calculate wu in a primal form via solving an assignment programming problem have authors ever tried sinkhorn iteration to my knowledge sinkhorn iteration is a very popular method to solve ot problem effectively it would be nice if authors can provide some reasons and comparisons for their choice on the optimizer of wu 2 authors proved that the sandwiching object ws is closer to the real wasserstein distance but it increases the variance of the loss function specifically the dynamics of wu and wl according to lemma1 is epsilon2epsilon1wp g while the dynamics of ws is 2epsilon1 wp g and 2epsilon1 epsilon2 epsilon1 according to the assumption in lemma 1 does it mean that the ws is not as stable as wl or wu during training additionally authors combined wu with wl with a mixture 201 ie the s in eqs6 13 14 is smaller than 005 in such a situation both the value and the dynamics of ws will be very close to that of wu does it mean that wl is not so important as wu authors should analyze the stability of their method in details essentially the proposed method is a variant of wgan which estimates wasserstein distance with lower bias but may suffer from worse stability in the experiments both the setting and the experimental results show that the proposed ws will be very close to wu as a result the improvement caused by the proposed method is incremental compared with its variants typos the end of the 2nd line of lemma 1 p g should be mathbbp mathbbg the 3rd line of lemma 1 epsilon1 epsilon1 page 14 eq14 lambda should be s page 14 eqs13 14 wmathbbp mathbbg should appear on the right
### Summary:
|
reviewers mostly recommended to reject after engaging with the authors however since not all author answers have been acknowledged by reviewers i am not sure if there are any remaining issues with the submission i thus lean to recommend to reject and resubmit please take reviewers comments into consideration to improve your submission should you decide to resubmit
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
436,
2929,
23970,
247,
1006,
800,
1566,
323,
495,
69,
1127,
16173,
4477,
4388,
387,
28055,
4645,
253,
12748,
273,
970,
5368,
1006,
800,
3210,
281,
3037,
10670,
273,
1127,
16173,
285,
12661,
247,
12955,
326,
24628,
35910,
253,
3374,
50276,
856,
84,
50276,
783,
1895,
273,
20462,
1006,
800,
3210,
323,
495,
69,
941,
310,
1774,
50276,
5040,
50275,
20790,
310,
2223,
1892,
281,
956,
285,
4428,
247,
1534,
1180,
273,
963,
993,
50275,
43355,
1750,
281,
4271,
247,
7936,
1895,
342,
253,
5368,
1006,
800,
3210,
323,
1127,
16173,
2568,
2593,
374,
14177,
281,
921,
326,
247,
2173,
2715,
326,
4648,
3676,
1178,
1057,
417,
10517,
10527,
23632,
752,
604,
359,
897,
24088,
247,
18902,
2990,
3185,
347,
310,
253,
4828,
1650,
4737,
3139,
310,
3240,
21643,
352,
651,
1663,
1361,
604,
253,
4737,
369,
625,
7473,
50276,
16662,
314,
4715,
271,
17032,
2990,
2805,
556,
5604,
644,
2218,
1078,
285,
891,
717,
417,
2119,
4477,
2085,
271,
21184,
2217,
8813,
273,
752,
310,
253,
3064,
342,
18539,
274,
1365,
6311,
17032,
50275,
324,
735,
24406,
4735,
4715,
50276,
262,
310,
417,
2590,
2139,
4477,
858,
417,
956,
253,
7103,
7241,
273,
247,
348,
965,
2178,
284,
1166,
390,
259,
86,
1036,
625,
8244,
275,
1798,
7103,
323,
253,
9162,
4836,
943,
320,
13333,
342,
253,
4081,
1566,
534,
651,
1918,
247,
1199,
1805,
5406,
273,
253,
6311,
14237,
5474,
339,
793,
360,
3454,
436,
2929,
29328,
247,
1006,
800,
1127,
9005,
1566,
1754,
327,
48960,
4715,
285,
3029,
770,
261,
6779,
10012,
273,
6431,
494,
4903,
253,
2022,
2770,
275,
4679,
285,
253,
47284,
310,
327,
495,
69,
1127,
16173,
9999,
1789,
15029,
3133,
253,
2553,
533,
812,
671,
320,
253,
10755,
273,
5113,
4496,
19148,
50276,
783,
2022,
2934,
310,
281,
1957,
247,
1127,
9005,
970,
247,
4156,
21624,
4778,
326,
28174,
253,
4583,
5281,
285,
247,
4849,
273,
1980,
21624,
4903,
326,
2127,
323,
253,
1899,
273,
247,
1127,
327,
253,
5281,
253,
1566,
8414,
273,
19615,
4295,
891,
271,
32049,
326,
3936,
247,
1127,
9005,
347,
3280,
285,
8115,
352,
281,
247,
1127,
6642,
273,
253,
4156,
21624,
4778,
273,
253,
5281,
6607,
407,
253,
3280,
9005,
247,
1127,
3024,
10336,
310,
908,
1060,
21255,
247,
29810,
326,
3936,
253,
5998,
4156,
21624,
4778,
285,
247,
1980,
21624,
4778,
285,
8115,
352,
281,
271,
3453,
1127,
275,
253,
9005,
281,
320,
4197,
407,
253,
1566,
50276,
12211,
247,
7134,
12915,
2990,
326,
13698,
281,
12129,
2792,
432,
247,
1677,
5281,
285,
253,
2792,
4197,
407,
9196,
293,
1699,
253,
32049,
285,
29810,
21038,
1027,
432,
6041,
305,
507,
253,
7134,
12915,
310,
18325,
591,
5281,
26332,
1016,
1127,
9005,
310,
2783,
347,
247,
3268,
689,
391,
20,
2173,
281,
326,
5281,
50276,
400,
247,
5281,
2720,
326,
2378,
253,
32049,
48759,
1566,
432,
1840,
310,
10166,
310,
908,
281,
1566,
253,
3268,
689,
253,
4156,
21624,
4903,
436,
1566,
310,
10166,
18289,
275,
247,
6041,
36827,
3740,
970,
253,
4156,
21624,
4778,
14237,
22245,
2439,
253,
1027,
3733,
1127,
16173,
50276,
284,
2429,
281,
2720,
789,
407,
247,
348,
300,
2178,
284,
1162,
355,
4240,
253,
4081,
2746,
556,
253,
5750,
281,
1581,
323,
10491,
271,
10341,
1180,
273,
2792,
432,
253,
2303,
5281,
2581,
685,
247,
4229,
41364,
1180,
50275,
249,
1635,
253,
4477,
12661,
281,
15338,
247,
17375,
3388,
273,
247,
2406,
3033,
285,
5170,
3033,
327,
253,
369,
2152,
6339,
4181,
875,
253,
10670,
273,
2792,
3969,
281,
1677,
15029,
50276,
2520,
2746,
30376,
281,
5520,
11745,
7103,
5593,
50275,
16217,
3825,
403,
5196,
327,
247,
2969,
20953,
941,
873,
347,
50276,
66,
4737,
273,
4473,
285,
327,
941,
432,
1566,
3024,
740,
285,
1566,
3024,
1449,
50276,
9389,
3045,
17082,
403,
5611,
281,
2939,
253,
6753,
27676,
3745,
273,
253,
1566,
281,
752,
6070,
1057,
253,
32049,
48759,
15722,
906,
275,
1127,
16173,
2074,
281,
253,
5281,
432,
534,
253,
3280,
1127,
18534,
310,
4561,
50275,
1189,
455,
891,
1089,
253,
2934,
273,
253,
2929,
4722,
285,
4409,
18051,
533,
253,
47284,
273,
253,
2929,
310,
1679,
685,
7445,
285,
3198,
2007,
789,
50276,
783,
5661,
12820,
273,
253,
4081,
2746,
476,
671,
320,
2007,
5520,
923,
625,
2173,
5701,
2708,
50275,
6160,
5701,
50275,
783,
4828,
1650,
387,
253,
5004,
273,
3239,
374,
310,
3710,
275,
253,
3282,
326,
253,
42295,
9376,
3133,
4122,
1327,
6549,
2531,
20278,
5545,
327,
253,
17200,
273,
253,
4154,
50275,
783,
14951,
275,
2593,
495,
1078,
4562,
310,
2581,
1499,
45695,
50276,
1542,
1650,
50275,
32897,
4853,
268,
285,
305,
253,
3603,
273,
253,
23279,
277,
8159,
326,
4620,
275,
253,
806,
12494,
273,
2593,
495,
50276,
262,
310,
417,
2931,
275,
534,
2317,
39116,
4852,
352,
310,
417,
2590,
752,
253,
4477,
18607,
342,
253,
14951,
305,
783,
3115,
948,
268,
3124,
50275,
5371,
2720,
10670,
268,
91,
285,
8429,
403,
908,
752,
310,
253,
4327,
1754,
327,
50275,
357,
22966,
2492,
13997,
78,
310,
6289,
2067,
2069,
275,
253,
2929,
533,
4558,
17011,
275,
253,
2929,
1919,
990,
273,
3239,
577,
4496,
4853,
4321,
50274,
783,
1566,
305,
3124,
1057,
417,
3176,
275,
253,
3733,
8103,
1159,
577,
849,
310,
436,
6333,
10166,
10534,
50275,
77,
471,
273,
19843,
275,
253,
1563,
10056,
275,
776,
4758,
1016,
1127,
1269,
74,
275,
253,
1127,
9005,
476,
320,
2783,
281,
2723,
281,
2014,
3888,
672,
359,
6194,
305,
507,
689,
3888,
50275,
783,
10732,
273,
23279,
277,
8159,
310,
417,
1160,
11859,
275,
2593,
495,
285,
4562,
534,
2789,
253,
14951,
273,
2581,
1652,
897,
50275,
783,
1563,
2929,
16108,
247,
5955,
275,
253,
2905,
789,
2593,
50276,
32289,
2196,
247,
11454,
26312,
757,
17857,
32888,
1166,
5987,
5758,
15337,
3024,
9275,
301,
73,
75,
5470,
2375,
22,
282,
50275,
783,
7714,
4428,
1142,
963,
993,
323,
1650,
50276,
1272,
900,
1121,
3239,
577,
2100,
587,
327,
3239,
608,
6882,
291,
327,
3239,
854,
3966,
4496,
4737,
1239,
634,
2929,
285,
4993,
841,
253,
1275,
257,
566,
281,
50276,
67,
1205,
900,
4765,
310,
18464,
752,
513,
368,
3730,
281,
10534,
50275,
9088,
3133,
281,
320,
642,
3748,
273,
253,
7877,
273,
253,
1980,
21624,
4903,
1182,
74,
50276,
32897,
4385,
327,
253,
4327,
285,
697,
3486,
327,
253,
3879,
273,
253,
1566,
50275,
783,
11745,
7103,
275,
2829,
337,
310,
4722,
285,
4217,
50276,
262,
310,
3710,
2299,
275,
253,
3282,
326,
352,
760,
5593,
6753,
27676,
13789,
281,
752,
6070,
476,
253,
5281,
320,
23775,
1677,
247,
3410,
1127,
9005,
432,
253,
1677,
5281,
50276,
17149,
6716,
7103,
273,
1006,
800,
14053,
3045,
310,
19235,
5816,
432,
436,
2929,
347,
352,
310,
275,
1199,
273,
253,
36827,
6239,
50276,
16534,
368,
4496,
4385,
327,
849,
436,
476,
9846,
320,
4229,
50275,
783,
20953,
941,
873,
4679,
812,
320,
8231,
50276,
936,
1056,
2316,
323,
4679,
5125,
2708,
50275,
266,
5661,
1263,
273,
253,
1055,
273,
253,
12480,
4764,
256,
651,
320,
4217,
281,
2486,
50276,
1542,
1650,
407,
3192,
256,
327,
247,
9860,
432,
470,
281,
337,
581,
812,
7484,
253,
7031,
285,
940,
1377,
292,
1171,
584,
5593,
50275,
49363,
7103,
273,
6753,
27676,
970,
247,
4778,
1180,
273,
3280,
2792,
310,
4722,
281,
823,
26332,
849,
513,
253,
767,
7103,
5593,
23554,
347,
247,
1159,
273,
253,
1180,
273,
2792,
275,
253,
3280,
1127,
9005,
50275,
22202,
352,
310,
4722,
281,
7472,
849,
6753,
9706,
17923,
672,
1327,
23714,
1086,
14508,
273,
253,
3280,
9005,
310,
2684,
24088,
752,
6569,
604,
359,
38419,
745,
629,
273,
253,
3280,
1127,
9005,
24088,
253,
9246,
273,
253,
6951,
1057,
253,
1566,
9295,
285,
823,
253,
5176,
4243,
436,
310,
7826,
4217,
281,
24432,
534,
452,
281,
2968,
342,
18464,
1127,
16173,
9288,
407,
2491,
660,
23217,
50274,
12792,
273,
15029,
342,
1027,
15443,
285,
10103,
651,
320,
4722,
50276,
18566,
253,
1566,
8722,
281,
9232,
326,
690,
15029,
452,
11385,
390,
8414,
273,
247,
4581,
374,
69,
2553,
4023,
4632,
271,
1527,
2553,
7592,
50276,
3229,
3784,
247,
2969,
2720,
327,
253,
1980,
21624,
4903,
1182,
50276,
7152,
33032,
43355,
2085,
247,
12955,
273,
259,
1247,
1925,
21136,
1247,
281,
6635,
495,
69,
1127,
16173,
253,
32489,
273,
247,
26724,
36827,
342,
247,
3676,
1178,
30410,
310,
5867,
253,
8870,
414,
326,
34430,
4906,
253,
1127,
14156,
342,
253,
1789,
14156,
310,
671,
5469,
50276,
66,
25749,
272,
8103,
1159,
310,
4081,
281,
5115,
247,
1805,
13418,
273,
369,
2152,
6339,
4181,
50276,
3118,
1096,
342,
247,
3348,
285,
253,
21010,
11640,
273,
253,
4081,
21136,
1247,
253,
4081,
21136,
1247,
33526,
32809,
1543,
327,
1127,
9005,
5978,
50276,
26122,
337,
4477,
10173,
259,
86,
275,
247,
819,
1983,
830,
3066,
16161,
271,
12714,
10717,
1895,
452,
4477,
2455,
3597,
16338,
27721,
19502,
281,
619,
3640,
16338,
27721,
19502,
310,
247,
1077,
4633,
1332,
281,
8415,
14366,
1895,
8069,
352,
651,
320,
5322,
604,
4477,
476,
2085,
690,
4606,
285,
14023,
323,
616,
4327,
327,
253,
5556,
6081,
273,
259,
86,
50275,
19,
4477,
8058,
326,
253,
25749,
272,
1789,
37280,
310,
8003,
281,
253,
1524,
369,
2152,
6339,
4181,
533,
352,
5459,
253,
11041,
273,
253,
2957,
1159,
5742,
253,
8062,
273,
259,
86,
285,
259,
77,
2556,
281,
18057,
18,
310,
299,
4277,
19,
4259,
18,
16471,
305,
1223,
253,
8062,
273,
37280,
310,
374,
4259,
18,
50276,
16471,
305,
285,
374,
4259,
18,
50276,
4259,
19,
50276,
4259,
18,
2556,
281,
253,
9376,
275,
18057,
337,
1057,
352,
1599,
326,
253,
37280,
310,
417,
347,
6474,
347,
259,
77,
390,
259,
86,
1309,
3733,
50276,
29483,
595,
4477,
5678,
259,
86,
342,
259,
77,
342,
247,
7802,
848,
26332,
253,
256,
275,
16186,
84,
23,
2145,
1638,
310,
4577,
685,
209,
5523,
275,
824,
247,
4112,
1097,
253,
1318,
285,
253,
8062,
273,
37280,
588,
320,
1077,
2810,
281,
326,
273,
259,
86,
1057,
352,
1599,
326,
259,
77,
310,
417,
594,
1774,
347,
259,
86,
4477,
943,
12106,
253,
7882,
273,
616,
1332,
275,
4278,
50276,
405,
4303,
253,
4081,
1332,
310,
247,
12955,
273,
259,
1247,
534,
8197,
369,
2152,
6339,
4181,
342,
2406,
8492,
533,
778,
11089,
432,
7197,
7882,
275,
253,
4679,
1097,
253,
4758,
285,
253,
5661,
1543,
921,
326,
253,
4081,
37280,
588,
320,
1077,
2810,
281,
259,
86,
347,
247,
906,
253,
7756,
4269,
407,
253,
4081,
1332,
310,
32809,
2429,
342,
697,
11640,
50275,
555,
993,
50276,
783,
990,
273,
253,
374,
2109,
1386,
273,
18057,
337,
268,
305,
943,
320,
14168,
4482,
81,
14168,
4482,
72,
50276,
783,
495,
5784,
1386,
273,
18057,
337,
299,
4277,
18,
50276,
4259,
18,
50276,
6377,
1638,
16186,
1047,
29331,
943,
320,
256,
50276,
6377,
1638,
16186,
84,
1012,
1638,
259,
1991,
81,
14168,
4482,
72,
943,
3176,
327,
253,
987,
2490,
187,
4118,
18435,
27,
15337,
398,
6571,
8521,
281,
12009,
846,
15966,
342,
253,
4477,
2299,
1580,
417,
512,
2488,
9172,
452,
644,
14969,
407,
30628,
891,
717,
417,
2119,
604,
627,
403,
667,
5780,
3374,
342,
253,
19529,
891,
3021,
9644,
281,
5583,
281,
12009,
285,
501,
538,
2225,
4496,
1379,
30628,
5701,
715,
8180,
281,
3157,
634,
19529,
943,
368,
7617,
281,
501,
538,
2225,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
436,
2929,
23970,
247,
1006,
800,
1566,
323,
495,
69,
1127,
16173,
4477,
4388,
387,
28055,
4645,
253,
12748,
273,
970,
5368,
1006,
800,
3210,
281,
3037,
10670,
273,
1127,
16173,
285,
12661,
247,
12955,
326,
24628,
35910,
253,
3374,
50276,
856,
84,
50276,
783,
1895,
273,
20462,
1006,
800,
3210,
323,
495,
69,
941,
310,
1774,
50276,
5040,
50275,
20790,
310,
2223,
1892,
281,
956,
285,
4428,
247,
1534,
1180,
273,
963,
993,
50275,
43355,
1750,
281,
4271,
247,
7936,
1895,
342,
253,
5368,
1006,
800,
3210,
323,
1127,
16173,
2568,
2593,
374,
14177,
281,
921,
326,
247,
2173,
2715,
326,
4648,
3676,
1178,
1057,
417,
10517,
10527,
23632,
752,
604,
359,
897,
24088,
247,
18902,
2990,
3185,
347,
310,
253,
4828,
1650,
4737,
3139,
310,
3240,
21643,
352,
651,
1663,
1361,
604,
253,
4737,
369,
625,
7473,
50276,
16662,
314,
4715,
271,
17032,
2990,
2805,
556,
5604,
644,
2218,
1078,
285,
891,
717,
417,
2119,
4477,
2085,
271,
21184,
2217,
8813,
273,
752,
310,
253,
3064,
342,
18539,
274,
1365,
6311,
17032,
50275,
324,
735,
24406,
4735,
4715,
50276,
262,
310,
417,
2590,
2139,
4477,
858,
417,
956,
253,
7103,
7241,
273,
247,
348,
965,
2178,
284,
1166,
390,
259,
86,
1036,
625,
8244,
275,
1798,
7103,
323,
253,
9162,
4836,
943,
320,
13333,
342,
253,
4081,
1566,
534,
651,
1918,
247,
1199,
1805,
5406,
273,
253,
6311,
14237,
5474,
339,
793,
360,
3454,
436,
2929,
29328,
247,
1006,
800,
1127,
9005,
1566,
1754,
327,
48960,
4715,
285,
3029,
770,
261,
6779,
10012,
273,
6431,
494,
4903,
253,
2022,
2770,
275,
4679,
285,
253,
47284,
310,
327,
495,
69,
1127,
16173,
9999,
1789,
15029,
3133,
253,
2553,
533,
812,
671,
320,
253,
10755,
273,
5113,
4496,
19148,
50276,
783,
2022,
2934,
310,
281,
1957,
247,
1127,
9005,
970,
247,
4156,
21624,
4778,
326,
28174,
253,
4583,
5281,
285,
247,
4849,
273,
1980,
21624,
4903,
326,
2127,
323,
253,
1899,
273,
247,
1127,
327,
253,
5281,
253,
1566,
8414,
273,
19615,
4295,
891,
271,
32049,
326,
3936,
247,
1127,
9005,
347,
3280,
285,
8115,
352,
281,
247,
1127,
6642,
273,
253,
4156,
21624,
4778,
273,
253,
5281,
6607,
407,
253,
3280,
9005,
247,
1127,
3024,
10336,
310,
908,
1060,
21255,
247,
29810,
326,
3936,
253,
5998,
4156,
21624,
4778,
285,
247,
1980,
21624,
4778,
285,
8115,
352,
281,
271,
3453,
1127,
275,
253,
9005,
281,
320,
4197,
407,
253,
1566,
50276,
12211,
247,
7134,
12915,
2990,
326,
13698,
281,
12129,
2792,
432,
247,
1677,
5281,
285,
253,
2792,
4197,
407,
9196,
293,
1699,
253,
32049,
285,
29810,
21038,
1027,
432,
6041,
305,
507,
253,
7134,
12915,
310,
18325,
591,
5281,
26332,
1016,
1127,
9005,
310,
2783,
347,
247,
3268,
689,
391,
20,
2173,
281,
326,
5281,
50276,
400,
247,
5281,
2720,
326,
2378,
253,
32049,
48759,
1566,
432,
1840,
310,
10166,
310,
908,
281,
1566,
253,
3268,
689,
253,
4156,
21624,
4903,
436,
1566,
310,
10166,
18289,
275,
247,
6041,
36827,
3740,
970,
253,
4156,
21624,
4778,
14237,
22245,
2439,
253,
1027,
3733,
1127,
16173,
50276,
284,
2429,
281,
2720,
789,
407,
247,
348,
300,
2178,
284,
1162,
355,
4240,
253,
4081,
2746,
556,
253,
5750,
281,
1581,
323,
10491,
271,
10341,
1180,
273,
2792,
432,
253,
2303,
5281,
2581,
685,
247,
4229,
41364,
1180,
50275,
249,
1635,
253,
4477,
12661,
281,
15338,
247,
17375,
3388,
273,
247,
2406,
3033,
285,
5170,
3033,
327,
253,
369,
2152,
6339,
4181,
875,
253,
10670,
273,
2792,
3969,
281,
1677,
15029,
50276,
2520,
2746,
30376,
281,
5520,
11745,
7103,
5593,
50275,
16217,
3825,
403,
5196,
327,
247,
2969,
20953,
941,
873,
347,
50276,
66,
4737,
273,
4473,
285,
327,
941,
432,
1566,
3024,
740,
285,
1566,
3024,
1449,
50276,
9389,
3045,
17082,
403,
5611,
281,
2939,
253,
6753,
27676,
3745,
273,
253,
1566,
281,
752,
6070,
1057,
253,
32049,
48759,
15722,
906,
275,
1127,
16173,
2074,
281,
253,
5281,
432,
534,
253,
3280,
1127,
18534,
310,
4561,
50275,
1189,
455,
891,
1089,
253,
2934,
273,
253,
2929,
4722,
285,
4409,
18051,
533,
253,
47284,
273,
253,
2929,
310,
1679,
685,
7445,
285,
3198,
2007,
789,
50276,
783,
5661,
12820,
273,
253,
4081,
2746,
476,
671,
320,
2007,
5520,
923,
625,
2173,
5701,
2708,
50275,
6160,
5701,
50275,
783,
4828,
1650,
387,
253,
5004,
273,
3239,
374,
310,
3710,
275,
253,
3282,
326,
253,
42295,
9376,
3133,
4122,
1327,
6549,
2531,
20278,
5545,
327,
253,
17200,
273,
253,
4154,
50275,
783,
14951,
275,
2593,
495,
1078,
4562,
310,
2581,
1499,
45695,
50276,
1542,
1650,
50275,
32897,
4853,
268,
285,
305,
253,
3603,
273,
253,
23279,
277,
8159,
326,
4620,
275,
253,
806,
12494,
273,
2593,
495,
50276,
262,
310,
417,
2931,
275,
534,
2317,
39116,
4852,
352,
310,
417,
2590,
752,
253,
4477,
18607,
342,
253,
14951,
305,
783,
3115,
948,
268,
3124,
50275,
5371,
2720,
10670,
268,
91,
285,
8429,
403,
908,
752,
310,
253,
4327,
1754,
327,
50275,
357,
22966,
2492,
13997,
78,
310,
6289,
2067,
2069,
275,
253,
2929,
533,
4558,
17011,
275,
253,
2929,
1919,
990,
273,
3239,
577,
4496,
4853,
4321,
50274,
783,
1566,
305,
3124,
1057,
417,
3176,
275,
253,
3733,
8103,
1159,
577,
849,
310,
436,
6333,
10166,
10534,
50275,
77,
471,
273,
19843,
275,
253,
1563,
10056,
275,
776,
4758,
1016,
1127,
1269,
74,
275,
253,
1127,
9005,
476,
320,
2783,
281,
2723,
281,
2014,
3888,
672,
359,
6194,
305,
507,
689,
3888,
50275,
783,
10732,
273,
23279,
277,
8159,
310,
417,
1160,
11859,
275,
2593,
495,
285,
4562,
534,
2789,
253,
14951,
273,
2581,
1652,
897,
50275,
783,
1563,
2929,
16108,
247,
5955,
275,
253,
2905,
789,
2593,
50276,
32289,
2196,
247,
11454,
26312,
757,
17857,
32888,
1166,
5987,
5758,
15337,
3024,
9275,
301,
73,
75,
5470,
2375,
22,
282,
50275,
783,
7714,
4428,
1142,
963,
993,
323,
1650,
50276,
1272,
900,
1121,
3239,
577,
2100,
587,
327,
3239,
608,
6882,
291,
327,
3239,
854,
3966,
4496,
4737,
1239,
634,
2929,
285,
4993,
841,
253,
1275,
257,
566,
281,
50276,
67,
1205,
900,
4765,
310,
18464,
752,
513,
368,
3730,
281,
10534,
50275,
9088,
3133,
281,
320,
642,
3748,
273,
253,
7877,
273,
253,
1980,
21624,
4903,
1182,
74,
50276,
32897,
4385,
327,
253,
4327,
285,
697,
3486,
327,
253,
3879,
273,
253,
1566,
50275,
783,
11745,
7103,
275,
2829,
337,
310,
4722,
285,
4217,
50276,
262,
310,
3710,
2299,
275,
253,
3282,
326,
352,
760,
5593,
6753,
27676,
13789,
281,
752,
6070,
476,
253,
5281,
320,
23775,
1677,
247,
3410,
1127,
9005,
432,
253,
1677,
5281,
50276,
17149,
6716,
7103,
273,
1006,
800,
14053,
3045,
310,
19235,
5816,
432,
436,
2929,
347,
352,
310,
275,
1199,
273,
253,
36827,
6239,
50276,
16534,
368,
4496,
4385,
327,
849,
436,
476,
9846,
320,
4229,
50275,
783,
20953,
941,
873,
4679,
812,
320,
8231,
50276,
936,
1056,
2316,
323,
4679,
5125,
2708,
50275,
266,
5661,
1263,
273,
253,
1055,
273,
253,
12480,
4764,
256,
651,
320,
4217,
281,
2486,
50276,
1542,
1650,
407,
3192,
256,
327,
247,
9860,
432,
470,
281,
337,
581,
812,
7484,
253,
7031,
285,
940,
1377,
292,
1171,
584,
5593,
50275,
49363,
7103,
273,
6753,
27676,
970,
247,
4778,
1180,
273,
3280,
2792,
310,
4722,
281,
823,
26332,
849,
513,
253,
767,
7103,
5593,
23554,
347,
247,
1159,
273,
253,
1180,
273,
2792,
275,
253,
3280,
1127,
9005,
50275,
22202,
352,
310,
4722,
281,
7472,
849,
6753,
9706,
17923,
672,
1327,
23714,
1086,
14508,
273,
253,
3280,
9005,
310,
2684,
24088,
752,
6569,
604,
359,
38419,
745,
629,
273,
253,
3280,
1127,
9005,
24088,
253,
9246,
273,
253,
6951,
1057,
253,
1566,
9295,
285,
823,
253,
5176,
4243,
436,
310,
7826,
4217,
281,
24432,
534,
452,
281,
2968,
342,
18464,
1127,
16173,
9288,
407,
2491,
660,
23217,
50274,
12792,
273,
15029,
342,
1027,
15443,
285,
10103,
651,
320,
4722,
50276,
18566,
253,
1566,
8722,
281,
9232,
326,
690,
15029,
452,
11385,
390,
8414,
273,
247,
4581,
374,
69,
2553,
4023,
4632,
271,
1527,
2553,
7592,
50276,
3229,
3784,
247,
2969,
2720,
327,
253,
1980,
21624,
4903,
1182,
50276,
7152,
33032,
43355,
2085,
247,
12955,
273,
259,
1247,
1925,
21136,
1247,
281,
6635,
495,
69,
1127,
16173,
253,
32489,
273,
247,
26724,
36827,
342,
247,
3676,
1178,
30410,
310,
5867,
253,
8870,
414,
326,
34430,
4906,
253,
1127,
14156,
342,
253,
1789,
14156,
310,
671,
5469,
50276,
66,
25749,
272,
8103,
1159,
310,
4081,
281,
5115,
247,
1805,
13418,
273,
369,
2152,
6339,
4181,
50276,
3118,
1096,
342,
247,
3348,
285,
253,
21010,
11640,
273,
253,
4081,
21136,
1247,
253,
4081,
21136,
1247,
33526,
32809,
1543,
327,
1127,
9005,
5978,
50276,
26122,
337,
4477,
10173,
259,
86,
275,
247,
819,
1983,
830,
3066,
16161,
271,
12714,
10717,
1895,
452,
4477,
2455,
3597,
16338,
27721,
19502,
281,
619,
3640,
16338,
27721,
19502,
310,
247,
1077,
4633,
1332,
281,
8415,
14366,
1895,
8069,
352,
651,
320,
5322,
604,
4477,
476,
2085,
690,
4606,
285,
14023,
323,
616,
4327,
327,
253,
5556,
6081,
273,
259,
86,
50275,
19,
4477,
8058,
326,
253,
25749,
272,
1789,
37280,
310,
8003,
281,
253,
1524,
369,
2152,
6339,
4181,
533,
352,
5459,
253,
11041,
273,
253,
2957,
1159,
5742,
253,
8062,
273,
259,
86,
285,
259,
77,
2556,
281,
18057,
18,
310,
299,
4277,
19,
4259,
18,
16471,
305,
1223,
253,
8062,
273,
37280,
310,
374,
4259,
18,
50276,
16471,
305,
285,
374,
4259,
18,
50276,
4259,
19,
50276,
4259,
18,
2556,
281,
253,
9376,
275,
18057,
337,
1057,
352,
1599,
326,
253,
37280,
310,
417,
347,
6474,
347,
259,
77,
390,
259,
86,
1309,
3733,
50276,
29483,
595,
4477,
5678,
259,
86,
342,
259,
77,
342,
247,
7802,
848,
26332,
253,
256,
275,
16186,
84,
23,
2145,
1638,
310,
4577,
685,
209,
5523,
275,
824,
247,
4112,
1097,
253,
1318,
285,
253,
8062,
273,
37280,
588,
320,
1077,
2810,
281,
326,
273,
259,
86,
1057,
352,
1599,
326,
259,
77,
310,
417,
594,
1774,
347,
259,
86,
4477,
943,
12106,
253,
7882,
273,
616,
1332,
275,
4278,
50276,
405,
4303,
253,
4081,
1332,
310,
247,
12955,
273,
259,
1247,
534,
8197,
369,
2152,
6339,
4181,
342,
2406,
8492,
533,
778,
11089,
432,
7197,
7882,
275,
253,
4679,
1097,
253,
4758,
285,
253,
5661,
1543,
921,
326,
253,
4081,
37280,
588,
320,
1077,
2810,
281,
259,
86,
347,
247,
906,
253,
7756,
4269,
407,
253,
4081,
1332,
310,
32809,
2429,
342,
697,
11640,
50275,
555,
993,
50276,
783,
990,
273,
253,
374,
2109,
1386,
273,
18057,
337,
268,
305,
943,
320,
14168,
4482,
81,
14168,
4482,
72,
50276,
783,
495,
5784,
1386,
273,
18057,
337,
299,
4277,
18,
50276,
4259,
18,
50276,
6377,
1638,
16186,
1047,
29331,
943,
320,
256,
50276,
6377,
1638,
16186,
84,
1012,
1638,
259,
1991,
81,
14168,
4482,
72,
943,
3176,
327,
253,
987,
2490,
187,
4118,
18435,
27,
15337,
398,
6571,
8521,
281,
12009,
846,
15966,
342,
253,
4477,
2299,
1580,
417,
512,
2488,
9172,
452,
644,
14969,
407,
30628,
891,
717,
417,
2119,
604,
627,
403,
667,
5780,
3374,
342,
253,
19529,
891,
3021,
9644,
281,
5583,
281,
12009,
285,
501,
538,
2225,
4496,
1379,
30628,
5701,
715,
8180,
281,
3157,
634,
19529,
943,
368,
7617,
281,
501,
538,
2225,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper aims to validate whether post hoc explanation methods are effective for detecting unknown spurious correlations the authors design 3 kinds of spurious signal detection reliability measures known spurious signal detection measure kssd causeforconcern measure ccm false alarm measure fam based on the 3 measurements the authors conduct extensive experiments to validate the reliability of 3 kinds of post hoc explanation methods for detecting spurious correlations strengths the authors develop 3 kinds of spurious signal detection reliability measures known spurious signal detection measure kssd causeforconcern measure ccm false alarm measure fam the authors carefully design the experiments to assess the effectiveness of an explanation method for detecting a models reliance on spurious training signals the authors further conduct a user study to measure the ability of users to use the post hoc explanation methods tested to detect model reliance on spurious signals the authors findings indicate that the explanation methods are less effective to detect nonvisible spurious signals and only the concept activation approach for visible spurious signals is effective which cast doubt on the reliability of current post hoc tools for spurious signal detection weaknesses the paper is somewhat hard to follow the authors claim that they find that these methods also seem to attribute to the spurious signal fam 05 even when the signal is not being relied on by the model it is unclear whether the model here is the normal models or spurious models fam 05 means that both spurious model and normal model generate similar explanations on spurious inputs but it is unclear whether the similarity of the explanation comes from the spurious signals or not so we dont know whether the normal model attributes the spurious signals the authors claim that section 3 as indicated the ccm measure further indicates that these methods do not indicate the presence of spurious signals when the signal is unknown for both the tag and stripe signals it is unclear how the authors get this conclusion from low ccm values we only know that section 21 if this measure ccm is high then it is unlikely that such a method alert a practitioner that a spurious model exhibits defects in table 1 gbp has high ccm while other methods have low ccm why do the authors think that the ccm measure indicates that all these methods do not indicate the presence of spurious signals even with both high and low ccm the authors did not report the kssd ccm fam and kstest values in section 4 which makes some discussions hard to follow for example the last sentence in section 4 lastly a difference in means test for the fam measure indicates that the normal models do not rely on the spurious signals overall this finding suggests that tcav is less susceptible to false positives tables 1 2 display the kssd ccm fam values in different settings these values can describe the difference of explanations in various settings eg normal and spurious models however even for the same model trained on the same datasets with different initial states the outputs and explanation results might be different it would be better if the authors displayed the values of kssd ccm fam of explanation among the same models trained with different initial states on the same inputs with the results as a comparison it would be easier for readers to understand how much the spurious model or spurious inputs affect the explanation results it would be better if the authors displayed the standard variance of the measurements eg table 1 2 minor in table 12 fam values are always much higher than ccm which means spurious model and normal model generate more similar explanations on spurious inputs than normal inputs it is worth some discussion minor gbp achieve much higher ccm and fam scores than other methods table 1 2 it would be better if the authors gave more discussion about it the overall idea of the paper is good but it is hard to follow and some analyses are not clear docsepthe authors present work that aims to test whether post hoc explanation methods can detect unknown spurious signals they perform an analysis on two different medical image datasets to which they introduce two different types of synthetic spurious signals pronounced spurious signals stripes and hospital tags and nonpronounced signals blurs when testing the ability of different post hoc explanation methods to identify spurious signals they find that feature attribution and concept activation are able to identify pronounced but not nonpronounced spurious signals that are known when faced with unknown signals none of the methods that were tested appears to be able to identify them they also provide results from an empirical study suggesting that none of the methods allowed practitioners to identify unknown spurious signals and only concept activation appears to enable practitioners to identify known spurious signals the main contribution of this work are the empirical results and alerting researchers and potentially practitioners to very substantial problems with current posthoc explanation the technical contributions are negligible first of all i would like to thank the authors for this interesting piece of work below i will briefly list the main strength and weaknesses of this submission and make suggestions for improvement i hope you will find them helpful and constructive strengths this work raises very important and timely concerns with regard to using nns for critical application domains like health care the scope of this work is very extensive 2 synthetic and 1 empirical dataset 3 different spurious signals 3 considering the different variants of feature attribution algorithms 6 different posthoc methods weaknesses the spurious signal detection reliability measures are not explained sufficiently how do you quantify similarity for the three spurious signal detection reliability measures equations would be helpful how is the ground truth defined for the kssd measure is this normal input and normal model how should these scores be judged what value do you interpret as problematic and why is there a rule of thumb that you can recommend you should mention in the limitations not in a food note that the normal model could still contain unknown spurious signals in the original data set the empirical study needs a more detailed description maybe in the supplement what population was recruited for the study is this a population of medical practitioners machine learning engineers or random subjects this should also be considered in the limitations is this representative of the people that will use these methods also is there reason for concern when choosing a withinsubject design here briefly when participants detect a spurious signal with one method will this not influence their judgement based on the next method was the order of methods counterbalanced the results of the empirical study should be reported properly measures of variance across participants are differences between normal and spurious and blinded and not blinded statistically significant using a formal test also why are the results mentioned in the discussion and not the results it is not clear to me why the spurious mhg hospital tag signal appears to be present in all classes and not just the spurious prepuberty class in figure 5 can you clarify i commend the authors for doing a formal statistical test kolmogorovsmirnoff for the concept activation method i would suggest including a visualization of these test results maybe a bar plot of only the spurious concept and indicating which comparisons differed significantly with asterisk or lines of course the same would be useful for other comparisons that you report in tables or figures for example on the three performance measures across different methods that you defined a formal test would substantiate your claims to improve clarity of the figures i would suggest adding arrows to figure 1 to direct the readers attention to the spurious signals overall the manuscript seems to be quite packed i would suggest to move some results to the supplement for example 3 of the 4 methods for feature attribution and figure 2 may be good candidates minor points there are some typos in the manuscript please have it proofread p6 we results report for the small dnn model in the paper we report results p3 which of these associations are be spurious are spurious overall this paper raises important questions about the usefulness of current posthoc explanation approaches for nns and provides a range of empirical results to support these claims while the contribution is incremental compared to previous studies i believe the authors do raise some important points that will help to stimulate more research in improving posthoc explanation methods however the manuscript is quite dense lacks some important clarifications eg about the performance measures and the empirical study and more rigorous statistical tests to substantiate the authors claims for this reason i cannot support acceptance of the manuscript as is but i am willing to reconsider should my concerns be addressed docsepthe authors present an analysis on posthoc explanation metrics that measure the reliance of a model to spurious signals the paper offers insights on three metrics kssd ccm and fam by deploying them in the analysis of dnns trained on medical image datasets the authors also conduct a blinded study the paper offers an analysis on three metrics for explaining posthoc reliance of models on spurious signals even though at first inspection the metrics proposed seem reasonable at the authors admission they turn out not to be all that good detecting reliance on spurious signals especially when the signal is unknown a priori admittedly the hardest case issues of the paper clarity from the main body of the paper the analysis of the metrics is lacking one has to refer to the appendix to get a better idea of the metrics proposed and how they are used in practice even though the authors claim a test on dog breeds dataset these are not included also all dogs belong to the same species canis familiaris the classification task refers to dog breeds not species contributions the paper proposed 3 metrics that as admitted by the authors are not performing as well as one would like that on itself is not an issue as seminegative results have merit however there is no indication of proposals of the authors on what the better metrics should be overall this paper is an analysis paper of 3 selfproposed metrics on posthoc analysis of spurious signals the metrics turn out not to be accurate enough for the tasks at hand especially when the spurious signal is implicit or unknown there is no discussion how to make the metrics better at detecting spurious signals post rebuttal update the score to 6
### Summary:
|
this paper demonstrates that current posthoc methods to explain blackbox models are not robust to spurious signals based on three metrics especially when the spurious signals are implicit or unknown technical novelty is limited because the paper presents primarily empirical results instead of novel machine learning techniques however the problem is very important and timely and significance to the field and potential impact of the presented results to advance the field are high as reviewers emphasized there are ways to further improve the paper including the clarity of presentation although the authors improved in the revised manuscript overall this paper deserves borderline acceptance
|
[
7202,
46541,
13007,
253,
4477,
2216,
495,
9351,
273,
46541,
2625,
5481,
13367,
5593,
1929,
46541,
2625,
5481,
2557,
465,
859,
69,
2847,
1542,
585,
20631,
2557,
260,
3591,
3221,
13965,
2557,
1431,
1754,
327,
253,
495,
6341,
253,
4477,
2589,
9470,
4679,
281,
17813,
253,
13367,
273,
495,
9351,
273,
1501,
26901,
8813,
3082,
323,
15549,
46541,
13007,
50276,
296,
3755,
20556,
50275,
783,
4477,
1287,
495,
9351,
273,
46541,
2625,
5481,
13367,
5593,
1929,
46541,
2625,
5481,
2557,
465,
859,
69,
2847,
1542,
585,
20631,
2557,
260,
3591,
3221,
13965,
2557,
1431,
50275,
783,
4477,
9257,
2216,
253,
4679,
281,
2939,
253,
12510,
273,
271,
8813,
1332,
323,
15549,
247,
3210,
22095,
327,
46541,
3733,
6298,
50275,
783,
4477,
2007,
2589,
247,
2608,
1263,
281,
2557,
253,
3745,
273,
4212,
281,
897,
253,
1501,
26901,
8813,
3082,
5762,
281,
2736,
1566,
22095,
327,
46541,
6298,
50275,
783,
4477,
4342,
5224,
326,
253,
8813,
3082,
403,
1679,
3576,
281,
2736,
1327,
22772,
46541,
6298,
285,
760,
253,
4473,
5743,
2746,
323,
7985,
46541,
6298,
310,
3576,
534,
5248,
5545,
327,
253,
13367,
273,
1655,
1501,
26901,
5657,
323,
46541,
2625,
5481,
50275,
20881,
1255,
265,
50275,
783,
2929,
310,
8489,
1892,
281,
956,
50275,
783,
4477,
1750,
326,
597,
1089,
326,
841,
3082,
671,
1646,
281,
11104,
281,
253,
46541,
2625,
1431,
50276,
1762,
1014,
672,
253,
2625,
310,
417,
1146,
15494,
327,
407,
253,
1566,
352,
310,
12744,
1880,
253,
1566,
1060,
310,
253,
2622,
3210,
390,
46541,
3210,
1431,
50276,
1762,
2097,
326,
1097,
46541,
1566,
285,
2622,
1566,
6635,
2074,
22909,
327,
46541,
14800,
533,
352,
310,
12744,
1880,
253,
14259,
273,
253,
8813,
3249,
432,
253,
46541,
6298,
390,
417,
594,
359,
13414,
871,
1880,
253,
2622,
1566,
12474,
253,
46541,
6298,
50275,
783,
4477,
1750,
326,
2593,
495,
347,
4860,
253,
260,
3591,
2557,
2007,
6492,
326,
841,
3082,
513,
417,
5224,
253,
3361,
273,
46541,
6298,
672,
253,
2625,
310,
7202,
323,
1097,
253,
6809,
285,
39832,
6298,
352,
310,
12744,
849,
253,
4477,
755,
436,
6452,
432,
1698,
260,
3591,
2193,
50276,
664,
760,
871,
326,
2593,
3127,
604,
436,
2557,
260,
3591,
310,
1029,
840,
352,
310,
11543,
326,
824,
247,
1332,
10028,
247,
34815,
326,
247,
46541,
1566,
15646,
12834,
275,
2829,
337,
305,
12303,
556,
1029,
260,
3591,
1223,
643,
3082,
452,
1698,
260,
3591,
2139,
513,
253,
4477,
1158,
326,
253,
260,
3591,
2557,
6492,
326,
512,
841,
3082,
513,
417,
5224,
253,
3361,
273,
46541,
6298,
1014,
342,
1097,
1029,
285,
1698,
260,
3591,
50275,
783,
4477,
858,
417,
1304,
253,
465,
859,
69,
260,
3591,
1431,
285,
465,
296,
383,
2193,
275,
2593,
577,
534,
2789,
690,
11985,
1892,
281,
956,
323,
1650,
253,
1390,
6197,
275,
2593,
577,
1390,
314,
247,
3064,
275,
2097,
1071,
323,
253,
1431,
2557,
6492,
326,
253,
2622,
3210,
513,
417,
10725,
327,
253,
46541,
6298,
4583,
436,
4560,
5936,
326,
246,
43205,
310,
1679,
16931,
281,
3221,
37865,
50274,
38538,
337,
374,
3148,
253,
465,
859,
69,
260,
3591,
1431,
2193,
275,
1027,
7533,
841,
2193,
476,
6266,
253,
3064,
273,
22909,
275,
2710,
7533,
24088,
2622,
285,
46541,
3210,
2299,
1014,
323,
253,
1072,
1566,
10166,
327,
253,
1072,
15302,
342,
1027,
3302,
3054,
253,
18012,
285,
8813,
1543,
1537,
320,
1027,
352,
651,
320,
1805,
604,
253,
4477,
8653,
253,
2193,
273,
465,
859,
69,
260,
3591,
1431,
273,
8813,
2190,
253,
1072,
3210,
10166,
342,
1027,
3302,
3054,
327,
253,
1072,
14800,
342,
253,
1543,
347,
247,
5301,
352,
651,
320,
6927,
323,
10668,
281,
2096,
849,
1199,
253,
46541,
1566,
390,
46541,
14800,
2818,
253,
8813,
1543,
50274,
262,
651,
320,
1805,
604,
253,
4477,
8653,
253,
2629,
11041,
273,
253,
6341,
24088,
2829,
337,
374,
50274,
37585,
275,
2829,
1249,
1431,
2193,
403,
1900,
1199,
2169,
685,
260,
3591,
534,
2097,
46541,
1566,
285,
2622,
1566,
6635,
625,
2074,
22909,
327,
46541,
14800,
685,
2622,
14800,
352,
310,
4409,
690,
5955,
50274,
37585,
305,
12303,
5115,
1199,
2169,
260,
3591,
285,
1431,
7363,
685,
643,
3082,
2829,
337,
374,
352,
651,
320,
1805,
604,
253,
4477,
3534,
625,
5955,
670,
352,
50276,
783,
4583,
2934,
273,
253,
2929,
310,
1175,
533,
352,
310,
1892,
281,
956,
285,
690,
6260,
403,
417,
2590,
5474,
339,
431,
248,
4477,
1246,
789,
326,
13698,
281,
1071,
1880,
1501,
26901,
8813,
3082,
476,
2736,
7202,
46541,
6298,
597,
1347,
271,
1783,
327,
767,
1027,
3739,
2460,
15302,
281,
534,
597,
9569,
767,
1027,
3510,
273,
13506,
46541,
6298,
17088,
46541,
6298,
36298,
285,
4675,
14610,
285,
1327,
1087,
251,
11493,
6298,
787,
2244,
672,
5175,
253,
3745,
273,
1027,
1501,
26901,
8813,
3082,
281,
4271,
46541,
6298,
597,
1089,
326,
4735,
863,
2382,
285,
4473,
5743,
403,
2104,
281,
4271,
17088,
533,
417,
1327,
1087,
251,
11493,
46541,
6298,
326,
403,
1929,
672,
11372,
342,
7202,
6298,
5293,
273,
253,
3082,
326,
497,
5762,
4620,
281,
320,
2104,
281,
4271,
731,
597,
671,
2085,
1543,
432,
271,
16774,
1263,
7738,
326,
5293,
273,
253,
3082,
4136,
24432,
281,
4271,
7202,
46541,
6298,
285,
760,
4473,
5743,
4620,
281,
8046,
24432,
281,
4271,
1929,
46541,
6298,
50276,
783,
2022,
7680,
273,
436,
789,
403,
253,
16774,
1543,
285,
10028,
272,
8607,
285,
7826,
24432,
281,
1077,
6832,
3237,
342,
1655,
1501,
37806,
8813,
253,
7681,
9021,
403,
22879,
50276,
7053,
273,
512,
891,
651,
751,
281,
5717,
253,
4477,
323,
436,
4722,
5313,
273,
789,
2708,
891,
588,
13366,
1618,
253,
2022,
4757,
285,
32213,
273,
436,
19529,
285,
1056,
13991,
323,
7756,
891,
3524,
368,
588,
1089,
731,
9371,
285,
25799,
50276,
296,
3755,
20556,
50276,
2520,
789,
16540,
1077,
1774,
285,
14793,
7350,
342,
2743,
281,
970,
295,
2224,
323,
4619,
2898,
10625,
751,
1786,
1557,
50276,
783,
7990,
273,
436,
789,
310,
1077,
9470,
374,
13506,
285,
337,
16774,
10895,
495,
1027,
46541,
6298,
495,
7296,
253,
1027,
11640,
273,
4735,
863,
2382,
11333,
721,
1027,
1501,
37806,
3082,
50276,
20881,
1255,
265,
50276,
783,
46541,
2625,
5481,
13367,
5593,
403,
417,
5544,
10481,
849,
513,
368,
22048,
14259,
323,
253,
1264,
46541,
2625,
5481,
13367,
5593,
7424,
651,
320,
9371,
849,
310,
253,
3216,
5083,
2931,
323,
253,
465,
859,
69,
2557,
310,
436,
2622,
3280,
285,
2622,
1566,
50275,
5430,
943,
841,
7363,
320,
24242,
752,
1318,
513,
368,
4665,
347,
20276,
285,
2139,
310,
627,
247,
4086,
273,
17300,
326,
368,
476,
5583,
50275,
5658,
943,
3748,
275,
253,
7364,
417,
275,
247,
2739,
3877,
326,
253,
2622,
1566,
812,
1335,
3831,
7202,
46541,
6298,
275,
253,
3236,
941,
873,
50275,
783,
16774,
1263,
3198,
247,
625,
7000,
5740,
5046,
275,
253,
8499,
752,
3072,
369,
17875,
323,
253,
1263,
310,
436,
247,
3072,
273,
3739,
24432,
5145,
4715,
19414,
390,
3632,
5705,
436,
943,
671,
320,
2783,
275,
253,
7364,
310,
436,
8612,
273,
253,
952,
326,
588,
897,
841,
3082,
671,
310,
627,
1921,
323,
4468,
672,
13887,
247,
342,
968,
538,
720,
2216,
1060,
13366,
672,
5014,
2736,
247,
46541,
2625,
342,
581,
1332,
588,
436,
417,
4833,
616,
31536,
1754,
327,
253,
1735,
1332,
369,
253,
1340,
273,
3082,
4828,
30063,
50275,
783,
1543,
273,
253,
16774,
1263,
943,
320,
2361,
6283,
5593,
273,
11041,
2439,
5014,
403,
3910,
875,
2622,
285,
46541,
285,
34903,
285,
417,
34903,
10126,
1534,
970,
247,
7473,
1071,
671,
2139,
403,
253,
1543,
5393,
275,
253,
5955,
285,
417,
253,
1543,
50275,
262,
310,
417,
2590,
281,
479,
2139,
253,
46541,
278,
47911,
4675,
6809,
2625,
4620,
281,
320,
1246,
275,
512,
5971,
285,
417,
816,
253,
46541,
3765,
22651,
555,
966,
275,
4677,
608,
476,
368,
19148,
50275,
74,
49638,
253,
4477,
323,
2509,
247,
7473,
7605,
1071,
38301,
44519,
42017,
3610,
343,
2369,
567,
323,
253,
4473,
5743,
1332,
891,
651,
1804,
1690,
247,
24426,
273,
841,
1071,
1543,
5046,
247,
2534,
7484,
273,
760,
253,
46541,
4473,
285,
7809,
534,
14023,
24315,
3012,
342,
23317,
1886,
390,
3104,
273,
2282,
253,
1072,
651,
320,
4217,
323,
643,
14023,
326,
368,
1304,
275,
7180,
390,
8442,
323,
1650,
327,
253,
1264,
3045,
5593,
2439,
1027,
3082,
326,
368,
2931,
247,
7473,
1071,
651,
4326,
4513,
634,
3916,
50275,
936,
3157,
19843,
273,
253,
8442,
891,
651,
1804,
6240,
18159,
281,
4677,
337,
281,
1480,
253,
10668,
4116,
281,
253,
46541,
6298,
50275,
1189,
455,
253,
7714,
3133,
281,
320,
3240,
14998,
891,
651,
1804,
281,
2118,
690,
1543,
281,
253,
8499,
323,
1650,
495,
273,
253,
577,
3082,
323,
4735,
863,
2382,
285,
4677,
374,
778,
320,
1175,
9183,
50275,
37585,
2792,
50276,
9088,
403,
690,
963,
993,
275,
253,
7714,
4496,
452,
352,
4737,
1088,
50276,
81,
23,
359,
1543,
1304,
323,
253,
1355,
277,
9866,
1566,
275,
253,
2929,
50276,
664,
1304,
1543,
50276,
81,
20,
534,
273,
841,
12485,
403,
320,
46541,
50276,
609,
46541,
50275,
1189,
455,
436,
2929,
16540,
1774,
3533,
670,
253,
31471,
273,
1655,
1501,
37806,
8813,
7274,
323,
295,
2224,
285,
3400,
247,
2491,
273,
16774,
1543,
281,
1329,
841,
3916,
1223,
253,
7680,
310,
32809,
2429,
281,
2045,
2175,
891,
2868,
253,
4477,
513,
7164,
690,
1774,
2792,
326,
588,
1361,
281,
23278,
625,
2561,
275,
11138,
1501,
37806,
8813,
3082,
2299,
253,
7714,
310,
3240,
14086,
19756,
690,
1774,
8254,
6787,
24088,
670,
253,
3045,
5593,
285,
253,
16774,
1263,
285,
625,
26565,
7605,
5216,
281,
4326,
4513,
253,
4477,
3916,
323,
436,
1921,
891,
2550,
1329,
14924,
273,
253,
7714,
347,
310,
533,
891,
717,
7378,
281,
24033,
943,
619,
7350,
320,
9713,
50276,
7152,
339,
431,
248,
4477,
1246,
271,
1783,
327,
1501,
37806,
8813,
17082,
326,
2557,
253,
22095,
273,
247,
1566,
281,
46541,
6298,
253,
2929,
6131,
16039,
327,
1264,
17082,
50276,
661,
8289,
260,
3591,
285,
1431,
407,
45021,
731,
275,
253,
1783,
273,
277,
79,
2224,
50276,
32927,
327,
3739,
2460,
15302,
253,
4477,
671,
2589,
247,
34903,
1263,
253,
2929,
6131,
271,
1783,
327,
1264,
17082,
323,
15571,
1501,
37806,
22095,
273,
3210,
327,
46541,
6298,
1014,
2167,
387,
806,
15981,
253,
17082,
4081,
1646,
5272,
387,
253,
4477,
11341,
597,
1614,
562,
417,
281,
320,
512,
326,
1175,
15549,
22095,
327,
46541,
6298,
3340,
672,
253,
2625,
310,
7202,
247,
30400,
50276,
324,
3004,
314,
253,
31056,
1083,
50275,
22402,
273,
253,
2929,
50274,
498,
15752,
50276,
4064,
253,
2022,
2133,
273,
253,
2929,
253,
1783,
273,
253,
17082,
310,
14999,
581,
556,
281,
3730,
281,
253,
30762,
281,
755,
247,
1805,
2934,
273,
253,
17082,
4081,
285,
849,
597,
403,
908,
275,
3946,
50275,
9154,
2167,
253,
4477,
1750,
247,
1071,
327,
4370,
35328,
10895,
841,
403,
417,
2908,
50276,
12563,
512,
9097,
5663,
281,
253,
1072,
3417,
476,
261,
7615,
261,
253,
9162,
4836,
10770,
281,
4370,
35328,
50276,
1439,
3417,
50275,
1987,
8303,
253,
2929,
4081,
495,
17082,
326,
347,
8176,
407,
253,
4477,
403,
417,
9591,
347,
973,
347,
581,
651,
751,
326,
327,
3139,
310,
417,
271,
2523,
347,
3300,
460,
72,
800,
1543,
452,
15785,
2299,
627,
310,
642,
14011,
273,
18595,
273,
253,
4477,
327,
752,
253,
1805,
17082,
943,
320,
50275,
1189,
455,
436,
2929,
310,
271,
1783,
2929,
273,
495,
1881,
856,
7334,
17082,
327,
1501,
37806,
1783,
273,
46541,
6298,
253,
17082,
1614,
562,
417,
281,
320,
7899,
2217,
323,
253,
8892,
387,
1133,
3340,
672,
253,
46541,
2625,
310,
15424,
390,
7202,
627,
310,
642,
5955,
849,
281,
1056,
253,
17082,
1805,
387,
15549,
46541,
6298,
50272,
5996,
30080,
22559,
50275,
11183,
253,
4868,
281,
721,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
14371,
326,
1655,
1501,
37806,
3082,
281,
5513,
2806,
3364,
3210,
403,
417,
10237,
281,
46541,
6298,
1754,
327,
1264,
17082,
3340,
672,
253,
46541,
6298,
403,
15424,
390,
7202,
50276,
48746,
38135,
310,
3710,
984,
253,
2929,
10262,
8558,
16774,
1543,
3185,
273,
4460,
5145,
4715,
5609,
2299,
253,
1895,
310,
1077,
1774,
285,
14793,
285,
8453,
281,
253,
1673,
285,
2442,
3486,
273,
253,
3559,
1543,
281,
7170,
253,
1673,
403,
1029,
347,
30628,
21947,
627,
403,
4088,
281,
2007,
3157,
253,
2929,
1690,
253,
19843,
273,
9759,
3738,
253,
4477,
5520,
275,
253,
17265,
7714,
50276,
1189,
455,
436,
2929,
22828,
45210,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
7202,
46541,
13007,
253,
4477,
2216,
495,
9351,
273,
46541,
2625,
5481,
13367,
5593,
1929,
46541,
2625,
5481,
2557,
465,
859,
69,
2847,
1542,
585,
20631,
2557,
260,
3591,
3221,
13965,
2557,
1431,
1754,
327,
253,
495,
6341,
253,
4477,
2589,
9470,
4679,
281,
17813,
253,
13367,
273,
495,
9351,
273,
1501,
26901,
8813,
3082,
323,
15549,
46541,
13007,
50276,
296,
3755,
20556,
50275,
783,
4477,
1287,
495,
9351,
273,
46541,
2625,
5481,
13367,
5593,
1929,
46541,
2625,
5481,
2557,
465,
859,
69,
2847,
1542,
585,
20631,
2557,
260,
3591,
3221,
13965,
2557,
1431,
50275,
783,
4477,
9257,
2216,
253,
4679,
281,
2939,
253,
12510,
273,
271,
8813,
1332,
323,
15549,
247,
3210,
22095,
327,
46541,
3733,
6298,
50275,
783,
4477,
2007,
2589,
247,
2608,
1263,
281,
2557,
253,
3745,
273,
4212,
281,
897,
253,
1501,
26901,
8813,
3082,
5762,
281,
2736,
1566,
22095,
327,
46541,
6298,
50275,
783,
4477,
4342,
5224,
326,
253,
8813,
3082,
403,
1679,
3576,
281,
2736,
1327,
22772,
46541,
6298,
285,
760,
253,
4473,
5743,
2746,
323,
7985,
46541,
6298,
310,
3576,
534,
5248,
5545,
327,
253,
13367,
273,
1655,
1501,
26901,
5657,
323,
46541,
2625,
5481,
50275,
20881,
1255,
265,
50275,
783,
2929,
310,
8489,
1892,
281,
956,
50275,
783,
4477,
1750,
326,
597,
1089,
326,
841,
3082,
671,
1646,
281,
11104,
281,
253,
46541,
2625,
1431,
50276,
1762,
1014,
672,
253,
2625,
310,
417,
1146,
15494,
327,
407,
253,
1566,
352,
310,
12744,
1880,
253,
1566,
1060,
310,
253,
2622,
3210,
390,
46541,
3210,
1431,
50276,
1762,
2097,
326,
1097,
46541,
1566,
285,
2622,
1566,
6635,
2074,
22909,
327,
46541,
14800,
533,
352,
310,
12744,
1880,
253,
14259,
273,
253,
8813,
3249,
432,
253,
46541,
6298,
390,
417,
594,
359,
13414,
871,
1880,
253,
2622,
1566,
12474,
253,
46541,
6298,
50275,
783,
4477,
1750,
326,
2593,
495,
347,
4860,
253,
260,
3591,
2557,
2007,
6492,
326,
841,
3082,
513,
417,
5224,
253,
3361,
273,
46541,
6298,
672,
253,
2625,
310,
7202,
323,
1097,
253,
6809,
285,
39832,
6298,
352,
310,
12744,
849,
253,
4477,
755,
436,
6452,
432,
1698,
260,
3591,
2193,
50276,
664,
760,
871,
326,
2593,
3127,
604,
436,
2557,
260,
3591,
310,
1029,
840,
352,
310,
11543,
326,
824,
247,
1332,
10028,
247,
34815,
326,
247,
46541,
1566,
15646,
12834,
275,
2829,
337,
305,
12303,
556,
1029,
260,
3591,
1223,
643,
3082,
452,
1698,
260,
3591,
2139,
513,
253,
4477,
1158,
326,
253,
260,
3591,
2557,
6492,
326,
512,
841,
3082,
513,
417,
5224,
253,
3361,
273,
46541,
6298,
1014,
342,
1097,
1029,
285,
1698,
260,
3591,
50275,
783,
4477,
858,
417,
1304,
253,
465,
859,
69,
260,
3591,
1431,
285,
465,
296,
383,
2193,
275,
2593,
577,
534,
2789,
690,
11985,
1892,
281,
956,
323,
1650,
253,
1390,
6197,
275,
2593,
577,
1390,
314,
247,
3064,
275,
2097,
1071,
323,
253,
1431,
2557,
6492,
326,
253,
2622,
3210,
513,
417,
10725,
327,
253,
46541,
6298,
4583,
436,
4560,
5936,
326,
246,
43205,
310,
1679,
16931,
281,
3221,
37865,
50274,
38538,
337,
374,
3148,
253,
465,
859,
69,
260,
3591,
1431,
2193,
275,
1027,
7533,
841,
2193,
476,
6266,
253,
3064,
273,
22909,
275,
2710,
7533,
24088,
2622,
285,
46541,
3210,
2299,
1014,
323,
253,
1072,
1566,
10166,
327,
253,
1072,
15302,
342,
1027,
3302,
3054,
253,
18012,
285,
8813,
1543,
1537,
320,
1027,
352,
651,
320,
1805,
604,
253,
4477,
8653,
253,
2193,
273,
465,
859,
69,
260,
3591,
1431,
273,
8813,
2190,
253,
1072,
3210,
10166,
342,
1027,
3302,
3054,
327,
253,
1072,
14800,
342,
253,
1543,
347,
247,
5301,
352,
651,
320,
6927,
323,
10668,
281,
2096,
849,
1199,
253,
46541,
1566,
390,
46541,
14800,
2818,
253,
8813,
1543,
50274,
262,
651,
320,
1805,
604,
253,
4477,
8653,
253,
2629,
11041,
273,
253,
6341,
24088,
2829,
337,
374,
50274,
37585,
275,
2829,
1249,
1431,
2193,
403,
1900,
1199,
2169,
685,
260,
3591,
534,
2097,
46541,
1566,
285,
2622,
1566,
6635,
625,
2074,
22909,
327,
46541,
14800,
685,
2622,
14800,
352,
310,
4409,
690,
5955,
50274,
37585,
305,
12303,
5115,
1199,
2169,
260,
3591,
285,
1431,
7363,
685,
643,
3082,
2829,
337,
374,
352,
651,
320,
1805,
604,
253,
4477,
3534,
625,
5955,
670,
352,
50276,
783,
4583,
2934,
273,
253,
2929,
310,
1175,
533,
352,
310,
1892,
281,
956,
285,
690,
6260,
403,
417,
2590,
5474,
339,
431,
248,
4477,
1246,
789,
326,
13698,
281,
1071,
1880,
1501,
26901,
8813,
3082,
476,
2736,
7202,
46541,
6298,
597,
1347,
271,
1783,
327,
767,
1027,
3739,
2460,
15302,
281,
534,
597,
9569,
767,
1027,
3510,
273,
13506,
46541,
6298,
17088,
46541,
6298,
36298,
285,
4675,
14610,
285,
1327,
1087,
251,
11493,
6298,
787,
2244,
672,
5175,
253,
3745,
273,
1027,
1501,
26901,
8813,
3082,
281,
4271,
46541,
6298,
597,
1089,
326,
4735,
863,
2382,
285,
4473,
5743,
403,
2104,
281,
4271,
17088,
533,
417,
1327,
1087,
251,
11493,
46541,
6298,
326,
403,
1929,
672,
11372,
342,
7202,
6298,
5293,
273,
253,
3082,
326,
497,
5762,
4620,
281,
320,
2104,
281,
4271,
731,
597,
671,
2085,
1543,
432,
271,
16774,
1263,
7738,
326,
5293,
273,
253,
3082,
4136,
24432,
281,
4271,
7202,
46541,
6298,
285,
760,
4473,
5743,
4620,
281,
8046,
24432,
281,
4271,
1929,
46541,
6298,
50276,
783,
2022,
7680,
273,
436,
789,
403,
253,
16774,
1543,
285,
10028,
272,
8607,
285,
7826,
24432,
281,
1077,
6832,
3237,
342,
1655,
1501,
37806,
8813,
253,
7681,
9021,
403,
22879,
50276,
7053,
273,
512,
891,
651,
751,
281,
5717,
253,
4477,
323,
436,
4722,
5313,
273,
789,
2708,
891,
588,
13366,
1618,
253,
2022,
4757,
285,
32213,
273,
436,
19529,
285,
1056,
13991,
323,
7756,
891,
3524,
368,
588,
1089,
731,
9371,
285,
25799,
50276,
296,
3755,
20556,
50276,
2520,
789,
16540,
1077,
1774,
285,
14793,
7350,
342,
2743,
281,
970,
295,
2224,
323,
4619,
2898,
10625,
751,
1786,
1557,
50276,
783,
7990,
273,
436,
789,
310,
1077,
9470,
374,
13506,
285,
337,
16774,
10895,
495,
1027,
46541,
6298,
495,
7296,
253,
1027,
11640,
273,
4735,
863,
2382,
11333,
721,
1027,
1501,
37806,
3082,
50276,
20881,
1255,
265,
50276,
783,
46541,
2625,
5481,
13367,
5593,
403,
417,
5544,
10481,
849,
513,
368,
22048,
14259,
323,
253,
1264,
46541,
2625,
5481,
13367,
5593,
7424,
651,
320,
9371,
849,
310,
253,
3216,
5083,
2931,
323,
253,
465,
859,
69,
2557,
310,
436,
2622,
3280,
285,
2622,
1566,
50275,
5430,
943,
841,
7363,
320,
24242,
752,
1318,
513,
368,
4665,
347,
20276,
285,
2139,
310,
627,
247,
4086,
273,
17300,
326,
368,
476,
5583,
50275,
5658,
943,
3748,
275,
253,
7364,
417,
275,
247,
2739,
3877,
326,
253,
2622,
1566,
812,
1335,
3831,
7202,
46541,
6298,
275,
253,
3236,
941,
873,
50275,
783,
16774,
1263,
3198,
247,
625,
7000,
5740,
5046,
275,
253,
8499,
752,
3072,
369,
17875,
323,
253,
1263,
310,
436,
247,
3072,
273,
3739,
24432,
5145,
4715,
19414,
390,
3632,
5705,
436,
943,
671,
320,
2783,
275,
253,
7364,
310,
436,
8612,
273,
253,
952,
326,
588,
897,
841,
3082,
671,
310,
627,
1921,
323,
4468,
672,
13887,
247,
342,
968,
538,
720,
2216,
1060,
13366,
672,
5014,
2736,
247,
46541,
2625,
342,
581,
1332,
588,
436,
417,
4833,
616,
31536,
1754,
327,
253,
1735,
1332,
369,
253,
1340,
273,
3082,
4828,
30063,
50275,
783,
1543,
273,
253,
16774,
1263,
943,
320,
2361,
6283,
5593,
273,
11041,
2439,
5014,
403,
3910,
875,
2622,
285,
46541,
285,
34903,
285,
417,
34903,
10126,
1534,
970,
247,
7473,
1071,
671,
2139,
403,
253,
1543,
5393,
275,
253,
5955,
285,
417,
253,
1543,
50275,
262,
310,
417,
2590,
281,
479,
2139,
253,
46541,
278,
47911,
4675,
6809,
2625,
4620,
281,
320,
1246,
275,
512,
5971,
285,
417,
816,
253,
46541,
3765,
22651,
555,
966,
275,
4677,
608,
476,
368,
19148,
50275,
74,
49638,
253,
4477,
323,
2509,
247,
7473,
7605,
1071,
38301,
44519,
42017,
3610,
343,
2369,
567,
323,
253,
4473,
5743,
1332,
891,
651,
1804,
1690,
247,
24426,
273,
841,
1071,
1543,
5046,
247,
2534,
7484,
273,
760,
253,
46541,
4473,
285,
7809,
534,
14023,
24315,
3012,
342,
23317,
1886,
390,
3104,
273,
2282,
253,
1072,
651,
320,
4217,
323,
643,
14023,
326,
368,
1304,
275,
7180,
390,
8442,
323,
1650,
327,
253,
1264,
3045,
5593,
2439,
1027,
3082,
326,
368,
2931,
247,
7473,
1071,
651,
4326,
4513,
634,
3916,
50275,
936,
3157,
19843,
273,
253,
8442,
891,
651,
1804,
6240,
18159,
281,
4677,
337,
281,
1480,
253,
10668,
4116,
281,
253,
46541,
6298,
50275,
1189,
455,
253,
7714,
3133,
281,
320,
3240,
14998,
891,
651,
1804,
281,
2118,
690,
1543,
281,
253,
8499,
323,
1650,
495,
273,
253,
577,
3082,
323,
4735,
863,
2382,
285,
4677,
374,
778,
320,
1175,
9183,
50275,
37585,
2792,
50276,
9088,
403,
690,
963,
993,
275,
253,
7714,
4496,
452,
352,
4737,
1088,
50276,
81,
23,
359,
1543,
1304,
323,
253,
1355,
277,
9866,
1566,
275,
253,
2929,
50276,
664,
1304,
1543,
50276,
81,
20,
534,
273,
841,
12485,
403,
320,
46541,
50276,
609,
46541,
50275,
1189,
455,
436,
2929,
16540,
1774,
3533,
670,
253,
31471,
273,
1655,
1501,
37806,
8813,
7274,
323,
295,
2224,
285,
3400,
247,
2491,
273,
16774,
1543,
281,
1329,
841,
3916,
1223,
253,
7680,
310,
32809,
2429,
281,
2045,
2175,
891,
2868,
253,
4477,
513,
7164,
690,
1774,
2792,
326,
588,
1361,
281,
23278,
625,
2561,
275,
11138,
1501,
37806,
8813,
3082,
2299,
253,
7714,
310,
3240,
14086,
19756,
690,
1774,
8254,
6787,
24088,
670,
253,
3045,
5593,
285,
253,
16774,
1263,
285,
625,
26565,
7605,
5216,
281,
4326,
4513,
253,
4477,
3916,
323,
436,
1921,
891,
2550,
1329,
14924,
273,
253,
7714,
347,
310,
533,
891,
717,
7378,
281,
24033,
943,
619,
7350,
320,
9713,
50276,
7152,
339,
431,
248,
4477,
1246,
271,
1783,
327,
1501,
37806,
8813,
17082,
326,
2557,
253,
22095,
273,
247,
1566,
281,
46541,
6298,
253,
2929,
6131,
16039,
327,
1264,
17082,
50276,
661,
8289,
260,
3591,
285,
1431,
407,
45021,
731,
275,
253,
1783,
273,
277,
79,
2224,
50276,
32927,
327,
3739,
2460,
15302,
253,
4477,
671,
2589,
247,
34903,
1263,
253,
2929,
6131,
271,
1783,
327,
1264,
17082,
323,
15571,
1501,
37806,
22095,
273,
3210,
327,
46541,
6298,
1014,
2167,
387,
806,
15981,
253,
17082,
4081,
1646,
5272,
387,
253,
4477,
11341,
597,
1614,
562,
417,
281,
320,
512,
326,
1175,
15549,
22095,
327,
46541,
6298,
3340,
672,
253,
2625,
310,
7202,
247,
30400,
50276,
324,
3004,
314,
253,
31056,
1083,
50275,
22402,
273,
253,
2929,
50274,
498,
15752,
50276,
4064,
253,
2022,
2133,
273,
253,
2929,
253,
1783,
273,
253,
17082,
310,
14999,
581,
556,
281,
3730,
281,
253,
30762,
281,
755,
247,
1805,
2934,
273,
253,
17082,
4081,
285,
849,
597,
403,
908,
275,
3946,
50275,
9154,
2167,
253,
4477,
1750,
247,
1071,
327,
4370,
35328,
10895,
841,
403,
417,
2908,
50276,
12563,
512,
9097,
5663,
281,
253,
1072,
3417,
476,
261,
7615,
261,
253,
9162,
4836,
10770,
281,
4370,
35328,
50276,
1439,
3417,
50275,
1987,
8303,
253,
2929,
4081,
495,
17082,
326,
347,
8176,
407,
253,
4477,
403,
417,
9591,
347,
973,
347,
581,
651,
751,
326,
327,
3139,
310,
417,
271,
2523,
347,
3300,
460,
72,
800,
1543,
452,
15785,
2299,
627,
310,
642,
14011,
273,
18595,
273,
253,
4477,
327,
752,
253,
1805,
17082,
943,
320,
50275,
1189,
455,
436,
2929,
310,
271,
1783,
2929,
273,
495,
1881,
856,
7334,
17082,
327,
1501,
37806,
1783,
273,
46541,
6298,
253,
17082,
1614,
562,
417,
281,
320,
7899,
2217,
323,
253,
8892,
387,
1133,
3340,
672,
253,
46541,
2625,
310,
15424,
390,
7202,
627,
310,
642,
5955,
849,
281,
1056,
253,
17082,
1805,
387,
15549,
46541,
6298,
50272,
5996,
30080,
22559,
50275,
11183,
253,
4868,
281,
721,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
14371,
326,
1655,
1501,
37806,
3082,
281,
5513,
2806,
3364,
3210,
403,
417,
10237,
281,
46541,
6298,
1754,
327,
1264,
17082,
3340,
672,
253,
46541,
6298,
403,
15424,
390,
7202,
50276,
48746,
38135,
310,
3710,
984,
253,
2929,
10262,
8558,
16774,
1543,
3185,
273,
4460,
5145,
4715,
5609,
2299,
253,
1895,
310,
1077,
1774,
285,
14793,
285,
8453,
281,
253,
1673,
285,
2442,
3486,
273,
253,
3559,
1543,
281,
7170,
253,
1673,
403,
1029,
347,
30628,
21947,
627,
403,
4088,
281,
2007,
3157,
253,
2929,
1690,
253,
19843,
273,
9759,
3738,
253,
4477,
5520,
275,
253,
17265,
7714,
50276,
1189,
455,
436,
2929,
22828,
45210,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper claims to provide a first order gradient algorithm that achieves global equivalent rates of convergence than in euclidean space on two particular models of geometry though important the hypersphere and the hyperbolic space the strategy proposed here consists in using the geodesic maps in order to write the minimization problem on the euclidean space with a controlled distortion which makes possible the use of a relaxed version of convexity inequalities in the new coordinate system the minimised function is not convex but not far from it in a way the authors are able to control quantitatively my opinion on the technical content of the paper is hindered by the difficulty of reading this paper see the remarks for improving its readability below the overall result seems a bit weak for all this work 40 pages long paper in total incl supplementary material and the calculations do not seem particularly enlightening i would suggest an important rewriting of the paper ideally the main ideas of the paper should be illustrated in the fist technical part section in a simple and enlightening example also putting forward a skeleton of proof of the main result of the paper with more details on the objects would be very helpful for the reader other comments on the readability in the contribution section the third point on reductions is not clear at all i suggest a rewriting of the sentence that makes it more understandable the method uses geodesic maps and in particular maps geodesics from the constant curvature space to geodesics in the euclidean space this condition is very stringent for instance there is a theorem by kobayashi which quantifies that affine maps a condition which implies geodesic maps are often isometries i would suggest the authors spend more time on the definition of a geodesic map and provide a discussion in particular they could write the explicit definition of such maps in the two cases of interest the hypersphere and the hyperbolic space instead of pointing to the socalled classical geodesic maps the beginning of section 2 is difficult to read the strategy is explained in wordy manner example our approach is to obtain a lower bound that is looser by a constant depending on r and that is linear over b in this way the aggregation becomes easier helping the reader with equations or more precise definitions would be helpful section 21 is hard to follow example after applying some desirable modifications like regularization with a 1strongly convex function and removing the unknown x by taking a minimum over x note 4 comes from averaging 3 for y x the contributions to reductions techniques in section 3 is only accessible to the expert reader and the main text is only statement of the results the authors could make their point more explicit here docsepthis paper considered the problem of minimizing strongly and nonstrongly geodesically convex functions on hyperbolic and spherical manifolds manifolds of constant curvature 1 and 1 respectively and proposed accelerated algorithms for such problems in particular the authors showed the proposed algorithms enjoy global accelerated rates that match their euclidean counterparts a key to the main result is lemma 22 which asserts a certain quasar convexitytype condition of the pullback of the objective function to some euclidean domain through a geodesic map based on this lemma the main result follows from combining techniques for developing accelerated algorithms in euclidean space such as the approximate duality gap technique and a certain discretization scheme for continuous dynamics some reduction results which obtain accelerated algorithms for the strongly convex case from the nonstrongly convex case and vice versa are also presented i believe that the technique is new to the best of my knowledge and i think that obtaining accelerated algorithms for manifold optimization problems is definitely an important topic therefore the results should be interesting to a broad audience of the conference and deserve some merits however i have some doubts about the paper 1 the presentation of the proofs in the supplementary material is unsatisfactory there are so many arguments like trivialeasy to seeprove follows straightforwardlytrivially etc it makes the proofs very difficult to follow 2 in lemma 22 its a bit surprising to me that the constants gammap and gamman depends only on the radius r of the geodesic ball but not the function f which implicitly contains k due to the rescaling nor the point x perhaps some intuition of why this is the case and some interpretation of these two constants would be good it is important as the gamma constants are involved in the complexity in theorem 24 and hence in theorem 25 3 the setting of manifolds of constant curvature seems to be quite restrictive and as pointed out in the paper it seems difficult that the technique could be extended to other manifolds 4 the practicality of the main algorithm algorithm 1 seems to be very limited i understand that this paper focuses more on theoretical side i am not hoping for practical use either however it would be good to demonstrate that performancebehaviour of the proposed algorithm does corroborate with the theories at least in some toy examples such as optimization problems on the poincare disk other comments 1 on page 3 the tildev vector of the same norm is not welldefined if tildev satisfies the definition then it can be checked that 2 tildev also satisfies the definition 2 on page 3 it is mentioned that rge dx0 x implies x in expx0 barb0r i believe such implication requires geodesic completeness of the manifold 3 on page 3 in the notation section the constant k appeared without definition 4 the notions of curvature of manifolds and angles between points on manifolds are used without definition it would be good to present the definition somewhere in the paper or the supplementary material 5 a recent paper an accelerated firstorder method for nonconvex optimization on manifolds by criscitiello and boumal which studied accelerated algorithms for nongconvex optimization on a more general class of manifolds is missing from the comparison 6 sometimes the tangent space tx m is mistakenly written as tx for example on page 4 and also in the supplementary material 7 in lemma 23 it is a bit strange that the smoothness property of the pullback function would require the assumption of the existence of a stationary point could you please provide some explanationdocsepthis paper proposes a global accelerated method on riemannian manifolds with the same rates as accelerated methods in the euclidean space up to log factors reductions have also been studied on riemannian manifolds quality i think this paper has high quality in theory clarity i have no experience on riemannian manifolds before this paper reads difficult for me i think this paper is too technical and some descriptions are not clear originality there are a number of works that study the problem of firstorder acceleration on riemannian manifolds this paper studies the special case of constant sectional curvature ie the hyperbolic and spherical spaces i am not sure whether there are literatures studying the optimization algorithms either accelerated or nonaccelerated on the constant sectional curvature before significance this paper gives the stateoftheart rates in the special case of constant sectional curvature i think it is significant i have some comments i have no experience on the optimizaton on riemannian manifolds before my comments may be too strict for the analysis on riemannian manifolds 1 previous literatures have studied the optimization on riemannian manifolds of bounded sectional curvature while this paper focuses on the special hyperbolic and spherical spaces that have constant sectional curvature is there any literature focusing on the constant sectional curvature before either accelerated or nonaccelerated what is the critical difference when transforming the analysis on the bounded sectional curvature to constant sectional curvature is it a straightforward extension or very challenging 2 i am not sure whether each step of the proposed method needs more computations than the standard accelerated gradient method for example function f is a composition of f and h1 can nabla fx be efficiently computed the binarylinesearch needs to compute gammai1 and xi1lambda do they need more computations docsepunfortunately though i am familiar with the literature on accelerated gradient methods in euclidean spaces i am not familiar enough with riemannian geometry to provide a confident review of this paper that being said i can provide some feedback overall this paper is very dense with mathematics that will not be not particularly familiar to most machine learning people moreover the suggested applications to ml a single line in the paper is not particularly convincing without more discussion and context eg a fully worked example this is compounded by the fact that no experiments were presented this paper would be greatly improved by an experiment that showed the improvement of this algorithm over vanilla gradient approaches on a riemannian manifold even if the experiment was totally synthetic even better would be to present a real machine learning application with that in mind i have a concern that iclr is not the right venue for this paper after all iclr is focussed on learning representations and though i tend to have a relaxed approach to this i think this paper might not be of general enough interest to the iclr community even though it may well be an excellent paper some more minor comments this paper focuses on hyperbolic and spherical manifolds how important are these spaces in practical problems i wonder if the limited scope of the results covers the machine learning applications discussed in the introduction this needs more discussion if these spaces are presented primarily because the analysis is easier then this is another reason why iclr might not be the right venue i am surprised that any of the parameters of the manifold do not appear in the bound is there some intuition why the bound relies only on the constants of the function f mu and l and not on any property of m eg the main results section the results are in terms of l and mu and ignore only factors of loglmu it would be good to add a discussion of this or be explicit with the dependency in the bounds if there is one very minor comment the word unfeasible is unusual though apparently it is a real world infeasible is more commondocsepsummary this paper provides a generalization of agd to constant sectional curvature spaces or subsets of them and proves the same global rates of convergence that hold in the euclidean space additionally they provide reductions for the bounded sectional curvature case their basic strategy involves the use of geodesic maps to accumulate local linear lower bounds in a way that accounts for the geometric distortion incurred by the map strengths the paper is written well and organized in a reasonable fashion they have a clear description of the general techniques applied in their work and push overly technical arguments to the appendix they provide global rates which also apply to gconvex functions not just strongly convex where i have checked their statements are mathematically sound weaknesses the domain of applicability for their main rates are restricted to the constant curvature spaces and it could be argued that it is relatively narrow in scope i am not sure of the convention in this community but perhaps it would helpful also to have some experimental results and code to assist in reproduction and discussion of practical import and comparison recommendation i gave a score of 7 as it seems to provide technical progress over previous results and the authors are clear in describing their contributions my score is relatively uncertain as i was not able to check many of the technical arguments and lemmas update i would reduce my score to a 6 based on the opinions of my fellow reviewers it appears that the restricted scope and lack of experimental results is quite a problem within this community and venue
### Summary:
|
reviewers generally appreciate the theoretical contribution of the paper namely accelerated gradient descent on the sphere and hyperbolic space with the same convergence rate as the euclidean counterpart however there are several major concerns with the current work from a theoretical standpoint the geodesic map which plays a crucial role in the algorithm and theoretical analysis exists if and only if the manifold has constant sectional curvature sphere and hyperbolic space it is not at all clear how the current approach can be extended beyond this setting from an algorithmic viewpoint the stated algorithm has not been experimentally validated it is suggested that at least some synthetic experiments eg on the sphere or poincare disk be carried out finally the current presentation is quite dense and should be considerably improved
|
[
403,
2223,
310,
2755,
2246,
891,
575,
12756,
1804,
253,
4477,
6947,
625,
673,
327,
253,
5426,
273,
247,
35917,
3711,
285,
2085,
247,
5955,
275,
1798,
597,
812,
50276,
6343,
253,
6843,
5426,
273,
824,
8115,
275,
253,
767,
2219,
273,
1600,
253,
24052,
81,
1568,
285,
253,
28095,
2317,
3185,
273,
13458,
281,
253,
9267,
18859,
8946,
35917,
8115,
50275,
783,
5068,
273,
2593,
374,
310,
2834,
281,
1239,
253,
5700,
310,
5544,
275,
3159,
90,
5133,
1650,
50276,
454,
2746,
310,
281,
4044,
247,
2406,
3033,
326,
310,
2343,
14356,
407,
247,
3638,
7293,
327,
391,
285,
326,
310,
4872,
689,
270,
275,
436,
1039,
253,
20828,
4916,
6927,
50276,
13070,
272,
253,
9414,
342,
7424,
390,
625,
10799,
14308,
651,
320,
9371,
50276,
4674,
3127,
310,
1892,
281,
956,
1650,
50276,
6438,
9433,
690,
11408,
14586,
751,
37820,
342,
247,
337,
9072,
314,
17133,
1159,
50275,
395,
11922,
253,
7202,
1269,
50276,
1615,
3192,
247,
5927,
689,
1269,
50276,
9939,
577,
3249,
432,
25001,
495,
50275,
1542,
340,
50275,
89,
50273,
783,
9021,
281,
23082,
5609,
275,
2593,
495,
575,
261,
760,
12482,
281,
253,
6485,
9414,
285,
253,
2022,
2505,
310,
760,
3908,
273,
253,
1543,
253,
4477,
812,
1056,
616,
1127,
625,
6843,
1060,
50276,
7152,
33032,
2520,
2929,
2783,
253,
1895,
273,
28699,
7052,
285,
1327,
9072,
314,
28600,
1037,
17133,
3470,
327,
28095,
285,
19474,
28236,
28236,
273,
3638,
16841,
337,
285,
337,
2975,
285,
4081,
21702,
11333,
323,
824,
3237,
275,
1798,
253,
4477,
2692,
253,
4081,
11333,
4264,
4156,
21702,
4142,
326,
3761,
616,
299,
26365,
21421,
247,
2234,
281,
253,
2022,
906,
310,
18057,
3307,
534,
17086,
247,
2176,
21582,
274,
17133,
414,
881,
1617,
273,
253,
3785,
2135,
273,
253,
8103,
1159,
281,
690,
299,
26365,
5028,
949,
247,
35917,
3711,
1754,
327,
436,
18057,
253,
2022,
906,
3637,
432,
16248,
5609,
323,
6684,
21702,
11333,
275,
299,
26365,
2317,
824,
347,
253,
16851,
34962,
8037,
5853,
285,
247,
2176,
35132,
1320,
6974,
323,
5415,
8062,
690,
5141,
1543,
534,
4044,
21702,
11333,
323,
253,
7052,
17133,
1083,
432,
253,
1327,
9072,
314,
17133,
1083,
285,
12008,
26620,
403,
671,
3559,
50276,
74,
2868,
326,
253,
5853,
310,
747,
281,
253,
1682,
273,
619,
3640,
285,
891,
1158,
326,
13546,
21702,
11333,
323,
16751,
13757,
3237,
310,
7964,
271,
1774,
9400,
3103,
253,
1543,
943,
320,
4722,
281,
247,
3862,
8446,
273,
253,
8059,
285,
17337,
690,
16108,
2299,
891,
452,
690,
24626,
670,
253,
2929,
50276,
18,
253,
9759,
273,
253,
27947,
275,
253,
24864,
2144,
310,
49770,
627,
403,
594,
1142,
7125,
751,
14916,
36423,
281,
396,
554,
47550,
3637,
15246,
314,
85,
1069,
1365,
3966,
352,
2789,
253,
27947,
1077,
2834,
281,
956,
50276,
19,
275,
18057,
3307,
697,
247,
2372,
10084,
281,
479,
326,
253,
14637,
305,
3681,
522,
285,
18814,
1342,
7024,
760,
327,
253,
9941,
391,
273,
253,
35917,
4023,
533,
417,
253,
1159,
269,
534,
29688,
4428,
465,
1955,
281,
253,
46595,
272,
4543,
253,
1127,
1269,
4931,
690,
30328,
273,
2139,
436,
310,
253,
1083,
285,
690,
7914,
273,
841,
767,
14637,
651,
320,
1175,
352,
310,
1774,
347,
253,
17356,
14637,
403,
3206,
275,
253,
10454,
275,
10012,
2164,
285,
7613,
275,
10012,
2030,
50275,
20,
253,
4758,
273,
28236,
273,
3638,
16841,
3133,
281,
320,
3240,
29190,
285,
347,
8042,
562,
275,
253,
2929,
352,
3133,
2834,
326,
253,
5853,
812,
320,
6508,
281,
643,
28236,
50276,
21,
253,
8542,
414,
273,
253,
2022,
5933,
5933,
337,
3133,
281,
320,
1077,
3710,
891,
2096,
326,
436,
2929,
16633,
625,
327,
10527,
1930,
891,
717,
417,
11525,
323,
8542,
897,
2057,
2299,
352,
651,
320,
1175,
281,
7568,
326,
3045,
1257,
37847,
273,
253,
4081,
5933,
1057,
25092,
366,
342,
253,
11813,
387,
1878,
275,
690,
20953,
6667,
824,
347,
13757,
3237,
327,
253,
2963,
1763,
609,
7592,
50276,
977,
5701,
337,
327,
3239,
495,
253,
10751,
3620,
4972,
273,
253,
1072,
5222,
310,
417,
6210,
392,
37224,
604,
10751,
3620,
12310,
253,
5426,
840,
352,
476,
320,
10141,
326,
374,
10751,
3620,
671,
12310,
253,
5426,
50276,
19,
327,
3239,
495,
352,
310,
5393,
326,
391,
463,
18747,
17,
1269,
8018,
1269,
275,
866,
89,
17,
2534,
67,
17,
83,
891,
2868,
824,
27570,
4419,
35917,
29867,
273,
253,
16751,
50276,
20,
327,
3239,
495,
275,
253,
14951,
2593,
253,
3638,
465,
5420,
1293,
5426,
50276,
21,
253,
27367,
273,
16841,
273,
28236,
285,
14636,
875,
2792,
327,
28236,
403,
908,
1293,
5426,
352,
651,
320,
1175,
281,
1246,
253,
5426,
9366,
275,
253,
2929,
390,
253,
24864,
2144,
50276,
22,
247,
3332,
2929,
271,
21702,
806,
2621,
1332,
323,
1327,
44181,
13757,
327,
28236,
407,
1531,
2865,
15208,
6646,
285,
29909,
10367,
534,
5421,
21702,
11333,
323,
295,
543,
44181,
13757,
327,
247,
625,
2087,
966,
273,
28236,
310,
5816,
432,
253,
5301,
50276,
23,
4536,
253,
28196,
2317,
28951,
278,
310,
49294,
3542,
347,
28951,
323,
1650,
327,
3239,
577,
285,
671,
275,
253,
24864,
2144,
50276,
24,
275,
18057,
3495,
352,
310,
247,
2372,
8921,
326,
253,
6032,
1255,
2867,
273,
253,
3785,
2135,
1159,
651,
2430,
253,
9376,
273,
253,
6242,
273,
247,
17429,
1127,
812,
368,
4496,
2085,
690,
8813,
7152,
33032,
2520,
2929,
29328,
247,
4156,
21702,
1332,
327,
4172,
39480,
757,
28236,
342,
253,
1072,
4142,
347,
21702,
3082,
275,
253,
299,
26365,
2317,
598,
281,
2412,
2616,
23082,
452,
671,
644,
5421,
327,
4172,
39480,
757,
28236,
50276,
15177,
891,
1158,
436,
2929,
556,
1029,
3290,
275,
3762,
50276,
498,
15752,
891,
452,
642,
2793,
327,
4172,
39480,
757,
28236,
1078,
436,
2929,
9563,
2834,
323,
479,
891,
1158,
436,
2929,
310,
1512,
7681,
285,
690,
20121,
403,
417,
2590,
50276,
19164,
414,
627,
403,
247,
1180,
273,
2987,
326,
1263,
253,
1895,
273,
806,
2621,
17680,
327,
4172,
39480,
757,
28236,
436,
2929,
2175,
253,
2714,
1083,
273,
3638,
46399,
16841,
26332,
253,
28095,
285,
19474,
8470,
891,
717,
417,
2119,
1880,
627,
403,
4133,
2478,
12392,
253,
13757,
11333,
2057,
21702,
390,
1327,
3649,
293,
12072,
327,
253,
3638,
46399,
16841,
1078,
50276,
9188,
40348,
436,
2929,
4245,
253,
1375,
23037,
14387,
4142,
275,
253,
2714,
1083,
273,
3638,
46399,
16841,
891,
1158,
352,
310,
1534,
50276,
74,
452,
690,
5701,
891,
452,
642,
2793,
327,
253,
5556,
478,
13078,
327,
4172,
39480,
757,
28236,
1078,
619,
5701,
778,
320,
1512,
7654,
323,
253,
1783,
327,
4172,
39480,
757,
28236,
50276,
18,
2045,
4133,
2478,
452,
5421,
253,
13757,
327,
4172,
39480,
757,
28236,
273,
11542,
46399,
16841,
1223,
436,
2929,
16633,
327,
253,
2714,
28095,
285,
19474,
8470,
326,
452,
3638,
46399,
16841,
310,
627,
667,
6239,
13654,
327,
253,
3638,
46399,
16841,
1078,
2057,
21702,
390,
1327,
3649,
293,
12072,
752,
310,
253,
4619,
3064,
672,
27197,
253,
1783,
327,
253,
11542,
46399,
16841,
281,
3638,
46399,
16841,
310,
352,
247,
15246,
6880,
390,
1077,
11132,
50276,
19,
891,
717,
417,
2119,
1880,
1016,
3213,
273,
253,
4081,
1332,
3198,
625,
30745,
685,
253,
2629,
21702,
11786,
1332,
323,
1650,
1159,
269,
310,
247,
5889,
273,
269,
285,
288,
18,
476,
295,
6348,
269,
89,
320,
14556,
10302,
253,
8985,
8737,
3849,
3198,
281,
11897,
17356,
74,
18,
285,
1269,
74,
18,
2260,
513,
597,
878,
625,
30745,
50276,
7152,
33032,
328,
9520,
2167,
891,
717,
7615,
342,
253,
6239,
327,
21702,
11786,
3082,
275,
299,
26365,
8470,
891,
717,
417,
7615,
2217,
342,
4172,
39480,
757,
12087,
281,
2085,
247,
13224,
2278,
273,
436,
2929,
326,
1146,
753,
891,
476,
2085,
690,
8680,
50276,
1189,
455,
436,
2929,
310,
1077,
14086,
342,
23065,
326,
588,
417,
320,
417,
3782,
7615,
281,
954,
5145,
4715,
952,
25761,
253,
5125,
4893,
281,
13361,
247,
2014,
1386,
275,
253,
2929,
310,
417,
3782,
21414,
1293,
625,
5955,
285,
3634,
24088,
247,
4751,
4307,
1650,
436,
310,
509,
8055,
407,
253,
958,
326,
642,
4679,
497,
3559,
436,
2929,
651,
320,
10260,
5520,
407,
271,
3368,
326,
2692,
253,
7756,
273,
436,
5933,
689,
26724,
11786,
7274,
327,
247,
4172,
39480,
757,
16751,
1014,
604,
253,
3368,
369,
9106,
13506,
1014,
1805,
651,
320,
281,
1246,
247,
1524,
5145,
4715,
2898,
342,
326,
275,
2564,
891,
452,
247,
4468,
326,
17857,
32888,
310,
417,
253,
987,
18767,
323,
436,
2929,
846,
512,
17857,
32888,
310,
41685,
47291,
327,
4715,
14237,
285,
2167,
891,
5257,
281,
452,
247,
19595,
2746,
281,
436,
891,
1158,
436,
2929,
1537,
417,
320,
273,
2087,
2217,
1600,
281,
253,
17857,
32888,
3114,
1014,
2167,
352,
778,
973,
320,
271,
7126,
2929,
50276,
8826,
625,
5884,
5701,
50276,
2520,
2929,
16633,
327,
28095,
285,
19474,
28236,
849,
1774,
403,
841,
8470,
275,
8542,
3237,
891,
4282,
604,
253,
3710,
7990,
273,
253,
1543,
10949,
253,
5145,
4715,
4893,
5469,
275,
253,
10199,
436,
3198,
625,
5955,
604,
841,
8470,
403,
3559,
8558,
984,
253,
1783,
310,
6927,
840,
436,
310,
1529,
1921,
2139,
17857,
32888,
1537,
417,
320,
253,
987,
18767,
50276,
74,
717,
9861,
326,
667,
273,
253,
3602,
273,
253,
16751,
513,
417,
3176,
275,
253,
3033,
310,
627,
690,
30328,
2139,
253,
3033,
15771,
760,
327,
253,
14637,
273,
253,
1159,
269,
12910,
285,
298,
285,
417,
327,
667,
2867,
273,
278,
24088,
253,
2022,
1543,
2593,
253,
1543,
403,
275,
2426,
273,
298,
285,
12910,
285,
11823,
760,
2616,
273,
2412,
77,
1906,
352,
651,
320,
1175,
281,
823,
247,
5955,
273,
436,
390,
320,
6843,
342,
253,
18925,
275,
253,
14493,
604,
627,
310,
581,
50276,
635,
5884,
4385,
253,
3159,
440,
36764,
917,
310,
11555,
2167,
8505,
352,
310,
247,
1524,
1533,
275,
36764,
917,
310,
625,
764,
857,
406,
339,
793,
360,
3454,
436,
2929,
3400,
247,
26647,
273,
639,
69,
281,
3638,
46399,
16841,
8470,
390,
20077,
273,
731,
285,
19539,
253,
1072,
4156,
4142,
273,
14940,
326,
2186,
275,
253,
299,
26365,
2317,
23000,
597,
2085,
23082,
323,
253,
11542,
46399,
16841,
1083,
616,
5044,
5700,
8687,
253,
897,
273,
35917,
8115,
281,
29010,
1980,
4872,
2406,
14493,
275,
247,
1039,
326,
8553,
323,
253,
17856,
22841,
23122,
407,
253,
3711,
50276,
296,
3755,
20556,
253,
2929,
310,
3542,
973,
285,
10932,
275,
247,
5272,
8142,
597,
452,
247,
2590,
5740,
273,
253,
2087,
5609,
3732,
275,
616,
789,
285,
7450,
27662,
7681,
7125,
281,
253,
30762,
597,
2085,
4156,
4142,
534,
671,
4647,
281,
305,
44181,
3470,
417,
816,
7052,
17133,
835,
891,
452,
10141,
616,
7234,
403,
11076,
1037,
3590,
50276,
20881,
1255,
265,
253,
5028,
273,
30437,
323,
616,
2022,
4142,
403,
11096,
281,
253,
3638,
16841,
8470,
285,
352,
812,
320,
9125,
326,
352,
310,
4942,
6891,
275,
7990,
891,
717,
417,
2119,
273,
253,
5008,
275,
436,
3114,
533,
4931,
352,
651,
9371,
671,
281,
452,
690,
5661,
1543,
285,
2127,
281,
10073,
275,
21068,
285,
5955,
273,
8542,
1395,
285,
5301,
50276,
250,
27167,
318,
891,
3534,
247,
4868,
273,
818,
347,
352,
3133,
281,
2085,
7681,
4780,
689,
2045,
1543,
285,
253,
4477,
403,
2590,
275,
12930,
616,
9021,
619,
4868,
310,
4942,
8767,
347,
891,
369,
417,
2104,
281,
2451,
1142,
273,
253,
7681,
7125,
285,
458,
44661,
5731,
891,
651,
4796,
619,
4868,
281,
247,
721,
1754,
327,
253,
11626,
273,
619,
7715,
30628,
352,
4620,
326,
253,
11096,
7990,
285,
3480,
273,
5661,
1543,
310,
3240,
247,
1895,
1561,
436,
3114,
285,
18767,
187,
187,
4118,
18435,
27,
15337,
398,
3839,
11435,
253,
10527,
7680,
273,
253,
2929,
10775,
21702,
11786,
18499,
327,
253,
15269,
285,
28095,
2317,
342,
253,
1072,
14940,
2281,
347,
253,
299,
26365,
14317,
2299,
627,
403,
2067,
2201,
7350,
342,
253,
1655,
789,
432,
247,
10527,
32764,
253,
35917,
3711,
534,
7120,
247,
9560,
2554,
275,
253,
5933,
285,
10527,
1783,
4961,
604,
285,
760,
604,
253,
16751,
556,
3638,
46399,
16841,
15269,
285,
28095,
2317,
352,
310,
417,
387,
512,
2590,
849,
253,
1655,
2746,
476,
320,
6508,
4457,
436,
4758,
50276,
4064,
271,
5933,
280,
31460,
253,
4767,
5933,
556,
417,
644,
21657,
17618,
352,
310,
5125,
326,
387,
1878,
690,
13506,
4679,
24088,
327,
253,
15269,
390,
2963,
1763,
609,
7592,
320,
4824,
562,
4720,
253,
1655,
9759,
310,
3240,
14086,
285,
943,
320,
15455,
5520
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
403,
2223,
310,
2755,
2246,
891,
575,
12756,
1804,
253,
4477,
6947,
625,
673,
327,
253,
5426,
273,
247,
35917,
3711,
285,
2085,
247,
5955,
275,
1798,
597,
812,
50276,
6343,
253,
6843,
5426,
273,
824,
8115,
275,
253,
767,
2219,
273,
1600,
253,
24052,
81,
1568,
285,
253,
28095,
2317,
3185,
273,
13458,
281,
253,
9267,
18859,
8946,
35917,
8115,
50275,
783,
5068,
273,
2593,
374,
310,
2834,
281,
1239,
253,
5700,
310,
5544,
275,
3159,
90,
5133,
1650,
50276,
454,
2746,
310,
281,
4044,
247,
2406,
3033,
326,
310,
2343,
14356,
407,
247,
3638,
7293,
327,
391,
285,
326,
310,
4872,
689,
270,
275,
436,
1039,
253,
20828,
4916,
6927,
50276,
13070,
272,
253,
9414,
342,
7424,
390,
625,
10799,
14308,
651,
320,
9371,
50276,
4674,
3127,
310,
1892,
281,
956,
1650,
50276,
6438,
9433,
690,
11408,
14586,
751,
37820,
342,
247,
337,
9072,
314,
17133,
1159,
50275,
395,
11922,
253,
7202,
1269,
50276,
1615,
3192,
247,
5927,
689,
1269,
50276,
9939,
577,
3249,
432,
25001,
495,
50275,
1542,
340,
50275,
89,
50273,
783,
9021,
281,
23082,
5609,
275,
2593,
495,
575,
261,
760,
12482,
281,
253,
6485,
9414,
285,
253,
2022,
2505,
310,
760,
3908,
273,
253,
1543,
253,
4477,
812,
1056,
616,
1127,
625,
6843,
1060,
50276,
7152,
33032,
2520,
2929,
2783,
253,
1895,
273,
28699,
7052,
285,
1327,
9072,
314,
28600,
1037,
17133,
3470,
327,
28095,
285,
19474,
28236,
28236,
273,
3638,
16841,
337,
285,
337,
2975,
285,
4081,
21702,
11333,
323,
824,
3237,
275,
1798,
253,
4477,
2692,
253,
4081,
11333,
4264,
4156,
21702,
4142,
326,
3761,
616,
299,
26365,
21421,
247,
2234,
281,
253,
2022,
906,
310,
18057,
3307,
534,
17086,
247,
2176,
21582,
274,
17133,
414,
881,
1617,
273,
253,
3785,
2135,
273,
253,
8103,
1159,
281,
690,
299,
26365,
5028,
949,
247,
35917,
3711,
1754,
327,
436,
18057,
253,
2022,
906,
3637,
432,
16248,
5609,
323,
6684,
21702,
11333,
275,
299,
26365,
2317,
824,
347,
253,
16851,
34962,
8037,
5853,
285,
247,
2176,
35132,
1320,
6974,
323,
5415,
8062,
690,
5141,
1543,
534,
4044,
21702,
11333,
323,
253,
7052,
17133,
1083,
432,
253,
1327,
9072,
314,
17133,
1083,
285,
12008,
26620,
403,
671,
3559,
50276,
74,
2868,
326,
253,
5853,
310,
747,
281,
253,
1682,
273,
619,
3640,
285,
891,
1158,
326,
13546,
21702,
11333,
323,
16751,
13757,
3237,
310,
7964,
271,
1774,
9400,
3103,
253,
1543,
943,
320,
4722,
281,
247,
3862,
8446,
273,
253,
8059,
285,
17337,
690,
16108,
2299,
891,
452,
690,
24626,
670,
253,
2929,
50276,
18,
253,
9759,
273,
253,
27947,
275,
253,
24864,
2144,
310,
49770,
627,
403,
594,
1142,
7125,
751,
14916,
36423,
281,
396,
554,
47550,
3637,
15246,
314,
85,
1069,
1365,
3966,
352,
2789,
253,
27947,
1077,
2834,
281,
956,
50276,
19,
275,
18057,
3307,
697,
247,
2372,
10084,
281,
479,
326,
253,
14637,
305,
3681,
522,
285,
18814,
1342,
7024,
760,
327,
253,
9941,
391,
273,
253,
35917,
4023,
533,
417,
253,
1159,
269,
534,
29688,
4428,
465,
1955,
281,
253,
46595,
272,
4543,
253,
1127,
1269,
4931,
690,
30328,
273,
2139,
436,
310,
253,
1083,
285,
690,
7914,
273,
841,
767,
14637,
651,
320,
1175,
352,
310,
1774,
347,
253,
17356,
14637,
403,
3206,
275,
253,
10454,
275,
10012,
2164,
285,
7613,
275,
10012,
2030,
50275,
20,
253,
4758,
273,
28236,
273,
3638,
16841,
3133,
281,
320,
3240,
29190,
285,
347,
8042,
562,
275,
253,
2929,
352,
3133,
2834,
326,
253,
5853,
812,
320,
6508,
281,
643,
28236,
50276,
21,
253,
8542,
414,
273,
253,
2022,
5933,
5933,
337,
3133,
281,
320,
1077,
3710,
891,
2096,
326,
436,
2929,
16633,
625,
327,
10527,
1930,
891,
717,
417,
11525,
323,
8542,
897,
2057,
2299,
352,
651,
320,
1175,
281,
7568,
326,
3045,
1257,
37847,
273,
253,
4081,
5933,
1057,
25092,
366,
342,
253,
11813,
387,
1878,
275,
690,
20953,
6667,
824,
347,
13757,
3237,
327,
253,
2963,
1763,
609,
7592,
50276,
977,
5701,
337,
327,
3239,
495,
253,
10751,
3620,
4972,
273,
253,
1072,
5222,
310,
417,
6210,
392,
37224,
604,
10751,
3620,
12310,
253,
5426,
840,
352,
476,
320,
10141,
326,
374,
10751,
3620,
671,
12310,
253,
5426,
50276,
19,
327,
3239,
495,
352,
310,
5393,
326,
391,
463,
18747,
17,
1269,
8018,
1269,
275,
866,
89,
17,
2534,
67,
17,
83,
891,
2868,
824,
27570,
4419,
35917,
29867,
273,
253,
16751,
50276,
20,
327,
3239,
495,
275,
253,
14951,
2593,
253,
3638,
465,
5420,
1293,
5426,
50276,
21,
253,
27367,
273,
16841,
273,
28236,
285,
14636,
875,
2792,
327,
28236,
403,
908,
1293,
5426,
352,
651,
320,
1175,
281,
1246,
253,
5426,
9366,
275,
253,
2929,
390,
253,
24864,
2144,
50276,
22,
247,
3332,
2929,
271,
21702,
806,
2621,
1332,
323,
1327,
44181,
13757,
327,
28236,
407,
1531,
2865,
15208,
6646,
285,
29909,
10367,
534,
5421,
21702,
11333,
323,
295,
543,
44181,
13757,
327,
247,
625,
2087,
966,
273,
28236,
310,
5816,
432,
253,
5301,
50276,
23,
4536,
253,
28196,
2317,
28951,
278,
310,
49294,
3542,
347,
28951,
323,
1650,
327,
3239,
577,
285,
671,
275,
253,
24864,
2144,
50276,
24,
275,
18057,
3495,
352,
310,
247,
2372,
8921,
326,
253,
6032,
1255,
2867,
273,
253,
3785,
2135,
1159,
651,
2430,
253,
9376,
273,
253,
6242,
273,
247,
17429,
1127,
812,
368,
4496,
2085,
690,
8813,
7152,
33032,
2520,
2929,
29328,
247,
4156,
21702,
1332,
327,
4172,
39480,
757,
28236,
342,
253,
1072,
4142,
347,
21702,
3082,
275,
253,
299,
26365,
2317,
598,
281,
2412,
2616,
23082,
452,
671,
644,
5421,
327,
4172,
39480,
757,
28236,
50276,
15177,
891,
1158,
436,
2929,
556,
1029,
3290,
275,
3762,
50276,
498,
15752,
891,
452,
642,
2793,
327,
4172,
39480,
757,
28236,
1078,
436,
2929,
9563,
2834,
323,
479,
891,
1158,
436,
2929,
310,
1512,
7681,
285,
690,
20121,
403,
417,
2590,
50276,
19164,
414,
627,
403,
247,
1180,
273,
2987,
326,
1263,
253,
1895,
273,
806,
2621,
17680,
327,
4172,
39480,
757,
28236,
436,
2929,
2175,
253,
2714,
1083,
273,
3638,
46399,
16841,
26332,
253,
28095,
285,
19474,
8470,
891,
717,
417,
2119,
1880,
627,
403,
4133,
2478,
12392,
253,
13757,
11333,
2057,
21702,
390,
1327,
3649,
293,
12072,
327,
253,
3638,
46399,
16841,
1078,
50276,
9188,
40348,
436,
2929,
4245,
253,
1375,
23037,
14387,
4142,
275,
253,
2714,
1083,
273,
3638,
46399,
16841,
891,
1158,
352,
310,
1534,
50276,
74,
452,
690,
5701,
891,
452,
642,
2793,
327,
253,
5556,
478,
13078,
327,
4172,
39480,
757,
28236,
1078,
619,
5701,
778,
320,
1512,
7654,
323,
253,
1783,
327,
4172,
39480,
757,
28236,
50276,
18,
2045,
4133,
2478,
452,
5421,
253,
13757,
327,
4172,
39480,
757,
28236,
273,
11542,
46399,
16841,
1223,
436,
2929,
16633,
327,
253,
2714,
28095,
285,
19474,
8470,
326,
452,
3638,
46399,
16841,
310,
627,
667,
6239,
13654,
327,
253,
3638,
46399,
16841,
1078,
2057,
21702,
390,
1327,
3649,
293,
12072,
752,
310,
253,
4619,
3064,
672,
27197,
253,
1783,
327,
253,
11542,
46399,
16841,
281,
3638,
46399,
16841,
310,
352,
247,
15246,
6880,
390,
1077,
11132,
50276,
19,
891,
717,
417,
2119,
1880,
1016,
3213,
273,
253,
4081,
1332,
3198,
625,
30745,
685,
253,
2629,
21702,
11786,
1332,
323,
1650,
1159,
269,
310,
247,
5889,
273,
269,
285,
288,
18,
476,
295,
6348,
269,
89,
320,
14556,
10302,
253,
8985,
8737,
3849,
3198,
281,
11897,
17356,
74,
18,
285,
1269,
74,
18,
2260,
513,
597,
878,
625,
30745,
50276,
7152,
33032,
328,
9520,
2167,
891,
717,
7615,
342,
253,
6239,
327,
21702,
11786,
3082,
275,
299,
26365,
8470,
891,
717,
417,
7615,
2217,
342,
4172,
39480,
757,
12087,
281,
2085,
247,
13224,
2278,
273,
436,
2929,
326,
1146,
753,
891,
476,
2085,
690,
8680,
50276,
1189,
455,
436,
2929,
310,
1077,
14086,
342,
23065,
326,
588,
417,
320,
417,
3782,
7615,
281,
954,
5145,
4715,
952,
25761,
253,
5125,
4893,
281,
13361,
247,
2014,
1386,
275,
253,
2929,
310,
417,
3782,
21414,
1293,
625,
5955,
285,
3634,
24088,
247,
4751,
4307,
1650,
436,
310,
509,
8055,
407,
253,
958,
326,
642,
4679,
497,
3559,
436,
2929,
651,
320,
10260,
5520,
407,
271,
3368,
326,
2692,
253,
7756,
273,
436,
5933,
689,
26724,
11786,
7274,
327,
247,
4172,
39480,
757,
16751,
1014,
604,
253,
3368,
369,
9106,
13506,
1014,
1805,
651,
320,
281,
1246,
247,
1524,
5145,
4715,
2898,
342,
326,
275,
2564,
891,
452,
247,
4468,
326,
17857,
32888,
310,
417,
253,
987,
18767,
323,
436,
2929,
846,
512,
17857,
32888,
310,
41685,
47291,
327,
4715,
14237,
285,
2167,
891,
5257,
281,
452,
247,
19595,
2746,
281,
436,
891,
1158,
436,
2929,
1537,
417,
320,
273,
2087,
2217,
1600,
281,
253,
17857,
32888,
3114,
1014,
2167,
352,
778,
973,
320,
271,
7126,
2929,
50276,
8826,
625,
5884,
5701,
50276,
2520,
2929,
16633,
327,
28095,
285,
19474,
28236,
849,
1774,
403,
841,
8470,
275,
8542,
3237,
891,
4282,
604,
253,
3710,
7990,
273,
253,
1543,
10949,
253,
5145,
4715,
4893,
5469,
275,
253,
10199,
436,
3198,
625,
5955,
604,
841,
8470,
403,
3559,
8558,
984,
253,
1783,
310,
6927,
840,
436,
310,
1529,
1921,
2139,
17857,
32888,
1537,
417,
320,
253,
987,
18767,
50276,
74,
717,
9861,
326,
667,
273,
253,
3602,
273,
253,
16751,
513,
417,
3176,
275,
253,
3033,
310,
627,
690,
30328,
2139,
253,
3033,
15771,
760,
327,
253,
14637,
273,
253,
1159,
269,
12910,
285,
298,
285,
417,
327,
667,
2867,
273,
278,
24088,
253,
2022,
1543,
2593,
253,
1543,
403,
275,
2426,
273,
298,
285,
12910,
285,
11823,
760,
2616,
273,
2412,
77,
1906,
352,
651,
320,
1175,
281,
823,
247,
5955,
273,
436,
390,
320,
6843,
342,
253,
18925,
275,
253,
14493,
604,
627,
310,
581,
50276,
635,
5884,
4385,
253,
3159,
440,
36764,
917,
310,
11555,
2167,
8505,
352,
310,
247,
1524,
1533,
275,
36764,
917,
310,
625,
764,
857,
406,
339,
793,
360,
3454,
436,
2929,
3400,
247,
26647,
273,
639,
69,
281,
3638,
46399,
16841,
8470,
390,
20077,
273,
731,
285,
19539,
253,
1072,
4156,
4142,
273,
14940,
326,
2186,
275,
253,
299,
26365,
2317,
23000,
597,
2085,
23082,
323,
253,
11542,
46399,
16841,
1083,
616,
5044,
5700,
8687,
253,
897,
273,
35917,
8115,
281,
29010,
1980,
4872,
2406,
14493,
275,
247,
1039,
326,
8553,
323,
253,
17856,
22841,
23122,
407,
253,
3711,
50276,
296,
3755,
20556,
253,
2929,
310,
3542,
973,
285,
10932,
275,
247,
5272,
8142,
597,
452,
247,
2590,
5740,
273,
253,
2087,
5609,
3732,
275,
616,
789,
285,
7450,
27662,
7681,
7125,
281,
253,
30762,
597,
2085,
4156,
4142,
534,
671,
4647,
281,
305,
44181,
3470,
417,
816,
7052,
17133,
835,
891,
452,
10141,
616,
7234,
403,
11076,
1037,
3590,
50276,
20881,
1255,
265,
253,
5028,
273,
30437,
323,
616,
2022,
4142,
403,
11096,
281,
253,
3638,
16841,
8470,
285,
352,
812,
320,
9125,
326,
352,
310,
4942,
6891,
275,
7990,
891,
717,
417,
2119,
273,
253,
5008,
275,
436,
3114,
533,
4931,
352,
651,
9371,
671,
281,
452,
690,
5661,
1543,
285,
2127,
281,
10073,
275,
21068,
285,
5955,
273,
8542,
1395,
285,
5301,
50276,
250,
27167,
318,
891,
3534,
247,
4868,
273,
818,
347,
352,
3133,
281,
2085,
7681,
4780,
689,
2045,
1543,
285,
253,
4477,
403,
2590,
275,
12930,
616,
9021,
619,
4868,
310,
4942,
8767,
347,
891,
369,
417,
2104,
281,
2451,
1142,
273,
253,
7681,
7125,
285,
458,
44661,
5731,
891,
651,
4796,
619,
4868,
281,
247,
721,
1754,
327,
253,
11626,
273,
619,
7715,
30628,
352,
4620,
326,
253,
11096,
7990,
285,
3480,
273,
5661,
1543,
310,
3240,
247,
1895,
1561,
436,
3114,
285,
18767,
187,
187,
4118,
18435,
27,
15337,
398,
3839,
11435,
253,
10527,
7680,
273,
253,
2929,
10775,
21702,
11786,
18499,
327,
253,
15269,
285,
28095,
2317,
342,
253,
1072,
14940,
2281,
347,
253,
299,
26365,
14317,
2299,
627,
403,
2067,
2201,
7350,
342,
253,
1655,
789,
432,
247,
10527,
32764,
253,
35917,
3711,
534,
7120,
247,
9560,
2554,
275,
253,
5933,
285,
10527,
1783,
4961,
604,
285,
760,
604,
253,
16751,
556,
3638,
46399,
16841,
15269,
285,
28095,
2317,
352,
310,
417,
387,
512,
2590,
849,
253,
1655,
2746,
476,
320,
6508,
4457,
436,
4758,
50276,
4064,
271,
5933,
280,
31460,
253,
4767,
5933,
556,
417,
644,
21657,
17618,
352,
310,
5125,
326,
387,
1878,
690,
13506,
4679,
24088,
327,
253,
15269,
390,
2963,
1763,
609,
7592,
320,
4824,
562,
4720,
253,
1655,
9759,
310,
3240,
14086,
285,
943,
320,
15455,
5520
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper extends on the framework of matrix computation in hinz van d geer 2019 to give a tight upper bound for linear regions in particular the paper shows improvement over the bounds derived in serra et al 2018 and extends the bounds for more complex networks with skip and residual connections the paper also shows why skip and residual connections can be beneficial by showing that they lead to networks with larger number of linear regions pros 1 the work of hinz van d geer 2019 can be seen as generalization of the results shown in serra et al 2018 through some choice of lambda parameters used in the expression for bounds this paper provides a tighter upper bound compared to serral et al 2018 2 the paper shows that in the presence of residual connections resnets the number of linear regions is larger this is understandable since there is no bottleneck effect in the case of resnets as shown by serra et al 2018 the bottleneck effect leads to reduced dimensions if the dimension of the earlier layers are small having earlier layers with smaller dimensions can lead to decrease in the number of linear regions however in the case of residual networks such an effect can not be observed thus we tend to have more linear regions in the case of residual connections cons 1 the paper needs significant improvement in the presentation the parameter lambda that is used in deriving the bounds is not formally introduced but the paper starts discussing this from the introduction it is difficult to appreciate the contributions or novelty when the paper does not carefully explain the insightsideas of the proposed formulation nor the differences with prior work 2 the results shown in this paper seem incremental in light of the work of hinz van d geer 2019 docsepthis paper studies the counting of linear regions of a multilayer relu network and gives an upper bound on the number of linear regions that is tighter than existing results networks with skip connections and pooling layers are also considered the authors then compare their bounds for standard multilayer networks and networks with skip connectionspooling layers and conclude that the latter has certain advantages in expressive power the paper is not written very clearly it is very notionheavy and lacks necessary explanations of the results the definitions in this paper are pretty confusing and do not seem to be very rigorous for example in definition 3 a histogram v is defined as an infinite vector with nonnegative integer entries and finite sum however in the discussion below the histogram is of length 5 more importantly the definition of gamma is very confusing definition 10 is not wellwritten and im not quite sure if gamma can be any quantity satisfying the bound condition or is it chosen more specifically the theoretical results are not presented very clearly either it is not obvious to me what is the upper bound for the number of linear regions of a multilayer relu network the discussion on pooling and skip connection is given without an explanation of the network structure it is not clear to me how the matrix a and the skip connections appear in the network the comparison between the vanilla feedforward network and the ones with skip connectionspooling layers may not be logically correct since the authors only compared the upper bounds of the number of linear regions the experimental results do not necessarily demonstrate the advantage of networks with skip connectionspooling layers it is possible that the bound for these types of networks are too loose from the technical aspect it seems that the proof framework of this paper is mainly the same as previous works except that a tighter bound of gamma is implemented this makes the technical contribution of this paper limited for the above reasons i would like to suggest that the authors should improve the presentation of the paper and explain the contributions more clearly and more convincingly docsep overview of the paper the number of regions can indicate the expressive power of neural networks the paper provides a tighter upper bound of the regions number by providing a better choice of the gamma parameter the idea is built upon the work hinz van de geer 2019 and the results is compared with serra et al 2018 the authors also study networks with pooling or skip connections contribution and strength i may not have a good sense of the contributions in the paper since i am not very familiar with the related works the main contribution is theorem 1 and 2 which gives a better choice of gamma the experimental results then verifies the chosen gamma parameters yields a tighter upper bound of the linear regions questions and comments the paper provides many background materials from hinzvandegeer 2019 yet i feel like some intuitive explanations are missing which makes it not very easy to follow closely for example what is the implication of the space dimension in definition 6 does it say something about the structure of a set or the size of a set it should be helpful to provide some examples for finding the space dimension the better choice of gamma in theorem 1 and theorem 2 also appear to be mysterious to me why we consider the downwardmove function in definition 12 i am eager to look at an intuitive idea behind theorem 1 and theorem 2 rather a pure comparison between gammaours vs gammaserra by the way gammaours is not clearly defined in the paper can the authors further elaborate on the extreme large ratio in table 1 educated guess on the rating i make an initial rating of the paper yet it is quite open to discuss given my understanding of the topic it seems the current paper lacks some transitions between the rather dense definitions and theorems in section 2 this makes the paper hard to follow for diverse background i will reconsider the rating after the author response and seeing other reviews to reevaluate the value and soundness of the paperdocsepthis paper studies a method to give upper bounds on the total number of linear regions in a deep relu network these bounds are tighter than those previously obtained and they apply not only to fully connected networks but also to networks with skip connections strong points 1 understanding the complexity of the function computed by a deep relu networks is of significant interest 2 the authors state sharper bounds than previously available for this complexity as measured by the number of linear regions weak points many parts of the paper are not clear or imprecise 1 definition 1 is incompatible with the definition pf of the linear regions of f for example consider a relu networks with a single neuron the partition rn0 into linear regions simply divides the input space into two halfspaces thus the regions d of this function are the closures of these halfspaces and hence are not disjoint they share the same hyperplane as their boundary 2 the definition of rlnn just before definition 9 is not clear is there a bound on the number of neurons in the hidden layer if one interprets containing one relu activation function as having a single neuron then it appears that 11 is a trivial histogram if there can be any number of neurons then in the definition 10 of whengammann satisfies the bound condition the max of hash over h in rlnn appears to be a histogram whose entries are not welldefined since they are infinite i am probably missing something but really didnt understand what is meant here 3 for a given relu network architecture bounding the total number of linear regions in the worst case over all configurations of weightsbiases is typically of no practical value thought it is certainly an interesting problem in extremal combinatorics the reason is that this number even on average is unbelievably large this is true even in extremely simple networks take for instance a single hidden layer relu network in input dimension n0 one hidden layer of size n1 and output dimension 1 for generic configurations of the weights and biases the number of linear regions grows like the sum of the binomial coefficients n1 j as j goes from 0 to n0 by zaslavskys theorem so in the case when n1 n0 for example we get 2n1 regions this in no way reflects the fact that single layer relu networks often learn very simple functions this is exemplified by the shocking numbers in table 1 where we see numbers like 102956 recommendation i found that this paper was difficult to read the results could be interesting from the point of view of a theoretical analysis of deep relu networks so i encourage the authors to rewrite the body of the paper to increase the clarity and precision around the definitions of linear regions ultimately i dont think this paper is ready for publication
### Summary:
|
this paper studies the number of linear regions of a multilayer relu network and gives a new upper bound reviewers concern about the writing and the results are incremental compared with previous results
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
436,
2929,
8725,
327,
253,
7792,
273,
4315,
13782,
275,
288,
29415,
50276,
6148,
277,
3471,
254,
6247,
281,
1918,
247,
6863,
5170,
3033,
323,
4872,
4811,
275,
1798,
253,
2929,
2722,
7756,
689,
253,
14493,
6012,
275,
1151,
376,
1162,
355,
4765,
285,
8725,
253,
14493,
323,
625,
2570,
6928,
342,
17049,
285,
12541,
10291,
253,
2929,
671,
2722,
2139,
17049,
285,
12541,
10291,
476,
320,
12912,
407,
4645,
326,
597,
1421,
281,
6928,
342,
4067,
1180,
273,
4872,
4811,
50275,
856,
84,
50276,
18,
253,
789,
273,
288,
29415,
50276,
6148,
277,
3471,
254,
6247,
476,
320,
2326,
347,
26647,
273,
253,
1543,
2011,
275,
1151,
376,
1162,
355,
4765,
949,
690,
4327,
273,
29331,
3602,
908,
275,
253,
2048,
323,
14493,
436,
2929,
3400,
247,
40638,
5170,
3033,
2429,
281,
256,
22859,
1162,
355,
4765,
50276,
19,
253,
2929,
2722,
326,
275,
253,
3361,
273,
12541,
10291,
501,
47301,
253,
1180,
273,
4872,
4811,
310,
4067,
436,
310,
34007,
1580,
627,
310,
642,
3673,
44856,
1055,
275,
253,
1083,
273,
501,
47301,
347,
2011,
407,
1151,
376,
1162,
355,
4765,
253,
3673,
44856,
1055,
5644,
281,
3777,
10103,
604,
253,
7877,
273,
253,
4321,
8090,
403,
1355,
1907,
4321,
8090,
342,
4577,
10103,
476,
1421,
281,
6379,
275,
253,
1180,
273,
4872,
4811,
2299,
275,
253,
1083,
273,
12541,
6928,
824,
271,
1055,
476,
417,
320,
2540,
3021,
359,
5257,
281,
452,
625,
4872,
4811,
275,
253,
1083,
273,
12541,
10291,
50274,
5040,
50276,
18,
253,
2929,
3198,
1534,
7756,
275,
253,
9759,
253,
4764,
29331,
326,
310,
908,
275,
44190,
253,
14493,
310,
417,
19186,
5611,
533,
253,
2929,
7866,
16585,
436,
432,
253,
10199,
50276,
262,
310,
2834,
281,
11435,
253,
9021,
390,
38135,
672,
253,
2929,
1057,
417,
9257,
5513,
253,
12288,
2189,
284,
273,
253,
4081,
15895,
4543,
253,
3910,
342,
2720,
789,
50275,
19,
253,
1543,
2011,
275,
436,
2929,
1646,
32809,
275,
1708,
273,
253,
789,
273,
288,
29415,
50276,
6148,
277,
3471,
254,
6247,
50276,
7152,
33032,
2520,
2929,
2175,
253,
15496,
273,
4872,
4811,
273,
247,
33362,
4071,
774,
86,
2990,
285,
4245,
271,
5170,
3033,
327,
253,
1180,
273,
4872,
4811,
326,
310,
40638,
685,
5368,
1543,
6928,
342,
17049,
10291,
285,
45900,
8090,
403,
671,
2783,
253,
4477,
840,
7277,
616,
14493,
323,
2629,
33362,
4071,
6928,
285,
6928,
342,
17049,
10291,
10730,
272,
8090,
285,
7525,
326,
253,
6158,
556,
2176,
11361,
275,
43541,
1612,
50275,
783,
2929,
310,
417,
3542,
1077,
4518,
352,
310,
1077,
10732,
37893,
285,
19756,
3309,
22909,
273,
253,
1543,
253,
14308,
275,
436,
2929,
403,
3965,
21643,
285,
513,
417,
1646,
281,
320,
1077,
26565,
323,
1650,
275,
5426,
495,
247,
33105,
362,
310,
2931,
347,
271,
11968,
4972,
342,
46214,
7007,
12028,
285,
6486,
2020,
2299,
275,
253,
5955,
2708,
253,
33105,
310,
273,
2978,
608,
625,
15538,
253,
5426,
273,
17356,
310,
1077,
21643,
5426,
884,
310,
417,
973,
15720,
285,
516,
417,
3240,
2119,
604,
17356,
476,
320,
667,
10671,
14127,
253,
3033,
1617,
390,
310,
352,
6777,
625,
5742,
50275,
783,
10527,
1543,
403,
417,
3559,
1077,
4518,
2057,
352,
310,
417,
4755,
281,
479,
752,
310,
253,
5170,
3033,
323,
253,
1180,
273,
4872,
4811,
273,
247,
33362,
4071,
774,
86,
2990,
50275,
783,
5955,
327,
45900,
285,
17049,
4602,
310,
1677,
1293,
271,
8813,
273,
253,
2990,
2605,
352,
310,
417,
2590,
281,
479,
849,
253,
4315,
247,
285,
253,
17049,
10291,
3176,
275,
253,
2990,
50275,
783,
5301,
875,
253,
26724,
3997,
10495,
2990,
285,
253,
4394,
342,
17049,
10291,
10730,
272,
8090,
778,
417,
320,
40452,
3451,
1580,
253,
4477,
760,
2429,
253,
5170,
14493,
273,
253,
1180,
273,
4872,
4811,
253,
5661,
1543,
513,
417,
7933,
7568,
253,
5750,
273,
6928,
342,
17049,
10291,
10730,
272,
8090,
352,
310,
1896,
326,
253,
3033,
323,
841,
3510,
273,
6928,
403,
1512,
13155,
50275,
4064,
253,
7681,
4809,
352,
3133,
326,
253,
4737,
7792,
273,
436,
2929,
310,
7194,
253,
1072,
347,
2045,
2987,
3707,
326,
247,
40638,
3033,
273,
17356,
310,
9009,
436,
2789,
253,
7681,
7680,
273,
436,
2929,
3710,
50276,
1542,
253,
1840,
4606,
891,
651,
751,
281,
1804,
326,
253,
4477,
943,
3157,
253,
9759,
273,
253,
2929,
285,
5513,
253,
9021,
625,
4518,
285,
625,
2410,
1763,
5356,
5474,
33032,
18389,
273,
253,
2929,
50275,
783,
1180,
273,
4811,
476,
5224,
253,
43541,
1612,
273,
11454,
6928,
253,
2929,
3400,
247,
40638,
5170,
3033,
273,
253,
4811,
1180,
407,
5277,
247,
1805,
4327,
273,
253,
17356,
4764,
253,
2934,
310,
4270,
2220,
253,
789,
288,
29415,
50276,
6148,
372,
3471,
254,
6247,
285,
253,
1543,
310,
2429,
342,
1151,
376,
1162,
355,
4765,
253,
4477,
671,
1263,
6928,
342,
45900,
390,
17049,
10291,
50275,
1987,
2382,
285,
4757,
50275,
74,
778,
417,
452,
247,
1175,
3282,
273,
253,
9021,
275,
253,
2929,
1580,
891,
717,
417,
1077,
7615,
342,
253,
2905,
2987,
253,
2022,
7680,
310,
10012,
337,
285,
374,
534,
4245,
247,
1805,
4327,
273,
17356,
253,
5661,
1543,
840,
2336,
7790,
253,
6777,
17356,
3602,
11026,
247,
40638,
5170,
3033,
273,
253,
4872,
4811,
50275,
34974,
285,
5701,
50275,
783,
2929,
3400,
1142,
4114,
4753,
432,
288,
29415,
87,
10273,
463,
254,
6247,
2568,
891,
1928,
751,
690,
27350,
22909,
403,
5816,
534,
2789,
352,
417,
1077,
3477,
281,
956,
8244,
323,
1650,
752,
310,
253,
27570,
273,
253,
2317,
7877,
275,
5426,
721,
50276,
18566,
352,
1333,
1633,
670,
253,
2605,
273,
247,
873,
390,
253,
1979,
273,
247,
873,
352,
943,
320,
9371,
281,
2085,
690,
6667,
323,
4560,
253,
2317,
7877,
50276,
783,
1805,
4327,
273,
17356,
275,
10012,
337,
285,
10012,
374,
671,
3176,
281,
320,
19796,
281,
479,
2139,
359,
1908,
253,
21169,
15106,
1159,
275,
5426,
1249,
891,
717,
15211,
281,
1007,
387,
271,
27350,
2934,
3212,
10012,
337,
285,
10012,
374,
2581,
247,
6313,
5301,
875,
17356,
2108,
4632,
305,
3681,
12290,
376,
407,
253,
1039,
17356,
2108,
310,
417,
4518,
2931,
275,
253,
2929,
50276,
5092,
253,
4477,
2007,
21184,
327,
253,
9559,
1781,
4313,
275,
2829,
337,
50275,
45093,
5476,
327,
253,
13716,
50275,
74,
1056,
271,
3302,
13716,
273,
253,
2929,
2568,
352,
310,
3240,
1527,
281,
2319,
1677,
619,
4685,
273,
253,
9400,
352,
3133,
253,
1655,
2929,
19756,
690,
16307,
875,
253,
2581,
14086,
14308,
285,
39383,
275,
2593,
374,
436,
2789,
253,
2929,
1892,
281,
956,
323,
11117,
4114,
50276,
74,
588,
24033,
253,
13716,
846,
253,
2488,
2380,
285,
6523,
643,
10123,
281,
294,
45141,
253,
1318,
285,
3590,
1255,
273,
253,
2929,
7152,
33032,
2520,
2929,
2175,
247,
1332,
281,
1918,
5170,
14493,
327,
253,
2264,
1180,
273,
4872,
4811,
275,
247,
3676,
774,
86,
2990,
841,
14493,
403,
40638,
685,
1110,
3786,
2797,
285,
597,
4647,
417,
760,
281,
4751,
4802,
6928,
533,
671,
281,
6928,
342,
17049,
10291,
50276,
9072,
2792,
50276,
18,
4685,
253,
10454,
273,
253,
1159,
10302,
407,
247,
3676,
774,
86,
6928,
310,
273,
1534,
1600,
50276,
19,
253,
4477,
1375,
17614,
468,
14493,
685,
3786,
2130,
323,
436,
10454,
347,
4080,
407,
253,
1180,
273,
4872,
4811,
50275,
20881,
2792,
1142,
4243,
273,
253,
2929,
403,
417,
2590,
390,
1607,
2845,
885,
337,
5426,
337,
310,
32555,
342,
253,
5426,
268,
71,
273,
253,
4872,
4811,
273,
269,
323,
1650,
1908,
247,
774,
86,
6928,
342,
247,
2014,
23586,
253,
10883,
391,
79,
17,
715,
4872,
4811,
3365,
37141,
253,
3280,
2317,
715,
767,
2716,
31748,
3021,
253,
4811,
277,
273,
436,
1159,
403,
253,
47814,
273,
841,
2716,
31748,
285,
7613,
403,
417,
28465,
597,
3894,
253,
1072,
4373,
13568,
347,
616,
7548,
50275,
19,
253,
5426,
273,
391,
6677,
79,
816,
1078,
5426,
898,
310,
417,
2590,
310,
627,
247,
3033,
327,
253,
1180,
273,
8512,
275,
253,
8763,
3828,
604,
581,
4665,
84,
4508,
50276,
531,
774,
86,
5743,
1159,
347,
1907,
247,
2014,
23586,
840,
352,
4620,
326,
1903,
310,
247,
14916,
33105,
604,
627,
476,
320,
667,
1180,
273,
8512,
840,
275,
253,
5426,
884,
273,
672,
72,
3681,
1136,
12310,
253,
3033,
1617,
253,
2781,
273,
13283,
689,
288,
275,
391,
6677,
79,
4620,
281,
320,
247,
33105,
3692,
12028,
403,
417,
6210,
392,
37224,
1580,
597,
403,
11968,
891,
717,
3164,
5816,
1633,
533,
1663,
42126,
2096,
752,
310,
5486,
1060,
50276,
20,
323,
247,
1677,
774,
86,
2990,
10336,
41113,
253,
2264,
1180,
273,
4872,
4811,
275,
253,
9065,
1083,
689,
512,
16012,
273,
13461,
4193,
1169,
310,
5431,
273,
642,
8542,
1318,
1869,
352,
310,
5604,
271,
4722,
1895,
275,
5320,
267,
32662,
982,
253,
1921,
310,
326,
436,
1180,
1014,
327,
3388,
310,
31570,
87,
1598,
1781,
436,
310,
2032,
1014,
275,
6685,
2969,
6928,
1379,
323,
4227,
247,
2014,
8763,
3828,
774,
86,
2990,
275,
3280,
7877,
295,
17,
581,
8763,
3828,
273,
1979,
295,
18,
285,
3453,
7877,
337,
323,
12314,
16012,
273,
253,
13461,
285,
31306,
253,
1180,
273,
4872,
4811,
17202,
751,
253,
2020,
273,
253,
47585,
10303,
295,
18,
480,
347,
480,
4566,
432,
470,
281,
295,
17,
407,
1182,
284,
19306,
3319,
656,
10012,
594,
275,
253,
1083,
672,
295,
18,
50276,
79,
17,
323,
1650,
359,
755,
374,
79,
18,
4811,
436,
275,
642,
1039,
13806,
253,
958,
326,
2014,
3828,
774,
86,
6928,
2223,
3037,
1077,
2969,
3470,
436,
310,
45045,
407,
253,
29103,
3904,
275,
2829,
337,
835,
359,
923,
3904,
751,
884,
1717,
3208,
50276,
250,
27167,
318,
891,
1119,
326,
436,
2929,
369,
2834,
281,
1239,
253,
1543,
812,
320,
4722,
432,
253,
1127,
273,
1859,
273,
247,
10527,
1783,
273,
3676,
774,
86,
6928,
594,
891,
11907,
253,
4477,
281,
24813,
253,
2133,
273,
253,
2929,
281,
2572,
253,
19843,
285,
12320,
1475,
253,
14308,
273,
4872,
4811,
9142,
891,
13414,
1158,
436,
2929,
310,
4704,
323,
9311,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1180,
273,
4872,
4811,
273,
247,
33362,
4071,
774,
86,
2990,
285,
4245,
247,
747,
5170,
3033,
30628,
4468,
670,
253,
4028,
285,
253,
1543,
403,
32809,
2429,
342,
2045,
1543
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
436,
2929,
8725,
327,
253,
7792,
273,
4315,
13782,
275,
288,
29415,
50276,
6148,
277,
3471,
254,
6247,
281,
1918,
247,
6863,
5170,
3033,
323,
4872,
4811,
275,
1798,
253,
2929,
2722,
7756,
689,
253,
14493,
6012,
275,
1151,
376,
1162,
355,
4765,
285,
8725,
253,
14493,
323,
625,
2570,
6928,
342,
17049,
285,
12541,
10291,
253,
2929,
671,
2722,
2139,
17049,
285,
12541,
10291,
476,
320,
12912,
407,
4645,
326,
597,
1421,
281,
6928,
342,
4067,
1180,
273,
4872,
4811,
50275,
856,
84,
50276,
18,
253,
789,
273,
288,
29415,
50276,
6148,
277,
3471,
254,
6247,
476,
320,
2326,
347,
26647,
273,
253,
1543,
2011,
275,
1151,
376,
1162,
355,
4765,
949,
690,
4327,
273,
29331,
3602,
908,
275,
253,
2048,
323,
14493,
436,
2929,
3400,
247,
40638,
5170,
3033,
2429,
281,
256,
22859,
1162,
355,
4765,
50276,
19,
253,
2929,
2722,
326,
275,
253,
3361,
273,
12541,
10291,
501,
47301,
253,
1180,
273,
4872,
4811,
310,
4067,
436,
310,
34007,
1580,
627,
310,
642,
3673,
44856,
1055,
275,
253,
1083,
273,
501,
47301,
347,
2011,
407,
1151,
376,
1162,
355,
4765,
253,
3673,
44856,
1055,
5644,
281,
3777,
10103,
604,
253,
7877,
273,
253,
4321,
8090,
403,
1355,
1907,
4321,
8090,
342,
4577,
10103,
476,
1421,
281,
6379,
275,
253,
1180,
273,
4872,
4811,
2299,
275,
253,
1083,
273,
12541,
6928,
824,
271,
1055,
476,
417,
320,
2540,
3021,
359,
5257,
281,
452,
625,
4872,
4811,
275,
253,
1083,
273,
12541,
10291,
50274,
5040,
50276,
18,
253,
2929,
3198,
1534,
7756,
275,
253,
9759,
253,
4764,
29331,
326,
310,
908,
275,
44190,
253,
14493,
310,
417,
19186,
5611,
533,
253,
2929,
7866,
16585,
436,
432,
253,
10199,
50276,
262,
310,
2834,
281,
11435,
253,
9021,
390,
38135,
672,
253,
2929,
1057,
417,
9257,
5513,
253,
12288,
2189,
284,
273,
253,
4081,
15895,
4543,
253,
3910,
342,
2720,
789,
50275,
19,
253,
1543,
2011,
275,
436,
2929,
1646,
32809,
275,
1708,
273,
253,
789,
273,
288,
29415,
50276,
6148,
277,
3471,
254,
6247,
50276,
7152,
33032,
2520,
2929,
2175,
253,
15496,
273,
4872,
4811,
273,
247,
33362,
4071,
774,
86,
2990,
285,
4245,
271,
5170,
3033,
327,
253,
1180,
273,
4872,
4811,
326,
310,
40638,
685,
5368,
1543,
6928,
342,
17049,
10291,
285,
45900,
8090,
403,
671,
2783,
253,
4477,
840,
7277,
616,
14493,
323,
2629,
33362,
4071,
6928,
285,
6928,
342,
17049,
10291,
10730,
272,
8090,
285,
7525,
326,
253,
6158,
556,
2176,
11361,
275,
43541,
1612,
50275,
783,
2929,
310,
417,
3542,
1077,
4518,
352,
310,
1077,
10732,
37893,
285,
19756,
3309,
22909,
273,
253,
1543,
253,
14308,
275,
436,
2929,
403,
3965,
21643,
285,
513,
417,
1646,
281,
320,
1077,
26565,
323,
1650,
275,
5426,
495,
247,
33105,
362,
310,
2931,
347,
271,
11968,
4972,
342,
46214,
7007,
12028,
285,
6486,
2020,
2299,
275,
253,
5955,
2708,
253,
33105,
310,
273,
2978,
608,
625,
15538,
253,
5426,
273,
17356,
310,
1077,
21643,
5426,
884,
310,
417,
973,
15720,
285,
516,
417,
3240,
2119,
604,
17356,
476,
320,
667,
10671,
14127,
253,
3033,
1617,
390,
310,
352,
6777,
625,
5742,
50275,
783,
10527,
1543,
403,
417,
3559,
1077,
4518,
2057,
352,
310,
417,
4755,
281,
479,
752,
310,
253,
5170,
3033,
323,
253,
1180,
273,
4872,
4811,
273,
247,
33362,
4071,
774,
86,
2990,
50275,
783,
5955,
327,
45900,
285,
17049,
4602,
310,
1677,
1293,
271,
8813,
273,
253,
2990,
2605,
352,
310,
417,
2590,
281,
479,
849,
253,
4315,
247,
285,
253,
17049,
10291,
3176,
275,
253,
2990,
50275,
783,
5301,
875,
253,
26724,
3997,
10495,
2990,
285,
253,
4394,
342,
17049,
10291,
10730,
272,
8090,
778,
417,
320,
40452,
3451,
1580,
253,
4477,
760,
2429,
253,
5170,
14493,
273,
253,
1180,
273,
4872,
4811,
253,
5661,
1543,
513,
417,
7933,
7568,
253,
5750,
273,
6928,
342,
17049,
10291,
10730,
272,
8090,
352,
310,
1896,
326,
253,
3033,
323,
841,
3510,
273,
6928,
403,
1512,
13155,
50275,
4064,
253,
7681,
4809,
352,
3133,
326,
253,
4737,
7792,
273,
436,
2929,
310,
7194,
253,
1072,
347,
2045,
2987,
3707,
326,
247,
40638,
3033,
273,
17356,
310,
9009,
436,
2789,
253,
7681,
7680,
273,
436,
2929,
3710,
50276,
1542,
253,
1840,
4606,
891,
651,
751,
281,
1804,
326,
253,
4477,
943,
3157,
253,
9759,
273,
253,
2929,
285,
5513,
253,
9021,
625,
4518,
285,
625,
2410,
1763,
5356,
5474,
33032,
18389,
273,
253,
2929,
50275,
783,
1180,
273,
4811,
476,
5224,
253,
43541,
1612,
273,
11454,
6928,
253,
2929,
3400,
247,
40638,
5170,
3033,
273,
253,
4811,
1180,
407,
5277,
247,
1805,
4327,
273,
253,
17356,
4764,
253,
2934,
310,
4270,
2220,
253,
789,
288,
29415,
50276,
6148,
372,
3471,
254,
6247,
285,
253,
1543,
310,
2429,
342,
1151,
376,
1162,
355,
4765,
253,
4477,
671,
1263,
6928,
342,
45900,
390,
17049,
10291,
50275,
1987,
2382,
285,
4757,
50275,
74,
778,
417,
452,
247,
1175,
3282,
273,
253,
9021,
275,
253,
2929,
1580,
891,
717,
417,
1077,
7615,
342,
253,
2905,
2987,
253,
2022,
7680,
310,
10012,
337,
285,
374,
534,
4245,
247,
1805,
4327,
273,
17356,
253,
5661,
1543,
840,
2336,
7790,
253,
6777,
17356,
3602,
11026,
247,
40638,
5170,
3033,
273,
253,
4872,
4811,
50275,
34974,
285,
5701,
50275,
783,
2929,
3400,
1142,
4114,
4753,
432,
288,
29415,
87,
10273,
463,
254,
6247,
2568,
891,
1928,
751,
690,
27350,
22909,
403,
5816,
534,
2789,
352,
417,
1077,
3477,
281,
956,
8244,
323,
1650,
752,
310,
253,
27570,
273,
253,
2317,
7877,
275,
5426,
721,
50276,
18566,
352,
1333,
1633,
670,
253,
2605,
273,
247,
873,
390,
253,
1979,
273,
247,
873,
352,
943,
320,
9371,
281,
2085,
690,
6667,
323,
4560,
253,
2317,
7877,
50276,
783,
1805,
4327,
273,
17356,
275,
10012,
337,
285,
10012,
374,
671,
3176,
281,
320,
19796,
281,
479,
2139,
359,
1908,
253,
21169,
15106,
1159,
275,
5426,
1249,
891,
717,
15211,
281,
1007,
387,
271,
27350,
2934,
3212,
10012,
337,
285,
10012,
374,
2581,
247,
6313,
5301,
875,
17356,
2108,
4632,
305,
3681,
12290,
376,
407,
253,
1039,
17356,
2108,
310,
417,
4518,
2931,
275,
253,
2929,
50276,
5092,
253,
4477,
2007,
21184,
327,
253,
9559,
1781,
4313,
275,
2829,
337,
50275,
45093,
5476,
327,
253,
13716,
50275,
74,
1056,
271,
3302,
13716,
273,
253,
2929,
2568,
352,
310,
3240,
1527,
281,
2319,
1677,
619,
4685,
273,
253,
9400,
352,
3133,
253,
1655,
2929,
19756,
690,
16307,
875,
253,
2581,
14086,
14308,
285,
39383,
275,
2593,
374,
436,
2789,
253,
2929,
1892,
281,
956,
323,
11117,
4114,
50276,
74,
588,
24033,
253,
13716,
846,
253,
2488,
2380,
285,
6523,
643,
10123,
281,
294,
45141,
253,
1318,
285,
3590,
1255,
273,
253,
2929,
7152,
33032,
2520,
2929,
2175,
247,
1332,
281,
1918,
5170,
14493,
327,
253,
2264,
1180,
273,
4872,
4811,
275,
247,
3676,
774,
86,
2990,
841,
14493,
403,
40638,
685,
1110,
3786,
2797,
285,
597,
4647,
417,
760,
281,
4751,
4802,
6928,
533,
671,
281,
6928,
342,
17049,
10291,
50276,
9072,
2792,
50276,
18,
4685,
253,
10454,
273,
253,
1159,
10302,
407,
247,
3676,
774,
86,
6928,
310,
273,
1534,
1600,
50276,
19,
253,
4477,
1375,
17614,
468,
14493,
685,
3786,
2130,
323,
436,
10454,
347,
4080,
407,
253,
1180,
273,
4872,
4811,
50275,
20881,
2792,
1142,
4243,
273,
253,
2929,
403,
417,
2590,
390,
1607,
2845,
885,
337,
5426,
337,
310,
32555,
342,
253,
5426,
268,
71,
273,
253,
4872,
4811,
273,
269,
323,
1650,
1908,
247,
774,
86,
6928,
342,
247,
2014,
23586,
253,
10883,
391,
79,
17,
715,
4872,
4811,
3365,
37141,
253,
3280,
2317,
715,
767,
2716,
31748,
3021,
253,
4811,
277,
273,
436,
1159,
403,
253,
47814,
273,
841,
2716,
31748,
285,
7613,
403,
417,
28465,
597,
3894,
253,
1072,
4373,
13568,
347,
616,
7548,
50275,
19,
253,
5426,
273,
391,
6677,
79,
816,
1078,
5426,
898,
310,
417,
2590,
310,
627,
247,
3033,
327,
253,
1180,
273,
8512,
275,
253,
8763,
3828,
604,
581,
4665,
84,
4508,
50276,
531,
774,
86,
5743,
1159,
347,
1907,
247,
2014,
23586,
840,
352,
4620,
326,
1903,
310,
247,
14916,
33105,
604,
627,
476,
320,
667,
1180,
273,
8512,
840,
275,
253,
5426,
884,
273,
672,
72,
3681,
1136,
12310,
253,
3033,
1617,
253,
2781,
273,
13283,
689,
288,
275,
391,
6677,
79,
4620,
281,
320,
247,
33105,
3692,
12028,
403,
417,
6210,
392,
37224,
1580,
597,
403,
11968,
891,
717,
3164,
5816,
1633,
533,
1663,
42126,
2096,
752,
310,
5486,
1060,
50276,
20,
323,
247,
1677,
774,
86,
2990,
10336,
41113,
253,
2264,
1180,
273,
4872,
4811,
275,
253,
9065,
1083,
689,
512,
16012,
273,
13461,
4193,
1169,
310,
5431,
273,
642,
8542,
1318,
1869,
352,
310,
5604,
271,
4722,
1895,
275,
5320,
267,
32662,
982,
253,
1921,
310,
326,
436,
1180,
1014,
327,
3388,
310,
31570,
87,
1598,
1781,
436,
310,
2032,
1014,
275,
6685,
2969,
6928,
1379,
323,
4227,
247,
2014,
8763,
3828,
774,
86,
2990,
275,
3280,
7877,
295,
17,
581,
8763,
3828,
273,
1979,
295,
18,
285,
3453,
7877,
337,
323,
12314,
16012,
273,
253,
13461,
285,
31306,
253,
1180,
273,
4872,
4811,
17202,
751,
253,
2020,
273,
253,
47585,
10303,
295,
18,
480,
347,
480,
4566,
432,
470,
281,
295,
17,
407,
1182,
284,
19306,
3319,
656,
10012,
594,
275,
253,
1083,
672,
295,
18,
50276,
79,
17,
323,
1650,
359,
755,
374,
79,
18,
4811,
436,
275,
642,
1039,
13806,
253,
958,
326,
2014,
3828,
774,
86,
6928,
2223,
3037,
1077,
2969,
3470,
436,
310,
45045,
407,
253,
29103,
3904,
275,
2829,
337,
835,
359,
923,
3904,
751,
884,
1717,
3208,
50276,
250,
27167,
318,
891,
1119,
326,
436,
2929,
369,
2834,
281,
1239,
253,
1543,
812,
320,
4722,
432,
253,
1127,
273,
1859,
273,
247,
10527,
1783,
273,
3676,
774,
86,
6928,
594,
891,
11907,
253,
4477,
281,
24813,
253,
2133,
273,
253,
2929,
281,
2572,
253,
19843,
285,
12320,
1475,
253,
14308,
273,
4872,
4811,
9142,
891,
13414,
1158,
436,
2929,
310,
4704,
323,
9311,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1180,
273,
4872,
4811,
273,
247,
33362,
4071,
774,
86,
2990,
285,
4245,
247,
747,
5170,
3033,
30628,
4468,
670,
253,
4028,
285,
253,
1543,
403,
32809,
2429,
342,
2045,
1543
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a new framework for spatiotemporal disentanglement in particular it contributes to the disentanglement of content and dynamics using neural odes strength 1 i think it is a wellwritten paper with interesting experimental design especially the swap and multiview experiments that measure the degree of disentanglement 2 the proposed model consistently outperforms wellestablished approaches for video prediction including drnet ddpae and phydnet which is also a pdebased model for spatiotemporal disentanglement weakness 1 although the effectiveness of the model has been validated on several datasets most of them are synthetic i strongly encourage the authors to include real human motion datasets for example drnet was evaluated on kth and phydnet was evaluated on human36m 2 the model was only compared with pknl and phydnet on the wave and sst datasets and i am curious about how models beyond disentanglement such as svg and mim perform in this case also the results in figure 2 appear to be quite blurry and do not show significant improvement these experiments mainly show the general ability of the proposed model for video prediction rather than spatiotemporal disentanglement correctness in eq 12 the authors use a gaussian prior to convey dynamic information and exclude the spatial information is it a good thing or a bad thing from the view of modeling temporal dynamics a question is can such a simple dynamic model be applied to datasets with complex motion information docsepthe authors present a generative model for videos where the latent trajectories have two components a term without a slowness loss that represents content and a term with a slowness loss that represents style they present results on a dataset simulating the wave equation and on videos of moving mnist digits and 3d chairs the results are generally good especially for long rollouts and they demonstrate something like disentangling by showing that the identities of the digits can be swapped in the moving mnist data my main objection with the paper is that it has nothing to do with pdes or separation of variables the actual latent trajectories are simulated as odes not pdes which are then used to generate images the justification in terms of separation of variables is also a baitandswitcha slowness penalty is added to the loss for one of the latent trajectories that is all factored latent trajectories are a wellestablished modeling technique for time series already the use of odes parameterized by neural networks rather than say lstms or other rnn architectures is also more a difference in degree than in kind from other sequence models much like how resnets become neural odes in the limit of very deep networks simply using rungekutta updates parameterized by a neural network is still technically a kind of rnn if you define rnn very loosely as any nonlinear iterated function learnable by gradient descent so im not sure that this paper actually does most of what it claims to be doing in the motivationdocsepthe paper presents a spatiotemporal disentanglement method for handling sequence data solving highdimensional pdes for deriving the exact dynamics is difficult hence this work proposes learning timeinvariant and timedependent representations separately to solve this problem to achieve this goal the authors devised a model that incorporates a temporal ode process the provided experiments indicate that the method achieves good performance although the gain is not consistent the overall derivation and methodology of this paper are technically sound and i guess discarding sprime was a practical choice for learning timeinvariant representation although the proposed model is generally applicable the experiments only cover prediction results on classical image datasets nevertheless the underlying timescales of datasets are sufficiently varied to demonstrate the effectiveness in general temporal sequences quality the paper is clearly written overall however section 4 was a bit hard to follow which is the core part of this paper for example the system coordinates and reasons for the relaxation sec 42 were not clearly explained for people who are not familiar with the area it would be harder to understand the architecture if fig 1 was not given originality the originality of the paper is not stellar but sufficient for acceptance significance the significance of this work is mainly for model architecture i believe these kinds of approaches which can internally model continuous dynamics are heavily preferred when solving realworld problems therefore the significance of the paper is sufficient for acceptance
### Summary:
|
this paper proposes a model for disentangling content and dynamics but unlike the majority of previous work the dynamics are modeled using odes rather than their discrete approximations rnns the reviewers agree that the paper is well written and the results look good especially for longer trajectories hence i am happy to recommend this paper for acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
747,
7792,
323,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
275,
1798,
352,
17904,
281,
253,
557,
290,
606,
1338,
273,
2600,
285,
8062,
970,
11454,
258,
3229,
50275,
45563,
50276,
18,
891,
1158,
352,
310,
247,
973,
15720,
2929,
342,
4722,
5661,
2216,
3340,
253,
22101,
285,
1554,
400,
827,
4679,
326,
2557,
253,
4248,
273,
557,
290,
606,
1338,
50276,
19,
253,
4081,
1566,
12724,
41731,
13015,
973,
21877,
7274,
323,
3492,
10554,
1690,
1837,
3024,
277,
12132,
3348,
285,
815,
10120,
3024,
534,
310,
671,
247,
268,
615,
3169,
1566,
323,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
50276,
20881,
1255,
337,
3738,
253,
12510,
273,
253,
1566,
556,
644,
17618,
327,
2067,
15302,
954,
273,
731,
403,
13506,
891,
7052,
11907,
253,
4477,
281,
2486,
1524,
1966,
3200,
15302,
323,
1650,
1837,
3024,
369,
6760,
327,
465,
394,
285,
815,
10120,
3024,
369,
6760,
327,
1966,
1812,
78,
374,
253,
1566,
369,
760,
2429,
342,
268,
3696,
77,
285,
815,
10120,
3024,
327,
253,
5149,
285,
256,
296,
15302,
285,
891,
717,
14338,
670,
849,
3210,
4457,
557,
290,
606,
1338,
824,
347,
18504,
72,
285,
13892,
1347,
275,
436,
1083,
671,
253,
1543,
275,
4677,
374,
3176,
281,
320,
3240,
787,
20657,
285,
513,
417,
921,
1534,
7756,
841,
4679,
7194,
921,
253,
2087,
3745,
273,
253,
4081,
1566,
323,
3492,
10554,
2581,
685,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
50275,
28113,
1255,
50276,
249,
16186,
1249,
253,
4477,
897,
247,
305,
12064,
2720,
281,
12709,
7870,
1491,
285,
16670,
253,
8820,
1491,
310,
352,
247,
1175,
2181,
390,
247,
3076,
2181,
432,
253,
1859,
273,
14053,
11935,
8062,
247,
1953,
310,
476,
824,
247,
2969,
7870,
1566,
320,
3732,
281,
15302,
342,
2570,
3200,
1491,
50276,
7152,
339,
431,
248,
4477,
1246,
247,
1006,
800,
1566,
323,
10556,
835,
253,
21624,
24102,
452,
767,
4295,
50276,
66,
1307,
1293,
247,
1499,
628,
405,
2957,
326,
6125,
2600,
285,
247,
1307,
342,
247,
1499,
628,
405,
2957,
326,
6125,
3740,
597,
1246,
1543,
327,
247,
10895,
948,
8287,
253,
5149,
5150,
285,
327,
10556,
273,
4886,
278,
79,
382,
24321,
285,
495,
69,
21583,
253,
1543,
403,
3839,
1175,
3340,
323,
1048,
4533,
8349,
285,
597,
7568,
1633,
751,
557,
290,
36874,
407,
4645,
326,
253,
22925,
273,
253,
24321,
476,
320,
1863,
6965,
275,
253,
4886,
278,
79,
382,
941,
50276,
2577,
2022,
14926,
342,
253,
2929,
310,
326,
352,
556,
2717,
281,
513,
342,
268,
3229,
390,
9712,
273,
4903,
253,
4588,
21624,
24102,
403,
15524,
347,
258,
3229,
417,
268,
3229,
534,
403,
840,
908,
281,
6635,
3888,
253,
22861,
275,
2426,
273,
9712,
273,
4903,
310,
671,
247,
35352,
395,
16065,
66,
1499,
628,
405,
12339,
310,
2879,
281,
253,
2957,
323,
581,
273,
253,
21624,
24102,
326,
310,
512,
958,
2149,
21624,
24102,
403,
247,
973,
21877,
14053,
5853,
323,
673,
2962,
2168,
253,
897,
273,
258,
3229,
4764,
1025,
407,
11454,
6928,
2581,
685,
1333,
298,
296,
983,
390,
643,
391,
9866,
35615,
310,
671,
625,
247,
3064,
275,
4248,
685,
275,
2238,
432,
643,
3425,
3210,
1199,
751,
849,
501,
47301,
2489,
11454,
258,
3229,
275,
253,
2701,
273,
1077,
3676,
6928,
3365,
970,
1408,
463,
76,
29662,
11269,
4764,
1025,
407,
247,
11454,
2990,
310,
1335,
22335,
247,
2238,
273,
391,
9866,
604,
368,
4853,
391,
9866,
1077,
35056,
347,
667,
14561,
10040,
456,
1159,
3037,
494,
407,
11786,
18499,
594,
516,
417,
2119,
326,
436,
2929,
2686,
1057,
954,
273,
752,
352,
3916,
281,
320,
2509,
275,
253,
16038,
7152,
339,
431,
248,
2929,
10262,
247,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
1332,
323,
10885,
3425,
941,
16161,
1029,
6967,
268,
3229,
323,
44190,
253,
3242,
8062,
310,
2834,
7613,
436,
789,
29328,
4715,
673,
25168,
285,
37282,
2662,
14237,
11794,
281,
8415,
436,
1895,
281,
5115,
436,
4736,
253,
4477,
32434,
247,
1566,
326,
31167,
247,
11935,
258,
615,
1232,
253,
2530,
4679,
5224,
326,
253,
1332,
33526,
1175,
3045,
3738,
253,
6351,
310,
417,
5185,
50276,
783,
4583,
28529,
285,
16182,
273,
436,
2929,
403,
22335,
3590,
285,
891,
5476,
1262,
13218,
8689,
553,
369,
247,
8542,
4327,
323,
4715,
673,
25168,
6779,
3738,
253,
4081,
1566,
310,
3839,
7763,
253,
4679,
760,
3835,
10554,
1543,
327,
8946,
2460,
15302,
17837,
253,
6944,
2069,
1179,
265,
273,
15302,
403,
10481,
12848,
281,
7568,
253,
12510,
275,
2087,
11935,
6430,
50276,
15177,
50276,
783,
2929,
310,
4518,
3542,
4583,
2299,
2593,
577,
369,
247,
2372,
1892,
281,
956,
534,
310,
253,
5161,
629,
273,
436,
2929,
323,
1650,
253,
985,
11627,
285,
4606,
323,
253,
17040,
4706,
5976,
497,
417,
4518,
5544,
323,
952,
665,
403,
417,
7615,
342,
253,
2170,
352,
651,
320,
12150,
281,
2096,
253,
10336,
604,
3036,
337,
369,
417,
1677,
50276,
19164,
414,
50276,
783,
3236,
414,
273,
253,
2929,
310,
417,
13671,
533,
4209,
323,
14924,
50275,
9188,
40348,
50276,
783,
8453,
273,
436,
789,
310,
7194,
323,
1566,
10336,
891,
2868,
841,
9351,
273,
7274,
534,
476,
26506,
1566,
5415,
8062,
403,
11306,
9013,
672,
16161,
1524,
10186,
3237,
3103,
253,
8453,
273,
253,
2929,
310,
4209,
323,
14924,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1566,
323,
557,
290,
36874,
2600,
285,
8062,
533,
12401,
253,
5020,
273,
2045,
789,
253,
8062,
403,
23115,
970,
258,
3229,
2581,
685,
616,
13358,
34754,
50276,
30930,
2224,
253,
30628,
5194,
326,
253,
2929,
310,
973,
3542,
285,
253,
1543,
1007,
1175,
3340,
323,
3356,
24102,
7613,
891,
717,
5211,
281,
5583,
436,
2929,
323,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
747,
7792,
323,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
275,
1798,
352,
17904,
281,
253,
557,
290,
606,
1338,
273,
2600,
285,
8062,
970,
11454,
258,
3229,
50275,
45563,
50276,
18,
891,
1158,
352,
310,
247,
973,
15720,
2929,
342,
4722,
5661,
2216,
3340,
253,
22101,
285,
1554,
400,
827,
4679,
326,
2557,
253,
4248,
273,
557,
290,
606,
1338,
50276,
19,
253,
4081,
1566,
12724,
41731,
13015,
973,
21877,
7274,
323,
3492,
10554,
1690,
1837,
3024,
277,
12132,
3348,
285,
815,
10120,
3024,
534,
310,
671,
247,
268,
615,
3169,
1566,
323,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
50276,
20881,
1255,
337,
3738,
253,
12510,
273,
253,
1566,
556,
644,
17618,
327,
2067,
15302,
954,
273,
731,
403,
13506,
891,
7052,
11907,
253,
4477,
281,
2486,
1524,
1966,
3200,
15302,
323,
1650,
1837,
3024,
369,
6760,
327,
465,
394,
285,
815,
10120,
3024,
369,
6760,
327,
1966,
1812,
78,
374,
253,
1566,
369,
760,
2429,
342,
268,
3696,
77,
285,
815,
10120,
3024,
327,
253,
5149,
285,
256,
296,
15302,
285,
891,
717,
14338,
670,
849,
3210,
4457,
557,
290,
606,
1338,
824,
347,
18504,
72,
285,
13892,
1347,
275,
436,
1083,
671,
253,
1543,
275,
4677,
374,
3176,
281,
320,
3240,
787,
20657,
285,
513,
417,
921,
1534,
7756,
841,
4679,
7194,
921,
253,
2087,
3745,
273,
253,
4081,
1566,
323,
3492,
10554,
2581,
685,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
50275,
28113,
1255,
50276,
249,
16186,
1249,
253,
4477,
897,
247,
305,
12064,
2720,
281,
12709,
7870,
1491,
285,
16670,
253,
8820,
1491,
310,
352,
247,
1175,
2181,
390,
247,
3076,
2181,
432,
253,
1859,
273,
14053,
11935,
8062,
247,
1953,
310,
476,
824,
247,
2969,
7870,
1566,
320,
3732,
281,
15302,
342,
2570,
3200,
1491,
50276,
7152,
339,
431,
248,
4477,
1246,
247,
1006,
800,
1566,
323,
10556,
835,
253,
21624,
24102,
452,
767,
4295,
50276,
66,
1307,
1293,
247,
1499,
628,
405,
2957,
326,
6125,
2600,
285,
247,
1307,
342,
247,
1499,
628,
405,
2957,
326,
6125,
3740,
597,
1246,
1543,
327,
247,
10895,
948,
8287,
253,
5149,
5150,
285,
327,
10556,
273,
4886,
278,
79,
382,
24321,
285,
495,
69,
21583,
253,
1543,
403,
3839,
1175,
3340,
323,
1048,
4533,
8349,
285,
597,
7568,
1633,
751,
557,
290,
36874,
407,
4645,
326,
253,
22925,
273,
253,
24321,
476,
320,
1863,
6965,
275,
253,
4886,
278,
79,
382,
941,
50276,
2577,
2022,
14926,
342,
253,
2929,
310,
326,
352,
556,
2717,
281,
513,
342,
268,
3229,
390,
9712,
273,
4903,
253,
4588,
21624,
24102,
403,
15524,
347,
258,
3229,
417,
268,
3229,
534,
403,
840,
908,
281,
6635,
3888,
253,
22861,
275,
2426,
273,
9712,
273,
4903,
310,
671,
247,
35352,
395,
16065,
66,
1499,
628,
405,
12339,
310,
2879,
281,
253,
2957,
323,
581,
273,
253,
21624,
24102,
326,
310,
512,
958,
2149,
21624,
24102,
403,
247,
973,
21877,
14053,
5853,
323,
673,
2962,
2168,
253,
897,
273,
258,
3229,
4764,
1025,
407,
11454,
6928,
2581,
685,
1333,
298,
296,
983,
390,
643,
391,
9866,
35615,
310,
671,
625,
247,
3064,
275,
4248,
685,
275,
2238,
432,
643,
3425,
3210,
1199,
751,
849,
501,
47301,
2489,
11454,
258,
3229,
275,
253,
2701,
273,
1077,
3676,
6928,
3365,
970,
1408,
463,
76,
29662,
11269,
4764,
1025,
407,
247,
11454,
2990,
310,
1335,
22335,
247,
2238,
273,
391,
9866,
604,
368,
4853,
391,
9866,
1077,
35056,
347,
667,
14561,
10040,
456,
1159,
3037,
494,
407,
11786,
18499,
594,
516,
417,
2119,
326,
436,
2929,
2686,
1057,
954,
273,
752,
352,
3916,
281,
320,
2509,
275,
253,
16038,
7152,
339,
431,
248,
2929,
10262,
247,
7046,
7173,
358,
23702,
557,
290,
606,
1338,
1332,
323,
10885,
3425,
941,
16161,
1029,
6967,
268,
3229,
323,
44190,
253,
3242,
8062,
310,
2834,
7613,
436,
789,
29328,
4715,
673,
25168,
285,
37282,
2662,
14237,
11794,
281,
8415,
436,
1895,
281,
5115,
436,
4736,
253,
4477,
32434,
247,
1566,
326,
31167,
247,
11935,
258,
615,
1232,
253,
2530,
4679,
5224,
326,
253,
1332,
33526,
1175,
3045,
3738,
253,
6351,
310,
417,
5185,
50276,
783,
4583,
28529,
285,
16182,
273,
436,
2929,
403,
22335,
3590,
285,
891,
5476,
1262,
13218,
8689,
553,
369,
247,
8542,
4327,
323,
4715,
673,
25168,
6779,
3738,
253,
4081,
1566,
310,
3839,
7763,
253,
4679,
760,
3835,
10554,
1543,
327,
8946,
2460,
15302,
17837,
253,
6944,
2069,
1179,
265,
273,
15302,
403,
10481,
12848,
281,
7568,
253,
12510,
275,
2087,
11935,
6430,
50276,
15177,
50276,
783,
2929,
310,
4518,
3542,
4583,
2299,
2593,
577,
369,
247,
2372,
1892,
281,
956,
534,
310,
253,
5161,
629,
273,
436,
2929,
323,
1650,
253,
985,
11627,
285,
4606,
323,
253,
17040,
4706,
5976,
497,
417,
4518,
5544,
323,
952,
665,
403,
417,
7615,
342,
253,
2170,
352,
651,
320,
12150,
281,
2096,
253,
10336,
604,
3036,
337,
369,
417,
1677,
50276,
19164,
414,
50276,
783,
3236,
414,
273,
253,
2929,
310,
417,
13671,
533,
4209,
323,
14924,
50275,
9188,
40348,
50276,
783,
8453,
273,
436,
789,
310,
7194,
323,
1566,
10336,
891,
2868,
841,
9351,
273,
7274,
534,
476,
26506,
1566,
5415,
8062,
403,
11306,
9013,
672,
16161,
1524,
10186,
3237,
3103,
253,
8453,
273,
253,
2929,
310,
4209,
323,
14924,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1566,
323,
557,
290,
36874,
2600,
285,
8062,
533,
12401,
253,
5020,
273,
2045,
789,
253,
8062,
403,
23115,
970,
258,
3229,
2581,
685,
616,
13358,
34754,
50276,
30930,
2224,
253,
30628,
5194,
326,
253,
2929,
310,
973,
3542,
285,
253,
1543,
1007,
1175,
3340,
323,
3356,
24102,
7613,
891,
717,
5211,
281,
5583,
436,
2929,
323,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors consider the problem of differential performance across subgroups commonly present in classifiers and propose to mitigate it they augment subgroup data for each class using cyclegan and balance the performance across subgroups in each class using a consistency regularizer and a robust objective that minimizes the difference between the minimum and maximum subgroup performance this approach is demonstrated in several datasets including the isic skin cancer data the paper is written well easy to follow and i was able to understand and appreciate the contributions quickly the appendices are also very thorough and the code is organized well there is sufficient detail to help readers reproduce the results comments 1 the authors can specify very early on that the subgroups are prespecified by the user and not automatically discovered 2 relations to the area of group fairness may be helpful the goal there is also to ensure parity of predictions among groups and various methods are used the groups though are assumed to be common across classes it may be possible to extend this approach to that area and the idea of data augmentation may be very appealing there a small discussion on this can be quite informative to the community there is also the notion of subgroup fairness discussed here which considers exponentially or infinitely many subgroups httpproceedingsmlrpressv80kearns18ahtml 3 have the authors considered how their methods can be modified when the subgroups are common across classes this points directly to the group fairness comment before 4 just to clarify the robustness metric used in the experiments is the same as the metric of interest for gdro in table 1 correct and is the gap same as the metric of interest for sgdro does the performance for groups in each class vary between robust accuracy and robust accuracy gap how are the metrics reported in the experiments consolidated across classes it may be helpful to define the experimental metrics explicitly 5 saying that your minimization of ihaty x x is parallel to lemma 1 is confusing since you minimize the upper bound in thm 1 and lemma 1 minimizes the lower bound agree that your minimization is stronger 6 have you tried any preliminary experiments with greater than 2 subgroups per class just curious what it takes to use something like stargan like you mention 7 what are the risks of letting the users specify the subgroups in some problems it may be hard to diagnose which subgroups are meaningful is minimizing the worst case performance over many possible automatically discovered subgroups an interesting future direction in your opinion what does it take to do that 8 how well will this method work when there is considerable imbalance in the subgroups considered 9 in table 5 what do the quantities in the parenthesis mean typo sec d42 gdro weighte weightdocsepthe paper focuses on data augmentation in cases when the classifier performance is worse on specific parts of the data the problem is also closely related to that of the spurious correlation problem the setting where the classifier might pick up on random patterns in the data to make its decision the solution proposed by the paper is quite intuitive first given the subgroups in a class a cyclegan model is used to learn different versions of the same training example each corresponding to a different subgroup once the augmented versions of the examples are available the the classifier is then trained with additional penalty terms ensuring that the predictions are consistent across different versions of the same training example empirical results show that the proposed method does reasonably well as compared to the competitors while the reviewer is not an expert in the area the contribution of the paper indeed sounds appealing the proposed method is quite intuitive and simple to implement and the empirical results are quite encouraging on top of that the proposed method is quite general does not seem to be limited to just images and can be applied to a variety of domains one drawback of course is the need for manual identification of the subgroups but given that stateoftheart methods also need manual annotations at least to my knowledge that is probably fine finally the paper is quite well written and easy to follow a few comments and suggestion the current assumption is that the subgroups are specific to a single class however that may not always be the case in the real world consider for instance the problem of fair classification where subgroups eg socially salient groups might span multiple classes does the proposed method extend to such cases from the first sight that does not seem to be the case it is not clear what is meant by the statement handcrafting these augmentations may be impossible if the subgroup differences are difficult to express however if the differences are difficult to express wouldnt it also mean that separating these subgroups is difficult in the first place in such cases even the proposed method would also have a problem one potential problem with the proposed approach is its application in domains with small training datasets it looks like training the cyclegan would require a relatively large amount of data how does the proposed approach expected to perform in such cases in eqs 2 and 3 the kl divergence is computed between the predicted distributions presumably softmax output given that dnns tend to be quite badly calibrated httpsarxivorgpdf170604599pdf is it worth computing the full kl divergence shouldnt minimizing the difference between the argmax class be sufficientdocsepmachine learning models are trained to optimize performance on the entire training set and can often exhibit inaccurate performance on a subgroup such inaccuracy often results from the models dependence on spurious features this paper proposes model patching a twostep method to avoid this problem the first step learns intersubgroup transformations where an example from one subgroup is transformed into examples of all the other subgroups within a class the second stage uses these transformations as controlled data augmentations to learn a classifier that is robust to subgroupspecific variation in particular the paper uses cyclegan to learn the transformations between pairs of subgroups the second stage uses the original data and the augmented data from stage 1 and minimizes a subgroup robust objective plus a selfconsistency loss the subgroup robust objective captures the discrepancy between the best and the worst performing subgroup within a class on the other hand the selfconsistency loss enforces consistency on the augmented data experiments the authors perform extensive experiments on three benchmark datasets mnist celeba and waterbirds on all the datasets camel improves both aggregate and robust accuracy by at least 53 and also reduces the subgroup gap significantly then the authors perform ablations on the two major components of the framework it seems that learned augmentations perform better than other heuristic augmentations and substituting the consistency loss with other losses reduces the robust accuracy by 25 finally the authors apply the proposed framework on the realworld isic skin cancer dataset and found that it improves robust accuracy by 117 compared to the other methods strengths extensive experimentation i thought that the experiments were sufficient to demonstrate the effectiveness of the proposed approach in particular i liked the model patching ablations experiments which showed that both the learned augmentations and the subgroup consistency regularizer are important i thought that the use of cyclegan and sgdro is wellmotivated and appropriate for this setting weaknesses no overlapping subgroups across classes this paper considers a setting where each class is partitioned into multiple subgroups i am not sure whether the proposed framework can be easily generalized for the setting when the subgroups are overlapping across multiple classes questions for the authors 1 what happens if the group information is not known or noisyincorrect can the objective function in subsection 221 be modified to handle such situations 2 if i expand the square term given in theorem 1 there will also be a product term in addition to the consistency loss and cyclegan loss however this additional product term is absent from the objective considered by camel so i did not follow why camel induces the desired conditional independence haty perp z x 3 in table 4 why does camel heuristic have very low robust acc 53 for lambda200 moreover for this value of lambda the maximum subgroup gap is quite large 4 subsection 422 does not report performance for the datasets celeba and mnist what is the effect of consistency loss ablations on these datasets in summary this paper proposes a model patching based twostep method to design a classifier that is robust and reduces performance disparity across different subgroups experiments on standard datasets clearly show that the proposed method is more effective than some of the existing approachesdocsep summary this paper introduces a method camel to make cnn models robust to the effect of subgroups in classes camel uses cyclegan to transfer the subgroup of each input image in each class and applies consistency regularization among transferred images this paper additionally introduces a novel objective sgdro camel shows preferable performance on various data including a realworld dataset of skin cancer classification strengths 1 the proposed method is straightforward and sound 1 the authors conducted extensive experiments with various datasets including a realworld dataset of skin cancer classification they also performed comprehensive ablation studies to show how and why camel works 1 camel can be combined with heuristic augmentation eg rotations to improve robustness weaknesses 1 the datasets used in the experiments are too simple to show the effectiveness of camel and sgdro specifically the number of subgroups in each class of all datasets is at most 2 on the other hand especially sgdro assumes that subgroups have structure a more complicated subgroup setting should be considered 1 when the number of subgroups in each class and the number of classes is limited as the experiments one can treat mathcalytimesmathcalz as target and use erm this should be a strong baseline 1 i concern the scalability of this method if cyclegans are used as the paper sumyinmathcalyfracmathcalzy2 cyclegan models need to be trained on a pair of subgroups beforehand and used during camel training even stargans are used mathcaly strargans are required feedbacks the paper will be improved if generated images of cyclegans are presented to visually show how cyclegans change subgroups
### Summary:
|
this paper presents an approach for mitigating subgroup performance gap in images in cases when a classifier relies on subgroup specific features the authors propose a data augmentation approach where synthetically produced examples by gans act as instantiations of the real samples in all possible subgroups by matching the predictions of original and augmented examples the prediction model is forced to ignore subgroup differences encouraging invariance the proposed method of controlled data augmentations as precisely called by r4 is relevant and wellmotivated the theoretical justifications support the main claims and the experimental results are diverse and demonstrate merits of the proposed approach as rightly pointed out by r3 the appendices are also very thorough and the code is organized well in the initial evaluation the reviewers have raised in unison concerns regarding overlapping subgroups per class and an imbalance problem in the subgroups when training gans there were also questions reg theoretical justifications and empirical evaluations of the baseline methods the authors have addressed all major concerns in the rebuttal pleased to report that based on the author respond with extra experiments and explanations r2 has raised the score from 6 to 7 in conclusion all four reviewers were convinced by the authors rebuttal and ac recommends acceptance of this paper congratulations to the authors there is a colossal effort in the community addressing a goal similar to this work learning invariant representations wrt sensitive features by means of algorithmic fairness methods r1 and r3 relate to it when preparing the final version the authors are encouraged to elaborate more on the discussioncomparison to fairnessbased methods ideally including empirical evidence where possible where subgroups overlap eg celeba the ac believes this will strengthen the final revision and will have an even broader impact in the community
|
[
273,
50276,
2377,
4399,
28959,
5469,
1060,
534,
19401,
28596,
390,
29556,
1142,
22105,
2832,
377,
287,
22868,
1686,
83,
7100,
87,
1438,
413,
1596,
84,
1093,
66,
2974,
50276,
20,
452,
253,
4477,
2783,
849,
616,
3082,
476,
320,
7321,
672,
253,
22105,
403,
1846,
50276,
317,
1350,
5971,
436,
2792,
3587,
281,
253,
1387,
28959,
4385,
50276,
9131,
577,
816,
281,
19148,
253,
31640,
7982,
908,
275,
253,
4679,
310,
253,
1072,
347,
253,
7982,
273,
1600,
323,
305,
3002,
275,
2829,
337,
3451,
285,
310,
253,
8037,
1072,
347,
253,
7982,
273,
1600,
323,
48237,
3002,
1057,
253,
3045,
323,
2390,
275,
1016,
966,
6889,
875,
10237,
7200,
285,
10237,
7200,
50276,
27142,
849,
403,
253,
17082,
2361,
275,
253,
4679,
33114,
2439,
5971,
352,
778,
320,
9371,
281,
4853,
253,
5661,
17082,
11120,
608,
3981,
326,
634,
41458,
273,
891,
700,
90,
1269,
50276,
89,
310,
7529,
281,
18057,
337,
310,
21643,
1580,
368,
15338,
253,
5170,
3033,
275,
289,
78,
337,
285,
18057,
337,
46926,
253,
2406,
3033,
5194,
326,
634,
41458,
310,
10046,
721,
452,
368,
3597,
50276,
1279,
12611,
4679,
342,
3687,
685,
374,
22105,
591,
966,
816,
14338,
752,
352,
3936,
281,
897,
1633,
751,
4177,
1247,
751,
368,
3748,
818,
752,
403,
253,
10502,
273,
13872,
253,
4212,
13199,
253,
22105,
275,
690,
3237,
352,
778,
320,
1892,
281,
33901,
534,
22105,
403,
14282,
310,
28699,
253,
9065,
1083,
3045,
689,
1142,
1896,
8356,
6888,
22105,
271,
4722,
2852,
50276,
21285,
275,
634,
4743,
50276,
5371,
1057,
352,
1379,
281,
513,
326,
854,
849,
973,
50276,
9846,
50276,
2520,
1332,
789,
672,
627,
310,
10665,
31561,
275,
253,
22105,
2783,
898,
275,
2829,
608,
50276,
5371,
513,
253,
13483,
275,
253,
2885,
25232,
1599,
50276,
555,
5367,
4706,
277,
2945,
305,
3002,
2801,
70,
50275,
6712,
7152,
339,
431,
248,
2929,
16633,
327,
941,
42072,
275,
2219,
672,
253,
30410,
3045,
310,
7197,
327,
2173,
4243,
273,
253,
941,
253,
1895,
310,
671,
8244,
2905,
281,
326,
273,
253,
46541,
5921,
1895,
253,
4758,
835,
253,
30410,
1537,
2619,
598,
327,
3632,
6127,
275,
253,
941,
281,
1056,
697,
3061,
253,
2900,
4081,
407,
253,
2929,
310,
3240,
27350,
806,
1677,
253,
22105,
275,
247,
966,
247,
5880,
1247,
1566,
310,
908,
281,
3037,
1027,
9508,
273,
253,
1072,
3733,
1650,
1016,
3969,
281,
247,
1027,
14632,
2378,
253,
31612,
9508,
273,
253,
6667,
403,
2130,
253,
253,
30410,
310,
840,
10166,
342,
3081,
12339,
2426,
17749,
326,
253,
13650,
403,
5185,
2439,
1027,
9508,
273,
253,
1072,
3733,
1650,
16774,
1543,
921,
326,
253,
4081,
1332,
1057,
12054,
973,
347,
2429,
281,
253,
21607,
50276,
6050,
253,
37317,
310,
417,
271,
6485,
275,
253,
2170,
253,
7680,
273,
253,
2929,
6296,
7835,
23176,
253,
4081,
1332,
310,
3240,
27350,
285,
2969,
281,
3359,
285,
253,
16774,
1543,
403,
3240,
18462,
327,
1755,
273,
326,
253,
4081,
1332,
310,
3240,
2087,
1057,
417,
1646,
281,
320,
3710,
281,
816,
3888,
285,
476,
320,
3732,
281,
247,
5235,
273,
10625,
581,
32489,
273,
2282,
310,
253,
878,
323,
11595,
8137,
273,
253,
22105,
533,
1677,
326,
1375,
23037,
14387,
3082,
671,
878,
11595,
31825,
387,
1878,
281,
619,
3640,
326,
310,
3164,
4030,
4720,
253,
2929,
310,
3240,
973,
3542,
285,
3477,
281,
956,
50276,
66,
1643,
5701,
285,
14876,
50275,
783,
1655,
9376,
310,
326,
253,
22105,
403,
2173,
281,
247,
2014,
966,
2299,
326,
778,
417,
1900,
320,
253,
1083,
275,
253,
1524,
1533,
1908,
323,
4227,
253,
1895,
273,
4344,
9162,
835,
22105,
24088,
28071,
43066,
2390,
1537,
13905,
2709,
5971,
1057,
253,
4081,
1332,
9017,
281,
824,
2219,
432,
253,
806,
8184,
326,
1057,
417,
1646,
281,
320,
253,
1083,
50275,
262,
310,
417,
2590,
752,
310,
5486,
407,
253,
3908,
1133,
12517,
272,
841,
35919,
569,
778,
320,
7479,
604,
253,
14632,
3910,
403,
2834,
281,
3890,
2299,
604,
253,
3910,
403,
2834,
281,
3890,
651,
2649,
352,
671,
1599,
326,
23694,
841,
22105,
310,
2834,
275,
253,
806,
1659,
275,
824,
2219,
1014,
253,
4081,
1332,
651,
671,
452,
247,
1895,
50275,
531,
2442,
1895,
342,
253,
4081,
2746,
310,
697,
2898,
275,
10625,
342,
1355,
3733,
15302,
352,
4453,
751,
3733,
253,
5880,
1247,
651,
2430,
247,
4942,
1781,
2408,
273,
941,
849,
1057,
253,
4081,
2746,
3264,
281,
1347,
275,
824,
2219,
50275,
249,
16186,
84,
374,
285,
495,
253,
27451,
23279,
310,
10302,
875,
253,
8131,
10670,
18289,
2602,
4090,
3453,
1677,
326,
277,
79,
2224,
5257,
281,
320,
3240,
16426,
35890,
5987,
39962,
2061,
9275,
15046,
1549,
1857,
1525,
9275,
310,
352,
4409,
12672,
253,
2120,
27451,
23279,
943,
2649,
28699,
253,
3064,
875,
253,
1736,
4090,
966,
320,
4209,
7152,
339,
2617,
12627,
4715,
3210,
403,
10166,
281,
22318,
3045,
327,
253,
2862,
3733,
873,
285,
476,
2223,
10738,
31215,
3045,
327,
247,
14632,
824,
23437,
1974,
2223,
1543,
432,
253,
3210,
10096,
327,
46541,
3386,
436,
2929,
29328,
1566,
869,
7695,
50276,
66,
2500,
493,
554,
1332,
281,
3693,
436,
1895,
253,
806,
3213,
33772,
734,
2377,
4399,
21257,
835,
271,
1650,
432,
581,
14632,
310,
13657,
715,
6667,
273,
512,
253,
643,
22105,
1561,
247,
966,
253,
1273,
3924,
4648,
841,
21257,
347,
6537,
941,
35919,
569,
281,
3037,
247,
30410,
326,
310,
10237,
281,
22105,
29765,
7629,
50276,
249,
1798,
253,
2929,
4648,
5880,
1247,
281,
3037,
253,
21257,
875,
8557,
273,
22105,
253,
1273,
3924,
4648,
253,
3236,
941,
285,
253,
31612,
941,
432,
3924,
337,
285,
46926,
247,
14632,
10237,
8103,
5043,
247,
1881,
46540,
1371,
2957,
253,
14632,
10237,
8103,
28174,
253,
26210,
875,
253,
1682,
285,
253,
9065,
9591,
14632,
1561,
247,
966,
327,
253,
643,
1133,
253,
1881,
46540,
1371,
2957,
546,
36217,
15274,
327,
253,
31612,
941,
50275,
16217,
3825,
253,
4477,
1347,
9470,
4679,
327,
1264,
22791,
15302,
50276,
16192,
382,
6076,
5830,
285,
1824,
40382,
327,
512,
253,
15302,
46493,
19132,
1097,
19737,
285,
10237,
7200,
407,
387,
1878,
8676,
285,
671,
11355,
253,
14632,
8037,
3012,
840,
253,
4477,
1347,
490,
77,
569,
327,
253,
767,
2201,
4295,
273,
253,
7792,
352,
3133,
326,
6311,
35919,
569,
1347,
1805,
685,
643,
47641,
35919,
569,
285,
40944,
253,
15274,
2957,
342,
643,
11655,
11355,
253,
10237,
7200,
407,
2030,
4720,
253,
4477,
4647,
253,
4081,
7792,
327,
253,
1524,
10186,
310,
280,
4808,
2923,
10895,
285,
1119,
326,
352,
19132,
10237,
7200,
407,
12387,
2429,
281,
253,
643,
3082,
50275,
296,
3755,
20556,
50276,
2068,
3134,
40290,
891,
1869,
326,
253,
4679,
497,
4209,
281,
7568,
253,
12510,
273,
253,
4081,
2746,
275,
1798,
891,
10490,
253,
1566,
869,
7695,
490,
77,
569,
4679,
534,
2692,
326,
1097,
253,
6311,
35919,
569,
285,
253,
14632,
15274,
3963,
6081,
403,
1774,
50276,
74,
1869,
326,
253,
897,
273,
5880,
1247,
285,
48237,
3002,
310,
973,
24013,
8550,
285,
4569,
323,
436,
4758,
50275,
20881,
1255,
265,
50276,
2369,
21481,
22105,
2439,
5971,
436,
2929,
19401,
247,
4758,
835,
1016,
966,
310,
10883,
264,
715,
2709,
22105,
891,
717,
417,
2119,
1880,
253,
4081,
7792,
476,
320,
4354,
14923,
323,
253,
4758,
672,
253,
22105,
403,
21481,
2439,
2709,
5971,
50274,
34974,
323,
253,
4477,
337,
752,
6569,
604,
253,
1387,
1491,
310,
417,
1929,
390,
27620,
1763,
263,
6471,
476,
253,
8103,
1159,
275,
19087,
27587,
320,
7321,
281,
6016,
824,
9534,
374,
604,
891,
5645,
253,
6278,
1307,
1677,
275,
10012,
337,
627,
588,
671,
320,
247,
1885,
1307,
275,
1635,
281,
253,
15274,
2957,
285,
5880,
1247,
2957,
2299,
436,
3081,
1885,
1307,
310,
12125,
432,
253,
8103,
2783,
407,
46493,
594,
891,
858,
417,
956,
2139,
46493,
14757,
253,
6799,
17697,
14275,
7856,
90,
591,
81,
1182,
50276,
89,
495,
275,
2829,
577,
2139,
1057,
46493,
50276,
248,
32838,
452,
1077,
1698,
10237,
756,
8676,
323,
29331,
1518,
25761,
323,
436,
1318,
273,
29331,
253,
4869,
14632,
8037,
310,
3240,
1781,
577,
19087,
38429,
1057,
417,
1304,
3045,
323,
253,
15302,
50276,
44033,
5830,
285,
278,
79,
382,
752,
310,
253,
1055,
273,
15274,
2957,
490,
77,
569,
327,
841,
15302,
50275,
249,
6010,
436,
2929,
29328,
247,
1566,
869,
7695,
1754,
2500,
493,
554,
1332,
281,
2216,
247,
30410,
326,
310,
10237,
285,
11355,
3045,
37808,
2439,
1027,
22105,
4679,
327,
2629,
15302,
4518,
921,
326,
253,
4081,
1332,
310,
625,
3576,
685,
690,
273,
253,
5368,
7274,
7152,
33032,
6010,
50276,
2520,
2929,
23970,
247,
1332,
46493,
281,
1056,
260,
9866,
3210,
10237,
281,
253,
1055,
273,
22105,
275,
5971,
46493,
4648,
5880,
1247,
281,
3700,
253,
14632,
273,
1016,
3280,
2460,
275,
1016,
966,
285,
10384,
15274,
37820,
2190,
9495,
3888,
436,
2929,
23000,
23970,
247,
4460,
8103,
48237,
3002,
46493,
2722,
29224,
3045,
327,
2710,
941,
1690,
247,
1524,
10186,
10895,
273,
4808,
2923,
9162,
50275,
296,
3755,
20556,
50276,
18,
253,
4081,
1332,
310,
15246,
285,
3590,
337,
253,
4477,
5196,
9470,
4679,
342,
2710,
15302,
1690,
247,
1524,
10186,
10895,
273,
4808,
2923,
9162,
597,
671,
2684,
11088,
28913,
2175,
281,
921,
849,
285,
2139,
46493,
2987,
337,
46493,
476,
320,
5678,
342,
47641,
42072,
24088,
39501,
281,
3157,
31640,
50275,
20881,
1255,
265,
50276,
18,
253,
15302,
908,
275,
253,
4679,
403,
1512,
2969,
281,
921,
253,
12510,
273,
46493,
285,
48237,
3002,
5742,
253,
1180,
273,
22105,
275,
1016,
966,
273,
512,
15302,
310,
387,
954,
374,
327,
253,
643,
1133,
3340,
48237,
3002,
19584,
326,
22105,
452,
2605,
247,
625,
9542,
14632,
4758,
943,
320,
2783,
337,
672,
253,
1180,
273,
22105,
275,
1016,
966,
285,
253,
1180,
273,
5971,
310,
3710,
347,
253,
4679,
581,
476,
1555,
14168,
1179,
1767,
1022,
1588,
91,
347,
2303,
285,
897,
209,
693,
436,
943,
320,
247,
2266,
8245,
337,
891,
4468,
253,
9171,
1430,
273,
436,
1332,
604,
2645,
68,
1851,
507,
403,
908,
347,
253,
2929,
2020,
90,
249,
1588,
90,
1124,
1588,
3847,
19,
5880,
1247,
3210,
878,
281,
320,
10166,
327,
247,
4667,
273,
22105,
38565,
285,
908,
1309,
46493,
3733,
1014,
331,
1662,
507,
403,
908,
14168,
1179,
90,
1213,
1662,
507,
403,
2424,
50275,
19261,
12270,
50276,
783,
2929,
588,
320,
5520,
604,
4561,
3888,
273,
2645,
68,
1851,
507,
403,
3559,
281,
25910,
921,
849,
2645,
68,
1851,
507,
1818,
22105,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
271,
2746,
323,
37460,
14632,
3045,
8037,
275,
3888,
275,
2219,
672,
247,
30410,
15771,
327,
14632,
2173,
3386,
253,
4477,
12661,
247,
941,
42072,
2746,
835,
5132,
85,
1037,
4197,
6667,
407,
305,
507,
769,
347,
8164,
10944,
273,
253,
1524,
3530,
275,
512,
1896,
22105,
407,
11038,
253,
13650,
273,
3236,
285,
31612,
6667,
253,
10554,
1566,
310,
6726,
281,
11823,
14632,
3910,
18462,
31429,
253,
4081,
1332,
273,
6537,
941,
35919,
569,
347,
10534,
1925,
407,
391,
21,
310,
4623,
285,
973,
24013,
8550,
253,
10527,
816,
6787,
1329,
253,
2022,
3916,
285,
253,
5661,
1543,
403,
11117,
285,
7568,
16108,
273,
253,
4081,
2746,
347,
35155,
8042,
562,
407,
391,
20,
253,
14801,
1271,
403,
671,
1077,
11080,
285,
253,
2127,
310,
10932,
973,
50276,
249,
253,
3302,
7103,
253,
30628,
452,
5439,
275,
440,
1988,
7350,
5001,
21481,
22105,
591,
966,
285,
271,
31561,
1895,
275,
253,
22105,
672,
3733,
305,
507,
627,
497,
671,
3533,
810,
10527,
816,
6787,
285,
16774,
27163,
273,
253,
8245,
3082,
253,
4477,
452,
9713,
512,
2201,
7350,
275,
253,
30080,
22559,
13864,
281,
1304,
326,
1754,
327,
253,
2488,
3794,
342,
4465,
4679,
285,
22909,
391,
19,
556,
5439,
253,
4868,
432,
721,
281,
818,
275,
6452,
512,
1740,
30628,
497,
13762,
407,
253,
4477,
30080,
22559,
285,
913,
32636,
14924,
273,
436,
2929,
50276,
14829,
37230,
281,
253,
4477,
50276,
9088,
310,
247,
847,
1730,
267,
3434,
275,
253,
3114,
15974,
247,
4736,
2074,
281,
436,
789,
50276,
28269,
13727,
14237,
8772,
7996,
3386,
407,
2097,
273,
5933,
280,
28959,
3082,
391,
18,
285,
391,
20,
14588,
281,
352,
672,
13828,
253,
2457,
2715,
253,
4477,
403,
14659,
281,
21184,
625,
327,
253,
5955,
47109,
281,
28959,
3169,
3082,
34243,
1690,
16774,
1941,
835,
1896,
835,
22105,
14787,
24088,
6076,
5830,
253,
913,
11532,
436,
588,
17084,
253,
2457,
18520,
285,
588,
452,
271,
1014,
16055,
3486,
275,
253,
3114,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
273,
50276,
2377,
4399,
28959,
5469,
1060,
534,
19401,
28596,
390,
29556,
1142,
22105,
2832,
377,
287,
22868,
1686,
83,
7100,
87,
1438,
413,
1596,
84,
1093,
66,
2974,
50276,
20,
452,
253,
4477,
2783,
849,
616,
3082,
476,
320,
7321,
672,
253,
22105,
403,
1846,
50276,
317,
1350,
5971,
436,
2792,
3587,
281,
253,
1387,
28959,
4385,
50276,
9131,
577,
816,
281,
19148,
253,
31640,
7982,
908,
275,
253,
4679,
310,
253,
1072,
347,
253,
7982,
273,
1600,
323,
305,
3002,
275,
2829,
337,
3451,
285,
310,
253,
8037,
1072,
347,
253,
7982,
273,
1600,
323,
48237,
3002,
1057,
253,
3045,
323,
2390,
275,
1016,
966,
6889,
875,
10237,
7200,
285,
10237,
7200,
50276,
27142,
849,
403,
253,
17082,
2361,
275,
253,
4679,
33114,
2439,
5971,
352,
778,
320,
9371,
281,
4853,
253,
5661,
17082,
11120,
608,
3981,
326,
634,
41458,
273,
891,
700,
90,
1269,
50276,
89,
310,
7529,
281,
18057,
337,
310,
21643,
1580,
368,
15338,
253,
5170,
3033,
275,
289,
78,
337,
285,
18057,
337,
46926,
253,
2406,
3033,
5194,
326,
634,
41458,
310,
10046,
721,
452,
368,
3597,
50276,
1279,
12611,
4679,
342,
3687,
685,
374,
22105,
591,
966,
816,
14338,
752,
352,
3936,
281,
897,
1633,
751,
4177,
1247,
751,
368,
3748,
818,
752,
403,
253,
10502,
273,
13872,
253,
4212,
13199,
253,
22105,
275,
690,
3237,
352,
778,
320,
1892,
281,
33901,
534,
22105,
403,
14282,
310,
28699,
253,
9065,
1083,
3045,
689,
1142,
1896,
8356,
6888,
22105,
271,
4722,
2852,
50276,
21285,
275,
634,
4743,
50276,
5371,
1057,
352,
1379,
281,
513,
326,
854,
849,
973,
50276,
9846,
50276,
2520,
1332,
789,
672,
627,
310,
10665,
31561,
275,
253,
22105,
2783,
898,
275,
2829,
608,
50276,
5371,
513,
253,
13483,
275,
253,
2885,
25232,
1599,
50276,
555,
5367,
4706,
277,
2945,
305,
3002,
2801,
70,
50275,
6712,
7152,
339,
431,
248,
2929,
16633,
327,
941,
42072,
275,
2219,
672,
253,
30410,
3045,
310,
7197,
327,
2173,
4243,
273,
253,
941,
253,
1895,
310,
671,
8244,
2905,
281,
326,
273,
253,
46541,
5921,
1895,
253,
4758,
835,
253,
30410,
1537,
2619,
598,
327,
3632,
6127,
275,
253,
941,
281,
1056,
697,
3061,
253,
2900,
4081,
407,
253,
2929,
310,
3240,
27350,
806,
1677,
253,
22105,
275,
247,
966,
247,
5880,
1247,
1566,
310,
908,
281,
3037,
1027,
9508,
273,
253,
1072,
3733,
1650,
1016,
3969,
281,
247,
1027,
14632,
2378,
253,
31612,
9508,
273,
253,
6667,
403,
2130,
253,
253,
30410,
310,
840,
10166,
342,
3081,
12339,
2426,
17749,
326,
253,
13650,
403,
5185,
2439,
1027,
9508,
273,
253,
1072,
3733,
1650,
16774,
1543,
921,
326,
253,
4081,
1332,
1057,
12054,
973,
347,
2429,
281,
253,
21607,
50276,
6050,
253,
37317,
310,
417,
271,
6485,
275,
253,
2170,
253,
7680,
273,
253,
2929,
6296,
7835,
23176,
253,
4081,
1332,
310,
3240,
27350,
285,
2969,
281,
3359,
285,
253,
16774,
1543,
403,
3240,
18462,
327,
1755,
273,
326,
253,
4081,
1332,
310,
3240,
2087,
1057,
417,
1646,
281,
320,
3710,
281,
816,
3888,
285,
476,
320,
3732,
281,
247,
5235,
273,
10625,
581,
32489,
273,
2282,
310,
253,
878,
323,
11595,
8137,
273,
253,
22105,
533,
1677,
326,
1375,
23037,
14387,
3082,
671,
878,
11595,
31825,
387,
1878,
281,
619,
3640,
326,
310,
3164,
4030,
4720,
253,
2929,
310,
3240,
973,
3542,
285,
3477,
281,
956,
50276,
66,
1643,
5701,
285,
14876,
50275,
783,
1655,
9376,
310,
326,
253,
22105,
403,
2173,
281,
247,
2014,
966,
2299,
326,
778,
417,
1900,
320,
253,
1083,
275,
253,
1524,
1533,
1908,
323,
4227,
253,
1895,
273,
4344,
9162,
835,
22105,
24088,
28071,
43066,
2390,
1537,
13905,
2709,
5971,
1057,
253,
4081,
1332,
9017,
281,
824,
2219,
432,
253,
806,
8184,
326,
1057,
417,
1646,
281,
320,
253,
1083,
50275,
262,
310,
417,
2590,
752,
310,
5486,
407,
253,
3908,
1133,
12517,
272,
841,
35919,
569,
778,
320,
7479,
604,
253,
14632,
3910,
403,
2834,
281,
3890,
2299,
604,
253,
3910,
403,
2834,
281,
3890,
651,
2649,
352,
671,
1599,
326,
23694,
841,
22105,
310,
2834,
275,
253,
806,
1659,
275,
824,
2219,
1014,
253,
4081,
1332,
651,
671,
452,
247,
1895,
50275,
531,
2442,
1895,
342,
253,
4081,
2746,
310,
697,
2898,
275,
10625,
342,
1355,
3733,
15302,
352,
4453,
751,
3733,
253,
5880,
1247,
651,
2430,
247,
4942,
1781,
2408,
273,
941,
849,
1057,
253,
4081,
2746,
3264,
281,
1347,
275,
824,
2219,
50275,
249,
16186,
84,
374,
285,
495,
253,
27451,
23279,
310,
10302,
875,
253,
8131,
10670,
18289,
2602,
4090,
3453,
1677,
326,
277,
79,
2224,
5257,
281,
320,
3240,
16426,
35890,
5987,
39962,
2061,
9275,
15046,
1549,
1857,
1525,
9275,
310,
352,
4409,
12672,
253,
2120,
27451,
23279,
943,
2649,
28699,
253,
3064,
875,
253,
1736,
4090,
966,
320,
4209,
7152,
339,
2617,
12627,
4715,
3210,
403,
10166,
281,
22318,
3045,
327,
253,
2862,
3733,
873,
285,
476,
2223,
10738,
31215,
3045,
327,
247,
14632,
824,
23437,
1974,
2223,
1543,
432,
253,
3210,
10096,
327,
46541,
3386,
436,
2929,
29328,
1566,
869,
7695,
50276,
66,
2500,
493,
554,
1332,
281,
3693,
436,
1895,
253,
806,
3213,
33772,
734,
2377,
4399,
21257,
835,
271,
1650,
432,
581,
14632,
310,
13657,
715,
6667,
273,
512,
253,
643,
22105,
1561,
247,
966,
253,
1273,
3924,
4648,
841,
21257,
347,
6537,
941,
35919,
569,
281,
3037,
247,
30410,
326,
310,
10237,
281,
22105,
29765,
7629,
50276,
249,
1798,
253,
2929,
4648,
5880,
1247,
281,
3037,
253,
21257,
875,
8557,
273,
22105,
253,
1273,
3924,
4648,
253,
3236,
941,
285,
253,
31612,
941,
432,
3924,
337,
285,
46926,
247,
14632,
10237,
8103,
5043,
247,
1881,
46540,
1371,
2957,
253,
14632,
10237,
8103,
28174,
253,
26210,
875,
253,
1682,
285,
253,
9065,
9591,
14632,
1561,
247,
966,
327,
253,
643,
1133,
253,
1881,
46540,
1371,
2957,
546,
36217,
15274,
327,
253,
31612,
941,
50275,
16217,
3825,
253,
4477,
1347,
9470,
4679,
327,
1264,
22791,
15302,
50276,
16192,
382,
6076,
5830,
285,
1824,
40382,
327,
512,
253,
15302,
46493,
19132,
1097,
19737,
285,
10237,
7200,
407,
387,
1878,
8676,
285,
671,
11355,
253,
14632,
8037,
3012,
840,
253,
4477,
1347,
490,
77,
569,
327,
253,
767,
2201,
4295,
273,
253,
7792,
352,
3133,
326,
6311,
35919,
569,
1347,
1805,
685,
643,
47641,
35919,
569,
285,
40944,
253,
15274,
2957,
342,
643,
11655,
11355,
253,
10237,
7200,
407,
2030,
4720,
253,
4477,
4647,
253,
4081,
7792,
327,
253,
1524,
10186,
310,
280,
4808,
2923,
10895,
285,
1119,
326,
352,
19132,
10237,
7200,
407,
12387,
2429,
281,
253,
643,
3082,
50275,
296,
3755,
20556,
50276,
2068,
3134,
40290,
891,
1869,
326,
253,
4679,
497,
4209,
281,
7568,
253,
12510,
273,
253,
4081,
2746,
275,
1798,
891,
10490,
253,
1566,
869,
7695,
490,
77,
569,
4679,
534,
2692,
326,
1097,
253,
6311,
35919,
569,
285,
253,
14632,
15274,
3963,
6081,
403,
1774,
50276,
74,
1869,
326,
253,
897,
273,
5880,
1247,
285,
48237,
3002,
310,
973,
24013,
8550,
285,
4569,
323,
436,
4758,
50275,
20881,
1255,
265,
50276,
2369,
21481,
22105,
2439,
5971,
436,
2929,
19401,
247,
4758,
835,
1016,
966,
310,
10883,
264,
715,
2709,
22105,
891,
717,
417,
2119,
1880,
253,
4081,
7792,
476,
320,
4354,
14923,
323,
253,
4758,
672,
253,
22105,
403,
21481,
2439,
2709,
5971,
50274,
34974,
323,
253,
4477,
337,
752,
6569,
604,
253,
1387,
1491,
310,
417,
1929,
390,
27620,
1763,
263,
6471,
476,
253,
8103,
1159,
275,
19087,
27587,
320,
7321,
281,
6016,
824,
9534,
374,
604,
891,
5645,
253,
6278,
1307,
1677,
275,
10012,
337,
627,
588,
671,
320,
247,
1885,
1307,
275,
1635,
281,
253,
15274,
2957,
285,
5880,
1247,
2957,
2299,
436,
3081,
1885,
1307,
310,
12125,
432,
253,
8103,
2783,
407,
46493,
594,
891,
858,
417,
956,
2139,
46493,
14757,
253,
6799,
17697,
14275,
7856,
90,
591,
81,
1182,
50276,
89,
495,
275,
2829,
577,
2139,
1057,
46493,
50276,
248,
32838,
452,
1077,
1698,
10237,
756,
8676,
323,
29331,
1518,
25761,
323,
436,
1318,
273,
29331,
253,
4869,
14632,
8037,
310,
3240,
1781,
577,
19087,
38429,
1057,
417,
1304,
3045,
323,
253,
15302,
50276,
44033,
5830,
285,
278,
79,
382,
752,
310,
253,
1055,
273,
15274,
2957,
490,
77,
569,
327,
841,
15302,
50275,
249,
6010,
436,
2929,
29328,
247,
1566,
869,
7695,
1754,
2500,
493,
554,
1332,
281,
2216,
247,
30410,
326,
310,
10237,
285,
11355,
3045,
37808,
2439,
1027,
22105,
4679,
327,
2629,
15302,
4518,
921,
326,
253,
4081,
1332,
310,
625,
3576,
685,
690,
273,
253,
5368,
7274,
7152,
33032,
6010,
50276,
2520,
2929,
23970,
247,
1332,
46493,
281,
1056,
260,
9866,
3210,
10237,
281,
253,
1055,
273,
22105,
275,
5971,
46493,
4648,
5880,
1247,
281,
3700,
253,
14632,
273,
1016,
3280,
2460,
275,
1016,
966,
285,
10384,
15274,
37820,
2190,
9495,
3888,
436,
2929,
23000,
23970,
247,
4460,
8103,
48237,
3002,
46493,
2722,
29224,
3045,
327,
2710,
941,
1690,
247,
1524,
10186,
10895,
273,
4808,
2923,
9162,
50275,
296,
3755,
20556,
50276,
18,
253,
4081,
1332,
310,
15246,
285,
3590,
337,
253,
4477,
5196,
9470,
4679,
342,
2710,
15302,
1690,
247,
1524,
10186,
10895,
273,
4808,
2923,
9162,
597,
671,
2684,
11088,
28913,
2175,
281,
921,
849,
285,
2139,
46493,
2987,
337,
46493,
476,
320,
5678,
342,
47641,
42072,
24088,
39501,
281,
3157,
31640,
50275,
20881,
1255,
265,
50276,
18,
253,
15302,
908,
275,
253,
4679,
403,
1512,
2969,
281,
921,
253,
12510,
273,
46493,
285,
48237,
3002,
5742,
253,
1180,
273,
22105,
275,
1016,
966,
273,
512,
15302,
310,
387,
954,
374,
327,
253,
643,
1133,
3340,
48237,
3002,
19584,
326,
22105,
452,
2605,
247,
625,
9542,
14632,
4758,
943,
320,
2783,
337,
672,
253,
1180,
273,
22105,
275,
1016,
966,
285,
253,
1180,
273,
5971,
310,
3710,
347,
253,
4679,
581,
476,
1555,
14168,
1179,
1767,
1022,
1588,
91,
347,
2303,
285,
897,
209,
693,
436,
943,
320,
247,
2266,
8245,
337,
891,
4468,
253,
9171,
1430,
273,
436,
1332,
604,
2645,
68,
1851,
507,
403,
908,
347,
253,
2929,
2020,
90,
249,
1588,
90,
1124,
1588,
3847,
19,
5880,
1247,
3210,
878,
281,
320,
10166,
327,
247,
4667,
273,
22105,
38565,
285,
908,
1309,
46493,
3733,
1014,
331,
1662,
507,
403,
908,
14168,
1179,
90,
1213,
1662,
507,
403,
2424,
50275,
19261,
12270,
50276,
783,
2929,
588,
320,
5520,
604,
4561,
3888,
273,
2645,
68,
1851,
507,
403,
3559,
281,
25910,
921,
849,
2645,
68,
1851,
507,
1818,
22105,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
271,
2746,
323,
37460,
14632,
3045,
8037,
275,
3888,
275,
2219,
672,
247,
30410,
15771,
327,
14632,
2173,
3386,
253,
4477,
12661,
247,
941,
42072,
2746,
835,
5132,
85,
1037,
4197,
6667,
407,
305,
507,
769,
347,
8164,
10944,
273,
253,
1524,
3530,
275,
512,
1896,
22105,
407,
11038,
253,
13650,
273,
3236,
285,
31612,
6667,
253,
10554,
1566,
310,
6726,
281,
11823,
14632,
3910,
18462,
31429,
253,
4081,
1332,
273,
6537,
941,
35919,
569,
347,
10534,
1925,
407,
391,
21,
310,
4623,
285,
973,
24013,
8550,
253,
10527,
816,
6787,
1329,
253,
2022,
3916,
285,
253,
5661,
1543,
403,
11117,
285,
7568,
16108,
273,
253,
4081,
2746,
347,
35155,
8042,
562,
407,
391,
20,
253,
14801,
1271,
403,
671,
1077,
11080,
285,
253,
2127,
310,
10932,
973,
50276,
249,
253,
3302,
7103,
253,
30628,
452,
5439,
275,
440,
1988,
7350,
5001,
21481,
22105,
591,
966,
285,
271,
31561,
1895,
275,
253,
22105,
672,
3733,
305,
507,
627,
497,
671,
3533,
810,
10527,
816,
6787,
285,
16774,
27163,
273,
253,
8245,
3082,
253,
4477,
452,
9713,
512,
2201,
7350,
275,
253,
30080,
22559,
13864,
281,
1304,
326,
1754,
327,
253,
2488,
3794,
342,
4465,
4679,
285,
22909,
391,
19,
556,
5439,
253,
4868,
432,
721,
281,
818,
275,
6452,
512,
1740,
30628,
497,
13762,
407,
253,
4477,
30080,
22559,
285,
913,
32636,
14924,
273,
436,
2929,
50276,
14829,
37230,
281,
253,
4477,
50276,
9088,
310,
247,
847,
1730,
267,
3434,
275,
253,
3114,
15974,
247,
4736,
2074,
281,
436,
789,
50276,
28269,
13727,
14237,
8772,
7996,
3386,
407,
2097,
273,
5933,
280,
28959,
3082,
391,
18,
285,
391,
20,
14588,
281,
352,
672,
13828,
253,
2457,
2715,
253,
4477,
403,
14659,
281,
21184,
625,
327,
253,
5955,
47109,
281,
28959,
3169,
3082,
34243,
1690,
16774,
1941,
835,
1896,
835,
22105,
14787,
24088,
6076,
5830,
253,
913,
11532,
436,
588,
17084,
253,
2457,
18520,
285,
588,
452,
271,
1014,
16055,
3486,
275,
253,
3114,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies dynamic pricing with contextual information under linear customer valuation model the paper considers two types of market noise distribution and proposes a unified algorithm for the two types the proposed algorithm is nearly optimal for one type and improve existing algorithm for the other the paper also establishes a new lower bound and conducts numerical experiments strengths the paper is well written and well organized for lipschitz and concavity market noise the paper improves the existing regret bound and provides a better lower bound the theoretical result is also verified by numerical experiments weaknesses the time and space complexity of the proposed algorithm is unclear which may hinder its practical applications yes docsepthis paper studied the problem of contextual pricing problem with the customers evaluation is linear in the context with an unknown market noise f the paper proposed an exucb policy to tackle this setting 1 with lipshitz and concavity smoothness assumptions the exucb algorithm achieves tildeot23 regret 2 with only lipschitz smoothness assumptions the exucb matches the best existing upper bound as tildeot34 additionally the paper gives proof of the lower bound of tildeot35 for the lipschitz and concavity smoothness assumptions strengths 1 the presentation of this paper is super clean ie with table table 2 and figure illustrationfigure 1 and the descriptions of the algorithms are clean 2 the paper carefully analyzes the behavior of exucb under different settings with different algorithm parameters which implies the generality of the proposed algorithm 3 the paper also provides a regret lower bound under the lipshitz and concavity smoothness and clearly explain the intuition behind the proof of the lower bound 4 the paper clearly states their assumptions generality with citations of recent literature 5 the paper illustrates the algorithms empirical performance weakness no conclusion section addressed in the rebuttal period minor issues 1 some of the theorem statement is incomplete line 252253 proposition 1it should be there exists positive c1 c2 emphand c3 2 section 31 lists many assumptions without connecting paragraphs between them if this paper can include a conclusion section in the end that would be great docsepthis paper studies the learning of posted prices in a bandit setting with buyer valuations having a linear structure noise they provide regret guarantees under different sets of assumptions on the distribution of the noise when the noise cdf f is only lipschitz it recovers the upperbound guarantee tildeotfrac34 of 48 if the expected revenue function under f additionally satisfies a strongconcavity property locally at its maximum this work provides a tildeotfrac23 regret guarantee that improves over the tildeotfrac23vee 1alpha from 35 finally it formalizes the omegatfrac35 lowerbound hinted in 35 for the same assumptions in my opinion the paper has mainly two contributions that concern the second set of assumptions where the expected revenue is locally strongly concave 1 improving the upperbound of 35 2 providing a lowerbound my main critic is about the significance of the first contribution indeed the algorithm follows a similar doubling trick period split as 35 and it feels like a straitforward extension 1 it is possible to perform pure exploration on oellkbeta samples per period without impacting the regret guarantee as long as beta leq frac23 so the additional regret is bounded by tildeotfrac23 2 these ellkbeta can be used to estimate hatthetak1 which by hoeffding concentration will lead to hatthetak theta01 oellkfracbeta2 3 remember the proof of 35 ensures that if the estimate hatthetak is such that hatthetak theta01 oellkalpha then the regret is bounded by tildeotfrac23vee 1alpha 4 using alpha fracbeta2 and choosing beta frac23 leads to an upperbound on the regret that is tildeotfrac23 except for this additional exploration the algorithm is very similar to 35 then the paper is hard to follow a couple of examples the perturbed linear bandit is introduced at the very beginning at this point its hard to make the link with the problem this link is only explained much later in the paper page 7 its a bit hard to understand the role of the discretization without reading 35 i dont see an obvious limitation negative impact that is relevant docsepthis paper studies an online contextual dynamic pricing problem with linearnoisy valuations and boolean feedback the author introduces an algorithm exucb that is a combination of epsilongreedy and upper confidence bound algorithms they show that their algorithm would achieve an tildeotfrac23 regret upper bound under lipschitz and concavitysee strengths and weaknesses session below assumptions which improves existing results substantially they also show that exucb would achieve an tildeotfrac34 regret upper bound under a lipschitz assumption only which matches existing results but with a more efficient algorithm they also have an tildeomegatfrac35 lower bound for the lipschitz and concavity setting they finally presents numerical results that matches their theories strengths 1 first of all the theoretical result on this problem especially case a is better than existing works two existing works 23 and 35 introduced epsilongreedy and ucb methods for pricing sequentially but each of them suffered suboptimality and is hard to improve this work eases these suboptimalities by subtly combining them with each other 2 they also improve the algorithm computational efficiency on case b where their result matches existing best results under additional assumptions but not improve which lets their algorithm more practical 3 their lower bound is developed based on existing methods but still meaningful 4 their numerical experiments matches their theoretical results very tightly 5 their writing quality is good i think this work should be worth at least 7 as they indeed improve existing results in an interesting way however there are still some doubts on my side ill tentatively give a 6 and promise to rise my score as long as the authors address my concerns weakness 1 this is a question but seems rather important what they claimed in assumption 7 is actually a 2ndordersmooth instead of a concavity see 15 they have both upper and lower constant bound on the 2ndorder derivatives at p by assuming both smoothness and concavity but you only have upper bound ie smooth please check if this affects any of your claims or the fairness of any comparison with existing results eg is your lower bound still unique 2 assumptions are slightly more than mild eg assumption 2 especially the condition number lower bound c0 is not assumed in some of 152326283548 assumption 5 can be implied by 4 ps assumption 3 combining with assumption 4 is indeed exactly a bounded noise assumption in 232648 3 missing of results andor discussions in some related works eg 44 has an omegatfracm12m1 lower bound that can fill in the blank of table 1 eg2 some works in contextual searching should also be briefly discussed including a1 a liu et al optimal contextual pricing and extensions soda21 a2 a krishnamurthy et al contextual search in the presence of irrational agents stoc21 their various assumptions on the noise distributions need discussing yes they mentioned their limitations and discussed the ethic issues related to their paper it would be better if a discussionconclusion session occurs at the end of the main pages
### Summary:
|
the authors make multiple contributions in this work under a lipschitz and second order smoothness assumption their algorithm exucb obtains regret of order tildeot23 which is an improvement upon a recent result of luo et al 2021 that has an additional dependence on a parameter alpha in addition under a lipschitz assumption the authors algorithm matches the best known regret of tildeot34 in summary the authors give a theoretical advancement over previous work notably in the lipschitz case their improvement in terms of computation is exponential going from exponential in dimension d0 from exp4 to mathrmpolyd0 for exucb in the discussion period there was some disagreement with regards to the novelty of the design of exucb one reviewer whom i believe has sufficient experience mentions that a explorethenucb structure is standard in bandits yet another reviewer highlighted that the execution of this strategy in this particular case is considerably more complicated due to additional challenges with the exploration phase not being similar to warmup periods sometimes used for multiarmed bandit algorithms i tend to believe that there indeed are additional challenges posed here and that there is sufficient novelty in light of the advances the authors have made in terms of improved regret bounds and improved computational efficiency this work merits publication and should be accepted
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
7870,
20910,
342,
33876,
1491,
762,
4872,
7731,
29581,
1566,
253,
2929,
19401,
767,
3510,
273,
2791,
6046,
3268,
285,
29328,
247,
27998,
5933,
323,
253,
767,
3510,
253,
4081,
5933,
310,
4829,
8654,
323,
581,
1511,
285,
3157,
5368,
5933,
323,
253,
643,
253,
2929,
671,
25097,
247,
747,
2406,
3033,
285,
2589,
84,
10704,
4679,
20544,
253,
2929,
310,
973,
3542,
285,
973,
10932,
323,
11233,
37913,
285,
7036,
580,
414,
2791,
6046,
253,
2929,
19132,
253,
5368,
14938,
3033,
285,
3400,
247,
1805,
2406,
3033,
253,
10527,
906,
310,
671,
16058,
407,
10704,
4679,
50276,
20881,
1255,
265,
253,
673,
285,
2317,
10454,
273,
253,
4081,
5933,
310,
12744,
534,
778,
35007,
697,
8542,
4893,
4754,
5474,
33032,
2520,
2929,
5421,
253,
1895,
273,
33876,
20910,
1895,
342,
253,
6383,
7103,
310,
4872,
275,
253,
3634,
342,
271,
7202,
2791,
6046,
269,
253,
2929,
4081,
271,
385,
1028,
67,
3646,
281,
18915,
436,
4758,
337,
342,
5541,
1200,
5432,
285,
7036,
580,
414,
6032,
1255,
13260,
253,
385,
1028,
67,
5933,
33526,
246,
6227,
302,
1508,
14938,
374,
342,
760,
11233,
37913,
6032,
1255,
13260,
253,
385,
1028,
67,
10129,
253,
1682,
5368,
5170,
3033,
347,
246,
6227,
302,
1706,
23000,
253,
2929,
4245,
4737,
273,
253,
2406,
3033,
273,
246,
6227,
302,
1671,
50276,
1542,
253,
11233,
37913,
285,
7036,
580,
414,
6032,
1255,
13260,
50276,
296,
3755,
20556,
337,
253,
9759,
273,
436,
2929,
310,
2221,
4076,
26332,
342,
2829,
2829,
374,
285,
4677,
23356,
13206,
337,
285,
253,
20121,
273,
253,
11333,
403,
4076,
50275,
19,
253,
2929,
9257,
3537,
13505,
253,
3879,
273,
385,
1028,
67,
762,
1027,
7533,
342,
1027,
5933,
3602,
534,
8018,
253,
31376,
273,
253,
4081,
5933,
50276,
20,
253,
2929,
671,
3400,
247,
14938,
2406,
3033,
762,
253,
5541,
1200,
5432,
285,
7036,
580,
414,
6032,
1255,
285,
4518,
5513,
253,
30328,
3212,
253,
4737,
273,
253,
2406,
3033,
50275,
21,
253,
2929,
4518,
3054,
616,
13260,
31376,
342,
30404,
273,
3332,
6239,
50276,
22,
253,
2929,
18303,
253,
11333,
16774,
3045,
50275,
20881,
1255,
50276,
2369,
6452,
2593,
9713,
275,
253,
30080,
22559,
2180,
50275,
37585,
3374,
337,
690,
273,
253,
10012,
3908,
310,
18464,
50275,
1282,
2030,
14832,
20,
13989,
337,
262,
943,
320,
627,
4961,
2762,
260,
18,
260,
19,
7013,
395,
260,
20,
50276,
19,
2593,
4562,
10894,
1142,
13260,
1293,
12873,
33295,
875,
731,
50276,
338,
436,
2929,
476,
2486,
247,
6452,
2593,
275,
253,
990,
326,
651,
320,
1270,
50276,
7152,
33032,
2520,
2929,
2175,
253,
4715,
273,
9269,
7911,
275,
247,
3961,
262,
4758,
342,
22894,
821,
12542,
1907,
247,
4872,
2605,
50276,
24946,
597,
2085,
14938,
23632,
762,
1027,
5239,
273,
13260,
327,
253,
3268,
273,
253,
6046,
672,
253,
6046,
260,
4989,
269,
310,
760,
11233,
37913,
352,
761,
12239,
253,
5170,
9458,
12215,
246,
6227,
302,
1124,
1706,
273,
5693,
604,
253,
3264,
11784,
1159,
762,
269,
23000,
12310,
247,
2266,
585,
43205,
414,
2867,
12171,
387,
697,
4869,
436,
789,
3400,
247,
246,
6227,
302,
1124,
1508,
14938,
12215,
326,
19132,
689,
253,
246,
6227,
302,
1124,
1508,
19406,
337,
1637,
432,
4791,
4720,
352,
7473,
4219,
253,
7005,
909,
255,
1124,
1671,
2406,
9458,
47466,
275,
4791,
323,
253,
1072,
13260,
275,
619,
4743,
253,
2929,
556,
7194,
767,
9021,
326,
4468,
253,
1273,
873,
273,
13260,
835,
253,
3264,
11784,
310,
12171,
7052,
40886,
337,
11138,
253,
5170,
9458,
273,
4791,
374,
5277,
247,
2406,
9458,
50276,
2577,
2022,
7291,
310,
670,
253,
8453,
273,
253,
806,
7680,
6296,
253,
5933,
3637,
247,
2074,
35373,
10480,
2180,
8085,
347,
4791,
285,
352,
9193,
751,
247,
3405,
262,
10495,
6880,
50276,
18,
352,
310,
1896,
281,
1347,
6313,
17947,
327,
258,
437,
76,
2461,
3530,
591,
2180,
1293,
48482,
253,
14938,
12215,
347,
1048,
347,
9840,
458,
82,
1315,
317,
1508,
594,
253,
3081,
14938,
310,
11542,
407,
246,
6227,
302,
1124,
1508,
374,
841,
11591,
76,
2461,
476,
320,
908,
281,
6642,
7856,
783,
85,
518,
18,
534,
407,
288,
3703,
567,
5361,
4719,
588,
1421,
281,
7856,
783,
85,
518,
50276,
3124,
520,
50276,
80,
437,
76,
1124,
2461,
19,
495,
4456,
253,
4737,
273,
4791,
20096,
326,
604,
253,
6642,
7856,
783,
85,
518,
310,
824,
326,
7856,
783,
85,
518,
50276,
3124,
520,
50276,
80,
437,
76,
1637,
840,
253,
14938,
310,
11542,
407,
246,
6227,
302,
1124,
1508,
19406,
337,
1637,
50276,
21,
970,
9765,
50276,
1124,
2461,
19,
285,
13887,
9840,
50276,
1124,
1508,
5644,
281,
271,
5170,
9458,
327,
253,
14938,
326,
310,
246,
6227,
302,
1124,
1508,
3707,
323,
436,
3081,
17947,
253,
5933,
310,
1077,
2074,
281,
4791,
50275,
7461,
253,
2929,
310,
1892,
281,
956,
247,
4564,
273,
6667,
50276,
783,
44711,
4872,
3961,
262,
310,
5611,
387,
253,
1077,
5068,
387,
436,
1127,
697,
1892,
281,
1056,
253,
3048,
342,
253,
1895,
436,
3048,
310,
760,
5544,
1199,
1996,
275,
253,
2929,
3239,
818,
50276,
953,
247,
2372,
1892,
281,
2096,
253,
2554,
273,
253,
35132,
1320,
1293,
4361,
4791,
891,
13414,
923,
271,
4755,
12291,
50276,
12373,
3486,
326,
310,
4623,
5474,
33032,
2520,
2929,
2175,
271,
3909,
33876,
7870,
20910,
1895,
342,
1386,
1596,
10225,
90,
821,
12542,
285,
12419,
8680,
50276,
783,
2488,
23970,
271,
5933,
385,
1028,
67,
326,
310,
247,
5019,
273,
299,
793,
300,
543,
250,
6368,
285,
5170,
7162,
3033,
11333,
50276,
9328,
921,
326,
616,
5933,
651,
5115,
271,
246,
6227,
302,
1124,
1508,
14938,
5170,
3033,
762,
11233,
37913,
285,
7036,
580,
414,
2887,
20544,
285,
32213,
6874,
2708,
13260,
534,
19132,
5368,
1543,
9619,
597,
671,
921,
326,
385,
1028,
67,
651,
5115,
271,
246,
6227,
302,
1124,
1706,
14938,
5170,
3033,
762,
247,
11233,
37913,
9376,
760,
534,
10129,
5368,
1543,
533,
342,
247,
625,
5919,
5933,
597,
671,
452,
271,
246,
6227,
485,
72,
255,
1124,
1671,
2406,
3033,
323,
253,
11233,
37913,
285,
7036,
580,
414,
4758,
50276,
9328,
4720,
10262,
10704,
1543,
326,
10129,
616,
11813,
20544,
50275,
18,
806,
273,
512,
253,
10527,
906,
327,
436,
1895,
3340,
1083,
247,
310,
1805,
685,
5368,
2987,
767,
5368,
2987,
3495,
285,
4791,
5611,
299,
793,
300,
543,
250,
6368,
285,
44274,
67,
3082,
323,
20910,
32627,
533,
1016,
273,
731,
9606,
749,
32581,
1319,
285,
310,
1892,
281,
3157,
436,
789,
299,
1169,
841,
749,
29776,
1005,
407,
8482,
314,
16248,
731,
342,
1016,
643,
50275,
19,
597,
671,
3157,
253,
5933,
15180,
6733,
327,
1083,
270,
835,
616,
906,
10129,
5368,
1682,
1543,
762,
3081,
13260,
533,
417,
3157,
534,
14935,
616,
5933,
625,
8542,
50276,
20,
616,
2406,
3033,
310,
3715,
1754,
327,
5368,
3082,
533,
1335,
14282,
50276,
21,
616,
10704,
4679,
10129,
616,
10527,
1543,
1077,
18996,
50276,
22,
616,
4028,
3290,
310,
1175,
50275,
74,
1158,
436,
789,
943,
320,
4409,
387,
1878,
818,
347,
597,
6296,
3157,
5368,
1543,
275,
271,
4722,
1039,
2299,
627,
403,
1335,
690,
24626,
327,
619,
1930,
2853,
12556,
3146,
1918,
247,
721,
285,
9023,
281,
6054,
619,
4868,
347,
1048,
347,
253,
4477,
2953,
619,
7350,
50276,
20881,
1255,
50275,
18,
436,
310,
247,
1953,
533,
3133,
2581,
1774,
752,
597,
7558,
275,
9376,
818,
310,
2686,
247,
374,
2109,
6609,
78,
4902,
3185,
273,
50276,
66,
7036,
580,
414,
923,
1458,
597,
452,
1097,
5170,
285,
2406,
3638,
3033,
327,
253,
374,
2109,
2621,
13335,
387,
268,
407,
7384,
1097,
6032,
1255,
285,
7036,
580,
414,
533,
368,
760,
452,
5170,
3033,
26332,
6032,
4496,
2451,
604,
436,
11852,
667,
273,
634,
3916,
390,
253,
28959,
273,
667,
5301,
342,
5368,
1543,
24088,
310,
634,
2406,
3033,
1335,
4451,
50276,
19,
13260,
403,
5777,
625,
685,
11134,
24088,
9376,
374,
3340,
253,
1617,
1180,
2406,
3033,
260,
17,
310,
417,
8025,
275,
690,
273,
1458,
19136,
49758,
1671,
2385,
9376,
608,
476,
320,
10466,
407,
577,
3714,
9376,
495,
16248,
342,
9376,
577,
310,
6296,
4555,
247,
11542,
6046,
9376,
275,
26972,
25020,
50276,
20,
5816,
273,
1543,
285,
263,
11985,
275,
690,
2905,
2987,
24088,
7127,
556,
271,
7005,
909,
255,
1124,
78,
805,
78,
18,
2406,
3033,
326,
476,
7522,
275,
253,
9912,
273,
2829,
337,
24088,
19,
690,
2987,
275,
33876,
12203,
943,
671,
320,
13366,
5469,
1690,
50270,
66,
18,
247,
632,
86,
1162,
355,
8654,
33876,
20910,
285,
18149,
29737,
1797,
50270,
66,
19,
247,
36407,
763,
6292,
321,
24085,
1162,
355,
33876,
3186,
275,
253,
3361,
273,
33384,
6083,
331,
406,
1797,
50276,
14094,
2710,
13260,
327,
253,
6046,
10670,
878,
16585,
4754,
597,
5393,
616,
7364,
285,
5469,
253,
48538,
3374,
2905,
281,
616,
2929,
352,
651,
320,
1805,
604,
247,
5955,
585,
3444,
6874,
6634,
387,
253,
990,
273,
253,
2022,
7223,
2490,
187,
4118,
18435,
27,
783,
4477,
1056,
2709,
9021,
275,
436,
789,
762,
247,
11233,
37913,
285,
1273,
1340,
6032,
1255,
9376,
616,
5933,
385,
1028,
67,
31326,
14938,
273,
1340,
246,
6227,
302,
1508,
534,
310,
271,
7756,
2220,
247,
3332,
906,
273,
26535,
80,
1162,
355,
43425,
326,
556,
271,
3081,
10096,
327,
247,
4764,
9765,
275,
1635,
762,
247,
11233,
37913,
9376,
253,
4477,
5933,
10129,
253,
1682,
1929,
14938,
273,
246,
6227,
302,
1706,
275,
6010,
253,
4477,
1918,
247,
10527,
32992,
689,
2045,
789,
19836,
275,
253,
11233,
37913,
1083,
616,
7756,
275,
2426,
273,
13782,
310,
17619,
1469,
432,
17619,
275,
7877,
277,
17,
432,
866,
21,
281,
14168,
1109,
4818,
10120,
17,
323,
385,
1028,
67,
50276,
249,
253,
5955,
2180,
627,
369,
690,
30859,
342,
17730,
281,
253,
38135,
273,
253,
2216,
273,
385,
1028,
67,
581,
37317,
5207,
891,
2868,
556,
4209,
2793,
25957,
326,
247,
8338,
7461,
1028,
67,
2605,
310,
2629,
275,
3961,
953,
2568,
1529,
37317,
16318,
326,
253,
10636,
273,
436,
5700,
275,
436,
1798,
1083,
310,
15455,
625,
9542,
1955,
281,
3081,
7881,
342,
253,
17947,
3408,
417,
1146,
2074,
281,
5890,
484,
9894,
4536,
908,
323,
4471,
21201,
3961,
262,
11333,
891,
5257,
281,
2868,
326,
627,
6296,
403,
3081,
7881,
22691,
1060,
285,
326,
627,
310,
4209,
38135,
275,
1708,
273,
253,
16424,
253,
4477,
452,
1160,
275,
2426,
273,
5520,
14938,
14493,
285,
5520,
15180,
6733,
436,
789,
16108,
9311,
285,
943,
320,
7607
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
7870,
20910,
342,
33876,
1491,
762,
4872,
7731,
29581,
1566,
253,
2929,
19401,
767,
3510,
273,
2791,
6046,
3268,
285,
29328,
247,
27998,
5933,
323,
253,
767,
3510,
253,
4081,
5933,
310,
4829,
8654,
323,
581,
1511,
285,
3157,
5368,
5933,
323,
253,
643,
253,
2929,
671,
25097,
247,
747,
2406,
3033,
285,
2589,
84,
10704,
4679,
20544,
253,
2929,
310,
973,
3542,
285,
973,
10932,
323,
11233,
37913,
285,
7036,
580,
414,
2791,
6046,
253,
2929,
19132,
253,
5368,
14938,
3033,
285,
3400,
247,
1805,
2406,
3033,
253,
10527,
906,
310,
671,
16058,
407,
10704,
4679,
50276,
20881,
1255,
265,
253,
673,
285,
2317,
10454,
273,
253,
4081,
5933,
310,
12744,
534,
778,
35007,
697,
8542,
4893,
4754,
5474,
33032,
2520,
2929,
5421,
253,
1895,
273,
33876,
20910,
1895,
342,
253,
6383,
7103,
310,
4872,
275,
253,
3634,
342,
271,
7202,
2791,
6046,
269,
253,
2929,
4081,
271,
385,
1028,
67,
3646,
281,
18915,
436,
4758,
337,
342,
5541,
1200,
5432,
285,
7036,
580,
414,
6032,
1255,
13260,
253,
385,
1028,
67,
5933,
33526,
246,
6227,
302,
1508,
14938,
374,
342,
760,
11233,
37913,
6032,
1255,
13260,
253,
385,
1028,
67,
10129,
253,
1682,
5368,
5170,
3033,
347,
246,
6227,
302,
1706,
23000,
253,
2929,
4245,
4737,
273,
253,
2406,
3033,
273,
246,
6227,
302,
1671,
50276,
1542,
253,
11233,
37913,
285,
7036,
580,
414,
6032,
1255,
13260,
50276,
296,
3755,
20556,
337,
253,
9759,
273,
436,
2929,
310,
2221,
4076,
26332,
342,
2829,
2829,
374,
285,
4677,
23356,
13206,
337,
285,
253,
20121,
273,
253,
11333,
403,
4076,
50275,
19,
253,
2929,
9257,
3537,
13505,
253,
3879,
273,
385,
1028,
67,
762,
1027,
7533,
342,
1027,
5933,
3602,
534,
8018,
253,
31376,
273,
253,
4081,
5933,
50276,
20,
253,
2929,
671,
3400,
247,
14938,
2406,
3033,
762,
253,
5541,
1200,
5432,
285,
7036,
580,
414,
6032,
1255,
285,
4518,
5513,
253,
30328,
3212,
253,
4737,
273,
253,
2406,
3033,
50275,
21,
253,
2929,
4518,
3054,
616,
13260,
31376,
342,
30404,
273,
3332,
6239,
50276,
22,
253,
2929,
18303,
253,
11333,
16774,
3045,
50275,
20881,
1255,
50276,
2369,
6452,
2593,
9713,
275,
253,
30080,
22559,
2180,
50275,
37585,
3374,
337,
690,
273,
253,
10012,
3908,
310,
18464,
50275,
1282,
2030,
14832,
20,
13989,
337,
262,
943,
320,
627,
4961,
2762,
260,
18,
260,
19,
7013,
395,
260,
20,
50276,
19,
2593,
4562,
10894,
1142,
13260,
1293,
12873,
33295,
875,
731,
50276,
338,
436,
2929,
476,
2486,
247,
6452,
2593,
275,
253,
990,
326,
651,
320,
1270,
50276,
7152,
33032,
2520,
2929,
2175,
253,
4715,
273,
9269,
7911,
275,
247,
3961,
262,
4758,
342,
22894,
821,
12542,
1907,
247,
4872,
2605,
50276,
24946,
597,
2085,
14938,
23632,
762,
1027,
5239,
273,
13260,
327,
253,
3268,
273,
253,
6046,
672,
253,
6046,
260,
4989,
269,
310,
760,
11233,
37913,
352,
761,
12239,
253,
5170,
9458,
12215,
246,
6227,
302,
1124,
1706,
273,
5693,
604,
253,
3264,
11784,
1159,
762,
269,
23000,
12310,
247,
2266,
585,
43205,
414,
2867,
12171,
387,
697,
4869,
436,
789,
3400,
247,
246,
6227,
302,
1124,
1508,
14938,
12215,
326,
19132,
689,
253,
246,
6227,
302,
1124,
1508,
19406,
337,
1637,
432,
4791,
4720,
352,
7473,
4219,
253,
7005,
909,
255,
1124,
1671,
2406,
9458,
47466,
275,
4791,
323,
253,
1072,
13260,
275,
619,
4743,
253,
2929,
556,
7194,
767,
9021,
326,
4468,
253,
1273,
873,
273,
13260,
835,
253,
3264,
11784,
310,
12171,
7052,
40886,
337,
11138,
253,
5170,
9458,
273,
4791,
374,
5277,
247,
2406,
9458,
50276,
2577,
2022,
7291,
310,
670,
253,
8453,
273,
253,
806,
7680,
6296,
253,
5933,
3637,
247,
2074,
35373,
10480,
2180,
8085,
347,
4791,
285,
352,
9193,
751,
247,
3405,
262,
10495,
6880,
50276,
18,
352,
310,
1896,
281,
1347,
6313,
17947,
327,
258,
437,
76,
2461,
3530,
591,
2180,
1293,
48482,
253,
14938,
12215,
347,
1048,
347,
9840,
458,
82,
1315,
317,
1508,
594,
253,
3081,
14938,
310,
11542,
407,
246,
6227,
302,
1124,
1508,
374,
841,
11591,
76,
2461,
476,
320,
908,
281,
6642,
7856,
783,
85,
518,
18,
534,
407,
288,
3703,
567,
5361,
4719,
588,
1421,
281,
7856,
783,
85,
518,
50276,
3124,
520,
50276,
80,
437,
76,
1124,
2461,
19,
495,
4456,
253,
4737,
273,
4791,
20096,
326,
604,
253,
6642,
7856,
783,
85,
518,
310,
824,
326,
7856,
783,
85,
518,
50276,
3124,
520,
50276,
80,
437,
76,
1637,
840,
253,
14938,
310,
11542,
407,
246,
6227,
302,
1124,
1508,
19406,
337,
1637,
50276,
21,
970,
9765,
50276,
1124,
2461,
19,
285,
13887,
9840,
50276,
1124,
1508,
5644,
281,
271,
5170,
9458,
327,
253,
14938,
326,
310,
246,
6227,
302,
1124,
1508,
3707,
323,
436,
3081,
17947,
253,
5933,
310,
1077,
2074,
281,
4791,
50275,
7461,
253,
2929,
310,
1892,
281,
956,
247,
4564,
273,
6667,
50276,
783,
44711,
4872,
3961,
262,
310,
5611,
387,
253,
1077,
5068,
387,
436,
1127,
697,
1892,
281,
1056,
253,
3048,
342,
253,
1895,
436,
3048,
310,
760,
5544,
1199,
1996,
275,
253,
2929,
3239,
818,
50276,
953,
247,
2372,
1892,
281,
2096,
253,
2554,
273,
253,
35132,
1320,
1293,
4361,
4791,
891,
13414,
923,
271,
4755,
12291,
50276,
12373,
3486,
326,
310,
4623,
5474,
33032,
2520,
2929,
2175,
271,
3909,
33876,
7870,
20910,
1895,
342,
1386,
1596,
10225,
90,
821,
12542,
285,
12419,
8680,
50276,
783,
2488,
23970,
271,
5933,
385,
1028,
67,
326,
310,
247,
5019,
273,
299,
793,
300,
543,
250,
6368,
285,
5170,
7162,
3033,
11333,
50276,
9328,
921,
326,
616,
5933,
651,
5115,
271,
246,
6227,
302,
1124,
1508,
14938,
5170,
3033,
762,
11233,
37913,
285,
7036,
580,
414,
2887,
20544,
285,
32213,
6874,
2708,
13260,
534,
19132,
5368,
1543,
9619,
597,
671,
921,
326,
385,
1028,
67,
651,
5115,
271,
246,
6227,
302,
1124,
1706,
14938,
5170,
3033,
762,
247,
11233,
37913,
9376,
760,
534,
10129,
5368,
1543,
533,
342,
247,
625,
5919,
5933,
597,
671,
452,
271,
246,
6227,
485,
72,
255,
1124,
1671,
2406,
3033,
323,
253,
11233,
37913,
285,
7036,
580,
414,
4758,
50276,
9328,
4720,
10262,
10704,
1543,
326,
10129,
616,
11813,
20544,
50275,
18,
806,
273,
512,
253,
10527,
906,
327,
436,
1895,
3340,
1083,
247,
310,
1805,
685,
5368,
2987,
767,
5368,
2987,
3495,
285,
4791,
5611,
299,
793,
300,
543,
250,
6368,
285,
44274,
67,
3082,
323,
20910,
32627,
533,
1016,
273,
731,
9606,
749,
32581,
1319,
285,
310,
1892,
281,
3157,
436,
789,
299,
1169,
841,
749,
29776,
1005,
407,
8482,
314,
16248,
731,
342,
1016,
643,
50275,
19,
597,
671,
3157,
253,
5933,
15180,
6733,
327,
1083,
270,
835,
616,
906,
10129,
5368,
1682,
1543,
762,
3081,
13260,
533,
417,
3157,
534,
14935,
616,
5933,
625,
8542,
50276,
20,
616,
2406,
3033,
310,
3715,
1754,
327,
5368,
3082,
533,
1335,
14282,
50276,
21,
616,
10704,
4679,
10129,
616,
10527,
1543,
1077,
18996,
50276,
22,
616,
4028,
3290,
310,
1175,
50275,
74,
1158,
436,
789,
943,
320,
4409,
387,
1878,
818,
347,
597,
6296,
3157,
5368,
1543,
275,
271,
4722,
1039,
2299,
627,
403,
1335,
690,
24626,
327,
619,
1930,
2853,
12556,
3146,
1918,
247,
721,
285,
9023,
281,
6054,
619,
4868,
347,
1048,
347,
253,
4477,
2953,
619,
7350,
50276,
20881,
1255,
50275,
18,
436,
310,
247,
1953,
533,
3133,
2581,
1774,
752,
597,
7558,
275,
9376,
818,
310,
2686,
247,
374,
2109,
6609,
78,
4902,
3185,
273,
50276,
66,
7036,
580,
414,
923,
1458,
597,
452,
1097,
5170,
285,
2406,
3638,
3033,
327,
253,
374,
2109,
2621,
13335,
387,
268,
407,
7384,
1097,
6032,
1255,
285,
7036,
580,
414,
533,
368,
760,
452,
5170,
3033,
26332,
6032,
4496,
2451,
604,
436,
11852,
667,
273,
634,
3916,
390,
253,
28959,
273,
667,
5301,
342,
5368,
1543,
24088,
310,
634,
2406,
3033,
1335,
4451,
50276,
19,
13260,
403,
5777,
625,
685,
11134,
24088,
9376,
374,
3340,
253,
1617,
1180,
2406,
3033,
260,
17,
310,
417,
8025,
275,
690,
273,
1458,
19136,
49758,
1671,
2385,
9376,
608,
476,
320,
10466,
407,
577,
3714,
9376,
495,
16248,
342,
9376,
577,
310,
6296,
4555,
247,
11542,
6046,
9376,
275,
26972,
25020,
50276,
20,
5816,
273,
1543,
285,
263,
11985,
275,
690,
2905,
2987,
24088,
7127,
556,
271,
7005,
909,
255,
1124,
78,
805,
78,
18,
2406,
3033,
326,
476,
7522,
275,
253,
9912,
273,
2829,
337,
24088,
19,
690,
2987,
275,
33876,
12203,
943,
671,
320,
13366,
5469,
1690,
50270,
66,
18,
247,
632,
86,
1162,
355,
8654,
33876,
20910,
285,
18149,
29737,
1797,
50270,
66,
19,
247,
36407,
763,
6292,
321,
24085,
1162,
355,
33876,
3186,
275,
253,
3361,
273,
33384,
6083,
331,
406,
1797,
50276,
14094,
2710,
13260,
327,
253,
6046,
10670,
878,
16585,
4754,
597,
5393,
616,
7364,
285,
5469,
253,
48538,
3374,
2905,
281,
616,
2929,
352,
651,
320,
1805,
604,
247,
5955,
585,
3444,
6874,
6634,
387,
253,
990,
273,
253,
2022,
7223,
2490,
187,
4118,
18435,
27,
783,
4477,
1056,
2709,
9021,
275,
436,
789,
762,
247,
11233,
37913,
285,
1273,
1340,
6032,
1255,
9376,
616,
5933,
385,
1028,
67,
31326,
14938,
273,
1340,
246,
6227,
302,
1508,
534,
310,
271,
7756,
2220,
247,
3332,
906,
273,
26535,
80,
1162,
355,
43425,
326,
556,
271,
3081,
10096,
327,
247,
4764,
9765,
275,
1635,
762,
247,
11233,
37913,
9376,
253,
4477,
5933,
10129,
253,
1682,
1929,
14938,
273,
246,
6227,
302,
1706,
275,
6010,
253,
4477,
1918,
247,
10527,
32992,
689,
2045,
789,
19836,
275,
253,
11233,
37913,
1083,
616,
7756,
275,
2426,
273,
13782,
310,
17619,
1469,
432,
17619,
275,
7877,
277,
17,
432,
866,
21,
281,
14168,
1109,
4818,
10120,
17,
323,
385,
1028,
67,
50276,
249,
253,
5955,
2180,
627,
369,
690,
30859,
342,
17730,
281,
253,
38135,
273,
253,
2216,
273,
385,
1028,
67,
581,
37317,
5207,
891,
2868,
556,
4209,
2793,
25957,
326,
247,
8338,
7461,
1028,
67,
2605,
310,
2629,
275,
3961,
953,
2568,
1529,
37317,
16318,
326,
253,
10636,
273,
436,
5700,
275,
436,
1798,
1083,
310,
15455,
625,
9542,
1955,
281,
3081,
7881,
342,
253,
17947,
3408,
417,
1146,
2074,
281,
5890,
484,
9894,
4536,
908,
323,
4471,
21201,
3961,
262,
11333,
891,
5257,
281,
2868,
326,
627,
6296,
403,
3081,
7881,
22691,
1060,
285,
326,
627,
310,
4209,
38135,
275,
1708,
273,
253,
16424,
253,
4477,
452,
1160,
275,
2426,
273,
5520,
14938,
14493,
285,
5520,
15180,
6733,
436,
789,
16108,
9311,
285,
943,
320,
7607
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
paper contributions the paper proposes the glancing transformer glat a model for nonautoregressive generative model of language focusing on the mt domain glat incorporates several changes which result in a model which is stateoftheart in several mt categories for nonautoregressive models and without requiring iterative sampling strong points of the paper the paper clearly lays out the suggested model changes and how they related to previous work some of the results are clearly state of the art for the task the model and changes are described well and clearly weak points of the paper glat incorporates several changes from the natbase architecture they primarily compare against without clear ablations to attribute cause to the improvement most of the paper discusses the impact of architectures on inferencesampling speed but doesnt discuss training speed which seems relevant because my understanding of the model means that each step requires an additional decoder pass some of the writing is poor and overlycolloquial clearly state your recommendation accept or reject with one or two key reasons for this choice i believe this paper is marginally above the acceptance threshold though i think there is a clear avenue to improve the paper and change my rating to accept supporting arguments for your recommendation the proposed glat model essentially consists of two changes from prior work 1 using an unconditional forward pass of the decoder to determine which tokens to mask and then running the decoder a second time 2 forwarding of encoding outputs via attention instead of mask tokens in the second decoder my main criticism of the paper is that is it unclear how each of the two changes contribute to the overall improvement being seen by the model it might be the case that it is exclusively one of them and i think the paper would clearly benefit from ablations discussing the two and showing how they each contribute as a secondary criticism the paper proposes using a simple random sampling strategy to determine which tokens to mask and which to glance at while the authors defend this as the simple choice and it certainly is it feels to me that there is an obvious second alternative which is to use the probability of the tokens generated by the decoder without glancing at anything as in maskpredicts iterative sampling to pick which tokens to glance at and which not to it feels to me that this alternative is substantially obvious seeing as it is exactly how mask tokens have been predicted in the past that it should be attempted ask questions you would like answered by the authors to help you clarify your understanding of the paper and provide the additional evidence you need to be confident in your assessment i would ask that two questions be answered both ablations trying to get at the heart of the reason for improvement 1 what happens if you do not forward the decoder inputs to the second decoding pass but instead set them to a unique mask token as in prior work but keep all other aspects of training the same 2 what happens if you use maskpredicts probabilitybased sampling criterion to pick which tokens to sample instead of the default random strategy also i have two questions which i think also should be answered but dont require any further experimentation 3 what is the impact on glancing sampling to training time 4 what is the exact architecture of natbase you show the encoder of glat works better than natbase is natbase trained with the attention instead of uniformcopy or softcopy could that be the reason for the improvement i feel that if the authors answer these questions ideally with wmt benchmark data for the first two along with addressing the more minor points i list below that i would feel comfortable increasing my rating to accept additional feedback with the aim to improve the paper i had a few minor questions while reading that i think would be useful to address you mention your final model is based on averaging the 5 best checkpoints how are you measuring best you make a quite strong claim that we think glat is superior to at to some extent which i think needs either more qualification or more focus i struggled with understanding the paragraph starting with adaptive sampling number i think it could use some cleanup finally i think most of the writing is clear but there are a few points where it becomes quite colloquial and difficult to follow i think an additional pass cleaning up some of the structure could be quite beneficialdocsep overall comment this submission improves nonautoregressive translation nat by proposing a noniterative parallel text generation model called glancing transformer glat which includes the explicit word dependency modeling in nat via a proposed glancing language model glm compared to previous work biggest contribution of the proposed method is that it improves the training of nat model with a similar idea of curriculum learning while keeping the inference time unchanged setting a significant improvement for noniterative nat without reranking it would be a good baseline for future research on noniterative nat models however i still have the following comments and questions methods 1 why wouldnt the model gets stuck at predicting only part of the words correctly as described by the algorithm the model will sample more reference words as the inputs of the decoder if the prediction is incorrect however the loss is only calculated for the remaining words will the model only learn to predict easy words and give up learning the difficult words i guess the random sampling strategy might be the key but it would be good to know answers from the authors 2 since the model only updates in the second pass where the inputs are always mixed with reference words and source embeddings there will be a clear mismatch between the second pass and the first pass why can the first model be sure to improve while training the second pass only by sharing parameters between two decoding passes 3 hamming distance is quite weak will the proposed method also apply to other distance such as levenshtein distance or a learnable distance 4 what do you mean by ratio function ftextratio what is the input to this function steps experiments 1 the proposed training process is in fact very similar to maskpredict except the inputs encoder hidden states instead of mask and adjusting the number of reference during training it would be nice to have a fair comparison with maskpredict for at least two settings and combined a maskpredict with encoder hidden state inputs with uniform copy or attention b maskpredict with 1 decoding iteration 2 i did not see how the model handle the length with another network for length prediction or decoding with multiple lengths for the latter case how to decide these lengths missing reference this paper had an even higher oneiteration nat results 258 with imputer compared to 2521 in the submission on wmt ende although the methods are different and the difference is to be honest marginal it is important to include discussions on that or combine them in future work saharia chitwan et al nonautoregressive machine translation with latent alignments arxiv preprint arxiv200407437 2020 docsepthis paper proposes a nonautoregressive neural machine translation model that does not require multiple iterations to achieve a good translation quality the key difference to previous models is that during training it uses decoding to estimate the number of words to randomly sample proportionally to the error as opposed to random sampling a fixed number of them and uses this sampling strategy to mask tokens strengths the idea of using the model to come up with the number of words to sample before predicting them looks new and it appears to be helping a lot the model to do better with a single iteration the evaluation shows that the proposed model outperforms previous noniterative ones and performs on par with iterative ones in some cases while being significantly faster which is promising but it still relies on reranking based on an autoregressive model moreover the method performs much better than the nonautoregressive baseline on longer sequences a trait that is observed in autoregressive models and performs better than the autoregressive baseline on very short sequences up to 20 tokens weaknesses 1 the writing requires some more effort because it has some grammaticalspelling errors and was not clear in several parts it would help simplify wording and make more clear statements for instance there is a lot of time spent describing the glancing sampling strategy while it could be described in one paragraph random sampling strategy paragraph seems redundant 2 the proposed idea has some merits and works well judging from the results but it seems somewhat incremental and not very clearly explained see below the connection to curriculum learning was a bit handwavy and hard to follow the model does not seem to be trained on targets that are of increasing difficulty but rather the number of incorrect targets simply defines the number of random samples to be used for training was this the intended connection to this line of work having many errors in the prediction during training does not necessarily mean that the example is difficult 3 in section 33 how is the function fratio actually implemented and trained if applicable its not clear from the provided description of how this is done how are the gradients computed through the sampling process if its trainable this is an essential component of the model and it hasnt been explained well 4 another setback for the reader is the lack of discussion or acknowledgment of the computational cost that is required by the method during training performing decoding twice during training looks interesting from the modeling perspective but what is its effect on training speed the evaluation focus is mainly on inference time but the training speed factor should also play a role too when deciding which method to use 5 related to the above the nonautoregressive models rely on knowledge distillation and reranking based on a pretrained autoregressive model this already has its own training cost so increasing it more could lead to a situation where the benefits during inference are overshadowed by the computational cost from training other comments are the models trained until convergence or for a fixed number of steps i am wondering what is the impact of the glancing sampling strategy on the convergence how did the authors come up with the number of reranking candidates for each model it looks like the number is different for each model this should affect the quality of the model could the authors elaborate on what do they mean by the formulation of our proposed glancing language model could be maximizing the likelihood of remaining words does it actually do that this was unclear and it leads the reader to make guesses in section 42 what do you mean by bleu and speedup are more or less contradictory in figure 1 and 2 the replaced inputs use the notation h2 h3 which points to the encoded inputs but in the textual description the replaced tokens seemed to be coming from the embedding on the decoder side section 32 paragraph 2 eyt in gsyhaty yt there seems to be this inconsistency between the notation in the diagrams and the notation in text which is very confusing docsepthe authors propose glancing transformer for single step parallel text generation the approach is inspired from curriculum learning ie the training task is adaptively controlled based on the models current performance specifically the paper proposes a glancing strategy which compares the models generation and reference sentence and forms a sequence which is partially masked the number of masked tokens in this sequence depends on the similarity between models generation and reference sentence the model is then trained to complete the partially masked sequence the model achieves strong improvements on standard nonautoregressive mt baselines while not modifying inference process thus not compromising on inference speed over vanilla nonautoregressive models ves limitations of current nonautoregressive mt models nat are well explained and the approach is nicely motivated the paper is well written although there are many grammatical mistakes that can be revised by the authors and is easy to follow the experimental details are well documented many ablation studies are reported apart from comparison using standard metrics the results on standard benchmarks are strong the paper improves over vanilla nat by approx 5 bleu points on average and is only 1 bleu point less than baseline autoregressive models while being 7x faster in inference concerns while the use of curriculum learning inspired techniques to augment nat model training is new and interesting their specific technique does not seem to have sufficient novelty according to me if i understood their technique correctly the only difference in the training algorithm between maskpredict ghazvininejad et al 2019 and their method is the selection of number of tokens to mask masknum in the decoders input while maskpredict uses a uniform distribution to sample masknum they use a glancing sampler that decides this number based on the hamming distance between models prediction and reference sentence although the reported results and ablation studies show a significant impact of this simple change i think more exploration of this technique is possible and should be done in the paper eg a simple hamming distance may not be a good strategy to compare the models prediction and reference sentence a related finding has been described in detail by ghazvininejad et al 2020 httpsarxivorgabs200401655 so i believe that authors can explore more strategies to compare reference and generated sequence the authors use random sampling like maskpredict as their sampling strategy the authors argue that random sampling may not be the most efficient strategy but its the easiest one and has been shown to be powerful in models like bert while i totally agree with their argument i believe that there is some possibility here to exploit the fact that we have access to models prediction and reference sentence eg one possible strategy to exploit this could be selecting tokens that the model was not able to predict correctly based on the hamming distance comparison in the introduction section it is mentioned that second glm replaces the mask token in the mlm with signals from the encoder thus not only the target word dependency but also the dependencies between the input and target are strengthened while this is an interesting argument i didnt find any experiment to validate this i think the authors should include experiments to compare use of mask tokens and their approach empirically based on these concerns i am currently inclined to recommend rejection while i find the idea of incorporating curriculum learning very interesting together with strong results as demonstrated by the paper i believe that more exploration of strategies to sample number of masked tokens and sample words from the reference sentence is necessary to make the paper publishable i have described this point in more detail in the concerns above minor comments there are many grammatical errors in the current version of the paper that the authors might want to revise eg in abstract falls fall achieves achieve in section 31 authors mention this note that we use the attention mechanism to form the decoder inputs with the input x i think it might be helpful to elaborate more on what this exactly means
### Summary:
|
this work raised quite a few questions and left the reviewers somewhat divided the authors have done their best to answer these questions conducting additional experiments where needed the close relation of this work to maskpredict ghazvininejad et al 2019 was noted by several reviewers although the current version of the manuscript addresses this the introduction still frames maskpredict as an iterative model and does not explicitly make the connection between glat and singleiteration maskpredict my impression is that this understates the relationship between these models somewhat taking singleiteration maskpredict as a baseline the proposed extension is fairly simple and seemingly effective which is a potentially impactful combination however the manuscript is still held back by presentation issues including but not limited to spelling and choice of words and i concur with reviewer 2 that the connection with curriculum learning should be elucidated not just in words but with supporting experimental analysis regarding training cost given that training for glat seems to be more costly for the same number of training iterations a comparison where the total compute budget is held constant could be interesting though i appreciate that this is not a key point of the paper as the authors point out whereas inference cost is i believe the changes made by the authors in response to the reviewers comments are substantial enough that they merit a further review cycle and may still fall short of the reviewers expectations in some aspects therefore i will not recommend acceptance though i want to add that this was a tough call to make i would also like to encourage the authors to resubmit their updated manuscript
|
[
4647,
281,
643,
4181,
824,
347,
458,
14941,
28993,
249,
4181,
390,
247,
3037,
494,
4181,
577,
752,
513,
368,
1599,
407,
4313,
1159,
269,
1156,
29603,
752,
310,
253,
3280,
281,
436,
1159,
5018,
50276,
16217,
3825,
337,
253,
4081,
3733,
1232,
310,
275,
958,
1077,
2074,
281,
8989,
22714,
3707,
253,
14800,
32049,
8763,
3054,
3185,
273,
8989,
285,
19427,
253,
1180,
273,
3806,
1309,
3733,
50276,
262,
651,
320,
5322,
281,
452,
247,
4344,
5301,
342,
8989,
22714,
323,
387,
1878,
767,
7533,
285,
5678,
247,
8989,
22714,
342,
32049,
8763,
1375,
14800,
342,
6447,
3491,
390,
4116,
270,
8989,
22714,
342,
337,
28490,
19502,
374,
891,
858,
417,
923,
849,
253,
1566,
6016,
253,
2978,
342,
1529,
2990,
323,
2978,
10554,
390,
28490,
342,
2709,
16095,
323,
253,
6158,
1083,
849,
281,
7617,
841,
16095,
50276,
33722,
3806,
436,
2929,
574,
271,
1014,
2169,
581,
2562,
318,
2889,
1543,
31605,
342,
516,
1065,
254,
2429,
281,
2030,
1797,
275,
253,
19529,
327,
259,
6917,
19072,
3738,
253,
3082,
403,
1027,
285,
253,
3064,
310,
281,
320,
8274,
16888,
352,
310,
1774,
281,
2486,
11985,
327,
326,
390,
13398,
731,
275,
2852,
789,
50276,
84,
1240,
8125,
448,
262,
10320,
1162,
355,
1327,
1920,
410,
11020,
5145,
10234,
342,
21624,
43097,
549,
32693,
638,
3845,
549,
32693,
1518,
1449,
3566,
1787,
9169,
5474,
33032,
2520,
2929,
29328,
247,
1327,
1920,
410,
11020,
11454,
5145,
10234,
1566,
326,
1057,
417,
2430,
2709,
25142,
281,
5115,
247,
1175,
10234,
3290,
253,
2234,
3064,
281,
2045,
3210,
310,
326,
1309,
3733,
352,
4648,
28490,
281,
6642,
253,
1180,
273,
3000,
281,
12421,
3410,
8394,
595,
281,
253,
2228,
347,
10066,
281,
3632,
10491,
247,
4229,
1180,
273,
731,
285,
4648,
436,
10491,
5700,
281,
8989,
21761,
50275,
296,
3755,
20556,
50275,
783,
2934,
273,
970,
253,
1566,
281,
1705,
598,
342,
253,
1180,
273,
3000,
281,
3410,
1078,
21565,
731,
4453,
747,
285,
352,
4620,
281,
320,
9073,
247,
2257,
253,
1566,
281,
513,
1805,
342,
247,
2014,
19502,
50275,
783,
7103,
2722,
326,
253,
4081,
1566,
41731,
13015,
2045,
1327,
2562,
800,
4394,
285,
17923,
327,
1061,
342,
34560,
4394,
275,
690,
2219,
1223,
1146,
3012,
7938,
534,
310,
12532,
533,
352,
1335,
15771,
327,
294,
47883,
1754,
327,
271,
47694,
11020,
1566,
50275,
3062,
1189,
253,
1332,
17923,
1199,
1805,
685,
253,
1327,
1920,
410,
11020,
8245,
327,
3356,
6430,
247,
18177,
326,
310,
2540,
275,
47694,
11020,
3210,
285,
17923,
1805,
685,
253,
47694,
11020,
8245,
327,
1077,
2159,
6430,
598,
281,
1384,
21761,
50275,
20881,
1255,
265,
50275,
18,
253,
4028,
4419,
690,
625,
3434,
984,
352,
556,
690,
47412,
474,
1033,
3485,
6332,
285,
369,
417,
2590,
275,
2067,
4243,
352,
651,
1361,
25636,
41066,
285,
1056,
625,
2590,
7234,
323,
4227,
627,
310,
247,
2257,
273,
673,
5262,
12930,
253,
48147,
10491,
5700,
1223,
352,
812,
320,
2529,
275,
581,
12494,
3632,
10491,
5700,
12494,
3133,
28116,
50275,
19,
253,
4081,
2934,
556,
690,
16108,
285,
2987,
973,
32721,
432,
253,
1543,
533,
352,
3133,
8489,
32809,
285,
417,
1077,
4518,
5544,
923,
2708,
253,
4602,
281,
24642,
4715,
369,
247,
2372,
1133,
88,
17157,
285,
1892,
281,
956,
253,
1566,
1057,
417,
1646,
281,
320,
10166,
327,
8571,
326,
403,
273,
3629,
10183,
533,
2581,
253,
1180,
273,
13583,
8571,
3365,
13067,
253,
1180,
273,
3632,
3530,
281,
320,
908,
323,
3733,
369,
436,
253,
6034,
4602,
281,
436,
1386,
273,
789,
1907,
1142,
6332,
275,
253,
10554,
1309,
3733,
1057,
417,
7933,
1599,
326,
253,
1650,
310,
2834,
50275,
20,
275,
2593,
5922,
849,
310,
253,
1159,
1315,
28965,
2686,
9009,
285,
10166,
604,
7763,
697,
417,
2590,
432,
253,
2530,
5740,
273,
849,
436,
310,
2218,
849,
403,
253,
27935,
10302,
949,
253,
10491,
1232,
604,
697,
6194,
494,
436,
310,
271,
5667,
4445,
273,
253,
1566,
285,
352,
556,
2649,
644,
5544,
973,
50275,
21,
1529,
873,
2135,
323,
253,
9414,
310,
253,
3480,
273,
5955,
390,
48371,
273,
253,
15180,
2105,
326,
310,
2424,
407,
253,
1332,
1309,
3733,
9591,
28490,
7019,
1309,
3733,
4453,
4722,
432,
253,
14053,
8668,
533,
752,
310,
697,
1055,
327,
3733,
3885,
253,
7103,
2770,
310,
7194,
327,
17032,
673,
533,
253,
3733,
3885,
2803,
943,
671,
1132,
247,
2554,
1512,
672,
18000,
534,
1332,
281,
897,
50276,
22,
2905,
281,
253,
1840,
253,
1327,
1920,
410,
11020,
3210,
10725,
327,
3640,
940,
21755,
285,
294,
47883,
1754,
327,
247,
3215,
11273,
47694,
11020,
1566,
436,
2168,
556,
697,
1211,
3733,
2105,
594,
3629,
352,
625,
812,
1421,
281,
247,
4112,
835,
253,
5373,
1309,
17032,
403,
689,
20644,
264,
407,
253,
15180,
2105,
432,
3733,
50275,
977,
5701,
50276,
609,
253,
3210,
10166,
1919,
14940,
390,
323,
247,
4229,
1180,
273,
5018,
50276,
74,
717,
12371,
752,
310,
253,
3486,
273,
253,
48147,
10491,
5700,
327,
253,
14940,
50275,
5430,
858,
253,
4477,
1705,
598,
342,
253,
1180,
273,
294,
47883,
9183,
323,
1016,
1566,
352,
4453,
751,
253,
1180,
310,
1027,
323,
1016,
1566,
436,
943,
2818,
253,
3290,
273,
253,
1566,
50274,
16534,
253,
4477,
21184,
327,
752,
513,
597,
1599,
407,
253,
15895,
273,
776,
4081,
48147,
3448,
1566,
812,
320,
46875,
253,
12177,
273,
5780,
3000,
1057,
352,
2686,
513,
326,
436,
369,
12744,
285,
352,
5644,
253,
9414,
281,
1056,
5476,
265,
50275,
249,
2593,
5976,
752,
513,
368,
1599,
407,
7387,
86,
285,
3885,
484,
403,
625,
390,
1679,
34126,
50276,
249,
4677,
337,
285,
374,
253,
7932,
14800,
897,
253,
14951,
288,
19,
288,
20,
534,
2792,
281,
253,
16202,
14800,
533,
275,
253,
45860,
5740,
253,
7932,
21761,
4455,
281,
320,
3551,
432,
253,
21496,
327,
253,
29810,
1930,
2593,
4567,
12494,
374,
299,
1767,
275,
305,
19089,
700,
90,
340,
85,
627,
3133,
281,
320,
436,
43430,
875,
253,
14951,
275,
253,
21302,
285,
253,
14951,
275,
2505,
534,
310,
1077,
21643,
50275,
7152,
339,
431,
248,
4477,
12661,
48147,
39707,
323,
2014,
3213,
7529,
2505,
5978,
253,
2746,
310,
11797,
432,
24642,
4715,
26332,
253,
3733,
4836,
310,
5223,
1242,
6537,
1754,
327,
253,
3210,
1655,
3045,
5742,
253,
2929,
29328,
247,
48147,
5700,
534,
26662,
253,
3210,
5978,
285,
3806,
6197,
285,
4948,
247,
3425,
534,
310,
10571,
34741,
253,
1180,
273,
34741,
21761,
275,
436,
3425,
7024,
327,
253,
14259,
875,
3210,
5978,
285,
3806,
6197,
253,
1566,
310,
840,
10166,
281,
3426,
253,
10571,
34741,
3425,
50275,
783,
1566,
33526,
2266,
11701,
327,
2629,
1327,
1920,
410,
11020,
26301,
1666,
25379,
1223,
417,
26264,
17032,
1232,
3021,
417,
48637,
327,
17032,
3885,
689,
26724,
1327,
1920,
410,
11020,
3210,
50276,
1634,
50274,
17465,
569,
273,
1655,
1327,
1920,
410,
11020,
26301,
3210,
2889,
403,
973,
5544,
285,
253,
2746,
310,
23395,
17194,
253,
2929,
310,
973,
3542,
3738,
627,
403,
1142,
47412,
474,
16503,
326,
476,
320,
17265,
407,
253,
4477,
285,
310,
3477,
281,
956,
253,
5661,
4278,
403,
973,
14290,
1142,
28913,
2175,
403,
2361,
7419,
432,
5301,
970,
2629,
17082,
50273,
783,
1543,
327,
2629,
49602,
403,
2266,
253,
2929,
19132,
689,
26724,
2889,
407,
1192,
89,
608,
7387,
86,
2792,
327,
3388,
285,
310,
760,
337,
7387,
86,
1127,
1679,
685,
8245,
47694,
11020,
3210,
1223,
1146,
818,
89,
7938,
275,
17032,
50276,
585,
1209,
2224,
50273,
6050,
253,
897,
273,
24642,
4715,
11797,
5609,
281,
35919,
2889,
1566,
3733,
310,
747,
285,
4722,
616,
2173,
5853,
1057,
417,
1646,
281,
452,
4209,
38135,
2556,
281,
479,
604,
891,
7192,
616,
5853,
9113,
253,
760,
3064,
275,
253,
3733,
5933,
875,
8989,
22714,
32798,
1370,
8498,
460,
75,
324,
1162,
355,
6247,
285,
616,
1332,
310,
253,
5438,
273,
1180,
273,
21761,
281,
8989,
8989,
6370,
275,
253,
1086,
351,
398,
3280,
1223,
8989,
22714,
4648,
247,
6447,
3268,
281,
3410,
8989,
6370,
50276,
9328,
897,
247,
48147,
1775,
17407,
326,
21936,
436,
1180,
1754,
327,
253,
288,
28444,
4181,
875,
3210,
10554,
285,
3806,
6197,
50275,
20261,
253,
2361,
1543,
285,
28913,
2175,
921,
247,
1534,
3486,
273,
436,
2969,
1818,
891,
1158,
625,
17947,
273,
436,
5853,
310,
1896,
285,
943,
320,
2218,
275,
253,
2929,
24088,
247,
2969,
288,
28444,
4181,
778,
417,
320,
247,
1175,
5700,
281,
7277,
253,
3210,
10554,
285,
3806,
6197,
247,
2905,
4560,
556,
644,
2529,
275,
2508,
407,
32798,
1370,
8498,
460,
75,
324,
1162,
355,
9169,
5987,
39962,
2061,
5375,
9430,
520,
25320,
594,
891,
2868,
326,
4477,
476,
8338,
625,
8130,
281,
7277,
3806,
285,
4561,
3425,
50275,
783,
4477,
897,
3632,
10491,
751,
8989,
22714,
347,
616,
10491,
5700,
253,
4477,
9059,
326,
3632,
10491,
778,
417,
320,
253,
954,
5919,
5700,
533,
697,
253,
24746,
581,
285,
556,
644,
2011,
281,
320,
6422,
275,
3210,
751,
270,
797,
1223,
891,
9106,
5194,
342,
616,
4154,
891,
2868,
326,
627,
310,
690,
6387,
1060,
281,
22059,
253,
958,
326,
359,
452,
2289,
281,
3210,
10554,
285,
3806,
6197,
50276,
909,
581,
1896,
5700,
281,
22059,
436,
812,
320,
17221,
21761,
326,
253,
1566,
369,
417,
2104,
281,
3283,
9113,
1754,
327,
253,
288,
28444,
4181,
5301,
50273,
249,
253,
10199,
2593,
352,
310,
5393,
326,
50276,
9815,
1289,
78,
36287,
253,
8989,
10669,
275,
253,
13361,
78,
342,
6298,
432,
253,
32049,
3021,
417,
760,
253,
2303,
3159,
18925,
533,
671,
253,
21011,
875,
253,
3280,
285,
2303,
403,
34615,
1223,
436,
310,
271,
4722,
4154,
891,
42126,
1089,
667,
3368,
281,
17813,
436,
891,
1158,
253,
4477,
943,
2486,
4679,
281,
7277,
897,
273,
8989,
21761,
285,
616,
2746,
45190,
50275,
3169,
327,
841,
7350,
891,
717,
4390,
21802,
281,
5583,
18235,
1223,
891,
1089,
253,
2934,
273,
24049,
24642,
4715,
1077,
4722,
2366,
342,
2266,
1543,
347,
5183,
407,
253,
2929,
891,
2868,
326,
625,
17947,
273,
8130,
281,
3410,
1180,
273,
34741,
21761,
285,
3410,
3000,
432,
253,
3806,
6197,
310,
3309,
281,
1056,
253,
2929,
15452,
494,
891,
452,
2529,
436,
1127,
275,
625,
2508,
275,
253,
7350,
1840,
50275,
37585,
5701,
50274,
9088,
403,
1142,
47412,
474,
6332,
275,
253,
1655,
2715,
273,
253,
2929,
326,
253,
4477,
1537,
971,
281,
49620,
24088,
275,
12002,
50276,
27366,
50276,
12615,
33526,
50276,
607,
12876,
50275,
249,
2593,
4562,
4477,
3748,
436,
50276,
9939,
326,
359,
897,
253,
4116,
5122,
281,
830,
253,
29810,
14800,
342,
253,
3280,
1269,
891,
1158,
352,
1537,
320,
9371,
281,
21184,
625,
327,
752,
436,
4555,
2097,
2490,
187,
4118,
18435,
27,
2520,
789,
5439,
3240,
247,
1643,
3533,
285,
1669,
253,
30628,
8489,
4272,
253,
4477,
452,
2218,
616,
1682,
281,
3662,
841,
3533,
16472,
3081,
4679,
835,
3058,
50276,
783,
2810,
5886,
273,
436,
789,
281,
8989,
22714,
32798,
1370,
8498,
460,
75,
324,
1162,
355,
6247,
369,
4879,
407,
2067,
30628,
3738,
253,
1655,
2715,
273,
253,
7714,
12453,
436,
253,
10199,
1335,
13009,
8989,
22714,
347,
271,
34560,
1566,
285,
1057,
417,
11120,
1056,
253,
4602,
875,
1289,
255,
285,
2014,
2562,
318,
8989,
22714,
619,
13214,
310,
326,
436,
762,
21196,
253,
2954,
875,
841,
3210,
8489,
50276,
29114,
2014,
2562,
318,
8989,
22714,
347,
247,
8245,
253,
4081,
6880,
310,
9648,
2969,
285,
16907,
3576,
534,
310,
247,
7826,
3486,
1020,
5019,
2299,
253,
7714,
310,
1335,
2918,
896,
407,
9759,
3374,
1690,
533,
417,
3710,
281,
33797,
285,
4327,
273,
3000,
285,
891,
15038,
342,
37317,
374,
326,
253,
4602,
342,
24642,
4715,
943,
320,
41167,
417,
816,
275,
3000,
533,
342,
8109,
5661,
1783,
50276,
1747,
13218,
3733,
2105,
1677,
326,
3733,
323,
1289,
255,
3133,
281,
320,
625,
19983,
323,
253,
1072,
1180,
273,
3733,
25142,
247,
5301,
835,
253,
2264,
11897,
7563,
310,
2918,
3638,
812,
320,
4722,
50276,
2004,
891,
11435,
326,
436,
310,
417,
247,
2234,
1127,
273,
253,
2929,
347,
253,
4477,
1127,
562,
5727,
17032,
2105,
310,
50276,
74,
2868,
253,
2544,
1160,
407,
253,
4477,
275,
2380,
281,
253,
30628,
5701,
403,
6832,
2217,
326,
597,
15785,
247,
2007,
2278,
5880,
285,
778,
1335,
2965,
2159,
273,
253,
30628,
12656,
275,
690,
7794,
3103,
891,
588,
417,
5583,
14924,
2167,
891,
971,
281,
823,
326,
436,
369,
247,
10458,
1067,
281,
1056,
891,
651,
671,
751,
281,
11907,
253,
4477,
281,
501,
538,
2225,
616,
9300,
7714
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
4647,
281,
643,
4181,
824,
347,
458,
14941,
28993,
249,
4181,
390,
247,
3037,
494,
4181,
577,
752,
513,
368,
1599,
407,
4313,
1159,
269,
1156,
29603,
752,
310,
253,
3280,
281,
436,
1159,
5018,
50276,
16217,
3825,
337,
253,
4081,
3733,
1232,
310,
275,
958,
1077,
2074,
281,
8989,
22714,
3707,
253,
14800,
32049,
8763,
3054,
3185,
273,
8989,
285,
19427,
253,
1180,
273,
3806,
1309,
3733,
50276,
262,
651,
320,
5322,
281,
452,
247,
4344,
5301,
342,
8989,
22714,
323,
387,
1878,
767,
7533,
285,
5678,
247,
8989,
22714,
342,
32049,
8763,
1375,
14800,
342,
6447,
3491,
390,
4116,
270,
8989,
22714,
342,
337,
28490,
19502,
374,
891,
858,
417,
923,
849,
253,
1566,
6016,
253,
2978,
342,
1529,
2990,
323,
2978,
10554,
390,
28490,
342,
2709,
16095,
323,
253,
6158,
1083,
849,
281,
7617,
841,
16095,
50276,
33722,
3806,
436,
2929,
574,
271,
1014,
2169,
581,
2562,
318,
2889,
1543,
31605,
342,
516,
1065,
254,
2429,
281,
2030,
1797,
275,
253,
19529,
327,
259,
6917,
19072,
3738,
253,
3082,
403,
1027,
285,
253,
3064,
310,
281,
320,
8274,
16888,
352,
310,
1774,
281,
2486,
11985,
327,
326,
390,
13398,
731,
275,
2852,
789,
50276,
84,
1240,
8125,
448,
262,
10320,
1162,
355,
1327,
1920,
410,
11020,
5145,
10234,
342,
21624,
43097,
549,
32693,
638,
3845,
549,
32693,
1518,
1449,
3566,
1787,
9169,
5474,
33032,
2520,
2929,
29328,
247,
1327,
1920,
410,
11020,
11454,
5145,
10234,
1566,
326,
1057,
417,
2430,
2709,
25142,
281,
5115,
247,
1175,
10234,
3290,
253,
2234,
3064,
281,
2045,
3210,
310,
326,
1309,
3733,
352,
4648,
28490,
281,
6642,
253,
1180,
273,
3000,
281,
12421,
3410,
8394,
595,
281,
253,
2228,
347,
10066,
281,
3632,
10491,
247,
4229,
1180,
273,
731,
285,
4648,
436,
10491,
5700,
281,
8989,
21761,
50275,
296,
3755,
20556,
50275,
783,
2934,
273,
970,
253,
1566,
281,
1705,
598,
342,
253,
1180,
273,
3000,
281,
3410,
1078,
21565,
731,
4453,
747,
285,
352,
4620,
281,
320,
9073,
247,
2257,
253,
1566,
281,
513,
1805,
342,
247,
2014,
19502,
50275,
783,
7103,
2722,
326,
253,
4081,
1566,
41731,
13015,
2045,
1327,
2562,
800,
4394,
285,
17923,
327,
1061,
342,
34560,
4394,
275,
690,
2219,
1223,
1146,
3012,
7938,
534,
310,
12532,
533,
352,
1335,
15771,
327,
294,
47883,
1754,
327,
271,
47694,
11020,
1566,
50275,
3062,
1189,
253,
1332,
17923,
1199,
1805,
685,
253,
1327,
1920,
410,
11020,
8245,
327,
3356,
6430,
247,
18177,
326,
310,
2540,
275,
47694,
11020,
3210,
285,
17923,
1805,
685,
253,
47694,
11020,
8245,
327,
1077,
2159,
6430,
598,
281,
1384,
21761,
50275,
20881,
1255,
265,
50275,
18,
253,
4028,
4419,
690,
625,
3434,
984,
352,
556,
690,
47412,
474,
1033,
3485,
6332,
285,
369,
417,
2590,
275,
2067,
4243,
352,
651,
1361,
25636,
41066,
285,
1056,
625,
2590,
7234,
323,
4227,
627,
310,
247,
2257,
273,
673,
5262,
12930,
253,
48147,
10491,
5700,
1223,
352,
812,
320,
2529,
275,
581,
12494,
3632,
10491,
5700,
12494,
3133,
28116,
50275,
19,
253,
4081,
2934,
556,
690,
16108,
285,
2987,
973,
32721,
432,
253,
1543,
533,
352,
3133,
8489,
32809,
285,
417,
1077,
4518,
5544,
923,
2708,
253,
4602,
281,
24642,
4715,
369,
247,
2372,
1133,
88,
17157,
285,
1892,
281,
956,
253,
1566,
1057,
417,
1646,
281,
320,
10166,
327,
8571,
326,
403,
273,
3629,
10183,
533,
2581,
253,
1180,
273,
13583,
8571,
3365,
13067,
253,
1180,
273,
3632,
3530,
281,
320,
908,
323,
3733,
369,
436,
253,
6034,
4602,
281,
436,
1386,
273,
789,
1907,
1142,
6332,
275,
253,
10554,
1309,
3733,
1057,
417,
7933,
1599,
326,
253,
1650,
310,
2834,
50275,
20,
275,
2593,
5922,
849,
310,
253,
1159,
1315,
28965,
2686,
9009,
285,
10166,
604,
7763,
697,
417,
2590,
432,
253,
2530,
5740,
273,
849,
436,
310,
2218,
849,
403,
253,
27935,
10302,
949,
253,
10491,
1232,
604,
697,
6194,
494,
436,
310,
271,
5667,
4445,
273,
253,
1566,
285,
352,
556,
2649,
644,
5544,
973,
50275,
21,
1529,
873,
2135,
323,
253,
9414,
310,
253,
3480,
273,
5955,
390,
48371,
273,
253,
15180,
2105,
326,
310,
2424,
407,
253,
1332,
1309,
3733,
9591,
28490,
7019,
1309,
3733,
4453,
4722,
432,
253,
14053,
8668,
533,
752,
310,
697,
1055,
327,
3733,
3885,
253,
7103,
2770,
310,
7194,
327,
17032,
673,
533,
253,
3733,
3885,
2803,
943,
671,
1132,
247,
2554,
1512,
672,
18000,
534,
1332,
281,
897,
50276,
22,
2905,
281,
253,
1840,
253,
1327,
1920,
410,
11020,
3210,
10725,
327,
3640,
940,
21755,
285,
294,
47883,
1754,
327,
247,
3215,
11273,
47694,
11020,
1566,
436,
2168,
556,
697,
1211,
3733,
2105,
594,
3629,
352,
625,
812,
1421,
281,
247,
4112,
835,
253,
5373,
1309,
17032,
403,
689,
20644,
264,
407,
253,
15180,
2105,
432,
3733,
50275,
977,
5701,
50276,
609,
253,
3210,
10166,
1919,
14940,
390,
323,
247,
4229,
1180,
273,
5018,
50276,
74,
717,
12371,
752,
310,
253,
3486,
273,
253,
48147,
10491,
5700,
327,
253,
14940,
50275,
5430,
858,
253,
4477,
1705,
598,
342,
253,
1180,
273,
294,
47883,
9183,
323,
1016,
1566,
352,
4453,
751,
253,
1180,
310,
1027,
323,
1016,
1566,
436,
943,
2818,
253,
3290,
273,
253,
1566,
50274,
16534,
253,
4477,
21184,
327,
752,
513,
597,
1599,
407,
253,
15895,
273,
776,
4081,
48147,
3448,
1566,
812,
320,
46875,
253,
12177,
273,
5780,
3000,
1057,
352,
2686,
513,
326,
436,
369,
12744,
285,
352,
5644,
253,
9414,
281,
1056,
5476,
265,
50275,
249,
2593,
5976,
752,
513,
368,
1599,
407,
7387,
86,
285,
3885,
484,
403,
625,
390,
1679,
34126,
50276,
249,
4677,
337,
285,
374,
253,
7932,
14800,
897,
253,
14951,
288,
19,
288,
20,
534,
2792,
281,
253,
16202,
14800,
533,
275,
253,
45860,
5740,
253,
7932,
21761,
4455,
281,
320,
3551,
432,
253,
21496,
327,
253,
29810,
1930,
2593,
4567,
12494,
374,
299,
1767,
275,
305,
19089,
700,
90,
340,
85,
627,
3133,
281,
320,
436,
43430,
875,
253,
14951,
275,
253,
21302,
285,
253,
14951,
275,
2505,
534,
310,
1077,
21643,
50275,
7152,
339,
431,
248,
4477,
12661,
48147,
39707,
323,
2014,
3213,
7529,
2505,
5978,
253,
2746,
310,
11797,
432,
24642,
4715,
26332,
253,
3733,
4836,
310,
5223,
1242,
6537,
1754,
327,
253,
3210,
1655,
3045,
5742,
253,
2929,
29328,
247,
48147,
5700,
534,
26662,
253,
3210,
5978,
285,
3806,
6197,
285,
4948,
247,
3425,
534,
310,
10571,
34741,
253,
1180,
273,
34741,
21761,
275,
436,
3425,
7024,
327,
253,
14259,
875,
3210,
5978,
285,
3806,
6197,
253,
1566,
310,
840,
10166,
281,
3426,
253,
10571,
34741,
3425,
50275,
783,
1566,
33526,
2266,
11701,
327,
2629,
1327,
1920,
410,
11020,
26301,
1666,
25379,
1223,
417,
26264,
17032,
1232,
3021,
417,
48637,
327,
17032,
3885,
689,
26724,
1327,
1920,
410,
11020,
3210,
50276,
1634,
50274,
17465,
569,
273,
1655,
1327,
1920,
410,
11020,
26301,
3210,
2889,
403,
973,
5544,
285,
253,
2746,
310,
23395,
17194,
253,
2929,
310,
973,
3542,
3738,
627,
403,
1142,
47412,
474,
16503,
326,
476,
320,
17265,
407,
253,
4477,
285,
310,
3477,
281,
956,
253,
5661,
4278,
403,
973,
14290,
1142,
28913,
2175,
403,
2361,
7419,
432,
5301,
970,
2629,
17082,
50273,
783,
1543,
327,
2629,
49602,
403,
2266,
253,
2929,
19132,
689,
26724,
2889,
407,
1192,
89,
608,
7387,
86,
2792,
327,
3388,
285,
310,
760,
337,
7387,
86,
1127,
1679,
685,
8245,
47694,
11020,
3210,
1223,
1146,
818,
89,
7938,
275,
17032,
50276,
585,
1209,
2224,
50273,
6050,
253,
897,
273,
24642,
4715,
11797,
5609,
281,
35919,
2889,
1566,
3733,
310,
747,
285,
4722,
616,
2173,
5853,
1057,
417,
1646,
281,
452,
4209,
38135,
2556,
281,
479,
604,
891,
7192,
616,
5853,
9113,
253,
760,
3064,
275,
253,
3733,
5933,
875,
8989,
22714,
32798,
1370,
8498,
460,
75,
324,
1162,
355,
6247,
285,
616,
1332,
310,
253,
5438,
273,
1180,
273,
21761,
281,
8989,
8989,
6370,
275,
253,
1086,
351,
398,
3280,
1223,
8989,
22714,
4648,
247,
6447,
3268,
281,
3410,
8989,
6370,
50276,
9328,
897,
247,
48147,
1775,
17407,
326,
21936,
436,
1180,
1754,
327,
253,
288,
28444,
4181,
875,
3210,
10554,
285,
3806,
6197,
50275,
20261,
253,
2361,
1543,
285,
28913,
2175,
921,
247,
1534,
3486,
273,
436,
2969,
1818,
891,
1158,
625,
17947,
273,
436,
5853,
310,
1896,
285,
943,
320,
2218,
275,
253,
2929,
24088,
247,
2969,
288,
28444,
4181,
778,
417,
320,
247,
1175,
5700,
281,
7277,
253,
3210,
10554,
285,
3806,
6197,
247,
2905,
4560,
556,
644,
2529,
275,
2508,
407,
32798,
1370,
8498,
460,
75,
324,
1162,
355,
9169,
5987,
39962,
2061,
5375,
9430,
520,
25320,
594,
891,
2868,
326,
4477,
476,
8338,
625,
8130,
281,
7277,
3806,
285,
4561,
3425,
50275,
783,
4477,
897,
3632,
10491,
751,
8989,
22714,
347,
616,
10491,
5700,
253,
4477,
9059,
326,
3632,
10491,
778,
417,
320,
253,
954,
5919,
5700,
533,
697,
253,
24746,
581,
285,
556,
644,
2011,
281,
320,
6422,
275,
3210,
751,
270,
797,
1223,
891,
9106,
5194,
342,
616,
4154,
891,
2868,
326,
627,
310,
690,
6387,
1060,
281,
22059,
253,
958,
326,
359,
452,
2289,
281,
3210,
10554,
285,
3806,
6197,
50276,
909,
581,
1896,
5700,
281,
22059,
436,
812,
320,
17221,
21761,
326,
253,
1566,
369,
417,
2104,
281,
3283,
9113,
1754,
327,
253,
288,
28444,
4181,
5301,
50273,
249,
253,
10199,
2593,
352,
310,
5393,
326,
50276,
9815,
1289,
78,
36287,
253,
8989,
10669,
275,
253,
13361,
78,
342,
6298,
432,
253,
32049,
3021,
417,
760,
253,
2303,
3159,
18925,
533,
671,
253,
21011,
875,
253,
3280,
285,
2303,
403,
34615,
1223,
436,
310,
271,
4722,
4154,
891,
42126,
1089,
667,
3368,
281,
17813,
436,
891,
1158,
253,
4477,
943,
2486,
4679,
281,
7277,
897,
273,
8989,
21761,
285,
616,
2746,
45190,
50275,
3169,
327,
841,
7350,
891,
717,
4390,
21802,
281,
5583,
18235,
1223,
891,
1089,
253,
2934,
273,
24049,
24642,
4715,
1077,
4722,
2366,
342,
2266,
1543,
347,
5183,
407,
253,
2929,
891,
2868,
326,
625,
17947,
273,
8130,
281,
3410,
1180,
273,
34741,
21761,
285,
3410,
3000,
432,
253,
3806,
6197,
310,
3309,
281,
1056,
253,
2929,
15452,
494,
891,
452,
2529,
436,
1127,
275,
625,
2508,
275,
253,
7350,
1840,
50275,
37585,
5701,
50274,
9088,
403,
1142,
47412,
474,
6332,
275,
253,
1655,
2715,
273,
253,
2929,
326,
253,
4477,
1537,
971,
281,
49620,
24088,
275,
12002,
50276,
27366,
50276,
12615,
33526,
50276,
607,
12876,
50275,
249,
2593,
4562,
4477,
3748,
436,
50276,
9939,
326,
359,
897,
253,
4116,
5122,
281,
830,
253,
29810,
14800,
342,
253,
3280,
1269,
891,
1158,
352,
1537,
320,
9371,
281,
21184,
625,
327,
752,
436,
4555,
2097,
2490,
187,
4118,
18435,
27,
2520,
789,
5439,
3240,
247,
1643,
3533,
285,
1669,
253,
30628,
8489,
4272,
253,
4477,
452,
2218,
616,
1682,
281,
3662,
841,
3533,
16472,
3081,
4679,
835,
3058,
50276,
783,
2810,
5886,
273,
436,
789,
281,
8989,
22714,
32798,
1370,
8498,
460,
75,
324,
1162,
355,
6247,
369,
4879,
407,
2067,
30628,
3738,
253,
1655,
2715,
273,
253,
7714,
12453,
436,
253,
10199,
1335,
13009,
8989,
22714,
347,
271,
34560,
1566,
285,
1057,
417,
11120,
1056,
253,
4602,
875,
1289,
255,
285,
2014,
2562,
318,
8989,
22714,
619,
13214,
310,
326,
436,
762,
21196,
253,
2954,
875,
841,
3210,
8489,
50276,
29114,
2014,
2562,
318,
8989,
22714,
347,
247,
8245,
253,
4081,
6880,
310,
9648,
2969,
285,
16907,
3576,
534,
310,
247,
7826,
3486,
1020,
5019,
2299,
253,
7714,
310,
1335,
2918,
896,
407,
9759,
3374,
1690,
533,
417,
3710,
281,
33797,
285,
4327,
273,
3000,
285,
891,
15038,
342,
37317,
374,
326,
253,
4602,
342,
24642,
4715,
943,
320,
41167,
417,
816,
275,
3000,
533,
342,
8109,
5661,
1783,
50276,
1747,
13218,
3733,
2105,
1677,
326,
3733,
323,
1289,
255,
3133,
281,
320,
625,
19983,
323,
253,
1072,
1180,
273,
3733,
25142,
247,
5301,
835,
253,
2264,
11897,
7563,
310,
2918,
3638,
812,
320,
4722,
50276,
2004,
891,
11435,
326,
436,
310,
417,
247,
2234,
1127,
273,
253,
2929,
347,
253,
4477,
1127,
562,
5727,
17032,
2105,
310,
50276,
74,
2868,
253,
2544,
1160,
407,
253,
4477,
275,
2380,
281,
253,
30628,
5701,
403,
6832,
2217,
326,
597,
15785,
247,
2007,
2278,
5880,
285,
778,
1335,
2965,
2159,
273,
253,
30628,
12656,
275,
690,
7794,
3103,
891,
588,
417,
5583,
14924,
2167,
891,
971,
281,
823,
326,
436,
369,
247,
10458,
1067,
281,
1056,
891,
651,
671,
751,
281,
11907,
253,
4477,
281,
501,
538,
2225,
616,
9300,
7714
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces a novel approach to partition the clients into k groups and build personalized model for each group the approach leverage graph clustering method for learning purpose the paper also provided a list of learning bounds the experimental results demonstrate the proposed k models can produce learning results close to personalized models strengths 1 the problem is interestingly and timely it can potentially be applied to many applications 2 the approach is based on graph theoretical results 3 the paper provides rather rigorous theoretical analysis weaknesses 1 the writing can be improved and better organized 2 lack of base lines for comparison the experimental results are rather weak 3 some figures are too small to read like those in figure 5 yes docsepthis paper studies the collaborative learning in personalized setting which considers the heterogeneous distribution among clients the paper proposes a method based on graph partition to learn personalized models for each partition group which is shared by all clients within one partition group the paper first provides the generalization bound for each client from which the optimal personalized model is derived via optimizing on the contribution from the dataset of each client then a contribution graph is constructed from the contribution weight and similar clients are partitioned into one group after filtering out bad clients the personalized model for one group is learned based on the average weights of the clients within the group the paper provides theoretically analysis on the increase of the upper bound of the excess risk due to the relaxation from optimal personalized model to group personalized model experiments demonstrate the group personalized model has similar performance with the optimal personalized model and justifies the proposed method this paper has several merits first the theoretical part is clear rigorous and sound it justifies the effectiveness of the proposed framework second the proposed method which is based on graph partition of client contribution is novel and interesting to me however i am not very familiar with this research topic my major concern is on the writing of this paper as it is sometimes hard to follow for example there are no subsections within section 4 our work but to my understanding this section first talks about the optimal model then provides graph partition based learning algorithm and last gives tighter bounds with further assumptions thus i think section4 can be separated into at least 3 subsections the authors are encouraged to improve the presentation of this work i have some more questions to ask the authors please see the comment below the paper does not discuss limitations of the work it provides the discussion on social impact docsepthis paper studied collaborative learning for heterogeneous clients to achieve personalization while avoiding training personalized models for a large number of clients the authors proposed methods to detect collaboration partners and adaptively learn models for client groups the authors provide a theoretical guarantee that the expected risk of the learned group model is close to the personalized model empirical results on realworld datasets validated the theoretical result showing group model is a good approximation of the personalized model strength the motivation of learning models for client clusters to balance between computation and personalization is intuitive the two algorithms aclmm which applies modularity maximization to estimate client group and aclc which assumes intrinsic client clustering both are analyzed with generalization bounds weakness i only have mostly minor concerns while the two proposed algorithms are theoretically analyzed they are not directly compared it seems that with the 1 gamma epsilonapproximationstability assumption in theorem 4 aclc has a better generalization bound than aclmm when optphip is small and it would be interesting to compare the bounds in detail it is also worth empirically testing aclmm in figure 5 see weakness docsepthis paper studies the collaborative learning problem in which different clients could have noniid data compared with existing global or personalized methods this paper proposes to detect collaboration partners to achieve a better collaboration the authors formalize this problem as measuring the similarity between clients and finding an optimal partition of the collaboration federated network the authors give various theoretical analysis and provide experimental results to verify the effectiveness of the proposed method strengths 1this paper considers a challenging problem the collaboration among multiple clients which is very useful and significant in reality 2this paper mainly focuses on theoretical analysis including the similarity measurement between clients the optimization of the partition and the error bound of the approximated partition weakness motivation 1 the motivation in this paper is to find a partition for the collaboration network it is reasonable and efficient however some related work and more discussions are missing for example 12 theory 1some theoretical results im very familiar with eg theorem 1 and theorem 2 im afraid they are quite similar to existing theoretical work in domain adaptation and domain generalization for the other theoretical conclusions the authors also seems to refer to existing work that being said its not a big problem more discussions about the difference between the theoretical analysis and existing work are necessary experiments the experiments in this paper are seriously insufficient 1the idea in this paper is clustering similar or helpful clients more baselines are necessary for example 2 2the experiments focus on two image datasets more other data are necessary privacy this paper considers the problem of collaborative learning the proposed method needs to measure the similarity between clients which could cause privacy leakage in a federated learning setting could the authors discuss more about the privacy concerns 1 cui s liang j pan w et al learning to collaboratej arxiv preprint arxiv210807926 2021 2 ghosh a chung j yin d et al an efficient framework for clustered federated learningj advances in neural information processing systems 2020 33 1958619597 the authors claim that they describe the limitations but i did not found
### Summary:
|
in this submission the authors propose a novel method to handle the heterogeneous clients challenge in fl by detecting collaboration partners and adaptively learning models for client groups the authors provide the theoretical analysis of the error bound of the approximated partition which justifies the effectiveness of the proposed method the experimental results on the realworld dataset also validate that the group model is a good approximation of the personalized model due to these i recommend accepting this submission however i do have one major concern about the insufficiency of the experiment as pointed out by reviewer xfip more datasets and baselines should be adopted further this submission also can be improved based on the comments from all the reviewers and the discussion between reviewers and authors hope they find these useful and make this submission a better one
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
247,
4460,
2746,
281,
10883,
253,
8548,
715,
465,
2390,
285,
1973,
32339,
1566,
323,
1016,
1387,
253,
2746,
25057,
4216,
17524,
1332,
323,
4715,
4096,
253,
2929,
671,
2530,
247,
1618,
273,
4715,
14493,
253,
5661,
1543,
7568,
253,
4081,
465,
3210,
476,
4711,
4715,
1543,
2810,
281,
32339,
3210,
50274,
296,
3755,
20556,
50275,
18,
253,
1895,
310,
4722,
314,
285,
14793,
352,
476,
7826,
320,
3732,
281,
1142,
4893,
50276,
19,
253,
2746,
310,
1754,
327,
4216,
10527,
1543,
495,
253,
2929,
3400,
2581,
26565,
10527,
1783,
50276,
20881,
1255,
265,
50276,
18,
253,
4028,
476,
320,
5520,
285,
1805,
10932,
374,
3480,
273,
2613,
3104,
323,
5301,
253,
5661,
1543,
403,
2581,
5075,
495,
690,
8442,
403,
1512,
1355,
281,
1239,
751,
1110,
275,
4677,
608,
50276,
9820,
5474,
33032,
2520,
2929,
2175,
253,
27549,
4715,
275,
32339,
4758,
534,
19401,
253,
22766,
3268,
2190,
8548,
253,
2929,
29328,
247,
1332,
1754,
327,
4216,
10883,
281,
3037,
32339,
3210,
323,
1016,
10883,
1387,
534,
310,
6096,
407,
512,
8548,
1561,
581,
10883,
1387,
253,
2929,
806,
3400,
253,
26647,
3033,
323,
1016,
5268,
432,
534,
253,
8654,
32339,
1566,
310,
6012,
3066,
39793,
327,
253,
7680,
432,
253,
10895,
273,
1016,
5268,
840,
247,
7680,
4216,
310,
8818,
432,
253,
7680,
2801,
285,
2074,
8548,
403,
10883,
264,
715,
581,
1387,
846,
19690,
562,
3076,
8548,
253,
32339,
1566,
323,
581,
1387,
310,
6311,
1754,
327,
253,
3388,
13461,
273,
253,
8548,
1561,
253,
1387,
253,
2929,
3400,
28055,
1783,
327,
253,
2572,
273,
253,
5170,
3033,
273,
253,
6714,
2495,
1955,
281,
253,
17040,
432,
8654,
32339,
1566,
281,
1387,
32339,
1566,
4679,
7568,
253,
1387,
32339,
1566,
556,
2074,
3045,
342,
253,
8654,
32339,
1566,
285,
816,
7790,
253,
4081,
1332,
50274,
2520,
2929,
556,
2067,
16108,
806,
253,
10527,
629,
310,
2590,
26565,
285,
3590,
352,
816,
7790,
253,
12510,
273,
253,
4081,
7792,
1273,
253,
4081,
1332,
534,
310,
1754,
327,
4216,
10883,
273,
5268,
7680,
310,
4460,
285,
4722,
281,
479,
50274,
35529,
891,
717,
417,
1077,
7615,
342,
436,
2561,
9400,
619,
2201,
4468,
310,
327,
253,
4028,
273,
436,
2929,
347,
352,
310,
4536,
1892,
281,
956,
323,
1650,
627,
403,
642,
749,
21454,
1561,
2593,
577,
776,
789,
533,
281,
619,
4685,
436,
2593,
806,
12088,
670,
253,
8654,
1566,
840,
3400,
4216,
10883,
1754,
4715,
5933,
285,
1390,
4245,
40638,
14493,
342,
2007,
13260,
3021,
891,
1158,
2593,
21,
476,
320,
9070,
715,
387,
1878,
495,
749,
21454,
253,
4477,
403,
14659,
281,
3157,
253,
9759,
273,
436,
789,
891,
452,
690,
625,
3533,
281,
1642,
253,
4477,
4496,
923,
253,
4385,
2708,
50276,
783,
2929,
1057,
417,
2319,
7364,
273,
253,
789,
352,
3400,
253,
5955,
327,
2675,
3486,
50276,
7152,
33032,
2520,
2929,
5421,
27549,
4715,
323,
22766,
8548,
281,
5115,
3367,
1320,
1223,
17816,
3733,
32339,
3210,
323,
247,
1781,
1180,
273,
8548,
253,
4477,
4081,
3082,
281,
2736,
14448,
10471,
285,
5223,
1242,
3037,
3210,
323,
5268,
2390,
253,
4477,
2085,
247,
10527,
12215,
326,
253,
3264,
2495,
273,
253,
6311,
1387,
1566,
310,
2810,
281,
253,
32339,
1566,
16774,
1543,
327,
1524,
10186,
15302,
17618,
253,
10527,
906,
4645,
1387,
1566,
310,
247,
1175,
11193,
273,
253,
32339,
1566,
4757,
253,
16038,
273,
4715,
3210,
323,
5268,
9959,
281,
6654,
875,
13782,
285,
3367,
1320,
310,
27350,
253,
767,
11333,
247,
498,
2188,
534,
10384,
23178,
414,
11903,
1320,
281,
6642,
5268,
1387,
285,
247,
498,
68,
534,
19584,
15276,
5268,
17524,
1097,
403,
5867,
342,
26647,
14493,
50276,
20881,
1255,
891,
760,
452,
6571,
5884,
7350,
1223,
253,
767,
4081,
11333,
403,
28055,
5867,
597,
403,
417,
3587,
2429,
352,
3133,
326,
342,
253,
337,
50276,
2733,
299,
4277,
6772,
3266,
318,
296,
1430,
9376,
275,
10012,
577,
247,
498,
68,
556,
247,
1805,
26647,
3033,
685,
247,
498,
2188,
672,
1478,
545,
532,
310,
1355,
285,
352,
651,
320,
4722,
281,
7277,
253,
14493,
275,
2508,
352,
310,
671,
4409,
45190,
5175,
247,
498,
2188,
275,
4677,
608,
923,
14855,
5474,
33032,
2520,
2929,
2175,
253,
27549,
4715,
1895,
275,
534,
1027,
8548,
812,
452,
1327,
74,
301,
941,
2429,
342,
5368,
4156,
390,
32339,
3082,
436,
2929,
29328,
281,
2736,
14448,
10471,
281,
5115,
247,
1805,
14448,
253,
4477,
7473,
907,
436,
1895,
347,
10499,
253,
14259,
875,
8548,
285,
4560,
271,
8654,
10883,
273,
253,
14448,
10208,
12072,
2990,
253,
4477,
1918,
2710,
10527,
1783,
285,
2085,
5661,
1543,
281,
12654,
253,
12510,
273,
253,
4081,
1332,
50275,
296,
3755,
20556,
337,
2520,
2929,
19401,
247,
11132,
1895,
253,
14448,
2190,
2709,
8548,
534,
310,
1077,
4217,
285,
1534,
275,
6612,
50276,
19,
2520,
2929,
7194,
16633,
327,
10527,
1783,
1690,
253,
14259,
6814,
875,
8548,
253,
13757,
273,
253,
10883,
285,
253,
2228,
3033,
273,
253,
34930,
10883,
50275,
20881,
1255,
50276,
24013,
7639,
337,
253,
16038,
275,
436,
2929,
310,
281,
1089,
247,
10883,
323,
253,
14448,
2990,
352,
310,
5272,
285,
5919,
2299,
690,
2905,
789,
285,
625,
11985,
403,
5816,
323,
1650,
1249,
50274,
32525,
337,
8826,
10527,
1543,
516,
1077,
7615,
342,
24088,
10012,
337,
285,
10012,
374,
516,
9202,
597,
403,
3240,
2074,
281,
5368,
10527,
789,
275,
5028,
15644,
285,
5028,
26647,
323,
253,
643,
10527,
11815,
253,
4477,
671,
3133,
281,
3730,
281,
5368,
789,
326,
1146,
753,
697,
417,
247,
1943,
1895,
625,
11985,
670,
253,
3064,
875,
253,
10527,
1783,
285,
5368,
789,
403,
3309,
50275,
16217,
3825,
253,
4679,
275,
436,
2929,
403,
10369,
12497,
337,
783,
2934,
275,
436,
2929,
310,
17524,
2074,
390,
9371,
8548,
625,
1666,
25379,
403,
3309,
323,
1650,
374,
374,
783,
4679,
2770,
327,
767,
2460,
15302,
625,
643,
941,
403,
3309,
50275,
13552,
1974,
436,
2929,
19401,
253,
1895,
273,
27549,
4715,
253,
4081,
1332,
3198,
281,
2557,
253,
14259,
875,
8548,
534,
812,
2847,
11068,
23753,
275,
247,
10208,
12072,
4715,
4758,
812,
253,
4477,
2319,
625,
670,
253,
11068,
7350,
50276,
18,
36707,
256,
632,
606,
480,
3199,
259,
1162,
355,
4715,
281,
42124,
75,
549,
32693,
638,
3845,
549,
32693,
16899,
1438,
2787,
1731,
43425,
50276,
19,
32798,
6934,
247,
448,
1947,
480,
340,
249,
277,
1162,
355,
271,
5919,
7792,
323,
29102,
10208,
12072,
4715,
75,
16424,
275,
11454,
1491,
5162,
2718,
9169,
5922,
23627,
2691,
746,
34651,
253,
4477,
1750,
326,
597,
6266,
253,
7364,
533,
891,
858,
417,
1119,
2490,
187,
4118,
18435,
27,
249,
436,
19529,
253,
4477,
12661,
247,
4460,
1332,
281,
6016,
253,
22766,
8548,
5691,
275,
892,
407,
15549,
14448,
10471,
285,
5223,
1242,
4715,
3210,
323,
5268,
2390,
253,
4477,
2085,
253,
10527,
1783,
273,
253,
2228,
3033,
273,
253,
34930,
10883,
534,
816,
7790,
253,
12510,
273,
253,
4081,
1332,
253,
5661,
1543,
327,
253,
1524,
10186,
10895,
671,
17813,
326,
253,
1387,
1566,
310,
247,
1175,
11193,
273,
253,
32339,
1566,
1955,
281,
841,
891,
5583,
18738,
436,
19529,
50276,
35529,
891,
513,
452,
581,
2201,
4468,
670,
253,
39975,
273,
253,
3368,
347,
8042,
562,
407,
37317,
1269,
71,
532,
625,
15302,
285,
1666,
25379,
943,
320,
8671,
50275,
44295,
436,
19529,
671,
476,
320,
5520,
1754,
327,
253,
5701,
432,
512,
253,
30628,
285,
253,
5955,
875,
30628,
285,
4477,
3524,
597,
1089,
841,
4217,
285,
1056,
436,
19529,
247,
1805,
581,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
247,
4460,
2746,
281,
10883,
253,
8548,
715,
465,
2390,
285,
1973,
32339,
1566,
323,
1016,
1387,
253,
2746,
25057,
4216,
17524,
1332,
323,
4715,
4096,
253,
2929,
671,
2530,
247,
1618,
273,
4715,
14493,
253,
5661,
1543,
7568,
253,
4081,
465,
3210,
476,
4711,
4715,
1543,
2810,
281,
32339,
3210,
50274,
296,
3755,
20556,
50275,
18,
253,
1895,
310,
4722,
314,
285,
14793,
352,
476,
7826,
320,
3732,
281,
1142,
4893,
50276,
19,
253,
2746,
310,
1754,
327,
4216,
10527,
1543,
495,
253,
2929,
3400,
2581,
26565,
10527,
1783,
50276,
20881,
1255,
265,
50276,
18,
253,
4028,
476,
320,
5520,
285,
1805,
10932,
374,
3480,
273,
2613,
3104,
323,
5301,
253,
5661,
1543,
403,
2581,
5075,
495,
690,
8442,
403,
1512,
1355,
281,
1239,
751,
1110,
275,
4677,
608,
50276,
9820,
5474,
33032,
2520,
2929,
2175,
253,
27549,
4715,
275,
32339,
4758,
534,
19401,
253,
22766,
3268,
2190,
8548,
253,
2929,
29328,
247,
1332,
1754,
327,
4216,
10883,
281,
3037,
32339,
3210,
323,
1016,
10883,
1387,
534,
310,
6096,
407,
512,
8548,
1561,
581,
10883,
1387,
253,
2929,
806,
3400,
253,
26647,
3033,
323,
1016,
5268,
432,
534,
253,
8654,
32339,
1566,
310,
6012,
3066,
39793,
327,
253,
7680,
432,
253,
10895,
273,
1016,
5268,
840,
247,
7680,
4216,
310,
8818,
432,
253,
7680,
2801,
285,
2074,
8548,
403,
10883,
264,
715,
581,
1387,
846,
19690,
562,
3076,
8548,
253,
32339,
1566,
323,
581,
1387,
310,
6311,
1754,
327,
253,
3388,
13461,
273,
253,
8548,
1561,
253,
1387,
253,
2929,
3400,
28055,
1783,
327,
253,
2572,
273,
253,
5170,
3033,
273,
253,
6714,
2495,
1955,
281,
253,
17040,
432,
8654,
32339,
1566,
281,
1387,
32339,
1566,
4679,
7568,
253,
1387,
32339,
1566,
556,
2074,
3045,
342,
253,
8654,
32339,
1566,
285,
816,
7790,
253,
4081,
1332,
50274,
2520,
2929,
556,
2067,
16108,
806,
253,
10527,
629,
310,
2590,
26565,
285,
3590,
352,
816,
7790,
253,
12510,
273,
253,
4081,
7792,
1273,
253,
4081,
1332,
534,
310,
1754,
327,
4216,
10883,
273,
5268,
7680,
310,
4460,
285,
4722,
281,
479,
50274,
35529,
891,
717,
417,
1077,
7615,
342,
436,
2561,
9400,
619,
2201,
4468,
310,
327,
253,
4028,
273,
436,
2929,
347,
352,
310,
4536,
1892,
281,
956,
323,
1650,
627,
403,
642,
749,
21454,
1561,
2593,
577,
776,
789,
533,
281,
619,
4685,
436,
2593,
806,
12088,
670,
253,
8654,
1566,
840,
3400,
4216,
10883,
1754,
4715,
5933,
285,
1390,
4245,
40638,
14493,
342,
2007,
13260,
3021,
891,
1158,
2593,
21,
476,
320,
9070,
715,
387,
1878,
495,
749,
21454,
253,
4477,
403,
14659,
281,
3157,
253,
9759,
273,
436,
789,
891,
452,
690,
625,
3533,
281,
1642,
253,
4477,
4496,
923,
253,
4385,
2708,
50276,
783,
2929,
1057,
417,
2319,
7364,
273,
253,
789,
352,
3400,
253,
5955,
327,
2675,
3486,
50276,
7152,
33032,
2520,
2929,
5421,
27549,
4715,
323,
22766,
8548,
281,
5115,
3367,
1320,
1223,
17816,
3733,
32339,
3210,
323,
247,
1781,
1180,
273,
8548,
253,
4477,
4081,
3082,
281,
2736,
14448,
10471,
285,
5223,
1242,
3037,
3210,
323,
5268,
2390,
253,
4477,
2085,
247,
10527,
12215,
326,
253,
3264,
2495,
273,
253,
6311,
1387,
1566,
310,
2810,
281,
253,
32339,
1566,
16774,
1543,
327,
1524,
10186,
15302,
17618,
253,
10527,
906,
4645,
1387,
1566,
310,
247,
1175,
11193,
273,
253,
32339,
1566,
4757,
253,
16038,
273,
4715,
3210,
323,
5268,
9959,
281,
6654,
875,
13782,
285,
3367,
1320,
310,
27350,
253,
767,
11333,
247,
498,
2188,
534,
10384,
23178,
414,
11903,
1320,
281,
6642,
5268,
1387,
285,
247,
498,
68,
534,
19584,
15276,
5268,
17524,
1097,
403,
5867,
342,
26647,
14493,
50276,
20881,
1255,
891,
760,
452,
6571,
5884,
7350,
1223,
253,
767,
4081,
11333,
403,
28055,
5867,
597,
403,
417,
3587,
2429,
352,
3133,
326,
342,
253,
337,
50276,
2733,
299,
4277,
6772,
3266,
318,
296,
1430,
9376,
275,
10012,
577,
247,
498,
68,
556,
247,
1805,
26647,
3033,
685,
247,
498,
2188,
672,
1478,
545,
532,
310,
1355,
285,
352,
651,
320,
4722,
281,
7277,
253,
14493,
275,
2508,
352,
310,
671,
4409,
45190,
5175,
247,
498,
2188,
275,
4677,
608,
923,
14855,
5474,
33032,
2520,
2929,
2175,
253,
27549,
4715,
1895,
275,
534,
1027,
8548,
812,
452,
1327,
74,
301,
941,
2429,
342,
5368,
4156,
390,
32339,
3082,
436,
2929,
29328,
281,
2736,
14448,
10471,
281,
5115,
247,
1805,
14448,
253,
4477,
7473,
907,
436,
1895,
347,
10499,
253,
14259,
875,
8548,
285,
4560,
271,
8654,
10883,
273,
253,
14448,
10208,
12072,
2990,
253,
4477,
1918,
2710,
10527,
1783,
285,
2085,
5661,
1543,
281,
12654,
253,
12510,
273,
253,
4081,
1332,
50275,
296,
3755,
20556,
337,
2520,
2929,
19401,
247,
11132,
1895,
253,
14448,
2190,
2709,
8548,
534,
310,
1077,
4217,
285,
1534,
275,
6612,
50276,
19,
2520,
2929,
7194,
16633,
327,
10527,
1783,
1690,
253,
14259,
6814,
875,
8548,
253,
13757,
273,
253,
10883,
285,
253,
2228,
3033,
273,
253,
34930,
10883,
50275,
20881,
1255,
50276,
24013,
7639,
337,
253,
16038,
275,
436,
2929,
310,
281,
1089,
247,
10883,
323,
253,
14448,
2990,
352,
310,
5272,
285,
5919,
2299,
690,
2905,
789,
285,
625,
11985,
403,
5816,
323,
1650,
1249,
50274,
32525,
337,
8826,
10527,
1543,
516,
1077,
7615,
342,
24088,
10012,
337,
285,
10012,
374,
516,
9202,
597,
403,
3240,
2074,
281,
5368,
10527,
789,
275,
5028,
15644,
285,
5028,
26647,
323,
253,
643,
10527,
11815,
253,
4477,
671,
3133,
281,
3730,
281,
5368,
789,
326,
1146,
753,
697,
417,
247,
1943,
1895,
625,
11985,
670,
253,
3064,
875,
253,
10527,
1783,
285,
5368,
789,
403,
3309,
50275,
16217,
3825,
253,
4679,
275,
436,
2929,
403,
10369,
12497,
337,
783,
2934,
275,
436,
2929,
310,
17524,
2074,
390,
9371,
8548,
625,
1666,
25379,
403,
3309,
323,
1650,
374,
374,
783,
4679,
2770,
327,
767,
2460,
15302,
625,
643,
941,
403,
3309,
50275,
13552,
1974,
436,
2929,
19401,
253,
1895,
273,
27549,
4715,
253,
4081,
1332,
3198,
281,
2557,
253,
14259,
875,
8548,
534,
812,
2847,
11068,
23753,
275,
247,
10208,
12072,
4715,
4758,
812,
253,
4477,
2319,
625,
670,
253,
11068,
7350,
50276,
18,
36707,
256,
632,
606,
480,
3199,
259,
1162,
355,
4715,
281,
42124,
75,
549,
32693,
638,
3845,
549,
32693,
16899,
1438,
2787,
1731,
43425,
50276,
19,
32798,
6934,
247,
448,
1947,
480,
340,
249,
277,
1162,
355,
271,
5919,
7792,
323,
29102,
10208,
12072,
4715,
75,
16424,
275,
11454,
1491,
5162,
2718,
9169,
5922,
23627,
2691,
746,
34651,
253,
4477,
1750,
326,
597,
6266,
253,
7364,
533,
891,
858,
417,
1119,
2490,
187,
4118,
18435,
27,
249,
436,
19529,
253,
4477,
12661,
247,
4460,
1332,
281,
6016,
253,
22766,
8548,
5691,
275,
892,
407,
15549,
14448,
10471,
285,
5223,
1242,
4715,
3210,
323,
5268,
2390,
253,
4477,
2085,
253,
10527,
1783,
273,
253,
2228,
3033,
273,
253,
34930,
10883,
534,
816,
7790,
253,
12510,
273,
253,
4081,
1332,
253,
5661,
1543,
327,
253,
1524,
10186,
10895,
671,
17813,
326,
253,
1387,
1566,
310,
247,
1175,
11193,
273,
253,
32339,
1566,
1955,
281,
841,
891,
5583,
18738,
436,
19529,
50276,
35529,
891,
513,
452,
581,
2201,
4468,
670,
253,
39975,
273,
253,
3368,
347,
8042,
562,
407,
37317,
1269,
71,
532,
625,
15302,
285,
1666,
25379,
943,
320,
8671,
50275,
44295,
436,
19529,
671,
476,
320,
5520,
1754,
327,
253,
5701,
432,
512,
253,
30628,
285,
253,
5955,
875,
30628,
285,
4477,
3524,
597,
1089,
841,
4217,
285,
1056,
436,
19529,
247,
1805,
581,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a method to fuse two source of information for bev detection namely multiview images and lidar data in such a manner that any data defects in one source of information do not effect the network of the other method previous methods have combined the information from the two sources at different stages of the network pipeline but they are prone to getting effected in the inference results when the data is corrupted from either source this paper delays the combination of information to even later part of the pipeline thereby mitigating the effect of badcorruptedunavailable data they do so by generating a pseudo bev point cloud just from multiview cameras and combining that information with lidar bev the combination part is based on a dynamic fusion method which selects important fused features be it from camera based bev or lidar based bev the results are shown where missing data in the lidar or camera image do not effect the detection unless both are missing for the same object in the scene eg car strength the paper addresses a problem which could be a real life problem in lidar or image based data capture where the lidar data is missing due to 3d scene materialreflectance properties or the image could be missing from a video stream these problems can cause existing networks to fail this paper addresses this problem which can make the commercial deployment of such systems more doable the paper is well written with clear explanation of the previous work the results are detailed and show scenarios where they are better than previous best results in challenging situations weakness 1 the dynamic fusion module should be explained in more detail as its one of the contributions of this paper it should be explained with a scenario where data is missing from either of the streams and how the formulation in eq1 and eq2 will still be able to select the bev information which exists to feed the final detection result 2 citations 4445 3233 5758 are repeated citations please fix it 4 grammar 92 startstarted 3 discoverdiscovered the reviewer didnt find any major limitations the authors could discuss and show some failure cases of their work docsepthis paper introduced a method for point cloud object detection based on lidar camera fusion the main contribution of this method is the fusion framework that combines the camera and lidar stream this fusion module is very simple because it mainly consists of the concatenation of lidar and camera streams and a typical feature selection with an average pooling and 1x1 convolution the results show that this method slightly outperformed the other method for the comparison moreover the robustness against camera or lidar malfunctions is shown in the results the ablation study shows that each module employed in this method improves performance strength the performance is slightly improved the methodology is very simple weakness considering the small performance improvement and the simple methodology i would think that the contribution of this method is relatively limited i would suggest showing some failure cases and discussions about them because it will contribute to the community docsepthe paper proposes a framework for 3d detection from rgb and lidar inputs in autonomous driving scenes the pipeline includes separate networks reasoning from rgb and lidar inputs independently and uses a fusion network for refined detection when both sources are available also the paper considers situations of data corruption and proposed to boost the robustness in the model design the paper is the first to identify and evaluate the problem that most existing methods do not consider situations where one or both sources are unavailable and proposes a pipeline customized for this situation the proposed method is evaluated in the standard settings of object detection and compared with baseline methods both qualitatively and quantitatively strength 1 the task identification as mentioned above the paper is the first to identify the issue within the current literature and models and proposes a pipeline accordingly which reasons from two sources independently and thus more robust when data unavailability occurs in this sense the task identification itself is valuable to the community in defining and bringing attention to the task 2 extensive design choices and evaluation although the proposed pipeline is mostly based on existing methods the paper is able to evaluate various design choices to demonstrate the flexibility of the proposed framework as well as provide extensive evaluation into the results yields sota results with both sources and robust result when only one is available weakness 1 novelty and model design the paper is novel in identifying the problem which is legit and valuable however for the proposed method itself it is mostly a combination of existing methods utilization single sources without much modification thus diminishing the merit of the proposed framework also the design to handle one or two sources in the framework is naive basically running the first stage network only if only one source is available and running both stages when two are available a more sophisticated design could be when for example camera stream is dropped for a few frames is there a chance to stick to the fusion detector but utilizing temporal information to compensate the missing rgb data instead of simply drop the rgb branch and the fusion running the lidar branch alone which will likely result in a sudden drastic change to the detections 2 simulation for data corruption the paper proposes to augment the data to simulate possible data corruption scenarios via dropping points and limiting fov however more effort can be done to boost the robustness eg looking for real driving sequences in extreme weather or with bad data and trainevaluate on those data na docsepmost existing cameralidar fusion work decorates lidar points with image features and then performs detection in 3dbev space this work leverages recent liftsplatshoot work for cameras which allows one to map both camera and lidar inputs to bev space before fusing and applying the detection head strengths the proposed idea and its realization makes sense and i am not aware of such published work even though there seems to be concurrent similar work since this seems a logical next step given the existence of lss 52 details in the model seem well thought out this include the extensions to lss dualswintiny architecture adp as well as the layers in the dynamic fusion module the experimental results show that this work is close to sota on nuscenes and that it affords significant model robustness in the case of lidar information missing compared to existing methods the model details are pretty clearly explained weaknesses related work section is confusing in a few places and can be streamlined further examples 1 the camera detectors section contains a discussion of pointpillars which is a purely lidar method 2 range images are not really euclidean space see line 88 3 89 recently people start to exploit these two feature modalities to increase the representation power there is earlier work to do this if i understand correctly the statement eg 5 from the paper or endtoend multiview fusion for 3d object detection in lidar point clouds by yin zhou et al corl 2019 4 90 another line of work is to exploit the benefit of the birds eye view plane similar to the camera perception a lot of this work came before camera started exploiting the bev view intuitive explanations are lacking in a couple of instances 1 work does not explain the intuition why the model needs to be trained in two stages what happens if its trained in a single stage 2 14 note that we do not conduct data augmentation when multiview image input is involved while data augmentation plays a critical part in other cutting edge methods is this a limitation of camera fusion methods in general or something specifically lacking in your case can you please clarify it is unclear whether the approach is sota on nuscenes or not can you please explicitly contrast your performance relative to the nuscenes leaderboard at least for the published approaches when exploring that leaderboard myself i see mentions of a method called bevfusion that is sota but seems to be a different method assuming that method is different and already on the leaderboard your naming may be confusing too generic nuscenes is a dataset with particularly poor lidar compared to other public datasets such as waymo open dataset argoverse20 etc results on at least one more dataset with high quality and longerrange lidar are highly desirable the core issue of missing lidar points may be a lot less pertinent for more modern lidars also as range increases beyond 40m to 70200m the approach here may actually underperform lidarpainting approaches since bev view can start containing errors 10m in the camera case making fusion in bev space difficult to this effect analysis of the method performance as a function of object distance relative to sota fusion methods for long distances will help language there are minor language issues and typos in the paper it would benefit from another proofreading pass see comment on weaknesses some core potential limitations of the existing method have not been fully explored my current rating is predicated on the assumption that a similar idea has not been published yet not completely certain and that i will receive reasonable responses to my questions docseptowards the problem of current methods tend to fail at situations where hardware malfunctions this paper presents a simple yet effective lidarcamera fusion framework namely bevfusion by disentangling camera pipeline from lidar network and using a dynamic fusion module bevfusion achieves sota performance and shows robustness against lidar or camera malfunction at the same time an effective modification on the camera pipeline is also proposed to boost the final performance strength 1 the paper is well written and easy to read 2 robustness of autonomous driving algorithms should be paid more attention to this paper raises the issue and makes the attempt to addressing it 3 thorough experiments are performed claims are wellsupported sota performance is achieved on both normal and robust settings of nuscenes 4 the clean design of the framework makes it easy to use any camera or lidar framework weaknesses 1 it is nice to see a simple yet effective module dynamic fusion module being proposed but it would be nicer to provide some insights and analysis into the design itself for example by analyzing how would the fusion module work when facing incomplete lidar or camera inputs we might gain some insights into the module design of csf and afs 2 the experiments section does not provide runtime analysis like inference time and memory footprint and its comparison with other methods the potential negative social impact is well discussed but the limitation should be discussed more for example would the latefusion style misses the opportunity to fuse intermediate lidar and camera features and thus makes the pipeline suffer potential performance drop
### Summary:
|
the paper proposes a method to fuse two sources of information for birds eye view bev detection namely multiview images and lidar data in a way that any data defects in one source of information does not affect the other most existing cameralidar fusion works decorate lidar points with image features and then perform detection in 3dbev space this work leverages recent liftsplatshoot work for cameras which allows one to map both camera and lidar inputs to bev space before fusing and applying the detection head the reviewers appreciate the identification of the problem of present fusion methods that are susceptible to damage in one of the two sources of information the simplicity of the method and its good empirical performance they raise concerns regarding its novelty given the obvious choices of the present method the rebuttal submitted by the authors presents more empirical results and ablations most reviewers appreciate the contribution of the paper and the paper is suggested for publication
|
[
2929,
12453,
247,
1895,
534,
812,
320,
247,
1524,
1495,
1895,
275,
16486,
274,
390,
2460,
1754,
941,
9232,
835,
253,
16486,
274,
941,
310,
5816,
1955,
281,
495,
69,
6200,
2144,
22697,
593,
3607,
390,
253,
2460,
812,
320,
5816,
432,
247,
3492,
5542,
841,
3237,
476,
2847,
5368,
6928,
281,
1891,
436,
2929,
12453,
436,
1895,
534,
476,
1056,
253,
6264,
19007,
273,
824,
2718,
625,
513,
494,
253,
2929,
310,
973,
3542,
342,
2590,
8813,
273,
253,
2045,
789,
253,
1543,
403,
7000,
285,
921,
15216,
835,
597,
403,
1805,
685,
2045,
1682,
1543,
275,
11132,
9534,
50276,
20881,
1255,
337,
253,
7870,
11781,
6333,
943,
320,
5544,
275,
625,
2508,
347,
697,
581,
273,
253,
9021,
273,
436,
2929,
352,
943,
320,
5544,
342,
247,
10076,
835,
941,
310,
5816,
432,
2057,
273,
253,
17795,
285,
849,
253,
15895,
275,
16186,
18,
285,
16186,
19,
588,
1335,
320,
2104,
281,
3609,
253,
320,
87,
1491,
534,
4961,
281,
3997,
253,
2457,
5481,
906,
374,
30404,
7127,
1857,
495,
22187,
608,
33607,
403,
6015,
30404,
4496,
4993,
352,
577,
28146,
11266,
1265,
40324,
495,
9413,
12722,
3111,
253,
37317,
42126,
1089,
667,
2201,
7364,
253,
4477,
812,
2319,
285,
921,
690,
4433,
2219,
273,
616,
789,
5474,
33032,
2520,
2929,
5611,
247,
1332,
323,
1127,
9005,
1789,
5481,
1754,
327,
16486,
274,
6568,
11781,
50276,
783,
2022,
7680,
273,
436,
1332,
310,
253,
11781,
7792,
326,
24772,
253,
6568,
285,
16486,
274,
5542,
436,
11781,
6333,
310,
1077,
2969,
984,
352,
7194,
8414,
273,
253,
32147,
318,
273,
16486,
274,
285,
6568,
17795,
285,
247,
6867,
4735,
5438,
342,
271,
3388,
45900,
285,
337,
89,
18,
27311,
253,
1543,
921,
326,
436,
1332,
5777,
41731,
10574,
253,
643,
1332,
323,
253,
5301,
25761,
253,
31640,
1411,
6568,
390,
16486,
274,
37540,
328,
960,
310,
2011,
275,
253,
1543,
253,
28913,
1263,
2722,
326,
1016,
6333,
7091,
275,
436,
1332,
19132,
3045,
50275,
45563,
50276,
783,
3045,
310,
5777,
5520,
50275,
783,
16182,
310,
1077,
2969,
50275,
20881,
1255,
50276,
15603,
272,
253,
1355,
3045,
7756,
285,
253,
2969,
16182,
891,
651,
1158,
326,
253,
7680,
273,
436,
1332,
310,
4942,
3710,
50276,
74,
651,
1804,
4645,
690,
4433,
2219,
285,
11985,
670,
731,
984,
352,
588,
8162,
281,
253,
3114,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
7792,
323,
495,
69,
5481,
432,
46206,
285,
16486,
274,
14800,
275,
26279,
6276,
13451,
253,
15722,
3797,
4858,
6928,
14720,
432,
46206,
285,
16486,
274,
14800,
10939,
285,
4648,
247,
11781,
2990,
323,
22407,
5481,
672,
1097,
4973,
403,
2130,
671,
253,
2929,
19401,
9534,
273,
941,
16933,
285,
4081,
281,
9510,
253,
31640,
275,
253,
1566,
2216,
253,
2929,
310,
253,
806,
281,
4271,
285,
7472,
253,
1895,
326,
954,
5368,
3082,
513,
417,
1908,
9534,
835,
581,
390,
1097,
4973,
403,
29356,
285,
29328,
247,
15722,
32176,
323,
436,
4112,
253,
4081,
1332,
310,
6760,
275,
253,
2629,
7533,
273,
1789,
5481,
285,
2429,
342,
8245,
3082,
1097,
36143,
285,
36878,
4757,
50276,
18,
253,
4836,
8137,
347,
5393,
1840,
253,
2929,
310,
253,
806,
281,
4271,
253,
2523,
1561,
253,
1655,
6239,
285,
3210,
285,
29328,
247,
15722,
15672,
534,
4606,
432,
767,
4973,
10939,
285,
3021,
625,
10237,
672,
941,
440,
32517,
6634,
275,
436,
3282,
253,
4836,
8137,
3139,
310,
9865,
281,
253,
3114,
275,
13947,
285,
9745,
4116,
281,
253,
4836,
50276,
19,
9470,
2216,
10165,
285,
7103,
3738,
253,
4081,
15722,
310,
6571,
1754,
327,
5368,
3082,
253,
2929,
310,
2104,
281,
7472,
2710,
2216,
10165,
281,
7568,
253,
15840,
273,
253,
4081,
7792,
347,
973,
347,
2085,
9470,
7103,
715,
253,
1543,
11026,
256,
5503,
1543,
342,
1097,
4973,
285,
10237,
906,
672,
760,
581,
310,
2130,
50276,
20881,
1255,
50276,
18,
38135,
285,
1566,
2216,
253,
2929,
310,
4460,
275,
12488,
253,
1895,
534,
310,
10972,
285,
9865,
2299,
323,
253,
4081,
1332,
3139,
352,
310,
6571,
247,
5019,
273,
5368,
3082,
19575,
2014,
4973,
1293,
1199,
11237,
3021,
48245,
253,
15785,
273,
253,
4081,
7792,
671,
253,
2216,
281,
6016,
581,
390,
767,
4973,
275,
253,
7792,
310,
27785,
10323,
3515,
253,
806,
3924,
2990,
760,
604,
760,
581,
2603,
310,
2130,
285,
3515,
1097,
8661,
672,
767,
403,
2130,
247,
625,
18144,
2216,
812,
320,
672,
323,
1650,
6568,
5542,
310,
8231,
323,
247,
1643,
13009,
310,
627,
247,
4839,
281,
7356,
281,
253,
11781,
13562,
533,
17617,
11935,
1491,
281,
23514,
253,
5816,
46206,
941,
3185,
273,
3365,
5926,
253,
46206,
7789,
285,
253,
11781,
3515,
253,
16486,
274,
7789,
3815,
534,
588,
2779,
906,
275,
247,
5982,
36927,
1818,
281,
253,
843,
20713,
50276,
19,
9864,
323,
941,
16933,
253,
2929,
29328,
281,
35919,
253,
941,
281,
26065,
1896,
941,
16933,
15216,
3066,
18752,
2792,
285,
14155,
269,
729,
2299,
625,
3434,
476,
320,
2218,
281,
9510,
253,
31640,
24088,
2819,
323,
1524,
6276,
6430,
275,
9559,
8588,
390,
342,
3076,
941,
285,
1140,
460,
1208,
6340,
327,
1110,
941,
5549,
5474,
33032,
2252,
5368,
4049,
1560,
301,
274,
11781,
789,
11482,
684,
16486,
274,
2792,
342,
2460,
3386,
285,
840,
17923,
5481,
275,
495,
5470,
1173,
2317,
436,
789,
19732,
1131,
3332,
35408,
446,
255,
40719,
789,
323,
14693,
534,
4483,
581,
281,
3711,
1097,
6568,
285,
16486,
274,
14800,
281,
320,
87,
2317,
1078,
269,
5302,
285,
9433,
253,
5481,
1481,
50276,
296,
3755,
20556,
50275,
783,
4081,
2934,
285,
697,
22786,
2789,
3282,
285,
891,
717,
417,
6600,
273,
824,
3863,
789,
1014,
2167,
627,
3133,
281,
320,
17336,
2074,
789,
1580,
436,
3133,
247,
13760,
1735,
3213,
1677,
253,
6242,
273,
298,
859,
8073,
50275,
23454,
275,
253,
1566,
1646,
973,
1869,
562,
436,
2486,
253,
18149,
281,
298,
859,
8746,
2140,
565,
5104,
10336,
519,
81,
347,
973,
347,
253,
8090,
275,
253,
7870,
11781,
6333,
50275,
783,
5661,
1543,
921,
326,
436,
789,
310,
2810,
281,
256,
5503,
327,
295,
19387,
24453,
285,
326,
352,
2438,
6565,
1534,
1566,
31640,
275,
253,
1083,
273,
16486,
274,
1491,
5816,
2429,
281,
5368,
3082,
50275,
783,
1566,
4278,
403,
3965,
4518,
5544,
50275,
20881,
1255,
265,
50275,
4919,
789,
2593,
310,
21643,
275,
247,
1643,
5053,
285,
476,
320,
5542,
12490,
2007,
6667,
50276,
18,
253,
6568,
25421,
2593,
4428,
247,
5955,
273,
1127,
81,
408,
1032,
534,
310,
247,
15846,
16486,
274,
1332,
50276,
19,
2491,
3888,
403,
417,
1663,
299,
26365,
2317,
923,
1386,
11003,
50276,
20,
11289,
4102,
952,
1265,
281,
22059,
841,
767,
4735,
33433,
281,
2572,
253,
6779,
1612,
50276,
9088,
310,
4321,
789,
281,
513,
436,
604,
891,
2096,
9113,
253,
3908,
24088,
608,
432,
253,
2929,
390,
990,
936,
423,
1554,
400,
827,
11781,
323,
495,
69,
1789,
5481,
275,
16486,
274,
1127,
16173,
407,
340,
249,
1182,
14451,
1162,
355,
944,
77,
6247,
50276,
21,
5091,
50276,
23955,
1386,
273,
789,
310,
281,
22059,
253,
5649,
273,
253,
11260,
5130,
1859,
6415,
2074,
281,
253,
6568,
13071,
50276,
66,
2257,
273,
436,
789,
2210,
1078,
6568,
3053,
38883,
253,
320,
87,
1859,
50274,
565,
48714,
22909,
403,
14999,
275,
247,
4564,
273,
10872,
50276,
18,
789,
1057,
417,
5513,
253,
30328,
2139,
253,
1566,
3198,
281,
320,
10166,
275,
767,
8661,
752,
6569,
604,
697,
10166,
275,
247,
2014,
3924,
50276,
19,
1638,
3877,
326,
359,
513,
417,
2589,
941,
42072,
672,
1554,
400,
827,
2460,
3280,
310,
3206,
1223,
941,
42072,
7120,
247,
4619,
629,
275,
643,
9968,
5024,
3082,
310,
436,
247,
12291,
273,
6568,
11781,
3082,
275,
2087,
390,
1633,
5742,
14999,
275,
634,
1083,
476,
368,
4496,
19148,
50274,
262,
310,
12744,
1880,
253,
2746,
310,
256,
5503,
327,
295,
19387,
24453,
390,
417,
476,
368,
4496,
11120,
4499,
634,
3045,
4103,
281,
253,
295,
19387,
24453,
6657,
4697,
387,
1878,
323,
253,
3863,
7274,
672,
18216,
326,
6657,
4697,
4266,
891,
923,
25957,
273,
247,
1332,
1925,
320,
87,
12213,
326,
310,
256,
5503,
533,
3133,
281,
320,
247,
1027,
1332,
7384,
326,
1332,
310,
1027,
285,
2168,
327,
253,
6657,
4697,
634,
26086,
778,
320,
21643,
50276,
15627,
12314,
50274,
79,
19387,
24453,
310,
247,
10895,
342,
3782,
4105,
16486,
274,
2429,
281,
643,
1345,
15302,
824,
347,
1039,
6972,
1527,
10895,
1736,
1189,
339,
938,
3966,
1543,
327,
387,
1878,
581,
625,
10895,
342,
1029,
3290,
285,
1048,
1000,
912,
16486,
274,
403,
4122,
11408,
253,
5161,
2523,
273,
5816,
16486,
274,
2792,
778,
320,
247,
2257,
1679,
21452,
323,
625,
4980,
16486,
1032,
671,
347,
2491,
5459,
4457,
3387,
78,
281,
5571,
1518,
78,
253,
2746,
1060,
778,
2686,
762,
32231,
16486,
5916,
404,
1076,
7274,
1580,
320,
87,
1859,
476,
1265,
4508,
6332,
50276,
740,
78,
275,
253,
6568,
1083,
2403,
11781,
275,
320,
87,
2317,
2834,
281,
436,
1055,
1783,
273,
253,
1332,
3045,
347,
247,
1159,
273,
1789,
4181,
4103,
281,
256,
5503,
11781,
3082,
323,
1048,
13849,
588,
1361,
50274,
12982,
50276,
9088,
403,
5884,
3448,
3374,
285,
963,
993,
275,
253,
2929,
352,
651,
5649,
432,
1529,
4737,
24042,
1509,
50275,
2887,
4385,
327,
32213,
690,
5161,
2442,
7364,
273,
253,
5368,
1332,
452,
417,
644,
4751,
14859,
50275,
2577,
1655,
13716,
310,
49902,
327,
253,
9376,
326,
247,
2074,
2934,
556,
417,
644,
3863,
2568,
417,
4336,
2176,
285,
326,
891,
588,
4763,
5272,
6128,
281,
619,
3533,
50276,
7152,
339,
431,
319,
2196,
253,
1895,
273,
1655,
3082,
5257,
281,
1891,
387,
9534,
835,
10309,
37540,
328,
960,
436,
2929,
10262,
247,
2969,
2568,
3576,
16486,
3178,
312,
3525,
11781,
7792,
10775,
320,
87,
12213,
407,
557,
290,
36874,
6568,
15722,
432,
16486,
274,
2990,
285,
970,
247,
7870,
11781,
6333,
320,
87,
12213,
33526,
256,
5503,
3045,
285,
2722,
31640,
1411,
16486,
274,
390,
6568,
43280,
387,
253,
1072,
673,
271,
3576,
11237,
327,
253,
6568,
15722,
310,
671,
4081,
281,
9510,
253,
2457,
3045,
50276,
45563,
337,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
374,
31640,
273,
26279,
6276,
11333,
943,
320,
5087,
625,
4116,
281,
436,
2929,
16540,
253,
2523,
285,
2789,
253,
3177,
281,
15974,
352,
495,
11080,
4679,
403,
2684,
3916,
403,
973,
19391,
256,
5503,
3045,
310,
6786,
327,
1097,
2622,
285,
10237,
7533,
273,
295,
19387,
24453,
577,
253,
4076,
2216,
273,
253,
7792,
2789,
352,
3477,
281,
897,
667,
6568,
390,
16486,
274,
7792,
50274,
20881,
1255,
265,
337,
352,
310,
5322,
281,
923,
247,
2969,
2568,
3576,
6333,
7870,
11781,
6333,
1146,
4081,
533,
352,
651,
320,
49482,
281,
2085,
690,
16039,
285,
1783,
715,
253,
2216,
3139,
323,
1650,
407,
18918,
849,
651,
253,
11781,
6333,
789,
672,
10268,
18464,
16486,
274,
390,
6568,
14800,
359,
1537,
6351,
690,
16039,
715,
253,
6333,
2216,
273,
260,
6091,
285,
247,
3671,
374,
253,
4679,
2593,
1057,
417,
2085,
20243,
1783,
751,
17032,
673,
285,
3541,
33257,
285,
697,
5301,
342,
643,
3082,
253,
2442,
4016,
2675,
3486,
310,
973,
5469,
533,
253,
12291,
943,
320,
5469,
625,
323,
1650,
651,
253,
3563,
12213,
3740,
38771,
253,
5107,
281,
34824,
10444,
16486,
274,
285,
6568,
3386,
285,
3021,
2789,
253,
15722,
11089,
2442,
3045,
5926,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
1332,
281,
34824,
767,
4973,
273,
1491,
323,
11260,
5130,
1859,
320,
87,
5481,
10775,
1554,
400,
827,
3888,
285,
16486,
274,
941,
275,
247,
1039,
326,
667,
941,
12834,
275,
581,
2603,
273,
1491,
1057,
417,
2818,
253,
643,
954,
5368,
4049,
1560,
301,
274,
11781,
2987,
11482,
366,
16486,
274,
2792,
342,
2460,
3386,
285,
840,
1347,
5481,
275,
495,
5470,
1173,
2317,
436,
789,
19732,
1131,
3332,
35408,
446,
255,
40719,
789,
323,
14693,
534,
4483,
581,
281,
3711,
1097,
6568,
285,
16486,
274,
14800,
281,
320,
87,
2317,
1078,
269,
5302,
285,
9433,
253,
5481,
1481,
253,
30628,
11435,
253,
8137,
273,
253,
1895,
273,
1246,
11781,
3082,
326,
403,
16931,
281,
4723,
275,
581,
273,
253,
767,
4973,
273,
1491,
253,
17647,
273,
253,
1332,
285,
697,
1175,
16774,
3045,
597,
7164,
7350,
5001,
697,
38135,
1677,
50276,
783,
4755,
10165,
273,
253,
1246,
1332,
253,
30080,
22559,
9262,
407,
253,
4477,
10262,
625,
16774,
1543,
285,
490,
77,
569,
954,
30628,
11435,
253,
7680,
273,
253,
2929,
285,
50276,
783,
2929,
310,
5125,
323,
9311,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
2929,
12453,
247,
1895,
534,
812,
320,
247,
1524,
1495,
1895,
275,
16486,
274,
390,
2460,
1754,
941,
9232,
835,
253,
16486,
274,
941,
310,
5816,
1955,
281,
495,
69,
6200,
2144,
22697,
593,
3607,
390,
253,
2460,
812,
320,
5816,
432,
247,
3492,
5542,
841,
3237,
476,
2847,
5368,
6928,
281,
1891,
436,
2929,
12453,
436,
1895,
534,
476,
1056,
253,
6264,
19007,
273,
824,
2718,
625,
513,
494,
253,
2929,
310,
973,
3542,
342,
2590,
8813,
273,
253,
2045,
789,
253,
1543,
403,
7000,
285,
921,
15216,
835,
597,
403,
1805,
685,
2045,
1682,
1543,
275,
11132,
9534,
50276,
20881,
1255,
337,
253,
7870,
11781,
6333,
943,
320,
5544,
275,
625,
2508,
347,
697,
581,
273,
253,
9021,
273,
436,
2929,
352,
943,
320,
5544,
342,
247,
10076,
835,
941,
310,
5816,
432,
2057,
273,
253,
17795,
285,
849,
253,
15895,
275,
16186,
18,
285,
16186,
19,
588,
1335,
320,
2104,
281,
3609,
253,
320,
87,
1491,
534,
4961,
281,
3997,
253,
2457,
5481,
906,
374,
30404,
7127,
1857,
495,
22187,
608,
33607,
403,
6015,
30404,
4496,
4993,
352,
577,
28146,
11266,
1265,
40324,
495,
9413,
12722,
3111,
253,
37317,
42126,
1089,
667,
2201,
7364,
253,
4477,
812,
2319,
285,
921,
690,
4433,
2219,
273,
616,
789,
5474,
33032,
2520,
2929,
5611,
247,
1332,
323,
1127,
9005,
1789,
5481,
1754,
327,
16486,
274,
6568,
11781,
50276,
783,
2022,
7680,
273,
436,
1332,
310,
253,
11781,
7792,
326,
24772,
253,
6568,
285,
16486,
274,
5542,
436,
11781,
6333,
310,
1077,
2969,
984,
352,
7194,
8414,
273,
253,
32147,
318,
273,
16486,
274,
285,
6568,
17795,
285,
247,
6867,
4735,
5438,
342,
271,
3388,
45900,
285,
337,
89,
18,
27311,
253,
1543,
921,
326,
436,
1332,
5777,
41731,
10574,
253,
643,
1332,
323,
253,
5301,
25761,
253,
31640,
1411,
6568,
390,
16486,
274,
37540,
328,
960,
310,
2011,
275,
253,
1543,
253,
28913,
1263,
2722,
326,
1016,
6333,
7091,
275,
436,
1332,
19132,
3045,
50275,
45563,
50276,
783,
3045,
310,
5777,
5520,
50275,
783,
16182,
310,
1077,
2969,
50275,
20881,
1255,
50276,
15603,
272,
253,
1355,
3045,
7756,
285,
253,
2969,
16182,
891,
651,
1158,
326,
253,
7680,
273,
436,
1332,
310,
4942,
3710,
50276,
74,
651,
1804,
4645,
690,
4433,
2219,
285,
11985,
670,
731,
984,
352,
588,
8162,
281,
253,
3114,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
7792,
323,
495,
69,
5481,
432,
46206,
285,
16486,
274,
14800,
275,
26279,
6276,
13451,
253,
15722,
3797,
4858,
6928,
14720,
432,
46206,
285,
16486,
274,
14800,
10939,
285,
4648,
247,
11781,
2990,
323,
22407,
5481,
672,
1097,
4973,
403,
2130,
671,
253,
2929,
19401,
9534,
273,
941,
16933,
285,
4081,
281,
9510,
253,
31640,
275,
253,
1566,
2216,
253,
2929,
310,
253,
806,
281,
4271,
285,
7472,
253,
1895,
326,
954,
5368,
3082,
513,
417,
1908,
9534,
835,
581,
390,
1097,
4973,
403,
29356,
285,
29328,
247,
15722,
32176,
323,
436,
4112,
253,
4081,
1332,
310,
6760,
275,
253,
2629,
7533,
273,
1789,
5481,
285,
2429,
342,
8245,
3082,
1097,
36143,
285,
36878,
4757,
50276,
18,
253,
4836,
8137,
347,
5393,
1840,
253,
2929,
310,
253,
806,
281,
4271,
253,
2523,
1561,
253,
1655,
6239,
285,
3210,
285,
29328,
247,
15722,
15672,
534,
4606,
432,
767,
4973,
10939,
285,
3021,
625,
10237,
672,
941,
440,
32517,
6634,
275,
436,
3282,
253,
4836,
8137,
3139,
310,
9865,
281,
253,
3114,
275,
13947,
285,
9745,
4116,
281,
253,
4836,
50276,
19,
9470,
2216,
10165,
285,
7103,
3738,
253,
4081,
15722,
310,
6571,
1754,
327,
5368,
3082,
253,
2929,
310,
2104,
281,
7472,
2710,
2216,
10165,
281,
7568,
253,
15840,
273,
253,
4081,
7792,
347,
973,
347,
2085,
9470,
7103,
715,
253,
1543,
11026,
256,
5503,
1543,
342,
1097,
4973,
285,
10237,
906,
672,
760,
581,
310,
2130,
50276,
20881,
1255,
50276,
18,
38135,
285,
1566,
2216,
253,
2929,
310,
4460,
275,
12488,
253,
1895,
534,
310,
10972,
285,
9865,
2299,
323,
253,
4081,
1332,
3139,
352,
310,
6571,
247,
5019,
273,
5368,
3082,
19575,
2014,
4973,
1293,
1199,
11237,
3021,
48245,
253,
15785,
273,
253,
4081,
7792,
671,
253,
2216,
281,
6016,
581,
390,
767,
4973,
275,
253,
7792,
310,
27785,
10323,
3515,
253,
806,
3924,
2990,
760,
604,
760,
581,
2603,
310,
2130,
285,
3515,
1097,
8661,
672,
767,
403,
2130,
247,
625,
18144,
2216,
812,
320,
672,
323,
1650,
6568,
5542,
310,
8231,
323,
247,
1643,
13009,
310,
627,
247,
4839,
281,
7356,
281,
253,
11781,
13562,
533,
17617,
11935,
1491,
281,
23514,
253,
5816,
46206,
941,
3185,
273,
3365,
5926,
253,
46206,
7789,
285,
253,
11781,
3515,
253,
16486,
274,
7789,
3815,
534,
588,
2779,
906,
275,
247,
5982,
36927,
1818,
281,
253,
843,
20713,
50276,
19,
9864,
323,
941,
16933,
253,
2929,
29328,
281,
35919,
253,
941,
281,
26065,
1896,
941,
16933,
15216,
3066,
18752,
2792,
285,
14155,
269,
729,
2299,
625,
3434,
476,
320,
2218,
281,
9510,
253,
31640,
24088,
2819,
323,
1524,
6276,
6430,
275,
9559,
8588,
390,
342,
3076,
941,
285,
1140,
460,
1208,
6340,
327,
1110,
941,
5549,
5474,
33032,
2252,
5368,
4049,
1560,
301,
274,
11781,
789,
11482,
684,
16486,
274,
2792,
342,
2460,
3386,
285,
840,
17923,
5481,
275,
495,
5470,
1173,
2317,
436,
789,
19732,
1131,
3332,
35408,
446,
255,
40719,
789,
323,
14693,
534,
4483,
581,
281,
3711,
1097,
6568,
285,
16486,
274,
14800,
281,
320,
87,
2317,
1078,
269,
5302,
285,
9433,
253,
5481,
1481,
50276,
296,
3755,
20556,
50275,
783,
4081,
2934,
285,
697,
22786,
2789,
3282,
285,
891,
717,
417,
6600,
273,
824,
3863,
789,
1014,
2167,
627,
3133,
281,
320,
17336,
2074,
789,
1580,
436,
3133,
247,
13760,
1735,
3213,
1677,
253,
6242,
273,
298,
859,
8073,
50275,
23454,
275,
253,
1566,
1646,
973,
1869,
562,
436,
2486,
253,
18149,
281,
298,
859,
8746,
2140,
565,
5104,
10336,
519,
81,
347,
973,
347,
253,
8090,
275,
253,
7870,
11781,
6333,
50275,
783,
5661,
1543,
921,
326,
436,
789,
310,
2810,
281,
256,
5503,
327,
295,
19387,
24453,
285,
326,
352,
2438,
6565,
1534,
1566,
31640,
275,
253,
1083,
273,
16486,
274,
1491,
5816,
2429,
281,
5368,
3082,
50275,
783,
1566,
4278,
403,
3965,
4518,
5544,
50275,
20881,
1255,
265,
50275,
4919,
789,
2593,
310,
21643,
275,
247,
1643,
5053,
285,
476,
320,
5542,
12490,
2007,
6667,
50276,
18,
253,
6568,
25421,
2593,
4428,
247,
5955,
273,
1127,
81,
408,
1032,
534,
310,
247,
15846,
16486,
274,
1332,
50276,
19,
2491,
3888,
403,
417,
1663,
299,
26365,
2317,
923,
1386,
11003,
50276,
20,
11289,
4102,
952,
1265,
281,
22059,
841,
767,
4735,
33433,
281,
2572,
253,
6779,
1612,
50276,
9088,
310,
4321,
789,
281,
513,
436,
604,
891,
2096,
9113,
253,
3908,
24088,
608,
432,
253,
2929,
390,
990,
936,
423,
1554,
400,
827,
11781,
323,
495,
69,
1789,
5481,
275,
16486,
274,
1127,
16173,
407,
340,
249,
1182,
14451,
1162,
355,
944,
77,
6247,
50276,
21,
5091,
50276,
23955,
1386,
273,
789,
310,
281,
22059,
253,
5649,
273,
253,
11260,
5130,
1859,
6415,
2074,
281,
253,
6568,
13071,
50276,
66,
2257,
273,
436,
789,
2210,
1078,
6568,
3053,
38883,
253,
320,
87,
1859,
50274,
565,
48714,
22909,
403,
14999,
275,
247,
4564,
273,
10872,
50276,
18,
789,
1057,
417,
5513,
253,
30328,
2139,
253,
1566,
3198,
281,
320,
10166,
275,
767,
8661,
752,
6569,
604,
697,
10166,
275,
247,
2014,
3924,
50276,
19,
1638,
3877,
326,
359,
513,
417,
2589,
941,
42072,
672,
1554,
400,
827,
2460,
3280,
310,
3206,
1223,
941,
42072,
7120,
247,
4619,
629,
275,
643,
9968,
5024,
3082,
310,
436,
247,
12291,
273,
6568,
11781,
3082,
275,
2087,
390,
1633,
5742,
14999,
275,
634,
1083,
476,
368,
4496,
19148,
50274,
262,
310,
12744,
1880,
253,
2746,
310,
256,
5503,
327,
295,
19387,
24453,
390,
417,
476,
368,
4496,
11120,
4499,
634,
3045,
4103,
281,
253,
295,
19387,
24453,
6657,
4697,
387,
1878,
323,
253,
3863,
7274,
672,
18216,
326,
6657,
4697,
4266,
891,
923,
25957,
273,
247,
1332,
1925,
320,
87,
12213,
326,
310,
256,
5503,
533,
3133,
281,
320,
247,
1027,
1332,
7384,
326,
1332,
310,
1027,
285,
2168,
327,
253,
6657,
4697,
634,
26086,
778,
320,
21643,
50276,
15627,
12314,
50274,
79,
19387,
24453,
310,
247,
10895,
342,
3782,
4105,
16486,
274,
2429,
281,
643,
1345,
15302,
824,
347,
1039,
6972,
1527,
10895,
1736,
1189,
339,
938,
3966,
1543,
327,
387,
1878,
581,
625,
10895,
342,
1029,
3290,
285,
1048,
1000,
912,
16486,
274,
403,
4122,
11408,
253,
5161,
2523,
273,
5816,
16486,
274,
2792,
778,
320,
247,
2257,
1679,
21452,
323,
625,
4980,
16486,
1032,
671,
347,
2491,
5459,
4457,
3387,
78,
281,
5571,
1518,
78,
253,
2746,
1060,
778,
2686,
762,
32231,
16486,
5916,
404,
1076,
7274,
1580,
320,
87,
1859,
476,
1265,
4508,
6332,
50276,
740,
78,
275,
253,
6568,
1083,
2403,
11781,
275,
320,
87,
2317,
2834,
281,
436,
1055,
1783,
273,
253,
1332,
3045,
347,
247,
1159,
273,
1789,
4181,
4103,
281,
256,
5503,
11781,
3082,
323,
1048,
13849,
588,
1361,
50274,
12982,
50276,
9088,
403,
5884,
3448,
3374,
285,
963,
993,
275,
253,
2929,
352,
651,
5649,
432,
1529,
4737,
24042,
1509,
50275,
2887,
4385,
327,
32213,
690,
5161,
2442,
7364,
273,
253,
5368,
1332,
452,
417,
644,
4751,
14859,
50275,
2577,
1655,
13716,
310,
49902,
327,
253,
9376,
326,
247,
2074,
2934,
556,
417,
644,
3863,
2568,
417,
4336,
2176,
285,
326,
891,
588,
4763,
5272,
6128,
281,
619,
3533,
50276,
7152,
339,
431,
319,
2196,
253,
1895,
273,
1655,
3082,
5257,
281,
1891,
387,
9534,
835,
10309,
37540,
328,
960,
436,
2929,
10262,
247,
2969,
2568,
3576,
16486,
3178,
312,
3525,
11781,
7792,
10775,
320,
87,
12213,
407,
557,
290,
36874,
6568,
15722,
432,
16486,
274,
2990,
285,
970,
247,
7870,
11781,
6333,
320,
87,
12213,
33526,
256,
5503,
3045,
285,
2722,
31640,
1411,
16486,
274,
390,
6568,
43280,
387,
253,
1072,
673,
271,
3576,
11237,
327,
253,
6568,
15722,
310,
671,
4081,
281,
9510,
253,
2457,
3045,
50276,
45563,
337,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
374,
31640,
273,
26279,
6276,
11333,
943,
320,
5087,
625,
4116,
281,
436,
2929,
16540,
253,
2523,
285,
2789,
253,
3177,
281,
15974,
352,
495,
11080,
4679,
403,
2684,
3916,
403,
973,
19391,
256,
5503,
3045,
310,
6786,
327,
1097,
2622,
285,
10237,
7533,
273,
295,
19387,
24453,
577,
253,
4076,
2216,
273,
253,
7792,
2789,
352,
3477,
281,
897,
667,
6568,
390,
16486,
274,
7792,
50274,
20881,
1255,
265,
337,
352,
310,
5322,
281,
923,
247,
2969,
2568,
3576,
6333,
7870,
11781,
6333,
1146,
4081,
533,
352,
651,
320,
49482,
281,
2085,
690,
16039,
285,
1783,
715,
253,
2216,
3139,
323,
1650,
407,
18918,
849,
651,
253,
11781,
6333,
789,
672,
10268,
18464,
16486,
274,
390,
6568,
14800,
359,
1537,
6351,
690,
16039,
715,
253,
6333,
2216,
273,
260,
6091,
285,
247,
3671,
374,
253,
4679,
2593,
1057,
417,
2085,
20243,
1783,
751,
17032,
673,
285,
3541,
33257,
285,
697,
5301,
342,
643,
3082,
253,
2442,
4016,
2675,
3486,
310,
973,
5469,
533,
253,
12291,
943,
320,
5469,
625,
323,
1650,
651,
253,
3563,
12213,
3740,
38771,
253,
5107,
281,
34824,
10444,
16486,
274,
285,
6568,
3386,
285,
3021,
2789,
253,
15722,
11089,
2442,
3045,
5926,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
1332,
281,
34824,
767,
4973,
273,
1491,
323,
11260,
5130,
1859,
320,
87,
5481,
10775,
1554,
400,
827,
3888,
285,
16486,
274,
941,
275,
247,
1039,
326,
667,
941,
12834,
275,
581,
2603,
273,
1491,
1057,
417,
2818,
253,
643,
954,
5368,
4049,
1560,
301,
274,
11781,
2987,
11482,
366,
16486,
274,
2792,
342,
2460,
3386,
285,
840,
1347,
5481,
275,
495,
5470,
1173,
2317,
436,
789,
19732,
1131,
3332,
35408,
446,
255,
40719,
789,
323,
14693,
534,
4483,
581,
281,
3711,
1097,
6568,
285,
16486,
274,
14800,
281,
320,
87,
2317,
1078,
269,
5302,
285,
9433,
253,
5481,
1481,
253,
30628,
11435,
253,
8137,
273,
253,
1895,
273,
1246,
11781,
3082,
326,
403,
16931,
281,
4723,
275,
581,
273,
253,
767,
4973,
273,
1491,
253,
17647,
273,
253,
1332,
285,
697,
1175,
16774,
3045,
597,
7164,
7350,
5001,
697,
38135,
1677,
50276,
783,
4755,
10165,
273,
253,
1246,
1332,
253,
30080,
22559,
9262,
407,
253,
4477,
10262,
625,
16774,
1543,
285,
490,
77,
569,
954,
30628,
11435,
253,
7680,
273,
253,
2929,
285,
50276,
783,
2929,
310,
5125,
323,
9311,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a method for goalconditioned rl that combines three ideas from prior work bootstrap dqn hindsight relabeling and prioritized experience relay experiments show that the proposed method outperforms variants of hindsight relabeling significance because the underlying ideas bootstrap dqn hindsight relabeling and prioritized experience relay are well known the main criterion for evaluation is the empirical results the results are quite strong however the complexity of the method and its sensitivity to hyperparameters make me a bit skeptical whether the paper will make a lasting impact on the field correctness i had two minor comments about the correctness of the paper much of the discussion of the paper discussed how hindsight relabeling helps with exploration i dont think this is correct exploration is a problem of data collection while hindsight relabeling is a tool for performing updates using previouslycollected data hindsight relabeling doesnt directly say anything about how the data should be collected i didnt see bootstrap dqn as a baseline as prior work has found that using multiple q functions significantly improves performance eg 1 4 i think this is an important baseline to add originality the proposed method is a new combination of old ingredients clarity the paper is mostly clear i found the discussion of prioritization hard to follow for example i didnt understand what higher variance first means in fig 5 the related work section lacks structure id recommend organizing the related work thematically and making sure that the commonality is clear minor comments nonnegative reward nonzero reward the agent could be getting negative dense rewards often fail to perform well in sparse rewards environments cite a brilliant cite 2 which predates the her paper by 25 years uniformly sampling one of them as her did the her paper actually tried many different sampling strategies including strategies that sampled multiple goals check out the appendix of the her paper when discussing prior work in the introduction and in sec 41 id recommend citing 1 4 and other recent papers that use multiple q functions both these methods dont work well cite goalconditioned supervised learning cite 3 too the rewards in goalconditioned environments are sparse actually define what the reward function is in an equation subtly applies applies the application isnt subtle i found lines 5 and 8 of alg 1 unclear i found fig 2 hard to interpret are there alternative ways of presenting the same data that more succinctly convey the same message sparse and nonnegative reward of 1 contradiction the reward isnt nonnegative if it is 1 for some states and actions acorss across 1 lee kimin et al sunrise a simple unified framework for ensemble learning in deep reinforcement learning international conference on machine learning pmlr 2021 2 kaelbling leslie pack learning to achieve goals ijcai 1993 3 ding yiming et al goalconditioned imitation learning arxiv preprint arxiv190605838 2019 4 chen xinyue et al randomized ensembled double qlearning learning fast without a model arxiv preprint arxiv210105982 2021 while the empirical results of the paper are quite strong the paper doesnt compare to a number of more recent methods that also use ensembles of q functions moreover the method is rather complex and it doesnt provide a guiding explanation for why this particular combination of ideas is good after rebuttal thanks to the authors for responding to some of the points raised in the review and for running additional experiments my two main concerns with the paper are 1 whether most of the empirical benefits are coming from the ensemble and 2 the clarity of the writing while i am not convinced that the revised version addressed these concerns i would encourage the authors to continue revising the paper and submit to a future conference docsepthis paper aims to improve learning efficiency for hindsight experience replayher instead of learning with a fixed proportion of fake and original data like her does this paper proposes to adopt the multiple head structure used in bootstrapped dqn and then utilize the uncertainty measured by the variance of multiple estimated qvalues so as to weight the prioritization of each transition specifically the proposed bher enhances the importance of data samples with lower uncertainty and thus achieves a tradeoff between exploration and exploitation resulting in a higher sample efficiency strengths the idea is simple and easy to implement the bher can achieve a significant improvement across different tasks weakness this paper claims that bher can achieve a tradeoff between exploration and exploitation about the hindsight experience but there are not strong enough evidence empirical or theoretical to support this claim though the experiments in appendixd try to show this tradeoff the results cant convince me figure8 only shows that the multiple head principle can have a better exploration about the goal than her and the bhermultiple head principle with prioritization have the best exploration about the goal it doesnt show bher scarifices the exploration so as to improve the exploitation most of the performance improvement is due to multiple head principle the counterintuitive prioritization can only slightly improve the performance in most environments so the reason behind the counterintuitive prioritization should be further investigated some questions and typos 1 the caption of the last figure in appendix should be figure 8 instead os figure 7 2 about the experiment setting this paper first said that the reward was sparse and nonnegative however this is contrast to the setting that the agent gets a reward of 0 only when it reaches desired goal otherwise it receives a reward of 1 the paper is built on the observation that achieving different goals may need different pseduo success trajectories which are unfortunately not provided by the naive her algorithm the paper then provides a straightforward solution to this by adding a boostrapped dqn onto her so as to allow it explore deeper and be able to evaluate the goodness of a pseduo trajectory for different goals although this idea is shown work well compared to the naive her i think that further more principled solution may be needed docsep summary contributions the authors call attention to sparsereward goalbased tasks where hindsight experience replay through its provision of relabeled goals for otherwise failed experiences facilitates efficient learning to further improve upon her the paper focuses on two algorithmic innovations 1 maintaining multiple actorcritic pairs inspired by bootstrappeddqn and 2 applying a variant of prioritized experience replay that draws samples from the replay buffer inversely proportional to the current qvalue variance under the critic ensemble the authors support their approach with comparative experiments to other variants of her as well as ablation studies of their own proposed method quality strengths the authors seem to have picked up on something interesting in paying attention to how transitions are sampled during the course of her empirically their approach seems to provide reasonable to modest gains across various continuous control tasks weaknesses major the authors have some fundamental issues and misconceptions about the bootstrappeddqn bootdqn algorithm section 2 mentions that bootdqn addresses exploration by running multiple behavior policies in the environment if so then it would be true that the method would have no reason to solve sparse reward tasks a priori as described by the authors however the reason for bootdqns success in exploration is because of posterior sampling 8 and the fact that the ensemble is an approximate posterior over the optimal actionvalue function of the mdp conditioned on all agent interactions observed thus far it is precisely this principle that allows the algorithm to address sparsereward tasks 2 the authors description of thompson sampling in bootdqn as an adhoc heuristic seems incorrect given the rigorous theoretical guarantees that accompany randomized leastsquares valueiteration algorithms 3 the authors are also confused about the use of replay buffers in bootdqn claiming that data from each head is held separate there is exactly one replay buffer used in bootdqn and bernoulli masks are sampled and stored with each transition for implementing the statistical bootstrap overall the connection the authors draw with bootdqn in this work seem rather disingenuous the proposed algorithm is simply applying ensembles of actorcritic pairs with no connection to the statistical bootstrap unlike bootdqn renaming the algorithm and rephrasing the contribution seem appropriate a more critical issue concerning bootdqn and the proposed bher algorithm is that the former is a purely valuebased rl algorithm that maintains a posterior distribution over the optimal actionvalue function in contrast the latter is an offpolicy actorcritic algorithm where naturally the critic is meant to be an estimate of the actionvalue function induced by the actor policy while the empirical results of this paper confirm empirical benefits of this ensembling heuristic the ablation in figure 5 shows that this is responsible for most of the bher performance the authors have offered no real justification for this ensemble actorcritic algorithm what is the point of representing epistemic uncertainty over the actor and critic in this manner i dont find the socalled counterintuitive prioritization to be counterintuitive at all it seems natural that hindsight transitions will serve the agent well only when there is little uncertainty in the associated optimal behavior under the relabeled goal can the authors explain why sampling based on higher variance seems to still maintain reasonable performance in three of seven environments shown in figure 5 minor in the first paragraph of section 42 the authors describe an instance of prioritized experience replay per 6 where the variance of the critic ensemble is used to prioritize transitions sampled from the replay buffer they seem to confuse this technique with methods for intrinsic motivation 7 based on curiosity and random network distillation clarity strengths the authors provide ample details about their experimental setup for reproducibility of their results weaknesses major overall the paper is not well written there are numerous grammatical errors throughout genuinely too many for me to sensibly list them all out here oftentimes these errors are missing articles for example it uses successful trajectories generated by agent as expert demonstrates or incorrect phrases on the contrast we inference all the qvalues normally i wouldnt bother nitpicking at a small handful of these but there are too many throughout the entire body for what may end up being a published conference paper minor the authors should remove the phrase importance sampling that is used twice in the paper to in my reading talk about the importance of sampled goals rather than the montecarlo technique of the same name originality strengths the authors demonstrate a good instinct in examining how other techniques used in deep reinforcement learning might further improve the efficacy of her weaknesses major fundamentally this paper rests on the idea of using ensembles and prioritized experience replay together neither of which is new to deep reinforcement learning 1459 though i do not know of any prior work that has explored this specific combination it would not surprise me if such prior work already exists more importantly there are other options for leveraging such ensembles that have not been addressed in this work 14 similarly the authors only consider variancebased prioritization schemes rather than the traditional prioritization based on tderror or any recent variants of per demonstrating that the authors specific choices in the proposed approach are better than these existing approaches to ensembling and per would dramatically improve what so far seems to be a rather incremental algorithm minor while it is appropriate for the related work section to focus on her it should also acknowledge the two fundamental innovations of this paper ensembles and prioritization schemes and provide an overview of related work for these areas as well significance strengths the only positive i can glean from this paper are the empirical results which seem to support the use of ensembling in actorcritic algorithms figure 5 shows a marginal drop in performance when the proposed counterintuitive prioritization scheme is not used that said it is not clear that this paper advances our understanding of ensembling in deep rl any more than prior work weaknesses major given the lack of comparisons mentioned above its difficult to assess how impactful the proposed approach will be with the breadth of existing work on the topic im unconvinced that this will add any novel insights into how practitioners use ensemble methods in reinforcement learning the prioritization scheme while slightly interesting on the surface doesnt seem to be a critical ingredient to the proposed algorithms success based on the ablation studies shown i dont believe any of the experiments have shown results for regular ddpg without the use of her having this baseline in place is important as it communicates the extent to which her is even necessary for achieving a reasonable level of performance in each of the examined environments minor references 1 lee kimin michael laskin aravind srinivas and pieter abbeel sunrise a simple unified framework for ensemble learning in deep reinforcement learning in international conference on machine learning pp 61316141 pmlr 2021 2 osband ian and benjamin van roy why is posterior sampling better than optimism for reinforcement learning in international conference on machine learning pp 27012710 pmlr 2017 3 osband ian benjamin van roy daniel j russo and zheng wen deep exploration via randomized value functions j mach learn res 20 no 124 2019 162 4 peer oren chen tessler nadav merlis and ron meir ensemble bootstrapping for qlearning arxiv preprint arxiv210300445 2021 5 saphal rohan balaraman ravindran dheevatsa mudigere sasikant avancha and bharat kaul seerl sample efficient ensemble reinforcement learning in proceedings of the 20th international conference on autonomous agents and multiagent systems pp 11001108 2021 6 schaul tom john quan ioannis antonoglou and david silver prioritized experience replay in iclr 2016 7 singh satinder richard l lewis andrew g barto and jonathan sorg intrinsically motivated reinforcement learning an evolutionary perspective ieee transactions on autonomous mental development 2 no 2 2010 7082 8 strens malcolm a bayesian framework for reinforcement learning in icml vol 2000 pp 943950 2000 9 wiering marco a and hado van hasselt ensemble algorithms in reinforcement learning ieee transactions on systems man and cybernetics part b cybernetics 38 no 4 2008 930936 final remarks the authors summarize their contributions in section 6 while their fifth point is true i would respond to the other points as 1 there is actually no use or examination of the statistical bootstrap in this work as the ensembles rely solely on random initializations bootdqn also does not distinguish data sources and there are other principled means of addressing exploration through ensembles such as ucb 2 while the proposed prioritization is new there are many others that also do not depend on the environment tderror which have not been assesssed 3 the combination is largely uninteresting based on ablation studies in this work which show near negligible impact of the proposed prioritization scheme and 4 i dont believe ddpg with or without her holds stateoftheart for these mujoco domains i would suspect that lies with either soft actor critic or td3 taken together i dont believe this paper is ready for publication at this time post rebuttal i thank the authors for their response but the justifications for the utility of representing epistemic uncertainty and lack of baselines are shallow its clear that substantial revisions are needed before the submission is ready for publication docsepthis paper looks at the problem of goalconditioned reinforcement learning and focuses specifically on the problem of prioritizing replay in conjunction with hindsight experience replay her it does so by using the idea of multiple heads in the actorcritic policy and critic networks inspired by bootstrapped dqn these different heads are initialized separately and are also used to induce deep exploration in the style of bootstrapped dqn the main idea is to use the variance in the qvalues across the heads for different transitions when prioritizing transitions to use for learning the algorithm termed counterintuitive prioritization prioritizes transitions with lower variance an example on the reacher environment is used to illustrate why this counterintuitive prioritization assists learning and hypothesizes that it is based on a tradeoff between exploration and exploitation experiments are performed in two simpler environments point 2d and reacher and 5 environments simulating different robots fetch sawyer and hand they compare with two algorithms that prioritize transitions for replay based on some other criteria and show that the counterintuitive prioritization along with deep exploration due to multiple heads leads to faster improvement in success rate of reaching goals in terms of interactions with the environment ablations try to tease apart whether this improvement is due to the improved exploration due to multiple heads or the better exploitation due to the counterintuitive prioritization high level comments strengths incorporating the idea of multiple heads from bootstrapped dqn for goalconditioned rl gcrl and then also using those multiple heads to also prioritize replay is a sensible extension of the idea to gcrl the counterintuitive prioritization needed when using the variance of the qvalues for prioritization is an interesting effect that is evaluated empirically with a possible explanation that it balances exploration and exploitation this empirical example comparing the variance in qvalues in her vs the original samples is helpful to see why the lower variance samples should be helpful weaknesses the paper focuses on this counterintuitive prioritization in the experiments this focus is understandable since that is the novel aspect introduced in the paper however it is unclear whether the improved performance by bher is due to the deep exploration due to multiple heads or this counterintuitive prioritization the ablation that removes the prioritization seems to still perform almost as well suggesting that most of the improvement is due to the improved exploration the above reason is why i am unconvinced about the experimental validation perhaps an additional ablation of bher without the deep exploration would highlight the efficacy of the counterintuitive prioritization also see below for questions about other comparisons detailed comments since the bootstrapped heads improve the exploration of the agent i wonder how this exploration stacks up to other exploration techniques in this domain for example would it be better than the directed exploration induced by 1 i understand that this paper is very recent but comparison to some other exploration enhancing mechanisms would be good the approach of using multiple heads and the variance in qvalues for picking samples seems related to 2 could authors clarify the difference and why this method was not compared to in baselines section 42 which goes over the counterintuitive prioritization and why it is needed is enlightening but also lacking the comparison of the variance when using her and when considering just samples from the environment show that her samples have lower variance the paper points out that this discrepancy could be due to the higher proportion of successful trajectories due to hindsight this analysis is reasonable but then the authors suggest that prioritizing higher variance samples would be similar to sampling without hindsight my question is what would happen if you only sampled higher variance samples from the relabeled experience and sampled uniformly from the true experience wouldnt that ensure that original samples are not sampled out of proportion secondly by prioritizing lower variance samples does bher just sample more from relabeled experience in the experiments the fetch robot task used for evaluation is just the reach environment similar to the reacher task evaluation on some actual manipulation tasks such as push or pick and place would be a better evaluation in this domain this is a small nitpick the other experiments deal with more challenging tasks it just seems like the fetchreach task is not bringing much to the table except nominally adding a domain as mentioned earlier in the ablations bher without deep exploration would be a good one to have and in those figures the baseline ddqn should also be plotted to show the improvement minor comments more preciseness in the problem setup section 31 would be appreciated as it stands this setup would be hard for someone not familiar with reinforcement learning literature to parse typo section 41 line 2 1 adversarial intrinsic motivation for reinforcement learning durugkar et al 2021 2 automatic curriculum learning through value disagreement zhang et al 2020 the idea of enhancing exploration in goalconditioned rl using bootstrapped dqn is reasonable also using the multiple heads to prioritize samples for replay is a good extension which leads to the investigation in the paper the idea of counterintuitive prioritization is investigated using an empirical investigation that gives a good insight into the reasons why this approach can be expected to work however the paper does not sufficiently tease apart the effect of prioritization versus the enhanced exploration the comparison to other techniques that prioritize their samples does not take into account the enhanced exploration either and the paper does not compare to other techniques that enhance exploration
### Summary:
|
i thank the authors for their submission and active participation in the discussion the reviewers unanimously agree that this submission has significant issues including comparison to baselinesablations bnlvyx9dpta1 clarity bnlv justification of the method nx4w thus i am recommending rejection of this paper
|
[
49328,
326,
452,
417,
644,
9713,
275,
436,
789,
1638,
12014,
253,
4477,
760,
1908,
11041,
3169,
23652,
1320,
15849,
2581,
685,
253,
5899,
23652,
1320,
1754,
327,
246,
491,
6045,
390,
667,
3332,
11640,
273,
591,
17227,
326,
253,
4477,
2173,
10165,
275,
253,
4081,
2746,
403,
1805,
685,
841,
5368,
7274,
281,
546,
35128,
285,
591,
651,
16821,
3157,
752,
594,
2080,
3133,
281,
320,
247,
2581,
32809,
5933,
28910,
50276,
37585,
209,
623,
1223,
352,
310,
4569,
323,
253,
2905,
789,
2593,
281,
2770,
327,
617,
352,
943,
671,
14409,
253,
767,
7936,
32771,
273,
436,
2929,
49328,
285,
23652,
1320,
15849,
285,
2085,
271,
18389,
273,
2905,
789,
323,
841,
3672,
347,
973,
50275,
9188,
40348,
50275,
296,
3755,
20556,
50272,
783,
760,
2762,
891,
476,
20786,
266,
432,
436,
2929,
403,
253,
16774,
1543,
534,
1646,
281,
1329,
253,
897,
273,
546,
35128,
275,
12353,
68,
17425,
11333,
4677,
608,
2722,
247,
16888,
5926,
275,
3045,
672,
253,
4081,
4828,
565,
48714,
23652,
1320,
6974,
310,
417,
908,
326,
753,
352,
310,
417,
2590,
326,
436,
2929,
16424,
776,
4685,
273,
546,
35128,
275,
3676,
391,
77,
667,
625,
685,
2720,
789,
50275,
20881,
1255,
265,
50271,
24330,
50268,
28821,
253,
3480,
273,
14023,
5393,
1840,
697,
2834,
281,
2939,
849,
3486,
1020,
253,
4081,
2746,
588,
320,
342,
253,
37535,
273,
5368,
789,
327,
253,
9400,
516,
10915,
8498,
758,
326,
436,
588,
823,
667,
4460,
16039,
715,
849,
24432,
897,
19862,
3082,
275,
35221,
4715,
253,
23652,
1320,
6974,
1223,
5777,
4722,
327,
253,
2553,
36908,
1646,
281,
320,
247,
4619,
24405,
281,
253,
4081,
11333,
2323,
1754,
327,
253,
28913,
2175,
2011,
50267,
74,
13414,
2868,
667,
273,
253,
4679,
452,
2011,
1543,
323,
3963,
32765,
8159,
1293,
253,
897,
273,
617,
1907,
436,
8245,
275,
1659,
310,
1774,
347,
352,
3461,
684,
253,
6070,
281,
534,
617,
310,
1014,
3309,
323,
17170,
247,
5272,
1268,
273,
3045,
275,
1016,
273,
253,
6730,
12620,
28910,
50276,
37585,
209,
623,
50274,
250,
3065,
337,
458,
70,
465,
303,
249,
278,
44023,
298,
1945,
249,
247,
3385,
527,
256,
11078,
34627,
285,
12580,
1715,
490,
1257,
293,
5101,
24078,
247,
2969,
27998,
7792,
323,
19862,
4715,
275,
3676,
35221,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
48726,
1036,
18962,
268,
1686,
83,
43425,
374,
7684,
4152,
209,
757,
285,
2240,
16935,
3889,
12869,
2139,
310,
12637,
10491,
1805,
685,
36970,
323,
35221,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
3435,
520,
1630,
740,
268,
1686,
83,
4240,
495,
7684,
4152,
209,
757,
2240,
16935,
3889,
12869,
16447,
928,
480,
391,
316,
601,
285,
1182,
24176,
259,
257,
3676,
17947,
3066,
14871,
1318,
3470,
480,
3674,
3037,
501,
1384,
642,
17294,
6247,
23094,
577,
14218,
258,
445,
260,
864,
246,
405,
2146,
33078,
580,
4285,
38466,
285,
391,
251,
479,
343,
19862,
7491,
10981,
2784,
323,
2805,
28269,
549,
32693,
638,
3845,
549,
32693,
19,
12172,
5525,
1857,
43425,
608,
35223,
5590,
687,
5582,
4273,
274,
14990,
37952,
527,
4011,
277,
248,
1173,
1832,
66,
16059,
9236,
250,
256,
284,
1479,
386,
1323,
3903,
66,
285,
270,
9432,
255,
16288,
335,
396,
45043,
3410,
5919,
19862,
35221,
4715,
275,
10061,
273,
253,
1384,
394,
5213,
8059,
327,
26279,
6083,
285,
4471,
12788,
2718,
7266,
1903,
2874,
12347,
43425,
721,
5807,
40186,
7275,
480,
2116,
572,
266,
17908,
1136,
261,
271,
1299,
11816,
276,
285,
34843,
301,
9711,
23652,
1025,
2793,
44864,
275,
17857,
32888,
4022,
818,
1625,
73,
2206,
6228,
6793,
472,
298,
458,
88,
261,
285,
2663,
305,
44693,
80,
285,
480,
251,
10511,
256,
2061,
45654,
17194,
35221,
4715,
271,
16483,
8668,
26332,
1796,
13122,
327,
26279,
6255,
2440,
374,
642,
374,
4267,
818,
45159,
854,
331,
25083,
4691,
30512,
247,
17699,
16561,
7792,
323,
35221,
4715,
275,
17857,
1686,
1936,
5307,
7266,
11107,
1867,
1235,
5307,
898,
259,
1321,
272,
2304,
1940,
247,
285,
574,
80,
3889,
38193,
2585,
19862,
11333,
275,
35221,
4715,
26332,
1796,
13122,
327,
2718,
637,
285,
20239,
3024,
982,
629,
270,
20239,
3024,
982,
6480,
642,
577,
4695,
898,
22000,
1812,
50276,
13017,
16157,
50276,
783,
4477,
26799,
616,
9021,
275,
2593,
721,
1223,
616,
10720,
1127,
310,
2032,
891,
651,
3794,
281,
253,
643,
2792,
347,
337,
627,
310,
2686,
642,
897,
390,
8368,
273,
253,
7605,
28551,
275,
436,
789,
347,
253,
49328,
10725,
12718,
327,
3632,
3302,
5904,
7491,
39028,
79,
671,
1057,
417,
12129,
941,
4973,
285,
627,
403,
643,
3505,
74,
6216,
2097,
273,
15974,
17947,
949,
49328,
824,
347,
44274,
67,
374,
1223,
253,
4081,
23652,
1320,
310,
747,
627,
403,
1142,
2571,
326,
671,
513,
417,
3469,
327,
253,
3126,
246,
491,
6045,
534,
452,
417,
644,
2939,
15690,
495,
253,
5019,
310,
8127,
440,
47606,
1754,
327,
28913,
2175,
275,
436,
789,
534,
921,
2822,
22879,
3486,
273,
253,
4081,
23652,
1320,
6974,
285,
577,
891,
13414,
2868,
32765,
8159,
342,
390,
1293,
617,
6556,
1375,
23037,
14387,
323,
841,
278,
10441,
16856,
10625,
891,
651,
9101,
326,
8696,
342,
2057,
2602,
12353,
7291,
390,
32989,
20,
50276,
42533,
2366,
891,
13414,
2868,
436,
2929,
310,
4704,
323,
9311,
387,
436,
673,
50273,
5996,
30080,
22559,
50275,
74,
5717,
253,
4477,
323,
616,
2380,
533,
253,
816,
6787,
323,
253,
11839,
273,
9999,
30009,
11060,
11649,
285,
3480,
273,
1666,
25379,
403,
20126,
50276,
953,
2590,
326,
6832,
38549,
403,
3058,
1078,
253,
19529,
310,
4704,
323,
9311,
5474,
33032,
2520,
2929,
4453,
387,
253,
1895,
273,
4736,
44321,
35221,
4715,
285,
16633,
5742,
327,
253,
1895,
273,
23652,
3006,
44864,
275,
17385,
342,
17134,
18347,
2793,
44864,
617,
352,
1057,
594,
407,
970,
253,
2934,
273,
2709,
9851,
275,
253,
12353,
68,
17425,
3646,
285,
7291,
6928,
11797,
407,
7491,
10981,
1882,
277,
47051,
841,
1027,
9851,
403,
31260,
11794,
285,
403,
671,
908,
281,
10808,
3676,
17947,
275,
253,
3740,
273,
7491,
10981,
1882,
277,
47051,
50276,
783,
2022,
2934,
310,
281,
897,
253,
11041,
275,
253,
2805,
8858,
2439,
253,
9851,
323,
1027,
16307,
672,
23652,
3006,
16307,
281,
897,
323,
4715,
253,
5933,
23776,
4828,
565,
48714,
23652,
1320,
23652,
4219,
16307,
342,
2406,
11041,
271,
1650,
327,
253,
294,
12844,
3126,
310,
908,
281,
17093,
2139,
436,
4828,
565,
48714,
23652,
1320,
27593,
4715,
285,
6482,
4219,
326,
352,
310,
1754,
327,
247,
5454,
2727,
875,
17947,
285,
30211,
50276,
16217,
3825,
403,
2684,
275,
767,
19554,
12620,
1127,
374,
69,
285,
294,
12844,
285,
608,
12620,
948,
8287,
1027,
25497,
20279,
3047,
7885,
285,
1133,
597,
7277,
342,
767,
11333,
326,
23652,
907,
16307,
323,
44864,
1754,
327,
690,
643,
6866,
285,
921,
326,
253,
4828,
565,
48714,
23652,
1320,
2112,
342,
3676,
17947,
1955,
281,
2709,
9851,
5644,
281,
7938,
7756,
275,
2323,
2281,
273,
10922,
7342,
275,
2426,
273,
6355,
342,
253,
3126,
50276,
1752,
569,
1611,
281,
716,
511,
7419,
1880,
436,
7756,
310,
1955,
281,
253,
5520,
17947,
1955,
281,
2709,
9851,
390,
253,
1805,
30211,
1955,
281,
253,
4828,
565,
48714,
23652,
1320,
50276,
8656,
1268,
5701,
50276,
296,
3755,
20556,
50276,
1763,
24993,
839,
253,
2934,
273,
2709,
9851,
432,
7491,
10981,
1882,
277,
47051,
323,
4736,
44321,
391,
77,
305,
7083,
77,
285,
840,
671,
970,
1110,
2709,
9851,
281,
671,
23652,
907,
44864,
310,
247,
24600,
6880,
273,
253,
2934,
281,
305,
7083,
77,
50276,
783,
4828,
565,
48714,
23652,
1320,
3058,
672,
970,
253,
11041,
273,
253,
2805,
8858,
323,
23652,
1320,
310,
271,
4722,
1055,
326,
310,
6760,
45190,
342,
247,
1896,
8813,
326,
352,
40216,
17947,
285,
30211,
436,
16774,
1650,
10941,
253,
11041,
275,
2805,
8858,
275,
617,
4632,
253,
3236,
3530,
310,
9371,
281,
923,
2139,
253,
2406,
11041,
3530,
943,
320,
9371,
50275,
20881,
1255,
265,
50276,
783,
2929,
16633,
327,
436,
4828,
565,
48714,
23652,
1320,
275,
253,
4679,
436,
2770,
310,
34007,
1580,
326,
310,
253,
4460,
4809,
5611,
275,
253,
2929,
2299,
352,
310,
12744,
1880,
253,
5520,
3045,
407,
270,
379,
310,
1955,
281,
253,
3676,
17947,
1955,
281,
2709,
9851,
390,
436,
4828,
565,
48714,
23652,
1320,
253,
28913,
326,
26586,
253,
23652,
1320,
3133,
281,
1335,
1347,
2761,
347,
973,
7738,
326,
954,
273,
253,
7756,
310,
1955,
281,
253,
5520,
17947,
50276,
783,
1840,
1921,
310,
2139,
891,
717,
10915,
8498,
758,
670,
253,
5661,
12820,
4931,
271,
3081,
28913,
273,
270,
379,
1293,
253,
3676,
17947,
651,
6780,
253,
10307,
273,
253,
4828,
565,
48714,
23652,
1320,
50276,
12563,
923,
2708,
323,
3533,
670,
643,
14023,
50275,
5992,
7193,
5701,
50275,
17480,
253,
7491,
10981,
1882,
9851,
3157,
253,
17947,
273,
253,
5570,
891,
4282,
849,
436,
17947,
34577,
598,
281,
643,
17947,
5609,
275,
436,
5028,
323,
1650,
651,
352,
320,
1805,
685,
253,
6828,
17947,
5802,
407,
337,
891,
2096,
326,
436,
2929,
310,
1077,
3332,
533,
5301,
281,
690,
643,
17947,
22474,
6297,
651,
320,
1175,
50276,
783,
2746,
273,
970,
2709,
9851,
285,
253,
11041,
275,
2805,
8858,
323,
8871,
3530,
3133,
2905,
281,
374,
812,
4477,
19148,
253,
3064,
285,
2139,
436,
1332,
369,
417,
2429,
281,
275,
1666,
25379,
50276,
4674,
5976,
534,
4566,
689,
253,
4828,
565,
48714,
23652,
1320,
285,
2139,
352,
310,
3058,
310,
25441,
2980,
533,
671,
14999,
253,
5301,
273,
253,
11041,
672,
970,
617,
285,
672,
7296,
816,
3530,
432,
253,
3126,
921,
326,
617,
3530,
452,
2406,
11041,
253,
2929,
2792,
562,
326,
436,
26210,
812,
320,
1955,
281,
253,
2169,
8394,
273,
5547,
24102,
1955,
281,
17134,
18347,
436,
1783,
310,
5272,
533,
840,
253,
4477,
1804,
326,
23652,
3006,
2169,
11041,
3530,
651,
320,
2074,
281,
10491,
1293,
17134,
18347,
619,
1953,
310,
752,
651,
5108,
604,
368,
760,
19958,
2169,
11041,
3530,
432,
253,
774,
1492,
264,
2793,
285,
19958,
17568,
432,
253,
50276,
5672,
2793,
651,
2649,
326,
5416,
326,
3236,
3530,
403,
417,
19958,
562,
273,
8394,
50276,
9815,
314,
407,
23652,
3006,
2406,
11041,
3530,
1057,
270,
379,
816,
3410,
625,
432,
774,
1492,
264,
2793,
50276,
249,
253,
4679,
253,
20279,
15688,
4836,
908,
323,
7103,
310,
816,
253,
3986,
3126,
2074,
281,
253,
294,
12844,
4836,
7103,
327,
690,
4588,
19763,
8892,
824,
347,
7450,
390,
2619,
285,
1659,
651,
320,
247,
1805,
7103,
275,
436,
5028,
436,
310,
247,
1355,
12389,
29397,
253,
643,
4679,
2968,
342,
625,
11132,
8892,
352,
816,
3133,
751,
253,
20279,
21943,
4836,
310,
417,
9745,
1199,
281,
253,
2829,
3707,
7163,
3341,
6240,
247,
5028,
50276,
284,
5393,
4321,
275,
253,
490,
77,
569,
270,
379,
1293,
3676,
17947,
651,
320,
247,
1175,
581,
281,
452,
285,
275,
1110,
8442,
253,
8245,
32765,
47051,
943,
671,
320,
17944,
281,
921,
253,
7756,
50275,
37585,
5701,
50276,
3062,
3509,
261,
8098,
275,
253,
1895,
9978,
2593,
4562,
651,
320,
14109,
347,
352,
9572,
436,
9978,
651,
320,
1892,
323,
3095,
417,
7615,
342,
35221,
4715,
6239,
281,
14390,
50276,
555,
5367,
2593,
7609,
1386,
374,
50275,
18,
50276,
324,
735,
24406,
15276,
16038,
323,
35221,
4715,
16547,
814,
18970,
1162,
355,
43425,
374,
50276,
50154,
24642,
4715,
949,
1318,
30859,
50276,
91,
12109,
1162,
355,
9169,
253,
2934,
273,
22474,
17947,
275,
4736,
44321,
391,
77,
970,
7491,
10981,
1882,
277,
47051,
310,
5272,
671,
970,
253,
2709,
9851,
281,
23652,
907,
3530,
323,
44864,
310,
247,
1175,
6880,
534,
5644,
281,
253,
5839,
275,
253,
2929,
253,
2934,
273,
4828,
565,
48714,
23652,
1320,
310,
6949,
970,
271,
16774,
5839,
326,
4245,
247,
1175,
12288,
715,
253,
4606,
2139,
436,
2746,
476,
320,
3264,
281,
789,
50276,
35529,
253,
2929,
1057,
417,
10481,
716,
511,
7419,
253,
1055,
273,
23652,
1320,
7147,
253,
8655,
17947,
253,
5301,
281,
643,
5609,
326,
23652,
907,
616,
3530,
1057,
417,
1379,
715,
2395,
253,
8655,
17947,
2057,
285,
253,
2929,
1057,
417,
7277,
281,
643,
5609,
326,
7278,
17947,
2490,
187,
4118,
18435,
27,
74,
5717,
253,
4477,
323,
616,
19529,
285,
3939,
11497,
275,
253,
5955,
253,
30628,
38350,
5194,
326,
436,
19529,
556,
1534,
3374,
1690,
5301,
281,
1666,
25379,
1752,
569,
270,
13307,
11170,
89,
26,
69,
37668,
18,
19843,
270,
13307,
87,
22861,
273,
253,
1332,
295,
89,
21,
88,
3021,
891,
717,
46705,
18235,
273,
436,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
49328,
326,
452,
417,
644,
9713,
275,
436,
789,
1638,
12014,
253,
4477,
760,
1908,
11041,
3169,
23652,
1320,
15849,
2581,
685,
253,
5899,
23652,
1320,
1754,
327,
246,
491,
6045,
390,
667,
3332,
11640,
273,
591,
17227,
326,
253,
4477,
2173,
10165,
275,
253,
4081,
2746,
403,
1805,
685,
841,
5368,
7274,
281,
546,
35128,
285,
591,
651,
16821,
3157,
752,
594,
2080,
3133,
281,
320,
247,
2581,
32809,
5933,
28910,
50276,
37585,
209,
623,
1223,
352,
310,
4569,
323,
253,
2905,
789,
2593,
281,
2770,
327,
617,
352,
943,
671,
14409,
253,
767,
7936,
32771,
273,
436,
2929,
49328,
285,
23652,
1320,
15849,
285,
2085,
271,
18389,
273,
2905,
789,
323,
841,
3672,
347,
973,
50275,
9188,
40348,
50275,
296,
3755,
20556,
50272,
783,
760,
2762,
891,
476,
20786,
266,
432,
436,
2929,
403,
253,
16774,
1543,
534,
1646,
281,
1329,
253,
897,
273,
546,
35128,
275,
12353,
68,
17425,
11333,
4677,
608,
2722,
247,
16888,
5926,
275,
3045,
672,
253,
4081,
4828,
565,
48714,
23652,
1320,
6974,
310,
417,
908,
326,
753,
352,
310,
417,
2590,
326,
436,
2929,
16424,
776,
4685,
273,
546,
35128,
275,
3676,
391,
77,
667,
625,
685,
2720,
789,
50275,
20881,
1255,
265,
50271,
24330,
50268,
28821,
253,
3480,
273,
14023,
5393,
1840,
697,
2834,
281,
2939,
849,
3486,
1020,
253,
4081,
2746,
588,
320,
342,
253,
37535,
273,
5368,
789,
327,
253,
9400,
516,
10915,
8498,
758,
326,
436,
588,
823,
667,
4460,
16039,
715,
849,
24432,
897,
19862,
3082,
275,
35221,
4715,
253,
23652,
1320,
6974,
1223,
5777,
4722,
327,
253,
2553,
36908,
1646,
281,
320,
247,
4619,
24405,
281,
253,
4081,
11333,
2323,
1754,
327,
253,
28913,
2175,
2011,
50267,
74,
13414,
2868,
667,
273,
253,
4679,
452,
2011,
1543,
323,
3963,
32765,
8159,
1293,
253,
897,
273,
617,
1907,
436,
8245,
275,
1659,
310,
1774,
347,
352,
3461,
684,
253,
6070,
281,
534,
617,
310,
1014,
3309,
323,
17170,
247,
5272,
1268,
273,
3045,
275,
1016,
273,
253,
6730,
12620,
28910,
50276,
37585,
209,
623,
50274,
250,
3065,
337,
458,
70,
465,
303,
249,
278,
44023,
298,
1945,
249,
247,
3385,
527,
256,
11078,
34627,
285,
12580,
1715,
490,
1257,
293,
5101,
24078,
247,
2969,
27998,
7792,
323,
19862,
4715,
275,
3676,
35221,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
48726,
1036,
18962,
268,
1686,
83,
43425,
374,
7684,
4152,
209,
757,
285,
2240,
16935,
3889,
12869,
2139,
310,
12637,
10491,
1805,
685,
36970,
323,
35221,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
3435,
520,
1630,
740,
268,
1686,
83,
4240,
495,
7684,
4152,
209,
757,
2240,
16935,
3889,
12869,
16447,
928,
480,
391,
316,
601,
285,
1182,
24176,
259,
257,
3676,
17947,
3066,
14871,
1318,
3470,
480,
3674,
3037,
501,
1384,
642,
17294,
6247,
23094,
577,
14218,
258,
445,
260,
864,
246,
405,
2146,
33078,
580,
4285,
38466,
285,
391,
251,
479,
343,
19862,
7491,
10981,
2784,
323,
2805,
28269,
549,
32693,
638,
3845,
549,
32693,
19,
12172,
5525,
1857,
43425,
608,
35223,
5590,
687,
5582,
4273,
274,
14990,
37952,
527,
4011,
277,
248,
1173,
1832,
66,
16059,
9236,
250,
256,
284,
1479,
386,
1323,
3903,
66,
285,
270,
9432,
255,
16288,
335,
396,
45043,
3410,
5919,
19862,
35221,
4715,
275,
10061,
273,
253,
1384,
394,
5213,
8059,
327,
26279,
6083,
285,
4471,
12788,
2718,
7266,
1903,
2874,
12347,
43425,
721,
5807,
40186,
7275,
480,
2116,
572,
266,
17908,
1136,
261,
271,
1299,
11816,
276,
285,
34843,
301,
9711,
23652,
1025,
2793,
44864,
275,
17857,
32888,
4022,
818,
1625,
73,
2206,
6228,
6793,
472,
298,
458,
88,
261,
285,
2663,
305,
44693,
80,
285,
480,
251,
10511,
256,
2061,
45654,
17194,
35221,
4715,
271,
16483,
8668,
26332,
1796,
13122,
327,
26279,
6255,
2440,
374,
642,
374,
4267,
818,
45159,
854,
331,
25083,
4691,
30512,
247,
17699,
16561,
7792,
323,
35221,
4715,
275,
17857,
1686,
1936,
5307,
7266,
11107,
1867,
1235,
5307,
898,
259,
1321,
272,
2304,
1940,
247,
285,
574,
80,
3889,
38193,
2585,
19862,
11333,
275,
35221,
4715,
26332,
1796,
13122,
327,
2718,
637,
285,
20239,
3024,
982,
629,
270,
20239,
3024,
982,
6480,
642,
577,
4695,
898,
22000,
1812,
50276,
13017,
16157,
50276,
783,
4477,
26799,
616,
9021,
275,
2593,
721,
1223,
616,
10720,
1127,
310,
2032,
891,
651,
3794,
281,
253,
643,
2792,
347,
337,
627,
310,
2686,
642,
897,
390,
8368,
273,
253,
7605,
28551,
275,
436,
789,
347,
253,
49328,
10725,
12718,
327,
3632,
3302,
5904,
7491,
39028,
79,
671,
1057,
417,
12129,
941,
4973,
285,
627,
403,
643,
3505,
74,
6216,
2097,
273,
15974,
17947,
949,
49328,
824,
347,
44274,
67,
374,
1223,
253,
4081,
23652,
1320,
310,
747,
627,
403,
1142,
2571,
326,
671,
513,
417,
3469,
327,
253,
3126,
246,
491,
6045,
534,
452,
417,
644,
2939,
15690,
495,
253,
5019,
310,
8127,
440,
47606,
1754,
327,
28913,
2175,
275,
436,
789,
534,
921,
2822,
22879,
3486,
273,
253,
4081,
23652,
1320,
6974,
285,
577,
891,
13414,
2868,
32765,
8159,
342,
390,
1293,
617,
6556,
1375,
23037,
14387,
323,
841,
278,
10441,
16856,
10625,
891,
651,
9101,
326,
8696,
342,
2057,
2602,
12353,
7291,
390,
32989,
20,
50276,
42533,
2366,
891,
13414,
2868,
436,
2929,
310,
4704,
323,
9311,
387,
436,
673,
50273,
5996,
30080,
22559,
50275,
74,
5717,
253,
4477,
323,
616,
2380,
533,
253,
816,
6787,
323,
253,
11839,
273,
9999,
30009,
11060,
11649,
285,
3480,
273,
1666,
25379,
403,
20126,
50276,
953,
2590,
326,
6832,
38549,
403,
3058,
1078,
253,
19529,
310,
4704,
323,
9311,
5474,
33032,
2520,
2929,
4453,
387,
253,
1895,
273,
4736,
44321,
35221,
4715,
285,
16633,
5742,
327,
253,
1895,
273,
23652,
3006,
44864,
275,
17385,
342,
17134,
18347,
2793,
44864,
617,
352,
1057,
594,
407,
970,
253,
2934,
273,
2709,
9851,
275,
253,
12353,
68,
17425,
3646,
285,
7291,
6928,
11797,
407,
7491,
10981,
1882,
277,
47051,
841,
1027,
9851,
403,
31260,
11794,
285,
403,
671,
908,
281,
10808,
3676,
17947,
275,
253,
3740,
273,
7491,
10981,
1882,
277,
47051,
50276,
783,
2022,
2934,
310,
281,
897,
253,
11041,
275,
253,
2805,
8858,
2439,
253,
9851,
323,
1027,
16307,
672,
23652,
3006,
16307,
281,
897,
323,
4715,
253,
5933,
23776,
4828,
565,
48714,
23652,
1320,
23652,
4219,
16307,
342,
2406,
11041,
271,
1650,
327,
253,
294,
12844,
3126,
310,
908,
281,
17093,
2139,
436,
4828,
565,
48714,
23652,
1320,
27593,
4715,
285,
6482,
4219,
326,
352,
310,
1754,
327,
247,
5454,
2727,
875,
17947,
285,
30211,
50276,
16217,
3825,
403,
2684,
275,
767,
19554,
12620,
1127,
374,
69,
285,
294,
12844,
285,
608,
12620,
948,
8287,
1027,
25497,
20279,
3047,
7885,
285,
1133,
597,
7277,
342,
767,
11333,
326,
23652,
907,
16307,
323,
44864,
1754,
327,
690,
643,
6866,
285,
921,
326,
253,
4828,
565,
48714,
23652,
1320,
2112,
342,
3676,
17947,
1955,
281,
2709,
9851,
5644,
281,
7938,
7756,
275,
2323,
2281,
273,
10922,
7342,
275,
2426,
273,
6355,
342,
253,
3126,
50276,
1752,
569,
1611,
281,
716,
511,
7419,
1880,
436,
7756,
310,
1955,
281,
253,
5520,
17947,
1955,
281,
2709,
9851,
390,
253,
1805,
30211,
1955,
281,
253,
4828,
565,
48714,
23652,
1320,
50276,
8656,
1268,
5701,
50276,
296,
3755,
20556,
50276,
1763,
24993,
839,
253,
2934,
273,
2709,
9851,
432,
7491,
10981,
1882,
277,
47051,
323,
4736,
44321,
391,
77,
305,
7083,
77,
285,
840,
671,
970,
1110,
2709,
9851,
281,
671,
23652,
907,
44864,
310,
247,
24600,
6880,
273,
253,
2934,
281,
305,
7083,
77,
50276,
783,
4828,
565,
48714,
23652,
1320,
3058,
672,
970,
253,
11041,
273,
253,
2805,
8858,
323,
23652,
1320,
310,
271,
4722,
1055,
326,
310,
6760,
45190,
342,
247,
1896,
8813,
326,
352,
40216,
17947,
285,
30211,
436,
16774,
1650,
10941,
253,
11041,
275,
2805,
8858,
275,
617,
4632,
253,
3236,
3530,
310,
9371,
281,
923,
2139,
253,
2406,
11041,
3530,
943,
320,
9371,
50275,
20881,
1255,
265,
50276,
783,
2929,
16633,
327,
436,
4828,
565,
48714,
23652,
1320,
275,
253,
4679,
436,
2770,
310,
34007,
1580,
326,
310,
253,
4460,
4809,
5611,
275,
253,
2929,
2299,
352,
310,
12744,
1880,
253,
5520,
3045,
407,
270,
379,
310,
1955,
281,
253,
3676,
17947,
1955,
281,
2709,
9851,
390,
436,
4828,
565,
48714,
23652,
1320,
253,
28913,
326,
26586,
253,
23652,
1320,
3133,
281,
1335,
1347,
2761,
347,
973,
7738,
326,
954,
273,
253,
7756,
310,
1955,
281,
253,
5520,
17947,
50276,
783,
1840,
1921,
310,
2139,
891,
717,
10915,
8498,
758,
670,
253,
5661,
12820,
4931,
271,
3081,
28913,
273,
270,
379,
1293,
253,
3676,
17947,
651,
6780,
253,
10307,
273,
253,
4828,
565,
48714,
23652,
1320,
50276,
12563,
923,
2708,
323,
3533,
670,
643,
14023,
50275,
5992,
7193,
5701,
50275,
17480,
253,
7491,
10981,
1882,
9851,
3157,
253,
17947,
273,
253,
5570,
891,
4282,
849,
436,
17947,
34577,
598,
281,
643,
17947,
5609,
275,
436,
5028,
323,
1650,
651,
352,
320,
1805,
685,
253,
6828,
17947,
5802,
407,
337,
891,
2096,
326,
436,
2929,
310,
1077,
3332,
533,
5301,
281,
690,
643,
17947,
22474,
6297,
651,
320,
1175,
50276,
783,
2746,
273,
970,
2709,
9851,
285,
253,
11041,
275,
2805,
8858,
323,
8871,
3530,
3133,
2905,
281,
374,
812,
4477,
19148,
253,
3064,
285,
2139,
436,
1332,
369,
417,
2429,
281,
275,
1666,
25379,
50276,
4674,
5976,
534,
4566,
689,
253,
4828,
565,
48714,
23652,
1320,
285,
2139,
352,
310,
3058,
310,
25441,
2980,
533,
671,
14999,
253,
5301,
273,
253,
11041,
672,
970,
617,
285,
672,
7296,
816,
3530,
432,
253,
3126,
921,
326,
617,
3530,
452,
2406,
11041,
253,
2929,
2792,
562,
326,
436,
26210,
812,
320,
1955,
281,
253,
2169,
8394,
273,
5547,
24102,
1955,
281,
17134,
18347,
436,
1783,
310,
5272,
533,
840,
253,
4477,
1804,
326,
23652,
3006,
2169,
11041,
3530,
651,
320,
2074,
281,
10491,
1293,
17134,
18347,
619,
1953,
310,
752,
651,
5108,
604,
368,
760,
19958,
2169,
11041,
3530,
432,
253,
774,
1492,
264,
2793,
285,
19958,
17568,
432,
253,
50276,
5672,
2793,
651,
2649,
326,
5416,
326,
3236,
3530,
403,
417,
19958,
562,
273,
8394,
50276,
9815,
314,
407,
23652,
3006,
2406,
11041,
3530,
1057,
270,
379,
816,
3410,
625,
432,
774,
1492,
264,
2793,
50276,
249,
253,
4679,
253,
20279,
15688,
4836,
908,
323,
7103,
310,
816,
253,
3986,
3126,
2074,
281,
253,
294,
12844,
4836,
7103,
327,
690,
4588,
19763,
8892,
824,
347,
7450,
390,
2619,
285,
1659,
651,
320,
247,
1805,
7103,
275,
436,
5028,
436,
310,
247,
1355,
12389,
29397,
253,
643,
4679,
2968,
342,
625,
11132,
8892,
352,
816,
3133,
751,
253,
20279,
21943,
4836,
310,
417,
9745,
1199,
281,
253,
2829,
3707,
7163,
3341,
6240,
247,
5028,
50276,
284,
5393,
4321,
275,
253,
490,
77,
569,
270,
379,
1293,
3676,
17947,
651,
320,
247,
1175,
581,
281,
452,
285,
275,
1110,
8442,
253,
8245,
32765,
47051,
943,
671,
320,
17944,
281,
921,
253,
7756,
50275,
37585,
5701,
50276,
3062,
3509,
261,
8098,
275,
253,
1895,
9978,
2593,
4562,
651,
320,
14109,
347,
352,
9572,
436,
9978,
651,
320,
1892,
323,
3095,
417,
7615,
342,
35221,
4715,
6239,
281,
14390,
50276,
555,
5367,
2593,
7609,
1386,
374,
50275,
18,
50276,
324,
735,
24406,
15276,
16038,
323,
35221,
4715,
16547,
814,
18970,
1162,
355,
43425,
374,
50276,
50154,
24642,
4715,
949,
1318,
30859,
50276,
91,
12109,
1162,
355,
9169,
253,
2934,
273,
22474,
17947,
275,
4736,
44321,
391,
77,
970,
7491,
10981,
1882,
277,
47051,
310,
5272,
671,
970,
253,
2709,
9851,
281,
23652,
907,
3530,
323,
44864,
310,
247,
1175,
6880,
534,
5644,
281,
253,
5839,
275,
253,
2929,
253,
2934,
273,
4828,
565,
48714,
23652,
1320,
310,
6949,
970,
271,
16774,
5839,
326,
4245,
247,
1175,
12288,
715,
253,
4606,
2139,
436,
2746,
476,
320,
3264,
281,
789,
50276,
35529,
253,
2929,
1057,
417,
10481,
716,
511,
7419,
253,
1055,
273,
23652,
1320,
7147,
253,
8655,
17947,
253,
5301,
281,
643,
5609,
326,
23652,
907,
616,
3530,
1057,
417,
1379,
715,
2395,
253,
8655,
17947,
2057,
285,
253,
2929,
1057,
417,
7277,
281,
643,
5609,
326,
7278,
17947,
2490,
187,
4118,
18435,
27,
74,
5717,
253,
4477,
323,
616,
19529,
285,
3939,
11497,
275,
253,
5955,
253,
30628,
38350,
5194,
326,
436,
19529,
556,
1534,
3374,
1690,
5301,
281,
1666,
25379,
1752,
569,
270,
13307,
11170,
89,
26,
69,
37668,
18,
19843,
270,
13307,
87,
22861,
273,
253,
1332,
295,
89,
21,
88,
3021,
891,
717,
46705,
18235,
273,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work is aimed at distributed privacy preserving training of neural networks for image processing tasks like deblocking denoising deraining and deblurring one of the most important contribution pg 2 is breaking down the neural network model into taskspecific convolutional head and tails trained on clients and a common shared across tasks transformer based feature backbone which is trained on the server the headstails and the transformer backbone are trained in an alternate manner by assuming the other model to be fixed the proposed is similar to the method splitfed thapa et al 2020 but is extended for different tasks as described above experimental results demonstrate i successful training of the neural network models with the proposed method ii bettercomparable performance to prior works on distributedprivacypreserving methods iii better performance using the vit backbone as compared to cnn backbones and also with the proposed multitask vs singletask setting strengths i the key idea of the paper is communicated effectively the method is an extension of splitfed thapa et al 2020 to multitask learning with a shared backbone on the server side ii the experiments validate that the proposed method is viable and achieves comparable performance tomarginal improvements over existing privacypreserving training methods on the tasks considered in this work iii some synergy can be seen in the performance on various tasks due to joint multitask training whereby the proposed method outperforms endtoend and singletask distributed training table 2 weaknesses i the key weakness is that the core contribution of using a transformer vit based taskagnostic backbone on the server has incremental novelty over the more standardprior works specifically thapa et al the key novelty is jointtraining of a shared backbone on different tasks in the proposed work this is a natural extension of the prior method ii it is unclear why the specific decision of using cnns as headstails on the clients and a vit as backbone on the server is so crucial could the roles be swapped vit based headstails and a cnn backbone overall this particular design choice has not been shown to be critical for operation or the performance of privacypreserving distributed learning any differentiable neural network can be placed at the serverclient sides as long as the two can be plumbed together iii the number of clients in the distributed experiments are small perhaps this is prevalent in this research community for example only 5 clients are used for 4 tasks section 4 hence it is difficult to gauge the practical deployment of this method as common distributed learning issues like clients dropping out asynchronous communication is not tested this work proposes an extension of the method splitfed of thapa et al 2020 where a common transformer based backbone is trained across different tasks this is a natural extension of the prior work this method has been shown to achieve similarmarginally better performance across all the five tasks considered in this work however the scale of experimentation is small only 5 clients and idea itself carries incremental novelty in view of the above i can recommend this paper further but perhaps with some reservation post authors response thank the authors for taking the time to address the concerns raised in the review however as detailed in individual comments their response is not convincing the key points are 1 no fundamental reason to favor transformers over other neural modules 2 no guarantees of privacy preservation despite claims to this effect 3 no largescale distributed study to validate their design hence i urge the authors to address these and revise the paper i am unable to recommend this work further in the current form docsepthe paper presents an architecture for image processing tasks that splits up a network into three subsequent parts head body and tail head and tail parts are cnnbased and can be trained on multiple client devices using federated learning fedavg while the body part of the architecture is transformerbased and is trained on a central server head and tail parts are trained for specific tasks while the body part is trained in a taskagnostic manner by selecting clients from each task for loss optimization experimental results show benchmark and convergence results that are comparable or favorable to nondistributed models as well as comparison results to purely fl and sl approaches with a very small nr of clients the submission is overall clearly written and presents an embodiment of a split architecture between clients and server that facilitates multitask learning which could be adopted by further work in the future code and models are available which greatly eases adoption however while most of the pages are spent on architecture description and experimental results there are several key omissions of discussion points which i deem essential for a paper in the distributed learning space there is no mention of any sort of privacy guarantees given the proposed architecture despite privacypreserving being part of the title privacy preservation is a strong claim that needs to be backed up rigorously moreover in their ethics statement the authors correctly lay out that transmitted hidden features may leak the raw data to some degree there is no discussion of communication cost in a federated setting the number of clients used in experiments is exceedingly small which might serve as a proof of concept but given that gradients have to be transmitted twoway or oneway for training headtail and body parts of the architecture a discussion on communication cost and scalability is necessary several key choices are not sufficiently motivated including the choice of both cnn and transformer architectures or the sampling of exactly one client from each task in eq 5 what makes cnn architectures less suitable for the body part in a more general framework beyond the particular architecture embodiment presented here how would the presented sampling strategy scale in the face of several orders of magnitudes of more clients if the focus of the paper is to position tavit as a general distributed multitask learning framework then the diversity of presented experiments for validation could have been expanded on the other hand if the focus is to present this particular architecture as a viable means to do distributed multitask learning for the presented tasks then the results of table 2 remain unconvincing in the sense that differences in results might come from essentially uncomparable architectures as opposed to contributions to multitask learning or distributed learning postrebuttal i thank the authors for their valuable comments on my review and their revision the revision has improved the submission substantially my comments were addressed mainly due to the addition of section 33 and the there referenced appendix d i believe that most of the other reviewers comments were also addressed and can now recommend the submission for acceptance this is reflected by my adjusted score minor comment the federated learning paragraph on pg 3 contains an unresolved reference due to a typo the submission presents a specific system that combines split and federated learning for multitask learning of various image processing applications it offers a good proof of concept of the proposed architecture decomposition but lacks a robust discussion on communication costoverhead as well as privacy guarantees in particular the ethics statement relativizes what the title claims privacypreserving docsepin this work the authors present a multitask distributed learning framework called tavit the taskspecific head cnn and the tail cnn are distributed to clients with their data connected to a standard transformer body placed in the server with an alternating training scheme the heads and tails on client sides are trained by taskspecific learning while the body is trained by taskagnostic learning experiments on four different image processing tasks show the success of taskagnostic learning of the transformer body and its synergistic improvement with the taskspecific heads and tails in line5 of the abstract what did the authors inspire from vision transformers why the core motivation of this paper comes from the success of vit the authors should revise the statement the proposed method has been verified on four lowlevel tasks can this strategyframework be applied to higherlevel tasks such as image inpainting classification and object detection note that those tasks need more semantic understanding during traininginference in the traditional federated learning framework each client commonly conducts the local train process then transfers the model update consisting of the intermediate gradient to sever however in this paper the author transfers the datasets features to sever which may bring about unpredictable challenges for example the computation cost of homomorphic encryption for features may increase rapidly does it have any advantages research or application value as for different tasks the authors use a unified taskagnostic transformer body how the authors bridge the domain gap caused by the other tasks knowledge furthermore do different data sources eg nature satellite and medical images share the same transformer body in figure1 does it still work well the authors should provide more experimental results and indepth analysis to verify this point prior works 12 have also addressed the taskagnostic problem in federated learning whats the significant difference between those works 1 federated learning with unbiased gradient aggregation and controllable meta updating 2 taskagnostic privacypreserving representation learning via federated learning this paper is wellwritten and presented however some key experiments and designs confuse me a lot overview it approaches the borderline of the iclr community the authors should address the above concerns main review in the rebuttal period after reading the responses from the authors the authors partially solved my concerns thus i decided to increase the rating docsepthis paper presents a new distributed learning framework exploiting the vision transformer for various image processing applications it gives impressive quantitative and qualitative results on multiple image restoration tasks meanwhile keeping privacy specifically it employs a taskagnostic vision transformer to learn universal representation at the server and several cnnbased taskspecific heads and tails to handle different image restoration tasks at the client side it also gives a training strategy to learn this model strengths it gives a new and practical distributed learning framework for image restoration tasks it is capable of handling multiple tasks with maintaining privacy it gives stateoftheart or competitive results on the evaluated restoration tasks the paper is easy to follow weaknesses the emphasized privacypreserving property of the given framework is not experimentally or theoretically validated no explanation about why only choosing deblocking denoising deraining and deblurring in clients what about superresolution and inpainting i vote for accepting this paper its technical novelties and contributions are sufficient and the given system seems practical and effective it applies federal learning to image restoration tasks it leverages vit for universal representation learning and taskspecific heads and tails for training different tasks the results are convincing i think it is worth giving a brief study about how this framework reacts to privacy attacks
### Summary:
|
the paper aims to devise a distributed multitask privacy preserving framework for image processing in this regard author propose partitioning neural network models into task specific headstails and a common taskagnostic feature backbone body a training procedure is designed which is claimed to be privacy preserving wherein the head and tail is trained locally on the client or using federated learning when multiple clients share a task while the main backbonebody is trained in a centralized manner by collecting appropriate gradients from the clients making easy to follow code is also highly appreciated we thank the reviewers and authors for engaging in an active discussion and also updating the paper while the new version is definitely resolves some of the concerns of the reviewers some still remain privacy preserving in title and in main body of the paper seems misleading proposed method doesnt provide any guarantees for privacy also pointed out by many reviewers the author response doesnt seem to be convincing and other federated learning papers do not claim privacy unless having some specific mechanism like adding noise secure aggregate etc also the reviewers are in consensus that novelty as well as large scale empirical evaluation is limited
|
[
1162,
355,
9169,
281,
1554,
262,
1945,
4715,
342,
247,
6096,
27882,
327,
253,
4771,
1930,
21255,
253,
4679,
17813,
326,
253,
4081,
1332,
310,
16571,
285,
33526,
10870,
3045,
7275,
1662,
989,
11701,
689,
5368,
11068,
10192,
26368,
3733,
3082,
327,
253,
8892,
2783,
275,
436,
789,
37685,
690,
726,
9751,
476,
320,
2326,
275,
253,
3045,
327,
2710,
8892,
1955,
281,
6036,
1554,
262,
1945,
3733,
17580,
253,
4081,
1332,
41731,
13015,
990,
936,
423,
285,
34791,
1945,
5939,
3733,
2829,
374,
50276,
20881,
1255,
265,
891,
253,
2234,
14855,
310,
326,
253,
5161,
7680,
273,
970,
247,
39707,
9084,
1754,
4836,
1530,
6932,
27882,
327,
253,
4771,
556,
32809,
38135,
689,
253,
625,
2629,
40844,
2987,
5742,
289,
24470,
1162,
355,
253,
2234,
38135,
310,
6036,
31158,
273,
247,
6096,
27882,
327,
1027,
8892,
275,
253,
4081,
789,
50276,
2520,
310,
247,
3626,
6880,
273,
253,
2720,
1332,
50276,
2886,
352,
310,
12744,
2139,
253,
2173,
3061,
273,
970,
260,
79,
2224,
347,
1481,
296,
5351,
327,
253,
8548,
285,
247,
9084,
347,
27882,
327,
253,
4771,
310,
594,
9560,
812,
253,
9503,
320,
1863,
6965,
50276,
34490,
1754,
1481,
296,
5351,
285,
247,
260,
9866,
27882,
4583,
436,
1798,
2216,
4327,
556,
417,
644,
2011,
281,
320,
4619,
323,
4254,
390,
253,
3045,
273,
11068,
10192,
26368,
5939,
4715,
667,
46350,
11454,
2990,
476,
320,
4845,
387,
253,
4771,
8780,
7123,
347,
1048,
347,
253,
767,
476,
320,
29896,
3026,
2366,
50276,
12211,
50276,
783,
1180,
273,
8548,
275,
253,
5939,
4679,
403,
1355,
4931,
436,
310,
21270,
275,
436,
2561,
3114,
323,
1650,
760,
608,
8548,
403,
908,
323,
577,
8892,
2593,
577,
7613,
352,
310,
2834,
281,
11206,
253,
8542,
19007,
273,
436,
1332,
347,
1846,
5939,
4715,
3374,
751,
8548,
18752,
562,
35576,
5511,
310,
417,
5762,
50273,
2520,
789,
29328,
271,
6880,
273,
253,
1332,
8085,
22210,
273,
289,
24470,
1162,
355,
9169,
835,
247,
1846,
39707,
1754,
27882,
310,
10166,
2439,
1027,
8892,
436,
310,
247,
3626,
6880,
273,
253,
2720,
789,
436,
1332,
556,
644,
2011,
281,
5115,
948,
300,
1513,
1662,
3341,
1805,
3045,
2439,
512,
253,
2620,
8892,
2783,
275,
436,
789,
2299,
253,
4311,
273,
40290,
310,
1355,
760,
608,
8548,
285,
2934,
3139,
15814,
32809,
38135,
275,
1859,
273,
253,
1840,
891,
476,
5583,
436,
2929,
2007,
533,
4931,
342,
690,
28930,
50274,
5996,
4477,
2380,
50276,
47033,
253,
4477,
323,
3192,
253,
673,
281,
2953,
253,
7350,
5439,
275,
253,
2278,
2299,
347,
7000,
275,
2060,
5701,
616,
2380,
310,
417,
21414,
253,
2234,
2792,
403,
337,
642,
7936,
1921,
281,
3718,
4979,
398,
689,
643,
11454,
11911,
374,
642,
23632,
273,
11068,
23029,
5747,
3916,
281,
436,
1055,
495,
642,
1236,
2510,
25912,
5939,
1263,
281,
17813,
616,
2216,
7613,
891,
21434,
253,
4477,
281,
2953,
841,
285,
49620,
253,
2929,
891,
717,
7591,
281,
5583,
436,
789,
2007,
275,
253,
1655,
830,
50276,
7152,
339,
431,
248,
2929,
10262,
271,
10336,
323,
2460,
5162,
8892,
326,
36509,
598,
247,
2990,
715,
1264,
6774,
4243,
1481,
2133,
285,
8105,
1481,
285,
8105,
4243,
403,
260,
9866,
3169,
285,
476,
320,
10166,
327,
2709,
5268,
4095,
970,
10208,
12072,
4715,
10208,
42921,
1223,
253,
2133,
629,
273,
253,
10336,
310,
39707,
3169,
285,
310,
10166,
327,
247,
4275,
4771,
1481,
285,
8105,
4243,
403,
10166,
323,
2173,
8892,
1223,
253,
2133,
629,
310,
10166,
275,
247,
4836,
1530,
6932,
5133,
407,
17221,
8548,
432,
1016,
4836,
323,
2957,
13757,
5661,
1543,
921,
22791,
285,
14940,
1543,
326,
403,
10870,
390,
13857,
281,
27370,
382,
3567,
3210,
347,
973,
347,
5301,
1543,
281,
15846,
892,
285,
1499,
7274,
342,
247,
1077,
1355,
42374,
273,
8548,
253,
19529,
310,
4583,
4518,
3542,
285,
10262,
271,
14704,
273,
247,
8085,
10336,
875,
8548,
285,
4771,
326,
29499,
1554,
262,
1945,
4715,
534,
812,
320,
8671,
407,
2007,
789,
275,
253,
2852,
2127,
285,
3210,
403,
2130,
534,
10260,
299,
1169,
16253,
50276,
35529,
1223,
954,
273,
253,
7223,
403,
5262,
327,
10336,
5740,
285,
5661,
1543,
627,
403,
2067,
2234,
49889,
273,
5955,
2792,
534,
891,
43413,
5667,
323,
247,
2929,
275,
253,
5939,
4715,
2317,
50276,
9088,
310,
642,
3748,
273,
667,
3686,
273,
11068,
23632,
1677,
253,
4081,
10336,
5747,
11068,
10192,
26368,
1146,
629,
273,
253,
4060,
11068,
23029,
310,
247,
2266,
1750,
326,
3198,
281,
320,
17245,
598,
8132,
29689,
25761,
275,
616,
18035,
3908,
253,
4477,
9113,
2242,
562,
326,
12573,
8763,
3386,
778,
13584,
253,
9305,
941,
281,
690,
4248,
50276,
9088,
310,
642,
5955,
273,
5511,
2105,
275,
247,
10208,
12072,
4758,
253,
1180,
273,
8548,
908,
275,
4679,
310,
42508,
1355,
534,
1537,
5752,
347,
247,
4737,
273,
4473,
533,
1677,
326,
27935,
452,
281,
320,
12573,
2500,
319,
333,
390,
581,
1106,
323,
3733,
1481,
14694,
285,
2133,
4243,
273,
253,
10336,
247,
5955,
327,
5511,
2105,
285,
9171,
1430,
310,
3309,
50276,
43249,
2234,
10165,
403,
417,
10481,
17194,
1690,
253,
4327,
273,
1097,
260,
9866,
285,
39707,
35615,
390,
253,
10491,
273,
4555,
581,
5268,
432,
1016,
4836,
275,
16186,
608,
752,
2789,
260,
9866,
35615,
1679,
7470,
323,
253,
2133,
629,
275,
247,
625,
2087,
7792,
4457,
253,
1798,
10336,
14704,
3559,
1060,
849,
651,
253,
3559,
10491,
5700,
4311,
275,
253,
2454,
273,
2067,
7367,
273,
32800,
273,
625,
8548,
50276,
338,
253,
2770,
273,
253,
2929,
310,
281,
1899,
246,
580,
262,
347,
247,
2087,
5939,
1554,
262,
1945,
4715,
7792,
840,
253,
9991,
273,
3559,
4679,
323,
12820,
812,
452,
644,
11848,
327,
253,
643,
1133,
604,
253,
2770,
310,
281,
1246,
436,
1798,
10336,
347,
247,
16571,
2097,
281,
513,
5939,
1554,
262,
1945,
4715,
323,
253,
3559,
8892,
840,
253,
1543,
273,
2829,
374,
3464,
10915,
87,
19163,
275,
253,
3282,
326,
3910,
275,
1543,
1537,
1705,
432,
9093,
440,
681,
36730,
35615,
347,
10066,
281,
9021,
281,
1554,
262,
1945,
4715,
390,
5939,
4715,
50275,
5996,
250,
2858,
22559,
891,
5717,
253,
4477,
323,
616,
9865,
5701,
327,
619,
2278,
285,
616,
18520,
253,
18520,
556,
5520,
253,
19529,
9619,
619,
5701,
497,
9713,
7194,
1955,
281,
253,
1635,
273,
2593,
5922,
285,
253,
627,
23378,
30762,
277,
891,
2868,
326,
954,
273,
253,
643,
30628,
5701,
497,
671,
9713,
285,
476,
1024,
5583,
253,
19529,
323,
14924,
436,
310,
11392,
407,
619,
10904,
4868,
50276,
37585,
4385,
50276,
783,
10208,
12072,
4715,
12494,
327,
23256,
495,
4428,
271,
39394,
3806,
1955,
281,
247,
1745,
80,
50274,
783,
19529,
10262,
247,
2173,
985,
326,
24772,
8085,
285,
10208,
12072,
4715,
323,
1554,
262,
1945,
4715,
273,
2710,
2460,
5162,
4893,
352,
6131,
247,
1175,
4737,
273,
4473,
273,
253,
4081,
10336,
14717,
533,
19756,
247,
10237,
5955,
327,
5511,
2105,
1189,
2522,
347,
973,
347,
11068,
23632,
275,
1798,
253,
18035,
3908,
44544,
4219,
752,
253,
4060,
3916,
11068,
10192,
26368,
5474,
339,
9852,
436,
789,
253,
4477,
1246,
247,
1554,
262,
1945,
5939,
4715,
7792,
1925,
246,
580,
262,
253,
8892,
29765,
1481,
260,
9866,
285,
253,
8105,
260,
9866,
403,
5939,
281,
8548,
342,
616,
941,
4802,
281,
247,
2629,
39707,
2133,
4845,
275,
253,
4771,
342,
271,
28035,
3733,
6974,
253,
9851,
285,
32936,
327,
5268,
7123,
403,
10166,
407,
8892,
29765,
4715,
1223,
253,
2133,
310,
10166,
407,
4836,
1530,
6932,
4715,
4679,
327,
1740,
1027,
2460,
5162,
8892,
921,
253,
2323,
273,
4836,
1530,
6932,
4715,
273,
253,
39707,
2133,
285,
697,
38543,
7756,
342,
253,
8892,
29765,
9851,
285,
32936,
50276,
249,
1386,
22,
273,
253,
12002,
752,
858,
253,
4477,
26761,
432,
8113,
4979,
398,
2139,
253,
5161,
16038,
273,
436,
2929,
3249,
432,
253,
2323,
273,
9084,
253,
4477,
943,
49620,
253,
3908,
50276,
783,
4081,
1332,
556,
644,
16058,
327,
1740,
1698,
5251,
8892,
476,
436,
5700,
13149,
320,
3732,
281,
2169,
5251,
8892,
824,
347,
2460,
275,
31406,
1076,
9162,
285,
1789,
5481,
3877,
326,
1110,
8892,
878,
625,
24705,
4685,
1309,
3733,
249,
1793,
50276,
249,
253,
5899,
10208,
12072,
4715,
7792,
1016,
5268,
7744,
2589,
84,
253,
1980,
6194,
1232,
840,
21916,
253,
1566,
5731,
11253,
273,
253,
10444,
11786,
281,
1917,
2299,
275,
436,
2929,
253,
2488,
21916,
253,
15302,
3386,
281,
1917,
534,
778,
3324,
670,
32947,
7881,
323,
1650,
253,
13782,
2105,
273,
2860,
13468,
24589,
323,
3386,
778,
2572,
9086,
1057,
352,
452,
667,
11361,
2561,
390,
2898,
1318,
50276,
284,
323,
1027,
8892,
253,
4477,
897,
247,
27998,
4836,
1530,
6932,
39707,
2133,
849,
253,
4477,
9729,
253,
5028,
8037,
4269,
407,
253,
643,
8892,
3640,
33810,
513,
1027,
941,
4973,
24088,
3753,
15109,
285,
3739,
3888,
3894,
253,
1072,
39707,
2133,
275,
4677,
18,
1057,
352,
1335,
789,
973,
253,
4477,
943,
2085,
625,
5661,
1543,
285,
801,
554,
394,
1783,
281,
12654,
436,
1127,
50276,
40844,
2987,
50276,
805,
452,
671,
9713,
253,
4836,
1530,
6932,
1895,
275,
10208,
12072,
4715,
47515,
253,
1534,
3064,
875,
1110,
2987,
50276,
18,
10208,
12072,
4715,
342,
38663,
11786,
20828,
285,
3661,
494,
11419,
22753,
374,
4836,
1530,
6932,
11068,
10192,
26368,
6779,
4715,
3066,
10208,
12072,
4715,
436,
2929,
310,
973,
15720,
285,
3559,
2299,
690,
2234,
4679,
285,
11809,
40678,
479,
247,
2257,
18389,
352,
7274,
253,
45210,
273,
253,
17857,
32888,
3114,
253,
4477,
943,
2953,
253,
1840,
7350,
50276,
7265,
2278,
275,
253,
30080,
22559,
2180,
50275,
6438,
4361,
253,
6128,
432,
253,
4477,
253,
4477,
10571,
14042,
619,
7350,
3021,
891,
4425,
281,
2572,
253,
13716,
5474,
33032,
2520,
2929,
10262,
247,
747,
5939,
4715,
7792,
38883,
253,
8113,
39707,
323,
2710,
2460,
5162,
4893,
352,
4245,
13943,
11745,
285,
18276,
1543,
327,
2709,
2460,
20384,
8892,
26614,
7562,
11068,
5742,
352,
27532,
247,
4836,
1530,
6932,
8113,
39707,
281,
3037,
10898,
6779,
387,
253,
4771,
285,
2067,
260,
9866,
3169,
8892,
29765,
9851,
285,
32936,
281,
6016,
1027,
2460,
20384,
8892,
387,
253,
5268,
1930,
352,
671,
4245,
247,
3733,
5700,
281,
3037,
436,
1566,
50276,
296,
3755,
20556,
50276,
262,
4245,
247,
747,
285,
8542,
5939,
4715,
7792,
323,
2460,
20384,
8892,
352,
310,
7032,
273,
10885,
2709,
8892,
342,
11850,
11068,
50276,
262,
4245,
1375,
23037,
14387,
390,
12085,
1543,
327,
253,
6760,
20384,
8892,
50276,
783,
2929,
310,
3477,
281,
956,
50275,
20881,
1255,
265,
50276,
783,
21947,
11068,
10192,
26368,
2867,
273,
253,
1677,
7792,
310,
417,
21657,
390,
28055,
17618,
50276,
2369,
8813,
670,
2139,
760,
13887,
372,
43883,
1850,
80,
2182,
1784,
1776,
285,
372,
49857,
804,
275,
8548,
752,
670,
2221,
21061,
285,
275,
31406,
1076,
891,
6273,
323,
18738,
436,
2929,
697,
7681,
4460,
2890,
285,
9021,
403,
4209,
285,
253,
1677,
985,
3133,
8542,
285,
3576,
352,
10384,
4400,
4715,
281,
2460,
20384,
8892,
352,
19732,
1131,
9084,
323,
10898,
6779,
4715,
285,
8892,
29765,
9851,
285,
32936,
323,
3733,
1027,
8892,
253,
1543,
403,
21414,
891,
1158,
352,
310,
4409,
4933,
247,
4864,
1263,
670,
849,
436,
7792,
44865,
281,
11068,
8104,
2490,
187,
4118,
18435,
27,
783,
2929,
13698,
281,
45018,
247,
5939,
1554,
262,
1945,
11068,
24279,
7792,
323,
2460,
5162,
275,
436,
2743,
2488,
12661,
41463,
11454,
2990,
3210,
715,
4836,
2173,
1481,
296,
5351,
285,
247,
1846,
4836,
1530,
6932,
4735,
27882,
2133,
247,
3733,
5199,
310,
4158,
534,
310,
7558,
281,
320,
11068,
24279,
10646,
253,
1481,
285,
8105,
310,
10166,
12171,
327,
253,
5268,
390,
970,
10208,
12072,
4715,
672,
2709,
8548,
3894,
247,
4836,
1223,
253,
2022,
27882,
2915,
310,
10166,
275,
247,
36409,
5133,
407,
17055,
4569,
27935,
432,
253,
8548,
2403,
3477,
281,
956,
2127,
310,
671,
4122,
14109,
359,
5717,
253,
30628,
285,
4477,
323,
15966,
275,
271,
3939,
5955,
285,
671,
22753,
253,
2929,
1223,
253,
747,
2715,
310,
7964,
501,
14503,
690,
273,
253,
7350,
273,
253,
30628,
690,
1335,
3464,
11068,
24279,
275,
4060,
285,
275,
2022,
2133,
273,
253,
2929,
3133,
24363,
4081,
1332,
36908,
2085,
667,
23632,
323,
11068,
671,
8042,
562,
407,
1142,
30628,
253,
2488,
2380,
36908,
1646,
281,
320,
21414,
285,
643,
10208,
12072,
4715,
9380,
513,
417,
1750,
11068,
5734,
1907,
690,
2173,
5122,
751,
6240,
6046,
7895,
19737,
3966,
671,
253,
30628,
403,
275,
13969,
326,
38135,
347,
973,
347,
1781,
4311,
16774,
7103,
310,
3710
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1162,
355,
9169,
281,
1554,
262,
1945,
4715,
342,
247,
6096,
27882,
327,
253,
4771,
1930,
21255,
253,
4679,
17813,
326,
253,
4081,
1332,
310,
16571,
285,
33526,
10870,
3045,
7275,
1662,
989,
11701,
689,
5368,
11068,
10192,
26368,
3733,
3082,
327,
253,
8892,
2783,
275,
436,
789,
37685,
690,
726,
9751,
476,
320,
2326,
275,
253,
3045,
327,
2710,
8892,
1955,
281,
6036,
1554,
262,
1945,
3733,
17580,
253,
4081,
1332,
41731,
13015,
990,
936,
423,
285,
34791,
1945,
5939,
3733,
2829,
374,
50276,
20881,
1255,
265,
891,
253,
2234,
14855,
310,
326,
253,
5161,
7680,
273,
970,
247,
39707,
9084,
1754,
4836,
1530,
6932,
27882,
327,
253,
4771,
556,
32809,
38135,
689,
253,
625,
2629,
40844,
2987,
5742,
289,
24470,
1162,
355,
253,
2234,
38135,
310,
6036,
31158,
273,
247,
6096,
27882,
327,
1027,
8892,
275,
253,
4081,
789,
50276,
2520,
310,
247,
3626,
6880,
273,
253,
2720,
1332,
50276,
2886,
352,
310,
12744,
2139,
253,
2173,
3061,
273,
970,
260,
79,
2224,
347,
1481,
296,
5351,
327,
253,
8548,
285,
247,
9084,
347,
27882,
327,
253,
4771,
310,
594,
9560,
812,
253,
9503,
320,
1863,
6965,
50276,
34490,
1754,
1481,
296,
5351,
285,
247,
260,
9866,
27882,
4583,
436,
1798,
2216,
4327,
556,
417,
644,
2011,
281,
320,
4619,
323,
4254,
390,
253,
3045,
273,
11068,
10192,
26368,
5939,
4715,
667,
46350,
11454,
2990,
476,
320,
4845,
387,
253,
4771,
8780,
7123,
347,
1048,
347,
253,
767,
476,
320,
29896,
3026,
2366,
50276,
12211,
50276,
783,
1180,
273,
8548,
275,
253,
5939,
4679,
403,
1355,
4931,
436,
310,
21270,
275,
436,
2561,
3114,
323,
1650,
760,
608,
8548,
403,
908,
323,
577,
8892,
2593,
577,
7613,
352,
310,
2834,
281,
11206,
253,
8542,
19007,
273,
436,
1332,
347,
1846,
5939,
4715,
3374,
751,
8548,
18752,
562,
35576,
5511,
310,
417,
5762,
50273,
2520,
789,
29328,
271,
6880,
273,
253,
1332,
8085,
22210,
273,
289,
24470,
1162,
355,
9169,
835,
247,
1846,
39707,
1754,
27882,
310,
10166,
2439,
1027,
8892,
436,
310,
247,
3626,
6880,
273,
253,
2720,
789,
436,
1332,
556,
644,
2011,
281,
5115,
948,
300,
1513,
1662,
3341,
1805,
3045,
2439,
512,
253,
2620,
8892,
2783,
275,
436,
789,
2299,
253,
4311,
273,
40290,
310,
1355,
760,
608,
8548,
285,
2934,
3139,
15814,
32809,
38135,
275,
1859,
273,
253,
1840,
891,
476,
5583,
436,
2929,
2007,
533,
4931,
342,
690,
28930,
50274,
5996,
4477,
2380,
50276,
47033,
253,
4477,
323,
3192,
253,
673,
281,
2953,
253,
7350,
5439,
275,
253,
2278,
2299,
347,
7000,
275,
2060,
5701,
616,
2380,
310,
417,
21414,
253,
2234,
2792,
403,
337,
642,
7936,
1921,
281,
3718,
4979,
398,
689,
643,
11454,
11911,
374,
642,
23632,
273,
11068,
23029,
5747,
3916,
281,
436,
1055,
495,
642,
1236,
2510,
25912,
5939,
1263,
281,
17813,
616,
2216,
7613,
891,
21434,
253,
4477,
281,
2953,
841,
285,
49620,
253,
2929,
891,
717,
7591,
281,
5583,
436,
789,
2007,
275,
253,
1655,
830,
50276,
7152,
339,
431,
248,
2929,
10262,
271,
10336,
323,
2460,
5162,
8892,
326,
36509,
598,
247,
2990,
715,
1264,
6774,
4243,
1481,
2133,
285,
8105,
1481,
285,
8105,
4243,
403,
260,
9866,
3169,
285,
476,
320,
10166,
327,
2709,
5268,
4095,
970,
10208,
12072,
4715,
10208,
42921,
1223,
253,
2133,
629,
273,
253,
10336,
310,
39707,
3169,
285,
310,
10166,
327,
247,
4275,
4771,
1481,
285,
8105,
4243,
403,
10166,
323,
2173,
8892,
1223,
253,
2133,
629,
310,
10166,
275,
247,
4836,
1530,
6932,
5133,
407,
17221,
8548,
432,
1016,
4836,
323,
2957,
13757,
5661,
1543,
921,
22791,
285,
14940,
1543,
326,
403,
10870,
390,
13857,
281,
27370,
382,
3567,
3210,
347,
973,
347,
5301,
1543,
281,
15846,
892,
285,
1499,
7274,
342,
247,
1077,
1355,
42374,
273,
8548,
253,
19529,
310,
4583,
4518,
3542,
285,
10262,
271,
14704,
273,
247,
8085,
10336,
875,
8548,
285,
4771,
326,
29499,
1554,
262,
1945,
4715,
534,
812,
320,
8671,
407,
2007,
789,
275,
253,
2852,
2127,
285,
3210,
403,
2130,
534,
10260,
299,
1169,
16253,
50276,
35529,
1223,
954,
273,
253,
7223,
403,
5262,
327,
10336,
5740,
285,
5661,
1543,
627,
403,
2067,
2234,
49889,
273,
5955,
2792,
534,
891,
43413,
5667,
323,
247,
2929,
275,
253,
5939,
4715,
2317,
50276,
9088,
310,
642,
3748,
273,
667,
3686,
273,
11068,
23632,
1677,
253,
4081,
10336,
5747,
11068,
10192,
26368,
1146,
629,
273,
253,
4060,
11068,
23029,
310,
247,
2266,
1750,
326,
3198,
281,
320,
17245,
598,
8132,
29689,
25761,
275,
616,
18035,
3908,
253,
4477,
9113,
2242,
562,
326,
12573,
8763,
3386,
778,
13584,
253,
9305,
941,
281,
690,
4248,
50276,
9088,
310,
642,
5955,
273,
5511,
2105,
275,
247,
10208,
12072,
4758,
253,
1180,
273,
8548,
908,
275,
4679,
310,
42508,
1355,
534,
1537,
5752,
347,
247,
4737,
273,
4473,
533,
1677,
326,
27935,
452,
281,
320,
12573,
2500,
319,
333,
390,
581,
1106,
323,
3733,
1481,
14694,
285,
2133,
4243,
273,
253,
10336,
247,
5955,
327,
5511,
2105,
285,
9171,
1430,
310,
3309,
50276,
43249,
2234,
10165,
403,
417,
10481,
17194,
1690,
253,
4327,
273,
1097,
260,
9866,
285,
39707,
35615,
390,
253,
10491,
273,
4555,
581,
5268,
432,
1016,
4836,
275,
16186,
608,
752,
2789,
260,
9866,
35615,
1679,
7470,
323,
253,
2133,
629,
275,
247,
625,
2087,
7792,
4457,
253,
1798,
10336,
14704,
3559,
1060,
849,
651,
253,
3559,
10491,
5700,
4311,
275,
253,
2454,
273,
2067,
7367,
273,
32800,
273,
625,
8548,
50276,
338,
253,
2770,
273,
253,
2929,
310,
281,
1899,
246,
580,
262,
347,
247,
2087,
5939,
1554,
262,
1945,
4715,
7792,
840,
253,
9991,
273,
3559,
4679,
323,
12820,
812,
452,
644,
11848,
327,
253,
643,
1133,
604,
253,
2770,
310,
281,
1246,
436,
1798,
10336,
347,
247,
16571,
2097,
281,
513,
5939,
1554,
262,
1945,
4715,
323,
253,
3559,
8892,
840,
253,
1543,
273,
2829,
374,
3464,
10915,
87,
19163,
275,
253,
3282,
326,
3910,
275,
1543,
1537,
1705,
432,
9093,
440,
681,
36730,
35615,
347,
10066,
281,
9021,
281,
1554,
262,
1945,
4715,
390,
5939,
4715,
50275,
5996,
250,
2858,
22559,
891,
5717,
253,
4477,
323,
616,
9865,
5701,
327,
619,
2278,
285,
616,
18520,
253,
18520,
556,
5520,
253,
19529,
9619,
619,
5701,
497,
9713,
7194,
1955,
281,
253,
1635,
273,
2593,
5922,
285,
253,
627,
23378,
30762,
277,
891,
2868,
326,
954,
273,
253,
643,
30628,
5701,
497,
671,
9713,
285,
476,
1024,
5583,
253,
19529,
323,
14924,
436,
310,
11392,
407,
619,
10904,
4868,
50276,
37585,
4385,
50276,
783,
10208,
12072,
4715,
12494,
327,
23256,
495,
4428,
271,
39394,
3806,
1955,
281,
247,
1745,
80,
50274,
783,
19529,
10262,
247,
2173,
985,
326,
24772,
8085,
285,
10208,
12072,
4715,
323,
1554,
262,
1945,
4715,
273,
2710,
2460,
5162,
4893,
352,
6131,
247,
1175,
4737,
273,
4473,
273,
253,
4081,
10336,
14717,
533,
19756,
247,
10237,
5955,
327,
5511,
2105,
1189,
2522,
347,
973,
347,
11068,
23632,
275,
1798,
253,
18035,
3908,
44544,
4219,
752,
253,
4060,
3916,
11068,
10192,
26368,
5474,
339,
9852,
436,
789,
253,
4477,
1246,
247,
1554,
262,
1945,
5939,
4715,
7792,
1925,
246,
580,
262,
253,
8892,
29765,
1481,
260,
9866,
285,
253,
8105,
260,
9866,
403,
5939,
281,
8548,
342,
616,
941,
4802,
281,
247,
2629,
39707,
2133,
4845,
275,
253,
4771,
342,
271,
28035,
3733,
6974,
253,
9851,
285,
32936,
327,
5268,
7123,
403,
10166,
407,
8892,
29765,
4715,
1223,
253,
2133,
310,
10166,
407,
4836,
1530,
6932,
4715,
4679,
327,
1740,
1027,
2460,
5162,
8892,
921,
253,
2323,
273,
4836,
1530,
6932,
4715,
273,
253,
39707,
2133,
285,
697,
38543,
7756,
342,
253,
8892,
29765,
9851,
285,
32936,
50276,
249,
1386,
22,
273,
253,
12002,
752,
858,
253,
4477,
26761,
432,
8113,
4979,
398,
2139,
253,
5161,
16038,
273,
436,
2929,
3249,
432,
253,
2323,
273,
9084,
253,
4477,
943,
49620,
253,
3908,
50276,
783,
4081,
1332,
556,
644,
16058,
327,
1740,
1698,
5251,
8892,
476,
436,
5700,
13149,
320,
3732,
281,
2169,
5251,
8892,
824,
347,
2460,
275,
31406,
1076,
9162,
285,
1789,
5481,
3877,
326,
1110,
8892,
878,
625,
24705,
4685,
1309,
3733,
249,
1793,
50276,
249,
253,
5899,
10208,
12072,
4715,
7792,
1016,
5268,
7744,
2589,
84,
253,
1980,
6194,
1232,
840,
21916,
253,
1566,
5731,
11253,
273,
253,
10444,
11786,
281,
1917,
2299,
275,
436,
2929,
253,
2488,
21916,
253,
15302,
3386,
281,
1917,
534,
778,
3324,
670,
32947,
7881,
323,
1650,
253,
13782,
2105,
273,
2860,
13468,
24589,
323,
3386,
778,
2572,
9086,
1057,
352,
452,
667,
11361,
2561,
390,
2898,
1318,
50276,
284,
323,
1027,
8892,
253,
4477,
897,
247,
27998,
4836,
1530,
6932,
39707,
2133,
849,
253,
4477,
9729,
253,
5028,
8037,
4269,
407,
253,
643,
8892,
3640,
33810,
513,
1027,
941,
4973,
24088,
3753,
15109,
285,
3739,
3888,
3894,
253,
1072,
39707,
2133,
275,
4677,
18,
1057,
352,
1335,
789,
973,
253,
4477,
943,
2085,
625,
5661,
1543,
285,
801,
554,
394,
1783,
281,
12654,
436,
1127,
50276,
40844,
2987,
50276,
805,
452,
671,
9713,
253,
4836,
1530,
6932,
1895,
275,
10208,
12072,
4715,
47515,
253,
1534,
3064,
875,
1110,
2987,
50276,
18,
10208,
12072,
4715,
342,
38663,
11786,
20828,
285,
3661,
494,
11419,
22753,
374,
4836,
1530,
6932,
11068,
10192,
26368,
6779,
4715,
3066,
10208,
12072,
4715,
436,
2929,
310,
973,
15720,
285,
3559,
2299,
690,
2234,
4679,
285,
11809,
40678,
479,
247,
2257,
18389,
352,
7274,
253,
45210,
273,
253,
17857,
32888,
3114,
253,
4477,
943,
2953,
253,
1840,
7350,
50276,
7265,
2278,
275,
253,
30080,
22559,
2180,
50275,
6438,
4361,
253,
6128,
432,
253,
4477,
253,
4477,
10571,
14042,
619,
7350,
3021,
891,
4425,
281,
2572,
253,
13716,
5474,
33032,
2520,
2929,
10262,
247,
747,
5939,
4715,
7792,
38883,
253,
8113,
39707,
323,
2710,
2460,
5162,
4893,
352,
4245,
13943,
11745,
285,
18276,
1543,
327,
2709,
2460,
20384,
8892,
26614,
7562,
11068,
5742,
352,
27532,
247,
4836,
1530,
6932,
8113,
39707,
281,
3037,
10898,
6779,
387,
253,
4771,
285,
2067,
260,
9866,
3169,
8892,
29765,
9851,
285,
32936,
281,
6016,
1027,
2460,
20384,
8892,
387,
253,
5268,
1930,
352,
671,
4245,
247,
3733,
5700,
281,
3037,
436,
1566,
50276,
296,
3755,
20556,
50276,
262,
4245,
247,
747,
285,
8542,
5939,
4715,
7792,
323,
2460,
20384,
8892,
352,
310,
7032,
273,
10885,
2709,
8892,
342,
11850,
11068,
50276,
262,
4245,
1375,
23037,
14387,
390,
12085,
1543,
327,
253,
6760,
20384,
8892,
50276,
783,
2929,
310,
3477,
281,
956,
50275,
20881,
1255,
265,
50276,
783,
21947,
11068,
10192,
26368,
2867,
273,
253,
1677,
7792,
310,
417,
21657,
390,
28055,
17618,
50276,
2369,
8813,
670,
2139,
760,
13887,
372,
43883,
1850,
80,
2182,
1784,
1776,
285,
372,
49857,
804,
275,
8548,
752,
670,
2221,
21061,
285,
275,
31406,
1076,
891,
6273,
323,
18738,
436,
2929,
697,
7681,
4460,
2890,
285,
9021,
403,
4209,
285,
253,
1677,
985,
3133,
8542,
285,
3576,
352,
10384,
4400,
4715,
281,
2460,
20384,
8892,
352,
19732,
1131,
9084,
323,
10898,
6779,
4715,
285,
8892,
29765,
9851,
285,
32936,
323,
3733,
1027,
8892,
253,
1543,
403,
21414,
891,
1158,
352,
310,
4409,
4933,
247,
4864,
1263,
670,
849,
436,
7792,
44865,
281,
11068,
8104,
2490,
187,
4118,
18435,
27,
783,
2929,
13698,
281,
45018,
247,
5939,
1554,
262,
1945,
11068,
24279,
7792,
323,
2460,
5162,
275,
436,
2743,
2488,
12661,
41463,
11454,
2990,
3210,
715,
4836,
2173,
1481,
296,
5351,
285,
247,
1846,
4836,
1530,
6932,
4735,
27882,
2133,
247,
3733,
5199,
310,
4158,
534,
310,
7558,
281,
320,
11068,
24279,
10646,
253,
1481,
285,
8105,
310,
10166,
12171,
327,
253,
5268,
390,
970,
10208,
12072,
4715,
672,
2709,
8548,
3894,
247,
4836,
1223,
253,
2022,
27882,
2915,
310,
10166,
275,
247,
36409,
5133,
407,
17055,
4569,
27935,
432,
253,
8548,
2403,
3477,
281,
956,
2127,
310,
671,
4122,
14109,
359,
5717,
253,
30628,
285,
4477,
323,
15966,
275,
271,
3939,
5955,
285,
671,
22753,
253,
2929,
1223,
253,
747,
2715,
310,
7964,
501,
14503,
690,
273,
253,
7350,
273,
253,
30628,
690,
1335,
3464,
11068,
24279,
275,
4060,
285,
275,
2022,
2133,
273,
253,
2929,
3133,
24363,
4081,
1332,
36908,
2085,
667,
23632,
323,
11068,
671,
8042,
562,
407,
1142,
30628,
253,
2488,
2380,
36908,
1646,
281,
320,
21414,
285,
643,
10208,
12072,
4715,
9380,
513,
417,
1750,
11068,
5734,
1907,
690,
2173,
5122,
751,
6240,
6046,
7895,
19737,
3966,
671,
253,
30628,
403,
275,
13969,
326,
38135,
347,
973,
347,
1781,
4311,
16774,
7103,
310,
3710
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the idea of learning feature maps for spatially adjacent slices is simple and general the reported results are very competitive and were achieved on public datasets the manuscript is clearly written and easy to understand the contributions of the paper are clear and substantiated the method could be compared to more segmentation variants eg 25d segmentation approaches utilizing feature maps from neighboring slices is not entirely novel a similar idea has been previously suggested for videos by lin et al 2019 docsepthe paper is well written two main strengths in my opinions 1 enabling 2d convolutions to make use of information from neighboring by a distill dsm which learns to extract information from a part of the channels and shares it with neighboring slices 2 extensive evaluations on several datasets the proposed model achieves better performance than 3d cnn for heart and prostate datasets and comparable performance on brats 2020 pancreas and hippocampus dataset with simply 28 of parameters compared to 3d cnn model 1 another important class of efficient 2d3d approaches is not mentioned and discussed 2 some explanation of the results on brats 2020 and some clarification are needed for example the evaluation metric called specificity for brats 2020 may not be needed docsep1 the paper is written clearly the description of the method and the experiments are reasonable i can easily understand the proposed module 2 the proposed distill dsm is easy to follow and can be plugged into different 2d networks for 3d volumetric segmentation 3 the authors conducted abundant and wellorganized experiments to validate the effectiveness of proposed distill dsm 1 the proposed distill dsm lacks theory or reference supporting it seems like an engineering application rather than innovation in method if the author would like to prove the novelty of their proposed method it needs more theoretical explanation and method reference 2 the description and visualization of methods seems to be too simple which easily make readers confused for instance the description depth shift module lin et al 2019 shifts part of feature channels in each frame to its neighbouring frame so that 2d convolution could handle depth information does the dsm shifts the same features in each frame to its last and next frames what the fig 1 presents is not the same as the description docsepthe paper is very well written and easy to follow i have seen works attempting to use neighbouring slices in the input the so called 25d but this is the first time that i see this idea applied to convolutions in medical imaging highlight should be given to the as far as i know novel idea of judging which features should be included on the forward and backward slice instead of simply shifting channels its an interesting idea with potential for providing more efficient networks according to the authors findings comparisons were made with famous architectures and a similar approach residual dsm in multiple datasets bringing more validity to the authors claims overall i do not see major weaknesses or problems in this manuscript however there are some minor problems writing needs to be improved especially in the introduction give attention to proper use of articles and verb tenses additionally the introduction contains strong claims without proper citations the explanation of how distill dsm mixes information from far away slices could be improved more minor details and suggestions in the detailed comments i see no promise to make code available but that would make this work even stronger and easier to reproducebe cited by future works
### Summary:
|
the reviewers find the work of interest and there was initial consensus that the paper can be accepted this was confirmed after the rebuttal
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2934,
273,
4715,
4735,
8115,
323,
28819,
9701,
18484,
310,
2969,
285,
2087,
50276,
783,
2361,
1543,
403,
1077,
12085,
285,
497,
6786,
327,
1345,
15302,
50276,
783,
7714,
310,
4518,
3542,
285,
3477,
281,
2096,
50276,
783,
9021,
273,
253,
2929,
403,
2590,
285,
4326,
4215,
50275,
783,
1332,
812,
320,
2429,
281,
625,
26405,
11640,
24088,
2030,
69,
26405,
7274,
50276,
8906,
3006,
4735,
8115,
432,
20667,
18484,
310,
417,
7094,
4460,
247,
2074,
2934,
556,
644,
3786,
5125,
323,
10556,
407,
19169,
1162,
355,
6247,
50275,
7152,
339,
431,
248,
2929,
310,
973,
3542,
767,
2022,
20544,
275,
619,
11626,
50276,
18,
17690,
374,
69,
2410,
17009,
281,
1056,
897,
273,
1491,
432,
20667,
407,
247,
940,
408,
277,
3610,
534,
33772,
281,
4908,
1491,
432,
247,
629,
273,
253,
8123,
285,
10764,
352,
342,
20667,
18484,
50275,
19,
9470,
27163,
327,
2067,
15302,
253,
4081,
1566,
33526,
1805,
3045,
685,
495,
69,
260,
9866,
323,
2798,
285,
14527,
15302,
285,
10870,
3045,
327,
1308,
1832,
9169,
35142,
285,
26382,
10895,
342,
3365,
3349,
273,
3602,
2429,
281,
495,
69,
260,
9866,
1566,
337,
1529,
1774,
966,
273,
5919,
374,
69,
20,
69,
7274,
310,
417,
5393,
285,
5469,
50275,
19,
690,
8813,
273,
253,
1543,
327,
1308,
1832,
9169,
285,
690,
37699,
403,
3058,
323,
1650,
253,
7103,
7982,
1925,
946,
189,
692,
414,
323,
1308,
1832,
9169,
778,
417,
320,
3058,
50275,
7152,
33032,
18,
50276,
783,
2929,
310,
3542,
4518,
253,
5740,
273,
253,
1332,
285,
253,
4679,
403,
5272,
891,
476,
4354,
2096,
253,
4081,
6333,
374,
50276,
783,
4081,
940,
408,
277,
3610,
310,
3477,
281,
956,
285,
476,
320,
43867,
715,
1027,
374,
69,
6928,
323,
495,
69,
1936,
45558,
26405,
495,
50276,
783,
4477,
5196,
17829,
285,
973,
34092,
4679,
281,
17813,
253,
12510,
273,
4081,
940,
408,
277,
3610,
337,
253,
4081,
940,
408,
277,
3610,
19756,
3762,
390,
3806,
8109,
352,
3133,
751,
271,
11369,
2898,
2581,
685,
15832,
275,
1332,
604,
253,
2488,
651,
751,
281,
5276,
253,
38135,
273,
616,
4081,
1332,
352,
3198,
625,
10527,
8813,
285,
1332,
3806,
50275,
19,
253,
5740,
285,
24426,
273,
3082,
3133,
50276,
936,
320,
1512,
2969,
534,
4354,
1056,
10668,
13477,
323,
4227,
253,
5740,
6864,
5333,
6333,
19169,
1162,
355,
6247,
15036,
629,
273,
4735,
8123,
275,
1016,
3665,
281,
697,
33423,
3665,
594,
326,
374,
69,
27311,
812,
6016,
6864,
1491,
1057,
253,
277,
3610,
15036,
253,
1072,
3386,
275,
1016,
3665,
281,
697,
1390,
285,
1735,
13009,
752,
253,
3036,
337,
10262,
310,
417,
253,
1072,
347,
253,
5740,
5474,
339,
431,
248,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
50275,
74,
452,
2326,
2987,
13756,
281,
897,
33423,
18484,
275,
253,
3280,
253,
594,
1925,
2030,
69,
533,
436,
310,
253,
806,
673,
326,
891,
923,
436,
2934,
3732,
281,
2410,
17009,
275,
3739,
6979,
6780,
943,
320,
1677,
281,
253,
347,
2080,
347,
891,
871,
4460,
2934,
273,
32721,
534,
3386,
943,
320,
2908,
327,
253,
3579,
285,
19265,
15512,
3185,
273,
3365,
19507,
8123,
697,
271,
4722,
2934,
342,
2442,
323,
5277,
625,
5919,
6928,
2556,
281,
253,
4477,
4342,
50275,
681,
1148,
10047,
497,
1160,
342,
8530,
35615,
285,
247,
2074,
2746,
12541,
277,
3610,
275,
2709,
15302,
9745,
625,
13091,
281,
253,
4477,
3916,
50276,
1189,
455,
891,
513,
417,
923,
2201,
32213,
390,
3237,
275,
436,
7714,
2299,
627,
403,
690,
5884,
3237,
50276,
17695,
3198,
281,
320,
5520,
3340,
275,
253,
10199,
1918,
4116,
281,
1463,
897,
273,
7774,
285,
17257,
246,
5060,
23000,
253,
10199,
4428,
2266,
3916,
1293,
1463,
30404,
253,
8813,
273,
849,
940,
408,
277,
3610,
47603,
1491,
432,
2080,
1977,
18484,
812,
320,
5520,
625,
5884,
4278,
285,
13991,
275,
253,
7000,
5701,
50275,
74,
923,
642,
9023,
281,
1056,
2127,
2130,
533,
326,
651,
1056,
436,
789,
1014,
10046,
285,
6927,
281,
18302,
1257,
11106,
407,
2852,
2987,
50275,
187,
187,
4118,
18435,
27,
783,
30628,
1089,
253,
789,
273,
1600,
285,
627,
369,
3302,
13969,
326,
253,
2929,
476,
320,
7607,
436,
369,
5783,
846,
253,
30080,
22559
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2934,
273,
4715,
4735,
8115,
323,
28819,
9701,
18484,
310,
2969,
285,
2087,
50276,
783,
2361,
1543,
403,
1077,
12085,
285,
497,
6786,
327,
1345,
15302,
50276,
783,
7714,
310,
4518,
3542,
285,
3477,
281,
2096,
50276,
783,
9021,
273,
253,
2929,
403,
2590,
285,
4326,
4215,
50275,
783,
1332,
812,
320,
2429,
281,
625,
26405,
11640,
24088,
2030,
69,
26405,
7274,
50276,
8906,
3006,
4735,
8115,
432,
20667,
18484,
310,
417,
7094,
4460,
247,
2074,
2934,
556,
644,
3786,
5125,
323,
10556,
407,
19169,
1162,
355,
6247,
50275,
7152,
339,
431,
248,
2929,
310,
973,
3542,
767,
2022,
20544,
275,
619,
11626,
50276,
18,
17690,
374,
69,
2410,
17009,
281,
1056,
897,
273,
1491,
432,
20667,
407,
247,
940,
408,
277,
3610,
534,
33772,
281,
4908,
1491,
432,
247,
629,
273,
253,
8123,
285,
10764,
352,
342,
20667,
18484,
50275,
19,
9470,
27163,
327,
2067,
15302,
253,
4081,
1566,
33526,
1805,
3045,
685,
495,
69,
260,
9866,
323,
2798,
285,
14527,
15302,
285,
10870,
3045,
327,
1308,
1832,
9169,
35142,
285,
26382,
10895,
342,
3365,
3349,
273,
3602,
2429,
281,
495,
69,
260,
9866,
1566,
337,
1529,
1774,
966,
273,
5919,
374,
69,
20,
69,
7274,
310,
417,
5393,
285,
5469,
50275,
19,
690,
8813,
273,
253,
1543,
327,
1308,
1832,
9169,
285,
690,
37699,
403,
3058,
323,
1650,
253,
7103,
7982,
1925,
946,
189,
692,
414,
323,
1308,
1832,
9169,
778,
417,
320,
3058,
50275,
7152,
33032,
18,
50276,
783,
2929,
310,
3542,
4518,
253,
5740,
273,
253,
1332,
285,
253,
4679,
403,
5272,
891,
476,
4354,
2096,
253,
4081,
6333,
374,
50276,
783,
4081,
940,
408,
277,
3610,
310,
3477,
281,
956,
285,
476,
320,
43867,
715,
1027,
374,
69,
6928,
323,
495,
69,
1936,
45558,
26405,
495,
50276,
783,
4477,
5196,
17829,
285,
973,
34092,
4679,
281,
17813,
253,
12510,
273,
4081,
940,
408,
277,
3610,
337,
253,
4081,
940,
408,
277,
3610,
19756,
3762,
390,
3806,
8109,
352,
3133,
751,
271,
11369,
2898,
2581,
685,
15832,
275,
1332,
604,
253,
2488,
651,
751,
281,
5276,
253,
38135,
273,
616,
4081,
1332,
352,
3198,
625,
10527,
8813,
285,
1332,
3806,
50275,
19,
253,
5740,
285,
24426,
273,
3082,
3133,
50276,
936,
320,
1512,
2969,
534,
4354,
1056,
10668,
13477,
323,
4227,
253,
5740,
6864,
5333,
6333,
19169,
1162,
355,
6247,
15036,
629,
273,
4735,
8123,
275,
1016,
3665,
281,
697,
33423,
3665,
594,
326,
374,
69,
27311,
812,
6016,
6864,
1491,
1057,
253,
277,
3610,
15036,
253,
1072,
3386,
275,
1016,
3665,
281,
697,
1390,
285,
1735,
13009,
752,
253,
3036,
337,
10262,
310,
417,
253,
1072,
347,
253,
5740,
5474,
339,
431,
248,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
50275,
74,
452,
2326,
2987,
13756,
281,
897,
33423,
18484,
275,
253,
3280,
253,
594,
1925,
2030,
69,
533,
436,
310,
253,
806,
673,
326,
891,
923,
436,
2934,
3732,
281,
2410,
17009,
275,
3739,
6979,
6780,
943,
320,
1677,
281,
253,
347,
2080,
347,
891,
871,
4460,
2934,
273,
32721,
534,
3386,
943,
320,
2908,
327,
253,
3579,
285,
19265,
15512,
3185,
273,
3365,
19507,
8123,
697,
271,
4722,
2934,
342,
2442,
323,
5277,
625,
5919,
6928,
2556,
281,
253,
4477,
4342,
50275,
681,
1148,
10047,
497,
1160,
342,
8530,
35615,
285,
247,
2074,
2746,
12541,
277,
3610,
275,
2709,
15302,
9745,
625,
13091,
281,
253,
4477,
3916,
50276,
1189,
455,
891,
513,
417,
923,
2201,
32213,
390,
3237,
275,
436,
7714,
2299,
627,
403,
690,
5884,
3237,
50276,
17695,
3198,
281,
320,
5520,
3340,
275,
253,
10199,
1918,
4116,
281,
1463,
897,
273,
7774,
285,
17257,
246,
5060,
23000,
253,
10199,
4428,
2266,
3916,
1293,
1463,
30404,
253,
8813,
273,
849,
940,
408,
277,
3610,
47603,
1491,
432,
2080,
1977,
18484,
812,
320,
5520,
625,
5884,
4278,
285,
13991,
275,
253,
7000,
5701,
50275,
74,
923,
642,
9023,
281,
1056,
2127,
2130,
533,
326,
651,
1056,
436,
789,
1014,
10046,
285,
6927,
281,
18302,
1257,
11106,
407,
2852,
2987,
50275,
187,
187,
4118,
18435,
27,
783,
30628,
1089,
253,
789,
273,
1600,
285,
627,
369,
3302,
13969,
326,
253,
2929,
476,
320,
7607,
436,
369,
5783,
846,
253,
30080,
22559
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper introduces the concept of nodespecific information nsi to model that nodes in a graph may have private information that other nodes cannot have access to the paper uses neural relation inference nri a framework published in 2018 based on variational inference to uncover the hidden relations of nodes in the graph for instance in a driving scenario different cars can be nodes in a graph with their publicly visible trajectory and their private information about the intention eg desired destination which is not shared with other nodes the encoder and decoder in nri are modified such that nsi stays private and is not shared with other nodes the paper considers problems that require uncovering the interactions of entities in a multiagent dynamical system the evaluation is based on the accuracy of future trajectory prediction the paper demonstrates the effectiveness of the idea on three different datasets one actionconditional dataset and two goalconditional datasets strengths the concept of nodespecific information nsi seems to be novel and is interesting the paper is wellwritten all components are explained in detail the paper demonstrate the effectiveness of the idea the proposed approach performs better than reference systems that use the private information only at the decoder hence i agree with the conclusion that the approach helps to form better interaction graphs the distinction to prior works especially nri is clear it is easy for the reader to understand what the contributions of the paper are weaknesses the paper argues that in many realworld examples there is a set of hidden features that affect the way entities interact with each other however i am not sure how reasonable this is for instance in autonomous driving it seems reasonable that considering the intention of individual entities can help to generate an interaction graph however it is not the hidden features that affect the interaction of the entities but only the public nodes which are affected by the hidden features the paper uses autonomous driving as an example however in autonomous driving the intention of a car can be shared with other cars via car2car communication however sharing this information is limited to system that are capable of this form of communication pedestrians for instance also have private intentions that cannot be shared easily i am missing an analysis of the learned graphs since the paper states that their method is better able to learn interaction graphs now i am wondering if this difference can be visualized in an example from one of datasets or even better analyzed in a systematic way questions question 1 one perhaps crucial part is not yet entirely clear to me since the public nodes are connected and each public node is connected to its private node isnt it possible that the private information flows to the other nodes in only 2 hops especially in the decoder then the private information can be shared with other public nodes the paper states that however its the private nodes effect might be observed in the future times steps but as far as i understand it is not only the effect of the private information but the private information itself that can be shared it would be great if the authors or other reviewers could clarify this point question 2 furthermore i am wondering if the approach can actually be used in realworld situations for instance for autonomous driving in a realworld setting each car and therefore each model has its own set of hidden information that is not shared with other cars which means that each model only has access to the private information of a single car however the model assumes to have access to all the private nodes ie it has a global view of the problem not only a local view which seems to not fit to the envisioned application setup to summarize the idea of introducing nodespecific private information into a graph of multiple interactive systems seems to be novel and interesting and improves the prediction performance however it is not entirely clear to me if the private information can be propagated to other nodes question 1 and if the model can even be applied in the envisioned scenario question 2 i think the answers to both questions are important to make a reasonable assessment of the paper hence my recommendation is rather tentative and i will update it as soon as the application setup becomes more clear to me after rebuttal thanks a lot for the authors replies i still think the applicability of the idea may be problematic but i think the idea of nsi is sufficently interesting and novel to be accepted to this conference hence i raise my score to accept docsepsummary this paper introduces a neural relational inference model that makes use of the hidden features of each node in a variational inference framework specifically the hiddenindividual information is modeled as private node in the graph importantly the task assumption made by the authors is that these individualized features cannot be observed by other entities contributions 1 the authors claim be to the first to study the use of individualized information for each entity in this direction 2 the proposed approach achieve stateoftheart results while only introducing minimum additional computational complexity weaknesses the task assumption of the paper does not look straightforward to me i am doubtful about the practical value of the task setting the individualized information cannot be accessed by other entities but its effect will be observed in the future is this a wellrecognized setting are there practical scenarios of this assumption given the existing work of nri and the variational inference framework the paper is not clear about the contributions for instance what are the new challenges that need to be solved after introducing the private entities besides the model architectures what are the key innovations strengths the experimental results look strong and thorough different datasets are considered and the improvements are significant overall this paper has demonstrated strong results on neural relational inference tasks the concerns i have are about the assumption of the approach and unclear novelty points docsepthis paper presents a neural relational inference model with nodespecific information experiments on realworld datasets validate the merit of the proposed method this paper presents a neural relational inference model with nodespecific information experiments on realworld datasets validate the merit of the proposed method stengths 1 this paper is well written and the idea is clear 2 the results are promising compared to various baslines weaknesses 1 the codes are not availabel it is difficut for others to reproduce the results this paper presents a neural relational inference model with nodespecific information experiments on realworld datasets validate the merit of the proposed method stengths 1 this paper is well written and the idea is clear 2 the results are promising compared to various baslines weaknesses 1 the codes are not availabel it is difficut for others to reproduce the results
### Summary:
|
this paper extends the neural relational inference framework for probabilistic inference of interaction relations between entities to a scenario where entities may have private features which requires modifications of the standard graph encoders and decoders in nri reviewers appreciated both the model and the overall execution of the paper the building blocks are clear the evaluation does its job well the main doubts are about the applicability of the setting for which the authors dont provide too many examples however the construction is somewhat intuitive and even in cases where private attributes arent explicit it may be valuable to disentangle the shareable attributes this way we encourage the reviewers to discuss the applicability a bit further typos not exhaustive please doublecheck with a spell checker multiple occurrences of gumble instead of gumbel bottom of pg 4 factorzied factorized
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
23970,
253,
4473,
273,
7632,
29765,
1491,
295,
9245,
281,
1566,
326,
7632,
275,
247,
4216,
778,
452,
3055,
1491,
326,
643,
7632,
2550,
452,
2289,
281,
253,
2929,
4648,
11454,
5886,
17032,
295,
363,
247,
7792,
3863,
275,
4765,
1754,
327,
39762,
17032,
281,
32355,
253,
8763,
2493,
273,
7632,
275,
253,
4216,
323,
4227,
275,
247,
6276,
10076,
1027,
8458,
476,
320,
7632,
275,
247,
4216,
342,
616,
13644,
7985,
18974,
285,
616,
3055,
1491,
670,
253,
8208,
24088,
6799,
12095,
534,
310,
417,
6096,
342,
643,
7632,
253,
32049,
285,
29810,
275,
295,
363,
403,
7321,
824,
326,
295,
9245,
19931,
3055,
285,
310,
417,
6096,
342,
643,
7632,
253,
2929,
19401,
3237,
326,
2430,
32355,
272,
253,
6355,
273,
14429,
275,
247,
4471,
12788,
18525,
985,
253,
7103,
310,
1754,
327,
253,
7200,
273,
2852,
18974,
10554,
253,
2929,
14371,
253,
12510,
273,
253,
2934,
327,
1264,
1027,
15302,
581,
2250,
35428,
10895,
285,
767,
4736,
35428,
15302,
20544,
50276,
783,
4473,
273,
7632,
29765,
1491,
295,
9245,
3133,
281,
320,
4460,
285,
310,
4722,
50276,
783,
2929,
310,
973,
15720,
512,
4295,
403,
5544,
275,
2508,
50276,
783,
2929,
7568,
253,
12510,
273,
253,
2934,
253,
4081,
2746,
17923,
1805,
685,
3806,
2718,
326,
897,
253,
3055,
1491,
760,
387,
253,
29810,
7613,
891,
5194,
342,
253,
6452,
326,
253,
2746,
7729,
281,
830,
1805,
5016,
14580,
50276,
783,
13812,
281,
2720,
2987,
3340,
295,
363,
310,
2590,
352,
310,
3477,
323,
253,
9414,
281,
2096,
752,
253,
9021,
273,
253,
2929,
403,
50276,
20881,
1255,
265,
50276,
783,
2929,
8219,
326,
275,
1142,
1524,
10186,
6667,
627,
310,
247,
873,
273,
8763,
3386,
326,
2818,
253,
1039,
14429,
8008,
342,
1016,
643,
2299,
891,
717,
417,
2119,
849,
5272,
436,
310,
323,
4227,
275,
26279,
6276,
352,
3133,
5272,
326,
7296,
253,
8208,
273,
2060,
14429,
476,
1361,
281,
6635,
271,
5016,
4216,
2299,
352,
310,
417,
253,
8763,
3386,
326,
2818,
253,
5016,
273,
253,
14429,
533,
760,
253,
1345,
7632,
534,
403,
5876,
407,
253,
8763,
3386,
50276,
783,
2929,
4648,
26279,
6276,
347,
271,
1650,
2299,
275,
26279,
6276,
253,
8208,
273,
247,
1113,
476,
320,
6096,
342,
643,
8458,
3066,
1113,
19,
5546,
5511,
2299,
9628,
436,
1491,
310,
3710,
281,
985,
326,
403,
7032,
273,
436,
830,
273,
5511,
47325,
323,
4227,
671,
452,
3055,
21546,
326,
2550,
320,
6096,
4354,
50276,
74,
717,
5816,
271,
1783,
273,
253,
6311,
14580,
1580,
253,
2929,
3054,
326,
616,
1332,
310,
1805,
2104,
281,
3037,
5016,
14580,
1024,
891,
717,
12371,
604,
436,
3064,
476,
320,
27130,
275,
271,
1650,
432,
581,
273,
15302,
390,
1014,
1805,
5867,
275,
247,
12082,
1039,
50275,
34974,
50276,
19751,
337,
581,
4931,
9560,
629,
310,
417,
2568,
7094,
2590,
281,
479,
1580,
253,
1345,
7632,
403,
4802,
285,
1016,
1345,
4666,
310,
4802,
281,
697,
3055,
4666,
310,
2649,
352,
1896,
326,
253,
3055,
1491,
14221,
281,
253,
643,
7632,
275,
760,
374,
47010,
3340,
275,
253,
29810,
840,
253,
3055,
1491,
476,
320,
6096,
342,
643,
1345,
7632,
253,
2929,
3054,
326,
50276,
35529,
697,
253,
3055,
7632,
1055,
1537,
320,
2540,
275,
253,
2852,
2069,
5018,
50276,
2858,
347,
2080,
347,
891,
2096,
352,
310,
417,
760,
253,
1055,
273,
253,
3055,
1491,
533,
253,
3055,
1491,
3139,
326,
476,
320,
6096,
352,
651,
320,
1270,
604,
253,
4477,
390,
643,
30628,
812,
19148,
436,
1127,
50276,
19751,
374,
33810,
891,
717,
12371,
604,
253,
2746,
476,
2686,
320,
908,
275,
1524,
10186,
9534,
323,
4227,
323,
26279,
6276,
275,
247,
1524,
10186,
4758,
1016,
1113,
285,
3103,
1016,
1566,
556,
697,
1211,
873,
273,
8763,
1491,
326,
310,
417,
6096,
342,
643,
8458,
534,
2097,
326,
1016,
1566,
760,
556,
2289,
281,
253,
3055,
1491,
273,
247,
2014,
1113,
2299,
253,
1566,
19584,
281,
452,
50276,
10773,
281,
512,
253,
3055,
7632,
26332,
352,
556,
247,
4156,
1859,
273,
253,
1895,
417,
760,
247,
1980,
1859,
534,
3133,
281,
417,
4944,
281,
253,
44921,
2898,
9978,
281,
26799,
253,
2934,
273,
16984,
7632,
29765,
3055,
1491,
715,
247,
4216,
273,
2709,
18366,
2718,
3133,
281,
320,
4460,
285,
4722,
285,
19132,
253,
10554,
3045,
2299,
352,
310,
417,
7094,
2590,
281,
479,
604,
253,
3055,
1491,
476,
320,
46695,
281,
643,
7632,
1953,
337,
285,
604,
253,
1566,
476,
1014,
320,
3732,
275,
253,
44921,
10076,
1953,
374,
891,
1158,
253,
9172,
281,
1097,
3533,
403,
1774,
281,
1056,
247,
5272,
6803,
273,
253,
2929,
7613,
619,
17401,
310,
2581,
43095,
285,
891,
588,
5731,
352,
347,
3517,
347,
253,
2898,
9978,
4916,
625,
2590,
281,
479,
50276,
6438,
30080,
22559,
6701,
247,
2257,
323,
253,
4477,
32114,
891,
1335,
1158,
253,
30437,
273,
253,
2934,
778,
320,
20276,
533,
891,
1158,
253,
2934,
273,
295,
9245,
310,
402,
1330,
1574,
4722,
285,
4460,
281,
320,
7607,
281,
436,
8059,
7613,
891,
7164,
619,
4868,
281,
2997,
5474,
339,
793,
360,
3454,
436,
2929,
23970,
247,
11454,
38524,
17032,
1566,
326,
2789,
897,
273,
253,
8763,
3386,
273,
1016,
4666,
275,
247,
39762,
17032,
7792,
5742,
253,
8763,
27276,
1491,
310,
23115,
347,
3055,
4666,
275,
253,
4216,
15538,
253,
4836,
9376,
1160,
407,
253,
4477,
310,
326,
841,
47687,
3386,
2550,
320,
2540,
407,
643,
14429,
50275,
1987,
8303,
50276,
18,
253,
4477,
1750,
320,
281,
253,
806,
281,
1263,
253,
897,
273,
47687,
1491,
323,
1016,
10726,
275,
436,
3884,
374,
253,
4081,
2746,
5115,
1375,
23037,
14387,
1543,
1223,
760,
16984,
5927,
3081,
15180,
10454,
50275,
20881,
1255,
265,
50276,
783,
4836,
9376,
273,
253,
2929,
1057,
417,
1007,
15246,
281,
479,
891,
717,
38342,
670,
253,
8542,
1318,
273,
253,
4836,
4758,
253,
47687,
1491,
2550,
320,
19197,
407,
643,
14429,
533,
697,
1055,
588,
320,
2540,
275,
253,
2852,
310,
436,
247,
973,
35477,
4758,
403,
627,
8542,
15216,
273,
436,
9376,
50276,
28821,
253,
5368,
789,
273,
295,
363,
285,
253,
39762,
17032,
7792,
253,
2929,
310,
417,
2590,
670,
253,
9021,
323,
4227,
752,
403,
253,
747,
7881,
326,
878,
281,
320,
14042,
846,
16984,
253,
3055,
14429,
16280,
253,
1566,
35615,
752,
403,
253,
2234,
32771,
50274,
296,
3755,
20556,
50276,
783,
5661,
1543,
1007,
2266,
285,
11080,
1027,
15302,
403,
2783,
285,
253,
11701,
403,
1534,
50276,
1189,
455,
436,
2929,
556,
5183,
2266,
1543,
327,
11454,
38524,
17032,
8892,
253,
7350,
891,
452,
403,
670,
253,
9376,
273,
253,
2746,
285,
12744,
38135,
2792,
50276,
7152,
33032,
2520,
2929,
10262,
247,
11454,
38524,
17032,
1566,
342,
7632,
29765,
1491,
4679,
327,
1524,
10186,
15302,
17813,
253,
15785,
273,
253,
4081,
1332,
436,
2929,
10262,
247,
11454,
38524,
17032,
1566,
342,
7632,
29765,
1491,
4679,
327,
1524,
10186,
15302,
17813,
253,
15785,
273,
253,
4081,
1332,
50275,
296,
1746,
84,
337,
436,
2929,
310,
973,
3542,
285,
253,
2934,
310,
2590,
50276,
19,
253,
1543,
403,
12532,
2429,
281,
2710,
1666,
8737,
50275,
20881,
1255,
265,
337,
253,
11646,
403,
417,
1961,
1492,
352,
310,
2171,
280,
307,
323,
2571,
281,
18302,
253,
1543,
50276,
2520,
2929,
10262,
247,
11454,
38524,
17032,
1566,
342,
7632,
29765,
1491,
4679,
327,
1524,
10186,
15302,
17813,
253,
15785,
273,
253,
4081,
1332,
50275,
296,
1746,
84,
337,
436,
2929,
310,
973,
3542,
285,
253,
2934,
310,
2590,
50276,
19,
253,
1543,
403,
12532,
2429,
281,
2710,
1666,
8737,
50275,
20881,
1255,
265,
337,
253,
11646,
403,
417,
1961,
1492,
352,
310,
2171,
280,
307,
323,
2571,
281,
18302,
253,
1543,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
8725,
253,
11454,
38524,
17032,
7792,
323,
37851,
17032,
273,
5016,
2493,
875,
14429,
281,
247,
10076,
835,
14429,
778,
452,
3055,
3386,
534,
4419,
14586,
273,
253,
2629,
4216,
2349,
351,
398,
285,
1086,
351,
398,
275,
295,
363,
50276,
15337,
398,
14109,
1097,
253,
1566,
285,
253,
4583,
10636,
273,
253,
2929,
253,
3652,
8336,
403,
2590,
253,
7103,
1057,
697,
2628,
973,
253,
2022,
24626,
403,
670,
253,
30437,
273,
253,
4758,
323,
534,
253,
4477,
13414,
2085,
1512,
1142,
6667,
2299,
253,
5140,
310,
8489,
27350,
285,
1014,
275,
2219,
835,
3055,
12474,
403,
2649,
6843,
352,
778,
320,
9865,
281,
557,
290,
2134,
253,
3894,
494,
12474,
436,
1039,
359,
11907,
253,
30628,
281,
2319,
253,
30437,
247,
2372,
2007,
50276,
555,
993,
417,
41389,
4496,
4021,
5903,
342,
247,
15368,
2451,
254,
50275,
34263,
37102,
273,
305,
19493,
3185,
273,
305,
3561,
293,
50275,
10492,
273,
23256,
577,
2803,
91,
728,
50276,
19012,
1025
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
23970,
253,
4473,
273,
7632,
29765,
1491,
295,
9245,
281,
1566,
326,
7632,
275,
247,
4216,
778,
452,
3055,
1491,
326,
643,
7632,
2550,
452,
2289,
281,
253,
2929,
4648,
11454,
5886,
17032,
295,
363,
247,
7792,
3863,
275,
4765,
1754,
327,
39762,
17032,
281,
32355,
253,
8763,
2493,
273,
7632,
275,
253,
4216,
323,
4227,
275,
247,
6276,
10076,
1027,
8458,
476,
320,
7632,
275,
247,
4216,
342,
616,
13644,
7985,
18974,
285,
616,
3055,
1491,
670,
253,
8208,
24088,
6799,
12095,
534,
310,
417,
6096,
342,
643,
7632,
253,
32049,
285,
29810,
275,
295,
363,
403,
7321,
824,
326,
295,
9245,
19931,
3055,
285,
310,
417,
6096,
342,
643,
7632,
253,
2929,
19401,
3237,
326,
2430,
32355,
272,
253,
6355,
273,
14429,
275,
247,
4471,
12788,
18525,
985,
253,
7103,
310,
1754,
327,
253,
7200,
273,
2852,
18974,
10554,
253,
2929,
14371,
253,
12510,
273,
253,
2934,
327,
1264,
1027,
15302,
581,
2250,
35428,
10895,
285,
767,
4736,
35428,
15302,
20544,
50276,
783,
4473,
273,
7632,
29765,
1491,
295,
9245,
3133,
281,
320,
4460,
285,
310,
4722,
50276,
783,
2929,
310,
973,
15720,
512,
4295,
403,
5544,
275,
2508,
50276,
783,
2929,
7568,
253,
12510,
273,
253,
2934,
253,
4081,
2746,
17923,
1805,
685,
3806,
2718,
326,
897,
253,
3055,
1491,
760,
387,
253,
29810,
7613,
891,
5194,
342,
253,
6452,
326,
253,
2746,
7729,
281,
830,
1805,
5016,
14580,
50276,
783,
13812,
281,
2720,
2987,
3340,
295,
363,
310,
2590,
352,
310,
3477,
323,
253,
9414,
281,
2096,
752,
253,
9021,
273,
253,
2929,
403,
50276,
20881,
1255,
265,
50276,
783,
2929,
8219,
326,
275,
1142,
1524,
10186,
6667,
627,
310,
247,
873,
273,
8763,
3386,
326,
2818,
253,
1039,
14429,
8008,
342,
1016,
643,
2299,
891,
717,
417,
2119,
849,
5272,
436,
310,
323,
4227,
275,
26279,
6276,
352,
3133,
5272,
326,
7296,
253,
8208,
273,
2060,
14429,
476,
1361,
281,
6635,
271,
5016,
4216,
2299,
352,
310,
417,
253,
8763,
3386,
326,
2818,
253,
5016,
273,
253,
14429,
533,
760,
253,
1345,
7632,
534,
403,
5876,
407,
253,
8763,
3386,
50276,
783,
2929,
4648,
26279,
6276,
347,
271,
1650,
2299,
275,
26279,
6276,
253,
8208,
273,
247,
1113,
476,
320,
6096,
342,
643,
8458,
3066,
1113,
19,
5546,
5511,
2299,
9628,
436,
1491,
310,
3710,
281,
985,
326,
403,
7032,
273,
436,
830,
273,
5511,
47325,
323,
4227,
671,
452,
3055,
21546,
326,
2550,
320,
6096,
4354,
50276,
74,
717,
5816,
271,
1783,
273,
253,
6311,
14580,
1580,
253,
2929,
3054,
326,
616,
1332,
310,
1805,
2104,
281,
3037,
5016,
14580,
1024,
891,
717,
12371,
604,
436,
3064,
476,
320,
27130,
275,
271,
1650,
432,
581,
273,
15302,
390,
1014,
1805,
5867,
275,
247,
12082,
1039,
50275,
34974,
50276,
19751,
337,
581,
4931,
9560,
629,
310,
417,
2568,
7094,
2590,
281,
479,
1580,
253,
1345,
7632,
403,
4802,
285,
1016,
1345,
4666,
310,
4802,
281,
697,
3055,
4666,
310,
2649,
352,
1896,
326,
253,
3055,
1491,
14221,
281,
253,
643,
7632,
275,
760,
374,
47010,
3340,
275,
253,
29810,
840,
253,
3055,
1491,
476,
320,
6096,
342,
643,
1345,
7632,
253,
2929,
3054,
326,
50276,
35529,
697,
253,
3055,
7632,
1055,
1537,
320,
2540,
275,
253,
2852,
2069,
5018,
50276,
2858,
347,
2080,
347,
891,
2096,
352,
310,
417,
760,
253,
1055,
273,
253,
3055,
1491,
533,
253,
3055,
1491,
3139,
326,
476,
320,
6096,
352,
651,
320,
1270,
604,
253,
4477,
390,
643,
30628,
812,
19148,
436,
1127,
50276,
19751,
374,
33810,
891,
717,
12371,
604,
253,
2746,
476,
2686,
320,
908,
275,
1524,
10186,
9534,
323,
4227,
323,
26279,
6276,
275,
247,
1524,
10186,
4758,
1016,
1113,
285,
3103,
1016,
1566,
556,
697,
1211,
873,
273,
8763,
1491,
326,
310,
417,
6096,
342,
643,
8458,
534,
2097,
326,
1016,
1566,
760,
556,
2289,
281,
253,
3055,
1491,
273,
247,
2014,
1113,
2299,
253,
1566,
19584,
281,
452,
50276,
10773,
281,
512,
253,
3055,
7632,
26332,
352,
556,
247,
4156,
1859,
273,
253,
1895,
417,
760,
247,
1980,
1859,
534,
3133,
281,
417,
4944,
281,
253,
44921,
2898,
9978,
281,
26799,
253,
2934,
273,
16984,
7632,
29765,
3055,
1491,
715,
247,
4216,
273,
2709,
18366,
2718,
3133,
281,
320,
4460,
285,
4722,
285,
19132,
253,
10554,
3045,
2299,
352,
310,
417,
7094,
2590,
281,
479,
604,
253,
3055,
1491,
476,
320,
46695,
281,
643,
7632,
1953,
337,
285,
604,
253,
1566,
476,
1014,
320,
3732,
275,
253,
44921,
10076,
1953,
374,
891,
1158,
253,
9172,
281,
1097,
3533,
403,
1774,
281,
1056,
247,
5272,
6803,
273,
253,
2929,
7613,
619,
17401,
310,
2581,
43095,
285,
891,
588,
5731,
352,
347,
3517,
347,
253,
2898,
9978,
4916,
625,
2590,
281,
479,
50276,
6438,
30080,
22559,
6701,
247,
2257,
323,
253,
4477,
32114,
891,
1335,
1158,
253,
30437,
273,
253,
2934,
778,
320,
20276,
533,
891,
1158,
253,
2934,
273,
295,
9245,
310,
402,
1330,
1574,
4722,
285,
4460,
281,
320,
7607,
281,
436,
8059,
7613,
891,
7164,
619,
4868,
281,
2997,
5474,
339,
793,
360,
3454,
436,
2929,
23970,
247,
11454,
38524,
17032,
1566,
326,
2789,
897,
273,
253,
8763,
3386,
273,
1016,
4666,
275,
247,
39762,
17032,
7792,
5742,
253,
8763,
27276,
1491,
310,
23115,
347,
3055,
4666,
275,
253,
4216,
15538,
253,
4836,
9376,
1160,
407,
253,
4477,
310,
326,
841,
47687,
3386,
2550,
320,
2540,
407,
643,
14429,
50275,
1987,
8303,
50276,
18,
253,
4477,
1750,
320,
281,
253,
806,
281,
1263,
253,
897,
273,
47687,
1491,
323,
1016,
10726,
275,
436,
3884,
374,
253,
4081,
2746,
5115,
1375,
23037,
14387,
1543,
1223,
760,
16984,
5927,
3081,
15180,
10454,
50275,
20881,
1255,
265,
50276,
783,
4836,
9376,
273,
253,
2929,
1057,
417,
1007,
15246,
281,
479,
891,
717,
38342,
670,
253,
8542,
1318,
273,
253,
4836,
4758,
253,
47687,
1491,
2550,
320,
19197,
407,
643,
14429,
533,
697,
1055,
588,
320,
2540,
275,
253,
2852,
310,
436,
247,
973,
35477,
4758,
403,
627,
8542,
15216,
273,
436,
9376,
50276,
28821,
253,
5368,
789,
273,
295,
363,
285,
253,
39762,
17032,
7792,
253,
2929,
310,
417,
2590,
670,
253,
9021,
323,
4227,
752,
403,
253,
747,
7881,
326,
878,
281,
320,
14042,
846,
16984,
253,
3055,
14429,
16280,
253,
1566,
35615,
752,
403,
253,
2234,
32771,
50274,
296,
3755,
20556,
50276,
783,
5661,
1543,
1007,
2266,
285,
11080,
1027,
15302,
403,
2783,
285,
253,
11701,
403,
1534,
50276,
1189,
455,
436,
2929,
556,
5183,
2266,
1543,
327,
11454,
38524,
17032,
8892,
253,
7350,
891,
452,
403,
670,
253,
9376,
273,
253,
2746,
285,
12744,
38135,
2792,
50276,
7152,
33032,
2520,
2929,
10262,
247,
11454,
38524,
17032,
1566,
342,
7632,
29765,
1491,
4679,
327,
1524,
10186,
15302,
17813,
253,
15785,
273,
253,
4081,
1332,
436,
2929,
10262,
247,
11454,
38524,
17032,
1566,
342,
7632,
29765,
1491,
4679,
327,
1524,
10186,
15302,
17813,
253,
15785,
273,
253,
4081,
1332,
50275,
296,
1746,
84,
337,
436,
2929,
310,
973,
3542,
285,
253,
2934,
310,
2590,
50276,
19,
253,
1543,
403,
12532,
2429,
281,
2710,
1666,
8737,
50275,
20881,
1255,
265,
337,
253,
11646,
403,
417,
1961,
1492,
352,
310,
2171,
280,
307,
323,
2571,
281,
18302,
253,
1543,
50276,
2520,
2929,
10262,
247,
11454,
38524,
17032,
1566,
342,
7632,
29765,
1491,
4679,
327,
1524,
10186,
15302,
17813,
253,
15785,
273,
253,
4081,
1332,
50275,
296,
1746,
84,
337,
436,
2929,
310,
973,
3542,
285,
253,
2934,
310,
2590,
50276,
19,
253,
1543,
403,
12532,
2429,
281,
2710,
1666,
8737,
50275,
20881,
1255,
265,
337,
253,
11646,
403,
417,
1961,
1492,
352,
310,
2171,
280,
307,
323,
2571,
281,
18302,
253,
1543,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
8725,
253,
11454,
38524,
17032,
7792,
323,
37851,
17032,
273,
5016,
2493,
875,
14429,
281,
247,
10076,
835,
14429,
778,
452,
3055,
3386,
534,
4419,
14586,
273,
253,
2629,
4216,
2349,
351,
398,
285,
1086,
351,
398,
275,
295,
363,
50276,
15337,
398,
14109,
1097,
253,
1566,
285,
253,
4583,
10636,
273,
253,
2929,
253,
3652,
8336,
403,
2590,
253,
7103,
1057,
697,
2628,
973,
253,
2022,
24626,
403,
670,
253,
30437,
273,
253,
4758,
323,
534,
253,
4477,
13414,
2085,
1512,
1142,
6667,
2299,
253,
5140,
310,
8489,
27350,
285,
1014,
275,
2219,
835,
3055,
12474,
403,
2649,
6843,
352,
778,
320,
9865,
281,
557,
290,
2134,
253,
3894,
494,
12474,
436,
1039,
359,
11907,
253,
30628,
281,
2319,
253,
30437,
247,
2372,
2007,
50276,
555,
993,
417,
41389,
4496,
4021,
5903,
342,
247,
15368,
2451,
254,
50275,
34263,
37102,
273,
305,
19493,
3185,
273,
305,
3561,
293,
50275,
10492,
273,
23256,
577,
2803,
91,
728,
50276,
19012,
1025
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a modifications of wellknown quasinewton methods for solving stronglyconvexstronglyconcave saddlepoints problems with lipschitz gradietns and hessians the key algorithmic novelty is in changing the matrix that is inverted unlike to quasinewton methods for minimization problems where the methods approximate the inverse of the hessian the proposed methods estimate the squared hessian this modification is important for the analysis of the methods following the analysis in the case of minimization 37 38 39 24 the authors derive twoperiod local convergence results one period with linear convergence and one period with superlinear convergence for general random broyden family with better rates for bfgs and sr1 these rates are similar to the ones known for minimization up to the replacement of kappa condition number by kappa2 the worsening in terms of kappa is expected since the methods use the inverse of the approximated square of the hessian strengths 1 novelty and significance the results are novel there are only 2 papers that also propose quasinewton methods for minmax problems for nonlinear equations but they consider a different setup and use different ideas the topic is relevant to ml community the obtained results are significant 2 clarity the results are presented in the clear way the proofs are also detailed enough and easy to follow weaknesses 1 bad dependence on the condition number the derived results suffer from the squared dependence on the condition number kappa although i find the proposed approach of approximating the squared hessian matrix interesting it would be interesting to see some discussion about the possibility of avoiding such dependence 2 comparison with 46 and 23 is incomplete although the authors discuss the results and assumptions from the concurent works 23 46 the explicit comparison of rates is missing i believe such a comparison of rates should be presented in the text moreover in the experiments it would be interesting to compare with the method from 23 as well although in 46 it is shown that the method from 23 is inferior to the one from 46 it is shown on particular problems different from the one considered in this paper therefore for completeness it is important to compare the proposed methods with the one from 23 as well all assumptions are properly formulated there are no potential negative social impact since the work is purely theoretical docsepin this paper the authors focus on applying quasinewton methods to solve the stronglyconvexstronglyconcave saddle point problems they achieved the explicit local superlinear convergence rates of quasinewton methods of broyden updates they utilized the technique of estimating the square of the indefinite hessian matrix instead of the hessian matrix itself they also reach the nonasymptotic convergence rate of bfgs method and sr1 method the authors conduct several numerical experiments on different datasets and these empirical results are consistent with the theoretical results this paper has the following strengths in terms of the originality quality and clarity originality this paper provides the first explicit local superlinear convergence rates of quasinewton algorithms applied to the saddle point problems this is the main and independent contribution of this paper quality the authors provides detailed proof in the theoretical section and the consistent empirical results in the numerical experiments section these empirical results show that the theoretical results are correct in practice clarity the paper is organized with clear and clean structure and the language is wellorganized so that the readers can easily understand the content of this submission however despite the above advantages the paper has the one weakness in term of the significance this submission presented incremental results of the nonasymptotic superlinear convergence analysis of quasinewton methods applied to different problems with different backgrounds all these proof ideas and techniques are based on the previous work and reference as the authors mentioned in the paper the authors extend the current superlinear convergence results of quasinewton methods applied to the general convex optimization problems to the stronglyconvexstronglyconcave saddle point problems although the theoretical proof is correct and the authors presented the consistent empirical results this does not change the essence that this work is an incremental analysis based on the previous similar work hence in summary although this submission is with high quality which improves our understanding of explicit superlinear convergence rates of quasinewton methods it is not totally original and 100 independent this is a submission with good quality but neither very impressive nor very significant enough some other minor weakness includes 1 the authors should use t as the time index instead of the k because k is very similar to the condition number notation kappa this can lead to misunderstanding 2 the authors should use d as the notation of the dimension instead of the n in the numerical experiment section the authors used d n is mostly used as the number of functions in the main objective function 3 the authors should use some other upperclass constant such as k or c to denote the hessian lipchitz parameter instead of l2 this notation is much more clear and consistent the authors have presented the limitations of this paper in the final conclusion section docsepthis paper proposes a randomized broyden family updates for stronglyconvexstrongly concave saddle point problems and show that the algorithm achieves a local superlinear convergence rate the key idea is to update the square of the hessian instead of the hessian itself numerical experiments show that the proposed algorithm outperforms classical firstorder methods the authors circumvent the issue of dealing with an indefinite hessian which is a nontrival obstacle for saddle point problems by applying bfgs type updates to the square of the hessian instead i am not aware of previous work that have established similar guarantees for quasinewton methods on saddle point problems so the analysis presented in this paper can be of interest to the neurips audience my main concern about this paper is the presentation in this current form the notations are very dense and often introduced without much motivation section 31 is particularly difficult to parse as the authors introduces a number of lemmas in previous papers without clearly laying out a purpose for these lemmas i think the presentation would be greatly improved if the main results are introduced first and then the necessary lemmas for proving the main results are presented with some motivation na docsepthis paper adapts greedy quasinewton method to saddle point problems the proposed algorithms are based on estimating the square of indefinite hessian matrix and turn out to have local superlinear convergence the paper is well structured it is stated clearly for the proposed algorithm that the key is the estimation of squared hessian i would like to see a more intensive discussion on how to store the approximate hessian efficiently nevertheless if all ingredients fit into the memory and are computable the proposed algorithm is quite efficient in theory and practice from all experiments shown in this paper the proposed algorithms work pretty well eventually however in section 52 it is better to address more clearly that the objective function satisfies all assumptions first this paper requires sufficient smoothness of a stronglyconvex and strongly concave function f and proposed algorithm requires the evaluation of squared hessian which results in computational complexity second the rate of local super linear convergence is highly related to dimension n which can be very large in practice if n is large then the rate will be close to the one for local linear convergence
### Summary:
|
thank you for your submission to neurips the reviewers unanimously found the work to address an important relevant problem and the paper to be clear and generally wellwritten all four reviewers unanimously recommend accepting the paper the work has obvious impact for the ml community the idea of rewriting the newton update z z h1g in terms of the positive definite squared hessian z z h2 h g and then using a quasinewton scheme to approximate h2 is immediately intuitive and applicable to a wide range of practical problems the paper provides a rigorous guarantee that such a quasinewton scheme converges superlinearly within a neighborhood of the saddle point please incorporate reviewer feedback in preparing the camera ready version in particular please take care to include the comparison to concurrent work 23 46 in your response to reviewer mk9
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
14586,
273,
973,
4304,
21582,
460,
88,
1299,
3082,
323,
16161,
7052,
44181,
9072,
314,
45542,
1123,
26759,
10801,
3237,
342,
11233,
37913,
3805,
2880,
2224,
285,
344,
859,
2458,
253,
2234,
5933,
280,
38135,
310,
275,
6890,
253,
4315,
326,
310,
28483,
12401,
281,
21582,
460,
88,
1299,
3082,
323,
41458,
3237,
835,
253,
3082,
16851,
253,
13737,
273,
253,
344,
859,
757,
253,
4081,
3082,
6642,
253,
30044,
344,
859,
757,
436,
11237,
310,
1774,
323,
253,
1783,
273,
253,
3082,
50276,
34814,
253,
1783,
275,
253,
1083,
273,
41458,
5345,
6480,
6931,
2164,
253,
4477,
15313,
2500,
2211,
1970,
1980,
14940,
1543,
581,
2180,
342,
4872,
14940,
285,
581,
2180,
342,
2221,
8172,
14940,
323,
2087,
3632,
1795,
90,
3354,
2021,
342,
1805,
4142,
323,
270,
71,
5943,
285,
49975,
18,
841,
4142,
403,
2074,
281,
253,
4394,
1929,
323,
41458,
598,
281,
253,
5407,
273,
465,
5596,
1617,
1180,
407,
465,
5596,
19,
253,
43685,
275,
2426,
273,
465,
5596,
310,
3264,
1580,
253,
3082,
897,
253,
13737,
273,
253,
34930,
6278,
273,
253,
344,
859,
757,
50276,
296,
3755,
20556,
337,
38135,
285,
8453,
253,
1543,
403,
4460,
627,
403,
760,
374,
9380,
326,
671,
12661,
21582,
460,
88,
1299,
3082,
323,
1054,
4090,
3237,
323,
14561,
7424,
533,
597,
1908,
247,
1027,
9978,
285,
897,
1027,
5697,
253,
9400,
310,
4623,
281,
13361,
3114,
253,
2797,
1543,
403,
1534,
374,
19843,
253,
1543,
403,
3559,
275,
253,
2590,
1039,
253,
27947,
403,
671,
7000,
2217,
285,
3477,
281,
956,
50274,
20881,
1255,
265,
337,
3076,
10096,
327,
253,
1617,
1180,
253,
6012,
1543,
11089,
432,
253,
30044,
10096,
327,
253,
1617,
1180,
465,
5596,
3738,
891,
1089,
253,
4081,
2746,
273,
4020,
839,
253,
30044,
344,
859,
757,
4315,
4722,
352,
651,
320,
4722,
281,
923,
690,
5955,
670,
253,
6387,
273,
17816,
824,
10096,
374,
5301,
342,
7904,
285,
3495,
310,
18464,
3738,
253,
4477,
2319,
253,
1543,
285,
13260,
432,
253,
7036,
86,
624,
2987,
3495,
7904,
253,
6843,
5301,
273,
4142,
310,
5816,
891,
2868,
824,
247,
5301,
273,
4142,
943,
320,
3559,
275,
253,
2505,
25761,
275,
253,
4679,
352,
651,
320,
4722,
281,
7277,
342,
253,
1332,
432,
3495,
347,
973,
3738,
275,
7904,
352,
310,
2011,
326,
253,
1332,
432,
3495,
310,
18134,
281,
253,
581,
432,
7904,
352,
310,
2011,
327,
1798,
3237,
1027,
432,
253,
581,
2783,
275,
436,
2929,
3103,
323,
29867,
352,
310,
1774,
281,
7277,
253,
4081,
3082,
342,
253,
581,
432,
3495,
347,
973,
512,
13260,
403,
6283,
26115,
627,
403,
642,
2442,
4016,
2675,
3486,
1580,
253,
789,
310,
15846,
10527,
5474,
339,
9852,
436,
2929,
253,
4477,
2770,
327,
9433,
21582,
460,
88,
1299,
3082,
281,
8415,
253,
7052,
44181,
9072,
314,
45542,
1123,
26759,
1127,
3237,
597,
6786,
253,
6843,
1980,
2221,
8172,
14940,
4142,
273,
21582,
460,
88,
1299,
3082,
273,
1795,
90,
3354,
11269,
597,
12845,
253,
5853,
273,
26230,
253,
6278,
273,
253,
44245,
344,
859,
757,
4315,
3185,
273,
253,
344,
859,
757,
4315,
3139,
597,
671,
3986,
253,
1327,
284,
40045,
3875,
14940,
2281,
273,
270,
71,
5943,
1332,
285,
49975,
18,
1332,
253,
4477,
2589,
2067,
10704,
4679,
327,
1027,
15302,
285,
841,
16774,
1543,
403,
5185,
342,
253,
10527,
1543,
436,
2929,
556,
253,
1563,
20544,
275,
2426,
273,
253,
3236,
414,
3290,
285,
19843,
50276,
19164,
414,
436,
2929,
3400,
253,
806,
6843,
1980,
2221,
8172,
14940,
4142,
273,
21582,
460,
88,
1299,
11333,
3732,
281,
253,
26759,
1127,
3237,
436,
310,
253,
2022,
285,
3907,
7680,
273,
436,
2929,
50276,
15177,
253,
4477,
3400,
7000,
4737,
275,
253,
10527,
2593,
285,
253,
5185,
16774,
1543,
275,
253,
10704,
4679,
2593,
841,
16774,
1543,
921,
326,
253,
10527,
1543,
403,
3451,
275,
3946,
50276,
498,
15752,
253,
2929,
310,
10932,
342,
2590,
285,
4076,
2605,
285,
253,
3448,
310,
973,
34092,
594,
326,
253,
10668,
476,
4354,
2096,
253,
2600,
273,
436,
19529,
50276,
35529,
5747,
253,
1840,
11361,
253,
2929,
556,
253,
581,
14855,
275,
1307,
273,
253,
8453,
50276,
2520,
19529,
3559,
32809,
1543,
273,
253,
1327,
284,
40045,
3875,
2221,
8172,
14940,
1783,
273,
21582,
460,
88,
1299,
3082,
3732,
281,
1027,
3237,
342,
1027,
24550,
512,
841,
4737,
5697,
285,
5609,
403,
1754,
327,
253,
2045,
789,
285,
3806,
347,
253,
4477,
5393,
275,
253,
2929,
253,
4477,
9017,
253,
1655,
2221,
8172,
14940,
1543,
273,
21582,
460,
88,
1299,
3082,
3732,
281,
253,
2087,
17133,
13757,
3237,
281,
253,
7052,
44181,
9072,
314,
45542,
1123,
26759,
1127,
3237,
3738,
253,
10527,
4737,
310,
3451,
285,
253,
4477,
3559,
253,
5185,
16774,
1543,
436,
1057,
417,
1818,
253,
17718,
326,
436,
789,
310,
271,
32809,
1783,
1754,
327,
253,
2045,
2074,
789,
7613,
275,
6010,
3738,
436,
19529,
310,
342,
1029,
3290,
534,
19132,
776,
4685,
273,
6843,
2221,
8172,
14940,
4142,
273,
21582,
460,
88,
1299,
3082,
352,
310,
417,
9106,
3236,
285,
2233,
3907,
436,
310,
247,
19529,
342,
1175,
3290,
533,
6747,
1077,
13943,
4543,
1077,
1534,
2217,
50276,
8826,
643,
5884,
14855,
3797,
50276,
18,
253,
4477,
943,
897,
246,
347,
253,
673,
3605,
3185,
273,
253,
465,
984,
465,
310,
1077,
2074,
281,
253,
1617,
1180,
14951,
465,
5596,
436,
476,
1421,
281,
40663,
50276,
19,
253,
4477,
943,
897,
277,
347,
253,
14951,
273,
253,
7877,
3185,
273,
253,
295,
275,
253,
10704,
3368,
2593,
253,
4477,
908,
277,
295,
310,
6571,
908,
347,
253,
1180,
273,
3470,
275,
253,
2022,
8103,
1159,
50276,
20,
253,
4477,
943,
897,
690,
643,
5170,
2437,
3638,
824,
347,
465,
390,
260,
281,
9173,
253,
344,
859,
757,
5541,
37913,
4764,
3185,
273,
298,
19,
436,
14951,
310,
1199,
625,
2590,
285,
5185,
253,
4477,
452,
3559,
253,
7364,
273,
436,
2929,
275,
253,
2457,
6452,
2593,
5474,
33032,
2520,
2929,
29328,
247,
14871,
1795,
90,
3354,
2021,
11269,
323,
7052,
44181,
9072,
314,
40886,
26759,
1127,
3237,
285,
921,
326,
253,
5933,
33526,
247,
1980,
2221,
8172,
14940,
2281,
253,
2234,
2934,
310,
281,
5731,
253,
6278,
273,
253,
344,
859,
757,
3185,
273,
253,
344,
859,
757,
3139,
10704,
4679,
921,
326,
253,
4081,
5933,
41731,
13015,
8946,
806,
2621,
3082,
50275,
783,
4477,
39256,
253,
2523,
273,
10620,
342,
271,
44245,
344,
859,
757,
534,
310,
247,
25450,
49182,
26982,
323,
26759,
1127,
3237,
407,
9433,
270,
71,
5943,
1511,
11269,
281,
253,
6278,
273,
253,
344,
859,
757,
3185,
891,
717,
417,
6600,
273,
2045,
789,
326,
452,
4232,
2074,
23632,
323,
21582,
460,
88,
1299,
3082,
327,
26759,
1127,
3237,
594,
253,
1783,
3559,
275,
436,
2929,
476,
320,
273,
1600,
281,
253,
5723,
2824,
8446,
50274,
2577,
2022,
4468,
670,
436,
2929,
310,
253,
9759,
50276,
249,
436,
1655,
830,
253,
41818,
403,
1077,
14086,
285,
2223,
5611,
1293,
1199,
16038,
2593,
4562,
310,
3782,
2834,
281,
14390,
347,
253,
4477,
23970,
247,
1180,
273,
458,
44661,
275,
2045,
9380,
1293,
4518,
23157,
562,
247,
4096,
323,
841,
458,
44661,
891,
1158,
253,
9759,
651,
320,
10260,
5520,
604,
253,
2022,
1543,
403,
5611,
806,
285,
840,
253,
3309,
458,
44661,
323,
18597,
253,
2022,
1543,
403,
3559,
342,
690,
16038,
50275,
2072,
5474,
33032,
2520,
2929,
5223,
84,
38754,
21582,
460,
88,
1299,
1332,
281,
26759,
1127,
3237,
253,
4081,
11333,
403,
1754,
327,
26230,
253,
6278,
273,
44245,
344,
859,
757,
4315,
285,
1614,
562,
281,
452,
1980,
2221,
8172,
14940,
253,
2929,
310,
973,
18872,
352,
310,
4767,
4518,
323,
253,
4081,
5933,
326,
253,
2234,
310,
253,
13418,
273,
30044,
344,
859,
757,
891,
651,
751,
281,
923,
247,
625,
17193,
5955,
327,
849,
281,
4657,
253,
16851,
344,
859,
757,
14556,
17837,
604,
512,
12696,
4944,
715,
253,
3541,
285,
403,
2475,
494,
253,
4081,
5933,
310,
3240,
5919,
275,
3762,
285,
3946,
432,
512,
4679,
2011,
275,
436,
2929,
253,
4081,
11333,
789,
3965,
973,
6524,
2299,
275,
2593,
8073,
352,
310,
1805,
281,
2953,
625,
4518,
326,
253,
8103,
1159,
12310,
512,
13260,
50276,
7053,
436,
2929,
4419,
4209,
6032,
1255,
273,
247,
7052,
44181,
285,
7052,
40886,
1159,
269,
285,
4081,
5933,
4419,
253,
7103,
273,
30044,
344,
859,
757,
534,
1543,
275,
15180,
10454,
1273,
253,
2281,
273,
1980,
2221,
4872,
14940,
310,
4122,
2905,
281,
7877,
295,
534,
476,
320,
1077,
1781,
275,
3946,
604,
295,
310,
1781,
840,
253,
2281,
588,
320,
2810,
281,
253,
581,
323,
1980,
4872,
14940,
2490,
187,
4118,
18435,
27,
47033,
368,
323,
634,
19529,
281,
5723,
2824,
253,
30628,
38350,
1119,
253,
789,
281,
2953,
271,
1774,
4623,
1895,
285,
253,
2929,
281,
320,
2590,
285,
3839,
973,
15720,
512,
1740,
30628,
38350,
5583,
18738,
253,
2929,
50276,
783,
789,
556,
4755,
3486,
323,
253,
13361,
3114,
253,
2934,
273,
294,
17695,
253,
747,
1299,
5731,
1182,
50276,
91,
50276,
73,
18,
72,
275,
2426,
273,
253,
2762,
19040,
30044,
344,
859,
757,
1182,
50276,
91,
50276,
73,
19,
288,
305,
285,
840,
970,
247,
21582,
460,
88,
1299,
6974,
281,
16851,
288,
19,
310,
4745,
27350,
285,
7763,
281,
247,
4618,
2491,
273,
8542,
3237,
253,
2929,
3400,
247,
26565,
12215,
326,
824,
247,
21582,
460,
88,
1299,
6974,
26414,
2221,
1282,
1285,
1561,
247,
9168,
273,
253,
26759,
1127,
50275,
32897,
19071,
37317,
8680,
275,
13828,
253,
6568,
4704,
2715,
275,
1798,
4496,
1379,
1557,
281,
2486,
253,
5301,
281,
17336,
789,
3495,
7904,
275,
634,
2380,
281,
37317,
36904,
26
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
14586,
273,
973,
4304,
21582,
460,
88,
1299,
3082,
323,
16161,
7052,
44181,
9072,
314,
45542,
1123,
26759,
10801,
3237,
342,
11233,
37913,
3805,
2880,
2224,
285,
344,
859,
2458,
253,
2234,
5933,
280,
38135,
310,
275,
6890,
253,
4315,
326,
310,
28483,
12401,
281,
21582,
460,
88,
1299,
3082,
323,
41458,
3237,
835,
253,
3082,
16851,
253,
13737,
273,
253,
344,
859,
757,
253,
4081,
3082,
6642,
253,
30044,
344,
859,
757,
436,
11237,
310,
1774,
323,
253,
1783,
273,
253,
3082,
50276,
34814,
253,
1783,
275,
253,
1083,
273,
41458,
5345,
6480,
6931,
2164,
253,
4477,
15313,
2500,
2211,
1970,
1980,
14940,
1543,
581,
2180,
342,
4872,
14940,
285,
581,
2180,
342,
2221,
8172,
14940,
323,
2087,
3632,
1795,
90,
3354,
2021,
342,
1805,
4142,
323,
270,
71,
5943,
285,
49975,
18,
841,
4142,
403,
2074,
281,
253,
4394,
1929,
323,
41458,
598,
281,
253,
5407,
273,
465,
5596,
1617,
1180,
407,
465,
5596,
19,
253,
43685,
275,
2426,
273,
465,
5596,
310,
3264,
1580,
253,
3082,
897,
253,
13737,
273,
253,
34930,
6278,
273,
253,
344,
859,
757,
50276,
296,
3755,
20556,
337,
38135,
285,
8453,
253,
1543,
403,
4460,
627,
403,
760,
374,
9380,
326,
671,
12661,
21582,
460,
88,
1299,
3082,
323,
1054,
4090,
3237,
323,
14561,
7424,
533,
597,
1908,
247,
1027,
9978,
285,
897,
1027,
5697,
253,
9400,
310,
4623,
281,
13361,
3114,
253,
2797,
1543,
403,
1534,
374,
19843,
253,
1543,
403,
3559,
275,
253,
2590,
1039,
253,
27947,
403,
671,
7000,
2217,
285,
3477,
281,
956,
50274,
20881,
1255,
265,
337,
3076,
10096,
327,
253,
1617,
1180,
253,
6012,
1543,
11089,
432,
253,
30044,
10096,
327,
253,
1617,
1180,
465,
5596,
3738,
891,
1089,
253,
4081,
2746,
273,
4020,
839,
253,
30044,
344,
859,
757,
4315,
4722,
352,
651,
320,
4722,
281,
923,
690,
5955,
670,
253,
6387,
273,
17816,
824,
10096,
374,
5301,
342,
7904,
285,
3495,
310,
18464,
3738,
253,
4477,
2319,
253,
1543,
285,
13260,
432,
253,
7036,
86,
624,
2987,
3495,
7904,
253,
6843,
5301,
273,
4142,
310,
5816,
891,
2868,
824,
247,
5301,
273,
4142,
943,
320,
3559,
275,
253,
2505,
25761,
275,
253,
4679,
352,
651,
320,
4722,
281,
7277,
342,
253,
1332,
432,
3495,
347,
973,
3738,
275,
7904,
352,
310,
2011,
326,
253,
1332,
432,
3495,
310,
18134,
281,
253,
581,
432,
7904,
352,
310,
2011,
327,
1798,
3237,
1027,
432,
253,
581,
2783,
275,
436,
2929,
3103,
323,
29867,
352,
310,
1774,
281,
7277,
253,
4081,
3082,
342,
253,
581,
432,
3495,
347,
973,
512,
13260,
403,
6283,
26115,
627,
403,
642,
2442,
4016,
2675,
3486,
1580,
253,
789,
310,
15846,
10527,
5474,
339,
9852,
436,
2929,
253,
4477,
2770,
327,
9433,
21582,
460,
88,
1299,
3082,
281,
8415,
253,
7052,
44181,
9072,
314,
45542,
1123,
26759,
1127,
3237,
597,
6786,
253,
6843,
1980,
2221,
8172,
14940,
4142,
273,
21582,
460,
88,
1299,
3082,
273,
1795,
90,
3354,
11269,
597,
12845,
253,
5853,
273,
26230,
253,
6278,
273,
253,
44245,
344,
859,
757,
4315,
3185,
273,
253,
344,
859,
757,
4315,
3139,
597,
671,
3986,
253,
1327,
284,
40045,
3875,
14940,
2281,
273,
270,
71,
5943,
1332,
285,
49975,
18,
1332,
253,
4477,
2589,
2067,
10704,
4679,
327,
1027,
15302,
285,
841,
16774,
1543,
403,
5185,
342,
253,
10527,
1543,
436,
2929,
556,
253,
1563,
20544,
275,
2426,
273,
253,
3236,
414,
3290,
285,
19843,
50276,
19164,
414,
436,
2929,
3400,
253,
806,
6843,
1980,
2221,
8172,
14940,
4142,
273,
21582,
460,
88,
1299,
11333,
3732,
281,
253,
26759,
1127,
3237,
436,
310,
253,
2022,
285,
3907,
7680,
273,
436,
2929,
50276,
15177,
253,
4477,
3400,
7000,
4737,
275,
253,
10527,
2593,
285,
253,
5185,
16774,
1543,
275,
253,
10704,
4679,
2593,
841,
16774,
1543,
921,
326,
253,
10527,
1543,
403,
3451,
275,
3946,
50276,
498,
15752,
253,
2929,
310,
10932,
342,
2590,
285,
4076,
2605,
285,
253,
3448,
310,
973,
34092,
594,
326,
253,
10668,
476,
4354,
2096,
253,
2600,
273,
436,
19529,
50276,
35529,
5747,
253,
1840,
11361,
253,
2929,
556,
253,
581,
14855,
275,
1307,
273,
253,
8453,
50276,
2520,
19529,
3559,
32809,
1543,
273,
253,
1327,
284,
40045,
3875,
2221,
8172,
14940,
1783,
273,
21582,
460,
88,
1299,
3082,
3732,
281,
1027,
3237,
342,
1027,
24550,
512,
841,
4737,
5697,
285,
5609,
403,
1754,
327,
253,
2045,
789,
285,
3806,
347,
253,
4477,
5393,
275,
253,
2929,
253,
4477,
9017,
253,
1655,
2221,
8172,
14940,
1543,
273,
21582,
460,
88,
1299,
3082,
3732,
281,
253,
2087,
17133,
13757,
3237,
281,
253,
7052,
44181,
9072,
314,
45542,
1123,
26759,
1127,
3237,
3738,
253,
10527,
4737,
310,
3451,
285,
253,
4477,
3559,
253,
5185,
16774,
1543,
436,
1057,
417,
1818,
253,
17718,
326,
436,
789,
310,
271,
32809,
1783,
1754,
327,
253,
2045,
2074,
789,
7613,
275,
6010,
3738,
436,
19529,
310,
342,
1029,
3290,
534,
19132,
776,
4685,
273,
6843,
2221,
8172,
14940,
4142,
273,
21582,
460,
88,
1299,
3082,
352,
310,
417,
9106,
3236,
285,
2233,
3907,
436,
310,
247,
19529,
342,
1175,
3290,
533,
6747,
1077,
13943,
4543,
1077,
1534,
2217,
50276,
8826,
643,
5884,
14855,
3797,
50276,
18,
253,
4477,
943,
897,
246,
347,
253,
673,
3605,
3185,
273,
253,
465,
984,
465,
310,
1077,
2074,
281,
253,
1617,
1180,
14951,
465,
5596,
436,
476,
1421,
281,
40663,
50276,
19,
253,
4477,
943,
897,
277,
347,
253,
14951,
273,
253,
7877,
3185,
273,
253,
295,
275,
253,
10704,
3368,
2593,
253,
4477,
908,
277,
295,
310,
6571,
908,
347,
253,
1180,
273,
3470,
275,
253,
2022,
8103,
1159,
50276,
20,
253,
4477,
943,
897,
690,
643,
5170,
2437,
3638,
824,
347,
465,
390,
260,
281,
9173,
253,
344,
859,
757,
5541,
37913,
4764,
3185,
273,
298,
19,
436,
14951,
310,
1199,
625,
2590,
285,
5185,
253,
4477,
452,
3559,
253,
7364,
273,
436,
2929,
275,
253,
2457,
6452,
2593,
5474,
33032,
2520,
2929,
29328,
247,
14871,
1795,
90,
3354,
2021,
11269,
323,
7052,
44181,
9072,
314,
40886,
26759,
1127,
3237,
285,
921,
326,
253,
5933,
33526,
247,
1980,
2221,
8172,
14940,
2281,
253,
2234,
2934,
310,
281,
5731,
253,
6278,
273,
253,
344,
859,
757,
3185,
273,
253,
344,
859,
757,
3139,
10704,
4679,
921,
326,
253,
4081,
5933,
41731,
13015,
8946,
806,
2621,
3082,
50275,
783,
4477,
39256,
253,
2523,
273,
10620,
342,
271,
44245,
344,
859,
757,
534,
310,
247,
25450,
49182,
26982,
323,
26759,
1127,
3237,
407,
9433,
270,
71,
5943,
1511,
11269,
281,
253,
6278,
273,
253,
344,
859,
757,
3185,
891,
717,
417,
6600,
273,
2045,
789,
326,
452,
4232,
2074,
23632,
323,
21582,
460,
88,
1299,
3082,
327,
26759,
1127,
3237,
594,
253,
1783,
3559,
275,
436,
2929,
476,
320,
273,
1600,
281,
253,
5723,
2824,
8446,
50274,
2577,
2022,
4468,
670,
436,
2929,
310,
253,
9759,
50276,
249,
436,
1655,
830,
253,
41818,
403,
1077,
14086,
285,
2223,
5611,
1293,
1199,
16038,
2593,
4562,
310,
3782,
2834,
281,
14390,
347,
253,
4477,
23970,
247,
1180,
273,
458,
44661,
275,
2045,
9380,
1293,
4518,
23157,
562,
247,
4096,
323,
841,
458,
44661,
891,
1158,
253,
9759,
651,
320,
10260,
5520,
604,
253,
2022,
1543,
403,
5611,
806,
285,
840,
253,
3309,
458,
44661,
323,
18597,
253,
2022,
1543,
403,
3559,
342,
690,
16038,
50275,
2072,
5474,
33032,
2520,
2929,
5223,
84,
38754,
21582,
460,
88,
1299,
1332,
281,
26759,
1127,
3237,
253,
4081,
11333,
403,
1754,
327,
26230,
253,
6278,
273,
44245,
344,
859,
757,
4315,
285,
1614,
562,
281,
452,
1980,
2221,
8172,
14940,
253,
2929,
310,
973,
18872,
352,
310,
4767,
4518,
323,
253,
4081,
5933,
326,
253,
2234,
310,
253,
13418,
273,
30044,
344,
859,
757,
891,
651,
751,
281,
923,
247,
625,
17193,
5955,
327,
849,
281,
4657,
253,
16851,
344,
859,
757,
14556,
17837,
604,
512,
12696,
4944,
715,
253,
3541,
285,
403,
2475,
494,
253,
4081,
5933,
310,
3240,
5919,
275,
3762,
285,
3946,
432,
512,
4679,
2011,
275,
436,
2929,
253,
4081,
11333,
789,
3965,
973,
6524,
2299,
275,
2593,
8073,
352,
310,
1805,
281,
2953,
625,
4518,
326,
253,
8103,
1159,
12310,
512,
13260,
50276,
7053,
436,
2929,
4419,
4209,
6032,
1255,
273,
247,
7052,
44181,
285,
7052,
40886,
1159,
269,
285,
4081,
5933,
4419,
253,
7103,
273,
30044,
344,
859,
757,
534,
1543,
275,
15180,
10454,
1273,
253,
2281,
273,
1980,
2221,
4872,
14940,
310,
4122,
2905,
281,
7877,
295,
534,
476,
320,
1077,
1781,
275,
3946,
604,
295,
310,
1781,
840,
253,
2281,
588,
320,
2810,
281,
253,
581,
323,
1980,
4872,
14940,
2490,
187,
4118,
18435,
27,
47033,
368,
323,
634,
19529,
281,
5723,
2824,
253,
30628,
38350,
1119,
253,
789,
281,
2953,
271,
1774,
4623,
1895,
285,
253,
2929,
281,
320,
2590,
285,
3839,
973,
15720,
512,
1740,
30628,
38350,
5583,
18738,
253,
2929,
50276,
783,
789,
556,
4755,
3486,
323,
253,
13361,
3114,
253,
2934,
273,
294,
17695,
253,
747,
1299,
5731,
1182,
50276,
91,
50276,
73,
18,
72,
275,
2426,
273,
253,
2762,
19040,
30044,
344,
859,
757,
1182,
50276,
91,
50276,
73,
19,
288,
305,
285,
840,
970,
247,
21582,
460,
88,
1299,
6974,
281,
16851,
288,
19,
310,
4745,
27350,
285,
7763,
281,
247,
4618,
2491,
273,
8542,
3237,
253,
2929,
3400,
247,
26565,
12215,
326,
824,
247,
21582,
460,
88,
1299,
6974,
26414,
2221,
1282,
1285,
1561,
247,
9168,
273,
253,
26759,
1127,
50275,
32897,
19071,
37317,
8680,
275,
13828,
253,
6568,
4704,
2715,
275,
1798,
4496,
1379,
1557,
281,
2486,
253,
5301,
281,
17336,
789,
3495,
7904,
275,
634,
2380,
281,
37317,
36904,
26
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors study the web navigation problem they propose behavioral cloning of the random trajectory method on wikipedia graphs they further apply their method to the fever verification task their experiments show that their method is effective in learning entity embeddings and a navigation policy it also succeeds in a competitive performance in the verification task strengths 1 the proposed method is more effective than the baseline methods and scalable to large wikipedia graphs 2 the extensive analysis of results and limitations of the study are presented weaknesses major 1 the proposed trajectories are very intuitive and lack novelty they simply use random walks with the probabilities being decided based on neighboring vertices 2 the proposed method is only compared with random and greedy methods not other stateoftheart methods hence it is difficult to measure effectiveness 3 a comprehensive efficiency analysis would be helpful comparison between the running times of the proposed methods and other methods would give a better understanding of the method minor 1 there are some typing issues for example on page 8 line 301 table 2 should be table 4 yes they mentioned two limitations of their work docsepthis paper focuses on the problem of web navigation the authors proposed to train an agent which can navigate a large wikipedia graph they first generate sampled trajectories on the wikipedia graph based on different methods eg random forward trajectories reverse trajectories shortest paths with these sampled trajectories the behavioral cloning ie supervised learning method is adopted to train the model the experiments show that the proposed method is not only able to efficiently navigate wikipedia graph but also have competitive performance on the fact verification task 1 strengths the proposed method works well on navigation task and obtains competitive performance on the fact verification task this paper is generally well written 2 weaknesses nogueira and cho 2016 is the most relevant work the proposed method webnav in nogueira and cho 2016 only gets 125 success rate while the proposed method in this paper obtains more than 90 success rate in a similar setting the reason for this large difference of performance is not explained or discussed in the experiments the authors have discussed the limitation of this work docsepthe paper presents a web navigation agent that learns to reach target webpages by hopping hyperlinks the paper uses transformerbased models that encode the local content and learn to select the link the agent moves to next this model is trained by behavior cloning of randomly sampled trajectories the authors show that this approach provides improved performance on navigation and fact verification benchmarks strengths the proposed system is clean and performs better than existing systems on navigation and fact verification benchmarks good execution the paper conducts various ablation studies such as different navigation setups and different design choices of the method trajectory sampling policy features etc weaknesses this might be due to the presentation of the paper but the technical novelty or contribution appears somewhat limited to me after reading the paper the method seems to be a fairly simple application of existing techniques such as transformer encoder and behavioral cloning training the experiments also seem not quite offering significant insights or takeaways for the neurips community it would be really helpful if the authors could crisply clarify the main contribution with respect to existing works eg is it a new methodology or a new empirical finding if so what might be the impact to the research community or an improved performance if so what exact difference from prior works is enabling the improved performance the navigation experiments are only conducted on a wikipedia graph which is smaller than the realworld web the authors discusses the limitation in the paper
### Summary:
|
this paper aims to use the random walk to learn an effective link selection policy on a graph version of wikipedia they navigate graphstructured web data to find a target by hyperlinks within articles to effectively navigate on the web they first construct wikipedia as a graph where nodes represent a web page and edges denote the hyperlinks in the current page and then conduct several strategies for sampling random trajectories from the graph to find the path from the start node to target node overall this paper is interesting that building wikipedia as a graph to navigate the target however the novelty of this paper is not enough multihop reasoning in the knowledge graph has achieved significant progress the main idea of this paper is similar to multihop reasoning besides the authors only use the existing pathfinding method such as random walk to navigate the target although the authors claim that difference between their navigation and knowledge graph navigation a better pathfinding method satisfied wikipedia navigation should be explored since the constructed wikipedia graph contains natural language text the authors use a transformer to learn representations for nodes and edges benefiting from semantic content the learned representations can enhance the performance of downstream tasks such a method is a common technique applied in text graphs where nodes consist of text information the novel aspects only lie in the navigation policy network which computes possible actions and defines the probability for action selection instead of using pretraining embedding for action embeddings the authors utilize learnable embeddings for actions the authors apply the proposed navigation method to wikipedia for the fact verification task and achieve a significant boost when integrated into a simple tfidf scheme results in small graph navigation show that the proposed method accomplishes an outstanding performance in terms of success rate besides results demonstrate that the learned embeddings perform better than fixed embedding from pretrained language models for example the navigation policy network with a feedforward layer performs best in most cases the experimental results are consistent with the assumption
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
1263,
253,
4384,
15034,
1895,
597,
12661,
14613,
34591,
273,
253,
3632,
18974,
1332,
327,
259,
15170,
14580,
597,
2007,
4647,
616,
1332,
281,
253,
15198,
21999,
4836,
616,
4679,
921,
326,
616,
1332,
310,
3576,
275,
4715,
10726,
46234,
285,
247,
15034,
3646,
352,
671,
44584,
275,
247,
12085,
3045,
275,
253,
21999,
4836,
50276,
296,
3755,
20556,
337,
253,
4081,
1332,
310,
625,
3576,
685,
253,
8245,
3082,
285,
44755,
281,
1781,
259,
15170,
14580,
50276,
19,
253,
9470,
1783,
273,
1543,
285,
7364,
273,
253,
1263,
403,
3559,
50276,
20881,
1255,
265,
2201,
337,
50276,
783,
4081,
24102,
403,
1077,
27350,
285,
3480,
38135,
597,
3365,
897,
3632,
16771,
342,
253,
20552,
1146,
4425,
1754,
327,
20667,
13388,
50275,
19,
253,
4081,
1332,
310,
760,
2429,
342,
3632,
285,
38754,
3082,
417,
643,
1375,
23037,
14387,
3082,
7613,
352,
310,
2834,
281,
2557,
12510,
495,
247,
11088,
6733,
1783,
651,
320,
9371,
5301,
875,
253,
3515,
2069,
273,
253,
4081,
3082,
285,
643,
3082,
651,
1918,
247,
1805,
4685,
273,
253,
1332,
50276,
37585,
50276,
18,
627,
403,
690,
23629,
3374,
323,
1650,
327,
3239,
854,
1386,
24271,
2829,
374,
943,
320,
2829,
577,
50275,
9820,
597,
5393,
767,
7364,
273,
616,
789,
50276,
7152,
33032,
2520,
2929,
16633,
327,
253,
1895,
273,
4384,
15034,
253,
4477,
4081,
281,
6194,
271,
5570,
534,
476,
24171,
247,
1781,
259,
15170,
4216,
597,
806,
6635,
19958,
24102,
327,
253,
259,
15170,
4216,
1754,
327,
1027,
3082,
24088,
3632,
3579,
24102,
8107,
24102,
30505,
11865,
342,
841,
19958,
24102,
253,
14613,
34591,
26332,
22296,
4715,
1332,
310,
8671,
281,
6194,
253,
1566,
253,
4679,
921,
326,
253,
4081,
1332,
310,
417,
760,
2104,
281,
14556,
24171,
259,
15170,
4216,
533,
671,
452,
12085,
3045,
327,
253,
958,
21999,
4836,
337,
20544,
50276,
783,
4081,
1332,
2987,
973,
327,
15034,
4836,
285,
31326,
12085,
3045,
327,
253,
958,
21999,
4836,
50276,
2520,
2929,
310,
3839,
973,
3542,
50276,
19,
32213,
50276,
79,
14873,
8432,
285,
2093,
4022,
50276,
261,
253,
954,
4623,
789,
253,
4081,
1332,
4384,
8002,
275,
295,
14873,
8432,
285,
2093,
4022,
760,
4850,
11140,
2323,
2281,
1223,
253,
4081,
1332,
275,
436,
2929,
31326,
625,
685,
5091,
2323,
2281,
275,
247,
2074,
4758,
253,
1921,
323,
436,
1781,
3064,
273,
3045,
310,
417,
5544,
390,
5469,
275,
253,
4679,
50276,
783,
4477,
452,
5469,
253,
12291,
273,
436,
789,
5474,
339,
431,
248,
2929,
10262,
247,
4384,
15034,
5570,
326,
33772,
281,
3986,
2303,
4384,
21505,
407,
36939,
4373,
23053,
253,
2929,
4648,
39707,
3169,
3210,
326,
22573,
253,
1980,
2600,
285,
3037,
281,
3609,
253,
3048,
253,
5570,
9727,
281,
1735,
436,
1566,
310,
10166,
407,
3879,
34591,
273,
12421,
19958,
24102,
253,
4477,
921,
326,
436,
2746,
3400,
5520,
3045,
327,
15034,
285,
958,
21999,
49602,
20544,
50276,
783,
4081,
985,
310,
4076,
285,
17923,
1805,
685,
5368,
2718,
327,
15034,
285,
958,
21999,
49602,
50275,
12311,
10636,
50276,
783,
2929,
2589,
84,
2710,
28913,
2175,
824,
347,
1027,
15034,
873,
8777,
285,
1027,
2216,
10165,
273,
253,
1332,
18974,
10491,
3646,
3386,
3966,
50275,
20881,
1255,
265,
50276,
2520,
1537,
320,
1955,
281,
253,
9759,
273,
253,
2929,
533,
253,
7681,
38135,
390,
7680,
4620,
8489,
3710,
281,
479,
846,
4361,
253,
2929,
253,
1332,
3133,
281,
320,
247,
9648,
2969,
2898,
273,
5368,
5609,
824,
347,
39707,
32049,
285,
14613,
34591,
3733,
253,
4679,
671,
1646,
417,
3240,
9159,
1534,
16039,
390,
1379,
42287,
323,
253,
5723,
2824,
3114,
352,
651,
320,
1663,
9371,
604,
253,
4477,
812,
7550,
2881,
19148,
253,
2022,
7680,
342,
1675,
281,
5368,
2987,
24088,
310,
352,
247,
747,
16182,
390,
247,
747,
16774,
4560,
604,
594,
752,
1537,
320,
253,
3486,
281,
253,
2561,
3114,
390,
271,
5520,
3045,
604,
594,
752,
3242,
3064,
432,
2720,
2987,
310,
17690,
253,
5520,
3045,
50276,
783,
15034,
4679,
403,
760,
5196,
327,
247,
259,
15170,
4216,
534,
310,
4577,
685,
253,
1524,
10186,
4384,
50276,
783,
4477,
25339,
253,
12291,
275,
253,
2929,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
13698,
281,
897,
253,
3632,
2940,
281,
3037,
271,
3576,
3048,
5438,
3646,
327,
247,
4216,
2715,
273,
259,
15170,
597,
24171,
4216,
34218,
4384,
941,
281,
1089,
247,
2303,
407,
4373,
23053,
1561,
7774,
281,
8069,
24171,
327,
253,
4384,
597,
806,
3989,
259,
15170,
347,
247,
4216,
835,
7632,
1957,
247,
4384,
3239,
285,
9297,
9173,
253,
4373,
23053,
275,
253,
1655,
3239,
285,
840,
2589,
2067,
8130,
323,
10491,
3632,
24102,
432,
253,
4216,
281,
1089,
253,
1854,
432,
253,
1265,
4666,
281,
2303,
4666,
50276,
1189,
455,
436,
2929,
310,
4722,
326,
3652,
259,
15170,
347,
247,
4216,
281,
24171,
253,
2303,
2299,
253,
38135,
273,
436,
2929,
310,
417,
2217,
4471,
12242,
14720,
275,
253,
3640,
4216,
556,
6786,
1534,
4780,
253,
2022,
2934,
273,
436,
2929,
310,
2074,
281,
4471,
12242,
14720,
16280,
253,
4477,
760,
897,
253,
5368,
1854,
28983,
1332,
824,
347,
3632,
2940,
281,
24171,
253,
2303,
3738,
253,
4477,
1750,
326,
3064,
875,
616,
15034,
285,
3640,
4216,
15034,
247,
1805,
1854,
28983,
1332,
10048,
259,
15170,
15034,
943,
320,
14859,
50275,
17480,
253,
8818,
259,
15170,
4216,
4428,
3626,
3448,
2505,
253,
4477,
50276,
2327,
247,
39707,
281,
3037,
14237,
323,
7632,
285,
9297,
2750,
2996,
432,
24705,
2600,
253,
6311,
14237,
476,
7278,
253,
3045,
273,
15450,
8892,
824,
247,
1332,
310,
247,
1846,
5853,
3732,
275,
2505,
14580,
835,
7632,
2882,
273,
2505,
1491,
253,
4460,
7794,
760,
7027,
275,
253,
15034,
3646,
2990,
534,
48169,
1896,
5231,
285,
13067,
253,
5912,
323,
2250,
5438,
3185,
273,
970,
3215,
26208,
21496,
323,
2250,
46234,
253,
4477,
16584,
3037,
494,
46234,
323,
5231,
50276,
783,
4477,
4647,
253,
4081,
15034,
1332,
281,
259,
15170,
323,
253,
958,
21999,
4836,
285,
5115,
247,
1534,
9510,
672,
8527,
715,
247,
2969,
28793,
301,
71,
6974,
1543,
275,
1355,
4216,
15034,
921,
326,
253,
4081,
1332,
7576,
6419,
271,
16383,
3045,
275,
2426,
273,
2323,
2281,
16280,
1543,
7568,
326,
253,
6311,
46234,
1347,
1805,
685,
4229,
21496,
432,
3215,
11273,
3448,
3210,
323,
1650,
253,
15034,
3646,
2990,
342,
247,
3997,
10495,
3828,
17923,
1682,
275,
954,
2219,
253,
5661,
1543,
403,
5185,
342,
253,
9376,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
1263,
253,
4384,
15034,
1895,
597,
12661,
14613,
34591,
273,
253,
3632,
18974,
1332,
327,
259,
15170,
14580,
597,
2007,
4647,
616,
1332,
281,
253,
15198,
21999,
4836,
616,
4679,
921,
326,
616,
1332,
310,
3576,
275,
4715,
10726,
46234,
285,
247,
15034,
3646,
352,
671,
44584,
275,
247,
12085,
3045,
275,
253,
21999,
4836,
50276,
296,
3755,
20556,
337,
253,
4081,
1332,
310,
625,
3576,
685,
253,
8245,
3082,
285,
44755,
281,
1781,
259,
15170,
14580,
50276,
19,
253,
9470,
1783,
273,
1543,
285,
7364,
273,
253,
1263,
403,
3559,
50276,
20881,
1255,
265,
2201,
337,
50276,
783,
4081,
24102,
403,
1077,
27350,
285,
3480,
38135,
597,
3365,
897,
3632,
16771,
342,
253,
20552,
1146,
4425,
1754,
327,
20667,
13388,
50275,
19,
253,
4081,
1332,
310,
760,
2429,
342,
3632,
285,
38754,
3082,
417,
643,
1375,
23037,
14387,
3082,
7613,
352,
310,
2834,
281,
2557,
12510,
495,
247,
11088,
6733,
1783,
651,
320,
9371,
5301,
875,
253,
3515,
2069,
273,
253,
4081,
3082,
285,
643,
3082,
651,
1918,
247,
1805,
4685,
273,
253,
1332,
50276,
37585,
50276,
18,
627,
403,
690,
23629,
3374,
323,
1650,
327,
3239,
854,
1386,
24271,
2829,
374,
943,
320,
2829,
577,
50275,
9820,
597,
5393,
767,
7364,
273,
616,
789,
50276,
7152,
33032,
2520,
2929,
16633,
327,
253,
1895,
273,
4384,
15034,
253,
4477,
4081,
281,
6194,
271,
5570,
534,
476,
24171,
247,
1781,
259,
15170,
4216,
597,
806,
6635,
19958,
24102,
327,
253,
259,
15170,
4216,
1754,
327,
1027,
3082,
24088,
3632,
3579,
24102,
8107,
24102,
30505,
11865,
342,
841,
19958,
24102,
253,
14613,
34591,
26332,
22296,
4715,
1332,
310,
8671,
281,
6194,
253,
1566,
253,
4679,
921,
326,
253,
4081,
1332,
310,
417,
760,
2104,
281,
14556,
24171,
259,
15170,
4216,
533,
671,
452,
12085,
3045,
327,
253,
958,
21999,
4836,
337,
20544,
50276,
783,
4081,
1332,
2987,
973,
327,
15034,
4836,
285,
31326,
12085,
3045,
327,
253,
958,
21999,
4836,
50276,
2520,
2929,
310,
3839,
973,
3542,
50276,
19,
32213,
50276,
79,
14873,
8432,
285,
2093,
4022,
50276,
261,
253,
954,
4623,
789,
253,
4081,
1332,
4384,
8002,
275,
295,
14873,
8432,
285,
2093,
4022,
760,
4850,
11140,
2323,
2281,
1223,
253,
4081,
1332,
275,
436,
2929,
31326,
625,
685,
5091,
2323,
2281,
275,
247,
2074,
4758,
253,
1921,
323,
436,
1781,
3064,
273,
3045,
310,
417,
5544,
390,
5469,
275,
253,
4679,
50276,
783,
4477,
452,
5469,
253,
12291,
273,
436,
789,
5474,
339,
431,
248,
2929,
10262,
247,
4384,
15034,
5570,
326,
33772,
281,
3986,
2303,
4384,
21505,
407,
36939,
4373,
23053,
253,
2929,
4648,
39707,
3169,
3210,
326,
22573,
253,
1980,
2600,
285,
3037,
281,
3609,
253,
3048,
253,
5570,
9727,
281,
1735,
436,
1566,
310,
10166,
407,
3879,
34591,
273,
12421,
19958,
24102,
253,
4477,
921,
326,
436,
2746,
3400,
5520,
3045,
327,
15034,
285,
958,
21999,
49602,
20544,
50276,
783,
4081,
985,
310,
4076,
285,
17923,
1805,
685,
5368,
2718,
327,
15034,
285,
958,
21999,
49602,
50275,
12311,
10636,
50276,
783,
2929,
2589,
84,
2710,
28913,
2175,
824,
347,
1027,
15034,
873,
8777,
285,
1027,
2216,
10165,
273,
253,
1332,
18974,
10491,
3646,
3386,
3966,
50275,
20881,
1255,
265,
50276,
2520,
1537,
320,
1955,
281,
253,
9759,
273,
253,
2929,
533,
253,
7681,
38135,
390,
7680,
4620,
8489,
3710,
281,
479,
846,
4361,
253,
2929,
253,
1332,
3133,
281,
320,
247,
9648,
2969,
2898,
273,
5368,
5609,
824,
347,
39707,
32049,
285,
14613,
34591,
3733,
253,
4679,
671,
1646,
417,
3240,
9159,
1534,
16039,
390,
1379,
42287,
323,
253,
5723,
2824,
3114,
352,
651,
320,
1663,
9371,
604,
253,
4477,
812,
7550,
2881,
19148,
253,
2022,
7680,
342,
1675,
281,
5368,
2987,
24088,
310,
352,
247,
747,
16182,
390,
247,
747,
16774,
4560,
604,
594,
752,
1537,
320,
253,
3486,
281,
253,
2561,
3114,
390,
271,
5520,
3045,
604,
594,
752,
3242,
3064,
432,
2720,
2987,
310,
17690,
253,
5520,
3045,
50276,
783,
15034,
4679,
403,
760,
5196,
327,
247,
259,
15170,
4216,
534,
310,
4577,
685,
253,
1524,
10186,
4384,
50276,
783,
4477,
25339,
253,
12291,
275,
253,
2929,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
13698,
281,
897,
253,
3632,
2940,
281,
3037,
271,
3576,
3048,
5438,
3646,
327,
247,
4216,
2715,
273,
259,
15170,
597,
24171,
4216,
34218,
4384,
941,
281,
1089,
247,
2303,
407,
4373,
23053,
1561,
7774,
281,
8069,
24171,
327,
253,
4384,
597,
806,
3989,
259,
15170,
347,
247,
4216,
835,
7632,
1957,
247,
4384,
3239,
285,
9297,
9173,
253,
4373,
23053,
275,
253,
1655,
3239,
285,
840,
2589,
2067,
8130,
323,
10491,
3632,
24102,
432,
253,
4216,
281,
1089,
253,
1854,
432,
253,
1265,
4666,
281,
2303,
4666,
50276,
1189,
455,
436,
2929,
310,
4722,
326,
3652,
259,
15170,
347,
247,
4216,
281,
24171,
253,
2303,
2299,
253,
38135,
273,
436,
2929,
310,
417,
2217,
4471,
12242,
14720,
275,
253,
3640,
4216,
556,
6786,
1534,
4780,
253,
2022,
2934,
273,
436,
2929,
310,
2074,
281,
4471,
12242,
14720,
16280,
253,
4477,
760,
897,
253,
5368,
1854,
28983,
1332,
824,
347,
3632,
2940,
281,
24171,
253,
2303,
3738,
253,
4477,
1750,
326,
3064,
875,
616,
15034,
285,
3640,
4216,
15034,
247,
1805,
1854,
28983,
1332,
10048,
259,
15170,
15034,
943,
320,
14859,
50275,
17480,
253,
8818,
259,
15170,
4216,
4428,
3626,
3448,
2505,
253,
4477,
50276,
2327,
247,
39707,
281,
3037,
14237,
323,
7632,
285,
9297,
2750,
2996,
432,
24705,
2600,
253,
6311,
14237,
476,
7278,
253,
3045,
273,
15450,
8892,
824,
247,
1332,
310,
247,
1846,
5853,
3732,
275,
2505,
14580,
835,
7632,
2882,
273,
2505,
1491,
253,
4460,
7794,
760,
7027,
275,
253,
15034,
3646,
2990,
534,
48169,
1896,
5231,
285,
13067,
253,
5912,
323,
2250,
5438,
3185,
273,
970,
3215,
26208,
21496,
323,
2250,
46234,
253,
4477,
16584,
3037,
494,
46234,
323,
5231,
50276,
783,
4477,
4647,
253,
4081,
15034,
1332,
281,
259,
15170,
323,
253,
958,
21999,
4836,
285,
5115,
247,
1534,
9510,
672,
8527,
715,
247,
2969,
28793,
301,
71,
6974,
1543,
275,
1355,
4216,
15034,
921,
326,
253,
4081,
1332,
7576,
6419,
271,
16383,
3045,
275,
2426,
273,
2323,
2281,
16280,
1543,
7568,
326,
253,
6311,
46234,
1347,
1805,
685,
4229,
21496,
432,
3215,
11273,
3448,
3210,
323,
1650,
253,
15034,
3646,
2990,
342,
247,
3997,
10495,
3828,
17923,
1682,
275,
954,
2219,
253,
5661,
1543,
403,
5185,
342,
253,
9376,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes an operator splitting method which solves a convex relaxation of a leargescale nonconvex optimization problem for analyzing the worstcase performance of deep neural networks against input perturbations the proposed method is modular scalable and parallelizable experimentally the authors demonstrate that the proposed method has tighter bounds on the worstcase performance of large cnns in image classification and rl settings while this paper is wellwritten and has proposed quite an interesting idea to improve the scalability of the neural network verification problem the idea of applying operator or variable splitting to neural networks is definitely not novel although neural network training and verification might be differet tasks however due to their highly similar problem formulations i woule expect the authors to discuss the similarities and difference between the proposed method and such related papers eg taylor et al 2016 cui et al 2019 zeng et al 2019 2021 i would raise my score if the authors can explain the above in their response and include such details in the revision of the paper cui y he z pang j s 2020 multicomposite nonconvex optimization for training deep neural networks siam journal on optimization 302 16931723 zeng j lau t t k lin s yao y 2019 global convergence of block coordinate descent in deep learning in icml zeng j lin s b yao y zhou d x 2021 on admm in deep learning convergence and saturationavoidance journal of machine learning research 22199 167 this paper has proposed an interesting idea ie operator splitting to improve the scalability of neural network verification but this idea is not novel for its applications to neural networks eg neural network training the paper will better position itself if more detailed comparisons are made with such methods in the existing literature docsepthe paper proposes a novel method for the neural network verification problem the authors use variable splitting and lagrangian relaxation to relax the verification problem which is then solved with admm the method is gpu friendly similar to previous work like dvijotham et al 2018 and bunel et al 2020a and claims to solve the relaxation to optimality strengths 1 the method uses variable splitting and lagrangian relaxation which have been used in previous verification works however the proposed variable splitting is novel as it allows closedform updates for the primal variables thus enjoying faster convergence than bunel et al 2020a 2 the authors have done a good job at discussing the differences with previous related works like dvijotham et al and bunel et al in section 3 concerns 1 incomplete verification experiments not thorough are the different methods using the same intermediate bounds i found in the supplementary that you computed intermediate bounds for bunel et al using kw how are intermediate bounds computed for your method the table and details are not very clear do the different rows correspond to the same network tested with different epsilon values 2 missing baselines in incomplete verification experiments the paper is using weaker baselines the authors should compare with a which is a published work at iclr21 they use a tighter relaxation tjandraatmadja et al 2020 and solve it using a lagrangian relaxation and active set based approach furthermore they are outperforming solvers from bunel et al 2020 which has been used as a baseline in this paper this also brings into question the relevance of this work because a outperforms bunel et al 2020 can your method be extended to the relaxation from tjandraatmadja et al 2020 it is not clear that this work can outperform a or work with tighter relaxations the authors should also compare with fastandcomplete xu et al 2021 betacrownwang et al 2021 as they use the same lp relaxation as this paper and are very effective this should be done in both table 1 and the scalability section as these methods can scale to such networks 3 missing complete verification experiments complete verification experiments are missing this has been done in works that have been used by the authors as a baseline bunel et al 2020 even the recent propagationbased solvers provide results on complete verification see xu et al 2021 and wang et al 2021 they are quite important to judge the practical relevance of the algorithm branchandbound takes memory speed and accuracy all into account thus is crucial to see the relevance of the work see vnncomp b for recent developments in this space the authors should use a branchandbound framework from the opensourced codes from vnncomp and plug deepsplit into them you should also add these bb baselines in the speedcomparison experiment bb solvers can be used as incomplete verifiers and that should be possible here since the network size is small 4 scalability experiments representation can you provide verified accuracy here this will help in checking if the method can in fact verify this size networks or if the bounds are too large to be relevant 5 overclaiming focus on networks whose convex relaxations were previously impossible to solve exactly due to their size in introduction seems to be an overclaim since lirpa seems to be not doing that bad and lirpa is not even the stateoftheart baseline is this statement corresponding to experiments in table 1 or the scalability experiment i am willing to update my rating depending on the authors response to these concerns a scaling the convex barrier with active sets alessandro de palma harkirat behl rudy r bunel philip torr m pawan kumar iclr 2021 b httpssitesgooglecomviewvnn20vnncomp the authors have proposed a novel method for the neural network verification problem it is better than bunel et al 2020 however authors have not compared against recent stateoftheart published baselines also complete verification experiments are missing docsepthis paper proposes an efficient solver for computing the convex relaxation of neural networks which is an important component of verification methods the method is based on admm with the authors proposing a novel decomposition of the optimization problem which allows them to get closed form solutions of all the intermediate steps of the algorithm the proposed method also improves on the guaranteed convergence rate and is shown to scale to larger networks than the ones considered in other works evaluation is performed on robustness verification on cifar10 and on reinforcement learning models strengths the paper is very clearly written with a clear narrative explaining how the algorithm is derived equation 3 to 13 with a helpful summary in algorithm 1 on how things tie together the appendix is very detailed and provide clear explanations for how to implement each component of the algorithm efficiently the contribution is clearly framed with regards to previous work the introduction and related work are a good summary of the current state of neural network verification and section 3 does a great job at highlighting the differences between the proposed methods and the most closely related published work highlighting why this results in significant improvements the significance of the contribution is also quite important the authors give convergence guarantees and show good convergence empirically with the optimization curves in the appendix which previous methods could not offer they even show that this results in improved certified accuracy for a given time budget the evaluation is done quite rigorously on several domains vision rl and at different scales comparison to gurobi on small models evaluation against related methods on cifar10 network of a standard size and evaluation of scalability against fast linear bound on resnet models weaknesses a potential improvement that this paper could benefit from is a comparison to methods such as the optimizedlirpa bounds from the fast and complete paper which the authors cite this would not need to include comparison to the branch and bound aspect of it but simply to the bound computation which rely on fast linear bounds but iteratively optimizes the slope of the relaxation this is equivalent to solving the same problem and has been empirically shown to be quite efficient a comparison of the tightness speed tradeoff that would result would be extremely useful the method described in this paper are valuable even if they are slower as they provide convergence guarantees which optimizedlirpa bounds do not very well written paper with interesting and significant contributions and which provide a great summary of the state of the field docsepthis paper proposes admm for neural network verification problems experiments are demonstrated to show its effectiveness and scalability this paper proposes an operator splitting method ie admm to deal with neural network verification problems this is because the admm is scalable to large datasets specifically they solve the convex relaxations of problems analytically extensive experiments on network verifications demonstrate outstanding performance as well as excellent scalability this paper has several limitations which are discussed as follows 1 some statements in the paper are unclear for example in the introduction the authors state that neural networks lack formal guarantees however the meaning of formal guarantees is unclear and how does it relate to safetycritical applications should be explained better as another example the background of the neural network verification problem is missing it is unclear what are potential applications of this problem and examples should be provided to better understand required properties 2 the novelty of this paper should be justified better the authors just utilize an existing optimization framework admm to relax nonconvex problems into convex ones which lead to analytic solutions as they mentioned several papers consider similar ideas but the difference seems subtle it is unclear how admm resolves the incomplete problem ie methods are guaranteed to detect all false negatives but also produce false positives moreover the admm usually converges slowly to the solution how do authors address this issue 3 experimental details are missing even though the authors show many experimental results the background of problems in the experiments is missing and little information such as architectures and hyperparameter settings is provided so it is difficult to reproducible experimental results this paper addresses an important problem however the unclear presentation the lack of novelty justification and missing experimental details should be improved to make it an acceptable paper in iclr docsepthe authors present deepsplit a novel solver for a popular neural network convex relaxation relying on admm and on a careful problem reformulation the authors achieve both a o1t converge rate and closedform solutions for the inner problems differently from previously presented solvers for the same relaxation computational results are presented for incomplete neural network verification showing that deesplit scales to a resnet18 and achieves better bounds in the same time compared to relevant baselines pros as implied in section 3 deepsplit is an improvement on two closelyrelated dual solvers for the same relaxation the convex hull of elementwise activation functions dvijotham et al 2018 presented a dual solver based on the lagrangian relaxation of problem 3 which converges to the same bounds as deepsplit with a rate of o1sqrtt bunel et al 2020a first employed variable splitting and an augmented lagrangian on the same relaxation improving upon the dual solver from dvijotham et al 2018 while the augmented lagrangian is differentiable the inner problems require the use of an iterative optimisation algorithm deepsplit adds a second variable split to the formulation from bunel et al 2020a so that the inner problems enjoy a closedform solution this results in faster empirical convergence and attains the o1t rate suggestion perhaps inverting the ordering of sections 3 and 2 could improve the papers readability cons the paper repeatedly claims that deepsplit solves the considered relaxation exactly seemingly contrasting this with other approaches however the methods from dvijotham et al 2018 and bunel et al 2020a enjoy the same property the only difference being a slower empirical convergence rate as a consequence i believe the authors should tone down claims such as that they compute bounds for networks whose convex relaxations were previously impossible to solve exactly due to their size furthermore in neural network verification one is typically concerned with verifying the largest number of properties in the smallest possible time rather than solving a given relaxation exactly suboptimal yet inexpensive bounds are then potentially preferable to solving a given relaxation to optimality especially considering that running a fixed small number of branch and bound steps will significantly tighten the bounds see for instance wang et al 2021 in other words methods based on branch and bound can be employed as incomplete verifiers if they are terminated early the cost of running recent branchandbound methods to termination is nevertheless significantly smaller than what the authors claim badnb from de palma et al 2021 verifies a large number of properties in under 10s for a network of 50k activations similar or better results were obtained by the participants of vnn comp 2020 or vnn comp 2021 i believe this is in the same order of magnitude as the network size employed for table 1 for which the reported runtime of deepsplit in the appendix is around 10s as well in addition recent works such as xu et al 2021 and bunel et al nonconvex 2020 have shown that nonconvex reformulations lacking convergence guarantees to the exact solution of the relaxation for relus obtain significantly better speedaccuracy tradeoffs on the same relaxation than dvijotham et al 2018 and bunel et al 2020a these newer works seemingly scale quite well with network sizes i hence doubt that the only applicable method to a resnet18 is nonoptimized lirpa as claimed by the authors i am willing to significantly increase my score if the authors address the following points table 1 should report the average runtime in addition to the certified test accuracy preferably different speedaccuracy tradeoffs should be reported for iterative algorithms the certified accuracy for dual decomp adam should not be lower than the one for linear as the latter can be used an initializer to the dual problem in bunel et al 2020a are the authors taking this into account could deepsplit enjoy the same initialization compare against the more recent bounding algorithms from bunel et al nonconvex 2020 and xu et al 2021 while introduced in the context of branch and bound the alphacrown method or lirpa with optimized slopes is a valid incomplete verifier and the global optimum of its optimization problem coincides with deepsplits furthermore it might yield tigther bounds than deepsplit in case of joint optimization over intermediate bounds the code of both algorithms is available online compare speedaccuracy tradeoffs against works operating on tighter relaxations such as active set considering that the memory footprint of alphacrown with fixed intermediate bounds is only marginally larger than the one for lirpa the authors should add this to the resnet18 experiments complete verification performance is tightly linked to the quality of the speedaccuracy tradeoffs of an incomplete verifier it would be quite informative if the authors could provide complete verification experiments on the model from table 1 minor comments end of page 1 these relaxations must typically be further relaxed this statement does not apply to dvijotham et al 2018 and bunel et al 2020a section 11 typo in reference de2 sec 32 when stopping early the primal minimization must be solved to convergence in order to compute the dual function and produce a valid bound this is incorrect as a valid bound can be obtained in closedform by minimising the lagrangian rather than augmented lagrangian references bunel et al nonconvex 2020 an efficient nonconvex reformulation of stagewise convex optimization problems neurips 2020 active set scaling the convex barrier with active sets iclr 2021 deepsplit is an improvement upon two closelyrelated solvers for the same convex relaxation dvijotham et al 2018 and bunel et al 2020a the work focuses on solving the relaxation exactly however as shown in a variety of recent papers xu et al 2021 active set bunel et al nonconvex 2020 wang et al 2021 heuristics and switching to tighter relaxations possibly considering branching might verify more properties in the same time both in incomplete and complete verification these developments have been somewhat ignored by the authors the authors should put their work in perspective with more recent and scalable optimizers for the same relaxation xu et al 2021 bunel et al nonconvex 2020 and with works on tighter relaxations active set furthermore as commonly done for solvers of the considered relaxation complete verification experiments are needed to fully assess the quality of deepsplits speedaccuracy tradeoffs
### Summary:
|
the authors propose a novel operator splitting method for solving convex relaxations of neural network verification problems and develop and validate an optimized implementation of the same on large scale networks focusing on the problem of verifying robustness to norm bounded adversarial perturbations the reviewers agree that the paper contains interesting ideas that are worthy of further development and that these ideas may prove useful eventually in pushing the envelope of what is possible in neural network verification however in its current form the paper misses some key experimental evidence to rigorously evaluate the value of the contributions made 1 comparison against sota incomplete verifiers the authors do not provide detailed and rigorous comparisons against wellknown baselines for example the incomplete verifiers from fastandcomplete xu et al 2021 betacrownwang et al 2021 2 incorporating tighter relaxations it would be valuable for the community to understand whether the proposed algorithm is compatible with tighter relaxations like those of tjandraatmadja et al 2020 even if they are not it would be interesting to understand the comparison against standard solvers for these tighter relaxations compared against the advanced solver developed by the authors applied to the weaker relaxation 3 showing performance in the context of complete verification while this is not a requirement it would be great to see how the method performs in the conjunction with a branch and bound search as this sometimes reveals surprising tradeoffs or weaknesses of incomplete verifiers as observed in the results of betacrownwang et al 2021 i encourage the authors to strengthen the paper adding these experiments and resubmit to a future venue
|
[
2442,
7756,
326,
436,
2929,
812,
5649,
432,
310,
247,
5301,
281,
3082,
824,
347,
253,
18325,
77,
343,
4904,
14493,
432,
253,
3809,
285,
3426,
2929,
534,
253,
4477,
26542,
436,
651,
417,
878,
281,
2486,
5301,
281,
253,
7789,
285,
3033,
4809,
273,
352,
533,
3365,
281,
253,
3033,
13782,
534,
10725,
327,
3809,
4872,
14493,
533,
10040,
3146,
5556,
4219,
253,
14679,
273,
253,
17040,
436,
310,
6425,
281,
16161,
253,
1072,
1895,
285,
556,
644,
45190,
2011,
281,
320,
3240,
5919,
50276,
66,
5301,
273,
253,
6863,
1255,
50276,
15507,
5454,
2727,
326,
651,
906,
651,
320,
6685,
4217,
253,
1332,
2529,
275,
436,
2929,
403,
9865,
1014,
604,
597,
403,
17357,
347,
597,
2085,
14940,
23632,
534,
18325,
77,
343,
4904,
14493,
513,
417,
1077,
973,
3542,
2929,
342,
4722,
285,
1534,
9021,
285,
534,
2085,
247,
1270,
6010,
273,
253,
1375,
273,
253,
1673,
5474,
33032,
2520,
2929,
29328,
519,
2188,
323,
11454,
2990,
21999,
3237,
4679,
403,
5183,
281,
921,
697,
12510,
285,
9171,
1430,
436,
2929,
29328,
271,
5572,
19860,
1332,
26332,
519,
2188,
281,
2968,
342,
11454,
2990,
21999,
3237,
436,
310,
984,
253,
519,
2188,
310,
44755,
281,
1781,
15302,
5742,
597,
8415,
253,
17133,
7921,
569,
273,
3237,
41398,
9470,
4679,
327,
2990,
2336,
6787,
7568,
16383,
3045,
347,
973,
347,
7126,
9171,
1430,
50276,
2520,
2929,
556,
2067,
7364,
534,
403,
5469,
347,
3637,
337,
186,
8826,
7234,
275,
253,
2929,
403,
12744,
323,
1650,
275,
253,
10199,
253,
4477,
1375,
326,
11454,
6928,
3480,
7473,
23632,
2299,
253,
4495,
273,
7473,
23632,
310,
12744,
285,
849,
1057,
352,
14588,
281,
5252,
26717,
4893,
943,
320,
5544,
1805,
347,
1529,
1650,
50276,
783,
4114,
273,
253,
11454,
2990,
21999,
1895,
310,
5816,
352,
310,
12744,
752,
403,
2442,
4893,
273,
436,
1895,
285,
6667,
943,
320,
2530,
281,
1805,
2096,
2424,
3607,
50275,
19,
186,
783,
38135,
273,
436,
2929,
943,
320,
17285,
1805,
253,
4477,
816,
16584,
271,
5368,
13757,
7792,
519,
2188,
281,
7921,
1327,
44181,
3237,
715,
17133,
4394,
534,
1421,
281,
20059,
5482,
347,
597,
5393,
2067,
9380,
1908,
2074,
5697,
533,
253,
3064,
3133,
16105,
352,
310,
12744,
849,
519,
2188,
501,
14503,
253,
18464,
1895,
26332,
3082,
403,
16293,
281,
2736,
512,
3221,
2297,
3993,
533,
671,
4711,
3221,
37865,
25761,
253,
519,
2188,
3798,
26414,
7808,
281,
253,
2900,
849,
513,
4477,
2953,
436,
2523,
495,
186,
49363,
4278,
403,
5816,
50276,
9154,
2167,
253,
4477,
921,
1142,
5661,
1543,
50276,
783,
4114,
273,
3237,
275,
253,
4679,
310,
5816,
285,
1652,
1491,
824,
347,
35615,
285,
4373,
19484,
7533,
310,
2530,
50276,
601,
352,
310,
2834,
281,
41374,
5661,
1543,
50275,
2520,
2929,
12453,
271,
1774,
1895,
2299,
253,
12744,
9759,
253,
3480,
273,
38135,
22861,
285,
5816,
5661,
4278,
943,
320,
5520,
281,
1056,
352,
271,
12207,
2929,
275,
17857,
32888,
5474,
339,
431,
248,
4477,
1246,
372,
2265,
4403,
247,
4460,
47037,
323,
247,
4633,
11454,
2990,
17133,
17040,
22128,
327,
519,
2188,
285,
327,
247,
10182,
1895,
8460,
1427,
253,
4477,
5115,
1097,
247,
258,
18,
85,
29623,
2281,
285,
4581,
630,
5482,
323,
253,
6703,
3237,
13359,
432,
3786,
3559,
1220,
735,
323,
253,
1072,
17040,
15180,
1543,
403,
3559,
323,
18464,
11454,
2990,
21999,
4645,
326,
372,
265,
4403,
11498,
281,
247,
501,
3024,
1093,
285,
33526,
1805,
14493,
275,
253,
1072,
673,
2429,
281,
4623,
1666,
25379,
50276,
856,
84,
50275,
284,
10466,
275,
2593,
495,
372,
2265,
4403,
310,
271,
7756,
327,
767,
8244,
4919,
8746,
1220,
735,
323,
253,
1072,
17040,
253,
17133,
28470,
273,
3284,
3020,
5743,
3470,
50276,
27088,
1944,
837,
312,
1162,
355,
4765,
3559,
247,
8746,
47037,
1754,
327,
253,
16653,
23623,
17040,
273,
1895,
495,
534,
26414,
281,
253,
1072,
14493,
347,
372,
2265,
4403,
342,
247,
2281,
273,
258,
18,
2609,
85,
50276,
67,
328,
293,
1162,
355,
9169,
66,
806,
7091,
4778,
19860,
285,
271,
31612,
16653,
23623,
327,
253,
1072,
17040,
11138,
2220,
253,
8746,
47037,
432,
43857,
1944,
837,
312,
1162,
355,
4765,
1223,
253,
31612,
16653,
23623,
310,
46350,
253,
6703,
3237,
2430,
253,
897,
273,
271,
34560,
5556,
5837,
5933,
50276,
615,
2265,
4403,
11323,
247,
1273,
4778,
8085,
281,
253,
15895,
432,
34258,
293,
1162,
355,
9169,
66,
594,
326,
253,
6703,
3237,
4264,
247,
4581,
630,
2900,
436,
1543,
275,
7938,
16774,
14940,
285,
863,
1550,
253,
258,
18,
85,
2281,
50276,
35640,
279,
4931,
275,
31324,
253,
15824,
273,
7118,
495,
285,
374,
812,
3157,
253,
9380,
1239,
1430,
50275,
5040,
50275,
783,
2929,
12889,
3916,
326,
372,
2265,
4403,
35910,
253,
2783,
17040,
4555,
16907,
42455,
436,
342,
643,
7274,
2299,
253,
3082,
432,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
4264,
253,
1072,
2867,
253,
760,
3064,
1146,
247,
17357,
16774,
14940,
2281,
347,
247,
9936,
891,
2868,
253,
4477,
943,
10541,
1066,
3916,
824,
347,
326,
597,
11897,
14493,
323,
6928,
3692,
17133,
7921,
569,
497,
3786,
7479,
281,
8415,
4555,
1955,
281,
616,
1979,
50276,
44295,
3062,
275,
11454,
2990,
21999,
581,
310,
5431,
7514,
342,
49160,
253,
6253,
1180,
273,
3607,
275,
253,
8004,
1896,
673,
2581,
685,
16161,
247,
1677,
17040,
4555,
749,
29776,
2568,
27296,
14493,
403,
840,
7826,
29224,
281,
16161,
247,
1677,
17040,
281,
5556,
1319,
3340,
7296,
326,
3515,
247,
4229,
1355,
1180,
273,
7789,
285,
3033,
5018,
588,
3012,
6863,
257,
253,
14493,
923,
323,
4227,
259,
606,
1162,
355,
43425,
275,
643,
3000,
3082,
1754,
327,
7789,
285,
3033,
476,
320,
7091,
347,
18464,
2336,
13783,
604,
597,
403,
19344,
2393,
253,
2105,
273,
3515,
3332,
7789,
395,
9458,
3082,
281,
15056,
310,
17837,
3012,
4577,
685,
752,
253,
4477,
1750,
3076,
5668,
432,
372,
5796,
785,
1162,
355,
43425,
2336,
7790,
247,
1781,
1180,
273,
3607,
275,
762,
884,
84,
323,
247,
2990,
273,
2456,
76,
1396,
569,
2074,
390,
1805,
1543,
497,
2797,
407,
253,
5014,
273,
362,
9866,
509,
9169,
390,
362,
9866,
509,
43425,
891,
2868,
436,
310,
275,
253,
1072,
1340,
273,
9777,
347,
253,
2990,
1979,
7091,
323,
2829,
337,
323,
534,
253,
2361,
20243,
273,
372,
2265,
4403,
275,
253,
30762,
310,
1475,
884,
84,
347,
973,
50276,
249,
1635,
3332,
2987,
824,
347,
1269,
86,
1162,
355,
43425,
285,
34258,
293,
1162,
355,
1327,
44181,
9169,
452,
2011,
326,
1327,
44181,
8460,
3339,
14999,
14940,
23632,
281,
253,
3242,
2900,
273,
253,
17040,
323,
774,
316,
4044,
3012,
1805,
3885,
18921,
1974,
5454,
14273,
327,
253,
1072,
17040,
685,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
841,
21629,
2987,
16907,
4311,
3240,
973,
342,
2990,
9552,
891,
7613,
5545,
326,
253,
760,
7763,
1332,
281,
247,
501,
3024,
1093,
310,
1327,
32581,
1025,
298,
343,
4904,
347,
7558,
407,
253,
4477,
50276,
74,
717,
7378,
281,
3012,
2572,
619,
4868,
604,
253,
4477,
2953,
253,
1563,
2792,
50276,
2420,
337,
943,
1304,
253,
3388,
20243,
275,
1635,
281,
253,
18065,
1071,
7200,
13027,
1027,
3885,
18921,
1974,
5454,
14273,
943,
320,
2361,
323,
34560,
11333,
50276,
783,
18065,
7200,
323,
8746,
30572,
38622,
943,
417,
320,
2406,
685,
253,
581,
323,
4872,
347,
253,
6158,
476,
320,
908,
271,
3302,
6081,
281,
253,
8746,
1895,
275,
34258,
293,
1162,
355,
9169,
66,
403,
253,
4477,
3192,
436,
715,
2395,
812,
372,
2265,
4403,
4264,
253,
1072,
31850,
50276,
23813,
1411,
253,
625,
3332,
41113,
11333,
432,
34258,
293,
1162,
355,
1327,
44181,
9169,
285,
1269,
86,
1162,
355,
43425,
1223,
5611,
275,
253,
3634,
273,
7789,
285,
3033,
253,
355,
545,
317,
2924,
1332,
390,
298,
343,
4904,
342,
18325,
28677,
310,
247,
3588,
18464,
2336,
5425,
285,
253,
4156,
24571,
273,
697,
13757,
1895,
30150,
342,
372,
2265,
446,
953,
33810,
352,
1537,
4917,
246,
304,
508,
14493,
685,
372,
2265,
4403,
275,
1083,
273,
6036,
13757,
689,
10444,
14493,
253,
2127,
273,
1097,
11333,
310,
2130,
3909,
50276,
23813,
3885,
18921,
1974,
5454,
14273,
1411,
2987,
6498,
327,
40638,
7921,
569,
824,
347,
3939,
873,
50276,
15603,
272,
326,
253,
3541,
33257,
273,
355,
545,
317,
2924,
342,
4229,
10444,
14493,
310,
760,
42876,
4067,
685,
253,
581,
323,
298,
343,
4904,
253,
4477,
943,
823,
436,
281,
253,
501,
3024,
1093,
4679,
50276,
11984,
21999,
3045,
310,
18996,
7939,
281,
253,
3290,
273,
253,
3885,
18921,
1974,
5454,
14273,
273,
271,
18464,
2336,
5425,
352,
651,
320,
3240,
27096,
604,
253,
4477,
812,
2085,
3426,
21999,
4679,
327,
253,
1566,
432,
2829,
337,
50275,
37585,
5701,
50275,
423,
273,
3239,
337,
841,
7921,
569,
1364,
5431,
320,
2007,
19595,
436,
3908,
1057,
417,
4647,
281,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
50276,
4674,
1903,
1745,
80,
275,
3806,
372,
19,
50276,
1704,
4567,
672,
15910,
2393,
253,
819,
1983,
41458,
1364,
320,
14042,
281,
14940,
275,
1340,
281,
11897,
253,
8746,
1159,
285,
4711,
247,
3588,
3033,
436,
310,
13583,
347,
247,
3588,
3033,
476,
320,
2797,
275,
4581,
630,
407,
7221,
2182,
253,
16653,
23623,
2581,
685,
31612,
16653,
23623,
50276,
250,
3065,
34258,
293,
1162,
355,
1327,
44181,
9169,
50276,
266,
5919,
1327,
44181,
8460,
1427,
273,
3924,
3020,
17133,
13757,
3237,
5723,
2824,
9169,
3939,
873,
50276,
49708,
253,
17133,
11394,
342,
3939,
5239,
17857,
32888,
43425,
372,
2265,
4403,
310,
271,
7756,
2220,
767,
8244,
4919,
1220,
735,
323,
253,
1072,
17133,
17040,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
50276,
783,
789,
16633,
327,
16161,
253,
17040,
4555,
2299,
347,
2011,
275,
247,
5235,
273,
3332,
9380,
1269,
86,
1162,
355,
43425,
3939,
873,
50276,
67,
328,
293,
1162,
355,
1327,
44181,
9169,
259,
606,
1162,
355,
43425,
344,
321,
3397,
285,
12797,
281,
40638,
7921,
569,
6830,
7296,
27213,
1537,
12654,
625,
3607,
275,
253,
1072,
673,
1097,
275,
18464,
285,
3426,
21999,
841,
16936,
452,
644,
8489,
12841,
407,
253,
4477,
50276,
783,
4477,
943,
1691,
616,
789,
275,
8668,
342,
625,
3332,
285,
44755,
5556,
14460,
323,
253,
1072,
17040,
1269,
86,
1162,
355,
43425,
34258,
293,
1162,
355,
1327,
44181,
9169,
285,
342,
2987,
327,
40638,
7921,
569,
3939,
873,
33810,
347,
7744,
2218,
323,
1220,
735,
273,
253,
2783,
17040,
3426,
21999,
4679,
403,
3058,
281,
4751,
2939,
253,
3290,
273,
372,
2265,
446,
953,
3885,
18921,
1974,
5454,
14273,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
4460,
5572,
19860,
1332,
323,
16161,
17133,
7921,
569,
273,
11454,
2990,
21999,
3237,
285,
1287,
285,
17813,
271,
18325,
7092,
273,
253,
1072,
327,
1781,
4311,
6928,
13654,
327,
253,
1895,
273,
49160,
31640,
281,
5222,
11542,
48960,
26309,
50276,
783,
30628,
5194,
326,
253,
2929,
4428,
4722,
5697,
326,
403,
18338,
273,
2007,
2440,
285,
326,
841,
5697,
778,
5276,
4217,
6524,
275,
13383,
253,
17329,
273,
752,
310,
1896,
275,
11454,
2990,
21999,
2299,
275,
697,
1655,
830,
253,
2929,
38771,
690,
2234,
5661,
1941,
281,
8132,
29689,
7472,
253,
1318,
273,
253,
9021,
1160,
337,
5301,
1411,
256,
5503,
18464,
2336,
13783,
253,
4477,
513,
417,
2085,
7000,
285,
26565,
14023,
1411,
973,
4304,
1666,
25379,
323,
1650,
253,
18464,
2336,
13783,
432,
3809,
395,
11984,
1269,
86,
1162,
355,
43425,
701,
317,
2924,
33317,
1162,
355,
43425,
50276,
19,
24049,
40638,
7921,
569,
352,
651,
320,
9865,
323,
253,
3114,
281,
2096,
1880,
253,
4081,
5933,
310,
13333,
342,
40638,
7921,
569,
751,
1110,
273,
246,
75,
17244,
255,
34180,
6362,
1162,
355,
9169,
1014,
604,
597,
403,
417,
352,
651,
320,
4722,
281,
2096,
253,
5301,
1411,
2629,
1220,
735,
323,
841,
40638,
7921,
569,
2429,
1411,
253,
7269,
47037,
3715,
407,
253,
4477,
3732,
281,
253,
21076,
17040,
495,
4645,
3045,
275,
253,
3634,
273,
3426,
21999,
1223,
436,
310,
417,
247,
8284,
352,
651,
320,
1270,
281,
923,
849,
253,
1332,
17923,
275,
253,
17385,
342,
247,
7789,
285,
3033,
3186,
347,
436,
4536,
12957,
10084,
5454,
14273,
390,
32213,
273,
18464,
2336,
13783,
347,
2540,
275,
253,
1543,
273,
701,
317,
2924,
33317,
1162,
355,
43425,
50276,
74,
11907,
253,
4477,
281,
17084,
253,
2929,
6240,
841,
4679,
285,
501,
538,
2225,
281,
247,
2852,
18767
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
2442,
7756,
326,
436,
2929,
812,
5649,
432,
310,
247,
5301,
281,
3082,
824,
347,
253,
18325,
77,
343,
4904,
14493,
432,
253,
3809,
285,
3426,
2929,
534,
253,
4477,
26542,
436,
651,
417,
878,
281,
2486,
5301,
281,
253,
7789,
285,
3033,
4809,
273,
352,
533,
3365,
281,
253,
3033,
13782,
534,
10725,
327,
3809,
4872,
14493,
533,
10040,
3146,
5556,
4219,
253,
14679,
273,
253,
17040,
436,
310,
6425,
281,
16161,
253,
1072,
1895,
285,
556,
644,
45190,
2011,
281,
320,
3240,
5919,
50276,
66,
5301,
273,
253,
6863,
1255,
50276,
15507,
5454,
2727,
326,
651,
906,
651,
320,
6685,
4217,
253,
1332,
2529,
275,
436,
2929,
403,
9865,
1014,
604,
597,
403,
17357,
347,
597,
2085,
14940,
23632,
534,
18325,
77,
343,
4904,
14493,
513,
417,
1077,
973,
3542,
2929,
342,
4722,
285,
1534,
9021,
285,
534,
2085,
247,
1270,
6010,
273,
253,
1375,
273,
253,
1673,
5474,
33032,
2520,
2929,
29328,
519,
2188,
323,
11454,
2990,
21999,
3237,
4679,
403,
5183,
281,
921,
697,
12510,
285,
9171,
1430,
436,
2929,
29328,
271,
5572,
19860,
1332,
26332,
519,
2188,
281,
2968,
342,
11454,
2990,
21999,
3237,
436,
310,
984,
253,
519,
2188,
310,
44755,
281,
1781,
15302,
5742,
597,
8415,
253,
17133,
7921,
569,
273,
3237,
41398,
9470,
4679,
327,
2990,
2336,
6787,
7568,
16383,
3045,
347,
973,
347,
7126,
9171,
1430,
50276,
2520,
2929,
556,
2067,
7364,
534,
403,
5469,
347,
3637,
337,
186,
8826,
7234,
275,
253,
2929,
403,
12744,
323,
1650,
275,
253,
10199,
253,
4477,
1375,
326,
11454,
6928,
3480,
7473,
23632,
2299,
253,
4495,
273,
7473,
23632,
310,
12744,
285,
849,
1057,
352,
14588,
281,
5252,
26717,
4893,
943,
320,
5544,
1805,
347,
1529,
1650,
50276,
783,
4114,
273,
253,
11454,
2990,
21999,
1895,
310,
5816,
352,
310,
12744,
752,
403,
2442,
4893,
273,
436,
1895,
285,
6667,
943,
320,
2530,
281,
1805,
2096,
2424,
3607,
50275,
19,
186,
783,
38135,
273,
436,
2929,
943,
320,
17285,
1805,
253,
4477,
816,
16584,
271,
5368,
13757,
7792,
519,
2188,
281,
7921,
1327,
44181,
3237,
715,
17133,
4394,
534,
1421,
281,
20059,
5482,
347,
597,
5393,
2067,
9380,
1908,
2074,
5697,
533,
253,
3064,
3133,
16105,
352,
310,
12744,
849,
519,
2188,
501,
14503,
253,
18464,
1895,
26332,
3082,
403,
16293,
281,
2736,
512,
3221,
2297,
3993,
533,
671,
4711,
3221,
37865,
25761,
253,
519,
2188,
3798,
26414,
7808,
281,
253,
2900,
849,
513,
4477,
2953,
436,
2523,
495,
186,
49363,
4278,
403,
5816,
50276,
9154,
2167,
253,
4477,
921,
1142,
5661,
1543,
50276,
783,
4114,
273,
3237,
275,
253,
4679,
310,
5816,
285,
1652,
1491,
824,
347,
35615,
285,
4373,
19484,
7533,
310,
2530,
50276,
601,
352,
310,
2834,
281,
41374,
5661,
1543,
50275,
2520,
2929,
12453,
271,
1774,
1895,
2299,
253,
12744,
9759,
253,
3480,
273,
38135,
22861,
285,
5816,
5661,
4278,
943,
320,
5520,
281,
1056,
352,
271,
12207,
2929,
275,
17857,
32888,
5474,
339,
431,
248,
4477,
1246,
372,
2265,
4403,
247,
4460,
47037,
323,
247,
4633,
11454,
2990,
17133,
17040,
22128,
327,
519,
2188,
285,
327,
247,
10182,
1895,
8460,
1427,
253,
4477,
5115,
1097,
247,
258,
18,
85,
29623,
2281,
285,
4581,
630,
5482,
323,
253,
6703,
3237,
13359,
432,
3786,
3559,
1220,
735,
323,
253,
1072,
17040,
15180,
1543,
403,
3559,
323,
18464,
11454,
2990,
21999,
4645,
326,
372,
265,
4403,
11498,
281,
247,
501,
3024,
1093,
285,
33526,
1805,
14493,
275,
253,
1072,
673,
2429,
281,
4623,
1666,
25379,
50276,
856,
84,
50275,
284,
10466,
275,
2593,
495,
372,
2265,
4403,
310,
271,
7756,
327,
767,
8244,
4919,
8746,
1220,
735,
323,
253,
1072,
17040,
253,
17133,
28470,
273,
3284,
3020,
5743,
3470,
50276,
27088,
1944,
837,
312,
1162,
355,
4765,
3559,
247,
8746,
47037,
1754,
327,
253,
16653,
23623,
17040,
273,
1895,
495,
534,
26414,
281,
253,
1072,
14493,
347,
372,
2265,
4403,
342,
247,
2281,
273,
258,
18,
2609,
85,
50276,
67,
328,
293,
1162,
355,
9169,
66,
806,
7091,
4778,
19860,
285,
271,
31612,
16653,
23623,
327,
253,
1072,
17040,
11138,
2220,
253,
8746,
47037,
432,
43857,
1944,
837,
312,
1162,
355,
4765,
1223,
253,
31612,
16653,
23623,
310,
46350,
253,
6703,
3237,
2430,
253,
897,
273,
271,
34560,
5556,
5837,
5933,
50276,
615,
2265,
4403,
11323,
247,
1273,
4778,
8085,
281,
253,
15895,
432,
34258,
293,
1162,
355,
9169,
66,
594,
326,
253,
6703,
3237,
4264,
247,
4581,
630,
2900,
436,
1543,
275,
7938,
16774,
14940,
285,
863,
1550,
253,
258,
18,
85,
2281,
50276,
35640,
279,
4931,
275,
31324,
253,
15824,
273,
7118,
495,
285,
374,
812,
3157,
253,
9380,
1239,
1430,
50275,
5040,
50275,
783,
2929,
12889,
3916,
326,
372,
2265,
4403,
35910,
253,
2783,
17040,
4555,
16907,
42455,
436,
342,
643,
7274,
2299,
253,
3082,
432,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
4264,
253,
1072,
2867,
253,
760,
3064,
1146,
247,
17357,
16774,
14940,
2281,
347,
247,
9936,
891,
2868,
253,
4477,
943,
10541,
1066,
3916,
824,
347,
326,
597,
11897,
14493,
323,
6928,
3692,
17133,
7921,
569,
497,
3786,
7479,
281,
8415,
4555,
1955,
281,
616,
1979,
50276,
44295,
3062,
275,
11454,
2990,
21999,
581,
310,
5431,
7514,
342,
49160,
253,
6253,
1180,
273,
3607,
275,
253,
8004,
1896,
673,
2581,
685,
16161,
247,
1677,
17040,
4555,
749,
29776,
2568,
27296,
14493,
403,
840,
7826,
29224,
281,
16161,
247,
1677,
17040,
281,
5556,
1319,
3340,
7296,
326,
3515,
247,
4229,
1355,
1180,
273,
7789,
285,
3033,
5018,
588,
3012,
6863,
257,
253,
14493,
923,
323,
4227,
259,
606,
1162,
355,
43425,
275,
643,
3000,
3082,
1754,
327,
7789,
285,
3033,
476,
320,
7091,
347,
18464,
2336,
13783,
604,
597,
403,
19344,
2393,
253,
2105,
273,
3515,
3332,
7789,
395,
9458,
3082,
281,
15056,
310,
17837,
3012,
4577,
685,
752,
253,
4477,
1750,
3076,
5668,
432,
372,
5796,
785,
1162,
355,
43425,
2336,
7790,
247,
1781,
1180,
273,
3607,
275,
762,
884,
84,
323,
247,
2990,
273,
2456,
76,
1396,
569,
2074,
390,
1805,
1543,
497,
2797,
407,
253,
5014,
273,
362,
9866,
509,
9169,
390,
362,
9866,
509,
43425,
891,
2868,
436,
310,
275,
253,
1072,
1340,
273,
9777,
347,
253,
2990,
1979,
7091,
323,
2829,
337,
323,
534,
253,
2361,
20243,
273,
372,
2265,
4403,
275,
253,
30762,
310,
1475,
884,
84,
347,
973,
50276,
249,
1635,
3332,
2987,
824,
347,
1269,
86,
1162,
355,
43425,
285,
34258,
293,
1162,
355,
1327,
44181,
9169,
452,
2011,
326,
1327,
44181,
8460,
3339,
14999,
14940,
23632,
281,
253,
3242,
2900,
273,
253,
17040,
323,
774,
316,
4044,
3012,
1805,
3885,
18921,
1974,
5454,
14273,
327,
253,
1072,
17040,
685,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
841,
21629,
2987,
16907,
4311,
3240,
973,
342,
2990,
9552,
891,
7613,
5545,
326,
253,
760,
7763,
1332,
281,
247,
501,
3024,
1093,
310,
1327,
32581,
1025,
298,
343,
4904,
347,
7558,
407,
253,
4477,
50276,
74,
717,
7378,
281,
3012,
2572,
619,
4868,
604,
253,
4477,
2953,
253,
1563,
2792,
50276,
2420,
337,
943,
1304,
253,
3388,
20243,
275,
1635,
281,
253,
18065,
1071,
7200,
13027,
1027,
3885,
18921,
1974,
5454,
14273,
943,
320,
2361,
323,
34560,
11333,
50276,
783,
18065,
7200,
323,
8746,
30572,
38622,
943,
417,
320,
2406,
685,
253,
581,
323,
4872,
347,
253,
6158,
476,
320,
908,
271,
3302,
6081,
281,
253,
8746,
1895,
275,
34258,
293,
1162,
355,
9169,
66,
403,
253,
4477,
3192,
436,
715,
2395,
812,
372,
2265,
4403,
4264,
253,
1072,
31850,
50276,
23813,
1411,
253,
625,
3332,
41113,
11333,
432,
34258,
293,
1162,
355,
1327,
44181,
9169,
285,
1269,
86,
1162,
355,
43425,
1223,
5611,
275,
253,
3634,
273,
7789,
285,
3033,
253,
355,
545,
317,
2924,
1332,
390,
298,
343,
4904,
342,
18325,
28677,
310,
247,
3588,
18464,
2336,
5425,
285,
253,
4156,
24571,
273,
697,
13757,
1895,
30150,
342,
372,
2265,
446,
953,
33810,
352,
1537,
4917,
246,
304,
508,
14493,
685,
372,
2265,
4403,
275,
1083,
273,
6036,
13757,
689,
10444,
14493,
253,
2127,
273,
1097,
11333,
310,
2130,
3909,
50276,
23813,
3885,
18921,
1974,
5454,
14273,
1411,
2987,
6498,
327,
40638,
7921,
569,
824,
347,
3939,
873,
50276,
15603,
272,
326,
253,
3541,
33257,
273,
355,
545,
317,
2924,
342,
4229,
10444,
14493,
310,
760,
42876,
4067,
685,
253,
581,
323,
298,
343,
4904,
253,
4477,
943,
823,
436,
281,
253,
501,
3024,
1093,
4679,
50276,
11984,
21999,
3045,
310,
18996,
7939,
281,
253,
3290,
273,
253,
3885,
18921,
1974,
5454,
14273,
273,
271,
18464,
2336,
5425,
352,
651,
320,
3240,
27096,
604,
253,
4477,
812,
2085,
3426,
21999,
4679,
327,
253,
1566,
432,
2829,
337,
50275,
37585,
5701,
50275,
423,
273,
3239,
337,
841,
7921,
569,
1364,
5431,
320,
2007,
19595,
436,
3908,
1057,
417,
4647,
281,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
50276,
4674,
1903,
1745,
80,
275,
3806,
372,
19,
50276,
1704,
4567,
672,
15910,
2393,
253,
819,
1983,
41458,
1364,
320,
14042,
281,
14940,
275,
1340,
281,
11897,
253,
8746,
1159,
285,
4711,
247,
3588,
3033,
436,
310,
13583,
347,
247,
3588,
3033,
476,
320,
2797,
275,
4581,
630,
407,
7221,
2182,
253,
16653,
23623,
2581,
685,
31612,
16653,
23623,
50276,
250,
3065,
34258,
293,
1162,
355,
1327,
44181,
9169,
50276,
266,
5919,
1327,
44181,
8460,
1427,
273,
3924,
3020,
17133,
13757,
3237,
5723,
2824,
9169,
3939,
873,
50276,
49708,
253,
17133,
11394,
342,
3939,
5239,
17857,
32888,
43425,
372,
2265,
4403,
310,
271,
7756,
2220,
767,
8244,
4919,
1220,
735,
323,
253,
1072,
17133,
17040,
43857,
1944,
837,
312,
1162,
355,
4765,
285,
34258,
293,
1162,
355,
9169,
66,
50276,
783,
789,
16633,
327,
16161,
253,
17040,
4555,
2299,
347,
2011,
275,
247,
5235,
273,
3332,
9380,
1269,
86,
1162,
355,
43425,
3939,
873,
50276,
67,
328,
293,
1162,
355,
1327,
44181,
9169,
259,
606,
1162,
355,
43425,
344,
321,
3397,
285,
12797,
281,
40638,
7921,
569,
6830,
7296,
27213,
1537,
12654,
625,
3607,
275,
253,
1072,
673,
1097,
275,
18464,
285,
3426,
21999,
841,
16936,
452,
644,
8489,
12841,
407,
253,
4477,
50276,
783,
4477,
943,
1691,
616,
789,
275,
8668,
342,
625,
3332,
285,
44755,
5556,
14460,
323,
253,
1072,
17040,
1269,
86,
1162,
355,
43425,
34258,
293,
1162,
355,
1327,
44181,
9169,
285,
342,
2987,
327,
40638,
7921,
569,
3939,
873,
33810,
347,
7744,
2218,
323,
1220,
735,
273,
253,
2783,
17040,
3426,
21999,
4679,
403,
3058,
281,
4751,
2939,
253,
3290,
273,
372,
2265,
446,
953,
3885,
18921,
1974,
5454,
14273,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
4460,
5572,
19860,
1332,
323,
16161,
17133,
7921,
569,
273,
11454,
2990,
21999,
3237,
285,
1287,
285,
17813,
271,
18325,
7092,
273,
253,
1072,
327,
1781,
4311,
6928,
13654,
327,
253,
1895,
273,
49160,
31640,
281,
5222,
11542,
48960,
26309,
50276,
783,
30628,
5194,
326,
253,
2929,
4428,
4722,
5697,
326,
403,
18338,
273,
2007,
2440,
285,
326,
841,
5697,
778,
5276,
4217,
6524,
275,
13383,
253,
17329,
273,
752,
310,
1896,
275,
11454,
2990,
21999,
2299,
275,
697,
1655,
830,
253,
2929,
38771,
690,
2234,
5661,
1941,
281,
8132,
29689,
7472,
253,
1318,
273,
253,
9021,
1160,
337,
5301,
1411,
256,
5503,
18464,
2336,
13783,
253,
4477,
513,
417,
2085,
7000,
285,
26565,
14023,
1411,
973,
4304,
1666,
25379,
323,
1650,
253,
18464,
2336,
13783,
432,
3809,
395,
11984,
1269,
86,
1162,
355,
43425,
701,
317,
2924,
33317,
1162,
355,
43425,
50276,
19,
24049,
40638,
7921,
569,
352,
651,
320,
9865,
323,
253,
3114,
281,
2096,
1880,
253,
4081,
5933,
310,
13333,
342,
40638,
7921,
569,
751,
1110,
273,
246,
75,
17244,
255,
34180,
6362,
1162,
355,
9169,
1014,
604,
597,
403,
417,
352,
651,
320,
4722,
281,
2096,
253,
5301,
1411,
2629,
1220,
735,
323,
841,
40638,
7921,
569,
2429,
1411,
253,
7269,
47037,
3715,
407,
253,
4477,
3732,
281,
253,
21076,
17040,
495,
4645,
3045,
275,
253,
3634,
273,
3426,
21999,
1223,
436,
310,
417,
247,
8284,
352,
651,
320,
1270,
281,
923,
849,
253,
1332,
17923,
275,
253,
17385,
342,
247,
7789,
285,
3033,
3186,
347,
436,
4536,
12957,
10084,
5454,
14273,
390,
32213,
273,
18464,
2336,
13783,
347,
2540,
275,
253,
1543,
273,
701,
317,
2924,
33317,
1162,
355,
43425,
50276,
74,
11907,
253,
4477,
281,
17084,
253,
2929,
6240,
841,
4679,
285,
501,
538,
2225,
281,
247,
2852,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper is built on the top of dnc model authors observe a list of issues with the dnc model issues with deallocation scheme issues with the blurring of forward and backward addressing and issues in contentbased addressing authors propose changes in the network architecture to solve all these three issues with toy experiments authors demonstrate the usefulness of the proposed modifications to dnc the improvements are also seen in more realistic babi tasks major comments the paper is well written and easy to follow the proposed improvements seem to result in very clear improvements the proposed improvements also improve the convergence of the model i do not have any major concerns about the paper i think that contributions of the paper are good enough to accept the paper i also appreciate that the authors have submitted the code to reproduce the results i am curious to know if authors observe similar convergence gains in babi tasks as well can you please provide the mean learning curve for babi task for dnc vs proposed modifications docsep overview this paper proposes modifications to the original differentiable neural computer architecture in three ways first by introducing a masked contentbased addressing which dynamically induces a keyvalue separation second by modifying the deallocation system by also multiplying the memory contents by a retention vector before an update finally the authors propose a modification in the link distribution through renormalization they provide some theoretical motivation and empirical evidence that it helps avoiding memory aliasing the authors test their approach in the some algorithm task from the dnc paper copy associative recall and keyvalue retrieval and also in the babi dataset strengths overall i think the paper is wellwritten and proposes simple adaptions to the dnc architecture which are theoretically grounded and could be effective for improving general performance although the experimental results seem promising when comparing the modified architecture to the original dnc in my opinion there are a few fundamental problems in the empirical session see weakness discussion bellow weaknesses not all model modifications are studied in all the algorithmic tasks for example in the associative recall and keyvalue retrieval only dnc and dnc masking are studied for the babi task although there is a significant improvement 43 in the mean error rate compared to the original dnc its important to note that performance in this task has improved a lot since the dnc paper was release since this is the only nontoy task in the paper in my opinion the authors have to discuss current sota on it and have to cite for example the universal transformer1 entnet2 relational nets 3 among others architectures that shown recent advances on this benchmark moreover the sparse dnc rae el at 2016 is already a much better performant in this task mean error dnc 167 pm 76 dncmd this paper 95 pm 16 sparse dnc 64 pm 25 although the authors mention in the conclusion that its future work to merge their proposed changes into the sparse dnc it is hard to know how relevant the improvements are knowing that there are much better baselines for this task it would also be good if besides the mean error rates they reported best runs chosen by performance on the validation task and number of the tasks solve with 5 error as it is standard in this dataset smaller notes 1 in the abstract i find the message for motivating the masking from the sentence content based lookup results which is not present in the key and need to be retrieved hard to understand by itself when i first read the abstract i couldnt understand what the authors wanted to communicate with it later in 31 it became clear 2 page 3 beta in that equation is not defined 3 first paragraph in page 5 uses definition of acronyms dncms and dncmds before they are defined 4 table 1 difference between dnc and dnc dm is not clear i am assuming its the numbers reported in the paper vs the authors implementation 5in session 3133 for completeness i think it would be helpful to explicitly compare the equations from the original dnc paper with the new proposed ones post rebuttal update i think the authors have addressed my main concern points and i am updating my score accordingly docsepthe authors propose three improvements to the dnc model masked attention erasion of deallocated elements and sharpened temporal links and show that this allows the model to solve synthetic memory tasks faster and with better precision they also show the model performs better on average on babi than the original dnc the negatives are that the paper does not really show this modified dnc can solve a task that the original dnc could not as the authors also admit there have been other dnc improvements that have had more dramatic improvements on babi i think the paper is particularly clearly written and i would vote for it being accepted as it has implications beyond the dnc the fact that masked attention works so much better than the standard cosineweighted contentbased attention is pretty interesting in itself the insights eg figure 5 are interesting and show the study is not just trying to be a benchmark paper for some toplevel results but actually cares about understanding a problem and fixing it although most recent memory architectures do not seem to have incorporated the dncs slightly complex memory deallocation scheme any resurgent work in this area would benefit from this study
### Summary:
|
pros identification of several interesting problems with the original dnc model masked attention erasion of deallocated elements and sharpened temporal links an improved architecture which addresses the issues and shows improved performance on synthetic memory tasks and babi over the original model clear writing cons does not really show this modified dnc can solve a task that the original dnc could not and the babi tasks are effectively solved anyway it is still not clear whether the dnc even with these improvements will have much impact beyond these toy tasks overall the reviewers found this to be a solid paper with a useful analysis and i agree i recommend acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50276,
2520,
2929,
310,
4270,
327,
253,
1755,
273,
277,
9068,
1566,
4477,
10018,
247,
1618,
273,
3374,
342,
253,
277,
9068,
1566,
3374,
342,
372,
455,
4341,
6974,
3374,
342,
253,
29017,
804,
273,
3579,
285,
19265,
15974,
285,
3374,
275,
2600,
3169,
15974,
4477,
12661,
2544,
275,
253,
2990,
10336,
281,
8415,
512,
841,
1264,
3374,
342,
20953,
4679,
4477,
7568,
253,
31471,
273,
253,
4081,
14586,
281,
277,
9068,
253,
11701,
403,
671,
2326,
275,
625,
15958,
5366,
74,
8892,
50276,
24330,
5701,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
4081,
11701,
1646,
281,
906,
275,
1077,
2590,
11701,
253,
4081,
11701,
671,
3157,
253,
14940,
273,
253,
1566,
891,
513,
417,
452,
667,
2201,
7350,
670,
253,
2929,
891,
1158,
326,
9021,
273,
253,
2929,
403,
1175,
2217,
281,
2997,
253,
2929,
50276,
74,
671,
11435,
326,
253,
4477,
452,
9262,
253,
2127,
281,
18302,
253,
1543,
50276,
74,
717,
14338,
281,
871,
604,
4477,
10018,
2074,
14940,
15988,
275,
5366,
74,
8892,
347,
973,
476,
368,
4496,
2085,
253,
1599,
4715,
6970,
323,
5366,
74,
4836,
323,
277,
9068,
4632,
4081,
14586,
5474,
33032,
18389,
50276,
2520,
2929,
29328,
14586,
281,
253,
3236,
46350,
11454,
4382,
10336,
275,
1264,
4088,
806,
407,
16984,
247,
34741,
2600,
3169,
15974,
534,
23043,
14757,
247,
2234,
2877,
9712,
1273,
407,
26264,
253,
372,
455,
4341,
985,
407,
671,
39763,
253,
3541,
9410,
407,
247,
17302,
4972,
1078,
271,
5731,
4720,
253,
4477,
12661,
247,
11237,
275,
253,
3048,
3268,
949,
39358,
597,
2085,
690,
10527,
16038,
285,
16774,
1941,
326,
352,
7729,
17816,
3541,
19541,
2355,
50276,
783,
4477,
1071,
616,
2746,
275,
253,
690,
5933,
4836,
432,
253,
277,
9068,
2929,
3491,
42162,
6983,
285,
2234,
2877,
25064,
285,
671,
275,
253,
5366,
74,
10895,
50275,
296,
3755,
20556,
4583,
891,
1158,
253,
2929,
310,
973,
15720,
285,
29328,
2969,
5223,
621,
281,
253,
277,
9068,
10336,
534,
403,
28055,
28462,
285,
812,
320,
3576,
323,
11138,
2087,
3045,
3738,
253,
5661,
1543,
1646,
12532,
672,
10941,
253,
7321,
10336,
281,
253,
3236,
277,
9068,
275,
619,
4743,
627,
403,
247,
1643,
7936,
3237,
275,
253,
16774,
6874,
923,
14855,
5955,
44462,
50276,
20881,
1255,
265,
417,
512,
1566,
14586,
403,
5421,
275,
512,
253,
5933,
280,
8892,
323,
1650,
275,
253,
42162,
6983,
285,
2234,
2877,
25064,
760,
277,
9068,
285,
277,
9068,
50276,
12477,
272,
403,
5421,
50275,
1542,
253,
5366,
74,
4836,
3738,
627,
310,
247,
1534,
7756,
7652,
275,
253,
1599,
2228,
2281,
2429,
281,
253,
3236,
277,
9068,
697,
1774,
281,
3877,
326,
3045,
275,
436,
4836,
556,
5520,
247,
2257,
1580,
253,
277,
9068,
2929,
369,
3727,
1580,
436,
310,
253,
760,
25450,
899,
4836,
275,
253,
2929,
275,
619,
4743,
253,
4477,
452,
281,
2319,
1655,
256,
5503,
327,
352,
285,
452,
281,
26542,
323,
1650,
253,
10898,
39707,
18,
994,
3024,
19,
38524,
37507,
495,
2190,
2571,
35615,
326,
2011,
3332,
16424,
327,
436,
22791,
50276,
3062,
1189,
253,
23507,
277,
9068,
1218,
70,
1045,
387,
4022,
310,
2168,
247,
1199,
1805,
1347,
386,
275,
436,
4836,
1599,
2228,
277,
9068,
22743,
12920,
10909,
277,
79,
14111,
436,
2929,
5325,
12920,
1668,
23507,
277,
9068,
6705,
12920,
2030,
3738,
253,
4477,
3748,
275,
253,
6452,
326,
697,
2852,
789,
281,
17310,
616,
4081,
2544,
715,
253,
23507,
277,
9068,
352,
310,
1892,
281,
871,
849,
4623,
253,
11701,
403,
8958,
326,
627,
403,
1199,
1805,
1666,
25379,
323,
436,
4836,
352,
651,
671,
320,
1175,
604,
16280,
253,
1599,
2228,
4142,
597,
2361,
1682,
6613,
6777,
407,
3045,
327,
253,
12820,
4836,
285,
1180,
273,
253,
8892,
8415,
342,
50276,
22,
2228,
347,
352,
310,
2629,
275,
436,
10895,
50275,
6795,
254,
7211,
50276,
18,
275,
253,
12002,
891,
1089,
253,
3935,
323,
15265,
839,
253,
44790,
432,
253,
6197,
50276,
6071,
1754,
31994,
1543,
534,
310,
417,
1246,
275,
253,
2234,
285,
878,
281,
320,
22111,
50276,
10984,
281,
2096,
407,
3139,
672,
891,
806,
1239,
253,
12002,
891,
812,
2649,
2096,
752,
253,
4477,
3078,
281,
13791,
342,
352,
1996,
275,
4562,
352,
3395,
2590,
50275,
19,
3239,
495,
9840,
275,
326,
5150,
310,
417,
2931,
50276,
20,
806,
12494,
275,
3239,
608,
4648,
5426,
273,
913,
1406,
90,
983,
277,
9068,
983,
285,
277,
79,
3591,
1397,
1078,
597,
403,
2931,
50276,
21,
2829,
337,
3064,
875,
277,
9068,
285,
277,
9068,
42961,
310,
417,
2590,
891,
717,
7384,
697,
253,
3904,
2361,
275,
253,
2929,
4632,
253,
4477,
7092,
50275,
22,
249,
6874,
495,
14380,
323,
29867,
891,
1158,
352,
651,
320,
9371,
281,
11120,
7277,
253,
7424,
432,
253,
3236,
277,
9068,
2929,
342,
253,
747,
4081,
4394,
50273,
5996,
30080,
22559,
5731,
891,
1158,
253,
4477,
452,
9713,
619,
2022,
4468,
2792,
285,
891,
717,
22753,
619,
4868,
15672,
5474,
339,
431,
248,
4477,
12661,
1264,
11701,
281,
253,
277,
9068,
1566,
34741,
4116,
2827,
4930,
273,
372,
10835,
456,
3603,
285,
9479,
2348,
11935,
4859,
50276,
395,
921,
326,
436,
4483,
253,
1566,
281,
8415,
13506,
3541,
8892,
7938,
285,
342,
1805,
12320,
597,
671,
921,
253,
1566,
17923,
1805,
327,
3388,
327,
5366,
74,
685,
253,
3236,
277,
9068,
50276,
783,
2297,
3993,
403,
326,
253,
2929,
1057,
417,
1663,
921,
436,
7321,
277,
9068,
476,
8415,
247,
4836,
326,
253,
3236,
277,
9068,
812,
417,
347,
253,
4477,
671,
11476,
627,
452,
644,
643,
277,
9068,
11701,
326,
452,
574,
625,
14138,
11701,
327,
5366,
74,
50276,
74,
1158,
253,
2929,
310,
3782,
4518,
3542,
285,
891,
651,
6273,
323,
352,
1146,
7607,
347,
352,
556,
12739,
4457,
253,
277,
9068,
253,
958,
326,
34741,
4116,
2987,
594,
1199,
1805,
685,
253,
2629,
7349,
460,
24676,
2600,
3169,
4116,
310,
3965,
4722,
275,
3139,
253,
16039,
24088,
4677,
608,
403,
4722,
285,
921,
253,
1263,
310,
417,
816,
2820,
281,
320,
247,
22791,
2929,
323,
690,
281,
713,
652,
1543,
533,
2686,
24505,
670,
4685,
247,
1895,
285,
18505,
352,
3738,
954,
3332,
3541,
35615,
513,
417,
1646,
281,
452,
11217,
253,
277,
79,
6113,
5777,
2570,
3541,
372,
455,
4341,
6974,
667,
501,
3922,
290,
789,
275,
436,
2170,
651,
5649,
432,
436,
1263,
187,
187,
4118,
18435,
27,
5847,
50276,
888,
1877,
273,
2067,
4722,
3237,
342,
253,
3236,
277,
9068,
1566,
34741,
4116,
2827,
4930,
273,
372,
10835,
456,
3603,
285,
9479,
2348,
11935,
4859,
50276,
266,
5520,
10336,
534,
12453,
253,
3374,
285,
2722,
5520,
3045,
327,
13506,
3541,
8892,
285,
5366,
74,
689,
253,
3236,
1566,
50276,
8250,
4028,
50276,
5040,
50276,
18566,
417,
1663,
921,
436,
7321,
277,
9068,
476,
8415,
247,
4836,
326,
253,
3236,
277,
9068,
812,
417,
285,
253,
5366,
74,
8892,
403,
8069,
14042,
8791,
50276,
262,
310,
1335,
417,
2590,
1880,
253,
277,
9068,
1014,
342,
841,
11701,
588,
452,
1199,
3486,
4457,
841,
20953,
8892,
50276,
1189,
455,
253,
30628,
1119,
436,
281,
320,
247,
4891,
2929,
342,
247,
4217,
1783,
285,
891,
5194,
50276,
74,
5583,
14924,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50276,
2520,
2929,
310,
4270,
327,
253,
1755,
273,
277,
9068,
1566,
4477,
10018,
247,
1618,
273,
3374,
342,
253,
277,
9068,
1566,
3374,
342,
372,
455,
4341,
6974,
3374,
342,
253,
29017,
804,
273,
3579,
285,
19265,
15974,
285,
3374,
275,
2600,
3169,
15974,
4477,
12661,
2544,
275,
253,
2990,
10336,
281,
8415,
512,
841,
1264,
3374,
342,
20953,
4679,
4477,
7568,
253,
31471,
273,
253,
4081,
14586,
281,
277,
9068,
253,
11701,
403,
671,
2326,
275,
625,
15958,
5366,
74,
8892,
50276,
24330,
5701,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
4081,
11701,
1646,
281,
906,
275,
1077,
2590,
11701,
253,
4081,
11701,
671,
3157,
253,
14940,
273,
253,
1566,
891,
513,
417,
452,
667,
2201,
7350,
670,
253,
2929,
891,
1158,
326,
9021,
273,
253,
2929,
403,
1175,
2217,
281,
2997,
253,
2929,
50276,
74,
671,
11435,
326,
253,
4477,
452,
9262,
253,
2127,
281,
18302,
253,
1543,
50276,
74,
717,
14338,
281,
871,
604,
4477,
10018,
2074,
14940,
15988,
275,
5366,
74,
8892,
347,
973,
476,
368,
4496,
2085,
253,
1599,
4715,
6970,
323,
5366,
74,
4836,
323,
277,
9068,
4632,
4081,
14586,
5474,
33032,
18389,
50276,
2520,
2929,
29328,
14586,
281,
253,
3236,
46350,
11454,
4382,
10336,
275,
1264,
4088,
806,
407,
16984,
247,
34741,
2600,
3169,
15974,
534,
23043,
14757,
247,
2234,
2877,
9712,
1273,
407,
26264,
253,
372,
455,
4341,
985,
407,
671,
39763,
253,
3541,
9410,
407,
247,
17302,
4972,
1078,
271,
5731,
4720,
253,
4477,
12661,
247,
11237,
275,
253,
3048,
3268,
949,
39358,
597,
2085,
690,
10527,
16038,
285,
16774,
1941,
326,
352,
7729,
17816,
3541,
19541,
2355,
50276,
783,
4477,
1071,
616,
2746,
275,
253,
690,
5933,
4836,
432,
253,
277,
9068,
2929,
3491,
42162,
6983,
285,
2234,
2877,
25064,
285,
671,
275,
253,
5366,
74,
10895,
50275,
296,
3755,
20556,
4583,
891,
1158,
253,
2929,
310,
973,
15720,
285,
29328,
2969,
5223,
621,
281,
253,
277,
9068,
10336,
534,
403,
28055,
28462,
285,
812,
320,
3576,
323,
11138,
2087,
3045,
3738,
253,
5661,
1543,
1646,
12532,
672,
10941,
253,
7321,
10336,
281,
253,
3236,
277,
9068,
275,
619,
4743,
627,
403,
247,
1643,
7936,
3237,
275,
253,
16774,
6874,
923,
14855,
5955,
44462,
50276,
20881,
1255,
265,
417,
512,
1566,
14586,
403,
5421,
275,
512,
253,
5933,
280,
8892,
323,
1650,
275,
253,
42162,
6983,
285,
2234,
2877,
25064,
760,
277,
9068,
285,
277,
9068,
50276,
12477,
272,
403,
5421,
50275,
1542,
253,
5366,
74,
4836,
3738,
627,
310,
247,
1534,
7756,
7652,
275,
253,
1599,
2228,
2281,
2429,
281,
253,
3236,
277,
9068,
697,
1774,
281,
3877,
326,
3045,
275,
436,
4836,
556,
5520,
247,
2257,
1580,
253,
277,
9068,
2929,
369,
3727,
1580,
436,
310,
253,
760,
25450,
899,
4836,
275,
253,
2929,
275,
619,
4743,
253,
4477,
452,
281,
2319,
1655,
256,
5503,
327,
352,
285,
452,
281,
26542,
323,
1650,
253,
10898,
39707,
18,
994,
3024,
19,
38524,
37507,
495,
2190,
2571,
35615,
326,
2011,
3332,
16424,
327,
436,
22791,
50276,
3062,
1189,
253,
23507,
277,
9068,
1218,
70,
1045,
387,
4022,
310,
2168,
247,
1199,
1805,
1347,
386,
275,
436,
4836,
1599,
2228,
277,
9068,
22743,
12920,
10909,
277,
79,
14111,
436,
2929,
5325,
12920,
1668,
23507,
277,
9068,
6705,
12920,
2030,
3738,
253,
4477,
3748,
275,
253,
6452,
326,
697,
2852,
789,
281,
17310,
616,
4081,
2544,
715,
253,
23507,
277,
9068,
352,
310,
1892,
281,
871,
849,
4623,
253,
11701,
403,
8958,
326,
627,
403,
1199,
1805,
1666,
25379,
323,
436,
4836,
352,
651,
671,
320,
1175,
604,
16280,
253,
1599,
2228,
4142,
597,
2361,
1682,
6613,
6777,
407,
3045,
327,
253,
12820,
4836,
285,
1180,
273,
253,
8892,
8415,
342,
50276,
22,
2228,
347,
352,
310,
2629,
275,
436,
10895,
50275,
6795,
254,
7211,
50276,
18,
275,
253,
12002,
891,
1089,
253,
3935,
323,
15265,
839,
253,
44790,
432,
253,
6197,
50276,
6071,
1754,
31994,
1543,
534,
310,
417,
1246,
275,
253,
2234,
285,
878,
281,
320,
22111,
50276,
10984,
281,
2096,
407,
3139,
672,
891,
806,
1239,
253,
12002,
891,
812,
2649,
2096,
752,
253,
4477,
3078,
281,
13791,
342,
352,
1996,
275,
4562,
352,
3395,
2590,
50275,
19,
3239,
495,
9840,
275,
326,
5150,
310,
417,
2931,
50276,
20,
806,
12494,
275,
3239,
608,
4648,
5426,
273,
913,
1406,
90,
983,
277,
9068,
983,
285,
277,
79,
3591,
1397,
1078,
597,
403,
2931,
50276,
21,
2829,
337,
3064,
875,
277,
9068,
285,
277,
9068,
42961,
310,
417,
2590,
891,
717,
7384,
697,
253,
3904,
2361,
275,
253,
2929,
4632,
253,
4477,
7092,
50275,
22,
249,
6874,
495,
14380,
323,
29867,
891,
1158,
352,
651,
320,
9371,
281,
11120,
7277,
253,
7424,
432,
253,
3236,
277,
9068,
2929,
342,
253,
747,
4081,
4394,
50273,
5996,
30080,
22559,
5731,
891,
1158,
253,
4477,
452,
9713,
619,
2022,
4468,
2792,
285,
891,
717,
22753,
619,
4868,
15672,
5474,
339,
431,
248,
4477,
12661,
1264,
11701,
281,
253,
277,
9068,
1566,
34741,
4116,
2827,
4930,
273,
372,
10835,
456,
3603,
285,
9479,
2348,
11935,
4859,
50276,
395,
921,
326,
436,
4483,
253,
1566,
281,
8415,
13506,
3541,
8892,
7938,
285,
342,
1805,
12320,
597,
671,
921,
253,
1566,
17923,
1805,
327,
3388,
327,
5366,
74,
685,
253,
3236,
277,
9068,
50276,
783,
2297,
3993,
403,
326,
253,
2929,
1057,
417,
1663,
921,
436,
7321,
277,
9068,
476,
8415,
247,
4836,
326,
253,
3236,
277,
9068,
812,
417,
347,
253,
4477,
671,
11476,
627,
452,
644,
643,
277,
9068,
11701,
326,
452,
574,
625,
14138,
11701,
327,
5366,
74,
50276,
74,
1158,
253,
2929,
310,
3782,
4518,
3542,
285,
891,
651,
6273,
323,
352,
1146,
7607,
347,
352,
556,
12739,
4457,
253,
277,
9068,
253,
958,
326,
34741,
4116,
2987,
594,
1199,
1805,
685,
253,
2629,
7349,
460,
24676,
2600,
3169,
4116,
310,
3965,
4722,
275,
3139,
253,
16039,
24088,
4677,
608,
403,
4722,
285,
921,
253,
1263,
310,
417,
816,
2820,
281,
320,
247,
22791,
2929,
323,
690,
281,
713,
652,
1543,
533,
2686,
24505,
670,
4685,
247,
1895,
285,
18505,
352,
3738,
954,
3332,
3541,
35615,
513,
417,
1646,
281,
452,
11217,
253,
277,
79,
6113,
5777,
2570,
3541,
372,
455,
4341,
6974,
667,
501,
3922,
290,
789,
275,
436,
2170,
651,
5649,
432,
436,
1263,
187,
187,
4118,
18435,
27,
5847,
50276,
888,
1877,
273,
2067,
4722,
3237,
342,
253,
3236,
277,
9068,
1566,
34741,
4116,
2827,
4930,
273,
372,
10835,
456,
3603,
285,
9479,
2348,
11935,
4859,
50276,
266,
5520,
10336,
534,
12453,
253,
3374,
285,
2722,
5520,
3045,
327,
13506,
3541,
8892,
285,
5366,
74,
689,
253,
3236,
1566,
50276,
8250,
4028,
50276,
5040,
50276,
18566,
417,
1663,
921,
436,
7321,
277,
9068,
476,
8415,
247,
4836,
326,
253,
3236,
277,
9068,
812,
417,
285,
253,
5366,
74,
8892,
403,
8069,
14042,
8791,
50276,
262,
310,
1335,
417,
2590,
1880,
253,
277,
9068,
1014,
342,
841,
11701,
588,
452,
1199,
3486,
4457,
841,
20953,
8892,
50276,
1189,
455,
253,
30628,
1119,
436,
281,
320,
247,
4891,
2929,
342,
247,
4217,
1783,
285,
891,
5194,
50276,
74,
5583,
14924,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a largescale highquality mandarin singing dataset m4singer which consists of 704 chinese songs recorded by 19 singers the authors further propose 4 applications for this dataset of scorebased svs controllable singing voice csv singing voice conversion svc and automatic music transcription amt 1 the dataset contains highquality music that is clean and with high fidelity 2 compared to existing datasets m4singer have longer hours of music and contains manual alignment and musical score annotation respectively 3 the paper is overall wellwritten and the dataset is well documented 1 since the dataset utilizes music from other singers does the dataset get the license from the original singers for the legitimate use 2 compared to some existing datasets like opensinger it seems that m4singers have less singers which may seems to lack some diversity docsepthe paper aims to tackle the unavailability of highquality and accurately labeled datasets for singing voice synthesis svs the paper proposes m4singer a collection of multistyle multisinger mandarin singing voices recorded by 19 professional singers covering 704 chinese pop songs in four vocal parts to demonstrate the use of the dataset experiments are performed on scorebased svs controllable singing voice singing voice conversion and automatic music transcription the paper is well motivated and does a good job describing the reasoning for different choices in the creation of the dataset a lot of effort seems to have gone into ensuring the correctness of the dataset including manual annotation scoring the music etc it seems like a good contribution to a limited body of musical datasets detailed benchmarks are provided to describe the use of the dataset the manual labeling is subject to human errors but the paper rightly notes this limitation the paper has several grammatical errors however none of them significantly detract from the ability to understand the paper it would still be helpful if the paper is carefully proofread to make it more accessible to readers docsepthis is a wellwritten paper that provides a reference for subsequent work this dataset is valuable in related areas but there are still some minor issues that need to be modified it is a novel multisinger and multistyle singing voice corpus with refined manual alignment and musical score annotation the total duration of the songs in the corpus is uneven for different singers the repo link should ideally be placed in a more visible location abx test should be added to the svc experiment the dataset is not publicly available docsepthis paper release a new dataset for multistyle multisinger mandarin singing collection with elaborately annotated musical scores as well as its benchmarks the paper is wellwritten the only concern is that it is somehow incremental on opencpop httpsarxivorgpdf220107429pdf the difference seems to be including more singers for each song as seen from table 1 it is wellwritten the scale of released dataset is much bigger than existing dataset it has a easytouse demo limited novelty considering existing dataset it would be nice to discuss the potencial usage of the dataset for example who will be benefical in which apects docsepthe paper introduces m4singer a musicvoice dataset of mandarin songs covering a range of singers and singing styles it comprises 30 hours of vocal data and handannotated musical scores with alignment information the benchmark consists of four tasks that deploys existing models and methods on m4singer and some insights are drawn from these experiments the dataset is collected to reflect diversity of singing vocal data the dataset is sourced from professional singers after auditioning and recorded in a studio environment which guarantee its quality the singers agree to contribute under explicit terms in an agreement uploaded by the authors immense human effort music alignment and annotation is partly completed by hand in addition to uses of existing tools the authors also employs professionals to doublecheck annotations as a postprocessing steps utility of the dataset mainly framed to address the svs problem m4singers ground truth achieves high subjective scores based on two metrics and it works well with two generative models the authors also demonstrates how the dataset can be used on new and different tasks dataset imbalance despite its wide coverage the dataset collects much more vocal data from female alto singers 10 hours compared to soprano 4 hours and from more male tenor singers compared to bass the authors should acknowledge this disparity and explain how this does or does not constitute an inherent limitation based on the experiments this imbalance could affect the finegrained performance of every task especially on voice conversion svc from a dataabundant class to a lessabundant one code and dataset availabilityaccessibility the authors provided a demo website with audio samples but the code repository is almost empty it is understandable that the dataset administrators screen users before giving access to the full resources however in their current state the provided resources are too skeletal what i am expecting as a minimum code for loading and preprocessing the data scripts to reproduce the experiments if given data at least links to the existing implementations of baselines scripts for generating the visualizations of the dataset for exploration purposes experimental details in writing the authors answer yes to questions 3a and 3b in the checklist but i do not think this is an accurate response i suggest that more details be added on experimental procedure which includes data splits hyper parameters random seeds etc this can be done in either section 4 or the appendix without error bars it seems that the experiments are run for only one seed the authors should clarify andor justify this dataset impact if possible there could be more discussion on how the benchmark tasks help with downstream applications and their contribution to the asr field as of now only the svs problem seems significant based on previous works docsepthis paper presents m4singer a freetouse multistyle multisinger mandarin singing 4 collection with elaborately annotated musical scores as well as its benchmarks this is very helpful for the research related to singing voice synthesis the dataset contains 76 a multisinger and multistyle mandarin singing voice corpus with refined manual alignment and 77 musical score annotation 1 what is the specific process of coarse annotation and fine annotation it is very important for singing voice synthesis dataset authors should describe them in detail 2 the dataset is not public available this will limit contributions to singing voice synthesis 3 the data set can be obtained after application but it is uncertain whether it is consistent with the description in httpsgithubcomm4singerm4singer
### Summary:
|
the paper presents m4singer a mandarin singing dataset with diversity in multiple aspects with a few demonstrated application use cases the majority of reviewers gave positive ratings and acknowledged that the data is carefully collected and balanced data is of high quality diverse content and relatively large in scale therefore i recommend acceptance to this conference track my main concern is the accessibility of the dataset the call for paper page explicitly mentioned that a key criterion is accessibility datasets should be available and accessible ie the data can be found and obtained without a personal request to the pi and any required code should be open source however the dataset requires an application and reviewing process to be downloaded i can imagine this could severely hurt the adoption of the datasets in future research i recommend the authors find ways to truly release the dataset to the public eg by obtaining further consent from the selected singers the readme page at httpsgithubcomm4singerm4singer also needs improvement for it to be really accessible
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1236,
2510,
25912,
1029,
15177,
7649,
19881,
16115,
10895,
278,
21,
4093,
254,
534,
8414,
273,
48798,
448,
5187,
9575,
5950,
407,
655,
32146,
253,
4477,
2007,
12661,
577,
4893,
323,
436,
10895,
273,
4868,
3169,
256,
10936,
3661,
494,
16115,
4318,
43153,
16115,
4318,
9436,
256,
16788,
285,
12077,
3440,
9464,
717,
85,
337,
253,
10895,
4428,
1029,
15177,
3440,
326,
310,
4076,
285,
342,
1029,
32422,
50276,
19,
2429,
281,
5368,
15302,
278,
21,
4093,
254,
452,
3356,
3038,
273,
3440,
285,
4428,
11595,
12420,
285,
12256,
4868,
22581,
2975,
50276,
20,
253,
2929,
310,
4583,
973,
15720,
285,
253,
10895,
310,
973,
14290,
337,
1580,
253,
10895,
29820,
3440,
432,
643,
32146,
1057,
253,
10895,
755,
253,
7981,
432,
253,
3236,
32146,
323,
253,
14905,
897,
50276,
19,
2429,
281,
690,
5368,
15302,
751,
13279,
4940,
352,
3133,
326,
278,
21,
4093,
398,
452,
1679,
32146,
534,
778,
3133,
281,
3480,
690,
9991,
5474,
339,
431,
248,
2929,
13698,
281,
18915,
253,
440,
32517,
273,
1029,
15177,
285,
13613,
13130,
15302,
323,
16115,
4318,
9066,
256,
10936,
253,
2929,
29328,
278,
21,
4093,
254,
247,
4849,
273,
1554,
382,
2172,
1554,
2182,
254,
7649,
19881,
16115,
15547,
5950,
407,
655,
5702,
32146,
10985,
48798,
448,
5187,
1684,
9575,
275,
1740,
17898,
4243,
281,
7568,
253,
897,
273,
253,
10895,
4679,
403,
2684,
327,
4868,
3169,
256,
10936,
3661,
494,
16115,
4318,
16115,
4318,
9436,
285,
12077,
3440,
9464,
50276,
783,
2929,
310,
973,
17194,
285,
1057,
247,
1175,
2628,
12930,
253,
14720,
323,
1027,
10165,
275,
253,
8869,
273,
253,
10895,
50276,
66,
2257,
273,
3434,
3133,
281,
452,
4783,
715,
17749,
253,
36594,
273,
253,
10895,
1690,
11595,
22581,
14755,
253,
3440,
3966,
50276,
262,
3133,
751,
247,
1175,
7680,
281,
247,
3710,
2133,
273,
12256,
15302,
50276,
5992,
7193,
49602,
403,
2530,
281,
6266,
253,
897,
273,
253,
10895,
50276,
783,
11595,
21473,
310,
2256,
281,
1966,
6332,
533,
253,
2929,
35155,
7211,
436,
12291,
50276,
783,
2929,
556,
2067,
47412,
474,
6332,
2299,
5293,
273,
731,
3012,
843,
974,
432,
253,
3745,
281,
2096,
253,
2929,
352,
651,
1335,
320,
9371,
604,
253,
2929,
310,
9257,
4737,
1088,
281,
1056,
352,
625,
12482,
281,
10668,
5474,
33032,
2520,
310,
247,
973,
15720,
2929,
326,
3400,
247,
3806,
323,
6774,
789,
436,
10895,
310,
9865,
275,
2905,
3672,
533,
627,
403,
1335,
690,
5884,
3374,
326,
878,
281,
320,
7321,
352,
310,
247,
4460,
1554,
2182,
254,
285,
1554,
382,
2172,
16115,
4318,
20689,
342,
22407,
11595,
12420,
285,
12256,
4868,
22581,
50275,
783,
2264,
7467,
273,
253,
9575,
275,
253,
20689,
310,
30914,
323,
1027,
32146,
50276,
783,
30905,
3048,
943,
34243,
320,
4845,
275,
247,
625,
7985,
4328,
50275,
357,
89,
1071,
943,
320,
2879,
281,
253,
256,
16788,
3368,
50276,
783,
10895,
310,
417,
13644,
2130,
50276,
7152,
33032,
2520,
2929,
3727,
247,
747,
10895,
323,
1554,
382,
2172,
1554,
2182,
254,
7649,
19881,
16115,
4849,
342,
14883,
1523,
28267,
12256,
7363,
347,
973,
347,
697,
49602,
253,
2929,
310,
973,
15720,
253,
760,
4468,
310,
326,
352,
310,
10380,
32809,
327,
1527,
7693,
412,
5987,
39962,
2061,
9275,
19,
1252,
2922,
34637,
9275,
253,
3064,
3133,
281,
320,
1690,
625,
32146,
323,
1016,
4498,
347,
2326,
432,
2829,
337,
50274,
262,
310,
973,
15720,
50276,
783,
4311,
273,
4439,
10895,
310,
1199,
8750,
685,
5368,
10895,
50276,
262,
556,
247,
1842,
1767,
1312,
22020,
50275,
15870,
38135,
7296,
5368,
10895,
50276,
262,
651,
320,
5322,
281,
2319,
253,
1721,
49016,
10393,
273,
253,
10895,
50276,
1542,
1650,
665,
588,
320,
2750,
474,
275,
534,
247,
808,
84,
5474,
339,
431,
248,
2929,
23970,
278,
21,
4093,
254,
247,
3440,
22619,
10895,
273,
7649,
19881,
9575,
10985,
247,
2491,
273,
32146,
285,
16115,
14957,
352,
12093,
1884,
3038,
273,
17898,
941,
285,
1133,
11423,
456,
12256,
7363,
342,
12420,
1491,
253,
22791,
8414,
273,
1740,
8892,
326,
48593,
16376,
5368,
3210,
285,
3082,
327,
278,
21,
4093,
254,
285,
690,
16039,
403,
8392,
432,
841,
4679,
50275,
783,
10895,
310,
5728,
281,
4887,
9991,
273,
16115,
17898,
941,
253,
10895,
310,
47344,
432,
5702,
32146,
846,
47373,
272,
285,
5950,
275,
247,
11803,
3126,
534,
12215,
697,
3290,
253,
32146,
5194,
281,
8162,
762,
6843,
2426,
275,
271,
4345,
28228,
407,
253,
4477,
50273,
12784,
1215,
1966,
3434,
3440,
12420,
285,
22581,
310,
13730,
6312,
407,
1133,
275,
1635,
281,
4648,
273,
5368,
5657,
253,
4477,
671,
27532,
12440,
281,
4021,
5903,
31825,
347,
247,
1501,
21678,
5018,
50274,
307,
874,
273,
253,
10895,
7194,
29318,
281,
2953,
253,
256,
10936,
1895,
278,
21,
4093,
398,
3216,
5083,
33526,
1029,
17854,
7363,
1754,
327,
767,
17082,
285,
352,
2987,
973,
342,
767,
1006,
800,
3210,
253,
4477,
671,
14371,
849,
253,
10895,
476,
320,
908,
327,
747,
285,
1027,
8892,
50275,
42429,
31561,
5747,
697,
4618,
7031,
253,
10895,
41084,
1199,
625,
17898,
941,
432,
5343,
355,
936,
32146,
884,
3038,
2429,
281,
34715,
41911,
577,
3038,
285,
432,
625,
5086,
3578,
263,
32146,
2429,
281,
16819,
253,
4477,
943,
14409,
436,
37808,
285,
5513,
849,
436,
1057,
390,
1057,
417,
12647,
271,
12794,
12291,
1754,
327,
253,
4679,
436,
31561,
812,
2818,
253,
4030,
72,
11273,
3045,
273,
1046,
4836,
3340,
327,
4318,
9436,
256,
16788,
432,
247,
941,
357,
1504,
386,
966,
281,
247,
1679,
357,
1504,
386,
581,
50274,
3211,
285,
10895,
11659,
10773,
2322,
253,
4477,
2530,
247,
22020,
4422,
342,
9797,
3530,
533,
253,
2127,
18491,
310,
2761,
6325,
352,
310,
34007,
326,
253,
10895,
30015,
3601,
4212,
1078,
4933,
2289,
281,
253,
2120,
5300,
2299,
275,
616,
1655,
1375,
253,
2530,
5300,
403,
1512,
20847,
752,
891,
717,
16764,
347,
247,
5927,
2127,
323,
10935,
285,
638,
21678,
253,
941,
20477,
281,
18302,
253,
4679,
604,
1677,
941,
387,
1878,
4859,
281,
253,
5368,
27558,
273,
1666,
25379,
20477,
323,
11365,
253,
5304,
5904,
273,
253,
10895,
323,
17947,
6378,
50273,
49363,
4278,
275,
4028,
253,
4477,
3662,
4754,
281,
3533,
495,
66,
285,
495,
67,
275,
253,
44282,
533,
891,
513,
417,
1158,
436,
310,
271,
7899,
2380,
891,
1804,
326,
625,
4278,
320,
2879,
327,
5661,
5199,
534,
3797,
941,
36509,
4373,
3602,
3632,
12922,
3966,
436,
476,
320,
2218,
275,
2057,
2593,
577,
390,
253,
30762,
1293,
2228,
8965,
352,
3133,
326,
253,
4679,
403,
1408,
323,
760,
581,
8357,
253,
4477,
943,
19148,
285,
263,
15249,
436,
50274,
42429,
3486,
604,
1896,
627,
812,
320,
625,
5955,
327,
849,
253,
22791,
8892,
1361,
342,
15450,
4893,
285,
616,
7680,
281,
253,
347,
83,
1673,
347,
273,
1024,
760,
253,
256,
10936,
1895,
3133,
1534,
1754,
327,
2045,
2987,
50275,
7152,
33032,
2520,
2929,
10262,
278,
21,
4093,
254,
247,
269,
3160,
1312,
1554,
382,
2172,
1554,
2182,
254,
7649,
19881,
16115,
577,
4849,
342,
14883,
1523,
28267,
12256,
7363,
347,
973,
347,
697,
49602,
436,
310,
1077,
9371,
323,
253,
2561,
2905,
281,
16115,
4318,
9066,
253,
10895,
4428,
10909,
247,
1554,
2182,
254,
285,
1554,
382,
2172,
7649,
19881,
16115,
4318,
20689,
342,
22407,
11595,
12420,
285,
10484,
12256,
4868,
22581,
337,
752,
310,
253,
2173,
1232,
273,
25319,
22581,
285,
4030,
22581,
352,
310,
1077,
1774,
323,
16115,
4318,
9066,
10895,
50276,
43355,
943,
50276,
49027,
731,
275,
2508,
374,
253,
10895,
310,
417,
1345,
2130,
436,
588,
2701,
9021,
281,
16115,
4318,
9066,
495,
253,
941,
873,
476,
320,
2797,
846,
2898,
533,
352,
310,
8767,
1880,
352,
310,
5185,
342,
253,
5740,
275,
5987,
7280,
2823,
21,
4093,
693,
21,
4093,
254,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
278,
21,
4093,
254,
247,
7649,
19881,
16115,
10895,
342,
9991,
275,
2709,
7794,
342,
247,
1643,
5183,
2898,
897,
2219,
253,
5020,
273,
30628,
3534,
2762,
17503,
285,
14969,
326,
253,
941,
310,
9257,
5728,
285,
16645,
941,
310,
273,
1029,
3290,
11117,
2600,
285,
4942,
1781,
275,
4311,
3103,
891,
5583,
14924,
281,
436,
8059,
3540,
50276,
2577,
2022,
4468,
310,
253,
28092,
273,
253,
10895,
253,
1067,
323,
2929,
3239,
11120,
5393,
326,
247,
2234,
17705,
310,
28092,
15302,
943,
320,
2130,
285,
12482,
26332,
253,
941,
476,
320,
1119,
285,
2797,
1293,
247,
3367,
2748,
281,
253,
12580,
285,
667,
2424,
2127,
943,
320,
1527,
2603,
2299,
253,
10895,
4419,
271,
2898,
285,
16725,
1232,
281,
320,
20582,
891,
476,
8564,
436,
812,
18270,
8513,
253,
16253,
273,
253,
15302,
275,
2852,
2561,
891,
5583,
253,
4477,
1089,
4088,
281,
7777,
3727,
253,
10895,
281,
253,
1345,
24088,
407,
13546,
2007,
7578,
432,
253,
4236,
32146,
50276,
783,
1239,
1405,
3239,
387,
5987,
7280,
2823,
21,
4093,
693,
21,
4093,
254,
671,
3198,
7756,
323,
352,
281,
320,
1663,
12482
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1236,
2510,
25912,
1029,
15177,
7649,
19881,
16115,
10895,
278,
21,
4093,
254,
534,
8414,
273,
48798,
448,
5187,
9575,
5950,
407,
655,
32146,
253,
4477,
2007,
12661,
577,
4893,
323,
436,
10895,
273,
4868,
3169,
256,
10936,
3661,
494,
16115,
4318,
43153,
16115,
4318,
9436,
256,
16788,
285,
12077,
3440,
9464,
717,
85,
337,
253,
10895,
4428,
1029,
15177,
3440,
326,
310,
4076,
285,
342,
1029,
32422,
50276,
19,
2429,
281,
5368,
15302,
278,
21,
4093,
254,
452,
3356,
3038,
273,
3440,
285,
4428,
11595,
12420,
285,
12256,
4868,
22581,
2975,
50276,
20,
253,
2929,
310,
4583,
973,
15720,
285,
253,
10895,
310,
973,
14290,
337,
1580,
253,
10895,
29820,
3440,
432,
643,
32146,
1057,
253,
10895,
755,
253,
7981,
432,
253,
3236,
32146,
323,
253,
14905,
897,
50276,
19,
2429,
281,
690,
5368,
15302,
751,
13279,
4940,
352,
3133,
326,
278,
21,
4093,
398,
452,
1679,
32146,
534,
778,
3133,
281,
3480,
690,
9991,
5474,
339,
431,
248,
2929,
13698,
281,
18915,
253,
440,
32517,
273,
1029,
15177,
285,
13613,
13130,
15302,
323,
16115,
4318,
9066,
256,
10936,
253,
2929,
29328,
278,
21,
4093,
254,
247,
4849,
273,
1554,
382,
2172,
1554,
2182,
254,
7649,
19881,
16115,
15547,
5950,
407,
655,
5702,
32146,
10985,
48798,
448,
5187,
1684,
9575,
275,
1740,
17898,
4243,
281,
7568,
253,
897,
273,
253,
10895,
4679,
403,
2684,
327,
4868,
3169,
256,
10936,
3661,
494,
16115,
4318,
16115,
4318,
9436,
285,
12077,
3440,
9464,
50276,
783,
2929,
310,
973,
17194,
285,
1057,
247,
1175,
2628,
12930,
253,
14720,
323,
1027,
10165,
275,
253,
8869,
273,
253,
10895,
50276,
66,
2257,
273,
3434,
3133,
281,
452,
4783,
715,
17749,
253,
36594,
273,
253,
10895,
1690,
11595,
22581,
14755,
253,
3440,
3966,
50276,
262,
3133,
751,
247,
1175,
7680,
281,
247,
3710,
2133,
273,
12256,
15302,
50276,
5992,
7193,
49602,
403,
2530,
281,
6266,
253,
897,
273,
253,
10895,
50276,
783,
11595,
21473,
310,
2256,
281,
1966,
6332,
533,
253,
2929,
35155,
7211,
436,
12291,
50276,
783,
2929,
556,
2067,
47412,
474,
6332,
2299,
5293,
273,
731,
3012,
843,
974,
432,
253,
3745,
281,
2096,
253,
2929,
352,
651,
1335,
320,
9371,
604,
253,
2929,
310,
9257,
4737,
1088,
281,
1056,
352,
625,
12482,
281,
10668,
5474,
33032,
2520,
310,
247,
973,
15720,
2929,
326,
3400,
247,
3806,
323,
6774,
789,
436,
10895,
310,
9865,
275,
2905,
3672,
533,
627,
403,
1335,
690,
5884,
3374,
326,
878,
281,
320,
7321,
352,
310,
247,
4460,
1554,
2182,
254,
285,
1554,
382,
2172,
16115,
4318,
20689,
342,
22407,
11595,
12420,
285,
12256,
4868,
22581,
50275,
783,
2264,
7467,
273,
253,
9575,
275,
253,
20689,
310,
30914,
323,
1027,
32146,
50276,
783,
30905,
3048,
943,
34243,
320,
4845,
275,
247,
625,
7985,
4328,
50275,
357,
89,
1071,
943,
320,
2879,
281,
253,
256,
16788,
3368,
50276,
783,
10895,
310,
417,
13644,
2130,
50276,
7152,
33032,
2520,
2929,
3727,
247,
747,
10895,
323,
1554,
382,
2172,
1554,
2182,
254,
7649,
19881,
16115,
4849,
342,
14883,
1523,
28267,
12256,
7363,
347,
973,
347,
697,
49602,
253,
2929,
310,
973,
15720,
253,
760,
4468,
310,
326,
352,
310,
10380,
32809,
327,
1527,
7693,
412,
5987,
39962,
2061,
9275,
19,
1252,
2922,
34637,
9275,
253,
3064,
3133,
281,
320,
1690,
625,
32146,
323,
1016,
4498,
347,
2326,
432,
2829,
337,
50274,
262,
310,
973,
15720,
50276,
783,
4311,
273,
4439,
10895,
310,
1199,
8750,
685,
5368,
10895,
50276,
262,
556,
247,
1842,
1767,
1312,
22020,
50275,
15870,
38135,
7296,
5368,
10895,
50276,
262,
651,
320,
5322,
281,
2319,
253,
1721,
49016,
10393,
273,
253,
10895,
50276,
1542,
1650,
665,
588,
320,
2750,
474,
275,
534,
247,
808,
84,
5474,
339,
431,
248,
2929,
23970,
278,
21,
4093,
254,
247,
3440,
22619,
10895,
273,
7649,
19881,
9575,
10985,
247,
2491,
273,
32146,
285,
16115,
14957,
352,
12093,
1884,
3038,
273,
17898,
941,
285,
1133,
11423,
456,
12256,
7363,
342,
12420,
1491,
253,
22791,
8414,
273,
1740,
8892,
326,
48593,
16376,
5368,
3210,
285,
3082,
327,
278,
21,
4093,
254,
285,
690,
16039,
403,
8392,
432,
841,
4679,
50275,
783,
10895,
310,
5728,
281,
4887,
9991,
273,
16115,
17898,
941,
253,
10895,
310,
47344,
432,
5702,
32146,
846,
47373,
272,
285,
5950,
275,
247,
11803,
3126,
534,
12215,
697,
3290,
253,
32146,
5194,
281,
8162,
762,
6843,
2426,
275,
271,
4345,
28228,
407,
253,
4477,
50273,
12784,
1215,
1966,
3434,
3440,
12420,
285,
22581,
310,
13730,
6312,
407,
1133,
275,
1635,
281,
4648,
273,
5368,
5657,
253,
4477,
671,
27532,
12440,
281,
4021,
5903,
31825,
347,
247,
1501,
21678,
5018,
50274,
307,
874,
273,
253,
10895,
7194,
29318,
281,
2953,
253,
256,
10936,
1895,
278,
21,
4093,
398,
3216,
5083,
33526,
1029,
17854,
7363,
1754,
327,
767,
17082,
285,
352,
2987,
973,
342,
767,
1006,
800,
3210,
253,
4477,
671,
14371,
849,
253,
10895,
476,
320,
908,
327,
747,
285,
1027,
8892,
50275,
42429,
31561,
5747,
697,
4618,
7031,
253,
10895,
41084,
1199,
625,
17898,
941,
432,
5343,
355,
936,
32146,
884,
3038,
2429,
281,
34715,
41911,
577,
3038,
285,
432,
625,
5086,
3578,
263,
32146,
2429,
281,
16819,
253,
4477,
943,
14409,
436,
37808,
285,
5513,
849,
436,
1057,
390,
1057,
417,
12647,
271,
12794,
12291,
1754,
327,
253,
4679,
436,
31561,
812,
2818,
253,
4030,
72,
11273,
3045,
273,
1046,
4836,
3340,
327,
4318,
9436,
256,
16788,
432,
247,
941,
357,
1504,
386,
966,
281,
247,
1679,
357,
1504,
386,
581,
50274,
3211,
285,
10895,
11659,
10773,
2322,
253,
4477,
2530,
247,
22020,
4422,
342,
9797,
3530,
533,
253,
2127,
18491,
310,
2761,
6325,
352,
310,
34007,
326,
253,
10895,
30015,
3601,
4212,
1078,
4933,
2289,
281,
253,
2120,
5300,
2299,
275,
616,
1655,
1375,
253,
2530,
5300,
403,
1512,
20847,
752,
891,
717,
16764,
347,
247,
5927,
2127,
323,
10935,
285,
638,
21678,
253,
941,
20477,
281,
18302,
253,
4679,
604,
1677,
941,
387,
1878,
4859,
281,
253,
5368,
27558,
273,
1666,
25379,
20477,
323,
11365,
253,
5304,
5904,
273,
253,
10895,
323,
17947,
6378,
50273,
49363,
4278,
275,
4028,
253,
4477,
3662,
4754,
281,
3533,
495,
66,
285,
495,
67,
275,
253,
44282,
533,
891,
513,
417,
1158,
436,
310,
271,
7899,
2380,
891,
1804,
326,
625,
4278,
320,
2879,
327,
5661,
5199,
534,
3797,
941,
36509,
4373,
3602,
3632,
12922,
3966,
436,
476,
320,
2218,
275,
2057,
2593,
577,
390,
253,
30762,
1293,
2228,
8965,
352,
3133,
326,
253,
4679,
403,
1408,
323,
760,
581,
8357,
253,
4477,
943,
19148,
285,
263,
15249,
436,
50274,
42429,
3486,
604,
1896,
627,
812,
320,
625,
5955,
327,
849,
253,
22791,
8892,
1361,
342,
15450,
4893,
285,
616,
7680,
281,
253,
347,
83,
1673,
347,
273,
1024,
760,
253,
256,
10936,
1895,
3133,
1534,
1754,
327,
2045,
2987,
50275,
7152,
33032,
2520,
2929,
10262,
278,
21,
4093,
254,
247,
269,
3160,
1312,
1554,
382,
2172,
1554,
2182,
254,
7649,
19881,
16115,
577,
4849,
342,
14883,
1523,
28267,
12256,
7363,
347,
973,
347,
697,
49602,
436,
310,
1077,
9371,
323,
253,
2561,
2905,
281,
16115,
4318,
9066,
253,
10895,
4428,
10909,
247,
1554,
2182,
254,
285,
1554,
382,
2172,
7649,
19881,
16115,
4318,
20689,
342,
22407,
11595,
12420,
285,
10484,
12256,
4868,
22581,
337,
752,
310,
253,
2173,
1232,
273,
25319,
22581,
285,
4030,
22581,
352,
310,
1077,
1774,
323,
16115,
4318,
9066,
10895,
50276,
43355,
943,
50276,
49027,
731,
275,
2508,
374,
253,
10895,
310,
417,
1345,
2130,
436,
588,
2701,
9021,
281,
16115,
4318,
9066,
495,
253,
941,
873,
476,
320,
2797,
846,
2898,
533,
352,
310,
8767,
1880,
352,
310,
5185,
342,
253,
5740,
275,
5987,
7280,
2823,
21,
4093,
693,
21,
4093,
254,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
278,
21,
4093,
254,
247,
7649,
19881,
16115,
10895,
342,
9991,
275,
2709,
7794,
342,
247,
1643,
5183,
2898,
897,
2219,
253,
5020,
273,
30628,
3534,
2762,
17503,
285,
14969,
326,
253,
941,
310,
9257,
5728,
285,
16645,
941,
310,
273,
1029,
3290,
11117,
2600,
285,
4942,
1781,
275,
4311,
3103,
891,
5583,
14924,
281,
436,
8059,
3540,
50276,
2577,
2022,
4468,
310,
253,
28092,
273,
253,
10895,
253,
1067,
323,
2929,
3239,
11120,
5393,
326,
247,
2234,
17705,
310,
28092,
15302,
943,
320,
2130,
285,
12482,
26332,
253,
941,
476,
320,
1119,
285,
2797,
1293,
247,
3367,
2748,
281,
253,
12580,
285,
667,
2424,
2127,
943,
320,
1527,
2603,
2299,
253,
10895,
4419,
271,
2898,
285,
16725,
1232,
281,
320,
20582,
891,
476,
8564,
436,
812,
18270,
8513,
253,
16253,
273,
253,
15302,
275,
2852,
2561,
891,
5583,
253,
4477,
1089,
4088,
281,
7777,
3727,
253,
10895,
281,
253,
1345,
24088,
407,
13546,
2007,
7578,
432,
253,
4236,
32146,
50276,
783,
1239,
1405,
3239,
387,
5987,
7280,
2823,
21,
4093,
693,
21,
4093,
254,
671,
3198,
7756,
323,
352,
281,
320,
1663,
12482
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes to use highdimensional sparse representation as opposed to lowdimensional dense representation for representing input text this can be useful in information retrieval applications the proposed model is evaluated on a product to product recommendation dataset and shows superior performance compared to the prior work the papers strengths 1 studying an important and interesting research problem ie learning sparse representation for information retrieval 2 the paper is wellwritten and easy to follow it is also wellstructured the results are well presented the papers weaknesses 1 in several places there are some overclaims that should be fixed for example in abstract and introduction when describing the papers contributions it is mentioned that in this paper we make a novel and unique argument that sparse high dimensional embeddings are superior to their dense low dimensional counterparts this is simply not true as we go forward in the related work we can see that the authors mention that snrm 40 has exactly the same idea with empirical evidences i think the authors should be more careful in claiming what is novel in their paper and what is borrowed from prior work 2 i think the weakest part of the paper is its evaluation here is the detailed comments 21 first of all it has been only evaluated on a single dataset it would be nice to see how the results can be generalized across datasets second the task they used for evaluating their model is pretty artificial there is no such realworld task as product to product recommendation and the experimental setting is not realistic at all i believe using such synthetic datasets for evaluating models is acceptable when there is no good alternative publicly available however in this case there are numerous largescale datasets that can be used to evaluate the model for example ms marco dataset from microsoft has been widely adopted by the research community and can be used for evaluating the model a lot of opendomain qa datasets are available too 22 as mentioned multiple times in the paper snrm is the only paper with the same idea of learning highdimensional sparse representation however the results reported in the table are strangely low looking at the snrm paper which was published in 2018 so its not an old paper we can see that it outperformed a lot of strong baselines on a number of benchmark i understand that sometimes it is difficult to reproduce the results from some papers however the snrm paper seems to be cited by over 60 papers and some of them reported its results on different datasets and they all reported a reasonable performance for the model i personally think that the authors did not carefully select the sparsity hyperparameter in the snrm model i also recommend the authors to include some nonneural baselines such as bm25 and rm3 as they sometimes outperform neural network models on information retrieval related tasks 3 it is not clear how scalable the model is and actually i have some doubt about its scalability what will happen when the number of items goes beyond tens of millions or even billions these questions are important to be answered when a model is proposed for information retrieval or recommender systems all in all i believe the authors are working on an interesting problem but the experimental results are not convincing and some claims should be changed some questions about scalability remain unanswereddocsepthe work describes an extreme classification strategy which leverages the computational efficiency of multiclass and multilabel efficiency on small label sets circa 10k composed with the nearorthogonality of random sparse embeddings while exploiting inherent parallelism of repeated independent instantiations of this primitive technique to mitigate statistical issues the use of fixed nearorthogonal label embeddings is elegantly motivated the computational properties of the technique are favorable especially amenability to inverted indexing and the statistical performance is competitive inference only gets passing treatment in the entire exposition without any supporting rationale for instance this reviewer was surprised that inference involves summing over predicted probabilities rather than summing logs of predicted probabilities essentially there are a collection of k independent predictors that we are trying to ensemble a problem that has received lots of attention in the literature making these connections would both help the reader and also potentially improve the techniquedocsepthe paper studies the problem of document retrieval using embedding based models it argues that performing nearneighbour search on a large number of dense embeddings hurts performance and accuracy as an alternative the paper proposes solar sparse orthogonal learned and random embeddings a model which uses highdimensional ultra sparse embeddings on which nearneighbour search can be done using simple lookup operations in solar the document labels are divided into equal chunks of sparse vector and independent models are learned for mapping the query to each chunk hence solar could be trained on multiple gpus in an embarrassingly parallel way without requiring any communication between the gpus the paper demonstrates the effectiveness of solar by comparing it against strong baselines on various recommendation and extreme classification datasets it also provides theoretical justification for solar by showing how onesided learning ie fixing label embedding but learning mapping from query to label is mathematically equivalent to twosided learning ie jointly mapping the label and query to a common space strong points wellmotivated problem the paper does a very good job in motivating the problem and highlighting the potential issue with dense embedding based models for large scale problems interesting result the paper argues the current convention for document retrieval is of using low dimensional dense embeddings it is thus interesting that a method that against the standard notions could perform so well practical approach the proposed method seems to be simple to implement and easily scalable to a large number of parallel processes this makes it practically valuable especially in largescale problems empirical evidence the paper provides strong empirical evidence solar outperforms industry standard embedding models while taking less query time on a product to product recommendation dataset solar also performs equally well on various extreme classification benchmarks the paper is well written and well structured the theoretical results are also presented in an easytofollow format with consistent notations weak points potential issuesquestions about the baselines i was wondering why are the baselines in table 1 trained for only 5 epochs and not until convergence can the performance of the baselines be improved by training more similarly i could not find information about the compute used during inference for the baselines and solar can the inference speed of the baselines and solar be improved by simply using more compute how sensitive is solar with respect to the random seeds chosen for the k models in other words how much would the performance vary if we start with a different set of random seeds overall recommendation although some information about the baselines is missing overall i feel that the paper presents an interesting and practical method with strong empirical evidence the paper is also well presented and is easy to follow i currently support acceptance but would form a final opinion based on the authors response docsepthis submission addresses the problem of learning document embeddings for document retrievalrecommendation tasks such tasks are characterized by a large number of documents and a large set of semantic class labels in contrast to the now standard approach of representing documents and their labels as dense low dimensional embeddings this work proposes to use sparse high dimensional embeddings for documentslabels and claims that the proposed approach has certain advantages over state of the art 1 it results in memory efficient and load balanced inverted index that facilitates fast index based lookup for retrieval 2 it enables distributed training on disjoint parts of the label vector and thereby speeding up training unlike the common approach of jointly learning the document and label embeddings the proposed approach employs random binary embeddings for labels the label embeddings are high dimensional sparse and mostly orthogonal by design each label in the training data is assigned a random ksparse only k 0 bits are 1 and rest are 0 binary embedding independent of other labels as well as documents associated with the label and any semantic features such as label texttitle the label might have when they are sufficiently high dimensional random embeddings ensure that the resulting inverted index is both balanced and each bucket in the index has only a small number of labels as the label embeddings are random and each bit in the embedding is independent of all other bits by design the embeddings can be partitioned into multiple disjoint blocks and each block can be used to independently learn partitioned embeddings for the documents as this requires no communication between the different threads working on different blocks training can be very fast to keep the model size low the work proposes to use a simple feedforward network with only one hidden layer there is one such network for each block and as they are trained independently there is no sharing among the models also unlike other approaches the proposed work doesnt make use of pretrained word embeddings for training document embeddings instead it uses a feature hashing scheme to map the very high dimensional text data to reasonably low dimension this eliminates the need to store word embeddings in memory as feature hashing is memory efficient the only structures that need to be in memory are the parameters of the feedforward network and the inverted index thereby achieving memory efficiency at query time the submission presents experimental results comparing the proposed approach with a few baselines the results are promising in terms of retrieval efficacy on multiple data sets while the proposed approach is well engineered to achieve its twin goals of memory efficient and load balanced inverted index and distributed training there are a few concerns about the methodology first is about the rationale behind using random binary embeddings for labels as observed earlier such an embedding is independent of other labels as well as documents associated with the label and any semantic features such as label texttitle the label might have this seems to be a rather poor utilization of the rich semantic relatedness in the data for the sake of engineering gains it can be argued that methods that ignore label correlations and other semantic relationships in the data are suboptimal second concern is about the harmful consequences of random embeddings as authors note in the appendix unrelated labels fall into the same bucket as each bucket has only a small number of labels it is unlikely that a significant number of the labels assigned to a bucket are related therefore it is unclear to this reviewer what does a bucket represent semantically other than being a collection of unrelated labels would this not affect the training of the classifier mapping documents to the bucket adversely empirical results seem to suggest otherwise but this is an issue that needs both deeper theoretical and empirical investigation further in the proposed approach semantically related labels are likely to have dissimilar embeddings this goes against the traditional view particularly in semantic hashing that similar labels should have similar embeddings third authors claim that onesided and twosided learning are mathematically equivalent this is true only when label embeddings are orthogonal when the embeddings are low dimensional as in other methods orthogonality of label embeddings is neither strictly guaranteed nor necessary fourth though authors dont say it k and b are hyper parameters in the proposed approach the tuning of these hyper parameters does add a significant overhead to training which is not shown in the experiments so the claim on significant training speed gains over baselines is suspect fifth it is not clear whether the proposed work uses the same tokenization scheme as used by the authors of dssm viz unigramsbigramschar trigramsoov in the experimental comparison with dssm as observed in dssm paper the choice of tokenization affects the retrieval accuracy significantly and a fair comparison would necessarily need to use the best tokenization scheme some suggested corrections 1 in figure 1 classifier 1 appears twice 2 in figures 1 and 3 k should be k 3 in the feature ablation study the best performing setting is b 30k k 32 based on p1 p3 p5 reported in table 3 whereas the submission claims that b 20k k 32 is the optimal setting based on author feedback here are some additional comments i would like to thank the authors for their response to the reviews as noted in my original review the proposed method is a wellengineered method for a particular type of document recommendation problem empirical performance on the chosen data sets is impressive compared to the baselines the claim on gain in training speed is suspect as there is a hyper parameter tuning step that has been not taken into account while reporting the speed gain over the baselines label correlations are not taken into account for label embeddings and might hurt the performance when there are not many documents in the training data for the long tail of labels in many realworld applications random embedding puts unrelated labels into the same bucket though this doesnt seem to hurt retrieval performance in the experiments reported in the submission thanks to filtering it is not clear how training will be affected by this nonsemantic bucketing of labels overall i think the submission has several things going in its favor though there is substantial scope for strengthening ive updated the rating
### Summary:
|
the paper proposes an approach to learn sparse embeddings for documentslabels which can be trained by using multiple gpus in parallel and are more amenable to nearest neighbor search the paper certainly seemed to have botched comparison to snrm and requires to fix the claims in section 51 but the impressive performance on extreme classification tasks is quite convincing also reviewers in general are quite enthusiastic about the paper so we would recommend the paper for acceptance but authors certainly need to take comments of reviewers into account especially around baselines and comparison to snrm
|
[
670,
697,
9171,
1430,
752,
588,
5108,
672,
253,
1180,
273,
4957,
4566,
4457,
7114,
273,
9790,
390,
1014,
25866,
841,
3533,
403,
1774,
281,
320,
9577,
672,
247,
1566,
310,
4081,
323,
1491,
25064,
390,
3818,
3109,
2718,
50276,
455,
275,
512,
891,
2868,
253,
4477,
403,
2444,
327,
271,
4722,
1895,
533,
253,
5661,
1543,
403,
417,
21414,
285,
690,
3916,
943,
320,
4391,
690,
3533,
670,
9171,
1430,
3464,
440,
42195,
7152,
339,
431,
248,
789,
8631,
271,
9559,
9162,
5700,
534,
19732,
1131,
253,
15180,
6733,
273,
23559,
14407,
285,
33362,
1492,
6733,
327,
1355,
5203,
5239,
2100,
66,
884,
76,
9924,
342,
253,
2822,
2156,
38931,
1319,
273,
3632,
23507,
46234,
1223,
38883,
12794,
7529,
1204,
273,
6015,
3907,
8164,
10944,
273,
436,
20523,
5853,
281,
29966,
7605,
3374,
50274,
783,
897,
273,
4229,
2822,
2156,
17397,
5203,
46234,
310,
13990,
5954,
17194,
253,
15180,
3607,
273,
253,
5853,
403,
13857,
3340,
29022,
1430,
281,
28483,
44176,
285,
253,
7605,
3045,
310,
12085,
50276,
249,
1793,
760,
4850,
8136,
1971,
275,
253,
2862,
47284,
1293,
667,
8109,
24775,
50275,
1542,
4227,
436,
37317,
369,
9861,
326,
17032,
8687,
49947,
689,
8131,
20552,
2581,
685,
49947,
20131,
273,
8131,
20552,
50276,
405,
4303,
627,
403,
247,
4849,
273,
465,
3907,
23477,
326,
359,
403,
2820,
281,
19862,
247,
1895,
326,
556,
2959,
8783,
273,
4116,
275,
253,
6239,
50276,
11849,
841,
10291,
651,
1097,
1361,
253,
9414,
285,
671,
7826,
3157,
253,
1732,
3008,
264,
406,
339,
431,
248,
2929,
2175,
253,
1895,
273,
3389,
25064,
970,
21496,
1754,
3210,
352,
8219,
326,
9591,
2822,
570,
798,
7390,
3186,
327,
247,
1781,
1180,
273,
14086,
46234,
31835,
3045,
285,
7200,
347,
271,
5795,
253,
2929,
29328,
9666,
23507,
19627,
6311,
285,
3632,
46234,
247,
1566,
534,
4648,
1029,
6967,
17452,
23507,
46234,
327,
534,
2822,
570,
798,
7390,
3186,
476,
320,
2218,
970,
2969,
31994,
5871,
275,
9666,
253,
3389,
13301,
403,
4272,
715,
4503,
30151,
273,
23507,
4972,
285,
3907,
3210,
403,
6311,
323,
10603,
253,
7316,
281,
1016,
20540,
7613,
9666,
812,
320,
10166,
327,
2709,
31025,
316,
275,
271,
15005,
5356,
7529,
1039,
1293,
10568,
667,
5511,
875,
253,
31025,
316,
253,
2929,
14371,
253,
12510,
273,
9666,
407,
10941,
352,
1411,
2266,
1666,
25379,
327,
2710,
17401,
285,
9559,
9162,
15302,
352,
671,
3400,
10527,
22861,
323,
9666,
407,
4645,
849,
4394,
1356,
4715,
26332,
18505,
5203,
21496,
533,
4715,
10603,
432,
7316,
281,
5203,
310,
11076,
1037,
6425,
281,
2500,
375,
1356,
4715,
26332,
26277,
10603,
253,
5203,
285,
7316,
281,
247,
1846,
2317,
50275,
9072,
2792,
50275,
4714,
24013,
8550,
1895,
253,
2929,
1057,
247,
1077,
1175,
2628,
275,
15265,
839,
253,
1895,
285,
27321,
253,
2442,
2523,
342,
14086,
21496,
1754,
3210,
323,
1781,
4311,
3237,
50276,
47606,
906,
253,
2929,
8219,
253,
1655,
5008,
323,
3389,
25064,
310,
273,
970,
1698,
15759,
14086,
46234,
352,
310,
3021,
4722,
326,
247,
1332,
326,
1411,
253,
2629,
27367,
812,
1347,
594,
973,
50275,
81,
26080,
2746,
253,
4081,
1332,
3133,
281,
320,
2969,
281,
3359,
285,
4354,
44755,
281,
247,
1781,
1180,
273,
7529,
4870,
436,
2789,
352,
18236,
9865,
3340,
275,
1236,
2510,
25912,
3237,
50276,
358,
5378,
474,
1941,
253,
2929,
3400,
2266,
16774,
1941,
9666,
41731,
13015,
4491,
2629,
21496,
3210,
1223,
3192,
1679,
7316,
673,
327,
247,
209,
186,
7509,
281,
1885,
17401,
10895,
9666,
671,
17923,
9696,
973,
327,
2710,
9559,
9162,
49602,
50276,
783,
2929,
310,
973,
3542,
285,
973,
18872,
253,
10527,
1543,
403,
671,
3559,
275,
271,
3477,
936,
25739,
5981,
342,
5185,
41818,
50275,
20881,
2792,
50275,
33177,
3374,
34974,
670,
253,
1666,
25379,
891,
369,
12371,
2139,
403,
253,
1666,
25379,
275,
2829,
337,
10166,
323,
760,
608,
44540,
285,
417,
1919,
14940,
50276,
5092,
253,
3045,
273,
253,
1666,
25379,
320,
5520,
407,
3733,
625,
50276,
3549,
6241,
891,
812,
417,
1089,
1491,
670,
253,
11897,
908,
1309,
17032,
323,
253,
1666,
25379,
285,
9666,
476,
253,
17032,
3885,
273,
253,
1666,
25379,
285,
9666,
320,
5520,
407,
3365,
970,
625,
11897,
50276,
5430,
7996,
310,
9666,
342,
1675,
281,
253,
3632,
12922,
6777,
323,
253,
465,
3210,
275,
643,
3000,
849,
1199,
651,
253,
3045,
6889,
604,
359,
1265,
342,
247,
1027,
873,
273,
3632,
12922,
50273,
1189,
455,
17401,
3738,
690,
1491,
670,
253,
1666,
25379,
310,
5816,
4583,
891,
1928,
326,
253,
2929,
10262,
271,
4722,
285,
8542,
1332,
342,
2266,
16774,
1941,
253,
2929,
310,
671,
973,
3559,
285,
310,
3477,
281,
956,
891,
4390,
1329,
14924,
533,
651,
830,
247,
2457,
4743,
1754,
327,
253,
4477,
2380,
5474,
33032,
2520,
19529,
12453,
253,
1895,
273,
4715,
3389,
46234,
323,
3389,
25064,
250,
27167,
318,
8892,
824,
8892,
403,
7943,
407,
247,
1781,
1180,
273,
7177,
285,
247,
1781,
873,
273,
24705,
966,
13301,
275,
4499,
281,
253,
1024,
2629,
2746,
273,
9999,
7177,
285,
616,
13301,
347,
14086,
1698,
15759,
46234,
436,
789,
29328,
281,
897,
23507,
1029,
15759,
46234,
323,
7177,
31294,
285,
3916,
326,
253,
4081,
2746,
556,
2176,
11361,
689,
1375,
273,
253,
1445,
209,
186,
18,
352,
1543,
275,
3541,
5919,
285,
3301,
16645,
28483,
3605,
326,
29499,
3809,
3605,
1754,
31994,
323,
25064,
50275,
186,
19,
352,
13276,
5939,
3733,
327,
28465,
4243,
273,
253,
5203,
4972,
285,
7624,
43088,
598,
3733,
50276,
328,
3022,
253,
1846,
2746,
273,
26277,
4715,
253,
3389,
285,
5203,
46234,
253,
4081,
2746,
27532,
3632,
8985,
46234,
323,
13301,
50276,
783,
5203,
46234,
403,
1029,
15759,
23507,
285,
6571,
19627,
407,
2216,
1016,
5203,
275,
253,
3733,
941,
310,
7922,
247,
3632,
465,
1033,
10788,
760,
465,
50276,
17,
9886,
403,
337,
285,
1551,
403,
470,
8985,
21496,
3907,
273,
643,
13301,
347,
973,
347,
7177,
2330,
342,
253,
5203,
285,
667,
24705,
3386,
824,
347,
5203,
2505,
5564,
253,
5203,
1537,
452,
672,
597,
403,
10481,
1029,
15759,
3632,
46234,
5416,
326,
253,
4795,
28483,
3605,
310,
1097,
16645,
285,
50276,
14382,
22205,
275,
253,
3605,
556,
760,
247,
1355,
1180,
273,
13301,
50275,
284,
253,
5203,
46234,
403,
3632,
285,
1016,
2372,
275,
253,
21496,
310,
3907,
273,
512,
643,
9886,
407,
2216,
253,
46234,
476,
320,
10883,
264,
715,
2709,
28465,
8336,
285,
1016,
2972,
476,
320,
908,
281,
10939,
3037,
10883,
264,
46234,
323,
253,
7177,
347,
436,
4419,
642,
5511,
875,
253,
1027,
17059,
2444,
327,
1027,
8336,
3733,
476,
320,
1077,
3809,
50274,
936,
1978,
253,
1566,
1979,
1698,
253,
789,
29328,
281,
897,
247,
2969,
3997,
10495,
2990,
342,
760,
581,
8763,
3828,
627,
310,
581,
824,
2990,
323,
1016,
2972,
285,
347,
597,
403,
10166,
10939,
627,
310,
642,
9628,
2190,
253,
3210,
50276,
12563,
12401,
643,
7274,
253,
4081,
789,
36908,
1056,
897,
273,
3215,
11273,
3159,
46234,
323,
3733,
3389,
46234,
3185,
352,
4648,
247,
4735,
556,
2027,
6974,
281,
3711,
253,
1077,
1029,
15759,
2505,
941,
281,
12054,
1698,
7877,
436,
35580,
253,
878,
281,
4657,
3159,
46234,
275,
3541,
347,
4735,
556,
2027,
310,
3541,
5919,
253,
760,
5289,
326,
878,
281,
320,
275,
3541,
403,
253,
3602,
273,
253,
3997,
10495,
2990,
285,
253,
28483,
3605,
7624,
17170,
3541,
6733,
387,
7316,
673,
50276,
783,
19529,
10262,
5661,
1543,
10941,
253,
4081,
2746,
342,
247,
1643,
1666,
25379,
253,
1543,
403,
12532,
275,
2426,
273,
25064,
10307,
327,
2709,
941,
5239,
50276,
6050,
253,
4081,
2746,
310,
973,
28136,
281,
5115,
697,
19661,
7342,
273,
3541,
5919,
285,
3301,
16645,
28483,
3605,
285,
5939,
3733,
627,
403,
247,
1643,
7350,
670,
253,
16182,
50275,
7053,
310,
670,
253,
24775,
3212,
970,
3632,
8985,
46234,
323,
13301,
347,
2540,
4321,
824,
271,
21496,
310,
3907,
273,
643,
13301,
347,
973,
347,
7177,
2330,
342,
253,
5203,
285,
667,
24705,
3386,
824,
347,
5203,
2505,
5564,
253,
5203,
1537,
452,
436,
3133,
281,
320,
247,
2581,
4105,
19575,
273,
253,
6793,
24705,
2905,
1255,
275,
253,
941,
323,
253,
13232,
273,
11369,
15988,
352,
476,
320,
9125,
326,
3082,
326,
11823,
5203,
13007,
285,
643,
24705,
7688,
275,
253,
941,
403,
749,
29776,
50276,
9815,
4468,
310,
670,
253,
19632,
9099,
273,
3632,
46234,
347,
4477,
3877,
275,
253,
30762,
20804,
13301,
2965,
715,
253,
1072,
22205,
347,
1016,
22205,
556,
760,
247,
1355,
1180,
273,
13301,
352,
310,
11543,
326,
247,
1534,
1180,
273,
253,
13301,
7922,
281,
247,
22205,
403,
2905,
3103,
352,
310,
12744,
281,
436,
37317,
752,
1057,
247,
22205,
1957,
3300,
39904,
643,
685,
1146,
247,
4849,
273,
20804,
13301,
651,
436,
417,
2818,
253,
3733,
273,
253,
30410,
10603,
7177,
281,
253,
22205,
31611,
16774,
1543,
1646,
281,
1804,
5010,
533,
436,
310,
271,
2523,
326,
3198,
1097,
12861,
10527,
285,
16774,
5839,
2007,
275,
253,
4081,
2746,
3300,
39904,
2905,
13301,
403,
2779,
281,
452,
43110,
46234,
436,
4566,
1411,
253,
5899,
1859,
3782,
275,
24705,
556,
2027,
326,
2074,
13301,
943,
452,
2074,
46234,
50275,
19016,
4477,
1750,
326,
4394,
1356,
285,
2500,
375,
1356,
4715,
403,
11076,
1037,
6425,
436,
310,
2032,
760,
672,
5203,
46234,
403,
19627,
672,
253,
46234,
403,
1698,
15759,
347,
275,
643,
3082,
9373,
38931,
1319,
273,
5203,
46234,
310,
6747,
13714,
16293,
4543,
3309,
50275,
48499,
2167,
4477,
13414,
1333,
352,
465,
285,
270,
403,
4373,
3602,
275,
253,
4081,
2746,
253,
25184,
273,
841,
4373,
3602,
1057,
823,
247,
1534,
18332,
281,
3733,
534,
310,
417,
2011,
275,
253,
4679,
594,
253,
1750,
327,
1534,
3733,
3885,
15988,
689,
1666,
25379,
310,
9101,
50275,
25512,
394,
352,
310,
417,
2590,
1880,
253,
4081,
789,
4648,
253,
1072,
10669,
1320,
6974,
347,
908,
407,
253,
4477,
273,
277,
859,
78,
40027,
440,
304,
10624,
2760,
10624,
3615,
5951,
3358,
601,
729,
275,
253,
5661,
5301,
342,
277,
859,
78,
347,
2540,
275,
277,
859,
78,
2929,
253,
4327,
273,
10669,
1320,
11852,
253,
25064,
7200,
3012,
285,
247,
4344,
5301,
651,
7933,
878,
281,
897,
253,
1682,
10669,
1320,
6974,
50275,
8826,
5125,
17660,
209,
186,
18,
275,
4677,
337,
30410,
337,
4620,
7019,
209,
186,
19,
275,
8442,
337,
285,
495,
465,
943,
320,
465,
209,
186,
20,
275,
253,
4735,
28913,
1263,
253,
1682,
9591,
4758,
310,
270,
50276,
1229,
76,
465,
50276,
1237,
1754,
327,
268,
18,
268,
20,
268,
22,
2361,
275,
2829,
495,
5727,
253,
19529,
3916,
326,
270,
50276,
938,
76,
465,
50276,
1237,
50276,
261,
253,
8654,
4758,
28910,
50274,
3169,
327,
2488,
8680,
1060,
403,
690,
3081,
5701,
891,
651,
751,
281,
5717,
253,
4477,
323,
616,
2380,
281,
253,
10123,
347,
4879,
275,
619,
3236,
2278,
253,
4081,
1332,
310,
247,
973,
15179,
2122,
1332,
323,
247,
1798,
1511,
273,
3389,
17401,
1895,
16774,
3045,
327,
253,
6777,
941,
5239,
310,
13943,
2429,
281,
253,
1666,
25379,
253,
1750,
327,
6351,
275,
3733,
3885,
310,
9101,
347,
627,
310,
247,
4373,
4764,
25184,
3213,
326,
556,
644,
417,
2668,
715,
2395,
1223,
9610,
253,
3885,
6351,
689,
253,
1666,
25379,
5203,
13007,
403,
417,
2668,
715,
2395,
323,
5203,
46234,
285,
1537,
8513,
253,
3045,
672,
627,
403,
417,
1142,
7177,
275,
253,
3733,
941,
323,
253,
1048,
8105,
273,
13301,
275,
1142,
1524,
10186,
4893,
3632,
21496,
12516,
20804,
13301,
715,
253,
1072,
22205,
2167,
436,
36908,
1646,
281,
8513,
25064,
3045,
275,
253,
4679,
2361,
275,
253,
19529,
6701,
281,
19690,
352,
310,
417,
2590,
849,
3733,
588,
320,
5876,
407,
436,
1327,
6017,
6484,
12433,
8211,
273,
13301,
4583,
891,
1158,
253,
19529,
556,
2067,
1841,
1469,
275,
697,
3718,
2167,
627,
310,
6832,
7990,
323,
31845,
50275,
422,
9300,
253,
13716,
50274,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
2746,
281,
3037,
23507,
46234,
323,
7177,
31294,
534,
476,
320,
10166,
407,
970,
2709,
31025,
316,
275,
7529,
285,
403,
625,
42133,
281,
5275,
6346,
3186,
50275,
783,
2929,
5604,
4455,
281,
452,
17994,
2147,
50276,
47109,
281,
3802,
1109,
285,
4419,
281,
4993,
253,
3916,
275,
50276,
4674,
8319,
50276,
2858,
253,
13943,
3045,
327,
9559,
9162,
8892,
310,
3240,
21414,
671,
30628,
275,
2087,
403,
3240,
31905,
670,
253,
2929,
594,
359,
651,
5583,
253,
2929,
323,
14924,
533,
4477,
5604,
878,
281,
1379,
5701,
273,
30628,
715,
2395,
3340,
1475,
1666,
25379,
285,
5301,
281,
3802,
1109
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
670,
697,
9171,
1430,
752,
588,
5108,
672,
253,
1180,
273,
4957,
4566,
4457,
7114,
273,
9790,
390,
1014,
25866,
841,
3533,
403,
1774,
281,
320,
9577,
672,
247,
1566,
310,
4081,
323,
1491,
25064,
390,
3818,
3109,
2718,
50276,
455,
275,
512,
891,
2868,
253,
4477,
403,
2444,
327,
271,
4722,
1895,
533,
253,
5661,
1543,
403,
417,
21414,
285,
690,
3916,
943,
320,
4391,
690,
3533,
670,
9171,
1430,
3464,
440,
42195,
7152,
339,
431,
248,
789,
8631,
271,
9559,
9162,
5700,
534,
19732,
1131,
253,
15180,
6733,
273,
23559,
14407,
285,
33362,
1492,
6733,
327,
1355,
5203,
5239,
2100,
66,
884,
76,
9924,
342,
253,
2822,
2156,
38931,
1319,
273,
3632,
23507,
46234,
1223,
38883,
12794,
7529,
1204,
273,
6015,
3907,
8164,
10944,
273,
436,
20523,
5853,
281,
29966,
7605,
3374,
50274,
783,
897,
273,
4229,
2822,
2156,
17397,
5203,
46234,
310,
13990,
5954,
17194,
253,
15180,
3607,
273,
253,
5853,
403,
13857,
3340,
29022,
1430,
281,
28483,
44176,
285,
253,
7605,
3045,
310,
12085,
50276,
249,
1793,
760,
4850,
8136,
1971,
275,
253,
2862,
47284,
1293,
667,
8109,
24775,
50275,
1542,
4227,
436,
37317,
369,
9861,
326,
17032,
8687,
49947,
689,
8131,
20552,
2581,
685,
49947,
20131,
273,
8131,
20552,
50276,
405,
4303,
627,
403,
247,
4849,
273,
465,
3907,
23477,
326,
359,
403,
2820,
281,
19862,
247,
1895,
326,
556,
2959,
8783,
273,
4116,
275,
253,
6239,
50276,
11849,
841,
10291,
651,
1097,
1361,
253,
9414,
285,
671,
7826,
3157,
253,
1732,
3008,
264,
406,
339,
431,
248,
2929,
2175,
253,
1895,
273,
3389,
25064,
970,
21496,
1754,
3210,
352,
8219,
326,
9591,
2822,
570,
798,
7390,
3186,
327,
247,
1781,
1180,
273,
14086,
46234,
31835,
3045,
285,
7200,
347,
271,
5795,
253,
2929,
29328,
9666,
23507,
19627,
6311,
285,
3632,
46234,
247,
1566,
534,
4648,
1029,
6967,
17452,
23507,
46234,
327,
534,
2822,
570,
798,
7390,
3186,
476,
320,
2218,
970,
2969,
31994,
5871,
275,
9666,
253,
3389,
13301,
403,
4272,
715,
4503,
30151,
273,
23507,
4972,
285,
3907,
3210,
403,
6311,
323,
10603,
253,
7316,
281,
1016,
20540,
7613,
9666,
812,
320,
10166,
327,
2709,
31025,
316,
275,
271,
15005,
5356,
7529,
1039,
1293,
10568,
667,
5511,
875,
253,
31025,
316,
253,
2929,
14371,
253,
12510,
273,
9666,
407,
10941,
352,
1411,
2266,
1666,
25379,
327,
2710,
17401,
285,
9559,
9162,
15302,
352,
671,
3400,
10527,
22861,
323,
9666,
407,
4645,
849,
4394,
1356,
4715,
26332,
18505,
5203,
21496,
533,
4715,
10603,
432,
7316,
281,
5203,
310,
11076,
1037,
6425,
281,
2500,
375,
1356,
4715,
26332,
26277,
10603,
253,
5203,
285,
7316,
281,
247,
1846,
2317,
50275,
9072,
2792,
50275,
4714,
24013,
8550,
1895,
253,
2929,
1057,
247,
1077,
1175,
2628,
275,
15265,
839,
253,
1895,
285,
27321,
253,
2442,
2523,
342,
14086,
21496,
1754,
3210,
323,
1781,
4311,
3237,
50276,
47606,
906,
253,
2929,
8219,
253,
1655,
5008,
323,
3389,
25064,
310,
273,
970,
1698,
15759,
14086,
46234,
352,
310,
3021,
4722,
326,
247,
1332,
326,
1411,
253,
2629,
27367,
812,
1347,
594,
973,
50275,
81,
26080,
2746,
253,
4081,
1332,
3133,
281,
320,
2969,
281,
3359,
285,
4354,
44755,
281,
247,
1781,
1180,
273,
7529,
4870,
436,
2789,
352,
18236,
9865,
3340,
275,
1236,
2510,
25912,
3237,
50276,
358,
5378,
474,
1941,
253,
2929,
3400,
2266,
16774,
1941,
9666,
41731,
13015,
4491,
2629,
21496,
3210,
1223,
3192,
1679,
7316,
673,
327,
247,
209,
186,
7509,
281,
1885,
17401,
10895,
9666,
671,
17923,
9696,
973,
327,
2710,
9559,
9162,
49602,
50276,
783,
2929,
310,
973,
3542,
285,
973,
18872,
253,
10527,
1543,
403,
671,
3559,
275,
271,
3477,
936,
25739,
5981,
342,
5185,
41818,
50275,
20881,
2792,
50275,
33177,
3374,
34974,
670,
253,
1666,
25379,
891,
369,
12371,
2139,
403,
253,
1666,
25379,
275,
2829,
337,
10166,
323,
760,
608,
44540,
285,
417,
1919,
14940,
50276,
5092,
253,
3045,
273,
253,
1666,
25379,
320,
5520,
407,
3733,
625,
50276,
3549,
6241,
891,
812,
417,
1089,
1491,
670,
253,
11897,
908,
1309,
17032,
323,
253,
1666,
25379,
285,
9666,
476,
253,
17032,
3885,
273,
253,
1666,
25379,
285,
9666,
320,
5520,
407,
3365,
970,
625,
11897,
50276,
5430,
7996,
310,
9666,
342,
1675,
281,
253,
3632,
12922,
6777,
323,
253,
465,
3210,
275,
643,
3000,
849,
1199,
651,
253,
3045,
6889,
604,
359,
1265,
342,
247,
1027,
873,
273,
3632,
12922,
50273,
1189,
455,
17401,
3738,
690,
1491,
670,
253,
1666,
25379,
310,
5816,
4583,
891,
1928,
326,
253,
2929,
10262,
271,
4722,
285,
8542,
1332,
342,
2266,
16774,
1941,
253,
2929,
310,
671,
973,
3559,
285,
310,
3477,
281,
956,
891,
4390,
1329,
14924,
533,
651,
830,
247,
2457,
4743,
1754,
327,
253,
4477,
2380,
5474,
33032,
2520,
19529,
12453,
253,
1895,
273,
4715,
3389,
46234,
323,
3389,
25064,
250,
27167,
318,
8892,
824,
8892,
403,
7943,
407,
247,
1781,
1180,
273,
7177,
285,
247,
1781,
873,
273,
24705,
966,
13301,
275,
4499,
281,
253,
1024,
2629,
2746,
273,
9999,
7177,
285,
616,
13301,
347,
14086,
1698,
15759,
46234,
436,
789,
29328,
281,
897,
23507,
1029,
15759,
46234,
323,
7177,
31294,
285,
3916,
326,
253,
4081,
2746,
556,
2176,
11361,
689,
1375,
273,
253,
1445,
209,
186,
18,
352,
1543,
275,
3541,
5919,
285,
3301,
16645,
28483,
3605,
326,
29499,
3809,
3605,
1754,
31994,
323,
25064,
50275,
186,
19,
352,
13276,
5939,
3733,
327,
28465,
4243,
273,
253,
5203,
4972,
285,
7624,
43088,
598,
3733,
50276,
328,
3022,
253,
1846,
2746,
273,
26277,
4715,
253,
3389,
285,
5203,
46234,
253,
4081,
2746,
27532,
3632,
8985,
46234,
323,
13301,
50276,
783,
5203,
46234,
403,
1029,
15759,
23507,
285,
6571,
19627,
407,
2216,
1016,
5203,
275,
253,
3733,
941,
310,
7922,
247,
3632,
465,
1033,
10788,
760,
465,
50276,
17,
9886,
403,
337,
285,
1551,
403,
470,
8985,
21496,
3907,
273,
643,
13301,
347,
973,
347,
7177,
2330,
342,
253,
5203,
285,
667,
24705,
3386,
824,
347,
5203,
2505,
5564,
253,
5203,
1537,
452,
672,
597,
403,
10481,
1029,
15759,
3632,
46234,
5416,
326,
253,
4795,
28483,
3605,
310,
1097,
16645,
285,
50276,
14382,
22205,
275,
253,
3605,
556,
760,
247,
1355,
1180,
273,
13301,
50275,
284,
253,
5203,
46234,
403,
3632,
285,
1016,
2372,
275,
253,
21496,
310,
3907,
273,
512,
643,
9886,
407,
2216,
253,
46234,
476,
320,
10883,
264,
715,
2709,
28465,
8336,
285,
1016,
2972,
476,
320,
908,
281,
10939,
3037,
10883,
264,
46234,
323,
253,
7177,
347,
436,
4419,
642,
5511,
875,
253,
1027,
17059,
2444,
327,
1027,
8336,
3733,
476,
320,
1077,
3809,
50274,
936,
1978,
253,
1566,
1979,
1698,
253,
789,
29328,
281,
897,
247,
2969,
3997,
10495,
2990,
342,
760,
581,
8763,
3828,
627,
310,
581,
824,
2990,
323,
1016,
2972,
285,
347,
597,
403,
10166,
10939,
627,
310,
642,
9628,
2190,
253,
3210,
50276,
12563,
12401,
643,
7274,
253,
4081,
789,
36908,
1056,
897,
273,
3215,
11273,
3159,
46234,
323,
3733,
3389,
46234,
3185,
352,
4648,
247,
4735,
556,
2027,
6974,
281,
3711,
253,
1077,
1029,
15759,
2505,
941,
281,
12054,
1698,
7877,
436,
35580,
253,
878,
281,
4657,
3159,
46234,
275,
3541,
347,
4735,
556,
2027,
310,
3541,
5919,
253,
760,
5289,
326,
878,
281,
320,
275,
3541,
403,
253,
3602,
273,
253,
3997,
10495,
2990,
285,
253,
28483,
3605,
7624,
17170,
3541,
6733,
387,
7316,
673,
50276,
783,
19529,
10262,
5661,
1543,
10941,
253,
4081,
2746,
342,
247,
1643,
1666,
25379,
253,
1543,
403,
12532,
275,
2426,
273,
25064,
10307,
327,
2709,
941,
5239,
50276,
6050,
253,
4081,
2746,
310,
973,
28136,
281,
5115,
697,
19661,
7342,
273,
3541,
5919,
285,
3301,
16645,
28483,
3605,
285,
5939,
3733,
627,
403,
247,
1643,
7350,
670,
253,
16182,
50275,
7053,
310,
670,
253,
24775,
3212,
970,
3632,
8985,
46234,
323,
13301,
347,
2540,
4321,
824,
271,
21496,
310,
3907,
273,
643,
13301,
347,
973,
347,
7177,
2330,
342,
253,
5203,
285,
667,
24705,
3386,
824,
347,
5203,
2505,
5564,
253,
5203,
1537,
452,
436,
3133,
281,
320,
247,
2581,
4105,
19575,
273,
253,
6793,
24705,
2905,
1255,
275,
253,
941,
323,
253,
13232,
273,
11369,
15988,
352,
476,
320,
9125,
326,
3082,
326,
11823,
5203,
13007,
285,
643,
24705,
7688,
275,
253,
941,
403,
749,
29776,
50276,
9815,
4468,
310,
670,
253,
19632,
9099,
273,
3632,
46234,
347,
4477,
3877,
275,
253,
30762,
20804,
13301,
2965,
715,
253,
1072,
22205,
347,
1016,
22205,
556,
760,
247,
1355,
1180,
273,
13301,
352,
310,
11543,
326,
247,
1534,
1180,
273,
253,
13301,
7922,
281,
247,
22205,
403,
2905,
3103,
352,
310,
12744,
281,
436,
37317,
752,
1057,
247,
22205,
1957,
3300,
39904,
643,
685,
1146,
247,
4849,
273,
20804,
13301,
651,
436,
417,
2818,
253,
3733,
273,
253,
30410,
10603,
7177,
281,
253,
22205,
31611,
16774,
1543,
1646,
281,
1804,
5010,
533,
436,
310,
271,
2523,
326,
3198,
1097,
12861,
10527,
285,
16774,
5839,
2007,
275,
253,
4081,
2746,
3300,
39904,
2905,
13301,
403,
2779,
281,
452,
43110,
46234,
436,
4566,
1411,
253,
5899,
1859,
3782,
275,
24705,
556,
2027,
326,
2074,
13301,
943,
452,
2074,
46234,
50275,
19016,
4477,
1750,
326,
4394,
1356,
285,
2500,
375,
1356,
4715,
403,
11076,
1037,
6425,
436,
310,
2032,
760,
672,
5203,
46234,
403,
19627,
672,
253,
46234,
403,
1698,
15759,
347,
275,
643,
3082,
9373,
38931,
1319,
273,
5203,
46234,
310,
6747,
13714,
16293,
4543,
3309,
50275,
48499,
2167,
4477,
13414,
1333,
352,
465,
285,
270,
403,
4373,
3602,
275,
253,
4081,
2746,
253,
25184,
273,
841,
4373,
3602,
1057,
823,
247,
1534,
18332,
281,
3733,
534,
310,
417,
2011,
275,
253,
4679,
594,
253,
1750,
327,
1534,
3733,
3885,
15988,
689,
1666,
25379,
310,
9101,
50275,
25512,
394,
352,
310,
417,
2590,
1880,
253,
4081,
789,
4648,
253,
1072,
10669,
1320,
6974,
347,
908,
407,
253,
4477,
273,
277,
859,
78,
40027,
440,
304,
10624,
2760,
10624,
3615,
5951,
3358,
601,
729,
275,
253,
5661,
5301,
342,
277,
859,
78,
347,
2540,
275,
277,
859,
78,
2929,
253,
4327,
273,
10669,
1320,
11852,
253,
25064,
7200,
3012,
285,
247,
4344,
5301,
651,
7933,
878,
281,
897,
253,
1682,
10669,
1320,
6974,
50275,
8826,
5125,
17660,
209,
186,
18,
275,
4677,
337,
30410,
337,
4620,
7019,
209,
186,
19,
275,
8442,
337,
285,
495,
465,
943,
320,
465,
209,
186,
20,
275,
253,
4735,
28913,
1263,
253,
1682,
9591,
4758,
310,
270,
50276,
1229,
76,
465,
50276,
1237,
1754,
327,
268,
18,
268,
20,
268,
22,
2361,
275,
2829,
495,
5727,
253,
19529,
3916,
326,
270,
50276,
938,
76,
465,
50276,
1237,
50276,
261,
253,
8654,
4758,
28910,
50274,
3169,
327,
2488,
8680,
1060,
403,
690,
3081,
5701,
891,
651,
751,
281,
5717,
253,
4477,
323,
616,
2380,
281,
253,
10123,
347,
4879,
275,
619,
3236,
2278,
253,
4081,
1332,
310,
247,
973,
15179,
2122,
1332,
323,
247,
1798,
1511,
273,
3389,
17401,
1895,
16774,
3045,
327,
253,
6777,
941,
5239,
310,
13943,
2429,
281,
253,
1666,
25379,
253,
1750,
327,
6351,
275,
3733,
3885,
310,
9101,
347,
627,
310,
247,
4373,
4764,
25184,
3213,
326,
556,
644,
417,
2668,
715,
2395,
1223,
9610,
253,
3885,
6351,
689,
253,
1666,
25379,
5203,
13007,
403,
417,
2668,
715,
2395,
323,
5203,
46234,
285,
1537,
8513,
253,
3045,
672,
627,
403,
417,
1142,
7177,
275,
253,
3733,
941,
323,
253,
1048,
8105,
273,
13301,
275,
1142,
1524,
10186,
4893,
3632,
21496,
12516,
20804,
13301,
715,
253,
1072,
22205,
2167,
436,
36908,
1646,
281,
8513,
25064,
3045,
275,
253,
4679,
2361,
275,
253,
19529,
6701,
281,
19690,
352,
310,
417,
2590,
849,
3733,
588,
320,
5876,
407,
436,
1327,
6017,
6484,
12433,
8211,
273,
13301,
4583,
891,
1158,
253,
19529,
556,
2067,
1841,
1469,
275,
697,
3718,
2167,
627,
310,
6832,
7990,
323,
31845,
50275,
422,
9300,
253,
13716,
50274,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
2746,
281,
3037,
23507,
46234,
323,
7177,
31294,
534,
476,
320,
10166,
407,
970,
2709,
31025,
316,
275,
7529,
285,
403,
625,
42133,
281,
5275,
6346,
3186,
50275,
783,
2929,
5604,
4455,
281,
452,
17994,
2147,
50276,
47109,
281,
3802,
1109,
285,
4419,
281,
4993,
253,
3916,
275,
50276,
4674,
8319,
50276,
2858,
253,
13943,
3045,
327,
9559,
9162,
8892,
310,
3240,
21414,
671,
30628,
275,
2087,
403,
3240,
31905,
670,
253,
2929,
594,
359,
651,
5583,
253,
2929,
323,
14924,
533,
4477,
5604,
878,
281,
1379,
5701,
273,
30628,
715,
2395,
3340,
1475,
1666,
25379,
285,
5301,
281,
3802,
1109
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a new trainingfree nas method where it is not necessary to optimize the weight parameters of target networks for architecture search to achieve this the paper exploits the capability of ntk for estimating the performance of candidate architectures at weight initialization thus the proposed method can avoid network training during the search and achieve a much efficient architecture search the experimental results show that the proposed method achieves competitive performance with existing methods and also can adapt to the label and dataagnostic scenarios the proposed method does not require the model training for architecture search thus it is much more efficient than the previous nas method in addition to the efficiency the proposed method can adapt to the label and dataagnostic situations concerns although ntk assumes infinitedepth dnns the proposed method does not seem to meet the condition it would be nice to provide the reason why the proposed method can work also i am interested in the lower bound of the network depth where the proposed method can work well the main concern about this method is whether the proposed method finds good architectures firstly recent studies have claimed that nas methods find architectures showing good performance but the rank of the found architecture is far from the best yu iclr20 this comes from the fact that many candidates of cnn architectures in the search space can achieve good performance on the benchmark dataset such as cifar10 in that sense even though the proposed method achieves good performance on the benchmark datasets i am wondering if the found architectures are really good or not to check the rank of the architectures it would be nice to use nasbench101 ying icml19 secondly a recent study has argued that training protocols eg data augmentation training epochs affect more on performance rather than search algorithms yang iclr20 in fact the training protocol affects a lot on the performance of dnns so i am wondering if the training protocol affects the estimation of the performance of candidate architectures by the proposed method or not the proposed method is inferior to another trainingfree method according to table 4 in the appendix but shows better in the main paper it would be nice to provide its cause and deeper analysis to better understand the proposed method is it possible to directly apply the proposed method to a large dataset such as imagenet yu iclr20 evaluating the search phase of neural architecture search iclr20 ying icml19 nasbench101 towards reproducible neural architecture search icml19 yang iclr20 nas evaluation is frustratingly hard the proposed method shows promising results my major concern is about the empirical and theoretical analysis see concerns above hopefully the authors can address my concern in the rebuttal period docsepthis paper proposes a trainingfree nas method called nasi which exploits the neural tangent kernel ntk to characterize the performance of the candidate architectures at initialization to alleviate the costly evaluation for ntk the authors apply a similar form to gradient flow to approximate nkt moreover they combined their ntk trick with gradientbased nas algorithm via gumbelsoftmax to solve nas problem efficiently the experiment results on various benchmarks illustrate the effect of nasi strengths trainingfree nas method is interesting and arouses a lot of interests in nas field based on the unchanged characteristic of ntk this paper utilizes ntk to evaluate the candidate architectures at initialization nasi is wellsupported by the theorem of ntk and achieves competitive results empirically weakness the main concern is the approximations in the paper since ntk has a certain assumption about the neural network the unchanged characteristic may not be satisfied for some neural architectures to calculate the trace norm of ntk and to solve the nas problem efficiently the authors both apply approximations here since three approximations are applied the final results may have a large deviation besides i also have the following questions about the paper 1 nasi is based on the theory of ntk which has the infinitywidth assumption also there are approximations in section 32 and section 33 can you discuss the bound of the approximation or just verify it empirically 2 in proposition 1 you discuss the upper bound of the training loss however when the architecture convergences we focus on the minimum value of the training loss i am confused about the claim we can simply minimize the upper bound of mathcallt below the proposition 1 3 there are some other trainingfree nas methods eg 1 can you compare nasi with them in a fair manner 1 mellor joe et al neural architecture search without training international conference on machine learning 2021 nasi applies approximations for neural architecture search at initialization based on ntk however some of them are not convincing to me see the main review above if the authors can provide detailed explanations about that i would consider raising my score docsepthis work proposes to search for good candidate neural architectures at initialization nasi so that we can completely avoid model training during the search strengths the targeting problem of searching neural architectures at initialization is important and promising the paper finds that such nasi is guaranteed to be label and dataagnostic under mild conditions which demonstrates that the searched architecture can be transferred to different datasets or tasks the retrained accuracy is promising and comparable or even better than other gradientbased searching methods weakness the authors claim competitive effectiveness on both cifar and imagenet while i only find cifar results in table 1 and imagenet16200 in table 4 also 16 and 200 are not defined how comparable are the achieved accuracy to other nas methods eg fbnetv3 efficientnet this work proposes to search for good candidate neural architectures at initialization nasi so that we can completely avoid model training during the search the theoretical analysis is also provided to analyze the optimization via ntk tools the authors replied my previous major concern with clarity so i raised my score to 6 docsepthis paper casts the problem of nas into a trainingfree evaluation process by using neural tangent kernel ntk specifically the paper argues that the training dynamics and the performance of a dnn can be determined by the constant ntk of its linearization moreover to efficiently evaluate the constant ntk of any network architectures the paper proposes to use the trac norm of ntk at initialization as an approximation using the nas method proposed in this paper one can search highquality architectures with little gpuhours interestigly the proposed method is robust when applied in a datalabelfree search setting extensive experiments show that the searched networks have good performance and can be welltransferred to other datasets pros the idea of using ntk as a network performance predictor for nas is interesting and the detailed method is nontrivial and plausible including the relaxation and sampling to solve intractable optimization the paper is wellwritten and easy to follow the paper provides sufficient theoretical proof of its claim including the assumptions made the search costs are very appealing and the performance of the searched architectures are good the proposed method can be datalabelfree which is a good property that most nas methods do not have cons the performance of nasi is worse than tenas on nasbench201 table 4 the advantage of nasi on imagenet is not significant comparing to sota table 5 how about the results if nasi is directly applied on imagenet instead of transferring the best architectures searched on cifar10 in general this paper is of good quality and exceed the bar of being accepted by iclr although i have some minor concerns about the performance i think the paper has great contributions to the nas community and should be recommended
### Summary:
|
this paper proposes an efficient trainingfree nas method nasi which exploits neural tangent kernels ntks ability to estimate the performance of candidate architectures specifically the authors provide a theoretical analysis showing that nas can be realizable at initialization and propose an efficient approximation of the trace norm of ntk that has a similar form to gradient flow to alleviate the prohibitive cost of computing ntk since the method is trainingfree nasi is also label and dataagnostic the experimental validation shows that nasi either outperforms or performs comparably to existing trainingbased and trainingfree nas methods while being significantly more efficient the below is the summary of pros and cons of the paper after pros the idea of using ntk to predict the performance of candidate neural architectures is both novel and promising and the proposed analysis and efficient approximation are nontrivial the paper provides sufficient theoretical proof of its claims including the assumptions made the method is highly efficient in terms of search cost and the searched architectures obtain good performance on benchmark datasets the method is datalabel free and thus allows transfer architectures across tasks the paper is wellwritten cons there is no result on imagenet obtained by directly applying nasi on it the initial reviews were split due to other concerns regarding whether the proposed method finds good architectures missing comparison against certain trainingfree baselines and some unclear descriptions however they were addressed away by the authors during the rebuttal period which led to a consensus to accept the paper in sum this is a strong paper that proposes a novel idea for trainingfree nas and the proposed method seems to be both effective efficient and generalizes well across tasks one remaining concern is the computational cost of running the method on larger datasets such as imagenet and i suggest the authors report the results and the running time in the final paper another suggestion is to include discussion of or comparison to other efficient nas methods based on metalearning such as metad2a lee et al 21 which is not trainingfree but is more efficient than the proposed nasi lee et al 21 rapid neural architecture search by learning to generate graphs from datasets iclr 2021
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
747,
3733,
4924,
13332,
1332,
835,
352,
310,
417,
3309,
281,
22318,
253,
2801,
3602,
273,
2303,
6928,
323,
10336,
3186,
281,
5115,
436,
253,
2929,
40725,
253,
14603,
273,
295,
17922,
323,
26230,
253,
3045,
273,
7431,
35615,
387,
2801,
31850,
3021,
253,
4081,
1332,
476,
3693,
2990,
3733,
1309,
253,
3186,
285,
5115,
247,
1199,
5919,
10336,
3186,
253,
5661,
1543,
921,
326,
253,
4081,
1332,
33526,
12085,
3045,
342,
5368,
3082,
285,
671,
476,
5223,
281,
253,
5203,
285,
941,
1530,
6932,
15216,
50275,
783,
4081,
1332,
1057,
417,
2430,
253,
1566,
3733,
323,
10336,
3186,
3021,
352,
310,
1199,
625,
5919,
685,
253,
2045,
13332,
1332,
50276,
249,
1635,
281,
253,
6733,
253,
4081,
1332,
476,
5223,
281,
253,
5203,
285,
941,
1530,
6932,
9534,
50275,
585,
1209,
2224,
50276,
20261,
295,
17922,
19584,
38353,
959,
554,
394,
277,
79,
2224,
253,
4081,
1332,
1057,
417,
1646,
281,
2525,
253,
1617,
352,
651,
320,
5322,
281,
2085,
253,
1921,
2139,
253,
4081,
1332,
476,
789,
671,
891,
717,
6110,
275,
253,
2406,
3033,
273,
253,
2990,
6864,
835,
253,
4081,
1332,
476,
789,
973,
50275,
783,
2022,
4468,
670,
436,
1332,
310,
1880,
253,
4081,
1332,
9010,
1175,
35615,
41005,
3332,
2175,
452,
7558,
326,
13332,
3082,
1089,
35615,
4645,
1175,
3045,
533,
253,
5958,
273,
253,
1119,
10336,
310,
2080,
432,
253,
1682,
340,
86,
17857,
32888,
938,
436,
3249,
432,
253,
958,
326,
1142,
9183,
273,
260,
9866,
35615,
275,
253,
3186,
2317,
476,
5115,
1175,
3045,
327,
253,
22791,
10895,
824,
347,
260,
338,
274,
740,
275,
326,
3282,
1014,
2167,
253,
4081,
1332,
33526,
1175,
3045,
327,
253,
22791,
15302,
891,
717,
12371,
604,
253,
1119,
35615,
403,
1663,
1175,
390,
417,
281,
2451,
253,
5958,
273,
253,
35615,
352,
651,
320,
5322,
281,
897,
13332,
31591,
6903,
340,
272,
17857,
1686,
746,
1273,
314,
247,
3332,
1263,
556,
9125,
326,
3733,
14238,
24088,
941,
42072,
3733,
44540,
2818,
625,
327,
3045,
2581,
685,
3186,
11333,
30966,
17857,
32888,
938,
275,
958,
253,
3733,
7241,
11852,
247,
2257,
327,
253,
3045,
273,
277,
79,
2224,
594,
891,
717,
12371,
604,
253,
3733,
7241,
11852,
253,
13418,
273,
253,
3045,
273,
7431,
35615,
407,
253,
4081,
1332,
390,
417,
50275,
783,
4081,
1332,
310,
18134,
281,
1529,
3733,
4924,
1332,
2556,
281,
2829,
577,
275,
253,
30762,
533,
2722,
1805,
275,
253,
2022,
2929,
352,
651,
320,
5322,
281,
2085,
697,
2847,
285,
12861,
1783,
281,
1805,
2096,
253,
4081,
1332,
50275,
261,
352,
1896,
281,
3587,
4647,
253,
4081,
1332,
281,
247,
1781,
10895,
824,
347,
4440,
257,
292,
50276,
30838,
17857,
32888,
938,
16344,
253,
3186,
3408,
273,
11454,
10336,
3186,
17857,
32888,
938,
50276,
3184,
17857,
1686,
746,
13332,
31591,
6903,
4404,
41374,
11454,
10336,
3186,
17857,
1686,
746,
50276,
31524,
17857,
32888,
938,
13332,
7103,
310,
29125,
314,
1892,
253,
4081,
1332,
2722,
12532,
1543,
619,
2201,
4468,
310,
670,
253,
16774,
285,
10527,
1783,
923,
7350,
1840,
18670,
253,
4477,
476,
2953,
619,
4468,
275,
253,
30080,
22559,
2180,
5474,
33032,
2520,
2929,
29328,
247,
3733,
4924,
13332,
1332,
1925,
295,
9720,
534,
40725,
253,
11454,
28196,
10295,
295,
17922,
281,
17710,
253,
3045,
273,
253,
7431,
35615,
387,
31850,
281,
33623,
253,
19983,
7103,
323,
295,
17922,
253,
4477,
4647,
247,
2074,
830,
281,
11786,
2685,
281,
16851,
295,
5751,
25761,
597,
5678,
616,
295,
17922,
10480,
342,
11786,
3169,
13332,
5933,
3066,
305,
3561,
293,
5530,
4090,
281,
8415,
13332,
1895,
14556,
253,
3368,
1543,
327,
2710,
49602,
17093,
253,
1055,
273,
295,
9720,
20544,
3733,
4924,
13332,
1332,
310,
4722,
285,
549,
15011,
247,
2257,
273,
6284,
275,
13332,
1673,
1754,
327,
253,
19965,
8847,
273,
295,
17922,
436,
2929,
29820,
295,
17922,
281,
7472,
253,
7431,
35615,
387,
31850,
295,
9720,
310,
973,
19391,
407,
253,
10012,
273,
295,
17922,
285,
33526,
12085,
1543,
45190,
50276,
20881,
1255,
253,
2022,
4468,
310,
253,
34754,
275,
253,
2929,
1580,
295,
17922,
556,
247,
2176,
9376,
670,
253,
11454,
2990,
253,
19965,
8847,
778,
417,
320,
10048,
323,
690,
11454,
35615,
281,
10173,
253,
10711,
5222,
273,
295,
17922,
285,
281,
8415,
253,
13332,
1895,
14556,
253,
4477,
1097,
4647,
34754,
1060,
1580,
1264,
34754,
403,
3732,
253,
2457,
1543,
778,
452,
247,
1781,
11254,
50276,
67,
11587,
891,
671,
452,
253,
1563,
3533,
670,
253,
2929,
337,
295,
9720,
310,
1754,
327,
253,
3762,
273,
295,
17922,
534,
556,
253,
23579,
3429,
9376,
671,
627,
403,
34754,
275,
2593,
4567,
285,
2593,
5922,
476,
368,
2319,
253,
3033,
273,
253,
11193,
390,
816,
12654,
352,
45190,
374,
275,
13989,
337,
368,
2319,
253,
5170,
3033,
273,
253,
3733,
2957,
2299,
672,
253,
10336,
5975,
1541,
707,
359,
2770,
327,
253,
5927,
1318,
273,
253,
3733,
2957,
891,
717,
13477,
670,
253,
1750,
359,
476,
3365,
15338,
253,
5170,
3033,
273,
14168,
4065,
85,
2708,
253,
13989,
337,
50276,
20,
627,
403,
690,
643,
3733,
4924,
13332,
3082,
24088,
337,
476,
368,
7277,
295,
9720,
342,
731,
275,
247,
4344,
5133,
50276,
18,
21648,
263,
3371,
70,
1162,
355,
11454,
10336,
3186,
1293,
3733,
5213,
8059,
327,
5145,
4715,
43425,
50276,
79,
9720,
10384,
34754,
323,
11454,
10336,
3186,
387,
31850,
1754,
327,
295,
17922,
2299,
690,
273,
731,
403,
417,
21414,
281,
479,
923,
253,
2022,
2278,
1840,
604,
253,
4477,
476,
2085,
7000,
22909,
670,
326,
891,
651,
1908,
12976,
619,
4868,
50276,
7152,
33032,
2520,
789,
29328,
281,
3186,
323,
1175,
7431,
11454,
35615,
387,
31850,
295,
9720,
594,
326,
359,
476,
4336,
3693,
1566,
3733,
1309,
253,
3186,
20544,
253,
12262,
1895,
273,
12203,
11454,
35615,
387,
31850,
310,
1774,
285,
12532,
253,
2929,
9010,
326,
824,
295,
9720,
310,
16293,
281,
320,
5203,
285,
941,
1530,
6932,
762,
11134,
2515,
534,
14371,
326,
253,
16113,
10336,
476,
320,
9495,
281,
1027,
15302,
390,
8892,
253,
851,
11273,
7200,
310,
12532,
285,
10870,
390,
1014,
1805,
685,
643,
11786,
3169,
12203,
3082,
50275,
20881,
1255,
253,
4477,
1750,
12085,
12510,
327,
1097,
260,
338,
274,
285,
4440,
257,
292,
1223,
891,
760,
1089,
260,
338,
274,
1543,
275,
2829,
337,
285,
4440,
257,
292,
1036,
1518,
275,
2829,
577,
671,
1668,
285,
1052,
403,
417,
2931,
849,
10870,
403,
253,
6786,
7200,
281,
643,
13332,
3082,
24088,
49962,
3024,
87,
20,
5919,
3024,
50276,
2520,
789,
29328,
281,
3186,
323,
1175,
7431,
11454,
35615,
387,
31850,
295,
9720,
594,
326,
359,
476,
4336,
3693,
1566,
3733,
1309,
253,
3186,
253,
10527,
1783,
310,
671,
2530,
281,
12106,
253,
13757,
3066,
295,
17922,
5657,
50276,
783,
4477,
10017,
619,
2045,
2201,
4468,
342,
19843,
594,
891,
5439,
619,
4868,
281,
721,
50276,
7152,
33032,
2520,
2929,
43603,
253,
1895,
273,
13332,
715,
247,
3733,
4924,
7103,
1232,
407,
970,
11454,
28196,
10295,
295,
17922,
5742,
253,
2929,
8219,
326,
253,
3733,
8062,
285,
253,
3045,
273,
247,
277,
9866,
476,
320,
3413,
407,
253,
3638,
295,
17922,
273,
697,
4872,
1320,
25761,
281,
14556,
7472,
253,
3638,
295,
17922,
273,
667,
2990,
35615,
253,
2929,
29328,
281,
897,
50276,
783,
43944,
5222,
273,
295,
17922,
387,
31850,
347,
271,
11193,
970,
253,
13332,
1332,
4081,
275,
436,
2929,
581,
476,
3186,
1029,
15177,
35615,
342,
1652,
31025,
6968,
2108,
1600,
304,
314,
253,
4081,
1332,
310,
10237,
672,
3732,
275,
247,
2856,
267,
357,
813,
658,
3186,
4758,
9470,
4679,
921,
326,
253,
16113,
6928,
452,
1175,
3045,
285,
476,
320,
973,
17338,
433,
281,
643,
15302,
5847,
50276,
783,
2934,
273,
970,
295,
17922,
347,
247,
2990,
3045,
23403,
323,
13332,
310,
4722,
285,
253,
7000,
1332,
310,
37825,
285,
21541,
1690,
253,
17040,
285,
10491,
281,
8415,
540,
44374,
13757,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
783,
2929,
3400,
4209,
10527,
4737,
273,
697,
1750,
1690,
253,
13260,
1160,
50276,
783,
3186,
4815,
403,
1077,
23176,
285,
253,
3045,
273,
253,
16113,
35615,
403,
1175,
50276,
783,
4081,
1332,
476,
320,
2856,
267,
357,
813,
658,
534,
310,
247,
1175,
2867,
326,
954,
13332,
3082,
513,
417,
452,
50276,
5040,
50276,
783,
3045,
273,
295,
9720,
310,
7197,
685,
3578,
284,
327,
13332,
31591,
1252,
2829,
577,
50276,
783,
5750,
273,
295,
9720,
327,
4440,
257,
292,
310,
417,
1534,
10941,
281,
256,
5503,
2829,
608,
50276,
5430,
670,
253,
1543,
604,
295,
9720,
310,
3587,
3732,
327,
4440,
257,
292,
3185,
273,
27090,
253,
1682,
35615,
16113,
327,
260,
338,
274,
740,
275,
2087,
436,
2929,
310,
273,
1175,
3290,
285,
8268,
253,
2534,
273,
1146,
7607,
407,
17857,
32888,
3738,
891,
452,
690,
5884,
7350,
670,
253,
3045,
891,
1158,
253,
2929,
556,
1270,
9021,
281,
253,
13332,
3114,
285,
943,
320,
8521,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
5919,
3733,
4924,
13332,
1332,
295,
9720,
534,
40725,
11454,
28196,
34501,
34900,
661,
3745,
281,
6642,
253,
3045,
273,
7431,
35615,
5742,
253,
4477,
2085,
247,
10527,
1783,
4645,
326,
13332,
476,
320,
1524,
12729,
387,
31850,
285,
12661,
271,
5919,
11193,
273,
253,
10711,
5222,
273,
295,
17922,
326,
556,
247,
2074,
830,
281,
11786,
2685,
281,
33623,
253,
9419,
1483,
2105,
273,
12672,
295,
17922,
1580,
253,
1332,
310,
3733,
4924,
295,
9720,
310,
671,
5203,
285,
941,
1530,
6932,
253,
5661,
12820,
2722,
326,
295,
9720,
2057,
41731,
13015,
390,
17923,
3294,
1598,
281,
5368,
3733,
3169,
285,
3733,
4924,
13332,
3082,
1223,
1146,
3012,
625,
5919,
50276,
783,
2708,
310,
253,
6010,
273,
5847,
285,
772,
273,
253,
2929,
846,
50275,
856,
84,
50276,
783,
2934,
273,
970,
295,
17922,
281,
3283,
253,
3045,
273,
7431,
11454,
35615,
310,
1097,
4460,
285,
12532,
285,
253,
4081,
1783,
285,
5919,
11193,
403,
37825,
50276,
783,
2929,
3400,
4209,
10527,
4737,
273,
697,
3916,
1690,
253,
13260,
1160,
50276,
783,
1332,
310,
4122,
5919,
275,
2426,
273,
3186,
2105,
285,
253,
16113,
35615,
4044,
1175,
3045,
327,
22791,
15302,
50276,
783,
1332,
310,
2856,
267,
1492,
1959,
285,
3021,
4483,
3700,
35615,
2439,
8892,
50276,
783,
2929,
310,
973,
15720,
50276,
5040,
50276,
9088,
310,
642,
906,
327,
4440,
257,
292,
2797,
407,
3587,
9433,
295,
9720,
327,
352,
50275,
783,
3302,
10123,
497,
8085,
1955,
281,
643,
7350,
5001,
1880,
253,
4081,
1332,
9010,
1175,
35615,
5816,
5301,
1411,
2176,
3733,
4924,
1666,
25379,
285,
690,
12744,
20121,
2299,
597,
497,
9713,
1977,
407,
253,
4477,
1309,
253,
30080,
22559,
2180,
534,
3977,
281,
247,
13969,
281,
2997,
253,
2929,
50275,
249,
2020,
436,
310,
247,
2266,
2929,
326,
29328,
247,
4460,
2934,
323,
3733,
4924,
13332,
285,
253,
4081,
1332,
3133,
281,
320,
1097,
3576,
5919,
285,
2087,
4219,
973,
2439,
8892,
581,
5780,
4468,
310,
253,
15180,
2105,
273,
3515,
253,
1332,
327,
4067,
15302,
824,
347,
4440,
257,
292,
285,
891,
1804,
253,
4477,
1304,
253,
1543,
285,
253,
3515,
673,
275,
253,
2457,
2929,
50276,
23955,
14876,
310,
281,
2486,
5955,
273,
390,
5301,
281,
643,
5919,
13332,
3082,
1754,
327,
5148,
613,
920,
824,
347,
1313,
324,
19,
66,
458,
70,
1162,
355,
3127,
534,
310,
417,
3733,
4924,
533,
310,
625,
5919,
685,
253,
4081,
295,
9720,
50275,
14906,
1162,
355,
3127,
5233,
11454,
10336,
3186,
407,
4715,
281,
6635,
14580,
432,
15302,
17857,
32888,
43425
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
747,
3733,
4924,
13332,
1332,
835,
352,
310,
417,
3309,
281,
22318,
253,
2801,
3602,
273,
2303,
6928,
323,
10336,
3186,
281,
5115,
436,
253,
2929,
40725,
253,
14603,
273,
295,
17922,
323,
26230,
253,
3045,
273,
7431,
35615,
387,
2801,
31850,
3021,
253,
4081,
1332,
476,
3693,
2990,
3733,
1309,
253,
3186,
285,
5115,
247,
1199,
5919,
10336,
3186,
253,
5661,
1543,
921,
326,
253,
4081,
1332,
33526,
12085,
3045,
342,
5368,
3082,
285,
671,
476,
5223,
281,
253,
5203,
285,
941,
1530,
6932,
15216,
50275,
783,
4081,
1332,
1057,
417,
2430,
253,
1566,
3733,
323,
10336,
3186,
3021,
352,
310,
1199,
625,
5919,
685,
253,
2045,
13332,
1332,
50276,
249,
1635,
281,
253,
6733,
253,
4081,
1332,
476,
5223,
281,
253,
5203,
285,
941,
1530,
6932,
9534,
50275,
585,
1209,
2224,
50276,
20261,
295,
17922,
19584,
38353,
959,
554,
394,
277,
79,
2224,
253,
4081,
1332,
1057,
417,
1646,
281,
2525,
253,
1617,
352,
651,
320,
5322,
281,
2085,
253,
1921,
2139,
253,
4081,
1332,
476,
789,
671,
891,
717,
6110,
275,
253,
2406,
3033,
273,
253,
2990,
6864,
835,
253,
4081,
1332,
476,
789,
973,
50275,
783,
2022,
4468,
670,
436,
1332,
310,
1880,
253,
4081,
1332,
9010,
1175,
35615,
41005,
3332,
2175,
452,
7558,
326,
13332,
3082,
1089,
35615,
4645,
1175,
3045,
533,
253,
5958,
273,
253,
1119,
10336,
310,
2080,
432,
253,
1682,
340,
86,
17857,
32888,
938,
436,
3249,
432,
253,
958,
326,
1142,
9183,
273,
260,
9866,
35615,
275,
253,
3186,
2317,
476,
5115,
1175,
3045,
327,
253,
22791,
10895,
824,
347,
260,
338,
274,
740,
275,
326,
3282,
1014,
2167,
253,
4081,
1332,
33526,
1175,
3045,
327,
253,
22791,
15302,
891,
717,
12371,
604,
253,
1119,
35615,
403,
1663,
1175,
390,
417,
281,
2451,
253,
5958,
273,
253,
35615,
352,
651,
320,
5322,
281,
897,
13332,
31591,
6903,
340,
272,
17857,
1686,
746,
1273,
314,
247,
3332,
1263,
556,
9125,
326,
3733,
14238,
24088,
941,
42072,
3733,
44540,
2818,
625,
327,
3045,
2581,
685,
3186,
11333,
30966,
17857,
32888,
938,
275,
958,
253,
3733,
7241,
11852,
247,
2257,
327,
253,
3045,
273,
277,
79,
2224,
594,
891,
717,
12371,
604,
253,
3733,
7241,
11852,
253,
13418,
273,
253,
3045,
273,
7431,
35615,
407,
253,
4081,
1332,
390,
417,
50275,
783,
4081,
1332,
310,
18134,
281,
1529,
3733,
4924,
1332,
2556,
281,
2829,
577,
275,
253,
30762,
533,
2722,
1805,
275,
253,
2022,
2929,
352,
651,
320,
5322,
281,
2085,
697,
2847,
285,
12861,
1783,
281,
1805,
2096,
253,
4081,
1332,
50275,
261,
352,
1896,
281,
3587,
4647,
253,
4081,
1332,
281,
247,
1781,
10895,
824,
347,
4440,
257,
292,
50276,
30838,
17857,
32888,
938,
16344,
253,
3186,
3408,
273,
11454,
10336,
3186,
17857,
32888,
938,
50276,
3184,
17857,
1686,
746,
13332,
31591,
6903,
4404,
41374,
11454,
10336,
3186,
17857,
1686,
746,
50276,
31524,
17857,
32888,
938,
13332,
7103,
310,
29125,
314,
1892,
253,
4081,
1332,
2722,
12532,
1543,
619,
2201,
4468,
310,
670,
253,
16774,
285,
10527,
1783,
923,
7350,
1840,
18670,
253,
4477,
476,
2953,
619,
4468,
275,
253,
30080,
22559,
2180,
5474,
33032,
2520,
2929,
29328,
247,
3733,
4924,
13332,
1332,
1925,
295,
9720,
534,
40725,
253,
11454,
28196,
10295,
295,
17922,
281,
17710,
253,
3045,
273,
253,
7431,
35615,
387,
31850,
281,
33623,
253,
19983,
7103,
323,
295,
17922,
253,
4477,
4647,
247,
2074,
830,
281,
11786,
2685,
281,
16851,
295,
5751,
25761,
597,
5678,
616,
295,
17922,
10480,
342,
11786,
3169,
13332,
5933,
3066,
305,
3561,
293,
5530,
4090,
281,
8415,
13332,
1895,
14556,
253,
3368,
1543,
327,
2710,
49602,
17093,
253,
1055,
273,
295,
9720,
20544,
3733,
4924,
13332,
1332,
310,
4722,
285,
549,
15011,
247,
2257,
273,
6284,
275,
13332,
1673,
1754,
327,
253,
19965,
8847,
273,
295,
17922,
436,
2929,
29820,
295,
17922,
281,
7472,
253,
7431,
35615,
387,
31850,
295,
9720,
310,
973,
19391,
407,
253,
10012,
273,
295,
17922,
285,
33526,
12085,
1543,
45190,
50276,
20881,
1255,
253,
2022,
4468,
310,
253,
34754,
275,
253,
2929,
1580,
295,
17922,
556,
247,
2176,
9376,
670,
253,
11454,
2990,
253,
19965,
8847,
778,
417,
320,
10048,
323,
690,
11454,
35615,
281,
10173,
253,
10711,
5222,
273,
295,
17922,
285,
281,
8415,
253,
13332,
1895,
14556,
253,
4477,
1097,
4647,
34754,
1060,
1580,
1264,
34754,
403,
3732,
253,
2457,
1543,
778,
452,
247,
1781,
11254,
50276,
67,
11587,
891,
671,
452,
253,
1563,
3533,
670,
253,
2929,
337,
295,
9720,
310,
1754,
327,
253,
3762,
273,
295,
17922,
534,
556,
253,
23579,
3429,
9376,
671,
627,
403,
34754,
275,
2593,
4567,
285,
2593,
5922,
476,
368,
2319,
253,
3033,
273,
253,
11193,
390,
816,
12654,
352,
45190,
374,
275,
13989,
337,
368,
2319,
253,
5170,
3033,
273,
253,
3733,
2957,
2299,
672,
253,
10336,
5975,
1541,
707,
359,
2770,
327,
253,
5927,
1318,
273,
253,
3733,
2957,
891,
717,
13477,
670,
253,
1750,
359,
476,
3365,
15338,
253,
5170,
3033,
273,
14168,
4065,
85,
2708,
253,
13989,
337,
50276,
20,
627,
403,
690,
643,
3733,
4924,
13332,
3082,
24088,
337,
476,
368,
7277,
295,
9720,
342,
731,
275,
247,
4344,
5133,
50276,
18,
21648,
263,
3371,
70,
1162,
355,
11454,
10336,
3186,
1293,
3733,
5213,
8059,
327,
5145,
4715,
43425,
50276,
79,
9720,
10384,
34754,
323,
11454,
10336,
3186,
387,
31850,
1754,
327,
295,
17922,
2299,
690,
273,
731,
403,
417,
21414,
281,
479,
923,
253,
2022,
2278,
1840,
604,
253,
4477,
476,
2085,
7000,
22909,
670,
326,
891,
651,
1908,
12976,
619,
4868,
50276,
7152,
33032,
2520,
789,
29328,
281,
3186,
323,
1175,
7431,
11454,
35615,
387,
31850,
295,
9720,
594,
326,
359,
476,
4336,
3693,
1566,
3733,
1309,
253,
3186,
20544,
253,
12262,
1895,
273,
12203,
11454,
35615,
387,
31850,
310,
1774,
285,
12532,
253,
2929,
9010,
326,
824,
295,
9720,
310,
16293,
281,
320,
5203,
285,
941,
1530,
6932,
762,
11134,
2515,
534,
14371,
326,
253,
16113,
10336,
476,
320,
9495,
281,
1027,
15302,
390,
8892,
253,
851,
11273,
7200,
310,
12532,
285,
10870,
390,
1014,
1805,
685,
643,
11786,
3169,
12203,
3082,
50275,
20881,
1255,
253,
4477,
1750,
12085,
12510,
327,
1097,
260,
338,
274,
285,
4440,
257,
292,
1223,
891,
760,
1089,
260,
338,
274,
1543,
275,
2829,
337,
285,
4440,
257,
292,
1036,
1518,
275,
2829,
577,
671,
1668,
285,
1052,
403,
417,
2931,
849,
10870,
403,
253,
6786,
7200,
281,
643,
13332,
3082,
24088,
49962,
3024,
87,
20,
5919,
3024,
50276,
2520,
789,
29328,
281,
3186,
323,
1175,
7431,
11454,
35615,
387,
31850,
295,
9720,
594,
326,
359,
476,
4336,
3693,
1566,
3733,
1309,
253,
3186,
253,
10527,
1783,
310,
671,
2530,
281,
12106,
253,
13757,
3066,
295,
17922,
5657,
50276,
783,
4477,
10017,
619,
2045,
2201,
4468,
342,
19843,
594,
891,
5439,
619,
4868,
281,
721,
50276,
7152,
33032,
2520,
2929,
43603,
253,
1895,
273,
13332,
715,
247,
3733,
4924,
7103,
1232,
407,
970,
11454,
28196,
10295,
295,
17922,
5742,
253,
2929,
8219,
326,
253,
3733,
8062,
285,
253,
3045,
273,
247,
277,
9866,
476,
320,
3413,
407,
253,
3638,
295,
17922,
273,
697,
4872,
1320,
25761,
281,
14556,
7472,
253,
3638,
295,
17922,
273,
667,
2990,
35615,
253,
2929,
29328,
281,
897,
50276,
783,
43944,
5222,
273,
295,
17922,
387,
31850,
347,
271,
11193,
970,
253,
13332,
1332,
4081,
275,
436,
2929,
581,
476,
3186,
1029,
15177,
35615,
342,
1652,
31025,
6968,
2108,
1600,
304,
314,
253,
4081,
1332,
310,
10237,
672,
3732,
275,
247,
2856,
267,
357,
813,
658,
3186,
4758,
9470,
4679,
921,
326,
253,
16113,
6928,
452,
1175,
3045,
285,
476,
320,
973,
17338,
433,
281,
643,
15302,
5847,
50276,
783,
2934,
273,
970,
295,
17922,
347,
247,
2990,
3045,
23403,
323,
13332,
310,
4722,
285,
253,
7000,
1332,
310,
37825,
285,
21541,
1690,
253,
17040,
285,
10491,
281,
8415,
540,
44374,
13757,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
783,
2929,
3400,
4209,
10527,
4737,
273,
697,
1750,
1690,
253,
13260,
1160,
50276,
783,
3186,
4815,
403,
1077,
23176,
285,
253,
3045,
273,
253,
16113,
35615,
403,
1175,
50276,
783,
4081,
1332,
476,
320,
2856,
267,
357,
813,
658,
534,
310,
247,
1175,
2867,
326,
954,
13332,
3082,
513,
417,
452,
50276,
5040,
50276,
783,
3045,
273,
295,
9720,
310,
7197,
685,
3578,
284,
327,
13332,
31591,
1252,
2829,
577,
50276,
783,
5750,
273,
295,
9720,
327,
4440,
257,
292,
310,
417,
1534,
10941,
281,
256,
5503,
2829,
608,
50276,
5430,
670,
253,
1543,
604,
295,
9720,
310,
3587,
3732,
327,
4440,
257,
292,
3185,
273,
27090,
253,
1682,
35615,
16113,
327,
260,
338,
274,
740,
275,
2087,
436,
2929,
310,
273,
1175,
3290,
285,
8268,
253,
2534,
273,
1146,
7607,
407,
17857,
32888,
3738,
891,
452,
690,
5884,
7350,
670,
253,
3045,
891,
1158,
253,
2929,
556,
1270,
9021,
281,
253,
13332,
3114,
285,
943,
320,
8521,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
5919,
3733,
4924,
13332,
1332,
295,
9720,
534,
40725,
11454,
28196,
34501,
34900,
661,
3745,
281,
6642,
253,
3045,
273,
7431,
35615,
5742,
253,
4477,
2085,
247,
10527,
1783,
4645,
326,
13332,
476,
320,
1524,
12729,
387,
31850,
285,
12661,
271,
5919,
11193,
273,
253,
10711,
5222,
273,
295,
17922,
326,
556,
247,
2074,
830,
281,
11786,
2685,
281,
33623,
253,
9419,
1483,
2105,
273,
12672,
295,
17922,
1580,
253,
1332,
310,
3733,
4924,
295,
9720,
310,
671,
5203,
285,
941,
1530,
6932,
253,
5661,
12820,
2722,
326,
295,
9720,
2057,
41731,
13015,
390,
17923,
3294,
1598,
281,
5368,
3733,
3169,
285,
3733,
4924,
13332,
3082,
1223,
1146,
3012,
625,
5919,
50276,
783,
2708,
310,
253,
6010,
273,
5847,
285,
772,
273,
253,
2929,
846,
50275,
856,
84,
50276,
783,
2934,
273,
970,
295,
17922,
281,
3283,
253,
3045,
273,
7431,
11454,
35615,
310,
1097,
4460,
285,
12532,
285,
253,
4081,
1783,
285,
5919,
11193,
403,
37825,
50276,
783,
2929,
3400,
4209,
10527,
4737,
273,
697,
3916,
1690,
253,
13260,
1160,
50276,
783,
1332,
310,
4122,
5919,
275,
2426,
273,
3186,
2105,
285,
253,
16113,
35615,
4044,
1175,
3045,
327,
22791,
15302,
50276,
783,
1332,
310,
2856,
267,
1492,
1959,
285,
3021,
4483,
3700,
35615,
2439,
8892,
50276,
783,
2929,
310,
973,
15720,
50276,
5040,
50276,
9088,
310,
642,
906,
327,
4440,
257,
292,
2797,
407,
3587,
9433,
295,
9720,
327,
352,
50275,
783,
3302,
10123,
497,
8085,
1955,
281,
643,
7350,
5001,
1880,
253,
4081,
1332,
9010,
1175,
35615,
5816,
5301,
1411,
2176,
3733,
4924,
1666,
25379,
285,
690,
12744,
20121,
2299,
597,
497,
9713,
1977,
407,
253,
4477,
1309,
253,
30080,
22559,
2180,
534,
3977,
281,
247,
13969,
281,
2997,
253,
2929,
50275,
249,
2020,
436,
310,
247,
2266,
2929,
326,
29328,
247,
4460,
2934,
323,
3733,
4924,
13332,
285,
253,
4081,
1332,
3133,
281,
320,
1097,
3576,
5919,
285,
2087,
4219,
973,
2439,
8892,
581,
5780,
4468,
310,
253,
15180,
2105,
273,
3515,
253,
1332,
327,
4067,
15302,
824,
347,
4440,
257,
292,
285,
891,
1804,
253,
4477,
1304,
253,
1543,
285,
253,
3515,
673,
275,
253,
2457,
2929,
50276,
23955,
14876,
310,
281,
2486,
5955,
273,
390,
5301,
281,
643,
5919,
13332,
3082,
1754,
327,
5148,
613,
920,
824,
347,
1313,
324,
19,
66,
458,
70,
1162,
355,
3127,
534,
310,
417,
3733,
4924,
533,
310,
625,
5919,
685,
253,
4081,
295,
9720,
50275,
14906,
1162,
355,
3127,
5233,
11454,
10336,
3186,
407,
4715,
281,
6635,
14580,
432,
15302,
17857,
32888,
43425
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper investigates the effectiveness of the language model for training the policy in embodied environments the authors use a pretrained gpt2 to initialize the policy then show the generalization effect in policy learning in the experiments the authors demonstrated the language model shows a better generalization effect with simple baseline and ablation studies the authors empirically demonstrated the effectiveness of initialization with the pretrained language model but this result cannot give new insight that is different from existing knowledge about the effect of the pretrained language model moreover some results seem to leave concerns and questions the detailed concerns and questions are as follows 1 the authors assume stringbased inputs goal predicates history actions observation through serialization but i cannot agree with why it is necessary to consider this setting moreover the comparison with simple baselines in figure 4 experiment 1 seems natural result based on existing knowledge that is not different from existing knowledge about the effectiveness of pretrained language model 2 for this work to be meaningful it is thought that it should show better performance than the result performed without a pretrained language model with original inputs that have not been serialized however figure 6 experiment 2b shows a negligible impact on the performance compared with learned encoding with original inputs based on these results it does not sufficiently support the claim that using a pretrained language model is helpful for generalization this paper provides various experiments to show the generalization effect in policy learning with the pretrained language model however most results cannot clearly support the main claim of the paper and it remains some concerns and questions docsepthe paper studies the effect of using pretrained components in a neural policy network for an embodied agent in a 3d environment called virtualhome vh the empirical results present 3 main observations 1 a model with components initialized from a pretrained language model generalized better in zeroshot settings than a model with all components randomly initialized 2 a poorly designed ie random string encoding removed these generalization benefits and 3 an effective encoding layer could be learned from scratch in the absence of a stringbased goal and observation representation while the empirical results in fig 4 do show benefits of initializing the model components from a pretrained language model the following are quite confusingunclear from reading the paper 1 the main motivation of the paper is to explore the effect of lm pretraining on a nonlinguistic task however the task chosen in the work is inherently linguistic so much so that the visual aspects of the task ie observations from the environment are represented as english tokens whether the tokens are structured in a formulaic representation eg insidefridge apple or in traditional english sentences seems unrelated to classifying a task as linguistic vs nonlinguistic 2 it is unclear why the goal and history are encoded as english sentences before feeding as input to the embedding layer while the observations are input as graph entities similar to original formulaic representation is there any difference between encoding observations as graph entities vs as english sentences indeed from experiment 72 it is clear that there is almost negligible difference in representing goalhistoryobservations as english sentences vs original formulaic entities 3 from fig 5 it can be observed that lmscratch is worse than lmft random given that lmft random is trying to learn the task using random strings it is concerning that lmscratch is unable to outperform even that does this mean lmscratch is not properly tuned andor trained till convergence overall this work presents limited empirical studies and has the above mentioned weaknesses this work presents limited empirical studies and has several critical weaknesses docsepafter rebuttal i am keeping my score but i will not fight against rejecting the paper i think the results are promising but the scope of the experiments are limited and the claims need to be more precise as pointed out in my discussion with the authors there are also several important missing details that makes it hard to understand and appreciate the experimental settings i like the idea of experiment 2b but the change from the stringbased representation to the onehot representation does not seem to be a significant change as they both give the language model a sequence of word vectors this may just show that the order of the words in the goal and history does not matter for the navigation decisions which is kind of expected given the simple templates used to generate the stringbased representation the papers claims would also be strengthened with reasonable explanations for the observed phenomena the paper currently treats the experiment observations as conclusions which i think may be overgeneralization given the limited scope of the experiments one simulator one type of currentstate input representation before rebuttal the paper proposes and studies the effectiveness of using pretrained lms to solve a sequential decisionmaking problem where the observations goals and actions are not originally represented in language the authors design three experiments 1 convert the problem to language modeling and measure the effectiveness of pretraining the lm 2 compare performance of using the language input representation versus a randomstring input representation 3 determine whether converting the inputs to texts is necessary the paper concludes that 1 language modeling improves generalization in policy learning 2 languagebased environment encodings are not needed to benefit from lmpretraining and 3 the results point the possible effectiveness of language modeling as a generalpurpose pretraining scheme overall i find the paper very wellwritten it is largely empirical and the experiments are carefully setup and deliver insightful results the claims in the introduction are supported by the experiments but needs be further restricted to reflect the limited scope of the paper i would like to ask whether the visual observations are used for training the models it seems like they are not and if not why in general itd be interesting to see whether lmlearned representation can complement other rich representations eg visual representation the symbolic graphbased representation may be too hard for the model to learn from and thus it is easier to observe improvement upon this representation i suggest the authors revise the claims emphasizing that they apply to a specific taskenvironment and a nonvisual input representation the paper offers novel interesting results that contribute new knowledge about pretrained lms i recommend acceptance docsepthis paper takes a transformerbased language model pretrained on a large text corpus in this case gpt2 and uses it for the symbolic version of the virtualhome environment the observation goals and action history of the agent are encoded as text strings in a few various ways and fed as input to the transformer and the output of the model is pooled to predict the agent action the paper demonstrates that in cases where the test distribution differs in some way from the training distribution that using pretrained transformers greatly improves performance on the task positives the results of language pretraining is quite large and well supported by the experimental evidence the randomized word experiment is quite interesting and i think quite clearly demonstrates that the language pretraining is injecting information about the world and when the words are replaced by nonsense words that knowledge can no longer be effectively transfered negatives i think the specific claim of novelty in this paper is quite weak and compared to other relevant work which should be cited but is not this paper is not a good contribution to the field specifically the claim is that this paper is the first to demonstrate improved generalization in a nonlinguistic problem over a standard neuralnetwork baseline using a pretrained language model either the claim is mistaken or it is being drawn deliberately narrowly to exclude work which i think is clearly related specifically i think excluding methods which have been done on text adventure games is either an accidental oversight or this claim is just not an interesting claim of novelty there has been quite a lot of work in this area but i think the paper to look at that most clearly makes the claims of novelty less convincing is 1 this paper uses a pretrained gpt2 model on text adventure games specifically the text games from the jericho framework also demonstrates how the pretraining improves performance including on heldout unseen games with 1 as context i do not think this specific novelty claim is correct or at best it is not an impressive claim of novelty this begs the question of what are the differences between this environment and text adventure environments i would argue that in pretty much every way this environment is less interesting one thing that might have been interesting is if this paper had used the 3d environment version of virtualhome then i think there would be a good argument that this environment presents new challenges over text adventure games because there is an added perception problem on top of the planning text understanding one but since this paper exclusively used the graphbased observations and discrete symbolic actions this is not true i think actually that with this observation and action space it is less interesting than text world first in this version everything is templated symbols there is a very fixed set of actions predicates and objects that always appear written in exactly the same templated way contrast with text world the space of actions is huge essentially only limted by the original game designers imagination there are any number of objects or other entities to interact with the games were designed to be played by humans and are thus in much more natural language and encorporate a potentially broader set of scenarios in virtualhome its just a fit set of objects and actions and the goals are just to satisfy a fixed set of predicates and many of them are just involving the placement of various objects into other objects in a home the fact that a symbolic planner can be used to generate ground truth whereas you cannot do this straightforwardly in text adventure games without providing a lot of hidden information to me indicates a less natural less interesting setting in general and specifically compared to the generalization in 1 i think the generalization in this paper is pretty limited in this paper the testtime generalization includes novel combination of known objects in known predicates but for instance does not include novel objects and novel predicates there is also no novelty in actions or in the general environment the most is putting objects in unusual places contrast with 1 where entire games are held out meaning that actions goals and objects can all be different in testing environments making this form of generalization much more interesting putting aside this specific claim of novelty i think there are actually not a lot of new idea in this paper the architecture is not particularly novel just pooling the outputs of a transformer model there isnt much novelty in training its just behavioral cloning or methods the environment is not new and as discussed is not as interesting as text games the only real claim to novelty is that they used pretrained lms on this specific kind of environment this is either a minor clarification or kind of a big problem is there an experiment where the raw observationgoalactions from the environment are fed into the network compared to the transformer trained from scratch i know there is the experiment for the first one but not seing a comparison to a fromscratch but in general the confusing placement of experiments why is the major experiment introduced and shown in the introduction i might have just missed this if its just placement i would recommend cleaning up the experiments section otherwise i think its kind of important that the comparison is missing because this means that the pretraining only works because the authors have essentially handcrafted these templates to get gpt2 to have useful pretraining i also found the architecture used pretty strange usually when people use gpt models they will use the string token output head and just finetune it to produce the correct output so in figure 3 you would have all of the same inputs but then would not generate outputs at every timestep you would just generated a token sequence out after the last input this gives a couple advantages the output heads actually contain useful training since theyre trained to generate text it lets you do zeroshot and fewshot experiments because the model can generate arbitrary string outputs there are quite impressive results in these settings using large language models so it is odd that this was not done was there a reason why this was not done in this case or was this tried and found not to be as effective minor the choice of reaching the goal in 70 steps seems somewhat arbitrary to me is this the default in the original environment alfred should always be in caps 1 keep calm and explore language models for action generation in textbased games shunyu yao rohan rao matthew hausknecht karthik narasimhan emnlp 2020 the paper has some interesting results but the paper lacks novelty pretrained lms have already been shown to be effective in similar textsymbolic rl environments text adventure games and the paper is not as impressive even as some earlier work see 1 it does not provide new insightsmethodsarchitectures compared to prior literature and so i think would not be accepted
### Summary:
|
the paper studies the use of pretrained language models lm for training the policy in embodied environments specifically a pretrained gpt2 lm is used to initialize the policy environment observations goals and actions are encoded appropriately eg converted into text strings to apply the lmbased policy the experiments study the generalization effect of initializing with pretrained lms reviewers have found the paper made limited contributions in particular prior works on text adventure games have explored the use of pretrained lms for playing games and studied the generalization effect such as 1 its also suggested that the paper should revise the claims made in the experiments given the limited experimental scope and results 1 keep calm and explore language models for action generation in textbased games shunyu yao rohan rao matthew hausknecht karthik narasimhan emnlp 2020
|
[
983,
7083,
1506,
310,
7197,
685,
298,
78,
649,
3632,
1677,
326,
298,
78,
649,
3632,
310,
2820,
281,
3037,
253,
4836,
970,
3632,
11559,
352,
310,
8664,
326,
298,
983,
7083,
1506,
310,
7591,
281,
562,
32231,
1014,
326,
1057,
436,
1599,
298,
983,
7083,
1506,
310,
417,
6283,
24251,
285,
263,
10166,
7357,
14940,
50276,
1189,
455,
436,
789,
10262,
3710,
16774,
2175,
285,
556,
253,
1840,
5393,
32213,
436,
789,
10262,
3710,
16774,
2175,
285,
556,
2067,
4619,
32213,
5474,
339,
4904,
699,
30080,
22559,
891,
717,
7562,
619,
4868,
533,
891,
588,
417,
3819,
1411,
33944,
253,
2929,
891,
1158,
253,
1543,
403,
12532,
533,
253,
7990,
273,
253,
4679,
403,
3710,
285,
253,
3916,
878,
281,
320,
625,
10799,
347,
8042,
562,
275,
619,
5955,
342,
253,
4477,
627,
403,
671,
2067,
1774,
5816,
4278,
326,
2789,
352,
1892,
281,
2096,
285,
11435,
253,
5661,
7533,
891,
751,
253,
2934,
273,
3368,
374,
67,
533,
253,
1818,
432,
253,
2876,
3169,
6779,
281,
253,
581,
12022,
6779,
1057,
417,
1646,
281,
320,
247,
1534,
1818,
347,
597,
1097,
1918,
253,
3448,
1566,
247,
3425,
273,
3159,
11390,
436,
778,
816,
921,
326,
253,
1340,
273,
253,
3000,
275,
253,
4736,
285,
2892,
1057,
417,
2647,
323,
253,
15034,
7089,
534,
310,
2238,
273,
3264,
1677,
253,
2969,
20665,
908,
281,
6635,
253,
2876,
3169,
6779,
253,
9380,
3916,
651,
671,
320,
34615,
342,
5272,
22909,
323,
253,
2540,
16958,
253,
2929,
4390,
26574,
253,
3368,
7313,
347,
11815,
534,
891,
1158,
778,
320,
689,
16691,
1320,
1677,
253,
3710,
7990,
273,
253,
4679,
581,
40022,
581,
1511,
273,
1655,
3409,
3280,
6779,
50276,
9131,
30080,
22559,
253,
2929,
29328,
285,
2175,
253,
12510,
273,
970,
3215,
11273,
298,
983,
281,
8415,
247,
22453,
3061,
11849,
1895,
835,
253,
7313,
7342,
285,
5231,
403,
417,
8927,
6607,
275,
3448,
253,
4477,
2216,
1264,
4679,
337,
6455,
253,
1895,
281,
3448,
14053,
285,
2557,
253,
12510,
273,
3215,
26208,
253,
298,
78,
50276,
19,
7277,
3045,
273,
970,
253,
3448,
3280,
6779,
7147,
247,
3632,
2703,
3280,
6779,
50276,
20,
3653,
1880,
22022,
253,
14800,
281,
17438,
310,
3309,
50275,
783,
2929,
20097,
326,
337,
3448,
14053,
19132,
26647,
275,
3646,
4715,
374,
3448,
3169,
3126,
2349,
351,
723,
403,
417,
3058,
281,
5649,
432,
298,
2503,
1221,
26208,
285,
495,
253,
1543,
1127,
253,
1896,
12510,
273,
3448,
14053,
347,
247,
2087,
27299,
3215,
26208,
6974,
50275,
1189,
455,
891,
1089,
253,
2929,
1077,
973,
15720,
352,
310,
8127,
16774,
285,
253,
4679,
403,
9257,
9978,
285,
7257,
47860,
1543,
253,
3916,
275,
253,
10199,
403,
4516,
407,
253,
4679,
533,
3198,
320,
2007,
11096,
281,
4887,
253,
3710,
7990,
273,
253,
2929,
891,
651,
751,
281,
1642,
1880,
253,
5304,
7313,
403,
908,
323,
3733,
253,
3210,
352,
3133,
751,
597,
403,
417,
285,
604,
417,
2139,
275,
2087,
352,
69,
320,
4722,
281,
923,
1880,
298,
78,
29343,
264,
6779,
476,
13503,
643,
6793,
14237,
24088,
5304,
6779,
253,
24762,
4216,
3169,
6779,
778,
320,
1512,
1892,
323,
253,
1566,
281,
3037,
432,
285,
3021,
352,
310,
6927,
281,
10018,
7756,
2220,
436,
6779,
891,
1804,
253,
4477,
49620,
253,
3916,
43962,
326,
597,
4647,
281,
247,
2173,
4836,
20034,
285,
247,
1327,
34309,
3280,
6779,
50276,
783,
2929,
6131,
4460,
4722,
1543,
326,
8162,
747,
3640,
670,
3215,
11273,
298,
983,
891,
5583,
14924,
50276,
7152,
33032,
2520,
2929,
3936,
247,
39707,
3169,
3448,
1566,
3215,
11273,
327,
247,
1781,
2505,
20689,
275,
436,
1083,
305,
431,
19,
285,
4648,
352,
323,
253,
24762,
2715,
273,
253,
7503,
9511,
3126,
253,
8310,
7342,
285,
2250,
2892,
273,
253,
5570,
403,
16202,
347,
2505,
11559,
275,
247,
1643,
2710,
4088,
285,
10208,
347,
3280,
281,
253,
39707,
285,
253,
3453,
273,
253,
1566,
310,
24462,
281,
3283,
253,
5570,
2250,
253,
2929,
14371,
326,
275,
2219,
835,
253,
1071,
3268,
19986,
275,
690,
1039,
432,
253,
3733,
3268,
326,
970,
3215,
11273,
4979,
398,
10260,
19132,
3045,
327,
253,
4836,
37865,
253,
1543,
273,
3448,
3215,
26208,
310,
3240,
1781,
285,
973,
4516,
407,
253,
5661,
1941,
253,
14871,
3159,
3368,
310,
3240,
4722,
285,
891,
1158,
3240,
4518,
14371,
326,
253,
3448,
3215,
26208,
310,
42014,
1491,
670,
253,
1533,
285,
672,
253,
3000,
403,
7932,
407,
25333,
3000,
326,
3640,
476,
642,
3356,
320,
8069,
811,
3850,
50276,
8265,
3993,
891,
1158,
253,
2173,
1750,
273,
38135,
275,
436,
2929,
310,
3240,
5075,
285,
2429,
281,
643,
4623,
789,
534,
943,
320,
11106,
533,
310,
417,
436,
2929,
310,
417,
247,
1175,
7680,
281,
253,
1673,
50276,
46458,
253,
1750,
310,
326,
436,
2929,
310,
253,
806,
281,
7568,
5520,
26647,
275,
247,
1327,
1981,
86,
2531,
1895,
689,
247,
2629,
11454,
18428,
8245,
970,
247,
3215,
11273,
3448,
1566,
2057,
253,
1750,
310,
20854,
390,
352,
310,
1146,
8392,
21547,
35440,
281,
16670,
789,
534,
891,
1158,
310,
4518,
2905,
50276,
46458,
891,
1158,
22914,
3082,
534,
452,
644,
2218,
327,
2505,
15865,
3958,
310,
2057,
271,
31120,
29002,
390,
436,
1750,
310,
816,
417,
271,
4722,
1750,
273,
38135,
627,
556,
644,
3240,
247,
2257,
273,
789,
275,
436,
2170,
533,
891,
1158,
253,
2929,
281,
1007,
387,
326,
954,
4518,
2789,
253,
3916,
273,
38135,
1679,
21414,
310,
337,
436,
2929,
4648,
247,
3215,
11273,
305,
431,
19,
1566,
327,
2505,
15865,
3958,
5742,
253,
50276,
1156,
3958,
432,
253,
23313,
469,
80,
7792,
671,
14371,
849,
253,
3215,
26208,
19132,
3045,
1690,
327,
2918,
483,
39709,
3958,
342,
337,
347,
3634,
891,
513,
417,
1158,
436,
2173,
38135,
1750,
310,
3451,
390,
387,
1682,
352,
310,
417,
271,
13943,
1750,
273,
38135,
50276,
2520,
2353,
84,
253,
1953,
273,
752,
403,
253,
3910,
875,
436,
3126,
285,
2505,
15865,
12620,
891,
651,
9059,
326,
275,
3965,
1199,
1046,
1039,
436,
3126,
310,
1679,
4722,
50276,
531,
2181,
326,
1537,
452,
644,
4722,
310,
604,
436,
2929,
574,
908,
253,
495,
69,
3126,
2715,
273,
7503,
9511,
840,
891,
1158,
627,
651,
320,
247,
1175,
4154,
326,
436,
3126,
10262,
747,
7881,
689,
2505,
15865,
3958,
984,
627,
310,
271,
2879,
13071,
1895,
327,
1755,
273,
253,
7219,
50276,
1156,
4685,
581,
533,
1580,
436,
2929,
14288,
908,
253,
4216,
3169,
7313,
285,
13358,
24762,
5231,
436,
310,
417,
2032,
891,
1158,
2686,
326,
342,
436,
8310,
285,
2250,
2317,
352,
310,
1679,
4722,
685,
2505,
1533,
806,
275,
436,
2715,
3253,
310,
1565,
446,
456,
14217,
627,
310,
247,
1077,
4229,
873,
273,
5231,
2063,
31290,
285,
5113,
326,
1900,
3176,
3542,
275,
4555,
253,
1072,
1565,
446,
456,
1039,
4499,
342,
2505,
1533,
253,
2317,
273,
5231,
310,
5699,
9093,
760,
1579,
8659,
407,
253,
3236,
2165,
22507,
17368,
627,
403,
667,
1180,
273,
5113,
390,
643,
14429,
281,
8008,
342,
253,
3958,
497,
4158,
281,
320,
4546,
407,
7497,
285,
403,
3021,
275,
1199,
625,
3626,
3448,
285,
2349,
24993,
366,
247,
7826,
16055,
873,
273,
15216,
275,
7503,
9511,
697,
816,
247,
4944,
873,
273,
5113,
285,
5231,
285,
253,
7342,
403,
816,
281,
10517,
247,
4229,
873,
273,
2063,
31290,
285,
1142,
273,
731,
403,
816,
7668,
253,
14663,
273,
2710,
5113,
715,
643,
5113,
275,
247,
1728,
253,
958,
326,
247,
24762,
499,
9582,
476,
320,
908,
281,
6635,
3216,
5083,
5727,
368,
2550,
513,
436,
15246,
314,
275,
2505,
15865,
3958,
1293,
5277,
247,
2257,
273,
8763,
1491,
281,
479,
6492,
247,
1679,
3626,
1679,
4722,
4758,
50276,
249,
2087,
285,
5742,
2429,
281,
253,
26647,
275,
337,
891,
1158,
253,
26647,
275,
436,
2929,
310,
3965,
3710,
275,
436,
2929,
253,
1071,
2606,
26647,
3797,
4460,
5019,
273,
1929,
5113,
275,
1929,
2063,
31290,
533,
323,
4227,
1057,
417,
2486,
4460,
5113,
285,
4460,
2063,
31290,
627,
310,
671,
642,
38135,
275,
5231,
390,
275,
253,
2087,
3126,
253,
954,
310,
8133,
5113,
275,
11555,
5053,
4499,
342,
337,
835,
2862,
3958,
403,
2918,
562,
4495,
326,
5231,
7342,
285,
5113,
476,
512,
320,
1027,
275,
5175,
12620,
2403,
436,
830,
273,
26647,
1199,
625,
4722,
50276,
1065,
1076,
9255,
436,
2173,
1750,
273,
38135,
891,
1158,
627,
403,
2686,
417,
247,
2257,
273,
747,
2934,
275,
436,
2929,
253,
10336,
310,
417,
3782,
4460,
816,
45900,
253,
18012,
273,
247,
39707,
1566,
627,
310,
2649,
1199,
38135,
275,
3733,
697,
816,
14613,
34591,
390,
3082,
253,
3126,
310,
417,
747,
285,
347,
5469,
310,
417,
347,
4722,
347,
2505,
3958,
253,
760,
1524,
1750,
281,
38135,
310,
326,
597,
908,
3215,
11273,
298,
983,
327,
436,
2173,
2238,
273,
3126,
50276,
2520,
310,
2057,
247,
5884,
37699,
390,
2238,
273,
247,
1943,
1895,
310,
627,
271,
3368,
835,
253,
9305,
8310,
41881,
3518,
432,
253,
3126,
403,
10208,
715,
253,
2990,
2429,
281,
253,
39707,
10166,
432,
20041,
891,
871,
627,
310,
253,
3368,
323,
253,
806,
581,
533,
417,
396,
272,
247,
5301,
281,
247,
432,
8658,
1506,
533,
275,
2087,
253,
21643,
14663,
273,
4679,
2139,
310,
253,
2201,
3368,
5611,
285,
2011,
275,
253,
10199,
891,
1537,
452,
816,
9829,
436,
604,
697,
816,
14663,
891,
651,
5583,
12478,
598,
253,
4679,
2593,
5010,
891,
1158,
697,
2238,
273,
1774,
326,
253,
5301,
310,
5816,
984,
436,
2097,
326,
253,
3215,
26208,
760,
2987,
984,
253,
4477,
452,
9093,
1133,
12517,
264,
841,
20665,
281,
755,
305,
431,
19,
281,
452,
4217,
3215,
26208,
50276,
74,
671,
1119,
253,
10336,
908,
3965,
8921,
3798,
672,
952,
897,
305,
431,
3210,
597,
588,
897,
253,
2876,
10669,
3453,
1481,
285,
816,
1442,
292,
2517,
352,
281,
4711,
253,
3451,
3453,
594,
275,
4677,
495,
368,
651,
452,
512,
273,
253,
1072,
14800,
533,
840,
651,
417,
6635,
18012,
387,
1046,
4522,
383,
554,
368,
651,
816,
4561,
247,
10669,
3425,
562,
846,
253,
1390,
3280,
436,
4245,
247,
4564,
11361,
253,
3453,
9851,
2686,
3831,
4217,
3733,
1580,
597,
250,
10166,
281,
6635,
2505,
352,
14935,
368,
513,
1182,
254,
6934,
302,
285,
1643,
11860,
4679,
984,
253,
1566,
476,
6635,
10341,
2876,
18012,
627,
403,
3240,
13943,
1543,
275,
841,
7533,
970,
1781,
3448,
3210,
594,
352,
310,
8909,
326,
436,
369,
417,
2218,
369,
627,
247,
1921,
2139,
436,
369,
417,
2218,
275,
436,
1083,
390,
369,
436,
3597,
285,
1119,
417,
281,
320,
347,
3576,
50276,
37585,
50276,
783,
4327,
273,
10922,
253,
4736,
275,
5571,
5018,
3133,
8489,
10341,
281,
479,
310,
436,
253,
4284,
275,
253,
3236,
3126,
355,
17408,
943,
1900,
320,
275,
12839,
50276,
18,
1978,
11874,
285,
8338,
3448,
3210,
323,
2250,
5978,
275,
2505,
3169,
3958,
439,
328,
30838,
340,
8500,
687,
5582,
1218,
80,
1111,
783,
88,
419,
24378,
570,
12914,
465,
5401,
1479,
295,
31210,
303,
5582,
802,
13307,
81,
9169,
253,
2929,
556,
690,
4722,
1543,
533,
253,
2929,
19756,
38135,
3215,
11273,
298,
983,
452,
2168,
644,
2011,
281,
320,
3576,
275,
2074,
17438,
3445,
3422,
391,
77,
12620,
2505,
15865,
3958,
285,
253,
2929,
310,
417,
347,
13943,
1014,
347,
690,
4321,
789,
923,
337,
352,
1057,
417,
2085,
747,
12288,
3610,
1130,
84,
1116,
5671,
980,
2429,
281,
2720,
6239,
285,
594,
891,
1158,
651,
417,
320,
7607,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
897,
273,
3215,
11273,
3448,
3210,
298,
78,
323,
3733,
253,
3646,
275,
36080,
12620,
5742,
247,
3215,
11273,
305,
431,
19,
298,
78,
310,
908,
281,
26641,
253,
3646,
3126,
7313,
7342,
285,
5231,
403,
16202,
20420,
24088,
11516,
715,
2505,
11559,
281,
4647,
253,
298,
1814,
833,
3646,
253,
4679,
1263,
253,
26647,
1055,
273,
3302,
3006,
342,
3215,
11273,
298,
983,
30628,
452,
1119,
253,
2929,
1160,
3710,
9021,
275,
1798,
2720,
2987,
327,
2505,
15865,
3958,
452,
14859,
253,
897,
273,
3215,
11273,
298,
983,
323,
4882,
3958,
285,
5421,
253,
26647,
1055,
824,
347,
337,
697,
671,
5125,
326,
253,
2929,
943,
49620,
253,
3916,
1160,
275,
253,
4679,
1677,
253,
3710,
5661,
7990,
285,
1543,
50275,
18,
1978,
11874,
285,
8338,
3448,
3210,
323,
2250,
5978,
275,
2505,
3169,
3958,
439,
328,
30838,
340,
8500,
687,
5582,
1218,
80,
1111,
783,
88,
419,
24378,
570,
12914,
465,
5401,
1479,
295,
31210,
303,
5582,
802,
13307,
81,
9169
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
983,
7083,
1506,
310,
7197,
685,
298,
78,
649,
3632,
1677,
326,
298,
78,
649,
3632,
310,
2820,
281,
3037,
253,
4836,
970,
3632,
11559,
352,
310,
8664,
326,
298,
983,
7083,
1506,
310,
7591,
281,
562,
32231,
1014,
326,
1057,
436,
1599,
298,
983,
7083,
1506,
310,
417,
6283,
24251,
285,
263,
10166,
7357,
14940,
50276,
1189,
455,
436,
789,
10262,
3710,
16774,
2175,
285,
556,
253,
1840,
5393,
32213,
436,
789,
10262,
3710,
16774,
2175,
285,
556,
2067,
4619,
32213,
5474,
339,
4904,
699,
30080,
22559,
891,
717,
7562,
619,
4868,
533,
891,
588,
417,
3819,
1411,
33944,
253,
2929,
891,
1158,
253,
1543,
403,
12532,
533,
253,
7990,
273,
253,
4679,
403,
3710,
285,
253,
3916,
878,
281,
320,
625,
10799,
347,
8042,
562,
275,
619,
5955,
342,
253,
4477,
627,
403,
671,
2067,
1774,
5816,
4278,
326,
2789,
352,
1892,
281,
2096,
285,
11435,
253,
5661,
7533,
891,
751,
253,
2934,
273,
3368,
374,
67,
533,
253,
1818,
432,
253,
2876,
3169,
6779,
281,
253,
581,
12022,
6779,
1057,
417,
1646,
281,
320,
247,
1534,
1818,
347,
597,
1097,
1918,
253,
3448,
1566,
247,
3425,
273,
3159,
11390,
436,
778,
816,
921,
326,
253,
1340,
273,
253,
3000,
275,
253,
4736,
285,
2892,
1057,
417,
2647,
323,
253,
15034,
7089,
534,
310,
2238,
273,
3264,
1677,
253,
2969,
20665,
908,
281,
6635,
253,
2876,
3169,
6779,
253,
9380,
3916,
651,
671,
320,
34615,
342,
5272,
22909,
323,
253,
2540,
16958,
253,
2929,
4390,
26574,
253,
3368,
7313,
347,
11815,
534,
891,
1158,
778,
320,
689,
16691,
1320,
1677,
253,
3710,
7990,
273,
253,
4679,
581,
40022,
581,
1511,
273,
1655,
3409,
3280,
6779,
50276,
9131,
30080,
22559,
253,
2929,
29328,
285,
2175,
253,
12510,
273,
970,
3215,
11273,
298,
983,
281,
8415,
247,
22453,
3061,
11849,
1895,
835,
253,
7313,
7342,
285,
5231,
403,
417,
8927,
6607,
275,
3448,
253,
4477,
2216,
1264,
4679,
337,
6455,
253,
1895,
281,
3448,
14053,
285,
2557,
253,
12510,
273,
3215,
26208,
253,
298,
78,
50276,
19,
7277,
3045,
273,
970,
253,
3448,
3280,
6779,
7147,
247,
3632,
2703,
3280,
6779,
50276,
20,
3653,
1880,
22022,
253,
14800,
281,
17438,
310,
3309,
50275,
783,
2929,
20097,
326,
337,
3448,
14053,
19132,
26647,
275,
3646,
4715,
374,
3448,
3169,
3126,
2349,
351,
723,
403,
417,
3058,
281,
5649,
432,
298,
2503,
1221,
26208,
285,
495,
253,
1543,
1127,
253,
1896,
12510,
273,
3448,
14053,
347,
247,
2087,
27299,
3215,
26208,
6974,
50275,
1189,
455,
891,
1089,
253,
2929,
1077,
973,
15720,
352,
310,
8127,
16774,
285,
253,
4679,
403,
9257,
9978,
285,
7257,
47860,
1543,
253,
3916,
275,
253,
10199,
403,
4516,
407,
253,
4679,
533,
3198,
320,
2007,
11096,
281,
4887,
253,
3710,
7990,
273,
253,
2929,
891,
651,
751,
281,
1642,
1880,
253,
5304,
7313,
403,
908,
323,
3733,
253,
3210,
352,
3133,
751,
597,
403,
417,
285,
604,
417,
2139,
275,
2087,
352,
69,
320,
4722,
281,
923,
1880,
298,
78,
29343,
264,
6779,
476,
13503,
643,
6793,
14237,
24088,
5304,
6779,
253,
24762,
4216,
3169,
6779,
778,
320,
1512,
1892,
323,
253,
1566,
281,
3037,
432,
285,
3021,
352,
310,
6927,
281,
10018,
7756,
2220,
436,
6779,
891,
1804,
253,
4477,
49620,
253,
3916,
43962,
326,
597,
4647,
281,
247,
2173,
4836,
20034,
285,
247,
1327,
34309,
3280,
6779,
50276,
783,
2929,
6131,
4460,
4722,
1543,
326,
8162,
747,
3640,
670,
3215,
11273,
298,
983,
891,
5583,
14924,
50276,
7152,
33032,
2520,
2929,
3936,
247,
39707,
3169,
3448,
1566,
3215,
11273,
327,
247,
1781,
2505,
20689,
275,
436,
1083,
305,
431,
19,
285,
4648,
352,
323,
253,
24762,
2715,
273,
253,
7503,
9511,
3126,
253,
8310,
7342,
285,
2250,
2892,
273,
253,
5570,
403,
16202,
347,
2505,
11559,
275,
247,
1643,
2710,
4088,
285,
10208,
347,
3280,
281,
253,
39707,
285,
253,
3453,
273,
253,
1566,
310,
24462,
281,
3283,
253,
5570,
2250,
253,
2929,
14371,
326,
275,
2219,
835,
253,
1071,
3268,
19986,
275,
690,
1039,
432,
253,
3733,
3268,
326,
970,
3215,
11273,
4979,
398,
10260,
19132,
3045,
327,
253,
4836,
37865,
253,
1543,
273,
3448,
3215,
26208,
310,
3240,
1781,
285,
973,
4516,
407,
253,
5661,
1941,
253,
14871,
3159,
3368,
310,
3240,
4722,
285,
891,
1158,
3240,
4518,
14371,
326,
253,
3448,
3215,
26208,
310,
42014,
1491,
670,
253,
1533,
285,
672,
253,
3000,
403,
7932,
407,
25333,
3000,
326,
3640,
476,
642,
3356,
320,
8069,
811,
3850,
50276,
8265,
3993,
891,
1158,
253,
2173,
1750,
273,
38135,
275,
436,
2929,
310,
3240,
5075,
285,
2429,
281,
643,
4623,
789,
534,
943,
320,
11106,
533,
310,
417,
436,
2929,
310,
417,
247,
1175,
7680,
281,
253,
1673,
50276,
46458,
253,
1750,
310,
326,
436,
2929,
310,
253,
806,
281,
7568,
5520,
26647,
275,
247,
1327,
1981,
86,
2531,
1895,
689,
247,
2629,
11454,
18428,
8245,
970,
247,
3215,
11273,
3448,
1566,
2057,
253,
1750,
310,
20854,
390,
352,
310,
1146,
8392,
21547,
35440,
281,
16670,
789,
534,
891,
1158,
310,
4518,
2905,
50276,
46458,
891,
1158,
22914,
3082,
534,
452,
644,
2218,
327,
2505,
15865,
3958,
310,
2057,
271,
31120,
29002,
390,
436,
1750,
310,
816,
417,
271,
4722,
1750,
273,
38135,
627,
556,
644,
3240,
247,
2257,
273,
789,
275,
436,
2170,
533,
891,
1158,
253,
2929,
281,
1007,
387,
326,
954,
4518,
2789,
253,
3916,
273,
38135,
1679,
21414,
310,
337,
436,
2929,
4648,
247,
3215,
11273,
305,
431,
19,
1566,
327,
2505,
15865,
3958,
5742,
253,
50276,
1156,
3958,
432,
253,
23313,
469,
80,
7792,
671,
14371,
849,
253,
3215,
26208,
19132,
3045,
1690,
327,
2918,
483,
39709,
3958,
342,
337,
347,
3634,
891,
513,
417,
1158,
436,
2173,
38135,
1750,
310,
3451,
390,
387,
1682,
352,
310,
417,
271,
13943,
1750,
273,
38135,
50276,
2520,
2353,
84,
253,
1953,
273,
752,
403,
253,
3910,
875,
436,
3126,
285,
2505,
15865,
12620,
891,
651,
9059,
326,
275,
3965,
1199,
1046,
1039,
436,
3126,
310,
1679,
4722,
50276,
531,
2181,
326,
1537,
452,
644,
4722,
310,
604,
436,
2929,
574,
908,
253,
495,
69,
3126,
2715,
273,
7503,
9511,
840,
891,
1158,
627,
651,
320,
247,
1175,
4154,
326,
436,
3126,
10262,
747,
7881,
689,
2505,
15865,
3958,
984,
627,
310,
271,
2879,
13071,
1895,
327,
1755,
273,
253,
7219,
50276,
1156,
4685,
581,
533,
1580,
436,
2929,
14288,
908,
253,
4216,
3169,
7313,
285,
13358,
24762,
5231,
436,
310,
417,
2032,
891,
1158,
2686,
326,
342,
436,
8310,
285,
2250,
2317,
352,
310,
1679,
4722,
685,
2505,
1533,
806,
275,
436,
2715,
3253,
310,
1565,
446,
456,
14217,
627,
310,
247,
1077,
4229,
873,
273,
5231,
2063,
31290,
285,
5113,
326,
1900,
3176,
3542,
275,
4555,
253,
1072,
1565,
446,
456,
1039,
4499,
342,
2505,
1533,
253,
2317,
273,
5231,
310,
5699,
9093,
760,
1579,
8659,
407,
253,
3236,
2165,
22507,
17368,
627,
403,
667,
1180,
273,
5113,
390,
643,
14429,
281,
8008,
342,
253,
3958,
497,
4158,
281,
320,
4546,
407,
7497,
285,
403,
3021,
275,
1199,
625,
3626,
3448,
285,
2349,
24993,
366,
247,
7826,
16055,
873,
273,
15216,
275,
7503,
9511,
697,
816,
247,
4944,
873,
273,
5113,
285,
5231,
285,
253,
7342,
403,
816,
281,
10517,
247,
4229,
873,
273,
2063,
31290,
285,
1142,
273,
731,
403,
816,
7668,
253,
14663,
273,
2710,
5113,
715,
643,
5113,
275,
247,
1728,
253,
958,
326,
247,
24762,
499,
9582,
476,
320,
908,
281,
6635,
3216,
5083,
5727,
368,
2550,
513,
436,
15246,
314,
275,
2505,
15865,
3958,
1293,
5277,
247,
2257,
273,
8763,
1491,
281,
479,
6492,
247,
1679,
3626,
1679,
4722,
4758,
50276,
249,
2087,
285,
5742,
2429,
281,
253,
26647,
275,
337,
891,
1158,
253,
26647,
275,
436,
2929,
310,
3965,
3710,
275,
436,
2929,
253,
1071,
2606,
26647,
3797,
4460,
5019,
273,
1929,
5113,
275,
1929,
2063,
31290,
533,
323,
4227,
1057,
417,
2486,
4460,
5113,
285,
4460,
2063,
31290,
627,
310,
671,
642,
38135,
275,
5231,
390,
275,
253,
2087,
3126,
253,
954,
310,
8133,
5113,
275,
11555,
5053,
4499,
342,
337,
835,
2862,
3958,
403,
2918,
562,
4495,
326,
5231,
7342,
285,
5113,
476,
512,
320,
1027,
275,
5175,
12620,
2403,
436,
830,
273,
26647,
1199,
625,
4722,
50276,
1065,
1076,
9255,
436,
2173,
1750,
273,
38135,
891,
1158,
627,
403,
2686,
417,
247,
2257,
273,
747,
2934,
275,
436,
2929,
253,
10336,
310,
417,
3782,
4460,
816,
45900,
253,
18012,
273,
247,
39707,
1566,
627,
310,
2649,
1199,
38135,
275,
3733,
697,
816,
14613,
34591,
390,
3082,
253,
3126,
310,
417,
747,
285,
347,
5469,
310,
417,
347,
4722,
347,
2505,
3958,
253,
760,
1524,
1750,
281,
38135,
310,
326,
597,
908,
3215,
11273,
298,
983,
327,
436,
2173,
2238,
273,
3126,
50276,
2520,
310,
2057,
247,
5884,
37699,
390,
2238,
273,
247,
1943,
1895,
310,
627,
271,
3368,
835,
253,
9305,
8310,
41881,
3518,
432,
253,
3126,
403,
10208,
715,
253,
2990,
2429,
281,
253,
39707,
10166,
432,
20041,
891,
871,
627,
310,
253,
3368,
323,
253,
806,
581,
533,
417,
396,
272,
247,
5301,
281,
247,
432,
8658,
1506,
533,
275,
2087,
253,
21643,
14663,
273,
4679,
2139,
310,
253,
2201,
3368,
5611,
285,
2011,
275,
253,
10199,
891,
1537,
452,
816,
9829,
436,
604,
697,
816,
14663,
891,
651,
5583,
12478,
598,
253,
4679,
2593,
5010,
891,
1158,
697,
2238,
273,
1774,
326,
253,
5301,
310,
5816,
984,
436,
2097,
326,
253,
3215,
26208,
760,
2987,
984,
253,
4477,
452,
9093,
1133,
12517,
264,
841,
20665,
281,
755,
305,
431,
19,
281,
452,
4217,
3215,
26208,
50276,
74,
671,
1119,
253,
10336,
908,
3965,
8921,
3798,
672,
952,
897,
305,
431,
3210,
597,
588,
897,
253,
2876,
10669,
3453,
1481,
285,
816,
1442,
292,
2517,
352,
281,
4711,
253,
3451,
3453,
594,
275,
4677,
495,
368,
651,
452,
512,
273,
253,
1072,
14800,
533,
840,
651,
417,
6635,
18012,
387,
1046,
4522,
383,
554,
368,
651,
816,
4561,
247,
10669,
3425,
562,
846,
253,
1390,
3280,
436,
4245,
247,
4564,
11361,
253,
3453,
9851,
2686,
3831,
4217,
3733,
1580,
597,
250,
10166,
281,
6635,
2505,
352,
14935,
368,
513,
1182,
254,
6934,
302,
285,
1643,
11860,
4679,
984,
253,
1566,
476,
6635,
10341,
2876,
18012,
627,
403,
3240,
13943,
1543,
275,
841,
7533,
970,
1781,
3448,
3210,
594,
352,
310,
8909,
326,
436,
369,
417,
2218,
369,
627,
247,
1921,
2139,
436,
369,
417,
2218,
275,
436,
1083,
390,
369,
436,
3597,
285,
1119,
417,
281,
320,
347,
3576,
50276,
37585,
50276,
783,
4327,
273,
10922,
253,
4736,
275,
5571,
5018,
3133,
8489,
10341,
281,
479,
310,
436,
253,
4284,
275,
253,
3236,
3126,
355,
17408,
943,
1900,
320,
275,
12839,
50276,
18,
1978,
11874,
285,
8338,
3448,
3210,
323,
2250,
5978,
275,
2505,
3169,
3958,
439,
328,
30838,
340,
8500,
687,
5582,
1218,
80,
1111,
783,
88,
419,
24378,
570,
12914,
465,
5401,
1479,
295,
31210,
303,
5582,
802,
13307,
81,
9169,
253,
2929,
556,
690,
4722,
1543,
533,
253,
2929,
19756,
38135,
3215,
11273,
298,
983,
452,
2168,
644,
2011,
281,
320,
3576,
275,
2074,
17438,
3445,
3422,
391,
77,
12620,
2505,
15865,
3958,
285,
253,
2929,
310,
417,
347,
13943,
1014,
347,
690,
4321,
789,
923,
337,
352,
1057,
417,
2085,
747,
12288,
3610,
1130,
84,
1116,
5671,
980,
2429,
281,
2720,
6239,
285,
594,
891,
1158,
651,
417,
320,
7607,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
897,
273,
3215,
11273,
3448,
3210,
298,
78,
323,
3733,
253,
3646,
275,
36080,
12620,
5742,
247,
3215,
11273,
305,
431,
19,
298,
78,
310,
908,
281,
26641,
253,
3646,
3126,
7313,
7342,
285,
5231,
403,
16202,
20420,
24088,
11516,
715,
2505,
11559,
281,
4647,
253,
298,
1814,
833,
3646,
253,
4679,
1263,
253,
26647,
1055,
273,
3302,
3006,
342,
3215,
11273,
298,
983,
30628,
452,
1119,
253,
2929,
1160,
3710,
9021,
275,
1798,
2720,
2987,
327,
2505,
15865,
3958,
452,
14859,
253,
897,
273,
3215,
11273,
298,
983,
323,
4882,
3958,
285,
5421,
253,
26647,
1055,
824,
347,
337,
697,
671,
5125,
326,
253,
2929,
943,
49620,
253,
3916,
1160,
275,
253,
4679,
1677,
253,
3710,
5661,
7990,
285,
1543,
50275,
18,
1978,
11874,
285,
8338,
3448,
3210,
323,
2250,
5978,
275,
2505,
3169,
3958,
439,
328,
30838,
340,
8500,
687,
5582,
1218,
80,
1111,
783,
88,
419,
24378,
570,
12914,
465,
5401,
1479,
295,
31210,
303,
5582,
802,
13307,
81,
9169
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper provides a novel dataset and its code to increase the number of samples the dataset is named gamified association benchmark to challenge visionandlanguage models winogavil inspired by winograd schema challenge 9 the dataset is collected via an online game resembling the wellknown codename game in the game given image candidates a spymaster associates a cue word for a subset of them spymasters goal is to come up with a set of images and their cue word that ai cannot guess and that humans can guess correctly this benchmark is interesting since it highlights the gap of ability for commonsense reasoning between humans and ais the authors provide a dataset and the online game allowing researchers to collect further data experimental results show that the benchmark is challenging for ais while humans can solve the game though inspired by codename is a different one the reviewer is a little concerned that the incentive for spymasters to look for a unified representation for more images may not appear for example if the spymaster comes up with a word for two images and the ai misses them all the jaccard index is of course 0 if the spymaster thinks of a word for three images and the ai misses them all the jaccard index is still 0 in the original codename the number of cards guessed by other than the spymaster is directly scored so the spymaster thinks of abstract expressions suitable for more cards however in the game in this paper no such incentive is created through the jaccard index in order to address human commonsense reasoning incentives to assign words to more images would have been necessary although the title includes visionandlanguage it is slightly overclaimed since the associated representation seems to be a single word if the authors mean that the dataset includes not only words but also sentences as a cue the reviewer would like them to point this out cues did not have to be limited to words only as in codename in the sense of association and commonsense reasoning spymasters may generate a more flexible representation for a subset of candidates some relations between cue length and performance of ais and humans would also be interesting docsepthis paper introduces a new visionandlanguage vl benchmark based on gamification the gamification framework is very similar to codenames a board game and in this benchmark the aim is to associate k images from a set of candidates for given input concept first the spymaster selects k images from a set of candidates and the solvers try to predict selected images to do so a game is designed for amts and a dataset is collected from the logs of this game finally a couple of known vl models are evaluated on this dataset the idea is quite novel and interesting i am certain that this benchmark will be a beneficial resource for future researchers nlp community is interested in problems that require commonsense reasoning andor external knowledge recently and this benchmark falls into this category the problem is easy for humans and challenging for machines which should be the case for all created datasets the paper includes many different categorical analyses eg table 2 figure 4 overall the paper is wellwritten i cant find any major weakness but i think the paper has minor issues which can be improved see the correctness section docsepthis paper proposes an online game winogavil to collect vision and language associations eg werewolves to a full moon used as a dynamic benchmark to evaluate stateoftheart models specifically the paper uses the game to collect 35k instances finding that they are intuitive for humans 90 jaccard index but challenging for stateoftheart ai models where the best model vilt achieves a score of 52 succeeding mostly where the cue is visually salient which is interesting and insightful moreover those experimental results in this paper also indicate that the collected associations require diverse reasoning skills including general knowledge common sense abstraction and more overall this paper is wellmotivated and wellwritten i believe it can promote the development of visuallanguage models 1 this paper proposes winogavil an online game to collect vision and language associations used as a dynamic benchmark to evaluate stateoftheart models to me the proposed approach is novel and has a great contribution 2 this paper is wellwritten and wellmotivated and has insightful discussions 3 this paper also releases the dataset the code and the interactive game aiming to allow future data collection that can be used to develop models with better association abilities i do not find any weaknesses in this paper docsepthe authors introduce winogavil a game which acts as a framework to collect languagevision associations that are difficult for existing visionlanguage models to solve but solvable by humans the authors design winogavil in the style of codenames where the spymaster is provided a set of images and has to come up with a language cue that is related to a certain subset of the images the authors initially collect a set of 35k association instances but more instances can be flexibly added to this the authors analyze the collected associations and the challenges involved in solving them and analyze how existing visionlanguage models perform on this dataset this paper has one of the most novel and intriguing ideas i have seen in visionlanguage data collection but after reading the paper and playing around with the demo myself i have a few doubts that make me skeptical of this work until they are addressed the strengths of this paper are very strong ones but unfortunately the weaknesses in certain methods are also glaring if the questions listed are adequately answered i would be willing to change to clear accept strengths 1 the idea is a very clever one and the codenamesstyle data collection game is well formulated to collect difficult visionlanguage associations 2 the framework and game are made public so that people can continually add associations 3 the evaluation of the collected associations is well done and the results are convincing that these associations are harder to solve for existing vl models but highly solvable by humans but still less than the swow baseline 4 the paper is very well written and structured it was honestly a pleasure to read 1 i find it odd that the authors collect only one image per concept from google images l100 collecting only one image per concept results in cues for concepts that are too specific to the single image collected for it this means that the cues corresponding to the various associations that a concept occurs in largely contain information about the specific visual context of the concepts image and so the concepts evaluation overfits to the visual properties of that image while collecting a larger set of images say 5 per concept would also result in cues that are specific to the images that in their instances since the same image is not used for each instance the overall evaluation for that concepts associations does not overfit to a single image 2 the various visual skills in table 2 only make it more confusing exactly what we are evaluating with the collected associations do snowmen stare is this a general association that we want the model to know it seems like the associations being collected are too imagespecific related to point 1 above even if we only want to evaluate the associations that the model can express conditioned on a visual context ie not a general conceptlevel association like snowmen are white but an imagespecific one like snowmen staring in that particular image the overall evaluation still only evaluates associations specific to that one image of the snowmen how do i know if the model will express snowmanrelated associations for a different image of the snowman 3 further its not clear what visual associations for all concepts derived from swow will look like for instance what do the images of green and dirty concepts from the first row of table 2 look like how does that affect the particular associations that we collect for them 4 the choice to not discard associations that are solvable by the ai l148 is confusing the whole point of the framework is to collect associations that are difficult for models but solvable by humans so why keep samples that do not meet the first criteria the authors could average model performances from a few different ai models if they are concerned about model dependence of their collected data 5 its not clear in section 31 how the swow baseline dataset is different from the winogavil one specifically what are properties of how the baseline is collected that winogavil does better the explanation in l253258 would go well in s31 because its not clear how these datasets differ till we compare the results on them some visual examples that compare associations from these two datasets would be nice 6 in l274276 when filtering visually salient cues i dont understand why the authors do not retain only those cases where both the annotators agree 88 is still a sizable number and gives us more confidence about the results on this subset in the current setup the results appear to be muddled by those examples where annotators disagreed on the visual salience of the cue docsepthis paper proposed a new benchmark called winogavil to evaluate the commonsense reasoning ability of the stateoftheart visionandlanguage models in winogavil each task is composed of a textual cue a number k and a set of candidate images and its goal is to select the k images most associated with the textual cue to collect novel and challenging associations this work proposed a web gamification framework where a spymaster composes a new association cue that can be easily solved by other human players but can easily fool the ai model such as cliprn50 experiments showed that current visionandlanguage largely fall behind human performance 1 the winogavil is an interesting benchmark for multimodal commonsense reasoning and it can be dynamically updated with the proposed gamification framework 2 the experiments well demonstrated the gap between human and machine performances implying the challenges of the designed tasks for current ai methods 3 the analysis of the tasks in the winogavil dataset reveals that a diverse set of reasoning skills are required to achieve high performance 4 an alternative approach to collecting the data using the swow dataset is presented in this work to show that the gamification framework creates more challenging associations 1 it is unclear why not use the model scores in the data selection my intuition is that removing data that can be easily solved by the ai model ie the cliprn50 will make the benchmark more challenging to current visionandlanguage models is that true if not i wonder how to explain this phenomenon and whether there is any experiment to justify it 2 in experiments it would be more interesting to show how various methods behave differently across tasks that require different reasoning skills it is an extension of the existing experiments of comparing nonvisual and visual settings to more finegrained observed patterns 3 what are the chance performances in table 4 i am curious how the methods compare with the random guess
### Summary:
|
the paper presents a new visionandlanguage benchmark that associates images and text collected using a crowdsourced game similar to codemaster all reviewers were positive on the paper they praised the cleverness and novelty of the idea the clarity of the writing that the benchmark and game are publicly available and the quality of the experimental evaluation reviewer o1pk initially raised some issues but was largely convinced by the authors response overall the ac agrees that the paper is interesting and wellwritten and will be a welcome addition to the neurips datasets and benchmarks track this year
|
[
1027,
581,
253,
37317,
310,
247,
1652,
7514,
326,
253,
25275,
323,
653,
1105,
22854,
281,
1007,
323,
247,
27998,
6779,
323,
625,
3888,
778,
417,
3176,
323,
1650,
604,
253,
653,
1105,
2237,
3249,
598,
342,
247,
3159,
323,
767,
3888,
285,
253,
23105,
38771,
731,
512,
253,
480,
3649,
472,
3605,
310,
273,
2282,
470,
604,
253,
653,
1105,
2237,
11121,
273,
247,
3159,
323,
1264,
3888,
285,
253,
23105,
38771,
731,
512,
253,
480,
3649,
472,
3605,
310,
1335,
470,
275,
253,
3236,
12738,
5116,
253,
1180,
273,
8364,
30346,
407,
643,
685,
253,
653,
1105,
2237,
310,
3587,
11691,
594,
253,
653,
1105,
2237,
11121,
273,
12002,
12091,
7470,
323,
625,
8364,
2299,
275,
253,
2165,
275,
436,
2929,
642,
824,
25275,
310,
3562,
949,
253,
480,
3649,
472,
3605,
275,
1340,
281,
2953,
1966,
764,
49235,
14720,
26911,
281,
9212,
3000,
281,
625,
3888,
651,
452,
644,
3309,
50276,
20261,
253,
4060,
3797,
8113,
395,
12982,
352,
310,
5777,
689,
13578,
1580,
253,
2330,
6779,
3133,
281,
320,
247,
2014,
3159,
604,
253,
4477,
1599,
326,
253,
10895,
3797,
417,
760,
3000,
533,
671,
14683,
347,
247,
30129,
253,
37317,
651,
751,
731,
281,
1127,
436,
562,
26638,
858,
417,
452,
281,
320,
3710,
281,
3000,
760,
347,
275,
12738,
5116,
275,
253,
3282,
273,
5864,
285,
764,
49235,
14720,
653,
1105,
22854,
778,
6635,
247,
625,
12112,
6779,
323,
247,
8578,
273,
9183,
690,
2493,
875,
30129,
2978,
285,
3045,
273,
247,
261,
285,
7497,
651,
671,
320,
4722,
5474,
33032,
2520,
2929,
23970,
247,
747,
8113,
395,
12982,
362,
77,
22791,
1754,
327,
18814,
1877,
253,
18814,
1877,
7792,
310,
1077,
2074,
281,
12738,
257,
1443,
247,
4450,
2165,
285,
275,
436,
22791,
253,
4388,
310,
281,
15629,
465,
3888,
432,
247,
873,
273,
9183,
323,
1677,
3280,
4473,
806,
253,
653,
1105,
2237,
34899,
465,
3888,
432,
247,
873,
273,
9183,
285,
253,
1220,
735,
1611,
281,
3283,
4236,
3888,
281,
513,
594,
247,
2165,
310,
4158,
323,
717,
1641,
285,
247,
10895,
310,
5728,
432,
253,
20131,
273,
436,
2165,
4720,
247,
4564,
273,
1929,
362,
77,
3210,
403,
6760,
327,
436,
10895,
253,
2934,
310,
3240,
4460,
285,
4722,
891,
717,
2176,
326,
436,
22791,
588,
320,
247,
12912,
7741,
323,
2852,
8607,
295,
24343,
3114,
310,
6110,
275,
3237,
326,
2430,
764,
49235,
14720,
285,
263,
6024,
3640,
4102,
285,
436,
22791,
11521,
715,
436,
7140,
253,
1895,
310,
3477,
323,
7497,
285,
11132,
323,
10679,
534,
943,
320,
253,
1083,
323,
512,
3562,
15302,
253,
2929,
3797,
1142,
1027,
31091,
6260,
24088,
2829,
374,
4677,
577,
4583,
253,
2929,
310,
973,
15720,
891,
16216,
1089,
667,
2201,
14855,
533,
891,
1158,
253,
2929,
556,
5884,
3374,
534,
476,
320,
5520,
923,
253,
36594,
2593,
5474,
33032,
2520,
2929,
29328,
271,
3909,
2165,
3330,
462,
580,
300,
281,
4822,
8113,
285,
3448,
12485,
24088,
497,
88,
14503,
281,
247,
2120,
12334,
908,
347,
247,
7870,
22791,
281,
7472,
1375,
23037,
14387,
3210,
5742,
253,
2929,
4648,
253,
2165,
281,
4822,
4791,
76,
10872,
4560,
326,
597,
403,
50276,
565,
48714,
323,
7497,
5091,
480,
3649,
472,
3605,
533,
11132,
323,
1375,
23037,
14387,
23105,
3210,
835,
253,
1682,
1566,
362,
2878,
33526,
247,
4868,
273,
8073,
42547,
6571,
835,
253,
30129,
310,
25910,
43066,
534,
310,
4722,
285,
47860,
50276,
3062,
1189,
1110,
5661,
1543,
275,
436,
2929,
671,
5224,
326,
253,
5728,
12485,
2430,
11117,
14720,
6936,
50276,
10387,
2087,
3640,
1846,
3282,
38562,
285,
625,
4583,
436,
2929,
310,
973,
24013,
8550,
285,
973,
15720,
891,
2868,
352,
476,
8591,
253,
2440,
273,
1649,
86,
455,
2848,
3210,
337,
436,
2929,
29328,
3330,
462,
580,
300,
271,
3909,
2165,
281,
4822,
8113,
285,
3448,
12485,
908,
347,
247,
7870,
22791,
281,
7472,
1375,
23037,
14387,
3210,
281,
479,
253,
4081,
2746,
310,
4460,
285,
556,
247,
1270,
7680,
50276,
19,
436,
2929,
310,
973,
15720,
285,
973,
24013,
8550,
285,
556,
47860,
11985,
50276,
20,
436,
2929,
671,
16784,
253,
10895,
253,
2127,
285,
253,
18366,
2165,
26400,
281,
1581,
2852,
941,
4849,
326,
476,
320,
908,
281,
1287,
3210,
342,
1805,
5864,
15277,
891,
513,
417,
1089,
667,
32213,
275,
436,
2929,
5474,
339,
431,
248,
4477,
9569,
3330,
462,
580,
300,
247,
2165,
534,
6993,
347,
247,
7792,
281,
4822,
3448,
4694,
12485,
326,
403,
2834,
323,
5368,
8113,
12982,
3210,
281,
8415,
533,
1220,
17254,
407,
7497,
253,
4477,
2216,
3330,
462,
580,
300,
275,
253,
3740,
273,
12738,
257,
1443,
835,
253,
653,
1105,
2237,
310,
2530,
247,
873,
273,
3888,
285,
556,
281,
1705,
598,
342,
247,
3448,
30129,
326,
310,
2905,
281,
247,
2176,
8578,
273,
253,
3888,
253,
4477,
8523,
4822,
247,
873,
273,
4791,
76,
5864,
10872,
533,
625,
10872,
476,
320,
6520,
4360,
2879,
281,
436,
253,
4477,
12106,
253,
5728,
12485,
285,
253,
7881,
3206,
275,
16161,
731,
285,
12106,
849,
5368,
8113,
12982,
3210,
1347,
327,
436,
10895,
50276,
2520,
2929,
556,
581,
273,
253,
954,
4460,
285,
27807,
5697,
891,
452,
2326,
275,
8113,
12982,
941,
4849,
533,
846,
4361,
253,
2929,
285,
4882,
1475,
342,
253,
22020,
4266,
891,
452,
247,
1643,
24626,
326,
1056,
479,
33872,
273,
436,
789,
1919,
597,
403,
9713,
253,
20544,
273,
436,
2929,
403,
1077,
2266,
4394,
533,
19235,
253,
32213,
275,
2176,
3082,
403,
671,
45982,
604,
253,
3533,
7117,
403,
18212,
9577,
891,
651,
320,
7378,
281,
1818,
281,
2590,
2997,
20544,
337,
253,
2934,
310,
247,
1077,
19080,
581,
285,
253,
12738,
257,
1443,
4826,
941,
4849,
2165,
310,
973,
26115,
281,
4822,
2834,
8113,
12982,
12485,
50276,
19,
253,
7792,
285,
2165,
403,
1160,
1345,
594,
326,
952,
476,
23265,
823,
12485,
495,
253,
7103,
273,
253,
5728,
12485,
310,
973,
2218,
285,
253,
1543,
403,
21414,
326,
841,
12485,
403,
12150,
281,
8415,
323,
5368,
362,
77,
3210,
533,
4122,
1220,
17254,
407,
7497,
533,
1335,
1679,
685,
253,
1863,
319,
8245,
577,
253,
2929,
310,
1077,
973,
3542,
285,
18872,
352,
369,
20509,
247,
11284,
281,
1239,
50276,
18,
891,
1089,
352,
8909,
326,
253,
4477,
4822,
760,
581,
2460,
591,
4473,
432,
17899,
3888,
298,
2313,
17055,
760,
581,
2460,
591,
4473,
1543,
275,
26638,
323,
12342,
326,
403,
1512,
2173,
281,
253,
2014,
2460,
5728,
323,
352,
436,
2097,
326,
253,
26638,
3969,
281,
253,
2710,
12485,
326,
247,
4473,
6634,
275,
8127,
3831,
1491,
670,
253,
2173,
5304,
3634,
273,
253,
12342,
2460,
285,
594,
253,
12342,
7103,
689,
26017,
281,
253,
5304,
3607,
273,
326,
2460,
1223,
17055,
247,
4067,
873,
273,
3888,
1333,
608,
591,
4473,
651,
671,
906,
275,
26638,
326,
403,
2173,
281,
253,
3888,
326,
275,
616,
10872,
1580,
253,
1072,
2460,
310,
417,
908,
323,
1016,
4227,
253,
4583,
7103,
323,
326,
12342,
12485,
1057,
417,
689,
8491,
281,
247,
2014,
2460,
50275,
19,
253,
2710,
5304,
6936,
275,
2829,
374,
760,
1056,
352,
625,
21643,
4555,
752,
359,
403,
16344,
342,
253,
5728,
12485,
513,
8762,
3767,
25885,
310,
436,
247,
2087,
5864,
326,
359,
971,
253,
1566,
281,
871,
352,
3133,
751,
253,
12485,
1146,
5728,
403,
1512,
3888,
29765,
2905,
281,
1127,
337,
1840,
1014,
604,
359,
760,
971,
281,
7472,
253,
12485,
326,
253,
1566,
476,
3890,
27039,
327,
247,
5304,
3634,
26332,
417,
247,
2087,
4473,
5251,
5864,
751,
8762,
3767,
403,
3168,
533,
271,
3888,
29765,
581,
751,
8762,
3767,
17303,
275,
326,
1798,
2460,
253,
4583,
7103,
1335,
760,
44995,
12485,
2173,
281,
326,
581,
2460,
273,
253,
8762,
3767,
50276,
5430,
513,
891,
871,
604,
253,
1566,
588,
3890,
8762,
1342,
4919,
12485,
323,
247,
1027,
2460,
273,
253,
8762,
1342,
50276,
20,
2007,
697,
417,
2590,
752,
5304,
12485,
323,
512,
12342,
6012,
432,
1863,
319,
588,
1007,
751,
323,
4227,
752,
513,
253,
3888,
273,
4759,
285,
16076,
12342,
432,
253,
806,
4194,
273,
2829,
374,
1007,
751,
849,
1057,
326,
2818,
253,
1798,
12485,
326,
359,
4822,
323,
731,
50276,
21,
253,
4327,
281,
417,
37271,
12485,
326,
403,
1220,
17254,
407,
253,
23105,
298,
16989,
310,
21643,
50276,
783,
2644,
1127,
273,
253,
7792,
310,
281,
4822,
12485,
326,
403,
2834,
323,
3210,
533,
1220,
17254,
407,
7497,
594,
2139,
1978,
3530,
326,
513,
417,
2525,
253,
806,
6866,
253,
4477,
812,
3388,
1566,
16226,
432,
247,
1643,
1027,
23105,
3210,
604,
597,
403,
7514,
670,
1566,
10096,
273,
616,
5728,
941,
50276,
22,
697,
417,
2590,
275,
2593,
4562,
849,
253,
1863,
319,
8245,
10895,
310,
1027,
432,
253,
3330,
462,
580,
300,
581,
50276,
46458,
752,
403,
3607,
273,
849,
253,
8245,
310,
5728,
326,
3330,
462,
580,
300,
1057,
1805,
253,
8813,
275,
298,
22067,
22029,
651,
564,
973,
275,
256,
2405,
984,
697,
417,
2590,
849,
841,
15302,
9184,
7357,
359,
7277,
253,
1543,
327,
731,
690,
5304,
6667,
326,
7277,
12485,
432,
841,
767,
15302,
651,
320,
5322,
50276,
23,
275,
298,
23735,
22818,
672,
19690,
25910,
43066,
26638,
891,
13414,
2096,
2139,
253,
4477,
513,
417,
13280,
760,
1110,
2219,
835,
1097,
253,
12182,
2392,
5194,
50276,
2055,
310,
1335,
247,
256,
12729,
1180,
285,
4245,
441,
625,
7162,
670,
253,
1543,
327,
436,
8578,
50276,
249,
253,
1655,
9978,
253,
1543,
3176,
281,
320,
278,
40747,
407,
1110,
6667,
835,
12182,
2392,
41030,
327,
253,
5304,
3779,
1482,
273,
253,
30129,
5474,
33032,
2520,
2929,
4081,
247,
747,
22791,
1925,
3330,
462,
580,
300,
281,
7472,
253,
764,
49235,
14720,
3745,
273,
253,
1375,
23037,
14387,
8113,
395,
12982,
3210,
275,
3330,
462,
580,
300,
1016,
4836,
310,
9924,
273,
247,
45860,
30129,
247,
1180,
465,
285,
247,
873,
273,
7431,
3888,
285,
697,
4736,
310,
281,
3609,
253,
465,
3888,
954,
2330,
342,
253,
45860,
30129,
281,
4822,
4460,
285,
11132,
12485,
436,
789,
4081,
247,
4384,
18814,
1877,
7792,
835,
247,
653,
1105,
2237,
509,
4863,
247,
747,
5864,
30129,
326,
476,
320,
4354,
14042,
407,
643,
1966,
3773,
533,
476,
4354,
11213,
253,
23105,
1566,
824,
347,
17230,
30930,
1235,
4679,
2692,
326,
1655,
8113,
395,
12982,
8127,
2965,
3212,
1966,
3045,
50276,
18,
253,
3330,
462,
580,
300,
310,
271,
4722,
22791,
323,
23390,
26306,
764,
49235,
14720,
285,
352,
476,
320,
23043,
9300,
342,
253,
4081,
18814,
1877,
7792,
50275,
19,
253,
4679,
973,
5183,
253,
8037,
875,
1966,
285,
5145,
16226,
27594,
253,
7881,
273,
253,
4158,
8892,
323,
1655,
23105,
3082,
50275,
20,
253,
1783,
273,
253,
8892,
275,
253,
3330,
462,
580,
300,
10895,
12957,
326,
247,
11117,
873,
273,
14720,
6936,
403,
2424,
281,
5115,
1029,
3045,
50275,
21,
271,
5795,
2746,
281,
17055,
253,
941,
970,
253,
1863,
319,
10895,
310,
3559,
275,
436,
789,
281,
921,
326,
253,
18814,
1877,
7792,
10513,
625,
11132,
12485,
50275,
18,
352,
310,
12744,
2139,
417,
897,
253,
1566,
7363,
275,
253,
941,
5438,
619,
30328,
310,
326,
11922,
941,
326,
476,
320,
4354,
14042,
407,
253,
23105,
1566,
26332,
253,
17230,
30930,
1235,
588,
1056,
253,
22791,
625,
11132,
281,
1655,
8113,
395,
12982,
3210,
310,
326,
2032,
604,
417,
891,
4282,
849,
281,
5513,
436,
11562,
285,
1880,
627,
310,
667,
3368,
281,
15249,
352,
50276,
19,
275,
4679,
352,
651,
320,
625,
4722,
281,
921,
849,
2710,
3082,
21319,
13359,
2439,
8892,
326,
2430,
1027,
14720,
6936,
352,
310,
271,
6880,
273,
253,
5368,
4679,
273,
10941,
1327,
34309,
285,
5304,
7533,
281,
625,
4030,
72,
11273,
2540,
6127,
50275,
20,
752,
403,
253,
4839,
16226,
275,
2829,
577,
891,
717,
14338,
849,
253,
3082,
7277,
342,
253,
3632,
5476,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
8113,
395,
12982,
22791,
326,
26624,
3888,
285,
2505,
5728,
970,
247,
24597,
47549,
2165,
2074,
281,
12738,
358,
2237,
512,
30628,
497,
2762,
327,
253,
2929,
597,
26108,
253,
19080,
1255,
285,
38135,
273,
253,
2934,
253,
19843,
273,
253,
4028,
326,
253,
22791,
285,
2165,
403,
13644,
2130,
285,
253,
3290,
273,
253,
5661,
7103,
37317,
258,
18,
27905,
8523,
5439,
690,
3374,
533,
369,
8127,
13762,
407,
253,
4477,
2380,
4583,
253,
913,
18726,
326,
253,
2929,
310,
4722,
285,
973,
15720,
285,
588,
320,
247,
10112,
1635,
281,
253,
5723,
2824,
15302,
285,
49602,
3540,
436,
807
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1027,
581,
253,
37317,
310,
247,
1652,
7514,
326,
253,
25275,
323,
653,
1105,
22854,
281,
1007,
323,
247,
27998,
6779,
323,
625,
3888,
778,
417,
3176,
323,
1650,
604,
253,
653,
1105,
2237,
3249,
598,
342,
247,
3159,
323,
767,
3888,
285,
253,
23105,
38771,
731,
512,
253,
480,
3649,
472,
3605,
310,
273,
2282,
470,
604,
253,
653,
1105,
2237,
11121,
273,
247,
3159,
323,
1264,
3888,
285,
253,
23105,
38771,
731,
512,
253,
480,
3649,
472,
3605,
310,
1335,
470,
275,
253,
3236,
12738,
5116,
253,
1180,
273,
8364,
30346,
407,
643,
685,
253,
653,
1105,
2237,
310,
3587,
11691,
594,
253,
653,
1105,
2237,
11121,
273,
12002,
12091,
7470,
323,
625,
8364,
2299,
275,
253,
2165,
275,
436,
2929,
642,
824,
25275,
310,
3562,
949,
253,
480,
3649,
472,
3605,
275,
1340,
281,
2953,
1966,
764,
49235,
14720,
26911,
281,
9212,
3000,
281,
625,
3888,
651,
452,
644,
3309,
50276,
20261,
253,
4060,
3797,
8113,
395,
12982,
352,
310,
5777,
689,
13578,
1580,
253,
2330,
6779,
3133,
281,
320,
247,
2014,
3159,
604,
253,
4477,
1599,
326,
253,
10895,
3797,
417,
760,
3000,
533,
671,
14683,
347,
247,
30129,
253,
37317,
651,
751,
731,
281,
1127,
436,
562,
26638,
858,
417,
452,
281,
320,
3710,
281,
3000,
760,
347,
275,
12738,
5116,
275,
253,
3282,
273,
5864,
285,
764,
49235,
14720,
653,
1105,
22854,
778,
6635,
247,
625,
12112,
6779,
323,
247,
8578,
273,
9183,
690,
2493,
875,
30129,
2978,
285,
3045,
273,
247,
261,
285,
7497,
651,
671,
320,
4722,
5474,
33032,
2520,
2929,
23970,
247,
747,
8113,
395,
12982,
362,
77,
22791,
1754,
327,
18814,
1877,
253,
18814,
1877,
7792,
310,
1077,
2074,
281,
12738,
257,
1443,
247,
4450,
2165,
285,
275,
436,
22791,
253,
4388,
310,
281,
15629,
465,
3888,
432,
247,
873,
273,
9183,
323,
1677,
3280,
4473,
806,
253,
653,
1105,
2237,
34899,
465,
3888,
432,
247,
873,
273,
9183,
285,
253,
1220,
735,
1611,
281,
3283,
4236,
3888,
281,
513,
594,
247,
2165,
310,
4158,
323,
717,
1641,
285,
247,
10895,
310,
5728,
432,
253,
20131,
273,
436,
2165,
4720,
247,
4564,
273,
1929,
362,
77,
3210,
403,
6760,
327,
436,
10895,
253,
2934,
310,
3240,
4460,
285,
4722,
891,
717,
2176,
326,
436,
22791,
588,
320,
247,
12912,
7741,
323,
2852,
8607,
295,
24343,
3114,
310,
6110,
275,
3237,
326,
2430,
764,
49235,
14720,
285,
263,
6024,
3640,
4102,
285,
436,
22791,
11521,
715,
436,
7140,
253,
1895,
310,
3477,
323,
7497,
285,
11132,
323,
10679,
534,
943,
320,
253,
1083,
323,
512,
3562,
15302,
253,
2929,
3797,
1142,
1027,
31091,
6260,
24088,
2829,
374,
4677,
577,
4583,
253,
2929,
310,
973,
15720,
891,
16216,
1089,
667,
2201,
14855,
533,
891,
1158,
253,
2929,
556,
5884,
3374,
534,
476,
320,
5520,
923,
253,
36594,
2593,
5474,
33032,
2520,
2929,
29328,
271,
3909,
2165,
3330,
462,
580,
300,
281,
4822,
8113,
285,
3448,
12485,
24088,
497,
88,
14503,
281,
247,
2120,
12334,
908,
347,
247,
7870,
22791,
281,
7472,
1375,
23037,
14387,
3210,
5742,
253,
2929,
4648,
253,
2165,
281,
4822,
4791,
76,
10872,
4560,
326,
597,
403,
50276,
565,
48714,
323,
7497,
5091,
480,
3649,
472,
3605,
533,
11132,
323,
1375,
23037,
14387,
23105,
3210,
835,
253,
1682,
1566,
362,
2878,
33526,
247,
4868,
273,
8073,
42547,
6571,
835,
253,
30129,
310,
25910,
43066,
534,
310,
4722,
285,
47860,
50276,
3062,
1189,
1110,
5661,
1543,
275,
436,
2929,
671,
5224,
326,
253,
5728,
12485,
2430,
11117,
14720,
6936,
50276,
10387,
2087,
3640,
1846,
3282,
38562,
285,
625,
4583,
436,
2929,
310,
973,
24013,
8550,
285,
973,
15720,
891,
2868,
352,
476,
8591,
253,
2440,
273,
1649,
86,
455,
2848,
3210,
337,
436,
2929,
29328,
3330,
462,
580,
300,
271,
3909,
2165,
281,
4822,
8113,
285,
3448,
12485,
908,
347,
247,
7870,
22791,
281,
7472,
1375,
23037,
14387,
3210,
281,
479,
253,
4081,
2746,
310,
4460,
285,
556,
247,
1270,
7680,
50276,
19,
436,
2929,
310,
973,
15720,
285,
973,
24013,
8550,
285,
556,
47860,
11985,
50276,
20,
436,
2929,
671,
16784,
253,
10895,
253,
2127,
285,
253,
18366,
2165,
26400,
281,
1581,
2852,
941,
4849,
326,
476,
320,
908,
281,
1287,
3210,
342,
1805,
5864,
15277,
891,
513,
417,
1089,
667,
32213,
275,
436,
2929,
5474,
339,
431,
248,
4477,
9569,
3330,
462,
580,
300,
247,
2165,
534,
6993,
347,
247,
7792,
281,
4822,
3448,
4694,
12485,
326,
403,
2834,
323,
5368,
8113,
12982,
3210,
281,
8415,
533,
1220,
17254,
407,
7497,
253,
4477,
2216,
3330,
462,
580,
300,
275,
253,
3740,
273,
12738,
257,
1443,
835,
253,
653,
1105,
2237,
310,
2530,
247,
873,
273,
3888,
285,
556,
281,
1705,
598,
342,
247,
3448,
30129,
326,
310,
2905,
281,
247,
2176,
8578,
273,
253,
3888,
253,
4477,
8523,
4822,
247,
873,
273,
4791,
76,
5864,
10872,
533,
625,
10872,
476,
320,
6520,
4360,
2879,
281,
436,
253,
4477,
12106,
253,
5728,
12485,
285,
253,
7881,
3206,
275,
16161,
731,
285,
12106,
849,
5368,
8113,
12982,
3210,
1347,
327,
436,
10895,
50276,
2520,
2929,
556,
581,
273,
253,
954,
4460,
285,
27807,
5697,
891,
452,
2326,
275,
8113,
12982,
941,
4849,
533,
846,
4361,
253,
2929,
285,
4882,
1475,
342,
253,
22020,
4266,
891,
452,
247,
1643,
24626,
326,
1056,
479,
33872,
273,
436,
789,
1919,
597,
403,
9713,
253,
20544,
273,
436,
2929,
403,
1077,
2266,
4394,
533,
19235,
253,
32213,
275,
2176,
3082,
403,
671,
45982,
604,
253,
3533,
7117,
403,
18212,
9577,
891,
651,
320,
7378,
281,
1818,
281,
2590,
2997,
20544,
337,
253,
2934,
310,
247,
1077,
19080,
581,
285,
253,
12738,
257,
1443,
4826,
941,
4849,
2165,
310,
973,
26115,
281,
4822,
2834,
8113,
12982,
12485,
50276,
19,
253,
7792,
285,
2165,
403,
1160,
1345,
594,
326,
952,
476,
23265,
823,
12485,
495,
253,
7103,
273,
253,
5728,
12485,
310,
973,
2218,
285,
253,
1543,
403,
21414,
326,
841,
12485,
403,
12150,
281,
8415,
323,
5368,
362,
77,
3210,
533,
4122,
1220,
17254,
407,
7497,
533,
1335,
1679,
685,
253,
1863,
319,
8245,
577,
253,
2929,
310,
1077,
973,
3542,
285,
18872,
352,
369,
20509,
247,
11284,
281,
1239,
50276,
18,
891,
1089,
352,
8909,
326,
253,
4477,
4822,
760,
581,
2460,
591,
4473,
432,
17899,
3888,
298,
2313,
17055,
760,
581,
2460,
591,
4473,
1543,
275,
26638,
323,
12342,
326,
403,
1512,
2173,
281,
253,
2014,
2460,
5728,
323,
352,
436,
2097,
326,
253,
26638,
3969,
281,
253,
2710,
12485,
326,
247,
4473,
6634,
275,
8127,
3831,
1491,
670,
253,
2173,
5304,
3634,
273,
253,
12342,
2460,
285,
594,
253,
12342,
7103,
689,
26017,
281,
253,
5304,
3607,
273,
326,
2460,
1223,
17055,
247,
4067,
873,
273,
3888,
1333,
608,
591,
4473,
651,
671,
906,
275,
26638,
326,
403,
2173,
281,
253,
3888,
326,
275,
616,
10872,
1580,
253,
1072,
2460,
310,
417,
908,
323,
1016,
4227,
253,
4583,
7103,
323,
326,
12342,
12485,
1057,
417,
689,
8491,
281,
247,
2014,
2460,
50275,
19,
253,
2710,
5304,
6936,
275,
2829,
374,
760,
1056,
352,
625,
21643,
4555,
752,
359,
403,
16344,
342,
253,
5728,
12485,
513,
8762,
3767,
25885,
310,
436,
247,
2087,
5864,
326,
359,
971,
253,
1566,
281,
871,
352,
3133,
751,
253,
12485,
1146,
5728,
403,
1512,
3888,
29765,
2905,
281,
1127,
337,
1840,
1014,
604,
359,
760,
971,
281,
7472,
253,
12485,
326,
253,
1566,
476,
3890,
27039,
327,
247,
5304,
3634,
26332,
417,
247,
2087,
4473,
5251,
5864,
751,
8762,
3767,
403,
3168,
533,
271,
3888,
29765,
581,
751,
8762,
3767,
17303,
275,
326,
1798,
2460,
253,
4583,
7103,
1335,
760,
44995,
12485,
2173,
281,
326,
581,
2460,
273,
253,
8762,
3767,
50276,
5430,
513,
891,
871,
604,
253,
1566,
588,
3890,
8762,
1342,
4919,
12485,
323,
247,
1027,
2460,
273,
253,
8762,
1342,
50276,
20,
2007,
697,
417,
2590,
752,
5304,
12485,
323,
512,
12342,
6012,
432,
1863,
319,
588,
1007,
751,
323,
4227,
752,
513,
253,
3888,
273,
4759,
285,
16076,
12342,
432,
253,
806,
4194,
273,
2829,
374,
1007,
751,
849,
1057,
326,
2818,
253,
1798,
12485,
326,
359,
4822,
323,
731,
50276,
21,
253,
4327,
281,
417,
37271,
12485,
326,
403,
1220,
17254,
407,
253,
23105,
298,
16989,
310,
21643,
50276,
783,
2644,
1127,
273,
253,
7792,
310,
281,
4822,
12485,
326,
403,
2834,
323,
3210,
533,
1220,
17254,
407,
7497,
594,
2139,
1978,
3530,
326,
513,
417,
2525,
253,
806,
6866,
253,
4477,
812,
3388,
1566,
16226,
432,
247,
1643,
1027,
23105,
3210,
604,
597,
403,
7514,
670,
1566,
10096,
273,
616,
5728,
941,
50276,
22,
697,
417,
2590,
275,
2593,
4562,
849,
253,
1863,
319,
8245,
10895,
310,
1027,
432,
253,
3330,
462,
580,
300,
581,
50276,
46458,
752,
403,
3607,
273,
849,
253,
8245,
310,
5728,
326,
3330,
462,
580,
300,
1057,
1805,
253,
8813,
275,
298,
22067,
22029,
651,
564,
973,
275,
256,
2405,
984,
697,
417,
2590,
849,
841,
15302,
9184,
7357,
359,
7277,
253,
1543,
327,
731,
690,
5304,
6667,
326,
7277,
12485,
432,
841,
767,
15302,
651,
320,
5322,
50276,
23,
275,
298,
23735,
22818,
672,
19690,
25910,
43066,
26638,
891,
13414,
2096,
2139,
253,
4477,
513,
417,
13280,
760,
1110,
2219,
835,
1097,
253,
12182,
2392,
5194,
50276,
2055,
310,
1335,
247,
256,
12729,
1180,
285,
4245,
441,
625,
7162,
670,
253,
1543,
327,
436,
8578,
50276,
249,
253,
1655,
9978,
253,
1543,
3176,
281,
320,
278,
40747,
407,
1110,
6667,
835,
12182,
2392,
41030,
327,
253,
5304,
3779,
1482,
273,
253,
30129,
5474,
33032,
2520,
2929,
4081,
247,
747,
22791,
1925,
3330,
462,
580,
300,
281,
7472,
253,
764,
49235,
14720,
3745,
273,
253,
1375,
23037,
14387,
8113,
395,
12982,
3210,
275,
3330,
462,
580,
300,
1016,
4836,
310,
9924,
273,
247,
45860,
30129,
247,
1180,
465,
285,
247,
873,
273,
7431,
3888,
285,
697,
4736,
310,
281,
3609,
253,
465,
3888,
954,
2330,
342,
253,
45860,
30129,
281,
4822,
4460,
285,
11132,
12485,
436,
789,
4081,
247,
4384,
18814,
1877,
7792,
835,
247,
653,
1105,
2237,
509,
4863,
247,
747,
5864,
30129,
326,
476,
320,
4354,
14042,
407,
643,
1966,
3773,
533,
476,
4354,
11213,
253,
23105,
1566,
824,
347,
17230,
30930,
1235,
4679,
2692,
326,
1655,
8113,
395,
12982,
8127,
2965,
3212,
1966,
3045,
50276,
18,
253,
3330,
462,
580,
300,
310,
271,
4722,
22791,
323,
23390,
26306,
764,
49235,
14720,
285,
352,
476,
320,
23043,
9300,
342,
253,
4081,
18814,
1877,
7792,
50275,
19,
253,
4679,
973,
5183,
253,
8037,
875,
1966,
285,
5145,
16226,
27594,
253,
7881,
273,
253,
4158,
8892,
323,
1655,
23105,
3082,
50275,
20,
253,
1783,
273,
253,
8892,
275,
253,
3330,
462,
580,
300,
10895,
12957,
326,
247,
11117,
873,
273,
14720,
6936,
403,
2424,
281,
5115,
1029,
3045,
50275,
21,
271,
5795,
2746,
281,
17055,
253,
941,
970,
253,
1863,
319,
10895,
310,
3559,
275,
436,
789,
281,
921,
326,
253,
18814,
1877,
7792,
10513,
625,
11132,
12485,
50275,
18,
352,
310,
12744,
2139,
417,
897,
253,
1566,
7363,
275,
253,
941,
5438,
619,
30328,
310,
326,
11922,
941,
326,
476,
320,
4354,
14042,
407,
253,
23105,
1566,
26332,
253,
17230,
30930,
1235,
588,
1056,
253,
22791,
625,
11132,
281,
1655,
8113,
395,
12982,
3210,
310,
326,
2032,
604,
417,
891,
4282,
849,
281,
5513,
436,
11562,
285,
1880,
627,
310,
667,
3368,
281,
15249,
352,
50276,
19,
275,
4679,
352,
651,
320,
625,
4722,
281,
921,
849,
2710,
3082,
21319,
13359,
2439,
8892,
326,
2430,
1027,
14720,
6936,
352,
310,
271,
6880,
273,
253,
5368,
4679,
273,
10941,
1327,
34309,
285,
5304,
7533,
281,
625,
4030,
72,
11273,
2540,
6127,
50275,
20,
752,
403,
253,
4839,
16226,
275,
2829,
577,
891,
717,
14338,
849,
253,
3082,
7277,
342,
253,
3632,
5476,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
8113,
395,
12982,
22791,
326,
26624,
3888,
285,
2505,
5728,
970,
247,
24597,
47549,
2165,
2074,
281,
12738,
358,
2237,
512,
30628,
497,
2762,
327,
253,
2929,
597,
26108,
253,
19080,
1255,
285,
38135,
273,
253,
2934,
253,
19843,
273,
253,
4028,
326,
253,
22791,
285,
2165,
403,
13644,
2130,
285,
253,
3290,
273,
253,
5661,
7103,
37317,
258,
18,
27905,
8523,
5439,
690,
3374,
533,
369,
8127,
13762,
407,
253,
4477,
2380,
4583,
253,
913,
18726,
326,
253,
2929,
310,
4722,
285,
973,
15720,
285,
588,
320,
247,
10112,
1635,
281,
253,
5723,
2824,
15302,
285,
49602,
3540,
436,
807
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper leverages modern scientific computing techniques specifically the meshless radial basis functionfinite difference rbffd method to accelerate the training of pinns in detail the paper replaces the computationally expensive automatic differentiation by rbffd with sparsematrix vector multiplication allowing a high efficiency and accuracy computation of the partial derivative terms and a fp64 training on gpu several numerical experiments are carried out including linear nonlinear and spacetime problems showing that the proposed method is significantly faster than fp32 vanillapinns with comparable accuracy strengths the idea of the paper is straight forward replacing the automatic differentiation by rbffd and sparse matrixvector multiplication the theoretical grounding is explained clearly several empirical evaluations are used to validate the proposed method the novelty of the contribution is fair although both the pinns and rbffd are well developed techniques this is the first work combining them together to accelerate the pinns training the topic is closely related to the neurips community weaknesses the weakness of the paper is mainly that the evaluation cases considered are somehow naive which could not demonstrate the advantages of the proposed method sufficiently the paper barely talks about the motivation of choosing fp64 over fp32 in the first place and gives little reasons and evidences about why it would cause a training speedup and this leads to my suspicion on whether choosing fp64 would become a key point this method offers limited speedup if the nonlinear terms not invoking differential operations dominates the calculation time docsepthe paper uses a linear combination of radial bias functions rbfs with additive odd polyharmonics to discretise partial differential equations in a meshless manner a deep neural network is used as trial function to obtain the approximate solutions through training over the data points with the equation to be solved as soft constraints in the rbffd finite differences method the calculation of derivatives is performed with precomputed stencils restricting the approximation to nearest neighbors reduces the computational load significantly the paper displays a solutions of poisson and heat dissipation equations as examples of the strengths of the method it is found that the using a 4ayer deep neural network that is trained to fulfill the equations using gpu fp64 is very efficient in solving the problem compared to the state of the art the paper goes trough in a clear way the theory of the physics informed neural networks pinn and then builds upon that to introduce rbfdf in a way that is pretty easy to follow it described system addresses many of the problems that arise in the work of solving pdes for complicated and practical geometries the meshless nature of the method makes its able to handle complex cases also as it lacks the usual meshing of the geometry it is much faster to set up ready for the computation in addition to being efficient in its calculations of course one would like that pytorch would have all the required autograd functions already implemented for even simpler scripting but this is no way fault in the paper but comment on paradigm shift where ai is providing new capabilities and better and better tools for problem solving the appendix and submitted materials are valuable and greatly appreciated by those planning to use the solution method the figures do not have error bars so one thinks the performance gains are reported from a single training run albeit there is a good set of comparisons done to state of the art i would like to see results gathered from multiple runs for each of the points i do not think there are any potential negative societal impacts of this work docsep the submitted work proposes the use of a discrete finite difference fd method for a faster computation of spatial derivatives ie with respect to the signal domain in physics informed neural networks pinns more specifically the use of radialbasisfunction fd rbffd is proposed which approximates an operator at a given position xi via a weighted linear combination of a position xi and its neighbors the weights can be learned by solving a dense block linear system and an arbitrary point cloud may be used the underlying implicit interpolation between the points is a combination of rbfs and legendre polynomials the linear operators are precomputed so that during the pinn training they can be directly evaluated on the collocation points experiments show the application of rbffd for the linear and nonlinear poisson equation as well as the twodimensional heat equation for the heat equation the temporal derivatives are still computed using autograd and only the spatial derivatives are approximated via rbffd results show that the proposed method is able to accurately approximate the derivatives of neural networks with a significant speedup compared to automatic differentiation the speedup depends on the complexity of the approximated operator applied to pinns this results in faster training of the networks with comparable accuracy to vanilla pinns furthermore the use of rbffd required fewer training epochs than vanilla pinns however the exact causes for this are not known contributions the use of rbffd for approximating differential operators in pinns is proposed as an alternative to autograd it is empirically shown that with a sufficiently high order rbffd provides accurate and fast approximations to the laplacian of a neural network on experiments for the nonlinear poisson equation and the heat equation a significant speedup of pinns is demonstrated fewer epochs were required for reaching a similar error as vanilla pinns significance high the authors provide an efficient alternative to autograd for computing pde losses in pinns low the proposed method results in fewer required epochs for attaining a similar loss compared to vanilla pinns low the proposed method appears to be very general and orthogonal to existing methods for speeding up pinns allowing a seamless combination with existing work originality the usage of rbffd as an alternative to autograd for pinns is to the best of my knowledge novel as far as i can judge rbffd methods seem to be wellestablished within the numerical pde literature but have not yet been explored much within machine learning quality the submission and experimental settings appear to be sound to the best of my knowledge while limited in their extent in terms of different pdes the experiments serve as a valuable proof of concept it is not explained why the approximation is only used for spatial derivatives whereas autograd is used for temporal derivatives it is unclear whether this serves as an example for mixed usage of autograd and rbffd or is due to some limitation of the underlying method well maintained and readable code is provided the appendix provides implementation details for the autograd implementation of sparse matrix operations that are required for the proposed method no error bars are provided checklist 3c the authors argue that this is the norm for pinns which in my opinion does not warrant ignoring general best practices within the ml community however another argument made is the limited computational resources which is reasonable for the pinn experiments experiments concerning figure 1 autograd vs rbffd should not be very computeintensive clarity the paper is generally well written and argued for minor the plots are at times a bit confusing the x and y labels could be a bit bigger and more expressive eg network depth s instead of just s in figure 1 similar to n in figures 24 and p in figure 5 its hard to quickly distinguish the vanilla pinn from the proposed method maybe a different style or color coding would help to summarize the paper provides an interesting and feasible alternative for computing spatial derivatives in pinns bridging the gap between wellestablished meshfree methods for pdes and pinns the proposed method is novel and to the best of my knowledge technically sound experiments are reasonable and reproducible and the paper is well written the authors highlight that the speedup depends on the complexity of the approximated operator ie they may not offer much speedup if nonlinear terms not involving differential operators dominate training times the approximations to the operators have to be precomputed
### Summary:
|
all reviewers agreed that this paper has several strengths such as a convincing motivation a well structured and wellformulated model and solid theoretical grounding while two reviewers had a very positive general impression of the paper emphasizing in particular the novelty and originality of this work one reviewer raised some concerns about the application cases being too simplistic and not well suited for demonstrating potential strengths or weaknesses of the method in my opinion however theses concerns and further questions could be addressed reasonably well in the rebuttal and therefore i recommend to accept this paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
19732,
1131,
4980,
8249,
12672,
5609,
5742,
253,
17489,
1417,
14599,
3720,
1159,
35161,
3064,
45630,
567,
69,
1332,
281,
28523,
253,
3733,
273,
9176,
2224,
275,
2508,
253,
2929,
36287,
253,
43245,
8214,
12077,
9827,
407,
45630,
567,
69,
342,
23507,
6674,
4972,
25219,
6941,
247,
1029,
6733,
285,
7200,
13782,
273,
253,
7898,
4309,
2426,
285,
247,
44296,
1540,
3733,
327,
305,
11113,
2067,
10704,
4679,
403,
4824,
562,
1690,
4872,
14561,
285,
29380,
3237,
4645,
326,
253,
4081,
1332,
310,
3012,
7938,
685,
44296,
1237,
3889,
408,
522,
249,
2224,
342,
10870,
7200,
20544,
253,
2934,
273,
253,
2929,
310,
4951,
3579,
50276,
250,
47972,
253,
12077,
9827,
407,
45630,
567,
69,
285,
23507,
4315,
11000,
25219,
253,
10527,
3216,
272,
310,
5544,
4518,
2067,
16774,
27163,
403,
908,
281,
17813,
253,
4081,
1332,
253,
38135,
273,
253,
7680,
310,
4344,
3738,
1097,
253,
9176,
2224,
285,
45630,
567,
69,
403,
973,
3715,
5609,
436,
310,
253,
806,
789,
16248,
731,
2366,
281,
28523,
253,
9176,
2224,
3733,
253,
9400,
310,
8244,
2905,
281,
253,
5723,
2824,
3114,
50276,
20881,
1255,
265,
253,
14855,
273,
253,
2929,
310,
7194,
326,
253,
7103,
2219,
2783,
403,
10380,
27785,
534,
812,
417,
7568,
253,
11361,
273,
253,
4081,
1332,
10481,
253,
2929,
12345,
12088,
670,
253,
16038,
273,
13887,
44296,
1540,
689,
44296,
1237,
275,
253,
806,
1659,
285,
4245,
1652,
4606,
285,
20456,
2979,
670,
2139,
352,
651,
2847,
247,
3733,
3885,
484,
285,
436,
5644,
281,
619,
18910,
327,
1880,
13887,
44296,
1540,
651,
2489,
247,
2234,
1127,
436,
1332,
6131,
3710,
3885,
484,
604,
253,
14561,
2426,
417,
42203,
8967,
5871,
36807,
253,
10272,
673,
5474,
339,
431,
248,
2929,
4648,
247,
4872,
5019,
273,
14599,
8492,
3470,
50276,
83,
3342,
84,
342,
21842,
8909,
3488,
21127,
251,
982,
281,
35132,
885,
7898,
8967,
7424,
275,
247,
17489,
1417,
5133,
247,
3676,
11454,
2990,
310,
908,
347,
2332,
1159,
281,
4044,
253,
16851,
5482,
949,
3733,
689,
253,
941,
2792,
342,
253,
5150,
281,
320,
14042,
347,
2602,
10806,
50276,
249,
253,
50276,
13041,
567,
69,
6486,
3910,
1332,
253,
10272,
273,
13335,
310,
2684,
342,
638,
16777,
264,
331,
2083,
3683,
34617,
253,
11193,
281,
5275,
15833,
11355,
253,
15180,
3301,
3012,
253,
2929,
12646,
247,
5482,
273,
2963,
17469,
285,
4250,
30695,
7424,
347,
6667,
273,
253,
20544,
273,
253,
1332,
352,
310,
1119,
326,
253,
970,
247,
577,
4071,
3676,
11454,
2990,
326,
310,
10166,
281,
19145,
253,
7424,
970,
50276,
37597,
44296,
1540,
310,
1077,
5919,
275,
16161,
253,
1895,
2429,
281,
253,
1375,
273,
253,
1445,
50275,
783,
2929,
4566,
39445,
275,
247,
2590,
1039,
253,
3762,
273,
253,
50276,
14453,
982,
8191,
11454,
6928,
268,
2966,
285,
840,
21168,
2220,
326,
281,
9569,
391,
3342,
4989,
275,
247,
1039,
326,
310,
3965,
3477,
281,
956,
50276,
262,
2529,
985,
12453,
1142,
273,
253,
3237,
326,
12893,
275,
50276,
783,
789,
273,
16161,
268,
3229,
323,
9542,
285,
8542,
41184,
253,
17489,
1417,
3753,
50276,
1171,
253,
1332,
2789,
697,
2104,
281,
6016,
2570,
2219,
671,
347,
352,
19756,
253,
7312,
6191,
2027,
273,
253,
12087,
352,
310,
50276,
25914,
7938,
281,
873,
598,
4704,
323,
253,
13782,
275,
1635,
281,
1146,
5919,
275,
697,
10426,
50276,
1171,
2282,
581,
651,
751,
326,
268,
1767,
263,
348,
651,
452,
512,
253,
50276,
17354,
1125,
462,
4614,
3470,
2168,
9009,
323,
1014,
19554,
6001,
272,
533,
436,
310,
642,
1039,
9331,
275,
253,
2929,
533,
4385,
327,
22199,
5333,
835,
23105,
310,
5277,
747,
13789,
285,
1805,
285,
1805,
5657,
323,
1895,
16161,
253,
30762,
285,
9262,
4753,
403,
9865,
285,
10260,
14109,
407,
1110,
7219,
281,
897,
253,
2900,
1332,
50276,
783,
8442,
513,
417,
452,
2228,
8965,
594,
581,
11121,
253,
3045,
15988,
403,
2361,
432,
247,
2014,
3733,
1408,
23447,
627,
310,
247,
1175,
873,
273,
14023,
2218,
281,
1375,
273,
253,
1445,
50276,
74,
651,
751,
281,
923,
1543,
13037,
432,
2709,
6613,
323,
1016,
273,
253,
2792,
50254,
50254,
50254,
50254,
50254,
50266,
74,
513,
417,
1158,
627,
403,
667,
2442,
4016,
38058,
16274,
273,
436,
789,
5474,
33032,
253,
9262,
789,
29328,
253,
897,
273,
247,
13358,
6486,
3064,
29439,
1332,
323,
247,
7938,
13782,
273,
8820,
13335,
26332,
342,
1675,
281,
253,
2625,
5028,
275,
12057,
8191,
11454,
6928,
9176,
2224,
625,
5742,
253,
897,
273,
14599,
40265,
3701,
29439,
45630,
567,
69,
310,
4081,
534,
4020,
684,
271,
5572,
387,
247,
1677,
1899,
1269,
74,
3066,
247,
17375,
4872,
5019,
273,
247,
1899,
1269,
74,
285,
697,
15833,
253,
13461,
476,
320,
6311,
407,
16161,
247,
14086,
2972,
4872,
985,
285,
271,
10341,
1127,
9005,
778,
320,
908,
253,
6944,
15424,
30370,
875,
253,
2792,
310,
247,
5019,
273,
391,
3342,
84,
285,
13691,
250,
21783,
253,
4872,
9158,
403,
638,
16777,
264,
594,
326,
1309,
253,
268,
2966,
3733,
597,
476,
320,
3587,
6760,
327,
253,
3007,
4341,
2792,
4679,
921,
253,
2898,
273,
45630,
567,
69,
323,
253,
4872,
285,
14561,
2963,
17469,
5150,
347,
973,
347,
253,
2500,
351,
37613,
4250,
5150,
323,
253,
4250,
5150,
253,
11935,
13335,
403,
1335,
10302,
970,
1125,
462,
4614,
285,
760,
253,
8820,
13335,
403,
34930,
3066,
45630,
567,
69,
50276,
16680,
921,
326,
253,
4081,
1332,
310,
2104,
281,
13613,
16851,
253,
13335,
273,
11454,
6928,
342,
247,
1534,
3885,
484,
2429,
281,
12077,
9827,
253,
3885,
484,
7024,
327,
253,
10454,
273,
253,
34930,
5572,
3732,
281,
9176,
2224,
436,
1543,
275,
7938,
3733,
273,
253,
6928,
342,
10870,
7200,
281,
26724,
9176,
2224,
33810,
253,
897,
273,
45630,
567,
69,
2424,
11184,
3733,
44540,
685,
26724,
9176,
2224,
2299,
253,
3242,
5997,
323,
436,
403,
417,
1929,
50276,
1987,
8303,
50276,
783,
897,
273,
45630,
567,
69,
323,
4020,
839,
8967,
9158,
275,
9176,
2224,
310,
4081,
347,
271,
5795,
281,
1125,
462,
4614,
50276,
262,
310,
45190,
2011,
326,
342,
247,
10481,
1029,
1340,
45630,
567,
69,
3400,
7899,
285,
3809,
34754,
281,
253,
826,
43917,
273,
247,
11454,
2990,
50276,
251,
4679,
323,
253,
14561,
2963,
17469,
5150,
285,
253,
4250,
5150,
247,
50276,
32258,
3885,
484,
273,
9176,
2224,
310,
5183,
11184,
44540,
497,
2424,
323,
10922,
247,
2074,
2228,
347,
26724,
9176,
2224,
50276,
9188,
40348,
50276,
8656,
253,
4477,
2085,
271,
5919,
5795,
281,
1125,
462,
4614,
323,
12672,
268,
615,
11655,
275,
9176,
2224,
50275,
676,
253,
4081,
1332,
1543,
275,
11184,
2424,
44540,
323,
863,
1776,
247,
2074,
2957,
2429,
281,
26724,
9176,
2224,
50276,
676,
253,
4081,
1332,
4620,
281,
320,
1077,
2087,
285,
19627,
281,
5368,
3082,
323,
43088,
598,
9176,
2224,
6941,
247,
45871,
5019,
342,
5368,
789,
50275,
19164,
414,
50276,
783,
10393,
273,
45630,
567,
69,
347,
271,
5795,
281,
1125,
462,
4614,
323,
9176,
2224,
310,
281,
253,
1682,
273,
619,
3640,
4460,
50276,
284,
2080,
347,
891,
476,
5963,
45630,
567,
69,
3082,
1646,
281,
320,
973,
21877,
1561,
253,
10704,
268,
615,
6239,
533,
452,
417,
2568,
644,
14859,
1199,
1561,
5145,
4715,
50275,
15177,
50276,
783,
19529,
285,
5661,
7533,
3176,
281,
320,
3590,
281,
253,
1682,
273,
619,
3640,
50276,
6050,
3710,
275,
616,
6070,
275,
2426,
273,
1027,
268,
3229,
253,
4679,
5752,
347,
247,
9865,
4737,
273,
4473,
50276,
262,
310,
417,
5544,
2139,
253,
11193,
310,
760,
908,
323,
8820,
13335,
5727,
1125,
462,
4614,
310,
908,
323,
11935,
13335,
352,
310,
12744,
1880,
436,
11029,
347,
271,
1650,
323,
6804,
10393,
273,
1125,
462,
4614,
285,
45630,
567,
69,
390,
310,
1955,
281,
690,
12291,
273,
253,
6944,
1332,
50275,
4714,
8838,
285,
34025,
2127,
310,
2530,
253,
30762,
3400,
7092,
4278,
323,
253,
1125,
462,
4614,
7092,
273,
23507,
4315,
5871,
326,
403,
2424,
323,
253,
4081,
1332,
50276,
2369,
2228,
8965,
403,
2530,
44282,
495,
68,
253,
4477,
9059,
326,
436,
310,
253,
5222,
323,
9176,
2224,
534,
275,
619,
4743,
1057,
417,
7501,
23111,
2087,
1682,
8333,
1561,
253,
13361,
3114,
2299,
1529,
4154,
1160,
310,
253,
3710,
15180,
5300,
534,
310,
5272,
323,
253,
268,
2966,
4679,
4679,
8664,
4677,
337,
1125,
462,
4614,
4632,
45630,
567,
69,
943,
417,
320,
1077,
11897,
47986,
50274,
498,
15752,
50276,
783,
2929,
310,
3839,
973,
3542,
285,
9125,
323,
50276,
37585,
253,
14777,
403,
387,
2069,
247,
2372,
21643,
50270,
783,
1269,
285,
340,
13301,
812,
320,
247,
2372,
8750,
285,
625,
43541,
24088,
2990,
6864,
256,
3185,
273,
816,
256,
275,
4677,
337,
2074,
281,
295,
275,
8442,
2164,
285,
268,
275,
4677,
608,
50272,
953,
1892,
281,
4541,
12129,
253,
26724,
268,
2966,
432,
253,
4081,
1332,
5046,
247,
1027,
3740,
390,
3295,
12425,
651,
1361,
50275,
936,
26799,
50276,
783,
2929,
3400,
271,
4722,
285,
17887,
5795,
323,
12672,
8820,
13335,
275,
9176,
2224,
49519,
253,
8037,
875,
973,
21877,
17489,
4924,
3082,
323,
268,
3229,
285,
9176,
2224,
253,
4081,
1332,
310,
4460,
285,
281,
253,
1682,
273,
619,
3640,
22335,
3590,
4679,
403,
5272,
285,
41374,
285,
253,
2929,
310,
973,
3542,
50275,
783,
4477,
6780,
326,
253,
3885,
484,
7024,
327,
253,
10454,
273,
253,
34930,
5572,
26332,
597,
778,
417,
3959,
1199,
3885,
484,
604,
14561,
2426,
417,
7668,
8967,
9158,
25903,
3733,
2069,
50276,
783,
34754,
281,
253,
9158,
452,
281,
320,
638,
16777,
264,
2490,
187,
4118,
18435,
27,
512,
30628,
5821,
326,
436,
2929,
556,
2067,
20544,
824,
347,
247,
21414,
16038,
247,
973,
18872,
285,
973,
630,
2907,
1566,
285,
4891,
10527,
3216,
272,
1223,
767,
30628,
574,
247,
1077,
2762,
2087,
13214,
273,
253,
2929,
43962,
275,
1798,
253,
38135,
285,
3236,
414,
273,
436,
789,
581,
37317,
5439,
690,
7350,
670,
253,
2898,
2219,
1146,
1512,
8077,
2531,
285,
417,
973,
18960,
323,
17227,
2442,
20544,
390,
32213,
273,
253,
1332,
275,
619,
4743,
2299,
253,
6628,
7350,
285,
2007,
3533,
812,
320,
9713,
12054,
973,
275,
253,
30080,
22559,
285,
3103,
891,
5583,
281,
2997,
436,
2929,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
19732,
1131,
4980,
8249,
12672,
5609,
5742,
253,
17489,
1417,
14599,
3720,
1159,
35161,
3064,
45630,
567,
69,
1332,
281,
28523,
253,
3733,
273,
9176,
2224,
275,
2508,
253,
2929,
36287,
253,
43245,
8214,
12077,
9827,
407,
45630,
567,
69,
342,
23507,
6674,
4972,
25219,
6941,
247,
1029,
6733,
285,
7200,
13782,
273,
253,
7898,
4309,
2426,
285,
247,
44296,
1540,
3733,
327,
305,
11113,
2067,
10704,
4679,
403,
4824,
562,
1690,
4872,
14561,
285,
29380,
3237,
4645,
326,
253,
4081,
1332,
310,
3012,
7938,
685,
44296,
1237,
3889,
408,
522,
249,
2224,
342,
10870,
7200,
20544,
253,
2934,
273,
253,
2929,
310,
4951,
3579,
50276,
250,
47972,
253,
12077,
9827,
407,
45630,
567,
69,
285,
23507,
4315,
11000,
25219,
253,
10527,
3216,
272,
310,
5544,
4518,
2067,
16774,
27163,
403,
908,
281,
17813,
253,
4081,
1332,
253,
38135,
273,
253,
7680,
310,
4344,
3738,
1097,
253,
9176,
2224,
285,
45630,
567,
69,
403,
973,
3715,
5609,
436,
310,
253,
806,
789,
16248,
731,
2366,
281,
28523,
253,
9176,
2224,
3733,
253,
9400,
310,
8244,
2905,
281,
253,
5723,
2824,
3114,
50276,
20881,
1255,
265,
253,
14855,
273,
253,
2929,
310,
7194,
326,
253,
7103,
2219,
2783,
403,
10380,
27785,
534,
812,
417,
7568,
253,
11361,
273,
253,
4081,
1332,
10481,
253,
2929,
12345,
12088,
670,
253,
16038,
273,
13887,
44296,
1540,
689,
44296,
1237,
275,
253,
806,
1659,
285,
4245,
1652,
4606,
285,
20456,
2979,
670,
2139,
352,
651,
2847,
247,
3733,
3885,
484,
285,
436,
5644,
281,
619,
18910,
327,
1880,
13887,
44296,
1540,
651,
2489,
247,
2234,
1127,
436,
1332,
6131,
3710,
3885,
484,
604,
253,
14561,
2426,
417,
42203,
8967,
5871,
36807,
253,
10272,
673,
5474,
339,
431,
248,
2929,
4648,
247,
4872,
5019,
273,
14599,
8492,
3470,
50276,
83,
3342,
84,
342,
21842,
8909,
3488,
21127,
251,
982,
281,
35132,
885,
7898,
8967,
7424,
275,
247,
17489,
1417,
5133,
247,
3676,
11454,
2990,
310,
908,
347,
2332,
1159,
281,
4044,
253,
16851,
5482,
949,
3733,
689,
253,
941,
2792,
342,
253,
5150,
281,
320,
14042,
347,
2602,
10806,
50276,
249,
253,
50276,
13041,
567,
69,
6486,
3910,
1332,
253,
10272,
273,
13335,
310,
2684,
342,
638,
16777,
264,
331,
2083,
3683,
34617,
253,
11193,
281,
5275,
15833,
11355,
253,
15180,
3301,
3012,
253,
2929,
12646,
247,
5482,
273,
2963,
17469,
285,
4250,
30695,
7424,
347,
6667,
273,
253,
20544,
273,
253,
1332,
352,
310,
1119,
326,
253,
970,
247,
577,
4071,
3676,
11454,
2990,
326,
310,
10166,
281,
19145,
253,
7424,
970,
50276,
37597,
44296,
1540,
310,
1077,
5919,
275,
16161,
253,
1895,
2429,
281,
253,
1375,
273,
253,
1445,
50275,
783,
2929,
4566,
39445,
275,
247,
2590,
1039,
253,
3762,
273,
253,
50276,
14453,
982,
8191,
11454,
6928,
268,
2966,
285,
840,
21168,
2220,
326,
281,
9569,
391,
3342,
4989,
275,
247,
1039,
326,
310,
3965,
3477,
281,
956,
50276,
262,
2529,
985,
12453,
1142,
273,
253,
3237,
326,
12893,
275,
50276,
783,
789,
273,
16161,
268,
3229,
323,
9542,
285,
8542,
41184,
253,
17489,
1417,
3753,
50276,
1171,
253,
1332,
2789,
697,
2104,
281,
6016,
2570,
2219,
671,
347,
352,
19756,
253,
7312,
6191,
2027,
273,
253,
12087,
352,
310,
50276,
25914,
7938,
281,
873,
598,
4704,
323,
253,
13782,
275,
1635,
281,
1146,
5919,
275,
697,
10426,
50276,
1171,
2282,
581,
651,
751,
326,
268,
1767,
263,
348,
651,
452,
512,
253,
50276,
17354,
1125,
462,
4614,
3470,
2168,
9009,
323,
1014,
19554,
6001,
272,
533,
436,
310,
642,
1039,
9331,
275,
253,
2929,
533,
4385,
327,
22199,
5333,
835,
23105,
310,
5277,
747,
13789,
285,
1805,
285,
1805,
5657,
323,
1895,
16161,
253,
30762,
285,
9262,
4753,
403,
9865,
285,
10260,
14109,
407,
1110,
7219,
281,
897,
253,
2900,
1332,
50276,
783,
8442,
513,
417,
452,
2228,
8965,
594,
581,
11121,
253,
3045,
15988,
403,
2361,
432,
247,
2014,
3733,
1408,
23447,
627,
310,
247,
1175,
873,
273,
14023,
2218,
281,
1375,
273,
253,
1445,
50276,
74,
651,
751,
281,
923,
1543,
13037,
432,
2709,
6613,
323,
1016,
273,
253,
2792,
50254,
50254,
50254,
50254,
50254,
50266,
74,
513,
417,
1158,
627,
403,
667,
2442,
4016,
38058,
16274,
273,
436,
789,
5474,
33032,
253,
9262,
789,
29328,
253,
897,
273,
247,
13358,
6486,
3064,
29439,
1332,
323,
247,
7938,
13782,
273,
8820,
13335,
26332,
342,
1675,
281,
253,
2625,
5028,
275,
12057,
8191,
11454,
6928,
9176,
2224,
625,
5742,
253,
897,
273,
14599,
40265,
3701,
29439,
45630,
567,
69,
310,
4081,
534,
4020,
684,
271,
5572,
387,
247,
1677,
1899,
1269,
74,
3066,
247,
17375,
4872,
5019,
273,
247,
1899,
1269,
74,
285,
697,
15833,
253,
13461,
476,
320,
6311,
407,
16161,
247,
14086,
2972,
4872,
985,
285,
271,
10341,
1127,
9005,
778,
320,
908,
253,
6944,
15424,
30370,
875,
253,
2792,
310,
247,
5019,
273,
391,
3342,
84,
285,
13691,
250,
21783,
253,
4872,
9158,
403,
638,
16777,
264,
594,
326,
1309,
253,
268,
2966,
3733,
597,
476,
320,
3587,
6760,
327,
253,
3007,
4341,
2792,
4679,
921,
253,
2898,
273,
45630,
567,
69,
323,
253,
4872,
285,
14561,
2963,
17469,
5150,
347,
973,
347,
253,
2500,
351,
37613,
4250,
5150,
323,
253,
4250,
5150,
253,
11935,
13335,
403,
1335,
10302,
970,
1125,
462,
4614,
285,
760,
253,
8820,
13335,
403,
34930,
3066,
45630,
567,
69,
50276,
16680,
921,
326,
253,
4081,
1332,
310,
2104,
281,
13613,
16851,
253,
13335,
273,
11454,
6928,
342,
247,
1534,
3885,
484,
2429,
281,
12077,
9827,
253,
3885,
484,
7024,
327,
253,
10454,
273,
253,
34930,
5572,
3732,
281,
9176,
2224,
436,
1543,
275,
7938,
3733,
273,
253,
6928,
342,
10870,
7200,
281,
26724,
9176,
2224,
33810,
253,
897,
273,
45630,
567,
69,
2424,
11184,
3733,
44540,
685,
26724,
9176,
2224,
2299,
253,
3242,
5997,
323,
436,
403,
417,
1929,
50276,
1987,
8303,
50276,
783,
897,
273,
45630,
567,
69,
323,
4020,
839,
8967,
9158,
275,
9176,
2224,
310,
4081,
347,
271,
5795,
281,
1125,
462,
4614,
50276,
262,
310,
45190,
2011,
326,
342,
247,
10481,
1029,
1340,
45630,
567,
69,
3400,
7899,
285,
3809,
34754,
281,
253,
826,
43917,
273,
247,
11454,
2990,
50276,
251,
4679,
323,
253,
14561,
2963,
17469,
5150,
285,
253,
4250,
5150,
247,
50276,
32258,
3885,
484,
273,
9176,
2224,
310,
5183,
11184,
44540,
497,
2424,
323,
10922,
247,
2074,
2228,
347,
26724,
9176,
2224,
50276,
9188,
40348,
50276,
8656,
253,
4477,
2085,
271,
5919,
5795,
281,
1125,
462,
4614,
323,
12672,
268,
615,
11655,
275,
9176,
2224,
50275,
676,
253,
4081,
1332,
1543,
275,
11184,
2424,
44540,
323,
863,
1776,
247,
2074,
2957,
2429,
281,
26724,
9176,
2224,
50276,
676,
253,
4081,
1332,
4620,
281,
320,
1077,
2087,
285,
19627,
281,
5368,
3082,
323,
43088,
598,
9176,
2224,
6941,
247,
45871,
5019,
342,
5368,
789,
50275,
19164,
414,
50276,
783,
10393,
273,
45630,
567,
69,
347,
271,
5795,
281,
1125,
462,
4614,
323,
9176,
2224,
310,
281,
253,
1682,
273,
619,
3640,
4460,
50276,
284,
2080,
347,
891,
476,
5963,
45630,
567,
69,
3082,
1646,
281,
320,
973,
21877,
1561,
253,
10704,
268,
615,
6239,
533,
452,
417,
2568,
644,
14859,
1199,
1561,
5145,
4715,
50275,
15177,
50276,
783,
19529,
285,
5661,
7533,
3176,
281,
320,
3590,
281,
253,
1682,
273,
619,
3640,
50276,
6050,
3710,
275,
616,
6070,
275,
2426,
273,
1027,
268,
3229,
253,
4679,
5752,
347,
247,
9865,
4737,
273,
4473,
50276,
262,
310,
417,
5544,
2139,
253,
11193,
310,
760,
908,
323,
8820,
13335,
5727,
1125,
462,
4614,
310,
908,
323,
11935,
13335,
352,
310,
12744,
1880,
436,
11029,
347,
271,
1650,
323,
6804,
10393,
273,
1125,
462,
4614,
285,
45630,
567,
69,
390,
310,
1955,
281,
690,
12291,
273,
253,
6944,
1332,
50275,
4714,
8838,
285,
34025,
2127,
310,
2530,
253,
30762,
3400,
7092,
4278,
323,
253,
1125,
462,
4614,
7092,
273,
23507,
4315,
5871,
326,
403,
2424,
323,
253,
4081,
1332,
50276,
2369,
2228,
8965,
403,
2530,
44282,
495,
68,
253,
4477,
9059,
326,
436,
310,
253,
5222,
323,
9176,
2224,
534,
275,
619,
4743,
1057,
417,
7501,
23111,
2087,
1682,
8333,
1561,
253,
13361,
3114,
2299,
1529,
4154,
1160,
310,
253,
3710,
15180,
5300,
534,
310,
5272,
323,
253,
268,
2966,
4679,
4679,
8664,
4677,
337,
1125,
462,
4614,
4632,
45630,
567,
69,
943,
417,
320,
1077,
11897,
47986,
50274,
498,
15752,
50276,
783,
2929,
310,
3839,
973,
3542,
285,
9125,
323,
50276,
37585,
253,
14777,
403,
387,
2069,
247,
2372,
21643,
50270,
783,
1269,
285,
340,
13301,
812,
320,
247,
2372,
8750,
285,
625,
43541,
24088,
2990,
6864,
256,
3185,
273,
816,
256,
275,
4677,
337,
2074,
281,
295,
275,
8442,
2164,
285,
268,
275,
4677,
608,
50272,
953,
1892,
281,
4541,
12129,
253,
26724,
268,
2966,
432,
253,
4081,
1332,
5046,
247,
1027,
3740,
390,
3295,
12425,
651,
1361,
50275,
936,
26799,
50276,
783,
2929,
3400,
271,
4722,
285,
17887,
5795,
323,
12672,
8820,
13335,
275,
9176,
2224,
49519,
253,
8037,
875,
973,
21877,
17489,
4924,
3082,
323,
268,
3229,
285,
9176,
2224,
253,
4081,
1332,
310,
4460,
285,
281,
253,
1682,
273,
619,
3640,
22335,
3590,
4679,
403,
5272,
285,
41374,
285,
253,
2929,
310,
973,
3542,
50275,
783,
4477,
6780,
326,
253,
3885,
484,
7024,
327,
253,
10454,
273,
253,
34930,
5572,
26332,
597,
778,
417,
3959,
1199,
3885,
484,
604,
14561,
2426,
417,
7668,
8967,
9158,
25903,
3733,
2069,
50276,
783,
34754,
281,
253,
9158,
452,
281,
320,
638,
16777,
264,
2490,
187,
4118,
18435,
27,
512,
30628,
5821,
326,
436,
2929,
556,
2067,
20544,
824,
347,
247,
21414,
16038,
247,
973,
18872,
285,
973,
630,
2907,
1566,
285,
4891,
10527,
3216,
272,
1223,
767,
30628,
574,
247,
1077,
2762,
2087,
13214,
273,
253,
2929,
43962,
275,
1798,
253,
38135,
285,
3236,
414,
273,
436,
789,
581,
37317,
5439,
690,
7350,
670,
253,
2898,
2219,
1146,
1512,
8077,
2531,
285,
417,
973,
18960,
323,
17227,
2442,
20544,
390,
32213,
273,
253,
1332,
275,
619,
4743,
2299,
253,
6628,
7350,
285,
2007,
3533,
812,
320,
9713,
12054,
973,
275,
253,
30080,
22559,
285,
3103,
891,
5583,
281,
2997,
436,
2929,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a method called optimizer grafting it uses two optimizers in one training session one is to decide the update direction of parameters and the other is to decide the update stride of parameters this paper proposes a new optimizing mode and take a large amount of experiment exploration s1 a large amount of experiment is conducted and plenty of result is shown in appendix s2 a novel optimizing mode of grafting two different optimizers is proposed w1 the paper structure is strange i recommend to read some published proceedings to try to make this paper more clearly w2 some format maybe not legal such as the caption of table and content in page 16 w3 the theory is not reasonable in other word you just tell me you do it like this but not why its reasonable actually i dont think adamsgd will be better than adam adam calculates the update direction according to loss function in a multidimensional space this direction is composed of the value of each gradient and positive or negativesymbol of each gradient however you change the symbol of some parameters gradient according to sgd this is why im confused in my view this method is more like a sgd with multiplying a large const to its gradient w4 i have a question how to compute the norms wmwt wdwt is wmwt calculated with all the parameters in neural network if not i think figure1 is a wrong example cause md will step to different direction with d in multidimensional space w5 the results shown in tables are not strong enough even though some idea is interesting the theoretical work in this paper is insufficinet docsepthe authors report on a technique to address learning rate hyperparameter tuning for deep learning referred to as optimizer grafting specifically the paper proposes a metaalgorithm referred to as md that blends the steps of two optimizers by combining the step magnitude of one m with the direction of the other d the technique of optimizer grafting allows for the transfer of the overall implicit step size schedule to a new optimizer resulting in reductions in computational cost of optimizer hyper parameter search the second primary result is leveraging the technique to identify a nonadaptive perlayer learning rate correction to sgd which allows it to train a bert model to stateoftheart performance analogous results are presented for vision models for global nonperlayer schedules for adagrad the authors describe grafting meta algorithm md as at each iteration md feeds the same input wt gt to both m and d which manage their states independently and produce wm wd then the norms of the steps each would have taken is computed and used to combine ms magnitude update with ds direction update partitioning is managed to implement global versus perlayer grafting2 optimizer hyperparameter searches with the same computational budget but different performances the authors present an empirical study on the transfer of implicit step size schedules between optimizers comparing sgd and adam to adamsgd for task of bert pretraining they show that adamsgd is able to achieve performance atnear adam the paper also presents results for image classification for imagenet and cifar10 for adagrad sgd and sgdadagrad showing sgdadagrad outperforms sgd and adagrad however without error bars it is hard assess the actual results finally the paper shows results for grafting distilling a nonadaptive correction to d eliminating the need to run m in parallel that is transferring a global timedependent nonadaptive multipliers for the learning rate the results show for the global variant for resnet sgd adagrad the discovered learning rate is comparable to the one used on sgd and achieves a top1 accuracy of 7246 for the perlayer variant learning rate schedule enables a simple perlayer step size correction without adaptive preconditioning the authors present proofofconcept results for simplifying the discovered schedule as a way to support the robustness of their transfer approach the paper presents an interesting technique of grafting for the problem of step size hyperparameter tuning and opens up questions as to the power of simple perlearning rate schedules the strengths of the paper include performing analysis on state of the art benchmarks imagenet cifar10 for image classification and wikipedia books for bert pertaining another strength is assessing the context of transferring implicit step size schedules to another optimizer for assessing global and perlayer variants and for assessing simple learning rate discovery however empirical evidence was not sufficient to be conclusive on the methods presented for example the results were on 2 tasks image classification with resnets and bert pretraining with 2 datasets each and 2 batch sizes 8192 and 32768 further for example figure 2 displays results for the best trial performance but does not include error bars adding error bars and the empirical computational cost reductions would be useful in better supporting the claims on several steps potentially concerning results were identified but not adequately justified and without more sufficient empirical results the reader is uncertain if these are indications of more serious flaws in the approach eg for bert p6 the global version of grafting were signficantly worse and have been omitted and p 14 despite these exponentially large correction ratios the grated optimizer converges but this curious phenomenon was left to future work the study also does not present theoretical underpinnings for the technique that would be helpful for understanding if indeed the results could be more widely applicable the study presents the approach md but does not give guidance on how m or d might be selected more generally for other tasks or how one might assess tasks to decide on m and d there are also decisions for parameters that are not explained and it is not clear now sensitive results may be to these decisions for example for learning rate discovery p8 authors describe choosing the layerwise multiplier as the median of individual corrections for the first 2000 iterations or discretizing to the nearest power of ten providing additional guidance andor theoretical underpinnings for such choices would be useful overall the paper presents an interesting technique and opens interesting questions but without further empirical results or theoretical underpinnings the results presented are insufficient to assess how generalizable the findings might be docsepthe authors propose learning rate grafting as a method to explore the power and dynamics of optimizers learning rate grafting partitions the parameters of the networks into groups and for each group takes the direction of the weight update from one optimizer and the magnitude from another optimizer the paper then shows that grafting allows for achieving the performance of a tuned optimizer using that tuned optimizers groupwise magnitudes along with an untuned optimizers groupwise directions as a note i reviewed a prior submission of this paper the bulk of my review is the same as in the prior submission with modifications corresponding to the modifications in the paper originality the paper is original performing an experiment that i have not seen in the literature and satisfactorily discusses background and related work quality the experiments performed within the paper are of high quality satisfactorily demonstrating the main claims of the paper that groupwise grafting results in performance equivalent to the bettertuned optimizer that the learning rates are derived from my main qualm with the experiments is the lack of motivation for the specific experimental configurations instead the results of several disparate experiments in disparate settings are reported making it hard to reason about the generality of the results specifically there are no reported results for global grafting on bert other than a statement that it performs significantly worse it would be helpful to see these results in the main body or appendix to understand how much worse and the tradeoffs between global and layer wise grafting the simplifying the discovered schedule section also feels arbitrary the correction scheme is described as simple which it is but it is also a bespoke correction for a single experiment which is not validated to transfer to other settings its not obvious what the contribution of this section is beyond that of the rest of the paper i would also appreciate further validation that the differences in directions between m and d are significant on the layerwise and global scale given that m d m when using perweight groups it seems possible that when using smaller groups that are eg the size of layers m d approx m it would be worthwhile to show that the groupwise directions of m and d are significantly different my expectation is that they are essentially orthogonal but this is worth validating clarity on the whole the paper is very clear with enough details to easily understand and reproduce the experiments reported in the paper however i do have the following questions and suggestions for improvements an explicit statement of the precise hypothesis being tested would significantly improve the paper asis the results are possible to interpret in many different ways eg that grafting either gives the performance of m gives the better between m and d or something else and explicit statement of the hypothesis along with additional experiments to precisely test the bounds of that hypothesis would result in a much more precise takeaway from the paper section 21 how does this deal with stochasticity in the gradient estimate based on eq 1 it seems like gt is the gradient for a minibatch but the algorithm does not specify minibatch selection the notation of md and granularity could be improved i kept having to scroll up and try to figure out which refers to magnitude and which refers to direction and the granularity is specified separately in text the plots specifically fig 23 are very small and hard to read significance though i am not an expert in optimizers the results seem like a useful step towards empirically understanding the contrast and resultant difference in realworld performance between different optimizers update after author response thanks to the authors for the response especially the explicit statement of the research question i do understand that the two experiments that i raised slight concerns with simplifying the discovered schedule and global grafting for bert are not intended to be reusable methodologies for training networks in the future but my concern is that without clearly stated methodologiesjustifications for these oneoff experiments or broader replication the results may just be the result of chance having continued to try variants of the approach until one worked rather than supporting the broader claim that the effective learning rate schedule is the primary driver of accuracy regardless my critiques are ultimately minor and i do still think that the paper should be accepted my main point of disagreement from the other reviewers is that i believe this paper is a significant contribution even if nobody ever uses the technique to train a network to deploy to endusers because the papers contribution is to the empirical understanding of optimizers rather than to the actual usage of optimizers because of that i think that a slightly lower methodological bar is reasonable the paper is presenting a new finding moreso than a new technique and we should not require that the approach works in all settings or even that the authors describe methodologies for finding similar results in new settings eg with different hyper parameters or choices of m and d accept while i am not entirely satisfied with the motivation behind the choices of reported experiments and the missing statement of a falsifiable hypothesis to test the paper proposes an interesting experiment and provides a technically correct evaluation of the claims in the paper docsepthe authors investigate the entanglements between the optimizer and the learning rate schedule and propose the technique of optimizer grafting which allows for the transfer of the overall implicit step size schedule from a tuned optimizer to a new optimizer preserving empirical performance this provides a robust plugandplay baseline for optimizer comparisons leading to reductions to the computational cost of optimizer hyperparameter search using grafting they discover a nonadaptive learning rate correction to sgd which allows it to train a bert model to stateoftheart performance the algorithm and the method are quite simple magnitudes are taken from one and directions are taken from another when you take d and md in the results m and md are comparable in performance so there is no clear benefit to using md when md is comparable to m a consequence of using md they claim is transferring the schedules but do schedules really generalize well generalize to what suppose i change the data and architecture can i use this new rate scheduler this to be is the question they should address thoroughly i think what the authors are proposing is this if the change is in the optimizer only how should i change the learning rate schedule but there are additional questions q1 what is the impact of this method on generalization error if yes with what confidence q2 does this new schedule improve the order of convergence if yes how q3 can you discuss general properties of the grafting operator q4 whats the intuition behind transferring schedules either theoretical or empirical and usefulness of grafting q5 why should we expect ab to perform better than ba when there are two optimizers a and b say the paper although is interesting lacks the technicalempirical novelty to merit publication
### Summary:
|
the paper proposed trained ml oracles to find the decent direction and step size in optimization the process they call grafting reviewers raised several concerns about the reliability of ml oracles in general settings which is valid the rebuttal could not convince the reviewers to change their opinion ideally for an empirical only paper with heavy reliability on ml for critical decisions to meet the high bar of iclr there must be several experiments 510 datasets or more on diverse datasets and settings also there should be discussions on when and how the method fails and related discussions in that sense the paper does not meet the bar for publication
|
[
9162,
323,
4440,
257,
292,
285,
260,
338,
274,
740,
323,
519,
356,
4614,
256,
35333,
285,
256,
35333,
324,
356,
4614,
4645,
50276,
8433,
45328,
356,
4614,
41731,
13015,
256,
35333,
285,
519,
356,
4614,
2299,
1293,
2228,
8965,
352,
310,
1892,
2939,
253,
4588,
1543,
50276,
71,
3341,
253,
2929,
2722,
1543,
323,
47353,
940,
3867,
247,
1327,
26672,
422,
10618,
281,
277,
23703,
253,
878,
281,
1408,
278,
275,
7529,
50276,
3529,
310,
27090,
247,
4156,
37282,
2662,
1327,
26672,
422,
18878,
4670,
323,
253,
4715,
2281,
253,
1543,
921,
323,
253,
4156,
12955,
323,
501,
3024,
256,
35333,
519,
356,
4614,
253,
6888,
4715,
2281,
310,
10870,
281,
253,
581,
908,
327,
256,
35333,
285,
33526,
247,
1755,
18,
7200,
273,
818,
23260,
323,
253,
591,
12026,
12955,
4715,
2281,
10130,
13276,
247,
2969,
591,
12026,
3213,
1979,
10618,
1293,
17825,
638,
42743,
253,
4477,
1246,
4737,
1171,
31503,
1543,
323,
8077,
5411,
253,
6888,
10130,
347,
247,
1039,
281,
1329,
253,
31640,
273,
616,
3700,
2746,
50276,
783,
2929,
10262,
271,
4722,
5853,
273,
47353,
323,
253,
1895,
273,
3213,
1979,
4373,
19484,
25184,
285,
13279,
598,
3533,
347,
281,
253,
1612,
273,
2969,
591,
28269,
2281,
28631,
253,
20544,
273,
253,
2929,
2486,
9591,
1783,
327,
1375,
273,
253,
1445,
49602,
4440,
257,
292,
260,
338,
274,
740,
323,
2460,
9162,
285,
259,
15170,
5098,
323,
270,
797,
27855,
1529,
4757,
310,
18005,
253,
3634,
273,
27090,
15424,
3213,
1979,
28631,
281,
1529,
5556,
6081,
323,
18005,
4156,
285,
591,
12026,
11640,
285,
323,
18005,
2969,
4715,
2281,
8900,
50275,
35529,
16774,
1941,
369,
417,
4209,
281,
320,
38662,
327,
253,
3082,
3559,
323,
1650,
253,
1543,
497,
327,
374,
8892,
50276,
5695,
9162,
342,
501,
47301,
285,
270,
797,
3215,
26208,
342,
374,
15302,
1016,
285,
374,
14604,
9552,
854,
14403,
285,
31133,
2358,
2007,
323,
1650,
50276,
13206,
374,
12646,
1543,
323,
253,
1682,
2332,
3045,
533,
1057,
417,
2486,
2228,
8965,
6240,
2228,
8965,
285,
253,
16774,
15180,
2105,
23082,
651,
320,
4217,
275,
1805,
8109,
253,
3916,
50276,
251,
2067,
5018,
7826,
8664,
1543,
497,
3636,
533,
417,
18212,
17285,
285,
1293,
625,
4209,
16774,
1543,
253,
9414,
310,
8767,
604,
841,
403,
25488,
273,
625,
4092,
32138,
275,
253,
2746,
24088,
323,
270,
797,
268,
23,
253,
4156,
2715,
273,
47353,
497,
861,
71,
280,
5954,
7197,
285,
452,
644,
11035,
285,
268,
1638,
5747,
841,
28596,
1781,
10618,
11878,
253,
41824,
5556,
6081,
26414,
533,
436,
14338,
11562,
369,
1669,
281,
2852,
789,
50276,
783,
1263,
671,
1057,
417,
1246,
10527,
762,
81,
32599,
323,
253,
5853,
326,
651,
320,
9371,
323,
4685,
604,
6296,
253,
1543,
812,
320,
625,
7561,
7763,
50275,
783,
1263,
10262,
253,
2746,
31934,
533,
1057,
417,
1918,
12925,
327,
849,
278,
390,
277,
1537,
320,
4236,
625,
3839,
323,
643,
8892,
390,
849,
581,
1537,
2939,
8892,
281,
7617,
327,
278,
285,
277,
627,
403,
671,
7089,
323,
3602,
326,
403,
417,
5544,
285,
352,
310,
417,
2590,
1024,
7996,
1543,
778,
320,
281,
841,
7089,
323,
1650,
323,
4715,
2281,
8900,
268,
25,
50276,
43355,
6266,
13887,
253,
3828,
3020,
39199,
347,
253,
8876,
273,
2060,
17660,
323,
253,
806,
5307,
25142,
390,
35132,
3006,
281,
253,
5275,
1612,
273,
3578,
5277,
3081,
12925,
285,
263,
10527,
762,
81,
32599,
323,
824,
10165,
651,
320,
4217,
4583,
253,
2929,
10262,
271,
4722,
5853,
285,
13279,
4722,
3533,
533,
1293,
2007,
16774,
1543,
390,
10527,
762,
81,
32599,
253,
1543,
3559,
403,
12497,
281,
2939,
849,
2087,
12729,
253,
4342,
1537,
320,
5474,
339,
431,
248,
4477,
12661,
4715,
2281,
47353,
347,
247,
1332,
281,
8338,
253,
1612,
285,
8062,
273,
5556,
14460,
4715,
2281,
47353,
27959,
253,
3602,
273,
253,
6928,
715,
2390,
285,
323,
1016,
1387,
3936,
253,
3884,
273,
253,
2801,
5731,
432,
581,
5556,
6081,
285,
253,
9777,
432,
1529,
5556,
6081,
253,
2929,
840,
2722,
326,
47353,
4483,
323,
17170,
253,
3045,
273,
247,
24251,
5556,
6081,
970,
326,
24251,
5556,
14460,
1387,
3020,
32800,
2112,
342,
271,
18093,
37437,
5556,
14460,
1387,
3020,
10746,
347,
247,
3877,
891,
9814,
247,
2720,
19529,
273,
436,
2929,
253,
10713,
273,
619,
2278,
310,
253,
1072,
347,
275,
253,
2720,
19529,
342,
14586,
3969,
281,
253,
14586,
275,
253,
2929,
50275,
19164,
414,
50276,
783,
2929,
310,
3236,
9591,
271,
3368,
326,
891,
452,
417,
2326,
275,
253,
6239,
285,
3449,
5906,
1031,
25339,
4114,
285,
2905,
789,
50275,
15177,
50276,
783,
4679,
2684,
1561,
253,
2929,
403,
273,
1029,
3290,
3449,
5906,
1031,
17227,
253,
2022,
3916,
273,
253,
2929,
326,
1387,
3020,
47353,
1543,
275,
3045,
6425,
281,
253,
1805,
85,
37437,
5556,
6081,
326,
253,
4715,
4142,
403,
6012,
432,
50276,
2577,
2022,
4426,
78,
342,
253,
4679,
310,
253,
3480,
273,
16038,
323,
253,
2173,
5661,
16012,
3185,
253,
1543,
273,
2067,
39653,
4679,
275,
39653,
7533,
403,
2361,
2403,
352,
1892,
281,
1921,
670,
253,
31376,
273,
253,
1543,
5742,
50276,
9088,
403,
642,
2361,
1543,
323,
4156,
47353,
327,
270,
797,
643,
685,
247,
3908,
326,
352,
17923,
3012,
7197,
352,
651,
320,
9371,
281,
923,
841,
1543,
275,
253,
2022,
2133,
390,
30762,
281,
2096,
849,
1199,
7197,
285,
253,
5454,
14273,
875,
4156,
285,
3828,
15822,
47353,
50276,
783,
8077,
5411,
253,
6888,
10130,
2593,
671,
9193,
10341,
253,
10618,
6974,
310,
2529,
347,
2969,
534,
352,
310,
533,
352,
310,
671,
247,
6290,
81,
3136,
10618,
323,
247,
2014,
3368,
534,
310,
417,
17618,
281,
3700,
281,
643,
7533,
697,
417,
4755,
752,
253,
7680,
273,
436,
2593,
310,
4457,
326,
273,
253,
1551,
273,
253,
2929,
50276,
74,
651,
671,
11435,
2007,
12820,
326,
253,
3910,
275,
10746,
875,
278,
285,
277,
403,
1534,
327,
253,
3828,
3020,
285,
4156,
4311,
1677,
326,
278,
50276,
69,
50276,
78,
672,
970,
591,
6712,
2390,
352,
3133,
1896,
326,
672,
970,
4577,
2390,
326,
403,
24088,
253,
1979,
273,
8090,
278,
50276,
69,
1192,
89,
278,
352,
651,
320,
32811,
281,
921,
326,
253,
1387,
3020,
10746,
273,
278,
285,
277,
403,
3012,
1027,
619,
15355,
310,
326,
597,
403,
9093,
19627,
533,
436,
310,
4409,
3588,
839,
50275,
498,
15752,
50276,
251,
253,
2644,
253,
2929,
310,
1077,
2590,
342,
2217,
4278,
281,
4354,
2096,
285,
18302,
253,
4679,
2361,
275,
253,
2929,
2299,
891,
513,
452,
253,
1563,
3533,
285,
13991,
323,
11701,
50276,
266,
6843,
3908,
273,
253,
10799,
9079,
1146,
5762,
651,
3012,
3157,
253,
2929,
347,
261,
253,
1543,
403,
1896,
281,
4665,
275,
1142,
1027,
4088,
24088,
326,
47353,
2057,
4245,
253,
3045,
273,
278,
4245,
253,
1805,
875,
278,
285,
277,
390,
1633,
2010,
285,
6843,
3908,
273,
253,
9079,
2112,
342,
3081,
4679,
281,
10534,
1071,
253,
14493,
273,
326,
9079,
651,
906,
275,
247,
1199,
625,
10799,
1379,
12594,
432,
253,
2929,
50276,
4674,
3127,
849,
1057,
436,
2968,
342,
19191,
414,
275,
253,
11786,
6642,
1754,
327,
16186,
337,
352,
3133,
751,
305,
85,
310,
253,
11786,
323,
247,
1054,
487,
1506,
533,
253,
5933,
1057,
417,
13199,
1054,
487,
1506,
5438,
50275,
783,
14951,
273,
31934,
285,
32449,
414,
812,
320,
5520,
891,
4934,
1907,
281,
14084,
598,
285,
1611,
281,
4677,
562,
534,
10770,
281,
9777,
285,
534,
10770,
281,
3884,
285,
253,
32449,
414,
310,
7616,
11794,
275,
2505,
50276,
783,
14777,
5742,
3036,
3495,
403,
1077,
1355,
285,
1892,
281,
1239,
50275,
9188,
40348,
50276,
2004,
891,
717,
417,
271,
6485,
275,
5556,
14460,
253,
1543,
1646,
751,
247,
4217,
3213,
4404,
45190,
4685,
253,
4499,
285,
29395,
3064,
275,
1524,
10186,
3045,
875,
1027,
5556,
14460,
50274,
11183,
846,
2488,
2380,
50276,
35501,
281,
253,
4477,
323,
253,
2380,
3340,
253,
6843,
3908,
273,
253,
2561,
1953,
891,
513,
2096,
326,
253,
767,
4679,
326,
891,
5439,
4512,
7350,
342,
8077,
5411,
253,
6888,
10130,
285,
4156,
47353,
323,
270,
797,
403,
417,
6034,
281,
320,
294,
34153,
39396,
323,
3733,
6928,
275,
253,
2852,
533,
619,
4468,
310,
326,
1293,
4518,
4767,
39396,
6309,
6787,
323,
841,
581,
2727,
4679,
390,
16055,
14970,
253,
1543,
778,
816,
320,
253,
906,
273,
4839,
1907,
4821,
281,
1611,
11640,
273,
253,
2746,
1919,
581,
4307,
2581,
685,
8109,
253,
16055,
1750,
326,
253,
3576,
4715,
2281,
10130,
310,
253,
3625,
6254,
273,
7200,
50276,
1747,
21350,
619,
2268,
4624,
403,
9142,
5884,
285,
891,
513,
1335,
1158,
326,
253,
2929,
943,
320,
7607,
619,
2022,
1127,
273,
30859,
432,
253,
643,
30628,
310,
326,
891,
2868,
436,
2929,
310,
247,
1534,
7680,
1014,
604,
12445,
2455,
4648,
253,
5853,
281,
6194,
247,
2990,
281,
8745,
281,
990,
15987,
984,
253,
9380,
7680,
310,
281,
253,
16774,
4685,
273,
5556,
14460,
2581,
685,
281,
253,
4588,
10393,
273,
5556,
14460,
984,
273,
326,
891,
1158,
326,
247,
5777,
2406,
35961,
2534,
310,
5272,
50276,
783,
2929,
310,
15250,
247,
747,
4560,
278,
2324,
80,
685,
247,
747,
5853,
285,
359,
943,
417,
2430,
326,
253,
2746,
2987,
275,
512,
7533,
390,
1014,
326,
253,
4477,
6266,
39396,
323,
4560,
2074,
1543,
275,
747,
7533,
24088,
342,
1027,
4373,
3602,
390,
10165,
273,
278,
285,
277,
50275,
14764,
1223,
891,
717,
417,
7094,
10048,
342,
253,
16038,
3212,
253,
10165,
273,
2361,
4679,
285,
253,
5816,
3908,
273,
247,
21649,
18397,
9079,
281,
1071,
253,
2929,
29328,
271,
4722,
3368,
285,
3400,
247,
22335,
3451,
7103,
273,
253,
3916,
275,
253,
2929,
5474,
339,
431,
248,
4477,
7409,
253,
994,
2134,
942,
875,
253,
5556,
6081,
285,
253,
4715,
2281,
10130,
285,
12661,
50276,
783,
5853,
273,
5556,
6081,
47353,
534,
4483,
323,
253,
3700,
273,
253,
4583,
15424,
3213,
1979,
10130,
432,
247,
24251,
5556,
6081,
281,
247,
747,
5556,
6081,
24279,
16774,
3045,
436,
3400,
247,
10237,
50276,
17381,
395,
1993,
8245,
323,
5556,
6081,
14023,
4283,
281,
23082,
281,
253,
15180,
2105,
273,
50276,
32581,
6081,
4373,
19484,
3186,
970,
47353,
597,
9413,
247,
1327,
26672,
422,
4715,
2281,
10618,
281,
50276,
8433,
69,
534,
4483,
352,
281,
6194,
247,
270,
797,
1566,
281,
1375,
23037,
14387,
3045,
253,
5933,
285,
253,
1332,
403,
3240,
2969,
32800,
403,
2668,
432,
581,
285,
10746,
403,
2668,
432,
50276,
23955,
50276,
9453,
368,
1379,
277,
285,
31934,
275,
253,
1543,
278,
285,
31934,
403,
10870,
275,
3045,
594,
627,
310,
642,
2590,
5649,
281,
970,
31934,
672,
31934,
310,
10870,
281,
278,
247,
9936,
273,
970,
31934,
597,
1750,
310,
27090,
253,
50276,
31062,
2651,
50276,
2858,
513,
28631,
1663,
39970,
973,
50276,
16691,
907,
281,
752,
9428,
891,
1818,
253,
941,
285,
10336,
50276,
5092,
891,
897,
436,
747,
2281,
8194,
14398,
50276,
2520,
281,
320,
310,
253,
1953,
597,
943,
2953,
16575,
50276,
74,
1158,
752,
253,
4477,
403,
36636,
310,
436,
604,
253,
1818,
310,
275,
253,
5556,
6081,
760,
849,
943,
891,
1818,
253,
50276,
28269,
2281,
10130,
533,
627,
403,
3081,
3533,
50276,
82,
18,
752,
310,
253,
3486,
273,
436,
1332,
327,
26647,
2228,
604,
4754,
342,
752,
7162,
2805,
19,
1057,
436,
747,
10130,
3157,
253,
1340,
273,
14940,
604,
4754,
849,
2805,
20,
476,
368,
2319,
2087,
3607,
273,
253,
47353,
5572,
50276,
82,
21,
47515,
253,
30328,
3212,
27090,
28631,
2057,
10527,
390,
16774,
50276,
395,
31471,
273,
47353,
2805,
22,
2139,
943,
359,
1902,
490,
281,
1347,
1805,
685,
18927,
672,
627,
403,
767,
5556,
14460,
247,
285,
270,
1333,
253,
2929,
3738,
310,
4722,
19756,
253,
7681,
358,
5378,
474,
38135,
281,
15785,
9311,
2490,
187,
4118,
18435,
27,
783,
2929,
4081,
10166,
13361,
390,
13853,
281,
1089,
253,
12524,
3884,
285,
3213,
1979,
275,
13757,
253,
1232,
597,
1067,
47353,
30628,
5439,
2067,
7350,
670,
253,
13367,
273,
13361,
390,
13853,
275,
2087,
7533,
534,
310,
3588,
253,
30080,
22559,
812,
417,
18578,
253,
30628,
281,
1818,
616,
4743,
50276,
504,
595,
323,
271,
16774,
760,
2929,
342,
5536,
13367,
327,
13361,
323,
4619,
7089,
281,
2525,
253,
1029,
2534,
273,
17857,
32888,
627,
1364,
320,
2067,
4679,
33930,
15302,
390,
625,
327,
11117,
15302,
285,
7533,
671,
627,
943,
320,
11985,
327,
672,
285,
849,
253,
1332,
10224,
285,
2905,
11985,
275,
326,
3282,
253,
2929,
1057,
417,
2525,
253,
2534,
323,
9311
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
9162,
323,
4440,
257,
292,
285,
260,
338,
274,
740,
323,
519,
356,
4614,
256,
35333,
285,
256,
35333,
324,
356,
4614,
4645,
50276,
8433,
45328,
356,
4614,
41731,
13015,
256,
35333,
285,
519,
356,
4614,
2299,
1293,
2228,
8965,
352,
310,
1892,
2939,
253,
4588,
1543,
50276,
71,
3341,
253,
2929,
2722,
1543,
323,
47353,
940,
3867,
247,
1327,
26672,
422,
10618,
281,
277,
23703,
253,
878,
281,
1408,
278,
275,
7529,
50276,
3529,
310,
27090,
247,
4156,
37282,
2662,
1327,
26672,
422,
18878,
4670,
323,
253,
4715,
2281,
253,
1543,
921,
323,
253,
4156,
12955,
323,
501,
3024,
256,
35333,
519,
356,
4614,
253,
6888,
4715,
2281,
310,
10870,
281,
253,
581,
908,
327,
256,
35333,
285,
33526,
247,
1755,
18,
7200,
273,
818,
23260,
323,
253,
591,
12026,
12955,
4715,
2281,
10130,
13276,
247,
2969,
591,
12026,
3213,
1979,
10618,
1293,
17825,
638,
42743,
253,
4477,
1246,
4737,
1171,
31503,
1543,
323,
8077,
5411,
253,
6888,
10130,
347,
247,
1039,
281,
1329,
253,
31640,
273,
616,
3700,
2746,
50276,
783,
2929,
10262,
271,
4722,
5853,
273,
47353,
323,
253,
1895,
273,
3213,
1979,
4373,
19484,
25184,
285,
13279,
598,
3533,
347,
281,
253,
1612,
273,
2969,
591,
28269,
2281,
28631,
253,
20544,
273,
253,
2929,
2486,
9591,
1783,
327,
1375,
273,
253,
1445,
49602,
4440,
257,
292,
260,
338,
274,
740,
323,
2460,
9162,
285,
259,
15170,
5098,
323,
270,
797,
27855,
1529,
4757,
310,
18005,
253,
3634,
273,
27090,
15424,
3213,
1979,
28631,
281,
1529,
5556,
6081,
323,
18005,
4156,
285,
591,
12026,
11640,
285,
323,
18005,
2969,
4715,
2281,
8900,
50275,
35529,
16774,
1941,
369,
417,
4209,
281,
320,
38662,
327,
253,
3082,
3559,
323,
1650,
253,
1543,
497,
327,
374,
8892,
50276,
5695,
9162,
342,
501,
47301,
285,
270,
797,
3215,
26208,
342,
374,
15302,
1016,
285,
374,
14604,
9552,
854,
14403,
285,
31133,
2358,
2007,
323,
1650,
50276,
13206,
374,
12646,
1543,
323,
253,
1682,
2332,
3045,
533,
1057,
417,
2486,
2228,
8965,
6240,
2228,
8965,
285,
253,
16774,
15180,
2105,
23082,
651,
320,
4217,
275,
1805,
8109,
253,
3916,
50276,
251,
2067,
5018,
7826,
8664,
1543,
497,
3636,
533,
417,
18212,
17285,
285,
1293,
625,
4209,
16774,
1543,
253,
9414,
310,
8767,
604,
841,
403,
25488,
273,
625,
4092,
32138,
275,
253,
2746,
24088,
323,
270,
797,
268,
23,
253,
4156,
2715,
273,
47353,
497,
861,
71,
280,
5954,
7197,
285,
452,
644,
11035,
285,
268,
1638,
5747,
841,
28596,
1781,
10618,
11878,
253,
41824,
5556,
6081,
26414,
533,
436,
14338,
11562,
369,
1669,
281,
2852,
789,
50276,
783,
1263,
671,
1057,
417,
1246,
10527,
762,
81,
32599,
323,
253,
5853,
326,
651,
320,
9371,
323,
4685,
604,
6296,
253,
1543,
812,
320,
625,
7561,
7763,
50275,
783,
1263,
10262,
253,
2746,
31934,
533,
1057,
417,
1918,
12925,
327,
849,
278,
390,
277,
1537,
320,
4236,
625,
3839,
323,
643,
8892,
390,
849,
581,
1537,
2939,
8892,
281,
7617,
327,
278,
285,
277,
627,
403,
671,
7089,
323,
3602,
326,
403,
417,
5544,
285,
352,
310,
417,
2590,
1024,
7996,
1543,
778,
320,
281,
841,
7089,
323,
1650,
323,
4715,
2281,
8900,
268,
25,
50276,
43355,
6266,
13887,
253,
3828,
3020,
39199,
347,
253,
8876,
273,
2060,
17660,
323,
253,
806,
5307,
25142,
390,
35132,
3006,
281,
253,
5275,
1612,
273,
3578,
5277,
3081,
12925,
285,
263,
10527,
762,
81,
32599,
323,
824,
10165,
651,
320,
4217,
4583,
253,
2929,
10262,
271,
4722,
5853,
285,
13279,
4722,
3533,
533,
1293,
2007,
16774,
1543,
390,
10527,
762,
81,
32599,
253,
1543,
3559,
403,
12497,
281,
2939,
849,
2087,
12729,
253,
4342,
1537,
320,
5474,
339,
431,
248,
4477,
12661,
4715,
2281,
47353,
347,
247,
1332,
281,
8338,
253,
1612,
285,
8062,
273,
5556,
14460,
4715,
2281,
47353,
27959,
253,
3602,
273,
253,
6928,
715,
2390,
285,
323,
1016,
1387,
3936,
253,
3884,
273,
253,
2801,
5731,
432,
581,
5556,
6081,
285,
253,
9777,
432,
1529,
5556,
6081,
253,
2929,
840,
2722,
326,
47353,
4483,
323,
17170,
253,
3045,
273,
247,
24251,
5556,
6081,
970,
326,
24251,
5556,
14460,
1387,
3020,
32800,
2112,
342,
271,
18093,
37437,
5556,
14460,
1387,
3020,
10746,
347,
247,
3877,
891,
9814,
247,
2720,
19529,
273,
436,
2929,
253,
10713,
273,
619,
2278,
310,
253,
1072,
347,
275,
253,
2720,
19529,
342,
14586,
3969,
281,
253,
14586,
275,
253,
2929,
50275,
19164,
414,
50276,
783,
2929,
310,
3236,
9591,
271,
3368,
326,
891,
452,
417,
2326,
275,
253,
6239,
285,
3449,
5906,
1031,
25339,
4114,
285,
2905,
789,
50275,
15177,
50276,
783,
4679,
2684,
1561,
253,
2929,
403,
273,
1029,
3290,
3449,
5906,
1031,
17227,
253,
2022,
3916,
273,
253,
2929,
326,
1387,
3020,
47353,
1543,
275,
3045,
6425,
281,
253,
1805,
85,
37437,
5556,
6081,
326,
253,
4715,
4142,
403,
6012,
432,
50276,
2577,
2022,
4426,
78,
342,
253,
4679,
310,
253,
3480,
273,
16038,
323,
253,
2173,
5661,
16012,
3185,
253,
1543,
273,
2067,
39653,
4679,
275,
39653,
7533,
403,
2361,
2403,
352,
1892,
281,
1921,
670,
253,
31376,
273,
253,
1543,
5742,
50276,
9088,
403,
642,
2361,
1543,
323,
4156,
47353,
327,
270,
797,
643,
685,
247,
3908,
326,
352,
17923,
3012,
7197,
352,
651,
320,
9371,
281,
923,
841,
1543,
275,
253,
2022,
2133,
390,
30762,
281,
2096,
849,
1199,
7197,
285,
253,
5454,
14273,
875,
4156,
285,
3828,
15822,
47353,
50276,
783,
8077,
5411,
253,
6888,
10130,
2593,
671,
9193,
10341,
253,
10618,
6974,
310,
2529,
347,
2969,
534,
352,
310,
533,
352,
310,
671,
247,
6290,
81,
3136,
10618,
323,
247,
2014,
3368,
534,
310,
417,
17618,
281,
3700,
281,
643,
7533,
697,
417,
4755,
752,
253,
7680,
273,
436,
2593,
310,
4457,
326,
273,
253,
1551,
273,
253,
2929,
50276,
74,
651,
671,
11435,
2007,
12820,
326,
253,
3910,
275,
10746,
875,
278,
285,
277,
403,
1534,
327,
253,
3828,
3020,
285,
4156,
4311,
1677,
326,
278,
50276,
69,
50276,
78,
672,
970,
591,
6712,
2390,
352,
3133,
1896,
326,
672,
970,
4577,
2390,
326,
403,
24088,
253,
1979,
273,
8090,
278,
50276,
69,
1192,
89,
278,
352,
651,
320,
32811,
281,
921,
326,
253,
1387,
3020,
10746,
273,
278,
285,
277,
403,
3012,
1027,
619,
15355,
310,
326,
597,
403,
9093,
19627,
533,
436,
310,
4409,
3588,
839,
50275,
498,
15752,
50276,
251,
253,
2644,
253,
2929,
310,
1077,
2590,
342,
2217,
4278,
281,
4354,
2096,
285,
18302,
253,
4679,
2361,
275,
253,
2929,
2299,
891,
513,
452,
253,
1563,
3533,
285,
13991,
323,
11701,
50276,
266,
6843,
3908,
273,
253,
10799,
9079,
1146,
5762,
651,
3012,
3157,
253,
2929,
347,
261,
253,
1543,
403,
1896,
281,
4665,
275,
1142,
1027,
4088,
24088,
326,
47353,
2057,
4245,
253,
3045,
273,
278,
4245,
253,
1805,
875,
278,
285,
277,
390,
1633,
2010,
285,
6843,
3908,
273,
253,
9079,
2112,
342,
3081,
4679,
281,
10534,
1071,
253,
14493,
273,
326,
9079,
651,
906,
275,
247,
1199,
625,
10799,
1379,
12594,
432,
253,
2929,
50276,
4674,
3127,
849,
1057,
436,
2968,
342,
19191,
414,
275,
253,
11786,
6642,
1754,
327,
16186,
337,
352,
3133,
751,
305,
85,
310,
253,
11786,
323,
247,
1054,
487,
1506,
533,
253,
5933,
1057,
417,
13199,
1054,
487,
1506,
5438,
50275,
783,
14951,
273,
31934,
285,
32449,
414,
812,
320,
5520,
891,
4934,
1907,
281,
14084,
598,
285,
1611,
281,
4677,
562,
534,
10770,
281,
9777,
285,
534,
10770,
281,
3884,
285,
253,
32449,
414,
310,
7616,
11794,
275,
2505,
50276,
783,
14777,
5742,
3036,
3495,
403,
1077,
1355,
285,
1892,
281,
1239,
50275,
9188,
40348,
50276,
2004,
891,
717,
417,
271,
6485,
275,
5556,
14460,
253,
1543,
1646,
751,
247,
4217,
3213,
4404,
45190,
4685,
253,
4499,
285,
29395,
3064,
275,
1524,
10186,
3045,
875,
1027,
5556,
14460,
50274,
11183,
846,
2488,
2380,
50276,
35501,
281,
253,
4477,
323,
253,
2380,
3340,
253,
6843,
3908,
273,
253,
2561,
1953,
891,
513,
2096,
326,
253,
767,
4679,
326,
891,
5439,
4512,
7350,
342,
8077,
5411,
253,
6888,
10130,
285,
4156,
47353,
323,
270,
797,
403,
417,
6034,
281,
320,
294,
34153,
39396,
323,
3733,
6928,
275,
253,
2852,
533,
619,
4468,
310,
326,
1293,
4518,
4767,
39396,
6309,
6787,
323,
841,
581,
2727,
4679,
390,
16055,
14970,
253,
1543,
778,
816,
320,
253,
906,
273,
4839,
1907,
4821,
281,
1611,
11640,
273,
253,
2746,
1919,
581,
4307,
2581,
685,
8109,
253,
16055,
1750,
326,
253,
3576,
4715,
2281,
10130,
310,
253,
3625,
6254,
273,
7200,
50276,
1747,
21350,
619,
2268,
4624,
403,
9142,
5884,
285,
891,
513,
1335,
1158,
326,
253,
2929,
943,
320,
7607,
619,
2022,
1127,
273,
30859,
432,
253,
643,
30628,
310,
326,
891,
2868,
436,
2929,
310,
247,
1534,
7680,
1014,
604,
12445,
2455,
4648,
253,
5853,
281,
6194,
247,
2990,
281,
8745,
281,
990,
15987,
984,
253,
9380,
7680,
310,
281,
253,
16774,
4685,
273,
5556,
14460,
2581,
685,
281,
253,
4588,
10393,
273,
5556,
14460,
984,
273,
326,
891,
1158,
326,
247,
5777,
2406,
35961,
2534,
310,
5272,
50276,
783,
2929,
310,
15250,
247,
747,
4560,
278,
2324,
80,
685,
247,
747,
5853,
285,
359,
943,
417,
2430,
326,
253,
2746,
2987,
275,
512,
7533,
390,
1014,
326,
253,
4477,
6266,
39396,
323,
4560,
2074,
1543,
275,
747,
7533,
24088,
342,
1027,
4373,
3602,
390,
10165,
273,
278,
285,
277,
50275,
14764,
1223,
891,
717,
417,
7094,
10048,
342,
253,
16038,
3212,
253,
10165,
273,
2361,
4679,
285,
253,
5816,
3908,
273,
247,
21649,
18397,
9079,
281,
1071,
253,
2929,
29328,
271,
4722,
3368,
285,
3400,
247,
22335,
3451,
7103,
273,
253,
3916,
275,
253,
2929,
5474,
339,
431,
248,
4477,
7409,
253,
994,
2134,
942,
875,
253,
5556,
6081,
285,
253,
4715,
2281,
10130,
285,
12661,
50276,
783,
5853,
273,
5556,
6081,
47353,
534,
4483,
323,
253,
3700,
273,
253,
4583,
15424,
3213,
1979,
10130,
432,
247,
24251,
5556,
6081,
281,
247,
747,
5556,
6081,
24279,
16774,
3045,
436,
3400,
247,
10237,
50276,
17381,
395,
1993,
8245,
323,
5556,
6081,
14023,
4283,
281,
23082,
281,
253,
15180,
2105,
273,
50276,
32581,
6081,
4373,
19484,
3186,
970,
47353,
597,
9413,
247,
1327,
26672,
422,
4715,
2281,
10618,
281,
50276,
8433,
69,
534,
4483,
352,
281,
6194,
247,
270,
797,
1566,
281,
1375,
23037,
14387,
3045,
253,
5933,
285,
253,
1332,
403,
3240,
2969,
32800,
403,
2668,
432,
581,
285,
10746,
403,
2668,
432,
50276,
23955,
50276,
9453,
368,
1379,
277,
285,
31934,
275,
253,
1543,
278,
285,
31934,
403,
10870,
275,
3045,
594,
627,
310,
642,
2590,
5649,
281,
970,
31934,
672,
31934,
310,
10870,
281,
278,
247,
9936,
273,
970,
31934,
597,
1750,
310,
27090,
253,
50276,
31062,
2651,
50276,
2858,
513,
28631,
1663,
39970,
973,
50276,
16691,
907,
281,
752,
9428,
891,
1818,
253,
941,
285,
10336,
50276,
5092,
891,
897,
436,
747,
2281,
8194,
14398,
50276,
2520,
281,
320,
310,
253,
1953,
597,
943,
2953,
16575,
50276,
74,
1158,
752,
253,
4477,
403,
36636,
310,
436,
604,
253,
1818,
310,
275,
253,
5556,
6081,
760,
849,
943,
891,
1818,
253,
50276,
28269,
2281,
10130,
533,
627,
403,
3081,
3533,
50276,
82,
18,
752,
310,
253,
3486,
273,
436,
1332,
327,
26647,
2228,
604,
4754,
342,
752,
7162,
2805,
19,
1057,
436,
747,
10130,
3157,
253,
1340,
273,
14940,
604,
4754,
849,
2805,
20,
476,
368,
2319,
2087,
3607,
273,
253,
47353,
5572,
50276,
82,
21,
47515,
253,
30328,
3212,
27090,
28631,
2057,
10527,
390,
16774,
50276,
395,
31471,
273,
47353,
2805,
22,
2139,
943,
359,
1902,
490,
281,
1347,
1805,
685,
18927,
672,
627,
403,
767,
5556,
14460,
247,
285,
270,
1333,
253,
2929,
3738,
310,
4722,
19756,
253,
7681,
358,
5378,
474,
38135,
281,
15785,
9311,
2490,
187,
4118,
18435,
27,
783,
2929,
4081,
10166,
13361,
390,
13853,
281,
1089,
253,
12524,
3884,
285,
3213,
1979,
275,
13757,
253,
1232,
597,
1067,
47353,
30628,
5439,
2067,
7350,
670,
253,
13367,
273,
13361,
390,
13853,
275,
2087,
7533,
534,
310,
3588,
253,
30080,
22559,
812,
417,
18578,
253,
30628,
281,
1818,
616,
4743,
50276,
504,
595,
323,
271,
16774,
760,
2929,
342,
5536,
13367,
327,
13361,
323,
4619,
7089,
281,
2525,
253,
1029,
2534,
273,
17857,
32888,
627,
1364,
320,
2067,
4679,
33930,
15302,
390,
625,
327,
11117,
15302,
285,
7533,
671,
627,
943,
320,
11985,
327,
672,
285,
849,
253,
1332,
10224,
285,
2905,
11985,
275,
326,
3282,
253,
2929,
1057,
417,
2525,
253,
2534,
323,
9311
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper considers the fundamental problem of model selection for univariate gaussian mixture models gmms which asks for the minimum number of components needed to fit the given data the approach to do this is via robust proper learning of gmms where given access to a distribution that is epsilonclose to a kgmm we want to find a kgmm having ideally kk being tildeoepsilonclose to that distribution the paper presents the first algorithm for this task with ktildeok that works for arbitrary gmms and has polynomial sample complexity and runtime using that result it obtains an algorithm for the model selection problem finally the paper utilizes the techniques to obtain a result for fourier sparse interpolation that enjoys better bounds on the sparsity of the outputted signal than prior work at the core of the gmm algorithm is a procedure that works for gmms with wellconditioned components and a localization technique allowing to reduce the general case to the wellconditioned one the former is based on considering the taylor expansions of the gaussian components and associating the wellconditioned mixtures to points in a convex body the latter uses gaussian multipliers to reweight the mixtures so that they become more localized and constructs tildeok such multipliers that allow reconstruction of the original mixture questions regarding learning of gaussians and mixture models are arguably one of the most fundamental problems in statistics the paper presents an algorithm that enjoys polynomial runtime while working under no additional assumptions on the input the paper is technically interesting and employs an interesting suite of techniques the solution deviates from prior work ideas and requires substantial technical work the paper is also wellwritten and despite the fact that the arguments used are involved the authors managed to present their ideas in a clean way i have not carefully checked the details in the supplementary material but the outline of the main submission seemed reasonable i recommend that the paper be accepted sufficiently addressed docsepin learning theory we often assume that samples are drawn from a finite mixture model also the number of components in the finite mixture model is known however in many applications this number may not be known in advance the problem of finding the number of components is called model selection the authors study the problem of robust model selection for one dimensional gaussian mixture models with k components kgmm formally it is formulated as follows given samples drawn from a distribution that is epsilonclose to a kgmm in total variation we want to output a kgmm where k is not much larger than k there is a long line of existing works for learning gmm however they have different shortcomings in this setting such as requiring exponentially in k many samples running in time in exponential in k the number of gaussians in the output k much larger than k etc in this paper the authors give an algorithm for learning arbitrary one dimensional kgmms that outputs an approximation with tilde ok components the algorithm works as follows recall that we are given samples drawn from a distribution f and there is a kgmm mathcalm epsilonclose to f in total variation the authors first consider the following case when the underlying gmm mathcalm satisfies the property that the means of the gaussians are all not too far away and the variances of the gaussians are all constant scale then we can find a olog frac1epsilongmm widetildemathcalm that is also close to f in total variation it is what the authors call wellconditioned since the gaussians are wellconditioned achieving this is rather easy by examining the taylor expansion then the authors reduce the general case to the well conditioned case via localization the high level idea is to look for the intervals such that the union of these intervals covers the union of the significant intervals of the gaussians in mathcalm where the significant interval of a gaussian is a long interval covering most probability mass of this gaussian for each interval i the mixture of gaussians in mathcalm whose significant interval intersects i forms a mixture of well conditioned gaussians on the other hand the gaussians in mathcalm whose significant interval does not intersect this interval i are negligible with respect to this mixture of well conditioned gaussians hence the authors use dynamic programming to find out these intervals from the samples and use the algorithm for the well conditioned case to output a mixture of widetilde ok gaussians since the number of these intervals is widetilde ok and there are log frac1epsilon gaussians in the mixture for the well conditioned case furthermore the authors apply the result to solve hypothesis testing for model selection problem and the techniques to solve sparse fourier problem strengths the paper is wellwritten the ideas and the proofs are easy to follow the results is interesting the number of gaussians in the output is klog frac1epsilono1 while this number is kfrac1epsilono1 in the previous works this improvement for the dependence on frac1epsilon from a polynomial bound to a logarithmic bound is interesting weaknesses it may just be a minor weakness in terms of proof techniques it seems that the techniques used in this paper are rather standard it seems the tools used in the proof are not very surprising na docsepthis paper considers density estimation of 1d gaussian mixtures with unknown number of components k main point of this submission is to find a 1d gaussian mixture hypothesis with at most tildeo k k cdot rm poly log1epsilon components such that the returned hypothesis is oepsilonclose in tv distance to the ground truth previous work on the same problem used either larger number of components o1epsilon2 or used non gaussianmixture hypothesis main tools used here are polynomial approximation of gaussian pdf up to degree olog2 1epsilon it is shown that the techniques shown for 1d gaussian mixture can also be used for sparse fourier reconstruction in my understanding the proposed algorithm proceeds in two steps 1 run any improper learning algorithm that outputs a gmm hypothesis with much more number of components the authors pick is chan et al 2013 that uses tildeokepsilon2 samples and 2 reduce the number of used components using polynomial approximation of gaussian pdf s main subtlety in processing 2 is that if variances of individual gaussians across different components vary a lot then it is trickier to find polynomial functions that can approximate all pdf s of gaussian components authors proposed a localization technique to handle this strength presentation of the paper is very clear it would be even nicer if you have an algorithm box or enumeration block that summarizes the full algorithm the paper also provides a nice application sparse fourier reconstruction of their theoretical findings weakness techniques used in this paper are very specific to the 1d gmm case and thus significance of the results is limited the paper is mostly about the postprocessing to reduce the number of components i wonder if there is anything improved in the density estimation step compared to benchmarks eg li and schmidt 2017 if there is any interesting challenge that did not appear in 1d gmm but should be considered in the sparse fourier problem then it might be better fleshed out in the main text i dont see any negative societal impact docsepthis paper addresses the problem of proper learning of 1d gmms wrt total variation distance however k number of components is unknown and we are allowed to output a mixture more than k components at most polylogarithmic factor more for nonproper learning it is already known how to learning 1d gmms using piecewise polynomials therefore the main remaining challenge for proper learning is a computational one not a statistical one the techniques used to solve the problem are novel and interesting and compared to the previous work the number of components of the mixture they find is significantly smaller proof overview we can assume we have an approximation to the distribution in the form of a piecewise polynomial based on the previous work the authors prove that under some conditions the components are not very thin there are gaussian multipliers that can make the mixture wellconditioned concentrated around a mean with similar variances they then prove that indicator function of every interval can be written as a mixture of gaussian multipliers therefore the mixture over that interval can be written as the product of the mixture and the indicator if the interval is such that the conditions for thin gaussians and the multipliers are met then the product is a wellconditioned mixture they then prove that there are intervals that satisfy the condition and most of the mass of mixture is concentrated on those intervals strengths the paper addresses an important and basic problem in the area of density estimation proper learning of 1d gmms there have been multiple previous attempts on this problem but this paper significantly improves over them the algorithmic techniques as well as the structural results are novel and interesting the presentation is nice weaknesses minor issues sometimes the authors jump in to the technical steps of the proofs without explaining the big picture adding a few sentences at the beginning of these subsections can help readability for example in the localization section it is not clear from the beginning that the localization is going to be centered at different components the reader may assume it is going to be a single localization for the whole mixture i was not able to understand the techniques thoroughly until i read the instructions in the appendix missing discussion of a seemingly very relevant paper by x wu and c xie improved algorithms for properly learning mixture of gaussians i dont foresee direct negative impact of this theoretical work
### Summary:
|
this paper gives significantly improved guarantees for improper learning of gaussian mixtures in 1dimension given samples from a distribution that is close to a mixture of gaussians it gives an polynomial time and samples algorithm that outputs a mixture with slightly more components that is also close the papers contributions are solid both in terms of the result and the techniques involved so this is a clear accept
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
19401,
253,
7936,
1895,
273,
1566,
5438,
323,
36474,
305,
12064,
7802,
3210,
305,
78,
983,
534,
12325,
323,
253,
5927,
1180,
273,
4295,
3058,
281,
4944,
253,
1677,
941,
253,
2746,
281,
513,
436,
310,
3066,
10237,
1463,
4715,
273,
305,
78,
983,
835,
1677,
2289,
281,
247,
3268,
326,
310,
299,
4277,
10483,
281,
247,
15841,
2188,
359,
971,
281,
1089,
247,
15841,
2188,
1907,
34243,
465,
76,
1146,
246,
6227,
80,
4259,
10483,
281,
326,
3268,
253,
2929,
10262,
253,
806,
5933,
323,
436,
4836,
342,
465,
3582,
536,
326,
2987,
323,
10341,
305,
78,
983,
285,
556,
14189,
3410,
10454,
285,
20243,
970,
326,
906,
352,
31326,
271,
5933,
323,
253,
1566,
5438,
1895,
4720,
253,
2929,
29820,
253,
5609,
281,
4044,
247,
906,
323,
269,
15421,
23507,
30370,
326,
29566,
1805,
14493,
327,
253,
37139,
414,
273,
253,
3453,
8659,
2625,
685,
2720,
789,
387,
253,
5161,
273,
253,
305,
2188,
5933,
310,
247,
5199,
326,
2987,
323,
305,
78,
983,
342,
973,
44321,
4295,
285,
247,
14536,
5853,
6941,
281,
4796,
253,
2087,
1083,
281,
253,
973,
44321,
581,
253,
3438,
310,
1754,
327,
7296,
253,
246,
9614,
40955,
273,
253,
305,
12064,
4295,
285,
1709,
839,
253,
973,
44321,
24170,
281,
2792,
275,
247,
17133,
2133,
253,
6158,
4648,
305,
12064,
18878,
4670,
281,
294,
6712,
253,
24170,
594,
326,
597,
2489,
625,
15783,
285,
21031,
246,
6227,
536,
824,
18878,
4670,
326,
1581,
14433,
273,
253,
3236,
7802,
3533,
5001,
4715,
273,
305,
10064,
2458,
285,
7802,
3210,
403,
25711,
581,
273,
253,
954,
7936,
3237,
275,
9990,
253,
2929,
10262,
271,
5933,
326,
29566,
14189,
20243,
1223,
2444,
762,
642,
3081,
13260,
327,
253,
3280,
50276,
783,
2929,
310,
22335,
4722,
285,
27532,
271,
4722,
18880,
273,
5609,
253,
2900,
1474,
28032,
432,
2720,
789,
5697,
285,
4419,
6832,
7681,
789,
50276,
783,
2929,
310,
671,
973,
15720,
285,
5747,
253,
958,
326,
253,
7125,
908,
403,
3206,
253,
4477,
7303,
281,
1246,
616,
5697,
275,
247,
4076,
1039,
891,
452,
417,
9257,
10141,
253,
4278,
275,
253,
24864,
2144,
533,
253,
19270,
273,
253,
2022,
19529,
4455,
5272,
50276,
74,
5583,
326,
253,
2929,
320,
7607,
50276,
31031,
314,
9713,
5474,
339,
9852,
4715,
3762,
359,
2223,
5467,
326,
3530,
403,
8392,
432,
247,
6486,
7802,
1566,
671,
253,
1180,
273,
4295,
275,
253,
6486,
7802,
1566,
310,
1929,
2299,
275,
1142,
4893,
436,
1180,
778,
417,
320,
1929,
275,
7170,
253,
1895,
273,
4560,
253,
1180,
273,
4295,
310,
1925,
1566,
5438,
253,
4477,
1263,
253,
1895,
273,
10237,
1566,
5438,
323,
581,
15759,
305,
12064,
7802,
3210,
342,
465,
4295,
15841,
2188,
19186,
352,
310,
26115,
347,
3637,
1677,
3530,
8392,
432,
247,
3268,
326,
310,
299,
4277,
10483,
281,
247,
15841,
2188,
275,
2264,
7629,
359,
971,
281,
3453,
247,
15841,
2188,
835,
465,
310,
417,
1199,
4067,
685,
465,
50276,
9088,
310,
247,
1048,
1386,
273,
5368,
2987,
323,
4715,
305,
2188,
2299,
597,
452,
1027,
35387,
275,
436,
4758,
824,
347,
10568,
28596,
275,
465,
1142,
3530,
3515,
275,
673,
275,
17619,
275,
465,
253,
1180,
273,
305,
10064,
2458,
275,
253,
3453,
465,
1199,
4067,
685,
465,
3966,
275,
436,
2929,
253,
4477,
1918,
271,
5933,
323,
4715,
10341,
581,
15759,
15841,
78,
983,
326,
18012,
271,
11193,
342,
246,
6227,
8718,
4295,
50276,
783,
5933,
2987,
347,
3637,
6983,
326,
359,
403,
1677,
3530,
8392,
432,
247,
3268,
269,
285,
627,
310,
247,
15841,
2188,
14168,
1179,
78,
299,
4277,
10483,
281,
269,
275,
2264,
7629,
253,
4477,
806,
1908,
253,
1563,
1083,
672,
253,
6944,
305,
2188,
14168,
1179,
78,
12310,
253,
2867,
326,
253,
2097,
273,
253,
305,
10064,
2458,
403,
512,
417,
1512,
2080,
1977,
285,
253,
48894,
273,
253,
305,
10064,
2458,
403,
512,
3638,
4311,
840,
359,
476,
1089,
247,
258,
2808,
1315,
317,
18,
2265,
300,
543,
2188,
5261,
292,
786,
358,
506,
1179,
78,
326,
310,
671,
2810,
281,
269,
275,
2264,
7629,
352,
310,
752,
253,
4477,
1067,
973,
44321,
1580,
253,
305,
10064,
2458,
403,
973,
44321,
17170,
436,
310,
2581,
3477,
407,
17565,
253,
246,
9614,
7466,
840,
253,
4477,
4796,
253,
2087,
1083,
281,
253,
973,
27039,
1083,
3066,
14536,
253,
1029,
1268,
2934,
310,
281,
1007,
323,
253,
11508,
824,
326,
253,
8083,
273,
841,
11508,
10949,
253,
8083,
273,
253,
1534,
11508,
273,
253,
305,
10064,
2458,
275,
14168,
1179,
78,
835,
253,
1534,
7726,
273,
247,
305,
12064,
310,
247,
1048,
7726,
10985,
954,
5912,
2280,
273,
436,
305,
12064,
323,
1016,
7726,
891,
253,
7802,
273,
305,
10064,
2458,
275,
14168,
1179,
78,
3692,
1534,
7726,
23965,
84,
891,
4948,
247,
7802,
273,
973,
27039,
305,
10064,
2458,
327,
253,
643,
1133,
253,
305,
10064,
2458,
275,
14168,
1179,
78,
3692,
1534,
7726,
1057,
417,
23965,
436,
7726,
891,
403,
22879,
342,
1675,
281,
436,
7802,
273,
973,
27039,
305,
10064,
2458,
7613,
253,
4477,
897,
7870,
10717,
281,
1089,
562,
841,
11508,
432,
253,
3530,
285,
897,
253,
5933,
323,
253,
973,
27039,
1083,
281,
3453,
247,
7802,
273,
5261,
292,
6227,
8718,
305,
10064,
2458,
1580,
253,
1180,
273,
841,
11508,
310,
5261,
292,
6227,
8718,
285,
627,
403,
2412,
1315,
317,
18,
4259,
305,
10064,
2458,
275,
253,
7802,
323,
253,
973,
27039,
1083,
50276,
44295,
3062,
253,
4477,
4647,
253,
906,
281,
8415,
9079,
5175,
323,
1566,
5438,
1895,
285,
253,
5609,
281,
8415,
23507,
269,
15421,
1895,
50274,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
15720,
253,
5697,
285,
253,
27947,
403,
3477,
281,
956,
50276,
783,
1543,
310,
4722,
253,
1180,
273,
305,
10064,
2458,
275,
253,
3453,
310,
465,
2808,
1315,
317,
18,
4259,
80,
18,
1223,
436,
1180,
310,
465,
1124,
18,
4259,
80,
18,
275,
253,
2045,
2987,
436,
7756,
323,
253,
10096,
327,
1315,
317,
18,
4259,
432,
247,
14189,
3033,
281,
247,
32643,
3033,
310,
4722,
50275,
20881,
1255,
265,
50276,
262,
778,
816,
320,
247,
5884,
14855,
275,
2426,
273,
4737,
5609,
352,
3133,
326,
253,
5609,
908,
275,
436,
2929,
403,
2581,
2629,
352,
3133,
253,
5657,
908,
275,
253,
4737,
403,
417,
1077,
10084,
50275,
2072,
5474,
33032,
2520,
2929,
19401,
4038,
13418,
273,
337,
69,
305,
12064,
24170,
342,
7202,
1180,
273,
4295,
465,
2022,
1127,
273,
436,
19529,
310,
281,
1089,
247,
337,
69,
305,
12064,
7802,
9079,
342,
387,
954,
246,
6227,
80,
465,
50276,
76,
260,
5256,
40373,
3488,
2412,
18,
4259,
4295,
824,
326,
253,
4895,
9079,
310,
258,
4259,
10483,
275,
23055,
4181,
281,
253,
3216,
5083,
2045,
789,
327,
253,
1072,
1895,
908,
2057,
4067,
1180,
273,
4295,
258,
18,
4259,
19,
390,
908,
1327,
305,
12064,
7373,
6638,
9079,
2022,
5657,
908,
1060,
403,
14189,
11193,
273,
305,
12064,
31697,
598,
281,
4248,
258,
2808,
19,
337,
4259,
352,
310,
2011,
326,
253,
5609,
2011,
323,
337,
69,
305,
12064,
7802,
476,
671,
320,
908,
323,
23507,
269,
15421,
14433,
50274,
249,
619,
4685,
253,
4081,
5933,
16947,
275,
767,
5018,
337,
1408,
667,
14697,
4715,
5933,
326,
18012,
247,
305,
2188,
9079,
342,
1199,
625,
1180,
273,
4295,
253,
4477,
2619,
310,
47853,
1162,
355,
4072,
326,
4648,
246,
6227,
3136,
4277,
19,
3530,
285,
374,
4796,
253,
1180,
273,
908,
4295,
970,
14189,
11193,
273,
305,
12064,
31697,
256,
2022,
16105,
555,
275,
5162,
374,
310,
326,
604,
48894,
273,
2060,
305,
10064,
2458,
2439,
1027,
4295,
6889,
247,
2257,
840,
352,
310,
10480,
1321,
281,
1089,
14189,
3470,
326,
476,
16851,
512,
31697,
256,
273,
305,
12064,
4295,
4477,
4081,
247,
14536,
5853,
281,
6016,
436,
4757,
50275,
49836,
273,
253,
2929,
310,
1077,
2590,
352,
651,
320,
1014,
49482,
604,
368,
452,
271,
5933,
3817,
390,
46223,
2972,
326,
37250,
253,
2120,
5933,
50275,
783,
2929,
671,
3400,
247,
5322,
2898,
23507,
269,
15421,
14433,
273,
616,
10527,
4342,
50274,
20881,
1255,
50275,
23693,
4624,
908,
275,
436,
2929,
403,
1077,
2173,
281,
253,
337,
69,
305,
2188,
1083,
285,
3021,
8453,
273,
253,
1543,
310,
3710,
50274,
783,
2929,
310,
6571,
670,
253,
1501,
21678,
281,
4796,
253,
1180,
273,
4295,
891,
4282,
604,
627,
310,
2712,
5520,
275,
253,
4038,
13418,
3213,
2429,
281,
49602,
24088,
632,
285,
5807,
23533,
4240,
50275,
338,
627,
310,
667,
4722,
5691,
326,
858,
417,
3176,
275,
337,
69,
305,
2188,
533,
943,
320,
2783,
275,
253,
23507,
269,
15421,
1895,
840,
352,
1537,
320,
1805,
269,
868,
742,
562,
275,
253,
2022,
2505,
50276,
74,
13414,
923,
667,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
12453,
253,
1895,
273,
1463,
4715,
273,
337,
69,
305,
78,
983,
8772,
2264,
7629,
4181,
2299,
465,
1180,
273,
4295,
310,
7202,
285,
359,
403,
4136,
281,
3453,
247,
7802,
625,
685,
465,
4295,
387,
954,
877,
1190,
462,
274,
29229,
2803,
625,
323,
1327,
30976,
4715,
352,
310,
2168,
1929,
849,
281,
4715,
337,
69,
305,
78,
983,
970,
5313,
3020,
21783,
3103,
253,
2022,
5780,
5691,
323,
1463,
4715,
310,
247,
15180,
581,
417,
247,
7605,
581,
253,
5609,
908,
281,
8415,
253,
1895,
403,
4460,
285,
4722,
285,
2429,
281,
253,
2045,
789,
253,
1180,
273,
4295,
273,
253,
7802,
597,
1089,
310,
3012,
4577,
50276,
16314,
18389,
359,
476,
5467,
359,
452,
271,
11193,
281,
253,
3268,
275,
253,
830,
273,
247,
5313,
3020,
14189,
1754,
327,
253,
2045,
789,
253,
4477,
5276,
326,
762,
690,
2515,
253,
4295,
403,
417,
1077,
6906,
627,
403,
305,
12064,
18878,
4670,
326,
476,
1056,
253,
7802,
973,
44321,
16761,
1475,
247,
1599,
342,
2074,
48894,
597,
840,
5276,
326,
15301,
1159,
273,
1046,
7726,
476,
320,
3542,
347,
247,
7802,
273,
305,
12064,
18878,
4670,
3103,
253,
7802,
689,
326,
7726,
476,
320,
3542,
347,
253,
1885,
273,
253,
7802,
285,
253,
15301,
604,
253,
7726,
310,
824,
326,
253,
2515,
323,
6906,
305,
10064,
2458,
285,
253,
18878,
4670,
403,
1313,
840,
253,
1885,
310,
247,
973,
44321,
7802,
597,
840,
5276,
326,
627,
403,
11508,
326,
10517,
253,
1617,
285,
954,
273,
253,
2280,
273,
7802,
310,
16761,
327,
1110,
11508,
20544,
50275,
783,
2929,
12453,
271,
1774,
285,
5044,
1895,
275,
253,
2170,
273,
4038,
13418,
1463,
4715,
273,
337,
69,
305,
78,
983,
50276,
9088,
452,
644,
2709,
2045,
9437,
327,
436,
1895,
533,
436,
2929,
3012,
19132,
689,
731,
50276,
783,
5933,
280,
5609,
347,
973,
347,
253,
8350,
1543,
403,
4460,
285,
4722,
50276,
783,
9759,
310,
5322,
50276,
20881,
1255,
265,
5884,
3374,
50276,
32307,
253,
4477,
6923,
275,
281,
253,
7681,
5018,
273,
253,
27947,
1293,
15571,
253,
1943,
5406,
6240,
247,
1643,
14683,
387,
253,
5068,
273,
841,
749,
21454,
476,
1361,
1239,
1430,
323,
1650,
275,
253,
14536,
2593,
352,
310,
417,
2590,
432,
253,
5068,
326,
253,
14536,
310,
1469,
281,
320,
18932,
387,
1027,
4295,
253,
9414,
778,
5467,
352,
310,
1469,
281,
320,
247,
2014,
14536,
323,
253,
2644,
7802,
891,
369,
417,
2104,
281,
2096,
253,
5609,
16575,
1919,
891,
1239,
253,
7997,
275,
253,
30762,
50276,
33722,
5955,
273,
247,
16907,
1077,
4623,
2929,
407,
1269,
259,
86,
285,
260,
1269,
466,
5520,
11333,
323,
6283,
4715,
7802,
273,
305,
10064,
2458,
891,
13414,
32734,
1480,
4016,
3486,
273,
436,
10527,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
4245,
3012,
5520,
23632,
323,
14697,
4715,
273,
305,
12064,
24170,
275,
337,
39120,
1677,
3530,
432,
247,
3268,
326,
310,
2810,
281,
247,
7802,
273,
305,
10064,
2458,
352,
4245,
271,
14189,
673,
285,
3530,
5933,
326,
18012,
247,
7802,
342,
5777,
625,
4295,
326,
310,
671,
2810,
253,
9380,
9021,
403,
4891,
1097,
275,
2426,
273,
253,
906,
285,
253,
5609,
3206,
594,
436,
310,
247,
2590,
2997,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
19401,
253,
7936,
1895,
273,
1566,
5438,
323,
36474,
305,
12064,
7802,
3210,
305,
78,
983,
534,
12325,
323,
253,
5927,
1180,
273,
4295,
3058,
281,
4944,
253,
1677,
941,
253,
2746,
281,
513,
436,
310,
3066,
10237,
1463,
4715,
273,
305,
78,
983,
835,
1677,
2289,
281,
247,
3268,
326,
310,
299,
4277,
10483,
281,
247,
15841,
2188,
359,
971,
281,
1089,
247,
15841,
2188,
1907,
34243,
465,
76,
1146,
246,
6227,
80,
4259,
10483,
281,
326,
3268,
253,
2929,
10262,
253,
806,
5933,
323,
436,
4836,
342,
465,
3582,
536,
326,
2987,
323,
10341,
305,
78,
983,
285,
556,
14189,
3410,
10454,
285,
20243,
970,
326,
906,
352,
31326,
271,
5933,
323,
253,
1566,
5438,
1895,
4720,
253,
2929,
29820,
253,
5609,
281,
4044,
247,
906,
323,
269,
15421,
23507,
30370,
326,
29566,
1805,
14493,
327,
253,
37139,
414,
273,
253,
3453,
8659,
2625,
685,
2720,
789,
387,
253,
5161,
273,
253,
305,
2188,
5933,
310,
247,
5199,
326,
2987,
323,
305,
78,
983,
342,
973,
44321,
4295,
285,
247,
14536,
5853,
6941,
281,
4796,
253,
2087,
1083,
281,
253,
973,
44321,
581,
253,
3438,
310,
1754,
327,
7296,
253,
246,
9614,
40955,
273,
253,
305,
12064,
4295,
285,
1709,
839,
253,
973,
44321,
24170,
281,
2792,
275,
247,
17133,
2133,
253,
6158,
4648,
305,
12064,
18878,
4670,
281,
294,
6712,
253,
24170,
594,
326,
597,
2489,
625,
15783,
285,
21031,
246,
6227,
536,
824,
18878,
4670,
326,
1581,
14433,
273,
253,
3236,
7802,
3533,
5001,
4715,
273,
305,
10064,
2458,
285,
7802,
3210,
403,
25711,
581,
273,
253,
954,
7936,
3237,
275,
9990,
253,
2929,
10262,
271,
5933,
326,
29566,
14189,
20243,
1223,
2444,
762,
642,
3081,
13260,
327,
253,
3280,
50276,
783,
2929,
310,
22335,
4722,
285,
27532,
271,
4722,
18880,
273,
5609,
253,
2900,
1474,
28032,
432,
2720,
789,
5697,
285,
4419,
6832,
7681,
789,
50276,
783,
2929,
310,
671,
973,
15720,
285,
5747,
253,
958,
326,
253,
7125,
908,
403,
3206,
253,
4477,
7303,
281,
1246,
616,
5697,
275,
247,
4076,
1039,
891,
452,
417,
9257,
10141,
253,
4278,
275,
253,
24864,
2144,
533,
253,
19270,
273,
253,
2022,
19529,
4455,
5272,
50276,
74,
5583,
326,
253,
2929,
320,
7607,
50276,
31031,
314,
9713,
5474,
339,
9852,
4715,
3762,
359,
2223,
5467,
326,
3530,
403,
8392,
432,
247,
6486,
7802,
1566,
671,
253,
1180,
273,
4295,
275,
253,
6486,
7802,
1566,
310,
1929,
2299,
275,
1142,
4893,
436,
1180,
778,
417,
320,
1929,
275,
7170,
253,
1895,
273,
4560,
253,
1180,
273,
4295,
310,
1925,
1566,
5438,
253,
4477,
1263,
253,
1895,
273,
10237,
1566,
5438,
323,
581,
15759,
305,
12064,
7802,
3210,
342,
465,
4295,
15841,
2188,
19186,
352,
310,
26115,
347,
3637,
1677,
3530,
8392,
432,
247,
3268,
326,
310,
299,
4277,
10483,
281,
247,
15841,
2188,
275,
2264,
7629,
359,
971,
281,
3453,
247,
15841,
2188,
835,
465,
310,
417,
1199,
4067,
685,
465,
50276,
9088,
310,
247,
1048,
1386,
273,
5368,
2987,
323,
4715,
305,
2188,
2299,
597,
452,
1027,
35387,
275,
436,
4758,
824,
347,
10568,
28596,
275,
465,
1142,
3530,
3515,
275,
673,
275,
17619,
275,
465,
253,
1180,
273,
305,
10064,
2458,
275,
253,
3453,
465,
1199,
4067,
685,
465,
3966,
275,
436,
2929,
253,
4477,
1918,
271,
5933,
323,
4715,
10341,
581,
15759,
15841,
78,
983,
326,
18012,
271,
11193,
342,
246,
6227,
8718,
4295,
50276,
783,
5933,
2987,
347,
3637,
6983,
326,
359,
403,
1677,
3530,
8392,
432,
247,
3268,
269,
285,
627,
310,
247,
15841,
2188,
14168,
1179,
78,
299,
4277,
10483,
281,
269,
275,
2264,
7629,
253,
4477,
806,
1908,
253,
1563,
1083,
672,
253,
6944,
305,
2188,
14168,
1179,
78,
12310,
253,
2867,
326,
253,
2097,
273,
253,
305,
10064,
2458,
403,
512,
417,
1512,
2080,
1977,
285,
253,
48894,
273,
253,
305,
10064,
2458,
403,
512,
3638,
4311,
840,
359,
476,
1089,
247,
258,
2808,
1315,
317,
18,
2265,
300,
543,
2188,
5261,
292,
786,
358,
506,
1179,
78,
326,
310,
671,
2810,
281,
269,
275,
2264,
7629,
352,
310,
752,
253,
4477,
1067,
973,
44321,
1580,
253,
305,
10064,
2458,
403,
973,
44321,
17170,
436,
310,
2581,
3477,
407,
17565,
253,
246,
9614,
7466,
840,
253,
4477,
4796,
253,
2087,
1083,
281,
253,
973,
27039,
1083,
3066,
14536,
253,
1029,
1268,
2934,
310,
281,
1007,
323,
253,
11508,
824,
326,
253,
8083,
273,
841,
11508,
10949,
253,
8083,
273,
253,
1534,
11508,
273,
253,
305,
10064,
2458,
275,
14168,
1179,
78,
835,
253,
1534,
7726,
273,
247,
305,
12064,
310,
247,
1048,
7726,
10985,
954,
5912,
2280,
273,
436,
305,
12064,
323,
1016,
7726,
891,
253,
7802,
273,
305,
10064,
2458,
275,
14168,
1179,
78,
3692,
1534,
7726,
23965,
84,
891,
4948,
247,
7802,
273,
973,
27039,
305,
10064,
2458,
327,
253,
643,
1133,
253,
305,
10064,
2458,
275,
14168,
1179,
78,
3692,
1534,
7726,
1057,
417,
23965,
436,
7726,
891,
403,
22879,
342,
1675,
281,
436,
7802,
273,
973,
27039,
305,
10064,
2458,
7613,
253,
4477,
897,
7870,
10717,
281,
1089,
562,
841,
11508,
432,
253,
3530,
285,
897,
253,
5933,
323,
253,
973,
27039,
1083,
281,
3453,
247,
7802,
273,
5261,
292,
6227,
8718,
305,
10064,
2458,
1580,
253,
1180,
273,
841,
11508,
310,
5261,
292,
6227,
8718,
285,
627,
403,
2412,
1315,
317,
18,
4259,
305,
10064,
2458,
275,
253,
7802,
323,
253,
973,
27039,
1083,
50276,
44295,
3062,
253,
4477,
4647,
253,
906,
281,
8415,
9079,
5175,
323,
1566,
5438,
1895,
285,
253,
5609,
281,
8415,
23507,
269,
15421,
1895,
50274,
296,
3755,
20556,
50276,
783,
2929,
310,
973,
15720,
253,
5697,
285,
253,
27947,
403,
3477,
281,
956,
50276,
783,
1543,
310,
4722,
253,
1180,
273,
305,
10064,
2458,
275,
253,
3453,
310,
465,
2808,
1315,
317,
18,
4259,
80,
18,
1223,
436,
1180,
310,
465,
1124,
18,
4259,
80,
18,
275,
253,
2045,
2987,
436,
7756,
323,
253,
10096,
327,
1315,
317,
18,
4259,
432,
247,
14189,
3033,
281,
247,
32643,
3033,
310,
4722,
50275,
20881,
1255,
265,
50276,
262,
778,
816,
320,
247,
5884,
14855,
275,
2426,
273,
4737,
5609,
352,
3133,
326,
253,
5609,
908,
275,
436,
2929,
403,
2581,
2629,
352,
3133,
253,
5657,
908,
275,
253,
4737,
403,
417,
1077,
10084,
50275,
2072,
5474,
33032,
2520,
2929,
19401,
4038,
13418,
273,
337,
69,
305,
12064,
24170,
342,
7202,
1180,
273,
4295,
465,
2022,
1127,
273,
436,
19529,
310,
281,
1089,
247,
337,
69,
305,
12064,
7802,
9079,
342,
387,
954,
246,
6227,
80,
465,
50276,
76,
260,
5256,
40373,
3488,
2412,
18,
4259,
4295,
824,
326,
253,
4895,
9079,
310,
258,
4259,
10483,
275,
23055,
4181,
281,
253,
3216,
5083,
2045,
789,
327,
253,
1072,
1895,
908,
2057,
4067,
1180,
273,
4295,
258,
18,
4259,
19,
390,
908,
1327,
305,
12064,
7373,
6638,
9079,
2022,
5657,
908,
1060,
403,
14189,
11193,
273,
305,
12064,
31697,
598,
281,
4248,
258,
2808,
19,
337,
4259,
352,
310,
2011,
326,
253,
5609,
2011,
323,
337,
69,
305,
12064,
7802,
476,
671,
320,
908,
323,
23507,
269,
15421,
14433,
50274,
249,
619,
4685,
253,
4081,
5933,
16947,
275,
767,
5018,
337,
1408,
667,
14697,
4715,
5933,
326,
18012,
247,
305,
2188,
9079,
342,
1199,
625,
1180,
273,
4295,
253,
4477,
2619,
310,
47853,
1162,
355,
4072,
326,
4648,
246,
6227,
3136,
4277,
19,
3530,
285,
374,
4796,
253,
1180,
273,
908,
4295,
970,
14189,
11193,
273,
305,
12064,
31697,
256,
2022,
16105,
555,
275,
5162,
374,
310,
326,
604,
48894,
273,
2060,
305,
10064,
2458,
2439,
1027,
4295,
6889,
247,
2257,
840,
352,
310,
10480,
1321,
281,
1089,
14189,
3470,
326,
476,
16851,
512,
31697,
256,
273,
305,
12064,
4295,
4477,
4081,
247,
14536,
5853,
281,
6016,
436,
4757,
50275,
49836,
273,
253,
2929,
310,
1077,
2590,
352,
651,
320,
1014,
49482,
604,
368,
452,
271,
5933,
3817,
390,
46223,
2972,
326,
37250,
253,
2120,
5933,
50275,
783,
2929,
671,
3400,
247,
5322,
2898,
23507,
269,
15421,
14433,
273,
616,
10527,
4342,
50274,
20881,
1255,
50275,
23693,
4624,
908,
275,
436,
2929,
403,
1077,
2173,
281,
253,
337,
69,
305,
2188,
1083,
285,
3021,
8453,
273,
253,
1543,
310,
3710,
50274,
783,
2929,
310,
6571,
670,
253,
1501,
21678,
281,
4796,
253,
1180,
273,
4295,
891,
4282,
604,
627,
310,
2712,
5520,
275,
253,
4038,
13418,
3213,
2429,
281,
49602,
24088,
632,
285,
5807,
23533,
4240,
50275,
338,
627,
310,
667,
4722,
5691,
326,
858,
417,
3176,
275,
337,
69,
305,
2188,
533,
943,
320,
2783,
275,
253,
23507,
269,
15421,
1895,
840,
352,
1537,
320,
1805,
269,
868,
742,
562,
275,
253,
2022,
2505,
50276,
74,
13414,
923,
667,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
12453,
253,
1895,
273,
1463,
4715,
273,
337,
69,
305,
78,
983,
8772,
2264,
7629,
4181,
2299,
465,
1180,
273,
4295,
310,
7202,
285,
359,
403,
4136,
281,
3453,
247,
7802,
625,
685,
465,
4295,
387,
954,
877,
1190,
462,
274,
29229,
2803,
625,
323,
1327,
30976,
4715,
352,
310,
2168,
1929,
849,
281,
4715,
337,
69,
305,
78,
983,
970,
5313,
3020,
21783,
3103,
253,
2022,
5780,
5691,
323,
1463,
4715,
310,
247,
15180,
581,
417,
247,
7605,
581,
253,
5609,
908,
281,
8415,
253,
1895,
403,
4460,
285,
4722,
285,
2429,
281,
253,
2045,
789,
253,
1180,
273,
4295,
273,
253,
7802,
597,
1089,
310,
3012,
4577,
50276,
16314,
18389,
359,
476,
5467,
359,
452,
271,
11193,
281,
253,
3268,
275,
253,
830,
273,
247,
5313,
3020,
14189,
1754,
327,
253,
2045,
789,
253,
4477,
5276,
326,
762,
690,
2515,
253,
4295,
403,
417,
1077,
6906,
627,
403,
305,
12064,
18878,
4670,
326,
476,
1056,
253,
7802,
973,
44321,
16761,
1475,
247,
1599,
342,
2074,
48894,
597,
840,
5276,
326,
15301,
1159,
273,
1046,
7726,
476,
320,
3542,
347,
247,
7802,
273,
305,
12064,
18878,
4670,
3103,
253,
7802,
689,
326,
7726,
476,
320,
3542,
347,
253,
1885,
273,
253,
7802,
285,
253,
15301,
604,
253,
7726,
310,
824,
326,
253,
2515,
323,
6906,
305,
10064,
2458,
285,
253,
18878,
4670,
403,
1313,
840,
253,
1885,
310,
247,
973,
44321,
7802,
597,
840,
5276,
326,
627,
403,
11508,
326,
10517,
253,
1617,
285,
954,
273,
253,
2280,
273,
7802,
310,
16761,
327,
1110,
11508,
20544,
50275,
783,
2929,
12453,
271,
1774,
285,
5044,
1895,
275,
253,
2170,
273,
4038,
13418,
1463,
4715,
273,
337,
69,
305,
78,
983,
50276,
9088,
452,
644,
2709,
2045,
9437,
327,
436,
1895,
533,
436,
2929,
3012,
19132,
689,
731,
50276,
783,
5933,
280,
5609,
347,
973,
347,
253,
8350,
1543,
403,
4460,
285,
4722,
50276,
783,
9759,
310,
5322,
50276,
20881,
1255,
265,
5884,
3374,
50276,
32307,
253,
4477,
6923,
275,
281,
253,
7681,
5018,
273,
253,
27947,
1293,
15571,
253,
1943,
5406,
6240,
247,
1643,
14683,
387,
253,
5068,
273,
841,
749,
21454,
476,
1361,
1239,
1430,
323,
1650,
275,
253,
14536,
2593,
352,
310,
417,
2590,
432,
253,
5068,
326,
253,
14536,
310,
1469,
281,
320,
18932,
387,
1027,
4295,
253,
9414,
778,
5467,
352,
310,
1469,
281,
320,
247,
2014,
14536,
323,
253,
2644,
7802,
891,
369,
417,
2104,
281,
2096,
253,
5609,
16575,
1919,
891,
1239,
253,
7997,
275,
253,
30762,
50276,
33722,
5955,
273,
247,
16907,
1077,
4623,
2929,
407,
1269,
259,
86,
285,
260,
1269,
466,
5520,
11333,
323,
6283,
4715,
7802,
273,
305,
10064,
2458,
891,
13414,
32734,
1480,
4016,
3486,
273,
436,
10527,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
4245,
3012,
5520,
23632,
323,
14697,
4715,
273,
305,
12064,
24170,
275,
337,
39120,
1677,
3530,
432,
247,
3268,
326,
310,
2810,
281,
247,
7802,
273,
305,
10064,
2458,
352,
4245,
271,
14189,
673,
285,
3530,
5933,
326,
18012,
247,
7802,
342,
5777,
625,
4295,
326,
310,
671,
2810,
253,
9380,
9021,
403,
4891,
1097,
275,
2426,
273,
253,
906,
285,
253,
5609,
3206,
594,
436,
310,
247,
2590,
2997,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper claims that through canonical correlation analysis on the representations learnt by popular deep models we can show that the representations learnt on the same dataset with same model using different sets of parameters the representations are linearly identifiable this seems like a nice thing to know but i however am not sure about few things 1 i think the paper doesnt really discuss the implications of this enough that is i would like to see few example applications where we take advantage of this fact as is this contribution in my opinion remains as a cute thing to know 2 it would be interesting to how much would change in initializations would effect the cca curves that is if i initialize my network within a wider range the identifiability result would still hold in practice this would be interesting to add in my opinion 3 ideally i would like to see if this statement hold for different hyperparameters also overall the conclusion is interesting and i think that it could be published docsep postrebuttal the authors replied to my comments related to the diversity condition in 32 and theorem 1 their answers did not fully clarify my concerns or misunderstandings and it seems the authors didnt make any changes in that regard in the revised version except if i missed something in the review system they did not answer my other comments the paper discusses identifiability of learnt representations in supervised settings and under canonical discriminative modelling it provides conditions under which two learnt representations are equivalent up to a linear transform the paper discusses an important topic related in particular to explicability of dnns and appears to provide valuable results but is difficult to follow the paper is packed with results and difficult to read as important details are missing a longer journal version would be more appropriate comments i got lost from the paragraph diversity condition i dont understand the meaning of ya and yb what do subscripts a and b refer to then what do yai and ybi refer to what are they exactly samples of using illustrative examples would be very useful there for example what do they mean in classification where y is a label because i didnt understand the meaning of ya and yb i failed to understand the impact of theorem 1 in particular i dont understand why eq 5 holds in the proof i am mildly familiar with the canonical discriminative formulation can all dnn tasks but represented as such how does it relate to more traditional forms of deep learning such as based on minimisation of cross entropy or quadratic loss i assume that the sum in eq 1 is an integral when y is continuous i got the idea of using cca for comparing two different representations but the exact definition of ci is unclear is it a vector or matrix besides why using only a subset b of d in the experiments you basically compare two learnt representations and show they are essentially similar up to a linear transform however its not clear how the two representations have been obtained is it only a matter of different initialisations i didnt understand how are the labels constructed in section 51 do you simply divide the unit circle in k18 vectors that form the mean of each cluster i didnt get the model misspecification argument and the need to use dnn on such a simple task this show that increasing model size correlates strongly with increase in linear similarity of learned representations i understand why this can be true when increasing number of samples but i dont understand why this should be expected with model capacity what is it in theorem 1 that reflects this minor homogenise use of equation x and equation x missing compiled reference at the beginning of section 4 it looks like you are using two different fonts for ftheta in the paper or are those different variables many typos share parameters but of the network a the docsepthis paper investigated the identifiability of the learned representations in pretrained dnn models that fall into a general class of function space defined by the canonical mathematical form the identifiability of learned representations in this paper is defined as the representations are reproducible on the same data distribution regardless of the randomness in the training procedure such as the random initialization of parameters and the stochastic optimization procedure the authors first proved that in the limit of infinite data learned representations in this family are asymptotic identifiable in function space up to a linear transformation with the additional assumption of the diverse condition they further showed that this property is applicable to several stateoftheart pretrained models including ctc bert and gpt2 and gpt3 at last the authors conducted experiments to empirically investigate the linear identifiability of dnn models in a practical setting finite data and partial optimization they adapted canonical correlation analysis cca and svcca for highdimensional space to measure the linear similarity between two learned representations results from three sets of experiments including classification selfsupervised learning for images ctc and for texts gpt2 results show that the learned representations after mapping through the optimal linear transformation from cca have a strongly linear relationship overall this paper is wellwritten and wellmotivated from the theoretical aspect this paper proved the linear identifiability of a large class of dnn models in an ideal setting from the empirical aspect they provided experimental results to demonstrate the theorem in a practical setting docsepin this paper the authors address the model identifiability in a general setting that can be adapted to several recent deep learning models dnn supervised learning cpc bert and gpt since model parameters nn weights are not identifiable the authors hypothesize that vector f and g can be identifiably up to a linear transformation although the purpose of the work is appealing there are some issues related to the current structure of the paper proposed theory and its relationship to the provided experiments see my detailed comments and questions below figure 1 this figure is not referenced in the text it should be referenced as a motivating fact figure 1 it is mentioned in the appendix that the subset of words was selected how these subsets were chosen randomly i think this is important to clarify theoretical claims this figure clearly illustrate a strong relationship between f and f which is close to a linear mapping but it is not exactly linear the main result of the paper is that if diversity condition is met then models are linearly identifiable but nothing is said about approximately linearly identifiable models as the ones shown in figure 1 one question that should be addressed is how a perturbation in the diversity condition is translated to the identifiability property it would be good that the authors add some theory about it it is suggested that eq 1 holds for arbitrary supervised learning problem but in the appendix the equivalence is not shown for this case also the example provided in fig 2 seems to be a very arbitrary case is it possible to write any supervised learning problem as eq 1 could you please include its derivation is the form of eq 1 already used as a general approach for different ml problems as the authors claim if so could you please cite relevant references diversity condition means that such invertible matrices l and l can be constructed from data samples what can be said if those matrices are invertible but illconditioned ie smallest singular value close to zero for supervised learning the condition implies that k m1 is it not a very restrictive condition for example it is possible to apply this analysis to a simple 2classes supervised classification setting agreement between theory and experimental results with exception of the first experiment supervised learning the rest of the experiments show that by increasing iterations dataset size and number of hidden units then functions f and g tends to be linearly related but not closely linear related i dont see why the main theoretical result of the paper theorem 1 is related to the experimental observations i disagree with the authors claim that these experiments validate theorem 1
### Summary:
|
this paper presents novel results on linear identifiability in discriminative models with three of the four reviewers arguing for acceptance the paper went through an extensive round of edits which incorporated detailed responses to issues raised by the reviewers while this paper would be a nice contribution to the conference some reviewer concerns remain unresolved so we encourage the authors to revise and resubmit to a future venue
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3916,
326,
949,
15516,
5921,
1783,
327,
253,
14237,
34003,
407,
4633,
3676,
3210,
359,
476,
921,
326,
253,
14237,
34003,
327,
253,
1072,
10895,
342,
1072,
1566,
970,
1027,
5239,
273,
3602,
253,
14237,
403,
23352,
38640,
50275,
2520,
3133,
751,
247,
5322,
2181,
281,
871,
533,
891,
2299,
717,
417,
2119,
670,
1643,
1841,
50276,
18,
891,
1158,
253,
2929,
36908,
1663,
2319,
253,
12739,
273,
436,
2217,
326,
310,
891,
651,
751,
281,
923,
1643,
1650,
4893,
835,
359,
1379,
5750,
273,
436,
958,
347,
310,
436,
7680,
275,
619,
4743,
4558,
347,
247,
20295,
2181,
281,
871,
50276,
19,
352,
651,
320,
4722,
281,
849,
1199,
651,
1818,
275,
3302,
5904,
651,
1055,
253,
260,
6357,
9191,
326,
310,
604,
891,
26641,
619,
2990,
1561,
247,
14200,
2491,
253,
1548,
18279,
1430,
906,
651,
1335,
2186,
275,
3946,
436,
651,
320,
4722,
281,
823,
275,
619,
4743,
50276,
20,
34243,
891,
651,
751,
281,
923,
604,
436,
3908,
2186,
323,
1027,
4373,
22041,
671,
50275,
1189,
455,
253,
6452,
310,
4722,
285,
891,
1158,
326,
352,
812,
320,
3863,
50275,
7152,
33032,
1501,
250,
2858,
22559,
50275,
783,
4477,
10017,
281,
619,
5701,
2905,
281,
253,
9991,
1617,
275,
4567,
285,
10012,
337,
616,
9172,
858,
417,
4751,
19148,
619,
7350,
390,
23452,
1676,
723,
285,
352,
3133,
253,
4477,
42126,
1056,
667,
2544,
275,
326,
2743,
275,
253,
17265,
2715,
3707,
604,
891,
9829,
1633,
275,
253,
2278,
985,
597,
858,
417,
3662,
619,
643,
5701,
50273,
783,
2929,
25339,
1548,
18279,
1430,
273,
34003,
14237,
275,
22296,
7533,
285,
762,
15516,
20741,
800,
26278,
352,
3400,
2515,
762,
534,
767,
34003,
14237,
403,
6425,
598,
281,
247,
4872,
4979,
253,
2929,
25339,
271,
1774,
9400,
2905,
275,
1798,
281,
1414,
280,
1430,
273,
277,
79,
2224,
285,
4620,
281,
2085,
9865,
1543,
533,
310,
2834,
281,
956,
253,
2929,
310,
14998,
342,
1543,
285,
2834,
281,
1239,
347,
1774,
4278,
403,
5816,
247,
3356,
6698,
2715,
651,
320,
625,
4569,
50276,
26122,
50275,
74,
1694,
3663,
432,
253,
12494,
9991,
1617,
891,
13414,
2096,
253,
4495,
273,
14076,
285,
340,
67,
752,
513,
749,
26804,
247,
285,
270,
3730,
281,
50276,
7461,
752,
513,
340,
2284,
285,
340,
4193,
3730,
281,
50276,
5371,
403,
597,
4555,
3530,
273,
50276,
5302,
47386,
6667,
651,
320,
1077,
4217,
627,
323,
1650,
752,
513,
597,
1599,
275,
9162,
835,
340,
310,
247,
5203,
50274,
12157,
891,
42126,
2096,
253,
4495,
273,
14076,
285,
340,
67,
891,
4242,
281,
2096,
253,
3486,
273,
10012,
337,
275,
1798,
891,
13414,
2096,
2139,
16186,
608,
6556,
275,
253,
4737,
50275,
74,
717,
38920,
7615,
342,
253,
15516,
20741,
800,
15895,
476,
512,
277,
9866,
8892,
533,
6607,
347,
824,
50276,
5430,
1057,
352,
14588,
281,
625,
5899,
4948,
273,
3676,
4715,
824,
347,
1754,
327,
7221,
5837,
273,
2831,
15579,
390,
21396,
2957,
50276,
74,
5467,
326,
253,
2020,
275,
16186,
337,
310,
271,
9909,
672,
340,
310,
5415,
50275,
74,
1694,
253,
2934,
273,
970,
260,
6357,
323,
10941,
767,
1027,
14237,
533,
253,
3242,
5426,
273,
16399,
310,
12744,
310,
352,
247,
4972,
390,
4315,
50276,
67,
11587,
2139,
970,
760,
247,
8578,
270,
273,
277,
50274,
249,
253,
4679,
368,
10323,
7277,
767,
34003,
14237,
285,
921,
597,
403,
9093,
2074,
598,
281,
247,
4872,
4979,
2299,
697,
417,
2590,
849,
253,
767,
14237,
452,
644,
2797,
310,
352,
760,
247,
2647,
273,
1027,
3302,
18058,
50274,
74,
42126,
2096,
849,
403,
253,
13301,
8818,
275,
2593,
8319,
513,
368,
3365,
10957,
253,
3943,
9096,
275,
465,
1093,
11390,
326,
830,
253,
1599,
273,
1016,
7368,
50276,
74,
42126,
755,
253,
1566,
2985,
1553,
1877,
4154,
285,
253,
878,
281,
897,
277,
9866,
327,
824,
247,
2969,
4836,
50275,
2520,
921,
326,
3629,
1566,
1979,
27972,
7052,
342,
2572,
275,
4872,
14259,
273,
6311,
14237,
891,
2096,
2139,
436,
476,
320,
2032,
672,
3629,
1180,
273,
3530,
533,
891,
13414,
2096,
2139,
436,
943,
320,
3264,
342,
1566,
5350,
752,
310,
352,
275,
10012,
337,
326,
13806,
436,
50274,
37585,
50276,
12856,
2646,
885,
897,
273,
5150,
1269,
285,
5150,
1269,
50276,
33722,
18133,
3806,
387,
253,
5068,
273,
2593,
577,
50276,
262,
4453,
751,
368,
403,
970,
767,
1027,
36622,
323,
269,
3124,
275,
253,
2929,
390,
403,
1110,
1027,
4903,
50275,
20415,
963,
993,
3894,
3602,
533,
273,
253,
2990,
247,
253,
50275,
7152,
33032,
2520,
2929,
6949,
253,
1548,
18279,
1430,
273,
253,
6311,
14237,
275,
3215,
11273,
277,
9866,
3210,
326,
2965,
715,
247,
2087,
966,
273,
1159,
2317,
50276,
7769,
407,
253,
15516,
15965,
830,
253,
1548,
18279,
1430,
273,
6311,
14237,
275,
436,
2929,
310,
2931,
347,
253,
14237,
403,
41374,
327,
253,
1072,
941,
3268,
10159,
273,
253,
3632,
1255,
275,
253,
3733,
5199,
824,
347,
253,
3632,
31850,
273,
3602,
285,
253,
19191,
13757,
5199,
50276,
783,
4477,
806,
8058,
326,
275,
253,
2701,
273,
11968,
941,
6311,
14237,
275,
436,
2021,
403,
20185,
38640,
275,
1159,
2317,
598,
281,
247,
4872,
9261,
342,
253,
3081,
9376,
273,
253,
11117,
1617,
50275,
9328,
2007,
2692,
326,
436,
2867,
310,
7763,
281,
2067,
1375,
23037,
14387,
3215,
11273,
3210,
1690,
260,
18038,
270,
797,
285,
305,
431,
19,
285,
305,
431,
20,
50276,
255,
1390,
253,
4477,
5196,
4679,
281,
45190,
7409,
253,
4872,
1548,
18279,
1430,
273,
277,
9866,
3210,
275,
247,
8542,
4758,
6486,
941,
285,
7898,
13757,
50276,
9328,
12956,
15516,
5921,
1783,
260,
6357,
285,
18504,
18442,
323,
1029,
6967,
2317,
281,
2557,
253,
4872,
14259,
875,
767,
6311,
14237,
1543,
432,
1264,
5239,
273,
4679,
1690,
9162,
1881,
35421,
4715,
323,
3888,
260,
18038,
285,
323,
17438,
305,
431,
19,
1543,
921,
326,
253,
6311,
14237,
846,
10603,
949,
253,
8654,
4872,
9261,
432,
260,
6357,
452,
247,
7052,
4872,
2954,
50276,
1189,
455,
436,
2929,
310,
973,
15720,
285,
973,
24013,
8550,
432,
253,
10527,
4809,
436,
2929,
8058,
253,
4872,
1548,
18279,
1430,
273,
247,
1781,
966,
273,
277,
9866,
3210,
275,
271,
7445,
4758,
432,
253,
16774,
4809,
597,
2530,
5661,
1543,
281,
7568,
253,
10012,
275,
247,
8542,
4758,
5474,
339,
9852,
436,
2929,
253,
4477,
2953,
253,
1566,
1548,
18279,
1430,
275,
247,
2087,
4758,
326,
476,
320,
12956,
281,
2067,
3332,
3676,
4715,
3210,
277,
9866,
22296,
4715,
260,
5902,
270,
797,
285,
305,
431,
1580,
1566,
3602,
48257,
13461,
403,
417,
38640,
253,
4477,
41661,
326,
4972,
269,
285,
305,
476,
320,
1548,
18279,
1598,
598,
281,
247,
4872,
9261,
3738,
253,
4096,
273,
253,
789,
310,
23176,
627,
403,
690,
3374,
2905,
281,
253,
1655,
2605,
273,
253,
2929,
4081,
3762,
285,
697,
2954,
281,
253,
2530,
4679,
923,
619,
7000,
5701,
285,
3533,
2708,
209,
186,
13206,
337,
436,
4677,
310,
417,
23378,
275,
253,
2505,
352,
943,
320,
23378,
347,
247,
15265,
839,
958,
209,
186,
13206,
337,
352,
310,
5393,
275,
253,
30762,
326,
253,
8578,
273,
3000,
369,
4236,
849,
841,
20077,
497,
6777,
12421,
891,
1158,
436,
310,
1774,
281,
19148,
209,
186,
783,
33977,
3916,
436,
4677,
4518,
17093,
247,
2266,
2954,
875,
269,
285,
269,
534,
310,
2810,
281,
247,
4872,
10603,
533,
352,
310,
417,
4555,
4872,
253,
2022,
906,
273,
253,
2929,
310,
326,
604,
9991,
1617,
310,
1313,
840,
3210,
403,
23352,
38640,
533,
2717,
310,
753,
670,
5512,
23352,
38640,
3210,
347,
253,
4394,
2011,
275,
4677,
337,
581,
1953,
326,
943,
320,
9713,
310,
849,
247,
20452,
275,
253,
9991,
1617,
310,
15786,
281,
253,
1548,
18279,
1430,
2867,
352,
651,
320,
1175,
326,
253,
4477,
823,
690,
3762,
670,
352,
209,
186,
262,
310,
5125,
326,
16186,
337,
6556,
323,
10341,
22296,
4715,
1895,
533,
275,
253,
30762,
253,
19945,
310,
417,
2011,
323,
436,
1083,
671,
253,
1650,
2530,
275,
3036,
374,
3133,
281,
320,
247,
1077,
10341,
1083,
310,
352,
1896,
281,
3630,
667,
22296,
4715,
1895,
347,
16186,
337,
812,
368,
4496,
2486,
697,
28529,
209,
186,
261,
253,
830,
273,
16186,
337,
2168,
908,
347,
247,
2087,
2746,
323,
1027,
13361,
3237,
347,
253,
4477,
1750,
604,
594,
812,
368,
4496,
26542,
4623,
10414,
209,
186,
69,
2095,
1617,
2097,
326,
824,
42275,
12624,
298,
285,
298,
476,
320,
8818,
432,
941,
3530,
752,
476,
320,
753,
604,
1110,
12624,
403,
42275,
533,
2853,
44321,
26332,
8004,
11098,
1318,
2810,
281,
5058,
209,
186,
1542,
22296,
4715,
253,
1617,
8018,
326,
465,
278,
18,
310,
352,
417,
247,
1077,
29190,
1617,
323,
1650,
352,
310,
1896,
281,
4647,
436,
1783,
281,
247,
2969,
374,
19770,
22296,
9162,
4758,
209,
186,
45694,
875,
3762,
285,
5661,
1543,
342,
6517,
273,
253,
806,
3368,
22296,
4715,
253,
1551,
273,
253,
4679,
921,
326,
407,
3629,
25142,
10895,
1979,
285,
1180,
273,
8763,
5085,
840,
3470,
269,
285,
305,
14280,
281,
320,
23352,
2905,
533,
417,
8244,
4872,
2905,
891,
13414,
923,
2139,
253,
2022,
10527,
906,
273,
253,
2929,
10012,
337,
310,
2905,
281,
253,
5661,
7313,
891,
14936,
342,
253,
4477,
1750,
326,
841,
4679,
17813,
10012,
337,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
4460,
1543,
327,
4872,
1548,
18279,
1430,
275,
20741,
800,
3210,
342,
1264,
273,
253,
1740,
30628,
16425,
323,
14924,
253,
2929,
2427,
949,
271,
9470,
3790,
273,
1407,
953,
534,
11217,
7000,
6128,
281,
3374,
5439,
407,
253,
30628,
50275,
6050,
436,
2929,
651,
320,
247,
5322,
7680,
281,
253,
8059,
690,
37317,
7350,
3464,
39394,
594,
359,
11907,
253,
4477,
281,
49620,
285,
501,
538,
2225,
281,
247,
2852,
18767
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3916,
326,
949,
15516,
5921,
1783,
327,
253,
14237,
34003,
407,
4633,
3676,
3210,
359,
476,
921,
326,
253,
14237,
34003,
327,
253,
1072,
10895,
342,
1072,
1566,
970,
1027,
5239,
273,
3602,
253,
14237,
403,
23352,
38640,
50275,
2520,
3133,
751,
247,
5322,
2181,
281,
871,
533,
891,
2299,
717,
417,
2119,
670,
1643,
1841,
50276,
18,
891,
1158,
253,
2929,
36908,
1663,
2319,
253,
12739,
273,
436,
2217,
326,
310,
891,
651,
751,
281,
923,
1643,
1650,
4893,
835,
359,
1379,
5750,
273,
436,
958,
347,
310,
436,
7680,
275,
619,
4743,
4558,
347,
247,
20295,
2181,
281,
871,
50276,
19,
352,
651,
320,
4722,
281,
849,
1199,
651,
1818,
275,
3302,
5904,
651,
1055,
253,
260,
6357,
9191,
326,
310,
604,
891,
26641,
619,
2990,
1561,
247,
14200,
2491,
253,
1548,
18279,
1430,
906,
651,
1335,
2186,
275,
3946,
436,
651,
320,
4722,
281,
823,
275,
619,
4743,
50276,
20,
34243,
891,
651,
751,
281,
923,
604,
436,
3908,
2186,
323,
1027,
4373,
22041,
671,
50275,
1189,
455,
253,
6452,
310,
4722,
285,
891,
1158,
326,
352,
812,
320,
3863,
50275,
7152,
33032,
1501,
250,
2858,
22559,
50275,
783,
4477,
10017,
281,
619,
5701,
2905,
281,
253,
9991,
1617,
275,
4567,
285,
10012,
337,
616,
9172,
858,
417,
4751,
19148,
619,
7350,
390,
23452,
1676,
723,
285,
352,
3133,
253,
4477,
42126,
1056,
667,
2544,
275,
326,
2743,
275,
253,
17265,
2715,
3707,
604,
891,
9829,
1633,
275,
253,
2278,
985,
597,
858,
417,
3662,
619,
643,
5701,
50273,
783,
2929,
25339,
1548,
18279,
1430,
273,
34003,
14237,
275,
22296,
7533,
285,
762,
15516,
20741,
800,
26278,
352,
3400,
2515,
762,
534,
767,
34003,
14237,
403,
6425,
598,
281,
247,
4872,
4979,
253,
2929,
25339,
271,
1774,
9400,
2905,
275,
1798,
281,
1414,
280,
1430,
273,
277,
79,
2224,
285,
4620,
281,
2085,
9865,
1543,
533,
310,
2834,
281,
956,
253,
2929,
310,
14998,
342,
1543,
285,
2834,
281,
1239,
347,
1774,
4278,
403,
5816,
247,
3356,
6698,
2715,
651,
320,
625,
4569,
50276,
26122,
50275,
74,
1694,
3663,
432,
253,
12494,
9991,
1617,
891,
13414,
2096,
253,
4495,
273,
14076,
285,
340,
67,
752,
513,
749,
26804,
247,
285,
270,
3730,
281,
50276,
7461,
752,
513,
340,
2284,
285,
340,
4193,
3730,
281,
50276,
5371,
403,
597,
4555,
3530,
273,
50276,
5302,
47386,
6667,
651,
320,
1077,
4217,
627,
323,
1650,
752,
513,
597,
1599,
275,
9162,
835,
340,
310,
247,
5203,
50274,
12157,
891,
42126,
2096,
253,
4495,
273,
14076,
285,
340,
67,
891,
4242,
281,
2096,
253,
3486,
273,
10012,
337,
275,
1798,
891,
13414,
2096,
2139,
16186,
608,
6556,
275,
253,
4737,
50275,
74,
717,
38920,
7615,
342,
253,
15516,
20741,
800,
15895,
476,
512,
277,
9866,
8892,
533,
6607,
347,
824,
50276,
5430,
1057,
352,
14588,
281,
625,
5899,
4948,
273,
3676,
4715,
824,
347,
1754,
327,
7221,
5837,
273,
2831,
15579,
390,
21396,
2957,
50276,
74,
5467,
326,
253,
2020,
275,
16186,
337,
310,
271,
9909,
672,
340,
310,
5415,
50275,
74,
1694,
253,
2934,
273,
970,
260,
6357,
323,
10941,
767,
1027,
14237,
533,
253,
3242,
5426,
273,
16399,
310,
12744,
310,
352,
247,
4972,
390,
4315,
50276,
67,
11587,
2139,
970,
760,
247,
8578,
270,
273,
277,
50274,
249,
253,
4679,
368,
10323,
7277,
767,
34003,
14237,
285,
921,
597,
403,
9093,
2074,
598,
281,
247,
4872,
4979,
2299,
697,
417,
2590,
849,
253,
767,
14237,
452,
644,
2797,
310,
352,
760,
247,
2647,
273,
1027,
3302,
18058,
50274,
74,
42126,
2096,
849,
403,
253,
13301,
8818,
275,
2593,
8319,
513,
368,
3365,
10957,
253,
3943,
9096,
275,
465,
1093,
11390,
326,
830,
253,
1599,
273,
1016,
7368,
50276,
74,
42126,
755,
253,
1566,
2985,
1553,
1877,
4154,
285,
253,
878,
281,
897,
277,
9866,
327,
824,
247,
2969,
4836,
50275,
2520,
921,
326,
3629,
1566,
1979,
27972,
7052,
342,
2572,
275,
4872,
14259,
273,
6311,
14237,
891,
2096,
2139,
436,
476,
320,
2032,
672,
3629,
1180,
273,
3530,
533,
891,
13414,
2096,
2139,
436,
943,
320,
3264,
342,
1566,
5350,
752,
310,
352,
275,
10012,
337,
326,
13806,
436,
50274,
37585,
50276,
12856,
2646,
885,
897,
273,
5150,
1269,
285,
5150,
1269,
50276,
33722,
18133,
3806,
387,
253,
5068,
273,
2593,
577,
50276,
262,
4453,
751,
368,
403,
970,
767,
1027,
36622,
323,
269,
3124,
275,
253,
2929,
390,
403,
1110,
1027,
4903,
50275,
20415,
963,
993,
3894,
3602,
533,
273,
253,
2990,
247,
253,
50275,
7152,
33032,
2520,
2929,
6949,
253,
1548,
18279,
1430,
273,
253,
6311,
14237,
275,
3215,
11273,
277,
9866,
3210,
326,
2965,
715,
247,
2087,
966,
273,
1159,
2317,
50276,
7769,
407,
253,
15516,
15965,
830,
253,
1548,
18279,
1430,
273,
6311,
14237,
275,
436,
2929,
310,
2931,
347,
253,
14237,
403,
41374,
327,
253,
1072,
941,
3268,
10159,
273,
253,
3632,
1255,
275,
253,
3733,
5199,
824,
347,
253,
3632,
31850,
273,
3602,
285,
253,
19191,
13757,
5199,
50276,
783,
4477,
806,
8058,
326,
275,
253,
2701,
273,
11968,
941,
6311,
14237,
275,
436,
2021,
403,
20185,
38640,
275,
1159,
2317,
598,
281,
247,
4872,
9261,
342,
253,
3081,
9376,
273,
253,
11117,
1617,
50275,
9328,
2007,
2692,
326,
436,
2867,
310,
7763,
281,
2067,
1375,
23037,
14387,
3215,
11273,
3210,
1690,
260,
18038,
270,
797,
285,
305,
431,
19,
285,
305,
431,
20,
50276,
255,
1390,
253,
4477,
5196,
4679,
281,
45190,
7409,
253,
4872,
1548,
18279,
1430,
273,
277,
9866,
3210,
275,
247,
8542,
4758,
6486,
941,
285,
7898,
13757,
50276,
9328,
12956,
15516,
5921,
1783,
260,
6357,
285,
18504,
18442,
323,
1029,
6967,
2317,
281,
2557,
253,
4872,
14259,
875,
767,
6311,
14237,
1543,
432,
1264,
5239,
273,
4679,
1690,
9162,
1881,
35421,
4715,
323,
3888,
260,
18038,
285,
323,
17438,
305,
431,
19,
1543,
921,
326,
253,
6311,
14237,
846,
10603,
949,
253,
8654,
4872,
9261,
432,
260,
6357,
452,
247,
7052,
4872,
2954,
50276,
1189,
455,
436,
2929,
310,
973,
15720,
285,
973,
24013,
8550,
432,
253,
10527,
4809,
436,
2929,
8058,
253,
4872,
1548,
18279,
1430,
273,
247,
1781,
966,
273,
277,
9866,
3210,
275,
271,
7445,
4758,
432,
253,
16774,
4809,
597,
2530,
5661,
1543,
281,
7568,
253,
10012,
275,
247,
8542,
4758,
5474,
339,
9852,
436,
2929,
253,
4477,
2953,
253,
1566,
1548,
18279,
1430,
275,
247,
2087,
4758,
326,
476,
320,
12956,
281,
2067,
3332,
3676,
4715,
3210,
277,
9866,
22296,
4715,
260,
5902,
270,
797,
285,
305,
431,
1580,
1566,
3602,
48257,
13461,
403,
417,
38640,
253,
4477,
41661,
326,
4972,
269,
285,
305,
476,
320,
1548,
18279,
1598,
598,
281,
247,
4872,
9261,
3738,
253,
4096,
273,
253,
789,
310,
23176,
627,
403,
690,
3374,
2905,
281,
253,
1655,
2605,
273,
253,
2929,
4081,
3762,
285,
697,
2954,
281,
253,
2530,
4679,
923,
619,
7000,
5701,
285,
3533,
2708,
209,
186,
13206,
337,
436,
4677,
310,
417,
23378,
275,
253,
2505,
352,
943,
320,
23378,
347,
247,
15265,
839,
958,
209,
186,
13206,
337,
352,
310,
5393,
275,
253,
30762,
326,
253,
8578,
273,
3000,
369,
4236,
849,
841,
20077,
497,
6777,
12421,
891,
1158,
436,
310,
1774,
281,
19148,
209,
186,
783,
33977,
3916,
436,
4677,
4518,
17093,
247,
2266,
2954,
875,
269,
285,
269,
534,
310,
2810,
281,
247,
4872,
10603,
533,
352,
310,
417,
4555,
4872,
253,
2022,
906,
273,
253,
2929,
310,
326,
604,
9991,
1617,
310,
1313,
840,
3210,
403,
23352,
38640,
533,
2717,
310,
753,
670,
5512,
23352,
38640,
3210,
347,
253,
4394,
2011,
275,
4677,
337,
581,
1953,
326,
943,
320,
9713,
310,
849,
247,
20452,
275,
253,
9991,
1617,
310,
15786,
281,
253,
1548,
18279,
1430,
2867,
352,
651,
320,
1175,
326,
253,
4477,
823,
690,
3762,
670,
352,
209,
186,
262,
310,
5125,
326,
16186,
337,
6556,
323,
10341,
22296,
4715,
1895,
533,
275,
253,
30762,
253,
19945,
310,
417,
2011,
323,
436,
1083,
671,
253,
1650,
2530,
275,
3036,
374,
3133,
281,
320,
247,
1077,
10341,
1083,
310,
352,
1896,
281,
3630,
667,
22296,
4715,
1895,
347,
16186,
337,
812,
368,
4496,
2486,
697,
28529,
209,
186,
261,
253,
830,
273,
16186,
337,
2168,
908,
347,
247,
2087,
2746,
323,
1027,
13361,
3237,
347,
253,
4477,
1750,
604,
594,
812,
368,
4496,
26542,
4623,
10414,
209,
186,
69,
2095,
1617,
2097,
326,
824,
42275,
12624,
298,
285,
298,
476,
320,
8818,
432,
941,
3530,
752,
476,
320,
753,
604,
1110,
12624,
403,
42275,
533,
2853,
44321,
26332,
8004,
11098,
1318,
2810,
281,
5058,
209,
186,
1542,
22296,
4715,
253,
1617,
8018,
326,
465,
278,
18,
310,
352,
417,
247,
1077,
29190,
1617,
323,
1650,
352,
310,
1896,
281,
4647,
436,
1783,
281,
247,
2969,
374,
19770,
22296,
9162,
4758,
209,
186,
45694,
875,
3762,
285,
5661,
1543,
342,
6517,
273,
253,
806,
3368,
22296,
4715,
253,
1551,
273,
253,
4679,
921,
326,
407,
3629,
25142,
10895,
1979,
285,
1180,
273,
8763,
5085,
840,
3470,
269,
285,
305,
14280,
281,
320,
23352,
2905,
533,
417,
8244,
4872,
2905,
891,
13414,
923,
2139,
253,
2022,
10527,
906,
273,
253,
2929,
10012,
337,
310,
2905,
281,
253,
5661,
7313,
891,
14936,
342,
253,
4477,
1750,
326,
841,
4679,
17813,
10012,
337,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
4460,
1543,
327,
4872,
1548,
18279,
1430,
275,
20741,
800,
3210,
342,
1264,
273,
253,
1740,
30628,
16425,
323,
14924,
253,
2929,
2427,
949,
271,
9470,
3790,
273,
1407,
953,
534,
11217,
7000,
6128,
281,
3374,
5439,
407,
253,
30628,
50275,
6050,
436,
2929,
651,
320,
247,
5322,
7680,
281,
253,
8059,
690,
37317,
7350,
3464,
39394,
594,
359,
11907,
253,
4477,
281,
49620,
285,
501,
538,
2225,
281,
247,
2852,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the author focuses on exploring the xai methods that none of the methods discussed highlighting specific parts in the classification to enhance the explanation the author proposes a method critical classification regions ccrs to do this ccrs use a nearestneighbor example to highlight similar important parts in the image explanation the author performed a user study on a subset of the imagenet to show improvement of the ccr method strengths the main reason to accept this paper is empirical results showing performance on the variants of the proposed method the author has evaluated performance on a standard dataset the problem statement is exciting but not the solution weaknesses 1 the author has proposed the critical classification regions ccr method but it is not discussed in the paper how does the proposed method help to find critical regions does the region help in explanation 2 the core idea is not novel nearestneighbor classification12 already appears as an explanation for visual object recognition nearestneighborbased explanations are already presented in the literature so the main contribution is not novel and seems to be very limited in the past patro et al have proposed an attentionbased nearest neighbor technique to highlight specific parts of the image and try to improve attention and classification score 3 what would be the motivation for the nearestneighbor example for the proposed task can it highlight specific parts in the classification to enhance the explanation what is its requirement is this the only method to solve the task the author should explain this 4 the core idea is not novel nearestneighbor classification12 already appears as an explanation for visual object recognition nearestneighborbased explanations are already presented in the literature so the main contribution is not novel and seems to be very limited 5 what would be the motivation for the nearestneighbor example for the proposed task can it highlight specific parts in the classification to enhance the explanation what is its requirement is this the only method to solve the task the author should explain this 6 selection nn for test samples this part is not discussed with a detailed analysis the method section is not clearly discussed the author should rewrite these sections 7 how does speedingup twinsystems help improve explanation if not please remove that from the method section it creates confusion 8 what happens when the method gets a different patternobjectclass in the nearest neighbor is there any way to tackle this issue have you tried multiple nearest neighbors like k1234 9 patro et al3 have tried different values of k and used a clustering technique to select the optimal nearest neighbor they also tried contrastive techniques using positive examples negative examples to anchor to improve the critical regions 10 the proposed method has not been compared with the latest explanation methods the author should provide both quantitative and qualitative comparison results with the latest explanation methods and also visualize the proposed explanation method with the latest methods 11 the author should explain figure6 and its attributes it is not clear from the caption what do you mean by those plots and the significance of those bars 12 cub and imagenet datasets are mainly focused on single instances what happens to your algorithm for multiple instance data like mscoco images 13 however the paper misses one of the core aspects of machine learning practice readability and reproducibility of results the author should provide an algorithm or pseudocode to reproduce the results missing in this paper ref 1 chen george h and devavrat shah explaining the success of nearest neighbor methods in prediction foundations and trends in machine learning 10 no 56 2018 337588 2 ma wei kendall nowocin niraj marathe and george h chen an interpretable produce price forecasting system for small and marginal farmers in india using collaborative filtering and adaptive nearest neighbors in proceedings of the tenth international conference on information and communication technologies and development pp 111 2019 3 patro badri and vinay p namboodiri differential attention for visual question answering in proceedings of the ieee conference on computer vision and pattern recognition pp 76807688 2018 overall i do not feel like this paper is above the bar for acceptance because the core idea is not terribly exciting and has a lot of technical issues the author should rewrite the paper and provide all other details mention in the weakness section docsepthe authors proposes an approach to improve explanationbyexample techniques by picking nearest neighbors nns based on finegrained image content that are critical for the classification decisions four variations to select these nns are explored and compared against each other the main advantage claimed is the causal role of the generated explanation the authors support this claim via quantitative evaluation and human evaluation i appreciate the wide array of experiments conducted by the author the manuscript reflects the thought process the authors had while developing their ideas touching multiple times at the hypotheses they had and the rational behind their algorithmic choices and experiment design i like the fact that the authors algorithm help explore different explanations based on different ccrs figure 10 and figure 11 does the algorithm support finding nns based on multiple ccrs at once so that two images can have a high match if they share multiple ccrs if this is the case i strongly encourage the authors to provide examples because such ability helps make their method more generic many classification decisions are based on multiple cues in the input image not a single ccr i would be helpful if the authors can describe how their solution can be used in practice in particular do the users have to pick alpha and beta the authors provide some analysis on imagenet and cub200 highlighting the compromise between higher values and lower values and concluding that 125 is the optimal value for imagenet while 11 is the optimal value for cub200 i do not fully understand how the authors came to these values and whether the users of their algorithms will be able to make informed decisions about these values especially also due to the computational overhead of scanning a range of values i miss a comparison with influence functions koh and liang icml17 the authors mentioned that such comparison would be very computationally expensive however there are two algorithms that allow computing influential instances more efficiently described in estimating training data influence by tracing gradient descent garima et al neurips20 representer point selection for explaining deep neural networks yeh et al neurips18 i also miss a comparison with counterfactual methods eg goyal et al icml19 language formatting issues hyperparamter were another where the only method that 1 use and 2 do 1 uses and 2 does not misclassifcation section 3 determines end with a full stop was tested on cifar10imagenet on cifar10 and on imagenet is the regions found are contingent is that its effect similar to hooker et al 2019 avoid text within citations kenny and keane 2021 add page number instead of tbd wording suggestions critical classification regions classificationcritical regions seriously explored considered directly exntesively studied seriously might imply unserious efforts twinsystems twin systems twinsystems twinsystem use a consistent term eg twinsystems framework what about an abbreviation we found four candidate we identified four candidate the manuscript is heavy on analysis and seems to reflect workinprogress results rather than solid and generalizable findings docsepthe manuscript proposes a method to extend explanationbyexample aka exemplarbased explanations by highlighting the regions that links the test image with examples provided as part of the explanation towards this goal different methods to compute critical classification regions ccr are proposed where image regionspixels linking the input with the explanation examples are highlighted the proposed method is validated on the cub200 and imagenet dataset this is complemented with a userstudy overall the proposed idea is clear i find its simplicity a good strength of the proposed method i find the userstudy presented in section 5 adequate and with a proper level of depth in addition it seems that a code library related with the proposed method will be released after publication of the manuscript this is always welcome from the reproducibility point of view my concerns with the manuscript are the following the novelty of proposed proposed ccrs is reduced and somewhat incremental moreover it is only limited to explanationbyexample methods to a good extent the idea behind ccrs of highlighting the regions that link input images with training examples is similar to that of prototypebased explanations especially the work from chen et al 2019 in this regard proper positioning with respect to that family of methods would strengthen the manuscript it is not clear to me what the value on the yaxis of figure 2 a b refers to is that the activation value of the logit related to the predicted class or the groundtruth class how do you handle the cases when the prediction changes over the evaluation of different segments within the same image perhaps i missed something but in section 4 it seems that 30 versions of the original model are finetuned by gradually considering adding the superpixels of each image in the order of importance predicted by the explanation method each time a segment is added the model is finetuned from this procedure it seems that the evaluation is still more related towards assessing the relevance order in which the regions of the image are ranked by the explanation methods and not really the effect that specific training examples may have on performance also i was wondering if these networks are not trained from scratch then there is no guarantee that information present in the heldout segments is actually ignored since the model may still contain features related to then also in section 4 the manuscript explicitly links observations made in the experiment conducted therein with causality given the fact that the tested model is continuously modified on that experiment none of the 30 finetuned resulting models is the same as the original model i do not see how analyzing these different models can tell us something regarding causal properties of the original model proper validation of a method is not a contribution per se but a requirement in place for a method to be acceptedadopted by the community there do not seem to be any novel protocols proposed in this regard if that is indeed the case this should not be claimed as a novelty text and legends present in the plots from figure 2 is not legible i had difficulties reading it in digital version of the document please ensure the text have sufficient size to be readable when printed in standard a4 paper references chen et al this looks like that deep learning for interpretable image recognition neurips 2019 while there is novelty in the proposed methodi find it quite limited moreover at this point i have several doubts regarding the quantitative evaluation ie sections 3 and 4 of the proposed method please see my review perhaps it is a problem of clarity but is something to be addressed if the manuscript is to be published as a solid piece of scientific work docsepthe paper proposes a local modelspecific posthoc explanation method for image classifiers cnn returning as explanations examples suggesting the reasons for the classification together with subparts of the images named critical regions common to the test instance and the examples and responsible for the classification the paper presents a nice advancement of the stateoftheart in xai the proposed method is partially incremental as it is based on the composition of existing approaches for the various steps and their composition happens in a rather simple way however i believe that it brings innovation to the field as it seems effective from the evaluation the weak points are listed in the following first the twinsystem that is at the basis of the proposal is not properly presented and illustrated it is impossible to fully understand the paper without reading the papers presenting the twinsystem second experiment 1 is biased because the way adopted to test the importance of the parts of the image ie by image occlusion is exactly the same used by spccrs thus this method has a clear advantage over the others the authors should adopt a validation fair for both proposals or test them into separate settings third the paper lacks for comparison with a method like the one of been kim or cynthia rudin this statement is true for experiments 1 2 but also for experiments 3 in this last case in fact i suppose that the examples are extracted with the twinsystem what about examples retrieved by the proposals of been kim or cynthia rudin also a comparison against counterfactual explanations see yash goyal would have provided a better picture finally the very interesting case study unfortunately does not bring the expected results i suppose the way the questions are proposed to the users impacted the results as the explanation provided with ccr should be better than those having only examples i suggest the authors repeat the experiments focusing only on images correctly classified minor issues i suggest avoiding using k for both the number of classes and the k of the nn algorithm the number k used in the experiments is not clearly stated as well as which are the different versions of knn tested a very nice idea probably a bit incremental an experimentation not sound in some aspects
### Summary:
|
the paper proposes a method to improve explanationbyexample by identifying important parts of the image when using nearest engihbor explanationsbyexample towards this goal the notion of critical classification regions ccr is proposed the method is tested both computationally and a user study the reviewers felt that the paper had interesting ideas but overall the reviewers agreed that the paper needs more work before being ready for publication this includes improving the soundness of the empirical evaluations and clarifying the contribution of the paper
|
[
4227,
941,
751,
278,
1026,
16856,
3888,
50275,
1012,
2299,
253,
2929,
38771,
581,
273,
253,
5161,
7794,
273,
5145,
4715,
3946,
1239,
1430,
285,
38041,
273,
1543,
253,
2488,
943,
2085,
271,
5933,
390,
10585,
406,
853,
281,
18302,
253,
1543,
5816,
275,
436,
2929,
50276,
709,
337,
260,
864,
3471,
4652,
288,
285,
1474,
580,
9296,
439,
1240,
15571,
253,
2323,
273,
5275,
6346,
3082,
275,
10554,
27629,
285,
13554,
275,
5145,
4715,
884,
642,
8026,
4765,
495,
17820,
2055,
50276,
19,
6429,
359,
74,
465,
423,
455,
1024,
37091,
295,
343,
1432,
2304,
4349,
285,
3471,
4652,
288,
260,
864,
271,
4665,
494,
4711,
4376,
16923,
272,
985,
323,
1355,
285,
16888,
14768,
275,
801,
571,
970,
27549,
19690,
285,
17825,
5275,
15833,
275,
10061,
273,
253,
28081,
5213,
8059,
327,
1491,
285,
5511,
10296,
285,
2440,
7266,
11334,
6247,
50276,
20,
869,
287,
3076,
363,
285,
35803,
333,
268,
295,
1369,
836,
19283,
8967,
4116,
323,
5304,
1953,
22291,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
10909,
1438,
3121,
2055,
4765,
50275,
1189,
455,
891,
513,
417,
1928,
751,
436,
2929,
310,
1840,
253,
2534,
323,
14924,
984,
253,
5161,
2934,
310,
417,
30643,
12302,
285,
556,
247,
2257,
273,
7681,
3374,
50276,
783,
2488,
943,
24813,
253,
2929,
285,
2085,
512,
643,
4278,
3748,
275,
253,
14855,
2593,
50275,
7152,
339,
431,
248,
4477,
29328,
271,
2746,
281,
3157,
8813,
1615,
11667,
5609,
407,
8871,
5275,
15833,
295,
2224,
50276,
3169,
327,
4030,
72,
11273,
2460,
2600,
326,
403,
4619,
323,
253,
9162,
7089,
1740,
10575,
281,
3609,
841,
295,
2224,
403,
14859,
285,
2429,
1411,
1016,
643,
253,
2022,
5750,
7558,
310,
253,
19349,
2554,
273,
253,
4561,
8813,
253,
4477,
1329,
436,
1750,
3066,
11745,
7103,
285,
1966,
7103,
50276,
74,
11435,
253,
4618,
3781,
273,
4679,
5196,
407,
253,
2488,
253,
7714,
13806,
253,
1869,
1232,
253,
4477,
574,
1223,
6684,
616,
5697,
19883,
2709,
2069,
387,
253,
24316,
597,
574,
285,
253,
8870,
3212,
616,
5933,
280,
10165,
285,
3368,
2216,
50276,
74,
751,
253,
958,
326,
253,
4477,
5933,
1361,
8338,
1027,
22909,
1754,
327,
1027,
25215,
2967,
4677,
884,
285,
4677,
1903,
1057,
253,
5933,
1329,
4560,
295,
2224,
1754,
327,
2709,
25215,
2967,
387,
2378,
594,
326,
767,
3888,
476,
452,
247,
1029,
3761,
604,
597,
3894,
2709,
25215,
2967,
604,
436,
310,
253,
1083,
891,
7052,
11907,
253,
4477,
281,
2085,
6667,
984,
824,
3745,
7729,
1056,
616,
1332,
625,
12314,
1142,
9162,
7089,
403,
1754,
327,
2709,
26638,
275,
253,
3280,
2460,
417,
247,
2014,
260,
7083,
50276,
74,
651,
320,
9371,
604,
253,
4477,
476,
6266,
849,
616,
2900,
476,
320,
908,
275,
3946,
275,
1798,
513,
253,
4212,
452,
281,
2619,
9765,
285,
9840,
253,
4477,
2085,
690,
1783,
327,
4440,
257,
292,
285,
12966,
1518,
27321,
253,
18230,
875,
2169,
2193,
285,
2406,
2193,
285,
26215,
326,
11140,
310,
253,
8654,
1318,
323,
4440,
257,
292,
1223,
1903,
310,
253,
8654,
1318,
323,
12966,
1518,
891,
513,
417,
4751,
2096,
849,
253,
4477,
2210,
281,
841,
2193,
285,
1880,
253,
4212,
273,
616,
11333,
588,
320,
2104,
281,
1056,
8191,
7089,
670,
841,
2193,
3340,
671,
1955,
281,
253,
15180,
18332,
273,
13733,
247,
2491,
273,
2193,
50276,
74,
2985,
247,
5301,
342,
4833,
3470,
465,
1368,
285,
632,
606,
17857,
1686,
1166,
253,
4477,
5393,
326,
824,
5301,
651,
320,
1077,
43245,
8214,
2299,
627,
403,
767,
11333,
326,
1581,
12672,
20803,
10872,
625,
14556,
2529,
275,
50276,
383,
303,
839,
3733,
941,
4833,
407,
30749,
11786,
18499,
6746,
8032,
1162,
355,
5723,
2824,
938,
50276,
12554,
254,
1127,
5438,
323,
15571,
3676,
11454,
6928,
9094,
73,
1162,
355,
5723,
2824,
1093,
891,
671,
2985,
247,
5301,
342,
4828,
12690,
780,
3082,
24088,
564,
90,
267,
1162,
355,
17857,
1686,
746,
50276,
12982,
50276,
8124,
1076,
3374,
50276,
27049,
3575,
350,
50276,
12796,
1529,
50276,
2811,
50276,
783,
760,
1332,
326,
337,
897,
50276,
395,
374,
513,
50276,
18,
4648,
50276,
395,
374,
1057,
417,
50276,
24418,
2437,
338,
24435,
50276,
4674,
495,
14802,
50275,
423,
342,
247,
2120,
3523,
50276,
4238,
5762,
327,
260,
338,
274,
740,
303,
6533,
292,
50276,
251,
260,
338,
274,
740,
285,
327,
4440,
257,
292,
50276,
261,
253,
4811,
1119,
403,
32391,
50276,
261,
326,
50276,
953,
1055,
50276,
22202,
281,
10584,
254,
1162,
355,
6247,
50276,
27635,
2505,
1561,
30404,
50276,
76,
10614,
285,
1058,
1351,
43425,
50276,
1911,
3239,
1180,
3185,
273,
246,
14836,
50276,
88,
1573,
13991,
50276,
26717,
9162,
4811,
50276,
42070,
26717,
4811,
50276,
2152,
8140,
14859,
50276,
46779,
50276,
18711,
314,
385,
2649,
265,
1242,
5421,
10369,
1537,
16084,
440,
47396,
6031,
50276,
7553,
968,
24926,
19661,
2718,
26664,
24926,
26664,
2468,
50276,
2327,
247,
5185,
1307,
24088,
26664,
24926,
7792,
752,
670,
271,
31931,
2492,
50276,
664,
1119,
1740,
7431,
50275,
664,
3636,
1740,
7431,
50276,
783,
7714,
310,
5536,
327,
1783,
285,
3133,
281,
4887,
789,
249,
24951,
1543,
2581,
685,
4891,
285,
2087,
12729,
4342,
5474,
339,
431,
248,
7714,
29328,
247,
1332,
281,
9017,
8813,
1615,
11667,
38857,
17449,
274,
3169,
22909,
407,
27321,
253,
4811,
326,
4859,
253,
1071,
2460,
342,
6667,
2530,
347,
629,
273,
253,
8813,
4404,
436,
4736,
1027,
3082,
281,
11897,
4619,
9162,
4811,
260,
7083,
403,
4081,
835,
2460,
4811,
30061,
1241,
20057,
253,
3280,
342,
253,
8813,
6667,
403,
16318,
50276,
783,
4081,
1332,
310,
17618,
327,
253,
12966,
1518,
285,
4440,
257,
292,
10895,
436,
310,
48912,
342,
247,
2608,
34966,
50275,
1189,
455,
253,
4081,
2934,
310,
2590,
891,
1089,
697,
17647,
247,
1175,
4757,
273,
253,
4081,
1332,
50276,
74,
1089,
253,
2608,
34966,
3559,
275,
2593,
608,
10599,
285,
342,
247,
1463,
1268,
273,
6864,
50276,
249,
1635,
352,
3133,
326,
247,
2127,
6335,
2905,
342,
253,
4081,
1332,
588,
320,
4439,
846,
9311,
273,
253,
7714,
436,
310,
1900,
10112,
432,
253,
38041,
1127,
273,
1859,
50276,
2577,
7350,
342,
253,
7714,
403,
253,
1563,
50275,
783,
38135,
273,
4081,
4081,
25215,
2967,
310,
3777,
285,
8489,
32809,
25761,
352,
310,
760,
3710,
281,
8813,
1615,
11667,
3082,
50275,
936,
247,
1175,
6070,
253,
2934,
3212,
25215,
2967,
273,
27321,
253,
4811,
326,
3048,
3280,
3888,
342,
3733,
6667,
310,
2074,
281,
326,
273,
21841,
3169,
22909,
3340,
253,
789,
432,
260,
864,
1162,
355,
6247,
275,
436,
2743,
1463,
19274,
342,
1675,
281,
326,
2021,
273,
3082,
651,
17084,
253,
7714,
50275,
262,
310,
417,
2590,
281,
479,
752,
253,
1318,
327,
253,
340,
10565,
273,
4677,
374,
247,
50276,
67,
10770,
281,
310,
326,
253,
5743,
1318,
273,
253,
2412,
262,
2905,
281,
253,
8131,
966,
390,
253,
3216,
33024,
966,
849,
513,
368,
6016,
253,
2219,
672,
253,
10554,
2544,
689,
253,
7103,
273,
1027,
13288,
1561,
253,
1072,
2460,
50275,
30875,
891,
9829,
1633,
533,
275,
2593,
577,
352,
3133,
326,
1884,
9508,
273,
253,
3236,
1566,
403,
1442,
292,
37437,
407,
13237,
7296,
6240,
253,
2221,
30061,
1241,
273,
1016,
2460,
275,
253,
1340,
273,
6349,
8131,
407,
253,
8813,
1332,
1016,
673,
247,
8223,
310,
2879,
253,
1566,
310,
1442,
292,
37437,
432,
436,
5199,
352,
3133,
326,
253,
7103,
310,
1335,
625,
2905,
4404,
18005,
253,
17200,
1340,
275,
534,
253,
4811,
273,
253,
2460,
403,
17045,
407,
253,
8813,
3082,
285,
417,
1663,
253,
1055,
326,
2173,
3733,
6667,
778,
452,
327,
3045,
50276,
12563,
891,
369,
12371,
604,
841,
6928,
403,
417,
10166,
432,
20041,
840,
627,
310,
642,
12215,
326,
1491,
1246,
275,
253,
2918,
483,
13288,
310,
2686,
12841,
1580,
253,
1566,
778,
1335,
3831,
3386,
2905,
281,
840,
50275,
12563,
275,
2593,
577,
253,
7714,
11120,
4859,
7313,
1160,
275,
253,
3368,
5196,
15308,
342,
46449,
1677,
253,
958,
326,
253,
5762,
1566,
310,
14949,
7321,
327,
326,
3368,
5293,
273,
253,
1884,
1442,
292,
37437,
4795,
3210,
310,
253,
1072,
347,
253,
3236,
1566,
891,
513,
417,
923,
849,
18918,
841,
1027,
3210,
476,
2028,
441,
1633,
5001,
19349,
3607,
273,
253,
3236,
1566,
50275,
30976,
12820,
273,
247,
1332,
310,
417,
247,
7680,
591,
396,
533,
247,
8284,
275,
1659,
323,
247,
1332,
281,
320,
7607,
324,
2178,
264,
407,
253,
3114,
627,
513,
417,
1646,
281,
320,
667,
4460,
14238,
4081,
275,
436,
2743,
604,
326,
310,
6296,
253,
50276,
5045,
436,
943,
417,
320,
7558,
347,
247,
38135,
50275,
1156,
285,
38209,
1246,
275,
253,
14777,
432,
4677,
374,
310,
417,
1791,
917,
891,
574,
12748,
4361,
352,
275,
5865,
2715,
273,
253,
3389,
4496,
5416,
253,
2505,
452,
4209,
1979,
281,
320,
34025,
672,
11462,
275,
2629,
247,
21,
2929,
50276,
250,
3065,
50276,
5756,
1162,
355,
436,
4453,
751,
326,
3676,
4715,
323,
4665,
494,
2460,
8981,
5723,
2824,
6247,
1223,
627,
310,
38135,
275,
253,
4081,
1332,
74,
1089,
352,
3240,
3710,
25761,
387,
436,
1127,
891,
452,
2067,
24626,
5001,
253,
11745,
7103,
26332,
7118,
495,
285,
577,
273,
253,
4081,
1332,
4496,
923,
619,
2278,
4931,
352,
310,
247,
1895,
273,
19843,
533,
310,
1633,
281,
320,
9713,
604,
253,
7714,
310,
281,
320,
3863,
347,
247,
4891,
5313,
273,
8249,
789,
5474,
339,
431,
248,
2929,
29328,
247,
1980,
3210,
29765,
1501,
37806,
8813,
1332,
323,
2460,
49996,
260,
9866,
10884,
347,
22909,
6667,
7738,
253,
4606,
323,
253,
9162,
2366,
342,
749,
31369,
273,
253,
3888,
4907,
4619,
4811,
1846,
281,
253,
1071,
4227,
285,
253,
6667,
285,
5506,
323,
253,
9162,
253,
2929,
10262,
247,
5322,
32992,
273,
253,
1375,
23037,
14387,
275,
1269,
2284,
253,
4081,
1332,
310,
10571,
32809,
347,
352,
310,
1754,
327,
253,
5889,
273,
5368,
7274,
323,
253,
2710,
5018,
285,
616,
5889,
6569,
275,
247,
2581,
2969,
1039,
2299,
891,
2868,
326,
352,
10316,
15832,
281,
253,
1673,
347,
352,
3133,
3576,
432,
253,
7103,
253,
5075,
2792,
403,
7117,
275,
253,
1563,
806,
253,
26664,
2468,
326,
310,
387,
253,
3720,
273,
253,
10419,
310,
417,
6283,
3559,
285,
12800,
352,
310,
7479,
281,
4751,
2096,
253,
2929,
1293,
4361,
253,
9380,
15250,
253,
26664,
2468,
1273,
3368,
337,
310,
23539,
984,
253,
1039,
8671,
281,
1071,
253,
6349,
273,
253,
4243,
273,
253,
2460,
26332,
407,
2460,
30796,
310,
4555,
253,
1072,
908,
407,
653,
550,
2967,
3021,
436,
1332,
556,
247,
2590,
5750,
689,
253,
2571,
253,
4477,
943,
5283,
247,
12820,
4344,
323,
1097,
18595,
390,
1071,
731,
715,
4858,
7533,
2626,
253,
2929,
19756,
323,
5301,
342,
247,
1332,
751,
253,
581,
273,
644,
465,
303,
390,
31970,
36005,
30514,
249,
436,
3908,
310,
2032,
323,
4679,
337,
374,
533,
671,
323,
4679,
495,
275,
436,
1390,
1083,
275,
958,
891,
9428,
326,
253,
6667,
403,
10375,
342,
253,
26664,
2468,
752,
670,
6667,
22111,
407,
253,
18595,
273,
644,
465,
303,
390,
31970,
36005,
30514,
249,
671,
247,
5301,
1411,
4828,
12690,
780,
22909,
923,
340,
1225,
564,
90,
267,
651,
452,
2530,
247,
1805,
5406,
4720,
253,
1077,
4722,
1083,
1263,
19235,
1057,
417,
3324,
253,
3264,
1543,
891,
9428,
253,
1039,
253,
3533,
403,
4081,
281,
253,
4212,
27857,
253,
1543,
347,
253,
8813,
2530,
342,
260,
7083,
943,
320,
1805,
685,
1110,
1907,
760,
6667,
891,
1804,
253,
4477,
10280,
253,
4679,
13654,
760,
327,
3888,
9113,
10509,
50276,
37585,
3374,
891,
1804,
17816,
970,
465,
323,
1097,
253,
1180,
273,
5971,
285,
253,
465,
273,
253,
48257,
5933,
253,
1180,
465,
908,
275,
253,
4679,
310,
417,
4518,
4767,
347,
973,
347,
534,
403,
253,
1027,
9508,
273,
694,
79,
5762,
247,
1077,
5322,
2934,
3164,
247,
2372,
32809,
271,
40290,
417,
3590,
275,
690,
7794,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
1332,
281,
3157,
8813,
1615,
11667,
407,
12488,
1774,
4243,
273,
253,
2460,
672,
970,
5275,
2209,
6356,
3399,
22909,
1615,
11667,
4404,
436,
4736,
253,
10732,
273,
4619,
9162,
4811,
260,
7083,
310,
4081,
253,
1332,
310,
5762,
1097,
43245,
285,
247,
2608,
1263,
50276,
783,
30628,
3543,
326,
253,
2929,
574,
4722,
5697,
533,
4583,
253,
30628,
5821,
326,
253,
2929,
3198,
625,
789,
1078,
1146,
4704,
323,
9311,
436,
3797,
11138,
253,
3590,
1255,
273,
253,
16774,
27163,
285,
8254,
5411,
253,
7680,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
4227,
941,
751,
278,
1026,
16856,
3888,
50275,
1012,
2299,
253,
2929,
38771,
581,
273,
253,
5161,
7794,
273,
5145,
4715,
3946,
1239,
1430,
285,
38041,
273,
1543,
253,
2488,
943,
2085,
271,
5933,
390,
10585,
406,
853,
281,
18302,
253,
1543,
5816,
275,
436,
2929,
50276,
709,
337,
260,
864,
3471,
4652,
288,
285,
1474,
580,
9296,
439,
1240,
15571,
253,
2323,
273,
5275,
6346,
3082,
275,
10554,
27629,
285,
13554,
275,
5145,
4715,
884,
642,
8026,
4765,
495,
17820,
2055,
50276,
19,
6429,
359,
74,
465,
423,
455,
1024,
37091,
295,
343,
1432,
2304,
4349,
285,
3471,
4652,
288,
260,
864,
271,
4665,
494,
4711,
4376,
16923,
272,
985,
323,
1355,
285,
16888,
14768,
275,
801,
571,
970,
27549,
19690,
285,
17825,
5275,
15833,
275,
10061,
273,
253,
28081,
5213,
8059,
327,
1491,
285,
5511,
10296,
285,
2440,
7266,
11334,
6247,
50276,
20,
869,
287,
3076,
363,
285,
35803,
333,
268,
295,
1369,
836,
19283,
8967,
4116,
323,
5304,
1953,
22291,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
10909,
1438,
3121,
2055,
4765,
50275,
1189,
455,
891,
513,
417,
1928,
751,
436,
2929,
310,
1840,
253,
2534,
323,
14924,
984,
253,
5161,
2934,
310,
417,
30643,
12302,
285,
556,
247,
2257,
273,
7681,
3374,
50276,
783,
2488,
943,
24813,
253,
2929,
285,
2085,
512,
643,
4278,
3748,
275,
253,
14855,
2593,
50275,
7152,
339,
431,
248,
4477,
29328,
271,
2746,
281,
3157,
8813,
1615,
11667,
5609,
407,
8871,
5275,
15833,
295,
2224,
50276,
3169,
327,
4030,
72,
11273,
2460,
2600,
326,
403,
4619,
323,
253,
9162,
7089,
1740,
10575,
281,
3609,
841,
295,
2224,
403,
14859,
285,
2429,
1411,
1016,
643,
253,
2022,
5750,
7558,
310,
253,
19349,
2554,
273,
253,
4561,
8813,
253,
4477,
1329,
436,
1750,
3066,
11745,
7103,
285,
1966,
7103,
50276,
74,
11435,
253,
4618,
3781,
273,
4679,
5196,
407,
253,
2488,
253,
7714,
13806,
253,
1869,
1232,
253,
4477,
574,
1223,
6684,
616,
5697,
19883,
2709,
2069,
387,
253,
24316,
597,
574,
285,
253,
8870,
3212,
616,
5933,
280,
10165,
285,
3368,
2216,
50276,
74,
751,
253,
958,
326,
253,
4477,
5933,
1361,
8338,
1027,
22909,
1754,
327,
1027,
25215,
2967,
4677,
884,
285,
4677,
1903,
1057,
253,
5933,
1329,
4560,
295,
2224,
1754,
327,
2709,
25215,
2967,
387,
2378,
594,
326,
767,
3888,
476,
452,
247,
1029,
3761,
604,
597,
3894,
2709,
25215,
2967,
604,
436,
310,
253,
1083,
891,
7052,
11907,
253,
4477,
281,
2085,
6667,
984,
824,
3745,
7729,
1056,
616,
1332,
625,
12314,
1142,
9162,
7089,
403,
1754,
327,
2709,
26638,
275,
253,
3280,
2460,
417,
247,
2014,
260,
7083,
50276,
74,
651,
320,
9371,
604,
253,
4477,
476,
6266,
849,
616,
2900,
476,
320,
908,
275,
3946,
275,
1798,
513,
253,
4212,
452,
281,
2619,
9765,
285,
9840,
253,
4477,
2085,
690,
1783,
327,
4440,
257,
292,
285,
12966,
1518,
27321,
253,
18230,
875,
2169,
2193,
285,
2406,
2193,
285,
26215,
326,
11140,
310,
253,
8654,
1318,
323,
4440,
257,
292,
1223,
1903,
310,
253,
8654,
1318,
323,
12966,
1518,
891,
513,
417,
4751,
2096,
849,
253,
4477,
2210,
281,
841,
2193,
285,
1880,
253,
4212,
273,
616,
11333,
588,
320,
2104,
281,
1056,
8191,
7089,
670,
841,
2193,
3340,
671,
1955,
281,
253,
15180,
18332,
273,
13733,
247,
2491,
273,
2193,
50276,
74,
2985,
247,
5301,
342,
4833,
3470,
465,
1368,
285,
632,
606,
17857,
1686,
1166,
253,
4477,
5393,
326,
824,
5301,
651,
320,
1077,
43245,
8214,
2299,
627,
403,
767,
11333,
326,
1581,
12672,
20803,
10872,
625,
14556,
2529,
275,
50276,
383,
303,
839,
3733,
941,
4833,
407,
30749,
11786,
18499,
6746,
8032,
1162,
355,
5723,
2824,
938,
50276,
12554,
254,
1127,
5438,
323,
15571,
3676,
11454,
6928,
9094,
73,
1162,
355,
5723,
2824,
1093,
891,
671,
2985,
247,
5301,
342,
4828,
12690,
780,
3082,
24088,
564,
90,
267,
1162,
355,
17857,
1686,
746,
50276,
12982,
50276,
8124,
1076,
3374,
50276,
27049,
3575,
350,
50276,
12796,
1529,
50276,
2811,
50276,
783,
760,
1332,
326,
337,
897,
50276,
395,
374,
513,
50276,
18,
4648,
50276,
395,
374,
1057,
417,
50276,
24418,
2437,
338,
24435,
50276,
4674,
495,
14802,
50275,
423,
342,
247,
2120,
3523,
50276,
4238,
5762,
327,
260,
338,
274,
740,
303,
6533,
292,
50276,
251,
260,
338,
274,
740,
285,
327,
4440,
257,
292,
50276,
261,
253,
4811,
1119,
403,
32391,
50276,
261,
326,
50276,
953,
1055,
50276,
22202,
281,
10584,
254,
1162,
355,
6247,
50276,
27635,
2505,
1561,
30404,
50276,
76,
10614,
285,
1058,
1351,
43425,
50276,
1911,
3239,
1180,
3185,
273,
246,
14836,
50276,
88,
1573,
13991,
50276,
26717,
9162,
4811,
50276,
42070,
26717,
4811,
50276,
2152,
8140,
14859,
50276,
46779,
50276,
18711,
314,
385,
2649,
265,
1242,
5421,
10369,
1537,
16084,
440,
47396,
6031,
50276,
7553,
968,
24926,
19661,
2718,
26664,
24926,
26664,
2468,
50276,
2327,
247,
5185,
1307,
24088,
26664,
24926,
7792,
752,
670,
271,
31931,
2492,
50276,
664,
1119,
1740,
7431,
50275,
664,
3636,
1740,
7431,
50276,
783,
7714,
310,
5536,
327,
1783,
285,
3133,
281,
4887,
789,
249,
24951,
1543,
2581,
685,
4891,
285,
2087,
12729,
4342,
5474,
339,
431,
248,
7714,
29328,
247,
1332,
281,
9017,
8813,
1615,
11667,
38857,
17449,
274,
3169,
22909,
407,
27321,
253,
4811,
326,
4859,
253,
1071,
2460,
342,
6667,
2530,
347,
629,
273,
253,
8813,
4404,
436,
4736,
1027,
3082,
281,
11897,
4619,
9162,
4811,
260,
7083,
403,
4081,
835,
2460,
4811,
30061,
1241,
20057,
253,
3280,
342,
253,
8813,
6667,
403,
16318,
50276,
783,
4081,
1332,
310,
17618,
327,
253,
12966,
1518,
285,
4440,
257,
292,
10895,
436,
310,
48912,
342,
247,
2608,
34966,
50275,
1189,
455,
253,
4081,
2934,
310,
2590,
891,
1089,
697,
17647,
247,
1175,
4757,
273,
253,
4081,
1332,
50276,
74,
1089,
253,
2608,
34966,
3559,
275,
2593,
608,
10599,
285,
342,
247,
1463,
1268,
273,
6864,
50276,
249,
1635,
352,
3133,
326,
247,
2127,
6335,
2905,
342,
253,
4081,
1332,
588,
320,
4439,
846,
9311,
273,
253,
7714,
436,
310,
1900,
10112,
432,
253,
38041,
1127,
273,
1859,
50276,
2577,
7350,
342,
253,
7714,
403,
253,
1563,
50275,
783,
38135,
273,
4081,
4081,
25215,
2967,
310,
3777,
285,
8489,
32809,
25761,
352,
310,
760,
3710,
281,
8813,
1615,
11667,
3082,
50275,
936,
247,
1175,
6070,
253,
2934,
3212,
25215,
2967,
273,
27321,
253,
4811,
326,
3048,
3280,
3888,
342,
3733,
6667,
310,
2074,
281,
326,
273,
21841,
3169,
22909,
3340,
253,
789,
432,
260,
864,
1162,
355,
6247,
275,
436,
2743,
1463,
19274,
342,
1675,
281,
326,
2021,
273,
3082,
651,
17084,
253,
7714,
50275,
262,
310,
417,
2590,
281,
479,
752,
253,
1318,
327,
253,
340,
10565,
273,
4677,
374,
247,
50276,
67,
10770,
281,
310,
326,
253,
5743,
1318,
273,
253,
2412,
262,
2905,
281,
253,
8131,
966,
390,
253,
3216,
33024,
966,
849,
513,
368,
6016,
253,
2219,
672,
253,
10554,
2544,
689,
253,
7103,
273,
1027,
13288,
1561,
253,
1072,
2460,
50275,
30875,
891,
9829,
1633,
533,
275,
2593,
577,
352,
3133,
326,
1884,
9508,
273,
253,
3236,
1566,
403,
1442,
292,
37437,
407,
13237,
7296,
6240,
253,
2221,
30061,
1241,
273,
1016,
2460,
275,
253,
1340,
273,
6349,
8131,
407,
253,
8813,
1332,
1016,
673,
247,
8223,
310,
2879,
253,
1566,
310,
1442,
292,
37437,
432,
436,
5199,
352,
3133,
326,
253,
7103,
310,
1335,
625,
2905,
4404,
18005,
253,
17200,
1340,
275,
534,
253,
4811,
273,
253,
2460,
403,
17045,
407,
253,
8813,
3082,
285,
417,
1663,
253,
1055,
326,
2173,
3733,
6667,
778,
452,
327,
3045,
50276,
12563,
891,
369,
12371,
604,
841,
6928,
403,
417,
10166,
432,
20041,
840,
627,
310,
642,
12215,
326,
1491,
1246,
275,
253,
2918,
483,
13288,
310,
2686,
12841,
1580,
253,
1566,
778,
1335,
3831,
3386,
2905,
281,
840,
50275,
12563,
275,
2593,
577,
253,
7714,
11120,
4859,
7313,
1160,
275,
253,
3368,
5196,
15308,
342,
46449,
1677,
253,
958,
326,
253,
5762,
1566,
310,
14949,
7321,
327,
326,
3368,
5293,
273,
253,
1884,
1442,
292,
37437,
4795,
3210,
310,
253,
1072,
347,
253,
3236,
1566,
891,
513,
417,
923,
849,
18918,
841,
1027,
3210,
476,
2028,
441,
1633,
5001,
19349,
3607,
273,
253,
3236,
1566,
50275,
30976,
12820,
273,
247,
1332,
310,
417,
247,
7680,
591,
396,
533,
247,
8284,
275,
1659,
323,
247,
1332,
281,
320,
7607,
324,
2178,
264,
407,
253,
3114,
627,
513,
417,
1646,
281,
320,
667,
4460,
14238,
4081,
275,
436,
2743,
604,
326,
310,
6296,
253,
50276,
5045,
436,
943,
417,
320,
7558,
347,
247,
38135,
50275,
1156,
285,
38209,
1246,
275,
253,
14777,
432,
4677,
374,
310,
417,
1791,
917,
891,
574,
12748,
4361,
352,
275,
5865,
2715,
273,
253,
3389,
4496,
5416,
253,
2505,
452,
4209,
1979,
281,
320,
34025,
672,
11462,
275,
2629,
247,
21,
2929,
50276,
250,
3065,
50276,
5756,
1162,
355,
436,
4453,
751,
326,
3676,
4715,
323,
4665,
494,
2460,
8981,
5723,
2824,
6247,
1223,
627,
310,
38135,
275,
253,
4081,
1332,
74,
1089,
352,
3240,
3710,
25761,
387,
436,
1127,
891,
452,
2067,
24626,
5001,
253,
11745,
7103,
26332,
7118,
495,
285,
577,
273,
253,
4081,
1332,
4496,
923,
619,
2278,
4931,
352,
310,
247,
1895,
273,
19843,
533,
310,
1633,
281,
320,
9713,
604,
253,
7714,
310,
281,
320,
3863,
347,
247,
4891,
5313,
273,
8249,
789,
5474,
339,
431,
248,
2929,
29328,
247,
1980,
3210,
29765,
1501,
37806,
8813,
1332,
323,
2460,
49996,
260,
9866,
10884,
347,
22909,
6667,
7738,
253,
4606,
323,
253,
9162,
2366,
342,
749,
31369,
273,
253,
3888,
4907,
4619,
4811,
1846,
281,
253,
1071,
4227,
285,
253,
6667,
285,
5506,
323,
253,
9162,
253,
2929,
10262,
247,
5322,
32992,
273,
253,
1375,
23037,
14387,
275,
1269,
2284,
253,
4081,
1332,
310,
10571,
32809,
347,
352,
310,
1754,
327,
253,
5889,
273,
5368,
7274,
323,
253,
2710,
5018,
285,
616,
5889,
6569,
275,
247,
2581,
2969,
1039,
2299,
891,
2868,
326,
352,
10316,
15832,
281,
253,
1673,
347,
352,
3133,
3576,
432,
253,
7103,
253,
5075,
2792,
403,
7117,
275,
253,
1563,
806,
253,
26664,
2468,
326,
310,
387,
253,
3720,
273,
253,
10419,
310,
417,
6283,
3559,
285,
12800,
352,
310,
7479,
281,
4751,
2096,
253,
2929,
1293,
4361,
253,
9380,
15250,
253,
26664,
2468,
1273,
3368,
337,
310,
23539,
984,
253,
1039,
8671,
281,
1071,
253,
6349,
273,
253,
4243,
273,
253,
2460,
26332,
407,
2460,
30796,
310,
4555,
253,
1072,
908,
407,
653,
550,
2967,
3021,
436,
1332,
556,
247,
2590,
5750,
689,
253,
2571,
253,
4477,
943,
5283,
247,
12820,
4344,
323,
1097,
18595,
390,
1071,
731,
715,
4858,
7533,
2626,
253,
2929,
19756,
323,
5301,
342,
247,
1332,
751,
253,
581,
273,
644,
465,
303,
390,
31970,
36005,
30514,
249,
436,
3908,
310,
2032,
323,
4679,
337,
374,
533,
671,
323,
4679,
495,
275,
436,
1390,
1083,
275,
958,
891,
9428,
326,
253,
6667,
403,
10375,
342,
253,
26664,
2468,
752,
670,
6667,
22111,
407,
253,
18595,
273,
644,
465,
303,
390,
31970,
36005,
30514,
249,
671,
247,
5301,
1411,
4828,
12690,
780,
22909,
923,
340,
1225,
564,
90,
267,
651,
452,
2530,
247,
1805,
5406,
4720,
253,
1077,
4722,
1083,
1263,
19235,
1057,
417,
3324,
253,
3264,
1543,
891,
9428,
253,
1039,
253,
3533,
403,
4081,
281,
253,
4212,
27857,
253,
1543,
347,
253,
8813,
2530,
342,
260,
7083,
943,
320,
1805,
685,
1110,
1907,
760,
6667,
891,
1804,
253,
4477,
10280,
253,
4679,
13654,
760,
327,
3888,
9113,
10509,
50276,
37585,
3374,
891,
1804,
17816,
970,
465,
323,
1097,
253,
1180,
273,
5971,
285,
253,
465,
273,
253,
48257,
5933,
253,
1180,
465,
908,
275,
253,
4679,
310,
417,
4518,
4767,
347,
973,
347,
534,
403,
253,
1027,
9508,
273,
694,
79,
5762,
247,
1077,
5322,
2934,
3164,
247,
2372,
32809,
271,
40290,
417,
3590,
275,
690,
7794,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
1332,
281,
3157,
8813,
1615,
11667,
407,
12488,
1774,
4243,
273,
253,
2460,
672,
970,
5275,
2209,
6356,
3399,
22909,
1615,
11667,
4404,
436,
4736,
253,
10732,
273,
4619,
9162,
4811,
260,
7083,
310,
4081,
253,
1332,
310,
5762,
1097,
43245,
285,
247,
2608,
1263,
50276,
783,
30628,
3543,
326,
253,
2929,
574,
4722,
5697,
533,
4583,
253,
30628,
5821,
326,
253,
2929,
3198,
625,
789,
1078,
1146,
4704,
323,
9311,
436,
3797,
11138,
253,
3590,
1255,
273,
253,
16774,
27163,
285,
8254,
5411,
253,
7680,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors develop a hierarchical clustering technique for sequential data and apply it to rl agents by optimizing a variational objective the method is tested on simple multiagent environments including a mujoco example the paper is very well written and easy to follow the methods are explained clearly and each experiment and example is well motivated all results are discussed and the figures are clear and well explained the problem is an important and impactful one and progress here would be exciting to many in the marl community however the current work suffers from the following weaknesses too much time spent on the hillclimbing problem while this problem is very intuitive it has almost no distinctly multiagent content interaction between the agents and thus makes it hard to see how this approach will handle more interesting kinds of multiagent behavior for example the authors allude to cooperate vs compete but there is no analysis of sophisticated behavior the evaluation of the method needs significantly more work to raise my score i would like to see is the analysis of an existing agent not trained by the authors that demonstrates some kind of sophisticated behavior and to have the model presented here discover some kind of novel insight into what is going on it would be particularly interesting for example to analyze a complete system with its ablations to give finergrained insight into why an ablation doesnt work as well update scores raised in response to new experiments and the answers to my questions yes docsepthe paper focuses on behavior analysis of multiagent reinforcement learning the approach is to discover behaviors at the local level and joint level separately from offline data based on an informationtheoretical objective the authors did an empirical analysis on two simple domains as well as the multiagent mujoco domain the results show some interesting multiagent behaviors visualized in the learned latent space the highlevel idea of analyzing multiagent behaviors is appealing and i think is an important direction for multiagent rl research the authors propose to learn hierarchical latent embeddings of the agents behaviors by focusing on the local and joint level separately the proposed approach is simple and easy to follow the empirical analysis is thorough by first analyzing the two levels of clusters independently and then their relationships i think one of the weaknesses of the submission is lack of empirical analysis on more complex domains that requires structured cooperationcompetition strategy like simulated football game it would be interesting to see how different categories of skills eg dribbling the ball look like in latent space im also a little doubtful whether the informationtheoretical training objective will work properly for such domains besides the first domain of empirical analysis does not seem to be a typical multiagent environment with interactions between agents see above docsepthis paper introduces a method for discovery of behavior clusters both individualagentlevel and jointagentlevel in multiagent domains with the given offline trajectory without assumption about the underlying learning algorithms and world models specifically the method uses jointlevel latent behavior factor z and individuallevel latent behavior factor zi with the assumption that each agents latentconditioned policy is conditionallyindependent z given zi and drive a variational lower bound to learn z and zi by maximizing the probabilities of the given offline trajectory probabilities the authors illustrate the effectiveness of their method for enabling the coupled understanding of behaviors at the joint and local agent level detection of behavior changepoints throughout training discovery of core behavioral concepts eg those that facilitate higher returns and demonstrate the approachs scalability to a highdimensional multiagent mujoco control domain strengths good research point the agent behavior analysis is less studied compared to the agent performance enhancement as far as i konw i think the proposed method which is less rewardcentric is very necessary for the multiagent community clear method and good paper writting the latentconditioned trajectory generation process that the authors use to learn multiagent behavior clusters is very clear and easy to understand sufficient experiments including both coupled analysis and indepedent analysis of both joint and local behavioral clusters analysis emergent behavior analysis and the analysis of scalability to highdimensional domains and i also found that the figures is clear to understand weaknesses i do not know whether the assumption that each agents latentconditioned policy is conditionallyindependent z given zi is true for real world scenarios i understand the authors try to simplify the problem formulation but i have some concerns for this assumption for example the same example used in the paper if z encodes at a high level whether agents were cooperating or competing i think the agents policy should certainly be dependent on z no matter what meanings zi may represent since the paper focus on behavior analysis ie the analysis of z and zi i think the authors should give clear description about how to draw the figures of z and zi but some figures do not have such descriptions docsepthis paper studies an interesting question how to model different behaviors from an offline multiagent rl dataset the authors propose a hierarchical modeling approach and present empirical experiments over a 2d particle world navigation scenario and a mujoco halfcheetah control task postrebuttal i think the updated draft is clearly a much stronger one as a solid work i have updated my score accordingly strengths i personally like the topic of this paper and i strongly agree that the analysis of rl applications should be beyond rewards ie considering the behaviors of the learned strategies the use of a hierarchical generative model is intuitive although it isnt surprising i do feel this draft has the potential to become a much stronger paper if the authors can step further in this research direction regarding the writing the method and motivation part is clearly written but i dont think many of the arguments from this paper are well justified weakness the current content looks a bit preliminary there are two major issues 1 the technical content is not well justified 2 the experiments are a bit too toy for an analysis paper please see my concerns below 1 regarding the technical content the use of the generative model in modeling agent behaviors isnt new at all lots of techniques such as ganstylehttpsarxivorgabs180206070 vaehttpsarxivorgabs201001523 and diffusion modelhttpsdiffusionplanninggithubio have been widely adopted the authors choose a vaestyle generative modeling approach which is technically standard the interesting part appears to be the use of two latent variables ie one for joint behavior and the other for agents but i dont really get why this hierarchical design is necessary at least for the motivating example the hill climb game a flattened vae model is clearly sufficient even in the experiment part table 2 i cannot see any critical advantages yes it is true that the bilevel model has a higher prediction accuracy but why do we really care about prediction accuracy from the motivation of the paper the paper focuses on behavior analysis and aims to discover different behavior modes which are well captured by the ictd metric i couldnt see a big difference between the two methods under the ictd metric moreover at least from the current content of the paper i think both the half cheetah and the mpe motivating example can be perfectly analyzed by the flattened model for the purpose of discovering strategy modes i do agree that the bilevel model discovers some agentwise behavior modes but so what if we are really interested in agentwise behavior analysis we can simply run another flattened vae model for each agent over its own trajectories and thats all set to sum up the technical content of the paper looks so arbitrary some further analysis and research would be required for a solid work 2 regarding the experiment section first of all the experiment section is way too toy for an analysis paper it is okay to take the hillclimb environment as a motivating example but i would expect much more complex domains the mojoco control domain is a good example but why only the simplest scenario with just two agents is considered why not consider more scenarios with much more agents the authors talk about the hideandseek project in the introduction section repeatedly so why not run the model over the hideandseek project the learned policies are released you dont even need to train the policies moreover the author claimed in the introduction section that the proposed method is rewardagnostic and is generic to offline rl settings however the dataset construction doesnt really follow the offline rl convention the dataset is simply constructed by running marl training repeatedly note that the marl algorithms are optimizing the environment reward which implies all the behavior data are leaning towards high rewards what if some suboptimal or even random trajectories are included in the dataset can the model still work we would like to point out that including suboptimal and random trajectories is a common practice in offline rl literaturehttpsarxivorgabs200407219 in general i would suggest the authors rethink the emphasis of the paper what is really the main contribution is it the hierarchical modeling technique or the behavior analysis problem itself if the paper focuses on the former one bilevel modeling the motivating example should make people understand why a bilevel model is necessary instead of a flattened model the experiments should also focus more on the bilevel vs flatten part if the authors would like to focus more on a new problem then we would expect much more experiments and analysis hopfully including novel insights or findings beyond cheatah skills there are so many papers showing different emergent motion skills over mujoco domains minor issues 1 a missing line of research i would suggest the author also take into account the works on rl diversity which focuses on learning a collection of policies covering different behavior modes these works typically use a discriminative approach to model behavior modes ie defining a distance metric or a heuristic density rather than a generative model i would like to point out that a distance metric can be perfectly used for clustering although these methods simply optimize the policies to ensure diversity remark optimizing diverse policies is a much harder problem than simply clustering over a fixed dataset representative metrics include trajectoriy diversityhttpsproceedingsmlrpressv139lupu21ahtml dvdhttpsgithubcomjparkerholderdvdes crossentropyhttpsopenreviewnetforumidhcqhrhkfn and quality diversityhttpswwwfrontiersinorgarticles103389frobt201600040full i would suggest the authors include this line of research and if possible discuss why a generative modeling approach is preferred here eg why cannot we just design a quality diversity function and run a tsne analysis or just run a kernelversion of gmm 2 experiments on userdesired characteristics i dont really get the motivation of this part line 239 why does userdesired necessarily mean high rewards also as i mentioned above the dataset is constructed by running marl algorithms so many of the trajectories are indeed having high rewards i wouldnt be surprised that highreward concepts can be learned but what if more random data are introduced what if most data are random in addition i couldnt understand why a trainvalidation split makes any sense here why do we really care about testing accuracy the authors state that they are following 13 however i would point out that 13 is in the setting of supervised learning which is different from rl why could all the evaluation settings of sl be simply generalized to rl it is natural for supervised learning methods to focus on loss functionprediction accuracy but why do we care about the loss in rl note that even the title of the paper claims beyond rewards should we measure something beyond prediction accuracy i would suggest the author think a bit further to present more convincing experiments na
### Summary:
|
the reviewers agreed this paper studies an interesting problem and provide an interesting contribution to the multiagent community we urge the authors to include the added experiments and information eg suggested related work into the main text
|
[
12106,
247,
3426,
985,
342,
697,
490,
77,
569,
281,
1918,
1442,
1326,
11273,
12288,
715,
2139,
271,
28913,
36908,
789,
347,
973,
50275,
11183,
7363,
5439,
275,
2380,
281,
747,
4679,
285,
253,
9172,
281,
619,
3533,
50275,
9820,
5474,
339,
431,
248,
2929,
16633,
327,
3879,
1783,
273,
4471,
12788,
35221,
4715,
253,
2746,
310,
281,
9413,
13576,
387,
253,
1980,
1268,
285,
6036,
1268,
11794,
432,
28841,
941,
1754,
327,
271,
1491,
783,
33977,
8103,
253,
4477,
858,
271,
16774,
1783,
327,
767,
2969,
10625,
347,
973,
347,
253,
4471,
12788,
278,
10441,
16856,
5028,
253,
1543,
921,
690,
4722,
4471,
12788,
13576,
27130,
275,
253,
6311,
21624,
2317,
253,
1029,
5251,
2934,
273,
18918,
4471,
12788,
13576,
310,
23176,
285,
891,
1158,
310,
271,
1774,
3884,
323,
4471,
12788,
391,
77,
2561,
253,
4477,
12661,
281,
3037,
24498,
21624,
46234,
273,
253,
6083,
13576,
407,
13654,
327,
253,
1980,
285,
6036,
1268,
11794,
253,
4081,
2746,
310,
2969,
285,
3477,
281,
956,
253,
16774,
1783,
310,
11080,
407,
806,
18918,
253,
767,
2308,
273,
9959,
10939,
285,
840,
616,
7688,
50276,
74,
1158,
581,
273,
253,
32213,
273,
253,
19529,
310,
3480,
273,
16774,
1783,
327,
625,
2570,
10625,
326,
4419,
18872,
15850,
3118,
2930,
5700,
751,
15524,
5842,
2165,
352,
651,
320,
4722,
281,
923,
849,
1027,
9050,
273,
6936,
24088,
277,
725,
8036,
253,
4023,
1007,
751,
275,
21624,
2317,
516,
671,
247,
1652,
38342,
1880,
253,
1491,
783,
33977,
3733,
8103,
588,
789,
6283,
323,
824,
10625,
16280,
253,
806,
5028,
273,
16774,
1783,
1057,
417,
1646,
281,
320,
247,
6867,
4471,
12788,
3126,
342,
6355,
875,
6083,
923,
1840,
5474,
33032,
2520,
2929,
23970,
247,
1332,
323,
8900,
273,
3879,
9959,
1097,
2060,
12788,
5251,
285,
6036,
12788,
5251,
275,
4471,
12788,
10625,
342,
253,
1677,
28841,
18974,
1293,
9376,
670,
253,
6944,
4715,
11333,
285,
1533,
3210,
5742,
253,
1332,
4648,
6036,
5251,
21624,
3879,
2803,
1182,
285,
2060,
5251,
21624,
3879,
2803,
1182,
74,
342,
253,
9376,
326,
1016,
6083,
21624,
44321,
3646,
310,
1617,
595,
17777,
1182,
1677,
1182,
74,
285,
4446,
247,
39762,
2406,
3033,
281,
3037,
1182,
285,
1182,
74,
407,
46875,
253,
20552,
273,
253,
1677,
28841,
18974,
20552,
253,
4477,
17093,
253,
12510,
273,
616,
1332,
323,
17690,
253,
9904,
4685,
273,
13576,
387,
253,
6036,
285,
1980,
5570,
1268,
5481,
273,
3879,
1818,
10801,
4768,
3733,
8900,
273,
5161,
14613,
12342,
24088,
1110,
326,
12454,
2169,
6548,
285,
7568,
253,
2746,
84,
9171,
1430,
281,
247,
1029,
6967,
4471,
12788,
278,
10441,
16856,
1453,
5028,
50276,
296,
3755,
20556,
50276,
12311,
2561,
1127,
253,
5570,
3879,
1783,
310,
1679,
5421,
2429,
281,
253,
5570,
3045,
14314,
347,
2080,
347,
891,
17022,
88,
891,
1158,
253,
4081,
1332,
534,
310,
1679,
10921,
37382,
310,
1077,
3309,
323,
253,
4471,
12788,
3114,
50276,
8250,
1332,
285,
1175,
2929,
1488,
2835,
253,
21624,
44321,
18974,
5978,
1232,
326,
253,
4477,
897,
281,
3037,
4471,
12788,
3879,
9959,
310,
1077,
2590,
285,
3477,
281,
2096,
50275,
31031,
4679,
1690,
1097,
9904,
1783,
285,
801,
554,
264,
290,
1783,
273,
1097,
6036,
285,
1980,
14613,
9959,
1783,
47006,
3879,
1783,
285,
253,
1783,
273,
9171,
1430,
281,
1029,
6967,
10625,
285,
891,
671,
1119,
326,
253,
8442,
310,
2590,
281,
2096,
50276,
20881,
1255,
265,
50276,
74,
513,
417,
871,
1880,
253,
9376,
326,
1016,
6083,
21624,
44321,
3646,
310,
1617,
595,
17777,
1182,
1677,
1182,
74,
310,
2032,
323,
1524,
1533,
15216,
891,
2096,
253,
4477,
1611,
281,
25636,
253,
1895,
15895,
533,
891,
452,
690,
7350,
323,
436,
9376,
323,
1650,
253,
1072,
1650,
908,
275,
253,
2929,
604,
1182,
31360,
387,
247,
1029,
1268,
1880,
6083,
497,
10239,
839,
390,
11771,
891,
1158,
253,
6083,
3646,
943,
5604,
320,
7976,
327,
1182,
642,
2647,
752,
30460,
1182,
74,
778,
1957,
50275,
17480,
253,
2929,
2770,
327,
3879,
1783,
26332,
253,
1783,
273,
1182,
285,
1182,
74,
891,
1158,
253,
4477,
943,
1918,
2590,
5740,
670,
849,
281,
3812,
253,
8442,
273,
1182,
285,
1182,
74,
533,
690,
8442,
513,
417,
452,
824,
20121,
50276,
7152,
33032,
2520,
2929,
2175,
271,
4722,
1953,
849,
281,
1566,
1027,
13576,
432,
271,
28841,
4471,
12788,
391,
77,
10895,
253,
4477,
12661,
247,
24498,
14053,
2746,
285,
1246,
16774,
4679,
689,
247,
374,
69,
8091,
1533,
15034,
10076,
285,
247,
278,
10441,
16856,
2716,
1962,
292,
1240,
1453,
4836,
50273,
5996,
250,
2858,
22559,
50276,
74,
1158,
253,
9300,
7482,
310,
4518,
247,
1199,
10046,
581,
347,
247,
4891,
789,
891,
452,
9300,
619,
4868,
15672,
50276,
296,
3755,
20556,
891,
11697,
751,
253,
9400,
273,
436,
2929,
285,
891,
7052,
5194,
326,
253,
1783,
273,
391,
77,
4893,
943,
320,
4457,
23267,
26332,
7296,
253,
13576,
273,
253,
6311,
8130,
253,
897,
273,
247,
24498,
1006,
800,
1566,
310,
27350,
3738,
352,
310,
2649,
10084,
891,
513,
1928,
436,
7482,
556,
253,
2442,
281,
2489,
247,
1199,
10046,
2929,
604,
253,
4477,
476,
3213,
2007,
275,
436,
2561,
3884,
50276,
1747,
13218,
253,
4028,
253,
1332,
285,
16038,
629,
310,
4518,
3542,
533,
891,
13414,
1158,
1142,
273,
253,
7125,
432,
436,
2929,
403,
973,
17285,
50274,
20881,
1255,
253,
1655,
2600,
4453,
247,
2372,
12611,
627,
403,
767,
2201,
3374,
337,
253,
7681,
2600,
310,
417,
973,
17285,
374,
253,
4679,
403,
247,
2372,
1512,
20953,
323,
271,
1783,
2929,
4496,
923,
619,
7350,
2708,
50276,
18,
5001,
253,
7681,
2600,
253,
897,
273,
253,
1006,
800,
1566,
275,
14053,
5570,
13576,
310,
2649,
747,
387,
512,
8783,
273,
5609,
824,
347,
36827,
4826,
3614,
39962,
2061,
5375,
1093,
9992,
1549,
1967,
362,
3348,
3614,
39962,
2061,
5375,
1252,
22140,
1508,
285,
12393,
1566,
3614,
13437,
2035,
446,
7526,
7280,
900,
452,
644,
7561,
8671,
253,
4477,
5206,
247,
13460,
12463,
1006,
800,
14053,
2746,
534,
310,
22335,
2629,
253,
4722,
629,
4620,
281,
320,
253,
897,
273,
767,
21624,
4903,
26332,
581,
323,
6036,
3879,
285,
253,
643,
323,
6083,
533,
891,
13414,
1663,
755,
2139,
436,
24498,
2216,
310,
3309,
387,
1878,
323,
253,
15265,
839,
1650,
253,
13599,
12394,
2165,
247,
42394,
362,
3348,
1566,
310,
4518,
4209,
1014,
275,
253,
3368,
629,
2829,
374,
891,
2550,
923,
667,
4619,
11361,
50276,
9820,
352,
310,
2032,
326,
253,
26413,
652,
1566,
556,
247,
2169,
10554,
7200,
533,
2139,
513,
359,
1663,
1557,
670,
10554,
7200,
432,
253,
16038,
273,
253,
2929,
253,
2929,
16633,
327,
3879,
1783,
285,
13698,
281,
9413,
1027,
3879,
10006,
534,
403,
973,
10848,
407,
253,
209,
882,
69,
7982,
891,
812,
2649,
923,
247,
1943,
3064,
875,
253,
767,
3082,
762,
253,
209,
882,
69,
7982,
25761,
387,
1878,
432,
253,
1655,
2600,
273,
253,
2929,
891,
1158,
1097,
253,
2716,
1161,
292,
1240,
285,
253,
278,
365,
15265,
839,
1650,
476,
320,
9670,
5867,
407,
253,
42394,
1566,
323,
253,
4096,
273,
30375,
5700,
10006,
891,
513,
5194,
326,
253,
26413,
652,
1566,
41217,
690,
5570,
3020,
3879,
10006,
533,
594,
752,
604,
359,
403,
1663,
6110,
275,
5570,
3020,
3879,
1783,
359,
476,
3365,
1408,
1529,
42394,
362,
3348,
1566,
323,
1016,
5570,
689,
697,
1211,
24102,
285,
28763,
512,
873,
281,
2020,
598,
253,
7681,
2600,
273,
253,
2929,
4453,
594,
10341,
690,
2007,
1783,
285,
2561,
651,
320,
2424,
323,
247,
4891,
789,
50276,
19,
5001,
253,
3368,
2593,
806,
273,
512,
253,
3368,
2593,
310,
1039,
1512,
20953,
323,
271,
1783,
2929,
352,
310,
8261,
281,
1379,
253,
13599,
498,
6785,
3126,
347,
247,
15265,
839,
1650,
533,
891,
651,
1902,
1199,
625,
2570,
10625,
253,
5497,
75,
16856,
1453,
5028,
310,
247,
1175,
1650,
533,
2139,
760,
253,
22325,
10076,
342,
816,
767,
6083,
310,
2783,
2139,
417,
1908,
625,
15216,
342,
1199,
625,
6083,
253,
4477,
2312,
670,
253,
10507,
395,
32179,
2199,
275,
253,
10199,
2593,
12889,
594,
2139,
417,
1408,
253,
1566,
689,
253,
10507,
395,
32179,
2199,
253,
6311,
7823,
403,
4439,
368,
13414,
1014,
878,
281,
6194,
253,
7823,
25761,
253,
2488,
7558,
275,
253,
10199,
2593,
326,
253,
4081,
1332,
310,
10921,
1530,
6932,
285,
310,
12314,
281,
28841,
391,
77,
7533,
2299,
253,
10895,
5140,
36908,
1663,
956,
253,
28841,
391,
77,
5008,
253,
10895,
310,
3365,
8818,
407,
3515,
2304,
77,
3733,
12889,
3877,
326,
253,
2304,
77,
11333,
403,
39793,
253,
3126,
10921,
534,
8018,
512,
253,
3879,
941,
403,
25661,
4404,
1029,
23267,
752,
604,
690,
749,
29776,
390,
1014,
3632,
24102,
403,
2908,
275,
253,
10895,
476,
253,
1566,
1335,
789,
359,
651,
751,
281,
1127,
562,
326,
1690,
749,
29776,
285,
3632,
24102,
310,
247,
1846,
3946,
275,
28841,
391,
77,
6239,
3614,
39962,
2061,
5375,
1518,
1449,
3547,
746,
50275,
249,
2087,
891,
651,
1804,
253,
4477,
294,
18959,
253,
15075,
273,
253,
2929,
752,
310,
1663,
253,
2022,
7680,
310,
352,
253,
24498,
14053,
5853,
390,
253,
3879,
1783,
1895,
3139,
604,
253,
2929,
16633,
327,
253,
3438,
581,
26413,
652,
14053,
253,
15265,
839,
1650,
943,
1056,
952,
2096,
2139,
247,
26413,
652,
1566,
310,
3309,
3185,
273,
247,
42394,
1566,
253,
4679,
943,
671,
2770,
625,
327,
253,
26413,
652,
4632,
892,
21562,
629,
604,
253,
4477,
651,
751,
281,
2770,
625,
327,
247,
747,
1895,
840,
359,
651,
1902,
1199,
625,
4679,
285,
1783,
5184,
2920,
1690,
4460,
16039,
390,
4342,
4457,
1161,
682,
73,
6936,
627,
403,
594,
1142,
9380,
4645,
1027,
47006,
3200,
6936,
689,
278,
10441,
16856,
10625,
50274,
37585,
3374,
337,
247,
5816,
1386,
273,
2561,
891,
651,
1804,
253,
2488,
671,
1379,
715,
2395,
253,
2987,
327,
391,
77,
9991,
534,
16633,
327,
4715,
247,
4849,
273,
7823,
10985,
1027,
3879,
10006,
841,
2987,
5431,
897,
247,
20741,
800,
2746,
281,
1566,
3879,
10006,
26332,
13947,
247,
4181,
7982,
390,
247,
47641,
4038,
2581,
685,
247,
1006,
800,
1566,
891,
651,
751,
281,
1127,
562,
326,
247,
4181,
7982,
476,
320,
9670,
908,
323,
17524,
3738,
841,
3082,
3365,
22318,
253,
7823,
281,
5416,
9991,
7579,
39793,
11117,
7823,
310,
247,
1199,
12150,
1895,
685,
3365,
17524,
689,
247,
4229,
10895,
8612,
17082,
2486,
13310,
8885,
90,
9991,
3614,
856,
22868,
1686,
83,
7100,
87,
15270,
77,
484,
86,
1797,
66,
2974,
277,
19122,
3614,
7280,
681,
23731,
26599,
11375,
27088,
3229,
2831,
290,
10144,
3614,
5758,
15337,
3024,
39061,
301,
31728,
82,
6285,
44734,
4174,
285,
3290,
9991,
3614,
2700,
6342,
4670,
249,
2061,
13137,
740,
1610,
2511,
35255,
2612,
6961,
933,
1449,
11546,
891,
651,
1804,
253,
4477,
2486,
436,
1386,
273,
2561,
285,
604,
1896,
2319,
2139,
247,
1006,
800,
14053,
2746,
310,
9013,
1060,
24088,
2139,
2550,
359,
816,
2216,
247,
3290,
9991,
1159,
285,
1408,
247,
28669,
570,
1783,
390,
816,
1408,
247,
10295,
4149,
273,
305,
2188,
374,
4679,
327,
2608,
3229,
1250,
5319,
891,
13414,
1663,
755,
253,
16038,
273,
436,
629,
1386,
27862,
2139,
1057,
2608,
3229,
1250,
7933,
1599,
1029,
23267,
671,
347,
891,
5393,
1840,
253,
10895,
310,
8818,
407,
3515,
2304,
77,
11333,
594,
1142,
273,
253,
24102,
403,
6296,
1907,
1029,
23267,
891,
651,
2649,
320,
9861,
326,
1029,
250,
1034,
12342,
476,
320,
6311,
533,
752,
604,
625,
3632,
941,
403,
5611,
752,
604,
954,
941,
403,
3632,
275,
1635,
891,
812,
2649,
2096,
2139,
247,
6194,
29599,
8085,
2789,
667,
3282,
1060,
2139,
513,
359,
1663,
1557,
670,
5175,
7200,
253,
4477,
1375,
326,
597,
403,
1563,
2145,
2299,
891,
651,
1127,
562,
326,
2145,
310,
275,
253,
4758,
273,
22296,
4715,
534,
310,
1027,
432,
391,
77,
2139,
812,
512,
253,
7103,
7533,
273,
1499,
320,
3365,
14923,
281,
391,
77,
352,
310,
3626,
323,
22296,
4715,
3082,
281,
2770,
327,
2957,
1159,
12787,
2474,
7200,
533,
2139,
513,
359,
1557,
670,
253,
2957,
275,
391,
77,
3877,
326,
1014,
253,
4060,
273,
253,
2929,
3916,
4457,
23267,
943,
359,
2557,
1633,
4457,
10554,
7200,
891,
651,
1804,
253,
2488,
1158,
247,
2372,
2007,
281,
1246,
625,
21414,
4679,
50276,
2072,
2490,
187,
4118,
18435,
27,
783,
30628,
5821,
436,
2929,
2175,
271,
4722,
1895,
285,
2085,
271,
4722,
7680,
281,
253,
4471,
12788,
3114,
359,
21434,
253,
4477,
281,
2486,
253,
2879,
4679,
285,
1491,
24088,
5125,
2905,
789,
715,
253,
2022,
2505,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
12106,
247,
3426,
985,
342,
697,
490,
77,
569,
281,
1918,
1442,
1326,
11273,
12288,
715,
2139,
271,
28913,
36908,
789,
347,
973,
50275,
11183,
7363,
5439,
275,
2380,
281,
747,
4679,
285,
253,
9172,
281,
619,
3533,
50275,
9820,
5474,
339,
431,
248,
2929,
16633,
327,
3879,
1783,
273,
4471,
12788,
35221,
4715,
253,
2746,
310,
281,
9413,
13576,
387,
253,
1980,
1268,
285,
6036,
1268,
11794,
432,
28841,
941,
1754,
327,
271,
1491,
783,
33977,
8103,
253,
4477,
858,
271,
16774,
1783,
327,
767,
2969,
10625,
347,
973,
347,
253,
4471,
12788,
278,
10441,
16856,
5028,
253,
1543,
921,
690,
4722,
4471,
12788,
13576,
27130,
275,
253,
6311,
21624,
2317,
253,
1029,
5251,
2934,
273,
18918,
4471,
12788,
13576,
310,
23176,
285,
891,
1158,
310,
271,
1774,
3884,
323,
4471,
12788,
391,
77,
2561,
253,
4477,
12661,
281,
3037,
24498,
21624,
46234,
273,
253,
6083,
13576,
407,
13654,
327,
253,
1980,
285,
6036,
1268,
11794,
253,
4081,
2746,
310,
2969,
285,
3477,
281,
956,
253,
16774,
1783,
310,
11080,
407,
806,
18918,
253,
767,
2308,
273,
9959,
10939,
285,
840,
616,
7688,
50276,
74,
1158,
581,
273,
253,
32213,
273,
253,
19529,
310,
3480,
273,
16774,
1783,
327,
625,
2570,
10625,
326,
4419,
18872,
15850,
3118,
2930,
5700,
751,
15524,
5842,
2165,
352,
651,
320,
4722,
281,
923,
849,
1027,
9050,
273,
6936,
24088,
277,
725,
8036,
253,
4023,
1007,
751,
275,
21624,
2317,
516,
671,
247,
1652,
38342,
1880,
253,
1491,
783,
33977,
3733,
8103,
588,
789,
6283,
323,
824,
10625,
16280,
253,
806,
5028,
273,
16774,
1783,
1057,
417,
1646,
281,
320,
247,
6867,
4471,
12788,
3126,
342,
6355,
875,
6083,
923,
1840,
5474,
33032,
2520,
2929,
23970,
247,
1332,
323,
8900,
273,
3879,
9959,
1097,
2060,
12788,
5251,
285,
6036,
12788,
5251,
275,
4471,
12788,
10625,
342,
253,
1677,
28841,
18974,
1293,
9376,
670,
253,
6944,
4715,
11333,
285,
1533,
3210,
5742,
253,
1332,
4648,
6036,
5251,
21624,
3879,
2803,
1182,
285,
2060,
5251,
21624,
3879,
2803,
1182,
74,
342,
253,
9376,
326,
1016,
6083,
21624,
44321,
3646,
310,
1617,
595,
17777,
1182,
1677,
1182,
74,
285,
4446,
247,
39762,
2406,
3033,
281,
3037,
1182,
285,
1182,
74,
407,
46875,
253,
20552,
273,
253,
1677,
28841,
18974,
20552,
253,
4477,
17093,
253,
12510,
273,
616,
1332,
323,
17690,
253,
9904,
4685,
273,
13576,
387,
253,
6036,
285,
1980,
5570,
1268,
5481,
273,
3879,
1818,
10801,
4768,
3733,
8900,
273,
5161,
14613,
12342,
24088,
1110,
326,
12454,
2169,
6548,
285,
7568,
253,
2746,
84,
9171,
1430,
281,
247,
1029,
6967,
4471,
12788,
278,
10441,
16856,
1453,
5028,
50276,
296,
3755,
20556,
50276,
12311,
2561,
1127,
253,
5570,
3879,
1783,
310,
1679,
5421,
2429,
281,
253,
5570,
3045,
14314,
347,
2080,
347,
891,
17022,
88,
891,
1158,
253,
4081,
1332,
534,
310,
1679,
10921,
37382,
310,
1077,
3309,
323,
253,
4471,
12788,
3114,
50276,
8250,
1332,
285,
1175,
2929,
1488,
2835,
253,
21624,
44321,
18974,
5978,
1232,
326,
253,
4477,
897,
281,
3037,
4471,
12788,
3879,
9959,
310,
1077,
2590,
285,
3477,
281,
2096,
50275,
31031,
4679,
1690,
1097,
9904,
1783,
285,
801,
554,
264,
290,
1783,
273,
1097,
6036,
285,
1980,
14613,
9959,
1783,
47006,
3879,
1783,
285,
253,
1783,
273,
9171,
1430,
281,
1029,
6967,
10625,
285,
891,
671,
1119,
326,
253,
8442,
310,
2590,
281,
2096,
50276,
20881,
1255,
265,
50276,
74,
513,
417,
871,
1880,
253,
9376,
326,
1016,
6083,
21624,
44321,
3646,
310,
1617,
595,
17777,
1182,
1677,
1182,
74,
310,
2032,
323,
1524,
1533,
15216,
891,
2096,
253,
4477,
1611,
281,
25636,
253,
1895,
15895,
533,
891,
452,
690,
7350,
323,
436,
9376,
323,
1650,
253,
1072,
1650,
908,
275,
253,
2929,
604,
1182,
31360,
387,
247,
1029,
1268,
1880,
6083,
497,
10239,
839,
390,
11771,
891,
1158,
253,
6083,
3646,
943,
5604,
320,
7976,
327,
1182,
642,
2647,
752,
30460,
1182,
74,
778,
1957,
50275,
17480,
253,
2929,
2770,
327,
3879,
1783,
26332,
253,
1783,
273,
1182,
285,
1182,
74,
891,
1158,
253,
4477,
943,
1918,
2590,
5740,
670,
849,
281,
3812,
253,
8442,
273,
1182,
285,
1182,
74,
533,
690,
8442,
513,
417,
452,
824,
20121,
50276,
7152,
33032,
2520,
2929,
2175,
271,
4722,
1953,
849,
281,
1566,
1027,
13576,
432,
271,
28841,
4471,
12788,
391,
77,
10895,
253,
4477,
12661,
247,
24498,
14053,
2746,
285,
1246,
16774,
4679,
689,
247,
374,
69,
8091,
1533,
15034,
10076,
285,
247,
278,
10441,
16856,
2716,
1962,
292,
1240,
1453,
4836,
50273,
5996,
250,
2858,
22559,
50276,
74,
1158,
253,
9300,
7482,
310,
4518,
247,
1199,
10046,
581,
347,
247,
4891,
789,
891,
452,
9300,
619,
4868,
15672,
50276,
296,
3755,
20556,
891,
11697,
751,
253,
9400,
273,
436,
2929,
285,
891,
7052,
5194,
326,
253,
1783,
273,
391,
77,
4893,
943,
320,
4457,
23267,
26332,
7296,
253,
13576,
273,
253,
6311,
8130,
253,
897,
273,
247,
24498,
1006,
800,
1566,
310,
27350,
3738,
352,
310,
2649,
10084,
891,
513,
1928,
436,
7482,
556,
253,
2442,
281,
2489,
247,
1199,
10046,
2929,
604,
253,
4477,
476,
3213,
2007,
275,
436,
2561,
3884,
50276,
1747,
13218,
253,
4028,
253,
1332,
285,
16038,
629,
310,
4518,
3542,
533,
891,
13414,
1158,
1142,
273,
253,
7125,
432,
436,
2929,
403,
973,
17285,
50274,
20881,
1255,
253,
1655,
2600,
4453,
247,
2372,
12611,
627,
403,
767,
2201,
3374,
337,
253,
7681,
2600,
310,
417,
973,
17285,
374,
253,
4679,
403,
247,
2372,
1512,
20953,
323,
271,
1783,
2929,
4496,
923,
619,
7350,
2708,
50276,
18,
5001,
253,
7681,
2600,
253,
897,
273,
253,
1006,
800,
1566,
275,
14053,
5570,
13576,
310,
2649,
747,
387,
512,
8783,
273,
5609,
824,
347,
36827,
4826,
3614,
39962,
2061,
5375,
1093,
9992,
1549,
1967,
362,
3348,
3614,
39962,
2061,
5375,
1252,
22140,
1508,
285,
12393,
1566,
3614,
13437,
2035,
446,
7526,
7280,
900,
452,
644,
7561,
8671,
253,
4477,
5206,
247,
13460,
12463,
1006,
800,
14053,
2746,
534,
310,
22335,
2629,
253,
4722,
629,
4620,
281,
320,
253,
897,
273,
767,
21624,
4903,
26332,
581,
323,
6036,
3879,
285,
253,
643,
323,
6083,
533,
891,
13414,
1663,
755,
2139,
436,
24498,
2216,
310,
3309,
387,
1878,
323,
253,
15265,
839,
1650,
253,
13599,
12394,
2165,
247,
42394,
362,
3348,
1566,
310,
4518,
4209,
1014,
275,
253,
3368,
629,
2829,
374,
891,
2550,
923,
667,
4619,
11361,
50276,
9820,
352,
310,
2032,
326,
253,
26413,
652,
1566,
556,
247,
2169,
10554,
7200,
533,
2139,
513,
359,
1663,
1557,
670,
10554,
7200,
432,
253,
16038,
273,
253,
2929,
253,
2929,
16633,
327,
3879,
1783,
285,
13698,
281,
9413,
1027,
3879,
10006,
534,
403,
973,
10848,
407,
253,
209,
882,
69,
7982,
891,
812,
2649,
923,
247,
1943,
3064,
875,
253,
767,
3082,
762,
253,
209,
882,
69,
7982,
25761,
387,
1878,
432,
253,
1655,
2600,
273,
253,
2929,
891,
1158,
1097,
253,
2716,
1161,
292,
1240,
285,
253,
278,
365,
15265,
839,
1650,
476,
320,
9670,
5867,
407,
253,
42394,
1566,
323,
253,
4096,
273,
30375,
5700,
10006,
891,
513,
5194,
326,
253,
26413,
652,
1566,
41217,
690,
5570,
3020,
3879,
10006,
533,
594,
752,
604,
359,
403,
1663,
6110,
275,
5570,
3020,
3879,
1783,
359,
476,
3365,
1408,
1529,
42394,
362,
3348,
1566,
323,
1016,
5570,
689,
697,
1211,
24102,
285,
28763,
512,
873,
281,
2020,
598,
253,
7681,
2600,
273,
253,
2929,
4453,
594,
10341,
690,
2007,
1783,
285,
2561,
651,
320,
2424,
323,
247,
4891,
789,
50276,
19,
5001,
253,
3368,
2593,
806,
273,
512,
253,
3368,
2593,
310,
1039,
1512,
20953,
323,
271,
1783,
2929,
352,
310,
8261,
281,
1379,
253,
13599,
498,
6785,
3126,
347,
247,
15265,
839,
1650,
533,
891,
651,
1902,
1199,
625,
2570,
10625,
253,
5497,
75,
16856,
1453,
5028,
310,
247,
1175,
1650,
533,
2139,
760,
253,
22325,
10076,
342,
816,
767,
6083,
310,
2783,
2139,
417,
1908,
625,
15216,
342,
1199,
625,
6083,
253,
4477,
2312,
670,
253,
10507,
395,
32179,
2199,
275,
253,
10199,
2593,
12889,
594,
2139,
417,
1408,
253,
1566,
689,
253,
10507,
395,
32179,
2199,
253,
6311,
7823,
403,
4439,
368,
13414,
1014,
878,
281,
6194,
253,
7823,
25761,
253,
2488,
7558,
275,
253,
10199,
2593,
326,
253,
4081,
1332,
310,
10921,
1530,
6932,
285,
310,
12314,
281,
28841,
391,
77,
7533,
2299,
253,
10895,
5140,
36908,
1663,
956,
253,
28841,
391,
77,
5008,
253,
10895,
310,
3365,
8818,
407,
3515,
2304,
77,
3733,
12889,
3877,
326,
253,
2304,
77,
11333,
403,
39793,
253,
3126,
10921,
534,
8018,
512,
253,
3879,
941,
403,
25661,
4404,
1029,
23267,
752,
604,
690,
749,
29776,
390,
1014,
3632,
24102,
403,
2908,
275,
253,
10895,
476,
253,
1566,
1335,
789,
359,
651,
751,
281,
1127,
562,
326,
1690,
749,
29776,
285,
3632,
24102,
310,
247,
1846,
3946,
275,
28841,
391,
77,
6239,
3614,
39962,
2061,
5375,
1518,
1449,
3547,
746,
50275,
249,
2087,
891,
651,
1804,
253,
4477,
294,
18959,
253,
15075,
273,
253,
2929,
752,
310,
1663,
253,
2022,
7680,
310,
352,
253,
24498,
14053,
5853,
390,
253,
3879,
1783,
1895,
3139,
604,
253,
2929,
16633,
327,
253,
3438,
581,
26413,
652,
14053,
253,
15265,
839,
1650,
943,
1056,
952,
2096,
2139,
247,
26413,
652,
1566,
310,
3309,
3185,
273,
247,
42394,
1566,
253,
4679,
943,
671,
2770,
625,
327,
253,
26413,
652,
4632,
892,
21562,
629,
604,
253,
4477,
651,
751,
281,
2770,
625,
327,
247,
747,
1895,
840,
359,
651,
1902,
1199,
625,
4679,
285,
1783,
5184,
2920,
1690,
4460,
16039,
390,
4342,
4457,
1161,
682,
73,
6936,
627,
403,
594,
1142,
9380,
4645,
1027,
47006,
3200,
6936,
689,
278,
10441,
16856,
10625,
50274,
37585,
3374,
337,
247,
5816,
1386,
273,
2561,
891,
651,
1804,
253,
2488,
671,
1379,
715,
2395,
253,
2987,
327,
391,
77,
9991,
534,
16633,
327,
4715,
247,
4849,
273,
7823,
10985,
1027,
3879,
10006,
841,
2987,
5431,
897,
247,
20741,
800,
2746,
281,
1566,
3879,
10006,
26332,
13947,
247,
4181,
7982,
390,
247,
47641,
4038,
2581,
685,
247,
1006,
800,
1566,
891,
651,
751,
281,
1127,
562,
326,
247,
4181,
7982,
476,
320,
9670,
908,
323,
17524,
3738,
841,
3082,
3365,
22318,
253,
7823,
281,
5416,
9991,
7579,
39793,
11117,
7823,
310,
247,
1199,
12150,
1895,
685,
3365,
17524,
689,
247,
4229,
10895,
8612,
17082,
2486,
13310,
8885,
90,
9991,
3614,
856,
22868,
1686,
83,
7100,
87,
15270,
77,
484,
86,
1797,
66,
2974,
277,
19122,
3614,
7280,
681,
23731,
26599,
11375,
27088,
3229,
2831,
290,
10144,
3614,
5758,
15337,
3024,
39061,
301,
31728,
82,
6285,
44734,
4174,
285,
3290,
9991,
3614,
2700,
6342,
4670,
249,
2061,
13137,
740,
1610,
2511,
35255,
2612,
6961,
933,
1449,
11546,
891,
651,
1804,
253,
4477,
2486,
436,
1386,
273,
2561,
285,
604,
1896,
2319,
2139,
247,
1006,
800,
14053,
2746,
310,
9013,
1060,
24088,
2139,
2550,
359,
816,
2216,
247,
3290,
9991,
1159,
285,
1408,
247,
28669,
570,
1783,
390,
816,
1408,
247,
10295,
4149,
273,
305,
2188,
374,
4679,
327,
2608,
3229,
1250,
5319,
891,
13414,
1663,
755,
253,
16038,
273,
436,
629,
1386,
27862,
2139,
1057,
2608,
3229,
1250,
7933,
1599,
1029,
23267,
671,
347,
891,
5393,
1840,
253,
10895,
310,
8818,
407,
3515,
2304,
77,
11333,
594,
1142,
273,
253,
24102,
403,
6296,
1907,
1029,
23267,
891,
651,
2649,
320,
9861,
326,
1029,
250,
1034,
12342,
476,
320,
6311,
533,
752,
604,
625,
3632,
941,
403,
5611,
752,
604,
954,
941,
403,
3632,
275,
1635,
891,
812,
2649,
2096,
2139,
247,
6194,
29599,
8085,
2789,
667,
3282,
1060,
2139,
513,
359,
1663,
1557,
670,
5175,
7200,
253,
4477,
1375,
326,
597,
403,
1563,
2145,
2299,
891,
651,
1127,
562,
326,
2145,
310,
275,
253,
4758,
273,
22296,
4715,
534,
310,
1027,
432,
391,
77,
2139,
812,
512,
253,
7103,
7533,
273,
1499,
320,
3365,
14923,
281,
391,
77,
352,
310,
3626,
323,
22296,
4715,
3082,
281,
2770,
327,
2957,
1159,
12787,
2474,
7200,
533,
2139,
513,
359,
1557,
670,
253,
2957,
275,
391,
77,
3877,
326,
1014,
253,
4060,
273,
253,
2929,
3916,
4457,
23267,
943,
359,
2557,
1633,
4457,
10554,
7200,
891,
651,
1804,
253,
2488,
1158,
247,
2372,
2007,
281,
1246,
625,
21414,
4679,
50276,
2072,
2490,
187,
4118,
18435,
27,
783,
30628,
5821,
436,
2929,
2175,
271,
4722,
1895,
285,
2085,
271,
4722,
7680,
281,
253,
4471,
12788,
3114,
359,
21434,
253,
4477,
281,
2486,
253,
2879,
4679,
285,
1491,
24088,
5125,
2905,
789,
715,
253,
2022,
2505,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper provides a theoretical analysis on the ogd based continual learning method the method is in fact proposed by a previous paper farajtabar et al 2019 and the current paper shows a generalization bound for the regression case the result thm 3 compares the generalization bounds between sgd and ogd and shows ogd leads to a tighter bound the theorem is also based on the bound on the rademacher complexity lemma 1 the paper also suggests ogd which stores some data points from past tasks they also present some experimental results on small benchmark datasets and show ogd outperforms sgd and ogd while the paper makes an interesting attempt on theoretical analyses of ogd based continual learning method i feel the result is quite limited only to the ogd scheme also the result is for regression as shown in the loss function in sec 32 but the experiments are on classification so its not clear with the connection with the theory and the experiments the results also seem to be somewhat simple derivations from the known papers like jacot et al 2018 and liu et al 2019 the experimental results are also very limited and weak since it only compares with sgd an obvious weak scheme that suffers from catastrophic forgetting and does not compare with any other continual learning baselines for example the stateoftheart on cifar100 is around 65 and the performance of ogd is very weak even though the paper aims for a theoretical contribution it is very limited only for ogd based scheme which is not strong in practice so i am not sure about the significance of the contribution of the paper but i havent fully read the entire proof of the paper and i may have missed some details regarding the proof i would like to see other reviewers opinion as well what about comparing with more enlarged and various benchmark datasets beyond mnist and cifar100 like cub200 or omniglot as shown in httpsarxivorgpdf200313726pdf how does ogd or ogd compares with other baselines like ewc or masdocsepthe authors use a neural tangent kernel ntk approximation of wide neural nets to establish generalization bounds for continual learning cl using stochastic gradient descent sgd and orthogonal gradient descent ogd in this regime the authors prove that ogd does not suffer from catastrophic forgetting of training data the authors additionally introduce a modification to ogd which causes significant performance improvements in the rotated mnist and permuted mnist problems ogd involves storing feature maps from data points from previous tasks the modified ogd method ogd additionally stores feature maps from the current task the primary contribution of this paper is the theoretical analysis of continual learning given that the cl problem does not have an extensive theoretical foundation the generalization bound in this paper is a notable advance the theory presented also provides a justification for the empirical observations observed by the authors that as overparameterization increases the effect of catastrophic forgetting decreases in a variety of cl task setups the primary drawback of the paper is that the authors do not compare the ogd algorithm to other continual learning algorithms synaptic intelligence elastic weight consolidation etc as a result it is difficult to know how ogd compares to alternatives it is not clear to the reviewer why improving ogd to ogd is itself a contribution given the expense occurred by ogdtype methods in storing ever increasing numbers of directions it would be important to know the comparison of this method with others minor comments 1 section 32 f is not defined as of this point in the paper 2 theorem 1 the theorem needs a quantifier of lambda 3 line above remark 1 ktau kappatau 4 theorem 2 the paper should define what is in the memory means when introducing ogd s 5 theorem 3 definition of rt has incorrect dummy index in the summation docsep summary this paper studies the theoretical aspect of a continual learning method called orthogonal gradient descent ogd in this study authors leverage neural tangent kernel and over parameterized neural networks to prove the generalization of ogd reasons for score overall i vote for rejection i like the idea of the paper to analyze an exist method from different aspect and even improving it however my major concern is about the clarity of the paper see cons below pros 1 the paper investigate an important problem in continual learning framework which is the generalization 2 this paper provides some experiments to show the effectiveness of the proposed framework cons 1 unfortunately the paper is not clear and very difficult to follow a for instance ntk is referred without explaining it well first in page 2 authors use cl for referring to continual learning but it has not been defined b there are many typos and capital letters have been used inappropriately c ft has not been defined 2 although the proposed method provides several experiments there are still many other methods and datasets that have been ignored there has been recent studies and frameworks that have outperformed ogd it woud be great if you include them in your baselines for instance sgddroput in httpsarxivorgpdf200411545pdf beats ogd 3 there are many metrics to evaluate continual learning frameworks like backward transferbwt or average accuracy over tasks i would suggest the authors to look at the defined metrics in gem gradient episodic memory httpsarxivorgabs170608840 and compute those values questions during rebuttal period please address and clarify the cons above would you please elaborate more what could be the superior performance of ogd on rotated mnist dataset wrt ogd what is the time complexity of ogd
### Summary:
|
the reviewers were excited by the papers theoretical contribution to continual learning since that aspect of continual learning is underdeveloped however all reviewers including the most positive reviewer during discussions expressed that the paper would benefit from revisions to improve the clarity and the thoroughness of comparisons in the paper the papers focus on ogd is not necessarily an issue for it to be of use to the community as mentioned as a negative point in one review that other reviewers disagreed with the authors are encouraged to revise this paper incorporating the reviewers suggestions
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
3400,
247,
10527,
1783,
327,
253,
9040,
69,
1754,
45120,
4715,
1332,
253,
1332,
310,
275,
958,
4081,
407,
247,
2045,
2929,
2080,
1432,
8476,
274,
1162,
355,
6247,
285,
253,
1655,
2929,
2722,
247,
26647,
3033,
323,
253,
9077,
1083,
253,
906,
289,
78,
495,
26662,
253,
26647,
14493,
875,
256,
35333,
285,
9040,
69,
285,
2722,
9040,
69,
5644,
281,
247,
40638,
3033,
253,
10012,
310,
671,
1754,
327,
253,
3033,
327,
253,
1985,
358,
12844,
10454,
18057,
337,
50276,
783,
2929,
671,
5936,
9040,
69,
534,
10111,
690,
941,
2792,
432,
2469,
8892,
597,
671,
1246,
690,
5661,
1543,
327,
1355,
22791,
15302,
285,
921,
9040,
69,
41731,
13015,
256,
35333,
285,
9040,
69,
50276,
6050,
253,
2929,
2789,
271,
4722,
3177,
327,
10527,
6260,
273,
9040,
69,
1754,
45120,
4715,
1332,
891,
1928,
253,
906,
310,
3240,
3710,
760,
281,
253,
9040,
69,
6974,
671,
253,
906,
310,
323,
9077,
347,
2011,
275,
253,
2957,
1159,
275,
4706,
4567,
533,
253,
4679,
403,
327,
9162,
594,
697,
417,
2590,
342,
253,
4602,
342,
253,
3762,
285,
253,
4679,
253,
1543,
671,
1646,
281,
320,
8489,
2969,
3538,
569,
432,
253,
1929,
9380,
751,
480,
317,
302,
1162,
355,
4765,
285,
632,
86,
1162,
355,
6247,
50275,
783,
5661,
1543,
403,
671,
1077,
3710,
50276,
395,
5075,
1580,
352,
760,
26662,
342,
256,
35333,
271,
4755,
5075,
6974,
326,
27171,
432,
36256,
37264,
285,
1057,
417,
7277,
342,
667,
643,
45120,
4715,
1666,
25379,
323,
1650,
253,
1375,
23037,
14387,
327,
260,
338,
274,
2313,
310,
1475,
7251,
285,
253,
3045,
273,
9040,
69,
310,
1077,
5075,
1014,
2167,
253,
2929,
13698,
323,
247,
10527,
7680,
352,
310,
1077,
3710,
760,
323,
9040,
69,
1754,
6974,
534,
310,
417,
2266,
275,
3946,
594,
891,
717,
417,
2119,
670,
253,
8453,
273,
253,
7680,
273,
253,
2929,
533,
891,
419,
2254,
4751,
1239,
253,
2862,
4737,
273,
253,
2929,
285,
891,
778,
452,
9829,
690,
4278,
5001,
253,
4737,
891,
651,
751,
281,
923,
643,
30628,
4743,
347,
973,
50275,
5371,
670,
10941,
342,
625,
28148,
285,
2710,
22791,
15302,
4457,
278,
79,
382,
285,
260,
338,
274,
2313,
751,
12966,
1518,
390,
33039,
304,
11753,
347,
2011,
275,
5987,
39962,
2061,
9275,
9755,
15497,
1731,
9275,
50276,
5430,
1057,
9040,
69,
390,
9040,
69,
26662,
342,
643,
1666,
25379,
751,
299,
38212,
390,
9425,
7152,
339,
431,
248,
4477,
897,
247,
11454,
28196,
10295,
295,
17922,
11193,
273,
4618,
11454,
37507,
281,
5100,
26647,
14493,
323,
45120,
4715,
502,
970,
19191,
11786,
18499,
256,
35333,
285,
19627,
11786,
18499,
9040,
69,
50276,
249,
436,
9459,
253,
4477,
5276,
326,
9040,
69,
1057,
417,
11089,
432,
36256,
37264,
273,
3733,
941,
50276,
783,
4477,
23000,
9569,
247,
11237,
281,
9040,
69,
534,
5997,
1534,
3045,
11701,
275,
253,
27272,
278,
79,
382,
285,
8143,
4525,
278,
79,
382,
3237,
50276,
462,
69,
8687,
20073,
4735,
8115,
432,
941,
2792,
432,
2045,
8892,
50276,
783,
7321,
9040,
69,
1332,
9040,
69,
23000,
10111,
4735,
8115,
432,
253,
1655,
4836,
50274,
783,
3625,
7680,
273,
436,
2929,
310,
253,
10527,
1783,
273,
45120,
4715,
50276,
28821,
326,
253,
502,
1895,
1057,
417,
452,
271,
9470,
10527,
12153,
253,
26647,
3033,
275,
436,
2929,
310,
247,
16613,
7170,
253,
3762,
3559,
671,
3400,
247,
22861,
323,
253,
16774,
7313,
2540,
407,
253,
4477,
326,
347,
689,
19484,
1320,
5459,
253,
1055,
273,
36256,
37264,
12075,
275,
247,
5235,
273,
502,
4836,
873,
8777,
50276,
783,
3625,
32489,
273,
253,
2929,
310,
326,
253,
4477,
513,
417,
7277,
253,
9040,
69,
5933,
281,
643,
45120,
4715,
11333,
21066,
9260,
15386,
2801,
34889,
3966,
50276,
284,
247,
906,
352,
310,
2834,
281,
871,
849,
9040,
69,
26662,
281,
18075,
50276,
262,
310,
417,
2590,
281,
253,
37317,
2139,
11138,
9040,
69,
281,
9040,
69,
310,
3139,
247,
7680,
50276,
28821,
253,
14247,
5866,
407,
9040,
69,
881,
3082,
275,
20073,
2455,
3629,
3904,
273,
10746,
352,
651,
320,
1774,
281,
871,
253,
5301,
273,
436,
1332,
342,
2571,
50274,
37585,
5701,
50276,
18,
2593,
4567,
269,
310,
417,
2931,
347,
273,
436,
1127,
275,
253,
2929,
374,
10012,
337,
253,
10012,
3198,
247,
2677,
5425,
273,
29331,
495,
1386,
1840,
7579,
337,
465,
3115,
50276,
76,
1212,
682,
86,
577,
10012,
374,
253,
2929,
943,
4853,
752,
310,
275,
253,
3541,
2097,
672,
16984,
9040,
69,
256,
608,
10012,
495,
5426,
273,
37523,
556,
13583,
28726,
3605,
275,
253,
36138,
5474,
33032,
50276,
8774,
50274,
2520,
2929,
2175,
253,
10527,
4809,
273,
247,
45120,
4715,
1332,
1925,
19627,
11786,
18499,
9040,
69,
275,
436,
1263,
4477,
25057,
11454,
28196,
10295,
285,
689,
4764,
1025,
11454,
6928,
281,
5276,
253,
26647,
273,
9040,
69,
50273,
250,
3743,
323,
4868,
50273,
1189,
455,
891,
6273,
323,
18235,
891,
751,
253,
2934,
273,
253,
2929,
281,
12106,
271,
2226,
1332,
432,
1027,
4809,
285,
1014,
11138,
352,
2299,
619,
2201,
4468,
310,
670,
253,
19843,
273,
253,
2929,
923,
772,
2708,
50274,
856,
84,
50273,
18,
253,
2929,
7409,
271,
1774,
1895,
275,
45120,
4715,
7792,
534,
310,
253,
26647,
50274,
19,
436,
2929,
3400,
690,
4679,
281,
921,
253,
12510,
273,
253,
4081,
7792,
50271,
5040,
50275,
18,
19235,
253,
2929,
310,
417,
2590,
285,
1077,
2834,
281,
956,
247,
323,
4227,
295,
17922,
310,
6289,
1293,
15571,
352,
973,
806,
50276,
249,
3239,
374,
4477,
897,
502,
323,
14339,
281,
45120,
4715,
533,
352,
556,
417,
644,
2931,
270,
627,
403,
1142,
963,
993,
285,
5347,
4876,
452,
644,
908,
275,
13759,
1523,
260,
23899,
556,
417,
644,
2931,
374,
3738,
253,
4081,
1332,
3400,
2067,
4679,
627,
403,
1335,
1142,
643,
3082,
285,
15302,
326,
452,
644,
12841,
627,
556,
644,
3332,
2175,
285,
31225,
326,
452,
41731,
10574,
9040,
69,
352,
259,
2995,
320,
1270,
604,
368,
2486,
731,
275,
634,
1666,
25379,
323,
4227,
48237,
1678,
287,
1065,
275,
5987,
39962,
2061,
9275,
9430,
12730,
1857,
9275,
27125,
9040,
69,
50276,
20,
50276,
9088,
403,
1142,
17082,
281,
7472,
45120,
4715,
31225,
751,
19265,
3700,
67,
17118,
390,
3388,
7200,
689,
8892,
891,
651,
1804,
253,
4477,
281,
1007,
387,
253,
2931,
17082,
275,
16915,
11786,
6314,
23329,
3541,
5987,
39962,
2061,
5375,
15046,
1549,
2055,
1449,
285,
11897,
1110,
2193,
50271,
34974,
1309,
30080,
22559,
2180,
50272,
32897,
2953,
285,
19148,
253,
772,
1840,
50275,
12756,
368,
4496,
21184,
625,
752,
812,
320,
253,
8936,
3045,
273,
9040,
69,
327,
27272,
278,
79,
382,
10895,
8772,
9040,
69,
50276,
5371,
310,
253,
673,
10454,
273,
9040,
69,
50271,
187,
187,
4118,
18435,
27,
783,
30628,
497,
9049,
407,
253,
9380,
10527,
7680,
281,
45120,
4715,
1580,
326,
4809,
273,
45120,
4715,
310,
762,
35208,
50276,
35529,
512,
30628,
1690,
253,
954,
2762,
37317,
1309,
11985,
4469,
326,
253,
2929,
651,
5649,
432,
38549,
281,
3157,
253,
19843,
285,
253,
11080,
1255,
273,
14023,
275,
253,
2929,
50276,
783,
9380,
2770,
327,
9040,
69,
310,
417,
7933,
271,
2523,
323,
352,
281,
320,
273,
897,
281,
253,
3114,
347,
5393,
347,
247,
4016,
1127,
275,
581,
2278,
326,
643,
30628,
41030,
342,
253,
4477,
403,
14659,
281,
49620,
436,
2929,
24049,
253,
30628,
13991
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
3400,
247,
10527,
1783,
327,
253,
9040,
69,
1754,
45120,
4715,
1332,
253,
1332,
310,
275,
958,
4081,
407,
247,
2045,
2929,
2080,
1432,
8476,
274,
1162,
355,
6247,
285,
253,
1655,
2929,
2722,
247,
26647,
3033,
323,
253,
9077,
1083,
253,
906,
289,
78,
495,
26662,
253,
26647,
14493,
875,
256,
35333,
285,
9040,
69,
285,
2722,
9040,
69,
5644,
281,
247,
40638,
3033,
253,
10012,
310,
671,
1754,
327,
253,
3033,
327,
253,
1985,
358,
12844,
10454,
18057,
337,
50276,
783,
2929,
671,
5936,
9040,
69,
534,
10111,
690,
941,
2792,
432,
2469,
8892,
597,
671,
1246,
690,
5661,
1543,
327,
1355,
22791,
15302,
285,
921,
9040,
69,
41731,
13015,
256,
35333,
285,
9040,
69,
50276,
6050,
253,
2929,
2789,
271,
4722,
3177,
327,
10527,
6260,
273,
9040,
69,
1754,
45120,
4715,
1332,
891,
1928,
253,
906,
310,
3240,
3710,
760,
281,
253,
9040,
69,
6974,
671,
253,
906,
310,
323,
9077,
347,
2011,
275,
253,
2957,
1159,
275,
4706,
4567,
533,
253,
4679,
403,
327,
9162,
594,
697,
417,
2590,
342,
253,
4602,
342,
253,
3762,
285,
253,
4679,
253,
1543,
671,
1646,
281,
320,
8489,
2969,
3538,
569,
432,
253,
1929,
9380,
751,
480,
317,
302,
1162,
355,
4765,
285,
632,
86,
1162,
355,
6247,
50275,
783,
5661,
1543,
403,
671,
1077,
3710,
50276,
395,
5075,
1580,
352,
760,
26662,
342,
256,
35333,
271,
4755,
5075,
6974,
326,
27171,
432,
36256,
37264,
285,
1057,
417,
7277,
342,
667,
643,
45120,
4715,
1666,
25379,
323,
1650,
253,
1375,
23037,
14387,
327,
260,
338,
274,
2313,
310,
1475,
7251,
285,
253,
3045,
273,
9040,
69,
310,
1077,
5075,
1014,
2167,
253,
2929,
13698,
323,
247,
10527,
7680,
352,
310,
1077,
3710,
760,
323,
9040,
69,
1754,
6974,
534,
310,
417,
2266,
275,
3946,
594,
891,
717,
417,
2119,
670,
253,
8453,
273,
253,
7680,
273,
253,
2929,
533,
891,
419,
2254,
4751,
1239,
253,
2862,
4737,
273,
253,
2929,
285,
891,
778,
452,
9829,
690,
4278,
5001,
253,
4737,
891,
651,
751,
281,
923,
643,
30628,
4743,
347,
973,
50275,
5371,
670,
10941,
342,
625,
28148,
285,
2710,
22791,
15302,
4457,
278,
79,
382,
285,
260,
338,
274,
2313,
751,
12966,
1518,
390,
33039,
304,
11753,
347,
2011,
275,
5987,
39962,
2061,
9275,
9755,
15497,
1731,
9275,
50276,
5430,
1057,
9040,
69,
390,
9040,
69,
26662,
342,
643,
1666,
25379,
751,
299,
38212,
390,
9425,
7152,
339,
431,
248,
4477,
897,
247,
11454,
28196,
10295,
295,
17922,
11193,
273,
4618,
11454,
37507,
281,
5100,
26647,
14493,
323,
45120,
4715,
502,
970,
19191,
11786,
18499,
256,
35333,
285,
19627,
11786,
18499,
9040,
69,
50276,
249,
436,
9459,
253,
4477,
5276,
326,
9040,
69,
1057,
417,
11089,
432,
36256,
37264,
273,
3733,
941,
50276,
783,
4477,
23000,
9569,
247,
11237,
281,
9040,
69,
534,
5997,
1534,
3045,
11701,
275,
253,
27272,
278,
79,
382,
285,
8143,
4525,
278,
79,
382,
3237,
50276,
462,
69,
8687,
20073,
4735,
8115,
432,
941,
2792,
432,
2045,
8892,
50276,
783,
7321,
9040,
69,
1332,
9040,
69,
23000,
10111,
4735,
8115,
432,
253,
1655,
4836,
50274,
783,
3625,
7680,
273,
436,
2929,
310,
253,
10527,
1783,
273,
45120,
4715,
50276,
28821,
326,
253,
502,
1895,
1057,
417,
452,
271,
9470,
10527,
12153,
253,
26647,
3033,
275,
436,
2929,
310,
247,
16613,
7170,
253,
3762,
3559,
671,
3400,
247,
22861,
323,
253,
16774,
7313,
2540,
407,
253,
4477,
326,
347,
689,
19484,
1320,
5459,
253,
1055,
273,
36256,
37264,
12075,
275,
247,
5235,
273,
502,
4836,
873,
8777,
50276,
783,
3625,
32489,
273,
253,
2929,
310,
326,
253,
4477,
513,
417,
7277,
253,
9040,
69,
5933,
281,
643,
45120,
4715,
11333,
21066,
9260,
15386,
2801,
34889,
3966,
50276,
284,
247,
906,
352,
310,
2834,
281,
871,
849,
9040,
69,
26662,
281,
18075,
50276,
262,
310,
417,
2590,
281,
253,
37317,
2139,
11138,
9040,
69,
281,
9040,
69,
310,
3139,
247,
7680,
50276,
28821,
253,
14247,
5866,
407,
9040,
69,
881,
3082,
275,
20073,
2455,
3629,
3904,
273,
10746,
352,
651,
320,
1774,
281,
871,
253,
5301,
273,
436,
1332,
342,
2571,
50274,
37585,
5701,
50276,
18,
2593,
4567,
269,
310,
417,
2931,
347,
273,
436,
1127,
275,
253,
2929,
374,
10012,
337,
253,
10012,
3198,
247,
2677,
5425,
273,
29331,
495,
1386,
1840,
7579,
337,
465,
3115,
50276,
76,
1212,
682,
86,
577,
10012,
374,
253,
2929,
943,
4853,
752,
310,
275,
253,
3541,
2097,
672,
16984,
9040,
69,
256,
608,
10012,
495,
5426,
273,
37523,
556,
13583,
28726,
3605,
275,
253,
36138,
5474,
33032,
50276,
8774,
50274,
2520,
2929,
2175,
253,
10527,
4809,
273,
247,
45120,
4715,
1332,
1925,
19627,
11786,
18499,
9040,
69,
275,
436,
1263,
4477,
25057,
11454,
28196,
10295,
285,
689,
4764,
1025,
11454,
6928,
281,
5276,
253,
26647,
273,
9040,
69,
50273,
250,
3743,
323,
4868,
50273,
1189,
455,
891,
6273,
323,
18235,
891,
751,
253,
2934,
273,
253,
2929,
281,
12106,
271,
2226,
1332,
432,
1027,
4809,
285,
1014,
11138,
352,
2299,
619,
2201,
4468,
310,
670,
253,
19843,
273,
253,
2929,
923,
772,
2708,
50274,
856,
84,
50273,
18,
253,
2929,
7409,
271,
1774,
1895,
275,
45120,
4715,
7792,
534,
310,
253,
26647,
50274,
19,
436,
2929,
3400,
690,
4679,
281,
921,
253,
12510,
273,
253,
4081,
7792,
50271,
5040,
50275,
18,
19235,
253,
2929,
310,
417,
2590,
285,
1077,
2834,
281,
956,
247,
323,
4227,
295,
17922,
310,
6289,
1293,
15571,
352,
973,
806,
50276,
249,
3239,
374,
4477,
897,
502,
323,
14339,
281,
45120,
4715,
533,
352,
556,
417,
644,
2931,
270,
627,
403,
1142,
963,
993,
285,
5347,
4876,
452,
644,
908,
275,
13759,
1523,
260,
23899,
556,
417,
644,
2931,
374,
3738,
253,
4081,
1332,
3400,
2067,
4679,
627,
403,
1335,
1142,
643,
3082,
285,
15302,
326,
452,
644,
12841,
627,
556,
644,
3332,
2175,
285,
31225,
326,
452,
41731,
10574,
9040,
69,
352,
259,
2995,
320,
1270,
604,
368,
2486,
731,
275,
634,
1666,
25379,
323,
4227,
48237,
1678,
287,
1065,
275,
5987,
39962,
2061,
9275,
9430,
12730,
1857,
9275,
27125,
9040,
69,
50276,
20,
50276,
9088,
403,
1142,
17082,
281,
7472,
45120,
4715,
31225,
751,
19265,
3700,
67,
17118,
390,
3388,
7200,
689,
8892,
891,
651,
1804,
253,
4477,
281,
1007,
387,
253,
2931,
17082,
275,
16915,
11786,
6314,
23329,
3541,
5987,
39962,
2061,
5375,
15046,
1549,
2055,
1449,
285,
11897,
1110,
2193,
50271,
34974,
1309,
30080,
22559,
2180,
50272,
32897,
2953,
285,
19148,
253,
772,
1840,
50275,
12756,
368,
4496,
21184,
625,
752,
812,
320,
253,
8936,
3045,
273,
9040,
69,
327,
27272,
278,
79,
382,
10895,
8772,
9040,
69,
50276,
5371,
310,
253,
673,
10454,
273,
9040,
69,
50271,
187,
187,
4118,
18435,
27,
783,
30628,
497,
9049,
407,
253,
9380,
10527,
7680,
281,
45120,
4715,
1580,
326,
4809,
273,
45120,
4715,
310,
762,
35208,
50276,
35529,
512,
30628,
1690,
253,
954,
2762,
37317,
1309,
11985,
4469,
326,
253,
2929,
651,
5649,
432,
38549,
281,
3157,
253,
19843,
285,
253,
11080,
1255,
273,
14023,
275,
253,
2929,
50276,
783,
9380,
2770,
327,
9040,
69,
310,
417,
7933,
271,
2523,
323,
352,
281,
320,
273,
897,
281,
253,
3114,
347,
5393,
347,
247,
4016,
1127,
275,
581,
2278,
326,
643,
30628,
41030,
342,
253,
4477,
403,
14659,
281,
49620,
436,
2929,
24049,
253,
30628,
13991
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the problem of modeling dynamical systems using lowrank rnns the paper motivates a number of domains in which lowrank rnns have given insight into the computations underlying a particular task the paper proposes directly fitting lowrank rnns meaning the connectivity matrix in the rnn is restricted to be lowrank the paper demonstrates how this restriction allows one to learn the correct effective connectivity for a number of synthetic tasks and also applies the method to recordings from macaque prefontal cortex while the animals performed a contextdependent decision making task this method reveals that a rank one network is sufficient to capture the neural activity strengths 1 the paper is well written and does a good job of motivating the problem and summarizing relevant work 2 the papers focus on lowrank structure as an important inductive bias for neuroscience is not particularly new but is nonetheless important for example fig 2f is a powerful reminder of the utility of the appropriate inductive biases for systems neuroscience where we are generally only able to sample a tiny fraction of the relevant neurons participating in a given task 3 the experiments on synthetic tasks as well as the application to neural data are clear and high quality weaknesses 1 the main contribution of the paper is essentially advocating for a type of regularization lowrank structure one that has been discussed plenty in the literature for example by mastroguiseppe and ostojic which is already cited by the paper while important the overall significance and novelty does not seem that high 2 it is not clear how one generalizes the method to rnns beyond the ones described by equation 1 for example what about rnns with multiple weight matrices such as gated architectures grus or lstms these are commonly used in machine learning and neuroscience i am curious if or how the authors would apply their method to those networks 3 the paper studies tasks which can readily be solved by a lowrank vanilla rnn but the space of tasks that we are interested in understanding is much greater what happens if you apply this method to a dynamical system that is solving a more complicated task for example one that does not admit a lowrank solution it would be good to get a sense of what to expect from the method in these scenarios minor suggestions for improvement should explain the effective connectivity in the main text it is used throughout and thus seems important enough to put in the main text currently the explanation is in appendix a4 the entire pdf is low resolution for exmample text which makes it hard to read this is a nitpick but replace jeff with jtexteff that will prevent the eff from being italicized which should only be done for mathematical variables overall i am not so worried about point 1 under weaknesses but look forward to see if the authors can address 2 and 3 in the rebuttal yes docsepthis paper describes results from a method for estimating low rank rnns from observed trajectories the method itself is just gradient descent on the set of weights that specify a lowrank rnn to match observed trajectories the paper demonstrates that the method can recover the effective connectivity of known lowrank rnns does a good job describing the behavior of general rnns trained on a set of four tasks and does a reasonable job recovering neural trajectories from nhps performing one of those tasks strengths given the widespread interest in rnns in machine learning and neuroscience this method may find broad applications in both of those fields the lowrank description is more readily interpretable weaknesses i think the paper seems to take as a given that low rank rnns can provide a veridical description of brain dynamics and learn many tasks this is not at all well established the tasks chosen here are welldescribed by lowdimensional rnns but many tasks and natural networks appear not to have that property for instance cueva et al 2020 pnas estimate the dimensionality of neural populations during the delay period of a variety of experiments see especially their figure 7 they find a number on the order of 510 for the dimensionality over a few seconds leaving aside technical questions about whether a particular number is meaningful one can read their result as decelerating but growing without bound eg dimensionality propto log t or as saturating at some time t the first interpretation argues that a lowrank description is not appropriateor at least that rank grows to be much larger than 10 the other possibilitythat dimensionality saturates at some time tsuggests it would be impossible for the network to distinguish times t but that seems inconsistent with the data eg lewis miall 2009 trans royal soc b kind of a minor issue but submitting a set of images instead of a searchable pdf made the job of reviewing this paper more difficult than it needed to be the paper should do a better job at pointing to the computational limitations of lowrank rnns and their possible limitations in describing brain data and cognition docsepin this paper the authors use lowrank rnns for modelling neural data the connectivity matrix is approximated using a lowrank representation which is then inferred by minimising the loss between the predicted neural outputs and the target ones in taskoptimised rnn relevant parameters to are trained based on task targets the method is then applied to various cognitive tasks to validate and extract insights strengths the models used are rather simple which benefits from good interpretability the experiments considered are interesting and elucidate different aspects of the model weaknesses novelty the novelty of the paper in terms of the model is very limited the paper shows that lowrank rnns can help analyse neural data but the contributions of the work beyond these experimental evaluations and applications of the lowrank rnns are unclear minor weaknesses presentation from the main text it is unclear how kappa and v enter the analysis similarly the role of section a2 is rather unclear is it to justify the effective connectivity metric na docsepthis paper introduces lowrank rnns parameterized by the left and right eigenvectors of the connectivity matrix the authors first show on simulated fullrank vanilla rnns trained to perform some cognitive tasks that lowrank rnns can accurately extract the essential dynamical and task structure learned by the fullrank systems they thereby offer a computationally interpretable low dimensional representation of task activity the authors then test their approach on neural recordings from two monkeys performing a contextdependent decision making task again the trained lowrank rnns seem to extract the behaviorally relevant lowdimensional dynamics at the same time producing synaptic weight distributions reminiscent of those observed in the brain strengths generally speaking i think this is a powerful approach that may have important implications in neuroscience building a lowrank structure right away into the models used for inference may profoundly ease detection of computational mechanisms from data weaknesses see detailed questions below my major point probably is that i dont think its clear whether the model really extracts the neural dynamics and how an animal is truly solving the task or whether it merely finds a parsimonious solution to the task partly see questions above
### Summary:
|
building on recent theoretical work on the dynamics of lowrank recurrent neural networks the authors present a method called lint for learning lowrank network models directly from data as the reviewers point out from a purely technical perspective the idea is straightforward simply optimize a lowrank parameterization of an rnn similar ideas have been considered under the heading of tensorized neural networks eg 1 see also related works cited therein which considers lowrank parameterizations of weight matrices eg 2 though the technical innovation may be limited the work makes up for it with connections to recent research in theoretical neuroscience and interesting experiments the reviewers raise many important caveats and limitations eg are these tasks really reflective of the complexity of real world tasks in ml and experimental neuroscience overall though the reviewers and i think this paper offers valuable contributions i encourage the authors to revise their manuscript in light of these thorough and constructive reviews 1 novikov a podoprikhin d osokin a and vetrov dp 2015 tensorizing neural networks advances in neural information processing systems 28 2 denil m shakibi b dinh l ranzato ma and de freitas n 2013 predicting parameters in deep learning advances in neural information processing systems 26
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
14053,
18525,
2718,
970,
1698,
14714,
391,
79,
2224,
253,
2929,
15265,
684,
247,
1180,
273,
10625,
275,
534,
1698,
14714,
391,
79,
2224,
452,
1677,
12288,
715,
253,
30745,
6944,
247,
1798,
4836,
253,
2929,
29328,
3587,
13532,
1698,
14714,
391,
79,
2224,
4495,
253,
17769,
4315,
275,
253,
391,
9866,
310,
11096,
281,
320,
1698,
14714,
253,
2929,
14371,
849,
436,
12400,
4483,
581,
281,
3037,
253,
3451,
3576,
17769,
323,
247,
1180,
273,
13506,
8892,
285,
671,
10384,
253,
1332,
281,
19654,
432,
5315,
13734,
638,
4909,
267,
14031,
1223,
253,
5074,
2684,
247,
3634,
6820,
3061,
2403,
4836,
436,
1332,
12957,
326,
247,
5958,
581,
2990,
310,
4209,
281,
9232,
253,
11454,
2425,
20544,
337,
253,
2929,
310,
973,
3542,
285,
1057,
247,
1175,
2628,
273,
15265,
839,
253,
1895,
285,
10405,
3006,
4623,
789,
374,
253,
9380,
2770,
327,
1698,
14714,
2605,
347,
271,
1774,
42115,
8492,
323,
6551,
21559,
310,
417,
3782,
747,
533,
310,
23188,
1774,
323,
1650,
3036,
374,
71,
310,
247,
6422,
24388,
273,
253,
11839,
273,
253,
4569,
42115,
31306,
323,
2718,
6551,
21559,
835,
359,
403,
3839,
760,
2104,
281,
3410,
247,
10058,
6919,
273,
253,
4623,
8512,
15299,
275,
247,
1677,
4836,
495,
253,
4679,
327,
13506,
8892,
347,
973,
347,
253,
2898,
281,
11454,
941,
403,
2590,
285,
1029,
3290,
50276,
20881,
1255,
265,
337,
253,
2022,
7680,
273,
253,
2929,
310,
9093,
43243,
323,
247,
1511,
273,
37820,
1698,
14714,
2605,
581,
326,
556,
644,
5469,
9828,
275,
253,
6239,
323,
1650,
407,
20971,
287,
4297,
261,
554,
365,
285,
258,
11769,
75,
280,
534,
310,
2168,
11106,
407,
253,
2929,
1223,
1774,
253,
4583,
8453,
285,
38135,
1057,
417,
1646,
326,
1029,
374,
352,
310,
417,
2590,
849,
581,
2087,
4219,
253,
1332,
281,
391,
79,
2224,
4457,
253,
4394,
2529,
407,
5150,
337,
323,
1650,
752,
670,
391,
79,
2224,
342,
2709,
2801,
12624,
824,
347,
305,
456,
35615,
650,
316,
390,
298,
296,
983,
841,
403,
7744,
908,
275,
5145,
4715,
285,
6551,
21559,
891,
717,
14338,
604,
390,
849,
253,
4477,
651,
4647,
616,
1332,
281,
1110,
6928,
495,
253,
2929,
2175,
8892,
534,
476,
12450,
320,
14042,
407,
247,
1698,
14714,
26724,
391,
9866,
533,
253,
2317,
273,
8892,
326,
359,
403,
6110,
275,
4685,
310,
1199,
3687,
752,
6569,
604,
368,
4647,
436,
1332,
281,
247,
18525,
985,
326,
310,
16161,
247,
625,
9542,
4836,
323,
1650,
581,
326,
1057,
417,
11476,
247,
1698,
14714,
2900,
352,
651,
320,
1175,
281,
755,
247,
3282,
273,
752,
281,
1902,
432,
253,
1332,
275,
841,
15216,
50276,
37585,
13991,
323,
7756,
50276,
11425,
5513,
253,
3576,
17769,
275,
253,
2022,
2505,
352,
310,
908,
4768,
285,
3021,
3133,
1774,
2217,
281,
1691,
275,
253,
2022,
2505,
4390,
253,
8813,
310,
275,
30762,
247,
21,
50276,
783,
2862,
31697,
310,
1698,
6064,
323,
385,
78,
4636,
2505,
534,
2789,
352,
1892,
281,
1239,
50276,
2520,
310,
247,
12389,
29397,
533,
8171,
5139,
567,
342,
480,
7109,
442,
567,
326,
588,
3657,
253,
3098,
432,
1146,
36037,
280,
1025,
534,
943,
760,
320,
2218,
323,
15965,
4903,
50276,
1189,
455,
891,
717,
417,
594,
11926,
670,
1127,
337,
762,
32213,
533,
1007,
3579,
281,
923,
604,
253,
4477,
476,
2953,
374,
285,
495,
275,
253,
30080,
22559,
50276,
9820,
5474,
33032,
2520,
2929,
8631,
1543,
432,
247,
1332,
323,
26230,
1698,
5958,
391,
79,
2224,
432,
2540,
24102,
50276,
783,
1332,
3139,
310,
816,
11786,
18499,
327,
253,
873,
273,
13461,
326,
13199,
247,
1698,
14714,
391,
9866,
281,
3761,
2540,
24102,
50276,
783,
2929,
14371,
326,
253,
1332,
476,
9295,
253,
3576,
17769,
273,
1929,
1698,
14714,
391,
79,
2224,
1057,
247,
1175,
2628,
12930,
253,
3879,
273,
2087,
391,
79,
2224,
10166,
327,
247,
873,
273,
1740,
8892,
285,
1057,
247,
5272,
2628,
27930,
11454,
24102,
432,
31386,
793,
9591,
581,
273,
1110,
8892,
20544,
50276,
28821,
253,
14414,
1600,
275,
391,
79,
2224,
275,
5145,
4715,
285,
6551,
21559,
436,
1332,
778,
1089,
3862,
4893,
275,
1097,
273,
1110,
4910,
50276,
783,
1698,
14714,
5740,
310,
625,
12450,
4665,
494,
50276,
20881,
1255,
265,
50276,
74,
1158,
253,
2929,
3133,
281,
1379,
347,
247,
1677,
326,
1698,
5958,
391,
79,
2224,
476,
2085,
247,
2336,
301,
474,
5740,
273,
3998,
8062,
285,
3037,
1142,
8892,
50276,
2520,
310,
417,
387,
512,
973,
4232,
50276,
783,
8892,
6777,
1060,
403,
6210,
392,
265,
9397,
407,
1698,
6967,
391,
79,
2224,
533,
1142,
8892,
285,
3626,
6928,
3176,
417,
281,
452,
326,
2867,
50276,
1542,
4227,
30129,
6156,
1162,
355,
9169,
268,
27109,
6642,
253,
7877,
1319,
273,
11454,
7625,
1309,
253,
5778,
2180,
273,
247,
5235,
273,
4679,
923,
3340,
616,
4677,
818,
50276,
9328,
1089,
247,
1180,
327,
253,
1340,
273,
33930,
323,
253,
7877,
1319,
689,
247,
1643,
7253,
50276,
282,
3292,
9255,
7681,
3533,
670,
1880,
247,
1798,
1180,
310,
14282,
581,
476,
1239,
616,
906,
347,
1086,
15766,
839,
533,
5675,
1293,
3033,
24088,
7877,
1319,
354,
22352,
2412,
246,
390,
347,
19004,
839,
387,
690,
673,
246,
50276,
783,
806,
7914,
8219,
326,
247,
1698,
14714,
5740,
310,
417,
4569,
263,
387,
1878,
326,
5958,
17202,
281,
320,
1199,
4067,
685,
884,
50276,
783,
643,
6387,
3529,
7877,
1319,
19004,
684,
387,
690,
673,
28669,
21662,
84,
352,
651,
320,
7479,
323,
253,
2990,
281,
12129,
2069,
50276,
85,
50276,
2858,
326,
3133,
16706,
342,
253,
941,
24088,
458,
88,
261,
50276,
78,
451,
77,
4748,
811,
17292,
9267,
270,
50275,
11258,
273,
247,
5884,
2523,
533,
29315,
247,
873,
273,
3888,
3185,
273,
247,
3186,
494,
31697,
1160,
253,
2628,
273,
16725,
436,
2929,
625,
2834,
685,
352,
3058,
281,
320,
50274,
783,
2929,
943,
513,
247,
1805,
2628,
387,
13458,
281,
253,
15180,
7364,
273,
1698,
14714,
391,
79,
2224,
285,
616,
1896,
7364,
275,
12930,
3998,
941,
285,
31937,
50275,
7152,
339,
9852,
436,
2929,
253,
4477,
897,
1698,
14714,
391,
79,
2224,
323,
26278,
11454,
941,
253,
17769,
4315,
310,
34930,
970,
247,
1698,
14714,
6779,
534,
310,
840,
22245,
407,
7221,
2182,
253,
2957,
875,
253,
8131,
11454,
18012,
285,
253,
2303,
4394,
275,
4836,
32581,
1701,
391,
9866,
4623,
3602,
281,
403,
10166,
1754,
327,
4836,
8571,
253,
1332,
310,
840,
3732,
281,
2710,
9699,
8892,
281,
17813,
285,
4908,
16039,
50274,
296,
3755,
20556,
50276,
783,
3210,
908,
403,
2581,
2969,
534,
5373,
432,
1175,
4665,
1430,
50276,
783,
4679,
2783,
403,
4722,
285,
30955,
1027,
7794,
273,
253,
1566,
50276,
20881,
1255,
265,
38135,
253,
38135,
273,
253,
2929,
275,
2426,
273,
253,
1566,
310,
1077,
3710,
253,
2929,
2722,
326,
1698,
14714,
391,
79,
2224,
476,
1361,
30648,
11454,
941,
533,
253,
9021,
273,
253,
789,
4457,
841,
5661,
27163,
285,
4893,
273,
253,
1698,
14714,
391,
79,
2224,
403,
12744,
50276,
37585,
32213,
9759,
432,
253,
2022,
2505,
352,
310,
12744,
849,
465,
5596,
285,
362,
4901,
253,
1783,
12014,
253,
2554,
273,
2593,
247,
19,
310,
2581,
12744,
310,
352,
281,
15249,
253,
3576,
17769,
7982,
50276,
2072,
5474,
33032,
2520,
2929,
23970,
1698,
14714,
391,
79,
2224,
4764,
1025,
407,
253,
1669,
285,
987,
48670,
273,
253,
17769,
4315,
253,
4477,
806,
921,
327,
15524,
2120,
14714,
26724,
391,
79,
2224,
10166,
281,
1347,
690,
9699,
8892,
326,
1698,
14714,
391,
79,
2224,
476,
13613,
4908,
253,
5667,
18525,
285,
4836,
2605,
6311,
407,
253,
2120,
14714,
2718,
597,
7624,
3959,
247,
43245,
4665,
494,
1698,
15759,
6779,
273,
4836,
2425,
253,
4477,
840,
1071,
616,
2746,
327,
11454,
19654,
432,
767,
30552,
9591,
247,
3634,
6820,
3061,
2403,
4836,
969,
253,
10166,
1698,
14714,
391,
79,
2224,
1646,
281,
4908,
253,
3879,
595,
4623,
1698,
6967,
8062,
387,
253,
1072,
673,
9603,
21066,
2801,
10670,
35036,
273,
1110,
2540,
275,
253,
3998,
20544,
3839,
8288,
891,
1158,
436,
310,
247,
6422,
2746,
326,
778,
452,
1774,
12739,
275,
6551,
21559,
3652,
247,
1698,
14714,
2605,
987,
1977,
715,
253,
3210,
908,
323,
17032,
778,
38245,
11990,
5481,
273,
15180,
6297,
432,
941,
50276,
20881,
1255,
265,
923,
7000,
3533,
2708,
619,
2201,
1127,
3164,
310,
326,
891,
13414,
1158,
697,
2590,
1880,
253,
1566,
1663,
16756,
253,
11454,
8062,
285,
849,
271,
5893,
310,
7777,
16161,
253,
4836,
390,
1880,
352,
7960,
9010,
247,
13328,
15329,
784,
2900,
281,
253,
4836,
13730,
923,
3533,
1840,
2490,
187,
4118,
18435,
27,
22157,
327,
3332,
10527,
789,
327,
253,
8062,
273,
1698,
14714,
18902,
11454,
6928,
253,
4477,
1246,
247,
1332,
1925,
298,
565,
323,
4715,
1698,
14714,
2990,
3210,
3587,
432,
941,
347,
253,
30628,
1127,
562,
432,
247,
15846,
7681,
8668,
253,
2934,
310,
15246,
3365,
22318,
247,
1698,
14714,
4764,
1320,
273,
271,
391,
9866,
2074,
5697,
452,
644,
2783,
762,
253,
13590,
273,
13148,
1025,
11454,
6928,
24088,
337,
923,
671,
2905,
2987,
11106,
15308,
534,
19401,
1698,
14714,
4764,
5904,
273,
2801,
12624,
24088,
374,
50275,
2004,
253,
7681,
15832,
778,
320,
3710,
253,
789,
2789,
598,
323,
352,
342,
10291,
281,
3332,
2561,
275,
10527,
6551,
21559,
285,
4722,
4679,
253,
30628,
7164,
1142,
1774,
15985,
1832,
285,
7364,
24088,
403,
841,
8892,
1663,
29210,
273,
253,
10454,
273,
1524,
1533,
8892,
275,
13361,
285,
5661,
6551,
21559,
4583,
2167,
253,
30628,
285,
891,
1158,
436,
2929,
6131,
9865,
9021,
891,
11907,
253,
4477,
281,
49620,
616,
7714,
275,
1708,
273,
841,
11080,
285,
25799,
10123,
50275,
18,
22458,
34089,
247,
7360,
412,
16409,
23187,
277,
7684,
25980,
247,
285,
26925,
18540,
33234,
4104,
13148,
3006,
11454,
6928,
16424,
275,
11454,
1491,
5162,
2718,
3349,
50276,
19,
1850,
300,
278,
439,
518,
29294,
270,
13223,
73,
298,
6337,
91,
4611,
6429,
285,
372,
4107,
30126,
295,
4072,
21565,
3602,
275,
3676,
4715,
16424,
275,
11454,
1491,
5162,
2718,
3436
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
14053,
18525,
2718,
970,
1698,
14714,
391,
79,
2224,
253,
2929,
15265,
684,
247,
1180,
273,
10625,
275,
534,
1698,
14714,
391,
79,
2224,
452,
1677,
12288,
715,
253,
30745,
6944,
247,
1798,
4836,
253,
2929,
29328,
3587,
13532,
1698,
14714,
391,
79,
2224,
4495,
253,
17769,
4315,
275,
253,
391,
9866,
310,
11096,
281,
320,
1698,
14714,
253,
2929,
14371,
849,
436,
12400,
4483,
581,
281,
3037,
253,
3451,
3576,
17769,
323,
247,
1180,
273,
13506,
8892,
285,
671,
10384,
253,
1332,
281,
19654,
432,
5315,
13734,
638,
4909,
267,
14031,
1223,
253,
5074,
2684,
247,
3634,
6820,
3061,
2403,
4836,
436,
1332,
12957,
326,
247,
5958,
581,
2990,
310,
4209,
281,
9232,
253,
11454,
2425,
20544,
337,
253,
2929,
310,
973,
3542,
285,
1057,
247,
1175,
2628,
273,
15265,
839,
253,
1895,
285,
10405,
3006,
4623,
789,
374,
253,
9380,
2770,
327,
1698,
14714,
2605,
347,
271,
1774,
42115,
8492,
323,
6551,
21559,
310,
417,
3782,
747,
533,
310,
23188,
1774,
323,
1650,
3036,
374,
71,
310,
247,
6422,
24388,
273,
253,
11839,
273,
253,
4569,
42115,
31306,
323,
2718,
6551,
21559,
835,
359,
403,
3839,
760,
2104,
281,
3410,
247,
10058,
6919,
273,
253,
4623,
8512,
15299,
275,
247,
1677,
4836,
495,
253,
4679,
327,
13506,
8892,
347,
973,
347,
253,
2898,
281,
11454,
941,
403,
2590,
285,
1029,
3290,
50276,
20881,
1255,
265,
337,
253,
2022,
7680,
273,
253,
2929,
310,
9093,
43243,
323,
247,
1511,
273,
37820,
1698,
14714,
2605,
581,
326,
556,
644,
5469,
9828,
275,
253,
6239,
323,
1650,
407,
20971,
287,
4297,
261,
554,
365,
285,
258,
11769,
75,
280,
534,
310,
2168,
11106,
407,
253,
2929,
1223,
1774,
253,
4583,
8453,
285,
38135,
1057,
417,
1646,
326,
1029,
374,
352,
310,
417,
2590,
849,
581,
2087,
4219,
253,
1332,
281,
391,
79,
2224,
4457,
253,
4394,
2529,
407,
5150,
337,
323,
1650,
752,
670,
391,
79,
2224,
342,
2709,
2801,
12624,
824,
347,
305,
456,
35615,
650,
316,
390,
298,
296,
983,
841,
403,
7744,
908,
275,
5145,
4715,
285,
6551,
21559,
891,
717,
14338,
604,
390,
849,
253,
4477,
651,
4647,
616,
1332,
281,
1110,
6928,
495,
253,
2929,
2175,
8892,
534,
476,
12450,
320,
14042,
407,
247,
1698,
14714,
26724,
391,
9866,
533,
253,
2317,
273,
8892,
326,
359,
403,
6110,
275,
4685,
310,
1199,
3687,
752,
6569,
604,
368,
4647,
436,
1332,
281,
247,
18525,
985,
326,
310,
16161,
247,
625,
9542,
4836,
323,
1650,
581,
326,
1057,
417,
11476,
247,
1698,
14714,
2900,
352,
651,
320,
1175,
281,
755,
247,
3282,
273,
752,
281,
1902,
432,
253,
1332,
275,
841,
15216,
50276,
37585,
13991,
323,
7756,
50276,
11425,
5513,
253,
3576,
17769,
275,
253,
2022,
2505,
352,
310,
908,
4768,
285,
3021,
3133,
1774,
2217,
281,
1691,
275,
253,
2022,
2505,
4390,
253,
8813,
310,
275,
30762,
247,
21,
50276,
783,
2862,
31697,
310,
1698,
6064,
323,
385,
78,
4636,
2505,
534,
2789,
352,
1892,
281,
1239,
50276,
2520,
310,
247,
12389,
29397,
533,
8171,
5139,
567,
342,
480,
7109,
442,
567,
326,
588,
3657,
253,
3098,
432,
1146,
36037,
280,
1025,
534,
943,
760,
320,
2218,
323,
15965,
4903,
50276,
1189,
455,
891,
717,
417,
594,
11926,
670,
1127,
337,
762,
32213,
533,
1007,
3579,
281,
923,
604,
253,
4477,
476,
2953,
374,
285,
495,
275,
253,
30080,
22559,
50276,
9820,
5474,
33032,
2520,
2929,
8631,
1543,
432,
247,
1332,
323,
26230,
1698,
5958,
391,
79,
2224,
432,
2540,
24102,
50276,
783,
1332,
3139,
310,
816,
11786,
18499,
327,
253,
873,
273,
13461,
326,
13199,
247,
1698,
14714,
391,
9866,
281,
3761,
2540,
24102,
50276,
783,
2929,
14371,
326,
253,
1332,
476,
9295,
253,
3576,
17769,
273,
1929,
1698,
14714,
391,
79,
2224,
1057,
247,
1175,
2628,
12930,
253,
3879,
273,
2087,
391,
79,
2224,
10166,
327,
247,
873,
273,
1740,
8892,
285,
1057,
247,
5272,
2628,
27930,
11454,
24102,
432,
31386,
793,
9591,
581,
273,
1110,
8892,
20544,
50276,
28821,
253,
14414,
1600,
275,
391,
79,
2224,
275,
5145,
4715,
285,
6551,
21559,
436,
1332,
778,
1089,
3862,
4893,
275,
1097,
273,
1110,
4910,
50276,
783,
1698,
14714,
5740,
310,
625,
12450,
4665,
494,
50276,
20881,
1255,
265,
50276,
74,
1158,
253,
2929,
3133,
281,
1379,
347,
247,
1677,
326,
1698,
5958,
391,
79,
2224,
476,
2085,
247,
2336,
301,
474,
5740,
273,
3998,
8062,
285,
3037,
1142,
8892,
50276,
2520,
310,
417,
387,
512,
973,
4232,
50276,
783,
8892,
6777,
1060,
403,
6210,
392,
265,
9397,
407,
1698,
6967,
391,
79,
2224,
533,
1142,
8892,
285,
3626,
6928,
3176,
417,
281,
452,
326,
2867,
50276,
1542,
4227,
30129,
6156,
1162,
355,
9169,
268,
27109,
6642,
253,
7877,
1319,
273,
11454,
7625,
1309,
253,
5778,
2180,
273,
247,
5235,
273,
4679,
923,
3340,
616,
4677,
818,
50276,
9328,
1089,
247,
1180,
327,
253,
1340,
273,
33930,
323,
253,
7877,
1319,
689,
247,
1643,
7253,
50276,
282,
3292,
9255,
7681,
3533,
670,
1880,
247,
1798,
1180,
310,
14282,
581,
476,
1239,
616,
906,
347,
1086,
15766,
839,
533,
5675,
1293,
3033,
24088,
7877,
1319,
354,
22352,
2412,
246,
390,
347,
19004,
839,
387,
690,
673,
246,
50276,
783,
806,
7914,
8219,
326,
247,
1698,
14714,
5740,
310,
417,
4569,
263,
387,
1878,
326,
5958,
17202,
281,
320,
1199,
4067,
685,
884,
50276,
783,
643,
6387,
3529,
7877,
1319,
19004,
684,
387,
690,
673,
28669,
21662,
84,
352,
651,
320,
7479,
323,
253,
2990,
281,
12129,
2069,
50276,
85,
50276,
2858,
326,
3133,
16706,
342,
253,
941,
24088,
458,
88,
261,
50276,
78,
451,
77,
4748,
811,
17292,
9267,
270,
50275,
11258,
273,
247,
5884,
2523,
533,
29315,
247,
873,
273,
3888,
3185,
273,
247,
3186,
494,
31697,
1160,
253,
2628,
273,
16725,
436,
2929,
625,
2834,
685,
352,
3058,
281,
320,
50274,
783,
2929,
943,
513,
247,
1805,
2628,
387,
13458,
281,
253,
15180,
7364,
273,
1698,
14714,
391,
79,
2224,
285,
616,
1896,
7364,
275,
12930,
3998,
941,
285,
31937,
50275,
7152,
339,
9852,
436,
2929,
253,
4477,
897,
1698,
14714,
391,
79,
2224,
323,
26278,
11454,
941,
253,
17769,
4315,
310,
34930,
970,
247,
1698,
14714,
6779,
534,
310,
840,
22245,
407,
7221,
2182,
253,
2957,
875,
253,
8131,
11454,
18012,
285,
253,
2303,
4394,
275,
4836,
32581,
1701,
391,
9866,
4623,
3602,
281,
403,
10166,
1754,
327,
4836,
8571,
253,
1332,
310,
840,
3732,
281,
2710,
9699,
8892,
281,
17813,
285,
4908,
16039,
50274,
296,
3755,
20556,
50276,
783,
3210,
908,
403,
2581,
2969,
534,
5373,
432,
1175,
4665,
1430,
50276,
783,
4679,
2783,
403,
4722,
285,
30955,
1027,
7794,
273,
253,
1566,
50276,
20881,
1255,
265,
38135,
253,
38135,
273,
253,
2929,
275,
2426,
273,
253,
1566,
310,
1077,
3710,
253,
2929,
2722,
326,
1698,
14714,
391,
79,
2224,
476,
1361,
30648,
11454,
941,
533,
253,
9021,
273,
253,
789,
4457,
841,
5661,
27163,
285,
4893,
273,
253,
1698,
14714,
391,
79,
2224,
403,
12744,
50276,
37585,
32213,
9759,
432,
253,
2022,
2505,
352,
310,
12744,
849,
465,
5596,
285,
362,
4901,
253,
1783,
12014,
253,
2554,
273,
2593,
247,
19,
310,
2581,
12744,
310,
352,
281,
15249,
253,
3576,
17769,
7982,
50276,
2072,
5474,
33032,
2520,
2929,
23970,
1698,
14714,
391,
79,
2224,
4764,
1025,
407,
253,
1669,
285,
987,
48670,
273,
253,
17769,
4315,
253,
4477,
806,
921,
327,
15524,
2120,
14714,
26724,
391,
79,
2224,
10166,
281,
1347,
690,
9699,
8892,
326,
1698,
14714,
391,
79,
2224,
476,
13613,
4908,
253,
5667,
18525,
285,
4836,
2605,
6311,
407,
253,
2120,
14714,
2718,
597,
7624,
3959,
247,
43245,
4665,
494,
1698,
15759,
6779,
273,
4836,
2425,
253,
4477,
840,
1071,
616,
2746,
327,
11454,
19654,
432,
767,
30552,
9591,
247,
3634,
6820,
3061,
2403,
4836,
969,
253,
10166,
1698,
14714,
391,
79,
2224,
1646,
281,
4908,
253,
3879,
595,
4623,
1698,
6967,
8062,
387,
253,
1072,
673,
9603,
21066,
2801,
10670,
35036,
273,
1110,
2540,
275,
253,
3998,
20544,
3839,
8288,
891,
1158,
436,
310,
247,
6422,
2746,
326,
778,
452,
1774,
12739,
275,
6551,
21559,
3652,
247,
1698,
14714,
2605,
987,
1977,
715,
253,
3210,
908,
323,
17032,
778,
38245,
11990,
5481,
273,
15180,
6297,
432,
941,
50276,
20881,
1255,
265,
923,
7000,
3533,
2708,
619,
2201,
1127,
3164,
310,
326,
891,
13414,
1158,
697,
2590,
1880,
253,
1566,
1663,
16756,
253,
11454,
8062,
285,
849,
271,
5893,
310,
7777,
16161,
253,
4836,
390,
1880,
352,
7960,
9010,
247,
13328,
15329,
784,
2900,
281,
253,
4836,
13730,
923,
3533,
1840,
2490,
187,
4118,
18435,
27,
22157,
327,
3332,
10527,
789,
327,
253,
8062,
273,
1698,
14714,
18902,
11454,
6928,
253,
4477,
1246,
247,
1332,
1925,
298,
565,
323,
4715,
1698,
14714,
2990,
3210,
3587,
432,
941,
347,
253,
30628,
1127,
562,
432,
247,
15846,
7681,
8668,
253,
2934,
310,
15246,
3365,
22318,
247,
1698,
14714,
4764,
1320,
273,
271,
391,
9866,
2074,
5697,
452,
644,
2783,
762,
253,
13590,
273,
13148,
1025,
11454,
6928,
24088,
337,
923,
671,
2905,
2987,
11106,
15308,
534,
19401,
1698,
14714,
4764,
5904,
273,
2801,
12624,
24088,
374,
50275,
2004,
253,
7681,
15832,
778,
320,
3710,
253,
789,
2789,
598,
323,
352,
342,
10291,
281,
3332,
2561,
275,
10527,
6551,
21559,
285,
4722,
4679,
253,
30628,
7164,
1142,
1774,
15985,
1832,
285,
7364,
24088,
403,
841,
8892,
1663,
29210,
273,
253,
10454,
273,
1524,
1533,
8892,
275,
13361,
285,
5661,
6551,
21559,
4583,
2167,
253,
30628,
285,
891,
1158,
436,
2929,
6131,
9865,
9021,
891,
11907,
253,
4477,
281,
49620,
616,
7714,
275,
1708,
273,
841,
11080,
285,
25799,
10123,
50275,
18,
22458,
34089,
247,
7360,
412,
16409,
23187,
277,
7684,
25980,
247,
285,
26925,
18540,
33234,
4104,
13148,
3006,
11454,
6928,
16424,
275,
11454,
1491,
5162,
2718,
3349,
50276,
19,
1850,
300,
278,
439,
518,
29294,
270,
13223,
73,
298,
6337,
91,
4611,
6429,
285,
372,
4107,
30126,
295,
4072,
21565,
3602,
275,
3676,
4715,
16424,
275,
11454,
1491,
5162,
2718,
3436
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors studied robust losses for learning value functions in reinforcement learning the main contribution of this paper lies in the development of two novel robust loss functions for reinforcement learning and a saddle point reformulation based on the huber and the absolute belleman error and the biconjugates overall i think this paper is interesting and is well motivated my main concern is about its novelty the authors highlighted that their main contribution is the introduction of two novel robust losses which are reformulated based on the least absolute loss and the huber loss this leads to the problem why specifically the two losses it seems to me that the comments and the analysis of this paper may also apply to other convex lipschitz losses it seems to me that it is the lipschitz constant of the loss that change the solution quality i would expect more comments in this regard in the paper in addition as a minor comment the presentation of the paper could be further improved for instance the abbreviations mhbe and mabe were used without being defined beforehand see above docsepthis paper proposed using huber bellman error to robustify the loss function in learning value function and proposed using conjugate function to avoid double sampling it also conducted experiments to justify its algorithm strengths the main contribution of this paper is using huber bellman error instead of mean squared bellman error since using the conjugate function to solve double sampling has been proposed in 1 weakness in the last paragraph of page 8 it said that it applied qrc without a target network would the performance of qrc become better and outperform qrchuber when combined with the target network this paper is novel in the sense that it cooperates huber loss with value evaluation to robustify value evaluation it also conducts experiments to justify this improvement therefore i think this paper is marginally above the acceptance threshold docsepthis paper starts with the premise that squared error minimization despite its wide use might not be the most effective option for learning value functions the authors hypothesize that this might be because of squared errors emphasis on outlier states where the bellman error is large at the expense of accuracy on other states to address this they consider the absolute value and huber errors alongside the squared error objective and propose a saddlepoint reformulation for these objectives that requires learning an auxiliary learned state function which essentially attempts to predict the residual error at each state based on this motivation starting with section 4 the authors then make a connection of the proposed robust loss framework involving a learned auxiliary state function with prior work on algorithms for improving td learning namely gtd2 and tdc empirical analysis covers both prediction tasks as well as control tasks in the prediction experiments evaluations are conducted over carefully designed synthetic problems with linear function approximation in order to highlight particular challenges for each objective these demonstrate that the huber objective can often improve the prediction accuracy over squared error over a wide range of step sizes learning rates for the empirical investigation on nonlinear control problems the authors consider a huberextension of the previously proposed qrc algorithm which itself extends the tdc update to be considered in conjunction with the dqn target definition and demonstrate that qrchuber can improve on baselines in certain environments while being competitive elsewhere these algorithms are also validated on a mini version of the atari domain called minatar strengths a nicely executed paper that starts from a simple premise and connects several prior ideas together namely the tdcgtd2 algorithms with the conjugate formulation of bellman error minimization and uses this to derive a more robust versions of the aforementioned algorithms empirical results investigate both a simpler setup with linear prediction experiments as well as a more end to end nonlinear control experiment to validate the practical benefits for policy optimization one of the novel conceptual contributions in this paper appears to be making a connection between the auxiliary model h used in the conjugate formulations of the bellman error with the very differently motivated secondary weights model in gtd2 and tdc algorithms where the additional model h in both cases can be viewed as attempting to predict a residual error corresponding to the primary weights of the value function request for clarifications other comments for improvement for the empirical results on nonlinear control problems figures 4 5 and table 1 it is somewhat unclear how much of a role is played by the auxiliary variable learning mechanism versus simply changing the loss from squared error to huber loss for example it would be interesting to see how a more naive baseline like semigradient update corresponding to the dqn with huber loss metric instead of dqn squared error performs alongside the other three curves in each of the environments the main algorithmic proposal is based on the closely related prior works tdc gtd2 and qrc yet this aspect is somewhat buried until fairly late in the paper for the last term in theorem 32 ptau is applied to the vector mathcalt v v but without specifying any reduction operation looking at the proof it seems to me that there might be a need to add a max operation over the states to this bound could you please confirm the combination of the multiplicative term involving a matrix inverse norm and the max operation over the states above makes this a potentially fairly weak bound in practice another case of using an auxiliary model that learns to predict the td errors has also been proposed for improving learning in actorcritic models eg characterizing the gap between ac and policy gradient wen et al icml 2021 minor comments in equation 4 the notation for state seems inconsistent between t and t1 across s s in the equation for qrchuber updates for the auxiliary variable there is some inconsistent notation across the references thetah thetaht thetah t1 also the variable beta in the secondary variable update needs a definition or reference of some sort this is a nice paper that makes a novel connection between the secondary variable update in gtd2tdc with the conjugate formulation of bellman errors involving an auxiliary state function both of which involve predicting the residual error with a separate model while the technical contributions in section 3 eg theorem 32 are not particularly significant i believe the main value in the paper is the conceptual linking of two different lines of work to derive an improved algorithm over well motivated baselines the empirical evaluation is well motivated and quite thorough even if only for a limited set of benchmarks docsepthe paper proposes a mean huber value error in the tdlearning and the paper demonstrates the robustness under such loss the robustness is important in rl learning the mhbe defined in this paper is inituitive and the authors also develop reformulation to solve it in practive the major contribution is to develop a new type of loss which is not sufficient according to the iclr standards my main concern is that the contribution is not sufficient i think the main contribution of this paper is the introduction of a robust loss the conjugate reformulation is standard and the bound developed in theorem 32 is not surprising the writing is also not clear for example mhbe and mabe are never defined
### Summary:
|
the paper proposes to use the huber and absolute loss for value function estimation in reinforcement learning and optimizes it by leveraging a recent primaldual formulation by dai et al this is a controversial paper on one hand it is a well motivated idea to apply robust loss on rl the paper implemented the idea well by leveraging the saddle point formulation and empirically demonstrate its advantages in practice on the other hand the technical novelty of this paper is limited the idea of huber and standard conjugate formulation are straightforward application of existing techniques despite being well motivated the authors seem to think that there has been no application of huber loss on rl but existing implementations of rl already uses huber loss for example in the openai baselines httpsopenaicomblogopenaibaselinesdqn they said the following double check your interpretations of papers in the dqn nature paper the authors write we also found it helpful to clip the error term from the update to be between 1 and 1 there are two ways to interpret this statement clip the objective or clip the multiplicative term when computing gradient the former seems more natural but it causes the gradient to be zero on transitions with high error which leads to suboptimal performance as found in one dqn implementation the latter is correct and has a simple mathematical interpretation huber loss you can spot bugs like these by checking that the gradients appear as you expect this can be easily done within tensorflow by using computegradients the authors discussed the first approach above on in the rebuttal but i am not sure if the authors have considered the second method if not it would be worthwhile to discuss and compare with it see also agarwal et al an optimistic perspective on offline reinforcement learning and dabney et al distributional reinforcement learning with quantile regression on the other hand i have not seen the application of saddle point approach by primaldual method of dai on huber specially it seems that the proposed algorithm is in the end equivalent to msbeprimaldual h with softmax output if it is that simple i think it would help the readers to explicitly point this out upfront in the beginning which is an interesting conceptual connection because the primaldual approach need to be approximate h with a neural network the difference of the two methods is vague in the primaldual space a side mark when we say an objective for which we can obtain unbiased sample gradients i think that the gradient estimator of the augmented lagrange is unbiased the gradient estimates of mhbe and mabe are still biased overall it is a paper with a well motivated and valuable contribution but limited in terms of technical depth and novelty
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
5421,
10237,
11655,
323,
4715,
1318,
3470,
275,
35221,
4715,
253,
2022,
7680,
273,
436,
2929,
8696,
275,
253,
2440,
273,
767,
4460,
10237,
2957,
3470,
323,
35221,
4715,
285,
247,
26759,
1127,
8460,
1427,
1754,
327,
253,
14713,
254,
285,
253,
7880,
1112,
39772,
2228,
285,
253,
270,
3557,
18936,
684,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
4722,
285,
310,
973,
17194,
619,
2022,
4468,
310,
670,
697,
38135,
253,
4477,
16318,
326,
616,
2022,
7680,
310,
253,
10199,
273,
767,
4460,
10237,
11655,
534,
403,
8460,
2907,
1754,
327,
253,
1878,
7880,
2957,
285,
253,
14713,
254,
2957,
436,
5644,
281,
253,
1895,
2139,
5742,
253,
767,
11655,
352,
3133,
281,
479,
326,
253,
5701,
285,
253,
1783,
273,
436,
2929,
778,
671,
4647,
281,
643,
17133,
11233,
37913,
11655,
352,
3133,
281,
479,
326,
352,
310,
253,
11233,
37913,
3638,
273,
253,
2957,
326,
1818,
253,
2900,
3290,
891,
651,
1902,
625,
5701,
275,
436,
2743,
275,
253,
2929,
50275,
249,
1635,
347,
247,
5884,
4385,
253,
9759,
273,
253,
2929,
812,
320,
2007,
5520,
323,
4227,
253,
490,
25669,
278,
73,
1257,
285,
278,
12424,
497,
908,
1293,
1146,
2931,
38565,
50273,
2887,
1840,
5474,
33032,
2520,
2929,
4081,
970,
14713,
254,
17487,
1342,
2228,
281,
10237,
1419,
253,
2957,
1159,
275,
4715,
1318,
1159,
285,
4081,
970,
27442,
1159,
281,
3693,
4021,
10491,
352,
671,
5196,
4679,
281,
15249,
697,
5933,
20544,
50276,
783,
2022,
7680,
273,
436,
2929,
310,
970,
14713,
254,
17487,
1342,
2228,
3185,
273,
1599,
30044,
17487,
1342,
2228,
1580,
970,
253,
27442,
1159,
281,
8415,
4021,
10491,
556,
644,
4081,
275,
337,
14855,
275,
253,
1390,
12494,
273,
3239,
854,
352,
753,
326,
352,
3732,
2805,
3373,
1293,
247,
2303,
2990,
651,
253,
3045,
273,
2805,
3373,
2489,
1805,
285,
562,
32231,
2805,
83,
348,
22651,
672,
5678,
342,
253,
2303,
2990,
436,
2929,
310,
4460,
275,
253,
3282,
326,
352,
10239,
684,
14713,
254,
2957,
342,
1318,
7103,
281,
10237,
1419,
1318,
7103,
352,
671,
2589,
84,
4679,
281,
15249,
436,
7756,
3103,
891,
1158,
436,
2929,
310,
42876,
1840,
253,
14924,
7887,
5474,
33032,
2520,
2929,
7866,
342,
253,
26536,
326,
30044,
2228,
41458,
5747,
697,
4618,
897,
1537,
417,
320,
253,
954,
3576,
4500,
323,
4715,
1318,
3470,
253,
4477,
41661,
326,
436,
1537,
320,
984,
273,
30044,
6332,
15075,
327,
562,
3623,
3054,
835,
253,
17487,
1342,
2228,
310,
1781,
387,
253,
14247,
273,
7200,
327,
643,
3054,
281,
2953,
436,
597,
1908,
253,
7880,
1318,
285,
14713,
254,
6332,
12936,
253,
30044,
2228,
8103,
285,
12661,
247,
26759,
3659,
8460,
1427,
323,
841,
16566,
326,
4419,
4715,
271,
24026,
6311,
1375,
1159,
534,
9093,
9437,
281,
3283,
253,
12541,
2228,
387,
1016,
1375,
50276,
3169,
327,
436,
16038,
4983,
342,
2593,
577,
253,
4477,
840,
1056,
247,
4602,
273,
253,
4081,
10237,
2957,
7792,
7668,
247,
6311,
24026,
1375,
1159,
342,
2720,
789,
327,
11333,
323,
11138,
32989,
4715,
10775,
305,
2851,
19,
285,
246,
12352,
50276,
358,
5378,
474,
1783,
10949,
1097,
10554,
8892,
347,
973,
347,
1453,
8892,
275,
253,
10554,
4679,
27163,
403,
5196,
689,
9257,
4158,
13506,
3237,
342,
4872,
1159,
11193,
275,
1340,
281,
6780,
1798,
7881,
323,
1016,
8103,
841,
7568,
326,
253,
14713,
254,
8103,
476,
2223,
3157,
253,
10554,
7200,
689,
30044,
2228,
689,
247,
4618,
2491,
273,
3213,
9552,
4715,
4142,
50276,
1542,
253,
16774,
5839,
327,
14561,
1453,
3237,
253,
4477,
1908,
247,
288,
4338,
250,
633,
2452,
273,
253,
3786,
4081,
2805,
3373,
5933,
534,
3139,
8725,
253,
246,
12352,
5731,
281,
320,
2783,
275,
17385,
342,
253,
277,
47051,
2303,
5426,
285,
7568,
326,
2805,
83,
348,
22651,
476,
3157,
327,
1666,
25379,
275,
2176,
12620,
1223,
1146,
12085,
11358,
841,
11333,
403,
671,
17618,
327,
247,
12949,
2715,
273,
253,
387,
1792,
5028,
1925,
1054,
15642,
50276,
296,
3755,
20556,
50276,
66,
23395,
11407,
2929,
326,
7866,
432,
247,
2969,
26536,
285,
23417,
2067,
2720,
5697,
2366,
50276,
49592,
253,
246,
12352,
72,
2851,
19,
11333,
342,
253,
27442,
15895,
273,
17487,
1342,
2228,
41458,
285,
4648,
436,
281,
15313,
247,
625,
10237,
9508,
273,
253,
18979,
11333,
50276,
358,
5378,
474,
1543,
7409,
1097,
247,
19554,
9978,
342,
4872,
10554,
4679,
347,
973,
347,
247,
625,
990,
281,
990,
14561,
1453,
3368,
281,
17813,
253,
8542,
5373,
323,
3646,
13757,
50275,
531,
273,
253,
4460,
20178,
9021,
275,
436,
2929,
4620,
281,
320,
2403,
247,
4602,
875,
253,
24026,
1566,
288,
908,
275,
253,
27442,
26850,
273,
253,
17487,
1342,
2228,
342,
253,
1077,
13359,
17194,
6561,
13461,
1566,
275,
305,
2851,
19,
285,
246,
12352,
11333,
835,
253,
3081,
1566,
288,
275,
1097,
2219,
476,
320,
11575,
347,
13756,
281,
3283,
247,
12541,
2228,
3969,
281,
253,
3625,
13461,
273,
253,
1318,
1159,
50274,
9629,
323,
8254,
6787,
50276,
977,
5701,
323,
7756,
50276,
1542,
253,
16774,
1543,
327,
14561,
1453,
3237,
8442,
577,
608,
285,
2829,
337,
352,
310,
8489,
12744,
849,
1199,
273,
247,
2554,
310,
4546,
407,
253,
24026,
4778,
4715,
5122,
7147,
3365,
6890,
253,
2957,
432,
30044,
2228,
281,
14713,
254,
2957,
323,
1650,
352,
651,
320,
4722,
281,
923,
849,
247,
625,
27785,
8245,
751,
3300,
304,
4614,
850,
5731,
3969,
281,
253,
277,
47051,
50276,
3113,
14713,
254,
2957,
7982,
3185,
273,
277,
47051,
50276,
47593,
2228,
17923,
12936,
253,
643,
1264,
9191,
275,
1016,
273,
253,
12620,
50275,
783,
2022,
5933,
280,
10419,
310,
1754,
327,
253,
8244,
2905,
2720,
2987,
246,
12352,
305,
2851,
19,
285,
2805,
3373,
2568,
436,
4809,
310,
8489,
14205,
1919,
9648,
3563,
275,
253,
2929,
50275,
1542,
253,
1390,
1307,
275,
10012,
4567,
268,
3115,
310,
3732,
281,
253,
4972,
14168,
1179,
85,
362,
50276,
87,
533,
1293,
31238,
667,
5141,
4254,
50276,
13565,
387,
253,
4737,
352,
3133,
281,
479,
326,
627,
1537,
320,
247,
878,
281,
823,
247,
2781,
4254,
689,
253,
3054,
281,
436,
3033,
812,
368,
4496,
6583,
253,
5019,
273,
253,
43904,
1307,
7668,
247,
4315,
13737,
5222,
285,
253,
2781,
4254,
689,
253,
3054,
1840,
2789,
436,
247,
7826,
9648,
5075,
3033,
275,
3946,
50276,
23955,
1083,
273,
970,
271,
24026,
1566,
326,
33772,
281,
3283,
253,
32989,
6332,
556,
671,
644,
4081,
323,
11138,
4715,
275,
12353,
68,
17425,
3210,
24088,
39330,
253,
8037,
875,
913,
285,
3646,
11786,
259,
257,
1162,
355,
17857,
1686,
43425,
50275,
37585,
5701,
50276,
249,
5150,
577,
253,
14951,
323,
1375,
3133,
16706,
875,
246,
285,
246,
18,
2439,
256,
256,
50276,
249,
253,
5150,
323,
2805,
83,
348,
22651,
11269,
323,
253,
24026,
4778,
627,
310,
690,
16706,
14951,
2439,
253,
10414,
253,
15559,
39116,
384,
253,
15559,
246,
18,
671,
253,
4778,
9840,
275,
253,
6561,
4778,
5731,
3198,
247,
5426,
390,
3806,
273,
690,
3686,
50276,
2520,
310,
247,
5322,
2929,
326,
2789,
247,
4460,
4602,
875,
253,
6561,
4778,
5731,
275,
305,
2851,
19,
2851,
68,
342,
253,
27442,
15895,
273,
17487,
1342,
6332,
7668,
271,
24026,
1375,
1159,
1097,
273,
534,
6388,
21565,
253,
12541,
2228,
342,
247,
4858,
1566,
1223,
253,
7681,
9021,
275,
2593,
495,
24088,
10012,
4567,
403,
417,
3782,
1534,
891,
2868,
253,
2022,
1318,
275,
253,
2929,
310,
253,
20178,
20057,
273,
767,
1027,
3104,
273,
789,
281,
15313,
271,
5520,
5933,
689,
973,
17194,
1666,
25379,
253,
16774,
7103,
310,
973,
17194,
285,
3240,
11080,
1014,
604,
760,
323,
247,
3710,
873,
273,
49602,
50275,
7152,
339,
431,
248,
2929,
29328,
247,
1599,
14713,
254,
1318,
2228,
275,
253,
32989,
28269,
285,
253,
2929,
14371,
253,
31640,
762,
824,
2957,
253,
31640,
310,
1774,
275,
391,
77,
4715,
253,
278,
73,
1257,
2931,
275,
436,
2929,
310,
2012,
48714,
285,
253,
4477,
671,
1287,
8460,
1427,
281,
8415,
352,
275,
2283,
422,
253,
2201,
7680,
310,
281,
1287,
247,
747,
1511,
273,
2957,
534,
310,
417,
4209,
2556,
281,
253,
17857,
32888,
7465,
50275,
2577,
2022,
4468,
310,
326,
253,
7680,
310,
417,
4209,
891,
1158,
253,
2022,
7680,
273,
436,
2929,
310,
253,
10199,
273,
247,
10237,
2957,
253,
27442,
8460,
1427,
310,
2629,
285,
253,
3033,
3715,
275,
10012,
4567,
310,
417,
10084,
50276,
783,
4028,
310,
671,
417,
2590,
323,
1650,
278,
73,
1257,
285,
278,
12424,
403,
1620,
2931,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
897,
253,
14713,
254,
285,
7880,
2957,
323,
1318,
1159,
13418,
275,
35221,
4715,
285,
5556,
4219,
352,
407,
19732,
2977,
247,
3332,
819,
1983,
34716,
15895,
407,
277,
2284,
1162,
355,
50275,
2520,
310,
247,
15620,
2929,
327,
581,
1133,
352,
310,
247,
973,
17194,
2934,
281,
4647,
10237,
2957,
327,
391,
77,
253,
2929,
9009,
253,
2934,
973,
407,
19732,
2977,
253,
26759,
1127,
15895,
285,
45190,
7568,
697,
11361,
275,
3946,
50275,
251,
253,
643,
1133,
253,
7681,
38135,
273,
436,
2929,
310,
3710,
253,
2934,
273,
14713,
254,
285,
2629,
27442,
15895,
403,
15246,
2898,
273,
5368,
5609,
5747,
1146,
973,
17194,
50275,
783,
4477,
1646,
281,
1158,
326,
627,
556,
644,
642,
2898,
273,
14713,
254,
2957,
327,
391,
77,
533,
5368,
27558,
273,
391,
77,
2168,
4648,
14713,
254,
2957,
323,
1650,
275,
253,
1527,
2284,
1666,
25379,
5987,
5758,
39533,
297,
9198,
5758,
66,
487,
284,
25379,
39028,
79,
597,
753,
253,
1563,
50275,
12237,
2451,
634,
27838,
273,
9380,
275,
253,
277,
47051,
3753,
2929,
253,
4477,
3630,
359,
671,
1119,
352,
9371,
281,
17230,
253,
2228,
1307,
432,
253,
5731,
50276,
936,
320,
875,
337,
285,
337,
627,
403,
767,
4088,
281,
4665,
436,
3908,
50276,
11536,
253,
8103,
390,
17230,
253,
43904,
1307,
672,
12672,
11786,
253,
3438,
3133,
625,
3626,
533,
352,
5997,
253,
11786,
281,
320,
5058,
327,
16307,
342,
1029,
2228,
534,
5644,
281,
749,
29776,
3045,
347,
1119,
275,
581,
277,
47051,
7092,
253,
6158,
310,
3451,
285,
556,
247,
2969,
15965,
7914,
50276,
73,
22651,
2957,
368,
476,
6308,
19775,
751,
841,
407,
12669,
326,
253,
27935,
3176,
347,
368,
1902,
50276,
2520,
476,
320,
4354,
2218,
1561,
13148,
5449,
407,
970,
11897,
4971,
1104,
50275,
783,
4477,
5469,
253,
806,
2746,
1840,
327,
275,
253,
30080,
22559,
533,
891,
717,
417,
2119,
604,
253,
4477,
452,
2783,
253,
1273,
1332,
604,
417,
352,
651,
320,
32811,
281,
2319,
285,
7277,
342,
352,
50274,
2887,
671,
21703,
18758,
1162,
355,
271,
28684,
8668,
327,
28841,
35221,
4715,
285,
49265,
2191,
1162,
355,
3268,
267,
35221,
4715,
342,
2677,
587,
9077,
50276,
251,
253,
643,
1133,
891,
452,
417,
2326,
253,
2898,
273,
26759,
1127,
2746,
407,
819,
1983,
34716,
1332,
273,
277,
2284,
327,
14713,
254,
24443,
50275,
262,
3133,
326,
253,
4081,
5933,
310,
275,
253,
990,
6425,
281,
13818,
67,
554,
83,
1983,
34716,
288,
342,
2602,
4090,
3453,
604,
352,
310,
326,
2969,
891,
1158,
352,
651,
1361,
253,
10668,
281,
11120,
1127,
436,
562,
598,
6342,
275,
253,
5068,
534,
310,
271,
4722,
20178,
4602,
50276,
12157,
253,
819,
1983,
34716,
2746,
878,
281,
320,
16851,
288,
342,
247,
11454,
2990,
253,
3064,
273,
253,
767,
3082,
310,
21248,
275,
253,
819,
1983,
34716,
2317,
50275,
66,
1930,
1616,
50276,
9453,
359,
1333,
271,
8103,
323,
534,
359,
476,
4044,
38663,
3410,
27935,
891,
1158,
326,
253,
11786,
29107,
273,
253,
31612,
16653,
6324,
310,
38663,
253,
11786,
8197,
273,
278,
73,
1257,
285,
278,
12424,
403,
1335,
23539,
50274,
1189,
455,
352,
310,
247,
2929,
342,
247,
973,
17194,
285,
9865,
7680,
533,
3710,
275,
2426,
273,
7681,
6864,
285,
38135
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
5421,
10237,
11655,
323,
4715,
1318,
3470,
275,
35221,
4715,
253,
2022,
7680,
273,
436,
2929,
8696,
275,
253,
2440,
273,
767,
4460,
10237,
2957,
3470,
323,
35221,
4715,
285,
247,
26759,
1127,
8460,
1427,
1754,
327,
253,
14713,
254,
285,
253,
7880,
1112,
39772,
2228,
285,
253,
270,
3557,
18936,
684,
50276,
1189,
455,
891,
1158,
436,
2929,
310,
4722,
285,
310,
973,
17194,
619,
2022,
4468,
310,
670,
697,
38135,
253,
4477,
16318,
326,
616,
2022,
7680,
310,
253,
10199,
273,
767,
4460,
10237,
11655,
534,
403,
8460,
2907,
1754,
327,
253,
1878,
7880,
2957,
285,
253,
14713,
254,
2957,
436,
5644,
281,
253,
1895,
2139,
5742,
253,
767,
11655,
352,
3133,
281,
479,
326,
253,
5701,
285,
253,
1783,
273,
436,
2929,
778,
671,
4647,
281,
643,
17133,
11233,
37913,
11655,
352,
3133,
281,
479,
326,
352,
310,
253,
11233,
37913,
3638,
273,
253,
2957,
326,
1818,
253,
2900,
3290,
891,
651,
1902,
625,
5701,
275,
436,
2743,
275,
253,
2929,
50275,
249,
1635,
347,
247,
5884,
4385,
253,
9759,
273,
253,
2929,
812,
320,
2007,
5520,
323,
4227,
253,
490,
25669,
278,
73,
1257,
285,
278,
12424,
497,
908,
1293,
1146,
2931,
38565,
50273,
2887,
1840,
5474,
33032,
2520,
2929,
4081,
970,
14713,
254,
17487,
1342,
2228,
281,
10237,
1419,
253,
2957,
1159,
275,
4715,
1318,
1159,
285,
4081,
970,
27442,
1159,
281,
3693,
4021,
10491,
352,
671,
5196,
4679,
281,
15249,
697,
5933,
20544,
50276,
783,
2022,
7680,
273,
436,
2929,
310,
970,
14713,
254,
17487,
1342,
2228,
3185,
273,
1599,
30044,
17487,
1342,
2228,
1580,
970,
253,
27442,
1159,
281,
8415,
4021,
10491,
556,
644,
4081,
275,
337,
14855,
275,
253,
1390,
12494,
273,
3239,
854,
352,
753,
326,
352,
3732,
2805,
3373,
1293,
247,
2303,
2990,
651,
253,
3045,
273,
2805,
3373,
2489,
1805,
285,
562,
32231,
2805,
83,
348,
22651,
672,
5678,
342,
253,
2303,
2990,
436,
2929,
310,
4460,
275,
253,
3282,
326,
352,
10239,
684,
14713,
254,
2957,
342,
1318,
7103,
281,
10237,
1419,
1318,
7103,
352,
671,
2589,
84,
4679,
281,
15249,
436,
7756,
3103,
891,
1158,
436,
2929,
310,
42876,
1840,
253,
14924,
7887,
5474,
33032,
2520,
2929,
7866,
342,
253,
26536,
326,
30044,
2228,
41458,
5747,
697,
4618,
897,
1537,
417,
320,
253,
954,
3576,
4500,
323,
4715,
1318,
3470,
253,
4477,
41661,
326,
436,
1537,
320,
984,
273,
30044,
6332,
15075,
327,
562,
3623,
3054,
835,
253,
17487,
1342,
2228,
310,
1781,
387,
253,
14247,
273,
7200,
327,
643,
3054,
281,
2953,
436,
597,
1908,
253,
7880,
1318,
285,
14713,
254,
6332,
12936,
253,
30044,
2228,
8103,
285,
12661,
247,
26759,
3659,
8460,
1427,
323,
841,
16566,
326,
4419,
4715,
271,
24026,
6311,
1375,
1159,
534,
9093,
9437,
281,
3283,
253,
12541,
2228,
387,
1016,
1375,
50276,
3169,
327,
436,
16038,
4983,
342,
2593,
577,
253,
4477,
840,
1056,
247,
4602,
273,
253,
4081,
10237,
2957,
7792,
7668,
247,
6311,
24026,
1375,
1159,
342,
2720,
789,
327,
11333,
323,
11138,
32989,
4715,
10775,
305,
2851,
19,
285,
246,
12352,
50276,
358,
5378,
474,
1783,
10949,
1097,
10554,
8892,
347,
973,
347,
1453,
8892,
275,
253,
10554,
4679,
27163,
403,
5196,
689,
9257,
4158,
13506,
3237,
342,
4872,
1159,
11193,
275,
1340,
281,
6780,
1798,
7881,
323,
1016,
8103,
841,
7568,
326,
253,
14713,
254,
8103,
476,
2223,
3157,
253,
10554,
7200,
689,
30044,
2228,
689,
247,
4618,
2491,
273,
3213,
9552,
4715,
4142,
50276,
1542,
253,
16774,
5839,
327,
14561,
1453,
3237,
253,
4477,
1908,
247,
288,
4338,
250,
633,
2452,
273,
253,
3786,
4081,
2805,
3373,
5933,
534,
3139,
8725,
253,
246,
12352,
5731,
281,
320,
2783,
275,
17385,
342,
253,
277,
47051,
2303,
5426,
285,
7568,
326,
2805,
83,
348,
22651,
476,
3157,
327,
1666,
25379,
275,
2176,
12620,
1223,
1146,
12085,
11358,
841,
11333,
403,
671,
17618,
327,
247,
12949,
2715,
273,
253,
387,
1792,
5028,
1925,
1054,
15642,
50276,
296,
3755,
20556,
50276,
66,
23395,
11407,
2929,
326,
7866,
432,
247,
2969,
26536,
285,
23417,
2067,
2720,
5697,
2366,
50276,
49592,
253,
246,
12352,
72,
2851,
19,
11333,
342,
253,
27442,
15895,
273,
17487,
1342,
2228,
41458,
285,
4648,
436,
281,
15313,
247,
625,
10237,
9508,
273,
253,
18979,
11333,
50276,
358,
5378,
474,
1543,
7409,
1097,
247,
19554,
9978,
342,
4872,
10554,
4679,
347,
973,
347,
247,
625,
990,
281,
990,
14561,
1453,
3368,
281,
17813,
253,
8542,
5373,
323,
3646,
13757,
50275,
531,
273,
253,
4460,
20178,
9021,
275,
436,
2929,
4620,
281,
320,
2403,
247,
4602,
875,
253,
24026,
1566,
288,
908,
275,
253,
27442,
26850,
273,
253,
17487,
1342,
2228,
342,
253,
1077,
13359,
17194,
6561,
13461,
1566,
275,
305,
2851,
19,
285,
246,
12352,
11333,
835,
253,
3081,
1566,
288,
275,
1097,
2219,
476,
320,
11575,
347,
13756,
281,
3283,
247,
12541,
2228,
3969,
281,
253,
3625,
13461,
273,
253,
1318,
1159,
50274,
9629,
323,
8254,
6787,
50276,
977,
5701,
323,
7756,
50276,
1542,
253,
16774,
1543,
327,
14561,
1453,
3237,
8442,
577,
608,
285,
2829,
337,
352,
310,
8489,
12744,
849,
1199,
273,
247,
2554,
310,
4546,
407,
253,
24026,
4778,
4715,
5122,
7147,
3365,
6890,
253,
2957,
432,
30044,
2228,
281,
14713,
254,
2957,
323,
1650,
352,
651,
320,
4722,
281,
923,
849,
247,
625,
27785,
8245,
751,
3300,
304,
4614,
850,
5731,
3969,
281,
253,
277,
47051,
50276,
3113,
14713,
254,
2957,
7982,
3185,
273,
277,
47051,
50276,
47593,
2228,
17923,
12936,
253,
643,
1264,
9191,
275,
1016,
273,
253,
12620,
50275,
783,
2022,
5933,
280,
10419,
310,
1754,
327,
253,
8244,
2905,
2720,
2987,
246,
12352,
305,
2851,
19,
285,
2805,
3373,
2568,
436,
4809,
310,
8489,
14205,
1919,
9648,
3563,
275,
253,
2929,
50275,
1542,
253,
1390,
1307,
275,
10012,
4567,
268,
3115,
310,
3732,
281,
253,
4972,
14168,
1179,
85,
362,
50276,
87,
533,
1293,
31238,
667,
5141,
4254,
50276,
13565,
387,
253,
4737,
352,
3133,
281,
479,
326,
627,
1537,
320,
247,
878,
281,
823,
247,
2781,
4254,
689,
253,
3054,
281,
436,
3033,
812,
368,
4496,
6583,
253,
5019,
273,
253,
43904,
1307,
7668,
247,
4315,
13737,
5222,
285,
253,
2781,
4254,
689,
253,
3054,
1840,
2789,
436,
247,
7826,
9648,
5075,
3033,
275,
3946,
50276,
23955,
1083,
273,
970,
271,
24026,
1566,
326,
33772,
281,
3283,
253,
32989,
6332,
556,
671,
644,
4081,
323,
11138,
4715,
275,
12353,
68,
17425,
3210,
24088,
39330,
253,
8037,
875,
913,
285,
3646,
11786,
259,
257,
1162,
355,
17857,
1686,
43425,
50275,
37585,
5701,
50276,
249,
5150,
577,
253,
14951,
323,
1375,
3133,
16706,
875,
246,
285,
246,
18,
2439,
256,
256,
50276,
249,
253,
5150,
323,
2805,
83,
348,
22651,
11269,
323,
253,
24026,
4778,
627,
310,
690,
16706,
14951,
2439,
253,
10414,
253,
15559,
39116,
384,
253,
15559,
246,
18,
671,
253,
4778,
9840,
275,
253,
6561,
4778,
5731,
3198,
247,
5426,
390,
3806,
273,
690,
3686,
50276,
2520,
310,
247,
5322,
2929,
326,
2789,
247,
4460,
4602,
875,
253,
6561,
4778,
5731,
275,
305,
2851,
19,
2851,
68,
342,
253,
27442,
15895,
273,
17487,
1342,
6332,
7668,
271,
24026,
1375,
1159,
1097,
273,
534,
6388,
21565,
253,
12541,
2228,
342,
247,
4858,
1566,
1223,
253,
7681,
9021,
275,
2593,
495,
24088,
10012,
4567,
403,
417,
3782,
1534,
891,
2868,
253,
2022,
1318,
275,
253,
2929,
310,
253,
20178,
20057,
273,
767,
1027,
3104,
273,
789,
281,
15313,
271,
5520,
5933,
689,
973,
17194,
1666,
25379,
253,
16774,
7103,
310,
973,
17194,
285,
3240,
11080,
1014,
604,
760,
323,
247,
3710,
873,
273,
49602,
50275,
7152,
339,
431,
248,
2929,
29328,
247,
1599,
14713,
254,
1318,
2228,
275,
253,
32989,
28269,
285,
253,
2929,
14371,
253,
31640,
762,
824,
2957,
253,
31640,
310,
1774,
275,
391,
77,
4715,
253,
278,
73,
1257,
2931,
275,
436,
2929,
310,
2012,
48714,
285,
253,
4477,
671,
1287,
8460,
1427,
281,
8415,
352,
275,
2283,
422,
253,
2201,
7680,
310,
281,
1287,
247,
747,
1511,
273,
2957,
534,
310,
417,
4209,
2556,
281,
253,
17857,
32888,
7465,
50275,
2577,
2022,
4468,
310,
326,
253,
7680,
310,
417,
4209,
891,
1158,
253,
2022,
7680,
273,
436,
2929,
310,
253,
10199,
273,
247,
10237,
2957,
253,
27442,
8460,
1427,
310,
2629,
285,
253,
3033,
3715,
275,
10012,
4567,
310,
417,
10084,
50276,
783,
4028,
310,
671,
417,
2590,
323,
1650,
278,
73,
1257,
285,
278,
12424,
403,
1620,
2931,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
897,
253,
14713,
254,
285,
7880,
2957,
323,
1318,
1159,
13418,
275,
35221,
4715,
285,
5556,
4219,
352,
407,
19732,
2977,
247,
3332,
819,
1983,
34716,
15895,
407,
277,
2284,
1162,
355,
50275,
2520,
310,
247,
15620,
2929,
327,
581,
1133,
352,
310,
247,
973,
17194,
2934,
281,
4647,
10237,
2957,
327,
391,
77,
253,
2929,
9009,
253,
2934,
973,
407,
19732,
2977,
253,
26759,
1127,
15895,
285,
45190,
7568,
697,
11361,
275,
3946,
50275,
251,
253,
643,
1133,
253,
7681,
38135,
273,
436,
2929,
310,
3710,
253,
2934,
273,
14713,
254,
285,
2629,
27442,
15895,
403,
15246,
2898,
273,
5368,
5609,
5747,
1146,
973,
17194,
50275,
783,
4477,
1646,
281,
1158,
326,
627,
556,
644,
642,
2898,
273,
14713,
254,
2957,
327,
391,
77,
533,
5368,
27558,
273,
391,
77,
2168,
4648,
14713,
254,
2957,
323,
1650,
275,
253,
1527,
2284,
1666,
25379,
5987,
5758,
39533,
297,
9198,
5758,
66,
487,
284,
25379,
39028,
79,
597,
753,
253,
1563,
50275,
12237,
2451,
634,
27838,
273,
9380,
275,
253,
277,
47051,
3753,
2929,
253,
4477,
3630,
359,
671,
1119,
352,
9371,
281,
17230,
253,
2228,
1307,
432,
253,
5731,
50276,
936,
320,
875,
337,
285,
337,
627,
403,
767,
4088,
281,
4665,
436,
3908,
50276,
11536,
253,
8103,
390,
17230,
253,
43904,
1307,
672,
12672,
11786,
253,
3438,
3133,
625,
3626,
533,
352,
5997,
253,
11786,
281,
320,
5058,
327,
16307,
342,
1029,
2228,
534,
5644,
281,
749,
29776,
3045,
347,
1119,
275,
581,
277,
47051,
7092,
253,
6158,
310,
3451,
285,
556,
247,
2969,
15965,
7914,
50276,
73,
22651,
2957,
368,
476,
6308,
19775,
751,
841,
407,
12669,
326,
253,
27935,
3176,
347,
368,
1902,
50276,
2520,
476,
320,
4354,
2218,
1561,
13148,
5449,
407,
970,
11897,
4971,
1104,
50275,
783,
4477,
5469,
253,
806,
2746,
1840,
327,
275,
253,
30080,
22559,
533,
891,
717,
417,
2119,
604,
253,
4477,
452,
2783,
253,
1273,
1332,
604,
417,
352,
651,
320,
32811,
281,
2319,
285,
7277,
342,
352,
50274,
2887,
671,
21703,
18758,
1162,
355,
271,
28684,
8668,
327,
28841,
35221,
4715,
285,
49265,
2191,
1162,
355,
3268,
267,
35221,
4715,
342,
2677,
587,
9077,
50276,
251,
253,
643,
1133,
891,
452,
417,
2326,
253,
2898,
273,
26759,
1127,
2746,
407,
819,
1983,
34716,
1332,
273,
277,
2284,
327,
14713,
254,
24443,
50275,
262,
3133,
326,
253,
4081,
5933,
310,
275,
253,
990,
6425,
281,
13818,
67,
554,
83,
1983,
34716,
288,
342,
2602,
4090,
3453,
604,
352,
310,
326,
2969,
891,
1158,
352,
651,
1361,
253,
10668,
281,
11120,
1127,
436,
562,
598,
6342,
275,
253,
5068,
534,
310,
271,
4722,
20178,
4602,
50276,
12157,
253,
819,
1983,
34716,
2746,
878,
281,
320,
16851,
288,
342,
247,
11454,
2990,
253,
3064,
273,
253,
767,
3082,
310,
21248,
275,
253,
819,
1983,
34716,
2317,
50275,
66,
1930,
1616,
50276,
9453,
359,
1333,
271,
8103,
323,
534,
359,
476,
4044,
38663,
3410,
27935,
891,
1158,
326,
253,
11786,
29107,
273,
253,
31612,
16653,
6324,
310,
38663,
253,
11786,
8197,
273,
278,
73,
1257,
285,
278,
12424,
403,
1335,
23539,
50274,
1189,
455,
352,
310,
247,
2929,
342,
247,
973,
17194,
285,
9865,
7680,
533,
3710,
275,
2426,
273,
7681,
6864,
285,
38135
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this manuscript provides a unified asymptotic convergence analysis framework for several centralized stochastic optimization methods such as sgd random reshuffling proximal sgd and stochastic modelbased methods by introducing two sets of general conditions on the abstract convergence measure as well as the sequence generated by stochastic optimization algorithms they derive both the expected and almostsure convergence to the stationary points respectively the authors then apply this result to the abovementioned algorithms to obtain asymptotic convergence guarantees in the expected and almostsure senses which either recover existing convergence results possibly under weaker assumptions or generate new results originality the main novelty of this paper relies on the generality of the proposed unified convergence analysis based on the abstract stationarity measure phi which seems new to the reviewer the major comments are summarized as follows strengths by introducing general conditions on the abstract stationarity measure phi their theoretical analysis enjoys high flexibility to recover or complete the expected and almostsure asymptotic convergence results of existing stochastic optimization methods under several problem settings thus is potential to simplify algorithm design and analysis this paper is well written and easy to follow the convergence analysis for sgd rr proxsgd and smm methods is clear and reveals some insights among these methods weaknesses on technical novelty since the considered stochastic optimization methods in this work have been well studied and the proof techniques used in this paper are standard thus the technical contribution is limited to some extent on the obtained rate results for nonconvex problems the convergence to a stationary point of the objective function is not very informative since the result cannot declare any optimality from this understanding the obtained asymptotic convergence results in this work are even weaker than the existing iteration complexity results which further shows the nonasymptotic convergence rates under certain measures the generality of the proposed framework is not very clear to the reviewer the authors are thus suggested to make clear the scope of stochastic optimization algorithms that can be incorporated into this unified convergence analysis docsepthis paper derives general convergence results for stochastic optimization methods the results include the cases where the cost function is nonconvex and nonsmooth strength the approach is general and can be applicable to several settings weakness the results are only asymptotic and about the gradient norm as such not global optimization results not applicable docsepthis work provides a unified theorem for analyzing several stochastic optimization methods both expected and almost sure convergence are derived for applications the authors recover the convergence results for stochastic gradient descent sgd and random reshuffling rr in addition this paper also obtains the convergence result for the stochastic proximal gradient method proxsgd by using the proposed framework this paper is well written and easy to follow in general the structure of the paper is good and provides some new insights into the convergence results for the stochastic optimization methods the core theorem 21 is very important and its proof makes sense to me furthermore the proxsgd is analyzed via this framework to obtain some convergence results the main weakness is that the proposed method only provides an asymptotic convergence result which is less informative compared to the finitetime type of bounds in addition the function value convergence might be also interested yes docsepthe authors analyze the convergence of sgdlike methods such that the norm of the gradient in the last iterate convergence to zero in expectation and almost surely they provide the general framework to analyze the convergence for sgdlike methods including sgd random reshuffling proxsgd and modelbased methods using this framework they show the convergence of these methods i the authors prove the main theorem 21 which generalizes results from eg httpsparameterfreecom20201005almostsureconvergenceofsgdonsmoothnonconvexfunctions or gradient convergence in gradient methods with errors bertsekas et al strengths the generalization can be useful and give insights into the community weaknesses yes it can help to analyze a broad family of methods but i think that the paper doesnt have enough examples to be sure that theorem 21 is the unified convergence theorem ii as the authors noted the lastiterate convergence of sgd and rr was analyzed before so qualitatively the papers contribution is the analysis of proxsgd and smm strengths even the fact that it is possible to prove the convergence of proxsgd is interesting weaknesses to prove the convergence the authors use assumption c3 that varphi is llipschitz for instance one of the most popular regularizers varphix frac12 x22 is not llipschitz i think the paper is good it provides proof that proxsgd and smm converge at the same time im not sure that theorem 21 can be considered the unified convergence theorem but it can be helpful for the community na
### Summary:
|
the authors provide a blanket convergence analysis for several stochastic optimization methods the techniques are interesting and will be useful the authors may want to be a bit more careful on the details on some of their convergence results when they make comparisons for instance the main difference between 3 and 26 is in the noise assumptions in 26 which allow to use more aggressive stepsize policies otherwise the difference in assumptions that the paper alludes to is reflected in the fact that 26 is getting a stronger convergence result to a component of critical points whereas 3 leaves open the possibility that the process escapes to infinity the assumptions in 26 rule out this behavior the authors also miss the recent work which provide a tighter general characterization yp hsieh p mertikopoulos and v cevher the limits of minmax optimization algorithms convergence to spurious noncritical sets in icml 21 proceedings of the 38th international conference on machine learning 2021
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
7714,
3400,
247,
27998,
20185,
14940,
1783,
7792,
323,
2067,
36409,
19191,
13757,
3082,
824,
347,
256,
35333,
3632,
40206,
47587,
19561,
256,
35333,
285,
19191,
1566,
3169,
3082,
407,
16984,
767,
5239,
273,
2087,
2515,
327,
253,
12002,
14940,
2557,
347,
973,
347,
253,
3425,
4561,
407,
19191,
13757,
11333,
597,
15313,
1097,
253,
3264,
285,
2761,
43574,
14940,
281,
253,
17429,
2792,
2975,
253,
4477,
840,
4647,
436,
906,
281,
253,
1840,
13012,
11333,
281,
4044,
20185,
14940,
23632,
275,
253,
3264,
285,
2761,
43574,
21768,
534,
2057,
9295,
5368,
14940,
1543,
6830,
762,
21076,
13260,
390,
6635,
747,
1543,
3236,
414,
253,
2022,
38135,
273,
436,
2929,
15771,
327,
253,
31376,
273,
253,
4081,
27998,
14940,
1783,
1754,
327,
253,
12002,
4660,
15752,
2557,
815,
74,
534,
3133,
747,
281,
253,
37317,
253,
2201,
5701,
403,
17903,
347,
3637,
50276,
296,
3755,
20556,
50276,
1615,
16984,
2087,
2515,
327,
253,
12002,
4660,
15752,
2557,
815,
74,
616,
10527,
1783,
29566,
1029,
15840,
281,
9295,
390,
3426,
253,
3264,
285,
50276,
25855,
43574,
20185,
14940,
1543,
273,
5368,
19191,
13757,
3082,
762,
2067,
1895,
7533,
3021,
310,
2442,
281,
25636,
5933,
2216,
285,
1783,
50275,
2520,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
14940,
1783,
323,
256,
35333,
391,
83,
16843,
8433,
69,
285,
924,
78,
3082,
310,
2590,
285,
12957,
690,
16039,
2190,
841,
3082,
50276,
20881,
1255,
265,
50276,
251,
7681,
38135,
1580,
253,
2783,
19191,
13757,
3082,
275,
436,
789,
452,
644,
973,
5421,
285,
253,
4737,
5609,
908,
275,
436,
2929,
403,
2629,
3021,
253,
7681,
7680,
310,
3710,
281,
690,
6070,
50276,
251,
253,
2797,
2281,
1543,
323,
1327,
44181,
3237,
253,
14940,
281,
247,
17429,
1127,
273,
253,
8103,
1159,
310,
417,
1077,
27096,
1580,
253,
906,
2550,
11165,
667,
5556,
1319,
432,
436,
4685,
253,
2797,
20185,
14940,
1543,
275,
436,
789,
403,
1014,
21076,
685,
253,
5368,
19502,
10454,
1543,
534,
2007,
2722,
253,
50276,
4160,
284,
40045,
3875,
14940,
4142,
762,
2176,
5593,
50276,
783,
31376,
273,
253,
4081,
7792,
310,
417,
1077,
2590,
281,
253,
37317,
253,
4477,
403,
3021,
5125,
281,
1056,
2590,
253,
7990,
273,
19191,
13757,
11333,
326,
476,
320,
11217,
715,
436,
27998,
14940,
1783,
5474,
33032,
2520,
2929,
38422,
2087,
14940,
1543,
323,
19191,
13757,
3082,
253,
1543,
2486,
253,
2219,
835,
253,
2105,
1159,
310,
1327,
44181,
285,
14122,
78,
4902,
4757,
253,
2746,
310,
2087,
285,
476,
320,
7763,
281,
2067,
7533,
50276,
20881,
1255,
253,
1543,
403,
760,
20185,
285,
670,
253,
11786,
5222,
347,
824,
417,
4156,
13757,
1543,
417,
7763,
5474,
33032,
2520,
789,
3400,
247,
27998,
10012,
323,
18918,
2067,
19191,
13757,
3082,
1097,
3264,
285,
2761,
2119,
14940,
403,
6012,
323,
4893,
253,
4477,
9295,
253,
14940,
1543,
323,
19191,
11786,
18499,
256,
35333,
285,
3632,
40206,
47587,
391,
83,
275,
1635,
436,
2929,
671,
31326,
253,
14940,
906,
323,
253,
19191,
19561,
11786,
1332,
16843,
8433,
69,
407,
970,
253,
4081,
7792,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
275,
2087,
253,
2605,
273,
253,
2929,
310,
1175,
285,
3400,
690,
747,
16039,
715,
253,
14940,
1543,
323,
253,
19191,
13757,
3082,
253,
5161,
10012,
3127,
310,
1077,
1774,
285,
697,
4737,
2789,
3282,
281,
479,
33810,
253,
16843,
8433,
69,
310,
5867,
3066,
436,
7792,
281,
4044,
690,
14940,
1543,
50276,
783,
2022,
14855,
310,
326,
253,
4081,
1332,
760,
3400,
271,
20185,
14940,
906,
534,
310,
1679,
27096,
2429,
281,
253,
1442,
262,
7816,
1511,
273,
14493,
275,
1635,
253,
1159,
1318,
14940,
1537,
320,
671,
6110,
50275,
9820,
50276,
7152,
339,
431,
248,
4477,
12106,
253,
14940,
273,
256,
35333,
3022,
3082,
824,
326,
253,
5222,
273,
253,
11786,
275,
253,
1390,
35388,
14940,
281,
5058,
275,
15355,
285,
2761,
13353,
597,
2085,
253,
2087,
7792,
281,
12106,
253,
14940,
323,
256,
35333,
3022,
3082,
1690,
256,
35333,
3632,
40206,
47587,
16843,
8433,
69,
285,
1566,
3169,
3082,
970,
436,
7792,
597,
921,
253,
14940,
273,
841,
3082,
891,
253,
4477,
5276,
253,
2022,
10012,
3127,
534,
2087,
4219,
1543,
432,
24088,
5987,
19484,
4924,
681,
938,
1252,
5523,
25855,
43574,
585,
41801,
1171,
8433,
69,
790,
78,
4902,
4160,
44181,
20619,
390,
11786,
14940,
275,
11786,
3082,
342,
6332,
270,
797,
339,
39903,
1162,
355,
50275,
296,
3755,
20556,
253,
26647,
476,
320,
4217,
285,
1918,
16039,
715,
253,
3114,
50276,
20881,
1255,
265,
4754,
352,
476,
1361,
281,
12106,
247,
3862,
2021,
273,
3082,
533,
891,
1158,
326,
253,
2929,
36908,
452,
2217,
6667,
281,
320,
2119,
326,
10012,
3127,
310,
253,
27998,
14940,
10012,
50276,
2886,
347,
253,
4477,
4879,
253,
1390,
48532,
14940,
273,
256,
35333,
285,
391,
83,
369,
5867,
1078,
594,
36143,
253,
9380,
7680,
310,
253,
1783,
273,
16843,
8433,
69,
285,
924,
78,
50275,
296,
3755,
20556,
1014,
253,
958,
326,
352,
310,
1896,
281,
5276,
253,
14940,
273,
16843,
8433,
69,
310,
4722,
50275,
20881,
1255,
265,
281,
5276,
253,
14940,
253,
4477,
897,
9376,
260,
20,
326,
945,
2162,
310,
26198,
2824,
37913,
323,
4227,
581,
273,
253,
954,
4633,
3963,
14460,
945,
545,
895,
50276,
1124,
805,
1269,
1423,
310,
417,
26198,
2824,
37913,
50276,
74,
1158,
253,
2929,
310,
1175,
352,
3400,
4737,
326,
16843,
8433,
69,
285,
924,
78,
29623,
387,
253,
1072,
673,
516,
417,
2119,
326,
10012,
3127,
476,
320,
2783,
253,
27998,
14940,
10012,
533,
352,
476,
320,
9371,
323,
253,
3114,
5549,
2490,
187,
4118,
18435,
27,
783,
4477,
2085,
247,
23069,
14940,
1783,
323,
2067,
19191,
13757,
3082,
253,
5609,
403,
4722,
285,
588,
320,
4217,
50275,
783,
4477,
778,
971,
281,
320,
247,
2372,
625,
10182,
327,
253,
4278,
327,
690,
273,
616,
14940,
1543,
672,
597,
1056,
14023,
323,
4227,
253,
2022,
3064,
875,
495,
285,
3436,
310,
275,
253,
6046,
13260,
275,
3436,
534,
1581,
281,
897,
625,
13847,
5018,
907,
7823,
5010,
253,
3064,
275,
13260,
326,
253,
2929,
512,
14735,
281,
310,
11392,
275,
253,
958,
326,
3436,
310,
2970,
247,
10046,
14940,
906,
281,
247,
4445,
273,
4619,
2792,
5727,
495,
6505,
1527,
253,
6387,
326,
253,
1232,
44716,
281,
23579,
253,
13260,
275,
3436,
4086,
562,
436,
3879,
253,
4477,
671,
2985,
253,
3332,
789,
534,
2085,
247,
40638,
2087,
14846,
50276,
3170,
288,
48188,
73,
268,
278,
797,
1479,
35464,
285,
362,
260,
1173,
379,
253,
7787,
273,
1054,
4090,
13757,
11333,
14940,
281,
46541,
1327,
26717,
5239,
275,
17857,
1686,
3127,
10061,
273,
253,
6480,
394,
5213,
8059,
327,
5145,
4715,
43425,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
7714,
3400,
247,
27998,
20185,
14940,
1783,
7792,
323,
2067,
36409,
19191,
13757,
3082,
824,
347,
256,
35333,
3632,
40206,
47587,
19561,
256,
35333,
285,
19191,
1566,
3169,
3082,
407,
16984,
767,
5239,
273,
2087,
2515,
327,
253,
12002,
14940,
2557,
347,
973,
347,
253,
3425,
4561,
407,
19191,
13757,
11333,
597,
15313,
1097,
253,
3264,
285,
2761,
43574,
14940,
281,
253,
17429,
2792,
2975,
253,
4477,
840,
4647,
436,
906,
281,
253,
1840,
13012,
11333,
281,
4044,
20185,
14940,
23632,
275,
253,
3264,
285,
2761,
43574,
21768,
534,
2057,
9295,
5368,
14940,
1543,
6830,
762,
21076,
13260,
390,
6635,
747,
1543,
3236,
414,
253,
2022,
38135,
273,
436,
2929,
15771,
327,
253,
31376,
273,
253,
4081,
27998,
14940,
1783,
1754,
327,
253,
12002,
4660,
15752,
2557,
815,
74,
534,
3133,
747,
281,
253,
37317,
253,
2201,
5701,
403,
17903,
347,
3637,
50276,
296,
3755,
20556,
50276,
1615,
16984,
2087,
2515,
327,
253,
12002,
4660,
15752,
2557,
815,
74,
616,
10527,
1783,
29566,
1029,
15840,
281,
9295,
390,
3426,
253,
3264,
285,
50276,
25855,
43574,
20185,
14940,
1543,
273,
5368,
19191,
13757,
3082,
762,
2067,
1895,
7533,
3021,
310,
2442,
281,
25636,
5933,
2216,
285,
1783,
50275,
2520,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
14940,
1783,
323,
256,
35333,
391,
83,
16843,
8433,
69,
285,
924,
78,
3082,
310,
2590,
285,
12957,
690,
16039,
2190,
841,
3082,
50276,
20881,
1255,
265,
50276,
251,
7681,
38135,
1580,
253,
2783,
19191,
13757,
3082,
275,
436,
789,
452,
644,
973,
5421,
285,
253,
4737,
5609,
908,
275,
436,
2929,
403,
2629,
3021,
253,
7681,
7680,
310,
3710,
281,
690,
6070,
50276,
251,
253,
2797,
2281,
1543,
323,
1327,
44181,
3237,
253,
14940,
281,
247,
17429,
1127,
273,
253,
8103,
1159,
310,
417,
1077,
27096,
1580,
253,
906,
2550,
11165,
667,
5556,
1319,
432,
436,
4685,
253,
2797,
20185,
14940,
1543,
275,
436,
789,
403,
1014,
21076,
685,
253,
5368,
19502,
10454,
1543,
534,
2007,
2722,
253,
50276,
4160,
284,
40045,
3875,
14940,
4142,
762,
2176,
5593,
50276,
783,
31376,
273,
253,
4081,
7792,
310,
417,
1077,
2590,
281,
253,
37317,
253,
4477,
403,
3021,
5125,
281,
1056,
2590,
253,
7990,
273,
19191,
13757,
11333,
326,
476,
320,
11217,
715,
436,
27998,
14940,
1783,
5474,
33032,
2520,
2929,
38422,
2087,
14940,
1543,
323,
19191,
13757,
3082,
253,
1543,
2486,
253,
2219,
835,
253,
2105,
1159,
310,
1327,
44181,
285,
14122,
78,
4902,
4757,
253,
2746,
310,
2087,
285,
476,
320,
7763,
281,
2067,
7533,
50276,
20881,
1255,
253,
1543,
403,
760,
20185,
285,
670,
253,
11786,
5222,
347,
824,
417,
4156,
13757,
1543,
417,
7763,
5474,
33032,
2520,
789,
3400,
247,
27998,
10012,
323,
18918,
2067,
19191,
13757,
3082,
1097,
3264,
285,
2761,
2119,
14940,
403,
6012,
323,
4893,
253,
4477,
9295,
253,
14940,
1543,
323,
19191,
11786,
18499,
256,
35333,
285,
3632,
40206,
47587,
391,
83,
275,
1635,
436,
2929,
671,
31326,
253,
14940,
906,
323,
253,
19191,
19561,
11786,
1332,
16843,
8433,
69,
407,
970,
253,
4081,
7792,
436,
2929,
310,
973,
3542,
285,
3477,
281,
956,
275,
2087,
253,
2605,
273,
253,
2929,
310,
1175,
285,
3400,
690,
747,
16039,
715,
253,
14940,
1543,
323,
253,
19191,
13757,
3082,
253,
5161,
10012,
3127,
310,
1077,
1774,
285,
697,
4737,
2789,
3282,
281,
479,
33810,
253,
16843,
8433,
69,
310,
5867,
3066,
436,
7792,
281,
4044,
690,
14940,
1543,
50276,
783,
2022,
14855,
310,
326,
253,
4081,
1332,
760,
3400,
271,
20185,
14940,
906,
534,
310,
1679,
27096,
2429,
281,
253,
1442,
262,
7816,
1511,
273,
14493,
275,
1635,
253,
1159,
1318,
14940,
1537,
320,
671,
6110,
50275,
9820,
50276,
7152,
339,
431,
248,
4477,
12106,
253,
14940,
273,
256,
35333,
3022,
3082,
824,
326,
253,
5222,
273,
253,
11786,
275,
253,
1390,
35388,
14940,
281,
5058,
275,
15355,
285,
2761,
13353,
597,
2085,
253,
2087,
7792,
281,
12106,
253,
14940,
323,
256,
35333,
3022,
3082,
1690,
256,
35333,
3632,
40206,
47587,
16843,
8433,
69,
285,
1566,
3169,
3082,
970,
436,
7792,
597,
921,
253,
14940,
273,
841,
3082,
891,
253,
4477,
5276,
253,
2022,
10012,
3127,
534,
2087,
4219,
1543,
432,
24088,
5987,
19484,
4924,
681,
938,
1252,
5523,
25855,
43574,
585,
41801,
1171,
8433,
69,
790,
78,
4902,
4160,
44181,
20619,
390,
11786,
14940,
275,
11786,
3082,
342,
6332,
270,
797,
339,
39903,
1162,
355,
50275,
296,
3755,
20556,
253,
26647,
476,
320,
4217,
285,
1918,
16039,
715,
253,
3114,
50276,
20881,
1255,
265,
4754,
352,
476,
1361,
281,
12106,
247,
3862,
2021,
273,
3082,
533,
891,
1158,
326,
253,
2929,
36908,
452,
2217,
6667,
281,
320,
2119,
326,
10012,
3127,
310,
253,
27998,
14940,
10012,
50276,
2886,
347,
253,
4477,
4879,
253,
1390,
48532,
14940,
273,
256,
35333,
285,
391,
83,
369,
5867,
1078,
594,
36143,
253,
9380,
7680,
310,
253,
1783,
273,
16843,
8433,
69,
285,
924,
78,
50275,
296,
3755,
20556,
1014,
253,
958,
326,
352,
310,
1896,
281,
5276,
253,
14940,
273,
16843,
8433,
69,
310,
4722,
50275,
20881,
1255,
265,
281,
5276,
253,
14940,
253,
4477,
897,
9376,
260,
20,
326,
945,
2162,
310,
26198,
2824,
37913,
323,
4227,
581,
273,
253,
954,
4633,
3963,
14460,
945,
545,
895,
50276,
1124,
805,
1269,
1423,
310,
417,
26198,
2824,
37913,
50276,
74,
1158,
253,
2929,
310,
1175,
352,
3400,
4737,
326,
16843,
8433,
69,
285,
924,
78,
29623,
387,
253,
1072,
673,
516,
417,
2119,
326,
10012,
3127,
476,
320,
2783,
253,
27998,
14940,
10012,
533,
352,
476,
320,
9371,
323,
253,
3114,
5549,
2490,
187,
4118,
18435,
27,
783,
4477,
2085,
247,
23069,
14940,
1783,
323,
2067,
19191,
13757,
3082,
253,
5609,
403,
4722,
285,
588,
320,
4217,
50275,
783,
4477,
778,
971,
281,
320,
247,
2372,
625,
10182,
327,
253,
4278,
327,
690,
273,
616,
14940,
1543,
672,
597,
1056,
14023,
323,
4227,
253,
2022,
3064,
875,
495,
285,
3436,
310,
275,
253,
6046,
13260,
275,
3436,
534,
1581,
281,
897,
625,
13847,
5018,
907,
7823,
5010,
253,
3064,
275,
13260,
326,
253,
2929,
512,
14735,
281,
310,
11392,
275,
253,
958,
326,
3436,
310,
2970,
247,
10046,
14940,
906,
281,
247,
4445,
273,
4619,
2792,
5727,
495,
6505,
1527,
253,
6387,
326,
253,
1232,
44716,
281,
23579,
253,
13260,
275,
3436,
4086,
562,
436,
3879,
253,
4477,
671,
2985,
253,
3332,
789,
534,
2085,
247,
40638,
2087,
14846,
50276,
3170,
288,
48188,
73,
268,
278,
797,
1479,
35464,
285,
362,
260,
1173,
379,
253,
7787,
273,
1054,
4090,
13757,
11333,
14940,
281,
46541,
1327,
26717,
5239,
275,
17857,
1686,
3127,
10061,
273,
253,
6480,
394,
5213,
8059,
327,
5145,
4715,
43425,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a blackbox attack where by relying on segmentation priors the perturbation is applied only in the salient region this allows one to obtain reduce perceptibility with a limited number of queries and a small reduction in success rate more specifically once the salient region has been identified a refining procedure is carried out to find small areas where the perturbation should be added experiments are performed on imagenet and results are compared with those of some sota methods and their variants that work on saliency regions this work tackles an important problem that is developing an imperceptible blackbox attack however there are two main issues concerning the technical approach and the experimental analysis from a technical point of view the search algorithm devised to refine perturbation is exceedingly naive it is based on an iterative procedure that resembles quadtree analysis in image processing the first split of the salient region generates four initial blocks and a perturbation is added to them to find those that maximize the loss function after finding the best block it is further split and the procedure is repeated until the block is reduced to one pixel or no smaller blocks can achieve a better loss then this same procedure is applied again to the second best block etc as it is described this procedure has neither a theoretical nor an experimental justification in addition a similar idea but applied in a different context can be found in guo et al 2020 and saliency maps to add adversarial attacks are also explored in sun et al 2021 overall i feel that the contribution is not significant in terms of technical approach the approach is not compared with other blackbox attacks that also have the objective to make the perturbation imperceptible see references below in fact experiments are carried out by modifying some sota approaches by including saliency regions and comparing them with the proposed solution see table 2 in my opinion this is too limited to validate the proposal typos imperceptiblity tunning references guo et al watch out motion is blurring the vision of your deep neural networks neurips 2020 sun et al generating facial expression adversarial examples based on saliency map image and vision computing 2021 wang et al perception improvement for free exploring imperceptible blackbox adversarial attacks on image classification arxiv 2020 liu et al greedyfool multifactor imperceptibility and its application to designing blackbox adversarial example attacks arxiv 2020 croce and hein sparse and imperceivable adversarial attacks iccv 2019 gragnaniello et al perceptual qualitypreserving blackbox attack against deep learning image classifiers pattern recognition letters 2021 li and chen toward visual distortion in blackbox attacks ieee transactions on image processing 2021 post rebuttal comments i appreciate that the authors better clarified their contribution and included a new comparison in the experimental section hence i increase my score however i still believe that the paper needs much more comparisons with stateoftheart and that the proposal should be better justified from a theoretical point of view in my opinion the technical novelty introduced in this paper is too limited and also its experimental validation should be improved by considering more relevant methods for comparison docsepthe paper studies how to reduce the perceptibility of the perturbations to the original images produced by blackbox adversarial attacks for the ellinftythreat model in particular it proposes to use a prior based on segmentation techniques to localize the changes on the subject of the image and leave the background unaltered moreover the saliency attack is introduced to further reduce the fraction of the original image which is modified to induce misclassification strengths the proposed method is simple and achieves the goal of improving the imperceptibility of the perturbations using the segmentation prior is effective for avoiding changes on the background and the saliency attack attains better mad score several ablation studies are presented to illustrate the proposed method weaknesses the paper focuses on improving the imperceptibility of ellinftyattacks with a fixed budget epsilon005 in practice reducing the number of pixels perturbed a sort of minimization of the ell0norm however i think that how visible the perturbations are is more a property of the threat model set of feasible changes rather than of the attack used for example using a smaller epsilon for the same attacks would increase their imperceptibility moreover other ellpnorms eg square attack has versions for ell2 and ell1 a including ell0 or perceptual metrics like lpips b should be considered since they produce more localized changes than ellinftyattacks there are a few works which aim at finding which areas of an image the attacker should perturb to have invisible changes both with black and whitebox attacks c d in particular c show that small perturbations in the ellinftynorm are not necessary if limited to certain areas to the image to preserve imperceptibility the presentation of the method in algorithms 1 and 2 seems a bit confuse first delta is not defined second the refine function is called recursively on smaller blocks but also repeatedly with smaller k in algorithm 2 is this correct also in line 9 k is halved shouldnt it be restarted from the original value when going over the next iteration of the loop in l4 from the images in figure 4 it seems that for parsimonious and square attack perturbations are sampled also on areas which are outside the mask given by the segmentation prior are those candidates evaluated spending queries of the total budget although they cant change the current perturbations a httpsarxivorgabs210301208 b httpsarxivorgabs200612655 c httpsarxivorgabs190905040 d httpsarxivorgabs201013773 overall i think the proposed method is effective but especially given that the technical novelty is limited it should be better positioned if the main goal is imperceptibility of the perturbations it should be compared to other kinds of attacks beyond ellinfty ones otherwise the authors should better motivate why improving the ellinftyattacks with such fixed threshold is relevant docsepthere has been a lot of interest in improving the query efficiency of blackbox attacks in the recent past however these techniques produce examples that a human in the loop can quickly identify the authors propose using segmentation priors to improve the blackbox attacks so that the perturbations are restricted to the salient regions of the image in addition to this they also present a technique that improves the imperceptibility without forgoing the query efficiency the paper proposes a few optimisations that can improve the imperceptibility of generated adversarial perturbations they also demonstrate that some of the exiting blackbox adversarial attacks can benefit from their optimisations in addition to this they also propose a search algorithm that can further narrow down the candidate regions for adversarial perturbations they also demonstrate that adversarial examples generated by their technique have a higher success rate for evading detection mechanisms in the recent past a richer class of gradientfree blackbox techniques have been proposed in the literature why is the paper considering only two of those for instance why not consider techniques based on bayesian optimisation the same applies to methods for detecting adversarial examples why was feature squeezing chosen as a baseline to evaluate the success rate in addition to the above the authors report the results only over a sample of 1000 images from imagenet why only such a small sample was considered and i would also encourage the authors to evaluate over multiple samples report the variance too in general i found the paper easy to follow i also find the direction explored by the paper to be quite promising but found the experiments the baselines chosen by the authors are lacking i would encourage the authors to include a diverse set of baselines for instance i would also demonstrate how the optimization proposed in the paper improves gradientfree blackbox adversarial example generation techniques other than the chosen few also evaluate the evasion rate of adversarial examples generated by the proposed technique against a diverse set of detection mechanisms docsepthis paper proposed a method to generate imperceptible attack in black box attack scenario by generating local perturbation blocks in salient regions it used salient object segmentation to obtain the salient region then applied a tree search method to find smallest blocks within the salient region that can cause the maximal change in predicted class logits experiments on 1000 imagenet examples are conducted compared to several existing baselines showing that the proposed method can improve achieve more imperceptible attacks where imperceptibility is measured by metric mad the topic of imperceptible black box is very interesting and not well studied the ideas of generating perturbing blocks in salient regions are interesting however this paper can be improved with more sufficient justifications on motivations and experiments moreover its technical contribution is not very high the motivation of generating the perturbed regions in the salient object region is not sufficiently justified salient object can still be mostly a smooth low frequency area eg smooth human faceskin and hence generating the perturbation in this area may not be reasonable moreover it is not clear whether generating perturbation in salient object area makes more sense than generating perturbations in the nonsmoothness high frequency or heavy texture area the experiment part is not totally convincing 1 there is only one data set which is 1000 images from imagenet however what these images look like will affect the results a lot for example does most salient objects have heavy texture area 2 also there is no comparison to the related method in approach of zhang 2020 3 i can not find the information of which classification network is used in the experiments are multiple networks used in the experiments this paper used an existing salient object segmentation approach and so its major technical contribution is the tree based method to generate perturbation blocks which is however a little straightforward and adhoc overall i liked the idea of this paper and enjoyed reading it however it needs better justification on motivations as well as improvement on experiments
### Summary:
|
in this paper the authors propose to use segmentation priors for blackbox attacks such that the perturbations are limited in the salient region they also find that stateoftheart blackbox attacks equipped with segmentation priors can achieve much better imperceptibility performance with little reduction in query efficiency and success rate hence the auithors propose the saliency attack a new gradientfree blackbox attack that can further improve the imperceptibility by refining perturbations in the salient region the reviewers think that the proposed method is simple and important and the authors have responded properly to some comments however the reviewers still are not satisfied with the experimental evaluation and comparisons as the authors can only try to compare with other ideas and test more models in the future in summary i think the manuscript at its current staus cannot be accepted
|
[
27,
187,
2520,
2929,
29328,
247,
2806,
3364,
2983,
835,
407,
22128,
327,
26405,
2235,
641,
253,
20452,
310,
3732,
760,
275,
253,
43066,
2919,
436,
4483,
581,
281,
4044,
4796,
591,
916,
2322,
342,
247,
3710,
1180,
273,
19241,
285,
247,
1355,
5141,
275,
2323,
2281,
625,
5742,
2378,
253,
43066,
2919,
556,
644,
3636,
247,
1275,
1699,
5199,
310,
4824,
562,
281,
1089,
1355,
3672,
835,
253,
20452,
943,
320,
2879,
4679,
403,
2684,
327,
4440,
257,
292,
285,
1543,
403,
2429,
342,
1110,
273,
690,
256,
5503,
3082,
285,
616,
11640,
326,
789,
327,
3779,
4364,
4811,
436,
789,
39223,
271,
1774,
1895,
326,
310,
6684,
271,
9719,
44043,
2806,
3364,
2983,
2299,
627,
403,
767,
2022,
3374,
8664,
253,
7681,
2746,
285,
253,
5661,
1783,
50273,
4064,
247,
7681,
1127,
273,
1859,
253,
3186,
5933,
32434,
281,
39494,
20452,
310,
42508,
27785,
352,
310,
1754,
327,
271,
34560,
5199,
326,
29217,
9853,
12588,
1783,
275,
2460,
5162,
253,
806,
8085,
273,
253,
43066,
2919,
15693,
1740,
3302,
8336,
285,
247,
20452,
310,
2879,
281,
731,
281,
1089,
1110,
326,
22950,
253,
2957,
1159,
846,
4560,
253,
1682,
2972,
352,
310,
2007,
8085,
285,
253,
5199,
310,
6015,
1919,
253,
2972,
310,
3777,
281,
581,
12275,
390,
642,
4577,
8336,
476,
5115,
247,
1805,
2957,
840,
436,
1072,
5199,
310,
3732,
969,
281,
253,
1273,
1682,
2972,
3966,
347,
352,
310,
2529,
436,
5199,
556,
6747,
247,
10527,
4543,
271,
5661,
22861,
275,
1635,
247,
2074,
2934,
533,
3732,
275,
247,
1027,
3634,
476,
320,
1119,
275,
1149,
80,
1162,
355,
9169,
285,
3779,
4364,
8115,
281,
823,
48960,
8104,
403,
671,
14859,
275,
5101,
1162,
355,
43425,
4583,
891,
1928,
326,
253,
7680,
310,
417,
1534,
275,
2426,
273,
7681,
2746,
50275,
783,
2746,
310,
417,
2429,
342,
643,
2806,
3364,
8104,
326,
671,
452,
253,
8103,
281,
1056,
253,
20452,
9719,
44043,
923,
10414,
2708,
275,
958,
4679,
403,
4824,
562,
407,
26264,
690,
256,
5503,
7274,
407,
1690,
3779,
4364,
4811,
285,
10941,
731,
342,
253,
4081,
2900,
923,
2829,
374,
275,
619,
4743,
436,
310,
1512,
3710,
281,
17813,
253,
10419,
50275,
555,
993,
9719,
916,
487,
77,
414,
246,
10455,
50276,
250,
3065,
50276,
4297,
80,
1162,
355,
3698,
562,
3200,
310,
29017,
804,
253,
8113,
273,
634,
3676,
11454,
6928,
5723,
2824,
9169,
50276,
13998,
1162,
355,
11365,
17754,
2048,
48960,
6667,
1754,
327,
3779,
4364,
3711,
2460,
285,
8113,
12672,
43425,
50276,
33317,
1162,
355,
13071,
7756,
323,
1959,
18216,
9719,
44043,
2806,
3364,
48960,
8104,
327,
2460,
9162,
549,
32693,
9169,
50276,
965,
86,
1162,
355,
38754,
71,
1062,
25274,
5906,
9719,
916,
2322,
285,
697,
2898,
281,
20462,
2806,
3364,
48960,
1650,
8104,
549,
32693,
9169,
50276,
23853,
336,
285,
344,
249,
23507,
285,
9719,
336,
26430,
48960,
8104,
17857,
17312,
6247,
50276,
737,
1530,
6451,
6646,
1162,
355,
39612,
3290,
10192,
26368,
2806,
3364,
2983,
1411,
3676,
4715,
2460,
49996,
3102,
8981,
4876,
43425,
50276,
965,
285,
260,
864,
2584,
5304,
22841,
275,
2806,
3364,
8104,
26332,
1796,
13122,
327,
2460,
5162,
43425,
50274,
5996,
30080,
22559,
5701,
50275,
74,
11435,
326,
253,
4477,
1805,
31637,
616,
7680,
285,
2908,
247,
747,
5301,
275,
253,
5661,
2593,
7613,
891,
2572,
619,
4868,
2299,
891,
1335,
2868,
326,
253,
2929,
3198,
1199,
625,
14023,
342,
1375,
23037,
14387,
285,
326,
253,
10419,
943,
320,
1805,
17285,
432,
247,
10527,
1127,
273,
1859,
50276,
249,
619,
4743,
253,
7681,
38135,
5611,
275,
436,
2929,
310,
1512,
3710,
285,
671,
697,
5661,
12820,
943,
320,
5520,
407,
7296,
625,
4623,
3082,
323,
5301,
50276,
7152,
339,
431,
248,
2929,
2175,
849,
281,
4796,
253,
591,
916,
2322,
273,
253,
26309,
281,
253,
3236,
3888,
4197,
407,
2806,
3364,
48960,
8104,
323,
253,
11591,
3259,
26039,
1566,
275,
1798,
352,
29328,
281,
897,
247,
2720,
1754,
327,
26405,
5609,
281,
1980,
907,
253,
2544,
327,
253,
2256,
273,
253,
2460,
285,
3553,
253,
4114,
440,
267,
3606,
25761,
253,
3779,
4364,
2983,
310,
5611,
281,
2007,
4796,
253,
6919,
273,
253,
3236,
2460,
534,
310,
7321,
281,
10808,
3731,
42070,
20544,
50276,
783,
4081,
1332,
310,
2969,
285,
33526,
253,
4736,
273,
11138,
253,
9719,
916,
2322,
273,
253,
26309,
970,
253,
26405,
2720,
310,
3576,
323,
17816,
2544,
327,
253,
4114,
285,
253,
3779,
4364,
2983,
863,
1550,
1805,
10279,
4868,
50275,
43249,
28913,
2175,
403,
3559,
281,
17093,
253,
4081,
1332,
50276,
20881,
1255,
265,
50276,
783,
2929,
16633,
327,
11138,
253,
9719,
916,
2322,
273,
11591,
3259,
1595,
7305,
342,
247,
4229,
7563,
299,
4277,
5523,
275,
3946,
8493,
253,
1180,
273,
15115,
44711,
247,
3686,
273,
41458,
273,
253,
11591,
17,
12850,
2299,
891,
1158,
326,
849,
7985,
253,
26309,
403,
310,
625,
247,
2867,
273,
253,
4322,
1566,
873,
273,
17887,
2544,
2581,
685,
273,
253,
2983,
908,
323,
1650,
970,
247,
4577,
299,
4277,
323,
253,
1072,
8104,
651,
2572,
616,
9719,
916,
2322,
25761,
643,
11591,
81,
12850,
84,
24088,
6278,
2983,
556,
9508,
323,
11591,
19,
285,
11591,
18,
247,
1690,
11591,
17,
390,
39612,
17082,
751,
39322,
2824,
270,
943,
320,
2783,
1580,
597,
4711,
625,
15783,
2544,
685,
11591,
3259,
1595,
7305,
50275,
9088,
403,
247,
1643,
2987,
534,
4388,
387,
4560,
534,
3672,
273,
271,
2460,
253,
30539,
943,
12230,
281,
452,
20624,
2544,
1097,
342,
2806,
285,
3168,
3364,
8104,
260,
277,
275,
1798,
260,
921,
326,
1355,
26309,
275,
253,
11591,
3259,
12850,
403,
417,
3309,
604,
3710,
281,
2176,
3672,
281,
253,
2460,
281,
14003,
9719,
916,
2322,
50275,
783,
9759,
273,
253,
1332,
275,
11333,
337,
285,
374,
3133,
247,
2372,
40678,
806,
18687,
310,
417,
2931,
1273,
253,
39494,
1159,
310,
1925,
17910,
1242,
327,
4577,
8336,
533,
671,
12889,
342,
4577,
465,
275,
5933,
374,
310,
436,
3451,
671,
275,
1386,
898,
465,
310,
7905,
1272,
943,
2649,
352,
320,
19855,
264,
432,
253,
3236,
1318,
672,
1469,
689,
253,
1735,
19502,
273,
253,
6287,
275,
298,
21,
50275,
4064,
253,
3888,
275,
4677,
577,
352,
3133,
326,
323,
13328,
15329,
784,
285,
6278,
2983,
26309,
403,
19958,
671,
327,
3672,
534,
403,
3345,
253,
8989,
1677,
407,
253,
26405,
2720,
403,
1110,
9183,
6760,
9100,
19241,
273,
253,
2264,
7563,
3738,
597,
16216,
1818,
253,
1655,
26309,
50276,
66,
5987,
39962,
2061,
5375,
19,
12172,
520,
17391,
270,
5987,
39962,
2061,
5375,
8603,
805,
25320,
260,
5987,
39962,
2061,
5375,
16129,
2270,
1235,
1449,
277,
5987,
39962,
2061,
5375,
1252,
520,
1787,
3655,
4583,
891,
1158,
253,
4081,
1332,
310,
3576,
533,
3340,
1677,
326,
253,
7681,
38135,
310,
3710,
352,
943,
320,
1805,
15471,
604,
253,
2022,
4736,
310,
9719,
916,
2322,
273,
253,
26309,
352,
943,
320,
2429,
281,
643,
9351,
273,
8104,
4457,
11591,
3259,
4394,
5010,
253,
4477,
943,
1805,
41509,
2139,
11138,
253,
11591,
3259,
1595,
7305,
342,
824,
4229,
7887,
310,
4623,
5474,
339,
431,
1568,
556,
644,
247,
2257,
273,
1600,
275,
11138,
253,
7316,
6733,
273,
2806,
3364,
8104,
275,
253,
3332,
2469,
2299,
841,
5609,
4711,
6667,
326,
247,
1966,
275,
253,
6287,
476,
4541,
4271,
253,
4477,
12661,
970,
26405,
2235,
641,
281,
3157,
253,
2806,
3364,
8104,
594,
326,
253,
26309,
403,
11096,
281,
253,
43066,
4811,
273,
253,
2460,
275,
1635,
281,
436,
597,
671,
1246,
247,
5853,
326,
19132,
253,
9719,
916,
2322,
1293,
323,
5681,
253,
7316,
6733,
253,
2929,
29328,
247,
1643,
5556,
18058,
326,
476,
3157,
253,
9719,
916,
2322,
273,
4561,
48960,
26309,
597,
671,
7568,
326,
690,
273,
253,
44528,
2806,
3364,
48960,
8104,
476,
5649,
432,
616,
5556,
18058,
275,
1635,
281,
436,
597,
671,
12661,
247,
3186,
5933,
326,
476,
2007,
6891,
1066,
253,
7431,
4811,
323,
48960,
26309,
597,
671,
7568,
326,
48960,
6667,
4561,
407,
616,
5853,
452,
247,
2169,
2323,
2281,
323,
612,
6748,
5481,
6297,
50275,
249,
253,
3332,
2469,
247,
38539,
966,
273,
11786,
4924,
2806,
3364,
5609,
452,
644,
4081,
275,
253,
6239,
2139,
310,
253,
2929,
7296,
760,
767,
273,
1110,
323,
4227,
2139,
417,
1908,
5609,
1754,
327,
17699,
16561,
5556,
5837,
253,
1072,
10384,
281,
3082,
323,
15549,
48960,
6667,
2139,
369,
4735,
43464,
6777,
347,
247,
8245,
281,
7472,
253,
2323,
2281,
50276,
249,
1635,
281,
253,
1840,
253,
4477,
1304,
253,
1543,
760,
689,
247,
3410,
273,
9098,
3888,
432,
4440,
257,
292,
2139,
760,
824,
247,
1355,
3410,
369,
2783,
285,
891,
651,
671,
11907,
253,
4477,
281,
7472,
689,
2709,
3530,
50276,
16223,
253,
11041,
1512,
275,
2087,
891,
1119,
253,
2929,
3477,
281,
956,
891,
671,
1089,
253,
3884,
14859,
407,
253,
2929,
281,
320,
3240,
12532,
533,
1119,
253,
4679,
50276,
783,
1666,
25379,
6777,
407,
253,
4477,
403,
14999,
891,
651,
11907,
253,
4477,
281,
2486,
247,
11117,
873,
273,
1666,
25379,
323,
4227,
891,
651,
671,
7568,
849,
253,
13757,
4081,
275,
253,
2929,
19132,
11786,
4924,
2806,
3364,
48960,
1650,
5978,
5609,
643,
685,
253,
6777,
1643,
671,
7472,
253,
612,
4930,
2281,
273,
48960,
6667,
4561,
407,
253,
4081,
5853,
1411,
247,
11117,
873,
273,
5481,
6297,
5474,
33032,
2520,
2929,
4081,
247,
1332,
281,
6635,
9719,
44043,
2983,
275,
2806,
3817,
2983,
10076,
407,
11365,
1980,
20452,
8336,
275,
43066,
4811,
352,
908,
43066,
1789,
26405,
281,
4044,
253,
43066,
2919,
840,
3732,
247,
5202,
3186,
1332,
281,
1089,
8004,
8336,
1561,
253,
43066,
2919,
326,
476,
2847,
253,
13493,
1818,
275,
8131,
966,
2412,
953,
4679,
327,
9098,
4440,
257,
292,
6667,
403,
5196,
2429,
281,
2067,
5368,
1666,
25379,
4645,
326,
253,
4081,
1332,
476,
3157,
5115,
625,
9719,
44043,
8104,
835,
9719,
916,
2322,
310,
4080,
407,
7982,
10279,
253,
9400,
273,
9719,
44043,
2806,
3817,
310,
1077,
4722,
285,
417,
973,
5421,
253,
5697,
273,
11365,
12230,
272,
8336,
275,
43066,
4811,
403,
4722,
2299,
436,
2929,
476,
320,
5520,
342,
625,
4209,
816,
6787,
327,
42852,
285,
4679,
25761,
697,
7681,
7680,
310,
417,
1077,
1029,
50276,
783,
16038,
273,
11365,
253,
44711,
4811,
275,
253,
43066,
1789,
2919,
310,
417,
10481,
17285,
43066,
1789,
476,
1335,
320,
6571,
247,
6032,
1698,
4294,
2170,
24088,
6032,
1966,
9365,
5914,
285,
7613,
11365,
253,
20452,
275,
436,
2170,
778,
417,
320,
5272,
25761,
352,
310,
417,
2590,
1880,
11365,
20452,
275,
43066,
1789,
2170,
2789,
625,
3282,
685,
11365,
26309,
275,
253,
14122,
78,
4902,
1255,
1029,
4294,
390,
50276,
37893,
14542,
2170,
50275,
783,
3368,
629,
310,
417,
9106,
21414,
337,
627,
310,
760,
581,
941,
873,
534,
310,
9098,
3888,
432,
4440,
257,
292,
2299,
752,
841,
3888,
1007,
751,
588,
2818,
253,
1543,
247,
2257,
323,
1650,
1057,
954,
43066,
5113,
452,
5536,
14542,
2170,
374,
671,
627,
310,
642,
5301,
281,
253,
2905,
1332,
275,
2746,
273,
1182,
12109,
9169,
495,
891,
476,
417,
1089,
253,
1491,
273,
534,
9162,
2990,
310,
908,
275,
253,
4679,
403,
2709,
6928,
908,
275,
253,
4679,
50276,
2520,
2929,
908,
271,
5368,
43066,
1789,
26405,
2746,
285,
594,
697,
2201,
7681,
7680,
310,
253,
5202,
1754,
1332,
281,
6635,
20452,
8336,
534,
310,
2299,
247,
1652,
15246,
285,
519,
37806,
4583,
891,
10490,
253,
2934,
273,
436,
2929,
285,
11346,
4361,
352,
2299,
352,
3198,
1805,
22861,
327,
42852,
347,
973,
347,
7756,
327,
4679,
2490,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
12661,
281,
897,
26405,
2235,
641,
323,
2806,
3364,
8104,
824,
326,
253,
26309,
403,
3710,
275,
253,
43066,
2919,
597,
671,
1089,
326,
1375,
23037,
14387,
2806,
3364,
8104,
13496,
342,
26405,
2235,
641,
476,
5115,
1199,
1805,
9719,
916,
2322,
3045,
342,
1652,
5141,
275,
7316,
6733,
285,
2323,
2281,
7613,
253,
7331,
334,
641,
12661,
253,
3779,
4364,
2983,
247,
747,
11786,
4924,
2806,
3364,
2983,
326,
476,
2007,
3157,
253,
9719,
916,
2322,
407,
1275,
1699,
26309,
275,
253,
43066,
2919,
253,
30628,
1158,
326,
253,
4081,
1332,
310,
2969,
285,
1774,
285,
253,
4477,
452,
10974,
6283,
281,
690,
5701,
50276,
35529,
253,
30628,
1335,
403,
417,
10048,
342,
253,
5661,
7103,
285,
14023,
347,
253,
4477,
476,
760,
1611,
281,
7277,
342,
643,
5697,
285,
1071,
625,
3210,
275,
253,
2852,
275,
6010,
891,
1158,
253,
7714,
387,
697,
1655,
331,
666,
2550,
320,
7607
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
27,
187,
2520,
2929,
29328,
247,
2806,
3364,
2983,
835,
407,
22128,
327,
26405,
2235,
641,
253,
20452,
310,
3732,
760,
275,
253,
43066,
2919,
436,
4483,
581,
281,
4044,
4796,
591,
916,
2322,
342,
247,
3710,
1180,
273,
19241,
285,
247,
1355,
5141,
275,
2323,
2281,
625,
5742,
2378,
253,
43066,
2919,
556,
644,
3636,
247,
1275,
1699,
5199,
310,
4824,
562,
281,
1089,
1355,
3672,
835,
253,
20452,
943,
320,
2879,
4679,
403,
2684,
327,
4440,
257,
292,
285,
1543,
403,
2429,
342,
1110,
273,
690,
256,
5503,
3082,
285,
616,
11640,
326,
789,
327,
3779,
4364,
4811,
436,
789,
39223,
271,
1774,
1895,
326,
310,
6684,
271,
9719,
44043,
2806,
3364,
2983,
2299,
627,
403,
767,
2022,
3374,
8664,
253,
7681,
2746,
285,
253,
5661,
1783,
50273,
4064,
247,
7681,
1127,
273,
1859,
253,
3186,
5933,
32434,
281,
39494,
20452,
310,
42508,
27785,
352,
310,
1754,
327,
271,
34560,
5199,
326,
29217,
9853,
12588,
1783,
275,
2460,
5162,
253,
806,
8085,
273,
253,
43066,
2919,
15693,
1740,
3302,
8336,
285,
247,
20452,
310,
2879,
281,
731,
281,
1089,
1110,
326,
22950,
253,
2957,
1159,
846,
4560,
253,
1682,
2972,
352,
310,
2007,
8085,
285,
253,
5199,
310,
6015,
1919,
253,
2972,
310,
3777,
281,
581,
12275,
390,
642,
4577,
8336,
476,
5115,
247,
1805,
2957,
840,
436,
1072,
5199,
310,
3732,
969,
281,
253,
1273,
1682,
2972,
3966,
347,
352,
310,
2529,
436,
5199,
556,
6747,
247,
10527,
4543,
271,
5661,
22861,
275,
1635,
247,
2074,
2934,
533,
3732,
275,
247,
1027,
3634,
476,
320,
1119,
275,
1149,
80,
1162,
355,
9169,
285,
3779,
4364,
8115,
281,
823,
48960,
8104,
403,
671,
14859,
275,
5101,
1162,
355,
43425,
4583,
891,
1928,
326,
253,
7680,
310,
417,
1534,
275,
2426,
273,
7681,
2746,
50275,
783,
2746,
310,
417,
2429,
342,
643,
2806,
3364,
8104,
326,
671,
452,
253,
8103,
281,
1056,
253,
20452,
9719,
44043,
923,
10414,
2708,
275,
958,
4679,
403,
4824,
562,
407,
26264,
690,
256,
5503,
7274,
407,
1690,
3779,
4364,
4811,
285,
10941,
731,
342,
253,
4081,
2900,
923,
2829,
374,
275,
619,
4743,
436,
310,
1512,
3710,
281,
17813,
253,
10419,
50275,
555,
993,
9719,
916,
487,
77,
414,
246,
10455,
50276,
250,
3065,
50276,
4297,
80,
1162,
355,
3698,
562,
3200,
310,
29017,
804,
253,
8113,
273,
634,
3676,
11454,
6928,
5723,
2824,
9169,
50276,
13998,
1162,
355,
11365,
17754,
2048,
48960,
6667,
1754,
327,
3779,
4364,
3711,
2460,
285,
8113,
12672,
43425,
50276,
33317,
1162,
355,
13071,
7756,
323,
1959,
18216,
9719,
44043,
2806,
3364,
48960,
8104,
327,
2460,
9162,
549,
32693,
9169,
50276,
965,
86,
1162,
355,
38754,
71,
1062,
25274,
5906,
9719,
916,
2322,
285,
697,
2898,
281,
20462,
2806,
3364,
48960,
1650,
8104,
549,
32693,
9169,
50276,
23853,
336,
285,
344,
249,
23507,
285,
9719,
336,
26430,
48960,
8104,
17857,
17312,
6247,
50276,
737,
1530,
6451,
6646,
1162,
355,
39612,
3290,
10192,
26368,
2806,
3364,
2983,
1411,
3676,
4715,
2460,
49996,
3102,
8981,
4876,
43425,
50276,
965,
285,
260,
864,
2584,
5304,
22841,
275,
2806,
3364,
8104,
26332,
1796,
13122,
327,
2460,
5162,
43425,
50274,
5996,
30080,
22559,
5701,
50275,
74,
11435,
326,
253,
4477,
1805,
31637,
616,
7680,
285,
2908,
247,
747,
5301,
275,
253,
5661,
2593,
7613,
891,
2572,
619,
4868,
2299,
891,
1335,
2868,
326,
253,
2929,
3198,
1199,
625,
14023,
342,
1375,
23037,
14387,
285,
326,
253,
10419,
943,
320,
1805,
17285,
432,
247,
10527,
1127,
273,
1859,
50276,
249,
619,
4743,
253,
7681,
38135,
5611,
275,
436,
2929,
310,
1512,
3710,
285,
671,
697,
5661,
12820,
943,
320,
5520,
407,
7296,
625,
4623,
3082,
323,
5301,
50276,
7152,
339,
431,
248,
2929,
2175,
849,
281,
4796,
253,
591,
916,
2322,
273,
253,
26309,
281,
253,
3236,
3888,
4197,
407,
2806,
3364,
48960,
8104,
323,
253,
11591,
3259,
26039,
1566,
275,
1798,
352,
29328,
281,
897,
247,
2720,
1754,
327,
26405,
5609,
281,
1980,
907,
253,
2544,
327,
253,
2256,
273,
253,
2460,
285,
3553,
253,
4114,
440,
267,
3606,
25761,
253,
3779,
4364,
2983,
310,
5611,
281,
2007,
4796,
253,
6919,
273,
253,
3236,
2460,
534,
310,
7321,
281,
10808,
3731,
42070,
20544,
50276,
783,
4081,
1332,
310,
2969,
285,
33526,
253,
4736,
273,
11138,
253,
9719,
916,
2322,
273,
253,
26309,
970,
253,
26405,
2720,
310,
3576,
323,
17816,
2544,
327,
253,
4114,
285,
253,
3779,
4364,
2983,
863,
1550,
1805,
10279,
4868,
50275,
43249,
28913,
2175,
403,
3559,
281,
17093,
253,
4081,
1332,
50276,
20881,
1255,
265,
50276,
783,
2929,
16633,
327,
11138,
253,
9719,
916,
2322,
273,
11591,
3259,
1595,
7305,
342,
247,
4229,
7563,
299,
4277,
5523,
275,
3946,
8493,
253,
1180,
273,
15115,
44711,
247,
3686,
273,
41458,
273,
253,
11591,
17,
12850,
2299,
891,
1158,
326,
849,
7985,
253,
26309,
403,
310,
625,
247,
2867,
273,
253,
4322,
1566,
873,
273,
17887,
2544,
2581,
685,
273,
253,
2983,
908,
323,
1650,
970,
247,
4577,
299,
4277,
323,
253,
1072,
8104,
651,
2572,
616,
9719,
916,
2322,
25761,
643,
11591,
81,
12850,
84,
24088,
6278,
2983,
556,
9508,
323,
11591,
19,
285,
11591,
18,
247,
1690,
11591,
17,
390,
39612,
17082,
751,
39322,
2824,
270,
943,
320,
2783,
1580,
597,
4711,
625,
15783,
2544,
685,
11591,
3259,
1595,
7305,
50275,
9088,
403,
247,
1643,
2987,
534,
4388,
387,
4560,
534,
3672,
273,
271,
2460,
253,
30539,
943,
12230,
281,
452,
20624,
2544,
1097,
342,
2806,
285,
3168,
3364,
8104,
260,
277,
275,
1798,
260,
921,
326,
1355,
26309,
275,
253,
11591,
3259,
12850,
403,
417,
3309,
604,
3710,
281,
2176,
3672,
281,
253,
2460,
281,
14003,
9719,
916,
2322,
50275,
783,
9759,
273,
253,
1332,
275,
11333,
337,
285,
374,
3133,
247,
2372,
40678,
806,
18687,
310,
417,
2931,
1273,
253,
39494,
1159,
310,
1925,
17910,
1242,
327,
4577,
8336,
533,
671,
12889,
342,
4577,
465,
275,
5933,
374,
310,
436,
3451,
671,
275,
1386,
898,
465,
310,
7905,
1272,
943,
2649,
352,
320,
19855,
264,
432,
253,
3236,
1318,
672,
1469,
689,
253,
1735,
19502,
273,
253,
6287,
275,
298,
21,
50275,
4064,
253,
3888,
275,
4677,
577,
352,
3133,
326,
323,
13328,
15329,
784,
285,
6278,
2983,
26309,
403,
19958,
671,
327,
3672,
534,
403,
3345,
253,
8989,
1677,
407,
253,
26405,
2720,
403,
1110,
9183,
6760,
9100,
19241,
273,
253,
2264,
7563,
3738,
597,
16216,
1818,
253,
1655,
26309,
50276,
66,
5987,
39962,
2061,
5375,
19,
12172,
520,
17391,
270,
5987,
39962,
2061,
5375,
8603,
805,
25320,
260,
5987,
39962,
2061,
5375,
16129,
2270,
1235,
1449,
277,
5987,
39962,
2061,
5375,
1252,
520,
1787,
3655,
4583,
891,
1158,
253,
4081,
1332,
310,
3576,
533,
3340,
1677,
326,
253,
7681,
38135,
310,
3710,
352,
943,
320,
1805,
15471,
604,
253,
2022,
4736,
310,
9719,
916,
2322,
273,
253,
26309,
352,
943,
320,
2429,
281,
643,
9351,
273,
8104,
4457,
11591,
3259,
4394,
5010,
253,
4477,
943,
1805,
41509,
2139,
11138,
253,
11591,
3259,
1595,
7305,
342,
824,
4229,
7887,
310,
4623,
5474,
339,
431,
1568,
556,
644,
247,
2257,
273,
1600,
275,
11138,
253,
7316,
6733,
273,
2806,
3364,
8104,
275,
253,
3332,
2469,
2299,
841,
5609,
4711,
6667,
326,
247,
1966,
275,
253,
6287,
476,
4541,
4271,
253,
4477,
12661,
970,
26405,
2235,
641,
281,
3157,
253,
2806,
3364,
8104,
594,
326,
253,
26309,
403,
11096,
281,
253,
43066,
4811,
273,
253,
2460,
275,
1635,
281,
436,
597,
671,
1246,
247,
5853,
326,
19132,
253,
9719,
916,
2322,
1293,
323,
5681,
253,
7316,
6733,
253,
2929,
29328,
247,
1643,
5556,
18058,
326,
476,
3157,
253,
9719,
916,
2322,
273,
4561,
48960,
26309,
597,
671,
7568,
326,
690,
273,
253,
44528,
2806,
3364,
48960,
8104,
476,
5649,
432,
616,
5556,
18058,
275,
1635,
281,
436,
597,
671,
12661,
247,
3186,
5933,
326,
476,
2007,
6891,
1066,
253,
7431,
4811,
323,
48960,
26309,
597,
671,
7568,
326,
48960,
6667,
4561,
407,
616,
5853,
452,
247,
2169,
2323,
2281,
323,
612,
6748,
5481,
6297,
50275,
249,
253,
3332,
2469,
247,
38539,
966,
273,
11786,
4924,
2806,
3364,
5609,
452,
644,
4081,
275,
253,
6239,
2139,
310,
253,
2929,
7296,
760,
767,
273,
1110,
323,
4227,
2139,
417,
1908,
5609,
1754,
327,
17699,
16561,
5556,
5837,
253,
1072,
10384,
281,
3082,
323,
15549,
48960,
6667,
2139,
369,
4735,
43464,
6777,
347,
247,
8245,
281,
7472,
253,
2323,
2281,
50276,
249,
1635,
281,
253,
1840,
253,
4477,
1304,
253,
1543,
760,
689,
247,
3410,
273,
9098,
3888,
432,
4440,
257,
292,
2139,
760,
824,
247,
1355,
3410,
369,
2783,
285,
891,
651,
671,
11907,
253,
4477,
281,
7472,
689,
2709,
3530,
50276,
16223,
253,
11041,
1512,
275,
2087,
891,
1119,
253,
2929,
3477,
281,
956,
891,
671,
1089,
253,
3884,
14859,
407,
253,
2929,
281,
320,
3240,
12532,
533,
1119,
253,
4679,
50276,
783,
1666,
25379,
6777,
407,
253,
4477,
403,
14999,
891,
651,
11907,
253,
4477,
281,
2486,
247,
11117,
873,
273,
1666,
25379,
323,
4227,
891,
651,
671,
7568,
849,
253,
13757,
4081,
275,
253,
2929,
19132,
11786,
4924,
2806,
3364,
48960,
1650,
5978,
5609,
643,
685,
253,
6777,
1643,
671,
7472,
253,
612,
4930,
2281,
273,
48960,
6667,
4561,
407,
253,
4081,
5853,
1411,
247,
11117,
873,
273,
5481,
6297,
5474,
33032,
2520,
2929,
4081,
247,
1332,
281,
6635,
9719,
44043,
2983,
275,
2806,
3817,
2983,
10076,
407,
11365,
1980,
20452,
8336,
275,
43066,
4811,
352,
908,
43066,
1789,
26405,
281,
4044,
253,
43066,
2919,
840,
3732,
247,
5202,
3186,
1332,
281,
1089,
8004,
8336,
1561,
253,
43066,
2919,
326,
476,
2847,
253,
13493,
1818,
275,
8131,
966,
2412,
953,
4679,
327,
9098,
4440,
257,
292,
6667,
403,
5196,
2429,
281,
2067,
5368,
1666,
25379,
4645,
326,
253,
4081,
1332,
476,
3157,
5115,
625,
9719,
44043,
8104,
835,
9719,
916,
2322,
310,
4080,
407,
7982,
10279,
253,
9400,
273,
9719,
44043,
2806,
3817,
310,
1077,
4722,
285,
417,
973,
5421,
253,
5697,
273,
11365,
12230,
272,
8336,
275,
43066,
4811,
403,
4722,
2299,
436,
2929,
476,
320,
5520,
342,
625,
4209,
816,
6787,
327,
42852,
285,
4679,
25761,
697,
7681,
7680,
310,
417,
1077,
1029,
50276,
783,
16038,
273,
11365,
253,
44711,
4811,
275,
253,
43066,
1789,
2919,
310,
417,
10481,
17285,
43066,
1789,
476,
1335,
320,
6571,
247,
6032,
1698,
4294,
2170,
24088,
6032,
1966,
9365,
5914,
285,
7613,
11365,
253,
20452,
275,
436,
2170,
778,
417,
320,
5272,
25761,
352,
310,
417,
2590,
1880,
11365,
20452,
275,
43066,
1789,
2170,
2789,
625,
3282,
685,
11365,
26309,
275,
253,
14122,
78,
4902,
1255,
1029,
4294,
390,
50276,
37893,
14542,
2170,
50275,
783,
3368,
629,
310,
417,
9106,
21414,
337,
627,
310,
760,
581,
941,
873,
534,
310,
9098,
3888,
432,
4440,
257,
292,
2299,
752,
841,
3888,
1007,
751,
588,
2818,
253,
1543,
247,
2257,
323,
1650,
1057,
954,
43066,
5113,
452,
5536,
14542,
2170,
374,
671,
627,
310,
642,
5301,
281,
253,
2905,
1332,
275,
2746,
273,
1182,
12109,
9169,
495,
891,
476,
417,
1089,
253,
1491,
273,
534,
9162,
2990,
310,
908,
275,
253,
4679,
403,
2709,
6928,
908,
275,
253,
4679,
50276,
2520,
2929,
908,
271,
5368,
43066,
1789,
26405,
2746,
285,
594,
697,
2201,
7681,
7680,
310,
253,
5202,
1754,
1332,
281,
6635,
20452,
8336,
534,
310,
2299,
247,
1652,
15246,
285,
519,
37806,
4583,
891,
10490,
253,
2934,
273,
436,
2929,
285,
11346,
4361,
352,
2299,
352,
3198,
1805,
22861,
327,
42852,
347,
973,
347,
7756,
327,
4679,
2490,
187,
4118,
18435,
27,
249,
436,
2929,
253,
4477,
12661,
281,
897,
26405,
2235,
641,
323,
2806,
3364,
8104,
824,
326,
253,
26309,
403,
3710,
275,
253,
43066,
2919,
597,
671,
1089,
326,
1375,
23037,
14387,
2806,
3364,
8104,
13496,
342,
26405,
2235,
641,
476,
5115,
1199,
1805,
9719,
916,
2322,
3045,
342,
1652,
5141,
275,
7316,
6733,
285,
2323,
2281,
7613,
253,
7331,
334,
641,
12661,
253,
3779,
4364,
2983,
247,
747,
11786,
4924,
2806,
3364,
2983,
326,
476,
2007,
3157,
253,
9719,
916,
2322,
407,
1275,
1699,
26309,
275,
253,
43066,
2919,
253,
30628,
1158,
326,
253,
4081,
1332,
310,
2969,
285,
1774,
285,
253,
4477,
452,
10974,
6283,
281,
690,
5701,
50276,
35529,
253,
30628,
1335,
403,
417,
10048,
342,
253,
5661,
7103,
285,
14023,
347,
253,
4477,
476,
760,
1611,
281,
7277,
342,
643,
5697,
285,
1071,
625,
3210,
275,
253,
2852,
275,
6010,
891,
1158,
253,
7714,
387,
697,
1655,
331,
666,
2550,
320,
7607
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
to enhance the quality of stylecontrolled generation especially in an unsupervised manner and nonparallel setting this paper proposes a style equalization mechanism to prevent the content leakage problem in the style equalization module the style of a sample is transformed to be the same as the style of ground truth the authors assumed that content information is timedependent whereas the style can be timeindependent so that the authors employ timeaverage pooling to learn the global style then the style difference is added to the inputs style features at each time step a content attended feature queries and attends appropriate style equalized feature via the multihead attention module the entire model is optimized to maximize the elbo the proposed method is demonstrated on speech synthesis and handwriting synthesis tasks strong points the problem this paper tackles is crucial and practical in controllable generations in particular the unsupervised learning and nonparallel setting are realistic also the proposed method and the motivation behind it are simple but effective the method is also technically sound overall this paper is neat clearly written and wellorganized the experimental results are much improved in the automated evaluations and human evaluation in particular i was hard to distinguish the synthesized speech examples whether it is groundtruth or generated the low wer of generated speech is very impressive for me the authors provide sufficient experimental details for reproducibility and mentioned several training techniques weak points please clarify that in sec 52 which data are compared in the cossim metric the implementation codes are not submitted although the authors mentioned the details of training publish the source codes would be helpful for the community questions please refer to the weak points above recommendation i vote for accept with the reasons that the proposed method is simple and somewhat novel and also it shows strongly effective results i think the results are able to contribute to the stylecontrolled synthesis communities docsepin this paper the authors argue that the typical training algorithms for controllable sequence generative models suffers from the traininginference mismatch therefore to address such a problem they introduce a style transformation module that is called style equalization such a module is designed to enable training using different content and style samples and thereby mitigate the traininginference mismatch problem to demonstrate the generality of the proposed approach the style equalization is applied to two tasks of tts and texttohandwriting synthesis on three datasets on both tasks the models show good results controllable sequential generative models have been studies for years one of the most fundamental problem is how to effectively capture the content information and style information respectively it is a critical while very challenging research problem because the content and style are entangled in the training samples and ones must carefully design the training objective such that each of these factor can be learned in a controllable way the idea of learning the style equalization is interesting and achieves promising results on tasks in different application scenarios ie tts and texttohandwriting synthesis despite that the paper is easy to follow and the demos show in the project page qualitatively demonstrate the proposed approach strengths controllable sequential generative models have been studies for years one of the most fundamental problem is how to effectively capture the content information and style information respectively it is a critical while very challenging research problem because the content and style are entangled in the training samples and ones must carefully design the training objective such that each of these factor can be learned in a controllable way the idea of learning the style equalization is interesting and achieves promising results on tasks in different application scenarios ie tts and texttohandwriting synthesis despite that the paper is easy to follow and the demos show in the project page qualitatively demonstrate the proposed approach weakness 1my main concern is the training of style equalization module based on the design the training is to approximate the posterior pzx c with qzmx phix x c in this such a design the expectation is that the style encoder only capture the style information from the given samples x and x however it might be possible that the network is trained to find a shortcut that convx xcontentxstyle and m phi are just trained to diminish x in this way the training loss is still very small however the conv does not trained to encode pure style information as expected i am wondering is there any penalties or training tricks are used to avoid such shortcuts as i do not find implementation details and either ablations or analysis on the latent space that inferred by the style encoder conv so i am not fully convinced 2 to deal with the traininginference mismatch issue there are another line of works hat encourage disentanglement of content and style for example in 1 the model are also trained with paired and unpaired textspeech by disengaging the content and style information in each encoders while i do not see comparisons and discussions about this line of work i am very curious to see what are the differences when applying disentanglement and style equalization 3 there lack of implementation and experimental details 4 the comparing sota are quite a few on tts the model is only compared with tacotronbased models eg gst and other versions on texthandwriting is only graves 5 there lack of comprehensive ablation studies especially on the style latent space i am curious to see differences of the latent space directly inferred by the style encoder conv and the after by applying m phi 6 no source code is submitted it raises doubles on the reproducible ability 1 s ma d mcduff y song neural tts stylization with adversarial and collaborative games this paper is well motivated the idea of applying style equalization is interesting also the showed experimental results are impressive upon the weakness ive pointed i would suggest the authors 1 demonstrate how m phi can be trained in the expected way which avoids the model collapse to shortcuts 2 add discussions and comparisons with the very related line of work ie contentstyle disentanglement 3 adding ablation studies and comparisons with more sota approaches minors by looking into the paper i am not clear about the training and evaluation datasets for example on tts what is the training dataset and what is the evaluation dataset are they trained on voxceleb and then evaluated on vctk libirtts or they trained on vctk and evaluated on vctk and the same for libritts if the latter are the evaluation performed on seen or unseen speakers docsep proposed an unsupervised style transferring framework based on vrnn conducted experiments on tts and handwriting synthesis to demonstrate the effectiveness of the proposed method strengths proposed an interesting method with nice experimental results weaknesses the connection to related works is not well discussed the benefit of the proposed method over other similar methods is not well justified detailed comments 1 the main idea of this paper is based on vrnn chung et al 2015 such connection as well as the connection to other methods based on vrnn eg aksan et al 2018 is not clearly discussed in this paper other major approaches on style transferring are not well discussed either such as vaebased approaches akuzawa et al 2018 henter et al 2018 sun et al 2020 2 this paper claims stateoftheart however the experiments are not compared to the stateoftheart approaches in the field the tts experiments focus on speaker similarity but is compared to the global style tokens gst model which is not good at voice modeling due to its limited dictionary vaebased approach such as hsu et al 2019 are better baselines for the tacotron baseline its also worth to include the result conditioned on speaker id on handwriting synthesis the baseline used in the paper graves 2013 is not a strong baseline either more recent works such as aksan et al 2018 kotani et al 2020 are better baselines 3 the main novelty of this paper is eq 2 the idea is to generate sample x the model is conditioned on both a content c and a style z where z is from a function mx x here x is a sample unrelated to x a natural question is is x really used by the model or is it actually ignored this is neither theoretically nor empirically answered in the paper 4 to address the above question empirically id suggest running two ablation studies replace mx x with mx replace x more precisely f with a learned prior interesting but not required replace x more precisely f with a random noise 5 there are quite some redundancy in the statement of the mismatch problem the key idea of this paper is not presented until paper 5 it would be nice to save some space from them and use it for the connection to related works references chung et al a recurrent latent variable model for sequential data neurips 2015 aksan et al deepwriting making digital ink editable via deep generative modeling 2018 akuzawa et al expressive speech synthesis via modeling expressions with variational autoencoder interspeech 2018 henter et al deep encoderdecoder models for unsupervised learning of controllable speech synthesis 2018 sun et al fullyhierarchical finegrained prosody modeling for interpretable speech synthesis icassp 2020 hsu et al hierarchical generative modeling for controllable speech synthesis iclr 2019 kotani et al generating handwriting via decoupled style descriptors eccv 2020 while this paper proposes an interesting idea for style transferring with nice experimental results the benefit of the proposed idea over reasonable baselines is not clearly exhibited the connection to related works is also not well discussed
### Summary:
|
this work aims to improve style transfer in the unsupervised nonparallel case it does this by proposing a style equalization approach to prevent content leakage and assuming that content information is timedependent whereas style information is timedependent this is an important problem to solve and lots of prior work in the area exists the work is wellorganised with good experimental results however there are strong claims in the paper and there is insufficient experimental comparison to similar related work such as hsu et al 2019 and ma et al 2018 to back that up if theres no comparison with the current state of the art eg due to a private implementation or dataset then its hard to justify calling a new work a new state of the art even though an implementation may be private it can be worth spending time to reproduce a paper or asking the authors for an implementation finally task and metric selection could be improved to better highlight the performance of the approach the reviewers thank the authors for the rebuttal but it was insufficient to change their decision
|
[
5133,
285,
1327,
19783,
4758,
436,
2929,
29328,
247,
3740,
4503,
1320,
5122,
281,
3657,
253,
2600,
23753,
1895,
275,
253,
3740,
4503,
1320,
6333,
253,
3740,
273,
247,
3410,
310,
13657,
281,
320,
253,
1072,
347,
253,
3740,
273,
3216,
5083,
253,
4477,
8025,
326,
2600,
1491,
310,
37282,
2662,
5727,
253,
3740,
476,
320,
673,
17777,
594,
326,
253,
4477,
2126,
673,
25629,
45900,
281,
3037,
253,
4156,
3740,
840,
253,
3740,
3064,
310,
2879,
281,
253,
14800,
3740,
3386,
387,
1016,
673,
3213,
247,
2600,
11612,
4735,
19241,
285,
863,
1727,
4569,
3740,
4503,
1025,
4735,
3066,
253,
4471,
2522,
4116,
6333,
253,
2862,
1566,
310,
18325,
281,
22950,
253,
1045,
2399,
253,
4081,
1332,
310,
5183,
327,
6519,
9066,
285,
47021,
9066,
8892,
2266,
2792,
50275,
783,
1895,
436,
2929,
39223,
310,
9560,
285,
8542,
275,
3661,
494,
14649,
275,
1798,
253,
440,
35421,
4715,
285,
1327,
19783,
4758,
403,
15958,
671,
253,
4081,
1332,
285,
253,
16038,
3212,
352,
403,
2969,
533,
3576,
253,
1332,
310,
671,
22335,
3590,
50276,
1189,
455,
436,
2929,
310,
18176,
4518,
3542,
285,
973,
34092,
50276,
783,
5661,
1543,
403,
1199,
5520,
275,
253,
16644,
27163,
285,
1966,
7103,
275,
1798,
891,
369,
1892,
281,
12129,
253,
17791,
6519,
6667,
1880,
352,
310,
3216,
33024,
390,
4561,
253,
1698,
16640,
273,
4561,
6519,
310,
1077,
13943,
323,
479,
50276,
783,
4477,
2085,
4209,
5661,
4278,
323,
38041,
285,
5393,
2067,
3733,
5609,
50276,
20881,
2792,
50275,
32897,
19148,
326,
275,
4706,
8073,
534,
941,
403,
2429,
275,
253,
260,
1730,
303,
7982,
50276,
783,
7092,
11646,
403,
417,
9262,
3738,
253,
4477,
5393,
253,
4278,
273,
3733,
15452,
253,
2603,
11646,
651,
320,
9371,
323,
253,
3114,
50275,
34974,
50275,
32897,
3730,
281,
253,
5075,
2792,
1840,
17401,
50275,
74,
6273,
323,
2997,
342,
253,
4606,
326,
253,
4081,
1332,
310,
2969,
285,
8489,
4460,
285,
671,
352,
2722,
7052,
3576,
1543,
891,
1158,
253,
1543,
403,
2104,
281,
8162,
281,
253,
3740,
16894,
9066,
7888,
5474,
339,
9852,
436,
2929,
253,
4477,
9059,
326,
253,
6867,
3733,
11333,
323,
3661,
494,
3425,
1006,
800,
3210,
27171,
432,
253,
3733,
249,
1793,
29713,
3103,
281,
2953,
824,
247,
1895,
597,
9569,
247,
3740,
9261,
6333,
326,
310,
1925,
3740,
4503,
1320,
824,
247,
6333,
310,
4158,
281,
8046,
3733,
970,
1027,
2600,
285,
3740,
3530,
285,
7624,
29966,
253,
3733,
249,
1793,
29713,
1895,
281,
7568,
253,
31376,
273,
253,
4081,
2746,
253,
3740,
4503,
1320,
310,
3732,
281,
767,
8892,
273,
246,
1641,
285,
2505,
936,
4608,
17695,
9066,
327,
1264,
15302,
327,
1097,
8892,
253,
3210,
921,
1175,
1543,
50275,
35019,
494,
22453,
1006,
800,
3210,
452,
644,
2175,
323,
1107,
581,
273,
253,
954,
7936,
1895,
310,
849,
281,
8069,
9232,
253,
2600,
1491,
285,
3740,
1491,
2975,
352,
310,
247,
4619,
1223,
1077,
11132,
2561,
1895,
984,
253,
2600,
285,
3740,
403,
36255,
275,
253,
3733,
3530,
285,
4394,
1364,
9257,
2216,
253,
3733,
8103,
824,
326,
1016,
273,
841,
2803,
476,
320,
6311,
275,
247,
3661,
494,
1039,
253,
2934,
273,
4715,
253,
3740,
4503,
1320,
310,
4722,
285,
33526,
12532,
1543,
327,
8892,
275,
1027,
2898,
15216,
26332,
246,
1641,
285,
2505,
936,
4608,
17695,
9066,
5747,
326,
253,
2929,
310,
3477,
281,
956,
285,
253,
1471,
375,
921,
275,
253,
2199,
3239,
36143,
7568,
253,
4081,
2746,
20544,
3661,
494,
22453,
1006,
800,
3210,
452,
644,
2175,
323,
1107,
581,
273,
253,
954,
7936,
1895,
310,
849,
281,
8069,
9232,
253,
2600,
1491,
285,
3740,
1491,
2975,
352,
310,
247,
4619,
1223,
1077,
11132,
2561,
1895,
984,
253,
2600,
285,
3740,
403,
36255,
275,
253,
3733,
3530,
285,
4394,
1364,
9257,
2216,
253,
3733,
8103,
824,
326,
1016,
273,
841,
2803,
476,
320,
6311,
275,
247,
3661,
494,
1039,
253,
2934,
273,
4715,
253,
3740,
4503,
1320,
310,
4722,
285,
33526,
12532,
1543,
327,
8892,
275,
1027,
2898,
15216,
26332,
246,
1641,
285,
2505,
936,
4608,
17695,
9066,
5747,
326,
253,
2929,
310,
3477,
281,
956,
285,
253,
1471,
375,
921,
275,
253,
2199,
3239,
36143,
7568,
253,
4081,
2746,
50276,
20881,
1255,
337,
2577,
2022,
4468,
310,
253,
3733,
273,
3740,
4503,
1320,
6333,
1754,
327,
253,
2216,
253,
3733,
310,
281,
16851,
253,
12637,
268,
91,
89,
260,
342,
2805,
91,
26652,
815,
895,
1269,
260,
275,
436,
824,
247,
2216,
253,
15355,
310,
326,
253,
3740,
32049,
760,
9232,
253,
3740,
1491,
432,
253,
1677,
3530,
1269,
285,
1269,
2299,
352,
1537,
320,
1896,
326,
253,
2990,
310,
10166,
281,
1089,
247,
28194,
326,
2410,
89,
50276,
89,
6071,
89,
4826,
285,
278,
50276,
2162,
403,
816,
10166,
281,
37856,
1269,
275,
436,
1039,
253,
3733,
2957,
310,
1335,
1077,
1355,
2299,
253,
2410,
1057,
417,
10166,
281,
22573,
6313,
3740,
1491,
347,
3264,
50276,
74,
717,
12371,
310,
627,
667,
22414,
390,
3733,
24866,
403,
908,
281,
3693,
824,
28194,
84,
347,
891,
513,
417,
1089,
7092,
4278,
285,
2057,
490,
77,
569,
390,
1783,
327,
253,
21624,
2317,
326,
22245,
407,
253,
3740,
32049,
2410,
594,
891,
717,
417,
4751,
13762,
374,
281,
2968,
342,
253,
3733,
249,
1793,
29713,
2523,
627,
403,
1529,
1386,
273,
2987,
7856,
11907,
557,
290,
606,
1338,
273,
2600,
50276,
395,
3740,
323,
1650,
275,
337,
253,
1566,
403,
671,
10166,
342,
18433,
285,
47223,
2505,
48460,
407,
44894,
2977,
253,
2600,
285,
3740,
1491,
275,
1016,
2349,
351,
398,
1223,
891,
513,
417,
923,
14023,
285,
11985,
670,
436,
1386,
273,
789,
891,
717,
1077,
14338,
281,
923,
752,
403,
253,
3910,
672,
9433,
557,
290,
606,
1338,
285,
3740,
4503,
1320,
50276,
20,
627,
3480,
273,
7092,
285,
5661,
4278,
577,
253,
10941,
256,
5503,
403,
3240,
247,
1643,
327,
246,
1641,
253,
1566,
310,
760,
2429,
342,
32851,
302,
1406,
3169,
3210,
24088,
305,
296,
285,
643,
9508,
327,
30557,
394,
395,
17695,
310,
760,
38854,
608,
627,
3480,
273,
11088,
28913,
2175,
3340,
327,
253,
3740,
21624,
2317,
891,
717,
14338,
281,
923,
3910,
273,
253,
21624,
2317,
3587,
22245,
407,
253,
3740,
32049,
2410,
285,
253,
846,
407,
9433,
278,
50276,
2162,
721,
642,
2603,
2127,
310,
9262,
352,
16540,
33478,
327,
253,
41374,
3745,
50275,
18,
256,
6429,
277,
278,
68,
563,
567,
340,
4498,
11454,
246,
1641,
17521,
1320,
342,
48960,
285,
27549,
3958,
436,
2929,
310,
973,
17194,
253,
2934,
273,
9433,
3740,
4503,
1320,
310,
4722,
671,
253,
2692,
5661,
1543,
403,
13943,
50276,
20026,
253,
14855,
209,
422,
8042,
891,
651,
1804,
253,
4477,
337,
7568,
849,
278,
50276,
2162,
476,
320,
10166,
275,
253,
3264,
1039,
534,
32547,
253,
1566,
13551,
281,
28194,
84,
374,
823,
11985,
285,
14023,
342,
253,
1077,
2905,
1386,
273,
789,
26332,
2600,
4826,
557,
290,
606,
1338,
495,
6240,
28913,
2175,
285,
14023,
342,
625,
256,
5503,
7274,
50276,
1222,
641,
407,
2819,
715,
253,
2929,
891,
717,
417,
2590,
670,
253,
3733,
285,
7103,
15302,
323,
1650,
327,
246,
1641,
752,
310,
253,
3733,
10895,
285,
752,
310,
253,
7103,
10895,
403,
597,
10166,
327,
32582,
44033,
67,
285,
840,
6760,
327,
362,
291,
76,
50276,
4658,
343,
1440,
84,
390,
597,
10166,
327,
362,
291,
76,
285,
6760,
327,
362,
291,
76,
285,
253,
1072,
323,
40211,
770,
84,
50276,
338,
253,
6158,
403,
253,
7103,
2684,
327,
2326,
390,
39709,
17999,
50275,
7152,
33032,
50276,
856,
7334,
271,
440,
35421,
3740,
27090,
7792,
1754,
327,
362,
83,
9866,
50275,
11018,
264,
4679,
327,
246,
1641,
285,
47021,
9066,
281,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
50275,
856,
7334,
271,
4722,
1332,
342,
5322,
5661,
1543,
50276,
20881,
1255,
265,
50275,
783,
4602,
281,
2905,
2987,
310,
417,
973,
5469,
50275,
783,
5649,
273,
253,
4081,
1332,
689,
643,
2074,
3082,
310,
417,
973,
17285,
50276,
5992,
7193,
5701,
50276,
18,
253,
2022,
2934,
273,
436,
2929,
310,
1754,
327,
362,
83,
9866,
448,
1947,
1162,
355,
4104,
824,
4602,
347,
973,
347,
253,
4602,
281,
643,
3082,
1754,
327,
362,
83,
9866,
24088,
247,
661,
266,
1162,
355,
4765,
310,
417,
4518,
5469,
275,
436,
2929,
643,
2201,
7274,
327,
3740,
27090,
403,
417,
973,
5469,
2057,
824,
347,
13460,
2275,
833,
7274,
29507,
7958,
11415,
1162,
355,
4765,
23394,
350,
1162,
355,
4765,
5101,
1162,
355,
9169,
50276,
19,
436,
2929,
3916,
1375,
23037,
14387,
2299,
253,
4679,
403,
417,
2429,
281,
253,
1375,
23037,
14387,
7274,
275,
253,
1673,
50272,
783,
246,
1641,
4679,
2770,
327,
14925,
14259,
533,
310,
2429,
281,
253,
4156,
3740,
21761,
305,
296,
1566,
534,
310,
417,
1175,
387,
4318,
14053,
1955,
281,
697,
3710,
19034,
13460,
2275,
833,
2746,
824,
347,
288,
3467,
1162,
355,
6247,
403,
1805,
1666,
25379,
323,
253,
32851,
302,
1406,
8245,
697,
671,
4409,
281,
2486,
253,
906,
27039,
327,
14925,
2654,
50272,
251,
47021,
9066,
253,
8245,
908,
275,
253,
2929,
38854,
4072,
310,
417,
247,
2266,
8245,
2057,
625,
3332,
2987,
824,
347,
247,
661,
266,
1162,
355,
4765,
465,
302,
6451,
1162,
355,
9169,
403,
1805,
1666,
25379,
50276,
20,
253,
2022,
38135,
273,
436,
2929,
310,
16186,
374,
253,
2934,
310,
281,
6635,
3410,
1269,
253,
1566,
310,
27039,
327,
1097,
247,
2600,
260,
285,
247,
3740,
1182,
835,
1182,
310,
432,
247,
1159,
278,
89,
1269,
1060,
1269,
310,
247,
3410,
20804,
281,
1269,
247,
3626,
1953,
310,
310,
1269,
1663,
908,
407,
253,
1566,
390,
310,
352,
2686,
12841,
436,
310,
6747,
28055,
4543,
45190,
9577,
275,
253,
2929,
50276,
21,
281,
2953,
253,
1840,
1953,
45190,
2654,
1804,
3515,
767,
28913,
2175,
50272,
13481,
278,
89,
1269,
342,
278,
89,
50272,
13481,
1269,
625,
10534,
269,
342,
247,
6311,
2720,
50272,
47606,
533,
417,
2424,
8171,
1269,
625,
10534,
269,
342,
247,
3632,
6046,
50276,
22,
627,
403,
3240,
690,
39296,
275,
253,
3908,
273,
253,
29713,
1895,
253,
2234,
2934,
273,
436,
2929,
310,
417,
3559,
1919,
2929,
608,
352,
651,
320,
5322,
281,
5321,
690,
2317,
432,
731,
285,
897,
352,
323,
253,
4602,
281,
2905,
2987,
50276,
250,
3065,
50275,
348,
1947,
1162,
355,
247,
18902,
21624,
4778,
1566,
323,
22453,
941,
5723,
2824,
4104,
50275,
8765,
266,
1162,
355,
3676,
17695,
2403,
5865,
12296,
1407,
7116,
3066,
3676,
1006,
800,
14053,
4765,
50275,
518,
7958,
11415,
1162,
355,
43541,
6519,
9066,
3066,
14053,
12091,
342,
39762,
6753,
36465,
734,
48460,
4765,
50275,
864,
350,
1162,
355,
3676,
32049,
48759,
3210,
323,
440,
35421,
4715,
273,
3661,
494,
6519,
9066,
4765,
50275,
13998,
1162,
355,
4751,
73,
1321,
1116,
474,
4030,
72,
11273,
5847,
1197,
14053,
323,
4665,
494,
6519,
9066,
17857,
515,
81,
9169,
50275,
73,
3467,
1162,
355,
24498,
1006,
800,
14053,
323,
3661,
494,
6519,
9066,
17857,
32888,
6247,
50275,
46841,
6451,
1162,
355,
11365,
47021,
3066,
34430,
6216,
3740,
42785,
23746,
87,
9169,
1223,
436,
2929,
29328,
271,
4722,
2934,
323,
3740,
27090,
342,
5322,
5661,
1543,
253,
5649,
273,
253,
4081,
2934,
689,
5272,
1666,
25379,
310,
417,
4518,
12322,
253,
4602,
281,
2905,
2987,
310,
671,
417,
973,
5469,
2490,
187,
4118,
18435,
27,
2520,
789,
13698,
281,
3157,
3740,
3700,
275,
253,
440,
35421,
1327,
19783,
1083,
352,
1057,
436,
407,
36636,
247,
3740,
4503,
1320,
2746,
281,
3657,
2600,
23753,
285,
7384,
326,
2600,
1491,
310,
37282,
2662,
5727,
3740,
1491,
310,
37282,
2662,
436,
310,
271,
1774,
1895,
281,
8415,
285,
8783,
273,
2720,
789,
275,
253,
2170,
4961,
253,
789,
310,
973,
7397,
1701,
342,
1175,
5661,
1543,
2299,
627,
403,
2266,
3916,
275,
253,
2929,
285,
627,
310,
12497,
5661,
5301,
281,
2074,
2905,
789,
824,
347,
288,
3467,
1162,
355,
6247,
285,
6429,
1162,
355,
4765,
281,
896,
326,
598,
604,
253,
373,
642,
5301,
342,
253,
1655,
1375,
273,
253,
1445,
24088,
1955,
281,
247,
3055,
7092,
390,
10895,
840,
697,
1892,
281,
15249,
6789,
247,
747,
789,
247,
747,
1375,
273,
253,
1445,
1014,
2167,
271,
7092,
778,
320,
3055,
352,
476,
320,
4409,
9100,
673,
281,
18302,
247,
2929,
390,
7004,
253,
4477,
323,
271,
7092,
4720,
4836,
285,
7982,
5438,
812,
320,
5520,
281,
1805,
6780,
253,
3045,
273,
253,
2746,
253,
30628,
5717,
253,
4477,
323,
253,
30080,
22559,
533,
352,
369,
12497,
281,
1818,
616,
3061
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
5133,
285,
1327,
19783,
4758,
436,
2929,
29328,
247,
3740,
4503,
1320,
5122,
281,
3657,
253,
2600,
23753,
1895,
275,
253,
3740,
4503,
1320,
6333,
253,
3740,
273,
247,
3410,
310,
13657,
281,
320,
253,
1072,
347,
253,
3740,
273,
3216,
5083,
253,
4477,
8025,
326,
2600,
1491,
310,
37282,
2662,
5727,
253,
3740,
476,
320,
673,
17777,
594,
326,
253,
4477,
2126,
673,
25629,
45900,
281,
3037,
253,
4156,
3740,
840,
253,
3740,
3064,
310,
2879,
281,
253,
14800,
3740,
3386,
387,
1016,
673,
3213,
247,
2600,
11612,
4735,
19241,
285,
863,
1727,
4569,
3740,
4503,
1025,
4735,
3066,
253,
4471,
2522,
4116,
6333,
253,
2862,
1566,
310,
18325,
281,
22950,
253,
1045,
2399,
253,
4081,
1332,
310,
5183,
327,
6519,
9066,
285,
47021,
9066,
8892,
2266,
2792,
50275,
783,
1895,
436,
2929,
39223,
310,
9560,
285,
8542,
275,
3661,
494,
14649,
275,
1798,
253,
440,
35421,
4715,
285,
1327,
19783,
4758,
403,
15958,
671,
253,
4081,
1332,
285,
253,
16038,
3212,
352,
403,
2969,
533,
3576,
253,
1332,
310,
671,
22335,
3590,
50276,
1189,
455,
436,
2929,
310,
18176,
4518,
3542,
285,
973,
34092,
50276,
783,
5661,
1543,
403,
1199,
5520,
275,
253,
16644,
27163,
285,
1966,
7103,
275,
1798,
891,
369,
1892,
281,
12129,
253,
17791,
6519,
6667,
1880,
352,
310,
3216,
33024,
390,
4561,
253,
1698,
16640,
273,
4561,
6519,
310,
1077,
13943,
323,
479,
50276,
783,
4477,
2085,
4209,
5661,
4278,
323,
38041,
285,
5393,
2067,
3733,
5609,
50276,
20881,
2792,
50275,
32897,
19148,
326,
275,
4706,
8073,
534,
941,
403,
2429,
275,
253,
260,
1730,
303,
7982,
50276,
783,
7092,
11646,
403,
417,
9262,
3738,
253,
4477,
5393,
253,
4278,
273,
3733,
15452,
253,
2603,
11646,
651,
320,
9371,
323,
253,
3114,
50275,
34974,
50275,
32897,
3730,
281,
253,
5075,
2792,
1840,
17401,
50275,
74,
6273,
323,
2997,
342,
253,
4606,
326,
253,
4081,
1332,
310,
2969,
285,
8489,
4460,
285,
671,
352,
2722,
7052,
3576,
1543,
891,
1158,
253,
1543,
403,
2104,
281,
8162,
281,
253,
3740,
16894,
9066,
7888,
5474,
339,
9852,
436,
2929,
253,
4477,
9059,
326,
253,
6867,
3733,
11333,
323,
3661,
494,
3425,
1006,
800,
3210,
27171,
432,
253,
3733,
249,
1793,
29713,
3103,
281,
2953,
824,
247,
1895,
597,
9569,
247,
3740,
9261,
6333,
326,
310,
1925,
3740,
4503,
1320,
824,
247,
6333,
310,
4158,
281,
8046,
3733,
970,
1027,
2600,
285,
3740,
3530,
285,
7624,
29966,
253,
3733,
249,
1793,
29713,
1895,
281,
7568,
253,
31376,
273,
253,
4081,
2746,
253,
3740,
4503,
1320,
310,
3732,
281,
767,
8892,
273,
246,
1641,
285,
2505,
936,
4608,
17695,
9066,
327,
1264,
15302,
327,
1097,
8892,
253,
3210,
921,
1175,
1543,
50275,
35019,
494,
22453,
1006,
800,
3210,
452,
644,
2175,
323,
1107,
581,
273,
253,
954,
7936,
1895,
310,
849,
281,
8069,
9232,
253,
2600,
1491,
285,
3740,
1491,
2975,
352,
310,
247,
4619,
1223,
1077,
11132,
2561,
1895,
984,
253,
2600,
285,
3740,
403,
36255,
275,
253,
3733,
3530,
285,
4394,
1364,
9257,
2216,
253,
3733,
8103,
824,
326,
1016,
273,
841,
2803,
476,
320,
6311,
275,
247,
3661,
494,
1039,
253,
2934,
273,
4715,
253,
3740,
4503,
1320,
310,
4722,
285,
33526,
12532,
1543,
327,
8892,
275,
1027,
2898,
15216,
26332,
246,
1641,
285,
2505,
936,
4608,
17695,
9066,
5747,
326,
253,
2929,
310,
3477,
281,
956,
285,
253,
1471,
375,
921,
275,
253,
2199,
3239,
36143,
7568,
253,
4081,
2746,
20544,
3661,
494,
22453,
1006,
800,
3210,
452,
644,
2175,
323,
1107,
581,
273,
253,
954,
7936,
1895,
310,
849,
281,
8069,
9232,
253,
2600,
1491,
285,
3740,
1491,
2975,
352,
310,
247,
4619,
1223,
1077,
11132,
2561,
1895,
984,
253,
2600,
285,
3740,
403,
36255,
275,
253,
3733,
3530,
285,
4394,
1364,
9257,
2216,
253,
3733,
8103,
824,
326,
1016,
273,
841,
2803,
476,
320,
6311,
275,
247,
3661,
494,
1039,
253,
2934,
273,
4715,
253,
3740,
4503,
1320,
310,
4722,
285,
33526,
12532,
1543,
327,
8892,
275,
1027,
2898,
15216,
26332,
246,
1641,
285,
2505,
936,
4608,
17695,
9066,
5747,
326,
253,
2929,
310,
3477,
281,
956,
285,
253,
1471,
375,
921,
275,
253,
2199,
3239,
36143,
7568,
253,
4081,
2746,
50276,
20881,
1255,
337,
2577,
2022,
4468,
310,
253,
3733,
273,
3740,
4503,
1320,
6333,
1754,
327,
253,
2216,
253,
3733,
310,
281,
16851,
253,
12637,
268,
91,
89,
260,
342,
2805,
91,
26652,
815,
895,
1269,
260,
275,
436,
824,
247,
2216,
253,
15355,
310,
326,
253,
3740,
32049,
760,
9232,
253,
3740,
1491,
432,
253,
1677,
3530,
1269,
285,
1269,
2299,
352,
1537,
320,
1896,
326,
253,
2990,
310,
10166,
281,
1089,
247,
28194,
326,
2410,
89,
50276,
89,
6071,
89,
4826,
285,
278,
50276,
2162,
403,
816,
10166,
281,
37856,
1269,
275,
436,
1039,
253,
3733,
2957,
310,
1335,
1077,
1355,
2299,
253,
2410,
1057,
417,
10166,
281,
22573,
6313,
3740,
1491,
347,
3264,
50276,
74,
717,
12371,
310,
627,
667,
22414,
390,
3733,
24866,
403,
908,
281,
3693,
824,
28194,
84,
347,
891,
513,
417,
1089,
7092,
4278,
285,
2057,
490,
77,
569,
390,
1783,
327,
253,
21624,
2317,
326,
22245,
407,
253,
3740,
32049,
2410,
594,
891,
717,
417,
4751,
13762,
374,
281,
2968,
342,
253,
3733,
249,
1793,
29713,
2523,
627,
403,
1529,
1386,
273,
2987,
7856,
11907,
557,
290,
606,
1338,
273,
2600,
50276,
395,
3740,
323,
1650,
275,
337,
253,
1566,
403,
671,
10166,
342,
18433,
285,
47223,
2505,
48460,
407,
44894,
2977,
253,
2600,
285,
3740,
1491,
275,
1016,
2349,
351,
398,
1223,
891,
513,
417,
923,
14023,
285,
11985,
670,
436,
1386,
273,
789,
891,
717,
1077,
14338,
281,
923,
752,
403,
253,
3910,
672,
9433,
557,
290,
606,
1338,
285,
3740,
4503,
1320,
50276,
20,
627,
3480,
273,
7092,
285,
5661,
4278,
577,
253,
10941,
256,
5503,
403,
3240,
247,
1643,
327,
246,
1641,
253,
1566,
310,
760,
2429,
342,
32851,
302,
1406,
3169,
3210,
24088,
305,
296,
285,
643,
9508,
327,
30557,
394,
395,
17695,
310,
760,
38854,
608,
627,
3480,
273,
11088,
28913,
2175,
3340,
327,
253,
3740,
21624,
2317,
891,
717,
14338,
281,
923,
3910,
273,
253,
21624,
2317,
3587,
22245,
407,
253,
3740,
32049,
2410,
285,
253,
846,
407,
9433,
278,
50276,
2162,
721,
642,
2603,
2127,
310,
9262,
352,
16540,
33478,
327,
253,
41374,
3745,
50275,
18,
256,
6429,
277,
278,
68,
563,
567,
340,
4498,
11454,
246,
1641,
17521,
1320,
342,
48960,
285,
27549,
3958,
436,
2929,
310,
973,
17194,
253,
2934,
273,
9433,
3740,
4503,
1320,
310,
4722,
671,
253,
2692,
5661,
1543,
403,
13943,
50276,
20026,
253,
14855,
209,
422,
8042,
891,
651,
1804,
253,
4477,
337,
7568,
849,
278,
50276,
2162,
476,
320,
10166,
275,
253,
3264,
1039,
534,
32547,
253,
1566,
13551,
281,
28194,
84,
374,
823,
11985,
285,
14023,
342,
253,
1077,
2905,
1386,
273,
789,
26332,
2600,
4826,
557,
290,
606,
1338,
495,
6240,
28913,
2175,
285,
14023,
342,
625,
256,
5503,
7274,
50276,
1222,
641,
407,
2819,
715,
253,
2929,
891,
717,
417,
2590,
670,
253,
3733,
285,
7103,
15302,
323,
1650,
327,
246,
1641,
752,
310,
253,
3733,
10895,
285,
752,
310,
253,
7103,
10895,
403,
597,
10166,
327,
32582,
44033,
67,
285,
840,
6760,
327,
362,
291,
76,
50276,
4658,
343,
1440,
84,
390,
597,
10166,
327,
362,
291,
76,
285,
6760,
327,
362,
291,
76,
285,
253,
1072,
323,
40211,
770,
84,
50276,
338,
253,
6158,
403,
253,
7103,
2684,
327,
2326,
390,
39709,
17999,
50275,
7152,
33032,
50276,
856,
7334,
271,
440,
35421,
3740,
27090,
7792,
1754,
327,
362,
83,
9866,
50275,
11018,
264,
4679,
327,
246,
1641,
285,
47021,
9066,
281,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
50275,
856,
7334,
271,
4722,
1332,
342,
5322,
5661,
1543,
50276,
20881,
1255,
265,
50275,
783,
4602,
281,
2905,
2987,
310,
417,
973,
5469,
50275,
783,
5649,
273,
253,
4081,
1332,
689,
643,
2074,
3082,
310,
417,
973,
17285,
50276,
5992,
7193,
5701,
50276,
18,
253,
2022,
2934,
273,
436,
2929,
310,
1754,
327,
362,
83,
9866,
448,
1947,
1162,
355,
4104,
824,
4602,
347,
973,
347,
253,
4602,
281,
643,
3082,
1754,
327,
362,
83,
9866,
24088,
247,
661,
266,
1162,
355,
4765,
310,
417,
4518,
5469,
275,
436,
2929,
643,
2201,
7274,
327,
3740,
27090,
403,
417,
973,
5469,
2057,
824,
347,
13460,
2275,
833,
7274,
29507,
7958,
11415,
1162,
355,
4765,
23394,
350,
1162,
355,
4765,
5101,
1162,
355,
9169,
50276,
19,
436,
2929,
3916,
1375,
23037,
14387,
2299,
253,
4679,
403,
417,
2429,
281,
253,
1375,
23037,
14387,
7274,
275,
253,
1673,
50272,
783,
246,
1641,
4679,
2770,
327,
14925,
14259,
533,
310,
2429,
281,
253,
4156,
3740,
21761,
305,
296,
1566,
534,
310,
417,
1175,
387,
4318,
14053,
1955,
281,
697,
3710,
19034,
13460,
2275,
833,
2746,
824,
347,
288,
3467,
1162,
355,
6247,
403,
1805,
1666,
25379,
323,
253,
32851,
302,
1406,
8245,
697,
671,
4409,
281,
2486,
253,
906,
27039,
327,
14925,
2654,
50272,
251,
47021,
9066,
253,
8245,
908,
275,
253,
2929,
38854,
4072,
310,
417,
247,
2266,
8245,
2057,
625,
3332,
2987,
824,
347,
247,
661,
266,
1162,
355,
4765,
465,
302,
6451,
1162,
355,
9169,
403,
1805,
1666,
25379,
50276,
20,
253,
2022,
38135,
273,
436,
2929,
310,
16186,
374,
253,
2934,
310,
281,
6635,
3410,
1269,
253,
1566,
310,
27039,
327,
1097,
247,
2600,
260,
285,
247,
3740,
1182,
835,
1182,
310,
432,
247,
1159,
278,
89,
1269,
1060,
1269,
310,
247,
3410,
20804,
281,
1269,
247,
3626,
1953,
310,
310,
1269,
1663,
908,
407,
253,
1566,
390,
310,
352,
2686,
12841,
436,
310,
6747,
28055,
4543,
45190,
9577,
275,
253,
2929,
50276,
21,
281,
2953,
253,
1840,
1953,
45190,
2654,
1804,
3515,
767,
28913,
2175,
50272,
13481,
278,
89,
1269,
342,
278,
89,
50272,
13481,
1269,
625,
10534,
269,
342,
247,
6311,
2720,
50272,
47606,
533,
417,
2424,
8171,
1269,
625,
10534,
269,
342,
247,
3632,
6046,
50276,
22,
627,
403,
3240,
690,
39296,
275,
253,
3908,
273,
253,
29713,
1895,
253,
2234,
2934,
273,
436,
2929,
310,
417,
3559,
1919,
2929,
608,
352,
651,
320,
5322,
281,
5321,
690,
2317,
432,
731,
285,
897,
352,
323,
253,
4602,
281,
2905,
2987,
50276,
250,
3065,
50275,
348,
1947,
1162,
355,
247,
18902,
21624,
4778,
1566,
323,
22453,
941,
5723,
2824,
4104,
50275,
8765,
266,
1162,
355,
3676,
17695,
2403,
5865,
12296,
1407,
7116,
3066,
3676,
1006,
800,
14053,
4765,
50275,
518,
7958,
11415,
1162,
355,
43541,
6519,
9066,
3066,
14053,
12091,
342,
39762,
6753,
36465,
734,
48460,
4765,
50275,
864,
350,
1162,
355,
3676,
32049,
48759,
3210,
323,
440,
35421,
4715,
273,
3661,
494,
6519,
9066,
4765,
50275,
13998,
1162,
355,
4751,
73,
1321,
1116,
474,
4030,
72,
11273,
5847,
1197,
14053,
323,
4665,
494,
6519,
9066,
17857,
515,
81,
9169,
50275,
73,
3467,
1162,
355,
24498,
1006,
800,
14053,
323,
3661,
494,
6519,
9066,
17857,
32888,
6247,
50275,
46841,
6451,
1162,
355,
11365,
47021,
3066,
34430,
6216,
3740,
42785,
23746,
87,
9169,
1223,
436,
2929,
29328,
271,
4722,
2934,
323,
3740,
27090,
342,
5322,
5661,
1543,
253,
5649,
273,
253,
4081,
2934,
689,
5272,
1666,
25379,
310,
417,
4518,
12322,
253,
4602,
281,
2905,
2987,
310,
671,
417,
973,
5469,
2490,
187,
4118,
18435,
27,
2520,
789,
13698,
281,
3157,
3740,
3700,
275,
253,
440,
35421,
1327,
19783,
1083,
352,
1057,
436,
407,
36636,
247,
3740,
4503,
1320,
2746,
281,
3657,
2600,
23753,
285,
7384,
326,
2600,
1491,
310,
37282,
2662,
5727,
3740,
1491,
310,
37282,
2662,
436,
310,
271,
1774,
1895,
281,
8415,
285,
8783,
273,
2720,
789,
275,
253,
2170,
4961,
253,
789,
310,
973,
7397,
1701,
342,
1175,
5661,
1543,
2299,
627,
403,
2266,
3916,
275,
253,
2929,
285,
627,
310,
12497,
5661,
5301,
281,
2074,
2905,
789,
824,
347,
288,
3467,
1162,
355,
6247,
285,
6429,
1162,
355,
4765,
281,
896,
326,
598,
604,
253,
373,
642,
5301,
342,
253,
1655,
1375,
273,
253,
1445,
24088,
1955,
281,
247,
3055,
7092,
390,
10895,
840,
697,
1892,
281,
15249,
6789,
247,
747,
789,
247,
747,
1375,
273,
253,
1445,
1014,
2167,
271,
7092,
778,
320,
3055,
352,
476,
320,
4409,
9100,
673,
281,
18302,
247,
2929,
390,
7004,
253,
4477,
323,
271,
7092,
4720,
4836,
285,
7982,
5438,
812,
320,
5520,
281,
1805,
6780,
253,
3045,
273,
253,
2746,
253,
30628,
5717,
253,
4477,
323,
253,
30080,
22559,
533,
352,
369,
12497,
281,
1818,
616,
3061
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper introduces an additional objective to gan training called difference departure from normality that is meant to encourage correlation between real and generated samples projected into some subspace this is intended to prevent instability during training the technique is applied to audio spectrogram generation reason for score honestly i didnt completely understand this paper so im marking low confidence however i do feel like the paper was quite low on clarity the technique wasnt well motivated or concretely explained and i found it odd that it was applied to spectrogram generation exclusively rather than image generation which is the standard testbed for gans algorithms comments this paper could really use some improved motivation eg why is ddfn a good metric why would it improve training stability some of the terms and symbols need to be defined for clarity eg pg pr section 3 would be improved by the inclusion of intuitive explanations to help the reader and me understand why the ddfn metric makes sense and what it represents i would recommend including the steps of the overall learning algorithm for those who want to better understand the concrete steps involved in implementing the approach and how the ddfn objective is integrating into the standard gan learning algorithm theres some parameter matrices that show up as g and d backticks in section 3 are these latex typos i would recommend testing the approach on common image datasets since that is the typical testbed for generic gan improvements docsepthe paper proposes a novel conditioningbased regularization of gan training while it has been known that tying up the generator and discriminators latent variables can help stabilize gan training the proposed method is based on a different theory thus making the algorithm novel the experimental results also support the authors argument about the improved stability and results in terms of objective metrics one thing i wish the paper could improve is about the reason as to why the proposed method is superior to the existing regularization techniques which is somewhat buried in the mathematical details another more serious issue is that the method is strictly limited to spectrogram reconstruction i am not sure if this is the best practice when it comes to generate audio signals compared against the other generative models that are based on timedomain approaches eg wavenet or melgan more specifically the authors proposed to preserve the original phase information from the real examples and reused it to reconstruct the timedomain signal given the generative nature of the system i wonder if this option is always possible the authors might as well compare to the alternative cases such as estimating the phase from the magnitude spectrograms eg griffinlim or by trying to reconstruct both real and imaginary spectrograms to recover the full complex spectrogram also given that the signals mostly environmental sound used in the experiments are less sensitive to the phase distortion compared to the other kinds such as speech and music i believe that the phase mismatch issue wasnt thoroughly handled in this paper docsepsummary this paper propose a constraint in gan training to improve generated samples fidelity and stabilize training the proposed conditioning is based on limiting the generator from departing normality function of real samples which is computed in the spectral domain of schur decomposition it is claimed that this conditioning will not limit the exploration of all modes of real data distribution missing references gan for unsupervised and semi supervised speech classification 1 hosseiniasl e zhou y xiong c socher r augmented cyclic adversarial learning for low resource domain adaptation iclr 2018 2 hosseiniasl e zhou y xiong c socher r a multidiscriminator cyclegan for unsupervised nonparallel speech domain adaptation interspeech 2018 comments 1 what is the relation of batch size to iteration before collapse in table 1 on us8k it shows larger batch using spectral normalization result in faster collapse while this is reverse on esc50 dataset figure 1 is only shown for us8k dataset only 2 if using spectral normalization stabilize training the results in table 1 indicate this is not always true with different batch size and different dataset 3 table 2 there is no evaluation of sagan and biggan on 256 resolution for mcv also no 128 resolution for all models on us8k dataset what is the reason behind selecting some resolutions only is the proposed normalization only better on some selected resolutions 4 1 and 2 in missing references proposed multidiscriminator to provide more informative gradient to generator on spectrogram space it would be also interesting to explore this in combination with proposed normalization 5 what is the impact of the proposed normalization on speech recognition performance it would be very helpful to add classification metric since the proposed approach is evaluated on classconditional gan 6 the evaluation on image generation is also required for the proposed normalization approach docsepthis paper proposes a conditioning method to train the gans the conditioning trick is based on the dfn departure from normality metric computed in the spectrogram domain of schur decomposition to ensure the correlation between real and generated samples experiments were performed on a few public audio datasets and the results suggest that imposing the proposed constraint on the generators not only delays the collapse in training but also yields the better reconstruction performance pros an important issue of stable training of the gans is addressed the proposed conditioning trick is novel and interesting in some sense cons the major concern about this paper is the lack of thorough experimentation to prove the usefulness of the proposed method the experiments and evaluation dont seem to sufficiently support the main arguments in particular the authors put an emphasis on the importance of generating the highfidelity audio spectrograms for higher classification accuracies this was never supported by any experiments the reconstruction of speech samples are not sufficient to back up the arguments furthermore another advantage of the proposed method the authors state was its ability to generate the samples with high diversity this too was not demonstrated with experiments the authors state that the proposed method is generalizable but it should be confirmed through more experiments on the other domains otherwise it is recommended that it must be explicitly stated that the proposed method applies to audio or speech including the title the paper is not easy to follow some terms and notations are not adequately explained so are the descriptions of the figures for example in figure 2 it is difficult to follow the tradeoff between the quality and the diversity according to different alpha epsilon values speech samples in the supplementary material dont seem to match the corresponding dwt spectrograms and waveforms there are some clipping noises and sudden silences in audio while the spectrograms and waveforms look almost identical to those of the original audio to summarize the paper presents a novel and interesting idea to tackle the important problem in the field but fail to provide the experimental evidences to support the idea therefore i recommend the paper is not ready to be published in its current form
### Summary:
|
the paper proposes a trick for stabilizing gan training and reports experiment results on spectrogram synthesis all the reviewers rate the paper below the bar citing various concerns including a lack of clarity and unconvincing results several reviewers suggest conducting evaluations in the image domain as most of the gan training techniques are proposed in the image domain after consolidating the reviews and rebuttal the area chair finds the reviewers argument convincing and would not recommend acceptance of the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
436,
2929,
23970,
271,
3081,
8103,
281,
36827,
3733,
1925,
3064,
16018,
432,
5222,
1319,
326,
310,
5486,
281,
11907,
5921,
875,
1524,
285,
4561,
3530,
16589,
715,
690,
24822,
436,
310,
6034,
281,
3657,
17620,
1309,
3733,
50276,
783,
5853,
310,
3732,
281,
9797,
2812,
287,
1710,
5978,
50275,
10752,
323,
4868,
20509,
891,
42126,
4336,
2096,
436,
2929,
594,
516,
26099,
1698,
7162,
50276,
35529,
891,
513,
1928,
751,
253,
2929,
369,
3240,
1698,
327,
19843,
50276,
783,
5853,
369,
2649,
973,
17194,
390,
345,
2414,
600,
5544,
50276,
395,
891,
1119,
352,
8909,
326,
352,
369,
3732,
281,
2812,
287,
1710,
5978,
14288,
2581,
685,
2460,
5978,
534,
310,
253,
2629,
1071,
3026,
323,
305,
507,
11333,
50275,
26122,
50276,
2520,
2929,
812,
1663,
897,
690,
5520,
16038,
24088,
2139,
310,
32765,
4174,
247,
1175,
7982,
2139,
651,
352,
3157,
3733,
7882,
50276,
8826,
273,
253,
2426,
285,
14217,
878,
281,
320,
2931,
323,
19843,
24088,
23256,
819,
50276,
4674,
495,
651,
320,
5520,
407,
253,
11250,
273,
27350,
22909,
281,
1361,
253,
9414,
285,
479,
2096,
2139,
253,
32765,
4174,
7982,
2789,
3282,
285,
752,
352,
6125,
50275,
74,
651,
5583,
1690,
253,
5018,
273,
253,
4583,
4715,
5933,
323,
1110,
665,
971,
281,
1805,
2096,
253,
11859,
5018,
3206,
275,
16994,
253,
2746,
285,
849,
253,
32765,
4174,
8103,
310,
24399,
715,
253,
2629,
36827,
4715,
5933,
50275,
783,
373,
690,
4764,
12624,
326,
921,
598,
347,
305,
285,
277,
896,
3028,
661,
275,
2593,
495,
403,
841,
44127,
963,
993,
50276,
74,
651,
5583,
5175,
253,
2746,
327,
1846,
2460,
15302,
1580,
326,
310,
253,
6867,
1071,
3026,
323,
12314,
36827,
11701,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
21839,
3169,
37820,
273,
36827,
3733,
1223,
352,
556,
644,
1929,
326,
42068,
598,
253,
14156,
285,
20741,
2392,
21624,
4903,
476,
1361,
33292,
36827,
3733,
253,
4081,
1332,
310,
1754,
327,
247,
1027,
3762,
3021,
2403,
253,
5933,
4460,
253,
5661,
1543,
671,
1329,
253,
4477,
4154,
670,
253,
5520,
7882,
285,
1543,
275,
2426,
273,
8103,
17082,
50276,
531,
2181,
891,
5730,
253,
2929,
812,
3157,
310,
670,
253,
1921,
347,
281,
2139,
253,
4081,
1332,
310,
8936,
281,
253,
5368,
37820,
5609,
534,
310,
8489,
14205,
275,
253,
15965,
4278,
50275,
23955,
625,
4092,
2523,
310,
326,
253,
1332,
310,
13714,
3710,
281,
2812,
287,
1710,
14433,
891,
717,
417,
2119,
604,
436,
310,
253,
1682,
3946,
672,
352,
3249,
281,
6635,
9797,
6298,
2429,
1411,
253,
643,
1006,
800,
3210,
326,
403,
1754,
327,
4522,
5881,
404,
7274,
24088,
259,
7603,
292,
390,
6673,
1247,
625,
5742,
253,
4477,
4081,
281,
14003,
253,
3236,
3408,
1491,
432,
253,
1524,
6667,
285,
294,
3197,
352,
281,
17029,
253,
4522,
5881,
404,
2625,
1677,
253,
1006,
800,
3753,
273,
253,
985,
891,
4282,
604,
436,
4500,
310,
1900,
1896,
253,
4477,
1537,
347,
973,
7277,
281,
253,
5795,
2219,
824,
347,
26230,
253,
3408,
432,
253,
9777,
2812,
287,
5059,
24088,
650,
1648,
249,
2815,
390,
407,
2820,
281,
17029,
1097,
1524,
285,
21833,
2812,
287,
5059,
281,
9295,
253,
2120,
2570,
2812,
287,
1710,
671,
1677,
326,
253,
6298,
6571,
6938,
3590,
908,
275,
253,
4679,
403,
1679,
7996,
281,
253,
3408,
22841,
2429,
281,
253,
643,
9351,
824,
347,
6519,
285,
3440,
891,
2868,
326,
253,
3408,
29713,
2523,
369,
2649,
16575,
15726,
275,
436,
2929,
5474,
339,
793,
360,
3454,
50276,
2520,
2929,
12661,
247,
7658,
275,
36827,
3733,
281,
3157,
4561,
3530,
32422,
285,
33292,
3733,
253,
4081,
21839,
310,
1754,
327,
14155,
253,
14156,
432,
48373,
5222,
1319,
1159,
273,
1524,
3530,
534,
310,
10302,
275,
253,
9879,
5028,
273,
5807,
321,
14717,
352,
310,
7558,
326,
436,
21839,
588,
417,
2701,
253,
17947,
273,
512,
10006,
273,
1524,
941,
3268,
50274,
33722,
10414,
50276,
1247,
323,
440,
35421,
285,
10020,
22296,
6519,
9162,
50276,
18,
288,
37554,
5391,
284,
77,
299,
1182,
14451,
340,
1269,
279,
72,
260,
9267,
379,
391,
31612,
19870,
48960,
4715,
323,
1698,
7741,
5028,
15644,
17857,
32888,
4765,
50276,
19,
288,
37554,
5391,
284,
77,
299,
1182,
14451,
340,
1269,
279,
72,
260,
9267,
379,
391,
247,
23964,
2865,
3428,
12915,
5880,
1247,
323,
440,
35421,
1327,
19783,
6519,
5028,
15644,
734,
48460,
4765,
50274,
26122,
50276,
18,
752,
310,
253,
5886,
273,
14604,
1979,
281,
50276,
2562,
318,
1078,
13551,
275,
2829,
337,
327,
441,
25,
76,
352,
2722,
4067,
14604,
970,
9879,
21539,
906,
275,
7938,
13551,
1223,
436,
310,
8107,
327,
6262,
1235,
10895,
4677,
337,
310,
760,
2011,
323,
441,
25,
76,
10895,
760,
50276,
19,
604,
970,
9879,
21539,
33292,
3733,
253,
1543,
275,
2829,
337,
5224,
436,
310,
417,
1900,
2032,
342,
1027,
14604,
1979,
285,
1027,
10895,
50275,
20,
2829,
374,
627,
310,
642,
7103,
273,
256,
12043,
285,
1943,
1247,
327,
17558,
6064,
323,
278,
17312,
671,
642,
12842,
6064,
323,
512,
3210,
327,
441,
25,
76,
10895,
752,
310,
253,
1921,
3212,
17221,
690,
30285,
760,
310,
253,
4081,
21539,
760,
1805,
327,
690,
4236,
30285,
50276,
21,
337,
285,
374,
275,
5816,
10414,
4081,
23964,
2865,
3428,
12915,
281,
2085,
625,
27096,
11786,
281,
14156,
327,
2812,
287,
1710,
2317,
352,
651,
320,
671,
4722,
281,
8338,
436,
275,
5019,
342,
4081,
21539,
50276,
22,
752,
310,
253,
3486,
273,
253,
4081,
21539,
327,
6519,
8981,
3045,
352,
651,
320,
1077,
9371,
281,
823,
9162,
7982,
1580,
253,
4081,
2746,
310,
6760,
327,
966,
35428,
36827,
50275,
23,
253,
7103,
327,
2460,
5978,
310,
671,
2424,
323,
253,
4081,
21539,
2746,
5474,
33032,
2520,
2929,
29328,
247,
21839,
1332,
281,
6194,
253,
305,
507,
253,
21839,
10480,
310,
1754,
327,
253,
277,
4174,
16018,
432,
5222,
1319,
7982,
10302,
275,
253,
2812,
287,
1710,
5028,
273,
5807,
321,
14717,
281,
5416,
253,
5921,
875,
1524,
285,
4561,
3530,
4679,
497,
2684,
327,
247,
1643,
1345,
9797,
15302,
285,
253,
1543,
1804,
326,
23254,
253,
4081,
7658,
327,
253,
21025,
417,
760,
20219,
253,
13551,
275,
3733,
533,
671,
11026,
253,
1805,
14433,
3045,
50276,
856,
84,
50275,
266,
1774,
2523,
273,
6474,
3733,
273,
253,
305,
507,
310,
9713,
50275,
783,
4081,
21839,
10480,
310,
4460,
285,
4722,
275,
690,
3282,
50275,
5040,
50275,
783,
2201,
4468,
670,
436,
2929,
310,
253,
3480,
273,
11080,
40290,
281,
5276,
253,
31471,
273,
253,
4081,
1332,
253,
4679,
285,
7103,
13414,
1646,
281,
10481,
1329,
253,
2022,
7125,
275,
1798,
253,
4477,
1691,
271,
15075,
327,
253,
6349,
273,
11365,
253,
1029,
71,
21718,
9797,
2812,
287,
5059,
323,
2169,
9162,
3933,
19103,
436,
369,
1620,
4516,
407,
667,
4679,
253,
14433,
273,
6519,
3530,
403,
417,
4209,
281,
896,
598,
253,
7125,
33810,
1529,
5750,
273,
253,
4081,
1332,
253,
4477,
1375,
369,
697,
3745,
281,
6635,
253,
3530,
342,
1029,
9991,
436,
1512,
369,
417,
5183,
342,
4679,
50275,
783,
4477,
1375,
326,
253,
4081,
1332,
310,
2087,
12729,
533,
352,
943,
320,
5783,
949,
625,
4679,
327,
253,
643,
10625,
5010,
352,
310,
8521,
326,
352,
1364,
320,
11120,
4767,
326,
253,
4081,
1332,
10384,
281,
9797,
390,
6519,
1690,
253,
4060,
50275,
783,
2929,
310,
417,
3477,
281,
956,
690,
2426,
285,
41818,
403,
417,
18212,
5544,
594,
403,
253,
20121,
273,
253,
8442,
323,
1650,
275,
4677,
374,
352,
310,
2834,
281,
956,
253,
5454,
2727,
875,
253,
3290,
285,
253,
9991,
2556,
281,
1027,
9765,
299,
4277,
2193,
50275,
48460,
3530,
275,
253,
24864,
2144,
13414,
1646,
281,
3761,
253,
3969,
277,
17118,
2812,
287,
5059,
285,
5149,
13015,
627,
403,
690,
502,
8201,
33737,
285,
5982,
2830,
2979,
275,
9797,
1223,
253,
2812,
287,
5059,
285,
5149,
13015,
1007,
2761,
8931,
281,
1110,
273,
253,
3236,
9797,
50276,
936,
26799,
253,
2929,
10262,
247,
4460,
285,
4722,
2934,
281,
18915,
253,
1774,
1895,
275,
253,
1673,
533,
1891,
281,
2085,
253,
5661,
20456,
2979,
281,
1329,
253,
2934,
3103,
891,
5583,
253,
2929,
310,
417,
4704,
281,
320,
3863,
275,
697,
1655,
830,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10480,
323,
41427,
36827,
3733,
285,
5012,
3368,
1543,
327,
2812,
287,
1710,
9066,
512,
253,
30628,
2281,
253,
2929,
2708,
253,
2534,
19936,
2710,
7350,
1690,
247,
3480,
273,
19843,
285,
10915,
87,
19163,
1543,
2067,
30628,
1804,
16472,
27163,
275,
253,
2460,
5028,
347,
954,
273,
253,
36827,
3733,
5609,
403,
4081,
275,
253,
2460,
5028,
846,
16932,
839,
253,
10123,
285,
30080,
22559,
253,
2170,
6951,
9010,
253,
30628,
4154,
21414,
285,
651,
417,
5583,
14924,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
436,
2929,
23970,
271,
3081,
8103,
281,
36827,
3733,
1925,
3064,
16018,
432,
5222,
1319,
326,
310,
5486,
281,
11907,
5921,
875,
1524,
285,
4561,
3530,
16589,
715,
690,
24822,
436,
310,
6034,
281,
3657,
17620,
1309,
3733,
50276,
783,
5853,
310,
3732,
281,
9797,
2812,
287,
1710,
5978,
50275,
10752,
323,
4868,
20509,
891,
42126,
4336,
2096,
436,
2929,
594,
516,
26099,
1698,
7162,
50276,
35529,
891,
513,
1928,
751,
253,
2929,
369,
3240,
1698,
327,
19843,
50276,
783,
5853,
369,
2649,
973,
17194,
390,
345,
2414,
600,
5544,
50276,
395,
891,
1119,
352,
8909,
326,
352,
369,
3732,
281,
2812,
287,
1710,
5978,
14288,
2581,
685,
2460,
5978,
534,
310,
253,
2629,
1071,
3026,
323,
305,
507,
11333,
50275,
26122,
50276,
2520,
2929,
812,
1663,
897,
690,
5520,
16038,
24088,
2139,
310,
32765,
4174,
247,
1175,
7982,
2139,
651,
352,
3157,
3733,
7882,
50276,
8826,
273,
253,
2426,
285,
14217,
878,
281,
320,
2931,
323,
19843,
24088,
23256,
819,
50276,
4674,
495,
651,
320,
5520,
407,
253,
11250,
273,
27350,
22909,
281,
1361,
253,
9414,
285,
479,
2096,
2139,
253,
32765,
4174,
7982,
2789,
3282,
285,
752,
352,
6125,
50275,
74,
651,
5583,
1690,
253,
5018,
273,
253,
4583,
4715,
5933,
323,
1110,
665,
971,
281,
1805,
2096,
253,
11859,
5018,
3206,
275,
16994,
253,
2746,
285,
849,
253,
32765,
4174,
8103,
310,
24399,
715,
253,
2629,
36827,
4715,
5933,
50275,
783,
373,
690,
4764,
12624,
326,
921,
598,
347,
305,
285,
277,
896,
3028,
661,
275,
2593,
495,
403,
841,
44127,
963,
993,
50276,
74,
651,
5583,
5175,
253,
2746,
327,
1846,
2460,
15302,
1580,
326,
310,
253,
6867,
1071,
3026,
323,
12314,
36827,
11701,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
21839,
3169,
37820,
273,
36827,
3733,
1223,
352,
556,
644,
1929,
326,
42068,
598,
253,
14156,
285,
20741,
2392,
21624,
4903,
476,
1361,
33292,
36827,
3733,
253,
4081,
1332,
310,
1754,
327,
247,
1027,
3762,
3021,
2403,
253,
5933,
4460,
253,
5661,
1543,
671,
1329,
253,
4477,
4154,
670,
253,
5520,
7882,
285,
1543,
275,
2426,
273,
8103,
17082,
50276,
531,
2181,
891,
5730,
253,
2929,
812,
3157,
310,
670,
253,
1921,
347,
281,
2139,
253,
4081,
1332,
310,
8936,
281,
253,
5368,
37820,
5609,
534,
310,
8489,
14205,
275,
253,
15965,
4278,
50275,
23955,
625,
4092,
2523,
310,
326,
253,
1332,
310,
13714,
3710,
281,
2812,
287,
1710,
14433,
891,
717,
417,
2119,
604,
436,
310,
253,
1682,
3946,
672,
352,
3249,
281,
6635,
9797,
6298,
2429,
1411,
253,
643,
1006,
800,
3210,
326,
403,
1754,
327,
4522,
5881,
404,
7274,
24088,
259,
7603,
292,
390,
6673,
1247,
625,
5742,
253,
4477,
4081,
281,
14003,
253,
3236,
3408,
1491,
432,
253,
1524,
6667,
285,
294,
3197,
352,
281,
17029,
253,
4522,
5881,
404,
2625,
1677,
253,
1006,
800,
3753,
273,
253,
985,
891,
4282,
604,
436,
4500,
310,
1900,
1896,
253,
4477,
1537,
347,
973,
7277,
281,
253,
5795,
2219,
824,
347,
26230,
253,
3408,
432,
253,
9777,
2812,
287,
5059,
24088,
650,
1648,
249,
2815,
390,
407,
2820,
281,
17029,
1097,
1524,
285,
21833,
2812,
287,
5059,
281,
9295,
253,
2120,
2570,
2812,
287,
1710,
671,
1677,
326,
253,
6298,
6571,
6938,
3590,
908,
275,
253,
4679,
403,
1679,
7996,
281,
253,
3408,
22841,
2429,
281,
253,
643,
9351,
824,
347,
6519,
285,
3440,
891,
2868,
326,
253,
3408,
29713,
2523,
369,
2649,
16575,
15726,
275,
436,
2929,
5474,
339,
793,
360,
3454,
50276,
2520,
2929,
12661,
247,
7658,
275,
36827,
3733,
281,
3157,
4561,
3530,
32422,
285,
33292,
3733,
253,
4081,
21839,
310,
1754,
327,
14155,
253,
14156,
432,
48373,
5222,
1319,
1159,
273,
1524,
3530,
534,
310,
10302,
275,
253,
9879,
5028,
273,
5807,
321,
14717,
352,
310,
7558,
326,
436,
21839,
588,
417,
2701,
253,
17947,
273,
512,
10006,
273,
1524,
941,
3268,
50274,
33722,
10414,
50276,
1247,
323,
440,
35421,
285,
10020,
22296,
6519,
9162,
50276,
18,
288,
37554,
5391,
284,
77,
299,
1182,
14451,
340,
1269,
279,
72,
260,
9267,
379,
391,
31612,
19870,
48960,
4715,
323,
1698,
7741,
5028,
15644,
17857,
32888,
4765,
50276,
19,
288,
37554,
5391,
284,
77,
299,
1182,
14451,
340,
1269,
279,
72,
260,
9267,
379,
391,
247,
23964,
2865,
3428,
12915,
5880,
1247,
323,
440,
35421,
1327,
19783,
6519,
5028,
15644,
734,
48460,
4765,
50274,
26122,
50276,
18,
752,
310,
253,
5886,
273,
14604,
1979,
281,
50276,
2562,
318,
1078,
13551,
275,
2829,
337,
327,
441,
25,
76,
352,
2722,
4067,
14604,
970,
9879,
21539,
906,
275,
7938,
13551,
1223,
436,
310,
8107,
327,
6262,
1235,
10895,
4677,
337,
310,
760,
2011,
323,
441,
25,
76,
10895,
760,
50276,
19,
604,
970,
9879,
21539,
33292,
3733,
253,
1543,
275,
2829,
337,
5224,
436,
310,
417,
1900,
2032,
342,
1027,
14604,
1979,
285,
1027,
10895,
50275,
20,
2829,
374,
627,
310,
642,
7103,
273,
256,
12043,
285,
1943,
1247,
327,
17558,
6064,
323,
278,
17312,
671,
642,
12842,
6064,
323,
512,
3210,
327,
441,
25,
76,
10895,
752,
310,
253,
1921,
3212,
17221,
690,
30285,
760,
310,
253,
4081,
21539,
760,
1805,
327,
690,
4236,
30285,
50276,
21,
337,
285,
374,
275,
5816,
10414,
4081,
23964,
2865,
3428,
12915,
281,
2085,
625,
27096,
11786,
281,
14156,
327,
2812,
287,
1710,
2317,
352,
651,
320,
671,
4722,
281,
8338,
436,
275,
5019,
342,
4081,
21539,
50276,
22,
752,
310,
253,
3486,
273,
253,
4081,
21539,
327,
6519,
8981,
3045,
352,
651,
320,
1077,
9371,
281,
823,
9162,
7982,
1580,
253,
4081,
2746,
310,
6760,
327,
966,
35428,
36827,
50275,
23,
253,
7103,
327,
2460,
5978,
310,
671,
2424,
323,
253,
4081,
21539,
2746,
5474,
33032,
2520,
2929,
29328,
247,
21839,
1332,
281,
6194,
253,
305,
507,
253,
21839,
10480,
310,
1754,
327,
253,
277,
4174,
16018,
432,
5222,
1319,
7982,
10302,
275,
253,
2812,
287,
1710,
5028,
273,
5807,
321,
14717,
281,
5416,
253,
5921,
875,
1524,
285,
4561,
3530,
4679,
497,
2684,
327,
247,
1643,
1345,
9797,
15302,
285,
253,
1543,
1804,
326,
23254,
253,
4081,
7658,
327,
253,
21025,
417,
760,
20219,
253,
13551,
275,
3733,
533,
671,
11026,
253,
1805,
14433,
3045,
50276,
856,
84,
50275,
266,
1774,
2523,
273,
6474,
3733,
273,
253,
305,
507,
310,
9713,
50275,
783,
4081,
21839,
10480,
310,
4460,
285,
4722,
275,
690,
3282,
50275,
5040,
50275,
783,
2201,
4468,
670,
436,
2929,
310,
253,
3480,
273,
11080,
40290,
281,
5276,
253,
31471,
273,
253,
4081,
1332,
253,
4679,
285,
7103,
13414,
1646,
281,
10481,
1329,
253,
2022,
7125,
275,
1798,
253,
4477,
1691,
271,
15075,
327,
253,
6349,
273,
11365,
253,
1029,
71,
21718,
9797,
2812,
287,
5059,
323,
2169,
9162,
3933,
19103,
436,
369,
1620,
4516,
407,
667,
4679,
253,
14433,
273,
6519,
3530,
403,
417,
4209,
281,
896,
598,
253,
7125,
33810,
1529,
5750,
273,
253,
4081,
1332,
253,
4477,
1375,
369,
697,
3745,
281,
6635,
253,
3530,
342,
1029,
9991,
436,
1512,
369,
417,
5183,
342,
4679,
50275,
783,
4477,
1375,
326,
253,
4081,
1332,
310,
2087,
12729,
533,
352,
943,
320,
5783,
949,
625,
4679,
327,
253,
643,
10625,
5010,
352,
310,
8521,
326,
352,
1364,
320,
11120,
4767,
326,
253,
4081,
1332,
10384,
281,
9797,
390,
6519,
1690,
253,
4060,
50275,
783,
2929,
310,
417,
3477,
281,
956,
690,
2426,
285,
41818,
403,
417,
18212,
5544,
594,
403,
253,
20121,
273,
253,
8442,
323,
1650,
275,
4677,
374,
352,
310,
2834,
281,
956,
253,
5454,
2727,
875,
253,
3290,
285,
253,
9991,
2556,
281,
1027,
9765,
299,
4277,
2193,
50275,
48460,
3530,
275,
253,
24864,
2144,
13414,
1646,
281,
3761,
253,
3969,
277,
17118,
2812,
287,
5059,
285,
5149,
13015,
627,
403,
690,
502,
8201,
33737,
285,
5982,
2830,
2979,
275,
9797,
1223,
253,
2812,
287,
5059,
285,
5149,
13015,
1007,
2761,
8931,
281,
1110,
273,
253,
3236,
9797,
50276,
936,
26799,
253,
2929,
10262,
247,
4460,
285,
4722,
2934,
281,
18915,
253,
1774,
1895,
275,
253,
1673,
533,
1891,
281,
2085,
253,
5661,
20456,
2979,
281,
1329,
253,
2934,
3103,
891,
5583,
253,
2929,
310,
417,
4704,
281,
320,
3863,
275,
697,
1655,
830,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10480,
323,
41427,
36827,
3733,
285,
5012,
3368,
1543,
327,
2812,
287,
1710,
9066,
512,
253,
30628,
2281,
253,
2929,
2708,
253,
2534,
19936,
2710,
7350,
1690,
247,
3480,
273,
19843,
285,
10915,
87,
19163,
1543,
2067,
30628,
1804,
16472,
27163,
275,
253,
2460,
5028,
347,
954,
273,
253,
36827,
3733,
5609,
403,
4081,
275,
253,
2460,
5028,
846,
16932,
839,
253,
10123,
285,
30080,
22559,
253,
2170,
6951,
9010,
253,
30628,
4154,
21414,
285,
651,
417,
5583,
14924,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a more unified view of disparate methods for training sequence models specifically a multiterm objective lqtheta consisting of 1 the standard reward maximization objective of policy gradient epthetar 2 a weighted weight alpha reverse kl divergence of the parametric policy and a nonparameteric policy q 3 a weighted weight beta entropy term on q is proposed for sequence training see equation 1 l can be iteratively optimized by solving for q given ptheta and the theta given q see eq 2 this framework mathematically generalizes softmaxpolicy gradient spg alpha1 beta0 and rewardaugmented maximum likelihood alpha0 betatemperature and also standard entropy regularized policy gradient alpha0 among other algorithms the paper is well written and the approach sensible however combining spg and raml by introducing their respective regularization terms is a rather straightforward exercise and so seems quite incremental other major concerns are 1 the true utility of the model and 2 the integrity of the experiments wrt 1 while raml was a significant contribution at the time it is now well established that raml generally doesnt perform well at all in practice due to exposure bias not conditioning on its own previous predictions during training moreover spg as the authors point out was supposed to address the need for ml pretraining but required much engineering to work the fact is that reinforcebased policy gradient methods are still more effective than these methods provided they have a good baseline to reduce varince which brings me to point 2 was mixer run with a learned baseline and judiciously optimized table 1 suggests that mixer can outpeform ml by only 01 bleu points and outpeformed by raml something is wrong with your implementation then moreover there are techniques like selfcritical sequence training scst which far outpeform mixer and we havent even discussed actorcritic baselines in summary the contribution over raml and spg in combining them is quite incremental and the practical importance of combining them is questionable as is the integrity of the presented experiments given how poorly mixer is reported perform and the omission of stronger baselines like scst and ac methods also a paper on essentially the same approach was submitted and rejected from iclr 2018httpsopenreviewnetpdfidh1nyf7w0z although this paper is better written and puts the method more fully in context with existing work i think that several of the concerns with that paper apply here as well look forward to the authors feedback and additionalcorrected results will certainly update my score if these concerns are addressed in particular if this generalization can significantly outpeform existing methods it generalizes with nondegenerate settings this would overcome the more incremental contribution of combining spg and raml current ratings evaluation 25 results are not consistent with previous results eg mixer results stronger baselines such as scst and ac are not considered clarity 55 clear paper well written significance 35 raml and spg have not been established as important methods in practice so combining them is less interesting originality 25 raml and spg are fairly straightforward to combine for experts interested in these methods rating 410 okay but not good enough reject confidence 55 pros generalizes raml and spg and also standard entropyregularized policy gradient well written paper clean generalization cons raml and spg have not been established as important methods in practice generalization of raml and spg is straightforward incremental existing baselines in the paper ie mixer do not perform as expected ie barely better than ml worse than raml stronger reinforcebased algorthms like scst as well as actorcritic algorithms have not been compared update after author responses authors thank you for your feedback while it is true that generalizing raml and spg into a common framework is not trivial the presented framework simply augments the dual form of spg ie reps 16 in the spg paper with a raml term furthermore the mle interpretation discussed is contained within the raml paper and the reductions to raml and spg are straightforward by design and so do not really provide much new insight considering this i feel that the importance of the paper largely rests on investigating and establishing the utility of the approach experimentally wrt the experiments i appreciate that the authors took the time to investigate the poor performance of mixer however the unusally poor performance of mixer remains unexplained and falls short even of scheduled sampling ss which suggests a lingering major issue reinforce techniques rely on 1 strong baselines verified by the authors 2 larger batch sizes to reduce variance and 3 pretraining to reduce variance and facilitate efficient exploration if the mle is undertrained or overtrained the latter the more likely issue given the plots then mixer will perform poorly actually it is now standard practice to pretrain with mless before rl training and this is really the also dynamically weighted objective baseline that should be compared against the current reinforce results mixer or otherwise really need to be updated or at the least removed as they are not captured by the framework but the comparison to pg methods is important more generally i feel that the experiments are not yet comprehensive enough while the authors have shown that they can outperform spg and raml with a scheduled objective it is not currently clear how sensitiverobust the results are to the term weight scheduling or even what most appropriate general weightsscheduling approach actually is overall i feel that the paper is still in need of substantial maturation before publication although i have revised my score slightly upward docsepthe authors provide a common mathematical perspective on several learning algorithms for sequence models they also introduce a new algorithm that combines several of the existing ones and achieves significant but small improvements on a machine translation and a text summarization task the paper is clearly written giving a good exposition of the unifying formulation i believe the paper is quite insightful and contributes to the communitys understanding of the learning algorithms however the improvements in their experiments are rather small and could probably be improved with more experimental work they do showcase the usefulness of their new formulation but not very strongly thus my recommendation to accept the paper is mostly based on the theoretical content that opens an interesting new perspectivedocsepthis paper introduces an interesting unifying perspective on several sequence generation training algorithms exposing both mle raml and spg as special cases of this unified framework this enables insightful new interpretations of standard issues in mle training in terms of exploration for instance based on this new perspective a new algorithm is introduced its performance is analysed on a machine translation and a text summarisation task quality and clarity the paper is overall wellwritten although it can be improved upon see details below the bibliography for instance does not reference the conferencejournals where the articles were published and lists many 10 published papers as arxiv preprints the ideas are clearly presented which is crucial in a paper trying to unify different approaches and the new perspective on exploration is well motivated originality and significance the unifying framework is interesting and helps shed new light on some standard issues in sequence generation on the other hand the new algorithm and its analysis seem like a slightly rushed attempt at leveraging the unifying framework the experiments in particular present several issues for instance its clear from figure 3 that both mle and raml are overfitting and would benefit from more dropout in the literature 03 is commonly used for this type of encoderdecoder architecture having access to these experimental results is important since it would enable the reader to understand whether the benefits of the new approach are subsumed by regularisation or not further the performance of the competing methods seems a bit low mle reports 2644 bleu which is a bit surprising considering that with beamsearch beam of size 10 not 5 admittedly bahdanau et al 2016 get 2756 bleu and this is without dropout with dropout 03 but without beam search leblond et al 2018 get 274 bleu making a strong case for the benefits of the new algorithm requires more thorough experiments overall the first half of the paper is interesting and insightful while the second would benefit from more time pros clarity of the ideas that are presented interesting unifying perspective on sequence generation algorithms insightful new interpretations of existing algorithms in terms of exploration cons the example new algorithm is not very original the associated experiments are incomplete details 1 page 2 dayan hinton 1997 levine 2018 abdolmaleki et al 2018 study in a probabilistic inference perspective is an incomplete sentence 2 at the beginning of section 31 policy optimisation is a family of algorithms 3 page 7 in the setup of the experiments we use the adam optimizer for sgd training is incorrect since sgd is not a family but a specific algorithm which is different from adam
### Summary:
|
i enjoyed reading the paper myself and i appreciate the unifying framework connecting raml and spg while i do not put a lot of weight on the experiments i agree with the reviewers that the experimental results are not very strong and i am not convinced that the theoretical contribution meets the bar at iclr in the interpolation algorithm there seems to be an additional annealing parameter and two tuning parameters it is important to describe how the parameters are tuned given the additional hyperparameters one may consider giving all of the algorithms the same budget of hyperparameter tuning i also agree with reviewers that the policy gradient baseline seems to underperform typical results one possible way to strengthen the experiments is to try to replicate the results of spg or raml and discuss the behavior of each algorithm as a function of hyperparameters
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
625,
27998,
1859,
273,
39653,
3082,
323,
3733,
3425,
3210,
5742,
247,
1554,
262,
693,
8103,
298,
82,
3124,
11253,
273,
337,
253,
2629,
10921,
11903,
1320,
8103,
273,
3646,
11786,
299,
431,
6168,
274,
50276,
19,
247,
17375,
2801,
9765,
8107,
27451,
23279,
273,
253,
36833,
3646,
285,
247,
1327,
19484,
280,
3646,
2805,
50276,
20,
247,
17375,
2801,
9840,
15579,
1307,
327,
2805,
50276,
261,
4081,
323,
3425,
3733,
923,
5150,
337,
298,
476,
320,
10040,
3146,
18325,
407,
16161,
323,
2805,
1677,
268,
3124,
285,
253,
39116,
1677,
2805,
923,
16186,
374,
50276,
2520,
7792,
11076,
1037,
2087,
4219,
2602,
4090,
22872,
11786,
653,
72,
9765,
18,
9840,
17,
285,
10921,
2321,
16390,
4869,
12177,
9765,
17,
701,
31587,
18458,
285,
671,
2629,
15579,
3963,
1025,
3646,
11786,
9765,
17,
2190,
643,
11333,
50276,
783,
2929,
310,
973,
3542,
285,
253,
2746,
24600,
2299,
16248,
653,
72,
285,
391,
16878,
407,
16984,
616,
9056,
37820,
2426,
310,
247,
2581,
15246,
5763,
285,
594,
3133,
3240,
32809,
50276,
977,
2201,
7350,
403,
337,
253,
2032,
11839,
273,
253,
1566,
285,
50276,
19,
253,
12974,
273,
253,
4679,
50275,
88,
1378,
50276,
18,
1223,
391,
16878,
369,
247,
1534,
7680,
387,
253,
673,
352,
310,
1024,
973,
4232,
326,
391,
16878,
3839,
36908,
1347,
973,
387,
512,
275,
3946,
1955,
281,
5445,
8492,
417,
21839,
327,
697,
1211,
2045,
13650,
1309,
3733,
25761,
653,
72,
347,
253,
4477,
1127,
562,
369,
6326,
281,
2953,
253,
878,
323,
13361,
3215,
26208,
533,
2424,
1199,
11369,
281,
789,
253,
958,
310,
326,
28432,
3169,
3646,
11786,
3082,
403,
1335,
625,
3576,
685,
841,
3082,
2530,
597,
452,
247,
1175,
8245,
281,
4796,
945,
1090,
534,
10316,
479,
281,
1127,
50276,
19,
369,
33947,
1408,
342,
247,
6311,
8245,
285,
2128,
7556,
314,
18325,
2829,
337,
5936,
326,
33947,
476,
562,
365,
630,
13361,
407,
760,
14805,
7387,
86,
2792,
285,
562,
365,
10574,
407,
391,
16878,
1633,
310,
3430,
342,
634,
7092,
840,
25761,
627,
403,
5609,
751,
1881,
26717,
3425,
3733,
660,
296,
534,
2080,
562,
365,
630,
33947,
285,
359,
419,
2254,
1014,
5469,
12353,
68,
17425,
1666,
25379,
50276,
249,
6010,
253,
7680,
689,
391,
16878,
285,
653,
72,
275,
16248,
731,
310,
3240,
32809,
285,
253,
8542,
6349,
273,
16248,
731,
310,
30455,
347,
310,
253,
12974,
273,
253,
3559,
4679,
1677,
849,
15225,
33947,
310,
2361,
1347,
285,
253,
33860,
273,
10046,
1666,
25379,
751,
660,
296,
285,
913,
3082,
671,
247,
2929,
327,
9093,
253,
1072,
2746,
369,
9262,
285,
10945,
432,
17857,
32888,
4765,
3614,
5758,
15337,
3024,
9275,
301,
73,
18,
5134,
71,
24,
88,
17,
91,
3738,
436,
2929,
310,
1805,
3542,
285,
12516,
253,
1332,
625,
4751,
275,
3634,
342,
5368,
789,
891,
1158,
326,
2067,
273,
253,
7350,
342,
326,
2929,
4647,
1060,
347,
973,
50275,
6204,
3579,
281,
253,
4477,
8680,
285,
3081,
43122,
1543,
50276,
9846,
5604,
5731,
619,
4868,
604,
841,
7350,
403,
9713,
275,
1798,
604,
436,
26647,
476,
3012,
562,
365,
630,
5368,
3082,
352,
2087,
4219,
342,
1327,
42822,
7533,
436,
651,
11399,
253,
625,
32809,
7680,
273,
16248,
653,
72,
285,
391,
16878,
50276,
6259,
17503,
50276,
15419,
2368,
50272,
1099,
1543,
403,
417,
5185,
342,
2045,
1543,
24088,
33947,
1543,
10046,
1666,
25379,
824,
347,
660,
296,
285,
913,
403,
417,
2783,
19843,
50269,
2417,
2590,
2929,
973,
3542,
8453,
50274,
1671,
391,
16878,
285,
653,
72,
452,
417,
644,
4232,
347,
1774,
3082,
275,
3946,
594,
16248,
731,
310,
1679,
4722,
3236,
414,
50273,
1099,
391,
16878,
285,
653,
72,
403,
9648,
15246,
281,
13398,
323,
10071,
6110,
275,
841,
3082,
50276,
38203,
50268,
30889,
8261,
533,
417,
1175,
2217,
12009,
50273,
39943,
50272,
2417,
50276,
856,
84,
50275,
16691,
4219,
391,
16878,
285,
653,
72,
285,
671,
2629,
15579,
12846,
1025,
3646,
11786,
50276,
4714,
3542,
2929,
4076,
26647,
772,
50276,
3358,
77,
285,
653,
72,
452,
417,
644,
4232,
347,
1774,
3082,
275,
3946,
50276,
16691,
1320,
273,
391,
16878,
285,
653,
72,
310,
15246,
50276,
19687,
30132,
50276,
20137,
1666,
25379,
275,
253,
2929,
26332,
33947,
513,
417,
1347,
347,
3264,
26332,
12345,
1805,
685,
13361,
7197,
685,
391,
16878,
50276,
9072,
254,
28432,
3169,
20320,
2156,
983,
751,
660,
296,
347,
973,
347,
12353,
68,
17425,
11333,
452,
417,
644,
2429,
50276,
11183,
846,
2488,
6128,
50275,
43355,
5717,
368,
323,
634,
8680,
50276,
6050,
352,
310,
2032,
326,
2087,
3006,
391,
16878,
285,
653,
72,
715,
247,
1846,
7792,
310,
417,
14916,
253,
3559,
7792,
3365,
14688,
942,
253,
8746,
830,
273,
653,
72,
26332,
47276,
1668,
275,
253,
653,
72,
2929,
342,
247,
391,
16878,
1307,
33810,
253,
278,
282,
7914,
5469,
310,
6221,
1561,
253,
391,
16878,
2929,
285,
253,
23082,
281,
391,
16878,
285,
653,
72,
403,
15246,
407,
2216,
285,
594,
513,
417,
1663,
2085,
1199,
747,
12288,
7296,
436,
891,
1928,
326,
253,
6349,
273,
253,
2929,
8127,
27945,
327,
15686,
285,
14631,
253,
11839,
273,
253,
2746,
21657,
50276,
88,
1378,
253,
4679,
891,
11435,
326,
253,
4477,
2335,
253,
673,
281,
7409,
253,
4105,
3045,
273,
33947,
2299,
253,
9949,
595,
4105,
3045,
273,
33947,
4558,
49374,
285,
11521,
2159,
1014,
273,
11526,
10491,
23524,
534,
5936,
247,
42578,
2201,
2523,
28432,
5609,
10725,
327,
337,
2266,
1666,
25379,
16058,
407,
253,
4477,
374,
4067,
14604,
9552,
281,
4796,
11041,
285,
495,
3215,
26208,
281,
4796,
11041,
285,
12454,
5919,
17947,
604,
253,
278,
282,
310,
762,
32927,
390,
689,
32927,
253,
6158,
253,
625,
2779,
2523,
1677,
253,
14777,
840,
33947,
588,
1347,
15225,
2686,
352,
310,
1024,
2629,
3946,
281,
3215,
1949,
342,
278,
1417,
1078,
391,
77,
3733,
285,
436,
310,
1663,
253,
671,
23043,
17375,
8103,
8245,
326,
943,
320,
2429,
1411,
253,
1655,
28432,
1543,
33947,
390,
5010,
1663,
878,
281,
320,
9300,
390,
387,
253,
1878,
5176,
347,
597,
403,
417,
10848,
407,
253,
7792,
533,
253,
5301,
281,
23256,
3082,
310,
1774,
50276,
3062,
3839,
891,
1928,
326,
253,
4679,
403,
417,
2568,
11088,
2217,
1223,
253,
4477,
452,
2011,
326,
597,
476,
562,
32231,
653,
72,
285,
391,
16878,
342,
247,
11526,
8103,
352,
310,
417,
4390,
2590,
849,
21750,
2373,
706,
461,
253,
1543,
403,
281,
253,
1307,
2801,
27387,
390,
1014,
752,
954,
4569,
2087,
2801,
859,
2147,
16292,
2746,
2686,
310,
50276,
1189,
455,
891,
1928,
326,
253,
2929,
310,
1335,
275,
878,
273,
6832,
24988,
1078,
9311,
3738,
891,
452,
17265,
619,
4868,
5777,
19123,
5474,
339,
431,
248,
4477,
2085,
247,
1846,
15965,
8668,
327,
2067,
4715,
11333,
323,
3425,
3210,
597,
671,
9569,
247,
747,
5933,
326,
24772,
2067,
273,
253,
5368,
4394,
285,
33526,
1534,
533,
1355,
11701,
327,
247,
5145,
10234,
285,
247,
2505,
10405,
1320,
4836,
50276,
783,
2929,
310,
4518,
3542,
4933,
247,
1175,
47284,
273,
253,
440,
5411,
15895,
50276,
74,
2868,
253,
2929,
310,
3240,
47860,
285,
17904,
281,
253,
3114,
84,
4685,
273,
253,
4715,
11333,
2299,
253,
11701,
275,
616,
4679,
403,
2581,
1355,
285,
812,
3164,
320,
5520,
342,
625,
5661,
789,
597,
513,
34647,
253,
31471,
273,
616,
747,
15895,
533,
417,
1077,
7052,
3021,
619,
17401,
281,
2997,
253,
2929,
310,
6571,
1754,
327,
253,
10527,
2600,
326,
13279,
271,
4722,
747,
1153,
808,
1567,
406,
33032,
2520,
2929,
23970,
271,
4722,
440,
5411,
8668,
327,
2067,
3425,
5978,
3733,
11333,
28248,
1097,
278,
282,
391,
16878,
285,
653,
72,
347,
2714,
2219,
273,
436,
27998,
7792,
436,
13276,
47860,
747,
27838,
273,
2629,
3374,
275,
278,
282,
3733,
275,
2426,
273,
17947,
323,
4227,
1754,
327,
436,
747,
8668,
247,
747,
5933,
310,
5611,
697,
3045,
310,
15626,
327,
247,
5145,
10234,
285,
247,
2505,
10405,
5837,
4836,
50275,
15177,
285,
19843,
253,
2929,
310,
4583,
973,
15720,
3738,
352,
476,
320,
5520,
2220,
923,
4278,
2708,
253,
20314,
20561,
323,
4227,
1057,
417,
3806,
253,
8059,
34859,
835,
253,
7774,
497,
3863,
285,
10894,
1142,
884,
3863,
9380,
347,
549,
32693,
638,
21937,
50276,
783,
5697,
403,
4518,
3559,
534,
310,
9560,
275,
247,
2929,
2820,
281,
440,
1419,
1027,
7274,
285,
253,
747,
8668,
327,
17947,
310,
973,
17194,
50275,
19164,
414,
285,
8453,
253,
440,
5411,
7792,
310,
4722,
285,
7729,
17914,
747,
1708,
327,
690,
2629,
3374,
275,
3425,
5978,
327,
253,
643,
1133,
253,
747,
5933,
285,
697,
1783,
1646,
751,
247,
5777,
20906,
3177,
387,
19732,
2977,
253,
440,
5411,
7792,
50276,
783,
4679,
275,
1798,
1246,
2067,
3374,
50276,
1542,
4227,
697,
2590,
432,
4677,
495,
326,
1097,
278,
282,
285,
391,
16878,
403,
689,
31893,
285,
651,
5649,
432,
625,
5926,
483,
275,
253,
6239,
17272,
310,
7744,
908,
323,
436,
1511,
273,
32049,
48759,
10336,
1907,
2289,
281,
841,
5661,
1543,
310,
1774,
1580,
352,
651,
8046,
253,
9414,
281,
2096,
1880,
253,
5373,
273,
253,
747,
2746,
403,
749,
2204,
264,
407,
3963,
5837,
390,
417,
50276,
44295,
253,
3045,
273,
253,
11771,
3082,
3133,
247,
2372,
1698,
278,
282,
5012,
374,
25959,
7387,
86,
534,
310,
247,
2372,
10084,
7296,
326,
50272,
3113,
8325,
8716,
8325,
273,
1979,
884,
417,
608,
47421,
270,
1240,
21329,
1952,
1162,
355,
4022,
755,
25255,
23,
7387,
86,
285,
436,
310,
1293,
5926,
483,
50270,
3113,
5926,
483,
17272,
533,
1293,
8325,
3186,
458,
1559,
857,
1162,
355,
4765,
755,
32900,
7387,
86,
2403,
247,
2266,
1083,
323,
253,
5373,
273,
253,
747,
5933,
4419,
625,
11080,
4679,
50276,
1189,
455,
253,
806,
2716,
273,
253,
2929,
310,
4722,
285,
47860,
1223,
253,
1273,
651,
5649,
432,
625,
673,
50275,
856,
84,
50276,
498,
15752,
273,
253,
5697,
326,
403,
3559,
50276,
47606,
440,
5411,
8668,
327,
3425,
5978,
11333,
50276,
968,
429,
1020,
747,
27838,
273,
5368,
11333,
275,
2426,
273,
17947,
50276,
5040,
50276,
783,
1650,
747,
5933,
310,
417,
1077,
3236,
50276,
783,
2330,
4679,
403,
18464,
50275,
23454,
337,
3239,
374,
1388,
266,
50276,
73,
8185,
8210,
20978,
460,
4765,
490,
69,
311,
30165,
5985,
1162,
355,
4765,
1263,
275,
247,
37851,
17032,
8668,
310,
271,
18464,
6197,
374,
387,
253,
5068,
273,
2593,
4562,
3646,
5556,
5837,
310,
247,
2021,
273,
11333,
495,
3239,
818,
275,
253,
9978,
273,
253,
4679,
359,
897,
253,
38622,
5556,
6081,
323,
256,
35333,
3733,
310,
13583,
1580,
256,
35333,
310,
417,
247,
2021,
533,
247,
2173,
5933,
534,
310,
1027,
432,
38622,
187,
187,
4118,
18435,
27,
74,
11346,
4361,
253,
2929,
4266,
285,
891,
11435,
253,
440,
5411,
7792,
12873,
391,
16878,
285,
653,
72,
1223,
891,
513,
417,
1691,
247,
2257,
273,
2801,
327,
253,
4679,
891,
5194,
342,
253,
30628,
326,
253,
5661,
1543,
403,
417,
1077,
2266,
285,
891,
717,
417,
13762,
326,
253,
10527,
7680,
16382,
253,
2534,
387,
17857,
32888,
50276,
249,
253,
30370,
5933,
627,
3133,
281,
320,
271,
3081,
35375,
4764,
285,
767,
25184,
3602,
352,
310,
1774,
281,
6266,
849,
253,
3602,
403,
24251,
1677,
253,
3081,
4373,
22041,
581,
778,
1908,
4933,
512,
273,
253,
11333,
253,
1072,
7563,
273,
4373,
19484,
25184,
891,
671,
5194,
342,
30628,
326,
253,
3646,
11786,
8245,
3133,
281,
762,
32231,
6867,
1543,
581,
1896,
1039,
281,
17084,
253,
4679,
310,
281,
1611,
281,
25464,
253,
1543,
273,
653,
72,
390,
391,
16878,
285,
2319,
253,
3879,
273,
1016,
5933,
347,
247,
1159,
273,
4373,
22041,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
625,
27998,
1859,
273,
39653,
3082,
323,
3733,
3425,
3210,
5742,
247,
1554,
262,
693,
8103,
298,
82,
3124,
11253,
273,
337,
253,
2629,
10921,
11903,
1320,
8103,
273,
3646,
11786,
299,
431,
6168,
274,
50276,
19,
247,
17375,
2801,
9765,
8107,
27451,
23279,
273,
253,
36833,
3646,
285,
247,
1327,
19484,
280,
3646,
2805,
50276,
20,
247,
17375,
2801,
9840,
15579,
1307,
327,
2805,
50276,
261,
4081,
323,
3425,
3733,
923,
5150,
337,
298,
476,
320,
10040,
3146,
18325,
407,
16161,
323,
2805,
1677,
268,
3124,
285,
253,
39116,
1677,
2805,
923,
16186,
374,
50276,
2520,
7792,
11076,
1037,
2087,
4219,
2602,
4090,
22872,
11786,
653,
72,
9765,
18,
9840,
17,
285,
10921,
2321,
16390,
4869,
12177,
9765,
17,
701,
31587,
18458,
285,
671,
2629,
15579,
3963,
1025,
3646,
11786,
9765,
17,
2190,
643,
11333,
50276,
783,
2929,
310,
973,
3542,
285,
253,
2746,
24600,
2299,
16248,
653,
72,
285,
391,
16878,
407,
16984,
616,
9056,
37820,
2426,
310,
247,
2581,
15246,
5763,
285,
594,
3133,
3240,
32809,
50276,
977,
2201,
7350,
403,
337,
253,
2032,
11839,
273,
253,
1566,
285,
50276,
19,
253,
12974,
273,
253,
4679,
50275,
88,
1378,
50276,
18,
1223,
391,
16878,
369,
247,
1534,
7680,
387,
253,
673,
352,
310,
1024,
973,
4232,
326,
391,
16878,
3839,
36908,
1347,
973,
387,
512,
275,
3946,
1955,
281,
5445,
8492,
417,
21839,
327,
697,
1211,
2045,
13650,
1309,
3733,
25761,
653,
72,
347,
253,
4477,
1127,
562,
369,
6326,
281,
2953,
253,
878,
323,
13361,
3215,
26208,
533,
2424,
1199,
11369,
281,
789,
253,
958,
310,
326,
28432,
3169,
3646,
11786,
3082,
403,
1335,
625,
3576,
685,
841,
3082,
2530,
597,
452,
247,
1175,
8245,
281,
4796,
945,
1090,
534,
10316,
479,
281,
1127,
50276,
19,
369,
33947,
1408,
342,
247,
6311,
8245,
285,
2128,
7556,
314,
18325,
2829,
337,
5936,
326,
33947,
476,
562,
365,
630,
13361,
407,
760,
14805,
7387,
86,
2792,
285,
562,
365,
10574,
407,
391,
16878,
1633,
310,
3430,
342,
634,
7092,
840,
25761,
627,
403,
5609,
751,
1881,
26717,
3425,
3733,
660,
296,
534,
2080,
562,
365,
630,
33947,
285,
359,
419,
2254,
1014,
5469,
12353,
68,
17425,
1666,
25379,
50276,
249,
6010,
253,
7680,
689,
391,
16878,
285,
653,
72,
275,
16248,
731,
310,
3240,
32809,
285,
253,
8542,
6349,
273,
16248,
731,
310,
30455,
347,
310,
253,
12974,
273,
253,
3559,
4679,
1677,
849,
15225,
33947,
310,
2361,
1347,
285,
253,
33860,
273,
10046,
1666,
25379,
751,
660,
296,
285,
913,
3082,
671,
247,
2929,
327,
9093,
253,
1072,
2746,
369,
9262,
285,
10945,
432,
17857,
32888,
4765,
3614,
5758,
15337,
3024,
9275,
301,
73,
18,
5134,
71,
24,
88,
17,
91,
3738,
436,
2929,
310,
1805,
3542,
285,
12516,
253,
1332,
625,
4751,
275,
3634,
342,
5368,
789,
891,
1158,
326,
2067,
273,
253,
7350,
342,
326,
2929,
4647,
1060,
347,
973,
50275,
6204,
3579,
281,
253,
4477,
8680,
285,
3081,
43122,
1543,
50276,
9846,
5604,
5731,
619,
4868,
604,
841,
7350,
403,
9713,
275,
1798,
604,
436,
26647,
476,
3012,
562,
365,
630,
5368,
3082,
352,
2087,
4219,
342,
1327,
42822,
7533,
436,
651,
11399,
253,
625,
32809,
7680,
273,
16248,
653,
72,
285,
391,
16878,
50276,
6259,
17503,
50276,
15419,
2368,
50272,
1099,
1543,
403,
417,
5185,
342,
2045,
1543,
24088,
33947,
1543,
10046,
1666,
25379,
824,
347,
660,
296,
285,
913,
403,
417,
2783,
19843,
50269,
2417,
2590,
2929,
973,
3542,
8453,
50274,
1671,
391,
16878,
285,
653,
72,
452,
417,
644,
4232,
347,
1774,
3082,
275,
3946,
594,
16248,
731,
310,
1679,
4722,
3236,
414,
50273,
1099,
391,
16878,
285,
653,
72,
403,
9648,
15246,
281,
13398,
323,
10071,
6110,
275,
841,
3082,
50276,
38203,
50268,
30889,
8261,
533,
417,
1175,
2217,
12009,
50273,
39943,
50272,
2417,
50276,
856,
84,
50275,
16691,
4219,
391,
16878,
285,
653,
72,
285,
671,
2629,
15579,
12846,
1025,
3646,
11786,
50276,
4714,
3542,
2929,
4076,
26647,
772,
50276,
3358,
77,
285,
653,
72,
452,
417,
644,
4232,
347,
1774,
3082,
275,
3946,
50276,
16691,
1320,
273,
391,
16878,
285,
653,
72,
310,
15246,
50276,
19687,
30132,
50276,
20137,
1666,
25379,
275,
253,
2929,
26332,
33947,
513,
417,
1347,
347,
3264,
26332,
12345,
1805,
685,
13361,
7197,
685,
391,
16878,
50276,
9072,
254,
28432,
3169,
20320,
2156,
983,
751,
660,
296,
347,
973,
347,
12353,
68,
17425,
11333,
452,
417,
644,
2429,
50276,
11183,
846,
2488,
6128,
50275,
43355,
5717,
368,
323,
634,
8680,
50276,
6050,
352,
310,
2032,
326,
2087,
3006,
391,
16878,
285,
653,
72,
715,
247,
1846,
7792,
310,
417,
14916,
253,
3559,
7792,
3365,
14688,
942,
253,
8746,
830,
273,
653,
72,
26332,
47276,
1668,
275,
253,
653,
72,
2929,
342,
247,
391,
16878,
1307,
33810,
253,
278,
282,
7914,
5469,
310,
6221,
1561,
253,
391,
16878,
2929,
285,
253,
23082,
281,
391,
16878,
285,
653,
72,
403,
15246,
407,
2216,
285,
594,
513,
417,
1663,
2085,
1199,
747,
12288,
7296,
436,
891,
1928,
326,
253,
6349,
273,
253,
2929,
8127,
27945,
327,
15686,
285,
14631,
253,
11839,
273,
253,
2746,
21657,
50276,
88,
1378,
253,
4679,
891,
11435,
326,
253,
4477,
2335,
253,
673,
281,
7409,
253,
4105,
3045,
273,
33947,
2299,
253,
9949,
595,
4105,
3045,
273,
33947,
4558,
49374,
285,
11521,
2159,
1014,
273,
11526,
10491,
23524,
534,
5936,
247,
42578,
2201,
2523,
28432,
5609,
10725,
327,
337,
2266,
1666,
25379,
16058,
407,
253,
4477,
374,
4067,
14604,
9552,
281,
4796,
11041,
285,
495,
3215,
26208,
281,
4796,
11041,
285,
12454,
5919,
17947,
604,
253,
278,
282,
310,
762,
32927,
390,
689,
32927,
253,
6158,
253,
625,
2779,
2523,
1677,
253,
14777,
840,
33947,
588,
1347,
15225,
2686,
352,
310,
1024,
2629,
3946,
281,
3215,
1949,
342,
278,
1417,
1078,
391,
77,
3733,
285,
436,
310,
1663,
253,
671,
23043,
17375,
8103,
8245,
326,
943,
320,
2429,
1411,
253,
1655,
28432,
1543,
33947,
390,
5010,
1663,
878,
281,
320,
9300,
390,
387,
253,
1878,
5176,
347,
597,
403,
417,
10848,
407,
253,
7792,
533,
253,
5301,
281,
23256,
3082,
310,
1774,
50276,
3062,
3839,
891,
1928,
326,
253,
4679,
403,
417,
2568,
11088,
2217,
1223,
253,
4477,
452,
2011,
326,
597,
476,
562,
32231,
653,
72,
285,
391,
16878,
342,
247,
11526,
8103,
352,
310,
417,
4390,
2590,
849,
21750,
2373,
706,
461,
253,
1543,
403,
281,
253,
1307,
2801,
27387,
390,
1014,
752,
954,
4569,
2087,
2801,
859,
2147,
16292,
2746,
2686,
310,
50276,
1189,
455,
891,
1928,
326,
253,
2929,
310,
1335,
275,
878,
273,
6832,
24988,
1078,
9311,
3738,
891,
452,
17265,
619,
4868,
5777,
19123,
5474,
339,
431,
248,
4477,
2085,
247,
1846,
15965,
8668,
327,
2067,
4715,
11333,
323,
3425,
3210,
597,
671,
9569,
247,
747,
5933,
326,
24772,
2067,
273,
253,
5368,
4394,
285,
33526,
1534,
533,
1355,
11701,
327,
247,
5145,
10234,
285,
247,
2505,
10405,
1320,
4836,
50276,
783,
2929,
310,
4518,
3542,
4933,
247,
1175,
47284,
273,
253,
440,
5411,
15895,
50276,
74,
2868,
253,
2929,
310,
3240,
47860,
285,
17904,
281,
253,
3114,
84,
4685,
273,
253,
4715,
11333,
2299,
253,
11701,
275,
616,
4679,
403,
2581,
1355,
285,
812,
3164,
320,
5520,
342,
625,
5661,
789,
597,
513,
34647,
253,
31471,
273,
616,
747,
15895,
533,
417,
1077,
7052,
3021,
619,
17401,
281,
2997,
253,
2929,
310,
6571,
1754,
327,
253,
10527,
2600,
326,
13279,
271,
4722,
747,
1153,
808,
1567,
406,
33032,
2520,
2929,
23970,
271,
4722,
440,
5411,
8668,
327,
2067,
3425,
5978,
3733,
11333,
28248,
1097,
278,
282,
391,
16878,
285,
653,
72,
347,
2714,
2219,
273,
436,
27998,
7792,
436,
13276,
47860,
747,
27838,
273,
2629,
3374,
275,
278,
282,
3733,
275,
2426,
273,
17947,
323,
4227,
1754,
327,
436,
747,
8668,
247,
747,
5933,
310,
5611,
697,
3045,
310,
15626,
327,
247,
5145,
10234,
285,
247,
2505,
10405,
5837,
4836,
50275,
15177,
285,
19843,
253,
2929,
310,
4583,
973,
15720,
3738,
352,
476,
320,
5520,
2220,
923,
4278,
2708,
253,
20314,
20561,
323,
4227,
1057,
417,
3806,
253,
8059,
34859,
835,
253,
7774,
497,
3863,
285,
10894,
1142,
884,
3863,
9380,
347,
549,
32693,
638,
21937,
50276,
783,
5697,
403,
4518,
3559,
534,
310,
9560,
275,
247,
2929,
2820,
281,
440,
1419,
1027,
7274,
285,
253,
747,
8668,
327,
17947,
310,
973,
17194,
50275,
19164,
414,
285,
8453,
253,
440,
5411,
7792,
310,
4722,
285,
7729,
17914,
747,
1708,
327,
690,
2629,
3374,
275,
3425,
5978,
327,
253,
643,
1133,
253,
747,
5933,
285,
697,
1783,
1646,
751,
247,
5777,
20906,
3177,
387,
19732,
2977,
253,
440,
5411,
7792,
50276,
783,
4679,
275,
1798,
1246,
2067,
3374,
50276,
1542,
4227,
697,
2590,
432,
4677,
495,
326,
1097,
278,
282,
285,
391,
16878,
403,
689,
31893,
285,
651,
5649,
432,
625,
5926,
483,
275,
253,
6239,
17272,
310,
7744,
908,
323,
436,
1511,
273,
32049,
48759,
10336,
1907,
2289,
281,
841,
5661,
1543,
310,
1774,
1580,
352,
651,
8046,
253,
9414,
281,
2096,
1880,
253,
5373,
273,
253,
747,
2746,
403,
749,
2204,
264,
407,
3963,
5837,
390,
417,
50276,
44295,
253,
3045,
273,
253,
11771,
3082,
3133,
247,
2372,
1698,
278,
282,
5012,
374,
25959,
7387,
86,
534,
310,
247,
2372,
10084,
7296,
326,
50272,
3113,
8325,
8716,
8325,
273,
1979,
884,
417,
608,
47421,
270,
1240,
21329,
1952,
1162,
355,
4022,
755,
25255,
23,
7387,
86,
285,
436,
310,
1293,
5926,
483,
50270,
3113,
5926,
483,
17272,
533,
1293,
8325,
3186,
458,
1559,
857,
1162,
355,
4765,
755,
32900,
7387,
86,
2403,
247,
2266,
1083,
323,
253,
5373,
273,
253,
747,
5933,
4419,
625,
11080,
4679,
50276,
1189,
455,
253,
806,
2716,
273,
253,
2929,
310,
4722,
285,
47860,
1223,
253,
1273,
651,
5649,
432,
625,
673,
50275,
856,
84,
50276,
498,
15752,
273,
253,
5697,
326,
403,
3559,
50276,
47606,
440,
5411,
8668,
327,
3425,
5978,
11333,
50276,
968,
429,
1020,
747,
27838,
273,
5368,
11333,
275,
2426,
273,
17947,
50276,
5040,
50276,
783,
1650,
747,
5933,
310,
417,
1077,
3236,
50276,
783,
2330,
4679,
403,
18464,
50275,
23454,
337,
3239,
374,
1388,
266,
50276,
73,
8185,
8210,
20978,
460,
4765,
490,
69,
311,
30165,
5985,
1162,
355,
4765,
1263,
275,
247,
37851,
17032,
8668,
310,
271,
18464,
6197,
374,
387,
253,
5068,
273,
2593,
4562,
3646,
5556,
5837,
310,
247,
2021,
273,
11333,
495,
3239,
818,
275,
253,
9978,
273,
253,
4679,
359,
897,
253,
38622,
5556,
6081,
323,
256,
35333,
3733,
310,
13583,
1580,
256,
35333,
310,
417,
247,
2021,
533,
247,
2173,
5933,
534,
310,
1027,
432,
38622,
187,
187,
4118,
18435,
27,
74,
11346,
4361,
253,
2929,
4266,
285,
891,
11435,
253,
440,
5411,
7792,
12873,
391,
16878,
285,
653,
72,
1223,
891,
513,
417,
1691,
247,
2257,
273,
2801,
327,
253,
4679,
891,
5194,
342,
253,
30628,
326,
253,
5661,
1543,
403,
417,
1077,
2266,
285,
891,
717,
417,
13762,
326,
253,
10527,
7680,
16382,
253,
2534,
387,
17857,
32888,
50276,
249,
253,
30370,
5933,
627,
3133,
281,
320,
271,
3081,
35375,
4764,
285,
767,
25184,
3602,
352,
310,
1774,
281,
6266,
849,
253,
3602,
403,
24251,
1677,
253,
3081,
4373,
22041,
581,
778,
1908,
4933,
512,
273,
253,
11333,
253,
1072,
7563,
273,
4373,
19484,
25184,
891,
671,
5194,
342,
30628,
326,
253,
3646,
11786,
8245,
3133,
281,
762,
32231,
6867,
1543,
581,
1896,
1039,
281,
17084,
253,
4679,
310,
281,
1611,
281,
25464,
253,
1543,
273,
653,
72,
390,
391,
16878,
285,
2319,
253,
3879,
273,
1016,
5933,
347,
247,
1159,
273,
4373,
22041,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work investigates the estimation of model reusability when the size of target data is small specifically features are extracted for tasks and pretrained models respectively then the metric distance between them is optimized as mre score the theoretical analysis and the empirical study confirm the effectiveness of the proposed method strong 1 with the development of pretraining evaluating the reusability of existing pretrained models appropriately is important for applications 2 the framework is elaborated with the example of classification 3 the empirical study shows that the proposed method can learn better mre score than existing methods weak 1 some notations are not clear for example the loss in line 108 contains a term of vmu h without any explanation about how to obtain it besides it is better to elaborate the feature generator for models ie ch and the validation mre function vvalid with examples to help illustrate 2 for classification the proposed method requires the label space for pretrained models which is unavailable for models learned without labels it may limit the application of the proposed method which is only applicable for supervised pretrained models 3 the current experiments only contain models pretrained on target data it is better to include existing pretrained models as in logme to demonstrate the effectiveness for a diverse model zoo yes docsepthis paper proposes synergistic learning a new metric for model reusability evaluation mre of pretrained models to downstream tasks with small data this taskmodel metric is learned using a set of pretrained models and one or more data providers given a validation mre function synergistic learning establishes a metric between data providers and pretrained models so that this metric distance can be used as an mre score the authors propose four different types of synergistic models 1 basic model 2 task transform model 3 model transform model 4 ensembler model experiments are run on benchmark mnist cifar10 dsprites cifar100 and miniimagenet datasets and show that the proposed method is competitive as a mre metric compared to sota methods strengths the paper tackles the important problem of learning to select pretrained models for downstream tasks with small training data the paper is technically sound and the claims are accompanied by theoretical and experimental analyses the experiments show that the proposed method is successful as a mre metric compared to counterpars weaknesses the main weakness that i see is related to clarity and organization of the paper especially in terms of definitions and assumptions for example it seems that feature extractors are an important part of the proposed method however this assumption is not clearly stated until later in the paper also important assumptions for the small data transfer scenario such as retraining of heads only lines 187195 are provided later in the paper in general the presence of definitions and assumptions in several parts of the paper make it difficult to read and a big effort is required for understanding the ideas assumptions and limitations of the proposed approach i would suggest the authors to reorganize the paper in such a way that assumptions and definitions of the general approach are provided first then an explanation of the proposed method the theory and the algorithm limitations are clearly stated docsepthis paper focuses on the reusability of pretrained models with many source data in particular this paper aims to predict the transfer learning performance in advance when the target dataset is small to evaluate model reusability this paper proposes synergistic learning metric distance converted to mre score as one of the metric learning methods the proposed method experimentally shows that it is a more useful method to predict the transfer learning performance than baseline methods strength the proposed method can robustly calculate mre scores even in situations where there is little learning data the proposed method is also a generic method applicable to most problems clarity from the readers point of view the notation of this paper is somewhat confusing because it is different from that of other general transfer learning papers it is also unclear what circumstances squery and mquery assumed since there is a lot of space left in figure 1 it is recommended to explain the situation of squery and mquery in pictures finally some grammatical errors are seen therefore this reviewer recommends that you organize your thesis well this reviewer thinks that the authors have properly stated the limitation of this paper
### Summary:
|
there is a consensus among the expert reviewers that this paper tackles an important problem is technically sound and has a sufficient contribution for publication at neurips2022 synergistic learning is still in its infancy and as a result requires visibility from the community the proposed methods for calculating model reusability evaluation mre metrics in this paper will serve as an important baseline for subsequent research in this direction personally i also appreciate this research direction as it will be a crucial part of the democratization of ai the main reservation is the clarity and presentation of the paper the authors attempted to address them by providing clarification during the discussion phase and by revising the manuscript accordingly which i really appreciate the reviewers also acknowledged and responded positively to the authors responses either maintaining their high scores or increasing them hence i recommend that the authors take special care of this issue in the cameraready version
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
2340,
684,
253,
13418,
273,
1566,
294,
316,
1430,
672,
253,
1979,
273,
2303,
941,
310,
1355,
5742,
3386,
403,
10375,
323,
8892,
285,
3215,
11273,
3210,
2975,
840,
253,
7982,
4181,
875,
731,
310,
18325,
347,
278,
250,
4868,
253,
10527,
1783,
285,
253,
16774,
1263,
6583,
253,
12510,
273,
253,
4081,
1332,
50276,
9072,
50275,
18,
186,
3113,
253,
2440,
273,
3215,
26208,
16344,
253,
294,
316,
1430,
273,
5368,
3215,
11273,
3210,
20420,
310,
1774,
323,
4893,
50275,
19,
186,
783,
7792,
310,
50221,
342,
253,
1650,
273,
9162,
50275,
20,
186,
783,
16774,
1263,
2722,
326,
253,
4081,
1332,
476,
3037,
1805,
278,
250,
4868,
685,
5368,
3082,
50274,
20881,
50275,
18,
186,
8826,
41818,
403,
417,
2590,
323,
1650,
253,
2957,
275,
1386,
13278,
4428,
247,
1307,
273,
362,
1906,
288,
1293,
667,
8813,
670,
849,
281,
4044,
352,
16280,
352,
310,
1805,
281,
21184,
253,
4735,
14156,
323,
3210,
26332,
448,
285,
253,
12820,
278,
250,
1159,
362,
7210,
342,
6667,
281,
1361,
17093,
50275,
19,
186,
1542,
9162,
253,
4081,
1332,
4419,
253,
5203,
2317,
323,
3215,
11273,
3210,
534,
310,
29356,
323,
3210,
6311,
1293,
13301,
352,
778,
2701,
253,
2898,
273,
253,
4081,
1332,
534,
310,
760,
7763,
323,
22296,
3215,
11273,
3210,
50275,
20,
186,
783,
1655,
4679,
760,
3831,
3210,
3215,
11273,
327,
2303,
941,
352,
310,
1805,
281,
2486,
5368,
3215,
11273,
3210,
347,
275,
2412,
1405,
281,
7568,
253,
12510,
323,
247,
11117,
1566,
41089,
50274,
9820,
5474,
33032,
2520,
2929,
29328,
38543,
4715,
247,
747,
7982,
323,
1566,
294,
316,
1430,
7103,
278,
250,
273,
3215,
11273,
3210,
281,
15450,
8892,
342,
1355,
941,
436,
4836,
7645,
7982,
310,
6311,
970,
247,
873,
273,
3215,
11273,
3210,
285,
581,
390,
625,
941,
11967,
1677,
247,
12820,
278,
250,
1159,
38543,
4715,
25097,
247,
7982,
875,
941,
11967,
285,
3215,
11273,
3210,
594,
326,
436,
7982,
4181,
476,
320,
908,
347,
271,
278,
250,
4868,
253,
4477,
12661,
1740,
1027,
3510,
273,
38543,
3210,
337,
5044,
1566,
374,
4836,
4979,
1566,
495,
1566,
4979,
1566,
577,
546,
14575,
254,
1566,
4679,
403,
1408,
327,
22791,
278,
79,
382,
260,
338,
274,
740,
277,
1033,
31320,
260,
338,
274,
2313,
285,
12949,
303,
6533,
292,
15302,
285,
921,
326,
253,
4081,
1332,
310,
12085,
347,
247,
278,
250,
7982,
2429,
281,
256,
5503,
3082,
20544,
50276,
783,
2929,
39223,
253,
1774,
1895,
273,
4715,
281,
3609,
3215,
11273,
3210,
323,
15450,
8892,
342,
1355,
3733,
941,
50275,
783,
2929,
310,
22335,
3590,
285,
253,
3916,
403,
11704,
407,
10527,
285,
5661,
6260,
50276,
783,
4679,
921,
326,
253,
4081,
1332,
310,
5547,
347,
247,
278,
250,
7982,
2429,
281,
4828,
35422,
50276,
20881,
1255,
265,
50276,
783,
2022,
14855,
326,
891,
923,
310,
2905,
281,
19843,
285,
6003,
273,
253,
2929,
3340,
275,
2426,
273,
14308,
285,
13260,
323,
1650,
352,
3133,
326,
4735,
4908,
641,
403,
271,
1774,
629,
273,
253,
4081,
1332,
2299,
436,
9376,
310,
417,
4518,
4767,
1919,
1996,
275,
253,
2929,
671,
1774,
13260,
323,
253,
1355,
941,
3700,
10076,
824,
347,
851,
26208,
273,
9851,
760,
3104,
25165,
17726,
403,
2530,
1996,
275,
253,
2929,
275,
2087,
253,
3361,
273,
14308,
285,
13260,
275,
2067,
4243,
273,
253,
2929,
1056,
352,
2834,
281,
1239,
285,
247,
1943,
3434,
310,
2424,
323,
4685,
253,
5697,
13260,
285,
7364,
273,
253,
4081,
2746,
891,
651,
1804,
253,
4477,
281,
294,
7397,
907,
253,
2929,
275,
824,
247,
1039,
326,
13260,
285,
14308,
273,
253,
2087,
2746,
403,
2530,
806,
840,
271,
8813,
273,
253,
4081,
1332,
253,
3762,
285,
253,
5933,
50275,
17465,
569,
403,
4518,
4767,
5474,
33032,
2520,
2929,
16633,
327,
253,
294,
316,
1430,
273,
3215,
11273,
3210,
342,
1142,
2603,
941,
275,
1798,
436,
2929,
13698,
281,
3283,
253,
3700,
4715,
3045,
275,
7170,
672,
253,
2303,
10895,
310,
1355,
281,
7472,
1566,
294,
316,
1430,
436,
2929,
29328,
38543,
4715,
7982,
4181,
11516,
281,
278,
250,
4868,
347,
581,
273,
253,
7982,
4715,
3082,
253,
4081,
1332,
21657,
2722,
326,
352,
310,
247,
625,
4217,
1332,
281,
3283,
253,
3700,
4715,
3045,
685,
8245,
3082,
4757,
50276,
783,
4081,
1332,
476,
10237,
314,
10173,
278,
250,
7363,
1014,
275,
9534,
835,
627,
310,
1652,
4715,
941,
253,
4081,
1332,
310,
671,
247,
12314,
1332,
7763,
281,
954,
3237,
50276,
498,
15752,
50276,
4064,
253,
10668,
1127,
273,
1859,
253,
14951,
273,
436,
2929,
310,
8489,
21643,
984,
352,
310,
1027,
432,
326,
273,
643,
2087,
3700,
4715,
9380,
352,
310,
671,
12744,
752,
5989,
3896,
1771,
285,
278,
7267,
8025,
1580,
627,
310,
247,
2257,
273,
2317,
1669,
275,
4677,
337,
352,
310,
8521,
281,
5513,
253,
4112,
273,
3896,
1771,
285,
278,
7267,
275,
7968,
4720,
690,
47412,
474,
6332,
403,
2326,
3103,
436,
37317,
32636,
326,
368,
23968,
634,
22857,
973,
50276,
2520,
37317,
11121,
326,
253,
4477,
452,
6283,
4767,
253,
12291,
273,
436,
2929,
2490,
187,
4118,
18435,
27,
9088,
310,
247,
13969,
2190,
253,
6485,
30628,
326,
436,
2929,
39223,
271,
1774,
1895,
310,
22335,
3590,
285,
556,
247,
4209,
7680,
323,
9311,
387,
5723,
2824,
938,
1423,
38543,
4715,
310,
1335,
275,
697,
42986,
285,
347,
247,
906,
4419,
23114,
432,
253,
3114,
253,
4081,
3082,
323,
18899,
1566,
294,
316,
1430,
7103,
278,
250,
17082,
275,
436,
2929,
588,
5752,
347,
271,
1774,
8245,
323,
6774,
2561,
275,
436,
3884,
11697,
891,
671,
11435,
436,
2561,
3884,
347,
352,
588,
320,
247,
9560,
629,
273,
253,
8738,
47159,
273,
23105,
50276,
783,
2022,
28930,
310,
253,
19843,
285,
9759,
273,
253,
2929,
253,
4477,
9919,
281,
2953,
731,
407,
5277,
37699,
1309,
253,
5955,
3408,
285,
407,
3585,
2182,
253,
7714,
15672,
534,
891,
1663,
11435,
253,
30628,
671,
14969,
285,
10974,
14962,
281,
253,
4477,
6128,
2057,
11850,
616,
1029,
7363,
390,
3629,
731,
7613,
891,
5583,
326,
253,
4477,
1379,
2714,
1557,
273,
436,
2523,
275,
253,
4049,
254,
609,
5102,
2715,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
2340,
684,
253,
13418,
273,
1566,
294,
316,
1430,
672,
253,
1979,
273,
2303,
941,
310,
1355,
5742,
3386,
403,
10375,
323,
8892,
285,
3215,
11273,
3210,
2975,
840,
253,
7982,
4181,
875,
731,
310,
18325,
347,
278,
250,
4868,
253,
10527,
1783,
285,
253,
16774,
1263,
6583,
253,
12510,
273,
253,
4081,
1332,
50276,
9072,
50275,
18,
186,
3113,
253,
2440,
273,
3215,
26208,
16344,
253,
294,
316,
1430,
273,
5368,
3215,
11273,
3210,
20420,
310,
1774,
323,
4893,
50275,
19,
186,
783,
7792,
310,
50221,
342,
253,
1650,
273,
9162,
50275,
20,
186,
783,
16774,
1263,
2722,
326,
253,
4081,
1332,
476,
3037,
1805,
278,
250,
4868,
685,
5368,
3082,
50274,
20881,
50275,
18,
186,
8826,
41818,
403,
417,
2590,
323,
1650,
253,
2957,
275,
1386,
13278,
4428,
247,
1307,
273,
362,
1906,
288,
1293,
667,
8813,
670,
849,
281,
4044,
352,
16280,
352,
310,
1805,
281,
21184,
253,
4735,
14156,
323,
3210,
26332,
448,
285,
253,
12820,
278,
250,
1159,
362,
7210,
342,
6667,
281,
1361,
17093,
50275,
19,
186,
1542,
9162,
253,
4081,
1332,
4419,
253,
5203,
2317,
323,
3215,
11273,
3210,
534,
310,
29356,
323,
3210,
6311,
1293,
13301,
352,
778,
2701,
253,
2898,
273,
253,
4081,
1332,
534,
310,
760,
7763,
323,
22296,
3215,
11273,
3210,
50275,
20,
186,
783,
1655,
4679,
760,
3831,
3210,
3215,
11273,
327,
2303,
941,
352,
310,
1805,
281,
2486,
5368,
3215,
11273,
3210,
347,
275,
2412,
1405,
281,
7568,
253,
12510,
323,
247,
11117,
1566,
41089,
50274,
9820,
5474,
33032,
2520,
2929,
29328,
38543,
4715,
247,
747,
7982,
323,
1566,
294,
316,
1430,
7103,
278,
250,
273,
3215,
11273,
3210,
281,
15450,
8892,
342,
1355,
941,
436,
4836,
7645,
7982,
310,
6311,
970,
247,
873,
273,
3215,
11273,
3210,
285,
581,
390,
625,
941,
11967,
1677,
247,
12820,
278,
250,
1159,
38543,
4715,
25097,
247,
7982,
875,
941,
11967,
285,
3215,
11273,
3210,
594,
326,
436,
7982,
4181,
476,
320,
908,
347,
271,
278,
250,
4868,
253,
4477,
12661,
1740,
1027,
3510,
273,
38543,
3210,
337,
5044,
1566,
374,
4836,
4979,
1566,
495,
1566,
4979,
1566,
577,
546,
14575,
254,
1566,
4679,
403,
1408,
327,
22791,
278,
79,
382,
260,
338,
274,
740,
277,
1033,
31320,
260,
338,
274,
2313,
285,
12949,
303,
6533,
292,
15302,
285,
921,
326,
253,
4081,
1332,
310,
12085,
347,
247,
278,
250,
7982,
2429,
281,
256,
5503,
3082,
20544,
50276,
783,
2929,
39223,
253,
1774,
1895,
273,
4715,
281,
3609,
3215,
11273,
3210,
323,
15450,
8892,
342,
1355,
3733,
941,
50275,
783,
2929,
310,
22335,
3590,
285,
253,
3916,
403,
11704,
407,
10527,
285,
5661,
6260,
50276,
783,
4679,
921,
326,
253,
4081,
1332,
310,
5547,
347,
247,
278,
250,
7982,
2429,
281,
4828,
35422,
50276,
20881,
1255,
265,
50276,
783,
2022,
14855,
326,
891,
923,
310,
2905,
281,
19843,
285,
6003,
273,
253,
2929,
3340,
275,
2426,
273,
14308,
285,
13260,
323,
1650,
352,
3133,
326,
4735,
4908,
641,
403,
271,
1774,
629,
273,
253,
4081,
1332,
2299,
436,
9376,
310,
417,
4518,
4767,
1919,
1996,
275,
253,
2929,
671,
1774,
13260,
323,
253,
1355,
941,
3700,
10076,
824,
347,
851,
26208,
273,
9851,
760,
3104,
25165,
17726,
403,
2530,
1996,
275,
253,
2929,
275,
2087,
253,
3361,
273,
14308,
285,
13260,
275,
2067,
4243,
273,
253,
2929,
1056,
352,
2834,
281,
1239,
285,
247,
1943,
3434,
310,
2424,
323,
4685,
253,
5697,
13260,
285,
7364,
273,
253,
4081,
2746,
891,
651,
1804,
253,
4477,
281,
294,
7397,
907,
253,
2929,
275,
824,
247,
1039,
326,
13260,
285,
14308,
273,
253,
2087,
2746,
403,
2530,
806,
840,
271,
8813,
273,
253,
4081,
1332,
253,
3762,
285,
253,
5933,
50275,
17465,
569,
403,
4518,
4767,
5474,
33032,
2520,
2929,
16633,
327,
253,
294,
316,
1430,
273,
3215,
11273,
3210,
342,
1142,
2603,
941,
275,
1798,
436,
2929,
13698,
281,
3283,
253,
3700,
4715,
3045,
275,
7170,
672,
253,
2303,
10895,
310,
1355,
281,
7472,
1566,
294,
316,
1430,
436,
2929,
29328,
38543,
4715,
7982,
4181,
11516,
281,
278,
250,
4868,
347,
581,
273,
253,
7982,
4715,
3082,
253,
4081,
1332,
21657,
2722,
326,
352,
310,
247,
625,
4217,
1332,
281,
3283,
253,
3700,
4715,
3045,
685,
8245,
3082,
4757,
50276,
783,
4081,
1332,
476,
10237,
314,
10173,
278,
250,
7363,
1014,
275,
9534,
835,
627,
310,
1652,
4715,
941,
253,
4081,
1332,
310,
671,
247,
12314,
1332,
7763,
281,
954,
3237,
50276,
498,
15752,
50276,
4064,
253,
10668,
1127,
273,
1859,
253,
14951,
273,
436,
2929,
310,
8489,
21643,
984,
352,
310,
1027,
432,
326,
273,
643,
2087,
3700,
4715,
9380,
352,
310,
671,
12744,
752,
5989,
3896,
1771,
285,
278,
7267,
8025,
1580,
627,
310,
247,
2257,
273,
2317,
1669,
275,
4677,
337,
352,
310,
8521,
281,
5513,
253,
4112,
273,
3896,
1771,
285,
278,
7267,
275,
7968,
4720,
690,
47412,
474,
6332,
403,
2326,
3103,
436,
37317,
32636,
326,
368,
23968,
634,
22857,
973,
50276,
2520,
37317,
11121,
326,
253,
4477,
452,
6283,
4767,
253,
12291,
273,
436,
2929,
2490,
187,
4118,
18435,
27,
9088,
310,
247,
13969,
2190,
253,
6485,
30628,
326,
436,
2929,
39223,
271,
1774,
1895,
310,
22335,
3590,
285,
556,
247,
4209,
7680,
323,
9311,
387,
5723,
2824,
938,
1423,
38543,
4715,
310,
1335,
275,
697,
42986,
285,
347,
247,
906,
4419,
23114,
432,
253,
3114,
253,
4081,
3082,
323,
18899,
1566,
294,
316,
1430,
7103,
278,
250,
17082,
275,
436,
2929,
588,
5752,
347,
271,
1774,
8245,
323,
6774,
2561,
275,
436,
3884,
11697,
891,
671,
11435,
436,
2561,
3884,
347,
352,
588,
320,
247,
9560,
629,
273,
253,
8738,
47159,
273,
23105,
50276,
783,
2022,
28930,
310,
253,
19843,
285,
9759,
273,
253,
2929,
253,
4477,
9919,
281,
2953,
731,
407,
5277,
37699,
1309,
253,
5955,
3408,
285,
407,
3585,
2182,
253,
7714,
15672,
534,
891,
1663,
11435,
253,
30628,
671,
14969,
285,
10974,
14962,
281,
253,
4477,
6128,
2057,
11850,
616,
1029,
7363,
390,
3629,
731,
7613,
891,
5583,
326,
253,
4477,
1379,
2714,
1557,
273,
436,
2523,
275,
253,
4049,
254,
609,
5102,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a task calibration method to for meta learning aiming at better task uncertainty estimation the method modifies the maml metalearning approach by weighting the taskspecific loss using classwise similarity measured by the cosine similarity of the task features the authors further propose to use this weighting on metricbased models for fewshot classification the change in weighting proposed by the paper although incremental provides an appreciable performance improvement of existing metalearning methods however some comparisons to existing uncertainty calibration methods are recommended to show a more comprehensive evaluation in particular one claim is its advantages over bayesian methods probabilistic maml 1 and bayesian maml 2 could be readily applied as additional comparisons to the authors proposed method for metricbased methods as the authors indicated it may worth testing with more recent developments including tadam an additional experiment could also be useful to verify that the class similarity weighting is performing its intended purpose for example one can construct a task where the labels are shuffled as in zhang et al 3 because the labels are uninformative the task should not be useful for meta learning and should provide almost zero weight the description of task corruption experiments can be improved were the reported results based on the total four tasks for metatraining only as described in appendix b1 also it may worth putting the results from maml variants in addition to protonet the authors motivated the method with the discussion of distributional uncertainty a more explicit characterization relating the posterior predictive distribution to the proposed method would strengthen the paper there are also several typos in the current submission the presentation can be improved for better readability in section 32 for example subscripts for classes in equation 3 can be changed to be different from the ones for the query inputs and labels to avoid confusion 1 finn et al probabilistic modelagnostic metalearning 2 yoon et al bayesian modelagnostic metalearning nuerips 2018 3 zhang et al understanding deep learning requires rethinking generalization post rebuttal i am downgrading my score as the authors did not address most of my concerns the paper can be further improved with the suggested experiments as well as discussion on distributional uncertainty but the revised version does not appear to contain any of these suggested changes docsepoverall this paper studies uncertainty in fewshot classification the main challenge of this problem is unknown the description of the problems in the first page is kind of misleading as authors describe many problems not sure whether authors intend to resolve them all or just part of them the main contribution of this work is not well motivated my main concern is that the contribution of this work is not strong the main contribution of this work is imposing the classwise similarity on the metaupdate rule in equation 4 the metaupdate is part of the modelagnostic metalearning pipeline as shown in equation 2 finn et al 2017 equation 4 seems reasonable however the novelty of this work is not clear for example is the class similarity computation first proposed in this work is the weighted metaupdate never used in existing literature my another concern is that the motivation is not clear the main problemproblems authors aim to solve are not specified or emphasized there are many challenges mentioned in this work such as aleatoric and epistemic uncertainty fewshot classification distributional mismatch between support and query data the tradeoff between calibration and the performance of a model it is not clear what is the main problem that authors aim to solve and how it promotes the contributions of this work figure 1 requires narratives especially the description about the dash line and solid line with different colors is not cleardocsepthe paper proposes to use classwise similarity to measure task uncertainty in fewshot learning in training if a task has low classwise similarity then the task is assigned a higher weight if a task has high classwise similarity then it would be considered illdefined and assigned a lower weight during training experiments on miniimagenet and cub200 show that it can improve maml and protonets classification performance and uncertainty estimation i disagree with the papers definition of illdesigned tasks when clases in a training task are very similar i think it just means that this training task is harder for the model to learn therefore the task at least shouldnt get a lower weight and somewhat get ignored during training if at test time a test task also has very similar classes close to this training task then this illdefined training task could actually help with the current method i think if the data set gets bigger and more complicated the classification performance will drop significantly because the current method would filter out the hard tasks and ignore them during training to some extent as for the experiment part the paper also only conducted experiments on relatively small and simple data sets miniimagenet and cub200 if the authors can add an experiment on tinyimagenet and show some performance improvements there it can make the experiment part much more solid the paper says a wellcalibrated network often exhibits lower accuracy compared to an overconfident deep network in the introduction it is a false claim there are a lot of posttraining methods to measure uncertainty at test time they dont affect the models classification performance since they are posttraining methods moreover deepensemble improves the classification performance of a classifier at the same time when it improves the models uncertainty estimation post rebuttal i meant tieredimagenet httpsgithubcomrenmengyefewshotsslpublic i apologize for the mistake i think the rebuttal has not addressed my concerns especially the one related to the illdefined tasks the illdefined tasks detected from the proposed method could be hard but good tasks benefitting the test the authors didnt deny this and the proposed method tries to push the model to learn less from such tasks a natural deduction is that it could affect the classification performance i think if the authors can take a further step to disentangle uncertainty estimation and classification this con of the proposed method can be removed in the future docsepthis paper presented a task calibration tc method which introduces the notion of distributional uncertainty for fewshot classification two tc extensions of existing methods maml and protonet namely tcmaml and tcprotonet have been presented and experimented the main contributions of this work the authors pointed out are 1 its in a nonbayesian fashion thus is computationally efficient and 2 the tc method can be applied to a range of meta leaning models and 3 the method is more effective for the fewshot metalearning situation under dataset shift i think this paper deals with an interesting idea but should be improved further also the contributions are not clearly supported the authors should do some more work to demonstrate their contributions my comments are as below 1 regarding the contribution 1 the authors only showed that abml amortized bayesian metalearning was computationally expensive while very efficient bayesian approximation method based on mcdropout for the implementation in a bayesian fashion the authors should evaluate and compare in terms of the computation time to demonstrate the proposed method is really computationally efficient compared to those in a bayesian fashion also they need to do something to demonstrate that the proposed method can efficiently handle complex and highdimensional data sufficiently well 2 regarding the contribution 2 there are a variety of metalearning models that have been presented in recent years eg metasgd taml and many more which of them can be combined with the tc method it would be great if the authors investigate the applicability of tc to other existing methods and see if tc improves their performance by experiments as well 3 regarding the contribution 3 only one dataset shift case miniimagenet cub200 was experimented i strongly recommened the authors to evaulate ther method on other dataset shift cases with additional datasets eg omniglot 4 in table 1 and table 2 i cant find significant improvement of the proposed methods tc compared to their baselines in terms of classification accuracy which is the most important metric in particular in the case of 5way 5shot for miniimagenet tcmaml was worse than maml and tcprotonet was worse than protonet how and in which aspect could you say the proposed methods are significantly better 5 for the organization of this paper things should be more consistent for example accuracy was reported in table 1 but wasnt in table 2 although provided in appendex b5 in table 3 the experimental results for maml and its variants are not provided althouth the authors mentioned that maml is already robust for the corrupted tasks 6 arent the methods sensitive to the architectural and training configurations of the cnn models 7 i think in the current manuscript there are some strong conjectures that need to be supported by theoretical evidences or more experimental results for example that is modeling episteme is not much helpful for the calibration in fewshot classification and the sentences that describe the main contributions ive upated my rating after the authors response
### Summary:
|
this paper tries to address the uncertainty calibration problem in metalearning by weighting the gradient from different tasks according to classwise similarity there have been many concerns raised by the reviewers and most of them either are still not properly addressed after the rebuttal period the main concerns are as follows the problem the paper tries to address is not clear the use of weighting in metaupdate is motivated from distributional uncertainty but it is not clear how that will improve the task calibration at metatesting time the proposed update runs into the risk of focusing on simple tasks and downweighting hard tasks that could be improved with more learning to get more discriminative features that might hurt the classification performance even though it gets better calibration quality novelty is limited the proposed method is a fairly small modification on the original maml algorithm more comprehensive empirical evaluation is required to support the superiority of the proposed method to other baselines i suggest the authors take all the reviewers comments seriously and improve their work for a better revision
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
4836,
18543,
1332,
281,
323,
11419,
4715,
26400,
387,
1805,
4836,
11649,
13418,
253,
1332,
771,
7790,
253,
278,
16878,
5148,
613,
920,
2746,
407,
42428,
253,
8892,
29765,
2957,
970,
966,
3020,
14259,
4080,
407,
253,
7349,
460,
14259,
273,
253,
4836,
3386,
253,
4477,
2007,
12661,
281,
897,
436,
42428,
327,
7982,
3169,
3210,
323,
1643,
11860,
9162,
50275,
783,
1818,
275,
42428,
4081,
407,
253,
2929,
3738,
32809,
3400,
271,
6373,
6051,
3045,
7756,
273,
5368,
5148,
613,
920,
3082,
2299,
690,
14023,
281,
5368,
11649,
18543,
3082,
403,
8521,
281,
921,
247,
625,
11088,
7103,
275,
1798,
581,
1750,
310,
697,
11361,
689,
17699,
16561,
3082,
37851,
278,
16878,
337,
285,
17699,
16561,
278,
16878,
374,
812,
320,
12450,
3732,
347,
3081,
14023,
281,
253,
4477,
4081,
1332,
323,
7982,
3169,
3082,
347,
253,
4477,
4860,
352,
778,
4409,
5175,
342,
625,
3332,
16936,
1690,
246,
43089,
50276,
266,
3081,
3368,
812,
671,
320,
4217,
281,
12654,
326,
253,
966,
14259,
42428,
310,
9591,
697,
6034,
4096,
323,
1650,
581,
476,
3989,
247,
4836,
835,
253,
13301,
403,
439,
31377,
347,
275,
1182,
12109,
1162,
355,
495,
984,
253,
13301,
403,
440,
37650,
800,
253,
4836,
943,
417,
320,
4217,
323,
11419,
4715,
285,
943,
2085,
2761,
5058,
2801,
50276,
783,
5740,
273,
4836,
16933,
4679,
476,
320,
5520,
497,
253,
2361,
1543,
1754,
327,
253,
2264,
1740,
8892,
323,
1313,
255,
26208,
760,
347,
2529,
275,
30762,
270,
18,
671,
352,
778,
4409,
8133,
253,
1543,
432,
278,
16878,
11640,
275,
1635,
281,
19025,
292,
50275,
783,
4477,
17194,
253,
1332,
342,
253,
5955,
273,
3268,
267,
11649,
247,
625,
6843,
14846,
12600,
253,
12637,
15970,
3268,
281,
253,
4081,
1332,
651,
17084,
253,
2929,
50276,
9088,
403,
671,
2067,
963,
993,
275,
253,
1655,
19529,
253,
9759,
476,
320,
5520,
323,
1805,
1239,
1430,
275,
2593,
4567,
323,
1650,
749,
26804,
323,
5971,
275,
5150,
495,
476,
320,
4391,
281,
320,
1027,
432,
253,
4394,
323,
253,
7316,
14800,
285,
13301,
281,
3693,
13775,
50274,
18,
1442,
79,
1162,
355,
37851,
1566,
1530,
6932,
5148,
613,
920,
374,
340,
3508,
1162,
355,
17699,
16561,
1566,
1530,
6932,
5148,
613,
920,
8794,
254,
2824,
4765,
495,
1182,
12109,
1162,
355,
4685,
3676,
4715,
4419,
294,
37341,
26647,
50276,
5996,
30080,
22559,
50276,
74,
717,
1066,
4971,
272,
619,
4868,
347,
253,
4477,
858,
417,
2953,
954,
273,
619,
7350,
253,
2929,
476,
320,
2007,
5520,
342,
253,
5125,
4679,
347,
973,
347,
5955,
327,
3268,
267,
11649,
533,
253,
17265,
2715,
1057,
417,
3176,
281,
3831,
667,
273,
841,
5125,
2544,
50276,
7152,
33032,
1189,
455,
436,
2929,
2175,
11649,
275,
1643,
11860,
9162,
253,
2022,
5691,
273,
436,
1895,
310,
7202,
253,
5740,
273,
253,
3237,
275,
253,
806,
3239,
310,
2238,
273,
24363,
347,
4477,
6266,
1142,
3237,
417,
2119,
1880,
4477,
18607,
281,
11322,
731,
512,
390,
816,
629,
273,
731,
253,
2022,
7680,
273,
436,
789,
310,
417,
973,
17194,
50275,
2577,
2022,
4468,
310,
326,
253,
7680,
273,
436,
789,
310,
417,
2266,
253,
2022,
7680,
273,
436,
789,
310,
23254,
253,
966,
3020,
14259,
327,
253,
11419,
11183,
4086,
275,
5150,
577,
253,
11419,
11183,
310,
629,
273,
253,
1566,
1530,
6932,
5148,
613,
920,
15722,
347,
2011,
275,
5150,
374,
1442,
79,
1162,
355,
4240,
5150,
577,
3133,
5272,
2299,
253,
38135,
273,
436,
789,
310,
417,
2590,
323,
1650,
310,
253,
966,
14259,
13782,
806,
4081,
275,
436,
789,
310,
253,
17375,
11419,
11183,
1620,
908,
275,
5368,
6239,
50276,
2577,
1529,
4468,
310,
326,
253,
16038,
310,
417,
2590,
253,
2022,
1895,
856,
23042,
4477,
4388,
281,
8415,
403,
417,
7616,
390,
21947,
627,
403,
1142,
7881,
5393,
275,
436,
789,
824,
347,
21844,
1080,
280,
285,
30009,
11060,
11649,
1643,
11860,
9162,
3268,
267,
29713,
875,
1329,
285,
7316,
941,
253,
5454,
2727,
875,
18543,
285,
253,
3045,
273,
247,
1566,
352,
310,
417,
2590,
752,
310,
253,
2022,
1895,
326,
4477,
4388,
281,
8415,
285,
849,
352,
18653,
253,
9021,
273,
436,
789,
50276,
13206,
337,
4419,
39062,
3340,
253,
5740,
670,
253,
20134,
1386,
285,
4891,
1386,
342,
1027,
9830,
310,
417,
1391,
472,
406,
339,
431,
248,
2929,
29328,
281,
897,
966,
3020,
14259,
281,
2557,
4836,
11649,
275,
1643,
11860,
4715,
275,
3733,
604,
247,
4836,
556,
1698,
966,
3020,
14259,
840,
253,
4836,
310,
7922,
247,
2169,
2801,
604,
247,
4836,
556,
1029,
966,
3020,
14259,
840,
352,
651,
320,
2783,
4164,
392,
37224,
285,
7922,
247,
2406,
2801,
1309,
3733,
4679,
327,
12949,
303,
6533,
292,
50276,
395,
12966,
1518,
921,
326,
352,
476,
3157,
278,
16878,
285,
19025,
1507,
9162,
3045,
285,
11649,
13418,
50276,
74,
14936,
342,
253,
9380,
5426,
273,
4164,
392,
265,
1300,
8892,
672,
502,
1169,
275,
247,
3733,
4836,
403,
1077,
2074,
891,
1158,
352,
816,
2097,
326,
436,
3733,
4836,
310,
12150,
323,
253,
1566,
281,
3037,
3103,
253,
4836,
387,
1878,
943,
2649,
755,
247,
2406,
2801,
285,
8489,
755,
12841,
1309,
3733,
604,
387,
1071,
673,
247,
1071,
4836,
671,
556,
1077,
2074,
5971,
2810,
281,
436,
3733,
4836,
840,
436,
4164,
392,
37224,
3733,
4836,
812,
2686,
1361,
342,
253,
1655,
1332,
891,
1158,
604,
253,
941,
873,
4850,
8750,
285,
625,
9542,
253,
9162,
3045,
588,
5926,
3012,
984,
253,
1655,
1332,
651,
5806,
562,
253,
1892,
8892,
285,
11823,
731,
1309,
3733,
281,
690,
6070,
50275,
284,
323,
253,
3368,
629,
253,
2929,
671,
760,
5196,
4679,
327,
4942,
1355,
285,
2969,
941,
5239,
12949,
303,
6533,
292,
285,
12966,
1518,
604,
253,
4477,
476,
823,
271,
3368,
327,
10058,
303,
6533,
292,
285,
921,
690,
3045,
11701,
627,
352,
476,
1056,
253,
3368,
629,
1199,
625,
4891,
50276,
783,
2929,
2296,
247,
973,
1179,
50250,
2990,
2223,
15646,
2406,
7200,
2429,
281,
271,
689,
8259,
888,
3676,
2990,
275,
253,
10199,
352,
310,
247,
3221,
1750,
627,
403,
247,
2257,
273,
50276,
5996,
31158,
3082,
281,
2557,
11649,
387,
1071,
673,
597,
13414,
2818,
253,
3210,
9162,
3045,
1580,
597,
403,
1501,
31158,
3082,
25761,
3676,
1215,
78,
934,
19132,
253,
9162,
3045,
273,
247,
30410,
387,
253,
1072,
673,
672,
352,
19132,
253,
3210,
11649,
13418,
50276,
5996,
30080,
22559,
50276,
74,
5486,
13898,
433,
303,
6533,
292,
5987,
7280,
681,
445,
78,
1205,
6683,
37922,
11860,
28908,
4387,
891,
26012,
323,
253,
10551,
50275,
74,
1158,
253,
30080,
22559,
556,
417,
9713,
619,
7350,
3340,
253,
581,
2905,
281,
253,
4164,
392,
37224,
8892,
253,
4164,
392,
37224,
8892,
5189,
432,
253,
4081,
1332,
812,
320,
1892,
533,
1175,
8892,
2750,
2835,
253,
1071,
253,
4477,
42126,
13292,
436,
285,
253,
4081,
1332,
14177,
281,
7450,
253,
1566,
281,
3037,
1679,
432,
824,
8892,
247,
3626,
34143,
310,
326,
352,
812,
2818,
253,
9162,
3045,
891,
1158,
604,
253,
4477,
476,
1379,
247,
2007,
3213,
281,
557,
290,
2134,
11649,
13418,
285,
9162,
436,
345,
273,
253,
4081,
1332,
476,
320,
5176,
275,
253,
2852,
5474,
33032,
2520,
2929,
3559,
247,
4836,
18543,
246,
68,
1332,
534,
23970,
253,
10732,
273,
3268,
267,
11649,
323,
1643,
11860,
9162,
767,
246,
68,
18149,
273,
5368,
3082,
278,
16878,
285,
19025,
292,
10775,
246,
3591,
16878,
285,
246,
68,
856,
1299,
292,
452,
644,
3559,
285,
3368,
264,
253,
2022,
9021,
273,
436,
789,
253,
4477,
8042,
562,
403,
337,
697,
275,
247,
1327,
32442,
16561,
8142,
3021,
310,
43245,
5919,
285,
374,
253,
246,
68,
1332,
476,
320,
3732,
281,
247,
2491,
273,
11419,
25661,
3210,
285,
495,
253,
1332,
310,
625,
3576,
323,
253,
1643,
11860,
5148,
613,
920,
4112,
762,
10895,
5333,
50276,
74,
1158,
436,
2929,
13330,
342,
271,
4722,
2934,
533,
943,
320,
5520,
2007,
671,
253,
9021,
403,
417,
4518,
4516,
253,
4477,
943,
513,
690,
625,
789,
281,
7568,
616,
9021,
619,
5701,
403,
347,
2708,
50276,
18,
5001,
253,
7680,
337,
253,
4477,
760,
2692,
326,
490,
1686,
717,
430,
1025,
17699,
16561,
5148,
613,
920,
369,
43245,
8214,
1223,
1077,
5919,
17699,
16561,
11193,
1332,
1754,
327,
278,
2428,
1658,
483,
323,
253,
7092,
275,
247,
17699,
16561,
8142,
253,
4477,
943,
7472,
285,
7277,
275,
2426,
273,
253,
13782,
673,
281,
7568,
253,
4081,
1332,
310,
1663,
43245,
5919,
2429,
281,
1110,
275,
247,
17699,
16561,
8142,
671,
597,
878,
281,
513,
1633,
281,
7568,
326,
253,
4081,
1332,
476,
14556,
6016,
2570,
285,
1029,
6967,
941,
10481,
973,
50276,
19,
5001,
253,
7680,
374,
627,
403,
247,
5235,
273,
5148,
613,
920,
3210,
326,
452,
644,
3559,
275,
3332,
1107,
24088,
1313,
284,
35333,
246,
16878,
285,
1142,
625,
534,
273,
731,
476,
320,
5678,
342,
253,
246,
68,
1332,
352,
651,
320,
1270,
604,
253,
4477,
7409,
253,
30437,
273,
246,
68,
281,
643,
5368,
3082,
285,
923,
604,
246,
68,
19132,
616,
3045,
407,
4679,
347,
973,
50276,
20,
5001,
253,
7680,
495,
760,
581,
10895,
5333,
1083,
12949,
303,
6533,
292,
50276,
68,
538,
1518,
369,
3368,
264,
891,
7052,
3818,
2348,
253,
4477,
281,
612,
66,
4187,
253,
83,
1332,
327,
643,
10895,
5333,
2219,
342,
3081,
15302,
24088,
33039,
304,
11753,
50276,
21,
275,
2829,
337,
285,
2829,
374,
891,
16216,
1089,
1534,
7756,
273,
253,
4081,
3082,
246,
68,
2429,
281,
616,
1666,
25379,
275,
2426,
273,
9162,
7200,
534,
310,
253,
954,
1774,
7982,
275,
1798,
275,
253,
1083,
273,
608,
1106,
608,
11860,
323,
12949,
303,
6533,
292,
246,
3591,
16878,
369,
7197,
685,
278,
16878,
285,
246,
68,
856,
1299,
292,
369,
7197,
685,
19025,
292,
849,
285,
275,
534,
4809,
812,
368,
1333,
253,
4081,
3082,
403,
3012,
1805,
50276,
22,
323,
253,
6003,
273,
436,
2929,
1841,
943,
320,
625,
5185,
323,
1650,
7200,
369,
2361,
275,
2829,
337,
533,
369,
2649,
275,
2829,
374,
3738,
2530,
275,
14801,
911,
270,
22,
275,
2829,
495,
253,
5661,
1543,
323,
278,
16878,
285,
697,
11640,
403,
417,
2530,
355,
394,
2001,
253,
4477,
5393,
326,
278,
16878,
310,
2168,
10237,
323,
253,
40634,
8892,
50276,
23,
403,
2649,
253,
3082,
7996,
281,
253,
27934,
285,
3733,
16012,
273,
253,
260,
9866,
3210,
50275,
24,
891,
1158,
275,
253,
1655,
7714,
627,
403,
690,
2266,
19704,
980,
326,
878,
281,
320,
4516,
407,
10527,
20456,
2979,
390,
625,
5661,
1543,
323,
1650,
326,
310,
14053,
30009,
20867,
310,
417,
1199,
9371,
323,
253,
18543,
275,
1643,
11860,
9162,
285,
253,
14683,
326,
6266,
253,
2022,
9021,
50275,
422,
598,
456,
619,
13716,
846,
253,
4477,
2380,
187,
187,
4118,
18435,
27,
2520,
2929,
14177,
281,
2953,
253,
11649,
18543,
1895,
275,
5148,
613,
920,
407,
42428,
253,
11786,
432,
1027,
8892,
2556,
281,
966,
3020,
14259,
627,
452,
644,
1142,
7350,
5439,
407,
253,
30628,
285,
954,
273,
731,
2057,
403,
1335,
417,
6283,
9713,
846,
253,
30080,
22559,
2180,
50275,
783,
2022,
7350,
403,
347,
3637,
50276,
783,
1895,
253,
2929,
14177,
281,
2953,
310,
417,
2590,
253,
897,
273,
42428,
275,
11419,
11183,
310,
17194,
432,
3268,
267,
11649,
533,
352,
310,
417,
2590,
849,
326,
588,
3157,
253,
4836,
18543,
387,
1313,
255,
38972,
673,
50276,
783,
4081,
5731,
6613,
715,
253,
2495,
273,
13654,
327,
2969,
8892,
285,
1066,
6712,
272,
1892,
8892,
326,
812,
320,
5520,
342,
625,
4715,
281,
755,
625,
20741,
800,
3386,
326,
1537,
8513,
253,
9162,
3045,
1014,
2167,
352,
4850,
1805,
18543,
3290,
50276,
2369,
652,
555,
310,
3710,
253,
4081,
1332,
310,
247,
9648,
1355,
11237,
327,
253,
3236,
278,
16878,
5933,
50276,
3062,
11088,
16774,
7103,
310,
2424,
281,
1329,
253,
34385,
273,
253,
4081,
1332,
281,
643,
1666,
25379,
50276,
74,
1804,
253,
4477,
1379,
512,
253,
30628,
5701,
10369,
285,
3157,
616,
789,
323,
247,
1805,
18520,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
4836,
18543,
1332,
281,
323,
11419,
4715,
26400,
387,
1805,
4836,
11649,
13418,
253,
1332,
771,
7790,
253,
278,
16878,
5148,
613,
920,
2746,
407,
42428,
253,
8892,
29765,
2957,
970,
966,
3020,
14259,
4080,
407,
253,
7349,
460,
14259,
273,
253,
4836,
3386,
253,
4477,
2007,
12661,
281,
897,
436,
42428,
327,
7982,
3169,
3210,
323,
1643,
11860,
9162,
50275,
783,
1818,
275,
42428,
4081,
407,
253,
2929,
3738,
32809,
3400,
271,
6373,
6051,
3045,
7756,
273,
5368,
5148,
613,
920,
3082,
2299,
690,
14023,
281,
5368,
11649,
18543,
3082,
403,
8521,
281,
921,
247,
625,
11088,
7103,
275,
1798,
581,
1750,
310,
697,
11361,
689,
17699,
16561,
3082,
37851,
278,
16878,
337,
285,
17699,
16561,
278,
16878,
374,
812,
320,
12450,
3732,
347,
3081,
14023,
281,
253,
4477,
4081,
1332,
323,
7982,
3169,
3082,
347,
253,
4477,
4860,
352,
778,
4409,
5175,
342,
625,
3332,
16936,
1690,
246,
43089,
50276,
266,
3081,
3368,
812,
671,
320,
4217,
281,
12654,
326,
253,
966,
14259,
42428,
310,
9591,
697,
6034,
4096,
323,
1650,
581,
476,
3989,
247,
4836,
835,
253,
13301,
403,
439,
31377,
347,
275,
1182,
12109,
1162,
355,
495,
984,
253,
13301,
403,
440,
37650,
800,
253,
4836,
943,
417,
320,
4217,
323,
11419,
4715,
285,
943,
2085,
2761,
5058,
2801,
50276,
783,
5740,
273,
4836,
16933,
4679,
476,
320,
5520,
497,
253,
2361,
1543,
1754,
327,
253,
2264,
1740,
8892,
323,
1313,
255,
26208,
760,
347,
2529,
275,
30762,
270,
18,
671,
352,
778,
4409,
8133,
253,
1543,
432,
278,
16878,
11640,
275,
1635,
281,
19025,
292,
50275,
783,
4477,
17194,
253,
1332,
342,
253,
5955,
273,
3268,
267,
11649,
247,
625,
6843,
14846,
12600,
253,
12637,
15970,
3268,
281,
253,
4081,
1332,
651,
17084,
253,
2929,
50276,
9088,
403,
671,
2067,
963,
993,
275,
253,
1655,
19529,
253,
9759,
476,
320,
5520,
323,
1805,
1239,
1430,
275,
2593,
4567,
323,
1650,
749,
26804,
323,
5971,
275,
5150,
495,
476,
320,
4391,
281,
320,
1027,
432,
253,
4394,
323,
253,
7316,
14800,
285,
13301,
281,
3693,
13775,
50274,
18,
1442,
79,
1162,
355,
37851,
1566,
1530,
6932,
5148,
613,
920,
374,
340,
3508,
1162,
355,
17699,
16561,
1566,
1530,
6932,
5148,
613,
920,
8794,
254,
2824,
4765,
495,
1182,
12109,
1162,
355,
4685,
3676,
4715,
4419,
294,
37341,
26647,
50276,
5996,
30080,
22559,
50276,
74,
717,
1066,
4971,
272,
619,
4868,
347,
253,
4477,
858,
417,
2953,
954,
273,
619,
7350,
253,
2929,
476,
320,
2007,
5520,
342,
253,
5125,
4679,
347,
973,
347,
5955,
327,
3268,
267,
11649,
533,
253,
17265,
2715,
1057,
417,
3176,
281,
3831,
667,
273,
841,
5125,
2544,
50276,
7152,
33032,
1189,
455,
436,
2929,
2175,
11649,
275,
1643,
11860,
9162,
253,
2022,
5691,
273,
436,
1895,
310,
7202,
253,
5740,
273,
253,
3237,
275,
253,
806,
3239,
310,
2238,
273,
24363,
347,
4477,
6266,
1142,
3237,
417,
2119,
1880,
4477,
18607,
281,
11322,
731,
512,
390,
816,
629,
273,
731,
253,
2022,
7680,
273,
436,
789,
310,
417,
973,
17194,
50275,
2577,
2022,
4468,
310,
326,
253,
7680,
273,
436,
789,
310,
417,
2266,
253,
2022,
7680,
273,
436,
789,
310,
23254,
253,
966,
3020,
14259,
327,
253,
11419,
11183,
4086,
275,
5150,
577,
253,
11419,
11183,
310,
629,
273,
253,
1566,
1530,
6932,
5148,
613,
920,
15722,
347,
2011,
275,
5150,
374,
1442,
79,
1162,
355,
4240,
5150,
577,
3133,
5272,
2299,
253,
38135,
273,
436,
789,
310,
417,
2590,
323,
1650,
310,
253,
966,
14259,
13782,
806,
4081,
275,
436,
789,
310,
253,
17375,
11419,
11183,
1620,
908,
275,
5368,
6239,
50276,
2577,
1529,
4468,
310,
326,
253,
16038,
310,
417,
2590,
253,
2022,
1895,
856,
23042,
4477,
4388,
281,
8415,
403,
417,
7616,
390,
21947,
627,
403,
1142,
7881,
5393,
275,
436,
789,
824,
347,
21844,
1080,
280,
285,
30009,
11060,
11649,
1643,
11860,
9162,
3268,
267,
29713,
875,
1329,
285,
7316,
941,
253,
5454,
2727,
875,
18543,
285,
253,
3045,
273,
247,
1566,
352,
310,
417,
2590,
752,
310,
253,
2022,
1895,
326,
4477,
4388,
281,
8415,
285,
849,
352,
18653,
253,
9021,
273,
436,
789,
50276,
13206,
337,
4419,
39062,
3340,
253,
5740,
670,
253,
20134,
1386,
285,
4891,
1386,
342,
1027,
9830,
310,
417,
1391,
472,
406,
339,
431,
248,
2929,
29328,
281,
897,
966,
3020,
14259,
281,
2557,
4836,
11649,
275,
1643,
11860,
4715,
275,
3733,
604,
247,
4836,
556,
1698,
966,
3020,
14259,
840,
253,
4836,
310,
7922,
247,
2169,
2801,
604,
247,
4836,
556,
1029,
966,
3020,
14259,
840,
352,
651,
320,
2783,
4164,
392,
37224,
285,
7922,
247,
2406,
2801,
1309,
3733,
4679,
327,
12949,
303,
6533,
292,
50276,
395,
12966,
1518,
921,
326,
352,
476,
3157,
278,
16878,
285,
19025,
1507,
9162,
3045,
285,
11649,
13418,
50276,
74,
14936,
342,
253,
9380,
5426,
273,
4164,
392,
265,
1300,
8892,
672,
502,
1169,
275,
247,
3733,
4836,
403,
1077,
2074,
891,
1158,
352,
816,
2097,
326,
436,
3733,
4836,
310,
12150,
323,
253,
1566,
281,
3037,
3103,
253,
4836,
387,
1878,
943,
2649,
755,
247,
2406,
2801,
285,
8489,
755,
12841,
1309,
3733,
604,
387,
1071,
673,
247,
1071,
4836,
671,
556,
1077,
2074,
5971,
2810,
281,
436,
3733,
4836,
840,
436,
4164,
392,
37224,
3733,
4836,
812,
2686,
1361,
342,
253,
1655,
1332,
891,
1158,
604,
253,
941,
873,
4850,
8750,
285,
625,
9542,
253,
9162,
3045,
588,
5926,
3012,
984,
253,
1655,
1332,
651,
5806,
562,
253,
1892,
8892,
285,
11823,
731,
1309,
3733,
281,
690,
6070,
50275,
284,
323,
253,
3368,
629,
253,
2929,
671,
760,
5196,
4679,
327,
4942,
1355,
285,
2969,
941,
5239,
12949,
303,
6533,
292,
285,
12966,
1518,
604,
253,
4477,
476,
823,
271,
3368,
327,
10058,
303,
6533,
292,
285,
921,
690,
3045,
11701,
627,
352,
476,
1056,
253,
3368,
629,
1199,
625,
4891,
50276,
783,
2929,
2296,
247,
973,
1179,
50250,
2990,
2223,
15646,
2406,
7200,
2429,
281,
271,
689,
8259,
888,
3676,
2990,
275,
253,
10199,
352,
310,
247,
3221,
1750,
627,
403,
247,
2257,
273,
50276,
5996,
31158,
3082,
281,
2557,
11649,
387,
1071,
673,
597,
13414,
2818,
253,
3210,
9162,
3045,
1580,
597,
403,
1501,
31158,
3082,
25761,
3676,
1215,
78,
934,
19132,
253,
9162,
3045,
273,
247,
30410,
387,
253,
1072,
673,
672,
352,
19132,
253,
3210,
11649,
13418,
50276,
5996,
30080,
22559,
50276,
74,
5486,
13898,
433,
303,
6533,
292,
5987,
7280,
681,
445,
78,
1205,
6683,
37922,
11860,
28908,
4387,
891,
26012,
323,
253,
10551,
50275,
74,
1158,
253,
30080,
22559,
556,
417,
9713,
619,
7350,
3340,
253,
581,
2905,
281,
253,
4164,
392,
37224,
8892,
253,
4164,
392,
37224,
8892,
5189,
432,
253,
4081,
1332,
812,
320,
1892,
533,
1175,
8892,
2750,
2835,
253,
1071,
253,
4477,
42126,
13292,
436,
285,
253,
4081,
1332,
14177,
281,
7450,
253,
1566,
281,
3037,
1679,
432,
824,
8892,
247,
3626,
34143,
310,
326,
352,
812,
2818,
253,
9162,
3045,
891,
1158,
604,
253,
4477,
476,
1379,
247,
2007,
3213,
281,
557,
290,
2134,
11649,
13418,
285,
9162,
436,
345,
273,
253,
4081,
1332,
476,
320,
5176,
275,
253,
2852,
5474,
33032,
2520,
2929,
3559,
247,
4836,
18543,
246,
68,
1332,
534,
23970,
253,
10732,
273,
3268,
267,
11649,
323,
1643,
11860,
9162,
767,
246,
68,
18149,
273,
5368,
3082,
278,
16878,
285,
19025,
292,
10775,
246,
3591,
16878,
285,
246,
68,
856,
1299,
292,
452,
644,
3559,
285,
3368,
264,
253,
2022,
9021,
273,
436,
789,
253,
4477,
8042,
562,
403,
337,
697,
275,
247,
1327,
32442,
16561,
8142,
3021,
310,
43245,
5919,
285,
374,
253,
246,
68,
1332,
476,
320,
3732,
281,
247,
2491,
273,
11419,
25661,
3210,
285,
495,
253,
1332,
310,
625,
3576,
323,
253,
1643,
11860,
5148,
613,
920,
4112,
762,
10895,
5333,
50276,
74,
1158,
436,
2929,
13330,
342,
271,
4722,
2934,
533,
943,
320,
5520,
2007,
671,
253,
9021,
403,
417,
4518,
4516,
253,
4477,
943,
513,
690,
625,
789,
281,
7568,
616,
9021,
619,
5701,
403,
347,
2708,
50276,
18,
5001,
253,
7680,
337,
253,
4477,
760,
2692,
326,
490,
1686,
717,
430,
1025,
17699,
16561,
5148,
613,
920,
369,
43245,
8214,
1223,
1077,
5919,
17699,
16561,
11193,
1332,
1754,
327,
278,
2428,
1658,
483,
323,
253,
7092,
275,
247,
17699,
16561,
8142,
253,
4477,
943,
7472,
285,
7277,
275,
2426,
273,
253,
13782,
673,
281,
7568,
253,
4081,
1332,
310,
1663,
43245,
5919,
2429,
281,
1110,
275,
247,
17699,
16561,
8142,
671,
597,
878,
281,
513,
1633,
281,
7568,
326,
253,
4081,
1332,
476,
14556,
6016,
2570,
285,
1029,
6967,
941,
10481,
973,
50276,
19,
5001,
253,
7680,
374,
627,
403,
247,
5235,
273,
5148,
613,
920,
3210,
326,
452,
644,
3559,
275,
3332,
1107,
24088,
1313,
284,
35333,
246,
16878,
285,
1142,
625,
534,
273,
731,
476,
320,
5678,
342,
253,
246,
68,
1332,
352,
651,
320,
1270,
604,
253,
4477,
7409,
253,
30437,
273,
246,
68,
281,
643,
5368,
3082,
285,
923,
604,
246,
68,
19132,
616,
3045,
407,
4679,
347,
973,
50276,
20,
5001,
253,
7680,
495,
760,
581,
10895,
5333,
1083,
12949,
303,
6533,
292,
50276,
68,
538,
1518,
369,
3368,
264,
891,
7052,
3818,
2348,
253,
4477,
281,
612,
66,
4187,
253,
83,
1332,
327,
643,
10895,
5333,
2219,
342,
3081,
15302,
24088,
33039,
304,
11753,
50276,
21,
275,
2829,
337,
285,
2829,
374,
891,
16216,
1089,
1534,
7756,
273,
253,
4081,
3082,
246,
68,
2429,
281,
616,
1666,
25379,
275,
2426,
273,
9162,
7200,
534,
310,
253,
954,
1774,
7982,
275,
1798,
275,
253,
1083,
273,
608,
1106,
608,
11860,
323,
12949,
303,
6533,
292,
246,
3591,
16878,
369,
7197,
685,
278,
16878,
285,
246,
68,
856,
1299,
292,
369,
7197,
685,
19025,
292,
849,
285,
275,
534,
4809,
812,
368,
1333,
253,
4081,
3082,
403,
3012,
1805,
50276,
22,
323,
253,
6003,
273,
436,
2929,
1841,
943,
320,
625,
5185,
323,
1650,
7200,
369,
2361,
275,
2829,
337,
533,
369,
2649,
275,
2829,
374,
3738,
2530,
275,
14801,
911,
270,
22,
275,
2829,
495,
253,
5661,
1543,
323,
278,
16878,
285,
697,
11640,
403,
417,
2530,
355,
394,
2001,
253,
4477,
5393,
326,
278,
16878,
310,
2168,
10237,
323,
253,
40634,
8892,
50276,
23,
403,
2649,
253,
3082,
7996,
281,
253,
27934,
285,
3733,
16012,
273,
253,
260,
9866,
3210,
50275,
24,
891,
1158,
275,
253,
1655,
7714,
627,
403,
690,
2266,
19704,
980,
326,
878,
281,
320,
4516,
407,
10527,
20456,
2979,
390,
625,
5661,
1543,
323,
1650,
326,
310,
14053,
30009,
20867,
310,
417,
1199,
9371,
323,
253,
18543,
275,
1643,
11860,
9162,
285,
253,
14683,
326,
6266,
253,
2022,
9021,
50275,
422,
598,
456,
619,
13716,
846,
253,
4477,
2380,
187,
187,
4118,
18435,
27,
2520,
2929,
14177,
281,
2953,
253,
11649,
18543,
1895,
275,
5148,
613,
920,
407,
42428,
253,
11786,
432,
1027,
8892,
2556,
281,
966,
3020,
14259,
627,
452,
644,
1142,
7350,
5439,
407,
253,
30628,
285,
954,
273,
731,
2057,
403,
1335,
417,
6283,
9713,
846,
253,
30080,
22559,
2180,
50275,
783,
2022,
7350,
403,
347,
3637,
50276,
783,
1895,
253,
2929,
14177,
281,
2953,
310,
417,
2590,
253,
897,
273,
42428,
275,
11419,
11183,
310,
17194,
432,
3268,
267,
11649,
533,
352,
310,
417,
2590,
849,
326,
588,
3157,
253,
4836,
18543,
387,
1313,
255,
38972,
673,
50276,
783,
4081,
5731,
6613,
715,
253,
2495,
273,
13654,
327,
2969,
8892,
285,
1066,
6712,
272,
1892,
8892,
326,
812,
320,
5520,
342,
625,
4715,
281,
755,
625,
20741,
800,
3386,
326,
1537,
8513,
253,
9162,
3045,
1014,
2167,
352,
4850,
1805,
18543,
3290,
50276,
2369,
652,
555,
310,
3710,
253,
4081,
1332,
310,
247,
9648,
1355,
11237,
327,
253,
3236,
278,
16878,
5933,
50276,
3062,
11088,
16774,
7103,
310,
2424,
281,
1329,
253,
34385,
273,
253,
4081,
1332,
281,
643,
1666,
25379,
50276,
74,
1804,
253,
4477,
1379,
512,
253,
30628,
5701,
10369,
285,
3157,
616,
789,
323,
247,
1805,
18520,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this submission shows how an approximate posterior distribution on dnn weights learned on a source task can be beneficial in transferlearning to downstream tasks a 3step method is proposed 1 fit posterior over weights on source task using swag 2 rescale the learned posterior using a single scalar coefficient 3 plug the rescaled prior into a beysian inference algorithm like sgld or sghmc experimental results are provided to support the following claims in terms of performance learned priors sgd transfer learning nonlearned priors sgd with learned priors significantly outperforms standard transfer learning ie learned priors lead to performance gains with or wo bayesian inference learned priors lead to more dataefficient performance on the downstream task when using the same prior bayesian inference often outperforms sgd training though bayesian inference with a nonlearned prior has inferior accuracy to sgd with pretraining strengths s1 the method is wellmotivated presented with clarity and shown to lead to performance gains s2 the insights provided prompt further exploration of losssurface alignment in transfer learning and other settings eg fewshot learning s3 the method builds on prior art and adds a single hyperparameter for this reason it should be easy for others to apply s4 the experimental validation includes classification tasks of different complexities and semantic segmentation thus the method is shown to be beneficial in interesting problems weaknesses w1 the proposed method claims to be a dropin replacement for standard transfer learningbecause the additional computation is not expected to be trivial it would have been beneficial to provide a proper study of the expected increase in computation wrt standard transfer learning the submission includes some discussion about the additional training and inference cost but no discussion about the cost in practice of tuning hyperparameters for example there are a number of hyperparameters for the methods involved in the different steps of the proposed method setting or tuning these hyperparameters for a given transfer would be part of the cost of the method in particular for steps 2 scaling lambda of prior and 3 parameters of sgld and sghmc w2 the main technical novelty in the method is the scaling of the prior steps 1 and 3 are applications of prior art but this was not given much attention in the experiments only fig 2c what is the recommended procedure and expected cost for tuning the scaling parameter lambda w3 plots legends and captions are not synchronized for example in fig 2c there is a purple plot but the legend does not include purple in fig 7 the caption speaks of a red plot but there is no red in the plots the two limitations discussed are the additional computation and the inclusion of only vision applications docsepthis paper presents a bayesian learning method for transfer learning on target task specifically a posterior approximation method swag is used to estimate the posterior distribution of a supervised or selfsupervised pretrained model this distribution is used as the prior for downstream learning experiments and ablation study demonstrate the effectiveness of learning a prior for transfer strengths 1 the problem of learning a prior for transfer with bayesian learning is well motivated 2 the proposed method with three steps is reasonable 3 experimental results of the proposed method on semantic segmentation are quite strong weaknesses 1 this paper employs methods of posterior approximation and bayesian learning and incorporates them into bayesian transfer by learning a prior from source tasks these existing methods have been well developed so the overall technical contribution is incremental upon these 2 the method introduced in section 32 looks a bit arbitrary in order to slightly smooth the prior distribution this paper proposes to multiply a single scaling factor to the covariance matrix this does not seem to be a principled way to adapt a prior distribution to a target task are there any other possible methods of adjusting the covariance matrix for transfer the holdout validation method used to select this scaling factor also neglects other data information that is potentially useful for determining the scaling factor 3 the background review on bayesian transfer learning seems not very thorough which makes the overall contribution of the proposed transfer learning with bayesian learning unclear the only relevant review on this direction is about continual learning 4 the proposed method will introduce additional computational burden for transfer learning see weakness and questions docsepthis paper proposes a bayesian perspective of transfer learning in deep neural networks whereby a rescaled bayesian parameter posterior from a source task is used as a pretrained prior for a target task the proposed procedure effectively reshapes the training objective of the target task to more faithfully reflect knowledge learnt from the source task the authors find that modifying the loss surface of target tasks through informative priors significantly improves performance and calibration especially with bayesian inference strengths highly important and relevant topic for the research community clear and organized exposition well motivated arguments and experiments weaknesses proposed approach can incur significant additional computational costs may require expert knowledge to use effectively overall i like this paper i find the arguments and experiments convincing enough and believe it would be valuable for the research community one concern is whether the proposed bayesian inference based transfer learning pipeline would be of limited utility to regular practitioners as expert knowledge may be required to get these systems working stably as intended there is also the incurred computational cost which may deter uptake if the performance gain does not justify it see above
### Summary:
|
this work presents a bayesian method for transfer learning using swag reviewers agree that this is wellmotivated its novel and the proposed method is well done and works well there are some concerns about the computational burden but the authors claim that the proposed part adds about 17 total cost i share some of the concerns with one of the reviewers regarding the fact that this method may be limited in usefulness to people who are experts rather than the more general public however given that the method builds on prior art and adds a single hyperparameter i feel it should be relatively easy for someone to actually use this if they are interested in transfer learning
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
19529,
2722,
849,
271,
16851,
12637,
3268,
327,
277,
9866,
13461,
6311,
327,
247,
2603,
4836,
476,
320,
12912,
275,
3700,
28269,
281,
15450,
8892,
50275,
66,
495,
10539,
1332,
310,
4081,
337,
4944,
12637,
689,
13461,
327,
2603,
4836,
970,
1863,
356,
374,
9708,
1079,
253,
6311,
12637,
970,
247,
2014,
13434,
10235,
495,
10358,
253,
46595,
264,
2720,
715,
247,
320,
656,
757,
17032,
5933,
751,
48237,
392,
390,
48237,
11774,
68,
50276,
49363,
1543,
403,
2530,
281,
1329,
253,
1563,
3916,
50275,
249,
2426,
273,
3045,
6311,
2235,
641,
50276,
8433,
69,
3700,
4715,
50276,
4160,
29343,
264,
2235,
641,
50276,
8433,
69,
342,
6311,
2235,
641,
3012,
41731,
13015,
2629,
3700,
4715,
26332,
6311,
2235,
641,
1421,
281,
3045,
15988,
342,
390,
32063,
17699,
16561,
17032,
50276,
29343,
264,
2235,
641,
1421,
281,
625,
941,
20246,
3045,
327,
253,
15450,
4836,
50276,
9453,
970,
253,
1072,
2720,
17699,
16561,
17032,
2223,
41731,
13015,
256,
35333,
3733,
50276,
2004,
17699,
16561,
17032,
342,
247,
1327,
29343,
264,
2720,
556,
18134,
7200,
281,
256,
35333,
342,
3215,
26208,
50275,
296,
3755,
20556,
50276,
84,
18,
253,
1332,
310,
973,
24013,
8550,
3559,
342,
19843,
285,
2011,
281,
1421,
281,
3045,
15988,
50276,
84,
19,
253,
16039,
2530,
8959,
2007,
17947,
273,
3897,
859,
32961,
12420,
275,
3700,
4715,
285,
643,
7533,
24088,
1643,
11860,
4715,
50276,
84,
20,
253,
1332,
21168,
327,
2720,
1445,
285,
11323,
247,
2014,
4373,
19484,
323,
436,
1921,
352,
943,
320,
3477,
323,
2571,
281,
4647,
50276,
84,
21,
253,
5661,
12820,
3797,
9162,
8892,
273,
1027,
48663,
285,
24705,
26405,
3021,
253,
1332,
310,
2011,
281,
320,
12912,
275,
4722,
3237,
50275,
20881,
1255,
265,
50276,
88,
18,
253,
4081,
1332,
3916,
281,
320,
247,
5926,
249,
5407,
323,
2629,
3700,
4715,
12157,
253,
3081,
13782,
310,
417,
3264,
281,
320,
14916,
352,
651,
452,
644,
12912,
281,
2085,
247,
1463,
1263,
273,
253,
3264,
2572,
275,
13782,
8772,
2629,
3700,
4715,
50276,
783,
19529,
3797,
690,
5955,
670,
253,
3081,
3733,
285,
17032,
2105,
533,
642,
5955,
670,
253,
2105,
275,
3946,
273,
25184,
4373,
22041,
323,
1650,
627,
403,
247,
1180,
273,
4373,
22041,
323,
253,
3082,
3206,
275,
253,
1027,
5018,
273,
253,
4081,
1332,
4758,
390,
25184,
841,
4373,
22041,
323,
247,
1677,
3700,
651,
320,
629,
273,
253,
2105,
273,
253,
1332,
275,
1798,
323,
5018,
374,
13642,
29331,
273,
2720,
285,
495,
3602,
273,
48237,
392,
285,
48237,
11774,
68,
50276,
88,
19,
253,
2022,
7681,
38135,
275,
253,
1332,
310,
253,
13642,
273,
253,
2720,
5018,
337,
285,
495,
403,
4893,
273,
2720,
1445,
533,
436,
369,
417,
1677,
1199,
4116,
275,
253,
4679,
760,
3036,
374,
68,
752,
310,
253,
8521,
5199,
285,
3264,
2105,
323,
25184,
253,
13642,
4764,
29331,
50276,
88,
20,
14777,
38209,
285,
3403,
621,
403,
417,
30492,
323,
1650,
275,
3036,
374,
68,
627,
310,
247,
19445,
7484,
533,
253,
13691,
1057,
417,
2486,
19445,
275,
3036,
818,
253,
11743,
16544,
273,
247,
2502,
7484,
533,
627,
310,
642,
2502,
275,
253,
14777,
50276,
783,
767,
7364,
5469,
403,
253,
3081,
13782,
285,
253,
11250,
273,
760,
8113,
4893,
50276,
7152,
33032,
2520,
2929,
10262,
247,
17699,
16561,
4715,
1332,
323,
3700,
4715,
327,
2303,
4836,
5742,
247,
12637,
11193,
1332,
1863,
356,
310,
908,
281,
6642,
253,
12637,
3268,
273,
247,
22296,
390,
1881,
35421,
3215,
11273,
1566,
436,
3268,
310,
908,
347,
253,
2720,
323,
15450,
4715,
4679,
285,
28913,
1263,
7568,
253,
12510,
273,
4715,
247,
2720,
323,
3700,
20544,
50276,
18,
253,
1895,
273,
4715,
247,
2720,
323,
3700,
342,
17699,
16561,
4715,
310,
973,
17194,
50276,
19,
253,
4081,
1332,
342,
1264,
5018,
310,
5272,
50276,
20,
5661,
1543,
273,
253,
4081,
1332,
327,
24705,
26405,
403,
3240,
2266,
50276,
20881,
1255,
265,
50276,
18,
436,
2929,
27532,
3082,
273,
12637,
11193,
285,
17699,
16561,
4715,
285,
31167,
731,
715,
17699,
16561,
3700,
407,
4715,
247,
2720,
432,
2603,
8892,
841,
5368,
3082,
452,
644,
973,
3715,
594,
253,
4583,
7681,
7680,
310,
32809,
2220,
841,
50276,
19,
253,
1332,
5611,
275,
2593,
4567,
4453,
247,
2372,
10341,
275,
1340,
281,
5777,
6032,
253,
2720,
3268,
436,
2929,
29328,
281,
30247,
247,
2014,
13642,
2803,
281,
253,
26677,
4315,
436,
1057,
417,
1646,
281,
320,
247,
3505,
74,
6216,
1039,
281,
5223,
247,
2720,
3268,
281,
247,
2303,
4836,
403,
627,
667,
643,
1896,
3082,
273,
19427,
253,
26677,
4315,
323,
3700,
253,
2186,
483,
12820,
1332,
908,
281,
3609,
436,
13642,
2803,
671,
18369,
84,
643,
941,
1491,
326,
310,
7826,
4217,
323,
8925,
253,
13642,
2803,
50276,
20,
253,
4114,
2278,
327,
17699,
16561,
3700,
4715,
3133,
417,
1077,
11080,
534,
2789,
253,
4583,
7680,
273,
253,
4081,
3700,
4715,
342,
17699,
16561,
4715,
12744,
253,
760,
4623,
2278,
327,
436,
3884,
310,
670,
45120,
4715,
50276,
21,
253,
4081,
1332,
588,
9569,
3081,
15180,
7977,
323,
3700,
4715,
50274,
2887,
14855,
285,
3533,
5474,
33032,
2520,
2929,
29328,
247,
17699,
16561,
8668,
273,
3700,
4715,
275,
3676,
11454,
6928,
17580,
247,
46595,
264,
17699,
16561,
4764,
12637,
432,
247,
2603,
4836,
310,
908,
347,
247,
3215,
11273,
2720,
323,
247,
2303,
4836,
253,
4081,
5199,
8069,
40206,
9652,
253,
3733,
8103,
273,
253,
2303,
4836,
281,
625,
48479,
4887,
3640,
34003,
432,
253,
2603,
4836,
253,
4477,
1089,
326,
26264,
253,
2957,
2553,
273,
2303,
8892,
949,
27096,
2235,
641,
3012,
19132,
3045,
285,
18543,
3340,
342,
17699,
16561,
17032,
50276,
296,
3755,
20556,
50275,
8656,
314,
1774,
285,
4623,
9400,
323,
253,
2561,
3114,
50276,
8250,
285,
10932,
47284,
50276,
4714,
17194,
7125,
285,
4679,
50276,
20881,
1255,
265,
50276,
856,
7334,
2746,
476,
36967,
1534,
3081,
15180,
4815,
50276,
11159,
2430,
6485,
3640,
281,
897,
8069,
50276,
1189,
455,
891,
751,
436,
2929,
891,
1089,
253,
7125,
285,
4679,
21414,
2217,
285,
2868,
352,
651,
320,
9865,
323,
253,
2561,
3114,
581,
4468,
310,
1880,
253,
4081,
17699,
16561,
17032,
1754,
3700,
4715,
15722,
651,
320,
273,
3710,
11839,
281,
3963,
24432,
347,
6485,
3640,
778,
320,
2424,
281,
755,
841,
2718,
2444,
42526,
347,
6034,
627,
310,
671,
253,
23122,
15180,
2105,
534,
778,
25615,
13797,
604,
253,
3045,
6351,
1057,
417,
15249,
352,
50276,
2887,
1840,
2490,
187,
4118,
18435,
27,
2520,
789,
10262,
247,
17699,
16561,
1332,
323,
3700,
4715,
970,
1863,
356,
30628,
5194,
326,
436,
310,
973,
24013,
8550,
697,
4460,
285,
253,
4081,
1332,
310,
973,
2218,
285,
2987,
973,
627,
403,
690,
7350,
670,
253,
15180,
7977,
533,
253,
4477,
1750,
326,
253,
4081,
629,
11323,
670,
1722,
2264,
2105,
891,
3894,
690,
273,
253,
7350,
342,
581,
273,
253,
30628,
5001,
253,
958,
326,
436,
1332,
778,
320,
3710,
275,
31471,
281,
952,
665,
403,
10071,
2581,
685,
253,
625,
2087,
1345,
2299,
1677,
326,
253,
1332,
21168,
327,
2720,
1445,
285,
11323,
247,
2014,
4373,
19484,
891,
1928,
352,
943,
320,
4942,
3477,
323,
3095,
281,
2686,
897,
436,
604,
597,
403,
6110,
275,
3700,
4715
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
19529,
2722,
849,
271,
16851,
12637,
3268,
327,
277,
9866,
13461,
6311,
327,
247,
2603,
4836,
476,
320,
12912,
275,
3700,
28269,
281,
15450,
8892,
50275,
66,
495,
10539,
1332,
310,
4081,
337,
4944,
12637,
689,
13461,
327,
2603,
4836,
970,
1863,
356,
374,
9708,
1079,
253,
6311,
12637,
970,
247,
2014,
13434,
10235,
495,
10358,
253,
46595,
264,
2720,
715,
247,
320,
656,
757,
17032,
5933,
751,
48237,
392,
390,
48237,
11774,
68,
50276,
49363,
1543,
403,
2530,
281,
1329,
253,
1563,
3916,
50275,
249,
2426,
273,
3045,
6311,
2235,
641,
50276,
8433,
69,
3700,
4715,
50276,
4160,
29343,
264,
2235,
641,
50276,
8433,
69,
342,
6311,
2235,
641,
3012,
41731,
13015,
2629,
3700,
4715,
26332,
6311,
2235,
641,
1421,
281,
3045,
15988,
342,
390,
32063,
17699,
16561,
17032,
50276,
29343,
264,
2235,
641,
1421,
281,
625,
941,
20246,
3045,
327,
253,
15450,
4836,
50276,
9453,
970,
253,
1072,
2720,
17699,
16561,
17032,
2223,
41731,
13015,
256,
35333,
3733,
50276,
2004,
17699,
16561,
17032,
342,
247,
1327,
29343,
264,
2720,
556,
18134,
7200,
281,
256,
35333,
342,
3215,
26208,
50275,
296,
3755,
20556,
50276,
84,
18,
253,
1332,
310,
973,
24013,
8550,
3559,
342,
19843,
285,
2011,
281,
1421,
281,
3045,
15988,
50276,
84,
19,
253,
16039,
2530,
8959,
2007,
17947,
273,
3897,
859,
32961,
12420,
275,
3700,
4715,
285,
643,
7533,
24088,
1643,
11860,
4715,
50276,
84,
20,
253,
1332,
21168,
327,
2720,
1445,
285,
11323,
247,
2014,
4373,
19484,
323,
436,
1921,
352,
943,
320,
3477,
323,
2571,
281,
4647,
50276,
84,
21,
253,
5661,
12820,
3797,
9162,
8892,
273,
1027,
48663,
285,
24705,
26405,
3021,
253,
1332,
310,
2011,
281,
320,
12912,
275,
4722,
3237,
50275,
20881,
1255,
265,
50276,
88,
18,
253,
4081,
1332,
3916,
281,
320,
247,
5926,
249,
5407,
323,
2629,
3700,
4715,
12157,
253,
3081,
13782,
310,
417,
3264,
281,
320,
14916,
352,
651,
452,
644,
12912,
281,
2085,
247,
1463,
1263,
273,
253,
3264,
2572,
275,
13782,
8772,
2629,
3700,
4715,
50276,
783,
19529,
3797,
690,
5955,
670,
253,
3081,
3733,
285,
17032,
2105,
533,
642,
5955,
670,
253,
2105,
275,
3946,
273,
25184,
4373,
22041,
323,
1650,
627,
403,
247,
1180,
273,
4373,
22041,
323,
253,
3082,
3206,
275,
253,
1027,
5018,
273,
253,
4081,
1332,
4758,
390,
25184,
841,
4373,
22041,
323,
247,
1677,
3700,
651,
320,
629,
273,
253,
2105,
273,
253,
1332,
275,
1798,
323,
5018,
374,
13642,
29331,
273,
2720,
285,
495,
3602,
273,
48237,
392,
285,
48237,
11774,
68,
50276,
88,
19,
253,
2022,
7681,
38135,
275,
253,
1332,
310,
253,
13642,
273,
253,
2720,
5018,
337,
285,
495,
403,
4893,
273,
2720,
1445,
533,
436,
369,
417,
1677,
1199,
4116,
275,
253,
4679,
760,
3036,
374,
68,
752,
310,
253,
8521,
5199,
285,
3264,
2105,
323,
25184,
253,
13642,
4764,
29331,
50276,
88,
20,
14777,
38209,
285,
3403,
621,
403,
417,
30492,
323,
1650,
275,
3036,
374,
68,
627,
310,
247,
19445,
7484,
533,
253,
13691,
1057,
417,
2486,
19445,
275,
3036,
818,
253,
11743,
16544,
273,
247,
2502,
7484,
533,
627,
310,
642,
2502,
275,
253,
14777,
50276,
783,
767,
7364,
5469,
403,
253,
3081,
13782,
285,
253,
11250,
273,
760,
8113,
4893,
50276,
7152,
33032,
2520,
2929,
10262,
247,
17699,
16561,
4715,
1332,
323,
3700,
4715,
327,
2303,
4836,
5742,
247,
12637,
11193,
1332,
1863,
356,
310,
908,
281,
6642,
253,
12637,
3268,
273,
247,
22296,
390,
1881,
35421,
3215,
11273,
1566,
436,
3268,
310,
908,
347,
253,
2720,
323,
15450,
4715,
4679,
285,
28913,
1263,
7568,
253,
12510,
273,
4715,
247,
2720,
323,
3700,
20544,
50276,
18,
253,
1895,
273,
4715,
247,
2720,
323,
3700,
342,
17699,
16561,
4715,
310,
973,
17194,
50276,
19,
253,
4081,
1332,
342,
1264,
5018,
310,
5272,
50276,
20,
5661,
1543,
273,
253,
4081,
1332,
327,
24705,
26405,
403,
3240,
2266,
50276,
20881,
1255,
265,
50276,
18,
436,
2929,
27532,
3082,
273,
12637,
11193,
285,
17699,
16561,
4715,
285,
31167,
731,
715,
17699,
16561,
3700,
407,
4715,
247,
2720,
432,
2603,
8892,
841,
5368,
3082,
452,
644,
973,
3715,
594,
253,
4583,
7681,
7680,
310,
32809,
2220,
841,
50276,
19,
253,
1332,
5611,
275,
2593,
4567,
4453,
247,
2372,
10341,
275,
1340,
281,
5777,
6032,
253,
2720,
3268,
436,
2929,
29328,
281,
30247,
247,
2014,
13642,
2803,
281,
253,
26677,
4315,
436,
1057,
417,
1646,
281,
320,
247,
3505,
74,
6216,
1039,
281,
5223,
247,
2720,
3268,
281,
247,
2303,
4836,
403,
627,
667,
643,
1896,
3082,
273,
19427,
253,
26677,
4315,
323,
3700,
253,
2186,
483,
12820,
1332,
908,
281,
3609,
436,
13642,
2803,
671,
18369,
84,
643,
941,
1491,
326,
310,
7826,
4217,
323,
8925,
253,
13642,
2803,
50276,
20,
253,
4114,
2278,
327,
17699,
16561,
3700,
4715,
3133,
417,
1077,
11080,
534,
2789,
253,
4583,
7680,
273,
253,
4081,
3700,
4715,
342,
17699,
16561,
4715,
12744,
253,
760,
4623,
2278,
327,
436,
3884,
310,
670,
45120,
4715,
50276,
21,
253,
4081,
1332,
588,
9569,
3081,
15180,
7977,
323,
3700,
4715,
50274,
2887,
14855,
285,
3533,
5474,
33032,
2520,
2929,
29328,
247,
17699,
16561,
8668,
273,
3700,
4715,
275,
3676,
11454,
6928,
17580,
247,
46595,
264,
17699,
16561,
4764,
12637,
432,
247,
2603,
4836,
310,
908,
347,
247,
3215,
11273,
2720,
323,
247,
2303,
4836,
253,
4081,
5199,
8069,
40206,
9652,
253,
3733,
8103,
273,
253,
2303,
4836,
281,
625,
48479,
4887,
3640,
34003,
432,
253,
2603,
4836,
253,
4477,
1089,
326,
26264,
253,
2957,
2553,
273,
2303,
8892,
949,
27096,
2235,
641,
3012,
19132,
3045,
285,
18543,
3340,
342,
17699,
16561,
17032,
50276,
296,
3755,
20556,
50275,
8656,
314,
1774,
285,
4623,
9400,
323,
253,
2561,
3114,
50276,
8250,
285,
10932,
47284,
50276,
4714,
17194,
7125,
285,
4679,
50276,
20881,
1255,
265,
50276,
856,
7334,
2746,
476,
36967,
1534,
3081,
15180,
4815,
50276,
11159,
2430,
6485,
3640,
281,
897,
8069,
50276,
1189,
455,
891,
751,
436,
2929,
891,
1089,
253,
7125,
285,
4679,
21414,
2217,
285,
2868,
352,
651,
320,
9865,
323,
253,
2561,
3114,
581,
4468,
310,
1880,
253,
4081,
17699,
16561,
17032,
1754,
3700,
4715,
15722,
651,
320,
273,
3710,
11839,
281,
3963,
24432,
347,
6485,
3640,
778,
320,
2424,
281,
755,
841,
2718,
2444,
42526,
347,
6034,
627,
310,
671,
253,
23122,
15180,
2105,
534,
778,
25615,
13797,
604,
253,
3045,
6351,
1057,
417,
15249,
352,
50276,
2887,
1840,
2490,
187,
4118,
18435,
27,
2520,
789,
10262,
247,
17699,
16561,
1332,
323,
3700,
4715,
970,
1863,
356,
30628,
5194,
326,
436,
310,
973,
24013,
8550,
697,
4460,
285,
253,
4081,
1332,
310,
973,
2218,
285,
2987,
973,
627,
403,
690,
7350,
670,
253,
15180,
7977,
533,
253,
4477,
1750,
326,
253,
4081,
629,
11323,
670,
1722,
2264,
2105,
891,
3894,
690,
273,
253,
7350,
342,
581,
273,
253,
30628,
5001,
253,
958,
326,
436,
1332,
778,
320,
3710,
275,
31471,
281,
952,
665,
403,
10071,
2581,
685,
253,
625,
2087,
1345,
2299,
1677,
326,
253,
1332,
21168,
327,
2720,
1445,
285,
11323,
247,
2014,
4373,
19484,
891,
1928,
352,
943,
320,
4942,
3477,
323,
3095,
281,
2686,
897,
436,
604,
597,
403,
6110,
275,
3700,
4715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors describe a mapping from likelihood free inference to blackbox sequence optimization then use this mapping to link common algorithms in both fields they go on to describe novel blackbox sequence design algorithms induced by known lfi algorithms empirical results show their methods are competitive on standard datasets the link described in this work is interesting and the novel algorithms proposed contain significant differences to existing sequence optimization techniques empirical results support claims that these novel algorithms are interesting and bear consideration for future design efforts strengths both iterative scoring and iterative ratio lead heavily on supervised learning outputting lowdimensional predictions compared to many existing design algorithms training regression models instead of likelihood models on protein sequence space could yield useful empirical advances making this an interesting contribution the empirical evaluations are on wellknown datasets and baselines appear to be used correctly results align with the conclusions of the paper drawing a distinction between forward modeling and backward modeling could provide generally useful language for the community and enable communication of ideas weaknesses the main weakness is in presentation of the link itself in general the link seems to be correct in the sense that it is meaningful and consistently applied to link algorithms however its exposition does give clear intuition for what is going on i recognize this is not easy and try to provide some useful feedback below on the one hand the quantities theta x have clear distinctions as parameters and data on the other hand the quantities e m are a set of sequence and a sequence and do not have such a clear distinction it is quite hard to see how the set e pops out of the mapping t described beneath table 1 the notation pe m pm in e m is very confusing the function pe m reads like a probability distribution over subsets of sequence space not a distribution over sequence space restricted to the subset e i recommend finding a better notation here this comment is based on equation 3 which i could be misunderstanding the contribution of algorithms which heavily rely on regression classification to guide sequence design is interesting especially since they are the product of a general mechanism for producing sequence optimization methods the empirical results seem sound the exposition of the model could use work to help readers have a crisper sense of how probabilistic modeling gets linked to a problem setting where the only randomness is in experimental noise docsepthe paper draws on connections between likelihoodfree inference and blackbox optimization to propose new blackbox optimization methods in general the goal here is to not find the exact optimum of the blackbox objective but to sample from a set of sequences with highquality objective this is akin to the problem of collecting posterior samples in a likelihoodfree inference problem the paper provides a number of proposed methods and compares them on some benchmark sequence optimization problems that have appeared in recent literature can you please comment on the relationship between your work and derivative free optimization via repeated classification httpsarxivorgabs180403761 the paper builds up a variety of optimization methods building on a line of work in the lfi literature this leads to a lot of approaches to compare and there is no clear indication as to what practitioners should use in practice what do you actually suggest people should use along these lines the paper has far too few details about the actual optimization approaches for example isa and isb seem to be some of your strongest methods and these require mcmc to sample new proposed sequences there is no discussion of how this mcmc is done further there is no discussion of neural network architectures optimization methods etc the paper would be stronger if it just focused on one method and provided sufficient details for practitioners to actually use it i found this sentence in sec 42 very unsatisfying do you have any further insights about performance differences this indicates that composite methods way of using parameterized models to replace computational procedures is not the optimal solution for smallscale tasks i was surprised that the error bars were so small in all of your experiments as i expect that the trajectory of an optimizer has high variance due for example to the initial set of sequences that are sampled what do your error bars correspond to are they standard errors or standard deviations also what sources of randomness are you accounting for when you generate multiple random trials for the methods that combine both a generative and discriminative models there are two sources of approximation error it would be very helpful if you isolated these by performing oracle experiments where you assume that the forward model is exactly correct this can be used to isolate the impact for example of using the amortized sampler qphi in alg 10 just replace the forward model with the ground truth objective function can you run a quick experiment the paper has some interesting methods and the connection to lfi is helpful however it does not have a clear empirical recommendation for what algorithm readers should use going forward and the paper does not provide adequate details to understand how to go about actually using these methods in practice docsepin this paper the authors draw direct parallels between likelihoodfree inference lfi and blackbox sequence design this allows that authors to draw parallels between existing methods from the lfi and blackbox sequence design literatures in a few cases there is no direct analog in the blackbox sequence design literature for a given lfi algorithm and so the authors are able to immediately propose such an algorithm the authors also present a number of composite methods that combine ideas from a number of these approaches i found the paper to be extremely clear and well written providing an excellent review of both the lfi and blackbox sequence design literatures and drawing clear clean parallels between the two fields the proposed algorithms are all sensible and seem to work well on the empirical tasks and the connection between the two fields seems like a fruitful area for further exploration especially using the presented framework i have few comments below typos on p 4 it is stated that to model the general posterior pthetamathbfx which takes arbitrary theta and mathbfx as two inputs and outputs a distribution but these models only take mathbfx as input and output the conditional distribution over theta ie the posterior there are a number of typos in the supplementary materials appendix the meaning was always clear but that document could use some thorough copy editing the paper is clear and wellwritten and provides a very useful conceptual framework tying two subfields together with sufficient empirical evidence to show real gains from this approach docsepthe paper relates likelihood free inference to methods for biological sequences design and proposes new sequence design methods based on this insight major comments 1 the paper is missing a related works section with an overview of existing methods for sequence design and likelihood free inference and how they relate to the methods that were introduced in this paper 2 section 31 fbvae and dbas are both estimation of distribution algorithms eda based on a generative model eda is closely related to expectation maximization httpsarxivorgpdf190510474pdf please describe more clearly the differences and similarities of snp eda and em 3 section 31 please describe the differences between fbvae and dbas more clearly both approaches update a vae iteratively by fitting it on the top scoring sequences what does top mean are there differences in the fitting of the vae 4 section 32 is is a discriminative approach for sequence design or blackbox optimization similar to bayesian modelbased optimization mbo please describe the similarities between is and mbo and differences if there are any 5 section 32 please describe what construct qm using fm means what is fm and qm in your experiments how were they trained and which hyperparameters were optimized 6 section 32 what are the differences between isa and isb 7 section 33 ir seems like a minor variation of is that uses a classifier instead of a regressor as surrogate models which is not new are there any other differences 8 section 33 please describe which models you used for ir in your experiments and how sequences were generated 9 section 34 please describe more clearly how ips and ipr relate to and differ from existing design methods that combine generative and discriminative models eg rl gans or optimizing a surrogate model using dbascbas for example 10 section 34 please describe the differences between ipsa and ipsb more clearly including differ in the detailed construction of the distribution qm 11 experiments since the performance of algorithms can be sensitive to the batch size i would like to see experiments with a different batch size than 100 eg small 1 medium 100 and large 500 12 experiments please compare to bayesian optimization and rl if possible using the same surrogate model as used for iprips and tuning hyperparameters in the same way 13 experiments please describe which hyperparameters you tuned and how they were tuned 14 experiments please describe what the boolean feature mathcale is in all or your experiments 15 experiments please describe the evolution baseline more clearly 16 experiments please motivate why you report the average reward of the top10100 sequences instead of just the maximum reward top1 the average can be maximized by reporting identical or very similar sequences more important for practical applications is that the optimizer finds a diverse set of highreward sequences as explained in angermueller et al who used additional diversity metrics to quantify this 17 experiments how did you initialize the optimization i would like to also see experiments that are initialized with a small set of labeled sequence eg one or few parent sequenceshomologs which often exist in practice minor comments section 1 de novo biological sequence design describe which kind of sequence dna rna protein molecules represented as strings section 2 2nd paragraph please cite reviews eg httparxivorgabs210605466 httpwwwnaturecomarticless4159201904966 of existing ml design methods instead of single papers ahn gottipati denoting m as sequence and s fm as the oracle function value is confusing since s is the first letter of sequence i strongly suggest to use s to denote a sequence and for example y fs to denote the function value your benchmark problems seem to be similar to the benchmark problems introduced in angermueller et al except for flu if this is the case please describe that you reused the benchmark problems from angermueller et al and describe possible differences by referencing angermueller et al instead of describing each benchmark problem in detail you can also shorten the experimental section experiments flu you hypothesize that backward modeling techniques perform better since sequences are long although sequences are long there may be only a few variable positions while most positions are conserved since generative models can easily fit this conservation better they may perform well in this case since the optimization problem becomes trivial as only few variable positions are mutated the outlined relation between likelihood free inference and sequence design blackbox optimization is interesting i am not aware of any existing papers with this insight however i am not sure about the impact of this insight on how sequences are designed in practice it is not described clearly enough how the proposed methods differ from existing methods for sequence design such as bayesian modelbased optimization gans or rl important details about the proposed methods and performed experiments are missing which makes it hard to understand and assess
### Summary:
|
the paper investigates various approaches and a unifying framework for sequence design there were a variety of opinions about the paper it was felt after discussion that the paper would benefit from a sharper focus and somewhat suffers from being overwhelmed by various approaches lacking a clear narrative but overall all reviewers had a positive sentiment and the paper makes a nice contribution to the growing body of work on protein design
|
[
3048,
11333,
2299,
697,
47284,
1057,
1918,
2590,
30328,
323,
752,
310,
1469,
327,
891,
9446,
436,
310,
417,
3477,
285,
1611,
281,
2085,
690,
4217,
8680,
2708,
50276,
251,
253,
581,
1133,
253,
13483,
39116,
1269,
452,
2590,
42060,
347,
3602,
285,
941,
327,
253,
643,
1133,
253,
13483,
299,
278,
403,
247,
873,
273,
3425,
285,
247,
3425,
285,
513,
417,
452,
824,
247,
2590,
13812,
352,
310,
3240,
1892,
281,
923,
849,
253,
873,
299,
42206,
562,
273,
253,
10603,
246,
2529,
11834,
2829,
337,
50275,
783,
14951,
759,
50276,
78,
50276,
2617,
275,
299,
50276,
78,
310,
1077,
21643,
253,
1159,
759,
50276,
78,
9563,
751,
247,
5912,
3268,
689,
20077,
273,
3425,
2317,
417,
247,
3268,
689,
3425,
2317,
11096,
281,
253,
8578,
299,
891,
5583,
4560,
247,
1805,
14951,
1060,
436,
4385,
310,
1754,
327,
5150,
495,
534,
891,
812,
320,
40663,
50276,
783,
7680,
273,
11333,
534,
11306,
10725,
327,
9077,
50276,
42070,
281,
7102,
3425,
2216,
310,
4722,
3340,
1580,
597,
403,
253,
1885,
273,
247,
2087,
5122,
323,
9603,
3425,
13757,
3082,
253,
16774,
1543,
1646,
3590,
253,
47284,
273,
253,
1566,
812,
897,
789,
281,
1361,
10668,
452,
247,
7550,
468,
3282,
273,
849,
37851,
14053,
4850,
7939,
281,
247,
1895,
4758,
835,
253,
760,
3632,
1255,
310,
275,
5661,
6046,
5474,
339,
431,
248,
2929,
21354,
327,
10291,
875,
12177,
4924,
17032,
285,
2806,
3364,
13757,
281,
12661,
747,
2806,
3364,
13757,
3082,
275,
2087,
253,
4736,
1060,
310,
281,
417,
1089,
253,
3242,
24571,
273,
253,
2806,
3364,
8103,
533,
281,
3410,
432,
247,
873,
273,
6430,
342,
1029,
15177,
8103,
436,
310,
33917,
281,
253,
1895,
273,
17055,
12637,
3530,
275,
247,
12177,
4924,
17032,
1895,
50275,
783,
2929,
3400,
247,
1180,
273,
4081,
3082,
285,
26662,
731,
327,
690,
22791,
3425,
13757,
3237,
326,
452,
5420,
275,
3332,
6239,
476,
368,
4496,
4385,
327,
253,
2954,
875,
634,
789,
285,
4309,
1959,
13757,
3066,
6015,
9162,
5987,
39962,
2061,
5375,
11395,
1449,
1787,
3832,
50276,
783,
2929,
21168,
598,
247,
5235,
273,
13757,
3082,
3652,
327,
247,
1386,
273,
789,
275,
253,
298,
11125,
6239,
436,
5644,
281,
247,
2257,
273,
7274,
281,
7277,
285,
627,
310,
642,
2590,
14011,
347,
281,
752,
24432,
943,
897,
275,
3946,
752,
513,
368,
2686,
1804,
952,
943,
897,
50276,
28694,
841,
3104,
253,
2929,
556,
2080,
1512,
1643,
4278,
670,
253,
4588,
13757,
7274,
323,
1650,
310,
66,
285,
310,
67,
1646,
281,
320,
690,
273,
634,
19508,
3082,
285,
841,
2430,
278,
3591,
68,
281,
3410,
747,
4081,
6430,
627,
310,
642,
5955,
273,
849,
436,
278,
3591,
68,
310,
2218,
2007,
627,
310,
642,
5955,
273,
11454,
2990,
35615,
13757,
3082,
3966,
253,
2929,
651,
320,
10046,
604,
352,
816,
7106,
327,
581,
1332,
285,
2530,
4209,
4278,
323,
24432,
281,
2686,
897,
352,
50276,
74,
1119,
436,
6197,
275,
4706,
5976,
1077,
43288,
3184,
513,
368,
452,
667,
2007,
16039,
670,
3045,
3910,
50276,
2520,
6492,
326,
8212,
3082,
1039,
273,
970,
4764,
1025,
3210,
281,
8171,
15180,
7259,
310,
417,
253,
8654,
2900,
323,
1355,
7527,
8892,
50276,
74,
369,
9861,
326,
253,
2228,
8965,
497,
594,
1355,
275,
512,
273,
634,
4679,
347,
891,
1902,
326,
253,
18974,
273,
271,
5556,
6081,
556,
1029,
11041,
1955,
323,
1650,
281,
253,
3302,
873,
273,
6430,
326,
403,
19958,
752,
513,
634,
2228,
8965,
2723,
281,
403,
597,
2629,
6332,
390,
2629,
21492,
671,
752,
4973,
273,
3632,
1255,
403,
368,
15890,
323,
672,
368,
6635,
2709,
3632,
7587,
50276,
1542,
253,
3082,
326,
13398,
1097,
247,
1006,
800,
285,
20741,
800,
3210,
627,
403,
767,
4973,
273,
11193,
2228,
352,
651,
320,
1077,
9371,
604,
368,
7011,
841,
407,
9591,
42295,
4679,
835,
368,
5467,
326,
253,
3579,
1566,
310,
4555,
3451,
436,
476,
320,
908,
281,
20843,
253,
3486,
323,
1650,
273,
970,
253,
717,
430,
1025,
1775,
17407,
2805,
2162,
275,
20320,
884,
816,
8171,
253,
3579,
1566,
342,
253,
3216,
5083,
8103,
1159,
476,
368,
1408,
247,
3158,
3368,
50276,
783,
2929,
556,
690,
4722,
3082,
285,
253,
4602,
281,
298,
11125,
310,
9371,
2299,
352,
1057,
417,
452,
247,
2590,
16774,
17401,
323,
752,
5933,
10668,
943,
897,
1469,
3579,
285,
253,
2929,
1057,
417,
2085,
10599,
4278,
281,
2096,
849,
281,
564,
670,
2686,
970,
841,
3082,
275,
3946,
5474,
339,
9852,
436,
2929,
253,
4477,
3812,
1480,
43630,
875,
12177,
4924,
17032,
298,
11125,
285,
2806,
3364,
3425,
2216,
50276,
2520,
4483,
326,
4477,
281,
3812,
43630,
875,
5368,
3082,
432,
253,
298,
11125,
285,
2806,
3364,
3425,
2216,
4133,
2478,
50276,
249,
247,
1643,
2219,
627,
310,
642,
1480,
7370,
275,
253,
2806,
3364,
3425,
2216,
6239,
323,
247,
1677,
298,
11125,
5933,
285,
594,
253,
4477,
403,
2104,
281,
4745,
12661,
824,
271,
5933,
50276,
783,
4477,
671,
1246,
247,
1180,
273,
8212,
3082,
326,
13398,
5697,
432,
247,
1180,
273,
841,
7274,
891,
1119,
253,
2929,
281,
320,
6685,
2590,
285,
973,
3542,
5277,
271,
7126,
2278,
273,
1097,
253,
298,
11125,
285,
2806,
3364,
3425,
2216,
4133,
2478,
285,
10263,
2590,
4076,
43630,
875,
253,
767,
4910,
50276,
783,
4081,
11333,
403,
512,
24600,
285,
1646,
281,
789,
973,
327,
253,
16774,
8892,
285,
253,
4602,
875,
253,
767,
4910,
3133,
751,
247,
46001,
2170,
323,
2007,
17947,
3340,
970,
253,
3559,
7792,
50276,
74,
452,
1643,
5701,
2708,
50276,
555,
993,
50276,
251,
268,
577,
352,
310,
4767,
326,
281,
1566,
253,
2087,
12637,
268,
783,
85,
312,
506,
3342,
89,
534,
3936,
10341,
39116,
285,
14168,
3342,
89,
347,
767,
14800,
285,
18012,
247,
3268,
533,
841,
3210,
760,
1379,
14168,
3342,
89,
347,
3280,
285,
3453,
253,
17697,
3268,
689,
39116,
26332,
253,
12637,
50276,
9088,
403,
247,
1180,
273,
963,
993,
275,
253,
24864,
4753,
30762,
50276,
783,
4495,
369,
1900,
2590,
533,
326,
3389,
812,
897,
690,
11080,
3491,
14835,
253,
2929,
310,
2590,
285,
973,
15720,
285,
3400,
247,
1077,
4217,
20178,
7792,
42068,
767,
749,
15069,
2366,
342,
4209,
16774,
1941,
281,
921,
1524,
15988,
432,
436,
2746,
5474,
339,
431,
248,
2929,
7033,
12177,
1959,
17032,
281,
3082,
323,
7534,
6430,
2216,
285,
29328,
747,
3425,
2216,
3082,
1754,
327,
436,
12288,
50276,
24330,
5701,
337,
253,
2929,
310,
5816,
247,
2905,
2987,
2593,
342,
271,
18389,
273,
5368,
3082,
323,
3425,
2216,
285,
12177,
1959,
17032,
285,
849,
597,
14588,
281,
253,
3082,
326,
497,
5611,
275,
436,
2929,
50276,
19,
2593,
4562,
49962,
21574,
285,
277,
10352,
403,
1097,
13418,
273,
3268,
11333,
1407,
66,
1754,
327,
247,
1006,
800,
1566,
1407,
66,
310,
8244,
2905,
281,
15355,
11903,
1320,
5987,
39962,
2061,
9275,
746,
1762,
740,
30413,
9275,
4496,
6266,
625,
4518,
253,
3910,
285,
22620,
273,
3802,
81,
1407,
66,
285,
802,
50276,
20,
2593,
4562,
4496,
6266,
253,
3910,
875,
49962,
21574,
285,
277,
10352,
625,
4518,
1097,
7274,
5731,
247,
362,
3348,
10040,
3146,
407,
13532,
352,
327,
253,
1755,
14755,
6430,
752,
1057,
1755,
1599,
403,
627,
3910,
275,
253,
13532,
273,
253,
362,
3348,
50276,
21,
2593,
4567,
310,
310,
247,
20741,
800,
2746,
323,
3425,
2216,
390,
2806,
3364,
13757,
2074,
281,
17699,
16561,
1566,
3169,
13757,
278,
2399,
4496,
6266,
253,
22620,
875,
310,
285,
278,
2399,
285,
3910,
604,
627,
403,
667,
50276,
22,
2593,
4567,
4496,
6266,
752,
3989,
2805,
78,
970,
49555,
2097,
752,
310,
49555,
285,
2805,
78,
275,
634,
4679,
849,
497,
597,
10166,
285,
534,
4373,
22041,
497,
18325,
50276,
23,
2593,
4567,
752,
403,
253,
3910,
875,
310,
66,
285,
310,
67,
50276,
24,
2593,
5922,
3496,
3133,
751,
247,
5884,
7629,
273,
310,
326,
4648,
247,
30410,
3185,
273,
247,
810,
32232,
347,
35701,
3210,
534,
310,
417,
747,
403,
627,
667,
643,
3910,
50276,
25,
2593,
5922,
4496,
6266,
534,
3210,
368,
908,
323,
3496,
275,
634,
4679,
285,
849,
6430,
497,
4561,
50276,
26,
2593,
5910,
4496,
6266,
625,
4518,
849,
41998,
285,
13997,
83,
14588,
281,
285,
9184,
432,
5368,
2216,
3082,
326,
13398,
1006,
800,
285,
20741,
800,
3210,
24088,
391,
77,
305,
507,
390,
39793,
247,
35701,
1566,
970,
14073,
4843,
10352,
323,
1650,
50276,
740,
2593,
5910,
4496,
6266,
253,
3910,
875,
41998,
66,
285,
41998,
67,
625,
4518,
1690,
9184,
275,
253,
7000,
5140,
273,
253,
3268,
2805,
78,
50276,
883,
4679,
1580,
253,
3045,
273,
11333,
476,
320,
7996,
281,
253,
14604,
1979,
891,
651,
751,
281,
923,
4679,
342,
247,
1027,
14604,
1979,
685,
2233,
24088,
1355,
337,
4646,
2233,
285,
1781,
6783,
50276,
805,
4679,
4496,
7277,
281,
17699,
16561,
13757,
285,
391,
77,
604,
1896,
970,
253,
1072,
35701,
1566,
347,
908,
323,
13997,
363,
793,
285,
25184,
4373,
22041,
275,
253,
1072,
1039,
50276,
1012,
4679,
4496,
6266,
534,
4373,
22041,
368,
24251,
285,
849,
597,
497,
24251,
50276,
1047,
4679,
4496,
6266,
752,
253,
12419,
4735,
14168,
25912,
310,
275,
512,
390,
634,
4679,
50276,
1010,
4679,
4496,
6266,
253,
5606,
8245,
625,
4518,
50275,
1036,
4679,
4496,
41509,
2139,
368,
1304,
253,
3388,
10921,
273,
253,
1755,
6903,
361,
6430,
3185,
273,
816,
253,
4869,
10921,
1755,
18,
253,
3388,
476,
320,
11903,
1025,
407,
9610,
8931,
390,
1077,
2074,
6430,
625,
1774,
323,
8542,
4893,
310,
326,
253,
5556,
6081,
9010,
247,
11117,
873,
273,
1029,
250,
1034,
6430,
347,
5544,
275,
2897,
693,
86,
7707,
1162,
355,
665,
908,
3081,
9991,
17082,
281,
22048,
436,
50276,
1166,
4679,
849,
858,
368,
26641,
253,
13757,
891,
651,
751,
281,
671,
923,
4679,
326,
403,
31260,
342,
247,
1355,
873,
273,
13130,
3425,
24088,
581,
390,
1643,
2885,
6430,
12856,
862,
84,
534,
2223,
2226,
275,
3946,
50274,
37585,
5701,
2593,
337,
372,
17590,
7534,
3425,
2216,
6266,
534,
2238,
273,
3425,
277,
2072,
391,
2072,
2601,
8094,
6607,
347,
11559,
50275,
4674,
374,
374,
2109,
12494,
4496,
26542,
10123,
24088,
2832,
1148,
32693,
2061,
5375,
16899,
1549,
3439,
2526,
3944,
1477,
939,
1177,
681,
45894,
1417,
21,
17220,
9638,
25814,
2526,
273,
5368,
13361,
2216,
3082,
3185,
273,
2014,
9380,
247,
13107,
305,
1519,
532,
8657,
50275,
3354,
5341,
278,
347,
3425,
285,
256,
50276,
22401,
347,
253,
42295,
1159,
1318,
310,
21643,
1580,
256,
310,
253,
806,
4857,
273,
3425,
891,
7052,
1804,
281,
897,
256,
281,
9173,
247,
3425,
285,
323,
1650,
340,
50276,
3671,
281,
9173,
253,
1159,
1318,
50276,
12550,
22791,
3237,
1646,
281,
320,
2074,
281,
253,
22791,
3237,
5611,
275,
2897,
693,
86,
7707,
1162,
355,
3707,
323,
2938,
604,
436,
310,
253,
1083,
4496,
6266,
326,
368,
294,
3197,
253,
22791,
3237,
432,
2897,
693,
86,
7707,
1162,
355,
285,
6266,
1896,
3910,
407,
44978,
2897,
693,
86,
7707,
1162,
355,
3185,
273,
12930,
1016,
22791,
1895,
275,
2508,
368,
476,
671,
48399,
253,
5661,
2593,
50276,
16217,
3825,
2938,
368,
41661,
326,
19265,
14053,
5609,
1347,
1805,
1580,
6430,
403,
1048,
3738,
6430,
403,
1048,
627,
778,
320,
760,
247,
1643,
4778,
6887,
1223,
954,
6887,
403,
14900,
1580,
1006,
800,
3210,
476,
4354,
4944,
436,
14144,
1805,
597,
778,
1347,
973,
275,
436,
1083,
1580,
253,
13757,
1895,
4916,
14916,
347,
760,
1643,
4778,
6887,
403,
31758,
50276,
783,
18627,
5886,
875,
12177,
1959,
17032,
285,
3425,
2216,
2806,
3364,
13757,
310,
4722,
891,
717,
417,
6600,
273,
667,
5368,
9380,
342,
436,
12288,
2299,
891,
717,
417,
2119,
670,
253,
3486,
273,
436,
12288,
327,
849,
6430,
403,
4158,
275,
3946,
50276,
262,
310,
417,
2529,
4518,
2217,
849,
253,
4081,
3082,
9184,
432,
5368,
3082,
323,
3425,
2216,
824,
347,
17699,
16561,
1566,
3169,
13757,
305,
507,
390,
391,
77,
50276,
18108,
4278,
670,
253,
4081,
3082,
285,
2684,
4679,
403,
5816,
534,
2789,
352,
1892,
281,
2096,
285,
2939,
2490,
187,
4118,
18435,
27,
783,
2929,
2340,
684,
2710,
7274,
285,
247,
440,
5411,
7792,
323,
3425,
2216,
627,
497,
247,
5235,
273,
11626,
670,
253,
2929,
352,
369,
3543,
846,
5955,
326,
253,
2929,
651,
5649,
432,
247,
17614,
468,
2770,
285,
8489,
27171,
432,
1146,
29991,
407,
2710,
7274,
14999,
247,
2590,
14511,
533,
4583,
512,
30628,
574,
247,
2762,
21942,
285,
253,
2929,
2789,
247,
5322,
7680,
281,
253,
5675,
2133,
273,
789,
327,
2601,
2216
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
3048,
11333,
2299,
697,
47284,
1057,
1918,
2590,
30328,
323,
752,
310,
1469,
327,
891,
9446,
436,
310,
417,
3477,
285,
1611,
281,
2085,
690,
4217,
8680,
2708,
50276,
251,
253,
581,
1133,
253,
13483,
39116,
1269,
452,
2590,
42060,
347,
3602,
285,
941,
327,
253,
643,
1133,
253,
13483,
299,
278,
403,
247,
873,
273,
3425,
285,
247,
3425,
285,
513,
417,
452,
824,
247,
2590,
13812,
352,
310,
3240,
1892,
281,
923,
849,
253,
873,
299,
42206,
562,
273,
253,
10603,
246,
2529,
11834,
2829,
337,
50275,
783,
14951,
759,
50276,
78,
50276,
2617,
275,
299,
50276,
78,
310,
1077,
21643,
253,
1159,
759,
50276,
78,
9563,
751,
247,
5912,
3268,
689,
20077,
273,
3425,
2317,
417,
247,
3268,
689,
3425,
2317,
11096,
281,
253,
8578,
299,
891,
5583,
4560,
247,
1805,
14951,
1060,
436,
4385,
310,
1754,
327,
5150,
495,
534,
891,
812,
320,
40663,
50276,
783,
7680,
273,
11333,
534,
11306,
10725,
327,
9077,
50276,
42070,
281,
7102,
3425,
2216,
310,
4722,
3340,
1580,
597,
403,
253,
1885,
273,
247,
2087,
5122,
323,
9603,
3425,
13757,
3082,
253,
16774,
1543,
1646,
3590,
253,
47284,
273,
253,
1566,
812,
897,
789,
281,
1361,
10668,
452,
247,
7550,
468,
3282,
273,
849,
37851,
14053,
4850,
7939,
281,
247,
1895,
4758,
835,
253,
760,
3632,
1255,
310,
275,
5661,
6046,
5474,
339,
431,
248,
2929,
21354,
327,
10291,
875,
12177,
4924,
17032,
285,
2806,
3364,
13757,
281,
12661,
747,
2806,
3364,
13757,
3082,
275,
2087,
253,
4736,
1060,
310,
281,
417,
1089,
253,
3242,
24571,
273,
253,
2806,
3364,
8103,
533,
281,
3410,
432,
247,
873,
273,
6430,
342,
1029,
15177,
8103,
436,
310,
33917,
281,
253,
1895,
273,
17055,
12637,
3530,
275,
247,
12177,
4924,
17032,
1895,
50275,
783,
2929,
3400,
247,
1180,
273,
4081,
3082,
285,
26662,
731,
327,
690,
22791,
3425,
13757,
3237,
326,
452,
5420,
275,
3332,
6239,
476,
368,
4496,
4385,
327,
253,
2954,
875,
634,
789,
285,
4309,
1959,
13757,
3066,
6015,
9162,
5987,
39962,
2061,
5375,
11395,
1449,
1787,
3832,
50276,
783,
2929,
21168,
598,
247,
5235,
273,
13757,
3082,
3652,
327,
247,
1386,
273,
789,
275,
253,
298,
11125,
6239,
436,
5644,
281,
247,
2257,
273,
7274,
281,
7277,
285,
627,
310,
642,
2590,
14011,
347,
281,
752,
24432,
943,
897,
275,
3946,
752,
513,
368,
2686,
1804,
952,
943,
897,
50276,
28694,
841,
3104,
253,
2929,
556,
2080,
1512,
1643,
4278,
670,
253,
4588,
13757,
7274,
323,
1650,
310,
66,
285,
310,
67,
1646,
281,
320,
690,
273,
634,
19508,
3082,
285,
841,
2430,
278,
3591,
68,
281,
3410,
747,
4081,
6430,
627,
310,
642,
5955,
273,
849,
436,
278,
3591,
68,
310,
2218,
2007,
627,
310,
642,
5955,
273,
11454,
2990,
35615,
13757,
3082,
3966,
253,
2929,
651,
320,
10046,
604,
352,
816,
7106,
327,
581,
1332,
285,
2530,
4209,
4278,
323,
24432,
281,
2686,
897,
352,
50276,
74,
1119,
436,
6197,
275,
4706,
5976,
1077,
43288,
3184,
513,
368,
452,
667,
2007,
16039,
670,
3045,
3910,
50276,
2520,
6492,
326,
8212,
3082,
1039,
273,
970,
4764,
1025,
3210,
281,
8171,
15180,
7259,
310,
417,
253,
8654,
2900,
323,
1355,
7527,
8892,
50276,
74,
369,
9861,
326,
253,
2228,
8965,
497,
594,
1355,
275,
512,
273,
634,
4679,
347,
891,
1902,
326,
253,
18974,
273,
271,
5556,
6081,
556,
1029,
11041,
1955,
323,
1650,
281,
253,
3302,
873,
273,
6430,
326,
403,
19958,
752,
513,
634,
2228,
8965,
2723,
281,
403,
597,
2629,
6332,
390,
2629,
21492,
671,
752,
4973,
273,
3632,
1255,
403,
368,
15890,
323,
672,
368,
6635,
2709,
3632,
7587,
50276,
1542,
253,
3082,
326,
13398,
1097,
247,
1006,
800,
285,
20741,
800,
3210,
627,
403,
767,
4973,
273,
11193,
2228,
352,
651,
320,
1077,
9371,
604,
368,
7011,
841,
407,
9591,
42295,
4679,
835,
368,
5467,
326,
253,
3579,
1566,
310,
4555,
3451,
436,
476,
320,
908,
281,
20843,
253,
3486,
323,
1650,
273,
970,
253,
717,
430,
1025,
1775,
17407,
2805,
2162,
275,
20320,
884,
816,
8171,
253,
3579,
1566,
342,
253,
3216,
5083,
8103,
1159,
476,
368,
1408,
247,
3158,
3368,
50276,
783,
2929,
556,
690,
4722,
3082,
285,
253,
4602,
281,
298,
11125,
310,
9371,
2299,
352,
1057,
417,
452,
247,
2590,
16774,
17401,
323,
752,
5933,
10668,
943,
897,
1469,
3579,
285,
253,
2929,
1057,
417,
2085,
10599,
4278,
281,
2096,
849,
281,
564,
670,
2686,
970,
841,
3082,
275,
3946,
5474,
339,
9852,
436,
2929,
253,
4477,
3812,
1480,
43630,
875,
12177,
4924,
17032,
298,
11125,
285,
2806,
3364,
3425,
2216,
50276,
2520,
4483,
326,
4477,
281,
3812,
43630,
875,
5368,
3082,
432,
253,
298,
11125,
285,
2806,
3364,
3425,
2216,
4133,
2478,
50276,
249,
247,
1643,
2219,
627,
310,
642,
1480,
7370,
275,
253,
2806,
3364,
3425,
2216,
6239,
323,
247,
1677,
298,
11125,
5933,
285,
594,
253,
4477,
403,
2104,
281,
4745,
12661,
824,
271,
5933,
50276,
783,
4477,
671,
1246,
247,
1180,
273,
8212,
3082,
326,
13398,
5697,
432,
247,
1180,
273,
841,
7274,
891,
1119,
253,
2929,
281,
320,
6685,
2590,
285,
973,
3542,
5277,
271,
7126,
2278,
273,
1097,
253,
298,
11125,
285,
2806,
3364,
3425,
2216,
4133,
2478,
285,
10263,
2590,
4076,
43630,
875,
253,
767,
4910,
50276,
783,
4081,
11333,
403,
512,
24600,
285,
1646,
281,
789,
973,
327,
253,
16774,
8892,
285,
253,
4602,
875,
253,
767,
4910,
3133,
751,
247,
46001,
2170,
323,
2007,
17947,
3340,
970,
253,
3559,
7792,
50276,
74,
452,
1643,
5701,
2708,
50276,
555,
993,
50276,
251,
268,
577,
352,
310,
4767,
326,
281,
1566,
253,
2087,
12637,
268,
783,
85,
312,
506,
3342,
89,
534,
3936,
10341,
39116,
285,
14168,
3342,
89,
347,
767,
14800,
285,
18012,
247,
3268,
533,
841,
3210,
760,
1379,
14168,
3342,
89,
347,
3280,
285,
3453,
253,
17697,
3268,
689,
39116,
26332,
253,
12637,
50276,
9088,
403,
247,
1180,
273,
963,
993,
275,
253,
24864,
4753,
30762,
50276,
783,
4495,
369,
1900,
2590,
533,
326,
3389,
812,
897,
690,
11080,
3491,
14835,
253,
2929,
310,
2590,
285,
973,
15720,
285,
3400,
247,
1077,
4217,
20178,
7792,
42068,
767,
749,
15069,
2366,
342,
4209,
16774,
1941,
281,
921,
1524,
15988,
432,
436,
2746,
5474,
339,
431,
248,
2929,
7033,
12177,
1959,
17032,
281,
3082,
323,
7534,
6430,
2216,
285,
29328,
747,
3425,
2216,
3082,
1754,
327,
436,
12288,
50276,
24330,
5701,
337,
253,
2929,
310,
5816,
247,
2905,
2987,
2593,
342,
271,
18389,
273,
5368,
3082,
323,
3425,
2216,
285,
12177,
1959,
17032,
285,
849,
597,
14588,
281,
253,
3082,
326,
497,
5611,
275,
436,
2929,
50276,
19,
2593,
4562,
49962,
21574,
285,
277,
10352,
403,
1097,
13418,
273,
3268,
11333,
1407,
66,
1754,
327,
247,
1006,
800,
1566,
1407,
66,
310,
8244,
2905,
281,
15355,
11903,
1320,
5987,
39962,
2061,
9275,
746,
1762,
740,
30413,
9275,
4496,
6266,
625,
4518,
253,
3910,
285,
22620,
273,
3802,
81,
1407,
66,
285,
802,
50276,
20,
2593,
4562,
4496,
6266,
253,
3910,
875,
49962,
21574,
285,
277,
10352,
625,
4518,
1097,
7274,
5731,
247,
362,
3348,
10040,
3146,
407,
13532,
352,
327,
253,
1755,
14755,
6430,
752,
1057,
1755,
1599,
403,
627,
3910,
275,
253,
13532,
273,
253,
362,
3348,
50276,
21,
2593,
4567,
310,
310,
247,
20741,
800,
2746,
323,
3425,
2216,
390,
2806,
3364,
13757,
2074,
281,
17699,
16561,
1566,
3169,
13757,
278,
2399,
4496,
6266,
253,
22620,
875,
310,
285,
278,
2399,
285,
3910,
604,
627,
403,
667,
50276,
22,
2593,
4567,
4496,
6266,
752,
3989,
2805,
78,
970,
49555,
2097,
752,
310,
49555,
285,
2805,
78,
275,
634,
4679,
849,
497,
597,
10166,
285,
534,
4373,
22041,
497,
18325,
50276,
23,
2593,
4567,
752,
403,
253,
3910,
875,
310,
66,
285,
310,
67,
50276,
24,
2593,
5922,
3496,
3133,
751,
247,
5884,
7629,
273,
310,
326,
4648,
247,
30410,
3185,
273,
247,
810,
32232,
347,
35701,
3210,
534,
310,
417,
747,
403,
627,
667,
643,
3910,
50276,
25,
2593,
5922,
4496,
6266,
534,
3210,
368,
908,
323,
3496,
275,
634,
4679,
285,
849,
6430,
497,
4561,
50276,
26,
2593,
5910,
4496,
6266,
625,
4518,
849,
41998,
285,
13997,
83,
14588,
281,
285,
9184,
432,
5368,
2216,
3082,
326,
13398,
1006,
800,
285,
20741,
800,
3210,
24088,
391,
77,
305,
507,
390,
39793,
247,
35701,
1566,
970,
14073,
4843,
10352,
323,
1650,
50276,
740,
2593,
5910,
4496,
6266,
253,
3910,
875,
41998,
66,
285,
41998,
67,
625,
4518,
1690,
9184,
275,
253,
7000,
5140,
273,
253,
3268,
2805,
78,
50276,
883,
4679,
1580,
253,
3045,
273,
11333,
476,
320,
7996,
281,
253,
14604,
1979,
891,
651,
751,
281,
923,
4679,
342,
247,
1027,
14604,
1979,
685,
2233,
24088,
1355,
337,
4646,
2233,
285,
1781,
6783,
50276,
805,
4679,
4496,
7277,
281,
17699,
16561,
13757,
285,
391,
77,
604,
1896,
970,
253,
1072,
35701,
1566,
347,
908,
323,
13997,
363,
793,
285,
25184,
4373,
22041,
275,
253,
1072,
1039,
50276,
1012,
4679,
4496,
6266,
534,
4373,
22041,
368,
24251,
285,
849,
597,
497,
24251,
50276,
1047,
4679,
4496,
6266,
752,
253,
12419,
4735,
14168,
25912,
310,
275,
512,
390,
634,
4679,
50276,
1010,
4679,
4496,
6266,
253,
5606,
8245,
625,
4518,
50275,
1036,
4679,
4496,
41509,
2139,
368,
1304,
253,
3388,
10921,
273,
253,
1755,
6903,
361,
6430,
3185,
273,
816,
253,
4869,
10921,
1755,
18,
253,
3388,
476,
320,
11903,
1025,
407,
9610,
8931,
390,
1077,
2074,
6430,
625,
1774,
323,
8542,
4893,
310,
326,
253,
5556,
6081,
9010,
247,
11117,
873,
273,
1029,
250,
1034,
6430,
347,
5544,
275,
2897,
693,
86,
7707,
1162,
355,
665,
908,
3081,
9991,
17082,
281,
22048,
436,
50276,
1166,
4679,
849,
858,
368,
26641,
253,
13757,
891,
651,
751,
281,
671,
923,
4679,
326,
403,
31260,
342,
247,
1355,
873,
273,
13130,
3425,
24088,
581,
390,
1643,
2885,
6430,
12856,
862,
84,
534,
2223,
2226,
275,
3946,
50274,
37585,
5701,
2593,
337,
372,
17590,
7534,
3425,
2216,
6266,
534,
2238,
273,
3425,
277,
2072,
391,
2072,
2601,
8094,
6607,
347,
11559,
50275,
4674,
374,
374,
2109,
12494,
4496,
26542,
10123,
24088,
2832,
1148,
32693,
2061,
5375,
16899,
1549,
3439,
2526,
3944,
1477,
939,
1177,
681,
45894,
1417,
21,
17220,
9638,
25814,
2526,
273,
5368,
13361,
2216,
3082,
3185,
273,
2014,
9380,
247,
13107,
305,
1519,
532,
8657,
50275,
3354,
5341,
278,
347,
3425,
285,
256,
50276,
22401,
347,
253,
42295,
1159,
1318,
310,
21643,
1580,
256,
310,
253,
806,
4857,
273,
3425,
891,
7052,
1804,
281,
897,
256,
281,
9173,
247,
3425,
285,
323,
1650,
340,
50276,
3671,
281,
9173,
253,
1159,
1318,
50276,
12550,
22791,
3237,
1646,
281,
320,
2074,
281,
253,
22791,
3237,
5611,
275,
2897,
693,
86,
7707,
1162,
355,
3707,
323,
2938,
604,
436,
310,
253,
1083,
4496,
6266,
326,
368,
294,
3197,
253,
22791,
3237,
432,
2897,
693,
86,
7707,
1162,
355,
285,
6266,
1896,
3910,
407,
44978,
2897,
693,
86,
7707,
1162,
355,
3185,
273,
12930,
1016,
22791,
1895,
275,
2508,
368,
476,
671,
48399,
253,
5661,
2593,
50276,
16217,
3825,
2938,
368,
41661,
326,
19265,
14053,
5609,
1347,
1805,
1580,
6430,
403,
1048,
3738,
6430,
403,
1048,
627,
778,
320,
760,
247,
1643,
4778,
6887,
1223,
954,
6887,
403,
14900,
1580,
1006,
800,
3210,
476,
4354,
4944,
436,
14144,
1805,
597,
778,
1347,
973,
275,
436,
1083,
1580,
253,
13757,
1895,
4916,
14916,
347,
760,
1643,
4778,
6887,
403,
31758,
50276,
783,
18627,
5886,
875,
12177,
1959,
17032,
285,
3425,
2216,
2806,
3364,
13757,
310,
4722,
891,
717,
417,
6600,
273,
667,
5368,
9380,
342,
436,
12288,
2299,
891,
717,
417,
2119,
670,
253,
3486,
273,
436,
12288,
327,
849,
6430,
403,
4158,
275,
3946,
50276,
262,
310,
417,
2529,
4518,
2217,
849,
253,
4081,
3082,
9184,
432,
5368,
3082,
323,
3425,
2216,
824,
347,
17699,
16561,
1566,
3169,
13757,
305,
507,
390,
391,
77,
50276,
18108,
4278,
670,
253,
4081,
3082,
285,
2684,
4679,
403,
5816,
534,
2789,
352,
1892,
281,
2096,
285,
2939,
2490,
187,
4118,
18435,
27,
783,
2929,
2340,
684,
2710,
7274,
285,
247,
440,
5411,
7792,
323,
3425,
2216,
627,
497,
247,
5235,
273,
11626,
670,
253,
2929,
352,
369,
3543,
846,
5955,
326,
253,
2929,
651,
5649,
432,
247,
17614,
468,
2770,
285,
8489,
27171,
432,
1146,
29991,
407,
2710,
7274,
14999,
247,
2590,
14511,
533,
4583,
512,
30628,
574,
247,
2762,
21942,
285,
253,
2929,
2789,
247,
5322,
7680,
281,
253,
5675,
2133,
273,
789,
327,
2601,
2216
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a model combining convolutional operation and recurrent operation along with temporal attention for eeg classification strengths 1 the topic of eeg classification is meaningful 2 the modified class activation mapping gradcam is interesting to me weaknesses 1 the representation is not clear to me the challenges motivation solutions or contributions are not clearly expressed for example since the title mentioned shortinterval eeg then whats longinterval eeg whats the difference and how do existing studies treat them differently sec 1 uses two long paragraphs to show the limitations of traditional approaches which are all well known please condense them into one paragraph 2 lack of novelty using standard deep learning models and attention motivation for eeg recognition is ok in 2018 but now we expect more technical novelty in specific the cnn rnn excitation and squeeze and fullyconnected layers are very commonly used and intensively studied components in the last 5 years 3 weak experiments this work only compared with one baseline which is not enough i am also confused that what is exact the baseline mentioned in table 1 i cannot find descriptions about the baselines modelstructurecitation this work claims its effectiveness by outperforming the 1st place winner by 4 in abstract and results however the bci iv competition is in 2008 it is not amazing that the proposed model is better than a winner in 2008 the work lacks lots of studies on dlbased eeg analysis in recent 3 years please find my recent works and compare them as i mentioned in the above weakness this manuscript is not presented well the technical novelty is not significant and the experiments are not extensive i suggest a strong rejection docsepthe paper proposes a shortinterval mi classification system the model is a convolutional rnn with temporal attention it is the opinion of the reviewer that following are the strengths and weaknesses of the paper strengths the paper is generally wellwritten and structured good results are achieved weaknesses the proposed method is quite simple conv rnn att unfortunately with no novelty or innovation similar methods have been widely used in the past even in the field of eeg representation learning eg zhang et al classification of hand movements from eeg using a deep attentionbased lstm network 2020 only one dataset bci iv 2a is used therefore it is very unclear how the method generalizes to other datasets no indepth analysis of the length of the window sizes ie short interval is carried out the model details are not all give and so the work is not reproducible a comprehensive comparison with other methods in the field is missing based on the shortcomings mentioned above unfortunately the paper is very far from the level of iclr docsepthe authors introduce a deep learning approach for shorttime motor imagery classification using eeg data conventional cnn and rnn gru layers are used remarkably a data augmentation strategy and a classactivation mapping approach are presented overall the idea is interesting but the paper presentation the mathematical foundation and the experiments provided are poor therefore more details about the model and other experiments should be carried out to validate the proposal besides the authors claim that short time interval eeg classification is achieved however 08s windows size does not seem to be a short interval compared to other stateoftheart methods pros an interesting data augmentation algorithm for eeg timeseries classification a class activation mapping approach for eeg data is presented cons and comments the paper presentation is poor for example the introduction is confusing lacks suitable stateoftheart analysis and does not provide the proposals paragraph the results should include other motor imagery databases the influence of the number of eeg channels and the inter and intrasubject variability must be studied to test the availability of a short interval for example see httpsacademicoupcomgigasciencearticle67gix0343796323 httpsphysionetorgcontenteegmmidb100 the mathematical details should be enhanced for an iclr paper please provide the codes and the experimental details a good idea is presented concerning the data augmentation and the cambased extensions for eeg data nonetheless the short interval claim seems to be ambiguous several details regarding the mathematical background and more experiments should be conducted to test the proposal in addition the paper presentation needs to be enhanced
### Summary:
|
this paper develops a deep convolutional network with rnn layers and a new data augmentation method for eeg motor imagery classification reviewers agreed that the paper was not very clearly written and that without comparisons to other related methods or at least demonstration of the importance of each of the components of the model through for example ablation analyses it was hard to understand the generality of the approach the authors did not respond to the reviews so i am recommending not accepting this paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1566,
16248,
27311,
267,
4254,
285,
18902,
4254,
2112,
342,
11935,
4116,
323,
299,
909,
9162,
50276,
296,
3755,
20556,
337,
253,
9400,
273,
299,
909,
9162,
310,
14282,
50275,
19,
253,
7321,
966,
5743,
10603,
3805,
12583,
310,
4722,
281,
479,
50276,
20881,
1255,
265,
337,
253,
6779,
310,
417,
2590,
281,
479,
253,
7881,
16038,
5482,
390,
9021,
403,
417,
4518,
4469,
323,
1650,
50274,
17480,
253,
4060,
5393,
2159,
31251,
299,
909,
840,
47515,
1048,
31251,
299,
909,
47515,
253,
3064,
285,
849,
513,
5368,
2175,
1555,
731,
13359,
50275,
1704,
337,
4648,
767,
1048,
33295,
281,
921,
253,
7364,
273,
5899,
7274,
534,
403,
512,
973,
1929,
4496,
6882,
1215,
731,
715,
581,
12494,
50275,
19,
3480,
273,
38135,
970,
2629,
3676,
4715,
3210,
285,
4116,
16038,
323,
299,
909,
8981,
310,
8718,
275,
4765,
533,
1024,
359,
1902,
625,
7681,
38135,
275,
2173,
253,
260,
9866,
391,
9866,
16820,
285,
29684,
285,
4751,
14063,
8090,
403,
1077,
7744,
908,
285,
12934,
1242,
5421,
4295,
275,
253,
1390,
608,
1107,
50275,
20,
5075,
4679,
50275,
2520,
789,
760,
2429,
342,
581,
8245,
534,
310,
417,
2217,
891,
717,
671,
13477,
326,
752,
310,
3242,
253,
8245,
5393,
275,
2829,
337,
891,
2550,
1089,
20121,
670,
253,
1666,
25379,
1566,
18317,
26977,
50274,
2520,
789,
3916,
697,
12510,
407,
41731,
14692,
253,
337,
296,
1659,
13688,
407,
577,
275,
12002,
285,
1543,
2299,
253,
270,
5297,
21983,
7324,
310,
275,
4695,
352,
310,
417,
8644,
326,
253,
4081,
1566,
310,
1805,
685,
247,
13688,
275,
4695,
50275,
783,
789,
19756,
8783,
273,
2175,
327,
45439,
3169,
299,
909,
1783,
275,
3332,
495,
1107,
4496,
1089,
619,
3332,
2987,
285,
7277,
731,
50276,
284,
891,
5393,
275,
253,
1840,
14855,
436,
7714,
310,
417,
3559,
973,
253,
7681,
38135,
310,
417,
1534,
285,
253,
4679,
403,
417,
9470,
891,
1804,
247,
2266,
18235,
5474,
339,
431,
248,
2929,
29328,
247,
2159,
31251,
3641,
9162,
985,
253,
1566,
310,
247,
27311,
267,
391,
9866,
342,
11935,
4116,
352,
310,
253,
4743,
273,
253,
37317,
326,
1563,
403,
253,
20544,
285,
32213,
273,
253,
2929,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
3839,
973,
15720,
285,
18872,
50276,
12311,
1543,
403,
6786,
50276,
20881,
1255,
265,
50276,
783,
4081,
1332,
310,
3240,
2969,
2410,
391,
9866,
50276,
1595,
19235,
342,
642,
38135,
390,
15832,
2074,
3082,
452,
644,
7561,
908,
275,
253,
2469,
1014,
275,
253,
1673,
273,
299,
909,
6779,
4715,
24088,
1182,
12109,
1162,
355,
9162,
273,
1133,
11438,
432,
299,
909,
970,
247,
3676,
4116,
3169,
298,
296,
78,
2990,
9169,
50276,
7483,
581,
10895,
270,
5297,
21983,
374,
66,
310,
908,
3103,
352,
310,
1077,
12744,
849,
253,
1332,
2087,
4219,
281,
643,
15302,
50275,
2369,
801,
554,
394,
1783,
273,
253,
2978,
273,
253,
3497,
9552,
26332,
2159,
7726,
310,
4824,
562,
50276,
783,
1566,
4278,
403,
417,
512,
1918,
285,
594,
253,
789,
310,
417,
41374,
50275,
66,
11088,
5301,
342,
643,
3082,
275,
253,
1673,
310,
5816,
50275,
3169,
327,
253,
35387,
5393,
1840,
19235,
253,
2929,
310,
1077,
2080,
432,
253,
1268,
273,
17857,
32888,
50276,
7152,
339,
431,
248,
4477,
9569,
247,
3676,
4715,
2746,
323,
2159,
2606,
5694,
27471,
9162,
970,
299,
909,
941,
50276,
585,
26743,
260,
9866,
285,
391,
9866,
50276,
30107,
8090,
403,
908,
24678,
247,
941,
42072,
5700,
285,
247,
966,
18166,
10603,
2746,
403,
3559,
4583,
253,
2934,
310,
4722,
533,
253,
2929,
9759,
253,
15965,
12153,
285,
253,
4679,
2530,
403,
4105,
3103,
625,
4278,
670,
253,
1566,
285,
643,
4679,
943,
320,
4824,
562,
281,
17813,
253,
10419,
16280,
253,
4477,
1750,
326,
2159,
673,
7726,
299,
909,
9162,
310,
6786,
2299,
16331,
84,
8323,
1979,
1057,
417,
1646,
281,
320,
247,
2159,
7726,
2429,
281,
643,
1375,
23037,
14387,
3082,
5847,
50276,
266,
4722,
941,
42072,
5933,
323,
299,
909,
2069,
12395,
9162,
247,
966,
5743,
10603,
2746,
323,
299,
909,
941,
310,
3559,
50276,
5040,
285,
5701,
50276,
783,
2929,
9759,
310,
4105,
323,
1650,
253,
10199,
310,
21643,
19756,
7470,
1375,
23037,
14387,
1783,
285,
1057,
417,
2085,
253,
18595,
12494,
253,
1543,
943,
2486,
643,
5694,
27471,
16634,
253,
4833,
273,
253,
1180,
273,
299,
909,
8123,
285,
253,
734,
285,
4996,
284,
538,
720,
13099,
1364,
320,
5421,
281,
1071,
253,
11659,
273,
247,
2159,
7726,
323,
1650,
923,
5987,
317,
4788,
280,
1011,
681,
72,
304,
4843,
1482,
14600,
2251,
72,
895,
20570,
1787,
4196,
21874,
5987,
14453,
279,
292,
2061,
6071,
70,
909,
2188,
301,
67,
2313,
50268,
783,
15965,
4278,
943,
320,
8655,
323,
271,
17857,
32888,
2929,
4496,
2085,
253,
11646,
285,
253,
5661,
4278,
247,
1175,
2934,
310,
3559,
8664,
253,
941,
42072,
285,
253,
31801,
833,
18149,
323,
299,
909,
941,
23188,
253,
2159,
7726,
1750,
3133,
281,
320,
23851,
2067,
4278,
5001,
253,
15965,
4114,
285,
625,
4679,
943,
320,
5196,
281,
1071,
253,
10419,
275,
1635,
253,
2929,
9759,
3198,
281,
320,
8655,
2490,
187,
4118,
18435,
27,
2520,
2929,
24357,
247,
3676,
27311,
267,
2990,
342,
391,
9866,
8090,
285,
50276,
66,
747,
941,
42072,
1332,
323,
299,
909,
5694,
27471,
9162,
50276,
15337,
398,
5821,
326,
253,
2929,
369,
417,
1077,
4518,
3542,
285,
326,
1293,
14023,
281,
643,
2905,
3082,
390,
387,
1878,
20028,
50276,
1171,
253,
6349,
273,
1016,
273,
253,
4295,
273,
253,
1566,
949,
323,
50276,
11667,
28913,
6260,
352,
369,
1892,
281,
2096,
253,
31376,
273,
253,
2746,
50276,
783,
4477,
858,
417,
3794,
281,
253,
10123,
594,
891,
717,
46705,
417,
50275,
14764,
272,
436,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1566,
16248,
27311,
267,
4254,
285,
18902,
4254,
2112,
342,
11935,
4116,
323,
299,
909,
9162,
50276,
296,
3755,
20556,
337,
253,
9400,
273,
299,
909,
9162,
310,
14282,
50275,
19,
253,
7321,
966,
5743,
10603,
3805,
12583,
310,
4722,
281,
479,
50276,
20881,
1255,
265,
337,
253,
6779,
310,
417,
2590,
281,
479,
253,
7881,
16038,
5482,
390,
9021,
403,
417,
4518,
4469,
323,
1650,
50274,
17480,
253,
4060,
5393,
2159,
31251,
299,
909,
840,
47515,
1048,
31251,
299,
909,
47515,
253,
3064,
285,
849,
513,
5368,
2175,
1555,
731,
13359,
50275,
1704,
337,
4648,
767,
1048,
33295,
281,
921,
253,
7364,
273,
5899,
7274,
534,
403,
512,
973,
1929,
4496,
6882,
1215,
731,
715,
581,
12494,
50275,
19,
3480,
273,
38135,
970,
2629,
3676,
4715,
3210,
285,
4116,
16038,
323,
299,
909,
8981,
310,
8718,
275,
4765,
533,
1024,
359,
1902,
625,
7681,
38135,
275,
2173,
253,
260,
9866,
391,
9866,
16820,
285,
29684,
285,
4751,
14063,
8090,
403,
1077,
7744,
908,
285,
12934,
1242,
5421,
4295,
275,
253,
1390,
608,
1107,
50275,
20,
5075,
4679,
50275,
2520,
789,
760,
2429,
342,
581,
8245,
534,
310,
417,
2217,
891,
717,
671,
13477,
326,
752,
310,
3242,
253,
8245,
5393,
275,
2829,
337,
891,
2550,
1089,
20121,
670,
253,
1666,
25379,
1566,
18317,
26977,
50274,
2520,
789,
3916,
697,
12510,
407,
41731,
14692,
253,
337,
296,
1659,
13688,
407,
577,
275,
12002,
285,
1543,
2299,
253,
270,
5297,
21983,
7324,
310,
275,
4695,
352,
310,
417,
8644,
326,
253,
4081,
1566,
310,
1805,
685,
247,
13688,
275,
4695,
50275,
783,
789,
19756,
8783,
273,
2175,
327,
45439,
3169,
299,
909,
1783,
275,
3332,
495,
1107,
4496,
1089,
619,
3332,
2987,
285,
7277,
731,
50276,
284,
891,
5393,
275,
253,
1840,
14855,
436,
7714,
310,
417,
3559,
973,
253,
7681,
38135,
310,
417,
1534,
285,
253,
4679,
403,
417,
9470,
891,
1804,
247,
2266,
18235,
5474,
339,
431,
248,
2929,
29328,
247,
2159,
31251,
3641,
9162,
985,
253,
1566,
310,
247,
27311,
267,
391,
9866,
342,
11935,
4116,
352,
310,
253,
4743,
273,
253,
37317,
326,
1563,
403,
253,
20544,
285,
32213,
273,
253,
2929,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
3839,
973,
15720,
285,
18872,
50276,
12311,
1543,
403,
6786,
50276,
20881,
1255,
265,
50276,
783,
4081,
1332,
310,
3240,
2969,
2410,
391,
9866,
50276,
1595,
19235,
342,
642,
38135,
390,
15832,
2074,
3082,
452,
644,
7561,
908,
275,
253,
2469,
1014,
275,
253,
1673,
273,
299,
909,
6779,
4715,
24088,
1182,
12109,
1162,
355,
9162,
273,
1133,
11438,
432,
299,
909,
970,
247,
3676,
4116,
3169,
298,
296,
78,
2990,
9169,
50276,
7483,
581,
10895,
270,
5297,
21983,
374,
66,
310,
908,
3103,
352,
310,
1077,
12744,
849,
253,
1332,
2087,
4219,
281,
643,
15302,
50275,
2369,
801,
554,
394,
1783,
273,
253,
2978,
273,
253,
3497,
9552,
26332,
2159,
7726,
310,
4824,
562,
50276,
783,
1566,
4278,
403,
417,
512,
1918,
285,
594,
253,
789,
310,
417,
41374,
50275,
66,
11088,
5301,
342,
643,
3082,
275,
253,
1673,
310,
5816,
50275,
3169,
327,
253,
35387,
5393,
1840,
19235,
253,
2929,
310,
1077,
2080,
432,
253,
1268,
273,
17857,
32888,
50276,
7152,
339,
431,
248,
4477,
9569,
247,
3676,
4715,
2746,
323,
2159,
2606,
5694,
27471,
9162,
970,
299,
909,
941,
50276,
585,
26743,
260,
9866,
285,
391,
9866,
50276,
30107,
8090,
403,
908,
24678,
247,
941,
42072,
5700,
285,
247,
966,
18166,
10603,
2746,
403,
3559,
4583,
253,
2934,
310,
4722,
533,
253,
2929,
9759,
253,
15965,
12153,
285,
253,
4679,
2530,
403,
4105,
3103,
625,
4278,
670,
253,
1566,
285,
643,
4679,
943,
320,
4824,
562,
281,
17813,
253,
10419,
16280,
253,
4477,
1750,
326,
2159,
673,
7726,
299,
909,
9162,
310,
6786,
2299,
16331,
84,
8323,
1979,
1057,
417,
1646,
281,
320,
247,
2159,
7726,
2429,
281,
643,
1375,
23037,
14387,
3082,
5847,
50276,
266,
4722,
941,
42072,
5933,
323,
299,
909,
2069,
12395,
9162,
247,
966,
5743,
10603,
2746,
323,
299,
909,
941,
310,
3559,
50276,
5040,
285,
5701,
50276,
783,
2929,
9759,
310,
4105,
323,
1650,
253,
10199,
310,
21643,
19756,
7470,
1375,
23037,
14387,
1783,
285,
1057,
417,
2085,
253,
18595,
12494,
253,
1543,
943,
2486,
643,
5694,
27471,
16634,
253,
4833,
273,
253,
1180,
273,
299,
909,
8123,
285,
253,
734,
285,
4996,
284,
538,
720,
13099,
1364,
320,
5421,
281,
1071,
253,
11659,
273,
247,
2159,
7726,
323,
1650,
923,
5987,
317,
4788,
280,
1011,
681,
72,
304,
4843,
1482,
14600,
2251,
72,
895,
20570,
1787,
4196,
21874,
5987,
14453,
279,
292,
2061,
6071,
70,
909,
2188,
301,
67,
2313,
50268,
783,
15965,
4278,
943,
320,
8655,
323,
271,
17857,
32888,
2929,
4496,
2085,
253,
11646,
285,
253,
5661,
4278,
247,
1175,
2934,
310,
3559,
8664,
253,
941,
42072,
285,
253,
31801,
833,
18149,
323,
299,
909,
941,
23188,
253,
2159,
7726,
1750,
3133,
281,
320,
23851,
2067,
4278,
5001,
253,
15965,
4114,
285,
625,
4679,
943,
320,
5196,
281,
1071,
253,
10419,
275,
1635,
253,
2929,
9759,
3198,
281,
320,
8655,
2490,
187,
4118,
18435,
27,
2520,
2929,
24357,
247,
3676,
27311,
267,
2990,
342,
391,
9866,
8090,
285,
50276,
66,
747,
941,
42072,
1332,
323,
299,
909,
5694,
27471,
9162,
50276,
15337,
398,
5821,
326,
253,
2929,
369,
417,
1077,
4518,
3542,
285,
326,
1293,
14023,
281,
643,
2905,
3082,
390,
387,
1878,
20028,
50276,
1171,
253,
6349,
273,
1016,
273,
253,
4295,
273,
253,
1566,
949,
323,
50276,
11667,
28913,
6260,
352,
369,
1892,
281,
2096,
253,
31376,
273,
253,
2746,
50276,
783,
4477,
858,
417,
3794,
281,
253,
10123,
594,
891,
717,
46705,
417,
50275,
14764,
272,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a model to perform audio super resolution the proposed model trains a neural network to produce a highresolution audio sample given a low resolution input it uses three losses sample reconstructon adversarialy loss and feature matching on a representation learned on an unsupervised way from a technical perspective i do not find the proposed approach very novel it uses architectures following closely what has been done for image supreresolution i am not aware of an effective use of gans in the audio processing domain this would be a good point for the paper however the evidence presented does not seem very convincing in my view while this is an audio processing paper it lacks domain insights even the terminology feels borrowed from the image domain again most of the modeling decisions seem to follow what has been done for images the empirical results seem good but the generated audio does not match the quality of the stateoftheart the presentation of the paper is correct it would be good to list or summarize the contributions of this work recent works have shown the amazing power of autoregressive generative models wavenet in producing audio signals this is as far as i know the stateoftheart in audio generation the authors should motivate why the proposed model is better or worth studying in light of those approaches in particular a recent work a has shown very high quality results in the problem of speech conversion which seems harder than bandwidth extension it would seem to me that applying such models to the bandwith extension task should also lead to very high quality results as well what is the advantage of the proposed approach would a wavenet decoder also be improved by including these auxiliary losses while the audio samples seem to be good they are also a bit noisy even compared with the baseline this is not the case in the samples generated by a which is of course a different problem the qualitative results are evaluated using pesq while this is a good proxy it is much better to perform blind tests with listeners that would certainly improve the paper feature spaces are used in super resolution to provide a space in which the an l2 loss is perceptually more relevant there are many such representations for audio signals specifically the magnitude of timefrequency representations like spectrograms or more sophisticated features such as scattering coefficients in my view the paper would be much stronger if these features would be evaluated as alternative to the features provided by the proposed autoencoder one of the motivations for defining the loss in the feature space is the lack or difficulty to train auxiliary classifiers on large amounts of data however speech recognition models using neural networks are quite common it would be good to also test features obtained from an offtheshelf speech recognition system how would this compare to the proposed model the l2 pixel loss seems a bit strange in my view particularly in audio processing the recovered high frequency components can be synthesized with an arbitrary phase this means that imposing an exact match seems like a constraint as the phase cannot be predicted from the low resolution signal which is what a gan loss could achieve the paper should present ablations on the use of the different losses in particular one of the main contributions is the inclusion of the loss measured in the learned feature space the authors mention that not including it leads to audible artifacts i think that more studies should be presented including quantitative evaluations and audio samples how where the hyper parameters chosen is there a lot of sensitivity to their values a van den oord aaron and oriol vinyals neural discrete representation learning advances in neural information processing systems 2017 docsepthis paper presents a ganbased method to perform audio superresolution in contrast to previous work this work uses autoencoder to obtain feature losses derived from unlabeled data comments 1 redundant comma filters with very large receptive fields are required to create high quality raw audio 2 there are some stateoftheart nonautoregressive generative models for audio waveform eg parallel wavenet clarinet one may properly discuss them in related work section although gan performs very well for images it hasnt obtained any compelling results for raw audios still its very interesting to explore that any nontrivial insight would be highly appreciated 3 in multiscale convolutional layers it seems only larger filter plays a significant role what if we omit small filter eg 3x1 4 it seems the proposed mugan introduces noticeable noise in the upsampled audios pros interesting idea and fascinating problem cons the results are fair i didnt see big improvement over previous work kuleshov et al 2017 id like to reconsider my rating after the rebuttal docseppros wellwritten nice overall system gan framework for supersampling audio incorporating features from an autoencoder some goodsounding examples cons some confusingweaklypresented parts admittedly covering lots of material in short space i am confused about the evaluation would like additional qualitativeobservational understanding of what works including more on how the results differ from baseline summary the task addressed in this work is given a lowresolution audio signal generate corresponding highquality audio the approach is a generative neural network that operates on raw audio and train within a gan framework working in raw samplespace eg pixels is known to be challenging so a stabilizing solution is to incorporate a feature loss feature loss however usually requires a network trained on a related task and if such a net one does not already exist then building one can have its own possibly significant challenges in this work the authors avoid this auxiliary challenge by using unsupervised feature losses taking advantage of the fact that any audio signal can be downsampled and therefore one has the corresponding upsampled signal as well the training framework is basically that of a gan but where rather than providing the generator with a lowdimensional noise signal input they provide the generator with the subsampled audio signal the architecture includes a generator glofidelityhighfidelity a discriminator dhighfidelity real or by supersampled and an autoencoder phi signal x features of signal x at aes bottleneck comments the generator network appears to be nearly identical to that of kuleshov et al 2017 which becomes the baseline and so the primary contribution differentiating this work is the insertion of that network into a gan framework along with the additional featurebased loss term this is overall a nice problem and a nice approach in that light i believe that there is a new focus in this work on the perceptual quality of the outputs as compared to kuleshov et al 2017 i would therefore ideally like to see a some attempts at perceptually evaluating the resulting output beyond pesq eg with human subjects and with the understanding that eg not all amt workers have the same aural discriminative abilities themselves andor b more detailed associated qualitative descriptionsvisualization of the supersampled signal perhaps with a few more samples if that would help that said i understand that there are pagespace limitations more on this next given the similarity of the unet architectures to kuleshov et al 2017 why not move some of those descriptions to the appendix for example i found the description and figure illustrating the superpixel layers to be fairly uninformative i see that the figure shows interleaving and deinterleaving resulting in tradingoff dimensionalitiesranksetc and we are told that this helps with wellknown checkerboard artifacts but i was confused about what the white elements represent and the caption just reiterated that resolution was being increased and decreased overall i didnt really understand exactly the role that this plays in the system i wondered if it either needed a lot more clarification in an appendix or just less space spent on it but keeping the pointers to the relevant references it seems that the subpixel layer was already implemented in kuleshov 2017 with some explanation yet in the present work a large table table 1b is presented showing that there is no difference in quality metrics and the text also mentions that there is no significant perceptual difference in audio if the subpixel layer were explained in detail and with justification then i would potentially be ok with the negative results but in this case its not clear why spend this time on it here its possible that there is something simple about it that i am not understanding im open to being convinced otherwise why not just write following kuleshov et al 2017 we use subpixel layers shi et al instead of to speed up training although we found that they make no significant perceptual effects or something along those lines and leave it at that i did appreciate the descriptions of models sensitivity to sizestructure of the conv filters importance of the res connections etc my biggest confusion was with the evaluation results since the most directly related work was kuleshov 2017 i compared the super resolution unet samples on that website httpskuleshovgithubioaudiosuperres to the samples provided for the present work httpssitesgooglecomviewunsupervisedaudiosrhome and i was a bit confused because the quality of the unet samples in kuleshov 2017 seemed to be perceptually significantly better than the quality of the deep cnn unet baseline in the present work perhaps i am in error about this but as far as i can tell the superresolution in kuleshov et al 2017 is significantly better than the deep cnn examples here is this a result of careful selection of examples i do believe what i hear eg that the mugan8 is clearly better on some examples than the unet8 but then for nonidentical samples how come unet4 actually generally sounds better than unet8 that doesnt make immediate sense either assuming no overfitting etc is the benefit in moving from unet4 to unet8 within a gan context but then stabilizing it with the featurebased loss if so then how does mugan8 compare to unet4 would there be any info for the reader by doing an ablation removing the feature loss from the gan framework etc i guess i would like to get a better understanding of what is actually going on even if qualitative is there any qualitative or anecdotal observation about which types of samples one system works better on than another for example in the provided examples for the present paper it seemed to be the case that perhaps the mugan8 was more helpful for supersampling female voices which might have more highfrequency components that seem to get lost when downsampling but maybe im overgeneralizing from the few examples i heard some spectrograms might be helpful since they do after all convey some useful information despite not telling much of the perceptual story for example are there visible but inaudible artifacts are such artifacts systematic were individual audio samples represented as a onehot encoding or as floats i assume floats since there was no mention of sampling from a distribution to select the value a couple of typos descriminator discriminator pg 6 impact of superpixel layers last sentence of 2nd par is actually not a sentence the reduction in convolutional kernels prior to the superpixel operation overall interesting work and i enjoyed reading it if some of my questions around evaluation could be addressed either in a revision or in a rebuttal eg if i completely misunderstood something i would gladly consider revising my rating which is currently somewhere between 6 and 7
### Summary:
|
the paper presents an algorithm for audio superresolution using adversarial models along with additional losses eg using autoencoders and reconstruction losses to improve the generation process strengths proposes audio super resolution based on gans extending some of the techniques proposed for vision image to audio the authors improved the paper during the review process by including results from a user study and ablation analysis weaknesses although the paper presents an interesting application of gans for the audio task overall novelty is limited since the setup closely follows what has been done for vision and related tasks and the baseline system this is also not the first application of gans for audio tasks performance improvement over previously proposed unet models is small it would have been useful to also include unet4 in userstudy as one of the reviewers pointed out since it sounds better in a few cases it is not entirely clear if the method would be an improvement of stateoftheart audio generative models like wavenet reviewers agree that the general direction of this work is interesting but the results are not compelling enough at the moment for the paper to be accepted to iclr given these review comments the recommendation is to reject the paper
|
[
310,
591,
916,
1230,
625,
4623,
627,
403,
1142,
824,
14237,
323,
9797,
6298,
5742,
253,
9777,
273,
673,
18163,
14237,
751,
2812,
287,
5059,
390,
625,
18144,
3386,
824,
347,
11715,
10303,
275,
619,
1859,
253,
2929,
651,
320,
1199,
10046,
604,
841,
3386,
651,
320,
6760,
347,
5795,
281,
253,
3386,
2530,
407,
253,
4081,
6753,
36465,
50275,
531,
273,
253,
42852,
323,
13947,
253,
2957,
275,
253,
4735,
2317,
310,
253,
3480,
390,
10183,
281,
6194,
24026,
49996,
327,
1781,
8322,
273,
941,
50276,
35529,
6519,
8981,
3210,
970,
11454,
6928,
403,
3240,
1846,
352,
651,
320,
1175,
281,
671,
1071,
3386,
2797,
432,
271,
273,
649,
1041,
48164,
6519,
8981,
985,
849,
651,
436,
7277,
281,
253,
4081,
1566,
50276,
783,
298,
19,
12275,
2957,
3133,
247,
2372,
8921,
275,
619,
1859,
3782,
275,
9797,
5162,
253,
12372,
1029,
4294,
4295,
476,
320,
17791,
342,
271,
10341,
3408,
436,
2097,
326,
23254,
271,
3242,
3761,
3133,
751,
247,
7658,
347,
253,
3408,
2550,
320,
8131,
432,
253,
1698,
6064,
2625,
534,
310,
752,
247,
36827,
2957,
812,
5115,
50275,
783,
2929,
943,
1246,
490,
77,
569,
327,
253,
897,
273,
253,
1027,
11655,
275,
1798,
581,
273,
253,
2022,
9021,
310,
253,
11250,
273,
253,
2957,
4080,
275,
253,
6311,
4735,
2317,
253,
4477,
3748,
326,
417,
1690,
352,
5644,
281,
47055,
24165,
891,
1158,
326,
625,
2175,
943,
320,
3559,
1690,
11745,
27163,
285,
9797,
3530,
50276,
5430,
835,
253,
4373,
3602,
6777,
310,
627,
247,
2257,
273,
7340,
281,
616,
2193,
50275,
66,
3889,
1850,
258,
636,
247,
10510,
285,
47692,
311,
362,
5104,
932,
11454,
13358,
6779,
4715,
16424,
275,
11454,
1491,
5162,
2718,
4240,
5474,
33032,
2520,
2929,
10262,
247,
36827,
3169,
1332,
281,
1347,
9797,
2221,
21061,
275,
4499,
281,
2045,
789,
436,
789,
4648,
6753,
36465,
281,
4044,
4735,
11655,
6012,
432,
440,
22027,
941,
50275,
26122,
337,
28116,
39169,
15116,
342,
1077,
1781,
44952,
4910,
403,
2424,
281,
2794,
1029,
3290,
9305,
9797,
50276,
19,
627,
403,
690,
1375,
23037,
14387,
1327,
1920,
410,
11020,
1006,
800,
3210,
323,
9797,
34048,
24088,
7529,
259,
7603,
292,
8254,
7795,
581,
778,
6283,
2319,
731,
275,
2905,
789,
2593,
3738,
36827,
17923,
1077,
973,
323,
3888,
352,
556,
2649,
2797,
667,
18511,
1543,
323,
9305,
3820,
3783,
1335,
697,
1077,
4722,
281,
8338,
326,
667,
37825,
12288,
651,
320,
4122,
14109,
50276,
20,
275,
1554,
2865,
1079,
27311,
267,
8090,
352,
3133,
760,
4067,
5806,
7120,
247,
1534,
2554,
752,
604,
359,
35991,
1355,
5806,
24088,
495,
89,
18,
50276,
21,
352,
3133,
253,
4081,
33222,
266,
23970,
28629,
6046,
275,
253,
598,
22163,
6216,
3820,
3783,
50275,
856,
84,
50276,
47606,
2934,
285,
20996,
1895,
50276,
5040,
50276,
783,
1543,
403,
4344,
891,
42126,
923,
1943,
7756,
689,
2045,
789,
465,
2651,
44227,
1162,
355,
4240,
50276,
301,
751,
281,
24033,
619,
13716,
846,
253,
30080,
22559,
5474,
339,
377,
2921,
973,
15720,
5322,
4583,
985,
36827,
7792,
323,
17402,
312,
4906,
9797,
24049,
3386,
432,
271,
6753,
36465,
690,
10229,
13802,
6667,
50276,
5040,
690,
21643,
20881,
314,
15068,
264,
4243,
47421,
10985,
8783,
273,
2144,
275,
2159,
2317,
891,
717,
13477,
670,
253,
7103,
651,
751,
3081,
18276,
23705,
1050,
4685,
273,
752,
2987,
1690,
625,
327,
849,
253,
1543,
9184,
432,
8245,
50276,
8774,
253,
4836,
9713,
275,
436,
789,
310,
1677,
247,
1698,
21061,
9797,
2625,
6635,
3969,
1029,
15177,
9797,
253,
2746,
310,
247,
1006,
800,
11454,
2990,
326,
17209,
327,
9305,
9797,
285,
6194,
1561,
247,
36827,
7792,
50276,
21107,
275,
9305,
3530,
4511,
24088,
15115,
310,
1929,
281,
320,
11132,
594,
247,
41427,
2900,
310,
281,
19071,
247,
4735,
2957,
4735,
2957,
2299,
3798,
4419,
247,
2990,
10166,
327,
247,
2905,
4836,
285,
604,
824,
247,
2036,
581,
1057,
417,
2168,
2226,
840,
3652,
581,
476,
452,
697,
1211,
6830,
1534,
7881,
275,
436,
789,
253,
4477,
3693,
436,
24026,
5691,
407,
970,
440,
35421,
4735,
11655,
3192,
5750,
273,
253,
958,
326,
667,
9797,
2625,
476,
320,
1066,
22163,
6216,
285,
3103,
581,
556,
253,
3969,
598,
22163,
6216,
2625,
347,
973,
50276,
783,
3733,
7792,
310,
10323,
326,
273,
247,
36827,
533,
835,
2581,
685,
5277,
253,
14156,
342,
247,
1698,
6967,
6046,
2625,
3280,
597,
2085,
253,
14156,
342,
253,
8790,
312,
6216,
9797,
2625,
253,
10336,
3797,
247,
14156,
50276,
3129,
1171,
21718,
8656,
71,
21718,
50276,
66,
7134,
12915,
50276,
69,
8656,
71,
21718,
50276,
6549,
390,
407,
17402,
312,
6216,
50275,
395,
271,
6753,
36465,
50276,
2162,
2625,
1269,
50276,
28862,
273,
2625,
1269,
387,
247,
265,
3673,
44856,
50275,
26122,
50276,
783,
14156,
2990,
4620,
281,
320,
4829,
8931,
281,
326,
273,
465,
2651,
44227,
1162,
355,
4240,
534,
4916,
253,
8245,
285,
594,
253,
3625,
7680,
43073,
436,
789,
310,
253,
16941,
273,
326,
2990,
715,
247,
36827,
7792,
2112,
342,
253,
3081,
4735,
3169,
2957,
1307,
436,
310,
4583,
247,
5322,
1895,
285,
247,
5322,
2746,
275,
326,
1708,
891,
2868,
326,
627,
310,
247,
747,
2770,
275,
436,
789,
327,
253,
39612,
3290,
273,
253,
18012,
347,
2429,
281,
465,
2651,
44227,
1162,
355,
4240,
891,
651,
3103,
34243,
751,
281,
923,
247,
690,
9437,
387,
591,
916,
1230,
16344,
253,
4795,
3453,
4457,
27246,
82,
24088,
342,
1966,
5705,
285,
342,
253,
4685,
326,
24088,
417,
512,
717,
85,
5820,
452,
253,
1072,
247,
1546,
20741,
800,
15277,
3746,
285,
263,
270,
625,
7000,
2330,
18276,
20121,
34309,
1320,
273,
253,
17402,
312,
6216,
2625,
4931,
342,
247,
1643,
625,
3530,
604,
326,
651,
1361,
326,
753,
891,
2096,
326,
627,
403,
7223,
4511,
7364,
625,
327,
436,
1735,
50276,
28821,
253,
14259,
273,
253,
440,
292,
35615,
281,
465,
2651,
44227,
1162,
355,
4240,
2139,
417,
2118,
690,
273,
1110,
20121,
281,
253,
30762,
50275,
1542,
1650,
891,
1119,
253,
5740,
285,
4677,
34805,
253,
2221,
29206,
8090,
281,
320,
9648,
440,
37650,
800,
891,
923,
326,
253,
4677,
2722,
25817,
3292,
285,
372,
2388,
282,
3292,
4795,
275,
11947,
2727,
15759,
1005,
83,
3107,
14069,
285,
359,
403,
2183,
326,
436,
7729,
342,
973,
4304,
2451,
254,
4697,
24165,
533,
891,
369,
13477,
670,
752,
253,
3168,
3603,
1957,
285,
253,
11743,
816,
43269,
326,
6064,
369,
1146,
2559,
285,
6137,
4583,
891,
42126,
1663,
2096,
4555,
253,
2554,
326,
436,
7120,
275,
253,
985,
891,
13876,
604,
352,
2057,
3058,
247,
2257,
625,
37699,
275,
271,
30762,
390,
816,
1679,
2317,
5262,
327,
352,
533,
7562,
253,
29476,
281,
253,
4623,
10414,
50276,
262,
3133,
326,
253,
749,
29206,
3828,
369,
2168,
9009,
275,
465,
2651,
44227,
4240,
342,
690,
8813,
2568,
275,
253,
1246,
789,
247,
1781,
2829,
2829,
337,
67,
310,
3559,
4645,
326,
627,
310,
642,
3064,
275,
3290,
17082,
285,
253,
2505,
671,
25957,
326,
627,
310,
642,
1534,
39612,
3064,
275,
9797,
604,
253,
749,
29206,
3828,
497,
5544,
275,
2508,
285,
342,
22861,
840,
891,
651,
7826,
320,
8718,
342,
253,
4016,
1543,
533,
275,
436,
1083,
697,
417,
2590,
2139,
6947,
436,
673,
327,
352,
1060,
697,
1896,
326,
627,
310,
1633,
2969,
670,
352,
326,
891,
717,
417,
4685,
516,
1527,
281,
1146,
13762,
5010,
2139,
417,
816,
3630,
1563,
465,
2651,
44227,
1162,
355,
4240,
359,
897,
749,
29206,
8090,
439,
74,
1162,
355,
3185,
273,
50276,
936,
3885,
598,
3733,
3738,
359,
1119,
326,
597,
1056,
642,
1534,
39612,
2538,
390,
1633,
2112,
1110,
3104,
285,
3553,
352,
387,
326,
50275,
74,
858,
11435,
253,
20121,
273,
3210,
7340,
281,
256,
478,
383,
7818,
273,
253,
2410,
15116,
6349,
273,
253,
501,
10291,
3966,
50276,
2577,
5962,
13775,
369,
342,
253,
7103,
50276,
16680,
50276,
17480,
253,
954,
3587,
2905,
789,
369,
465,
2651,
44227,
4240,
891,
2429,
253,
2221,
6064,
440,
292,
3530,
327,
326,
4422,
5987,
76,
2651,
44227,
7280,
900,
5353,
3783,
29974,
373,
50276,
936,
253,
3530,
2530,
323,
253,
1246,
789,
50276,
3614,
37813,
9906,
681,
1374,
328,
35421,
5353,
3783,
9492,
485,
50276,
395,
891,
369,
247,
2372,
13477,
984,
253,
3290,
273,
253,
440,
292,
3530,
275,
465,
2651,
44227,
4240,
4455,
281,
320,
591,
916,
1230,
3012,
1805,
685,
253,
3290,
273,
253,
3676,
260,
9866,
440,
292,
8245,
275,
253,
1246,
789,
4931,
891,
717,
275,
2228,
670,
436,
533,
347,
2080,
347,
891,
476,
2028,
253,
2221,
21061,
275,
465,
2651,
44227,
1162,
355,
4240,
310,
3012,
1805,
685,
253,
3676,
260,
9866,
6667,
1060,
310,
436,
247,
906,
273,
10182,
5438,
273,
6667,
891,
513,
2868,
752,
891,
4089,
24088,
326,
253,
33222,
266,
25,
310,
4518,
1805,
327,
690,
6667,
685,
253,
440,
292,
25,
533,
840,
323,
1327,
888,
474,
3530,
849,
1705,
440,
292,
21,
2686,
3839,
7835,
1805,
685,
440,
292,
25,
326,
36908,
1056,
8993,
3282,
2057,
7384,
642,
689,
31893,
3966,
310,
253,
5649,
275,
4886,
432,
440,
292,
21,
281,
440,
292,
25,
1561,
247,
36827,
3634,
533,
840,
41427,
50276,
262,
342,
253,
4735,
3169,
2957,
604,
594,
840,
849,
1057,
33222,
266,
25,
7277,
281,
440,
292,
21,
651,
627,
320,
667,
8692,
323,
253,
9414,
407,
2509,
271,
28913,
11922,
253,
4735,
2957,
432,
253,
36827,
7792,
3966,
891,
5476,
891,
651,
751,
281,
755,
247,
1805,
4685,
273,
752,
310,
2686,
1469,
327,
1014,
604,
18276,
310,
627,
667,
18276,
390,
34009,
5256,
267,
8310,
670,
534,
3510,
273,
3530,
581,
985,
2987,
1805,
327,
685,
1529,
323,
1650,
275,
253,
2530,
6667,
323,
253,
1246,
2929,
352,
4455,
281,
320,
253,
1083,
326,
4931,
253,
33222,
266,
25,
369,
625,
9371,
323,
17402,
312,
4906,
5343,
15547,
534,
1537,
452,
625,
1029,
18163,
4295,
326,
1646,
281,
755,
3663,
672,
1066,
48027,
533,
5046,
516,
689,
16691,
3006,
432,
253,
1643,
6667,
891,
3735,
50275,
8826,
2812,
287,
5059,
1537,
320,
9371,
1580,
597,
513,
846,
512,
12709,
690,
4217,
1491,
5747,
417,
7746,
1199,
273,
253,
39612,
2926,
323,
1650,
403,
627,
7985,
533,
275,
5353,
917,
24165,
403,
824,
24165,
12082,
50276,
12796,
2060,
9797,
3530,
6607,
347,
247,
581,
12022,
9706,
390,
347,
48158,
891,
5467,
48158,
1580,
627,
369,
642,
3748,
273,
10491,
432,
247,
3268,
281,
3609,
253,
1318,
50276,
66,
4564,
273,
963,
993,
50276,
12898,
3428,
12915,
50276,
12722,
3428,
12915,
50275,
8159,
721,
3486,
273,
2221,
29206,
8090,
50276,
6275,
6197,
273,
374,
2109,
1061,
310,
2686,
417,
247,
6197,
253,
5141,
275,
27311,
267,
34501,
2720,
281,
253,
2221,
29206,
4254,
50276,
1189,
455,
4722,
789,
285,
891,
11346,
4361,
352,
604,
690,
273,
619,
3533,
1475,
7103,
812,
320,
9713,
2057,
275,
247,
18520,
390,
275,
247,
30080,
22559,
24088,
604,
891,
4336,
46485,
1633,
891,
651,
46107,
1908,
3585,
2182,
619,
13716,
534,
310,
4390,
9366,
875,
721,
285,
818,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
271,
5933,
323,
9797,
2221,
21061,
970,
48960,
3210,
2112,
342,
3081,
11655,
24088,
970,
6753,
2083,
351,
398,
285,
14433,
11655,
281,
3157,
253,
5978,
1232,
50275,
296,
3755,
20556,
50276,
856,
6013,
9797,
2221,
6064,
1754,
327,
305,
507,
13633,
690,
273,
253,
5609,
4081,
323,
8113,
50276,
5695,
281,
9797,
50276,
783,
4477,
5520,
253,
2929,
1309,
253,
2278,
1232,
407,
1690,
1543,
432,
247,
2608,
1263,
285,
28913,
1783,
50276,
20881,
1255,
265,
50276,
20261,
253,
2929,
10262,
271,
4722,
2898,
273,
305,
507,
323,
253,
9797,
4836,
4583,
38135,
310,
3710,
1580,
253,
9978,
8244,
3637,
752,
556,
644,
2218,
323,
8113,
285,
2905,
8892,
285,
253,
8245,
985,
436,
310,
671,
417,
253,
806,
2898,
273,
305,
507,
323,
9797,
8892,
50275,
24159,
7756,
689,
3786,
4081,
440,
292,
3210,
310,
1355,
352,
651,
452,
644,
4217,
281,
671,
2486,
440,
292,
21,
275,
2608,
34966,
347,
581,
273,
253,
30628,
8042,
562,
1580,
352,
7835,
1805,
275,
247,
1643,
2219,
50276,
262,
310,
417,
7094,
2590,
604,
253,
1332,
651,
320,
271,
7756,
273,
1375,
23037,
14387,
9797,
1006,
800,
3210,
751,
259,
7603,
292,
50276,
15337,
398,
5194,
326,
253,
2087,
3884,
273,
436,
789,
310,
4722,
533,
253,
1543,
403,
417,
18511,
2217,
387,
253,
2774,
323,
253,
2929,
281,
320,
7607,
281,
17857,
32888,
1677,
841,
2278,
5701,
253,
17401,
310,
281,
12009,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
310,
591,
916,
1230,
625,
4623,
627,
403,
1142,
824,
14237,
323,
9797,
6298,
5742,
253,
9777,
273,
673,
18163,
14237,
751,
2812,
287,
5059,
390,
625,
18144,
3386,
824,
347,
11715,
10303,
275,
619,
1859,
253,
2929,
651,
320,
1199,
10046,
604,
841,
3386,
651,
320,
6760,
347,
5795,
281,
253,
3386,
2530,
407,
253,
4081,
6753,
36465,
50275,
531,
273,
253,
42852,
323,
13947,
253,
2957,
275,
253,
4735,
2317,
310,
253,
3480,
390,
10183,
281,
6194,
24026,
49996,
327,
1781,
8322,
273,
941,
50276,
35529,
6519,
8981,
3210,
970,
11454,
6928,
403,
3240,
1846,
352,
651,
320,
1175,
281,
671,
1071,
3386,
2797,
432,
271,
273,
649,
1041,
48164,
6519,
8981,
985,
849,
651,
436,
7277,
281,
253,
4081,
1566,
50276,
783,
298,
19,
12275,
2957,
3133,
247,
2372,
8921,
275,
619,
1859,
3782,
275,
9797,
5162,
253,
12372,
1029,
4294,
4295,
476,
320,
17791,
342,
271,
10341,
3408,
436,
2097,
326,
23254,
271,
3242,
3761,
3133,
751,
247,
7658,
347,
253,
3408,
2550,
320,
8131,
432,
253,
1698,
6064,
2625,
534,
310,
752,
247,
36827,
2957,
812,
5115,
50275,
783,
2929,
943,
1246,
490,
77,
569,
327,
253,
897,
273,
253,
1027,
11655,
275,
1798,
581,
273,
253,
2022,
9021,
310,
253,
11250,
273,
253,
2957,
4080,
275,
253,
6311,
4735,
2317,
253,
4477,
3748,
326,
417,
1690,
352,
5644,
281,
47055,
24165,
891,
1158,
326,
625,
2175,
943,
320,
3559,
1690,
11745,
27163,
285,
9797,
3530,
50276,
5430,
835,
253,
4373,
3602,
6777,
310,
627,
247,
2257,
273,
7340,
281,
616,
2193,
50275,
66,
3889,
1850,
258,
636,
247,
10510,
285,
47692,
311,
362,
5104,
932,
11454,
13358,
6779,
4715,
16424,
275,
11454,
1491,
5162,
2718,
4240,
5474,
33032,
2520,
2929,
10262,
247,
36827,
3169,
1332,
281,
1347,
9797,
2221,
21061,
275,
4499,
281,
2045,
789,
436,
789,
4648,
6753,
36465,
281,
4044,
4735,
11655,
6012,
432,
440,
22027,
941,
50275,
26122,
337,
28116,
39169,
15116,
342,
1077,
1781,
44952,
4910,
403,
2424,
281,
2794,
1029,
3290,
9305,
9797,
50276,
19,
627,
403,
690,
1375,
23037,
14387,
1327,
1920,
410,
11020,
1006,
800,
3210,
323,
9797,
34048,
24088,
7529,
259,
7603,
292,
8254,
7795,
581,
778,
6283,
2319,
731,
275,
2905,
789,
2593,
3738,
36827,
17923,
1077,
973,
323,
3888,
352,
556,
2649,
2797,
667,
18511,
1543,
323,
9305,
3820,
3783,
1335,
697,
1077,
4722,
281,
8338,
326,
667,
37825,
12288,
651,
320,
4122,
14109,
50276,
20,
275,
1554,
2865,
1079,
27311,
267,
8090,
352,
3133,
760,
4067,
5806,
7120,
247,
1534,
2554,
752,
604,
359,
35991,
1355,
5806,
24088,
495,
89,
18,
50276,
21,
352,
3133,
253,
4081,
33222,
266,
23970,
28629,
6046,
275,
253,
598,
22163,
6216,
3820,
3783,
50275,
856,
84,
50276,
47606,
2934,
285,
20996,
1895,
50276,
5040,
50276,
783,
1543,
403,
4344,
891,
42126,
923,
1943,
7756,
689,
2045,
789,
465,
2651,
44227,
1162,
355,
4240,
50276,
301,
751,
281,
24033,
619,
13716,
846,
253,
30080,
22559,
5474,
339,
377,
2921,
973,
15720,
5322,
4583,
985,
36827,
7792,
323,
17402,
312,
4906,
9797,
24049,
3386,
432,
271,
6753,
36465,
690,
10229,
13802,
6667,
50276,
5040,
690,
21643,
20881,
314,
15068,
264,
4243,
47421,
10985,
8783,
273,
2144,
275,
2159,
2317,
891,
717,
13477,
670,
253,
7103,
651,
751,
3081,
18276,
23705,
1050,
4685,
273,
752,
2987,
1690,
625,
327,
849,
253,
1543,
9184,
432,
8245,
50276,
8774,
253,
4836,
9713,
275,
436,
789,
310,
1677,
247,
1698,
21061,
9797,
2625,
6635,
3969,
1029,
15177,
9797,
253,
2746,
310,
247,
1006,
800,
11454,
2990,
326,
17209,
327,
9305,
9797,
285,
6194,
1561,
247,
36827,
7792,
50276,
21107,
275,
9305,
3530,
4511,
24088,
15115,
310,
1929,
281,
320,
11132,
594,
247,
41427,
2900,
310,
281,
19071,
247,
4735,
2957,
4735,
2957,
2299,
3798,
4419,
247,
2990,
10166,
327,
247,
2905,
4836,
285,
604,
824,
247,
2036,
581,
1057,
417,
2168,
2226,
840,
3652,
581,
476,
452,
697,
1211,
6830,
1534,
7881,
275,
436,
789,
253,
4477,
3693,
436,
24026,
5691,
407,
970,
440,
35421,
4735,
11655,
3192,
5750,
273,
253,
958,
326,
667,
9797,
2625,
476,
320,
1066,
22163,
6216,
285,
3103,
581,
556,
253,
3969,
598,
22163,
6216,
2625,
347,
973,
50276,
783,
3733,
7792,
310,
10323,
326,
273,
247,
36827,
533,
835,
2581,
685,
5277,
253,
14156,
342,
247,
1698,
6967,
6046,
2625,
3280,
597,
2085,
253,
14156,
342,
253,
8790,
312,
6216,
9797,
2625,
253,
10336,
3797,
247,
14156,
50276,
3129,
1171,
21718,
8656,
71,
21718,
50276,
66,
7134,
12915,
50276,
69,
8656,
71,
21718,
50276,
6549,
390,
407,
17402,
312,
6216,
50275,
395,
271,
6753,
36465,
50276,
2162,
2625,
1269,
50276,
28862,
273,
2625,
1269,
387,
247,
265,
3673,
44856,
50275,
26122,
50276,
783,
14156,
2990,
4620,
281,
320,
4829,
8931,
281,
326,
273,
465,
2651,
44227,
1162,
355,
4240,
534,
4916,
253,
8245,
285,
594,
253,
3625,
7680,
43073,
436,
789,
310,
253,
16941,
273,
326,
2990,
715,
247,
36827,
7792,
2112,
342,
253,
3081,
4735,
3169,
2957,
1307,
436,
310,
4583,
247,
5322,
1895,
285,
247,
5322,
2746,
275,
326,
1708,
891,
2868,
326,
627,
310,
247,
747,
2770,
275,
436,
789,
327,
253,
39612,
3290,
273,
253,
18012,
347,
2429,
281,
465,
2651,
44227,
1162,
355,
4240,
891,
651,
3103,
34243,
751,
281,
923,
247,
690,
9437,
387,
591,
916,
1230,
16344,
253,
4795,
3453,
4457,
27246,
82,
24088,
342,
1966,
5705,
285,
342,
253,
4685,
326,
24088,
417,
512,
717,
85,
5820,
452,
253,
1072,
247,
1546,
20741,
800,
15277,
3746,
285,
263,
270,
625,
7000,
2330,
18276,
20121,
34309,
1320,
273,
253,
17402,
312,
6216,
2625,
4931,
342,
247,
1643,
625,
3530,
604,
326,
651,
1361,
326,
753,
891,
2096,
326,
627,
403,
7223,
4511,
7364,
625,
327,
436,
1735,
50276,
28821,
253,
14259,
273,
253,
440,
292,
35615,
281,
465,
2651,
44227,
1162,
355,
4240,
2139,
417,
2118,
690,
273,
1110,
20121,
281,
253,
30762,
50275,
1542,
1650,
891,
1119,
253,
5740,
285,
4677,
34805,
253,
2221,
29206,
8090,
281,
320,
9648,
440,
37650,
800,
891,
923,
326,
253,
4677,
2722,
25817,
3292,
285,
372,
2388,
282,
3292,
4795,
275,
11947,
2727,
15759,
1005,
83,
3107,
14069,
285,
359,
403,
2183,
326,
436,
7729,
342,
973,
4304,
2451,
254,
4697,
24165,
533,
891,
369,
13477,
670,
752,
253,
3168,
3603,
1957,
285,
253,
11743,
816,
43269,
326,
6064,
369,
1146,
2559,
285,
6137,
4583,
891,
42126,
1663,
2096,
4555,
253,
2554,
326,
436,
7120,
275,
253,
985,
891,
13876,
604,
352,
2057,
3058,
247,
2257,
625,
37699,
275,
271,
30762,
390,
816,
1679,
2317,
5262,
327,
352,
533,
7562,
253,
29476,
281,
253,
4623,
10414,
50276,
262,
3133,
326,
253,
749,
29206,
3828,
369,
2168,
9009,
275,
465,
2651,
44227,
4240,
342,
690,
8813,
2568,
275,
253,
1246,
789,
247,
1781,
2829,
2829,
337,
67,
310,
3559,
4645,
326,
627,
310,
642,
3064,
275,
3290,
17082,
285,
253,
2505,
671,
25957,
326,
627,
310,
642,
1534,
39612,
3064,
275,
9797,
604,
253,
749,
29206,
3828,
497,
5544,
275,
2508,
285,
342,
22861,
840,
891,
651,
7826,
320,
8718,
342,
253,
4016,
1543,
533,
275,
436,
1083,
697,
417,
2590,
2139,
6947,
436,
673,
327,
352,
1060,
697,
1896,
326,
627,
310,
1633,
2969,
670,
352,
326,
891,
717,
417,
4685,
516,
1527,
281,
1146,
13762,
5010,
2139,
417,
816,
3630,
1563,
465,
2651,
44227,
1162,
355,
4240,
359,
897,
749,
29206,
8090,
439,
74,
1162,
355,
3185,
273,
50276,
936,
3885,
598,
3733,
3738,
359,
1119,
326,
597,
1056,
642,
1534,
39612,
2538,
390,
1633,
2112,
1110,
3104,
285,
3553,
352,
387,
326,
50275,
74,
858,
11435,
253,
20121,
273,
3210,
7340,
281,
256,
478,
383,
7818,
273,
253,
2410,
15116,
6349,
273,
253,
501,
10291,
3966,
50276,
2577,
5962,
13775,
369,
342,
253,
7103,
50276,
16680,
50276,
17480,
253,
954,
3587,
2905,
789,
369,
465,
2651,
44227,
4240,
891,
2429,
253,
2221,
6064,
440,
292,
3530,
327,
326,
4422,
5987,
76,
2651,
44227,
7280,
900,
5353,
3783,
29974,
373,
50276,
936,
253,
3530,
2530,
323,
253,
1246,
789,
50276,
3614,
37813,
9906,
681,
1374,
328,
35421,
5353,
3783,
9492,
485,
50276,
395,
891,
369,
247,
2372,
13477,
984,
253,
3290,
273,
253,
440,
292,
3530,
275,
465,
2651,
44227,
4240,
4455,
281,
320,
591,
916,
1230,
3012,
1805,
685,
253,
3290,
273,
253,
3676,
260,
9866,
440,
292,
8245,
275,
253,
1246,
789,
4931,
891,
717,
275,
2228,
670,
436,
533,
347,
2080,
347,
891,
476,
2028,
253,
2221,
21061,
275,
465,
2651,
44227,
1162,
355,
4240,
310,
3012,
1805,
685,
253,
3676,
260,
9866,
6667,
1060,
310,
436,
247,
906,
273,
10182,
5438,
273,
6667,
891,
513,
2868,
752,
891,
4089,
24088,
326,
253,
33222,
266,
25,
310,
4518,
1805,
327,
690,
6667,
685,
253,
440,
292,
25,
533,
840,
323,
1327,
888,
474,
3530,
849,
1705,
440,
292,
21,
2686,
3839,
7835,
1805,
685,
440,
292,
25,
326,
36908,
1056,
8993,
3282,
2057,
7384,
642,
689,
31893,
3966,
310,
253,
5649,
275,
4886,
432,
440,
292,
21,
281,
440,
292,
25,
1561,
247,
36827,
3634,
533,
840,
41427,
50276,
262,
342,
253,
4735,
3169,
2957,
604,
594,
840,
849,
1057,
33222,
266,
25,
7277,
281,
440,
292,
21,
651,
627,
320,
667,
8692,
323,
253,
9414,
407,
2509,
271,
28913,
11922,
253,
4735,
2957,
432,
253,
36827,
7792,
3966,
891,
5476,
891,
651,
751,
281,
755,
247,
1805,
4685,
273,
752,
310,
2686,
1469,
327,
1014,
604,
18276,
310,
627,
667,
18276,
390,
34009,
5256,
267,
8310,
670,
534,
3510,
273,
3530,
581,
985,
2987,
1805,
327,
685,
1529,
323,
1650,
275,
253,
2530,
6667,
323,
253,
1246,
2929,
352,
4455,
281,
320,
253,
1083,
326,
4931,
253,
33222,
266,
25,
369,
625,
9371,
323,
17402,
312,
4906,
5343,
15547,
534,
1537,
452,
625,
1029,
18163,
4295,
326,
1646,
281,
755,
3663,
672,
1066,
48027,
533,
5046,
516,
689,
16691,
3006,
432,
253,
1643,
6667,
891,
3735,
50275,
8826,
2812,
287,
5059,
1537,
320,
9371,
1580,
597,
513,
846,
512,
12709,
690,
4217,
1491,
5747,
417,
7746,
1199,
273,
253,
39612,
2926,
323,
1650,
403,
627,
7985,
533,
275,
5353,
917,
24165,
403,
824,
24165,
12082,
50276,
12796,
2060,
9797,
3530,
6607,
347,
247,
581,
12022,
9706,
390,
347,
48158,
891,
5467,
48158,
1580,
627,
369,
642,
3748,
273,
10491,
432,
247,
3268,
281,
3609,
253,
1318,
50276,
66,
4564,
273,
963,
993,
50276,
12898,
3428,
12915,
50276,
12722,
3428,
12915,
50275,
8159,
721,
3486,
273,
2221,
29206,
8090,
50276,
6275,
6197,
273,
374,
2109,
1061,
310,
2686,
417,
247,
6197,
253,
5141,
275,
27311,
267,
34501,
2720,
281,
253,
2221,
29206,
4254,
50276,
1189,
455,
4722,
789,
285,
891,
11346,
4361,
352,
604,
690,
273,
619,
3533,
1475,
7103,
812,
320,
9713,
2057,
275,
247,
18520,
390,
275,
247,
30080,
22559,
24088,
604,
891,
4336,
46485,
1633,
891,
651,
46107,
1908,
3585,
2182,
619,
13716,
534,
310,
4390,
9366,
875,
721,
285,
818,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
271,
5933,
323,
9797,
2221,
21061,
970,
48960,
3210,
2112,
342,
3081,
11655,
24088,
970,
6753,
2083,
351,
398,
285,
14433,
11655,
281,
3157,
253,
5978,
1232,
50275,
296,
3755,
20556,
50276,
856,
6013,
9797,
2221,
6064,
1754,
327,
305,
507,
13633,
690,
273,
253,
5609,
4081,
323,
8113,
50276,
5695,
281,
9797,
50276,
783,
4477,
5520,
253,
2929,
1309,
253,
2278,
1232,
407,
1690,
1543,
432,
247,
2608,
1263,
285,
28913,
1783,
50276,
20881,
1255,
265,
50276,
20261,
253,
2929,
10262,
271,
4722,
2898,
273,
305,
507,
323,
253,
9797,
4836,
4583,
38135,
310,
3710,
1580,
253,
9978,
8244,
3637,
752,
556,
644,
2218,
323,
8113,
285,
2905,
8892,
285,
253,
8245,
985,
436,
310,
671,
417,
253,
806,
2898,
273,
305,
507,
323,
9797,
8892,
50275,
24159,
7756,
689,
3786,
4081,
440,
292,
3210,
310,
1355,
352,
651,
452,
644,
4217,
281,
671,
2486,
440,
292,
21,
275,
2608,
34966,
347,
581,
273,
253,
30628,
8042,
562,
1580,
352,
7835,
1805,
275,
247,
1643,
2219,
50276,
262,
310,
417,
7094,
2590,
604,
253,
1332,
651,
320,
271,
7756,
273,
1375,
23037,
14387,
9797,
1006,
800,
3210,
751,
259,
7603,
292,
50276,
15337,
398,
5194,
326,
253,
2087,
3884,
273,
436,
789,
310,
4722,
533,
253,
1543,
403,
417,
18511,
2217,
387,
253,
2774,
323,
253,
2929,
281,
320,
7607,
281,
17857,
32888,
1677,
841,
2278,
5701,
253,
17401,
310,
281,
12009,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors study a procedure for pruning sparsifying and binarizing neural networks through a pruning procedure they do this by taking a trained dense network pruning the synapses to get to a sparser network and then doing a stochastic search over connection swaps to further optimize the pruned network reasonable performance is shown on mnist they also show data for a carracing imitation task the details of that task are a bit sparse so i am not sure how impressive their 90 performance figure is for that task i found some other details to be missing discussed below and also have a few conceptual criticisms of this work i like the concept a lot of searching over sparse network configurations to find high performance small networks but this work seems somewhat preliminary criticisms 1 sec 41 could have used more detail a how do you decide which connections to prune in step 2 is it the weakest ones or did you find those for which the gradients of the loss with respect to the weights were smallest in magnitude or do something else b what is the training procedure during step 4 training after binarizing is that a combinatoric search over the connection swaps or was this just done by adjusting the thresholds for individual units or some other thing 2 sec 42 is the search over swaps greedy one connection swap at a time if so that seems likely to miss global optima that require say a bad bit swap to get over to a better region of the space that should be discussed i think even if there is not an immediately effective solution available 3 this work doesnt seem architecture agnostic you are still specifying the number of layers conv vs dense etc it seems more like you have an approach for sparsifying which could still be useful but i am not persuaded that this work solves the architecture search problems in any meaningful way there has been some nice recent progress in this area eg the automl zero work from quoc le et al that might interest the authors if they are curious about genuine progress in architecture agnostic nns automl zero doesnt seem architecture agnosticdocsepthe authors pay attention to architecture agnostic neural networks i think their stochastic search algorithm is a kind of ea method start with a initial architecture and then swap a single bitswap to obtain the child architecture the pruning method in learning rule is the standard magnitudebased pruning thus from the view of technique the contribution is somewhat weak even though the conclusion of this paper is interesting i am confused about the title of section 42 and section 43 is it incomplete moreover i think stochastic search is not suitable for your algorithm since the whole procedure is much similar with evolutionary algorithm the only difference is how to define mutation for architecture agnostic neural networks docsepthe paper proposes a way to perturb a binary sparse nn in order to achieve a higher accuracy first they trained binarized and sparsify a network with a couple of conv and fc layers then they propose to create a swarm of networks by swapping a random nonzero weight with a zero weight some of these networks happen to perform slightly better than the original one based on this the author propose a multistage algorithm that they call a stochastic search and succeed algorithm sss that essentially alternates between training and swapping steps the paper also tries to analyze the manifold of the network weights of different swapped networks by visualizing them using tsne algorithm the evaluation is done in mnist and carracing dataset generally im not convinced that the binary networks would perform much better beyond mnist and car examples the paper raises a series of questions how accurately was the network trained did the authors rained the best possible binary sparse mnist network before preceding to the swapping this is not clear from the text it would be great to at least repeat the procedure they describe for training multiple times how is the affinity matrix of tsne is computed i assume that if the input is binary simple euclidean distance wont be a great measure symmetric nature of the plots at fig4 suggest that your perplexity parameter is too high the results on the plots are rather the artifacts of the wrong affinity matrix than the properly of the weight manifold the description of sss lacks rigor how many networks are selected in the first step how do first neighbor are chosen it is an exhaustive search what is the complexity what is the stopping criteria what stochastic about this algorithm nits blue marker is almost impossible to see on fig5 docsepsummary in this paper the authors have explored a brains stochastic synaptic pruning inspired method for architecture agnostic models authors have explored sparse and binary paradigms for neural architecture and architecture manifold using random bitswaps authors have tested their methods on a static and dynamic tasks strengths both sparse binary paradigms for neural architecture and sampling using random bitswaps are priorly less explored tasks with very less literature the accuracies listed in paper motivates us that the authors are taking right steps towards brainlikearchitectures weakness the datasets used are smaller and not diverse the authors did not explain the reasons for architecture choice and epoch choice for sparse binary networks overall the authors should take time to explaincorrect the following things can restricting weights to binary format impact generalization of network how good is the network during transfer learning the authors coul have tried other small datasets why only mnist why the parameters in the architecture are choosen the way ther are presented in generating sparse binary neural networks section abalation study would have helped understand the choice of stages what are the reasons for choosing 4 stages of epochs diagrams are bit unclear colour representations in tsne are not clearly visible on the pdf
### Summary:
|
this paper explores methods for pruning binary neural networks the authors provide algorithms for developing sparse binary networks that perform okay on some basic ml benchmarks they frame this as providing insights into synaptic pruning in the brain and potentially providing a method for more efficient edge computing in the future all four reviews placed the paper below the acceptance threshold the reviewers noted that the paper was hard to follow in several places and were unsure as to the motivations the authors attempted to address these concerns in their replies but the area chair felt that these were insufficient as well the area chair notes that some of the claimed contributions of the paper are questionable specifically 1 the claim that there is anything biologically plausible about the algorithms presented here is very suspect the brain cannot use a search and test system for synaptic pruning like the algorithms proposed here thus it is unclear how this paper provides any insight for neuroscience in fact the authors do not even really try to provide any neuroscience insights in the results or discussion moreover they dont actually appear to use any neuroscience insights to develop their algorithms other than the stochasticity of the pruning though note it is not actually clear in neuroscience data whether pruning is stochastic given the ultimately very poor performance on ml tasks the paper doesnt seem to provide anything particularly useful for application in ml either 2 the claim that the provide the demonstration that network families with common architectural properties share similar accuracies and structural properties is odd surely this is the null hypothesis anyone would have about anns it would be surprising if networks with common connectivity profiles which is what the authors mean by architecture didnt share similar performance 3 the claim that searching in architecture space like this leads to architecture agnostic networks is odd as noted by reviewer 2 the authors are really just specifying algorithms for sparsifying binary neural networks which they frame as being architecture agnosticism according to a rather strained definition there are other ways of approaching the sparsification of neural networks and of doing architecture optimization but the paper is not framed as contributing to this literature altogether given these considerations and the four reviews a reject decision was delivered
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
1263,
247,
5199,
323,
819,
25004,
37139,
5411,
285,
10269,
274,
3006,
11454,
6928,
949,
247,
819,
25004,
5199,
597,
513,
436,
407,
3192,
247,
10166,
14086,
2990,
819,
25004,
253,
40041,
281,
755,
281,
247,
653,
9332,
2990,
285,
840,
2509,
247,
19191,
3186,
689,
4602,
1863,
1825,
281,
2007,
22318,
253,
819,
37437,
2990,
50276,
14885,
3045,
310,
2011,
327,
278,
79,
382,
597,
671,
921,
941,
323,
247,
1113,
83,
4234,
45738,
4836,
253,
4278,
273,
326,
4836,
403,
247,
2372,
23507,
594,
891,
717,
417,
2119,
849,
13943,
616,
5091,
3045,
4677,
310,
323,
326,
4836,
50276,
74,
1119,
690,
643,
4278,
281,
320,
5816,
5469,
2708,
285,
671,
452,
247,
1643,
20178,
43680,
273,
436,
789,
891,
751,
253,
4473,
247,
2257,
273,
12203,
689,
23507,
2990,
16012,
281,
1089,
1029,
3045,
1355,
6928,
533,
436,
789,
3133,
8489,
12611,
50275,
68,
17425,
3931,
50276,
18,
4706,
7609,
812,
452,
908,
625,
2508,
247,
849,
513,
368,
7617,
534,
10291,
281,
819,
2517,
275,
3213,
374,
310,
352,
253,
5075,
383,
4394,
390,
858,
368,
1089,
1110,
323,
534,
253,
27935,
273,
253,
2957,
342,
1675,
281,
253,
13461,
497,
8004,
275,
9777,
390,
513,
1633,
2010,
50276,
67,
752,
310,
253,
3733,
5199,
1309,
3213,
577,
3733,
846,
10269,
274,
3006,
310,
326,
247,
32662,
280,
3186,
689,
253,
4602,
1863,
1825,
390,
369,
436,
816,
2218,
407,
19427,
253,
26682,
323,
2060,
5085,
390,
690,
643,
2181,
50276,
19,
4706,
5976,
310,
253,
3186,
689,
1863,
1825,
38754,
581,
4602,
22101,
387,
247,
673,
604,
594,
326,
3133,
2779,
281,
2985,
4156,
5556,
66,
326,
2430,
1333,
247,
3076,
2372,
22101,
281,
755,
689,
281,
247,
1805,
2919,
273,
253,
2317,
326,
943,
320,
5469,
891,
1158,
1014,
604,
627,
310,
417,
271,
4745,
3576,
2900,
2130,
50276,
20,
436,
789,
36908,
1646,
10336,
639,
79,
6932,
368,
403,
1335,
31238,
253,
1180,
273,
8090,
2410,
4632,
14086,
3966,
352,
3133,
625,
751,
368,
452,
271,
2746,
323,
37139,
5411,
534,
812,
1335,
320,
4217,
533,
891,
717,
417,
28198,
326,
436,
789,
35910,
253,
10336,
3186,
3237,
275,
667,
14282,
1039,
627,
556,
644,
690,
5322,
3332,
4780,
275,
436,
2170,
24088,
253,
3772,
77,
5058,
789,
432,
572,
406,
458,
1162,
355,
326,
1537,
1600,
253,
4477,
604,
597,
403,
14338,
670,
13241,
4780,
275,
10336,
639,
79,
6932,
295,
2224,
50273,
37771,
77,
5058,
50276,
18566,
2649,
1646,
10336,
639,
79,
6932,
7152,
339,
431,
248,
4477,
2075,
4116,
281,
10336,
639,
79,
6932,
11454,
6928,
891,
1158,
616,
19191,
3186,
5933,
310,
247,
2238,
273,
299,
66,
1332,
1265,
342,
247,
3302,
10336,
285,
840,
22101,
247,
2014,
9886,
88,
522,
281,
4044,
253,
1429,
10336,
253,
819,
25004,
1332,
275,
4715,
4086,
310,
253,
2629,
2849,
7128,
2275,
833,
819,
25004,
3021,
432,
253,
1859,
273,
5853,
253,
7680,
310,
8489,
5075,
1014,
2167,
253,
6452,
273,
436,
2929,
310,
4722,
50276,
74,
717,
13477,
670,
253,
4060,
273,
2593,
5976,
285,
2593,
7652,
310,
352,
18464,
50276,
3062,
1189,
891,
1158,
19191,
3186,
310,
417,
7470,
50275,
1542,
634,
5933,
1580,
253,
2644,
5199,
310,
1199,
2074,
342,
16483,
5933,
253,
760,
3064,
310,
849,
281,
4853,
10577,
323,
10336,
639,
79,
6932,
11454,
6928,
5474,
339,
431,
248,
2929,
29328,
247,
1039,
281,
12230,
247,
8985,
23507,
48257,
275,
1340,
281,
5115,
247,
2169,
7200,
50276,
7053,
597,
10166,
10269,
274,
1025,
285,
37139,
1419,
247,
2990,
342,
247,
4564,
273,
2410,
285,
269,
68,
8090,
840,
597,
12661,
281,
2794,
247,
47025,
273,
6928,
407,
1863,
5436,
247,
3632,
28078,
2801,
342,
247,
5058,
2801,
690,
273,
841,
6928,
5108,
281,
1347,
5777,
1805,
685,
253,
3236,
581,
1754,
327,
436,
253,
2488,
12661,
247,
1554,
382,
486,
5933,
326,
597,
1067,
247,
19191,
3186,
285,
9302,
5933,
256,
859,
326,
9093,
3960,
684,
875,
3733,
285,
1863,
5436,
5018,
50276,
783,
2929,
671,
14177,
281,
12106,
253,
16751,
273,
253,
2990,
13461,
273,
1027,
1863,
6965,
6928,
407,
5304,
3006,
731,
970,
28669,
570,
5933,
50276,
783,
7103,
310,
2218,
275,
278,
79,
382,
285,
1113,
83,
4234,
10895,
3839,
516,
417,
13762,
326,
253,
8985,
6928,
651,
1347,
1199,
1805,
4457,
278,
79,
382,
285,
1113,
6667,
50275,
783,
2929,
16540,
247,
2962,
273,
3533,
50276,
5430,
13613,
369,
253,
2990,
10166,
858,
253,
4477,
1218,
967,
253,
1682,
1896,
8985,
23507,
278,
79,
382,
2990,
1078,
17691,
281,
253,
1863,
5436,
436,
310,
417,
2590,
432,
253,
2505,
352,
651,
320,
1270,
281,
387,
1878,
10280,
253,
5199,
597,
6266,
323,
3733,
2709,
2069,
50276,
5430,
310,
253,
15430,
4315,
273,
28669,
570,
310,
10302,
891,
5467,
326,
604,
253,
3280,
310,
8985,
2969,
299,
26365,
4181,
31451,
320,
247,
1270,
2557,
13123,
3753,
273,
253,
14777,
387,
3036,
21,
1804,
326,
634,
44229,
414,
4764,
310,
1512,
1029,
253,
1543,
327,
253,
14777,
403,
2581,
253,
24165,
273,
253,
3430,
15430,
4315,
685,
253,
6283,
273,
253,
2801,
16751,
50276,
783,
5740,
273,
256,
859,
19756,
8132,
263,
849,
1142,
6928,
403,
4236,
275,
253,
806,
3213,
849,
513,
806,
6346,
403,
6777,
352,
310,
271,
41389,
3186,
752,
310,
253,
10454,
752,
310,
253,
15910,
6866,
752,
19191,
670,
436,
5933,
50276,
79,
953,
50276,
11863,
10705,
310,
2761,
7479,
281,
923,
327,
3036,
22,
50275,
7152,
339,
793,
360,
3454,
275,
436,
2929,
253,
4477,
452,
14859,
247,
19795,
19191,
21066,
819,
25004,
11797,
1332,
323,
10336,
639,
79,
6932,
3210,
4477,
452,
14859,
23507,
285,
8985,
11951,
304,
983,
323,
11454,
10336,
285,
10336,
16751,
970,
3632,
9886,
88,
1825,
4477,
452,
5762,
616,
3082,
327,
247,
4228,
285,
7870,
8892,
50275,
296,
3755,
20556,
1097,
23507,
8985,
11951,
304,
983,
323,
11454,
10336,
285,
10491,
970,
3632,
9886,
88,
1825,
403,
2720,
314,
1679,
14859,
8892,
342,
1077,
1679,
6239,
253,
3933,
19103,
7117,
275,
2929,
15265,
684,
441,
326,
253,
4477,
403,
3192,
987,
5018,
4404,
3998,
3022,
1116,
5671,
980,
50275,
20881,
1255,
253,
15302,
908,
403,
4577,
285,
417,
11117,
253,
4477,
858,
417,
5513,
253,
4606,
323,
10336,
4327,
285,
23657,
4327,
323,
23507,
8985,
6928,
4583,
253,
4477,
943,
1379,
673,
281,
5513,
28113,
253,
1563,
1841,
50276,
5092,
34617,
13461,
281,
8985,
5981,
3486,
26647,
273,
2990,
849,
1175,
310,
253,
2990,
1309,
3700,
4715,
50276,
783,
4477,
2565,
77,
452,
3597,
643,
1355,
15302,
2139,
760,
278,
79,
382,
50276,
22309,
253,
3602,
275,
253,
10336,
403,
2093,
5458,
253,
1039,
253,
83,
403,
3559,
50276,
249,
11365,
23507,
8985,
11454,
6928,
2593,
50276,
357,
267,
318,
1263,
651,
452,
6518,
2096,
253,
4327,
273,
8661,
752,
403,
253,
4606,
323,
13887,
577,
8661,
273,
44540,
50276,
31783,
10624,
403,
2372,
12744,
10688,
14237,
275,
28669,
570,
403,
417,
4518,
7985,
327,
253,
31697,
187,
187,
4118,
18435,
27,
2520,
2929,
33826,
3082,
323,
819,
25004,
8985,
11454,
6928,
253,
4477,
2085,
11333,
323,
6684,
23507,
8985,
6928,
326,
1347,
8261,
327,
690,
5044,
13361,
49602,
597,
3665,
436,
347,
5277,
16039,
715,
21066,
819,
25004,
275,
253,
3998,
285,
7826,
5277,
247,
1332,
323,
625,
5919,
5024,
12672,
275,
253,
2852,
50276,
455,
1740,
10123,
4845,
253,
2929,
2708,
253,
14924,
7887,
253,
30628,
4879,
326,
253,
2929,
369,
1892,
281,
956,
275,
2067,
5053,
285,
497,
31488,
347,
281,
253,
42852,
253,
4477,
9919,
281,
2953,
841,
7350,
275,
616,
32114,
533,
253,
2170,
6951,
3543,
326,
841,
497,
12497,
50275,
284,
973,
253,
2170,
6951,
7211,
326,
690,
273,
253,
7558,
9021,
273,
253,
2929,
403,
30455,
5742,
50276,
18,
253,
1750,
326,
627,
310,
2712,
35605,
21541,
670,
253,
11333,
3559,
1060,
310,
1077,
9101,
253,
3998,
2550,
897,
247,
3186,
285,
1071,
985,
323,
21066,
819,
25004,
751,
253,
11333,
4081,
1060,
3021,
352,
310,
12744,
849,
436,
2929,
3400,
667,
12288,
323,
6551,
21559,
275,
958,
253,
4477,
513,
417,
1014,
1663,
1611,
281,
2085,
667,
6551,
21559,
16039,
275,
253,
1543,
390,
5955,
25761,
597,
13414,
2686,
3176,
281,
897,
667,
6551,
21559,
16039,
281,
1287,
616,
11333,
643,
685,
253,
19191,
414,
273,
253,
819,
25004,
2167,
3877,
352,
310,
417,
2686,
2590,
275,
6551,
21559,
941,
1880,
819,
25004,
310,
19191,
1677,
253,
9142,
1077,
4105,
3045,
327,
13361,
8892,
253,
2929,
36908,
1646,
281,
2085,
2712,
3782,
4217,
323,
2898,
275,
13361,
2057,
50276,
19,
253,
1750,
326,
253,
2085,
253,
20028,
326,
2990,
5870,
342,
1846,
27934,
3607,
3894,
2074,
3933,
19103,
285,
8350,
3607,
310,
8909,
13353,
436,
310,
253,
3635,
9079,
3780,
651,
452,
670,
271,
2224,
352,
651,
320,
10084,
604,
6928,
342,
1846,
17769,
10104,
534,
310,
752,
253,
4477,
1599,
407,
10336,
42126,
3894,
2074,
3045,
50276,
20,
253,
1750,
326,
12203,
275,
10336,
2317,
751,
436,
5644,
281,
10336,
639,
79,
6932,
6928,
310,
8909,
347,
4879,
407,
37317,
374,
253,
4477,
403,
1663,
816,
31238,
11333,
323,
37139,
5411,
8985,
11454,
6928,
534,
597,
3665,
347,
1146,
10336,
639,
79,
6932,
1204,
2556,
281,
247,
2581,
37007,
5426,
627,
403,
643,
4088,
273,
17682,
253,
37139,
1877,
273,
11454,
6928,
285,
273,
2509,
10336,
13757,
533,
253,
2929,
310,
417,
29318,
347,
15979,
281,
436,
6239,
50276,
2711,
9518,
1677,
841,
15711,
285,
253,
1740,
10123,
247,
12009,
3061,
369,
8549
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
1263,
247,
5199,
323,
819,
25004,
37139,
5411,
285,
10269,
274,
3006,
11454,
6928,
949,
247,
819,
25004,
5199,
597,
513,
436,
407,
3192,
247,
10166,
14086,
2990,
819,
25004,
253,
40041,
281,
755,
281,
247,
653,
9332,
2990,
285,
840,
2509,
247,
19191,
3186,
689,
4602,
1863,
1825,
281,
2007,
22318,
253,
819,
37437,
2990,
50276,
14885,
3045,
310,
2011,
327,
278,
79,
382,
597,
671,
921,
941,
323,
247,
1113,
83,
4234,
45738,
4836,
253,
4278,
273,
326,
4836,
403,
247,
2372,
23507,
594,
891,
717,
417,
2119,
849,
13943,
616,
5091,
3045,
4677,
310,
323,
326,
4836,
50276,
74,
1119,
690,
643,
4278,
281,
320,
5816,
5469,
2708,
285,
671,
452,
247,
1643,
20178,
43680,
273,
436,
789,
891,
751,
253,
4473,
247,
2257,
273,
12203,
689,
23507,
2990,
16012,
281,
1089,
1029,
3045,
1355,
6928,
533,
436,
789,
3133,
8489,
12611,
50275,
68,
17425,
3931,
50276,
18,
4706,
7609,
812,
452,
908,
625,
2508,
247,
849,
513,
368,
7617,
534,
10291,
281,
819,
2517,
275,
3213,
374,
310,
352,
253,
5075,
383,
4394,
390,
858,
368,
1089,
1110,
323,
534,
253,
27935,
273,
253,
2957,
342,
1675,
281,
253,
13461,
497,
8004,
275,
9777,
390,
513,
1633,
2010,
50276,
67,
752,
310,
253,
3733,
5199,
1309,
3213,
577,
3733,
846,
10269,
274,
3006,
310,
326,
247,
32662,
280,
3186,
689,
253,
4602,
1863,
1825,
390,
369,
436,
816,
2218,
407,
19427,
253,
26682,
323,
2060,
5085,
390,
690,
643,
2181,
50276,
19,
4706,
5976,
310,
253,
3186,
689,
1863,
1825,
38754,
581,
4602,
22101,
387,
247,
673,
604,
594,
326,
3133,
2779,
281,
2985,
4156,
5556,
66,
326,
2430,
1333,
247,
3076,
2372,
22101,
281,
755,
689,
281,
247,
1805,
2919,
273,
253,
2317,
326,
943,
320,
5469,
891,
1158,
1014,
604,
627,
310,
417,
271,
4745,
3576,
2900,
2130,
50276,
20,
436,
789,
36908,
1646,
10336,
639,
79,
6932,
368,
403,
1335,
31238,
253,
1180,
273,
8090,
2410,
4632,
14086,
3966,
352,
3133,
625,
751,
368,
452,
271,
2746,
323,
37139,
5411,
534,
812,
1335,
320,
4217,
533,
891,
717,
417,
28198,
326,
436,
789,
35910,
253,
10336,
3186,
3237,
275,
667,
14282,
1039,
627,
556,
644,
690,
5322,
3332,
4780,
275,
436,
2170,
24088,
253,
3772,
77,
5058,
789,
432,
572,
406,
458,
1162,
355,
326,
1537,
1600,
253,
4477,
604,
597,
403,
14338,
670,
13241,
4780,
275,
10336,
639,
79,
6932,
295,
2224,
50273,
37771,
77,
5058,
50276,
18566,
2649,
1646,
10336,
639,
79,
6932,
7152,
339,
431,
248,
4477,
2075,
4116,
281,
10336,
639,
79,
6932,
11454,
6928,
891,
1158,
616,
19191,
3186,
5933,
310,
247,
2238,
273,
299,
66,
1332,
1265,
342,
247,
3302,
10336,
285,
840,
22101,
247,
2014,
9886,
88,
522,
281,
4044,
253,
1429,
10336,
253,
819,
25004,
1332,
275,
4715,
4086,
310,
253,
2629,
2849,
7128,
2275,
833,
819,
25004,
3021,
432,
253,
1859,
273,
5853,
253,
7680,
310,
8489,
5075,
1014,
2167,
253,
6452,
273,
436,
2929,
310,
4722,
50276,
74,
717,
13477,
670,
253,
4060,
273,
2593,
5976,
285,
2593,
7652,
310,
352,
18464,
50276,
3062,
1189,
891,
1158,
19191,
3186,
310,
417,
7470,
50275,
1542,
634,
5933,
1580,
253,
2644,
5199,
310,
1199,
2074,
342,
16483,
5933,
253,
760,
3064,
310,
849,
281,
4853,
10577,
323,
10336,
639,
79,
6932,
11454,
6928,
5474,
339,
431,
248,
2929,
29328,
247,
1039,
281,
12230,
247,
8985,
23507,
48257,
275,
1340,
281,
5115,
247,
2169,
7200,
50276,
7053,
597,
10166,
10269,
274,
1025,
285,
37139,
1419,
247,
2990,
342,
247,
4564,
273,
2410,
285,
269,
68,
8090,
840,
597,
12661,
281,
2794,
247,
47025,
273,
6928,
407,
1863,
5436,
247,
3632,
28078,
2801,
342,
247,
5058,
2801,
690,
273,
841,
6928,
5108,
281,
1347,
5777,
1805,
685,
253,
3236,
581,
1754,
327,
436,
253,
2488,
12661,
247,
1554,
382,
486,
5933,
326,
597,
1067,
247,
19191,
3186,
285,
9302,
5933,
256,
859,
326,
9093,
3960,
684,
875,
3733,
285,
1863,
5436,
5018,
50276,
783,
2929,
671,
14177,
281,
12106,
253,
16751,
273,
253,
2990,
13461,
273,
1027,
1863,
6965,
6928,
407,
5304,
3006,
731,
970,
28669,
570,
5933,
50276,
783,
7103,
310,
2218,
275,
278,
79,
382,
285,
1113,
83,
4234,
10895,
3839,
516,
417,
13762,
326,
253,
8985,
6928,
651,
1347,
1199,
1805,
4457,
278,
79,
382,
285,
1113,
6667,
50275,
783,
2929,
16540,
247,
2962,
273,
3533,
50276,
5430,
13613,
369,
253,
2990,
10166,
858,
253,
4477,
1218,
967,
253,
1682,
1896,
8985,
23507,
278,
79,
382,
2990,
1078,
17691,
281,
253,
1863,
5436,
436,
310,
417,
2590,
432,
253,
2505,
352,
651,
320,
1270,
281,
387,
1878,
10280,
253,
5199,
597,
6266,
323,
3733,
2709,
2069,
50276,
5430,
310,
253,
15430,
4315,
273,
28669,
570,
310,
10302,
891,
5467,
326,
604,
253,
3280,
310,
8985,
2969,
299,
26365,
4181,
31451,
320,
247,
1270,
2557,
13123,
3753,
273,
253,
14777,
387,
3036,
21,
1804,
326,
634,
44229,
414,
4764,
310,
1512,
1029,
253,
1543,
327,
253,
14777,
403,
2581,
253,
24165,
273,
253,
3430,
15430,
4315,
685,
253,
6283,
273,
253,
2801,
16751,
50276,
783,
5740,
273,
256,
859,
19756,
8132,
263,
849,
1142,
6928,
403,
4236,
275,
253,
806,
3213,
849,
513,
806,
6346,
403,
6777,
352,
310,
271,
41389,
3186,
752,
310,
253,
10454,
752,
310,
253,
15910,
6866,
752,
19191,
670,
436,
5933,
50276,
79,
953,
50276,
11863,
10705,
310,
2761,
7479,
281,
923,
327,
3036,
22,
50275,
7152,
339,
793,
360,
3454,
275,
436,
2929,
253,
4477,
452,
14859,
247,
19795,
19191,
21066,
819,
25004,
11797,
1332,
323,
10336,
639,
79,
6932,
3210,
4477,
452,
14859,
23507,
285,
8985,
11951,
304,
983,
323,
11454,
10336,
285,
10336,
16751,
970,
3632,
9886,
88,
1825,
4477,
452,
5762,
616,
3082,
327,
247,
4228,
285,
7870,
8892,
50275,
296,
3755,
20556,
1097,
23507,
8985,
11951,
304,
983,
323,
11454,
10336,
285,
10491,
970,
3632,
9886,
88,
1825,
403,
2720,
314,
1679,
14859,
8892,
342,
1077,
1679,
6239,
253,
3933,
19103,
7117,
275,
2929,
15265,
684,
441,
326,
253,
4477,
403,
3192,
987,
5018,
4404,
3998,
3022,
1116,
5671,
980,
50275,
20881,
1255,
253,
15302,
908,
403,
4577,
285,
417,
11117,
253,
4477,
858,
417,
5513,
253,
4606,
323,
10336,
4327,
285,
23657,
4327,
323,
23507,
8985,
6928,
4583,
253,
4477,
943,
1379,
673,
281,
5513,
28113,
253,
1563,
1841,
50276,
5092,
34617,
13461,
281,
8985,
5981,
3486,
26647,
273,
2990,
849,
1175,
310,
253,
2990,
1309,
3700,
4715,
50276,
783,
4477,
2565,
77,
452,
3597,
643,
1355,
15302,
2139,
760,
278,
79,
382,
50276,
22309,
253,
3602,
275,
253,
10336,
403,
2093,
5458,
253,
1039,
253,
83,
403,
3559,
50276,
249,
11365,
23507,
8985,
11454,
6928,
2593,
50276,
357,
267,
318,
1263,
651,
452,
6518,
2096,
253,
4327,
273,
8661,
752,
403,
253,
4606,
323,
13887,
577,
8661,
273,
44540,
50276,
31783,
10624,
403,
2372,
12744,
10688,
14237,
275,
28669,
570,
403,
417,
4518,
7985,
327,
253,
31697,
187,
187,
4118,
18435,
27,
2520,
2929,
33826,
3082,
323,
819,
25004,
8985,
11454,
6928,
253,
4477,
2085,
11333,
323,
6684,
23507,
8985,
6928,
326,
1347,
8261,
327,
690,
5044,
13361,
49602,
597,
3665,
436,
347,
5277,
16039,
715,
21066,
819,
25004,
275,
253,
3998,
285,
7826,
5277,
247,
1332,
323,
625,
5919,
5024,
12672,
275,
253,
2852,
50276,
455,
1740,
10123,
4845,
253,
2929,
2708,
253,
14924,
7887,
253,
30628,
4879,
326,
253,
2929,
369,
1892,
281,
956,
275,
2067,
5053,
285,
497,
31488,
347,
281,
253,
42852,
253,
4477,
9919,
281,
2953,
841,
7350,
275,
616,
32114,
533,
253,
2170,
6951,
3543,
326,
841,
497,
12497,
50275,
284,
973,
253,
2170,
6951,
7211,
326,
690,
273,
253,
7558,
9021,
273,
253,
2929,
403,
30455,
5742,
50276,
18,
253,
1750,
326,
627,
310,
2712,
35605,
21541,
670,
253,
11333,
3559,
1060,
310,
1077,
9101,
253,
3998,
2550,
897,
247,
3186,
285,
1071,
985,
323,
21066,
819,
25004,
751,
253,
11333,
4081,
1060,
3021,
352,
310,
12744,
849,
436,
2929,
3400,
667,
12288,
323,
6551,
21559,
275,
958,
253,
4477,
513,
417,
1014,
1663,
1611,
281,
2085,
667,
6551,
21559,
16039,
275,
253,
1543,
390,
5955,
25761,
597,
13414,
2686,
3176,
281,
897,
667,
6551,
21559,
16039,
281,
1287,
616,
11333,
643,
685,
253,
19191,
414,
273,
253,
819,
25004,
2167,
3877,
352,
310,
417,
2686,
2590,
275,
6551,
21559,
941,
1880,
819,
25004,
310,
19191,
1677,
253,
9142,
1077,
4105,
3045,
327,
13361,
8892,
253,
2929,
36908,
1646,
281,
2085,
2712,
3782,
4217,
323,
2898,
275,
13361,
2057,
50276,
19,
253,
1750,
326,
253,
2085,
253,
20028,
326,
2990,
5870,
342,
1846,
27934,
3607,
3894,
2074,
3933,
19103,
285,
8350,
3607,
310,
8909,
13353,
436,
310,
253,
3635,
9079,
3780,
651,
452,
670,
271,
2224,
352,
651,
320,
10084,
604,
6928,
342,
1846,
17769,
10104,
534,
310,
752,
253,
4477,
1599,
407,
10336,
42126,
3894,
2074,
3045,
50276,
20,
253,
1750,
326,
12203,
275,
10336,
2317,
751,
436,
5644,
281,
10336,
639,
79,
6932,
6928,
310,
8909,
347,
4879,
407,
37317,
374,
253,
4477,
403,
1663,
816,
31238,
11333,
323,
37139,
5411,
8985,
11454,
6928,
534,
597,
3665,
347,
1146,
10336,
639,
79,
6932,
1204,
2556,
281,
247,
2581,
37007,
5426,
627,
403,
643,
4088,
273,
17682,
253,
37139,
1877,
273,
11454,
6928,
285,
273,
2509,
10336,
13757,
533,
253,
2929,
310,
417,
29318,
347,
15979,
281,
436,
6239,
50276,
2711,
9518,
1677,
841,
15711,
285,
253,
1740,
10123,
247,
12009,
3061,
369,
8549
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a new formulation for a layer in neural networks known as spherization layer when inner product with euclidean norm is calculated often there exist a loss of scalar information due to normalization authors propose spherization layer to form a bijective mapping from cartesian coordinates to hyperspherical surface spherization layer can replace typical neural network layers empirical experiments from simple synthetic data to realworld dataset such as mnist miniimagenet and wikitext show that performance of neural network with spherization layer is scalable and can perform well the formulation presented is clear and wellwritten background on spherical coordinates in n1 dimensions and stepbystep explanation of the spherization process is easy to follow with good choice of notation experiments are also welldocumented the toy example is a good starting point verifying the formulation experiments on both computer vision tasks such as image classification and language tasks such as word analogy are also pluses that show case formulation proposed is versatile for different use cases empirically for some setting proposed formulation can greatly outperform older related works the formulation is original and significant in that it is a flexible formulation and is applicable in many other related works i reserve the weakness and questions together for better coherence authors stated potential limitations of their work in that spherization layer only takes on an effect if its used during training it is mentioned that it will require retraining in order to use the proposed formulation in conventional pretrained models docsepthe paper proposes a spherization layer to represent preactivations in a hypersphere this layer can be plugged into any neural network this representation is useful in cases where cosine similarity is used as the main metric to determine similarity in the feature space strengths 1 the spherization idea is interesting it addresses a niche problem in a neat way 2 the formulation is clean and clear justifications are provided 3 experiments are performed in various settings and significant improvements are observed in the fewshot learning setting weaknesses 1 the improvement is not clear compared to the projection approach table 5 this raises doubts about the motivation for this approach is there really an information loss with the projection approach 2 more experiments where explicit similarity is used in the feature space could be performed examples include selfsupervised learning and metric learning this may show the benefits of the proposed approach in a better way 3 training stability of the network could have been discussed it is not clear how the signal propagation gradient flow would be affected by the spherization layer post rebuttal thanks for the clarifications im in favour of accepting the paper please consider addressing the discussed comments in cameraready or in an extension societal impact is not discussed docsepthis paper proposes spherization layer to represent all information on angular similarity which avoids the information loss when using only angular similarity on representations trained with the inner product it maps the preactivations of input vectors into the specific range of angles converts the angular coordinates of the vectors to cartesian coordinates with an additional dimension and trains decision boundaries from hyperplanes this guarantees that representation learning always occurs on the hyperspherical surface without the loss of any information unlike other projectionbased methods experiments have been evaluated on different tasks and show the effectiveness of the proposed methods novelty this paper proposes a novel method for representation learning only using the angle information the subtle construction to covert the angular coordinates of the vectors to cartesian coordinates which avoids the information loss than the previous method writing this paper is of better organized and shows detailed description about the proposed method weakness some related works should be listed and have a discussion and comparison docsepthe authors propose a novel neural network layer that forces preactivations into angles besides the spherization layer converts the angular coordinates to cartesian coordinates and the training is done without bias parameters several different empirical experiments have shown the effectiveness of the proposed method strengths 1 this paper is well written and clear 2 the proposed spherization layer can be easily applied to most existing neural network architectures and downstream tasks weakness 1 there is not enough comparison between the proposed spherization layer and other recent anglebased learning methods such as 1 2 since the authors claim that the proposed method solves the information loss problem that occurs in these projectionbased learning methods these comparisons are important experiments to support the authors claims 1 chen beidi et al angular visual hardness international conference on machine learning pmlr 2020 2 lin rongmei et al regularizing neural networks via minimizing hyperspherical energy proceedings of the ieeecvf conference on computer vision and pattern recognition 2020 the authors have adequately addressed the limitations of this paper
### Summary:
|
this paper proposes spherization layer by first transforming the preactivations into angles then transforming the angles into cartesian coordinates on a sphere and finally training weight parameters without bias the proposed spherization layer is geometrically meaningful and is generally applicable it is demonstrated on a range of experiments reviewer v1re who gave a rating 5 pointed out two related references and asked for more explanation about motivation the authors explained the difference between their paper and the two references and explained the motivation all other reviewers are positive about this paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
747,
15895,
323,
247,
3828,
275,
11454,
6928,
1929,
347,
653,
248,
45031,
3828,
672,
6703,
1885,
342,
299,
26365,
5222,
310,
5118,
2223,
627,
2226,
247,
2957,
273,
13434,
1491,
1955,
281,
21539,
4477,
12661,
653,
248,
45031,
3828,
281,
830,
247,
1794,
25667,
10603,
432,
7281,
16561,
11627,
281,
24052,
81,
16635,
2553,
653,
248,
45031,
3828,
476,
8171,
6867,
11454,
2990,
8090,
16774,
4679,
432,
2969,
13506,
941,
281,
1524,
10186,
10895,
824,
347,
278,
79,
382,
12949,
303,
6533,
292,
285,
259,
1479,
614,
633,
921,
326,
3045,
273,
11454,
2990,
342,
653,
248,
45031,
3828,
310,
44755,
285,
476,
1347,
973,
50274,
783,
15895,
3559,
310,
2590,
285,
973,
15720,
4114,
327,
19474,
11627,
275,
295,
18,
10103,
285,
3213,
1615,
10539,
8813,
273,
253,
653,
248,
45031,
1232,
310,
3477,
281,
956,
342,
1175,
4327,
273,
14951,
4679,
403,
671,
6210,
392,
1829,
264,
253,
20953,
1650,
310,
247,
1175,
4983,
1127,
49160,
253,
15895,
4679,
327,
1097,
4382,
8113,
8892,
824,
347,
2460,
9162,
285,
3448,
8892,
824,
347,
3159,
24760,
403,
671,
5043,
265,
326,
921,
1083,
15895,
4081,
310,
30708,
323,
1027,
897,
2219,
45190,
323,
690,
4758,
4081,
15895,
476,
10260,
562,
32231,
5662,
2905,
2987,
253,
15895,
310,
3236,
285,
1534,
275,
326,
352,
310,
247,
12112,
15895,
285,
310,
7763,
275,
1142,
643,
2905,
2987,
50273,
74,
15917,
253,
14855,
285,
3533,
2366,
323,
1805,
25253,
50275,
43355,
4767,
2442,
7364,
273,
616,
789,
275,
326,
653,
248,
45031,
3828,
760,
3936,
327,
271,
1055,
604,
697,
908,
1309,
3733,
352,
310,
5393,
326,
352,
588,
2430,
851,
26208,
275,
1340,
281,
897,
253,
4081,
15895,
275,
6041,
3215,
11273,
3210,
50275,
7152,
339,
431,
248,
2929,
29328,
247,
653,
248,
45031,
3828,
281,
1957,
638,
19452,
569,
275,
247,
24052,
81,
1568,
436,
3828,
476,
320,
43867,
715,
667,
11454,
2990,
436,
6779,
310,
4217,
275,
2219,
835,
7349,
460,
14259,
310,
908,
347,
253,
2022,
7982,
281,
3653,
14259,
275,
253,
4735,
2317,
50276,
296,
3755,
20556,
337,
253,
653,
248,
45031,
2934,
310,
4722,
352,
12453,
247,
25803,
1895,
275,
247,
18176,
1039,
374,
253,
15895,
310,
4076,
285,
2590,
816,
6787,
403,
2530,
495,
4679,
403,
2684,
275,
2710,
7533,
285,
1534,
11701,
403,
2540,
275,
253,
1643,
11860,
4715,
4758,
50275,
20881,
1255,
265,
337,
253,
7756,
310,
417,
2590,
2429,
281,
253,
12378,
2746,
2829,
608,
436,
16540,
24626,
670,
253,
16038,
323,
436,
2746,
310,
627,
1663,
271,
1491,
2957,
342,
253,
12378,
2746,
374,
625,
4679,
835,
6843,
14259,
310,
908,
275,
253,
4735,
2317,
812,
320,
2684,
6667,
2486,
1881,
35421,
4715,
285,
7982,
4715,
436,
778,
921,
253,
5373,
273,
253,
4081,
2746,
275,
247,
1805,
1039,
495,
3733,
7882,
273,
253,
2990,
812,
452,
644,
5469,
352,
310,
417,
2590,
849,
253,
2625,
18634,
11786,
2685,
651,
320,
5876,
407,
253,
653,
248,
45031,
3828,
50275,
5996,
30080,
22559,
6701,
323,
253,
8254,
6787,
516,
275,
9796,
273,
18738,
253,
2929,
4496,
1908,
15974,
253,
5469,
5701,
275,
4049,
254,
609,
5102,
390,
275,
271,
6880,
38058,
3486,
310,
417,
5469,
5474,
33032,
2520,
2929,
29328,
653,
248,
45031,
3828,
281,
1957,
512,
1491,
327,
12336,
14259,
534,
32547,
253,
1491,
2957,
672,
970,
760,
12336,
14259,
327,
14237,
10166,
342,
253,
6703,
1885,
352,
8115,
253,
638,
19452,
569,
273,
3280,
11390,
715,
253,
2173,
2491,
273,
14636,
28472,
253,
12336,
11627,
273,
253,
11390,
281,
7281,
16561,
11627,
342,
271,
3081,
7877,
285,
18784,
3061,
13674,
432,
4373,
32763,
436,
23632,
326,
6779,
4715,
1900,
6634,
327,
253,
24052,
81,
16635,
2553,
1293,
253,
2957,
273,
667,
1491,
12401,
643,
12378,
3169,
3082,
4679,
452,
644,
6760,
327,
1027,
8892,
285,
921,
253,
12510,
273,
253,
4081,
3082,
50275,
2369,
652,
555,
436,
2929,
29328,
247,
4460,
1332,
323,
6779,
4715,
760,
970,
253,
6907,
1491,
253,
16105,
5140,
281,
40894,
253,
12336,
11627,
273,
253,
11390,
281,
7281,
16561,
11627,
534,
32547,
253,
1491,
2957,
685,
253,
2045,
1332,
50276,
17695,
436,
2929,
310,
273,
1805,
10932,
285,
2722,
7000,
5740,
670,
253,
4081,
1332,
50275,
20881,
1255,
690,
2905,
2987,
943,
320,
7117,
285,
452,
247,
5955,
285,
5301,
50276,
7152,
339,
431,
248,
4477,
12661,
247,
4460,
11454,
2990,
3828,
326,
5621,
638,
19452,
569,
715,
14636,
16280,
253,
653,
248,
45031,
3828,
28472,
253,
12336,
11627,
281,
7281,
16561,
11627,
285,
253,
3733,
310,
2218,
1293,
8492,
3602,
2067,
1027,
16774,
4679,
452,
2011,
253,
12510,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50275,
18,
436,
2929,
310,
973,
3542,
285,
2590,
50276,
19,
253,
4081,
653,
248,
45031,
3828,
476,
320,
4354,
3732,
281,
954,
5368,
11454,
2990,
35615,
285,
15450,
8892,
50276,
20881,
1255,
50274,
18,
627,
310,
417,
2217,
5301,
875,
253,
4081,
653,
248,
45031,
3828,
285,
643,
3332,
6907,
3169,
4715,
3082,
824,
347,
337,
374,
1580,
253,
4477,
1750,
326,
253,
4081,
1332,
35910,
253,
1491,
2957,
1895,
326,
6634,
275,
841,
12378,
3169,
4715,
3082,
841,
14023,
403,
1774,
4679,
281,
1329,
253,
4477,
3916,
50275,
18,
260,
864,
320,
13535,
1162,
355,
12336,
5304,
38576,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
9169,
374,
19169,
391,
543,
1405,
74,
1162,
355,
3963,
3006,
11454,
6928,
3066,
28699,
24052,
81,
16635,
2341,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
9169,
50272,
783,
4477,
452,
18212,
9713,
253,
7364,
273,
436,
2929,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
653,
248,
45031,
3828,
407,
806,
27197,
253,
638,
19452,
569,
715,
14636,
840,
27197,
253,
14636,
715,
7281,
16561,
11627,
327,
247,
15269,
285,
4720,
3733,
2801,
3602,
1293,
8492,
253,
4081,
653,
248,
45031,
3828,
310,
22040,
16671,
14282,
285,
310,
3839,
7763,
352,
310,
5183,
327,
247,
2491,
273,
4679,
50275,
15337,
254,
362,
18,
250,
665,
3534,
247,
13716,
608,
8042,
562,
767,
2905,
10414,
285,
2546,
323,
625,
8813,
670,
16038,
253,
4477,
5544,
253,
3064,
875,
616,
2929,
285,
253,
767,
10414,
285,
5544,
253,
16038,
512,
643,
30628,
403,
2762,
670,
436,
2929,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
747,
15895,
323,
247,
3828,
275,
11454,
6928,
1929,
347,
653,
248,
45031,
3828,
672,
6703,
1885,
342,
299,
26365,
5222,
310,
5118,
2223,
627,
2226,
247,
2957,
273,
13434,
1491,
1955,
281,
21539,
4477,
12661,
653,
248,
45031,
3828,
281,
830,
247,
1794,
25667,
10603,
432,
7281,
16561,
11627,
281,
24052,
81,
16635,
2553,
653,
248,
45031,
3828,
476,
8171,
6867,
11454,
2990,
8090,
16774,
4679,
432,
2969,
13506,
941,
281,
1524,
10186,
10895,
824,
347,
278,
79,
382,
12949,
303,
6533,
292,
285,
259,
1479,
614,
633,
921,
326,
3045,
273,
11454,
2990,
342,
653,
248,
45031,
3828,
310,
44755,
285,
476,
1347,
973,
50274,
783,
15895,
3559,
310,
2590,
285,
973,
15720,
4114,
327,
19474,
11627,
275,
295,
18,
10103,
285,
3213,
1615,
10539,
8813,
273,
253,
653,
248,
45031,
1232,
310,
3477,
281,
956,
342,
1175,
4327,
273,
14951,
4679,
403,
671,
6210,
392,
1829,
264,
253,
20953,
1650,
310,
247,
1175,
4983,
1127,
49160,
253,
15895,
4679,
327,
1097,
4382,
8113,
8892,
824,
347,
2460,
9162,
285,
3448,
8892,
824,
347,
3159,
24760,
403,
671,
5043,
265,
326,
921,
1083,
15895,
4081,
310,
30708,
323,
1027,
897,
2219,
45190,
323,
690,
4758,
4081,
15895,
476,
10260,
562,
32231,
5662,
2905,
2987,
253,
15895,
310,
3236,
285,
1534,
275,
326,
352,
310,
247,
12112,
15895,
285,
310,
7763,
275,
1142,
643,
2905,
2987,
50273,
74,
15917,
253,
14855,
285,
3533,
2366,
323,
1805,
25253,
50275,
43355,
4767,
2442,
7364,
273,
616,
789,
275,
326,
653,
248,
45031,
3828,
760,
3936,
327,
271,
1055,
604,
697,
908,
1309,
3733,
352,
310,
5393,
326,
352,
588,
2430,
851,
26208,
275,
1340,
281,
897,
253,
4081,
15895,
275,
6041,
3215,
11273,
3210,
50275,
7152,
339,
431,
248,
2929,
29328,
247,
653,
248,
45031,
3828,
281,
1957,
638,
19452,
569,
275,
247,
24052,
81,
1568,
436,
3828,
476,
320,
43867,
715,
667,
11454,
2990,
436,
6779,
310,
4217,
275,
2219,
835,
7349,
460,
14259,
310,
908,
347,
253,
2022,
7982,
281,
3653,
14259,
275,
253,
4735,
2317,
50276,
296,
3755,
20556,
337,
253,
653,
248,
45031,
2934,
310,
4722,
352,
12453,
247,
25803,
1895,
275,
247,
18176,
1039,
374,
253,
15895,
310,
4076,
285,
2590,
816,
6787,
403,
2530,
495,
4679,
403,
2684,
275,
2710,
7533,
285,
1534,
11701,
403,
2540,
275,
253,
1643,
11860,
4715,
4758,
50275,
20881,
1255,
265,
337,
253,
7756,
310,
417,
2590,
2429,
281,
253,
12378,
2746,
2829,
608,
436,
16540,
24626,
670,
253,
16038,
323,
436,
2746,
310,
627,
1663,
271,
1491,
2957,
342,
253,
12378,
2746,
374,
625,
4679,
835,
6843,
14259,
310,
908,
275,
253,
4735,
2317,
812,
320,
2684,
6667,
2486,
1881,
35421,
4715,
285,
7982,
4715,
436,
778,
921,
253,
5373,
273,
253,
4081,
2746,
275,
247,
1805,
1039,
495,
3733,
7882,
273,
253,
2990,
812,
452,
644,
5469,
352,
310,
417,
2590,
849,
253,
2625,
18634,
11786,
2685,
651,
320,
5876,
407,
253,
653,
248,
45031,
3828,
50275,
5996,
30080,
22559,
6701,
323,
253,
8254,
6787,
516,
275,
9796,
273,
18738,
253,
2929,
4496,
1908,
15974,
253,
5469,
5701,
275,
4049,
254,
609,
5102,
390,
275,
271,
6880,
38058,
3486,
310,
417,
5469,
5474,
33032,
2520,
2929,
29328,
653,
248,
45031,
3828,
281,
1957,
512,
1491,
327,
12336,
14259,
534,
32547,
253,
1491,
2957,
672,
970,
760,
12336,
14259,
327,
14237,
10166,
342,
253,
6703,
1885,
352,
8115,
253,
638,
19452,
569,
273,
3280,
11390,
715,
253,
2173,
2491,
273,
14636,
28472,
253,
12336,
11627,
273,
253,
11390,
281,
7281,
16561,
11627,
342,
271,
3081,
7877,
285,
18784,
3061,
13674,
432,
4373,
32763,
436,
23632,
326,
6779,
4715,
1900,
6634,
327,
253,
24052,
81,
16635,
2553,
1293,
253,
2957,
273,
667,
1491,
12401,
643,
12378,
3169,
3082,
4679,
452,
644,
6760,
327,
1027,
8892,
285,
921,
253,
12510,
273,
253,
4081,
3082,
50275,
2369,
652,
555,
436,
2929,
29328,
247,
4460,
1332,
323,
6779,
4715,
760,
970,
253,
6907,
1491,
253,
16105,
5140,
281,
40894,
253,
12336,
11627,
273,
253,
11390,
281,
7281,
16561,
11627,
534,
32547,
253,
1491,
2957,
685,
253,
2045,
1332,
50276,
17695,
436,
2929,
310,
273,
1805,
10932,
285,
2722,
7000,
5740,
670,
253,
4081,
1332,
50275,
20881,
1255,
690,
2905,
2987,
943,
320,
7117,
285,
452,
247,
5955,
285,
5301,
50276,
7152,
339,
431,
248,
4477,
12661,
247,
4460,
11454,
2990,
3828,
326,
5621,
638,
19452,
569,
715,
14636,
16280,
253,
653,
248,
45031,
3828,
28472,
253,
12336,
11627,
281,
7281,
16561,
11627,
285,
253,
3733,
310,
2218,
1293,
8492,
3602,
2067,
1027,
16774,
4679,
452,
2011,
253,
12510,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50275,
18,
436,
2929,
310,
973,
3542,
285,
2590,
50276,
19,
253,
4081,
653,
248,
45031,
3828,
476,
320,
4354,
3732,
281,
954,
5368,
11454,
2990,
35615,
285,
15450,
8892,
50276,
20881,
1255,
50274,
18,
627,
310,
417,
2217,
5301,
875,
253,
4081,
653,
248,
45031,
3828,
285,
643,
3332,
6907,
3169,
4715,
3082,
824,
347,
337,
374,
1580,
253,
4477,
1750,
326,
253,
4081,
1332,
35910,
253,
1491,
2957,
1895,
326,
6634,
275,
841,
12378,
3169,
4715,
3082,
841,
14023,
403,
1774,
4679,
281,
1329,
253,
4477,
3916,
50275,
18,
260,
864,
320,
13535,
1162,
355,
12336,
5304,
38576,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
9169,
374,
19169,
391,
543,
1405,
74,
1162,
355,
3963,
3006,
11454,
6928,
3066,
28699,
24052,
81,
16635,
2341,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
9169,
50272,
783,
4477,
452,
18212,
9713,
253,
7364,
273,
436,
2929,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
653,
248,
45031,
3828,
407,
806,
27197,
253,
638,
19452,
569,
715,
14636,
840,
27197,
253,
14636,
715,
7281,
16561,
11627,
327,
247,
15269,
285,
4720,
3733,
2801,
3602,
1293,
8492,
253,
4081,
653,
248,
45031,
3828,
310,
22040,
16671,
14282,
285,
310,
3839,
7763,
352,
310,
5183,
327,
247,
2491,
273,
4679,
50275,
15337,
254,
362,
18,
250,
665,
3534,
247,
13716,
608,
8042,
562,
767,
2905,
10414,
285,
2546,
323,
625,
8813,
670,
16038,
253,
4477,
5544,
253,
3064,
875,
616,
2929,
285,
253,
767,
10414,
285,
5544,
253,
16038,
512,
643,
30628,
403,
2762,
670,
436,
2929,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary in this work authors implement and tune 14 dg algorithms and compare them across 6 datasets with 3 model selection criteria and they find i a careful implementation of erm outperforms the sotas ii no competitor can outperform erm by more than one point iii model selection matters for dg reasons for score i believe the welltuned erm outperforms many sota dg methods may not surprise many researchers in this area but i would very much like to see a paper delivering this message clearly thus i recommend an acceptance pros 1 for quite a while erm when carefully implemented and tuned being the sota in dg is elephant in the room at least we should admit that quite often the improvements claimed by those da methods are gone when switching from a weaker backbone eg resnet18 to a stonger one eg resnet50 a highstandard testing protocol is a very important contribution in dg research 2 this work brings an opensourced software for replicating the existing methods and comparing the newly proposed ones in a consistent and realistic setting cons 1 domainnet as a much larger scale and more challenging dataset could be considered 2 can you elaborate the last sentence of claim 2 ie our advice to dg practitioners is to use coralwith a hyperparameter search distribution that allows ermlike behavior a typo page 6 table 52 shows that using a resnet50 neural network architecture i think it should be table 4docsepin this paper the authors implement a test bed to evaluate domain generalization methods in a unified way the works is important because current methods use different model selection approaches which may not reflect the inherent properties of the dg algorithms model selection is a fundamentally difficult problem in the presence of distribution shift however it was significantly ignored in previous works it is nice to see that the authors provide three kinds of model selection methods from the results it seems that existing dg methods do not have a clear advantage over erm even when testdomain validation test is used does this mean existing methods themselves are not good or the dataset might not be appropriate for dg it seems hard even for human to generalize to new domains when given a small number of domains with many changing factors i have some questions regarding the test bed details 1 did the authors implement the existing methods or use the source codes provided by the authors 2 the authors carefully implemented and tuned erm did the authors also tuned the other methods carefully this may require a significant amount of work because different methods may need different engineering tricks docsepsummary this paper critically reexamines research in domain generalisation dg ie building models that robustly generalise to outofdistribution data it observes that existing methods are hard to compare in particular due to unclear hyperparameter and model selection criteria it introduces a common benchmark suite including a well designed model selection procedure and reevaluates existing methods on this suite the results show that under such controlled evaluation the benefit of existing dg methods over vanilla empirical risk minimisation erm largely disappear this raises the concern that existing dg methods might be overtuned and hard to replicate by releasing the controlled benchmark suite future research progress can be more reliably measured strength the paper highlights an important point comparing existing methods is indeed tricky and complicated by model selection and hyperparameter selection issues etc it makes good recommendations for practice such as requiring that any dg method also specifies its model selection method we knew this already but its good to remind people explicitly helpfully it specifies a few reasonable options for model selection criteria which future papers could refer to rather than inventing adhoc approaches a common benchmark with a prespecified strategy for hyperparameterearlystopping could be very helpful for more reliably comparing and measuring research progress in future a significant amount of effort was expended running a large and properly comparable evaluation across all several existing methods weakness 1 strength of claim validation this paper is in part making a very strong negative result claim that a wide range of existing methods dont work when implemented properly this might be true but then there is onus on the paper to make sure that all of the evaluation details are completely watertight are 20 trials enough for random search on all these models it would be good to show some evidence that performance has saturated at this point eg that performance doesnt improve further with an order of magnitude more search it would be good to also show some specific hyperparameters found by the random search so experts in the specific algorithms shown can assess if something sensible was found do the discovered hyperparameters correspond to erm for those methods where erm is a special case are the hyperparameter choices and ranges tab 6 satisfactory for example some algorithms i have expertise on include hyperparameters not documented in tab 6 and its not clear how these are set as another example the dg research in my group has used sgd with momentum not adam due to better stability in our experience its not clear how this change affects things re tab 6 its not clear when optimising the methods do you jointly optimize the resnet hyper parameters such as learning rate batch size with the dgspecific parameters bottom half of table for each individual dg method or do you optimize resnet hyperparameters first for erm and then fix those before optimising the dg hyperparameters overall while the benchmark should be a useful contribution anyway to believe the specific numerical conclusions we have to trust the authors did a really good job reimplementing everything if this research project had been setup instead as a competition where dg developers submit algorithms according to the constraints imposed by the benchmark for independent evaluation and still reached the sae conclusion then it would be more believable 2 insight if we accept the headline conclusion that erm indeed outperforms all the existing methods it would be really nice to have some insight into what went wrong along the way some of these were alluded to but not properly analysed for example do the prior methods have visible benefit over erm on the more commonly evaluated networks like resnet18 and that benefit just fails to transfer to resnet50 we dont know because the comparison is only made on resnet50 do the prior methods have visible benefit over erm in the absence of data augmentation and not otherwise how much of the negative result is due to proper tuning improving erm compared to previous poorly tuned implementations of the baselines versus worsening the previously overtuned dg methods what is the primary source of overtuning in existing dg methods which dominates among model selection hyperparameter selection dataset split insights such as these would make the paper more satisfying as well as believability by identifying more precisely the factors behind the negative result without these it feels a bit empty 3 minor i dont understand the final comment in untestable assumptions on pg 8 dg algorithms that selfevaluate and selfadapt at test time seems wrong adapting at testtime perse is against the problem spec of dg and blurs into domain adaptation update thanks to the authors for their feedback i appreciate the efforts on clarification and looseend tying one outstanding thing to clarify to help us understand whether claim 1 can be fully supported afaik the table 1 table 5 that underpin this claim are comparing numbers copied from previous papers with numbers generated from domain bed benchmark however i suspect the splits are not the same for example some previous benchmarks have a fixed split by default while i understood domain bed use multiple random splits if so the numbers are not directly comparable and it still may not be fair to make a strong claim that tuned erm outperforms prior workdocsep summary the paper investigates the success of domain generalization dg approaches developed in recent years in doing so the authors evaluate a large variety of the most successful recent variants under principled model selection schemes trainingdomain and leaveoutout validation and find that standard empirical risk minimization outperforms or is at least comparable to all recently proposed stateoftheart competitors in bringing together a large set of heterogeneous methods the work makes an interesting contribution to the current dg literature at the same time in its current form i found the manuscript to be lacking any sufficiently substantial recommendations on how to improve the currently available zoo of methods strengths the manuscript contains a very rigorous and impressively detailed background section moved to the appendix as well as experimental evaluations of the most important recent methods in domain generalization the methodological introduction is very well written and principled and contrasts domain generalization against other learning setups from generative learning all the way to continual learning domain adaptation etc the experimental section does a convincing job of comparing empirical risk minimization against the various dg competitors that have been developed in recent years diva rsc ddaig etc and in particular makes important recommendations that should find their way into practical research around the theme of dg the proposed domainbed environment code in supplementary materials appears of high quality although it is difficult to gauge whether its easeofuse in terms of extending it to new approaches will convince future researcher to adopt it and incorporate it with their propositions all in all i found the manuscript to be a compelling read that contains an interesting alternative viewpoint on the role and limitations of dg that being said i think more needs to be done to make this paper more inspiring and useful for the wider community in particular with regards to its concreteness see below weaknesses as highlighted above the paper is extremely wellwritten however it often leans in directions that i find too implicit and in my opinion unnecessarily so for example selecting hyperparameters is a learning problem at least as hard as fitting the model inasmuch as we may interpret any model parameter as a hyperparameter page 3 final paragraph while this is a stimulating sentence as a reader i am left wondering how this is crucial to motivating the central elements of the paper in addition what is the average dg researcher going to extract from this left untouched i would suspect such a statement might invite some additional criticism from the viewpoint of previous dg research how can one possibly guarantee that the tuning as part of domainbed is optimal for the large heterogeneous variety of dg approaches out there and who is to say that this cant be improved upon while page 8 introduces a number of interesting and compelling open questions i would encourage the authors to provide more guidance in this context potentially in the form of some experimentation or analysis of the large set of results eg which methods have strengths where etc some again too implicit comments are there eg therefore we believe there is great promise in dg algorithms able to selfevaluate and selfadapt during test time more concise propositions as to which ideas the dg community can consider would be extremely helpful here i think the final paragraph of the paper nicely summarizes what i find to be the central weakness of this manuscript in asking what a good benchmark is instead of proposing some sort of alternative or making a recommendation the authors opt for an intellectually compelling quote by proust sure its lovely to read but i would again caution that is going to be of limited practical usefulness in summary i therefore remain somewhat unconvinced that the manuscript at least in its current form that mainly resorts to criticism of other dg approaches while undoubtedly warranted but otherwise steers clear of any concise recommendations for future improvements is an optimal use of content for an 8 pages conference paper minor points although challenging dg is the best approximation to real prediction problems where unforeseen distributional discrepancies between training and testing data are surely expected page 3 third paragraph dg is an interesting problem but id be a bit more cautious in elevating its role for ml predictions problems in the realworld at least in its current form dg approximates realworld problems from a constrained direction and the above statement doesnt yet synergize well will the paragraph that rightfully asks whether dg benchmarks are the right datasets page 8 some clarification around the restrictions of dg early on would be helpful here while an interesting analysis of the differences between learning scenarios and domain generalization has been demarcated clearly from other types of learning problems a very brief mentioning of the difference between multitask and multidomain learning would be beneficial given there is some confusion around these terms in the current literature
### Summary:
|
this paper provides an interesting analysis on the research on domain generalization with main principles and limitations the authors provide a strong rebuttal to address some comments pointed by reviewers all the reviews are very positive hence i recommend acceptance
|
[
247,
1355,
1180,
273,
10625,
342,
1142,
6890,
2616,
50275,
74,
452,
690,
3533,
5001,
253,
1071,
3722,
4278,
337,
186,
14958,
253,
4477,
3359,
253,
5368,
3082,
390,
897,
253,
2603,
11646,
2530,
407,
253,
4477,
374,
186,
783,
4477,
9257,
9009,
285,
24251,
209,
693,
858,
253,
4477,
671,
24251,
253,
643,
3082,
9257,
436,
778,
2430,
247,
1534,
2408,
273,
789,
984,
1027,
3082,
778,
878,
1027,
11369,
24866,
50276,
7152,
339,
793,
360,
3454,
436,
2929,
21038,
294,
911,
35753,
2561,
275,
5028,
2087,
5837,
277,
72,
26332,
3652,
3210,
326,
10237,
314,
2087,
885,
281,
562,
1171,
35360,
941,
352,
40687,
326,
5368,
3082,
403,
1892,
281,
7277,
275,
1798,
1955,
281,
12744,
4373,
19484,
285,
1566,
5438,
6866,
352,
23970,
247,
1846,
22791,
18880,
1690,
247,
973,
4158,
1566,
5438,
5199,
285,
294,
15419,
21094,
5368,
3082,
327,
436,
18880,
253,
1543,
921,
326,
762,
824,
6537,
7103,
253,
5649,
273,
5368,
277,
72,
3082,
689,
26724,
16774,
2495,
7221,
5837,
209,
693,
8127,
15529,
436,
16540,
253,
4468,
326,
5368,
277,
72,
3082,
1537,
320,
19486,
37437,
285,
1892,
281,
25464,
407,
20437,
253,
6537,
22791,
18880,
2852,
2561,
4780,
476,
320,
625,
27340,
4080,
50275,
45563,
50275,
783,
2929,
16681,
271,
1774,
1127,
10941,
5368,
3082,
310,
6296,
28190,
285,
9542,
407,
1566,
5438,
285,
4373,
19484,
5438,
3374,
3966,
50276,
262,
2789,
1175,
12645,
323,
3946,
824,
347,
10568,
326,
667,
277,
72,
1332,
671,
28251,
697,
1566,
5438,
1332,
50276,
664,
3260,
436,
2168,
533,
697,
1175,
281,
9287,
952,
11120,
1361,
2920,
352,
28251,
247,
1643,
5272,
4610,
323,
1566,
5438,
6866,
534,
2852,
9380,
812,
3730,
281,
2581,
685,
10242,
272,
519,
37806,
7274,
50275,
66,
1846,
22791,
342,
247,
838,
1553,
1245,
5700,
323,
4373,
3575,
292,
11892,
1285,
11769,
2784,
812,
320,
1077,
9371,
323,
625,
27340,
10941,
285,
10499,
2561,
4780,
275,
2852,
50275,
66,
1534,
2408,
273,
3434,
369,
49976,
3515,
247,
1781,
285,
6283,
10870,
7103,
2439,
512,
2067,
5368,
3082,
50276,
20881,
1255,
50276,
18,
4757,
273,
1750,
50276,
29599,
436,
2929,
310,
275,
629,
2403,
247,
1077,
2266,
4016,
906,
1750,
326,
247,
4618,
2491,
273,
5368,
3082,
13414,
789,
672,
9009,
6283,
436,
1537,
320,
2032,
533,
840,
627,
310,
327,
316,
327,
253,
2929,
281,
1056,
2119,
326,
512,
273,
253,
7103,
4278,
403,
4336,
1824,
33886,
50274,
609,
1384,
7587,
2217,
323,
3632,
3186,
327,
512,
841,
3210,
352,
651,
320,
1175,
281,
921,
690,
1941,
326,
3045,
556,
23543,
387,
436,
1127,
24088,
326,
3045,
36908,
3157,
2007,
342,
271,
1340,
273,
9777,
625,
3186,
352,
651,
320,
1175,
281,
671,
921,
690,
2173,
4373,
22041,
1119,
407,
253,
3632,
3186,
594,
10071,
275,
253,
2173,
11333,
2011,
476,
2939,
604,
1633,
24600,
369,
1119,
513,
253,
6888,
4373,
22041,
2723,
281,
209,
693,
323,
1110,
3082,
835,
209,
693,
310,
247,
2714,
1083,
50276,
609,
253,
4373,
19484,
10165,
285,
13794,
10334,
721,
20297,
323,
1650,
690,
11333,
891,
452,
15040,
327,
2486,
4373,
22041,
417,
14290,
275,
10334,
721,
285,
697,
417,
2590,
849,
841,
403,
873,
347,
1529,
1650,
253,
277,
72,
2561,
275,
619,
1387,
556,
908,
256,
35333,
342,
10254,
417,
38622,
1955,
281,
1805,
7882,
275,
776,
2793,
697,
417,
2590,
849,
436,
1818,
11852,
1841,
50275,
250,
10334,
721,
697,
417,
2590,
672,
5556,
2182,
253,
3082,
513,
368,
26277,
22318,
253,
501,
3024,
4373,
3602,
824,
347,
4715,
2281,
14604,
1979,
342,
253,
277,
72,
6160,
3602,
5004,
2716,
273,
2829,
323,
1016,
2060,
277,
72,
1332,
390,
513,
368,
22318,
501,
3024,
4373,
22041,
806,
323,
209,
693,
285,
840,
4993,
1110,
1078,
5556,
2182,
253,
277,
72,
4373,
22041,
50276,
1189,
455,
1223,
253,
22791,
943,
320,
247,
4217,
7680,
8791,
281,
2868,
253,
2173,
10704,
11815,
359,
452,
281,
4517,
253,
4477,
858,
247,
1663,
1175,
2628,
294,
303,
3018,
272,
3253,
604,
436,
2561,
2199,
574,
644,
9978,
3185,
347,
247,
7324,
835,
277,
72,
12259,
11929,
11333,
2556,
281,
253,
10806,
11295,
407,
253,
22791,
323,
3907,
7103,
285,
1335,
4925,
253,
618,
70,
6452,
840,
352,
651,
320,
625,
1802,
17254,
50275,
19,
12288,
604,
359,
2997,
253,
30062,
6452,
326,
209,
693,
6296,
41731,
13015,
512,
253,
5368,
3082,
352,
651,
320,
1663,
5322,
281,
452,
690,
12288,
715,
752,
2427,
3430,
2112,
253,
1039,
690,
273,
841,
497,
512,
21015,
281,
533,
417,
6283,
15626,
50276,
1542,
1650,
50276,
3088,
253,
2720,
3082,
452,
7985,
5649,
689,
209,
693,
327,
253,
625,
7744,
6760,
6928,
751,
501,
3024,
1093,
285,
326,
5649,
816,
10224,
281,
3700,
281,
501,
3024,
1235,
359,
13414,
871,
984,
253,
5301,
310,
760,
1160,
327,
501,
3024,
1235,
50275,
3088,
253,
2720,
3082,
452,
7985,
5649,
689,
209,
693,
275,
253,
5928,
273,
941,
42072,
285,
417,
5010,
50275,
5430,
1199,
273,
253,
4016,
906,
310,
1955,
281,
1463,
25184,
11138,
209,
693,
2429,
281,
2045,
15225,
24251,
27558,
273,
253,
1666,
25379,
7147,
43685,
253,
3786,
19486,
37437,
277,
72,
3082,
50276,
5371,
310,
253,
3625,
2603,
273,
19486,
25004,
275,
5368,
277,
72,
3082,
534,
36807,
2190,
1566,
5438,
4373,
19484,
5438,
10895,
8085,
50276,
968,
4380,
824,
347,
841,
651,
1056,
253,
2929,
625,
14127,
347,
973,
347,
1802,
87,
1430,
50276,
1615,
12488,
625,
10534,
253,
2616,
3212,
253,
4016,
906,
1293,
841,
352,
9193,
247,
2372,
6325,
50275,
20,
5884,
50275,
74,
13414,
2096,
253,
2457,
4385,
275,
440,
2566,
494,
13260,
327,
23256,
854,
277,
72,
11333,
326,
11329,
453,
1208,
6340,
285,
1881,
26672,
387,
1071,
673,
3133,
3430,
42174,
387,
1071,
2606,
591,
339,
310,
1411,
253,
1895,
946,
273,
277,
72,
285,
787,
2244,
715,
5028,
15644,
50275,
11183,
50275,
35501,
281,
253,
4477,
323,
616,
8680,
891,
11435,
253,
6031,
327,
37699,
285,
13155,
423,
42068,
50276,
531,
16383,
2181,
281,
19148,
281,
1361,
441,
2096,
1880,
1750,
337,
476,
320,
4751,
4516,
50275,
37124,
1479,
253,
2829,
337,
50276,
2420,
608,
326,
762,
9852,
436,
1750,
403,
10941,
3904,
22489,
432,
2045,
9380,
342,
3904,
4561,
432,
5028,
3722,
22791,
2299,
891,
9101,
253,
36509,
403,
417,
253,
1072,
323,
1650,
690,
2045,
49602,
452,
247,
4229,
8085,
407,
4284,
1223,
891,
7192,
5028,
3722,
897,
2709,
3632,
36509,
604,
594,
253,
3904,
403,
417,
3587,
10870,
285,
352,
1335,
778,
417,
320,
4344,
281,
1056,
247,
2266,
1750,
326,
24251,
209,
693,
41731,
13015,
2720,
789,
7152,
33032,
6010,
50276,
783,
2929,
2340,
684,
253,
2323,
273,
5028,
26647,
277,
72,
7274,
3715,
275,
3332,
1107,
275,
2509,
594,
253,
4477,
7472,
247,
1781,
5235,
273,
253,
954,
5547,
3332,
11640,
762,
3505,
74,
6216,
1566,
5438,
15849,
3733,
13517,
285,
3553,
483,
483,
12820,
285,
1089,
326,
2629,
16774,
2495,
41458,
41731,
13015,
50276,
263,
310,
387,
1878,
10870,
50276,
936,
512,
4102,
4081,
1375,
23037,
14387,
21607,
275,
9745,
2366,
247,
1781,
873,
273,
22766,
3082,
253,
789,
2789,
271,
4722,
7680,
281,
253,
1655,
277,
72,
6239,
387,
253,
1072,
673,
275,
697,
1655,
830,
891,
1119,
253,
7714,
281,
320,
14999,
667,
10481,
6832,
12645,
327,
849,
281,
3157,
253,
4390,
2130,
41089,
273,
3082,
50275,
296,
3755,
20556,
50276,
783,
7714,
4428,
247,
1077,
26565,
285,
21097,
1242,
7000,
4114,
2593,
4395,
281,
253,
30762,
347,
973,
347,
5661,
27163,
273,
253,
954,
1774,
3332,
3082,
275,
5028,
26647,
253,
35961,
10199,
310,
1077,
973,
3542,
285,
3505,
74,
6216,
285,
39165,
5028,
26647,
1411,
643,
4715,
873,
8777,
432,
1006,
800,
4715,
512,
253,
1039,
281,
45120,
4715,
5028,
15644,
3966,
50276,
783,
5661,
2593,
1057,
247,
21414,
2628,
273,
10941,
16774,
2495,
41458,
1411,
253,
2710,
277,
72,
21607,
326,
452,
644,
3715,
275,
3332,
1107,
2017,
66,
391,
1026,
277,
1473,
304,
3966,
285,
275,
1798,
2789,
1774,
12645,
326,
943,
1089,
616,
1039,
715,
8542,
2561,
1475,
253,
10014,
273,
277,
72,
253,
4081,
5028,
3026,
3126,
2127,
275,
24864,
4753,
4620,
273,
1029,
3290,
3738,
352,
310,
2834,
281,
11206,
1880,
697,
11990,
1171,
2327,
275,
2426,
273,
13633,
352,
281,
747,
7274,
588,
18578,
2852,
22780,
281,
5283,
352,
285,
19071,
352,
342,
616,
39325,
50276,
455,
275,
512,
891,
1119,
253,
7714,
281,
320,
247,
18511,
1239,
326,
4428,
271,
4722,
5795,
31460,
327,
253,
2554,
285,
7364,
273,
277,
72,
326,
1146,
753,
891,
1158,
625,
3198,
281,
320,
2218,
281,
1056,
436,
2929,
625,
29853,
285,
4217,
323,
253,
14200,
3114,
275,
1798,
342,
17730,
281,
697,
345,
719,
1866,
405,
923,
2708,
50275,
20881,
1255,
265,
50276,
284,
16318,
1840,
253,
2929,
310,
6685,
973,
15720,
2299,
352,
2223,
458,
507,
275,
10746,
326,
891,
1089,
1512,
15424,
285,
50276,
249,
619,
4743,
50276,
328,
37967,
594,
323,
1650,
17221,
4373,
22041,
310,
247,
4715,
1895,
387,
1878,
347,
1892,
347,
13532,
253,
1566,
275,
49067,
347,
359,
778,
4665,
667,
1566,
4764,
347,
247,
4373,
19484,
3239,
495,
2457,
12494,
1223,
436,
310,
247,
28502,
6197,
347,
247,
9414,
891,
717,
1669,
12371,
849,
436,
310,
9560,
281,
15265,
839,
253,
4275,
3603,
273,
253,
2929,
275,
1635,
752,
310,
253,
3388,
277,
72,
22780,
1469,
281,
4908,
432,
436,
1669,
48976,
891,
651,
9101,
824,
247,
3908,
1537,
19864,
690,
3081,
14226,
432,
253,
31460,
273,
2045,
277,
72,
2561,
849,
476,
581,
6830,
12215,
326,
253,
25184,
347,
629,
273,
5028,
3026,
310,
8654,
323,
253,
1781,
22766,
5235,
273,
277,
72,
7274,
562,
627,
285,
665,
310,
281,
1333,
326,
436,
16216,
320,
5520,
2220,
50276,
6050,
3239,
854,
23970,
247,
1180,
273,
4722,
285,
18511,
1527,
3533,
891,
651,
11907,
253,
4477,
281,
2085,
625,
12925,
275,
436,
3634,
7826,
275,
253,
830,
273,
690,
40290,
390,
1783,
273,
253,
1781,
873,
273,
1543,
24088,
534,
3082,
452,
20544,
835,
3966,
690,
969,
1512,
15424,
5701,
403,
627,
50276,
909,
3103,
359,
2868,
627,
310,
1270,
9023,
275,
277,
72,
11333,
2104,
281,
11329,
453,
1208,
6340,
285,
1881,
26672,
1309,
1071,
673,
50276,
3062,
44003,
39325,
347,
281,
534,
5697,
253,
277,
72,
3114,
476,
1908,
651,
320,
6685,
9371,
1060,
50276,
74,
1158,
253,
2457,
12494,
273,
253,
2929,
23395,
37250,
752,
891,
1089,
281,
320,
253,
4275,
14855,
273,
436,
7714,
275,
7004,
752,
247,
1175,
22791,
310,
3185,
273,
36636,
690,
3686,
273,
5795,
390,
2403,
247,
17401,
253,
4477,
1478,
323,
271,
10893,
1230,
18511,
14430,
407,
819,
26202,
2119,
697,
13491,
281,
1239,
533,
891,
651,
969,
17458,
326,
310,
1469,
281,
320,
273,
3710,
8542,
31471,
50276,
249,
6010,
891,
3103,
3464,
8489,
10915,
8498,
758,
326,
253,
7714,
387,
1878,
275,
697,
1655,
830,
326,
7194,
48520,
281,
14226,
273,
643,
277,
72,
7274,
1223,
25369,
26085,
50276,
2858,
5010,
2870,
398,
2590,
273,
667,
44003,
12645,
323,
2852,
11701,
310,
271,
8654,
897,
273,
2600,
323,
271,
854,
7223,
8059,
2929,
50275,
37585,
2792,
50276,
20261,
11132,
277,
72,
310,
253,
1682,
11193,
281,
1524,
10554,
3237,
835,
440,
922,
16564,
3268,
267,
37122,
875,
3733,
285,
5175,
941,
403,
13353,
3264,
3239,
495,
2626,
12494,
277,
72,
310,
271,
4722,
1895,
533,
2654,
320,
247,
2372,
625,
31798,
275,
6478,
839,
697,
2554,
323,
13361,
13650,
3237,
275,
253,
1524,
10186,
387,
1878,
275,
697,
1655,
830,
277,
72,
4020,
684,
1524,
10186,
3237,
432,
247,
20793,
3884,
285,
253,
1840,
3908,
36908,
2568,
26455,
907,
973,
588,
253,
12494,
326,
987,
2920,
12325,
1880,
277,
72,
49602,
403,
50276,
783,
987,
15302,
3239,
854,
690,
37699,
1475,
253,
13133,
273,
277,
72,
2393,
327,
651,
320,
9371,
1060,
50276,
6050,
271,
4722,
1783,
273,
253,
3910,
875,
4715,
15216,
285,
5028,
26647,
556,
644,
1471,
3178,
456,
4518,
432,
643,
3510,
273,
4715,
3237,
247,
1077,
4864,
29570,
273,
253,
3064,
875,
1554,
262,
1945,
285,
23964,
297,
404,
4715,
651,
320,
12912,
1677,
627,
310,
690,
13775,
1475,
841,
2426,
275,
253,
1655,
6239,
187,
187,
4118,
18435,
27,
2520,
2929,
3400,
271,
4722,
1783,
327,
253,
2561,
327,
5028,
26647,
342,
2022,
9241,
285,
7364,
253,
4477,
2085,
247,
2266,
30080,
22559,
281,
2953,
690,
5701,
8042,
407,
30628,
512,
253,
10123,
403,
1077,
2762,
7613,
891,
5583,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
247,
1355,
1180,
273,
10625,
342,
1142,
6890,
2616,
50275,
74,
452,
690,
3533,
5001,
253,
1071,
3722,
4278,
337,
186,
14958,
253,
4477,
3359,
253,
5368,
3082,
390,
897,
253,
2603,
11646,
2530,
407,
253,
4477,
374,
186,
783,
4477,
9257,
9009,
285,
24251,
209,
693,
858,
253,
4477,
671,
24251,
253,
643,
3082,
9257,
436,
778,
2430,
247,
1534,
2408,
273,
789,
984,
1027,
3082,
778,
878,
1027,
11369,
24866,
50276,
7152,
339,
793,
360,
3454,
436,
2929,
21038,
294,
911,
35753,
2561,
275,
5028,
2087,
5837,
277,
72,
26332,
3652,
3210,
326,
10237,
314,
2087,
885,
281,
562,
1171,
35360,
941,
352,
40687,
326,
5368,
3082,
403,
1892,
281,
7277,
275,
1798,
1955,
281,
12744,
4373,
19484,
285,
1566,
5438,
6866,
352,
23970,
247,
1846,
22791,
18880,
1690,
247,
973,
4158,
1566,
5438,
5199,
285,
294,
15419,
21094,
5368,
3082,
327,
436,
18880,
253,
1543,
921,
326,
762,
824,
6537,
7103,
253,
5649,
273,
5368,
277,
72,
3082,
689,
26724,
16774,
2495,
7221,
5837,
209,
693,
8127,
15529,
436,
16540,
253,
4468,
326,
5368,
277,
72,
3082,
1537,
320,
19486,
37437,
285,
1892,
281,
25464,
407,
20437,
253,
6537,
22791,
18880,
2852,
2561,
4780,
476,
320,
625,
27340,
4080,
50275,
45563,
50275,
783,
2929,
16681,
271,
1774,
1127,
10941,
5368,
3082,
310,
6296,
28190,
285,
9542,
407,
1566,
5438,
285,
4373,
19484,
5438,
3374,
3966,
50276,
262,
2789,
1175,
12645,
323,
3946,
824,
347,
10568,
326,
667,
277,
72,
1332,
671,
28251,
697,
1566,
5438,
1332,
50276,
664,
3260,
436,
2168,
533,
697,
1175,
281,
9287,
952,
11120,
1361,
2920,
352,
28251,
247,
1643,
5272,
4610,
323,
1566,
5438,
6866,
534,
2852,
9380,
812,
3730,
281,
2581,
685,
10242,
272,
519,
37806,
7274,
50275,
66,
1846,
22791,
342,
247,
838,
1553,
1245,
5700,
323,
4373,
3575,
292,
11892,
1285,
11769,
2784,
812,
320,
1077,
9371,
323,
625,
27340,
10941,
285,
10499,
2561,
4780,
275,
2852,
50275,
66,
1534,
2408,
273,
3434,
369,
49976,
3515,
247,
1781,
285,
6283,
10870,
7103,
2439,
512,
2067,
5368,
3082,
50276,
20881,
1255,
50276,
18,
4757,
273,
1750,
50276,
29599,
436,
2929,
310,
275,
629,
2403,
247,
1077,
2266,
4016,
906,
1750,
326,
247,
4618,
2491,
273,
5368,
3082,
13414,
789,
672,
9009,
6283,
436,
1537,
320,
2032,
533,
840,
627,
310,
327,
316,
327,
253,
2929,
281,
1056,
2119,
326,
512,
273,
253,
7103,
4278,
403,
4336,
1824,
33886,
50274,
609,
1384,
7587,
2217,
323,
3632,
3186,
327,
512,
841,
3210,
352,
651,
320,
1175,
281,
921,
690,
1941,
326,
3045,
556,
23543,
387,
436,
1127,
24088,
326,
3045,
36908,
3157,
2007,
342,
271,
1340,
273,
9777,
625,
3186,
352,
651,
320,
1175,
281,
671,
921,
690,
2173,
4373,
22041,
1119,
407,
253,
3632,
3186,
594,
10071,
275,
253,
2173,
11333,
2011,
476,
2939,
604,
1633,
24600,
369,
1119,
513,
253,
6888,
4373,
22041,
2723,
281,
209,
693,
323,
1110,
3082,
835,
209,
693,
310,
247,
2714,
1083,
50276,
609,
253,
4373,
19484,
10165,
285,
13794,
10334,
721,
20297,
323,
1650,
690,
11333,
891,
452,
15040,
327,
2486,
4373,
22041,
417,
14290,
275,
10334,
721,
285,
697,
417,
2590,
849,
841,
403,
873,
347,
1529,
1650,
253,
277,
72,
2561,
275,
619,
1387,
556,
908,
256,
35333,
342,
10254,
417,
38622,
1955,
281,
1805,
7882,
275,
776,
2793,
697,
417,
2590,
849,
436,
1818,
11852,
1841,
50275,
250,
10334,
721,
697,
417,
2590,
672,
5556,
2182,
253,
3082,
513,
368,
26277,
22318,
253,
501,
3024,
4373,
3602,
824,
347,
4715,
2281,
14604,
1979,
342,
253,
277,
72,
6160,
3602,
5004,
2716,
273,
2829,
323,
1016,
2060,
277,
72,
1332,
390,
513,
368,
22318,
501,
3024,
4373,
22041,
806,
323,
209,
693,
285,
840,
4993,
1110,
1078,
5556,
2182,
253,
277,
72,
4373,
22041,
50276,
1189,
455,
1223,
253,
22791,
943,
320,
247,
4217,
7680,
8791,
281,
2868,
253,
2173,
10704,
11815,
359,
452,
281,
4517,
253,
4477,
858,
247,
1663,
1175,
2628,
294,
303,
3018,
272,
3253,
604,
436,
2561,
2199,
574,
644,
9978,
3185,
347,
247,
7324,
835,
277,
72,
12259,
11929,
11333,
2556,
281,
253,
10806,
11295,
407,
253,
22791,
323,
3907,
7103,
285,
1335,
4925,
253,
618,
70,
6452,
840,
352,
651,
320,
625,
1802,
17254,
50275,
19,
12288,
604,
359,
2997,
253,
30062,
6452,
326,
209,
693,
6296,
41731,
13015,
512,
253,
5368,
3082,
352,
651,
320,
1663,
5322,
281,
452,
690,
12288,
715,
752,
2427,
3430,
2112,
253,
1039,
690,
273,
841,
497,
512,
21015,
281,
533,
417,
6283,
15626,
50276,
1542,
1650,
50276,
3088,
253,
2720,
3082,
452,
7985,
5649,
689,
209,
693,
327,
253,
625,
7744,
6760,
6928,
751,
501,
3024,
1093,
285,
326,
5649,
816,
10224,
281,
3700,
281,
501,
3024,
1235,
359,
13414,
871,
984,
253,
5301,
310,
760,
1160,
327,
501,
3024,
1235,
50275,
3088,
253,
2720,
3082,
452,
7985,
5649,
689,
209,
693,
275,
253,
5928,
273,
941,
42072,
285,
417,
5010,
50275,
5430,
1199,
273,
253,
4016,
906,
310,
1955,
281,
1463,
25184,
11138,
209,
693,
2429,
281,
2045,
15225,
24251,
27558,
273,
253,
1666,
25379,
7147,
43685,
253,
3786,
19486,
37437,
277,
72,
3082,
50276,
5371,
310,
253,
3625,
2603,
273,
19486,
25004,
275,
5368,
277,
72,
3082,
534,
36807,
2190,
1566,
5438,
4373,
19484,
5438,
10895,
8085,
50276,
968,
4380,
824,
347,
841,
651,
1056,
253,
2929,
625,
14127,
347,
973,
347,
1802,
87,
1430,
50276,
1615,
12488,
625,
10534,
253,
2616,
3212,
253,
4016,
906,
1293,
841,
352,
9193,
247,
2372,
6325,
50275,
20,
5884,
50275,
74,
13414,
2096,
253,
2457,
4385,
275,
440,
2566,
494,
13260,
327,
23256,
854,
277,
72,
11333,
326,
11329,
453,
1208,
6340,
285,
1881,
26672,
387,
1071,
673,
3133,
3430,
42174,
387,
1071,
2606,
591,
339,
310,
1411,
253,
1895,
946,
273,
277,
72,
285,
787,
2244,
715,
5028,
15644,
50275,
11183,
50275,
35501,
281,
253,
4477,
323,
616,
8680,
891,
11435,
253,
6031,
327,
37699,
285,
13155,
423,
42068,
50276,
531,
16383,
2181,
281,
19148,
281,
1361,
441,
2096,
1880,
1750,
337,
476,
320,
4751,
4516,
50275,
37124,
1479,
253,
2829,
337,
50276,
2420,
608,
326,
762,
9852,
436,
1750,
403,
10941,
3904,
22489,
432,
2045,
9380,
342,
3904,
4561,
432,
5028,
3722,
22791,
2299,
891,
9101,
253,
36509,
403,
417,
253,
1072,
323,
1650,
690,
2045,
49602,
452,
247,
4229,
8085,
407,
4284,
1223,
891,
7192,
5028,
3722,
897,
2709,
3632,
36509,
604,
594,
253,
3904,
403,
417,
3587,
10870,
285,
352,
1335,
778,
417,
320,
4344,
281,
1056,
247,
2266,
1750,
326,
24251,
209,
693,
41731,
13015,
2720,
789,
7152,
33032,
6010,
50276,
783,
2929,
2340,
684,
253,
2323,
273,
5028,
26647,
277,
72,
7274,
3715,
275,
3332,
1107,
275,
2509,
594,
253,
4477,
7472,
247,
1781,
5235,
273,
253,
954,
5547,
3332,
11640,
762,
3505,
74,
6216,
1566,
5438,
15849,
3733,
13517,
285,
3553,
483,
483,
12820,
285,
1089,
326,
2629,
16774,
2495,
41458,
41731,
13015,
50276,
263,
310,
387,
1878,
10870,
50276,
936,
512,
4102,
4081,
1375,
23037,
14387,
21607,
275,
9745,
2366,
247,
1781,
873,
273,
22766,
3082,
253,
789,
2789,
271,
4722,
7680,
281,
253,
1655,
277,
72,
6239,
387,
253,
1072,
673,
275,
697,
1655,
830,
891,
1119,
253,
7714,
281,
320,
14999,
667,
10481,
6832,
12645,
327,
849,
281,
3157,
253,
4390,
2130,
41089,
273,
3082,
50275,
296,
3755,
20556,
50276,
783,
7714,
4428,
247,
1077,
26565,
285,
21097,
1242,
7000,
4114,
2593,
4395,
281,
253,
30762,
347,
973,
347,
5661,
27163,
273,
253,
954,
1774,
3332,
3082,
275,
5028,
26647,
253,
35961,
10199,
310,
1077,
973,
3542,
285,
3505,
74,
6216,
285,
39165,
5028,
26647,
1411,
643,
4715,
873,
8777,
432,
1006,
800,
4715,
512,
253,
1039,
281,
45120,
4715,
5028,
15644,
3966,
50276,
783,
5661,
2593,
1057,
247,
21414,
2628,
273,
10941,
16774,
2495,
41458,
1411,
253,
2710,
277,
72,
21607,
326,
452,
644,
3715,
275,
3332,
1107,
2017,
66,
391,
1026,
277,
1473,
304,
3966,
285,
275,
1798,
2789,
1774,
12645,
326,
943,
1089,
616,
1039,
715,
8542,
2561,
1475,
253,
10014,
273,
277,
72,
253,
4081,
5028,
3026,
3126,
2127,
275,
24864,
4753,
4620,
273,
1029,
3290,
3738,
352,
310,
2834,
281,
11206,
1880,
697,
11990,
1171,
2327,
275,
2426,
273,
13633,
352,
281,
747,
7274,
588,
18578,
2852,
22780,
281,
5283,
352,
285,
19071,
352,
342,
616,
39325,
50276,
455,
275,
512,
891,
1119,
253,
7714,
281,
320,
247,
18511,
1239,
326,
4428,
271,
4722,
5795,
31460,
327,
253,
2554,
285,
7364,
273,
277,
72,
326,
1146,
753,
891,
1158,
625,
3198,
281,
320,
2218,
281,
1056,
436,
2929,
625,
29853,
285,
4217,
323,
253,
14200,
3114,
275,
1798,
342,
17730,
281,
697,
345,
719,
1866,
405,
923,
2708,
50275,
20881,
1255,
265,
50276,
284,
16318,
1840,
253,
2929,
310,
6685,
973,
15720,
2299,
352,
2223,
458,
507,
275,
10746,
326,
891,
1089,
1512,
15424,
285,
50276,
249,
619,
4743,
50276,
328,
37967,
594,
323,
1650,
17221,
4373,
22041,
310,
247,
4715,
1895,
387,
1878,
347,
1892,
347,
13532,
253,
1566,
275,
49067,
347,
359,
778,
4665,
667,
1566,
4764,
347,
247,
4373,
19484,
3239,
495,
2457,
12494,
1223,
436,
310,
247,
28502,
6197,
347,
247,
9414,
891,
717,
1669,
12371,
849,
436,
310,
9560,
281,
15265,
839,
253,
4275,
3603,
273,
253,
2929,
275,
1635,
752,
310,
253,
3388,
277,
72,
22780,
1469,
281,
4908,
432,
436,
1669,
48976,
891,
651,
9101,
824,
247,
3908,
1537,
19864,
690,
3081,
14226,
432,
253,
31460,
273,
2045,
277,
72,
2561,
849,
476,
581,
6830,
12215,
326,
253,
25184,
347,
629,
273,
5028,
3026,
310,
8654,
323,
253,
1781,
22766,
5235,
273,
277,
72,
7274,
562,
627,
285,
665,
310,
281,
1333,
326,
436,
16216,
320,
5520,
2220,
50276,
6050,
3239,
854,
23970,
247,
1180,
273,
4722,
285,
18511,
1527,
3533,
891,
651,
11907,
253,
4477,
281,
2085,
625,
12925,
275,
436,
3634,
7826,
275,
253,
830,
273,
690,
40290,
390,
1783,
273,
253,
1781,
873,
273,
1543,
24088,
534,
3082,
452,
20544,
835,
3966,
690,
969,
1512,
15424,
5701,
403,
627,
50276,
909,
3103,
359,
2868,
627,
310,
1270,
9023,
275,
277,
72,
11333,
2104,
281,
11329,
453,
1208,
6340,
285,
1881,
26672,
1309,
1071,
673,
50276,
3062,
44003,
39325,
347,
281,
534,
5697,
253,
277,
72,
3114,
476,
1908,
651,
320,
6685,
9371,
1060,
50276,
74,
1158,
253,
2457,
12494,
273,
253,
2929,
23395,
37250,
752,
891,
1089,
281,
320,
253,
4275,
14855,
273,
436,
7714,
275,
7004,
752,
247,
1175,
22791,
310,
3185,
273,
36636,
690,
3686,
273,
5795,
390,
2403,
247,
17401,
253,
4477,
1478,
323,
271,
10893,
1230,
18511,
14430,
407,
819,
26202,
2119,
697,
13491,
281,
1239,
533,
891,
651,
969,
17458,
326,
310,
1469,
281,
320,
273,
3710,
8542,
31471,
50276,
249,
6010,
891,
3103,
3464,
8489,
10915,
8498,
758,
326,
253,
7714,
387,
1878,
275,
697,
1655,
830,
326,
7194,
48520,
281,
14226,
273,
643,
277,
72,
7274,
1223,
25369,
26085,
50276,
2858,
5010,
2870,
398,
2590,
273,
667,
44003,
12645,
323,
2852,
11701,
310,
271,
8654,
897,
273,
2600,
323,
271,
854,
7223,
8059,
2929,
50275,
37585,
2792,
50276,
20261,
11132,
277,
72,
310,
253,
1682,
11193,
281,
1524,
10554,
3237,
835,
440,
922,
16564,
3268,
267,
37122,
875,
3733,
285,
5175,
941,
403,
13353,
3264,
3239,
495,
2626,
12494,
277,
72,
310,
271,
4722,
1895,
533,
2654,
320,
247,
2372,
625,
31798,
275,
6478,
839,
697,
2554,
323,
13361,
13650,
3237,
275,
253,
1524,
10186,
387,
1878,
275,
697,
1655,
830,
277,
72,
4020,
684,
1524,
10186,
3237,
432,
247,
20793,
3884,
285,
253,
1840,
3908,
36908,
2568,
26455,
907,
973,
588,
253,
12494,
326,
987,
2920,
12325,
1880,
277,
72,
49602,
403,
50276,
783,
987,
15302,
3239,
854,
690,
37699,
1475,
253,
13133,
273,
277,
72,
2393,
327,
651,
320,
9371,
1060,
50276,
6050,
271,
4722,
1783,
273,
253,
3910,
875,
4715,
15216,
285,
5028,
26647,
556,
644,
1471,
3178,
456,
4518,
432,
643,
3510,
273,
4715,
3237,
247,
1077,
4864,
29570,
273,
253,
3064,
875,
1554,
262,
1945,
285,
23964,
297,
404,
4715,
651,
320,
12912,
1677,
627,
310,
690,
13775,
1475,
841,
2426,
275,
253,
1655,
6239,
187,
187,
4118,
18435,
27,
2520,
2929,
3400,
271,
4722,
1783,
327,
253,
2561,
327,
5028,
26647,
342,
2022,
9241,
285,
7364,
253,
4477,
2085,
247,
2266,
30080,
22559,
281,
2953,
690,
5701,
8042,
407,
30628,
512,
253,
10123,
403,
1077,
2762,
7613,
891,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work proposes an online variational bayesian vb approach to continual learning the prior over neural network functions is both over the neural network structure and parameter values where the structure is modelled by an indian buffet process ibp and the weights are drawn from a gaussian similarly the approximate posterior is assumed to be ibp and factorised gaussian as well and inference is performed through variational inference and reparametrization of the respective distributions the approach is similar to vcl in that it uses online vb for learning however the prior and approximate posterior is more general in that it also considers the neural network structure as a random variable prior and posterior theory the approach is theoretically sound and well motivated the paper is presented well and easy to follow my main concern is that that the second paragraph of the paper motivates with scenarios where the ability to adapt to dynamically changing environments or evolving data distributions is essential however online vb assumes iid data that is the online algorithm should infer the posterior over all tasks rather than adapting to dynamically changing data distributions inference is sequentially but the ordering of the task should in theory not matter it only matters in practice as we perform approximations see eg 1 iclr 2020 for an approach that explicitly adapts through forgetting the distribution over neural network weights it could be possible to extend this work to similar adaptation mechanisms although for multitask learning such adaptationforgetting may not be desirable i would appreciate a few comments on this and i think it should also be discussed shortly in the paper experimental evaluation the experimental section considers scenarios that are very common in the cl literature unfortunately these are not the most interesting or insightful as these are variants of mnist but since most related work considers these settings as well the choice is justified the results are quite strong i find the results on classification from the latent space of the unsupervised learning approach especially convincing and interesting table 1 related work the relation to 2 needs to be discussed in more detail what exactly is the difference if the ipb is put on the activations rather than weights what are pros and cons i am aware that there are additional experiments in the supplementary material comparing to kessler why do you think your approach outperformed the one of kessler another interesting aspect is that the coreset does not help much which is in contrast to vcl do you think this is because the performance is already high or because the coreset selection algorithms kcenter random are unsuitable 1 continual learning with bayesian neural networks for nonstationary data iclr 2020 2 hierarchical indian buffet neural networks for bayesian continual learning docsepi found it difficult to evaluate this paper because the paper does not say which continual learning problem it is solving in the supervised learning case is it solving the classincremental learning or taskincremental learning problem without knowing that it is hard to make an assessment because the two problems are usually solved in very different ways and their evaluation protocols are different too i tried to guess but get confused the paper writes in the evaluation section to gauge the effectiveness of our model towards preventing catastrophic forgetting we report i the test accuracy of first task after learning each of the subsequent tasks and ii the average test accuracy over all previous tasks 1 2 t after learning each task and it also writes earlier omitting the task id t for brevity i am confused with these two statements do you need task id during training and testing if you are solving the classincremental learning task id should not be used in training or testing at least not testing how do you do i do you only use the test instances of the first task do you restrict those instances to be classified into only the classes in the first task or do you allow them to be classified to future classes in future tasks for classincremental learning one should be getting the accuracy of all classes learned so far rather than each task so i am guessing that you are doing taskincremental learning ii also gives me the same impression if that is the case the following systems are expected to be compared uncertaintybased continual learning with adaptive regularization nips2019 and overcoming catastrophic forgetting with hard attention to the task serr et al 2018 ibp is related to the mechanisms in serr et al 2018 and adel et al 2020 it is desirable to have them compared if you are doing classincremental learning more recent baselines should be compared the baselines used in your experiments are old learning a unified classifier incrementally via rebalancing cvpr 2019 overcoming catastrophic forgetting for continual learning via model adaptation iclr 2019 large scale incremental learning cvpr 2019 random path selection for continual learning neurips 2019 continuous learning of contextdependent processing in neural networks nature machine intelligence 2019 itaml an incremental taskagnostic metalearning approach cvpr 2020 continual learning with hypernetworks iclr 2020 in the experiment varying the number of tasks for each dataset is also desired to show the generality of the proposed approach docsepsummary the paper proposes a continual learning framework based on bayesian nonparametric approach the hidden layer is modeled using indian buffet process prior the inference uses a structured meanfield approximation with a gaussian family for the weights and betabernoulli for the taskmasks the variational inference is done with bayesbybackprop on a common elbo setup the experiments show less diminishing accuracy on the increment of tasks on five datasets for the discriminative problem and for generation the methods learn one digit or character at a time on mnist and notmnist datasets pros the paper shows a structured meanfield approximation of the continual learning problem and train a single hidden layer of 200 units on the discriminative and generative settings the paper is easy to follow cons the structure of the network is rather shallow the authors also mentioned the challenge in their conclusion the experiments were done on rather simple datasets the paper introduces a proof of concept instead of showing the capabilities of the proposal in complex scenarios the mixture of the structured meanfield approximation seems straightforward and builds extensively on previous work the novelty may be on making the network train using this approach but the proposal uses a rather simple layer configuration one layer comments in section 51 you mention that you report the mean accuracies on the different tasks fig 2 how many versions of the tasks do you generate and average over adding error bars would help to see the variance of the methods over the tasks minor comments typo p6 par2 tries to adapts typo p7 par 3 comapred overall rating the idea of applying bayesian nonparametric for continuous learning is interesting and the authors show a simple implementation on simple datasets the evaluated tasks are extremely related and in a general continuous learning setup this method may not work more extensive and complex experiments ie more complex databases and setups for discrimination as well as for generation may shed some light on the process due to all these issues i rate the paper as a 5docsepthe authors present a new structurelearning approach to continual learning by modelling each hidden layer using a nonparametric bayesian prior a technique inspired by recent work on learning sparse nns using this technique a sparse subset of available weights is used for each task selectively allowing knowledge sharing between subsequent task while reducing catastrophic forgetting on the deactivated connections this makes for an overall very convincing submission reflected by my minor criticism well placed among other recent publications on cl in top venues pros in no particular order the method is principled and follows naturally and elegantly in the vcl framework the presentation follows a clear narrative that is easy to follow the method comes with a task detection mechanism both discriminative and generative modelling are naturally supported in the same framework the appendix discusses all necessary details to a level wellabove standard in the literature analysis covers most of the interesting questions that can naturally be asked about this method presented results are overall strong and cover standard evaluation in the literature i was pleased to see experiments in an application other than supervised image classification here unsupervised learning a simple heuristic for dynamic expansion is introduced a worthwhile direction for future research cons in no particular order this method requires storage of a binary matrix for task and each layer in the network however the authors show that space complexity grows logarithmically with the number of tasks which is likely to make this an acceptable tradeoff author feedback imho the main contribution of this work is its ideas on structure learning in the context of continual learning this is however not reflected in the title the paper would benefit from clearly stating why structure learning for cl is a worthwhile direction to pursue wellwritten papers clearly state why the proposed direction is a worthwhile method of investigation figure 3 is missing axis labels
### Summary:
|
this paper proposes a bayesian nonparametric method for taskincremental continual learning it is more general than previous work in that it considers the network structure as a random variable and works for both supervised and unsupervised settings experimental results show that the proposed method outperforms prior work in the proposed tasks pros it is well motivated its theoretically sound it can do task inference it outperforms other methods in the proposed tasks cons the experimental setup was not very challenging because the datasetmnist was simple and the network was shallow there was no ablation study to analyze the contributions of the algorithm to the performance there is not enough experiments to support the advantage of task inference the paper did not compare with the sota taskincremental learning algorithms hat and den the main concerns of reviewers are on the experimental section as listed in cons and the difference from previous work the authors explained that their method has the advantages over previous work that it can do task inference r3 agreed with the advantage and suggested more experiments in this direction should be performed the authors conducted additional experiments suggested by the reviewers including comparison with hat they also uploaded a revised version to incorporate the comments from the reviewers i think the paper is well motivated and the idea of applying bayesian nonparametric for continuous learning is interesting it could potentially motivate interesting future work on cl however the main advantagescontributions are not well presented and supported by the experiments so at present time i believe there is much room for the authors to improve their paper before publication i hope that the authors will be able to address the feedback they received to make this submission get where it should be
|
[
1097,
689,
253,
11454,
2990,
2605,
285,
4764,
2193,
835,
253,
2605,
310,
41329,
407,
271,
801,
757,
14664,
292,
1232,
18890,
81,
285,
253,
13461,
403,
8392,
432,
247,
305,
12064,
12014,
253,
16851,
12637,
310,
8025,
281,
320,
18890,
81,
285,
2803,
1701,
305,
12064,
347,
973,
285,
17032,
310,
2684,
949,
39762,
17032,
285,
294,
3575,
292,
45031,
273,
253,
9056,
10670,
50276,
783,
2746,
310,
2074,
281,
362,
498,
275,
326,
352,
4648,
3909,
362,
67,
323,
4715,
2299,
253,
2720,
285,
16851,
12637,
310,
625,
2087,
275,
326,
352,
671,
19401,
253,
11454,
2990,
2605,
347,
247,
3632,
4778,
2720,
285,
12637,
50275,
32525,
253,
2746,
310,
28055,
3590,
285,
973,
17194,
253,
2929,
310,
3559,
973,
285,
3477,
281,
956,
619,
2022,
4468,
310,
326,
326,
253,
1273,
12494,
273,
253,
2929,
15265,
684,
342,
15216,
835,
253,
3745,
281,
5223,
281,
23043,
6890,
12620,
390,
25537,
941,
10670,
310,
5667,
2299,
3909,
362,
67,
19584,
891,
301,
941,
326,
310,
253,
3909,
5933,
943,
9441,
253,
12637,
689,
512,
8892,
2581,
685,
42174,
281,
23043,
6890,
941,
10670,
17032,
310,
32627,
533,
253,
15824,
273,
253,
4836,
943,
275,
3762,
417,
2647,
50276,
262,
760,
8213,
275,
3946,
347,
359,
1347,
34754,
50276,
2887,
24088,
337,
17857,
32888,
9169,
323,
271,
2746,
326,
11120,
5223,
84,
949,
37264,
253,
3268,
689,
11454,
2990,
13461,
352,
812,
320,
1896,
281,
9017,
436,
789,
281,
2074,
15644,
6297,
3738,
323,
1554,
262,
1945,
4715,
824,
15644,
1542,
35777,
778,
417,
320,
11408,
891,
651,
11435,
247,
1643,
5701,
327,
436,
285,
891,
1158,
352,
943,
671,
320,
5469,
13515,
275,
253,
2929,
50275,
49363,
7103,
253,
5661,
2593,
19401,
15216,
326,
403,
1077,
1846,
275,
253,
502,
6239,
19235,
841,
403,
417,
253,
954,
4722,
390,
47860,
347,
841,
403,
11640,
273,
278,
79,
382,
533,
1580,
954,
2905,
789,
19401,
841,
7533,
347,
973,
253,
4327,
310,
17285,
50276,
783,
1543,
403,
3240,
2266,
891,
1089,
253,
1543,
327,
9162,
432,
253,
21624,
2317,
273,
253,
440,
35421,
4715,
2746,
3340,
21414,
285,
4722,
2829,
337,
50275,
4919,
789,
253,
5886,
281,
374,
3198,
281,
320,
5469,
275,
625,
2508,
752,
4555,
310,
253,
3064,
604,
253,
13997,
67,
310,
1691,
327,
253,
1396,
569,
2581,
685,
13461,
752,
403,
5847,
285,
772,
891,
717,
6600,
326,
627,
403,
3081,
4679,
275,
253,
24864,
2144,
10941,
281,
465,
405,
2146,
2139,
513,
368,
1158,
634,
2746,
41731,
10574,
253,
581,
273,
465,
405,
2146,
50276,
23955,
4722,
4809,
310,
326,
253,
820,
19511,
1057,
417,
1361,
1199,
534,
310,
275,
4499,
281,
362,
498,
513,
368,
1158,
436,
310,
984,
253,
3045,
310,
2168,
1029,
390,
984,
253,
820,
19511,
5438,
11333,
465,
9229,
3632,
403,
49590,
50276,
18,
45120,
4715,
342,
17699,
16561,
11454,
6928,
323,
1327,
20502,
552,
941,
17857,
32888,
9169,
374,
24498,
801,
757,
14664,
292,
11454,
6928,
323,
17699,
16561,
45120,
4715,
5474,
339,
2059,
1119,
352,
2834,
281,
7472,
436,
2929,
984,
253,
2929,
1057,
417,
1333,
534,
45120,
4715,
1895,
352,
310,
16161,
275,
253,
22296,
4715,
1083,
310,
352,
16161,
253,
966,
19687,
30132,
4715,
390,
4836,
19687,
30132,
4715,
1895,
1293,
8958,
326,
352,
310,
1892,
281,
1056,
271,
6803,
984,
253,
767,
3237,
403,
3798,
14042,
275,
1077,
1027,
4088,
285,
616,
7103,
14238,
403,
1027,
1512,
891,
3597,
281,
5476,
533,
755,
13477,
50275,
783,
2929,
12013,
275,
253,
7103,
2593,
281,
11206,
253,
12510,
273,
776,
1566,
4404,
13538,
36256,
37264,
359,
1304,
891,
253,
1071,
7200,
273,
806,
4836,
846,
4715,
1016,
273,
253,
6774,
8892,
285,
21255,
253,
3388,
1071,
7200,
689,
512,
2045,
8892,
337,
374,
50274,
85,
846,
4715,
1016,
4836,
285,
352,
671,
12013,
4321,
7005,
2835,
253,
4836,
2654,
246,
323,
1517,
27789,
891,
717,
13477,
342,
841,
767,
7234,
513,
368,
878,
4836,
2654,
1309,
3733,
285,
5175,
604,
368,
403,
16161,
253,
966,
19687,
30132,
4715,
4836,
2654,
943,
417,
320,
908,
275,
3733,
390,
5175,
387,
1878,
417,
5175,
849,
513,
368,
513,
891,
513,
368,
760,
897,
253,
1071,
10872,
273,
253,
806,
4836,
513,
368,
4656,
1110,
10872,
281,
320,
10509,
715,
760,
253,
5971,
275,
253,
806,
4836,
390,
513,
368,
1581,
731,
281,
320,
10509,
281,
2852,
5971,
275,
2852,
8892,
323,
966,
19687,
30132,
4715,
581,
943,
320,
2970,
253,
7200,
273,
512,
5971,
6311,
594,
2080,
2581,
685,
1016,
4836,
594,
891,
717,
29985,
326,
368,
403,
2509,
4836,
19687,
30132,
4715,
21255,
671,
4245,
479,
253,
1072,
13214,
604,
326,
310,
253,
1083,
253,
1563,
2718,
403,
3264,
281,
320,
2429,
11649,
3169,
45120,
4715,
342,
17825,
37820,
295,
2824,
9638,
285,
40845,
36256,
37264,
342,
1892,
4116,
281,
253,
4836,
256,
1000,
1162,
355,
4765,
18890,
81,
310,
2905,
281,
253,
6297,
275,
256,
1000,
1162,
355,
4765,
285,
519,
293,
1162,
355,
9169,
352,
310,
11408,
281,
452,
731,
2429,
50275,
338,
368,
403,
2509,
966,
19687,
30132,
4715,
625,
3332,
1666,
25379,
943,
320,
2429,
253,
1666,
25379,
908,
275,
634,
4679,
403,
1711,
50275,
28269,
247,
27998,
30410,
17627,
595,
3066,
6142,
267,
6816,
30105,
1087,
6247,
50276,
1189,
4202,
36256,
37264,
323,
45120,
4715,
3066,
1566,
15644,
17857,
32888,
6247,
50276,
16374,
4311,
32809,
4715,
30105,
1087,
6247,
3632,
1854,
5438,
323,
45120,
4715,
5723,
2824,
6247,
5415,
4715,
273,
3634,
6820,
5162,
275,
11454,
6928,
3753,
5145,
9260,
6247,
352,
16878,
50276,
266,
32809,
4836,
1530,
6932,
5148,
613,
920,
2746,
30105,
1087,
9169,
45120,
4715,
342,
4373,
3024,
4896,
17857,
32888,
9169,
50276,
249,
253,
3368,
11962,
253,
1180,
273,
8892,
323,
1016,
10895,
310,
671,
6799,
281,
921,
253,
31376,
273,
253,
4081,
2746,
50276,
7152,
339,
793,
360,
3454,
50276,
783,
2929,
29328,
247,
45120,
4715,
7792,
1754,
327,
17699,
16561,
1327,
36928,
2746,
50276,
783,
8763,
3828,
310,
23115,
970,
801,
757,
14664,
292,
1232,
2720,
50276,
783,
17032,
4648,
247,
18872,
1599,
3423,
11193,
342,
247,
305,
12064,
2021,
323,
253,
13461,
285,
701,
357,
1808,
276,
25658,
323,
253,
4836,
78,
6579,
50276,
783,
39762,
17032,
310,
2218,
342,
17699,
265,
1615,
2135,
8560,
327,
247,
1846,
1045,
2399,
9978,
50276,
783,
4679,
921,
1679,
48245,
7200,
327,
253,
17627,
273,
8892,
327,
2620,
15302,
323,
253,
20741,
800,
1895,
285,
323,
5978,
253,
3082,
3037,
581,
6670,
390,
1894,
387,
247,
673,
327,
278,
79,
382,
285,
417,
16192,
382,
15302,
50275,
856,
84,
50276,
783,
2929,
2722,
247,
18872,
1599,
3423,
11193,
273,
253,
45120,
4715,
1895,
285,
6194,
247,
2014,
8763,
3828,
273,
1052,
5085,
327,
253,
20741,
800,
285,
1006,
800,
7533,
50276,
783,
2929,
310,
3477,
281,
956,
50276,
5040,
50275,
783,
2605,
273,
253,
2990,
310,
2581,
20126,
50276,
783,
4477,
671,
5393,
253,
5691,
275,
616,
6452,
50276,
783,
4679,
497,
2218,
327,
2581,
2969,
15302,
50276,
783,
2929,
23970,
247,
4737,
273,
4473,
3185,
273,
4645,
253,
13789,
273,
253,
10419,
275,
2570,
15216,
50276,
783,
7802,
273,
253,
18872,
1599,
3423,
11193,
3133,
15246,
285,
21168,
18171,
327,
2045,
789,
50276,
783,
38135,
778,
320,
327,
2403,
253,
2990,
6194,
970,
436,
2746,
533,
253,
10419,
4648,
247,
2581,
2969,
3828,
6661,
581,
3828,
50276,
26122,
50276,
249,
2593,
8319,
368,
3748,
326,
368,
1304,
253,
1599,
3933,
19103,
327,
253,
1027,
8892,
3036,
374,
50276,
5430,
1142,
9508,
273,
253,
8892,
513,
368,
6635,
285,
3388,
689,
50276,
8052,
2228,
8965,
651,
1361,
281,
923,
253,
11041,
273,
253,
3082,
689,
253,
8892,
50276,
37585,
5701,
50276,
555,
5367,
268,
23,
1061,
19,
14177,
281,
5223,
84,
50276,
555,
5367,
268,
24,
1061,
495,
389,
522,
433,
50276,
1189,
455,
13716,
253,
2934,
273,
9433,
17699,
16561,
1327,
36928,
323,
5415,
4715,
310,
4722,
285,
253,
4477,
921,
247,
2969,
7092,
327,
2969,
15302,
50276,
783,
6760,
8892,
403,
6685,
2905,
285,
275,
247,
2087,
5415,
4715,
9978,
436,
1332,
778,
417,
789,
50276,
3062,
9470,
285,
2570,
4679,
26332,
625,
2570,
16634,
285,
873,
8777,
323,
11081,
347,
973,
347,
323,
5978,
778,
17914,
690,
1708,
327,
253,
1232,
50276,
21848,
281,
512,
841,
3374,
891,
2281,
253,
2929,
347,
247,
608,
7152,
339,
431,
248,
4477,
1246,
247,
747,
2605,
28269,
2746,
281,
45120,
4715,
407,
26278,
1016,
8763,
3828,
970,
247,
1327,
36928,
17699,
16561,
2720,
247,
5853,
11797,
407,
3332,
789,
327,
4715,
23507,
295,
2224,
970,
436,
5853,
247,
23507,
8578,
273,
2130,
13461,
310,
908,
323,
1016,
4836,
21656,
6941,
3640,
9628,
875,
6774,
4836,
1223,
8493,
36256,
37264,
327,
253,
372,
18132,
10291,
50275,
2520,
2789,
323,
271,
4583,
1077,
21414,
19529,
11392,
407,
619,
5884,
14226,
973,
4845,
2190,
643,
3332,
16516,
327,
502,
275,
1755,
28966,
50275,
856,
84,
275,
642,
1798,
1340,
50276,
783,
1332,
310,
3505,
74,
6216,
285,
3637,
10748,
285,
13990,
5954,
275,
253,
362,
498,
7792,
253,
9759,
3637,
247,
2590,
14511,
326,
310,
3477,
281,
956,
50276,
783,
1332,
3249,
342,
247,
4836,
5481,
5122,
50276,
15617,
20741,
800,
285,
1006,
800,
26278,
403,
10748,
4516,
275,
253,
1072,
7792,
50276,
783,
30762,
25339,
512,
3309,
4278,
281,
247,
1268,
973,
25117,
2629,
275,
253,
6239,
1783,
10949,
954,
273,
253,
4722,
3533,
326,
476,
10748,
320,
2546,
670,
436,
1332,
50275,
15068,
264,
1543,
403,
4583,
2266,
285,
3835,
2629,
7103,
275,
253,
6239,
891,
369,
13864,
281,
923,
4679,
275,
271,
2898,
643,
685,
22296,
2460,
9162,
1060,
440,
35421,
4715,
50275,
66,
2969,
47641,
323,
7870,
7466,
310,
5611,
247,
32811,
3884,
323,
2852,
2561,
50275,
5040,
275,
642,
1798,
1340,
50276,
2520,
1332,
4419,
5718,
273,
247,
8985,
4315,
323,
4836,
285,
1016,
3828,
275,
253,
2990,
2299,
253,
4477,
921,
326,
2317,
10454,
17202,
42407,
1037,
342,
253,
1180,
273,
8892,
534,
310,
2779,
281,
1056,
436,
271,
12207,
5454,
2727,
50275,
7582,
8680,
50276,
303,
1689,
253,
2022,
7680,
273,
436,
789,
310,
697,
5697,
327,
2605,
4715,
275,
253,
3634,
273,
45120,
4715,
436,
310,
2299,
417,
11392,
275,
253,
4060,
50276,
783,
2929,
651,
5649,
432,
4518,
14851,
2139,
2605,
4715,
323,
502,
310,
247,
32811,
3884,
281,
15142,
973,
15720,
9380,
4518,
1375,
2139,
253,
4081,
3884,
310,
247,
32811,
1332,
273,
5839,
50275,
13206,
495,
310,
5816,
7844,
13301,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
17699,
16561,
1327,
36928,
1332,
323,
4836,
19687,
30132,
45120,
4715,
352,
310,
625,
2087,
685,
2045,
789,
275,
326,
352,
19401,
253,
2990,
2605,
347,
247,
3632,
4778,
285,
2987,
323,
1097,
22296,
285,
440,
35421,
7533,
5661,
1543,
921,
326,
253,
4081,
1332,
41731,
13015,
2720,
789,
275,
253,
4081,
8892,
50276,
856,
84,
50276,
262,
310,
973,
17194,
50276,
953,
28055,
3590,
50276,
262,
476,
513,
4836,
17032,
50276,
262,
41731,
13015,
643,
3082,
275,
253,
4081,
8892,
50276,
5040,
50276,
783,
5661,
9978,
369,
417,
1077,
11132,
984,
253,
10895,
16192,
382,
369,
2969,
285,
253,
2990,
369,
20126,
50276,
9088,
369,
642,
28913,
1263,
281,
12106,
253,
9021,
273,
253,
5933,
281,
253,
3045,
50276,
9088,
310,
417,
2217,
4679,
281,
1329,
253,
5750,
273,
4836,
17032,
50276,
783,
2929,
858,
417,
7277,
342,
253,
256,
5503,
4836,
19687,
30132,
4715,
11333,
7856,
285,
1850,
50276,
783,
2022,
7350,
273,
30628,
403,
327,
253,
5661,
2593,
347,
7117,
275,
772,
285,
253,
3064,
432,
2045,
789,
253,
4477,
5544,
326,
616,
1332,
556,
253,
11361,
689,
2045,
789,
326,
352,
476,
513,
4836,
17032,
391,
20,
5821,
342,
253,
5750,
285,
5125,
625,
4679,
275,
436,
3884,
943,
320,
2684,
253,
4477,
5196,
3081,
4679,
5125,
407,
253,
30628,
1690,
5301,
342,
7856,
597,
671,
28228,
247,
17265,
2715,
281,
19071,
253,
5701,
432,
253,
30628,
50276,
74,
1158,
253,
2929,
310,
973,
17194,
285,
253,
2934,
273,
9433,
17699,
16561,
1327,
36928,
323,
5415,
4715,
310,
4722,
352,
812,
7826,
41509,
4722,
2852,
789,
327,
502,
2299,
253,
2022,
11361,
1987,
8303,
403,
417,
973,
3559,
285,
4516,
407,
253,
4679,
594,
387,
1246,
673,
891,
2868,
627,
310,
1199,
2316,
323,
253,
4477,
281,
3157,
616,
2929,
1078,
9311,
891,
3524,
326,
253,
4477,
588,
320,
2104,
281,
2953,
253,
8680,
597,
2959,
281,
1056,
436,
19529,
755,
835,
352,
943,
320
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1097,
689,
253,
11454,
2990,
2605,
285,
4764,
2193,
835,
253,
2605,
310,
41329,
407,
271,
801,
757,
14664,
292,
1232,
18890,
81,
285,
253,
13461,
403,
8392,
432,
247,
305,
12064,
12014,
253,
16851,
12637,
310,
8025,
281,
320,
18890,
81,
285,
2803,
1701,
305,
12064,
347,
973,
285,
17032,
310,
2684,
949,
39762,
17032,
285,
294,
3575,
292,
45031,
273,
253,
9056,
10670,
50276,
783,
2746,
310,
2074,
281,
362,
498,
275,
326,
352,
4648,
3909,
362,
67,
323,
4715,
2299,
253,
2720,
285,
16851,
12637,
310,
625,
2087,
275,
326,
352,
671,
19401,
253,
11454,
2990,
2605,
347,
247,
3632,
4778,
2720,
285,
12637,
50275,
32525,
253,
2746,
310,
28055,
3590,
285,
973,
17194,
253,
2929,
310,
3559,
973,
285,
3477,
281,
956,
619,
2022,
4468,
310,
326,
326,
253,
1273,
12494,
273,
253,
2929,
15265,
684,
342,
15216,
835,
253,
3745,
281,
5223,
281,
23043,
6890,
12620,
390,
25537,
941,
10670,
310,
5667,
2299,
3909,
362,
67,
19584,
891,
301,
941,
326,
310,
253,
3909,
5933,
943,
9441,
253,
12637,
689,
512,
8892,
2581,
685,
42174,
281,
23043,
6890,
941,
10670,
17032,
310,
32627,
533,
253,
15824,
273,
253,
4836,
943,
275,
3762,
417,
2647,
50276,
262,
760,
8213,
275,
3946,
347,
359,
1347,
34754,
50276,
2887,
24088,
337,
17857,
32888,
9169,
323,
271,
2746,
326,
11120,
5223,
84,
949,
37264,
253,
3268,
689,
11454,
2990,
13461,
352,
812,
320,
1896,
281,
9017,
436,
789,
281,
2074,
15644,
6297,
3738,
323,
1554,
262,
1945,
4715,
824,
15644,
1542,
35777,
778,
417,
320,
11408,
891,
651,
11435,
247,
1643,
5701,
327,
436,
285,
891,
1158,
352,
943,
671,
320,
5469,
13515,
275,
253,
2929,
50275,
49363,
7103,
253,
5661,
2593,
19401,
15216,
326,
403,
1077,
1846,
275,
253,
502,
6239,
19235,
841,
403,
417,
253,
954,
4722,
390,
47860,
347,
841,
403,
11640,
273,
278,
79,
382,
533,
1580,
954,
2905,
789,
19401,
841,
7533,
347,
973,
253,
4327,
310,
17285,
50276,
783,
1543,
403,
3240,
2266,
891,
1089,
253,
1543,
327,
9162,
432,
253,
21624,
2317,
273,
253,
440,
35421,
4715,
2746,
3340,
21414,
285,
4722,
2829,
337,
50275,
4919,
789,
253,
5886,
281,
374,
3198,
281,
320,
5469,
275,
625,
2508,
752,
4555,
310,
253,
3064,
604,
253,
13997,
67,
310,
1691,
327,
253,
1396,
569,
2581,
685,
13461,
752,
403,
5847,
285,
772,
891,
717,
6600,
326,
627,
403,
3081,
4679,
275,
253,
24864,
2144,
10941,
281,
465,
405,
2146,
2139,
513,
368,
1158,
634,
2746,
41731,
10574,
253,
581,
273,
465,
405,
2146,
50276,
23955,
4722,
4809,
310,
326,
253,
820,
19511,
1057,
417,
1361,
1199,
534,
310,
275,
4499,
281,
362,
498,
513,
368,
1158,
436,
310,
984,
253,
3045,
310,
2168,
1029,
390,
984,
253,
820,
19511,
5438,
11333,
465,
9229,
3632,
403,
49590,
50276,
18,
45120,
4715,
342,
17699,
16561,
11454,
6928,
323,
1327,
20502,
552,
941,
17857,
32888,
9169,
374,
24498,
801,
757,
14664,
292,
11454,
6928,
323,
17699,
16561,
45120,
4715,
5474,
339,
2059,
1119,
352,
2834,
281,
7472,
436,
2929,
984,
253,
2929,
1057,
417,
1333,
534,
45120,
4715,
1895,
352,
310,
16161,
275,
253,
22296,
4715,
1083,
310,
352,
16161,
253,
966,
19687,
30132,
4715,
390,
4836,
19687,
30132,
4715,
1895,
1293,
8958,
326,
352,
310,
1892,
281,
1056,
271,
6803,
984,
253,
767,
3237,
403,
3798,
14042,
275,
1077,
1027,
4088,
285,
616,
7103,
14238,
403,
1027,
1512,
891,
3597,
281,
5476,
533,
755,
13477,
50275,
783,
2929,
12013,
275,
253,
7103,
2593,
281,
11206,
253,
12510,
273,
776,
1566,
4404,
13538,
36256,
37264,
359,
1304,
891,
253,
1071,
7200,
273,
806,
4836,
846,
4715,
1016,
273,
253,
6774,
8892,
285,
21255,
253,
3388,
1071,
7200,
689,
512,
2045,
8892,
337,
374,
50274,
85,
846,
4715,
1016,
4836,
285,
352,
671,
12013,
4321,
7005,
2835,
253,
4836,
2654,
246,
323,
1517,
27789,
891,
717,
13477,
342,
841,
767,
7234,
513,
368,
878,
4836,
2654,
1309,
3733,
285,
5175,
604,
368,
403,
16161,
253,
966,
19687,
30132,
4715,
4836,
2654,
943,
417,
320,
908,
275,
3733,
390,
5175,
387,
1878,
417,
5175,
849,
513,
368,
513,
891,
513,
368,
760,
897,
253,
1071,
10872,
273,
253,
806,
4836,
513,
368,
4656,
1110,
10872,
281,
320,
10509,
715,
760,
253,
5971,
275,
253,
806,
4836,
390,
513,
368,
1581,
731,
281,
320,
10509,
281,
2852,
5971,
275,
2852,
8892,
323,
966,
19687,
30132,
4715,
581,
943,
320,
2970,
253,
7200,
273,
512,
5971,
6311,
594,
2080,
2581,
685,
1016,
4836,
594,
891,
717,
29985,
326,
368,
403,
2509,
4836,
19687,
30132,
4715,
21255,
671,
4245,
479,
253,
1072,
13214,
604,
326,
310,
253,
1083,
253,
1563,
2718,
403,
3264,
281,
320,
2429,
11649,
3169,
45120,
4715,
342,
17825,
37820,
295,
2824,
9638,
285,
40845,
36256,
37264,
342,
1892,
4116,
281,
253,
4836,
256,
1000,
1162,
355,
4765,
18890,
81,
310,
2905,
281,
253,
6297,
275,
256,
1000,
1162,
355,
4765,
285,
519,
293,
1162,
355,
9169,
352,
310,
11408,
281,
452,
731,
2429,
50275,
338,
368,
403,
2509,
966,
19687,
30132,
4715,
625,
3332,
1666,
25379,
943,
320,
2429,
253,
1666,
25379,
908,
275,
634,
4679,
403,
1711,
50275,
28269,
247,
27998,
30410,
17627,
595,
3066,
6142,
267,
6816,
30105,
1087,
6247,
50276,
1189,
4202,
36256,
37264,
323,
45120,
4715,
3066,
1566,
15644,
17857,
32888,
6247,
50276,
16374,
4311,
32809,
4715,
30105,
1087,
6247,
3632,
1854,
5438,
323,
45120,
4715,
5723,
2824,
6247,
5415,
4715,
273,
3634,
6820,
5162,
275,
11454,
6928,
3753,
5145,
9260,
6247,
352,
16878,
50276,
266,
32809,
4836,
1530,
6932,
5148,
613,
920,
2746,
30105,
1087,
9169,
45120,
4715,
342,
4373,
3024,
4896,
17857,
32888,
9169,
50276,
249,
253,
3368,
11962,
253,
1180,
273,
8892,
323,
1016,
10895,
310,
671,
6799,
281,
921,
253,
31376,
273,
253,
4081,
2746,
50276,
7152,
339,
793,
360,
3454,
50276,
783,
2929,
29328,
247,
45120,
4715,
7792,
1754,
327,
17699,
16561,
1327,
36928,
2746,
50276,
783,
8763,
3828,
310,
23115,
970,
801,
757,
14664,
292,
1232,
2720,
50276,
783,
17032,
4648,
247,
18872,
1599,
3423,
11193,
342,
247,
305,
12064,
2021,
323,
253,
13461,
285,
701,
357,
1808,
276,
25658,
323,
253,
4836,
78,
6579,
50276,
783,
39762,
17032,
310,
2218,
342,
17699,
265,
1615,
2135,
8560,
327,
247,
1846,
1045,
2399,
9978,
50276,
783,
4679,
921,
1679,
48245,
7200,
327,
253,
17627,
273,
8892,
327,
2620,
15302,
323,
253,
20741,
800,
1895,
285,
323,
5978,
253,
3082,
3037,
581,
6670,
390,
1894,
387,
247,
673,
327,
278,
79,
382,
285,
417,
16192,
382,
15302,
50275,
856,
84,
50276,
783,
2929,
2722,
247,
18872,
1599,
3423,
11193,
273,
253,
45120,
4715,
1895,
285,
6194,
247,
2014,
8763,
3828,
273,
1052,
5085,
327,
253,
20741,
800,
285,
1006,
800,
7533,
50276,
783,
2929,
310,
3477,
281,
956,
50276,
5040,
50275,
783,
2605,
273,
253,
2990,
310,
2581,
20126,
50276,
783,
4477,
671,
5393,
253,
5691,
275,
616,
6452,
50276,
783,
4679,
497,
2218,
327,
2581,
2969,
15302,
50276,
783,
2929,
23970,
247,
4737,
273,
4473,
3185,
273,
4645,
253,
13789,
273,
253,
10419,
275,
2570,
15216,
50276,
783,
7802,
273,
253,
18872,
1599,
3423,
11193,
3133,
15246,
285,
21168,
18171,
327,
2045,
789,
50276,
783,
38135,
778,
320,
327,
2403,
253,
2990,
6194,
970,
436,
2746,
533,
253,
10419,
4648,
247,
2581,
2969,
3828,
6661,
581,
3828,
50276,
26122,
50276,
249,
2593,
8319,
368,
3748,
326,
368,
1304,
253,
1599,
3933,
19103,
327,
253,
1027,
8892,
3036,
374,
50276,
5430,
1142,
9508,
273,
253,
8892,
513,
368,
6635,
285,
3388,
689,
50276,
8052,
2228,
8965,
651,
1361,
281,
923,
253,
11041,
273,
253,
3082,
689,
253,
8892,
50276,
37585,
5701,
50276,
555,
5367,
268,
23,
1061,
19,
14177,
281,
5223,
84,
50276,
555,
5367,
268,
24,
1061,
495,
389,
522,
433,
50276,
1189,
455,
13716,
253,
2934,
273,
9433,
17699,
16561,
1327,
36928,
323,
5415,
4715,
310,
4722,
285,
253,
4477,
921,
247,
2969,
7092,
327,
2969,
15302,
50276,
783,
6760,
8892,
403,
6685,
2905,
285,
275,
247,
2087,
5415,
4715,
9978,
436,
1332,
778,
417,
789,
50276,
3062,
9470,
285,
2570,
4679,
26332,
625,
2570,
16634,
285,
873,
8777,
323,
11081,
347,
973,
347,
323,
5978,
778,
17914,
690,
1708,
327,
253,
1232,
50276,
21848,
281,
512,
841,
3374,
891,
2281,
253,
2929,
347,
247,
608,
7152,
339,
431,
248,
4477,
1246,
247,
747,
2605,
28269,
2746,
281,
45120,
4715,
407,
26278,
1016,
8763,
3828,
970,
247,
1327,
36928,
17699,
16561,
2720,
247,
5853,
11797,
407,
3332,
789,
327,
4715,
23507,
295,
2224,
970,
436,
5853,
247,
23507,
8578,
273,
2130,
13461,
310,
908,
323,
1016,
4836,
21656,
6941,
3640,
9628,
875,
6774,
4836,
1223,
8493,
36256,
37264,
327,
253,
372,
18132,
10291,
50275,
2520,
2789,
323,
271,
4583,
1077,
21414,
19529,
11392,
407,
619,
5884,
14226,
973,
4845,
2190,
643,
3332,
16516,
327,
502,
275,
1755,
28966,
50275,
856,
84,
275,
642,
1798,
1340,
50276,
783,
1332,
310,
3505,
74,
6216,
285,
3637,
10748,
285,
13990,
5954,
275,
253,
362,
498,
7792,
253,
9759,
3637,
247,
2590,
14511,
326,
310,
3477,
281,
956,
50276,
783,
1332,
3249,
342,
247,
4836,
5481,
5122,
50276,
15617,
20741,
800,
285,
1006,
800,
26278,
403,
10748,
4516,
275,
253,
1072,
7792,
50276,
783,
30762,
25339,
512,
3309,
4278,
281,
247,
1268,
973,
25117,
2629,
275,
253,
6239,
1783,
10949,
954,
273,
253,
4722,
3533,
326,
476,
10748,
320,
2546,
670,
436,
1332,
50275,
15068,
264,
1543,
403,
4583,
2266,
285,
3835,
2629,
7103,
275,
253,
6239,
891,
369,
13864,
281,
923,
4679,
275,
271,
2898,
643,
685,
22296,
2460,
9162,
1060,
440,
35421,
4715,
50275,
66,
2969,
47641,
323,
7870,
7466,
310,
5611,
247,
32811,
3884,
323,
2852,
2561,
50275,
5040,
275,
642,
1798,
1340,
50276,
2520,
1332,
4419,
5718,
273,
247,
8985,
4315,
323,
4836,
285,
1016,
3828,
275,
253,
2990,
2299,
253,
4477,
921,
326,
2317,
10454,
17202,
42407,
1037,
342,
253,
1180,
273,
8892,
534,
310,
2779,
281,
1056,
436,
271,
12207,
5454,
2727,
50275,
7582,
8680,
50276,
303,
1689,
253,
2022,
7680,
273,
436,
789,
310,
697,
5697,
327,
2605,
4715,
275,
253,
3634,
273,
45120,
4715,
436,
310,
2299,
417,
11392,
275,
253,
4060,
50276,
783,
2929,
651,
5649,
432,
4518,
14851,
2139,
2605,
4715,
323,
502,
310,
247,
32811,
3884,
281,
15142,
973,
15720,
9380,
4518,
1375,
2139,
253,
4081,
3884,
310,
247,
32811,
1332,
273,
5839,
50275,
13206,
495,
310,
5816,
7844,
13301,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
17699,
16561,
1327,
36928,
1332,
323,
4836,
19687,
30132,
45120,
4715,
352,
310,
625,
2087,
685,
2045,
789,
275,
326,
352,
19401,
253,
2990,
2605,
347,
247,
3632,
4778,
285,
2987,
323,
1097,
22296,
285,
440,
35421,
7533,
5661,
1543,
921,
326,
253,
4081,
1332,
41731,
13015,
2720,
789,
275,
253,
4081,
8892,
50276,
856,
84,
50276,
262,
310,
973,
17194,
50276,
953,
28055,
3590,
50276,
262,
476,
513,
4836,
17032,
50276,
262,
41731,
13015,
643,
3082,
275,
253,
4081,
8892,
50276,
5040,
50276,
783,
5661,
9978,
369,
417,
1077,
11132,
984,
253,
10895,
16192,
382,
369,
2969,
285,
253,
2990,
369,
20126,
50276,
9088,
369,
642,
28913,
1263,
281,
12106,
253,
9021,
273,
253,
5933,
281,
253,
3045,
50276,
9088,
310,
417,
2217,
4679,
281,
1329,
253,
5750,
273,
4836,
17032,
50276,
783,
2929,
858,
417,
7277,
342,
253,
256,
5503,
4836,
19687,
30132,
4715,
11333,
7856,
285,
1850,
50276,
783,
2022,
7350,
273,
30628,
403,
327,
253,
5661,
2593,
347,
7117,
275,
772,
285,
253,
3064,
432,
2045,
789,
253,
4477,
5544,
326,
616,
1332,
556,
253,
11361,
689,
2045,
789,
326,
352,
476,
513,
4836,
17032,
391,
20,
5821,
342,
253,
5750,
285,
5125,
625,
4679,
275,
436,
3884,
943,
320,
2684,
253,
4477,
5196,
3081,
4679,
5125,
407,
253,
30628,
1690,
5301,
342,
7856,
597,
671,
28228,
247,
17265,
2715,
281,
19071,
253,
5701,
432,
253,
30628,
50276,
74,
1158,
253,
2929,
310,
973,
17194,
285,
253,
2934,
273,
9433,
17699,
16561,
1327,
36928,
323,
5415,
4715,
310,
4722,
352,
812,
7826,
41509,
4722,
2852,
789,
327,
502,
2299,
253,
2022,
11361,
1987,
8303,
403,
417,
973,
3559,
285,
4516,
407,
253,
4679,
594,
387,
1246,
673,
891,
2868,
627,
310,
1199,
2316,
323,
253,
4477,
281,
3157,
616,
2929,
1078,
9311,
891,
3524,
326,
253,
4477,
588,
320,
2104,
281,
2953,
253,
8680,
597,
2959,
281,
1056,
436,
19529,
755,
835,
352,
943,
320
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes to use the variational estimation of the lower bound of the skew renyi divergence for contrastive representation learning the authors compare the reynyi divergence against traditional divergence such as kl divergence and skew kl divergence and argue the proposed skew renyi divergence is beneficial for contrastive representation learning empirically while it is nice to have a new divergence and understand some properties of it i am not quite convinced why the proposed skew renyi divergence is better than the traditional ones for example skew kl divergence besides the empirical results is there any explanation why skew renyi divergence is preferred another thing that bothers me is the paper entangles the analysis of of the skew renyi divergence with data augmentation i am curious if it is possible to decouple the two one can do contrastive learning using the estimator of the skew renyi divergence without data augmentation coupling the two makes the analysis harder to read in particular i dont feel the transition probability avx is well explained why do authors need to use the transition probability to define the positive and negative pairs can we define them just using the data distribution without the transition probability na docsepthis paper revisits the derivations of the infonce objective from the original contrastive predictive coding paper replacing the kl divergence in the mutual mutual information with the renyi divergence as with the kl divergence the renyi divergence is known to enjoy a variational representation which in both cases has a naive estimator that suffers from high variance they then note a variance reduction idea which i do not recognize from the contrastive literature that shows that for d in kl renyi replacing dpq with dp alphap 1alphaq for some small alpha yields an estimator with much improved variance the paper proposes to use this variance reduction with the renyi divergence as a plugin replacement for infonce as a training objective the final theoretical component is to show that the resulting objective has gradients that naturally place high importance weights on hard negative and easy positives the authors suggest that the easy positive in particular lend themselves to using more aggressive data augmentations in practice experiments cover visual tabular and graph structured data congratulations to the authors on a well written tidy piece of work there are certainly some interesting ideas in this work and i enjoyed reading this paper first i outline some of the positive aspects i saw in this work before turning to some weaker points and a few concerns i have particularly with some of the experimental comparisons it is good to see that the thought process behind the mutual information infonce derivation can be replicated for other divergences perhaps the same could be done with all sorts of other divergences leading to a variety of different behaviors in objective functions i also liked the presentation of the skewdivergence connection the skewdivergence seems like it has useful lowvariance highbias properties that could be useful in other contrastive learning contexts one of the premier benefits of renyicl is touted to be able to gracefully handle and make use of aggressive positive sampling thanks to its weighting biassing towards easy positives as such the authors use what is said to be harder data augmentations for their method one of the ingredients constituting harder augmentations is multicrop sampling la swav this raises a major concern of a lack of apples to apples vision comparisons since the multicrop method has in all cases i have ever seen always boosted performance and increase the memory and runtime requirements so the results reported in tables 123 all compare a method using multicrop to methods that do not except for swav which does and performs similarly to renyicl it would be much fairer to remove the multicrop sampling from renyicl table 4 has this number but it is only for a 200 epoch run so again it is not really fair to compare finally table 5 compares renyicl to other methods using the hard data augmentations an important brightspot here is that renyicl does see the biggest boost of the three methods which is great to see however renyicl is only compared to cpc and a variant of cpc it remains unclear for instance how mocov3 would fare in comparison using multicrop in all it remains hard to say exactly how renyicl compares to other methods besides to say that it is likely broadly comparable the inclusion of experiments on tabular and graph data is welcome and i am glad to see promising results on other domains besides vision i end with a much more philosophical point the motivation for why renyi divergence is mostly post hocinstead of any intrinsic explanation for why renyi may be a better estimator than kl the authors first go through the process of deriving their lowvariance estimator and analyzing its gradients they then find a number of natural potentially desirable attributes of the final objective such as easy positivehard negative sample importance weights this is again very similar to process of producing the infonce objective starting from mutual information it was convincingly argued by tschannen djolonga et al 2020 that it is the form of the infonce loss itself rather than any relation to mutual information that explains its success i have to say that i very much expect the same to hold for thebased renyi objective presented in this work from this point of view it is worth putting to one side for a moment all of the scaffolding built on top of the idea of the renyi divergence and simply focus on the form of the final renyicl objective as noted by the authors this objective is of similar form to prior idea but with an importance sampling reweighting to emphasize hard negatives and easy positives the main distinction here from prior works which have considered hard negatives and hard positives using similar importance weightings is to keep using hard negatives but instead to reverse the focus to be on easy positives and note that actually this may buy some benefits in terms of being able to use more aggressive data augmentations this is an interesting proposal and may well be at the core of the effectiveness of the proposed method the renyi divergence may mostly serve to obfuscate this to conclude while i have some gripes with the vision experimental setup and suspect that the main benefit of renyicl of easy positive sampling could similarly be implemented in any other contrastive framework using importance weights i am still broadly supportive of this work to me this core recipe of easy positive weights aggressive data augmentations and its justification in table 5 is already an interesting contribution since how to produce good positive samples is perhaps the single most important question in contrastive learning on a different note the paper also offers other ideas such as the skewdivergences and their applicability to both kl and renyi divergences seem interesting and may be of independent interest for producing lowvariance contrastive learning objectives post rebuttal increased score by one yes docsepthis paper investigates a novel alternative formulation of contrastive learning that leverages the renyi divergence motivated by the sensitivity of augmentation schemes for the learned representations the authors argues using renyi divergence can better manage hard augmentations to overcome the variance issue in the naive formulation a more stable version based on skewed statistics is advocated with theoretical justifications the authors showed that using the proposed renyirl can spared the need for additional hacks or extra computations to get decent performance strong empirical evidence supports the claims made in the paper strength quality originality the application of renyi contrastive learning for selfsupervised learning is novel although i have seen other recent works also attempting to renyi for contrastive learning i believe this manuscript is the only one that well balances theoretical and practical aspects clarity significance this paper is written with exceptional clarity and i enjoyed reading it it provides an extensive coverage on related work and i am genuinely surprised to see some research threads not well known to the general contrastive learning community nice work i added some additional thread which should make the literature review more comprehensive and will hopefully inspire future research ideas quality strong experimental results well curated experimental code weakness it is not clear to me how the variance in theorem 32 turns into the variance in theorem 31 in the limiting case alpha0 could the authors elaborate on that variance is not the only reason causing infonce to perform suboptimally discussions from 57 have presented diverse perspectives on how contrastive learning can be improved based numerical or optimization views also backed up strong mathematical arguments these works are complementary to the approach adopted here and should be discussed misc theorem 31 is related to the result from 1 on the exponential variance issue for mi estimation which should be discussed the objective function for skewed chi2 divergence presented in the tsai et al 2021 which this work is based on is related to the spectrum contrastive learning objective 5 which is simpler and also demonstrated strong performance gains over the vanilla infonce objective the renyi divergence and chi2 divergence have also been successfully applied in the context of generative modeling 24 where the goal of minimizing dalphapxyparallel pxpy or dalphapxparallel qx is shared potential typos alphacpc is actually proposed by b poole et al icml 2019 not j song et al neurips 2020 1 mcallester d and stratos k formal limitations on the measurement of mutual information aistats 2020 2 y li et al rnyi divergence variational inference neurips 2016 3 l chen et al variational inference and model selection with generalized evidence bounds icml 2018 4 c tao chisquare generative adversarial network icml 2018 5 j haochen et al provable guarantees for selfsupervised deep learning with spectral contrastive loss neurips 2021 6 q guo et al tight mutual information estimation with contrastive fenchellegendre optimization 2021 7 j chen et al simpler faster stronger breaking the logk curse on contrastive learners with flatnce 2021 i dont have any concerns on limitations and potential negative societal impact docsepthis paper proposed the new robust contrastive learning scheme called rnyicl by using the rnyi divergence instead of kl divergence as the objective function the proposed method becomes robust against harder data augmentations in order to keep the variance of the objective function small this paper proposed the use of a variational lower bound for skewed rnyi divergence which allows for training stabilization the use of such skewed divergence is based on the observation that the original contrastive learning corresponds to the variational lower bound of the skewed kl divergence strength the flow of the text is clear and the purpose background and proposed method of this study can be easily understood the proposed method is simple to implement and can be easily used as a plugin to existing ssl methods numerical experiments clearly demonstrate the effectiveness of the proposed method the proposed method achieves comparable or slightly better performance than existing ssl methods when using normal data augmentation in the situation with strong data augmentation there was significant performance improvement over the existing methods infomin and clsa for the two conditions of cub200 and the plant dataset weakness the significance of the theorem 32 is not adequately presented this paper argues that skewed divergence allows the variance of the objective function to be bounded from above using some constants however this argument is not sufficiently convincing because there is not enough reference to the actual magnitude of the constant more details are given in the questions section discussion about the gradient analysis of rnyicl is not fully convincing the interpretation of the effect of the change in divergence is discussed by comparing the gradient of the objective function of rnyicl with the baseline gradient but the conclusion is not convincing due to its logical development i will discuss in detail in the question section lack of ablation studies on robustness since the main concern of this paper is robustness to various data augmentation the paper should directly demonstrate the effectiveness of the proposed method by comparing the performance by ablation of the contrastive objective under strong data augmentation the authors have adequately addressed the limitations and potential negative societal impact of their work in conclusion section in particular the paper properly mentioned that it is difficult to identify what kind of data augmentation would be effective as a limitation of the proposed method yet since the introduction of the paper motivates that the choice of data extension is the key in contrastive learning there seems to be a slight discrepancy between the research objective and the actual proposed method
### Summary:
|
the reviewers reached a consensus that this paper is a nice addition to neurips please refers to the reviews and authors responses for reviewers opinions on the strength and weakness of the paper
|
[
247,
973,
3542,
49031,
5313,
273,
789,
627,
403,
5604,
690,
4722,
5697,
275,
436,
789,
285,
891,
11346,
4361,
436,
2929,
806,
891,
19270,
690,
273,
253,
2762,
7794,
891,
3047,
275,
436,
789,
1078,
8577,
281,
690,
21076,
2792,
285,
247,
1643,
7350,
891,
452,
3782,
342,
690,
273,
253,
5661,
14023,
50275,
262,
310,
1175,
281,
923,
326,
253,
1869,
1232,
3212,
253,
15577,
1491,
50276,
2050,
19131,
28529,
476,
320,
37221,
323,
643,
11711,
1541,
707,
4931,
253,
1072,
812,
320,
2218,
342,
512,
16308,
273,
643,
11711,
1541,
707,
4283,
281,
247,
5235,
273,
1027,
13576,
275,
8103,
3470,
891,
671,
10490,
253,
9759,
273,
253,
8413,
14066,
2373,
9515,
4602,
253,
8413,
14066,
2373,
9515,
3133,
751,
352,
556,
4217,
1698,
87,
14417,
1029,
39043,
3607,
326,
812,
320,
4217,
275,
643,
4499,
422,
4715,
22349,
50275,
531,
273,
253,
24654,
5373,
273,
3816,
90,
280,
77,
310,
14078,
264,
281,
320,
2104,
281,
14426,
2920,
6016,
285,
1056,
897,
273,
13847,
2762,
10491,
6701,
281,
697,
42428,
1794,
38046,
4404,
3477,
37865,
347,
824,
253,
4477,
897,
752,
310,
753,
281,
320,
12150,
941,
35919,
569,
323,
616,
1332,
50276,
531,
273,
253,
12696,
38015,
12150,
35919,
569,
310,
23559,
1658,
10491,
50276,
4123,
1863,
580,
436,
16540,
247,
2201,
4468,
273,
247,
3480,
273,
28580,
281,
28580,
8113,
14023,
1580,
253,
23559,
1658,
1332,
556,
275,
512,
2219,
891,
452,
2455,
2326,
1900,
46002,
3045,
285,
2572,
253,
3541,
285,
20243,
6095,
594,
253,
1543,
2361,
275,
7180,
15567,
512,
7277,
247,
1332,
970,
23559,
1658,
281,
3082,
326,
513,
417,
3707,
323,
1863,
580,
534,
1057,
285,
17923,
12014,
281,
3816,
90,
280,
77,
352,
651,
320,
1199,
22870,
83,
281,
5386,
253,
23559,
1658,
10491,
432,
3816,
90,
280,
77,
2829,
577,
556,
436,
1180,
533,
352,
310,
760,
323,
247,
1052,
23657,
1408,
594,
969,
352,
310,
417,
1663,
4344,
281,
7277,
4720,
2829,
608,
26662,
50275,
445,
90,
280,
77,
281,
643,
3082,
970,
253,
1892,
941,
35919,
569,
271,
1774,
6627,
17166,
1060,
310,
326,
3816,
90,
280,
77,
1057,
923,
253,
5962,
9510,
273,
253,
1264,
3082,
534,
310,
1270,
281,
923,
2299,
3816,
90,
280,
77,
310,
760,
2429,
281,
260,
5902,
285,
247,
12955,
273,
260,
5902,
352,
4558,
12744,
323,
4227,
849,
278,
406,
729,
20,
651,
19021,
275,
5301,
970,
23559,
1658,
275,
512,
352,
4558,
1892,
281,
1333,
4555,
849,
3816,
90,
280,
77,
26662,
281,
643,
3082,
16280,
281,
1333,
326,
352,
310,
2779,
21450,
10870,
50276,
783,
11250,
273,
4679,
327,
10334,
792,
285,
4216,
941,
310,
10112,
285,
891,
717,
9995,
281,
923,
12532,
1543,
327,
643,
10625,
16280,
8113,
50275,
74,
990,
342,
247,
1199,
625,
22555,
1127,
253,
16038,
323,
2139,
3816,
28212,
23279,
310,
6571,
1501,
26901,
34235,
273,
667,
15276,
8813,
323,
2139,
3816,
28212,
778,
320,
247,
1805,
29107,
685,
27451,
253,
4477,
806,
564,
949,
253,
1232,
273,
44190,
616,
1698,
87,
14417,
29107,
285,
18918,
697,
27935,
597,
840,
1089,
247,
1180,
273,
3626,
7826,
11408,
12474,
273,
253,
2457,
8103,
824,
347,
3477,
2762,
10984,
4016,
3410,
6349,
13461,
50276,
2520,
310,
969,
1077,
2074,
281,
1232,
273,
9603,
253,
2192,
19131,
8103,
4983,
432,
15577,
1491,
352,
369,
2410,
1763,
5356,
9125,
407,
246,
10629,
1136,
257,
50276,
45061,
311,
543,
66,
1162,
355,
9169,
326,
352,
310,
253,
830,
273,
253,
2192,
19131,
2957,
3139,
2581,
685,
667,
5886,
281,
15577,
1491,
326,
11424,
697,
2323,
891,
452,
281,
1333,
326,
891,
1077,
1199,
1902,
253,
1072,
281,
2186,
323,
253,
3169,
3816,
28212,
8103,
3559,
275,
436,
789,
432,
436,
1127,
273,
1859,
352,
310,
4409,
8133,
281,
581,
1930,
323,
247,
2774,
512,
273,
253,
20195,
17593,
4270,
327,
1755,
273,
253,
2934,
273,
253,
3816,
28212,
23279,
285,
3365,
2770,
327,
253,
830,
273,
253,
2457,
3816,
90,
280,
77,
8103,
347,
4879,
407,
253,
4477,
436,
8103,
310,
273,
2074,
830,
281,
2720,
2934,
533,
342,
271,
6349,
10491,
294,
6712,
272,
281,
22175,
1892,
2297,
3993,
285,
3477,
37865,
253,
2022,
13812,
1060,
432,
2720,
2987,
534,
452,
2783,
1892,
2297,
3993,
285,
1892,
37865,
970,
2074,
6349,
2801,
723,
310,
281,
1978,
970,
1892,
2297,
3993,
533,
3185,
281,
8107,
253,
2770,
281,
320,
327,
3477,
37865,
285,
3877,
326,
2686,
436,
778,
4489,
690,
5373,
275,
2426,
273,
1146,
2104,
281,
897,
625,
13847,
941,
35919,
569,
436,
310,
271,
4722,
10419,
285,
778,
973,
320,
387,
253,
5161,
273,
253,
12510,
273,
253,
4081,
1332,
253,
3816,
28212,
23279,
778,
6571,
5752,
281,
691,
71,
19387,
366,
436,
50273,
936,
7525,
1223,
891,
452,
690,
17628,
265,
342,
253,
8113,
5661,
9978,
285,
9101,
326,
253,
2022,
5649,
273,
3816,
90,
280,
77,
273,
3477,
2762,
10491,
812,
12014,
320,
9009,
275,
667,
643,
4499,
422,
7792,
970,
6349,
13461,
891,
717,
1335,
21450,
23384,
273,
436,
789,
281,
479,
436,
5161,
13612,
273,
3477,
2762,
13461,
50276,
356,
11020,
941,
35919,
569,
285,
697,
22861,
275,
2829,
608,
310,
2168,
271,
4722,
7680,
1580,
849,
281,
4711,
1175,
2762,
3530,
310,
4931,
253,
2014,
954,
1774,
1953,
275,
4499,
422,
4715,
327,
247,
1027,
3877,
253,
2929,
671,
6131,
643,
5697,
824,
347,
253,
8413,
14066,
2373,
1541,
707,
285,
616,
30437,
281,
1097,
27451,
285,
3816,
28212,
11711,
1541,
707,
1646,
4722,
285,
778,
320,
273,
3907,
1600,
323,
9603,
1698,
87,
14417,
4499,
422,
4715,
16566,
50272,
5996,
30080,
22559,
2559,
4868,
407,
581,
50276,
9820,
5474,
33032,
2520,
2929,
2340,
684,
247,
4460,
5795,
15895,
273,
4499,
422,
4715,
326,
19732,
1131,
253,
3816,
28212,
23279,
17194,
407,
253,
7340,
273,
42072,
15849,
323,
253,
6311,
14237,
253,
4477,
8219,
970,
3816,
28212,
23279,
476,
1805,
8722,
1892,
35919,
569,
281,
11399,
253,
11041,
2523,
275,
253,
27785,
15895,
247,
625,
6474,
2715,
1754,
327,
46746,
9990,
310,
36431,
342,
10527,
816,
6787,
253,
4477,
2692,
326,
970,
253,
4081,
3816,
90,
2587,
476,
40335,
253,
878,
323,
3081,
288,
7305,
390,
4465,
30745,
281,
755,
12524,
3045,
2266,
16774,
1941,
8525,
253,
3916,
1160,
275,
253,
2929,
50276,
45563,
50275,
15177,
3236,
414,
253,
2898,
273,
3816,
28212,
4499,
422,
4715,
323,
1881,
35421,
4715,
310,
4460,
3738,
891,
452,
2326,
643,
3332,
2987,
671,
13756,
281,
3816,
28212,
323,
4499,
422,
4715,
891,
2868,
436,
7714,
310,
253,
760,
581,
326,
973,
40216,
10527,
285,
8542,
7794,
50275,
498,
15752,
8453,
436,
2929,
310,
3542,
342,
18714,
19843,
285,
891,
11346,
4361,
352,
352,
3400,
271,
9470,
7031,
327,
2905,
789,
285,
891,
717,
27364,
9861,
281,
923,
690,
2561,
17059,
417,
973,
1929,
281,
253,
2087,
4499,
422,
4715,
3114,
5322,
789,
891,
2879,
690,
3081,
6293,
534,
943,
1056,
253,
6239,
2278,
625,
11088,
285,
588,
18670,
26761,
2852,
2561,
5697,
50276,
15177,
2266,
5661,
1543,
973,
1095,
456,
5661,
2127,
50274,
20881,
1255,
50275,
262,
310,
417,
2590,
281,
479,
849,
253,
11041,
275,
10012,
4567,
7819,
715,
253,
11041,
275,
10012,
4562,
275,
253,
14155,
1083,
9765,
17,
812,
253,
4477,
21184,
327,
326,
50275,
87,
14417,
310,
417,
253,
760,
1921,
8479,
2192,
19131,
281,
1347,
749,
32581,
595,
11985,
432,
8988,
452,
3559,
11117,
24302,
327,
849,
4499,
422,
4715,
476,
320,
5520,
1754,
10704,
390,
13757,
6849,
671,
17245,
598,
2266,
15965,
7125,
841,
2987,
403,
19767,
281,
253,
2746,
8671,
1060,
285,
943,
320,
5469,
50274,
43671,
50275,
33921,
4562,
310,
2905,
281,
253,
906,
432,
337,
327,
253,
17619,
11041,
2523,
323,
3641,
13418,
534,
943,
320,
50276,
35844,
264,
50276,
783,
8103,
1159,
323,
46746,
21477,
19,
23279,
3559,
275,
253,
28669,
2284,
1162,
355,
43425,
534,
436,
789,
310,
1754,
327,
310,
2905,
281,
253,
6637,
4499,
422,
4715,
8103,
608,
534,
310,
19554,
285,
671,
5183,
2266,
3045,
15988,
689,
253,
26724,
2192,
19131,
8103,
253,
3816,
28212,
23279,
285,
21477,
19,
23279,
452,
671,
644,
8379,
3732,
275,
253,
3634,
273,
1006,
800,
14053,
2164,
835,
253,
4736,
273,
28699,
277,
21697,
522,
5246,
19783,
268,
89,
4789,
390,
277,
21697,
522,
89,
19783,
2805,
89,
310,
6096,
50275,
33177,
963,
993,
355,
545,
317,
5902,
310,
2686,
4081,
407,
270,
2963,
1306,
1162,
355,
17857,
1686,
6247,
417,
480,
4498,
1162,
355,
5723,
2824,
9169,
50276,
18,
278,
4065,
9358,
277,
285,
15252,
375,
465,
7473,
7364,
327,
253,
6814,
273,
15577,
1491,
247,
382,
1832,
9169,
374,
340,
632,
1162,
355,
391,
5134,
74,
23279,
39762,
17032,
5723,
2824,
4022,
495,
298,
260,
864,
1162,
355,
39762,
17032,
285,
1566,
5438,
342,
14923,
1941,
14493,
17857,
1686,
4765,
577,
260,
246,
8500,
448,
261,
8974,
1006,
800,
48960,
2990,
17857,
1686,
4765,
608,
480,
419,
406,
864,
1162,
355,
872,
494,
23632,
323,
1881,
35421,
3676,
4715,
342,
9879,
4499,
422,
2957,
5723,
2824,
43425,
721,
2805,
1149,
80,
1162,
355,
6863,
15577,
1491,
13418,
342,
4499,
422,
38775,
29311,
42262,
250,
13757,
43425,
818,
480,
260,
864,
1162,
355,
19554,
7938,
10046,
10155,
253,
2412,
76,
28401,
327,
4499,
422,
40390,
342,
6507,
6591,
43425,
50276,
74,
13414,
452,
667,
7350,
327,
7364,
285,
2442,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
4081,
253,
747,
10237,
4499,
422,
4715,
6974,
1925,
391,
5134,
280,
77,
407,
970,
253,
391,
5134,
74,
23279,
3185,
273,
27451,
23279,
347,
253,
8103,
1159,
253,
4081,
1332,
4916,
10237,
1411,
12150,
941,
35919,
569,
275,
1340,
281,
1978,
253,
11041,
273,
253,
8103,
1159,
1355,
436,
2929,
4081,
253,
897,
273,
247,
39762,
2406,
3033,
323,
46746,
391,
5134,
74,
23279,
534,
4483,
323,
3733,
28366,
253,
897,
273,
824,
46746,
23279,
310,
1754,
327,
253,
8310,
326,
253,
3236,
4499,
422,
4715,
10140,
281,
253,
39762,
2406,
3033,
273,
253,
46746,
27451,
23279,
50276,
45563,
50276,
783,
2685,
273,
253,
2505,
310,
2590,
285,
253,
4096,
4114,
285,
4081,
1332,
273,
436,
1263,
476,
320,
4354,
7192,
50276,
783,
4081,
1332,
310,
2969,
281,
3359,
285,
476,
320,
4354,
908,
347,
247,
15191,
281,
5368,
256,
3433,
3082,
50276,
40907,
474,
4679,
4518,
7568,
253,
12510,
273,
253,
4081,
1332,
253,
4081,
1332,
33526,
10870,
390,
5777,
1805,
3045,
685,
5368,
256,
3433,
3082,
672,
970,
2622,
941,
42072,
275,
253,
4112,
342,
2266,
941,
42072,
627,
369,
1534,
3045,
7756,
689,
253,
5368,
3082,
2192,
5240,
285,
502,
6678,
323,
253,
767,
2515,
273,
12966,
1518,
285,
253,
4444,
10895,
50275,
20881,
1255,
50276,
783,
8453,
273,
253,
10012,
4567,
310,
417,
18212,
3559,
436,
2929,
8219,
326,
46746,
23279,
4483,
253,
11041,
273,
253,
8103,
1159,
281,
320,
11542,
432,
1840,
970,
690,
14637,
2299,
436,
4154,
310,
417,
10481,
21414,
984,
627,
310,
417,
2217,
3806,
281,
253,
4588,
9777,
273,
253,
3638,
625,
4278,
403,
1677,
275,
253,
3533,
2593,
50276,
49794,
670,
253,
11786,
1783,
273,
391,
5134,
280,
77,
310,
417,
4751,
21414,
253,
7914,
273,
253,
1055,
273,
253,
1818,
275,
23279,
310,
5469,
407,
10941,
253,
11786,
273,
253,
8103,
1159,
273,
391,
5134,
280,
77,
342,
253,
8245,
11786,
533,
253,
6452,
310,
417,
21414,
1955,
281,
697,
13760,
2440,
891,
588,
2319,
275,
2508,
275,
253,
1953,
2593,
50276,
77,
471,
273,
28913,
2175,
327,
31640,
1580,
253,
2022,
4468,
273,
436,
2929,
310,
31640,
281,
2710,
941,
42072,
253,
2929,
943,
3587,
7568,
253,
12510,
273,
253,
4081,
1332,
407,
10941,
253,
3045,
407,
28913,
273,
253,
4499,
422,
8103,
762,
2266,
941,
42072,
50276,
783,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
275,
6452,
2593,
275,
1798,
253,
2929,
6283,
5393,
326,
352,
310,
2834,
281,
4271,
752,
2238,
273,
941,
42072,
651,
320,
3576,
347,
247,
12291,
273,
253,
4081,
1332,
2568,
1580,
253,
10199,
273,
253,
2929,
15265,
684,
326,
253,
4327,
273,
941,
6880,
310,
253,
2234,
275,
4499,
422,
4715,
627,
3133,
281,
320,
247,
4512,
26210,
875,
253,
2561,
8103,
285,
253,
4588,
4081,
1332,
2490,
187,
4118,
18435,
27,
783,
30628,
4925,
247,
13969,
326,
436,
2929,
310,
247,
5322,
1635,
281,
5723,
2824,
4496,
10770,
281,
253,
10123,
285,
4477,
6128,
323,
30628,
11626,
327,
253,
4757,
285,
14855,
273,
253,
2929,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
247,
973,
3542,
49031,
5313,
273,
789,
627,
403,
5604,
690,
4722,
5697,
275,
436,
789,
285,
891,
11346,
4361,
436,
2929,
806,
891,
19270,
690,
273,
253,
2762,
7794,
891,
3047,
275,
436,
789,
1078,
8577,
281,
690,
21076,
2792,
285,
247,
1643,
7350,
891,
452,
3782,
342,
690,
273,
253,
5661,
14023,
50275,
262,
310,
1175,
281,
923,
326,
253,
1869,
1232,
3212,
253,
15577,
1491,
50276,
2050,
19131,
28529,
476,
320,
37221,
323,
643,
11711,
1541,
707,
4931,
253,
1072,
812,
320,
2218,
342,
512,
16308,
273,
643,
11711,
1541,
707,
4283,
281,
247,
5235,
273,
1027,
13576,
275,
8103,
3470,
891,
671,
10490,
253,
9759,
273,
253,
8413,
14066,
2373,
9515,
4602,
253,
8413,
14066,
2373,
9515,
3133,
751,
352,
556,
4217,
1698,
87,
14417,
1029,
39043,
3607,
326,
812,
320,
4217,
275,
643,
4499,
422,
4715,
22349,
50275,
531,
273,
253,
24654,
5373,
273,
3816,
90,
280,
77,
310,
14078,
264,
281,
320,
2104,
281,
14426,
2920,
6016,
285,
1056,
897,
273,
13847,
2762,
10491,
6701,
281,
697,
42428,
1794,
38046,
4404,
3477,
37865,
347,
824,
253,
4477,
897,
752,
310,
753,
281,
320,
12150,
941,
35919,
569,
323,
616,
1332,
50276,
531,
273,
253,
12696,
38015,
12150,
35919,
569,
310,
23559,
1658,
10491,
50276,
4123,
1863,
580,
436,
16540,
247,
2201,
4468,
273,
247,
3480,
273,
28580,
281,
28580,
8113,
14023,
1580,
253,
23559,
1658,
1332,
556,
275,
512,
2219,
891,
452,
2455,
2326,
1900,
46002,
3045,
285,
2572,
253,
3541,
285,
20243,
6095,
594,
253,
1543,
2361,
275,
7180,
15567,
512,
7277,
247,
1332,
970,
23559,
1658,
281,
3082,
326,
513,
417,
3707,
323,
1863,
580,
534,
1057,
285,
17923,
12014,
281,
3816,
90,
280,
77,
352,
651,
320,
1199,
22870,
83,
281,
5386,
253,
23559,
1658,
10491,
432,
3816,
90,
280,
77,
2829,
577,
556,
436,
1180,
533,
352,
310,
760,
323,
247,
1052,
23657,
1408,
594,
969,
352,
310,
417,
1663,
4344,
281,
7277,
4720,
2829,
608,
26662,
50275,
445,
90,
280,
77,
281,
643,
3082,
970,
253,
1892,
941,
35919,
569,
271,
1774,
6627,
17166,
1060,
310,
326,
3816,
90,
280,
77,
1057,
923,
253,
5962,
9510,
273,
253,
1264,
3082,
534,
310,
1270,
281,
923,
2299,
3816,
90,
280,
77,
310,
760,
2429,
281,
260,
5902,
285,
247,
12955,
273,
260,
5902,
352,
4558,
12744,
323,
4227,
849,
278,
406,
729,
20,
651,
19021,
275,
5301,
970,
23559,
1658,
275,
512,
352,
4558,
1892,
281,
1333,
4555,
849,
3816,
90,
280,
77,
26662,
281,
643,
3082,
16280,
281,
1333,
326,
352,
310,
2779,
21450,
10870,
50276,
783,
11250,
273,
4679,
327,
10334,
792,
285,
4216,
941,
310,
10112,
285,
891,
717,
9995,
281,
923,
12532,
1543,
327,
643,
10625,
16280,
8113,
50275,
74,
990,
342,
247,
1199,
625,
22555,
1127,
253,
16038,
323,
2139,
3816,
28212,
23279,
310,
6571,
1501,
26901,
34235,
273,
667,
15276,
8813,
323,
2139,
3816,
28212,
778,
320,
247,
1805,
29107,
685,
27451,
253,
4477,
806,
564,
949,
253,
1232,
273,
44190,
616,
1698,
87,
14417,
29107,
285,
18918,
697,
27935,
597,
840,
1089,
247,
1180,
273,
3626,
7826,
11408,
12474,
273,
253,
2457,
8103,
824,
347,
3477,
2762,
10984,
4016,
3410,
6349,
13461,
50276,
2520,
310,
969,
1077,
2074,
281,
1232,
273,
9603,
253,
2192,
19131,
8103,
4983,
432,
15577,
1491,
352,
369,
2410,
1763,
5356,
9125,
407,
246,
10629,
1136,
257,
50276,
45061,
311,
543,
66,
1162,
355,
9169,
326,
352,
310,
253,
830,
273,
253,
2192,
19131,
2957,
3139,
2581,
685,
667,
5886,
281,
15577,
1491,
326,
11424,
697,
2323,
891,
452,
281,
1333,
326,
891,
1077,
1199,
1902,
253,
1072,
281,
2186,
323,
253,
3169,
3816,
28212,
8103,
3559,
275,
436,
789,
432,
436,
1127,
273,
1859,
352,
310,
4409,
8133,
281,
581,
1930,
323,
247,
2774,
512,
273,
253,
20195,
17593,
4270,
327,
1755,
273,
253,
2934,
273,
253,
3816,
28212,
23279,
285,
3365,
2770,
327,
253,
830,
273,
253,
2457,
3816,
90,
280,
77,
8103,
347,
4879,
407,
253,
4477,
436,
8103,
310,
273,
2074,
830,
281,
2720,
2934,
533,
342,
271,
6349,
10491,
294,
6712,
272,
281,
22175,
1892,
2297,
3993,
285,
3477,
37865,
253,
2022,
13812,
1060,
432,
2720,
2987,
534,
452,
2783,
1892,
2297,
3993,
285,
1892,
37865,
970,
2074,
6349,
2801,
723,
310,
281,
1978,
970,
1892,
2297,
3993,
533,
3185,
281,
8107,
253,
2770,
281,
320,
327,
3477,
37865,
285,
3877,
326,
2686,
436,
778,
4489,
690,
5373,
275,
2426,
273,
1146,
2104,
281,
897,
625,
13847,
941,
35919,
569,
436,
310,
271,
4722,
10419,
285,
778,
973,
320,
387,
253,
5161,
273,
253,
12510,
273,
253,
4081,
1332,
253,
3816,
28212,
23279,
778,
6571,
5752,
281,
691,
71,
19387,
366,
436,
50273,
936,
7525,
1223,
891,
452,
690,
17628,
265,
342,
253,
8113,
5661,
9978,
285,
9101,
326,
253,
2022,
5649,
273,
3816,
90,
280,
77,
273,
3477,
2762,
10491,
812,
12014,
320,
9009,
275,
667,
643,
4499,
422,
7792,
970,
6349,
13461,
891,
717,
1335,
21450,
23384,
273,
436,
789,
281,
479,
436,
5161,
13612,
273,
3477,
2762,
13461,
50276,
356,
11020,
941,
35919,
569,
285,
697,
22861,
275,
2829,
608,
310,
2168,
271,
4722,
7680,
1580,
849,
281,
4711,
1175,
2762,
3530,
310,
4931,
253,
2014,
954,
1774,
1953,
275,
4499,
422,
4715,
327,
247,
1027,
3877,
253,
2929,
671,
6131,
643,
5697,
824,
347,
253,
8413,
14066,
2373,
1541,
707,
285,
616,
30437,
281,
1097,
27451,
285,
3816,
28212,
11711,
1541,
707,
1646,
4722,
285,
778,
320,
273,
3907,
1600,
323,
9603,
1698,
87,
14417,
4499,
422,
4715,
16566,
50272,
5996,
30080,
22559,
2559,
4868,
407,
581,
50276,
9820,
5474,
33032,
2520,
2929,
2340,
684,
247,
4460,
5795,
15895,
273,
4499,
422,
4715,
326,
19732,
1131,
253,
3816,
28212,
23279,
17194,
407,
253,
7340,
273,
42072,
15849,
323,
253,
6311,
14237,
253,
4477,
8219,
970,
3816,
28212,
23279,
476,
1805,
8722,
1892,
35919,
569,
281,
11399,
253,
11041,
2523,
275,
253,
27785,
15895,
247,
625,
6474,
2715,
1754,
327,
46746,
9990,
310,
36431,
342,
10527,
816,
6787,
253,
4477,
2692,
326,
970,
253,
4081,
3816,
90,
2587,
476,
40335,
253,
878,
323,
3081,
288,
7305,
390,
4465,
30745,
281,
755,
12524,
3045,
2266,
16774,
1941,
8525,
253,
3916,
1160,
275,
253,
2929,
50276,
45563,
50275,
15177,
3236,
414,
253,
2898,
273,
3816,
28212,
4499,
422,
4715,
323,
1881,
35421,
4715,
310,
4460,
3738,
891,
452,
2326,
643,
3332,
2987,
671,
13756,
281,
3816,
28212,
323,
4499,
422,
4715,
891,
2868,
436,
7714,
310,
253,
760,
581,
326,
973,
40216,
10527,
285,
8542,
7794,
50275,
498,
15752,
8453,
436,
2929,
310,
3542,
342,
18714,
19843,
285,
891,
11346,
4361,
352,
352,
3400,
271,
9470,
7031,
327,
2905,
789,
285,
891,
717,
27364,
9861,
281,
923,
690,
2561,
17059,
417,
973,
1929,
281,
253,
2087,
4499,
422,
4715,
3114,
5322,
789,
891,
2879,
690,
3081,
6293,
534,
943,
1056,
253,
6239,
2278,
625,
11088,
285,
588,
18670,
26761,
2852,
2561,
5697,
50276,
15177,
2266,
5661,
1543,
973,
1095,
456,
5661,
2127,
50274,
20881,
1255,
50275,
262,
310,
417,
2590,
281,
479,
849,
253,
11041,
275,
10012,
4567,
7819,
715,
253,
11041,
275,
10012,
4562,
275,
253,
14155,
1083,
9765,
17,
812,
253,
4477,
21184,
327,
326,
50275,
87,
14417,
310,
417,
253,
760,
1921,
8479,
2192,
19131,
281,
1347,
749,
32581,
595,
11985,
432,
8988,
452,
3559,
11117,
24302,
327,
849,
4499,
422,
4715,
476,
320,
5520,
1754,
10704,
390,
13757,
6849,
671,
17245,
598,
2266,
15965,
7125,
841,
2987,
403,
19767,
281,
253,
2746,
8671,
1060,
285,
943,
320,
5469,
50274,
43671,
50275,
33921,
4562,
310,
2905,
281,
253,
906,
432,
337,
327,
253,
17619,
11041,
2523,
323,
3641,
13418,
534,
943,
320,
50276,
35844,
264,
50276,
783,
8103,
1159,
323,
46746,
21477,
19,
23279,
3559,
275,
253,
28669,
2284,
1162,
355,
43425,
534,
436,
789,
310,
1754,
327,
310,
2905,
281,
253,
6637,
4499,
422,
4715,
8103,
608,
534,
310,
19554,
285,
671,
5183,
2266,
3045,
15988,
689,
253,
26724,
2192,
19131,
8103,
253,
3816,
28212,
23279,
285,
21477,
19,
23279,
452,
671,
644,
8379,
3732,
275,
253,
3634,
273,
1006,
800,
14053,
2164,
835,
253,
4736,
273,
28699,
277,
21697,
522,
5246,
19783,
268,
89,
4789,
390,
277,
21697,
522,
89,
19783,
2805,
89,
310,
6096,
50275,
33177,
963,
993,
355,
545,
317,
5902,
310,
2686,
4081,
407,
270,
2963,
1306,
1162,
355,
17857,
1686,
6247,
417,
480,
4498,
1162,
355,
5723,
2824,
9169,
50276,
18,
278,
4065,
9358,
277,
285,
15252,
375,
465,
7473,
7364,
327,
253,
6814,
273,
15577,
1491,
247,
382,
1832,
9169,
374,
340,
632,
1162,
355,
391,
5134,
74,
23279,
39762,
17032,
5723,
2824,
4022,
495,
298,
260,
864,
1162,
355,
39762,
17032,
285,
1566,
5438,
342,
14923,
1941,
14493,
17857,
1686,
4765,
577,
260,
246,
8500,
448,
261,
8974,
1006,
800,
48960,
2990,
17857,
1686,
4765,
608,
480,
419,
406,
864,
1162,
355,
872,
494,
23632,
323,
1881,
35421,
3676,
4715,
342,
9879,
4499,
422,
2957,
5723,
2824,
43425,
721,
2805,
1149,
80,
1162,
355,
6863,
15577,
1491,
13418,
342,
4499,
422,
38775,
29311,
42262,
250,
13757,
43425,
818,
480,
260,
864,
1162,
355,
19554,
7938,
10046,
10155,
253,
2412,
76,
28401,
327,
4499,
422,
40390,
342,
6507,
6591,
43425,
50276,
74,
13414,
452,
667,
7350,
327,
7364,
285,
2442,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
4081,
253,
747,
10237,
4499,
422,
4715,
6974,
1925,
391,
5134,
280,
77,
407,
970,
253,
391,
5134,
74,
23279,
3185,
273,
27451,
23279,
347,
253,
8103,
1159,
253,
4081,
1332,
4916,
10237,
1411,
12150,
941,
35919,
569,
275,
1340,
281,
1978,
253,
11041,
273,
253,
8103,
1159,
1355,
436,
2929,
4081,
253,
897,
273,
247,
39762,
2406,
3033,
323,
46746,
391,
5134,
74,
23279,
534,
4483,
323,
3733,
28366,
253,
897,
273,
824,
46746,
23279,
310,
1754,
327,
253,
8310,
326,
253,
3236,
4499,
422,
4715,
10140,
281,
253,
39762,
2406,
3033,
273,
253,
46746,
27451,
23279,
50276,
45563,
50276,
783,
2685,
273,
253,
2505,
310,
2590,
285,
253,
4096,
4114,
285,
4081,
1332,
273,
436,
1263,
476,
320,
4354,
7192,
50276,
783,
4081,
1332,
310,
2969,
281,
3359,
285,
476,
320,
4354,
908,
347,
247,
15191,
281,
5368,
256,
3433,
3082,
50276,
40907,
474,
4679,
4518,
7568,
253,
12510,
273,
253,
4081,
1332,
253,
4081,
1332,
33526,
10870,
390,
5777,
1805,
3045,
685,
5368,
256,
3433,
3082,
672,
970,
2622,
941,
42072,
275,
253,
4112,
342,
2266,
941,
42072,
627,
369,
1534,
3045,
7756,
689,
253,
5368,
3082,
2192,
5240,
285,
502,
6678,
323,
253,
767,
2515,
273,
12966,
1518,
285,
253,
4444,
10895,
50275,
20881,
1255,
50276,
783,
8453,
273,
253,
10012,
4567,
310,
417,
18212,
3559,
436,
2929,
8219,
326,
46746,
23279,
4483,
253,
11041,
273,
253,
8103,
1159,
281,
320,
11542,
432,
1840,
970,
690,
14637,
2299,
436,
4154,
310,
417,
10481,
21414,
984,
627,
310,
417,
2217,
3806,
281,
253,
4588,
9777,
273,
253,
3638,
625,
4278,
403,
1677,
275,
253,
3533,
2593,
50276,
49794,
670,
253,
11786,
1783,
273,
391,
5134,
280,
77,
310,
417,
4751,
21414,
253,
7914,
273,
253,
1055,
273,
253,
1818,
275,
23279,
310,
5469,
407,
10941,
253,
11786,
273,
253,
8103,
1159,
273,
391,
5134,
280,
77,
342,
253,
8245,
11786,
533,
253,
6452,
310,
417,
21414,
1955,
281,
697,
13760,
2440,
891,
588,
2319,
275,
2508,
275,
253,
1953,
2593,
50276,
77,
471,
273,
28913,
2175,
327,
31640,
1580,
253,
2022,
4468,
273,
436,
2929,
310,
31640,
281,
2710,
941,
42072,
253,
2929,
943,
3587,
7568,
253,
12510,
273,
253,
4081,
1332,
407,
10941,
253,
3045,
407,
28913,
273,
253,
4499,
422,
8103,
762,
2266,
941,
42072,
50276,
783,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
275,
6452,
2593,
275,
1798,
253,
2929,
6283,
5393,
326,
352,
310,
2834,
281,
4271,
752,
2238,
273,
941,
42072,
651,
320,
3576,
347,
247,
12291,
273,
253,
4081,
1332,
2568,
1580,
253,
10199,
273,
253,
2929,
15265,
684,
326,
253,
4327,
273,
941,
6880,
310,
253,
2234,
275,
4499,
422,
4715,
627,
3133,
281,
320,
247,
4512,
26210,
875,
253,
2561,
8103,
285,
253,
4588,
4081,
1332,
2490,
187,
4118,
18435,
27,
783,
30628,
4925,
247,
13969,
326,
436,
2929,
310,
247,
5322,
1635,
281,
5723,
2824,
4496,
10770,
281,
253,
10123,
285,
4477,
6128,
323,
30628,
11626,
327,
253,
4757,
285,
14855,
273,
253,
2929,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors propose a framework to learn the query embeddings for hyperrelational kgs which allows qa over hyperrelational kgs with more complex questions and makes use of the qualifiers they also introduce a new hyperrelational kg qa dataset wd50kqe for the evaluation of their proposed method experiments show that qualifiers help their framework achieve more accurate results in general compared with the baselines that use information from tripleonly graphs strength the paper is wellmotivated and the authors propose a reasonable solution to the hyperrelational kg qe problem experiments show that the proposed problem outperforms most of the previous methods that only use tripleonly graphs the authors prepare a new dataset for the hyperrelational kg qe problem weakness the description of their proposed method is not very clear for example we use xy to indicate the representation of y in x in the qualifier representation paragraph of sec 4 readers may be confused about the meaning of x the authors should explain that x can be e r etc in table 1 compared with reification the performance improvement of starqe is not very obvious questions could you clarify whether starelike in table 2 is the same as starqe in table 1 it is better to use a more consistent name or explain their relation more clearly in table 2 caption docsepthe paper presents an approach for embeddings queries over hyper relation graphs hyperrelational graphs specifically those that have context are usually represented and encoded as regular triples by the existing approaches to do so the work represent the hyperrelational graphs as fol with parameterizing the predicate to include context the model is a combination of three architectures where firstly encodes the query graph using stare then compgcn to aggregate for representing qualifiers and mpqe for learning the query embeddings in order to evaluate this approach the work defines a new dataset on wikidata because of reified triplesqualifiers the paper is well written and is an important step in the direction of handling multiple representations of knowledge graphs most existing approaches are focused on triple based techniques however ignoring complex but more expressive representations such as reification andor qualifiers therefore in terms of novelty of the problem this work clearly seems to be taking the next step in query embeddings the approach is an interesting and valid combination of other approaches to solve the problem the evaluation of the dataset clearly shows that the approach works my main concerns of this work is as follows 1 the aspect of entering a new territory should also be backward compatible with older representations this raises the question of the performance of the system on freebase datasets primarily evaluated by other state of the art approaches such as betae query2box emql etc 2 emql evaluates faithfulness of the overall approach it is also important to measure how well the work performs on complete knowledge graphs and not only incomplete ones have the authors tried to do this 3 the change from query2box emql to beta embeddings was its way of handling negation and disjunction operators how does this work compare in terms of handling the operators of efol 4 why does oracles performance drop for 3p queries and how does the proposed approach handle them better if there are qualitative analysis on this it would be great 5 can you please elaborate on starelike row in table 2 i am not convinced that it is an apples to apples comparison to the other rows in the table the paper is well written and addresses an important direction is handling more complex representations of knowledge graphs the work seems solid however lacks some aspects of evaluation it is unable to justify that it is backward compatible to the original datasets which would be of great value even though not being at par with the state of the art approaches in other words it is ok to show drop in performance on freebase datasets but claim generalizability over both traditional representations and kgs with reification docsepthis paper studies the multihop logical reasoning problem with hyperrelational knowledge graphs in hyper relational knowledge graphs edges are associated with some keyvalue pairs used for describing the contextual information the goal of this paper is to incorporate such information in order to provide more finegrained question answering specifically starqe a combination of mpqeand stare is proposed to achieve this goal the authors constructed a new dataset wd50kqe and the experimental results show that starqe can effectively model the qualifiers in the dataset this paper proposed starqe for question answering over hyperrelational knowledge graphs overall the motivation is strong and the method is technically sound pros 1 the task is wellmotivated and interesting it can deal with the questionanswering task with qualifier pairs yet some contextual information is not allowed 2 the proposed model is simple but performs well on certain types of queries and it has a certain generalization capability 3 a new dataset is constructed in order to verify the model efficacy 4 the paper is wellwritten and easy to follow cons 1 the experiments could be more convincing by using more datasets currently only one dataset is used it is not sure that if this method can also be used on other datasets or domains also since the model can handle tripleonly scenarios i suggest the authors also consider reporting the performances on freebase and nell as done in query2box as such we can see whether or not this model can deal with more basic cases 2 the novelty is limited the model is just a combination of two existing approaches 3 it cannot handle queries about numbers text and time while some existing methods abc can handle such information in knowledge graph embeddings those important references are missing and should be used as baselines especially on 1p queries 4 important reasoning operators such as negation disjunctions are not considered making the approach less attractive i guess we can draw some insights from betae for this type of operator 5 i suggest the authors add some baselines in table 2 so that we can know the differences between starqe and baselines in generalization capability question can the model benefit from nonlinear layers transformations a knowledge graph embedding with numeric attributes of entities acl 2018 b incorporating literals into knowledge graph embeddings iswc 2019 c beyond triplets hyperrelational knowledge graph embedding for link prediction www 2020 the current version of this paper is marginally below the acceptance threshold i believe some improvements can be made to strengthen the paper docsepthis paper studies how to embed and answer hyperrelational conjunctive queries based on recent advancements in graph neural networks and query embedding techniques and propose a method to answer such queries and demonstrate in their experiments that qualifiers improve qa on a diverse set of query patterns this paper has a certain novelty the challenge of multihop logical reasoning is not clearly stated i suggest that the author should discuss the challenge in detail in the modeldescription section the author should first explain the challenges of solving the problem and how the proposed method is solvedi suggest the author introduce it in detail in the introduction of related work the author did not analyze the advantages and disadvantages and did not discuss the proposed method which could not prove the innovation of the proposed method there is little description of query embedding in this papersuggeest the author to add more space to introduce it the abstract of the experimental results is not specific enough there is only few references in the past 3 years it is recommended to ensure the latestness of the literature when investigating related work otherwise it is difficult to persuade the work to be novel this paper propose a method to answer such queries and demonstrate in their experiments that qualifiers improve qa on a diverse set of query patterns the logic is clear the argument is complete and the grammatical expression is very standard however in the process of demonstration there is a lack of description of the challenge and thinking about how to solve the problem i think the authors needs to improve the writing docsepthis paper proposes a novel problem to embed query graphs with edge qualifiers to query a hyperrelational knowledge graph and a solution extending an existing approach for a nonhyperrelational knowledge graph due to the lack of evaluation datasets they propose to use a hyperrelational knowledge graph extracted from wikidata they also consider three baselines that can show the characteristics of the proposed approach including reification baseline that transforms qualifiers to nodes the overall performance shows the proposed approach shows benefit over the baselines and usefulness when edge qualifiers are available strengths s1 they proposed the first query embedding approach for a graph with qualifiers s2 the proposed approach is compared with three baselines including the reification baseline s3 a new evaluation dataset is proposed weaknesses w1 some of important information had to go to appendix due to space for example i think the best hyperparameters used in the experiment can stay in the main paper w2 the proposed approach performs worse than the reification baseline when the problem is easier and the answer can be narrow down with traditional conjunctions w3 another baseline of embedding the entire edgequalifiers pair as a unit rather than representing the qualifiers as separate nodes is not considered it can be predicted that this baseline is not suitable for a large number of possible qualifier values but can be easily applicable to the suggested example by considering three atomic relations instead of qualifiers educatedatbsc educatedatms educatedatphd i think this is a solid piece of work extending the problem space solved by modern embedding techniques the presentation is mostly solid and easy to read and understand there is some very minor presentation issues that can be improved such as adding the best hyperparameters in the main paper or adding the missing node color in the legend in figure 4 or maybe using oracle instead of just oracle which can be a bit misleading but otherwise this paper is well written with a clear goal and contributions and evaluations since in case of limited set of qualifier values the edgequalifiers can be easily represented and embedded as altogether it would be interesting to see the comparison with such a baseline but i dont think this is necessary to show the value of this paper
### Summary:
|
this paper presents a query embedding approach for answering multihop queries over a hyperrelational knowledge graph kg the main contributions are a new dataset wd50kqe for this task and a simple but sensible extension to an existing model for query embeddings to also handle relation qualifiers reviewers wjvm and bute note that the reification and starqe models perform similarly while this is not a negative result as the authors note it does raise the question of the relative pros and cons of the two methods i hope the authors can add a discussion of when one might prefer starqe over the conceptually simpler reification method in the final version the authors addressed reviewer frrts concerns about faithfulness and backwards compatibility though more evidence on purely triplebased tasks would be nice here reviewer gqar also raised some concerns about writing but the other reviewers mostly found the paper to be well written and well motivated and i tend to agree overall while there are some very good suggestions on how the paper can be extended and improved i find the current contributions to be substantial enough to warrant a publication
|
[
783,
5740,
273,
616,
4081,
1332,
310,
417,
1077,
2590,
323,
1650,
359,
897,
1269,
90,
281,
5224,
253,
6779,
273,
340,
275,
1269,
275,
253,
4426,
5425,
6779,
12494,
273,
4706,
577,
10668,
778,
320,
13477,
670,
253,
4495,
273,
1269,
253,
4477,
943,
5513,
326,
1269,
476,
320,
299,
391,
3966,
50276,
249,
2829,
337,
2429,
342,
294,
1877,
253,
3045,
7756,
273,
4177,
82,
70,
310,
417,
1077,
4755,
50274,
34974,
50276,
16534,
368,
19148,
1880,
25885,
3022,
275,
2829,
374,
310,
253,
1072,
347,
4177,
82,
70,
275,
2829,
337,
352,
310,
1805,
281,
897,
247,
625,
5185,
1416,
390,
5513,
616,
5886,
625,
4518,
275,
2829,
374,
11743,
50272,
7152,
339,
431,
248,
2929,
10262,
271,
2746,
323,
46234,
19241,
689,
4373,
5886,
14580,
4373,
1661,
1050,
14580,
5742,
1110,
326,
452,
3634,
403,
3798,
6607,
285,
16202,
347,
3963,
1195,
1868,
407,
253,
5368,
7274,
281,
513,
594,
253,
789,
1957,
253,
4373,
1661,
1050,
14580,
347,
6305,
342,
4764,
3006,
253,
29524,
281,
2486,
3634,
253,
1566,
310,
247,
5019,
273,
1264,
35615,
835,
41005,
31360,
253,
7316,
4216,
970,
25885,
840,
509,
72,
14340,
281,
19737,
323,
9999,
4426,
13783,
285,
23542,
82,
70,
323,
4715,
253,
7316,
46234,
275,
1340,
281,
7472,
436,
2746,
253,
789,
13067,
247,
747,
10895,
327,
259,
1479,
301,
682,
984,
273,
294,
1245,
1195,
1868,
15847,
13783,
50276,
783,
2929,
310,
973,
3542,
285,
310,
271,
1774,
3213,
275,
253,
3884,
273,
10885,
2709,
14237,
273,
3640,
14580,
954,
5368,
7274,
403,
7106,
327,
16260,
1754,
5609,
2299,
23111,
2570,
533,
625,
43541,
14237,
824,
347,
294,
1877,
285,
263,
4426,
13783,
3103,
275,
2426,
273,
38135,
273,
253,
1895,
436,
789,
4518,
3133,
281,
320,
3192,
253,
1735,
3213,
275,
7316,
46234,
253,
2746,
310,
271,
4722,
285,
3588,
5019,
273,
643,
7274,
281,
8415,
253,
1895,
253,
7103,
273,
253,
10895,
4518,
2722,
326,
253,
2746,
2987,
50276,
2577,
2022,
7350,
273,
436,
789,
310,
347,
3637,
337,
253,
4809,
273,
11734,
247,
747,
12785,
943,
671,
320,
19265,
13333,
342,
5662,
14237,
436,
16540,
253,
1953,
273,
253,
3045,
273,
253,
985,
327,
1959,
4793,
15302,
8558,
6760,
407,
643,
1375,
273,
253,
1445,
7274,
824,
347,
701,
3348,
7316,
19,
3364,
802,
5848,
3966,
50276,
19,
802,
5848,
44995,
6009,
16858,
273,
253,
4583,
2746,
352,
310,
671,
1774,
281,
2557,
849,
973,
253,
789,
17923,
327,
3426,
3640,
14580,
285,
417,
760,
18464,
4394,
452,
253,
4477,
3597,
281,
513,
436,
495,
253,
1818,
432,
7316,
19,
3364,
802,
5848,
281,
9840,
46234,
369,
697,
1039,
273,
10885,
2297,
318,
285,
557,
45148,
9158,
849,
1057,
436,
789,
7277,
275,
2426,
273,
10885,
253,
9158,
273,
299,
10631,
50276,
21,
2139,
1057,
390,
13853,
3045,
5926,
323,
495,
81,
19241,
285,
849,
1057,
253,
4081,
2746,
6016,
731,
1805,
604,
627,
403,
18276,
1783,
327,
436,
352,
651,
320,
1270,
50276,
22,
476,
368,
4496,
21184,
327,
25885,
3022,
4194,
275,
2829,
374,
891,
717,
417,
13762,
326,
352,
310,
271,
28580,
281,
28580,
5301,
281,
253,
643,
10175,
275,
253,
2829,
50274,
783,
2929,
310,
973,
3542,
285,
12453,
271,
1774,
3884,
310,
10885,
625,
2570,
14237,
273,
3640,
14580,
253,
789,
3133,
4891,
2299,
19756,
690,
7794,
273,
7103,
352,
310,
7591,
281,
15249,
326,
352,
310,
19265,
13333,
281,
253,
3236,
15302,
534,
651,
320,
273,
1270,
1318,
1014,
2167,
417,
1146,
387,
1061,
342,
253,
1375,
273,
253,
1445,
7274,
275,
643,
3000,
352,
310,
8718,
281,
921,
5926,
275,
3045,
327,
1959,
4793,
15302,
533,
1750,
2087,
50228,
689,
1097,
5899,
14237,
285,
465,
5943,
342,
294,
1877,
50276,
7152,
33032,
2520,
2929,
2175,
253,
4471,
12242,
13760,
14720,
1895,
342,
4373,
1661,
1050,
3640,
14580,
275,
4373,
38524,
3640,
14580,
9297,
403,
2330,
342,
690,
2234,
2877,
8557,
908,
323,
12930,
253,
33876,
1491,
253,
4736,
273,
436,
2929,
310,
281,
19071,
824,
1491,
275,
1340,
281,
2085,
625,
4030,
72,
11273,
1953,
22291,
5742,
4177,
82,
70,
247,
5019,
273,
23542,
82,
70,
395,
25885,
310,
4081,
281,
5115,
436,
4736,
253,
4477,
8818,
247,
747,
10895,
259,
69,
1235,
76,
82,
70,
285,
253,
5661,
1543,
921,
326,
4177,
82,
70,
476,
8069,
1566,
253,
4426,
13783,
275,
253,
10895,
436,
2929,
4081,
4177,
82,
70,
323,
1953,
22291,
689,
4373,
1661,
1050,
3640,
14580,
4583,
253,
16038,
310,
2266,
285,
253,
1332,
310,
22335,
3590,
50276,
856,
84,
337,
253,
4836,
310,
973,
24013,
8550,
285,
4722,
352,
476,
2968,
342,
253,
1953,
507,
88,
2158,
4836,
342,
4426,
5425,
8557,
2568,
690,
33876,
1491,
310,
417,
4136,
374,
253,
4081,
1566,
310,
2969,
533,
17923,
973,
327,
2176,
3510,
273,
19241,
285,
352,
556,
247,
2176,
26647,
14603,
495,
247,
747,
10895,
310,
8818,
275,
1340,
281,
12654,
253,
1566,
10307,
50276,
21,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
5040,
337,
253,
4679,
812,
320,
625,
21414,
407,
970,
625,
15302,
4390,
760,
581,
10895,
310,
908,
352,
310,
417,
2119,
326,
604,
436,
1332,
476,
671,
320,
908,
327,
643,
15302,
390,
10625,
671,
1580,
253,
1566,
476,
6016,
16260,
7483,
15216,
891,
1804,
253,
4477,
671,
1908,
9610,
253,
16226,
327,
1959,
4793,
285,
295,
437,
347,
2218,
275,
7316,
19,
3364,
347,
824,
359,
476,
923,
1880,
390,
417,
436,
1566,
476,
2968,
342,
625,
5044,
2219,
50276,
19,
253,
38135,
310,
3710,
253,
1566,
310,
816,
247,
5019,
273,
767,
5368,
7274,
50276,
20,
352,
2550,
6016,
19241,
670,
3904,
2505,
285,
673,
1223,
690,
5368,
3082,
490,
68,
476,
6016,
824,
1491,
275,
3640,
4216,
46234,
50276,
21808,
1774,
10414,
403,
5816,
285,
943,
320,
908,
347,
1666,
25379,
3340,
327,
337,
81,
19241,
50276,
21,
1774,
14720,
9158,
824,
347,
2297,
318,
557,
30986,
960,
403,
417,
2783,
2403,
253,
2746,
1679,
12994,
891,
5476,
359,
476,
3812,
690,
16039,
432,
701,
3348,
323,
436,
1511,
273,
5572,
608,
891,
1804,
253,
4477,
823,
690,
1666,
25379,
275,
2829,
374,
594,
326,
359,
476,
871,
253,
3910,
875,
4177,
82,
70,
285,
1666,
25379,
275,
26647,
14603,
50276,
19751,
476,
253,
1566,
5649,
432,
14561,
8090,
21257,
50276,
66,
3640,
4216,
21496,
342,
31437,
12474,
273,
14429,
247,
498,
4765,
270,
24049,
4133,
932,
715,
3640,
4216,
46234,
310,
38212,
6247,
260,
4457,
16260,
1641,
4373,
1661,
1050,
3640,
4216,
21496,
323,
3048,
10554,
8280,
9169,
50276,
783,
1655,
2715,
273,
436,
2929,
310,
42876,
2708,
253,
14924,
7887,
891,
2868,
690,
11701,
476,
320,
1160,
281,
17084,
253,
2929,
50275,
7152,
33032,
2520,
2929,
2175,
849,
281,
8473,
285,
3662,
4373,
1661,
1050,
7862,
28816,
19241,
1754,
327,
3332,
7170,
942,
275,
4216,
11454,
6928,
285,
7316,
21496,
5609,
285,
12661,
247,
1332,
281,
3662,
824,
19241,
285,
7568,
275,
616,
4679,
326,
4426,
13783,
3157,
2805,
66,
327,
247,
11117,
873,
273,
7316,
6127,
436,
2929,
556,
247,
2176,
38135,
253,
5691,
273,
4471,
12242,
13760,
14720,
310,
417,
4518,
4767,
891,
1804,
326,
253,
2488,
943,
2319,
253,
5691,
275,
2508,
50276,
249,
253,
1566,
10008,
2593,
253,
2488,
943,
806,
5513,
253,
7881,
273,
16161,
253,
1895,
285,
849,
253,
4081,
1332,
310,
14042,
74,
1804,
253,
2488,
9569,
352,
275,
2508,
50276,
249,
253,
10199,
273,
2905,
789,
253,
2488,
858,
417,
12106,
253,
11361,
285,
23797,
285,
858,
417,
2319,
253,
4081,
1332,
534,
812,
417,
5276,
253,
15832,
273,
253,
4081,
1332,
50275,
9088,
310,
1652,
5740,
273,
7316,
21496,
275,
436,
9380,
814,
463,
383,
253,
2488,
281,
823,
625,
2317,
281,
9569,
352,
50276,
783,
12002,
273,
253,
5661,
1543,
310,
417,
2173,
2217,
50276,
9088,
310,
760,
1643,
10414,
275,
253,
2469,
495,
1107,
352,
310,
8521,
281,
5416,
253,
6323,
1255,
273,
253,
6239,
672,
15686,
2905,
789,
5010,
352,
310,
2834,
281,
29720,
253,
789,
281,
320,
4460,
436,
2929,
12661,
247,
1332,
281,
3662,
824,
19241,
285,
7568,
275,
616,
4679,
326,
4426,
13783,
3157,
2805,
66,
327,
247,
11117,
873,
273,
7316,
6127,
253,
9317,
310,
2590,
253,
4154,
310,
3426,
285,
253,
47412,
474,
2048,
310,
1077,
2629,
2299,
275,
253,
1232,
273,
20028,
627,
310,
247,
3480,
273,
5740,
273,
253,
5691,
285,
4680,
670,
849,
281,
8415,
253,
1895,
891,
1158,
253,
4477,
3198,
281,
3157,
253,
4028,
5474,
33032,
2520,
2929,
29328,
247,
4460,
1895,
281,
8473,
7316,
14580,
342,
5024,
4426,
13783,
281,
7316,
247,
4373,
1661,
1050,
3640,
4216,
285,
247,
2900,
13633,
271,
5368,
2746,
323,
247,
1327,
27049,
1661,
1050,
3640,
4216,
1955,
281,
253,
3480,
273,
7103,
15302,
597,
12661,
281,
897,
247,
4373,
1661,
1050,
3640,
4216,
10375,
432,
259,
1479,
301,
682,
597,
671,
1908,
1264,
1666,
25379,
326,
476,
921,
253,
5319,
273,
253,
4081,
2746,
1690,
294,
1877,
8245,
326,
29698,
4426,
13783,
281,
7632,
253,
4583,
3045,
2722,
253,
4081,
2746,
2722,
5649,
689,
253,
1666,
25379,
285,
31471,
672,
5024,
4426,
13783,
403,
2130,
20544,
50276,
84,
18,
597,
4081,
253,
806,
7316,
21496,
2746,
323,
247,
4216,
342,
4426,
13783,
50276,
84,
19,
253,
4081,
2746,
310,
2429,
342,
1264,
1666,
25379,
1690,
253,
294,
1877,
8245,
50276,
84,
20,
247,
747,
7103,
10895,
310,
4081,
50276,
20881,
1255,
265,
50276,
88,
18,
690,
273,
1774,
1491,
574,
281,
564,
281,
30762,
1955,
281,
2317,
323,
1650,
891,
1158,
253,
1682,
4373,
22041,
908,
275,
253,
3368,
476,
3297,
275,
253,
2022,
2929,
50276,
88,
19,
253,
4081,
2746,
17923,
7197,
685,
253,
294,
1877,
8245,
672,
253,
1895,
310,
6927,
285,
253,
3662,
476,
320,
6891,
1066,
342,
5899,
7862,
328,
960,
50276,
88,
20,
1529,
8245,
273,
21496,
253,
2862,
5024,
15847,
13783,
4667,
347,
247,
3943,
2581,
685,
9999,
253,
4426,
13783,
347,
4858,
7632,
310,
417,
2783,
352,
476,
320,
8131,
326,
436,
8245,
310,
417,
7470,
323,
247,
1781,
1180,
273,
1896,
4426,
5425,
2193,
533,
476,
320,
4354,
7763,
281,
253,
5125,
1650,
407,
7296,
1264,
13805,
2493,
3185,
273,
4426,
13783,
19149,
255,
67,
1026,
19149,
255,
983,
19149,
255,
545,
69,
891,
1158,
436,
310,
247,
4891,
5313,
273,
789,
13633,
253,
1895,
2317,
14042,
407,
4980,
21496,
5609,
253,
9759,
310,
6571,
4891,
285,
3477,
281,
1239,
285,
2096,
627,
310,
690,
1077,
5884,
9759,
3374,
326,
476,
320,
5520,
824,
347,
6240,
253,
1682,
4373,
22041,
275,
253,
2022,
2929,
390,
6240,
253,
5816,
4666,
3295,
275,
253,
13691,
275,
4677,
577,
390,
5046,
970,
42295,
3185,
273,
816,
42295,
534,
476,
320,
247,
2372,
24363,
533,
5010,
436,
2929,
310,
973,
3542,
342,
247,
2590,
4736,
285,
9021,
285,
27163,
1580,
275,
1083,
273,
3710,
873,
273,
4426,
5425,
2193,
253,
5024,
15847,
13783,
476,
320,
4354,
6607,
285,
12691,
347,
17965,
352,
651,
320,
4722,
281,
923,
253,
5301,
342,
824,
247,
8245,
533,
891,
13414,
1158,
436,
310,
3309,
281,
921,
253,
1318,
273,
436,
2929,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
7316,
21496,
2746,
323,
22291,
4471,
12242,
19241,
689,
247,
4373,
1661,
1050,
3640,
4216,
15841,
253,
2022,
9021,
403,
247,
747,
10895,
259,
69,
1235,
76,
82,
70,
323,
436,
4836,
285,
247,
2969,
533,
24600,
6880,
281,
271,
5368,
1566,
323,
7316,
46234,
281,
671,
6016,
5886,
4426,
13783,
30628,
259,
75,
11618,
285,
533,
70,
3877,
326,
253,
294,
1877,
285,
4177,
82,
70,
3210,
1347,
12014,
1223,
436,
310,
417,
247,
4016,
906,
347,
253,
4477,
3877,
352,
1057,
7164,
253,
1953,
273,
253,
4103,
5847,
285,
772,
273,
253,
767,
3082,
891,
3524,
253,
4477,
476,
823,
247,
5955,
273,
672,
581,
1537,
4510,
4177,
82,
70,
689,
253,
4473,
1230,
19554,
294,
1877,
1332,
275,
253,
2457,
2715,
253,
4477,
9713,
37317,
1315,
1378,
84,
7350,
670,
6009,
16858,
285,
24291,
22862,
2167,
625,
1941,
327,
15846,
16260,
3169,
8892,
651,
320,
5322,
1060,
37317,
305,
82,
274,
671,
5439,
690,
7350,
670,
4028,
533,
253,
643,
30628,
6571,
1119,
253,
2929,
281,
320,
973,
3542,
285,
973,
17194,
285,
891,
5257,
281,
5194,
4583,
1223,
627,
403,
690,
1077,
1175,
13991,
327,
849,
253,
2929,
476,
320,
6508,
285,
5520,
891,
1089,
253,
1655,
9021,
281,
320,
6832,
2217,
281,
7501,
247,
9311
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
783,
5740,
273,
616,
4081,
1332,
310,
417,
1077,
2590,
323,
1650,
359,
897,
1269,
90,
281,
5224,
253,
6779,
273,
340,
275,
1269,
275,
253,
4426,
5425,
6779,
12494,
273,
4706,
577,
10668,
778,
320,
13477,
670,
253,
4495,
273,
1269,
253,
4477,
943,
5513,
326,
1269,
476,
320,
299,
391,
3966,
50276,
249,
2829,
337,
2429,
342,
294,
1877,
253,
3045,
7756,
273,
4177,
82,
70,
310,
417,
1077,
4755,
50274,
34974,
50276,
16534,
368,
19148,
1880,
25885,
3022,
275,
2829,
374,
310,
253,
1072,
347,
4177,
82,
70,
275,
2829,
337,
352,
310,
1805,
281,
897,
247,
625,
5185,
1416,
390,
5513,
616,
5886,
625,
4518,
275,
2829,
374,
11743,
50272,
7152,
339,
431,
248,
2929,
10262,
271,
2746,
323,
46234,
19241,
689,
4373,
5886,
14580,
4373,
1661,
1050,
14580,
5742,
1110,
326,
452,
3634,
403,
3798,
6607,
285,
16202,
347,
3963,
1195,
1868,
407,
253,
5368,
7274,
281,
513,
594,
253,
789,
1957,
253,
4373,
1661,
1050,
14580,
347,
6305,
342,
4764,
3006,
253,
29524,
281,
2486,
3634,
253,
1566,
310,
247,
5019,
273,
1264,
35615,
835,
41005,
31360,
253,
7316,
4216,
970,
25885,
840,
509,
72,
14340,
281,
19737,
323,
9999,
4426,
13783,
285,
23542,
82,
70,
323,
4715,
253,
7316,
46234,
275,
1340,
281,
7472,
436,
2746,
253,
789,
13067,
247,
747,
10895,
327,
259,
1479,
301,
682,
984,
273,
294,
1245,
1195,
1868,
15847,
13783,
50276,
783,
2929,
310,
973,
3542,
285,
310,
271,
1774,
3213,
275,
253,
3884,
273,
10885,
2709,
14237,
273,
3640,
14580,
954,
5368,
7274,
403,
7106,
327,
16260,
1754,
5609,
2299,
23111,
2570,
533,
625,
43541,
14237,
824,
347,
294,
1877,
285,
263,
4426,
13783,
3103,
275,
2426,
273,
38135,
273,
253,
1895,
436,
789,
4518,
3133,
281,
320,
3192,
253,
1735,
3213,
275,
7316,
46234,
253,
2746,
310,
271,
4722,
285,
3588,
5019,
273,
643,
7274,
281,
8415,
253,
1895,
253,
7103,
273,
253,
10895,
4518,
2722,
326,
253,
2746,
2987,
50276,
2577,
2022,
7350,
273,
436,
789,
310,
347,
3637,
337,
253,
4809,
273,
11734,
247,
747,
12785,
943,
671,
320,
19265,
13333,
342,
5662,
14237,
436,
16540,
253,
1953,
273,
253,
3045,
273,
253,
985,
327,
1959,
4793,
15302,
8558,
6760,
407,
643,
1375,
273,
253,
1445,
7274,
824,
347,
701,
3348,
7316,
19,
3364,
802,
5848,
3966,
50276,
19,
802,
5848,
44995,
6009,
16858,
273,
253,
4583,
2746,
352,
310,
671,
1774,
281,
2557,
849,
973,
253,
789,
17923,
327,
3426,
3640,
14580,
285,
417,
760,
18464,
4394,
452,
253,
4477,
3597,
281,
513,
436,
495,
253,
1818,
432,
7316,
19,
3364,
802,
5848,
281,
9840,
46234,
369,
697,
1039,
273,
10885,
2297,
318,
285,
557,
45148,
9158,
849,
1057,
436,
789,
7277,
275,
2426,
273,
10885,
253,
9158,
273,
299,
10631,
50276,
21,
2139,
1057,
390,
13853,
3045,
5926,
323,
495,
81,
19241,
285,
849,
1057,
253,
4081,
2746,
6016,
731,
1805,
604,
627,
403,
18276,
1783,
327,
436,
352,
651,
320,
1270,
50276,
22,
476,
368,
4496,
21184,
327,
25885,
3022,
4194,
275,
2829,
374,
891,
717,
417,
13762,
326,
352,
310,
271,
28580,
281,
28580,
5301,
281,
253,
643,
10175,
275,
253,
2829,
50274,
783,
2929,
310,
973,
3542,
285,
12453,
271,
1774,
3884,
310,
10885,
625,
2570,
14237,
273,
3640,
14580,
253,
789,
3133,
4891,
2299,
19756,
690,
7794,
273,
7103,
352,
310,
7591,
281,
15249,
326,
352,
310,
19265,
13333,
281,
253,
3236,
15302,
534,
651,
320,
273,
1270,
1318,
1014,
2167,
417,
1146,
387,
1061,
342,
253,
1375,
273,
253,
1445,
7274,
275,
643,
3000,
352,
310,
8718,
281,
921,
5926,
275,
3045,
327,
1959,
4793,
15302,
533,
1750,
2087,
50228,
689,
1097,
5899,
14237,
285,
465,
5943,
342,
294,
1877,
50276,
7152,
33032,
2520,
2929,
2175,
253,
4471,
12242,
13760,
14720,
1895,
342,
4373,
1661,
1050,
3640,
14580,
275,
4373,
38524,
3640,
14580,
9297,
403,
2330,
342,
690,
2234,
2877,
8557,
908,
323,
12930,
253,
33876,
1491,
253,
4736,
273,
436,
2929,
310,
281,
19071,
824,
1491,
275,
1340,
281,
2085,
625,
4030,
72,
11273,
1953,
22291,
5742,
4177,
82,
70,
247,
5019,
273,
23542,
82,
70,
395,
25885,
310,
4081,
281,
5115,
436,
4736,
253,
4477,
8818,
247,
747,
10895,
259,
69,
1235,
76,
82,
70,
285,
253,
5661,
1543,
921,
326,
4177,
82,
70,
476,
8069,
1566,
253,
4426,
13783,
275,
253,
10895,
436,
2929,
4081,
4177,
82,
70,
323,
1953,
22291,
689,
4373,
1661,
1050,
3640,
14580,
4583,
253,
16038,
310,
2266,
285,
253,
1332,
310,
22335,
3590,
50276,
856,
84,
337,
253,
4836,
310,
973,
24013,
8550,
285,
4722,
352,
476,
2968,
342,
253,
1953,
507,
88,
2158,
4836,
342,
4426,
5425,
8557,
2568,
690,
33876,
1491,
310,
417,
4136,
374,
253,
4081,
1566,
310,
2969,
533,
17923,
973,
327,
2176,
3510,
273,
19241,
285,
352,
556,
247,
2176,
26647,
14603,
495,
247,
747,
10895,
310,
8818,
275,
1340,
281,
12654,
253,
1566,
10307,
50276,
21,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
5040,
337,
253,
4679,
812,
320,
625,
21414,
407,
970,
625,
15302,
4390,
760,
581,
10895,
310,
908,
352,
310,
417,
2119,
326,
604,
436,
1332,
476,
671,
320,
908,
327,
643,
15302,
390,
10625,
671,
1580,
253,
1566,
476,
6016,
16260,
7483,
15216,
891,
1804,
253,
4477,
671,
1908,
9610,
253,
16226,
327,
1959,
4793,
285,
295,
437,
347,
2218,
275,
7316,
19,
3364,
347,
824,
359,
476,
923,
1880,
390,
417,
436,
1566,
476,
2968,
342,
625,
5044,
2219,
50276,
19,
253,
38135,
310,
3710,
253,
1566,
310,
816,
247,
5019,
273,
767,
5368,
7274,
50276,
20,
352,
2550,
6016,
19241,
670,
3904,
2505,
285,
673,
1223,
690,
5368,
3082,
490,
68,
476,
6016,
824,
1491,
275,
3640,
4216,
46234,
50276,
21808,
1774,
10414,
403,
5816,
285,
943,
320,
908,
347,
1666,
25379,
3340,
327,
337,
81,
19241,
50276,
21,
1774,
14720,
9158,
824,
347,
2297,
318,
557,
30986,
960,
403,
417,
2783,
2403,
253,
2746,
1679,
12994,
891,
5476,
359,
476,
3812,
690,
16039,
432,
701,
3348,
323,
436,
1511,
273,
5572,
608,
891,
1804,
253,
4477,
823,
690,
1666,
25379,
275,
2829,
374,
594,
326,
359,
476,
871,
253,
3910,
875,
4177,
82,
70,
285,
1666,
25379,
275,
26647,
14603,
50276,
19751,
476,
253,
1566,
5649,
432,
14561,
8090,
21257,
50276,
66,
3640,
4216,
21496,
342,
31437,
12474,
273,
14429,
247,
498,
4765,
270,
24049,
4133,
932,
715,
3640,
4216,
46234,
310,
38212,
6247,
260,
4457,
16260,
1641,
4373,
1661,
1050,
3640,
4216,
21496,
323,
3048,
10554,
8280,
9169,
50276,
783,
1655,
2715,
273,
436,
2929,
310,
42876,
2708,
253,
14924,
7887,
891,
2868,
690,
11701,
476,
320,
1160,
281,
17084,
253,
2929,
50275,
7152,
33032,
2520,
2929,
2175,
849,
281,
8473,
285,
3662,
4373,
1661,
1050,
7862,
28816,
19241,
1754,
327,
3332,
7170,
942,
275,
4216,
11454,
6928,
285,
7316,
21496,
5609,
285,
12661,
247,
1332,
281,
3662,
824,
19241,
285,
7568,
275,
616,
4679,
326,
4426,
13783,
3157,
2805,
66,
327,
247,
11117,
873,
273,
7316,
6127,
436,
2929,
556,
247,
2176,
38135,
253,
5691,
273,
4471,
12242,
13760,
14720,
310,
417,
4518,
4767,
891,
1804,
326,
253,
2488,
943,
2319,
253,
5691,
275,
2508,
50276,
249,
253,
1566,
10008,
2593,
253,
2488,
943,
806,
5513,
253,
7881,
273,
16161,
253,
1895,
285,
849,
253,
4081,
1332,
310,
14042,
74,
1804,
253,
2488,
9569,
352,
275,
2508,
50276,
249,
253,
10199,
273,
2905,
789,
253,
2488,
858,
417,
12106,
253,
11361,
285,
23797,
285,
858,
417,
2319,
253,
4081,
1332,
534,
812,
417,
5276,
253,
15832,
273,
253,
4081,
1332,
50275,
9088,
310,
1652,
5740,
273,
7316,
21496,
275,
436,
9380,
814,
463,
383,
253,
2488,
281,
823,
625,
2317,
281,
9569,
352,
50276,
783,
12002,
273,
253,
5661,
1543,
310,
417,
2173,
2217,
50276,
9088,
310,
760,
1643,
10414,
275,
253,
2469,
495,
1107,
352,
310,
8521,
281,
5416,
253,
6323,
1255,
273,
253,
6239,
672,
15686,
2905,
789,
5010,
352,
310,
2834,
281,
29720,
253,
789,
281,
320,
4460,
436,
2929,
12661,
247,
1332,
281,
3662,
824,
19241,
285,
7568,
275,
616,
4679,
326,
4426,
13783,
3157,
2805,
66,
327,
247,
11117,
873,
273,
7316,
6127,
253,
9317,
310,
2590,
253,
4154,
310,
3426,
285,
253,
47412,
474,
2048,
310,
1077,
2629,
2299,
275,
253,
1232,
273,
20028,
627,
310,
247,
3480,
273,
5740,
273,
253,
5691,
285,
4680,
670,
849,
281,
8415,
253,
1895,
891,
1158,
253,
4477,
3198,
281,
3157,
253,
4028,
5474,
33032,
2520,
2929,
29328,
247,
4460,
1895,
281,
8473,
7316,
14580,
342,
5024,
4426,
13783,
281,
7316,
247,
4373,
1661,
1050,
3640,
4216,
285,
247,
2900,
13633,
271,
5368,
2746,
323,
247,
1327,
27049,
1661,
1050,
3640,
4216,
1955,
281,
253,
3480,
273,
7103,
15302,
597,
12661,
281,
897,
247,
4373,
1661,
1050,
3640,
4216,
10375,
432,
259,
1479,
301,
682,
597,
671,
1908,
1264,
1666,
25379,
326,
476,
921,
253,
5319,
273,
253,
4081,
2746,
1690,
294,
1877,
8245,
326,
29698,
4426,
13783,
281,
7632,
253,
4583,
3045,
2722,
253,
4081,
2746,
2722,
5649,
689,
253,
1666,
25379,
285,
31471,
672,
5024,
4426,
13783,
403,
2130,
20544,
50276,
84,
18,
597,
4081,
253,
806,
7316,
21496,
2746,
323,
247,
4216,
342,
4426,
13783,
50276,
84,
19,
253,
4081,
2746,
310,
2429,
342,
1264,
1666,
25379,
1690,
253,
294,
1877,
8245,
50276,
84,
20,
247,
747,
7103,
10895,
310,
4081,
50276,
20881,
1255,
265,
50276,
88,
18,
690,
273,
1774,
1491,
574,
281,
564,
281,
30762,
1955,
281,
2317,
323,
1650,
891,
1158,
253,
1682,
4373,
22041,
908,
275,
253,
3368,
476,
3297,
275,
253,
2022,
2929,
50276,
88,
19,
253,
4081,
2746,
17923,
7197,
685,
253,
294,
1877,
8245,
672,
253,
1895,
310,
6927,
285,
253,
3662,
476,
320,
6891,
1066,
342,
5899,
7862,
328,
960,
50276,
88,
20,
1529,
8245,
273,
21496,
253,
2862,
5024,
15847,
13783,
4667,
347,
247,
3943,
2581,
685,
9999,
253,
4426,
13783,
347,
4858,
7632,
310,
417,
2783,
352,
476,
320,
8131,
326,
436,
8245,
310,
417,
7470,
323,
247,
1781,
1180,
273,
1896,
4426,
5425,
2193,
533,
476,
320,
4354,
7763,
281,
253,
5125,
1650,
407,
7296,
1264,
13805,
2493,
3185,
273,
4426,
13783,
19149,
255,
67,
1026,
19149,
255,
983,
19149,
255,
545,
69,
891,
1158,
436,
310,
247,
4891,
5313,
273,
789,
13633,
253,
1895,
2317,
14042,
407,
4980,
21496,
5609,
253,
9759,
310,
6571,
4891,
285,
3477,
281,
1239,
285,
2096,
627,
310,
690,
1077,
5884,
9759,
3374,
326,
476,
320,
5520,
824,
347,
6240,
253,
1682,
4373,
22041,
275,
253,
2022,
2929,
390,
6240,
253,
5816,
4666,
3295,
275,
253,
13691,
275,
4677,
577,
390,
5046,
970,
42295,
3185,
273,
816,
42295,
534,
476,
320,
247,
2372,
24363,
533,
5010,
436,
2929,
310,
973,
3542,
342,
247,
2590,
4736,
285,
9021,
285,
27163,
1580,
275,
1083,
273,
3710,
873,
273,
4426,
5425,
2193,
253,
5024,
15847,
13783,
476,
320,
4354,
6607,
285,
12691,
347,
17965,
352,
651,
320,
4722,
281,
923,
253,
5301,
342,
824,
247,
8245,
533,
891,
13414,
1158,
436,
310,
3309,
281,
921,
253,
1318,
273,
436,
2929,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
7316,
21496,
2746,
323,
22291,
4471,
12242,
19241,
689,
247,
4373,
1661,
1050,
3640,
4216,
15841,
253,
2022,
9021,
403,
247,
747,
10895,
259,
69,
1235,
76,
82,
70,
323,
436,
4836,
285,
247,
2969,
533,
24600,
6880,
281,
271,
5368,
1566,
323,
7316,
46234,
281,
671,
6016,
5886,
4426,
13783,
30628,
259,
75,
11618,
285,
533,
70,
3877,
326,
253,
294,
1877,
285,
4177,
82,
70,
3210,
1347,
12014,
1223,
436,
310,
417,
247,
4016,
906,
347,
253,
4477,
3877,
352,
1057,
7164,
253,
1953,
273,
253,
4103,
5847,
285,
772,
273,
253,
767,
3082,
891,
3524,
253,
4477,
476,
823,
247,
5955,
273,
672,
581,
1537,
4510,
4177,
82,
70,
689,
253,
4473,
1230,
19554,
294,
1877,
1332,
275,
253,
2457,
2715,
253,
4477,
9713,
37317,
1315,
1378,
84,
7350,
670,
6009,
16858,
285,
24291,
22862,
2167,
625,
1941,
327,
15846,
16260,
3169,
8892,
651,
320,
5322,
1060,
37317,
305,
82,
274,
671,
5439,
690,
7350,
670,
4028,
533,
253,
643,
30628,
6571,
1119,
253,
2929,
281,
320,
973,
3542,
285,
973,
17194,
285,
891,
5257,
281,
5194,
4583,
1223,
627,
403,
690,
1077,
1175,
13991,
327,
849,
253,
2929,
476,
320,
6508,
285,
5520,
891,
1089,
253,
1655,
9021,
281,
320,
6832,
2217,
281,
7501,
247,
9311
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper studies proximal graph event models and in particular how to evaluate process independencies such evaluation tests are fundamental for learning causal structure the study of process independencies presents unique challenges on top of usual causal discovery while it is already hard to test different conditional independence requirements among multiple variables studying causal structure learning requires testing conditional independences among different processes to start witheach process can be highdimensional objecteven if we are just testing the conditional independence structure among three variables the paper makes an ambitious attempt to address this problem for which it proposes conditional process independence tests for causal pgems and deploying them in constraintbased structure discovery algorithms the key innovation is the treatment of multivariate conditioning in independence testing for which it proposes a normalized influence tester and a likelihood ratio tester the paper demonstrates the improved empirical performance in empirical studies i think the paper would benefit from the a further explanation with intuition of why the proposed algorithm is expected to do well while the proposals make sense a constant question i had while reading the paper is that multivariate conditioning is an intrinsically hard problem so what exactly does the proposal do to solve it why or why not influence is the right metric for such tasks how should we expect influence tester to compare with likelihood ratio tester are the two expected to perform similarly or differently in different settings when do we expect each of them to perform well i think answering these questions would make the paper much stronger docsepthe paper introduces the notion of pgems and subsequently it defines their respective notions of separation markov conditions and faithfulness it uses these notions to modify the pc algorithm and introduce an additional algorithm for learning the graph the empirical evaluation is sufficiently extensive for this study and seems encouraging unfortunately this is only a borderline paper due to the lack of clarity in the writing there are several places in which this can be substantially improved here are some pointers 1 insufficient intuitive comparison is made between causal graphs and pgems and how their notions of separation markov conditions and faithfulness are related 2 tying into the previous point there are many definitions introduced without any intuition about what they represent for example it would be useful to to explain how definition 3 is related to the causal markov property 3 in figure 1 it is unclear why part b follows from the timeline presented in part a 4 the authors should rigorously define the marked point process 5 the authors should explain in more detail the meaning of a conditional density also emanate may not be the best verb to use when describing wx in the same paragraph 6 immediately before the start of section 321 the meaning of finding optimal windows is unclear 7 in equation 3 the authors have not defined bary docsepi do not know much about causal discovery or event process analysis so i cannot speak for the accuracy of technical details in this paper but as a reader who has very little background knowledge i found this paper very well written the exposition is crisp precise concise while still providing right amount of information for unfamiliar readers to understand the high level messages of this paper the idea appears novel and the execution looks very solid and complete therefore i recommend acceptance but due to my limited knowledge my assessment may not be particularly accurate
### Summary:
|
the paper addresses an important question in analyzing event data and the reviewers agree that the paper makes a solid technical contribution i would encourage the authors to take the reviewer comments to heart and make the presentation a bit more accessible to a community whose core familiarity is with standard dags and iid data acknowledging space constraints otherwise the paper connects well with prior research in this area and is a nice contribution
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
2175,
19561,
4216,
2362,
3210,
285,
275,
1798,
849,
281,
7472,
1232,
7365,
4601,
824,
7103,
5216,
403,
7936,
323,
4715,
19349,
2605,
50276,
783,
1263,
273,
1232,
7365,
4601,
10262,
4451,
7881,
327,
1755,
273,
7312,
19349,
8900,
1223,
352,
310,
2168,
1892,
281,
1071,
1027,
17697,
14275,
6095,
2190,
2709,
4903,
12392,
19349,
2605,
4715,
4419,
5175,
17697,
7365,
2979,
2190,
1027,
4870,
281,
1265,
19311,
248,
607,
1232,
476,
320,
1029,
6967,
1789,
9154,
604,
359,
403,
816,
5175,
253,
17697,
14275,
2605,
2190,
1264,
4903,
50275,
783,
2929,
2789,
271,
24683,
3177,
281,
2953,
436,
1895,
323,
534,
352,
29328,
17697,
1232,
14275,
5216,
323,
19349,
23256,
3030,
285,
45021,
731,
275,
7658,
3169,
2605,
8900,
11333,
253,
2234,
15832,
310,
253,
1971,
273,
21471,
21839,
275,
14275,
5175,
323,
534,
352,
29328,
247,
12650,
4833,
49568,
285,
247,
12177,
4313,
49568,
253,
2929,
14371,
253,
5520,
16774,
3045,
275,
16774,
2175,
50276,
74,
1158,
253,
2929,
651,
5649,
432,
253,
247,
2007,
8813,
342,
30328,
273,
2139,
253,
4081,
5933,
310,
3264,
281,
513,
973,
1223,
253,
18595,
1056,
3282,
247,
3638,
1953,
891,
574,
1223,
4361,
253,
2929,
310,
326,
21471,
21839,
310,
271,
45654,
1892,
1895,
594,
752,
4555,
1057,
253,
10419,
513,
281,
8415,
352,
2139,
390,
2139,
417,
4833,
310,
253,
987,
7982,
323,
824,
8892,
849,
943,
359,
1902,
4833,
49568,
281,
7277,
342,
12177,
4313,
49568,
403,
253,
767,
3264,
281,
1347,
12014,
390,
13359,
275,
1027,
7533,
672,
513,
359,
1902,
1016,
273,
731,
281,
1347,
973,
891,
1158,
22291,
841,
3533,
651,
1056,
253,
2929,
1199,
10046,
5474,
339,
431,
248,
2929,
23970,
253,
10732,
273,
23256,
3030,
285,
9674,
352,
13067,
616,
9056,
27367,
273,
9712,
1616,
729,
2515,
285,
6009,
16858,
352,
4648,
841,
27367,
281,
10007,
253,
21136,
5933,
285,
9569,
271,
3081,
5933,
323,
4715,
253,
4216,
253,
16774,
7103,
310,
10481,
9470,
323,
436,
1263,
285,
3133,
18462,
50275,
328,
9520,
436,
310,
760,
247,
45210,
2929,
1955,
281,
253,
3480,
273,
19843,
275,
253,
4028,
627,
403,
2067,
5053,
275,
534,
436,
476,
320,
9619,
5520,
1060,
403,
690,
29476,
50275,
18,
12497,
27350,
5301,
310,
1160,
875,
19349,
14580,
285,
23256,
3030,
285,
849,
616,
27367,
273,
9712,
1616,
729,
2515,
285,
6009,
16858,
403,
2905,
50275,
19,
42068,
715,
253,
2045,
1127,
627,
403,
1142,
14308,
5611,
1293,
667,
30328,
670,
752,
597,
1957,
323,
1650,
352,
651,
320,
4217,
281,
281,
5513,
849,
5426,
495,
310,
2905,
281,
253,
19349,
1616,
729,
2867,
50275,
20,
275,
4677,
337,
352,
310,
12744,
2139,
629,
270,
3637,
432,
253,
28563,
3559,
275,
629,
247,
50275,
21,
253,
4477,
943,
8132,
29689,
4853,
253,
7101,
1127,
1232,
50275,
22,
253,
4477,
943,
5513,
275,
625,
2508,
253,
4495,
273,
247,
17697,
4038,
671,
36534,
366,
778,
417,
320,
253,
1682,
17257,
281,
897,
672,
12930,
22365,
275,
253,
1072,
12494,
50275,
23,
4745,
1078,
253,
1265,
273,
2593,
33251,
253,
4495,
273,
4560,
8654,
8323,
310,
12744,
50275,
24,
275,
5150,
495,
253,
4477,
452,
417,
2931,
28556,
50273,
7152,
339,
2059,
513,
417,
871,
1199,
670,
19349,
8900,
390,
2362,
1232,
1783,
594,
891,
2550,
3984,
323,
253,
7200,
273,
7681,
4278,
275,
436,
2929,
533,
347,
247,
9414,
665,
556,
1077,
1652,
4114,
3640,
891,
1119,
436,
2929,
1077,
973,
3542,
253,
47284,
310,
29990,
10799,
44003,
1223,
1335,
5277,
987,
2408,
273,
1491,
323,
32139,
10668,
281,
2096,
253,
1029,
1268,
8169,
273,
436,
2929,
253,
2934,
4620,
4460,
285,
253,
10636,
4453,
1077,
4891,
285,
3426,
3103,
891,
5583,
14924,
533,
1955,
281,
619,
3710,
3640,
619,
6803,
778,
417,
320,
3782,
7899,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
271,
1774,
1953,
275,
18918,
2362,
941,
285,
253,
30628,
5194,
326,
253,
2929,
2789,
247,
4891,
7681,
7680,
891,
651,
11907,
253,
4477,
281,
1379,
253,
37317,
5701,
281,
2798,
285,
1056,
253,
9759,
247,
2372,
625,
12482,
281,
247,
3114,
3692,
5161,
38550,
310,
342,
2629,
277,
3544,
285,
891,
301,
941,
40088,
2317,
10806,
5010,
253,
2929,
23417,
973,
342,
2720,
2561,
275,
436,
2170,
285,
310,
247,
5322,
7680
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
2175,
19561,
4216,
2362,
3210,
285,
275,
1798,
849,
281,
7472,
1232,
7365,
4601,
824,
7103,
5216,
403,
7936,
323,
4715,
19349,
2605,
50276,
783,
1263,
273,
1232,
7365,
4601,
10262,
4451,
7881,
327,
1755,
273,
7312,
19349,
8900,
1223,
352,
310,
2168,
1892,
281,
1071,
1027,
17697,
14275,
6095,
2190,
2709,
4903,
12392,
19349,
2605,
4715,
4419,
5175,
17697,
7365,
2979,
2190,
1027,
4870,
281,
1265,
19311,
248,
607,
1232,
476,
320,
1029,
6967,
1789,
9154,
604,
359,
403,
816,
5175,
253,
17697,
14275,
2605,
2190,
1264,
4903,
50275,
783,
2929,
2789,
271,
24683,
3177,
281,
2953,
436,
1895,
323,
534,
352,
29328,
17697,
1232,
14275,
5216,
323,
19349,
23256,
3030,
285,
45021,
731,
275,
7658,
3169,
2605,
8900,
11333,
253,
2234,
15832,
310,
253,
1971,
273,
21471,
21839,
275,
14275,
5175,
323,
534,
352,
29328,
247,
12650,
4833,
49568,
285,
247,
12177,
4313,
49568,
253,
2929,
14371,
253,
5520,
16774,
3045,
275,
16774,
2175,
50276,
74,
1158,
253,
2929,
651,
5649,
432,
253,
247,
2007,
8813,
342,
30328,
273,
2139,
253,
4081,
5933,
310,
3264,
281,
513,
973,
1223,
253,
18595,
1056,
3282,
247,
3638,
1953,
891,
574,
1223,
4361,
253,
2929,
310,
326,
21471,
21839,
310,
271,
45654,
1892,
1895,
594,
752,
4555,
1057,
253,
10419,
513,
281,
8415,
352,
2139,
390,
2139,
417,
4833,
310,
253,
987,
7982,
323,
824,
8892,
849,
943,
359,
1902,
4833,
49568,
281,
7277,
342,
12177,
4313,
49568,
403,
253,
767,
3264,
281,
1347,
12014,
390,
13359,
275,
1027,
7533,
672,
513,
359,
1902,
1016,
273,
731,
281,
1347,
973,
891,
1158,
22291,
841,
3533,
651,
1056,
253,
2929,
1199,
10046,
5474,
339,
431,
248,
2929,
23970,
253,
10732,
273,
23256,
3030,
285,
9674,
352,
13067,
616,
9056,
27367,
273,
9712,
1616,
729,
2515,
285,
6009,
16858,
352,
4648,
841,
27367,
281,
10007,
253,
21136,
5933,
285,
9569,
271,
3081,
5933,
323,
4715,
253,
4216,
253,
16774,
7103,
310,
10481,
9470,
323,
436,
1263,
285,
3133,
18462,
50275,
328,
9520,
436,
310,
760,
247,
45210,
2929,
1955,
281,
253,
3480,
273,
19843,
275,
253,
4028,
627,
403,
2067,
5053,
275,
534,
436,
476,
320,
9619,
5520,
1060,
403,
690,
29476,
50275,
18,
12497,
27350,
5301,
310,
1160,
875,
19349,
14580,
285,
23256,
3030,
285,
849,
616,
27367,
273,
9712,
1616,
729,
2515,
285,
6009,
16858,
403,
2905,
50275,
19,
42068,
715,
253,
2045,
1127,
627,
403,
1142,
14308,
5611,
1293,
667,
30328,
670,
752,
597,
1957,
323,
1650,
352,
651,
320,
4217,
281,
281,
5513,
849,
5426,
495,
310,
2905,
281,
253,
19349,
1616,
729,
2867,
50275,
20,
275,
4677,
337,
352,
310,
12744,
2139,
629,
270,
3637,
432,
253,
28563,
3559,
275,
629,
247,
50275,
21,
253,
4477,
943,
8132,
29689,
4853,
253,
7101,
1127,
1232,
50275,
22,
253,
4477,
943,
5513,
275,
625,
2508,
253,
4495,
273,
247,
17697,
4038,
671,
36534,
366,
778,
417,
320,
253,
1682,
17257,
281,
897,
672,
12930,
22365,
275,
253,
1072,
12494,
50275,
23,
4745,
1078,
253,
1265,
273,
2593,
33251,
253,
4495,
273,
4560,
8654,
8323,
310,
12744,
50275,
24,
275,
5150,
495,
253,
4477,
452,
417,
2931,
28556,
50273,
7152,
339,
2059,
513,
417,
871,
1199,
670,
19349,
8900,
390,
2362,
1232,
1783,
594,
891,
2550,
3984,
323,
253,
7200,
273,
7681,
4278,
275,
436,
2929,
533,
347,
247,
9414,
665,
556,
1077,
1652,
4114,
3640,
891,
1119,
436,
2929,
1077,
973,
3542,
253,
47284,
310,
29990,
10799,
44003,
1223,
1335,
5277,
987,
2408,
273,
1491,
323,
32139,
10668,
281,
2096,
253,
1029,
1268,
8169,
273,
436,
2929,
253,
2934,
4620,
4460,
285,
253,
10636,
4453,
1077,
4891,
285,
3426,
3103,
891,
5583,
14924,
533,
1955,
281,
619,
3710,
3640,
619,
6803,
778,
417,
320,
3782,
7899,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
271,
1774,
1953,
275,
18918,
2362,
941,
285,
253,
30628,
5194,
326,
253,
2929,
2789,
247,
4891,
7681,
7680,
891,
651,
11907,
253,
4477,
281,
1379,
253,
37317,
5701,
281,
2798,
285,
1056,
253,
9759,
247,
2372,
625,
12482,
281,
247,
3114,
3692,
5161,
38550,
310,
342,
2629,
277,
3544,
285,
891,
301,
941,
40088,
2317,
10806,
5010,
253,
2929,
23417,
973,
342,
2720,
2561,
275,
436,
2170,
285,
310,
247,
5322,
7680
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a scheme for transitioning to favorable starting states for executing given options in continuous domains two learning processes are carried out simultaneously one learns a proximity function to favorable states from previous trajectories and executions of the option and the other learns the transition policies based on dense reward provided by the proximity function both parts of the learning algorithms are pretty straightforward but their combination turns out to be quite elegant the experiments suggest that the scheme works and in particular does not get stuck in local minima the experiments involve fairly realistic robotic applications with complex options which renders credibility to the results overall this is a nice contribution to the options literature the scheme itself is quite simple and straightforward but still useful one point that i would like to see elaborated is the choice of exponential discounted proximity function wouldnt a linear function of step be more natural here the exponent loses sensitivity as the number of steps away increases which may lead to sparser rewards docsepthe paper presents a method for learning policies for transitioning from one task to another with the goal of completing complex tasks in the heart of the method is state proximity estimator which measures the distance between states in the originator and destination tasks this estimator is used in the reward for the transition policy the method is evaluated on number of mojoco tasks including locomotion and manipulation strengths well motivated and relevant topic one of the big downsides in the current state of the art is lack of understanding how to learn complex tasks this papers tackles that problem the paper is well written and the presentation is clear the method is simple yet original overall an elegant approach that appears to be working well comprehensive evaluations over several tasks and several baselines questions in the metapolicy what ensures consistency ie it selects the same policy in the consecutive steps can the authors comment on the weaknesses and the limits of the methoddocsep summary the authors propose a new training scheme with a learned auxiliary reward function to optimise transition policies ie policies that connect the ending state of a previous macro actionoption with good initiation states of the following macro actionoption quality clarity the paper is well written and features an extensive set of experiments originality i am not aware of similar work and believe the idea is novel significance several recent papers have proposed to approach the topic of learning hierarchical policies not by training the hierarchy endtoend but by first learning useful individual behavioural patterns eg skills which then later can be used and sequentially chained together by higherlevel policies i believe the here presented work can be quite helpful to do so as the individual skills are not optimised for smooth composition and are therefore likely to fail when naively used sequentially
### Summary:
|
strengths the paper tackles a novel wellmotivated problem related to options hrl the problem is that of learning transition policies and the paper proposes a novel and simple solution to that problem using learned proximity predictors and transition policies that can leverage those solid evaluations are done on simulated locomotion and manipulation tasks the paper is well written weaknesses limitations were not originally discussed in any depth there is related work related to subgoal generation in hrl ac the physics of the 2d walker simulations looks to be unrealistic the character seems to move in a lowgravity environment and can lean forwards at extreme angles without falling it would be good to see this explained there is a consensus among reviewers and ac that the paper would make an excellent iclr contribution ac i suggest a poster presentation it could also be considered for oral presentation based on the very positive reception by reviewers
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
6974,
323,
5502,
272,
281,
13857,
4983,
3054,
323,
24364,
1677,
4610,
275,
5415,
10625,
767,
4715,
4870,
403,
4824,
562,
10486,
581,
33772,
247,
18326,
1159,
281,
13857,
3054,
432,
2045,
24102,
285,
3244,
3360,
273,
253,
4500,
50276,
395,
253,
643,
33772,
253,
5502,
7823,
1754,
327,
14086,
10921,
2530,
407,
253,
18326,
1159,
28910,
1097,
4243,
273,
253,
4715,
11333,
403,
3965,
15246,
533,
616,
5019,
7819,
562,
281,
320,
3240,
20654,
253,
4679,
1804,
326,
253,
6974,
2987,
50276,
395,
275,
1798,
1057,
417,
755,
10960,
275,
1980,
46836,
50275,
783,
4679,
6388,
9648,
15958,
35121,
4893,
342,
2570,
4610,
50276,
4609,
29512,
17938,
281,
253,
1543,
50272,
1189,
455,
436,
310,
247,
5322,
7680,
281,
253,
4610,
6239,
253,
6974,
3139,
310,
3240,
2969,
285,
15246,
533,
1335,
4217,
50275,
531,
1127,
326,
891,
651,
751,
281,
923,
50221,
310,
253,
4327,
273,
17619,
42214,
18326,
1159,
651,
2649,
247,
4872,
1159,
273,
3213,
320,
50275,
3062,
3626,
1060,
253,
23653,
25068,
7340,
347,
253,
1180,
273,
5018,
1977,
5459,
534,
778,
1421,
281,
653,
9332,
23267,
50274,
7152,
339,
431,
248,
2929,
10262,
247,
1332,
323,
4715,
7823,
323,
5502,
272,
432,
581,
4836,
281,
1529,
342,
253,
4736,
273,
21006,
2570,
8892,
275,
253,
2798,
273,
253,
1332,
310,
1375,
18326,
29107,
534,
5593,
253,
4181,
875,
3054,
275,
253,
6510,
1080,
285,
12095,
8892,
436,
29107,
310,
908,
275,
253,
10921,
323,
253,
5502,
3646,
253,
1332,
310,
6760,
327,
1180,
273,
5497,
75,
16856,
8892,
1690,
23904,
5011,
285,
19763,
50276,
296,
3755,
20556,
50276,
4714,
17194,
285,
4623,
9400,
581,
273,
253,
1943,
37616,
1487,
275,
253,
1655,
1375,
273,
253,
1445,
310,
3480,
273,
4685,
849,
281,
3037,
2570,
8892,
436,
9380,
39223,
326,
1895,
50276,
783,
2929,
310,
973,
3542,
285,
253,
9759,
310,
2590,
50276,
783,
1332,
310,
2969,
2568,
3236,
4583,
271,
20654,
2746,
326,
4620,
281,
320,
2444,
973,
50276,
3118,
8391,
422,
27163,
689,
2067,
8892,
285,
2067,
1666,
25379,
50276,
34974,
50276,
249,
253,
21543,
311,
2576,
752,
20096,
15274,
26332,
352,
34899,
253,
1072,
3646,
275,
253,
12640,
5018,
50276,
5092,
253,
4477,
4385,
327,
253,
32213,
285,
253,
7787,
273,
253,
1332,
7152,
33032,
6010,
50276,
783,
4477,
12661,
247,
747,
3733,
6974,
342,
247,
6311,
24026,
10921,
1159,
281,
5556,
885,
5502,
7823,
26332,
7823,
326,
4684,
253,
12365,
1375,
273,
247,
2045,
14823,
2250,
7872,
342,
1175,
17000,
3054,
273,
253,
1563,
14823,
2250,
7872,
50275,
15177,
50276,
498,
15752,
50276,
783,
2929,
310,
973,
3542,
285,
3386,
271,
9470,
873,
273,
4679,
50275,
19164,
414,
50276,
74,
717,
417,
6600,
273,
2074,
789,
285,
2868,
253,
2934,
310,
4460,
50275,
9188,
40348,
50276,
43249,
3332,
9380,
452,
4081,
281,
2746,
253,
9400,
273,
4715,
24498,
7823,
417,
407,
3733,
253,
19868,
990,
936,
423,
533,
407,
806,
4715,
4217,
2060,
35174,
6127,
24088,
6936,
534,
840,
1996,
476,
320,
908,
285,
32627,
448,
1243,
2366,
407,
2169,
5251,
7823,
891,
2868,
253,
1060,
3559,
789,
476,
320,
3240,
9371,
281,
513,
594,
347,
253,
2060,
6936,
403,
417,
5556,
1701,
323,
6032,
5889,
285,
403,
3103,
2779,
281,
1891,
672,
5549,
1242,
908,
32627,
187,
187,
4118,
18435,
27,
296,
3755,
20556,
253,
2929,
39223,
247,
4460,
973,
24013,
8550,
1895,
2905,
281,
4610,
50276,
6285,
77,
253,
1895,
310,
326,
273,
4715,
5502,
7823,
285,
253,
2929,
29328,
247,
4460,
285,
2969,
2900,
281,
326,
1895,
970,
6311,
18326,
23477,
285,
5502,
7823,
326,
476,
25057,
1110,
4891,
27163,
403,
2218,
327,
15524,
23904,
5011,
285,
19763,
8892,
253,
2929,
310,
973,
3542,
50276,
20881,
1255,
265,
7364,
497,
417,
8927,
5469,
275,
667,
6864,
50276,
9088,
310,
2905,
789,
2905,
281,
749,
41881,
5978,
275,
288,
8435,
913,
253,
12057,
273,
253,
374,
69,
2940,
254,
9938,
4453,
281,
320,
46521,
253,
1894,
3133,
281,
2118,
275,
247,
1698,
28702,
3126,
285,
476,
9644,
32856,
387,
9559,
14636,
1293,
10805,
352,
651,
320,
1175,
281,
923,
436,
5544,
50276,
9088,
310,
247,
13969,
2190,
30628,
285,
913,
326,
253,
2929,
651,
1056,
271,
7126,
17857,
32888,
7680,
913,
891,
1804,
247,
20731,
9759,
352,
812,
671,
320,
2783,
323,
7946,
9759,
1754,
327,
253,
1077,
2762,
16112,
407,
30628
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
6974,
323,
5502,
272,
281,
13857,
4983,
3054,
323,
24364,
1677,
4610,
275,
5415,
10625,
767,
4715,
4870,
403,
4824,
562,
10486,
581,
33772,
247,
18326,
1159,
281,
13857,
3054,
432,
2045,
24102,
285,
3244,
3360,
273,
253,
4500,
50276,
395,
253,
643,
33772,
253,
5502,
7823,
1754,
327,
14086,
10921,
2530,
407,
253,
18326,
1159,
28910,
1097,
4243,
273,
253,
4715,
11333,
403,
3965,
15246,
533,
616,
5019,
7819,
562,
281,
320,
3240,
20654,
253,
4679,
1804,
326,
253,
6974,
2987,
50276,
395,
275,
1798,
1057,
417,
755,
10960,
275,
1980,
46836,
50275,
783,
4679,
6388,
9648,
15958,
35121,
4893,
342,
2570,
4610,
50276,
4609,
29512,
17938,
281,
253,
1543,
50272,
1189,
455,
436,
310,
247,
5322,
7680,
281,
253,
4610,
6239,
253,
6974,
3139,
310,
3240,
2969,
285,
15246,
533,
1335,
4217,
50275,
531,
1127,
326,
891,
651,
751,
281,
923,
50221,
310,
253,
4327,
273,
17619,
42214,
18326,
1159,
651,
2649,
247,
4872,
1159,
273,
3213,
320,
50275,
3062,
3626,
1060,
253,
23653,
25068,
7340,
347,
253,
1180,
273,
5018,
1977,
5459,
534,
778,
1421,
281,
653,
9332,
23267,
50274,
7152,
339,
431,
248,
2929,
10262,
247,
1332,
323,
4715,
7823,
323,
5502,
272,
432,
581,
4836,
281,
1529,
342,
253,
4736,
273,
21006,
2570,
8892,
275,
253,
2798,
273,
253,
1332,
310,
1375,
18326,
29107,
534,
5593,
253,
4181,
875,
3054,
275,
253,
6510,
1080,
285,
12095,
8892,
436,
29107,
310,
908,
275,
253,
10921,
323,
253,
5502,
3646,
253,
1332,
310,
6760,
327,
1180,
273,
5497,
75,
16856,
8892,
1690,
23904,
5011,
285,
19763,
50276,
296,
3755,
20556,
50276,
4714,
17194,
285,
4623,
9400,
581,
273,
253,
1943,
37616,
1487,
275,
253,
1655,
1375,
273,
253,
1445,
310,
3480,
273,
4685,
849,
281,
3037,
2570,
8892,
436,
9380,
39223,
326,
1895,
50276,
783,
2929,
310,
973,
3542,
285,
253,
9759,
310,
2590,
50276,
783,
1332,
310,
2969,
2568,
3236,
4583,
271,
20654,
2746,
326,
4620,
281,
320,
2444,
973,
50276,
3118,
8391,
422,
27163,
689,
2067,
8892,
285,
2067,
1666,
25379,
50276,
34974,
50276,
249,
253,
21543,
311,
2576,
752,
20096,
15274,
26332,
352,
34899,
253,
1072,
3646,
275,
253,
12640,
5018,
50276,
5092,
253,
4477,
4385,
327,
253,
32213,
285,
253,
7787,
273,
253,
1332,
7152,
33032,
6010,
50276,
783,
4477,
12661,
247,
747,
3733,
6974,
342,
247,
6311,
24026,
10921,
1159,
281,
5556,
885,
5502,
7823,
26332,
7823,
326,
4684,
253,
12365,
1375,
273,
247,
2045,
14823,
2250,
7872,
342,
1175,
17000,
3054,
273,
253,
1563,
14823,
2250,
7872,
50275,
15177,
50276,
498,
15752,
50276,
783,
2929,
310,
973,
3542,
285,
3386,
271,
9470,
873,
273,
4679,
50275,
19164,
414,
50276,
74,
717,
417,
6600,
273,
2074,
789,
285,
2868,
253,
2934,
310,
4460,
50275,
9188,
40348,
50276,
43249,
3332,
9380,
452,
4081,
281,
2746,
253,
9400,
273,
4715,
24498,
7823,
417,
407,
3733,
253,
19868,
990,
936,
423,
533,
407,
806,
4715,
4217,
2060,
35174,
6127,
24088,
6936,
534,
840,
1996,
476,
320,
908,
285,
32627,
448,
1243,
2366,
407,
2169,
5251,
7823,
891,
2868,
253,
1060,
3559,
789,
476,
320,
3240,
9371,
281,
513,
594,
347,
253,
2060,
6936,
403,
417,
5556,
1701,
323,
6032,
5889,
285,
403,
3103,
2779,
281,
1891,
672,
5549,
1242,
908,
32627,
187,
187,
4118,
18435,
27,
296,
3755,
20556,
253,
2929,
39223,
247,
4460,
973,
24013,
8550,
1895,
2905,
281,
4610,
50276,
6285,
77,
253,
1895,
310,
326,
273,
4715,
5502,
7823,
285,
253,
2929,
29328,
247,
4460,
285,
2969,
2900,
281,
326,
1895,
970,
6311,
18326,
23477,
285,
5502,
7823,
326,
476,
25057,
1110,
4891,
27163,
403,
2218,
327,
15524,
23904,
5011,
285,
19763,
8892,
253,
2929,
310,
973,
3542,
50276,
20881,
1255,
265,
7364,
497,
417,
8927,
5469,
275,
667,
6864,
50276,
9088,
310,
2905,
789,
2905,
281,
749,
41881,
5978,
275,
288,
8435,
913,
253,
12057,
273,
253,
374,
69,
2940,
254,
9938,
4453,
281,
320,
46521,
253,
1894,
3133,
281,
2118,
275,
247,
1698,
28702,
3126,
285,
476,
9644,
32856,
387,
9559,
14636,
1293,
10805,
352,
651,
320,
1175,
281,
923,
436,
5544,
50276,
9088,
310,
247,
13969,
2190,
30628,
285,
913,
326,
253,
2929,
651,
1056,
271,
7126,
17857,
32888,
7680,
913,
891,
1804,
247,
20731,
9759,
352,
812,
671,
320,
2783,
323,
7946,
9759,
1754,
327,
253,
1077,
2762,
16112,
407,
30628
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the statistical limits of deep learning for solving pdes for highdimensional data the authors present a thorough theoretical and empirical evaluation of neural pde solvers the paper presents the work in the context of known theoretical frameworks for evaluating the performance as a function of the input dimension they describe the implications of the results in practice by comparing them with existing neural pde solvers the implications are important for understanding the limits of their usedocsepthe authors study the limits of the deep ritz method drm and physics informed neural networks pinn for the schrdinger equation with hypercube boundary conditions and obtain new minmax optimal bounds for pinns and for an adhoc modified version of the drm experimental results back up the claims the authors also provide much detail and proof in the appendix this work can be an important contribution to the scientific machine learning community by rigorously providing new optimal bounds for deep pde solvers typos 1 abstract openreview page schrdinger appears as schrodinger there is also citation leftover from latex as citepduan2021convergencejiao2021convergence 2 line 34 optimiality 3 line 688 to 673 more than one instance of lemma docsepdeep ritz method drm and physics informed neural networks pinns are deep learning methods to solve pdes the authors took the same case setting as 11 and 25 which is the ellipticschrdinger equation with hypercube boundary condition in this setting they show that drm and pinns can achieve an optimal bound of o1n instead of the previously best bound of o1sqrtn their upper bounds matches the lower bound minmax bound drm does not achieve the improved bound however they show how to modify drm to achieve the same minmax bound the authors provide experiments in simple settings to show that their bounds matches with experimental results significance of the work given that they derive the optimal minmax bounds of the two major deep learning methods to solve pdes further improvement is unlikely which makes this work very important theoretically the assumptions do not seem too strong however this bound only applies to a very specific scenario schrdinger equation with hypercube boundary condition their modified version of drm could improve practical results which is very useful for practitioners line 106 section
### Summary:
|
the paper develops bounds on the optimality of drms and pinns used to solve a class of pdes the reviewers were positive about this submission and its theoretical contributions
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
7605,
7787,
273,
3676,
4715,
323,
16161,
268,
3229,
323,
1029,
6967,
941,
50276,
783,
4477,
1246,
247,
11080,
10527,
285,
16774,
7103,
273,
11454,
268,
615,
1220,
735,
50276,
783,
2929,
10262,
253,
789,
275,
253,
3634,
273,
1929,
10527,
31225,
323,
16344,
253,
3045,
347,
247,
1159,
273,
253,
3280,
7877,
50276,
9328,
6266,
253,
12739,
273,
253,
1543,
275,
3946,
407,
10941,
731,
342,
5368,
11454,
268,
615,
1220,
735,
50276,
783,
12739,
403,
1774,
323,
4685,
253,
7787,
273,
616,
908,
406,
339,
431,
248,
4477,
1263,
253,
7787,
273,
253,
3676,
391,
5432,
1332,
41953,
285,
12057,
8191,
11454,
6928,
268,
2966,
323,
253,
5807,
5784,
4940,
5150,
342,
4373,
68,
4338,
7548,
2515,
285,
4044,
747,
1054,
4090,
8654,
14493,
323,
9176,
2224,
285,
323,
271,
519,
37806,
7321,
2715,
273,
253,
41953,
5661,
1543,
896,
598,
253,
3916,
253,
4477,
671,
2085,
1199,
2508,
285,
4737,
275,
253,
30762,
50276,
2520,
789,
476,
320,
271,
1774,
7680,
281,
253,
8249,
5145,
4715,
3114,
407,
8132,
29689,
5277,
747,
8654,
14493,
323,
3676,
268,
615,
1220,
735,
50275,
555,
993,
337,
12002,
1527,
15337,
3239,
50276,
10629,
5784,
4940,
4620,
347,
50276,
10629,
16104,
4940,
627,
310,
671,
25577,
1669,
1189,
432,
44127,
347,
4851,
554,
563,
266,
938,
1797,
585,
41801,
75,
22728,
938,
1797,
585,
41801,
374,
1386,
5910,
5556,
451,
414,
495,
1386,
721,
2055,
281,
721,
3655,
625,
685,
581,
4227,
273,
18057,
5474,
33032,
22412,
391,
5432,
1332,
41953,
285,
12057,
8191,
11454,
6928,
9176,
2224,
403,
3676,
4715,
3082,
281,
8415,
268,
3229,
253,
4477,
2335,
253,
1072,
1083,
4758,
347,
1903,
285,
2030,
534,
310,
253,
20194,
982,
348,
5784,
4940,
5150,
342,
4373,
68,
4338,
7548,
1617,
275,
436,
4758,
597,
921,
326,
41953,
285,
9176,
2224,
476,
5115,
271,
8654,
3033,
273,
258,
18,
79,
3185,
273,
253,
3786,
1682,
3033,
273,
258,
18,
2609,
79,
616,
5170,
14493,
10129,
253,
2406,
3033,
1054,
4090,
3033,
41953,
1057,
417,
5115,
253,
5520,
3033,
2299,
597,
921,
849,
281,
10007,
41953,
281,
5115,
253,
1072,
1054,
4090,
3033,
253,
4477,
2085,
4679,
275,
2969,
7533,
281,
921,
326,
616,
14493,
10129,
342,
5661,
1543,
50276,
9188,
40348,
273,
253,
789,
1677,
326,
597,
15313,
253,
8654,
1054,
4090,
14493,
273,
253,
767,
2201,
3676,
4715,
3082,
281,
8415,
268,
3229,
2007,
7756,
310,
11543,
534,
2789,
436,
789,
1077,
1774,
28055,
253,
13260,
513,
417,
1646,
1512,
2266,
2299,
436,
3033,
760,
10384,
281,
247,
1077,
2173,
10076,
5807,
5784,
4940,
5150,
342,
4373,
68,
4338,
7548,
1617,
616,
7321,
2715,
273,
41953,
812,
3157,
8542,
1543,
534,
310,
1077,
4217,
323,
24432,
50275,
1282,
12708,
2593,
2490,
187,
4118,
18435,
27,
783,
2929,
24357,
14493,
327,
253,
5556,
1319,
273,
1837,
983,
285,
9176,
2224,
908,
281,
8415,
247,
966,
273,
268,
3229,
253,
30628,
497,
2762,
670,
436,
19529,
285,
697,
10527,
9021
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
7605,
7787,
273,
3676,
4715,
323,
16161,
268,
3229,
323,
1029,
6967,
941,
50276,
783,
4477,
1246,
247,
11080,
10527,
285,
16774,
7103,
273,
11454,
268,
615,
1220,
735,
50276,
783,
2929,
10262,
253,
789,
275,
253,
3634,
273,
1929,
10527,
31225,
323,
16344,
253,
3045,
347,
247,
1159,
273,
253,
3280,
7877,
50276,
9328,
6266,
253,
12739,
273,
253,
1543,
275,
3946,
407,
10941,
731,
342,
5368,
11454,
268,
615,
1220,
735,
50276,
783,
12739,
403,
1774,
323,
4685,
253,
7787,
273,
616,
908,
406,
339,
431,
248,
4477,
1263,
253,
7787,
273,
253,
3676,
391,
5432,
1332,
41953,
285,
12057,
8191,
11454,
6928,
268,
2966,
323,
253,
5807,
5784,
4940,
5150,
342,
4373,
68,
4338,
7548,
2515,
285,
4044,
747,
1054,
4090,
8654,
14493,
323,
9176,
2224,
285,
323,
271,
519,
37806,
7321,
2715,
273,
253,
41953,
5661,
1543,
896,
598,
253,
3916,
253,
4477,
671,
2085,
1199,
2508,
285,
4737,
275,
253,
30762,
50276,
2520,
789,
476,
320,
271,
1774,
7680,
281,
253,
8249,
5145,
4715,
3114,
407,
8132,
29689,
5277,
747,
8654,
14493,
323,
3676,
268,
615,
1220,
735,
50275,
555,
993,
337,
12002,
1527,
15337,
3239,
50276,
10629,
5784,
4940,
4620,
347,
50276,
10629,
16104,
4940,
627,
310,
671,
25577,
1669,
1189,
432,
44127,
347,
4851,
554,
563,
266,
938,
1797,
585,
41801,
75,
22728,
938,
1797,
585,
41801,
374,
1386,
5910,
5556,
451,
414,
495,
1386,
721,
2055,
281,
721,
3655,
625,
685,
581,
4227,
273,
18057,
5474,
33032,
22412,
391,
5432,
1332,
41953,
285,
12057,
8191,
11454,
6928,
9176,
2224,
403,
3676,
4715,
3082,
281,
8415,
268,
3229,
253,
4477,
2335,
253,
1072,
1083,
4758,
347,
1903,
285,
2030,
534,
310,
253,
20194,
982,
348,
5784,
4940,
5150,
342,
4373,
68,
4338,
7548,
1617,
275,
436,
4758,
597,
921,
326,
41953,
285,
9176,
2224,
476,
5115,
271,
8654,
3033,
273,
258,
18,
79,
3185,
273,
253,
3786,
1682,
3033,
273,
258,
18,
2609,
79,
616,
5170,
14493,
10129,
253,
2406,
3033,
1054,
4090,
3033,
41953,
1057,
417,
5115,
253,
5520,
3033,
2299,
597,
921,
849,
281,
10007,
41953,
281,
5115,
253,
1072,
1054,
4090,
3033,
253,
4477,
2085,
4679,
275,
2969,
7533,
281,
921,
326,
616,
14493,
10129,
342,
5661,
1543,
50276,
9188,
40348,
273,
253,
789,
1677,
326,
597,
15313,
253,
8654,
1054,
4090,
14493,
273,
253,
767,
2201,
3676,
4715,
3082,
281,
8415,
268,
3229,
2007,
7756,
310,
11543,
534,
2789,
436,
789,
1077,
1774,
28055,
253,
13260,
513,
417,
1646,
1512,
2266,
2299,
436,
3033,
760,
10384,
281,
247,
1077,
2173,
10076,
5807,
5784,
4940,
5150,
342,
4373,
68,
4338,
7548,
1617,
616,
7321,
2715,
273,
41953,
812,
3157,
8542,
1543,
534,
310,
1077,
4217,
323,
24432,
50275,
1282,
12708,
2593,
2490,
187,
4118,
18435,
27,
783,
2929,
24357,
14493,
327,
253,
5556,
1319,
273,
1837,
983,
285,
9176,
2224,
908,
281,
8415,
247,
966,
273,
268,
3229,
253,
30628,
497,
2762,
670,
436,
19529,
285,
697,
10527,
9021
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the main contributions of this papa are 1 a particular architecture that combines variational autoencoder vae that makes the predictions based on the distances between instances and prototypes and 2 a series of visualizations in the latent space that explains the models predictions ie explanation space in order to verify their assumptions that their networks provide faithful and rich explanations the authors conduct user studies on mnist and fashionmnist datasets overall the paper introduces an interesting concept explanation space and an interesting way of constructing networks that incorporate some constraints ie the distance layer into the model as a result part of the internal components are not black boxes to humans any more it also comes with an easier way to visualize the internal activations of the model by the decoder part of a vae however even though i appreciate the potential merits in the papers idea of building better networks in terms of interpretability i have several concerns about the motivation of the technique and inadequate evaluations that may not be sufficient to convince the reader please find my detailed review as follows novelty the idea of combining distance layer with vae is novel and the concept of explanation space is also first proposed in this paper these two techniques make this paper stand out from other vaebased explanations however these techniques are not wellmotivated to me and the particular application of the proposed visualizations explanation space is also missing to clarify i have two following questions regarding the motivation of the paper 1 what is the concrete problem this paper tries to solve as the authors argue that however examples are powerful but not enough in the introduction of the paper the sentence seems not to be completed because the most important object is missing the examplebased explanations are not enough for what questions i am interested in what kind of questions that the authors have in mind that a human user may have but is not able to fully answer by the current explanation approaches for example with a classifier that differs dogs from cats what does the concrete question look like which requires a set of explanations ie explanation space instead of examples generated by prior work without a clarification like that i am not sure how the evaluations provided in the result of the paper justify the significance and validate the efficiency of the proposed approach 2 the current motivation for using the uncertainty layer seems to be quite weak the authors do not justify what motivates the use of an uncertainty layer on the top of the network the only relevant information i am able to find is that by using the uncertainty layer the accuracy of the model is improved by 1 this is not convincing enough to serve as a strong motivation because 1 the improvement of performance is strongly related to the data distribution which does not serve as a motivation if using other datasets for example 1 increase can be very significant for larget datasets ie imagenet but pretty trivial for the datasets used in this paper fashionmnist and mnist 2 the improvement of the accuracy should instead be the result of employing the uncertainty layer instead of the motivation to do so the way the current paper justifies the use of this layer does not convince me of the necessity of it and i am curious to see how the changes in explanations may look like with the 1 decrease of the networks performance a minor point understanding the models internal behavior by the distances of representations learned by vae or any encoderdecoder networks are not first proposed in this paper some related work 1 seems worth mentioning and discussing technical quality i have three concerns regarding the analysis and the experiments in this paper 1 the empirical findings seem to be the main contributions of this paper however some descriptions seem to convey the message that the proposed explanation enjoys some nice properties but they are either not clearly defined or proved as the authors emphasize in the abstract that but we propose an inherently interpretable model for more faithful explanations i am not able to locate clear definitions in the paper about inherently interpretable and a measurable definition of faithfulness to support the argument that the proposed method is more faithful further what are the baselines for the authors to derive the conclusion that the proposed method is more faithful i will come back to the baseline issue in the next bullet point 2 the empirical evaluations are not strong enough to support the claims that the authors make about the contributions firstly most of the experiments are conducted on mnist and fashionmnist but these two data distributions are sometimes too simple to generalize to higher dimensions ie imagenet it is not new to the community that a lot of algorithms will work perfectly on mnist and fashionmnist but fail dramatically on more complicated colorful images as in practice you may not actually need a deep net to achieve a great performance on mnist for example the knearest neighbors classifier is both explainbale and accurate on mnist i would encourage the authors to expand the empirical evaluations to datasets with higher dimensions in the input features before rushing to make any conclusions 3 baselines are missing in the evaluation section the major part of the evaluation section seems to focus on explore that the proposed method can bring to the user however before rushing into the exploration part of the method there is a missing part of comparing the proposed method with the prior work even though the authors emphasize that the proposed method provides a set of examples instead of just one which does not exist in the previous work however it does not seem to be unfair to compare one or several examples samples from the explanation space with some prior work that provides instancebased explanations 2 3 4 5 6 unless i misunderstand the paper and please help me understand why such comparisons are not useful to show the proposed explanation space provides better explanations these baselines may not provide an appletoapple comparison but are worth discussing and performing some comparisons in the ballpark clarity the writing of the introduction section is a bit confusing to me because i miss the part about what the problem the paper aims to answer and how the results in the evaluation sections help to show that the proposed method solves the problem figure 1 and 2 are clear to help readers to understand the proposed architecture and i appreciate that significance with the aforementioned review the significance of this paper can vary one the one side if the authors can help me understand the what is the subject of the explanation that is the question requiring an answer like the one proposed in the paper and can not be answered by prior work and why the current evaluation is sufficient to support the claim i would recommend this paper because the contributions are significant to the explanation community on the other hand the current manuscript does not seem to be able to convince me that the contributions are sound and significant therefore i am on the negative side but will decrease my confidence in the score because i am willing to increase it once my concerns are resolved 1 yang ceyuan et al semantic hierarchy emerges in deep generative representations for scene synthesis int j comput vis 129 2021 14511466 2 pang wei koh and percy liang 2017 understanding blackbox predictions via influence functions in proceedings of the 34th international conference on machine learning volume 70icml17 jmlrorg 18851894 3 kim been et al examples are not enough learn to criticize criticism for interpretability nips2016 4 yeh chihkuan et al representer point selection for explaining deep neural networks neurips2018 5 pruthi garima et al estimating training data influence by tracking gradient descent arxivabs200208484 2020 n pag 6 goyal yash et al counterfactual visual explanations arxiv abs190407451 2019 n pag overall i recommend for a rejection because the current version of the paper is not wellmotivated and the evaluations are not sufficient to support the claims and contributions made by this paper i am open to any discussions and will increase my score if my concerns are resolved docsepthis paper extends existing series of prototypebased dnn for image classification their model uses vae to ensure similar latent representations share similarities in appearance a problem that many existing prototypebased cnn models suffer they also include an uncertainty layer which improves the predictive performance the authors also conducted human evaluations via amazon mechanical turks to validate the interpretability of their model this paper improved existing prototypebased dnn by enhancing interpretability and while also improving the performance i enjoyed reading the paper and i think the proposed solutions will benefit other prototypebased dnn models for image classification i have a couple of questions that i wish the authors could answer 1 in the original protopnet paper the authors li et al designed separation cost and clustering cost in their objective with the goal to push images closer to prototypes of the same label and away from incorrect prototypes i dont see such designs in this paper is there a reason that you dont choose to include these terms 2 in some prototypebased models they also include a emphdiversity term which encourages protoypes to be different to avoid redundancy there are no such designs in the proposed model do prototypes have redundancy issues do some of them look very similar to each other 3 im wondering if you can discuss the connection between your explanations for basic explanations and counterfactual explanations because it seems by changing certain features following your basic explanations and explanations for basic explanations you can guide the model to get a different prediction you did mention that visualized images of our explanations are fundamentally generated through vae making it challenging to generate humanrecognizable images for complex datasets that are difficult for vae to generate the observable image would that be a reason that your model cannot be used to generate counterfactual explanations overall i enjoyed reading the paper i only have a few questions which basically asked the authors to clarify their design choices especially those that are different from existing works docsepthe paper extends prototypesbased explanations by proposing what they call explanation spaces describing the relationships between input and prototypes the relationship between prototypes and how prototypes are distributed they construct explanation spaces using vaes and suggest a way to find the optimal number of prototypes the key issue that explanation spaces try to address is that having similar latent representations does not guarantee that images share similarities in humandiscernible ways so the method uses vae to regularize the latent representations this is an interesting idea i have a few questions about the experimental results how was reliability measured that factorvae forms a more reliable explanation space in section 51 it looks like there are two orange clusters in figure 3b the idea is interesting but not a lot of novelty as some existing papers eg lis deep learning for casebased reasoning through prototypes paper already provide notions of distances between inputs and prototypes prototypes and test data points etc docsepthe paper tries to provide an inherently interpretable neural network unlike the previous method of o li aaai 2018 the paper uses vae instead of ae it considers the distance between the distribution of the input and the distribution of prototypes in the latent space of a vae the explanation space the paper also provides a method for determining the number of prototypes using the bayesian information criterion bic for choosing the optimal number of components in the corresponding gmm the continuous nature of the embedding space of a vae makes it more suitable than an ae this is an improvement over the previous work of oscar li et al aaai 2018 training vae may have several issues it may require many training samples to be able to generate explanation images of adequate quality in the paper generated samples are based on toy datasets mnist or fmnist it is not clear how complicated the image generation process would become if the method be applied to more serious datasets my concern is that at some point the challenge will become highquality image generation rather than a classification task also the decoder of the vae may degrade the classification performance c chen neurips 2019 too much regularization may degrade the classification performance too little regularization vae may result in a poor generation of the explanation images do these extra explanations this method is providing offer anything more than a simple prototype in other words if the authors had a comparison study in their mechanical turk experiment with pdl oscar li et al aaai 2018 would there have been any benefit over using their method the paper unfortunately hasnt done an experiment with pdl only comparing to the no explanation scenario i am not a native english speaker but there are too many grammatical issues in the paper for example in pp2 paragraph 2 among our model i relationships meanwhile prototypes are given for explanation we developed the model that progresses or in page 9 it is proper to use i see values in the paper as using a vae instead of ae is a natural improvement over previous work nevertheless i have major concerns as described above in particular the paper is missing a comparison with previous works especially oscar li et al aaai 2018
### Summary:
|
this paper proposes to create an explanation space to describe the relationships between input data and prototypes and also between the prototypes themselves it constructs such a space suing vaes and conducts experiments to validate the effectiveness and interpretability of the method strengths the proposed method is interesting and intuitive weakness novelty of the idea is limited missing experiment comparison with some important previous work some claims are not well supported by the empirical results
|
[
253,
906,
273,
19693,
253,
11649,
3828,
3185,
273,
253,
16038,
281,
513,
594,
253,
1039,
253,
1655,
2929,
816,
7790,
253,
897,
273,
436,
3828,
1057,
417,
18578,
479,
273,
253,
15504,
273,
352,
285,
891,
717,
14338,
281,
923,
849,
253,
2544,
275,
22909,
778,
1007,
751,
342,
253,
337,
6379,
273,
253,
6928,
3045,
50275,
66,
5884,
1127,
4685,
253,
3210,
4812,
3879,
407,
253,
13849,
273,
14237,
6311,
407,
362,
3348,
390,
667,
32049,
48759,
6928,
403,
417,
806,
4081,
275,
436,
2929,
50276,
8826,
2905,
789,
337,
3133,
4409,
29570,
285,
16585,
50273,
48746,
3290,
50275,
74,
452,
1264,
7350,
5001,
253,
1783,
285,
253,
4679,
275,
436,
2929,
50276,
18,
253,
16774,
4342,
1646,
281,
320,
253,
2022,
9021,
273,
436,
2929,
2299,
690,
20121,
1646,
281,
12709,
253,
3935,
326,
253,
4081,
8813,
29566,
690,
5322,
3607,
533,
597,
403,
2057,
417,
4518,
2931,
390,
8058,
347,
253,
4477,
22175,
275,
253,
12002,
326,
50276,
2858,
359,
12661,
271,
26557,
4665,
494,
1566,
323,
625,
21738,
22909,
891,
717,
417,
2104,
281,
19912,
2590,
14308,
275,
253,
2929,
670,
26557,
4665,
494,
285,
247,
27289,
5426,
273,
6009,
16858,
281,
1329,
253,
4154,
326,
253,
4081,
1332,
310,
625,
21738,
2007,
752,
403,
253,
1666,
25379,
323,
253,
4477,
281,
15313,
253,
6452,
326,
253,
4081,
1332,
310,
625,
21738,
891,
588,
1705,
896,
281,
253,
8245,
2523,
275,
253,
1735,
16950,
1127,
50275,
19,
253,
16774,
27163,
403,
417,
2266,
2217,
281,
1329,
253,
3916,
326,
253,
4477,
1056,
670,
253,
9021,
41005,
954,
273,
253,
4679,
403,
5196,
327,
278,
79,
382,
285,
8142,
16192,
382,
533,
841,
767,
941,
10670,
403,
4536,
1512,
2969,
281,
39970,
281,
2169,
10103,
26332,
4440,
257,
292,
352,
310,
417,
747,
281,
253,
3114,
326,
247,
2257,
273,
11333,
588,
789,
9670,
327,
278,
79,
382,
285,
8142,
16192,
382,
533,
1891,
16821,
327,
625,
9542,
30820,
3888,
347,
275,
3946,
368,
778,
417,
2686,
878,
247,
3676,
2036,
281,
5115,
247,
1270,
3045,
327,
278,
79,
382,
323,
1650,
253,
7725,
4885,
15833,
30410,
310,
1097,
5513,
67,
1079,
285,
7899,
327,
278,
79,
382,
891,
651,
11907,
253,
4477,
281,
5645,
253,
16774,
27163,
281,
15302,
342,
2169,
10103,
275,
253,
3280,
3386,
1078,
26575,
281,
1056,
667,
11815,
50275,
20,
1666,
25379,
403,
5816,
275,
253,
7103,
2593,
253,
2201,
629,
273,
253,
7103,
2593,
3133,
281,
2770,
327,
8338,
326,
253,
4081,
1332,
476,
3324,
281,
253,
2608,
2299,
1078,
26575,
715,
253,
17947,
629,
273,
253,
1332,
627,
310,
247,
5816,
629,
273,
10941,
253,
4081,
1332,
342,
253,
2720,
789,
1014,
2167,
253,
4477,
22175,
326,
253,
4081,
1332,
3400,
247,
873,
273,
6667,
3185,
273,
816,
581,
534,
1057,
417,
2226,
275,
253,
2045,
789,
2299,
352,
1057,
417,
1646,
281,
320,
16593,
281,
7277,
581,
390,
2067,
6667,
3530,
432,
253,
8813,
2317,
342,
690,
2720,
789,
326,
3400,
4227,
3169,
22909,
374,
495,
577,
608,
721,
5734,
891,
23452,
1676,
253,
2929,
285,
4496,
1361,
479,
2096,
2139,
824,
14023,
403,
417,
4217,
281,
921,
253,
4081,
8813,
2317,
3400,
1805,
22909,
841,
1666,
25379,
778,
417,
2085,
271,
19126,
936,
19934,
5301,
533,
403,
4409,
16585,
285,
9591,
690,
14023,
275,
253,
4023,
30844,
50273,
498,
15752,
253,
4028,
273,
253,
10199,
2593,
310,
247,
2372,
21643,
281,
479,
984,
891,
2985,
253,
629,
670,
752,
253,
1895,
253,
2929,
13698,
281,
3662,
285,
849,
253,
1543,
275,
253,
7103,
7118,
1361,
281,
921,
326,
253,
4081,
1332,
35910,
253,
1895,
4677,
337,
285,
374,
403,
2590,
281,
1361,
10668,
281,
2096,
253,
4081,
10336,
285,
891,
11435,
326,
50274,
9188,
40348,
50276,
3113,
253,
18979,
2278,
253,
8453,
273,
436,
2929,
476,
6889,
581,
253,
581,
1930,
604,
253,
4477,
476,
1361,
479,
2096,
253,
752,
310,
253,
2256,
273,
253,
8813,
326,
310,
253,
1953,
10568,
271,
3662,
751,
253,
581,
4081,
275,
253,
2929,
285,
476,
417,
320,
9577,
407,
2720,
789,
285,
2139,
253,
1655,
7103,
310,
4209,
281,
1329,
253,
1750,
891,
651,
5583,
436,
2929,
984,
253,
9021,
403,
1534,
281,
253,
8813,
3114,
327,
253,
643,
1133,
253,
1655,
7714,
1057,
417,
1646,
281,
320,
2104,
281,
18578,
479,
326,
253,
9021,
403,
3590,
285,
1534,
3103,
891,
717,
327,
253,
4016,
1930,
533,
588,
6379,
619,
7162,
275,
253,
4868,
984,
891,
717,
7378,
281,
2572,
352,
2378,
619,
7350,
403,
11512,
50274,
18,
30966,
2636,
90,
9041,
1162,
355,
24705,
19868,
32361,
275,
3676,
1006,
800,
14237,
323,
6200,
9066,
540,
480,
2475,
1649,
17181,
43425,
19092,
13391,
2526,
50275,
19,
268,
606,
359,
74,
465,
1368,
285,
591,
951,
632,
606,
4240,
4685,
2806,
3364,
13650,
3066,
4833,
3470,
275,
10061,
273,
253,
5910,
394,
5213,
8059,
327,
5145,
4715,
50276,
21970,
5571,
280,
1686,
1166,
480,
1686,
83,
2061,
46416,
1093,
3953,
50276,
20,
465,
303,
644,
1162,
355,
6667,
403,
417,
2217,
3037,
281,
45688,
14226,
323,
4665,
1430,
295,
2824,
6961,
50276,
21,
9094,
73,
448,
6356,
76,
9041,
1162,
355,
1957,
254,
1127,
5438,
323,
15571,
3676,
11454,
6928,
5723,
2824,
7798,
50276,
22,
819,
3097,
74,
6746,
8032,
1162,
355,
26230,
3733,
941,
4833,
407,
12544,
11786,
18499,
549,
32693,
5375,
1518,
17391,
29579,
9169,
295,
24949,
50276,
23,
564,
90,
267,
340,
1225,
1162,
355,
4828,
12690,
780,
5304,
22909,
549,
32693,
2117,
16129,
24769,
30592,
6247,
295,
24949,
50276,
1189,
455,
891,
5583,
323,
247,
18235,
984,
253,
1655,
2715,
273,
253,
2929,
310,
417,
973,
24013,
8550,
285,
253,
27163,
403,
417,
4209,
281,
1329,
253,
3916,
285,
9021,
1160,
407,
436,
2929,
891,
717,
1527,
281,
667,
11985,
285,
588,
2572,
619,
4868,
604,
619,
7350,
403,
11512,
5474,
33032,
2520,
2929,
8725,
5368,
2962,
273,
21841,
3169,
277,
9866,
323,
2460,
9162,
616,
1566,
4648,
362,
3348,
281,
5416,
2074,
21624,
14237,
3894,
22620,
275,
7286,
247,
1895,
326,
1142,
5368,
21841,
3169,
260,
9866,
3210,
11089,
597,
671,
2486,
271,
11649,
3828,
534,
19132,
253,
15970,
3045,
253,
4477,
671,
5196,
1966,
27163,
3066,
7001,
251,
8651,
10709,
661,
281,
17813,
253,
4665,
1430,
273,
616,
1566,
50276,
2520,
2929,
5520,
5368,
21841,
3169,
277,
9866,
407,
22474,
4665,
1430,
285,
1223,
671,
11138,
253,
3045,
891,
11346,
4361,
253,
2929,
285,
891,
1158,
253,
4081,
5482,
588,
5649,
643,
21841,
3169,
277,
9866,
3210,
323,
2460,
9162,
891,
452,
247,
4564,
273,
3533,
326,
891,
5730,
253,
4477,
812,
3662,
50276,
18,
275,
253,
3236,
3861,
412,
3024,
2929,
253,
4477,
632,
1162,
355,
4158,
9712,
2105,
285,
17524,
2105,
275,
616,
8103,
342,
253,
4736,
281,
7450,
3888,
8003,
281,
3861,
9117,
273,
253,
1072,
5203,
285,
1977,
432,
13583,
3861,
9117,
891,
13414,
923,
824,
11809,
275,
436,
2929,
310,
627,
247,
1921,
326,
368,
13414,
5206,
281,
2486,
841,
2426,
50276,
19,
275,
690,
21841,
3169,
3210,
597,
671,
2486,
247,
7013,
69,
2095,
1307,
534,
29426,
3861,
899,
19064,
281,
320,
1027,
281,
3693,
39296,
627,
403,
642,
824,
11809,
275,
253,
4081,
1566,
513,
3861,
9117,
452,
39296,
3374,
513,
690,
273,
731,
1007,
1077,
2074,
281,
1016,
643,
50276,
20,
516,
12371,
604,
368,
476,
2319,
253,
4602,
875,
634,
22909,
323,
5044,
22909,
285,
4828,
12690,
780,
22909,
984,
352,
3133,
407,
6890,
2176,
3386,
1563,
634,
5044,
22909,
285,
22909,
323,
5044,
22909,
368,
476,
7102,
253,
1566,
281,
755,
247,
1027,
10554,
368,
858,
3748,
326,
27130,
3888,
273,
776,
22909,
403,
26401,
4561,
949,
362,
3348,
2403,
352,
11132,
281,
6635,
1966,
21888,
12729,
3888,
323,
2570,
15302,
326,
403,
2834,
323,
362,
3348,
281,
6635,
253,
24802,
2460,
651,
326,
320,
247,
1921,
326,
634,
1566,
2550,
320,
908,
281,
6635,
4828,
12690,
780,
22909,
4583,
891,
11346,
4361,
253,
2929,
891,
760,
452,
247,
1643,
3533,
534,
10323,
2546,
253,
4477,
281,
19148,
616,
2216,
10165,
3340,
1110,
326,
403,
1027,
432,
5368,
2987,
50276,
7152,
339,
431,
248,
2929,
8725,
3861,
9117,
3169,
22909,
407,
36636,
752,
597,
1067,
8813,
8470,
12930,
253,
7688,
875,
3280,
285,
3861,
9117,
253,
2954,
875,
3861,
9117,
285,
849,
3861,
9117,
403,
5939,
597,
3989,
8813,
8470,
970,
13460,
265,
285,
1804,
247,
1039,
281,
1089,
253,
8654,
1180,
273,
3861,
9117,
50276,
783,
2234,
2523,
326,
8813,
8470,
1611,
281,
2953,
310,
326,
1907,
2074,
21624,
14237,
1057,
417,
12215,
326,
3888,
3894,
22620,
275,
1547,
395,
261,
20631,
917,
4088,
594,
253,
1332,
4648,
362,
3348,
281,
3963,
907,
253,
21624,
14237,
436,
310,
271,
4722,
2934,
50276,
74,
452,
247,
1643,
3533,
670,
253,
5661,
1543,
50276,
5430,
369,
13367,
4080,
326,
2803,
21574,
4948,
247,
625,
9630,
8813,
2317,
275,
2593,
8319,
50276,
262,
4453,
751,
627,
403,
767,
13735,
9959,
275,
4677,
495,
67,
253,
2934,
310,
4722,
533,
417,
247,
2257,
273,
38135,
347,
690,
5368,
9380,
24088,
44044,
3676,
4715,
323,
1083,
3169,
14720,
949,
3861,
9117,
2929,
2168,
2085,
27367,
273,
13849,
875,
14800,
285,
3861,
9117,
3861,
9117,
285,
1071,
941,
2792,
3966,
5474,
339,
431,
248,
2929,
14177,
281,
2085,
271,
26557,
4665,
494,
11454,
2990,
12401,
253,
2045,
1332,
273,
258,
632,
39951,
2284,
4765,
253,
2929,
4648,
362,
3348,
3185,
273,
247,
70,
352,
19401,
253,
4181,
875,
253,
3268,
273,
253,
3280,
285,
253,
3268,
273,
3861,
9117,
275,
253,
21624,
2317,
273,
247,
362,
3348,
253,
8813,
2317,
253,
2929,
671,
3400,
247,
1332,
323,
8925,
253,
1180,
273,
3861,
9117,
970,
253,
17699,
16561,
1491,
17705,
43022,
323,
13887,
253,
8654,
1180,
273,
4295,
275,
253,
3969,
305,
2188,
50275,
783,
5415,
3753,
273,
253,
21496,
2317,
273,
247,
362,
3348,
2789,
352,
625,
7470,
685,
271,
247,
70,
436,
310,
271,
7756,
689,
253,
2045,
789,
273,
258,
19378,
632,
1162,
355,
39951,
2284,
4765,
3733,
362,
3348,
778,
452,
2067,
3374,
352,
778,
2430,
1142,
3733,
3530,
281,
320,
2104,
281,
6635,
8813,
3888,
273,
10599,
3290,
275,
253,
2929,
4561,
3530,
403,
1754,
327,
20953,
15302,
278,
79,
382,
390,
269,
16192,
382,
352,
310,
417,
2590,
849,
9542,
253,
2460,
5978,
1232,
651,
2489,
604,
253,
1332,
320,
3732,
281,
625,
4092,
15302,
619,
4468,
310,
326,
387,
690,
1127,
253,
5691,
588,
2489,
1029,
15177,
2460,
5978,
2581,
685,
247,
9162,
4836,
50275,
12563,
253,
29810,
273,
253,
362,
3348,
778,
40195,
253,
9162,
3045,
260,
260,
864,
5723,
2824,
6247,
1512,
1199,
37820,
778,
40195,
253,
9162,
3045,
1512,
1652,
37820,
362,
3348,
778,
906,
275,
247,
4105,
5978,
273,
253,
8813,
3888,
50275,
3088,
841,
4465,
22909,
436,
1332,
310,
5277,
3959,
2712,
625,
685,
247,
2969,
21841,
275,
643,
3000,
604,
253,
4477,
574,
247,
5301,
1263,
275,
616,
8651,
10709,
76,
3368,
342,
268,
11830,
258,
19378,
632,
1162,
355,
39951,
2284,
4765,
651,
627,
452,
644,
667,
5649,
689,
970,
616,
1332,
253,
2929,
19235,
556,
2649,
2218,
271,
3368,
342,
268,
11830,
760,
10941,
281,
253,
642,
8813,
10076,
50276,
74,
717,
417,
247,
7925,
48087,
14925,
533,
627,
403,
1512,
1142,
47412,
474,
3374,
275,
253,
2929,
323,
1650,
275,
7266,
19,
12494,
374,
2190,
776,
1566,
891,
7688,
50275,
10722,
6050,
3861,
9117,
50276,
609,
1677,
323,
8813,
359,
3715,
253,
1566,
326,
42851,
390,
275,
3239,
898,
352,
310,
1463,
281,
897,
50275,
74,
923,
2193,
275,
253,
2929,
347,
970,
247,
362,
3348,
3185,
273,
247,
70,
310,
247,
3626,
7756,
689,
2045,
789,
17837,
891,
452,
2201,
7350,
347,
2529,
1840,
275,
1798,
253,
2929,
310,
5816,
247,
5301,
342,
2045,
2987,
3340,
258,
19378,
632,
1162,
355,
39951,
2284,
4765,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
2794,
271,
8813,
2317,
281,
6266,
253,
7688,
875,
3280,
941,
285,
3861,
9117,
285,
671,
875,
253,
3861,
9117,
3746,
352,
21031,
824,
247,
2317,
402,
272,
13460,
265,
285,
2589,
84,
4679,
281,
17813,
253,
12510,
285,
4665,
1430,
273,
253,
1332,
50276,
296,
3755,
20556,
50276,
783,
4081,
1332,
310,
4722,
285,
27350,
50276,
20881,
1255,
50276,
2369,
652,
555,
273,
253,
2934,
310,
3710,
50276,
33722,
3368,
5301,
342,
690,
1774,
2045,
789,
50276,
8826,
3916,
403,
417,
973,
4516,
407,
253,
16774,
1543
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
253,
906,
273,
19693,
253,
11649,
3828,
3185,
273,
253,
16038,
281,
513,
594,
253,
1039,
253,
1655,
2929,
816,
7790,
253,
897,
273,
436,
3828,
1057,
417,
18578,
479,
273,
253,
15504,
273,
352,
285,
891,
717,
14338,
281,
923,
849,
253,
2544,
275,
22909,
778,
1007,
751,
342,
253,
337,
6379,
273,
253,
6928,
3045,
50275,
66,
5884,
1127,
4685,
253,
3210,
4812,
3879,
407,
253,
13849,
273,
14237,
6311,
407,
362,
3348,
390,
667,
32049,
48759,
6928,
403,
417,
806,
4081,
275,
436,
2929,
50276,
8826,
2905,
789,
337,
3133,
4409,
29570,
285,
16585,
50273,
48746,
3290,
50275,
74,
452,
1264,
7350,
5001,
253,
1783,
285,
253,
4679,
275,
436,
2929,
50276,
18,
253,
16774,
4342,
1646,
281,
320,
253,
2022,
9021,
273,
436,
2929,
2299,
690,
20121,
1646,
281,
12709,
253,
3935,
326,
253,
4081,
8813,
29566,
690,
5322,
3607,
533,
597,
403,
2057,
417,
4518,
2931,
390,
8058,
347,
253,
4477,
22175,
275,
253,
12002,
326,
50276,
2858,
359,
12661,
271,
26557,
4665,
494,
1566,
323,
625,
21738,
22909,
891,
717,
417,
2104,
281,
19912,
2590,
14308,
275,
253,
2929,
670,
26557,
4665,
494,
285,
247,
27289,
5426,
273,
6009,
16858,
281,
1329,
253,
4154,
326,
253,
4081,
1332,
310,
625,
21738,
2007,
752,
403,
253,
1666,
25379,
323,
253,
4477,
281,
15313,
253,
6452,
326,
253,
4081,
1332,
310,
625,
21738,
891,
588,
1705,
896,
281,
253,
8245,
2523,
275,
253,
1735,
16950,
1127,
50275,
19,
253,
16774,
27163,
403,
417,
2266,
2217,
281,
1329,
253,
3916,
326,
253,
4477,
1056,
670,
253,
9021,
41005,
954,
273,
253,
4679,
403,
5196,
327,
278,
79,
382,
285,
8142,
16192,
382,
533,
841,
767,
941,
10670,
403,
4536,
1512,
2969,
281,
39970,
281,
2169,
10103,
26332,
4440,
257,
292,
352,
310,
417,
747,
281,
253,
3114,
326,
247,
2257,
273,
11333,
588,
789,
9670,
327,
278,
79,
382,
285,
8142,
16192,
382,
533,
1891,
16821,
327,
625,
9542,
30820,
3888,
347,
275,
3946,
368,
778,
417,
2686,
878,
247,
3676,
2036,
281,
5115,
247,
1270,
3045,
327,
278,
79,
382,
323,
1650,
253,
7725,
4885,
15833,
30410,
310,
1097,
5513,
67,
1079,
285,
7899,
327,
278,
79,
382,
891,
651,
11907,
253,
4477,
281,
5645,
253,
16774,
27163,
281,
15302,
342,
2169,
10103,
275,
253,
3280,
3386,
1078,
26575,
281,
1056,
667,
11815,
50275,
20,
1666,
25379,
403,
5816,
275,
253,
7103,
2593,
253,
2201,
629,
273,
253,
7103,
2593,
3133,
281,
2770,
327,
8338,
326,
253,
4081,
1332,
476,
3324,
281,
253,
2608,
2299,
1078,
26575,
715,
253,
17947,
629,
273,
253,
1332,
627,
310,
247,
5816,
629,
273,
10941,
253,
4081,
1332,
342,
253,
2720,
789,
1014,
2167,
253,
4477,
22175,
326,
253,
4081,
1332,
3400,
247,
873,
273,
6667,
3185,
273,
816,
581,
534,
1057,
417,
2226,
275,
253,
2045,
789,
2299,
352,
1057,
417,
1646,
281,
320,
16593,
281,
7277,
581,
390,
2067,
6667,
3530,
432,
253,
8813,
2317,
342,
690,
2720,
789,
326,
3400,
4227,
3169,
22909,
374,
495,
577,
608,
721,
5734,
891,
23452,
1676,
253,
2929,
285,
4496,
1361,
479,
2096,
2139,
824,
14023,
403,
417,
4217,
281,
921,
253,
4081,
8813,
2317,
3400,
1805,
22909,
841,
1666,
25379,
778,
417,
2085,
271,
19126,
936,
19934,
5301,
533,
403,
4409,
16585,
285,
9591,
690,
14023,
275,
253,
4023,
30844,
50273,
498,
15752,
253,
4028,
273,
253,
10199,
2593,
310,
247,
2372,
21643,
281,
479,
984,
891,
2985,
253,
629,
670,
752,
253,
1895,
253,
2929,
13698,
281,
3662,
285,
849,
253,
1543,
275,
253,
7103,
7118,
1361,
281,
921,
326,
253,
4081,
1332,
35910,
253,
1895,
4677,
337,
285,
374,
403,
2590,
281,
1361,
10668,
281,
2096,
253,
4081,
10336,
285,
891,
11435,
326,
50274,
9188,
40348,
50276,
3113,
253,
18979,
2278,
253,
8453,
273,
436,
2929,
476,
6889,
581,
253,
581,
1930,
604,
253,
4477,
476,
1361,
479,
2096,
253,
752,
310,
253,
2256,
273,
253,
8813,
326,
310,
253,
1953,
10568,
271,
3662,
751,
253,
581,
4081,
275,
253,
2929,
285,
476,
417,
320,
9577,
407,
2720,
789,
285,
2139,
253,
1655,
7103,
310,
4209,
281,
1329,
253,
1750,
891,
651,
5583,
436,
2929,
984,
253,
9021,
403,
1534,
281,
253,
8813,
3114,
327,
253,
643,
1133,
253,
1655,
7714,
1057,
417,
1646,
281,
320,
2104,
281,
18578,
479,
326,
253,
9021,
403,
3590,
285,
1534,
3103,
891,
717,
327,
253,
4016,
1930,
533,
588,
6379,
619,
7162,
275,
253,
4868,
984,
891,
717,
7378,
281,
2572,
352,
2378,
619,
7350,
403,
11512,
50274,
18,
30966,
2636,
90,
9041,
1162,
355,
24705,
19868,
32361,
275,
3676,
1006,
800,
14237,
323,
6200,
9066,
540,
480,
2475,
1649,
17181,
43425,
19092,
13391,
2526,
50275,
19,
268,
606,
359,
74,
465,
1368,
285,
591,
951,
632,
606,
4240,
4685,
2806,
3364,
13650,
3066,
4833,
3470,
275,
10061,
273,
253,
5910,
394,
5213,
8059,
327,
5145,
4715,
50276,
21970,
5571,
280,
1686,
1166,
480,
1686,
83,
2061,
46416,
1093,
3953,
50276,
20,
465,
303,
644,
1162,
355,
6667,
403,
417,
2217,
3037,
281,
45688,
14226,
323,
4665,
1430,
295,
2824,
6961,
50276,
21,
9094,
73,
448,
6356,
76,
9041,
1162,
355,
1957,
254,
1127,
5438,
323,
15571,
3676,
11454,
6928,
5723,
2824,
7798,
50276,
22,
819,
3097,
74,
6746,
8032,
1162,
355,
26230,
3733,
941,
4833,
407,
12544,
11786,
18499,
549,
32693,
5375,
1518,
17391,
29579,
9169,
295,
24949,
50276,
23,
564,
90,
267,
340,
1225,
1162,
355,
4828,
12690,
780,
5304,
22909,
549,
32693,
2117,
16129,
24769,
30592,
6247,
295,
24949,
50276,
1189,
455,
891,
5583,
323,
247,
18235,
984,
253,
1655,
2715,
273,
253,
2929,
310,
417,
973,
24013,
8550,
285,
253,
27163,
403,
417,
4209,
281,
1329,
253,
3916,
285,
9021,
1160,
407,
436,
2929,
891,
717,
1527,
281,
667,
11985,
285,
588,
2572,
619,
4868,
604,
619,
7350,
403,
11512,
5474,
33032,
2520,
2929,
8725,
5368,
2962,
273,
21841,
3169,
277,
9866,
323,
2460,
9162,
616,
1566,
4648,
362,
3348,
281,
5416,
2074,
21624,
14237,
3894,
22620,
275,
7286,
247,
1895,
326,
1142,
5368,
21841,
3169,
260,
9866,
3210,
11089,
597,
671,
2486,
271,
11649,
3828,
534,
19132,
253,
15970,
3045,
253,
4477,
671,
5196,
1966,
27163,
3066,
7001,
251,
8651,
10709,
661,
281,
17813,
253,
4665,
1430,
273,
616,
1566,
50276,
2520,
2929,
5520,
5368,
21841,
3169,
277,
9866,
407,
22474,
4665,
1430,
285,
1223,
671,
11138,
253,
3045,
891,
11346,
4361,
253,
2929,
285,
891,
1158,
253,
4081,
5482,
588,
5649,
643,
21841,
3169,
277,
9866,
3210,
323,
2460,
9162,
891,
452,
247,
4564,
273,
3533,
326,
891,
5730,
253,
4477,
812,
3662,
50276,
18,
275,
253,
3236,
3861,
412,
3024,
2929,
253,
4477,
632,
1162,
355,
4158,
9712,
2105,
285,
17524,
2105,
275,
616,
8103,
342,
253,
4736,
281,
7450,
3888,
8003,
281,
3861,
9117,
273,
253,
1072,
5203,
285,
1977,
432,
13583,
3861,
9117,
891,
13414,
923,
824,
11809,
275,
436,
2929,
310,
627,
247,
1921,
326,
368,
13414,
5206,
281,
2486,
841,
2426,
50276,
19,
275,
690,
21841,
3169,
3210,
597,
671,
2486,
247,
7013,
69,
2095,
1307,
534,
29426,
3861,
899,
19064,
281,
320,
1027,
281,
3693,
39296,
627,
403,
642,
824,
11809,
275,
253,
4081,
1566,
513,
3861,
9117,
452,
39296,
3374,
513,
690,
273,
731,
1007,
1077,
2074,
281,
1016,
643,
50276,
20,
516,
12371,
604,
368,
476,
2319,
253,
4602,
875,
634,
22909,
323,
5044,
22909,
285,
4828,
12690,
780,
22909,
984,
352,
3133,
407,
6890,
2176,
3386,
1563,
634,
5044,
22909,
285,
22909,
323,
5044,
22909,
368,
476,
7102,
253,
1566,
281,
755,
247,
1027,
10554,
368,
858,
3748,
326,
27130,
3888,
273,
776,
22909,
403,
26401,
4561,
949,
362,
3348,
2403,
352,
11132,
281,
6635,
1966,
21888,
12729,
3888,
323,
2570,
15302,
326,
403,
2834,
323,
362,
3348,
281,
6635,
253,
24802,
2460,
651,
326,
320,
247,
1921,
326,
634,
1566,
2550,
320,
908,
281,
6635,
4828,
12690,
780,
22909,
4583,
891,
11346,
4361,
253,
2929,
891,
760,
452,
247,
1643,
3533,
534,
10323,
2546,
253,
4477,
281,
19148,
616,
2216,
10165,
3340,
1110,
326,
403,
1027,
432,
5368,
2987,
50276,
7152,
339,
431,
248,
2929,
8725,
3861,
9117,
3169,
22909,
407,
36636,
752,
597,
1067,
8813,
8470,
12930,
253,
7688,
875,
3280,
285,
3861,
9117,
253,
2954,
875,
3861,
9117,
285,
849,
3861,
9117,
403,
5939,
597,
3989,
8813,
8470,
970,
13460,
265,
285,
1804,
247,
1039,
281,
1089,
253,
8654,
1180,
273,
3861,
9117,
50276,
783,
2234,
2523,
326,
8813,
8470,
1611,
281,
2953,
310,
326,
1907,
2074,
21624,
14237,
1057,
417,
12215,
326,
3888,
3894,
22620,
275,
1547,
395,
261,
20631,
917,
4088,
594,
253,
1332,
4648,
362,
3348,
281,
3963,
907,
253,
21624,
14237,
436,
310,
271,
4722,
2934,
50276,
74,
452,
247,
1643,
3533,
670,
253,
5661,
1543,
50276,
5430,
369,
13367,
4080,
326,
2803,
21574,
4948,
247,
625,
9630,
8813,
2317,
275,
2593,
8319,
50276,
262,
4453,
751,
627,
403,
767,
13735,
9959,
275,
4677,
495,
67,
253,
2934,
310,
4722,
533,
417,
247,
2257,
273,
38135,
347,
690,
5368,
9380,
24088,
44044,
3676,
4715,
323,
1083,
3169,
14720,
949,
3861,
9117,
2929,
2168,
2085,
27367,
273,
13849,
875,
14800,
285,
3861,
9117,
3861,
9117,
285,
1071,
941,
2792,
3966,
5474,
339,
431,
248,
2929,
14177,
281,
2085,
271,
26557,
4665,
494,
11454,
2990,
12401,
253,
2045,
1332,
273,
258,
632,
39951,
2284,
4765,
253,
2929,
4648,
362,
3348,
3185,
273,
247,
70,
352,
19401,
253,
4181,
875,
253,
3268,
273,
253,
3280,
285,
253,
3268,
273,
3861,
9117,
275,
253,
21624,
2317,
273,
247,
362,
3348,
253,
8813,
2317,
253,
2929,
671,
3400,
247,
1332,
323,
8925,
253,
1180,
273,
3861,
9117,
970,
253,
17699,
16561,
1491,
17705,
43022,
323,
13887,
253,
8654,
1180,
273,
4295,
275,
253,
3969,
305,
2188,
50275,
783,
5415,
3753,
273,
253,
21496,
2317,
273,
247,
362,
3348,
2789,
352,
625,
7470,
685,
271,
247,
70,
436,
310,
271,
7756,
689,
253,
2045,
789,
273,
258,
19378,
632,
1162,
355,
39951,
2284,
4765,
3733,
362,
3348,
778,
452,
2067,
3374,
352,
778,
2430,
1142,
3733,
3530,
281,
320,
2104,
281,
6635,
8813,
3888,
273,
10599,
3290,
275,
253,
2929,
4561,
3530,
403,
1754,
327,
20953,
15302,
278,
79,
382,
390,
269,
16192,
382,
352,
310,
417,
2590,
849,
9542,
253,
2460,
5978,
1232,
651,
2489,
604,
253,
1332,
320,
3732,
281,
625,
4092,
15302,
619,
4468,
310,
326,
387,
690,
1127,
253,
5691,
588,
2489,
1029,
15177,
2460,
5978,
2581,
685,
247,
9162,
4836,
50275,
12563,
253,
29810,
273,
253,
362,
3348,
778,
40195,
253,
9162,
3045,
260,
260,
864,
5723,
2824,
6247,
1512,
1199,
37820,
778,
40195,
253,
9162,
3045,
1512,
1652,
37820,
362,
3348,
778,
906,
275,
247,
4105,
5978,
273,
253,
8813,
3888,
50275,
3088,
841,
4465,
22909,
436,
1332,
310,
5277,
3959,
2712,
625,
685,
247,
2969,
21841,
275,
643,
3000,
604,
253,
4477,
574,
247,
5301,
1263,
275,
616,
8651,
10709,
76,
3368,
342,
268,
11830,
258,
19378,
632,
1162,
355,
39951,
2284,
4765,
651,
627,
452,
644,
667,
5649,
689,
970,
616,
1332,
253,
2929,
19235,
556,
2649,
2218,
271,
3368,
342,
268,
11830,
760,
10941,
281,
253,
642,
8813,
10076,
50276,
74,
717,
417,
247,
7925,
48087,
14925,
533,
627,
403,
1512,
1142,
47412,
474,
3374,
275,
253,
2929,
323,
1650,
275,
7266,
19,
12494,
374,
2190,
776,
1566,
891,
7688,
50275,
10722,
6050,
3861,
9117,
50276,
609,
1677,
323,
8813,
359,
3715,
253,
1566,
326,
42851,
390,
275,
3239,
898,
352,
310,
1463,
281,
897,
50275,
74,
923,
2193,
275,
253,
2929,
347,
970,
247,
362,
3348,
3185,
273,
247,
70,
310,
247,
3626,
7756,
689,
2045,
789,
17837,
891,
452,
2201,
7350,
347,
2529,
1840,
275,
1798,
253,
2929,
310,
5816,
247,
5301,
342,
2045,
2987,
3340,
258,
19378,
632,
1162,
355,
39951,
2284,
4765,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
2794,
271,
8813,
2317,
281,
6266,
253,
7688,
875,
3280,
941,
285,
3861,
9117,
285,
671,
875,
253,
3861,
9117,
3746,
352,
21031,
824,
247,
2317,
402,
272,
13460,
265,
285,
2589,
84,
4679,
281,
17813,
253,
12510,
285,
4665,
1430,
273,
253,
1332,
50276,
296,
3755,
20556,
50276,
783,
4081,
1332,
310,
4722,
285,
27350,
50276,
20881,
1255,
50276,
2369,
652,
555,
273,
253,
2934,
310,
3710,
50276,
33722,
3368,
5301,
342,
690,
1774,
2045,
789,
50276,
8826,
3916,
403,
417,
973,
4516,
407,
253,
16774,
1543
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work studies policy gradient methods for domain randomization dr in particular it investigates baselines for policy gradient under the dr settings such that the gradient estimate can have a lower variance to ensure better policy updates and learning this paper derives optimal stateenvironmentdependent baseline theoretically gives general recipes for building the baseline and proposes an algorithm called variance reduced domain randomization vrdr vrdr is evaluated on several continuous control tasks and it performs better compared with two baselines strengths 1 this work has a good structure is wellwritten and i feel pleased to read this paper 2 this work is wellmotivated and i believe the topic is highlysignificant and related to the venue 3 this work has a good balance of theoretical contribution and empirical evaluation it is technically novel and the proposed algorithm works well in practice compared to baselines weaknessesquestions 1 this work targets the variance of the gradient estimates so why not directly measure the variance of the gradient estimates eg empirically i think this is the most straightforward way 2 i agree lower variance gradient estimates could converge fast but in general its also easier to converge to local optimal what are your opinions on this 4 for sec52 could you elaborate more on why vrdr generalizes better to unseen environments i dont find it intuitive to understand why lowervariance gradient estimates find a solution that generalizes better as opposed to sgd in supervised learning whats the connection or insight 5 whats mrpo at the end of sec51 probably a typo 6 could you elaborate on the added computational cost compared to the baselines i enjoy reading the paper and i find it theoretically novel and sound however i do not find the experiments very straightforward i believe its necessary to compare the empirical variance of the gradient estimates with the baselines docsepthe paper derived an optimal stateenvironmentdependent baseline and a variance reduced domain randomization approach for policy gradient methods the paper is well written and the mathematical derivation is sound the idea of extending the state values as baselines to the additional parameterization of the environment variations is natural because the variance reduction technique in standard singleenvironment training is wellstudied the extension to the additional parameterization is relatively straightforward the technical details are wellpresented my main concerns are about the experimental results i think the analysis is too weak and not enough baselines are compared with by looking at figure 1 it is quite hard to conclude that the proposed methods really help whereas the standard variance reduction does make a crucial difference in training i hope the authors delve more into the results and in particular elaborate on possible reasons when the benefits are unclear rather than just a simple paragraph on a few environments where the method is marginally better and i would guess there can be some examples where it is worse i do not believe these are all the baselines that should be considered for instance theres no comparison with meta learning or robust rl methods the paper is well written and the mathematical derivation is sound my main concerns are about the experimental results i think the analysis is too weak and not enough baselines are compared with docsepthis paper tackles the high variance problem caused by the randomization of environments for estimating the policy gradients the idea is to derive a biasfree and stateenvironmentdependent optimal baseline for domain randomization the authors further develop a vrdr method by dividing the entire environment space into subspaces and estimating the statesubspacedependent baseline this manuscript is a rederivation of the control variate given that the randomness in the environment is partially artificial domain randomization the control variate induces a baseline that depends not only on the state but also on the environment this term unsurprisingly has some traceability issue in practice and then the authors provide a divideandconquer idea to partially address it experiments show that this method is marginally better than vanilla domain randomization pros 1 the paper tackles an interesting problem how to reduce the variance and improve sample efficiency during the training of dr by developing a better baseline 2 the paper is clearly written in general discussion 1 theorem 1 is not useful the term can be both positive and negative i dont think control variates in rl has guarantees in variance reduction so i would suggest removing the theorem 2 the proposed method is based on the premise that the environment parameter is known and used to calculate the clustering prototypes during the training which may not be valid for the real setting where the agent only can observe the environment 3 the convergence improvement in figure 1 is not significant for some tasks like pendulum and pendulum2d yet with the additional cost of the clustering process 4 the paper uses the hierarchical cluster method to partition the environment space does the cluster method influence the experimental results why are other cluster methods like kmeans not considered and choosing a proper clustering interval seems nontrivial with grid search since the values in figure 4 fall into different ranges for different tasks 5 the parameter nc is used without definition in alg 1 6 in section 52 the authors learn 15 policies for each algorithm on testing environments to test the generalization to the unseen environment i guess that the authors wanted to express that they use the trained policies on testing environments directly 7 the paper uses the uniform domain randomization as a baseline which is not the proposed method in mehta et al 2020 cited by the paper why does the paper not compare to the active domain randomization algorithm developed in mehta et al 2020 this is a marginal improvement of domain randomization with marginal improvement in experiments docsepthis paper tackles the variance of policy gradient due to the domain randomization used in rl in simulations the authors prove that the policy gradient variance can be further reduced by learning a statedependent baseline for each environment parameter compared to statedependent baselines the authors then develop a practical algorithm based on the analysis and analyze the properties of the algorithm the algorithm is implemented and tested on six robot control tasks it consistently accelerates policy training strengths its great to see that the proposed algorithm demonstrates good generalizability compared to other methods in figure 2 i have a question about a plot though in figure 2a the most left subplot the drs score is no higher than 1400 however in figure 1 a the mean of the dr curve is around 3000 did i interpret the plots correctly domain randomization is commonly used in practice to train good policies using simulators the problem that the submission tries to address is relevant to the community i appreciate the efforts of deriving and analyzing the practical algorithms based on theoretical development although it would be better to analyze the variance reduction of the practical baseline proposed concerns the novelty of the submission is limited as authors noted the baseline in section 32 is a special case of the inputdriven baseline derived by mao et al 2018 and its not discussed how proposed method is different from liu et al s pertask control variate if i understand it correctly the environment parameters can be treated as part of the state of the mdc the combined state original state environment parameterdependent baselines can be learned just as regular statedependent baselines based on function approximation without additional techniques such as the subspace clustering in algorithm 1 clusters are formed based on q values but the nearest neighbors are found based on environment parameter this inconsistency is a bit odd to me there may be a bug in the theoretical analysis i believe the expectation of environment parameters need to be conditioned on s just as the action distribution is conditioned on s policy pi this includes ep in eq 4 corollary 1 and theorem 2 the clarity of the paper can be improved the math notations are sometimes confusing and math arguments are not always precise examples on the fifth line after eq 1 the meaning of equation ep mu pig eg and the meaning of g in eq 9 are not very clear the symbol in probability represents conditioning however it seems to represent parameterized functions for example etapi p bspi bspj in theorem 2 iff the condition eq 11 seems to be sufficient but not necessary can obtain the minimum variance for dr right above section 4 i think corollary 1 shows that stateenvironmentdependent baseline reduces more variance than statedependent baseline but i dont think this implies stateenvironmentdependent baseline can obtain the minimum variance the algorithm requires more explanation such as how the baselines are updated in line 14 how the hierarchical cluster method works and more description about relabel may help other comments multiple symbols are randomly written in italic or nonitalic font p in the second before the last line on page 2 and a in the fifth before the last line in section 2 the meaning of nc and n in line 3 algorithm 1 what does kappa kappa 1 represent right above section 5 pcc1m on the fourth line after algorithm 1 box should it be pcc0h1 instead hve have in the middle of page 9 after rebuttal i really appreciate that the authors incorporated the comments in such a short amount of time although i think the clustering idea and experiments are interesting due to the amount of changes made i am slightly leaning toward suggesting the authors take some more time improving the manuscript and submit it to future conferences for example i think the experiment section can be improved with the new experiments and discussions on the results and more analysis on the proposed algorithm eg how b in the second paragraph on page 5 is related to the algorithm proposed we can not sample from pps and quantify the correlation between ga s and qs a p because my concerns outweigh the strengths now i am leaning towards rejecting the paper
### Summary:
|
while the reviewers appreciated the clarity of the work there is a concern about the meaning of the proposed result and method it is known that adding knowledge about an additional variable in this case the environment leads to a lower variance estimate what is not known is the practical impact of using this new baseline or perhaps some other intuition stemming from that use of the baseline for instance the origin of the variance however the results shown are not that compelling a point which was raised by the reviewers making the work below the bar for publication
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
2175,
3646,
11786,
3082,
323,
5028,
46852,
1837,
275,
1798,
352,
2340,
684,
1666,
25379,
323,
3646,
11786,
762,
253,
1837,
7533,
824,
326,
253,
11786,
6642,
476,
452,
247,
2406,
11041,
281,
5416,
1805,
3646,
11269,
285,
4715,
436,
2929,
38422,
8654,
1375,
20034,
6820,
8245,
28055,
4245,
2087,
20247,
323,
3652,
253,
8245,
285,
29328,
271,
5933,
1925,
11041,
3777,
5028,
46852,
362,
83,
5267,
362,
83,
5267,
310,
6760,
327,
2067,
5415,
1453,
8892,
285,
352,
17923,
1805,
2429,
342,
767,
1666,
25379,
20544,
337,
436,
789,
556,
247,
1175,
2605,
310,
973,
15720,
285,
891,
1928,
13864,
281,
1239,
436,
2929,
374,
436,
789,
310,
973,
24013,
8550,
285,
891,
2868,
253,
9400,
310,
4122,
32258,
285,
2905,
281,
253,
18767,
495,
436,
789,
556,
247,
1175,
6654,
273,
10527,
7680,
285,
16774,
7103,
352,
310,
22335,
4460,
285,
253,
4081,
5933,
2987,
973,
275,
3946,
2429,
281,
1666,
25379,
50276,
20881,
1255,
265,
34974,
337,
436,
789,
8571,
253,
11041,
273,
253,
11786,
8197,
594,
2139,
417,
3587,
2557,
253,
11041,
273,
253,
11786,
8197,
24088,
45190,
891,
1158,
436,
310,
253,
954,
15246,
1039,
374,
891,
5194,
2406,
11041,
11786,
8197,
812,
29623,
3809,
533,
275,
2087,
697,
671,
6927,
281,
29623,
281,
1980,
8654,
752,
403,
634,
11626,
327,
436,
577,
323,
4706,
3583,
812,
368,
21184,
625,
327,
2139,
362,
83,
5267,
2087,
4219,
1805,
281,
39709,
12620,
891,
13414,
1089,
352,
27350,
281,
2096,
2139,
1698,
677,
14417,
11786,
8197,
1089,
247,
2900,
326,
2087,
4219,
1805,
347,
10066,
281,
256,
35333,
275,
22296,
4715,
47515,
253,
4602,
390,
12288,
608,
47515,
278,
83,
5367,
387,
253,
990,
273,
4706,
3712,
3164,
247,
1745,
80,
721,
812,
368,
21184,
327,
253,
2879,
15180,
2105,
2429,
281,
253,
1666,
25379,
891,
4264,
4361,
253,
2929,
285,
891,
1089,
352,
28055,
4460,
285,
3590,
2299,
891,
513,
417,
1089,
253,
4679,
1077,
15246,
891,
2868,
697,
3309,
281,
7277,
253,
16774,
11041,
273,
253,
11786,
8197,
342,
253,
1666,
25379,
50276,
7152,
339,
431,
248,
2929,
6012,
271,
8654,
1375,
20034,
6820,
8245,
285,
247,
11041,
3777,
5028,
46852,
2746,
323,
3646,
11786,
3082,
50276,
783,
2929,
310,
973,
3542,
285,
253,
15965,
28529,
310,
3590,
253,
2934,
273,
13633,
253,
1375,
2193,
347,
1666,
25379,
281,
253,
3081,
4764,
1320,
273,
253,
3126,
10575,
310,
3626,
984,
253,
11041,
5141,
5853,
275,
2629,
2014,
20034,
3733,
310,
973,
14091,
728,
253,
6880,
281,
253,
3081,
4764,
1320,
310,
4942,
15246,
253,
7681,
4278,
403,
973,
15068,
264,
50275,
2577,
2022,
7350,
403,
670,
253,
5661,
1543,
891,
1158,
253,
1783,
310,
1512,
5075,
285,
417,
2217,
1666,
25379,
403,
2429,
342,
407,
2819,
387,
4677,
337,
352,
310,
3240,
1892,
281,
7525,
326,
253,
4081,
3082,
1663,
1361,
5727,
253,
2629,
11041,
5141,
1057,
1056,
247,
9560,
3064,
275,
3733,
891,
3524,
253,
4477,
1448,
306,
625,
715,
253,
1543,
285,
275,
1798,
21184,
327,
1896,
4606,
672,
253,
5373,
403,
12744,
2581,
685,
816,
247,
2969,
12494,
327,
247,
1643,
12620,
835,
253,
1332,
310,
42876,
1805,
285,
891,
651,
5476,
627,
476,
320,
690,
6667,
835,
352,
310,
7197,
891,
513,
417,
2868,
841,
403,
512,
253,
1666,
25379,
326,
943,
320,
2783,
323,
4227,
253,
373,
642,
5301,
342,
11419,
4715,
390,
10237,
391,
77,
3082,
50276,
783,
2929,
310,
973,
3542,
285,
253,
15965,
28529,
310,
3590,
619,
2022,
7350,
403,
670,
253,
5661,
1543,
891,
1158,
253,
1783,
310,
1512,
5075,
285,
417,
2217,
1666,
25379,
403,
2429,
342,
50276,
7152,
33032,
2520,
2929,
39223,
253,
1029,
11041,
1895,
4269,
407,
253,
46852,
273,
12620,
323,
26230,
253,
3646,
27935,
253,
2934,
310,
281,
15313,
247,
8492,
4924,
285,
1375,
20034,
6820,
8654,
8245,
323,
5028,
46852,
253,
4477,
2007,
1287,
247,
362,
83,
5267,
1332,
407,
23534,
253,
2862,
3126,
2317,
715,
749,
31748,
285,
26230,
253,
3054,
538,
1033,
2575,
2662,
8245,
436,
7714,
310,
247,
294,
491,
7639,
273,
253,
1453,
1459,
366,
1677,
326,
253,
3632,
1255,
275,
253,
3126,
310,
10571,
13345,
5028,
46852,
253,
1453,
1459,
366,
14757,
247,
8245,
326,
7024,
417,
760,
327,
253,
1375,
533,
671,
327,
253,
3126,
436,
1307,
5061,
321,
28761,
556,
690,
10711,
1430,
2523,
275,
3946,
285,
840,
253,
4477,
2085,
247,
10957,
395,
585,
14056,
2934,
281,
10571,
2953,
352,
4679,
921,
326,
436,
1332,
310,
42876,
1805,
685,
26724,
5028,
46852,
50276,
856,
84,
337,
253,
2929,
39223,
271,
4722,
1895,
849,
281,
4796,
253,
11041,
285,
3157,
3410,
6733,
1309,
253,
3733,
273,
1837,
407,
6684,
247,
1805,
8245,
374,
253,
2929,
310,
4518,
3542,
275,
2087,
50276,
49794,
337,
10012,
337,
310,
417,
4217,
253,
1307,
476,
320,
1097,
2762,
285,
4016,
891,
13414,
1158,
1453,
1459,
684,
275,
391,
77,
556,
23632,
275,
11041,
5141,
594,
891,
651,
1804,
11922,
253,
10012,
374,
253,
4081,
1332,
310,
1754,
327,
253,
26536,
326,
253,
3126,
4764,
310,
1929,
285,
908,
281,
10173,
253,
17524,
3861,
9117,
1309,
253,
3733,
534,
778,
417,
320,
3588,
323,
253,
1524,
4758,
835,
253,
5570,
760,
476,
10018,
253,
3126,
495,
253,
14940,
7756,
275,
4677,
337,
310,
417,
1534,
323,
690,
8892,
751,
32752,
15508,
285,
32752,
15508,
19,
69,
2568,
342,
253,
3081,
2105,
273,
253,
17524,
1232,
577,
253,
2929,
4648,
253,
24498,
7368,
1332,
281,
10883,
253,
3126,
2317,
1057,
253,
7368,
1332,
4833,
253,
5661,
1543,
2139,
403,
643,
7368,
3082,
751,
465,
30799,
417,
2783,
285,
13887,
247,
1463,
17524,
7726,
3133,
37825,
342,
9860,
3186,
1580,
253,
2193,
275,
4677,
577,
2965,
715,
1027,
13794,
323,
1027,
8892,
608,
253,
4764,
295,
68,
310,
908,
1293,
5426,
275,
20320,
337,
721,
275,
2593,
8073,
253,
4477,
3037,
1458,
7823,
323,
1016,
5933,
327,
5175,
12620,
281,
1071,
253,
26647,
281,
253,
39709,
3126,
891,
5476,
326,
253,
4477,
3078,
281,
3890,
326,
597,
897,
253,
10166,
7823,
327,
5175,
12620,
3587,
818,
253,
2929,
4648,
253,
6447,
5028,
46852,
347,
247,
8245,
534,
310,
417,
253,
4081,
1332,
275,
479,
45846,
1162,
355,
9169,
11106,
407,
253,
2929,
2139,
1057,
253,
2929,
417,
7277,
281,
253,
3939,
5028,
46852,
5933,
3715,
275,
479,
45846,
1162,
355,
9169,
50276,
2520,
310,
247,
16888,
7756,
273,
5028,
46852,
342,
16888,
7756,
275,
4679,
5474,
33032,
2520,
2929,
39223,
253,
11041,
273,
3646,
11786,
1955,
281,
253,
5028,
46852,
908,
275,
391,
77,
275,
9938,
253,
4477,
5276,
326,
253,
3646,
11786,
11041,
476,
320,
2007,
3777,
407,
4715,
247,
4767,
2662,
8245,
323,
1016,
3126,
4764,
2429,
281,
4767,
2662,
1666,
25379,
253,
4477,
840,
1287,
247,
8542,
5933,
1754,
327,
253,
1783,
285,
12106,
253,
3607,
273,
253,
5933,
253,
5933,
310,
9009,
285,
5762,
327,
2800,
15688,
1453,
8892,
352,
12724,
17308,
684,
3646,
3733,
50275,
296,
3755,
20556,
50275,
953,
1270,
281,
923,
326,
253,
4081,
5933,
14371,
1175,
2087,
50228,
2429,
281,
643,
3082,
275,
4677,
374,
891,
452,
247,
1953,
670,
247,
7484,
2167,
275,
4677,
374,
66,
253,
954,
1669,
749,
14095,
253,
1837,
84,
4868,
310,
642,
2169,
685,
47207,
2299,
275,
4677,
337,
247,
253,
1599,
273,
253,
1837,
6970,
310,
1475,
27295,
858,
891,
4665,
253,
14777,
9113,
50276,
13517,
46852,
310,
7744,
908,
275,
3946,
281,
6194,
1175,
7823,
970,
948,
28457,
253,
1895,
326,
253,
19529,
14177,
281,
2953,
310,
4623,
281,
253,
3114,
50275,
74,
11435,
253,
6031,
273,
44190,
285,
18918,
253,
8542,
11333,
1754,
327,
10527,
2440,
3738,
352,
651,
320,
1805,
281,
12106,
253,
11041,
5141,
273,
253,
8542,
8245,
4081,
50274,
585,
1209,
2224,
50275,
783,
38135,
273,
253,
19529,
310,
3710,
347,
4477,
4879,
253,
8245,
275,
2593,
4567,
310,
247,
2714,
1083,
273,
253,
3280,
17477,
8245,
6012,
407,
6429,
80,
1162,
355,
4765,
285,
697,
417,
5469,
849,
4081,
1332,
310,
1027,
432,
632,
86,
1162,
355,
256,
6925,
1945,
1453,
1459,
366,
50272,
338,
891,
2096,
352,
9113,
253,
3126,
3602,
476,
320,
4127,
347,
629,
273,
253,
1375,
273,
253,
278,
12352,
253,
5678,
1375,
3236,
1375,
50276,
20034,
4764,
6820,
1666,
25379,
476,
320,
6311,
816,
347,
3963,
4767,
2662,
1666,
25379,
1754,
327,
1159,
11193,
1293,
3081,
5609,
824,
347,
253,
24822,
17524,
50273,
249,
5933,
337,
9959,
403,
4447,
1754,
327,
2805,
2193,
533,
253,
5275,
15833,
403,
1119,
1754,
327,
3126,
4764,
436,
43430,
310,
247,
2372,
8909,
281,
479,
50274,
9088,
778,
320,
247,
7505,
275,
253,
10527,
1783,
891,
2868,
253,
15355,
273,
3126,
3602,
878,
281,
320,
27039,
327,
256,
816,
347,
253,
2250,
3268,
310,
27039,
327,
256,
3646,
12580,
436,
3797,
2563,
275,
16186,
577,
40460,
337,
285,
10012,
374,
50275,
783,
19843,
273,
253,
2929,
476,
320,
5520,
50273,
783,
14168,
41818,
403,
4536,
21643,
285,
14168,
7125,
403,
417,
1900,
10799,
6667,
50270,
251,
253,
10720,
1386,
846,
16186,
337,
253,
4495,
273,
5150,
2563,
12910,
8393,
50274,
909,
285,
253,
4495,
273,
305,
275,
16186,
898,
403,
417,
1077,
2590,
50269,
783,
9484,
50276,
249,
5912,
6125,
21839,
2299,
352,
3133,
281,
1957,
4764,
1025,
3470,
323,
1650,
50276,
292,
6682,
268,
270,
37640,
270,
1033,
75,
50271,
249,
10012,
374,
36714,
253,
1617,
16186,
1903,
3133,
281,
320,
4209,
533,
417,
3309,
50271,
5092,
4044,
253,
5927,
11041,
323,
1837,
987,
1840,
2593,
577,
891,
1158,
40460,
337,
2722,
326,
1375,
20034,
6820,
8245,
11355,
625,
11041,
685,
4767,
2662,
8245,
533,
891,
13414,
1158,
436,
8018,
1375,
20034,
6820,
8245,
476,
4044,
253,
5927,
11041,
50273,
783,
5933,
4419,
625,
8813,
824,
347,
849,
253,
1666,
25379,
403,
9300,
275,
1386,
1638,
849,
253,
24498,
7368,
1332,
2987,
285,
625,
5740,
670,
774,
1492,
778,
1361,
50274,
977,
5701,
50275,
34263,
14217,
403,
12421,
3542,
275,
36037,
280,
390,
1327,
1562,
280,
8266,
268,
275,
253,
1273,
1078,
253,
1390,
1386,
327,
3239,
374,
285,
247,
275,
253,
10720,
1078,
253,
1390,
1386,
275,
2593,
374,
253,
4495,
273,
295,
68,
285,
295,
275,
1386,
495,
5933,
337,
50275,
5371,
1057,
465,
5596,
50276,
6165,
50276,
18,
1957,
987,
1840,
2593,
608,
50275,
81,
550,
18,
78,
50276,
251,
253,
7002,
1386,
846,
5933,
337,
3817,
50276,
11425,
352,
320,
268,
550,
17,
73,
18,
3185,
50276,
73,
306,
50276,
9802,
275,
253,
4766,
273,
3239,
898,
50275,
6438,
30080,
22559,
50276,
74,
1663,
11435,
326,
253,
4477,
11217,
253,
5701,
275,
824,
247,
2159,
2408,
273,
673,
3738,
891,
1158,
253,
17524,
2934,
285,
4679,
403,
4722,
1955,
281,
253,
2408,
273,
2544,
1160,
891,
717,
5777,
25661,
2584,
7738,
253,
4477,
1379,
690,
625,
673,
11138,
253,
7714,
285,
11929,
352,
281,
2852,
27691,
323,
1650,
891,
1158,
253,
3368,
2593,
476,
320,
5520,
342,
253,
747,
4679,
285,
11985,
327,
253,
1543,
285,
625,
1783,
327,
253,
4081,
5933,
24088,
849,
270,
275,
253,
1273,
12494,
327,
3239,
608,
310,
2905,
281,
253,
5933,
4081,
359,
476,
417,
3410,
432,
268,
793,
285,
22048,
253,
5921,
875,
23646,
256,
285,
2805,
84,
247,
268,
50275,
12157,
619,
7350,
32180,
798,
253,
20544,
1024,
891,
717,
25661,
4404,
33944,
253,
2929,
50276,
187,
187,
4118,
18435,
27,
6050,
253,
30628,
14109,
253,
19843,
273,
253,
789,
627,
310,
247,
4468,
670,
253,
4495,
273,
253,
4081,
906,
285,
1332,
352,
310,
1929,
326,
6240,
3640,
670,
271,
3081,
4778,
275,
436,
1083,
253,
3126,
5644,
281,
247,
2406,
11041,
6642,
752,
310,
417,
1929,
310,
253,
8542,
3486,
273,
970,
436,
747,
8245,
390,
4931,
690,
643,
30328,
45030,
432,
326,
897,
273,
253,
8245,
323,
4227,
253,
6510,
273,
253,
11041,
2299,
253,
1543,
2011,
403,
417,
326,
18511,
247,
1127,
534,
369,
5439,
407,
253,
30628,
2403,
253,
789,
2708,
253,
2534,
323,
9311
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
2175,
3646,
11786,
3082,
323,
5028,
46852,
1837,
275,
1798,
352,
2340,
684,
1666,
25379,
323,
3646,
11786,
762,
253,
1837,
7533,
824,
326,
253,
11786,
6642,
476,
452,
247,
2406,
11041,
281,
5416,
1805,
3646,
11269,
285,
4715,
436,
2929,
38422,
8654,
1375,
20034,
6820,
8245,
28055,
4245,
2087,
20247,
323,
3652,
253,
8245,
285,
29328,
271,
5933,
1925,
11041,
3777,
5028,
46852,
362,
83,
5267,
362,
83,
5267,
310,
6760,
327,
2067,
5415,
1453,
8892,
285,
352,
17923,
1805,
2429,
342,
767,
1666,
25379,
20544,
337,
436,
789,
556,
247,
1175,
2605,
310,
973,
15720,
285,
891,
1928,
13864,
281,
1239,
436,
2929,
374,
436,
789,
310,
973,
24013,
8550,
285,
891,
2868,
253,
9400,
310,
4122,
32258,
285,
2905,
281,
253,
18767,
495,
436,
789,
556,
247,
1175,
6654,
273,
10527,
7680,
285,
16774,
7103,
352,
310,
22335,
4460,
285,
253,
4081,
5933,
2987,
973,
275,
3946,
2429,
281,
1666,
25379,
50276,
20881,
1255,
265,
34974,
337,
436,
789,
8571,
253,
11041,
273,
253,
11786,
8197,
594,
2139,
417,
3587,
2557,
253,
11041,
273,
253,
11786,
8197,
24088,
45190,
891,
1158,
436,
310,
253,
954,
15246,
1039,
374,
891,
5194,
2406,
11041,
11786,
8197,
812,
29623,
3809,
533,
275,
2087,
697,
671,
6927,
281,
29623,
281,
1980,
8654,
752,
403,
634,
11626,
327,
436,
577,
323,
4706,
3583,
812,
368,
21184,
625,
327,
2139,
362,
83,
5267,
2087,
4219,
1805,
281,
39709,
12620,
891,
13414,
1089,
352,
27350,
281,
2096,
2139,
1698,
677,
14417,
11786,
8197,
1089,
247,
2900,
326,
2087,
4219,
1805,
347,
10066,
281,
256,
35333,
275,
22296,
4715,
47515,
253,
4602,
390,
12288,
608,
47515,
278,
83,
5367,
387,
253,
990,
273,
4706,
3712,
3164,
247,
1745,
80,
721,
812,
368,
21184,
327,
253,
2879,
15180,
2105,
2429,
281,
253,
1666,
25379,
891,
4264,
4361,
253,
2929,
285,
891,
1089,
352,
28055,
4460,
285,
3590,
2299,
891,
513,
417,
1089,
253,
4679,
1077,
15246,
891,
2868,
697,
3309,
281,
7277,
253,
16774,
11041,
273,
253,
11786,
8197,
342,
253,
1666,
25379,
50276,
7152,
339,
431,
248,
2929,
6012,
271,
8654,
1375,
20034,
6820,
8245,
285,
247,
11041,
3777,
5028,
46852,
2746,
323,
3646,
11786,
3082,
50276,
783,
2929,
310,
973,
3542,
285,
253,
15965,
28529,
310,
3590,
253,
2934,
273,
13633,
253,
1375,
2193,
347,
1666,
25379,
281,
253,
3081,
4764,
1320,
273,
253,
3126,
10575,
310,
3626,
984,
253,
11041,
5141,
5853,
275,
2629,
2014,
20034,
3733,
310,
973,
14091,
728,
253,
6880,
281,
253,
3081,
4764,
1320,
310,
4942,
15246,
253,
7681,
4278,
403,
973,
15068,
264,
50275,
2577,
2022,
7350,
403,
670,
253,
5661,
1543,
891,
1158,
253,
1783,
310,
1512,
5075,
285,
417,
2217,
1666,
25379,
403,
2429,
342,
407,
2819,
387,
4677,
337,
352,
310,
3240,
1892,
281,
7525,
326,
253,
4081,
3082,
1663,
1361,
5727,
253,
2629,
11041,
5141,
1057,
1056,
247,
9560,
3064,
275,
3733,
891,
3524,
253,
4477,
1448,
306,
625,
715,
253,
1543,
285,
275,
1798,
21184,
327,
1896,
4606,
672,
253,
5373,
403,
12744,
2581,
685,
816,
247,
2969,
12494,
327,
247,
1643,
12620,
835,
253,
1332,
310,
42876,
1805,
285,
891,
651,
5476,
627,
476,
320,
690,
6667,
835,
352,
310,
7197,
891,
513,
417,
2868,
841,
403,
512,
253,
1666,
25379,
326,
943,
320,
2783,
323,
4227,
253,
373,
642,
5301,
342,
11419,
4715,
390,
10237,
391,
77,
3082,
50276,
783,
2929,
310,
973,
3542,
285,
253,
15965,
28529,
310,
3590,
619,
2022,
7350,
403,
670,
253,
5661,
1543,
891,
1158,
253,
1783,
310,
1512,
5075,
285,
417,
2217,
1666,
25379,
403,
2429,
342,
50276,
7152,
33032,
2520,
2929,
39223,
253,
1029,
11041,
1895,
4269,
407,
253,
46852,
273,
12620,
323,
26230,
253,
3646,
27935,
253,
2934,
310,
281,
15313,
247,
8492,
4924,
285,
1375,
20034,
6820,
8654,
8245,
323,
5028,
46852,
253,
4477,
2007,
1287,
247,
362,
83,
5267,
1332,
407,
23534,
253,
2862,
3126,
2317,
715,
749,
31748,
285,
26230,
253,
3054,
538,
1033,
2575,
2662,
8245,
436,
7714,
310,
247,
294,
491,
7639,
273,
253,
1453,
1459,
366,
1677,
326,
253,
3632,
1255,
275,
253,
3126,
310,
10571,
13345,
5028,
46852,
253,
1453,
1459,
366,
14757,
247,
8245,
326,
7024,
417,
760,
327,
253,
1375,
533,
671,
327,
253,
3126,
436,
1307,
5061,
321,
28761,
556,
690,
10711,
1430,
2523,
275,
3946,
285,
840,
253,
4477,
2085,
247,
10957,
395,
585,
14056,
2934,
281,
10571,
2953,
352,
4679,
921,
326,
436,
1332,
310,
42876,
1805,
685,
26724,
5028,
46852,
50276,
856,
84,
337,
253,
2929,
39223,
271,
4722,
1895,
849,
281,
4796,
253,
11041,
285,
3157,
3410,
6733,
1309,
253,
3733,
273,
1837,
407,
6684,
247,
1805,
8245,
374,
253,
2929,
310,
4518,
3542,
275,
2087,
50276,
49794,
337,
10012,
337,
310,
417,
4217,
253,
1307,
476,
320,
1097,
2762,
285,
4016,
891,
13414,
1158,
1453,
1459,
684,
275,
391,
77,
556,
23632,
275,
11041,
5141,
594,
891,
651,
1804,
11922,
253,
10012,
374,
253,
4081,
1332,
310,
1754,
327,
253,
26536,
326,
253,
3126,
4764,
310,
1929,
285,
908,
281,
10173,
253,
17524,
3861,
9117,
1309,
253,
3733,
534,
778,
417,
320,
3588,
323,
253,
1524,
4758,
835,
253,
5570,
760,
476,
10018,
253,
3126,
495,
253,
14940,
7756,
275,
4677,
337,
310,
417,
1534,
323,
690,
8892,
751,
32752,
15508,
285,
32752,
15508,
19,
69,
2568,
342,
253,
3081,
2105,
273,
253,
17524,
1232,
577,
253,
2929,
4648,
253,
24498,
7368,
1332,
281,
10883,
253,
3126,
2317,
1057,
253,
7368,
1332,
4833,
253,
5661,
1543,
2139,
403,
643,
7368,
3082,
751,
465,
30799,
417,
2783,
285,
13887,
247,
1463,
17524,
7726,
3133,
37825,
342,
9860,
3186,
1580,
253,
2193,
275,
4677,
577,
2965,
715,
1027,
13794,
323,
1027,
8892,
608,
253,
4764,
295,
68,
310,
908,
1293,
5426,
275,
20320,
337,
721,
275,
2593,
8073,
253,
4477,
3037,
1458,
7823,
323,
1016,
5933,
327,
5175,
12620,
281,
1071,
253,
26647,
281,
253,
39709,
3126,
891,
5476,
326,
253,
4477,
3078,
281,
3890,
326,
597,
897,
253,
10166,
7823,
327,
5175,
12620,
3587,
818,
253,
2929,
4648,
253,
6447,
5028,
46852,
347,
247,
8245,
534,
310,
417,
253,
4081,
1332,
275,
479,
45846,
1162,
355,
9169,
11106,
407,
253,
2929,
2139,
1057,
253,
2929,
417,
7277,
281,
253,
3939,
5028,
46852,
5933,
3715,
275,
479,
45846,
1162,
355,
9169,
50276,
2520,
310,
247,
16888,
7756,
273,
5028,
46852,
342,
16888,
7756,
275,
4679,
5474,
33032,
2520,
2929,
39223,
253,
11041,
273,
3646,
11786,
1955,
281,
253,
5028,
46852,
908,
275,
391,
77,
275,
9938,
253,
4477,
5276,
326,
253,
3646,
11786,
11041,
476,
320,
2007,
3777,
407,
4715,
247,
4767,
2662,
8245,
323,
1016,
3126,
4764,
2429,
281,
4767,
2662,
1666,
25379,
253,
4477,
840,
1287,
247,
8542,
5933,
1754,
327,
253,
1783,
285,
12106,
253,
3607,
273,
253,
5933,
253,
5933,
310,
9009,
285,
5762,
327,
2800,
15688,
1453,
8892,
352,
12724,
17308,
684,
3646,
3733,
50275,
296,
3755,
20556,
50275,
953,
1270,
281,
923,
326,
253,
4081,
5933,
14371,
1175,
2087,
50228,
2429,
281,
643,
3082,
275,
4677,
374,
891,
452,
247,
1953,
670,
247,
7484,
2167,
275,
4677,
374,
66,
253,
954,
1669,
749,
14095,
253,
1837,
84,
4868,
310,
642,
2169,
685,
47207,
2299,
275,
4677,
337,
247,
253,
1599,
273,
253,
1837,
6970,
310,
1475,
27295,
858,
891,
4665,
253,
14777,
9113,
50276,
13517,
46852,
310,
7744,
908,
275,
3946,
281,
6194,
1175,
7823,
970,
948,
28457,
253,
1895,
326,
253,
19529,
14177,
281,
2953,
310,
4623,
281,
253,
3114,
50275,
74,
11435,
253,
6031,
273,
44190,
285,
18918,
253,
8542,
11333,
1754,
327,
10527,
2440,
3738,
352,
651,
320,
1805,
281,
12106,
253,
11041,
5141,
273,
253,
8542,
8245,
4081,
50274,
585,
1209,
2224,
50275,
783,
38135,
273,
253,
19529,
310,
3710,
347,
4477,
4879,
253,
8245,
275,
2593,
4567,
310,
247,
2714,
1083,
273,
253,
3280,
17477,
8245,
6012,
407,
6429,
80,
1162,
355,
4765,
285,
697,
417,
5469,
849,
4081,
1332,
310,
1027,
432,
632,
86,
1162,
355,
256,
6925,
1945,
1453,
1459,
366,
50272,
338,
891,
2096,
352,
9113,
253,
3126,
3602,
476,
320,
4127,
347,
629,
273,
253,
1375,
273,
253,
278,
12352,
253,
5678,
1375,
3236,
1375,
50276,
20034,
4764,
6820,
1666,
25379,
476,
320,
6311,
816,
347,
3963,
4767,
2662,
1666,
25379,
1754,
327,
1159,
11193,
1293,
3081,
5609,
824,
347,
253,
24822,
17524,
50273,
249,
5933,
337,
9959,
403,
4447,
1754,
327,
2805,
2193,
533,
253,
5275,
15833,
403,
1119,
1754,
327,
3126,
4764,
436,
43430,
310,
247,
2372,
8909,
281,
479,
50274,
9088,
778,
320,
247,
7505,
275,
253,
10527,
1783,
891,
2868,
253,
15355,
273,
3126,
3602,
878,
281,
320,
27039,
327,
256,
816,
347,
253,
2250,
3268,
310,
27039,
327,
256,
3646,
12580,
436,
3797,
2563,
275,
16186,
577,
40460,
337,
285,
10012,
374,
50275,
783,
19843,
273,
253,
2929,
476,
320,
5520,
50273,
783,
14168,
41818,
403,
4536,
21643,
285,
14168,
7125,
403,
417,
1900,
10799,
6667,
50270,
251,
253,
10720,
1386,
846,
16186,
337,
253,
4495,
273,
5150,
2563,
12910,
8393,
50274,
909,
285,
253,
4495,
273,
305,
275,
16186,
898,
403,
417,
1077,
2590,
50269,
783,
9484,
50276,
249,
5912,
6125,
21839,
2299,
352,
3133,
281,
1957,
4764,
1025,
3470,
323,
1650,
50276,
292,
6682,
268,
270,
37640,
270,
1033,
75,
50271,
249,
10012,
374,
36714,
253,
1617,
16186,
1903,
3133,
281,
320,
4209,
533,
417,
3309,
50271,
5092,
4044,
253,
5927,
11041,
323,
1837,
987,
1840,
2593,
577,
891,
1158,
40460,
337,
2722,
326,
1375,
20034,
6820,
8245,
11355,
625,
11041,
685,
4767,
2662,
8245,
533,
891,
13414,
1158,
436,
8018,
1375,
20034,
6820,
8245,
476,
4044,
253,
5927,
11041,
50273,
783,
5933,
4419,
625,
8813,
824,
347,
849,
253,
1666,
25379,
403,
9300,
275,
1386,
1638,
849,
253,
24498,
7368,
1332,
2987,
285,
625,
5740,
670,
774,
1492,
778,
1361,
50274,
977,
5701,
50275,
34263,
14217,
403,
12421,
3542,
275,
36037,
280,
390,
1327,
1562,
280,
8266,
268,
275,
253,
1273,
1078,
253,
1390,
1386,
327,
3239,
374,
285,
247,
275,
253,
10720,
1078,
253,
1390,
1386,
275,
2593,
374,
253,
4495,
273,
295,
68,
285,
295,
275,
1386,
495,
5933,
337,
50275,
5371,
1057,
465,
5596,
50276,
6165,
50276,
18,
1957,
987,
1840,
2593,
608,
50275,
81,
550,
18,
78,
50276,
251,
253,
7002,
1386,
846,
5933,
337,
3817,
50276,
11425,
352,
320,
268,
550,
17,
73,
18,
3185,
50276,
73,
306,
50276,
9802,
275,
253,
4766,
273,
3239,
898,
50275,
6438,
30080,
22559,
50276,
74,
1663,
11435,
326,
253,
4477,
11217,
253,
5701,
275,
824,
247,
2159,
2408,
273,
673,
3738,
891,
1158,
253,
17524,
2934,
285,
4679,
403,
4722,
1955,
281,
253,
2408,
273,
2544,
1160,
891,
717,
5777,
25661,
2584,
7738,
253,
4477,
1379,
690,
625,
673,
11138,
253,
7714,
285,
11929,
352,
281,
2852,
27691,
323,
1650,
891,
1158,
253,
3368,
2593,
476,
320,
5520,
342,
253,
747,
4679,
285,
11985,
327,
253,
1543,
285,
625,
1783,
327,
253,
4081,
5933,
24088,
849,
270,
275,
253,
1273,
12494,
327,
3239,
608,
310,
2905,
281,
253,
5933,
4081,
359,
476,
417,
3410,
432,
268,
793,
285,
22048,
253,
5921,
875,
23646,
256,
285,
2805,
84,
247,
268,
50275,
12157,
619,
7350,
32180,
798,
253,
20544,
1024,
891,
717,
25661,
4404,
33944,
253,
2929,
50276,
187,
187,
4118,
18435,
27,
6050,
253,
30628,
14109,
253,
19843,
273,
253,
789,
627,
310,
247,
4468,
670,
253,
4495,
273,
253,
4081,
906,
285,
1332,
352,
310,
1929,
326,
6240,
3640,
670,
271,
3081,
4778,
275,
436,
1083,
253,
3126,
5644,
281,
247,
2406,
11041,
6642,
752,
310,
417,
1929,
310,
253,
8542,
3486,
273,
970,
436,
747,
8245,
390,
4931,
690,
643,
30328,
45030,
432,
326,
897,
273,
253,
8245,
323,
4227,
253,
6510,
273,
253,
11041,
2299,
253,
1543,
2011,
403,
417,
326,
18511,
247,
1127,
534,
369,
5439,
407,
253,
30628,
2403,
253,
789,
2708,
253,
2534,
323,
9311
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the approach addressed an important challenge in semantic segmentation in medical images partially labeled images specially important in this field histopathology the method is compared with a set of currently published methods in the field the methodological contribution is minimal the method is practically the previously published dodnet just input is 2d instead of 3d as initially proposed the overall performance is not so convincing as stated by the authors it seems the improvement is marginal compared to multi unet it would have been also good to compare the method with a multihead approach docsep well written paper with a clear rationale opensource code and nice illustrative images important problem in the field as partially labelled data is abundantly available the approach is benchmarked against important baseline approaches unet and deeplabv3 the method has a lot of overlap with the dodnet paper so i feel the technical novelty of this paper is limited also the authors do not clearly state the technical novelty of the paper in comparison to the dodnet paper however the application to wsi images is novel afaik the experiments are conducted on a dataset which is not publicly available it would be great to benchmark this approach using public datasets so that future approaches can also be tested against the same benchmark is it possible to publicly release part of the data and set up a challenge around this why is this approach not compared to a multihead design that would be a great comparison and is lacking now this is a pity because the introduction introduces the multihead approach as an alternative strategy so nicely docsepgreat evaluation interesting method the problem of having a multiple class segmentation method from partially labeled datasets is of value to the community no current solution is satisfactory of potential value to the community literature review the authors seem to ignore the multiorgan segmentation literature where training with partial labels is performed see for instance gonzalez18 for a multiclass segmentation method with a single network with partial labelled datasets or a review of the methods cerrolaza19 the paper would benefit from the inclusion of such methods on the review clarity of the method it is unclear what the function phi of eq 2 is also theta subphi is not clearly defined while the authors state that there is only allowed one label to each image path it would be interesting to discuss how the inclusion of a plurality of labels to each of them would be treated in this method in equation 3 the authors state that the dynamic head is just a set of convolutions it seems that there is no activation function post each of the convolutional layers if that is the case the three convolutional layers could be simplified into a single convolution since everything is linear please clarify section 24 there are several references to optimal scales i assume that the images are captured at 40x magnification and split into 256x256 patches the downsample method for different tissues is not clear data management the authors claim that we randomly split the entire dataset into training validation and testing sets this random split is worrisome nearby patches will share similar image characteristics and therefore can be easy to classify the split should always be done in an wsibased randomization not patchbased please clarify if the random split is done in a patchbased or wsi based method even more it should be done on a patientbased manner the authors state that binary dice loss and crossentropy loss were used as the loss function is it a combination of both does it depend on the experiment please clarify when generating completely labeled tissue segmentations the authors will pass each patch multiple times through the network with different class encodings on their classaware vector how do the authors deal with the case that a pixel obtains multiple foreground hits there is no information on the paper performance of the method in comparison to others the authors compare their method to multiple networks or multiheads networks but do not compare to multiple class segmentation networks see gonzalez 18 the authors method need a pass of the network for each class at test time while multipleclass segmentation methods require a single pass to all classes gonzalez18 gonzlez germn george r washko and ral san jos estpar multistructure segmentation from partially labeled datasets application to body composition measurements on ct scans image analysis for moving organ breast and thoracic images springer cham 2018 215224 cerrolaza19 cerrolaza juan j et al computational anatomy for multiorgan analysis in medical imaging a review medical image analysis 56 2019 4467
### Summary:
|
while two our of three reviewers pointed out that the method is very similar to a previously published approach dodnet these reviewers still see value in the extensive evaluation presented in this paper and both suggested weak accept the first reviewer also increased the score to borderline after the rebuttal owing to additional experiments for evaluation i also agree with the reviewers that this paper addresses an important problem in the field ie partially labelled data and presents extensive evaluation and benchmarking against important baseline approaches and therefore suggest acceptance of this paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
253,
2746,
9713,
271,
1774,
5691,
275,
24705,
26405,
275,
3739,
3888,
10571,
13130,
3888,
24443,
1774,
275,
436,
1673,
1872,
36691,
50276,
783,
1332,
310,
2429,
342,
247,
873,
273,
4390,
3863,
3082,
275,
253,
1673,
50275,
783,
35961,
7680,
310,
8723,
253,
1332,
310,
18236,
253,
3786,
3863,
33413,
3024,
816,
3280,
310,
374,
69,
3185,
273,
495,
69,
347,
8523,
4081,
50275,
783,
4583,
3045,
310,
417,
594,
21414,
347,
4767,
407,
253,
4477,
352,
3133,
253,
7756,
310,
16888,
2429,
281,
4471,
440,
292,
352,
651,
452,
644,
671,
1175,
281,
7277,
253,
1332,
342,
247,
4471,
2522,
2746,
50276,
7152,
33032,
973,
3542,
2929,
342,
247,
2590,
24775,
13279,
1505,
2127,
285,
5322,
47386,
3888,
50276,
18108,
1895,
275,
253,
1673,
347,
10571,
27214,
941,
310,
7735,
5954,
2130,
50276,
783,
2746,
310,
22791,
264,
1411,
1774,
8245,
7274,
440,
292,
285,
372,
70,
446,
357,
87,
20,
50276,
783,
1332,
556,
247,
2257,
273,
14787,
342,
253,
33413,
3024,
2929,
594,
891,
1928,
253,
7681,
38135,
273,
436,
2929,
310,
3710,
671,
253,
4477,
513,
417,
4518,
1375,
253,
7681,
38135,
273,
253,
2929,
275,
5301,
281,
253,
33413,
3024,
2929,
2299,
253,
2898,
281,
259,
9245,
3888,
310,
4460,
6706,
66,
1479,
50276,
783,
4679,
403,
5196,
327,
247,
10895,
534,
310,
417,
13644,
2130,
352,
651,
320,
1270,
281,
22791,
436,
2746,
970,
1345,
15302,
594,
326,
2852,
7274,
476,
671,
320,
5762,
1411,
253,
1072,
22791,
310,
352,
1896,
281,
13644,
3727,
629,
273,
253,
941,
285,
873,
598,
247,
5691,
1475,
436,
50276,
22309,
310,
436,
2746,
417,
2429,
281,
247,
4471,
2522,
2216,
326,
651,
320,
247,
1270,
5301,
285,
310,
14999,
1024,
436,
310,
247,
27042,
984,
253,
10199,
23970,
253,
4471,
2522,
2746,
347,
271,
5795,
5700,
594,
23395,
50276,
7152,
339,
8159,
675,
7103,
4722,
1332,
253,
1895,
273,
1907,
247,
2709,
966,
26405,
1332,
432,
10571,
13130,
15302,
310,
273,
1318,
281,
253,
3114,
642,
1655,
2900,
310,
20297,
273,
2442,
1318,
281,
253,
3114,
50276,
22478,
1177,
2278,
253,
4477,
1646,
281,
11823,
253,
1554,
1528,
1247,
26405,
6239,
835,
3733,
342,
7898,
13301,
310,
2684,
923,
323,
4227,
305,
16430,
30050,
1093,
323,
247,
23559,
14407,
26405,
1332,
342,
247,
2014,
2990,
342,
7898,
27214,
15302,
390,
247,
2278,
273,
253,
3082,
15733,
1102,
11983,
746,
253,
2929,
651,
5649,
432,
253,
11250,
273,
824,
3082,
327,
253,
2278,
19843,
273,
253,
1332,
50276,
186,
262,
310,
12744,
752,
253,
1159,
815,
74,
273,
16186,
374,
310,
671,
39116,
749,
2162,
310,
417,
4518,
2931,
50276,
186,
6050,
253,
4477,
1375,
326,
627,
310,
760,
4136,
581,
5203,
281,
1016,
2460,
1854,
352,
651,
320,
4722,
281,
2319,
849,
253,
11250,
273,
247,
11234,
273,
13301,
281,
1016,
273,
731,
651,
320,
4127,
275,
436,
1332,
50276,
186,
249,
5150,
495,
253,
4477,
1375,
326,
253,
7870,
1481,
310,
816,
247,
873,
273,
2410,
17009,
352,
3133,
326,
627,
310,
642,
5743,
1159,
1501,
1016,
273,
253,
27311,
267,
8090,
604,
326,
310,
253,
1083,
253,
1264,
27311,
267,
8090,
812,
320,
21010,
715,
247,
2014,
27311,
1580,
3253,
310,
4872,
4496,
19148,
209,
186,
4674,
2164,
627,
403,
2067,
10414,
281,
8654,
11498,
891,
5467,
326,
253,
3888,
403,
10848,
387,
3387,
89,
28358,
285,
8085,
715,
17558,
89,
9726,
20412,
253,
1066,
16848,
1332,
323,
1027,
7944,
310,
417,
2590,
50276,
186,
2203,
4323,
253,
4477,
1750,
326,
359,
12421,
8085,
253,
2862,
10895,
715,
3733,
12820,
285,
5175,
5239,
436,
3632,
8085,
310,
548,
4448,
485,
10151,
20412,
588,
3894,
2074,
2460,
5319,
285,
3103,
476,
320,
3477,
281,
30215,
253,
8085,
943,
1900,
320,
2218,
275,
271,
37280,
487,
833,
46852,
417,
12097,
3169,
4496,
19148,
604,
253,
3632,
8085,
310,
2218,
275,
247,
12097,
3169,
390,
259,
9245,
1754,
1332,
1014,
625,
352,
943,
320,
2218,
327,
247,
3110,
3169,
5133,
209,
186,
783,
4477,
1375,
326,
8985,
25807,
2957,
285,
2831,
290,
10144,
2957,
497,
908,
347,
253,
2957,
1159,
310,
352,
247,
5019,
273,
1097,
1057,
352,
3469,
327,
253,
3368,
4496,
19148,
50276,
186,
9453,
11365,
4336,
13130,
4408,
8223,
569,
253,
4477,
588,
1509,
1016,
12097,
2709,
2069,
949,
253,
2990,
342,
1027,
966,
2349,
351,
723,
327,
616,
966,
13823,
4972,
849,
513,
253,
4477,
2968,
342,
253,
1083,
326,
247,
12275,
31326,
2709,
35936,
12830,
627,
310,
642,
1491,
327,
253,
2929,
3045,
273,
253,
1332,
275,
5301,
281,
2571,
253,
4477,
7277,
616,
1332,
281,
2709,
6928,
390,
4471,
22089,
6928,
533,
513,
417,
7277,
281,
2709,
966,
26405,
6928,
923,
305,
16430,
30050,
1283,
253,
4477,
1332,
878,
247,
1509,
273,
253,
2990,
323,
1016,
966,
50276,
255,
1071,
673,
1223,
2709,
2437,
26405,
3082,
2430,
247,
2014,
1509,
281,
512,
5971,
50276,
72,
16430,
30050,
1093,
305,
16430,
33383,
14638,
79,
3471,
4652,
391,
14841,
7381,
285,
391,
267,
7699,
480,
375,
1144,
1148,
1554,
382,
7818,
26405,
432,
10571,
13130,
15302,
2898,
281,
2133,
5889,
6341,
327,
45830,
20947,
2460,
1783,
323,
4886,
1963,
5988,
285,
33288,
3888,
7203,
254,
45909,
4765,
374,
17472,
1348,
15733,
1102,
11983,
746,
15733,
1102,
11983,
7166,
266,
480,
1162,
355,
15180,
30559,
323,
1554,
1528,
1247,
1783,
275,
3739,
6979,
247,
2278,
3739,
2460,
1783,
8026,
6247,
7127,
2251,
50276,
187,
187,
4118,
18435,
27,
6050,
767,
776,
273,
1264,
30628,
8042,
562,
326,
253,
1332,
310,
1077,
2074,
281,
247,
3786,
3863,
2746,
33413,
3024,
50276,
20513,
30628,
1335,
923,
1318,
275,
253,
9470,
7103,
3559,
275,
436,
2929,
285,
1097,
5125,
5075,
2997,
253,
806,
37317,
671,
2559,
253,
4868,
281,
45210,
846,
253,
30080,
22559,
21681,
281,
3081,
4679,
323,
7103,
891,
671,
5194,
342,
253,
30628,
326,
436,
2929,
12453,
271,
1774,
1895,
275,
253,
1673,
26332,
50276,
2003,
1365,
27214,
941,
285,
10262,
9470,
7103,
285,
22791,
272,
1411,
1774,
8245,
7274,
285,
3103,
1804,
14924,
273,
436,
2929,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
253,
2746,
9713,
271,
1774,
5691,
275,
24705,
26405,
275,
3739,
3888,
10571,
13130,
3888,
24443,
1774,
275,
436,
1673,
1872,
36691,
50276,
783,
1332,
310,
2429,
342,
247,
873,
273,
4390,
3863,
3082,
275,
253,
1673,
50275,
783,
35961,
7680,
310,
8723,
253,
1332,
310,
18236,
253,
3786,
3863,
33413,
3024,
816,
3280,
310,
374,
69,
3185,
273,
495,
69,
347,
8523,
4081,
50275,
783,
4583,
3045,
310,
417,
594,
21414,
347,
4767,
407,
253,
4477,
352,
3133,
253,
7756,
310,
16888,
2429,
281,
4471,
440,
292,
352,
651,
452,
644,
671,
1175,
281,
7277,
253,
1332,
342,
247,
4471,
2522,
2746,
50276,
7152,
33032,
973,
3542,
2929,
342,
247,
2590,
24775,
13279,
1505,
2127,
285,
5322,
47386,
3888,
50276,
18108,
1895,
275,
253,
1673,
347,
10571,
27214,
941,
310,
7735,
5954,
2130,
50276,
783,
2746,
310,
22791,
264,
1411,
1774,
8245,
7274,
440,
292,
285,
372,
70,
446,
357,
87,
20,
50276,
783,
1332,
556,
247,
2257,
273,
14787,
342,
253,
33413,
3024,
2929,
594,
891,
1928,
253,
7681,
38135,
273,
436,
2929,
310,
3710,
671,
253,
4477,
513,
417,
4518,
1375,
253,
7681,
38135,
273,
253,
2929,
275,
5301,
281,
253,
33413,
3024,
2929,
2299,
253,
2898,
281,
259,
9245,
3888,
310,
4460,
6706,
66,
1479,
50276,
783,
4679,
403,
5196,
327,
247,
10895,
534,
310,
417,
13644,
2130,
352,
651,
320,
1270,
281,
22791,
436,
2746,
970,
1345,
15302,
594,
326,
2852,
7274,
476,
671,
320,
5762,
1411,
253,
1072,
22791,
310,
352,
1896,
281,
13644,
3727,
629,
273,
253,
941,
285,
873,
598,
247,
5691,
1475,
436,
50276,
22309,
310,
436,
2746,
417,
2429,
281,
247,
4471,
2522,
2216,
326,
651,
320,
247,
1270,
5301,
285,
310,
14999,
1024,
436,
310,
247,
27042,
984,
253,
10199,
23970,
253,
4471,
2522,
2746,
347,
271,
5795,
5700,
594,
23395,
50276,
7152,
339,
8159,
675,
7103,
4722,
1332,
253,
1895,
273,
1907,
247,
2709,
966,
26405,
1332,
432,
10571,
13130,
15302,
310,
273,
1318,
281,
253,
3114,
642,
1655,
2900,
310,
20297,
273,
2442,
1318,
281,
253,
3114,
50276,
22478,
1177,
2278,
253,
4477,
1646,
281,
11823,
253,
1554,
1528,
1247,
26405,
6239,
835,
3733,
342,
7898,
13301,
310,
2684,
923,
323,
4227,
305,
16430,
30050,
1093,
323,
247,
23559,
14407,
26405,
1332,
342,
247,
2014,
2990,
342,
7898,
27214,
15302,
390,
247,
2278,
273,
253,
3082,
15733,
1102,
11983,
746,
253,
2929,
651,
5649,
432,
253,
11250,
273,
824,
3082,
327,
253,
2278,
19843,
273,
253,
1332,
50276,
186,
262,
310,
12744,
752,
253,
1159,
815,
74,
273,
16186,
374,
310,
671,
39116,
749,
2162,
310,
417,
4518,
2931,
50276,
186,
6050,
253,
4477,
1375,
326,
627,
310,
760,
4136,
581,
5203,
281,
1016,
2460,
1854,
352,
651,
320,
4722,
281,
2319,
849,
253,
11250,
273,
247,
11234,
273,
13301,
281,
1016,
273,
731,
651,
320,
4127,
275,
436,
1332,
50276,
186,
249,
5150,
495,
253,
4477,
1375,
326,
253,
7870,
1481,
310,
816,
247,
873,
273,
2410,
17009,
352,
3133,
326,
627,
310,
642,
5743,
1159,
1501,
1016,
273,
253,
27311,
267,
8090,
604,
326,
310,
253,
1083,
253,
1264,
27311,
267,
8090,
812,
320,
21010,
715,
247,
2014,
27311,
1580,
3253,
310,
4872,
4496,
19148,
209,
186,
4674,
2164,
627,
403,
2067,
10414,
281,
8654,
11498,
891,
5467,
326,
253,
3888,
403,
10848,
387,
3387,
89,
28358,
285,
8085,
715,
17558,
89,
9726,
20412,
253,
1066,
16848,
1332,
323,
1027,
7944,
310,
417,
2590,
50276,
186,
2203,
4323,
253,
4477,
1750,
326,
359,
12421,
8085,
253,
2862,
10895,
715,
3733,
12820,
285,
5175,
5239,
436,
3632,
8085,
310,
548,
4448,
485,
10151,
20412,
588,
3894,
2074,
2460,
5319,
285,
3103,
476,
320,
3477,
281,
30215,
253,
8085,
943,
1900,
320,
2218,
275,
271,
37280,
487,
833,
46852,
417,
12097,
3169,
4496,
19148,
604,
253,
3632,
8085,
310,
2218,
275,
247,
12097,
3169,
390,
259,
9245,
1754,
1332,
1014,
625,
352,
943,
320,
2218,
327,
247,
3110,
3169,
5133,
209,
186,
783,
4477,
1375,
326,
8985,
25807,
2957,
285,
2831,
290,
10144,
2957,
497,
908,
347,
253,
2957,
1159,
310,
352,
247,
5019,
273,
1097,
1057,
352,
3469,
327,
253,
3368,
4496,
19148,
50276,
186,
9453,
11365,
4336,
13130,
4408,
8223,
569,
253,
4477,
588,
1509,
1016,
12097,
2709,
2069,
949,
253,
2990,
342,
1027,
966,
2349,
351,
723,
327,
616,
966,
13823,
4972,
849,
513,
253,
4477,
2968,
342,
253,
1083,
326,
247,
12275,
31326,
2709,
35936,
12830,
627,
310,
642,
1491,
327,
253,
2929,
3045,
273,
253,
1332,
275,
5301,
281,
2571,
253,
4477,
7277,
616,
1332,
281,
2709,
6928,
390,
4471,
22089,
6928,
533,
513,
417,
7277,
281,
2709,
966,
26405,
6928,
923,
305,
16430,
30050,
1283,
253,
4477,
1332,
878,
247,
1509,
273,
253,
2990,
323,
1016,
966,
50276,
255,
1071,
673,
1223,
2709,
2437,
26405,
3082,
2430,
247,
2014,
1509,
281,
512,
5971,
50276,
72,
16430,
30050,
1093,
305,
16430,
33383,
14638,
79,
3471,
4652,
391,
14841,
7381,
285,
391,
267,
7699,
480,
375,
1144,
1148,
1554,
382,
7818,
26405,
432,
10571,
13130,
15302,
2898,
281,
2133,
5889,
6341,
327,
45830,
20947,
2460,
1783,
323,
4886,
1963,
5988,
285,
33288,
3888,
7203,
254,
45909,
4765,
374,
17472,
1348,
15733,
1102,
11983,
746,
15733,
1102,
11983,
7166,
266,
480,
1162,
355,
15180,
30559,
323,
1554,
1528,
1247,
1783,
275,
3739,
6979,
247,
2278,
3739,
2460,
1783,
8026,
6247,
7127,
2251,
50276,
187,
187,
4118,
18435,
27,
6050,
767,
776,
273,
1264,
30628,
8042,
562,
326,
253,
1332,
310,
1077,
2074,
281,
247,
3786,
3863,
2746,
33413,
3024,
50276,
20513,
30628,
1335,
923,
1318,
275,
253,
9470,
7103,
3559,
275,
436,
2929,
285,
1097,
5125,
5075,
2997,
253,
806,
37317,
671,
2559,
253,
4868,
281,
45210,
846,
253,
30080,
22559,
21681,
281,
3081,
4679,
323,
7103,
891,
671,
5194,
342,
253,
30628,
326,
436,
2929,
12453,
271,
1774,
1895,
275,
253,
1673,
26332,
50276,
2003,
1365,
27214,
941,
285,
10262,
9470,
7103,
285,
22791,
272,
1411,
1774,
8245,
7274,
285,
3103,
1804,
14924,
273,
436,
2929,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes an offpolicy rl called nfqi nested fitted qiteration as an extension of fitted qiteration to estimate groupspecific policies and account for group structure in the data having two predefined groups of observations background and foreground nfqi imposes a structure on the family of function to approximate the q function with guarantees to converge to bellman optimal qvalue function nfqi is trained in two stages the first stage the shared component is trained with all samples in the second stage the foreground component is trained using only the foreground samples the nfqi method is validated on a nested cartpole environment where the background is the original environment and foreground includes a constant force that pushes the cart to the left nfqi is also validated using a realworld clinical data from mimic dataset pros a simple approach to account for a structured and imbalanced data when learning offpolicy rl nfqi works well in the case of extreme imbalance between the foreground and background environments the interpretability experiments using shap values validate that the nfqi finds policies that identify the foreground environment as an important component the model is applied to realworld clinical data mimiciv cons the novelty of nfqi is somewhat limited the proposed training approach is very similar to transfer learning where shared and specific components are trained limited baselines in the experimental analysis nfqi has many realworld applications including the use in healthcare however the novelty of the method is rather limited also comparisons with additional baselines would be required to further validate the approachg minor comments background loss lb is not defined in algorithm 1 table 2 is duplicated so which list of features has been used in the experiments shown in the paper in figure 5a superimposed bars are not clear as mixed color is showing up separate bars would be better for clarity docsepthis paper proposes nested policy fitted qiteration nfqi an adaptation of the wellestablished fitted qiteration fqi algorithm originally proposed by ernst geurts and wehenkel in 2005 1 to the setting where the agent is required to learn policies for two different mdps while sharing information between them at a high level the authors proposed modification involves learning a function f which approximates the qfunction qpist at rt gamma mathbbesvpis by decomposing the qestimate into a shared component and a subgroupspecific component concretely they represent f as fmathbfs mathbfa z gsmathbfs mathbfa mathbb1z1gfmathbfs mathbfa where gs models the qvalue using parameters shared across groups and gf models a groupspecific modification to the estimated qvalue through both simulated experiments cartpole and an observational dataset electrolyte replenishment optimization on mimic the authors demonstrate that their nfqi approach performs better than do alternative approaches which do not take known subjectspecific group information into account 1 ernst damien pierre geurts and louis wehenkel treebased batch mode reinforcement learning journal of machine learning research 6 2005 503556 strengths the challenge of being able to identify shared problem structure between subgroups and leverage that information to improve sample efficiency in reinforcement learning is an important and difficult task this work provides an interesting perspective and intuitive approach for solving it i thought the questions and experiments laid out in section 4 top of page 4 were interesting and welldefined overall their experimental designs in the cartpole setting were compelling weaknesses my main critique of the paper is that it takes an unnecessarily complicated approach to an otherwise easy problem and does not compare performance of the proposed algorithm to one of the more obviousintuitive benchmark approaches first ill address the unnecessarily complicated approach comment the proposed approach is to essentially create two separate sets of weights for modeling a groupspecific qvalue estimator one shared set of weights for all groups and one set of weights unique to the foreground group but the authors themselves state that their approach is equivalent to just augmenting the state space with an indicator of onehot encoded group assignment in this light an alternative summary of the papers contributions could be written as we find that fqi performs better when it has access to all of the variables that actually specify the mdp fundamentally when group assignment modifies the mdp but is not included in the state space the resulting stochastic decision process is no longer markovian i suppose you could think of the resulting problem setting as a pomdp the resulting problem is interesting when group assignment is not observable see eg killian and daulton 1 but trivial when group assignment is available just expand the state space to include the state space and the problem is solved in my mind then to the extent that nfqi is not always just a simple state space augmentation see bullet below the best comparison here would just be fqi with an augmented state space i would bet though that any differences in performance between nfqi and stateaugmented fqi when there is actually a difference between them are negligible at best the other baseline to compare against in classimbalanced settings would be to use fqi augmented state space sample weighting to address the class imbalance so upweighting the loss of the smaller class by 1pa random sample is in the smaller class related to the above ill note that the authors comment in appendix 61 about equivalence of their proposed ftextnfqi with stateaugmented fqi is not true for the neural network approach that they describe in appendix 62 this discrepancy is not mentioned but should be more importantly however the authors gloss over the fact that convergence of fqi see section 35 of 2 is heavily dependent on the use of an induced kernel that does not depend on the output values of the training sample this is certainly not the case when using neural networks as function approximators as the authors have proposed indeed none of the instantiations of nfqi proposed by the authors carry such a convergence guarantee in the discussion of the related work the authors claim that their approach differs from metarl and multitask rl by arguing that these approaches deal with different tasks in addition to different mdps but the nfqi approach only handles modeling the same task with different mdps i would argue that multitask rl and meta rl are thus more general than the proposed nfqi approach and in fact subsume it to that extent one relevant comparison here would be to compare multitask rl and metarl approaches to the proposed nfqi approach but honestly this is probably overkill because with known group identity you can likely get most of the potential gains just by augmenting the state space and running fqi i have significant concerns about the quantitative evaluation approach for the mimic datasettask using alignment between clinician behavior and the learned rl policy as a proxy for policy quality is fraught in many ways what if clinicians behave suboptimally what if the differences are just due to chance see gottesman et al 3 for some pointers to best practices in offpolicy policy evaluation two central premises of the proposed approach is that there are many important realworld problem settings in which assuming some shared rl taskreward function 1 there are precisely two knownobservable conditions for which the underlying transition dynamics of the associated mdp are similar but slightly different and 2 there are significantly more data available for one condition relative to the other while premise 1 is plausible it is poorly if at all justified the much more likely scenario in my mind is one in which there are many conditionsgroups all of which are unobservable each of which is associated with a slightly altered mdp consider the experiment the authors ran on mimic where they split the cohort into individuals with kidney disease and individuals without kidney disease whos to say that this is the right way to split that cohort into two groups would it be more appropriate to split patients on age diabetes status prior history of an acute kidney injury also does the specific kind of kidney disease matter what about the severity is there some grouping that is unobservedlatent but perhaps correlated with age diabetes status and kidney disease that would be a more accurate description of how patientspecific mdps vary the limitation of the presently proposed framework to exactly two observed groups is quite constraining and limits the significance of the work that being said in the spirit of all models are wrong but some are useful ill put that aside for now and focus on the merits of the approach for improving performance in the proposed problem setting premise 2 is also plausible but also poorly justified the justification for this claim is essentially an intuitive argument in the second sentence of section 414 nfqi is robust to group size imbalance for example it is much easier to collect medical records from healthy background patients than from patients with a specific chronic disease i would actually argue that the opposite is true in the medical setting medical record data for healthy individuals are relatively sparse because healthy patients dont go to the hospital on the other hand individuals with chronic conditions will tend to visit their primary care provider pcp or specialist with much more regularity im not arguing with the premise itself but rather saying it needs more justification and the one given is inadequate and in my view just wrong im not going to get into really minor points here because i think the paper will require a major overhaul before it can be accepted at a conference like iclr that being said i do think there are some interesting directions here for future work specifically in settings where there is more than one group and group identity is unknown theres already a fair bit of existing work on meta rl and multitask rl that is quite relevant another interesting direction on the theory side would be to better understand how close mdps from two or more groups have to be in order for weightsharing approaches like the one proposed provide sample efficiency improvements 1 killian taylor et al robust and efficient transfer learning with hidden parameter markov decision processes proceedings of the 31st international conference on neural information processing systems 2017 2 ernst damien pierre geurts and louis wehenkel treebased batch mode reinforcement learning journal of machine learning research 6 2005 503556 3 gottesman omer et al evaluating reinforcement learning algorithms in observational health settings arxiv preprint arxiv180512298 2018 this paper proposes an approach for intelligent weight sharing in offline rl when the dataset available consists of two groups with slightly different mdps and where group identity is knownobservable this problem can be trivially solved by simply including group identity in the state representation the experimental design is good and the research questions are interesting however the problem is simple enough and the solution trivial enough that theres not a whole lot of novelty or significance in this paper i recommend rejection docsepthis paper introduces a learning paradigm to handle related mdps called nested mdps that share the same structure and definition varying only in their dynamics based on fitted qiteration fqi the authors propose an algorithm known as nested fqi nfqi to learn from the shared structure of the nested mdps while also being able to adapt to the specific dynamics of the separate mdps nfqi is developed and analyzed empirically in the simplest setting where there are only two variants of dynamics are present where there is a nominal imbalance between expected and out of distribution observations nfqi is compared to standard fqi and standard transfer learning on an augmented cartpole task as well as medical treatment task derived from retrospective ehr data extensive analysis on the learned policies is performed to establish the anticipated benefits of using nfqi in these settings strengths the paper nicely outlines the setting for nfqi its clear to understand how the nested mdps are modeled and where they may be found in the real world concerns about novelty and the necessity of this formulation are listed below among the weaknesses i found in the paper the paper is well structured and well written for the most part at times i felt that the discussion was a little too high level specifically in sections 2 and 3 related work and methods more in the weaknesses section below but for the most part necessary explanations were sufficiently through and clearly explained a particular strength in the writing was evident in how the core insights or experimental objectives were clearly outlined at the beginning of section 4 experiments i was also impressed at the extent by which the learned policies were evaluated and compared to standard fqi the interpretation of how different features contribute to policy decisions using shap values was nice to see some concerns about clarity are discussed below weaknesses i have several concerns about the necessity of this approach as proposed by the authors and presented in this paper as submitted first by focusing only on two subpopulations of these nested mdps the authors are investigating the simplest base case of what could be termed multitask or meta learning there is very little discussion to justify why taking this stripped down approach is necessary beyond the tendency of the multitask or metalearning literature to focus on large overparametrized networks theres no reason to expect why the methods and algorithms presented in these papers wouldnt be applicable and perform as well as nfqi second there is a large body of work on transfer within rl that has been overlooked and should be cited appropriately the nested mdp setting seems to be a special case of a hiddenparameter mdp hipmdp or block mdp references below each of these branches of literature cover the exact setting where families of mdps differentiated only by changes in their dynamics are treated with joint policies or otherwise personalized to individual settings through a latent or contextual parameter a discussion differentiating this paper from this body of literature is warranted third its not surprising that nfqi outperforms fqi given the extra contextual information the algorithm is provided through the contextual group variable z a more fair comparison would be providing this extra feature to fqi and evaluating the policy performance in that setting additionally its severely unclear what is actually done in the transfer learning baseline are policies learned on the background dataset and they applied directly to the foreground dataset is there any finetuning on the foreground at all beyond these points i was disappointed with how highlevel and devoid of detail the discussion in section 3 methods was there is very little concrete development of nfqi where i wasnt entirely sure what was being presented in the experimental results and how much significance to ascribe to the performance gains or differences between fqi and nfqi clearly something exciting is happening by leveraging the contextual information of the nested mdp in equation 1 within the function approximator f but without a clear link or more formal presentation of where eqt 1 fits into fqi to produce nfqi its not easy to immediately follow the results and insights developed in section 4 experiments its also unclear after reading the appendix what the separate losses represent based on eqt 1 and algorithm 1 in the appendix it seems that gb is doubly accounted for so is gb rightarrow pib having double the gradients applied to it if so its unsurprising that its performance is so high relative to fqi on the background dataset on the cartpole task questions from cartpole experiments its not apparent why different dataset sizes were used between the various experiments for cartpole also in section 411 does samples mean episodes the lack of consistency and the policies achieving different levels of performance between experiments is unnecessary and only provides more opportunity for confusion it would possibly be better if the experiment in section 411 had the same total number of episodes as the other experiments in section 41 but still could be used to demonstrate the effects of a small portion of those episodes belonging to the foreground dataset with a more thorough exploration of this effect as is done in section 414 what sparked this question was the diminished performance in figure 4 in comparison to figure 1 the use of shap values to differentiate policies is a really neat idea i think that there are further insights behind figure 3 that probably deserve to be explored as discussed there are qualitative differences in the importance of features between the background and foreground policies what is shown in addition to this is that the feature values that a deemed important are inverted between the datasets high feature values seem to be more important in the foreground dataset this is interesting given the definition of the foreground mdp does this relation shift as you reduce the magnitude of the force applied in the foreground dataset its not described whats varied between runs of the experiments are these random seeds sampling strategies ie what is driving the variance of the performance is it the method itself in section 414 it is mentioned that the proportion of data that makes up the foreground dataset is varied to mirror proportions seen in medical datasets this is unecessarily vague what types of datasets what makes up these proportions in the medical datasets what conditions questions from the experiments using the mimic renal cohort the cohort definition was unclear beyond the number of patients the number of features and how theyre derived is relevant information that should be included in the main body of the paper the specific action definition is also not very clearly outlined what goes into repleting electrolytes is there more than one medication for this beyond prescribing potassium after looking through table 2 in the appendix it appears that several of the features are associated with treatment decisions or medications administered to the patients there is likely some overlap with comorbidities and the presence of these medications how was this accounted for were treatmentaction decisions incorporated into the patients state representation the potassium treatment decisions are rightfully binned into a categorical vector none low high which corresponds to three discrete actions as i understood the definition however in figure 6 there seems to be a continuum of treatments that are administered which was it on this point its not clear what the numerical values in figure 6 correspond to in the categorical sense this was confusing because as stated the healthy range for potassium is defined as 3545 mmoll but in figure 6d the nfqi policy appears to recommend high replenishment treatments within that range assuming a value of 2 high that seems counterintuitive now for table 1 if there are only three discrete actions then the results of predicting the clinicians actions arent performing much better than random its also curious that the action prediction performance for the background dataset is so much lower than the foreground dataset perhaps this is due to greater heterogeneity in the background data i would expect the performance to be better on the background dataset due to there being more data in figure 7 its interesting to see how the foreground and background policies differ from each other im curious if theres some causal leakage informing these results however for example one of the top rated features is potassium which is directly correlated to the treatment decisions right is this filtering through as a top rated feature in the foreground dataset because there is a higher frequency of these labstests being ordered is this the same for hgb or the amount of phosphate being administered via iv what is the significance of adding the other features together this imbalances the presentation of figure 7 and my immediate impression was that the other 56 features are more important since their combined shap values are significantly higher than the others presented i would perhaps recommend removing that last row and making a comment in the caption or main body that the top 510 features from nfqi are provided references hipmdps doshivelez f konidaris g 2016 july hidden parameter markov decision processes a semiparametric regression approach for discovering latent task parametrizations in ijcai proceedings of the conference vol 2016 p 1432 nih public access killian t daulton s konidaris g doshivelez f 2017 december robust and efficient transfer learning with hidden parameter markov decision processes in proceedings of the 31st international conference on neural information processing systems pp 62516262 yao j killian t konidaris g doshivelez f 2018 direct policy transfer via hidden parameter markov decision processes in llarla workshop faim vol 2018 perez c such f p karaletsos t 2020 april generalized hidden parameter mdps transferable modelbased rl in a handful of trials in proceedings of the aaai conference on artificial intelligence vol 34 no 04 pp 54035411 block mdps misra d henaff m krishnamurthy a langford j 2020 november kinematic state abstraction and provably efficient richobservation reinforcement learning in international conference on machine learning pp 69616971 pmlr zhang a lyle c sodhani s filos a kwiatkowska m pineau j precup d 2020 november invariant causal prediction for block mdps in international conference on machine learning pp 1121411224 pmlr zhang a sodhani s khetarpal k pineau j 2020 learning robust state abstractions for hiddenparameter block mdps arxiv preprint arxiv200707206 zhang a sodhani s khetarpal k pineau j 2020 multitask reinforcement learning as a hiddenparameter block mdp arxiv eprints arxiv2007 there are several concerns about significance of the problem setting and proposed algorithmic approach additionally there are severe gaps in clear exposition outlining how the proposed algorithm is setup and run docsepthis paper considers decision making in nested mdps where there are two distinct groups having partially shared but unknown state dynamics a new algorithm called nfqi nested fittedq iteration is presented and its based on a simple modification to fqi a commonly used offline rl algorithm baseline comparisons and sensitivity analyses were done in a set of simulated experiments and using openai gym cartpole the proposed approach is then applied to a realworld rl task formulated from mimiciv ehr data strengths exposition is clear and easy to follow simulated experiments on carpole are extensive and seek to answer various interesting research questions that justify the usefulness of the proposed approach good use of shap for understanding which features were used by the policys recommendation and showing the difference of foregroundbackground weaknesses this paper only considers two groups background foreground that share state dynamics to some extent extensions to more than two groups could be more generally useful but are only mentioned in future work and not discussed concretely there doesnt seem to be anything special about the proposed approach thats specific to the offline setting one could parameterize the qfunction as fsa gssa 1z gfsa in an online setting as well could you elaborate whether this approach will also provide benefit in the online setting and if not what makes it special for offline rl the baselines were not clearly described in the main text what is the transfer learning baseline first appeared on page 4 re realworld ehr data experiments i dont believe agreement with doctors actions is the right quantitative metric consider using an ope offline policy evaluation method 12 it wasnt clear to me whether there was a trainvaltest split because sec 42 on page 8 describes the total number of patients yet table 1 says predict on test samples detailed comments re a possibly inaccurate claim on page 11 appx 61 in its original formulation fqi was shown to converge to a bellmanoptimal qvalue function using concepts from dynamic programming theory the classical results only guarantees that iterative application of bellman optimality operator without function approximation or equivalently with perfect function approximation leads to q but in practice the convergence of fqi 3 depends heavily on i the completenessrealizability of the function class and ii distribution shift in limited data i also hope to see some theoretical support for why fnfqitilde s a is a good choice of function class the proposed setting seems to be a variationspecial case of factored mdp with factored state spaces i would recommend reading and citing the following references 45 plus possibly their followup works and comment on how the proposed approach is related and where it differs presentation of fig 6 can you please clarify what are the values in the heatmap since there are only 3 actions no low high i was expecting only three colors in the heatmap i would also recommend using a discretized color bar with only 3 colors furthermore the yaxis label for creatinine should be flipped so that lower values are at the bottom which is what a typical cartesian axis looks like naming its true that the presented setting has nested dynamics but im not sure if nested is the best way to describe the value function and policy references 1 taylor w killian haoran zhang jayakumar subramanian mehdi fatemi marzyeh ghassemi an empirical study of representation learning for reinforcement learning in healthcare ml4h neurips 2020 httpsarxivorgabs201111235 2 shengpu tang jenna wiens model selection for offline reinforcement learning practical considerations for healthcare settings mlhc 2021 httpsarxivorgabs210711003 3 jinglin chen nan jiang informationtheoretic considerations in batch reinforcement learning icml 2019 httpsarxivorgabs190500360 4 daphne koller ronald parr computing factored value functions for policies in structured mdps ijcai99 httpsdlacmorgdoiabs10555516243121624408 5 carlos guestrin daphne koller ronald parr shobha venkataraman efficient solution algorithms for factored mdps jair 2003 httpsdoiorg101613jair1000 the presented setting is interesting to look at and potentially useful in clinical decision making and the simulation results justified the benefit of the proposed approach to a limited extent however experiments on real ehr data need to be improved and there seems to be a lack of theoretical understanding for when and why this approach works well this will also help generalize the results to more than two groups there needs to be better framing regarding whether the conclusions are limited to offline settings lastly its important to include a discussion that relates to past work on factored state spaces and factored value functions
### Summary:
|
this paper provides a method for offline rl in settings where the environment may exhibit significant similar structure such as one part having nearly the same dynamics as other parts the work is motivated in part by healthcare settings the reviewers appreciated the potential applications to areas like healthcare but also thought there is a strong body of related work eg transfer learning metarl and other related papers and it was unclear how novel the approach was within that related work or how it would compare the authors did not respond to the reviewers reviews we hope their input is useful to the authors in revising their work for the future
|
[
5886,
5333,
347,
368,
4796,
253,
9777,
273,
253,
3490,
3732,
275,
253,
35936,
10895,
50276,
953,
417,
2529,
47515,
12848,
875,
6613,
273,
253,
4679,
403,
841,
3632,
12922,
10491,
8130,
26332,
752,
310,
6276,
253,
11041,
273,
253,
3045,
310,
352,
253,
1332,
3139,
50276,
249,
2593,
36573,
352,
310,
5393,
326,
253,
8394,
273,
941,
326,
2789,
598,
253,
35936,
10895,
310,
12848,
281,
11472,
22260,
2326,
275,
3739,
15302,
436,
310,
6987,
829,
3441,
21248,
752,
3510,
273,
15302,
752,
2789,
598,
841,
22260,
275,
253,
3739,
15302,
752,
2515,
50275,
34974,
432,
253,
4679,
970,
253,
25066,
10919,
11077,
50276,
783,
11077,
5426,
369,
12744,
4457,
253,
1180,
273,
1363,
253,
1180,
273,
3386,
285,
849,
597,
250,
6012,
310,
4623,
1491,
326,
943,
320,
2908,
275,
253,
2022,
2133,
273,
253,
2929,
253,
2173,
2250,
5426,
310,
671,
417,
1077,
4518,
18627,
752,
4566,
715,
294,
713,
1076,
22281,
5298,
310,
627,
625,
685,
581,
12358,
323,
436,
4457,
41776,
23160,
846,
2819,
949,
2829,
374,
275,
253,
30762,
352,
4620,
326,
2067,
273,
253,
3386,
403,
2330,
342,
1971,
7089,
390,
16940,
11966,
281,
253,
1363,
627,
310,
2779,
690,
14787,
342,
37584,
285,
253,
3361,
273,
841,
16940,
849,
369,
436,
20184,
323,
497,
1971,
1913,
7089,
11217,
715,
253,
1363,
1375,
6779,
50276,
783,
23160,
1971,
7089,
403,
987,
2920,
270,
13172,
715,
247,
31091,
4972,
5293,
1698,
1029,
534,
10140,
281,
1264,
13358,
5231,
347,
891,
7192,
253,
5426,
2299,
275,
4677,
721,
627,
3133,
281,
320,
247,
19106,
273,
9694,
326,
403,
11966,
534,
369,
352,
327,
436,
1127,
697,
417,
2590,
752,
253,
10704,
2193,
275,
4677,
721,
2723,
281,
275,
253,
31091,
3282,
436,
369,
21643,
984,
347,
4767,
253,
5875,
2491,
323,
23160,
310,
2931,
347,
4791,
1857,
5823,
2555,
533,
275,
4677,
721,
69,
253,
295,
71,
33980,
3646,
4620,
281,
5583,
1029,
43692,
8922,
9694,
1561,
326,
2491,
7384,
247,
1318,
273,
374,
50276,
8656,
326,
3133,
4828,
565,
48714,
50276,
2666,
323,
2829,
337,
604,
627,
403,
760,
1264,
13358,
5231,
840,
253,
1543,
273,
21565,
253,
26690,
5231,
403,
2649,
9591,
1199,
1805,
685,
3632,
697,
671,
14338,
326,
253,
2250,
10554,
3045,
323,
253,
4114,
10895,
310,
594,
1199,
2406,
685,
253,
35936,
10895,
4931,
436,
310,
1955,
281,
3687,
19331,
275,
253,
4114,
941,
891,
651,
1902,
253,
3045,
281,
320,
1805,
327,
253,
4114,
10895,
1955,
281,
627,
1146,
625,
941,
50275,
249,
4677,
818,
697,
4722,
281,
923,
849,
253,
35936,
285,
4114,
7823,
9184,
432,
1016,
643,
516,
14338,
604,
253,
373,
690,
19349,
23753,
36689,
841,
1543,
2299,
323,
1650,
581,
273,
253,
1755,
20139,
3386,
310,
23160,
534,
310,
3587,
9578,
281,
253,
1971,
7089,
987,
310,
436,
19690,
949,
347,
247,
1755,
20139,
4735,
275,
253,
35936,
10895,
984,
627,
310,
247,
2169,
4294,
273,
841,
5188,
296,
6655,
1146,
6960,
310,
436,
253,
1072,
323,
288,
20773,
390,
253,
2408,
273,
17016,
1146,
11966,
3066,
21983,
752,
310,
253,
8453,
273,
6240,
253,
643,
3386,
2366,
436,
516,
7187,
1972,
253,
9759,
273,
4677,
818,
285,
619,
8993,
13214,
369,
326,
253,
643,
8026,
3386,
403,
625,
1774,
1580,
616,
5678,
439,
522,
2193,
403,
3012,
2169,
685,
253,
2571,
3559,
891,
651,
4931,
5583,
11922,
326,
1390,
4194,
285,
2403,
247,
4385,
275,
253,
11743,
390,
2022,
2133,
326,
253,
1755,
33930,
3386,
432,
295,
71,
33980,
403,
2530,
50274,
250,
3065,
50276,
1456,
6535,
793,
50276,
69,
6934,
422,
33383,
269,
50276,
32937,
301,
26232,
305,
4022,
480,
2988,
8763,
4764,
1616,
729,
3061,
4870,
247,
3300,
532,
29982,
6853,
9077,
2746,
323,
30375,
21624,
4836,
30364,
21100,
569,
275,
891,
23925,
2284,
10061,
273,
253,
8059,
1936,
4022,
268,
1638,
1237,
295,
6356,
1345,
2289,
50276,
24212,
757,
246,
277,
1923,
251,
256,
17022,
301,
26232,
305,
50276,
69,
6934,
422,
33383,
269,
4240,
372,
4246,
10237,
285,
5919,
3700,
4715,
342,
8763,
4764,
1616,
729,
3061,
4870,
275,
10061,
273,
253,
4562,
296,
5213,
8059,
327,
11454,
1491,
5162,
2718,
7266,
45451,
1036,
24724,
50276,
5973,
80,
480,
5159,
757,
246,
17022,
301,
26232,
305,
50276,
69,
6934,
422,
33383,
269,
4765,
1480,
3646,
3700,
3066,
8763,
4764,
1616,
729,
3061,
4870,
275,
298,
9388,
4123,
22586,
269,
1468,
1936,
4765,
50276,
365,
14852,
260,
824,
269,
268,
50276,
18970,
267,
1507,
375,
246,
9169,
1049,
21704,
14923,
8763,
4764,
31934,
793,
3700,
494,
1566,
3169,
391,
77,
275,
247,
17167,
273,
7587,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
5910,
642,
16703,
7266,
36725,
25039,
883,
50276,
6172,
31934,
793,
50276,
24418,
376,
277,
344,
2072,
567,
278,
36407,
763,
6292,
321,
24085,
247,
50276,
8700,
4379,
480,
9169,
642,
306,
2480,
44062,
1375,
38562,
285,
872,
1598,
5919,
6793,
23705,
318,
35221,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
721,
4196,
17809,
3677,
268,
1686,
83,
50276,
91,
12109,
247,
298,
2172,
260,
9359,
49520,
256,
47667,
247,
41291,
5946,
49036,
32518,
278,
21175,
1952,
480,
50275,
17995,
484,
277,
9169,
642,
306,
2480,
13727,
19349,
10554,
323,
2972,
31934,
793,
275,
5213,
8059,
327,
5145,
4715,
7266,
11633,
1047,
11124,
1348,
268,
1686,
83,
50276,
91,
12109,
247,
9359,
49520,
256,
465,
6168,
5916,
267,
465,
50276,
23344,
1952,
480,
9169,
4715,
10237,
1375,
490,
10981,
960,
323,
8763,
19484,
2972,
31934,
793,
549,
32693,
638,
3845,
549,
32693,
1518,
26522,
18040,
50276,
91,
12109,
247,
9359,
49520,
256,
465,
6168,
5916,
267,
465,
50276,
23344,
1952,
480,
9169,
1554,
262,
1945,
35221,
4715,
347,
247,
8763,
19484,
2972,
278,
12132,
549,
32693,
299,
21937,
549,
32693,
8602,
627,
403,
2067,
7350,
670,
8453,
273,
253,
1895,
4758,
285,
4081,
5933,
280,
2746,
23000,
627,
403,
5460,
18388,
275,
2590,
47284,
562,
30927,
849,
253,
4081,
5933,
310,
9978,
285,
1408,
50276,
7152,
33032,
2520,
2929,
19401,
3061,
2403,
275,
20494,
31934,
793,
835,
627,
403,
767,
5799,
2390,
1907,
10571,
6096,
533,
7202,
1375,
8062,
247,
747,
5933,
1925,
295,
71,
33980,
20494,
14662,
82,
19502,
310,
3559,
285,
697,
1754,
327,
247,
2969,
11237,
281,
269,
33980,
247,
7744,
908,
28841,
391,
77,
5933,
8245,
14023,
285,
7340,
6260,
497,
2218,
275,
247,
873,
273,
15524,
4679,
285,
970,
1527,
2284,
17409,
7281,
36479,
253,
4081,
2746,
310,
840,
3732,
281,
247,
1524,
10186,
391,
77,
4836,
26115,
432,
25066,
400,
299,
6285,
941,
50275,
296,
3755,
20556,
50276,
911,
3321,
310,
2590,
285,
3477,
281,
956,
50275,
3549,
2907,
4679,
327,
26918,
1306,
403,
9470,
285,
7703,
281,
3662,
2710,
4722,
2561,
3533,
326,
15249,
253,
31471,
273,
253,
4081,
2746,
50275,
12311,
897,
273,
439,
522,
323,
4685,
534,
3386,
497,
908,
407,
253,
6382,
656,
17401,
285,
4645,
253,
3064,
273,
35936,
11814,
50274,
20881,
1255,
265,
50276,
2520,
2929,
760,
19401,
767,
2390,
4114,
50276,
922,
2595,
326,
3894,
1375,
8062,
281,
690,
6070,
18149,
281,
625,
685,
767,
2390,
812,
320,
625,
3839,
4217,
533,
403,
760,
5393,
275,
2852,
789,
285,
417,
5469,
345,
2414,
600,
50275,
9088,
36908,
1646,
281,
320,
2712,
2714,
670,
253,
4081,
2746,
28763,
2173,
281,
253,
28841,
4758,
581,
812,
4764,
907,
253,
2805,
3701,
347,
269,
6678,
50276,
72,
859,
66,
50276,
18,
91,
305,
3671,
66,
275,
271,
3909,
4758,
347,
973,
812,
368,
21184,
1880,
436,
2746,
588,
671,
2085,
5649,
275,
253,
3909,
4758,
285,
604,
417,
752,
2789,
352,
2714,
323,
28841,
391,
77,
50275,
783,
1666,
25379,
497,
417,
4518,
2529,
275,
253,
2022,
2505,
752,
310,
253,
3700,
4715,
8245,
806,
5420,
327,
3239,
577,
50275,
250,
1524,
10186,
299,
6285,
941,
4679,
891,
13414,
2868,
4345,
342,
11576,
5231,
310,
253,
987,
11745,
7982,
1908,
970,
271,
258,
365,
28841,
3646,
7103,
1332,
1249,
50275,
262,
369,
2649,
2590,
281,
479,
1880,
627,
369,
247,
6194,
1208,
2566,
8085,
984,
4706,
5976,
327,
3239,
854,
8631,
253,
2264,
1180,
273,
1363,
2568,
2829,
337,
2296,
3283,
50276,
251,
1071,
3530,
50275,
5992,
7193,
5701,
50276,
250,
247,
6830,
31215,
1750,
327,
3239,
1903,
622,
89,
9901,
275,
697,
3236,
15895,
269,
33980,
369,
2011,
281,
29623,
281,
247,
17487,
1342,
29776,
2805,
2877,
1159,
970,
12342,
432,
7870,
10717,
3762,
253,
8946,
1543,
760,
23632,
326,
34560,
2898,
273,
17487,
1342,
5556,
1319,
5572,
1293,
1159,
11193,
390,
39406,
342,
3962,
1159,
11193,
5644,
281,
2805,
533,
275,
3946,
253,
14940,
273,
269,
33980,
495,
7024,
11306,
327,
891,
253,
29867,
6549,
50228,
273,
253,
1159,
966,
285,
21255,
3268,
5333,
275,
3710,
941,
891,
671,
3524,
281,
923,
690,
10527,
1329,
323,
2139,
21486,
71,
82,
262,
6227,
256,
247,
310,
247,
1175,
4327,
273,
1159,
966,
50276,
575,
783,
4081,
4758,
3133,
281,
320,
247,
10575,
1553,
451,
1083,
273,
958,
2149,
278,
12132,
342,
958,
2149,
1375,
8470,
891,
651,
5583,
4361,
285,
19936,
253,
1563,
10414,
5329,
5043,
6830,
616,
956,
484,
2987,
285,
4385,
327,
849,
253,
4081,
2746,
310,
2905,
285,
835,
352,
19986,
50276,
575,
49836,
273,
3036,
721,
476,
368,
4496,
19148,
752,
403,
253,
2193,
275,
253,
4250,
4251,
1580,
627,
403,
760,
495,
5231,
642,
1698,
1029,
891,
369,
16764,
760,
1264,
9830,
275,
253,
4250,
4251,
891,
651,
671,
5583,
970,
247,
35132,
1025,
3295,
2534,
342,
760,
495,
9830,
33810,
253,
340,
10565,
5203,
323,
31585,
943,
320,
34572,
594,
326,
2406,
2193,
403,
387,
253,
5004,
534,
310,
752,
247,
6867,
7281,
16561,
7844,
4453,
751,
50276,
575,
6292,
272,
697,
2032,
326,
253,
3559,
4758,
556,
20494,
8062,
533,
516,
417,
2119,
604,
20494,
310,
253,
1682,
1039,
281,
6266,
253,
1318,
1159,
285,
3646,
50274,
250,
3065,
337,
246,
9614,
259,
5159,
757,
419,
263,
266,
1182,
12109,
480,
333,
518,
22711,
749,
3358,
44316,
479,
73,
5168,
4688,
35381,
2304,
3847,
11430,
305,
7110,
6017,
74,
271,
16774,
1263,
273,
6779,
4715,
323,
35221,
4715,
275,
11723,
13361,
21,
73,
50276,
32167,
2824,
9169,
5987,
39962,
2061,
5375,
1252,
883,
805,
1671,
374,
703,
1251,
11113,
12717,
480,
13685,
38435,
561,
1566,
5438,
323,
28841,
35221,
4715,
8542,
15711,
323,
11723,
7533,
13361,
31728,
43425,
5987,
39962,
2061,
5375,
19,
12224,
883,
4838,
495,
480,
272,
3642,
260,
864,
6399,
480,
22589,
1491,
783,
30325,
15711,
275,
14604,
35221,
4715,
17857,
1686,
6247,
5987,
39962,
2061,
5375,
746,
1762,
4838,
1549,
577,
277,
9761,
570,
38301,
2146,
391,
6540,
1061,
83,
12672,
958,
2149,
1318,
3470,
323,
7823,
275,
18872,
31934,
793,
891,
23925,
2284,
1525,
5987,
11830,
50232,
2061,
14369,
5375,
740,
45240,
1036,
1348,
20207,
1036,
1348,
22488,
608,
1113,
26185,
12141,
11078,
277,
9761,
570,
38301,
2146,
391,
6540,
1061,
83,
439,
706,
3227,
8097,
76,
15642,
14990,
5919,
2900,
11333,
323,
958,
2149,
31934,
793,
480,
1094,
6469,
5987,
3088,
1528,
72,
37524,
1012,
75,
1094,
9138,
50276,
783,
3559,
4758,
310,
4722,
281,
1007,
387,
285,
7826,
4217,
275,
3382,
3061,
2403,
285,
253,
9864,
1543,
17285,
253,
5649,
273,
253,
4081,
2746,
281,
247,
3710,
6070,
2299,
4679,
327,
1524,
299,
6285,
941,
878,
281,
320,
5520,
285,
627,
3133,
281,
320,
247,
3480,
273,
10527,
4685,
323,
672,
285,
2139,
436,
2746,
2987,
973,
436,
588,
671,
1361,
39970,
253,
1543,
281,
625,
685,
767,
2390,
627,
3198,
281,
320,
1805,
39926,
5001,
1880,
253,
11815,
403,
3710,
281,
28841,
7533,
1390,
314,
697,
1774,
281,
2486,
247,
5955,
326,
7033,
281,
2469,
789,
327,
958,
2149,
1375,
8470,
285,
958,
2149,
1318,
3470,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
3400,
247,
1332,
323,
28841,
391,
77,
275,
7533,
835,
253,
3126,
778,
10738,
1534,
2074,
2605,
824,
347,
581,
629,
1907,
4829,
253,
1072,
8062,
347,
643,
4243,
253,
789,
310,
17194,
275,
629,
407,
11723,
7533,
253,
30628,
14109,
253,
2442,
4893,
281,
3672,
751,
11723,
533,
671,
1869,
627,
310,
247,
2266,
2133,
273,
2905,
789,
24088,
3700,
4715,
1313,
7694,
285,
643,
2905,
9380,
285,
352,
369,
12744,
849,
4460,
253,
2746,
369,
1561,
326,
2905,
789,
390,
849,
352,
651,
7277,
253,
4477,
858,
417,
3794,
281,
253,
30628,
10123,
359,
3524,
616,
3280,
310,
4217,
281,
253,
4477,
275,
3585,
2182,
616,
789,
323,
253,
2852
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
5886,
5333,
347,
368,
4796,
253,
9777,
273,
253,
3490,
3732,
275,
253,
35936,
10895,
50276,
953,
417,
2529,
47515,
12848,
875,
6613,
273,
253,
4679,
403,
841,
3632,
12922,
10491,
8130,
26332,
752,
310,
6276,
253,
11041,
273,
253,
3045,
310,
352,
253,
1332,
3139,
50276,
249,
2593,
36573,
352,
310,
5393,
326,
253,
8394,
273,
941,
326,
2789,
598,
253,
35936,
10895,
310,
12848,
281,
11472,
22260,
2326,
275,
3739,
15302,
436,
310,
6987,
829,
3441,
21248,
752,
3510,
273,
15302,
752,
2789,
598,
841,
22260,
275,
253,
3739,
15302,
752,
2515,
50275,
34974,
432,
253,
4679,
970,
253,
25066,
10919,
11077,
50276,
783,
11077,
5426,
369,
12744,
4457,
253,
1180,
273,
1363,
253,
1180,
273,
3386,
285,
849,
597,
250,
6012,
310,
4623,
1491,
326,
943,
320,
2908,
275,
253,
2022,
2133,
273,
253,
2929,
253,
2173,
2250,
5426,
310,
671,
417,
1077,
4518,
18627,
752,
4566,
715,
294,
713,
1076,
22281,
5298,
310,
627,
625,
685,
581,
12358,
323,
436,
4457,
41776,
23160,
846,
2819,
949,
2829,
374,
275,
253,
30762,
352,
4620,
326,
2067,
273,
253,
3386,
403,
2330,
342,
1971,
7089,
390,
16940,
11966,
281,
253,
1363,
627,
310,
2779,
690,
14787,
342,
37584,
285,
253,
3361,
273,
841,
16940,
849,
369,
436,
20184,
323,
497,
1971,
1913,
7089,
11217,
715,
253,
1363,
1375,
6779,
50276,
783,
23160,
1971,
7089,
403,
987,
2920,
270,
13172,
715,
247,
31091,
4972,
5293,
1698,
1029,
534,
10140,
281,
1264,
13358,
5231,
347,
891,
7192,
253,
5426,
2299,
275,
4677,
721,
627,
3133,
281,
320,
247,
19106,
273,
9694,
326,
403,
11966,
534,
369,
352,
327,
436,
1127,
697,
417,
2590,
752,
253,
10704,
2193,
275,
4677,
721,
2723,
281,
275,
253,
31091,
3282,
436,
369,
21643,
984,
347,
4767,
253,
5875,
2491,
323,
23160,
310,
2931,
347,
4791,
1857,
5823,
2555,
533,
275,
4677,
721,
69,
253,
295,
71,
33980,
3646,
4620,
281,
5583,
1029,
43692,
8922,
9694,
1561,
326,
2491,
7384,
247,
1318,
273,
374,
50276,
8656,
326,
3133,
4828,
565,
48714,
50276,
2666,
323,
2829,
337,
604,
627,
403,
760,
1264,
13358,
5231,
840,
253,
1543,
273,
21565,
253,
26690,
5231,
403,
2649,
9591,
1199,
1805,
685,
3632,
697,
671,
14338,
326,
253,
2250,
10554,
3045,
323,
253,
4114,
10895,
310,
594,
1199,
2406,
685,
253,
35936,
10895,
4931,
436,
310,
1955,
281,
3687,
19331,
275,
253,
4114,
941,
891,
651,
1902,
253,
3045,
281,
320,
1805,
327,
253,
4114,
10895,
1955,
281,
627,
1146,
625,
941,
50275,
249,
4677,
818,
697,
4722,
281,
923,
849,
253,
35936,
285,
4114,
7823,
9184,
432,
1016,
643,
516,
14338,
604,
253,
373,
690,
19349,
23753,
36689,
841,
1543,
2299,
323,
1650,
581,
273,
253,
1755,
20139,
3386,
310,
23160,
534,
310,
3587,
9578,
281,
253,
1971,
7089,
987,
310,
436,
19690,
949,
347,
247,
1755,
20139,
4735,
275,
253,
35936,
10895,
984,
627,
310,
247,
2169,
4294,
273,
841,
5188,
296,
6655,
1146,
6960,
310,
436,
253,
1072,
323,
288,
20773,
390,
253,
2408,
273,
17016,
1146,
11966,
3066,
21983,
752,
310,
253,
8453,
273,
6240,
253,
643,
3386,
2366,
436,
516,
7187,
1972,
253,
9759,
273,
4677,
818,
285,
619,
8993,
13214,
369,
326,
253,
643,
8026,
3386,
403,
625,
1774,
1580,
616,
5678,
439,
522,
2193,
403,
3012,
2169,
685,
253,
2571,
3559,
891,
651,
4931,
5583,
11922,
326,
1390,
4194,
285,
2403,
247,
4385,
275,
253,
11743,
390,
2022,
2133,
326,
253,
1755,
33930,
3386,
432,
295,
71,
33980,
403,
2530,
50274,
250,
3065,
50276,
1456,
6535,
793,
50276,
69,
6934,
422,
33383,
269,
50276,
32937,
301,
26232,
305,
4022,
480,
2988,
8763,
4764,
1616,
729,
3061,
4870,
247,
3300,
532,
29982,
6853,
9077,
2746,
323,
30375,
21624,
4836,
30364,
21100,
569,
275,
891,
23925,
2284,
10061,
273,
253,
8059,
1936,
4022,
268,
1638,
1237,
295,
6356,
1345,
2289,
50276,
24212,
757,
246,
277,
1923,
251,
256,
17022,
301,
26232,
305,
50276,
69,
6934,
422,
33383,
269,
4240,
372,
4246,
10237,
285,
5919,
3700,
4715,
342,
8763,
4764,
1616,
729,
3061,
4870,
275,
10061,
273,
253,
4562,
296,
5213,
8059,
327,
11454,
1491,
5162,
2718,
7266,
45451,
1036,
24724,
50276,
5973,
80,
480,
5159,
757,
246,
17022,
301,
26232,
305,
50276,
69,
6934,
422,
33383,
269,
4765,
1480,
3646,
3700,
3066,
8763,
4764,
1616,
729,
3061,
4870,
275,
298,
9388,
4123,
22586,
269,
1468,
1936,
4765,
50276,
365,
14852,
260,
824,
269,
268,
50276,
18970,
267,
1507,
375,
246,
9169,
1049,
21704,
14923,
8763,
4764,
31934,
793,
3700,
494,
1566,
3169,
391,
77,
275,
247,
17167,
273,
7587,
275,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
5910,
642,
16703,
7266,
36725,
25039,
883,
50276,
6172,
31934,
793,
50276,
24418,
376,
277,
344,
2072,
567,
278,
36407,
763,
6292,
321,
24085,
247,
50276,
8700,
4379,
480,
9169,
642,
306,
2480,
44062,
1375,
38562,
285,
872,
1598,
5919,
6793,
23705,
318,
35221,
4715,
275,
5213,
8059,
327,
5145,
4715,
7266,
721,
4196,
17809,
3677,
268,
1686,
83,
50276,
91,
12109,
247,
298,
2172,
260,
9359,
49520,
256,
47667,
247,
41291,
5946,
49036,
32518,
278,
21175,
1952,
480,
50275,
17995,
484,
277,
9169,
642,
306,
2480,
13727,
19349,
10554,
323,
2972,
31934,
793,
275,
5213,
8059,
327,
5145,
4715,
7266,
11633,
1047,
11124,
1348,
268,
1686,
83,
50276,
91,
12109,
247,
9359,
49520,
256,
465,
6168,
5916,
267,
465,
50276,
23344,
1952,
480,
9169,
4715,
10237,
1375,
490,
10981,
960,
323,
8763,
19484,
2972,
31934,
793,
549,
32693,
638,
3845,
549,
32693,
1518,
26522,
18040,
50276,
91,
12109,
247,
9359,
49520,
256,
465,
6168,
5916,
267,
465,
50276,
23344,
1952,
480,
9169,
1554,
262,
1945,
35221,
4715,
347,
247,
8763,
19484,
2972,
278,
12132,
549,
32693,
299,
21937,
549,
32693,
8602,
627,
403,
2067,
7350,
670,
8453,
273,
253,
1895,
4758,
285,
4081,
5933,
280,
2746,
23000,
627,
403,
5460,
18388,
275,
2590,
47284,
562,
30927,
849,
253,
4081,
5933,
310,
9978,
285,
1408,
50276,
7152,
33032,
2520,
2929,
19401,
3061,
2403,
275,
20494,
31934,
793,
835,
627,
403,
767,
5799,
2390,
1907,
10571,
6096,
533,
7202,
1375,
8062,
247,
747,
5933,
1925,
295,
71,
33980,
20494,
14662,
82,
19502,
310,
3559,
285,
697,
1754,
327,
247,
2969,
11237,
281,
269,
33980,
247,
7744,
908,
28841,
391,
77,
5933,
8245,
14023,
285,
7340,
6260,
497,
2218,
275,
247,
873,
273,
15524,
4679,
285,
970,
1527,
2284,
17409,
7281,
36479,
253,
4081,
2746,
310,
840,
3732,
281,
247,
1524,
10186,
391,
77,
4836,
26115,
432,
25066,
400,
299,
6285,
941,
50275,
296,
3755,
20556,
50276,
911,
3321,
310,
2590,
285,
3477,
281,
956,
50275,
3549,
2907,
4679,
327,
26918,
1306,
403,
9470,
285,
7703,
281,
3662,
2710,
4722,
2561,
3533,
326,
15249,
253,
31471,
273,
253,
4081,
2746,
50275,
12311,
897,
273,
439,
522,
323,
4685,
534,
3386,
497,
908,
407,
253,
6382,
656,
17401,
285,
4645,
253,
3064,
273,
35936,
11814,
50274,
20881,
1255,
265,
50276,
2520,
2929,
760,
19401,
767,
2390,
4114,
50276,
922,
2595,
326,
3894,
1375,
8062,
281,
690,
6070,
18149,
281,
625,
685,
767,
2390,
812,
320,
625,
3839,
4217,
533,
403,
760,
5393,
275,
2852,
789,
285,
417,
5469,
345,
2414,
600,
50275,
9088,
36908,
1646,
281,
320,
2712,
2714,
670,
253,
4081,
2746,
28763,
2173,
281,
253,
28841,
4758,
581,
812,
4764,
907,
253,
2805,
3701,
347,
269,
6678,
50276,
72,
859,
66,
50276,
18,
91,
305,
3671,
66,
275,
271,
3909,
4758,
347,
973,
812,
368,
21184,
1880,
436,
2746,
588,
671,
2085,
5649,
275,
253,
3909,
4758,
285,
604,
417,
752,
2789,
352,
2714,
323,
28841,
391,
77,
50275,
783,
1666,
25379,
497,
417,
4518,
2529,
275,
253,
2022,
2505,
752,
310,
253,
3700,
4715,
8245,
806,
5420,
327,
3239,
577,
50275,
250,
1524,
10186,
299,
6285,
941,
4679,
891,
13414,
2868,
4345,
342,
11576,
5231,
310,
253,
987,
11745,
7982,
1908,
970,
271,
258,
365,
28841,
3646,
7103,
1332,
1249,
50275,
262,
369,
2649,
2590,
281,
479,
1880,
627,
369,
247,
6194,
1208,
2566,
8085,
984,
4706,
5976,
327,
3239,
854,
8631,
253,
2264,
1180,
273,
1363,
2568,
2829,
337,
2296,
3283,
50276,
251,
1071,
3530,
50275,
5992,
7193,
5701,
50276,
250,
247,
6830,
31215,
1750,
327,
3239,
1903,
622,
89,
9901,
275,
697,
3236,
15895,
269,
33980,
369,
2011,
281,
29623,
281,
247,
17487,
1342,
29776,
2805,
2877,
1159,
970,
12342,
432,
7870,
10717,
3762,
253,
8946,
1543,
760,
23632,
326,
34560,
2898,
273,
17487,
1342,
5556,
1319,
5572,
1293,
1159,
11193,
390,
39406,
342,
3962,
1159,
11193,
5644,
281,
2805,
533,
275,
3946,
253,
14940,
273,
269,
33980,
495,
7024,
11306,
327,
891,
253,
29867,
6549,
50228,
273,
253,
1159,
966,
285,
21255,
3268,
5333,
275,
3710,
941,
891,
671,
3524,
281,
923,
690,
10527,
1329,
323,
2139,
21486,
71,
82,
262,
6227,
256,
247,
310,
247,
1175,
4327,
273,
1159,
966,
50276,
575,
783,
4081,
4758,
3133,
281,
320,
247,
10575,
1553,
451,
1083,
273,
958,
2149,
278,
12132,
342,
958,
2149,
1375,
8470,
891,
651,
5583,
4361,
285,
19936,
253,
1563,
10414,
5329,
5043,
6830,
616,
956,
484,
2987,
285,
4385,
327,
849,
253,
4081,
2746,
310,
2905,
285,
835,
352,
19986,
50276,
575,
49836,
273,
3036,
721,
476,
368,
4496,
19148,
752,
403,
253,
2193,
275,
253,
4250,
4251,
1580,
627,
403,
760,
495,
5231,
642,
1698,
1029,
891,
369,
16764,
760,
1264,
9830,
275,
253,
4250,
4251,
891,
651,
671,
5583,
970,
247,
35132,
1025,
3295,
2534,
342,
760,
495,
9830,
33810,
253,
340,
10565,
5203,
323,
31585,
943,
320,
34572,
594,
326,
2406,
2193,
403,
387,
253,
5004,
534,
310,
752,
247,
6867,
7281,
16561,
7844,
4453,
751,
50276,
575,
6292,
272,
697,
2032,
326,
253,
3559,
4758,
556,
20494,
8062,
533,
516,
417,
2119,
604,
20494,
310,
253,
1682,
1039,
281,
6266,
253,
1318,
1159,
285,
3646,
50274,
250,
3065,
337,
246,
9614,
259,
5159,
757,
419,
263,
266,
1182,
12109,
480,
333,
518,
22711,
749,
3358,
44316,
479,
73,
5168,
4688,
35381,
2304,
3847,
11430,
305,
7110,
6017,
74,
271,
16774,
1263,
273,
6779,
4715,
323,
35221,
4715,
275,
11723,
13361,
21,
73,
50276,
32167,
2824,
9169,
5987,
39962,
2061,
5375,
1252,
883,
805,
1671,
374,
703,
1251,
11113,
12717,
480,
13685,
38435,
561,
1566,
5438,
323,
28841,
35221,
4715,
8542,
15711,
323,
11723,
7533,
13361,
31728,
43425,
5987,
39962,
2061,
5375,
19,
12224,
883,
4838,
495,
480,
272,
3642,
260,
864,
6399,
480,
22589,
1491,
783,
30325,
15711,
275,
14604,
35221,
4715,
17857,
1686,
6247,
5987,
39962,
2061,
5375,
746,
1762,
4838,
1549,
577,
277,
9761,
570,
38301,
2146,
391,
6540,
1061,
83,
12672,
958,
2149,
1318,
3470,
323,
7823,
275,
18872,
31934,
793,
891,
23925,
2284,
1525,
5987,
11830,
50232,
2061,
14369,
5375,
740,
45240,
1036,
1348,
20207,
1036,
1348,
22488,
608,
1113,
26185,
12141,
11078,
277,
9761,
570,
38301,
2146,
391,
6540,
1061,
83,
439,
706,
3227,
8097,
76,
15642,
14990,
5919,
2900,
11333,
323,
958,
2149,
31934,
793,
480,
1094,
6469,
5987,
3088,
1528,
72,
37524,
1012,
75,
1094,
9138,
50276,
783,
3559,
4758,
310,
4722,
281,
1007,
387,
285,
7826,
4217,
275,
3382,
3061,
2403,
285,
253,
9864,
1543,
17285,
253,
5649,
273,
253,
4081,
2746,
281,
247,
3710,
6070,
2299,
4679,
327,
1524,
299,
6285,
941,
878,
281,
320,
5520,
285,
627,
3133,
281,
320,
247,
3480,
273,
10527,
4685,
323,
672,
285,
2139,
436,
2746,
2987,
973,
436,
588,
671,
1361,
39970,
253,
1543,
281,
625,
685,
767,
2390,
627,
3198,
281,
320,
1805,
39926,
5001,
1880,
253,
11815,
403,
3710,
281,
28841,
7533,
1390,
314,
697,
1774,
281,
2486,
247,
5955,
326,
7033,
281,
2469,
789,
327,
958,
2149,
1375,
8470,
285,
958,
2149,
1318,
3470,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
3400,
247,
1332,
323,
28841,
391,
77,
275,
7533,
835,
253,
3126,
778,
10738,
1534,
2074,
2605,
824,
347,
581,
629,
1907,
4829,
253,
1072,
8062,
347,
643,
4243,
253,
789,
310,
17194,
275,
629,
407,
11723,
7533,
253,
30628,
14109,
253,
2442,
4893,
281,
3672,
751,
11723,
533,
671,
1869,
627,
310,
247,
2266,
2133,
273,
2905,
789,
24088,
3700,
4715,
1313,
7694,
285,
643,
2905,
9380,
285,
352,
369,
12744,
849,
4460,
253,
2746,
369,
1561,
326,
2905,
789,
390,
849,
352,
651,
7277,
253,
4477,
858,
417,
3794,
281,
253,
30628,
10123,
359,
3524,
616,
3280,
310,
4217,
281,
253,
4477,
275,
3585,
2182,
616,
789,
323,
253,
2852
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper proposes a set of biologically plausible update rules that can be used to compute latent representations the paper presents a number of heuristics to set hyperparameters and train the proposed model the model is compared to rbms autoencoders and another recently proposed biologicallyplausible model pros the method explores a new kind of model motivated by having biologically plausible update rules the experimental results include an interesting comparison of the learned features cons the paper places emphasis on interpreting the update rules as bayesian confidence propagation yet the underlying probabilistic graphical model is not clearly described it would be useful to have a figure in the paper that describes the model including whether the model is directed or undirected which variables are observed or unobserved if the model is directed what is the generative model etc two assumptions are mentioned about the graphical model px1xn prodi pxi and px1xnyj prodi pxiyj unless i misunderstood the first assumption is saying that each dimension in the input data x is independent of the others this is a very strong assumption and makes the model weak the paper does not include an explanation about why these assumptions make sense or how they influence the models inductive bias relative to other probabilistic models the derivation of the update rules is also unclear the approximation that the indicator function ixixdi can be replaced by its expected value pxdi is hard to understand in particular i could not understand how to parse eq 4 which has a sum over xi is this still dependent on the input given that the indicator function has been replaced by its expectation it would be good to have a more clear derivation of the update rules starting from the probabilisitic model the paper makes some very general assertions that do not add to the point being made for example one disadvantage of probabilistic models is that the known methods do not scale well in practice models such as vaes are probabilistic models which scale quite well to highdimensional inputs and large amounts of data in the introduced model each hc seems to have a similar representational capacity as a softmax hidden unit therefore instead of comparing to rbms and aes with sigmoid units comparisons to rbms and aes with softmax hidden units will be more relevant overall the paper can be improved along two directions making the probabilistic interpretation more clear specifying the graphical model clearly deriving update rules and doing experiments with softmax hidden units so that the only thing changing is the update rules and not the model architecture postrebuttal figure 1 is helpful in understanding the model thank you for adding that the additional experiments are also appreciated this seems to indicate that its not just the architecture but the learning rules that make the bcpnn model work well it would also be helpful to visualize the features learned by softmax units in rbms and aes and see if this results in a similar pattern of hcs encoding broad regions and mcs encoding variations within those regions it seems that the rbms and aes were not trained using any sparsity penalty for example page 11 of httpswwwcstorontoeduhintonabspsguidetrpdf and httpswebstanfordeduclasscs294asparseautoencoderpdf having a target sparsity can have a significant impact on the learned features higher sparsity makes the features look more like stroke like and localized and less spread out all over the visual field as they do in fig 4 c and d based on the additional experiments i will be increasing my score to 5 however given that the main contribution of the paper is a comparative study the paper can add value by doing a more thorough comparison against variants of aes and rbms that have otherwise similar properties such as keeping the hcmc softmax architecture and sparsity levels the samedocsepsummary the bayesian confidence propagating neural network has recently been extended to the case of unsupervised learning ravichandran et al ijcnn 2020 this paper compares this extension to restricted boltzmann machines autoencoders and a biologically plausible model proposed by krotov hopfield pnas 2019 on the mnist dataset for evaluation the authors consider the learned receptive fields and the classification performance of a linear classifier the paper is very similar to ravichandran et al ijcnn 2020 but with an extended experimental section positives biologically plausible methods for unsupervised learning are an interesting area of research there has been relatively little research on structural plasticity concerns the paper does not introduce anything new but merely compares existing methods the comparison is not an extensive study but limited to one dataset mnist and few alternative proposals of which only the kh model is deemed brainlike there seems to be something off with the experimental results krotov hopfield report a better test accuracy of 9854 in their original paper in spite of using less hidden units bcpnns performance is mediocre it is even outperformed by random shallow networks with fixed localized random random gabor filters in the hidden layer illing et al neural networks 2019 lacking performance could be excused by greater biological plausibility as a neuroscientific contribution which is however not the case here as the authors themselves state their model is abstract page 4 and not a neural implementation but merely uses implicit competition via a local softmax operation the neural implementation of which would be lateral inhibition minor comments pixi in eq 6 is never introduced the line above eq 11 should probably refer to 11 not 6 theres a typo in the sentence after eq 11 the hybrid representation might be interesting is it any more biological than the wellknown distributed and local representations docsepsystematic investigation of biologically inspired algorithms and architectures is a very important research topic the paper investigates bayesian confidence propagating neural networks bcpnn on learning unsupervised representations on mnist dataset it presents a comprehensive comparison of four different commonly used unsupervised methods the strong merit of bcpnn approach is the nice receptive fields of the hypercolumns hc and minicolumns mc learned by the proposed algorithm as the authors point out the advantage of the proposed algorithm is that it is able to produce sparse and highly localized in the image plane receptive fields for the mcs also those filters look much cleaner than the counterparts of the classical algorithms considered in figure 3 additionally the authors demonstrate that their representations stand in line with previously published proposals in terms of the classification accuracy the main weakness of this work is that the proposed method has only been investigated on mnist and in onelayer architectures at the same time given the novelty of the approach i think it deserves attention even in this simplest setting considered in this manuscript small comments 1 section4 3rd line should be learned 2 there are some misprints around equation 11 such as the use of py vs py is inconsistent also it seem that the word large is missing following the formula in the line after equation 11 3 i also find panel b in figure 2 confusing the way it is presented makes it look like the authors have combined the outputs of four networks to feed into the classifier while in reality those four networks were evaluated one by one 4 it would be better to designate a new variable for the left hand side of equation 12 since iij is already taken post rebuttal thank you for the response i have read the discussion with other reviewers a small comment while i agree that it is reasonable to keep the classifier the same for all the models softmax with cross entropy for a fair comparison i disagree that the activation function for the first layer should be kept as relu in the kh model in fact kh explain that this is a suboptimal choice in fig 4 of their original paper using powers of relus should increase the kh accuracy overall i think that this is a nice paper and i am inclined to keep my initial score docsepthis paper evaluated four unsupervised learning approaches bcpnn kh rbm ae by training a supervised classification layer on top of the hidden representation specifically the authors qualitatively compared the receptive fields and quantitatively compared the classification performance across four models the authors emphasized the advantages of bcpnn since it applies biologically plausible local learning rules and requires fewer epochs for convergence overall the comparison was fair and solid the description of the bcpnn model in section 3 was clear and comprehensive but the authors did not provide sufficient details of key mechanisms in the other three models especially the kh model which also used brainlike learning rules the detailed introduction of the other three models should be an important component since this is a comparative study the results were clearly stated but the insignificant difference in the classification accuracy comparison table 2 can hardly lead to a reliable conclusion about which unsupervised method is better and it would be better if the authors could provide more interpretations about the hybrid receptive fields of hcs and mcs in bcpnn fig 3a specific comments and questions 1 in the original kh model krovtov hopfield 2019 they also tested the classification accuracy on the mnist dataset and the result reached an error rate of 146 with 2000 hidden states this is better than the reproduced result shown here 9739 accuracy with 3000 hidden states is this accuracy drop caused by a different setting of hyperparameters then is it fair to say that bcpnn outperforms kh 2 could you provide more explanation on eq 11 why this dynamic update of kbeta could be used as a desired bias regulation 3 when the number of total hidden units is fixed what would be the effect of changing the ratio fracnmcnhc in bcpnn 4 the hybrid structure of bcpnn provides interesting receptive field results in fig 3a is this structure generalizable to a model with multiple hidden layers minor page 4 in 31 bias regulation typo eq 6 should be eq 11 and the value of gain kbeta at around 1 when pyj fracpmaxent4 missing gg 0 here
### Summary:
|
this paper conducts a comparison between a small set of models 4 in total for unsupervised learning specifically the authors focus on comparing bayesian confidence propagating neural networks bcpnn restricted boltzmann machines rbm a recent model by krotov hopfield 2019 kh and autoencoders ae the authors compare trained weight distributions receptive field structures and linear classification on mnist using the learned representations the first two comparisons are essentially qualitative comparisons while on classification accuracy the authors report similar accuracy levels across the models this paper received mixed reviews reviewers 4 and 5 felt it did not contribute enough for acceptance while reviewers 2 3 were more positive however as noted by a few of the reviewers this paper does not appear to achieve much and provides very limited analysis and experiments on the models it isnt introducing any new models nor does it make any clear distinctions between the models examined that would help the field to decide which directions to pursue the experiments add little insight into the differences between the models that could be used to inform new work thus the contribution provided here is very limited moreover the motivations in this paper are confused in general it is important for researchers at the intersection of neuroscience and machine learning to decide what their goal is when building and or comparing models specifically is the goal 1 finding a model that may potentially explain how the brain works or 2 finding better machine learning tools if the goal is 1 the performance on benchmarks is less important however clear links to experimental data such that experimental predictions may be possible are very important thats not to say that a model must be perfectly biologically realistic to be worthwhile but it must have sufficient grounding in biology to be informative for neuroscience however in this manuscript as was noted by reviewer 4 the links to biology are tenuous the principal claim for biological relevance for all the models considered seems to be that the update rules are local but this is a loose connection at best there are many more models of unsupervised learning with far more physiological relevance that are not considered here see eg olshausen field 1996 nature zylberberg et al 2011 plos computational biology george et al 2020 biorxiv httpsdoiorg10110120200909290601 it is true that some of these models use nonlocal information but given the emerging evidence that locality is not actually even a strict property in real synaptic plasticity see eg gerstner et al 2018 frontiers in neural circuits williams holtmaat 2018 neuron banerjee et al 2020 nature an obsession with rules that only use pre and postsynaptic activity is not even clearly a desiderata for neuroscience if the goal is 2 then performance on benchmarks and some comparison to the sota is absolutely critical yet this paper does none of this indeed the performance achieved with the four models considered here is as noted by reviewer 4 very poor in contrast there have been numerous advances in unsupervised or selfsupervised learning in ml in recent years eg contrastive predictive coding simclr bootstrap your own latent etc all of which achieve far better results than the four models considered here thus the models being compared here cannot inform machine learning as they do not appear to provide any technical advances of course some models may combine goals 1 2 eg seeking increased physiological relevance while also achieving decent benchmark performance see eg sacramento et al 2018 neurips but that is not really the situation faced here as the models considered have little biological plausibility as noted above and achieve poor performance at the same time altogether given these considerations although this paper received mixed reviews it is clearly not appropriate for acceptance at iclr in the area chairs opinion
|
[
1534,
3486,
327,
253,
6311,
3386,
2169,
37139,
414,
2789,
253,
3386,
1007,
625,
751,
9980,
751,
285,
15783,
285,
1679,
5195,
562,
512,
689,
253,
5304,
1673,
347,
597,
513,
275,
3036,
577,
260,
285,
277,
50274,
3169,
327,
253,
3081,
4679,
891,
588,
320,
3629,
619,
4868,
281,
608,
2299,
1677,
326,
253,
2022,
7680,
273,
253,
2929,
310,
247,
20407,
1263,
253,
2929,
476,
823,
1318,
407,
2509,
247,
625,
11080,
5301,
1411,
11640,
273,
247,
265,
285,
45630,
983,
326,
452,
5010,
2074,
3607,
824,
347,
7562,
253,
288,
3591,
68,
2602,
4090,
10336,
285,
37139,
414,
2308,
253,
1775,
264,
406,
339,
793,
360,
3454,
253,
17699,
16561,
7162,
42995,
11454,
2990,
556,
4102,
644,
6508,
281,
253,
1083,
273,
440,
35421,
4715,
37952,
469,
395,
4011,
1162,
355,
891,
23925,
9866,
9169,
436,
2929,
26662,
436,
6880,
281,
11096,
22491,
91,
8420,
10679,
6753,
2083,
351,
398,
285,
247,
35605,
21541,
1566,
4081,
407,
465,
8601,
729,
50276,
12242,
3423,
268,
27109,
6247,
327,
253,
278,
79,
382,
10895,
323,
7103,
253,
4477,
1908,
253,
6311,
44952,
4910,
285,
253,
9162,
3045,
273,
247,
4872,
30410,
253,
2929,
310,
1077,
2074,
281,
37952,
469,
395,
4011,
1162,
355,
891,
23925,
9866,
9169,
533,
342,
271,
6508,
5661,
2593,
50275,
993,
23223,
50275,
4193,
11220,
21541,
3082,
323,
440,
35421,
4715,
403,
271,
4722,
2170,
273,
2561,
50275,
9088,
556,
644,
4942,
1652,
2561,
327,
8350,
30535,
50275,
585,
1209,
2224,
50275,
783,
2929,
1057,
417,
9569,
2712,
747,
533,
7960,
26662,
5368,
3082,
50275,
783,
5301,
310,
417,
271,
9470,
1263,
533,
3710,
281,
581,
10895,
278,
79,
382,
285,
1643,
5795,
18595,
273,
534,
760,
253,
26856,
1566,
310,
14320,
3998,
3022,
50275,
9088,
3133,
281,
320,
1633,
745,
342,
253,
5661,
1543,
465,
8601,
729,
50276,
12242,
3423,
1304,
247,
1805,
1071,
7200,
273,
898,
36208,
275,
616,
3236,
2929,
275,
15866,
273,
970,
1679,
8763,
5085,
50275,
67,
7693,
79,
2224,
3045,
310,
12069,
49636,
352,
310,
1014,
41731,
10574,
407,
3632,
20126,
6928,
342,
4229,
15783,
3632,
50276,
14719,
305,
2735,
15116,
275,
253,
8763,
3828,
2853,
272,
1162,
355,
11454,
6928,
6247,
50275,
77,
10892,
3045,
812,
320,
48504,
407,
3687,
7534,
18662,
2322,
347,
247,
6551,
47458,
7680,
534,
310,
2299,
417,
253,
1083,
1060,
347,
253,
4477,
3746,
1375,
616,
1566,
310,
12002,
3239,
577,
285,
417,
247,
11454,
7092,
533,
7960,
4648,
15424,
7324,
3066,
247,
1980,
2602,
4090,
4254,
253,
11454,
7092,
273,
534,
651,
320,
11884,
8638,
50275,
37585,
5701,
50276,
30061,
74,
275,
16186,
721,
310,
1620,
5611,
50276,
783,
1386,
1840,
16186,
1903,
943,
3164,
3730,
281,
1903,
417,
721,
50276,
783,
373,
247,
1745,
80,
275,
253,
6197,
846,
16186,
1903,
50276,
783,
9769,
6779,
1537,
320,
4722,
310,
352,
667,
625,
7534,
685,
253,
973,
4304,
5939,
285,
1980,
14237,
5474,
339,
793,
2468,
1420,
5839,
273,
35605,
11797,
11333,
285,
35615,
310,
247,
1077,
1774,
2561,
9400,
50275,
783,
2929,
2340,
684,
17699,
16561,
7162,
42995,
11454,
6928,
270,
7693,
9866,
327,
4715,
440,
35421,
14237,
327,
278,
79,
382,
10895,
352,
10262,
247,
11088,
5301,
273,
1740,
1027,
7744,
908,
440,
35421,
3082,
50275,
783,
2266,
15785,
273,
270,
7693,
9866,
2746,
310,
253,
5322,
44952,
4910,
273,
253,
4373,
31248,
288,
68,
285,
1054,
20917,
7930,
278,
68,
6311,
407,
253,
4081,
5933,
347,
253,
4477,
1127,
562,
253,
5750,
273,
253,
4081,
5933,
310,
326,
352,
310,
2104,
281,
4711,
23507,
285,
4122,
15783,
275,
253,
2460,
6415,
44952,
4910,
323,
253,
278,
6113,
671,
1110,
15116,
1007,
1199,
28452,
685,
253,
21421,
273,
253,
8946,
11333,
2783,
275,
4677,
495,
50276,
29483,
595,
253,
4477,
7568,
326,
616,
14237,
1462,
275,
1386,
342,
3786,
3863,
18595,
275,
2426,
273,
253,
9162,
7200,
50275,
783,
2022,
14855,
273,
436,
789,
310,
326,
253,
4081,
1332,
556,
760,
644,
6949,
327,
278,
79,
382,
285,
275,
327,
293,
4071,
35615,
387,
253,
1072,
673,
1677,
253,
38135,
273,
253,
2746,
891,
1158,
352,
22828,
4116,
1014,
275,
436,
22325,
4758,
2783,
275,
436,
7714,
50275,
6795,
5701,
50276,
18,
2593,
21,
495,
5784,
1386,
943,
320,
6311,
50276,
19,
627,
403,
690,
3731,
21937,
1475,
5150,
1903,
824,
347,
253,
897,
273,
7239,
4632,
7239,
310,
16706,
671,
352,
1646,
326,
253,
3159,
1781,
310,
5816,
1563,
253,
7212,
275,
253,
1386,
846,
5150,
1903,
50276,
20,
891,
671,
1089,
5370,
270,
275,
4677,
374,
21643,
253,
1039,
352,
310,
3559,
2789,
352,
1007,
751,
253,
4477,
452,
5678,
253,
18012,
273,
1740,
6928,
281,
3997,
715,
253,
30410,
1223,
275,
6612,
1110,
1740,
6928,
497,
6760,
581,
407,
581,
50276,
21,
352,
651,
320,
1805,
281,
42638,
247,
747,
4778,
323,
253,
1669,
1133,
1930,
273,
5150,
1249,
1580,
891,
1944,
310,
2168,
2668,
50275,
5996,
30080,
22559,
50275,
47033,
368,
323,
253,
2380,
891,
452,
1239,
253,
5955,
342,
643,
30628,
247,
1355,
4385,
1223,
891,
5194,
326,
352,
310,
5272,
281,
1978,
253,
30410,
253,
1072,
323,
512,
253,
3210,
2602,
4090,
342,
2831,
15579,
323,
247,
4344,
5301,
891,
14936,
326,
253,
5743,
1159,
323,
253,
806,
3828,
943,
320,
4934,
347,
774,
86,
275,
253,
26856,
1566,
275,
958,
26856,
5513,
326,
436,
310,
247,
749,
29776,
4327,
275,
3036,
577,
273,
616,
3236,
2929,
970,
9136,
273,
774,
316,
943,
2572,
253,
26856,
7200,
4583,
891,
1158,
326,
436,
310,
247,
5322,
2929,
285,
891,
717,
21802,
281,
1978,
619,
3302,
4868,
50275,
7152,
33032,
2520,
2929,
6760,
1740,
440,
35421,
4715,
7274,
270,
7693,
9866,
26856,
391,
5844,
247,
70,
407,
3733,
247,
22296,
9162,
3828,
327,
1755,
273,
253,
8763,
6779,
5742,
253,
4477,
36143,
2429,
253,
44952,
4910,
285,
36878,
2429,
253,
9162,
3045,
2439,
1740,
3210,
253,
4477,
21947,
253,
11361,
273,
270,
7693,
9866,
1580,
352,
10384,
35605,
21541,
1980,
4715,
4803,
285,
4419,
11184,
44540,
323,
14940,
50276,
1189,
455,
253,
5301,
369,
4344,
285,
4891,
253,
5740,
273,
253,
270,
7693,
9866,
1566,
275,
2593,
495,
369,
2590,
285,
11088,
533,
253,
4477,
858,
417,
2085,
4209,
4278,
273,
2234,
6297,
275,
253,
643,
1264,
3210,
3340,
253,
26856,
1566,
534,
671,
908,
3998,
3022,
4715,
4803,
253,
7000,
10199,
273,
253,
643,
1264,
3210,
943,
320,
271,
1774,
4445,
1580,
436,
310,
247,
20407,
1263,
253,
1543,
497,
4518,
4767,
533,
253,
34584,
3064,
275,
253,
9162,
7200,
5301,
2829,
374,
476,
10693,
1421,
281,
247,
9630,
6452,
670,
534,
440,
35421,
1332,
310,
1805,
285,
352,
651,
320,
1805,
604,
253,
4477,
812,
2085,
625,
27838,
670,
253,
9769,
44952,
4910,
273,
288,
6113,
285,
278,
6113,
275,
270,
7693,
9866,
3036,
495,
66,
50275,
6160,
5701,
285,
3533,
337,
275,
253,
3236,
26856,
1566,
465,
18540,
85,
729,
50276,
12242,
3423,
6247,
597,
671,
5762,
253,
9162,
7200,
327,
253,
278,
79,
382,
10895,
285,
253,
906,
4925,
271,
2228,
2281,
273,
21640,
342,
5307,
8763,
3054,
436,
310,
1805,
685,
253,
23775,
906,
2011,
1060,
10694,
1867,
7200,
342,
27295,
8763,
3054,
310,
436,
7200,
5926,
4269,
407,
247,
1027,
4758,
273,
4373,
22041,
840,
310,
352,
4344,
281,
1333,
326,
270,
7693,
9866,
41731,
13015,
26856,
374,
812,
368,
2085,
625,
8813,
327,
16186,
1903,
2139,
436,
7870,
5731,
273,
465,
2461,
812,
320,
908,
347,
247,
6799,
8492,
7248,
495,
672,
253,
1180,
273,
2264,
8763,
5085,
310,
4229,
752,
651,
320,
253,
1055,
273,
6890,
253,
4313,
1315,
317,
10602,
14340,
31728,
275,
270,
7693,
9866,
577,
253,
9769,
2605,
273,
270,
7693,
9866,
3400,
4722,
44952,
1673,
1543,
275,
3036,
495,
66,
310,
436,
2605,
2087,
12729,
281,
247,
1566,
342,
2709,
8763,
8090,
50275,
37585,
3239,
577,
275,
4562,
8492,
7248,
1745,
80,
16186,
721,
943,
320,
16186,
1903,
285,
253,
1318,
273,
6351,
465,
2461,
387,
1475,
337,
672,
7239,
75,
50276,
1124,
81,
4090,
290,
21,
5816,
305,
72,
470,
1060,
2490,
187,
4118,
18435,
27,
2520,
2929,
2589,
84,
247,
5301,
875,
247,
1355,
873,
273,
3210,
577,
275,
2264,
323,
440,
35421,
4715,
5742,
253,
4477,
2770,
327,
10941,
17699,
16561,
7162,
42995,
11454,
6928,
270,
7693,
9866,
11096,
22491,
91,
8420,
10679,
391,
5844,
247,
3332,
1566,
407,
465,
8601,
729,
50276,
12242,
3423,
6247,
26856,
285,
6753,
2083,
351,
398,
247,
70,
253,
4477,
7277,
10166,
2801,
10670,
44952,
1673,
5289,
285,
4872,
9162,
327,
278,
79,
382,
970,
253,
6311,
14237,
253,
806,
767,
14023,
403,
9093,
18276,
14023,
1223,
327,
9162,
7200,
253,
4477,
1304,
2074,
7200,
2308,
2439,
253,
3210,
50276,
2520,
2929,
2959,
6804,
10123,
30628,
577,
285,
608,
3543,
352,
858,
417,
8162,
2217,
323,
14924,
1223,
30628,
374,
50276,
20,
497,
625,
2762,
2299,
347,
4879,
407,
247,
1643,
273,
253,
30628,
436,
2929,
1057,
417,
3176,
281,
5115,
1199,
285,
3400,
1077,
3710,
1783,
285,
4679,
327,
253,
3210,
352,
310,
2649,
16984,
667,
747,
3210,
4543,
1057,
352,
1056,
667,
2590,
42060,
875,
253,
3210,
6730,
326,
651,
1361,
253,
1673,
281,
7617,
534,
10746,
281,
15142,
50276,
783,
4679,
823,
1652,
12288,
715,
253,
3910,
875,
253,
3210,
326,
812,
320,
908,
281,
4151,
747,
789,
3021,
253,
7680,
2530,
1060,
310,
1077,
3710,
50275,
3062,
1189,
253,
42852,
275,
436,
2929,
403,
13477,
275,
2087,
352,
310,
1774,
323,
8607,
387,
253,
15171,
273,
6551,
21559,
285,
5145,
4715,
281,
7617,
752,
616,
4736,
310,
672,
3652,
285,
390,
10941,
3210,
5742,
310,
253,
4736,
337,
4560,
247,
1566,
326,
778,
7826,
5513,
849,
253,
3998,
2987,
390,
374,
4560,
1805,
5145,
4715,
5657,
50276,
338,
253,
4736,
310,
337,
253,
3045,
327,
49602,
310,
1679,
1774,
2299,
2590,
4859,
281,
5661,
941,
824,
326,
5661,
13650,
778,
320,
1896,
403,
1077,
1774,
28763,
417,
281,
1333,
326,
247,
1566,
1364,
320,
9670,
35605,
15958,
281,
320,
32811,
533,
352,
1364,
452,
4209,
3216,
272,
275,
16775,
281,
320,
27096,
323,
6551,
21559,
2299,
275,
436,
7714,
347,
369,
4879,
407,
37317,
577,
253,
4859,
281,
16775,
403,
3578,
3472,
253,
8624,
1750,
323,
7534,
17200,
323,
512,
253,
3210,
2783,
3133,
281,
320,
326,
253,
5731,
4803,
403,
1980,
533,
436,
310,
247,
13155,
4602,
387,
1682,
627,
403,
1142,
625,
3210,
273,
440,
35421,
4715,
342,
2080,
625,
13424,
17200,
326,
403,
417,
2783,
1060,
923,
24088,
8919,
1200,
666,
257,
50276,
3423,
8441,
3753,
1182,
1190,
589,
4978,
1162,
355,
4332,
499,
375,
15180,
16775,
3471,
4652,
1162,
355,
9169,
270,
1528,
32693,
5987,
3088,
1528,
72,
6903,
6903,
938,
1518,
28766,
1717,
3071,
520,
352,
310,
2032,
326,
690,
273,
841,
3210,
897,
1327,
6790,
1491,
533,
1677,
253,
14149,
1941,
326,
33643,
310,
417,
2686,
1014,
247,
7654,
2867,
275,
1524,
21066,
30535,
923,
24088,
21974,
296,
1216,
1162,
355,
4765,
2914,
4670,
275,
11454,
14174,
588,
74,
1317,
50276,
73,
7391,
785,
255,
4765,
23586,
8913,
254,
39101,
1162,
355,
9169,
3753,
271,
37238,
342,
4803,
326,
760,
897,
638,
285,
9319,
35692,
2425,
310,
417,
1014,
4518,
247,
711,
1334,
682,
323,
6551,
21559,
50276,
338,
253,
4736,
310,
374,
840,
3045,
327,
49602,
285,
690,
5301,
281,
253,
256,
5503,
310,
8839,
4619,
2568,
436,
2929,
1057,
5293,
273,
436,
6296,
253,
3045,
6786,
342,
253,
1740,
3210,
2783,
1060,
310,
347,
4879,
407,
37317,
577,
1077,
4105,
275,
4499,
627,
452,
644,
7418,
16424,
275,
440,
35421,
390,
1881,
35421,
4715,
275,
13361,
275,
3332,
1107,
24088,
4499,
422,
15970,
12425,
948,
498,
83,
28551,
634,
1211,
21624,
3966,
512,
273,
534,
5115,
2080,
1805,
1543,
685,
253,
1740,
3210,
2783,
1060,
3021,
253,
3210,
1146,
2429,
1060,
2550,
4151,
5145,
4715,
347,
597,
513,
417,
3176,
281,
2085,
667,
7681,
16424,
273,
2282,
690,
3210,
778,
13398,
7342,
337,
50276,
19,
24088,
8445,
2559,
13424,
17200,
1223,
671,
17170,
12524,
22791,
3045,
923,
24088,
7044,
33165,
1162,
355,
4765,
5723,
2824,
533,
326,
310,
417,
1663,
253,
4112,
11372,
1060,
347,
253,
3210,
2783,
452,
1652,
7534,
18662,
2322,
347,
4879,
1840,
285,
5115,
4105,
3045,
387,
253,
1072,
673,
50276,
2711,
9518,
1677,
841,
15711,
3738,
436,
2929,
2959,
6804,
10123,
352,
310,
4518,
417,
4569,
323,
14924,
387,
17857,
32888,
275,
253,
2170,
21583,
4743
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1534,
3486,
327,
253,
6311,
3386,
2169,
37139,
414,
2789,
253,
3386,
1007,
625,
751,
9980,
751,
285,
15783,
285,
1679,
5195,
562,
512,
689,
253,
5304,
1673,
347,
597,
513,
275,
3036,
577,
260,
285,
277,
50274,
3169,
327,
253,
3081,
4679,
891,
588,
320,
3629,
619,
4868,
281,
608,
2299,
1677,
326,
253,
2022,
7680,
273,
253,
2929,
310,
247,
20407,
1263,
253,
2929,
476,
823,
1318,
407,
2509,
247,
625,
11080,
5301,
1411,
11640,
273,
247,
265,
285,
45630,
983,
326,
452,
5010,
2074,
3607,
824,
347,
7562,
253,
288,
3591,
68,
2602,
4090,
10336,
285,
37139,
414,
2308,
253,
1775,
264,
406,
339,
793,
360,
3454,
253,
17699,
16561,
7162,
42995,
11454,
2990,
556,
4102,
644,
6508,
281,
253,
1083,
273,
440,
35421,
4715,
37952,
469,
395,
4011,
1162,
355,
891,
23925,
9866,
9169,
436,
2929,
26662,
436,
6880,
281,
11096,
22491,
91,
8420,
10679,
6753,
2083,
351,
398,
285,
247,
35605,
21541,
1566,
4081,
407,
465,
8601,
729,
50276,
12242,
3423,
268,
27109,
6247,
327,
253,
278,
79,
382,
10895,
323,
7103,
253,
4477,
1908,
253,
6311,
44952,
4910,
285,
253,
9162,
3045,
273,
247,
4872,
30410,
253,
2929,
310,
1077,
2074,
281,
37952,
469,
395,
4011,
1162,
355,
891,
23925,
9866,
9169,
533,
342,
271,
6508,
5661,
2593,
50275,
993,
23223,
50275,
4193,
11220,
21541,
3082,
323,
440,
35421,
4715,
403,
271,
4722,
2170,
273,
2561,
50275,
9088,
556,
644,
4942,
1652,
2561,
327,
8350,
30535,
50275,
585,
1209,
2224,
50275,
783,
2929,
1057,
417,
9569,
2712,
747,
533,
7960,
26662,
5368,
3082,
50275,
783,
5301,
310,
417,
271,
9470,
1263,
533,
3710,
281,
581,
10895,
278,
79,
382,
285,
1643,
5795,
18595,
273,
534,
760,
253,
26856,
1566,
310,
14320,
3998,
3022,
50275,
9088,
3133,
281,
320,
1633,
745,
342,
253,
5661,
1543,
465,
8601,
729,
50276,
12242,
3423,
1304,
247,
1805,
1071,
7200,
273,
898,
36208,
275,
616,
3236,
2929,
275,
15866,
273,
970,
1679,
8763,
5085,
50275,
67,
7693,
79,
2224,
3045,
310,
12069,
49636,
352,
310,
1014,
41731,
10574,
407,
3632,
20126,
6928,
342,
4229,
15783,
3632,
50276,
14719,
305,
2735,
15116,
275,
253,
8763,
3828,
2853,
272,
1162,
355,
11454,
6928,
6247,
50275,
77,
10892,
3045,
812,
320,
48504,
407,
3687,
7534,
18662,
2322,
347,
247,
6551,
47458,
7680,
534,
310,
2299,
417,
253,
1083,
1060,
347,
253,
4477,
3746,
1375,
616,
1566,
310,
12002,
3239,
577,
285,
417,
247,
11454,
7092,
533,
7960,
4648,
15424,
7324,
3066,
247,
1980,
2602,
4090,
4254,
253,
11454,
7092,
273,
534,
651,
320,
11884,
8638,
50275,
37585,
5701,
50276,
30061,
74,
275,
16186,
721,
310,
1620,
5611,
50276,
783,
1386,
1840,
16186,
1903,
943,
3164,
3730,
281,
1903,
417,
721,
50276,
783,
373,
247,
1745,
80,
275,
253,
6197,
846,
16186,
1903,
50276,
783,
9769,
6779,
1537,
320,
4722,
310,
352,
667,
625,
7534,
685,
253,
973,
4304,
5939,
285,
1980,
14237,
5474,
339,
793,
2468,
1420,
5839,
273,
35605,
11797,
11333,
285,
35615,
310,
247,
1077,
1774,
2561,
9400,
50275,
783,
2929,
2340,
684,
17699,
16561,
7162,
42995,
11454,
6928,
270,
7693,
9866,
327,
4715,
440,
35421,
14237,
327,
278,
79,
382,
10895,
352,
10262,
247,
11088,
5301,
273,
1740,
1027,
7744,
908,
440,
35421,
3082,
50275,
783,
2266,
15785,
273,
270,
7693,
9866,
2746,
310,
253,
5322,
44952,
4910,
273,
253,
4373,
31248,
288,
68,
285,
1054,
20917,
7930,
278,
68,
6311,
407,
253,
4081,
5933,
347,
253,
4477,
1127,
562,
253,
5750,
273,
253,
4081,
5933,
310,
326,
352,
310,
2104,
281,
4711,
23507,
285,
4122,
15783,
275,
253,
2460,
6415,
44952,
4910,
323,
253,
278,
6113,
671,
1110,
15116,
1007,
1199,
28452,
685,
253,
21421,
273,
253,
8946,
11333,
2783,
275,
4677,
495,
50276,
29483,
595,
253,
4477,
7568,
326,
616,
14237,
1462,
275,
1386,
342,
3786,
3863,
18595,
275,
2426,
273,
253,
9162,
7200,
50275,
783,
2022,
14855,
273,
436,
789,
310,
326,
253,
4081,
1332,
556,
760,
644,
6949,
327,
278,
79,
382,
285,
275,
327,
293,
4071,
35615,
387,
253,
1072,
673,
1677,
253,
38135,
273,
253,
2746,
891,
1158,
352,
22828,
4116,
1014,
275,
436,
22325,
4758,
2783,
275,
436,
7714,
50275,
6795,
5701,
50276,
18,
2593,
21,
495,
5784,
1386,
943,
320,
6311,
50276,
19,
627,
403,
690,
3731,
21937,
1475,
5150,
1903,
824,
347,
253,
897,
273,
7239,
4632,
7239,
310,
16706,
671,
352,
1646,
326,
253,
3159,
1781,
310,
5816,
1563,
253,
7212,
275,
253,
1386,
846,
5150,
1903,
50276,
20,
891,
671,
1089,
5370,
270,
275,
4677,
374,
21643,
253,
1039,
352,
310,
3559,
2789,
352,
1007,
751,
253,
4477,
452,
5678,
253,
18012,
273,
1740,
6928,
281,
3997,
715,
253,
30410,
1223,
275,
6612,
1110,
1740,
6928,
497,
6760,
581,
407,
581,
50276,
21,
352,
651,
320,
1805,
281,
42638,
247,
747,
4778,
323,
253,
1669,
1133,
1930,
273,
5150,
1249,
1580,
891,
1944,
310,
2168,
2668,
50275,
5996,
30080,
22559,
50275,
47033,
368,
323,
253,
2380,
891,
452,
1239,
253,
5955,
342,
643,
30628,
247,
1355,
4385,
1223,
891,
5194,
326,
352,
310,
5272,
281,
1978,
253,
30410,
253,
1072,
323,
512,
253,
3210,
2602,
4090,
342,
2831,
15579,
323,
247,
4344,
5301,
891,
14936,
326,
253,
5743,
1159,
323,
253,
806,
3828,
943,
320,
4934,
347,
774,
86,
275,
253,
26856,
1566,
275,
958,
26856,
5513,
326,
436,
310,
247,
749,
29776,
4327,
275,
3036,
577,
273,
616,
3236,
2929,
970,
9136,
273,
774,
316,
943,
2572,
253,
26856,
7200,
4583,
891,
1158,
326,
436,
310,
247,
5322,
2929,
285,
891,
717,
21802,
281,
1978,
619,
3302,
4868,
50275,
7152,
33032,
2520,
2929,
6760,
1740,
440,
35421,
4715,
7274,
270,
7693,
9866,
26856,
391,
5844,
247,
70,
407,
3733,
247,
22296,
9162,
3828,
327,
1755,
273,
253,
8763,
6779,
5742,
253,
4477,
36143,
2429,
253,
44952,
4910,
285,
36878,
2429,
253,
9162,
3045,
2439,
1740,
3210,
253,
4477,
21947,
253,
11361,
273,
270,
7693,
9866,
1580,
352,
10384,
35605,
21541,
1980,
4715,
4803,
285,
4419,
11184,
44540,
323,
14940,
50276,
1189,
455,
253,
5301,
369,
4344,
285,
4891,
253,
5740,
273,
253,
270,
7693,
9866,
1566,
275,
2593,
495,
369,
2590,
285,
11088,
533,
253,
4477,
858,
417,
2085,
4209,
4278,
273,
2234,
6297,
275,
253,
643,
1264,
3210,
3340,
253,
26856,
1566,
534,
671,
908,
3998,
3022,
4715,
4803,
253,
7000,
10199,
273,
253,
643,
1264,
3210,
943,
320,
271,
1774,
4445,
1580,
436,
310,
247,
20407,
1263,
253,
1543,
497,
4518,
4767,
533,
253,
34584,
3064,
275,
253,
9162,
7200,
5301,
2829,
374,
476,
10693,
1421,
281,
247,
9630,
6452,
670,
534,
440,
35421,
1332,
310,
1805,
285,
352,
651,
320,
1805,
604,
253,
4477,
812,
2085,
625,
27838,
670,
253,
9769,
44952,
4910,
273,
288,
6113,
285,
278,
6113,
275,
270,
7693,
9866,
3036,
495,
66,
50275,
6160,
5701,
285,
3533,
337,
275,
253,
3236,
26856,
1566,
465,
18540,
85,
729,
50276,
12242,
3423,
6247,
597,
671,
5762,
253,
9162,
7200,
327,
253,
278,
79,
382,
10895,
285,
253,
906,
4925,
271,
2228,
2281,
273,
21640,
342,
5307,
8763,
3054,
436,
310,
1805,
685,
253,
23775,
906,
2011,
1060,
10694,
1867,
7200,
342,
27295,
8763,
3054,
310,
436,
7200,
5926,
4269,
407,
247,
1027,
4758,
273,
4373,
22041,
840,
310,
352,
4344,
281,
1333,
326,
270,
7693,
9866,
41731,
13015,
26856,
374,
812,
368,
2085,
625,
8813,
327,
16186,
1903,
2139,
436,
7870,
5731,
273,
465,
2461,
812,
320,
908,
347,
247,
6799,
8492,
7248,
495,
672,
253,
1180,
273,
2264,
8763,
5085,
310,
4229,
752,
651,
320,
253,
1055,
273,
6890,
253,
4313,
1315,
317,
10602,
14340,
31728,
275,
270,
7693,
9866,
577,
253,
9769,
2605,
273,
270,
7693,
9866,
3400,
4722,
44952,
1673,
1543,
275,
3036,
495,
66,
310,
436,
2605,
2087,
12729,
281,
247,
1566,
342,
2709,
8763,
8090,
50275,
37585,
3239,
577,
275,
4562,
8492,
7248,
1745,
80,
16186,
721,
943,
320,
16186,
1903,
285,
253,
1318,
273,
6351,
465,
2461,
387,
1475,
337,
672,
7239,
75,
50276,
1124,
81,
4090,
290,
21,
5816,
305,
72,
470,
1060,
2490,
187,
4118,
18435,
27,
2520,
2929,
2589,
84,
247,
5301,
875,
247,
1355,
873,
273,
3210,
577,
275,
2264,
323,
440,
35421,
4715,
5742,
253,
4477,
2770,
327,
10941,
17699,
16561,
7162,
42995,
11454,
6928,
270,
7693,
9866,
11096,
22491,
91,
8420,
10679,
391,
5844,
247,
3332,
1566,
407,
465,
8601,
729,
50276,
12242,
3423,
6247,
26856,
285,
6753,
2083,
351,
398,
247,
70,
253,
4477,
7277,
10166,
2801,
10670,
44952,
1673,
5289,
285,
4872,
9162,
327,
278,
79,
382,
970,
253,
6311,
14237,
253,
806,
767,
14023,
403,
9093,
18276,
14023,
1223,
327,
9162,
7200,
253,
4477,
1304,
2074,
7200,
2308,
2439,
253,
3210,
50276,
2520,
2929,
2959,
6804,
10123,
30628,
577,
285,
608,
3543,
352,
858,
417,
8162,
2217,
323,
14924,
1223,
30628,
374,
50276,
20,
497,
625,
2762,
2299,
347,
4879,
407,
247,
1643,
273,
253,
30628,
436,
2929,
1057,
417,
3176,
281,
5115,
1199,
285,
3400,
1077,
3710,
1783,
285,
4679,
327,
253,
3210,
352,
310,
2649,
16984,
667,
747,
3210,
4543,
1057,
352,
1056,
667,
2590,
42060,
875,
253,
3210,
6730,
326,
651,
1361,
253,
1673,
281,
7617,
534,
10746,
281,
15142,
50276,
783,
4679,
823,
1652,
12288,
715,
253,
3910,
875,
253,
3210,
326,
812,
320,
908,
281,
4151,
747,
789,
3021,
253,
7680,
2530,
1060,
310,
1077,
3710,
50275,
3062,
1189,
253,
42852,
275,
436,
2929,
403,
13477,
275,
2087,
352,
310,
1774,
323,
8607,
387,
253,
15171,
273,
6551,
21559,
285,
5145,
4715,
281,
7617,
752,
616,
4736,
310,
672,
3652,
285,
390,
10941,
3210,
5742,
310,
253,
4736,
337,
4560,
247,
1566,
326,
778,
7826,
5513,
849,
253,
3998,
2987,
390,
374,
4560,
1805,
5145,
4715,
5657,
50276,
338,
253,
4736,
310,
337,
253,
3045,
327,
49602,
310,
1679,
1774,
2299,
2590,
4859,
281,
5661,
941,
824,
326,
5661,
13650,
778,
320,
1896,
403,
1077,
1774,
28763,
417,
281,
1333,
326,
247,
1566,
1364,
320,
9670,
35605,
15958,
281,
320,
32811,
533,
352,
1364,
452,
4209,
3216,
272,
275,
16775,
281,
320,
27096,
323,
6551,
21559,
2299,
275,
436,
7714,
347,
369,
4879,
407,
37317,
577,
253,
4859,
281,
16775,
403,
3578,
3472,
253,
8624,
1750,
323,
7534,
17200,
323,
512,
253,
3210,
2783,
3133,
281,
320,
326,
253,
5731,
4803,
403,
1980,
533,
436,
310,
247,
13155,
4602,
387,
1682,
627,
403,
1142,
625,
3210,
273,
440,
35421,
4715,
342,
2080,
625,
13424,
17200,
326,
403,
417,
2783,
1060,
923,
24088,
8919,
1200,
666,
257,
50276,
3423,
8441,
3753,
1182,
1190,
589,
4978,
1162,
355,
4332,
499,
375,
15180,
16775,
3471,
4652,
1162,
355,
9169,
270,
1528,
32693,
5987,
3088,
1528,
72,
6903,
6903,
938,
1518,
28766,
1717,
3071,
520,
352,
310,
2032,
326,
690,
273,
841,
3210,
897,
1327,
6790,
1491,
533,
1677,
253,
14149,
1941,
326,
33643,
310,
417,
2686,
1014,
247,
7654,
2867,
275,
1524,
21066,
30535,
923,
24088,
21974,
296,
1216,
1162,
355,
4765,
2914,
4670,
275,
11454,
14174,
588,
74,
1317,
50276,
73,
7391,
785,
255,
4765,
23586,
8913,
254,
39101,
1162,
355,
9169,
3753,
271,
37238,
342,
4803,
326,
760,
897,
638,
285,
9319,
35692,
2425,
310,
417,
1014,
4518,
247,
711,
1334,
682,
323,
6551,
21559,
50276,
338,
253,
4736,
310,
374,
840,
3045,
327,
49602,
285,
690,
5301,
281,
253,
256,
5503,
310,
8839,
4619,
2568,
436,
2929,
1057,
5293,
273,
436,
6296,
253,
3045,
6786,
342,
253,
1740,
3210,
2783,
1060,
310,
347,
4879,
407,
37317,
577,
1077,
4105,
275,
4499,
627,
452,
644,
7418,
16424,
275,
440,
35421,
390,
1881,
35421,
4715,
275,
13361,
275,
3332,
1107,
24088,
4499,
422,
15970,
12425,
948,
498,
83,
28551,
634,
1211,
21624,
3966,
512,
273,
534,
5115,
2080,
1805,
1543,
685,
253,
1740,
3210,
2783,
1060,
3021,
253,
3210,
1146,
2429,
1060,
2550,
4151,
5145,
4715,
347,
597,
513,
417,
3176,
281,
2085,
667,
7681,
16424,
273,
2282,
690,
3210,
778,
13398,
7342,
337,
50276,
19,
24088,
8445,
2559,
13424,
17200,
1223,
671,
17170,
12524,
22791,
3045,
923,
24088,
7044,
33165,
1162,
355,
4765,
5723,
2824,
533,
326,
310,
417,
1663,
253,
4112,
11372,
1060,
347,
253,
3210,
2783,
452,
1652,
7534,
18662,
2322,
347,
4879,
1840,
285,
5115,
4105,
3045,
387,
253,
1072,
673,
50276,
2711,
9518,
1677,
841,
15711,
3738,
436,
2929,
2959,
6804,
10123,
352,
310,
4518,
417,
4569,
323,
14924,
387,
17857,
32888,
275,
253,
2170,
21583,
4743
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper improves on the idea of mixup by selecting samples for mixup via a model that selects suitable pairs found using knn in a batch in order to provide a learning signal to this discrete process the authors apply reinforce using the loss of the downstream regressor not applied to classification tasks they show promising results on a number of regression tasks strengths the idea is simple and appears to be implemented correctly and the experiments appear to be justifiable given mixup is a useful regularization technique for supervised learning and has been shown to be very useful in classification tasks the application of reinforce as a stochastic gradient approximator is well known in other context that involve nns with discrete processes see dlgms eg rezende 2014 the experiments are straightforward and support the value of their work weaknesses i have deep concerns about this paper being marketed as rl for mixup rl comes packaged with a great deal more than just the reinforce algorithm eg the policy gradient and rephrasing a lot of what is a gradient estimator for a discrete process as rl not only may potentially confuses the reader on what rl is but also doesnt connect with an active area of research on doing backprop through discrete processes yes i understand that the cited yoon 2020 paper did this as well but i dont think they should have either one of the most famous examples of using reinforce is with discrete vaes or dlgms rezende 2014 and they are careful not to call it rl but there are a number of other works or baselines that this paper should have compared to and i think in part this is because the related works talk about other rl algorithms for instance a3c or even ppo arent really relatable to the problem being addressed here as straightthrough estimators bengio 2013 or gumbel softmax jang 2017 yoon does a better job at connecting to these types of works for instance it seems possible to backprop through to the selection probabilities using gumbel softmax variance is usually much better than reinforce even with the baseline removal these are important baselines and it is important this paper places itself correctly among related works next there are many claims about why classification doesnt need this sort of mixup but i didnt follow the arguments entirely p1p2 in section 2 i dont see any classification experiments to back up this claim at least in the main text for instance linear assumption in classification turns out to be reasonable because the label difference why is it reasonable to leave the label space in classification mixup with regression i have a better chance of hitting a label in the distribution other notes concerns so the motivation was on some physical systems and were using reinforce to learn how to mix data but if were in the physical systems such as you describe usually we have some sort of closedform solution model that describes the physical process maybe poorly even if this model has error wouldnt this be a good place to design a prior that tells you how to sample eg if the masses are some set distance apart then use else dont i see metalearning as a description of the model but not in the algorithm 1 how do you sample k when you decide how many neighbors to mix is this sampled from a prior rezende 2014 stochastic backpropagation and approximate inference in deep generative models bengio 2013 estimating or propagating gradients through stochastic neurons for conditional computation jang 2017 categorical reparameterization with gumbelsoftmax i recommend reject as the papers core story isnt well placed compared to related works the paper is placed as rl so were missing important baselines and connections to other gradient estimators through discrete processes docsepthe authors propose mixrl to improve upon mixup in regression settings mixrl is used to impose a proximity constraint on the inputoutput pairs that are mixed during mixupbased data augmentation by predicting how many nearest neighbors to utilize from a small set of prespecified options based on feedback from evaluating the validation set consistent but small gains over mixup and manifold mixup are realized on several datasets strengths datadependent inputoutput proximity constraints on mixup are learned consistent but small gains over mixup and manifold mixup are realized on several regression datasets limitations the set of nearest neighbors considered for use in mixup while dynamic are highly restricted eg 4 16 64 128 for the no2 dataset in table 3 more generally the lack of a local inputoutput kernel to prioritize more local data in a continuous manner feels like a significant limitation optimized inputoutput kernels that restrict and more generally reweight mixup pairs or construct locally nonconstant regressions to sample from feel like important missing baselines the lack of a locally nonconstant regression model in mixrl beyond mixing data pairs is likely limiting performance significantly the technique while sound is currently quite limited in scope in that important baselines and algorithmic elements eg locally nonconstant models and local kernels that need to be considered in the context of doing high fidelity data augmentation for regression problems have not been adequately investigated or discussed docsepto apply mixup for regression tasks the paper first utilizes the stricter assumption that linearity only holds within specific data or label distances for regression then this paper proposes a data mixing augmentation method called mixrl the goal of mixrl is to identify which examples to mix with which nearest neighbors mixrl employs a metalearning framework that estimates how important mixing a sample is to minimize the model loss on a validation set using policy gradient reinforcement learning mixrl is inspired by the problem of measuring how individual examples contribute to model performance the idea is reasonable for regression tasks the experiments verify its effectiveness by improving regression performance by carefully mixing examples the paper is well written and organized i notice mixrl cannot improve previous mixup methods by a clear margin since i am hesitant about the importance of the improvement i want to hear about other reviewers opinions to make the decision id like to see the other reviewers opinions about the empirical results
### Summary:
|
this paper generalize the idea of mixupbased data augmentation for regression compared to classification for which mixup was used the paper argues that in regression the linearity assumption only holds within specific data or label distances the paper thus proposes mixrl to select suitable pairs using knearest neighbor in a batch for mixup the selection policy is trained with metalearning by minimizing the validationset loss the approach provides consistent but small improvement over mixup on several datasets reviewers have also suggested discussion and comparison with more baselines such as respective method using other lowervariant gradient estimators eg gumbelsoftmax and using local inputoutput kernels for data selection etc
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19132,
327,
253,
2934,
273,
5878,
484,
407,
17221,
3530,
323,
5878,
484,
3066,
247,
1566,
326,
34899,
7470,
8557,
1119,
970,
694,
79,
275,
247,
14604,
275,
1340,
281,
2085,
247,
4715,
2625,
281,
436,
13358,
1232,
253,
4477,
4647,
28432,
970,
253,
2957,
273,
253,
15450,
810,
32232,
417,
3732,
281,
9162,
8892,
597,
921,
12532,
1543,
327,
247,
1180,
273,
9077,
8892,
20544,
253,
2934,
310,
2969,
285,
4620,
281,
320,
9009,
9113,
285,
253,
4679,
3176,
281,
320,
816,
18397,
1677,
5878,
484,
310,
247,
4217,
37820,
5853,
323,
22296,
4715,
285,
556,
644,
2011,
281,
320,
1077,
4217,
275,
9162,
8892,
253,
2898,
273,
28432,
347,
247,
19191,
11786,
4020,
1080,
310,
973,
1929,
275,
643,
3634,
326,
6388,
295,
2224,
342,
13358,
4870,
923,
277,
21619,
983,
24088,
294,
91,
9747,
4059,
253,
4679,
403,
15246,
285,
1329,
253,
1318,
273,
616,
789,
50276,
20881,
1255,
265,
891,
452,
3676,
7350,
670,
436,
2929,
1146,
39809,
347,
391,
77,
323,
5878,
484,
391,
77,
3249,
32487,
342,
247,
1270,
2968,
625,
685,
816,
253,
28432,
5933,
24088,
253,
3646,
11786,
285,
294,
545,
83,
2355,
247,
2257,
273,
752,
310,
247,
11786,
29107,
323,
247,
13358,
1232,
347,
391,
77,
417,
760,
778,
7826,
1461,
5123,
253,
9414,
327,
752,
391,
77,
310,
533,
671,
36908,
4684,
342,
271,
3939,
2170,
273,
2561,
327,
2509,
896,
8560,
949,
13358,
4870,
4754,
891,
2096,
326,
253,
11106,
340,
3508,
9169,
2929,
858,
436,
347,
973,
533,
891,
13414,
1158,
597,
943,
452,
2057,
581,
273,
253,
954,
8530,
6667,
273,
970,
28432,
310,
342,
13358,
13460,
265,
390,
277,
21619,
983,
294,
91,
9747,
4059,
285,
597,
403,
10182,
417,
281,
1067,
352,
391,
77,
50276,
2858,
627,
403,
247,
1180,
273,
643,
2987,
390,
1666,
25379,
326,
436,
2929,
943,
452,
2429,
281,
285,
891,
1158,
275,
629,
436,
310,
984,
253,
2905,
2987,
2312,
670,
643,
391,
77,
11333,
323,
4227,
247,
20,
68,
390,
1014,
268,
5367,
403,
2649,
1663,
774,
17980,
281,
253,
1895,
1146,
9713,
1060,
347,
4951,
10489,
48489,
270,
1205,
900,
4072,
390,
305,
3561,
293,
2602,
4090,
480,
606,
4240,
340,
3508,
1057,
247,
1805,
2628,
387,
12873,
281,
841,
3510,
273,
2987,
50275,
1542,
4227,
352,
3133,
1896,
281,
896,
8560,
949,
281,
253,
5438,
20552,
970,
305,
3561,
293,
2602,
4090,
11041,
310,
3798,
1199,
1805,
685,
28432,
1014,
342,
253,
8245,
8570,
50276,
20513,
403,
1774,
1666,
25379,
285,
352,
310,
1774,
436,
2929,
5053,
3139,
9113,
2190,
2905,
2987,
50276,
8384,
627,
403,
1142,
3916,
670,
2139,
9162,
36908,
878,
436,
3686,
273,
5878,
484,
533,
891,
42126,
956,
253,
7125,
7094,
268,
18,
81,
19,
275,
2593,
374,
891,
13414,
923,
667,
9162,
4679,
281,
896,
598,
436,
1750,
387,
1878,
275,
253,
2022,
2505,
50276,
1542,
4227,
4872,
9376,
275,
9162,
7819,
562,
281,
320,
5272,
984,
253,
5203,
3064,
2139,
310,
352,
5272,
281,
3553,
253,
5203,
2317,
275,
9162,
5878,
484,
342,
9077,
891,
452,
247,
1805,
4839,
273,
16116,
247,
5203,
275,
253,
3268,
50276,
977,
7211,
50276,
585,
1209,
2224,
594,
253,
16038,
369,
327,
690,
3520,
2718,
285,
497,
970,
28432,
281,
3037,
849,
281,
5878,
941,
533,
604,
497,
275,
253,
3520,
2718,
824,
347,
368,
6266,
3798,
359,
452,
690,
3686,
273,
4581,
630,
2900,
1566,
326,
8631,
253,
3520,
1232,
5046,
15225,
1014,
604,
436,
1566,
556,
2228,
651,
2649,
436,
320,
247,
1175,
1659,
281,
2216,
247,
2720,
326,
8599,
368,
849,
281,
3410,
24088,
604,
253,
11843,
403,
690,
873,
4181,
7419,
840,
897,
2010,
13414,
50276,
74,
923,
5148,
613,
920,
347,
247,
5740,
273,
253,
1566,
533,
417,
275,
253,
5933,
337,
50276,
5430,
513,
368,
3410,
465,
672,
368,
7617,
849,
1142,
15833,
281,
5878,
310,
436,
19958,
432,
247,
2720,
50276,
14852,
9747,
4059,
19191,
896,
44263,
318,
285,
16851,
17032,
275,
3676,
1006,
800,
3210,
270,
1205,
900,
4072,
26230,
390,
42995,
27935,
949,
19191,
8512,
323,
17697,
13782,
480,
606,
4240,
31091,
294,
19484,
1320,
342,
305,
3561,
293,
5530,
4090,
891,
5583,
12009,
347,
253,
9380,
5161,
2926,
310,
2649,
973,
4845,
2429,
281,
2905,
2987,
253,
2929,
310,
4845,
347,
391,
77,
594,
497,
5816,
1774,
1666,
25379,
285,
10291,
281,
643,
11786,
48489,
949,
13358,
4870,
5474,
339,
431,
248,
4477,
12661,
5878,
8435,
281,
3157,
2220,
5878,
484,
275,
9077,
7533,
5878,
8435,
310,
908,
281,
16209,
247,
18326,
7658,
327,
253,
3280,
9252,
8557,
326,
403,
6804,
1309,
5878,
484,
3169,
941,
42072,
407,
21565,
849,
1142,
5275,
15833,
281,
16584,
432,
247,
1355,
873,
273,
838,
1553,
1245,
4610,
1754,
327,
8680,
432,
16344,
253,
12820,
873,
5185,
533,
1355,
15988,
689,
5878,
484,
285,
16751,
5878,
484,
403,
8156,
327,
2067,
15302,
20544,
50276,
8608,
324,
2662,
3280,
9252,
18326,
10806,
327,
5878,
484,
403,
6311,
50276,
32474,
533,
1355,
15988,
689,
5878,
484,
285,
16751,
5878,
484,
403,
8156,
327,
2067,
9077,
15302,
50276,
17465,
569,
50276,
783,
873,
273,
5275,
15833,
2783,
323,
897,
275,
5878,
484,
1223,
7870,
403,
4122,
11096,
24088,
577,
1668,
6705,
12842,
323,
253,
642,
19,
10895,
275,
2829,
495,
50276,
3062,
3839,
253,
3480,
273,
247,
1980,
3280,
9252,
10295,
281,
23652,
907,
625,
1980,
941,
275,
247,
5415,
5133,
9193,
751,
247,
1534,
12291,
50276,
32581,
1025,
3280,
9252,
34501,
326,
4656,
285,
625,
3839,
294,
6712,
5878,
484,
8557,
390,
3989,
12171,
1327,
22174,
810,
37761,
281,
3410,
432,
1928,
751,
1774,
5816,
1666,
25379,
253,
3480,
273,
247,
12171,
1327,
22174,
9077,
1566,
275,
5878,
8435,
4457,
12480,
941,
8557,
310,
2779,
14155,
3045,
3012,
253,
5853,
1223,
3590,
310,
4390,
3240,
3710,
275,
7990,
275,
326,
1774,
1666,
25379,
285,
5933,
280,
3603,
24088,
12171,
1327,
22174,
3210,
285,
1980,
34501,
326,
878,
281,
320,
2783,
275,
253,
3634,
273,
2509,
1029,
32422,
941,
42072,
323,
9077,
3237,
452,
417,
644,
18212,
6949,
390,
5469,
5474,
339,
22352,
4647,
5878,
484,
323,
9077,
8892,
253,
2929,
806,
29820,
253,
34614,
350,
9376,
326,
50137,
760,
6556,
1561,
2173,
941,
390,
5203,
13849,
323,
9077,
840,
436,
2929,
29328,
247,
941,
12480,
42072,
1332,
1925,
5878,
8435,
253,
4736,
273,
5878,
8435,
310,
281,
4271,
534,
6667,
281,
5878,
342,
534,
5275,
15833,
5878,
8435,
27532,
247,
5148,
613,
920,
7792,
326,
8197,
849,
1774,
12480,
247,
3410,
310,
281,
15338,
253,
1566,
2957,
327,
247,
12820,
873,
970,
3646,
11786,
35221,
4715,
5878,
8435,
310,
11797,
407,
253,
1895,
273,
10499,
849,
2060,
6667,
8162,
281,
1566,
3045,
253,
2934,
310,
5272,
323,
9077,
8892,
253,
4679,
12654,
697,
12510,
407,
11138,
9077,
3045,
407,
9257,
12480,
6667,
253,
2929,
310,
973,
3542,
285,
10932,
50276,
74,
4366,
5878,
8435,
2550,
3157,
2045,
5878,
484,
3082,
407,
247,
2590,
8459,
1580,
891,
717,
16063,
386,
670,
253,
6349,
273,
253,
7756,
891,
971,
281,
4089,
670,
643,
30628,
11626,
281,
1056,
253,
3061,
2654,
751,
281,
923,
253,
643,
30628,
11626,
670,
253,
16774,
1543,
2490,
187,
4118,
18435,
27,
2520,
2929,
39970,
253,
2934,
273,
5878,
484,
3169,
941,
42072,
323,
9077,
2429,
281,
9162,
323,
534,
5878,
484,
369,
908,
253,
2929,
8219,
326,
275,
9077,
253,
50137,
9376,
760,
6556,
1561,
2173,
941,
390,
5203,
13849,
253,
2929,
3021,
29328,
5878,
8435,
281,
3609,
7470,
8557,
970,
7725,
4885,
6346,
275,
247,
14604,
323,
5878,
484,
253,
5438,
3646,
310,
10166,
342,
5148,
613,
920,
407,
28699,
253,
3588,
569,
292,
2957,
253,
2746,
3400,
5185,
533,
1355,
7756,
689,
5878,
484,
327,
2067,
15302,
30628,
452,
671,
5125,
5955,
285,
5301,
342,
625,
1666,
25379,
824,
347,
9056,
1332,
970,
643,
1698,
677,
6410,
11786,
48489,
24088,
305,
3561,
293,
5530,
4090,
285,
970,
1980,
3280,
9252,
34501,
323,
941,
5438,
3966
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19132,
327,
253,
2934,
273,
5878,
484,
407,
17221,
3530,
323,
5878,
484,
3066,
247,
1566,
326,
34899,
7470,
8557,
1119,
970,
694,
79,
275,
247,
14604,
275,
1340,
281,
2085,
247,
4715,
2625,
281,
436,
13358,
1232,
253,
4477,
4647,
28432,
970,
253,
2957,
273,
253,
15450,
810,
32232,
417,
3732,
281,
9162,
8892,
597,
921,
12532,
1543,
327,
247,
1180,
273,
9077,
8892,
20544,
253,
2934,
310,
2969,
285,
4620,
281,
320,
9009,
9113,
285,
253,
4679,
3176,
281,
320,
816,
18397,
1677,
5878,
484,
310,
247,
4217,
37820,
5853,
323,
22296,
4715,
285,
556,
644,
2011,
281,
320,
1077,
4217,
275,
9162,
8892,
253,
2898,
273,
28432,
347,
247,
19191,
11786,
4020,
1080,
310,
973,
1929,
275,
643,
3634,
326,
6388,
295,
2224,
342,
13358,
4870,
923,
277,
21619,
983,
24088,
294,
91,
9747,
4059,
253,
4679,
403,
15246,
285,
1329,
253,
1318,
273,
616,
789,
50276,
20881,
1255,
265,
891,
452,
3676,
7350,
670,
436,
2929,
1146,
39809,
347,
391,
77,
323,
5878,
484,
391,
77,
3249,
32487,
342,
247,
1270,
2968,
625,
685,
816,
253,
28432,
5933,
24088,
253,
3646,
11786,
285,
294,
545,
83,
2355,
247,
2257,
273,
752,
310,
247,
11786,
29107,
323,
247,
13358,
1232,
347,
391,
77,
417,
760,
778,
7826,
1461,
5123,
253,
9414,
327,
752,
391,
77,
310,
533,
671,
36908,
4684,
342,
271,
3939,
2170,
273,
2561,
327,
2509,
896,
8560,
949,
13358,
4870,
4754,
891,
2096,
326,
253,
11106,
340,
3508,
9169,
2929,
858,
436,
347,
973,
533,
891,
13414,
1158,
597,
943,
452,
2057,
581,
273,
253,
954,
8530,
6667,
273,
970,
28432,
310,
342,
13358,
13460,
265,
390,
277,
21619,
983,
294,
91,
9747,
4059,
285,
597,
403,
10182,
417,
281,
1067,
352,
391,
77,
50276,
2858,
627,
403,
247,
1180,
273,
643,
2987,
390,
1666,
25379,
326,
436,
2929,
943,
452,
2429,
281,
285,
891,
1158,
275,
629,
436,
310,
984,
253,
2905,
2987,
2312,
670,
643,
391,
77,
11333,
323,
4227,
247,
20,
68,
390,
1014,
268,
5367,
403,
2649,
1663,
774,
17980,
281,
253,
1895,
1146,
9713,
1060,
347,
4951,
10489,
48489,
270,
1205,
900,
4072,
390,
305,
3561,
293,
2602,
4090,
480,
606,
4240,
340,
3508,
1057,
247,
1805,
2628,
387,
12873,
281,
841,
3510,
273,
2987,
50275,
1542,
4227,
352,
3133,
1896,
281,
896,
8560,
949,
281,
253,
5438,
20552,
970,
305,
3561,
293,
2602,
4090,
11041,
310,
3798,
1199,
1805,
685,
28432,
1014,
342,
253,
8245,
8570,
50276,
20513,
403,
1774,
1666,
25379,
285,
352,
310,
1774,
436,
2929,
5053,
3139,
9113,
2190,
2905,
2987,
50276,
8384,
627,
403,
1142,
3916,
670,
2139,
9162,
36908,
878,
436,
3686,
273,
5878,
484,
533,
891,
42126,
956,
253,
7125,
7094,
268,
18,
81,
19,
275,
2593,
374,
891,
13414,
923,
667,
9162,
4679,
281,
896,
598,
436,
1750,
387,
1878,
275,
253,
2022,
2505,
50276,
1542,
4227,
4872,
9376,
275,
9162,
7819,
562,
281,
320,
5272,
984,
253,
5203,
3064,
2139,
310,
352,
5272,
281,
3553,
253,
5203,
2317,
275,
9162,
5878,
484,
342,
9077,
891,
452,
247,
1805,
4839,
273,
16116,
247,
5203,
275,
253,
3268,
50276,
977,
7211,
50276,
585,
1209,
2224,
594,
253,
16038,
369,
327,
690,
3520,
2718,
285,
497,
970,
28432,
281,
3037,
849,
281,
5878,
941,
533,
604,
497,
275,
253,
3520,
2718,
824,
347,
368,
6266,
3798,
359,
452,
690,
3686,
273,
4581,
630,
2900,
1566,
326,
8631,
253,
3520,
1232,
5046,
15225,
1014,
604,
436,
1566,
556,
2228,
651,
2649,
436,
320,
247,
1175,
1659,
281,
2216,
247,
2720,
326,
8599,
368,
849,
281,
3410,
24088,
604,
253,
11843,
403,
690,
873,
4181,
7419,
840,
897,
2010,
13414,
50276,
74,
923,
5148,
613,
920,
347,
247,
5740,
273,
253,
1566,
533,
417,
275,
253,
5933,
337,
50276,
5430,
513,
368,
3410,
465,
672,
368,
7617,
849,
1142,
15833,
281,
5878,
310,
436,
19958,
432,
247,
2720,
50276,
14852,
9747,
4059,
19191,
896,
44263,
318,
285,
16851,
17032,
275,
3676,
1006,
800,
3210,
270,
1205,
900,
4072,
26230,
390,
42995,
27935,
949,
19191,
8512,
323,
17697,
13782,
480,
606,
4240,
31091,
294,
19484,
1320,
342,
305,
3561,
293,
5530,
4090,
891,
5583,
12009,
347,
253,
9380,
5161,
2926,
310,
2649,
973,
4845,
2429,
281,
2905,
2987,
253,
2929,
310,
4845,
347,
391,
77,
594,
497,
5816,
1774,
1666,
25379,
285,
10291,
281,
643,
11786,
48489,
949,
13358,
4870,
5474,
339,
431,
248,
4477,
12661,
5878,
8435,
281,
3157,
2220,
5878,
484,
275,
9077,
7533,
5878,
8435,
310,
908,
281,
16209,
247,
18326,
7658,
327,
253,
3280,
9252,
8557,
326,
403,
6804,
1309,
5878,
484,
3169,
941,
42072,
407,
21565,
849,
1142,
5275,
15833,
281,
16584,
432,
247,
1355,
873,
273,
838,
1553,
1245,
4610,
1754,
327,
8680,
432,
16344,
253,
12820,
873,
5185,
533,
1355,
15988,
689,
5878,
484,
285,
16751,
5878,
484,
403,
8156,
327,
2067,
15302,
20544,
50276,
8608,
324,
2662,
3280,
9252,
18326,
10806,
327,
5878,
484,
403,
6311,
50276,
32474,
533,
1355,
15988,
689,
5878,
484,
285,
16751,
5878,
484,
403,
8156,
327,
2067,
9077,
15302,
50276,
17465,
569,
50276,
783,
873,
273,
5275,
15833,
2783,
323,
897,
275,
5878,
484,
1223,
7870,
403,
4122,
11096,
24088,
577,
1668,
6705,
12842,
323,
253,
642,
19,
10895,
275,
2829,
495,
50276,
3062,
3839,
253,
3480,
273,
247,
1980,
3280,
9252,
10295,
281,
23652,
907,
625,
1980,
941,
275,
247,
5415,
5133,
9193,
751,
247,
1534,
12291,
50276,
32581,
1025,
3280,
9252,
34501,
326,
4656,
285,
625,
3839,
294,
6712,
5878,
484,
8557,
390,
3989,
12171,
1327,
22174,
810,
37761,
281,
3410,
432,
1928,
751,
1774,
5816,
1666,
25379,
253,
3480,
273,
247,
12171,
1327,
22174,
9077,
1566,
275,
5878,
8435,
4457,
12480,
941,
8557,
310,
2779,
14155,
3045,
3012,
253,
5853,
1223,
3590,
310,
4390,
3240,
3710,
275,
7990,
275,
326,
1774,
1666,
25379,
285,
5933,
280,
3603,
24088,
12171,
1327,
22174,
3210,
285,
1980,
34501,
326,
878,
281,
320,
2783,
275,
253,
3634,
273,
2509,
1029,
32422,
941,
42072,
323,
9077,
3237,
452,
417,
644,
18212,
6949,
390,
5469,
5474,
339,
22352,
4647,
5878,
484,
323,
9077,
8892,
253,
2929,
806,
29820,
253,
34614,
350,
9376,
326,
50137,
760,
6556,
1561,
2173,
941,
390,
5203,
13849,
323,
9077,
840,
436,
2929,
29328,
247,
941,
12480,
42072,
1332,
1925,
5878,
8435,
253,
4736,
273,
5878,
8435,
310,
281,
4271,
534,
6667,
281,
5878,
342,
534,
5275,
15833,
5878,
8435,
27532,
247,
5148,
613,
920,
7792,
326,
8197,
849,
1774,
12480,
247,
3410,
310,
281,
15338,
253,
1566,
2957,
327,
247,
12820,
873,
970,
3646,
11786,
35221,
4715,
5878,
8435,
310,
11797,
407,
253,
1895,
273,
10499,
849,
2060,
6667,
8162,
281,
1566,
3045,
253,
2934,
310,
5272,
323,
9077,
8892,
253,
4679,
12654,
697,
12510,
407,
11138,
9077,
3045,
407,
9257,
12480,
6667,
253,
2929,
310,
973,
3542,
285,
10932,
50276,
74,
4366,
5878,
8435,
2550,
3157,
2045,
5878,
484,
3082,
407,
247,
2590,
8459,
1580,
891,
717,
16063,
386,
670,
253,
6349,
273,
253,
7756,
891,
971,
281,
4089,
670,
643,
30628,
11626,
281,
1056,
253,
3061,
2654,
751,
281,
923,
253,
643,
30628,
11626,
670,
253,
16774,
1543,
2490,
187,
4118,
18435,
27,
2520,
2929,
39970,
253,
2934,
273,
5878,
484,
3169,
941,
42072,
323,
9077,
2429,
281,
9162,
323,
534,
5878,
484,
369,
908,
253,
2929,
8219,
326,
275,
9077,
253,
50137,
9376,
760,
6556,
1561,
2173,
941,
390,
5203,
13849,
253,
2929,
3021,
29328,
5878,
8435,
281,
3609,
7470,
8557,
970,
7725,
4885,
6346,
275,
247,
14604,
323,
5878,
484,
253,
5438,
3646,
310,
10166,
342,
5148,
613,
920,
407,
28699,
253,
3588,
569,
292,
2957,
253,
2746,
3400,
5185,
533,
1355,
7756,
689,
5878,
484,
327,
2067,
15302,
30628,
452,
671,
5125,
5955,
285,
5301,
342,
625,
1666,
25379,
824,
347,
9056,
1332,
970,
643,
1698,
677,
6410,
11786,
48489,
24088,
305,
3561,
293,
5530,
4090,
285,
970,
1980,
3280,
9252,
34501,
323,
941,
5438,
3966
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work presents a new method that combines neural odes which uses neural networks to flexibly describe a nonlinear dynamical system and neural processes which uses neural networks to parameterize a class of functions by combining these two approaches neural ode processes can now adapt to incoming datapoints as the latent representation parameterizes a distribution over odes due to the np component of the model the authors use different variants of the model first order ode second order ode linear latent readout to infer dynamics from data in the experiments section i find this work to be an interesting and important extension to the existing literature on neural processes my primary qualms with the manuscript is that i found it difficult to glean some of the details about the models and in particular the details of the inference procedure i assume many of the same details in the original np paper apply here but it is not clear to what extent and exactly how many of these important inference and model details seem to be missing from both the main text and the supplemental material in particular when you first discuss the aggregator some key details are missing you mention that the mapping from r to z could be a nn but you are not clear on whenif the encoder is a neural network in the actual experiments also is it the case that data from multiple contexts are trained in parallel it is important to specify all of the details for the encoder for each of the experimental sections the decoder and the other pieces of the model are clear moreover how exactly is this trained sgd adam is it end to end i assume you are optimizing an elbo and the inference methods are akin to a vae or the original np paper but it is not explicitly said anywhere stepping through or at least explaining the primary details of training the model and the training objective will be useful finally it is unclear how long this inference takes or what kind of computing resources are needed though there are some comparisons of training different versions of the model in the appendix there is no sense of how long an epoch is because there was no code that i could see with the submission this is doubly difficult to glean i think the proofs of secion 32 could be moved to an appendix additionally a lot of space is devoted to the discussion and the conclusion i would rather see more clarity provided to the implementation of ndps and their differences at every stage of the model across the experiments i am excited about the work and it does seem to be a useful extension of existing methods and i think there are details that need to be clarified in order for this to be publishable minor details bottom of page three the the edit i am satisfied with the authors response and given the proposed changes will raise my score to a 7 docsepthis paper proposes a new class of stochastic processes determined by a distribution over neural odes the overall structure of the paper is clear i find the newly defined process interesting and applicable to many real data setsdocsepthis paper proposes a new algorithm that can adapt incoming datapoints by applying neural ordinary differential equations nodes to neural processes nps it combines two algorithms properly and showed better performance than nps through odes in the encoding even with a smaller number of parameters strengths 1 they properly combined nodes and nps to fastadapt few data points from underlying ode over ode distributions 2 they showed their algorithm outperforms nps through ode encoding with fewer parameters 3 they analyzed several variations like secondorder neural ode process or latentonly version weaknesses 1 task details are not clearly described i checked the appendix also but they just mentioned with varying gradients and amplitudes and shifts 2 lack of comparison with previous works for instance one of the advantages of this work is good interpolation and extrapolation convolutional conditional np convcnp jonathan gordon et al 2019 also outperformed other nps methods for extrapolation but they didnt compare convcnp as one of the baselines for the rotated mnist experiment sequential neural processes snps singh et al 2018 isnt compared the correctness of their claim and clarity this paper is well written and almost correct but the details about the experimental setting look missed additional feedback thank you for submitting it i enjoyed reading it i think that it is a wellwritten paper and deserved sharing in our community however detailed information eg task details is not clearly described and some comparison results are missed by updating those things it will be more concrete for the rotated mnist experiment evaluating the version applying nodes to snps could be interesting also minor things are on page 5 additional details for every task considered can be found in c additional details for every task considered can be found in appendix c secondly as is seen in a ndps train faster in a faster wall clock time than other variants secondly as is seen in a ndps train faster in a wall clock time than other variants on page 7 we show the mean squared errors mse for the 4th rotated mnist digit in table 2 what is the meaning of the 4th rotated mnist edit i agree that the authors disagreement with my second comment and thank you for the update i change my rate to 7docsepthe proposed ndp has two main advantages 1 it has the capability to adapt the incoming data points in timeseries unlike node without retraining 2 it can provide a measure of uncertainty for the underlying dynamics of the timeseries ndp partitions the global latent context z to a latent position l and subcontext zprime then it lets l follow an ode called latent ode this part is actually the innovation of the paper where by defining a latent ode the authors take advantages of odes to find the underlying hidden dynamics of the timeseries this assumption helps find better dynamics when the generating processes of timeseries meet some odes then the authors define a stochastic process very like the idea from neural processes np paper that is by defining a latent context z which here is a concatenation of l and subcontext zprime with a prior pz and integrating a gaussian distribution of a function of z decoder gltzprime which is a neural network over z overall i liked the idea of the paper and how the authors integrate two important concepts ie node and np into a single framework which could be useful in many realworld timeseries with complex underlying dynamics however i have some questions regarding some points in the paper 1 the paper says that z is split into two parts l and zprime where zprime is kept unchanged over time and only l follows an ode i wonder why is this the case how many dimensions should l have how does the dimension of l affect the results why not let the whole z follow an ode there are no explanations and clarifications for these in the paper 2 there is no mention of how zprime should be learned in general there is no mention on how to train the ndps it is unclear in the paper what loss function should be optimized and how the latents should be learned if it is by variational methods how the posteriors of zprime and l should be learned i believe the authors should augment on these in the paper otherwise it is very hard to know how the ndps should be trained 3 what is the dimension of l used for rotating mnist experiments why ndp is able to extrapolate well when there is variable angular velocity and angular shift fig 5 and fails to extrapolate when there is constant angular velocity fig 4 it seems the second is an easier task and i wonder why ndp has a poor performance does it imply that ndp can only work well in a specific conditions 4 typo page 3 the the decoder the decoder edit the authors have addressed all my questions thanks
### Summary:
|
this work proposes a stochastic process variant that extends existing work on neural odes the resulting method allows for a fast dataadaptive method that can work well fit to sparser time series settings without retraining the methodology is backed up empirically and after the response period the reviewers concerns are sufficiently addressed and reviewers are in agreement that the contributions are clear and correct
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
10262,
247,
747,
1332,
326,
24772,
11454,
258,
3229,
534,
4648,
11454,
6928,
281,
6520,
4360,
6266,
247,
14561,
18525,
985,
285,
11454,
4870,
534,
4648,
11454,
6928,
281,
4764,
907,
247,
966,
273,
3470,
407,
16248,
841,
767,
7274,
11454,
258,
615,
4870,
476,
1024,
5223,
281,
19363,
2856,
522,
842,
84,
347,
253,
21624,
6779,
4764,
4219,
247,
3268,
689,
258,
3229,
1955,
281,
253,
15749,
4445,
273,
253,
1566,
253,
4477,
897,
1027,
11640,
273,
253,
1566,
806,
1340,
258,
615,
1273,
1340,
258,
615,
4872,
21624,
49914,
281,
9441,
8062,
432,
941,
275,
253,
4679,
2593,
891,
1089,
436,
789,
281,
320,
271,
4722,
285,
1774,
6880,
281,
253,
5368,
6239,
327,
11454,
4870,
50275,
2577,
3625,
4426,
983,
342,
253,
7714,
310,
326,
891,
1119,
352,
2834,
281,
20786,
266,
690,
273,
253,
4278,
670,
253,
3210,
285,
275,
1798,
253,
4278,
273,
253,
17032,
5199,
891,
5467,
1142,
273,
253,
1072,
4278,
275,
253,
3236,
15749,
2929,
4647,
1060,
533,
352,
310,
417,
2590,
281,
752,
6070,
285,
4555,
849,
1142,
273,
841,
1774,
17032,
285,
1566,
4278,
1646,
281,
320,
5816,
432,
1097,
253,
2022,
2505,
285,
253,
25702,
2144,
50275,
249,
1798,
672,
368,
806,
2319,
253,
9406,
1080,
690,
2234,
4278,
403,
5816,
368,
3748,
326,
253,
10603,
432,
391,
281,
1182,
812,
320,
247,
48257,
533,
368,
403,
417,
2590,
327,
672,
338,
253,
32049,
310,
247,
11454,
2990,
275,
253,
4588,
4679,
671,
310,
352,
253,
1083,
326,
941,
432,
2709,
22349,
403,
10166,
275,
7529,
352,
310,
1774,
281,
13199,
512,
273,
253,
4278,
323,
253,
32049,
323,
1016,
273,
253,
5661,
7118,
253,
29810,
285,
253,
643,
7437,
273,
253,
1566,
403,
2590,
50276,
3062,
1189,
849,
4555,
310,
436,
10166,
256,
35333,
38622,
310,
352,
990,
281,
990,
50276,
74,
5467,
368,
403,
39793,
271,
1045,
2399,
285,
253,
17032,
3082,
403,
33917,
281,
247,
362,
3348,
390,
253,
3236,
15749,
2929,
533,
352,
310,
417,
11120,
753,
9825,
24655,
949,
390,
387,
1878,
15571,
253,
3625,
4278,
273,
3733,
253,
1566,
285,
253,
3733,
8103,
588,
320,
4217,
50275,
71,
3341,
352,
310,
12744,
849,
1048,
436,
17032,
3936,
390,
752,
2238,
273,
12672,
5300,
403,
3058,
2167,
627,
403,
690,
14023,
273,
3733,
1027,
9508,
273,
253,
1566,
275,
253,
30762,
627,
310,
642,
3282,
273,
849,
1048,
271,
23657,
310,
984,
627,
369,
642,
2127,
326,
891,
812,
923,
342,
253,
19529,
436,
310,
44881,
2834,
281,
20786,
266,
50275,
74,
1158,
253,
27947,
273,
4706,
279,
4567,
812,
320,
4395,
281,
271,
30762,
23000,
247,
2257,
273,
2317,
310,
16222,
281,
253,
5955,
285,
253,
6452,
891,
651,
2581,
923,
625,
19843,
2530,
281,
253,
7092,
273,
40515,
793,
285,
616,
3910,
387,
1046,
3924,
273,
253,
1566,
2439,
253,
4679,
50276,
74,
717,
9049,
670,
253,
789,
285,
352,
1057,
1646,
281,
320,
247,
4217,
6880,
273,
5368,
3082,
285,
891,
1158,
627,
403,
4278,
326,
878,
281,
320,
31637,
275,
1340,
323,
436,
281,
320,
15452,
494,
50275,
37585,
4278,
5004,
273,
3239,
1264,
253,
253,
50273,
15576,
50275,
74,
717,
10048,
342,
253,
4477,
2380,
285,
1677,
253,
4081,
2544,
588,
7164,
619,
4868,
281,
247,
818,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
966,
273,
19191,
4870,
3413,
407,
247,
3268,
689,
11454,
258,
3229,
253,
4583,
2605,
273,
253,
2929,
310,
2590,
891,
1089,
253,
9841,
2931,
1232,
4722,
285,
7763,
281,
1142,
1524,
941,
5239,
7152,
33032,
2520,
2929,
29328,
247,
747,
5933,
326,
476,
5223,
19363,
2856,
522,
842,
84,
407,
9433,
11454,
9826,
8967,
7424,
7632,
281,
11454,
4870,
295,
793,
352,
24772,
767,
11333,
6283,
285,
2692,
1805,
3045,
685,
295,
793,
949,
258,
3229,
275,
253,
9706,
1014,
342,
247,
4577,
1180,
273,
3602,
50275,
296,
3755,
20556,
50276,
18,
597,
6283,
5678,
7632,
285,
295,
793,
281,
3809,
26672,
1643,
941,
2792,
432,
6944,
258,
615,
689,
258,
615,
10670,
50276,
19,
597,
2692,
616,
5933,
41731,
13015,
295,
793,
949,
258,
615,
9706,
342,
11184,
3602,
50276,
20,
597,
5867,
2067,
10575,
751,
1273,
2621,
11454,
258,
615,
1232,
390,
21624,
7483,
2715,
50274,
20881,
1255,
265,
50276,
18,
4836,
4278,
403,
417,
4518,
2529,
891,
10141,
253,
30762,
671,
533,
597,
816,
5393,
342,
11962,
27935,
285,
22652,
285,
15036,
50275,
19,
3480,
273,
5301,
342,
2045,
2987,
323,
4227,
581,
273,
253,
11361,
273,
436,
789,
310,
1175,
30370,
285,
26480,
17888,
27311,
267,
17697,
15749,
2410,
14340,
81,
480,
251,
10511,
305,
14276,
1162,
355,
6247,
671,
41731,
10574,
643,
295,
793,
3082,
323,
26480,
17888,
533,
597,
42126,
7277,
2410,
14340,
81,
347,
581,
273,
253,
1666,
25379,
323,
253,
27272,
278,
79,
382,
3368,
22453,
11454,
4870,
3802,
793,
1625,
73,
1162,
355,
4765,
310,
2649,
2429,
50275,
783,
36594,
273,
616,
1750,
285,
19843,
50276,
2520,
2929,
310,
973,
3542,
285,
2761,
3451,
533,
253,
4278,
670,
253,
5661,
4758,
1007,
9829,
50275,
38092,
8680,
50276,
47033,
368,
323,
29315,
352,
891,
11346,
4361,
352,
891,
1158,
326,
352,
310,
247,
973,
15720,
2929,
285,
29774,
9628,
275,
776,
3114,
2299,
7000,
1491,
24088,
4836,
4278,
310,
417,
4518,
2529,
285,
690,
5301,
1543,
403,
9829,
407,
22753,
1110,
1841,
352,
588,
320,
625,
11859,
323,
253,
27272,
278,
79,
382,
3368,
16344,
253,
2715,
9433,
7632,
281,
3802,
793,
812,
320,
4722,
671,
50276,
37585,
1841,
403,
50276,
251,
3239,
608,
50276,
38092,
4278,
323,
1046,
4836,
2783,
476,
320,
1119,
275,
260,
50276,
38092,
4278,
323,
1046,
4836,
2783,
476,
320,
1119,
275,
30762,
260,
1273,
314,
347,
310,
2326,
275,
247,
40515,
793,
6194,
7938,
275,
247,
7938,
3402,
8886,
673,
685,
643,
11640,
50276,
9815,
314,
347,
310,
2326,
275,
247,
40515,
793,
6194,
7938,
275,
247,
3402,
8886,
673,
685,
643,
11640,
50276,
251,
3239,
818,
359,
921,
253,
1599,
30044,
6332,
278,
339,
323,
253,
577,
394,
27272,
278,
79,
382,
6670,
275,
2829,
374,
50276,
5371,
310,
253,
4495,
273,
253,
577,
394,
27272,
278,
79,
382,
50276,
15576,
891,
5194,
326,
253,
4477,
30859,
342,
619,
1273,
4385,
285,
5717,
368,
323,
253,
5731,
891,
1818,
619,
2281,
281,
818,
7152,
339,
431,
248,
4081,
295,
12132,
556,
767,
2022,
11361,
337,
352,
556,
253,
14603,
281,
5223,
253,
19363,
941,
2792,
275,
2069,
12395,
12401,
4666,
1293,
851,
26208,
374,
352,
476,
2085,
247,
2557,
273,
11649,
323,
253,
6944,
8062,
273,
253,
2069,
12395,
295,
12132,
27959,
253,
4156,
21624,
3634,
1182,
281,
247,
21624,
1899,
298,
285,
749,
8882,
1182,
5994,
840,
352,
14935,
298,
956,
271,
258,
615,
1925,
21624,
258,
615,
436,
629,
310,
2686,
253,
15832,
273,
253,
2929,
835,
407,
13947,
247,
21624,
258,
615,
253,
4477,
1379,
11361,
273,
258,
3229,
281,
1089,
253,
6944,
8763,
8062,
273,
253,
2069,
12395,
436,
9376,
7729,
1089,
1805,
8062,
672,
253,
11365,
4870,
273,
2069,
12395,
2525,
690,
258,
3229,
840,
253,
4477,
4853,
247,
19191,
1232,
1077,
751,
253,
2934,
432,
11454,
4870,
15749,
2929,
326,
310,
407,
13947,
247,
21624,
3634,
1182,
534,
1060,
310,
247,
32147,
318,
273,
298,
285,
749,
8882,
1182,
5994,
342,
247,
2720,
268,
91,
285,
24399,
247,
305,
12064,
3268,
273,
247,
1159,
273,
1182,
29810,
1289,
21239,
5994,
534,
310,
247,
11454,
2990,
689,
1182,
50275,
1189,
455,
891,
10490,
253,
2934,
273,
253,
2929,
285,
849,
253,
4477,
19837,
767,
1774,
12342,
26332,
4666,
285,
15749,
715,
247,
2014,
7792,
534,
812,
320,
4217,
275,
1142,
1524,
10186,
2069,
12395,
342,
2570,
6944,
8062,
2299,
891,
452,
690,
3533,
5001,
690,
2792,
275,
253,
2929,
50276,
18,
253,
2929,
2296,
326,
1182,
310,
8085,
715,
767,
4243,
298,
285,
1182,
5994,
835,
1182,
5994,
310,
4934,
19965,
689,
673,
285,
760,
298,
3637,
271,
258,
615,
891,
4282,
2139,
310,
436,
253,
1083,
849,
1142,
10103,
943,
298,
452,
849,
1057,
253,
7877,
273,
298,
2818,
253,
1543,
2139,
417,
1339,
253,
2644,
1182,
956,
271,
258,
615,
627,
403,
642,
22909,
285,
8254,
6787,
323,
841,
275,
253,
2929,
50276,
19,
627,
310,
642,
3748,
273,
849,
1182,
5994,
943,
320,
6311,
275,
2087,
627,
310,
642,
3748,
327,
849,
281,
6194,
253,
40515,
793,
352,
310,
12744,
275,
253,
2929,
752,
2957,
1159,
943,
320,
18325,
285,
849,
253,
4329,
592,
943,
320,
6311,
604,
352,
310,
407,
39762,
3082,
849,
253,
20731,
17327,
273,
1182,
5994,
285,
298,
943,
320,
6311,
891,
2868,
253,
4477,
943,
35919,
327,
841,
275,
253,
2929,
5010,
352,
310,
1077,
1892,
281,
871,
849,
253,
40515,
793,
943,
320,
10166,
50275,
20,
752,
310,
253,
7877,
273,
298,
908,
323,
17387,
278,
79,
382,
4679,
2139,
295,
12132,
310,
2104,
281,
26480,
25839,
973,
672,
627,
310,
4778,
12336,
7602,
285,
12336,
5333,
3036,
608,
285,
10224,
281,
26480,
25839,
672,
627,
310,
50276,
22174,
12336,
7602,
3036,
577,
352,
3133,
253,
1273,
310,
271,
6927,
4836,
285,
891,
4282,
2139,
295,
12132,
556,
247,
4105,
3045,
1057,
352,
16084,
326,
295,
12132,
476,
760,
789,
973,
275,
247,
2173,
2515,
50276,
21,
1745,
80,
3239,
495,
253,
253,
29810,
50276,
783,
29810,
50274,
15576,
50275,
783,
4477,
452,
9713,
512,
619,
3533,
6701,
187,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
19191,
1232,
12955,
326,
8725,
5368,
789,
327,
11454,
258,
3229,
253,
4795,
1332,
4483,
323,
247,
3809,
941,
26672,
422,
1332,
326,
476,
789,
973,
4944,
281,
653,
9332,
673,
2962,
7533,
1293,
851,
26208,
253,
16182,
310,
17245,
598,
45190,
285,
846,
253,
2380,
2180,
253,
30628,
7350,
403,
10481,
9713,
285,
30628,
403,
275,
4345,
326,
253,
9021,
403,
2590,
285,
3451
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
10262,
247,
747,
1332,
326,
24772,
11454,
258,
3229,
534,
4648,
11454,
6928,
281,
6520,
4360,
6266,
247,
14561,
18525,
985,
285,
11454,
4870,
534,
4648,
11454,
6928,
281,
4764,
907,
247,
966,
273,
3470,
407,
16248,
841,
767,
7274,
11454,
258,
615,
4870,
476,
1024,
5223,
281,
19363,
2856,
522,
842,
84,
347,
253,
21624,
6779,
4764,
4219,
247,
3268,
689,
258,
3229,
1955,
281,
253,
15749,
4445,
273,
253,
1566,
253,
4477,
897,
1027,
11640,
273,
253,
1566,
806,
1340,
258,
615,
1273,
1340,
258,
615,
4872,
21624,
49914,
281,
9441,
8062,
432,
941,
275,
253,
4679,
2593,
891,
1089,
436,
789,
281,
320,
271,
4722,
285,
1774,
6880,
281,
253,
5368,
6239,
327,
11454,
4870,
50275,
2577,
3625,
4426,
983,
342,
253,
7714,
310,
326,
891,
1119,
352,
2834,
281,
20786,
266,
690,
273,
253,
4278,
670,
253,
3210,
285,
275,
1798,
253,
4278,
273,
253,
17032,
5199,
891,
5467,
1142,
273,
253,
1072,
4278,
275,
253,
3236,
15749,
2929,
4647,
1060,
533,
352,
310,
417,
2590,
281,
752,
6070,
285,
4555,
849,
1142,
273,
841,
1774,
17032,
285,
1566,
4278,
1646,
281,
320,
5816,
432,
1097,
253,
2022,
2505,
285,
253,
25702,
2144,
50275,
249,
1798,
672,
368,
806,
2319,
253,
9406,
1080,
690,
2234,
4278,
403,
5816,
368,
3748,
326,
253,
10603,
432,
391,
281,
1182,
812,
320,
247,
48257,
533,
368,
403,
417,
2590,
327,
672,
338,
253,
32049,
310,
247,
11454,
2990,
275,
253,
4588,
4679,
671,
310,
352,
253,
1083,
326,
941,
432,
2709,
22349,
403,
10166,
275,
7529,
352,
310,
1774,
281,
13199,
512,
273,
253,
4278,
323,
253,
32049,
323,
1016,
273,
253,
5661,
7118,
253,
29810,
285,
253,
643,
7437,
273,
253,
1566,
403,
2590,
50276,
3062,
1189,
849,
4555,
310,
436,
10166,
256,
35333,
38622,
310,
352,
990,
281,
990,
50276,
74,
5467,
368,
403,
39793,
271,
1045,
2399,
285,
253,
17032,
3082,
403,
33917,
281,
247,
362,
3348,
390,
253,
3236,
15749,
2929,
533,
352,
310,
417,
11120,
753,
9825,
24655,
949,
390,
387,
1878,
15571,
253,
3625,
4278,
273,
3733,
253,
1566,
285,
253,
3733,
8103,
588,
320,
4217,
50275,
71,
3341,
352,
310,
12744,
849,
1048,
436,
17032,
3936,
390,
752,
2238,
273,
12672,
5300,
403,
3058,
2167,
627,
403,
690,
14023,
273,
3733,
1027,
9508,
273,
253,
1566,
275,
253,
30762,
627,
310,
642,
3282,
273,
849,
1048,
271,
23657,
310,
984,
627,
369,
642,
2127,
326,
891,
812,
923,
342,
253,
19529,
436,
310,
44881,
2834,
281,
20786,
266,
50275,
74,
1158,
253,
27947,
273,
4706,
279,
4567,
812,
320,
4395,
281,
271,
30762,
23000,
247,
2257,
273,
2317,
310,
16222,
281,
253,
5955,
285,
253,
6452,
891,
651,
2581,
923,
625,
19843,
2530,
281,
253,
7092,
273,
40515,
793,
285,
616,
3910,
387,
1046,
3924,
273,
253,
1566,
2439,
253,
4679,
50276,
74,
717,
9049,
670,
253,
789,
285,
352,
1057,
1646,
281,
320,
247,
4217,
6880,
273,
5368,
3082,
285,
891,
1158,
627,
403,
4278,
326,
878,
281,
320,
31637,
275,
1340,
323,
436,
281,
320,
15452,
494,
50275,
37585,
4278,
5004,
273,
3239,
1264,
253,
253,
50273,
15576,
50275,
74,
717,
10048,
342,
253,
4477,
2380,
285,
1677,
253,
4081,
2544,
588,
7164,
619,
4868,
281,
247,
818,
50276,
7152,
33032,
2520,
2929,
29328,
247,
747,
966,
273,
19191,
4870,
3413,
407,
247,
3268,
689,
11454,
258,
3229,
253,
4583,
2605,
273,
253,
2929,
310,
2590,
891,
1089,
253,
9841,
2931,
1232,
4722,
285,
7763,
281,
1142,
1524,
941,
5239,
7152,
33032,
2520,
2929,
29328,
247,
747,
5933,
326,
476,
5223,
19363,
2856,
522,
842,
84,
407,
9433,
11454,
9826,
8967,
7424,
7632,
281,
11454,
4870,
295,
793,
352,
24772,
767,
11333,
6283,
285,
2692,
1805,
3045,
685,
295,
793,
949,
258,
3229,
275,
253,
9706,
1014,
342,
247,
4577,
1180,
273,
3602,
50275,
296,
3755,
20556,
50276,
18,
597,
6283,
5678,
7632,
285,
295,
793,
281,
3809,
26672,
1643,
941,
2792,
432,
6944,
258,
615,
689,
258,
615,
10670,
50276,
19,
597,
2692,
616,
5933,
41731,
13015,
295,
793,
949,
258,
615,
9706,
342,
11184,
3602,
50276,
20,
597,
5867,
2067,
10575,
751,
1273,
2621,
11454,
258,
615,
1232,
390,
21624,
7483,
2715,
50274,
20881,
1255,
265,
50276,
18,
4836,
4278,
403,
417,
4518,
2529,
891,
10141,
253,
30762,
671,
533,
597,
816,
5393,
342,
11962,
27935,
285,
22652,
285,
15036,
50275,
19,
3480,
273,
5301,
342,
2045,
2987,
323,
4227,
581,
273,
253,
11361,
273,
436,
789,
310,
1175,
30370,
285,
26480,
17888,
27311,
267,
17697,
15749,
2410,
14340,
81,
480,
251,
10511,
305,
14276,
1162,
355,
6247,
671,
41731,
10574,
643,
295,
793,
3082,
323,
26480,
17888,
533,
597,
42126,
7277,
2410,
14340,
81,
347,
581,
273,
253,
1666,
25379,
323,
253,
27272,
278,
79,
382,
3368,
22453,
11454,
4870,
3802,
793,
1625,
73,
1162,
355,
4765,
310,
2649,
2429,
50275,
783,
36594,
273,
616,
1750,
285,
19843,
50276,
2520,
2929,
310,
973,
3542,
285,
2761,
3451,
533,
253,
4278,
670,
253,
5661,
4758,
1007,
9829,
50275,
38092,
8680,
50276,
47033,
368,
323,
29315,
352,
891,
11346,
4361,
352,
891,
1158,
326,
352,
310,
247,
973,
15720,
2929,
285,
29774,
9628,
275,
776,
3114,
2299,
7000,
1491,
24088,
4836,
4278,
310,
417,
4518,
2529,
285,
690,
5301,
1543,
403,
9829,
407,
22753,
1110,
1841,
352,
588,
320,
625,
11859,
323,
253,
27272,
278,
79,
382,
3368,
16344,
253,
2715,
9433,
7632,
281,
3802,
793,
812,
320,
4722,
671,
50276,
37585,
1841,
403,
50276,
251,
3239,
608,
50276,
38092,
4278,
323,
1046,
4836,
2783,
476,
320,
1119,
275,
260,
50276,
38092,
4278,
323,
1046,
4836,
2783,
476,
320,
1119,
275,
30762,
260,
1273,
314,
347,
310,
2326,
275,
247,
40515,
793,
6194,
7938,
275,
247,
7938,
3402,
8886,
673,
685,
643,
11640,
50276,
9815,
314,
347,
310,
2326,
275,
247,
40515,
793,
6194,
7938,
275,
247,
3402,
8886,
673,
685,
643,
11640,
50276,
251,
3239,
818,
359,
921,
253,
1599,
30044,
6332,
278,
339,
323,
253,
577,
394,
27272,
278,
79,
382,
6670,
275,
2829,
374,
50276,
5371,
310,
253,
4495,
273,
253,
577,
394,
27272,
278,
79,
382,
50276,
15576,
891,
5194,
326,
253,
4477,
30859,
342,
619,
1273,
4385,
285,
5717,
368,
323,
253,
5731,
891,
1818,
619,
2281,
281,
818,
7152,
339,
431,
248,
4081,
295,
12132,
556,
767,
2022,
11361,
337,
352,
556,
253,
14603,
281,
5223,
253,
19363,
941,
2792,
275,
2069,
12395,
12401,
4666,
1293,
851,
26208,
374,
352,
476,
2085,
247,
2557,
273,
11649,
323,
253,
6944,
8062,
273,
253,
2069,
12395,
295,
12132,
27959,
253,
4156,
21624,
3634,
1182,
281,
247,
21624,
1899,
298,
285,
749,
8882,
1182,
5994,
840,
352,
14935,
298,
956,
271,
258,
615,
1925,
21624,
258,
615,
436,
629,
310,
2686,
253,
15832,
273,
253,
2929,
835,
407,
13947,
247,
21624,
258,
615,
253,
4477,
1379,
11361,
273,
258,
3229,
281,
1089,
253,
6944,
8763,
8062,
273,
253,
2069,
12395,
436,
9376,
7729,
1089,
1805,
8062,
672,
253,
11365,
4870,
273,
2069,
12395,
2525,
690,
258,
3229,
840,
253,
4477,
4853,
247,
19191,
1232,
1077,
751,
253,
2934,
432,
11454,
4870,
15749,
2929,
326,
310,
407,
13947,
247,
21624,
3634,
1182,
534,
1060,
310,
247,
32147,
318,
273,
298,
285,
749,
8882,
1182,
5994,
342,
247,
2720,
268,
91,
285,
24399,
247,
305,
12064,
3268,
273,
247,
1159,
273,
1182,
29810,
1289,
21239,
5994,
534,
310,
247,
11454,
2990,
689,
1182,
50275,
1189,
455,
891,
10490,
253,
2934,
273,
253,
2929,
285,
849,
253,
4477,
19837,
767,
1774,
12342,
26332,
4666,
285,
15749,
715,
247,
2014,
7792,
534,
812,
320,
4217,
275,
1142,
1524,
10186,
2069,
12395,
342,
2570,
6944,
8062,
2299,
891,
452,
690,
3533,
5001,
690,
2792,
275,
253,
2929,
50276,
18,
253,
2929,
2296,
326,
1182,
310,
8085,
715,
767,
4243,
298,
285,
1182,
5994,
835,
1182,
5994,
310,
4934,
19965,
689,
673,
285,
760,
298,
3637,
271,
258,
615,
891,
4282,
2139,
310,
436,
253,
1083,
849,
1142,
10103,
943,
298,
452,
849,
1057,
253,
7877,
273,
298,
2818,
253,
1543,
2139,
417,
1339,
253,
2644,
1182,
956,
271,
258,
615,
627,
403,
642,
22909,
285,
8254,
6787,
323,
841,
275,
253,
2929,
50276,
19,
627,
310,
642,
3748,
273,
849,
1182,
5994,
943,
320,
6311,
275,
2087,
627,
310,
642,
3748,
327,
849,
281,
6194,
253,
40515,
793,
352,
310,
12744,
275,
253,
2929,
752,
2957,
1159,
943,
320,
18325,
285,
849,
253,
4329,
592,
943,
320,
6311,
604,
352,
310,
407,
39762,
3082,
849,
253,
20731,
17327,
273,
1182,
5994,
285,
298,
943,
320,
6311,
891,
2868,
253,
4477,
943,
35919,
327,
841,
275,
253,
2929,
5010,
352,
310,
1077,
1892,
281,
871,
849,
253,
40515,
793,
943,
320,
10166,
50275,
20,
752,
310,
253,
7877,
273,
298,
908,
323,
17387,
278,
79,
382,
4679,
2139,
295,
12132,
310,
2104,
281,
26480,
25839,
973,
672,
627,
310,
4778,
12336,
7602,
285,
12336,
5333,
3036,
608,
285,
10224,
281,
26480,
25839,
672,
627,
310,
50276,
22174,
12336,
7602,
3036,
577,
352,
3133,
253,
1273,
310,
271,
6927,
4836,
285,
891,
4282,
2139,
295,
12132,
556,
247,
4105,
3045,
1057,
352,
16084,
326,
295,
12132,
476,
760,
789,
973,
275,
247,
2173,
2515,
50276,
21,
1745,
80,
3239,
495,
253,
253,
29810,
50276,
783,
29810,
50274,
15576,
50275,
783,
4477,
452,
9713,
512,
619,
3533,
6701,
187,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
19191,
1232,
12955,
326,
8725,
5368,
789,
327,
11454,
258,
3229,
253,
4795,
1332,
4483,
323,
247,
3809,
941,
26672,
422,
1332,
326,
476,
789,
973,
4944,
281,
653,
9332,
673,
2962,
7533,
1293,
851,
26208,
253,
16182,
310,
17245,
598,
45190,
285,
846,
253,
2380,
2180,
253,
30628,
7350,
403,
10481,
9713,
285,
30628,
403,
275,
4345,
326,
253,
9021,
403,
2590,
285,
3451
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this manuscript authors proposed a deep learning dlbased pipeline to automate the pathological assessment of fish images with respect to her2 gene amplification testing their pipeline detects nuclei and classifies fluorescence signals within each nucleus using cnns pros the paper is very well written and planned the pipeline design is adequate the experiments are well designed the motivation and future direction given clearly cons using term interpretable results is not adequate as the results and generated report are not really interpretable it can be maybe human readable it would be nice to discuss more on how this can be implemented in clinical practices how much training is needed for practitioners docsepthe clinical problem this paper takes on is interesting and relevant for patient care but the motivation is unclear is there a problem with the current method for evaluating her2 status that deep learning can solve the methods of this paper are unclear and it would be impossible to replicate this study from this manuscript the hypothesis of this paper appears to be that deep learning can determine her2 status but it is unclear whether this was supported or refuted by these results major comments the number of patients and number of images from each patient must be stated the paper refers to training test and validation sets but the number of patients in each set and method by which they were allocated are unclear were the sets the same for each task are all the patients from the same hospital the results section consists mostly of methods there is no methods section it is unclear how artifacts or overlapping nuclear parts were excluded it is unclear how this exclusion affected the results if this exclusion is manual it calls into question the claim of a fully automated pipeline how was the ground truth for these images established the performance metrics are given for each network individually but it is unclear what the performance of the pipeline is on the overall task of patient her2 classification how does this performance compare to the current gold standard a main claim of this paper is that the pipeline is interpretable however there is no description of the features used by any of the networks nor any biological insight provided by the networks an interpretable network allows scrutiny of its classification decisions it is unclear whether that is possible here minor comments the test in figure 1 is so small that it becomes readable only at 200 size it seems odd to have so much text in this figure rather than describing the process in the manuscript text and referencing the figure in the text the magnification and micronsperpixel of the images should be given as should the hardware used for digitization the inclusion of the code via github is good docsepthe authors describe a new end to end image analysis pipeline they developed to interpret histopathological images of breast and gastric cancers specifically a deep convolutional neural network cnn first makes automatic the analysis of fluorescence in situ hybridization fish images that test the human epidermal growth factor receptor 2 her2 oncogene amplification status the deep learning pipeline mimics the pathological assessment and localizes plus classifies the fluorescence signals within each nucleus then it classifies the whole image regarding its her2 amplification status this short paper gives a good overview of the pipeline and reads well the methodology seems to be solid and very flexible the results are also promising although the proposed pipeline is not compared with another stateoftheart method the source code of the pipeline is finally freely available it therefore believe that this work will be of significant interest at midl docsepauthors present a machine learning computer vision pipeline for fish based her oncogene detection and quantification in histopath images the paper is well written and easy to follow even if the technical contribution is limited the main pitch of the paper is application novelty the authors clearly present this in the paper and do not overclaim technical novelty the authors only provided performance for individual steps of the pipeline endtoend performance analysis should have been included the authors could have included a few more detail about the algorithm such as the input size to individual networks training parameters etc
### Summary:
|
most reviewers suggest acceptance of the paper whereas one reviewer suggests a strong rejection the reviewers suggesting acceptance indicate that the paper is wellwritten easy to follow and that the results look very promising the pipeline is generally welldescribed and the tasks of the individual components is clear i do agree with the reviewer recommending rejection that some important details are missing and that these could have been added to the paper eg dataset splits however i think rejection would be too harsh given that this is a short paper and quite and extensive method the authors had to choose what to include and what not as such i lean towards acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
7714,
4477,
4081,
247,
3676,
4715,
45439,
3169,
15722,
281,
3772,
366,
253,
18977,
6803,
273,
6773,
3888,
342,
1675,
281,
617,
19,
3320,
17442,
5175,
616,
15722,
34472,
50276,
29371,
74,
285,
966,
7790,
11348,
6298,
1561,
1016,
13787,
970,
260,
79,
2224,
50276,
856,
84,
253,
2929,
310,
1077,
973,
3542,
285,
9355,
50276,
783,
15722,
2216,
310,
10599,
253,
4679,
403,
973,
4158,
253,
16038,
285,
2852,
3884,
1677,
4518,
50276,
5040,
970,
1307,
4665,
494,
1543,
310,
417,
10599,
347,
253,
1543,
285,
4561,
1304,
403,
417,
1663,
4665,
494,
352,
476,
320,
5046,
1966,
34025,
352,
651,
320,
5322,
281,
2319,
625,
327,
849,
436,
476,
320,
9009,
275,
3382,
8333,
849,
1199,
3733,
310,
3058,
323,
24432,
5474,
339,
431,
248,
3382,
1895,
436,
2929,
3936,
327,
310,
4722,
285,
4623,
323,
3110,
1557,
533,
253,
16038,
310,
12744,
310,
627,
247,
1895,
342,
253,
1655,
1332,
323,
16344,
617,
19,
3708,
326,
3676,
4715,
476,
8415,
253,
3082,
273,
436,
2929,
403,
12744,
285,
352,
651,
320,
7479,
281,
25464,
436,
1263,
432,
436,
7714,
253,
9079,
273,
436,
2929,
4620,
281,
320,
326,
3676,
4715,
476,
3653,
617,
19,
3708,
533,
352,
310,
12744,
1880,
436,
369,
4516,
390,
1275,
4525,
407,
841,
1543,
50276,
24330,
5701,
50276,
783,
1180,
273,
1363,
285,
1180,
273,
3888,
432,
1016,
3110,
1364,
320,
4767,
50276,
783,
2929,
10770,
281,
3733,
1071,
285,
12820,
5239,
533,
253,
1180,
273,
1363,
275,
1016,
873,
285,
1332,
407,
534,
597,
497,
18564,
403,
12744,
497,
253,
5239,
253,
1072,
323,
1016,
4836,
50276,
609,
512,
253,
1363,
432,
253,
1072,
4675,
50276,
783,
1543,
2593,
8414,
6571,
273,
3082,
627,
310,
642,
3082,
2593,
50275,
262,
310,
12744,
849,
24165,
390,
21481,
6069,
4243,
497,
10432,
352,
310,
12744,
849,
436,
14978,
5876,
253,
1543,
604,
436,
14978,
310,
11595,
352,
5841,
715,
1953,
253,
1750,
273,
247,
4751,
16644,
15722,
50276,
5430,
369,
253,
3216,
5083,
323,
841,
3888,
4232,
50276,
783,
3045,
17082,
403,
1677,
323,
1016,
2990,
15978,
533,
352,
310,
12744,
752,
253,
3045,
273,
253,
15722,
310,
327,
253,
4583,
4836,
273,
3110,
617,
19,
9162,
849,
1057,
436,
3045,
7277,
281,
253,
1655,
5328,
2629,
50276,
66,
2022,
1750,
273,
436,
2929,
310,
326,
253,
15722,
310,
4665,
494,
2299,
627,
310,
642,
5740,
273,
253,
3386,
908,
407,
667,
273,
253,
6928,
4543,
667,
7534,
12288,
2530,
407,
253,
6928,
271,
4665,
494,
2990,
4483,
24852,
273,
697,
9162,
7089,
352,
310,
12744,
1880,
326,
310,
1896,
1060,
50276,
37585,
5701,
50276,
783,
1071,
275,
4677,
337,
310,
594,
1355,
326,
352,
4916,
34025,
760,
387,
1052,
1979,
352,
3133,
8909,
281,
452,
594,
1199,
2505,
275,
436,
4677,
2581,
685,
12930,
253,
1232,
275,
253,
7714,
2505,
285,
44978,
253,
4677,
275,
253,
2505,
50276,
783,
28358,
285,
8555,
9036,
14715,
16145,
273,
253,
3888,
943,
320,
1677,
347,
943,
253,
10309,
908,
323,
6670,
1320,
50275,
783,
11250,
273,
253,
2127,
3066,
40477,
310,
1175,
5474,
339,
431,
248,
4477,
6266,
247,
747,
990,
281,
990,
2460,
1783,
15722,
597,
3715,
281,
4665,
46346,
3888,
273,
5988,
285,
16418,
15110,
5742,
247,
3676,
27311,
267,
11454,
2990,
260,
9866,
806,
2789,
12077,
253,
1783,
273,
11348,
275,
5999,
24389,
6773,
3888,
326,
1071,
253,
1966,
35371,
3116,
2803,
6437,
374,
617,
19,
18695,
6679,
17442,
3708,
50276,
783,
3676,
4715,
15722,
43341,
253,
18977,
6803,
285,
1980,
4219,
5043,
966,
7790,
253,
11348,
6298,
1561,
1016,
13787,
840,
352,
966,
7790,
253,
2644,
2460,
5001,
697,
617,
19,
17442,
3708,
50276,
2520,
2159,
2929,
4245,
247,
1175,
18389,
273,
253,
15722,
285,
9563,
973,
253,
16182,
3133,
281,
320,
4891,
285,
1077,
12112,
253,
1543,
403,
671,
12532,
3738,
253,
4081,
15722,
310,
417,
2429,
342,
1529,
1375,
23037,
14387,
1332,
253,
2603,
2127,
273,
253,
15722,
310,
4720,
15744,
2130,
352,
3103,
2868,
326,
436,
789,
588,
320,
273,
1534,
1600,
387,
4260,
77,
50276,
7152,
33032,
43355,
1246,
247,
5145,
4715,
4382,
8113,
15722,
323,
6773,
1754,
617,
18695,
6679,
5481,
285,
21652,
275,
1872,
11036,
3888,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
9154,
604,
253,
7681,
7680,
310,
3710,
253,
2022,
11288,
273,
253,
2929,
310,
2898,
38135,
253,
4477,
4518,
1246,
436,
275,
253,
2929,
285,
513,
417,
689,
7041,
7681,
38135,
50275,
783,
4477,
760,
2530,
3045,
323,
2060,
5018,
273,
253,
15722,
990,
936,
423,
3045,
1783,
943,
452,
644,
2908,
50275,
783,
4477,
812,
452,
2908,
247,
1643,
625,
2508,
670,
253,
5933,
824,
347,
253,
3280,
1979,
281,
2060,
6928,
3733,
3602,
3966,
187,
187,
4118,
18435,
27,
2252,
30628,
1804,
14924,
273,
253,
2929,
5727,
581,
37317,
5936,
247,
2266,
18235,
253,
30628,
7738,
14924,
5224,
326,
253,
2929,
310,
973,
15720,
3477,
281,
956,
285,
326,
253,
1543,
1007,
1077,
12532,
253,
15722,
310,
3839,
6210,
392,
265,
9397,
285,
253,
8892,
273,
253,
2060,
4295,
310,
2590,
891,
513,
5194,
342,
253,
37317,
46705,
18235,
326,
690,
1774,
4278,
403,
5816,
285,
326,
841,
812,
452,
644,
2879,
281,
253,
2929,
24088,
10895,
36509,
2299,
891,
1158,
18235,
651,
320,
1512,
17770,
1677,
326,
436,
310,
247,
2159,
2929,
285,
3240,
285,
9470,
1332,
253,
4477,
574,
281,
5206,
752,
281,
2486,
285,
752,
417,
347,
824,
891,
9644,
4404,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
7714,
4477,
4081,
247,
3676,
4715,
45439,
3169,
15722,
281,
3772,
366,
253,
18977,
6803,
273,
6773,
3888,
342,
1675,
281,
617,
19,
3320,
17442,
5175,
616,
15722,
34472,
50276,
29371,
74,
285,
966,
7790,
11348,
6298,
1561,
1016,
13787,
970,
260,
79,
2224,
50276,
856,
84,
253,
2929,
310,
1077,
973,
3542,
285,
9355,
50276,
783,
15722,
2216,
310,
10599,
253,
4679,
403,
973,
4158,
253,
16038,
285,
2852,
3884,
1677,
4518,
50276,
5040,
970,
1307,
4665,
494,
1543,
310,
417,
10599,
347,
253,
1543,
285,
4561,
1304,
403,
417,
1663,
4665,
494,
352,
476,
320,
5046,
1966,
34025,
352,
651,
320,
5322,
281,
2319,
625,
327,
849,
436,
476,
320,
9009,
275,
3382,
8333,
849,
1199,
3733,
310,
3058,
323,
24432,
5474,
339,
431,
248,
3382,
1895,
436,
2929,
3936,
327,
310,
4722,
285,
4623,
323,
3110,
1557,
533,
253,
16038,
310,
12744,
310,
627,
247,
1895,
342,
253,
1655,
1332,
323,
16344,
617,
19,
3708,
326,
3676,
4715,
476,
8415,
253,
3082,
273,
436,
2929,
403,
12744,
285,
352,
651,
320,
7479,
281,
25464,
436,
1263,
432,
436,
7714,
253,
9079,
273,
436,
2929,
4620,
281,
320,
326,
3676,
4715,
476,
3653,
617,
19,
3708,
533,
352,
310,
12744,
1880,
436,
369,
4516,
390,
1275,
4525,
407,
841,
1543,
50276,
24330,
5701,
50276,
783,
1180,
273,
1363,
285,
1180,
273,
3888,
432,
1016,
3110,
1364,
320,
4767,
50276,
783,
2929,
10770,
281,
3733,
1071,
285,
12820,
5239,
533,
253,
1180,
273,
1363,
275,
1016,
873,
285,
1332,
407,
534,
597,
497,
18564,
403,
12744,
497,
253,
5239,
253,
1072,
323,
1016,
4836,
50276,
609,
512,
253,
1363,
432,
253,
1072,
4675,
50276,
783,
1543,
2593,
8414,
6571,
273,
3082,
627,
310,
642,
3082,
2593,
50275,
262,
310,
12744,
849,
24165,
390,
21481,
6069,
4243,
497,
10432,
352,
310,
12744,
849,
436,
14978,
5876,
253,
1543,
604,
436,
14978,
310,
11595,
352,
5841,
715,
1953,
253,
1750,
273,
247,
4751,
16644,
15722,
50276,
5430,
369,
253,
3216,
5083,
323,
841,
3888,
4232,
50276,
783,
3045,
17082,
403,
1677,
323,
1016,
2990,
15978,
533,
352,
310,
12744,
752,
253,
3045,
273,
253,
15722,
310,
327,
253,
4583,
4836,
273,
3110,
617,
19,
9162,
849,
1057,
436,
3045,
7277,
281,
253,
1655,
5328,
2629,
50276,
66,
2022,
1750,
273,
436,
2929,
310,
326,
253,
15722,
310,
4665,
494,
2299,
627,
310,
642,
5740,
273,
253,
3386,
908,
407,
667,
273,
253,
6928,
4543,
667,
7534,
12288,
2530,
407,
253,
6928,
271,
4665,
494,
2990,
4483,
24852,
273,
697,
9162,
7089,
352,
310,
12744,
1880,
326,
310,
1896,
1060,
50276,
37585,
5701,
50276,
783,
1071,
275,
4677,
337,
310,
594,
1355,
326,
352,
4916,
34025,
760,
387,
1052,
1979,
352,
3133,
8909,
281,
452,
594,
1199,
2505,
275,
436,
4677,
2581,
685,
12930,
253,
1232,
275,
253,
7714,
2505,
285,
44978,
253,
4677,
275,
253,
2505,
50276,
783,
28358,
285,
8555,
9036,
14715,
16145,
273,
253,
3888,
943,
320,
1677,
347,
943,
253,
10309,
908,
323,
6670,
1320,
50275,
783,
11250,
273,
253,
2127,
3066,
40477,
310,
1175,
5474,
339,
431,
248,
4477,
6266,
247,
747,
990,
281,
990,
2460,
1783,
15722,
597,
3715,
281,
4665,
46346,
3888,
273,
5988,
285,
16418,
15110,
5742,
247,
3676,
27311,
267,
11454,
2990,
260,
9866,
806,
2789,
12077,
253,
1783,
273,
11348,
275,
5999,
24389,
6773,
3888,
326,
1071,
253,
1966,
35371,
3116,
2803,
6437,
374,
617,
19,
18695,
6679,
17442,
3708,
50276,
783,
3676,
4715,
15722,
43341,
253,
18977,
6803,
285,
1980,
4219,
5043,
966,
7790,
253,
11348,
6298,
1561,
1016,
13787,
840,
352,
966,
7790,
253,
2644,
2460,
5001,
697,
617,
19,
17442,
3708,
50276,
2520,
2159,
2929,
4245,
247,
1175,
18389,
273,
253,
15722,
285,
9563,
973,
253,
16182,
3133,
281,
320,
4891,
285,
1077,
12112,
253,
1543,
403,
671,
12532,
3738,
253,
4081,
15722,
310,
417,
2429,
342,
1529,
1375,
23037,
14387,
1332,
253,
2603,
2127,
273,
253,
15722,
310,
4720,
15744,
2130,
352,
3103,
2868,
326,
436,
789,
588,
320,
273,
1534,
1600,
387,
4260,
77,
50276,
7152,
33032,
43355,
1246,
247,
5145,
4715,
4382,
8113,
15722,
323,
6773,
1754,
617,
18695,
6679,
5481,
285,
21652,
275,
1872,
11036,
3888,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
9154,
604,
253,
7681,
7680,
310,
3710,
253,
2022,
11288,
273,
253,
2929,
310,
2898,
38135,
253,
4477,
4518,
1246,
436,
275,
253,
2929,
285,
513,
417,
689,
7041,
7681,
38135,
50275,
783,
4477,
760,
2530,
3045,
323,
2060,
5018,
273,
253,
15722,
990,
936,
423,
3045,
1783,
943,
452,
644,
2908,
50275,
783,
4477,
812,
452,
2908,
247,
1643,
625,
2508,
670,
253,
5933,
824,
347,
253,
3280,
1979,
281,
2060,
6928,
3733,
3602,
3966,
187,
187,
4118,
18435,
27,
2252,
30628,
1804,
14924,
273,
253,
2929,
5727,
581,
37317,
5936,
247,
2266,
18235,
253,
30628,
7738,
14924,
5224,
326,
253,
2929,
310,
973,
15720,
3477,
281,
956,
285,
326,
253,
1543,
1007,
1077,
12532,
253,
15722,
310,
3839,
6210,
392,
265,
9397,
285,
253,
8892,
273,
253,
2060,
4295,
310,
2590,
891,
513,
5194,
342,
253,
37317,
46705,
18235,
326,
690,
1774,
4278,
403,
5816,
285,
326,
841,
812,
452,
644,
2879,
281,
253,
2929,
24088,
10895,
36509,
2299,
891,
1158,
18235,
651,
320,
1512,
17770,
1677,
326,
436,
310,
247,
2159,
2929,
285,
3240,
285,
9470,
1332,
253,
4477,
574,
281,
5206,
752,
281,
2486,
285,
752,
417,
347,
824,
891,
9644,
4404,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
i very much like the aim of this work this is a problem of interest to a wide community which as far as im aware hasnt yet had much focus from the deep learning community however perhaps in part because of this the paper reads as naive in places pages 14 are all background saying nothing new but ignoring the effort made on this problem by other communities there has been some work done o this problem within statistics and within the gaussian process community to which no reference is made at all by the paper there are two novelties as far as i can see these may or may not be novel but they were novel to me the first is use of nns to model the system the second is the multiple state restimation mrse on page 5 i struggled to get a feeling about how successful these two aspects of the work are the results section is difficult to follow and doesnt compare the method to existing methods and so there is no baseline to say that this is successful or not thus i find it hard to judge the execution of the idea what i really want to know reading a paper like this is should i use this approach because there is no comparison to existing methods it leaves me unsure other comments is the title correct i dont see how these are pde guided nns youve used data from a pde to train the network and as a test problem a pde guided nn would for me know something about the dynamics compare with recently work in the gp community where kernels are derived that lead to gps that analytically obey simple pdes there is an obvious link to work in the uncertainty quantification community particularly around the use of multifidelity multilevel simulation this paper is likely to be of interest to them and the link could be more explicit page 3 after eq 2 there is notation used here that is undefined ytkt the simplifying assumption on page 3 is very strong and unlikely to hold for many systems but it isnt clear to me whether this is necessary or not presumably if it doesnt hold then we may still get an approximation that could be useful but it is just that we lose any guarantee the method will work i thought the msre idea was interesting it wasnt very well explained or motivated and it was unclear to me whether it works well or not from the results or whether it is novel to this paper or not but id like to have read more about it is the trick in section 82 original to this paper if so it seems a nice idea ive not checked the detail most of section 81 strikes me as unnecessary there are quite a few typos in particular words such as markovian newtonian should be capitalised docsep an interesting idea to learn the hidden state evolution and the stateobservation mapping jointly the experiments on eulers equation are slightly better than resnet for 30 steps ahead forecasting in terms of mse the paper is clearly written and wellexplained the model is not new resnet for state evolution and convdeconv for stateobservation mapping the difference between resnet and the proposed framework is not significant resnet is even better in figure 2 missing an important experiment test whether the model can generalize that is to forecast on different initial conditions than the training dataset how does the model compare with gans y xie e franz and m chu and n thuereyy tempogan a temporally coherent volumetric gan for superresolution fluid flow docsepi feel like i am missing something about this paper so rather than a review this is just mainly a long question making sure i understand things properly ignore the score for now ill change once i get a clearer picture of whats happening here the network you propose in this paper is motivated by solving pdes where as in 1 the actual solution as they are computed numerically depends on the current spatial field of the state as well as difference operators over this field eg both the gradients and the laplacian terms so i naturally was assuming that youd be designing a network that actually represented state as a spatial field and used these difference operators in computing the next state but instead it seems like you reverted to the notion of because difference operators can be expressed as convolutions we use a convolutional network and i dont really see anything specific to pdes thereafter just general statements about statespace models am i understanding this correctly why not just actually use the pdebased terms in the dynamics model of an architecture why bother with a generic resnet and i presume youre using a fully convolutional resnet here wouldnt the former work much better and be a significantly more interesting contribution that just applying a resnet and a generic unet as a state estimator im not understanding why the current proposed architecture assuming i understand it correctly could be seen as pde guided in all but the loosest possible sense can you correct me if im misunderstanding some element here
### Summary:
|
this paper introduces a few training methods to fit the dynamics of a pde based on observations quality not great the authors seem unaware of much related work both in the numerics and deep learning communities the experiments arent very illuminating and the connections between the different methods are never clearly and explicitly laid out in one place clarity poor the intro is long and rambly and the main contributions arent clearly motivated a lot of time is spent mentioning things that could be done without saying when this would be important or useful to do an algorithm box or two would be a big improvement over the many long english explanations of the methods and the diagrams with cycles in them originality not great there has been a lot of work on fitting dynamics models using nns and also attempting to optimize pde solvers which is hardly engaged with significance this work fails to make its own significance clear by not exploring or explaining the scope and limitations of their proposed approach or comparing against more baselines from the large set of related literature
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
74,
1077,
1199,
751,
253,
4388,
273,
436,
789,
436,
310,
247,
1895,
273,
1600,
281,
247,
4618,
3114,
534,
347,
2080,
347,
516,
6600,
556,
2649,
2568,
574,
1199,
2770,
432,
253,
3676,
4715,
3114,
2299,
4931,
275,
629,
984,
273,
436,
253,
2929,
9563,
347,
27785,
275,
5053,
7223,
1638,
403,
512,
4114,
3981,
2717,
747,
533,
23111,
253,
3434,
1160,
327,
436,
1895,
407,
643,
7888,
627,
556,
644,
690,
789,
2218,
258,
436,
1895,
1561,
9990,
285,
1561,
253,
305,
12064,
1232,
3114,
281,
534,
642,
3806,
310,
1160,
387,
512,
407,
253,
2929,
50276,
9088,
403,
767,
4460,
2890,
347,
2080,
347,
891,
476,
923,
841,
778,
390,
778,
417,
320,
4460,
50276,
2858,
597,
497,
4460,
281,
479,
253,
806,
310,
897,
273,
295,
2224,
281,
1566,
253,
985,
253,
1273,
310,
253,
2709,
1375,
1551,
14508,
278,
83,
339,
327,
3239,
608,
891,
19460,
281,
755,
247,
5471,
670,
849,
5547,
841,
767,
7794,
273,
253,
789,
403,
253,
1543,
2593,
310,
2834,
281,
956,
285,
36908,
7277,
253,
1332,
281,
5368,
3082,
285,
594,
627,
310,
642,
8245,
281,
1333,
326,
436,
310,
5547,
390,
417,
3021,
891,
1089,
352,
1892,
281,
5963,
253,
10636,
273,
253,
2934,
752,
891,
1663,
971,
281,
871,
4361,
247,
2929,
751,
436,
310,
943,
891,
897,
436,
2746,
984,
627,
310,
642,
5301,
281,
5368,
3082,
352,
6505,
479,
31488,
50276,
977,
5701,
50276,
261,
253,
4060,
3451,
891,
13414,
923,
849,
841,
403,
268,
615,
18107,
295,
2224,
368,
306,
908,
941,
432,
247,
268,
615,
281,
6194,
253,
2990,
285,
347,
247,
1071,
1895,
247,
268,
615,
18107,
48257,
651,
323,
479,
871,
1633,
670,
253,
8062,
7277,
342,
4102,
789,
275,
253,
31025,
3114,
835,
34501,
403,
6012,
326,
1421,
281,
305,
793,
326,
41398,
20090,
2969,
268,
3229,
50275,
9088,
310,
271,
4755,
3048,
281,
789,
275,
253,
11649,
21652,
3114,
3782,
1475,
253,
897,
273,
25274,
21718,
1554,
48268,
9864,
436,
2929,
310,
2779,
281,
320,
273,
1600,
281,
731,
285,
253,
3048,
812,
320,
625,
6843,
50276,
6377,
495,
846,
16186,
374,
50276,
9088,
310,
14951,
908,
1060,
326,
310,
17011,
340,
85,
5751,
50276,
783,
8077,
5411,
9376,
327,
3239,
495,
310,
1077,
2266,
285,
11543,
281,
2186,
323,
1142,
2718,
533,
352,
310,
2649,
2590,
281,
479,
1880,
436,
310,
3309,
390,
417,
18289,
604,
352,
36908,
2186,
840,
359,
778,
1335,
755,
271,
11193,
326,
812,
320,
4217,
533,
352,
310,
816,
326,
359,
7168,
667,
12215,
253,
1332,
588,
789,
50276,
74,
1869,
253,
13818,
250,
2934,
369,
4722,
352,
369,
2649,
1077,
973,
5544,
390,
17194,
285,
352,
369,
12744,
281,
479,
1880,
352,
2987,
973,
390,
417,
432,
253,
1543,
390,
1880,
352,
310,
4460,
281,
436,
2929,
390,
417,
533,
2654,
751,
281,
452,
1239,
625,
670,
352,
50276,
261,
253,
10480,
275,
2593,
11487,
3236,
281,
436,
2929,
604,
594,
352,
3133,
247,
5322,
2934,
209,
422,
417,
10141,
253,
2508,
50275,
2252,
273,
2593,
11681,
18200,
479,
347,
15279,
50276,
9088,
403,
3240,
247,
1643,
963,
993,
275,
1798,
3000,
824,
347,
1616,
729,
757,
747,
1299,
757,
943,
320,
5347,
1701,
50276,
7152,
33032,
271,
4722,
2934,
281,
3037,
253,
8763,
1375,
5606,
285,
253,
1375,
23705,
318,
10603,
26277,
50276,
783,
4679,
327,
299,
335,
398,
5150,
403,
5777,
1805,
685,
501,
3024,
323,
1884,
5018,
6386,
16923,
272,
275,
2426,
273,
278,
339,
50276,
783,
2929,
310,
4518,
3542,
285,
6210,
1591,
446,
1243,
50275,
783,
1566,
310,
417,
747,
501,
3024,
323,
1375,
5606,
285,
50276,
13118,
615,
13118,
323,
1375,
23705,
318,
10603,
50276,
783,
3064,
875,
501,
3024,
285,
253,
4081,
7792,
310,
417,
1534,
501,
3024,
310,
1014,
1805,
275,
4677,
374,
50276,
33722,
271,
1774,
3368,
50276,
2566,
1880,
253,
1566,
476,
39970,
326,
310,
281,
16923,
327,
1027,
3302,
2515,
685,
253,
3733,
10895,
50276,
5430,
1057,
253,
1566,
7277,
342,
305,
507,
340,
1269,
466,
50276,
70,
1315,
11670,
285,
278,
448,
86,
285,
295,
289,
489,
5292,
90,
14712,
19356,
247,
5897,
595,
18893,
1936,
45558,
36827,
323,
2221,
21061,
6514,
2685,
5474,
339,
2059,
1928,
751,
891,
717,
5816,
1633,
670,
436,
2929,
594,
2581,
685,
247,
2278,
436,
310,
816,
7194,
247,
1048,
1953,
2403,
2119,
891,
2096,
1841,
6283,
50276,
30029,
253,
4868,
323,
1024,
2853,
1818,
2378,
891,
755,
247,
30909,
5406,
273,
47515,
9369,
1060,
50276,
783,
2990,
368,
12661,
275,
436,
2929,
310,
17194,
407,
16161,
268,
3229,
835,
347,
275,
337,
253,
4588,
2900,
347,
597,
403,
10302,
27184,
7024,
327,
253,
1655,
8820,
1673,
273,
253,
1375,
347,
973,
347,
3064,
9158,
689,
436,
1673,
24088,
1097,
253,
27935,
285,
253,
826,
43917,
2426,
50276,
601,
891,
10748,
369,
7384,
326,
368,
69,
320,
20462,
247,
2990,
326,
2686,
6607,
1375,
347,
247,
8820,
1673,
285,
908,
841,
3064,
9158,
275,
12672,
253,
1735,
1375,
50276,
2858,
3185,
352,
3133,
751,
368,
294,
13083,
281,
253,
10732,
273,
984,
3064,
9158,
476,
320,
4469,
347,
2410,
17009,
359,
897,
247,
27311,
267,
2990,
285,
891,
13414,
1663,
923,
2712,
2173,
281,
268,
3229,
17096,
816,
2087,
7234,
670,
3054,
4511,
3210,
50276,
312,
891,
4685,
436,
9113,
50276,
22309,
417,
816,
2686,
897,
253,
268,
615,
3169,
2426,
275,
253,
8062,
1566,
273,
271,
10336,
50276,
22309,
15105,
342,
247,
12314,
501,
3024,
285,
891,
35533,
368,
250,
970,
247,
4751,
27311,
267,
501,
3024,
1060,
50276,
12756,
2649,
253,
3438,
789,
1199,
1805,
285,
320,
247,
3012,
625,
4722,
7680,
326,
816,
9433,
247,
501,
3024,
285,
247,
12314,
440,
292,
347,
247,
1375,
29107,
50276,
303,
417,
4685,
2139,
253,
1655,
4081,
10336,
7384,
891,
2096,
352,
9113,
812,
320,
2326,
347,
268,
615,
18107,
275,
512,
533,
253,
24067,
383,
1896,
3282,
50276,
5092,
368,
3451,
479,
604,
516,
40663,
690,
3284,
1060,
187,
187,
4118,
18435,
27,
2520,
2929,
23970,
247,
1643,
3733,
3082,
281,
4944,
253,
8062,
273,
247,
268,
615,
1754,
327,
7313,
50276,
15177,
50276,
1439,
1270,
50276,
783,
4477,
1646,
25229,
273,
1199,
2905,
789,
1097,
275,
253,
4520,
982,
285,
3676,
4715,
7888,
50276,
783,
4679,
403,
2649,
1077,
48374,
285,
253,
10291,
875,
253,
1027,
3082,
403,
1620,
4518,
285,
11120,
10090,
562,
275,
581,
1659,
19843,
50276,
31943,
50276,
783,
26432,
310,
1048,
285,
391,
1369,
314,
285,
253,
2022,
9021,
403,
2649,
4518,
17194,
50276,
66,
2257,
273,
673,
310,
5262,
29570,
1841,
326,
812,
320,
2218,
1293,
3981,
672,
436,
651,
320,
1774,
390,
4217,
281,
513,
50276,
266,
5933,
3817,
390,
767,
651,
320,
247,
1943,
7756,
689,
253,
1142,
1048,
48087,
22909,
273,
253,
3082,
285,
253,
21302,
342,
11945,
275,
731,
3236,
414,
50276,
1439,
1270,
50276,
9088,
556,
644,
247,
2257,
273,
789,
327,
13532,
8062,
3210,
970,
295,
2224,
285,
671,
13756,
281,
22318,
268,
615,
1220,
735,
534,
310,
10693,
9583,
342,
8453,
50276,
2520,
789,
10224,
281,
1056,
697,
1211,
8453,
2590,
407,
417,
18216,
390,
15571,
253,
7990,
285,
7364,
273,
616,
4081,
2746,
390,
10941,
1411,
625,
1666,
25379,
432,
253,
1781,
873,
273,
2905,
6239
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
74,
1077,
1199,
751,
253,
4388,
273,
436,
789,
436,
310,
247,
1895,
273,
1600,
281,
247,
4618,
3114,
534,
347,
2080,
347,
516,
6600,
556,
2649,
2568,
574,
1199,
2770,
432,
253,
3676,
4715,
3114,
2299,
4931,
275,
629,
984,
273,
436,
253,
2929,
9563,
347,
27785,
275,
5053,
7223,
1638,
403,
512,
4114,
3981,
2717,
747,
533,
23111,
253,
3434,
1160,
327,
436,
1895,
407,
643,
7888,
627,
556,
644,
690,
789,
2218,
258,
436,
1895,
1561,
9990,
285,
1561,
253,
305,
12064,
1232,
3114,
281,
534,
642,
3806,
310,
1160,
387,
512,
407,
253,
2929,
50276,
9088,
403,
767,
4460,
2890,
347,
2080,
347,
891,
476,
923,
841,
778,
390,
778,
417,
320,
4460,
50276,
2858,
597,
497,
4460,
281,
479,
253,
806,
310,
897,
273,
295,
2224,
281,
1566,
253,
985,
253,
1273,
310,
253,
2709,
1375,
1551,
14508,
278,
83,
339,
327,
3239,
608,
891,
19460,
281,
755,
247,
5471,
670,
849,
5547,
841,
767,
7794,
273,
253,
789,
403,
253,
1543,
2593,
310,
2834,
281,
956,
285,
36908,
7277,
253,
1332,
281,
5368,
3082,
285,
594,
627,
310,
642,
8245,
281,
1333,
326,
436,
310,
5547,
390,
417,
3021,
891,
1089,
352,
1892,
281,
5963,
253,
10636,
273,
253,
2934,
752,
891,
1663,
971,
281,
871,
4361,
247,
2929,
751,
436,
310,
943,
891,
897,
436,
2746,
984,
627,
310,
642,
5301,
281,
5368,
3082,
352,
6505,
479,
31488,
50276,
977,
5701,
50276,
261,
253,
4060,
3451,
891,
13414,
923,
849,
841,
403,
268,
615,
18107,
295,
2224,
368,
306,
908,
941,
432,
247,
268,
615,
281,
6194,
253,
2990,
285,
347,
247,
1071,
1895,
247,
268,
615,
18107,
48257,
651,
323,
479,
871,
1633,
670,
253,
8062,
7277,
342,
4102,
789,
275,
253,
31025,
3114,
835,
34501,
403,
6012,
326,
1421,
281,
305,
793,
326,
41398,
20090,
2969,
268,
3229,
50275,
9088,
310,
271,
4755,
3048,
281,
789,
275,
253,
11649,
21652,
3114,
3782,
1475,
253,
897,
273,
25274,
21718,
1554,
48268,
9864,
436,
2929,
310,
2779,
281,
320,
273,
1600,
281,
731,
285,
253,
3048,
812,
320,
625,
6843,
50276,
6377,
495,
846,
16186,
374,
50276,
9088,
310,
14951,
908,
1060,
326,
310,
17011,
340,
85,
5751,
50276,
783,
8077,
5411,
9376,
327,
3239,
495,
310,
1077,
2266,
285,
11543,
281,
2186,
323,
1142,
2718,
533,
352,
310,
2649,
2590,
281,
479,
1880,
436,
310,
3309,
390,
417,
18289,
604,
352,
36908,
2186,
840,
359,
778,
1335,
755,
271,
11193,
326,
812,
320,
4217,
533,
352,
310,
816,
326,
359,
7168,
667,
12215,
253,
1332,
588,
789,
50276,
74,
1869,
253,
13818,
250,
2934,
369,
4722,
352,
369,
2649,
1077,
973,
5544,
390,
17194,
285,
352,
369,
12744,
281,
479,
1880,
352,
2987,
973,
390,
417,
432,
253,
1543,
390,
1880,
352,
310,
4460,
281,
436,
2929,
390,
417,
533,
2654,
751,
281,
452,
1239,
625,
670,
352,
50276,
261,
253,
10480,
275,
2593,
11487,
3236,
281,
436,
2929,
604,
594,
352,
3133,
247,
5322,
2934,
209,
422,
417,
10141,
253,
2508,
50275,
2252,
273,
2593,
11681,
18200,
479,
347,
15279,
50276,
9088,
403,
3240,
247,
1643,
963,
993,
275,
1798,
3000,
824,
347,
1616,
729,
757,
747,
1299,
757,
943,
320,
5347,
1701,
50276,
7152,
33032,
271,
4722,
2934,
281,
3037,
253,
8763,
1375,
5606,
285,
253,
1375,
23705,
318,
10603,
26277,
50276,
783,
4679,
327,
299,
335,
398,
5150,
403,
5777,
1805,
685,
501,
3024,
323,
1884,
5018,
6386,
16923,
272,
275,
2426,
273,
278,
339,
50276,
783,
2929,
310,
4518,
3542,
285,
6210,
1591,
446,
1243,
50275,
783,
1566,
310,
417,
747,
501,
3024,
323,
1375,
5606,
285,
50276,
13118,
615,
13118,
323,
1375,
23705,
318,
10603,
50276,
783,
3064,
875,
501,
3024,
285,
253,
4081,
7792,
310,
417,
1534,
501,
3024,
310,
1014,
1805,
275,
4677,
374,
50276,
33722,
271,
1774,
3368,
50276,
2566,
1880,
253,
1566,
476,
39970,
326,
310,
281,
16923,
327,
1027,
3302,
2515,
685,
253,
3733,
10895,
50276,
5430,
1057,
253,
1566,
7277,
342,
305,
507,
340,
1269,
466,
50276,
70,
1315,
11670,
285,
278,
448,
86,
285,
295,
289,
489,
5292,
90,
14712,
19356,
247,
5897,
595,
18893,
1936,
45558,
36827,
323,
2221,
21061,
6514,
2685,
5474,
339,
2059,
1928,
751,
891,
717,
5816,
1633,
670,
436,
2929,
594,
2581,
685,
247,
2278,
436,
310,
816,
7194,
247,
1048,
1953,
2403,
2119,
891,
2096,
1841,
6283,
50276,
30029,
253,
4868,
323,
1024,
2853,
1818,
2378,
891,
755,
247,
30909,
5406,
273,
47515,
9369,
1060,
50276,
783,
2990,
368,
12661,
275,
436,
2929,
310,
17194,
407,
16161,
268,
3229,
835,
347,
275,
337,
253,
4588,
2900,
347,
597,
403,
10302,
27184,
7024,
327,
253,
1655,
8820,
1673,
273,
253,
1375,
347,
973,
347,
3064,
9158,
689,
436,
1673,
24088,
1097,
253,
27935,
285,
253,
826,
43917,
2426,
50276,
601,
891,
10748,
369,
7384,
326,
368,
69,
320,
20462,
247,
2990,
326,
2686,
6607,
1375,
347,
247,
8820,
1673,
285,
908,
841,
3064,
9158,
275,
12672,
253,
1735,
1375,
50276,
2858,
3185,
352,
3133,
751,
368,
294,
13083,
281,
253,
10732,
273,
984,
3064,
9158,
476,
320,
4469,
347,
2410,
17009,
359,
897,
247,
27311,
267,
2990,
285,
891,
13414,
1663,
923,
2712,
2173,
281,
268,
3229,
17096,
816,
2087,
7234,
670,
3054,
4511,
3210,
50276,
312,
891,
4685,
436,
9113,
50276,
22309,
417,
816,
2686,
897,
253,
268,
615,
3169,
2426,
275,
253,
8062,
1566,
273,
271,
10336,
50276,
22309,
15105,
342,
247,
12314,
501,
3024,
285,
891,
35533,
368,
250,
970,
247,
4751,
27311,
267,
501,
3024,
1060,
50276,
12756,
2649,
253,
3438,
789,
1199,
1805,
285,
320,
247,
3012,
625,
4722,
7680,
326,
816,
9433,
247,
501,
3024,
285,
247,
12314,
440,
292,
347,
247,
1375,
29107,
50276,
303,
417,
4685,
2139,
253,
1655,
4081,
10336,
7384,
891,
2096,
352,
9113,
812,
320,
2326,
347,
268,
615,
18107,
275,
512,
533,
253,
24067,
383,
1896,
3282,
50276,
5092,
368,
3451,
479,
604,
516,
40663,
690,
3284,
1060,
187,
187,
4118,
18435,
27,
2520,
2929,
23970,
247,
1643,
3733,
3082,
281,
4944,
253,
8062,
273,
247,
268,
615,
1754,
327,
7313,
50276,
15177,
50276,
1439,
1270,
50276,
783,
4477,
1646,
25229,
273,
1199,
2905,
789,
1097,
275,
253,
4520,
982,
285,
3676,
4715,
7888,
50276,
783,
4679,
403,
2649,
1077,
48374,
285,
253,
10291,
875,
253,
1027,
3082,
403,
1620,
4518,
285,
11120,
10090,
562,
275,
581,
1659,
19843,
50276,
31943,
50276,
783,
26432,
310,
1048,
285,
391,
1369,
314,
285,
253,
2022,
9021,
403,
2649,
4518,
17194,
50276,
66,
2257,
273,
673,
310,
5262,
29570,
1841,
326,
812,
320,
2218,
1293,
3981,
672,
436,
651,
320,
1774,
390,
4217,
281,
513,
50276,
266,
5933,
3817,
390,
767,
651,
320,
247,
1943,
7756,
689,
253,
1142,
1048,
48087,
22909,
273,
253,
3082,
285,
253,
21302,
342,
11945,
275,
731,
3236,
414,
50276,
1439,
1270,
50276,
9088,
556,
644,
247,
2257,
273,
789,
327,
13532,
8062,
3210,
970,
295,
2224,
285,
671,
13756,
281,
22318,
268,
615,
1220,
735,
534,
310,
10693,
9583,
342,
8453,
50276,
2520,
789,
10224,
281,
1056,
697,
1211,
8453,
2590,
407,
417,
18216,
390,
15571,
253,
7990,
285,
7364,
273,
616,
4081,
2746,
390,
10941,
1411,
625,
1666,
25379,
432,
253,
1781,
873,
273,
2905,
6239
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the proposed method is novel and wellmotivated the paper is written well the motivation is clear the description of proposed method is easy to follow the reference is thorough and welldescribed maybe provide a visual result to demonstrate the performance of the proposed method explain more details about the proposed method i found figure 2 a hard to follow please provide text description in caption maybe perform statistical significance test for results docsep1 a combination of unet memory network and hough voting from three views sounds novel especially the motivation to solve the longtailed problem in the specific dataset 2 paper is well written and easy to follow 3 the problem aiming to solve is clinically relevant 1 since the authors define the problem to be an instance segmentation problem related work on performing instance segmentation in medical imaging should be mentioned too 2 some details need to be clarified for example when preparing the datasets why putting different weights according to sizes 3 the evaluation metrics in table 1 seem to be altered 4 maybe a bit more experiments are needed docsep introduction section is written in a really good manner with a clear formulation of the problem at hand and clear motivation for the proposed method the proposed use of a memory network in addition to unet is a novel idea and gives better performance compared to only unet novel memory updating mechanics and variancereducing loss function are tailored to the problem at hand results are encouraging and show the potential of the proposed method the paper misses a lot of literature review for instance segmentation in medical imaging ex 12 comparison against any recent instance segmentation method is not given similarly comparison against a simple baseline 3 which directly predicts counting segmentation is also missing no qualitative results for instance segmentation and counting confusion metric in ms counting small lesions also play a significant role a better division of the ms lesions by size and how the proposed method helps in getting better counting of these lesions is missing discussion section of the paper is really rushed with not much great insight into the results 1 graham s chen h gamper j dou q heng pa snead d tsang yw and rajpoot n 2019 mildnet minimal information loss dilated network for gland instance segmentation in colon histology images medical image analysis 52 pp199211 2 zhou y onder of dou q tsougenis e chen h and heng pa 2019 june cianet robust nuclei instance segmentation with contouraware information aggregation in international conference on information processing in medical imaging pp 682693 springer cham 3 eatonrosen z varsavsky t ourselin s and cardoso mj 2019 october as easy as 1 2 4 uncertainty in counting tasks for medical imaging in international conference on medical image computing and computerassisted intervention pp 356364 springer cham docsep1 the paper is wellwritten and easy to follow 2 the proposed method achieves stateoftheart results 3 the method is simple and effective 4 experiments on a largescale crosssectional mr imaging study shows the effectiveness of the proposed method 1 the novelty of the paper is limited the memory updating mechanism and variancereducing loss function are not sufficiently novel either 2 the experimental results are not convincing enough theres no experimental evidence for the reasons why the proposed method outperforms the baselines 3 the proposed method is only compared with a few baselines it would be better to compare with more recent methods eg lesiontoads shiee etal 2010 and lst schmidt et al 2012
### Summary:
|
all reviewers found interest and merit in this paper some negative points were satisfactorily addressed by authors the paper should be accepted
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
253,
4081,
1332,
310,
4460,
285,
973,
24013,
8550,
50276,
783,
2929,
310,
3542,
973,
50276,
783,
16038,
310,
2590,
50276,
783,
5740,
273,
4081,
1332,
310,
3477,
281,
956,
50276,
783,
3806,
310,
11080,
285,
6210,
392,
265,
9397,
50276,
28489,
2085,
247,
5304,
906,
281,
7568,
253,
3045,
273,
253,
4081,
1332,
50276,
15083,
404,
625,
4278,
670,
253,
4081,
1332,
50276,
74,
1119,
4677,
374,
247,
1892,
281,
956,
4496,
2085,
2505,
5740,
275,
11743,
50276,
28489,
1347,
7605,
8453,
1071,
323,
1543,
50276,
7152,
33032,
18,
247,
5019,
273,
440,
292,
3541,
2990,
285,
288,
602,
13423,
432,
1264,
6849,
7835,
4460,
3340,
253,
16038,
281,
8415,
253,
1048,
29551,
1895,
275,
253,
2173,
10895,
50276,
19,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
20,
253,
1895,
26400,
281,
8415,
310,
16747,
4623,
50276,
18,
1580,
253,
4477,
4853,
253,
1895,
281,
320,
271,
4227,
26405,
1895,
2905,
789,
327,
9591,
4227,
26405,
275,
3739,
6979,
943,
320,
5393,
1512,
50275,
19,
690,
4278,
878,
281,
320,
31637,
323,
1650,
672,
13828,
253,
15302,
2139,
8133,
1027,
13461,
2556,
281,
9552,
50275,
20,
253,
7103,
17082,
275,
2829,
337,
1646,
281,
320,
12059,
50275,
21,
5046,
247,
2372,
625,
4679,
403,
3058,
50276,
7152,
33032,
10199,
2593,
310,
3542,
275,
247,
1663,
1175,
5133,
342,
247,
2590,
15895,
273,
253,
1895,
387,
1133,
285,
2590,
16038,
323,
253,
4081,
1332,
50276,
783,
4081,
897,
273,
247,
3541,
2990,
275,
1635,
281,
440,
292,
310,
247,
4460,
2934,
285,
4245,
1805,
3045,
2429,
281,
760,
440,
292,
50276,
2369,
652,
3541,
22753,
17823,
285,
11041,
17773,
272,
2957,
1159,
403,
27846,
281,
253,
1895,
387,
1133,
50276,
16680,
403,
18462,
285,
921,
253,
2442,
273,
253,
4081,
1332,
50276,
783,
2929,
38771,
247,
2257,
273,
6239,
2278,
323,
4227,
26405,
275,
3739,
6979,
385,
1249,
50276,
47109,
1411,
667,
3332,
4227,
26405,
1332,
310,
417,
1677,
50276,
3549,
6241,
5301,
1411,
247,
2969,
8245,
495,
534,
3587,
26295,
15496,
26405,
310,
671,
5816,
50276,
2369,
18276,
1543,
323,
4227,
26405,
285,
15496,
13775,
7982,
50276,
249,
13818,
15496,
1355,
10429,
671,
1132,
247,
1534,
2554,
247,
1805,
9025,
273,
253,
13818,
10429,
407,
1979,
285,
849,
253,
4081,
1332,
7729,
275,
2970,
1805,
15496,
273,
841,
10429,
310,
5816,
50275,
49794,
2593,
273,
253,
2929,
310,
1663,
20906,
342,
417,
1199,
1270,
12288,
715,
253,
1543,
50276,
18,
7098,
3964,
256,
260,
864,
288,
18814,
468,
480,
2443,
2805,
344,
1251,
1349,
16037,
324,
277,
28669,
606,
340,
88,
285,
1218,
23731,
1412,
295,
6247,
11134,
3024,
8723,
1491,
2957,
49783,
2990,
323,
21147,
4227,
26405,
275,
6769,
40102,
3888,
3739,
2460,
1783,
8073,
7266,
13895,
883,
50276,
19,
1182,
14451,
340,
29508,
273,
2443,
2805,
28669,
529,
257,
261,
299,
260,
864,
288,
285,
344,
1251,
1349,
6247,
480,
2517,
260,
757,
292,
10237,
15935,
4227,
26405,
342,
523,
276,
376,
1935,
1491,
20828,
275,
5213,
8059,
327,
1491,
5162,
275,
3739,
6979,
7266,
9934,
1731,
4590,
7203,
254,
45909,
50276,
20,
6008,
251,
2921,
257,
1182,
41665,
580,
13456,
246,
776,
2034,
249,
256,
285,
3120,
26471,
278,
75,
6247,
17109,
2672,
347,
3477,
347,
337,
374,
577,
11649,
275,
15496,
8892,
323,
3739,
6979,
275,
5213,
8059,
327,
3739,
2460,
12672,
285,
4382,
27310,
7268,
7266,
37071,
23100,
7203,
254,
45909,
5474,
33032,
18,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
19,
253,
4081,
1332,
33526,
1375,
23037,
14387,
1543,
50276,
20,
253,
1332,
310,
2969,
285,
3576,
577,
4679,
327,
247,
1236,
2510,
25912,
2831,
20382,
278,
83,
6979,
1263,
2722,
253,
12510,
273,
253,
4081,
1332,
337,
253,
38135,
273,
253,
2929,
310,
3710,
253,
3541,
22753,
5122,
285,
11041,
17773,
272,
2957,
1159,
403,
417,
10481,
4460,
2057,
50276,
19,
253,
5661,
1543,
403,
417,
21414,
2217,
253,
373,
642,
5661,
1941,
323,
253,
4606,
2139,
253,
4081,
1332,
41731,
13015,
253,
1666,
25379,
495,
253,
4081,
1332,
310,
760,
2429,
342,
247,
1643,
1666,
25379,
50276,
262,
651,
320,
1805,
281,
7277,
342,
625,
3332,
3082,
50276,
909,
15411,
936,
6594,
439,
466,
70,
1162,
267,
50276,
7199,
50276,
395,
298,
296,
50276,
10629,
23533,
1162,
355,
50276,
6755,
50275,
187,
187,
4118,
18435,
27,
455,
30628,
1119,
1600,
285,
15785,
275,
436,
2929,
690,
4016,
2792,
497,
3449,
5906,
1031,
9713,
407,
4477,
253,
2929,
943,
320,
7607
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
253,
4081,
1332,
310,
4460,
285,
973,
24013,
8550,
50276,
783,
2929,
310,
3542,
973,
50276,
783,
16038,
310,
2590,
50276,
783,
5740,
273,
4081,
1332,
310,
3477,
281,
956,
50276,
783,
3806,
310,
11080,
285,
6210,
392,
265,
9397,
50276,
28489,
2085,
247,
5304,
906,
281,
7568,
253,
3045,
273,
253,
4081,
1332,
50276,
15083,
404,
625,
4278,
670,
253,
4081,
1332,
50276,
74,
1119,
4677,
374,
247,
1892,
281,
956,
4496,
2085,
2505,
5740,
275,
11743,
50276,
28489,
1347,
7605,
8453,
1071,
323,
1543,
50276,
7152,
33032,
18,
247,
5019,
273,
440,
292,
3541,
2990,
285,
288,
602,
13423,
432,
1264,
6849,
7835,
4460,
3340,
253,
16038,
281,
8415,
253,
1048,
29551,
1895,
275,
253,
2173,
10895,
50276,
19,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
20,
253,
1895,
26400,
281,
8415,
310,
16747,
4623,
50276,
18,
1580,
253,
4477,
4853,
253,
1895,
281,
320,
271,
4227,
26405,
1895,
2905,
789,
327,
9591,
4227,
26405,
275,
3739,
6979,
943,
320,
5393,
1512,
50275,
19,
690,
4278,
878,
281,
320,
31637,
323,
1650,
672,
13828,
253,
15302,
2139,
8133,
1027,
13461,
2556,
281,
9552,
50275,
20,
253,
7103,
17082,
275,
2829,
337,
1646,
281,
320,
12059,
50275,
21,
5046,
247,
2372,
625,
4679,
403,
3058,
50276,
7152,
33032,
10199,
2593,
310,
3542,
275,
247,
1663,
1175,
5133,
342,
247,
2590,
15895,
273,
253,
1895,
387,
1133,
285,
2590,
16038,
323,
253,
4081,
1332,
50276,
783,
4081,
897,
273,
247,
3541,
2990,
275,
1635,
281,
440,
292,
310,
247,
4460,
2934,
285,
4245,
1805,
3045,
2429,
281,
760,
440,
292,
50276,
2369,
652,
3541,
22753,
17823,
285,
11041,
17773,
272,
2957,
1159,
403,
27846,
281,
253,
1895,
387,
1133,
50276,
16680,
403,
18462,
285,
921,
253,
2442,
273,
253,
4081,
1332,
50276,
783,
2929,
38771,
247,
2257,
273,
6239,
2278,
323,
4227,
26405,
275,
3739,
6979,
385,
1249,
50276,
47109,
1411,
667,
3332,
4227,
26405,
1332,
310,
417,
1677,
50276,
3549,
6241,
5301,
1411,
247,
2969,
8245,
495,
534,
3587,
26295,
15496,
26405,
310,
671,
5816,
50276,
2369,
18276,
1543,
323,
4227,
26405,
285,
15496,
13775,
7982,
50276,
249,
13818,
15496,
1355,
10429,
671,
1132,
247,
1534,
2554,
247,
1805,
9025,
273,
253,
13818,
10429,
407,
1979,
285,
849,
253,
4081,
1332,
7729,
275,
2970,
1805,
15496,
273,
841,
10429,
310,
5816,
50275,
49794,
2593,
273,
253,
2929,
310,
1663,
20906,
342,
417,
1199,
1270,
12288,
715,
253,
1543,
50276,
18,
7098,
3964,
256,
260,
864,
288,
18814,
468,
480,
2443,
2805,
344,
1251,
1349,
16037,
324,
277,
28669,
606,
340,
88,
285,
1218,
23731,
1412,
295,
6247,
11134,
3024,
8723,
1491,
2957,
49783,
2990,
323,
21147,
4227,
26405,
275,
6769,
40102,
3888,
3739,
2460,
1783,
8073,
7266,
13895,
883,
50276,
19,
1182,
14451,
340,
29508,
273,
2443,
2805,
28669,
529,
257,
261,
299,
260,
864,
288,
285,
344,
1251,
1349,
6247,
480,
2517,
260,
757,
292,
10237,
15935,
4227,
26405,
342,
523,
276,
376,
1935,
1491,
20828,
275,
5213,
8059,
327,
1491,
5162,
275,
3739,
6979,
7266,
9934,
1731,
4590,
7203,
254,
45909,
50276,
20,
6008,
251,
2921,
257,
1182,
41665,
580,
13456,
246,
776,
2034,
249,
256,
285,
3120,
26471,
278,
75,
6247,
17109,
2672,
347,
3477,
347,
337,
374,
577,
11649,
275,
15496,
8892,
323,
3739,
6979,
275,
5213,
8059,
327,
3739,
2460,
12672,
285,
4382,
27310,
7268,
7266,
37071,
23100,
7203,
254,
45909,
5474,
33032,
18,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
19,
253,
4081,
1332,
33526,
1375,
23037,
14387,
1543,
50276,
20,
253,
1332,
310,
2969,
285,
3576,
577,
4679,
327,
247,
1236,
2510,
25912,
2831,
20382,
278,
83,
6979,
1263,
2722,
253,
12510,
273,
253,
4081,
1332,
337,
253,
38135,
273,
253,
2929,
310,
3710,
253,
3541,
22753,
5122,
285,
11041,
17773,
272,
2957,
1159,
403,
417,
10481,
4460,
2057,
50276,
19,
253,
5661,
1543,
403,
417,
21414,
2217,
253,
373,
642,
5661,
1941,
323,
253,
4606,
2139,
253,
4081,
1332,
41731,
13015,
253,
1666,
25379,
495,
253,
4081,
1332,
310,
760,
2429,
342,
247,
1643,
1666,
25379,
50276,
262,
651,
320,
1805,
281,
7277,
342,
625,
3332,
3082,
50276,
909,
15411,
936,
6594,
439,
466,
70,
1162,
267,
50276,
7199,
50276,
395,
298,
296,
50276,
10629,
23533,
1162,
355,
50276,
6755,
50275,
187,
187,
4118,
18435,
27,
455,
30628,
1119,
1600,
285,
15785,
275,
436,
2929,
690,
4016,
2792,
497,
3449,
5906,
1031,
9713,
407,
4477,
253,
2929,
943,
320,
7607
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the finegrained class clustering is more challenging than coarsegrained due to lower sample representation and large scale and color variations between the finegrained classes the main goal of the proposed approach is to learn stronger representations in an unsupervised fashion to this end the paper proposes c3gan which uses the ability of infogan with contrastive learning to learn feature representations that maximize the mutual information between the latent code and its corresponding observation the proposed approach is able to achieve best performance in comparison to the related works the results demonstrated quantitatively and qualitatively on 4 different datasets along with ablation study validate the proposed approach the method is promising since it is unsupervised way of learning cluster centroid for unlabeled data strengths information theory based regularization with contrastive loss learns the features of the cluster centroids the proposed method is well adapted for this purpose using infogan and perturbgan the proposed method effectively avoids mode collapse while training generators and are able to generate with good control various objects with varying background while keeping foreground fixed and varying foreground objects while keeping background fixed better than related methods the paper demonstrates that overclustering is a viable approach to hyperparameter optimization that can achieve better finegrained clustering due to learning dense features weaknesses the approach is heavily inspired by finegan and perturbgan in overclustering it is not clear what would happen if the chosen cluster size is such that it is not a multiple of no of classes we assume that the data does not have labels in unsupervised feature learning therefore it would be interesting to see if the cluster size is not a multiple grammaticaltypographical errors page 1 last line we formulate the auxiliary probability fig 2 caption c3gan page 5 first line in mathematical the paper presents a method that is a hybrid of 2 major previous stateofart methods although heavily inspired the paper proposes solutions to reduce the drawbacks of the previous approaches which is the highlight of the contribution additionally the biggest takeaway is that the method is unsupervised given the breadth of the experiments to validate each of the proposed solutions and substantial ablation experiments to justify each proposal the paper overall is a good contribution docsepthis paper studies the problem of finegrained image clustering similar to recent work such as onegan and finegan the paper proposes to use a gan setup called c3gan where the finegrained image synthesis and clustering are performed in the same endtoend pipeline the main contribution is to use a contrastive loss in maximizing the mutual information of the discriminator encoded features and classcentric features there are also some modifications in the foreground and background synthesis mechanisms compared to prior work such as the finegan experiments are conducted on finegrained datasets such as cub2002011 strengths 1 the method is straightforward and the paper is easy to follow 2 the performance boost compared to sota for clustering seems to be very good table 1 3 the ablation study covers major components in the pipeline so it is easy to understand their contributions weaknesses while the paper is focusing on finegrained image synthesisclustering it is not specifically designed for finegrained classes the whole pipeline is pretty generic this is unlike for example the finegan paper where they design separate stages for parent and child classes other than showing in the experiment that the proposed method just works in the finegrained setting id be interested in seeing more analysis on the advantage of the proposed pipeline in finegrained setting vs coarsegrained setting for example the paper could show the performance of parent and child classes in those datasets and compare against sota if the proposed method has a similar performance as others in the coarse setting but a clear advantage in the finegrained setting that will be a very strong signal in addition there are some parts in the paper that are unclear or confusing 1 section 32 second paragraph the discriminator d aims to learn adversarial features r what does adversarial features mean here i checked the architecture it seems to be just the discriminator score a scalar 2 what is the effect of underclustering where y is much smaller 3 the loss defined in eq 5 does not use the notion q or qk defined in eq 4 it is not clear from the text directly what their connection is it should be written clearly l log qk for example 4 table 1 last sentence i think onegan is an unsupervised method other suggestions for editing 1 the first sentence in the abstract has redundant words unsupervisedly and without ground truth annotation are the same thing 2 figure 4 a the same cluster c is very confusing here because those birds are from different classes id suggest simply dropping the symbol c overall this paper proposes a straightforward pipeline to synthesize and cluster images in finegrained classes a contrastive loss is used in place of the regular softmax loss in the infogan framework the performance seems to be very solid compared to sota methods on both synthesis and clustering tasks there are some unclear or confusing parts in the paper but i think given the simplicity and good performance of the method it might be worth being seen by the community to inspire similar works docsepthe authors undertake the more difficult task of data clustering based upon finegrained features they use contrastive learning for this in conjunction with gan losses their representation can be used in the downstream task of image generation where they use their representations strengths to show improved resilience to mode collapse while displaying better intracluster variance strengths the experiments are well executed and clearly communicated the quantitative results for clustering and the qualitative figures in the supplementary material are both impressive and they support the authors claims the method has novel components which are provably attributed to these results weaknesses the paper is well rounded with all aspects detailed to completion there are no obvious weaknesses the paper combines several techniques to achieve unsupervised finegrained clustering of semantic classes their representations high quality is able to drive gan generation without mode collapse thereby achieving great twofold contributions docsep the authors propose c3gan a method that learns clusteringfriendly feature representation for finegrained clustering main goal by learning features of cluster centroids latent codes using contrastive loss on the mutual information of imagelatent code pairs the method should improve the gans performance in terms of image diversity the method is unsupervised and applicable for singleobject images only the method is built upon finegan and infogan idea after adding significant improvements such as removing the dependency on boundingbox labels applying the mutual information in the embedding space and directly learning cluster centroids strength the problem is known to be valid and challenging the method idea formulations and figures are clear in general the level of novelty is reasonable the model is fully unsupervised the authors provide an extensive evaluation and the results are impressive weaknesses the part of the contrastive loss is not totally clear the authors should provide a better intuition of why the contrastive loss improves the feature representation for example how are imagelatent pairs defined as positive the method focuses on learning cluster granularity for the object only and not for the background its unclear why the transformation matrix is used other than the fact that its part of perturbgans pipeline a few comments on the text the phrase coarsegrained images is inaccurate the coarsegrained adjective should refer to the clustering and not the images in the intro the authors should share more details about the auxiliary distribution mentioned in the abstract and the intro overall proofreading is required it would be great to add some of the models notations to figure 2 eg dbase psir psih i tend to vote for accepting this paper as i think it proposes a great approach and presents a convincing comparative performance
### Summary:
|
all the reviewers liked the paper the proposed method contains novel ideas of learning feature representation to maixmize the mutral informatio nbetween the latent code and its corresponding observation for finegrained class clustering the model seems to successfully avoid mode collapse while training generators and able to generate various object foregrounds with varying backgrounds the foreground and background control ability is an outstanding feature of the paper please incorporate the comments of the reviewers in the final version btw the real score of this paper should be 70 as reviewer 5wfe commented that heshe would raise the score from 5 to 6 but at the time of this meta review ths core was not raised so the final score of the paper should be 8866
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4030,
72,
11273,
966,
17524,
310,
625,
11132,
685,
25319,
72,
11273,
1955,
281,
2406,
3410,
6779,
285,
1781,
4311,
285,
3295,
10575,
875,
253,
4030,
72,
11273,
5971,
253,
2022,
4736,
273,
253,
4081,
2746,
310,
281,
3037,
10046,
14237,
275,
271,
440,
35421,
8142,
281,
436,
990,
253,
2929,
29328,
260,
20,
1247,
534,
4648,
253,
3745,
273,
2192,
19356,
342,
4499,
422,
4715,
281,
3037,
4735,
14237,
326,
22950,
253,
15577,
1491,
875,
253,
21624,
2127,
285,
697,
3969,
8310,
253,
4081,
2746,
310,
2104,
281,
5115,
1682,
3045,
275,
5301,
281,
253,
2905,
2987,
50275,
783,
1543,
5183,
36878,
285,
36143,
327,
577,
1027,
15302,
2112,
342,
28913,
1263,
17813,
253,
4081,
2746,
253,
1332,
310,
12532,
1580,
352,
310,
440,
35421,
1039,
273,
4715,
7368,
1399,
2631,
323,
440,
22027,
941,
20544,
50276,
18480,
3762,
1754,
37820,
342,
4499,
422,
2957,
33772,
253,
3386,
273,
253,
7368,
1399,
287,
2352,
253,
4081,
1332,
310,
973,
12956,
323,
436,
4096,
970,
2192,
19356,
285,
12230,
1247,
50275,
783,
4081,
1332,
8069,
32547,
4438,
13551,
1223,
3733,
21025,
285,
403,
2104,
281,
6635,
342,
1175,
1453,
2710,
5113,
342,
11962,
4114,
1223,
7562,
35936,
4229,
285,
11962,
35936,
5113,
1223,
7562,
4114,
4229,
1805,
685,
2905,
3082,
50276,
783,
2929,
14371,
326,
689,
498,
49591,
310,
247,
16571,
2746,
281,
4373,
19484,
13757,
326,
476,
5115,
1805,
4030,
72,
11273,
17524,
1955,
281,
4715,
14086,
3386,
50275,
20881,
1255,
265,
50276,
783,
2746,
310,
11306,
11797,
407,
4030,
1247,
285,
12230,
1247,
50275,
249,
689,
498,
49591,
352,
310,
417,
2590,
752,
651,
5108,
604,
253,
6777,
7368,
1979,
310,
824,
326,
352,
310,
417,
247,
2709,
273,
642,
273,
5971,
359,
5467,
326,
253,
941,
1057,
417,
452,
13301,
275,
440,
35421,
4735,
4715,
3103,
352,
651,
320,
4722,
281,
923,
604,
253,
7368,
1979,
310,
417,
247,
2709,
50276,
1710,
2056,
474,
6611,
15553,
6332,
50276,
6377,
337,
1390,
1386,
359,
36803,
253,
24026,
5912,
50276,
926,
374,
11743,
260,
20,
1247,
50276,
6377,
608,
806,
1386,
275,
15965,
50276,
783,
2929,
10262,
247,
1332,
326,
310,
247,
9769,
273,
374,
2201,
2045,
1375,
1171,
435,
3082,
3738,
11306,
11797,
253,
2929,
29328,
5482,
281,
4796,
253,
30453,
273,
253,
2045,
7274,
50276,
4609,
310,
253,
6780,
273,
253,
7680,
23000,
253,
5962,
1379,
12594,
310,
326,
253,
1332,
310,
440,
35421,
1677,
253,
37535,
273,
253,
4679,
281,
17813,
1016,
273,
253,
4081,
5482,
285,
6832,
28913,
4679,
281,
15249,
1016,
10419,
253,
2929,
4583,
310,
247,
1175,
7680,
5474,
33032,
2520,
2929,
2175,
253,
1895,
273,
4030,
72,
11273,
2460,
17524,
2074,
281,
3332,
789,
824,
347,
581,
1247,
285,
4030,
1247,
253,
2929,
29328,
281,
897,
247,
36827,
9978,
1925,
260,
20,
1247,
835,
253,
4030,
72,
11273,
2460,
9066,
285,
17524,
403,
2684,
275,
253,
1072,
990,
936,
423,
15722,
253,
2022,
7680,
310,
281,
897,
247,
4499,
422,
2957,
275,
46875,
253,
15577,
1491,
273,
253,
7134,
12915,
16202,
3386,
285,
966,
37382,
3386,
627,
403,
671,
690,
14586,
275,
253,
35936,
285,
4114,
9066,
6297,
2429,
281,
2720,
789,
824,
347,
253,
4030,
1247,
4679,
403,
5196,
327,
4030,
72,
11273,
15302,
824,
347,
12966,
1518,
7330,
20544,
337,
253,
1332,
310,
15246,
285,
253,
2929,
310,
3477,
281,
956,
50276,
19,
253,
3045,
9510,
2429,
281,
256,
5503,
323,
17524,
3133,
281,
320,
1077,
1175,
2829,
337,
50276,
20,
253,
28913,
1263,
10949,
2201,
4295,
275,
253,
15722,
594,
352,
310,
3477,
281,
2096,
616,
9021,
50276,
20881,
1255,
265,
1223,
253,
2929,
310,
13654,
327,
4030,
72,
11273,
2460,
9066,
498,
49591,
352,
310,
417,
5742,
4158,
323,
4030,
72,
11273,
5971,
253,
2644,
15722,
310,
3965,
12314,
436,
310,
12401,
323,
1650,
253,
4030,
1247,
2929,
835,
597,
2216,
4858,
8661,
323,
2885,
285,
1429,
5971,
643,
685,
4645,
275,
253,
3368,
326,
253,
4081,
1332,
816,
2987,
275,
253,
4030,
72,
11273,
4758,
2654,
320,
6110,
275,
6523,
625,
1783,
327,
253,
5750,
273,
253,
4081,
15722,
275,
4030,
72,
11273,
4758,
4632,
25319,
72,
11273,
4758,
323,
1650,
253,
2929,
812,
921,
253,
3045,
273,
2885,
285,
1429,
5971,
275,
1110,
15302,
285,
7277,
1411,
256,
5503,
604,
253,
4081,
1332,
556,
247,
2074,
3045,
347,
2571,
275,
253,
25319,
4758,
533,
247,
2590,
5750,
275,
253,
4030,
72,
11273,
4758,
326,
588,
320,
247,
1077,
2266,
2625,
50276,
249,
1635,
627,
403,
690,
4243,
275,
253,
2929,
326,
403,
12744,
390,
21643,
337,
2593,
4567,
1273,
12494,
253,
7134,
12915,
277,
13698,
281,
3037,
48960,
3386,
391,
50276,
5371,
1057,
48960,
3386,
1599,
1060,
891,
10141,
253,
10336,
352,
3133,
281,
320,
816,
253,
7134,
12915,
4868,
247,
13434,
50276,
19,
752,
310,
253,
1055,
273,
762,
498,
49591,
835,
340,
310,
1199,
4577,
50276,
20,
253,
2957,
2931,
275,
16186,
608,
1057,
417,
897,
253,
10732,
2805,
390,
2805,
76,
2931,
275,
16186,
577,
352,
310,
417,
2590,
432,
253,
2505,
3587,
752,
616,
4602,
310,
352,
943,
320,
3542,
4518,
298,
50276,
2808,
2805,
76,
323,
1650,
577,
2829,
337,
1390,
6197,
891,
1158,
581,
1247,
310,
271,
440,
35421,
1332,
50276,
977,
13991,
323,
14835,
337,
253,
806,
6197,
275,
253,
12002,
556,
28116,
3000,
440,
35421,
314,
285,
1293,
3216,
5083,
22581,
403,
253,
1072,
2181,
374,
4677,
577,
247,
253,
1072,
7368,
260,
310,
1077,
21643,
1060,
984,
1110,
11260,
403,
432,
1027,
5971,
2654,
1804,
3365,
18752,
253,
9484,
260,
50276,
1189,
455,
436,
2929,
29328,
247,
15246,
15722,
281,
46919,
285,
7368,
3888,
275,
4030,
72,
11273,
5971,
247,
4499,
422,
2957,
310,
908,
275,
1659,
273,
253,
3963,
2602,
4090,
2957,
275,
253,
2192,
19356,
7792,
253,
3045,
3133,
281,
320,
1077,
4891,
2429,
281,
256,
5503,
3082,
327,
1097,
9066,
285,
17524,
8892,
627,
403,
690,
12744,
390,
21643,
4243,
275,
253,
2929,
533,
891,
1158,
1677,
253,
17647,
285,
1175,
3045,
273,
253,
1332,
352,
1537,
320,
4409,
1146,
2326,
407,
253,
3114,
281,
26761,
2074,
2987,
50276,
7152,
339,
431,
248,
4477,
30618,
253,
625,
2834,
4836,
273,
941,
17524,
1754,
2220,
4030,
72,
11273,
3386,
597,
897,
4499,
422,
4715,
323,
436,
275,
17385,
342,
36827,
11655,
616,
6779,
476,
320,
908,
275,
253,
15450,
4836,
273,
2460,
5978,
835,
597,
897,
616,
14237,
20544,
281,
921,
5520,
35124,
281,
4438,
13551,
1223,
19703,
1805,
11251,
77,
8976,
11041,
50275,
296,
3755,
20556,
50276,
186,
783,
4679,
403,
973,
11407,
285,
4518,
32452,
209,
186,
783,
11745,
1543,
323,
17524,
285,
253,
18276,
8442,
275,
253,
24864,
2144,
403,
1097,
13943,
285,
597,
1329,
253,
4477,
3916,
209,
186,
783,
1332,
556,
4460,
4295,
534,
403,
872,
1598,
12877,
281,
841,
1543,
50276,
20881,
1255,
265,
50276,
186,
783,
2929,
310,
973,
9971,
342,
512,
7794,
7000,
281,
12240,
627,
403,
642,
4755,
32213,
253,
2929,
24772,
2067,
5609,
281,
5115,
440,
35421,
4030,
72,
11273,
17524,
273,
24705,
5971,
616,
14237,
1029,
3290,
310,
2104,
281,
4446,
36827,
5978,
1293,
4438,
13551,
7624,
17170,
1270,
767,
8089,
9021,
5474,
33032,
253,
4477,
12661,
260,
20,
1247,
247,
1332,
326,
33772,
17524,
19771,
4735,
6779,
323,
4030,
72,
11273,
17524,
2022,
4736,
407,
4715,
3386,
273,
7368,
1399,
287,
2352,
21624,
11646,
970,
4499,
422,
2957,
327,
253,
15577,
1491,
273,
4440,
293,
255,
290,
2127,
8557,
50275,
783,
1332,
943,
3157,
253,
305,
507,
3045,
275,
2426,
273,
2460,
9991,
50275,
783,
1332,
310,
440,
35421,
285,
7763,
323,
2014,
6082,
3888,
760,
50275,
783,
1332,
310,
4270,
2220,
4030,
1247,
285,
2192,
19356,
2934,
846,
6240,
1534,
11701,
824,
347,
11922,
253,
18925,
327,
41113,
3364,
13301,
9433,
253,
15577,
1491,
275,
253,
21496,
2317,
285,
3587,
4715,
7368,
1399,
287,
2352,
50276,
45563,
28910,
253,
1895,
310,
1929,
281,
320,
3588,
285,
11132,
28910,
253,
1332,
2934,
26850,
285,
8442,
403,
2590,
275,
2087,
28910,
253,
1268,
273,
38135,
310,
5272,
28910,
253,
1566,
310,
4751,
440,
35421,
253,
4477,
2085,
271,
9470,
7103,
285,
253,
1543,
403,
13943,
50276,
20881,
1255,
265,
28910,
253,
629,
273,
253,
4499,
422,
2957,
310,
417,
9106,
2590,
253,
4477,
943,
2085,
247,
1805,
30328,
273,
2139,
253,
4499,
422,
2957,
19132,
253,
4735,
6779,
323,
1650,
849,
403,
4440,
293,
255,
290,
8557,
2931,
347,
2762,
28910,
253,
1332,
16633,
327,
4715,
7368,
32449,
414,
323,
253,
1789,
760,
285,
417,
323,
253,
4114,
28910,
697,
12744,
2139,
253,
9261,
4315,
310,
908,
643,
685,
253,
958,
326,
697,
629,
273,
12230,
72,
507,
15722,
28910,
247,
1643,
5701,
327,
253,
2505,
28910,
253,
12616,
25319,
72,
11273,
3888,
310,
31215,
253,
25319,
72,
11273,
519,
25667,
943,
3730,
281,
253,
17524,
285,
417,
253,
3888,
275,
253,
26432,
28910,
253,
4477,
943,
3894,
625,
4278,
670,
253,
24026,
3268,
5393,
275,
253,
12002,
285,
253,
26432,
28910,
4583,
4737,
24042,
310,
2424,
352,
651,
320,
1270,
281,
823,
690,
273,
253,
3210,
41818,
281,
4677,
374,
24088,
277,
4793,
3714,
343,
3714,
6356,
891,
5257,
281,
6273,
323,
18738,
436,
2929,
347,
891,
1158,
352,
29328,
247,
1270,
2746,
285,
10262,
247,
21414,
20407,
3045,
50276,
187,
187,
4118,
18435,
27,
455,
253,
30628,
10490,
253,
2929,
253,
4081,
1332,
4428,
4460,
5697,
273,
4715,
4735,
6779,
281,
6429,
895,
78,
907,
253,
2873,
1544,
4151,
28965,
295,
17352,
253,
21624,
2127,
285,
697,
3969,
8310,
323,
4030,
72,
11273,
966,
17524,
253,
1566,
3133,
281,
8379,
3693,
4438,
13551,
1223,
3733,
21025,
285,
2104,
281,
6635,
2710,
1789,
2273,
18646,
342,
11962,
24550,
253,
35936,
285,
4114,
1453,
3745,
310,
271,
16383,
4735,
273,
253,
2929,
4496,
19071,
253,
5701,
273,
253,
30628,
275,
253,
2457,
2715,
50276,
2612,
88,
253,
1524,
4868,
273,
436,
2929,
943,
320,
5571,
347,
37317,
608,
88,
453,
20503,
326,
344,
6689,
651,
7164,
253,
4868,
432,
608,
281,
721,
533,
387,
253,
673,
273,
436,
11419,
2278,
289,
84,
5161,
369,
417,
5439,
594,
253,
2457,
4868,
273,
253,
2929,
943,
320,
11003,
2526
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4030,
72,
11273,
966,
17524,
310,
625,
11132,
685,
25319,
72,
11273,
1955,
281,
2406,
3410,
6779,
285,
1781,
4311,
285,
3295,
10575,
875,
253,
4030,
72,
11273,
5971,
253,
2022,
4736,
273,
253,
4081,
2746,
310,
281,
3037,
10046,
14237,
275,
271,
440,
35421,
8142,
281,
436,
990,
253,
2929,
29328,
260,
20,
1247,
534,
4648,
253,
3745,
273,
2192,
19356,
342,
4499,
422,
4715,
281,
3037,
4735,
14237,
326,
22950,
253,
15577,
1491,
875,
253,
21624,
2127,
285,
697,
3969,
8310,
253,
4081,
2746,
310,
2104,
281,
5115,
1682,
3045,
275,
5301,
281,
253,
2905,
2987,
50275,
783,
1543,
5183,
36878,
285,
36143,
327,
577,
1027,
15302,
2112,
342,
28913,
1263,
17813,
253,
4081,
2746,
253,
1332,
310,
12532,
1580,
352,
310,
440,
35421,
1039,
273,
4715,
7368,
1399,
2631,
323,
440,
22027,
941,
20544,
50276,
18480,
3762,
1754,
37820,
342,
4499,
422,
2957,
33772,
253,
3386,
273,
253,
7368,
1399,
287,
2352,
253,
4081,
1332,
310,
973,
12956,
323,
436,
4096,
970,
2192,
19356,
285,
12230,
1247,
50275,
783,
4081,
1332,
8069,
32547,
4438,
13551,
1223,
3733,
21025,
285,
403,
2104,
281,
6635,
342,
1175,
1453,
2710,
5113,
342,
11962,
4114,
1223,
7562,
35936,
4229,
285,
11962,
35936,
5113,
1223,
7562,
4114,
4229,
1805,
685,
2905,
3082,
50276,
783,
2929,
14371,
326,
689,
498,
49591,
310,
247,
16571,
2746,
281,
4373,
19484,
13757,
326,
476,
5115,
1805,
4030,
72,
11273,
17524,
1955,
281,
4715,
14086,
3386,
50275,
20881,
1255,
265,
50276,
783,
2746,
310,
11306,
11797,
407,
4030,
1247,
285,
12230,
1247,
50275,
249,
689,
498,
49591,
352,
310,
417,
2590,
752,
651,
5108,
604,
253,
6777,
7368,
1979,
310,
824,
326,
352,
310,
417,
247,
2709,
273,
642,
273,
5971,
359,
5467,
326,
253,
941,
1057,
417,
452,
13301,
275,
440,
35421,
4735,
4715,
3103,
352,
651,
320,
4722,
281,
923,
604,
253,
7368,
1979,
310,
417,
247,
2709,
50276,
1710,
2056,
474,
6611,
15553,
6332,
50276,
6377,
337,
1390,
1386,
359,
36803,
253,
24026,
5912,
50276,
926,
374,
11743,
260,
20,
1247,
50276,
6377,
608,
806,
1386,
275,
15965,
50276,
783,
2929,
10262,
247,
1332,
326,
310,
247,
9769,
273,
374,
2201,
2045,
1375,
1171,
435,
3082,
3738,
11306,
11797,
253,
2929,
29328,
5482,
281,
4796,
253,
30453,
273,
253,
2045,
7274,
50276,
4609,
310,
253,
6780,
273,
253,
7680,
23000,
253,
5962,
1379,
12594,
310,
326,
253,
1332,
310,
440,
35421,
1677,
253,
37535,
273,
253,
4679,
281,
17813,
1016,
273,
253,
4081,
5482,
285,
6832,
28913,
4679,
281,
15249,
1016,
10419,
253,
2929,
4583,
310,
247,
1175,
7680,
5474,
33032,
2520,
2929,
2175,
253,
1895,
273,
4030,
72,
11273,
2460,
17524,
2074,
281,
3332,
789,
824,
347,
581,
1247,
285,
4030,
1247,
253,
2929,
29328,
281,
897,
247,
36827,
9978,
1925,
260,
20,
1247,
835,
253,
4030,
72,
11273,
2460,
9066,
285,
17524,
403,
2684,
275,
253,
1072,
990,
936,
423,
15722,
253,
2022,
7680,
310,
281,
897,
247,
4499,
422,
2957,
275,
46875,
253,
15577,
1491,
273,
253,
7134,
12915,
16202,
3386,
285,
966,
37382,
3386,
627,
403,
671,
690,
14586,
275,
253,
35936,
285,
4114,
9066,
6297,
2429,
281,
2720,
789,
824,
347,
253,
4030,
1247,
4679,
403,
5196,
327,
4030,
72,
11273,
15302,
824,
347,
12966,
1518,
7330,
20544,
337,
253,
1332,
310,
15246,
285,
253,
2929,
310,
3477,
281,
956,
50276,
19,
253,
3045,
9510,
2429,
281,
256,
5503,
323,
17524,
3133,
281,
320,
1077,
1175,
2829,
337,
50276,
20,
253,
28913,
1263,
10949,
2201,
4295,
275,
253,
15722,
594,
352,
310,
3477,
281,
2096,
616,
9021,
50276,
20881,
1255,
265,
1223,
253,
2929,
310,
13654,
327,
4030,
72,
11273,
2460,
9066,
498,
49591,
352,
310,
417,
5742,
4158,
323,
4030,
72,
11273,
5971,
253,
2644,
15722,
310,
3965,
12314,
436,
310,
12401,
323,
1650,
253,
4030,
1247,
2929,
835,
597,
2216,
4858,
8661,
323,
2885,
285,
1429,
5971,
643,
685,
4645,
275,
253,
3368,
326,
253,
4081,
1332,
816,
2987,
275,
253,
4030,
72,
11273,
4758,
2654,
320,
6110,
275,
6523,
625,
1783,
327,
253,
5750,
273,
253,
4081,
15722,
275,
4030,
72,
11273,
4758,
4632,
25319,
72,
11273,
4758,
323,
1650,
253,
2929,
812,
921,
253,
3045,
273,
2885,
285,
1429,
5971,
275,
1110,
15302,
285,
7277,
1411,
256,
5503,
604,
253,
4081,
1332,
556,
247,
2074,
3045,
347,
2571,
275,
253,
25319,
4758,
533,
247,
2590,
5750,
275,
253,
4030,
72,
11273,
4758,
326,
588,
320,
247,
1077,
2266,
2625,
50276,
249,
1635,
627,
403,
690,
4243,
275,
253,
2929,
326,
403,
12744,
390,
21643,
337,
2593,
4567,
1273,
12494,
253,
7134,
12915,
277,
13698,
281,
3037,
48960,
3386,
391,
50276,
5371,
1057,
48960,
3386,
1599,
1060,
891,
10141,
253,
10336,
352,
3133,
281,
320,
816,
253,
7134,
12915,
4868,
247,
13434,
50276,
19,
752,
310,
253,
1055,
273,
762,
498,
49591,
835,
340,
310,
1199,
4577,
50276,
20,
253,
2957,
2931,
275,
16186,
608,
1057,
417,
897,
253,
10732,
2805,
390,
2805,
76,
2931,
275,
16186,
577,
352,
310,
417,
2590,
432,
253,
2505,
3587,
752,
616,
4602,
310,
352,
943,
320,
3542,
4518,
298,
50276,
2808,
2805,
76,
323,
1650,
577,
2829,
337,
1390,
6197,
891,
1158,
581,
1247,
310,
271,
440,
35421,
1332,
50276,
977,
13991,
323,
14835,
337,
253,
806,
6197,
275,
253,
12002,
556,
28116,
3000,
440,
35421,
314,
285,
1293,
3216,
5083,
22581,
403,
253,
1072,
2181,
374,
4677,
577,
247,
253,
1072,
7368,
260,
310,
1077,
21643,
1060,
984,
1110,
11260,
403,
432,
1027,
5971,
2654,
1804,
3365,
18752,
253,
9484,
260,
50276,
1189,
455,
436,
2929,
29328,
247,
15246,
15722,
281,
46919,
285,
7368,
3888,
275,
4030,
72,
11273,
5971,
247,
4499,
422,
2957,
310,
908,
275,
1659,
273,
253,
3963,
2602,
4090,
2957,
275,
253,
2192,
19356,
7792,
253,
3045,
3133,
281,
320,
1077,
4891,
2429,
281,
256,
5503,
3082,
327,
1097,
9066,
285,
17524,
8892,
627,
403,
690,
12744,
390,
21643,
4243,
275,
253,
2929,
533,
891,
1158,
1677,
253,
17647,
285,
1175,
3045,
273,
253,
1332,
352,
1537,
320,
4409,
1146,
2326,
407,
253,
3114,
281,
26761,
2074,
2987,
50276,
7152,
339,
431,
248,
4477,
30618,
253,
625,
2834,
4836,
273,
941,
17524,
1754,
2220,
4030,
72,
11273,
3386,
597,
897,
4499,
422,
4715,
323,
436,
275,
17385,
342,
36827,
11655,
616,
6779,
476,
320,
908,
275,
253,
15450,
4836,
273,
2460,
5978,
835,
597,
897,
616,
14237,
20544,
281,
921,
5520,
35124,
281,
4438,
13551,
1223,
19703,
1805,
11251,
77,
8976,
11041,
50275,
296,
3755,
20556,
50276,
186,
783,
4679,
403,
973,
11407,
285,
4518,
32452,
209,
186,
783,
11745,
1543,
323,
17524,
285,
253,
18276,
8442,
275,
253,
24864,
2144,
403,
1097,
13943,
285,
597,
1329,
253,
4477,
3916,
209,
186,
783,
1332,
556,
4460,
4295,
534,
403,
872,
1598,
12877,
281,
841,
1543,
50276,
20881,
1255,
265,
50276,
186,
783,
2929,
310,
973,
9971,
342,
512,
7794,
7000,
281,
12240,
627,
403,
642,
4755,
32213,
253,
2929,
24772,
2067,
5609,
281,
5115,
440,
35421,
4030,
72,
11273,
17524,
273,
24705,
5971,
616,
14237,
1029,
3290,
310,
2104,
281,
4446,
36827,
5978,
1293,
4438,
13551,
7624,
17170,
1270,
767,
8089,
9021,
5474,
33032,
253,
4477,
12661,
260,
20,
1247,
247,
1332,
326,
33772,
17524,
19771,
4735,
6779,
323,
4030,
72,
11273,
17524,
2022,
4736,
407,
4715,
3386,
273,
7368,
1399,
287,
2352,
21624,
11646,
970,
4499,
422,
2957,
327,
253,
15577,
1491,
273,
4440,
293,
255,
290,
2127,
8557,
50275,
783,
1332,
943,
3157,
253,
305,
507,
3045,
275,
2426,
273,
2460,
9991,
50275,
783,
1332,
310,
440,
35421,
285,
7763,
323,
2014,
6082,
3888,
760,
50275,
783,
1332,
310,
4270,
2220,
4030,
1247,
285,
2192,
19356,
2934,
846,
6240,
1534,
11701,
824,
347,
11922,
253,
18925,
327,
41113,
3364,
13301,
9433,
253,
15577,
1491,
275,
253,
21496,
2317,
285,
3587,
4715,
7368,
1399,
287,
2352,
50276,
45563,
28910,
253,
1895,
310,
1929,
281,
320,
3588,
285,
11132,
28910,
253,
1332,
2934,
26850,
285,
8442,
403,
2590,
275,
2087,
28910,
253,
1268,
273,
38135,
310,
5272,
28910,
253,
1566,
310,
4751,
440,
35421,
253,
4477,
2085,
271,
9470,
7103,
285,
253,
1543,
403,
13943,
50276,
20881,
1255,
265,
28910,
253,
629,
273,
253,
4499,
422,
2957,
310,
417,
9106,
2590,
253,
4477,
943,
2085,
247,
1805,
30328,
273,
2139,
253,
4499,
422,
2957,
19132,
253,
4735,
6779,
323,
1650,
849,
403,
4440,
293,
255,
290,
8557,
2931,
347,
2762,
28910,
253,
1332,
16633,
327,
4715,
7368,
32449,
414,
323,
253,
1789,
760,
285,
417,
323,
253,
4114,
28910,
697,
12744,
2139,
253,
9261,
4315,
310,
908,
643,
685,
253,
958,
326,
697,
629,
273,
12230,
72,
507,
15722,
28910,
247,
1643,
5701,
327,
253,
2505,
28910,
253,
12616,
25319,
72,
11273,
3888,
310,
31215,
253,
25319,
72,
11273,
519,
25667,
943,
3730,
281,
253,
17524,
285,
417,
253,
3888,
275,
253,
26432,
28910,
253,
4477,
943,
3894,
625,
4278,
670,
253,
24026,
3268,
5393,
275,
253,
12002,
285,
253,
26432,
28910,
4583,
4737,
24042,
310,
2424,
352,
651,
320,
1270,
281,
823,
690,
273,
253,
3210,
41818,
281,
4677,
374,
24088,
277,
4793,
3714,
343,
3714,
6356,
891,
5257,
281,
6273,
323,
18738,
436,
2929,
347,
891,
1158,
352,
29328,
247,
1270,
2746,
285,
10262,
247,
21414,
20407,
3045,
50276,
187,
187,
4118,
18435,
27,
455,
253,
30628,
10490,
253,
2929,
253,
4081,
1332,
4428,
4460,
5697,
273,
4715,
4735,
6779,
281,
6429,
895,
78,
907,
253,
2873,
1544,
4151,
28965,
295,
17352,
253,
21624,
2127,
285,
697,
3969,
8310,
323,
4030,
72,
11273,
966,
17524,
253,
1566,
3133,
281,
8379,
3693,
4438,
13551,
1223,
3733,
21025,
285,
2104,
281,
6635,
2710,
1789,
2273,
18646,
342,
11962,
24550,
253,
35936,
285,
4114,
1453,
3745,
310,
271,
16383,
4735,
273,
253,
2929,
4496,
19071,
253,
5701,
273,
253,
30628,
275,
253,
2457,
2715,
50276,
2612,
88,
253,
1524,
4868,
273,
436,
2929,
943,
320,
5571,
347,
37317,
608,
88,
453,
20503,
326,
344,
6689,
651,
7164,
253,
4868,
432,
608,
281,
721,
533,
387,
253,
673,
273,
436,
11419,
2278,
289,
84,
5161,
369,
417,
5439,
594,
253,
2457,
4868,
273,
253,
2929,
943,
320,
11003,
2526
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper describes a general neural network architecture for predicting satisfiability specifically the contributions include an encoding for sat problems and predicting sat using a message passing method where the embeddings for literals and clauses are iteratively changed until convergence the paper seems significant considering that it brings together sat solving and neural network architectures the paper is very clearly written and quite precise about its contributions the analysis especially figures 34 and 7 seems to give a nice intuitive ideas as to what the neural network is trying to do however one weakness is that the problems are run on a specific type of sat problem the authors have created of course the authors make it clear that the objective is not really to create a stateoftheart solver but rather to understand what a neural network trying to do sat solving is capable of doing on this front i think the paper succeeds in doing this one thing that was a little confusing is that why should all the literals turn to sat turn red to prove sat as it is shown in figure 3 is it that the neural network does not realize that it has found a sat solution with a smaller subset of sat literals in other words is it not capable of taking advantage of the problem structure in general though this seemed to be an interesting paper though its practical implications are quite hard to know either in the sat community or in the neural network communitydocsepthis paper presents the neurosat architecture which uses a deep message passing neural net for predicting the satisfiability of cnf instances the architecture is also able to predict a satisfiable assignment in the sat case and the literals involved in some minimal conflicting set of clauses ie core in the unsat case the neurosat architecture is based on a vector space embedding of literals and clauses which exploits with message passing some important symmetries of sat instances permutation invariance and negation invariance this architecture is tested on various classes of random sat instances involving both unstructured rs problems and structured ones eg graph colorings vertex covers dominating sets etc overall the paper is wellmotivated and the experimental results are quite convincing arguably the salient characteristic of neurosat is to iteratively refine the confidence of literals voting for the sat or unsat output using a voting scheme on the last iteration of the literal matrix this is very interesting and neurosat might be used to help existing solvers in choosing variable orderings for tackling hard instances or hard queries eg find a core on the other hand the technical description of the architecture sec 3 is perhaps a little vague for having a clear intuition of how the classification task for sat instances is handled in the neurosat architecture namely a brief description of the initial matrices which encode the literal en clause embeddings would be nice some comments on the role played by the multilayer perceptron units and the normalization units would also be welcome the two update rules in page 3 could be explained in more detail for the sake of clarity i would suggest to provide a figure for depicting a transition from iteration t1 to iteration t in the architecture as a minor comment it would be nice in section 2 to define the main parameters n m and d used in the rest of the paper concerning the experimental part of the paper sections 4 5 are wellexplained but in section 6 the solution decoding method inspired from pca is a bit confusing specifically we dont know how a satisfying assignment is extracted from the last layer and this should be explained in detail according to figure 4 and the comments above it seems that a clustering method with two centroids is advocated but this is not clear in table 1 the correlation between the accuracy on sat instances and the percent of sat instances solved is not clear is the ratio of 70 measured on the cnf instances which have been predicted to be satisfiable or is this ratio measured on the whole set of test instances finally for the results established in table 1 how many training instances and test instances have been used in section 7 some important aspects related to experiments are missing in sec 71 for sr200 tasks was neurosat tested on the same conditions as those for sr40 tasks notably what is the input dimension d of the embedding space here i guess that d 128 is too small for such large instances also how many training and test instances have been used to plot the curves in figure 5 for the 4888 satisfiable instances generated in sec 72 which solver have been used to determine the satisfiability of those instances i guess it is minisat but this should be mentioned somewhere in section 8 i found interesting the the ability of neurosat in predicting the literals that participate in an unsat core indeed the problem of finding an unsat core in cnf instances is computationally harder than determining the satisfiability of this instance so neurosat might be used here to help a solver in finding a core but the notion of confidence should be explained in more detail in this section and more generally in the whole paper namely it seems that in the last layer of each iteration literals are voting for sat red colors with some confidence say delta and voting for unsat blue colors with some confidence say delta are delta and delta correlated in the neural architecture and how confidences for unsat votes are updated finally i found that the different benchmarks where relevant but i would also suggest for future work or in the appendix to additionally perform experiments on the wellknown random 3sat instances k is fixed to 3 here it is wellknown that a phase transition on the instances not the solverlearner occurs at 426 for the clausevariable ratio a plot displaying the performance of neurosat accuracy in predicting the label of the instance versus the clausevariable ratio would be very helpful in assessing the robustness of neurosat on the socalled hard instances which are close to 426 by extension there have been a lot of recent work in generating pseudoindustrial random sat instances which incorporate some structure eg communities in order to mimic realworld structured sat instances to this point it would be interesting to analyze the performance of neurosat on such pseudoindustrial instances docsepthis paper trains a neural network to solve the satisfiability problems based on the message passing neural network it presents neurosat and trains it as a classifier to predict satisfiability under a single bit of supervision after training neurosat can solve problems that are larger and more difficult than it ever saw during training furthermore the authors present a way to decode the solutions from the networks activations besides for unsatisfiable problems the paper also presents neurounsat which learns to detect the contradictions in the form of unsat cores relevance this paper is likely to be of interest to a large proportion of the community for several reasons firstly satisfiability problems arise from a variety of domains this paper starts with a new angle to solve the sat problem secondly it uses neural networks in the sat problem and establishes that neural networks can learn to perform a discrete search thirdly the system used in this paper may also help improve existing sat solvers significance i think the results are significant for the decoding satisfying assignments section the twodimensional pca embeddings are very clear and the neurosats success rate for more significant problems and different problems has shown its generalization ability finally the sequences of literal votes in neurounsat have proved its ability to detect unsatisfied cores novelty neurosats approach is novel based on message passing neural networks it trains a neural network to learn to solve the sat problem soundness this paper is technically sound evaluation the experimental section is comprehensive there are a variety of graphs showing the performance and ability of your architecture however the theoretical analysis isnt very sufficient for instance why does the change of the dataset from the original srn to srcnu lead to the change of the behavior of the network from searching for a satisfying assignment indefinitely to detecting the unsatisfiable cores clarity as a whole the paper is clear the definition of the problem the model structure the data generation the training procedure and the evaluation are all well organized however there is still a few points requiring more explanation for instance in figure 3 i am not sure whether darker value means larger value or smaller value because the authors only mentioned that white represents zero blue negative and red positive also in figure 7 i am not sure whether those black grids represent higher positive values or lower negative values a few questions whats the initialization of the two vectors the authors use for tiling operation does the initialization differ for different types of sat problems how do the authors decide the number of iterations necessary for solving a particular sat problem
### Summary:
|
the submission proposes a machine learning approach to directly train a prediction system for whether a boolean sentence is satisfiable the strengths of the paper seem to be largely in proposing an architecture for sat problems and the analysis of the generalization performance of the resulting classifier on classes of problems not directly seen during training although the resulting system cannot be claimed to be a state of the art system and it does not have a correctness guarantee like dpll based approaches the paper is a nice reintroduction of sat in a machine learning context using deep networks it may be nice to mention eg w ruml adaptive tree search phd thesis harvard university 2002 which applied reinforcement learning techniques to sat problems the empirical validation on variable sized problems etc is a nice contribution showing interesting generalization properties of the proposed approach the reviewers were unanimous in their recommendation that the paper be accepted and the review process attracted a number of additional comments showing the broader interest of the setting
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
8631,
247,
2087,
11454,
50276,
18428,
10336,
323,
21565,
3449,
74,
1430,
5742,
253,
9021,
2486,
271,
9706,
323,
2206,
3237,
285,
21565,
2206,
970,
247,
3935,
8136,
1332,
835,
253,
46234,
323,
4133,
932,
285,
34510,
403,
10040,
3146,
4391,
1919,
14940,
50276,
783,
2929,
3133,
1534,
7296,
326,
352,
10316,
2366,
2206,
16161,
285,
11454,
2990,
35615,
253,
2929,
310,
1077,
4518,
3542,
285,
3240,
10799,
670,
697,
9021,
253,
1783,
3340,
8442,
5910,
285,
818,
3133,
281,
1918,
247,
5322,
27350,
5697,
347,
281,
752,
253,
11454,
2990,
310,
2820,
281,
513,
2299,
581,
14855,
310,
326,
253,
3237,
403,
1408,
327,
247,
2173,
1511,
273,
2206,
1895,
253,
4477,
452,
3562,
273,
2282,
253,
4477,
1056,
352,
2590,
326,
253,
8103,
310,
417,
1663,
281,
2794,
247,
1375,
23037,
14387,
47037,
533,
2581,
281,
2096,
752,
247,
11454,
2990,
2820,
281,
513,
2206,
16161,
310,
7032,
273,
2509,
327,
436,
2914,
891,
1158,
253,
2929,
44584,
275,
2509,
436,
581,
2181,
326,
369,
247,
1652,
21643,
310,
326,
2139,
943,
512,
253,
4133,
932,
1614,
281,
2206,
1614,
2502,
281,
5276,
2206,
347,
352,
310,
2011,
275,
4677,
495,
310,
352,
326,
253,
11454,
2990,
1057,
417,
8968,
326,
352,
556,
1119,
247,
2206,
2900,
342,
247,
4577,
8578,
273,
2206,
4133,
932,
275,
643,
3000,
310,
352,
417,
7032,
273,
3192,
5750,
273,
253,
1895,
2605,
50276,
249,
2087,
2167,
436,
4455,
281,
320,
271,
4722,
2929,
2167,
697,
8542,
12739,
403,
3240,
1892,
281,
871,
2057,
275,
253,
2206,
3114,
390,
275,
253,
11454,
2990,
3114,
7152,
33032,
2520,
2929,
10262,
253,
6551,
22354,
10336,
534,
4648,
247,
3676,
3935,
8136,
11454,
2036,
323,
21565,
253,
3449,
74,
1430,
273,
260,
35478,
10872,
253,
10336,
310,
671,
2104,
281,
3283,
247,
3449,
6051,
12714,
275,
253,
2206,
1083,
285,
253,
4133,
932,
3206,
275,
690,
8723,
24648,
873,
273,
34510,
26332,
5161,
275,
253,
5061,
255,
1083,
253,
6551,
22354,
10336,
310,
1754,
327,
247,
4972,
2317,
21496,
273,
4133,
932,
285,
34510,
534,
40725,
342,
3935,
8136,
690,
1774,
34902,
273,
2206,
10872,
29391,
31429,
285,
2297,
318,
31429,
436,
10336,
310,
5762,
327,
2710,
5971,
273,
3632,
2206,
10872,
7668,
1097,
440,
34218,
14208,
3237,
285,
18872,
4394,
24088,
4216,
3295,
723,
11302,
10949,
41297,
5239,
3966,
50276,
1189,
455,
253,
2929,
310,
973,
24013,
8550,
285,
253,
5661,
1543,
403,
3240,
21414,
25711,
253,
43066,
8847,
273,
6551,
22354,
310,
281,
10040,
3146,
39494,
253,
7162,
273,
4133,
932,
13423,
323,
253,
2206,
50276,
263,
5061,
255,
50276,
9252,
970,
247,
13423,
6974,
327,
253,
1390,
19502,
273,
253,
22436,
4315,
436,
310,
1077,
4722,
285,
6551,
22354,
1537,
320,
908,
281,
1361,
5368,
1220,
735,
275,
13887,
4778,
1340,
723,
323,
46710,
1892,
10872,
390,
1892,
19241,
24088,
1089,
247,
5161,
50276,
251,
253,
643,
1133,
253,
7681,
5740,
273,
253,
10336,
4706,
495,
310,
4931,
247,
1652,
21248,
323,
1907,
247,
2590,
30328,
273,
849,
253,
9162,
4836,
50276,
1542,
2206,
10872,
50276,
261,
15726,
275,
253,
6551,
22354,
10336,
10775,
247,
4864,
5740,
273,
253,
3302,
12624,
534,
22573,
253,
22436,
546,
13604,
46234,
651,
320,
5322,
690,
5701,
327,
253,
2554,
4546,
407,
253,
33362,
4071,
591,
916,
1406,
5085,
285,
253,
21539,
5085,
651,
671,
320,
10112,
253,
767,
5731,
4803,
275,
3239,
495,
812,
320,
5544,
275,
625,
2508,
323,
253,
13232,
273,
19843,
891,
651,
1804,
281,
2085,
247,
4677,
323,
35668,
247,
5502,
432,
19502,
246,
18,
281,
19502,
246,
275,
253,
10336,
347,
247,
5884,
4385,
352,
651,
320,
5322,
275,
2593,
374,
281,
4853,
253,
2022,
3602,
295,
278,
285,
277,
908,
275,
253,
1551,
273,
253,
2929,
50276,
585,
29340,
253,
5661,
629,
273,
253,
2929,
7118,
577,
50276,
22,
403,
6210,
1591,
446,
1243,
533,
275,
2593,
721,
50276,
783,
2900,
28490,
1332,
11797,
432,
268,
6357,
310,
247,
2372,
21643,
5742,
359,
13414,
871,
849,
247,
14127,
12714,
310,
10375,
432,
253,
1390,
3828,
285,
436,
943,
320,
5544,
275,
2508,
2556,
281,
4677,
577,
285,
253,
5701,
1840,
352,
3133,
326,
247,
17524,
1332,
342,
767,
1399,
287,
2352,
310,
36431,
533,
436,
310,
417,
2590,
275,
2829,
337,
253,
5921,
875,
253,
7200,
327,
2206,
10872,
285,
253,
2558,
273,
2206,
10872,
14042,
310,
417,
2590,
310,
253,
4313,
273,
5571,
4080,
327,
253,
260,
35478,
10872,
534,
452,
644,
8131,
281,
320,
3449,
6051,
390,
310,
436,
4313,
4080,
327,
253,
2644,
873,
273,
1071,
10872,
4720,
323,
253,
1543,
4232,
275,
2829,
337,
849,
1142,
3733,
10872,
285,
1071,
10872,
452,
644,
908,
50276,
249,
2593,
818,
690,
1774,
7794,
2905,
281,
4679,
403,
5816,
275,
4706,
11102,
323,
49975,
1518,
8892,
369,
6551,
22354,
5762,
327,
253,
1072,
2515,
347,
1110,
323,
49975,
1449,
8892,
19836,
752,
310,
253,
3280,
7877,
277,
273,
253,
21496,
2317,
1060,
891,
5476,
326,
277,
50276,
8196,
310,
1512,
1355,
323,
824,
1781,
10872,
671,
849,
1142,
3733,
285,
1071,
10872,
452,
644,
908,
281,
7484,
253,
9191,
275,
4677,
608,
323,
253,
577,
25452,
3449,
6051,
10872,
4561,
275,
4706,
8187,
534,
47037,
452,
644,
908,
281,
3653,
253,
3449,
74,
1430,
273,
1110,
10872,
891,
5476,
352,
310,
1054,
261,
255,
533,
436,
943,
320,
5393,
9366,
50275,
249,
2593,
854,
891,
1119,
4722,
253,
253,
3745,
273,
6551,
22354,
275,
21565,
253,
4133,
932,
326,
10078,
275,
271,
5061,
255,
5161,
6296,
253,
1895,
273,
4560,
271,
5061,
255,
5161,
275,
260,
35478,
10872,
310,
43245,
12150,
685,
8925,
253,
3449,
74,
1430,
273,
436,
4227,
594,
6551,
22354,
1537,
320,
908,
1060,
281,
1361,
247,
47037,
275,
4560,
247,
5161,
533,
253,
10732,
273,
7162,
943,
320,
5544,
275,
625,
2508,
275,
436,
2593,
285,
625,
3839,
275,
253,
2644,
2929,
10775,
352,
3133,
326,
275,
253,
1390,
3828,
273,
1016,
19502,
4133,
932,
403,
13423,
323,
2206,
2502,
9830,
342,
690,
7162,
1333,
18687,
50276,
395,
13423,
323,
5061,
255,
4797,
9830,
342,
690,
7162,
1333,
18687,
403,
18687,
285,
18687,
9578,
275,
253,
11454,
10336,
285,
849,
1461,
39725,
323,
5061,
255,
13008,
403,
9300,
50276,
71,
3341,
891,
1119,
326,
253,
1027,
49602,
835,
4623,
533,
891,
651,
671,
1804,
323,
2852,
789,
390,
275,
253,
30762,
281,
23000,
1347,
4679,
327,
253,
973,
4304,
3632,
495,
22354,
10872,
465,
310,
4229,
281,
495,
1060,
352,
310,
973,
4304,
326,
247,
3408,
5502,
327,
253,
10872,
417,
253,
47037,
282,
47612,
6634,
387,
39802,
323,
253,
13604,
18645,
4313,
247,
7484,
19703,
253,
3045,
273,
6551,
22354,
7200,
275,
21565,
253,
5203,
273,
253,
4227,
7147,
253,
13604,
18645,
4313,
651,
320,
1077,
9371,
275,
18005,
253,
31640,
273,
6551,
22354,
327,
253,
9267,
18859,
1892,
10872,
534,
403,
2810,
281,
39802,
407,
6880,
627,
452,
644,
247,
2257,
273,
3332,
789,
275,
11365,
17927,
49338,
3632,
2206,
10872,
534,
19071,
690,
2605,
24088,
7888,
275,
1340,
281,
25066,
1524,
10186,
18872,
2206,
10872,
281,
436,
1127,
352,
651,
320,
4722,
281,
12106,
253,
3045,
273,
6551,
22354,
327,
824,
17927,
49338,
10872,
5474,
33032,
2520,
2929,
18784,
247,
11454,
2990,
281,
8415,
253,
3449,
74,
1430,
3237,
1754,
327,
253,
3935,
8136,
11454,
2990,
352,
10262,
6551,
22354,
285,
18784,
352,
347,
247,
30410,
281,
3283,
3449,
74,
1430,
762,
247,
2014,
2372,
273,
20446,
846,
3733,
6551,
22354,
476,
8415,
3237,
326,
403,
4067,
285,
625,
2834,
685,
352,
2455,
3047,
1309,
3733,
33810,
253,
4477,
1246,
247,
1039,
281,
30358,
253,
5482,
432,
253,
6928,
1396,
569,
16280,
323,
43288,
6051,
3237,
253,
2929,
671,
10262,
5723,
415,
22354,
534,
33772,
281,
2736,
253,
10435,
11297,
275,
253,
830,
273,
5061,
255,
23018,
50276,
11235,
11828,
436,
2929,
310,
2779,
281,
320,
273,
1600,
281,
247,
1781,
8394,
273,
253,
3114,
323,
2067,
4606,
41005,
3449,
74,
1430,
3237,
12893,
432,
247,
5235,
273,
10625,
436,
2929,
7866,
342,
247,
747,
6907,
281,
8415,
253,
2206,
1895,
1273,
314,
352,
4648,
11454,
6928,
275,
253,
2206,
1895,
285,
25097,
326,
11454,
6928,
476,
3037,
281,
1347,
247,
13358,
3186,
2626,
314,
253,
985,
908,
275,
436,
2929,
778,
671,
1361,
3157,
5368,
2206,
1220,
735,
50276,
9188,
40348,
891,
1158,
253,
1543,
403,
1534,
323,
253,
28490,
14127,
23768,
2593,
253,
2500,
351,
37613,
268,
6357,
46234,
403,
1077,
2590,
285,
253,
6551,
84,
1832,
2323,
2281,
323,
625,
1534,
3237,
285,
1027,
3237,
556,
2011,
697,
26647,
3745,
4720,
253,
6430,
273,
22436,
13008,
275,
5723,
415,
22354,
452,
8058,
697,
3745,
281,
2736,
5061,
33496,
23018,
50276,
2369,
652,
555,
6551,
84,
1832,
2746,
310,
4460,
1754,
327,
3935,
8136,
11454,
6928,
352,
18784,
247,
11454,
2990,
281,
3037,
281,
8415,
253,
2206,
1895,
50275,
27962,
1255,
436,
2929,
310,
22335,
3590,
50275,
15419,
2368,
253,
5661,
2593,
310,
11088,
627,
403,
247,
5235,
273,
14580,
4645,
253,
3045,
285,
3745,
273,
634,
10336,
2299,
253,
10527,
1783,
310,
2649,
1077,
4209,
323,
4227,
2139,
1057,
253,
1818,
273,
253,
10895,
432,
253,
3236,
256,
30930,
281,
6740,
3023,
1421,
281,
253,
1818,
273,
253,
3879,
273,
253,
2990,
432,
12203,
323,
247,
14127,
12714,
39450,
281,
15549,
253,
43288,
6051,
23018,
50276,
498,
15752,
347,
247,
2644,
253,
2929,
310,
2590,
253,
5426,
273,
253,
1895,
253,
1566,
2605,
253,
941,
5978,
253,
3733,
5199,
285,
253,
7103,
403,
512,
973,
10932,
2299,
627,
310,
1335,
247,
1643,
2792,
10568,
625,
8813,
323,
4227,
275,
4677,
495,
891,
717,
417,
2119,
1880,
28170,
1318,
2097,
4067,
1318,
390,
4577,
1318,
984,
253,
4477,
760,
5393,
326,
3168,
6125,
5058,
4797,
4016,
285,
2502,
2762,
671,
275,
4677,
818,
891,
717,
417,
2119,
1880,
1110,
2806,
42590,
1957,
2169,
2762,
2193,
390,
2406,
4016,
2193,
50276,
66,
1643,
3533,
50276,
5371,
84,
253,
31850,
273,
253,
767,
11390,
253,
4477,
897,
323,
246,
4837,
4254,
1057,
253,
31850,
9184,
323,
1027,
3510,
273,
2206,
3237,
50276,
5430,
513,
253,
4477,
7617,
253,
1180,
273,
25142,
3309,
323,
16161,
247,
1798,
2206,
1895,
50276,
187,
187,
4118,
18435,
27,
783,
19529,
29328,
247,
5145,
4715,
2746,
281,
3587,
6194,
247,
10554,
985,
323,
1880,
247,
12419,
6197,
310,
3449,
6051,
50276,
783,
20544,
273,
253,
2929,
1646,
281,
320,
8127,
275,
36636,
271,
10336,
323,
2206,
3237,
285,
253,
1783,
273,
253,
26647,
3045,
273,
253,
4795,
30410,
327,
5971,
273,
3237,
417,
3587,
2326,
1309,
3733,
50276,
20261,
253,
4795,
985,
2550,
320,
7558,
281,
320,
247,
1375,
273,
253,
1445,
985,
285,
352,
1057,
417,
452,
247,
36594,
12215,
751,
277,
50153,
1754,
7274,
253,
2929,
310,
247,
5322,
294,
46089,
273,
2206,
275,
247,
5145,
4715,
3634,
970,
3676,
6928,
50276,
262,
778,
320,
5322,
281,
3748,
24088,
259,
11267,
77,
17825,
5202,
3186,
815,
69,
22857,
4230,
12299,
9835,
6752,
534,
3732,
35221,
4715,
5609,
281,
2206,
3237,
50276,
783,
16774,
12820,
327,
4778,
25180,
3237,
3966,
310,
247,
5322,
7680,
4645,
4722,
26647,
3607,
273,
253,
4081,
2746,
50276,
783,
30628,
497,
42293,
275,
616,
17401,
326,
253,
2929,
320,
7607,
285,
253,
2278,
1232,
17755,
247,
1180,
273,
3081,
5701,
4645,
253,
16055,
1600,
273,
253,
4758
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
8631,
247,
2087,
11454,
50276,
18428,
10336,
323,
21565,
3449,
74,
1430,
5742,
253,
9021,
2486,
271,
9706,
323,
2206,
3237,
285,
21565,
2206,
970,
247,
3935,
8136,
1332,
835,
253,
46234,
323,
4133,
932,
285,
34510,
403,
10040,
3146,
4391,
1919,
14940,
50276,
783,
2929,
3133,
1534,
7296,
326,
352,
10316,
2366,
2206,
16161,
285,
11454,
2990,
35615,
253,
2929,
310,
1077,
4518,
3542,
285,
3240,
10799,
670,
697,
9021,
253,
1783,
3340,
8442,
5910,
285,
818,
3133,
281,
1918,
247,
5322,
27350,
5697,
347,
281,
752,
253,
11454,
2990,
310,
2820,
281,
513,
2299,
581,
14855,
310,
326,
253,
3237,
403,
1408,
327,
247,
2173,
1511,
273,
2206,
1895,
253,
4477,
452,
3562,
273,
2282,
253,
4477,
1056,
352,
2590,
326,
253,
8103,
310,
417,
1663,
281,
2794,
247,
1375,
23037,
14387,
47037,
533,
2581,
281,
2096,
752,
247,
11454,
2990,
2820,
281,
513,
2206,
16161,
310,
7032,
273,
2509,
327,
436,
2914,
891,
1158,
253,
2929,
44584,
275,
2509,
436,
581,
2181,
326,
369,
247,
1652,
21643,
310,
326,
2139,
943,
512,
253,
4133,
932,
1614,
281,
2206,
1614,
2502,
281,
5276,
2206,
347,
352,
310,
2011,
275,
4677,
495,
310,
352,
326,
253,
11454,
2990,
1057,
417,
8968,
326,
352,
556,
1119,
247,
2206,
2900,
342,
247,
4577,
8578,
273,
2206,
4133,
932,
275,
643,
3000,
310,
352,
417,
7032,
273,
3192,
5750,
273,
253,
1895,
2605,
50276,
249,
2087,
2167,
436,
4455,
281,
320,
271,
4722,
2929,
2167,
697,
8542,
12739,
403,
3240,
1892,
281,
871,
2057,
275,
253,
2206,
3114,
390,
275,
253,
11454,
2990,
3114,
7152,
33032,
2520,
2929,
10262,
253,
6551,
22354,
10336,
534,
4648,
247,
3676,
3935,
8136,
11454,
2036,
323,
21565,
253,
3449,
74,
1430,
273,
260,
35478,
10872,
253,
10336,
310,
671,
2104,
281,
3283,
247,
3449,
6051,
12714,
275,
253,
2206,
1083,
285,
253,
4133,
932,
3206,
275,
690,
8723,
24648,
873,
273,
34510,
26332,
5161,
275,
253,
5061,
255,
1083,
253,
6551,
22354,
10336,
310,
1754,
327,
247,
4972,
2317,
21496,
273,
4133,
932,
285,
34510,
534,
40725,
342,
3935,
8136,
690,
1774,
34902,
273,
2206,
10872,
29391,
31429,
285,
2297,
318,
31429,
436,
10336,
310,
5762,
327,
2710,
5971,
273,
3632,
2206,
10872,
7668,
1097,
440,
34218,
14208,
3237,
285,
18872,
4394,
24088,
4216,
3295,
723,
11302,
10949,
41297,
5239,
3966,
50276,
1189,
455,
253,
2929,
310,
973,
24013,
8550,
285,
253,
5661,
1543,
403,
3240,
21414,
25711,
253,
43066,
8847,
273,
6551,
22354,
310,
281,
10040,
3146,
39494,
253,
7162,
273,
4133,
932,
13423,
323,
253,
2206,
50276,
263,
5061,
255,
50276,
9252,
970,
247,
13423,
6974,
327,
253,
1390,
19502,
273,
253,
22436,
4315,
436,
310,
1077,
4722,
285,
6551,
22354,
1537,
320,
908,
281,
1361,
5368,
1220,
735,
275,
13887,
4778,
1340,
723,
323,
46710,
1892,
10872,
390,
1892,
19241,
24088,
1089,
247,
5161,
50276,
251,
253,
643,
1133,
253,
7681,
5740,
273,
253,
10336,
4706,
495,
310,
4931,
247,
1652,
21248,
323,
1907,
247,
2590,
30328,
273,
849,
253,
9162,
4836,
50276,
1542,
2206,
10872,
50276,
261,
15726,
275,
253,
6551,
22354,
10336,
10775,
247,
4864,
5740,
273,
253,
3302,
12624,
534,
22573,
253,
22436,
546,
13604,
46234,
651,
320,
5322,
690,
5701,
327,
253,
2554,
4546,
407,
253,
33362,
4071,
591,
916,
1406,
5085,
285,
253,
21539,
5085,
651,
671,
320,
10112,
253,
767,
5731,
4803,
275,
3239,
495,
812,
320,
5544,
275,
625,
2508,
323,
253,
13232,
273,
19843,
891,
651,
1804,
281,
2085,
247,
4677,
323,
35668,
247,
5502,
432,
19502,
246,
18,
281,
19502,
246,
275,
253,
10336,
347,
247,
5884,
4385,
352,
651,
320,
5322,
275,
2593,
374,
281,
4853,
253,
2022,
3602,
295,
278,
285,
277,
908,
275,
253,
1551,
273,
253,
2929,
50276,
585,
29340,
253,
5661,
629,
273,
253,
2929,
7118,
577,
50276,
22,
403,
6210,
1591,
446,
1243,
533,
275,
2593,
721,
50276,
783,
2900,
28490,
1332,
11797,
432,
268,
6357,
310,
247,
2372,
21643,
5742,
359,
13414,
871,
849,
247,
14127,
12714,
310,
10375,
432,
253,
1390,
3828,
285,
436,
943,
320,
5544,
275,
2508,
2556,
281,
4677,
577,
285,
253,
5701,
1840,
352,
3133,
326,
247,
17524,
1332,
342,
767,
1399,
287,
2352,
310,
36431,
533,
436,
310,
417,
2590,
275,
2829,
337,
253,
5921,
875,
253,
7200,
327,
2206,
10872,
285,
253,
2558,
273,
2206,
10872,
14042,
310,
417,
2590,
310,
253,
4313,
273,
5571,
4080,
327,
253,
260,
35478,
10872,
534,
452,
644,
8131,
281,
320,
3449,
6051,
390,
310,
436,
4313,
4080,
327,
253,
2644,
873,
273,
1071,
10872,
4720,
323,
253,
1543,
4232,
275,
2829,
337,
849,
1142,
3733,
10872,
285,
1071,
10872,
452,
644,
908,
50276,
249,
2593,
818,
690,
1774,
7794,
2905,
281,
4679,
403,
5816,
275,
4706,
11102,
323,
49975,
1518,
8892,
369,
6551,
22354,
5762,
327,
253,
1072,
2515,
347,
1110,
323,
49975,
1449,
8892,
19836,
752,
310,
253,
3280,
7877,
277,
273,
253,
21496,
2317,
1060,
891,
5476,
326,
277,
50276,
8196,
310,
1512,
1355,
323,
824,
1781,
10872,
671,
849,
1142,
3733,
285,
1071,
10872,
452,
644,
908,
281,
7484,
253,
9191,
275,
4677,
608,
323,
253,
577,
25452,
3449,
6051,
10872,
4561,
275,
4706,
8187,
534,
47037,
452,
644,
908,
281,
3653,
253,
3449,
74,
1430,
273,
1110,
10872,
891,
5476,
352,
310,
1054,
261,
255,
533,
436,
943,
320,
5393,
9366,
50275,
249,
2593,
854,
891,
1119,
4722,
253,
253,
3745,
273,
6551,
22354,
275,
21565,
253,
4133,
932,
326,
10078,
275,
271,
5061,
255,
5161,
6296,
253,
1895,
273,
4560,
271,
5061,
255,
5161,
275,
260,
35478,
10872,
310,
43245,
12150,
685,
8925,
253,
3449,
74,
1430,
273,
436,
4227,
594,
6551,
22354,
1537,
320,
908,
1060,
281,
1361,
247,
47037,
275,
4560,
247,
5161,
533,
253,
10732,
273,
7162,
943,
320,
5544,
275,
625,
2508,
275,
436,
2593,
285,
625,
3839,
275,
253,
2644,
2929,
10775,
352,
3133,
326,
275,
253,
1390,
3828,
273,
1016,
19502,
4133,
932,
403,
13423,
323,
2206,
2502,
9830,
342,
690,
7162,
1333,
18687,
50276,
395,
13423,
323,
5061,
255,
4797,
9830,
342,
690,
7162,
1333,
18687,
403,
18687,
285,
18687,
9578,
275,
253,
11454,
10336,
285,
849,
1461,
39725,
323,
5061,
255,
13008,
403,
9300,
50276,
71,
3341,
891,
1119,
326,
253,
1027,
49602,
835,
4623,
533,
891,
651,
671,
1804,
323,
2852,
789,
390,
275,
253,
30762,
281,
23000,
1347,
4679,
327,
253,
973,
4304,
3632,
495,
22354,
10872,
465,
310,
4229,
281,
495,
1060,
352,
310,
973,
4304,
326,
247,
3408,
5502,
327,
253,
10872,
417,
253,
47037,
282,
47612,
6634,
387,
39802,
323,
253,
13604,
18645,
4313,
247,
7484,
19703,
253,
3045,
273,
6551,
22354,
7200,
275,
21565,
253,
5203,
273,
253,
4227,
7147,
253,
13604,
18645,
4313,
651,
320,
1077,
9371,
275,
18005,
253,
31640,
273,
6551,
22354,
327,
253,
9267,
18859,
1892,
10872,
534,
403,
2810,
281,
39802,
407,
6880,
627,
452,
644,
247,
2257,
273,
3332,
789,
275,
11365,
17927,
49338,
3632,
2206,
10872,
534,
19071,
690,
2605,
24088,
7888,
275,
1340,
281,
25066,
1524,
10186,
18872,
2206,
10872,
281,
436,
1127,
352,
651,
320,
4722,
281,
12106,
253,
3045,
273,
6551,
22354,
327,
824,
17927,
49338,
10872,
5474,
33032,
2520,
2929,
18784,
247,
11454,
2990,
281,
8415,
253,
3449,
74,
1430,
3237,
1754,
327,
253,
3935,
8136,
11454,
2990,
352,
10262,
6551,
22354,
285,
18784,
352,
347,
247,
30410,
281,
3283,
3449,
74,
1430,
762,
247,
2014,
2372,
273,
20446,
846,
3733,
6551,
22354,
476,
8415,
3237,
326,
403,
4067,
285,
625,
2834,
685,
352,
2455,
3047,
1309,
3733,
33810,
253,
4477,
1246,
247,
1039,
281,
30358,
253,
5482,
432,
253,
6928,
1396,
569,
16280,
323,
43288,
6051,
3237,
253,
2929,
671,
10262,
5723,
415,
22354,
534,
33772,
281,
2736,
253,
10435,
11297,
275,
253,
830,
273,
5061,
255,
23018,
50276,
11235,
11828,
436,
2929,
310,
2779,
281,
320,
273,
1600,
281,
247,
1781,
8394,
273,
253,
3114,
323,
2067,
4606,
41005,
3449,
74,
1430,
3237,
12893,
432,
247,
5235,
273,
10625,
436,
2929,
7866,
342,
247,
747,
6907,
281,
8415,
253,
2206,
1895,
1273,
314,
352,
4648,
11454,
6928,
275,
253,
2206,
1895,
285,
25097,
326,
11454,
6928,
476,
3037,
281,
1347,
247,
13358,
3186,
2626,
314,
253,
985,
908,
275,
436,
2929,
778,
671,
1361,
3157,
5368,
2206,
1220,
735,
50276,
9188,
40348,
891,
1158,
253,
1543,
403,
1534,
323,
253,
28490,
14127,
23768,
2593,
253,
2500,
351,
37613,
268,
6357,
46234,
403,
1077,
2590,
285,
253,
6551,
84,
1832,
2323,
2281,
323,
625,
1534,
3237,
285,
1027,
3237,
556,
2011,
697,
26647,
3745,
4720,
253,
6430,
273,
22436,
13008,
275,
5723,
415,
22354,
452,
8058,
697,
3745,
281,
2736,
5061,
33496,
23018,
50276,
2369,
652,
555,
6551,
84,
1832,
2746,
310,
4460,
1754,
327,
3935,
8136,
11454,
6928,
352,
18784,
247,
11454,
2990,
281,
3037,
281,
8415,
253,
2206,
1895,
50275,
27962,
1255,
436,
2929,
310,
22335,
3590,
50275,
15419,
2368,
253,
5661,
2593,
310,
11088,
627,
403,
247,
5235,
273,
14580,
4645,
253,
3045,
285,
3745,
273,
634,
10336,
2299,
253,
10527,
1783,
310,
2649,
1077,
4209,
323,
4227,
2139,
1057,
253,
1818,
273,
253,
10895,
432,
253,
3236,
256,
30930,
281,
6740,
3023,
1421,
281,
253,
1818,
273,
253,
3879,
273,
253,
2990,
432,
12203,
323,
247,
14127,
12714,
39450,
281,
15549,
253,
43288,
6051,
23018,
50276,
498,
15752,
347,
247,
2644,
253,
2929,
310,
2590,
253,
5426,
273,
253,
1895,
253,
1566,
2605,
253,
941,
5978,
253,
3733,
5199,
285,
253,
7103,
403,
512,
973,
10932,
2299,
627,
310,
1335,
247,
1643,
2792,
10568,
625,
8813,
323,
4227,
275,
4677,
495,
891,
717,
417,
2119,
1880,
28170,
1318,
2097,
4067,
1318,
390,
4577,
1318,
984,
253,
4477,
760,
5393,
326,
3168,
6125,
5058,
4797,
4016,
285,
2502,
2762,
671,
275,
4677,
818,
891,
717,
417,
2119,
1880,
1110,
2806,
42590,
1957,
2169,
2762,
2193,
390,
2406,
4016,
2193,
50276,
66,
1643,
3533,
50276,
5371,
84,
253,
31850,
273,
253,
767,
11390,
253,
4477,
897,
323,
246,
4837,
4254,
1057,
253,
31850,
9184,
323,
1027,
3510,
273,
2206,
3237,
50276,
5430,
513,
253,
4477,
7617,
253,
1180,
273,
25142,
3309,
323,
16161,
247,
1798,
2206,
1895,
50276,
187,
187,
4118,
18435,
27,
783,
19529,
29328,
247,
5145,
4715,
2746,
281,
3587,
6194,
247,
10554,
985,
323,
1880,
247,
12419,
6197,
310,
3449,
6051,
50276,
783,
20544,
273,
253,
2929,
1646,
281,
320,
8127,
275,
36636,
271,
10336,
323,
2206,
3237,
285,
253,
1783,
273,
253,
26647,
3045,
273,
253,
4795,
30410,
327,
5971,
273,
3237,
417,
3587,
2326,
1309,
3733,
50276,
20261,
253,
4795,
985,
2550,
320,
7558,
281,
320,
247,
1375,
273,
253,
1445,
985,
285,
352,
1057,
417,
452,
247,
36594,
12215,
751,
277,
50153,
1754,
7274,
253,
2929,
310,
247,
5322,
294,
46089,
273,
2206,
275,
247,
5145,
4715,
3634,
970,
3676,
6928,
50276,
262,
778,
320,
5322,
281,
3748,
24088,
259,
11267,
77,
17825,
5202,
3186,
815,
69,
22857,
4230,
12299,
9835,
6752,
534,
3732,
35221,
4715,
5609,
281,
2206,
3237,
50276,
783,
16774,
12820,
327,
4778,
25180,
3237,
3966,
310,
247,
5322,
7680,
4645,
4722,
26647,
3607,
273,
253,
4081,
2746,
50276,
783,
30628,
497,
42293,
275,
616,
17401,
326,
253,
2929,
320,
7607,
285,
253,
2278,
1232,
17755,
247,
1180,
273,
3081,
5701,
4645,
253,
16055,
1600,
273,
253,
4758
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses the task of quantizing transformers to very low precision 12 bits quantizing neural networks can be essential for model size reduction especially for mobile devices but also for the speed of evaluation of hardware without floating point accelerators in addition binary activationscomputations open new possibilities for highperforming lowpower special purpose hardware for evaluating neural networks with drastically improved parallelism and power consumption while these considerations and motivations are not new the paper gives a concise summary and references to prior efforts especially in the context of convolutional networks that is clearly referenced in this work here this work addresses a set of cumulative technical improvements for successfully quantizing transformer networks for very low precision including normalization separate quantization for nonnegative activations and exploring distillation paths over medium precision eg 8 bits quantization using a combination of such methods this paper manages to reduce gap in blue score to the fullprecision model by 3x percentage points also the paper gives thorough ablation analysis for the effect of the employed methods originality moderate the motivation for extreme quantization of neural networks is not new and has been proposed many times over the past few years these include model size reduction and the development of new special purpose hardware that could reduce the power consumption and latency of inference of large deep learning models in fact the feasibility of extreme quantization has been pioneered for convolutional networks with high success but with transformers becoming the main workhorse for most deep learning applications it has become an important question how to transfer the above results over to them while there has been an initial set of work of binarizing bert and other transformerbased models typically those results resulted in a huge drop of languagemodeling quality this work does not propose one single solution to those issues but a collection of relatively commonsense solutions the combination of which has a large effect on quality of the resulting lowprecision model these ideas include normalizing activations before binarization to ensure that lowprecision activations are maximally informative improved gradient clipping for training specialized quantization for nonnegative and general activations attention quantization intermediatelayer distillation employed in prior work multistep distillation path over mediumprecision models quality high the paper has one very clear goal a very welldefined experimental setup thorough experiments and ablation analysis that seem consistent the work presents a clear analysis of the importance of all the employed methods while most of the ideasmethods are not very novel in isolation this work gives a clear welltested recipe for the extreme quantization of transformers the quality of which is significantly beyond whatever was reported earlier while the paper omits speculating about the potential gains by specialpurpose hardware this is not a real issue as the implications are relatively clear and these results give a clear motivation to further work in that direction clarity high the paper is well motivated references to prior work is quite extensive although references to very new concurrent works might that has not been peerreviewed yet but this does not affect the final conclusion the paper presents a lot of experimental evidence in concise tables that back up the intuition behind the decision and highlight the significance and motivation for all of the decisions made for this work significance high while the employed methods are relatively well known the fact that transformers can be quantized to 1bit precision is extremely important and this work gives a clear welldocumented measurement point and clear welltested recipes this is a valuable baseline for future work also increases the motivation for building specialpurpose hardware for extremely lowprecision neural networks while every improvement has some potential to increase the risk of powerful technologies it is unclearunlikely to me that this technology would have any adverse affects beyond the generic issues of making highimpact technologies easiercheaper to deploy by bad actors and that approximate inference might become less reliable than highquality models these are a general concerns regarding any performance improvements and performance tradeoff in this sense i dont think that this approach has any nongeneric risks that should have been considered in particular by the authors docsepthis paper proposes several changes to the design of the binary bert model the main proposal consists of three parts 1 use different quantization schemes based on the output distribution of a layer eg activations are quantized to 0 and 1 for multihead self attention layers 2 adopt an elastic binarization function which allows rescale and shift 3 use a multistep distillation scheme joint with a multistep quantization schedule the resulting bit model demonstrates a strong performance on downstream tasks in glue and squad datasets strengths this paper proposes several simple yet effective changes to improve the quality of binarized bert model significantly the proposed techniques are sound the writing of the paper is clear and easy to follow the authors also provide enough empirical evaluation of the proposed bit models weaknesses my biggest concern with this paper is that the authors do not clearly discuss the relationship to other recent works in the absence of a clear discussion of the relationship to other related work i am not convinced that the contributions in this work are new and original this paper does cite a large number of relevant works however the relationship between the proposals in this work and related work remains unclear first the proposal to use different quantization schemes depending on the output distribution of the layers is quite similar to biattention proposed in bibert 1 biattention suggests using a bool function equation 11 in the bibert paper rather than a sign function to binarize the output of a multihead selfattention layer second the elastic binarization function looks akin to the activation function proposed in reactnet 2 reactnet and some followup work have already shown that introducing additional scaling and shifting capabilities in the activation function can help improve the quality of the binarized cnn the authors do not point out the similarity with reactnet when discussing the elastic binarization function which could be misleading in addition i think the paper also needs to discuss the new hardware required for the bit model which is different from traditional binary neural networks that require only simple xnor gates to implement the computational engineprocessing element the computational engine for bit models needs to handle computation between 1 and 1 and 0 and 1 i suggest that the authors should at least compare the required hardware with traditional binary and ternary neural networks otherwise it would be difficult to justify not using a ternary neural network 1 bibert accurate fully binarized bert iclr22 2 reactnet towards precise binary neural network with generalized activation functions eccv20 na docsepthe authors propose novel binarisation techniques for transformer models based on knowledge distillation the goal is to achieve competitive performance for an efficient student transformer model the main contributions are i binarisation technique that improves knowledge distillation in transformer models ii multistep technique to distil binarised models to improve intermediate performance the study shows that the student model achieves competitive performance compared to a standard bert on the glue benchmark and question answering strengths clear description of background knowledge and related work needed to understand the proposed approach clear description of the proposed approach best practices for model replicability the authors perform a comprehensive comparison with related work on the glue benchmark the findings show that an efficient transformer model has competitive performance compared to previous sota weaknesses it is not clear how the parameter initialization and selection of hyperparameters could affect the model performance the authors have addressed limitations of the proposed approach
### Summary:
|
this paper proposes an innovative pipeline for quantizing transformers for extremely low precision 12 bits while reducing the gap of previous methods to full precision by 3x this result has important implications for resourcerestricted inference especially if memory is of concern but 1bit quantization has significant effect on inference latency as well this work reaches these strong results by careful normalization separate quantization for nonnegative activations and a combinatorial optimization over various distillation paths overall the paper demonstrates an important albeit incremental advance in the field and is of general interest to the wider community therefore i propose its acceptance at neurips 2022
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
4836,
273,
2677,
3006,
4979,
398,
281,
1077,
1698,
12320,
1249,
9886,
2677,
3006,
11454,
6928,
476,
320,
5667,
323,
1566,
1979,
5141,
3340,
323,
6109,
4095,
533,
671,
323,
253,
3885,
273,
7103,
273,
10309,
1293,
14974,
1127,
17308,
2392,
275,
1635,
8985,
1396,
569,
16777,
569,
1527,
747,
15018,
323,
1029,
468,
14692,
1698,
9177,
2714,
4096,
10309,
323,
16344,
11454,
6928,
342,
31063,
5520,
7529,
1204,
285,
1612,
8353,
50276,
6050,
841,
15711,
285,
42852,
403,
417,
747,
253,
2929,
4245,
247,
44003,
6010,
285,
10414,
281,
2720,
6031,
3340,
275,
253,
3634,
273,
27311,
267,
6928,
326,
310,
4518,
23378,
275,
436,
789,
50276,
1568,
436,
789,
12453,
247,
873,
273,
18849,
7681,
11701,
323,
8379,
2677,
3006,
39707,
6928,
323,
1077,
1698,
12320,
1690,
21539,
4858,
36643,
323,
46214,
1396,
569,
285,
18216,
940,
21755,
11865,
689,
4646,
12320,
24088,
854,
9886,
36643,
970,
247,
5019,
273,
824,
3082,
436,
2929,
26091,
281,
4796,
8037,
275,
4797,
4868,
281,
253,
2120,
40540,
1566,
407,
495,
89,
7155,
2792,
671,
253,
2929,
4245,
11080,
28913,
1783,
323,
253,
1055,
273,
253,
7091,
3082,
3236,
414,
10290,
253,
16038,
323,
9559,
36643,
273,
11454,
6928,
310,
417,
747,
285,
556,
644,
4081,
1142,
2069,
689,
253,
2469,
1643,
1107,
841,
2486,
1566,
1979,
5141,
285,
253,
2440,
273,
747,
2714,
4096,
10309,
326,
812,
4796,
253,
1612,
8353,
285,
22667,
273,
17032,
273,
1781,
3676,
4715,
3210,
275,
958,
253,
25720,
273,
9559,
36643,
556,
644,
33536,
35842,
323,
27311,
267,
6928,
342,
1029,
2323,
533,
342,
4979,
398,
7552,
253,
2022,
789,
33647,
323,
954,
3676,
4715,
4893,
352,
556,
2489,
271,
1774,
1953,
849,
281,
3700,
253,
1840,
1543,
689,
281,
731,
50276,
6050,
627,
556,
644,
271,
3302,
873,
273,
789,
273,
10269,
274,
3006,
270,
797,
285,
643,
39707,
3169,
3210,
5431,
1110,
1543,
7369,
275,
247,
5699,
5926,
273,
298,
2435,
31646,
351,
8855,
3290,
50275,
2520,
789,
1057,
417,
12661,
581,
2014,
2900,
281,
1110,
3374,
533,
247,
4849,
273,
4942,
764,
49235,
5482,
253,
5019,
273,
534,
556,
247,
1781,
1055,
327,
3290,
273,
253,
4795,
1698,
40540,
1566,
841,
5697,
2486,
50276,
6320,
3006,
1396,
569,
1078,
10269,
274,
1320,
281,
5416,
326,
1698,
40540,
1396,
569,
403,
11903,
595,
27096,
50276,
303,
27369,
11786,
502,
8201,
323,
3733,
50276,
19586,
1025,
36643,
323,
46214,
285,
2087,
1396,
569,
50276,
42959,
36643,
50276,
2388,
2514,
255,
293,
4071,
940,
21755,
7091,
275,
2720,
789,
50276,
9961,
382,
554,
940,
21755,
1854,
689,
4646,
40540,
3210,
50276,
15177,
1029,
253,
2929,
556,
581,
1077,
2590,
4736,
247,
1077,
6210,
392,
37224,
5661,
9978,
11080,
4679,
285,
28913,
1783,
326,
1646,
5185,
253,
789,
10262,
247,
2590,
1783,
273,
253,
6349,
273,
512,
253,
7091,
3082,
1223,
954,
273,
253,
1827,
4542,
1130,
84,
403,
417,
1077,
4460,
275,
12940,
436,
789,
4245,
247,
2590,
973,
41694,
13612,
323,
253,
9559,
36643,
273,
4979,
398,
253,
3290,
273,
534,
310,
3012,
4457,
5913,
369,
2361,
4321,
1223,
253,
2929,
7005,
953,
946,
8287,
670,
253,
2442,
15988,
407,
2714,
27299,
10309,
436,
310,
417,
247,
1524,
2523,
347,
253,
12739,
403,
4942,
2590,
285,
841,
1543,
1918,
247,
2590,
16038,
281,
2007,
789,
275,
326,
3884,
50276,
498,
15752,
1029,
253,
2929,
310,
973,
17194,
10414,
281,
2720,
789,
310,
3240,
9470,
3738,
10414,
281,
1077,
747,
17336,
2987,
1537,
326,
556,
417,
644,
14218,
33349,
2568,
533,
436,
1057,
417,
2818,
253,
2457,
6452,
50276,
783,
2929,
10262,
247,
2257,
273,
5661,
1941,
275,
44003,
7180,
326,
896,
598,
253,
30328,
3212,
253,
3061,
285,
6780,
253,
8453,
285,
16038,
323,
512,
273,
253,
7089,
1160,
323,
436,
789,
50276,
9188,
40348,
1029,
1223,
253,
7091,
3082,
403,
4942,
973,
1929,
253,
958,
326,
4979,
398,
476,
320,
2677,
1025,
281,
337,
2713,
12320,
310,
6685,
1774,
285,
436,
789,
4245,
247,
2590,
6210,
392,
1829,
264,
6814,
1127,
285,
2590,
973,
41694,
20247,
436,
310,
247,
9865,
8245,
323,
2852,
789,
671,
5459,
253,
16038,
323,
3652,
2714,
27299,
10309,
323,
6685,
1698,
40540,
11454,
6928,
50276,
6050,
1046,
7756,
556,
690,
2442,
281,
2572,
253,
2495,
273,
6422,
10296,
352,
310,
12744,
328,
10355,
281,
479,
326,
436,
4302,
651,
452,
667,
10021,
11852,
4457,
253,
12314,
3374,
273,
2403,
1029,
48276,
10296,
6927,
1962,
6653,
281,
8745,
407,
3076,
14142,
285,
326,
16851,
17032,
1537,
2489,
1679,
9630,
685,
1029,
15177,
3210,
841,
403,
247,
2087,
7350,
5001,
667,
3045,
11701,
285,
3045,
5454,
2727,
275,
436,
3282,
891,
13414,
1158,
326,
436,
2746,
556,
667,
295,
543,
4330,
280,
10502,
326,
943,
452,
644,
2783,
275,
1798,
407,
253,
4477,
5474,
33032,
2520,
2929,
29328,
2067,
2544,
281,
253,
2216,
273,
253,
8985,
270,
797,
1566,
253,
2022,
10419,
8414,
273,
1264,
4243,
337,
897,
1027,
36643,
15849,
1754,
327,
253,
3453,
3268,
273,
247,
3828,
24088,
1396,
569,
403,
2677,
1025,
281,
470,
285,
337,
323,
4471,
2522,
1881,
4116,
8090,
374,
5283,
271,
15386,
10269,
274,
1320,
1159,
534,
4483,
9708,
1079,
285,
5333,
495,
897,
247,
1554,
382,
554,
940,
21755,
6974,
6036,
342,
247,
1554,
382,
554,
36643,
10130,
253,
4795,
2372,
1566,
14371,
247,
2266,
3045,
327,
15450,
8892,
275,
28400,
285,
13487,
15302,
20544,
436,
2929,
29328,
2067,
2969,
2568,
3576,
2544,
281,
3157,
253,
3290,
273,
10269,
274,
1025,
270,
797,
1566,
3012,
253,
4081,
5609,
403,
3590,
253,
4028,
273,
253,
2929,
310,
2590,
285,
3477,
281,
956,
253,
4477,
671,
2085,
2217,
16774,
7103,
273,
253,
4081,
2372,
3210,
50276,
20881,
1255,
265,
619,
5962,
4468,
342,
436,
2929,
310,
326,
253,
4477,
513,
417,
4518,
2319,
253,
2954,
281,
643,
3332,
2987,
275,
253,
5928,
273,
247,
2590,
5955,
273,
253,
2954,
281,
643,
2905,
789,
891,
717,
417,
13762,
326,
253,
9021,
275,
436,
789,
403,
747,
285,
3236,
50276,
2520,
2929,
1057,
26542,
247,
1781,
1180,
273,
4623,
2987,
2299,
253,
2954,
875,
253,
18595,
275,
436,
789,
285,
2905,
789,
4558,
12744,
806,
253,
10419,
281,
897,
1027,
36643,
15849,
7293,
327,
253,
3453,
3268,
273,
253,
8090,
310,
3240,
2074,
281,
1794,
42959,
4081,
275,
20314,
797,
337,
1794,
42959,
5936,
970,
247,
7301,
1159,
5150,
1903,
275,
253,
20314,
797,
2929,
2581,
685,
247,
861,
1159,
281,
10269,
274,
907,
253,
3453,
273,
247,
4471,
2522,
1881,
42959,
3828,
1273,
253,
15386,
10269,
274,
1320,
1159,
4453,
33917,
281,
253,
5743,
1159,
4081,
275,
8071,
3024,
374,
8071,
3024,
285,
690,
956,
484,
789,
452,
2168,
2011,
326,
16984,
3081,
13642,
285,
19507,
13789,
275,
253,
5743,
1159,
476,
1361,
3157,
253,
3290,
273,
253,
10269,
274,
1025,
260,
9866,
253,
4477,
513,
417,
1127,
562,
253,
14259,
342,
8071,
3024,
672,
16585,
253,
15386,
10269,
274,
1320,
1159,
534,
812,
320,
24363,
50276,
249,
1635,
891,
1158,
253,
2929,
671,
3198,
281,
2319,
253,
747,
10309,
2424,
323,
253,
2372,
1566,
534,
310,
1027,
432,
5899,
8985,
11454,
6928,
326,
2430,
760,
2969,
1269,
15387,
18488,
281,
3359,
253,
15180,
3948,
21678,
3284,
253,
15180,
3948,
323,
2372,
3210,
3198,
281,
6016,
13782,
875,
337,
285,
337,
285,
470,
285,
337,
891,
1804,
326,
253,
4477,
943,
387,
1878,
7277,
253,
2424,
10309,
342,
5899,
8985,
285,
49688,
552,
11454,
6928,
5010,
352,
651,
320,
2834,
281,
15249,
417,
970,
247,
49688,
552,
11454,
2990,
50276,
18,
20314,
797,
7899,
4751,
10269,
274,
1025,
270,
797,
17857,
32888,
1423,
374,
8071,
3024,
4404,
10799,
8985,
11454,
2990,
342,
14923,
5743,
3470,
23746,
87,
938,
5549,
5474,
339,
431,
248,
4477,
12661,
4460,
10269,
274,
5837,
5609,
323,
39707,
3210,
1754,
327,
3640,
940,
21755,
253,
4736,
310,
281,
5115,
12085,
3045,
323,
271,
5919,
5974,
39707,
1566,
575,
253,
2022,
9021,
403,
891,
10269,
274,
5837,
5853,
326,
19132,
3640,
940,
21755,
275,
39707,
3210,
21255,
1554,
382,
554,
5853,
281,
940,
300,
10269,
274,
1701,
3210,
281,
3157,
10444,
3045,
253,
1263,
2722,
326,
253,
5974,
1566,
33526,
12085,
3045,
2429,
281,
247,
2629,
270,
797,
327,
253,
28400,
22791,
285,
1953,
22291,
20544,
50274,
8250,
5740,
273,
4114,
3640,
285,
2905,
789,
3058,
281,
2096,
253,
4081,
2746,
2519,
50275,
8250,
5740,
273,
253,
4081,
2746,
50275,
14461,
8333,
323,
1566,
7446,
1430,
50275,
783,
4477,
1347,
247,
11088,
5301,
342,
2905,
789,
327,
253,
28400,
22791,
50275,
783,
4342,
921,
326,
271,
5919,
39707,
1566,
556,
12085,
3045,
2429,
281,
2045,
256,
5503,
50275,
20881,
1255,
265,
50274,
262,
310,
417,
2590,
849,
253,
4764,
31850,
575,
285,
5438,
273,
4373,
22041,
812,
2818,
253,
1566,
3045,
253,
4477,
452,
9713,
7364,
273,
253,
4081,
2746,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
16694,
15722,
323,
2677,
3006,
4979,
398,
323,
6685,
1698,
12320,
1249,
9886,
1223,
8493,
253,
8037,
273,
2045,
3082,
281,
2120,
12320,
407,
495,
89,
50276,
2520,
906,
556,
1774,
12739,
323,
7741,
44255,
17032,
3340,
604,
3541,
310,
273,
4468,
533,
337,
2713,
36643,
556,
1534,
1055,
327,
17032,
22667,
347,
973,
50276,
2520,
789,
14190,
841,
2266,
1543,
407,
10182,
21539,
4858,
36643,
323,
46214,
1396,
569,
285,
50276,
66,
38183,
13757,
689,
2710,
940,
21755,
11865,
50276,
1189,
455,
253,
2929,
14371,
271,
1774,
23447,
32809,
7170,
275,
253,
1673,
285,
310,
273,
2087,
1600,
281,
253,
14200,
3114,
3103,
891,
12661,
697,
14924,
387,
5723,
2824,
1384,
1423,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
4836,
273,
2677,
3006,
4979,
398,
281,
1077,
1698,
12320,
1249,
9886,
2677,
3006,
11454,
6928,
476,
320,
5667,
323,
1566,
1979,
5141,
3340,
323,
6109,
4095,
533,
671,
323,
253,
3885,
273,
7103,
273,
10309,
1293,
14974,
1127,
17308,
2392,
275,
1635,
8985,
1396,
569,
16777,
569,
1527,
747,
15018,
323,
1029,
468,
14692,
1698,
9177,
2714,
4096,
10309,
323,
16344,
11454,
6928,
342,
31063,
5520,
7529,
1204,
285,
1612,
8353,
50276,
6050,
841,
15711,
285,
42852,
403,
417,
747,
253,
2929,
4245,
247,
44003,
6010,
285,
10414,
281,
2720,
6031,
3340,
275,
253,
3634,
273,
27311,
267,
6928,
326,
310,
4518,
23378,
275,
436,
789,
50276,
1568,
436,
789,
12453,
247,
873,
273,
18849,
7681,
11701,
323,
8379,
2677,
3006,
39707,
6928,
323,
1077,
1698,
12320,
1690,
21539,
4858,
36643,
323,
46214,
1396,
569,
285,
18216,
940,
21755,
11865,
689,
4646,
12320,
24088,
854,
9886,
36643,
970,
247,
5019,
273,
824,
3082,
436,
2929,
26091,
281,
4796,
8037,
275,
4797,
4868,
281,
253,
2120,
40540,
1566,
407,
495,
89,
7155,
2792,
671,
253,
2929,
4245,
11080,
28913,
1783,
323,
253,
1055,
273,
253,
7091,
3082,
3236,
414,
10290,
253,
16038,
323,
9559,
36643,
273,
11454,
6928,
310,
417,
747,
285,
556,
644,
4081,
1142,
2069,
689,
253,
2469,
1643,
1107,
841,
2486,
1566,
1979,
5141,
285,
253,
2440,
273,
747,
2714,
4096,
10309,
326,
812,
4796,
253,
1612,
8353,
285,
22667,
273,
17032,
273,
1781,
3676,
4715,
3210,
275,
958,
253,
25720,
273,
9559,
36643,
556,
644,
33536,
35842,
323,
27311,
267,
6928,
342,
1029,
2323,
533,
342,
4979,
398,
7552,
253,
2022,
789,
33647,
323,
954,
3676,
4715,
4893,
352,
556,
2489,
271,
1774,
1953,
849,
281,
3700,
253,
1840,
1543,
689,
281,
731,
50276,
6050,
627,
556,
644,
271,
3302,
873,
273,
789,
273,
10269,
274,
3006,
270,
797,
285,
643,
39707,
3169,
3210,
5431,
1110,
1543,
7369,
275,
247,
5699,
5926,
273,
298,
2435,
31646,
351,
8855,
3290,
50275,
2520,
789,
1057,
417,
12661,
581,
2014,
2900,
281,
1110,
3374,
533,
247,
4849,
273,
4942,
764,
49235,
5482,
253,
5019,
273,
534,
556,
247,
1781,
1055,
327,
3290,
273,
253,
4795,
1698,
40540,
1566,
841,
5697,
2486,
50276,
6320,
3006,
1396,
569,
1078,
10269,
274,
1320,
281,
5416,
326,
1698,
40540,
1396,
569,
403,
11903,
595,
27096,
50276,
303,
27369,
11786,
502,
8201,
323,
3733,
50276,
19586,
1025,
36643,
323,
46214,
285,
2087,
1396,
569,
50276,
42959,
36643,
50276,
2388,
2514,
255,
293,
4071,
940,
21755,
7091,
275,
2720,
789,
50276,
9961,
382,
554,
940,
21755,
1854,
689,
4646,
40540,
3210,
50276,
15177,
1029,
253,
2929,
556,
581,
1077,
2590,
4736,
247,
1077,
6210,
392,
37224,
5661,
9978,
11080,
4679,
285,
28913,
1783,
326,
1646,
5185,
253,
789,
10262,
247,
2590,
1783,
273,
253,
6349,
273,
512,
253,
7091,
3082,
1223,
954,
273,
253,
1827,
4542,
1130,
84,
403,
417,
1077,
4460,
275,
12940,
436,
789,
4245,
247,
2590,
973,
41694,
13612,
323,
253,
9559,
36643,
273,
4979,
398,
253,
3290,
273,
534,
310,
3012,
4457,
5913,
369,
2361,
4321,
1223,
253,
2929,
7005,
953,
946,
8287,
670,
253,
2442,
15988,
407,
2714,
27299,
10309,
436,
310,
417,
247,
1524,
2523,
347,
253,
12739,
403,
4942,
2590,
285,
841,
1543,
1918,
247,
2590,
16038,
281,
2007,
789,
275,
326,
3884,
50276,
498,
15752,
1029,
253,
2929,
310,
973,
17194,
10414,
281,
2720,
789,
310,
3240,
9470,
3738,
10414,
281,
1077,
747,
17336,
2987,
1537,
326,
556,
417,
644,
14218,
33349,
2568,
533,
436,
1057,
417,
2818,
253,
2457,
6452,
50276,
783,
2929,
10262,
247,
2257,
273,
5661,
1941,
275,
44003,
7180,
326,
896,
598,
253,
30328,
3212,
253,
3061,
285,
6780,
253,
8453,
285,
16038,
323,
512,
273,
253,
7089,
1160,
323,
436,
789,
50276,
9188,
40348,
1029,
1223,
253,
7091,
3082,
403,
4942,
973,
1929,
253,
958,
326,
4979,
398,
476,
320,
2677,
1025,
281,
337,
2713,
12320,
310,
6685,
1774,
285,
436,
789,
4245,
247,
2590,
6210,
392,
1829,
264,
6814,
1127,
285,
2590,
973,
41694,
20247,
436,
310,
247,
9865,
8245,
323,
2852,
789,
671,
5459,
253,
16038,
323,
3652,
2714,
27299,
10309,
323,
6685,
1698,
40540,
11454,
6928,
50276,
6050,
1046,
7756,
556,
690,
2442,
281,
2572,
253,
2495,
273,
6422,
10296,
352,
310,
12744,
328,
10355,
281,
479,
326,
436,
4302,
651,
452,
667,
10021,
11852,
4457,
253,
12314,
3374,
273,
2403,
1029,
48276,
10296,
6927,
1962,
6653,
281,
8745,
407,
3076,
14142,
285,
326,
16851,
17032,
1537,
2489,
1679,
9630,
685,
1029,
15177,
3210,
841,
403,
247,
2087,
7350,
5001,
667,
3045,
11701,
285,
3045,
5454,
2727,
275,
436,
3282,
891,
13414,
1158,
326,
436,
2746,
556,
667,
295,
543,
4330,
280,
10502,
326,
943,
452,
644,
2783,
275,
1798,
407,
253,
4477,
5474,
33032,
2520,
2929,
29328,
2067,
2544,
281,
253,
2216,
273,
253,
8985,
270,
797,
1566,
253,
2022,
10419,
8414,
273,
1264,
4243,
337,
897,
1027,
36643,
15849,
1754,
327,
253,
3453,
3268,
273,
247,
3828,
24088,
1396,
569,
403,
2677,
1025,
281,
470,
285,
337,
323,
4471,
2522,
1881,
4116,
8090,
374,
5283,
271,
15386,
10269,
274,
1320,
1159,
534,
4483,
9708,
1079,
285,
5333,
495,
897,
247,
1554,
382,
554,
940,
21755,
6974,
6036,
342,
247,
1554,
382,
554,
36643,
10130,
253,
4795,
2372,
1566,
14371,
247,
2266,
3045,
327,
15450,
8892,
275,
28400,
285,
13487,
15302,
20544,
436,
2929,
29328,
2067,
2969,
2568,
3576,
2544,
281,
3157,
253,
3290,
273,
10269,
274,
1025,
270,
797,
1566,
3012,
253,
4081,
5609,
403,
3590,
253,
4028,
273,
253,
2929,
310,
2590,
285,
3477,
281,
956,
253,
4477,
671,
2085,
2217,
16774,
7103,
273,
253,
4081,
2372,
3210,
50276,
20881,
1255,
265,
619,
5962,
4468,
342,
436,
2929,
310,
326,
253,
4477,
513,
417,
4518,
2319,
253,
2954,
281,
643,
3332,
2987,
275,
253,
5928,
273,
247,
2590,
5955,
273,
253,
2954,
281,
643,
2905,
789,
891,
717,
417,
13762,
326,
253,
9021,
275,
436,
789,
403,
747,
285,
3236,
50276,
2520,
2929,
1057,
26542,
247,
1781,
1180,
273,
4623,
2987,
2299,
253,
2954,
875,
253,
18595,
275,
436,
789,
285,
2905,
789,
4558,
12744,
806,
253,
10419,
281,
897,
1027,
36643,
15849,
7293,
327,
253,
3453,
3268,
273,
253,
8090,
310,
3240,
2074,
281,
1794,
42959,
4081,
275,
20314,
797,
337,
1794,
42959,
5936,
970,
247,
7301,
1159,
5150,
1903,
275,
253,
20314,
797,
2929,
2581,
685,
247,
861,
1159,
281,
10269,
274,
907,
253,
3453,
273,
247,
4471,
2522,
1881,
42959,
3828,
1273,
253,
15386,
10269,
274,
1320,
1159,
4453,
33917,
281,
253,
5743,
1159,
4081,
275,
8071,
3024,
374,
8071,
3024,
285,
690,
956,
484,
789,
452,
2168,
2011,
326,
16984,
3081,
13642,
285,
19507,
13789,
275,
253,
5743,
1159,
476,
1361,
3157,
253,
3290,
273,
253,
10269,
274,
1025,
260,
9866,
253,
4477,
513,
417,
1127,
562,
253,
14259,
342,
8071,
3024,
672,
16585,
253,
15386,
10269,
274,
1320,
1159,
534,
812,
320,
24363,
50276,
249,
1635,
891,
1158,
253,
2929,
671,
3198,
281,
2319,
253,
747,
10309,
2424,
323,
253,
2372,
1566,
534,
310,
1027,
432,
5899,
8985,
11454,
6928,
326,
2430,
760,
2969,
1269,
15387,
18488,
281,
3359,
253,
15180,
3948,
21678,
3284,
253,
15180,
3948,
323,
2372,
3210,
3198,
281,
6016,
13782,
875,
337,
285,
337,
285,
470,
285,
337,
891,
1804,
326,
253,
4477,
943,
387,
1878,
7277,
253,
2424,
10309,
342,
5899,
8985,
285,
49688,
552,
11454,
6928,
5010,
352,
651,
320,
2834,
281,
15249,
417,
970,
247,
49688,
552,
11454,
2990,
50276,
18,
20314,
797,
7899,
4751,
10269,
274,
1025,
270,
797,
17857,
32888,
1423,
374,
8071,
3024,
4404,
10799,
8985,
11454,
2990,
342,
14923,
5743,
3470,
23746,
87,
938,
5549,
5474,
339,
431,
248,
4477,
12661,
4460,
10269,
274,
5837,
5609,
323,
39707,
3210,
1754,
327,
3640,
940,
21755,
253,
4736,
310,
281,
5115,
12085,
3045,
323,
271,
5919,
5974,
39707,
1566,
575,
253,
2022,
9021,
403,
891,
10269,
274,
5837,
5853,
326,
19132,
3640,
940,
21755,
275,
39707,
3210,
21255,
1554,
382,
554,
5853,
281,
940,
300,
10269,
274,
1701,
3210,
281,
3157,
10444,
3045,
253,
1263,
2722,
326,
253,
5974,
1566,
33526,
12085,
3045,
2429,
281,
247,
2629,
270,
797,
327,
253,
28400,
22791,
285,
1953,
22291,
20544,
50274,
8250,
5740,
273,
4114,
3640,
285,
2905,
789,
3058,
281,
2096,
253,
4081,
2746,
2519,
50275,
8250,
5740,
273,
253,
4081,
2746,
50275,
14461,
8333,
323,
1566,
7446,
1430,
50275,
783,
4477,
1347,
247,
11088,
5301,
342,
2905,
789,
327,
253,
28400,
22791,
50275,
783,
4342,
921,
326,
271,
5919,
39707,
1566,
556,
12085,
3045,
2429,
281,
2045,
256,
5503,
50275,
20881,
1255,
265,
50274,
262,
310,
417,
2590,
849,
253,
4764,
31850,
575,
285,
5438,
273,
4373,
22041,
812,
2818,
253,
1566,
3045,
253,
4477,
452,
9713,
7364,
273,
253,
4081,
2746,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
16694,
15722,
323,
2677,
3006,
4979,
398,
323,
6685,
1698,
12320,
1249,
9886,
1223,
8493,
253,
8037,
273,
2045,
3082,
281,
2120,
12320,
407,
495,
89,
50276,
2520,
906,
556,
1774,
12739,
323,
7741,
44255,
17032,
3340,
604,
3541,
310,
273,
4468,
533,
337,
2713,
36643,
556,
1534,
1055,
327,
17032,
22667,
347,
973,
50276,
2520,
789,
14190,
841,
2266,
1543,
407,
10182,
21539,
4858,
36643,
323,
46214,
1396,
569,
285,
50276,
66,
38183,
13757,
689,
2710,
940,
21755,
11865,
50276,
1189,
455,
253,
2929,
14371,
271,
1774,
23447,
32809,
7170,
275,
253,
1673,
285,
310,
273,
2087,
1600,
281,
253,
14200,
3114,
3103,
891,
12661,
697,
14924,
387,
5723,
2824,
1384,
1423,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
strength this submission provides a novel perspective in understanding the relationship between jointly learned tasks recent works such as standley et al 1 empirically showed that sharedweight training behavior in a pairwise setting could vary substantially depending on the task this submission tells us that it also depends on the familiarity of testing examples figure 3b shows that in the usual supervised training setting in which training and testing examples both come from seen combinations shared branch architecture is better but it is the opposite when evaluated on unfamiliar examples strength novel experiments the finding that cnns generalize better to unseen categoryviewpoint combinations as the training data diversity grows by itself is not a surprising one in my opinion i would expect a nearestneighbor baseline to behave similarly but i think the papers value mainly has to do with the experimental design the symmetry between the two tasks and quantitative analysis of invariance and selectivity are useful in validating findings from prior work covered in part by yang 2019 cited it would not have been immediately apparent to me how to set up fair experiments because it involves two seemingly unrelated tasks the 2d grid classification which uniformly discretizes the task space and joint geometric mean evaluation made sense weakness in figure 3b it appears that getting better at predicting unseen combinations degrades performance on seen combinations this is not a weakness on its own but its easy to miss and i think it should have been discussed more the resnet architecture seems big enough to max out performance on mnist so i would expect the biasedcars dataset to present the limitations better as the authors suggested as the number of seen combinations increases theres less of unseen examples to extrapolate but it also increases the scope of predictions you have to make on seen examples im personally inclined to think there are almost always hidden or obvious tradeoffs in one form or another and some more useful than others in 1 getting better at one task often comes at the cost of sacrificing performance on another task 2 also evaluated on unfamiliar categories and unfamiliar viewpoints and found that those two have somewhat mutually exclusive properties another observation is that a shared branch architecture can be computationally the same as a separate branch architecture if you insert zero weights at the correct channels and had more channels to make up for the sparsity although specialized neurons do naturally emerge when learning many tasks yang et al current supervision methods do not seem to make it easy to learn this it is still unclear to me why exactly the separate branch architecture works better my current rating of the paper is above threshold im willing to change the rating based on author feedback and discussion 1 icml 2020 which tasks should be learned together in multitask learning 2 cvpr 2018 pixels voxels and views a study of shape representations for single view 3d object shape prediction postdiscussion comment id like to keep my rating 6 marginally above acceptance threshold i think r1s concerns regarding similarity between seen and unseen set are valid as the number of seen combinations increases the extrapolation problem becomes an interpolation problem but in my view thats not a weakness as long as the symmetric grid setting appropriately captures the relationship between the two tasks i am leaning towards thinking that it does it eliminates other confounding factors such as the relative weights of the loss terms too and i think it is fair overall hopefully future work will provide more insight ultimately i think this submission is above thresholddocsepoverview the authors present an analysis of the capability of cnns to generalize to unseen categoryviewpoint combinations they study various network architectures shared separate and splits in their terminology for the task of joint category and viewpoint classification this is done by setting up trainingtesting data in a specific pattern where some category viewpoint combinations are heldout and never seen in training data the question the authors try to answer is what kind of training data diversity is needed to be able to do well on this heldout test set this done using empirical evidence on the following datasets i mnistposition mnistscale ii ilab2m and iii biasedcars dataset which is a new dataset that they introduce they define and measure selectivity invariance and specialization scores for different models with different training data to comment on when and how cnns generalize to these unseen combinations pros the authors tackle an interesting and relevant problem of generalization of cnns for joint viewpoint and category estimation in a realistic setting we will not have enough images of all viewpoints and all categories of interest and a systematic study to roughly quantify how much is needed to be able to successfully predict any viewpoint and category of interest will be helpful i liked the selectivity invariance and specialization score definitions as quantitative ways of measuring generalizability the paper is wellwritten and the ideasexperiments are presented in an easytoread manner the experiments are welldesigned to validate the authors hypotheses sections 5 and 6 answer the key questions and tie it nicely with experiments and results presented cons the authors correctly identify a few variations of the network architecture that could be used for the joint viewpoint and category estimation task and study how this affects generalization however they are missing a key component of how these networks are trained let us consider the overall network to contain three components i stem ii category head and iii viewpoint head say we train the category estimation task first stem category head the features learned by the stem will be viewpoint invariant because the category estimation task requires viewpoint invariance if we not train the viewpoint head on top of such features it is going to suffer the other two possibilities are a train the viewpoint estimation task first and then category estimation and b train the viewpoint and category estimation task jointly which is what the authors have done the viewpoint and category tasks compete compared to other join tasks like detection categorization and so training protocol might make a significant difference to network performance the authors are missing some references of works that do joint category and viewpoint estimation elhoseny et al a comparative analysis and study of multiview cnn models for joint object categorization and pose estimation icml 2016 linkhttpproceedingsmlrpressv48elhoseiny16pdf afifi et al simultaneous object classification and viewpoint estimation using deep multitask convolutional neural network visigrapp 2018 linkhttpsresearchoutputcsueduauwsportalfilesportal2323223623231703publishedpaperpdf mahendran et al convolutional networks for object category and 3d pose estimation from 2d images eccvw 2018 linkhttpsopenaccessthecvfcomcontenteccvw2018papers11129mahendranconvolutionalnetworksforobjectcategoryand3dposeestimationfromeccvw2018paperpdf in mahendran et al some more network architectures are considered specifically ones that combine the category and viewpoint estimation tasks at the prediction stage along with the feature stage including it in your analysis as well will be helpful the new proposed dataset of biasedcars is synthetic even if useful why not use a real dataset like uiuc dataset of 3d object categories paperhttpvisionstanfordedudocumentssavaresefeifeiiccv2007pdf linkhttpwwweecsumicheduvisiondata3ddatasetzip demonstration that the conclusions you make on biasedcars carry to actual cars will provide stronger evidence for your claimsfindings for the mnist experiments a rotated mnist linkhttpssitesgooglecomalisairoumontrealcapublicstatictwikivariationsonthemnistdigits would be a better choice compared to mnistposition and mnistscale reason for rating i like the idea and analysis present in the paper for generalization to unseen categoryviewpoint combinations at the same time i think training protocol plays a key role in such multitask networks and is missing from current analysis adding that or at the very least commenting about it and including results on the uiuc 3d object dataset will help me go from borderline to yes update i have read the author feedback and other reviews discussions i have updated my rating from 6 to 7 the authors responded to my comments with updated analysis to further prove their claimsdocsepthis paper poses the following question whether or not cnns that learn to recognize a car from only its frontal view can correctly recognize the same car from a side view that they have never seen before previous studies report negative result i interpret the question as how would cnns learning to recognize various different cars in various different poses generalize to the above case of recognition i think its an intriguing question and it would be beneficial to know how categories and viewpoints are intertwined in their recognition in cnns the main conclusion is summarized as follows i learning with a greater number of categoryviewpoint combinations results in higher prediction accuracy for unseen combinations and ii separately recognizing category and viewpoint yields more accurate prediction for each i dont feel these results necessarily deepen our understanding of cnns regarding the above question the first result seems obvious in the experiments the authors divide categoryviewpoint combinations into a training and a test split without overlap they then observe how prediction accuracy for the test split ie unseen combinations changes as the number of combinations in the training split ie seen combinations increases while its cardinality is kept constant they conclude that increasing the number of seen combinations improves prediction for the test split however i suspect this result can be fully explained because increasing the number of seen combinations results in more samples having closer viewpoints to the test split samples an identical category object tends to have similar appearances when seen from closer viewpoints it will be easier to recognize the object from viewpoints closer if not equal to seen viewpoints as for the second result although the authors state that it is surprising i dont feel so the result implies the employed cnn finds little in common between recognition of category and viewpoint and thus we cannot expect generalization across the two it is reasonable that there is little or no synergistic effect in multitask learning of two unrelated tasks from the standpoint of the above question it would have been more interesting if recognition of category and viewpoint were not separated but coupled with each other update i read the authors response they say thus it is unclear that generalization to unseen viewpoints shown in our results is completely explained by the similarity between the seen and unseen set i agree with this comment i dont say we can prove that the similarity fully explains generalization to unseen viewpoints its my conjecture however the authors themselves seem to recognize my conjecture could be right at least it cannot be eliminated in other words a major issue with the current manuscript is that we cannot derive a firm conclusion from the experimental results if my conjecture is true then the results are almost obvious and not interesting i want to lower the score from 5 to 4 as the authors response makes me believe my concern is valid i think the authors tackle quite a hard problem and admire their efforts though docsepthis is an analytical study which investigates how good are cnns in generalizing to new viewpoints they also investigate whether mixing with an extra task of classifying the viewpoint would help to generalize to new viewpoints or not this study is focused on 3 datasets mnist scaleposition ilab2m vehicle toys and their own novel dataset of photo realistic 5 category car dataset one interesting aspect of their experimental setup is to withhold a set of categoryviewpoint pairs for the test set essentially for each category the test viewpoint is not seen in the training set so they are evaluating to what extent can a model generalize from one category to another category as opposed to withholding a viewpoint from all categories for example no frontal view for any of the cars during training there has been several previous papers that referred to viewpoint generalization as generalization to unseen viewpoints rather than generalization to viewpoints unseen for this category mnistaffnist or mnistmnistrot or smallnorb experiments in viewpoint equivarience papers they also investigate whether classifying the viewpoint jointly can improve viewpoint generalization their analysis shows that it hurts the accuracy to add the viewpoint classification task using architectures like resnet and sharing the backbone furthermore this paper studies specialization selectivity and invariance of neurons to categoryviewpoint this paper has an extensive number of experiments on various setups measuring different criterias for the sheer amount of experimental data present in this study it is valuable to community and answers some questions most significantly it shows that adding a viewpoint predication head hurts viewpoint generalization this finding is counter intuitive therefore intriguing and merits further studies in this regard although the accuracy reported in the experiments is a geometric mean of viewpoint and category i would be very interested in just comparing the category accuracy it is valuable to at least have one version that only category accuracy is reported for sharedseparate train with viewpoint and category classification for reporting ignore the viewpoint accuracy to verify whether adding viewpoint classification improves category classification generalization to unseen viewpoints question can you please clarify whether the images in the test set are unseen instances of the categories as well as unseen viewpoint ie the mnistscale test split is it mnist test set images scaled to the unseen scale as well as for cars etc can you clarify the choice of unseen viewpoint for a category rather than viewpoint generalization over all categories there has been several works addressing the viewpoint generalizability of cnns including group equivarient convolutional networks and capsule networks adding these architectures to this study would verify to what extent the findings here are limited to resnet style architectures the findings regarding separateshared architecture are interesting but the significance of the findings and framework of study at its current state is short of a conference standards post author response thank you for providing the results and clarifying the setup i am increasing the score to 6
### Summary:
|
this paper received 2 borderline accepts 1 accept and 1 reject in general there is broad agreement that this is solid experimental work and that the differences found between recognition and viewpoint estimation were interesting the main issue brought up by the more negative reviewer is that some of the experiments are subject to interpretation specifically the extrapolation problem could become more of an interpolation problem as the number of training examples increases this issue is acknowledged by the other reviewers who nonetheless see some value in the paper being published and potentially paving the way for additional studies my suggestion for the authors is to prominently discuss this issue in a revision of this paper unfortunately because of space constraints i have to recommend this paper be rejected
|
[
285,
2557,
29029,
31429,
285,
48544,
7363,
323,
1027,
3210,
342,
1027,
3733,
941,
281,
4385,
327,
672,
285,
849,
260,
79,
2224,
39970,
281,
841,
39709,
13553,
50275,
856,
84,
50276,
783,
4477,
18915,
271,
4722,
285,
4623,
1895,
273,
26647,
273,
260,
79,
2224,
323,
6036,
31460,
285,
7140,
13418,
275,
247,
15958,
4758,
359,
588,
417,
452,
2217,
3888,
273,
512,
1859,
10801,
285,
512,
9050,
273,
1600,
285,
247,
12082,
1263,
281,
11467,
22048,
849,
1199,
310,
3058,
281,
320,
2104,
281,
8379,
3283,
667,
31460,
285,
7140,
273,
1600,
588,
320,
9371,
50276,
74,
10490,
253,
29029,
31429,
285,
48544,
4868,
14308,
347,
11745,
4088,
273,
10499,
2087,
50228,
50275,
783,
2929,
310,
973,
15720,
285,
253,
1827,
511,
89,
468,
3825,
403,
3559,
275,
271,
1842,
1767,
410,
324,
5133,
50276,
783,
4679,
403,
6210,
392,
265,
1300,
281,
17813,
253,
4477,
24316,
7118,
608,
285,
721,
3662,
253,
2234,
3533,
285,
13898,
352,
23395,
342,
4679,
285,
1543,
3559,
50275,
5040,
50276,
783,
4477,
9113,
4271,
247,
1643,
10575,
273,
253,
2990,
10336,
326,
812,
320,
908,
323,
253,
6036,
31460,
285,
7140,
13418,
4836,
285,
1263,
849,
436,
11852,
26647,
2299,
597,
403,
5816,
247,
2234,
4445,
273,
849,
841,
6928,
403,
10166,
1339,
441,
1908,
253,
4583,
2990,
281,
3831,
1264,
4295,
891,
8424,
21255,
7140,
1481,
285,
37685,
31460,
1481,
1333,
359,
6194,
253,
7140,
13418,
4836,
806,
8424,
50276,
14267,
1481,
253,
3386,
6311,
407,
253,
8424,
588,
320,
31460,
13727,
984,
253,
7140,
13418,
4836,
4419,
31460,
31429,
604,
359,
417,
6194,
253,
31460,
1481,
327,
1755,
273,
824,
3386,
352,
310,
1469,
281,
11089,
253,
643,
767,
15018,
403,
247,
6194,
253,
31460,
13418,
4836,
806,
285,
840,
7140,
13418,
285,
270,
6194,
253,
31460,
285,
7140,
13418,
4836,
26277,
534,
310,
752,
253,
4477,
452,
2218,
253,
31460,
285,
7140,
8892,
15639,
2429,
281,
643,
6604,
8892,
751,
5481,
50276,
68,
992,
263,
1320,
285,
594,
3733,
7241,
1537,
1056,
247,
1534,
3064,
281,
2990,
3045,
50275,
783,
4477,
403,
5816,
690,
10414,
273,
2987,
326,
513,
6036,
7140,
285,
31460,
13418,
50274,
293,
73,
5458,
90,
1162,
355,
247,
20407,
1783,
285,
1263,
273,
1554,
400,
827,
260,
9866,
3210,
323,
6036,
1789,
13213,
1320,
285,
16753,
13418,
17857,
1686,
4022,
3048,
1696,
377,
287,
22868,
1686,
83,
7100,
87,
2385,
293,
73,
583,
5104,
1036,
9275,
50274,
2320,
18279,
1162,
355,
19645,
1789,
9162,
285,
31460,
13418,
970,
3676,
1554,
262,
1945,
27311,
267,
11454,
2990,
1649,
304,
376,
377,
4765,
3048,
3614,
36642,
9252,
6113,
2107,
86,
1952,
8819,
631,
2103,
3205,
631,
267,
19136,
1237,
1508,
3763,
21874,
1166,
2941,
15855,
20790,
9275,
50274,
785,
45182,
4011,
1162,
355,
27311,
267,
6928,
323,
1789,
7140,
285,
495,
69,
16753,
13418,
432,
374,
69,
3888,
23746,
87,
88,
4765,
3048,
3614,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
70,
550,
87,
88,
7798,
50004,
883,
13482,
785,
45182,
4011,
13118,
2241,
267,
3024,
4896,
1542,
6082,
14267,
395,
20,
69,
3014,
383,
14508,
4064,
70,
550,
87,
88,
7798,
20790,
9275,
50276,
249,
6429,
45182,
4011,
1162,
355,
690,
625,
2990,
35615,
403,
2783,
5742,
4394,
326,
13398,
253,
7140,
285,
31460,
13418,
8892,
387,
253,
10554,
3924,
2112,
342,
253,
4735,
3924,
1690,
352,
275,
634,
1783,
347,
973,
588,
320,
9371,
50276,
783,
747,
4081,
10895,
273,
23539,
43999,
310,
13506,
1014,
604,
4217,
2139,
417,
897,
247,
1524,
10895,
751,
28243,
1028,
10895,
273,
495,
69,
1789,
9050,
2929,
2413,
4694,
18549,
4379,
264,
438,
406,
3222,
47929,
609,
339,
453,
1074,
74,
280,
17312,
8602,
9275,
3048,
2413,
1477,
664,
886,
2204,
16128,
563,
4694,
2203,
20,
1678,
255,
23456,
16368,
20028,
326,
253,
11815,
368,
1056,
327,
23539,
43999,
4459,
281,
4588,
8458,
588,
2085,
10046,
1941,
323,
634,
3916,
8606,
723,
50275,
1542,
253,
278,
79,
382,
4679,
247,
27272,
278,
79,
382,
3048,
3614,
37813,
9906,
681,
21728,
1094,
276,
19065,
6549,
4421,
975,
8766,
882,
44874,
400,
1792,
569,
251,
16083,
79,
382,
11174,
953,
651,
320,
247,
1805,
4327,
2429,
281,
278,
79,
382,
3321,
285,
278,
79,
382,
7527,
50275,
10752,
323,
13716,
891,
751,
253,
2934,
285,
1783,
1246,
275,
253,
2929,
323,
26647,
281,
39709,
7140,
1374,
3659,
13553,
387,
253,
1072,
673,
891,
1158,
3733,
7241,
7120,
247,
2234,
2554,
275,
824,
1554,
262,
1945,
6928,
285,
310,
5816,
432,
1655,
1783,
6240,
326,
390,
387,
253,
1077,
1878,
36738,
670,
352,
285,
1690,
1543,
327,
253,
28243,
1028,
495,
69,
1789,
10895,
588,
1361,
479,
564,
432,
45210,
281,
4754,
50275,
11183,
891,
452,
1239,
253,
2488,
8680,
285,
643,
10123,
50276,
35844,
621,
891,
452,
9300,
619,
13716,
432,
721,
281,
818,
253,
4477,
10974,
281,
619,
5701,
342,
9300,
1783,
281,
2007,
5276,
616,
3916,
7152,
33032,
2520,
2929,
24543,
253,
1563,
1953,
1880,
390,
417,
260,
79,
2224,
326,
3037,
281,
9446,
247,
1113,
432,
760,
697,
24459,
1859,
476,
9113,
9446,
253,
1072,
1113,
432,
247,
1930,
1859,
326,
597,
452,
1620,
2326,
1078,
2045,
2175,
1304,
4016,
906,
891,
4665,
253,
1953,
347,
849,
651,
260,
79,
2224,
4715,
281,
9446,
2710,
1027,
8458,
275,
2710,
1027,
24543,
39970,
281,
253,
1840,
1083,
273,
8981,
891,
1158,
697,
271,
27807,
1953,
285,
352,
651,
320,
12912,
281,
871,
849,
9050,
285,
1859,
10801,
403,
44463,
967,
275,
616,
8981,
275,
260,
79,
2224,
50276,
783,
2022,
6452,
310,
17903,
347,
3637,
891,
4715,
342,
247,
3687,
1180,
273,
7140,
1374,
3659,
13553,
1543,
275,
2169,
10554,
7200,
323,
39709,
13553,
285,
21255,
11794,
26182,
7140,
285,
31460,
11026,
625,
7899,
10554,
323,
1016,
891,
13414,
1928,
841,
1543,
7933,
3676,
257,
776,
4685,
273,
260,
79,
2224,
5001,
253,
1840,
1953,
50276,
783,
806,
906,
3133,
4755,
275,
253,
4679,
253,
4477,
10957,
7140,
1374,
3659,
13553,
715,
247,
3733,
285,
247,
1071,
8085,
1293,
14787,
597,
840,
10018,
849,
10554,
7200,
323,
253,
1071,
8085,
26332,
39709,
13553,
2544,
347,
253,
1180,
273,
13553,
275,
253,
3733,
8085,
26332,
2326,
13553,
5459,
1223,
697,
46950,
310,
4934,
3638,
597,
7525,
326,
3629,
253,
1180,
273,
2326,
13553,
19132,
10554,
323,
253,
1071,
8085,
2299,
891,
9101,
436,
906,
476,
320,
4751,
5544,
984,
3629,
253,
1180,
273,
2326,
13553,
1543,
275,
625,
3530,
1907,
8003,
1859,
10801,
281,
253,
1071,
8085,
3530,
271,
8931,
7140,
1789,
14280,
281,
452,
2074,
18655,
672,
2326,
432,
8003,
1859,
10801,
352,
588,
320,
6927,
281,
9446,
253,
1789,
432,
1859,
10801,
8003,
604,
417,
4503,
281,
2326,
1859,
10801,
50276,
284,
323,
253,
1273,
906,
3738,
253,
4477,
1375,
326,
352,
310,
10084,
891,
13414,
1928,
594,
253,
906,
8018,
253,
7091,
260,
9866,
9010,
1652,
275,
1846,
875,
8981,
273,
7140,
285,
31460,
285,
3021,
359,
2550,
1902,
26647,
2439,
253,
767,
352,
310,
5272,
326,
627,
310,
1652,
390,
642,
38543,
1055,
275,
1554,
262,
1945,
4715,
273,
767,
20804,
8892,
432,
253,
32764,
273,
253,
1840,
1953,
352,
651,
452,
644,
625,
4722,
604,
8981,
273,
7140,
285,
31460,
497,
417,
9070,
533,
9904,
342,
1016,
643,
50276,
11183,
50276,
74,
1239,
253,
4477,
2380,
597,
1333,
3021,
352,
310,
12744,
326,
26647,
281,
39709,
1859,
10801,
2011,
275,
776,
1543,
310,
4336,
5544,
407,
253,
14259,
875,
253,
2326,
285,
39709,
873,
891,
5194,
342,
436,
4385,
891,
13414,
1333,
359,
476,
5276,
326,
253,
14259,
4751,
11424,
26647,
281,
39709,
1859,
10801,
697,
619,
24366,
2299,
253,
4477,
3746,
1646,
281,
9446,
619,
24366,
812,
320,
987,
387,
1878,
352,
2550,
320,
17527,
275,
643,
3000,
247,
2201,
2523,
342,
253,
1655,
7714,
310,
326,
359,
2550,
15313,
247,
5882,
6452,
432,
253,
5661,
1543,
604,
619,
24366,
310,
2032,
840,
253,
1543,
403,
2761,
4755,
285,
417,
4722,
50276,
74,
971,
281,
2406,
253,
4868,
432,
608,
281,
577,
347,
253,
4477,
2380,
2789,
479,
2868,
619,
4468,
310,
3588,
891,
1158,
253,
4477,
18915,
3240,
247,
1892,
1895,
285,
26930,
616,
6031,
2167,
5474,
33032,
2520,
310,
271,
16101,
1263,
534,
2340,
684,
849,
1175,
403,
260,
79,
2224,
275,
2087,
3006,
281,
747,
1859,
10801,
597,
671,
7409,
1880,
12480,
342,
271,
4465,
4836,
273,
49653,
253,
31460,
651,
1361,
281,
39970,
281,
747,
1859,
10801,
390,
417,
50275,
2520,
1263,
310,
7106,
327,
495,
15302,
278,
79,
382,
9171,
554,
3450,
4164,
357,
19,
78,
4958,
23908,
285,
616,
1211,
4460,
10895,
273,
7512,
15958,
608,
7140,
1113,
10895,
50275,
531,
4722,
4809,
273,
616,
5661,
9978,
310,
281,
342,
4949,
247,
873,
273,
7140,
1374,
3659,
8557,
323,
253,
1071,
873,
9093,
323,
1016,
7140,
253,
1071,
31460,
310,
417,
2326,
275,
253,
3733,
873,
594,
597,
403,
16344,
281,
752,
6070,
476,
247,
1566,
39970,
432,
581,
7140,
281,
1529,
7140,
347,
10066,
281,
40271,
247,
31460,
432,
512,
9050,
323,
1650,
642,
24459,
1859,
323,
667,
273,
253,
8458,
1309,
3733,
627,
556,
644,
2067,
2045,
9380,
326,
6289,
281,
31460,
26647,
347,
26647,
281,
39709,
1859,
10801,
2581,
685,
26647,
281,
1859,
10801,
39709,
323,
436,
7140,
278,
79,
382,
2843,
79,
382,
390,
278,
79,
382,
16192,
382,
8601,
390,
1355,
15387,
67,
4679,
275,
31460,
32270,
274,
1482,
9380,
50275,
9328,
671,
7409,
1880,
49653,
253,
31460,
26277,
476,
3157,
31460,
26647,
616,
1783,
2722,
326,
352,
31835,
253,
7200,
281,
823,
253,
31460,
9162,
4836,
970,
35615,
751,
501,
3024,
285,
9628,
253,
27882,
50275,
44295,
3062,
436,
2929,
2175,
48544,
29029,
285,
31429,
273,
8512,
281,
7140,
1374,
3659,
50275,
2520,
2929,
556,
271,
9470,
1180,
273,
4679,
327,
2710,
873,
8777,
10499,
1027,
16696,
6358,
323,
253,
23658,
2408,
273,
5661,
941,
1246,
275,
436,
1263,
352,
310,
9865,
281,
3114,
285,
9172,
690,
3533,
954,
3012,
352,
2722,
326,
6240,
247,
31460,
2063,
8518,
1481,
31835,
31460,
26647,
436,
4560,
310,
4828,
27350,
3103,
27807,
285,
16108,
2007,
2175,
275,
436,
2743,
50276,
20261,
253,
7200,
2361,
275,
253,
4679,
310,
247,
17856,
1599,
273,
31460,
285,
7140,
891,
651,
320,
1077,
6110,
275,
816,
10941,
253,
7140,
7200,
352,
310,
9865,
281,
387,
1878,
452,
581,
2715,
326,
760,
7140,
7200,
310,
2361,
323,
6096,
16806,
366,
6194,
342,
31460,
285,
7140,
9162,
323,
9610,
11823,
253,
31460,
7200,
281,
12654,
1880,
6240,
31460,
9162,
19132,
7140,
9162,
26647,
281,
39709,
1859,
10801,
50276,
19751,
50276,
5092,
368,
4496,
19148,
1880,
253,
3888,
275,
253,
1071,
873,
403,
39709,
10872,
273,
253,
9050,
347,
973,
347,
39709,
31460,
26332,
253,
278,
79,
382,
7527,
1071,
8085,
310,
352,
278,
79,
382,
1071,
873,
3888,
24337,
281,
253,
39709,
4311,
347,
973,
347,
323,
8458,
3966,
50276,
5092,
368,
19148,
253,
4327,
273,
39709,
31460,
323,
247,
7140,
2581,
685,
31460,
26647,
689,
512,
9050,
50276,
9088,
556,
644,
2067,
2987,
15974,
253,
31460,
2087,
50228,
273,
260,
79,
2224,
1690,
1387,
32270,
274,
850,
27311,
267,
6928,
285,
26661,
6928,
6240,
841,
35615,
281,
436,
1263,
651,
12654,
281,
752,
6070,
253,
4342,
1060,
403,
3710,
281,
501,
3024,
3740,
35615,
50275,
783,
4342,
5001,
36158,
73,
1096,
10336,
403,
4722,
533,
253,
8453,
273,
253,
4342,
285,
7792,
273,
1263,
387,
697,
1655,
1375,
310,
2159,
273,
247,
8059,
7465,
50275,
5996,
2488,
2380,
5717,
368,
323,
5277,
253,
1543,
285,
8254,
5411,
253,
9978,
891,
717,
3629,
253,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
2520,
2929,
2959,
374,
45210,
25026,
337,
2997,
285,
337,
12009,
50276,
249,
2087,
627,
310,
3862,
4345,
326,
436,
310,
4891,
5661,
789,
285,
326,
253,
3910,
1119,
875,
8981,
285,
31460,
13418,
497,
4722,
50276,
783,
2022,
2523,
3982,
598,
407,
253,
625,
4016,
37317,
310,
326,
690,
273,
253,
4679,
403,
2256,
281,
7914,
5742,
253,
26480,
17888,
1895,
812,
2489,
625,
273,
271,
30370,
1895,
347,
253,
1180,
273,
3733,
6667,
5459,
436,
2523,
310,
14969,
407,
253,
643,
30628,
665,
23188,
923,
690,
1318,
275,
253,
2929,
1146,
3863,
285,
7826,
268,
3292,
253,
1039,
323,
3081,
2175,
619,
14876,
323,
253,
4477,
310,
281,
46454,
2319,
436,
2523,
275,
247,
18520,
273,
436,
2929,
19235,
984,
273,
2317,
10806,
50276,
74,
452,
281,
5583,
436,
2929,
320,
10945
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
285,
2557,
29029,
31429,
285,
48544,
7363,
323,
1027,
3210,
342,
1027,
3733,
941,
281,
4385,
327,
672,
285,
849,
260,
79,
2224,
39970,
281,
841,
39709,
13553,
50275,
856,
84,
50276,
783,
4477,
18915,
271,
4722,
285,
4623,
1895,
273,
26647,
273,
260,
79,
2224,
323,
6036,
31460,
285,
7140,
13418,
275,
247,
15958,
4758,
359,
588,
417,
452,
2217,
3888,
273,
512,
1859,
10801,
285,
512,
9050,
273,
1600,
285,
247,
12082,
1263,
281,
11467,
22048,
849,
1199,
310,
3058,
281,
320,
2104,
281,
8379,
3283,
667,
31460,
285,
7140,
273,
1600,
588,
320,
9371,
50276,
74,
10490,
253,
29029,
31429,
285,
48544,
4868,
14308,
347,
11745,
4088,
273,
10499,
2087,
50228,
50275,
783,
2929,
310,
973,
15720,
285,
253,
1827,
511,
89,
468,
3825,
403,
3559,
275,
271,
1842,
1767,
410,
324,
5133,
50276,
783,
4679,
403,
6210,
392,
265,
1300,
281,
17813,
253,
4477,
24316,
7118,
608,
285,
721,
3662,
253,
2234,
3533,
285,
13898,
352,
23395,
342,
4679,
285,
1543,
3559,
50275,
5040,
50276,
783,
4477,
9113,
4271,
247,
1643,
10575,
273,
253,
2990,
10336,
326,
812,
320,
908,
323,
253,
6036,
31460,
285,
7140,
13418,
4836,
285,
1263,
849,
436,
11852,
26647,
2299,
597,
403,
5816,
247,
2234,
4445,
273,
849,
841,
6928,
403,
10166,
1339,
441,
1908,
253,
4583,
2990,
281,
3831,
1264,
4295,
891,
8424,
21255,
7140,
1481,
285,
37685,
31460,
1481,
1333,
359,
6194,
253,
7140,
13418,
4836,
806,
8424,
50276,
14267,
1481,
253,
3386,
6311,
407,
253,
8424,
588,
320,
31460,
13727,
984,
253,
7140,
13418,
4836,
4419,
31460,
31429,
604,
359,
417,
6194,
253,
31460,
1481,
327,
1755,
273,
824,
3386,
352,
310,
1469,
281,
11089,
253,
643,
767,
15018,
403,
247,
6194,
253,
31460,
13418,
4836,
806,
285,
840,
7140,
13418,
285,
270,
6194,
253,
31460,
285,
7140,
13418,
4836,
26277,
534,
310,
752,
253,
4477,
452,
2218,
253,
31460,
285,
7140,
8892,
15639,
2429,
281,
643,
6604,
8892,
751,
5481,
50276,
68,
992,
263,
1320,
285,
594,
3733,
7241,
1537,
1056,
247,
1534,
3064,
281,
2990,
3045,
50275,
783,
4477,
403,
5816,
690,
10414,
273,
2987,
326,
513,
6036,
7140,
285,
31460,
13418,
50274,
293,
73,
5458,
90,
1162,
355,
247,
20407,
1783,
285,
1263,
273,
1554,
400,
827,
260,
9866,
3210,
323,
6036,
1789,
13213,
1320,
285,
16753,
13418,
17857,
1686,
4022,
3048,
1696,
377,
287,
22868,
1686,
83,
7100,
87,
2385,
293,
73,
583,
5104,
1036,
9275,
50274,
2320,
18279,
1162,
355,
19645,
1789,
9162,
285,
31460,
13418,
970,
3676,
1554,
262,
1945,
27311,
267,
11454,
2990,
1649,
304,
376,
377,
4765,
3048,
3614,
36642,
9252,
6113,
2107,
86,
1952,
8819,
631,
2103,
3205,
631,
267,
19136,
1237,
1508,
3763,
21874,
1166,
2941,
15855,
20790,
9275,
50274,
785,
45182,
4011,
1162,
355,
27311,
267,
6928,
323,
1789,
7140,
285,
495,
69,
16753,
13418,
432,
374,
69,
3888,
23746,
87,
88,
4765,
3048,
3614,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
70,
550,
87,
88,
7798,
50004,
883,
13482,
785,
45182,
4011,
13118,
2241,
267,
3024,
4896,
1542,
6082,
14267,
395,
20,
69,
3014,
383,
14508,
4064,
70,
550,
87,
88,
7798,
20790,
9275,
50276,
249,
6429,
45182,
4011,
1162,
355,
690,
625,
2990,
35615,
403,
2783,
5742,
4394,
326,
13398,
253,
7140,
285,
31460,
13418,
8892,
387,
253,
10554,
3924,
2112,
342,
253,
4735,
3924,
1690,
352,
275,
634,
1783,
347,
973,
588,
320,
9371,
50276,
783,
747,
4081,
10895,
273,
23539,
43999,
310,
13506,
1014,
604,
4217,
2139,
417,
897,
247,
1524,
10895,
751,
28243,
1028,
10895,
273,
495,
69,
1789,
9050,
2929,
2413,
4694,
18549,
4379,
264,
438,
406,
3222,
47929,
609,
339,
453,
1074,
74,
280,
17312,
8602,
9275,
3048,
2413,
1477,
664,
886,
2204,
16128,
563,
4694,
2203,
20,
1678,
255,
23456,
16368,
20028,
326,
253,
11815,
368,
1056,
327,
23539,
43999,
4459,
281,
4588,
8458,
588,
2085,
10046,
1941,
323,
634,
3916,
8606,
723,
50275,
1542,
253,
278,
79,
382,
4679,
247,
27272,
278,
79,
382,
3048,
3614,
37813,
9906,
681,
21728,
1094,
276,
19065,
6549,
4421,
975,
8766,
882,
44874,
400,
1792,
569,
251,
16083,
79,
382,
11174,
953,
651,
320,
247,
1805,
4327,
2429,
281,
278,
79,
382,
3321,
285,
278,
79,
382,
7527,
50275,
10752,
323,
13716,
891,
751,
253,
2934,
285,
1783,
1246,
275,
253,
2929,
323,
26647,
281,
39709,
7140,
1374,
3659,
13553,
387,
253,
1072,
673,
891,
1158,
3733,
7241,
7120,
247,
2234,
2554,
275,
824,
1554,
262,
1945,
6928,
285,
310,
5816,
432,
1655,
1783,
6240,
326,
390,
387,
253,
1077,
1878,
36738,
670,
352,
285,
1690,
1543,
327,
253,
28243,
1028,
495,
69,
1789,
10895,
588,
1361,
479,
564,
432,
45210,
281,
4754,
50275,
11183,
891,
452,
1239,
253,
2488,
8680,
285,
643,
10123,
50276,
35844,
621,
891,
452,
9300,
619,
13716,
432,
721,
281,
818,
253,
4477,
10974,
281,
619,
5701,
342,
9300,
1783,
281,
2007,
5276,
616,
3916,
7152,
33032,
2520,
2929,
24543,
253,
1563,
1953,
1880,
390,
417,
260,
79,
2224,
326,
3037,
281,
9446,
247,
1113,
432,
760,
697,
24459,
1859,
476,
9113,
9446,
253,
1072,
1113,
432,
247,
1930,
1859,
326,
597,
452,
1620,
2326,
1078,
2045,
2175,
1304,
4016,
906,
891,
4665,
253,
1953,
347,
849,
651,
260,
79,
2224,
4715,
281,
9446,
2710,
1027,
8458,
275,
2710,
1027,
24543,
39970,
281,
253,
1840,
1083,
273,
8981,
891,
1158,
697,
271,
27807,
1953,
285,
352,
651,
320,
12912,
281,
871,
849,
9050,
285,
1859,
10801,
403,
44463,
967,
275,
616,
8981,
275,
260,
79,
2224,
50276,
783,
2022,
6452,
310,
17903,
347,
3637,
891,
4715,
342,
247,
3687,
1180,
273,
7140,
1374,
3659,
13553,
1543,
275,
2169,
10554,
7200,
323,
39709,
13553,
285,
21255,
11794,
26182,
7140,
285,
31460,
11026,
625,
7899,
10554,
323,
1016,
891,
13414,
1928,
841,
1543,
7933,
3676,
257,
776,
4685,
273,
260,
79,
2224,
5001,
253,
1840,
1953,
50276,
783,
806,
906,
3133,
4755,
275,
253,
4679,
253,
4477,
10957,
7140,
1374,
3659,
13553,
715,
247,
3733,
285,
247,
1071,
8085,
1293,
14787,
597,
840,
10018,
849,
10554,
7200,
323,
253,
1071,
8085,
26332,
39709,
13553,
2544,
347,
253,
1180,
273,
13553,
275,
253,
3733,
8085,
26332,
2326,
13553,
5459,
1223,
697,
46950,
310,
4934,
3638,
597,
7525,
326,
3629,
253,
1180,
273,
2326,
13553,
19132,
10554,
323,
253,
1071,
8085,
2299,
891,
9101,
436,
906,
476,
320,
4751,
5544,
984,
3629,
253,
1180,
273,
2326,
13553,
1543,
275,
625,
3530,
1907,
8003,
1859,
10801,
281,
253,
1071,
8085,
3530,
271,
8931,
7140,
1789,
14280,
281,
452,
2074,
18655,
672,
2326,
432,
8003,
1859,
10801,
352,
588,
320,
6927,
281,
9446,
253,
1789,
432,
1859,
10801,
8003,
604,
417,
4503,
281,
2326,
1859,
10801,
50276,
284,
323,
253,
1273,
906,
3738,
253,
4477,
1375,
326,
352,
310,
10084,
891,
13414,
1928,
594,
253,
906,
8018,
253,
7091,
260,
9866,
9010,
1652,
275,
1846,
875,
8981,
273,
7140,
285,
31460,
285,
3021,
359,
2550,
1902,
26647,
2439,
253,
767,
352,
310,
5272,
326,
627,
310,
1652,
390,
642,
38543,
1055,
275,
1554,
262,
1945,
4715,
273,
767,
20804,
8892,
432,
253,
32764,
273,
253,
1840,
1953,
352,
651,
452,
644,
625,
4722,
604,
8981,
273,
7140,
285,
31460,
497,
417,
9070,
533,
9904,
342,
1016,
643,
50276,
11183,
50276,
74,
1239,
253,
4477,
2380,
597,
1333,
3021,
352,
310,
12744,
326,
26647,
281,
39709,
1859,
10801,
2011,
275,
776,
1543,
310,
4336,
5544,
407,
253,
14259,
875,
253,
2326,
285,
39709,
873,
891,
5194,
342,
436,
4385,
891,
13414,
1333,
359,
476,
5276,
326,
253,
14259,
4751,
11424,
26647,
281,
39709,
1859,
10801,
697,
619,
24366,
2299,
253,
4477,
3746,
1646,
281,
9446,
619,
24366,
812,
320,
987,
387,
1878,
352,
2550,
320,
17527,
275,
643,
3000,
247,
2201,
2523,
342,
253,
1655,
7714,
310,
326,
359,
2550,
15313,
247,
5882,
6452,
432,
253,
5661,
1543,
604,
619,
24366,
310,
2032,
840,
253,
1543,
403,
2761,
4755,
285,
417,
4722,
50276,
74,
971,
281,
2406,
253,
4868,
432,
608,
281,
577,
347,
253,
4477,
2380,
2789,
479,
2868,
619,
4468,
310,
3588,
891,
1158,
253,
4477,
18915,
3240,
247,
1892,
1895,
285,
26930,
616,
6031,
2167,
5474,
33032,
2520,
310,
271,
16101,
1263,
534,
2340,
684,
849,
1175,
403,
260,
79,
2224,
275,
2087,
3006,
281,
747,
1859,
10801,
597,
671,
7409,
1880,
12480,
342,
271,
4465,
4836,
273,
49653,
253,
31460,
651,
1361,
281,
39970,
281,
747,
1859,
10801,
390,
417,
50275,
2520,
1263,
310,
7106,
327,
495,
15302,
278,
79,
382,
9171,
554,
3450,
4164,
357,
19,
78,
4958,
23908,
285,
616,
1211,
4460,
10895,
273,
7512,
15958,
608,
7140,
1113,
10895,
50275,
531,
4722,
4809,
273,
616,
5661,
9978,
310,
281,
342,
4949,
247,
873,
273,
7140,
1374,
3659,
8557,
323,
253,
1071,
873,
9093,
323,
1016,
7140,
253,
1071,
31460,
310,
417,
2326,
275,
253,
3733,
873,
594,
597,
403,
16344,
281,
752,
6070,
476,
247,
1566,
39970,
432,
581,
7140,
281,
1529,
7140,
347,
10066,
281,
40271,
247,
31460,
432,
512,
9050,
323,
1650,
642,
24459,
1859,
323,
667,
273,
253,
8458,
1309,
3733,
627,
556,
644,
2067,
2045,
9380,
326,
6289,
281,
31460,
26647,
347,
26647,
281,
39709,
1859,
10801,
2581,
685,
26647,
281,
1859,
10801,
39709,
323,
436,
7140,
278,
79,
382,
2843,
79,
382,
390,
278,
79,
382,
16192,
382,
8601,
390,
1355,
15387,
67,
4679,
275,
31460,
32270,
274,
1482,
9380,
50275,
9328,
671,
7409,
1880,
49653,
253,
31460,
26277,
476,
3157,
31460,
26647,
616,
1783,
2722,
326,
352,
31835,
253,
7200,
281,
823,
253,
31460,
9162,
4836,
970,
35615,
751,
501,
3024,
285,
9628,
253,
27882,
50275,
44295,
3062,
436,
2929,
2175,
48544,
29029,
285,
31429,
273,
8512,
281,
7140,
1374,
3659,
50275,
2520,
2929,
556,
271,
9470,
1180,
273,
4679,
327,
2710,
873,
8777,
10499,
1027,
16696,
6358,
323,
253,
23658,
2408,
273,
5661,
941,
1246,
275,
436,
1263,
352,
310,
9865,
281,
3114,
285,
9172,
690,
3533,
954,
3012,
352,
2722,
326,
6240,
247,
31460,
2063,
8518,
1481,
31835,
31460,
26647,
436,
4560,
310,
4828,
27350,
3103,
27807,
285,
16108,
2007,
2175,
275,
436,
2743,
50276,
20261,
253,
7200,
2361,
275,
253,
4679,
310,
247,
17856,
1599,
273,
31460,
285,
7140,
891,
651,
320,
1077,
6110,
275,
816,
10941,
253,
7140,
7200,
352,
310,
9865,
281,
387,
1878,
452,
581,
2715,
326,
760,
7140,
7200,
310,
2361,
323,
6096,
16806,
366,
6194,
342,
31460,
285,
7140,
9162,
323,
9610,
11823,
253,
31460,
7200,
281,
12654,
1880,
6240,
31460,
9162,
19132,
7140,
9162,
26647,
281,
39709,
1859,
10801,
50276,
19751,
50276,
5092,
368,
4496,
19148,
1880,
253,
3888,
275,
253,
1071,
873,
403,
39709,
10872,
273,
253,
9050,
347,
973,
347,
39709,
31460,
26332,
253,
278,
79,
382,
7527,
1071,
8085,
310,
352,
278,
79,
382,
1071,
873,
3888,
24337,
281,
253,
39709,
4311,
347,
973,
347,
323,
8458,
3966,
50276,
5092,
368,
19148,
253,
4327,
273,
39709,
31460,
323,
247,
7140,
2581,
685,
31460,
26647,
689,
512,
9050,
50276,
9088,
556,
644,
2067,
2987,
15974,
253,
31460,
2087,
50228,
273,
260,
79,
2224,
1690,
1387,
32270,
274,
850,
27311,
267,
6928,
285,
26661,
6928,
6240,
841,
35615,
281,
436,
1263,
651,
12654,
281,
752,
6070,
253,
4342,
1060,
403,
3710,
281,
501,
3024,
3740,
35615,
50275,
783,
4342,
5001,
36158,
73,
1096,
10336,
403,
4722,
533,
253,
8453,
273,
253,
4342,
285,
7792,
273,
1263,
387,
697,
1655,
1375,
310,
2159,
273,
247,
8059,
7465,
50275,
5996,
2488,
2380,
5717,
368,
323,
5277,
253,
1543,
285,
8254,
5411,
253,
9978,
891,
717,
3629,
253,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
2520,
2929,
2959,
374,
45210,
25026,
337,
2997,
285,
337,
12009,
50276,
249,
2087,
627,
310,
3862,
4345,
326,
436,
310,
4891,
5661,
789,
285,
326,
253,
3910,
1119,
875,
8981,
285,
31460,
13418,
497,
4722,
50276,
783,
2022,
2523,
3982,
598,
407,
253,
625,
4016,
37317,
310,
326,
690,
273,
253,
4679,
403,
2256,
281,
7914,
5742,
253,
26480,
17888,
1895,
812,
2489,
625,
273,
271,
30370,
1895,
347,
253,
1180,
273,
3733,
6667,
5459,
436,
2523,
310,
14969,
407,
253,
643,
30628,
665,
23188,
923,
690,
1318,
275,
253,
2929,
1146,
3863,
285,
7826,
268,
3292,
253,
1039,
323,
3081,
2175,
619,
14876,
323,
253,
4477,
310,
281,
46454,
2319,
436,
2523,
275,
247,
18520,
273,
436,
2929,
19235,
984,
273,
2317,
10806,
50276,
74,
452,
281,
5583,
436,
2929,
320,
10945
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses pointbased methods for classification and semantic segmentation in 3d applications it aims to improve the computational efficiency of this kind of methods to make it better fit the applications of limited resources such as mobile scenarios to achieve this this paper conducts the following works first it develops a mobile attention kernel point convolution makpconv scheme to improve the performance of existing kernel point convolution second it utilizes neural architecture search nas technique to design the makpconvbased network in this part this paper proposes a wide deep neural predictor for the nas process experimental study is conducted on benchmark datasets and tasks to demonstrate that the resulted deep networks can achieve higher computational efficiency and improved performance ablation study is also conducted to illustrate the key components in this work strengths 1 the pointbased 3d application focused by this work is of importance and has recently attracted much attention as a new research area 2 the motivation on computational efficiency makes sense and the experimental result demonstrates that this is well achieved by this work 3 the idea of changing kpconv to a depthwise kernel and then using nkattention to compensate the degraded performance caused by the limited representative power of the latter is sound and its effectiveness is verified by table 1 4 the overall performance of the resulted makpconvbased network is validated in the experimental study 5 the ablation study on the kernel contribution via the nkattention is helpful for illustrating the function of the proposed attention mechanism weaknesses 1 although the ideas of using a depthwise kernel and the attention mechanism work well in this paper they are wellknown techniques in the literature in this sense the contribution of this work in this part is mainly on innovatively using them to improve the computational efficiency of kpconv the theoretical contribution of this work seems to be limited 2 the part on the proposed wide deep neural predictor and the idea of carrying feature engineering on searchable features in nas can be made clearer firstly the key differences and novelties with respect to the existing related work can be further clarified secondly can the proposed feature engineering idea be regarded as a new principal in nas that can be generally applied clarifying these issues will make the contributions in this part easier to be understood minor issues 1 n in figure 1 shall be explicitly defined 2 kenrel kernel in the caption of figure 1 3 my our novel wide deep in conclusion this work is well motivated the proposed idea on reducing computational cost is neat and it works effectively as experimentally demonstrated the proposed wide deep predictors for nas helps to efficiently find the high quality architecture for the proposed network achieving promising performance meanwhile the potential issues are i the idea of makpcon is mainly based on existing wellknown techniques and does not seem to bring significant theoretical insight and ii the novelty and contribution of the wide deep neural predictor part can be better clarified docsepthis paper aims to accelerate the inference of 3d point cloud neural networks the authors first borrow a few designs from efficient 2d neural networks depthwise convolution and inverted residual bottleneck they then propose to reweight the weights of kernel points with attention to boost its representation power finally the authors adopt the predictorbased neural architecture search to explore the best model under a resource constraint they have evaluated their proposed solution on the smallscale modelnet40 dataset and the largescale semantickitti dataset and have achieved reasonably good empirical performance the problem studied in this paper is very important 3d point cloud neural networks are the core of many realworld applications arvr and autonomous driving at the same time most existing solutions are still not efficient enough to provide a satisfactory user experience and guarantee user safety the paper is wellwritten and relatively easy to follow furthermore the proposed solution is technically sound and has achieved fairly good empirical performance on smallscale and largescale benchmark datasets my primary concern of this paper is its limited technical novelty this paper is a combination of existing solutions depthwise convolution and inverted residual bottleneck are originally proposed in mobilenetv1v2 the reweighting mechanism adopted in this paper is essentially the same as in senet hu et al cvpr 2018 the neural architecture framework used in this paper is almost identical to neural predictor wen et al eccv 2020 the only thing new to me is the unified dense and sparse neural architecture representations which however is more of an engineering improvement than a technical contribution the authors should highlight their core contributions apart from applying existing techniques to a new modality besides the existing solution spvnas already adopts the oneshot weightsharing nas framework to explore the best neural architecture it is unclear why the authors follow a different approach the authors claim that makpconv is 52 faster than spvnas which is inaccurate having fewer macs does not necessarily translate into measured speedup on the actual hardware the authors should directly compare the latency of makpconv with spvnas and other baselines in table 4 furthermore the techniques proposed in this paper are tailored for macs reduction but not necessarily for latency reduction pointbased neural networks are usually bottlenecked by the knn computation rather than the convolution while most techniques in this paper aim at reducing the computation of convolution it is unclear whether these modifications are also helpful to reduce the latency the authors should include latency numbers in table 1 the authors have only reported the performance on the validation set of semantickitti they should also include some numbers on the final testing set besides it would be better to verify whether the searched neural architecture is transferrable to other datasets such as scannet and nuscenes andor tasks such as panoptic segmentation and object detection references hu et al squeezeandexcitation networks cvpr 2018 wen et al neural predictor for neural architecture search eccv 2020 the paper has relatively weak technical contributions and lacks latency comparisons with stateoftheart solutions therefore i would like to see the authors response before the final decision docsepthe paper builds a memory and computationefficient version of the kpconv model which is a point cloud processing model the paper uses many techniques to achieve this including depthwise kernels attention over kernels and also neural architecture search nas the paper conducts experiments to demonstrate the effectiveness of their techniques strengths usage of a depthwise kernel reduces the model size of kpconv and the computation cost involved the paper can effectively search for a makpconv block that is better than a handcrafted one using nas weaknesses it is unclear why the paper decides to start with kpconv architecture which is a particularly parameter inefficient architecture the objective of building a lightweight point cloud processing model would have been better served by starting with some already lightweight architecture like 1 explained later even after elaborate techniques like nas the final architecture is not very impressive in terms of memory and accuracy the paper does not compare to 1 which is a very simple baseline method that already achieves better performance 930 vs 922 than achieved in the paper while using a similar memory footprint 08 vs 056 m params hence the endproduct of the work is not very useful in the reviewers opinion the paper is missing details about the protocol used for the evaluation of the modelnet40 experiments this is particularly important as 1 shows smaller changes in protocols can have a drastic impact on performance the architecture of the proposed model makpconv is very complicated the ablation studies dont appear to justify the utility of the complicated structures used in the model design 1 revisiting point cloud shape classification with a simple and effective baseline icml 2021 overall i appreciate the papers efforts in making lightweight point cloud processing models this could be very useful in many applications however the method is weak in terms of the final performance parameters vs accuracy and is overly complicated some important details are also missing see weakness hence i would recommend rejection for the current version of the paper docsepthis paper aims to address the point cloud analysis task by 1 improving the existing kpconv operation via considering the kernel relationship 2 designing the networks via a predictorbased nas approach the improved kpconv operation ie makpconv can model the local structure efficiently and the searched network has fewer parameters while performing better than the baseline on two datasets including modelnet40 for classification and semantickitti for segmentation strengths 1 this paper is wellwritten in general 2 the motivation behind makpconv is reasonable 3 the networks have fewer parameters and outperform previous works especially on semantickitti 4 the analysis on nas for block structure search is enough weakness i am confused about the introduction to makpconv 1 what is the connection between makpconv and kpconv in kpconv the correlation between each neighbour and each kernel point is calculated which i do not find in makpconv 2 in nk attention the nk distance indicates the distance between the neighbour and the kernel point or between every two neighbours 3 it seems that makpconv is an improved version of kpconv but i cannot associate them in figure 1 for example it seems that the nk attention is similar to the h function in eq 5 in kpconv 1 if so the depthwise kernel in the upperleft part of figure 1 should be eq5 right then which part indicates eq 4 in kpconv1 maybe i misunderstand the figure please clarify this 4 in eq 1 the dimension of f is din while iin1dout which makes me confused does din equal to dout 5 in eq 1 and eq 2 since the points are unordered and we do not know which kernel point corresponds to a certain neighbour how can we have fk cdot wk in addition i also cannot find a similar formulation in kpconv 6 in eq 2 the dimension of f and w is din so how can we get f with dimension dout experimental results 1 currently previous works on point cloud analysis usually conduct experiments on several datasets such as s3dis shapenetpart and scannet while this paper only considers two datasets including modelnet40 and semantickitti which cannot demonstrate the effectiveness adequately 2 the oa of kpconv is 929 which is higher than the oa of the proposed method 926 while in this paper the authors do not report that result and instead report the result of their implementation 3 on modelnet40 currently there are many works yielding oa larger than 93 but the authors do not report 1 httpsarxivorgpdf190408889pdf i have two concerns 1 the introduction to the proposed makpconv is unclear 2 the experiments are not enough as a result currently i vote for borderline reject but i might change my score if the authors can address my concern or other reviewers can give more positive reviews
### Summary:
|
four reviewers have reviewed this manuscript and two found it borderline leaning towards acceptance two other reviewers scored it below the acceptance threshold while the authors identify the key challenges and bottlenecks in 3d point cloud model the most positive two reviewers notice that the depthwise kernel and the attention mechanism and similar tools are wellknown in the literature and that this work is more of an engineering improvement than a technical contribution which erodes the novelty of the proposed idea on that front while authors noticed some discrepancies in the numbers quoted by rev 3 the model gains are also nonetheless modest compared to other models overall the feeling amongst the reviewers was that the presentation of nk attention could be further improved and that the paper uses very heavy machinery to achieve results comparable with sota on this occasion the manuscript is below the acceptance threshold with even the borderline positive reviewers having their doubts about clear cut technical novelty
|
[
50276,
21,
253,
4583,
3045,
273,
253,
7369,
47657,
81,
13118,
3169,
2990,
310,
17618,
275,
253,
5661,
1263,
50276,
22,
253,
28913,
1263,
327,
253,
10295,
7680,
3066,
253,
295,
76,
42959,
310,
9371,
323,
34805,
253,
1159,
273,
253,
4081,
4116,
5122,
50275,
20881,
1255,
265,
337,
3738,
253,
5697,
273,
970,
247,
6864,
3020,
10295,
285,
253,
4116,
5122,
789,
973,
275,
436,
2929,
597,
403,
973,
4304,
5609,
275,
253,
6239,
275,
436,
3282,
253,
7680,
273,
436,
789,
275,
436,
629,
310,
7194,
327,
8434,
3146,
970,
731,
281,
3157,
253,
15180,
6733,
273,
465,
81,
13118,
253,
10527,
7680,
273,
436,
789,
3133,
281,
320,
3710,
50276,
19,
253,
629,
327,
253,
4081,
4618,
50276,
22412,
11454,
23403,
285,
253,
2934,
273,
8785,
4735,
11369,
327,
3186,
494,
3386,
275,
13332,
476,
320,
1160,
30909,
41005,
253,
2234,
3910,
285,
4460,
2890,
342,
1675,
281,
253,
5368,
2905,
789,
476,
320,
2007,
31637,
1273,
314,
476,
253,
4081,
4735,
11369,
2934,
320,
12258,
347,
247,
747,
8624,
275,
13332,
326,
476,
320,
3839,
3732,
8254,
5411,
841,
3374,
588,
1056,
253,
9021,
275,
436,
629,
6927,
281,
320,
7192,
50275,
37585,
3374,
337,
295,
275,
4677,
337,
3091,
320,
11120,
2931,
374,
465,
257,
1661,
50276,
22562,
275,
253,
11743,
273,
4677,
337,
495,
619,
776,
4460,
4618,
50276,
22412,
50276,
249,
6452,
50276,
2520,
789,
310,
973,
17194,
253,
4081,
2934,
327,
8493,
15180,
2105,
310,
18176,
285,
352,
2987,
8069,
347,
21657,
5183,
253,
4081,
4618,
50276,
22412,
23477,
323,
13332,
7729,
281,
14556,
1089,
253,
1029,
3290,
10336,
323,
253,
4081,
2990,
17170,
12532,
3045,
26614,
253,
2442,
3374,
403,
891,
253,
2934,
273,
47657,
81,
585,
310,
7194,
1754,
327,
5368,
973,
4304,
5609,
285,
1057,
417,
1646,
281,
3324,
1534,
10527,
12288,
285,
21255,
253,
38135,
285,
7680,
273,
253,
4618,
50276,
22412,
11454,
23403,
629,
476,
320,
1805,
31637,
5474,
33032,
2520,
2929,
13698,
281,
28523,
253,
17032,
273,
495,
69,
1127,
9005,
11454,
6928,
253,
4477,
806,
13179,
247,
1643,
11809,
432,
5919,
374,
69,
11454,
6928,
6864,
3020,
27311,
285,
28483,
12541,
3673,
44856,
597,
840,
12661,
281,
294,
6712,
253,
13461,
273,
10295,
2792,
342,
4116,
281,
9510,
697,
6779,
1612,
4720,
253,
4477,
5283,
253,
23403,
3169,
11454,
10336,
3186,
281,
8338,
253,
1682,
1566,
762,
247,
7741,
7658,
597,
452,
6760,
616,
4081,
2900,
327,
253,
1355,
7527,
1566,
3024,
1449,
10895,
285,
253,
1236,
2510,
25912,
3300,
386,
781,
21498,
10895,
285,
452,
6786,
12054,
1175,
16774,
3045,
253,
1895,
5421,
275,
436,
2929,
310,
1077,
1774,
495,
69,
1127,
9005,
11454,
6928,
403,
253,
5161,
273,
1142,
1524,
10186,
4893,
549,
24987,
285,
26279,
6276,
387,
253,
1072,
673,
954,
5368,
5482,
403,
1335,
417,
5919,
2217,
281,
2085,
247,
20297,
2608,
2793,
285,
12215,
2608,
5252,
50276,
783,
2929,
310,
973,
15720,
285,
4942,
3477,
281,
956,
33810,
253,
4081,
2900,
310,
22335,
3590,
285,
556,
6786,
9648,
1175,
16774,
3045,
327,
1355,
7527,
285,
1236,
2510,
25912,
22791,
15302,
50276,
2577,
3625,
4468,
273,
436,
2929,
310,
697,
3710,
7681,
38135,
436,
2929,
310,
247,
5019,
273,
5368,
5482,
50276,
16719,
3020,
27311,
285,
28483,
12541,
3673,
44856,
403,
8927,
4081,
275,
31551,
257,
292,
87,
18,
87,
19,
50276,
783,
294,
6712,
272,
5122,
8671,
275,
436,
2929,
310,
9093,
253,
1072,
347,
275,
5303,
292,
30287,
1162,
355,
30105,
1087,
4765,
50276,
783,
11454,
10336,
7792,
908,
275,
436,
2929,
310,
2761,
8931,
281,
11454,
23403,
259,
257,
1162,
355,
23746,
87,
9169,
50276,
783,
760,
2181,
747,
281,
479,
310,
253,
27998,
14086,
285,
23507,
11454,
10336,
14237,
534,
2299,
310,
625,
273,
271,
11369,
7756,
685,
247,
7681,
7680,
253,
4477,
943,
6780,
616,
5161,
9021,
7419,
432,
9433,
5368,
5609,
281,
247,
747,
36453,
16280,
253,
5368,
2900,
653,
26697,
284,
2168,
47932,
253,
4394,
12022,
2801,
35870,
13332,
7792,
281,
8338,
253,
1682,
11454,
10336,
352,
310,
12744,
2139,
253,
4477,
956,
247,
1027,
2746,
50276,
783,
4477,
1750,
326,
47657,
81,
13118,
310,
8073,
7938,
685,
653,
26697,
284,
534,
310,
31215,
1907,
11184,
5315,
84,
1057,
417,
7933,
16497,
715,
4080,
3885,
484,
327,
253,
4588,
10309,
253,
4477,
943,
3587,
7277,
253,
22667,
273,
47657,
81,
13118,
342,
653,
26697,
284,
285,
643,
1666,
25379,
275,
2829,
577,
33810,
253,
5609,
4081,
275,
436,
2929,
403,
27846,
323,
5315,
84,
5141,
533,
417,
7933,
323,
22667,
5141,
1127,
3169,
11454,
6928,
403,
3798,
3673,
44856,
264,
407,
253,
694,
79,
13782,
2581,
685,
253,
27311,
1223,
954,
5609,
275,
436,
2929,
4388,
387,
8493,
253,
13782,
273,
27311,
352,
310,
12744,
1880,
841,
14586,
403,
671,
9371,
281,
4796,
253,
22667,
253,
4477,
943,
2486,
22667,
3904,
275,
2829,
337,
50276,
783,
4477,
452,
760,
2361,
253,
3045,
327,
253,
12820,
873,
273,
3300,
386,
781,
21498,
597,
943,
671,
2486,
690,
3904,
327,
253,
2457,
5175,
873,
16280,
352,
651,
320,
1805,
281,
12654,
1880,
253,
16113,
11454,
10336,
310,
811,
27480,
494,
281,
643,
15302,
824,
347,
660,
1136,
292,
285,
295,
19387,
24453,
285,
263,
8892,
824,
347,
3199,
37016,
26405,
285,
1789,
5481,
50276,
250,
3065,
50276,
11917,
1162,
355,
29684,
395,
911,
26977,
6928,
30105,
1087,
4765,
50276,
15082,
1162,
355,
11454,
23403,
323,
11454,
10336,
3186,
23746,
87,
9169,
253,
2929,
556,
4942,
5075,
7681,
9021,
285,
19756,
22667,
14023,
342,
1375,
23037,
14387,
5482,
3103,
891,
651,
751,
281,
923,
253,
4477,
2380,
1078,
253,
2457,
3061,
5474,
339,
431,
248,
2929,
21168,
247,
3541,
285,
13782,
20246,
2715,
273,
253,
465,
81,
13118,
1566,
534,
310,
247,
1127,
9005,
5162,
1566,
253,
2929,
4648,
1142,
5609,
281,
5115,
436,
1690,
6864,
3020,
34501,
4116,
689,
34501,
285,
671,
11454,
10336,
3186,
13332,
253,
2929,
2589,
84,
4679,
281,
7568,
253,
12510,
273,
616,
5609,
20544,
50276,
24483,
273,
247,
6864,
3020,
10295,
11355,
253,
1566,
1979,
273,
465,
81,
13118,
285,
253,
13782,
2105,
3206,
50276,
783,
2929,
476,
8069,
3186,
323,
247,
47657,
81,
13118,
2972,
326,
310,
1805,
685,
247,
1133,
12517,
264,
581,
970,
13332,
50276,
20881,
1255,
265,
50276,
262,
310,
12744,
2139,
253,
2929,
21936,
281,
1265,
342,
465,
81,
13118,
10336,
534,
310,
247,
3782,
4764,
31334,
10336,
253,
8103,
273,
3652,
247,
28441,
1127,
9005,
5162,
1566,
651,
452,
644,
1805,
5608,
407,
4983,
342,
690,
2168,
28441,
10336,
751,
337,
5544,
1996,
50276,
9154,
846,
21184,
5609,
751,
13332,
253,
2457,
10336,
310,
417,
1077,
13943,
275,
2426,
273,
3541,
285,
7200,
253,
2929,
1057,
417,
7277,
281,
337,
534,
310,
247,
1077,
2969,
8245,
1332,
326,
2168,
33526,
1805,
3045,
898,
1229,
4632,
47722,
685,
6786,
275,
253,
2929,
1223,
970,
247,
2074,
3541,
33257,
16331,
4632,
470,
3208,
278,
18912,
7613,
253,
990,
7509,
273,
253,
789,
310,
417,
1077,
4217,
275,
253,
30628,
4743,
50276,
783,
2929,
310,
5816,
4278,
670,
253,
7241,
908,
323,
253,
7103,
273,
253,
1566,
3024,
1449,
4679,
436,
310,
3782,
1774,
347,
337,
2722,
4577,
2544,
275,
14238,
476,
452,
247,
36927,
3486,
327,
3045,
50275,
783,
10336,
273,
253,
4081,
1566,
47657,
81,
13118,
310,
1077,
9542,
253,
28913,
2175,
13414,
3176,
281,
15249,
253,
11839,
273,
253,
9542,
5289,
908,
275,
253,
1566,
2216,
50276,
18,
27694,
2996,
1127,
9005,
5281,
9162,
342,
247,
2969,
285,
3576,
8245,
17857,
1686,
43425,
50276,
1189,
455,
891,
11435,
253,
9380,
6031,
275,
2403,
28441,
1127,
9005,
5162,
3210,
436,
812,
320,
1077,
4217,
275,
1142,
4893,
2299,
253,
1332,
310,
5075,
275,
2426,
273,
253,
2457,
3045,
3602,
4632,
7200,
285,
310,
27662,
9542,
690,
1774,
4278,
403,
671,
5816,
923,
14855,
7613,
891,
651,
5583,
18235,
323,
253,
1655,
2715,
273,
253,
2929,
5474,
33032,
2520,
2929,
13698,
281,
2953,
253,
1127,
9005,
1783,
4836,
407,
50276,
18,
11138,
253,
5368,
465,
81,
13118,
4254,
3066,
7296,
253,
10295,
2954,
50276,
19,
20462,
253,
6928,
3066,
247,
23403,
3169,
13332,
2746,
50276,
783,
5520,
465,
81,
13118,
4254,
26332,
47657,
81,
13118,
476,
1566,
253,
1980,
2605,
14556,
285,
253,
16113,
2990,
556,
11184,
3602,
1223,
9591,
1805,
685,
253,
8245,
327,
767,
15302,
1690,
1566,
3024,
1449,
323,
9162,
285,
3300,
386,
781,
21498,
323,
26405,
20544,
50276,
18,
436,
2929,
310,
973,
15720,
275,
2087,
374,
253,
16038,
3212,
47657,
81,
13118,
310,
5272,
495,
253,
6928,
452,
11184,
3602,
285,
562,
32231,
2045,
2987,
3340,
327,
3300,
386,
781,
21498,
577,
253,
1783,
327,
13332,
323,
2972,
2605,
3186,
310,
2217,
50276,
20881,
1255,
50275,
74,
717,
13477,
670,
253,
10199,
281,
47657,
81,
13118,
337,
752,
310,
253,
4602,
875,
47657,
81,
13118,
285,
465,
81,
13118,
275,
465,
81,
13118,
253,
5921,
875,
1016,
14646,
285,
1016,
10295,
1127,
310,
5118,
534,
891,
513,
417,
1089,
275,
47657,
81,
13118,
374,
275,
295,
76,
4116,
253,
295,
76,
4181,
6492,
253,
4181,
875,
253,
14646,
285,
253,
10295,
1127,
390,
875,
1046,
767,
31359,
495,
352,
3133,
326,
47657,
81,
13118,
310,
271,
5520,
2715,
273,
465,
81,
13118,
533,
891,
2550,
15629,
731,
275,
4677,
337,
323,
1650,
352,
3133,
326,
253,
295,
76,
4116,
310,
2074,
281,
253,
288,
1159,
275,
16186,
608,
275,
465,
81,
13118,
337,
604,
594,
253,
6864,
3020,
10295,
275,
253,
5170,
1274,
629,
273,
4677,
337,
943,
320,
16186,
22,
987,
840,
534,
629,
6492,
16186,
577,
275,
465,
81,
13118,
18,
5046,
891,
23452,
1676,
253,
4677,
4496,
19148,
436,
577,
275,
16186,
337,
253,
7877,
273,
269,
310,
13223,
1223,
891,
249,
18,
69,
483,
534,
2789,
479,
13477,
1057,
13223,
4503,
281,
277,
483,
608,
275,
16186,
337,
285,
16186,
374,
1580,
253,
2792,
403,
440,
16586,
285,
359,
513,
417,
871,
534,
10295,
1127,
10140,
281,
247,
2176,
14646,
849,
476,
359,
452,
269,
76,
260,
5256,
47178,
275,
1635,
891,
671,
2550,
1089,
247,
2074,
15895,
275,
465,
81,
13118,
721,
275,
16186,
374,
253,
7877,
273,
269,
285,
259,
310,
13223,
594,
849,
476,
359,
755,
269,
342,
7877,
277,
483,
50274,
49363,
1543,
337,
4390,
2045,
2987,
327,
1127,
9005,
1783,
3798,
2589,
4679,
327,
2067,
15302,
824,
347,
256,
20,
3431,
439,
522,
257,
292,
2003,
285,
660,
1136,
292,
1223,
436,
2929,
760,
19401,
767,
15302,
1690,
1566,
3024,
1449,
285,
3300,
386,
781,
21498,
534,
2550,
7568,
253,
12510,
18212,
50276,
19,
253,
258,
66,
273,
465,
81,
13118,
310,
898,
1717,
534,
310,
2169,
685,
253,
258,
66,
273,
253,
4081,
1332,
898,
1731,
1223,
275,
436,
2929,
253,
4477,
513,
417,
1304,
326,
906,
285,
3185,
1304,
253,
906,
273,
616,
7092,
495,
327,
1566,
3024,
1449,
4390,
627,
403,
1142,
2987,
27012,
258,
66,
4067,
685,
11456,
533,
253,
4477,
513,
417,
1304,
50276,
18,
5987,
39962,
2061,
9275,
16129,
1449,
2055,
2511,
9275,
50275,
74,
452,
767,
7350,
50276,
18,
253,
10199,
281,
253,
4081,
47657,
81,
13118,
310,
12744,
374,
253,
4679,
403,
417,
2217,
50276,
284,
247,
906,
4390,
891,
6273,
323,
45210,
12009,
533,
891,
1537,
1818,
619,
4868,
604,
253,
4477,
476,
2953,
619,
4468,
390,
643,
30628,
476,
1918,
625,
2762,
10123,
2490,
187,
4118,
18435,
27,
12496,
30628,
452,
9814,
436,
7714,
285,
767,
1119,
352,
45210,
25661,
4404,
14924,
767,
643,
30628,
11691,
352,
2708,
253,
14924,
7887,
1223,
253,
4477,
4271,
253,
2234,
7881,
285,
3673,
5025,
886,
661,
275,
495,
69,
1127,
9005,
1566,
253,
954,
2762,
767,
30628,
4366,
326,
253,
6864,
3020,
10295,
285,
253,
4116,
5122,
285,
2074,
5657,
403,
973,
4304,
275,
253,
6239,
285,
326,
436,
789,
310,
625,
273,
271,
11369,
7756,
685,
247,
7681,
7680,
534,
2827,
3180,
253,
38135,
273,
253,
4081,
2934,
327,
326,
2914,
1223,
4477,
8344,
690,
37122,
275,
253,
3904,
15212,
407,
3585,
495,
253,
1566,
15988,
403,
671,
23188,
16453,
2429,
281,
643,
3210,
4583,
253,
5471,
15995,
253,
30628,
369,
326,
50276,
783,
9759,
273,
295,
76,
4116,
812,
320,
2007,
5520,
285,
326,
253,
2929,
4648,
1077,
5536,
20949,
281,
5115,
1543,
10870,
342,
256,
5503,
50276,
251,
436,
8120,
253,
7714,
310,
2708,
253,
14924,
7887,
342,
1014,
253,
45210,
2762,
30628,
1907,
616,
24626,
670,
2590,
2624,
7681,
38135
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
50276,
21,
253,
4583,
3045,
273,
253,
7369,
47657,
81,
13118,
3169,
2990,
310,
17618,
275,
253,
5661,
1263,
50276,
22,
253,
28913,
1263,
327,
253,
10295,
7680,
3066,
253,
295,
76,
42959,
310,
9371,
323,
34805,
253,
1159,
273,
253,
4081,
4116,
5122,
50275,
20881,
1255,
265,
337,
3738,
253,
5697,
273,
970,
247,
6864,
3020,
10295,
285,
253,
4116,
5122,
789,
973,
275,
436,
2929,
597,
403,
973,
4304,
5609,
275,
253,
6239,
275,
436,
3282,
253,
7680,
273,
436,
789,
275,
436,
629,
310,
7194,
327,
8434,
3146,
970,
731,
281,
3157,
253,
15180,
6733,
273,
465,
81,
13118,
253,
10527,
7680,
273,
436,
789,
3133,
281,
320,
3710,
50276,
19,
253,
629,
327,
253,
4081,
4618,
50276,
22412,
11454,
23403,
285,
253,
2934,
273,
8785,
4735,
11369,
327,
3186,
494,
3386,
275,
13332,
476,
320,
1160,
30909,
41005,
253,
2234,
3910,
285,
4460,
2890,
342,
1675,
281,
253,
5368,
2905,
789,
476,
320,
2007,
31637,
1273,
314,
476,
253,
4081,
4735,
11369,
2934,
320,
12258,
347,
247,
747,
8624,
275,
13332,
326,
476,
320,
3839,
3732,
8254,
5411,
841,
3374,
588,
1056,
253,
9021,
275,
436,
629,
6927,
281,
320,
7192,
50275,
37585,
3374,
337,
295,
275,
4677,
337,
3091,
320,
11120,
2931,
374,
465,
257,
1661,
50276,
22562,
275,
253,
11743,
273,
4677,
337,
495,
619,
776,
4460,
4618,
50276,
22412,
50276,
249,
6452,
50276,
2520,
789,
310,
973,
17194,
253,
4081,
2934,
327,
8493,
15180,
2105,
310,
18176,
285,
352,
2987,
8069,
347,
21657,
5183,
253,
4081,
4618,
50276,
22412,
23477,
323,
13332,
7729,
281,
14556,
1089,
253,
1029,
3290,
10336,
323,
253,
4081,
2990,
17170,
12532,
3045,
26614,
253,
2442,
3374,
403,
891,
253,
2934,
273,
47657,
81,
585,
310,
7194,
1754,
327,
5368,
973,
4304,
5609,
285,
1057,
417,
1646,
281,
3324,
1534,
10527,
12288,
285,
21255,
253,
38135,
285,
7680,
273,
253,
4618,
50276,
22412,
11454,
23403,
629,
476,
320,
1805,
31637,
5474,
33032,
2520,
2929,
13698,
281,
28523,
253,
17032,
273,
495,
69,
1127,
9005,
11454,
6928,
253,
4477,
806,
13179,
247,
1643,
11809,
432,
5919,
374,
69,
11454,
6928,
6864,
3020,
27311,
285,
28483,
12541,
3673,
44856,
597,
840,
12661,
281,
294,
6712,
253,
13461,
273,
10295,
2792,
342,
4116,
281,
9510,
697,
6779,
1612,
4720,
253,
4477,
5283,
253,
23403,
3169,
11454,
10336,
3186,
281,
8338,
253,
1682,
1566,
762,
247,
7741,
7658,
597,
452,
6760,
616,
4081,
2900,
327,
253,
1355,
7527,
1566,
3024,
1449,
10895,
285,
253,
1236,
2510,
25912,
3300,
386,
781,
21498,
10895,
285,
452,
6786,
12054,
1175,
16774,
3045,
253,
1895,
5421,
275,
436,
2929,
310,
1077,
1774,
495,
69,
1127,
9005,
11454,
6928,
403,
253,
5161,
273,
1142,
1524,
10186,
4893,
549,
24987,
285,
26279,
6276,
387,
253,
1072,
673,
954,
5368,
5482,
403,
1335,
417,
5919,
2217,
281,
2085,
247,
20297,
2608,
2793,
285,
12215,
2608,
5252,
50276,
783,
2929,
310,
973,
15720,
285,
4942,
3477,
281,
956,
33810,
253,
4081,
2900,
310,
22335,
3590,
285,
556,
6786,
9648,
1175,
16774,
3045,
327,
1355,
7527,
285,
1236,
2510,
25912,
22791,
15302,
50276,
2577,
3625,
4468,
273,
436,
2929,
310,
697,
3710,
7681,
38135,
436,
2929,
310,
247,
5019,
273,
5368,
5482,
50276,
16719,
3020,
27311,
285,
28483,
12541,
3673,
44856,
403,
8927,
4081,
275,
31551,
257,
292,
87,
18,
87,
19,
50276,
783,
294,
6712,
272,
5122,
8671,
275,
436,
2929,
310,
9093,
253,
1072,
347,
275,
5303,
292,
30287,
1162,
355,
30105,
1087,
4765,
50276,
783,
11454,
10336,
7792,
908,
275,
436,
2929,
310,
2761,
8931,
281,
11454,
23403,
259,
257,
1162,
355,
23746,
87,
9169,
50276,
783,
760,
2181,
747,
281,
479,
310,
253,
27998,
14086,
285,
23507,
11454,
10336,
14237,
534,
2299,
310,
625,
273,
271,
11369,
7756,
685,
247,
7681,
7680,
253,
4477,
943,
6780,
616,
5161,
9021,
7419,
432,
9433,
5368,
5609,
281,
247,
747,
36453,
16280,
253,
5368,
2900,
653,
26697,
284,
2168,
47932,
253,
4394,
12022,
2801,
35870,
13332,
7792,
281,
8338,
253,
1682,
11454,
10336,
352,
310,
12744,
2139,
253,
4477,
956,
247,
1027,
2746,
50276,
783,
4477,
1750,
326,
47657,
81,
13118,
310,
8073,
7938,
685,
653,
26697,
284,
534,
310,
31215,
1907,
11184,
5315,
84,
1057,
417,
7933,
16497,
715,
4080,
3885,
484,
327,
253,
4588,
10309,
253,
4477,
943,
3587,
7277,
253,
22667,
273,
47657,
81,
13118,
342,
653,
26697,
284,
285,
643,
1666,
25379,
275,
2829,
577,
33810,
253,
5609,
4081,
275,
436,
2929,
403,
27846,
323,
5315,
84,
5141,
533,
417,
7933,
323,
22667,
5141,
1127,
3169,
11454,
6928,
403,
3798,
3673,
44856,
264,
407,
253,
694,
79,
13782,
2581,
685,
253,
27311,
1223,
954,
5609,
275,
436,
2929,
4388,
387,
8493,
253,
13782,
273,
27311,
352,
310,
12744,
1880,
841,
14586,
403,
671,
9371,
281,
4796,
253,
22667,
253,
4477,
943,
2486,
22667,
3904,
275,
2829,
337,
50276,
783,
4477,
452,
760,
2361,
253,
3045,
327,
253,
12820,
873,
273,
3300,
386,
781,
21498,
597,
943,
671,
2486,
690,
3904,
327,
253,
2457,
5175,
873,
16280,
352,
651,
320,
1805,
281,
12654,
1880,
253,
16113,
11454,
10336,
310,
811,
27480,
494,
281,
643,
15302,
824,
347,
660,
1136,
292,
285,
295,
19387,
24453,
285,
263,
8892,
824,
347,
3199,
37016,
26405,
285,
1789,
5481,
50276,
250,
3065,
50276,
11917,
1162,
355,
29684,
395,
911,
26977,
6928,
30105,
1087,
4765,
50276,
15082,
1162,
355,
11454,
23403,
323,
11454,
10336,
3186,
23746,
87,
9169,
253,
2929,
556,
4942,
5075,
7681,
9021,
285,
19756,
22667,
14023,
342,
1375,
23037,
14387,
5482,
3103,
891,
651,
751,
281,
923,
253,
4477,
2380,
1078,
253,
2457,
3061,
5474,
339,
431,
248,
2929,
21168,
247,
3541,
285,
13782,
20246,
2715,
273,
253,
465,
81,
13118,
1566,
534,
310,
247,
1127,
9005,
5162,
1566,
253,
2929,
4648,
1142,
5609,
281,
5115,
436,
1690,
6864,
3020,
34501,
4116,
689,
34501,
285,
671,
11454,
10336,
3186,
13332,
253,
2929,
2589,
84,
4679,
281,
7568,
253,
12510,
273,
616,
5609,
20544,
50276,
24483,
273,
247,
6864,
3020,
10295,
11355,
253,
1566,
1979,
273,
465,
81,
13118,
285,
253,
13782,
2105,
3206,
50276,
783,
2929,
476,
8069,
3186,
323,
247,
47657,
81,
13118,
2972,
326,
310,
1805,
685,
247,
1133,
12517,
264,
581,
970,
13332,
50276,
20881,
1255,
265,
50276,
262,
310,
12744,
2139,
253,
2929,
21936,
281,
1265,
342,
465,
81,
13118,
10336,
534,
310,
247,
3782,
4764,
31334,
10336,
253,
8103,
273,
3652,
247,
28441,
1127,
9005,
5162,
1566,
651,
452,
644,
1805,
5608,
407,
4983,
342,
690,
2168,
28441,
10336,
751,
337,
5544,
1996,
50276,
9154,
846,
21184,
5609,
751,
13332,
253,
2457,
10336,
310,
417,
1077,
13943,
275,
2426,
273,
3541,
285,
7200,
253,
2929,
1057,
417,
7277,
281,
337,
534,
310,
247,
1077,
2969,
8245,
1332,
326,
2168,
33526,
1805,
3045,
898,
1229,
4632,
47722,
685,
6786,
275,
253,
2929,
1223,
970,
247,
2074,
3541,
33257,
16331,
4632,
470,
3208,
278,
18912,
7613,
253,
990,
7509,
273,
253,
789,
310,
417,
1077,
4217,
275,
253,
30628,
4743,
50276,
783,
2929,
310,
5816,
4278,
670,
253,
7241,
908,
323,
253,
7103,
273,
253,
1566,
3024,
1449,
4679,
436,
310,
3782,
1774,
347,
337,
2722,
4577,
2544,
275,
14238,
476,
452,
247,
36927,
3486,
327,
3045,
50275,
783,
10336,
273,
253,
4081,
1566,
47657,
81,
13118,
310,
1077,
9542,
253,
28913,
2175,
13414,
3176,
281,
15249,
253,
11839,
273,
253,
9542,
5289,
908,
275,
253,
1566,
2216,
50276,
18,
27694,
2996,
1127,
9005,
5281,
9162,
342,
247,
2969,
285,
3576,
8245,
17857,
1686,
43425,
50276,
1189,
455,
891,
11435,
253,
9380,
6031,
275,
2403,
28441,
1127,
9005,
5162,
3210,
436,
812,
320,
1077,
4217,
275,
1142,
4893,
2299,
253,
1332,
310,
5075,
275,
2426,
273,
253,
2457,
3045,
3602,
4632,
7200,
285,
310,
27662,
9542,
690,
1774,
4278,
403,
671,
5816,
923,
14855,
7613,
891,
651,
5583,
18235,
323,
253,
1655,
2715,
273,
253,
2929,
5474,
33032,
2520,
2929,
13698,
281,
2953,
253,
1127,
9005,
1783,
4836,
407,
50276,
18,
11138,
253,
5368,
465,
81,
13118,
4254,
3066,
7296,
253,
10295,
2954,
50276,
19,
20462,
253,
6928,
3066,
247,
23403,
3169,
13332,
2746,
50276,
783,
5520,
465,
81,
13118,
4254,
26332,
47657,
81,
13118,
476,
1566,
253,
1980,
2605,
14556,
285,
253,
16113,
2990,
556,
11184,
3602,
1223,
9591,
1805,
685,
253,
8245,
327,
767,
15302,
1690,
1566,
3024,
1449,
323,
9162,
285,
3300,
386,
781,
21498,
323,
26405,
20544,
50276,
18,
436,
2929,
310,
973,
15720,
275,
2087,
374,
253,
16038,
3212,
47657,
81,
13118,
310,
5272,
495,
253,
6928,
452,
11184,
3602,
285,
562,
32231,
2045,
2987,
3340,
327,
3300,
386,
781,
21498,
577,
253,
1783,
327,
13332,
323,
2972,
2605,
3186,
310,
2217,
50276,
20881,
1255,
50275,
74,
717,
13477,
670,
253,
10199,
281,
47657,
81,
13118,
337,
752,
310,
253,
4602,
875,
47657,
81,
13118,
285,
465,
81,
13118,
275,
465,
81,
13118,
253,
5921,
875,
1016,
14646,
285,
1016,
10295,
1127,
310,
5118,
534,
891,
513,
417,
1089,
275,
47657,
81,
13118,
374,
275,
295,
76,
4116,
253,
295,
76,
4181,
6492,
253,
4181,
875,
253,
14646,
285,
253,
10295,
1127,
390,
875,
1046,
767,
31359,
495,
352,
3133,
326,
47657,
81,
13118,
310,
271,
5520,
2715,
273,
465,
81,
13118,
533,
891,
2550,
15629,
731,
275,
4677,
337,
323,
1650,
352,
3133,
326,
253,
295,
76,
4116,
310,
2074,
281,
253,
288,
1159,
275,
16186,
608,
275,
465,
81,
13118,
337,
604,
594,
253,
6864,
3020,
10295,
275,
253,
5170,
1274,
629,
273,
4677,
337,
943,
320,
16186,
22,
987,
840,
534,
629,
6492,
16186,
577,
275,
465,
81,
13118,
18,
5046,
891,
23452,
1676,
253,
4677,
4496,
19148,
436,
577,
275,
16186,
337,
253,
7877,
273,
269,
310,
13223,
1223,
891,
249,
18,
69,
483,
534,
2789,
479,
13477,
1057,
13223,
4503,
281,
277,
483,
608,
275,
16186,
337,
285,
16186,
374,
1580,
253,
2792,
403,
440,
16586,
285,
359,
513,
417,
871,
534,
10295,
1127,
10140,
281,
247,
2176,
14646,
849,
476,
359,
452,
269,
76,
260,
5256,
47178,
275,
1635,
891,
671,
2550,
1089,
247,
2074,
15895,
275,
465,
81,
13118,
721,
275,
16186,
374,
253,
7877,
273,
269,
285,
259,
310,
13223,
594,
849,
476,
359,
755,
269,
342,
7877,
277,
483,
50274,
49363,
1543,
337,
4390,
2045,
2987,
327,
1127,
9005,
1783,
3798,
2589,
4679,
327,
2067,
15302,
824,
347,
256,
20,
3431,
439,
522,
257,
292,
2003,
285,
660,
1136,
292,
1223,
436,
2929,
760,
19401,
767,
15302,
1690,
1566,
3024,
1449,
285,
3300,
386,
781,
21498,
534,
2550,
7568,
253,
12510,
18212,
50276,
19,
253,
258,
66,
273,
465,
81,
13118,
310,
898,
1717,
534,
310,
2169,
685,
253,
258,
66,
273,
253,
4081,
1332,
898,
1731,
1223,
275,
436,
2929,
253,
4477,
513,
417,
1304,
326,
906,
285,
3185,
1304,
253,
906,
273,
616,
7092,
495,
327,
1566,
3024,
1449,
4390,
627,
403,
1142,
2987,
27012,
258,
66,
4067,
685,
11456,
533,
253,
4477,
513,
417,
1304,
50276,
18,
5987,
39962,
2061,
9275,
16129,
1449,
2055,
2511,
9275,
50275,
74,
452,
767,
7350,
50276,
18,
253,
10199,
281,
253,
4081,
47657,
81,
13118,
310,
12744,
374,
253,
4679,
403,
417,
2217,
50276,
284,
247,
906,
4390,
891,
6273,
323,
45210,
12009,
533,
891,
1537,
1818,
619,
4868,
604,
253,
4477,
476,
2953,
619,
4468,
390,
643,
30628,
476,
1918,
625,
2762,
10123,
2490,
187,
4118,
18435,
27,
12496,
30628,
452,
9814,
436,
7714,
285,
767,
1119,
352,
45210,
25661,
4404,
14924,
767,
643,
30628,
11691,
352,
2708,
253,
14924,
7887,
1223,
253,
4477,
4271,
253,
2234,
7881,
285,
3673,
5025,
886,
661,
275,
495,
69,
1127,
9005,
1566,
253,
954,
2762,
767,
30628,
4366,
326,
253,
6864,
3020,
10295,
285,
253,
4116,
5122,
285,
2074,
5657,
403,
973,
4304,
275,
253,
6239,
285,
326,
436,
789,
310,
625,
273,
271,
11369,
7756,
685,
247,
7681,
7680,
534,
2827,
3180,
253,
38135,
273,
253,
4081,
2934,
327,
326,
2914,
1223,
4477,
8344,
690,
37122,
275,
253,
3904,
15212,
407,
3585,
495,
253,
1566,
15988,
403,
671,
23188,
16453,
2429,
281,
643,
3210,
4583,
253,
5471,
15995,
253,
30628,
369,
326,
50276,
783,
9759,
273,
295,
76,
4116,
812,
320,
2007,
5520,
285,
326,
253,
2929,
4648,
1077,
5536,
20949,
281,
5115,
1543,
10870,
342,
256,
5503,
50276,
251,
436,
8120,
253,
7714,
310,
2708,
253,
14924,
7887,
342,
1014,
253,
45210,
2762,
30628,
1907,
616,
24626,
670,
2590,
2624,
7681,
38135
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper attempts to unify all cl research with a single formalism next they present a software implementation of their framework finally experiments are ran which demonstrate that sequoia can be used to evaluate cl methods pros p1 creating a software package which unifies rl and supervised learning approaches seems like a step in the right direction cons c1 cl hierarchy i think there are missing nodes in the hierarchy for instance it is important to also consider problemincremental learning where the task is the same but the input spaces are different eg the dimensionality of the input changes not just the input distribution c2 related work you mention that your framework doesnt compete with others it is not clear to me why someone else would not just use the other frameworks a more indepth discussion of the previous work is necessary this way it would be easier to distinguish your contributions are c3 experiments the text does not give us information on which methods are implemented by the authors it gave me the impression that the evaluated methods have all been implemented by another software package c4 experiments the results are presented but not discussed the provided hierarchy of cl methods might contain an interesting insight which relates rl and supervised learning approaches to cl however due to the lack of comparison to related work i am not certain of this overall i dont see a significant technological or conceptual contribution docsepin this paper the authors try to establish a unified framework for different continual learning settings they also provide a python library which includes different related methods extensive experimental results are also provided strengths nbsp the continual learning research tree figure 2 is well organized and makes a lot of sense to me nbsp i think the motivation of building a unified framework for continual learning is meaningful many different continual learning papers evaluate their methods in different benchmark protocols it is difficult for the following researchers to compare these related methods nbsp detailed explanations and experimental results are provided in this paper the authors also provide the results on wandb it makes obtaining the detailed results for each setting very convenient for the following researchers nbsp weaknesses nbsp some important continual learning baselines are not included such as icarl rebuffi et al 2017 nbsp the authors only provide experiment results on smallscale datasets such as mnist and cifar most continual learning papers such as icarl rebuffi et al 2017 and bic wu et al 2019 provide the results on largescale datasets eg imagenet1k as the authors trying to establish a new benchmark protocol it is important to provide the results on largescale datasets nbsp this project includes plenty of different methods but the authors dont include detailed information such as the licenses about the related opensource resources they use nbsp the definition of incremental learning in section 21 is ambiguous i think it is better to use classincremental learning or domainincremental learning directly the reasons are as follows 1 incremental learning is often considered as another name of continual learning it is weird to use it to denote a specific setting of continual learning 2 you include classil and domainil in incremental learning but you exclude taskil it is not reasonable nbsp the authors include too many details about the code implementation in the paper i think it is better to move these parts to the appendix and include more experimental results and analyses overall i think this project will be a useful tool for the following continual learning researchers i will recommend acceptance if the authors can address my concerns in the weaknesses part nbsp postrebuttal comments i thought the authors aimed to establish a unified software framework that makes running continual learning experiments easy however after reading the rebuttal i think sequoia has the following major issues and the authors failed to address them in the rebuttal sequoia heavily relies on the previous libraries such as avalanche and continuum i dont think this design is very friendly to the researchers in my personal view i prefer a framework that is easy to be understood and includes the most popular baselines i think your framework should be designed for a researcher instead of a software engineer sequoia hasnt been evaluated on largescale datasets eg imagenet1k if i need to use this framework i need the framework can reproduce the results of the previous baselines correctly so i dont think sequoia is very useful to a researcher like me according to my personal experience i tend to reject this paper docsepthe paper proposes a theoretical framework to organize research problems in the continual learning cl domain according to a hierarchy this theoretical framework is used as the basic foundation for sequoia a software library designed to reuse methods ie training algorithms across different research problems settings the goal of the paper is very ambitious there exist many open source libraries implementing cl methods for supervised learning and reinforcement learning settings framework according to the paper each setting is described as a set of assumptions a treeshaped hierarchy emerges from this view this shows a limitation of the theoretical framework defining settings as a set of assumptions does not result in a treeshaped hierarchy it would be a lattice therefore using a treeshaped hierarchy means that some methods will be compatible in principle but not in practice in sequoia because the tree lacks the connection this a strong limitation of the framework that is not discussed implementation the framework heavily relies on several dependencies to implement the heavy lifting datasets rl environments methods this is not a big problem by itself but it is not clear what sequoia is adding compared to the original libraries experiments the experiments show several baselines in the supervised and reinforcement learning settings most methods are not implemented by sequoia and are inherited from avalanche for sl and stablebaselines and continualworld for rl one thing that i expected from this section was how the same method could be easily applied to different settings which is the main claim of the library instead sl and rl settings share zero methods for example the ewc implementation is different in the sl and rl settings does this mean that users have to implement every method twice it seems that rl and sl methods are completely separate which is against the entire spirit of the library at this point what is the advantage that sequoia brings compared to its dependencies continuum gym avalanche stablebaselines continualworld additional comments constraints often relate to memory compute or time allowed to learn a task is sequoia able to check time and memory constraints is it possible for the end users to define new settings or modify the hierarchy it would be interesting to see an example we note that some avalanche methods achieve lower than chance accuracy in taskil because they do not use the task label to mask out the classes that lie outside the tested task can you fix this problem otherwise a user needs to know the internals of each library to debug the code this adds a lot of complexity i believe the objective of this paper of merging together continual sl and rl is very important and ambitious however the paper in the present form has several weaknesses it is unclear whether the theoretical framework really achieves the papers objective sequoia the software seems to be still at a very alpha stage in its development cycle i see very little unification in the methods which is the main scope of the paper this strongly limits the methods reuse between the different settings it is unclear what advantages sequoia is bringing compared to using its dependencies directly docsepthis paper introduces a new continual learning framework that aims to boost the research in the field this framework is based on a taxonomy of all possible assumptions that are common to cl methods moreover this taxonomy helps in putting supervised and reinforcement methods in a unified framework in section 2 the authors propose a markovian process that is being ignored in the later parts it is not clear why this hiddenmode markov decision process is useful to the proposed framework while i do appreciate the effort in developing this framework i am lacking the novelty in the paper the work is mainly an engineering effort thats appreciated but might not fit this conference while i do appreciate the effort behind this work i doubt the match between this paper and the scope of iclr
### Summary:
|
the manuscript introduces a taxonomy for organizing continual learning research settings and a software framework that realizes this taxonomy each continual learning setting is represented by as a set of shared assumptions eg are task ids observed or not represented in a hierarchy and the software is introduced with the hopes of unifying continual learning research the manuscript identifies a clear issue in the field settings and methods for continual learning have proliferated so that there is little coherence in benchmarks making progress difficult to judge reviewers generally agreed that the motivation of building software to help unify continual learning research was a positive however reviewers also pointed to many concerns with the manuscript and software package sequoia that comprises its main contribution in particular there is concern that the software is at an early stage of development and makes heavy use of existing libraries to function eg avalanche and continuum this makes it unclear what sequioa offers over using its dependencies directly as well there is concern that multiple standard benchmark tasks and common methods are missing from the implementation particularly for large scale experiments with eg imagenet1k in theory the library allows extension and these might be implemented by others in the community however this would require that the original manuscriptsoftware are strong enough to draw buy in from other researchers in sum the manuscriptsoftware does not yet offer a convincing starting point for researchers looking for a starting point to begin their continual learning research
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
9437,
281,
440,
1419,
512,
502,
2561,
342,
247,
2014,
30221,
1735,
597,
1246,
247,
3694,
7092,
273,
616,
7792,
4720,
4679,
403,
6337,
534,
7568,
326,
2160,
80,
571,
476,
320,
908,
281,
7472,
502,
3082,
5847,
50276,
81,
18,
6153,
247,
3694,
5522,
534,
440,
7790,
391,
77,
285,
22296,
4715,
7274,
3133,
751,
247,
3213,
275,
253,
987,
3884,
50276,
5040,
50276,
68,
18,
502,
19868,
50276,
74,
1158,
627,
403,
5816,
7632,
275,
253,
19868,
323,
4227,
352,
310,
1774,
281,
671,
1908,
1409,
1222,
719,
30132,
4715,
835,
253,
4836,
310,
253,
1072,
533,
253,
3280,
8470,
403,
1027,
24088,
253,
7877,
1319,
273,
253,
3280,
2544,
417,
816,
253,
3280,
3268,
50276,
68,
19,
2905,
789,
368,
3748,
326,
634,
7792,
36908,
15639,
342,
2571,
352,
310,
417,
2590,
281,
479,
2139,
3095,
2010,
651,
417,
816,
897,
253,
643,
31225,
247,
625,
801,
554,
394,
5955,
273,
253,
2045,
789,
310,
3309,
436,
1039,
352,
651,
320,
6927,
281,
12129,
634,
9021,
403,
50276,
68,
20,
4679,
253,
2505,
1057,
417,
1918,
441,
1491,
327,
534,
3082,
403,
9009,
407,
253,
4477,
352,
3534,
479,
253,
13214,
326,
253,
6760,
3082,
452,
512,
644,
9009,
407,
1529,
3694,
5522,
50276,
68,
21,
4679,
253,
1543,
403,
3559,
533,
417,
5469,
50276,
783,
2530,
19868,
273,
502,
3082,
1537,
3831,
271,
4722,
12288,
534,
7033,
391,
77,
285,
22296,
4715,
7274,
281,
502,
2299,
1955,
281,
253,
3480,
273,
5301,
281,
2905,
789,
891,
717,
417,
2176,
273,
436,
50275,
1189,
455,
891,
13414,
923,
247,
1534,
20417,
390,
20178,
7680,
5474,
339,
9852,
436,
2929,
253,
4477,
1611,
281,
5100,
247,
27998,
7792,
323,
1027,
45120,
4715,
7533,
597,
671,
2085,
247,
15548,
6335,
534,
3797,
1027,
2905,
3082,
9470,
5661,
1543,
403,
671,
2530,
50275,
296,
3755,
20556,
50275,
6732,
50275,
783,
45120,
4715,
2561,
5202,
4677,
374,
310,
973,
10932,
285,
2789,
247,
2257,
273,
3282,
281,
479,
50276,
6732,
50275,
74,
1158,
253,
16038,
273,
3652,
247,
27998,
7792,
323,
45120,
4715,
310,
14282,
1142,
1027,
45120,
4715,
9380,
7472,
616,
3082,
275,
1027,
22791,
14238,
352,
310,
2834,
323,
253,
1563,
8607,
281,
7277,
841,
2905,
3082,
50275,
6732,
50275,
5992,
7193,
22909,
285,
5661,
1543,
403,
2530,
275,
436,
2929,
253,
4477,
671,
2085,
253,
1543,
327,
18021,
67,
352,
2789,
13546,
253,
7000,
1543,
323,
1016,
4758,
1077,
11638,
323,
253,
1563,
8607,
50275,
6732,
50275,
20881,
1255,
265,
50276,
6732,
50275,
8826,
1774,
45120,
4715,
1666,
25379,
403,
417,
2908,
824,
347,
17857,
7694,
6142,
2066,
74,
1162,
355,
4240,
50275,
6732,
50275,
783,
4477,
760,
2085,
3368,
1543,
327,
1355,
7527,
15302,
824,
347,
278,
79,
382,
285,
260,
338,
274,
954,
45120,
4715,
9380,
824,
347,
17857,
7694,
6142,
2066,
74,
1162,
355,
4240,
285,
43022,
259,
86,
1162,
355,
6247,
2085,
253,
1543,
327,
1236,
2510,
25912,
15302,
24088,
4440,
257,
292,
18,
76,
347,
253,
4477,
2820,
281,
5100,
247,
747,
22791,
7241,
352,
310,
1774,
281,
2085,
253,
1543,
327,
1236,
2510,
25912,
15302,
50276,
6732,
50275,
2520,
2199,
3797,
9828,
273,
1027,
3082,
533,
253,
4477,
13414,
2486,
7000,
1491,
824,
347,
253,
23937,
670,
253,
2905,
13279,
1505,
5300,
597,
897,
50275,
6732,
50275,
783,
5426,
273,
32809,
4715,
275,
2593,
3127,
310,
23851,
891,
1158,
352,
310,
1805,
281,
897,
966,
19687,
30132,
4715,
390,
5028,
19687,
30132,
4715,
3587,
253,
4606,
403,
347,
3637,
337,
32809,
4715,
310,
2223,
2783,
347,
1529,
1416,
273,
45120,
4715,
352,
310,
12504,
281,
897,
352,
281,
9173,
247,
2173,
4758,
273,
45120,
4715,
374,
368,
2486,
966,
300,
285,
5028,
300,
275,
32809,
4715,
533,
368,
16670,
4836,
300,
352,
310,
417,
5272,
50276,
6732,
50275,
783,
4477,
2486,
1512,
1142,
4278,
670,
253,
2127,
7092,
275,
253,
2929,
891,
1158,
352,
310,
1805,
281,
2118,
841,
4243,
281,
253,
30762,
285,
2486,
625,
5661,
1543,
285,
6260,
50276,
1189,
455,
891,
1158,
436,
2199,
588,
320,
247,
4217,
4968,
323,
253,
1563,
45120,
4715,
8607,
891,
588,
5583,
14924,
604,
253,
4477,
476,
2953,
619,
7350,
275,
253,
32213,
629,
50275,
6732,
50274,
5996,
250,
2858,
22559,
5701,
50275,
74,
1869,
253,
4477,
11205,
281,
5100,
247,
27998,
3694,
7792,
326,
2789,
3515,
45120,
4715,
4679,
3477,
2299,
846,
4361,
253,
30080,
22559,
891,
1158,
2160,
80,
571,
556,
253,
1563,
2201,
3374,
285,
253,
4477,
4242,
281,
2953,
731,
275,
253,
30080,
22559,
50275,
2346,
80,
571,
11306,
15771,
327,
253,
2045,
13747,
824,
347,
37182,
10024,
285,
19106,
891,
13414,
1158,
436,
2216,
310,
1077,
11453,
281,
253,
8607,
275,
619,
3367,
1859,
891,
4510,
247,
7792,
326,
310,
3477,
281,
320,
7192,
285,
3797,
253,
954,
4633,
1666,
25379,
891,
1158,
634,
7792,
943,
320,
4158,
323,
247,
22780,
3185,
273,
247,
3694,
16518,
50275,
2346,
80,
571,
556,
2649,
644,
6760,
327,
1236,
2510,
25912,
15302,
24088,
4440,
257,
292,
18,
76,
604,
891,
878,
281,
897,
436,
7792,
891,
878,
253,
7792,
476,
18302,
253,
1543,
273,
253,
2045,
1666,
25379,
9113,
50275,
601,
891,
13414,
1158,
2160,
80,
571,
310,
1077,
4217,
281,
247,
22780,
751,
479,
2556,
281,
619,
3367,
2793,
891,
5257,
281,
12009,
436,
2929,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
10527,
7792,
281,
23968,
2561,
3237,
275,
253,
45120,
4715,
502,
5028,
2556,
281,
247,
19868,
436,
10527,
7792,
310,
908,
347,
253,
5044,
12153,
323,
2160,
80,
571,
247,
3694,
6335,
4158,
281,
33150,
3082,
26332,
3733,
11333,
2439,
1027,
2561,
3237,
7533,
253,
4736,
273,
253,
2929,
310,
1077,
24683,
627,
2226,
1142,
1527,
2603,
13747,
16994,
502,
3082,
323,
22296,
4715,
285,
35221,
4715,
7533,
50276,
13149,
2556,
281,
253,
2929,
1016,
4758,
310,
2529,
347,
247,
873,
273,
13260,
247,
7139,
73,
7760,
19868,
32361,
432,
436,
1859,
436,
2722,
247,
12291,
273,
253,
10527,
7792,
13947,
7533,
347,
247,
873,
273,
13260,
1057,
417,
906,
275,
247,
7139,
73,
7760,
19868,
352,
651,
320,
247,
10979,
3103,
970,
247,
7139,
73,
7760,
19868,
2097,
326,
690,
3082,
588,
320,
13333,
275,
8063,
533,
417,
275,
3946,
275,
2160,
80,
571,
984,
253,
5202,
19756,
253,
4602,
436,
247,
2266,
12291,
273,
253,
7792,
326,
310,
417,
5469,
50276,
39595,
253,
7792,
11306,
15771,
327,
2067,
21011,
281,
3359,
253,
5536,
20284,
15302,
391,
77,
12620,
3082,
436,
310,
417,
247,
1943,
1895,
407,
3139,
533,
352,
310,
417,
2590,
752,
2160,
80,
571,
310,
6240,
2429,
281,
253,
3236,
13747,
50276,
16217,
3825,
253,
4679,
921,
2067,
1666,
25379,
275,
253,
22296,
285,
35221,
4715,
7533,
954,
3082,
403,
417,
9009,
407,
2160,
80,
571,
285,
403,
20265,
432,
37182,
10024,
323,
1499,
285,
6474,
10352,
25379,
285,
45120,
10186,
323,
391,
77,
581,
2181,
326,
891,
3264,
432,
436,
2593,
369,
849,
253,
1072,
1332,
812,
320,
4354,
3732,
281,
1027,
7533,
534,
310,
253,
2022,
1750,
273,
253,
6335,
3185,
1499,
285,
391,
77,
7533,
3894,
5058,
3082,
323,
1650,
253,
299,
38212,
7092,
310,
1027,
275,
253,
1499,
285,
391,
77,
7533,
1057,
436,
1599,
326,
4212,
452,
281,
3359,
1046,
1332,
7019,
352,
3133,
326,
391,
77,
285,
1499,
3082,
403,
4336,
4858,
534,
310,
1411,
253,
2862,
5968,
273,
253,
6335,
387,
436,
1127,
752,
310,
253,
5750,
326,
2160,
80,
571,
10316,
2429,
281,
697,
21011,
19106,
17409,
37182,
10024,
6474,
10352,
25379,
45120,
10186,
50276,
38092,
5701,
50276,
3474,
21462,
2223,
14588,
281,
3541,
11897,
390,
673,
4136,
281,
3037,
247,
4836,
50276,
261,
2160,
80,
571,
2104,
281,
2451,
673,
285,
3541,
10806,
50276,
261,
352,
1896,
323,
253,
990,
4212,
281,
4853,
747,
7533,
390,
10007,
253,
19868,
352,
651,
320,
4722,
281,
923,
271,
1650,
50276,
664,
3877,
326,
690,
37182,
10024,
3082,
5115,
2406,
685,
4839,
7200,
275,
4836,
300,
984,
597,
513,
417,
897,
253,
4836,
5203,
281,
8989,
562,
253,
5971,
326,
7027,
3345,
253,
5762,
4836,
50276,
5092,
368,
4993,
436,
1895,
5010,
247,
2608,
3198,
281,
871,
253,
4184,
932,
273,
1016,
6335,
281,
13844,
253,
2127,
436,
11323,
247,
2257,
273,
10454,
50276,
74,
2868,
253,
8103,
273,
436,
2929,
273,
34047,
2366,
45120,
1499,
285,
391,
77,
310,
1077,
1774,
285,
24683,
2299,
253,
2929,
275,
253,
1246,
830,
556,
2067,
32213,
352,
310,
12744,
1880,
253,
10527,
7792,
1663,
33526,
253,
9380,
8103,
2160,
80,
571,
253,
3694,
3133,
281,
320,
1335,
387,
247,
1077,
9765,
3924,
275,
697,
2440,
5880,
891,
923,
1077,
1652,
440,
1877,
275,
253,
3082,
534,
310,
253,
2022,
7990,
273,
253,
2929,
436,
7052,
7787,
253,
3082,
33150,
875,
253,
1027,
7533,
352,
310,
12744,
752,
11361,
2160,
80,
571,
310,
9745,
2429,
281,
970,
697,
21011,
3587,
5474,
33032,
2520,
2929,
23970,
247,
747,
45120,
4715,
7792,
326,
13698,
281,
9510,
253,
2561,
275,
253,
1673,
436,
7792,
310,
1754,
327,
247,
2891,
13646,
273,
512,
1896,
13260,
326,
403,
1846,
281,
502,
3082,
25761,
436,
2891,
13646,
7729,
275,
8133,
22296,
285,
35221,
3082,
275,
247,
27998,
7792,
50275,
249,
2593,
374,
253,
4477,
12661,
247,
1616,
729,
757,
1232,
326,
310,
1146,
12841,
275,
253,
1996,
4243,
352,
310,
417,
2590,
2139,
436,
8763,
9561,
1616,
729,
3061,
1232,
310,
4217,
281,
253,
4081,
7792,
50276,
6050,
891,
513,
11435,
253,
3434,
275,
6684,
436,
7792,
891,
717,
14999,
253,
38135,
275,
253,
2929,
253,
789,
310,
7194,
271,
11369,
3434,
28763,
14109,
533,
1537,
417,
4944,
436,
8059,
50276,
6050,
891,
513,
11435,
253,
3434,
3212,
436,
789,
891,
5545,
253,
3761,
875,
436,
2929,
285,
253,
7990,
273,
17857,
32888,
50276,
187,
187,
4118,
18435,
27,
783,
7714,
23970,
247,
2891,
13646,
323,
26169,
45120,
4715,
2561,
7533,
285,
247,
3694,
7792,
326,
36683,
436,
2891,
13646,
1016,
45120,
4715,
4758,
310,
6607,
407,
347,
247,
873,
273,
6096,
13260,
24088,
403,
4836,
44077,
2540,
390,
417,
6607,
275,
247,
19868,
285,
253,
3694,
310,
5611,
342,
253,
13079,
273,
440,
5411,
45120,
4715,
2561,
50276,
783,
7714,
22649,
247,
2590,
2523,
275,
253,
1673,
7533,
285,
3082,
323,
45120,
4715,
452,
7987,
12072,
594,
326,
627,
310,
1652,
25253,
275,
49602,
2403,
4780,
2834,
281,
5963,
30628,
3839,
5821,
326,
253,
16038,
273,
3652,
3694,
281,
1361,
440,
1419,
45120,
4715,
2561,
369,
247,
2762,
50276,
35529,
30628,
671,
8042,
281,
1142,
7350,
342,
253,
7714,
285,
3694,
5522,
2160,
80,
571,
326,
12093,
697,
2022,
7680,
275,
1798,
627,
310,
4468,
326,
253,
3694,
310,
387,
271,
2393,
3924,
273,
2440,
285,
2789,
5536,
897,
273,
5368,
13747,
281,
1159,
24088,
37182,
10024,
285,
19106,
436,
2789,
352,
12744,
752,
2160,
900,
66,
6131,
689,
970,
697,
21011,
3587,
347,
973,
627,
310,
4468,
326,
2709,
2629,
22791,
8892,
285,
1846,
3082,
403,
5816,
432,
253,
7092,
50276,
35456,
323,
1781,
4311,
4679,
342,
24088,
4440,
257,
292,
18,
76,
275,
3762,
253,
6335,
4483,
6880,
285,
841,
1537,
320,
9009,
407,
2571,
275,
253,
3114,
2299,
436,
651,
2430,
326,
253,
3236,
7714,
33385,
403,
2266,
2217,
281,
3812,
4489,
275,
432,
643,
8607,
50276,
249,
2020,
253,
7714,
33385,
1057,
417,
2568,
3959,
247,
21414,
4983,
1127,
323,
8607,
2819,
323,
247,
4983,
1127,
281,
3135,
616,
45120,
4715,
2561
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
9437,
281,
440,
1419,
512,
502,
2561,
342,
247,
2014,
30221,
1735,
597,
1246,
247,
3694,
7092,
273,
616,
7792,
4720,
4679,
403,
6337,
534,
7568,
326,
2160,
80,
571,
476,
320,
908,
281,
7472,
502,
3082,
5847,
50276,
81,
18,
6153,
247,
3694,
5522,
534,
440,
7790,
391,
77,
285,
22296,
4715,
7274,
3133,
751,
247,
3213,
275,
253,
987,
3884,
50276,
5040,
50276,
68,
18,
502,
19868,
50276,
74,
1158,
627,
403,
5816,
7632,
275,
253,
19868,
323,
4227,
352,
310,
1774,
281,
671,
1908,
1409,
1222,
719,
30132,
4715,
835,
253,
4836,
310,
253,
1072,
533,
253,
3280,
8470,
403,
1027,
24088,
253,
7877,
1319,
273,
253,
3280,
2544,
417,
816,
253,
3280,
3268,
50276,
68,
19,
2905,
789,
368,
3748,
326,
634,
7792,
36908,
15639,
342,
2571,
352,
310,
417,
2590,
281,
479,
2139,
3095,
2010,
651,
417,
816,
897,
253,
643,
31225,
247,
625,
801,
554,
394,
5955,
273,
253,
2045,
789,
310,
3309,
436,
1039,
352,
651,
320,
6927,
281,
12129,
634,
9021,
403,
50276,
68,
20,
4679,
253,
2505,
1057,
417,
1918,
441,
1491,
327,
534,
3082,
403,
9009,
407,
253,
4477,
352,
3534,
479,
253,
13214,
326,
253,
6760,
3082,
452,
512,
644,
9009,
407,
1529,
3694,
5522,
50276,
68,
21,
4679,
253,
1543,
403,
3559,
533,
417,
5469,
50276,
783,
2530,
19868,
273,
502,
3082,
1537,
3831,
271,
4722,
12288,
534,
7033,
391,
77,
285,
22296,
4715,
7274,
281,
502,
2299,
1955,
281,
253,
3480,
273,
5301,
281,
2905,
789,
891,
717,
417,
2176,
273,
436,
50275,
1189,
455,
891,
13414,
923,
247,
1534,
20417,
390,
20178,
7680,
5474,
339,
9852,
436,
2929,
253,
4477,
1611,
281,
5100,
247,
27998,
7792,
323,
1027,
45120,
4715,
7533,
597,
671,
2085,
247,
15548,
6335,
534,
3797,
1027,
2905,
3082,
9470,
5661,
1543,
403,
671,
2530,
50275,
296,
3755,
20556,
50275,
6732,
50275,
783,
45120,
4715,
2561,
5202,
4677,
374,
310,
973,
10932,
285,
2789,
247,
2257,
273,
3282,
281,
479,
50276,
6732,
50275,
74,
1158,
253,
16038,
273,
3652,
247,
27998,
7792,
323,
45120,
4715,
310,
14282,
1142,
1027,
45120,
4715,
9380,
7472,
616,
3082,
275,
1027,
22791,
14238,
352,
310,
2834,
323,
253,
1563,
8607,
281,
7277,
841,
2905,
3082,
50275,
6732,
50275,
5992,
7193,
22909,
285,
5661,
1543,
403,
2530,
275,
436,
2929,
253,
4477,
671,
2085,
253,
1543,
327,
18021,
67,
352,
2789,
13546,
253,
7000,
1543,
323,
1016,
4758,
1077,
11638,
323,
253,
1563,
8607,
50275,
6732,
50275,
20881,
1255,
265,
50276,
6732,
50275,
8826,
1774,
45120,
4715,
1666,
25379,
403,
417,
2908,
824,
347,
17857,
7694,
6142,
2066,
74,
1162,
355,
4240,
50275,
6732,
50275,
783,
4477,
760,
2085,
3368,
1543,
327,
1355,
7527,
15302,
824,
347,
278,
79,
382,
285,
260,
338,
274,
954,
45120,
4715,
9380,
824,
347,
17857,
7694,
6142,
2066,
74,
1162,
355,
4240,
285,
43022,
259,
86,
1162,
355,
6247,
2085,
253,
1543,
327,
1236,
2510,
25912,
15302,
24088,
4440,
257,
292,
18,
76,
347,
253,
4477,
2820,
281,
5100,
247,
747,
22791,
7241,
352,
310,
1774,
281,
2085,
253,
1543,
327,
1236,
2510,
25912,
15302,
50276,
6732,
50275,
2520,
2199,
3797,
9828,
273,
1027,
3082,
533,
253,
4477,
13414,
2486,
7000,
1491,
824,
347,
253,
23937,
670,
253,
2905,
13279,
1505,
5300,
597,
897,
50275,
6732,
50275,
783,
5426,
273,
32809,
4715,
275,
2593,
3127,
310,
23851,
891,
1158,
352,
310,
1805,
281,
897,
966,
19687,
30132,
4715,
390,
5028,
19687,
30132,
4715,
3587,
253,
4606,
403,
347,
3637,
337,
32809,
4715,
310,
2223,
2783,
347,
1529,
1416,
273,
45120,
4715,
352,
310,
12504,
281,
897,
352,
281,
9173,
247,
2173,
4758,
273,
45120,
4715,
374,
368,
2486,
966,
300,
285,
5028,
300,
275,
32809,
4715,
533,
368,
16670,
4836,
300,
352,
310,
417,
5272,
50276,
6732,
50275,
783,
4477,
2486,
1512,
1142,
4278,
670,
253,
2127,
7092,
275,
253,
2929,
891,
1158,
352,
310,
1805,
281,
2118,
841,
4243,
281,
253,
30762,
285,
2486,
625,
5661,
1543,
285,
6260,
50276,
1189,
455,
891,
1158,
436,
2199,
588,
320,
247,
4217,
4968,
323,
253,
1563,
45120,
4715,
8607,
891,
588,
5583,
14924,
604,
253,
4477,
476,
2953,
619,
7350,
275,
253,
32213,
629,
50275,
6732,
50274,
5996,
250,
2858,
22559,
5701,
50275,
74,
1869,
253,
4477,
11205,
281,
5100,
247,
27998,
3694,
7792,
326,
2789,
3515,
45120,
4715,
4679,
3477,
2299,
846,
4361,
253,
30080,
22559,
891,
1158,
2160,
80,
571,
556,
253,
1563,
2201,
3374,
285,
253,
4477,
4242,
281,
2953,
731,
275,
253,
30080,
22559,
50275,
2346,
80,
571,
11306,
15771,
327,
253,
2045,
13747,
824,
347,
37182,
10024,
285,
19106,
891,
13414,
1158,
436,
2216,
310,
1077,
11453,
281,
253,
8607,
275,
619,
3367,
1859,
891,
4510,
247,
7792,
326,
310,
3477,
281,
320,
7192,
285,
3797,
253,
954,
4633,
1666,
25379,
891,
1158,
634,
7792,
943,
320,
4158,
323,
247,
22780,
3185,
273,
247,
3694,
16518,
50275,
2346,
80,
571,
556,
2649,
644,
6760,
327,
1236,
2510,
25912,
15302,
24088,
4440,
257,
292,
18,
76,
604,
891,
878,
281,
897,
436,
7792,
891,
878,
253,
7792,
476,
18302,
253,
1543,
273,
253,
2045,
1666,
25379,
9113,
50275,
601,
891,
13414,
1158,
2160,
80,
571,
310,
1077,
4217,
281,
247,
22780,
751,
479,
2556,
281,
619,
3367,
2793,
891,
5257,
281,
12009,
436,
2929,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
10527,
7792,
281,
23968,
2561,
3237,
275,
253,
45120,
4715,
502,
5028,
2556,
281,
247,
19868,
436,
10527,
7792,
310,
908,
347,
253,
5044,
12153,
323,
2160,
80,
571,
247,
3694,
6335,
4158,
281,
33150,
3082,
26332,
3733,
11333,
2439,
1027,
2561,
3237,
7533,
253,
4736,
273,
253,
2929,
310,
1077,
24683,
627,
2226,
1142,
1527,
2603,
13747,
16994,
502,
3082,
323,
22296,
4715,
285,
35221,
4715,
7533,
50276,
13149,
2556,
281,
253,
2929,
1016,
4758,
310,
2529,
347,
247,
873,
273,
13260,
247,
7139,
73,
7760,
19868,
32361,
432,
436,
1859,
436,
2722,
247,
12291,
273,
253,
10527,
7792,
13947,
7533,
347,
247,
873,
273,
13260,
1057,
417,
906,
275,
247,
7139,
73,
7760,
19868,
352,
651,
320,
247,
10979,
3103,
970,
247,
7139,
73,
7760,
19868,
2097,
326,
690,
3082,
588,
320,
13333,
275,
8063,
533,
417,
275,
3946,
275,
2160,
80,
571,
984,
253,
5202,
19756,
253,
4602,
436,
247,
2266,
12291,
273,
253,
7792,
326,
310,
417,
5469,
50276,
39595,
253,
7792,
11306,
15771,
327,
2067,
21011,
281,
3359,
253,
5536,
20284,
15302,
391,
77,
12620,
3082,
436,
310,
417,
247,
1943,
1895,
407,
3139,
533,
352,
310,
417,
2590,
752,
2160,
80,
571,
310,
6240,
2429,
281,
253,
3236,
13747,
50276,
16217,
3825,
253,
4679,
921,
2067,
1666,
25379,
275,
253,
22296,
285,
35221,
4715,
7533,
954,
3082,
403,
417,
9009,
407,
2160,
80,
571,
285,
403,
20265,
432,
37182,
10024,
323,
1499,
285,
6474,
10352,
25379,
285,
45120,
10186,
323,
391,
77,
581,
2181,
326,
891,
3264,
432,
436,
2593,
369,
849,
253,
1072,
1332,
812,
320,
4354,
3732,
281,
1027,
7533,
534,
310,
253,
2022,
1750,
273,
253,
6335,
3185,
1499,
285,
391,
77,
7533,
3894,
5058,
3082,
323,
1650,
253,
299,
38212,
7092,
310,
1027,
275,
253,
1499,
285,
391,
77,
7533,
1057,
436,
1599,
326,
4212,
452,
281,
3359,
1046,
1332,
7019,
352,
3133,
326,
391,
77,
285,
1499,
3082,
403,
4336,
4858,
534,
310,
1411,
253,
2862,
5968,
273,
253,
6335,
387,
436,
1127,
752,
310,
253,
5750,
326,
2160,
80,
571,
10316,
2429,
281,
697,
21011,
19106,
17409,
37182,
10024,
6474,
10352,
25379,
45120,
10186,
50276,
38092,
5701,
50276,
3474,
21462,
2223,
14588,
281,
3541,
11897,
390,
673,
4136,
281,
3037,
247,
4836,
50276,
261,
2160,
80,
571,
2104,
281,
2451,
673,
285,
3541,
10806,
50276,
261,
352,
1896,
323,
253,
990,
4212,
281,
4853,
747,
7533,
390,
10007,
253,
19868,
352,
651,
320,
4722,
281,
923,
271,
1650,
50276,
664,
3877,
326,
690,
37182,
10024,
3082,
5115,
2406,
685,
4839,
7200,
275,
4836,
300,
984,
597,
513,
417,
897,
253,
4836,
5203,
281,
8989,
562,
253,
5971,
326,
7027,
3345,
253,
5762,
4836,
50276,
5092,
368,
4993,
436,
1895,
5010,
247,
2608,
3198,
281,
871,
253,
4184,
932,
273,
1016,
6335,
281,
13844,
253,
2127,
436,
11323,
247,
2257,
273,
10454,
50276,
74,
2868,
253,
8103,
273,
436,
2929,
273,
34047,
2366,
45120,
1499,
285,
391,
77,
310,
1077,
1774,
285,
24683,
2299,
253,
2929,
275,
253,
1246,
830,
556,
2067,
32213,
352,
310,
12744,
1880,
253,
10527,
7792,
1663,
33526,
253,
9380,
8103,
2160,
80,
571,
253,
3694,
3133,
281,
320,
1335,
387,
247,
1077,
9765,
3924,
275,
697,
2440,
5880,
891,
923,
1077,
1652,
440,
1877,
275,
253,
3082,
534,
310,
253,
2022,
7990,
273,
253,
2929,
436,
7052,
7787,
253,
3082,
33150,
875,
253,
1027,
7533,
352,
310,
12744,
752,
11361,
2160,
80,
571,
310,
9745,
2429,
281,
970,
697,
21011,
3587,
5474,
33032,
2520,
2929,
23970,
247,
747,
45120,
4715,
7792,
326,
13698,
281,
9510,
253,
2561,
275,
253,
1673,
436,
7792,
310,
1754,
327,
247,
2891,
13646,
273,
512,
1896,
13260,
326,
403,
1846,
281,
502,
3082,
25761,
436,
2891,
13646,
7729,
275,
8133,
22296,
285,
35221,
3082,
275,
247,
27998,
7792,
50275,
249,
2593,
374,
253,
4477,
12661,
247,
1616,
729,
757,
1232,
326,
310,
1146,
12841,
275,
253,
1996,
4243,
352,
310,
417,
2590,
2139,
436,
8763,
9561,
1616,
729,
3061,
1232,
310,
4217,
281,
253,
4081,
7792,
50276,
6050,
891,
513,
11435,
253,
3434,
275,
6684,
436,
7792,
891,
717,
14999,
253,
38135,
275,
253,
2929,
253,
789,
310,
7194,
271,
11369,
3434,
28763,
14109,
533,
1537,
417,
4944,
436,
8059,
50276,
6050,
891,
513,
11435,
253,
3434,
3212,
436,
789,
891,
5545,
253,
3761,
875,
436,
2929,
285,
253,
7990,
273,
17857,
32888,
50276,
187,
187,
4118,
18435,
27,
783,
7714,
23970,
247,
2891,
13646,
323,
26169,
45120,
4715,
2561,
7533,
285,
247,
3694,
7792,
326,
36683,
436,
2891,
13646,
1016,
45120,
4715,
4758,
310,
6607,
407,
347,
247,
873,
273,
6096,
13260,
24088,
403,
4836,
44077,
2540,
390,
417,
6607,
275,
247,
19868,
285,
253,
3694,
310,
5611,
342,
253,
13079,
273,
440,
5411,
45120,
4715,
2561,
50276,
783,
7714,
22649,
247,
2590,
2523,
275,
253,
1673,
7533,
285,
3082,
323,
45120,
4715,
452,
7987,
12072,
594,
326,
627,
310,
1652,
25253,
275,
49602,
2403,
4780,
2834,
281,
5963,
30628,
3839,
5821,
326,
253,
16038,
273,
3652,
3694,
281,
1361,
440,
1419,
45120,
4715,
2561,
369,
247,
2762,
50276,
35529,
30628,
671,
8042,
281,
1142,
7350,
342,
253,
7714,
285,
3694,
5522,
2160,
80,
571,
326,
12093,
697,
2022,
7680,
275,
1798,
627,
310,
4468,
326,
253,
3694,
310,
387,
271,
2393,
3924,
273,
2440,
285,
2789,
5536,
897,
273,
5368,
13747,
281,
1159,
24088,
37182,
10024,
285,
19106,
436,
2789,
352,
12744,
752,
2160,
900,
66,
6131,
689,
970,
697,
21011,
3587,
347,
973,
627,
310,
4468,
326,
2709,
2629,
22791,
8892,
285,
1846,
3082,
403,
5816,
432,
253,
7092,
50276,
35456,
323,
1781,
4311,
4679,
342,
24088,
4440,
257,
292,
18,
76,
275,
3762,
253,
6335,
4483,
6880,
285,
841,
1537,
320,
9009,
407,
2571,
275,
253,
3114,
2299,
436,
651,
2430,
326,
253,
3236,
7714,
33385,
403,
2266,
2217,
281,
3812,
4489,
275,
432,
643,
8607,
50276,
249,
2020,
253,
7714,
33385,
1057,
417,
2568,
3959,
247,
21414,
4983,
1127,
323,
8607,
2819,
323,
247,
4983,
1127,
281,
3135,
616,
45120,
4715,
2561
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
originality the main contributions of the paper are its presentation of simple elegant proofs of consistency for threevariable lcd and fourvariable ystructure causal discovery algorithms the proof for the threevariable algorithm shows consistency under hidden confounding the proof for the fourvariable algorithms shows consistency under both hidden confounding and selection bias the current paper discusses earlier work on the threevariable lcd algorithm and points to a paper from 1997 that describes it a scan of that paper indicates that it explicitly assumes the absence of selection bias assumption 5 page 207 it seems important that the current paper make clear that the earlier paper was explicit in describing this assumption otherwise the reader of the current paper may be left with the impression that the earlier work was unaware of this assumption and thus so may be the readers of that earlier paper also the earlier lcd paper describes how y structures can be used to exclude the possibility of selection bias page 217 although it does not provide a consistency proof it seems important that the current paper mention that y structures were introduced in the earlier paper as a way to discover causal relationships that have no hidden confounding or selection bias significance the computational tractability of local causal discovery algorithms makes them appealing the relatively few conditional independence tests required to perform causal discovery with threevariable and fourvariable models make them potentially more statistically reliable than global causal discovery algorithms such as fci in which the correctness of the causal relationships in the single graph that is output can be highly dependent on one another however what about the need to control for multiple testing when evaluating thousands or even millions of three and fourvariable models when performing local causal discovery a natural question is this what might be gained by performing local causal discovery using more than four variables it seems worthwhile for the paper to at least touch on this issue about future research in the discussion section technical quality the technical quality of the paper is excellent the proofs appear to be correct the definitions of minimal conditional independence and dependence in section 31 appear to be different than the definitions given in the cited paper by claasen and heskes 2011 the definitions given by claasen and heskes seem to make more sense the authors should check to make sure that their definitions are the same as those given in the claasen and heskes paper the experiments include both simulated and real data and they provide useful insights about the algorithms it would be helpful to have more details about how the random graphs were generated for example directed edges were sampled independently with some fixed probability which is not stated in the paper also graphs that did not meet a predetermined minimum number of collider patterns were discarded but that number is not stated in the paper the paper generated 10000 samples for the random graph experiments it would be interesting to see how the results change over a wide span of sample sizes for the large random graphs it is interesting that the maximum precision over all levels of recall of ystext is lower than lcd and yst why is that it would be useful if the paper analyzed statistically the differences in results using hypothesis testing or confidence intervals for example the discussion and interpretation of the experimental results is sparse additional discussion would be useful clarity the paper is well written and easy to read overall there are a few places that would benefit from clarification and there are a few minor typos example 1 is a bit confusing for instance how does the professor observe spurious improvement in the final grades g of those who handed in their exams if only students with good grades actually handedin the class review sheet which is privacysensitive and unobservable to the professor is it the case that the professor does not know the content of the review sheets but does know which students handin the review sheets it would be interesting if the example was worked out numerically outside of the paper and the direction of the final results reported in the paper figure 1 it would be good to mention in the caption what each of the node types represents example 2 in the phrase might destroy some of these genes it seems the word genes should be cells in fact fig 2 represents should be in fact fig 1b represents it would be helpful to have a more expanded explanation of the following sentence on page 2 a procedure for extracting nonancestral relations from this equivalence class when selection bias is a possibility is yet unknown page 3 valid under when valid when page 3 constraintbased method causal discovery methods constraintbased causal discovery methods page 3 between nodes relates to an between nodes to an page 4 it seems the phrase a conditioned variable that is marginalized out should be a variable that is conditioned on page 4 anterior causal relations is mentioned but not defined or described page 4 assumption 0 c and x need to be defined or described page 4 mooij et al 2020 is an ambiguous citation because it matches two of the references page 5 the phrase where w represents selection all other conditioned variables is not clear page 6 shouldnt the phrase z is not ancestral to all of x y and conditioning set w be z is not ancestral to any of x y and conditioning set w page 6 soundness follows of the first two soundness of the first two page 6 shouldnt the original assumption of lcd is slightly stronger only requiring be the original assumption of lcd is slightly weaker only requiring page 8 it would be clarifying if section 33 contained a diagram of a y structure page 8 extended ystructure7 extended ystructure 7 page 9 the section of finite sample scoring is not easy to follow page 10 what is fig 422 page 11 figure 5 it would be helpful if the caption describes the meaning of solid and dotted lines in the plots page 12 the meaning of and reasoning underlying the following sentence is not entirely clear to me in fig 6a the resulting roc curve is shown where the threshold t is chosen in a way that 1 percent of the causal relations is contained in the ground truth why choose 1 percent and what are the likely repercussions for doing so docseplocally discovering causal relations is important in practice assuming the existence of selection biases and hidden confoundings this paper investigates whether the lcd algorithm and the extended ystructure are still valid this paper is wellwritten and the contribution is clear the authors proved that 1 for the threevariable case there is no sound constraintbased approach to discover a purely ancestral causal relation under selection bias 2 for the fourvariable case the extended ystructure is still sound under selection bias however it seems that the above two results are direct consequences of the logical causal inference rules the logical causal inference rules or other similar results have been used to prove the correctness of rfci see eg colombo et al 2012 lemma 31 it would be better to compare the proved results to lemma 31 and to how lemma 31 is applied in rfci other comments 1 section 41 it seems that xin ansy should be xin any and that xin y should be xin any 2 in section 41 the authors define a score in eq9 does this score only work for the lineargaussian case if the variables are discrete can we still use this score docsep the paper investigates identifying causal ancestral relationships among two variables locally the impossibility result with threevariable case seems new but the result is rather straightforward due to a small number of possible structures for the fourvariable case extended ystructure the result can be derived by lemma 1 claassen and heskes 2011 one can refine further to x in anyanw given that the mathematical rigor to prove the result is not high the quality of the paper would be mostly determined by the significance of the question relevance and empirical results the question seems important that under which condition can we tell ancestral relationship under selection bias but the question is limited to three and four variable cases how hard would it be to solve 5variable case certainly one can consider a bruteforce approach the paper is clearly written but can be improved by skipping the discussing of mag for example figures 2 and 3 are possible admgs not mags we do not need mag to understand the results introducing jci framework a little bit in assumption 0 you didnt define c and x yet draw ystructure in section 33 empirical validation is interesting but also can be improved the current result using random graphs can be quite confusing since nonsound method associates with higher precision and recall even under the selection bias tests on random graphs only check 1 how many portions of pairs are adequate for ext yst 2 how good the estimator is hence it is not quite appropriate to juxtapose pr curves for different methods they will find orientation for different edges where some might be irrelevant to selection bias can you report what pr value would be fore alpha005 instead of logp value whats max in eq 9 maxx also whats ansy just before the equation overall i like the simple question and solution no one asked before the presentation and experiments can be improved to highlight the importance of ext yst testing on more data sets would be desirable although the soundness of the method is already proven minor amdg just before section 22 in section 422 fig 422 should be fig 5 figure 5 please explicitly mention what are solid s0 and dashed s1 lines in conclusion lcd was found to not be sound may be written in a different way eg lcd is not applicable to a data with selection bias lcd is sound under the condition it was proposed
### Summary:
|
this paper shows that when making several assumptions of the joint causal inference framework introducing exogenous context variables the local causal discovery algorithm is able to reliably discover a causal effect when there are latent variables but no conditional independence based algorithm can discover a causal effect when there are latent variables and selection bias it also shows that when using only 4 variables there is a consistent way to discover a causal effect when there are latent variables selection bias and cycles assuming a simple structural causal model the failure modes of local causal discovery with three nodes under selection bias and the ability of extended ystructures the 4 variable case to discover causal effects under latent variables selection bias and cycles is new the paper is generally well written and easy to follow it is technically correct the results are of interest because although the results are about rather special cases there is some evidence that very simple local algorithms are more reliable than more informative global algorithms however the authors were asked to submit a modified version of their paper during the discussion period they submitted a new version with some new results however in examining it more closely while writing the metareview i found that they had introduced some very minor errors which need to be fixed these include 1 there is a definition of dmgs but not admgs 2 there is no reference to mags prior to assumption 0 either assumption 0 should be changed to refer to dmgs or admgs or mags should be introduced whichever one of these models is assumed the authors should make certain that it is compatible with the revised 4 variable theorem about cyclic models 3 the proof of proposition 4 was changed to allow for the theorem to apply cyclic simple scms this is not clearly stated in the theorem which should make clear that the assumption of a possibly cyclic but simple scm is allowed this should also be made clearer in the abstract and introduction two reviewers scored this paper as a 7 and one as a 6
|
[
326,
452,
642,
8763,
34541,
390,
5438,
8492,
50276,
9188,
40348,
50276,
783,
15180,
10649,
1430,
273,
1980,
19349,
8900,
11333,
2789,
731,
23176,
253,
4942,
1643,
17697,
14275,
5216,
2424,
281,
1347,
19349,
8900,
342,
1264,
18645,
285,
1740,
18645,
3210,
1056,
731,
7826,
625,
10126,
9630,
685,
4156,
19349,
8900,
11333,
824,
347,
269,
5297,
275,
534,
253,
36594,
273,
253,
19349,
7688,
275,
253,
2014,
4216,
326,
310,
3453,
476,
320,
4122,
7976,
327,
581,
1529,
2299,
752,
670,
253,
878,
281,
1453,
323,
2709,
5175,
672,
16344,
6763,
390,
1014,
9790,
273,
1264,
285,
1740,
18645,
3210,
672,
9591,
1980,
19349,
8900,
50275,
66,
3626,
1953,
310,
436,
752,
1537,
320,
12103,
407,
9591,
1980,
19349,
8900,
970,
625,
685,
1740,
4903,
352,
3133,
32811,
323,
253,
2929,
281,
387,
1878,
5181,
327,
436,
2523,
670,
2852,
2561,
275,
253,
5955,
2593,
50276,
48746,
3290,
50276,
783,
7681,
3290,
273,
253,
2929,
310,
7126,
253,
27947,
3176,
281,
320,
3451,
50273,
783,
14308,
273,
8723,
17697,
14275,
285,
10096,
275,
2593,
4562,
3176,
281,
320,
1027,
685,
253,
14308,
1677,
275,
253,
11106,
2929,
407,
502,
66,
284,
257,
285,
344,
3319,
265,
4332,
253,
14308,
1677,
407,
502,
66,
284,
257,
285,
344,
3319,
265,
1646,
281,
1056,
625,
3282,
253,
4477,
943,
2451,
281,
1056,
2119,
326,
616,
14308,
403,
253,
1072,
347,
1110,
1677,
275,
253,
502,
66,
284,
257,
285,
344,
3319,
265,
2929,
50276,
783,
4679,
2486,
1097,
15524,
285,
1524,
941,
285,
597,
2085,
4217,
16039,
670,
253,
11333,
352,
651,
320,
9371,
281,
452,
625,
4278,
670,
849,
253,
3632,
14580,
497,
4561,
323,
1650,
6828,
9297,
497,
19958,
10939,
342,
690,
4229,
5912,
534,
310,
417,
4767,
275,
253,
2929,
671,
14580,
326,
858,
417,
2525,
247,
17095,
5927,
1180,
273,
3007,
1334,
6127,
497,
25665,
533,
326,
1180,
310,
417,
4767,
275,
253,
2929,
253,
2929,
4561,
30321,
3530,
323,
253,
3632,
4216,
4679,
352,
651,
320,
4722,
281,
923,
849,
253,
1543,
1818,
689,
247,
4618,
13905,
273,
3410,
9552,
323,
253,
1781,
3632,
14580,
352,
310,
4722,
326,
253,
4869,
12320,
689,
512,
2308,
273,
6983,
273,
340,
296,
2068,
310,
2406,
685,
298,
2428,
285,
340,
296,
2139,
310,
326,
352,
651,
320,
4217,
604,
253,
2929,
5867,
10126,
253,
3910,
275,
1543,
970,
9079,
5175,
390,
7162,
11508,
323,
1650,
253,
5955,
285,
7914,
273,
253,
5661,
1543,
310,
23507,
3081,
5955,
651,
320,
4217,
50275,
498,
15752,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
4583,
627,
403,
247,
1643,
5053,
326,
651,
5649,
432,
37699,
285,
627,
403,
247,
1643,
5884,
963,
993,
50275,
11667,
337,
310,
247,
2372,
21643,
323,
4227,
849,
1057,
253,
11652,
10018,
46541,
7756,
275,
253,
2457,
23129,
305,
273,
1110,
665,
14048,
275,
616,
34666,
604,
760,
3484,
342,
1175,
23129,
2686,
14048,
249,
253,
966,
2278,
8335,
534,
310,
2294,
317,
656,
18917,
285,
440,
23705,
494,
281,
253,
11652,
310,
352,
253,
1083,
326,
253,
11652,
1057,
417,
871,
253,
2600,
273,
253,
2278,
14526,
533,
1057,
871,
534,
3484,
1133,
249,
253,
2278,
14526,
352,
651,
320,
4722,
604,
253,
1650,
369,
4307,
562,
27184,
3345,
273,
253,
2929,
285,
253,
3884,
273,
253,
2457,
1543,
2361,
275,
253,
2929,
50275,
13206,
337,
352,
651,
320,
1175,
281,
3748,
275,
253,
11743,
752,
1016,
273,
253,
4666,
3510,
6125,
50275,
11667,
374,
275,
253,
12616,
1537,
6909,
690,
273,
841,
3608,
352,
3133,
253,
3159,
3608,
943,
320,
1341,
50274,
249,
958,
3036,
374,
6125,
943,
320,
275,
958,
3036,
337,
67,
6125,
50275,
262,
651,
320,
9371,
281,
452,
247,
625,
11848,
8813,
273,
253,
1563,
6197,
327,
3239,
374,
50276,
66,
5199,
323,
34705,
1327,
1377,
383,
1544,
2493,
432,
436,
19945,
966,
672,
5438,
8492,
310,
247,
6387,
310,
2568,
7202,
50275,
6377,
495,
3588,
762,
672,
50276,
7210,
672,
50275,
6377,
495,
7658,
3169,
1332,
19349,
8900,
3082,
50276,
30995,
3169,
19349,
8900,
3082,
50275,
6377,
495,
875,
7632,
7033,
281,
271,
50276,
17352,
7632,
281,
271,
50275,
6377,
577,
352,
3133,
253,
12616,
247,
27039,
4778,
326,
310,
16888,
1025,
562,
943,
320,
247,
4778,
326,
310,
27039,
327,
50275,
6377,
577,
13235,
19349,
2493,
310,
5393,
533,
417,
2931,
390,
2529,
50275,
6377,
577,
9376,
470,
260,
285,
1269,
878,
281,
320,
2931,
390,
2529,
50275,
6377,
577,
278,
3288,
1944,
1162,
355,
9169,
310,
271,
23851,
25577,
984,
352,
10129,
767,
273,
253,
10414,
50275,
6377,
608,
253,
12616,
835,
259,
6125,
5438,
512,
643,
27039,
4903,
310,
417,
2590,
50274,
6377,
721,
943,
2649,
253,
12616,
1182,
310,
417,
37147,
281,
512,
273,
1269,
340,
285,
21839,
873,
259,
320,
1182,
310,
417,
37147,
281,
667,
273,
1269,
340,
285,
21839,
873,
259,
50275,
6377,
721,
3590,
1255,
3637,
273,
253,
806,
767,
50276,
27962,
1255,
273,
253,
806,
767,
50275,
6377,
721,
943,
2649,
253,
3236,
9376,
273,
298,
2428,
310,
5777,
10046,
760,
10568,
320,
253,
3236,
9376,
273,
298,
2428,
310,
5777,
21076,
760,
10568,
50275,
6377,
854,
352,
651,
320,
8254,
5411,
604,
2593,
5922,
6221,
247,
10659,
273,
247,
340,
2605,
50275,
6377,
854,
6508,
340,
18317,
24,
50276,
43315,
340,
18317,
818,
50275,
6377,
898,
253,
2593,
273,
6486,
3410,
14755,
310,
417,
3477,
281,
956,
50274,
6377,
884,
752,
310,
3036,
38429,
50275,
6377,
1903,
4677,
608,
352,
651,
320,
9371,
604,
253,
11743,
8631,
253,
4495,
273,
4891,
285,
24817,
3104,
275,
253,
14777,
50275,
6377,
1249,
253,
4495,
273,
285,
14720,
6944,
253,
1563,
6197,
310,
417,
7094,
2590,
281,
479,
275,
3036,
721,
66,
253,
4795,
687,
68,
6970,
310,
2011,
835,
253,
7887,
246,
310,
6777,
275,
247,
1039,
326,
337,
2558,
273,
253,
19349,
2493,
310,
6221,
275,
253,
3216,
5083,
2139,
5206,
337,
2558,
285,
752,
403,
253,
2779,
21230,
68,
45874,
323,
2509,
594,
5474,
339,
446,
406,
595,
30375,
19349,
2493,
310,
1774,
275,
3946,
7384,
253,
6242,
273,
5438,
31306,
285,
8763,
44667,
723,
436,
2929,
2340,
684,
1880,
253,
298,
2428,
5933,
285,
253,
6508,
340,
18317,
403,
1335,
3588,
50275,
2520,
2929,
310,
973,
15720,
285,
253,
7680,
310,
2590,
253,
4477,
8058,
326,
50276,
18,
323,
253,
1264,
18645,
1083,
627,
310,
642,
3590,
7658,
3169,
2746,
281,
9413,
247,
15846,
37147,
19349,
5886,
762,
5438,
8492,
50276,
19,
323,
253,
1740,
18645,
1083,
253,
6508,
340,
18317,
310,
1335,
3590,
762,
5438,
8492,
50276,
35529,
352,
3133,
326,
253,
1840,
767,
1543,
403,
1480,
9099,
273,
253,
13760,
19349,
17032,
4803,
253,
13760,
19349,
17032,
4803,
390,
643,
2074,
1543,
452,
644,
908,
281,
5276,
253,
36594,
273,
391,
71,
5297,
923,
24088,
847,
297,
2399,
1162,
355,
4050,
18057,
4562,
352,
50276,
12756,
320,
1805,
281,
7277,
253,
8058,
1543,
281,
18057,
4562,
285,
281,
849,
18057,
4562,
310,
3732,
275,
391,
71,
5297,
50275,
977,
5701,
337,
2593,
7609,
352,
3133,
326,
1269,
249,
7897,
90,
943,
320,
1269,
249,
667,
285,
326,
1269,
249,
340,
943,
320,
1269,
249,
667,
374,
275,
2593,
7609,
253,
4477,
4853,
247,
4868,
275,
16186,
26,
1057,
436,
4868,
760,
789,
323,
253,
1386,
1662,
12064,
1083,
604,
253,
4903,
403,
13358,
476,
359,
1335,
897,
436,
4868,
5474,
33032,
253,
2929,
2340,
684,
12488,
19349,
37147,
7688,
2190,
767,
4903,
12171,
253,
37593,
2322,
906,
342,
1264,
18645,
1083,
3133,
747,
533,
253,
906,
310,
2581,
15246,
1955,
281,
247,
1355,
1180,
273,
1896,
5289,
323,
253,
1740,
18645,
1083,
6508,
340,
18317,
253,
906,
476,
320,
6012,
407,
18057,
337,
502,
66,
38740,
285,
344,
3319,
265,
4332,
581,
476,
39494,
2007,
281,
1269,
275,
667,
266,
88,
1677,
326,
253,
15965,
8132,
263,
281,
5276,
253,
906,
310,
417,
1029,
253,
3290,
273,
253,
2929,
651,
320,
6571,
3413,
407,
253,
8453,
273,
253,
1953,
17200,
285,
16774,
1543,
50276,
783,
1953,
3133,
1774,
326,
762,
534,
1617,
476,
359,
2028,
37147,
2954,
762,
5438,
8492,
533,
253,
1953,
310,
3710,
281,
1264,
285,
1740,
4778,
2219,
849,
1892,
651,
352,
320,
281,
8415,
608,
18645,
1083,
5604,
581,
476,
1908,
247,
16325,
832,
48040,
2746,
50276,
783,
2929,
310,
4518,
3542,
533,
476,
320,
5520,
407,
50275,
3319,
8201,
253,
16585,
273,
4231,
323,
1650,
8442,
374,
285,
495,
403,
1896,
7599,
5943,
417,
278,
3544,
359,
513,
417,
878,
4231,
281,
2096,
253,
1543,
50276,
36445,
2844,
480,
5297,
7792,
247,
1652,
2372,
275,
9376,
470,
368,
42126,
4853,
260,
285,
1269,
2568,
50276,
6553,
340,
18317,
275,
2593,
5922,
50276,
358,
5378,
474,
12820,
310,
4722,
533,
671,
476,
320,
5520,
253,
1655,
906,
970,
3632,
14580,
476,
320,
3240,
21643,
1580,
14122,
517,
1332,
26624,
342,
2169,
12320,
285,
6983,
1014,
762,
253,
5438,
8492,
50275,
19453,
327,
3632,
14580,
760,
2451,
337,
849,
1142,
11821,
273,
8557,
403,
10599,
323,
1021,
340,
296,
50276,
19,
849,
1175,
253,
29107,
310,
7613,
352,
310,
417,
3240,
4569,
281,
47806,
522,
583,
819,
9191,
323,
1027,
3082,
597,
588,
1089,
11259,
323,
1027,
9297,
835,
690,
1537,
320,
19124,
281,
5438,
8492,
50276,
5092,
368,
1304,
752,
819,
1318,
651,
320,
2273,
9765,
5523,
3185,
273,
2412,
81,
1318,
50276,
5371,
84,
2781,
275,
16186,
898,
2781,
89,
671,
47515,
7897,
90,
816,
1078,
253,
5150,
50276,
1189,
455,
891,
751,
253,
2969,
1953,
285,
2900,
642,
581,
2546,
1078,
253,
9759,
285,
4679,
476,
320,
5520,
281,
6780,
253,
6349,
273,
1021,
340,
296,
5175,
327,
625,
941,
5239,
651,
320,
11408,
3738,
253,
3590,
1255,
273,
253,
1332,
310,
2168,
11464,
50276,
37585,
717,
27421,
816,
1078,
2593,
3307,
275,
2593,
38429,
3036,
38429,
943,
320,
3036,
608,
4677,
608,
4496,
11120,
3748,
752,
403,
4891,
256,
17,
285,
17889,
256,
18,
3104,
275,
6452,
298,
2428,
369,
1119,
281,
417,
320,
3590,
778,
320,
3542,
275,
247,
1027,
1039,
24088,
298,
2428,
310,
417,
7763,
281,
247,
941,
342,
5438,
8492,
298,
2428,
310,
3590,
762,
253,
1617,
352,
369,
4081,
2490,
187,
4118,
18435,
27,
2520,
2929,
2722,
326,
672,
2403,
2067,
13260,
273,
253,
6036,
19349,
17032,
7792,
16984,
30398,
3634,
4903,
253,
1980,
19349,
8900,
5933,
310,
2104,
281,
27340,
9413,
247,
19349,
1055,
672,
627,
403,
21624,
4903,
533,
642,
17697,
14275,
1754,
5933,
476,
9413,
247,
19349,
1055,
672,
627,
403,
21624,
4903,
285,
5438,
8492,
352,
671,
2722,
326,
672,
970,
760,
577,
4903,
627,
310,
247,
5185,
1039,
281,
9413,
247,
19349,
1055,
672,
627,
403,
21624,
4903,
5438,
8492,
285,
11945,
7384,
247,
2969,
8350,
19349,
1566,
253,
4433,
10006,
273,
1980,
19349,
8900,
342,
1264,
7632,
762,
5438,
8492,
285,
253,
3745,
273,
6508,
340,
45345,
253,
577,
4778,
1083,
281,
9413,
19349,
2538,
762,
21624,
4903,
5438,
8492,
285,
11945,
310,
747,
253,
2929,
310,
3839,
973,
3542,
285,
3477,
281,
956,
352,
310,
22335,
3451,
253,
1543,
403,
273,
1600,
984,
3738,
253,
1543,
403,
670,
2581,
2714,
2219,
627,
310,
690,
1941,
326,
1077,
2969,
1980,
11333,
403,
625,
9630,
685,
625,
27096,
4156,
11333,
50276,
35529,
253,
4477,
497,
2546,
281,
11929,
247,
7321,
2715,
273,
616,
2929,
1309,
253,
5955,
2180,
597,
9262,
247,
747,
2715,
342,
690,
747,
1543,
2299,
275,
17565,
352,
625,
8244,
1223,
4028,
253,
1313,
609,
1374,
891,
1119,
326,
597,
574,
5611,
690,
1077,
5884,
6332,
534,
878,
281,
320,
4229,
841,
2486,
50276,
18,
627,
310,
247,
5426,
273,
42961,
5943,
533,
417,
7599,
5943,
374,
627,
310,
642,
3806,
281,
278,
3544,
2720,
281,
9376,
470,
2057,
9376,
470,
943,
320,
4391,
281,
3730,
281,
42961,
5943,
390,
7599,
5943,
390,
278,
3544,
943,
320,
5611,
39347,
581,
273,
841,
3210,
310,
8025,
253,
4477,
943,
1056,
2176,
326,
352,
310,
13333,
342,
253,
17265,
577,
4778,
10012,
670,
19870,
3210,
495,
253,
4737,
273,
13989,
577,
369,
4391,
281,
1581,
323,
253,
10012,
281,
4647,
19870,
2969,
660,
983,
436,
310,
417,
4518,
4767,
275,
253,
10012,
534,
943,
1056,
2590,
326,
253,
9376,
273,
247,
6830,
19870,
533,
2969,
660,
78,
310,
4136,
436,
943,
671,
320,
1160,
30909,
275,
253,
12002,
285,
10199,
50276,
9389,
30628,
11691,
436,
2929,
347,
247,
818,
285,
581,
347,
247,
721
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
326,
452,
642,
8763,
34541,
390,
5438,
8492,
50276,
9188,
40348,
50276,
783,
15180,
10649,
1430,
273,
1980,
19349,
8900,
11333,
2789,
731,
23176,
253,
4942,
1643,
17697,
14275,
5216,
2424,
281,
1347,
19349,
8900,
342,
1264,
18645,
285,
1740,
18645,
3210,
1056,
731,
7826,
625,
10126,
9630,
685,
4156,
19349,
8900,
11333,
824,
347,
269,
5297,
275,
534,
253,
36594,
273,
253,
19349,
7688,
275,
253,
2014,
4216,
326,
310,
3453,
476,
320,
4122,
7976,
327,
581,
1529,
2299,
752,
670,
253,
878,
281,
1453,
323,
2709,
5175,
672,
16344,
6763,
390,
1014,
9790,
273,
1264,
285,
1740,
18645,
3210,
672,
9591,
1980,
19349,
8900,
50275,
66,
3626,
1953,
310,
436,
752,
1537,
320,
12103,
407,
9591,
1980,
19349,
8900,
970,
625,
685,
1740,
4903,
352,
3133,
32811,
323,
253,
2929,
281,
387,
1878,
5181,
327,
436,
2523,
670,
2852,
2561,
275,
253,
5955,
2593,
50276,
48746,
3290,
50276,
783,
7681,
3290,
273,
253,
2929,
310,
7126,
253,
27947,
3176,
281,
320,
3451,
50273,
783,
14308,
273,
8723,
17697,
14275,
285,
10096,
275,
2593,
4562,
3176,
281,
320,
1027,
685,
253,
14308,
1677,
275,
253,
11106,
2929,
407,
502,
66,
284,
257,
285,
344,
3319,
265,
4332,
253,
14308,
1677,
407,
502,
66,
284,
257,
285,
344,
3319,
265,
1646,
281,
1056,
625,
3282,
253,
4477,
943,
2451,
281,
1056,
2119,
326,
616,
14308,
403,
253,
1072,
347,
1110,
1677,
275,
253,
502,
66,
284,
257,
285,
344,
3319,
265,
2929,
50276,
783,
4679,
2486,
1097,
15524,
285,
1524,
941,
285,
597,
2085,
4217,
16039,
670,
253,
11333,
352,
651,
320,
9371,
281,
452,
625,
4278,
670,
849,
253,
3632,
14580,
497,
4561,
323,
1650,
6828,
9297,
497,
19958,
10939,
342,
690,
4229,
5912,
534,
310,
417,
4767,
275,
253,
2929,
671,
14580,
326,
858,
417,
2525,
247,
17095,
5927,
1180,
273,
3007,
1334,
6127,
497,
25665,
533,
326,
1180,
310,
417,
4767,
275,
253,
2929,
253,
2929,
4561,
30321,
3530,
323,
253,
3632,
4216,
4679,
352,
651,
320,
4722,
281,
923,
849,
253,
1543,
1818,
689,
247,
4618,
13905,
273,
3410,
9552,
323,
253,
1781,
3632,
14580,
352,
310,
4722,
326,
253,
4869,
12320,
689,
512,
2308,
273,
6983,
273,
340,
296,
2068,
310,
2406,
685,
298,
2428,
285,
340,
296,
2139,
310,
326,
352,
651,
320,
4217,
604,
253,
2929,
5867,
10126,
253,
3910,
275,
1543,
970,
9079,
5175,
390,
7162,
11508,
323,
1650,
253,
5955,
285,
7914,
273,
253,
5661,
1543,
310,
23507,
3081,
5955,
651,
320,
4217,
50275,
498,
15752,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
4583,
627,
403,
247,
1643,
5053,
326,
651,
5649,
432,
37699,
285,
627,
403,
247,
1643,
5884,
963,
993,
50275,
11667,
337,
310,
247,
2372,
21643,
323,
4227,
849,
1057,
253,
11652,
10018,
46541,
7756,
275,
253,
2457,
23129,
305,
273,
1110,
665,
14048,
275,
616,
34666,
604,
760,
3484,
342,
1175,
23129,
2686,
14048,
249,
253,
966,
2278,
8335,
534,
310,
2294,
317,
656,
18917,
285,
440,
23705,
494,
281,
253,
11652,
310,
352,
253,
1083,
326,
253,
11652,
1057,
417,
871,
253,
2600,
273,
253,
2278,
14526,
533,
1057,
871,
534,
3484,
1133,
249,
253,
2278,
14526,
352,
651,
320,
4722,
604,
253,
1650,
369,
4307,
562,
27184,
3345,
273,
253,
2929,
285,
253,
3884,
273,
253,
2457,
1543,
2361,
275,
253,
2929,
50275,
13206,
337,
352,
651,
320,
1175,
281,
3748,
275,
253,
11743,
752,
1016,
273,
253,
4666,
3510,
6125,
50275,
11667,
374,
275,
253,
12616,
1537,
6909,
690,
273,
841,
3608,
352,
3133,
253,
3159,
3608,
943,
320,
1341,
50274,
249,
958,
3036,
374,
6125,
943,
320,
275,
958,
3036,
337,
67,
6125,
50275,
262,
651,
320,
9371,
281,
452,
247,
625,
11848,
8813,
273,
253,
1563,
6197,
327,
3239,
374,
50276,
66,
5199,
323,
34705,
1327,
1377,
383,
1544,
2493,
432,
436,
19945,
966,
672,
5438,
8492,
310,
247,
6387,
310,
2568,
7202,
50275,
6377,
495,
3588,
762,
672,
50276,
7210,
672,
50275,
6377,
495,
7658,
3169,
1332,
19349,
8900,
3082,
50276,
30995,
3169,
19349,
8900,
3082,
50275,
6377,
495,
875,
7632,
7033,
281,
271,
50276,
17352,
7632,
281,
271,
50275,
6377,
577,
352,
3133,
253,
12616,
247,
27039,
4778,
326,
310,
16888,
1025,
562,
943,
320,
247,
4778,
326,
310,
27039,
327,
50275,
6377,
577,
13235,
19349,
2493,
310,
5393,
533,
417,
2931,
390,
2529,
50275,
6377,
577,
9376,
470,
260,
285,
1269,
878,
281,
320,
2931,
390,
2529,
50275,
6377,
577,
278,
3288,
1944,
1162,
355,
9169,
310,
271,
23851,
25577,
984,
352,
10129,
767,
273,
253,
10414,
50275,
6377,
608,
253,
12616,
835,
259,
6125,
5438,
512,
643,
27039,
4903,
310,
417,
2590,
50274,
6377,
721,
943,
2649,
253,
12616,
1182,
310,
417,
37147,
281,
512,
273,
1269,
340,
285,
21839,
873,
259,
320,
1182,
310,
417,
37147,
281,
667,
273,
1269,
340,
285,
21839,
873,
259,
50275,
6377,
721,
3590,
1255,
3637,
273,
253,
806,
767,
50276,
27962,
1255,
273,
253,
806,
767,
50275,
6377,
721,
943,
2649,
253,
3236,
9376,
273,
298,
2428,
310,
5777,
10046,
760,
10568,
320,
253,
3236,
9376,
273,
298,
2428,
310,
5777,
21076,
760,
10568,
50275,
6377,
854,
352,
651,
320,
8254,
5411,
604,
2593,
5922,
6221,
247,
10659,
273,
247,
340,
2605,
50275,
6377,
854,
6508,
340,
18317,
24,
50276,
43315,
340,
18317,
818,
50275,
6377,
898,
253,
2593,
273,
6486,
3410,
14755,
310,
417,
3477,
281,
956,
50274,
6377,
884,
752,
310,
3036,
38429,
50275,
6377,
1903,
4677,
608,
352,
651,
320,
9371,
604,
253,
11743,
8631,
253,
4495,
273,
4891,
285,
24817,
3104,
275,
253,
14777,
50275,
6377,
1249,
253,
4495,
273,
285,
14720,
6944,
253,
1563,
6197,
310,
417,
7094,
2590,
281,
479,
275,
3036,
721,
66,
253,
4795,
687,
68,
6970,
310,
2011,
835,
253,
7887,
246,
310,
6777,
275,
247,
1039,
326,
337,
2558,
273,
253,
19349,
2493,
310,
6221,
275,
253,
3216,
5083,
2139,
5206,
337,
2558,
285,
752,
403,
253,
2779,
21230,
68,
45874,
323,
2509,
594,
5474,
339,
446,
406,
595,
30375,
19349,
2493,
310,
1774,
275,
3946,
7384,
253,
6242,
273,
5438,
31306,
285,
8763,
44667,
723,
436,
2929,
2340,
684,
1880,
253,
298,
2428,
5933,
285,
253,
6508,
340,
18317,
403,
1335,
3588,
50275,
2520,
2929,
310,
973,
15720,
285,
253,
7680,
310,
2590,
253,
4477,
8058,
326,
50276,
18,
323,
253,
1264,
18645,
1083,
627,
310,
642,
3590,
7658,
3169,
2746,
281,
9413,
247,
15846,
37147,
19349,
5886,
762,
5438,
8492,
50276,
19,
323,
253,
1740,
18645,
1083,
253,
6508,
340,
18317,
310,
1335,
3590,
762,
5438,
8492,
50276,
35529,
352,
3133,
326,
253,
1840,
767,
1543,
403,
1480,
9099,
273,
253,
13760,
19349,
17032,
4803,
253,
13760,
19349,
17032,
4803,
390,
643,
2074,
1543,
452,
644,
908,
281,
5276,
253,
36594,
273,
391,
71,
5297,
923,
24088,
847,
297,
2399,
1162,
355,
4050,
18057,
4562,
352,
50276,
12756,
320,
1805,
281,
7277,
253,
8058,
1543,
281,
18057,
4562,
285,
281,
849,
18057,
4562,
310,
3732,
275,
391,
71,
5297,
50275,
977,
5701,
337,
2593,
7609,
352,
3133,
326,
1269,
249,
7897,
90,
943,
320,
1269,
249,
667,
285,
326,
1269,
249,
340,
943,
320,
1269,
249,
667,
374,
275,
2593,
7609,
253,
4477,
4853,
247,
4868,
275,
16186,
26,
1057,
436,
4868,
760,
789,
323,
253,
1386,
1662,
12064,
1083,
604,
253,
4903,
403,
13358,
476,
359,
1335,
897,
436,
4868,
5474,
33032,
253,
2929,
2340,
684,
12488,
19349,
37147,
7688,
2190,
767,
4903,
12171,
253,
37593,
2322,
906,
342,
1264,
18645,
1083,
3133,
747,
533,
253,
906,
310,
2581,
15246,
1955,
281,
247,
1355,
1180,
273,
1896,
5289,
323,
253,
1740,
18645,
1083,
6508,
340,
18317,
253,
906,
476,
320,
6012,
407,
18057,
337,
502,
66,
38740,
285,
344,
3319,
265,
4332,
581,
476,
39494,
2007,
281,
1269,
275,
667,
266,
88,
1677,
326,
253,
15965,
8132,
263,
281,
5276,
253,
906,
310,
417,
1029,
253,
3290,
273,
253,
2929,
651,
320,
6571,
3413,
407,
253,
8453,
273,
253,
1953,
17200,
285,
16774,
1543,
50276,
783,
1953,
3133,
1774,
326,
762,
534,
1617,
476,
359,
2028,
37147,
2954,
762,
5438,
8492,
533,
253,
1953,
310,
3710,
281,
1264,
285,
1740,
4778,
2219,
849,
1892,
651,
352,
320,
281,
8415,
608,
18645,
1083,
5604,
581,
476,
1908,
247,
16325,
832,
48040,
2746,
50276,
783,
2929,
310,
4518,
3542,
533,
476,
320,
5520,
407,
50275,
3319,
8201,
253,
16585,
273,
4231,
323,
1650,
8442,
374,
285,
495,
403,
1896,
7599,
5943,
417,
278,
3544,
359,
513,
417,
878,
4231,
281,
2096,
253,
1543,
50276,
36445,
2844,
480,
5297,
7792,
247,
1652,
2372,
275,
9376,
470,
368,
42126,
4853,
260,
285,
1269,
2568,
50276,
6553,
340,
18317,
275,
2593,
5922,
50276,
358,
5378,
474,
12820,
310,
4722,
533,
671,
476,
320,
5520,
253,
1655,
906,
970,
3632,
14580,
476,
320,
3240,
21643,
1580,
14122,
517,
1332,
26624,
342,
2169,
12320,
285,
6983,
1014,
762,
253,
5438,
8492,
50275,
19453,
327,
3632,
14580,
760,
2451,
337,
849,
1142,
11821,
273,
8557,
403,
10599,
323,
1021,
340,
296,
50276,
19,
849,
1175,
253,
29107,
310,
7613,
352,
310,
417,
3240,
4569,
281,
47806,
522,
583,
819,
9191,
323,
1027,
3082,
597,
588,
1089,
11259,
323,
1027,
9297,
835,
690,
1537,
320,
19124,
281,
5438,
8492,
50276,
5092,
368,
1304,
752,
819,
1318,
651,
320,
2273,
9765,
5523,
3185,
273,
2412,
81,
1318,
50276,
5371,
84,
2781,
275,
16186,
898,
2781,
89,
671,
47515,
7897,
90,
816,
1078,
253,
5150,
50276,
1189,
455,
891,
751,
253,
2969,
1953,
285,
2900,
642,
581,
2546,
1078,
253,
9759,
285,
4679,
476,
320,
5520,
281,
6780,
253,
6349,
273,
1021,
340,
296,
5175,
327,
625,
941,
5239,
651,
320,
11408,
3738,
253,
3590,
1255,
273,
253,
1332,
310,
2168,
11464,
50276,
37585,
717,
27421,
816,
1078,
2593,
3307,
275,
2593,
38429,
3036,
38429,
943,
320,
3036,
608,
4677,
608,
4496,
11120,
3748,
752,
403,
4891,
256,
17,
285,
17889,
256,
18,
3104,
275,
6452,
298,
2428,
369,
1119,
281,
417,
320,
3590,
778,
320,
3542,
275,
247,
1027,
1039,
24088,
298,
2428,
310,
417,
7763,
281,
247,
941,
342,
5438,
8492,
298,
2428,
310,
3590,
762,
253,
1617,
352,
369,
4081,
2490,
187,
4118,
18435,
27,
2520,
2929,
2722,
326,
672,
2403,
2067,
13260,
273,
253,
6036,
19349,
17032,
7792,
16984,
30398,
3634,
4903,
253,
1980,
19349,
8900,
5933,
310,
2104,
281,
27340,
9413,
247,
19349,
1055,
672,
627,
403,
21624,
4903,
533,
642,
17697,
14275,
1754,
5933,
476,
9413,
247,
19349,
1055,
672,
627,
403,
21624,
4903,
285,
5438,
8492,
352,
671,
2722,
326,
672,
970,
760,
577,
4903,
627,
310,
247,
5185,
1039,
281,
9413,
247,
19349,
1055,
672,
627,
403,
21624,
4903,
5438,
8492,
285,
11945,
7384,
247,
2969,
8350,
19349,
1566,
253,
4433,
10006,
273,
1980,
19349,
8900,
342,
1264,
7632,
762,
5438,
8492,
285,
253,
3745,
273,
6508,
340,
45345,
253,
577,
4778,
1083,
281,
9413,
19349,
2538,
762,
21624,
4903,
5438,
8492,
285,
11945,
310,
747,
253,
2929,
310,
3839,
973,
3542,
285,
3477,
281,
956,
352,
310,
22335,
3451,
253,
1543,
403,
273,
1600,
984,
3738,
253,
1543,
403,
670,
2581,
2714,
2219,
627,
310,
690,
1941,
326,
1077,
2969,
1980,
11333,
403,
625,
9630,
685,
625,
27096,
4156,
11333,
50276,
35529,
253,
4477,
497,
2546,
281,
11929,
247,
7321,
2715,
273,
616,
2929,
1309,
253,
5955,
2180,
597,
9262,
247,
747,
2715,
342,
690,
747,
1543,
2299,
275,
17565,
352,
625,
8244,
1223,
4028,
253,
1313,
609,
1374,
891,
1119,
326,
597,
574,
5611,
690,
1077,
5884,
6332,
534,
878,
281,
320,
4229,
841,
2486,
50276,
18,
627,
310,
247,
5426,
273,
42961,
5943,
533,
417,
7599,
5943,
374,
627,
310,
642,
3806,
281,
278,
3544,
2720,
281,
9376,
470,
2057,
9376,
470,
943,
320,
4391,
281,
3730,
281,
42961,
5943,
390,
7599,
5943,
390,
278,
3544,
943,
320,
5611,
39347,
581,
273,
841,
3210,
310,
8025,
253,
4477,
943,
1056,
2176,
326,
352,
310,
13333,
342,
253,
17265,
577,
4778,
10012,
670,
19870,
3210,
495,
253,
4737,
273,
13989,
577,
369,
4391,
281,
1581,
323,
253,
10012,
281,
4647,
19870,
2969,
660,
983,
436,
310,
417,
4518,
4767,
275,
253,
10012,
534,
943,
1056,
2590,
326,
253,
9376,
273,
247,
6830,
19870,
533,
2969,
660,
78,
310,
4136,
436,
943,
671,
320,
1160,
30909,
275,
253,
12002,
285,
10199,
50276,
9389,
30628,
11691,
436,
2929,
347,
247,
818,
285,
581,
347,
247,
721
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper looks at the problem of unsupervised crosslingual transfer termed ucl in the paper through the optics of domain adaptation after empirically analysing and validating that distributional shifts in class priors might cause a huge problem for ucl which wasnt tackled in previous research the authors proceed with an introduction of a new method that aims to mitigate that problem the idea is to get rid of that shift through a approach called importanceweighted domain adaptation iwda which is largely the adaptation of the work from tachet des combes et al neurips 2020 to the ucl problem the results on two tasks in the ucl setup ner and marc classification show slight gains over the standard zeroshot transfer when iwda is applied with more prominent gains reported when a stronger domain shift is observed however such a setup has been created mostly artificially to further demonstrate the benefits of modelling the shift in the model update after the response i would like to thank the authors for the new set of experiments and their very thorough author response the new experiments indeed strengthen the paper and better outline the key contributions of the work some of my main concerns still do remain though i am not fully certain that given its mixed performance in the transfer tasks the proposed framework will be very useful with more powerful transfer methods this preliminary evidence has not fully convinced me on its future impact and i am still not fully convinced by the choice of baselines and whether to understand this preliminary empirical evidence as fertile ground for future enhancements in ucl some other clarifications were provided in the response so i have increased my score but as said i still have reservations when it comes to the papers empirical contributions and consequent impact overall the paper provides some nice insights especially treating the problem of crosslingual transfer as largely a domain transfer problem which allows it to use some domain adaptation da machinery from prior work and apply it to the ucl problem the empirical gains although not huge do demonstrate the alignment between the research hypothesis and empirical scores the main strength of the paper imo is section 2 which delves deeper into analysing factors that affect crosslingual transfer the ideas of having more invariant representations to improve crosslingual transfer are not new and they date back to the work on crosslingual word embeddings eg see the work of dubossarsky et al emnlp 2020 or patra et al iclr 2019 or zheng et al acl 2019 also the idea of treating crosslingual transfer as a da problem is also not novel eg phd thesis of ruder discusses the similarity of the two problems in a nice and detailed way one of the major concerns i have with the current paper and its current presentation is that it is very difficult to discern between novelty of this work and what was done in prior research and simply reapplied to ucl for instance it seems a bit that this work is a largely incremental application of the prior more fundamental work of tachet des combes et al neurips 2020 to a new but highly similar problem for instance the authors should clearly note in section 3 if they bring any methodological contribution here or they just describe the previous method of tachet des combes et al in a similar vein the paper can also be seen as an extension of the work from keung et al adding this mitigation of prior class shift into the mix which yields slight improvements i am also not convinced by the results and the entire evaluation protocol detecting several potential problems and weaknesses here evaluation is conducted only on highresource languages from the nlp perspective while the main promise of ucl is to improve on nlp tasks for lowerresource languages as done in plenty of contemporary nlp research i wonder how iwda would behave for such languages and whether it would bring any benefits doing zeroshot transfer from en to de really does not make much sense imo evaluation is conducted on two reasonably simple tasks ner and marc sentiment analysis which is a classification task with a small number of classes a more detailed empirical analysis on other higherlevel semantic tasks eg xnli pawsx is warranted for a clearer picture of the benefits of the proposed approach the gains are typically not very high and are close to none when compared to the simple alignment method of wu et al in the ner task moreover performance on marc sentiment is measured only against the vanilla zeroshot transfer and an older selftraining st baseline what precludes the authors to again compare against wu et al or some other more recent techniques that go beyond the simplest zeroshot transfer protocol the paper does not really put results into the perspective of plenty of related work that has been conducted in this area recently eg check the leaderboards of xtreme and xglue for more sophisticated baselines a true gain would be showing that applying iwda along with these stronger models can yield further benefits with the current set of experiments i believe that the paper will have very limited impact along the same line why is only mbert evaluated all concurrent work also provides evaluations with a stronger xlmr base model and i also wonder whether these gains would remain with a larger and an even stronger xlmr large model the paper also misses some very relevant related work eg it seems very close to this work in its optics and design httpsarxivorgpdf201111499pdf i would like to see a discussion regarding their dissimilarity some other papers that should have been briefly discussed and cited in this paper httpsarxivorgpdf210407908pdf httpsarxivorgpdf200500396pdf analysis of mberts multilinguality the paper presents a dainspired view on unsupervised crosslingual transfer offering some insightful analyses but it seems as an eclectic mostly incremental work with insufficient and lacking empirical validations incomplete baselines and inadequate positioning against previous work in this area which would negatively affect its potential impact and its overall contributions more work and a stronger empirical foundation are needed docsepin this paper the authors provide substantial analyses on the crosslingual transfer performance in the multilingual neural language models and reported that the performance is strongly correlated with representation invariance and negatively affected by distributional shift in class priors between data in the srctgt languages based on these findings the authors propose an unsupervised crosslingual learning method called importanceweighted domain adaptation iwda where it performs feature alignment prior shift estimation and correction the authors experimented on two different nlp tasks such as multilingual ner and multilingual sentiment analysis tasks and experimentally showed the effectiveness besides that they demonstrated that the proposed approach improves performance further when combined with existing semisupervised learning approaches the paper provides extensive analyses on crosslingual transfer in the commonly used approaches that follow the pretrainfinetune strategy in this strategy the multilingual model finetuned on english task data is known to have zeroshot capability in the other languages however it is not well studied that in unsupervised crosslingual learning such as multilingual language model what the role of shared representations is considering this the paper gives a good start with substantial analyses that are helpful to understand what are the key factors to successful crosslingual transfer learning since xtreme benchmark provides a variety of multilingual nlp tasks the experiment section could be extended with more results the targeted languages in the current experiment are considered as a highresource language it would be better if the authors could move the remaining results into the main 0 pages and give more discussion on them the paper is mostly well organized and provides extensive analyses and experimental results they will be helpful to the studies in crosslingual transfer learning however some important descriptions on the model settings or experimental results are shown in appendix which makes the paper difficult to read i would suggest to revisit and reorganize the sections for better readability regarding the experiments since mbert that the authors used as a pretrained model serves diverse language representations they could provide deeper discussion by moving table 4 to the main 9 pages i read the responses from authors and appreciate their response and showing more results i am okay with accepting the paper but keep the score 6 because one concern might be that those resultsanalyses are mostly described in appendix the authors would need to reconstruct the manuscript by moving them into the main pages docsepthe paper first demonstrates the importance of feature invariance language invariant representations and classprior invariance across languages on zeroshot crosslingual performance by analyzing the zeroshot performance on different languages on the marc reviews and wikiann ner tasks and comparing against the class conditional distance between the source language english and target language feature representations the authors illustrate the how high feature invariance results in better zeroshot performance by also synthetically modifying the class prior for the target language the authors demonstrate how increasing differences in class priors result in decreasing zeroshot crosslingual transfer performance building on their observations the authors propose an approach that i introduces an adversarial loss term to penalize distortion in average class conditional feature representations between the source and target languages ii adds an importance weighting term to ensure the approach doesnt fail under class prior shifts empirical studies demonstrate that the proposed approach improves significantly over the vanilla zeroshot model on both marc sentiment analysis and ner tasks also improving over selftraining on sentiment analysis comparisons on synthetic datasets subsampled from ner and marc datasets that enhance the class prior shift highlight the robustness of the approach under large class prior shifts where previous approaches fail overall the paper first presents insightful analysis that highlights the role of feature invariance and class prior shifts on the extent of zeroshot crosslingual transfer the insights from the analysis are then adapted to develop importanceweighted domain adaptation for zeroshot crosslingual learning resulting in improved performance on marc sentiment analysis and wikiann ner with significantly improved robustness under classprior shifts strengths 1 well written and easy to understand 2 insightful analysis that grounds the presented approach 3 results from analysis with synthetically modified class prior distributions support the hypothesis that iwda is improving robustness to classprior shifts weaknesses 1 the paper evaluates on a limited set of tasks having additional results on a wider range of tasks for eg additional tasks from the xtreme benchmark could significantly strengthen the results suggestions comments questions 1 incorporating featureinvariance for domain adaptation has been studied for several nlp applications including multilingual pretraining and zeroshot neural machine translation discussion on several references is missing in the paper 1 2 3 2 last paragraph on page 3 pratrained pretrained 3 while the analysis in figure 1 suggests that f1score is directly correlated with conditional feature shift is it possible there are other confounds in this analysis for eg the amount of pretraining data used for each of these languages 4 could the relatively weaker results on ner also be caused by challenges with aligning representations at a subword level across languages with different levels of tokenization granularity across different languages references 1 explicit alignment objectives for multilingual bidirectional encoders hu et al 2 the missing ingredient in zeroshot neural machine translation arivazhagan et al 3 improving zeroshot translation with languageindependent constraints pham et al overall the paper first presents insightful analysis that highlights the role of feature invariance and class prior shifts on the extent of zeroshot crosslingual transfer the insights from the analysis are then adapted to develop importanceweighted domain adaptation for zeroshot crosslingual learning resulting in improved performance on marc sentiment analysis and wikiann ner with significantly improved robustness under classprior shifts the paper produces some valuable insights and develops a wellgrounded approach that improves the robustness of zeroshot crosslingual learning however the empirical results are limited to just two relatively small scale tasks having additional results on a wider range of tasks for eg additional tasks from the xtreme benchmark could significantly strengthen the results given the thorough analysis but limited range of tasks i am leaning towards acceptance if authors include additional empirical results i would be willing to update my recommendation to strong accept docsepthis paper investigates a method for improving the crosslingual transfer of pretrained multilingual models the paper first empirically analyzed the influence of representation invariance and distributional class shift then the paper proposed a method to improve the representation invariance and correcting the class shift experiments showed its superiority under large prior shifts strengths the proposed method is well motivated based the empirical analysis experiments show effectiveness of the proposed method weaknesses the experimental results are not convincing enough 1 only two of the tasks are considered 2 only mbert is tested would the same observations and conclusions hold when other widelyused pretrained models such as xlm are used 3 according to table 1 the proposed method is comparable with the existing work wu et al 2020 on average 4 theres no baselines in table 2 5 seemingly the proposed method is much more time consuming during training than prior works the proposed method is well motivated given that the empirical analysis reveal the influence of representation invariance and class shift experiments show the effectiveness of the proposed method under large class shift however the experimental results are not convincing enough and the paper can be improved by conducting more experiments and analysis
### Summary:
|
the paper presents a domain adaptation approach based on the importance weighting for unsupervised crosslingual learning the paper first analyzes factors that affect crosslingual transfer and finds that the crosslingual transfer performance is strongly correlated with feature representation alignments as well as the distributional shift in class priors between the source and the target then the paper designs an approach based on the observations pros the paper is well written and the proposed approach is well motivated the analysis about which factors affect crosslingual transfer is interesting and provides some great insight cons as the reviewer pointed out the experiments for verifying the proposed approach are relatively weak overall the paper presents nice insights to connect crosslingual transfer with domain adaptation all reviewers lean to accept the paper and i also found the paper is in general interesting
|
[
2509,
1182,
254,
6934,
302,
3700,
432,
546,
281,
372,
1663,
1057,
417,
1056,
1199,
3282,
516,
80,
50276,
15419,
2368,
310,
5196,
327,
767,
12054,
2969,
8892,
38998,
285,
2304,
68,
21942,
1783,
534,
310,
247,
9162,
4836,
342,
247,
1355,
1180,
273,
5971,
50276,
66,
625,
7000,
16774,
1783,
327,
643,
2169,
5251,
24705,
8892,
24088,
1269,
79,
965,
1349,
8819,
89,
310,
26085,
323,
247,
30909,
5406,
273,
253,
5373,
273,
253,
4081,
2746,
50276,
783,
15988,
403,
5431,
417,
1077,
1029,
285,
403,
2810,
281,
5293,
672,
2429,
281,
253,
2969,
12420,
1332,
273,
259,
86,
1162,
355,
275,
253,
38998,
4836,
25761,
3045,
327,
2304,
68,
21942,
310,
4080,
760,
1411,
253,
26724,
1182,
254,
6934,
302,
3700,
285,
271,
5662,
11329,
649,
26208,
331,
8245,
752,
46704,
253,
4477,
281,
969,
7277,
1411,
259,
86,
1162,
355,
390,
690,
643,
625,
3332,
5609,
326,
564,
4457,
253,
22325,
1182,
254,
6934,
302,
3700,
7241,
50276,
783,
2929,
1057,
417,
1663,
1691,
1543,
715,
253,
8668,
273,
9828,
273,
2905,
789,
326,
556,
644,
5196,
275,
436,
2170,
4102,
24088,
2451,
253,
6657,
19184,
273,
209,
633,
4190,
285,
1269,
3129,
489,
323,
625,
18144,
1666,
25379,
247,
2032,
6351,
651,
320,
4645,
326,
9433,
891,
88,
1473,
2112,
342,
841,
10046,
3210,
476,
4917,
2007,
5373,
342,
253,
1655,
873,
273,
4679,
891,
2868,
326,
253,
2929,
588,
452,
1077,
3710,
3486,
50276,
28694,
253,
1072,
1386,
2139,
310,
760,
278,
6291,
6760,
512,
17336,
789,
671,
3400,
27163,
342,
247,
10046,
1269,
20347,
83,
2613,
1566,
285,
891,
671,
4282,
1880,
841,
15988,
651,
3464,
342,
247,
4067,
285,
271,
1014,
10046,
1269,
20347,
83,
1781,
1566,
50275,
783,
2929,
671,
38771,
690,
1077,
4623,
2905,
789,
24088,
352,
3133,
1077,
2810,
281,
436,
789,
275,
697,
35353,
285,
2216,
5987,
39962,
2061,
9275,
1252,
883,
1047,
1525,
9275,
50275,
74,
651,
751,
281,
923,
247,
5955,
5001,
616,
43110,
414,
690,
643,
9380,
326,
943,
452,
644,
13366,
5469,
285,
11106,
275,
436,
2929,
50276,
3614,
39962,
2061,
9275,
16899,
24769,
33648,
9275,
50276,
3614,
39962,
2061,
9275,
1518,
5388,
24698,
9275,
1783,
273,
278,
589,
1641,
1554,
4837,
10982,
253,
2929,
10262,
247,
277,
404,
1033,
1250,
1859,
327,
440,
35421,
2831,
1981,
780,
3700,
9159,
690,
47860,
6260,
533,
352,
3133,
347,
271,
10038,
732,
280,
6571,
32809,
789,
342,
12497,
285,
14999,
16774,
3588,
569,
18464,
1666,
25379,
285,
18766,
19274,
1411,
2045,
789,
275,
436,
2170,
534,
651,
18123,
2818,
697,
2442,
3486,
285,
697,
4583,
9021,
50276,
3062,
789,
285,
247,
10046,
16774,
12153,
403,
3058,
5474,
339,
9852,
436,
2929,
253,
4477,
2085,
6832,
6260,
327,
253,
2831,
1981,
780,
3700,
3045,
275,
253,
1554,
39661,
11454,
3448,
3210,
285,
2361,
326,
253,
3045,
310,
7052,
9578,
342,
6779,
31429,
285,
18123,
5876,
407,
3268,
267,
5333,
275,
966,
2235,
641,
875,
941,
275,
253,
49975,
291,
7332,
11515,
1754,
327,
841,
4342,
253,
4477,
12661,
271,
440,
35421,
2831,
1981,
780,
4715,
1332,
1925,
6349,
24676,
5028,
15644,
891,
88,
1473,
835,
352,
17923,
4735,
12420,
2720,
5333,
13418,
285,
10618,
253,
4477,
3368,
264,
327,
767,
1027,
295,
24343,
8892,
824,
347,
1554,
39661,
38998,
285,
1554,
39661,
21942,
1783,
8892,
285,
21657,
2692,
253,
12510,
16280,
326,
597,
5183,
326,
253,
4081,
2746,
19132,
3045,
2007,
672,
5678,
342,
5368,
49863,
29974,
13337,
4715,
7274,
253,
2929,
3400,
9470,
6260,
327,
2831,
1981,
780,
3700,
275,
253,
7744,
908,
7274,
326,
956,
253,
3215,
1949,
71,
7795,
2517,
5700,
275,
436,
5700,
253,
1554,
39661,
1566,
1442,
292,
37437,
327,
48087,
4836,
941,
310,
1929,
281,
452,
1182,
254,
6934,
302,
14603,
275,
253,
643,
11515,
2299,
352,
310,
417,
973,
5421,
326,
275,
440,
35421,
2831,
1981,
780,
4715,
824,
347,
1554,
39661,
3448,
1566,
752,
253,
2554,
273,
6096,
14237,
310,
7296,
436,
253,
2929,
4245,
247,
1175,
1265,
342,
6832,
6260,
326,
403,
9371,
281,
2096,
752,
403,
253,
2234,
2616,
281,
5547,
2831,
1981,
780,
3700,
4715,
1580,
209,
633,
4190,
22791,
3400,
247,
5235,
273,
1554,
39661,
295,
24343,
8892,
253,
3368,
2593,
812,
320,
6508,
342,
625,
1543,
253,
10522,
11515,
275,
253,
1655,
3368,
403,
2783,
347,
247,
1029,
15024,
3448,
352,
651,
320,
1805,
604,
253,
4477,
812,
2118,
253,
5780,
1543,
715,
253,
2022,
470,
7223,
285,
1918,
625,
5955,
327,
731,
50276,
783,
2929,
310,
6571,
973,
10932,
285,
3400,
9470,
6260,
285,
5661,
1543,
597,
588,
320,
9371,
281,
253,
2175,
275,
2831,
1981,
780,
3700,
4715,
2299,
690,
1774,
20121,
327,
253,
1566,
7533,
390,
5661,
1543,
403,
2011,
275,
30762,
534,
2789,
253,
2929,
2834,
281,
1239,
891,
651,
1804,
281,
45735,
285,
294,
7397,
907,
253,
7118,
323,
1805,
1239,
1430,
5001,
253,
4679,
1580,
278,
6291,
326,
253,
4477,
908,
347,
247,
3215,
11273,
1566,
11029,
11117,
3448,
14237,
597,
812,
2085,
12861,
5955,
407,
4886,
2829,
577,
281,
253,
2022,
898,
7223,
50276,
74,
1239,
253,
6128,
432,
4477,
285,
11435,
616,
2380,
285,
4645,
625,
1543,
891,
717,
8261,
342,
18738,
253,
2929,
533,
1978,
253,
4868,
721,
984,
581,
4468,
1537,
320,
326,
1110,
1543,
47927,
403,
6571,
2529,
275,
30762,
253,
4477,
651,
878,
281,
17029,
253,
7714,
407,
4886,
731,
715,
253,
2022,
7223,
50276,
7152,
339,
431,
248,
2929,
806,
14371,
253,
6349,
273,
4735,
31429,
3448,
13727,
14237,
285,
966,
40844,
31429,
2439,
11515,
327,
1182,
254,
6934,
302,
2831,
1981,
780,
3045,
407,
18918,
253,
1182,
254,
6934,
302,
3045,
327,
1027,
11515,
327,
253,
2304,
68,
10123,
285,
259,
1479,
757,
79,
38998,
8892,
285,
10941,
1411,
253,
966,
17697,
4181,
875,
253,
2603,
3448,
48087,
285,
2303,
3448,
4735,
14237,
253,
4477,
17093,
253,
849,
1029,
4735,
31429,
1543,
275,
1805,
1182,
254,
6934,
302,
3045,
407,
671,
5132,
85,
1037,
26264,
253,
966,
2720,
323,
253,
2303,
3448,
253,
4477,
7568,
849,
3629,
3910,
275,
966,
2235,
641,
906,
275,
11052,
1182,
254,
6934,
302,
2831,
1981,
780,
3700,
3045,
50276,
22157,
327,
616,
7313,
253,
4477,
12661,
271,
2746,
326,
891,
23970,
271,
48960,
2957,
1307,
281,
29697,
907,
22841,
275,
3388,
966,
17697,
4735,
14237,
875,
253,
2603,
285,
2303,
11515,
21255,
11323,
271,
6349,
42428,
1307,
281,
5416,
253,
2746,
36908,
1891,
762,
966,
2720,
15036,
50276,
358,
5378,
474,
2175,
7568,
326,
253,
4081,
2746,
19132,
3012,
689,
253,
26724,
1182,
254,
6934,
302,
1566,
327,
1097,
2304,
68,
21942,
1783,
285,
38998,
8892,
671,
11138,
689,
11329,
649,
26208,
327,
21942,
1783,
14023,
327,
13506,
15302,
8790,
312,
6216,
432,
38998,
285,
2304,
68,
15302,
326,
7278,
253,
966,
2720,
5333,
6780,
253,
31640,
273,
253,
2746,
762,
1781,
966,
2720,
15036,
835,
2045,
7274,
1891,
4583,
253,
2929,
806,
10262,
47860,
1783,
326,
16681,
253,
2554,
273,
4735,
31429,
285,
966,
2720,
15036,
327,
253,
6070,
273,
1182,
254,
6934,
302,
2831,
1981,
780,
3700,
253,
16039,
432,
253,
1783,
403,
840,
12956,
281,
1287,
6349,
24676,
5028,
15644,
323,
1182,
254,
6934,
302,
2831,
1981,
780,
4715,
4795,
275,
5520,
3045,
327,
2304,
68,
21942,
1783,
285,
259,
1479,
757,
79,
38998,
342,
3012,
5520,
31640,
762,
966,
40844,
15036,
50275,
296,
3755,
20556,
337,
973,
3542,
285,
3477,
281,
2096,
374,
47860,
1783,
326,
9905,
253,
3559,
2746,
50276,
20,
1543,
432,
1783,
342,
5132,
85,
1037,
7321,
966,
2720,
10670,
1329,
253,
9079,
326,
891,
88,
1473,
310,
11138,
31640,
281,
966,
40844,
15036,
50276,
20881,
1255,
265,
337,
253,
2929,
44995,
327,
247,
3710,
873,
273,
8892,
1907,
3081,
1543,
327,
247,
14200,
2491,
273,
8892,
323,
24088,
3081,
8892,
432,
253,
209,
633,
4190,
22791,
812,
3012,
17084,
253,
1543,
50276,
35640,
621,
50276,
26122,
50276,
34974,
337,
24049,
4735,
7821,
14417,
323,
5028,
15644,
556,
644,
5421,
323,
2067,
295,
24343,
4893,
1690,
1554,
39661,
3215,
26208,
285,
1182,
254,
6934,
302,
11454,
5145,
10234,
5955,
327,
2067,
10414,
310,
5816,
275,
253,
2929,
337,
374,
495,
374,
1390,
12494,
327,
3239,
495,
819,
255,
11273,
50276,
4025,
11273,
495,
1223,
253,
1783,
275,
4677,
337,
5936,
326,
269,
18,
18891,
310,
3587,
9578,
342,
17697,
4735,
5333,
310,
352,
1896,
627,
403,
643,
1461,
2261,
275,
436,
1783,
323,
24088,
253,
2408,
273,
3215,
26208,
941,
908,
323,
1016,
273,
841,
11515,
577,
812,
253,
4942,
21076,
1543,
327,
38998,
671,
320,
4269,
407,
7881,
342,
8495,
272,
14237,
387,
247,
749,
3418,
1268,
2439,
11515,
342,
1027,
2308,
273,
10669,
1320,
32449,
414,
2439,
1027,
11515,
50276,
250,
3065,
337,
6843,
12420,
16566,
323,
1554,
39661,
12246,
30869,
2349,
351,
398,
30287,
1162,
355,
374,
253,
5816,
24405,
275,
1182,
254,
6934,
302,
11454,
5145,
10234,
247,
1069,
1370,
73,
12043,
1162,
355,
495,
11138,
1182,
254,
6934,
302,
10234,
342,
3448,
17777,
10806,
815,
312,
1162,
355,
4583,
253,
2929,
806,
10262,
47860,
1783,
326,
16681,
253,
2554,
273,
4735,
31429,
285,
966,
2720,
15036,
327,
253,
6070,
273,
1182,
254,
6934,
302,
2831,
1981,
780,
3700,
253,
16039,
432,
253,
1783,
403,
840,
12956,
281,
1287,
6349,
24676,
5028,
15644,
323,
1182,
254,
6934,
302,
2831,
1981,
780,
4715,
4795,
275,
5520,
3045,
327,
2304,
68,
21942,
1783,
285,
259,
1479,
757,
79,
38998,
342,
3012,
5520,
31640,
762,
966,
40844,
15036,
50275,
783,
2929,
11330,
690,
9865,
16039,
285,
24357,
247,
973,
2595,
264,
2746,
326,
19132,
253,
31640,
273,
1182,
254,
6934,
302,
2831,
1981,
780,
4715,
2299,
253,
16774,
1543,
403,
3710,
281,
816,
767,
4942,
1355,
4311,
8892,
1907,
3081,
1543,
327,
247,
14200,
2491,
273,
8892,
323,
24088,
3081,
8892,
432,
253,
209,
633,
4190,
22791,
812,
3012,
17084,
253,
1543,
50276,
28821,
253,
11080,
1783,
533,
3710,
2491,
273,
8892,
891,
717,
25661,
4404,
14924,
604,
4477,
2486,
3081,
16774,
1543,
891,
651,
320,
7378,
281,
5731,
619,
17401,
281,
2266,
2997,
50274,
7152,
33032,
2520,
2929,
2340,
684,
247,
1332,
323,
11138,
253,
2831,
1981,
780,
3700,
273,
3215,
11273,
1554,
39661,
3210,
253,
2929,
806,
45190,
5867,
253,
4833,
273,
6779,
31429,
285,
3268,
267,
966,
5333,
840,
253,
2929,
4081,
247,
1332,
281,
3157,
253,
6779,
31429,
285,
35827,
253,
966,
5333,
4679,
2692,
697,
34385,
762,
1781,
2720,
15036,
20544,
50276,
783,
4081,
1332,
310,
973,
17194,
1754,
253,
16774,
1783,
50275,
16217,
3825,
921,
12510,
273,
253,
4081,
1332,
50275,
20881,
1255,
265,
50276,
783,
5661,
1543,
403,
417,
21414,
2217,
337,
760,
767,
273,
253,
8892,
403,
2783,
374,
760,
278,
6291,
310,
5762,
651,
253,
1072,
7313,
285,
11815,
2186,
672,
643,
7561,
3197,
3215,
11273,
3210,
824,
347,
1269,
20347,
403,
908,
495,
2556,
281,
2829,
337,
253,
4081,
1332,
310,
10870,
342,
253,
5368,
789,
259,
86,
1162,
355,
9169,
327,
3388,
577,
253,
373,
642,
1666,
25379,
275,
2829,
374,
608,
16907,
253,
4081,
1332,
310,
1199,
625,
673,
21337,
1309,
3733,
685,
2720,
2987,
50275,
783,
4081,
1332,
310,
973,
17194,
1677,
326,
253,
16774,
1783,
10313,
253,
4833,
273,
6779,
31429,
285,
966,
5333,
4679,
921,
253,
12510,
273,
253,
4081,
1332,
762,
1781,
966,
5333,
2299,
253,
5661,
1543,
403,
417,
21414,
2217,
285,
253,
2929,
476,
320,
5520,
407,
16472,
625,
4679,
285,
1783,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
5028,
15644,
2746,
1754,
327,
253,
6349,
42428,
323,
440,
35421,
2831,
1981,
780,
4715,
253,
2929,
806,
3537,
13505,
2616,
326,
2818,
2831,
1981,
780,
3700,
285,
9010,
326,
253,
2831,
1981,
780,
3700,
3045,
310,
7052,
9578,
342,
4735,
6779,
43097,
347,
973,
347,
253,
3268,
267,
5333,
275,
966,
2235,
641,
875,
253,
2603,
285,
253,
2303,
840,
253,
2929,
11809,
271,
2746,
1754,
327,
253,
7313,
50276,
856,
84,
50275,
783,
2929,
310,
973,
3542,
285,
253,
4081,
2746,
310,
973,
17194,
50275,
783,
1783,
670,
534,
2616,
2818,
2831,
1981,
780,
3700,
310,
4722,
285,
3400,
690,
1270,
12288,
50275,
5040,
50275,
284,
253,
37317,
8042,
562,
253,
4679,
323,
49160,
253,
4081,
2746,
403,
4942,
5075,
50276,
1189,
455,
253,
2929,
10262,
5322,
16039,
281,
4684,
2831,
1981,
780,
3700,
342,
5028,
15644,
512,
30628,
9644,
281,
2997,
253,
2929,
285,
891,
671,
1119,
253,
2929,
310,
275,
2087,
4722
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
2509,
1182,
254,
6934,
302,
3700,
432,
546,
281,
372,
1663,
1057,
417,
1056,
1199,
3282,
516,
80,
50276,
15419,
2368,
310,
5196,
327,
767,
12054,
2969,
8892,
38998,
285,
2304,
68,
21942,
1783,
534,
310,
247,
9162,
4836,
342,
247,
1355,
1180,
273,
5971,
50276,
66,
625,
7000,
16774,
1783,
327,
643,
2169,
5251,
24705,
8892,
24088,
1269,
79,
965,
1349,
8819,
89,
310,
26085,
323,
247,
30909,
5406,
273,
253,
5373,
273,
253,
4081,
2746,
50276,
783,
15988,
403,
5431,
417,
1077,
1029,
285,
403,
2810,
281,
5293,
672,
2429,
281,
253,
2969,
12420,
1332,
273,
259,
86,
1162,
355,
275,
253,
38998,
4836,
25761,
3045,
327,
2304,
68,
21942,
310,
4080,
760,
1411,
253,
26724,
1182,
254,
6934,
302,
3700,
285,
271,
5662,
11329,
649,
26208,
331,
8245,
752,
46704,
253,
4477,
281,
969,
7277,
1411,
259,
86,
1162,
355,
390,
690,
643,
625,
3332,
5609,
326,
564,
4457,
253,
22325,
1182,
254,
6934,
302,
3700,
7241,
50276,
783,
2929,
1057,
417,
1663,
1691,
1543,
715,
253,
8668,
273,
9828,
273,
2905,
789,
326,
556,
644,
5196,
275,
436,
2170,
4102,
24088,
2451,
253,
6657,
19184,
273,
209,
633,
4190,
285,
1269,
3129,
489,
323,
625,
18144,
1666,
25379,
247,
2032,
6351,
651,
320,
4645,
326,
9433,
891,
88,
1473,
2112,
342,
841,
10046,
3210,
476,
4917,
2007,
5373,
342,
253,
1655,
873,
273,
4679,
891,
2868,
326,
253,
2929,
588,
452,
1077,
3710,
3486,
50276,
28694,
253,
1072,
1386,
2139,
310,
760,
278,
6291,
6760,
512,
17336,
789,
671,
3400,
27163,
342,
247,
10046,
1269,
20347,
83,
2613,
1566,
285,
891,
671,
4282,
1880,
841,
15988,
651,
3464,
342,
247,
4067,
285,
271,
1014,
10046,
1269,
20347,
83,
1781,
1566,
50275,
783,
2929,
671,
38771,
690,
1077,
4623,
2905,
789,
24088,
352,
3133,
1077,
2810,
281,
436,
789,
275,
697,
35353,
285,
2216,
5987,
39962,
2061,
9275,
1252,
883,
1047,
1525,
9275,
50275,
74,
651,
751,
281,
923,
247,
5955,
5001,
616,
43110,
414,
690,
643,
9380,
326,
943,
452,
644,
13366,
5469,
285,
11106,
275,
436,
2929,
50276,
3614,
39962,
2061,
9275,
16899,
24769,
33648,
9275,
50276,
3614,
39962,
2061,
9275,
1518,
5388,
24698,
9275,
1783,
273,
278,
589,
1641,
1554,
4837,
10982,
253,
2929,
10262,
247,
277,
404,
1033,
1250,
1859,
327,
440,
35421,
2831,
1981,
780,
3700,
9159,
690,
47860,
6260,
533,
352,
3133,
347,
271,
10038,
732,
280,
6571,
32809,
789,
342,
12497,
285,
14999,
16774,
3588,
569,
18464,
1666,
25379,
285,
18766,
19274,
1411,
2045,
789,
275,
436,
2170,
534,
651,
18123,
2818,
697,
2442,
3486,
285,
697,
4583,
9021,
50276,
3062,
789,
285,
247,
10046,
16774,
12153,
403,
3058,
5474,
339,
9852,
436,
2929,
253,
4477,
2085,
6832,
6260,
327,
253,
2831,
1981,
780,
3700,
3045,
275,
253,
1554,
39661,
11454,
3448,
3210,
285,
2361,
326,
253,
3045,
310,
7052,
9578,
342,
6779,
31429,
285,
18123,
5876,
407,
3268,
267,
5333,
275,
966,
2235,
641,
875,
941,
275,
253,
49975,
291,
7332,
11515,
1754,
327,
841,
4342,
253,
4477,
12661,
271,
440,
35421,
2831,
1981,
780,
4715,
1332,
1925,
6349,
24676,
5028,
15644,
891,
88,
1473,
835,
352,
17923,
4735,
12420,
2720,
5333,
13418,
285,
10618,
253,
4477,
3368,
264,
327,
767,
1027,
295,
24343,
8892,
824,
347,
1554,
39661,
38998,
285,
1554,
39661,
21942,
1783,
8892,
285,
21657,
2692,
253,
12510,
16280,
326,
597,
5183,
326,
253,
4081,
2746,
19132,
3045,
2007,
672,
5678,
342,
5368,
49863,
29974,
13337,
4715,
7274,
253,
2929,
3400,
9470,
6260,
327,
2831,
1981,
780,
3700,
275,
253,
7744,
908,
7274,
326,
956,
253,
3215,
1949,
71,
7795,
2517,
5700,
275,
436,
5700,
253,
1554,
39661,
1566,
1442,
292,
37437,
327,
48087,
4836,
941,
310,
1929,
281,
452,
1182,
254,
6934,
302,
14603,
275,
253,
643,
11515,
2299,
352,
310,
417,
973,
5421,
326,
275,
440,
35421,
2831,
1981,
780,
4715,
824,
347,
1554,
39661,
3448,
1566,
752,
253,
2554,
273,
6096,
14237,
310,
7296,
436,
253,
2929,
4245,
247,
1175,
1265,
342,
6832,
6260,
326,
403,
9371,
281,
2096,
752,
403,
253,
2234,
2616,
281,
5547,
2831,
1981,
780,
3700,
4715,
1580,
209,
633,
4190,
22791,
3400,
247,
5235,
273,
1554,
39661,
295,
24343,
8892,
253,
3368,
2593,
812,
320,
6508,
342,
625,
1543,
253,
10522,
11515,
275,
253,
1655,
3368,
403,
2783,
347,
247,
1029,
15024,
3448,
352,
651,
320,
1805,
604,
253,
4477,
812,
2118,
253,
5780,
1543,
715,
253,
2022,
470,
7223,
285,
1918,
625,
5955,
327,
731,
50276,
783,
2929,
310,
6571,
973,
10932,
285,
3400,
9470,
6260,
285,
5661,
1543,
597,
588,
320,
9371,
281,
253,
2175,
275,
2831,
1981,
780,
3700,
4715,
2299,
690,
1774,
20121,
327,
253,
1566,
7533,
390,
5661,
1543,
403,
2011,
275,
30762,
534,
2789,
253,
2929,
2834,
281,
1239,
891,
651,
1804,
281,
45735,
285,
294,
7397,
907,
253,
7118,
323,
1805,
1239,
1430,
5001,
253,
4679,
1580,
278,
6291,
326,
253,
4477,
908,
347,
247,
3215,
11273,
1566,
11029,
11117,
3448,
14237,
597,
812,
2085,
12861,
5955,
407,
4886,
2829,
577,
281,
253,
2022,
898,
7223,
50276,
74,
1239,
253,
6128,
432,
4477,
285,
11435,
616,
2380,
285,
4645,
625,
1543,
891,
717,
8261,
342,
18738,
253,
2929,
533,
1978,
253,
4868,
721,
984,
581,
4468,
1537,
320,
326,
1110,
1543,
47927,
403,
6571,
2529,
275,
30762,
253,
4477,
651,
878,
281,
17029,
253,
7714,
407,
4886,
731,
715,
253,
2022,
7223,
50276,
7152,
339,
431,
248,
2929,
806,
14371,
253,
6349,
273,
4735,
31429,
3448,
13727,
14237,
285,
966,
40844,
31429,
2439,
11515,
327,
1182,
254,
6934,
302,
2831,
1981,
780,
3045,
407,
18918,
253,
1182,
254,
6934,
302,
3045,
327,
1027,
11515,
327,
253,
2304,
68,
10123,
285,
259,
1479,
757,
79,
38998,
8892,
285,
10941,
1411,
253,
966,
17697,
4181,
875,
253,
2603,
3448,
48087,
285,
2303,
3448,
4735,
14237,
253,
4477,
17093,
253,
849,
1029,
4735,
31429,
1543,
275,
1805,
1182,
254,
6934,
302,
3045,
407,
671,
5132,
85,
1037,
26264,
253,
966,
2720,
323,
253,
2303,
3448,
253,
4477,
7568,
849,
3629,
3910,
275,
966,
2235,
641,
906,
275,
11052,
1182,
254,
6934,
302,
2831,
1981,
780,
3700,
3045,
50276,
22157,
327,
616,
7313,
253,
4477,
12661,
271,
2746,
326,
891,
23970,
271,
48960,
2957,
1307,
281,
29697,
907,
22841,
275,
3388,
966,
17697,
4735,
14237,
875,
253,
2603,
285,
2303,
11515,
21255,
11323,
271,
6349,
42428,
1307,
281,
5416,
253,
2746,
36908,
1891,
762,
966,
2720,
15036,
50276,
358,
5378,
474,
2175,
7568,
326,
253,
4081,
2746,
19132,
3012,
689,
253,
26724,
1182,
254,
6934,
302,
1566,
327,
1097,
2304,
68,
21942,
1783,
285,
38998,
8892,
671,
11138,
689,
11329,
649,
26208,
327,
21942,
1783,
14023,
327,
13506,
15302,
8790,
312,
6216,
432,
38998,
285,
2304,
68,
15302,
326,
7278,
253,
966,
2720,
5333,
6780,
253,
31640,
273,
253,
2746,
762,
1781,
966,
2720,
15036,
835,
2045,
7274,
1891,
4583,
253,
2929,
806,
10262,
47860,
1783,
326,
16681,
253,
2554,
273,
4735,
31429,
285,
966,
2720,
15036,
327,
253,
6070,
273,
1182,
254,
6934,
302,
2831,
1981,
780,
3700,
253,
16039,
432,
253,
1783,
403,
840,
12956,
281,
1287,
6349,
24676,
5028,
15644,
323,
1182,
254,
6934,
302,
2831,
1981,
780,
4715,
4795,
275,
5520,
3045,
327,
2304,
68,
21942,
1783,
285,
259,
1479,
757,
79,
38998,
342,
3012,
5520,
31640,
762,
966,
40844,
15036,
50275,
296,
3755,
20556,
337,
973,
3542,
285,
3477,
281,
2096,
374,
47860,
1783,
326,
9905,
253,
3559,
2746,
50276,
20,
1543,
432,
1783,
342,
5132,
85,
1037,
7321,
966,
2720,
10670,
1329,
253,
9079,
326,
891,
88,
1473,
310,
11138,
31640,
281,
966,
40844,
15036,
50276,
20881,
1255,
265,
337,
253,
2929,
44995,
327,
247,
3710,
873,
273,
8892,
1907,
3081,
1543,
327,
247,
14200,
2491,
273,
8892,
323,
24088,
3081,
8892,
432,
253,
209,
633,
4190,
22791,
812,
3012,
17084,
253,
1543,
50276,
35640,
621,
50276,
26122,
50276,
34974,
337,
24049,
4735,
7821,
14417,
323,
5028,
15644,
556,
644,
5421,
323,
2067,
295,
24343,
4893,
1690,
1554,
39661,
3215,
26208,
285,
1182,
254,
6934,
302,
11454,
5145,
10234,
5955,
327,
2067,
10414,
310,
5816,
275,
253,
2929,
337,
374,
495,
374,
1390,
12494,
327,
3239,
495,
819,
255,
11273,
50276,
4025,
11273,
495,
1223,
253,
1783,
275,
4677,
337,
5936,
326,
269,
18,
18891,
310,
3587,
9578,
342,
17697,
4735,
5333,
310,
352,
1896,
627,
403,
643,
1461,
2261,
275,
436,
1783,
323,
24088,
253,
2408,
273,
3215,
26208,
941,
908,
323,
1016,
273,
841,
11515,
577,
812,
253,
4942,
21076,
1543,
327,
38998,
671,
320,
4269,
407,
7881,
342,
8495,
272,
14237,
387,
247,
749,
3418,
1268,
2439,
11515,
342,
1027,
2308,
273,
10669,
1320,
32449,
414,
2439,
1027,
11515,
50276,
250,
3065,
337,
6843,
12420,
16566,
323,
1554,
39661,
12246,
30869,
2349,
351,
398,
30287,
1162,
355,
374,
253,
5816,
24405,
275,
1182,
254,
6934,
302,
11454,
5145,
10234,
247,
1069,
1370,
73,
12043,
1162,
355,
495,
11138,
1182,
254,
6934,
302,
10234,
342,
3448,
17777,
10806,
815,
312,
1162,
355,
4583,
253,
2929,
806,
10262,
47860,
1783,
326,
16681,
253,
2554,
273,
4735,
31429,
285,
966,
2720,
15036,
327,
253,
6070,
273,
1182,
254,
6934,
302,
2831,
1981,
780,
3700,
253,
16039,
432,
253,
1783,
403,
840,
12956,
281,
1287,
6349,
24676,
5028,
15644,
323,
1182,
254,
6934,
302,
2831,
1981,
780,
4715,
4795,
275,
5520,
3045,
327,
2304,
68,
21942,
1783,
285,
259,
1479,
757,
79,
38998,
342,
3012,
5520,
31640,
762,
966,
40844,
15036,
50275,
783,
2929,
11330,
690,
9865,
16039,
285,
24357,
247,
973,
2595,
264,
2746,
326,
19132,
253,
31640,
273,
1182,
254,
6934,
302,
2831,
1981,
780,
4715,
2299,
253,
16774,
1543,
403,
3710,
281,
816,
767,
4942,
1355,
4311,
8892,
1907,
3081,
1543,
327,
247,
14200,
2491,
273,
8892,
323,
24088,
3081,
8892,
432,
253,
209,
633,
4190,
22791,
812,
3012,
17084,
253,
1543,
50276,
28821,
253,
11080,
1783,
533,
3710,
2491,
273,
8892,
891,
717,
25661,
4404,
14924,
604,
4477,
2486,
3081,
16774,
1543,
891,
651,
320,
7378,
281,
5731,
619,
17401,
281,
2266,
2997,
50274,
7152,
33032,
2520,
2929,
2340,
684,
247,
1332,
323,
11138,
253,
2831,
1981,
780,
3700,
273,
3215,
11273,
1554,
39661,
3210,
253,
2929,
806,
45190,
5867,
253,
4833,
273,
6779,
31429,
285,
3268,
267,
966,
5333,
840,
253,
2929,
4081,
247,
1332,
281,
3157,
253,
6779,
31429,
285,
35827,
253,
966,
5333,
4679,
2692,
697,
34385,
762,
1781,
2720,
15036,
20544,
50276,
783,
4081,
1332,
310,
973,
17194,
1754,
253,
16774,
1783,
50275,
16217,
3825,
921,
12510,
273,
253,
4081,
1332,
50275,
20881,
1255,
265,
50276,
783,
5661,
1543,
403,
417,
21414,
2217,
337,
760,
767,
273,
253,
8892,
403,
2783,
374,
760,
278,
6291,
310,
5762,
651,
253,
1072,
7313,
285,
11815,
2186,
672,
643,
7561,
3197,
3215,
11273,
3210,
824,
347,
1269,
20347,
403,
908,
495,
2556,
281,
2829,
337,
253,
4081,
1332,
310,
10870,
342,
253,
5368,
789,
259,
86,
1162,
355,
9169,
327,
3388,
577,
253,
373,
642,
1666,
25379,
275,
2829,
374,
608,
16907,
253,
4081,
1332,
310,
1199,
625,
673,
21337,
1309,
3733,
685,
2720,
2987,
50275,
783,
4081,
1332,
310,
973,
17194,
1677,
326,
253,
16774,
1783,
10313,
253,
4833,
273,
6779,
31429,
285,
966,
5333,
4679,
921,
253,
12510,
273,
253,
4081,
1332,
762,
1781,
966,
5333,
2299,
253,
5661,
1543,
403,
417,
21414,
2217,
285,
253,
2929,
476,
320,
5520,
407,
16472,
625,
4679,
285,
1783,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
5028,
15644,
2746,
1754,
327,
253,
6349,
42428,
323,
440,
35421,
2831,
1981,
780,
4715,
253,
2929,
806,
3537,
13505,
2616,
326,
2818,
2831,
1981,
780,
3700,
285,
9010,
326,
253,
2831,
1981,
780,
3700,
3045,
310,
7052,
9578,
342,
4735,
6779,
43097,
347,
973,
347,
253,
3268,
267,
5333,
275,
966,
2235,
641,
875,
253,
2603,
285,
253,
2303,
840,
253,
2929,
11809,
271,
2746,
1754,
327,
253,
7313,
50276,
856,
84,
50275,
783,
2929,
310,
973,
3542,
285,
253,
4081,
2746,
310,
973,
17194,
50275,
783,
1783,
670,
534,
2616,
2818,
2831,
1981,
780,
3700,
310,
4722,
285,
3400,
690,
1270,
12288,
50275,
5040,
50275,
284,
253,
37317,
8042,
562,
253,
4679,
323,
49160,
253,
4081,
2746,
403,
4942,
5075,
50276,
1189,
455,
253,
2929,
10262,
5322,
16039,
281,
4684,
2831,
1981,
780,
3700,
342,
5028,
15644,
512,
30628,
9644,
281,
2997,
253,
2929,
285,
891,
671,
1119,
253,
2929,
310,
275,
2087,
4722
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
offline rl has been receiving enormous attention recently due to its practical application however as the offline data generation method could be noisy identifying a robust solution by accounting uncertain model parameters has great impact the paper proposes a robust fitted qlearning rfqi algorithm that used a dual formulation to work with nominal training model the dual problem is reformulated with function optimization and solved with risk minimization techniques which is then used to approximate robust bellman update theoretical upper bounds are derived for the rfqi algorithm experimental results on simple toy environments demonstrate that rfqi outperforms nonrobust algorithms robust offline rl is a practically important and challenging problem rfqi reformulated the challenging estimation problem to a function optimization model and used it to update the robust qfunction i appreciate the theoretical results and insights provided in theorem 1 experiments are performed on simple gym environments which are not exhaustive enough furthermore only nonrobust benchmark algorithms are used to validate the performance comparing against something like rlsvi zhang et al 2022 would have been great references 1 zhang xuezhou yiding chen xiaojin zhu and wen sun corruptionrobust offline reinforcement learning in international conference on artificial intelligence and statistics pp 57575773 pmlr 2022 the paper missed citing some recent work on offline robust rl eg zhang et al 2022 experimental results are not sufficient to empirically validate the claims validating the performance against stateoftheart robust offline rl methods on a set of realworld simulated environments will strengthen the paper docsepthis paper suggests a method to solve robust mdp using offline data a fitted qlearning is proposed and the authors developed its convergence and sample complexity s this work considered the rl problems under the offline setting which is more challenging a global optimality is also proposed w 1the main challenge in robust rl is to find the worst case however in this paper it is solved by assuming a failstate this assumption greatly reduces the difficulties of the problem hence im afread it is too strong although it can be justified in practice it is still too strong for me this paper is wellwritten but some of the assumptions seem to be a little strong to me and hence limit the contribution of the paper docsepthe paper deals with robust offline rl using function approximators robustness is a broad term here it means a special robustness defined in the rmdp framework an algorithm is outlined and numerous proofs of the properties of the algorithm are proved under strong assumptions the algorithm is evaluated on two deterministic benchmarks cart pole and hopper robustness to changes in system dynamics is investigated and promising performance is achieved on the two selected benchmarks relative to the selected comparison algorithms strengths the paper makes extensive theoretical contributions the proposed practical algorithm could be promising weaknesses the paper makes exaggerated claims in this work we presented a novel robust rl algorithm called robust fitted qiteration algorithm with provably optimal performance for an rmdp with arbitrarily large state space using only offline data with function approximation optimal provably optimal the limitations and assumptions should be named the nonoptimality evident in figure 3 should be explained we also demonstrated the superior performance of the proposed algorithm on standard benchmark problems based on figures 1 2 and 3 there is too little evidence that the performance is superior furthermore the statement can only be made for the examined alternative algorithms furthermore there are only two benchmarks the empirical evidence of the usefulness of the method is weak only two benchmarks moreover both contain only deterministic dynamics due to the lack of investigations on benchmarks with stochastic dynamics it seems questionable to me whether the experiments are at all suitable to test the claim of the proposed method the paper seems incomplete without the appendix algotithm 2 more experiments there are in my opinion a few limitations that should still be addressed in the text to what extent the conditions for the validity of the proofs are realized in practice should be discussed as a possible limitation furthermore the limitation that the benchmarks used have deterministic dynamics throughout should be mentioned docsepthis paper endeavors to design a robust algorithm in the offline rl setting with function approximation the proposed algorithm is called robust fitted qiteration hopefully addressing challenges in this specific setting the detailed derivation of the proposal is given under certain assumptions followed by a nearoptimal guarantee or sample complexity result experiments are conducted on three typical environments strength 1related works and preliminaries are thorough and comprehensive 2the paper is well organized and easy to follow weakness 1in the design process of robust fitted qiteration the sequentially added assumptions eg assumptions 3 and 4 lack demonstration empirically in practice we are not fully sure whether these assumptions can be satisfied or not and how they impact on empirical performance 2emprical demonstration is insufficient as authors only compare their algorithm with a limited number of baselines eg srdqn on only two or three mujoco games researchers tend to be concerned about whether the empirical superior performance of the proposed algorithm can be observed as well on other games 3the proof sketch might be wordy and hard to understand 1the paper designs the robust algorithm only based on total variation which might be limited in terms of practice i am not sure why the authors chose this distance rather than kl or others is tv the most typical choice in robust rl literature an explanation is expected in the future 2the empirical improvement over other baselines seems not to be significant more experiments are needed including on more games also a detailed comparison of the computational cost is also suggested in summary i may not be familiar with some technical details but as far as i can tell they are basically sound in particular the dual reformulation as well as the following assumptions serves as the key to the proposed algorithm which is the critical contribution of this paper therefore i think it is a borderline paper as the algorithmic contribution is notable while the practicability of the proposed algorithms might be questionable and needs to be further demonstrated
### Summary:
|
the reviewers are in general positive about the paper the main concerns were about the assumptions used in the analysis the ac is satisfied by the response from authors and also thinks the assumptions are reasonable standard in offline rl literature the ac also thinks the setting studied in this paper is important
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2727,
1282,
391,
77,
556,
644,
6883,
14779,
4116,
4102,
1955,
281,
697,
8542,
2898,
2299,
347,
253,
28841,
941,
5978,
1332,
812,
320,
27620,
12488,
247,
10237,
2900,
407,
15890,
8767,
1566,
3602,
556,
1270,
3486,
253,
2929,
29328,
247,
10237,
14662,
2805,
28269,
391,
71,
33980,
5933,
326,
908,
247,
8746,
15895,
281,
789,
342,
25662,
3733,
1566,
253,
8746,
1895,
310,
8460,
2907,
342,
1159,
13757,
285,
14042,
342,
2495,
41458,
5609,
534,
310,
840,
908,
281,
16851,
10237,
17487,
1342,
5731,
10527,
5170,
14493,
403,
6012,
323,
253,
391,
71,
33980,
5933,
5661,
1543,
327,
2969,
20953,
12620,
7568,
326,
391,
71,
33980,
41731,
13015,
1327,
18848,
461,
11333,
10237,
28841,
391,
77,
310,
247,
18236,
1774,
285,
11132,
1895,
391,
71,
33980,
8460,
2907,
253,
11132,
13418,
1895,
281,
247,
1159,
13757,
1566,
285,
908,
352,
281,
5731,
253,
10237,
2805,
3701,
891,
11435,
253,
10527,
1543,
285,
16039,
2530,
275,
10012,
337,
4679,
403,
2684,
327,
2969,
17409,
12620,
534,
403,
417,
41389,
2217,
33810,
760,
1327,
18848,
461,
22791,
11333,
403,
908,
281,
17813,
253,
3045,
10941,
1411,
1633,
751,
391,
5200,
6584,
1182,
12109,
1162,
355,
1384,
1423,
651,
452,
644,
1270,
50276,
250,
3065,
337,
186,
91,
12109,
1269,
17761,
14451,
340,
2821,
260,
864,
1269,
571,
13511,
249,
1182,
11917,
285,
259,
257,
5101,
16933,
18848,
461,
28841,
35221,
4715,
275,
5213,
8059,
327,
13345,
9260,
285,
9990,
7266,
45916,
1976,
40761,
268,
1686,
83,
1384,
1423,
50275,
783,
2929,
9829,
19936,
690,
3332,
789,
327,
28841,
10237,
391,
77,
24088,
1182,
12109,
1162,
355,
1384,
1423,
5661,
1543,
403,
417,
4209,
281,
45190,
17813,
253,
3916,
3588,
839,
253,
3045,
1411,
1375,
23037,
14387,
10237,
28841,
391,
77,
3082,
327,
247,
873,
273,
1524,
10186,
15524,
12620,
588,
17084,
253,
2929,
50275,
7152,
33032,
2520,
2929,
5936,
247,
1332,
281,
8415,
10237,
278,
12132,
970,
28841,
941,
247,
14662,
2805,
28269,
310,
4081,
285,
253,
4477,
3715,
697,
14940,
285,
3410,
10454,
50276,
84,
436,
789,
2783,
253,
391,
77,
3237,
762,
253,
28841,
4758,
534,
310,
625,
11132,
247,
4156,
5556,
1319,
310,
671,
4081,
50275,
88,
337,
783,
2022,
5691,
275,
10237,
391,
77,
310,
281,
1089,
253,
9065,
1083,
2299,
275,
436,
2929,
352,
310,
14042,
407,
7384,
247,
1891,
3409,
436,
9376,
10260,
11355,
253,
12748,
273,
253,
1895,
7613,
516,
6706,
1088,
352,
310,
1512,
2266,
3738,
352,
476,
320,
17285,
275,
3946,
352,
310,
1335,
1512,
2266,
323,
479,
50276,
2520,
2929,
310,
973,
15720,
533,
690,
273,
253,
13260,
1646,
281,
320,
247,
1652,
2266,
281,
479,
285,
7613,
2701,
253,
7680,
273,
253,
2929,
50276,
7152,
339,
431,
248,
2929,
13330,
342,
10237,
28841,
391,
77,
970,
1159,
4020,
2392,
31640,
310,
247,
3862,
1307,
1060,
352,
2097,
247,
2714,
31640,
2931,
275,
253,
391,
6535,
81,
7792,
271,
5933,
310,
18627,
285,
7418,
27947,
273,
253,
3607,
273,
253,
5933,
403,
8058,
762,
2266,
13260,
253,
5933,
310,
6760,
327,
767,
30027,
49602,
7281,
15903,
285,
8511,
3803,
50276,
18848,
461,
1255,
281,
2544,
275,
985,
8062,
310,
6949,
285,
12532,
3045,
310,
6786,
327,
253,
767,
4236,
49602,
4103,
281,
253,
4236,
5301,
11333,
50276,
296,
3755,
20556,
50276,
783,
2929,
2789,
9470,
10527,
9021,
50276,
783,
4081,
8542,
5933,
812,
320,
12532,
50276,
20881,
1255,
265,
50276,
783,
2929,
2789,
36074,
3916,
275,
436,
789,
359,
3559,
247,
4460,
10237,
391,
77,
5933,
1925,
10237,
14662,
2805,
2562,
318,
5933,
342,
872,
1598,
8654,
3045,
323,
271,
391,
6535,
81,
342,
29607,
1781,
1375,
2317,
970,
760,
28841,
941,
342,
1159,
11193,
8654,
872,
1598,
8654,
253,
7364,
285,
13260,
943,
320,
4907,
253,
1327,
32581,
1319,
8943,
275,
4677,
495,
943,
320,
5544,
359,
671,
5183,
253,
8936,
3045,
273,
253,
4081,
5933,
327,
2629,
22791,
3237,
1754,
327,
8442,
337,
374,
285,
495,
627,
310,
1512,
1652,
1941,
326,
253,
3045,
310,
8936,
33810,
253,
3908,
476,
760,
320,
1160,
323,
253,
6730,
5795,
11333,
33810,
627,
403,
760,
767,
49602,
50275,
783,
16774,
1941,
273,
253,
31471,
273,
253,
1332,
310,
5075,
760,
767,
49602,
25761,
1097,
3831,
760,
30027,
8062,
1955,
281,
253,
3480,
273,
14006,
327,
49602,
342,
19191,
8062,
352,
3133,
30455,
281,
479,
1880,
253,
4679,
403,
387,
512,
7470,
281,
1071,
253,
1750,
273,
253,
4081,
1332,
50276,
783,
2929,
3133,
18464,
1293,
253,
30762,
355,
19559,
18136,
374,
625,
4679,
50276,
9088,
403,
275,
619,
4743,
247,
1643,
7364,
326,
943,
1335,
320,
9713,
275,
253,
2505,
50276,
936,
752,
6070,
253,
2515,
323,
253,
13091,
273,
253,
27947,
403,
8156,
275,
3946,
943,
320,
5469,
347,
247,
1896,
12291,
50276,
44295,
3062,
253,
12291,
326,
253,
49602,
908,
452,
30027,
8062,
4768,
943,
320,
5393,
5474,
33032,
2520,
2929,
33372,
641,
281,
2216,
247,
10237,
5933,
275,
253,
28841,
391,
77,
4758,
342,
1159,
11193,
253,
4081,
5933,
310,
1925,
10237,
14662,
2805,
2562,
318,
18670,
15974,
7881,
275,
436,
2173,
4758,
253,
7000,
28529,
273,
253,
10419,
310,
1677,
762,
2176,
13260,
3560,
407,
247,
2822,
29776,
12215,
390,
3410,
10454,
906,
4679,
403,
5196,
327,
1264,
6867,
12620,
50274,
45563,
337,
4919,
2987,
285,
11944,
249,
3927,
403,
11080,
285,
11088,
50276,
19,
783,
2929,
310,
973,
10932,
285,
3477,
281,
956,
50274,
20881,
1255,
337,
249,
253,
2216,
1232,
273,
10237,
14662,
2805,
2562,
318,
253,
32627,
2879,
13260,
24088,
13260,
495,
285,
577,
3480,
20028,
45190,
275,
3946,
359,
403,
417,
4751,
2119,
1880,
841,
13260,
476,
320,
10048,
390,
417,
285,
849,
597,
3486,
327,
16774,
3045,
50276,
19,
358,
1087,
474,
20028,
310,
12497,
347,
4477,
760,
7277,
616,
5933,
342,
247,
3710,
1180,
273,
1666,
25379,
24088,
256,
5784,
47051,
327,
760,
767,
390,
1264,
278,
10441,
16856,
3958,
8607,
5257,
281,
320,
7514,
670,
1880,
253,
16774,
8936,
3045,
273,
253,
4081,
5933,
476,
320,
2540,
347,
973,
327,
643,
3958,
50276,
20,
783,
4737,
23211,
1537,
320,
3159,
90,
285,
1892,
281,
2096,
50276,
18,
783,
2929,
11809,
253,
10237,
5933,
760,
1754,
327,
2264,
7629,
534,
1537,
320,
3710,
275,
2426,
273,
3946,
891,
717,
417,
2119,
2139,
253,
4477,
9703,
436,
4181,
2581,
685,
27451,
390,
2571,
310,
23055,
253,
954,
6867,
4327,
275,
10237,
391,
77,
6239,
271,
8813,
310,
3264,
275,
253,
2852,
50276,
19,
783,
16774,
7756,
689,
643,
1666,
25379,
3133,
417,
281,
320,
1534,
625,
4679,
403,
3058,
1690,
327,
625,
3958,
671,
247,
7000,
5301,
273,
253,
15180,
2105,
310,
671,
5125,
50276,
249,
6010,
891,
778,
417,
320,
7615,
342,
690,
7681,
4278,
533,
347,
2080,
347,
891,
476,
2028,
597,
403,
10323,
3590,
275,
1798,
253,
8746,
8460,
1427,
347,
973,
347,
253,
1563,
13260,
11029,
347,
253,
2234,
281,
253,
4081,
5933,
534,
310,
253,
4619,
7680,
273,
436,
2929,
3103,
891,
1158,
352,
310,
247,
45210,
2929,
347,
253,
5933,
280,
7680,
310,
16613,
1223,
253,
2283,
280,
1430,
273,
253,
4081,
11333,
1537,
320,
30455,
285,
3198,
281,
320,
2007,
5183,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
403,
275,
2087,
2762,
670,
253,
2929,
253,
2022,
7350,
497,
670,
253,
13260,
908,
275,
253,
1783,
253,
913,
310,
10048,
407,
253,
2380,
432,
4477,
285,
671,
11121,
253,
13260,
403,
5272,
2629,
275,
28841,
391,
77,
6239,
253,
913,
671,
11121,
253,
4758,
5421,
275,
436,
2929,
310,
1774,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2727,
1282,
391,
77,
556,
644,
6883,
14779,
4116,
4102,
1955,
281,
697,
8542,
2898,
2299,
347,
253,
28841,
941,
5978,
1332,
812,
320,
27620,
12488,
247,
10237,
2900,
407,
15890,
8767,
1566,
3602,
556,
1270,
3486,
253,
2929,
29328,
247,
10237,
14662,
2805,
28269,
391,
71,
33980,
5933,
326,
908,
247,
8746,
15895,
281,
789,
342,
25662,
3733,
1566,
253,
8746,
1895,
310,
8460,
2907,
342,
1159,
13757,
285,
14042,
342,
2495,
41458,
5609,
534,
310,
840,
908,
281,
16851,
10237,
17487,
1342,
5731,
10527,
5170,
14493,
403,
6012,
323,
253,
391,
71,
33980,
5933,
5661,
1543,
327,
2969,
20953,
12620,
7568,
326,
391,
71,
33980,
41731,
13015,
1327,
18848,
461,
11333,
10237,
28841,
391,
77,
310,
247,
18236,
1774,
285,
11132,
1895,
391,
71,
33980,
8460,
2907,
253,
11132,
13418,
1895,
281,
247,
1159,
13757,
1566,
285,
908,
352,
281,
5731,
253,
10237,
2805,
3701,
891,
11435,
253,
10527,
1543,
285,
16039,
2530,
275,
10012,
337,
4679,
403,
2684,
327,
2969,
17409,
12620,
534,
403,
417,
41389,
2217,
33810,
760,
1327,
18848,
461,
22791,
11333,
403,
908,
281,
17813,
253,
3045,
10941,
1411,
1633,
751,
391,
5200,
6584,
1182,
12109,
1162,
355,
1384,
1423,
651,
452,
644,
1270,
50276,
250,
3065,
337,
186,
91,
12109,
1269,
17761,
14451,
340,
2821,
260,
864,
1269,
571,
13511,
249,
1182,
11917,
285,
259,
257,
5101,
16933,
18848,
461,
28841,
35221,
4715,
275,
5213,
8059,
327,
13345,
9260,
285,
9990,
7266,
45916,
1976,
40761,
268,
1686,
83,
1384,
1423,
50275,
783,
2929,
9829,
19936,
690,
3332,
789,
327,
28841,
10237,
391,
77,
24088,
1182,
12109,
1162,
355,
1384,
1423,
5661,
1543,
403,
417,
4209,
281,
45190,
17813,
253,
3916,
3588,
839,
253,
3045,
1411,
1375,
23037,
14387,
10237,
28841,
391,
77,
3082,
327,
247,
873,
273,
1524,
10186,
15524,
12620,
588,
17084,
253,
2929,
50275,
7152,
33032,
2520,
2929,
5936,
247,
1332,
281,
8415,
10237,
278,
12132,
970,
28841,
941,
247,
14662,
2805,
28269,
310,
4081,
285,
253,
4477,
3715,
697,
14940,
285,
3410,
10454,
50276,
84,
436,
789,
2783,
253,
391,
77,
3237,
762,
253,
28841,
4758,
534,
310,
625,
11132,
247,
4156,
5556,
1319,
310,
671,
4081,
50275,
88,
337,
783,
2022,
5691,
275,
10237,
391,
77,
310,
281,
1089,
253,
9065,
1083,
2299,
275,
436,
2929,
352,
310,
14042,
407,
7384,
247,
1891,
3409,
436,
9376,
10260,
11355,
253,
12748,
273,
253,
1895,
7613,
516,
6706,
1088,
352,
310,
1512,
2266,
3738,
352,
476,
320,
17285,
275,
3946,
352,
310,
1335,
1512,
2266,
323,
479,
50276,
2520,
2929,
310,
973,
15720,
533,
690,
273,
253,
13260,
1646,
281,
320,
247,
1652,
2266,
281,
479,
285,
7613,
2701,
253,
7680,
273,
253,
2929,
50276,
7152,
339,
431,
248,
2929,
13330,
342,
10237,
28841,
391,
77,
970,
1159,
4020,
2392,
31640,
310,
247,
3862,
1307,
1060,
352,
2097,
247,
2714,
31640,
2931,
275,
253,
391,
6535,
81,
7792,
271,
5933,
310,
18627,
285,
7418,
27947,
273,
253,
3607,
273,
253,
5933,
403,
8058,
762,
2266,
13260,
253,
5933,
310,
6760,
327,
767,
30027,
49602,
7281,
15903,
285,
8511,
3803,
50276,
18848,
461,
1255,
281,
2544,
275,
985,
8062,
310,
6949,
285,
12532,
3045,
310,
6786,
327,
253,
767,
4236,
49602,
4103,
281,
253,
4236,
5301,
11333,
50276,
296,
3755,
20556,
50276,
783,
2929,
2789,
9470,
10527,
9021,
50276,
783,
4081,
8542,
5933,
812,
320,
12532,
50276,
20881,
1255,
265,
50276,
783,
2929,
2789,
36074,
3916,
275,
436,
789,
359,
3559,
247,
4460,
10237,
391,
77,
5933,
1925,
10237,
14662,
2805,
2562,
318,
5933,
342,
872,
1598,
8654,
3045,
323,
271,
391,
6535,
81,
342,
29607,
1781,
1375,
2317,
970,
760,
28841,
941,
342,
1159,
11193,
8654,
872,
1598,
8654,
253,
7364,
285,
13260,
943,
320,
4907,
253,
1327,
32581,
1319,
8943,
275,
4677,
495,
943,
320,
5544,
359,
671,
5183,
253,
8936,
3045,
273,
253,
4081,
5933,
327,
2629,
22791,
3237,
1754,
327,
8442,
337,
374,
285,
495,
627,
310,
1512,
1652,
1941,
326,
253,
3045,
310,
8936,
33810,
253,
3908,
476,
760,
320,
1160,
323,
253,
6730,
5795,
11333,
33810,
627,
403,
760,
767,
49602,
50275,
783,
16774,
1941,
273,
253,
31471,
273,
253,
1332,
310,
5075,
760,
767,
49602,
25761,
1097,
3831,
760,
30027,
8062,
1955,
281,
253,
3480,
273,
14006,
327,
49602,
342,
19191,
8062,
352,
3133,
30455,
281,
479,
1880,
253,
4679,
403,
387,
512,
7470,
281,
1071,
253,
1750,
273,
253,
4081,
1332,
50276,
783,
2929,
3133,
18464,
1293,
253,
30762,
355,
19559,
18136,
374,
625,
4679,
50276,
9088,
403,
275,
619,
4743,
247,
1643,
7364,
326,
943,
1335,
320,
9713,
275,
253,
2505,
50276,
936,
752,
6070,
253,
2515,
323,
253,
13091,
273,
253,
27947,
403,
8156,
275,
3946,
943,
320,
5469,
347,
247,
1896,
12291,
50276,
44295,
3062,
253,
12291,
326,
253,
49602,
908,
452,
30027,
8062,
4768,
943,
320,
5393,
5474,
33032,
2520,
2929,
33372,
641,
281,
2216,
247,
10237,
5933,
275,
253,
28841,
391,
77,
4758,
342,
1159,
11193,
253,
4081,
5933,
310,
1925,
10237,
14662,
2805,
2562,
318,
18670,
15974,
7881,
275,
436,
2173,
4758,
253,
7000,
28529,
273,
253,
10419,
310,
1677,
762,
2176,
13260,
3560,
407,
247,
2822,
29776,
12215,
390,
3410,
10454,
906,
4679,
403,
5196,
327,
1264,
6867,
12620,
50274,
45563,
337,
4919,
2987,
285,
11944,
249,
3927,
403,
11080,
285,
11088,
50276,
19,
783,
2929,
310,
973,
10932,
285,
3477,
281,
956,
50274,
20881,
1255,
337,
249,
253,
2216,
1232,
273,
10237,
14662,
2805,
2562,
318,
253,
32627,
2879,
13260,
24088,
13260,
495,
285,
577,
3480,
20028,
45190,
275,
3946,
359,
403,
417,
4751,
2119,
1880,
841,
13260,
476,
320,
10048,
390,
417,
285,
849,
597,
3486,
327,
16774,
3045,
50276,
19,
358,
1087,
474,
20028,
310,
12497,
347,
4477,
760,
7277,
616,
5933,
342,
247,
3710,
1180,
273,
1666,
25379,
24088,
256,
5784,
47051,
327,
760,
767,
390,
1264,
278,
10441,
16856,
3958,
8607,
5257,
281,
320,
7514,
670,
1880,
253,
16774,
8936,
3045,
273,
253,
4081,
5933,
476,
320,
2540,
347,
973,
327,
643,
3958,
50276,
20,
783,
4737,
23211,
1537,
320,
3159,
90,
285,
1892,
281,
2096,
50276,
18,
783,
2929,
11809,
253,
10237,
5933,
760,
1754,
327,
2264,
7629,
534,
1537,
320,
3710,
275,
2426,
273,
3946,
891,
717,
417,
2119,
2139,
253,
4477,
9703,
436,
4181,
2581,
685,
27451,
390,
2571,
310,
23055,
253,
954,
6867,
4327,
275,
10237,
391,
77,
6239,
271,
8813,
310,
3264,
275,
253,
2852,
50276,
19,
783,
16774,
7756,
689,
643,
1666,
25379,
3133,
417,
281,
320,
1534,
625,
4679,
403,
3058,
1690,
327,
625,
3958,
671,
247,
7000,
5301,
273,
253,
15180,
2105,
310,
671,
5125,
50276,
249,
6010,
891,
778,
417,
320,
7615,
342,
690,
7681,
4278,
533,
347,
2080,
347,
891,
476,
2028,
597,
403,
10323,
3590,
275,
1798,
253,
8746,
8460,
1427,
347,
973,
347,
253,
1563,
13260,
11029,
347,
253,
2234,
281,
253,
4081,
5933,
534,
310,
253,
4619,
7680,
273,
436,
2929,
3103,
891,
1158,
352,
310,
247,
45210,
2929,
347,
253,
5933,
280,
7680,
310,
16613,
1223,
253,
2283,
280,
1430,
273,
253,
4081,
11333,
1537,
320,
30455,
285,
3198,
281,
320,
2007,
5183,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
403,
275,
2087,
2762,
670,
253,
2929,
253,
2022,
7350,
497,
670,
253,
13260,
908,
275,
253,
1783,
253,
913,
310,
10048,
407,
253,
2380,
432,
4477,
285,
671,
11121,
253,
13260,
403,
5272,
2629,
275,
28841,
391,
77,
6239,
253,
913,
671,
11121,
253,
4758,
5421,
275,
436,
2929,
310,
1774,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work considers the problem of learning representations of highdimensional pixel observations for rl highdimensional pixel observations often include many taskirrelevant details and ideally an effective representation of such observations should only encode the taskrelevant details to implement this intuition this work proposes to learn a twopart representation based off of a bisimulation metric two states that have similar first parts of the representation should exhibit similar reward structure and two states that have similar second parts of the representation should exhibit similar dynamics structure this work evaluates this learned representation on several continuous control tasks with distracting backgrounds and on carla and shows that the proposed learned representation improves over baselines strengths overall the approach proposed in this work seems technically sound and wellmotivated learning a state representation based on a bisimulation metric seems quite reasonable and doing so appears to yield strong empirical results the experimental evaluation is quite thorough and illustrates several interesting phenomena i appreciate that this work both considers environments where most of the state observation includes only distracting information in the dm control tasks as well as environments where a large part of the state observation is potentially useful as in the carla tasks the ablation studies and results on transferring to different reward functions are also interesting and present fairly promising results the paper is generally fairly clear weaknesses there are no critical or significant weaknesses that i am aware of though potentially slightly incremental on top of dbc there are numerous minor typos and small errors throughout the paper for example the domain and range of the function fpi in equation 3 appear to be some undefined met space at the end of the preliminaries partial observations is used where i believe partial observability is meant the equations 4 and 5 are defined as least squares error where i believe mean squared error is meant or this could refer to a form of least squares regression i believe that these are generally easy to address but it would be useful to clean this up additional questions comments in equations 6 and 7 shouldnt this be restricted to samples where ai aj if the two actions dont match i dont think you can necessarily expect the rewards and dynamics to be similar even in similar states eg even in exactly the same state this seems potentially problematic to me why does the actor condition only on the cnn layers of phi in equation 11 and not on the full learned state representation like the critic do the representations phir and phis also receive gradients through updating the critic in jq or are they learned exclusively via the elltheta loss postrebuttal comments i appreciate the rebuttal which has alleviated the concerns ive raised i still find the proposed approach technically sound and wellmotivated and find that this work provides a reasonable empirical contribution several of the other reviewers also raised concerns about whether the approach is empirically justified i find these concerns to be reasonable although they are partially alleviated via the rebuttal and i believe they are outweighed by the empirical contribution of the work therefore i continue to recommend acceptance though i do not feel strongly enough to raise my score to the next possible option which is an 8 overall this work offers a principled and empirically strong approach there are some weaknesses but they seem relatively easy to address and the contributions of the work outweigh the cons there were also some points of confusion for me in the paper that seem potentially serious which i included above i initially recommend a score of 6 but am willing to raise my score if the points of confusion are clarified docsepthis paper leverages some recent work on bisimulation metrics to develop a pair of metalearners to capture the two parts of a bisimulation metric reward similarities and dynamics similarities the authors evaluate their method on the environments used by the recent dbc zhang et al 2021 paper and compare against this and other related methods while an interesting extension of dbc from zhang et al 2021 i have 3 main concerns with this paper 1 theoretical issues my main concern with this paper is in terms of the theoretical justification for learning c the c term in equation 3 which originally came from ferns et al 2004 is directly tied to gamma indeed this is how the authors of that paper were able to prove the valuefunction bounds castro 2020 only uses the second c term and sets it equal to gamma which still yields the valuefunction bounds by making the c a learnable parameter it seems like the connections to the theoretical properties of bisimulation are completely lost furthermore by having a varying c term it is not at all clear that the underlying metrics even converge to a fixed point postrebuttal note the authors have mostly addressed this in their rebuttal there are some minor issues with their proofs and ive provided some suggestions 1 implementation design choices why is fr even necessary rewards are typically fully observed so its not clear why this needs to be learned at all the use of v1 in the last term of equation 1 is a strange design decision and im not sure i follow the justification what is meant by make a consensus prediction postrebuttal note the authors have addressed these concerns in their rebuttal 1 statistical significance of results 3 runs is on the lowerside of what should be used for these types of experiments further the authors do not specify what the shaded areas represent in their figures postrebuttal note the authors have promised to run more seeds i have provided some suggestions in my highlevel comment some questionscomments for the authors 1 in the third paragraph of page 2 the authors say their behavioral distance to the other state representations what other state representations are being referred to it is not clear 1 in the third paragraph of page 2 the authors say more side information can be preserved what does side information mean and how is it being preserved 1 in the third paragraph of page 2 it says observe that a smaller loss can be obtained what loss a smaller loss with respect to what 1 in the third paragraph of page 2 it says the approximation precision issue which issue are you referring to specifically 1 can you clarify what you mean by least fixed point below equation 3 1 in figure 1 are all the cs in the figure there are three of them all the same 1 below figure 1 the authors say where phi is the state encoder but no state encoder has been introduced 1 in section 41 do phir and phid share parameters figure 1 suggests they do 1 in page 5 it says lead to large regression losses what regression loss is this specifically 1 in page 5 it says which destabilze the representation learning in what way is it destabilized 1 in page 5 it says which however may losssic part of useful information what useful information is being referred to specifically 1 in the second paragraph of page 5 it says able to preserve more taskrelevant information what do the authors mean by this specifically what taskrelevant information 1 the sentence immediately above equation 11 ends with comprises of encoder is what is meant by this 1 in the line below equation 11 what is meant by a convention form of qfunction 1 below equation 11 if q depends on c shouldnt c also be a parameter of the q function 1 in the learning curves what do the shaded areas represent 1 in section 52 it says ambs drq is dbc with data augmentation do the authors mean ambs with data augmentation 1 in section 52 the authors tried fixing c to 05 but a more natural choice would have been gamma given the main point made above further neither dbc nor the original pibisimulation from castro 2020 use the 1c term in front of the reward differences fixing cgamma and removing the 1c term would have yielded a more direct comparison to dbc some minor comments 1 in the first paragraph of the introduction consider replacing based on some rl algorithms with based on various rl algorithms 1 in the first line of page 2 should be of a markov decision process 1 a paper from early in the year is quite relevant and should be included in the related work section for representation learning agarwal et al contrastive behavioral similarity embeddings for generalization in reinforcement learning iclr 2021httpsarxivorgabs210105265 1 in the second paragraph of page 2 consider including two recent papers that have been accepted to neurips castro et al mico improved representations via samplingbased state similarity for markov decision processeshttpsarxivorgabs210608229 kemertas and aumentadoarmstrong towards robust bisimulation metric learninghttpsarxivorgabs211014096 1 in the second paragraph in page 2 should be and potentially lose parts of the state features 1 in the third paragraph in page 2 they say pair of metalearners that learn similarities please specify which similarities are being referred to 1 in the third paragraph in page 2 it should say handcrafted form 1 right above section 3 use citet for pitis et al 1 in section 3 specify the range of gamma eg gammain 0 1 1 right below equation 2 should say where d quantifies the behavioral 1 in the same paragraph better to say defines a metric with respect to a policy pi 1 in the caption of figure 1 should say which is jointly learned 1 below figure 1 caption gamma is referred to the hyperparameter but i think the authors mean discount factor 1 below figure 1 caption should say combined with the reinforcement learning 1 in the first paragraph of section 4 remove the the before figure 1 ie is demonstrated in figure 1 1 in the first paragraph of section 4 should say two learned similarities in a specific 1 in section 41 the network architecture is mentioned but has not been introduced id suggest referencing the appendix where it is introduced 1 in the second paragraph of page 5 it should say and therefore the metalearner 1 in the second paragraph of page 5 it should say the process of udpating the state encoder 1 in the second paragraph of page 5 it should say besides f is a nonlinear transformation 1 in equation 5 specify that you are using the closed form of the w2 metric as in dbc this is mentioned in the appendix but should be clarified in the main paper as well 1 above equation 6 you mention the learned parameteric dynamics model please add a reference to the appendix where it is defined 1 above equation 6 should say two transitions sampled from the replay buffer 1 in the first paragraph of 42 you should specify that cin 0 1 is enforced by a softmax this is specified in the appendix but should be clarified in the main paper as well 1 in the paragraph above equation 11 should say we aim to make fr and fd symmetric 1 in the line above 51 should say which is a commonly used offpolicy 1 in the first sentence in 51 should say is an environment that provides high dimensional pixel observations for rl tasks 1 in the first sentence of 53 it should say but have different reward functions 1 in section 55 should say learning of rl agents may suffer from 1 in section 55 should say dbc to create an autonomous driving see main concerns above docsepthe paper studies the problem of learning invariant visual representations for reinforcement learning specifically it builds on prior work deep bisimulation for control dbc zhang et al which learns an image representation predictive of rewards and dynamics and uses it with sac this paper makes a few modifications on top of dbc specifically 1 instead of a single state embedding it learns two separate ones one for rewards and one for dynamics 2 instead of using l1 distance in the embedding space to measure distance it instead uses a learned mlp which predicts distance 3 using gradients from sac it learns to dynamically adjust the balance between the dynamics weight and reward weight and 4 adds image augmentation to the training experiments indicate on distracting control that on 24 tasks it matches drq and dbc and on the other 2 tasks outperforms them ablations suggest that all components are important particularly 1 and 2 strengths the paper studies an important problem in learning visual representations for rl robust to distractions the experimental results are pretty strong outperforming the relevant baselines dbc drq and including ablations which show all components of the proposed approach are useful some components of the method like combining bisimulation with data augmentation are well motivated weaknesses despite the ablation which shows that each component of the method is important to performance i still have some concernsquestions on the correctness of the proposed method first the idea of learning 2 separate state embeddings one for reward and one for dynamics the ablation suggests this is very important for good performance however it seems to produce a metric that is no longer reflective of bisimulation the bisim metric should capture states which are functionally similar hence similar in both immediate reward and future state distribution but by splitting the two up into separate embeddings and training one only with the reward and one only with dynamics it seems this functional similarity is no longer captured the reward representation will only capture immediate reward while its not clear what the dynamics representation will capture since its only objective is future state similarity in the same dynamics representation space is it not possible that the dynamics model would collapse to simple predicting 0 everywhere to minimize its loss second the intuition given for using a learned distance instead of l1 or l2 is unclear while it may be easier to optimize the point of using l1 or l2 is to explicitly structure the embedding space to make rl easier when instead using an mlp its not clear how the embedding space itself will be shaped or why it would be shaped in a way that is good for rl also this is an important part of the method and should be clearly described in the main text instead of in the appendix also the framing of this learned distance as a metalearner seems incorrect from what i can understand it is simply using an mlp taking in the two embeddings and predicting distance but not doing any meta learning the confusing description the fact that all implementation details about this are only in the appendix makes this component of the method difficult to understand i hope the authors can clarify the above besides these aspects of the method which seem incorrect the results appear strong and if the authors can address the above concerns id be open to raising my score the paper studies an important problem and has some solid results however in their current form the major components of the method are not clearly motivated and there are some questions i have about the correctnessjustification of the approach docsep this paper proposes a representation learning method for reinforcement learning via the bisimulation metric its largely an extension of the work in deep bisimulation for control dbc zhang et al 2020 with some additional improvements learned distance metrics metalearners instead of l1 distance on the learned representations separate feature encodings for state and dynamics data augmentation from drq yarats et al 2021 learned c tradeoff between reward and dynamics strengths the papers improvements over prior work are generally pretty reasonable and backed by convincing experimental results the experimental results are generally pretty thorough with relevant comparisons to prior works and ablations the writing is clear and exposition of the literature is thorough weaknesses this paper generally feels like a dbc with more bells and whistles that enable it to work better i think the general novelty is on the weaker side the authors can consider a more thorough investigation into where existing representation learning methods fall short and address them more fundamentally the experiments with natural video backgrounds generally feel pretty contrived granted this is whats in the literature the only realistic setting is carla and i would have liked to see more environments of that nature or more detailed experimentation there detailed questions if reward and dynamics have different encoders to what extent is this still learning a bisimulation or mapping to an equivalent state space what do the experimental evaluations look like with more environment steps are the asymptotic performances of all methods similar or is ambs just more sample efficient in the beginning eg dbc uses 1e6 steps on carla nit its not really clear to me the naming of metalearner what is the meta traintest tasks here i think something like learned distance metric would be clearer overall this papers contributions are generally reasonable and empirically backed however its novelty and experimental evaluation leave much to be desired to this end i am leaning towards acceptance but would not be upset if this paper is rejected
### Summary:
|
the paper proposes to learn a staterepresentation using bisimulation in an rl setting the approach is thoroughly evaluated on several benchmarks in its current form the paper is mainly an empirical contribution with now some theoretical contributions tucked away in the appendices nevertheless an interesting approach with promising results the reviewers appreciated the revised paper and the discussion the replies and discussions successfully addressed all serious concerns of the reviewers please also clarify the discussed points in the next iteration of the paper and run the experiments with more seeds as promised
|
[
352,
2296,
534,
37355,
2721,
253,
6779,
4715,
275,
752,
1039,
310,
352,
37355,
1025,
337,
275,
3239,
608,
352,
2296,
534,
2299,
778,
3897,
859,
280,
629,
273,
4217,
1491,
752,
4217,
1491,
310,
1146,
6289,
281,
5742,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
2296,
2104,
281,
14003,
625,
4836,
15477,
1491,
752,
513,
253,
4477,
1599,
407,
436,
5742,
752,
4836,
15477,
1491,
337,
253,
6197,
4745,
1840,
5150,
1903,
7637,
342,
12093,
273,
32049,
310,
752,
310,
5486,
407,
436,
337,
275,
253,
1386,
2708,
5150,
1903,
752,
310,
5486,
407,
247,
5008,
830,
273,
2805,
3701,
337,
2708,
5150,
1903,
604,
2805,
7024,
327,
260,
943,
2649,
260,
671,
320,
247,
4764,
273,
253,
2805,
1159,
337,
275,
253,
4715,
9191,
752,
513,
253,
37042,
3672,
1957,
337,
275,
2593,
8073,
352,
2296,
717,
1768,
50276,
5267,
82,
310,
277,
12847,
342,
941,
42072,
513,
253,
4477,
1599,
717,
1768,
342,
941,
42072,
337,
275,
2593,
8073,
253,
4477,
3597,
18505,
260,
281,
16987,
533,
247,
625,
3626,
4327,
651,
452,
644,
17356,
1677,
253,
2022,
1127,
1160,
1840,
2007,
6747,
277,
12847,
4543,
253,
3236,
268,
487,
261,
303,
1427,
432,
5248,
287,
9169,
897,
253,
337,
68,
1307,
275,
2914,
273,
253,
10921,
3910,
18505,
260,
2733,
285,
11922,
253,
337,
68,
1307,
651,
452,
20714,
247,
625,
1480,
5301,
281,
277,
12847,
50274,
8826,
5884,
5701,
337,
50276,
249,
253,
806,
12494,
273,
253,
10199,
1908,
15706,
1754,
327,
690,
391,
77,
11333,
342,
1754,
327,
2710,
391,
77,
11333,
337,
50276,
249,
253,
806,
1386,
273,
3239,
374,
943,
320,
273,
247,
1616,
729,
3061,
1232,
337,
50276,
66,
2929,
432,
2393,
275,
253,
807,
310,
3240,
4623,
285,
943,
320,
2908,
275,
253,
2905,
789,
2593,
323,
6779,
4715,
50272,
28923,
18758,
1162,
355,
4499,
422,
14613,
14259,
46234,
323,
26647,
275,
35221,
4715,
17857,
32888,
43425,
3614,
39962,
2061,
5375,
1797,
520,
1762,
21317,
337,
50276,
249,
253,
1273,
12494,
273,
3239,
374,
1908,
1690,
767,
3332,
9380,
326,
452,
644,
7607,
281,
5723,
2824,
50270,
4008,
287,
1162,
355,
278,
4173,
5520,
14237,
3066,
10491,
3169,
1375,
14259,
323,
1616,
729,
3061,
4870,
3614,
39962,
2061,
5375,
16899,
25805,
17107,
50270,
76,
358,
797,
284,
285,
46850,
3377,
1513,
9072,
4404,
10237,
17542,
303,
1427,
7982,
4715,
3614,
39962,
2061,
5375,
17605,
520,
1449,
4196,
337,
50276,
249,
253,
1273,
12494,
275,
3239,
374,
943,
320,
285,
7826,
7168,
4243,
273,
253,
1375,
3386,
337,
50276,
249,
253,
2626,
12494,
275,
3239,
374,
597,
1333,
4667,
273,
5148,
613,
6118,
326,
3037,
22620,
4496,
13199,
534,
22620,
403,
1146,
6289,
281,
337,
50276,
249,
253,
2626,
12494,
275,
3239,
374,
352,
943,
1333,
1133,
12517,
264,
830,
337,
50276,
918,
1840,
2593,
495,
897,
4851,
292,
323,
268,
5895,
1162,
355,
337,
275,
2593,
495,
13199,
253,
2491,
273,
17356,
24088,
305,
3681,
404,
470,
337,
337,
987,
2708,
5150,
374,
943,
1333,
835,
277,
2677,
7790,
253,
14613,
337,
50276,
249,
253,
1072,
12494,
1805,
281,
1333,
13067,
247,
7982,
342,
1675,
281,
247,
3646,
12580,
337,
275,
253,
11743,
273,
4677,
337,
943,
1333,
534,
310,
26277,
6311,
337,
2708,
4677,
337,
11743,
17356,
310,
6289,
281,
253,
4373,
19484,
533,
891,
1158,
253,
4477,
1599,
13630,
2803,
337,
2708,
4677,
337,
11743,
943,
1333,
5678,
342,
253,
35221,
4715,
337,
275,
253,
806,
12494,
273,
2593,
577,
5386,
253,
253,
1078,
4677,
337,
26332,
310,
5183,
275,
4677,
337,
337,
275,
253,
806,
12494,
273,
2593,
577,
943,
1333,
767,
6311,
22620,
275,
247,
2173,
337,
275,
2593,
7609,
253,
2990,
10336,
310,
5393,
533,
556,
417,
644,
5611,
2654,
1804,
44978,
253,
30762,
835,
352,
310,
5611,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
943,
1333,
285,
3103,
253,
5148,
613,
1216,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
943,
1333,
253,
1232,
273,
18198,
81,
839,
253,
1375,
32049,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
943,
1333,
16280,
269,
310,
247,
14561,
9261,
337,
275,
5150,
608,
13199,
326,
368,
403,
970,
253,
4581,
830,
273,
253,
259,
19,
7982,
347,
275,
277,
12847,
436,
310,
5393,
275,
253,
30762,
533,
943,
320,
31637,
275,
253,
2022,
2929,
347,
973,
337,
1840,
5150,
721,
368,
3748,
253,
6311,
4764,
280,
8062,
1566,
4496,
823,
247,
3806,
281,
253,
30762,
835,
352,
310,
2931,
337,
1840,
5150,
721,
943,
1333,
767,
16307,
19958,
432,
253,
44864,
6391,
337,
275,
253,
806,
12494,
273,
5976,
368,
943,
13199,
326,
15573,
470,
337,
310,
27810,
407,
247,
2602,
4090,
436,
310,
7616,
275,
253,
30762,
533,
943,
320,
31637,
275,
253,
2022,
2929,
347,
973,
337,
275,
253,
12494,
1840,
5150,
1903,
943,
1333,
359,
4388,
281,
1056,
1315,
285,
29439,
13123,
337,
275,
253,
1386,
1840,
8319,
943,
1333,
534,
310,
247,
7744,
908,
745,
22872,
337,
275,
253,
806,
6197,
275,
8319,
943,
1333,
310,
271,
3126,
326,
3400,
1029,
15759,
12275,
7313,
323,
391,
77,
8892,
337,
275,
253,
806,
6197,
273,
8676,
352,
943,
1333,
533,
452,
1027,
10921,
3470,
337,
275,
2593,
7288,
943,
1333,
4715,
273,
391,
77,
6083,
778,
11089,
432,
337,
275,
2593,
7288,
943,
1333,
277,
12847,
281,
2794,
271,
26279,
6276,
923,
2022,
7350,
1840,
5474,
339,
431,
248,
2929,
2175,
253,
1895,
273,
4715,
13727,
5304,
14237,
323,
35221,
4715,
5742,
352,
21168,
327,
2720,
789,
3676,
17542,
303,
1427,
323,
1453,
277,
12847,
1182,
12109,
1162,
355,
534,
33772,
271,
2460,
6779,
15970,
273,
23267,
285,
8062,
285,
4648,
352,
342,
7044,
436,
2929,
2789,
247,
1643,
14586,
327,
1755,
273,
277,
12847,
5742,
337,
3185,
273,
247,
2014,
1375,
21496,
352,
33772,
767,
4858,
4394,
581,
323,
23267,
285,
581,
323,
8062,
374,
3185,
273,
970,
298,
18,
4181,
275,
253,
21496,
2317,
281,
2557,
4181,
352,
3185,
4648,
247,
6311,
13361,
81,
534,
26295,
4181,
495,
970,
27935,
432,
7044,
352,
33772,
281,
23043,
4575,
253,
6654,
875,
253,
8062,
2801,
285,
10921,
2801,
285,
577,
11323,
2460,
42072,
281,
253,
3733,
4679,
5224,
327,
940,
25031,
1453,
326,
327,
2164,
8892,
352,
10129,
1837,
82,
285,
277,
12847,
285,
327,
253,
643,
374,
8892,
41731,
13015,
731,
490,
77,
569,
1804,
326,
512,
4295,
403,
1774,
3782,
337,
285,
374,
20544,
50274,
783,
2929,
2175,
271,
1774,
1895,
275,
4715,
5304,
14237,
323,
391,
77,
10237,
281,
940,
21680,
50275,
783,
5661,
1543,
403,
3965,
2266,
41731,
14692,
253,
4623,
1666,
25379,
277,
12847,
1837,
82,
285,
1690,
490,
77,
569,
534,
921,
512,
4295,
273,
253,
4081,
2746,
403,
4217,
50276,
8826,
4295,
273,
253,
1332,
751,
16248,
17542,
303,
1427,
342,
941,
42072,
403,
973,
17194,
50276,
20881,
1255,
265,
50276,
3229,
3784,
253,
28913,
534,
2722,
326,
1016,
4445,
273,
253,
1332,
310,
1774,
281,
3045,
891,
1335,
452,
690,
7350,
34974,
327,
253,
36594,
273,
253,
4081,
1332,
50276,
7053,
253,
2934,
273,
4715,
374,
4858,
1375,
46234,
581,
323,
10921,
285,
581,
323,
8062,
253,
28913,
5936,
436,
310,
1077,
1774,
323,
1175,
3045,
2299,
352,
3133,
281,
4711,
247,
7982,
326,
310,
642,
3356,
29210,
273,
17542,
303,
1427,
253,
17542,
303,
7982,
943,
9232,
3054,
534,
403,
30333,
2074,
7613,
2074,
275,
1097,
8993,
10921,
285,
2852,
1375,
3268,
533,
407,
19860,
253,
767,
598,
715,
4858,
46234,
285,
3733,
581,
760,
342,
253,
10921,
285,
581,
760,
342,
8062,
352,
3133,
436,
5164,
14259,
310,
642,
3356,
10848,
253,
10921,
6779,
588,
760,
9232,
8993,
10921,
1223,
697,
417,
2590,
752,
253,
8062,
6779,
588,
9232,
1580,
697,
760,
8103,
310,
2852,
1375,
14259,
275,
253,
1072,
8062,
6779,
2317,
310,
352,
417,
1896,
326,
253,
8062,
1566,
651,
13551,
281,
2969,
21565,
470,
11678,
281,
15338,
697,
2957,
50276,
9815,
253,
30328,
1677,
323,
970,
247,
6311,
4181,
3185,
273,
298,
18,
390,
298,
19,
310,
12744,
1223,
352,
778,
320,
6927,
281,
22318,
253,
1127,
273,
970,
298,
18,
390,
298,
19,
310,
281,
11120,
2605,
253,
21496,
2317,
281,
1056,
391,
77,
6927,
672,
3185,
970,
271,
13361,
81,
697,
417,
2590,
849,
253,
21496,
2317,
3139,
588,
320,
16745,
390,
2139,
352,
651,
320,
16745,
275,
247,
1039,
326,
310,
1175,
323,
391,
77,
671,
436,
310,
271,
1774,
629,
273,
253,
1332,
285,
943,
320,
4518,
2529,
275,
253,
2022,
2505,
3185,
273,
275,
253,
30762,
671,
253,
39926,
273,
436,
6311,
4181,
347,
247,
5148,
613,
1216,
3133,
13583,
432,
752,
891,
476,
2096,
352,
310,
3365,
970,
271,
13361,
81,
3192,
275,
253,
767,
46234,
285,
21565,
4181,
533,
417,
2509,
667,
11419,
4715,
253,
21643,
5740,
50276,
783,
958,
326,
512,
7092,
4278,
670,
436,
403,
760,
275,
253,
30762,
2789,
436,
4445,
273,
253,
1332,
2834,
281,
2096,
50276,
74,
3524,
253,
4477,
476,
19148,
253,
1840,
16280,
841,
7794,
273,
253,
1332,
534,
1646,
13583,
253,
1543,
3176,
2266,
285,
604,
253,
4477,
476,
2953,
253,
1840,
7350,
2654,
320,
1527,
281,
12976,
619,
4868,
50276,
783,
2929,
2175,
271,
1774,
1895,
285,
556,
690,
4891,
1543,
2299,
275,
616,
1655,
830,
253,
2201,
4295,
273,
253,
1332,
403,
417,
4518,
17194,
285,
627,
403,
690,
3533,
891,
452,
670,
253,
36594,
6309,
1877,
273,
253,
2746,
50274,
7152,
33032,
436,
2929,
29328,
247,
6779,
4715,
1332,
323,
35221,
4715,
3066,
253,
17542,
303,
1427,
7982,
697,
8127,
271,
6880,
273,
253,
789,
275,
3676,
17542,
303,
1427,
323,
1453,
277,
12847,
1182,
12109,
1162,
355,
9169,
342,
690,
3081,
11701,
50276,
29343,
264,
4181,
17082,
5148,
613,
6118,
3185,
273,
298,
18,
4181,
327,
253,
6311,
14237,
50276,
16806,
366,
4735,
2349,
351,
723,
323,
1375,
285,
8062,
50276,
2203,
42072,
432,
1837,
82,
340,
274,
1832,
1162,
355,
43425,
50275,
29343,
264,
260,
5454,
2727,
875,
10921,
285,
8062,
50274,
296,
3755,
20556,
50275,
783,
9380,
11701,
689,
2720,
789,
403,
3839,
3965,
5272,
285,
17245,
407,
21414,
5661,
1543,
50276,
783,
5661,
1543,
403,
3839,
3965,
11080,
342,
4623,
14023,
281,
2720,
2987,
285,
490,
77,
569,
50276,
783,
4028,
310,
2590,
285,
47284,
273,
253,
6239,
310,
11080,
50275,
20881,
1255,
265,
50276,
2520,
2929,
3839,
9193,
751,
247,
277,
12847,
342,
625,
36161,
285,
37031,
868,
326,
8046,
352,
281,
789,
1805,
891,
1158,
253,
2087,
38135,
310,
327,
253,
21076,
1930,
253,
4477,
476,
1908,
247,
625,
11080,
5839,
715,
835,
5368,
6779,
4715,
3082,
2965,
2159,
285,
2953,
731,
625,
26401,
50276,
783,
4679,
342,
3626,
3492,
24550,
3839,
1928,
3965,
523,
30487,
7169,
436,
310,
47515,
275,
253,
6239,
253,
760,
15958,
4758,
310,
1113,
4123,
285,
891,
651,
452,
10490,
281,
923,
625,
12620,
273,
326,
3753,
390,
625,
7000,
40290,
627,
50276,
5992,
7193,
3533,
50276,
338,
10921,
285,
8062,
452,
1027,
2349,
351,
398,
281,
752,
6070,
310,
436,
1335,
4715,
247,
17542,
303,
1427,
390,
10603,
281,
271,
6425,
1375,
2317,
50276,
5371,
513,
253,
5661,
27163,
1007,
751,
342,
625,
3126,
5018,
403,
253,
20185,
16226,
273,
512,
3082,
2074,
390,
310,
717,
1768,
816,
625,
3410,
5919,
275,
253,
5068,
24088,
277,
12847,
4648,
337,
70,
23,
5018,
327,
1113,
4123,
50276,
32202,
697,
417,
1663,
2590,
281,
479,
253,
26086,
273,
5148,
613,
1216,
752,
310,
253,
11419,
1140,
565,
383,
8892,
1060,
891,
1158,
1633,
751,
6311,
4181,
7982,
651,
320,
30909,
50275,
1189,
455,
436,
9380,
9021,
403,
3839,
5272,
285,
45190,
17245,
2299,
697,
38135,
285,
5661,
7103,
3553,
1199,
281,
320,
6799,
281,
436,
990,
891,
717,
25661,
4404,
14924,
533,
651,
417,
320,
14004,
604,
436,
2929,
310,
10945,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
3037,
247,
1375,
37626,
970,
17542,
303,
1427,
275,
271,
391,
77,
4758,
253,
2746,
310,
16575,
6760,
327,
2067,
49602,
275,
697,
1655,
830,
253,
2929,
310,
7194,
271,
16774,
7680,
342,
1024,
690,
10527,
9021,
32420,
1977,
275,
253,
14801,
1271,
17837,
271,
4722,
2746,
342,
12532,
1543,
50276,
783,
30628,
14109,
253,
17265,
2929,
285,
253,
5955,
253,
32114,
285,
11985,
8379,
9713,
512,
4092,
7350,
273,
253,
30628,
4496,
671,
19148,
253,
5469,
2792,
275,
253,
1735,
19502,
273,
253,
2929,
285,
1408,
253,
4679,
342,
625,
12922,
347,
12316
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
352,
2296,
534,
37355,
2721,
253,
6779,
4715,
275,
752,
1039,
310,
352,
37355,
1025,
337,
275,
3239,
608,
352,
2296,
534,
2299,
778,
3897,
859,
280,
629,
273,
4217,
1491,
752,
4217,
1491,
310,
1146,
6289,
281,
5742,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
2296,
2104,
281,
14003,
625,
4836,
15477,
1491,
752,
513,
253,
4477,
1599,
407,
436,
5742,
752,
4836,
15477,
1491,
337,
253,
6197,
4745,
1840,
5150,
1903,
7637,
342,
12093,
273,
32049,
310,
752,
310,
5486,
407,
436,
337,
275,
253,
1386,
2708,
5150,
1903,
752,
310,
5486,
407,
247,
5008,
830,
273,
2805,
3701,
337,
2708,
5150,
1903,
604,
2805,
7024,
327,
260,
943,
2649,
260,
671,
320,
247,
4764,
273,
253,
2805,
1159,
337,
275,
253,
4715,
9191,
752,
513,
253,
37042,
3672,
1957,
337,
275,
2593,
8073,
352,
2296,
717,
1768,
50276,
5267,
82,
310,
277,
12847,
342,
941,
42072,
513,
253,
4477,
1599,
717,
1768,
342,
941,
42072,
337,
275,
2593,
8073,
253,
4477,
3597,
18505,
260,
281,
16987,
533,
247,
625,
3626,
4327,
651,
452,
644,
17356,
1677,
253,
2022,
1127,
1160,
1840,
2007,
6747,
277,
12847,
4543,
253,
3236,
268,
487,
261,
303,
1427,
432,
5248,
287,
9169,
897,
253,
337,
68,
1307,
275,
2914,
273,
253,
10921,
3910,
18505,
260,
2733,
285,
11922,
253,
337,
68,
1307,
651,
452,
20714,
247,
625,
1480,
5301,
281,
277,
12847,
50274,
8826,
5884,
5701,
337,
50276,
249,
253,
806,
12494,
273,
253,
10199,
1908,
15706,
1754,
327,
690,
391,
77,
11333,
342,
1754,
327,
2710,
391,
77,
11333,
337,
50276,
249,
253,
806,
1386,
273,
3239,
374,
943,
320,
273,
247,
1616,
729,
3061,
1232,
337,
50276,
66,
2929,
432,
2393,
275,
253,
807,
310,
3240,
4623,
285,
943,
320,
2908,
275,
253,
2905,
789,
2593,
323,
6779,
4715,
50272,
28923,
18758,
1162,
355,
4499,
422,
14613,
14259,
46234,
323,
26647,
275,
35221,
4715,
17857,
32888,
43425,
3614,
39962,
2061,
5375,
1797,
520,
1762,
21317,
337,
50276,
249,
253,
1273,
12494,
273,
3239,
374,
1908,
1690,
767,
3332,
9380,
326,
452,
644,
7607,
281,
5723,
2824,
50270,
4008,
287,
1162,
355,
278,
4173,
5520,
14237,
3066,
10491,
3169,
1375,
14259,
323,
1616,
729,
3061,
4870,
3614,
39962,
2061,
5375,
16899,
25805,
17107,
50270,
76,
358,
797,
284,
285,
46850,
3377,
1513,
9072,
4404,
10237,
17542,
303,
1427,
7982,
4715,
3614,
39962,
2061,
5375,
17605,
520,
1449,
4196,
337,
50276,
249,
253,
1273,
12494,
275,
3239,
374,
943,
320,
285,
7826,
7168,
4243,
273,
253,
1375,
3386,
337,
50276,
249,
253,
2626,
12494,
275,
3239,
374,
597,
1333,
4667,
273,
5148,
613,
6118,
326,
3037,
22620,
4496,
13199,
534,
22620,
403,
1146,
6289,
281,
337,
50276,
249,
253,
2626,
12494,
275,
3239,
374,
352,
943,
1333,
1133,
12517,
264,
830,
337,
50276,
918,
1840,
2593,
495,
897,
4851,
292,
323,
268,
5895,
1162,
355,
337,
275,
2593,
495,
13199,
253,
2491,
273,
17356,
24088,
305,
3681,
404,
470,
337,
337,
987,
2708,
5150,
374,
943,
1333,
835,
277,
2677,
7790,
253,
14613,
337,
50276,
249,
253,
1072,
12494,
1805,
281,
1333,
13067,
247,
7982,
342,
1675,
281,
247,
3646,
12580,
337,
275,
253,
11743,
273,
4677,
337,
943,
1333,
534,
310,
26277,
6311,
337,
2708,
4677,
337,
11743,
17356,
310,
6289,
281,
253,
4373,
19484,
533,
891,
1158,
253,
4477,
1599,
13630,
2803,
337,
2708,
4677,
337,
11743,
943,
1333,
5678,
342,
253,
35221,
4715,
337,
275,
253,
806,
12494,
273,
2593,
577,
5386,
253,
253,
1078,
4677,
337,
26332,
310,
5183,
275,
4677,
337,
337,
275,
253,
806,
12494,
273,
2593,
577,
943,
1333,
767,
6311,
22620,
275,
247,
2173,
337,
275,
2593,
7609,
253,
2990,
10336,
310,
5393,
533,
556,
417,
644,
5611,
2654,
1804,
44978,
253,
30762,
835,
352,
310,
5611,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
943,
1333,
285,
3103,
253,
5148,
613,
1216,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
943,
1333,
253,
1232,
273,
18198,
81,
839,
253,
1375,
32049,
337,
275,
253,
1273,
12494,
273,
3239,
608,
352,
943,
1333,
16280,
269,
310,
247,
14561,
9261,
337,
275,
5150,
608,
13199,
326,
368,
403,
970,
253,
4581,
830,
273,
253,
259,
19,
7982,
347,
275,
277,
12847,
436,
310,
5393,
275,
253,
30762,
533,
943,
320,
31637,
275,
253,
2022,
2929,
347,
973,
337,
1840,
5150,
721,
368,
3748,
253,
6311,
4764,
280,
8062,
1566,
4496,
823,
247,
3806,
281,
253,
30762,
835,
352,
310,
2931,
337,
1840,
5150,
721,
943,
1333,
767,
16307,
19958,
432,
253,
44864,
6391,
337,
275,
253,
806,
12494,
273,
5976,
368,
943,
13199,
326,
15573,
470,
337,
310,
27810,
407,
247,
2602,
4090,
436,
310,
7616,
275,
253,
30762,
533,
943,
320,
31637,
275,
253,
2022,
2929,
347,
973,
337,
275,
253,
12494,
1840,
5150,
1903,
943,
1333,
359,
4388,
281,
1056,
1315,
285,
29439,
13123,
337,
275,
253,
1386,
1840,
8319,
943,
1333,
534,
310,
247,
7744,
908,
745,
22872,
337,
275,
253,
806,
6197,
275,
8319,
943,
1333,
310,
271,
3126,
326,
3400,
1029,
15759,
12275,
7313,
323,
391,
77,
8892,
337,
275,
253,
806,
6197,
273,
8676,
352,
943,
1333,
533,
452,
1027,
10921,
3470,
337,
275,
2593,
7288,
943,
1333,
4715,
273,
391,
77,
6083,
778,
11089,
432,
337,
275,
2593,
7288,
943,
1333,
277,
12847,
281,
2794,
271,
26279,
6276,
923,
2022,
7350,
1840,
5474,
339,
431,
248,
2929,
2175,
253,
1895,
273,
4715,
13727,
5304,
14237,
323,
35221,
4715,
5742,
352,
21168,
327,
2720,
789,
3676,
17542,
303,
1427,
323,
1453,
277,
12847,
1182,
12109,
1162,
355,
534,
33772,
271,
2460,
6779,
15970,
273,
23267,
285,
8062,
285,
4648,
352,
342,
7044,
436,
2929,
2789,
247,
1643,
14586,
327,
1755,
273,
277,
12847,
5742,
337,
3185,
273,
247,
2014,
1375,
21496,
352,
33772,
767,
4858,
4394,
581,
323,
23267,
285,
581,
323,
8062,
374,
3185,
273,
970,
298,
18,
4181,
275,
253,
21496,
2317,
281,
2557,
4181,
352,
3185,
4648,
247,
6311,
13361,
81,
534,
26295,
4181,
495,
970,
27935,
432,
7044,
352,
33772,
281,
23043,
4575,
253,
6654,
875,
253,
8062,
2801,
285,
10921,
2801,
285,
577,
11323,
2460,
42072,
281,
253,
3733,
4679,
5224,
327,
940,
25031,
1453,
326,
327,
2164,
8892,
352,
10129,
1837,
82,
285,
277,
12847,
285,
327,
253,
643,
374,
8892,
41731,
13015,
731,
490,
77,
569,
1804,
326,
512,
4295,
403,
1774,
3782,
337,
285,
374,
20544,
50274,
783,
2929,
2175,
271,
1774,
1895,
275,
4715,
5304,
14237,
323,
391,
77,
10237,
281,
940,
21680,
50275,
783,
5661,
1543,
403,
3965,
2266,
41731,
14692,
253,
4623,
1666,
25379,
277,
12847,
1837,
82,
285,
1690,
490,
77,
569,
534,
921,
512,
4295,
273,
253,
4081,
2746,
403,
4217,
50276,
8826,
4295,
273,
253,
1332,
751,
16248,
17542,
303,
1427,
342,
941,
42072,
403,
973,
17194,
50276,
20881,
1255,
265,
50276,
3229,
3784,
253,
28913,
534,
2722,
326,
1016,
4445,
273,
253,
1332,
310,
1774,
281,
3045,
891,
1335,
452,
690,
7350,
34974,
327,
253,
36594,
273,
253,
4081,
1332,
50276,
7053,
253,
2934,
273,
4715,
374,
4858,
1375,
46234,
581,
323,
10921,
285,
581,
323,
8062,
253,
28913,
5936,
436,
310,
1077,
1774,
323,
1175,
3045,
2299,
352,
3133,
281,
4711,
247,
7982,
326,
310,
642,
3356,
29210,
273,
17542,
303,
1427,
253,
17542,
303,
7982,
943,
9232,
3054,
534,
403,
30333,
2074,
7613,
2074,
275,
1097,
8993,
10921,
285,
2852,
1375,
3268,
533,
407,
19860,
253,
767,
598,
715,
4858,
46234,
285,
3733,
581,
760,
342,
253,
10921,
285,
581,
760,
342,
8062,
352,
3133,
436,
5164,
14259,
310,
642,
3356,
10848,
253,
10921,
6779,
588,
760,
9232,
8993,
10921,
1223,
697,
417,
2590,
752,
253,
8062,
6779,
588,
9232,
1580,
697,
760,
8103,
310,
2852,
1375,
14259,
275,
253,
1072,
8062,
6779,
2317,
310,
352,
417,
1896,
326,
253,
8062,
1566,
651,
13551,
281,
2969,
21565,
470,
11678,
281,
15338,
697,
2957,
50276,
9815,
253,
30328,
1677,
323,
970,
247,
6311,
4181,
3185,
273,
298,
18,
390,
298,
19,
310,
12744,
1223,
352,
778,
320,
6927,
281,
22318,
253,
1127,
273,
970,
298,
18,
390,
298,
19,
310,
281,
11120,
2605,
253,
21496,
2317,
281,
1056,
391,
77,
6927,
672,
3185,
970,
271,
13361,
81,
697,
417,
2590,
849,
253,
21496,
2317,
3139,
588,
320,
16745,
390,
2139,
352,
651,
320,
16745,
275,
247,
1039,
326,
310,
1175,
323,
391,
77,
671,
436,
310,
271,
1774,
629,
273,
253,
1332,
285,
943,
320,
4518,
2529,
275,
253,
2022,
2505,
3185,
273,
275,
253,
30762,
671,
253,
39926,
273,
436,
6311,
4181,
347,
247,
5148,
613,
1216,
3133,
13583,
432,
752,
891,
476,
2096,
352,
310,
3365,
970,
271,
13361,
81,
3192,
275,
253,
767,
46234,
285,
21565,
4181,
533,
417,
2509,
667,
11419,
4715,
253,
21643,
5740,
50276,
783,
958,
326,
512,
7092,
4278,
670,
436,
403,
760,
275,
253,
30762,
2789,
436,
4445,
273,
253,
1332,
2834,
281,
2096,
50276,
74,
3524,
253,
4477,
476,
19148,
253,
1840,
16280,
841,
7794,
273,
253,
1332,
534,
1646,
13583,
253,
1543,
3176,
2266,
285,
604,
253,
4477,
476,
2953,
253,
1840,
7350,
2654,
320,
1527,
281,
12976,
619,
4868,
50276,
783,
2929,
2175,
271,
1774,
1895,
285,
556,
690,
4891,
1543,
2299,
275,
616,
1655,
830,
253,
2201,
4295,
273,
253,
1332,
403,
417,
4518,
17194,
285,
627,
403,
690,
3533,
891,
452,
670,
253,
36594,
6309,
1877,
273,
253,
2746,
50274,
7152,
33032,
436,
2929,
29328,
247,
6779,
4715,
1332,
323,
35221,
4715,
3066,
253,
17542,
303,
1427,
7982,
697,
8127,
271,
6880,
273,
253,
789,
275,
3676,
17542,
303,
1427,
323,
1453,
277,
12847,
1182,
12109,
1162,
355,
9169,
342,
690,
3081,
11701,
50276,
29343,
264,
4181,
17082,
5148,
613,
6118,
3185,
273,
298,
18,
4181,
327,
253,
6311,
14237,
50276,
16806,
366,
4735,
2349,
351,
723,
323,
1375,
285,
8062,
50276,
2203,
42072,
432,
1837,
82,
340,
274,
1832,
1162,
355,
43425,
50275,
29343,
264,
260,
5454,
2727,
875,
10921,
285,
8062,
50274,
296,
3755,
20556,
50275,
783,
9380,
11701,
689,
2720,
789,
403,
3839,
3965,
5272,
285,
17245,
407,
21414,
5661,
1543,
50276,
783,
5661,
1543,
403,
3839,
3965,
11080,
342,
4623,
14023,
281,
2720,
2987,
285,
490,
77,
569,
50276,
783,
4028,
310,
2590,
285,
47284,
273,
253,
6239,
310,
11080,
50275,
20881,
1255,
265,
50276,
2520,
2929,
3839,
9193,
751,
247,
277,
12847,
342,
625,
36161,
285,
37031,
868,
326,
8046,
352,
281,
789,
1805,
891,
1158,
253,
2087,
38135,
310,
327,
253,
21076,
1930,
253,
4477,
476,
1908,
247,
625,
11080,
5839,
715,
835,
5368,
6779,
4715,
3082,
2965,
2159,
285,
2953,
731,
625,
26401,
50276,
783,
4679,
342,
3626,
3492,
24550,
3839,
1928,
3965,
523,
30487,
7169,
436,
310,
47515,
275,
253,
6239,
253,
760,
15958,
4758,
310,
1113,
4123,
285,
891,
651,
452,
10490,
281,
923,
625,
12620,
273,
326,
3753,
390,
625,
7000,
40290,
627,
50276,
5992,
7193,
3533,
50276,
338,
10921,
285,
8062,
452,
1027,
2349,
351,
398,
281,
752,
6070,
310,
436,
1335,
4715,
247,
17542,
303,
1427,
390,
10603,
281,
271,
6425,
1375,
2317,
50276,
5371,
513,
253,
5661,
27163,
1007,
751,
342,
625,
3126,
5018,
403,
253,
20185,
16226,
273,
512,
3082,
2074,
390,
310,
717,
1768,
816,
625,
3410,
5919,
275,
253,
5068,
24088,
277,
12847,
4648,
337,
70,
23,
5018,
327,
1113,
4123,
50276,
32202,
697,
417,
1663,
2590,
281,
479,
253,
26086,
273,
5148,
613,
1216,
752,
310,
253,
11419,
1140,
565,
383,
8892,
1060,
891,
1158,
1633,
751,
6311,
4181,
7982,
651,
320,
30909,
50275,
1189,
455,
436,
9380,
9021,
403,
3839,
5272,
285,
45190,
17245,
2299,
697,
38135,
285,
5661,
7103,
3553,
1199,
281,
320,
6799,
281,
436,
990,
891,
717,
25661,
4404,
14924,
533,
651,
417,
320,
14004,
604,
436,
2929,
310,
10945,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
281,
3037,
247,
1375,
37626,
970,
17542,
303,
1427,
275,
271,
391,
77,
4758,
253,
2746,
310,
16575,
6760,
327,
2067,
49602,
275,
697,
1655,
830,
253,
2929,
310,
7194,
271,
16774,
7680,
342,
1024,
690,
10527,
9021,
32420,
1977,
275,
253,
14801,
1271,
17837,
271,
4722,
2746,
342,
12532,
1543,
50276,
783,
30628,
14109,
253,
17265,
2929,
285,
253,
5955,
253,
32114,
285,
11985,
8379,
9713,
512,
4092,
7350,
273,
253,
30628,
4496,
671,
19148,
253,
5469,
2792,
275,
253,
1735,
19502,
273,
253,
2929,
285,
1408,
253,
4679,
342,
625,
12922,
347,
12316
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper introduces an algorithm ttn for nonlinear online and onpolicy value function approximation the main novelty of the paper is to view nonlinear value estimation as two separate components one of representation learning from a nonlinear mapping and one of linear value function estimation the soundness of the approach stems from the rate at which each component is updated the authors argue that if the nonlinear component is updated at a slower rate than the linear component the former can be viewed as fixed in the limit and what remains is a linear value function estimation problem for which several sound algorithms exist ttn is evaluated on 4 domains and compared to several other value estimation methods as well as dqn on a control problem with two variations on the tasks state space ill start off the review by stating that i find the idea and theoretical justification of separating the nonlinear and linear parts of value function estimation to be quite interesting potentially impacting rl at large indeed this view promises to reconcile latest developments in deep rl with the longlasting work on rl with linear function approximators however there are a few unclear aspects that do not allow one to be fully convinced that this paper lives up to the aforementioned promise for the theoretical contribution the authors claim that the main challenge was to deal with the potentially dependent features outputted by the neural network it is dealt with by using a projection that projects the linear parameters of the value function to a compact subset of the parameter space bar the appendix there is no mention of this projection in the paper on how this compact subset that must include the optimal parameter is defined and if this projection is merely a theoretical tool or if it was necessary to implement it in practice there is a projection for the neural net weights too but i can see how for these it might not be necessary to use in practice however for the linear weights as their computation potentially involves inverting illconditioned matrices they can indeed blowup relatively fast i found the experimental validation to be quite rich but not done in a systematic enough manner for instance the experiment utility of optimizing the mspbe demonstrates quite nicely the importance of each component but is only performed on a single task as the theoretical analysis does not say anything about the improvements the representation learning can have on the linear value estimation nor if the loss used for learning the representation effectively yields better features for the mspbe minimization this experiment is rather important and should have been performed on more than a single domain secondly i do not find the chosen baselines to be sufficiently competitive the authors state in sec 2 that nonlineargtd has not seen widespread use but having this algorithm as the main competitor does not provide strong evidence that ttn will know a better fate in the abstract it is implied that outside of nonlineargtd value function approximation methods are not sound in approximate policy iteration algorithms such as ddpg or trpo there is a need in performing value estimation it is done by essentially a fittedq iteration procedure which is sound why wasnt ttn compared to these methods if it is because they are not online why being online in the experiments of the paper is important showing that ttn is competitive with currently widespread methods for value estimated would have been more convincing than the comparison with nonlineargtd thirdly for the sake of reproducibility as lstd seems to be the method of choice for learning the linear part it would have been adequate to provide an algorithm box for this version as is done for gtd2tdc lstd is essentially a batch algorithm and there could be many ways to turn it into an online algorithm with which algorithm were the results in the experimental section obtained finally on the control task the authors add several modifications to their algorithm which results in an algorithm that is very close to that of levine et al 2017 why was not the latter a baseline for this experiment especially since it was included in other experimentsdocsepthis paper proposes twotimescale networks ttns a reinforcement learning algorithm where feature representations are learned by a neural network trained on a surrogate loss function ie value and a value function is learned on top of the feature representation using a fast leastsquares algorithm the authors prove the convergence of this method using methods from two timescale stochastic approximation convergent and stable nonlinear algorithms is an important problem in reinforcement learning and this paper offers an interesting approach for addressing this issue the idea of using a fast linear learner on top of a slowly changing representation is not new in rl levine et al 2017 but the authors somewhat motivate this approach by showing that it results in a stable and convergent algorithm thus i view the convergence proof as the main contribution of the paper the paper is written clearly but could benefit from more efficient use of space in the main paper for example i feel that the introduction and discussion in section 3 on surrogate objectives could be considerably shortened and a formal proof statement could be included from the appendix in section 4 with the full proof in the appendix the experimental evaluation is detailed and ablation tests show the value of different choices of surrogate loss for value function training linear value function learning methods and comparisons against other nonlinear algorithms such as dqn and nonlinear gtdtdvariants a minor criticism is that it is difficult to position this work against the simpler but not sound deep rl methods as the authors only compare to dqn on a nonstandard benchmark task as additional related work sbeed dai et al icml 2018 also shows convergence for a nonlinear reinforcement learning algorithm in the control setting and quantifies the convergence rate while accounting for finite sample error it would be good to include discussion of this work although the proposed method and proofs are derived very differentlydocsepthe paper proposes a twotimescale framework for learning the value function and a state representation altogether with nonlinear approximators the authors provide proof of convergence and a good empirical evaluation the topic is very interesting and relevant to iclr however i think that the paper is not ready for a publication first although the paper is well written the writing can be improved for instance i found already the abstract a bit confusing there the authors state that they provide a twotimescale network ttn architecture that enables linear methods to be used to learn values the approach facilitates use of algorithms developed for the linear setting we prove convergence for ttns with particular care given to ensure convergence of the fast linear component yet the title says nonlinear and in the remainder of the paper they use neural networks the major problem of the paper is however its organization the novelty of the paper the proof of convergence is relegated to the appendix and too much is spent in the introduction when actually the idea of having the vfunction depending on a slowly changing network is also not novel in rl for instance the authors say that v depends on theta and w and that theta changes at slower pace compared to w this recalls the use of target networks in the td error for many actorcritic algorithms it is not the same thing but there is a strong connection furthermore in the introduction the authors say that eligibility traces have been used only with linear function approximators but gae by schulman et al uses the same principle their advantage is actually the tdlambda error to learn an advantage function estimator and it became sota for learning the value function i am also a bit skeptical about the use of msbe in the experiment first in eq 4 and 5 the authors state that using the mstde is easier than msbe then in the experiments they evaluate both however the msbe error involves the square of an expectation which should be biased how do you compute it furthermore you should spend a couple of sentences to explain the problem of this square and the doublesampling problem of bellman residual algorithms for someone unfamiliar with the problem this issue could be unclear i appreciate the extensive evaluation but its organization can also be improved considering that some important information are again in the appendix furthermore results on control experiment are not significative and should be removed at the current stage at least in the nonimage version there is a lot of variance in your runs one blue curve is really bad while for the image version all runs are very unstable going always up and down in conclusion there is a lot of interesting material in this paper even though the novelty is not great the proofs analysis and evaluation make it a solid paper however because there is so much do discuss i would suggest to reorganize the paper and submit directly to a journal track the paper is already 29 pages including the appendixdocsepsummary this paper presents a twotimescale network ttn that enables linear methods to be used to learn values on the slow timescale nonlinear features are learned using a surrogate loss on the fast timescale a value function is estimated as a linear function of those features it appears to be a single network where one head drives the representation and the second head learns the values they investigate multiple surrogate losses and end up using the mstde for its simplicity even though it provides worse value estimates than mspbe as detailed in their experiments they provide convergence results regular twotimescale stochastic approximation results from borkar for the twotimescale procedure and provide empirical evidence for the benefits of this method compared to other nonlinear value function approximation methods clarity and quality the paper is well written in general the mathematics seems to be sound and the experimental results appear to be thorough originality using two different heads one to drive the representation and the second to learn the values appears to be an architectural detail the surrogate loss to learn the features coupled with a linear policy evaluation algorithm appear to be novel but does not warrant in my opinion the novelty necessary for publication at iclr the theoretical results appear to be a straightforward application of borkars twotimescale stochastic approximation algorithm to this architecture to get convergence this therefore does not appear to be a novel contribution you state after equaltion 3 that nonlinear function classes do not have a closed form solution however it seems that the paper convergent temporaldifference learning with arbitrary smooth function approximation does indeed have a closed form solution for nonlinear function approximators when minimizing the mspbe albeit making a linearity assumption which is something your work seems to make as well the work done in the control setting appears to be very similar to the experiments performed in the paper shallow updates for deep reinforcement learning significance overall i think that the paper is well written and the experimental evaluation is thorough however the novelty is lacking as it appears to be training using a multiheaded approach which exists and the convergence results appear to be a straightforward application of borkars twotimescale proof the novelty therefore appears to be using a surrogate loss function for training the features which does not possess the sufficient novelty in my opinion for iclr i would suggest the authors detail why their twotimescale approach is different from that of borkars or additionally add some performance guarantee to the convergence results to extend the theory this would make for a much stronger paper
### Summary:
|
the paper proposes a new method to approximate the nonlinear value function by estimating it as a sum of linear and nonlinear terms the nonlinear term is updated much slower than the linear term and the paper proposes to use a fast leastsquare algorithm to update the linear term convergence results are also discussed and empirical evidence is provided as reviewers have pointed out the novelty of the paper is limited but the ideas are interesting and could be useful for the community i strongly recommend taking reviewers comments into account for the camera ready and also add a discussion on the relationship with the existing work overall i think this paper is interesting and i recommend acceptance
|
[
10527,
7680,
253,
4477,
1750,
326,
253,
2022,
5691,
369,
281,
2968,
342,
253,
7826,
7976,
3386,
3453,
8659,
407,
253,
11454,
2990,
352,
310,
18445,
342,
407,
970,
247,
12378,
326,
6493,
253,
4872,
3602,
273,
253,
1318,
1159,
281,
247,
8566,
8578,
273,
253,
4764,
2317,
2534,
253,
30762,
627,
310,
642,
3748,
273,
436,
12378,
275,
253,
2929,
327,
849,
436,
8566,
8578,
326,
1364,
2486,
253,
8654,
4764,
310,
2931,
285,
604,
436,
12378,
310,
7960,
247,
10527,
4968,
390,
604,
352,
369,
3309,
281,
3359,
352,
275,
3946,
627,
310,
247,
12378,
323,
253,
11454,
2036,
13461,
1512,
533,
891,
476,
923,
849,
323,
841,
352,
1537,
417,
320,
3309,
281,
897,
275,
3946,
2299,
323,
253,
4872,
13461,
347,
616,
13782,
7826,
8687,
275,
31324,
2853,
44321,
12624,
597,
476,
6296,
8230,
484,
4942,
3809,
50275,
74,
1119,
253,
5661,
12820,
281,
320,
3240,
6793,
533,
417,
2218,
275,
247,
12082,
2217,
5133,
323,
4227,
253,
3368,
11839,
273,
39793,
253,
278,
1033,
1257,
14371,
3240,
23395,
253,
6349,
273,
1016,
4445,
533,
310,
760,
2684,
327,
247,
2014,
4836,
347,
253,
10527,
1783,
1057,
417,
1333,
2712,
670,
253,
11701,
253,
6779,
4715,
476,
452,
327,
253,
4872,
1318,
13418,
4543,
604,
253,
2957,
908,
323,
4715,
253,
6779,
8069,
11026,
1805,
3386,
323,
253,
278,
1033,
1257,
41458,
436,
3368,
310,
2581,
1774,
285,
943,
452,
644,
2684,
327,
625,
685,
247,
2014,
5028,
50276,
9815,
314,
891,
513,
417,
1089,
253,
6777,
1666,
25379,
281,
320,
10481,
12085,
253,
4477,
1375,
275,
4706,
374,
326,
1327,
1282,
1662,
2851,
556,
417,
2326,
14414,
897,
533,
1907,
436,
5933,
347,
253,
2022,
32048,
1057,
417,
2085,
2266,
1941,
326,
246,
14543,
588,
871,
247,
1805,
13742,
275,
253,
12002,
352,
310,
10466,
326,
3345,
273,
1327,
1282,
1662,
2851,
1318,
1159,
11193,
3082,
403,
417,
3590,
275,
16851,
3646,
19502,
11333,
824,
347,
32765,
8159,
390,
492,
5367,
627,
310,
247,
878,
275,
9591,
1318,
13418,
352,
310,
2218,
407,
9093,
247,
14662,
82,
19502,
5199,
534,
310,
3590,
2139,
369,
2649,
246,
14543,
2429,
281,
841,
3082,
604,
352,
310,
984,
597,
403,
417,
3909,
2139,
1146,
3909,
275,
253,
4679,
273,
253,
2929,
310,
1774,
4645,
326,
246,
14543,
310,
12085,
342,
4390,
14414,
3082,
323,
1318,
5998,
651,
452,
644,
625,
21414,
685,
253,
5301,
342,
1327,
1282,
1662,
2851,
50276,
19016,
314,
323,
253,
13232,
273,
38041,
347,
298,
8400,
3133,
281,
320,
253,
1332,
273,
4327,
323,
4715,
253,
4872,
629,
352,
651,
452,
644,
10599,
281,
2085,
271,
5933,
3817,
323,
436,
2715,
347,
310,
2218,
323,
305,
2851,
19,
2851,
68,
298,
8400,
310,
9093,
247,
14604,
5933,
285,
627,
812,
320,
1142,
4088,
281,
1614,
352,
715,
271,
3909,
5933,
342,
534,
5933,
497,
253,
1543,
275,
253,
5661,
2593,
2797,
50276,
71,
3341,
327,
253,
1453,
4836,
253,
4477,
823,
2067,
14586,
281,
616,
5933,
534,
1543,
275,
271,
5933,
326,
310,
1077,
2810,
281,
326,
273,
20978,
460,
1162,
355,
4240,
2139,
369,
417,
253,
6158,
247,
8245,
323,
436,
3368,
3340,
1580,
352,
369,
2908,
275,
643,
4679,
7152,
33032,
2520,
2929,
29328,
2500,
5786,
25912,
6928,
42085,
2224,
247,
35221,
4715,
5933,
835,
4735,
14237,
403,
6311,
407,
247,
11454,
2990,
10166,
327,
247,
35701,
2957,
1159,
26332,
1318,
285,
247,
1318,
1159,
310,
6311,
327,
1755,
273,
253,
4735,
6779,
970,
247,
3809,
1878,
23600,
4420,
5933,
253,
4477,
5276,
253,
14940,
273,
436,
1332,
970,
3082,
432,
767,
43936,
19191,
11193,
50275,
585,
332,
7322,
285,
6474,
14561,
11333,
310,
271,
1774,
1895,
275,
35221,
4715,
285,
436,
2929,
6131,
271,
4722,
2746,
323,
15974,
436,
2523,
253,
2934,
273,
970,
247,
3809,
4872,
458,
47612,
327,
1755,
273,
247,
7808,
6890,
6779,
310,
417,
747,
275,
391,
77,
20978,
460,
1162,
355,
4240,
533,
253,
4477,
8489,
41509,
436,
2746,
407,
4645,
326,
352,
1543,
275,
247,
6474,
285,
41886,
5933,
3021,
891,
1859,
253,
14940,
4737,
347,
253,
2022,
7680,
273,
253,
2929,
50276,
783,
2929,
310,
3542,
4518,
533,
812,
5649,
432,
625,
5919,
897,
273,
2317,
275,
253,
2022,
2929,
323,
1650,
891,
1928,
326,
253,
10199,
285,
5955,
275,
2593,
495,
327,
35701,
16566,
812,
320,
15455,
36439,
285,
247,
7473,
4737,
3908,
812,
320,
2908,
432,
253,
30762,
275,
2593,
577,
342,
253,
2120,
4737,
275,
253,
30762,
50276,
783,
5661,
7103,
310,
7000,
285,
28913,
5216,
921,
253,
1318,
273,
1027,
10165,
273,
35701,
2957,
323,
1318,
1159,
3733,
4872,
1318,
1159,
4715,
3082,
285,
14023,
1411,
643,
14561,
11333,
824,
347,
277,
47051,
285,
14561,
305,
2851,
2851,
20617,
1103,
247,
5884,
14226,
310,
326,
352,
310,
2834,
281,
1899,
436,
789,
1411,
253,
19554,
533,
417,
3590,
3676,
391,
77,
3082,
347,
253,
4477,
760,
7277,
281,
277,
47051,
327,
247,
1327,
15291,
22791,
4836,
50276,
284,
3081,
2905,
789,
256,
1257,
264,
277,
2284,
1162,
355,
17857,
1686,
4765,
671,
2722,
14940,
323,
247,
14561,
35221,
4715,
5933,
275,
253,
1453,
4758,
285,
2677,
7790,
253,
14940,
2281,
1223,
15890,
323,
6486,
3410,
2228,
352,
651,
320,
1175,
281,
2486,
5955,
273,
436,
789,
3738,
253,
4081,
1332,
285,
27947,
403,
6012,
1077,
13359,
7152,
339,
431,
248,
2929,
29328,
247,
2500,
5786,
25912,
7792,
323,
4715,
253,
1318,
1159,
285,
247,
1375,
6779,
17965,
342,
14561,
4020,
2392,
253,
4477,
2085,
4737,
273,
14940,
285,
247,
1175,
16774,
7103,
50276,
783,
9400,
310,
1077,
4722,
285,
4623,
281,
17857,
32888,
2299,
891,
1158,
326,
253,
2929,
310,
417,
4704,
323,
247,
9311,
806,
3738,
253,
2929,
310,
973,
3542,
253,
4028,
476,
320,
5520,
323,
4227,
891,
1119,
2168,
253,
12002,
247,
2372,
21643,
627,
253,
4477,
1375,
326,
597,
2085,
247,
2500,
5786,
25912,
2990,
246,
14543,
10336,
326,
13276,
4872,
3082,
281,
320,
908,
281,
3037,
2193,
50276,
783,
2746,
29499,
897,
273,
11333,
3715,
323,
253,
4872,
4758,
50276,
664,
5276,
14940,
323,
42085,
2224,
342,
1798,
1557,
1677,
281,
5416,
14940,
273,
253,
3809,
4872,
4445,
2568,
253,
4060,
2296,
14561,
285,
275,
253,
6414,
273,
253,
2929,
597,
897,
11454,
6928,
50275,
783,
2201,
1895,
273,
253,
2929,
310,
2299,
697,
6003,
253,
38135,
273,
253,
2929,
253,
4737,
273,
14940,
310,
50217,
281,
253,
30762,
285,
1512,
1199,
310,
5262,
275,
253,
10199,
672,
2686,
253,
2934,
273,
1907,
253,
362,
3701,
7293,
327,
247,
7808,
6890,
2990,
310,
671,
417,
4460,
275,
391,
77,
323,
4227,
253,
4477,
1333,
326,
362,
7024,
327,
39116,
285,
259,
285,
326,
39116,
2544,
387,
17357,
13870,
2429,
281,
259,
436,
31562,
253,
897,
273,
2303,
6928,
275,
253,
32989,
2228,
323,
1142,
12353,
68,
17425,
11333,
352,
310,
417,
253,
1072,
2181,
533,
627,
310,
247,
2266,
4602,
33810,
275,
253,
10199,
253,
4477,
1333,
326,
26281,
20274,
452,
644,
908,
760,
342,
4872,
1159,
4020,
2392,
533,
305,
3348,
407,
5807,
335,
1342,
1162,
355,
4648,
253,
1072,
8063,
616,
5750,
310,
2686,
253,
32989,
2260,
2228,
281,
3037,
271,
5750,
1159,
29107,
285,
352,
3395,
256,
5503,
323,
4715,
253,
1318,
1159,
50276,
74,
717,
671,
247,
2372,
33872,
670,
253,
897,
273,
13818,
1257,
275,
253,
3368,
806,
275,
16186,
577,
285,
608,
253,
4477,
1375,
326,
970,
253,
278,
296,
615,
310,
6927,
685,
13818,
1257,
840,
275,
253,
4679,
597,
7472,
1097,
2299,
253,
13818,
1257,
2228,
8687,
253,
6278,
273,
271,
15355,
534,
943,
320,
23539,
849,
513,
368,
11897,
352,
50276,
44295,
3062,
368,
943,
6947,
247,
4564,
273,
14683,
281,
5513,
253,
1895,
273,
436,
6278,
285,
253,
33478,
312,
4906,
1895,
273,
17487,
1342,
12541,
11333,
323,
3095,
32139,
342,
253,
1895,
436,
2523,
812,
320,
12744,
50276,
74,
11435,
253,
9470,
7103,
533,
697,
6003,
476,
671,
320,
5520,
7296,
326,
690,
1774,
1491,
403,
969,
275,
253,
30762,
33810,
1543,
327,
1453,
3368,
403,
417,
1415,
800,
285,
943,
320,
5176,
387,
253,
1655,
3924,
387,
1878,
275,
253,
1327,
5695,
2715,
627,
310,
247,
2257,
273,
11041,
275,
634,
6613,
581,
4797,
6970,
310,
1663,
3076,
1223,
323,
253,
2460,
2715,
512,
6613,
403,
1077,
17631,
1469,
1900,
598,
285,
1066,
50275,
249,
6452,
627,
310,
247,
2257,
273,
4722,
2144,
275,
436,
2929,
1014,
2167,
253,
38135,
310,
417,
1270,
253,
27947,
1783,
285,
7103,
1056,
352,
247,
4891,
2929,
2299,
984,
627,
310,
594,
1199,
513,
2319,
891,
651,
1804,
281,
294,
7397,
907,
253,
2929,
285,
11929,
3587,
281,
247,
6698,
3540,
253,
2929,
310,
2168,
3285,
7223,
1690,
253,
30762,
7152,
339,
793,
360,
3454,
436,
2929,
10262,
247,
2500,
5786,
25912,
2990,
246,
14543,
326,
13276,
4872,
3082,
281,
320,
908,
281,
3037,
2193,
327,
253,
3468,
43936,
14561,
3386,
403,
6311,
970,
247,
35701,
2957,
327,
253,
3809,
43936,
247,
1318,
1159,
310,
5998,
347,
247,
4872,
1159,
273,
1110,
3386,
352,
4620,
281,
320,
247,
2014,
2990,
835,
581,
1481,
14137,
253,
6779,
285,
253,
1273,
1481,
33772,
253,
2193,
50276,
9328,
7409,
2709,
35701,
11655,
285,
990,
598,
970,
253,
278,
296,
615,
323,
697,
17647,
1014,
2167,
352,
3400,
7197,
1318,
8197,
685,
278,
1033,
1257,
347,
7000,
275,
616,
4679,
50276,
9328,
2085,
14940,
1543,
50276,
12846,
2500,
5786,
25912,
19191,
11193,
1543,
432,
270,
1064,
274,
323,
253,
2500,
5786,
25912,
5199,
285,
2085,
16774,
1941,
323,
253,
5373,
273,
436,
1332,
2429,
281,
643,
14561,
1318,
1159,
11193,
3082,
50276,
498,
15752,
285,
3290,
253,
2929,
310,
973,
3542,
275,
2087,
253,
23065,
3133,
281,
320,
3590,
285,
253,
5661,
1543,
3176,
281,
320,
11080,
50275,
19164,
414,
970,
767,
1027,
9851,
581,
281,
4446,
253,
6779,
285,
253,
1273,
281,
3037,
253,
2193,
4620,
281,
320,
271,
27934,
2508,
253,
35701,
2957,
281,
3037,
253,
3386,
9904,
342,
247,
4872,
3646,
7103,
5933,
3176,
281,
320,
4460,
533,
1057,
417,
7501,
275,
619,
4743,
253,
38135,
3309,
323,
9311,
387,
17857,
32888,
50275,
783,
10527,
1543,
3176,
281,
320,
247,
15246,
2898,
273,
270,
1064,
1032,
2500,
5786,
25912,
19191,
11193,
5933,
281,
436,
10336,
281,
755,
14940,
436,
3103,
1057,
417,
3176,
281,
320,
247,
4460,
7680,
50276,
5658,
1375,
846,
1298,
2711,
279,
495,
326,
14561,
1159,
5971,
513,
417,
452,
247,
4581,
830,
2900,
2299,
352,
3133,
326,
253,
2929,
41886,
5897,
8950,
17693,
4715,
342,
10341,
6032,
1159,
11193,
1057,
6296,
452,
247,
4581,
830,
2900,
323,
14561,
1159,
4020,
2392,
672,
28699,
253,
278,
1033,
1257,
23447,
2403,
247,
50137,
9376,
534,
310,
1633,
634,
789,
3133,
281,
1056,
347,
973,
50275,
783,
789,
2218,
275,
253,
1453,
4758,
4620,
281,
320,
1077,
2074,
281,
253,
4679,
2684,
275,
253,
2929,
20126,
11269,
323,
3676,
35221,
4715,
50276,
9188,
40348,
4583,
891,
1158,
326,
253,
2929,
310,
973,
3542,
285,
253,
5661,
7103,
310,
11080,
2299,
253,
38135,
310,
14999,
347,
352,
4620,
281,
320,
3733,
970,
247,
4471,
24818,
2746,
534,
4961,
285,
253,
14940,
1543,
3176,
281,
320,
247,
15246,
2898,
273,
270,
1064,
1032,
2500,
5786,
25912,
4737,
253,
38135,
3103,
4620,
281,
320,
970,
247,
35701,
2957,
1159,
323,
3733,
253,
3386,
534,
1057,
417,
7081,
253,
4209,
38135,
275,
619,
4743,
323,
17857,
32888,
50275,
74,
651,
1804,
253,
4477,
2508,
2139,
616,
2500,
5786,
25912,
2746,
310,
1027,
432,
326,
273,
270,
1064,
1032,
390,
23000,
823,
690,
3045,
12215,
281,
253,
14940,
1543,
281,
9017,
253,
3762,
436,
651,
1056,
323,
247,
1199,
10046,
2929,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
747,
1332,
281,
16851,
253,
14561,
1318,
1159,
407,
26230,
352,
347,
247,
2020,
273,
4872,
285,
14561,
2426,
253,
14561,
1307,
310,
9300,
1199,
17357,
685,
253,
4872,
1307,
285,
253,
2929,
29328,
281,
897,
247,
50276,
7957,
1878,
15044,
5933,
281,
5731,
253,
4872,
1307,
14940,
1543,
403,
671,
5469,
285,
16774,
1941,
310,
2530,
50275,
284,
30628,
452,
8042,
562,
253,
38135,
273,
253,
2929,
310,
3710,
533,
253,
5697,
403,
4722,
285,
812,
320,
4217,
323,
253,
3114,
891,
7052,
5583,
3192,
30628,
5701,
715,
2395,
323,
253,
6568,
4704,
285,
671,
823,
247,
5955,
327,
253,
2954,
342,
253,
5368,
789,
50275,
1189,
455,
891,
1158,
436,
2929,
310,
4722,
285,
891,
5583,
14924,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
10527,
7680,
253,
4477,
1750,
326,
253,
2022,
5691,
369,
281,
2968,
342,
253,
7826,
7976,
3386,
3453,
8659,
407,
253,
11454,
2990,
352,
310,
18445,
342,
407,
970,
247,
12378,
326,
6493,
253,
4872,
3602,
273,
253,
1318,
1159,
281,
247,
8566,
8578,
273,
253,
4764,
2317,
2534,
253,
30762,
627,
310,
642,
3748,
273,
436,
12378,
275,
253,
2929,
327,
849,
436,
8566,
8578,
326,
1364,
2486,
253,
8654,
4764,
310,
2931,
285,
604,
436,
12378,
310,
7960,
247,
10527,
4968,
390,
604,
352,
369,
3309,
281,
3359,
352,
275,
3946,
627,
310,
247,
12378,
323,
253,
11454,
2036,
13461,
1512,
533,
891,
476,
923,
849,
323,
841,
352,
1537,
417,
320,
3309,
281,
897,
275,
3946,
2299,
323,
253,
4872,
13461,
347,
616,
13782,
7826,
8687,
275,
31324,
2853,
44321,
12624,
597,
476,
6296,
8230,
484,
4942,
3809,
50275,
74,
1119,
253,
5661,
12820,
281,
320,
3240,
6793,
533,
417,
2218,
275,
247,
12082,
2217,
5133,
323,
4227,
253,
3368,
11839,
273,
39793,
253,
278,
1033,
1257,
14371,
3240,
23395,
253,
6349,
273,
1016,
4445,
533,
310,
760,
2684,
327,
247,
2014,
4836,
347,
253,
10527,
1783,
1057,
417,
1333,
2712,
670,
253,
11701,
253,
6779,
4715,
476,
452,
327,
253,
4872,
1318,
13418,
4543,
604,
253,
2957,
908,
323,
4715,
253,
6779,
8069,
11026,
1805,
3386,
323,
253,
278,
1033,
1257,
41458,
436,
3368,
310,
2581,
1774,
285,
943,
452,
644,
2684,
327,
625,
685,
247,
2014,
5028,
50276,
9815,
314,
891,
513,
417,
1089,
253,
6777,
1666,
25379,
281,
320,
10481,
12085,
253,
4477,
1375,
275,
4706,
374,
326,
1327,
1282,
1662,
2851,
556,
417,
2326,
14414,
897,
533,
1907,
436,
5933,
347,
253,
2022,
32048,
1057,
417,
2085,
2266,
1941,
326,
246,
14543,
588,
871,
247,
1805,
13742,
275,
253,
12002,
352,
310,
10466,
326,
3345,
273,
1327,
1282,
1662,
2851,
1318,
1159,
11193,
3082,
403,
417,
3590,
275,
16851,
3646,
19502,
11333,
824,
347,
32765,
8159,
390,
492,
5367,
627,
310,
247,
878,
275,
9591,
1318,
13418,
352,
310,
2218,
407,
9093,
247,
14662,
82,
19502,
5199,
534,
310,
3590,
2139,
369,
2649,
246,
14543,
2429,
281,
841,
3082,
604,
352,
310,
984,
597,
403,
417,
3909,
2139,
1146,
3909,
275,
253,
4679,
273,
253,
2929,
310,
1774,
4645,
326,
246,
14543,
310,
12085,
342,
4390,
14414,
3082,
323,
1318,
5998,
651,
452,
644,
625,
21414,
685,
253,
5301,
342,
1327,
1282,
1662,
2851,
50276,
19016,
314,
323,
253,
13232,
273,
38041,
347,
298,
8400,
3133,
281,
320,
253,
1332,
273,
4327,
323,
4715,
253,
4872,
629,
352,
651,
452,
644,
10599,
281,
2085,
271,
5933,
3817,
323,
436,
2715,
347,
310,
2218,
323,
305,
2851,
19,
2851,
68,
298,
8400,
310,
9093,
247,
14604,
5933,
285,
627,
812,
320,
1142,
4088,
281,
1614,
352,
715,
271,
3909,
5933,
342,
534,
5933,
497,
253,
1543,
275,
253,
5661,
2593,
2797,
50276,
71,
3341,
327,
253,
1453,
4836,
253,
4477,
823,
2067,
14586,
281,
616,
5933,
534,
1543,
275,
271,
5933,
326,
310,
1077,
2810,
281,
326,
273,
20978,
460,
1162,
355,
4240,
2139,
369,
417,
253,
6158,
247,
8245,
323,
436,
3368,
3340,
1580,
352,
369,
2908,
275,
643,
4679,
7152,
33032,
2520,
2929,
29328,
2500,
5786,
25912,
6928,
42085,
2224,
247,
35221,
4715,
5933,
835,
4735,
14237,
403,
6311,
407,
247,
11454,
2990,
10166,
327,
247,
35701,
2957,
1159,
26332,
1318,
285,
247,
1318,
1159,
310,
6311,
327,
1755,
273,
253,
4735,
6779,
970,
247,
3809,
1878,
23600,
4420,
5933,
253,
4477,
5276,
253,
14940,
273,
436,
1332,
970,
3082,
432,
767,
43936,
19191,
11193,
50275,
585,
332,
7322,
285,
6474,
14561,
11333,
310,
271,
1774,
1895,
275,
35221,
4715,
285,
436,
2929,
6131,
271,
4722,
2746,
323,
15974,
436,
2523,
253,
2934,
273,
970,
247,
3809,
4872,
458,
47612,
327,
1755,
273,
247,
7808,
6890,
6779,
310,
417,
747,
275,
391,
77,
20978,
460,
1162,
355,
4240,
533,
253,
4477,
8489,
41509,
436,
2746,
407,
4645,
326,
352,
1543,
275,
247,
6474,
285,
41886,
5933,
3021,
891,
1859,
253,
14940,
4737,
347,
253,
2022,
7680,
273,
253,
2929,
50276,
783,
2929,
310,
3542,
4518,
533,
812,
5649,
432,
625,
5919,
897,
273,
2317,
275,
253,
2022,
2929,
323,
1650,
891,
1928,
326,
253,
10199,
285,
5955,
275,
2593,
495,
327,
35701,
16566,
812,
320,
15455,
36439,
285,
247,
7473,
4737,
3908,
812,
320,
2908,
432,
253,
30762,
275,
2593,
577,
342,
253,
2120,
4737,
275,
253,
30762,
50276,
783,
5661,
7103,
310,
7000,
285,
28913,
5216,
921,
253,
1318,
273,
1027,
10165,
273,
35701,
2957,
323,
1318,
1159,
3733,
4872,
1318,
1159,
4715,
3082,
285,
14023,
1411,
643,
14561,
11333,
824,
347,
277,
47051,
285,
14561,
305,
2851,
2851,
20617,
1103,
247,
5884,
14226,
310,
326,
352,
310,
2834,
281,
1899,
436,
789,
1411,
253,
19554,
533,
417,
3590,
3676,
391,
77,
3082,
347,
253,
4477,
760,
7277,
281,
277,
47051,
327,
247,
1327,
15291,
22791,
4836,
50276,
284,
3081,
2905,
789,
256,
1257,
264,
277,
2284,
1162,
355,
17857,
1686,
4765,
671,
2722,
14940,
323,
247,
14561,
35221,
4715,
5933,
275,
253,
1453,
4758,
285,
2677,
7790,
253,
14940,
2281,
1223,
15890,
323,
6486,
3410,
2228,
352,
651,
320,
1175,
281,
2486,
5955,
273,
436,
789,
3738,
253,
4081,
1332,
285,
27947,
403,
6012,
1077,
13359,
7152,
339,
431,
248,
2929,
29328,
247,
2500,
5786,
25912,
7792,
323,
4715,
253,
1318,
1159,
285,
247,
1375,
6779,
17965,
342,
14561,
4020,
2392,
253,
4477,
2085,
4737,
273,
14940,
285,
247,
1175,
16774,
7103,
50276,
783,
9400,
310,
1077,
4722,
285,
4623,
281,
17857,
32888,
2299,
891,
1158,
326,
253,
2929,
310,
417,
4704,
323,
247,
9311,
806,
3738,
253,
2929,
310,
973,
3542,
253,
4028,
476,
320,
5520,
323,
4227,
891,
1119,
2168,
253,
12002,
247,
2372,
21643,
627,
253,
4477,
1375,
326,
597,
2085,
247,
2500,
5786,
25912,
2990,
246,
14543,
10336,
326,
13276,
4872,
3082,
281,
320,
908,
281,
3037,
2193,
50276,
783,
2746,
29499,
897,
273,
11333,
3715,
323,
253,
4872,
4758,
50276,
664,
5276,
14940,
323,
42085,
2224,
342,
1798,
1557,
1677,
281,
5416,
14940,
273,
253,
3809,
4872,
4445,
2568,
253,
4060,
2296,
14561,
285,
275,
253,
6414,
273,
253,
2929,
597,
897,
11454,
6928,
50275,
783,
2201,
1895,
273,
253,
2929,
310,
2299,
697,
6003,
253,
38135,
273,
253,
2929,
253,
4737,
273,
14940,
310,
50217,
281,
253,
30762,
285,
1512,
1199,
310,
5262,
275,
253,
10199,
672,
2686,
253,
2934,
273,
1907,
253,
362,
3701,
7293,
327,
247,
7808,
6890,
2990,
310,
671,
417,
4460,
275,
391,
77,
323,
4227,
253,
4477,
1333,
326,
362,
7024,
327,
39116,
285,
259,
285,
326,
39116,
2544,
387,
17357,
13870,
2429,
281,
259,
436,
31562,
253,
897,
273,
2303,
6928,
275,
253,
32989,
2228,
323,
1142,
12353,
68,
17425,
11333,
352,
310,
417,
253,
1072,
2181,
533,
627,
310,
247,
2266,
4602,
33810,
275,
253,
10199,
253,
4477,
1333,
326,
26281,
20274,
452,
644,
908,
760,
342,
4872,
1159,
4020,
2392,
533,
305,
3348,
407,
5807,
335,
1342,
1162,
355,
4648,
253,
1072,
8063,
616,
5750,
310,
2686,
253,
32989,
2260,
2228,
281,
3037,
271,
5750,
1159,
29107,
285,
352,
3395,
256,
5503,
323,
4715,
253,
1318,
1159,
50276,
74,
717,
671,
247,
2372,
33872,
670,
253,
897,
273,
13818,
1257,
275,
253,
3368,
806,
275,
16186,
577,
285,
608,
253,
4477,
1375,
326,
970,
253,
278,
296,
615,
310,
6927,
685,
13818,
1257,
840,
275,
253,
4679,
597,
7472,
1097,
2299,
253,
13818,
1257,
2228,
8687,
253,
6278,
273,
271,
15355,
534,
943,
320,
23539,
849,
513,
368,
11897,
352,
50276,
44295,
3062,
368,
943,
6947,
247,
4564,
273,
14683,
281,
5513,
253,
1895,
273,
436,
6278,
285,
253,
33478,
312,
4906,
1895,
273,
17487,
1342,
12541,
11333,
323,
3095,
32139,
342,
253,
1895,
436,
2523,
812,
320,
12744,
50276,
74,
11435,
253,
9470,
7103,
533,
697,
6003,
476,
671,
320,
5520,
7296,
326,
690,
1774,
1491,
403,
969,
275,
253,
30762,
33810,
1543,
327,
1453,
3368,
403,
417,
1415,
800,
285,
943,
320,
5176,
387,
253,
1655,
3924,
387,
1878,
275,
253,
1327,
5695,
2715,
627,
310,
247,
2257,
273,
11041,
275,
634,
6613,
581,
4797,
6970,
310,
1663,
3076,
1223,
323,
253,
2460,
2715,
512,
6613,
403,
1077,
17631,
1469,
1900,
598,
285,
1066,
50275,
249,
6452,
627,
310,
247,
2257,
273,
4722,
2144,
275,
436,
2929,
1014,
2167,
253,
38135,
310,
417,
1270,
253,
27947,
1783,
285,
7103,
1056,
352,
247,
4891,
2929,
2299,
984,
627,
310,
594,
1199,
513,
2319,
891,
651,
1804,
281,
294,
7397,
907,
253,
2929,
285,
11929,
3587,
281,
247,
6698,
3540,
253,
2929,
310,
2168,
3285,
7223,
1690,
253,
30762,
7152,
339,
793,
360,
3454,
436,
2929,
10262,
247,
2500,
5786,
25912,
2990,
246,
14543,
326,
13276,
4872,
3082,
281,
320,
908,
281,
3037,
2193,
327,
253,
3468,
43936,
14561,
3386,
403,
6311,
970,
247,
35701,
2957,
327,
253,
3809,
43936,
247,
1318,
1159,
310,
5998,
347,
247,
4872,
1159,
273,
1110,
3386,
352,
4620,
281,
320,
247,
2014,
2990,
835,
581,
1481,
14137,
253,
6779,
285,
253,
1273,
1481,
33772,
253,
2193,
50276,
9328,
7409,
2709,
35701,
11655,
285,
990,
598,
970,
253,
278,
296,
615,
323,
697,
17647,
1014,
2167,
352,
3400,
7197,
1318,
8197,
685,
278,
1033,
1257,
347,
7000,
275,
616,
4679,
50276,
9328,
2085,
14940,
1543,
50276,
12846,
2500,
5786,
25912,
19191,
11193,
1543,
432,
270,
1064,
274,
323,
253,
2500,
5786,
25912,
5199,
285,
2085,
16774,
1941,
323,
253,
5373,
273,
436,
1332,
2429,
281,
643,
14561,
1318,
1159,
11193,
3082,
50276,
498,
15752,
285,
3290,
253,
2929,
310,
973,
3542,
275,
2087,
253,
23065,
3133,
281,
320,
3590,
285,
253,
5661,
1543,
3176,
281,
320,
11080,
50275,
19164,
414,
970,
767,
1027,
9851,
581,
281,
4446,
253,
6779,
285,
253,
1273,
281,
3037,
253,
2193,
4620,
281,
320,
271,
27934,
2508,
253,
35701,
2957,
281,
3037,
253,
3386,
9904,
342,
247,
4872,
3646,
7103,
5933,
3176,
281,
320,
4460,
533,
1057,
417,
7501,
275,
619,
4743,
253,
38135,
3309,
323,
9311,
387,
17857,
32888,
50275,
783,
10527,
1543,
3176,
281,
320,
247,
15246,
2898,
273,
270,
1064,
1032,
2500,
5786,
25912,
19191,
11193,
5933,
281,
436,
10336,
281,
755,
14940,
436,
3103,
1057,
417,
3176,
281,
320,
247,
4460,
7680,
50276,
5658,
1375,
846,
1298,
2711,
279,
495,
326,
14561,
1159,
5971,
513,
417,
452,
247,
4581,
830,
2900,
2299,
352,
3133,
326,
253,
2929,
41886,
5897,
8950,
17693,
4715,
342,
10341,
6032,
1159,
11193,
1057,
6296,
452,
247,
4581,
830,
2900,
323,
14561,
1159,
4020,
2392,
672,
28699,
253,
278,
1033,
1257,
23447,
2403,
247,
50137,
9376,
534,
310,
1633,
634,
789,
3133,
281,
1056,
347,
973,
50275,
783,
789,
2218,
275,
253,
1453,
4758,
4620,
281,
320,
1077,
2074,
281,
253,
4679,
2684,
275,
253,
2929,
20126,
11269,
323,
3676,
35221,
4715,
50276,
9188,
40348,
4583,
891,
1158,
326,
253,
2929,
310,
973,
3542,
285,
253,
5661,
7103,
310,
11080,
2299,
253,
38135,
310,
14999,
347,
352,
4620,
281,
320,
3733,
970,
247,
4471,
24818,
2746,
534,
4961,
285,
253,
14940,
1543,
3176,
281,
320,
247,
15246,
2898,
273,
270,
1064,
1032,
2500,
5786,
25912,
4737,
253,
38135,
3103,
4620,
281,
320,
970,
247,
35701,
2957,
1159,
323,
3733,
253,
3386,
534,
1057,
417,
7081,
253,
4209,
38135,
275,
619,
4743,
323,
17857,
32888,
50275,
74,
651,
1804,
253,
4477,
2508,
2139,
616,
2500,
5786,
25912,
2746,
310,
1027,
432,
326,
273,
270,
1064,
1032,
390,
23000,
823,
690,
3045,
12215,
281,
253,
14940,
1543,
281,
9017,
253,
3762,
436,
651,
1056,
323,
247,
1199,
10046,
2929,
187,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
747,
1332,
281,
16851,
253,
14561,
1318,
1159,
407,
26230,
352,
347,
247,
2020,
273,
4872,
285,
14561,
2426,
253,
14561,
1307,
310,
9300,
1199,
17357,
685,
253,
4872,
1307,
285,
253,
2929,
29328,
281,
897,
247,
50276,
7957,
1878,
15044,
5933,
281,
5731,
253,
4872,
1307,
14940,
1543,
403,
671,
5469,
285,
16774,
1941,
310,
2530,
50275,
284,
30628,
452,
8042,
562,
253,
38135,
273,
253,
2929,
310,
3710,
533,
253,
5697,
403,
4722,
285,
812,
320,
4217,
323,
253,
3114,
891,
7052,
5583,
3192,
30628,
5701,
715,
2395,
323,
253,
6568,
4704,
285,
671,
823,
247,
5955,
327,
253,
2954,
342,
253,
5368,
789,
50275,
1189,
455,
891,
1158,
436,
2929,
310,
4722,
285,
891,
5583,
14924,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper experimentally examine a number of generator and discriminator network choices for point cloud gan it is shown that the best generator choice would be a pointnet with a mixture between maxpooling and average pooling and that an attentionbased pointnet framework performs the best in terms of discriminators multiple metrics were evaluated and a comprehensive experiment was done on the metrics capabiliities wrt the sampling procedure positive comprehensive experiments multiple metrics network structures tested on both the generator and the discriminator side negative i dont get why the authors would always stick to existing generators it has been shown that convolutional approaches such as kpconv and pointconv perform much better than pointnet in discriminative settings hence it also makes sense that they could perform well in a generative setting the tested graphcnn generator is not necessarily as powerful as kpconvpointconv from an intuitive point of view convolutional architectures should be helpful in the generative models as well besides using a convolutional discriminator may not be able to generate gradients good enough for a nonconvolutional generator but if the generator and discriminator match in terms of architecture presumably the performance could be better the other concern i had is wrt detailed settings of the no generator experiment in table 4 is wgangptype costterm and penalty used in this experiment its wellknown that discriminators that are very confident would not give proper gradient to generative models eg from the wgan papers i also remember reading somewhere this paper a good gan requires a bad discriminator although somehow i couldnt find this reference anymore unless they are heavily constrained eg by wgangp kind terms hence this detail can be important in deciding whether the conclusions from the paper would be credible the main reason i want to nitpick on these seemingly small items is that the conclusion of the paper might change significantly from those details the paper tends to get to a strong conclusion that one needs a bad discriminator for a good gan although thats generally a reasonable assumption its conclusions on inability to generate anything with a kpconvpointconv discriminator could have farfetching implications and hence i just want to apply a bit of additional scrutiny here minor for completeness its worth writing down what is the input to the point cloud gan this is very unclear in the current paper i assume that the input is a random point cloud as in achlioptas et al 2018 but its worth stating that clearly early on eg sec 21 docsepsummary of the paper this studies how different discriminator architectures are sensitive to sampling strategy of point clouds the paper proposes three benchmarks that measures how sensitive the architecture to the sampling strategy of point clouds which provides insights to future research about how to choose architectures for both discriminator and for feature extraction empirically the authors find out that using a discriminator architecture thats just reasonably sensitive to sampling could create better generative results regardless of the generator finally the paper also provides a novel experiment setting ie nogenerator experiment that could compare gradient quality of discriminators strength this work is the first one that im aware of that studies how the point sampling scheme affects the results and evaluation of the generation the paper also tried to verify such results with a wide ranges of architectures and the results seemed rather selfconsistent weakness 1 discreteness of the sampling spectrum while the idea to study how sensitive the network is to the sampling schema is certainly good but the way to characterize the spectrum in this paper is too empirical and discrete this might not be too useful when the sampling method of interested in certain application doesnt fall into the category the author mention it would be nice if the author could provide a way to quantify the difference between different sampling methods eg expectation of emd on the same surface area and verify how different architecture falls into such continuous spectrum 2 disentangling the effects of geometries vs the sampling i agree with the authors that the point clouds quality can be thoughts as outputs of the geometries and the sampling and those two factors goes handinhand but this paper seemed to emphasize solely on the architectures capacity to detach different sampling techniques and tries to correlate this factor with the generation quality my concern here is that could the networks capacity toward geometries a confounding factors in the experiment results for example could it be the case that the network thats oversensitive to the sampling patterns actually fails to tell apart geometric difference which leads to bad generative results in that case could it be the case that the networks ability to discriminate geometries differences the main factor regardless of whether the network is oversensitive or undersensitive to the sampling following this line of thoughts if the authors claim holds then there might be some fundamental difficulty for the network to be both capable of telling apart geometries and being overundersensitive to the sampling pattern which doesnt seem to be established in the paper concretely with theory or empirical results maybe the authors could point us to evidence that show that the networks used in the paper all have about the same capacity in telling apart geometries which is an alternative to address this concern justifications the paper is the first in my knowledge that studies how sampling strategy affects point cloud generation and evaluation in this case the paper has its value and novelty in the literature but the results and insights are organized under the framework of sampling spectrum which might still need improvement with more theories or results with that i will vote for borderline for now and will be happy to hear from the authors docsepthis paper studies an interesting problem ie point sampling in 3d point cloud generation through gan it shows that the currently used discriminator in this framework is sampling insensitive thus the learned generator is prone to sampling artifacts on the contrary the sampling sensitive pointcloud cnns are not suitable for acting as a discriminator according to the authors experiments that all of them fail to generate reasonable point clouds observing this the authors proposed samplingaware discriminator and achieved better results especially along with the proposed samplingrelated metrics the proposed method shows advantages over existing methods strength 1 the finding that sampling plays an important role in gan based point cloud generation is helpful to the community 2 the proposed method by simply concatenating an average pooled feature to the maxpooled pointnet feature is shown to be effective in the proposed metrics weakness overall there are many unclear issues need further explanation the rationality of the proposed method is also unconvincing 1 intuitively as the sampling insensitive discriminator leads to generated point clouds unaware of the sampling artifacts the advanced sampling sensitive discriminator could address this problem since it focuses more on this aspect however the results did not support this guess and the paper does not give strong reasonanalysis to explain such failures 2 the proposed sampling aware discriminator is a concatenation of max pooling and avg pooling considering pointnetmix and pointnetmax only differs by an avg does it mean that avg is samplingaware then why not directly using avg unfortunately the results shown in the paper eg table 4 demonstrate that using avg also does not perform well therefore it is quite hard to understand the underlying rationality for the success of pointnetmix 3 to show the proposed mix strategy is general enough it is required to show with other baselines for example how about using the mix strategy in aggregating weighted feature in pnattention does it still work 4 pnattention improves pn by incorporating communication between points however why other networks such as pn pointconv that also uses points interactions can not improve on pn in gan this is worthy to analysis and give a reasonable explanation in this paper 5 about the proposed samplingrelated metrics the core is f distance based on various feature extractors since f distance is based on features while the used features are depended on extractors so this observation in p5 the frecht distance metrics share the same sampling sensitivity of their corresponding discriminators is obvious in other words it actually heavily relies on the used extractor ie with a sampling insensitive extractor the metric is tend to be sampling insensitive therefore such metrics may be meaningless to reflect the property of different methodsgenerators other comments it is hard to understand what is learnable point clouds in no generator experiments more details about the setup and network architectures about this experiment is helpful for reader to understanddocsepthis paper considers the task of 3d shape generation more precisely it embraces the point cloud gan strategy for generation and offers an extensive comparative analysis of the existing architectures of point cloud generators and discriminators with a focus on the variations of the latter component a series of synthetic experiments considering several point cloud sampling schemes is proposed and used to define a spectrum of sensitivity to sampling artifacts various deviations from uniform sampling for a range of considered discriminators the same is performed with the existing metrics evaluating performance for point cloud generation presented results suggest that existing pointnet based discriminators with maxpooling point feature aggregation mechanism are insensitive to severe artifacts in point density which leads to nonuniform samples from the generators on the other hand existing improvements to pointnet which perform better in discriminative tasks are oversensitive to sampling and manage to capture even the smallest deviations from the uniform sampling which leads to poor training signals in terms of gradients from those discriminators as alternatives for both extremes the authors propose to use samplingaware pointnet based discriminators but either use a concatenation of the maxpooled and averagepooled point features or selfattention mechanism as an aggregation function in the experimental section the authors compare numerous combinations of considered discriminators with several existing generators to confirm that the proposed samplingaware discriminators yield superior performance across all the generators pros 1 the paper is wellwritten and is easy to follow 2 overall rigorousness of the experiments in the paper is impressive and the conclusions drawn are logical and insightful 3 additional inputs on the sampling sensitivity spectrum for evaluation metrics are also valuable cons 1 the main drawback of the paper is the selfcontained nature of the presented material in its current form it misses any validation by comparison to any external generative results thus it is not clear where the obtained improvements for point cloud gans put them on the performance quality list considering all the generative models for point clouds overall this is an impressive work which i think may be accepted even as it is however its value can be extended outside point cloud gan community to general 3d shape generation task audience with a proper comparison of the best results with current state of the art there are at least two recent works based on likelihood training 1 2 which can be considered for comparison or at least should be mentioned given that my score is only moderate additional comments section 1 paragraph 2 improper citation caused artifacts in the text section 51 both cd and emd also possibly fpd are sensitive to the scale of the input point clouds thus it is important to indicate the scale of the input point clouds how they were normalized for example for now i am not sure that it is possible to directly compare your results from table 3 with the results of 1 2 since your values of mmdcdemd and covcdemd significantly differ from the values reported there this might be due to the different input data scales section 52 in my experience lgans from 3 gave very unstable performance with respect to even small variations in the model architectures eg a change in the dimensionality of the latent space to obtain reasonable performance i needed to tune hyperparameters related to training objective and optimization separately for different configurations and could not use a single set of hyperparameters for all the configurations in your case you alter different network parts completely could you verify somehow that the relative performance of the considered configurations can not be attributed to the choice of the hyperparameters which are best for one configuration but not optimal for others in other worlds have you tried to tune the hyperparameters separately for different configurations i appreciated the pointnetmax2048 experiment in the supplementary materials since i had the same question and found the answer there section 53 no generator experiments to supervised the generator training to supervise figure 2 is reported to contain random selected randomly selected samples i do not understand how they can be randomly selected and at the same time all present approximately the same object in each row i do not think you can claim random selection rather than cherry picking in that case which is not a problem since you wanted to demonstrate the differences in generations of approximately the same object table 4 caption min mix 1 yang g huang x hao z liu my belongie s hariharan b pointflow 3d point cloud generation with continuous normalizing flows in iccv19 2 klokov r boyer e verbeek j discrete point flow networks for efficient point cloud generation in eccv20 3 achlioptas p diamanti o mitliagkas i guibas l learning representations and generative models for 3d point clouds in icml18 after reading other reviews authors comments and checking the revised manuscript i decided to slightly improve my rating for two reasons firstly my concerns were answered during discussions secondly i do not agree that the concerns raised by other reviewers could justify a rejection i believe this is an exemplary empirical study presenting novel information and insights about sampling sensitivity of point cloud encoders and point cloud evaluation metricsdocsepthe paper conducts experiments to examine the effects of point sampling patterns in point cloud gans and experimental results reveal the reason why current strong discriminatorspointnet etc fail to train a reasonable point cloud generator by proposing a sampling spectrum the authors introduce a middle point samplingaware baseline discriminatorpointnetmix the paper proposes a new improvement direction for point cloud gan which might have a strong impact in the community the paper is written concisely and the illustrations are clear however there are still some concerns that need to be addressed 1 one of the major contributions claimed in the paper is the term sampling spectrum however the paper does not provide the exact definition of sampling spectrum the paper mainly divides thes spectrum into three regimes samplinginsensitiveawareoversensitive which is oversimplified and too broad the authors should give more justification and background intuition on why the sampling spectrum should be in this form 2 another concern to this paper is the no generator experiments in figure 3 the authors show some visualizations of the point clouds generated by different discriminators it is reasonable to generate some weird chair shape from pointnet and dgcnn however the generated point clouds from kpconv and pointconv are quite surprising to make the experiment results convincing it is better that the authors should provide more details on the experiments of kpconv and pointconv and provide more explanation on why kpconv and pointconv give almost random point clouds because the pointnet dgcnn kpconv and pointconv all use relative point positionsedge information according to the paper there must be more reasons on why the results look very different 3 more technique details on the no generator experiments should be introduced for example what is the loss function usedboth the no generator experiments and the rest gan experiments is it the improved wgan
### Summary:
|
the paper provides empirical evidence that the sampling strategy used in point cloud gans can drastically impact the generation quality of the network specifically the authors show that discriminators that are not sensitive to sampling have clustering artifact errors while those that are sensitive to sampling do not produce reasonable looking point clouds they also provide a simple way ie including avg feature pooling to improve generation quality for insensitive discriminator gan setups the reviewers agree that this is an interesting insight into the problem and this insight can help the community based on the reviewers comments and subsequent discussions it becomes clear that the paper would be stronger and more compelling if the underlying hypothesis ie the idea of sampling spectrum is more rigorously defined eg ideally with a theoretical grounding and the claimsanalyses are tied in with this definition such a grounded and precise setup would help in analyzing future generation discriminators that may not simply fall into the two discrete groups defined in the paper ie sampling oversensitive and samplinginsensitive the results have promise so the authors are encouraged to take into consideration the reviewer discussions to produce a stronger future submission
|
[
323,
8534,
347,
247,
7134,
12915,
2556,
281,
253,
4477,
4679,
326,
512,
273,
731,
1891,
281,
6635,
5272,
1127,
16173,
20764,
436,
253,
4477,
4081,
10491,
13823,
7134,
12915,
285,
6786,
1805,
1543,
3340,
2112,
342,
253,
4081,
10491,
4919,
17082,
253,
4081,
1332,
2722,
11361,
689,
5368,
3082,
50276,
45563,
337,
253,
4560,
326,
10491,
7120,
271,
1774,
2554,
275,
36827,
1754,
1127,
9005,
5978,
310,
9371,
281,
253,
3114,
374,
253,
4081,
1332,
407,
3365,
32147,
839,
271,
3388,
24462,
4735,
281,
253,
2781,
10730,
264,
1127,
3024,
4735,
310,
2011,
281,
320,
3576,
275,
253,
4081,
17082,
50275,
20881,
1255,
4583,
627,
403,
1142,
12744,
3374,
878,
2007,
8813,
253,
8870,
414,
273,
253,
4081,
1332,
310,
671,
10915,
87,
19163,
337,
540,
41597,
347,
253,
10491,
39188,
7134,
12915,
5644,
281,
4561,
1127,
16173,
25229,
273,
253,
10491,
24165,
253,
7269,
10491,
7996,
7134,
12915,
812,
2953,
436,
1895,
1580,
352,
16633,
625,
327,
436,
4809,
2299,
253,
1543,
858,
417,
1329,
436,
5476,
285,
253,
2929,
1057,
417,
1918,
2266,
1921,
12792,
281,
5513,
824,
20101,
374,
253,
4081,
10491,
6600,
7134,
12915,
310,
247,
32147,
318,
273,
2781,
45900,
285,
1323,
72,
45900,
7296,
1127,
3024,
24706,
285,
1127,
3024,
4090,
760,
19986,
407,
271,
1323,
72,
1057,
352,
1599,
326,
1323,
72,
310,
10491,
13823,
840,
2139,
417,
3587,
970,
1323,
72,
19235,
253,
1543,
2011,
275,
253,
2929,
24088,
2829,
577,
7568,
326,
970,
1323,
72,
671,
1057,
417,
1347,
973,
3103,
352,
310,
3240,
1892,
281,
2096,
253,
6944,
8870,
414,
323,
253,
2323,
273,
1127,
3024,
24706,
495,
281,
921,
253,
4081,
5878,
5700,
310,
2087,
2217,
352,
310,
2424,
281,
921,
342,
643,
1666,
25379,
323,
1650,
849,
670,
970,
253,
5878,
5700,
275,
9406,
839,
17375,
4735,
275,
268,
79,
42959,
1057,
352,
1335,
789,
577,
268,
79,
42959,
19132,
268,
79,
407,
24049,
5511,
875,
2792,
2299,
2139,
643,
6928,
824,
347,
268,
79,
1127,
13118,
326,
671,
4648,
2792,
6355,
476,
417,
3157,
327,
268,
79,
275,
36827,
436,
310,
18338,
281,
1783,
285,
1918,
247,
5272,
8813,
275,
436,
2929,
608,
670,
253,
4081,
10491,
4919,
17082,
253,
5161,
310,
269,
4181,
1754,
327,
2710,
4735,
4908,
641,
1580,
269,
4181,
310,
1754,
327,
3386,
1223,
253,
908,
3386,
403,
32945,
327,
4908,
641,
594,
436,
8310,
275,
268,
22,
253,
4107,
12914,
4181,
17082,
3894,
253,
1072,
10491,
7340,
273,
616,
3969,
20741,
2392,
310,
4755,
275,
643,
3000,
352,
2686,
11306,
15771,
327,
253,
908,
4908,
263,
26332,
342,
247,
10491,
39188,
4908,
263,
253,
7982,
310,
5257,
281,
320,
10491,
39188,
3103,
824,
17082,
778,
320,
34209,
281,
4887,
253,
2867,
273,
1027,
3082,
8719,
2392,
50276,
977,
5701,
352,
310,
1892,
281,
2096,
752,
310,
3037,
494,
1127,
16173,
275,
642,
14156,
4679,
625,
4278,
670,
253,
9978,
285,
2990,
35615,
670,
436,
3368,
310,
9371,
323,
9414,
281,
2096,
7152,
33032,
2520,
2929,
19401,
253,
4836,
273,
495,
69,
5281,
5978,
625,
10534,
352,
11949,
1951,
253,
1127,
9005,
36827,
5700,
323,
5978,
285,
6131,
271,
9470,
20407,
1783,
273,
253,
5368,
35615,
273,
1127,
9005,
21025,
285,
20741,
2392,
342,
247,
2770,
327,
253,
10575,
273,
253,
6158,
4445,
50276,
66,
2962,
273,
13506,
4679,
7296,
2067,
1127,
9005,
10491,
15849,
310,
4081,
285,
908,
281,
4853,
247,
6637,
273,
7340,
281,
10491,
24165,
2710,
21492,
432,
6447,
10491,
323,
247,
2491,
273,
2783,
20741,
2392,
253,
1072,
310,
2684,
342,
253,
5368,
17082,
16344,
3045,
323,
1127,
9005,
5978,
50276,
15068,
264,
1543,
1804,
326,
5368,
1127,
3024,
1754,
20741,
2392,
342,
2781,
10730,
272,
1127,
4735,
20828,
5122,
403,
39188,
281,
5460,
24165,
275,
1127,
4038,
534,
5644,
281,
1327,
23714,
3530,
432,
253,
21025,
327,
253,
643,
1133,
5368,
11701,
281,
1127,
3024,
534,
1347,
1805,
275,
20741,
800,
8892,
403,
689,
19579,
281,
10491,
285,
8722,
281,
9232,
1014,
253,
8004,
21492,
432,
253,
6447,
10491,
534,
5644,
281,
4105,
3733,
6298,
275,
2426,
273,
27935,
432,
1110,
20741,
2392,
347,
18075,
323,
1097,
41798,
253,
4477,
12661,
281,
897,
10491,
13823,
1127,
3024,
1754,
20741,
2392,
533,
2057,
897,
247,
32147,
318,
273,
253,
2781,
10730,
264,
285,
3388,
10730,
264,
1127,
3386,
390,
1881,
42959,
5122,
347,
271,
20828,
1159,
50276,
249,
253,
5661,
2593,
253,
4477,
7277,
7418,
13553,
273,
2783,
20741,
2392,
342,
2067,
5368,
21025,
281,
6583,
326,
253,
4081,
10491,
13823,
20741,
2392,
4917,
8936,
3045,
2439,
512,
253,
21025,
50276,
856,
84,
337,
253,
2929,
310,
973,
15720,
285,
310,
3477,
281,
956,
374,
4583,
26565,
1255,
273,
253,
4679,
275,
253,
2929,
310,
13943,
285,
253,
11815,
8392,
403,
13760,
285,
47860,
495,
3081,
14800,
327,
253,
10491,
7340,
6637,
323,
7103,
17082,
403,
671,
9865,
50276,
5040,
337,
253,
2022,
32489,
273,
253,
2929,
310,
253,
1881,
41010,
3753,
273,
253,
3559,
2144,
275,
697,
1655,
830,
352,
38771,
667,
12820,
407,
5301,
281,
667,
6024,
1006,
800,
1543,
3021,
352,
310,
417,
2590,
835,
253,
2797,
11701,
323,
1127,
9005,
305,
507,
1691,
731,
327,
253,
3045,
3290,
1618,
7296,
512,
253,
1006,
800,
3210,
323,
1127,
16173,
50276,
1189,
455,
436,
310,
271,
13943,
789,
534,
891,
1158,
778,
320,
7607,
1014,
347,
352,
310,
2299,
697,
1318,
476,
320,
6508,
3345,
1127,
9005,
36827,
3114,
281,
2087,
495,
69,
5281,
5978,
4836,
8446,
342,
247,
1463,
5301,
273,
253,
1682,
1543,
342,
1655,
1375,
273,
253,
1445,
627,
403,
387,
1878,
767,
3332,
2987,
1754,
327,
12177,
3733,
337,
374,
534,
476,
320,
2783,
323,
5301,
390,
387,
1878,
943,
320,
5393,
1677,
326,
619,
4868,
310,
760,
10290,
50274,
38092,
5701,
50276,
4674,
337,
12494,
374,
14697,
25577,
4269,
24165,
275,
253,
2505,
50276,
4674,
8319,
1097,
22942,
285,
802,
69,
671,
6830,
269,
19875,
403,
7996,
281,
253,
4311,
273,
253,
3280,
1127,
16173,
3021,
352,
310,
1774,
281,
5224,
253,
4311,
273,
253,
3280,
1127,
16173,
849,
597,
497,
12650,
323,
1650,
323,
1024,
891,
717,
417,
2119,
326,
352,
310,
1896,
281,
3587,
7277,
634,
1543,
432,
2829,
495,
342,
253,
1543,
273,
337,
374,
1580,
634,
2193,
273,
5823,
69,
2428,
358,
69,
285,
9383,
2428,
358,
69,
3012,
9184,
432,
253,
2193,
2361,
627,
436,
1537,
320,
1955,
281,
253,
1027,
3280,
941,
11498,
50276,
4674,
8073,
275,
619,
2793,
298,
72,
507,
432,
495,
3534,
1077,
17631,
3045,
342,
1675,
281,
1014,
1355,
10575,
275,
253,
1566,
35615,
24088,
247,
1818,
275,
253,
7877,
1319,
273,
253,
21624,
2317,
281,
4044,
5272,
3045,
891,
3058,
281,
19928,
4373,
22041,
2905,
281,
3733,
8103,
285,
13757,
11794,
323,
1027,
16012,
285,
812,
417,
897,
247,
2014,
873,
273,
4373,
22041,
323,
512,
253,
16012,
275,
634,
1083,
368,
6990,
1027,
2990,
4243,
4336,
812,
368,
12654,
10380,
326,
253,
4103,
3045,
273,
253,
2783,
16012,
476,
417,
320,
12877,
281,
253,
4327,
273,
253,
4373,
22041,
534,
403,
1682,
323,
581,
6661,
533,
417,
8654,
323,
2571,
275,
643,
20490,
452,
368,
3597,
281,
19928,
253,
4373,
22041,
11794,
323,
1027,
16012,
50276,
74,
14109,
253,
1127,
3024,
4090,
938,
2385,
3368,
275,
253,
24864,
4753,
1580,
891,
574,
253,
1072,
1953,
285,
1119,
253,
3662,
627,
50276,
4674,
8676,
642,
14156,
4679,
281,
22296,
253,
14156,
3733,
50276,
936,
35220,
885,
50276,
13206,
374,
310,
2361,
281,
3831,
3632,
4236,
50276,
14719,
314,
4236,
3530,
891,
513,
417,
2096,
849,
597,
476,
320,
12421,
4236,
285,
387,
253,
1072,
673,
512,
1246,
5512,
253,
1072,
1789,
275,
1016,
4194,
891,
513,
417,
1158,
368,
476,
1750,
3632,
5438,
2581,
685,
33804,
8871,
275,
326,
1083,
534,
310,
417,
247,
1895,
1580,
368,
3078,
281,
7568,
253,
3910,
275,
14649,
273,
5512,
253,
1072,
1789,
50276,
2420,
577,
11743,
1054,
50276,
24706,
50276,
18,
30966,
305,
30287,
606,
1269,
419,
80,
1182,
632,
86,
619,
5663,
466,
256,
288,
1792,
9432,
266,
270,
1127,
5449,
495,
69,
1127,
9005,
5978,
342,
5415,
2622,
3006,
14221,
275,
17857,
17312,
746,
50276,
19,
27451,
536,
729,
391,
5006,
254,
299,
2336,
1257,
1441,
480,
13358,
1127,
2685,
6928,
323,
5919,
1127,
9005,
5978,
275,
23746,
87,
938,
50276,
20,
247,
348,
965,
2178,
284,
268,
45285,
11924,
258,
4784,
965,
356,
39903,
891,
1149,
487,
284,
298,
4715,
14237,
285,
1006,
800,
3210,
323,
495,
69,
1127,
16173,
275,
17857,
1686,
1093,
50274,
6438,
4361,
643,
10123,
4477,
5701,
285,
12669,
253,
17265,
7714,
891,
4425,
281,
5777,
3157,
619,
13716,
323,
767,
4606,
41005,
619,
7350,
497,
9577,
1309,
11985,
1273,
314,
891,
513,
417,
5194,
326,
253,
7350,
5439,
407,
643,
30628,
812,
15249,
247,
18235,
891,
2868,
436,
310,
271,
34093,
16774,
1263,
15250,
4460,
1491,
285,
16039,
670,
10491,
7340,
273,
1127,
9005,
2349,
351,
398,
285,
1127,
9005,
7103,
17082,
7152,
339,
431,
248,
2929,
2589,
84,
4679,
281,
9186,
253,
2538,
273,
1127,
10491,
6127,
275,
1127,
9005,
305,
507,
285,
5661,
1543,
10313,
253,
1921,
2139,
1655,
2266,
20741,
2392,
3659,
3024,
3966,
1891,
281,
6194,
247,
5272,
1127,
9005,
14156,
407,
36636,
50276,
66,
10491,
6637,
253,
4477,
9569,
247,
4766,
1127,
10491,
13823,
8245,
7134,
12915,
3659,
3024,
24706,
253,
2929,
29328,
247,
747,
7756,
3884,
323,
1127,
9005,
36827,
534,
1537,
452,
247,
2266,
3486,
275,
253,
3114,
253,
2929,
310,
3542,
7036,
9299,
285,
253,
33954,
403,
2590,
50276,
35529,
627,
403,
1335,
690,
7350,
326,
878,
281,
320,
9713,
50276,
18,
581,
273,
253,
2201,
9021,
7558,
275,
253,
2929,
310,
253,
1307,
10491,
6637,
2299,
253,
2929,
1057,
417,
2085,
253,
3242,
5426,
273,
10491,
6637,
253,
2929,
7194,
37141,
253,
84,
6637,
715,
1264,
27005,
10491,
968,
18917,
13823,
12239,
18917,
534,
310,
689,
48573,
1245,
285,
1512,
3862,
253,
4477,
943,
1918,
625,
22861,
285,
4114,
30328,
327,
2139,
253,
10491,
6637,
943,
320,
275,
436,
830,
50276,
19,
1529,
4468,
281,
436,
2929,
310,
253,
642,
14156,
4679,
275,
4677,
495,
253,
4477,
921,
690,
5304,
5904,
273,
253,
1127,
16173,
4561,
407,
1027,
20741,
2392,
352,
310,
5272,
281,
6635,
690,
12504,
6951,
5281,
432,
1127,
3024,
285,
277,
23654,
9866,
2299,
253,
4561,
1127,
16173,
432,
465,
81,
13118,
285,
1127,
13118,
403,
3240,
10084,
281,
1056,
253,
3368,
1543,
21414,
352,
310,
1805,
326,
253,
4477,
943,
2085,
625,
4278,
327,
253,
4679,
273,
465,
81,
13118,
285,
1127,
13118,
285,
2085,
625,
8813,
327,
2139,
465,
81,
13118,
285,
1127,
13118,
1918,
2761,
3632,
1127,
16173,
984,
253,
1127,
3024,
277,
23654,
9866,
465,
81,
13118,
285,
1127,
13118,
512,
897,
4103,
1127,
6887,
13057,
1491,
2556,
281,
253,
2929,
627,
1364,
320,
625,
4606,
327,
2139,
253,
1543,
1007,
1077,
1027,
50276,
20,
625,
5853,
4278,
327,
253,
642,
14156,
50276,
16217,
3825,
943,
320,
5611,
323,
1650,
752,
310,
253,
2957,
1159,
908,
15617,
253,
642,
14156,
4679,
285,
253,
1551,
36827,
4679,
310,
352,
253,
5520,
259,
1247,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
3400,
16774,
1941,
326,
253,
10491,
5700,
908,
275,
1127,
9005,
305,
507,
476,
31063,
3486,
253,
5978,
3290,
273,
253,
2990,
5742,
253,
4477,
921,
326,
20741,
2392,
326,
403,
417,
7996,
281,
10491,
452,
17524,
34332,
6332,
1223,
1110,
326,
403,
7996,
281,
10491,
513,
417,
4711,
5272,
2819,
1127,
16173,
597,
671,
2085,
247,
2969,
1039,
26332,
1690,
1323,
72,
4735,
45900,
281,
3157,
5978,
3290,
323,
39188,
7134,
12915,
36827,
873,
8777,
253,
30628,
5194,
326,
436,
310,
271,
4722,
12288,
715,
253,
1895,
285,
436,
12288,
476,
1361,
253,
3114,
50276,
3169,
327,
253,
30628,
5701,
285,
6774,
11985,
352,
4916,
2590,
326,
253,
2929,
651,
320,
10046,
285,
625,
18511,
604,
253,
6944,
9079,
26332,
253,
2934,
273,
10491,
6637,
310,
625,
8132,
29689,
2931,
24088,
34243,
342,
247,
10527,
3216,
272,
285,
253,
3916,
47927,
403,
12331,
275,
342,
436,
5426,
824,
247,
28462,
285,
10799,
9978,
651,
1361,
275,
18918,
2852,
5978,
20741,
2392,
326,
778,
417,
3365,
2965,
715,
253,
767,
13358,
2390,
2931,
275,
253,
2929,
26332,
10491,
689,
19579,
285,
10491,
968,
18917,
253,
1543,
452,
9023,
594,
253,
4477,
403,
14659,
281,
1379,
715,
8180,
253,
37317,
11985,
281,
4711,
247,
10046,
2852,
19529,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
323,
8534,
347,
247,
7134,
12915,
2556,
281,
253,
4477,
4679,
326,
512,
273,
731,
1891,
281,
6635,
5272,
1127,
16173,
20764,
436,
253,
4477,
4081,
10491,
13823,
7134,
12915,
285,
6786,
1805,
1543,
3340,
2112,
342,
253,
4081,
10491,
4919,
17082,
253,
4081,
1332,
2722,
11361,
689,
5368,
3082,
50276,
45563,
337,
253,
4560,
326,
10491,
7120,
271,
1774,
2554,
275,
36827,
1754,
1127,
9005,
5978,
310,
9371,
281,
253,
3114,
374,
253,
4081,
1332,
407,
3365,
32147,
839,
271,
3388,
24462,
4735,
281,
253,
2781,
10730,
264,
1127,
3024,
4735,
310,
2011,
281,
320,
3576,
275,
253,
4081,
17082,
50275,
20881,
1255,
4583,
627,
403,
1142,
12744,
3374,
878,
2007,
8813,
253,
8870,
414,
273,
253,
4081,
1332,
310,
671,
10915,
87,
19163,
337,
540,
41597,
347,
253,
10491,
39188,
7134,
12915,
5644,
281,
4561,
1127,
16173,
25229,
273,
253,
10491,
24165,
253,
7269,
10491,
7996,
7134,
12915,
812,
2953,
436,
1895,
1580,
352,
16633,
625,
327,
436,
4809,
2299,
253,
1543,
858,
417,
1329,
436,
5476,
285,
253,
2929,
1057,
417,
1918,
2266,
1921,
12792,
281,
5513,
824,
20101,
374,
253,
4081,
10491,
6600,
7134,
12915,
310,
247,
32147,
318,
273,
2781,
45900,
285,
1323,
72,
45900,
7296,
1127,
3024,
24706,
285,
1127,
3024,
4090,
760,
19986,
407,
271,
1323,
72,
1057,
352,
1599,
326,
1323,
72,
310,
10491,
13823,
840,
2139,
417,
3587,
970,
1323,
72,
19235,
253,
1543,
2011,
275,
253,
2929,
24088,
2829,
577,
7568,
326,
970,
1323,
72,
671,
1057,
417,
1347,
973,
3103,
352,
310,
3240,
1892,
281,
2096,
253,
6944,
8870,
414,
323,
253,
2323,
273,
1127,
3024,
24706,
495,
281,
921,
253,
4081,
5878,
5700,
310,
2087,
2217,
352,
310,
2424,
281,
921,
342,
643,
1666,
25379,
323,
1650,
849,
670,
970,
253,
5878,
5700,
275,
9406,
839,
17375,
4735,
275,
268,
79,
42959,
1057,
352,
1335,
789,
577,
268,
79,
42959,
19132,
268,
79,
407,
24049,
5511,
875,
2792,
2299,
2139,
643,
6928,
824,
347,
268,
79,
1127,
13118,
326,
671,
4648,
2792,
6355,
476,
417,
3157,
327,
268,
79,
275,
36827,
436,
310,
18338,
281,
1783,
285,
1918,
247,
5272,
8813,
275,
436,
2929,
608,
670,
253,
4081,
10491,
4919,
17082,
253,
5161,
310,
269,
4181,
1754,
327,
2710,
4735,
4908,
641,
1580,
269,
4181,
310,
1754,
327,
3386,
1223,
253,
908,
3386,
403,
32945,
327,
4908,
641,
594,
436,
8310,
275,
268,
22,
253,
4107,
12914,
4181,
17082,
3894,
253,
1072,
10491,
7340,
273,
616,
3969,
20741,
2392,
310,
4755,
275,
643,
3000,
352,
2686,
11306,
15771,
327,
253,
908,
4908,
263,
26332,
342,
247,
10491,
39188,
4908,
263,
253,
7982,
310,
5257,
281,
320,
10491,
39188,
3103,
824,
17082,
778,
320,
34209,
281,
4887,
253,
2867,
273,
1027,
3082,
8719,
2392,
50276,
977,
5701,
352,
310,
1892,
281,
2096,
752,
310,
3037,
494,
1127,
16173,
275,
642,
14156,
4679,
625,
4278,
670,
253,
9978,
285,
2990,
35615,
670,
436,
3368,
310,
9371,
323,
9414,
281,
2096,
7152,
33032,
2520,
2929,
19401,
253,
4836,
273,
495,
69,
5281,
5978,
625,
10534,
352,
11949,
1951,
253,
1127,
9005,
36827,
5700,
323,
5978,
285,
6131,
271,
9470,
20407,
1783,
273,
253,
5368,
35615,
273,
1127,
9005,
21025,
285,
20741,
2392,
342,
247,
2770,
327,
253,
10575,
273,
253,
6158,
4445,
50276,
66,
2962,
273,
13506,
4679,
7296,
2067,
1127,
9005,
10491,
15849,
310,
4081,
285,
908,
281,
4853,
247,
6637,
273,
7340,
281,
10491,
24165,
2710,
21492,
432,
6447,
10491,
323,
247,
2491,
273,
2783,
20741,
2392,
253,
1072,
310,
2684,
342,
253,
5368,
17082,
16344,
3045,
323,
1127,
9005,
5978,
50276,
15068,
264,
1543,
1804,
326,
5368,
1127,
3024,
1754,
20741,
2392,
342,
2781,
10730,
272,
1127,
4735,
20828,
5122,
403,
39188,
281,
5460,
24165,
275,
1127,
4038,
534,
5644,
281,
1327,
23714,
3530,
432,
253,
21025,
327,
253,
643,
1133,
5368,
11701,
281,
1127,
3024,
534,
1347,
1805,
275,
20741,
800,
8892,
403,
689,
19579,
281,
10491,
285,
8722,
281,
9232,
1014,
253,
8004,
21492,
432,
253,
6447,
10491,
534,
5644,
281,
4105,
3733,
6298,
275,
2426,
273,
27935,
432,
1110,
20741,
2392,
347,
18075,
323,
1097,
41798,
253,
4477,
12661,
281,
897,
10491,
13823,
1127,
3024,
1754,
20741,
2392,
533,
2057,
897,
247,
32147,
318,
273,
253,
2781,
10730,
264,
285,
3388,
10730,
264,
1127,
3386,
390,
1881,
42959,
5122,
347,
271,
20828,
1159,
50276,
249,
253,
5661,
2593,
253,
4477,
7277,
7418,
13553,
273,
2783,
20741,
2392,
342,
2067,
5368,
21025,
281,
6583,
326,
253,
4081,
10491,
13823,
20741,
2392,
4917,
8936,
3045,
2439,
512,
253,
21025,
50276,
856,
84,
337,
253,
2929,
310,
973,
15720,
285,
310,
3477,
281,
956,
374,
4583,
26565,
1255,
273,
253,
4679,
275,
253,
2929,
310,
13943,
285,
253,
11815,
8392,
403,
13760,
285,
47860,
495,
3081,
14800,
327,
253,
10491,
7340,
6637,
323,
7103,
17082,
403,
671,
9865,
50276,
5040,
337,
253,
2022,
32489,
273,
253,
2929,
310,
253,
1881,
41010,
3753,
273,
253,
3559,
2144,
275,
697,
1655,
830,
352,
38771,
667,
12820,
407,
5301,
281,
667,
6024,
1006,
800,
1543,
3021,
352,
310,
417,
2590,
835,
253,
2797,
11701,
323,
1127,
9005,
305,
507,
1691,
731,
327,
253,
3045,
3290,
1618,
7296,
512,
253,
1006,
800,
3210,
323,
1127,
16173,
50276,
1189,
455,
436,
310,
271,
13943,
789,
534,
891,
1158,
778,
320,
7607,
1014,
347,
352,
310,
2299,
697,
1318,
476,
320,
6508,
3345,
1127,
9005,
36827,
3114,
281,
2087,
495,
69,
5281,
5978,
4836,
8446,
342,
247,
1463,
5301,
273,
253,
1682,
1543,
342,
1655,
1375,
273,
253,
1445,
627,
403,
387,
1878,
767,
3332,
2987,
1754,
327,
12177,
3733,
337,
374,
534,
476,
320,
2783,
323,
5301,
390,
387,
1878,
943,
320,
5393,
1677,
326,
619,
4868,
310,
760,
10290,
50274,
38092,
5701,
50276,
4674,
337,
12494,
374,
14697,
25577,
4269,
24165,
275,
253,
2505,
50276,
4674,
8319,
1097,
22942,
285,
802,
69,
671,
6830,
269,
19875,
403,
7996,
281,
253,
4311,
273,
253,
3280,
1127,
16173,
3021,
352,
310,
1774,
281,
5224,
253,
4311,
273,
253,
3280,
1127,
16173,
849,
597,
497,
12650,
323,
1650,
323,
1024,
891,
717,
417,
2119,
326,
352,
310,
1896,
281,
3587,
7277,
634,
1543,
432,
2829,
495,
342,
253,
1543,
273,
337,
374,
1580,
634,
2193,
273,
5823,
69,
2428,
358,
69,
285,
9383,
2428,
358,
69,
3012,
9184,
432,
253,
2193,
2361,
627,
436,
1537,
320,
1955,
281,
253,
1027,
3280,
941,
11498,
50276,
4674,
8073,
275,
619,
2793,
298,
72,
507,
432,
495,
3534,
1077,
17631,
3045,
342,
1675,
281,
1014,
1355,
10575,
275,
253,
1566,
35615,
24088,
247,
1818,
275,
253,
7877,
1319,
273,
253,
21624,
2317,
281,
4044,
5272,
3045,
891,
3058,
281,
19928,
4373,
22041,
2905,
281,
3733,
8103,
285,
13757,
11794,
323,
1027,
16012,
285,
812,
417,
897,
247,
2014,
873,
273,
4373,
22041,
323,
512,
253,
16012,
275,
634,
1083,
368,
6990,
1027,
2990,
4243,
4336,
812,
368,
12654,
10380,
326,
253,
4103,
3045,
273,
253,
2783,
16012,
476,
417,
320,
12877,
281,
253,
4327,
273,
253,
4373,
22041,
534,
403,
1682,
323,
581,
6661,
533,
417,
8654,
323,
2571,
275,
643,
20490,
452,
368,
3597,
281,
19928,
253,
4373,
22041,
11794,
323,
1027,
16012,
50276,
74,
14109,
253,
1127,
3024,
4090,
938,
2385,
3368,
275,
253,
24864,
4753,
1580,
891,
574,
253,
1072,
1953,
285,
1119,
253,
3662,
627,
50276,
4674,
8676,
642,
14156,
4679,
281,
22296,
253,
14156,
3733,
50276,
936,
35220,
885,
50276,
13206,
374,
310,
2361,
281,
3831,
3632,
4236,
50276,
14719,
314,
4236,
3530,
891,
513,
417,
2096,
849,
597,
476,
320,
12421,
4236,
285,
387,
253,
1072,
673,
512,
1246,
5512,
253,
1072,
1789,
275,
1016,
4194,
891,
513,
417,
1158,
368,
476,
1750,
3632,
5438,
2581,
685,
33804,
8871,
275,
326,
1083,
534,
310,
417,
247,
1895,
1580,
368,
3078,
281,
7568,
253,
3910,
275,
14649,
273,
5512,
253,
1072,
1789,
50276,
2420,
577,
11743,
1054,
50276,
24706,
50276,
18,
30966,
305,
30287,
606,
1269,
419,
80,
1182,
632,
86,
619,
5663,
466,
256,
288,
1792,
9432,
266,
270,
1127,
5449,
495,
69,
1127,
9005,
5978,
342,
5415,
2622,
3006,
14221,
275,
17857,
17312,
746,
50276,
19,
27451,
536,
729,
391,
5006,
254,
299,
2336,
1257,
1441,
480,
13358,
1127,
2685,
6928,
323,
5919,
1127,
9005,
5978,
275,
23746,
87,
938,
50276,
20,
247,
348,
965,
2178,
284,
268,
45285,
11924,
258,
4784,
965,
356,
39903,
891,
1149,
487,
284,
298,
4715,
14237,
285,
1006,
800,
3210,
323,
495,
69,
1127,
16173,
275,
17857,
1686,
1093,
50274,
6438,
4361,
643,
10123,
4477,
5701,
285,
12669,
253,
17265,
7714,
891,
4425,
281,
5777,
3157,
619,
13716,
323,
767,
4606,
41005,
619,
7350,
497,
9577,
1309,
11985,
1273,
314,
891,
513,
417,
5194,
326,
253,
7350,
5439,
407,
643,
30628,
812,
15249,
247,
18235,
891,
2868,
436,
310,
271,
34093,
16774,
1263,
15250,
4460,
1491,
285,
16039,
670,
10491,
7340,
273,
1127,
9005,
2349,
351,
398,
285,
1127,
9005,
7103,
17082,
7152,
339,
431,
248,
2929,
2589,
84,
4679,
281,
9186,
253,
2538,
273,
1127,
10491,
6127,
275,
1127,
9005,
305,
507,
285,
5661,
1543,
10313,
253,
1921,
2139,
1655,
2266,
20741,
2392,
3659,
3024,
3966,
1891,
281,
6194,
247,
5272,
1127,
9005,
14156,
407,
36636,
50276,
66,
10491,
6637,
253,
4477,
9569,
247,
4766,
1127,
10491,
13823,
8245,
7134,
12915,
3659,
3024,
24706,
253,
2929,
29328,
247,
747,
7756,
3884,
323,
1127,
9005,
36827,
534,
1537,
452,
247,
2266,
3486,
275,
253,
3114,
253,
2929,
310,
3542,
7036,
9299,
285,
253,
33954,
403,
2590,
50276,
35529,
627,
403,
1335,
690,
7350,
326,
878,
281,
320,
9713,
50276,
18,
581,
273,
253,
2201,
9021,
7558,
275,
253,
2929,
310,
253,
1307,
10491,
6637,
2299,
253,
2929,
1057,
417,
2085,
253,
3242,
5426,
273,
10491,
6637,
253,
2929,
7194,
37141,
253,
84,
6637,
715,
1264,
27005,
10491,
968,
18917,
13823,
12239,
18917,
534,
310,
689,
48573,
1245,
285,
1512,
3862,
253,
4477,
943,
1918,
625,
22861,
285,
4114,
30328,
327,
2139,
253,
10491,
6637,
943,
320,
275,
436,
830,
50276,
19,
1529,
4468,
281,
436,
2929,
310,
253,
642,
14156,
4679,
275,
4677,
495,
253,
4477,
921,
690,
5304,
5904,
273,
253,
1127,
16173,
4561,
407,
1027,
20741,
2392,
352,
310,
5272,
281,
6635,
690,
12504,
6951,
5281,
432,
1127,
3024,
285,
277,
23654,
9866,
2299,
253,
4561,
1127,
16173,
432,
465,
81,
13118,
285,
1127,
13118,
403,
3240,
10084,
281,
1056,
253,
3368,
1543,
21414,
352,
310,
1805,
326,
253,
4477,
943,
2085,
625,
4278,
327,
253,
4679,
273,
465,
81,
13118,
285,
1127,
13118,
285,
2085,
625,
8813,
327,
2139,
465,
81,
13118,
285,
1127,
13118,
1918,
2761,
3632,
1127,
16173,
984,
253,
1127,
3024,
277,
23654,
9866,
465,
81,
13118,
285,
1127,
13118,
512,
897,
4103,
1127,
6887,
13057,
1491,
2556,
281,
253,
2929,
627,
1364,
320,
625,
4606,
327,
2139,
253,
1543,
1007,
1077,
1027,
50276,
20,
625,
5853,
4278,
327,
253,
642,
14156,
50276,
16217,
3825,
943,
320,
5611,
323,
1650,
752,
310,
253,
2957,
1159,
908,
15617,
253,
642,
14156,
4679,
285,
253,
1551,
36827,
4679,
310,
352,
253,
5520,
259,
1247,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
3400,
16774,
1941,
326,
253,
10491,
5700,
908,
275,
1127,
9005,
305,
507,
476,
31063,
3486,
253,
5978,
3290,
273,
253,
2990,
5742,
253,
4477,
921,
326,
20741,
2392,
326,
403,
417,
7996,
281,
10491,
452,
17524,
34332,
6332,
1223,
1110,
326,
403,
7996,
281,
10491,
513,
417,
4711,
5272,
2819,
1127,
16173,
597,
671,
2085,
247,
2969,
1039,
26332,
1690,
1323,
72,
4735,
45900,
281,
3157,
5978,
3290,
323,
39188,
7134,
12915,
36827,
873,
8777,
253,
30628,
5194,
326,
436,
310,
271,
4722,
12288,
715,
253,
1895,
285,
436,
12288,
476,
1361,
253,
3114,
50276,
3169,
327,
253,
30628,
5701,
285,
6774,
11985,
352,
4916,
2590,
326,
253,
2929,
651,
320,
10046,
285,
625,
18511,
604,
253,
6944,
9079,
26332,
253,
2934,
273,
10491,
6637,
310,
625,
8132,
29689,
2931,
24088,
34243,
342,
247,
10527,
3216,
272,
285,
253,
3916,
47927,
403,
12331,
275,
342,
436,
5426,
824,
247,
28462,
285,
10799,
9978,
651,
1361,
275,
18918,
2852,
5978,
20741,
2392,
326,
778,
417,
3365,
2965,
715,
253,
767,
13358,
2390,
2931,
275,
253,
2929,
26332,
10491,
689,
19579,
285,
10491,
968,
18917,
253,
1543,
452,
9023,
594,
253,
4477,
403,
14659,
281,
1379,
715,
8180,
253,
37317,
11985,
281,
4711,
247,
10046,
2852,
19529,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes an extension to denoising diffusion implicit model ddim a variant of denoising diffusion probabilistic model ddpm here i refer to it as deqddim specifically it regards the denoising iterations as fixedpoint iterations and solves all the denoised images efficiently using anderson acceleration this is an interesting exploration however i have concerns of the reduced expressiveness brought about by ddim and deqddim strengths 1 its a novel and interesting perspective to view the denoising iterations as fixedpoint iterations seeking for converged solutions through iterations weaknesses 1 deqddim inherits the limitations of ddim they remove the injected random noises during denoising therefore the results are deterministic given the noise image xt this limitation is fundamental it makes ddim and variants less being probabilistic models a direct consequence is the generated images of ddim and variants has reduced diversity compared with ddpm refer to the recall of ddim in a its debatable whether such a sacrifice for speedup is worthy 2 another possible inherent flaw but i might be wrong of deqddim is as the authors correctly pointed out the diffusion process is not timeinvariant ie not weighttied in the deq sense more specifically the denoising function takes the time embedding of time t as extra input in my view the time t controls the variance of the noises to be estimated therefore its indispensable but if taking time t as extra input then can we still view ddim as fixedpoint iterations if not then speedup using anderson acceleration is not applicable a tackling the generative learning trilemma with denoising diffusion gans iclr 2022 na docsepthis paper develops a variant deep equilibrium model deq of an existing method ddim with the goal of efficient sampling and model inversion while still maintaining the performance in particular it treats the sampling step sequence of ddimdenoising diffusion implicit model where the sampling step is deterministic and no longer a markov chain as a fixedpoint set so that a fixedpoint solver eg anderson acceleration etc is able to jointly minimize the entire sampling chain the authors show the practical relevance of their proposal in singleshot image sampling and model inversion on several benchmark datasets this paper builds on other works in the recent literature and proposes something useful and novel it is overall well written and reports somewhat competitive results with both quantitative and qualitative validation for image sampling and model inversion while i think the methodological aspect is novel enough for publication the rationale for single image sampling is unclear and need to be further justified please find my technical comments bellow the limitations of the proposal are missing or at least not well stated in the current manuscript docsepthe paper shows a deep equilibrium interpretation of denoising diffusion implicit models ddim this approach provides the traditionally sequential diffusion models with parallelization capabilities resulting in faster single image generation and model inversion with less memory consumption strengths wellfounded and strong mathematical viewpoints on diffusion models this paper provides a novel and significant view on diffusion models as uppertriangular iterative systems and shows an interesting method to differentiate through them impressive model inversion capabilities especially as demonstrated qualitatively in figure 4 weaknesses line 53 modern autograd packages would require storing the entire computational graph for all t states by default while true it is still possible to differentiate through the entire sequential diffusion process without requiring this for example this was done in r1 this fact is not made sufficiently clear to the reader the same point is made on line 159 without stating by default the approach loses its appeal when considering batched image generation in table 1 the reporting of ddim results is inadequate ddim achieves an fid of 407 using a particular setting whereas it can also achieve an fid of 317 using a different parameterization under the same timeframe the fact that it edges out this papers result is okay as this paper provides a significant speedup however this result should be adequately reported in the table similarly ddim can be sped up significantly and achieves remarkable results the authors suggest that their method can be used in tandem with other speedup methods which sounds theoretically plausible but such results are not reported in the paper results seem to only mildly improve upon previous results seen in ddim and others an additional experiment showing the use of both the novel acceleration method and previous speedup methods simultaneously would better demonstrate the capabilities of this method minor comments typo on line 41 join instead of joint typo on line 209 effect instead of affect on lines 266268 some citations are not described correctly citations 45 17 57 32 do not manipulate latent codes to edit highlevel attributes of images they utilize the prevalent understanding of diffusion models for the respective tasks they solve similar to ideas first presented in 57 r2 r3 the text should be reworded and show the existing and suggested citations as methods that edit images or solve inverse problems without requiring full model inversion r1 nie w guo b huang y xiao c vahdat a anandkumar a 2022 diffusion models for adversarial purification arxiv preprint arxiv220507460 r2 kadkhodaie z simoncelli e p 2020 solving linear inverse problems using the prior implicit in a denoiser arxiv preprint arxiv200713640 r3 kawar b vaksman g elad m 2021 snips solving noisy inverse problems stochastically advances in neural information processing systems 34 2175721769 the paper holds no potential negative societal impact in my opinion and the authors have aptly ignored this as it is not applicable as for the limitations the authors touch upon their methods limitations in the paper text addressing the larger memory requirements of each step however ddims results should be made clearer it should also be mentioned that sequential models can be differentiated through with constant memory complexity as well yes current autograd methods cannot automatically achieve this but it still is possible
### Summary:
|
this paper develops a variant deep equilibrium model deq of an existing method ddim with the goal of efficient sampling and model inversion while still maintaining the performance in particular it treats the sampling step sequence of ddimdenoising diffusion implicit model where the sampling step is deterministic and no longer a markov chain as a fixedpoint set so that a fixedpoint solver eg anderson acceleration etc is able to jointly minimize the entire sampling chain the committee all agree that the methodology proposed in this work although is built on prior work is novel the presentation of the paper is clear and the reported results are promising the committee appreciates the authors effort in both revising the manuscript and providing a conclusive response therefore we recommend acceptance of this manuscript
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
6880,
281,
1850,
80,
2182,
12393,
15424,
1566,
277,
4528,
247,
12955,
273,
1850,
80,
2182,
12393,
37851,
1566,
32765,
2617,
1060,
891,
3730,
281,
352,
347,
372,
82,
1678,
303,
5742,
352,
17730,
253,
1850,
80,
2182,
25142,
347,
4229,
3659,
25142,
285,
35910,
512,
253,
1850,
80,
1701,
3888,
14556,
970,
285,
3796,
17680,
436,
310,
271,
4722,
17947,
2299,
891,
452,
7350,
273,
253,
3777,
3890,
6460,
3982,
670,
407,
277,
4528,
285,
372,
82,
1678,
303,
50276,
296,
3755,
20556,
337,
697,
247,
4460,
285,
4722,
8668,
281,
1859,
253,
1850,
80,
2182,
25142,
347,
4229,
3659,
25142,
8445,
323,
5975,
2400,
5482,
949,
25142,
50275,
20881,
1255,
265,
337,
372,
82,
1678,
303,
10958,
953,
253,
7364,
273,
277,
4528,
597,
5386,
253,
13945,
3632,
33737,
1309,
1850,
80,
2182,
3103,
253,
1543,
403,
30027,
1677,
253,
6046,
2460,
209,
633,
436,
12291,
310,
7936,
352,
2789,
277,
4528,
285,
11640,
1679,
1146,
37851,
3210,
247,
1480,
9936,
310,
253,
4561,
3888,
273,
277,
4528,
285,
11640,
556,
3777,
9991,
2429,
342,
32765,
2617,
3730,
281,
253,
6983,
273,
277,
4528,
275,
247,
697,
4274,
17980,
1880,
824,
247,
17789,
323,
3885,
484,
310,
18338,
50276,
19,
1529,
1896,
12794,
19652,
533,
891,
1537,
320,
3430,
273,
372,
82,
1678,
303,
310,
347,
253,
4477,
9113,
8042,
562,
253,
12393,
1232,
310,
417,
673,
25168,
26332,
417,
2801,
85,
728,
275,
253,
372,
82,
3282,
625,
5742,
253,
1850,
80,
2182,
1159,
3936,
253,
673,
21496,
273,
673,
246,
347,
4465,
3280,
275,
619,
1859,
253,
673,
246,
5760,
253,
11041,
273,
253,
33737,
281,
320,
5998,
3103,
697,
33777,
533,
604,
3192,
673,
246,
347,
4465,
3280,
840,
476,
359,
1335,
1859,
277,
4528,
347,
4229,
3659,
25142,
604,
417,
840,
3885,
484,
970,
285,
3796,
17680,
310,
417,
7763,
50276,
66,
46710,
253,
1006,
800,
4715,
1195,
21838,
342,
1850,
80,
2182,
12393,
305,
507,
17857,
32888,
1384,
1423,
50276,
2072,
5474,
33032,
2520,
2929,
24357,
247,
12955,
3676,
12902,
1566,
372,
82,
273,
271,
5368,
1332,
277,
4528,
342,
253,
4736,
273,
5919,
10491,
285,
1566,
27697,
1223,
1335,
11850,
253,
3045,
50276,
249,
1798,
352,
26574,
253,
10491,
3213,
3425,
273,
277,
4528,
3354,
80,
2182,
12393,
15424,
1566,
835,
253,
10491,
3213,
310,
30027,
285,
642,
3356,
247,
1616,
729,
5931,
347,
247,
4229,
3659,
873,
594,
326,
247,
4229,
3659,
47037,
24088,
285,
3796,
17680,
3966,
310,
2104,
281,
26277,
15338,
253,
2862,
10491,
5931,
253,
4477,
921,
253,
8542,
17200,
273,
616,
10419,
275,
21864,
12022,
2460,
10491,
285,
1566,
27697,
327,
2067,
22791,
15302,
436,
2929,
21168,
327,
643,
2987,
275,
253,
3332,
6239,
285,
29328,
1633,
4217,
285,
4460,
352,
310,
4583,
973,
3542,
285,
5012,
8489,
12085,
1543,
342,
1097,
11745,
285,
18276,
12820,
323,
2460,
10491,
285,
1566,
27697,
50276,
6050,
891,
1158,
253,
35961,
4809,
310,
4460,
2217,
323,
9311,
253,
24775,
323,
2014,
2460,
10491,
310,
12744,
285,
878,
281,
320,
2007,
17285,
4496,
1089,
619,
7681,
5701,
44462,
253,
7364,
273,
253,
10419,
403,
5816,
390,
387,
1878,
417,
973,
4767,
275,
253,
1655,
7714,
50276,
7152,
339,
431,
248,
2929,
2722,
247,
3676,
12902,
7914,
273,
1850,
80,
2182,
12393,
15424,
3210,
277,
4528,
436,
2746,
3400,
253,
21533,
22453,
12393,
3210,
342,
7529,
1320,
13789,
4795,
275,
7938,
2014,
2460,
5978,
285,
1566,
27697,
342,
1679,
3541,
8353,
20544,
50276,
4714,
41607,
285,
2266,
15965,
1859,
10801,
327,
12393,
3210,
436,
2929,
3400,
247,
4460,
285,
1534,
1859,
327,
12393,
3210,
347,
4627,
797,
363,
12406,
34560,
2718,
285,
2722,
271,
4722,
1332,
281,
22629,
949,
731,
50276,
303,
7100,
422,
1566,
27697,
13789,
3340,
347,
5183,
36143,
275,
4677,
577,
50275,
20881,
1255,
265,
50276,
1282,
8676,
4980,
1125,
462,
4614,
12149,
651,
2430,
20073,
253,
2862,
15180,
4216,
323,
512,
246,
3054,
407,
4284,
1223,
2032,
352,
310,
1335,
1896,
281,
22629,
949,
253,
2862,
22453,
12393,
1232,
1293,
10568,
436,
323,
1650,
436,
369,
2218,
275,
391,
18,
436,
958,
310,
417,
1160,
10481,
2590,
281,
253,
9414,
253,
1072,
1127,
310,
1160,
327,
1386,
22769,
1293,
14851,
407,
4284,
50275,
783,
2746,
25068,
697,
4549,
672,
7296,
10464,
2147,
2460,
5978,
50276,
249,
2829,
337,
253,
9610,
273,
277,
4528,
1543,
310,
18766,
277,
4528,
33526,
271,
269,
301,
273,
32562,
970,
247,
1798,
4758,
5727,
352,
476,
671,
5115,
271,
269,
301,
273,
31004,
970,
247,
1027,
4764,
1320,
762,
253,
1072,
673,
6301,
253,
958,
326,
352,
9297,
562,
436,
9380,
906,
310,
8261,
347,
436,
2929,
3400,
247,
1534,
3885,
484,
2299,
436,
906,
943,
320,
18212,
2361,
275,
253,
2829,
50276,
3549,
6241,
277,
4528,
476,
320,
653,
264,
598,
3012,
285,
33526,
13406,
1543,
253,
4477,
1804,
326,
616,
1332,
476,
320,
908,
275,
30111,
342,
643,
3885,
484,
3082,
534,
7835,
28055,
21541,
533,
824,
1543,
403,
417,
2361,
275,
253,
2929,
50276,
16680,
1646,
281,
760,
38920,
3157,
2220,
2045,
1543,
2326,
275,
277,
4528,
285,
2571,
271,
3081,
3368,
4645,
253,
897,
273,
1097,
253,
4460,
17680,
1332,
285,
2045,
3885,
484,
3082,
10486,
651,
1805,
7568,
253,
13789,
273,
436,
1332,
50276,
37585,
5701,
50276,
555,
5367,
327,
1386,
7609,
6604,
3185,
273,
6036,
50276,
555,
5367,
327,
1386,
25597,
1055,
3185,
273,
2818,
50276,
251,
3104,
30610,
22913,
690,
30404,
403,
417,
2529,
9113,
30404,
5329,
1722,
8988,
4567,
513,
417,
26526,
21624,
11646,
281,
12921,
1029,
5251,
12474,
273,
3888,
597,
16584,
253,
21270,
4685,
273,
12393,
3210,
323,
253,
9056,
8892,
597,
8415,
2074,
281,
5697,
806,
3559,
275,
8988,
391,
19,
391,
20,
253,
2505,
943,
320,
294,
3418,
264,
285,
921,
253,
5368,
285,
5125,
30404,
347,
3082,
326,
12921,
3888,
390,
8415,
13737,
3237,
1293,
10568,
2120,
1566,
27697,
50276,
83,
18,
15361,
259,
1149,
80,
270,
30287,
606,
340,
1269,
22728,
260,
362,
1240,
8608,
247,
50276,
266,
395,
76,
22711,
247,
1384,
1423,
12393,
3210,
323,
48960,
23609,
549,
32693,
638,
3845,
549,
32693,
14256,
1235,
3566,
1549,
50276,
83,
19,
40303,
17616,
17457,
466,
1182,
50276,
3549,
251,
3992,
74,
299,
268,
9169,
16161,
4872,
13737,
3237,
970,
253,
2720,
15424,
275,
247,
1850,
80,
9141,
549,
32693,
638,
3845,
549,
32693,
8602,
15220,
1449,
50276,
83,
20,
465,
1403,
274,
270,
362,
8765,
1342,
305,
50276,
293,
324,
278,
43425,
3802,
2824,
16161,
27620,
13737,
3237,
331,
3770,
20595,
16424,
275,
11454,
1491,
5162,
2718,
5910,
26517,
3011,
19443,
2090,
253,
2929,
6556,
642,
2442,
4016,
38058,
3486,
275,
619,
4743,
285,
253,
4477,
452,
13390,
314,
12841,
436,
347,
352,
310,
417,
7763,
50276,
284,
323,
253,
7364,
253,
4477,
5181,
2220,
616,
3082,
7364,
275,
253,
2929,
2505,
15974,
253,
4067,
3541,
6095,
273,
1016,
3213,
2299,
277,
4528,
84,
1543,
943,
320,
1160,
30909,
352,
943,
671,
320,
5393,
326,
22453,
3210,
476,
320,
22266,
949,
342,
3638,
3541,
10454,
347,
973,
4754,
1655,
1125,
462,
4614,
3082,
2550,
8356,
5115,
436,
533,
352,
1335,
310,
1896,
2490,
187,
4118,
18435,
27,
2520,
2929,
24357,
247,
12955,
3676,
12902,
1566,
372,
82,
273,
271,
5368,
1332,
277,
4528,
342,
253,
4736,
273,
5919,
10491,
285,
1566,
27697,
1223,
1335,
11850,
253,
3045,
275,
1798,
352,
26574,
253,
10491,
3213,
3425,
273,
277,
4528,
3354,
80,
2182,
12393,
15424,
1566,
835,
253,
10491,
3213,
310,
30027,
285,
642,
3356,
247,
1616,
729,
5931,
347,
247,
4229,
3659,
873,
594,
326,
247,
4229,
3659,
47037,
24088,
285,
3796,
17680,
3966,
310,
2104,
281,
26277,
15338,
253,
2862,
10491,
5931,
50276,
783,
9353,
512,
5194,
326,
253,
16182,
4081,
275,
436,
789,
3738,
310,
4270,
327,
2720,
789,
310,
4460,
253,
9759,
273,
253,
2929,
310,
2590,
285,
253,
2361,
1543,
403,
12532,
50276,
783,
9353,
6373,
28032,
253,
4477,
3434,
275,
1097,
3585,
2182,
253,
7714,
285,
5277,
247,
38662,
2380,
3103,
359,
5583,
14924,
273,
436,
7714,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
6880,
281,
1850,
80,
2182,
12393,
15424,
1566,
277,
4528,
247,
12955,
273,
1850,
80,
2182,
12393,
37851,
1566,
32765,
2617,
1060,
891,
3730,
281,
352,
347,
372,
82,
1678,
303,
5742,
352,
17730,
253,
1850,
80,
2182,
25142,
347,
4229,
3659,
25142,
285,
35910,
512,
253,
1850,
80,
1701,
3888,
14556,
970,
285,
3796,
17680,
436,
310,
271,
4722,
17947,
2299,
891,
452,
7350,
273,
253,
3777,
3890,
6460,
3982,
670,
407,
277,
4528,
285,
372,
82,
1678,
303,
50276,
296,
3755,
20556,
337,
697,
247,
4460,
285,
4722,
8668,
281,
1859,
253,
1850,
80,
2182,
25142,
347,
4229,
3659,
25142,
8445,
323,
5975,
2400,
5482,
949,
25142,
50275,
20881,
1255,
265,
337,
372,
82,
1678,
303,
10958,
953,
253,
7364,
273,
277,
4528,
597,
5386,
253,
13945,
3632,
33737,
1309,
1850,
80,
2182,
3103,
253,
1543,
403,
30027,
1677,
253,
6046,
2460,
209,
633,
436,
12291,
310,
7936,
352,
2789,
277,
4528,
285,
11640,
1679,
1146,
37851,
3210,
247,
1480,
9936,
310,
253,
4561,
3888,
273,
277,
4528,
285,
11640,
556,
3777,
9991,
2429,
342,
32765,
2617,
3730,
281,
253,
6983,
273,
277,
4528,
275,
247,
697,
4274,
17980,
1880,
824,
247,
17789,
323,
3885,
484,
310,
18338,
50276,
19,
1529,
1896,
12794,
19652,
533,
891,
1537,
320,
3430,
273,
372,
82,
1678,
303,
310,
347,
253,
4477,
9113,
8042,
562,
253,
12393,
1232,
310,
417,
673,
25168,
26332,
417,
2801,
85,
728,
275,
253,
372,
82,
3282,
625,
5742,
253,
1850,
80,
2182,
1159,
3936,
253,
673,
21496,
273,
673,
246,
347,
4465,
3280,
275,
619,
1859,
253,
673,
246,
5760,
253,
11041,
273,
253,
33737,
281,
320,
5998,
3103,
697,
33777,
533,
604,
3192,
673,
246,
347,
4465,
3280,
840,
476,
359,
1335,
1859,
277,
4528,
347,
4229,
3659,
25142,
604,
417,
840,
3885,
484,
970,
285,
3796,
17680,
310,
417,
7763,
50276,
66,
46710,
253,
1006,
800,
4715,
1195,
21838,
342,
1850,
80,
2182,
12393,
305,
507,
17857,
32888,
1384,
1423,
50276,
2072,
5474,
33032,
2520,
2929,
24357,
247,
12955,
3676,
12902,
1566,
372,
82,
273,
271,
5368,
1332,
277,
4528,
342,
253,
4736,
273,
5919,
10491,
285,
1566,
27697,
1223,
1335,
11850,
253,
3045,
50276,
249,
1798,
352,
26574,
253,
10491,
3213,
3425,
273,
277,
4528,
3354,
80,
2182,
12393,
15424,
1566,
835,
253,
10491,
3213,
310,
30027,
285,
642,
3356,
247,
1616,
729,
5931,
347,
247,
4229,
3659,
873,
594,
326,
247,
4229,
3659,
47037,
24088,
285,
3796,
17680,
3966,
310,
2104,
281,
26277,
15338,
253,
2862,
10491,
5931,
253,
4477,
921,
253,
8542,
17200,
273,
616,
10419,
275,
21864,
12022,
2460,
10491,
285,
1566,
27697,
327,
2067,
22791,
15302,
436,
2929,
21168,
327,
643,
2987,
275,
253,
3332,
6239,
285,
29328,
1633,
4217,
285,
4460,
352,
310,
4583,
973,
3542,
285,
5012,
8489,
12085,
1543,
342,
1097,
11745,
285,
18276,
12820,
323,
2460,
10491,
285,
1566,
27697,
50276,
6050,
891,
1158,
253,
35961,
4809,
310,
4460,
2217,
323,
9311,
253,
24775,
323,
2014,
2460,
10491,
310,
12744,
285,
878,
281,
320,
2007,
17285,
4496,
1089,
619,
7681,
5701,
44462,
253,
7364,
273,
253,
10419,
403,
5816,
390,
387,
1878,
417,
973,
4767,
275,
253,
1655,
7714,
50276,
7152,
339,
431,
248,
2929,
2722,
247,
3676,
12902,
7914,
273,
1850,
80,
2182,
12393,
15424,
3210,
277,
4528,
436,
2746,
3400,
253,
21533,
22453,
12393,
3210,
342,
7529,
1320,
13789,
4795,
275,
7938,
2014,
2460,
5978,
285,
1566,
27697,
342,
1679,
3541,
8353,
20544,
50276,
4714,
41607,
285,
2266,
15965,
1859,
10801,
327,
12393,
3210,
436,
2929,
3400,
247,
4460,
285,
1534,
1859,
327,
12393,
3210,
347,
4627,
797,
363,
12406,
34560,
2718,
285,
2722,
271,
4722,
1332,
281,
22629,
949,
731,
50276,
303,
7100,
422,
1566,
27697,
13789,
3340,
347,
5183,
36143,
275,
4677,
577,
50275,
20881,
1255,
265,
50276,
1282,
8676,
4980,
1125,
462,
4614,
12149,
651,
2430,
20073,
253,
2862,
15180,
4216,
323,
512,
246,
3054,
407,
4284,
1223,
2032,
352,
310,
1335,
1896,
281,
22629,
949,
253,
2862,
22453,
12393,
1232,
1293,
10568,
436,
323,
1650,
436,
369,
2218,
275,
391,
18,
436,
958,
310,
417,
1160,
10481,
2590,
281,
253,
9414,
253,
1072,
1127,
310,
1160,
327,
1386,
22769,
1293,
14851,
407,
4284,
50275,
783,
2746,
25068,
697,
4549,
672,
7296,
10464,
2147,
2460,
5978,
50276,
249,
2829,
337,
253,
9610,
273,
277,
4528,
1543,
310,
18766,
277,
4528,
33526,
271,
269,
301,
273,
32562,
970,
247,
1798,
4758,
5727,
352,
476,
671,
5115,
271,
269,
301,
273,
31004,
970,
247,
1027,
4764,
1320,
762,
253,
1072,
673,
6301,
253,
958,
326,
352,
9297,
562,
436,
9380,
906,
310,
8261,
347,
436,
2929,
3400,
247,
1534,
3885,
484,
2299,
436,
906,
943,
320,
18212,
2361,
275,
253,
2829,
50276,
3549,
6241,
277,
4528,
476,
320,
653,
264,
598,
3012,
285,
33526,
13406,
1543,
253,
4477,
1804,
326,
616,
1332,
476,
320,
908,
275,
30111,
342,
643,
3885,
484,
3082,
534,
7835,
28055,
21541,
533,
824,
1543,
403,
417,
2361,
275,
253,
2929,
50276,
16680,
1646,
281,
760,
38920,
3157,
2220,
2045,
1543,
2326,
275,
277,
4528,
285,
2571,
271,
3081,
3368,
4645,
253,
897,
273,
1097,
253,
4460,
17680,
1332,
285,
2045,
3885,
484,
3082,
10486,
651,
1805,
7568,
253,
13789,
273,
436,
1332,
50276,
37585,
5701,
50276,
555,
5367,
327,
1386,
7609,
6604,
3185,
273,
6036,
50276,
555,
5367,
327,
1386,
25597,
1055,
3185,
273,
2818,
50276,
251,
3104,
30610,
22913,
690,
30404,
403,
417,
2529,
9113,
30404,
5329,
1722,
8988,
4567,
513,
417,
26526,
21624,
11646,
281,
12921,
1029,
5251,
12474,
273,
3888,
597,
16584,
253,
21270,
4685,
273,
12393,
3210,
323,
253,
9056,
8892,
597,
8415,
2074,
281,
5697,
806,
3559,
275,
8988,
391,
19,
391,
20,
253,
2505,
943,
320,
294,
3418,
264,
285,
921,
253,
5368,
285,
5125,
30404,
347,
3082,
326,
12921,
3888,
390,
8415,
13737,
3237,
1293,
10568,
2120,
1566,
27697,
50276,
83,
18,
15361,
259,
1149,
80,
270,
30287,
606,
340,
1269,
22728,
260,
362,
1240,
8608,
247,
50276,
266,
395,
76,
22711,
247,
1384,
1423,
12393,
3210,
323,
48960,
23609,
549,
32693,
638,
3845,
549,
32693,
14256,
1235,
3566,
1549,
50276,
83,
19,
40303,
17616,
17457,
466,
1182,
50276,
3549,
251,
3992,
74,
299,
268,
9169,
16161,
4872,
13737,
3237,
970,
253,
2720,
15424,
275,
247,
1850,
80,
9141,
549,
32693,
638,
3845,
549,
32693,
8602,
15220,
1449,
50276,
83,
20,
465,
1403,
274,
270,
362,
8765,
1342,
305,
50276,
293,
324,
278,
43425,
3802,
2824,
16161,
27620,
13737,
3237,
331,
3770,
20595,
16424,
275,
11454,
1491,
5162,
2718,
5910,
26517,
3011,
19443,
2090,
253,
2929,
6556,
642,
2442,
4016,
38058,
3486,
275,
619,
4743,
285,
253,
4477,
452,
13390,
314,
12841,
436,
347,
352,
310,
417,
7763,
50276,
284,
323,
253,
7364,
253,
4477,
5181,
2220,
616,
3082,
7364,
275,
253,
2929,
2505,
15974,
253,
4067,
3541,
6095,
273,
1016,
3213,
2299,
277,
4528,
84,
1543,
943,
320,
1160,
30909,
352,
943,
671,
320,
5393,
326,
22453,
3210,
476,
320,
22266,
949,
342,
3638,
3541,
10454,
347,
973,
4754,
1655,
1125,
462,
4614,
3082,
2550,
8356,
5115,
436,
533,
352,
1335,
310,
1896,
2490,
187,
4118,
18435,
27,
2520,
2929,
24357,
247,
12955,
3676,
12902,
1566,
372,
82,
273,
271,
5368,
1332,
277,
4528,
342,
253,
4736,
273,
5919,
10491,
285,
1566,
27697,
1223,
1335,
11850,
253,
3045,
275,
1798,
352,
26574,
253,
10491,
3213,
3425,
273,
277,
4528,
3354,
80,
2182,
12393,
15424,
1566,
835,
253,
10491,
3213,
310,
30027,
285,
642,
3356,
247,
1616,
729,
5931,
347,
247,
4229,
3659,
873,
594,
326,
247,
4229,
3659,
47037,
24088,
285,
3796,
17680,
3966,
310,
2104,
281,
26277,
15338,
253,
2862,
10491,
5931,
50276,
783,
9353,
512,
5194,
326,
253,
16182,
4081,
275,
436,
789,
3738,
310,
4270,
327,
2720,
789,
310,
4460,
253,
9759,
273,
253,
2929,
310,
2590,
285,
253,
2361,
1543,
403,
12532,
50276,
783,
9353,
6373,
28032,
253,
4477,
3434,
275,
1097,
3585,
2182,
253,
7714,
285,
5277,
247,
38662,
2380,
3103,
359,
5583,
14924,
273,
436,
7714,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
modified 19 july 2021 thank you for the rebuttal and for addressing my questions the dataset is a solid contribution to the community i am confident in my rating and vote to accept this paper the paper presents a dataset of mouse interactions provided annotations include 7 keypoints of the mouse obtained using a pose estimation model along with the data the authors propose three evaluation settings and the motivation behind them the benchmark models proposed are the winning methods of the multiagent behavior workshop the dataset contains a large number of annotated frames 1 million with tracked poses and behavior labels and 6 million unlabelled frames with tracked poses three tasks are proposed along with the dataset the second task poses a challenge of transfer learning to a different annotation style this direction is promising for the type of problems that require expert human annotators the third task requires understanding novel behaviors with limited data proposed tasks on the dataset are useful for realworld applications and the dataset provided will facilitate the research the dataset was to some degree adopted by the community in the recent multiagent behavior mabe challenge 2021 the dataset has an application to a very narrow problem of understanding the behavior of mice in a very controlled environment the methods developed on top of the provided annotations will rely on accurate pose predictions in a realworld application the pose detections might not be as precise docsepthis paper presents a video dataset consisting of pairs of socially interacting mice with 6 million frames of unlabeled tracked poses and over 1 million frames with tracked poses and corresponding framelevel behavior annotations tracked poses are output from their pose tracker mars 43 behavior annotations are designed for three different tasks 1 classical closedset supervised learning task 2 task of mimicking one annotators annotation style 3 prediction of a rare behavior label given only a few shot training samples using this new dataset as benchmark dataset the authors introduce baseline methods as well as a few top performing methods from the cvpr 2021 mabe challenge that the authors hosted recently this dataset will help investigation of an interesting problem domain through dyadic mouse social interaction and has relevance to other research fields in behavior modeling and classification moreover the dataset is designed as a benchmark dataset on different tasks fully supervised learning and fewshot learning which is well motivated by the need for detecting rare and novel behavior in behavior research in terms of size this dataset is quite large while the majority of the videos in the dataset is unlabeled they have value for studying unsupervised feature learning to facilitate future research reports on baseline and top performing models from the mabe challenge are informative and well documented there are number of questions raised around the task 2 proposed in this paper especially regarding l160161 in this sequential classification task we provide six 10minute training videos for each of five new annotators and evaluate the ability of models to produce annotations in each annotators style l167168 we have a source domain with labels from task 1 which needs to be transferred to each annotator in task 2 with comparatively fewer labels typically human manual coding in behavioral study involves training of the human annotators such that they get reliability among themselves interrater reliabilities are measured and then repeatedly trained until it reaches certain threshold as this paper describes novice annotators annotated labels for task 2 did these new annotators go through this kind of reliability training in fact it is never mentioned in the paper how the human annotation process differs between task 1 and task 2 in both task 1 and task 2 the label set is the same then how come there is annotator bias in task 2 whereas there isnt in task 1 if you train a machine learning model to mimic one novice annotators bias based on task 2s label how do you tell whether the learned style reflects the untrainedunreliable annotation state vs true environmental bias that may exist in in certain lab please address these issues in the revision docsepupdate july 20 2021 thank you for the responses to my comments and adding additional information about pose tracking approaches overall i think the related work is clearer and improved i stand by the original score and vote to accept animal behavioral analysis is of growing interest to many disciplines but there are a dearth of datasets for benchmarking algorithms for behavioral recognition and other tasks this contribution describes the calms21 dataset which consists of 1 million video frames of tracked animal poses and behavioral annotations for mice interacting in a residentintruder assay the dataset is primarily used for action recognition tasks with 3 and 10 actions but also contains annotations in some cases from multiple annotators making it useful for style transfer across individuals the dataset is thoroughly documented and freely available baselines for performance on three different tasks are presented and moreover the dataset has already been used for a challenge competition at cvpr2021 the presented challenge solutions are interesting and may be more broadly useful in the animal behavioral tracking field relevance animal behavioral tracking is becoming an established field drawing tremendous enthusiasm from multiple fields this includes neuroscience which has long been a subject at neurips algorithm choices for behavioral identification as well as other tasks are often ad hoc in the field and a move towards standardizing datasets for specific challenges makes sense social behavioral experiments are becoming more and more common in neuroscience in particular making this dataset of interest to a growing community significance this work was first used as a challenge competition and i think that it has already demonstrated excellence in that regard i think the majority of users will use it to iterate techniques for action recognition in animal models although it is possible that some will want to analyze the pose coordinates outside of the action labels accessibility and accountability the manuscript is well written the documentation of methods complete and has already been used to organize a challenge competition at cvpr2021 with dozens of participants this is framed as a multiagent behavior dataset but in many ways it is an action recognition dataset and it is somewhat limited in that respect mainly focusing on 3 actions although 10 are given in task 3 it is possible that this could be used for more multiagent style problems for instance predicting the impact individuals have on one anothers trajectories but this is not explored much of the novelty in the dataset provided is in the keypoint tracking as a portion of the authors published a similar dataset of dyadic rodent interactions in 2012 crim13 burgosartizzu et al cited with i believe additional behaviors labeled i do think that the dataset is valuable as in many experiments scientists are only interested in a portion of behaviors animals perform however i am not sure it will find a general usage as a multianimal action recognition dataset due to the limited number of actions provided i also am not sure about the utility of the annotator style transfer task there could be more discussion of the relevant human action recognition literature
### Summary:
|
the paper presents a dataset of 1m videos frames with tracked animal poses and behavior annotations all reviewers agreed that the paper is wellwritten and that the dataset is useful as evidenced by the fact that it has already been used to organize a challenge reviewers raised some minor concerns including more details about annotators and the relationship between the proposed benchmark and prior work on human action recognition the author response was satisfactory in addressing these concerns and in the end all reviewers voted to accept the paper congratulations on having your paper accepted to the neurips 2021 track on datasets and benchmarks the authors are encouraged to take the feedback from reviewers into account when preparing the cameraready version of the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
25016,
655,
480,
2988,
43425,
5717,
368,
323,
253,
30080,
22559,
285,
323,
15974,
619,
3533,
253,
10895,
310,
247,
4891,
7680,
281,
253,
3114,
891,
717,
13224,
275,
619,
13716,
285,
6273,
281,
2997,
436,
2929,
50275,
783,
2929,
10262,
247,
10895,
273,
6521,
6355,
2530,
31825,
2486,
818,
2234,
10801,
273,
253,
6521,
2797,
970,
247,
16753,
13418,
1566,
2112,
342,
253,
941,
253,
4477,
12661,
1264,
7103,
7533,
285,
253,
16038,
3212,
731,
253,
22791,
3210,
4081,
403,
253,
9880,
3082,
273,
253,
4471,
12788,
3879,
22586,
253,
10895,
4428,
247,
1781,
1180,
273,
28267,
13009,
337,
3041,
342,
27173,
24543,
285,
3879,
13301,
285,
721,
3041,
440,
47728,
13009,
342,
27173,
24543,
50276,
13524,
8892,
403,
4081,
2112,
342,
253,
10895,
253,
1273,
4836,
24543,
247,
5691,
273,
3700,
4715,
281,
247,
1027,
22581,
3740,
436,
3884,
310,
12532,
323,
253,
1511,
273,
3237,
326,
2430,
6485,
1966,
12182,
2392,
253,
2626,
4836,
4419,
4685,
4460,
13576,
342,
3710,
941,
4081,
8892,
327,
253,
10895,
403,
4217,
323,
1524,
10186,
4893,
285,
253,
10895,
2530,
588,
12454,
253,
2561,
50276,
783,
10895,
369,
281,
690,
4248,
8671,
407,
253,
3114,
275,
253,
3332,
4471,
12788,
3879,
278,
12424,
5691,
43425,
253,
10895,
556,
271,
2898,
281,
247,
1077,
6891,
1895,
273,
4685,
253,
3879,
273,
3754,
275,
247,
1077,
6537,
3126,
253,
3082,
3715,
327,
1755,
273,
253,
2530,
31825,
588,
10725,
327,
7899,
16753,
13650,
275,
247,
1524,
10186,
2898,
253,
16753,
843,
20713,
1537,
417,
320,
347,
10799,
5474,
33032,
2520,
2929,
10262,
247,
3492,
10895,
11253,
273,
8557,
273,
28071,
18745,
3754,
342,
721,
3041,
13009,
273,
440,
22027,
27173,
24543,
285,
689,
337,
3041,
13009,
342,
575,
13712,
264,
24543,
285,
3969,
3665,
5251,
3879,
31825,
27173,
24543,
403,
3453,
432,
616,
16753,
40143,
40584,
7652,
3879,
31825,
403,
4158,
323,
1264,
1027,
8892,
337,
8946,
4581,
1178,
22296,
4715,
4836,
374,
4836,
273,
13892,
12427,
581,
12182,
2392,
22581,
3740,
495,
10554,
273,
247,
7520,
3879,
5203,
1677,
760,
247,
1643,
5103,
3733,
3530,
970,
436,
747,
10895,
347,
22791,
10895,
253,
4477,
9569,
8245,
3082,
347,
973,
347,
247,
1643,
1755,
9591,
3082,
432,
253,
30105,
1087,
43425,
278,
12424,
5691,
326,
253,
4477,
17386,
4102,
436,
10895,
588,
1361,
5839,
273,
271,
4722,
1895,
5028,
949,
17713,
18535,
6521,
2675,
5016,
285,
556,
17200,
281,
643,
2561,
4910,
275,
3879,
14053,
285,
9162,
25761,
253,
10895,
310,
4158,
347,
247,
22791,
10895,
327,
1027,
8892,
4751,
22296,
4715,
285,
1643,
11860,
4715,
534,
310,
973,
17194,
407,
253,
878,
323,
15549,
7520,
285,
4460,
3879,
275,
3879,
2561,
275,
2426,
273,
1979,
436,
10895,
310,
3240,
1781,
1223,
253,
5020,
273,
253,
10556,
275,
253,
10895,
310,
440,
22027,
597,
452,
1318,
323,
12392,
440,
35421,
4735,
4715,
281,
12454,
2852,
2561,
5012,
327,
8245,
285,
1755,
9591,
3210,
432,
253,
278,
12424,
5691,
403,
27096,
285,
973,
14290,
627,
403,
1180,
273,
3533,
5439,
1475,
253,
4836,
374,
4081,
275,
436,
2929,
3340,
5001,
298,
1036,
520,
3832,
275,
436,
22453,
9162,
4836,
359,
2085,
2800,
884,
15505,
3733,
10556,
323,
1016,
273,
2620,
747,
575,
11423,
2392,
285,
7472,
253,
3745,
273,
3210,
281,
4711,
31825,
275,
1016,
12182,
2392,
3740,
298,
18146,
13851,
359,
452,
247,
2603,
5028,
342,
13301,
432,
4836,
337,
534,
3198,
281,
320,
9495,
281,
1016,
12182,
1080,
575,
249,
4836,
374,
342,
31381,
11184,
13301,
5431,
1966,
11595,
12425,
275,
14613,
1263,
8687,
3733,
273,
253,
1966,
12182,
2392,
824,
326,
597,
755,
13367,
2190,
3746,
734,
83,
727,
10926,
6720,
403,
4080,
285,
840,
12889,
10166,
1919,
352,
14190,
2176,
7887,
347,
436,
2929,
8631,
22458,
547,
12182,
2392,
28267,
13301,
323,
4836,
374,
858,
841,
747,
12182,
2392,
564,
949,
436,
2238,
273,
13367,
3733,
275,
958,
352,
310,
1620,
5393,
275,
253,
2929,
849,
253,
1966,
22581,
1232,
19986,
875,
4836,
337,
285,
4836,
374,
275,
1097,
4836,
337,
285,
4836,
374,
253,
5203,
873,
310,
253,
1072,
840,
849,
1705,
627,
310,
12182,
1080,
8492,
275,
4836,
374,
5727,
627,
310,
2649,
275,
4836,
337,
604,
368,
6194,
247,
5145,
4715,
1566,
281,
25066,
581,
22458,
547,
12182,
2392,
8492,
1754,
327,
4836,
374,
84,
5203,
849,
513,
368,
2028,
1880,
253,
6311,
3740,
13806,
253,
440,
32927,
328,
31631,
22581,
1375,
4632,
2032,
6938,
8492,
326,
778,
2226,
275,
275,
2176,
5188,
4496,
2953,
841,
3374,
275,
253,
18520,
5474,
33032,
11183,
480,
2988,
1384,
43425,
5717,
368,
323,
253,
6128,
281,
619,
5701,
285,
6240,
3081,
1491,
670,
16753,
12544,
7274,
4583,
891,
1158,
253,
2905,
789,
310,
30909,
285,
5520,
891,
1462,
407,
253,
3236,
4868,
285,
6273,
281,
2997,
50274,
49655,
14613,
1783,
310,
273,
5675,
1600,
281,
1142,
32870,
533,
627,
403,
247,
372,
5401,
273,
15302,
323,
22791,
272,
11333,
323,
14613,
8981,
285,
643,
8892,
436,
7680,
8631,
253,
1724,
983,
1797,
10895,
534,
8414,
273,
337,
3041,
3492,
13009,
273,
27173,
5893,
24543,
285,
14613,
31825,
323,
3754,
18745,
275,
247,
14544,
40150,
32656,
6981,
253,
10895,
310,
8558,
908,
323,
2250,
8981,
8892,
342,
495,
285,
884,
5231,
533,
671,
4428,
31825,
275,
690,
2219,
432,
2709,
12182,
2392,
2403,
352,
4217,
323,
3740,
3700,
2439,
4292,
253,
10895,
310,
16575,
14290,
285,
15744,
2130,
1666,
25379,
323,
3045,
327,
1264,
1027,
8892,
403,
3559,
285,
25761,
253,
10895,
556,
2168,
644,
908,
323,
247,
5691,
7324,
387,
30105,
1087,
938,
1797,
253,
3559,
5691,
5482,
403,
4722,
285,
778,
320,
625,
21450,
4217,
275,
253,
5893,
14613,
12544,
1673,
50275,
11235,
11828,
5893,
14613,
12544,
310,
7552,
271,
4232,
1673,
10263,
19999,
23027,
432,
2709,
4910,
436,
3797,
6551,
21559,
534,
556,
1048,
644,
247,
2256,
387,
5723,
2824,
5933,
10165,
323,
14613,
8137,
347,
973,
347,
643,
8892,
403,
2223,
519,
26901,
275,
253,
1673,
285,
247,
2118,
4404,
2629,
3006,
15302,
323,
2173,
7881,
2789,
3282,
2675,
14613,
4679,
403,
7552,
625,
285,
625,
1846,
275,
6551,
21559,
275,
1798,
2403,
436,
10895,
273,
1600,
281,
247,
5675,
3114,
50274,
9188,
40348,
436,
789,
369,
806,
908,
347,
247,
5691,
7324,
285,
891,
1158,
326,
352,
556,
2168,
5183,
31925,
275,
326,
2743,
891,
1158,
253,
5020,
273,
4212,
588,
897,
352,
281,
35388,
5609,
323,
2250,
8981,
275,
5893,
3210,
3738,
352,
310,
1896,
326,
50276,
8826,
588,
971,
281,
12106,
253,
16753,
11627,
3345,
273,
253,
2250,
13301,
50274,
10773,
2322,
285,
30990,
253,
7714,
310,
973,
3542,
253,
10097,
273,
3082,
3426,
285,
556,
2168,
644,
908,
281,
23968,
247,
5691,
7324,
387,
30105,
1087,
938,
1797,
342,
18660,
273,
5014,
50275,
2520,
310,
29318,
347,
247,
4471,
12788,
3879,
10895,
533,
275,
1142,
4088,
352,
310,
271,
2250,
8981,
10895,
285,
352,
310,
8489,
3710,
275,
326,
1675,
7194,
13654,
327,
495,
5231,
3738,
884,
403,
1677,
275,
4836,
495,
352,
310,
1896,
326,
436,
812,
320,
908,
323,
625,
4471,
12788,
3740,
3237,
323,
4227,
21565,
253,
3486,
4292,
452,
327,
581,
1529,
84,
24102,
533,
436,
310,
417,
14859,
50275,
25914,
273,
253,
38135,
275,
253,
10895,
2530,
310,
275,
253,
2234,
3659,
12544,
347,
247,
5110,
273,
253,
4477,
3863,
247,
2074,
10895,
273,
17713,
18535,
40815,
6355,
275,
4050,
5435,
1012,
3600,
44654,
435,
11114,
86,
1162,
355,
11106,
342,
891,
2868,
3081,
13576,
13130,
891,
513,
1158,
326,
253,
10895,
310,
9865,
347,
275,
1142,
4679,
10950,
403,
760,
6110,
275,
247,
5110,
273,
13576,
5074,
1347,
2299,
891,
717,
417,
2119,
352,
588,
1089,
247,
2087,
10393,
347,
247,
1554,
757,
1983,
2250,
8981,
10895,
1955,
281,
253,
3710,
1180,
273,
5231,
2530,
891,
671,
717,
417,
2119,
670,
253,
11839,
273,
253,
12182,
1080,
3740,
3700,
4836,
50275,
9088,
812,
320,
625,
5955,
273,
253,
4623,
1966,
2250,
8981,
6239,
50275,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
10895,
273,
337,
78,
10556,
13009,
342,
27173,
5893,
24543,
285,
3879,
31825,
512,
30628,
5821,
326,
253,
2929,
310,
973,
15720,
285,
326,
253,
10895,
310,
4217,
50276,
284,
27007,
407,
253,
958,
326,
352,
556,
2168,
644,
908,
281,
23968,
247,
5691,
30628,
5439,
690,
5884,
7350,
1690,
625,
4278,
670,
12182,
2392,
285,
253,
2954,
875,
253,
4081,
22791,
285,
2720,
789,
327,
1966,
2250,
8981,
253,
2488,
2380,
369,
20297,
275,
15974,
841,
7350,
285,
275,
253,
990,
512,
30628,
14285,
281,
2997,
253,
2929,
28858,
3339,
327,
1907,
634,
2929,
7607,
281,
253,
5723,
2824,
43425,
3540,
327,
15302,
285,
49602,
253,
4477,
403,
14659,
281,
1379,
253,
8680,
432,
30628,
715,
2395,
672,
13828,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
25016,
655,
480,
2988,
43425,
5717,
368,
323,
253,
30080,
22559,
285,
323,
15974,
619,
3533,
253,
10895,
310,
247,
4891,
7680,
281,
253,
3114,
891,
717,
13224,
275,
619,
13716,
285,
6273,
281,
2997,
436,
2929,
50275,
783,
2929,
10262,
247,
10895,
273,
6521,
6355,
2530,
31825,
2486,
818,
2234,
10801,
273,
253,
6521,
2797,
970,
247,
16753,
13418,
1566,
2112,
342,
253,
941,
253,
4477,
12661,
1264,
7103,
7533,
285,
253,
16038,
3212,
731,
253,
22791,
3210,
4081,
403,
253,
9880,
3082,
273,
253,
4471,
12788,
3879,
22586,
253,
10895,
4428,
247,
1781,
1180,
273,
28267,
13009,
337,
3041,
342,
27173,
24543,
285,
3879,
13301,
285,
721,
3041,
440,
47728,
13009,
342,
27173,
24543,
50276,
13524,
8892,
403,
4081,
2112,
342,
253,
10895,
253,
1273,
4836,
24543,
247,
5691,
273,
3700,
4715,
281,
247,
1027,
22581,
3740,
436,
3884,
310,
12532,
323,
253,
1511,
273,
3237,
326,
2430,
6485,
1966,
12182,
2392,
253,
2626,
4836,
4419,
4685,
4460,
13576,
342,
3710,
941,
4081,
8892,
327,
253,
10895,
403,
4217,
323,
1524,
10186,
4893,
285,
253,
10895,
2530,
588,
12454,
253,
2561,
50276,
783,
10895,
369,
281,
690,
4248,
8671,
407,
253,
3114,
275,
253,
3332,
4471,
12788,
3879,
278,
12424,
5691,
43425,
253,
10895,
556,
271,
2898,
281,
247,
1077,
6891,
1895,
273,
4685,
253,
3879,
273,
3754,
275,
247,
1077,
6537,
3126,
253,
3082,
3715,
327,
1755,
273,
253,
2530,
31825,
588,
10725,
327,
7899,
16753,
13650,
275,
247,
1524,
10186,
2898,
253,
16753,
843,
20713,
1537,
417,
320,
347,
10799,
5474,
33032,
2520,
2929,
10262,
247,
3492,
10895,
11253,
273,
8557,
273,
28071,
18745,
3754,
342,
721,
3041,
13009,
273,
440,
22027,
27173,
24543,
285,
689,
337,
3041,
13009,
342,
575,
13712,
264,
24543,
285,
3969,
3665,
5251,
3879,
31825,
27173,
24543,
403,
3453,
432,
616,
16753,
40143,
40584,
7652,
3879,
31825,
403,
4158,
323,
1264,
1027,
8892,
337,
8946,
4581,
1178,
22296,
4715,
4836,
374,
4836,
273,
13892,
12427,
581,
12182,
2392,
22581,
3740,
495,
10554,
273,
247,
7520,
3879,
5203,
1677,
760,
247,
1643,
5103,
3733,
3530,
970,
436,
747,
10895,
347,
22791,
10895,
253,
4477,
9569,
8245,
3082,
347,
973,
347,
247,
1643,
1755,
9591,
3082,
432,
253,
30105,
1087,
43425,
278,
12424,
5691,
326,
253,
4477,
17386,
4102,
436,
10895,
588,
1361,
5839,
273,
271,
4722,
1895,
5028,
949,
17713,
18535,
6521,
2675,
5016,
285,
556,
17200,
281,
643,
2561,
4910,
275,
3879,
14053,
285,
9162,
25761,
253,
10895,
310,
4158,
347,
247,
22791,
10895,
327,
1027,
8892,
4751,
22296,
4715,
285,
1643,
11860,
4715,
534,
310,
973,
17194,
407,
253,
878,
323,
15549,
7520,
285,
4460,
3879,
275,
3879,
2561,
275,
2426,
273,
1979,
436,
10895,
310,
3240,
1781,
1223,
253,
5020,
273,
253,
10556,
275,
253,
10895,
310,
440,
22027,
597,
452,
1318,
323,
12392,
440,
35421,
4735,
4715,
281,
12454,
2852,
2561,
5012,
327,
8245,
285,
1755,
9591,
3210,
432,
253,
278,
12424,
5691,
403,
27096,
285,
973,
14290,
627,
403,
1180,
273,
3533,
5439,
1475,
253,
4836,
374,
4081,
275,
436,
2929,
3340,
5001,
298,
1036,
520,
3832,
275,
436,
22453,
9162,
4836,
359,
2085,
2800,
884,
15505,
3733,
10556,
323,
1016,
273,
2620,
747,
575,
11423,
2392,
285,
7472,
253,
3745,
273,
3210,
281,
4711,
31825,
275,
1016,
12182,
2392,
3740,
298,
18146,
13851,
359,
452,
247,
2603,
5028,
342,
13301,
432,
4836,
337,
534,
3198,
281,
320,
9495,
281,
1016,
12182,
1080,
575,
249,
4836,
374,
342,
31381,
11184,
13301,
5431,
1966,
11595,
12425,
275,
14613,
1263,
8687,
3733,
273,
253,
1966,
12182,
2392,
824,
326,
597,
755,
13367,
2190,
3746,
734,
83,
727,
10926,
6720,
403,
4080,
285,
840,
12889,
10166,
1919,
352,
14190,
2176,
7887,
347,
436,
2929,
8631,
22458,
547,
12182,
2392,
28267,
13301,
323,
4836,
374,
858,
841,
747,
12182,
2392,
564,
949,
436,
2238,
273,
13367,
3733,
275,
958,
352,
310,
1620,
5393,
275,
253,
2929,
849,
253,
1966,
22581,
1232,
19986,
875,
4836,
337,
285,
4836,
374,
275,
1097,
4836,
337,
285,
4836,
374,
253,
5203,
873,
310,
253,
1072,
840,
849,
1705,
627,
310,
12182,
1080,
8492,
275,
4836,
374,
5727,
627,
310,
2649,
275,
4836,
337,
604,
368,
6194,
247,
5145,
4715,
1566,
281,
25066,
581,
22458,
547,
12182,
2392,
8492,
1754,
327,
4836,
374,
84,
5203,
849,
513,
368,
2028,
1880,
253,
6311,
3740,
13806,
253,
440,
32927,
328,
31631,
22581,
1375,
4632,
2032,
6938,
8492,
326,
778,
2226,
275,
275,
2176,
5188,
4496,
2953,
841,
3374,
275,
253,
18520,
5474,
33032,
11183,
480,
2988,
1384,
43425,
5717,
368,
323,
253,
6128,
281,
619,
5701,
285,
6240,
3081,
1491,
670,
16753,
12544,
7274,
4583,
891,
1158,
253,
2905,
789,
310,
30909,
285,
5520,
891,
1462,
407,
253,
3236,
4868,
285,
6273,
281,
2997,
50274,
49655,
14613,
1783,
310,
273,
5675,
1600,
281,
1142,
32870,
533,
627,
403,
247,
372,
5401,
273,
15302,
323,
22791,
272,
11333,
323,
14613,
8981,
285,
643,
8892,
436,
7680,
8631,
253,
1724,
983,
1797,
10895,
534,
8414,
273,
337,
3041,
3492,
13009,
273,
27173,
5893,
24543,
285,
14613,
31825,
323,
3754,
18745,
275,
247,
14544,
40150,
32656,
6981,
253,
10895,
310,
8558,
908,
323,
2250,
8981,
8892,
342,
495,
285,
884,
5231,
533,
671,
4428,
31825,
275,
690,
2219,
432,
2709,
12182,
2392,
2403,
352,
4217,
323,
3740,
3700,
2439,
4292,
253,
10895,
310,
16575,
14290,
285,
15744,
2130,
1666,
25379,
323,
3045,
327,
1264,
1027,
8892,
403,
3559,
285,
25761,
253,
10895,
556,
2168,
644,
908,
323,
247,
5691,
7324,
387,
30105,
1087,
938,
1797,
253,
3559,
5691,
5482,
403,
4722,
285,
778,
320,
625,
21450,
4217,
275,
253,
5893,
14613,
12544,
1673,
50275,
11235,
11828,
5893,
14613,
12544,
310,
7552,
271,
4232,
1673,
10263,
19999,
23027,
432,
2709,
4910,
436,
3797,
6551,
21559,
534,
556,
1048,
644,
247,
2256,
387,
5723,
2824,
5933,
10165,
323,
14613,
8137,
347,
973,
347,
643,
8892,
403,
2223,
519,
26901,
275,
253,
1673,
285,
247,
2118,
4404,
2629,
3006,
15302,
323,
2173,
7881,
2789,
3282,
2675,
14613,
4679,
403,
7552,
625,
285,
625,
1846,
275,
6551,
21559,
275,
1798,
2403,
436,
10895,
273,
1600,
281,
247,
5675,
3114,
50274,
9188,
40348,
436,
789,
369,
806,
908,
347,
247,
5691,
7324,
285,
891,
1158,
326,
352,
556,
2168,
5183,
31925,
275,
326,
2743,
891,
1158,
253,
5020,
273,
4212,
588,
897,
352,
281,
35388,
5609,
323,
2250,
8981,
275,
5893,
3210,
3738,
352,
310,
1896,
326,
50276,
8826,
588,
971,
281,
12106,
253,
16753,
11627,
3345,
273,
253,
2250,
13301,
50274,
10773,
2322,
285,
30990,
253,
7714,
310,
973,
3542,
253,
10097,
273,
3082,
3426,
285,
556,
2168,
644,
908,
281,
23968,
247,
5691,
7324,
387,
30105,
1087,
938,
1797,
342,
18660,
273,
5014,
50275,
2520,
310,
29318,
347,
247,
4471,
12788,
3879,
10895,
533,
275,
1142,
4088,
352,
310,
271,
2250,
8981,
10895,
285,
352,
310,
8489,
3710,
275,
326,
1675,
7194,
13654,
327,
495,
5231,
3738,
884,
403,
1677,
275,
4836,
495,
352,
310,
1896,
326,
436,
812,
320,
908,
323,
625,
4471,
12788,
3740,
3237,
323,
4227,
21565,
253,
3486,
4292,
452,
327,
581,
1529,
84,
24102,
533,
436,
310,
417,
14859,
50275,
25914,
273,
253,
38135,
275,
253,
10895,
2530,
310,
275,
253,
2234,
3659,
12544,
347,
247,
5110,
273,
253,
4477,
3863,
247,
2074,
10895,
273,
17713,
18535,
40815,
6355,
275,
4050,
5435,
1012,
3600,
44654,
435,
11114,
86,
1162,
355,
11106,
342,
891,
2868,
3081,
13576,
13130,
891,
513,
1158,
326,
253,
10895,
310,
9865,
347,
275,
1142,
4679,
10950,
403,
760,
6110,
275,
247,
5110,
273,
13576,
5074,
1347,
2299,
891,
717,
417,
2119,
352,
588,
1089,
247,
2087,
10393,
347,
247,
1554,
757,
1983,
2250,
8981,
10895,
1955,
281,
253,
3710,
1180,
273,
5231,
2530,
891,
671,
717,
417,
2119,
670,
253,
11839,
273,
253,
12182,
1080,
3740,
3700,
4836,
50275,
9088,
812,
320,
625,
5955,
273,
253,
4623,
1966,
2250,
8981,
6239,
50275,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
10895,
273,
337,
78,
10556,
13009,
342,
27173,
5893,
24543,
285,
3879,
31825,
512,
30628,
5821,
326,
253,
2929,
310,
973,
15720,
285,
326,
253,
10895,
310,
4217,
50276,
284,
27007,
407,
253,
958,
326,
352,
556,
2168,
644,
908,
281,
23968,
247,
5691,
30628,
5439,
690,
5884,
7350,
1690,
625,
4278,
670,
12182,
2392,
285,
253,
2954,
875,
253,
4081,
22791,
285,
2720,
789,
327,
1966,
2250,
8981,
253,
2488,
2380,
369,
20297,
275,
15974,
841,
7350,
285,
275,
253,
990,
512,
30628,
14285,
281,
2997,
253,
2929,
28858,
3339,
327,
1907,
634,
2929,
7607,
281,
253,
5723,
2824,
43425,
3540,
327,
15302,
285,
49602,
253,
4477,
403,
14659,
281,
1379,
253,
8680,
432,
30628,
715,
2395,
672,
13828,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a new method to learn stationary interpretable policies using soft decision trees in partially observed settings the soft decision tree structure is extended to allow for recursion over time and account for policy decisions based on history of collected data an algorithm is presented to optimize the parameters of the soft decision tree as well as the structuretopology of the tree the algorithm mainly proceeds by splitting nodes and locally optimizing the parameters of the the associated probability representation of the soft node and recursively split if local optimization does not improve validation performance and fixed as leaf otherwise a global update step is then used after topology is fixed followed by pruning low probability paths in the trees experimental validation on surveys with clinicians demonstrate reasonable interpretability and improved prediction performance on imitating clinician policy the paper is well written and the problem well motivated the key contribution seems to be extension of soft decision trees to the recurrent setting which is a nice and clinically useful contribution the algorithm used to train is certainly a heuristic but seems reasonable empirical results are comparable although the algorithm does not seem to improve over state of the art in terms of interpretability always weaknesses 1 i am not sure how partial observability plays a role and the contributions from the partial observability perspective are unclear is the claim that the representations learned due to the recurrent setup is better for overcoming challenges of partial observability if so i believe an additional evaluation is warranted for comparison if not then i might have misunderstood and i believe for completeness authors should comment on partial observability 2 no additional discussion regarding the algorithm and its behavior is provided i believe a discussion on potential failure cases and also choice of hyperparameters of the cost function will be helpful especially for experiments 3 i might have missed this but what exactly is the source of partial observability in all the empirical evaluation in the paper please add comments regarding the source and how the proposed method addresses it overall i believe this paper provides an interesting contribution for interpretable policy learning particularly for clinical decisionmaking i do have a few followup questions that will make the contribution regarding some aspects clear docsepthis paper proposes a novel approach for learning and representing human decisionmaking policies from observed behavioral data the proposed approach emphasizes interpretability as a primary aim while nevertheless seeking to maintain reasonable modeling accuracy the decision tree model proposed extends canonical decision tree approaches to the probabilistic setting allow for optimization of leafspecific parameters via stochastic gradient descent the proposed approach is evaluated both in terms of its interpretability subjective measurements from a panel of licensed physicians as well as its accuracy in recapitulating actions conditioned patient observations the utility of the approach is demonstrated on both synthetic and realworld datasets i am admittedly not an expert in imitation learningbehavioral cloning policy learning or softprobabilistic decision tree models nevertheless having more familiarity with offline reinforcement learning and healthcare broadly i was excited to read this paper the central premise of the paper distilling inherent clinician policies down to an easily interpretable and easily followable decision tree model seems like a compelling and important task and the authors motivate this well in their introduction highlighting the unnecessary costs of medical practice variability the algorithm is explained reasonably well in section 2 problem formalism and section 3 interpretable policy learning with decision trees some of the notation was a bit confusing but this is more of a minor point for example at is a onehot encoded target top of page 4 but then at kl is also the output probability for action class k in leaf l why not denote the predictions using hatat kl there were also a few confusing sentences consider this on in the first paragraph of section 32 finally as third leaf output with parameters thetazl in mathbbrd our model also predicts emphpatient evolution or observations at the next timestep tildezt 1 to formalise a consideration of expected treatment effects yau et al 2020 not only is the line after the semicolon not an independent clause making the use of the semicolon incorrect its also unclear to me where this line about expected treatment effects came from what does patient evolution have to do with expected treatment effects theres another related line in the action value quantification through counterfactual evolution section that reads overall thanks to its integral part of model design assessment of counterfactual evolution becomes intuitive i was very confused by this line what is the counterfactual here and what does this have to do with the integral part of model design on that note it was unclear to me why a hyperbolic tangent function is used to estimate tildezt 1l0 i understand the value of introducing differentiable nonlinearities for the goal of flexible function approximation but i thought that zt was supposed to be in the original output space is tildezt 1l0 not also supposed to be in that same space but using the hyperbolic tangent would map everything to 1 1 some clarity here would be helpful i thought the comparison to related work was done well and despite not being intimately familiar with the field i was able to appreciate both the existing challenges and opportunities present at the time the work was carried out table 2 was especially helpful for contextualizing the papers contributions admittedly it does seem that this paper is more of a convex combination of a few different ideas softprobabilistic decision trees from frosst hinton 2017 cascaded trees from ding et al 2021 for recurrence rather than a seminal and novel work in its own right the authors highlight differences between their approach and vanilla cartsstochastic decision trees sdts in appendix b2 but i didnt see a comparable exposition of differences and explanation of noveltycontribution relative to ding et al 2021 that being said if you believe as i do that creativity is just synthesizing existing ideas in novel ways then i suppose you could call this paper creative the question then is how much more useful the proposed amalgamation of ideas is over each idea in isolation i was impressed by the experimental results it does seem that theres a tradeoff between interpretability and performance eg auroc but this particular approach at least seems to be on the current pareto frontier of this tradeoff and provides a useful approach for practitioners i thought adni dataset application was a compelling one and the problem itself was explained adequately however i really had to sit and scratch my head to work through figure 3 for a paper nominally dedicated to producing interpretable decision policies i had a hard time interpreting the policy represented in figure 3a i think the vignettes of patient abc is insightful but would need some additional guideposts to add to the paper rather than detract from it what if for example you were able to color and label each of the tree branches with a different color for patient a b and c or at very least indicate the path through the tree for each patient separately in the appendix also it was unclear to me whether cdrsb was being treated as a categorical or an ordinal random variable for example in the mci subtree there is a leaf for cdrsb questionable if the answer to this is no then does that imply that the value could be either cdrsb severe or cdrsb normal is an mri really warranted in either case i was also a bit lost with regards to the decisionmaking uncertainty and anomalous behavior detection sections i think most practitioners would interpret the extracted policy as a deterministic one not a probabilistic one yet the underlying construction of the tree is inherently probabilistic how are we to reconcile those two in terms of interpretation as a concrete example the statement is made visits where an mri is predicted with 90 certainty make up 84 of adni but this certainty is a reflection of both the entropy in the path to the leaf in the decision tree as well as uncertainty in the action conditioned on the leaf itself disambiguating the two seems important for being able to decide whether an action is truly anomalous because taking action 1 in leaf l is inappropriate or just unlikely because there are very few patients that are represented by a particular leaf node maybe im misunderstanding something more fundamental here but id welcome any additional clarity here one other note the authors state we must highlight the similarity between our decision tree policy and published guidelines for alzheimers diagnosis reproduced in appendix f these two policies were not at all similar in my reading figure 10 makes no explicit mention of mri hippocampal volume or the cdrsb nor is it recurrent figure 11 exhibits all of the above postrebuttal comments the authors have thoughtfully addressed most of my concerns i particularly appreciate the update to figure 3a and i think it makes the figure much easier to parse with the revisions im happy to increase my score ill note though that i believe a typo was introduced in equation 2 check the parentheses in the far right dkl term of the equation also the term to the left of equality should take ht1 rather than ht as an argument i believe ive had a chance to read through the other reviewers comments and the authors responses to said comments i agree with reviewer kc79s original concerns and feel that those have been adequately addressed the same with reviewer frrv honestly and this is more for the metareviewer i was left scratching my head at the comments made by reviewer wnrz the request to compare the proposed approach to svms and gbms doesnt make any sense to me svms and gbms are inherently not as interpretable as decision trees in my opinion and the whole point of the paper was to learn a policy representation in decision tree form that closely matched clinician behavior this is contrary to wnrzs concerns very well suited to tools from behavioral cloning and imitation learning no reward is needed because the authors arent taking an inverse reinforcement learning this is all fine reviewers can have disagreements what was most concerning to me though was the combination of a very low score by the reviewer and a high confidence rating the reviewers comments reflected neither the depth nor understanding that i would expect from such a high confidence rating overall this is an interesting paper the novelty and significance are positive if not overwhelming the utility for practitioners and potential impact in areas outside of machine learning however is nontrivial the exposition of the method and its interpretation would benefit from additional refinement nevertheless in its current state i can recommend a marginal accept docsep the authors argued that many methods failed the merits of interpretability in some important areas eg clinical decisionmaking thus this paper proposed a soft treebased method for synthetic clinical datasets in the matter of interpretability the authors model the clinical decision process as a partially observable markov decision process pomdp which naturally fits the assumption of medical diagnosis overall i dont think the paper meets the requirement of iclr in a few aspects strengths the motivation is strong especially for clinicians the paper is well written and easy to understand the experiments are completed and detailed and the results seem strong to me noted that im not a expert in clinical datasets weakness some important classical machine learning methods are missingeg svm gbms which meet the needs of interpretability rather than deep learning methods the policy learning methodsil al irl are not related to me as the reward function is not important in clinical settings some important works 12 for timeseries data were also missed for discussion 1 ke g meng q finley t et al lightgbm a highly efficient gradient boosting decision treej advances in neural information processing systems 2017 30 31463154 2 chen t guestrin c xgboost a scalable tree boosting systemcproceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining 2016 785794 novelty the technical contributions are marginal to me as the main aspects of the proposed method are the combination of rnn and soft decision trees id give a rejection based on the comments above and i would like to suggest the authors revise the paper for resubmission to another conference post rebuttal to reviewer kv5i to be honest im still not convinced that the issues raised in this paper are more artificially fabricated than they are actually present clearly if the goal of this paper is to discover machine learning models that can be used to explain then decision tree models and support vector machine models are superior choices when compared to other benchmark models and in fact the interpretability of decision tree models would not lag behind the methods posed by the article it is possible however that the assumption made about the problem encountered in the clinical data a markov process with partial observability is illposed ie that the requirement to take into account the impact brought by the medical regimen is incorrect post postrebuttal i appreciate the authors thorough and precise arguments after several days of consideration i reviewed the entire paper and decided to increase my score thanks for your effort docseppoetree aims to construct an interpretable model for a policy over a time series using decision trees the healthcare domain is particularly targeted as opposed to other works the model directly maps observations of a pomdp to actions poetree creates a decision tree from time series data the decision tree can be conditioned on the history allowing the tree to be different at different time steps allowing for example the tree to model that an exam done previously that is no longer informative is no longer likely each tree is a softprobabilistic model first grown incrementally by developing optimized globally as it is differentiable then pruned finally the tree is simplified for interpretability by limiting each condition to a single variable poetree is then empirically evaluated and compared to baselines in terms of distribution modeling interpretability and policy learning strong points rather well written and clear paper the interpretability of the proposed approached is quantified by clinicians interpretability of models is important especially in healthcare experiments on synthetic data and two real data sets experiments include a comparison to what seems to be the most relevant existing method interpretable bcil huyuk et al 2021 i like that at least one example clearly shows a situation where the proposed method is more interpretable than interpretable bcil i liked the illustration of the model based on real dataexamples this is really nice good job balancing the content between the main paper and the appendices weak points i did not understand the first term zt tildezt of equation 2 the input of the tree is zt and ht the observation at the current time step and history why is there a loss over these terms what is tildezt the observation at the next time step is defined as tildezt1 so isnt that first term always 0 the tree is first learned by considering all variables in a node and then simplified by only using one variable in a node even though a l1 regularization is used when learning each node i am concerned about the impact this has and do not understand why a more direct approach such as learning a node directly with a single variable is not considered see questions below i might be mistaken but i think the impact of the different losses is not evaluated in the experiments questions unless i am mistaken the decision tree is modified after learning to make it interpretable by leaving only the variable with the highest weight in each node i read appendix c and saw the reported loss of accuracy to be only 2 yet as i understand it this is a single empirical experiment i am a bit uncomfortable as i wonder whether this could lead to large changes in a policy when many variable have similar weights i am concerned both with skewing the interpretation of the policy and with accumulations of errors within child branches could you please comment on that on a related note is there any particular reason not to learn the tree by using directly a single variable in every node this would guarantee that child branches are directly learned on the partition of the data that correspond to the inference setting likewise is there any reason to not use regular decision trees rather than probabilistic ones recurrent decisions trees could be used details i do not understand the purpose of the comparison to rnn in section 32 postrebuttal thank you for the detailed answers to my questions this is an impressive amount of extra experiments i found the answers very detailed and insightful most of my comments have been addressed beyond my expectation i think the paper is well worth publishing at iclr in its current state and have updated my score accordingly on the topic of learning an axisaligned tree i think the novel approach mentioned in the rebuttal and learning multiple axis aligned threshold at every node is very interesting but might be more difficult to interpret as many variables will be considered in each node nevertheless it is impressive you could learn such a tree i now understand that this paper follows previous approaches to align the decision boundaries i might well have missed it but reading the paper initially i thought that trees are not retrained after being axis aligned which still seems strange to me i could not see it said explicitly so perhaps i am wrong maybe this could be clarified on a final note i share the sentiment of reviewer kv5i about other reviewers comments he said it better than i could initial summary i think this is a very interesting research on an important topic experiments are well done my main concern is that the simplification of the model might have some bad impact after discussion concerns have been mostly addressed i think the paper is worth publishing
### Summary:
|
this paper proposes a treebased method for interpretable policy learning for fullyoffline and partiallyobservable clinical decision environments the models are trained incrementally as patient information becomes available the method was overall deemed novel by the reviewers and the interpretability of the model well validated by clinicians numerous points of clarification were brought up by reviewers related to the notation learning process and result reporting all of the concerns were responded to by the authors in great detail and the manuscript was appropriately revised all the reviewers have raised their scores as a result of the updates thus the paper is ready for acceptance
|
[
11859,
1650,
253,
3908,
310,
1160,
12941,
835,
271,
278,
363,
310,
8131,
342,
5091,
23140,
1056,
598,
11130,
273,
519,
8311,
533,
436,
23140,
310,
247,
12906,
273,
1097,
253,
15579,
275,
253,
1854,
281,
253,
10617,
275,
253,
3061,
5202,
347,
973,
347,
11649,
275,
253,
2250,
27039,
327,
253,
10617,
3139,
557,
19062,
18186,
253,
767,
3133,
1774,
323,
1146,
2104,
281,
7617,
1880,
271,
2250,
310,
7777,
31946,
984,
3192,
2250,
337,
275,
10617,
298,
310,
19582,
390,
816,
11543,
984,
627,
403,
1077,
1643,
1363,
326,
403,
6607,
407,
247,
1798,
10617,
4666,
5046,
516,
40663,
1633,
625,
7936,
1060,
533,
2654,
10112,
667,
3081,
19843,
1060,
50276,
531,
643,
3877,
253,
4477,
1375,
359,
1364,
6780,
253,
14259,
875,
776,
3061,
5202,
3646,
285,
3863,
9600,
323,
355,
91,
17563,
398,
6120,
23775,
275,
30762,
269,
841,
767,
7823,
497,
417,
387,
512,
2074,
275,
619,
4361,
4677,
884,
2789,
642,
6843,
3748,
273,
278,
363,
30348,
4644,
390,
253,
22942,
2967,
67,
4543,
310,
352,
18902,
4677,
1903,
15646,
512,
273,
253,
1840,
50276,
5996,
250,
2858,
22559,
5701,
253,
4477,
452,
1869,
2920,
9713,
954,
273,
619,
7350,
891,
3782,
11435,
253,
5731,
281,
4677,
495,
66,
285,
891,
1158,
352,
2789,
253,
4677,
1199,
6927,
281,
14390,
342,
253,
38549,
516,
5211,
281,
2572,
619,
4868,
2853,
3877,
2167,
326,
891,
2868,
247,
1745,
80,
369,
5611,
275,
5150,
374,
2451,
253,
41616,
275,
253,
2080,
987,
277,
7261,
1307,
273,
253,
5150,
671,
253,
1307,
281,
253,
1669,
273,
13919,
943,
1379,
288,
85,
18,
2581,
685,
288,
85,
347,
271,
4154,
891,
2868,
50275,
422,
574,
247,
4839,
281,
1239,
949,
253,
643,
30628,
5701,
285,
253,
4477,
6128,
281,
753,
5701,
891,
5194,
342,
37317,
465,
68,
2787,
84,
3236,
7350,
285,
1928,
326,
1110,
452,
644,
18212,
9713,
253,
1072,
342,
37317,
1315,
23667,
20509,
285,
436,
310,
625,
323,
253,
1313,
609,
1374,
254,
891,
369,
1669,
47348,
619,
1481,
387,
253,
5701,
1160,
407,
37317,
259,
23838,
91,
253,
2748,
281,
7277,
253,
4081,
2746,
281,
18504,
983,
285,
305,
67,
983,
36908,
1056,
667,
3282,
281,
479,
50276,
11427,
983,
285,
305,
67,
983,
403,
26557,
417,
347,
4665,
494,
347,
3061,
7139,
275,
619,
4743,
285,
253,
2644,
1127,
273,
253,
2929,
369,
281,
3037,
247,
3646,
6779,
275,
3061,
5202,
830,
326,
8244,
13373,
43018,
3879,
436,
310,
10214,
281,
259,
23838,
91,
84,
7350,
1077,
973,
18960,
281,
5657,
432,
14613,
34591,
285,
45738,
4715,
642,
10921,
310,
3058,
984,
253,
4477,
403,
2649,
3192,
271,
13737,
35221,
4715,
436,
310,
512,
4030,
30628,
476,
452,
10009,
36559,
752,
369,
954,
8664,
281,
479,
2167,
369,
253,
5019,
273,
247,
1077,
1698,
4868,
407,
253,
37317,
285,
247,
1029,
7162,
13716,
253,
30628,
5701,
11392,
6747,
253,
6864,
4543,
4685,
326,
891,
651,
1902,
432,
824,
247,
1029,
7162,
13716,
4583,
436,
310,
271,
4722,
2929,
253,
38135,
285,
8453,
403,
2762,
604,
417,
16400,
253,
11839,
323,
24432,
285,
2442,
3486,
275,
3672,
3345,
273,
5145,
4715,
2299,
310,
37825,
253,
47284,
273,
253,
1332,
285,
697,
7914,
651,
5649,
432,
3081,
29646,
17837,
275,
697,
1655,
1375,
891,
476,
5583,
247,
16888,
2997,
5474,
33032,
50276,
783,
4477,
9125,
326,
1142,
3082,
4242,
253,
16108,
273,
4665,
1430,
275,
690,
1774,
3672,
24088,
3382,
3061,
11849,
3021,
436,
2929,
4081,
247,
2602,
5202,
3169,
1332,
323,
13506,
3382,
15302,
275,
253,
2647,
273,
4665,
1430,
253,
4477,
1566,
253,
3382,
3061,
1232,
347,
247,
10571,
24802,
1616,
729,
3061,
1232,
31204,
12132,
534,
10748,
13840,
253,
9376,
273,
3739,
6120,
4583,
891,
13414,
1158,
253,
2929,
16382,
253,
8284,
273,
17857,
32888,
275,
247,
1643,
7794,
20544,
50276,
783,
16038,
310,
2266,
3340,
323,
26690,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
2096,
50276,
783,
4679,
403,
6312,
285,
7000,
50276,
395,
50276,
783,
1543,
1646,
2266,
281,
479,
50276,
1439,
264,
326,
516,
417,
247,
6485,
275,
3382,
15302,
50276,
20881,
1255,
50276,
8826,
1774,
8946,
5145,
4715,
3082,
403,
5816,
909,
256,
11618,
305,
67,
983,
534,
2525,
253,
3198,
273,
4665,
1430,
2581,
685,
3676,
4715,
3082,
253,
3646,
4715,
3082,
300,
355,
209,
2587,
403,
417,
2905,
281,
479,
347,
253,
10921,
1159,
310,
417,
1774,
275,
3382,
7533,
690,
1774,
2987,
1249,
323,
2069,
12395,
941,
497,
671,
9829,
323,
5955,
50276,
18,
1058,
305,
278,
1205,
2805,
1442,
2205,
246,
1162,
355,
1708,
72,
5844,
247,
4122,
5919,
11786,
43124,
3061,
5202,
75,
16424,
275,
11454,
1491,
5162,
2718,
4240,
1884,
33198,
3571,
17161,
374,
260,
864,
246,
12141,
11078,
260,
1269,
72,
15467,
247,
44755,
5202,
43124,
985,
68,
856,
22868,
273,
253,
3307,
2109,
913,
78,
9788,
76,
1678,
5213,
8059,
327,
3640,
8900,
285,
941,
15067,
4022,
818,
2227,
38184,
50275,
2369,
652,
555,
253,
7681,
9021,
403,
16888,
281,
479,
347,
253,
2022,
7794,
273,
253,
4081,
1332,
403,
253,
5019,
273,
391,
9866,
285,
2602,
3061,
7139,
50275,
301,
1918,
247,
18235,
1754,
327,
253,
5701,
1840,
285,
891,
651,
751,
281,
1804,
253,
4477,
49620,
253,
2929,
323,
501,
538,
2230,
281,
1529,
8059,
50276,
5996,
30080,
22559,
281,
37317,
44739,
22,
74,
281,
320,
8274,
516,
1335,
417,
13762,
326,
253,
3374,
5439,
275,
436,
2929,
403,
625,
41544,
26493,
685,
597,
403,
2686,
1246,
4518,
604,
253,
4736,
273,
436,
2929,
310,
281,
9413,
5145,
4715,
3210,
326,
476,
320,
908,
281,
5513,
840,
3061,
5202,
3210,
285,
1329,
4972,
5145,
3210,
403,
8936,
10165,
672,
2429,
281,
643,
22791,
3210,
285,
275,
958,
253,
4665,
1430,
273,
3061,
5202,
3210,
651,
417,
16653,
3212,
253,
3082,
22691,
407,
253,
3929,
352,
310,
1896,
2299,
326,
253,
9376,
1160,
670,
253,
1895,
14494,
275,
253,
3382,
941,
247,
1616,
729,
1232,
342,
7898,
1759,
1430,
310,
2853,
7334,
26332,
326,
253,
8284,
281,
1379,
715,
2395,
253,
3486,
3982,
407,
253,
3739,
24020,
310,
13583,
50275,
5996,
1501,
250,
2858,
22559,
50276,
74,
11435,
253,
4477,
11080,
285,
10799,
7125,
846,
2067,
1897,
273,
8180,
891,
9814,
253,
2862,
2929,
285,
4425,
281,
2572,
619,
4868,
6701,
323,
634,
3434,
5474,
339,
31244,
292,
658,
13698,
281,
3989,
271,
4665,
494,
1566,
323,
247,
3646,
689,
247,
673,
2962,
970,
3061,
7139,
253,
11723,
5028,
310,
3782,
10522,
347,
10066,
281,
643,
2987,
253,
1566,
3587,
8115,
7313,
273,
247,
31204,
12132,
281,
5231,
17502,
658,
10513,
247,
3061,
5202,
432,
673,
2962,
941,
253,
3061,
5202,
476,
320,
27039,
327,
253,
2892,
6941,
253,
5202,
281,
320,
1027,
387,
1027,
673,
5018,
6941,
323,
1650,
253,
5202,
281,
1566,
326,
271,
1174,
2218,
3786,
326,
310,
642,
3356,
27096,
310,
642,
3356,
2779,
1016,
5202,
310,
247,
2602,
22275,
5280,
2531,
1566,
806,
8228,
17627,
595,
407,
6684,
18325,
21349,
347,
352,
310,
46350,
840,
819,
37437,
4720,
253,
5202,
310,
21010,
323,
4665,
1430,
407,
14155,
1016,
1617,
281,
247,
2014,
4778,
17502,
658,
310,
840,
45190,
6760,
285,
2429,
281,
1666,
25379,
275,
2426,
273,
3268,
14053,
4665,
1430,
285,
3646,
4715,
2266,
2792,
50276,
30786,
973,
3542,
285,
2590,
2929,
50276,
783,
4665,
1430,
273,
253,
4081,
13781,
310,
18755,
407,
26690,
50276,
22416,
1430,
273,
3210,
310,
1774,
3340,
275,
11723,
50276,
16217,
3825,
327,
13506,
941,
285,
767,
1524,
941,
5239,
50276,
16217,
3825,
2486,
247,
5301,
281,
752,
3133,
281,
320,
253,
954,
4623,
5368,
1332,
4665,
494,
270,
4463,
288,
7352,
2788,
1162,
355,
43425,
891,
751,
326,
387,
1878,
581,
1650,
4518,
2722,
247,
4112,
835,
253,
4081,
1332,
310,
625,
4665,
494,
685,
4665,
494,
270,
4463,
50276,
74,
10490,
253,
23356,
273,
253,
1566,
1754,
327,
1524,
941,
32045,
436,
310,
1663,
5322,
50276,
12311,
2628,
26259,
253,
2600,
875,
253,
2022,
2929,
285,
253,
14801,
1271,
50275,
20881,
2792,
50276,
74,
858,
417,
2096,
253,
806,
1307,
1182,
85,
50276,
3582,
15701,
273,
5150,
374,
253,
3280,
273,
253,
5202,
310,
1182,
85,
285,
288,
85,
253,
8310,
387,
253,
1655,
673,
3213,
285,
2892,
2139,
310,
627,
247,
2957,
689,
841,
2426,
752,
310,
246,
6227,
15701,
50276,
783,
8310,
387,
253,
1735,
673,
3213,
310,
2931,
347,
246,
6227,
15701,
18,
594,
310,
2649,
326,
806,
1307,
1900,
470,
50276,
783,
5202,
310,
806,
6311,
407,
7296,
512,
4903,
275,
247,
4666,
285,
840,
21010,
407,
760,
970,
581,
4778,
275,
247,
4666,
1014,
2167,
247,
298,
18,
37820,
310,
908,
672,
4715,
1016,
4666,
891,
717,
7514,
670,
253,
3486,
436,
556,
285,
513,
417,
2096,
2139,
247,
625,
1480,
2746,
824,
347,
4715,
247,
4666,
3587,
342,
247,
2014,
4778,
310,
417,
2783,
923,
3533,
2708,
50276,
74,
1537,
320,
20854,
533,
891,
1158,
253,
3486,
273,
253,
1027,
11655,
310,
417,
6760,
275,
253,
4679,
50275,
34974,
50276,
28558,
891,
717,
20854,
253,
3061,
5202,
310,
7321,
846,
4715,
281,
1056,
352,
4665,
494,
407,
6108,
760,
253,
4778,
342,
253,
4585,
2801,
275,
1016,
4666,
891,
1239,
30762,
260,
285,
3047,
253,
2361,
2957,
273,
7200,
281,
320,
760,
374,
2568,
347,
891,
2096,
352,
436,
310,
247,
2014,
16774,
3368,
891,
717,
247,
2372,
20032,
347,
891,
4282,
1880,
436,
812,
1421,
281,
1781,
2544,
275,
247,
3646,
672,
1142,
4778,
452,
2074,
13461,
891,
717,
7514,
1097,
342,
8413,
7706,
253,
7914,
273,
253,
3646,
285,
342,
7358,
3339,
273,
6332,
1561,
1429,
12998,
812,
368,
4496,
4385,
327,
326,
50276,
251,
247,
2905,
3877,
310,
627,
667,
1798,
1921,
417,
281,
3037,
253,
5202,
407,
970,
3587,
247,
2014,
4778,
275,
1046,
4666,
436,
651,
12215,
326,
1429,
12998,
403,
3587,
6311,
327,
253,
10883,
273,
253,
941,
326,
2723,
281,
253,
17032,
4758,
50276,
3022,
3020,
310,
627,
667,
1921,
281,
417,
897,
3963,
3061,
7139,
2581,
685,
37851,
4394,
18902,
7089,
7139,
812,
320,
908,
50276,
23454,
50276,
74,
513,
417,
2096,
253,
4096,
273,
253,
5301,
281,
391,
9866,
275,
2593,
4567,
50276,
5996,
250,
2858,
22559,
5717,
368,
323,
253,
7000,
9172,
281,
619,
3533,
436,
310,
271,
13943,
2408,
273,
4465,
4679,
891,
1119,
253,
9172,
1077,
7000,
285,
47860,
954,
273,
619,
5701,
452,
644,
9713,
4457,
619,
15355,
891,
1158,
253,
2929,
310,
973,
4409,
18051,
387,
17857,
32888,
275,
697,
1655,
1375,
285,
452,
9300,
619,
4868,
15672,
50276,
251,
253,
9400,
273,
4715,
271,
7844,
2132,
5202,
50276,
74,
1158,
253,
4460,
2746,
5393,
275,
253,
30080,
22559,
285,
4715,
2709,
7844,
15616,
7887,
387,
1046,
4666,
310,
1077,
4722,
533,
1537,
320,
625,
2834,
281,
4665,
347,
1142,
4903,
588,
320,
2783,
275,
1016,
4666,
17837,
352,
310,
13943,
368,
812,
3037,
824,
247,
5202,
50276,
74,
1024,
2096,
326,
436,
2929,
3637,
2045,
7274,
281,
8495,
253,
3061,
13674,
891,
1537,
973,
452,
9829,
352,
533,
4361,
253,
2929,
8523,
891,
1869,
326,
7139,
403,
417,
851,
11273,
846,
1146,
7844,
15616,
534,
1335,
3133,
8921,
281,
479,
891,
812,
417,
923,
352,
753,
11120,
594,
4931,
891,
717,
3430,
5046,
436,
812,
320,
31637,
50276,
251,
247,
2457,
3877,
891,
3894,
253,
21942,
273,
37317,
44739,
22,
74,
670,
643,
30628,
5701,
344,
753,
352,
1805,
685,
891,
812,
3302,
6010,
891,
1158,
436,
310,
247,
1077,
4722,
2561,
327,
271,
1774,
9400,
4679,
403,
973,
2218,
619,
2022,
4468,
310,
326,
253,
8077,
1877,
273,
253,
1566,
1537,
452,
690,
3076,
3486,
50275,
6438,
5955,
7350,
452,
644,
6571,
9713,
891,
1158,
253,
2929,
310,
4409,
18051,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
5202,
3169,
1332,
323,
4665,
494,
3646,
4715,
323,
4751,
2727,
1282,
285,
10571,
23705,
494,
3382,
3061,
12620,
253,
3210,
403,
10166,
17627,
595,
347,
3110,
1491,
4916,
2130,
50275,
783,
1332,
369,
4583,
14320,
4460,
407,
253,
30628,
285,
253,
4665,
1430,
273,
253,
1566,
973,
17618,
407,
26690,
50276,
40907,
528,
2792,
273,
37699,
497,
3982,
598,
407,
30628,
2905,
281,
253,
14951,
4715,
1232,
285,
906,
9610,
512,
273,
253,
7350,
497,
10974,
281,
407,
253,
4477,
275,
1270,
2508,
285,
253,
7714,
369,
20420,
17265,
512,
253,
30628,
452,
5439,
616,
7363,
347,
247,
906,
273,
253,
11269,
50276,
40622,
253,
2929,
310,
4704,
323,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
11859,
1650,
253,
3908,
310,
1160,
12941,
835,
271,
278,
363,
310,
8131,
342,
5091,
23140,
1056,
598,
11130,
273,
519,
8311,
533,
436,
23140,
310,
247,
12906,
273,
1097,
253,
15579,
275,
253,
1854,
281,
253,
10617,
275,
253,
3061,
5202,
347,
973,
347,
11649,
275,
253,
2250,
27039,
327,
253,
10617,
3139,
557,
19062,
18186,
253,
767,
3133,
1774,
323,
1146,
2104,
281,
7617,
1880,
271,
2250,
310,
7777,
31946,
984,
3192,
2250,
337,
275,
10617,
298,
310,
19582,
390,
816,
11543,
984,
627,
403,
1077,
1643,
1363,
326,
403,
6607,
407,
247,
1798,
10617,
4666,
5046,
516,
40663,
1633,
625,
7936,
1060,
533,
2654,
10112,
667,
3081,
19843,
1060,
50276,
531,
643,
3877,
253,
4477,
1375,
359,
1364,
6780,
253,
14259,
875,
776,
3061,
5202,
3646,
285,
3863,
9600,
323,
355,
91,
17563,
398,
6120,
23775,
275,
30762,
269,
841,
767,
7823,
497,
417,
387,
512,
2074,
275,
619,
4361,
4677,
884,
2789,
642,
6843,
3748,
273,
278,
363,
30348,
4644,
390,
253,
22942,
2967,
67,
4543,
310,
352,
18902,
4677,
1903,
15646,
512,
273,
253,
1840,
50276,
5996,
250,
2858,
22559,
5701,
253,
4477,
452,
1869,
2920,
9713,
954,
273,
619,
7350,
891,
3782,
11435,
253,
5731,
281,
4677,
495,
66,
285,
891,
1158,
352,
2789,
253,
4677,
1199,
6927,
281,
14390,
342,
253,
38549,
516,
5211,
281,
2572,
619,
4868,
2853,
3877,
2167,
326,
891,
2868,
247,
1745,
80,
369,
5611,
275,
5150,
374,
2451,
253,
41616,
275,
253,
2080,
987,
277,
7261,
1307,
273,
253,
5150,
671,
253,
1307,
281,
253,
1669,
273,
13919,
943,
1379,
288,
85,
18,
2581,
685,
288,
85,
347,
271,
4154,
891,
2868,
50275,
422,
574,
247,
4839,
281,
1239,
949,
253,
643,
30628,
5701,
285,
253,
4477,
6128,
281,
753,
5701,
891,
5194,
342,
37317,
465,
68,
2787,
84,
3236,
7350,
285,
1928,
326,
1110,
452,
644,
18212,
9713,
253,
1072,
342,
37317,
1315,
23667,
20509,
285,
436,
310,
625,
323,
253,
1313,
609,
1374,
254,
891,
369,
1669,
47348,
619,
1481,
387,
253,
5701,
1160,
407,
37317,
259,
23838,
91,
253,
2748,
281,
7277,
253,
4081,
2746,
281,
18504,
983,
285,
305,
67,
983,
36908,
1056,
667,
3282,
281,
479,
50276,
11427,
983,
285,
305,
67,
983,
403,
26557,
417,
347,
4665,
494,
347,
3061,
7139,
275,
619,
4743,
285,
253,
2644,
1127,
273,
253,
2929,
369,
281,
3037,
247,
3646,
6779,
275,
3061,
5202,
830,
326,
8244,
13373,
43018,
3879,
436,
310,
10214,
281,
259,
23838,
91,
84,
7350,
1077,
973,
18960,
281,
5657,
432,
14613,
34591,
285,
45738,
4715,
642,
10921,
310,
3058,
984,
253,
4477,
403,
2649,
3192,
271,
13737,
35221,
4715,
436,
310,
512,
4030,
30628,
476,
452,
10009,
36559,
752,
369,
954,
8664,
281,
479,
2167,
369,
253,
5019,
273,
247,
1077,
1698,
4868,
407,
253,
37317,
285,
247,
1029,
7162,
13716,
253,
30628,
5701,
11392,
6747,
253,
6864,
4543,
4685,
326,
891,
651,
1902,
432,
824,
247,
1029,
7162,
13716,
4583,
436,
310,
271,
4722,
2929,
253,
38135,
285,
8453,
403,
2762,
604,
417,
16400,
253,
11839,
323,
24432,
285,
2442,
3486,
275,
3672,
3345,
273,
5145,
4715,
2299,
310,
37825,
253,
47284,
273,
253,
1332,
285,
697,
7914,
651,
5649,
432,
3081,
29646,
17837,
275,
697,
1655,
1375,
891,
476,
5583,
247,
16888,
2997,
5474,
33032,
50276,
783,
4477,
9125,
326,
1142,
3082,
4242,
253,
16108,
273,
4665,
1430,
275,
690,
1774,
3672,
24088,
3382,
3061,
11849,
3021,
436,
2929,
4081,
247,
2602,
5202,
3169,
1332,
323,
13506,
3382,
15302,
275,
253,
2647,
273,
4665,
1430,
253,
4477,
1566,
253,
3382,
3061,
1232,
347,
247,
10571,
24802,
1616,
729,
3061,
1232,
31204,
12132,
534,
10748,
13840,
253,
9376,
273,
3739,
6120,
4583,
891,
13414,
1158,
253,
2929,
16382,
253,
8284,
273,
17857,
32888,
275,
247,
1643,
7794,
20544,
50276,
783,
16038,
310,
2266,
3340,
323,
26690,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
2096,
50276,
783,
4679,
403,
6312,
285,
7000,
50276,
395,
50276,
783,
1543,
1646,
2266,
281,
479,
50276,
1439,
264,
326,
516,
417,
247,
6485,
275,
3382,
15302,
50276,
20881,
1255,
50276,
8826,
1774,
8946,
5145,
4715,
3082,
403,
5816,
909,
256,
11618,
305,
67,
983,
534,
2525,
253,
3198,
273,
4665,
1430,
2581,
685,
3676,
4715,
3082,
253,
3646,
4715,
3082,
300,
355,
209,
2587,
403,
417,
2905,
281,
479,
347,
253,
10921,
1159,
310,
417,
1774,
275,
3382,
7533,
690,
1774,
2987,
1249,
323,
2069,
12395,
941,
497,
671,
9829,
323,
5955,
50276,
18,
1058,
305,
278,
1205,
2805,
1442,
2205,
246,
1162,
355,
1708,
72,
5844,
247,
4122,
5919,
11786,
43124,
3061,
5202,
75,
16424,
275,
11454,
1491,
5162,
2718,
4240,
1884,
33198,
3571,
17161,
374,
260,
864,
246,
12141,
11078,
260,
1269,
72,
15467,
247,
44755,
5202,
43124,
985,
68,
856,
22868,
273,
253,
3307,
2109,
913,
78,
9788,
76,
1678,
5213,
8059,
327,
3640,
8900,
285,
941,
15067,
4022,
818,
2227,
38184,
50275,
2369,
652,
555,
253,
7681,
9021,
403,
16888,
281,
479,
347,
253,
2022,
7794,
273,
253,
4081,
1332,
403,
253,
5019,
273,
391,
9866,
285,
2602,
3061,
7139,
50275,
301,
1918,
247,
18235,
1754,
327,
253,
5701,
1840,
285,
891,
651,
751,
281,
1804,
253,
4477,
49620,
253,
2929,
323,
501,
538,
2230,
281,
1529,
8059,
50276,
5996,
30080,
22559,
281,
37317,
44739,
22,
74,
281,
320,
8274,
516,
1335,
417,
13762,
326,
253,
3374,
5439,
275,
436,
2929,
403,
625,
41544,
26493,
685,
597,
403,
2686,
1246,
4518,
604,
253,
4736,
273,
436,
2929,
310,
281,
9413,
5145,
4715,
3210,
326,
476,
320,
908,
281,
5513,
840,
3061,
5202,
3210,
285,
1329,
4972,
5145,
3210,
403,
8936,
10165,
672,
2429,
281,
643,
22791,
3210,
285,
275,
958,
253,
4665,
1430,
273,
3061,
5202,
3210,
651,
417,
16653,
3212,
253,
3082,
22691,
407,
253,
3929,
352,
310,
1896,
2299,
326,
253,
9376,
1160,
670,
253,
1895,
14494,
275,
253,
3382,
941,
247,
1616,
729,
1232,
342,
7898,
1759,
1430,
310,
2853,
7334,
26332,
326,
253,
8284,
281,
1379,
715,
2395,
253,
3486,
3982,
407,
253,
3739,
24020,
310,
13583,
50275,
5996,
1501,
250,
2858,
22559,
50276,
74,
11435,
253,
4477,
11080,
285,
10799,
7125,
846,
2067,
1897,
273,
8180,
891,
9814,
253,
2862,
2929,
285,
4425,
281,
2572,
619,
4868,
6701,
323,
634,
3434,
5474,
339,
31244,
292,
658,
13698,
281,
3989,
271,
4665,
494,
1566,
323,
247,
3646,
689,
247,
673,
2962,
970,
3061,
7139,
253,
11723,
5028,
310,
3782,
10522,
347,
10066,
281,
643,
2987,
253,
1566,
3587,
8115,
7313,
273,
247,
31204,
12132,
281,
5231,
17502,
658,
10513,
247,
3061,
5202,
432,
673,
2962,
941,
253,
3061,
5202,
476,
320,
27039,
327,
253,
2892,
6941,
253,
5202,
281,
320,
1027,
387,
1027,
673,
5018,
6941,
323,
1650,
253,
5202,
281,
1566,
326,
271,
1174,
2218,
3786,
326,
310,
642,
3356,
27096,
310,
642,
3356,
2779,
1016,
5202,
310,
247,
2602,
22275,
5280,
2531,
1566,
806,
8228,
17627,
595,
407,
6684,
18325,
21349,
347,
352,
310,
46350,
840,
819,
37437,
4720,
253,
5202,
310,
21010,
323,
4665,
1430,
407,
14155,
1016,
1617,
281,
247,
2014,
4778,
17502,
658,
310,
840,
45190,
6760,
285,
2429,
281,
1666,
25379,
275,
2426,
273,
3268,
14053,
4665,
1430,
285,
3646,
4715,
2266,
2792,
50276,
30786,
973,
3542,
285,
2590,
2929,
50276,
783,
4665,
1430,
273,
253,
4081,
13781,
310,
18755,
407,
26690,
50276,
22416,
1430,
273,
3210,
310,
1774,
3340,
275,
11723,
50276,
16217,
3825,
327,
13506,
941,
285,
767,
1524,
941,
5239,
50276,
16217,
3825,
2486,
247,
5301,
281,
752,
3133,
281,
320,
253,
954,
4623,
5368,
1332,
4665,
494,
270,
4463,
288,
7352,
2788,
1162,
355,
43425,
891,
751,
326,
387,
1878,
581,
1650,
4518,
2722,
247,
4112,
835,
253,
4081,
1332,
310,
625,
4665,
494,
685,
4665,
494,
270,
4463,
50276,
74,
10490,
253,
23356,
273,
253,
1566,
1754,
327,
1524,
941,
32045,
436,
310,
1663,
5322,
50276,
12311,
2628,
26259,
253,
2600,
875,
253,
2022,
2929,
285,
253,
14801,
1271,
50275,
20881,
2792,
50276,
74,
858,
417,
2096,
253,
806,
1307,
1182,
85,
50276,
3582,
15701,
273,
5150,
374,
253,
3280,
273,
253,
5202,
310,
1182,
85,
285,
288,
85,
253,
8310,
387,
253,
1655,
673,
3213,
285,
2892,
2139,
310,
627,
247,
2957,
689,
841,
2426,
752,
310,
246,
6227,
15701,
50276,
783,
8310,
387,
253,
1735,
673,
3213,
310,
2931,
347,
246,
6227,
15701,
18,
594,
310,
2649,
326,
806,
1307,
1900,
470,
50276,
783,
5202,
310,
806,
6311,
407,
7296,
512,
4903,
275,
247,
4666,
285,
840,
21010,
407,
760,
970,
581,
4778,
275,
247,
4666,
1014,
2167,
247,
298,
18,
37820,
310,
908,
672,
4715,
1016,
4666,
891,
717,
7514,
670,
253,
3486,
436,
556,
285,
513,
417,
2096,
2139,
247,
625,
1480,
2746,
824,
347,
4715,
247,
4666,
3587,
342,
247,
2014,
4778,
310,
417,
2783,
923,
3533,
2708,
50276,
74,
1537,
320,
20854,
533,
891,
1158,
253,
3486,
273,
253,
1027,
11655,
310,
417,
6760,
275,
253,
4679,
50275,
34974,
50276,
28558,
891,
717,
20854,
253,
3061,
5202,
310,
7321,
846,
4715,
281,
1056,
352,
4665,
494,
407,
6108,
760,
253,
4778,
342,
253,
4585,
2801,
275,
1016,
4666,
891,
1239,
30762,
260,
285,
3047,
253,
2361,
2957,
273,
7200,
281,
320,
760,
374,
2568,
347,
891,
2096,
352,
436,
310,
247,
2014,
16774,
3368,
891,
717,
247,
2372,
20032,
347,
891,
4282,
1880,
436,
812,
1421,
281,
1781,
2544,
275,
247,
3646,
672,
1142,
4778,
452,
2074,
13461,
891,
717,
7514,
1097,
342,
8413,
7706,
253,
7914,
273,
253,
3646,
285,
342,
7358,
3339,
273,
6332,
1561,
1429,
12998,
812,
368,
4496,
4385,
327,
326,
50276,
251,
247,
2905,
3877,
310,
627,
667,
1798,
1921,
417,
281,
3037,
253,
5202,
407,
970,
3587,
247,
2014,
4778,
275,
1046,
4666,
436,
651,
12215,
326,
1429,
12998,
403,
3587,
6311,
327,
253,
10883,
273,
253,
941,
326,
2723,
281,
253,
17032,
4758,
50276,
3022,
3020,
310,
627,
667,
1921,
281,
417,
897,
3963,
3061,
7139,
2581,
685,
37851,
4394,
18902,
7089,
7139,
812,
320,
908,
50276,
23454,
50276,
74,
513,
417,
2096,
253,
4096,
273,
253,
5301,
281,
391,
9866,
275,
2593,
4567,
50276,
5996,
250,
2858,
22559,
5717,
368,
323,
253,
7000,
9172,
281,
619,
3533,
436,
310,
271,
13943,
2408,
273,
4465,
4679,
891,
1119,
253,
9172,
1077,
7000,
285,
47860,
954,
273,
619,
5701,
452,
644,
9713,
4457,
619,
15355,
891,
1158,
253,
2929,
310,
973,
4409,
18051,
387,
17857,
32888,
275,
697,
1655,
1375,
285,
452,
9300,
619,
4868,
15672,
50276,
251,
253,
9400,
273,
4715,
271,
7844,
2132,
5202,
50276,
74,
1158,
253,
4460,
2746,
5393,
275,
253,
30080,
22559,
285,
4715,
2709,
7844,
15616,
7887,
387,
1046,
4666,
310,
1077,
4722,
533,
1537,
320,
625,
2834,
281,
4665,
347,
1142,
4903,
588,
320,
2783,
275,
1016,
4666,
17837,
352,
310,
13943,
368,
812,
3037,
824,
247,
5202,
50276,
74,
1024,
2096,
326,
436,
2929,
3637,
2045,
7274,
281,
8495,
253,
3061,
13674,
891,
1537,
973,
452,
9829,
352,
533,
4361,
253,
2929,
8523,
891,
1869,
326,
7139,
403,
417,
851,
11273,
846,
1146,
7844,
15616,
534,
1335,
3133,
8921,
281,
479,
891,
812,
417,
923,
352,
753,
11120,
594,
4931,
891,
717,
3430,
5046,
436,
812,
320,
31637,
50276,
251,
247,
2457,
3877,
891,
3894,
253,
21942,
273,
37317,
44739,
22,
74,
670,
643,
30628,
5701,
344,
753,
352,
1805,
685,
891,
812,
3302,
6010,
891,
1158,
436,
310,
247,
1077,
4722,
2561,
327,
271,
1774,
9400,
4679,
403,
973,
2218,
619,
2022,
4468,
310,
326,
253,
8077,
1877,
273,
253,
1566,
1537,
452,
690,
3076,
3486,
50275,
6438,
5955,
7350,
452,
644,
6571,
9713,
891,
1158,
253,
2929,
310,
4409,
18051,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
5202,
3169,
1332,
323,
4665,
494,
3646,
4715,
323,
4751,
2727,
1282,
285,
10571,
23705,
494,
3382,
3061,
12620,
253,
3210,
403,
10166,
17627,
595,
347,
3110,
1491,
4916,
2130,
50275,
783,
1332,
369,
4583,
14320,
4460,
407,
253,
30628,
285,
253,
4665,
1430,
273,
253,
1566,
973,
17618,
407,
26690,
50276,
40907,
528,
2792,
273,
37699,
497,
3982,
598,
407,
30628,
2905,
281,
253,
14951,
4715,
1232,
285,
906,
9610,
512,
273,
253,
7350,
497,
10974,
281,
407,
253,
4477,
275,
1270,
2508,
285,
253,
7714,
369,
20420,
17265,
512,
253,
30628,
452,
5439,
616,
7363,
347,
247,
906,
273,
253,
11269,
50276,
40622,
253,
2929,
310,
4704,
323,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary of the authors claim the authors claim endtoend learning of exploration and exploitation policies can result in a poor local minima to avoid this problem the authors propose to separate the objectives for exploration and exploitation in metarl theoretical analyses and the experiments on several artificial tasks show the effectiveness of the proposed method and superiority over existing methods overview of the proposed method as is common in the literature the exploitation policy depends not only on observations but also on a taskembedding z to adapt the environment across the tasks z is generated from a stochastic encoder which is conditioned on the problem id mu by minimizing the mutual information between the problem id mu and the taskembedding z which is called the information bottleneck term the exploitation policy tends to be independent from the task irrelevant information and can reduce the risk of being trapped at a poor local minima on one hand the lower bound of the mutual information between the exploration experiences tauexp and the taskembedding z is maximized for the learning of the exploitation policy during this optimization decoder qztauexp is trained and this decoder is used to estimate the taskembedding without knowing the problem id because the exploitation policy depends on the task embedding that is generated from problem id and not on the exploration experiences tauexp it make the training of the exploitation policy more robust and successful comments on the paper i appreciate the good experimental results but i think the paper contains inaccurate explanations and the proposed method is not properly characterized although the authors emphasize that the decoupling of the exploration and exploitation is important in section 1 section 41 and also in the title the actual proposed algorithm has coupled loss functions in section 41 the authors wrote to solve the task pitask needs good exploration data from a good exploration policy piexp rigorously speaking this is not true even if the exploration policy is just uniform random distribution we know table qlearning algorithms can solve the task when we have an infinite number of samples if the statement means to efficiently solve the task pitask needs good exploration data from a good exploration policy piexp it will be true but this is also true for the proposed algorithm when z is computed by not only using the encoder fzmu but also by using gtauexp this makes the training of pitask more coupled training i wonder how much this mixed training heuristics is important it would be clear if the authors provide the experimental results with and without this heuristics another statement learning piexp relies on gradients passed through pitask is true if the embedding z is not informative for solving the task that is pitask does not make use of z it is not meaningful to maximize the lower bound of mutual information between z and the experience by the exploration policy piexp tauexp thus it is also true for the proposed method therefore it seems for me the statement not only explains the reason why the existing methods sometimes does not work but also explain the reason why the reinforcement learning is difficult i think the stress should be put on the training by using the problem id which does not depend on the progress of the training rather than the decoupling the condition of proposition 1 is not clearly stated in the proof the authors assume lambda approaches 0 but this is not clearly stated in the condition also in the experiment this is not performed but lambda is set to be 1 usefulness of the bottleneck as shown in fig5 is not supported by the theoretical analyses it seems for me theoretical analysis does not support the actual proposed method which uses the bottleneck in the optimization instead as i wrote above the authors emphasize the importance of the decoupling too much however based on the theoretical analysis and the experimental results i think the authors should emphasize more the usefulness of the bottleneck to use the proposed algorithm problem id is indispensable this condition should be stressed i also wonder how much the performance of the proposed method is affected by the mapping of the problem id is it robust to the random permutation of the problem id it would be nice if the authors can test the robustness on the permutation of the problem id as written in the introduction the metareinforcement learning is often inspired by the humans quick adaptation ability to new task which shares some properties with the previously experienced tasks however the necessity of the problem id limits the applicability of the method to such situations it would be nice if the authors can exemplify what kind of practical tasks the proposed method can be applied as for the experiments it is good to test on newly designed environments and tasks to highlight the properties of the proposed method however i believe the proposed method should be tested on one of the common benchmarks to make the experimental results more reliable from these experiments the readers will understand that the existing methods are implemented appropriately and how much the proposed method works well on the common benchmarkdocsepthe paper introduces an approach to improve metalearning in rl specifically the approach aims to improve the agents exploration during the training phase so that the agent can better exploit during the test phase where samples collected from the training phase provides useful taskrelevant information to the agent pros the paper is wellwritten the idea is clearly presented the experiments are clear to understand the results presented demonstrate that the approach can either match or improve learning when compared to relevant baselines cons details about the architecture choices used for the baselines are not provided in the main text which makes it difficult to understand the experiment setup and its results drawing out the differences between the baselines and their approach would be useful to help understand the contribution of the work questions 1 it seems like the use of problem id is specific to the approach introduced in the paper if so could the authors describe how this was used in the baseline agents if the problem ids were not used in the baselines then it feels like the baselines considered here are unfair 2 for the domains considered here it seems like the problem ids are sufficient enough to provide any taskrelevant information to the metalearning agent in this case what sort of zs can the encoder produce the domains considered seem less interesting for the introduced approach as it seems like the encoder could simply learn an identity mapping of the problem id which is its input 3 what would be the reason for rl2 to fail in the 3d navigation task considered 4 as a baseline it would be interesting to see the learning performance of a simple rl2 agent on the domains provided they take in the problem id as input this would inform whether the encoderdecoder architecture that is introduced in the paper learns something that is beyond the problem iddocsepthe paper investigates the explorationexploitation problem in metalearning the authors explain the problem of coupled exploration and validate it through a toy example to overcome this issue the paper introduces dream a metaalgorithm decoupling exploration and exploitation in the first step dream learns an exploitation policy and a task embedding by maximizing the cumulative reward of the given task task identifier is known at train in the second step dream learns an exploration policy that is compatible with the embeddings generated by the exploitation policy dream outperformed the stateoftheart algorithms in several experiments the paper is well written and easy to follow the idea is clearly explained and justified i have only a few comments concerning the problem of coupled exploration im wondering if you could provide a more formal justification the current justification sec 41 is easy to follow but quite abstract despite being supported by your example sec 52 it is not clear to me whether this is a general problem of coupled exploration eg is it due to gradient updates comparison with the literature could you provide a detailed comparison with humplik et al 2019 kamienny et al 2020 in the first step your algorithm learns an encoding of the task fpsimu by maximizing the task reward ie learning the exploitation policy in the second step it uses this embedding to train an exploration policy to generate trajectories mapping to the relevant information constructed by the embedding ie maximizing the mutual information between z and the trajectories to the best of my understanding this seems very similar to what done in the mentioned approaches in particular kamienny et al 2020 also have exploration and exploitation policies the exploitation policy is trained to maximize the reward while generating an embedding of mu thus similarly to your exploitation step the exploration policy is also trained to maximize the reward while enforcing the rnn state to be similar to fmu more mildly even their approach aims to learn to generate trajectories providing information about z the main difference between the two approaches resides in the fact that their exploration policy is trained to also maximize the reward am i correct will their algorithms suffer from the coupling problem mentioned in section 41 minor comments proposition 1 could you explain in more details the need for ergodicity my understanding from the appendix is that you should be able to generate all possible trajectories to have that pztauexp fstarzmu however you would need to potentially enforce exploration at the level of actions this can be obtained by a random policy am i correct i think mathbbemu sim pmu is missing in the first and second equations of the proof of proposition 1 you mentioned that you could remove the ergodicity assumption by increasing the number of exploration episodes could you clarify this sentence the number of exploration episodes is a term that does not appear in your current analysis since the reasoning seems to be in expectation i didnt check the appendix c2 docsepsummary this paper introduces dream a metarl approach that decouples exploration from exploitation an exploitation policy learns to maximize rewards that are conditioned on an encoder that learns task relevant information then an exploration policy learns to collect data that maximizes the mutual information between the encoder and explored states the work is compared against multiple baselines in simple tasks overall i lean towards accepting the paper though i am not as familiar with the metarl literature to have much of an informed opinion about what relevant benchmarks or approaches are the paper was wellwritten and wellmotivated and while the experiments were simple seemed to highlight the problems that the paper was addressing it makes sense to separate out exploration and exploitation and i appreciated the inclusion of tasks that helped motivate this point furthermore the paper provides a theoretical analysis of dream showing that the decoupled policy maximizes returns code and hyperparameters are provided and the paper seems to be reproducible i do think that the paper should have more discussion and evaluation over approaches that aim to explicitly address the exploration exploitation problem the paper only considers exploration in the context of metalearning but of course exploration is a central problem in rl and several approaches have studied it outside of metarl the paper would be improved by discussing such approaches for example intrinsic rewards such as empowerment 1 or surprise 2 andor evaluating how well these approaches compare to dream when trained alone and combined with vanilla algorithms i also would have liked to see more empirical analysis over the exploration policy being learned by piexp questions 1 how was the decay rate for epsilon chosen in figure 3 how would a policy with a fixed decay rate perform 2 i do not quite understand how trajectories from the exploration policy can be used interchangeably with the encodings z when plugged into pitask could the authors provide more insights into this 1 a unified bellman optimality principle combining reward maximization and empowerment leibfried et al 2 curiositydriven exploration by selfsupervised prediction pathak et al
### Summary:
|
the authors clarified many of r1 and r4s concerns but there were important remaining concerns regarding the presentation on the bright side the approach is novel and the experimental results are solid however the main point raised by r1 is the mismatch between the narrative and the theory and the actual algorithm and results some exemples of this mismatch include proposition 1 is proved when lambda goes to 0 which is never mentioned in the main paper one has to look into the appendices to have a discussion of lambda and of the algorithm while the authors could clearly explain that when discussing the theoretical result the added discussion on tuning of lambda based on appendix e does not help because the optimization problem is written as under the constraint that is maximized which is not particularly clear more generally the theoretical result including eg the assumption of ergodicity could be discussed more precisely in terms of what is actually done in the algorithm and the experiments as stated by r3 many important aspects of the methods and the experimental setup are only available in appendices which makes it difficult to understand the similarities and differences in experimental protocol between the curent paper and rl2 pearl and import it is unclear what part of dream is critical for performance there is no thorough ablation study nor discussion of the importance of the information bottleneck term and the only signal given in figure 5 is that it is critical the authors could clarify the two aspects decoupling and information bottleneck there was some discussion about this paper but even under the assumption that the authors answered most r3s concerns r3 didnt engage in discussions the paper is still borderline in the end there was little support for acceptance because of the presentation issues above
|
[
15577,
1491,
875,
1182,
285,
253,
2793,
407,
253,
17947,
3646,
3376,
37755,
15307,
489,
37755,
50276,
40622,
352,
310,
671,
2032,
323,
253,
4081,
1332,
3103,
352,
3133,
323,
479,
253,
3908,
417,
760,
11424,
253,
1921,
2139,
253,
5368,
3082,
4536,
1057,
417,
789,
533,
671,
5513,
253,
1921,
2139,
253,
35221,
4715,
310,
2834,
891,
1158,
253,
4073,
943,
320,
1691,
327,
253,
3733,
407,
970,
253,
1895,
2654,
534,
1057,
417,
3469,
327,
253,
4780,
273,
253,
3733,
2581,
685,
253,
34430,
4906,
50276,
783,
1617,
273,
13989,
337,
310,
417,
4518,
4767,
275,
253,
4737,
253,
4477,
5467,
29331,
7274,
470,
533,
436,
310,
417,
4518,
4767,
275,
253,
1617,
671,
275,
253,
3368,
436,
310,
417,
2684,
533,
29331,
310,
873,
281,
320,
337,
50276,
316,
4085,
1255,
273,
253,
3673,
44856,
347,
2011,
275,
3036,
22,
310,
417,
4516,
407,
253,
10527,
6260,
352,
3133,
323,
479,
10527,
1783,
1057,
417,
1329,
253,
4588,
4081,
1332,
534,
4648,
253,
3673,
44856,
275,
253,
13757,
3185,
347,
891,
4159,
1840,
253,
4477,
22175,
253,
6349,
273,
253,
34430,
4906,
1512,
1199,
2299,
1754,
327,
253,
10527,
1783,
285,
253,
5661,
1543,
891,
1158,
253,
4477,
943,
22175,
625,
253,
31471,
273,
253,
3673,
44856,
50273,
936,
897,
253,
4081,
5933,
1895,
2654,
310,
33777,
436,
1617,
943,
320,
21198,
891,
671,
4282,
849,
1199,
253,
3045,
273,
253,
4081,
1332,
310,
5876,
407,
253,
10603,
273,
253,
1895,
2654,
310,
352,
10237,
281,
253,
3632,
29391,
273,
253,
1895,
2654,
352,
651,
320,
5322,
604,
253,
4477,
476,
1071,
253,
31640,
327,
253,
29391,
273,
253,
1895,
2654,
50275,
284,
3542,
275,
253,
10199,
253,
1313,
609,
249,
19503,
4715,
310,
2223,
11797,
407,
253,
7497,
3158,
15644,
3745,
281,
747,
4836,
534,
10764,
690,
3607,
342,
253,
3786,
7407,
8892,
2299,
253,
15504,
273,
253,
1895,
2654,
7787,
253,
30437,
273,
253,
1332,
281,
824,
9534,
352,
651,
320,
5322,
604,
253,
4477,
476,
40924,
6644,
752,
2238,
273,
8542,
8892,
253,
4081,
1332,
476,
320,
3732,
50276,
284,
323,
253,
4679,
352,
310,
1175,
281,
1071,
327,
9841,
4158,
12620,
285,
8892,
281,
6780,
253,
3607,
273,
253,
4081,
1332,
2299,
891,
2868,
253,
4081,
1332,
943,
320,
5762,
327,
581,
273,
253,
1846,
49602,
281,
1056,
253,
5661,
1543,
625,
9630,
432,
841,
4679,
253,
10668,
588,
2096,
326,
253,
5368,
3082,
403,
9009,
20420,
285,
849,
1199,
253,
4081,
1332,
2987,
973,
327,
253,
1846,
22791,
7152,
339,
431,
248,
2929,
23970,
271,
2746,
281,
3157,
5148,
613,
920,
275,
391,
77,
5742,
253,
2746,
13698,
281,
3157,
253,
6083,
17947,
1309,
253,
3733,
3408,
594,
326,
253,
5570,
476,
1805,
22059,
1309,
253,
1071,
3408,
835,
3530,
5728,
432,
253,
3733,
3408,
3400,
4217,
4836,
15477,
1491,
281,
253,
5570,
50276,
856,
84,
253,
2929,
310,
973,
15720,
50276,
783,
2934,
310,
4518,
3559,
50276,
783,
4679,
403,
2590,
281,
2096,
50276,
783,
1543,
3559,
7568,
326,
253,
2746,
476,
2057,
3761,
390,
3157,
4715,
672,
2429,
281,
4623,
1666,
25379,
50276,
5040,
50276,
23454,
670,
253,
10336,
10165,
908,
323,
253,
1666,
25379,
403,
417,
2530,
275,
253,
2022,
2505,
534,
2789,
352,
2834,
281,
2096,
253,
3368,
9978,
285,
697,
1543,
50276,
6553,
272,
562,
253,
3910,
875,
253,
1666,
25379,
285,
616,
2746,
651,
320,
4217,
281,
1361,
2096,
253,
7680,
273,
253,
789,
50276,
34974,
337,
352,
3133,
751,
253,
897,
273,
1895,
2654,
310,
2173,
281,
253,
2746,
5611,
275,
253,
2929,
604,
594,
812,
253,
4477,
6266,
849,
436,
369,
908,
275,
253,
8245,
6083,
604,
253,
1895,
44077,
497,
417,
908,
275,
253,
1666,
25379,
840,
352,
9193,
751,
253,
1666,
25379,
2783,
1060,
403,
16593,
374,
323,
253,
10625,
2783,
1060,
352,
3133,
751,
253,
1895,
44077,
403,
4209,
2217,
281,
2085,
667,
4836,
15477,
1491,
281,
253,
5148,
613,
920,
5570,
275,
436,
1083,
752,
3686,
273,
1182,
84,
476,
253,
32049,
4711,
253,
10625,
2783,
1646,
1679,
4722,
323,
253,
5611,
2746,
347,
352,
3133,
751,
253,
32049,
812,
3365,
3037,
271,
6489,
10603,
273,
253,
1895,
2654,
534,
310,
697,
3280,
495,
752,
651,
320,
253,
1921,
323,
391,
77,
19,
281,
1891,
275,
253,
495,
69,
15034,
4836,
2783,
577,
347,
247,
8245,
352,
651,
320,
4722,
281,
923,
253,
4715,
3045,
273,
247,
2969,
391,
77,
19,
5570,
327,
253,
10625,
2530,
597,
1379,
275,
253,
1895,
2654,
347,
3280,
436,
651,
4151,
1880,
253,
32049,
48759,
10336,
326,
310,
5611,
275,
253,
2929,
33772,
1633,
326,
310,
4457,
253,
1895,
209,
2016,
406,
339,
431,
248,
2929,
2340,
684,
253,
17947,
15083,
80,
3535,
1895,
275,
5148,
613,
920,
253,
4477,
5513,
253,
1895,
273,
9904,
17947,
285,
17813,
352,
949,
247,
20953,
1650,
281,
11399,
436,
2523,
253,
2929,
23970,
7156,
247,
11419,
41528,
34430,
4906,
17947,
285,
30211,
275,
253,
806,
3213,
7156,
33772,
271,
30211,
3646,
285,
247,
4836,
21496,
407,
46875,
253,
18849,
10921,
273,
253,
1677,
4836,
4836,
21674,
310,
1929,
387,
6194,
275,
253,
1273,
3213,
7156,
33772,
271,
17947,
3646,
326,
310,
13333,
342,
253,
46234,
4561,
407,
253,
30211,
3646,
7156,
41731,
10574,
253,
1375,
23037,
14387,
11333,
275,
2067,
4679,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
2934,
310,
4518,
5544,
285,
17285,
891,
452,
760,
247,
1643,
5701,
50276,
585,
29340,
253,
1895,
273,
9904,
17947,
516,
12371,
604,
368,
812,
2085,
247,
625,
7473,
22861,
253,
1655,
22861,
4706,
7609,
310,
3477,
281,
956,
533,
3240,
12002,
5747,
1146,
4516,
407,
634,
1650,
4706,
8073,
352,
310,
417,
2590,
281,
479,
1880,
436,
310,
247,
2087,
1895,
273,
9904,
17947,
24088,
310,
352,
1955,
281,
11786,
11269,
50276,
47109,
342,
253,
6239,
812,
368,
2085,
247,
7000,
5301,
342,
1547,
446,
1479,
1162,
355,
6247,
465,
312,
1914,
5134,
1162,
355,
9169,
50276,
249,
253,
806,
3213,
634,
5933,
33772,
271,
9706,
273,
253,
4836,
269,
793,
303,
86,
407,
46875,
253,
4836,
10921,
26332,
4715,
253,
30211,
3646,
275,
253,
1273,
3213,
352,
4648,
436,
21496,
281,
6194,
271,
17947,
3646,
281,
6635,
24102,
10603,
281,
253,
4623,
1491,
8818,
407,
253,
21496,
26332,
46875,
253,
15577,
1491,
875,
1182,
285,
253,
24102,
281,
253,
1682,
273,
619,
4685,
436,
3133,
1077,
2074,
281,
752,
2218,
275,
253,
5393,
7274,
275,
1798,
465,
312,
1914,
5134,
1162,
355,
9169,
671,
452,
17947,
285,
30211,
7823,
253,
30211,
3646,
310,
10166,
281,
22950,
253,
10921,
1223,
11365,
271,
21496,
273,
12910,
3021,
12014,
281,
634,
30211,
3213,
253,
17947,
3646,
310,
671,
10166,
281,
22950,
253,
10921,
1223,
37703,
253,
391,
9866,
1375,
281,
320,
2074,
281,
269,
1906,
625,
38920,
1014,
616,
2746,
13698,
281,
3037,
281,
6635,
24102,
5277,
1491,
670,
1182,
253,
2022,
3064,
875,
253,
767,
7274,
31951,
275,
253,
958,
326,
616,
17947,
3646,
310,
10166,
281,
671,
22950,
253,
10921,
717,
891,
3451,
588,
616,
11333,
11089,
432,
253,
8789,
1895,
5393,
275,
2593,
7609,
50275,
37585,
5701,
13989,
337,
812,
368,
5513,
275,
625,
4278,
253,
878,
323,
21651,
351,
5755,
619,
4685,
432,
253,
30762,
310,
326,
368,
943,
320,
2104,
281,
6635,
512,
1896,
24102,
281,
452,
326,
268,
91,
893,
489,
37755,
50276,
71,
7873,
91,
1906,
2299,
368,
651,
878,
281,
7826,
7767,
17947,
387,
253,
1268,
273,
5231,
436,
476,
320,
2797,
407,
247,
3632,
3646,
717,
891,
3451,
50276,
74,
1158,
14168,
4482,
36812,
948,
268,
1906,
310,
5816,
275,
253,
806,
285,
1273,
7424,
273,
253,
4737,
273,
13989,
337,
50276,
5658,
5393,
326,
368,
812,
5386,
253,
21651,
351,
5755,
9376,
407,
3629,
253,
1180,
273,
17947,
13305,
812,
368,
19148,
436,
6197,
253,
1180,
273,
17947,
13305,
310,
247,
1307,
326,
1057,
417,
3176,
275,
634,
1655,
1783,
1580,
253,
14720,
3133,
281,
320,
275,
15355,
50276,
74,
42126,
2451,
253,
30762,
260,
19,
5474,
339,
793,
360,
3454,
436,
2929,
23970,
7156,
247,
1313,
7694,
2746,
326,
34430,
1868,
17947,
432,
30211,
271,
30211,
3646,
33772,
281,
22950,
23267,
326,
403,
27039,
327,
271,
32049,
326,
33772,
4836,
4623,
1491,
840,
271,
17947,
3646,
33772,
281,
4822,
941,
326,
11903,
4219,
253,
15577,
1491,
875,
253,
32049,
285,
14859,
3054,
253,
789,
310,
2429,
1411,
2709,
1666,
25379,
275,
2969,
8892,
50275,
1189,
455,
891,
9644,
4404,
18738,
253,
2929,
2167,
891,
717,
417,
347,
7615,
342,
253,
1313,
7694,
6239,
281,
452,
1199,
273,
271,
8191,
4743,
670,
752,
4623,
49602,
390,
7274,
403,
253,
2929,
369,
973,
15720,
285,
973,
24013,
8550,
285,
1223,
253,
4679,
497,
2969,
4455,
281,
6780,
253,
3237,
326,
253,
2929,
369,
15974,
352,
2789,
3282,
281,
4858,
562,
17947,
285,
30211,
285,
891,
14109,
253,
11250,
273,
8892,
326,
6518,
41509,
436,
1127,
33810,
253,
2929,
3400,
247,
10527,
1783,
273,
7156,
4645,
326,
253,
34430,
6216,
3646,
11903,
4219,
6548,
2127,
285,
4373,
22041,
403,
2530,
285,
253,
2929,
3133,
281,
320,
41374,
50274,
74,
513,
1158,
326,
253,
2929,
943,
452,
625,
5955,
285,
7103,
689,
7274,
326,
4388,
281,
11120,
2953,
253,
17947,
30211,
1895,
253,
2929,
760,
19401,
17947,
275,
253,
3634,
273,
5148,
613,
920,
533,
273,
2282,
17947,
310,
247,
4275,
1895,
275,
391,
77,
285,
2067,
7274,
452,
5421,
352,
3345,
273,
1313,
7694,
253,
2929,
651,
320,
5520,
407,
16585,
824,
7274,
323,
1650,
15276,
23267,
824,
347,
49952,
337,
390,
9326,
374,
285,
263,
16344,
849,
973,
841,
7274,
7277,
281,
7156,
672,
10166,
3815,
285,
5678,
342,
26724,
11333,
50274,
74,
671,
651,
452,
10490,
281,
923,
625,
16774,
1783,
689,
253,
17947,
3646,
1146,
6311,
407,
3376,
37755,
50274,
34974,
50275,
18,
849,
369,
253,
10027,
2281,
323,
299,
4277,
6777,
275,
4677,
495,
849,
651,
247,
3646,
342,
247,
4229,
10027,
2281,
1347,
50274,
19,
891,
513,
417,
3240,
2096,
849,
24102,
432,
253,
17947,
3646,
476,
320,
908,
28961,
1598,
342,
253,
2349,
351,
723,
1182,
672,
43867,
715,
8483,
1945,
812,
253,
4477,
2085,
625,
16039,
715,
436,
50275,
18,
247,
27998,
17487,
1342,
5556,
1319,
8063,
16248,
10921,
11903,
1320,
285,
49952,
458,
487,
41717,
1162,
355,
50274,
19,
24536,
17477,
17947,
407,
1881,
35421,
10554,
1854,
518,
1162,
355,
2490,
187,
4118,
18435,
27,
783,
4477,
31637,
1142,
273,
391,
18,
285,
391,
21,
84,
7350,
533,
627,
497,
1774,
5780,
7350,
5001,
253,
9759,
50276,
251,
253,
6627,
1930,
253,
2746,
310,
4460,
285,
253,
5661,
1543,
403,
4891,
50275,
35529,
253,
2022,
1127,
5439,
407,
391,
18,
310,
253,
29713,
875,
253,
14511,
285,
253,
3762,
285,
253,
4588,
5933,
285,
1543,
690,
40924,
1868,
273,
436,
29713,
2486,
50276,
856,
3321,
337,
310,
8058,
672,
29331,
4566,
281,
470,
534,
310,
1620,
5393,
275,
253,
2022,
2929,
581,
556,
281,
1007,
715,
253,
14801,
1271,
281,
452,
247,
5955,
273,
29331,
285,
273,
253,
5933,
1223,
253,
4477,
812,
4518,
5513,
326,
672,
16585,
253,
10527,
906,
253,
2879,
5955,
327,
25184,
273,
29331,
1754,
327,
30762,
299,
1057,
417,
1361,
984,
253,
13757,
1895,
310,
3542,
347,
762,
253,
7658,
326,
50276,
261,
11903,
1025,
534,
310,
417,
3782,
2590,
50275,
3062,
3839,
253,
10527,
906,
1690,
24088,
253,
9376,
273,
21651,
351,
5755,
812,
320,
5469,
625,
10534,
275,
2426,
273,
752,
310,
2686,
2218,
275,
253,
5933,
285,
253,
4679,
50276,
284,
4767,
407,
391,
20,
1142,
1774,
7794,
273,
253,
3082,
285,
253,
5661,
9978,
403,
760,
2130,
275,
14801,
1271,
534,
2789,
352,
2834,
281,
2096,
253,
22620,
285,
3910,
275,
5661,
7241,
875,
253,
8514,
624,
2929,
285,
391,
77,
19,
27887,
77,
285,
1395,
50276,
262,
310,
12744,
752,
629,
273,
7156,
310,
4619,
323,
3045,
627,
310,
642,
11080,
28913,
1263,
4543,
5955,
273,
253,
6349,
273,
253,
1491,
3673,
44856,
1307,
285,
253,
760,
2625,
1677,
275,
4677,
608,
310,
326,
352,
310,
4619,
253,
4477,
812,
19148,
253,
767,
7794,
34430,
4906,
285,
1491,
3673,
44856,
50274,
9088,
369,
690,
5955,
670,
436,
2929,
533,
1014,
762,
253,
9376,
326,
253,
4477,
9577,
954,
391,
20,
84,
7350,
391,
20,
42126,
11377,
275,
11985,
253,
2929,
310,
1335,
45210,
275,
253,
990,
627,
369,
1652,
1329,
323,
14924,
984,
273,
253,
9759,
3374,
1840
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
15577,
1491,
875,
1182,
285,
253,
2793,
407,
253,
17947,
3646,
3376,
37755,
15307,
489,
37755,
50276,
40622,
352,
310,
671,
2032,
323,
253,
4081,
1332,
3103,
352,
3133,
323,
479,
253,
3908,
417,
760,
11424,
253,
1921,
2139,
253,
5368,
3082,
4536,
1057,
417,
789,
533,
671,
5513,
253,
1921,
2139,
253,
35221,
4715,
310,
2834,
891,
1158,
253,
4073,
943,
320,
1691,
327,
253,
3733,
407,
970,
253,
1895,
2654,
534,
1057,
417,
3469,
327,
253,
4780,
273,
253,
3733,
2581,
685,
253,
34430,
4906,
50276,
783,
1617,
273,
13989,
337,
310,
417,
4518,
4767,
275,
253,
4737,
253,
4477,
5467,
29331,
7274,
470,
533,
436,
310,
417,
4518,
4767,
275,
253,
1617,
671,
275,
253,
3368,
436,
310,
417,
2684,
533,
29331,
310,
873,
281,
320,
337,
50276,
316,
4085,
1255,
273,
253,
3673,
44856,
347,
2011,
275,
3036,
22,
310,
417,
4516,
407,
253,
10527,
6260,
352,
3133,
323,
479,
10527,
1783,
1057,
417,
1329,
253,
4588,
4081,
1332,
534,
4648,
253,
3673,
44856,
275,
253,
13757,
3185,
347,
891,
4159,
1840,
253,
4477,
22175,
253,
6349,
273,
253,
34430,
4906,
1512,
1199,
2299,
1754,
327,
253,
10527,
1783,
285,
253,
5661,
1543,
891,
1158,
253,
4477,
943,
22175,
625,
253,
31471,
273,
253,
3673,
44856,
50273,
936,
897,
253,
4081,
5933,
1895,
2654,
310,
33777,
436,
1617,
943,
320,
21198,
891,
671,
4282,
849,
1199,
253,
3045,
273,
253,
4081,
1332,
310,
5876,
407,
253,
10603,
273,
253,
1895,
2654,
310,
352,
10237,
281,
253,
3632,
29391,
273,
253,
1895,
2654,
352,
651,
320,
5322,
604,
253,
4477,
476,
1071,
253,
31640,
327,
253,
29391,
273,
253,
1895,
2654,
50275,
284,
3542,
275,
253,
10199,
253,
1313,
609,
249,
19503,
4715,
310,
2223,
11797,
407,
253,
7497,
3158,
15644,
3745,
281,
747,
4836,
534,
10764,
690,
3607,
342,
253,
3786,
7407,
8892,
2299,
253,
15504,
273,
253,
1895,
2654,
7787,
253,
30437,
273,
253,
1332,
281,
824,
9534,
352,
651,
320,
5322,
604,
253,
4477,
476,
40924,
6644,
752,
2238,
273,
8542,
8892,
253,
4081,
1332,
476,
320,
3732,
50276,
284,
323,
253,
4679,
352,
310,
1175,
281,
1071,
327,
9841,
4158,
12620,
285,
8892,
281,
6780,
253,
3607,
273,
253,
4081,
1332,
2299,
891,
2868,
253,
4081,
1332,
943,
320,
5762,
327,
581,
273,
253,
1846,
49602,
281,
1056,
253,
5661,
1543,
625,
9630,
432,
841,
4679,
253,
10668,
588,
2096,
326,
253,
5368,
3082,
403,
9009,
20420,
285,
849,
1199,
253,
4081,
1332,
2987,
973,
327,
253,
1846,
22791,
7152,
339,
431,
248,
2929,
23970,
271,
2746,
281,
3157,
5148,
613,
920,
275,
391,
77,
5742,
253,
2746,
13698,
281,
3157,
253,
6083,
17947,
1309,
253,
3733,
3408,
594,
326,
253,
5570,
476,
1805,
22059,
1309,
253,
1071,
3408,
835,
3530,
5728,
432,
253,
3733,
3408,
3400,
4217,
4836,
15477,
1491,
281,
253,
5570,
50276,
856,
84,
253,
2929,
310,
973,
15720,
50276,
783,
2934,
310,
4518,
3559,
50276,
783,
4679,
403,
2590,
281,
2096,
50276,
783,
1543,
3559,
7568,
326,
253,
2746,
476,
2057,
3761,
390,
3157,
4715,
672,
2429,
281,
4623,
1666,
25379,
50276,
5040,
50276,
23454,
670,
253,
10336,
10165,
908,
323,
253,
1666,
25379,
403,
417,
2530,
275,
253,
2022,
2505,
534,
2789,
352,
2834,
281,
2096,
253,
3368,
9978,
285,
697,
1543,
50276,
6553,
272,
562,
253,
3910,
875,
253,
1666,
25379,
285,
616,
2746,
651,
320,
4217,
281,
1361,
2096,
253,
7680,
273,
253,
789,
50276,
34974,
337,
352,
3133,
751,
253,
897,
273,
1895,
2654,
310,
2173,
281,
253,
2746,
5611,
275,
253,
2929,
604,
594,
812,
253,
4477,
6266,
849,
436,
369,
908,
275,
253,
8245,
6083,
604,
253,
1895,
44077,
497,
417,
908,
275,
253,
1666,
25379,
840,
352,
9193,
751,
253,
1666,
25379,
2783,
1060,
403,
16593,
374,
323,
253,
10625,
2783,
1060,
352,
3133,
751,
253,
1895,
44077,
403,
4209,
2217,
281,
2085,
667,
4836,
15477,
1491,
281,
253,
5148,
613,
920,
5570,
275,
436,
1083,
752,
3686,
273,
1182,
84,
476,
253,
32049,
4711,
253,
10625,
2783,
1646,
1679,
4722,
323,
253,
5611,
2746,
347,
352,
3133,
751,
253,
32049,
812,
3365,
3037,
271,
6489,
10603,
273,
253,
1895,
2654,
534,
310,
697,
3280,
495,
752,
651,
320,
253,
1921,
323,
391,
77,
19,
281,
1891,
275,
253,
495,
69,
15034,
4836,
2783,
577,
347,
247,
8245,
352,
651,
320,
4722,
281,
923,
253,
4715,
3045,
273,
247,
2969,
391,
77,
19,
5570,
327,
253,
10625,
2530,
597,
1379,
275,
253,
1895,
2654,
347,
3280,
436,
651,
4151,
1880,
253,
32049,
48759,
10336,
326,
310,
5611,
275,
253,
2929,
33772,
1633,
326,
310,
4457,
253,
1895,
209,
2016,
406,
339,
431,
248,
2929,
2340,
684,
253,
17947,
15083,
80,
3535,
1895,
275,
5148,
613,
920,
253,
4477,
5513,
253,
1895,
273,
9904,
17947,
285,
17813,
352,
949,
247,
20953,
1650,
281,
11399,
436,
2523,
253,
2929,
23970,
7156,
247,
11419,
41528,
34430,
4906,
17947,
285,
30211,
275,
253,
806,
3213,
7156,
33772,
271,
30211,
3646,
285,
247,
4836,
21496,
407,
46875,
253,
18849,
10921,
273,
253,
1677,
4836,
4836,
21674,
310,
1929,
387,
6194,
275,
253,
1273,
3213,
7156,
33772,
271,
17947,
3646,
326,
310,
13333,
342,
253,
46234,
4561,
407,
253,
30211,
3646,
7156,
41731,
10574,
253,
1375,
23037,
14387,
11333,
275,
2067,
4679,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
253,
2934,
310,
4518,
5544,
285,
17285,
891,
452,
760,
247,
1643,
5701,
50276,
585,
29340,
253,
1895,
273,
9904,
17947,
516,
12371,
604,
368,
812,
2085,
247,
625,
7473,
22861,
253,
1655,
22861,
4706,
7609,
310,
3477,
281,
956,
533,
3240,
12002,
5747,
1146,
4516,
407,
634,
1650,
4706,
8073,
352,
310,
417,
2590,
281,
479,
1880,
436,
310,
247,
2087,
1895,
273,
9904,
17947,
24088,
310,
352,
1955,
281,
11786,
11269,
50276,
47109,
342,
253,
6239,
812,
368,
2085,
247,
7000,
5301,
342,
1547,
446,
1479,
1162,
355,
6247,
465,
312,
1914,
5134,
1162,
355,
9169,
50276,
249,
253,
806,
3213,
634,
5933,
33772,
271,
9706,
273,
253,
4836,
269,
793,
303,
86,
407,
46875,
253,
4836,
10921,
26332,
4715,
253,
30211,
3646,
275,
253,
1273,
3213,
352,
4648,
436,
21496,
281,
6194,
271,
17947,
3646,
281,
6635,
24102,
10603,
281,
253,
4623,
1491,
8818,
407,
253,
21496,
26332,
46875,
253,
15577,
1491,
875,
1182,
285,
253,
24102,
281,
253,
1682,
273,
619,
4685,
436,
3133,
1077,
2074,
281,
752,
2218,
275,
253,
5393,
7274,
275,
1798,
465,
312,
1914,
5134,
1162,
355,
9169,
671,
452,
17947,
285,
30211,
7823,
253,
30211,
3646,
310,
10166,
281,
22950,
253,
10921,
1223,
11365,
271,
21496,
273,
12910,
3021,
12014,
281,
634,
30211,
3213,
253,
17947,
3646,
310,
671,
10166,
281,
22950,
253,
10921,
1223,
37703,
253,
391,
9866,
1375,
281,
320,
2074,
281,
269,
1906,
625,
38920,
1014,
616,
2746,
13698,
281,
3037,
281,
6635,
24102,
5277,
1491,
670,
1182,
253,
2022,
3064,
875,
253,
767,
7274,
31951,
275,
253,
958,
326,
616,
17947,
3646,
310,
10166,
281,
671,
22950,
253,
10921,
717,
891,
3451,
588,
616,
11333,
11089,
432,
253,
8789,
1895,
5393,
275,
2593,
7609,
50275,
37585,
5701,
13989,
337,
812,
368,
5513,
275,
625,
4278,
253,
878,
323,
21651,
351,
5755,
619,
4685,
432,
253,
30762,
310,
326,
368,
943,
320,
2104,
281,
6635,
512,
1896,
24102,
281,
452,
326,
268,
91,
893,
489,
37755,
50276,
71,
7873,
91,
1906,
2299,
368,
651,
878,
281,
7826,
7767,
17947,
387,
253,
1268,
273,
5231,
436,
476,
320,
2797,
407,
247,
3632,
3646,
717,
891,
3451,
50276,
74,
1158,
14168,
4482,
36812,
948,
268,
1906,
310,
5816,
275,
253,
806,
285,
1273,
7424,
273,
253,
4737,
273,
13989,
337,
50276,
5658,
5393,
326,
368,
812,
5386,
253,
21651,
351,
5755,
9376,
407,
3629,
253,
1180,
273,
17947,
13305,
812,
368,
19148,
436,
6197,
253,
1180,
273,
17947,
13305,
310,
247,
1307,
326,
1057,
417,
3176,
275,
634,
1655,
1783,
1580,
253,
14720,
3133,
281,
320,
275,
15355,
50276,
74,
42126,
2451,
253,
30762,
260,
19,
5474,
339,
793,
360,
3454,
436,
2929,
23970,
7156,
247,
1313,
7694,
2746,
326,
34430,
1868,
17947,
432,
30211,
271,
30211,
3646,
33772,
281,
22950,
23267,
326,
403,
27039,
327,
271,
32049,
326,
33772,
4836,
4623,
1491,
840,
271,
17947,
3646,
33772,
281,
4822,
941,
326,
11903,
4219,
253,
15577,
1491,
875,
253,
32049,
285,
14859,
3054,
253,
789,
310,
2429,
1411,
2709,
1666,
25379,
275,
2969,
8892,
50275,
1189,
455,
891,
9644,
4404,
18738,
253,
2929,
2167,
891,
717,
417,
347,
7615,
342,
253,
1313,
7694,
6239,
281,
452,
1199,
273,
271,
8191,
4743,
670,
752,
4623,
49602,
390,
7274,
403,
253,
2929,
369,
973,
15720,
285,
973,
24013,
8550,
285,
1223,
253,
4679,
497,
2969,
4455,
281,
6780,
253,
3237,
326,
253,
2929,
369,
15974,
352,
2789,
3282,
281,
4858,
562,
17947,
285,
30211,
285,
891,
14109,
253,
11250,
273,
8892,
326,
6518,
41509,
436,
1127,
33810,
253,
2929,
3400,
247,
10527,
1783,
273,
7156,
4645,
326,
253,
34430,
6216,
3646,
11903,
4219,
6548,
2127,
285,
4373,
22041,
403,
2530,
285,
253,
2929,
3133,
281,
320,
41374,
50274,
74,
513,
1158,
326,
253,
2929,
943,
452,
625,
5955,
285,
7103,
689,
7274,
326,
4388,
281,
11120,
2953,
253,
17947,
30211,
1895,
253,
2929,
760,
19401,
17947,
275,
253,
3634,
273,
5148,
613,
920,
533,
273,
2282,
17947,
310,
247,
4275,
1895,
275,
391,
77,
285,
2067,
7274,
452,
5421,
352,
3345,
273,
1313,
7694,
253,
2929,
651,
320,
5520,
407,
16585,
824,
7274,
323,
1650,
15276,
23267,
824,
347,
49952,
337,
390,
9326,
374,
285,
263,
16344,
849,
973,
841,
7274,
7277,
281,
7156,
672,
10166,
3815,
285,
5678,
342,
26724,
11333,
50274,
74,
671,
651,
452,
10490,
281,
923,
625,
16774,
1783,
689,
253,
17947,
3646,
1146,
6311,
407,
3376,
37755,
50274,
34974,
50275,
18,
849,
369,
253,
10027,
2281,
323,
299,
4277,
6777,
275,
4677,
495,
849,
651,
247,
3646,
342,
247,
4229,
10027,
2281,
1347,
50274,
19,
891,
513,
417,
3240,
2096,
849,
24102,
432,
253,
17947,
3646,
476,
320,
908,
28961,
1598,
342,
253,
2349,
351,
723,
1182,
672,
43867,
715,
8483,
1945,
812,
253,
4477,
2085,
625,
16039,
715,
436,
50275,
18,
247,
27998,
17487,
1342,
5556,
1319,
8063,
16248,
10921,
11903,
1320,
285,
49952,
458,
487,
41717,
1162,
355,
50274,
19,
24536,
17477,
17947,
407,
1881,
35421,
10554,
1854,
518,
1162,
355,
2490,
187,
4118,
18435,
27,
783,
4477,
31637,
1142,
273,
391,
18,
285,
391,
21,
84,
7350,
533,
627,
497,
1774,
5780,
7350,
5001,
253,
9759,
50276,
251,
253,
6627,
1930,
253,
2746,
310,
4460,
285,
253,
5661,
1543,
403,
4891,
50275,
35529,
253,
2022,
1127,
5439,
407,
391,
18,
310,
253,
29713,
875,
253,
14511,
285,
253,
3762,
285,
253,
4588,
5933,
285,
1543,
690,
40924,
1868,
273,
436,
29713,
2486,
50276,
856,
3321,
337,
310,
8058,
672,
29331,
4566,
281,
470,
534,
310,
1620,
5393,
275,
253,
2022,
2929,
581,
556,
281,
1007,
715,
253,
14801,
1271,
281,
452,
247,
5955,
273,
29331,
285,
273,
253,
5933,
1223,
253,
4477,
812,
4518,
5513,
326,
672,
16585,
253,
10527,
906,
253,
2879,
5955,
327,
25184,
273,
29331,
1754,
327,
30762,
299,
1057,
417,
1361,
984,
253,
13757,
1895,
310,
3542,
347,
762,
253,
7658,
326,
50276,
261,
11903,
1025,
534,
310,
417,
3782,
2590,
50275,
3062,
3839,
253,
10527,
906,
1690,
24088,
253,
9376,
273,
21651,
351,
5755,
812,
320,
5469,
625,
10534,
275,
2426,
273,
752,
310,
2686,
2218,
275,
253,
5933,
285,
253,
4679,
50276,
284,
4767,
407,
391,
20,
1142,
1774,
7794,
273,
253,
3082,
285,
253,
5661,
9978,
403,
760,
2130,
275,
14801,
1271,
534,
2789,
352,
2834,
281,
2096,
253,
22620,
285,
3910,
275,
5661,
7241,
875,
253,
8514,
624,
2929,
285,
391,
77,
19,
27887,
77,
285,
1395,
50276,
262,
310,
12744,
752,
629,
273,
7156,
310,
4619,
323,
3045,
627,
310,
642,
11080,
28913,
1263,
4543,
5955,
273,
253,
6349,
273,
253,
1491,
3673,
44856,
1307,
285,
253,
760,
2625,
1677,
275,
4677,
608,
310,
326,
352,
310,
4619,
253,
4477,
812,
19148,
253,
767,
7794,
34430,
4906,
285,
1491,
3673,
44856,
50274,
9088,
369,
690,
5955,
670,
436,
2929,
533,
1014,
762,
253,
9376,
326,
253,
4477,
9577,
954,
391,
20,
84,
7350,
391,
20,
42126,
11377,
275,
11985,
253,
2929,
310,
1335,
45210,
275,
253,
990,
627,
369,
1652,
1329,
323,
14924,
984,
273,
253,
9759,
3374,
1840
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper as the name suggests tries to figure out whether and through which ways bpe can affect transformers memorization capacity it evaluates transformers memorization under these 4 settings memorizing random synthetic data memorizing random labeled natural language data recognizing training data with lower output entropy and training data recovery qa experiments show that if we have more merge times for bpe larger vocabulary size the models performance on all the four settings would improve on 3 architectures which shows that bpe indeed can affect the models memorization capacity larger the vocabulary size is the better the model would perform on memorizing then the paper tries to figure out why more merges in bpe would lead to better memorization by excluding two other hypotheses the paper concludes that more merges in bpe results in shorter input sequence and that makes memorizing easier for transformers i think this paper presents a thorough evaluation on how bep affects transformers memorization my main concern is that their conclusion shorter input sequences lead to better memorization is drawn a bit hastily the paper draws this conclusion indirectly by disproving the other two assumptions that they gave however are the 3 assumptions the only possible candidate theories or are there ways to directly prove that it is the sequence length that makes a difference one possible experiment can be in the experiment of training data recovery you train two different language models with the same data set while for one you cut the long input sentences into multiple short sequences or concatenate sentences to make the input sequence longer for the other you train the language model with the original data set thorough empirical study on whether bpe affects memorization in transformers while the final conclusion shorter input sequences lead to better memorization is not well supported by experiments docsepthis paper studies the memorization properties of an nlp models conditioned on how large is the vocabulary size they concentrate on widely used bpe algorithm in order to split original data into subword units further they construct a testbed consisting on several tasks where each task is related with a specific memorization aspect connected to the model such as capacity or preference which authors introduced themself experiments empirically validate the change of memorization properties as the vocabulary size is changing where it exhibits better memorization with larger vocab size further authors make multiple hypotheses of what might be the underlying explanation hidden behind the improved memorization after checking these out they conjecture that the reduced sequence length is likely the major contributor explaining the underlying observation this work is placed on the border for me as it is investigating well crafted question and does some empirical validation of multiple hypotheses on the other hand there is not much of discussion on what can we do with long sequences when we require memorization authors provide very highlevel paths such as our findings can provide guidance for tuning the learned data structures but i would be very curious to read more details i think it would nicely fit as findings paper rather than work with the novel contribution when it comes to strengths i like the construction of memorization properties which allows to quantify the memorization by using the proposed probing tasks in addition authors tried to provide many details on specific hyperparameters they used although i left some comments below where more details would be welcome regarding weaknesses while the memorization properties are clearly outlined the construction of model parameterization in probing tasks is written on very highlevel without any discussion of why this specific method of casting the model as classifier is chosen and how it may or may not provide necessary information towards research questions about memorization this is mainly addressing section 33 in addition authors mention sota state of the art several times during the work while they do not much relate to the actual sota models please correct me here if i am wrong but considered models are not in the same ballpark as sota models i want to stress that the latter fact is not a bad thing but then there is no need to relate with sota then below i have noted multiple comments where it was hard to me to get all the details or where i felt the minor change addon would improve the presentation page 3 at testtime we prompt the trained lm by a query followed with the separator token and check whether the trained lm reproduces the correct answer within top1 or top5 results returned by beam search beam size 5 this is very specific design choice why did you choose it model conditioned on the q gives the entire distribution on sequencelevel and beam search is very sensitive to the beam size i see an alternative as doing unbiased sampling and computing expected similarity with the answer some discussion on that would help to clarify this bottom of page 3 you used word setups to refer to both tasks and models right which is a bit confusing section 33 as i wrote above in my opinion this section needs to be larger as it defines very important design of the model parameterization right now it is not even clear enough what are target class in the context of vocabulary tokens i believe explaining this would improve the presentation of that part a lot page 5 achieves 100 train accuracy in cases of mlm was acc measured only on task class prediction or on other masked tokens as well page 5 hence in this experiment we used a 1layer could you explain why you chose to reduce the model size in my understanding the task itself might be too easy so the other way is to alter the underlying task since you specifically construct it page 5 which was enough for the training loss to converge converge to what was there an early stopping criteria on loss or acc page 8 thus if the vocabulary growth alone can explain the observed effect we will see that the datasets with duplicated tokens would also be easier to memorize in your proposed construction the frequency distribution of tokens doesnt change relatively to each other right is it the same for bpe learning when number of merges is growing would be great to show this otherwise it is hard to buy this statement i set the score as marginally below the threshold given my concerns above where the main one is that the overall conclusions represents empirical findings without much of a contribution towards specific takeaways or practices which would be handy for the community nonetheless i am open to discuss how this work can be positioned in this conference if there will be strong positive feedback from others docsepthis paper investigates the impact of subword vocabulary size on the memorization ability of transformer models the authors designed three types of tasks to evaluate the changes of model memorization namely learning mappings with random labels membership inference and training data recovery experimental results show that the memorization ability of transformer model is stronger with the increase of subword vocabulary size which the authors attribute to the reduction in the sequences length pros 1 the experiments are comprehensive the authors design three different types of experiments to thoroughly compare the memorization ability differences between three different models encoder lm mlm 2 the paper explores the relationship between the memorization and generalization capabilities of the model and shows that generalization is not directly at odds with memorization which can inspire future model design cons 1 the authors focus on three candidate causes which the authors claimed are principal sideeffects of applying bpe i removing redundancy in the data due to compression ii increase in the number of the unique units used to represent the data or iii reducing the length of the training sequences other factors that are closely related to the bpe vocabulary may be also important such as the subword frequency can the authors explain why the three factors examined in this paper are more likely the causes than other factors eg subword frequency 2 although the paper presents several interesting conclusions they can not be directly used to determine a suitable vocabulary size in practical tasks where the demand of the models memorization is unknowable concerns 1 the conclusions in this study are only validated on the snli dataset paq is only used for the training data recovery experiment which may threat the universality of the findings experiments on other datasets and tasks eg the machine translation task where transformer was invented are necessary the research questions explored in the work are very interesting and important however the experiments are relatively weak and it is not clear how to apply their findings in practical tasks
### Summary:
|
this paper investigates the role of bpe and vocabulary sizes in memorization in transformer models through a series of experiments on random label prediction training data recovery and membership inference attacks the paper shows that larger vocabulary sizes lead to improved memorization the reviewers all agree that the paper investigates an important question and does so thoroughly the main concerns were about 1 the validity of the conclusion that it is sequence length indeed which affects memorization and 2 the lack of more tasks to validate the findings for 1 the authors added another set of experiments which further rule out frequency effects as a factor but i agree with reviewer kazc that more evidence is needed which directly shows that sequence length is responsible eg are shorter paq questions memorized better for 2 the authors shared a google drive link with additional results on nmt after the deadline which the reviewers appreciated overall however the paper needs more work in order to unify all these results in a single draft
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
347,
253,
1416,
5936,
14177,
281,
4677,
562,
1880,
285,
949,
534,
4088,
270,
365,
476,
2818,
4979,
398,
16407,
1320,
5350,
352,
44995,
4979,
398,
16407,
1320,
762,
841,
577,
7533,
16407,
3006,
3632,
13506,
941,
16407,
3006,
3632,
13130,
3626,
3448,
941,
26182,
3733,
941,
342,
2406,
3453,
15579,
285,
3733,
941,
7355,
2805,
66,
4679,
921,
326,
604,
359,
452,
625,
17310,
2069,
323,
270,
365,
4067,
30318,
1979,
253,
3210,
3045,
327,
512,
253,
1740,
7533,
651,
3157,
327,
495,
35615,
534,
2722,
326,
270,
365,
6296,
476,
2818,
253,
3210,
16407,
1320,
5350,
4067,
253,
30318,
1979,
310,
253,
1805,
253,
1566,
651,
1347,
327,
16407,
3006,
840,
253,
2929,
14177,
281,
4677,
562,
2139,
625,
14041,
265,
275,
270,
365,
651,
1421,
281,
1805,
16407,
1320,
407,
22914,
767,
643,
24316,
253,
2929,
20097,
326,
625,
14041,
265,
275,
270,
365,
1543,
275,
12217,
3280,
3425,
285,
326,
2789,
16407,
3006,
6927,
323,
4979,
398,
891,
1158,
436,
2929,
10262,
247,
11080,
7103,
327,
849,
320,
81,
11852,
4979,
398,
16407,
1320,
619,
2022,
4468,
310,
326,
616,
6452,
12217,
3280,
6430,
1421,
281,
1805,
16407,
1320,
310,
8392,
247,
2372,
44753,
253,
2929,
21354,
436,
6452,
21719,
407,
557,
40037,
253,
643,
767,
13260,
326,
597,
3534,
2299,
403,
253,
495,
13260,
253,
760,
1896,
7431,
11813,
390,
403,
627,
4088,
281,
3587,
5276,
326,
352,
310,
253,
3425,
2978,
326,
2789,
247,
3064,
581,
1896,
3368,
476,
320,
275,
253,
3368,
273,
3733,
941,
7355,
368,
6194,
767,
1027,
3448,
3210,
342,
253,
1072,
941,
873,
1223,
323,
581,
368,
2624,
253,
1048,
50276,
5423,
14683,
715,
2709,
2159,
6430,
390,
32147,
366,
14683,
281,
1056,
253,
3280,
3425,
3356,
323,
253,
643,
368,
6194,
253,
3448,
1566,
342,
253,
3236,
941,
873,
50276,
42771,
602,
16774,
1263,
327,
1880,
270,
365,
11852,
16407,
1320,
275,
4979,
398,
1223,
253,
2457,
6452,
12217,
3280,
6430,
1421,
281,
1805,
16407,
1320,
310,
417,
973,
4516,
407,
4679,
5474,
33032,
2520,
2929,
2175,
253,
16407,
1320,
3607,
273,
271,
295,
24343,
3210,
27039,
327,
849,
1781,
310,
253,
30318,
1979,
597,
21364,
327,
7561,
908,
270,
365,
5933,
275,
1340,
281,
8085,
3236,
941,
715,
749,
3418,
5085,
2007,
597,
3989,
247,
1071,
3026,
11253,
327,
2067,
8892,
835,
1016,
4836,
310,
2905,
342,
247,
2173,
16407,
1320,
4809,
4802,
281,
253,
1566,
824,
347,
5350,
390,
14682,
534,
4477,
5611,
731,
1286,
50276,
16217,
3825,
45190,
17813,
253,
1818,
273,
16407,
1320,
3607,
347,
253,
30318,
1979,
310,
6890,
835,
352,
15646,
1805,
16407,
1320,
342,
4067,
11571,
357,
1979,
2007,
4477,
1056,
2709,
24316,
273,
752,
1537,
320,
253,
6944,
8813,
8763,
3212,
253,
5520,
16407,
1320,
846,
12669,
841,
562,
597,
24366,
326,
253,
3777,
3425,
2978,
310,
2779,
253,
2201,
24938,
15571,
253,
6944,
8310,
436,
789,
310,
4845,
327,
253,
5680,
323,
479,
347,
352,
310,
15686,
973,
37171,
1953,
285,
1057,
690,
16774,
12820,
273,
2709,
24316,
327,
253,
643,
1133,
627,
310,
417,
1199,
273,
5955,
327,
752,
476,
359,
513,
342,
1048,
6430,
672,
359,
2430,
16407,
1320,
4477,
2085,
1077,
1029,
5251,
11865,
824,
347,
776,
4342,
476,
2085,
12925,
323,
25184,
253,
6311,
941,
5289,
533,
891,
651,
320,
1077,
14338,
281,
1239,
625,
4278,
891,
1158,
352,
651,
23395,
4944,
347,
4342,
2929,
2581,
685,
789,
342,
253,
4460,
7680,
50276,
9453,
352,
3249,
281,
20544,
891,
751,
253,
5140,
273,
16407,
1320,
3607,
534,
4483,
281,
22048,
253,
16407,
1320,
407,
970,
253,
4081,
39578,
8892,
275,
1635,
4477,
3597,
281,
2085,
1142,
4278,
327,
2173,
4373,
22041,
597,
908,
3738,
891,
1669,
690,
5701,
2708,
835,
625,
4278,
651,
320,
10112,
50276,
1747,
13218,
32213,
1223,
253,
16407,
1320,
3607,
403,
4518,
18627,
253,
5140,
273,
1566,
4764,
1320,
275,
39578,
8892,
310,
3542,
327,
1077,
1029,
5251,
1293,
667,
5955,
273,
2139,
436,
2173,
1332,
273,
20278,
253,
1566,
347,
30410,
310,
6777,
285,
849,
352,
778,
390,
778,
417,
2085,
3309,
1491,
4404,
2561,
3533,
670,
16407,
1320,
436,
310,
7194,
15974,
2593,
5922,
275,
1635,
4477,
3748,
256,
5503,
50276,
3409,
273,
253,
1445,
2067,
2069,
1309,
253,
789,
1223,
597,
513,
417,
1199,
14588,
281,
253,
4588,
256,
5503,
3210,
4496,
3451,
479,
1060,
604,
891,
717,
3430,
533,
2783,
3210,
403,
417,
275,
253,
1072,
4023,
30844,
347,
256,
5503,
3210,
891,
971,
281,
4073,
326,
253,
6158,
958,
310,
417,
247,
3076,
2181,
533,
840,
627,
310,
642,
878,
281,
14588,
342,
256,
5503,
840,
50276,
27490,
891,
452,
4879,
2709,
5701,
835,
352,
369,
1892,
281,
479,
281,
755,
512,
253,
4278,
390,
835,
891,
3543,
253,
5884,
1818,
50276,
1911,
251,
651,
3157,
253,
9759,
50276,
6377,
495,
387,
1071,
2606,
359,
8959,
253,
10166,
298,
78,
407,
247,
7316,
3560,
342,
253,
35823,
10669,
285,
2451,
1880,
253,
10166,
298,
78,
7598,
707,
253,
3451,
3662,
1561,
1755,
18,
390,
1755,
22,
1543,
4895,
407,
8325,
3186,
8325,
1979,
608,
50276,
2520,
310,
1077,
2173,
2216,
4327,
2139,
858,
368,
5206,
352,
1566,
27039,
327,
253,
2805,
4245,
253,
2862,
3268,
327,
3425,
5251,
285,
8325,
3186,
310,
1077,
7996,
281,
253,
8325,
1979,
891,
923,
271,
5795,
347,
2509,
38663,
10491,
285,
12672,
3264,
14259,
342,
253,
3662,
690,
5955,
327,
326,
651,
1361,
281,
19148,
436,
50276,
10492,
273,
3239,
495,
368,
908,
3159,
873,
8777,
281,
3730,
281,
1097,
8892,
285,
3210,
987,
534,
310,
247,
2372,
21643,
50276,
4674,
5922,
347,
891,
4159,
1840,
275,
619,
4743,
436,
2593,
3198,
281,
320,
4067,
347,
352,
13067,
1077,
1774,
2216,
273,
253,
1566,
4764,
1320,
987,
1024,
352,
310,
417,
1014,
2590,
2217,
752,
403,
2303,
966,
275,
253,
3634,
273,
30318,
21761,
891,
2868,
15571,
436,
651,
3157,
253,
9759,
273,
326,
629,
247,
2257,
50276,
6377,
608,
33526,
2233,
6194,
7200,
50276,
249,
2219,
273,
13361,
78,
369,
756,
4080,
760,
327,
4836,
966,
10554,
390,
327,
643,
34741,
21761,
347,
973,
50276,
6377,
608,
7613,
275,
436,
3368,
359,
908,
247,
337,
12026,
50276,
16534,
368,
5513,
2139,
368,
9703,
281,
4796,
253,
1566,
1979,
275,
619,
4685,
253,
4836,
3139,
1537,
320,
1512,
3477,
594,
253,
643,
1039,
310,
281,
6990,
253,
6944,
4836,
1580,
368,
5742,
3989,
352,
50275,
6377,
608,
534,
369,
2217,
323,
253,
3733,
2957,
281,
29623,
29623,
281,
752,
369,
627,
271,
2393,
15910,
6866,
327,
2957,
390,
756,
50276,
6377,
854,
3021,
604,
253,
30318,
3116,
3815,
476,
5513,
253,
2540,
1055,
359,
588,
923,
326,
253,
15302,
342,
44223,
21761,
651,
671,
320,
6927,
281,
16407,
907,
50276,
249,
634,
4081,
5140,
253,
4294,
3268,
273,
21761,
36908,
1818,
4942,
281,
1016,
643,
987,
310,
352,
253,
1072,
323,
270,
365,
4715,
672,
1180,
273,
14041,
265,
310,
5675,
651,
320,
1270,
281,
921,
436,
5010,
352,
310,
1892,
281,
4489,
436,
3908,
891,
873,
253,
4868,
347,
42876,
2708,
253,
7887,
1677,
619,
7350,
1840,
835,
253,
2022,
581,
310,
326,
253,
4583,
11815,
6125,
16774,
4342,
1293,
1199,
273,
247,
7680,
4404,
2173,
1379,
42287,
390,
8333,
534,
651,
320,
24783,
323,
253,
3114,
23188,
891,
717,
1527,
281,
2319,
849,
436,
789,
476,
320,
15471,
275,
436,
8059,
604,
627,
588,
320,
2266,
2762,
8680,
432,
2571,
5474,
33032,
2520,
2929,
2340,
684,
253,
3486,
273,
749,
3418,
30318,
1979,
327,
253,
16407,
1320,
3745,
273,
39707,
3210,
253,
4477,
4158,
1264,
3510,
273,
8892,
281,
7472,
253,
2544,
273,
1566,
16407,
1320,
10775,
4715,
42794,
342,
3632,
13301,
14199,
17032,
285,
3733,
941,
7355,
5661,
1543,
921,
326,
253,
16407,
1320,
3745,
273,
39707,
1566,
310,
10046,
342,
253,
2572,
273,
749,
3418,
30318,
1979,
534,
253,
4477,
11104,
281,
253,
5141,
275,
253,
6430,
2978,
50276,
856,
84,
337,
253,
4679,
403,
11088,
253,
4477,
2216,
1264,
1027,
3510,
273,
4679,
281,
16575,
7277,
253,
16407,
1320,
3745,
3910,
875,
1264,
1027,
3210,
32049,
298,
78,
13361,
78,
374,
253,
2929,
33826,
253,
2954,
875,
253,
16407,
1320,
285,
26647,
13789,
273,
253,
1566,
285,
2722,
326,
26647,
310,
417,
3587,
387,
13653,
342,
16407,
1320,
534,
476,
26761,
2852,
1566,
2216,
50276,
5040,
337,
253,
4477,
2770,
327,
1264,
7431,
5997,
534,
253,
4477,
7558,
403,
8624,
1930,
29907,
273,
9433,
270,
365,
891,
11922,
39296,
275,
253,
941,
1955,
281,
13800,
21255,
2572,
275,
253,
1180,
273,
253,
4451,
5085,
908,
281,
1957,
253,
941,
390,
37685,
8493,
253,
2978,
273,
253,
3733,
6430,
643,
2616,
326,
403,
8244,
2905,
281,
253,
270,
365,
30318,
778,
320,
671,
1774,
824,
347,
253,
749,
3418,
4294,
476,
253,
4477,
5513,
2139,
253,
1264,
2616,
6730,
275,
436,
2929,
403,
625,
2779,
253,
5997,
685,
643,
2616,
24088,
749,
3418,
4294,
374,
3738,
253,
2929,
10262,
2067,
4722,
11815,
597,
476,
417,
320,
3587,
908,
281,
3653,
247,
7470,
30318,
1979,
275,
8542,
8892,
835,
253,
4831,
273,
253,
3210,
16407,
1320,
310,
440,
14428,
494,
50275,
585,
1209,
2224,
337,
253,
11815,
275,
436,
1263,
403,
760,
17618,
327,
253,
3802,
965,
10895,
1349,
82,
310,
760,
908,
323,
253,
3733,
941,
7355,
3368,
534,
778,
4322,
253,
6978,
1319,
273,
253,
4342,
4679,
327,
643,
15302,
285,
8892,
24088,
253,
5145,
10234,
4836,
835,
39707,
369,
23179,
403,
3309,
253,
2561,
3533,
14859,
275,
253,
789,
403,
1077,
4722,
285,
1774,
2299,
253,
4679,
403,
4942,
5075,
285,
352,
310,
417,
2590,
849,
281,
4647,
616,
4342,
275,
8542,
8892,
2490,
187,
4118,
18435,
27,
2520,
2929,
2340,
684,
253,
2554,
273,
270,
365,
285,
30318,
9552,
275,
16407,
1320,
275,
39707,
3210,
949,
247,
2962,
273,
4679,
327,
3632,
5203,
10554,
3733,
941,
7355,
285,
14199,
17032,
8104,
253,
2929,
2722,
326,
4067,
30318,
9552,
1421,
281,
5520,
16407,
1320,
253,
30628,
512,
5194,
326,
253,
2929,
2340,
684,
271,
1774,
1953,
285,
1057,
594,
16575,
253,
2022,
7350,
497,
670,
337,
253,
13091,
273,
253,
6452,
326,
352,
310,
3425,
2978,
6296,
534,
11852,
16407,
1320,
285,
374,
253,
3480,
273,
625,
8892,
281,
17813,
253,
4342,
323,
337,
253,
4477,
2879,
1529,
873,
273,
4679,
534,
2007,
4086,
562,
4294,
2538,
347,
247,
2803,
533,
891,
5194,
342,
37317,
465,
1370,
68,
326,
625,
1941,
310,
3058,
534,
3587,
2722,
326,
3425,
2978,
310,
5506,
24088,
403,
12217,
1349,
82,
3533,
16407,
1025,
1805,
323,
374,
253,
4477,
6096,
247,
17899,
4446,
3048,
342,
3081,
1543,
327,
295,
6917,
846,
253,
20639,
534,
253,
30628,
14109,
4583,
2299,
253,
2929,
3198,
625,
789,
275,
1340,
281,
440,
1419,
512,
841,
1543,
275,
247,
2014,
7482
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
347,
253,
1416,
5936,
14177,
281,
4677,
562,
1880,
285,
949,
534,
4088,
270,
365,
476,
2818,
4979,
398,
16407,
1320,
5350,
352,
44995,
4979,
398,
16407,
1320,
762,
841,
577,
7533,
16407,
3006,
3632,
13506,
941,
16407,
3006,
3632,
13130,
3626,
3448,
941,
26182,
3733,
941,
342,
2406,
3453,
15579,
285,
3733,
941,
7355,
2805,
66,
4679,
921,
326,
604,
359,
452,
625,
17310,
2069,
323,
270,
365,
4067,
30318,
1979,
253,
3210,
3045,
327,
512,
253,
1740,
7533,
651,
3157,
327,
495,
35615,
534,
2722,
326,
270,
365,
6296,
476,
2818,
253,
3210,
16407,
1320,
5350,
4067,
253,
30318,
1979,
310,
253,
1805,
253,
1566,
651,
1347,
327,
16407,
3006,
840,
253,
2929,
14177,
281,
4677,
562,
2139,
625,
14041,
265,
275,
270,
365,
651,
1421,
281,
1805,
16407,
1320,
407,
22914,
767,
643,
24316,
253,
2929,
20097,
326,
625,
14041,
265,
275,
270,
365,
1543,
275,
12217,
3280,
3425,
285,
326,
2789,
16407,
3006,
6927,
323,
4979,
398,
891,
1158,
436,
2929,
10262,
247,
11080,
7103,
327,
849,
320,
81,
11852,
4979,
398,
16407,
1320,
619,
2022,
4468,
310,
326,
616,
6452,
12217,
3280,
6430,
1421,
281,
1805,
16407,
1320,
310,
8392,
247,
2372,
44753,
253,
2929,
21354,
436,
6452,
21719,
407,
557,
40037,
253,
643,
767,
13260,
326,
597,
3534,
2299,
403,
253,
495,
13260,
253,
760,
1896,
7431,
11813,
390,
403,
627,
4088,
281,
3587,
5276,
326,
352,
310,
253,
3425,
2978,
326,
2789,
247,
3064,
581,
1896,
3368,
476,
320,
275,
253,
3368,
273,
3733,
941,
7355,
368,
6194,
767,
1027,
3448,
3210,
342,
253,
1072,
941,
873,
1223,
323,
581,
368,
2624,
253,
1048,
50276,
5423,
14683,
715,
2709,
2159,
6430,
390,
32147,
366,
14683,
281,
1056,
253,
3280,
3425,
3356,
323,
253,
643,
368,
6194,
253,
3448,
1566,
342,
253,
3236,
941,
873,
50276,
42771,
602,
16774,
1263,
327,
1880,
270,
365,
11852,
16407,
1320,
275,
4979,
398,
1223,
253,
2457,
6452,
12217,
3280,
6430,
1421,
281,
1805,
16407,
1320,
310,
417,
973,
4516,
407,
4679,
5474,
33032,
2520,
2929,
2175,
253,
16407,
1320,
3607,
273,
271,
295,
24343,
3210,
27039,
327,
849,
1781,
310,
253,
30318,
1979,
597,
21364,
327,
7561,
908,
270,
365,
5933,
275,
1340,
281,
8085,
3236,
941,
715,
749,
3418,
5085,
2007,
597,
3989,
247,
1071,
3026,
11253,
327,
2067,
8892,
835,
1016,
4836,
310,
2905,
342,
247,
2173,
16407,
1320,
4809,
4802,
281,
253,
1566,
824,
347,
5350,
390,
14682,
534,
4477,
5611,
731,
1286,
50276,
16217,
3825,
45190,
17813,
253,
1818,
273,
16407,
1320,
3607,
347,
253,
30318,
1979,
310,
6890,
835,
352,
15646,
1805,
16407,
1320,
342,
4067,
11571,
357,
1979,
2007,
4477,
1056,
2709,
24316,
273,
752,
1537,
320,
253,
6944,
8813,
8763,
3212,
253,
5520,
16407,
1320,
846,
12669,
841,
562,
597,
24366,
326,
253,
3777,
3425,
2978,
310,
2779,
253,
2201,
24938,
15571,
253,
6944,
8310,
436,
789,
310,
4845,
327,
253,
5680,
323,
479,
347,
352,
310,
15686,
973,
37171,
1953,
285,
1057,
690,
16774,
12820,
273,
2709,
24316,
327,
253,
643,
1133,
627,
310,
417,
1199,
273,
5955,
327,
752,
476,
359,
513,
342,
1048,
6430,
672,
359,
2430,
16407,
1320,
4477,
2085,
1077,
1029,
5251,
11865,
824,
347,
776,
4342,
476,
2085,
12925,
323,
25184,
253,
6311,
941,
5289,
533,
891,
651,
320,
1077,
14338,
281,
1239,
625,
4278,
891,
1158,
352,
651,
23395,
4944,
347,
4342,
2929,
2581,
685,
789,
342,
253,
4460,
7680,
50276,
9453,
352,
3249,
281,
20544,
891,
751,
253,
5140,
273,
16407,
1320,
3607,
534,
4483,
281,
22048,
253,
16407,
1320,
407,
970,
253,
4081,
39578,
8892,
275,
1635,
4477,
3597,
281,
2085,
1142,
4278,
327,
2173,
4373,
22041,
597,
908,
3738,
891,
1669,
690,
5701,
2708,
835,
625,
4278,
651,
320,
10112,
50276,
1747,
13218,
32213,
1223,
253,
16407,
1320,
3607,
403,
4518,
18627,
253,
5140,
273,
1566,
4764,
1320,
275,
39578,
8892,
310,
3542,
327,
1077,
1029,
5251,
1293,
667,
5955,
273,
2139,
436,
2173,
1332,
273,
20278,
253,
1566,
347,
30410,
310,
6777,
285,
849,
352,
778,
390,
778,
417,
2085,
3309,
1491,
4404,
2561,
3533,
670,
16407,
1320,
436,
310,
7194,
15974,
2593,
5922,
275,
1635,
4477,
3748,
256,
5503,
50276,
3409,
273,
253,
1445,
2067,
2069,
1309,
253,
789,
1223,
597,
513,
417,
1199,
14588,
281,
253,
4588,
256,
5503,
3210,
4496,
3451,
479,
1060,
604,
891,
717,
3430,
533,
2783,
3210,
403,
417,
275,
253,
1072,
4023,
30844,
347,
256,
5503,
3210,
891,
971,
281,
4073,
326,
253,
6158,
958,
310,
417,
247,
3076,
2181,
533,
840,
627,
310,
642,
878,
281,
14588,
342,
256,
5503,
840,
50276,
27490,
891,
452,
4879,
2709,
5701,
835,
352,
369,
1892,
281,
479,
281,
755,
512,
253,
4278,
390,
835,
891,
3543,
253,
5884,
1818,
50276,
1911,
251,
651,
3157,
253,
9759,
50276,
6377,
495,
387,
1071,
2606,
359,
8959,
253,
10166,
298,
78,
407,
247,
7316,
3560,
342,
253,
35823,
10669,
285,
2451,
1880,
253,
10166,
298,
78,
7598,
707,
253,
3451,
3662,
1561,
1755,
18,
390,
1755,
22,
1543,
4895,
407,
8325,
3186,
8325,
1979,
608,
50276,
2520,
310,
1077,
2173,
2216,
4327,
2139,
858,
368,
5206,
352,
1566,
27039,
327,
253,
2805,
4245,
253,
2862,
3268,
327,
3425,
5251,
285,
8325,
3186,
310,
1077,
7996,
281,
253,
8325,
1979,
891,
923,
271,
5795,
347,
2509,
38663,
10491,
285,
12672,
3264,
14259,
342,
253,
3662,
690,
5955,
327,
326,
651,
1361,
281,
19148,
436,
50276,
10492,
273,
3239,
495,
368,
908,
3159,
873,
8777,
281,
3730,
281,
1097,
8892,
285,
3210,
987,
534,
310,
247,
2372,
21643,
50276,
4674,
5922,
347,
891,
4159,
1840,
275,
619,
4743,
436,
2593,
3198,
281,
320,
4067,
347,
352,
13067,
1077,
1774,
2216,
273,
253,
1566,
4764,
1320,
987,
1024,
352,
310,
417,
1014,
2590,
2217,
752,
403,
2303,
966,
275,
253,
3634,
273,
30318,
21761,
891,
2868,
15571,
436,
651,
3157,
253,
9759,
273,
326,
629,
247,
2257,
50276,
6377,
608,
33526,
2233,
6194,
7200,
50276,
249,
2219,
273,
13361,
78,
369,
756,
4080,
760,
327,
4836,
966,
10554,
390,
327,
643,
34741,
21761,
347,
973,
50276,
6377,
608,
7613,
275,
436,
3368,
359,
908,
247,
337,
12026,
50276,
16534,
368,
5513,
2139,
368,
9703,
281,
4796,
253,
1566,
1979,
275,
619,
4685,
253,
4836,
3139,
1537,
320,
1512,
3477,
594,
253,
643,
1039,
310,
281,
6990,
253,
6944,
4836,
1580,
368,
5742,
3989,
352,
50275,
6377,
608,
534,
369,
2217,
323,
253,
3733,
2957,
281,
29623,
29623,
281,
752,
369,
627,
271,
2393,
15910,
6866,
327,
2957,
390,
756,
50276,
6377,
854,
3021,
604,
253,
30318,
3116,
3815,
476,
5513,
253,
2540,
1055,
359,
588,
923,
326,
253,
15302,
342,
44223,
21761,
651,
671,
320,
6927,
281,
16407,
907,
50276,
249,
634,
4081,
5140,
253,
4294,
3268,
273,
21761,
36908,
1818,
4942,
281,
1016,
643,
987,
310,
352,
253,
1072,
323,
270,
365,
4715,
672,
1180,
273,
14041,
265,
310,
5675,
651,
320,
1270,
281,
921,
436,
5010,
352,
310,
1892,
281,
4489,
436,
3908,
891,
873,
253,
4868,
347,
42876,
2708,
253,
7887,
1677,
619,
7350,
1840,
835,
253,
2022,
581,
310,
326,
253,
4583,
11815,
6125,
16774,
4342,
1293,
1199,
273,
247,
7680,
4404,
2173,
1379,
42287,
390,
8333,
534,
651,
320,
24783,
323,
253,
3114,
23188,
891,
717,
1527,
281,
2319,
849,
436,
789,
476,
320,
15471,
275,
436,
8059,
604,
627,
588,
320,
2266,
2762,
8680,
432,
2571,
5474,
33032,
2520,
2929,
2340,
684,
253,
3486,
273,
749,
3418,
30318,
1979,
327,
253,
16407,
1320,
3745,
273,
39707,
3210,
253,
4477,
4158,
1264,
3510,
273,
8892,
281,
7472,
253,
2544,
273,
1566,
16407,
1320,
10775,
4715,
42794,
342,
3632,
13301,
14199,
17032,
285,
3733,
941,
7355,
5661,
1543,
921,
326,
253,
16407,
1320,
3745,
273,
39707,
1566,
310,
10046,
342,
253,
2572,
273,
749,
3418,
30318,
1979,
534,
253,
4477,
11104,
281,
253,
5141,
275,
253,
6430,
2978,
50276,
856,
84,
337,
253,
4679,
403,
11088,
253,
4477,
2216,
1264,
1027,
3510,
273,
4679,
281,
16575,
7277,
253,
16407,
1320,
3745,
3910,
875,
1264,
1027,
3210,
32049,
298,
78,
13361,
78,
374,
253,
2929,
33826,
253,
2954,
875,
253,
16407,
1320,
285,
26647,
13789,
273,
253,
1566,
285,
2722,
326,
26647,
310,
417,
3587,
387,
13653,
342,
16407,
1320,
534,
476,
26761,
2852,
1566,
2216,
50276,
5040,
337,
253,
4477,
2770,
327,
1264,
7431,
5997,
534,
253,
4477,
7558,
403,
8624,
1930,
29907,
273,
9433,
270,
365,
891,
11922,
39296,
275,
253,
941,
1955,
281,
13800,
21255,
2572,
275,
253,
1180,
273,
253,
4451,
5085,
908,
281,
1957,
253,
941,
390,
37685,
8493,
253,
2978,
273,
253,
3733,
6430,
643,
2616,
326,
403,
8244,
2905,
281,
253,
270,
365,
30318,
778,
320,
671,
1774,
824,
347,
253,
749,
3418,
4294,
476,
253,
4477,
5513,
2139,
253,
1264,
2616,
6730,
275,
436,
2929,
403,
625,
2779,
253,
5997,
685,
643,
2616,
24088,
749,
3418,
4294,
374,
3738,
253,
2929,
10262,
2067,
4722,
11815,
597,
476,
417,
320,
3587,
908,
281,
3653,
247,
7470,
30318,
1979,
275,
8542,
8892,
835,
253,
4831,
273,
253,
3210,
16407,
1320,
310,
440,
14428,
494,
50275,
585,
1209,
2224,
337,
253,
11815,
275,
436,
1263,
403,
760,
17618,
327,
253,
3802,
965,
10895,
1349,
82,
310,
760,
908,
323,
253,
3733,
941,
7355,
3368,
534,
778,
4322,
253,
6978,
1319,
273,
253,
4342,
4679,
327,
643,
15302,
285,
8892,
24088,
253,
5145,
10234,
4836,
835,
39707,
369,
23179,
403,
3309,
253,
2561,
3533,
14859,
275,
253,
789,
403,
1077,
4722,
285,
1774,
2299,
253,
4679,
403,
4942,
5075,
285,
352,
310,
417,
2590,
849,
281,
4647,
616,
4342,
275,
8542,
8892,
2490,
187,
4118,
18435,
27,
2520,
2929,
2340,
684,
253,
2554,
273,
270,
365,
285,
30318,
9552,
275,
16407,
1320,
275,
39707,
3210,
949,
247,
2962,
273,
4679,
327,
3632,
5203,
10554,
3733,
941,
7355,
285,
14199,
17032,
8104,
253,
2929,
2722,
326,
4067,
30318,
9552,
1421,
281,
5520,
16407,
1320,
253,
30628,
512,
5194,
326,
253,
2929,
2340,
684,
271,
1774,
1953,
285,
1057,
594,
16575,
253,
2022,
7350,
497,
670,
337,
253,
13091,
273,
253,
6452,
326,
352,
310,
3425,
2978,
6296,
534,
11852,
16407,
1320,
285,
374,
253,
3480,
273,
625,
8892,
281,
17813,
253,
4342,
323,
337,
253,
4477,
2879,
1529,
873,
273,
4679,
534,
2007,
4086,
562,
4294,
2538,
347,
247,
2803,
533,
891,
5194,
342,
37317,
465,
1370,
68,
326,
625,
1941,
310,
3058,
534,
3587,
2722,
326,
3425,
2978,
310,
5506,
24088,
403,
12217,
1349,
82,
3533,
16407,
1025,
1805,
323,
374,
253,
4477,
6096,
247,
17899,
4446,
3048,
342,
3081,
1543,
327,
295,
6917,
846,
253,
20639,
534,
253,
30628,
14109,
4583,
2299,
253,
2929,
3198,
625,
789,
275,
1340,
281,
440,
1419,
512,
841,
1543,
275,
247,
2014,
7482
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes an algorithm to solve pdes thats agnostic to the input mesh precisely the proposed method follows the encoderinterpolationforcast pipeline this pipeline will first extract features for each of the sampled mesh points then these features will be passed into the interpolator to create features for the query points the query points features will passed through the forcaster to make the final prediction the experimental results show that magnet is capable of performing un par with baselines such as fno and mpnn the results also show the proposed interpolation in feature space does lead to improvement the results also show that the model is able to test on a mesh resolution thats different from the trained mesh resolution strengths i think the main strength of the proposed idea is that they did interpolation and interaction in a learned feature space instead of in the physics space directory this allows the encoder and the interpolator to work together to make a potentially better interpolation schema which is agnostic to the sampling of the mesh vertices the ablation studies seem to suggest that the learned interpolation is indeed bringing some advantages per figure 4a i suspect such advantage will be larger ie comparing to cubic interpolation on the original frames when it comes to more irregular input meshes weakness 1 is the model actually mesh agnostic my understanding of meshagnostic is that if both meshes represent the same geometry then the method should output the same results the experiments seem to show that if two meshes are representing the same geometry then the method will output equally accurate results which is a weaker guarantee its not clear to me that the encoder architecture will provide exactly the same feature if the input mesh is slightly changed 2 how well is the method robust to irregular mesh one major concern i have toward this method is that if the input mesh is sample in a extremely nonuniform way which seems to be the case for many physics problem as higher numerical precision will be required in one place than the other this will require the interpolation to work with different descretization density which might potentially lead to interpolation error the experiment regarding irregular mesh is in section 42 there is very little description about how the irregular mesh is sampled or visualization also to the bet of my understanding its only tested on the irregular mesh but not using irregular mesh as input i suspect this also seems to be related to the limitation of the ablation study on interpolation schema where the difference between the cubic interpolation is not much larer than that of the feature space interpolation more clarification toward this point will be helpful the limitation is properly addressed in section 5 docsepthis paper introduces an architecture for learning meshagnostic neural pde solvers the architecture consists of encoding interpolation and forecasting trained via interpolation loss and forecasting loss the authors show in a 1d pde dataset it outperforms stateoftheart model mpnn and fno most of the time in terms of superresolution on regular and irregular meshes the authors also investigated how different aspects of the model influences the performance strengths the paper tackles an important problem of learning neural pde solvers which go beyond the resolution it is trained on the paper is written clearly the proposed method makes sense weaknesses the most important weakness in this paper is its limited evaluation which only evaluates in 1d pdes since the papers main contribution is mesh agnostic we should see experiments where how the method is evaluated in 2d meshes which is the standard understanding of a mesh in 1d a mesh reduces to connected line segments compared with 1d 2dmesh introduces many new aspects of difficulty such as more diversity and variations in the shape of the cells which leads to more complex interpolations etc the results in 1d may not generalize to 2d i would expect having experiments in 2d mesh both in a regular grid and a irregular mesh to see how the model compare with sota eg using a 2d experiment where fno or mpnnmeshgraphnet is evaluated on but evaluated on superresolution also in 2d where problems are more difficult the strength of the method compared to baselines may better exhibit update the authors have addressed my main concerns in the revision which add experiments in 2d in 2d experiments the mpnn still outperforms the proposed method in the fig 15 in all cases which may not show the relative strength of the proposed method in light of the above i have updated my rating from 4 to 5 the main limitation is indicated in the above weaknesses part docsep1 summary the authors propose a pde solver method based on implicit neural representations address the general limitation of generic numerical pde solvers that assume the the underlying pde is discretized and solve it with the finite element method more specifically the proposed method lifts such assumptions while increasing prediction accuracy and resolution over generic numerical pde solvers the proposed method uses a graph neural network to infer the next state of the system in addition to a physicsbased inductive bias in its loss term to constraint the neural network behavior by encouraging it to produce more physicallyaccurate results the experiments show that the proposed method is able to estimate spatially continuous solution of the underlying pde up to 250 frames with physicallyconsistent accuracy 2 paper strengths the proposed pde solver can generalize to pdes with higher resolution that pdes than it was trained on the experiments show that the proposed method performs better or similar than sota methods 3 paper weaknesses note that some of the weaknesses below are not meant to suggest that the authors should address them during the discussion prior in order to change my opinion this is because i understand that addressing some of the weaknesses might take a long time or might require significant changes to the proposed method although addressing any of these might change my judgment about the work but for the time being the authors may prioritize addressing questions the authors start the paper by talking about climate forecasting as a challenging highresolution problem but their model is far from handling anything related to weather forecasting and their experiments also does not include anything for climate forecasting 4 additional comments related work the related work section is written a bit mechanistically and does not highlight some of the distinctive differences of the prior methods in contrast to the authors proposed method results it would be good if the authors could provide a summary of the results at the beginning of this section i am personally hesitant to question societal impacts of any scientific discovery that is not purposefully aiming to promote certain viewsdirections that directly and objectively harm the society this is because i believe in the longterm this limits the natural evolution of discovery of the truth similar to how protection of the free expression principle leads to democracy arguably any genuine scientific discovery could be eventually used for negative purposes therefore i do not think the reviewers subjective opinion should be used in any way as an objective to decide the direction of scientific discoveries regardless i do not think the proposed method would have any direct negative impact in the society
### Summary:
|
the paper proposes an architecture that maps from an input function to an output function that can handle unstructured meshes on a set of extensive experiments the effectiveness and robustness of the model is shown as in other models like fourier neural operator the pde itself is not present in the loss so solution has to be used with caution however the overall design and presentation is impressive and there is a general agreement between the reviewers about the importance of the work
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
5933,
281,
8415,
268,
3229,
28763,
639,
79,
6932,
281,
253,
3280,
17489,
10534,
253,
4081,
1332,
3637,
253,
32049,
2388,
4818,
318,
1542,
4008,
15722,
436,
15722,
588,
806,
4908,
3386,
323,
1016,
273,
253,
19958,
17489,
2792,
840,
841,
3386,
588,
320,
4817,
715,
253,
20670,
1080,
281,
2794,
3386,
323,
253,
7316,
2792,
253,
7316,
2792,
3386,
588,
4817,
949,
253,
323,
33293,
281,
1056,
253,
2457,
10554,
253,
5661,
1543,
921,
326,
10973,
310,
7032,
273,
9591,
440,
1061,
342,
1666,
25379,
824,
347,
269,
2369,
285,
23542,
9866,
253,
1543,
671,
921,
253,
4081,
30370,
275,
4735,
2317,
1057,
1421,
281,
7756,
253,
1543,
671,
921,
326,
253,
1566,
310,
2104,
281,
1071,
327,
247,
17489,
6064,
28763,
1027,
432,
253,
10166,
17489,
6064,
20544,
891,
1158,
253,
2022,
4757,
273,
253,
4081,
2934,
310,
326,
597,
858,
30370,
285,
5016,
275,
247,
6311,
4735,
2317,
3185,
273,
275,
253,
12057,
2317,
9617,
436,
4483,
253,
32049,
285,
253,
20670,
1080,
281,
789,
2366,
281,
1056,
247,
7826,
1805,
30370,
20824,
534,
310,
639,
79,
6932,
281,
253,
10491,
273,
253,
17489,
13388,
253,
28913,
2175,
1646,
281,
1804,
326,
253,
6311,
30370,
310,
6296,
9745,
690,
11361,
591,
4677,
577,
66,
891,
9101,
824,
5750,
588,
320,
4067,
26332,
10941,
281,
23664,
30370,
327,
253,
3236,
13009,
672,
352,
3249,
281,
625,
17948,
3280,
6191,
1041,
50275,
20881,
1255,
337,
310,
253,
1566,
2686,
17489,
639,
79,
6932,
619,
4685,
273,
17489,
1530,
6932,
310,
326,
604,
1097,
6191,
1041,
1957,
253,
1072,
12087,
840,
253,
1332,
943,
3453,
253,
1072,
1543,
253,
4679,
1646,
281,
921,
326,
604,
767,
6191,
1041,
403,
9999,
253,
1072,
12087,
840,
253,
1332,
588,
3453,
9696,
7899,
1543,
534,
310,
247,
21076,
12215,
697,
417,
2590,
281,
479,
326,
253,
32049,
10336,
588,
2085,
4555,
253,
1072,
4735,
604,
253,
3280,
17489,
310,
5777,
4391,
50276,
19,
849,
973,
310,
253,
1332,
10237,
281,
17948,
17489,
581,
2201,
4468,
891,
452,
2584,
436,
1332,
310,
326,
604,
253,
3280,
17489,
310,
3410,
275,
247,
6685,
1327,
23714,
1039,
534,
3133,
281,
320,
253,
1083,
323,
1142,
12057,
1895,
347,
2169,
10704,
12320,
588,
320,
2424,
275,
581,
1659,
685,
253,
643,
436,
588,
2430,
253,
30370,
281,
789,
342,
1027,
711,
2414,
1320,
4038,
534,
1537,
7826,
1421,
281,
30370,
2228,
253,
3368,
5001,
17948,
17489,
310,
275,
2593,
5976,
627,
310,
1077,
1652,
5740,
670,
849,
253,
17948,
17489,
310,
19958,
390,
24426,
671,
281,
253,
701,
273,
619,
4685,
697,
760,
5762,
327,
253,
17948,
17489,
533,
417,
970,
17948,
17489,
347,
3280,
891,
9101,
436,
671,
3133,
281,
320,
2905,
281,
253,
12291,
273,
253,
28913,
1263,
327,
30370,
20824,
835,
253,
3064,
875,
253,
23664,
30370,
310,
417,
1199,
298,
12287,
685,
326,
273,
253,
4735,
2317,
30370,
625,
37699,
2584,
436,
1127,
588,
320,
9371,
253,
12291,
310,
6283,
9713,
275,
2593,
608,
5474,
33032,
2520,
2929,
23970,
271,
10336,
323,
4715,
17489,
1530,
6932,
11454,
268,
615,
1220,
735,
253,
10336,
8414,
273,
9706,
30370,
285,
16923,
272,
10166,
3066,
30370,
2957,
285,
16923,
272,
2957,
253,
4477,
921,
275,
247,
337,
69,
268,
615,
10895,
352,
41731,
13015,
1375,
23037,
14387,
1566,
23542,
9866,
285,
269,
2369,
954,
273,
253,
673,
275,
2426,
273,
2221,
21061,
327,
3963,
285,
17948,
6191,
1041,
253,
4477,
671,
6949,
849,
1027,
7794,
273,
253,
1566,
16178,
253,
3045,
20544,
50276,
783,
2929,
39223,
271,
1774,
1895,
273,
4715,
11454,
268,
615,
1220,
735,
534,
564,
4457,
253,
6064,
352,
310,
10166,
327,
253,
2929,
310,
3542,
4518,
253,
4081,
1332,
2789,
3282,
50276,
20881,
1255,
265,
50276,
783,
954,
1774,
14855,
275,
436,
2929,
310,
697,
3710,
7103,
534,
760,
44995,
275,
337,
69,
268,
3229,
1580,
253,
9380,
2022,
7680,
310,
17489,
639,
79,
6932,
359,
943,
923,
4679,
835,
849,
253,
1332,
310,
6760,
275,
374,
69,
6191,
1041,
534,
310,
253,
2629,
4685,
273,
247,
17489,
275,
337,
69,
247,
17489,
11355,
281,
4802,
1386,
13288,
2429,
342,
337,
69,
374,
69,
36742,
23970,
1142,
747,
7794,
273,
10183,
824,
347,
625,
9991,
285,
10575,
275,
253,
5281,
273,
253,
1341,
534,
5644,
281,
625,
2570,
20670,
569,
3966,
253,
1543,
275,
337,
69,
778,
417,
39970,
281,
374,
69,
891,
651,
1902,
1907,
4679,
275,
374,
69,
17489,
1097,
275,
247,
3963,
9860,
285,
247,
17948,
17489,
281,
923,
849,
253,
1566,
7277,
342,
256,
5503,
24088,
970,
247,
374,
69,
3368,
835,
269,
2369,
390,
23542,
9866,
36742,
10580,
3024,
310,
6760,
327,
533,
6760,
327,
2221,
21061,
671,
275,
374,
69,
835,
3237,
403,
625,
2834,
253,
4757,
273,
253,
1332,
2429,
281,
1666,
25379,
778,
1805,
10738,
50275,
11183,
50276,
783,
4477,
452,
9713,
619,
2022,
7350,
275,
253,
18520,
534,
823,
4679,
275,
374,
69,
275,
374,
69,
4679,
253,
23542,
9866,
1335,
41731,
13015,
253,
4081,
1332,
275,
253,
3036,
1458,
275,
512,
2219,
534,
778,
417,
921,
253,
4103,
4757,
273,
253,
4081,
1332,
275,
1708,
273,
253,
1840,
891,
452,
9300,
619,
13716,
432,
577,
281,
608,
253,
2022,
12291,
310,
4860,
275,
253,
1840,
32213,
629,
5474,
33032,
18,
6010,
50276,
783,
4477,
12661,
247,
268,
615,
47037,
1332,
1754,
327,
15424,
11454,
14237,
2953,
253,
2087,
12291,
273,
12314,
10704,
268,
615,
1220,
735,
326,
5467,
253,
253,
6944,
268,
615,
310,
35132,
1025,
285,
8415,
352,
342,
253,
6486,
3284,
1332,
625,
5742,
253,
4081,
1332,
35408,
824,
13260,
1223,
3629,
10554,
7200,
285,
6064,
689,
12314,
10704,
268,
615,
1220,
735,
253,
4081,
1332,
4648,
247,
4216,
11454,
2990,
281,
9441,
253,
1735,
1375,
273,
253,
985,
275,
1635,
281,
247,
12057,
3169,
42115,
8492,
275,
697,
2957,
1307,
281,
7658,
253,
11454,
2990,
3879,
407,
18462,
352,
281,
4711,
625,
13318,
18921,
366,
1543,
253,
4679,
921,
326,
253,
4081,
1332,
310,
2104,
281,
6642,
28819,
5415,
2900,
273,
253,
6944,
268,
615,
598,
281,
10257,
13009,
342,
13318,
32474,
7200,
374,
2929,
20544,
50275,
783,
4081,
268,
615,
47037,
476,
39970,
281,
268,
3229,
342,
2169,
6064,
326,
268,
3229,
685,
352,
369,
10166,
327,
50276,
783,
4679,
921,
326,
253,
4081,
1332,
17923,
1805,
390,
2074,
685,
256,
5503,
3082,
50276,
20,
2929,
32213,
3877,
326,
690,
273,
253,
32213,
2708,
403,
417,
5486,
281,
1804,
326,
253,
4477,
943,
2953,
731,
1309,
253,
5955,
2720,
275,
1340,
281,
1818,
619,
4743,
436,
310,
984,
891,
2096,
326,
15974,
690,
273,
253,
32213,
1537,
1379,
247,
1048,
673,
390,
1537,
2430,
1534,
2544,
281,
253,
4081,
1332,
3738,
15974,
667,
273,
841,
1537,
1818,
619,
3883,
670,
253,
789,
533,
323,
253,
673,
1146,
253,
4477,
778,
23652,
907,
15974,
3533,
50275,
783,
4477,
1265,
253,
2929,
407,
5015,
670,
7952,
16923,
272,
347,
247,
11132,
1029,
21061,
1895,
533,
616,
1566,
310,
2080,
432,
10885,
2712,
2905,
281,
8588,
16923,
272,
285,
616,
4679,
671,
1057,
417,
2486,
2712,
323,
7952,
16923,
272,
50276,
21,
3081,
5701,
50276,
4919,
789,
50276,
783,
2905,
789,
2593,
310,
3542,
247,
2372,
2543,
18260,
285,
1057,
417,
6780,
690,
273,
253,
21488,
3910,
273,
253,
2720,
3082,
275,
4499,
281,
253,
4477,
4081,
1332,
50276,
16680,
50276,
262,
651,
320,
1175,
604,
253,
4477,
812,
2085,
247,
6010,
273,
253,
1543,
387,
253,
5068,
273,
436,
2593,
891,
717,
11697,
16063,
386,
281,
1953,
38058,
16274,
273,
667,
8249,
8900,
326,
310,
417,
4096,
2920,
26400,
281,
8591,
2176,
6849,
69,
40357,
326,
3587,
285,
38304,
5237,
253,
5948,
436,
310,
984,
891,
2868,
275,
253,
1048,
3945,
436,
7787,
253,
3626,
5606,
273,
8900,
273,
253,
5083,
2074,
281,
849,
6055,
273,
253,
1959,
2048,
8063,
5644,
281,
14483,
25711,
667,
13241,
8249,
8900,
812,
320,
6524,
908,
323,
4016,
6378,
3103,
891,
513,
417,
1158,
253,
30628,
17854,
4743,
943,
320,
908,
275,
667,
1039,
347,
271,
8103,
281,
7617,
253,
3884,
273,
8249,
32912,
50276,
1747,
21350,
891,
513,
417,
1158,
253,
4081,
1332,
651,
452,
667,
1480,
4016,
3486,
275,
253,
5948,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
10336,
326,
8115,
432,
271,
3280,
1159,
281,
271,
3453,
1159,
326,
476,
6016,
440,
34218,
6191,
1041,
327,
247,
873,
273,
9470,
4679,
253,
12510,
285,
31640,
273,
253,
1566,
310,
2011,
347,
275,
643,
3210,
751,
269,
15421,
11454,
5572,
253,
268,
615,
3139,
310,
417,
1246,
275,
253,
2957,
594,
2900,
556,
281,
320,
908,
342,
17458,
2299,
253,
4583,
2216,
285,
9759,
310,
13943,
285,
627,
310,
247,
2087,
4345,
875,
253,
30628,
670,
253,
6349,
273,
253,
789,
50274
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
5933,
281,
8415,
268,
3229,
28763,
639,
79,
6932,
281,
253,
3280,
17489,
10534,
253,
4081,
1332,
3637,
253,
32049,
2388,
4818,
318,
1542,
4008,
15722,
436,
15722,
588,
806,
4908,
3386,
323,
1016,
273,
253,
19958,
17489,
2792,
840,
841,
3386,
588,
320,
4817,
715,
253,
20670,
1080,
281,
2794,
3386,
323,
253,
7316,
2792,
253,
7316,
2792,
3386,
588,
4817,
949,
253,
323,
33293,
281,
1056,
253,
2457,
10554,
253,
5661,
1543,
921,
326,
10973,
310,
7032,
273,
9591,
440,
1061,
342,
1666,
25379,
824,
347,
269,
2369,
285,
23542,
9866,
253,
1543,
671,
921,
253,
4081,
30370,
275,
4735,
2317,
1057,
1421,
281,
7756,
253,
1543,
671,
921,
326,
253,
1566,
310,
2104,
281,
1071,
327,
247,
17489,
6064,
28763,
1027,
432,
253,
10166,
17489,
6064,
20544,
891,
1158,
253,
2022,
4757,
273,
253,
4081,
2934,
310,
326,
597,
858,
30370,
285,
5016,
275,
247,
6311,
4735,
2317,
3185,
273,
275,
253,
12057,
2317,
9617,
436,
4483,
253,
32049,
285,
253,
20670,
1080,
281,
789,
2366,
281,
1056,
247,
7826,
1805,
30370,
20824,
534,
310,
639,
79,
6932,
281,
253,
10491,
273,
253,
17489,
13388,
253,
28913,
2175,
1646,
281,
1804,
326,
253,
6311,
30370,
310,
6296,
9745,
690,
11361,
591,
4677,
577,
66,
891,
9101,
824,
5750,
588,
320,
4067,
26332,
10941,
281,
23664,
30370,
327,
253,
3236,
13009,
672,
352,
3249,
281,
625,
17948,
3280,
6191,
1041,
50275,
20881,
1255,
337,
310,
253,
1566,
2686,
17489,
639,
79,
6932,
619,
4685,
273,
17489,
1530,
6932,
310,
326,
604,
1097,
6191,
1041,
1957,
253,
1072,
12087,
840,
253,
1332,
943,
3453,
253,
1072,
1543,
253,
4679,
1646,
281,
921,
326,
604,
767,
6191,
1041,
403,
9999,
253,
1072,
12087,
840,
253,
1332,
588,
3453,
9696,
7899,
1543,
534,
310,
247,
21076,
12215,
697,
417,
2590,
281,
479,
326,
253,
32049,
10336,
588,
2085,
4555,
253,
1072,
4735,
604,
253,
3280,
17489,
310,
5777,
4391,
50276,
19,
849,
973,
310,
253,
1332,
10237,
281,
17948,
17489,
581,
2201,
4468,
891,
452,
2584,
436,
1332,
310,
326,
604,
253,
3280,
17489,
310,
3410,
275,
247,
6685,
1327,
23714,
1039,
534,
3133,
281,
320,
253,
1083,
323,
1142,
12057,
1895,
347,
2169,
10704,
12320,
588,
320,
2424,
275,
581,
1659,
685,
253,
643,
436,
588,
2430,
253,
30370,
281,
789,
342,
1027,
711,
2414,
1320,
4038,
534,
1537,
7826,
1421,
281,
30370,
2228,
253,
3368,
5001,
17948,
17489,
310,
275,
2593,
5976,
627,
310,
1077,
1652,
5740,
670,
849,
253,
17948,
17489,
310,
19958,
390,
24426,
671,
281,
253,
701,
273,
619,
4685,
697,
760,
5762,
327,
253,
17948,
17489,
533,
417,
970,
17948,
17489,
347,
3280,
891,
9101,
436,
671,
3133,
281,
320,
2905,
281,
253,
12291,
273,
253,
28913,
1263,
327,
30370,
20824,
835,
253,
3064,
875,
253,
23664,
30370,
310,
417,
1199,
298,
12287,
685,
326,
273,
253,
4735,
2317,
30370,
625,
37699,
2584,
436,
1127,
588,
320,
9371,
253,
12291,
310,
6283,
9713,
275,
2593,
608,
5474,
33032,
2520,
2929,
23970,
271,
10336,
323,
4715,
17489,
1530,
6932,
11454,
268,
615,
1220,
735,
253,
10336,
8414,
273,
9706,
30370,
285,
16923,
272,
10166,
3066,
30370,
2957,
285,
16923,
272,
2957,
253,
4477,
921,
275,
247,
337,
69,
268,
615,
10895,
352,
41731,
13015,
1375,
23037,
14387,
1566,
23542,
9866,
285,
269,
2369,
954,
273,
253,
673,
275,
2426,
273,
2221,
21061,
327,
3963,
285,
17948,
6191,
1041,
253,
4477,
671,
6949,
849,
1027,
7794,
273,
253,
1566,
16178,
253,
3045,
20544,
50276,
783,
2929,
39223,
271,
1774,
1895,
273,
4715,
11454,
268,
615,
1220,
735,
534,
564,
4457,
253,
6064,
352,
310,
10166,
327,
253,
2929,
310,
3542,
4518,
253,
4081,
1332,
2789,
3282,
50276,
20881,
1255,
265,
50276,
783,
954,
1774,
14855,
275,
436,
2929,
310,
697,
3710,
7103,
534,
760,
44995,
275,
337,
69,
268,
3229,
1580,
253,
9380,
2022,
7680,
310,
17489,
639,
79,
6932,
359,
943,
923,
4679,
835,
849,
253,
1332,
310,
6760,
275,
374,
69,
6191,
1041,
534,
310,
253,
2629,
4685,
273,
247,
17489,
275,
337,
69,
247,
17489,
11355,
281,
4802,
1386,
13288,
2429,
342,
337,
69,
374,
69,
36742,
23970,
1142,
747,
7794,
273,
10183,
824,
347,
625,
9991,
285,
10575,
275,
253,
5281,
273,
253,
1341,
534,
5644,
281,
625,
2570,
20670,
569,
3966,
253,
1543,
275,
337,
69,
778,
417,
39970,
281,
374,
69,
891,
651,
1902,
1907,
4679,
275,
374,
69,
17489,
1097,
275,
247,
3963,
9860,
285,
247,
17948,
17489,
281,
923,
849,
253,
1566,
7277,
342,
256,
5503,
24088,
970,
247,
374,
69,
3368,
835,
269,
2369,
390,
23542,
9866,
36742,
10580,
3024,
310,
6760,
327,
533,
6760,
327,
2221,
21061,
671,
275,
374,
69,
835,
3237,
403,
625,
2834,
253,
4757,
273,
253,
1332,
2429,
281,
1666,
25379,
778,
1805,
10738,
50275,
11183,
50276,
783,
4477,
452,
9713,
619,
2022,
7350,
275,
253,
18520,
534,
823,
4679,
275,
374,
69,
275,
374,
69,
4679,
253,
23542,
9866,
1335,
41731,
13015,
253,
4081,
1332,
275,
253,
3036,
1458,
275,
512,
2219,
534,
778,
417,
921,
253,
4103,
4757,
273,
253,
4081,
1332,
275,
1708,
273,
253,
1840,
891,
452,
9300,
619,
13716,
432,
577,
281,
608,
253,
2022,
12291,
310,
4860,
275,
253,
1840,
32213,
629,
5474,
33032,
18,
6010,
50276,
783,
4477,
12661,
247,
268,
615,
47037,
1332,
1754,
327,
15424,
11454,
14237,
2953,
253,
2087,
12291,
273,
12314,
10704,
268,
615,
1220,
735,
326,
5467,
253,
253,
6944,
268,
615,
310,
35132,
1025,
285,
8415,
352,
342,
253,
6486,
3284,
1332,
625,
5742,
253,
4081,
1332,
35408,
824,
13260,
1223,
3629,
10554,
7200,
285,
6064,
689,
12314,
10704,
268,
615,
1220,
735,
253,
4081,
1332,
4648,
247,
4216,
11454,
2990,
281,
9441,
253,
1735,
1375,
273,
253,
985,
275,
1635,
281,
247,
12057,
3169,
42115,
8492,
275,
697,
2957,
1307,
281,
7658,
253,
11454,
2990,
3879,
407,
18462,
352,
281,
4711,
625,
13318,
18921,
366,
1543,
253,
4679,
921,
326,
253,
4081,
1332,
310,
2104,
281,
6642,
28819,
5415,
2900,
273,
253,
6944,
268,
615,
598,
281,
10257,
13009,
342,
13318,
32474,
7200,
374,
2929,
20544,
50275,
783,
4081,
268,
615,
47037,
476,
39970,
281,
268,
3229,
342,
2169,
6064,
326,
268,
3229,
685,
352,
369,
10166,
327,
50276,
783,
4679,
921,
326,
253,
4081,
1332,
17923,
1805,
390,
2074,
685,
256,
5503,
3082,
50276,
20,
2929,
32213,
3877,
326,
690,
273,
253,
32213,
2708,
403,
417,
5486,
281,
1804,
326,
253,
4477,
943,
2953,
731,
1309,
253,
5955,
2720,
275,
1340,
281,
1818,
619,
4743,
436,
310,
984,
891,
2096,
326,
15974,
690,
273,
253,
32213,
1537,
1379,
247,
1048,
673,
390,
1537,
2430,
1534,
2544,
281,
253,
4081,
1332,
3738,
15974,
667,
273,
841,
1537,
1818,
619,
3883,
670,
253,
789,
533,
323,
253,
673,
1146,
253,
4477,
778,
23652,
907,
15974,
3533,
50275,
783,
4477,
1265,
253,
2929,
407,
5015,
670,
7952,
16923,
272,
347,
247,
11132,
1029,
21061,
1895,
533,
616,
1566,
310,
2080,
432,
10885,
2712,
2905,
281,
8588,
16923,
272,
285,
616,
4679,
671,
1057,
417,
2486,
2712,
323,
7952,
16923,
272,
50276,
21,
3081,
5701,
50276,
4919,
789,
50276,
783,
2905,
789,
2593,
310,
3542,
247,
2372,
2543,
18260,
285,
1057,
417,
6780,
690,
273,
253,
21488,
3910,
273,
253,
2720,
3082,
275,
4499,
281,
253,
4477,
4081,
1332,
50276,
16680,
50276,
262,
651,
320,
1175,
604,
253,
4477,
812,
2085,
247,
6010,
273,
253,
1543,
387,
253,
5068,
273,
436,
2593,
891,
717,
11697,
16063,
386,
281,
1953,
38058,
16274,
273,
667,
8249,
8900,
326,
310,
417,
4096,
2920,
26400,
281,
8591,
2176,
6849,
69,
40357,
326,
3587,
285,
38304,
5237,
253,
5948,
436,
310,
984,
891,
2868,
275,
253,
1048,
3945,
436,
7787,
253,
3626,
5606,
273,
8900,
273,
253,
5083,
2074,
281,
849,
6055,
273,
253,
1959,
2048,
8063,
5644,
281,
14483,
25711,
667,
13241,
8249,
8900,
812,
320,
6524,
908,
323,
4016,
6378,
3103,
891,
513,
417,
1158,
253,
30628,
17854,
4743,
943,
320,
908,
275,
667,
1039,
347,
271,
8103,
281,
7617,
253,
3884,
273,
8249,
32912,
50276,
1747,
21350,
891,
513,
417,
1158,
253,
4081,
1332,
651,
452,
667,
1480,
4016,
3486,
275,
253,
5948,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
10336,
326,
8115,
432,
271,
3280,
1159,
281,
271,
3453,
1159,
326,
476,
6016,
440,
34218,
6191,
1041,
327,
247,
873,
273,
9470,
4679,
253,
12510,
285,
31640,
273,
253,
1566,
310,
2011,
347,
275,
643,
3210,
751,
269,
15421,
11454,
5572,
253,
268,
615,
3139,
310,
417,
1246,
275,
253,
2957,
594,
2900,
556,
281,
320,
908,
342,
17458,
2299,
253,
4583,
2216,
285,
9759,
310,
13943,
285,
627,
310,
247,
2087,
4345,
875,
253,
30628,
670,
253,
6349,
273,
253,
789,
50274
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper suggests a gradient averaging strategy for distributed adaptive optimization gradient compression is used for reducing the communication costs in transmission of gradients and the tool of error feedback is used for correcting bias injected by the compression step convergence analysis results demonstrates that the proposed strategy has a linear speedup as the number of workers increases while the same convergence rate as standard amsgrad the experimental results on realworld datasets successfully validate the theoretical results strengths the topic considering communication efficiency for adaptive distributed optimization is of timeliness and importance theoretical results look promising and the numeral results efficiently supports the validity of the theoretical results in practice the paper is wellorganized and contains the introduction of related studies with a fairly enough coverage weaknesses minor comments is there any systematic ways to choose the parameters beta1 beta2 epsilon in the proposed algorithm algorithm 2 page 4 in section 4 5 in sections 4 and 5 how do we define the data heterogeneity in page 2 does this mean different chii on each local node it would be better if the authors could clearly state this page 6 under assumption 1 to assumption 4 under assumptions 1 to 4 overall i believe that this paper could be a meaningful add to the theories behind the distributed optimization numerical results also look solid enough to support the theoretical results docsepthe paper proposes a new distributed optimization frameworkcompamsbased on gradient averaging and compression with convergence guarantee in particular theoretical discussions have shown that the proposed algorithm shares the same convergence rate as amsgrad with linear speedup effect numerical experiments on several realworld data sets have demonstrated the proposed method can significantly reduce the communication costs while reaching comparable performance with fullprecision amsgrad major strengths are listed as follows 1 the paper is easy to follow and provides sufficient background on compressed gradients and adaptive optimization comparisons with two related works are important and clear 2 the proposed errorfeedback strategy is interesting in the distributed optimization scenario 3 empirical results on some benchmark data sets are supportive and convincing in addition there are several minor issues that the authors could address to improve the clarity and quality of the paper 1 in sec 22 it seems there is no problem description for algorithm 1 which may cause confusion for firsttime readers the paragraph in the beginning of sec 3 could be moved here or even before sec 22 also the parameter space could be explained or detailed with an example 2 in sec 22 different learning ratedifferent learning rates elementwiselyelementwise previous gradient magnitudeprevious gradient magnitudes 3 in definition 2 signxmathcalb1 is not defined 4 in theorem 1 under algorithm 1 to algorithm 4 under algorithms 14 overall the proposed work is important and interesting and the paper deserves publication the compressed gradient averaging technique with error feedback and its related convergence discussions could shed lights on other related works docsepthe authors propose the compams algorithm for a distributed optimization framework the algorithm is based on gradient averaging and adaptive algorithms the application of gradient compression helps to reduce the communication complexity and the tool of error feedback is used for the bias correction the authors study the convergence rate of the proposed algorithm the theoretical results are justified by the numerical experiments strength the authors extend the adaptive optimization framework to distributed approach with a compressed gradient averaging the convergence analysis implies a linear speedup in terms of the number of workers and shows that in the singlemachine case it can achieve the same convergence rate as the standard fullgradient sgd the paper is wellwritten and easy to follow weakness i have some concerns as follows 1 the authors do not assume the finitesum of fitheta to be convex then how to avoid converging to a local optimal 2 if data heterogeneity exists for local workers is it possible that the local gradient may drag others back when taking the gradient averaging in the central server 3 in algorithm 2 how to determine parameters beta1 beta2 and k is it possible that k differs for different local workers 4 in the algorithm when applying ef technique to reduce the bias each local worker still needs extra storage for the error term et which is the same dimension as the local gradient why such an approach can be efficient in terms of the memory space when training largescale learners 5 in the experiments how many workers are considered since data samples are randomly assigned to the workers it can be regarded as the nondata heterogeneous case it could be better if the authors can add the results for cases with data heterogeneity eg workers with different sample sizes different class distributions 6 there seem to be some typos for example in algorithm 1 line 8 i suppose the numerator should be mt instead of thetat in theorem 1 corollary 1 and 2 what is c2 in the denominator of the constrain for eta i believe the paper has its value in extending the adaptive optimization framework to distributed approach with compressed gradient averaging i mainly have concerns about the assumption on convexity and the impact on the performance brought by data heterogeneity
### Summary:
|
the paper considers the setting of distributed optimization and proposes an adaptive gradient averaging and compression scheme to reduce the communication cost the proposed scheme is shown to achieve the same convergence rate as fullgradient amsgrad algorithm but due to the reduced cost it exhibits linear speedup as the number of workers grows the reviews appreciated the clear presentation of the results technical soundness and convincing numerical experiments the paper is a solid contribution to distributed optimization thus i recommend acceptance
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
5936,
247,
11786,
25001,
5700,
323,
5939,
17825,
13757,
11786,
13800,
310,
908,
323,
8493,
253,
5511,
4815,
275,
6322,
273,
27935,
285,
253,
4968,
273,
2228,
8680,
310,
908,
323,
35827,
8492,
13945,
407,
253,
13800,
3213,
14940,
1783,
1543,
14371,
326,
253,
4081,
5700,
556,
247,
4872,
3885,
484,
347,
253,
1180,
273,
5820,
5459,
1223,
253,
1072,
14940,
2281,
347,
2629,
717,
84,
4971,
253,
5661,
1543,
327,
1524,
10186,
15302,
8379,
17813,
253,
10527,
1543,
20544,
50276,
783,
9400,
7296,
5511,
6733,
323,
17825,
5939,
13757,
310,
273,
4522,
293,
1632,
285,
6349,
50276,
783,
33977,
1543,
1007,
12532,
285,
253,
930,
1560,
1543,
14556,
8525,
253,
13091,
273,
253,
10527,
1543,
275,
3946,
50276,
783,
2929,
310,
973,
34092,
285,
4428,
253,
10199,
273,
2905,
2175,
342,
247,
9648,
2217,
7031,
50275,
20881,
1255,
265,
5884,
5701,
50276,
261,
627,
667,
12082,
4088,
281,
5206,
253,
3602,
9840,
18,
9840,
19,
299,
4277,
275,
253,
4081,
5933,
5933,
374,
50276,
6377,
577,
275,
2593,
577,
50276,
22,
50276,
249,
7118,
577,
285,
608,
50276,
5430,
513,
359,
4853,
253,
941,
19331,
275,
3239,
374,
1057,
436,
1599,
1027,
448,
2886,
327,
1016,
1980,
4666,
352,
651,
320,
1805,
604,
253,
4477,
812,
4518,
1375,
436,
50276,
6377,
721,
762,
9376,
337,
281,
9376,
577,
50276,
4524,
13260,
337,
281,
577,
50276,
1189,
455,
891,
2868,
326,
436,
2929,
812,
320,
247,
14282,
823,
281,
253,
11813,
3212,
253,
5939,
13757,
10704,
1543,
671,
1007,
4891,
2217,
281,
1329,
253,
10527,
1543,
5474,
339,
431,
248,
2929,
29328,
247,
747,
5939,
13757,
7792,
3118,
1317,
3169,
327,
11786,
25001,
285,
13800,
342,
14940,
12215,
275,
1798,
10527,
11985,
452,
2011,
326,
253,
4081,
5933,
10764,
253,
1072,
14940,
2281,
347,
717,
84,
4971,
342,
4872,
3885,
484,
1055,
10704,
4679,
327,
2067,
1524,
10186,
941,
5239,
452,
5183,
253,
4081,
1332,
476,
3012,
4796,
253,
5511,
4815,
1223,
10922,
10870,
3045,
342,
2120,
40540,
717,
84,
4971,
2201,
20544,
403,
7117,
347,
3637,
337,
253,
2929,
310,
3477,
281,
956,
285,
3400,
4209,
4114,
327,
21012,
27935,
285,
17825,
13757,
14023,
342,
767,
2905,
2987,
403,
1774,
285,
2590,
50276,
19,
253,
4081,
2228,
44333,
5700,
310,
4722,
275,
253,
5939,
13757,
10076,
50276,
20,
16774,
1543,
327,
690,
22791,
941,
5239,
403,
23384,
285,
21414,
50275,
249,
1635,
627,
403,
2067,
5884,
3374,
326,
253,
4477,
812,
2953,
281,
3157,
253,
19843,
285,
3290,
273,
253,
2929,
337,
275,
4706,
3307,
352,
3133,
627,
310,
642,
1895,
5740,
323,
5933,
337,
534,
778,
2847,
13775,
323,
806,
2606,
10668,
253,
12494,
275,
253,
5068,
273,
4706,
495,
812,
320,
4395,
1060,
390,
1014,
1078,
4706,
3307,
671,
253,
4764,
2317,
812,
320,
5544,
390,
7000,
342,
271,
1650,
50276,
19,
275,
4706,
3307,
1027,
4715,
20139,
7413,
4715,
4142,
3284,
88,
9299,
10531,
3020,
2045,
11786,
2849,
7128,
554,
250,
3391,
11786,
32800,
495,
275,
5426,
374,
861,
89,
1588,
67,
18,
310,
417,
2931,
50276,
21,
275,
10012,
337,
762,
5933,
337,
281,
5933,
577,
50276,
4524,
11333,
1638,
4583,
253,
4081,
789,
310,
1774,
285,
4722,
285,
253,
2929,
22828,
9311,
253,
21012,
11786,
25001,
5853,
342,
2228,
8680,
285,
697,
2905,
14940,
11985,
812,
17914,
10654,
327,
643,
2905,
2987,
50276,
7152,
339,
431,
248,
4477,
12661,
253,
509,
1317,
5933,
323,
247,
5939,
13757,
7792,
253,
5933,
310,
1754,
327,
11786,
25001,
285,
17825,
11333,
253,
2898,
273,
11786,
13800,
7729,
281,
4796,
253,
5511,
10454,
285,
253,
4968,
273,
2228,
8680,
310,
908,
323,
253,
8492,
10618,
253,
4477,
1263,
253,
14940,
2281,
273,
253,
4081,
5933,
253,
10527,
1543,
403,
17285,
407,
253,
10704,
4679,
4757,
253,
4477,
9017,
253,
17825,
13757,
7792,
281,
5939,
2746,
342,
247,
21012,
11786,
25001,
253,
14940,
1783,
8018,
247,
4872,
3885,
484,
275,
2426,
273,
253,
1180,
273,
5820,
285,
2722,
326,
275,
253,
2014,
28936,
1083,
352,
476,
5115,
253,
1072,
14940,
2281,
347,
253,
2629,
2120,
29844,
256,
35333,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
20881,
1255,
891,
452,
690,
7350,
347,
3637,
337,
253,
4477,
513,
417,
5467,
253,
1442,
3254,
360,
273,
4944,
22666,
281,
320,
17133,
840,
849,
281,
3693,
5975,
3390,
281,
247,
1980,
8654,
50276,
19,
604,
941,
19331,
4961,
323,
1980,
5820,
310,
352,
1896,
326,
253,
1980,
11786,
778,
9310,
2571,
896,
672,
3192,
253,
11786,
25001,
275,
253,
4275,
4771,
495,
275,
5933,
374,
849,
281,
3653,
3602,
9840,
18,
9840,
19,
285,
465,
310,
352,
1896,
326,
465,
19986,
323,
1027,
1980,
5820,
577,
275,
253,
5933,
672,
9433,
29692,
5853,
281,
4796,
253,
8492,
1016,
1980,
12954,
1335,
3198,
4465,
5718,
323,
253,
2228,
1307,
1162,
534,
310,
253,
1072,
7877,
347,
253,
1980,
11786,
2139,
824,
271,
2746,
476,
320,
5919,
275,
2426,
273,
253,
3541,
2317,
672,
3733,
1236,
2510,
25912,
40390,
608,
275,
253,
4679,
849,
1142,
5820,
403,
2783,
1580,
941,
3530,
403,
12421,
7922,
281,
253,
5820,
352,
476,
320,
12258,
347,
253,
27370,
682,
22766,
1083,
352,
812,
320,
1805,
604,
253,
4477,
476,
823,
253,
1543,
323,
2219,
342,
941,
19331,
24088,
5820,
342,
1027,
3410,
9552,
1027,
966,
10670,
721,
627,
1646,
281,
320,
690,
963,
993,
323,
1650,
275,
5933,
337,
1386,
854,
891,
9428,
253,
4520,
1080,
943,
320,
26301,
3185,
273,
253,
41506,
275,
10012,
337,
40460,
337,
285,
374,
752,
310,
260,
19,
275,
253,
12619,
273,
253,
37709,
323,
1162,
66,
50274,
74,
2868,
253,
2929,
556,
697,
1318,
275,
13633,
253,
17825,
13757,
7792,
281,
5939,
2746,
342,
21012,
11786,
25001,
891,
7194,
452,
7350,
670,
253,
9376,
327,
17133,
414,
285,
253,
3486,
327,
253,
3045,
3982,
407,
941,
19331,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
253,
4758,
273,
5939,
13757,
285,
29328,
271,
17825,
11786,
25001,
285,
13800,
6974,
281,
4796,
253,
5511,
2105,
253,
4081,
6974,
310,
2011,
281,
5115,
253,
1072,
14940,
2281,
347,
2120,
29844,
717,
84,
4971,
5933,
533,
1955,
281,
253,
3777,
2105,
352,
15646,
4872,
3885,
484,
347,
253,
1180,
273,
5820,
17202,
50276,
783,
10123,
14109,
253,
2590,
9759,
273,
253,
1543,
7681,
3590,
1255,
285,
21414,
10704,
4679,
253,
2929,
310,
247,
4891,
7680,
281,
5939,
13757,
3021,
891,
5583,
14924
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
5936,
247,
11786,
25001,
5700,
323,
5939,
17825,
13757,
11786,
13800,
310,
908,
323,
8493,
253,
5511,
4815,
275,
6322,
273,
27935,
285,
253,
4968,
273,
2228,
8680,
310,
908,
323,
35827,
8492,
13945,
407,
253,
13800,
3213,
14940,
1783,
1543,
14371,
326,
253,
4081,
5700,
556,
247,
4872,
3885,
484,
347,
253,
1180,
273,
5820,
5459,
1223,
253,
1072,
14940,
2281,
347,
2629,
717,
84,
4971,
253,
5661,
1543,
327,
1524,
10186,
15302,
8379,
17813,
253,
10527,
1543,
20544,
50276,
783,
9400,
7296,
5511,
6733,
323,
17825,
5939,
13757,
310,
273,
4522,
293,
1632,
285,
6349,
50276,
783,
33977,
1543,
1007,
12532,
285,
253,
930,
1560,
1543,
14556,
8525,
253,
13091,
273,
253,
10527,
1543,
275,
3946,
50276,
783,
2929,
310,
973,
34092,
285,
4428,
253,
10199,
273,
2905,
2175,
342,
247,
9648,
2217,
7031,
50275,
20881,
1255,
265,
5884,
5701,
50276,
261,
627,
667,
12082,
4088,
281,
5206,
253,
3602,
9840,
18,
9840,
19,
299,
4277,
275,
253,
4081,
5933,
5933,
374,
50276,
6377,
577,
275,
2593,
577,
50276,
22,
50276,
249,
7118,
577,
285,
608,
50276,
5430,
513,
359,
4853,
253,
941,
19331,
275,
3239,
374,
1057,
436,
1599,
1027,
448,
2886,
327,
1016,
1980,
4666,
352,
651,
320,
1805,
604,
253,
4477,
812,
4518,
1375,
436,
50276,
6377,
721,
762,
9376,
337,
281,
9376,
577,
50276,
4524,
13260,
337,
281,
577,
50276,
1189,
455,
891,
2868,
326,
436,
2929,
812,
320,
247,
14282,
823,
281,
253,
11813,
3212,
253,
5939,
13757,
10704,
1543,
671,
1007,
4891,
2217,
281,
1329,
253,
10527,
1543,
5474,
339,
431,
248,
2929,
29328,
247,
747,
5939,
13757,
7792,
3118,
1317,
3169,
327,
11786,
25001,
285,
13800,
342,
14940,
12215,
275,
1798,
10527,
11985,
452,
2011,
326,
253,
4081,
5933,
10764,
253,
1072,
14940,
2281,
347,
717,
84,
4971,
342,
4872,
3885,
484,
1055,
10704,
4679,
327,
2067,
1524,
10186,
941,
5239,
452,
5183,
253,
4081,
1332,
476,
3012,
4796,
253,
5511,
4815,
1223,
10922,
10870,
3045,
342,
2120,
40540,
717,
84,
4971,
2201,
20544,
403,
7117,
347,
3637,
337,
253,
2929,
310,
3477,
281,
956,
285,
3400,
4209,
4114,
327,
21012,
27935,
285,
17825,
13757,
14023,
342,
767,
2905,
2987,
403,
1774,
285,
2590,
50276,
19,
253,
4081,
2228,
44333,
5700,
310,
4722,
275,
253,
5939,
13757,
10076,
50276,
20,
16774,
1543,
327,
690,
22791,
941,
5239,
403,
23384,
285,
21414,
50275,
249,
1635,
627,
403,
2067,
5884,
3374,
326,
253,
4477,
812,
2953,
281,
3157,
253,
19843,
285,
3290,
273,
253,
2929,
337,
275,
4706,
3307,
352,
3133,
627,
310,
642,
1895,
5740,
323,
5933,
337,
534,
778,
2847,
13775,
323,
806,
2606,
10668,
253,
12494,
275,
253,
5068,
273,
4706,
495,
812,
320,
4395,
1060,
390,
1014,
1078,
4706,
3307,
671,
253,
4764,
2317,
812,
320,
5544,
390,
7000,
342,
271,
1650,
50276,
19,
275,
4706,
3307,
1027,
4715,
20139,
7413,
4715,
4142,
3284,
88,
9299,
10531,
3020,
2045,
11786,
2849,
7128,
554,
250,
3391,
11786,
32800,
495,
275,
5426,
374,
861,
89,
1588,
67,
18,
310,
417,
2931,
50276,
21,
275,
10012,
337,
762,
5933,
337,
281,
5933,
577,
50276,
4524,
11333,
1638,
4583,
253,
4081,
789,
310,
1774,
285,
4722,
285,
253,
2929,
22828,
9311,
253,
21012,
11786,
25001,
5853,
342,
2228,
8680,
285,
697,
2905,
14940,
11985,
812,
17914,
10654,
327,
643,
2905,
2987,
50276,
7152,
339,
431,
248,
4477,
12661,
253,
509,
1317,
5933,
323,
247,
5939,
13757,
7792,
253,
5933,
310,
1754,
327,
11786,
25001,
285,
17825,
11333,
253,
2898,
273,
11786,
13800,
7729,
281,
4796,
253,
5511,
10454,
285,
253,
4968,
273,
2228,
8680,
310,
908,
323,
253,
8492,
10618,
253,
4477,
1263,
253,
14940,
2281,
273,
253,
4081,
5933,
253,
10527,
1543,
403,
17285,
407,
253,
10704,
4679,
4757,
253,
4477,
9017,
253,
17825,
13757,
7792,
281,
5939,
2746,
342,
247,
21012,
11786,
25001,
253,
14940,
1783,
8018,
247,
4872,
3885,
484,
275,
2426,
273,
253,
1180,
273,
5820,
285,
2722,
326,
275,
253,
2014,
28936,
1083,
352,
476,
5115,
253,
1072,
14940,
2281,
347,
253,
2629,
2120,
29844,
256,
35333,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
50276,
20881,
1255,
891,
452,
690,
7350,
347,
3637,
337,
253,
4477,
513,
417,
5467,
253,
1442,
3254,
360,
273,
4944,
22666,
281,
320,
17133,
840,
849,
281,
3693,
5975,
3390,
281,
247,
1980,
8654,
50276,
19,
604,
941,
19331,
4961,
323,
1980,
5820,
310,
352,
1896,
326,
253,
1980,
11786,
778,
9310,
2571,
896,
672,
3192,
253,
11786,
25001,
275,
253,
4275,
4771,
495,
275,
5933,
374,
849,
281,
3653,
3602,
9840,
18,
9840,
19,
285,
465,
310,
352,
1896,
326,
465,
19986,
323,
1027,
1980,
5820,
577,
275,
253,
5933,
672,
9433,
29692,
5853,
281,
4796,
253,
8492,
1016,
1980,
12954,
1335,
3198,
4465,
5718,
323,
253,
2228,
1307,
1162,
534,
310,
253,
1072,
7877,
347,
253,
1980,
11786,
2139,
824,
271,
2746,
476,
320,
5919,
275,
2426,
273,
253,
3541,
2317,
672,
3733,
1236,
2510,
25912,
40390,
608,
275,
253,
4679,
849,
1142,
5820,
403,
2783,
1580,
941,
3530,
403,
12421,
7922,
281,
253,
5820,
352,
476,
320,
12258,
347,
253,
27370,
682,
22766,
1083,
352,
812,
320,
1805,
604,
253,
4477,
476,
823,
253,
1543,
323,
2219,
342,
941,
19331,
24088,
5820,
342,
1027,
3410,
9552,
1027,
966,
10670,
721,
627,
1646,
281,
320,
690,
963,
993,
323,
1650,
275,
5933,
337,
1386,
854,
891,
9428,
253,
4520,
1080,
943,
320,
26301,
3185,
273,
253,
41506,
275,
10012,
337,
40460,
337,
285,
374,
752,
310,
260,
19,
275,
253,
12619,
273,
253,
37709,
323,
1162,
66,
50274,
74,
2868,
253,
2929,
556,
697,
1318,
275,
13633,
253,
17825,
13757,
7792,
281,
5939,
2746,
342,
21012,
11786,
25001,
891,
7194,
452,
7350,
670,
253,
9376,
327,
17133,
414,
285,
253,
3486,
327,
253,
3045,
3982,
407,
941,
19331,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
253,
4758,
273,
5939,
13757,
285,
29328,
271,
17825,
11786,
25001,
285,
13800,
6974,
281,
4796,
253,
5511,
2105,
253,
4081,
6974,
310,
2011,
281,
5115,
253,
1072,
14940,
2281,
347,
2120,
29844,
717,
84,
4971,
5933,
533,
1955,
281,
253,
3777,
2105,
352,
15646,
4872,
3885,
484,
347,
253,
1180,
273,
5820,
17202,
50276,
783,
10123,
14109,
253,
2590,
9759,
273,
253,
1543,
7681,
3590,
1255,
285,
21414,
10704,
4679,
253,
2929,
310,
247,
4891,
7680,
281,
5939,
13757,
3021,
891,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper provides an explanation of the phenomenon of sgd selecting flat minima by relating the noise structure of sgd to its linear stability in overparameterized models with the square loss it shows exploiting the geometry awareness of sgd noise that is provable for linear networks and rfms that the hessian of an accessible minimum for sgd is bounded by a term depending only on the batch size an the learning rate strenghts the topic of study is relevant the quantitative analysis is extensive going from simple models st rfms and linear models to modern deep neural networks the paper is well written and clear the role of overparameterization and the specific loss type quadratic are not discussed potential negative societal impact are not discussed docsepthe authors of the paper present the following contributions they provide an indepth study of the importance of the sgd noise both in term of geometry and in terms of scale they show that if a certain alignment property is satisfied a global minimum is linearly stable if and only if the froebinius norm of the hessian at optimum is upper bounded by constant independent of model and sample size strengths the first thing i have to say is that the paper is very well written the exposition is clearly conducted and all assumptions and estimations are detailed announced and commented one may or may not validate the authors noise model but at least it is not hidden as too often and keys are given to appreciate the results the main strength and difference of the article is the particular attention given on the noise model both in terms of noise and of geometry from this even if the results seem easy to prove the exposition is cristal clear and more convincing with the previous literature weaknesses there are a lot of results and approximation stated in the article but overall the crux of the paper is two show that the approximations of the equation 2 at the beginning of the paper is valid that is to say that either alpha beta or mu are close to 1 if figure 1 is pretty convincing from this point of view i have to say that figure 5 is not as the scale of these constant can be way smaller than 1 i would really appreciate if the authors comment more on this point because the sentence this comparison suggests that the alignment strength significantly depends on the intrinsic complexity of the problem nearly independent of the model size is not very convincing this is a minor weakness but overall even if the paper is not overselling and clear about their study i am still not convinced by the stability argument of sgd indeed on never see plots like figure 4 in real training dynamics of neural networks and this suggests at least to me that the taking into account the noise is not a stability issue but a dynamical one maybe the authors could comment a bit on this fact minor typos lign 65 lambdai and not lambda1 lign 227 in the current paper ligns after 235 there is a confusion between v and nu i use this paragraph to conclude as i already discussed the limitations on the previous boxes the paper tries to conduct both theoretical and experimental explanations of the wide minima selection of sgd ill weakly accept the paper for its clarity and detail about the noise to raise my score i would like the authors to justify more the fact that the escaping phenomenon is important and the alignment phenomenon of the covariance structures docsepdeep learning folklore holds that the gradient noise in sgd causes it to prefer flat minima an important open question in deep learning theory is to make this folklore mathematically precise taking a step in that direction this paper describes a mechanism by which sgd escapes exponentially quickly from minima where the frobenius norm of the hessian is too large this mechanism is orthogonal to the curvaturedriven process that causes fullbatch gradient descent to escape from minima where the hessian spectral norm exceeds 2 step size strengths to the best of my knowledge the idea of an exponentially fast escape that is driven purely by noise is novel it is interesting that a required condition for this phenomenon noise magnitude proportional to loss value is provably satisfied in linear nets and random feature models weaknesses the analysis only yields a sufficient condition for instability not a necessaryandsufficient condition while in general the stability will depend on both the fullbatch component and the noise component the analysis here considers only the noise component in isolation accounting for both simultaneously is going to be hard so i dont begrudge the authors for this simplification i think the paper would be clearer if the authors first presented the escape analysis section 3 before the sufficient conditions section 2 when i read section 2 i spent a lot of time scratching my head wondering why alpha beta and mu are defined the way they are later when i got to section 3 i realized that mu is precisely what is needed to trigger exponentially fast escape yes the authors adequately addressed the limitations
### Summary:
|
the paper investigates an important topic of why sgd converges to flat minima overall the reviewers felt that this is a nicely written paper with a nice contribution to the state of the art
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
3400,
271,
8813,
273,
253,
11562,
273,
256,
35333,
17221,
6507,
46836,
407,
12600,
253,
6046,
2605,
273,
256,
35333,
281,
697,
4872,
7882,
275,
689,
19484,
1025,
3210,
342,
253,
6278,
2957,
352,
2722,
38883,
253,
12087,
11891,
273,
256,
35333,
6046,
326,
310,
872,
494,
323,
4872,
6928,
285,
391,
71,
983,
326,
253,
344,
859,
757,
273,
271,
12482,
5927,
323,
256,
35333,
310,
11542,
407,
247,
1307,
7293,
760,
327,
253,
14604,
1979,
271,
253,
4715,
2281,
4056,
384,
84,
50276,
783,
9400,
273,
1263,
310,
4623,
50276,
783,
11745,
1783,
310,
9470,
1469,
432,
2969,
3210,
331,
391,
71,
983,
285,
4872,
3210,
281,
4980,
3676,
11454,
6928,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
253,
2554,
273,
689,
19484,
1320,
285,
253,
2173,
2957,
1511,
21396,
403,
417,
5469,
50276,
33177,
4016,
38058,
3486,
403,
417,
5469,
5474,
339,
431,
248,
4477,
273,
253,
2929,
1246,
253,
1563,
9021,
50276,
9328,
2085,
271,
801,
554,
394,
1263,
273,
253,
6349,
273,
253,
256,
35333,
6046,
1097,
275,
1307,
273,
12087,
285,
275,
2426,
273,
4311,
50276,
9328,
921,
326,
604,
247,
2176,
12420,
2867,
310,
10048,
247,
4156,
5927,
310,
23352,
6474,
604,
285,
760,
604,
253,
8954,
2275,
249,
3750,
5222,
273,
253,
344,
859,
757,
387,
24571,
310,
5170,
11542,
407,
3638,
3907,
273,
1566,
285,
3410,
1979,
50274,
296,
3755,
20556,
50275,
783,
806,
2181,
891,
452,
281,
1333,
310,
326,
253,
2929,
310,
1077,
973,
3542,
253,
47284,
310,
4518,
5196,
285,
512,
13260,
285,
3311,
569,
403,
7000,
6138,
285,
20503,
581,
778,
390,
778,
417,
17813,
253,
4477,
6046,
1566,
533,
387,
1878,
352,
310,
417,
8763,
347,
1512,
2223,
285,
10149,
403,
1677,
281,
11435,
253,
1543,
50276,
783,
2022,
4757,
285,
3064,
273,
253,
3929,
310,
253,
1798,
4116,
1677,
327,
253,
6046,
1566,
1097,
275,
2426,
273,
6046,
285,
273,
12087,
432,
436,
1014,
604,
253,
1543,
1646,
3477,
281,
5276,
253,
47284,
310,
1531,
382,
267,
2590,
285,
625,
21414,
342,
253,
2045,
6239,
50275,
20881,
1255,
265,
50275,
9088,
403,
247,
2257,
273,
1543,
285,
11193,
4767,
275,
253,
3929,
533,
4583,
253,
5385,
89,
273,
253,
2929,
310,
767,
921,
326,
253,
34754,
273,
253,
5150,
374,
387,
253,
5068,
273,
253,
2929,
310,
3588,
326,
310,
281,
1333,
326,
2057,
9765,
9840,
390,
12910,
403,
2810,
281,
337,
604,
4677,
337,
310,
50276,
38256,
21414,
432,
436,
1127,
273,
1859,
891,
452,
281,
1333,
326,
4677,
608,
310,
417,
347,
253,
4311,
273,
841,
3638,
476,
320,
1039,
4577,
685,
337,
891,
651,
1663,
11435,
604,
253,
4477,
4385,
625,
327,
436,
1127,
984,
253,
6197,
436,
5301,
5936,
326,
253,
12420,
4757,
3012,
7024,
327,
253,
15276,
10454,
273,
253,
1895,
4829,
3907,
273,
253,
1566,
1979,
310,
417,
1077,
21414,
50276,
2520,
310,
247,
5884,
14855,
533,
4583,
1014,
604,
253,
2929,
310,
417,
689,
23708,
285,
2590,
670,
616,
1263,
891,
717,
1335,
417,
13762,
407,
253,
7882,
4154,
273,
256,
35333,
6296,
327,
1620,
923,
14777,
751,
4677,
577,
275,
1524,
3733,
8062,
273,
11454,
6928,
285,
436,
5936,
387,
1878,
281,
479,
326,
253,
3192,
715,
2395,
253,
6046,
310,
417,
247,
7882,
2523,
533,
247,
18525,
581,
5046,
253,
4477,
812,
4385,
247,
2372,
327,
436,
958,
50273,
37585,
963,
993,
50275,
77,
525,
7251,
29331,
74,
285,
417,
29331,
18,
50276,
77,
525,
26472,
275,
253,
1655,
2929,
50276,
77,
525,
84,
846,
23540,
627,
310,
247,
13775,
875,
362,
285,
8794,
50274,
74,
897,
436,
12494,
281,
7525,
347,
891,
2168,
5469,
253,
7364,
327,
253,
2045,
12783,
253,
2929,
14177,
281,
2589,
1097,
10527,
285,
5661,
22909,
273,
253,
4618,
46836,
5438,
273,
256,
35333,
2853,
22112,
2997,
253,
2929,
323,
697,
19843,
285,
2508,
670,
253,
6046,
281,
7164,
619,
4868,
891,
651,
751,
253,
4477,
281,
15249,
625,
253,
958,
326,
253,
34528,
11562,
310,
1774,
285,
253,
12420,
11562,
273,
253,
26677,
5289,
50275,
7152,
33032,
22412,
4715,
6305,
7261,
410,
6556,
326,
253,
11786,
6046,
275,
256,
35333,
5997,
352,
281,
4510,
6507,
46836,
50276,
266,
1774,
1527,
1953,
275,
3676,
4715,
3762,
310,
281,
1056,
436,
6305,
7261,
410,
11076,
1037,
10799,
50276,
29114,
247,
3213,
275,
326,
3884,
436,
2929,
8631,
247,
5122,
407,
534,
256,
35333,
44716,
28596,
4541,
432,
46836,
835,
253,
8954,
7564,
3750,
5222,
273,
253,
344,
859,
757,
310,
1512,
1781,
50276,
2520,
5122,
310,
19627,
281,
253,
15340,
30387,
1069,
257,
1232,
326,
5997,
2120,
23941,
11786,
18499,
281,
8773,
432,
46836,
835,
253,
344,
859,
757,
9879,
5222,
23141,
374,
50276,
10539,
1979,
50276,
296,
3755,
20556,
50273,
936,
253,
1682,
273,
619,
3640,
253,
2934,
273,
271,
28596,
3809,
8773,
326,
310,
8877,
15846,
407,
6046,
310,
4460,
50276,
262,
310,
4722,
326,
247,
2424,
1617,
323,
436,
11562,
6046,
9777,
14495,
281,
2957,
1318,
310,
872,
1598,
10048,
275,
4872,
37507,
285,
3632,
4735,
3210,
50276,
20881,
1255,
265,
50272,
783,
1783,
760,
11026,
247,
4209,
1617,
323,
17620,
417,
247,
3309,
2287,
86,
2276,
1617,
50273,
6050,
275,
2087,
253,
7882,
588,
3469,
327,
1097,
253,
2120,
23941,
4445,
285,
253,
6046,
4445,
253,
1783,
1060,
19401,
760,
253,
6046,
4445,
275,
12940,
50276,
15793,
272,
323,
1097,
10486,
310,
1469,
281,
320,
1892,
594,
891,
13414,
320,
737,
15414,
253,
4477,
323,
436,
8077,
1877,
50272,
74,
1158,
253,
2929,
651,
320,
30909,
604,
253,
4477,
806,
3559,
253,
8773,
1783,
2593,
495,
1078,
253,
4209,
2515,
2593,
374,
50276,
9453,
891,
1239,
2593,
374,
891,
5262,
247,
2257,
273,
673,
47348,
619,
1481,
12371,
2139,
9765,
9840,
285,
12910,
403,
2931,
253,
1039,
597,
403,
50276,
31312,
672,
891,
1694,
281,
2593,
495,
891,
8156,
326,
12910,
310,
10534,
752,
310,
3058,
281,
9632,
28596,
3809,
8773,
50276,
9820,
253,
4477,
18212,
9713,
253,
7364,
2490,
187,
4118,
18435,
27,
783,
2929,
2340,
684,
271,
1774,
9400,
273,
2139,
256,
35333,
26414,
281,
6507,
46836,
4583,
253,
30628,
3543,
326,
436,
310,
247,
23395,
3542,
2929,
342,
247,
5322,
7680,
281,
253,
1375,
273,
253,
1445,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
3400,
271,
8813,
273,
253,
11562,
273,
256,
35333,
17221,
6507,
46836,
407,
12600,
253,
6046,
2605,
273,
256,
35333,
281,
697,
4872,
7882,
275,
689,
19484,
1025,
3210,
342,
253,
6278,
2957,
352,
2722,
38883,
253,
12087,
11891,
273,
256,
35333,
6046,
326,
310,
872,
494,
323,
4872,
6928,
285,
391,
71,
983,
326,
253,
344,
859,
757,
273,
271,
12482,
5927,
323,
256,
35333,
310,
11542,
407,
247,
1307,
7293,
760,
327,
253,
14604,
1979,
271,
253,
4715,
2281,
4056,
384,
84,
50276,
783,
9400,
273,
1263,
310,
4623,
50276,
783,
11745,
1783,
310,
9470,
1469,
432,
2969,
3210,
331,
391,
71,
983,
285,
4872,
3210,
281,
4980,
3676,
11454,
6928,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
253,
2554,
273,
689,
19484,
1320,
285,
253,
2173,
2957,
1511,
21396,
403,
417,
5469,
50276,
33177,
4016,
38058,
3486,
403,
417,
5469,
5474,
339,
431,
248,
4477,
273,
253,
2929,
1246,
253,
1563,
9021,
50276,
9328,
2085,
271,
801,
554,
394,
1263,
273,
253,
6349,
273,
253,
256,
35333,
6046,
1097,
275,
1307,
273,
12087,
285,
275,
2426,
273,
4311,
50276,
9328,
921,
326,
604,
247,
2176,
12420,
2867,
310,
10048,
247,
4156,
5927,
310,
23352,
6474,
604,
285,
760,
604,
253,
8954,
2275,
249,
3750,
5222,
273,
253,
344,
859,
757,
387,
24571,
310,
5170,
11542,
407,
3638,
3907,
273,
1566,
285,
3410,
1979,
50274,
296,
3755,
20556,
50275,
783,
806,
2181,
891,
452,
281,
1333,
310,
326,
253,
2929,
310,
1077,
973,
3542,
253,
47284,
310,
4518,
5196,
285,
512,
13260,
285,
3311,
569,
403,
7000,
6138,
285,
20503,
581,
778,
390,
778,
417,
17813,
253,
4477,
6046,
1566,
533,
387,
1878,
352,
310,
417,
8763,
347,
1512,
2223,
285,
10149,
403,
1677,
281,
11435,
253,
1543,
50276,
783,
2022,
4757,
285,
3064,
273,
253,
3929,
310,
253,
1798,
4116,
1677,
327,
253,
6046,
1566,
1097,
275,
2426,
273,
6046,
285,
273,
12087,
432,
436,
1014,
604,
253,
1543,
1646,
3477,
281,
5276,
253,
47284,
310,
1531,
382,
267,
2590,
285,
625,
21414,
342,
253,
2045,
6239,
50275,
20881,
1255,
265,
50275,
9088,
403,
247,
2257,
273,
1543,
285,
11193,
4767,
275,
253,
3929,
533,
4583,
253,
5385,
89,
273,
253,
2929,
310,
767,
921,
326,
253,
34754,
273,
253,
5150,
374,
387,
253,
5068,
273,
253,
2929,
310,
3588,
326,
310,
281,
1333,
326,
2057,
9765,
9840,
390,
12910,
403,
2810,
281,
337,
604,
4677,
337,
310,
50276,
38256,
21414,
432,
436,
1127,
273,
1859,
891,
452,
281,
1333,
326,
4677,
608,
310,
417,
347,
253,
4311,
273,
841,
3638,
476,
320,
1039,
4577,
685,
337,
891,
651,
1663,
11435,
604,
253,
4477,
4385,
625,
327,
436,
1127,
984,
253,
6197,
436,
5301,
5936,
326,
253,
12420,
4757,
3012,
7024,
327,
253,
15276,
10454,
273,
253,
1895,
4829,
3907,
273,
253,
1566,
1979,
310,
417,
1077,
21414,
50276,
2520,
310,
247,
5884,
14855,
533,
4583,
1014,
604,
253,
2929,
310,
417,
689,
23708,
285,
2590,
670,
616,
1263,
891,
717,
1335,
417,
13762,
407,
253,
7882,
4154,
273,
256,
35333,
6296,
327,
1620,
923,
14777,
751,
4677,
577,
275,
1524,
3733,
8062,
273,
11454,
6928,
285,
436,
5936,
387,
1878,
281,
479,
326,
253,
3192,
715,
2395,
253,
6046,
310,
417,
247,
7882,
2523,
533,
247,
18525,
581,
5046,
253,
4477,
812,
4385,
247,
2372,
327,
436,
958,
50273,
37585,
963,
993,
50275,
77,
525,
7251,
29331,
74,
285,
417,
29331,
18,
50276,
77,
525,
26472,
275,
253,
1655,
2929,
50276,
77,
525,
84,
846,
23540,
627,
310,
247,
13775,
875,
362,
285,
8794,
50274,
74,
897,
436,
12494,
281,
7525,
347,
891,
2168,
5469,
253,
7364,
327,
253,
2045,
12783,
253,
2929,
14177,
281,
2589,
1097,
10527,
285,
5661,
22909,
273,
253,
4618,
46836,
5438,
273,
256,
35333,
2853,
22112,
2997,
253,
2929,
323,
697,
19843,
285,
2508,
670,
253,
6046,
281,
7164,
619,
4868,
891,
651,
751,
253,
4477,
281,
15249,
625,
253,
958,
326,
253,
34528,
11562,
310,
1774,
285,
253,
12420,
11562,
273,
253,
26677,
5289,
50275,
7152,
33032,
22412,
4715,
6305,
7261,
410,
6556,
326,
253,
11786,
6046,
275,
256,
35333,
5997,
352,
281,
4510,
6507,
46836,
50276,
266,
1774,
1527,
1953,
275,
3676,
4715,
3762,
310,
281,
1056,
436,
6305,
7261,
410,
11076,
1037,
10799,
50276,
29114,
247,
3213,
275,
326,
3884,
436,
2929,
8631,
247,
5122,
407,
534,
256,
35333,
44716,
28596,
4541,
432,
46836,
835,
253,
8954,
7564,
3750,
5222,
273,
253,
344,
859,
757,
310,
1512,
1781,
50276,
2520,
5122,
310,
19627,
281,
253,
15340,
30387,
1069,
257,
1232,
326,
5997,
2120,
23941,
11786,
18499,
281,
8773,
432,
46836,
835,
253,
344,
859,
757,
9879,
5222,
23141,
374,
50276,
10539,
1979,
50276,
296,
3755,
20556,
50273,
936,
253,
1682,
273,
619,
3640,
253,
2934,
273,
271,
28596,
3809,
8773,
326,
310,
8877,
15846,
407,
6046,
310,
4460,
50276,
262,
310,
4722,
326,
247,
2424,
1617,
323,
436,
11562,
6046,
9777,
14495,
281,
2957,
1318,
310,
872,
1598,
10048,
275,
4872,
37507,
285,
3632,
4735,
3210,
50276,
20881,
1255,
265,
50272,
783,
1783,
760,
11026,
247,
4209,
1617,
323,
17620,
417,
247,
3309,
2287,
86,
2276,
1617,
50273,
6050,
275,
2087,
253,
7882,
588,
3469,
327,
1097,
253,
2120,
23941,
4445,
285,
253,
6046,
4445,
253,
1783,
1060,
19401,
760,
253,
6046,
4445,
275,
12940,
50276,
15793,
272,
323,
1097,
10486,
310,
1469,
281,
320,
1892,
594,
891,
13414,
320,
737,
15414,
253,
4477,
323,
436,
8077,
1877,
50272,
74,
1158,
253,
2929,
651,
320,
30909,
604,
253,
4477,
806,
3559,
253,
8773,
1783,
2593,
495,
1078,
253,
4209,
2515,
2593,
374,
50276,
9453,
891,
1239,
2593,
374,
891,
5262,
247,
2257,
273,
673,
47348,
619,
1481,
12371,
2139,
9765,
9840,
285,
12910,
403,
2931,
253,
1039,
597,
403,
50276,
31312,
672,
891,
1694,
281,
2593,
495,
891,
8156,
326,
12910,
310,
10534,
752,
310,
3058,
281,
9632,
28596,
3809,
8773,
50276,
9820,
253,
4477,
18212,
9713,
253,
7364,
2490,
187,
4118,
18435,
27,
783,
2929,
2340,
684,
271,
1774,
9400,
273,
2139,
256,
35333,
26414,
281,
6507,
46836,
4583,
253,
30628,
3543,
326,
436,
310,
247,
23395,
3542,
2929,
342,
247,
5322,
7680,
281,
253,
1375,
273,
253,
1445,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a general agepath algorithm for various selfpaced regularizers based on solving ordinal differential equations ode prior methods in selfpaced learning spl are prone to be underfitting and sensitive to noise and outliers mostly due to the difficulty of choosing the optimal model age moreover some existing solution path algorithms such as lasso and svm can not apply to spl due to the optimization difficulty to tackle these issues this paper establishes a theoretical analysis based on the kkt condition and some fair assumptions and proposes a novel algorithm based on solving ode the methodology seems solid and the authors also introduce some guidance in practice strengths this paper proposes an indepth theoretical analysis of agetracking in spl the proposed method incorporates kkt condition and ode to advance the empirical bilevel solution of the prior method leading to more stable sampleefficient and robust optimization weaknesses the main concerns are clarity and experimental evaluation in fact i am confused by some notations and formulations for example 0 can not belong to another scalar in eqn4 the righthand side of eqn4 is a scalar secondly the proposed method needs to solve an ode in each iteration as shown in algorithm 2 it is important to discuss the training efficiency as solving an ode in each iteration may be timeconsuming moreover the experimental results in table 2 and 3 are quite misleading the proposed method actually underperforms the baseline methods on many datasets but its results are in bold meanwhile a comparison with the recent sota is also needed not applicable docsepthis paper proposes an agepath algorithm named gaga to tackle general spregularizer in selfpaced learning during the optimization process gaga detects critical points via solving ode experimental results further verify the superiorty of gaga over traditional spl paradigms in multiple smallscale datasets strengths 1 this studied problem is interesting and would attract attention from different machine learning topics weaknesses 1 this paper is not easy to follow especially for the method section many variables are not well defined m in eq1 and ir is not mentioned in section 31 different sets in line 160 are also explained with the right equation 2 overall the motivation of the proposed method is not clear 3 the experiment part is not enough to validate the effectiveness of the proposed method since most of these datasets in table 1 are small and easy largescale datasets like dog120 in a are suggested to be used in the experiment a wu xiaoping et al bispl bidirectional selfpaced learning for recognition from web data ieee transactions on image processing 30 2021 65126527 na docsepthis paper tackles the critical problem of how to optimally choose the age parameter and stop the increasingly learning process in selfpaced learning spl grounded on the biconvexity of the problem the authors observe and prove that previous spl methodologies are closely connected to the partial optimum of the spl objective function thus they reformulate the spl paradigm as searching the partial optimum with optimal age parameter which is merely related to a single group of variables then a generalized agepath algorithm that exactly traces the path of the partial optimum is proposed with the technique of ordinary differential equations odes despite the general framework two detailed algorithms on svm and lasso are discussed in the paper numerical results demonstrate the correctness and efficiency of the algorithm strengths 1 the problem that this paper tries to address is significant in the spl but lacks sufficient investigations both in the theoretical and practical aspects before unlike the existing studies the framework derived in this paper is able to address most currently used selfpaced regularizers having explicit closed forms and following definitions in 1 while owning theoretical guarantees 2 the paper proposes a rigorous theoretical framework all detailed proofs of all the results are given in the paper except theorem 5 in appendix d the experimental results are also fairly abundant in different views 3 this paper provides a novel and interesting perspective to reconsider the previous spl regime wrt the concept of partial optimum the authors provide various discussions which are supported by both intuitions and theoretical results and dive much deeper above the concept itself 4 the technique that utilizes the closedform in alternative optimization to derive the age path is interesting since the problems of optimizing two groups of variables attract more and more focus eg bilevel problems i think this idea is of great potential to be studied further weakness 1 it seems that the limitations of the proposed method are not explicitly discussed in the paper 2 the theoretical framework and its derivation look quite complicated to be rigorous and solid and thus somewhat difficult to understand since there are many definitions assumptions as well as reformulations but the specific algorithms on the svm and lasso make up for this part in some sense 3 the proposed framework is based on the earliest spl regime which involves only one hyperparameter hence the application scenarios of gaga are restricted the author should consider the proposed algorithm in more recent spl formulations with more hyperparameters please refer to the above discussions and questions during the authors responses period more discussions have been provided in the appendix part docsepthis paper proposes gaga an exact agepath algorithm to tackle general spregularizer for the selfpaced learning spl paradigm the authors build the relationship between the gaga framework and the existing theoretical analysis on spl to provide a more indepth understanding of the principle behind spl the proposed gaga is the first exact agepath algorithm to tackle the general spregularizer for the selfpaced learning spl paradigm the authors build the relationship between the proposed gaga framework and the existing theoretical analysis on spl to provide a more indepth understanding of the principle behind spl the proposed gaga is more computationally efficient than existing methods missing related works about the relation of the work to the general curriculum learning models missing related works and experimental comparison to the general robustness model under label noise missing justifications about many specific parameter choices in the experimental validation not applicable the authors did not provide any limitations or potential negative societal impact of their work
### Summary:
|
the paper received 4 positive reviews after the rebuttal the technical concerns raised by the reviewers were addressed properly overall this work introduces a challenging and realistic setting that can be of large interest to the community working on selfpaced learning
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
2087,
2363,
3967,
5933,
323,
2710,
1881,
47417,
3963,
14460,
1754,
327,
16161,
50144,
8967,
7424,
258,
615,
2720,
3082,
275,
1881,
47417,
4715,
6821,
403,
21291,
281,
320,
762,
31893,
285,
7996,
281,
6046,
285,
42559,
6571,
1955,
281,
253,
10183,
273,
13887,
253,
8654,
1566,
2363,
25761,
690,
5368,
2900,
1854,
11333,
824,
347,
298,
26341,
285,
256,
11618,
476,
417,
4647,
281,
6821,
1955,
281,
253,
13757,
10183,
281,
18915,
841,
3374,
436,
2929,
25097,
247,
10527,
1783,
1754,
327,
253,
465,
5751,
1617,
285,
690,
4344,
13260,
285,
29328,
247,
4460,
5933,
1754,
327,
16161,
258,
615,
253,
16182,
3133,
4891,
285,
253,
4477,
671,
9569,
690,
12925,
275,
3946,
20544,
50275,
2520,
2929,
29328,
271,
801,
554,
394,
10527,
1783,
273,
639,
11656,
10892,
275,
6821,
253,
4081,
1332,
31167,
465,
5751,
1617,
285,
258,
615,
281,
7170,
253,
16774,
26413,
652,
2900,
273,
253,
2720,
1332,
4283,
281,
625,
6474,
3410,
20246,
285,
10237,
13757,
50273,
20881,
1255,
265,
50275,
783,
2022,
7350,
403,
19843,
285,
5661,
7103,
275,
958,
891,
717,
13477,
407,
690,
41818,
285,
26850,
323,
1650,
470,
476,
417,
5663,
281,
1529,
13434,
275,
16186,
79,
21,
253,
987,
4608,
1930,
273,
16186,
79,
21,
310,
247,
13434,
1273,
314,
253,
4081,
1332,
3198,
281,
8415,
271,
258,
615,
275,
1016,
19502,
347,
2011,
275,
5933,
374,
352,
310,
1774,
281,
2319,
253,
3733,
6733,
347,
16161,
271,
258,
615,
275,
1016,
19502,
778,
320,
673,
33136,
25761,
253,
5661,
1543,
275,
2829,
374,
285,
495,
403,
3240,
24363,
253,
4081,
1332,
2686,
762,
468,
13015,
253,
8245,
3082,
327,
1142,
15302,
533,
697,
1543,
403,
275,
13433,
26614,
247,
5301,
342,
253,
3332,
256,
5503,
310,
671,
3058,
50276,
1439,
7763,
5474,
33032,
2520,
2929,
29328,
271,
2363,
3967,
5933,
4907,
305,
12727,
281,
18915,
2087,
653,
12846,
6081,
275,
1881,
47417,
4715,
1309,
253,
13757,
1232,
305,
12727,
34472,
4619,
2792,
3066,
16161,
258,
615,
5661,
1543,
2007,
12654,
253,
2221,
74,
430,
90,
273,
305,
12727,
689,
5899,
6821,
11951,
304,
983,
275,
2709,
1355,
7527,
15302,
20544,
50276,
18,
436,
5421,
1895,
310,
4722,
285,
651,
6427,
4116,
432,
1027,
5145,
4715,
12989,
50275,
20881,
1255,
265,
50276,
18,
436,
2929,
310,
417,
3477,
281,
956,
3340,
323,
253,
1332,
2593,
1142,
4903,
403,
417,
973,
2931,
278,
275,
16186,
18,
285,
3496,
310,
417,
5393,
275,
2593,
4562,
1027,
5239,
275,
1386,
12036,
403,
671,
5544,
342,
253,
987,
5150,
50276,
19,
4583,
253,
16038,
273,
253,
4081,
1332,
310,
417,
2590,
50276,
20,
253,
3368,
629,
310,
417,
2217,
281,
17813,
253,
12510,
273,
253,
4081,
1332,
1580,
954,
273,
841,
15302,
275,
2829,
337,
403,
1355,
285,
3477,
1236,
2510,
25912,
15302,
751,
4370,
8193,
275,
247,
403,
5125,
281,
320,
908,
275,
253,
3368,
50276,
66,
259,
86,
1269,
571,
18225,
1162,
355,
17542,
446,
12246,
30869,
1881,
47417,
4715,
323,
8981,
432,
4384,
941,
26332,
1796,
13122,
327,
2460,
5162,
1884,
43425,
7251,
805,
2082,
1630,
50276,
2072,
5474,
33032,
2520,
2929,
39223,
253,
4619,
1895,
273,
849,
281,
5556,
595,
5206,
253,
2363,
4764,
285,
3523,
253,
9592,
4715,
1232,
275,
1881,
47417,
4715,
6821,
28462,
327,
253,
270,
3557,
30275,
414,
273,
253,
1895,
253,
4477,
10018,
285,
5276,
326,
2045,
6821,
39396,
403,
8244,
4802,
281,
253,
7898,
24571,
273,
253,
6821,
8103,
1159,
3021,
597,
8460,
4187,
253,
6821,
22199,
347,
12203,
253,
7898,
24571,
342,
8654,
2363,
4764,
534,
310,
7960,
2905,
281,
247,
2014,
1387,
273,
4903,
840,
247,
14923,
2363,
3967,
5933,
326,
4555,
20274,
253,
1854,
273,
253,
7898,
24571,
310,
4081,
342,
253,
5853,
273,
9826,
8967,
7424,
258,
3229,
5747,
253,
2087,
7792,
767,
7000,
11333,
327,
256,
11618,
285,
298,
26341,
403,
5469,
275,
253,
2929,
10704,
1543,
7568,
253,
36594,
285,
6733,
273,
253,
5933,
50276,
296,
3755,
20556,
337,
253,
1895,
326,
436,
2929,
14177,
281,
2953,
310,
1534,
275,
253,
6821,
533,
19756,
4209,
14006,
1097,
275,
253,
10527,
285,
8542,
7794,
1078,
12401,
253,
5368,
2175,
253,
7792,
6012,
275,
436,
2929,
310,
2104,
281,
2953,
954,
4390,
908,
1881,
47417,
3963,
14460,
1907,
6843,
4581,
4948,
285,
1563,
14308,
275,
337,
1223,
37695,
10527,
23632,
374,
253,
2929,
29328,
247,
26565,
10527,
7792,
512,
7000,
27947,
273,
512,
253,
1543,
403,
1677,
275,
253,
2929,
3707,
10012,
608,
275,
30762,
277,
253,
5661,
1543,
403,
671,
9648,
17829,
275,
1027,
6849,
495,
436,
2929,
3400,
247,
4460,
285,
4722,
8668,
281,
24033,
253,
2045,
6821,
9459,
8772,
253,
4473,
273,
7898,
24571,
253,
4477,
2085,
2710,
11985,
534,
403,
4516,
407,
1097,
16875,
4431,
285,
10527,
1543,
285,
25760,
1199,
12861,
1840,
253,
4473,
3139,
577,
253,
5853,
326,
29820,
253,
4581,
630,
275,
5795,
13757,
281,
15313,
253,
2363,
1854,
310,
4722,
1580,
253,
3237,
273,
39793,
767,
2390,
273,
4903,
6427,
625,
285,
625,
2770,
24088,
26413,
652,
3237,
891,
1158,
436,
2934,
310,
273,
1270,
2442,
281,
320,
5421,
2007,
50270,
20881,
1255,
337,
352,
3133,
326,
253,
7364,
273,
253,
4081,
1332,
403,
417,
11120,
5469,
275,
253,
2929,
50276,
19,
253,
10527,
7792,
285,
697,
28529,
1007,
3240,
9542,
281,
320,
26565,
285,
4891,
285,
3021,
8489,
2834,
281,
2096,
1580,
627,
403,
1142,
14308,
13260,
347,
973,
347,
8460,
3339,
533,
253,
2173,
11333,
327,
253,
256,
11618,
285,
298,
26341,
1056,
598,
323,
436,
629,
275,
690,
3282,
495,
253,
4081,
7792,
310,
1754,
327,
253,
18353,
6821,
9459,
534,
8687,
760,
581,
4373,
19484,
7613,
253,
2898,
15216,
273,
305,
12727,
403,
11096,
253,
2488,
943,
1908,
253,
4081,
5933,
275,
625,
3332,
6821,
26850,
342,
625,
4373,
22041,
4496,
3730,
281,
253,
1840,
11985,
285,
3533,
50276,
32674,
253,
4477,
6128,
2180,
625,
11985,
452,
644,
2530,
275,
253,
30762,
629,
5474,
33032,
2520,
2929,
29328,
305,
12727,
271,
3242,
2363,
3967,
5933,
281,
18915,
2087,
653,
12846,
6081,
323,
253,
1881,
47417,
4715,
6821,
22199,
253,
4477,
1973,
253,
2954,
875,
253,
305,
12727,
7792,
285,
253,
5368,
10527,
1783,
327,
6821,
281,
2085,
247,
625,
801,
554,
394,
4685,
273,
253,
8063,
3212,
6821,
50276,
783,
4081,
305,
12727,
310,
253,
806,
3242,
2363,
3967,
5933,
281,
18915,
253,
2087,
653,
12846,
6081,
323,
253,
1881,
47417,
4715,
6821,
22199,
50276,
783,
4477,
1973,
253,
2954,
875,
253,
4081,
305,
12727,
7792,
285,
253,
5368,
10527,
1783,
327,
6821,
281,
2085,
247,
625,
801,
554,
394,
4685,
273,
253,
8063,
3212,
6821,
50276,
783,
4081,
305,
12727,
310,
625,
43245,
5919,
685,
5368,
3082,
50276,
33722,
2905,
2987,
670,
253,
5886,
273,
253,
789,
281,
253,
2087,
24642,
4715,
3210,
50275,
33722,
2905,
2987,
285,
5661,
5301,
281,
253,
2087,
31640,
1566,
762,
5203,
6046,
50276,
33722,
816,
6787,
670,
1142,
2173,
4764,
10165,
275,
253,
5661,
12820,
50276,
1439,
7763,
253,
4477,
858,
417,
2085,
667,
7364,
390,
2442,
4016,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
783,
2929,
2959,
577,
2762,
10123,
846,
253,
30080,
22559,
253,
7681,
7350,
5439,
407,
253,
30628,
497,
9713,
6283,
4583,
436,
789,
23970,
247,
11132,
285,
15958,
4758,
326,
476,
320,
273,
1781,
1600,
281,
253,
3114,
2444,
327,
1881,
47417,
4715
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
2087,
2363,
3967,
5933,
323,
2710,
1881,
47417,
3963,
14460,
1754,
327,
16161,
50144,
8967,
7424,
258,
615,
2720,
3082,
275,
1881,
47417,
4715,
6821,
403,
21291,
281,
320,
762,
31893,
285,
7996,
281,
6046,
285,
42559,
6571,
1955,
281,
253,
10183,
273,
13887,
253,
8654,
1566,
2363,
25761,
690,
5368,
2900,
1854,
11333,
824,
347,
298,
26341,
285,
256,
11618,
476,
417,
4647,
281,
6821,
1955,
281,
253,
13757,
10183,
281,
18915,
841,
3374,
436,
2929,
25097,
247,
10527,
1783,
1754,
327,
253,
465,
5751,
1617,
285,
690,
4344,
13260,
285,
29328,
247,
4460,
5933,
1754,
327,
16161,
258,
615,
253,
16182,
3133,
4891,
285,
253,
4477,
671,
9569,
690,
12925,
275,
3946,
20544,
50275,
2520,
2929,
29328,
271,
801,
554,
394,
10527,
1783,
273,
639,
11656,
10892,
275,
6821,
253,
4081,
1332,
31167,
465,
5751,
1617,
285,
258,
615,
281,
7170,
253,
16774,
26413,
652,
2900,
273,
253,
2720,
1332,
4283,
281,
625,
6474,
3410,
20246,
285,
10237,
13757,
50273,
20881,
1255,
265,
50275,
783,
2022,
7350,
403,
19843,
285,
5661,
7103,
275,
958,
891,
717,
13477,
407,
690,
41818,
285,
26850,
323,
1650,
470,
476,
417,
5663,
281,
1529,
13434,
275,
16186,
79,
21,
253,
987,
4608,
1930,
273,
16186,
79,
21,
310,
247,
13434,
1273,
314,
253,
4081,
1332,
3198,
281,
8415,
271,
258,
615,
275,
1016,
19502,
347,
2011,
275,
5933,
374,
352,
310,
1774,
281,
2319,
253,
3733,
6733,
347,
16161,
271,
258,
615,
275,
1016,
19502,
778,
320,
673,
33136,
25761,
253,
5661,
1543,
275,
2829,
374,
285,
495,
403,
3240,
24363,
253,
4081,
1332,
2686,
762,
468,
13015,
253,
8245,
3082,
327,
1142,
15302,
533,
697,
1543,
403,
275,
13433,
26614,
247,
5301,
342,
253,
3332,
256,
5503,
310,
671,
3058,
50276,
1439,
7763,
5474,
33032,
2520,
2929,
29328,
271,
2363,
3967,
5933,
4907,
305,
12727,
281,
18915,
2087,
653,
12846,
6081,
275,
1881,
47417,
4715,
1309,
253,
13757,
1232,
305,
12727,
34472,
4619,
2792,
3066,
16161,
258,
615,
5661,
1543,
2007,
12654,
253,
2221,
74,
430,
90,
273,
305,
12727,
689,
5899,
6821,
11951,
304,
983,
275,
2709,
1355,
7527,
15302,
20544,
50276,
18,
436,
5421,
1895,
310,
4722,
285,
651,
6427,
4116,
432,
1027,
5145,
4715,
12989,
50275,
20881,
1255,
265,
50276,
18,
436,
2929,
310,
417,
3477,
281,
956,
3340,
323,
253,
1332,
2593,
1142,
4903,
403,
417,
973,
2931,
278,
275,
16186,
18,
285,
3496,
310,
417,
5393,
275,
2593,
4562,
1027,
5239,
275,
1386,
12036,
403,
671,
5544,
342,
253,
987,
5150,
50276,
19,
4583,
253,
16038,
273,
253,
4081,
1332,
310,
417,
2590,
50276,
20,
253,
3368,
629,
310,
417,
2217,
281,
17813,
253,
12510,
273,
253,
4081,
1332,
1580,
954,
273,
841,
15302,
275,
2829,
337,
403,
1355,
285,
3477,
1236,
2510,
25912,
15302,
751,
4370,
8193,
275,
247,
403,
5125,
281,
320,
908,
275,
253,
3368,
50276,
66,
259,
86,
1269,
571,
18225,
1162,
355,
17542,
446,
12246,
30869,
1881,
47417,
4715,
323,
8981,
432,
4384,
941,
26332,
1796,
13122,
327,
2460,
5162,
1884,
43425,
7251,
805,
2082,
1630,
50276,
2072,
5474,
33032,
2520,
2929,
39223,
253,
4619,
1895,
273,
849,
281,
5556,
595,
5206,
253,
2363,
4764,
285,
3523,
253,
9592,
4715,
1232,
275,
1881,
47417,
4715,
6821,
28462,
327,
253,
270,
3557,
30275,
414,
273,
253,
1895,
253,
4477,
10018,
285,
5276,
326,
2045,
6821,
39396,
403,
8244,
4802,
281,
253,
7898,
24571,
273,
253,
6821,
8103,
1159,
3021,
597,
8460,
4187,
253,
6821,
22199,
347,
12203,
253,
7898,
24571,
342,
8654,
2363,
4764,
534,
310,
7960,
2905,
281,
247,
2014,
1387,
273,
4903,
840,
247,
14923,
2363,
3967,
5933,
326,
4555,
20274,
253,
1854,
273,
253,
7898,
24571,
310,
4081,
342,
253,
5853,
273,
9826,
8967,
7424,
258,
3229,
5747,
253,
2087,
7792,
767,
7000,
11333,
327,
256,
11618,
285,
298,
26341,
403,
5469,
275,
253,
2929,
10704,
1543,
7568,
253,
36594,
285,
6733,
273,
253,
5933,
50276,
296,
3755,
20556,
337,
253,
1895,
326,
436,
2929,
14177,
281,
2953,
310,
1534,
275,
253,
6821,
533,
19756,
4209,
14006,
1097,
275,
253,
10527,
285,
8542,
7794,
1078,
12401,
253,
5368,
2175,
253,
7792,
6012,
275,
436,
2929,
310,
2104,
281,
2953,
954,
4390,
908,
1881,
47417,
3963,
14460,
1907,
6843,
4581,
4948,
285,
1563,
14308,
275,
337,
1223,
37695,
10527,
23632,
374,
253,
2929,
29328,
247,
26565,
10527,
7792,
512,
7000,
27947,
273,
512,
253,
1543,
403,
1677,
275,
253,
2929,
3707,
10012,
608,
275,
30762,
277,
253,
5661,
1543,
403,
671,
9648,
17829,
275,
1027,
6849,
495,
436,
2929,
3400,
247,
4460,
285,
4722,
8668,
281,
24033,
253,
2045,
6821,
9459,
8772,
253,
4473,
273,
7898,
24571,
253,
4477,
2085,
2710,
11985,
534,
403,
4516,
407,
1097,
16875,
4431,
285,
10527,
1543,
285,
25760,
1199,
12861,
1840,
253,
4473,
3139,
577,
253,
5853,
326,
29820,
253,
4581,
630,
275,
5795,
13757,
281,
15313,
253,
2363,
1854,
310,
4722,
1580,
253,
3237,
273,
39793,
767,
2390,
273,
4903,
6427,
625,
285,
625,
2770,
24088,
26413,
652,
3237,
891,
1158,
436,
2934,
310,
273,
1270,
2442,
281,
320,
5421,
2007,
50270,
20881,
1255,
337,
352,
3133,
326,
253,
7364,
273,
253,
4081,
1332,
403,
417,
11120,
5469,
275,
253,
2929,
50276,
19,
253,
10527,
7792,
285,
697,
28529,
1007,
3240,
9542,
281,
320,
26565,
285,
4891,
285,
3021,
8489,
2834,
281,
2096,
1580,
627,
403,
1142,
14308,
13260,
347,
973,
347,
8460,
3339,
533,
253,
2173,
11333,
327,
253,
256,
11618,
285,
298,
26341,
1056,
598,
323,
436,
629,
275,
690,
3282,
495,
253,
4081,
7792,
310,
1754,
327,
253,
18353,
6821,
9459,
534,
8687,
760,
581,
4373,
19484,
7613,
253,
2898,
15216,
273,
305,
12727,
403,
11096,
253,
2488,
943,
1908,
253,
4081,
5933,
275,
625,
3332,
6821,
26850,
342,
625,
4373,
22041,
4496,
3730,
281,
253,
1840,
11985,
285,
3533,
50276,
32674,
253,
4477,
6128,
2180,
625,
11985,
452,
644,
2530,
275,
253,
30762,
629,
5474,
33032,
2520,
2929,
29328,
305,
12727,
271,
3242,
2363,
3967,
5933,
281,
18915,
2087,
653,
12846,
6081,
323,
253,
1881,
47417,
4715,
6821,
22199,
253,
4477,
1973,
253,
2954,
875,
253,
305,
12727,
7792,
285,
253,
5368,
10527,
1783,
327,
6821,
281,
2085,
247,
625,
801,
554,
394,
4685,
273,
253,
8063,
3212,
6821,
50276,
783,
4081,
305,
12727,
310,
253,
806,
3242,
2363,
3967,
5933,
281,
18915,
253,
2087,
653,
12846,
6081,
323,
253,
1881,
47417,
4715,
6821,
22199,
50276,
783,
4477,
1973,
253,
2954,
875,
253,
4081,
305,
12727,
7792,
285,
253,
5368,
10527,
1783,
327,
6821,
281,
2085,
247,
625,
801,
554,
394,
4685,
273,
253,
8063,
3212,
6821,
50276,
783,
4081,
305,
12727,
310,
625,
43245,
5919,
685,
5368,
3082,
50276,
33722,
2905,
2987,
670,
253,
5886,
273,
253,
789,
281,
253,
2087,
24642,
4715,
3210,
50275,
33722,
2905,
2987,
285,
5661,
5301,
281,
253,
2087,
31640,
1566,
762,
5203,
6046,
50276,
33722,
816,
6787,
670,
1142,
2173,
4764,
10165,
275,
253,
5661,
12820,
50276,
1439,
7763,
253,
4477,
858,
417,
2085,
667,
7364,
390,
2442,
4016,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
783,
2929,
2959,
577,
2762,
10123,
846,
253,
30080,
22559,
253,
7681,
7350,
5439,
407,
253,
30628,
497,
9713,
6283,
4583,
436,
789,
23970,
247,
11132,
285,
15958,
4758,
326,
476,
320,
273,
1781,
1600,
281,
253,
3114,
2444,
327,
1881,
47417,
4715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this work the authors propose to combine two speedup techniques for significantly speeding up the computation of persistence diagrams on large graphs the proposed approach first reduces the graph by strong collapsing every dominated vertex and then further reduces the graph by computing its k1core when dealing with kdimensional homology the authors prove that such operations preserve the persistence diagram and then provide significant improvements over running times for various graph data sets i think the paper tackles an important question since persistent homology is known to be hard to compute and that its writing is very good and clear the mathematical arguments are simple and easy to follow making the paper mathematically sound at least to me however the paper suffers from two weaknesses first some more efforts have to be put in the positioning although coraltda seems relevant and original it is quite difficult to understand what are the differences between prunit and the method of strong collapses for computing the persistence of flag complexes httpsarxivorgabs180910945 it really looks like that prunit is a mere application of an existing method to graph data second i think the experiments are nice but are not that convincing since all of them show the reduction in the number of vertices and edges but only a few actually display the running time improvements which is the main motivation of this work i think a more extensive comparison of running times is necessary it would be good to discuss the fact that the proposed method is limited to flag complexes built from graphs can coraltda be extended to more general types of graph filtrations docsepthe authors give two methods for data reduction that do not alter the calculations needed for persistent homology the strengths of the paper are that the algorithms are straightforward and very general the weaknesses are that the full computational complexity is not exploredit is not clear me how costly these algorithms are in the preprocessing step yes the authors have addressed limitations docsepthis paper presents a new method for calculating persistence diagrams ie topological descriptors of large graphs in an efficient manner this is accomplished by two fundamental insights the first one being that only specific types of nodes can contribute to highdimensional topological descriptors the second one being that some nodes are essentially redundant for the calculation more precisely the first insights necessitates simplifying the graph to its kcore ie the subgraph induces by all vertices whose degree is at least k this graph is shown to be sufficient for calculating topological descriptors up to dimension k1 the second insight necessitates removing specific vertices thus reducing the size of the graph that are dominated by other vertices in terms of a filtration ie an ordering of the nodes in the graph experiments demonstrate that the proposed method can often reduce the size of graphs from realworld and benchmark data sets to a substantial extent the main strength of the paper is that it derives a new principled way of reducing the size of graphs that are commonly used as inputs for persistent homology algorithms by demonstrating that certain substructures of the graph can be ignored for higherorder persistent homology calculations the paper potentially opens the door towards a more widespread adoption of these methods in practice the fact that the two tuning techniques proposed in the paper are easily to implement thus being readily deployable is another strength of the paper in addition i find the writeup easy to follow with some minor issues in terms of clarity but more about that later the main weakness of the paper is that it does not make use of the potential insights afforded by persistent homology when applied to the reduced and pruned graphs while i appreciate that large reduction or compression ratios can be achieved for the graph data sets it would strengthen the paper if the methods were also used in a directed application for instance suppose for the sake of the argument that existing graph learning techniques reach a certain accuracy on one data set and it requires knowledge about higherorder topological features to provide a better classification if the paper could find a case in which previous work did not rely on such higherorder features as most tda works are wont to do because of computational concerns but higherorder features turn out to be crucial for good predictive performance a strong case for the proposed reduction schemes would be made i believe that even a single such example which would be easy to generate even within the review cycle would add an enormous amount of scientific valueutility to this paper in summary i believe that with a few additional modifications this paper could be a very good contribution to the conference detailed comments consider citing hofer et al deep learning with topological signatures as one of the earliest examples of how to combine topological concepts with machine learning concepts in the introduction a brief explanation of terminology such as filtration would be helpful to make the paper more accessible to nonexpert readers please use the spelling ech complex not cech complex for consistency reasons the term filtration function should be preferred instead of filtering function section 42 requires a better introduction it should be made clear that the filtration function does not change here potentially the section could also be rewritten to explain more clearly what coraltda actually entails the syn data set unless i am mistaken is broken in that it only consists of n copies of the same graph the syntheticnew data set has been created as the fixed version of this data set in figure 5 change the scale to for consistency all other figures use in their scales typos and minor comments the paper is well written and clear with some mild caveats concerning the accessibility for nonexperts the authors are to be commended here it is higher betti numbers are prevalent higher betti numbers are prevalent see figure 3 see figure 3 in simplicial complex setting in a simplicial complex setting take kcore take the kcore the closing parenthesis is somtetimes cut off in figures such as figure 6 there are no specific negative societal impacts arising from this work in general network analysis can lead to problematic results when the applied to networks that influence social behaviour the proposed methods are of a generic nature though and other methods for scalable network analysis exist so the risk is mitigated that being said the algorithmic limitations of the proposed pruning schemes could be discussed better i am in particular interested in understanding whether the size of the graph becomes prohibitive for the method at some point moreover are there any specific limitations on the filtration function also asked above
### Summary:
|
this paper proposes a method of reducing the size of graphs that are commonly used as inputs for persistent homology algorithms this addresses a fundamental scalability problem in the case of graphs and is likely to enable further work in the area on the negative side the paper is similar to the prior work on strong collapses also the experimental results do not make a strong case that the proposed algorithms are actually effective in reducing the running time for computing persistent homology at dimensions more than 0 there is only one plot on time reduction improvement fig 4b which provides results for only two datasets in homology dimension 0 which is not satisfactory given that 0dimensional case is already known to be very efficient given the above my recommendation is a weak accept and i urge the authors to address these issues in their final version better clarify the connection with the prior work and provide experimental evidence that the algorithm improves the efficiency of persistent homology computation at dimensions more than 0
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
789,
253,
4477,
12661,
281,
13398,
767,
3885,
484,
5609,
323,
3012,
43088,
598,
253,
13782,
273,
25306,
21302,
327,
1781,
14580,
253,
4081,
2746,
806,
11355,
253,
4216,
407,
2266,
45130,
1046,
14691,
11302,
285,
840,
2007,
11355,
253,
4216,
407,
12672,
697,
465,
18,
6443,
672,
10620,
342,
465,
6967,
23117,
253,
4477,
5276,
326,
824,
5871,
14003,
253,
25306,
10659,
285,
840,
2085,
1534,
11701,
689,
3515,
2069,
323,
2710,
4216,
941,
5239,
50275,
74,
1158,
253,
2929,
39223,
271,
1774,
1953,
1580,
15663,
23117,
310,
1929,
281,
320,
1892,
281,
11897,
285,
326,
697,
4028,
310,
1077,
1175,
285,
2590,
253,
15965,
7125,
403,
2969,
285,
3477,
281,
956,
2403,
253,
2929,
11076,
1037,
3590,
387,
1878,
281,
479,
50276,
35529,
253,
2929,
27171,
432,
767,
32213,
50276,
7053,
690,
625,
6031,
452,
281,
320,
1691,
275,
253,
19274,
3738,
944,
2711,
1473,
3133,
4623,
285,
3236,
352,
310,
3240,
2834,
281,
2096,
752,
403,
253,
3910,
875,
819,
8522,
285,
253,
1332,
273,
2266,
3007,
23508,
323,
12672,
253,
25306,
273,
7908,
12872,
5987,
39962,
2061,
5375,
1093,
2693,
12852,
1857,
352,
1663,
4453,
751,
326,
819,
8522,
310,
247,
11019,
2898,
273,
271,
5368,
1332,
281,
4216,
941,
50276,
9815,
891,
1158,
253,
4679,
403,
5322,
533,
403,
417,
326,
21414,
1580,
512,
273,
731,
921,
253,
5141,
275,
253,
1180,
273,
13388,
285,
9297,
533,
760,
247,
1643,
2686,
3148,
253,
3515,
673,
11701,
534,
310,
253,
2022,
16038,
273,
436,
789,
891,
1158,
247,
625,
9470,
5301,
273,
3515,
2069,
310,
3309,
50276,
262,
651,
320,
1175,
281,
2319,
253,
958,
326,
253,
4081,
1332,
310,
3710,
281,
7908,
12872,
4270,
432,
14580,
476,
944,
2711,
1473,
320,
6508,
281,
625,
2087,
3510,
273,
4216,
1193,
1206,
569,
50276,
7152,
339,
431,
248,
4477,
1918,
767,
3082,
323,
941,
5141,
326,
513,
417,
6990,
253,
10426,
3058,
323,
15663,
23117,
50276,
783,
20544,
273,
253,
2929,
403,
326,
253,
11333,
403,
15246,
285,
1077,
2087,
253,
32213,
403,
326,
253,
2120,
15180,
10454,
310,
417,
14859,
262,
310,
417,
2590,
479,
849,
19983,
841,
11333,
403,
275,
253,
638,
21678,
3213,
4754,
253,
4477,
452,
9713,
7364,
5474,
33032,
2520,
2929,
10262,
247,
747,
1332,
323,
18899,
25306,
21302,
26332,
17597,
42785,
273,
1781,
14580,
275,
271,
5919,
5133,
436,
310,
14123,
407,
767,
7936,
16039,
253,
806,
581,
1146,
326,
760,
2173,
3510,
273,
7632,
476,
8162,
281,
1029,
6967,
17597,
42785,
253,
1273,
581,
1146,
326,
690,
7632,
403,
9093,
28116,
323,
253,
10272,
625,
10534,
253,
806,
16039,
2436,
36269,
8077,
5411,
253,
4216,
281,
697,
465,
6443,
26332,
253,
749,
10580,
14757,
407,
512,
13388,
3692,
4248,
310,
387,
1878,
465,
436,
4216,
310,
2011,
281,
320,
4209,
323,
18899,
17597,
42785,
598,
281,
7877,
465,
18,
253,
1273,
12288,
2436,
36269,
11922,
2173,
13388,
3021,
8493,
253,
1979,
273,
253,
4216,
326,
403,
14691,
407,
643,
13388,
275,
2426,
273,
247,
24077,
26332,
271,
15824,
273,
253,
7632,
275,
253,
4216,
4679,
7568,
326,
253,
4081,
1332,
476,
2223,
4796,
253,
1979,
273,
14580,
432,
1524,
10186,
285,
22791,
941,
5239,
281,
247,
6832,
6070,
50276,
783,
2022,
4757,
273,
253,
2929,
310,
326,
352,
38422,
247,
747,
3505,
74,
6216,
1039,
273,
8493,
253,
1979,
273,
14580,
326,
403,
7744,
908,
347,
14800,
323,
15663,
23117,
11333,
407,
17227,
326,
2176,
749,
45345,
273,
253,
4216,
476,
320,
12841,
323,
2169,
2621,
15663,
23117,
10426,
253,
2929,
7826,
13279,
253,
3369,
4404,
247,
625,
14414,
16253,
273,
841,
3082,
275,
3946,
253,
958,
326,
253,
767,
25184,
5609,
4081,
275,
253,
2929,
403,
4354,
281,
3359,
3021,
1146,
12450,
8745,
494,
310,
1529,
4757,
273,
253,
2929,
275,
1635,
891,
1089,
253,
3630,
484,
3477,
281,
956,
342,
690,
5884,
3374,
275,
2426,
273,
19843,
533,
625,
670,
326,
1996,
50276,
783,
2022,
14855,
273,
253,
2929,
310,
326,
352,
1057,
417,
1056,
897,
273,
253,
2442,
16039,
26299,
407,
15663,
23117,
672,
3732,
281,
253,
3777,
285,
819,
37437,
14580,
1223,
891,
11435,
326,
1781,
5141,
390,
13800,
11878,
476,
320,
6786,
323,
253,
4216,
941,
5239,
352,
651,
17084,
253,
2929,
604,
253,
3082,
497,
671,
908,
275,
247,
6828,
2898,
323,
4227,
9428,
323,
253,
13232,
273,
253,
4154,
326,
5368,
4216,
4715,
5609,
3986,
247,
2176,
7200,
327,
581,
941,
873,
285,
352,
4419,
3640,
670,
2169,
2621,
17597,
3386,
281,
2085,
247,
1805,
9162,
604,
253,
2929,
812,
1089,
247,
1083,
275,
534,
2045,
789,
858,
417,
10725,
327,
824,
2169,
2621,
3386,
347,
954,
246,
1473,
2987,
403,
31451,
281,
513,
984,
273,
15180,
7350,
533,
2169,
2621,
3386,
1614,
562,
281,
320,
9560,
323,
1175,
15970,
3045,
247,
2266,
1083,
323,
253,
4081,
5141,
15849,
651,
320,
1160,
891,
2868,
326,
1014,
247,
2014,
824,
1650,
534,
651,
320,
3477,
281,
6635,
1014,
1561,
253,
2278,
5880,
651,
823,
271,
14779,
2408,
273,
8249,
1318,
307,
874,
281,
436,
2929,
50276,
249,
6010,
891,
2868,
326,
342,
247,
1643,
3081,
14586,
436,
2929,
812,
320,
247,
1077,
1175,
7680,
281,
253,
8059,
50275,
5992,
7193,
5701,
50275,
15603,
19936,
288,
1171,
254,
1162,
355,
3676,
4715,
342,
17597,
50275,
9188,
2478,
347,
581,
273,
253,
18353,
6667,
273,
849,
281,
13398,
50275,
3956,
1975,
12342,
342,
5145,
4715,
12342,
50275,
249,
253,
10199,
247,
4864,
8813,
273,
28939,
824,
347,
50275,
5073,
14046,
651,
320,
9371,
281,
1056,
253,
2929,
625,
12482,
281,
50275,
15422,
89,
8292,
10668,
50275,
32897,
897,
253,
33797,
21509,
2570,
417,
2636,
348,
2570,
50275,
1542,
15274,
4606,
253,
1307,
24077,
1159,
943,
320,
50275,
11499,
13004,
3185,
273,
19690,
1159,
50275,
4674,
5976,
4419,
247,
1805,
10199,
352,
943,
320,
1160,
2590,
50275,
3529,
253,
24077,
1159,
1057,
417,
1818,
1060,
7826,
253,
50275,
4674,
812,
671,
320,
35993,
281,
5513,
625,
4518,
752,
944,
2711,
1473,
50275,
35264,
36782,
50275,
783,
2753,
941,
873,
5734,
891,
717,
20854,
310,
7154,
275,
326,
352,
760,
50275,
5040,
1346,
273,
295,
10125,
273,
253,
1072,
4216,
253,
13506,
1826,
941,
50275,
1178,
556,
644,
3562,
347,
253,
4229,
2715,
273,
436,
941,
873,
50275,
249,
4677,
608,
1818,
253,
4311,
281,
50276,
1542,
15274,
512,
643,
50275,
40203,
897,
50276,
249,
616,
11498,
50275,
555,
993,
285,
5884,
5701,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
342,
690,
11134,
15985,
1832,
8664,
253,
28092,
323,
44382,
468,
1641,
253,
4477,
403,
281,
320,
764,
1834,
1060,
50274,
262,
310,
2169,
701,
6811,
3904,
403,
21270,
50276,
41832,
701,
6811,
3904,
403,
21270,
50275,
2887,
4677,
495,
50275,
2887,
4677,
495,
50274,
249,
45666,
2570,
4758,
50276,
249,
247,
45666,
2570,
4758,
50275,
21528,
465,
6443,
50276,
21528,
253,
465,
6443,
50275,
783,
11196,
2885,
25232,
50276,
261,
1260,
23214,
1022,
2624,
745,
275,
8442,
824,
347,
50275,
13206,
721,
627,
403,
642,
2173,
4016,
38058,
16274,
14475,
432,
436,
789,
275,
2087,
2990,
1783,
476,
1421,
281,
20276,
1543,
672,
253,
3732,
281,
6928,
326,
4833,
2675,
8770,
253,
4081,
3082,
403,
273,
247,
12314,
3753,
2167,
285,
643,
3082,
323,
44755,
2990,
1783,
2226,
594,
253,
2495,
310,
4784,
27285,
50276,
3529,
1146,
753,
253,
5933,
280,
7364,
273,
253,
4081,
819,
25004,
15849,
812,
320,
5469,
1805,
891,
717,
275,
1798,
6110,
275,
4685,
1880,
253,
1979,
273,
253,
4216,
4916,
9419,
1483,
323,
253,
1332,
387,
690,
1127,
25761,
403,
627,
667,
2173,
7364,
327,
253,
24077,
1159,
671,
2546,
1840,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
273,
8493,
253,
1979,
273,
14580,
326,
403,
7744,
908,
347,
14800,
323,
15663,
23117,
11333,
436,
12453,
247,
7936,
9171,
1430,
1895,
275,
253,
1083,
273,
14580,
285,
310,
2779,
281,
8046,
2007,
789,
275,
253,
2170,
50276,
251,
253,
4016,
1930,
253,
2929,
310,
2074,
281,
253,
2720,
789,
327,
2266,
3007,
23508,
671,
253,
5661,
1543,
513,
417,
1056,
247,
2266,
1083,
326,
253,
4081,
11333,
403,
2686,
3576,
275,
8493,
253,
3515,
673,
323,
12672,
15663,
23117,
387,
10103,
625,
685,
470,
627,
310,
760,
581,
7484,
327,
673,
5141,
7756,
3036,
577,
67,
534,
3400,
1543,
323,
760,
767,
15302,
275,
23117,
7877,
470,
534,
310,
417,
20297,
1677,
326,
470,
6967,
1083,
310,
2168,
1929,
281,
320,
1077,
5919,
50276,
28821,
253,
1840,
619,
17401,
310,
247,
5075,
2997,
285,
891,
21434,
253,
4477,
281,
2953,
841,
3374,
275,
616,
2457,
2715,
1805,
19148,
253,
4602,
342,
253,
2720,
789,
285,
2085,
5661,
1941,
326,
253,
5933,
19132,
253,
6733,
273,
15663,
23117,
13782,
387,
10103,
625,
685,
470,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
789,
253,
4477,
12661,
281,
13398,
767,
3885,
484,
5609,
323,
3012,
43088,
598,
253,
13782,
273,
25306,
21302,
327,
1781,
14580,
253,
4081,
2746,
806,
11355,
253,
4216,
407,
2266,
45130,
1046,
14691,
11302,
285,
840,
2007,
11355,
253,
4216,
407,
12672,
697,
465,
18,
6443,
672,
10620,
342,
465,
6967,
23117,
253,
4477,
5276,
326,
824,
5871,
14003,
253,
25306,
10659,
285,
840,
2085,
1534,
11701,
689,
3515,
2069,
323,
2710,
4216,
941,
5239,
50275,
74,
1158,
253,
2929,
39223,
271,
1774,
1953,
1580,
15663,
23117,
310,
1929,
281,
320,
1892,
281,
11897,
285,
326,
697,
4028,
310,
1077,
1175,
285,
2590,
253,
15965,
7125,
403,
2969,
285,
3477,
281,
956,
2403,
253,
2929,
11076,
1037,
3590,
387,
1878,
281,
479,
50276,
35529,
253,
2929,
27171,
432,
767,
32213,
50276,
7053,
690,
625,
6031,
452,
281,
320,
1691,
275,
253,
19274,
3738,
944,
2711,
1473,
3133,
4623,
285,
3236,
352,
310,
3240,
2834,
281,
2096,
752,
403,
253,
3910,
875,
819,
8522,
285,
253,
1332,
273,
2266,
3007,
23508,
323,
12672,
253,
25306,
273,
7908,
12872,
5987,
39962,
2061,
5375,
1093,
2693,
12852,
1857,
352,
1663,
4453,
751,
326,
819,
8522,
310,
247,
11019,
2898,
273,
271,
5368,
1332,
281,
4216,
941,
50276,
9815,
891,
1158,
253,
4679,
403,
5322,
533,
403,
417,
326,
21414,
1580,
512,
273,
731,
921,
253,
5141,
275,
253,
1180,
273,
13388,
285,
9297,
533,
760,
247,
1643,
2686,
3148,
253,
3515,
673,
11701,
534,
310,
253,
2022,
16038,
273,
436,
789,
891,
1158,
247,
625,
9470,
5301,
273,
3515,
2069,
310,
3309,
50276,
262,
651,
320,
1175,
281,
2319,
253,
958,
326,
253,
4081,
1332,
310,
3710,
281,
7908,
12872,
4270,
432,
14580,
476,
944,
2711,
1473,
320,
6508,
281,
625,
2087,
3510,
273,
4216,
1193,
1206,
569,
50276,
7152,
339,
431,
248,
4477,
1918,
767,
3082,
323,
941,
5141,
326,
513,
417,
6990,
253,
10426,
3058,
323,
15663,
23117,
50276,
783,
20544,
273,
253,
2929,
403,
326,
253,
11333,
403,
15246,
285,
1077,
2087,
253,
32213,
403,
326,
253,
2120,
15180,
10454,
310,
417,
14859,
262,
310,
417,
2590,
479,
849,
19983,
841,
11333,
403,
275,
253,
638,
21678,
3213,
4754,
253,
4477,
452,
9713,
7364,
5474,
33032,
2520,
2929,
10262,
247,
747,
1332,
323,
18899,
25306,
21302,
26332,
17597,
42785,
273,
1781,
14580,
275,
271,
5919,
5133,
436,
310,
14123,
407,
767,
7936,
16039,
253,
806,
581,
1146,
326,
760,
2173,
3510,
273,
7632,
476,
8162,
281,
1029,
6967,
17597,
42785,
253,
1273,
581,
1146,
326,
690,
7632,
403,
9093,
28116,
323,
253,
10272,
625,
10534,
253,
806,
16039,
2436,
36269,
8077,
5411,
253,
4216,
281,
697,
465,
6443,
26332,
253,
749,
10580,
14757,
407,
512,
13388,
3692,
4248,
310,
387,
1878,
465,
436,
4216,
310,
2011,
281,
320,
4209,
323,
18899,
17597,
42785,
598,
281,
7877,
465,
18,
253,
1273,
12288,
2436,
36269,
11922,
2173,
13388,
3021,
8493,
253,
1979,
273,
253,
4216,
326,
403,
14691,
407,
643,
13388,
275,
2426,
273,
247,
24077,
26332,
271,
15824,
273,
253,
7632,
275,
253,
4216,
4679,
7568,
326,
253,
4081,
1332,
476,
2223,
4796,
253,
1979,
273,
14580,
432,
1524,
10186,
285,
22791,
941,
5239,
281,
247,
6832,
6070,
50276,
783,
2022,
4757,
273,
253,
2929,
310,
326,
352,
38422,
247,
747,
3505,
74,
6216,
1039,
273,
8493,
253,
1979,
273,
14580,
326,
403,
7744,
908,
347,
14800,
323,
15663,
23117,
11333,
407,
17227,
326,
2176,
749,
45345,
273,
253,
4216,
476,
320,
12841,
323,
2169,
2621,
15663,
23117,
10426,
253,
2929,
7826,
13279,
253,
3369,
4404,
247,
625,
14414,
16253,
273,
841,
3082,
275,
3946,
253,
958,
326,
253,
767,
25184,
5609,
4081,
275,
253,
2929,
403,
4354,
281,
3359,
3021,
1146,
12450,
8745,
494,
310,
1529,
4757,
273,
253,
2929,
275,
1635,
891,
1089,
253,
3630,
484,
3477,
281,
956,
342,
690,
5884,
3374,
275,
2426,
273,
19843,
533,
625,
670,
326,
1996,
50276,
783,
2022,
14855,
273,
253,
2929,
310,
326,
352,
1057,
417,
1056,
897,
273,
253,
2442,
16039,
26299,
407,
15663,
23117,
672,
3732,
281,
253,
3777,
285,
819,
37437,
14580,
1223,
891,
11435,
326,
1781,
5141,
390,
13800,
11878,
476,
320,
6786,
323,
253,
4216,
941,
5239,
352,
651,
17084,
253,
2929,
604,
253,
3082,
497,
671,
908,
275,
247,
6828,
2898,
323,
4227,
9428,
323,
253,
13232,
273,
253,
4154,
326,
5368,
4216,
4715,
5609,
3986,
247,
2176,
7200,
327,
581,
941,
873,
285,
352,
4419,
3640,
670,
2169,
2621,
17597,
3386,
281,
2085,
247,
1805,
9162,
604,
253,
2929,
812,
1089,
247,
1083,
275,
534,
2045,
789,
858,
417,
10725,
327,
824,
2169,
2621,
3386,
347,
954,
246,
1473,
2987,
403,
31451,
281,
513,
984,
273,
15180,
7350,
533,
2169,
2621,
3386,
1614,
562,
281,
320,
9560,
323,
1175,
15970,
3045,
247,
2266,
1083,
323,
253,
4081,
5141,
15849,
651,
320,
1160,
891,
2868,
326,
1014,
247,
2014,
824,
1650,
534,
651,
320,
3477,
281,
6635,
1014,
1561,
253,
2278,
5880,
651,
823,
271,
14779,
2408,
273,
8249,
1318,
307,
874,
281,
436,
2929,
50276,
249,
6010,
891,
2868,
326,
342,
247,
1643,
3081,
14586,
436,
2929,
812,
320,
247,
1077,
1175,
7680,
281,
253,
8059,
50275,
5992,
7193,
5701,
50275,
15603,
19936,
288,
1171,
254,
1162,
355,
3676,
4715,
342,
17597,
50275,
9188,
2478,
347,
581,
273,
253,
18353,
6667,
273,
849,
281,
13398,
50275,
3956,
1975,
12342,
342,
5145,
4715,
12342,
50275,
249,
253,
10199,
247,
4864,
8813,
273,
28939,
824,
347,
50275,
5073,
14046,
651,
320,
9371,
281,
1056,
253,
2929,
625,
12482,
281,
50275,
15422,
89,
8292,
10668,
50275,
32897,
897,
253,
33797,
21509,
2570,
417,
2636,
348,
2570,
50275,
1542,
15274,
4606,
253,
1307,
24077,
1159,
943,
320,
50275,
11499,
13004,
3185,
273,
19690,
1159,
50275,
4674,
5976,
4419,
247,
1805,
10199,
352,
943,
320,
1160,
2590,
50275,
3529,
253,
24077,
1159,
1057,
417,
1818,
1060,
7826,
253,
50275,
4674,
812,
671,
320,
35993,
281,
5513,
625,
4518,
752,
944,
2711,
1473,
50275,
35264,
36782,
50275,
783,
2753,
941,
873,
5734,
891,
717,
20854,
310,
7154,
275,
326,
352,
760,
50275,
5040,
1346,
273,
295,
10125,
273,
253,
1072,
4216,
253,
13506,
1826,
941,
50275,
1178,
556,
644,
3562,
347,
253,
4229,
2715,
273,
436,
941,
873,
50275,
249,
4677,
608,
1818,
253,
4311,
281,
50276,
1542,
15274,
512,
643,
50275,
40203,
897,
50276,
249,
616,
11498,
50275,
555,
993,
285,
5884,
5701,
50276,
783,
2929,
310,
973,
3542,
285,
2590,
342,
690,
11134,
15985,
1832,
8664,
253,
28092,
323,
44382,
468,
1641,
253,
4477,
403,
281,
320,
764,
1834,
1060,
50274,
262,
310,
2169,
701,
6811,
3904,
403,
21270,
50276,
41832,
701,
6811,
3904,
403,
21270,
50275,
2887,
4677,
495,
50275,
2887,
4677,
495,
50274,
249,
45666,
2570,
4758,
50276,
249,
247,
45666,
2570,
4758,
50275,
21528,
465,
6443,
50276,
21528,
253,
465,
6443,
50275,
783,
11196,
2885,
25232,
50276,
261,
1260,
23214,
1022,
2624,
745,
275,
8442,
824,
347,
50275,
13206,
721,
627,
403,
642,
2173,
4016,
38058,
16274,
14475,
432,
436,
789,
275,
2087,
2990,
1783,
476,
1421,
281,
20276,
1543,
672,
253,
3732,
281,
6928,
326,
4833,
2675,
8770,
253,
4081,
3082,
403,
273,
247,
12314,
3753,
2167,
285,
643,
3082,
323,
44755,
2990,
1783,
2226,
594,
253,
2495,
310,
4784,
27285,
50276,
3529,
1146,
753,
253,
5933,
280,
7364,
273,
253,
4081,
819,
25004,
15849,
812,
320,
5469,
1805,
891,
717,
275,
1798,
6110,
275,
4685,
1880,
253,
1979,
273,
253,
4216,
4916,
9419,
1483,
323,
253,
1332,
387,
690,
1127,
25761,
403,
627,
667,
2173,
7364,
327,
253,
24077,
1159,
671,
2546,
1840,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
273,
8493,
253,
1979,
273,
14580,
326,
403,
7744,
908,
347,
14800,
323,
15663,
23117,
11333,
436,
12453,
247,
7936,
9171,
1430,
1895,
275,
253,
1083,
273,
14580,
285,
310,
2779,
281,
8046,
2007,
789,
275,
253,
2170,
50276,
251,
253,
4016,
1930,
253,
2929,
310,
2074,
281,
253,
2720,
789,
327,
2266,
3007,
23508,
671,
253,
5661,
1543,
513,
417,
1056,
247,
2266,
1083,
326,
253,
4081,
11333,
403,
2686,
3576,
275,
8493,
253,
3515,
673,
323,
12672,
15663,
23117,
387,
10103,
625,
685,
470,
627,
310,
760,
581,
7484,
327,
673,
5141,
7756,
3036,
577,
67,
534,
3400,
1543,
323,
760,
767,
15302,
275,
23117,
7877,
470,
534,
310,
417,
20297,
1677,
326,
470,
6967,
1083,
310,
2168,
1929,
281,
320,
1077,
5919,
50276,
28821,
253,
1840,
619,
17401,
310,
247,
5075,
2997,
285,
891,
21434,
253,
4477,
281,
2953,
841,
3374,
275,
616,
2457,
2715,
1805,
19148,
253,
4602,
342,
253,
2720,
789,
285,
2085,
5661,
1941,
326,
253,
5933,
19132,
253,
6733,
273,
15663,
23117,
13782,
387,
10103,
625,
685,
470,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
apologies for the late review summary of the paper the paper provides a potential theoretical explanation of the known empirical observation that cold or tempered posteriors improve predictive performance of deep bayesian neural networks the provided explanation is simple and it leads to additional predictions which the authors check empirically as far as possible with existing data sets the empirical results agree with the predictions main strengths i believe the main message of the paper is so relevant and seems so simple at least in hindsight that it has the potential of becoming a kind of common knowledge in bayesian neural networks community caveat i cant judge if these findings had already been informally known to a larger group of researchers but unless someone has explicitly written them down somewhere i wouldnt hold this against the paper researchers have been trying to increase predictive performance of deep neural networks by applying scalable bayesian methods to deep learning for a while but even replicating the performance of point estimated models with bayesian neural networks has proven surprisingly difficult to my knowledge it was found out only recently that bayesian neural networks systematically outperform their point estimated counterparts if the prior is made artificially sharper than what probability theory would predict ie by lowering the temperature this empirical result has been puzzling from a theoretical perspective but the present paper provides a simple potential explanation for this effect i believe the findings in this paper go beyond a theoretical justification of an empirically known fact the findings may also have implications on model robustness the authors argue that the tempering effect is a result of curation of the training set validations and tests sets are typically curated in the same way as the training set in the machine learning community however when models are deployed in the field they typically see uncurated data points i would be curious to know if explicitly modeling the curation process as the authors do in this paper would also address this issue potential weaknesses theres one caveat to my review i am not an expert on bayesian neural networks and as stated above the argument made in this paper seems so simple in hindsight that i cannot say with absolute certainty that it hasnt been made before i personally havent heard this argument before but if other reviewers can point to a reference that already made this argument then that would probably be the only thing that could convince me to lower my rating otherwise i would consider the simplicity of the authors argument a strength of the paper questions to the authors what do the authors mean with the phrase finite networks in the first paragraph is it the same as networks with point estimated parameters as opposed to bayesian neural networks as mentioned above the paper models the curation of the training set but i didnt understand how or if curation of the test set is modeled could the authors clarify this specifically what changes for a model trained on curated data when it is either a tested on an equally curated test set or b applied to uncurated data in the wild would the optimal lambda during training differ between cases a and b or would the posterior have to be changed after training minor issues i like the short section 2 which compares cold and tempered posteriors very much i think it could even be improved by adding one reference each for cold and tempered posteriors respectively more importantly as far as i understand the two are really essentially the same if one uses eg a gaussian prior unless im mistaken the missing factor of frac1t in eq 2 could then be absorbed into a rescaling of the prior covariance unless the prior covariance itself is learned with expectation maximization if this is correct then id add a corresponding statement at the end of section 2 it would make the papers claims more widely applicable i think figure 2 is never discussed in the paper but there should be enough space left to discuss it the figure caption says schematic diagram could the authors clarify what this means do the points come from some toy model with a 2d parameter space or does the figure show a 2d pca of the parameter space or were the points really just drawn manually to visualize the idea i think both would be fine but i would be very curious to know how the figure looks with real data in figure 4 the last panel is labelled f but referred to as e also in the last two panels the left dashed vertical bar is not discussed is it the theoretically expected optimal value of lambda ie lambdafrac14 or the empirically found optimal valuedocsepthis paper addresses the perplexing issue of cold posterior having better predictive performance than the ideal bayesian posterior in bayesian deep learning wenzel et al 2020 and offers a possible explanation in terms of a misspecified likelihood function that deviates from the true generative process of the data by considering the data curation process and augmenting the likelihood model accordingly the effect of cold posterior is shown to diminish significantly and the ideal posterior is again optimal empirical results on both a toy problem and image classification support the theory pros 1 given the prevalence of bayesian deep learning and the issue of cold posterior this paper offers a timely contribution that bridges theory and practice 1 the paper is well written and motivated the method appears sound but see questions below and concepts are explained in a clear and pedagogical manner 2 the experiments are well thought out and offer clear empirical support of the proposed hypothesis cons this might be due to my limited understanding of the paper but i think there are still some limitations to the papers proposed theory eg it doesnt explain the observation that extremely cold posterior 0 doesnt seem to hurt the performance of bnn which should according to the proposed theory as there is only one optimal temperature 1 s where s is the true number of underlying labelers and more below questions and comments 1 my biggest confusion is this the paper argues that its incorrect to assume a simple categorical likelihood pyx as it doesnt take into account the data curation process however under the extended likelihood model as proposed when conditioning on the event that ynone and xnone as we do when training on standard datasets and after marginalizing out the intermediate variables and renormalizing isnt the conditional distribution pyx still just a categorical distribution except parameterized in a different way now if so then the difference between the two likelihoods is really just a different parameterization and im no longer sure what to make of the suggested theory and the supporting results i find it very surprising that the more complex parameterization significantly reduced the tempering effect and if the we take the ground truth likelihood pyx to be as in the standard curated dataset which the paper argues is in some sense artificially tempered then why cant a wellspcified bnn just adapt to this still categorical likelihood and learn an optimal posterior under it im happy to raise my score if the authors can clarify these issues for me 2 since point estimation with sgd optimizes the same likelihood function why dont we observe the tempering effect in sgd perhaps there is some effect but rather minimal in any case some experiments on sgd with without the corrected likelihood would be interesting 3 related to my comment about extremely cold posterior in the gp experiment when trained and tested on the corrected likelihood considering curation the test performance seems to really prefer the optimal 1 whereas on the image experiment more tempering 0 doesnt seem to affect test performance is there an explanation for this 4 finally does the proposed theory explain the observation that the cold posterior effect is more prominent in bnn with higher capacity wenzel et al 2020 possible typos and minor mistakes 1 p3 under eq 7 this likelihood is equivalent to labeling each datapoint s times with the same label and therefore has exactly the effect of setting s in a tempered posterior should be 1s instead 2 the rightmost subfigure in figure 4 should be labeled e instead of f to match the caption below update ive raised my score in light of author response and new results docsepthe work propose a theory suggesting that the cold posterior phenomena arises solely due the the curated nature of image benchmarks a generative model is proposed where multiple annotators label datapoints and only unanimously labeled datapoints are accepted into a dataset this theory is studied under a toyproblem using vi and a relabelled version of the cifar10 test set with sgld however many questions remain unanswered and the proposed theory is not sufficiently studied q the cold posterior problem was highlighted in the sgmcmc case but this works main toy problem only explores tempered posteriors as prevalent in vi it would be beneficial if the work highlights why these results should extend to the cold posterior or better yet run experiments in this scenario q 41 strongly suggests that there is a relationship between between lambda and s in the toy problem it should be an easy addition to study this connection for a range of values for s to explore if this holds q the work claims that the consensus protocol for standard datasets is not available but it would appear that this is a a simple manner of reaching out to the authors of the datasets q the main experiment presented in figure 5 is missing some important ablations what happens when the cifar10 baseline is trained under the same conditions learning rate as cifar10h q why is it acceptable to use the original cifar10 trainingset as the testset for cifar10h this seems like a problematic shift in data distribution q the theory of dataset curation is interesting but makes a broad claim more datasets should be explored from varying modalities the sole focus here on cifar10 provides too little evidence as a suggestion curation processes are different for eg medical imaging datasets and typically well documented q its unclear to me why it is acceptable to increase the training set size by the number of annotators an increase of factor 50 is effectively setting the temperature to 1e2 at this temperature the baseline performs just as well does this large gap still hold if just a much smaller subset of annotators is taken from the cifar10h dataset the core idea proposed in this work is thought provoking and contributes to the discussion on this topic the work is relatively short which is not a problem in its own right but the experiment section needs to provide more evidence and analysis i vote for reject nitpicks f instead of e in figure 4 as expected we when update ive increased my scoredocsepthe authors propose the idea that cold posteriors in bayesian neural networks could be caused by the likelihood instead of the prior they argue theoretically that the curation process of popular benchmark data sets would lead to a different weighting of the likelihood in the posterior they show in some experiments that the cold posterior effect can be reduced when accounting for this major comments the paper title suggests that it is about cold posteriors and it quite prominently references 1 in the introduction however in sec 2 it is then clarified that the paper is in fact not about cold posteriors but about tempered ones it is just briefly mentioned in passing that the results should transfer but this is never tested i think an experiment on actual cold posteriors similar to the one in 1 would be warranted to support such a statement and the usage of the current paper title the theory suggests that the optimal posterior performance should be achieved at lambdas that is cooling down beyond that point should deteriorate performance again this is an interesting prediction since it does not seem to fit the observations in 1 it would be nice to see this confirmed on an actual bnn experiment similarly to what can be seen in the toy gp experiment the related work section seems awfully short it does not even mention 1 although it is cited heavily elsewhere moreover for a paper that is proposing a statistical theory of tempered posteriors works such as 2 and 3 should probably be mentioned minor comments sec 42 we when when summary the idea that coldtempered posterior effects can be caused by the data set curation instead of by misspecified priors is very interesting and definitely deserves a theoretical and empirical investigation however the investigation at hand seems a bit incomplete in some places especially the related work and experiments sections also the title does not currently seem to fit the experiments given that the current manuscript is comfortably within the iclr page limit im hopeful that these points can be addressed in a revised version during the discussion phase update i increased my score following the clarifications and addition of the bnn experiment during the discussion phase 1 wenzel f roth k veeling b s witkowski j tran l mandt s nowozin s 2020 how good is the bayes posterior in deep neural networks really arxiv preprint arxiv200202405 2 grnwald p 2012 october the safe bayesian in international conference on algorithmic learning theory pp 169183 springer berlin heidelberg 3 grnwald p van ommen t 2017 inconsistency of bayesian inference for misspecified linear models and a proposal for repairing it bayesian analysis 124 10691103
### Summary:
|
this is an interesting controversial paper that contributes to an ongoing debate in bayesian deep learning bayesian inference with artificially cooled posteriors eg trained with langevin dynamics with downweighted noise was recently found to outperform over both point estimation and fullybayesian treatments wenzel et al 2020 this paper proposes a new explanation for these observed phenomena in terms of a data curation mechanism that popular benchmark data sets such as cifar underwent the analysis boils down to an evidence overcountingundercounting argument and takes into account that curated data sets only contain data points for which all labelers agreed on a label the authors claim that when modeling the true generative process of the data the cold posterior effect partially vanishes the paper is wellwritten and provides a consistent analysis by modeling the data curation mechanism in terms of an underlying probabilistic graphical model of the labeling mechanism unfortunately several observed phenomena of wenzel et al 2020 remain unexplained by the theoretical arguments eg the fact that very cold t 0 posteriors dont hurt performance or the observation that the optimal temperature seems to depend on the model capacity while the proposed explanation doesnt capture the full picture upon which both authors and reviewers agree the papers focus on the data curation process supported extensive experiments gives a partial explanation and provides an interesting perspective that will spur further discussion and should be of broad interest to the bayesian deep learning community
|
[
849,
253,
4677,
4453,
342,
1524,
941,
50276,
249,
4677,
577,
253,
1390,
5370,
310,
27214,
269,
533,
6289,
281,
347,
299,
671,
275,
253,
1390,
767,
12471,
253,
1669,
17889,
9118,
2534,
310,
417,
5469,
310,
352,
253,
28055,
3264,
8654,
1318,
273,
29331,
26332,
29331,
1124,
1047,
390,
253,
45190,
1119,
8654,
21392,
406,
33032,
2520,
2929,
12453,
253,
44229,
272,
2523,
273,
5412,
12637,
1907,
1805,
15970,
3045,
685,
253,
7445,
17699,
16561,
12637,
275,
17699,
16561,
3676,
4715,
259,
12586,
293,
1162,
355,
9169,
285,
6131,
247,
1896,
8813,
275,
2426,
273,
247,
2985,
1553,
1245,
12177,
1159,
326,
1474,
28032,
432,
253,
2032,
1006,
800,
1232,
273,
253,
941,
407,
7296,
253,
941,
1095,
318,
1232,
285,
35919,
272,
253,
12177,
1566,
15672,
253,
1055,
273,
5412,
12637,
310,
2011,
281,
37856,
3012,
285,
253,
7445,
12637,
310,
969,
8654,
16774,
1543,
327,
1097,
247,
20953,
1895,
285,
2460,
9162,
1329,
253,
3762,
50273,
856,
84,
337,
1677,
253,
8996,
273,
17699,
16561,
3676,
4715,
285,
253,
2523,
273,
5412,
12637,
436,
2929,
6131,
247,
14793,
7680,
326,
24853,
3762,
285,
3946,
337,
253,
2929,
310,
973,
3542,
285,
17194,
253,
1332,
4620,
3590,
533,
923,
3533,
2708,
285,
12342,
403,
5544,
275,
247,
2590,
285,
7690,
356,
38721,
5133,
374,
253,
4679,
403,
973,
1869,
562,
285,
3959,
2590,
16774,
1329,
273,
253,
4081,
9079,
50274,
5040,
436,
1537,
320,
1955,
281,
619,
3710,
4685,
273,
253,
2929,
533,
891,
1158,
627,
403,
1335,
690,
7364,
281,
253,
9380,
4081,
3762,
24088,
352,
36908,
5513,
253,
8310,
326,
6685,
5412,
12637,
50275,
17,
36908,
1646,
281,
8513,
253,
3045,
273,
270,
9866,
534,
943,
2556,
281,
253,
4081,
3762,
347,
627,
310,
760,
581,
8654,
3276,
50275,
18,
50276,
84,
835,
256,
310,
253,
2032,
1180,
273,
6944,
5203,
398,
285,
625,
2708,
50274,
34974,
285,
5701,
337,
619,
5962,
13775,
310,
436,
253,
2929,
8219,
326,
697,
13583,
281,
5467,
247,
2969,
31091,
12177,
7239,
89,
347,
352,
36908,
1379,
715,
2395,
253,
941,
1095,
318,
1232,
2299,
762,
253,
6508,
12177,
1566,
347,
4081,
672,
21839,
327,
253,
2362,
326,
340,
15422,
285,
1269,
15422,
347,
359,
513,
672,
3733,
327,
2629,
15302,
285,
846,
16888,
3006,
562,
253,
10444,
4903,
285,
26749,
3006,
310,
2649,
253,
17697,
3268,
7239,
89,
1335,
816,
247,
31091,
3268,
3707,
4764,
1025,
275,
247,
1027,
1039,
1024,
50276,
338,
594,
840,
253,
3064,
875,
253,
767,
12177,
84,
310,
1663,
816,
247,
1027,
4764,
1320,
285,
516,
642,
3356,
2119,
752,
281,
1056,
273,
253,
5125,
3762,
285,
253,
8109,
1543,
50276,
74,
1089,
352,
1077,
10084,
326,
253,
625,
2570,
4764,
1320,
3012,
3777,
253,
2660,
272,
1055,
50276,
395,
604,
253,
359,
1379,
253,
3216,
5083,
12177,
7239,
89,
281,
320,
347,
275,
253,
2629,
1095,
456,
10895,
534,
253,
2929,
8219,
310,
275,
690,
3282,
41544,
1565,
11712,
840,
2139,
16216,
247,
973,
1033,
68,
1245,
270,
9866,
816,
5223,
281,
436,
1335,
31091,
12177,
285,
3037,
271,
8654,
12637,
762,
352,
50276,
303,
5211,
281,
7164,
619,
4868,
604,
253,
4477,
476,
19148,
841,
3374,
323,
479,
374,
1580,
1127,
13418,
342,
256,
35333,
5556,
4219,
253,
1072,
12177,
1159,
2139,
13414,
359,
10018,
253,
2660,
272,
1055,
275,
256,
35333,
4931,
627,
310,
690,
1055,
533,
2581,
8723,
275,
667,
1083,
690,
4679,
327,
256,
35333,
342,
50276,
14920,
253,
15045,
12177,
651,
320,
4722,
495,
2905,
281,
619,
4385,
670,
6685,
5412,
12637,
275,
253,
31025,
3368,
672,
10166,
285,
5762,
327,
253,
15045,
12177,
7296,
1095,
318,
253,
1071,
3045,
3133,
281,
1663,
4510,
253,
8654,
50275,
18,
5727,
327,
253,
2460,
3368,
625,
2660,
272,
50275,
17,
36908,
1646,
281,
2818,
1071,
3045,
310,
627,
271,
8813,
323,
436,
577,
4720,
1057,
253,
4081,
3762,
5513,
253,
8310,
326,
253,
5412,
12637,
1055,
310,
625,
11906,
275,
270,
9866,
342,
2169,
5350,
259,
12586,
293,
1162,
355,
9169,
50273,
24902,
963,
993,
285,
5884,
16503,
337,
268,
20,
762,
16186,
818,
50276,
2520,
12177,
310,
6425,
281,
21473,
1016,
2856,
522,
842,
256,
2069,
342,
253,
1072,
5203,
285,
3103,
556,
4555,
253,
1055,
273,
4758,
50275,
84,
275,
247,
1565,
11712,
12637,
943,
320,
50275,
18,
84,
3185,
374,
253,
987,
2252,
749,
13206,
275,
4677,
577,
943,
320,
13130,
299,
3185,
273,
269,
281,
3761,
253,
11743,
2708,
50274,
11183,
50276,
422,
5439,
619,
4868,
275,
1708,
273,
2488,
2380,
285,
747,
1543,
5474,
339,
431,
248,
789,
12661,
247,
3762,
7738,
326,
253,
5412,
12637,
16958,
15877,
12718,
1955,
253,
253,
1095,
456,
3753,
273,
2460,
49602,
247,
1006,
800,
1566,
310,
4081,
835,
2709,
12182,
2392,
5203,
2856,
522,
842,
84,
285,
760,
38350,
13130,
2856,
522,
842,
84,
403,
7607,
715,
247,
10895,
436,
3762,
310,
5421,
762,
247,
20953,
28872,
970,
2177,
285,
247,
774,
357,
5911,
2715,
273,
253,
260,
338,
274,
740,
1071,
873,
342,
48237,
392,
50275,
35529,
1142,
3533,
3464,
440,
42195,
285,
253,
4081,
3762,
310,
417,
10481,
5421,
50275,
82,
253,
5412,
12637,
1895,
369,
16318,
275,
253,
256,
34753,
3591,
68,
1083,
533,
436,
2987,
2022,
20953,
1895,
760,
33826,
1565,
11712,
20731,
17327,
347,
21270,
275,
2177,
352,
651,
320,
12912,
604,
253,
789,
16681,
2139,
841,
1543,
943,
9017,
281,
253,
5412,
12637,
390,
1805,
2568,
1408,
4679,
275,
436,
10076,
50276,
82,
7609,
7052,
5936,
326,
627,
310,
247,
2954,
875,
875,
29331,
285,
256,
275,
253,
20953,
1895,
352,
943,
320,
271,
3477,
1635,
281,
1263,
436,
4602,
323,
247,
2491,
273,
2193,
323,
256,
281,
8338,
604,
436,
6556,
50276,
82,
253,
789,
3916,
326,
253,
13969,
7241,
323,
2629,
15302,
310,
417,
2130,
533,
352,
651,
3176,
326,
436,
310,
247,
247,
2969,
5133,
273,
10922,
562,
281,
253,
4477,
273,
253,
15302,
50276,
82,
253,
2022,
3368,
3559,
275,
4677,
608,
310,
5816,
690,
1774,
490,
77,
569,
752,
6569,
672,
253,
260,
338,
274,
740,
8245,
310,
10166,
762,
253,
1072,
2515,
4715,
2281,
347,
260,
338,
274,
740,
73,
50276,
82,
2139,
310,
352,
12207,
281,
897,
253,
3236,
260,
338,
274,
740,
6194,
723,
292,
347,
253,
1071,
1178,
323,
260,
338,
274,
740,
73,
436,
3133,
751,
247,
20276,
5333,
275,
941,
3268,
50276,
82,
253,
3762,
273,
10895,
1095,
318,
310,
4722,
533,
2789,
247,
3862,
1750,
625,
15302,
943,
320,
14859,
432,
11962,
33433,
253,
7934,
2770,
1060,
327,
260,
338,
274,
740,
3400,
1512,
1652,
1941,
347,
247,
14876,
1095,
318,
4870,
403,
1027,
323,
24088,
3739,
6979,
15302,
285,
5431,
973,
14290,
50275,
82,
697,
12744,
281,
479,
2139,
352,
310,
12207,
281,
2572,
253,
3733,
873,
1979,
407,
253,
1180,
273,
12182,
2392,
271,
2572,
273,
2803,
2456,
310,
8069,
4758,
253,
3276,
281,
50276,
18,
70,
19,
387,
436,
3276,
253,
8245,
17923,
816,
347,
973,
1057,
436,
1781,
8037,
1335,
2186,
604,
816,
247,
1199,
4577,
8578,
273,
12182,
2392,
310,
2668,
432,
253,
260,
338,
274,
740,
73,
10895,
50275,
783,
5161,
2934,
4081,
275,
436,
789,
310,
1869,
872,
6856,
285,
17904,
281,
253,
5955,
327,
436,
9400,
253,
789,
310,
4942,
2159,
534,
310,
417,
247,
1895,
275,
697,
1211,
987,
533,
253,
3368,
2593,
3198,
281,
2085,
625,
1941,
285,
1783,
891,
6273,
323,
12009,
50276,
32202,
81,
5519,
50276,
71,
3185,
273,
299,
275,
4677,
577,
50276,
284,
3264,
359,
672,
50276,
11183,
209,
422,
2559,
619,
11691,
406,
339,
431,
248,
4477,
12661,
253,
2934,
326,
5412,
20731,
17327,
275,
17699,
16561,
11454,
6928,
812,
320,
4269,
407,
253,
12177,
3185,
273,
253,
2720,
597,
9059,
28055,
326,
253,
1095,
318,
1232,
273,
4633,
22791,
941,
5239,
651,
1421,
281,
247,
1027,
42428,
273,
253,
12177,
275,
253,
12637,
597,
921,
275,
690,
4679,
326,
253,
5412,
12637,
1055,
476,
320,
3777,
672,
15890,
323,
436,
50276,
24330,
5701,
50276,
783,
2929,
4060,
5936,
326,
352,
310,
670,
5412,
20731,
17327,
285,
352,
3240,
46454,
10414,
337,
275,
253,
10199,
2299,
275,
4706,
374,
352,
310,
840,
31637,
326,
253,
2929,
310,
275,
958,
417,
670,
5412,
20731,
17327,
533,
670,
1565,
11712,
4394,
352,
310,
816,
13366,
5393,
275,
8136,
326,
253,
1543,
943,
3700,
533,
436,
310,
1620,
5762,
891,
1158,
271,
3368,
327,
4588,
5412,
20731,
17327,
2074,
281,
253,
581,
275,
337,
651,
320,
26085,
281,
1329,
824,
247,
3908,
285,
253,
10393,
273,
253,
1655,
2929,
4060,
50276,
783,
3762,
5936,
326,
253,
8654,
12637,
3045,
943,
320,
6786,
387,
24082,
34797,
326,
310,
12356,
1066,
4457,
326,
1127,
943,
16528,
366,
3045,
969,
436,
310,
271,
4722,
10554,
1580,
352,
1057,
417,
1646,
281,
4944,
253,
7313,
275,
337,
352,
651,
320,
5322,
281,
923,
436,
5783,
327,
271,
4588,
270,
9866,
3368,
12014,
281,
752,
476,
320,
2326,
275,
253,
20953,
31025,
3368,
50276,
783,
2905,
789,
2593,
3133,
3768,
2920,
2159,
352,
1057,
417,
1014,
3748,
337,
3738,
352,
310,
11106,
11306,
11358,
25761,
323,
247,
2929,
326,
310,
36636,
247,
7605,
3762,
273,
1565,
11712,
20731,
17327,
2987,
824,
347,
374,
285,
495,
943,
3164,
320,
5393,
50276,
37585,
5701,
50276,
1704,
5976,
359,
672,
50276,
9453,
50276,
8774,
253,
2934,
326,
5412,
5142,
11712,
12637,
2538,
476,
320,
4269,
407,
253,
941,
873,
1095,
318,
3185,
273,
407,
2985,
1553,
1245,
2235,
641,
310,
1077,
4722,
285,
7964,
22828,
247,
10527,
285,
16774,
5839,
2299,
253,
5839,
387,
1133,
3133,
247,
2372,
18464,
275,
690,
5053,
3340,
253,
2905,
789,
285,
4679,
7118,
671,
253,
4060,
1057,
417,
4390,
1646,
281,
4944,
253,
4679,
1677,
326,
253,
1655,
7714,
310,
35766,
1561,
253,
17857,
32888,
3239,
2701,
516,
29874,
326,
841,
2792,
476,
320,
9713,
275,
247,
17265,
2715,
1309,
253,
5955,
3408,
50275,
11183,
891,
2559,
619,
4868,
1563,
253,
8254,
6787,
285,
1635,
273,
253,
270,
9866,
3368,
1309,
253,
5955,
3408,
50276,
18,
259,
12586,
293,
269,
687,
394,
465,
1670,
8855,
270,
256,
19311,
45593,
480,
21191,
298,
7649,
85,
256,
50275,
2666,
6002,
249,
256,
9169,
849,
1175,
310,
253,
17699,
265,
12637,
275,
3676,
11454,
6928,
1663,
549,
32693,
638,
3845,
549,
32693,
1518,
938,
1348,
1762,
374,
650,
79,
35269,
268,
4050,
17109,
2672,
253,
4999,
17699,
16561,
275,
5213,
8059,
327,
5933,
280,
4715,
3762,
7266,
23504,
21092,
7203,
254,
17099,
3642,
344,
45355,
495,
650,
79,
35269,
268,
50276,
6148,
258,
2188,
257,
246,
4240,
43430,
273,
17699,
16561,
17032,
323,
2985,
1553,
1245,
4872,
3210,
285,
247,
10419,
323,
47387,
352,
17699,
16561,
1783,
17294,
884,
2090,
30032,
187,
187,
4118,
18435,
27,
2520,
310,
271,
4722,
15620,
2929,
326,
17904,
281,
271,
10800,
8881,
275,
17699,
16561,
3676,
4715,
50276,
32442,
16561,
17032,
342,
41544,
27336,
20731,
17327,
24088,
10166,
342,
298,
912,
8498,
8062,
342,
1066,
24676,
6046,
369,
4102,
1119,
281,
562,
32231,
689,
1097,
1127,
13418,
285,
4751,
32442,
16561,
9694,
259,
12586,
293,
1162,
355,
9169,
436,
2929,
29328,
247,
747,
8813,
323,
841,
2540,
16958,
275,
2426,
273,
247,
941,
1095,
318,
5122,
326,
4633,
22791,
941,
5239,
824,
347,
260,
338,
274,
13368,
253,
1783,
1766,
3683,
1066,
281,
271,
1941,
689,
5560,
272,
4524,
5560,
272,
4154,
285,
3936,
715,
2395,
326,
1095,
456,
941,
5239,
760,
3831,
941,
2792,
323,
534,
512,
5203,
398,
5821,
327,
247,
5203,
253,
4477,
1750,
326,
672,
14053,
253,
2032,
1006,
800,
1232,
273,
253,
941,
253,
5412,
12637,
1055,
10571,
27309,
50276,
783,
2929,
310,
973,
15720,
285,
3400,
247,
5185,
1783,
407,
14053,
253,
941,
1095,
318,
5122,
275,
2426,
273,
271,
6944,
37851,
29886,
1566,
273,
253,
21473,
5122,
19235,
2067,
2540,
16958,
273,
259,
12586,
293,
1162,
355,
9169,
3464,
49374,
407,
253,
10527,
7125,
24088,
253,
958,
326,
1077,
5412,
246,
50276,
17,
20731,
17327,
13414,
8513,
3045,
390,
253,
8310,
326,
253,
8654,
3276,
3133,
281,
3469,
327,
253,
1566,
5350,
1223,
253,
4081,
8813,
36908,
9232,
253,
2120,
5406,
2220,
534,
1097,
4477,
285,
30628,
5194,
253,
9380,
2770,
327,
253,
941,
1095,
318,
1232,
4516,
9470,
4679,
4245,
247,
7898,
8813,
285,
3400,
271,
4722,
8668,
326,
588,
36057,
2007,
5955,
285,
943,
320,
273,
3862,
1600,
281,
253,
17699,
16561,
3676,
4715,
3114,
50276
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
849,
253,
4677,
4453,
342,
1524,
941,
50276,
249,
4677,
577,
253,
1390,
5370,
310,
27214,
269,
533,
6289,
281,
347,
299,
671,
275,
253,
1390,
767,
12471,
253,
1669,
17889,
9118,
2534,
310,
417,
5469,
310,
352,
253,
28055,
3264,
8654,
1318,
273,
29331,
26332,
29331,
1124,
1047,
390,
253,
45190,
1119,
8654,
21392,
406,
33032,
2520,
2929,
12453,
253,
44229,
272,
2523,
273,
5412,
12637,
1907,
1805,
15970,
3045,
685,
253,
7445,
17699,
16561,
12637,
275,
17699,
16561,
3676,
4715,
259,
12586,
293,
1162,
355,
9169,
285,
6131,
247,
1896,
8813,
275,
2426,
273,
247,
2985,
1553,
1245,
12177,
1159,
326,
1474,
28032,
432,
253,
2032,
1006,
800,
1232,
273,
253,
941,
407,
7296,
253,
941,
1095,
318,
1232,
285,
35919,
272,
253,
12177,
1566,
15672,
253,
1055,
273,
5412,
12637,
310,
2011,
281,
37856,
3012,
285,
253,
7445,
12637,
310,
969,
8654,
16774,
1543,
327,
1097,
247,
20953,
1895,
285,
2460,
9162,
1329,
253,
3762,
50273,
856,
84,
337,
1677,
253,
8996,
273,
17699,
16561,
3676,
4715,
285,
253,
2523,
273,
5412,
12637,
436,
2929,
6131,
247,
14793,
7680,
326,
24853,
3762,
285,
3946,
337,
253,
2929,
310,
973,
3542,
285,
17194,
253,
1332,
4620,
3590,
533,
923,
3533,
2708,
285,
12342,
403,
5544,
275,
247,
2590,
285,
7690,
356,
38721,
5133,
374,
253,
4679,
403,
973,
1869,
562,
285,
3959,
2590,
16774,
1329,
273,
253,
4081,
9079,
50274,
5040,
436,
1537,
320,
1955,
281,
619,
3710,
4685,
273,
253,
2929,
533,
891,
1158,
627,
403,
1335,
690,
7364,
281,
253,
9380,
4081,
3762,
24088,
352,
36908,
5513,
253,
8310,
326,
6685,
5412,
12637,
50275,
17,
36908,
1646,
281,
8513,
253,
3045,
273,
270,
9866,
534,
943,
2556,
281,
253,
4081,
3762,
347,
627,
310,
760,
581,
8654,
3276,
50275,
18,
50276,
84,
835,
256,
310,
253,
2032,
1180,
273,
6944,
5203,
398,
285,
625,
2708,
50274,
34974,
285,
5701,
337,
619,
5962,
13775,
310,
436,
253,
2929,
8219,
326,
697,
13583,
281,
5467,
247,
2969,
31091,
12177,
7239,
89,
347,
352,
36908,
1379,
715,
2395,
253,
941,
1095,
318,
1232,
2299,
762,
253,
6508,
12177,
1566,
347,
4081,
672,
21839,
327,
253,
2362,
326,
340,
15422,
285,
1269,
15422,
347,
359,
513,
672,
3733,
327,
2629,
15302,
285,
846,
16888,
3006,
562,
253,
10444,
4903,
285,
26749,
3006,
310,
2649,
253,
17697,
3268,
7239,
89,
1335,
816,
247,
31091,
3268,
3707,
4764,
1025,
275,
247,
1027,
1039,
1024,
50276,
338,
594,
840,
253,
3064,
875,
253,
767,
12177,
84,
310,
1663,
816,
247,
1027,
4764,
1320,
285,
516,
642,
3356,
2119,
752,
281,
1056,
273,
253,
5125,
3762,
285,
253,
8109,
1543,
50276,
74,
1089,
352,
1077,
10084,
326,
253,
625,
2570,
4764,
1320,
3012,
3777,
253,
2660,
272,
1055,
50276,
395,
604,
253,
359,
1379,
253,
3216,
5083,
12177,
7239,
89,
281,
320,
347,
275,
253,
2629,
1095,
456,
10895,
534,
253,
2929,
8219,
310,
275,
690,
3282,
41544,
1565,
11712,
840,
2139,
16216,
247,
973,
1033,
68,
1245,
270,
9866,
816,
5223,
281,
436,
1335,
31091,
12177,
285,
3037,
271,
8654,
12637,
762,
352,
50276,
303,
5211,
281,
7164,
619,
4868,
604,
253,
4477,
476,
19148,
841,
3374,
323,
479,
374,
1580,
1127,
13418,
342,
256,
35333,
5556,
4219,
253,
1072,
12177,
1159,
2139,
13414,
359,
10018,
253,
2660,
272,
1055,
275,
256,
35333,
4931,
627,
310,
690,
1055,
533,
2581,
8723,
275,
667,
1083,
690,
4679,
327,
256,
35333,
342,
50276,
14920,
253,
15045,
12177,
651,
320,
4722,
495,
2905,
281,
619,
4385,
670,
6685,
5412,
12637,
275,
253,
31025,
3368,
672,
10166,
285,
5762,
327,
253,
15045,
12177,
7296,
1095,
318,
253,
1071,
3045,
3133,
281,
1663,
4510,
253,
8654,
50275,
18,
5727,
327,
253,
2460,
3368,
625,
2660,
272,
50275,
17,
36908,
1646,
281,
2818,
1071,
3045,
310,
627,
271,
8813,
323,
436,
577,
4720,
1057,
253,
4081,
3762,
5513,
253,
8310,
326,
253,
5412,
12637,
1055,
310,
625,
11906,
275,
270,
9866,
342,
2169,
5350,
259,
12586,
293,
1162,
355,
9169,
50273,
24902,
963,
993,
285,
5884,
16503,
337,
268,
20,
762,
16186,
818,
50276,
2520,
12177,
310,
6425,
281,
21473,
1016,
2856,
522,
842,
256,
2069,
342,
253,
1072,
5203,
285,
3103,
556,
4555,
253,
1055,
273,
4758,
50275,
84,
275,
247,
1565,
11712,
12637,
943,
320,
50275,
18,
84,
3185,
374,
253,
987,
2252,
749,
13206,
275,
4677,
577,
943,
320,
13130,
299,
3185,
273,
269,
281,
3761,
253,
11743,
2708,
50274,
11183,
50276,
422,
5439,
619,
4868,
275,
1708,
273,
2488,
2380,
285,
747,
1543,
5474,
339,
431,
248,
789,
12661,
247,
3762,
7738,
326,
253,
5412,
12637,
16958,
15877,
12718,
1955,
253,
253,
1095,
456,
3753,
273,
2460,
49602,
247,
1006,
800,
1566,
310,
4081,
835,
2709,
12182,
2392,
5203,
2856,
522,
842,
84,
285,
760,
38350,
13130,
2856,
522,
842,
84,
403,
7607,
715,
247,
10895,
436,
3762,
310,
5421,
762,
247,
20953,
28872,
970,
2177,
285,
247,
774,
357,
5911,
2715,
273,
253,
260,
338,
274,
740,
1071,
873,
342,
48237,
392,
50275,
35529,
1142,
3533,
3464,
440,
42195,
285,
253,
4081,
3762,
310,
417,
10481,
5421,
50275,
82,
253,
5412,
12637,
1895,
369,
16318,
275,
253,
256,
34753,
3591,
68,
1083,
533,
436,
2987,
2022,
20953,
1895,
760,
33826,
1565,
11712,
20731,
17327,
347,
21270,
275,
2177,
352,
651,
320,
12912,
604,
253,
789,
16681,
2139,
841,
1543,
943,
9017,
281,
253,
5412,
12637,
390,
1805,
2568,
1408,
4679,
275,
436,
10076,
50276,
82,
7609,
7052,
5936,
326,
627,
310,
247,
2954,
875,
875,
29331,
285,
256,
275,
253,
20953,
1895,
352,
943,
320,
271,
3477,
1635,
281,
1263,
436,
4602,
323,
247,
2491,
273,
2193,
323,
256,
281,
8338,
604,
436,
6556,
50276,
82,
253,
789,
3916,
326,
253,
13969,
7241,
323,
2629,
15302,
310,
417,
2130,
533,
352,
651,
3176,
326,
436,
310,
247,
247,
2969,
5133,
273,
10922,
562,
281,
253,
4477,
273,
253,
15302,
50276,
82,
253,
2022,
3368,
3559,
275,
4677,
608,
310,
5816,
690,
1774,
490,
77,
569,
752,
6569,
672,
253,
260,
338,
274,
740,
8245,
310,
10166,
762,
253,
1072,
2515,
4715,
2281,
347,
260,
338,
274,
740,
73,
50276,
82,
2139,
310,
352,
12207,
281,
897,
253,
3236,
260,
338,
274,
740,
6194,
723,
292,
347,
253,
1071,
1178,
323,
260,
338,
274,
740,
73,
436,
3133,
751,
247,
20276,
5333,
275,
941,
3268,
50276,
82,
253,
3762,
273,
10895,
1095,
318,
310,
4722,
533,
2789,
247,
3862,
1750,
625,
15302,
943,
320,
14859,
432,
11962,
33433,
253,
7934,
2770,
1060,
327,
260,
338,
274,
740,
3400,
1512,
1652,
1941,
347,
247,
14876,
1095,
318,
4870,
403,
1027,
323,
24088,
3739,
6979,
15302,
285,
5431,
973,
14290,
50275,
82,
697,
12744,
281,
479,
2139,
352,
310,
12207,
281,
2572,
253,
3733,
873,
1979,
407,
253,
1180,
273,
12182,
2392,
271,
2572,
273,
2803,
2456,
310,
8069,
4758,
253,
3276,
281,
50276,
18,
70,
19,
387,
436,
3276,
253,
8245,
17923,
816,
347,
973,
1057,
436,
1781,
8037,
1335,
2186,
604,
816,
247,
1199,
4577,
8578,
273,
12182,
2392,
310,
2668,
432,
253,
260,
338,
274,
740,
73,
10895,
50275,
783,
5161,
2934,
4081,
275,
436,
789,
310,
1869,
872,
6856,
285,
17904,
281,
253,
5955,
327,
436,
9400,
253,
789,
310,
4942,
2159,
534,
310,
417,
247,
1895,
275,
697,
1211,
987,
533,
253,
3368,
2593,
3198,
281,
2085,
625,
1941,
285,
1783,
891,
6273,
323,
12009,
50276,
32202,
81,
5519,
50276,
71,
3185,
273,
299,
275,
4677,
577,
50276,
284,
3264,
359,
672,
50276,
11183,
209,
422,
2559,
619,
11691,
406,
339,
431,
248,
4477,
12661,
253,
2934,
326,
5412,
20731,
17327,
275,
17699,
16561,
11454,
6928,
812,
320,
4269,
407,
253,
12177,
3185,
273,
253,
2720,
597,
9059,
28055,
326,
253,
1095,
318,
1232,
273,
4633,
22791,
941,
5239,
651,
1421,
281,
247,
1027,
42428,
273,
253,
12177,
275,
253,
12637,
597,
921,
275,
690,
4679,
326,
253,
5412,
12637,
1055,
476,
320,
3777,
672,
15890,
323,
436,
50276,
24330,
5701,
50276,
783,
2929,
4060,
5936,
326,
352,
310,
670,
5412,
20731,
17327,
285,
352,
3240,
46454,
10414,
337,
275,
253,
10199,
2299,
275,
4706,
374,
352,
310,
840,
31637,
326,
253,
2929,
310,
275,
958,
417,
670,
5412,
20731,
17327,
533,
670,
1565,
11712,
4394,
352,
310,
816,
13366,
5393,
275,
8136,
326,
253,
1543,
943,
3700,
533,
436,
310,
1620,
5762,
891,
1158,
271,
3368,
327,
4588,
5412,
20731,
17327,
2074,
281,
253,
581,
275,
337,
651,
320,
26085,
281,
1329,
824,
247,
3908,
285,
253,
10393,
273,
253,
1655,
2929,
4060,
50276,
783,
3762,
5936,
326,
253,
8654,
12637,
3045,
943,
320,
6786,
387,
24082,
34797,
326,
310,
12356,
1066,
4457,
326,
1127,
943,
16528,
366,
3045,
969,
436,
310,
271,
4722,
10554,
1580,
352,
1057,
417,
1646,
281,
4944,
253,
7313,
275,
337,
352,
651,
320,
5322,
281,
923,
436,
5783,
327,
271,
4588,
270,
9866,
3368,
12014,
281,
752,
476,
320,
2326,
275,
253,
20953,
31025,
3368,
50276,
783,
2905,
789,
2593,
3133,
3768,
2920,
2159,
352,
1057,
417,
1014,
3748,
337,
3738,
352,
310,
11106,
11306,
11358,
25761,
323,
247,
2929,
326,
310,
36636,
247,
7605,
3762,
273,
1565,
11712,
20731,
17327,
2987,
824,
347,
374,
285,
495,
943,
3164,
320,
5393,
50276,
37585,
5701,
50276,
1704,
5976,
359,
672,
50276,
9453,
50276,
8774,
253,
2934,
326,
5412,
5142,
11712,
12637,
2538,
476,
320,
4269,
407,
253,
941,
873,
1095,
318,
3185,
273,
407,
2985,
1553,
1245,
2235,
641,
310,
1077,
4722,
285,
7964,
22828,
247,
10527,
285,
16774,
5839,
2299,
253,
5839,
387,
1133,
3133,
247,
2372,
18464,
275,
690,
5053,
3340,
253,
2905,
789,
285,
4679,
7118,
671,
253,
4060,
1057,
417,
4390,
1646,
281,
4944,
253,
4679,
1677,
326,
253,
1655,
7714,
310,
35766,
1561,
253,
17857,
32888,
3239,
2701,
516,
29874,
326,
841,
2792,
476,
320,
9713,
275,
247,
17265,
2715,
1309,
253,
5955,
3408,
50275,
11183,
891,
2559,
619,
4868,
1563,
253,
8254,
6787,
285,
1635,
273,
253,
270,
9866,
3368,
1309,
253,
5955,
3408,
50276,
18,
259,
12586,
293,
269,
687,
394,
465,
1670,
8855,
270,
256,
19311,
45593,
480,
21191,
298,
7649,
85,
256,
50275,
2666,
6002,
249,
256,
9169,
849,
1175,
310,
253,
17699,
265,
12637,
275,
3676,
11454,
6928,
1663,
549,
32693,
638,
3845,
549,
32693,
1518,
938,
1348,
1762,
374,
650,
79,
35269,
268,
4050,
17109,
2672,
253,
4999,
17699,
16561,
275,
5213,
8059,
327,
5933,
280,
4715,
3762,
7266,
23504,
21092,
7203,
254,
17099,
3642,
344,
45355,
495,
650,
79,
35269,
268,
50276,
6148,
258,
2188,
257,
246,
4240,
43430,
273,
17699,
16561,
17032,
323,
2985,
1553,
1245,
4872,
3210,
285,
247,
10419,
323,
47387,
352,
17699,
16561,
1783,
17294,
884,
2090,
30032,
187,
187,
4118,
18435,
27,
2520,
310,
271,
4722,
15620,
2929,
326,
17904,
281,
271,
10800,
8881,
275,
17699,
16561,
3676,
4715,
50276,
32442,
16561,
17032,
342,
41544,
27336,
20731,
17327,
24088,
10166,
342,
298,
912,
8498,
8062,
342,
1066,
24676,
6046,
369,
4102,
1119,
281,
562,
32231,
689,
1097,
1127,
13418,
285,
4751,
32442,
16561,
9694,
259,
12586,
293,
1162,
355,
9169,
436,
2929,
29328,
247,
747,
8813,
323,
841,
2540,
16958,
275,
2426,
273,
247,
941,
1095,
318,
5122,
326,
4633,
22791,
941,
5239,
824,
347,
260,
338,
274,
13368,
253,
1783,
1766,
3683,
1066,
281,
271,
1941,
689,
5560,
272,
4524,
5560,
272,
4154,
285,
3936,
715,
2395,
326,
1095,
456,
941,
5239,
760,
3831,
941,
2792,
323,
534,
512,
5203,
398,
5821,
327,
247,
5203,
253,
4477,
1750,
326,
672,
14053,
253,
2032,
1006,
800,
1232,
273,
253,
941,
253,
5412,
12637,
1055,
10571,
27309,
50276,
783,
2929,
310,
973,
15720,
285,
3400,
247,
5185,
1783,
407,
14053,
253,
941,
1095,
318,
5122,
275,
2426,
273,
271,
6944,
37851,
29886,
1566,
273,
253,
21473,
5122,
19235,
2067,
2540,
16958,
273,
259,
12586,
293,
1162,
355,
9169,
3464,
49374,
407,
253,
10527,
7125,
24088,
253,
958,
326,
1077,
5412,
246,
50276,
17,
20731,
17327,
13414,
8513,
3045,
390,
253,
8310,
326,
253,
8654,
3276,
3133,
281,
3469,
327,
253,
1566,
5350,
1223,
253,
4081,
8813,
36908,
9232,
253,
2120,
5406,
2220,
534,
1097,
4477,
285,
30628,
5194,
253,
9380,
2770,
327,
253,
941,
1095,
318,
1232,
4516,
9470,
4679,
4245,
247,
7898,
8813,
285,
3400,
271,
4722,
8668,
326,
588,
36057,
2007,
5955,
285,
943,
320,
273,
3862,
1600,
281,
253,
17699,
16561,
3676,
4715,
3114,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper extends the pate approach to the text generation problem the authors introduce additional steps to help boosting the performance of pate in the text generation setting as the oiriginal approach does not bode well with the large output space of vocabulary the paper further studies phraselevel privacy beyond the regularly studied samplelevel privacy the paper is wellwritten and easy to follow unfortunately my main concern is on the novelty of the paper the approach is heavily based on the pate algorithm with few tricks to work better for the text generation task it utilizes a pretrained lm to generate pseudo completions and reduces the output space by filtering the tail distribution without a privacy requirement and finally the privacy loss is reduced by acquiring the teacher supervision only when the student is not good at a certain prediction the latter idea has also appeared in the more recent pate paper while i believe these extensions are valuable in improving the performance of pate in this scenario i do not think they provide sufficient novelty for this venuei have one critical comment about the users secret phrases section the authors took the route of group privacy for this scenario which i do not think might bethe effective way with dp dpsgd algorithm can easily be adapted to have userlevel privacy by batching users instead of samples i find it an unfair comparison in the sense that the authors have not employed this approach but took the naive way of applying group privacy on userlevel na docsepthe paper proposes an extension of pate a private learning algorithm to text generation tasks the extensions are simple yet effective they generate pseudo inputs and reduce the sequence generation problem to next word predictions they also propose a strategy to dynamically filter out candidates to reduce the large output space in the text decoder experiments in the sentence completion task show that the proposed model is effective in protecting samples and sensitive phrases strengths the proposed extension is very simple yet intuitive and effective for differentially private text generation weaknesses the paper could have been more convincing if the model is tested on multiple text generation tasks such as dialog response generation generate a response given previous utterances where privacy is more crucial as the work focuses on privacy i think it would be nice to have a specific section on limitations and societal impact this is currently absent in the main paper docsepin this paper authors propose a novel framework seqpate an extension of pate on text generation as a differentially private dp learning algorithm for text generations seqpate aims to protect the privacy of both training samples and sensitive phrases in samples and employs a teacherstudent framework additionally authors propose several strategies for seqpate to handle text generations with a sequence of classifications over large spaces strengths privacy protections is important for text generation models and other tasks motivation and problem setting are clear previous work survey is enough weaknesses several claims have not been adequately verified for example the effectiveness of seqpate in protecting both samples and sensitive phrases training corpora with a moderate privacy cost no qualitative and error analysis runtime analysis is lacked without qualitative analysis it is a quantitative comparison of similar models and does not support the authors claim only the usual text generation model metrics ie ppl and bleuare used docsepthis paper extends pate into the field of text generation to do so the following technical challenges must be properly addressed 1 in addition to protecting individual words we need to protect phrases too 2 compared to other task the output space is huge for text generation 3 we need to control the privacy loss this paper has done a solid work to address these challenges this paper is well written this work is original it is based on the theory of differentially private dp so its potential and quality are pretty high the important difficult points are well explained but as a reader i think one area can be improved the connection between theory of dp and the proposed seqpate method can be explained more explicitly doing so can greatly decrease the barriers to new researchers who are interested in this area privacy is an area with great social impact this paper focuses solely on the technical aspect of privacy but its too early to give an assessment on its social impact so in my humble opinion it is acceptable that its social impact are absent in the paper
### Summary:
|
the paper studies pate framework for text generation models and proposes algorithm based on kd to handle large output space reviewers think that proposed methods should generate interest among the neurips audience we encourage the authors to incorporate comments of the reviewers to improve the paper
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8725,
253,
268,
366,
2746,
281,
253,
2505,
5978,
1895,
253,
4477,
9569,
3081,
5018,
281,
1361,
43124,
253,
3045,
273,
268,
366,
275,
253,
2505,
5978,
4758,
347,
253,
258,
343,
10019,
2746,
1057,
417,
270,
853,
973,
342,
253,
1781,
3453,
2317,
273,
30318,
253,
2929,
2007,
2175,
12616,
5251,
11068,
4457,
253,
11719,
5421,
3410,
5251,
11068,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
19235,
619,
2022,
4468,
310,
327,
253,
38135,
273,
253,
2929,
253,
2746,
310,
11306,
1754,
327,
253,
268,
366,
5933,
342,
1643,
24866,
281,
789,
1805,
323,
253,
2505,
5978,
4836,
352,
29820,
247,
3215,
11273,
298,
78,
281,
6635,
17927,
2535,
621,
285,
11355,
253,
3453,
2317,
407,
19690,
253,
8105,
3268,
1293,
247,
11068,
8284,
285,
4720,
253,
11068,
2957,
310,
3777,
407,
28635,
253,
9732,
20446,
760,
672,
253,
5974,
310,
417,
1175,
387,
247,
2176,
10554,
253,
6158,
2934,
556,
671,
5420,
275,
253,
625,
3332,
268,
366,
2929,
1223,
891,
2868,
841,
18149,
403,
9865,
275,
11138,
253,
3045,
273,
268,
366,
275,
436,
10076,
891,
513,
417,
1158,
597,
2085,
4209,
38135,
323,
436,
18767,
74,
452,
581,
4619,
4385,
670,
253,
4212,
4279,
25491,
2593,
253,
4477,
2335,
253,
7622,
273,
1387,
11068,
323,
436,
10076,
534,
891,
513,
417,
1158,
1537,
701,
248,
3576,
1039,
342,
33234,
20093,
35333,
5933,
476,
4354,
320,
12956,
281,
452,
2608,
5251,
11068,
407,
14604,
272,
4212,
3185,
273,
3530,
891,
1089,
352,
271,
16593,
5301,
275,
253,
3282,
326,
253,
4477,
452,
417,
7091,
436,
2746,
533,
2335,
253,
27785,
1039,
273,
9433,
1387,
11068,
327,
2608,
5251,
5549,
5474,
339,
431,
248,
2929,
29328,
271,
6880,
273,
268,
366,
247,
3055,
4715,
5933,
281,
2505,
5978,
8892,
253,
18149,
403,
2969,
2568,
3576,
597,
6635,
17927,
14800,
285,
4796,
253,
3425,
5978,
1895,
281,
1735,
3159,
13650,
597,
671,
12661,
247,
5700,
281,
23043,
5806,
562,
9183,
281,
4796,
253,
1781,
3453,
2317,
275,
253,
2505,
29810,
4679,
275,
253,
6197,
12240,
4836,
921,
326,
253,
4081,
1566,
310,
3576,
275,
15233,
3530,
285,
7996,
25491,
20544,
50276,
783,
4081,
6880,
310,
1077,
2969,
2568,
27350,
285,
3576,
323,
21673,
3055,
2505,
5978,
50276,
20881,
1255,
265,
50276,
783,
2929,
812,
452,
644,
625,
21414,
604,
253,
1566,
310,
5762,
327,
2709,
2505,
5978,
8892,
824,
347,
10756,
2380,
5978,
6635,
247,
2380,
1677,
2045,
13894,
1972,
835,
11068,
310,
625,
9560,
347,
253,
789,
16633,
327,
11068,
891,
1158,
352,
651,
320,
5322,
281,
452,
247,
2173,
2593,
327,
7364,
285,
38058,
3486,
436,
310,
4390,
12125,
275,
253,
2022,
2929,
5474,
339,
9852,
436,
2929,
4477,
12661,
247,
4460,
7792,
22510,
81,
366,
271,
6880,
273,
268,
366,
327,
2505,
5978,
347,
247,
21673,
3055,
33234,
4715,
5933,
323,
2505,
14649,
22510,
81,
366,
13698,
281,
4017,
253,
11068,
273,
1097,
3733,
3530,
285,
7996,
25491,
275,
3530,
285,
50276,
12837,
16376,
247,
9732,
39095,
7792,
23000,
4477,
12661,
2067,
8130,
323,
22510,
81,
366,
281,
6016,
2505,
14649,
342,
247,
3425,
273,
43394,
689,
1781,
8470,
20544,
50276,
13552,
1974,
29110,
310,
1774,
323,
2505,
5978,
3210,
285,
643,
8892,
50276,
24013,
7639,
285,
1895,
4758,
403,
2590,
50276,
35065,
789,
6630,
310,
2217,
50275,
20881,
1255,
265,
50276,
43249,
3916,
452,
417,
644,
18212,
16058,
323,
1650,
253,
12510,
273,
22510,
81,
366,
275,
15233,
1097,
3530,
285,
7996,
25491,
3733,
5944,
66,
342,
247,
10290,
11068,
2105,
50276,
2369,
18276,
285,
2228,
1783,
50276,
21005,
1783,
310,
20296,
50276,
14920,
18276,
1783,
50276,
262,
310,
247,
11745,
5301,
273,
2074,
3210,
285,
1057,
417,
1329,
253,
4477,
1750,
760,
253,
7312,
2505,
5978,
1566,
17082,
26332,
268,
446,
285,
7387,
86,
609,
908,
5474,
33032,
2520,
2929,
8725,
268,
366,
715,
253,
1673,
273,
2505,
5978,
50276,
936,
513,
594,
253,
1563,
7681,
7881,
1364,
320,
6283,
9713,
337,
275,
1635,
281,
15233,
2060,
3000,
359,
878,
281,
4017,
25491,
1512,
374,
2429,
281,
643,
4836,
253,
3453,
2317,
310,
5699,
323,
2505,
5978,
495,
359,
878,
281,
1453,
253,
11068,
2957,
50276,
2520,
2929,
556,
2218,
247,
4891,
789,
281,
2953,
841,
7881,
436,
2929,
310,
973,
3542,
50276,
2520,
789,
310,
3236,
50276,
262,
310,
1754,
327,
253,
3762,
273,
21673,
3055,
33234,
594,
697,
2442,
285,
3290,
403,
3965,
1029,
253,
1774,
2834,
2792,
403,
973,
5544,
533,
347,
247,
9414,
891,
1158,
581,
2170,
476,
320,
5520,
253,
4602,
875,
3762,
273,
33234,
285,
253,
4081,
22510,
81,
366,
1332,
476,
320,
5544,
625,
11120,
50276,
25163,
594,
476,
10260,
6379,
253,
15938,
281,
747,
8607,
665,
403,
6110,
275,
436,
2170,
11068,
310,
271,
2170,
342,
1270,
2675,
3486,
50276,
2520,
2929,
16633,
12718,
327,
253,
7681,
4809,
273,
11068,
50276,
2858,
697,
1512,
2393,
281,
1918,
271,
6803,
327,
697,
2675,
3486,
50276,
601,
275,
619,
26896,
4743,
352,
310,
12207,
326,
697,
2675,
3486,
403,
12125,
275,
253,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
268,
366,
7792,
323,
2505,
5978,
3210,
285,
29328,
5933,
1754,
327,
465,
69,
281,
6016,
1781,
3453,
2317,
50276,
15337,
398,
1158,
326,
4081,
3082,
943,
6635,
1600,
2190,
253,
5723,
2824,
8446,
359,
11907,
253,
4477,
281,
19071,
5701,
273,
253,
30628,
281,
3157,
253,
2929
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
8725,
253,
268,
366,
2746,
281,
253,
2505,
5978,
1895,
253,
4477,
9569,
3081,
5018,
281,
1361,
43124,
253,
3045,
273,
268,
366,
275,
253,
2505,
5978,
4758,
347,
253,
258,
343,
10019,
2746,
1057,
417,
270,
853,
973,
342,
253,
1781,
3453,
2317,
273,
30318,
253,
2929,
2007,
2175,
12616,
5251,
11068,
4457,
253,
11719,
5421,
3410,
5251,
11068,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
19235,
619,
2022,
4468,
310,
327,
253,
38135,
273,
253,
2929,
253,
2746,
310,
11306,
1754,
327,
253,
268,
366,
5933,
342,
1643,
24866,
281,
789,
1805,
323,
253,
2505,
5978,
4836,
352,
29820,
247,
3215,
11273,
298,
78,
281,
6635,
17927,
2535,
621,
285,
11355,
253,
3453,
2317,
407,
19690,
253,
8105,
3268,
1293,
247,
11068,
8284,
285,
4720,
253,
11068,
2957,
310,
3777,
407,
28635,
253,
9732,
20446,
760,
672,
253,
5974,
310,
417,
1175,
387,
247,
2176,
10554,
253,
6158,
2934,
556,
671,
5420,
275,
253,
625,
3332,
268,
366,
2929,
1223,
891,
2868,
841,
18149,
403,
9865,
275,
11138,
253,
3045,
273,
268,
366,
275,
436,
10076,
891,
513,
417,
1158,
597,
2085,
4209,
38135,
323,
436,
18767,
74,
452,
581,
4619,
4385,
670,
253,
4212,
4279,
25491,
2593,
253,
4477,
2335,
253,
7622,
273,
1387,
11068,
323,
436,
10076,
534,
891,
513,
417,
1158,
1537,
701,
248,
3576,
1039,
342,
33234,
20093,
35333,
5933,
476,
4354,
320,
12956,
281,
452,
2608,
5251,
11068,
407,
14604,
272,
4212,
3185,
273,
3530,
891,
1089,
352,
271,
16593,
5301,
275,
253,
3282,
326,
253,
4477,
452,
417,
7091,
436,
2746,
533,
2335,
253,
27785,
1039,
273,
9433,
1387,
11068,
327,
2608,
5251,
5549,
5474,
339,
431,
248,
2929,
29328,
271,
6880,
273,
268,
366,
247,
3055,
4715,
5933,
281,
2505,
5978,
8892,
253,
18149,
403,
2969,
2568,
3576,
597,
6635,
17927,
14800,
285,
4796,
253,
3425,
5978,
1895,
281,
1735,
3159,
13650,
597,
671,
12661,
247,
5700,
281,
23043,
5806,
562,
9183,
281,
4796,
253,
1781,
3453,
2317,
275,
253,
2505,
29810,
4679,
275,
253,
6197,
12240,
4836,
921,
326,
253,
4081,
1566,
310,
3576,
275,
15233,
3530,
285,
7996,
25491,
20544,
50276,
783,
4081,
6880,
310,
1077,
2969,
2568,
27350,
285,
3576,
323,
21673,
3055,
2505,
5978,
50276,
20881,
1255,
265,
50276,
783,
2929,
812,
452,
644,
625,
21414,
604,
253,
1566,
310,
5762,
327,
2709,
2505,
5978,
8892,
824,
347,
10756,
2380,
5978,
6635,
247,
2380,
1677,
2045,
13894,
1972,
835,
11068,
310,
625,
9560,
347,
253,
789,
16633,
327,
11068,
891,
1158,
352,
651,
320,
5322,
281,
452,
247,
2173,
2593,
327,
7364,
285,
38058,
3486,
436,
310,
4390,
12125,
275,
253,
2022,
2929,
5474,
339,
9852,
436,
2929,
4477,
12661,
247,
4460,
7792,
22510,
81,
366,
271,
6880,
273,
268,
366,
327,
2505,
5978,
347,
247,
21673,
3055,
33234,
4715,
5933,
323,
2505,
14649,
22510,
81,
366,
13698,
281,
4017,
253,
11068,
273,
1097,
3733,
3530,
285,
7996,
25491,
275,
3530,
285,
50276,
12837,
16376,
247,
9732,
39095,
7792,
23000,
4477,
12661,
2067,
8130,
323,
22510,
81,
366,
281,
6016,
2505,
14649,
342,
247,
3425,
273,
43394,
689,
1781,
8470,
20544,
50276,
13552,
1974,
29110,
310,
1774,
323,
2505,
5978,
3210,
285,
643,
8892,
50276,
24013,
7639,
285,
1895,
4758,
403,
2590,
50276,
35065,
789,
6630,
310,
2217,
50275,
20881,
1255,
265,
50276,
43249,
3916,
452,
417,
644,
18212,
16058,
323,
1650,
253,
12510,
273,
22510,
81,
366,
275,
15233,
1097,
3530,
285,
7996,
25491,
3733,
5944,
66,
342,
247,
10290,
11068,
2105,
50276,
2369,
18276,
285,
2228,
1783,
50276,
21005,
1783,
310,
20296,
50276,
14920,
18276,
1783,
50276,
262,
310,
247,
11745,
5301,
273,
2074,
3210,
285,
1057,
417,
1329,
253,
4477,
1750,
760,
253,
7312,
2505,
5978,
1566,
17082,
26332,
268,
446,
285,
7387,
86,
609,
908,
5474,
33032,
2520,
2929,
8725,
268,
366,
715,
253,
1673,
273,
2505,
5978,
50276,
936,
513,
594,
253,
1563,
7681,
7881,
1364,
320,
6283,
9713,
337,
275,
1635,
281,
15233,
2060,
3000,
359,
878,
281,
4017,
25491,
1512,
374,
2429,
281,
643,
4836,
253,
3453,
2317,
310,
5699,
323,
2505,
5978,
495,
359,
878,
281,
1453,
253,
11068,
2957,
50276,
2520,
2929,
556,
2218,
247,
4891,
789,
281,
2953,
841,
7881,
436,
2929,
310,
973,
3542,
50276,
2520,
789,
310,
3236,
50276,
262,
310,
1754,
327,
253,
3762,
273,
21673,
3055,
33234,
594,
697,
2442,
285,
3290,
403,
3965,
1029,
253,
1774,
2834,
2792,
403,
973,
5544,
533,
347,
247,
9414,
891,
1158,
581,
2170,
476,
320,
5520,
253,
4602,
875,
3762,
273,
33234,
285,
253,
4081,
22510,
81,
366,
1332,
476,
320,
5544,
625,
11120,
50276,
25163,
594,
476,
10260,
6379,
253,
15938,
281,
747,
8607,
665,
403,
6110,
275,
436,
2170,
11068,
310,
271,
2170,
342,
1270,
2675,
3486,
50276,
2520,
2929,
16633,
12718,
327,
253,
7681,
4809,
273,
11068,
50276,
2858,
697,
1512,
2393,
281,
1918,
271,
6803,
327,
697,
2675,
3486,
50276,
601,
275,
619,
26896,
4743,
352,
310,
12207,
326,
697,
2675,
3486,
403,
12125,
275,
253,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
268,
366,
7792,
323,
2505,
5978,
3210,
285,
29328,
5933,
1754,
327,
465,
69,
281,
6016,
1781,
3453,
2317,
50276,
15337,
398,
1158,
326,
4081,
3082,
943,
6635,
1600,
2190,
253,
5723,
2824,
8446,
359,
11907,
253,
4477,
281,
19071,
5701,
273,
253,
30628,
281,
3157,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
under the assumption of a smooth loss function this paper investigates a new onlinetobatch conversion method for private stochastic optimization this paper offers a fresh look at stochastic optimization with dp guarantees however it appears to have been rushed and does not demonstrate well even though the smoothness assumption is a limitation experiments for the smooth loss case and comparison with other existing methods are preferred to provide a sense of the utility advantage claimed in this paper the writing could be improved to make it more readerfriendly and experiments are encouraged docsepthis paper proposes a novel onlinetobatch conversion algorithm with theoretical guarantees which aims to achieve optimal convergence rate of expected risk under differential privacy constraints relationships between regret bound and expected risk are also provided under different conditions of loss function strengths 1 this work is wellmotivated with theoretical analyses 2 onlinetobatch conversion technology is an important topic in online learning and differentially private is also important in streaming data 3 regret bound and expected risk are analyzed under different conditions of loss function weaknesses 1 some efforts have been made to the topic of differentially private online learning dpol in jain et al 2012 the comparison with dpol is missing 2 lack of analysis of computational complexity 3 the paper lacks mostly discussion and insights for the theoretical results 4 this paper lacks empirical analyses which limits the reproducibility of the proposed algorithms jain et al 2012 differentially private online learning colt 2012 my major concern is about the reproducibility of the proposed algorithm this paper proposes a novel onlinetobatch algorithm with theoretical guarantees rather than only provides theoretical analyses for existing learning algorithms thus empirical analyses are essential for demonstrating the effectiveness and efficiency of the novel algorithm with differential privacy constraints besides some important related works about differentially private online learning are missing docseponline to batch conversions are a classical reduction from adversarial online learning to stochastic optimization this work introduces the use of such reductions in the context of differentially private stochastic convex optimization dpsco this extension is nontrivial as it is known that in this context classical online to batch leads to suboptimal rates therefore in this work a more careful weighted regret scheme known as anytime online to batch conversions cutkosky 2019 is used this method is used in conjunction with a tree aggregation dp mechanism dwork et al 2010 that allows for optimal noise addition of partial sums and with a recursive stochastic gradient estimator which has been proved useful in dpsco asi et al 2021 bassily et al 2021 it is interesting that the technique is quite general and it is applicable to settings beyond ell2 dpsco which by now is well understood the work includes extensions to strongly convex losses and rates for which the nonprivate o1sqrtn term adapts to the level of noise asi feldman koren talwar private stochastic convex optimization optimal rates in l1 geometry icml21 httpsarxivorg abs210301516 bassily guzman nandi noneuclidean differentially private stochastic optimization colt21 httpsarxivorgpdf210301278pdf cutkosky anytime onlinetobatch optimism and acceleration icml19 httpsarxivorgpdf190300974pdf dwork naor pitassi rothblum differential privacy under continual observation stoc10 strengths 1 the unifying perspective on dpsco capable to recover optimal rates without specialized arguments or reductions between convex and strongly convex losses through regularization 2 the perspective of online to batch conversions provides a rather straightforward extension to noneuclidean norms for which optimal methods are still scarce bassily et al 2021 3 adaptivity to the noise level all kinds of adaptivity in differential privacy are challenging and not well understood so this result in itself is significant 4 the paper is fairly well written and the proofs and analyses are not complicated and principled weaknesses 1 i think this unifying perspective becomes more interesting in geometries which are not ell2 more exploration of the consequences of the results of the paper to these settings would have been an interesting addition 2 aside from the noise adaptive rates and the parameter free algorithm the rates derived in the paper are not new namely optimal rates have already been established i dont think this is a major problem though mostly because making these methods more broadly applicable is interesting and necessary to better understand dpsco there are two technical aspects of this work that are worth a more extensive discussion 1 are the new online to batch conversion results presented in this paper capable of addressing ell1geometry asi et al 2021 bassily et al 2021 this is an important question since this is the only general case where we can avoid polynomial dependence on d in the rates this is not a direct corollary of what is currently done in the paper as the stochastic frankwolfe algorithms of the references above are not clearly applicable to the online setting 2 the issue about the smoothness constant h is currently lacking a more thorough discussion
### Summary:
|
the paper made original contributions to dp learning of convex and smooth differentially private learning problem by connecting to the vast parameterfree online learning literature one of the reviewers read the paper with great details and carefully checked the correctness of the results the ac also took a close look and find the results very nice the reviewers and the ac further discussed the work and clarified some of the concerns raised eg regarding computation but the missing experiments make it hard to vouch for the practicality of the approach based on the theoretical contribution alone we believe the paper is above the bar and would happily recommend accept the authors are encouraged to take into account the points below and consider adding benchmarking experiments some additional feedback comments out of the discussion 1 for dpsco as the optimal rates are known to be achievable by a linear time algorithm fkt20 the proposed new algorithm thus does not improve over existing methods in either statistical or computational complexity 2 if computation is not a concern noisy gradient descent with on2 iterations thus n3 incremental gradient oracle calls is known to provide an even stronger excess empirical risk bound that is optimal without that additional log t in this paper it also is very hard to beat in practice 3 the main contribution of the proposed algorithm is then about new algorithmic techniques borrowed from adaptive online learning this approach gives rise to the unified treatment of general convex and strongly convex problems and they discussed how it helps to tap into other more adaptive more parameterfree online learners 4 treeaggregation for releasing gradient sequences by leveraging smoothness and stability implied by the anytime onlinetobatch reduction is cute i think technically this is the most interesting idea 5 i think the paper contains enough good results to be considered as a purely theoretical work for acceptance that said i do think the algorithm has the potential to be practical it is a pity that the authors did not try though with the several layers of reductions and the binarytree approach it wont surprise me if the proposed method is not competitive against methods such as noisygd or noisysgd 6 another positive aspect about this paper is that it is polished and the writing is pretty good i particularly liked how the authors accurately describe their contribution in the title abstract by clearly highlighting the key assumption on the smoothness
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
4524,
253,
9376,
273,
247,
6032,
2957,
1159,
436,
2929,
2340,
684,
247,
747,
327,
3642,
292,
706,
1506,
9436,
1332,
323,
3055,
19191,
13757,
436,
2929,
6131,
247,
5352,
1007,
387,
19191,
13757,
342,
33234,
23632,
2299,
352,
4620,
281,
452,
644,
20906,
285,
1057,
417,
7568,
973,
1014,
2167,
253,
6032,
1255,
9376,
310,
247,
12291,
4679,
323,
253,
6032,
2957,
1083,
285,
5301,
342,
643,
5368,
3082,
403,
9013,
281,
2085,
247,
3282,
273,
253,
11839,
5750,
7558,
275,
436,
2929,
253,
4028,
812,
320,
5520,
281,
1056,
352,
625,
9414,
19771,
285,
4679,
403,
14659,
5474,
33032,
2520,
2929,
29328,
247,
4460,
327,
3642,
292,
706,
1506,
9436,
5933,
342,
10527,
23632,
534,
13698,
281,
5115,
8654,
14940,
2281,
273,
3264,
2495,
762,
8967,
11068,
10806,
7688,
875,
14938,
3033,
285,
3264,
2495,
403,
671,
2530,
762,
1027,
2515,
273,
2957,
1159,
20544,
50276,
18,
186,
2520,
789,
310,
973,
24013,
8550,
342,
10527,
6260,
50275,
19,
186,
251,
3642,
292,
706,
1506,
9436,
4302,
310,
271,
1774,
9400,
275,
3909,
4715,
285,
21673,
3055,
310,
671,
1774,
275,
18361,
941,
50276,
20,
186,
1747,
1221,
3033,
285,
3264,
2495,
403,
5867,
762,
1027,
2515,
273,
2957,
1159,
50276,
20881,
1255,
265,
50276,
18,
186,
8826,
6031,
452,
644,
1160,
281,
253,
9400,
273,
21673,
3055,
3909,
4715,
277,
4818,
275,
480,
404,
1162,
355,
4050,
253,
5301,
342,
277,
4818,
310,
5816,
50276,
19,
186,
77,
471,
273,
1783,
273,
15180,
10454,
50276,
20,
186,
783,
2929,
19756,
6571,
5955,
285,
16039,
323,
253,
10527,
1543,
50275,
21,
186,
2520,
2929,
19756,
16774,
6260,
534,
7787,
253,
38041,
273,
253,
4081,
11333,
50276,
75,
404,
1162,
355,
4050,
21673,
3055,
3909,
4715,
847,
85,
4050,
50276,
2577,
2201,
4468,
310,
670,
253,
38041,
273,
253,
4081,
5933,
436,
2929,
29328,
247,
4460,
327,
3642,
292,
706,
1506,
5933,
342,
10527,
23632,
2581,
685,
760,
3400,
10527,
6260,
323,
5368,
4715,
11333,
3021,
16774,
6260,
403,
5667,
323,
17227,
253,
12510,
285,
6733,
273,
253,
4460,
5933,
342,
8967,
11068,
10806,
16280,
690,
1774,
2905,
2987,
670,
21673,
3055,
3909,
4715,
403,
5816,
5474,
33032,
27381,
281,
14604,
43576,
403,
247,
8946,
5141,
432,
48960,
3909,
4715,
281,
19191,
13757,
436,
789,
23970,
253,
897,
273,
824,
23082,
275,
253,
3634,
273,
21673,
3055,
19191,
17133,
13757,
20093,
1940,
436,
6880,
310,
37825,
347,
352,
310,
1929,
326,
275,
436,
3634,
8946,
3909,
281,
14604,
5644,
281,
749,
29776,
4142,
3103,
275,
436,
789,
247,
625,
10182,
17375,
14938,
6974,
1929,
347,
28537,
3909,
281,
14604,
43576,
2624,
76,
375,
4742,
6247,
310,
908,
436,
1332,
310,
908,
275,
17385,
342,
247,
5202,
20828,
33234,
5122,
277,
1601,
1162,
355,
4267,
326,
4483,
323,
8654,
6046,
1635,
273,
7898,
22661,
285,
342,
247,
33037,
19191,
11786,
29107,
534,
556,
644,
8058,
4217,
275,
20093,
1940,
347,
74,
1162,
355,
43425,
16819,
1031,
1162,
355,
43425,
352,
310,
4722,
326,
253,
5853,
310,
3240,
2087,
285,
352,
310,
7763,
281,
7533,
4457,
11591,
19,
20093,
1940,
534,
407,
1024,
310,
973,
7192,
50276,
783,
789,
3797,
18149,
281,
7052,
17133,
11655,
285,
4142,
323,
534,
253,
1327,
9486,
258,
18,
2609,
79,
1307,
5223,
84,
281,
253,
1268,
273,
6046,
50274,
9720,
269,
10391,
1342,
465,
15077,
5269,
7523,
3055,
19191,
17133,
13757,
8654,
4142,
275,
298,
18,
12087,
17857,
1686,
1797,
5987,
39962,
2061,
2117,
19,
12172,
10496,
1036,
50276,
67,
515,
1031,
1149,
91,
1342,
295,
38911,
5293,
26365,
21673,
3055,
19191,
13757,
847,
85,
1797,
5987,
39962,
2061,
9275,
19,
12172,
520,
24803,
9275,
50276,
7317,
76,
375,
4742,
28537,
327,
3642,
292,
706,
1506,
36970,
285,
17680,
17857,
1686,
746,
5987,
39962,
2061,
9275,
746,
2941,
8972,
3566,
9275,
50276,
69,
1601,
5549,
263,
8483,
515,
74,
687,
394,
1559,
360,
8967,
11068,
762,
45120,
8310,
331,
406,
740,
50276,
296,
3755,
20556,
50276,
18,
253,
440,
5411,
8668,
327,
20093,
1940,
7032,
281,
9295,
8654,
4142,
1293,
18052,
7125,
390,
23082,
875,
17133,
285,
7052,
17133,
11655,
949,
37820,
374,
253,
8668,
273,
3909,
281,
14604,
43576,
3400,
247,
2581,
15246,
6880,
281,
5293,
26365,
22429,
323,
534,
8654,
3082,
403,
1335,
29967,
16819,
1031,
1162,
355,
43425,
495,
5223,
2351,
281,
253,
6046,
1268,
512,
9351,
273,
5223,
2351,
275,
8967,
11068,
403,
11132,
285,
417,
973,
7192,
594,
436,
906,
275,
3139,
310,
1534,
577,
253,
2929,
310,
9648,
973,
3542,
285,
253,
27947,
285,
6260,
403,
417,
9542,
285,
3505,
74,
6216,
50276,
20881,
1255,
265,
50276,
18,
891,
1158,
436,
440,
5411,
8668,
4916,
625,
4722,
275,
41184,
534,
403,
417,
11591,
19,
625,
17947,
273,
253,
9099,
273,
253,
1543,
273,
253,
2929,
281,
841,
7533,
651,
452,
644,
271,
4722,
1635,
374,
9255,
432,
253,
6046,
17825,
4142,
285,
253,
4764,
1959,
5933,
253,
4142,
6012,
275,
253,
2929,
403,
417,
747,
10775,
8654,
4142,
452,
2168,
644,
4232,
891,
13414,
1158,
436,
310,
247,
2201,
1895,
2167,
6571,
984,
2403,
841,
3082,
625,
21450,
7763,
310,
4722,
285,
3309,
281,
1805,
2096,
20093,
1940,
627,
403,
767,
7681,
7794,
273,
436,
789,
326,
403,
4409,
247,
625,
9470,
5955,
50276,
18,
403,
253,
747,
3909,
281,
14604,
9436,
1543,
3559,
275,
436,
2929,
7032,
273,
15974,
11591,
18,
38965,
347,
74,
1162,
355,
43425,
16819,
1031,
1162,
355,
43425,
436,
310,
271,
1774,
1953,
1580,
436,
310,
253,
760,
2087,
1083,
835,
359,
476,
3693,
14189,
10096,
327,
277,
275,
253,
4142,
436,
310,
417,
247,
1480,
40460,
273,
752,
310,
4390,
2218,
275,
253,
2929,
347,
253,
19191,
21332,
88,
311,
453,
11333,
273,
253,
10414,
1840,
403,
417,
4518,
7763,
281,
253,
3909,
4758,
50276,
19,
253,
2523,
670,
253,
6032,
1255,
3638,
288,
310,
4390,
14999,
247,
625,
11080,
5955,
2490,
187,
4118,
18435,
27,
783,
2929,
1160,
3236,
9021,
281,
33234,
4715,
273,
17133,
285,
6032,
21673,
3055,
4715,
1895,
407,
12873,
281,
253,
8485,
4764,
4924,
3909,
4715,
6239,
581,
273,
253,
30628,
1239,
253,
2929,
50276,
3113,
1270,
4278,
285,
9257,
10141,
253,
36594,
273,
253,
1543,
253,
913,
671,
2335,
247,
2810,
1007,
285,
1089,
253,
1543,
1077,
5322,
253,
30628,
285,
253,
913,
2007,
5469,
253,
789,
285,
31637,
690,
273,
253,
7350,
5439,
24088,
5001,
13782,
533,
253,
5816,
4679,
1056,
352,
1892,
281,
362,
9764,
323,
253,
8542,
414,
273,
253,
2746,
50276,
3169,
327,
253,
10527,
7680,
3815,
359,
2868,
253,
2929,
310,
1840,
253,
2534,
285,
651,
26174,
5583,
2997,
50275,
783,
4477,
403,
14659,
281,
1379,
715,
2395,
253,
2792,
2708,
285,
1908,
6240,
22791,
272,
4679,
50274,
8826,
3081,
8680,
50276,
26122,
562,
273,
253,
5955,
50276,
18,
323,
20093,
1940,
347,
253,
8654,
4142,
403,
1929,
281,
320,
39941,
407,
247,
4872,
673,
5933,
269,
5751,
938,
253,
4081,
747,
5933,
3021,
1057,
417,
3157,
689,
5368,
3082,
275,
2057,
7605,
390,
15180,
10454,
50276,
19,
604,
13782,
310,
417,
247,
4468,
27620,
11786,
18499,
342,
327,
19,
25142,
3021,
295,
20,
32809,
11786,
42295,
5841,
310,
1929,
281,
2085,
271,
1014,
10046,
6714,
16774,
2495,
3033,
326,
310,
8654,
1293,
326,
3081,
2412,
246,
275,
436,
2929,
352,
671,
310,
1077,
1892,
281,
7171,
275,
3946,
50276,
20,
253,
2022,
7680,
273,
253,
4081,
5933,
310,
840,
670,
747,
5933,
280,
5609,
29563,
432,
17825,
3909,
4715,
436,
2746,
4245,
6054,
281,
253,
27998,
1971,
273,
2087,
17133,
285,
7052,
17133,
3237,
285,
597,
5469,
849,
352,
7729,
281,
13341,
715,
643,
625,
17825,
50276,
3062,
4764,
4924,
3909,
40390,
50276,
21,
50276,
12588,
356,
18840,
323,
20437,
11786,
6430,
407,
19732,
2977,
6032,
1255,
285,
7882,
10466,
407,
253,
28537,
327,
3642,
292,
706,
1506,
5141,
310,
20295,
891,
1158,
22335,
436,
310,
253,
954,
4722,
2934,
50276,
22,
891,
1158,
253,
2929,
4428,
2217,
1175,
1543,
281,
320,
2783,
347,
247,
15846,
10527,
789,
323,
14924,
326,
753,
891,
513,
1158,
253,
5933,
556,
253,
2442,
281,
320,
8542,
352,
310,
247,
27042,
326,
253,
4477,
858,
417,
1611,
2167,
342,
253,
2067,
8090,
273,
23082,
285,
253,
8985,
12588,
2746,
352,
31451,
9326,
479,
604,
253,
4081,
1332,
310,
417,
12085,
1411,
3082,
824,
347,
642,
261,
11550,
69,
390,
642,
261,
656,
35333,
50276,
23,
1529,
2762,
4809,
670,
436,
2929,
310,
326,
352,
310,
29422,
285,
253,
4028,
310,
3965,
1175,
891,
3782,
10490,
849,
253,
4477,
13613,
6266,
616,
7680,
275,
253,
4060,
50276,
15834,
407,
4518,
27321,
253,
2234,
9376,
327,
253,
6032,
1255,
50275
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
4524,
253,
9376,
273,
247,
6032,
2957,
1159,
436,
2929,
2340,
684,
247,
747,
327,
3642,
292,
706,
1506,
9436,
1332,
323,
3055,
19191,
13757,
436,
2929,
6131,
247,
5352,
1007,
387,
19191,
13757,
342,
33234,
23632,
2299,
352,
4620,
281,
452,
644,
20906,
285,
1057,
417,
7568,
973,
1014,
2167,
253,
6032,
1255,
9376,
310,
247,
12291,
4679,
323,
253,
6032,
2957,
1083,
285,
5301,
342,
643,
5368,
3082,
403,
9013,
281,
2085,
247,
3282,
273,
253,
11839,
5750,
7558,
275,
436,
2929,
253,
4028,
812,
320,
5520,
281,
1056,
352,
625,
9414,
19771,
285,
4679,
403,
14659,
5474,
33032,
2520,
2929,
29328,
247,
4460,
327,
3642,
292,
706,
1506,
9436,
5933,
342,
10527,
23632,
534,
13698,
281,
5115,
8654,
14940,
2281,
273,
3264,
2495,
762,
8967,
11068,
10806,
7688,
875,
14938,
3033,
285,
3264,
2495,
403,
671,
2530,
762,
1027,
2515,
273,
2957,
1159,
20544,
50276,
18,
186,
2520,
789,
310,
973,
24013,
8550,
342,
10527,
6260,
50275,
19,
186,
251,
3642,
292,
706,
1506,
9436,
4302,
310,
271,
1774,
9400,
275,
3909,
4715,
285,
21673,
3055,
310,
671,
1774,
275,
18361,
941,
50276,
20,
186,
1747,
1221,
3033,
285,
3264,
2495,
403,
5867,
762,
1027,
2515,
273,
2957,
1159,
50276,
20881,
1255,
265,
50276,
18,
186,
8826,
6031,
452,
644,
1160,
281,
253,
9400,
273,
21673,
3055,
3909,
4715,
277,
4818,
275,
480,
404,
1162,
355,
4050,
253,
5301,
342,
277,
4818,
310,
5816,
50276,
19,
186,
77,
471,
273,
1783,
273,
15180,
10454,
50276,
20,
186,
783,
2929,
19756,
6571,
5955,
285,
16039,
323,
253,
10527,
1543,
50275,
21,
186,
2520,
2929,
19756,
16774,
6260,
534,
7787,
253,
38041,
273,
253,
4081,
11333,
50276,
75,
404,
1162,
355,
4050,
21673,
3055,
3909,
4715,
847,
85,
4050,
50276,
2577,
2201,
4468,
310,
670,
253,
38041,
273,
253,
4081,
5933,
436,
2929,
29328,
247,
4460,
327,
3642,
292,
706,
1506,
5933,
342,
10527,
23632,
2581,
685,
760,
3400,
10527,
6260,
323,
5368,
4715,
11333,
3021,
16774,
6260,
403,
5667,
323,
17227,
253,
12510,
285,
6733,
273,
253,
4460,
5933,
342,
8967,
11068,
10806,
16280,
690,
1774,
2905,
2987,
670,
21673,
3055,
3909,
4715,
403,
5816,
5474,
33032,
27381,
281,
14604,
43576,
403,
247,
8946,
5141,
432,
48960,
3909,
4715,
281,
19191,
13757,
436,
789,
23970,
253,
897,
273,
824,
23082,
275,
253,
3634,
273,
21673,
3055,
19191,
17133,
13757,
20093,
1940,
436,
6880,
310,
37825,
347,
352,
310,
1929,
326,
275,
436,
3634,
8946,
3909,
281,
14604,
5644,
281,
749,
29776,
4142,
3103,
275,
436,
789,
247,
625,
10182,
17375,
14938,
6974,
1929,
347,
28537,
3909,
281,
14604,
43576,
2624,
76,
375,
4742,
6247,
310,
908,
436,
1332,
310,
908,
275,
17385,
342,
247,
5202,
20828,
33234,
5122,
277,
1601,
1162,
355,
4267,
326,
4483,
323,
8654,
6046,
1635,
273,
7898,
22661,
285,
342,
247,
33037,
19191,
11786,
29107,
534,
556,
644,
8058,
4217,
275,
20093,
1940,
347,
74,
1162,
355,
43425,
16819,
1031,
1162,
355,
43425,
352,
310,
4722,
326,
253,
5853,
310,
3240,
2087,
285,
352,
310,
7763,
281,
7533,
4457,
11591,
19,
20093,
1940,
534,
407,
1024,
310,
973,
7192,
50276,
783,
789,
3797,
18149,
281,
7052,
17133,
11655,
285,
4142,
323,
534,
253,
1327,
9486,
258,
18,
2609,
79,
1307,
5223,
84,
281,
253,
1268,
273,
6046,
50274,
9720,
269,
10391,
1342,
465,
15077,
5269,
7523,
3055,
19191,
17133,
13757,
8654,
4142,
275,
298,
18,
12087,
17857,
1686,
1797,
5987,
39962,
2061,
2117,
19,
12172,
10496,
1036,
50276,
67,
515,
1031,
1149,
91,
1342,
295,
38911,
5293,
26365,
21673,
3055,
19191,
13757,
847,
85,
1797,
5987,
39962,
2061,
9275,
19,
12172,
520,
24803,
9275,
50276,
7317,
76,
375,
4742,
28537,
327,
3642,
292,
706,
1506,
36970,
285,
17680,
17857,
1686,
746,
5987,
39962,
2061,
9275,
746,
2941,
8972,
3566,
9275,
50276,
69,
1601,
5549,
263,
8483,
515,
74,
687,
394,
1559,
360,
8967,
11068,
762,
45120,
8310,
331,
406,
740,
50276,
296,
3755,
20556,
50276,
18,
253,
440,
5411,
8668,
327,
20093,
1940,
7032,
281,
9295,
8654,
4142,
1293,
18052,
7125,
390,
23082,
875,
17133,
285,
7052,
17133,
11655,
949,
37820,
374,
253,
8668,
273,
3909,
281,
14604,
43576,
3400,
247,
2581,
15246,
6880,
281,
5293,
26365,
22429,
323,
534,
8654,
3082,
403,
1335,
29967,
16819,
1031,
1162,
355,
43425,
495,
5223,
2351,
281,
253,
6046,
1268,
512,
9351,
273,
5223,
2351,
275,
8967,
11068,
403,
11132,
285,
417,
973,
7192,
594,
436,
906,
275,
3139,
310,
1534,
577,
253,
2929,
310,
9648,
973,
3542,
285,
253,
27947,
285,
6260,
403,
417,
9542,
285,
3505,
74,
6216,
50276,
20881,
1255,
265,
50276,
18,
891,
1158,
436,
440,
5411,
8668,
4916,
625,
4722,
275,
41184,
534,
403,
417,
11591,
19,
625,
17947,
273,
253,
9099,
273,
253,
1543,
273,
253,
2929,
281,
841,
7533,
651,
452,
644,
271,
4722,
1635,
374,
9255,
432,
253,
6046,
17825,
4142,
285,
253,
4764,
1959,
5933,
253,
4142,
6012,
275,
253,
2929,
403,
417,
747,
10775,
8654,
4142,
452,
2168,
644,
4232,
891,
13414,
1158,
436,
310,
247,
2201,
1895,
2167,
6571,
984,
2403,
841,
3082,
625,
21450,
7763,
310,
4722,
285,
3309,
281,
1805,
2096,
20093,
1940,
627,
403,
767,
7681,
7794,
273,
436,
789,
326,
403,
4409,
247,
625,
9470,
5955,
50276,
18,
403,
253,
747,
3909,
281,
14604,
9436,
1543,
3559,
275,
436,
2929,
7032,
273,
15974,
11591,
18,
38965,
347,
74,
1162,
355,
43425,
16819,
1031,
1162,
355,
43425,
436,
310,
271,
1774,
1953,
1580,
436,
310,
253,
760,
2087,
1083,
835,
359,
476,
3693,
14189,
10096,
327,
277,
275,
253,
4142,
436,
310,
417,
247,
1480,
40460,
273,
752,
310,
4390,
2218,
275,
253,
2929,
347,
253,
19191,
21332,
88,
311,
453,
11333,
273,
253,
10414,
1840,
403,
417,
4518,
7763,
281,
253,
3909,
4758,
50276,
19,
253,
2523,
670,
253,
6032,
1255,
3638,
288,
310,
4390,
14999,
247,
625,
11080,
5955,
2490,
187,
4118,
18435,
27,
783,
2929,
1160,
3236,
9021,
281,
33234,
4715,
273,
17133,
285,
6032,
21673,
3055,
4715,
1895,
407,
12873,
281,
253,
8485,
4764,
4924,
3909,
4715,
6239,
581,
273,
253,
30628,
1239,
253,
2929,
50276,
3113,
1270,
4278,
285,
9257,
10141,
253,
36594,
273,
253,
1543,
253,
913,
671,
2335,
247,
2810,
1007,
285,
1089,
253,
1543,
1077,
5322,
253,
30628,
285,
253,
913,
2007,
5469,
253,
789,
285,
31637,
690,
273,
253,
7350,
5439,
24088,
5001,
13782,
533,
253,
5816,
4679,
1056,
352,
1892,
281,
362,
9764,
323,
253,
8542,
414,
273,
253,
2746,
50276,
3169,
327,
253,
10527,
7680,
3815,
359,
2868,
253,
2929,
310,
1840,
253,
2534,
285,
651,
26174,
5583,
2997,
50275,
783,
4477,
403,
14659,
281,
1379,
715,
2395,
253,
2792,
2708,
285,
1908,
6240,
22791,
272,
4679,
50274,
8826,
3081,
8680,
50276,
26122,
562,
273,
253,
5955,
50276,
18,
323,
20093,
1940,
347,
253,
8654,
4142,
403,
1929,
281,
320,
39941,
407,
247,
4872,
673,
5933,
269,
5751,
938,
253,
4081,
747,
5933,
3021,
1057,
417,
3157,
689,
5368,
3082,
275,
2057,
7605,
390,
15180,
10454,
50276,
19,
604,
13782,
310,
417,
247,
4468,
27620,
11786,
18499,
342,
327,
19,
25142,
3021,
295,
20,
32809,
11786,
42295,
5841,
310,
1929,
281,
2085,
271,
1014,
10046,
6714,
16774,
2495,
3033,
326,
310,
8654,
1293,
326,
3081,
2412,
246,
275,
436,
2929,
352,
671,
310,
1077,
1892,
281,
7171,
275,
3946,
50276,
20,
253,
2022,
7680,
273,
253,
4081,
5933,
310,
840,
670,
747,
5933,
280,
5609,
29563,
432,
17825,
3909,
4715,
436,
2746,
4245,
6054,
281,
253,
27998,
1971,
273,
2087,
17133,
285,
7052,
17133,
3237,
285,
597,
5469,
849,
352,
7729,
281,
13341,
715,
643,
625,
17825,
50276,
3062,
4764,
4924,
3909,
40390,
50276,
21,
50276,
12588,
356,
18840,
323,
20437,
11786,
6430,
407,
19732,
2977,
6032,
1255,
285,
7882,
10466,
407,
253,
28537,
327,
3642,
292,
706,
1506,
5141,
310,
20295,
891,
1158,
22335,
436,
310,
253,
954,
4722,
2934,
50276,
22,
891,
1158,
253,
2929,
4428,
2217,
1175,
1543,
281,
320,
2783,
347,
247,
15846,
10527,
789,
323,
14924,
326,
753,
891,
513,
1158,
253,
5933,
556,
253,
2442,
281,
320,
8542,
352,
310,
247,
27042,
326,
253,
4477,
858,
417,
1611,
2167,
342,
253,
2067,
8090,
273,
23082,
285,
253,
8985,
12588,
2746,
352,
31451,
9326,
479,
604,
253,
4081,
1332,
310,
417,
12085,
1411,
3082,
824,
347,
642,
261,
11550,
69,
390,
642,
261,
656,
35333,
50276,
23,
1529,
2762,
4809,
670,
436,
2929,
310,
326,
352,
310,
29422,
285,
253,
4028,
310,
3965,
1175,
891,
3782,
10490,
849,
253,
4477,
13613,
6266,
616,
7680,
275,
253,
4060,
50276,
15834,
407,
4518,
27321,
253,
2234,
9376,
327,
253,
6032,
1255,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses a challenging problem in 3d representation learning of point clouds which aims to generalize the representations well to arbitrary orientations the proposed csgnn uses a convolutional framework to learn over concentric spherical feature maps which is incorporated with both intrasphere and intersphere information the idea is interesting and reasonably well presented in general the proposed csgnn is practical on spherical representation learning however there still remains some issues that should be addressed strengths 1 the author proposes a new volumetric representation as the basis for convolutional learning which consists of multiple nested spheres discretized by the icosahedral grid 2 the proposed csgnn integrates both intrasphere and intersphere convolutions to learn volumetric representation over concentric spheres 3 the proposed csgnn is practical in classifying arbitrarily rotated 3d objects and is also effective in molecular environment description weaknesses 1 the architecture is not novel enough which is only based on hierarchical graph convolutions followed by radial convolutions and the performance is restricted by the size of radius as illustrated in fig 1 and the experimental results on table 2 2 it is timeconsuming and redundant to adopt all the points to calculate the contribution to vertices the reviewer is curious about how to select the effective points or how to avoid the redundant information when converting the point cloud to concentric spheres are there any adaptive strategies other than thresholding on some scores 3 there are some adjustable hyperparameters ie the threshold of rbf the number of spatial concentric spheres the input channels the kernel size etc it is not clear whether the method is robust 4 how to calculate the dos and fdos loss items the author is expected to give explanations 5 experimental results in some aspects are not stateofthearts and some failure cases should also be discussed if there exists this paper presents a new volumetric representation to learn 3d representations however some technical details should be further discussed and analyzed strengths and weaknesses are both discussed on the proposed csgnn in terms of data process method design and experimental settings and results therefore the reviewer gives rating of 6 marginally above the acceptance threshold docsepthis paper proposes a graph convolution based multiscale spherical deep neural network in particular radial convolutions over concentric spheres of different radii are used in conjunction with the typical spherical convolution a simple 5layer architecture is presented to fuse the rich features extracted by both convolutions experiments on point cloud classification under arbitrary so3 transformations as well as predicting electronic state density of graphene allotropes demonstrate the validity of the devised method pros the paper makes a very sensible contribution that extends a variant of spherical cnn to be multiscale as such better accuracy is expected because the finer grained details can be better captured the presentation is quite clear and i did not spot significant problems the chosen experiments are quite relevant and span two different communities cons i feel like the paper presents what i would call a multiresolution spherical cnn in other words the only addition to the known graph based spherical cnn is the notion of multiple concentric spheres however the experiments seem to distinguish between spherical cnn and the csgnn with r1 c1 this confuses me a little bit would it be possible to discuss the differences or can we recover spherical cnns as a special case of the proposed approach representation is a fundamental aspect in 3d analysis i would suggest the paper to be more careful or conscious with the use of the word volumetric first why are volumetric features desirable to begin with second i feel like the concentric spheres can better capture the narrow band around the point set while still leaving parts of the space unexplained this is not necessarily a negative as it might amount to avoiding the storage of unnecessary information in that sense the representation might fall somewhere between a point cloud and a dense volume say a sweet spot maybe i might also be wrong but these points could be better described in the paper the related work does not really discriminate the proposed approach from the others so as to position this method among the state of the art in particular it could be good to discuss what propertiesgoals are not met by the prior approaches regarding the related works it is also good to mention the initial attempts to develop invariant point cloud classification networks thomas nathaniel et al tensor field networks rotationand translationequivariant neural networks for 3d point clouds arxiv preprint arxiv180208219 2018 zhao yongheng et al quaternion equivariant capsule networks for 3d point clouds european conference on computer vision springer cham 2020 the paper seems to lack some details of the rotational invariance capabilities of the convolutions especially invariantequivariant design of the radial convolutions carry importance as being the main contributions of this paper the paper seems to decouple two convolutions radial and angular would it be possible to have a single convolution fusing information from both dimensions eg just like volumetric 3d convolutions could the paper explain why such a design choice is better the paper argues for a spherical distribution to represent the point cloud however this introduces some discretization artifacts so it is of wonder to see whether the benefit brought by concentric spheres outdoes the discretization error as shown in fig 2 it is possible to obtain a subset of the spheres at certain radii as far as i understand this is somewhat a pruning mechanism is this only hypothetical or does it also happen with real point clouds could we see an example the spheres seem to have more structure than a simple graph i feel like resorting to gnns can yield some information loss have we ever thought about this aspect can the approach estimate the pose of the object or is invariance property all we get in case the pose can be estimated can we see a study on this in fig 6 i notice an architecturaldesign difference compared to the point cloud one in fact this seems to have spheres anchored on local patches an operation also applicable to point sets do we necessarily need this change can we cover both problems with similar architectures i feel that some kind of a theoretical analysis on the minimum number of concentric spheres required for rich feature extraction would be a nice to have minor remarks such the underlying such that the underlying challenging to invariance challenging to introduce invariance maybe over over over are all point sets normalized to the unit ball can we hear some more details here what about the connections in fig 5 are those used or is the discretization same as the point cloud experiment simple idea clear writing but lacks thorough discussions and evaluations docsepthis paper proposed a concentric spherical representation of 3d space formed by nesting spatiallysampled spheres resulting from the highly regular icosahedral discretization it utilizes separate intrasphere and intersphere convolutions over the resulting concentric spherical grid which are combined into a convolutional framework for learning volumetric and rotationally equivariant representations over point clouds this paper proposes a new volumetric representation that consists of multiple nested spheres each discretized by the icosahedral grid it proposes to learn the 3d volumetric representation over concentric spheres by combining intrasphere and intersphere convolutions the proposed convolutions are rotationally equivariant and also scale near linearly with grid resolution it shows some experiments in 3d object classification and resolving electronic structure of atomistic systems the paper overall is technical sound with some interesting results the application in atomistic systems is interesting and seems new at least to the reviewer for the 3d point clouds the results in 3d object classification is not very detailed or sufficient there are a lot of datasets i wished the paper can test also it doesnt reference sufficient papers in the 3d point clouds deep learning domain which is related and very relevant eg the kpconv paper should be referred the shellnet paper which also uses concentric shells should be mentioned it seems to me the authors might not be very familiar with the literature in this area i would like to see the paper has side by side comparisons with some of these leading papers in some of the standard benchmark datasets overall the paper is sound and interesting with good results the application in atomic system is interesting and new however the paper doesnt do a good comparison with leading papers in this field thus it is hard to judge the effectiveness of the proposed method docsepthe paper presents a concentric sphere representation and icosahedronbased spherical cnns for 3d point clouds the contributions are two folds first the concentric sphere representation enables learning features volumetrically second two types of convolutions intrasphere and intersphere are combined towards rotation equivariant and scalable computations strengths 1 motivation the motivation to use multiple resolution spheres with varying radius sounds valid and may help feature learning of 3d shapes comparing to previous approaches using single unit sphere 2 architecture the multiple resolution approach is widely used in image deep learning architecture this is the first time it is extended to spherical cnn and the proposed architecture could lead to many followup work in both 3d point cloud processing and spherical cnns weakness 1 rotationequivariant authors claims the proposed method to be rotation equivariant and results on modelnet40 also support this claim this is the most important property of the proposed method it is advisable to have a better explanation of both proposed method and related method mentioned in related work ie why some methods are not rotationequivariant 2 modelnet40 results of 1 cohen etal 2019 is missing in table 1 3 dos experiments 1 meaning of parameters rc in the experiments r1 and c32 are used does this mean 32 spheres are combined into 1 2 loss functions from a2 and eq5 alpha01 is used does this mean lossdos is less important than lossfdos if previous works using alpha10 could the authors provides the results using the same setting in other words how does disabling lossfdos hurt im also keen to know if it is fair to compare to other methods using additional loss term 3 effectiveness of the methods on dos experiments authors use simplified version of the architecture r1 and reported strong performance than baselines does this mean that dos estimation is too easy for the proposed method 4 spherical cnns baseline considering the baselines use in dos experiments nearly all sota spherical cnns are not compared ref 1 gauge equivariant convolutional networks and the icosahedral cnn in summary the paper is easy to read and follow description of the method is clear and complete experiments include one for 3d point cloud classification and one for dos estimation for the latter one im not familiar with the experiment difficulty and details and thus could not evaluate the methods from the results overall i appreciate the efforts of the work to propose multiresolution version of spherical cnns the downside is that evaluations are limited and some baselines are not compared
### Summary:
|
this paper addresses the problem of learning representation of 3d point clouds and introduces an interesting approach of concentric spherical gnn with the property of rotationally equivariant it shows some promising results on point cloud classification under so3 transformations and on predicting electronic state density of graphene allotropes the reviews suggest that while it does not suffer from any major flaws the paper has a fairly large number of minor issues that add up to make it subpar for publication the proposed approach have several hyperparameters but the authors do not seem to be up front about how the parameters are selected except for stating that they use standard tuning techniques this is not a satisfactory answer and appears to be dodging the question many technical details and specific choices could use more thorough explanation and analysis the distinction of the proposed approach in relation to the large body of existing literature could be more clearly spelled out collectively these issues made the contribution of this paper less clear
|
[
327,
690,
7363,
495,
627,
403,
690,
31445,
4373,
22041,
26332,
253,
7887,
273,
391,
3342,
253,
1180,
273,
8820,
2786,
695,
28394,
253,
3280,
8123,
253,
10295,
1979,
3966,
352,
310,
417,
2590,
1880,
253,
1332,
310,
10237,
577,
849,
281,
10173,
253,
9500,
285,
29439,
375,
2957,
4957,
253,
2488,
310,
3264,
281,
1918,
22909,
608,
5661,
1543,
275,
690,
7794,
403,
417,
1375,
23037,
248,
12863,
285,
690,
4433,
2219,
943,
671,
320,
5469,
604,
627,
4961,
50276,
2520,
2929,
10262,
247,
747,
1936,
45558,
6779,
281,
3037,
495,
69,
14237,
2299,
690,
7681,
4278,
943,
320,
2007,
5469,
285,
5867,
20544,
285,
32213,
403,
1097,
5469,
327,
253,
4081,
29180,
3757,
79,
275,
2426,
273,
941,
1232,
1332,
2216,
285,
5661,
7533,
285,
1543,
3103,
253,
37317,
4245,
13716,
273,
721,
42876,
1840,
253,
14924,
7887,
5474,
33032,
2520,
2929,
29328,
247,
4216,
27311,
1754,
1554,
2865,
1079,
19474,
3676,
11454,
2990,
275,
1798,
14599,
2410,
17009,
689,
2786,
695,
28394,
273,
1027,
32285,
403,
908,
275,
17385,
342,
253,
6867,
19474,
27311,
247,
2969,
608,
12026,
10336,
310,
3559,
281,
34824,
253,
6793,
3386,
10375,
407,
1097,
2410,
17009,
4679,
327,
1127,
9005,
9162,
762,
10341,
594,
20,
21257,
347,
973,
347,
21565,
7051,
1375,
4038,
273,
23076,
512,
41916,
265,
7568,
253,
13091,
273,
253,
32434,
1332,
5847,
50276,
783,
2929,
2789,
247,
1077,
24600,
7680,
326,
8725,
247,
12955,
273,
19474,
260,
9866,
281,
320,
1554,
2865,
1079,
347,
824,
1805,
7200,
310,
3264,
984,
253,
40259,
7098,
967,
4278,
476,
320,
1805,
10848,
50275,
783,
9759,
310,
3240,
2590,
285,
891,
858,
417,
6308,
1534,
3237,
50276,
783,
6777,
4679,
403,
3240,
4623,
285,
13905,
767,
1027,
7888,
50275,
5040,
50275,
74,
1928,
751,
253,
2929,
10262,
752,
891,
651,
1067,
247,
1554,
2731,
2241,
19474,
260,
9866,
275,
643,
3000,
253,
760,
1635,
281,
253,
1929,
4216,
1754,
19474,
260,
9866,
310,
253,
10732,
273,
2709,
2786,
695,
28394,
2299,
253,
4679,
1646,
281,
12129,
875,
19474,
260,
9866,
285,
253,
29180,
3757,
79,
342,
391,
18,
260,
18,
436,
1461,
5123,
479,
247,
1652,
2372,
651,
352,
320,
1896,
281,
2319,
253,
3910,
390,
476,
359,
9295,
19474,
260,
79,
2224,
347,
247,
2714,
1083,
273,
253,
4081,
2746,
50274,
37626,
310,
247,
7936,
4809,
275,
495,
69,
1783,
891,
651,
1804,
253,
2929,
281,
320,
625,
10182,
390,
9680,
342,
253,
897,
273,
253,
3159,
1936,
45558,
806,
2139,
403,
1936,
45558,
3386,
11408,
281,
3135,
342,
1273,
891,
1928,
751,
253,
2786,
695,
28394,
476,
1805,
9232,
253,
6891,
3961,
1475,
253,
1127,
873,
1223,
1335,
6108,
4243,
273,
253,
2317,
49374,
436,
310,
417,
7933,
247,
4016,
347,
352,
1537,
2408,
281,
17816,
253,
5718,
273,
15279,
1491,
275,
326,
3282,
253,
6779,
1537,
2965,
9366,
875,
247,
1127,
9005,
285,
247,
14086,
4644,
50276,
19506,
247,
7353,
6308,
5046,
891,
1537,
671,
320,
3430,
533,
841,
2792,
812,
320,
1805,
2529,
275,
253,
2929,
50275,
783,
2905,
789,
1057,
417,
1663,
30530,
253,
4081,
2746,
432,
253,
2571,
594,
347,
281,
1899,
436,
1332,
2190,
253,
1375,
273,
253,
1445,
275,
1798,
352,
812,
320,
1175,
281,
2319,
752,
3607,
2184,
932,
403,
417,
1313,
407,
253,
2720,
7274,
5001,
253,
2905,
2987,
352,
310,
671,
1175,
281,
3748,
253,
3302,
9437,
281,
1287,
13727,
1127,
9005,
9162,
6928,
50276,
394,
4921,
295,
10511,
928,
1162,
355,
13148,
1673,
6928,
9381,
395,
10234,
8275,
6410,
11454,
6928,
323,
495,
69,
1127,
16173,
549,
32693,
638,
3845,
549,
32693,
1093,
9992,
3507,
746,
4765,
50276,
91,
31035,
340,
543,
24176,
1162,
355,
40163,
279,
32270,
6410,
26661,
6928,
323,
495,
69,
1127,
16173,
19454,
266,
8059,
327,
4382,
8113,
7203,
254,
45909,
9169,
50275,
783,
2929,
3133,
281,
3480,
690,
4278,
273,
253,
22090,
31429,
13789,
273,
253,
2410,
17009,
3340,
13727,
8275,
6410,
2216,
273,
253,
14599,
2410,
17009,
4459,
6349,
347,
1146,
253,
2022,
9021,
273,
436,
2929,
50275,
783,
2929,
3133,
281,
34430,
713,
767,
2410,
17009,
14599,
285,
12336,
651,
352,
320,
1896,
281,
452,
247,
2014,
27311,
269,
5302,
1491,
432,
1097,
10103,
24088,
816,
751,
1936,
45558,
495,
69,
2410,
17009,
812,
253,
2929,
5513,
2139,
824,
247,
2216,
4327,
310,
1805,
50275,
783,
2929,
8219,
323,
247,
19474,
3268,
281,
1957,
253,
1127,
9005,
2299,
436,
23970,
690,
35132,
1320,
24165,
594,
352,
310,
273,
4282,
281,
923,
1880,
253,
5649,
3982,
407,
2786,
695,
28394,
562,
18566,
253,
35132,
1320,
2228,
50274,
284,
2011,
275,
3036,
374,
352,
310,
1896,
281,
4044,
247,
8578,
273,
253,
28394,
387,
2176,
32285,
347,
2080,
347,
891,
2096,
436,
310,
8489,
247,
819,
25004,
5122,
310,
436,
760,
27710,
390,
1057,
352,
671,
5108,
342,
1524,
1127,
16173,
812,
359,
923,
271,
1650,
50275,
783,
28394,
1646,
281,
452,
625,
2605,
685,
247,
2969,
4216,
891,
1928,
751,
501,
12655,
281,
18976,
2224,
476,
4917,
690,
1491,
2957,
452,
359,
2455,
1869,
670,
436,
4809,
50275,
5092,
253,
2746,
6642,
253,
16753,
273,
253,
1789,
390,
310,
31429,
2867,
512,
359,
755,
275,
1083,
253,
16753,
476,
320,
5998,
476,
359,
923,
247,
1263,
327,
436,
50275,
249,
3036,
721,
891,
4366,
271,
27934,
19417,
3064,
2429,
281,
253,
1127,
9005,
581,
275,
958,
436,
3133,
281,
452,
28394,
39574,
327,
1980,
20412,
271,
4254,
671,
7763,
281,
1127,
5239,
513,
359,
7933,
878,
436,
1818,
476,
359,
3835,
1097,
3237,
342,
2074,
35615,
50275,
74,
1928,
326,
690,
2238,
273,
247,
10527,
1783,
327,
253,
5927,
1180,
273,
2786,
695,
28394,
2424,
323,
6793,
4735,
11998,
651,
320,
247,
5322,
281,
452,
50275,
37585,
16157,
50276,
10328,
253,
6944,
50276,
10328,
326,
253,
6944,
50276,
40893,
272,
281,
31429,
50276,
40893,
272,
281,
9569,
31429,
5046,
50276,
1189,
689,
50276,
1189,
50276,
609,
512,
1127,
5239,
12650,
281,
253,
3943,
4023,
476,
359,
4089,
690,
625,
4278,
1060,
50276,
5371,
670,
253,
10291,
275,
3036,
608,
403,
1110,
908,
390,
310,
253,
35132,
1320,
1072,
347,
253,
1127,
9005,
3368,
2969,
2934,
2590,
4028,
533,
19756,
11080,
11985,
285,
27163,
5474,
33032,
2520,
2929,
4081,
247,
2786,
695,
19474,
6779,
273,
495,
69,
2317,
4447,
407,
47847,
28819,
22163,
6216,
28394,
4795,
432,
253,
4122,
3963,
17857,
6859,
16232,
35132,
1320,
352,
29820,
4858,
4996,
4938,
1568,
285,
734,
40196,
2410,
17009,
689,
253,
4795,
2786,
695,
19474,
9860,
534,
403,
5678,
715,
247,
27311,
267,
7792,
323,
4715,
1936,
45558,
285,
9381,
595,
32270,
6410,
14237,
689,
1127,
16173,
436,
2929,
29328,
247,
747,
1936,
45558,
6779,
326,
8414,
273,
2709,
20494,
28394,
1016,
35132,
1025,
407,
253,
17857,
6859,
16232,
9860,
352,
29328,
281,
3037,
253,
495,
69,
1936,
45558,
6779,
689,
2786,
695,
28394,
407,
16248,
4996,
4938,
1568,
285,
734,
40196,
2410,
17009,
253,
4081,
2410,
17009,
403,
9381,
595,
32270,
6410,
285,
671,
4311,
2822,
23352,
342,
9860,
6064,
352,
2722,
690,
4679,
275,
495,
69,
1789,
9162,
285,
30426,
7051,
2605,
273,
13112,
2531,
2718,
50276,
783,
2929,
4583,
310,
7681,
3590,
342,
690,
4722,
1543,
253,
2898,
275,
13112,
2531,
2718,
310,
4722,
285,
3133,
747,
387,
1878,
281,
253,
37317,
323,
253,
495,
69,
1127,
16173,
50276,
783,
1543,
275,
495,
69,
1789,
9162,
310,
417,
1077,
7000,
390,
4209,
627,
403,
247,
2257,
273,
15302,
891,
16632,
253,
2929,
476,
1071,
671,
352,
36908,
3806,
4209,
9380,
275,
253,
495,
69,
1127,
16173,
3676,
4715,
5028,
534,
310,
2905,
285,
1077,
4623,
24088,
253,
465,
81,
13118,
2929,
943,
320,
6289,
253,
8135,
3024,
2929,
534,
671,
4648,
2786,
695,
24383,
943,
320,
5393,
352,
3133,
281,
479,
253,
4477,
1537,
417,
320,
1077,
7615,
342,
253,
6239,
275,
436,
2170,
891,
651,
751,
281,
923,
253,
2929,
556,
1930,
407,
1930,
14023,
342,
690,
273,
841,
4283,
9380,
275,
690,
273,
253,
2629,
22791,
15302,
4583,
253,
2929,
310,
3590,
285,
4722,
342,
1175,
1543,
253,
2898,
275,
13805,
985,
310,
4722,
285,
747,
2299,
253,
2929,
36908,
513,
247,
1175,
5301,
342,
4283,
9380,
275,
436,
1673,
3021,
352,
310,
1892,
281,
5963,
253,
12510,
273,
253,
4081,
1332,
5474,
339,
431,
248,
2929,
10262,
247,
2786,
695,
15269,
6779,
285,
17857,
6859,
45938,
3169,
19474,
260,
79,
2224,
323,
495,
69,
1127,
16173,
253,
9021,
403,
767,
34579,
806,
253,
2786,
695,
15269,
6779,
13276,
4715,
3386,
1936,
360,
11656,
1037,
1273,
767,
3510,
273,
2410,
17009,
4996,
4938,
1568,
285,
734,
40196,
403,
5678,
4404,
9381,
32270,
6410,
285,
44755,
30745,
50276,
296,
3755,
20556,
50275,
18,
16038,
253,
16038,
281,
897,
2709,
6064,
28394,
342,
11962,
9941,
7835,
3588,
285,
778,
1361,
4735,
4715,
273,
495,
69,
15029,
10941,
281,
2045,
7274,
970,
2014,
3943,
15269,
50276,
19,
10336,
253,
2709,
6064,
2746,
310,
7561,
908,
275,
2460,
3676,
4715,
10336,
436,
310,
253,
806,
673,
352,
310,
6508,
281,
19474,
260,
9866,
285,
253,
4081,
10336,
812,
1421,
281,
1142,
956,
484,
789,
275,
1097,
495,
69,
1127,
9005,
5162,
285,
19474,
260,
79,
2224,
50276,
20881,
1255,
50275,
18,
9381,
8275,
6410,
4477,
3916,
253,
4081,
1332,
281,
320,
9381,
32270,
6410,
285,
1543,
327,
1566,
3024,
1449,
671,
1329,
436,
1750,
436,
310,
253,
954,
1774,
2867,
273,
253,
4081,
1332,
352,
310,
15237,
494,
281,
452,
247,
1805,
8813,
273,
1097,
4081,
1332,
285,
2905,
1332,
5393,
275,
2905,
789,
26332,
2139,
690,
3082,
403,
417,
9381,
8275,
6410,
50276,
19,
1566,
3024,
1449,
1543,
273,
337,
820,
864,
1162,
267,
6247,
310,
5816,
275,
2829,
337,
50276,
20,
9500,
4679,
50276,
18,
4495,
273,
3602,
27657,
275,
253,
4679,
391,
18,
285,
260,
1237,
403,
908,
1057,
436,
1599,
4567,
28394,
403,
5678,
715,
337,
374,
2957,
3470,
432,
247,
19,
285,
16186,
22,
9765,
520,
310,
908,
1057,
436,
1599,
2957,
38755,
310,
1679,
1774,
685,
2957,
9194,
375,
604,
2045,
2987,
970,
9765,
740,
812,
253,
4477,
3400,
253,
1543,
970,
253,
1072,
4758,
275,
643,
3000,
849,
1057,
44276,
2957,
9194,
375,
8513,
516,
671,
19497,
281,
871,
604,
352,
310,
4344,
281,
7277,
281,
643,
3082,
970,
3081,
2957,
1307,
50276,
20,
12510,
273,
253,
3082,
327,
9500,
4679,
4477,
897,
21010,
2715,
273,
253,
10336,
391,
18,
285,
2361,
2266,
3045,
685,
1666,
25379,
1057,
436,
1599,
326,
9500,
13418,
310,
1512,
3477,
323,
253,
4081,
1332,
50276,
21,
19474,
260,
79,
2224,
8245,
7296,
253,
1666,
25379,
897,
275,
9500,
4679,
4829,
512,
256,
5503,
19474,
260,
79,
2224,
403,
417,
2429,
50275,
709,
50276,
18,
11206,
32270,
6410,
27311,
267,
6928,
285,
253,
17857,
6859,
16232,
260,
9866,
275,
6010,
253,
2929,
310,
3477,
281,
1239,
285,
956,
5740,
273,
253,
1332,
310,
2590,
285,
3426,
4679,
2486,
581,
323,
495,
69,
1127,
9005,
9162,
285,
581,
323,
9500,
13418,
323,
253,
6158,
581,
516,
417,
7615,
342,
253,
3368,
10183,
285,
4278,
285,
3021,
812,
417,
7472,
253,
3082,
432,
253,
1543,
4583,
891,
11435,
253,
6031,
273,
253,
789,
281,
12661,
1554,
2731,
2241,
2715,
273,
19474,
260,
79,
2224,
253,
42719,
310,
326,
27163,
403,
3710,
285,
690,
1666,
25379,
403,
417,
2429,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
4715,
6779,
273,
495,
69,
1127,
16173,
285,
23970,
271,
4722,
2746,
273,
2786,
695,
19474,
305,
9866,
342,
253,
2867,
273,
9381,
595,
32270,
6410,
352,
2722,
690,
12532,
1543,
327,
1127,
9005,
9162,
762,
594,
20,
21257,
285,
327,
21565,
7051,
1375,
4038,
273,
23076,
512,
41916,
265,
253,
10123,
1804,
326,
1223,
352,
1057,
417,
11089,
432,
667,
2201,
32138,
253,
2929,
556,
247,
9648,
1781,
1180,
273,
5884,
3374,
326,
823,
598,
281,
1056,
352,
749,
1148,
323,
9311,
253,
4081,
2746,
452,
2067,
4373,
22041,
533,
253,
4477,
513,
417,
1646,
281,
320,
598,
2914,
670,
849,
253,
3602,
403,
4236,
3707,
323,
14851,
326,
597,
897,
2629,
25184,
5609,
50276,
2520,
310,
417,
247,
20297,
3662,
285,
4620,
281,
320,
33413,
3390,
253,
1953,
1142,
7681,
4278,
285,
2173,
10165,
812,
897,
625,
11080,
8813,
285,
1783,
253,
13812,
273,
253,
4081,
2746,
275,
5886,
281,
253,
1781,
2133,
273,
5368,
6239,
812,
320,
625,
4518,
43997,
562,
26708,
841,
3374,
1160,
253,
7680,
273,
436,
2929,
1679,
2590
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
327,
690,
7363,
495,
627,
403,
690,
31445,
4373,
22041,
26332,
253,
7887,
273,
391,
3342,
253,
1180,
273,
8820,
2786,
695,
28394,
253,
3280,
8123,
253,
10295,
1979,
3966,
352,
310,
417,
2590,
1880,
253,
1332,
310,
10237,
577,
849,
281,
10173,
253,
9500,
285,
29439,
375,
2957,
4957,
253,
2488,
310,
3264,
281,
1918,
22909,
608,
5661,
1543,
275,
690,
7794,
403,
417,
1375,
23037,
248,
12863,
285,
690,
4433,
2219,
943,
671,
320,
5469,
604,
627,
4961,
50276,
2520,
2929,
10262,
247,
747,
1936,
45558,
6779,
281,
3037,
495,
69,
14237,
2299,
690,
7681,
4278,
943,
320,
2007,
5469,
285,
5867,
20544,
285,
32213,
403,
1097,
5469,
327,
253,
4081,
29180,
3757,
79,
275,
2426,
273,
941,
1232,
1332,
2216,
285,
5661,
7533,
285,
1543,
3103,
253,
37317,
4245,
13716,
273,
721,
42876,
1840,
253,
14924,
7887,
5474,
33032,
2520,
2929,
29328,
247,
4216,
27311,
1754,
1554,
2865,
1079,
19474,
3676,
11454,
2990,
275,
1798,
14599,
2410,
17009,
689,
2786,
695,
28394,
273,
1027,
32285,
403,
908,
275,
17385,
342,
253,
6867,
19474,
27311,
247,
2969,
608,
12026,
10336,
310,
3559,
281,
34824,
253,
6793,
3386,
10375,
407,
1097,
2410,
17009,
4679,
327,
1127,
9005,
9162,
762,
10341,
594,
20,
21257,
347,
973,
347,
21565,
7051,
1375,
4038,
273,
23076,
512,
41916,
265,
7568,
253,
13091,
273,
253,
32434,
1332,
5847,
50276,
783,
2929,
2789,
247,
1077,
24600,
7680,
326,
8725,
247,
12955,
273,
19474,
260,
9866,
281,
320,
1554,
2865,
1079,
347,
824,
1805,
7200,
310,
3264,
984,
253,
40259,
7098,
967,
4278,
476,
320,
1805,
10848,
50275,
783,
9759,
310,
3240,
2590,
285,
891,
858,
417,
6308,
1534,
3237,
50276,
783,
6777,
4679,
403,
3240,
4623,
285,
13905,
767,
1027,
7888,
50275,
5040,
50275,
74,
1928,
751,
253,
2929,
10262,
752,
891,
651,
1067,
247,
1554,
2731,
2241,
19474,
260,
9866,
275,
643,
3000,
253,
760,
1635,
281,
253,
1929,
4216,
1754,
19474,
260,
9866,
310,
253,
10732,
273,
2709,
2786,
695,
28394,
2299,
253,
4679,
1646,
281,
12129,
875,
19474,
260,
9866,
285,
253,
29180,
3757,
79,
342,
391,
18,
260,
18,
436,
1461,
5123,
479,
247,
1652,
2372,
651,
352,
320,
1896,
281,
2319,
253,
3910,
390,
476,
359,
9295,
19474,
260,
79,
2224,
347,
247,
2714,
1083,
273,
253,
4081,
2746,
50274,
37626,
310,
247,
7936,
4809,
275,
495,
69,
1783,
891,
651,
1804,
253,
2929,
281,
320,
625,
10182,
390,
9680,
342,
253,
897,
273,
253,
3159,
1936,
45558,
806,
2139,
403,
1936,
45558,
3386,
11408,
281,
3135,
342,
1273,
891,
1928,
751,
253,
2786,
695,
28394,
476,
1805,
9232,
253,
6891,
3961,
1475,
253,
1127,
873,
1223,
1335,
6108,
4243,
273,
253,
2317,
49374,
436,
310,
417,
7933,
247,
4016,
347,
352,
1537,
2408,
281,
17816,
253,
5718,
273,
15279,
1491,
275,
326,
3282,
253,
6779,
1537,
2965,
9366,
875,
247,
1127,
9005,
285,
247,
14086,
4644,
50276,
19506,
247,
7353,
6308,
5046,
891,
1537,
671,
320,
3430,
533,
841,
2792,
812,
320,
1805,
2529,
275,
253,
2929,
50275,
783,
2905,
789,
1057,
417,
1663,
30530,
253,
4081,
2746,
432,
253,
2571,
594,
347,
281,
1899,
436,
1332,
2190,
253,
1375,
273,
253,
1445,
275,
1798,
352,
812,
320,
1175,
281,
2319,
752,
3607,
2184,
932,
403,
417,
1313,
407,
253,
2720,
7274,
5001,
253,
2905,
2987,
352,
310,
671,
1175,
281,
3748,
253,
3302,
9437,
281,
1287,
13727,
1127,
9005,
9162,
6928,
50276,
394,
4921,
295,
10511,
928,
1162,
355,
13148,
1673,
6928,
9381,
395,
10234,
8275,
6410,
11454,
6928,
323,
495,
69,
1127,
16173,
549,
32693,
638,
3845,
549,
32693,
1093,
9992,
3507,
746,
4765,
50276,
91,
31035,
340,
543,
24176,
1162,
355,
40163,
279,
32270,
6410,
26661,
6928,
323,
495,
69,
1127,
16173,
19454,
266,
8059,
327,
4382,
8113,
7203,
254,
45909,
9169,
50275,
783,
2929,
3133,
281,
3480,
690,
4278,
273,
253,
22090,
31429,
13789,
273,
253,
2410,
17009,
3340,
13727,
8275,
6410,
2216,
273,
253,
14599,
2410,
17009,
4459,
6349,
347,
1146,
253,
2022,
9021,
273,
436,
2929,
50275,
783,
2929,
3133,
281,
34430,
713,
767,
2410,
17009,
14599,
285,
12336,
651,
352,
320,
1896,
281,
452,
247,
2014,
27311,
269,
5302,
1491,
432,
1097,
10103,
24088,
816,
751,
1936,
45558,
495,
69,
2410,
17009,
812,
253,
2929,
5513,
2139,
824,
247,
2216,
4327,
310,
1805,
50275,
783,
2929,
8219,
323,
247,
19474,
3268,
281,
1957,
253,
1127,
9005,
2299,
436,
23970,
690,
35132,
1320,
24165,
594,
352,
310,
273,
4282,
281,
923,
1880,
253,
5649,
3982,
407,
2786,
695,
28394,
562,
18566,
253,
35132,
1320,
2228,
50274,
284,
2011,
275,
3036,
374,
352,
310,
1896,
281,
4044,
247,
8578,
273,
253,
28394,
387,
2176,
32285,
347,
2080,
347,
891,
2096,
436,
310,
8489,
247,
819,
25004,
5122,
310,
436,
760,
27710,
390,
1057,
352,
671,
5108,
342,
1524,
1127,
16173,
812,
359,
923,
271,
1650,
50275,
783,
28394,
1646,
281,
452,
625,
2605,
685,
247,
2969,
4216,
891,
1928,
751,
501,
12655,
281,
18976,
2224,
476,
4917,
690,
1491,
2957,
452,
359,
2455,
1869,
670,
436,
4809,
50275,
5092,
253,
2746,
6642,
253,
16753,
273,
253,
1789,
390,
310,
31429,
2867,
512,
359,
755,
275,
1083,
253,
16753,
476,
320,
5998,
476,
359,
923,
247,
1263,
327,
436,
50275,
249,
3036,
721,
891,
4366,
271,
27934,
19417,
3064,
2429,
281,
253,
1127,
9005,
581,
275,
958,
436,
3133,
281,
452,
28394,
39574,
327,
1980,
20412,
271,
4254,
671,
7763,
281,
1127,
5239,
513,
359,
7933,
878,
436,
1818,
476,
359,
3835,
1097,
3237,
342,
2074,
35615,
50275,
74,
1928,
326,
690,
2238,
273,
247,
10527,
1783,
327,
253,
5927,
1180,
273,
2786,
695,
28394,
2424,
323,
6793,
4735,
11998,
651,
320,
247,
5322,
281,
452,
50275,
37585,
16157,
50276,
10328,
253,
6944,
50276,
10328,
326,
253,
6944,
50276,
40893,
272,
281,
31429,
50276,
40893,
272,
281,
9569,
31429,
5046,
50276,
1189,
689,
50276,
1189,
50276,
609,
512,
1127,
5239,
12650,
281,
253,
3943,
4023,
476,
359,
4089,
690,
625,
4278,
1060,
50276,
5371,
670,
253,
10291,
275,
3036,
608,
403,
1110,
908,
390,
310,
253,
35132,
1320,
1072,
347,
253,
1127,
9005,
3368,
2969,
2934,
2590,
4028,
533,
19756,
11080,
11985,
285,
27163,
5474,
33032,
2520,
2929,
4081,
247,
2786,
695,
19474,
6779,
273,
495,
69,
2317,
4447,
407,
47847,
28819,
22163,
6216,
28394,
4795,
432,
253,
4122,
3963,
17857,
6859,
16232,
35132,
1320,
352,
29820,
4858,
4996,
4938,
1568,
285,
734,
40196,
2410,
17009,
689,
253,
4795,
2786,
695,
19474,
9860,
534,
403,
5678,
715,
247,
27311,
267,
7792,
323,
4715,
1936,
45558,
285,
9381,
595,
32270,
6410,
14237,
689,
1127,
16173,
436,
2929,
29328,
247,
747,
1936,
45558,
6779,
326,
8414,
273,
2709,
20494,
28394,
1016,
35132,
1025,
407,
253,
17857,
6859,
16232,
9860,
352,
29328,
281,
3037,
253,
495,
69,
1936,
45558,
6779,
689,
2786,
695,
28394,
407,
16248,
4996,
4938,
1568,
285,
734,
40196,
2410,
17009,
253,
4081,
2410,
17009,
403,
9381,
595,
32270,
6410,
285,
671,
4311,
2822,
23352,
342,
9860,
6064,
352,
2722,
690,
4679,
275,
495,
69,
1789,
9162,
285,
30426,
7051,
2605,
273,
13112,
2531,
2718,
50276,
783,
2929,
4583,
310,
7681,
3590,
342,
690,
4722,
1543,
253,
2898,
275,
13112,
2531,
2718,
310,
4722,
285,
3133,
747,
387,
1878,
281,
253,
37317,
323,
253,
495,
69,
1127,
16173,
50276,
783,
1543,
275,
495,
69,
1789,
9162,
310,
417,
1077,
7000,
390,
4209,
627,
403,
247,
2257,
273,
15302,
891,
16632,
253,
2929,
476,
1071,
671,
352,
36908,
3806,
4209,
9380,
275,
253,
495,
69,
1127,
16173,
3676,
4715,
5028,
534,
310,
2905,
285,
1077,
4623,
24088,
253,
465,
81,
13118,
2929,
943,
320,
6289,
253,
8135,
3024,
2929,
534,
671,
4648,
2786,
695,
24383,
943,
320,
5393,
352,
3133,
281,
479,
253,
4477,
1537,
417,
320,
1077,
7615,
342,
253,
6239,
275,
436,
2170,
891,
651,
751,
281,
923,
253,
2929,
556,
1930,
407,
1930,
14023,
342,
690,
273,
841,
4283,
9380,
275,
690,
273,
253,
2629,
22791,
15302,
4583,
253,
2929,
310,
3590,
285,
4722,
342,
1175,
1543,
253,
2898,
275,
13805,
985,
310,
4722,
285,
747,
2299,
253,
2929,
36908,
513,
247,
1175,
5301,
342,
4283,
9380,
275,
436,
1673,
3021,
352,
310,
1892,
281,
5963,
253,
12510,
273,
253,
4081,
1332,
5474,
339,
431,
248,
2929,
10262,
247,
2786,
695,
15269,
6779,
285,
17857,
6859,
45938,
3169,
19474,
260,
79,
2224,
323,
495,
69,
1127,
16173,
253,
9021,
403,
767,
34579,
806,
253,
2786,
695,
15269,
6779,
13276,
4715,
3386,
1936,
360,
11656,
1037,
1273,
767,
3510,
273,
2410,
17009,
4996,
4938,
1568,
285,
734,
40196,
403,
5678,
4404,
9381,
32270,
6410,
285,
44755,
30745,
50276,
296,
3755,
20556,
50275,
18,
16038,
253,
16038,
281,
897,
2709,
6064,
28394,
342,
11962,
9941,
7835,
3588,
285,
778,
1361,
4735,
4715,
273,
495,
69,
15029,
10941,
281,
2045,
7274,
970,
2014,
3943,
15269,
50276,
19,
10336,
253,
2709,
6064,
2746,
310,
7561,
908,
275,
2460,
3676,
4715,
10336,
436,
310,
253,
806,
673,
352,
310,
6508,
281,
19474,
260,
9866,
285,
253,
4081,
10336,
812,
1421,
281,
1142,
956,
484,
789,
275,
1097,
495,
69,
1127,
9005,
5162,
285,
19474,
260,
79,
2224,
50276,
20881,
1255,
50275,
18,
9381,
8275,
6410,
4477,
3916,
253,
4081,
1332,
281,
320,
9381,
32270,
6410,
285,
1543,
327,
1566,
3024,
1449,
671,
1329,
436,
1750,
436,
310,
253,
954,
1774,
2867,
273,
253,
4081,
1332,
352,
310,
15237,
494,
281,
452,
247,
1805,
8813,
273,
1097,
4081,
1332,
285,
2905,
1332,
5393,
275,
2905,
789,
26332,
2139,
690,
3082,
403,
417,
9381,
8275,
6410,
50276,
19,
1566,
3024,
1449,
1543,
273,
337,
820,
864,
1162,
267,
6247,
310,
5816,
275,
2829,
337,
50276,
20,
9500,
4679,
50276,
18,
4495,
273,
3602,
27657,
275,
253,
4679,
391,
18,
285,
260,
1237,
403,
908,
1057,
436,
1599,
4567,
28394,
403,
5678,
715,
337,
374,
2957,
3470,
432,
247,
19,
285,
16186,
22,
9765,
520,
310,
908,
1057,
436,
1599,
2957,
38755,
310,
1679,
1774,
685,
2957,
9194,
375,
604,
2045,
2987,
970,
9765,
740,
812,
253,
4477,
3400,
253,
1543,
970,
253,
1072,
4758,
275,
643,
3000,
849,
1057,
44276,
2957,
9194,
375,
8513,
516,
671,
19497,
281,
871,
604,
352,
310,
4344,
281,
7277,
281,
643,
3082,
970,
3081,
2957,
1307,
50276,
20,
12510,
273,
253,
3082,
327,
9500,
4679,
4477,
897,
21010,
2715,
273,
253,
10336,
391,
18,
285,
2361,
2266,
3045,
685,
1666,
25379,
1057,
436,
1599,
326,
9500,
13418,
310,
1512,
3477,
323,
253,
4081,
1332,
50276,
21,
19474,
260,
79,
2224,
8245,
7296,
253,
1666,
25379,
897,
275,
9500,
4679,
4829,
512,
256,
5503,
19474,
260,
79,
2224,
403,
417,
2429,
50275,
709,
50276,
18,
11206,
32270,
6410,
27311,
267,
6928,
285,
253,
17857,
6859,
16232,
260,
9866,
275,
6010,
253,
2929,
310,
3477,
281,
1239,
285,
956,
5740,
273,
253,
1332,
310,
2590,
285,
3426,
4679,
2486,
581,
323,
495,
69,
1127,
9005,
9162,
285,
581,
323,
9500,
13418,
323,
253,
6158,
581,
516,
417,
7615,
342,
253,
3368,
10183,
285,
4278,
285,
3021,
812,
417,
7472,
253,
3082,
432,
253,
1543,
4583,
891,
11435,
253,
6031,
273,
253,
789,
281,
12661,
1554,
2731,
2241,
2715,
273,
19474,
260,
79,
2224,
253,
42719,
310,
326,
27163,
403,
3710,
285,
690,
1666,
25379,
403,
417,
2429,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
4715,
6779,
273,
495,
69,
1127,
16173,
285,
23970,
271,
4722,
2746,
273,
2786,
695,
19474,
305,
9866,
342,
253,
2867,
273,
9381,
595,
32270,
6410,
352,
2722,
690,
12532,
1543,
327,
1127,
9005,
9162,
762,
594,
20,
21257,
285,
327,
21565,
7051,
1375,
4038,
273,
23076,
512,
41916,
265,
253,
10123,
1804,
326,
1223,
352,
1057,
417,
11089,
432,
667,
2201,
32138,
253,
2929,
556,
247,
9648,
1781,
1180,
273,
5884,
3374,
326,
823,
598,
281,
1056,
352,
749,
1148,
323,
9311,
253,
4081,
2746,
452,
2067,
4373,
22041,
533,
253,
4477,
513,
417,
1646,
281,
320,
598,
2914,
670,
849,
253,
3602,
403,
4236,
3707,
323,
14851,
326,
597,
897,
2629,
25184,
5609,
50276,
2520,
310,
417,
247,
20297,
3662,
285,
4620,
281,
320,
33413,
3390,
253,
1953,
1142,
7681,
4278,
285,
2173,
10165,
812,
897,
625,
11080,
8813,
285,
1783,
253,
13812,
273,
253,
4081,
2746,
275,
5886,
281,
253,
1781,
2133,
273,
5368,
6239,
812,
320,
625,
4518,
43997,
562,
26708,
841,
3374,
1160,
253,
7680,
273,
436,
2929,
1679,
2590
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a method to perform subspace splitting that is the task of clustering the entries of an input vector into sets of coherent subspaces the contribution of the work is twofold 1 the theoretical characterization of the problem and its wellposedness and 2 the presentation of three algorithms for tackling the problem of subspace splitting quantitative analysis of the performance of the three algorithms is provided by means of synthetic experiments the paper is well written with sound mathematical formulation the contributions proposed by the authors seem to have enough novelty and relevance for the community and both theoretical and practical contributions are thoroughly motivated and discussed major concerns i find the structure and content of the paper somehow unbalanced a fairly large portion is dedicated to motivational applications which feels like an extension of the related work section but are not experimentally explored in section 5 most discussion is devoted to approaches with clear drawbacks ransas greedys while the final proposal ksplits is hardly discussed while it is true that a large contribution of the work is strictly theoretical the practical aspect of the paper could be further backed by experimental validation the experiments section show limited results regarding the choice of the number of clusters only 2 or 4 the performance of ransas and ksplits is saturated in all noisefree experiments which makes them hard to compare or identify failure cases given the attention paid in the paper to motivating applications a full section i miss a section in the experiments where the proposed approach is validated with real data from any of the mentioned applications i feel this would tremendously increase the value of the work and would legitimize the claim of practical importance that the authors make in introduction and conclusions minor concerns at the end of the first paragraph of related work the authors claim that random sampling is not applicable to subspace splitting with no further explanation of why this is the case at the same time the algorithm presented ransas is based on random sampling which seems contradictory as i mentioned earlier i miss more discussion regarding the ksplits approach did the authors find any clear drawbacks besides initialization what about the kmeans assumption of isotropic clusters the choice of just l1 as a baseline for comparison is a bit arbitrary i wonder why the other mentioned approaches mixedinteger programming or random sampling were not added to the evaluation the paper could use further review to avoid typos and missing references the paper doesnt follow the citation convention authors last names and year docsepthis paper introduced a new setting in which variables in feature vector are divided into different groups and in each group variables are generated by sampling within linear subspace they characterized the sufficient and necessary condition for the system to be identifiable three algorithms are provided to cluster the variables into their generative subspace they also provide motivating applications for this new setting in metagenomics recommender systems and robust learning i think the new setting is potentially interesting and the proposed algorithms are good however compared with for example subspace clustering the generative subspace ui is known to the users a prior and the goal is to classify variables of a single data point which might be rarely the case in practice comments my main concern is on the proof of theorem 1 which seems a little bit sloppy and not convincing enough to me for example on page 14 in the mid of eq1 and eq2 the authors argue uomegak is invertible as subspace uk is in socalled general space please elaborate on that i am confused as if we set uomegak any full rank rbyr matrix and let ur1k be a copy of any row in uomegak and do a row swap between r1 and any row in r then the new rbyr uomegak cannot be invertible as it is not full rank in row it is not clear to me the set of this type of subspace a zeromeasure set on the uniform distribution of drgrassmannian please consider rewriting the proof in a more rigorous waydocsepsummary the paper introduces the problem of subspace spitting in which an observed mixedfeatures vector is to be partitioned such that the identified partitions match with given subspaces the main results of the paper lie in deriving sufficient and necessary conditions for identifiability of these partitions when the subspaces and the entries of the features are randomly positioned in the ambient dimension and the subspaces respectively the conditions simply require that there are more entries associated with each subspace than the dimension of the subspace the paper also presents algorithms to perform the splitting strengths the problem statement is novel i did not see previous formulations of this problem the paper is generally wellwritten the paper has a good balance of theory and algorithm development weaknesses the experimental results section is rather weak it would be good to include some realistic examples from some concrete applications while the paper has a dedicated section on motivating applications they are not that convincing the metagenomics application is more plausible but i do not think this model applies well to recommender systems the examples provided seem to be included to justify the proposed model perhaps there are meaningful relevant applications but for the current version the problem setup is not sufficiently justified and seems somewhat contrived the assumptions about the subspaces need to be more explicit especially when discussing the algorithms for example the random sampling algorithm seems to require full knowledge of the dimensions of these subspaces which may be impractical i have not fully verified this argument but i am under the impression that the result of the main theorem is trivial isnt it obvious that the span of the restriction of a subspace to a given partition of size m the whole rm i believe the main result can just follow from this simple observation reproducibility the authors did not include code for their developed algorithms in the supplementary material impact provided that the problem setup and model are better justified i believe this work could open up new research questions in machine learning and data analysis including mixture variants of wellstudied problems on matrix completion and robust learning overall i like the paper especially that the problem setup itself seems novel however the model is not sufficiently justified the assumptions are somewhat questionable and the experimental section is lacking as such i would rate this as a marginal acceptance
### Summary:
|
the paper considers a new linearalgebraic problem motivated by applications such as metagenomics which requires the algorithm to partition the coordinates of a long noisy vector according to a few known subspaces a number of theoretical questions were asked eg identifiability efficient algorithms and their error bounds etc the reviewers generally liked the paper for what it does specific suggestions were raised by the reviewers including how the paper went into length about the motivating applications but did not end up evaluating the proposed algorithms on any motivating applications and that the main theoretical results were not technically challenging nor surprising although the authors provided a fair justification in their rebuttal the ac finds the paper an outlier in terms of the topics among papers typically received by iclr but liked the paper precisely because it is different the authors are encouraged to discuss the connections of the specific problem to the context of representation learning and machine learning in general overall i believe the paper is a solid borderline accept
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
1332,
281,
1347,
24822,
19860,
326,
310,
253,
4836,
273,
17524,
253,
12028,
273,
271,
3280,
4972,
715,
5239,
273,
18893,
749,
31748,
253,
7680,
273,
253,
789,
310,
767,
8089,
337,
253,
10527,
14846,
273,
253,
1895,
285,
697,
973,
7334,
1255,
285,
374,
253,
9759,
273,
1264,
11333,
323,
46710,
253,
1895,
273,
24822,
19860,
11745,
1783,
273,
253,
3045,
273,
253,
1264,
11333,
310,
2530,
407,
2097,
273,
13506,
4679,
50276,
783,
2929,
310,
973,
3542,
342,
3590,
15965,
15895,
253,
9021,
4081,
407,
253,
4477,
1646,
281,
452,
2217,
38135,
285,
17200,
323,
253,
3114,
285,
1097,
10527,
285,
8542,
9021,
403,
16575,
17194,
285,
5469,
50276,
24330,
7350,
50275,
74,
1089,
253,
2605,
285,
2600,
273,
253,
2929,
10380,
440,
30063,
247,
9648,
1781,
5110,
310,
9940,
281,
49956,
4893,
534,
9193,
751,
271,
6880,
273,
253,
2905,
789,
2593,
533,
403,
417,
21657,
14859,
275,
2593,
608,
954,
5955,
310,
16222,
281,
7274,
342,
2590,
30453,
391,
10642,
37819,
656,
1223,
253,
2457,
10419,
465,
23336,
953,
310,
10693,
5469,
50275,
6050,
352,
310,
2032,
326,
247,
1781,
7680,
273,
253,
789,
310,
13714,
10527,
253,
8542,
4809,
273,
253,
2929,
812,
320,
2007,
17245,
407,
5661,
12820,
253,
4679,
2593,
921,
3710,
1543,
5001,
253,
4327,
273,
253,
1180,
273,
9959,
760,
374,
390,
577,
253,
3045,
273,
391,
10642,
285,
465,
23336,
953,
310,
23543,
275,
512,
642,
261,
832,
658,
4679,
534,
2789,
731,
1892,
281,
7277,
390,
4271,
4433,
2219,
50274,
28821,
253,
4116,
5087,
275,
253,
2929,
281,
15265,
839,
4893,
247,
2120,
2593,
891,
2985,
247,
2593,
275,
253,
4679,
835,
253,
4081,
2746,
310,
17618,
342,
1524,
941,
432,
667,
273,
253,
5393,
4893,
891,
1928,
436,
651,
17623,
4087,
2572,
253,
1318,
273,
253,
789,
285,
651,
29933,
907,
253,
1750,
273,
8542,
6349,
326,
253,
4477,
1056,
275,
10199,
285,
11815,
50276,
37585,
7350,
50275,
255,
253,
990,
273,
253,
806,
12494,
273,
2905,
789,
253,
4477,
1750,
326,
3632,
10491,
310,
417,
7763,
281,
24822,
19860,
342,
642,
2007,
8813,
273,
2139,
436,
310,
253,
1083,
387,
253,
1072,
673,
253,
5933,
3559,
391,
10642,
310,
1754,
327,
3632,
10491,
534,
3133,
34126,
50276,
284,
891,
5393,
4321,
891,
2985,
625,
5955,
5001,
253,
465,
23336,
953,
2746,
858,
253,
4477,
1089,
667,
2590,
30453,
16280,
31850,
752,
670,
253,
465,
30799,
9376,
273,
29436,
9959,
50276,
783,
4327,
273,
816,
298,
18,
347,
247,
8245,
323,
5301,
310,
247,
2372,
10341,
891,
4282,
2139,
253,
643,
5393,
7274,
6804,
18743,
10717,
390,
3632,
10491,
497,
417,
2879,
281,
253,
7103,
50276,
783,
2929,
812,
897,
2007,
2278,
281,
3693,
963,
993,
285,
5816,
10414,
50276,
783,
2929,
36908,
956,
253,
25577,
5008,
4477,
1390,
4454,
285,
807,
5474,
33032,
2520,
2929,
5611,
247,
747,
4758,
275,
534,
4903,
275,
4735,
4972,
403,
4272,
715,
1027,
2390,
285,
275,
1016,
1387,
4903,
403,
4561,
407,
10491,
1561,
4872,
24822,
597,
7943,
253,
4209,
285,
3309,
1617,
323,
253,
985,
281,
320,
38640,
1264,
11333,
403,
2530,
281,
7368,
253,
4903,
715,
616,
1006,
800,
24822,
597,
671,
2085,
15265,
839,
4893,
323,
436,
747,
4758,
275,
1313,
6533,
19177,
3818,
3109,
2718,
285,
10237,
4715,
891,
1158,
253,
747,
4758,
310,
7826,
4722,
285,
253,
4081,
11333,
403,
1175,
2299,
2429,
342,
323,
1650,
24822,
17524,
253,
1006,
800,
24822,
28243,
310,
1929,
281,
253,
4212,
247,
2720,
285,
253,
4736,
310,
281,
30215,
4903,
273,
247,
2014,
941,
1127,
534,
1537,
320,
11766,
253,
1083,
275,
3946,
50276,
26122,
50276,
2577,
2022,
4468,
310,
327,
253,
4737,
273,
10012,
337,
534,
3133,
247,
1652,
2372,
1499,
45695,
285,
417,
21414,
2217,
281,
479,
323,
1650,
327,
3239,
1638,
275,
253,
4260,
273,
16186,
18,
285,
16186,
19,
253,
4477,
9059,
1484,
485,
72,
518,
310,
42275,
347,
24822,
42487,
310,
275,
9267,
18859,
2087,
2317,
4496,
21184,
327,
326,
891,
717,
13477,
347,
604,
359,
873,
1484,
485,
72,
518,
667,
2120,
5958,
391,
1615,
83,
4315,
285,
1339,
2936,
18,
76,
320,
247,
3491,
273,
667,
4194,
275,
1484,
485,
72,
518,
285,
513,
247,
4194,
22101,
875,
391,
18,
285,
667,
4194,
275,
391,
840,
253,
747,
391,
1615,
83,
1484,
485,
72,
518,
2550,
320,
42275,
347,
352,
310,
417,
2120,
5958,
275,
4194,
352,
310,
417,
2590,
281,
479,
253,
873,
273,
436,
1511,
273,
24822,
247,
1182,
254,
485,
5849,
873,
327,
253,
6447,
3268,
273,
1837,
34495,
8420,
757,
4496,
1908,
294,
17695,
253,
4737,
275,
247,
625,
26565,
1039,
7152,
339,
793,
360,
3454,
253,
2929,
23970,
253,
1895,
273,
24822,
653,
2835,
275,
534,
271,
2540,
6804,
28862,
4972,
310,
281,
320,
10883,
264,
824,
326,
253,
3636,
27959,
3761,
342,
1677,
749,
31748,
253,
2022,
1543,
273,
253,
2929,
7027,
275,
44190,
4209,
285,
3309,
2515,
323,
1548,
18279,
1430,
273,
841,
27959,
672,
253,
749,
31748,
285,
253,
12028,
273,
253,
3386,
403,
12421,
15471,
275,
253,
18509,
7877,
285,
253,
749,
31748,
2975,
253,
2515,
3365,
2430,
326,
627,
403,
625,
12028,
2330,
342,
1016,
24822,
685,
253,
7877,
273,
253,
24822,
253,
2929,
671,
10262,
11333,
281,
1347,
253,
19860,
50275,
296,
3755,
20556,
50275,
783,
1895,
3908,
310,
4460,
891,
858,
417,
923,
2045,
26850,
273,
436,
1895,
50275,
783,
2929,
310,
3839,
973,
15720,
50275,
783,
2929,
556,
247,
1175,
6654,
273,
3762,
285,
5933,
2440,
50275,
20881,
1255,
265,
50275,
783,
5661,
1543,
2593,
310,
2581,
5075,
352,
651,
320,
1175,
281,
2486,
690,
15958,
6667,
432,
690,
11859,
4893,
50276,
6050,
253,
2929,
556,
247,
9940,
2593,
327,
15265,
839,
4893,
597,
403,
417,
326,
21414,
253,
1313,
6533,
19177,
2898,
310,
625,
21541,
533,
891,
513,
417,
1158,
436,
1566,
10384,
973,
281,
3818,
3109,
2718,
253,
6667,
2530,
1646,
281,
320,
2908,
281,
15249,
253,
4081,
1566,
4931,
627,
403,
14282,
4623,
4893,
533,
323,
253,
1655,
2715,
253,
1895,
9978,
310,
417,
10481,
17285,
285,
3133,
8489,
523,
30487,
50275,
783,
13260,
670,
253,
749,
31748,
878,
281,
320,
625,
6843,
3340,
672,
16585,
253,
11333,
323,
1650,
253,
3632,
10491,
5933,
3133,
281,
2430,
2120,
3640,
273,
253,
10103,
273,
841,
749,
31748,
534,
778,
320,
45783,
50276,
74,
452,
417,
4751,
16058,
436,
4154,
533,
891,
717,
762,
253,
13214,
326,
253,
906,
273,
253,
2022,
10012,
310,
14916,
310,
2649,
352,
4755,
326,
253,
13905,
273,
253,
12400,
273,
247,
24822,
281,
247,
1677,
10883,
273,
1979,
278,
253,
2644,
40373,
891,
2868,
253,
2022,
906,
476,
816,
956,
432,
436,
2969,
8310,
50273,
250,
5551,
33593,
253,
4477,
858,
417,
2486,
2127,
323,
616,
3715,
11333,
275,
253,
24864,
2144,
50274,
48276,
2530,
326,
253,
1895,
9978,
285,
1566,
403,
1805,
17285,
891,
2868,
436,
789,
812,
1527,
598,
747,
2561,
3533,
275,
5145,
4715,
285,
941,
1783,
1690,
7802,
11640,
273,
973,
14091,
728,
3237,
327,
4315,
12240,
285,
10237,
4715,
50273,
1189,
455,
891,
751,
253,
2929,
3340,
326,
253,
1895,
9978,
3139,
3133,
4460,
2299,
253,
1566,
310,
417,
10481,
17285,
253,
13260,
403,
8489,
30455,
285,
253,
5661,
2593,
310,
14999,
347,
824,
891,
651,
2281,
436,
347,
247,
16888,
14924,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
19401,
247,
747,
4872,
20190,
280,
1895,
17194,
407,
4893,
824,
347,
1313,
6533,
19177,
534,
4419,
253,
5933,
281,
10883,
253,
11627,
273,
247,
1048,
27620,
4972,
2556,
281,
247,
1643,
1929,
749,
31748,
247,
1180,
273,
10527,
3533,
497,
2546,
24088,
1548,
18279,
1430,
50276,
20246,
11333,
285,
616,
2228,
14493,
3966,
50274,
783,
30628,
3839,
10490,
253,
2929,
323,
752,
352,
1057,
2173,
13991,
497,
5439,
407,
253,
30628,
1690,
849,
253,
2929,
2427,
715,
2978,
670,
253,
15265,
839,
4893,
533,
858,
417,
990,
598,
16344,
253,
4081,
11333,
327,
667,
15265,
839,
4893,
285,
326,
253,
2022,
10527,
1543,
497,
417,
22335,
11132,
50276,
15387,
10084,
3738,
253,
4477,
2530,
247,
4344,
22861,
275,
616,
30080,
22559,
50276,
783,
913,
9010,
253,
2929,
271,
562,
3623,
275,
2426,
273,
253,
12989,
2190,
9380,
5431,
2959,
407,
17857,
32888,
533,
10490,
253,
2929,
10534,
984,
352,
310,
1027,
50276,
783,
4477,
403,
14659,
281,
2319,
253,
10291,
273,
253,
2173,
1895,
281,
253,
3634,
273,
6779,
4715,
285,
5145,
4715,
275,
2087,
50276,
1189,
455,
891,
2868,
253,
2929,
310,
247,
4891,
45210,
2997,
50274
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
247,
1332,
281,
1347,
24822,
19860,
326,
310,
253,
4836,
273,
17524,
253,
12028,
273,
271,
3280,
4972,
715,
5239,
273,
18893,
749,
31748,
253,
7680,
273,
253,
789,
310,
767,
8089,
337,
253,
10527,
14846,
273,
253,
1895,
285,
697,
973,
7334,
1255,
285,
374,
253,
9759,
273,
1264,
11333,
323,
46710,
253,
1895,
273,
24822,
19860,
11745,
1783,
273,
253,
3045,
273,
253,
1264,
11333,
310,
2530,
407,
2097,
273,
13506,
4679,
50276,
783,
2929,
310,
973,
3542,
342,
3590,
15965,
15895,
253,
9021,
4081,
407,
253,
4477,
1646,
281,
452,
2217,
38135,
285,
17200,
323,
253,
3114,
285,
1097,
10527,
285,
8542,
9021,
403,
16575,
17194,
285,
5469,
50276,
24330,
7350,
50275,
74,
1089,
253,
2605,
285,
2600,
273,
253,
2929,
10380,
440,
30063,
247,
9648,
1781,
5110,
310,
9940,
281,
49956,
4893,
534,
9193,
751,
271,
6880,
273,
253,
2905,
789,
2593,
533,
403,
417,
21657,
14859,
275,
2593,
608,
954,
5955,
310,
16222,
281,
7274,
342,
2590,
30453,
391,
10642,
37819,
656,
1223,
253,
2457,
10419,
465,
23336,
953,
310,
10693,
5469,
50275,
6050,
352,
310,
2032,
326,
247,
1781,
7680,
273,
253,
789,
310,
13714,
10527,
253,
8542,
4809,
273,
253,
2929,
812,
320,
2007,
17245,
407,
5661,
12820,
253,
4679,
2593,
921,
3710,
1543,
5001,
253,
4327,
273,
253,
1180,
273,
9959,
760,
374,
390,
577,
253,
3045,
273,
391,
10642,
285,
465,
23336,
953,
310,
23543,
275,
512,
642,
261,
832,
658,
4679,
534,
2789,
731,
1892,
281,
7277,
390,
4271,
4433,
2219,
50274,
28821,
253,
4116,
5087,
275,
253,
2929,
281,
15265,
839,
4893,
247,
2120,
2593,
891,
2985,
247,
2593,
275,
253,
4679,
835,
253,
4081,
2746,
310,
17618,
342,
1524,
941,
432,
667,
273,
253,
5393,
4893,
891,
1928,
436,
651,
17623,
4087,
2572,
253,
1318,
273,
253,
789,
285,
651,
29933,
907,
253,
1750,
273,
8542,
6349,
326,
253,
4477,
1056,
275,
10199,
285,
11815,
50276,
37585,
7350,
50275,
255,
253,
990,
273,
253,
806,
12494,
273,
2905,
789,
253,
4477,
1750,
326,
3632,
10491,
310,
417,
7763,
281,
24822,
19860,
342,
642,
2007,
8813,
273,
2139,
436,
310,
253,
1083,
387,
253,
1072,
673,
253,
5933,
3559,
391,
10642,
310,
1754,
327,
3632,
10491,
534,
3133,
34126,
50276,
284,
891,
5393,
4321,
891,
2985,
625,
5955,
5001,
253,
465,
23336,
953,
2746,
858,
253,
4477,
1089,
667,
2590,
30453,
16280,
31850,
752,
670,
253,
465,
30799,
9376,
273,
29436,
9959,
50276,
783,
4327,
273,
816,
298,
18,
347,
247,
8245,
323,
5301,
310,
247,
2372,
10341,
891,
4282,
2139,
253,
643,
5393,
7274,
6804,
18743,
10717,
390,
3632,
10491,
497,
417,
2879,
281,
253,
7103,
50276,
783,
2929,
812,
897,
2007,
2278,
281,
3693,
963,
993,
285,
5816,
10414,
50276,
783,
2929,
36908,
956,
253,
25577,
5008,
4477,
1390,
4454,
285,
807,
5474,
33032,
2520,
2929,
5611,
247,
747,
4758,
275,
534,
4903,
275,
4735,
4972,
403,
4272,
715,
1027,
2390,
285,
275,
1016,
1387,
4903,
403,
4561,
407,
10491,
1561,
4872,
24822,
597,
7943,
253,
4209,
285,
3309,
1617,
323,
253,
985,
281,
320,
38640,
1264,
11333,
403,
2530,
281,
7368,
253,
4903,
715,
616,
1006,
800,
24822,
597,
671,
2085,
15265,
839,
4893,
323,
436,
747,
4758,
275,
1313,
6533,
19177,
3818,
3109,
2718,
285,
10237,
4715,
891,
1158,
253,
747,
4758,
310,
7826,
4722,
285,
253,
4081,
11333,
403,
1175,
2299,
2429,
342,
323,
1650,
24822,
17524,
253,
1006,
800,
24822,
28243,
310,
1929,
281,
253,
4212,
247,
2720,
285,
253,
4736,
310,
281,
30215,
4903,
273,
247,
2014,
941,
1127,
534,
1537,
320,
11766,
253,
1083,
275,
3946,
50276,
26122,
50276,
2577,
2022,
4468,
310,
327,
253,
4737,
273,
10012,
337,
534,
3133,
247,
1652,
2372,
1499,
45695,
285,
417,
21414,
2217,
281,
479,
323,
1650,
327,
3239,
1638,
275,
253,
4260,
273,
16186,
18,
285,
16186,
19,
253,
4477,
9059,
1484,
485,
72,
518,
310,
42275,
347,
24822,
42487,
310,
275,
9267,
18859,
2087,
2317,
4496,
21184,
327,
326,
891,
717,
13477,
347,
604,
359,
873,
1484,
485,
72,
518,
667,
2120,
5958,
391,
1615,
83,
4315,
285,
1339,
2936,
18,
76,
320,
247,
3491,
273,
667,
4194,
275,
1484,
485,
72,
518,
285,
513,
247,
4194,
22101,
875,
391,
18,
285,
667,
4194,
275,
391,
840,
253,
747,
391,
1615,
83,
1484,
485,
72,
518,
2550,
320,
42275,
347,
352,
310,
417,
2120,
5958,
275,
4194,
352,
310,
417,
2590,
281,
479,
253,
873,
273,
436,
1511,
273,
24822,
247,
1182,
254,
485,
5849,
873,
327,
253,
6447,
3268,
273,
1837,
34495,
8420,
757,
4496,
1908,
294,
17695,
253,
4737,
275,
247,
625,
26565,
1039,
7152,
339,
793,
360,
3454,
253,
2929,
23970,
253,
1895,
273,
24822,
653,
2835,
275,
534,
271,
2540,
6804,
28862,
4972,
310,
281,
320,
10883,
264,
824,
326,
253,
3636,
27959,
3761,
342,
1677,
749,
31748,
253,
2022,
1543,
273,
253,
2929,
7027,
275,
44190,
4209,
285,
3309,
2515,
323,
1548,
18279,
1430,
273,
841,
27959,
672,
253,
749,
31748,
285,
253,
12028,
273,
253,
3386,
403,
12421,
15471,
275,
253,
18509,
7877,
285,
253,
749,
31748,
2975,
253,
2515,
3365,
2430,
326,
627,
403,
625,
12028,
2330,
342,
1016,
24822,
685,
253,
7877,
273,
253,
24822,
253,
2929,
671,
10262,
11333,
281,
1347,
253,
19860,
50275,
296,
3755,
20556,
50275,
783,
1895,
3908,
310,
4460,
891,
858,
417,
923,
2045,
26850,
273,
436,
1895,
50275,
783,
2929,
310,
3839,
973,
15720,
50275,
783,
2929,
556,
247,
1175,
6654,
273,
3762,
285,
5933,
2440,
50275,
20881,
1255,
265,
50275,
783,
5661,
1543,
2593,
310,
2581,
5075,
352,
651,
320,
1175,
281,
2486,
690,
15958,
6667,
432,
690,
11859,
4893,
50276,
6050,
253,
2929,
556,
247,
9940,
2593,
327,
15265,
839,
4893,
597,
403,
417,
326,
21414,
253,
1313,
6533,
19177,
2898,
310,
625,
21541,
533,
891,
513,
417,
1158,
436,
1566,
10384,
973,
281,
3818,
3109,
2718,
253,
6667,
2530,
1646,
281,
320,
2908,
281,
15249,
253,
4081,
1566,
4931,
627,
403,
14282,
4623,
4893,
533,
323,
253,
1655,
2715,
253,
1895,
9978,
310,
417,
10481,
17285,
285,
3133,
8489,
523,
30487,
50275,
783,
13260,
670,
253,
749,
31748,
878,
281,
320,
625,
6843,
3340,
672,
16585,
253,
11333,
323,
1650,
253,
3632,
10491,
5933,
3133,
281,
2430,
2120,
3640,
273,
253,
10103,
273,
841,
749,
31748,
534,
778,
320,
45783,
50276,
74,
452,
417,
4751,
16058,
436,
4154,
533,
891,
717,
762,
253,
13214,
326,
253,
906,
273,
253,
2022,
10012,
310,
14916,
310,
2649,
352,
4755,
326,
253,
13905,
273,
253,
12400,
273,
247,
24822,
281,
247,
1677,
10883,
273,
1979,
278,
253,
2644,
40373,
891,
2868,
253,
2022,
906,
476,
816,
956,
432,
436,
2969,
8310,
50273,
250,
5551,
33593,
253,
4477,
858,
417,
2486,
2127,
323,
616,
3715,
11333,
275,
253,
24864,
2144,
50274,
48276,
2530,
326,
253,
1895,
9978,
285,
1566,
403,
1805,
17285,
891,
2868,
436,
789,
812,
1527,
598,
747,
2561,
3533,
275,
5145,
4715,
285,
941,
1783,
1690,
7802,
11640,
273,
973,
14091,
728,
3237,
327,
4315,
12240,
285,
10237,
4715,
50273,
1189,
455,
891,
751,
253,
2929,
3340,
326,
253,
1895,
9978,
3139,
3133,
4460,
2299,
253,
1566,
310,
417,
10481,
17285,
253,
13260,
403,
8489,
30455,
285,
253,
5661,
2593,
310,
14999,
347,
824,
891,
651,
2281,
436,
347,
247,
16888,
14924,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
19401,
247,
747,
4872,
20190,
280,
1895,
17194,
407,
4893,
824,
347,
1313,
6533,
19177,
534,
4419,
253,
5933,
281,
10883,
253,
11627,
273,
247,
1048,
27620,
4972,
2556,
281,
247,
1643,
1929,
749,
31748,
247,
1180,
273,
10527,
3533,
497,
2546,
24088,
1548,
18279,
1430,
50276,
20246,
11333,
285,
616,
2228,
14493,
3966,
50274,
783,
30628,
3839,
10490,
253,
2929,
323,
752,
352,
1057,
2173,
13991,
497,
5439,
407,
253,
30628,
1690,
849,
253,
2929,
2427,
715,
2978,
670,
253,
15265,
839,
4893,
533,
858,
417,
990,
598,
16344,
253,
4081,
11333,
327,
667,
15265,
839,
4893,
285,
326,
253,
2022,
10527,
1543,
497,
417,
22335,
11132,
50276,
15387,
10084,
3738,
253,
4477,
2530,
247,
4344,
22861,
275,
616,
30080,
22559,
50276,
783,
913,
9010,
253,
2929,
271,
562,
3623,
275,
2426,
273,
253,
12989,
2190,
9380,
5431,
2959,
407,
17857,
32888,
533,
10490,
253,
2929,
10534,
984,
352,
310,
1027,
50276,
783,
4477,
403,
14659,
281,
2319,
253,
10291,
273,
253,
2173,
1895,
281,
253,
3634,
273,
6779,
4715,
285,
5145,
4715,
275,
2087,
50276,
1189,
455,
891,
2868,
253,
2929,
310,
247,
4891,
45210,
2997,
50274
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
aim to improve the interpretability and the accuracy of the neural network this paper takes a step further on the integration of nn with a decision tree it will replace the final linear layer of the nn with a decision tree induced by pretrained model weights it takes advantage of both hard and soft decision trees and designs suitable tree supervision loss thereon extensive experiments verify the design choice of the proposed components on both smallscale and largescale datasets it beats the decision tree counterparts also on the aspects of generalization and interpretability it shows the strength compared to nn this work is a good try to combine the two techniques nn and decision tree it finally makes the combination to achieve comparable accuracy with the nn and also enjoy the benefit in the aspects of generalization and interpretability recent sota of capsule networks which are based on the nn backbone and this work are both achieved comparable performance with nn they show a promising direction for studying representation learning researchers can delve deeper based on this work to further exploit how to integrate decision tree into nn and the characteristics of the combination eg adversarial examples with the decision tree we can visualize the decision process the bring the benefits of interpretability the paper proposes to label the decision nodes with wordnet and show the applications of zeroshot generalization highlevel concepts dataset debugging and improved human trust there are lots to do on the aspects also the zeroshot and highlevel concept experiments are really intriguing using the pretrained model weights to construct the tree and the proposed tree losses to train can help the generalization in such a significant way though the performance would depend on the accuracy of the superclasses labeling and the agglomerative clustering where the benefits come from the method is only used the same information as the nn and the tree is also constructed based on the pretrained weights does the way of making hierarchy decisions help here if you do not enforce the second term of the equation 3 will the phenomenon be the same overall the paper is very easy to follow and the figures really help understanding extensive experiments help to know the performance effectiveness of the proposed components and also its unusual applications some concerns and comments are listed below will you update the weights of the intermediate nodes on largescale datasets the paper currently only tests using the efficientnet the reviewer wonders if the author can use more advanced backbones to see the performance changes the reviewer is unsure of the specific way to label the decision nodes will you use the wordvec provided in the wordnet and compare it with the decision nodes feature since your structure is different from the wordnet how do you match the classes with the nodes after rebuttal i would like to keep my origin score due to the pros listed abovedocsepthis paper proposes a neuralbacked decision tree that aims to improve both the accuracy and the interpretability of deep learning models training under a newly introduced tree supervision loss the authors show that nbdts can outperform and outgeneralize some modern architectures on several image datasets overall this paper is well written and established the idea of using a differentiable oblique decision tree to replace the final linear layer is interesting the authors provide clear illustration of the procedure and promising experimental results questions 1 what is the main intuition that nbdts can outperform the original network 2 given the classes are in the leaves does the ordering of classes in the leaf layer matter how should one determine which two classes are in the same bottom subtrees minor comments 1 figure 2 step a yd yk or y4 2 why are there many na results in table 1 3 section 31 in the compute node probabilities paragraph the definition of ri seems confusing 4 what does nn mean in table 1 how different is it from cnnrnndocsepthe authors did a fantastic job of answering questions revising their manuscript in accord with reviewer feedback sec 34 title and even adding new experimental results based on reviewer suggestions midtraining hierarchy and reflecting best practices in interpretability research i was really impressed by their nimbleness and responsiveness i will raise my score to a 7 i think this is a very solid paper and excellent research effort around a nascent idea in particular i think its impact is limited by its close coupling to naturally hierarchical problems eg multiclass classification with a taxonomy its close coupling to image data and tasks its heuristic nature fully train neural net infer hierarchy via clustering retrain neural net then map a priori labels onto inferred hierarchy the 10 version of this paper maybe future work would propose a way to infer the hierarchy on the fly and show how to apply it onto other kinds of data and problems with different structures this submission proposes a modification of neural networks that replaces the final linear layer with a decision tree the term decision tree is applied somewhat loosely to a hierarchical neural architecture akin to a hierarchical softmax in the current work as i understand it this hierarchy is induced from a pretrained multioutput eg multiclass neural network via a hierarchical clustering and subsequent averaging of the output weights at inference time path probabilties can be computed based on the chain rule predictions can be made based on either a greedy traversal of the tree choosing the most likely child at each step a la hierarchical softmax or by choosing the most probable leaf which requires computing all path probabilities empirical results across three standard image datasets are suggestive if not conclusive and the paper concludes with some interesting albeit cursory examples of potential interpretability applications the submission summarizes its contributions at the end of section 1 as follows 1 it proposes a treestructured loss to augment supervised neural network training predominantly for multiclass classification problems 2 it describes a heuristic to induce a hierarchy in the output weights of a pretrained multioutput neural network enabling decision treelike inference and provide evidence it is more effective than other approaches for inducing hierarchies 3 it presents simple case studies of how the induced hierarchy can be used for traditional interpretability tasks like debugging and generating explanations i appreciate the idea at the center of this paper adding simple hierarchical structure to a multioutput neural network with the aim of increased interpretability but i feel the work as it is presented is nascent and the manuscript itself is flawed i lean toward rejection at the moment but i could be persuaded to change my mind by some combination of solid revisions convincing author response or vociferous advocacy from other reviewers i will briefly extol the papers strengths before providing a longer discussion of what i consider to be its key weakness first i really like the last sentence in the paper this challenges the conventional supposition of a dichotomy between accuracy and interpretability paving the way for jointly accurate and interpretable models in realworld deployments weaknesses in the evaluation of its interpretability claims aside i agree with this statement i think the case studies presented do provide evidence of improved interpretability alongside small accuracy improvements i think this paper does succeed in demonstrating that accuracy and interpretability are not necessarily competing objectives at least for certain tasks multiclass classification of images a laundry list of other strengths the motivation is strong modulo weakness discussed below there is a growing need to provide humanunderstandable insights into decisions made by complex machine learning models the proposed approach is simple and elegant easy to implement and empirically effective im quite impressed that the proposed tree loss appears to improve accuracy on multiple tasks i also think this paper lays groundwork for a direction of research that the community could continue to build on i think that the manuscripts largest flaw ironically regards interpretability its primary motivation the works central claim is that the treestructured decision layer delivers improved interpretability with comparable or slightly improved accuracy in its discussion of this claim the manuscript provides no precise definition of interpretable making it difficult to verify the claim qualitatively or quantitatively section 5 presents a vignette of case studies but the discussion of each is quite limited in particular none of the use cases is fully motivated or placed in the context of previous research on interpretability definitions 12 the cursory presentation of results for each do the results a disservice by making it difficult for the reader to recognize and assess their significance to quote the introduction from liptons the mythos of model interpretability 1 despite the absence of a definition papers frequently make claims about the interpretability of various models from this we might conclude that either i the definition of interpretability is universally agreed upon but no one has managed to set it in writing or ii the term interpretability is illdefined and thus claims regarding interpretability of various models may exhibit a quasiscientific character i believe the paper would be strengthened by focusing on one use case eg debugging or human trust using the 1 page dedicated to section 5 to motivate it more fully and to present the results in detail if the primary use case is generalization or debugging then i suggest designing a quantitative analysis so defend against claims of cherry picking the best results a common problem in presenting example interpretability results section 54 includes a quantitative evaluation but i question whether mere human preference is evidence of human trust more recent research on trust appear to use more elaborate studies in which trust is measured by subjects rate of success in performing a particular task aided by the machine learning model 3 i want to caveat the above i really appreciate this line of work and think it has value there is an ongoing discussion in our community about rewarding good ideas rather than punishing imperfect or incomplete execution i also acknowledge that i am far from an expert in the latest interpretability research nonetheless my understanding is that interpretability researchers have grown more skeptical of interpretability claims about new methods absent a rigorous framework definitional andor experimental for evaluating those claims when i read this paper i find it hard to escape the conclusion that its interpretability claims rest on the presupposition that trees are naturally more interpretable and further that readers will accept this dogma i disagree with this assertion see below but even if it were generally true i still think the paper would be strengthened by adding a more rigorous discussion and analysis of its claims propose a definition or criterion see 12 for ideas ideally one that could be assessed qualitatively and evaluated empirically then apply it regarding the claim about trees in section 5 the interpretability of a decision tree is wellestablished when input features are easily understood eg tabular data in medicine or finance i would dispute that this is wellestablished for anything but the simplest decision tree models with a single tree consisting of a small number of splits using a handful of features which are rare in realistic settings the most commonly used treestructured models gradient boosted decision trees and random forests are not readily interpretable even for tabular data and especially for high dimensional inputs this has made research like shap 4 of great interest to practitioners what is more even for tabular data the neural decision trees described in this paper are to my understanding basically a cascade of linear classifiers with split each having access to all features at once this does do not lend itself to the same kind of interpretation one gets for classic decision trees that use one feature per split with even modestly deep hierarchies the resulting explanations would rapidly become quite complex i see one other weakness in the proposed method itself as i understand things it requires access to a pretrained neural network at the very least one needs preexisting output weights to cluster in order to induce a hierarchy and the induced hierarchy is a necessary component in the presented results this isnt a fatal flaw learning a hierarchy on the fly could be left for future work nonetheless it limits the works usefulness and potential impact what is more i dont think the manuscript is sufficiently clear about this requirement on my first pass through the paper i came away with the impression that there was a way to learn the hierarchy while training the neural net the inclusion of a section entitled training with tree supervision loss seems to imply this i suggest revising the text to make it crystal clear that it is not possible to use the tree loss to train a neural net from scratch at least not without a predefined hierachy perhaps from a previous training run or prior knowledge i will now summarize the improvements i suggest for strengthening the manuscript 1 focus on one definition of interpretability and then analyze central claims through that lens introduce it early in the paper introduction and then dedicate section 5 to it rather than trying to cover lots of use cases superficially 2 make the limitations of the proposed approach very clear in particular you need a predefined hierarchy to train with the tree loss and you need pretrained neural net or preexisting weights at least to induce a hierarchy based on clustering 3 if the intention is for this approach to be used exclusively for finetuning or adapting an existing neural network then this should be made clear in the text consider renaming section 33 4 justify or reword statements like the interpretability of a decision tree is wellestablished or neural features are visually interpretable a single reference does not sufficethe olah distill survey draws no such definitive conclusions i have a few questions one thing that is not clear when training with tree loss are weights shared across nodes in particular the weight vector for an inner node is the average of its descendent leaf node weight vectors when training with tree loss do we then treat that inner node weight vector as a set of independent parameters with separate updates or do we continue to treat it as a sum of leaf parameters so that leaf and inner node updates affect the same parameters as in an rnn or recursive network is there possibly a heuristic that could approximate learning the hierarchy for example train with a basic loss for enough iterations until the output weights start to converge then pause training to induce the hierarchy and then resume training with the tree loss part of the heuristic could be guidance about how to detect when the output weights have sufficiently converged what are the key differences between this approach and a hierarchical softmax my understanding is that theyre basically equivalent at inference time except maybe traditional hierarchical softmax uses hard decisions what about during training is it maybe the use of negative sampling for hierarchical softmax how would this approach perform for extremely high dimensional output spaces one of the primary motivations for hierarchical softmax i imagine that for some output cardinality soft inference becomes computationally infeasible 1 lipton the mythos of model interpretability httpsarxivorgabs160603490 2 doshivelez and kim towards a rigorous science of interpretable machine learning httpsarxivorgabs170208608 3 poursabzisangdeh et al manipulating and measuring model interpretability httpsarxivorgabs180207810 4 lundberg et al from local explanations to global understanding with explainable ai for trees httpswwwnaturecomarticless4225601901389docsepsignificance this article seems to be a useful contribution to the literature on interpretable deep networks however the paper could be strengthened by demonstrating and analyzing the interpretability of approaches to other types of data such as sequential data novelty this papers main contribution is to offer a new hybrid model that combines a deep neural network with a tree the authors used the weights of the last layer of a dnn to build a tree from bottom to top where each inner node in the tree is the average weight of each child then the authors used softmax to compute the probability of routing for each child potential impact the approach presented in this paper is wellevaluated in computer vision but potentially useful in many other settings technical quality the technical content of the paper appears to be correct albeit there is some room for improvement page 2 the authors said these models likewise limit interpretability by supporting no more than depth2 trees having the depth2 tree actually improves the interpretability since it is easier to follow the model prediction for example a tree of depth2 considers more interpretable than a tree of depth4 the authors should rephrase this sentence while the authors claim that linearly increasing the weight in nbdt is the superior method why the nbdt with the constant rate overperforms on cifar10 as shown in table 3 based on the nbdts explanation all leaf nodes should have the same depth but in the example shown in figure 6 and supplement the leaf nodes are in different depths the authors need to explain why the final tree has leaf nodes with different depths the authors did not compare their methods interpretability with a similar methods such as adaptive neural trees tanno et al 2019 i suggest running this experiment since adaptive neural trees has a different interface than nbdt while the number of participants is noticeable in the interpretability study it seems that participants only answer one question adding more questions could strengthen the paper further the authors should provide a summary of participants eg age education and gender the paper will be strengthened if the authors run an experiment without using a pretrained neural network on a small dataset like mnist to demonstrate their algorithms effectiveness presentationclarity while the paper is fairly readable there is room for improvement in the clarity page 3 the last paragraph forgot a period after the parentheses path figure 1 c appendix table 6 this page 7 figure 4 i believe that the authors mean without nbdt instead of without resnet while the authors explained how they label the tree nodes with wordnet in the supplement there is no explanation in the main paper labeling the trees nodes is an important part of the algorithm and should be included in the main paper page 5 the authors used hierarchy ablation and loss ablation subtitle in the paper the word ablation seems inappropriate in this context reproducibility the paper describes all the algorithms in full detail and provides enough information for an expert reader to reproduce its results however the authors did not discuss when they start to increase the w in equation 3 how to determine to stop the w from increasing and with what rate the w should be increaseddocsepthe paper proposes a method to make neural networks more accurate and interpretable by replacing their final layers with a probabilistic decision tree as a result the network can produce a sequence of decisions that leads to the final classification result given an input image the method is trained with soft decisions by assigning probabilities to each leaf which are associated with a single class the tree decision hyperplanes are constructed automatically from the backbone networks final dense layer and finetuned the fact that decisions are soft solves the differentiablility problem of decisions as in various other similar papers cited or uncited more below the paper is not written very clearly so it would be hard to reproduce its not clear in places if indices correspond to nodes or classes as it is used interchangeably the text misses a proper mathematical formulation of the operations done in inner nodes and this all makes it difficult to understand what the loss is and how it can update the decisions in the tree perhaps its possible to understand all the details by rereading the text several times but the paper definitely lacks clarity a nice comparison would be with the deep neural decision forest dndf paper id expect that level of clarity from an iclr paper the justification behind using the hyperplanes from the last layer of the cnn for constructing inner nodes is not explained the visualization in figure 2 indicates that the averages used for the parent nodes act like clusters and somehow averaging them forms bigger clusters but what really happens is that the rows of the final layer which are unnormalized hyperplanes are averaged to form new hyperplanes that are assumed to cover both classes which would not be the case most of the time im not sure if this has a reasonable geometric meaning but the visualization gives the wrong idea the paper makes the claim unlike previous decision trees or hierarchical classifiers nbdts use path probabilities for inference there are certainly many papers that use path probabilities for inference in fact its been the norm for discrete decisions since hard decisions are much harder to formulate in a differentiable manner dfdn uses path probabilities as do older papers like decision forests convolutional networks and the models inbetween by ioannou et al uncited i dont understand this claim the biggest difference here is that each class gets exactly one dedicated leaf instead of each leaf storing a distribution it is not clear to me why this is a good idea though its clearly not preferred in the decision tree literature another interesting point most papers i know on the subject actually try to enable sparse activation at test time for efficiency which is the harder problem to solve see condconv neurips 2019 uncited splinenets neurips 2018 uncited outrageously large neural networks by shazeer et al uncited conditional information gain by bicici et al etc activating all the branches of the tree or graph becomes prohibitively costly for deeper trees interpretability is another major claim the way it works is by using wordnet to assign higher level compound classes to images such as animal turtle this way by grouping similar leaves together the inner nodes are assigned a meaning this needs to be done when constructing the hierarchy so the tree structure is manually given from what i understand but then how are the pairs of nodes selected its not clear from the text at least it is not explained clearly in the figure 5 a decision tree is given which could only have been constructed by hand ie with someone knowing that cats and dogs are the closest pairs where does this information come from for imagenet when there are 1000 classes wordnet another concern about interpretability is that it is claimed that other good performing methods like dfdn are not interpretable i dont see why thats the case if we use wordnet to assign meaning for the inner nodes of such solutions id agree that it is not straightforward to do so but with wordnet i can imagine how it could be done so im not convinced about the claim that this solution is the most interpretable one when the paper does not explain why other solutions arent in a persuasive manner one final concern is with the comparisons with similar models the paper says these were reimplemented and tested with different backbones where are these implementations taken from for instance for dfdn which is quite complex how can the reader trust these numbers i find the claims unconvincing and the results unpersuasive i think the paper needs i a better mathematical formulation to clarify the method ii better explanation of how the trees are constructed eg if wordnet is used iii better understanding of differences from similar work currently i dont think it meets the bar for iclr treesup result is likely wrong in table 3 after rebuttal the authors did a great job in addressing most of the issues i have and made many changes that helped with the clarity of the paper there are still some remaining issues like figure 2 assigning a wrong geometric meaning to the clusters formed by taking means of hyperplanes and the uncited references which are simply added to the references section which should be fixed but i think the added survey results are a great addition and a persuasive proof about the increased interpretabililty of these models therefore ill increase my score to 6
### Summary:
|
this paper initially received mixed ratings but after the rebuttal all reviewers recommended acceptance reviewers appreciate the novel technical ideas and extensive experimental results
|
[
15249,
390,
294,
3418,
7234,
751,
253,
4665,
1430,
273,
247,
3061,
5202,
310,
973,
21877,
390,
11454,
3386,
403,
25910,
4665,
494,
247,
2014,
3806,
1057,
417,
402,
1330,
10666,
8919,
1240,
940,
408,
6630,
21354,
642,
824,
25038,
11815,
50276,
74,
452,
247,
1643,
3533,
50276,
531,
2181,
326,
310,
417,
2590,
672,
3733,
342,
5202,
2957,
403,
13461,
6096,
2439,
7632,
275,
1798,
253,
2801,
4972,
323,
271,
6703,
4666,
310,
253,
3388,
273,
697,
1398,
20708,
10617,
4666,
2801,
11390,
672,
3733,
342,
5202,
2957,
513,
359,
840,
1555,
326,
6703,
4666,
2801,
4972,
347,
247,
873,
273,
3907,
3602,
342,
4858,
11269,
390,
513,
359,
4035,
281,
1555,
352,
347,
247,
2020,
273,
10617,
3602,
594,
326,
10617,
285,
6703,
4666,
11269,
2818,
253,
1072,
3602,
347,
275,
271,
391,
9866,
390,
33037,
2990,
50276,
261,
627,
6830,
247,
47641,
326,
812,
16851,
4715,
253,
19868,
323,
1650,
6194,
342,
247,
5044,
2957,
323,
2217,
25142,
1919,
253,
3453,
13461,
1265,
281,
29623,
840,
19309,
3733,
281,
10808,
253,
19868,
285,
840,
21058,
3733,
342,
253,
5202,
2957,
629,
273,
253,
47641,
812,
320,
12925,
670,
849,
281,
2736,
672,
253,
3453,
13461,
452,
10481,
5975,
2400,
50276,
5371,
403,
253,
2234,
3910,
875,
436,
2746,
285,
247,
24498,
2602,
4090,
619,
4685,
310,
326,
597,
250,
10323,
6425,
387,
17032,
673,
3707,
5046,
5899,
24498,
2602,
4090,
4648,
1892,
7089,
752,
670,
1309,
3733,
310,
352,
5046,
253,
897,
273,
4016,
10491,
323,
24498,
2602,
4090,
50276,
5430,
651,
436,
2746,
1347,
323,
6685,
1029,
15759,
3453,
8470,
50276,
531,
273,
253,
3625,
42852,
323,
24498,
2602,
4090,
891,
8564,
326,
323,
690,
3453,
46950,
2602,
17032,
4916,
43245,
275,
36764,
917,
50276,
18,
632,
11933,
253,
13296,
375,
273,
1566,
4665,
1430,
5987,
39962,
2061,
5375,
9913,
1549,
1706,
2270,
374,
277,
6934,
422,
33383,
285,
465,
303,
4404,
247,
26565,
5859,
273,
4665,
494,
5145,
4715,
5987,
39962,
2061,
5375,
1166,
9992,
25,
25805,
495,
268,
2108,
357,
91,
261,
606,
615,
73,
1162,
355,
40238,
285,
10499,
1566,
4665,
1430,
5987,
39962,
2061,
5375,
1093,
9992,
3141,
740,
577,
298,
1504,
4978,
1162,
355,
432,
1980,
22909,
281,
4156,
4685,
342,
5513,
494,
23105,
323,
7139,
5987,
1477,
939,
1177,
681,
45894,
1417,
2945,
9726,
11325,
520,
25592,
7152,
339,
793,
525,
40348,
436,
3929,
3133,
281,
320,
247,
4217,
7680,
281,
253,
6239,
327,
4665,
494,
3676,
6928,
2299,
253,
2929,
812,
320,
34615,
407,
17227,
285,
18918,
253,
4665,
1430,
273,
7274,
281,
643,
3510,
273,
941,
824,
347,
22453,
941,
50276,
2369,
652,
555,
436,
9380,
2022,
7680,
310,
281,
3959,
247,
747,
9769,
1566,
326,
24772,
247,
3676,
11454,
2990,
342,
247,
5202,
253,
4477,
908,
253,
13461,
273,
253,
1390,
3828,
273,
247,
277,
9866,
281,
1973,
247,
5202,
432,
5004,
281,
1755,
835,
1016,
6703,
4666,
275,
253,
5202,
310,
253,
3388,
2801,
273,
1016,
1429,
840,
253,
4477,
908,
2602,
4090,
281,
11897,
253,
5912,
273,
50276,
83,
20309,
323,
1016,
1429,
50275,
33177,
3486,
253,
2746,
3559,
275,
436,
2929,
310,
259,
4415,
1208,
11634,
275,
4382,
8113,
533,
7826,
4217,
275,
1142,
643,
7533,
50276,
48746,
3290,
50276,
783,
7681,
2600,
273,
253,
2929,
4620,
281,
320,
3451,
23447,
627,
310,
690,
2316,
323,
7756,
50275,
6377,
374,
253,
4477,
753,
841,
3210,
21223,
2701,
4665,
1430,
407,
8109,
642,
625,
685,
6864,
19,
7139,
1907,
253,
6864,
19,
5202,
2686,
19132,
253,
4665,
1430,
1580,
352,
310,
6927,
281,
956,
253,
1566,
10554,
323,
1650,
247,
5202,
273,
6864,
19,
19401,
625,
4665,
494,
685,
247,
5202,
273,
6864,
21,
253,
4477,
943,
294,
40712,
436,
6197,
50276,
6050,
253,
4477,
1750,
326,
23352,
3629,
253,
2801,
275,
295,
67,
7064,
310,
253,
8936,
1332,
2139,
253,
295,
67,
7064,
342,
253,
3638,
2281,
689,
468,
13015,
327,
260,
338,
274,
740,
347,
2011,
275,
2829,
495,
50276,
3169,
327,
253,
295,
14836,
1641,
8813,
512,
10617,
7632,
943,
452,
253,
1072,
6864,
533,
275,
253,
1650,
2011,
275,
4677,
721,
285,
8499,
253,
10617,
7632,
403,
275,
1027,
24484,
253,
4477,
878,
281,
5513,
2139,
253,
2457,
5202,
556,
10617,
7632,
342,
1027,
24484,
50276,
783,
4477,
858,
417,
7277,
616,
3082,
4665,
1430,
342,
247,
2074,
3082,
824,
347,
17825,
11454,
7139,
246,
23099,
1162,
355,
6247,
50276,
74,
1804,
3515,
436,
3368,
1580,
17825,
11454,
7139,
556,
247,
1027,
5673,
685,
295,
67,
7064,
50276,
6050,
253,
1180,
273,
5014,
310,
28629,
275,
253,
4665,
1430,
1263,
352,
3133,
326,
5014,
760,
3662,
581,
1953,
6240,
625,
3533,
812,
17084,
253,
2929,
2007,
253,
4477,
943,
2085,
247,
6010,
273,
5014,
24088,
2363,
4730,
285,
8645,
50276,
783,
2929,
588,
320,
34615,
604,
253,
4477,
1408,
271,
3368,
1293,
970,
247,
3215,
11273,
11454,
2990,
327,
247,
1355,
10895,
751,
278,
79,
382,
281,
7568,
616,
11333,
12510,
50275,
49836,
498,
15752,
50276,
6050,
253,
2929,
310,
9648,
34025,
627,
310,
2316,
323,
7756,
275,
253,
19843,
50276,
6377,
495,
253,
1390,
12494,
18298,
247,
2180,
846,
253,
41616,
1854,
4677,
337,
260,
30762,
2829,
721,
436,
50275,
6377,
818,
4677,
577,
891,
2868,
326,
253,
4477,
1599,
1293,
295,
67,
7064,
3185,
273,
1293,
501,
3024,
50276,
6050,
253,
4477,
5544,
849,
597,
5203,
253,
5202,
7632,
342,
3159,
3024,
275,
253,
8499,
627,
310,
642,
8813,
275,
253,
2022,
2929,
21473,
253,
7139,
7632,
310,
271,
1774,
629,
273,
253,
5933,
285,
943,
320,
2908,
275,
253,
2022,
2929,
50276,
6377,
608,
253,
4477,
908,
19868,
28913,
285,
2957,
28913,
749,
5564,
275,
253,
2929,
253,
3159,
28913,
3133,
19582,
275,
436,
3634,
50276,
250,
5551,
33593,
253,
2929,
8631,
512,
253,
11333,
275,
2120,
2508,
285,
3400,
2217,
1491,
323,
271,
6485,
9414,
281,
18302,
697,
1543,
2299,
253,
4477,
858,
417,
2319,
672,
597,
1265,
281,
2572,
253,
259,
275,
5150,
495,
849,
281,
3653,
281,
3523,
253,
259,
432,
3629,
285,
342,
752,
2281,
253,
259,
943,
320,
2559,
7152,
339,
431,
248,
2929,
29328,
247,
1332,
281,
1056,
11454,
6928,
625,
7899,
285,
4665,
494,
407,
15706,
616,
2457,
8090,
342,
247,
37851,
3061,
5202,
347,
247,
906,
253,
2990,
476,
4711,
247,
3425,
273,
7089,
326,
5644,
281,
253,
2457,
9162,
906,
1677,
271,
3280,
2460,
253,
1332,
310,
10166,
342,
2602,
7089,
407,
34018,
20552,
281,
1016,
10617,
534,
403,
2330,
342,
247,
2014,
966,
253,
5202,
3061,
4373,
32763,
403,
8818,
8356,
432,
253,
27882,
6928,
2457,
14086,
3828,
285,
1442,
292,
37437,
253,
958,
326,
7089,
403,
2602,
35910,
253,
1027,
74,
1752,
874,
1895,
273,
7089,
347,
275,
2710,
643,
2074,
9380,
11106,
390,
5258,
959,
625,
2708,
50276,
783,
2929,
310,
417,
3542,
1077,
4518,
594,
352,
651,
320,
1892,
281,
18302,
697,
417,
2590,
275,
5053,
604,
14452,
2723,
281,
7632,
390,
5971,
347,
352,
310,
908,
28961,
1598,
253,
2505,
38771,
247,
1463,
15965,
15895,
273,
253,
5871,
2218,
275,
6703,
7632,
285,
436,
512,
2789,
352,
2834,
281,
2096,
752,
253,
2957,
310,
285,
849,
352,
476,
5731,
253,
7089,
275,
253,
5202,
4931,
697,
1896,
281,
2096,
512,
253,
4278,
407,
294,
24042,
253,
2505,
2067,
2069,
533,
253,
2929,
7964,
19756,
19843,
247,
5322,
5301,
651,
320,
342,
253,
3676,
11454,
3061,
9741,
277,
2109,
71,
2929,
2654,
1902,
326,
1268,
273,
19843,
432,
271,
17857,
32888,
2929,
50276,
783,
22861,
3212,
970,
253,
4373,
32763,
432,
253,
1390,
3828,
273,
253,
260,
9866,
323,
26736,
6703,
7632,
310,
417,
5544,
253,
24426,
275,
4677,
374,
6492,
326,
253,
31218,
908,
323,
253,
2885,
7632,
769,
751,
9959,
50276,
395,
10380,
25001,
731,
4948,
8750,
9959,
533,
752,
1663,
6569,
310,
326,
253,
10175,
273,
253,
2457,
3828,
534,
403,
440,
6320,
1025,
4373,
32763,
403,
17522,
281,
830,
747,
4373,
32763,
326,
403,
8025,
281,
3835,
1097,
5971,
534,
651,
417,
320,
253,
1083,
954,
273,
253,
673,
516,
417,
2119,
604,
436,
556,
247,
5272,
17856,
4495,
533,
253,
24426,
4245,
253,
3430,
2934,
50276,
783,
2929,
2789,
253,
1750,
12401,
2045,
3061,
7139,
390,
24498,
49996,
295,
14836,
1641,
897,
1854,
20552,
323,
17032,
627,
403,
5604,
1142,
9380,
326,
897,
1854,
20552,
323,
17032,
275,
958,
697,
644,
253,
5222,
323,
13358,
7089,
1580,
1892,
7089,
403,
1199,
12150,
281,
36803,
275,
247,
46350,
5133,
277,
9194,
79,
4648,
1854,
20552,
347,
513,
5662,
9380,
751,
3061,
21327,
27311,
267,
6928,
285,
253,
3210,
275,
17352,
407,
17908,
1136,
276,
1162,
355,
5258,
959,
891,
13414,
2096,
436,
1750,
253,
5962,
3064,
1060,
310,
326,
1016,
966,
4850,
4555,
581,
9940,
10617,
3185,
273,
1016,
10617,
20073,
247,
3268,
352,
310,
417,
2590,
281,
479,
2139,
436,
310,
247,
1175,
2934,
2167,
50276,
953,
4518,
417,
9013,
275,
253,
3061,
5202,
6239,
50276,
23955,
4722,
1127,
50276,
2252,
9380,
891,
871,
327,
253,
2256,
2686,
1611,
281,
8046,
23507,
5743,
387,
1071,
673,
323,
6733,
534,
310,
253,
12150,
1895,
281,
8415,
923,
6882,
13118,
5723,
2824,
6247,
50276,
7157,
959,
6821,
14899,
1507,
5723,
2824,
4765,
50276,
7157,
959,
22145,
4087,
1781,
11454,
6928,
407,
439,
8879,
254,
1162,
355,
5258,
959,
17697,
1491,
6351,
407,
43022,
12706,
1162,
355,
3966,
27149,
512,
253,
12998,
273,
253,
5202,
390,
4216,
4916,
9419,
25785,
19983,
323,
12861,
7139,
50276,
22416,
1430,
310,
1529,
2201,
1750,
253,
1039,
352,
2987,
310,
407,
970,
3159,
3024,
281,
9212,
2169,
1268,
8508,
5971,
281,
3888,
824,
347,
5893,
50276,
85,
49718,
436,
1039,
407,
32827,
2074,
6505,
2366,
253,
6703,
7632,
403,
7922,
247,
4495,
436,
3198,
281,
320,
2218,
672,
26736,
253,
19868,
594,
253,
5202,
2605,
310,
13542,
1677,
432,
752,
891,
2096,
533,
840,
849,
403,
253,
8557,
273,
7632,
4236,
697,
417,
2590,
432,
253,
2505,
387,
1878,
352,
310,
417,
5544,
4518,
275,
253,
4677,
608,
247,
3061,
5202,
310,
1677,
534,
812,
760,
452,
644,
8818,
407,
1133,
50276,
466,
342,
3095,
8958,
326,
16581,
285,
9097,
403,
253,
8642,
8557,
835,
1057,
436,
1491,
1705,
432,
323,
4440,
257,
292,
672,
627,
403,
9098,
5971,
3159,
3024,
50276,
23955,
4468,
670,
4665,
1430,
310,
326,
352,
310,
7558,
326,
643,
1175,
9591,
3082,
751,
277,
9194,
79,
403,
417,
4665,
494,
891,
13414,
923,
2139,
28763,
253,
1083,
604,
359,
897,
3159,
3024,
281,
9212,
4495,
323,
253,
6703,
7632,
273,
824,
5482,
2654,
5194,
326,
352,
310,
417,
15246,
281,
513,
594,
533,
342,
3159,
3024,
891,
476,
8564,
849,
352,
812,
320,
2218,
594,
516,
417,
13762,
670,
253,
1750,
326,
436,
2900,
310,
253,
954,
4665,
494,
581,
672,
253,
2929,
1057,
417,
5513,
2139,
643,
5482,
403,
2649,
275,
247,
34593,
5133,
50276,
531,
2457,
4468,
310,
342,
253,
14023,
342,
2074,
3210,
253,
2929,
2296,
841,
497,
294,
303,
38758,
285,
5762,
342,
1027,
896,
47473,
835,
403,
841,
27558,
2668,
432,
323,
4227,
323,
277,
9194,
79,
534,
310,
3240,
2570,
849,
476,
253,
9414,
4517,
841,
3904,
50276,
74,
1089,
253,
3916,
10915,
87,
19163,
285,
253,
1543,
440,
5726,
86,
10563,
891,
1158,
253,
2929,
3198,
891,
247,
1805,
15965,
15895,
281,
19148,
253,
1332,
21255,
1805,
8813,
273,
849,
253,
7139,
403,
8818,
24088,
604,
3159,
3024,
310,
908,
37685,
1805,
4685,
273,
3910,
432,
2074,
789,
4390,
891,
13414,
1158,
352,
16382,
253,
2534,
323,
17857,
32888,
50274,
45670,
484,
906,
310,
2779,
3430,
275,
2829,
495,
50276,
6438,
30080,
22559,
50276,
783,
4477,
858,
247,
1270,
2628,
275,
15974,
954,
273,
253,
3374,
891,
452,
285,
1160,
1142,
2544,
326,
6518,
342,
253,
19843,
273,
253,
2929,
627,
403,
1335,
690,
5780,
3374,
751,
4677,
374,
34018,
247,
3430,
17856,
4495,
281,
253,
9959,
4447,
407,
3192,
2097,
273,
4373,
32763,
285,
253,
5258,
959,
10414,
534,
403,
3365,
2879,
281,
253,
10414,
2593,
534,
943,
320,
4229,
533,
891,
1158,
253,
2879,
6630,
1543,
403,
247,
1270,
1635,
285,
247,
34593,
4737,
670,
253,
2559,
4665,
5280,
7443,
273,
841,
3210,
3103,
2853,
2572,
619,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
2520,
2929,
8523,
2959,
6804,
17503,
533,
846,
253,
30080,
22559,
512,
30628,
8521,
14924,
30628,
11435,
253,
4460,
7681,
5697,
285,
9470,
5661,
1543,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
15249,
390,
294,
3418,
7234,
751,
253,
4665,
1430,
273,
247,
3061,
5202,
310,
973,
21877,
390,
11454,
3386,
403,
25910,
4665,
494,
247,
2014,
3806,
1057,
417,
402,
1330,
10666,
8919,
1240,
940,
408,
6630,
21354,
642,
824,
25038,
11815,
50276,
74,
452,
247,
1643,
3533,
50276,
531,
2181,
326,
310,
417,
2590,
672,
3733,
342,
5202,
2957,
403,
13461,
6096,
2439,
7632,
275,
1798,
253,
2801,
4972,
323,
271,
6703,
4666,
310,
253,
3388,
273,
697,
1398,
20708,
10617,
4666,
2801,
11390,
672,
3733,
342,
5202,
2957,
513,
359,
840,
1555,
326,
6703,
4666,
2801,
4972,
347,
247,
873,
273,
3907,
3602,
342,
4858,
11269,
390,
513,
359,
4035,
281,
1555,
352,
347,
247,
2020,
273,
10617,
3602,
594,
326,
10617,
285,
6703,
4666,
11269,
2818,
253,
1072,
3602,
347,
275,
271,
391,
9866,
390,
33037,
2990,
50276,
261,
627,
6830,
247,
47641,
326,
812,
16851,
4715,
253,
19868,
323,
1650,
6194,
342,
247,
5044,
2957,
323,
2217,
25142,
1919,
253,
3453,
13461,
1265,
281,
29623,
840,
19309,
3733,
281,
10808,
253,
19868,
285,
840,
21058,
3733,
342,
253,
5202,
2957,
629,
273,
253,
47641,
812,
320,
12925,
670,
849,
281,
2736,
672,
253,
3453,
13461,
452,
10481,
5975,
2400,
50276,
5371,
403,
253,
2234,
3910,
875,
436,
2746,
285,
247,
24498,
2602,
4090,
619,
4685,
310,
326,
597,
250,
10323,
6425,
387,
17032,
673,
3707,
5046,
5899,
24498,
2602,
4090,
4648,
1892,
7089,
752,
670,
1309,
3733,
310,
352,
5046,
253,
897,
273,
4016,
10491,
323,
24498,
2602,
4090,
50276,
5430,
651,
436,
2746,
1347,
323,
6685,
1029,
15759,
3453,
8470,
50276,
531,
273,
253,
3625,
42852,
323,
24498,
2602,
4090,
891,
8564,
326,
323,
690,
3453,
46950,
2602,
17032,
4916,
43245,
275,
36764,
917,
50276,
18,
632,
11933,
253,
13296,
375,
273,
1566,
4665,
1430,
5987,
39962,
2061,
5375,
9913,
1549,
1706,
2270,
374,
277,
6934,
422,
33383,
285,
465,
303,
4404,
247,
26565,
5859,
273,
4665,
494,
5145,
4715,
5987,
39962,
2061,
5375,
1166,
9992,
25,
25805,
495,
268,
2108,
357,
91,
261,
606,
615,
73,
1162,
355,
40238,
285,
10499,
1566,
4665,
1430,
5987,
39962,
2061,
5375,
1093,
9992,
3141,
740,
577,
298,
1504,
4978,
1162,
355,
432,
1980,
22909,
281,
4156,
4685,
342,
5513,
494,
23105,
323,
7139,
5987,
1477,
939,
1177,
681,
45894,
1417,
2945,
9726,
11325,
520,
25592,
7152,
339,
793,
525,
40348,
436,
3929,
3133,
281,
320,
247,
4217,
7680,
281,
253,
6239,
327,
4665,
494,
3676,
6928,
2299,
253,
2929,
812,
320,
34615,
407,
17227,
285,
18918,
253,
4665,
1430,
273,
7274,
281,
643,
3510,
273,
941,
824,
347,
22453,
941,
50276,
2369,
652,
555,
436,
9380,
2022,
7680,
310,
281,
3959,
247,
747,
9769,
1566,
326,
24772,
247,
3676,
11454,
2990,
342,
247,
5202,
253,
4477,
908,
253,
13461,
273,
253,
1390,
3828,
273,
247,
277,
9866,
281,
1973,
247,
5202,
432,
5004,
281,
1755,
835,
1016,
6703,
4666,
275,
253,
5202,
310,
253,
3388,
2801,
273,
1016,
1429,
840,
253,
4477,
908,
2602,
4090,
281,
11897,
253,
5912,
273,
50276,
83,
20309,
323,
1016,
1429,
50275,
33177,
3486,
253,
2746,
3559,
275,
436,
2929,
310,
259,
4415,
1208,
11634,
275,
4382,
8113,
533,
7826,
4217,
275,
1142,
643,
7533,
50276,
48746,
3290,
50276,
783,
7681,
2600,
273,
253,
2929,
4620,
281,
320,
3451,
23447,
627,
310,
690,
2316,
323,
7756,
50275,
6377,
374,
253,
4477,
753,
841,
3210,
21223,
2701,
4665,
1430,
407,
8109,
642,
625,
685,
6864,
19,
7139,
1907,
253,
6864,
19,
5202,
2686,
19132,
253,
4665,
1430,
1580,
352,
310,
6927,
281,
956,
253,
1566,
10554,
323,
1650,
247,
5202,
273,
6864,
19,
19401,
625,
4665,
494,
685,
247,
5202,
273,
6864,
21,
253,
4477,
943,
294,
40712,
436,
6197,
50276,
6050,
253,
4477,
1750,
326,
23352,
3629,
253,
2801,
275,
295,
67,
7064,
310,
253,
8936,
1332,
2139,
253,
295,
67,
7064,
342,
253,
3638,
2281,
689,
468,
13015,
327,
260,
338,
274,
740,
347,
2011,
275,
2829,
495,
50276,
3169,
327,
253,
295,
14836,
1641,
8813,
512,
10617,
7632,
943,
452,
253,
1072,
6864,
533,
275,
253,
1650,
2011,
275,
4677,
721,
285,
8499,
253,
10617,
7632,
403,
275,
1027,
24484,
253,
4477,
878,
281,
5513,
2139,
253,
2457,
5202,
556,
10617,
7632,
342,
1027,
24484,
50276,
783,
4477,
858,
417,
7277,
616,
3082,
4665,
1430,
342,
247,
2074,
3082,
824,
347,
17825,
11454,
7139,
246,
23099,
1162,
355,
6247,
50276,
74,
1804,
3515,
436,
3368,
1580,
17825,
11454,
7139,
556,
247,
1027,
5673,
685,
295,
67,
7064,
50276,
6050,
253,
1180,
273,
5014,
310,
28629,
275,
253,
4665,
1430,
1263,
352,
3133,
326,
5014,
760,
3662,
581,
1953,
6240,
625,
3533,
812,
17084,
253,
2929,
2007,
253,
4477,
943,
2085,
247,
6010,
273,
5014,
24088,
2363,
4730,
285,
8645,
50276,
783,
2929,
588,
320,
34615,
604,
253,
4477,
1408,
271,
3368,
1293,
970,
247,
3215,
11273,
11454,
2990,
327,
247,
1355,
10895,
751,
278,
79,
382,
281,
7568,
616,
11333,
12510,
50275,
49836,
498,
15752,
50276,
6050,
253,
2929,
310,
9648,
34025,
627,
310,
2316,
323,
7756,
275,
253,
19843,
50276,
6377,
495,
253,
1390,
12494,
18298,
247,
2180,
846,
253,
41616,
1854,
4677,
337,
260,
30762,
2829,
721,
436,
50275,
6377,
818,
4677,
577,
891,
2868,
326,
253,
4477,
1599,
1293,
295,
67,
7064,
3185,
273,
1293,
501,
3024,
50276,
6050,
253,
4477,
5544,
849,
597,
5203,
253,
5202,
7632,
342,
3159,
3024,
275,
253,
8499,
627,
310,
642,
8813,
275,
253,
2022,
2929,
21473,
253,
7139,
7632,
310,
271,
1774,
629,
273,
253,
5933,
285,
943,
320,
2908,
275,
253,
2022,
2929,
50276,
6377,
608,
253,
4477,
908,
19868,
28913,
285,
2957,
28913,
749,
5564,
275,
253,
2929,
253,
3159,
28913,
3133,
19582,
275,
436,
3634,
50276,
250,
5551,
33593,
253,
2929,
8631,
512,
253,
11333,
275,
2120,
2508,
285,
3400,
2217,
1491,
323,
271,
6485,
9414,
281,
18302,
697,
1543,
2299,
253,
4477,
858,
417,
2319,
672,
597,
1265,
281,
2572,
253,
259,
275,
5150,
495,
849,
281,
3653,
281,
3523,
253,
259,
432,
3629,
285,
342,
752,
2281,
253,
259,
943,
320,
2559,
7152,
339,
431,
248,
2929,
29328,
247,
1332,
281,
1056,
11454,
6928,
625,
7899,
285,
4665,
494,
407,
15706,
616,
2457,
8090,
342,
247,
37851,
3061,
5202,
347,
247,
906,
253,
2990,
476,
4711,
247,
3425,
273,
7089,
326,
5644,
281,
253,
2457,
9162,
906,
1677,
271,
3280,
2460,
253,
1332,
310,
10166,
342,
2602,
7089,
407,
34018,
20552,
281,
1016,
10617,
534,
403,
2330,
342,
247,
2014,
966,
253,
5202,
3061,
4373,
32763,
403,
8818,
8356,
432,
253,
27882,
6928,
2457,
14086,
3828,
285,
1442,
292,
37437,
253,
958,
326,
7089,
403,
2602,
35910,
253,
1027,
74,
1752,
874,
1895,
273,
7089,
347,
275,
2710,
643,
2074,
9380,
11106,
390,
5258,
959,
625,
2708,
50276,
783,
2929,
310,
417,
3542,
1077,
4518,
594,
352,
651,
320,
1892,
281,
18302,
697,
417,
2590,
275,
5053,
604,
14452,
2723,
281,
7632,
390,
5971,
347,
352,
310,
908,
28961,
1598,
253,
2505,
38771,
247,
1463,
15965,
15895,
273,
253,
5871,
2218,
275,
6703,
7632,
285,
436,
512,
2789,
352,
2834,
281,
2096,
752,
253,
2957,
310,
285,
849,
352,
476,
5731,
253,
7089,
275,
253,
5202,
4931,
697,
1896,
281,
2096,
512,
253,
4278,
407,
294,
24042,
253,
2505,
2067,
2069,
533,
253,
2929,
7964,
19756,
19843,
247,
5322,
5301,
651,
320,
342,
253,
3676,
11454,
3061,
9741,
277,
2109,
71,
2929,
2654,
1902,
326,
1268,
273,
19843,
432,
271,
17857,
32888,
2929,
50276,
783,
22861,
3212,
970,
253,
4373,
32763,
432,
253,
1390,
3828,
273,
253,
260,
9866,
323,
26736,
6703,
7632,
310,
417,
5544,
253,
24426,
275,
4677,
374,
6492,
326,
253,
31218,
908,
323,
253,
2885,
7632,
769,
751,
9959,
50276,
395,
10380,
25001,
731,
4948,
8750,
9959,
533,
752,
1663,
6569,
310,
326,
253,
10175,
273,
253,
2457,
3828,
534,
403,
440,
6320,
1025,
4373,
32763,
403,
17522,
281,
830,
747,
4373,
32763,
326,
403,
8025,
281,
3835,
1097,
5971,
534,
651,
417,
320,
253,
1083,
954,
273,
253,
673,
516,
417,
2119,
604,
436,
556,
247,
5272,
17856,
4495,
533,
253,
24426,
4245,
253,
3430,
2934,
50276,
783,
2929,
2789,
253,
1750,
12401,
2045,
3061,
7139,
390,
24498,
49996,
295,
14836,
1641,
897,
1854,
20552,
323,
17032,
627,
403,
5604,
1142,
9380,
326,
897,
1854,
20552,
323,
17032,
275,
958,
697,
644,
253,
5222,
323,
13358,
7089,
1580,
1892,
7089,
403,
1199,
12150,
281,
36803,
275,
247,
46350,
5133,
277,
9194,
79,
4648,
1854,
20552,
347,
513,
5662,
9380,
751,
3061,
21327,
27311,
267,
6928,
285,
253,
3210,
275,
17352,
407,
17908,
1136,
276,
1162,
355,
5258,
959,
891,
13414,
2096,
436,
1750,
253,
5962,
3064,
1060,
310,
326,
1016,
966,
4850,
4555,
581,
9940,
10617,
3185,
273,
1016,
10617,
20073,
247,
3268,
352,
310,
417,
2590,
281,
479,
2139,
436,
310,
247,
1175,
2934,
2167,
50276,
953,
4518,
417,
9013,
275,
253,
3061,
5202,
6239,
50276,
23955,
4722,
1127,
50276,
2252,
9380,
891,
871,
327,
253,
2256,
2686,
1611,
281,
8046,
23507,
5743,
387,
1071,
673,
323,
6733,
534,
310,
253,
12150,
1895,
281,
8415,
923,
6882,
13118,
5723,
2824,
6247,
50276,
7157,
959,
6821,
14899,
1507,
5723,
2824,
4765,
50276,
7157,
959,
22145,
4087,
1781,
11454,
6928,
407,
439,
8879,
254,
1162,
355,
5258,
959,
17697,
1491,
6351,
407,
43022,
12706,
1162,
355,
3966,
27149,
512,
253,
12998,
273,
253,
5202,
390,
4216,
4916,
9419,
25785,
19983,
323,
12861,
7139,
50276,
22416,
1430,
310,
1529,
2201,
1750,
253,
1039,
352,
2987,
310,
407,
970,
3159,
3024,
281,
9212,
2169,
1268,
8508,
5971,
281,
3888,
824,
347,
5893,
50276,
85,
49718,
436,
1039,
407,
32827,
2074,
6505,
2366,
253,
6703,
7632,
403,
7922,
247,
4495,
436,
3198,
281,
320,
2218,
672,
26736,
253,
19868,
594,
253,
5202,
2605,
310,
13542,
1677,
432,
752,
891,
2096,
533,
840,
849,
403,
253,
8557,
273,
7632,
4236,
697,
417,
2590,
432,
253,
2505,
387,
1878,
352,
310,
417,
5544,
4518,
275,
253,
4677,
608,
247,
3061,
5202,
310,
1677,
534,
812,
760,
452,
644,
8818,
407,
1133,
50276,
466,
342,
3095,
8958,
326,
16581,
285,
9097,
403,
253,
8642,
8557,
835,
1057,
436,
1491,
1705,
432,
323,
4440,
257,
292,
672,
627,
403,
9098,
5971,
3159,
3024,
50276,
23955,
4468,
670,
4665,
1430,
310,
326,
352,
310,
7558,
326,
643,
1175,
9591,
3082,
751,
277,
9194,
79,
403,
417,
4665,
494,
891,
13414,
923,
2139,
28763,
253,
1083,
604,
359,
897,
3159,
3024,
281,
9212,
4495,
323,
253,
6703,
7632,
273,
824,
5482,
2654,
5194,
326,
352,
310,
417,
15246,
281,
513,
594,
533,
342,
3159,
3024,
891,
476,
8564,
849,
352,
812,
320,
2218,
594,
516,
417,
13762,
670,
253,
1750,
326,
436,
2900,
310,
253,
954,
4665,
494,
581,
672,
253,
2929,
1057,
417,
5513,
2139,
643,
5482,
403,
2649,
275,
247,
34593,
5133,
50276,
531,
2457,
4468,
310,
342,
253,
14023,
342,
2074,
3210,
253,
2929,
2296,
841,
497,
294,
303,
38758,
285,
5762,
342,
1027,
896,
47473,
835,
403,
841,
27558,
2668,
432,
323,
4227,
323,
277,
9194,
79,
534,
310,
3240,
2570,
849,
476,
253,
9414,
4517,
841,
3904,
50276,
74,
1089,
253,
3916,
10915,
87,
19163,
285,
253,
1543,
440,
5726,
86,
10563,
891,
1158,
253,
2929,
3198,
891,
247,
1805,
15965,
15895,
281,
19148,
253,
1332,
21255,
1805,
8813,
273,
849,
253,
7139,
403,
8818,
24088,
604,
3159,
3024,
310,
908,
37685,
1805,
4685,
273,
3910,
432,
2074,
789,
4390,
891,
13414,
1158,
352,
16382,
253,
2534,
323,
17857,
32888,
50274,
45670,
484,
906,
310,
2779,
3430,
275,
2829,
495,
50276,
6438,
30080,
22559,
50276,
783,
4477,
858,
247,
1270,
2628,
275,
15974,
954,
273,
253,
3374,
891,
452,
285,
1160,
1142,
2544,
326,
6518,
342,
253,
19843,
273,
253,
2929,
627,
403,
1335,
690,
5780,
3374,
751,
4677,
374,
34018,
247,
3430,
17856,
4495,
281,
253,
9959,
4447,
407,
3192,
2097,
273,
4373,
32763,
285,
253,
5258,
959,
10414,
534,
403,
3365,
2879,
281,
253,
10414,
2593,
534,
943,
320,
4229,
533,
891,
1158,
253,
2879,
6630,
1543,
403,
247,
1270,
1635,
285,
247,
34593,
4737,
670,
253,
2559,
4665,
5280,
7443,
273,
841,
3210,
3103,
2853,
2572,
619,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
2520,
2929,
8523,
2959,
6804,
17503,
533,
846,
253,
30080,
22559,
512,
30628,
8521,
14924,
30628,
11435,
253,
4460,
7681,
5697,
285,
9470,
5661,
1543,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a method to learn a joint representation of an image with a very generic description of the object present in the image this representation makes it possible to associate the discriminating elements of the textual description with visual elements of the image the objective is to improve the zeroshot learning approaches and to classify images containing objects not seen during the learning phase only from the textual description the proposed model is composed of two parts a first part combines an imagebased transformer and a textbased transformer through a scoring function that allows to compute the similarity between text embedding and image embedding a second part allows alignment using visual queries and textual keys in a combined text and image transformer experiments are conducted on standard datasets for zeroshot learning wikipedia articles were collected to serve as class descriptions the collected dataset will be made public after the review process experiments show that the proposed model outperforms stateoftheart models and allows the classification of images containing objects never seen during the learning process based on their textual description only unseen classes different types of semantic embedding are tested glove longformer mpnet tfidf an ablation study and examples of qualitative results are presented i am not familiar with the field of zeroshot learning but it seems to me that the model combining visual queries and textual keys in a transform model is original the performances of the model seem to exceed the state of the art results and the model obtains interesting performances with simple embedding glove which puts in perspective the contribution of more complex models like longformer for zeroshot learning problems the bibliography seems to be very complete again with citations of simple but efficient models tfidf the proposed model seems to be able to be trained on a single a100 gpu in one day which is accessible the experimental part is very complete with comparison to the state of the art testing of different embeddings ablation study and qualitative analysis na docsepthe authors propose an attentionbased model for zeroshot learning from text documents knowledge sources such as wikipedia and birdflower information websites the proposed approach relies on two transformer models one for processing image patches and another one for text documents each of the transformers outputs a sequence of features which is fed into two different components of the proposed model one of them fuses two sequences with an attention mechanism i2d attention and produces an imagetext matching score described as local the other component i2d global only uses special classification tokens cls from both text and image and similarly produces an imagetext matching score which the authors describe as global the approach optimizes for both local and global matching but only the global one is used later on for classificationgenerating predictions the authors evaluate their model on standard zsl datasets awa2 cub and flo they separately evaluate the performance of both their learned text features and their entire model that uses the global matching scores when using their learned text features as input to existing four different zsl models they observe quite a consistent improvement over alternative glove or vgse features often by a big margin also using their entire proposed model with global textimage matching scores used for classification generally compares well against the other zsl models additionally the authors argue that their model is more interpretable showing examples of attention maps over words in a document or matching imagetext element pairs strengths s1 the proposed approach is conceptually simple it consists mostly of dotproduct attention it doesnt require many extra hyperparameters s2 the experimental results the learned document text features i2demb quite consistently both among different datasets and different methods outperform glove and vgse features often by a large margin additionally the entire proposed model that uses a very simple similarity score between text and image cls token often outperforms the remaining zsl methods even models that use the text features proposed in this work i2demb s3 rich ablation studies and a good set of experiments the experiments show the importance of different components of the proposed model the authors evaluate their model with different types of textual features but also use their textual features with different models the set of experiments covers the most important aspects of the model s4 attention maps provide some extra interpretability which can be important for understanding predictions especially when using text documents as a source of information about classes although see w3 weaknesses w1 possibly suboptimal settings for zsl models used to compare against gazsl was introduced using tfidf features eg apn was introduced for attribute features this paper however compares against them when using either glove or vgse features which might be suboptimal for those models a more fair comparison should at least additionally include tfidf features as well especially important for gazsl w2 the authors seem to miss a little bit of context regarding the attribute data the usage of text documents instead of attributes in general might have many advantages however eg fvaegand2 apn papers using attribute data report much higher performance on eg cub dataset the authors however do not mention any attributebased results which might be misleading for the readers as if attribute features were not competitive even on these datasets additionally the claims of outperforming sota should be formulated more precisely eg with respect to a specific type of datasource of information w3 the qualitative evaluation samples are not indicated as random samples as opposed to selected additionally for more convincinginsightful results an analysis of failure cases would be needed w4 the authors without justifying collect their own text documents instead of already standard existing ones already extracted there is no motivation for it present in the paper and no discussion or analysis of the differences and the impact of using these vs standard documents additionally the authors mention performing some filtering of document sections which could potentially be very important but is not discussedanalyzed further existing extracted documents wikipedia allaboutbirds etc flo cub mohamed elhoseiny ahmed elgammal and babak saleh write a classifier predicting visual classifiers from unstructured text tpami 2016 cub mohamed elhoseiny yizhe zhu han zhang and ahmed elgammal link the head to the beak zero shot learning from noisy text description at part precision cvpr 2017 w5 the direction of the attention seems somewhat arbitrary and not analyzed the authors basically weight the text features attention values by the attention compatibility between image patches text sequence elements one could do the opposite use image patch features and weight them by attention maps instead or do both directions like some crossattention works no motivation behind this choice is discussed and no comparison of alternative choices is present the authors do not discuss the computational cost of training evaluating their model as opposed to alternative approaches the qualitative analysis seems to include only selected positive example with no failcase analysis see w3 additionally the authors use some very strong overexaggerated language in l45 they claim that their model is able to develop understanding of different parts of an animal docsepin this paper authors propose to address the zeroshot image classification problem under a more realistic setting namely each class is descried with one document collected online to address the issues with online textual documents contain certain noise and different parts of the document may correspond to the different regions of the images authors propose a transformerbased zsl framework that jointly learns to encode images and documents by aligning both modalities in a shared embedding space the crossmodality attention mechanism is introduced to suppress the noise and extensive experiments are conducted to validate the proposed modules strengths the transformbased framework is proposed for the zsl problem the i2d module is proven effective to capture the relevant image regions based on the collected documents the experimental results are considered favorable compared to sotas on three benchmark datasets and the results are highlyinterpretable the learned text embedding can be utilized with existing zsl for further improvement weakness the discussions regrading the model complexity needs further clarification the novelty of cross modality attention among different tokens is somewhat limited being the main contribution of the whole paper the fairness of utilizing the online document compared to other sota zsl frameworks needs further clarification the more detailed comments id like authors to address are summarized in the questions part the main concern regarding the limitation is the model complexity docsepthe paper proposes to use multimodal crossattention to align image representation with word representation by unsupervised training on imagedocument pairs the model is later validated on a zeroshot learning task and an interpretability study is conducted pros 1 the interpretability study demonstrates that the model learned to align modalities together 2 the performance over the baselines was improved 3 the idea of image2document attention seems natural using the asymmetric attentionimagequery on textimage pairs and aligning them into the same embedding space cons 1 motivation task description is short and appears relatively late in the paperzeroshot learning is a very vague statement it is hard to assume what reallife problem the paper is trying to solve which requires considerable mental effort to judge the model correctly for example the articles structure makes it impossible to understand the contribution while reading for the first time not enough context on them is provided before both those issues make it hard to understand the significance of the contribution 2 training it remains unclear how the training proceeds eg how is the output of the similarity function classified how are the images aligned to the text where does the class information come from 3 model as i understand the textswords are processed by converting them into glove embeddings then through the mlp layer to be consumed by a transformer network this approach added a couple of processing steps but the decision to do so remained unexplained and is not grounded in the experimental results anywhere in the paper how much did using glovemlp help additionally in some aspects the usefulness of this method counters the conclusion the paper reaches especially the following documents of unseen classes use the same and additional vocabulary in new sentences causing a distribution shift in their input representation this issue has been previously tackled in other areas of nlp with the subword tokenization eg bpe why the authors use static word embeddings in such a situation remains highly mysterious 4 model is there any reason not to try to unfreeze the pretrained visual encoder 5 model the proposed model trains a single attention layer while the ablation on scaling the model deeper is lacking but may be significant 6 model the choice of asymmetric attentionimagequery is not explained nor studied one can imagine using the coattention or merging the inputs before passing to the selfattention see eg 1 for comparison is there any specific reason to design it that way 7 clarity the caption in table 1 is not clear how were semantic embeddings learned from different sources and were they represented 8 clarity the paper presentation is aesthetically pleasing but the structure makes it hard to understand and stack all the concepts together sadly after reading the paper and having it in front of me i would not be able to reproduce and reimplement the models training the authors partially analyze the results and provide some details on when the model fails
### Summary:
|
the authors propose a method to learn a joint representation of an image with a document of the object present in the image experiments show that the proposed model outperforms stateoftheart models although the final reviews between reviewers are not aligned i think authors solved most of their proposed questions
|
[
1620,
2326,
1309,
253,
4715,
1232,
1754,
327,
616,
45860,
5740,
760,
39709,
5971,
1027,
3510,
273,
24705,
21496,
403,
5762,
38081,
1048,
19946,
23542,
3024,
28793,
301,
71,
271,
28913,
1263,
285,
6667,
273,
18276,
1543,
403,
3559,
50276,
74,
717,
417,
7615,
342,
253,
1673,
273,
1182,
254,
6934,
302,
4715,
533,
352,
3133,
281,
479,
326,
253,
1566,
16248,
5304,
19241,
285,
45860,
10149,
275,
247,
4979,
1566,
310,
3236,
253,
16226,
273,
253,
1566,
1646,
281,
8268,
253,
1375,
273,
253,
1445,
1543,
285,
253,
1566,
31326,
4722,
16226,
342,
2969,
21496,
38081,
534,
12516,
275,
8668,
253,
7680,
273,
625,
2570,
3210,
751,
1048,
19946,
323,
1182,
254,
6934,
302,
4715,
3237,
253,
20314,
20561,
3133,
281,
320,
1077,
3426,
969,
342,
30404,
273,
2969,
533,
5919,
3210,
28793,
301,
71,
253,
4081,
1566,
3133,
281,
320,
2104,
281,
320,
10166,
327,
247,
2014,
247,
2313,
305,
11113,
275,
581,
1388,
534,
310,
12482,
253,
5661,
629,
310,
1077,
3426,
342,
5301,
281,
253,
1375,
273,
253,
1445,
5175,
273,
1027,
46234,
28913,
1263,
285,
18276,
1783,
50275,
2072,
5474,
339,
431,
248,
4477,
12661,
271,
4116,
3169,
1566,
323,
1182,
254,
6934,
302,
4715,
432,
2505,
7177,
50276,
36871,
4973,
824,
347,
259,
15170,
285,
12621,
33179,
1491,
14248,
253,
4081,
2746,
15771,
327,
767,
39707,
3210,
581,
323,
5162,
2460,
20412,
285,
1529,
581,
323,
2505,
7177,
1016,
273,
253,
4979,
398,
18012,
247,
3425,
273,
3386,
534,
310,
10208,
715,
767,
1027,
4295,
273,
253,
4081,
1566,
581,
273,
731,
269,
5123,
767,
6430,
342,
271,
4116,
5122,
891,
19,
69,
4116,
285,
11330,
271,
4440,
292,
2068,
11038,
4868,
2529,
347,
1980,
253,
643,
4445,
891,
19,
69,
4156,
760,
4648,
2714,
9162,
21761,
502,
84,
432,
1097,
2505,
285,
2460,
285,
12014,
11330,
271,
4440,
292,
2068,
11038,
4868,
534,
253,
4477,
6266,
347,
4156,
253,
2746,
5556,
4219,
323,
1097,
1980,
285,
4156,
11038,
533,
760,
253,
4156,
581,
310,
908,
1996,
327,
323,
9162,
8719,
839,
13650,
50276,
783,
4477,
7472,
616,
1566,
327,
2629,
1182,
3433,
15302,
33326,
19,
12966,
285,
9071,
597,
11794,
7472,
253,
3045,
273,
1097,
616,
6311,
2505,
3386,
285,
616,
2862,
1566,
326,
4648,
253,
4156,
11038,
7363,
672,
970,
616,
6311,
2505,
3386,
347,
3280,
281,
5368,
1740,
1027,
1182,
3433,
3210,
597,
10018,
3240,
247,
5185,
7756,
689,
5795,
38081,
390,
362,
72,
339,
3386,
2223,
407,
247,
1943,
8459,
671,
970,
616,
2862,
4081,
1566,
342,
4156,
2505,
5695,
11038,
7363,
908,
323,
9162,
3839,
26662,
973,
1411,
253,
643,
1182,
3433,
3210,
50276,
29483,
595,
253,
4477,
9059,
326,
616,
1566,
310,
625,
4665,
494,
4645,
6667,
273,
4116,
8115,
689,
3000,
275,
247,
3389,
390,
11038,
4440,
292,
2068,
3284,
8557,
20544,
50275,
84,
18,
253,
4081,
2746,
310,
4473,
1230,
2969,
50276,
262,
8414,
6571,
273,
14261,
7509,
4116,
352,
36908,
2430,
1142,
4465,
4373,
22041,
50275,
84,
19,
253,
5661,
1543,
253,
6311,
3389,
2505,
3386,
891,
19,
9468,
67,
3240,
12724,
1097,
2190,
1027,
15302,
285,
1027,
3082,
562,
32231,
38081,
285,
362,
72,
339,
3386,
2223,
407,
247,
1781,
8459,
23000,
253,
2862,
4081,
1566,
326,
4648,
247,
1077,
2969,
14259,
4868,
875,
2505,
285,
2460,
502,
84,
10669,
2223,
41731,
13015,
253,
5780,
1182,
3433,
3082,
50276,
9154,
3210,
326,
897,
253,
2505,
3386,
4081,
275,
436,
789,
891,
19,
9468,
67,
50275,
84,
20,
6793,
28913,
2175,
285,
247,
1175,
873,
273,
4679,
253,
4679,
921,
253,
6349,
273,
1027,
4295,
273,
253,
4081,
1566,
253,
4477,
7472,
616,
1566,
342,
1027,
3510,
273,
45860,
3386,
533,
671,
897,
616,
45860,
3386,
342,
1027,
3210,
253,
873,
273,
4679,
10949,
253,
954,
1774,
7794,
273,
253,
1566,
50275,
84,
21,
4116,
8115,
2085,
690,
4465,
4665,
1430,
534,
476,
320,
1774,
323,
4685,
13650,
3340,
672,
970,
2505,
7177,
347,
247,
2603,
273,
1491,
670,
5971,
3738,
923,
259,
20,
50276,
20881,
1255,
265,
50275,
88,
18,
6830,
749,
29776,
7533,
323,
1182,
3433,
3210,
908,
281,
7277,
1411,
20001,
3433,
369,
5611,
970,
28793,
301,
71,
3386,
24088,
1049,
79,
369,
5611,
323,
11104,
3386,
436,
2929,
2299,
26662,
1411,
731,
672,
970,
2057,
38081,
390,
362,
72,
339,
3386,
50276,
4609,
1537,
320,
749,
29776,
323,
1110,
3210,
247,
625,
4344,
5301,
943,
387,
1878,
23000,
2486,
28793,
301,
71,
3386,
347,
973,
3340,
1774,
323,
20001,
3433,
50275,
88,
19,
253,
4477,
1646,
281,
2985,
247,
1652,
2372,
273,
3634,
5001,
253,
11104,
941,
253,
10393,
273,
2505,
7177,
3185,
273,
12474,
275,
2087,
1537,
452,
1142,
11361,
2299,
24088,
269,
6156,
909,
395,
19,
1049,
79,
9380,
970,
11104,
941,
1304,
1199,
2169,
3045,
327,
24088,
12966,
10895,
253,
4477,
2299,
513,
417,
3748,
667,
11104,
3169,
1543,
534,
1537,
320,
24363,
323,
253,
10668,
347,
604,
11104,
3386,
497,
417,
12085,
1014,
327,
841,
15302,
23000,
253,
3916,
273,
41731,
14692,
256,
5503,
943,
320,
26115,
625,
10534,
50276,
909,
342,
1675,
281,
247,
2173,
1511,
273,
7621,
1505,
273,
1491,
50275,
88,
20,
253,
18276,
7103,
3530,
403,
417,
4860,
347,
3632,
3530,
347,
10066,
281,
4236,
23000,
323,
625,
21414,
968,
429,
1020,
1543,
271,
1783,
273,
4433,
2219,
651,
320,
3058,
50275,
88,
21,
253,
4477,
1293,
816,
5411,
4822,
616,
1211,
2505,
7177,
3185,
273,
2168,
2629,
5368,
4394,
2168,
10375,
627,
310,
642,
16038,
323,
352,
1246,
275,
253,
2929,
285,
642,
5955,
390,
1783,
273,
253,
3910,
285,
253,
3486,
273,
970,
841,
4632,
2629,
7177,
23000,
253,
4477,
3748,
9591,
690,
19690,
273,
3389,
7118,
534,
812,
7826,
320,
1077,
1774,
533,
310,
417,
5469,
29965,
4337,
2007,
50272,
20137,
10375,
7177,
259,
15170,
512,
10383,
40382,
3966,
50268,
40591,
12966,
278,
1368,
3163,
1045,
73,
583,
5104,
21799,
1314,
1045,
72,
3681,
267,
285,
5366,
518,
7289,
73,
3630,
247,
30410,
21565,
5304,
49996,
432,
440,
34218,
2505,
246,
81,
7588,
4022,
50268,
68,
538,
278,
1368,
3163,
1045,
73,
583,
5104,
340,
478,
248,
1182,
11917,
15761,
1182,
12109,
285,
21799,
1314,
1045,
72,
3681,
267,
3048,
253,
1481,
281,
253,
320,
518,
5058,
5103,
4715,
432,
27620,
2505,
5740,
387,
629,
12320,
30105,
1087,
4240,
50276,
88,
22,
253,
3884,
273,
253,
4116,
3133,
8489,
10341,
285,
417,
5867,
253,
4477,
10323,
2801,
253,
2505,
3386,
4116,
2193,
407,
253,
4116,
22862,
875,
2460,
20412,
50276,
1156,
3425,
3603,
581,
812,
513,
253,
7285,
50276,
2327,
2460,
12097,
3386,
285,
2801,
731,
407,
4116,
8115,
3185,
390,
513,
1097,
10746,
50276,
3022,
690,
2831,
42959,
2987,
642,
16038,
3212,
436,
4327,
310,
5469,
285,
642,
5301,
273,
5795,
10165,
310,
1246,
253,
4477,
513,
417,
2319,
253,
15180,
2105,
273,
3733,
50276,
15419,
18186,
616,
1566,
347,
10066,
281,
5795,
7274,
253,
18276,
1783,
3133,
281,
2486,
760,
4236,
2762,
1650,
342,
642,
1891,
5045,
1783,
923,
259,
20,
23000,
253,
4477,
897,
690,
1077,
2266,
14525,
7215,
456,
3448,
50276,
249,
298,
1857,
597,
1750,
326,
616,
1566,
310,
2104,
281,
1287,
4685,
273,
1027,
4243,
273,
271,
5893,
5474,
339,
9852,
436,
2929,
4477,
12661,
281,
2953,
253,
1182,
254,
6934,
302,
2460,
9162,
1895,
762,
247,
625,
15958,
4758,
10775,
1016,
966,
310,
1398,
2200,
342,
581,
3389,
5728,
3909,
281,
2953,
253,
3374,
342,
3909,
45860,
7177,
3831,
2176,
6046,
285,
1027,
4243,
273,
253,
3389,
778,
2723,
281,
253,
1027,
4811,
273,
253,
3888,
4477,
12661,
247,
39707,
3169,
1182,
3433,
7792,
326,
26277,
33772,
281,
22573,
3888,
285,
7177,
407,
8495,
272,
1097,
33433,
275,
247,
6096,
21496,
2317,
253,
2831,
2307,
1319,
4116,
5122,
310,
5611,
281,
10476,
253,
6046,
285,
9470,
4679,
403,
5196,
281,
17813,
253,
4081,
11911,
20544,
50276,
783,
4979,
3169,
7792,
310,
4081,
323,
253,
1182,
3433,
1895,
50275,
783,
891,
19,
69,
6333,
310,
11464,
3576,
281,
9232,
253,
4623,
2460,
4811,
1754,
327,
253,
5728,
7177,
50275,
783,
5661,
1543,
403,
2783,
13857,
2429,
281,
256,
302,
284,
327,
1264,
22791,
15302,
285,
253,
1543,
403,
4122,
22416,
494,
50275,
783,
6311,
2505,
21496,
476,
320,
12845,
342,
5368,
1182,
3433,
323,
2007,
7756,
50276,
20881,
1255,
50276,
783,
11985,
294,
4971,
272,
253,
1566,
10454,
3198,
2007,
37699,
50275,
783,
38135,
273,
2831,
36453,
4116,
2190,
1027,
21761,
310,
8489,
3710,
1146,
253,
2022,
7680,
273,
253,
2644,
2929,
50275,
783,
28959,
273,
17617,
253,
3909,
3389,
2429,
281,
643,
256,
5503,
1182,
3433,
31225,
3198,
2007,
37699,
50276,
783,
625,
7000,
5701,
2654,
751,
4477,
281,
2953,
403,
17903,
275,
253,
3533,
629,
253,
2022,
4468,
5001,
253,
12291,
310,
253,
1566,
10454,
5474,
339,
431,
248,
2929,
29328,
281,
897,
23390,
26306,
2831,
42959,
281,
8495,
2460,
6779,
342,
3159,
6779,
407,
440,
35421,
3733,
327,
35014,
1829,
8557,
253,
1566,
310,
1996,
17618,
327,
247,
1182,
254,
6934,
302,
4715,
4836,
285,
271,
4665,
1430,
1263,
310,
5196,
50276,
856,
84,
337,
253,
4665,
1430,
1263,
14371,
326,
253,
1566,
6311,
281,
8495,
33433,
2366,
374,
253,
3045,
689,
253,
1666,
25379,
369,
5520,
495,
253,
2934,
273,
2460,
19,
3306,
4116,
3133,
3626,
50276,
5302,
253,
26640,
4116,
5695,
7267,
327,
2505,
5695,
8557,
285,
8495,
272,
731,
715,
253,
1072,
21496,
2317,
50275,
5040,
337,
16038,
4836,
5740,
310,
2159,
285,
4620,
4942,
3563,
275,
253,
2929,
8260,
6934,
302,
4715,
310,
247,
1077,
21248,
3908,
352,
310,
1892,
281,
5467,
752,
294,
455,
1074,
1895,
253,
2929,
310,
2820,
281,
8415,
534,
4419,
10665,
6255,
3434,
281,
5963,
253,
1566,
9113,
323,
1650,
253,
7774,
2605,
2789,
352,
7479,
281,
2096,
253,
7680,
1223,
4361,
323,
253,
806,
673,
417,
2217,
3634,
327,
731,
310,
2530,
1078,
1097,
1110,
3374,
1056,
352,
1892,
281,
2096,
253,
8453,
273,
253,
7680,
50276,
19,
3733,
352,
4558,
12744,
849,
253,
3733,
16947,
24088,
849,
310,
253,
3453,
273,
253,
14259,
1159,
10509,
849,
403,
253,
3888,
15616,
281,
253,
2505,
835,
1057,
253,
966,
1491,
1705,
432,
50276,
20,
1566,
347,
891,
2096,
253,
2505,
2140,
6565,
403,
11742,
407,
22022,
731,
715,
38081,
46234,
840,
949,
253,
13361,
81,
3828,
281,
320,
17017,
407,
247,
39707,
2990,
436,
2746,
2879,
247,
4564,
273,
5162,
5018,
533,
253,
3061,
281,
513,
594,
6376,
49374,
285,
310,
417,
28462,
275,
253,
5661,
1543,
9825,
275,
253,
2929,
849,
1199,
858,
970,
38081,
1686,
81,
1361,
23000,
275,
690,
7794,
253,
31471,
273,
436,
1332,
33605,
253,
6452,
253,
2929,
14190,
3340,
253,
1563,
7177,
273,
39709,
5971,
897,
253,
1072,
285,
3081,
30318,
275,
747,
14683,
8479,
247,
3268,
5333,
275,
616,
3280,
6779,
436,
2523,
556,
644,
3786,
11463,
1070,
275,
643,
3672,
273,
295,
24343,
342,
253,
749,
3418,
10669,
1320,
24088,
270,
365,
2139,
253,
4477,
897,
4228,
3159,
46234,
275,
824,
247,
4112,
4558,
4122,
19796,
577,
1566,
310,
627,
667,
1921,
417,
281,
1611,
281,
440,
4924,
2721,
253,
3215,
11273,
5304,
32049,
186,
608,
1566,
253,
4081,
1566,
18784,
247,
2014,
4116,
3828,
1223,
253,
28913,
327,
13642,
253,
1566,
12861,
310,
14999,
533,
778,
320,
1534,
721,
1566,
253,
4327,
273,
26640,
4116,
5695,
7267,
310,
417,
5544,
4543,
5421,
50276,
531,
476,
8564,
970,
253,
820,
42959,
390,
34047,
253,
14800,
1078,
8136,
281,
253,
1881,
42959,
923,
24088,
337,
323,
5301,
310,
627,
667,
2173,
1921,
281,
2216,
352,
326,
1039,
818,
19843,
253,
11743,
275,
2829,
337,
310,
417,
2590,
849,
497,
24705,
46234,
6311,
432,
1027,
4973,
285,
497,
597,
6607,
854,
19843,
253,
2929,
9759,
310,
18079,
6168,
1037,
37902,
533,
253,
2605,
2789,
352,
1892,
281,
2096,
285,
8031,
512,
253,
12342,
2366,
30018,
846,
4361,
253,
2929,
285,
1907,
352,
275,
2914,
273,
479,
891,
651,
417,
320,
2104,
281,
18302,
285,
294,
303,
3018,
253,
3210,
3733,
50275,
783,
4477,
10571,
12106,
253,
1543,
285,
2085,
690,
4278,
327,
672,
253,
1566,
10224,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
1332,
281,
3037,
247,
6036,
6779,
273,
271,
2460,
342,
247,
3389,
273,
253,
1789,
1246,
275,
253,
2460,
4679,
921,
326,
253,
4081,
1566,
41731,
13015,
1375,
23037,
14387,
3210,
3738,
253,
2457,
10123,
875,
30628,
403,
417,
15616,
891,
1158,
4477,
14042,
954,
273,
616,
4081,
3533
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
1620,
2326,
1309,
253,
4715,
1232,
1754,
327,
616,
45860,
5740,
760,
39709,
5971,
1027,
3510,
273,
24705,
21496,
403,
5762,
38081,
1048,
19946,
23542,
3024,
28793,
301,
71,
271,
28913,
1263,
285,
6667,
273,
18276,
1543,
403,
3559,
50276,
74,
717,
417,
7615,
342,
253,
1673,
273,
1182,
254,
6934,
302,
4715,
533,
352,
3133,
281,
479,
326,
253,
1566,
16248,
5304,
19241,
285,
45860,
10149,
275,
247,
4979,
1566,
310,
3236,
253,
16226,
273,
253,
1566,
1646,
281,
8268,
253,
1375,
273,
253,
1445,
1543,
285,
253,
1566,
31326,
4722,
16226,
342,
2969,
21496,
38081,
534,
12516,
275,
8668,
253,
7680,
273,
625,
2570,
3210,
751,
1048,
19946,
323,
1182,
254,
6934,
302,
4715,
3237,
253,
20314,
20561,
3133,
281,
320,
1077,
3426,
969,
342,
30404,
273,
2969,
533,
5919,
3210,
28793,
301,
71,
253,
4081,
1566,
3133,
281,
320,
2104,
281,
320,
10166,
327,
247,
2014,
247,
2313,
305,
11113,
275,
581,
1388,
534,
310,
12482,
253,
5661,
629,
310,
1077,
3426,
342,
5301,
281,
253,
1375,
273,
253,
1445,
5175,
273,
1027,
46234,
28913,
1263,
285,
18276,
1783,
50275,
2072,
5474,
339,
431,
248,
4477,
12661,
271,
4116,
3169,
1566,
323,
1182,
254,
6934,
302,
4715,
432,
2505,
7177,
50276,
36871,
4973,
824,
347,
259,
15170,
285,
12621,
33179,
1491,
14248,
253,
4081,
2746,
15771,
327,
767,
39707,
3210,
581,
323,
5162,
2460,
20412,
285,
1529,
581,
323,
2505,
7177,
1016,
273,
253,
4979,
398,
18012,
247,
3425,
273,
3386,
534,
310,
10208,
715,
767,
1027,
4295,
273,
253,
4081,
1566,
581,
273,
731,
269,
5123,
767,
6430,
342,
271,
4116,
5122,
891,
19,
69,
4116,
285,
11330,
271,
4440,
292,
2068,
11038,
4868,
2529,
347,
1980,
253,
643,
4445,
891,
19,
69,
4156,
760,
4648,
2714,
9162,
21761,
502,
84,
432,
1097,
2505,
285,
2460,
285,
12014,
11330,
271,
4440,
292,
2068,
11038,
4868,
534,
253,
4477,
6266,
347,
4156,
253,
2746,
5556,
4219,
323,
1097,
1980,
285,
4156,
11038,
533,
760,
253,
4156,
581,
310,
908,
1996,
327,
323,
9162,
8719,
839,
13650,
50276,
783,
4477,
7472,
616,
1566,
327,
2629,
1182,
3433,
15302,
33326,
19,
12966,
285,
9071,
597,
11794,
7472,
253,
3045,
273,
1097,
616,
6311,
2505,
3386,
285,
616,
2862,
1566,
326,
4648,
253,
4156,
11038,
7363,
672,
970,
616,
6311,
2505,
3386,
347,
3280,
281,
5368,
1740,
1027,
1182,
3433,
3210,
597,
10018,
3240,
247,
5185,
7756,
689,
5795,
38081,
390,
362,
72,
339,
3386,
2223,
407,
247,
1943,
8459,
671,
970,
616,
2862,
4081,
1566,
342,
4156,
2505,
5695,
11038,
7363,
908,
323,
9162,
3839,
26662,
973,
1411,
253,
643,
1182,
3433,
3210,
50276,
29483,
595,
253,
4477,
9059,
326,
616,
1566,
310,
625,
4665,
494,
4645,
6667,
273,
4116,
8115,
689,
3000,
275,
247,
3389,
390,
11038,
4440,
292,
2068,
3284,
8557,
20544,
50275,
84,
18,
253,
4081,
2746,
310,
4473,
1230,
2969,
50276,
262,
8414,
6571,
273,
14261,
7509,
4116,
352,
36908,
2430,
1142,
4465,
4373,
22041,
50275,
84,
19,
253,
5661,
1543,
253,
6311,
3389,
2505,
3386,
891,
19,
9468,
67,
3240,
12724,
1097,
2190,
1027,
15302,
285,
1027,
3082,
562,
32231,
38081,
285,
362,
72,
339,
3386,
2223,
407,
247,
1781,
8459,
23000,
253,
2862,
4081,
1566,
326,
4648,
247,
1077,
2969,
14259,
4868,
875,
2505,
285,
2460,
502,
84,
10669,
2223,
41731,
13015,
253,
5780,
1182,
3433,
3082,
50276,
9154,
3210,
326,
897,
253,
2505,
3386,
4081,
275,
436,
789,
891,
19,
9468,
67,
50275,
84,
20,
6793,
28913,
2175,
285,
247,
1175,
873,
273,
4679,
253,
4679,
921,
253,
6349,
273,
1027,
4295,
273,
253,
4081,
1566,
253,
4477,
7472,
616,
1566,
342,
1027,
3510,
273,
45860,
3386,
533,
671,
897,
616,
45860,
3386,
342,
1027,
3210,
253,
873,
273,
4679,
10949,
253,
954,
1774,
7794,
273,
253,
1566,
50275,
84,
21,
4116,
8115,
2085,
690,
4465,
4665,
1430,
534,
476,
320,
1774,
323,
4685,
13650,
3340,
672,
970,
2505,
7177,
347,
247,
2603,
273,
1491,
670,
5971,
3738,
923,
259,
20,
50276,
20881,
1255,
265,
50275,
88,
18,
6830,
749,
29776,
7533,
323,
1182,
3433,
3210,
908,
281,
7277,
1411,
20001,
3433,
369,
5611,
970,
28793,
301,
71,
3386,
24088,
1049,
79,
369,
5611,
323,
11104,
3386,
436,
2929,
2299,
26662,
1411,
731,
672,
970,
2057,
38081,
390,
362,
72,
339,
3386,
50276,
4609,
1537,
320,
749,
29776,
323,
1110,
3210,
247,
625,
4344,
5301,
943,
387,
1878,
23000,
2486,
28793,
301,
71,
3386,
347,
973,
3340,
1774,
323,
20001,
3433,
50275,
88,
19,
253,
4477,
1646,
281,
2985,
247,
1652,
2372,
273,
3634,
5001,
253,
11104,
941,
253,
10393,
273,
2505,
7177,
3185,
273,
12474,
275,
2087,
1537,
452,
1142,
11361,
2299,
24088,
269,
6156,
909,
395,
19,
1049,
79,
9380,
970,
11104,
941,
1304,
1199,
2169,
3045,
327,
24088,
12966,
10895,
253,
4477,
2299,
513,
417,
3748,
667,
11104,
3169,
1543,
534,
1537,
320,
24363,
323,
253,
10668,
347,
604,
11104,
3386,
497,
417,
12085,
1014,
327,
841,
15302,
23000,
253,
3916,
273,
41731,
14692,
256,
5503,
943,
320,
26115,
625,
10534,
50276,
909,
342,
1675,
281,
247,
2173,
1511,
273,
7621,
1505,
273,
1491,
50275,
88,
20,
253,
18276,
7103,
3530,
403,
417,
4860,
347,
3632,
3530,
347,
10066,
281,
4236,
23000,
323,
625,
21414,
968,
429,
1020,
1543,
271,
1783,
273,
4433,
2219,
651,
320,
3058,
50275,
88,
21,
253,
4477,
1293,
816,
5411,
4822,
616,
1211,
2505,
7177,
3185,
273,
2168,
2629,
5368,
4394,
2168,
10375,
627,
310,
642,
16038,
323,
352,
1246,
275,
253,
2929,
285,
642,
5955,
390,
1783,
273,
253,
3910,
285,
253,
3486,
273,
970,
841,
4632,
2629,
7177,
23000,
253,
4477,
3748,
9591,
690,
19690,
273,
3389,
7118,
534,
812,
7826,
320,
1077,
1774,
533,
310,
417,
5469,
29965,
4337,
2007,
50272,
20137,
10375,
7177,
259,
15170,
512,
10383,
40382,
3966,
50268,
40591,
12966,
278,
1368,
3163,
1045,
73,
583,
5104,
21799,
1314,
1045,
72,
3681,
267,
285,
5366,
518,
7289,
73,
3630,
247,
30410,
21565,
5304,
49996,
432,
440,
34218,
2505,
246,
81,
7588,
4022,
50268,
68,
538,
278,
1368,
3163,
1045,
73,
583,
5104,
340,
478,
248,
1182,
11917,
15761,
1182,
12109,
285,
21799,
1314,
1045,
72,
3681,
267,
3048,
253,
1481,
281,
253,
320,
518,
5058,
5103,
4715,
432,
27620,
2505,
5740,
387,
629,
12320,
30105,
1087,
4240,
50276,
88,
22,
253,
3884,
273,
253,
4116,
3133,
8489,
10341,
285,
417,
5867,
253,
4477,
10323,
2801,
253,
2505,
3386,
4116,
2193,
407,
253,
4116,
22862,
875,
2460,
20412,
50276,
1156,
3425,
3603,
581,
812,
513,
253,
7285,
50276,
2327,
2460,
12097,
3386,
285,
2801,
731,
407,
4116,
8115,
3185,
390,
513,
1097,
10746,
50276,
3022,
690,
2831,
42959,
2987,
642,
16038,
3212,
436,
4327,
310,
5469,
285,
642,
5301,
273,
5795,
10165,
310,
1246,
253,
4477,
513,
417,
2319,
253,
15180,
2105,
273,
3733,
50276,
15419,
18186,
616,
1566,
347,
10066,
281,
5795,
7274,
253,
18276,
1783,
3133,
281,
2486,
760,
4236,
2762,
1650,
342,
642,
1891,
5045,
1783,
923,
259,
20,
23000,
253,
4477,
897,
690,
1077,
2266,
14525,
7215,
456,
3448,
50276,
249,
298,
1857,
597,
1750,
326,
616,
1566,
310,
2104,
281,
1287,
4685,
273,
1027,
4243,
273,
271,
5893,
5474,
339,
9852,
436,
2929,
4477,
12661,
281,
2953,
253,
1182,
254,
6934,
302,
2460,
9162,
1895,
762,
247,
625,
15958,
4758,
10775,
1016,
966,
310,
1398,
2200,
342,
581,
3389,
5728,
3909,
281,
2953,
253,
3374,
342,
3909,
45860,
7177,
3831,
2176,
6046,
285,
1027,
4243,
273,
253,
3389,
778,
2723,
281,
253,
1027,
4811,
273,
253,
3888,
4477,
12661,
247,
39707,
3169,
1182,
3433,
7792,
326,
26277,
33772,
281,
22573,
3888,
285,
7177,
407,
8495,
272,
1097,
33433,
275,
247,
6096,
21496,
2317,
253,
2831,
2307,
1319,
4116,
5122,
310,
5611,
281,
10476,
253,
6046,
285,
9470,
4679,
403,
5196,
281,
17813,
253,
4081,
11911,
20544,
50276,
783,
4979,
3169,
7792,
310,
4081,
323,
253,
1182,
3433,
1895,
50275,
783,
891,
19,
69,
6333,
310,
11464,
3576,
281,
9232,
253,
4623,
2460,
4811,
1754,
327,
253,
5728,
7177,
50275,
783,
5661,
1543,
403,
2783,
13857,
2429,
281,
256,
302,
284,
327,
1264,
22791,
15302,
285,
253,
1543,
403,
4122,
22416,
494,
50275,
783,
6311,
2505,
21496,
476,
320,
12845,
342,
5368,
1182,
3433,
323,
2007,
7756,
50276,
20881,
1255,
50276,
783,
11985,
294,
4971,
272,
253,
1566,
10454,
3198,
2007,
37699,
50275,
783,
38135,
273,
2831,
36453,
4116,
2190,
1027,
21761,
310,
8489,
3710,
1146,
253,
2022,
7680,
273,
253,
2644,
2929,
50275,
783,
28959,
273,
17617,
253,
3909,
3389,
2429,
281,
643,
256,
5503,
1182,
3433,
31225,
3198,
2007,
37699,
50276,
783,
625,
7000,
5701,
2654,
751,
4477,
281,
2953,
403,
17903,
275,
253,
3533,
629,
253,
2022,
4468,
5001,
253,
12291,
310,
253,
1566,
10454,
5474,
339,
431,
248,
2929,
29328,
281,
897,
23390,
26306,
2831,
42959,
281,
8495,
2460,
6779,
342,
3159,
6779,
407,
440,
35421,
3733,
327,
35014,
1829,
8557,
253,
1566,
310,
1996,
17618,
327,
247,
1182,
254,
6934,
302,
4715,
4836,
285,
271,
4665,
1430,
1263,
310,
5196,
50276,
856,
84,
337,
253,
4665,
1430,
1263,
14371,
326,
253,
1566,
6311,
281,
8495,
33433,
2366,
374,
253,
3045,
689,
253,
1666,
25379,
369,
5520,
495,
253,
2934,
273,
2460,
19,
3306,
4116,
3133,
3626,
50276,
5302,
253,
26640,
4116,
5695,
7267,
327,
2505,
5695,
8557,
285,
8495,
272,
731,
715,
253,
1072,
21496,
2317,
50275,
5040,
337,
16038,
4836,
5740,
310,
2159,
285,
4620,
4942,
3563,
275,
253,
2929,
8260,
6934,
302,
4715,
310,
247,
1077,
21248,
3908,
352,
310,
1892,
281,
5467,
752,
294,
455,
1074,
1895,
253,
2929,
310,
2820,
281,
8415,
534,
4419,
10665,
6255,
3434,
281,
5963,
253,
1566,
9113,
323,
1650,
253,
7774,
2605,
2789,
352,
7479,
281,
2096,
253,
7680,
1223,
4361,
323,
253,
806,
673,
417,
2217,
3634,
327,
731,
310,
2530,
1078,
1097,
1110,
3374,
1056,
352,
1892,
281,
2096,
253,
8453,
273,
253,
7680,
50276,
19,
3733,
352,
4558,
12744,
849,
253,
3733,
16947,
24088,
849,
310,
253,
3453,
273,
253,
14259,
1159,
10509,
849,
403,
253,
3888,
15616,
281,
253,
2505,
835,
1057,
253,
966,
1491,
1705,
432,
50276,
20,
1566,
347,
891,
2096,
253,
2505,
2140,
6565,
403,
11742,
407,
22022,
731,
715,
38081,
46234,
840,
949,
253,
13361,
81,
3828,
281,
320,
17017,
407,
247,
39707,
2990,
436,
2746,
2879,
247,
4564,
273,
5162,
5018,
533,
253,
3061,
281,
513,
594,
6376,
49374,
285,
310,
417,
28462,
275,
253,
5661,
1543,
9825,
275,
253,
2929,
849,
1199,
858,
970,
38081,
1686,
81,
1361,
23000,
275,
690,
7794,
253,
31471,
273,
436,
1332,
33605,
253,
6452,
253,
2929,
14190,
3340,
253,
1563,
7177,
273,
39709,
5971,
897,
253,
1072,
285,
3081,
30318,
275,
747,
14683,
8479,
247,
3268,
5333,
275,
616,
3280,
6779,
436,
2523,
556,
644,
3786,
11463,
1070,
275,
643,
3672,
273,
295,
24343,
342,
253,
749,
3418,
10669,
1320,
24088,
270,
365,
2139,
253,
4477,
897,
4228,
3159,
46234,
275,
824,
247,
4112,
4558,
4122,
19796,
577,
1566,
310,
627,
667,
1921,
417,
281,
1611,
281,
440,
4924,
2721,
253,
3215,
11273,
5304,
32049,
186,
608,
1566,
253,
4081,
1566,
18784,
247,
2014,
4116,
3828,
1223,
253,
28913,
327,
13642,
253,
1566,
12861,
310,
14999,
533,
778,
320,
1534,
721,
1566,
253,
4327,
273,
26640,
4116,
5695,
7267,
310,
417,
5544,
4543,
5421,
50276,
531,
476,
8564,
970,
253,
820,
42959,
390,
34047,
253,
14800,
1078,
8136,
281,
253,
1881,
42959,
923,
24088,
337,
323,
5301,
310,
627,
667,
2173,
1921,
281,
2216,
352,
326,
1039,
818,
19843,
253,
11743,
275,
2829,
337,
310,
417,
2590,
849,
497,
24705,
46234,
6311,
432,
1027,
4973,
285,
497,
597,
6607,
854,
19843,
253,
2929,
9759,
310,
18079,
6168,
1037,
37902,
533,
253,
2605,
2789,
352,
1892,
281,
2096,
285,
8031,
512,
253,
12342,
2366,
30018,
846,
4361,
253,
2929,
285,
1907,
352,
275,
2914,
273,
479,
891,
651,
417,
320,
2104,
281,
18302,
285,
294,
303,
3018,
253,
3210,
3733,
50275,
783,
4477,
10571,
12106,
253,
1543,
285,
2085,
690,
4278,
327,
672,
253,
1566,
10224,
2490,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
1332,
281,
3037,
247,
6036,
6779,
273,
271,
2460,
342,
247,
3389,
273,
253,
1789,
1246,
275,
253,
2460,
4679,
921,
326,
253,
4081,
1566,
41731,
13015,
1375,
23037,
14387,
3210,
3738,
253,
2457,
10123,
875,
30628,
403,
417,
15616,
891,
1158,
4477,
14042,
954,
273,
616,
4081,
3533
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose to train deep convolutional neural networks in a layerwise fashion this is contrary to the traditional joint endtoend training of deep cnns as their motivation the authors quote computational and memory benefits at the time of training in addition to being able to extend shallownetwork analytical frameworks to the individual network layers thus allowing for a theoretical interpretation of their optima their method is simple and clearly explained note in the 10th line on pg 4 is there a j missing in the subscript of xn the experimental results are interesting the authors are able to demonstrate some architecture that in solely a layerwise training is able to show competitive results with respect to alexnet when trained in an endtoend manner on imagenet these results can seem questionable as both the architectures and training routines are being varied and hence the precise contribution of the layerwise training is unclear however as per my understanding the aim of the paper wasnt to show that a layerwise training can work better that endtoend training the aim was on the contrary to show that even a layerwise training can offer competitive performance for some network and hence may come handy when memory is limited their underlying claim which could be more clearly stated is that the memory benefits during training can be enjoyed when the individual layers of a network net1 are smaller in parameter count as compared to another net net2 although the net1 in totality maybe larger than net2 this is because net1 will be trained in a layerwise fashion while net2 would be trained in an endtoend manner i would like to see the authors confirm or reject this understanding and rationalise their experimental regimen further i would like to know how their work compares to the following httpsarxivorgabs170307115 httpsarxivorgabs161102185 finally while the authors state that the layerwise training makes the individual layers amenable to theoretical analysisinterpretation no such discussion is presentedinitiated in the paper the only analysis presented is on the ability of the individually trained layers to linear separate the data to round the analysis it should also be extended to the representations learned by endtoend trained networks all in all while the paper raises some interesting ideas its execution in terms of a method that learns a classifier on each individual layer is rather simplistic dont get me wrong simple can indeed be elegant but at the minute the comparisons are not very convincing docsepthis paper is of reasonable quality and clarity rather modest originality perhaps considerable significance in some applications strengths i think this kind of method could be useful for data of very high dimensionality when it is not possible to train everything end to end the experiments seem to be conducted correctly the paper is well written weaknesses minor abstract its kind of funny to say that cifar10 is a large scale image recognition problem what the authors are proposing is quite similar to lee et al 1 which was not mentioned in the paper as well as a wide range of papers which were mentioned i think it is kind interesting for people to revise these techniques from 10 years ago but this method is just not that novel the authors highlight that their goal is not using this method as a pretraining strategy but it would be interesting to see whether it would indeed work better if after the layerwise training the whole network would be trained endtoend maths in this paper is mostly decorative when comparing different models or training methods eg layerwise trained alexnet and endtoend trained alexnet it would make sense to do some hyperparameter search it is very risky to conclude anything otherwise i would like to see a wall clock time comparison between this and endtoend training 1 lee et al deeplysupervised nets 2015docsepsummary this paper proposes layer wise training of neural networks using classification auxiliary tasks for training each layer experiments are presented on cifar10 and imagenet accuracies close to end to end training are obtained the layer wise training is repeated for j steps the auxiliary tasks are trained on top of the shallow one layer of width m with a network of depth k and width tildem layerwise training is done using sgd with momentum and the learning rate is decayed through epochs note that the layer wise training is done with large width m than typical end to end networks in use the authors argue and test the hypothesis that auxiliary tasks encourage the linear separability of cnn features to reduce the size of the learned network the author propose a layer wise compression using the filter removal technique of molchanov et al reproducibility this empirical work has been investigated for a while with mild success the authors should make their code available to the community to confirm and reproduce their findings i encourage the authors to make their code available during the reviewdiscussion period significance of the work from reading the paper it is not clear what is the main ingredient that makes this layer wise training successful negative results would help in understanding what is important for the accuracy some more ablation studies and negative results will be insightful and here are few suggestions in that direction authors claim that they used invertible downsampling as max or average pooling lead to information loss does the layer wise training give worst results with average or max pooling if so please report those numbers to know what is the implications of this choice of pooling on the width of the networks seems it is key for the success of the approach what if you train wider networks with j that is small j3 for instance but much wider networks instead of j8 now for imagenet to answer the same question above one needs also to see what are the accuracies for j8 with thiner networks smaller m would the accuracy with the layer wise training reach a plateau if one uses an architecture with j higher than 8 transferability of the learned features end to end features are know to be transferable it would be good to see if this still holds using the network layer wise trained on imagenet for cifar10 or other datasets other questions section 32 is vague in proposition 31 and proposition 32 can you add some text to explain what are the implicitions of the claims thus our training permits to extend some results on shallow cnn to deeper cnns which shallow results for k1 batch normalization was useful is this only on the auxiliary problems networks or you used also batch norm for the layer the ensemble used is zsumj1j 2j zj this uses the network of j layers also the oj auxiliary networks of depth k please report the number of parameters for all models single and ensembles in table 1 and table 2 in the conclusion the framework can be extendable to the parallel training how would this possible since one needs the output of the first training to do the training of the next layer can you elaborate on what is meant here minor page 2 bottom have competitorsand have competitors and the non linearity rho in equation 1 and throughout the paper put a bracket for its argument rhox not rho x page 6 imagenet paragraph w we section 42 we define linear separability etc a space is missing before further section 43 we report our results imagenet a space is missing after imagenet overall assessment this is a good paper making the code available and adding more ablation studies and explanations of width versus depth and the choice of pooling will make the contributions easier to understand
### Summary:
|
the paper discusses layerwise training of deep networks the authors show that its possible to achieve reasonable performance by training deep nets layer by layer as opposed to now widely adopted endtoend training while such a training procedure is not novel the authors argue that this is an interesting result considering that such a training procedure is often dismissed as suboptimal and leading to inferior results however the results show exactly that as the performance of the models is significantly worse than the state of the art and it is unclear what other advantages such a training scheme can offer the authors mention that layerwise training could be useful for theoretical understanding of deep nets but they dont really perform such analysis in this submission and its also unclear whether conclusions of such analysis would extend to deep nets trained endtoend in its current form the paper is not ready for acceptance i encourage the authors to make a more clear case for the method either by improving results to match endtoend training or by actually demonstrating that layerwise training has certain advantages over endtoend learning
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
50276,
783,
4477,
12661,
281,
6194,
3676,
27311,
267,
11454,
6928,
275,
247,
3828,
3020,
8142,
436,
310,
10214,
281,
253,
5899,
6036,
990,
936,
423,
3733,
273,
3676,
260,
79,
2224,
347,
616,
16038,
253,
4477,
14430,
15180,
285,
3541,
5373,
387,
253,
673,
273,
3733,
275,
1635,
281,
1146,
2104,
281,
9017,
3091,
628,
292,
1601,
16101,
31225,
281,
253,
2060,
2990,
8090,
3021,
6941,
323,
247,
10527,
7914,
273,
616,
5556,
66,
50276,
14094,
1332,
310,
2969,
285,
4518,
5544,
3877,
275,
253,
884,
394,
1386,
327,
23256,
577,
310,
627,
247,
480,
5816,
275,
253,
749,
3866,
273,
1269,
79,
253,
5661,
1543,
403,
4722,
253,
4477,
403,
2104,
281,
7568,
690,
10336,
326,
275,
12718,
247,
3828,
3020,
3733,
310,
2104,
281,
921,
12085,
1543,
342,
1675,
281,
247,
1591,
3024,
672,
10166,
275,
271,
990,
936,
423,
5133,
327,
4440,
257,
292,
841,
1543,
476,
1646,
30455,
347,
1097,
253,
35615,
285,
3733,
33255,
403,
1146,
12848,
285,
7613,
253,
10799,
7680,
273,
253,
3828,
3020,
3733,
310,
12744,
2299,
347,
591,
619,
4685,
253,
4388,
273,
253,
2929,
369,
2649,
281,
921,
326,
247,
3828,
3020,
3733,
476,
789,
1805,
326,
990,
936,
423,
3733,
253,
4388,
369,
327,
253,
10214,
281,
921,
326,
1014,
247,
3828,
3020,
3733,
476,
3959,
12085,
3045,
323,
690,
2990,
285,
7613,
778,
1705,
24783,
672,
3541,
310,
3710,
616,
6944,
1750,
534,
812,
320,
625,
4518,
4767,
310,
326,
253,
3541,
5373,
1309,
3733,
476,
320,
11346,
672,
253,
2060,
8090,
273,
247,
2990,
2036,
18,
403,
4577,
275,
4764,
1385,
347,
2429,
281,
1529,
2036,
2036,
19,
3738,
253,
2036,
18,
275,
40071,
5046,
4067,
685,
2036,
19,
436,
310,
984,
2036,
18,
588,
320,
10166,
275,
247,
3828,
3020,
8142,
1223,
2036,
19,
651,
320,
10166,
275,
271,
990,
936,
423,
5133,
891,
651,
751,
281,
923,
253,
4477,
6583,
390,
12009,
436,
4685,
285,
8870,
885,
616,
5661,
24020,
50276,
44295,
891,
651,
751,
281,
871,
849,
616,
789,
26662,
281,
253,
1563,
5987,
39962,
2061,
5375,
15046,
21645,
12730,
5987,
39962,
2061,
5375,
1036,
7749,
19,
17786,
50276,
71,
3341,
1223,
253,
4477,
1375,
326,
253,
3828,
3020,
3733,
2789,
253,
2060,
8090,
42133,
281,
10527,
1783,
22416,
318,
642,
824,
5955,
310,
3559,
4478,
4215,
275,
253,
2929,
253,
760,
1783,
3559,
310,
327,
253,
3745,
273,
253,
15978,
10166,
8090,
281,
4872,
4858,
253,
941,
281,
3790,
253,
1783,
352,
943,
671,
320,
6508,
281,
253,
14237,
6311,
407,
990,
936,
423,
10166,
6928,
50274,
455,
275,
512,
1223,
253,
2929,
16540,
690,
4722,
5697,
697,
10636,
275,
2426,
273,
247,
1332,
326,
33772,
247,
30410,
327,
1016,
2060,
3828,
310,
2581,
8077,
2531,
13414,
755,
479,
3430,
2969,
476,
6296,
320,
20654,
533,
387,
253,
7017,
253,
14023,
403,
417,
1077,
21414,
5474,
33032,
2520,
2929,
310,
273,
5272,
3290,
285,
19843,
2581,
16453,
3236,
414,
4931,
10665,
8453,
275,
690,
4893,
50276,
296,
3755,
20556,
50276,
74,
1158,
436,
2238,
273,
1332,
812,
320,
4217,
323,
941,
273,
1077,
1029,
7877,
1319,
672,
352,
310,
417,
1896,
281,
6194,
3253,
990,
281,
990,
50276,
783,
4679,
1646,
281,
320,
5196,
9113,
50276,
783,
2929,
310,
973,
3542,
50276,
20881,
1255,
265,
50276,
37585,
12002,
697,
2238,
273,
11755,
281,
1333,
326,
260,
338,
274,
740,
310,
247,
1781,
4311,
2460,
8981,
1895,
50276,
5371,
253,
4477,
403,
36636,
310,
3240,
2074,
281,
458,
70,
1162,
355,
337,
534,
369,
417,
5393,
275,
253,
2929,
347,
973,
347,
247,
4618,
2491,
273,
9380,
534,
497,
5393,
891,
1158,
352,
310,
2238,
4722,
323,
952,
281,
49620,
841,
5609,
432,
884,
1107,
3622,
533,
436,
1332,
310,
816,
417,
326,
4460,
50276,
783,
4477,
6780,
326,
616,
4736,
310,
417,
970,
436,
1332,
347,
247,
3215,
26208,
5700,
533,
352,
651,
320,
4722,
281,
923,
1880,
352,
651,
6296,
789,
1805,
604,
846,
253,
3828,
3020,
3733,
253,
2644,
2990,
651,
320,
10166,
990,
936,
423,
50276,
679,
84,
275,
436,
2929,
310,
6571,
38323,
50275,
9453,
10941,
1027,
3210,
390,
3733,
3082,
24088,
3828,
3020,
10166,
247,
1591,
3024,
285,
990,
936,
423,
10166,
247,
1591,
3024,
352,
651,
1056,
3282,
281,
513,
690,
4373,
19484,
3186,
352,
310,
1077,
29198,
281,
7525,
2712,
5010,
50276,
74,
651,
751,
281,
923,
247,
3402,
8886,
673,
5301,
875,
436,
285,
990,
936,
423,
3733,
50276,
18,
458,
70,
1162,
355,
11617,
35421,
37507,
4104,
7152,
339,
793,
360,
3454,
50276,
2520,
2929,
29328,
3828,
15822,
3733,
273,
11454,
6928,
970,
9162,
24026,
8892,
323,
3733,
1016,
3828,
4679,
403,
3559,
327,
260,
338,
274,
740,
285,
4440,
257,
292,
3933,
19103,
2810,
281,
990,
281,
990,
3733,
403,
2797,
50275,
783,
3828,
15822,
3733,
310,
6015,
323,
480,
5018,
253,
24026,
8892,
403,
10166,
327,
1755,
273,
253,
20126,
581,
3828,
273,
4871,
278,
50276,
3113,
247,
2990,
50276,
1171,
6864,
465,
285,
4871,
246,
786,
358,
3828,
3020,
3733,
310,
2218,
970,
256,
35333,
342,
10254,
285,
253,
4715,
2281,
310,
10027,
264,
949,
44540,
50275,
9939,
326,
253,
3828,
15822,
3733,
310,
2218,
342,
1781,
4871,
278,
685,
6867,
990,
281,
990,
6928,
275,
897,
50275,
783,
4477,
9059,
285,
1071,
253,
9079,
326,
24026,
8892,
50276,
2083,
38529,
253,
4872,
2533,
1430,
273,
260,
9866,
3386,
50275,
936,
4796,
253,
1979,
273,
253,
6311,
2990,
253,
2488,
12661,
247,
3828,
15822,
13800,
970,
253,
5806,
8570,
5853,
273,
14008,
2291,
729,
1162,
355,
50275,
250,
5551,
33593,
50276,
2520,
16774,
50276,
1601,
556,
644,
6949,
323,
247,
1223,
342,
11134,
2323,
253,
4477,
943,
1056,
616,
2127,
2130,
281,
253,
3114,
281,
6583,
285,
18302,
50276,
14094,
4342,
50276,
74,
11907,
253,
4477,
281,
1056,
616,
2127,
2130,
1309,
253,
2278,
49794,
2180,
50274,
9188,
40348,
273,
253,
789,
50276,
4064,
4361,
253,
2929,
352,
310,
417,
2590,
752,
310,
253,
2022,
24405,
326,
2789,
436,
3828,
15822,
3733,
50276,
12566,
1020,
4016,
1543,
651,
1361,
275,
4685,
752,
310,
1774,
323,
253,
7200,
50275,
8826,
625,
28913,
2175,
285,
4016,
1543,
588,
320,
47860,
285,
1060,
403,
1643,
13991,
275,
326,
3884,
50275,
43355,
1750,
326,
597,
908,
42275,
1066,
48027,
347,
2781,
390,
3388,
45900,
50276,
26460,
281,
1491,
2957,
1057,
253,
3828,
15822,
3733,
1918,
9065,
1543,
342,
3388,
390,
2781,
45900,
604,
594,
4496,
1304,
1110,
3904,
281,
871,
752,
310,
253,
12739,
273,
436,
4327,
273,
45900,
50273,
251,
253,
4871,
273,
253,
6928,
3133,
352,
310,
2234,
323,
253,
2323,
273,
253,
2746,
50276,
5371,
604,
50276,
5658,
6194,
14200,
6928,
342,
480,
326,
310,
1355,
50275,
75,
20,
323,
4227,
533,
1199,
50276,
88,
1334,
6928,
3185,
273,
480,
25,
1024,
323,
4440,
257,
292,
50275,
936,
3662,
253,
1072,
1953,
1840,
581,
3198,
671,
281,
923,
752,
403,
253,
3933,
19103,
323,
480,
25,
342,
6906,
254,
6928,
4577,
278,
50274,
12756,
253,
7200,
50276,
3113,
253,
3828,
15822,
3733,
3986,
247,
50276,
3789,
1952,
604,
581,
4648,
271,
10336,
342,
480,
2169,
685,
854,
50274,
17338,
1430,
273,
253,
6311,
3386,
990,
281,
990,
3386,
403,
871,
281,
320,
3700,
494,
352,
651,
320,
1175,
281,
923,
604,
436,
1335,
6556,
970,
253,
2990,
3828,
15822,
10166,
327,
4440,
257,
292,
323,
260,
338,
274,
740,
390,
643,
15302,
50275,
977,
3533,
50276,
4674,
4567,
310,
21248,
275,
13989,
4562,
285,
50276,
856,
3321,
4567,
476,
368,
823,
690,
2505,
281,
5513,
752,
403,
253,
6139,
4431,
273,
253,
3916,
3021,
776,
3733,
16000,
281,
9017,
690,
1543,
327,
20126,
260,
9866,
281,
12861,
260,
79,
2224,
50276,
4609,
20126,
1543,
50274,
1542,
465,
18,
14604,
21539,
369,
4217,
50276,
261,
436,
760,
327,
253,
24026,
3237,
6928,
50276,
263,
368,
908,
671,
14604,
5222,
323,
253,
3828,
50274,
783,
19862,
908,
310,
1182,
2204,
75,
18,
75,
374,
75,
1182,
75,
50276,
2520,
4648,
253,
2990,
273,
480,
8090,
50275,
12563,
253,
258,
75,
24026,
6928,
50276,
1171,
6864,
465,
4496,
1304,
253,
1180,
273,
3602,
323,
512,
3210,
2014,
285,
49328,
275,
2829,
337,
285,
2829,
374,
50274,
249,
253,
6452,
253,
7792,
476,
320,
9017,
494,
281,
253,
7529,
3733,
50276,
5430,
651,
436,
1896,
1580,
581,
3198,
253,
3453,
273,
253,
806,
3733,
281,
513,
253,
3733,
273,
253,
1735,
3828,
476,
368,
21184,
327,
752,
310,
5486,
1060,
50276,
37585,
3239,
374,
5004,
452,
21607,
395,
50276,
9802,
21607,
285,
50276,
783,
1327,
50137,
391,
1689,
275,
5150,
337,
285,
4768,
253,
2929,
1691,
247,
24312,
323,
697,
4154,
13882,
1004,
417,
391,
1689,
1269,
3239,
721,
50276,
303,
6533,
292,
12494,
50276,
88,
50276,
664,
2593,
5976,
359,
4853,
4872,
2533,
1430,
3966,
247,
2317,
310,
5816,
1078,
2007,
50276,
4674,
7652,
359,
1304,
776,
1543,
50276,
303,
6533,
292,
247,
2317,
310,
5816,
846,
4440,
257,
292,
50276,
1189,
455,
6803,
50276,
2520,
310,
247,
1175,
2929,
2403,
253,
2127,
2130,
285,
6240,
625,
28913,
2175,
285,
22909,
273,
4871,
7147,
6864,
285,
253,
4327,
273,
45900,
588,
1056,
253,
9021,
6927,
281,
2096,
2490,
187,
4118,
18435,
27,
783,
2929,
25339,
3828,
3020,
3733,
273,
3676,
6928,
253,
4477,
921,
326,
697,
1896,
281,
5115,
5272,
3045,
407,
3733,
3676,
37507,
3828,
407,
3828,
347,
10066,
281,
1024,
7561,
8671,
990,
936,
423,
3733,
1223,
824,
247,
3733,
5199,
310,
417,
4460,
253,
4477,
9059,
326,
436,
310,
271,
4722,
906,
7296,
326,
824,
247,
3733,
5199,
310,
2223,
11511,
347,
749,
29776,
285,
4283,
281,
18134,
1543,
2299,
253,
1543,
921,
4555,
326,
347,
253,
3045,
273,
253,
3210,
310,
3012,
7197,
685,
253,
1375,
273,
253,
1445,
285,
352,
310,
12744,
752,
643,
11361,
824,
247,
3733,
6974,
476,
3959,
253,
4477,
3748,
326,
3828,
3020,
3733,
812,
320,
4217,
323,
10527,
4685,
273,
3676,
37507,
533,
597,
13414,
1663,
1347,
824,
1783,
275,
436,
19529,
285,
697,
671,
12744,
1880,
11815,
273,
824,
1783,
651,
9017,
281,
3676,
37507,
10166,
990,
936,
423,
50276,
249,
697,
1655,
830,
253,
2929,
310,
417,
4704,
323,
14924,
891,
11907,
253,
4477,
281,
1056,
247,
625,
2590,
1083,
323,
253,
1332,
2057,
407,
11138,
1543,
281,
3761,
990,
936,
423,
3733,
390,
407,
2686,
17227,
326,
3828,
3020,
3733,
556,
2176,
11361,
689,
990,
936,
423,
4715,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
50276,
783,
4477,
12661,
281,
6194,
3676,
27311,
267,
11454,
6928,
275,
247,
3828,
3020,
8142,
436,
310,
10214,
281,
253,
5899,
6036,
990,
936,
423,
3733,
273,
3676,
260,
79,
2224,
347,
616,
16038,
253,
4477,
14430,
15180,
285,
3541,
5373,
387,
253,
673,
273,
3733,
275,
1635,
281,
1146,
2104,
281,
9017,
3091,
628,
292,
1601,
16101,
31225,
281,
253,
2060,
2990,
8090,
3021,
6941,
323,
247,
10527,
7914,
273,
616,
5556,
66,
50276,
14094,
1332,
310,
2969,
285,
4518,
5544,
3877,
275,
253,
884,
394,
1386,
327,
23256,
577,
310,
627,
247,
480,
5816,
275,
253,
749,
3866,
273,
1269,
79,
253,
5661,
1543,
403,
4722,
253,
4477,
403,
2104,
281,
7568,
690,
10336,
326,
275,
12718,
247,
3828,
3020,
3733,
310,
2104,
281,
921,
12085,
1543,
342,
1675,
281,
247,
1591,
3024,
672,
10166,
275,
271,
990,
936,
423,
5133,
327,
4440,
257,
292,
841,
1543,
476,
1646,
30455,
347,
1097,
253,
35615,
285,
3733,
33255,
403,
1146,
12848,
285,
7613,
253,
10799,
7680,
273,
253,
3828,
3020,
3733,
310,
12744,
2299,
347,
591,
619,
4685,
253,
4388,
273,
253,
2929,
369,
2649,
281,
921,
326,
247,
3828,
3020,
3733,
476,
789,
1805,
326,
990,
936,
423,
3733,
253,
4388,
369,
327,
253,
10214,
281,
921,
326,
1014,
247,
3828,
3020,
3733,
476,
3959,
12085,
3045,
323,
690,
2990,
285,
7613,
778,
1705,
24783,
672,
3541,
310,
3710,
616,
6944,
1750,
534,
812,
320,
625,
4518,
4767,
310,
326,
253,
3541,
5373,
1309,
3733,
476,
320,
11346,
672,
253,
2060,
8090,
273,
247,
2990,
2036,
18,
403,
4577,
275,
4764,
1385,
347,
2429,
281,
1529,
2036,
2036,
19,
3738,
253,
2036,
18,
275,
40071,
5046,
4067,
685,
2036,
19,
436,
310,
984,
2036,
18,
588,
320,
10166,
275,
247,
3828,
3020,
8142,
1223,
2036,
19,
651,
320,
10166,
275,
271,
990,
936,
423,
5133,
891,
651,
751,
281,
923,
253,
4477,
6583,
390,
12009,
436,
4685,
285,
8870,
885,
616,
5661,
24020,
50276,
44295,
891,
651,
751,
281,
871,
849,
616,
789,
26662,
281,
253,
1563,
5987,
39962,
2061,
5375,
15046,
21645,
12730,
5987,
39962,
2061,
5375,
1036,
7749,
19,
17786,
50276,
71,
3341,
1223,
253,
4477,
1375,
326,
253,
3828,
3020,
3733,
2789,
253,
2060,
8090,
42133,
281,
10527,
1783,
22416,
318,
642,
824,
5955,
310,
3559,
4478,
4215,
275,
253,
2929,
253,
760,
1783,
3559,
310,
327,
253,
3745,
273,
253,
15978,
10166,
8090,
281,
4872,
4858,
253,
941,
281,
3790,
253,
1783,
352,
943,
671,
320,
6508,
281,
253,
14237,
6311,
407,
990,
936,
423,
10166,
6928,
50274,
455,
275,
512,
1223,
253,
2929,
16540,
690,
4722,
5697,
697,
10636,
275,
2426,
273,
247,
1332,
326,
33772,
247,
30410,
327,
1016,
2060,
3828,
310,
2581,
8077,
2531,
13414,
755,
479,
3430,
2969,
476,
6296,
320,
20654,
533,
387,
253,
7017,
253,
14023,
403,
417,
1077,
21414,
5474,
33032,
2520,
2929,
310,
273,
5272,
3290,
285,
19843,
2581,
16453,
3236,
414,
4931,
10665,
8453,
275,
690,
4893,
50276,
296,
3755,
20556,
50276,
74,
1158,
436,
2238,
273,
1332,
812,
320,
4217,
323,
941,
273,
1077,
1029,
7877,
1319,
672,
352,
310,
417,
1896,
281,
6194,
3253,
990,
281,
990,
50276,
783,
4679,
1646,
281,
320,
5196,
9113,
50276,
783,
2929,
310,
973,
3542,
50276,
20881,
1255,
265,
50276,
37585,
12002,
697,
2238,
273,
11755,
281,
1333,
326,
260,
338,
274,
740,
310,
247,
1781,
4311,
2460,
8981,
1895,
50276,
5371,
253,
4477,
403,
36636,
310,
3240,
2074,
281,
458,
70,
1162,
355,
337,
534,
369,
417,
5393,
275,
253,
2929,
347,
973,
347,
247,
4618,
2491,
273,
9380,
534,
497,
5393,
891,
1158,
352,
310,
2238,
4722,
323,
952,
281,
49620,
841,
5609,
432,
884,
1107,
3622,
533,
436,
1332,
310,
816,
417,
326,
4460,
50276,
783,
4477,
6780,
326,
616,
4736,
310,
417,
970,
436,
1332,
347,
247,
3215,
26208,
5700,
533,
352,
651,
320,
4722,
281,
923,
1880,
352,
651,
6296,
789,
1805,
604,
846,
253,
3828,
3020,
3733,
253,
2644,
2990,
651,
320,
10166,
990,
936,
423,
50276,
679,
84,
275,
436,
2929,
310,
6571,
38323,
50275,
9453,
10941,
1027,
3210,
390,
3733,
3082,
24088,
3828,
3020,
10166,
247,
1591,
3024,
285,
990,
936,
423,
10166,
247,
1591,
3024,
352,
651,
1056,
3282,
281,
513,
690,
4373,
19484,
3186,
352,
310,
1077,
29198,
281,
7525,
2712,
5010,
50276,
74,
651,
751,
281,
923,
247,
3402,
8886,
673,
5301,
875,
436,
285,
990,
936,
423,
3733,
50276,
18,
458,
70,
1162,
355,
11617,
35421,
37507,
4104,
7152,
339,
793,
360,
3454,
50276,
2520,
2929,
29328,
3828,
15822,
3733,
273,
11454,
6928,
970,
9162,
24026,
8892,
323,
3733,
1016,
3828,
4679,
403,
3559,
327,
260,
338,
274,
740,
285,
4440,
257,
292,
3933,
19103,
2810,
281,
990,
281,
990,
3733,
403,
2797,
50275,
783,
3828,
15822,
3733,
310,
6015,
323,
480,
5018,
253,
24026,
8892,
403,
10166,
327,
1755,
273,
253,
20126,
581,
3828,
273,
4871,
278,
50276,
3113,
247,
2990,
50276,
1171,
6864,
465,
285,
4871,
246,
786,
358,
3828,
3020,
3733,
310,
2218,
970,
256,
35333,
342,
10254,
285,
253,
4715,
2281,
310,
10027,
264,
949,
44540,
50275,
9939,
326,
253,
3828,
15822,
3733,
310,
2218,
342,
1781,
4871,
278,
685,
6867,
990,
281,
990,
6928,
275,
897,
50275,
783,
4477,
9059,
285,
1071,
253,
9079,
326,
24026,
8892,
50276,
2083,
38529,
253,
4872,
2533,
1430,
273,
260,
9866,
3386,
50275,
936,
4796,
253,
1979,
273,
253,
6311,
2990,
253,
2488,
12661,
247,
3828,
15822,
13800,
970,
253,
5806,
8570,
5853,
273,
14008,
2291,
729,
1162,
355,
50275,
250,
5551,
33593,
50276,
2520,
16774,
50276,
1601,
556,
644,
6949,
323,
247,
1223,
342,
11134,
2323,
253,
4477,
943,
1056,
616,
2127,
2130,
281,
253,
3114,
281,
6583,
285,
18302,
50276,
14094,
4342,
50276,
74,
11907,
253,
4477,
281,
1056,
616,
2127,
2130,
1309,
253,
2278,
49794,
2180,
50274,
9188,
40348,
273,
253,
789,
50276,
4064,
4361,
253,
2929,
352,
310,
417,
2590,
752,
310,
253,
2022,
24405,
326,
2789,
436,
3828,
15822,
3733,
50276,
12566,
1020,
4016,
1543,
651,
1361,
275,
4685,
752,
310,
1774,
323,
253,
7200,
50275,
8826,
625,
28913,
2175,
285,
4016,
1543,
588,
320,
47860,
285,
1060,
403,
1643,
13991,
275,
326,
3884,
50275,
43355,
1750,
326,
597,
908,
42275,
1066,
48027,
347,
2781,
390,
3388,
45900,
50276,
26460,
281,
1491,
2957,
1057,
253,
3828,
15822,
3733,
1918,
9065,
1543,
342,
3388,
390,
2781,
45900,
604,
594,
4496,
1304,
1110,
3904,
281,
871,
752,
310,
253,
12739,
273,
436,
4327,
273,
45900,
50273,
251,
253,
4871,
273,
253,
6928,
3133,
352,
310,
2234,
323,
253,
2323,
273,
253,
2746,
50276,
5371,
604,
50276,
5658,
6194,
14200,
6928,
342,
480,
326,
310,
1355,
50275,
75,
20,
323,
4227,
533,
1199,
50276,
88,
1334,
6928,
3185,
273,
480,
25,
1024,
323,
4440,
257,
292,
50275,
936,
3662,
253,
1072,
1953,
1840,
581,
3198,
671,
281,
923,
752,
403,
253,
3933,
19103,
323,
480,
25,
342,
6906,
254,
6928,
4577,
278,
50274,
12756,
253,
7200,
50276,
3113,
253,
3828,
15822,
3733,
3986,
247,
50276,
3789,
1952,
604,
581,
4648,
271,
10336,
342,
480,
2169,
685,
854,
50274,
17338,
1430,
273,
253,
6311,
3386,
990,
281,
990,
3386,
403,
871,
281,
320,
3700,
494,
352,
651,
320,
1175,
281,
923,
604,
436,
1335,
6556,
970,
253,
2990,
3828,
15822,
10166,
327,
4440,
257,
292,
323,
260,
338,
274,
740,
390,
643,
15302,
50275,
977,
3533,
50276,
4674,
4567,
310,
21248,
275,
13989,
4562,
285,
50276,
856,
3321,
4567,
476,
368,
823,
690,
2505,
281,
5513,
752,
403,
253,
6139,
4431,
273,
253,
3916,
3021,
776,
3733,
16000,
281,
9017,
690,
1543,
327,
20126,
260,
9866,
281,
12861,
260,
79,
2224,
50276,
4609,
20126,
1543,
50274,
1542,
465,
18,
14604,
21539,
369,
4217,
50276,
261,
436,
760,
327,
253,
24026,
3237,
6928,
50276,
263,
368,
908,
671,
14604,
5222,
323,
253,
3828,
50274,
783,
19862,
908,
310,
1182,
2204,
75,
18,
75,
374,
75,
1182,
75,
50276,
2520,
4648,
253,
2990,
273,
480,
8090,
50275,
12563,
253,
258,
75,
24026,
6928,
50276,
1171,
6864,
465,
4496,
1304,
253,
1180,
273,
3602,
323,
512,
3210,
2014,
285,
49328,
275,
2829,
337,
285,
2829,
374,
50274,
249,
253,
6452,
253,
7792,
476,
320,
9017,
494,
281,
253,
7529,
3733,
50276,
5430,
651,
436,
1896,
1580,
581,
3198,
253,
3453,
273,
253,
806,
3733,
281,
513,
253,
3733,
273,
253,
1735,
3828,
476,
368,
21184,
327,
752,
310,
5486,
1060,
50276,
37585,
3239,
374,
5004,
452,
21607,
395,
50276,
9802,
21607,
285,
50276,
783,
1327,
50137,
391,
1689,
275,
5150,
337,
285,
4768,
253,
2929,
1691,
247,
24312,
323,
697,
4154,
13882,
1004,
417,
391,
1689,
1269,
3239,
721,
50276,
303,
6533,
292,
12494,
50276,
88,
50276,
664,
2593,
5976,
359,
4853,
4872,
2533,
1430,
3966,
247,
2317,
310,
5816,
1078,
2007,
50276,
4674,
7652,
359,
1304,
776,
1543,
50276,
303,
6533,
292,
247,
2317,
310,
5816,
846,
4440,
257,
292,
50276,
1189,
455,
6803,
50276,
2520,
310,
247,
1175,
2929,
2403,
253,
2127,
2130,
285,
6240,
625,
28913,
2175,
285,
22909,
273,
4871,
7147,
6864,
285,
253,
4327,
273,
45900,
588,
1056,
253,
9021,
6927,
281,
2096,
2490,
187,
4118,
18435,
27,
783,
2929,
25339,
3828,
3020,
3733,
273,
3676,
6928,
253,
4477,
921,
326,
697,
1896,
281,
5115,
5272,
3045,
407,
3733,
3676,
37507,
3828,
407,
3828,
347,
10066,
281,
1024,
7561,
8671,
990,
936,
423,
3733,
1223,
824,
247,
3733,
5199,
310,
417,
4460,
253,
4477,
9059,
326,
436,
310,
271,
4722,
906,
7296,
326,
824,
247,
3733,
5199,
310,
2223,
11511,
347,
749,
29776,
285,
4283,
281,
18134,
1543,
2299,
253,
1543,
921,
4555,
326,
347,
253,
3045,
273,
253,
3210,
310,
3012,
7197,
685,
253,
1375,
273,
253,
1445,
285,
352,
310,
12744,
752,
643,
11361,
824,
247,
3733,
6974,
476,
3959,
253,
4477,
3748,
326,
3828,
3020,
3733,
812,
320,
4217,
323,
10527,
4685,
273,
3676,
37507,
533,
597,
13414,
1663,
1347,
824,
1783,
275,
436,
19529,
285,
697,
671,
12744,
1880,
11815,
273,
824,
1783,
651,
9017,
281,
3676,
37507,
10166,
990,
936,
423,
50276,
249,
697,
1655,
830,
253,
2929,
310,
417,
4704,
323,
14924,
891,
11907,
253,
4477,
281,
1056,
247,
625,
2590,
1083,
323,
253,
1332,
2057,
407,
11138,
1543,
281,
3761,
990,
936,
423,
3733,
390,
407,
2686,
17227,
326,
3828,
3020,
3733,
556,
2176,
11361,
689,
990,
936,
423,
4715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper aims to combine visual attention mechanisms to deep rl by proposing a simple attention module in the convolutional encoder of the deep rl agent twostream encodings and adaptive scaling of balance between nonattentionalattentional masks are major components of this module the authors present empirical results of their algorithm and compares with sota baselines in deep mind control suite pros 1 this paper is fairly well written and easy to follow one of the main contributions of the attention module is twostream encoding where there are separate nonattentional features and attentional features extractor attentional features have more taskspecific information while nonattentional features have taskagnostic information this attention module can be combined with any rl baseline algorithm like sac the structure of attention module is simple and easy to understand 2 rlatte shows competitive performance in deepmind control suite rlatte performs much better than sacpixel and almost achieves as good performance with sacstate when compared with sota algorithms drq and dreamer rlatte shows comparable performance in many domains 3 visualization of attention module is helpful as shown in figure 5 the attended output hatxf green box captures relevant activated locations and the agent can dynamically change the attention focus depending on each action cons questions major 1 adaptive scaling of sigma the sigma is an alternative parameter that should be tuned to balance between attentional and nonattentional features i think the role of this parameter is consequential but the paper lacks detailed description about this parameter could you plot how sigma changes over the course of training sigma is cdimensional vector and initialized as zero so you could plot the average of sigma over the course of training and how it evolves over time considering that adaptive scaling of balance between two encoder is one of the two major contributions of this work the paper lacks sufficient explanations on this part 2 need more justification for shared encoder for consecutive frames one of the difference of this encoder from other conventional encoders is the use of sharedencoder p the original algorithms stack the consecutive frames first and then put them altogether into the encoder whereas in this work each frame is encoded by p and then stacked after the encoding although the difference between stackfirstbaselines and your algorithm is shown in the ablation studies figure 6 i dont see reasonable explanations on why this would benefit the performance do you have any intuition of why separateencodingthenstacking performs better to my knowledge there is a reason in stacking consecutive frames first eg in dqn because stacking consecutive frames contains its own information mnih et al 2015 and separating them might break those information minor 3 the performance of rlatte is comparable with sota dreamer drq but doesnt outperform them in most case rlatte performs slightly worse than drq or dreamer its totally fine but i think you should change the phrase like significantly improve sampleefficiency and final performance of the agents 4 does your method show lower variances than sota baselines as mentioned in page 5 only looking from the graph plots i dont see that could you support this claim with statisticalnumerical values typos citation suggestions in section 51 setup i think you cited a wrong paper for dreamer you cited kostrikov et al 2020 for both dreamer and drq but you should fix it to hafner et al 2018 for dreamer consider citing these works at the related work section and adding some discussions about them too 1 mott a zoran d chrzanowski m wierstra d rezende d j 2019 towards interpretable reinforcement learning using attention augmented agents in advances in neural information processing systems 1235012359 2 greydanus s koul a dodge j fern a 2018 visualizing and understanding atari agents in international conference on machine learning 17921801 3 yuezhang l zhang r and ballard d h 2018 an initial attempt of combining visual selective attention with deep reinforcement learning arxiv preprint arxiv181104407 overall i think this is a nice work but given the concerns i have above and marginalincremental advances in empirical results i think this paper doesnt yet meet the threshold of iclr however if these concerns are addressed i am happy to raise my score after the rebuttal period also related work section can be improved by covering more past works there are a lot of works on visual attentionsaliency and there have been some recent works attempting to use attention for deep rl agents not just the ones that i mentioned above docsepthe paper proposes an alternative encoder architecture for an imagebased deep rl agent that leverages attention the authors suggest to compute attentional and nonattentional convolutional features which later are combined together via a weighted learned residual connection the designed architecture seems to improve performance of the agent on standard continuous control tasks from the deepmind suite significance the overall novelty and significance of the paper is low while i appreciate the authors drive to research alternative network architectures in deep rl im unfortunately unable to find any significant insight either from a theoretical not studied or empirical the experimental setup is questionable perspectives more details pros interesting investigation into an architecture choice for a deep rl agent i think this is an important direction to tackle and unfortunately there has been very little attention given to this problem interesting ablation that demonstrates the learned attention map and what different features are responsible for cons the novelty insight and contribution are limited moreover the empirical evidence provides inconclusive support for the proposed method weak baselines in fig 2 performance of sacpixels is weaker than in the literature see fig 6a in httpsarxivorgpdf191001741pdf in fig 3 performance of drq is significantly weaker than reported in the original paper httpsarxivorgpdf200413649pdf and given that drqs code is publicly available and more importantly fully reproducible personal experience this casts doubt on the rigor and validity of conducted experiments this work goes against a common practice in rl and reports performance of their algorithm using only 3 random seeds there is very little information provided about the exact architecture of the attention block i would appreciate more details here absence of the source code makes me skeptical that the reported results are reproducible given that the authors build on a publicly available code base sacae httpsgithubcomdenisyaratspytorchsacae the supervised setting experiment sect 53 is also inconclusive since the authors compare train loss performance for each of the agents obviously agents that use data augmentation would overfit less and thus report higher loss on training dataset besides it is not clear if the dataset for this experiment came from the agent with attention module or not if so this will also make learning harder for rad as it has weaker affinity with the generated data the authors suggest that they dont use data augmentation but in appendix b table 1 they also suggest that random crops are being used this needs to be further clarified quality the papers quality is mediocre a fully empirical paper requires greater experimental rigor and clarity clarity the paper would benefit from providing more transparency over the experimental setup and clear reporting of the used hyper parameters especially given the simplicity of the proposed method several places in the paper that i detailed above were either conflicting or ambiguous docsep summary this paper proposes to apply the soft attention mechanism in a cnn network to boost the learning speed of an rl agent in environments such as deepmind control suite the key idea is that the attention mechanism is a better network architecture and can extract interpretable taskrelevant information to improve the learning however attention has been used in several past rl works and paper misses some important comparisons overall the contributions of the paper are limited strengths the proposed method shows comparable performance against the baselines such as dreamer and drq and outperforms pure visionbased sac weaknesses the motivation in the introduction section does not seem to be strong in particular the second paragraph seems to be disconnected from the paper why do authors spend many words explaining selfsupervisedunsupervised learning modelbased learningplanning etc using attention in rl is not novel and has been explored in many works 15 for example 2 also proposed a cnn architecture with a selfattention mechanism how does the proposed architecture in the paper compare to the one in 2 its also not clear how good the proposed attention block is compared to other attention architectures that are proposed in previous works selfattention attention on flattened feature vectors 1 etc the proposed method uses a stack of 3 observations which can provide the policy more temporal information while the baselines do not seem to use a stack of historical observations hence its not clear whether the claim that the proposed method achieves similar performance with sota still hold if sota algorithms are also given the same amount of information as input the keypointbased method 6 has shown that the network can capture taskrelevant information such as keypoint locations this paper also shows some visualization about the attended regions on the observations in figure 5 from what both papers show in the visualization it seems the keypointbased method can capture better interpretable taskrelevant information it would provide more a more comprehensive view of the series of works if the authors can comment on 6 the paper compares to sac with images as input such learning is typically slow 7 shows that using an asymmetric actor and critic can significantly speed up the learning while requiring no state information at test time how does the attentionbased policy compare to this line of works missing ablations 1 for the ablation on sharedencoder the paper provides a comparison between two settings 3 rgb images as input 3 channels as input each of them are encoded using the same encoder 10 channels for each image output and the outputs are then stacked 30 channels as output in the end 3 rgb images are stacked first 9 channels as input and processed by an encoder 30 channels as output these two settings have a different number of network parameters it could be the latter one has more parameters and hence takes a bit longer time to train to provide a more thorough analysis one should also compare to the following setting convert each rgb image into grayscale and stack the three grayscale images 3 channels as input and then process them by an encoder that produces an output with 30 channels while converting rgb images to grayscale lose some color information such information should not affect the learning or final performance in the environments that the paper experimented with also i wonder how the curves look after 02m or 03m steps as shown in figure 6 7 8 could you provide the training curves for at least 1m training steps for the ablation study figure figure 6 7 8 2 how does the number of times steps in the paper it is 3 in the observation input affect the learning an ablation on different stacking length will better demonstrate the importance and effect of having a history of observations minor points in section 41 page 4 it should be figure 6 rather than figure 5 to support the claim that using a shared encoder gives better performance the structure of the writing is a bit off shouldnt section 6 be a subsection for section 54 reference 1 mishra nikhil et al a simple neural attentive metalearner arxiv preprint arxiv170703141 2017 2 manchin anthony ehsan abbasnejad and anton van den hengel reinforcement learning with attention that works a selfsupervised approach international conference on neural information processing springer cham 2019 3 fang kuan et al scene memory transformer for embodied agents in longhorizon tasks proceedings of the ieee conference on computer vision and pattern recognition 2019 4 sorokin ivan et al deep attention recurrent qnetwork arxiv preprint arxiv151201693 2015 5 mott alexander et al towards interpretable reinforcement learning using attention augmented agents advances in neural information processing systems 2019 6 kulkarni tejas d et al unsupervised learning of object keypoints for perception and control advances in neural information processing systems 2019 7 pinto lerrel et al asymmetric actor critic for imagebased robot learning arxiv preprint arxiv171006542 2017
### Summary:
|
this paper proposes an attentionendowed architecture for deep imagebased rl while some positive points were raised by the reviewers most comments were on the negative side the reviewers noted marginalincremental advances in terms of empirical results and low novelty and significance moreover the provided baselines seem weak because of this the present submission unfortunately does not meet the publication bar i recommend the authors take into account the constructive feedback from reviews and discussion and submit an improved version to another venue
|
[
13414,
923,
5272,
22909,
327,
2139,
436,
651,
5649,
253,
3045,
513,
368,
452,
667,
30328,
273,
2139,
4858,
27676,
7461,
9742,
272,
17923,
1805,
281,
619,
3640,
627,
310,
247,
1921,
275,
37444,
12640,
13009,
806,
24088,
275,
277,
47051,
984,
37444,
12640,
13009,
4428,
697,
1211,
1491,
278,
24645,
1162,
355,
4104,
285,
23694,
731,
1537,
2740,
1110,
1491,
50275,
37585,
50276,
20,
253,
3045,
273,
391,
13324,
442,
310,
10870,
342,
256,
5503,
7156,
254,
1837,
82,
533,
36908,
562,
32231,
731,
275,
954,
1083,
391,
13324,
442,
17923,
5777,
7197,
685,
1837,
82,
390,
7156,
254,
697,
9106,
4030,
533,
891,
1158,
368,
943,
1818,
253,
12616,
751,
3012,
3157,
3410,
46505,
285,
2457,
3045,
273,
253,
6083,
50275,
21,
1057,
634,
1332,
921,
2406,
48894,
685,
256,
5503,
1666,
25379,
347,
5393,
275,
3239,
608,
760,
2819,
432,
253,
4216,
14777,
891,
13414,
923,
326,
812,
368,
1329,
436,
1750,
342,
7605,
40907,
474,
2193,
50272,
555,
993,
50276,
26977,
13991,
50275,
249,
2593,
8319,
9978,
891,
1158,
368,
11106,
247,
3430,
2929,
323,
7156,
254,
368,
11106,
465,
493,
16409,
729,
1162,
355,
9169,
323,
1097,
7156,
254,
285,
1837,
82,
533,
368,
943,
4993,
352,
281,
419,
71,
1216,
1162,
355,
4765,
323,
7156,
254,
50275,
15603,
19936,
841,
2987,
387,
253,
2905,
789,
2593,
285,
6240,
690,
11985,
670,
731,
1512,
50276,
18,
278,
1519,
247,
1182,
263,
266,
277,
41934,
30317,
15767,
278,
259,
1321,
10981,
277,
294,
91,
9747,
277,
480,
6247,
4404,
4665,
494,
35221,
4715,
970,
4116,
31612,
6083,
275,
16424,
275,
11454,
1491,
5162,
2718,
1249,
1671,
39154,
3046,
50276,
19,
14370,
21329,
316,
256,
465,
3941,
247,
277,
13676,
480,
269,
1808,
247,
4765,
5304,
3006,
285,
4685,
387,
1792,
6083,
275,
5213,
8059,
327,
5145,
4715,
1722,
4529,
1093,
520,
50276,
20,
340,
17761,
12109,
298,
1182,
12109,
391,
285,
4023,
472,
277,
288,
4765,
271,
3302,
3177,
273,
16248,
5304,
13687,
4116,
342,
3676,
35221,
4715,
549,
32693,
638,
3845,
549,
32693,
1093,
7749,
21,
24769,
50273,
1189,
455,
891,
1158,
436,
310,
247,
5322,
789,
533,
1677,
253,
7350,
891,
452,
1840,
285,
16888,
19687,
30132,
16424,
275,
16774,
1543,
891,
1158,
436,
2929,
36908,
2568,
2525,
253,
7887,
273,
17857,
32888,
2299,
604,
841,
7350,
403,
9713,
891,
717,
5211,
281,
7164,
619,
4868,
846,
253,
30080,
22559,
2180,
671,
2905,
789,
2593,
476,
320,
5520,
407,
10985,
625,
2469,
2987,
627,
403,
247,
2257,
273,
2987,
327,
5304,
33056,
621,
267,
4364,
285,
627,
452,
644,
690,
3332,
2987,
13756,
281,
897,
4116,
323,
3676,
391,
77,
6083,
417,
816,
253,
4394,
326,
891,
5393,
1840,
5474,
339,
431,
248,
2929,
29328,
271,
5795,
32049,
10336,
323,
271,
2460,
3169,
3676,
391,
77,
5570,
326,
19732,
1131,
4116,
253,
4477,
1804,
281,
11897,
4116,
267,
285,
1327,
1595,
41454,
27311,
267,
3386,
534,
1996,
403,
5678,
2366,
3066,
247,
17375,
6311,
12541,
4602,
253,
4158,
10336,
3133,
281,
3157,
3045,
273,
253,
5570,
327,
2629,
5415,
1453,
8892,
432,
253,
3676,
14785,
18880,
50276,
9188,
40348,
253,
4583,
38135,
285,
8453,
273,
253,
2929,
310,
1698,
1223,
891,
11435,
253,
4477,
4446,
281,
2561,
5795,
2990,
35615,
275,
3676,
391,
77,
516,
19235,
7591,
281,
1089,
667,
1534,
12288,
2057,
432,
247,
10527,
417,
5421,
390,
16774,
253,
5661,
9978,
310,
30455,
24302,
50276,
3062,
4278,
50276,
856,
84,
50275,
47606,
5839,
715,
271,
10336,
4327,
323,
247,
3676,
391,
77,
5570,
891,
1158,
436,
310,
271,
1774,
3884,
281,
18915,
285,
19235,
627,
556,
644,
1077,
1652,
4116,
1677,
281,
436,
1895,
50275,
47606,
28913,
326,
14371,
253,
6311,
4116,
3711,
285,
752,
1027,
3386,
403,
5506,
323,
50276,
5040,
50276,
783,
38135,
12288,
285,
7680,
403,
3710,
25761,
253,
16774,
1941,
3400,
16656,
7426,
1329,
323,
253,
4081,
1332,
50276,
20881,
1666,
25379,
275,
3036,
374,
3045,
273,
7044,
30061,
1241,
310,
21076,
685,
275,
253,
6239,
923,
3036,
721,
66,
275,
5987,
39962,
2061,
9275,
746,
2313,
1166,
3156,
9275,
275,
3036,
495,
3045,
273,
1837,
82,
310,
3012,
21076,
685,
2361,
275,
253,
3236,
2929,
5987,
39962,
2061,
9275,
9430,
1012,
26937,
9275,
285,
1677,
326,
1837,
41317,
2127,
310,
13644,
2130,
285,
625,
15538,
4751,
41374,
3367,
2793,
436,
43603,
5545,
327,
253,
8132,
263,
285,
13091,
273,
5196,
4679,
50275,
2520,
789,
4566,
1411,
247,
1846,
3946,
275,
391,
77,
285,
5012,
3045,
273,
616,
5933,
970,
760,
495,
3632,
12922,
50276,
9088,
310,
1077,
1652,
1491,
2530,
670,
253,
3242,
10336,
273,
253,
4116,
2972,
891,
651,
11435,
625,
4278,
1060,
50276,
5375,
566,
273,
253,
2603,
2127,
2789,
479,
33872,
326,
253,
2361,
1543,
403,
41374,
1677,
326,
253,
4477,
1973,
327,
247,
13644,
2130,
2127,
2613,
7044,
3348,
5987,
7280,
681,
3354,
17976,
47310,
1033,
1767,
263,
348,
38346,
3348,
50276,
783,
22296,
4758,
3368,
25102,
8676,
310,
671,
16656,
7426,
1580,
253,
4477,
7277,
6194,
2957,
3045,
323,
1016,
273,
253,
6083,
9090,
6083,
326,
897,
941,
42072,
651,
689,
8491,
1679,
285,
3021,
1304,
2169,
2957,
327,
3733,
10895,
16280,
352,
310,
417,
2590,
604,
253,
10895,
323,
436,
3368,
2210,
432,
253,
5570,
342,
4116,
6333,
390,
417,
604,
594,
436,
588,
671,
1056,
4715,
12150,
323,
1985,
347,
352,
556,
21076,
15430,
342,
253,
4561,
941,
50276,
783,
4477,
1804,
326,
597,
13414,
897,
941,
42072,
533,
275,
30762,
270,
2829,
337,
597,
671,
1804,
326,
3632,
19492,
403,
1146,
908,
436,
3198,
281,
320,
2007,
31637,
50276,
15177,
253,
9380,
3290,
310,
12069,
49636,
247,
4751,
16774,
2929,
4419,
3687,
5661,
8132,
263,
285,
19843,
50275,
498,
15752,
253,
2929,
651,
5649,
432,
5277,
625,
22107,
689,
253,
5661,
9978,
285,
2590,
9610,
273,
253,
908,
4373,
3602,
3340,
1677,
253,
17647,
273,
253,
4081,
1332,
2067,
5053,
275,
253,
2929,
326,
891,
7000,
1840,
497,
2057,
24648,
390,
23851,
50273,
7152,
33032,
50276,
8774,
50276,
2520,
2929,
29328,
281,
4647,
253,
2602,
4116,
5122,
275,
247,
260,
9866,
2990,
281,
9510,
253,
4715,
3885,
273,
271,
391,
77,
5570,
275,
12620,
824,
347,
3676,
14785,
1453,
18880,
253,
2234,
2934,
310,
326,
253,
4116,
5122,
310,
247,
1805,
2990,
10336,
285,
476,
4908,
4665,
494,
4836,
15477,
1491,
281,
3157,
253,
4715,
2299,
4116,
556,
644,
908,
275,
2067,
2469,
391,
77,
2987,
285,
2929,
38771,
690,
1774,
14023,
4583,
253,
9021,
273,
253,
2929,
403,
3710,
50273,
296,
3755,
20556,
50276,
783,
4081,
1332,
2722,
10870,
3045,
1411,
253,
1666,
25379,
824,
347,
7156,
254,
285,
1837,
82,
285,
41731,
13015,
6313,
8113,
3169,
7044,
50274,
20881,
1255,
265,
50276,
783,
16038,
275,
253,
10199,
2593,
1057,
417,
1646,
281,
320,
2266,
275,
1798,
50276,
783,
1273,
12494,
3133,
281,
320,
33817,
432,
253,
2929,
2139,
513,
4477,
6947,
1142,
3000,
15571,
1881,
35421,
328,
35421,
4715,
1566,
3169,
4715,
446,
7526,
3966,
50276,
5302,
4116,
275,
391,
77,
310,
417,
4460,
285,
556,
644,
14859,
275,
1142,
2987,
1458,
323,
1650,
374,
671,
4081,
247,
260,
9866,
10336,
342,
247,
1881,
42959,
5122,
849,
1057,
253,
4081,
10336,
275,
253,
2929,
7277,
281,
253,
581,
275,
374,
50276,
953,
671,
417,
2590,
849,
1175,
253,
4081,
4116,
2972,
310,
2429,
281,
643,
4116,
35615,
326,
403,
4081,
275,
2045,
2987,
1881,
42959,
4116,
327,
42394,
4735,
11390,
337,
3966,
50276,
783,
4081,
1332,
4648,
247,
8031,
273,
495,
7313,
534,
476,
2085,
253,
3646,
625,
11935,
1491,
1223,
253,
1666,
25379,
513,
417,
1646,
281,
897,
247,
8031,
273,
9493,
7313,
7613,
697,
417,
2590,
1880,
253,
1750,
326,
253,
4081,
1332,
33526,
2074,
3045,
342,
256,
5503,
1335,
2186,
604,
256,
5503,
11333,
403,
671,
1677,
253,
1072,
2408,
273,
1491,
347,
3280,
50276,
783,
2234,
3659,
3169,
1332,
721,
556,
2011,
326,
253,
2990,
476,
9232,
4836,
15477,
1491,
824,
347,
2234,
3659,
8593,
436,
2929,
671,
2722,
690,
24426,
670,
253,
11612,
4811,
327,
253,
7313,
275,
4677,
608,
432,
752,
1097,
9380,
921,
275,
253,
24426,
352,
3133,
253,
2234,
3659,
3169,
1332,
476,
9232,
1805,
4665,
494,
4836,
15477,
1491,
352,
651,
2085,
625,
247,
625,
11088,
1859,
273,
253,
2962,
273,
2987,
604,
253,
4477,
476,
4385,
327,
721,
50275,
783,
2929,
26662,
281,
7044,
342,
3888,
347,
3280,
824,
4715,
310,
5431,
3468,
818,
2722,
326,
970,
271,
26640,
12353,
285,
7291,
476,
3012,
3885,
598,
253,
4715,
1223,
10568,
642,
1375,
1491,
387,
1071,
673,
849,
1057,
253,
4116,
3169,
3646,
7277,
281,
436,
1386,
273,
2987,
50276,
33722,
490,
77,
569,
50276,
18,
323,
253,
28913,
327,
6096,
36465,
253,
2929,
3400,
247,
5301,
875,
767,
7533,
50275,
20,
46206,
3888,
347,
3280,
495,
8123,
347,
3280,
1016,
273,
731,
403,
16202,
970,
253,
1072,
32049,
884,
8123,
323,
1016,
2460,
3453,
285,
253,
18012,
403,
840,
24982,
1884,
8123,
347,
3453,
275,
253,
990,
50275,
20,
46206,
3888,
403,
24982,
806,
898,
8123,
347,
3280,
285,
11742,
407,
271,
32049,
1884,
8123,
347,
3453,
841,
767,
7533,
452,
247,
1027,
1180,
273,
2990,
3602,
352,
812,
320,
253,
6158,
581,
556,
625,
3602,
285,
7613,
3936,
247,
2372,
3356,
673,
281,
6194,
50276,
936,
2085,
247,
625,
11080,
1783,
581,
943,
671,
7277,
281,
253,
1563,
4758,
50275,
28985,
1016,
46206,
2460,
715,
650,
698,
25912,
285,
8031,
253,
1264,
650,
698,
25912,
3888,
495,
8123,
347,
3280,
285,
840,
1232,
731,
407,
271,
32049,
326,
11330,
271,
3453,
342,
1884,
8123,
1223,
22022,
46206,
3888,
281,
650,
698,
25912,
7168,
690,
3295,
1491,
824,
1491,
943,
417,
2818,
253,
4715,
390,
2457,
3045,
275,
253,
12620,
326,
253,
2929,
3368,
264,
342,
50275,
12563,
891,
4282,
849,
253,
9191,
1007,
846,
16261,
78,
390,
17272,
78,
5018,
347,
2011,
275,
4677,
721,
818,
854,
812,
368,
2085,
253,
3733,
9191,
323,
387,
1878,
337,
78,
3733,
5018,
323,
253,
28913,
1263,
4677,
4677,
721,
818,
854,
50276,
19,
849,
1057,
253,
1180,
273,
2069,
5018,
275,
253,
2929,
352,
310,
495,
275,
253,
8310,
3280,
2818,
253,
4715,
271,
28913,
327,
1027,
37444,
2978,
588,
1805,
7568,
253,
6349,
285,
1055,
273,
1907,
247,
2892,
273,
7313,
50274,
37585,
2792,
50276,
249,
2593,
7609,
3239,
577,
352,
943,
320,
4677,
721,
2581,
685,
4677,
608,
281,
1329,
253,
1750,
326,
970,
247,
6096,
32049,
4245,
1805,
3045,
50276,
783,
2605,
273,
253,
4028,
310,
247,
2372,
745,
943,
2649,
2593,
721,
320,
247,
19087,
323,
2593,
8255,
50273,
14005,
50276,
18,
49285,
376,
295,
20323,
300,
1162,
355,
247,
2969,
11454,
33056,
422,
5148,
613,
1216,
549,
32693,
638,
3845,
549,
32693,
15046,
1967,
18486,
18,
4240,
50276,
19,
637,
27545,
13950,
2421,
299,
11285,
266,
490,
10352,
33841,
324,
285,
271,
1299,
3889,
1850,
344,
1251,
293,
35221,
4715,
342,
4116,
326,
2987,
247,
1881,
35421,
2746,
5213,
8059,
327,
11454,
1491,
5162,
7203,
254,
45909,
6247,
50276,
20,
269,
606,
465,
9041,
1162,
355,
6200,
3541,
39707,
323,
36080,
6083,
275,
1048,
1688,
21148,
8892,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
50276,
21,
21791,
25980,
209,
18064,
1162,
355,
3676,
4116,
18902,
2805,
18428,
549,
32693,
638,
3845,
549,
32693,
1010,
805,
520,
33291,
4104,
50276,
22,
278,
1519,
247,
1591,
5945,
1162,
355,
4404,
4665,
494,
35221,
4715,
970,
4116,
31612,
6083,
16424,
275,
11454,
1491,
5162,
2718,
6247,
50276,
23,
465,
23073,
1596,
74,
716,
33583,
277,
1162,
355,
440,
35421,
4715,
273,
1789,
2234,
10801,
323,
13071,
285,
1453,
16424,
275,
11454,
1491,
5162,
2718,
6247,
50276,
24,
268,
14806,
298,
254,
1661,
1162,
355,
26640,
12353,
7291,
323,
2460,
3169,
15688,
4715,
549,
32693,
638,
3845,
549,
32693,
1166,
2313,
2082,
2945,
4240,
50274,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
4116,
423,
12997,
10336,
323,
3676,
2460,
3169,
391,
77,
1223,
690,
2762,
2792,
497,
5439,
407,
253,
30628,
954,
5701,
497,
327,
253,
4016,
1930,
253,
30628,
4879,
16888,
19687,
30132,
16424,
275,
2426,
273,
16774,
1543,
285,
1698,
38135,
285,
8453,
25761,
253,
2530,
1666,
25379,
1646,
5075,
984,
273,
436,
253,
1246,
19529,
19235,
1057,
417,
2525,
253,
9311,
2534,
891,
5583,
253,
4477,
1379,
715,
2395,
253,
25799,
8680,
432,
10123,
285,
5955,
285,
11929,
271,
5520,
2715,
281,
1529,
18767
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
13414,
923,
5272,
22909,
327,
2139,
436,
651,
5649,
253,
3045,
513,
368,
452,
667,
30328,
273,
2139,
4858,
27676,
7461,
9742,
272,
17923,
1805,
281,
619,
3640,
627,
310,
247,
1921,
275,
37444,
12640,
13009,
806,
24088,
275,
277,
47051,
984,
37444,
12640,
13009,
4428,
697,
1211,
1491,
278,
24645,
1162,
355,
4104,
285,
23694,
731,
1537,
2740,
1110,
1491,
50275,
37585,
50276,
20,
253,
3045,
273,
391,
13324,
442,
310,
10870,
342,
256,
5503,
7156,
254,
1837,
82,
533,
36908,
562,
32231,
731,
275,
954,
1083,
391,
13324,
442,
17923,
5777,
7197,
685,
1837,
82,
390,
7156,
254,
697,
9106,
4030,
533,
891,
1158,
368,
943,
1818,
253,
12616,
751,
3012,
3157,
3410,
46505,
285,
2457,
3045,
273,
253,
6083,
50275,
21,
1057,
634,
1332,
921,
2406,
48894,
685,
256,
5503,
1666,
25379,
347,
5393,
275,
3239,
608,
760,
2819,
432,
253,
4216,
14777,
891,
13414,
923,
326,
812,
368,
1329,
436,
1750,
342,
7605,
40907,
474,
2193,
50272,
555,
993,
50276,
26977,
13991,
50275,
249,
2593,
8319,
9978,
891,
1158,
368,
11106,
247,
3430,
2929,
323,
7156,
254,
368,
11106,
465,
493,
16409,
729,
1162,
355,
9169,
323,
1097,
7156,
254,
285,
1837,
82,
533,
368,
943,
4993,
352,
281,
419,
71,
1216,
1162,
355,
4765,
323,
7156,
254,
50275,
15603,
19936,
841,
2987,
387,
253,
2905,
789,
2593,
285,
6240,
690,
11985,
670,
731,
1512,
50276,
18,
278,
1519,
247,
1182,
263,
266,
277,
41934,
30317,
15767,
278,
259,
1321,
10981,
277,
294,
91,
9747,
277,
480,
6247,
4404,
4665,
494,
35221,
4715,
970,
4116,
31612,
6083,
275,
16424,
275,
11454,
1491,
5162,
2718,
1249,
1671,
39154,
3046,
50276,
19,
14370,
21329,
316,
256,
465,
3941,
247,
277,
13676,
480,
269,
1808,
247,
4765,
5304,
3006,
285,
4685,
387,
1792,
6083,
275,
5213,
8059,
327,
5145,
4715,
1722,
4529,
1093,
520,
50276,
20,
340,
17761,
12109,
298,
1182,
12109,
391,
285,
4023,
472,
277,
288,
4765,
271,
3302,
3177,
273,
16248,
5304,
13687,
4116,
342,
3676,
35221,
4715,
549,
32693,
638,
3845,
549,
32693,
1093,
7749,
21,
24769,
50273,
1189,
455,
891,
1158,
436,
310,
247,
5322,
789,
533,
1677,
253,
7350,
891,
452,
1840,
285,
16888,
19687,
30132,
16424,
275,
16774,
1543,
891,
1158,
436,
2929,
36908,
2568,
2525,
253,
7887,
273,
17857,
32888,
2299,
604,
841,
7350,
403,
9713,
891,
717,
5211,
281,
7164,
619,
4868,
846,
253,
30080,
22559,
2180,
671,
2905,
789,
2593,
476,
320,
5520,
407,
10985,
625,
2469,
2987,
627,
403,
247,
2257,
273,
2987,
327,
5304,
33056,
621,
267,
4364,
285,
627,
452,
644,
690,
3332,
2987,
13756,
281,
897,
4116,
323,
3676,
391,
77,
6083,
417,
816,
253,
4394,
326,
891,
5393,
1840,
5474,
339,
431,
248,
2929,
29328,
271,
5795,
32049,
10336,
323,
271,
2460,
3169,
3676,
391,
77,
5570,
326,
19732,
1131,
4116,
253,
4477,
1804,
281,
11897,
4116,
267,
285,
1327,
1595,
41454,
27311,
267,
3386,
534,
1996,
403,
5678,
2366,
3066,
247,
17375,
6311,
12541,
4602,
253,
4158,
10336,
3133,
281,
3157,
3045,
273,
253,
5570,
327,
2629,
5415,
1453,
8892,
432,
253,
3676,
14785,
18880,
50276,
9188,
40348,
253,
4583,
38135,
285,
8453,
273,
253,
2929,
310,
1698,
1223,
891,
11435,
253,
4477,
4446,
281,
2561,
5795,
2990,
35615,
275,
3676,
391,
77,
516,
19235,
7591,
281,
1089,
667,
1534,
12288,
2057,
432,
247,
10527,
417,
5421,
390,
16774,
253,
5661,
9978,
310,
30455,
24302,
50276,
3062,
4278,
50276,
856,
84,
50275,
47606,
5839,
715,
271,
10336,
4327,
323,
247,
3676,
391,
77,
5570,
891,
1158,
436,
310,
271,
1774,
3884,
281,
18915,
285,
19235,
627,
556,
644,
1077,
1652,
4116,
1677,
281,
436,
1895,
50275,
47606,
28913,
326,
14371,
253,
6311,
4116,
3711,
285,
752,
1027,
3386,
403,
5506,
323,
50276,
5040,
50276,
783,
38135,
12288,
285,
7680,
403,
3710,
25761,
253,
16774,
1941,
3400,
16656,
7426,
1329,
323,
253,
4081,
1332,
50276,
20881,
1666,
25379,
275,
3036,
374,
3045,
273,
7044,
30061,
1241,
310,
21076,
685,
275,
253,
6239,
923,
3036,
721,
66,
275,
5987,
39962,
2061,
9275,
746,
2313,
1166,
3156,
9275,
275,
3036,
495,
3045,
273,
1837,
82,
310,
3012,
21076,
685,
2361,
275,
253,
3236,
2929,
5987,
39962,
2061,
9275,
9430,
1012,
26937,
9275,
285,
1677,
326,
1837,
41317,
2127,
310,
13644,
2130,
285,
625,
15538,
4751,
41374,
3367,
2793,
436,
43603,
5545,
327,
253,
8132,
263,
285,
13091,
273,
5196,
4679,
50275,
2520,
789,
4566,
1411,
247,
1846,
3946,
275,
391,
77,
285,
5012,
3045,
273,
616,
5933,
970,
760,
495,
3632,
12922,
50276,
9088,
310,
1077,
1652,
1491,
2530,
670,
253,
3242,
10336,
273,
253,
4116,
2972,
891,
651,
11435,
625,
4278,
1060,
50276,
5375,
566,
273,
253,
2603,
2127,
2789,
479,
33872,
326,
253,
2361,
1543,
403,
41374,
1677,
326,
253,
4477,
1973,
327,
247,
13644,
2130,
2127,
2613,
7044,
3348,
5987,
7280,
681,
3354,
17976,
47310,
1033,
1767,
263,
348,
38346,
3348,
50276,
783,
22296,
4758,
3368,
25102,
8676,
310,
671,
16656,
7426,
1580,
253,
4477,
7277,
6194,
2957,
3045,
323,
1016,
273,
253,
6083,
9090,
6083,
326,
897,
941,
42072,
651,
689,
8491,
1679,
285,
3021,
1304,
2169,
2957,
327,
3733,
10895,
16280,
352,
310,
417,
2590,
604,
253,
10895,
323,
436,
3368,
2210,
432,
253,
5570,
342,
4116,
6333,
390,
417,
604,
594,
436,
588,
671,
1056,
4715,
12150,
323,
1985,
347,
352,
556,
21076,
15430,
342,
253,
4561,
941,
50276,
783,
4477,
1804,
326,
597,
13414,
897,
941,
42072,
533,
275,
30762,
270,
2829,
337,
597,
671,
1804,
326,
3632,
19492,
403,
1146,
908,
436,
3198,
281,
320,
2007,
31637,
50276,
15177,
253,
9380,
3290,
310,
12069,
49636,
247,
4751,
16774,
2929,
4419,
3687,
5661,
8132,
263,
285,
19843,
50275,
498,
15752,
253,
2929,
651,
5649,
432,
5277,
625,
22107,
689,
253,
5661,
9978,
285,
2590,
9610,
273,
253,
908,
4373,
3602,
3340,
1677,
253,
17647,
273,
253,
4081,
1332,
2067,
5053,
275,
253,
2929,
326,
891,
7000,
1840,
497,
2057,
24648,
390,
23851,
50273,
7152,
33032,
50276,
8774,
50276,
2520,
2929,
29328,
281,
4647,
253,
2602,
4116,
5122,
275,
247,
260,
9866,
2990,
281,
9510,
253,
4715,
3885,
273,
271,
391,
77,
5570,
275,
12620,
824,
347,
3676,
14785,
1453,
18880,
253,
2234,
2934,
310,
326,
253,
4116,
5122,
310,
247,
1805,
2990,
10336,
285,
476,
4908,
4665,
494,
4836,
15477,
1491,
281,
3157,
253,
4715,
2299,
4116,
556,
644,
908,
275,
2067,
2469,
391,
77,
2987,
285,
2929,
38771,
690,
1774,
14023,
4583,
253,
9021,
273,
253,
2929,
403,
3710,
50273,
296,
3755,
20556,
50276,
783,
4081,
1332,
2722,
10870,
3045,
1411,
253,
1666,
25379,
824,
347,
7156,
254,
285,
1837,
82,
285,
41731,
13015,
6313,
8113,
3169,
7044,
50274,
20881,
1255,
265,
50276,
783,
16038,
275,
253,
10199,
2593,
1057,
417,
1646,
281,
320,
2266,
275,
1798,
50276,
783,
1273,
12494,
3133,
281,
320,
33817,
432,
253,
2929,
2139,
513,
4477,
6947,
1142,
3000,
15571,
1881,
35421,
328,
35421,
4715,
1566,
3169,
4715,
446,
7526,
3966,
50276,
5302,
4116,
275,
391,
77,
310,
417,
4460,
285,
556,
644,
14859,
275,
1142,
2987,
1458,
323,
1650,
374,
671,
4081,
247,
260,
9866,
10336,
342,
247,
1881,
42959,
5122,
849,
1057,
253,
4081,
10336,
275,
253,
2929,
7277,
281,
253,
581,
275,
374,
50276,
953,
671,
417,
2590,
849,
1175,
253,
4081,
4116,
2972,
310,
2429,
281,
643,
4116,
35615,
326,
403,
4081,
275,
2045,
2987,
1881,
42959,
4116,
327,
42394,
4735,
11390,
337,
3966,
50276,
783,
4081,
1332,
4648,
247,
8031,
273,
495,
7313,
534,
476,
2085,
253,
3646,
625,
11935,
1491,
1223,
253,
1666,
25379,
513,
417,
1646,
281,
897,
247,
8031,
273,
9493,
7313,
7613,
697,
417,
2590,
1880,
253,
1750,
326,
253,
4081,
1332,
33526,
2074,
3045,
342,
256,
5503,
1335,
2186,
604,
256,
5503,
11333,
403,
671,
1677,
253,
1072,
2408,
273,
1491,
347,
3280,
50276,
783,
2234,
3659,
3169,
1332,
721,
556,
2011,
326,
253,
2990,
476,
9232,
4836,
15477,
1491,
824,
347,
2234,
3659,
8593,
436,
2929,
671,
2722,
690,
24426,
670,
253,
11612,
4811,
327,
253,
7313,
275,
4677,
608,
432,
752,
1097,
9380,
921,
275,
253,
24426,
352,
3133,
253,
2234,
3659,
3169,
1332,
476,
9232,
1805,
4665,
494,
4836,
15477,
1491,
352,
651,
2085,
625,
247,
625,
11088,
1859,
273,
253,
2962,
273,
2987,
604,
253,
4477,
476,
4385,
327,
721,
50275,
783,
2929,
26662,
281,
7044,
342,
3888,
347,
3280,
824,
4715,
310,
5431,
3468,
818,
2722,
326,
970,
271,
26640,
12353,
285,
7291,
476,
3012,
3885,
598,
253,
4715,
1223,
10568,
642,
1375,
1491,
387,
1071,
673,
849,
1057,
253,
4116,
3169,
3646,
7277,
281,
436,
1386,
273,
2987,
50276,
33722,
490,
77,
569,
50276,
18,
323,
253,
28913,
327,
6096,
36465,
253,
2929,
3400,
247,
5301,
875,
767,
7533,
50275,
20,
46206,
3888,
347,
3280,
495,
8123,
347,
3280,
1016,
273,
731,
403,
16202,
970,
253,
1072,
32049,
884,
8123,
323,
1016,
2460,
3453,
285,
253,
18012,
403,
840,
24982,
1884,
8123,
347,
3453,
275,
253,
990,
50275,
20,
46206,
3888,
403,
24982,
806,
898,
8123,
347,
3280,
285,
11742,
407,
271,
32049,
1884,
8123,
347,
3453,
841,
767,
7533,
452,
247,
1027,
1180,
273,
2990,
3602,
352,
812,
320,
253,
6158,
581,
556,
625,
3602,
285,
7613,
3936,
247,
2372,
3356,
673,
281,
6194,
50276,
936,
2085,
247,
625,
11080,
1783,
581,
943,
671,
7277,
281,
253,
1563,
4758,
50275,
28985,
1016,
46206,
2460,
715,
650,
698,
25912,
285,
8031,
253,
1264,
650,
698,
25912,
3888,
495,
8123,
347,
3280,
285,
840,
1232,
731,
407,
271,
32049,
326,
11330,
271,
3453,
342,
1884,
8123,
1223,
22022,
46206,
3888,
281,
650,
698,
25912,
7168,
690,
3295,
1491,
824,
1491,
943,
417,
2818,
253,
4715,
390,
2457,
3045,
275,
253,
12620,
326,
253,
2929,
3368,
264,
342,
50275,
12563,
891,
4282,
849,
253,
9191,
1007,
846,
16261,
78,
390,
17272,
78,
5018,
347,
2011,
275,
4677,
721,
818,
854,
812,
368,
2085,
253,
3733,
9191,
323,
387,
1878,
337,
78,
3733,
5018,
323,
253,
28913,
1263,
4677,
4677,
721,
818,
854,
50276,
19,
849,
1057,
253,
1180,
273,
2069,
5018,
275,
253,
2929,
352,
310,
495,
275,
253,
8310,
3280,
2818,
253,
4715,
271,
28913,
327,
1027,
37444,
2978,
588,
1805,
7568,
253,
6349,
285,
1055,
273,
1907,
247,
2892,
273,
7313,
50274,
37585,
2792,
50276,
249,
2593,
7609,
3239,
577,
352,
943,
320,
4677,
721,
2581,
685,
4677,
608,
281,
1329,
253,
1750,
326,
970,
247,
6096,
32049,
4245,
1805,
3045,
50276,
783,
2605,
273,
253,
4028,
310,
247,
2372,
745,
943,
2649,
2593,
721,
320,
247,
19087,
323,
2593,
8255,
50273,
14005,
50276,
18,
49285,
376,
295,
20323,
300,
1162,
355,
247,
2969,
11454,
33056,
422,
5148,
613,
1216,
549,
32693,
638,
3845,
549,
32693,
15046,
1967,
18486,
18,
4240,
50276,
19,
637,
27545,
13950,
2421,
299,
11285,
266,
490,
10352,
33841,
324,
285,
271,
1299,
3889,
1850,
344,
1251,
293,
35221,
4715,
342,
4116,
326,
2987,
247,
1881,
35421,
2746,
5213,
8059,
327,
11454,
1491,
5162,
7203,
254,
45909,
6247,
50276,
20,
269,
606,
465,
9041,
1162,
355,
6200,
3541,
39707,
323,
36080,
6083,
275,
1048,
1688,
21148,
8892,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
50276,
21,
21791,
25980,
209,
18064,
1162,
355,
3676,
4116,
18902,
2805,
18428,
549,
32693,
638,
3845,
549,
32693,
1010,
805,
520,
33291,
4104,
50276,
22,
278,
1519,
247,
1591,
5945,
1162,
355,
4404,
4665,
494,
35221,
4715,
970,
4116,
31612,
6083,
16424,
275,
11454,
1491,
5162,
2718,
6247,
50276,
23,
465,
23073,
1596,
74,
716,
33583,
277,
1162,
355,
440,
35421,
4715,
273,
1789,
2234,
10801,
323,
13071,
285,
1453,
16424,
275,
11454,
1491,
5162,
2718,
6247,
50276,
24,
268,
14806,
298,
254,
1661,
1162,
355,
26640,
12353,
7291,
323,
2460,
3169,
15688,
4715,
549,
32693,
638,
3845,
549,
32693,
1166,
2313,
2082,
2945,
4240,
50274,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
4116,
423,
12997,
10336,
323,
3676,
2460,
3169,
391,
77,
1223,
690,
2762,
2792,
497,
5439,
407,
253,
30628,
954,
5701,
497,
327,
253,
4016,
1930,
253,
30628,
4879,
16888,
19687,
30132,
16424,
275,
2426,
273,
16774,
1543,
285,
1698,
38135,
285,
8453,
25761,
253,
2530,
1666,
25379,
1646,
5075,
984,
273,
436,
253,
1246,
19529,
19235,
1057,
417,
2525,
253,
9311,
2534,
891,
5583,
253,
4477,
1379,
715,
2395,
253,
25799,
8680,
432,
10123,
285,
5955,
285,
11929,
271,
5520,
2715,
281,
1529,
18767
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors present their own take on a variational image compression model based on ball et al 2018 with some interesting extensionsmodifications the combination of an autoregressive and a hierarchical approach to define the prior as in klopp et al 2018 and minnen et al 2018 a simplified hyperprior replacing the flowbased density model with a simpler gaussian breaking the strict separation of stochastic variables denoted with a tilde and used during training and deterministic variables denoted with a hat used during evaluation and instead conditioning some of the distributions on the quantized variables directly during training in an effort to reduce potential training biases the paper is written in clear language and generally well presented with a great attention to detail it is unfortunate that as noted in the comments above two prior peerreviewed studies have already explored extensions of the prior by introducing an autoregressive component obtaining similar results as far as i can see this reduces the novelty of the present paper to the latter two modifications the bitfree vs bitconsuming terminology is simply another way of presenting the same concept in my opinion it is not sufficiently novel to consider acceptance of this work into the paper track at iclr the authors should consider to build on their work further and consider publication at a later time possibly highlighting the latter modifications however the paper would need to be rewritten with a different set of claims update incorporating the acpc decision to treat the paper as concurrent workdocsepupdate i have updated my review to mention that we should accept this work as being concurrent with the two papers that are discussed below original review this paper is very similar to two previously published papers as pointed by david minnen before the review period was opened learning a codespace predictor by exploiting intraimagedependencies klopp et al from bmvc 2018 and joint autoregressive and hierarchical priors for learned image compression minnen et al from nips 2018 the authors have already tried to address these similarities and have provided a list in their reply and my summary of the differences is as follows dear authors please comment if i am misrepresenting what you said 1 the context model is slightly different 2 parametric model for hyperprior vs nonparametric 3 this point is highly debatable to be considered as a difference because the distinction between using noisy outputs vs quantized outputs is a very tiny detail any any practitioner would probably try both and test which works better 4 this is not really a difference the fact that you provide details about the method should be a default i want all the papers i read to have enough details to be able to implement them 5 not relevant for the discussion here if the results were significantly different from previous work these differences would indeed be interesting to discuss but they didnt seem to change much vs previously published work if the other papers didnt exist this would be an excellent paper on its own however i think the overlap is definitely there and as you can see from the summary above its not really clear to me whether this should be an iclr paper or not i am on the fence because i would expect more from a paper to be accepted to this venue ie more than an incremental update to an existing set of models which have already been covered in two papers docsepsummary the paper is an improvement over balle et al 2018 for endtoend image compression using deep neural networks it relies on a generalized entropy model and some modifications in the training algorithm experimentals results on the kodak photocd dataset show improvements over the bpg format in terms of the peak signaltonoise ratio psnr it is not said whether the code will be made available pros deep image compression is an active field of research of interest for iclr the paper is a step forward wrt balle et al 2018 the paper is well written experimental results are promising cons differences with balle et al 2018 should be emphasized it is not easy to see where the improvements come from from the new entropy model or from modifications in the training phase using discrete representations on the conditions i am surprised that there is no discussion on the choice of the hyperparameter lambda what are the optimal values in the experiments are the results varying a lot depending on the choice is there a strategy for an a priori choice also is one dataset enough to draw conclusions on the proposed method evaluation as a non expert in deep learning compression i have a positive opinion on the paper but the paper seems more a fine tuning of the method of balle et al 2018 therefore i am not convinced that the improvements are sufficiently innovative for publication at iclr despite the promising experimental results some details typos the p1 the their p2 and p10 while whereas p3 and and figure 2 p8 lower configurations higher configurations rd configurations
### Summary:
|
this paper proposes an algorithm for endtoend image compression outperforming previously proposed annbased techniques and typical image compression standards like jpeg strengths all reviewers agreed that this a well written paper with careful analysis and results weaknesses one of the points raised during the review process was that 2 very recent publications propose very similar algorithms since these works appeared very close to iclr paper submission deadline within 30 days the program committee decided to treat this as concurrent work the authors also clarified the differences and similarities with prior work and included additional experiments to clarify some of the concerns raised during the review process overall the paper is a solid contribution towards improving image compression and is therefore recommended to be accepted
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
616,
1211,
1379,
327,
247,
39762,
2460,
13800,
1566,
1754,
327,
4023,
1162,
355,
4765,
342,
690,
50276,
47606,
18149,
2307,
6787,
50275,
783,
5019,
273,
271,
47694,
11020,
285,
247,
24498,
2746,
281,
4853,
253,
2720,
347,
275,
465,
4213,
377,
1162,
355,
4765,
285,
1054,
26159,
1162,
355,
4765,
50276,
66,
21010,
4373,
40844,
15706,
253,
2685,
3169,
4038,
1566,
342,
247,
19554,
305,
12064,
50276,
22071,
253,
7654,
9712,
273,
19191,
4903,
17007,
342,
247,
246,
6227,
285,
908,
1309,
3733,
285,
30027,
4903,
17007,
342,
247,
7856,
908,
1309,
7103,
285,
3185,
21839,
690,
273,
253,
10670,
327,
253,
2677,
1025,
4903,
3587,
1309,
3733,
275,
271,
3434,
281,
4796,
2442,
3733,
31306,
50276,
783,
2929,
310,
3542,
275,
2590,
3448,
285,
3839,
973,
3559,
342,
247,
1270,
4116,
281,
2508,
352,
310,
23293,
326,
347,
4879,
275,
253,
5701,
1840,
767,
2720,
14218,
33349,
2175,
452,
2168,
14859,
18149,
273,
253,
2720,
407,
16984,
271,
47694,
11020,
4445,
13546,
2074,
1543,
50276,
284,
2080,
347,
891,
476,
923,
436,
11355,
253,
38135,
273,
253,
1246,
2929,
281,
253,
6158,
767,
14586,
253,
2372,
4924,
4632,
2372,
33136,
28939,
310,
3365,
1529,
1039,
273,
15250,
253,
1072,
4473,
275,
619,
4743,
352,
310,
417,
10481,
4460,
281,
1908,
14924,
273,
436,
789,
715,
253,
2929,
3540,
387,
17857,
32888,
50276,
783,
4477,
943,
1908,
281,
1973,
327,
616,
789,
2007,
285,
1908,
9311,
387,
247,
1996,
673,
6830,
27321,
253,
6158,
14586,
2299,
253,
2929,
651,
878,
281,
320,
35993,
342,
247,
1027,
873,
273,
3916,
50276,
11183,
24049,
253,
913,
5902,
3061,
281,
1555,
253,
2929,
347,
17336,
789,
7152,
33032,
11183,
891,
452,
9300,
619,
2278,
281,
3748,
326,
359,
943,
2997,
436,
789,
347,
1146,
17336,
342,
253,
767,
9380,
326,
403,
5469,
2708,
50276,
19164,
2278,
436,
2929,
310,
1077,
2074,
281,
767,
3786,
3863,
9380,
347,
8042,
407,
34843,
301,
1054,
26159,
1078,
253,
2278,
2180,
369,
5485,
4715,
247,
11646,
4511,
23403,
407,
38883,
8376,
303,
2961,
2008,
4601,
465,
4213,
377,
1162,
355,
50276,
4064,
270,
78,
16788,
4765,
285,
6036,
47694,
11020,
285,
24498,
2235,
641,
323,
6311,
2460,
13800,
1054,
26159,
1162,
355,
432,
295,
2824,
4765,
50276,
783,
4477,
452,
2168,
3597,
281,
2953,
841,
22620,
285,
452,
2530,
247,
1618,
275,
616,
12252,
285,
619,
6010,
273,
253,
3910,
310,
347,
3637,
11761,
4477,
4496,
4385,
604,
891,
717,
25355,
272,
752,
368,
753,
337,
253,
3634,
1566,
310,
5777,
1027,
374,
36833,
1566,
323,
4373,
40844,
4632,
1327,
36928,
495,
436,
1127,
310,
4122,
4274,
17980,
281,
320,
2783,
347,
247,
3064,
984,
253,
13812,
875,
970,
27620,
18012,
4632,
2677,
1025,
18012,
310,
247,
1077,
10058,
2508,
667,
667,
34815,
651,
3164,
1611,
1097,
285,
1071,
534,
2987,
1805,
50276,
21,
436,
310,
417,
1663,
247,
3064,
253,
958,
326,
368,
2085,
4278,
670,
253,
1332,
943,
320,
247,
4284,
891,
971,
512,
253,
9380,
891,
1239,
281,
452,
2217,
4278,
281,
320,
2104,
281,
3359,
731,
608,
50276,
1439,
4623,
323,
253,
5955,
1060,
50276,
338,
253,
1543,
497,
3012,
1027,
432,
2045,
789,
841,
3910,
651,
6296,
320,
4722,
281,
2319,
533,
597,
42126,
1646,
281,
1818,
1199,
4632,
3786,
3863,
789,
50276,
338,
253,
643,
9380,
42126,
2226,
436,
651,
320,
271,
7126,
2929,
327,
697,
1211,
2299,
891,
1158,
253,
14787,
310,
7964,
627,
285,
347,
368,
476,
923,
432,
253,
6010,
1840,
697,
417,
1663,
2590,
281,
479,
1880,
436,
943,
320,
271,
17857,
32888,
2929,
390,
417,
891,
717,
327,
253,
19354,
984,
891,
651,
1902,
625,
432,
247,
2929,
281,
320,
7607,
281,
436,
18767,
26332,
625,
685,
271,
32809,
5731,
281,
271,
5368,
873,
273,
3210,
534,
452,
2168,
644,
6107,
275,
767,
9380,
50276,
7152,
339,
793,
360,
3454,
253,
2929,
310,
271,
7756,
689,
4273,
282,
1162,
355,
4765,
323,
990,
936,
423,
2460,
13800,
970,
3676,
11454,
6928,
352,
15771,
327,
247,
14923,
15579,
1566,
285,
690,
14586,
275,
253,
3733,
5933,
3368,
932,
1543,
327,
253,
465,
351,
518,
32934,
69,
10895,
921,
11701,
689,
253,
270,
8159,
5981,
275,
2426,
273,
253,
5241,
2625,
1299,
45416,
4313,
3714,
23838,
352,
310,
417,
753,
1880,
253,
2127,
588,
320,
1160,
2130,
50276,
856,
84,
50275,
22412,
2460,
13800,
310,
271,
3939,
1673,
273,
2561,
273,
1600,
323,
17857,
32888,
253,
2929,
310,
247,
3213,
3579,
8772,
4273,
282,
1162,
355,
4765,
50275,
783,
2929,
310,
973,
3542,
50275,
49363,
1543,
403,
12532,
50276,
5040,
50276,
69,
26776,
342,
4273,
282,
1162,
355,
4765,
943,
320,
21947,
352,
310,
417,
3477,
281,
923,
835,
253,
11701,
1705,
432,
432,
253,
747,
15579,
1566,
390,
432,
14586,
275,
253,
3733,
3408,
970,
13358,
14237,
327,
253,
2515,
50276,
74,
717,
9861,
326,
627,
310,
642,
5955,
327,
253,
4327,
273,
253,
4373,
19484,
29331,
752,
403,
253,
8654,
2193,
275,
253,
4679,
403,
253,
1543,
11962,
247,
2257,
7293,
327,
253,
4327,
310,
627,
247,
5700,
323,
271,
247,
30400,
4327,
50275,
12563,
310,
581,
10895,
2217,
281,
3812,
11815,
327,
253,
4081,
1332,
50276,
15419,
2368,
347,
247,
1327,
6485,
275,
3676,
4715,
13800,
891,
452,
247,
2762,
4743,
327,
253,
2929,
533,
253,
2929,
3133,
625,
247,
4030,
25184,
273,
253,
1332,
273,
4273,
282,
1162,
355,
4765,
3103,
891,
717,
417,
13762,
326,
253,
11701,
403,
10481,
16694,
323,
9311,
387,
17857,
32888,
5747,
253,
12532,
5661,
1543,
50276,
8826,
4278,
963,
993,
253,
268,
18,
253,
616,
268,
19,
285,
268,
740,
1223,
5727,
268,
20,
285,
285,
4677,
374,
50276,
81,
25,
2406,
16012,
2169,
16012,
47939,
16012,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
5933,
323,
990,
936,
423,
2460,
13800,
41731,
14692,
3786,
4081,
2459,
3169,
5609,
285,
6867,
2460,
13800,
7465,
751,
480,
21949,
50276,
296,
3755,
20556,
50276,
455,
30628,
5821,
326,
436,
247,
973,
3542,
2929,
342,
10182,
1783,
285,
1543,
50276,
20881,
1255,
265,
50276,
531,
273,
253,
2792,
5439,
1309,
253,
2278,
1232,
369,
326,
374,
1077,
3332,
16516,
12661,
1077,
2074,
11333,
1580,
841,
2987,
5420,
1077,
2810,
281,
17857,
32888,
2929,
19529,
20639,
1561,
1884,
1897,
253,
2086,
9353,
4425,
281,
1555,
436,
347,
17336,
789,
50276,
783,
4477,
671,
31637,
253,
3910,
285,
22620,
342,
2720,
789,
285,
2908,
3081,
4679,
281,
19148,
690,
273,
253,
7350,
5439,
1309,
253,
2278,
1232,
4583,
253,
2929,
310,
247,
4891,
7680,
4404,
11138,
2460,
13800,
285,
310,
3103,
8521,
281,
320,
7607,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
616,
1211,
1379,
327,
247,
39762,
2460,
13800,
1566,
1754,
327,
4023,
1162,
355,
4765,
342,
690,
50276,
47606,
18149,
2307,
6787,
50275,
783,
5019,
273,
271,
47694,
11020,
285,
247,
24498,
2746,
281,
4853,
253,
2720,
347,
275,
465,
4213,
377,
1162,
355,
4765,
285,
1054,
26159,
1162,
355,
4765,
50276,
66,
21010,
4373,
40844,
15706,
253,
2685,
3169,
4038,
1566,
342,
247,
19554,
305,
12064,
50276,
22071,
253,
7654,
9712,
273,
19191,
4903,
17007,
342,
247,
246,
6227,
285,
908,
1309,
3733,
285,
30027,
4903,
17007,
342,
247,
7856,
908,
1309,
7103,
285,
3185,
21839,
690,
273,
253,
10670,
327,
253,
2677,
1025,
4903,
3587,
1309,
3733,
275,
271,
3434,
281,
4796,
2442,
3733,
31306,
50276,
783,
2929,
310,
3542,
275,
2590,
3448,
285,
3839,
973,
3559,
342,
247,
1270,
4116,
281,
2508,
352,
310,
23293,
326,
347,
4879,
275,
253,
5701,
1840,
767,
2720,
14218,
33349,
2175,
452,
2168,
14859,
18149,
273,
253,
2720,
407,
16984,
271,
47694,
11020,
4445,
13546,
2074,
1543,
50276,
284,
2080,
347,
891,
476,
923,
436,
11355,
253,
38135,
273,
253,
1246,
2929,
281,
253,
6158,
767,
14586,
253,
2372,
4924,
4632,
2372,
33136,
28939,
310,
3365,
1529,
1039,
273,
15250,
253,
1072,
4473,
275,
619,
4743,
352,
310,
417,
10481,
4460,
281,
1908,
14924,
273,
436,
789,
715,
253,
2929,
3540,
387,
17857,
32888,
50276,
783,
4477,
943,
1908,
281,
1973,
327,
616,
789,
2007,
285,
1908,
9311,
387,
247,
1996,
673,
6830,
27321,
253,
6158,
14586,
2299,
253,
2929,
651,
878,
281,
320,
35993,
342,
247,
1027,
873,
273,
3916,
50276,
11183,
24049,
253,
913,
5902,
3061,
281,
1555,
253,
2929,
347,
17336,
789,
7152,
33032,
11183,
891,
452,
9300,
619,
2278,
281,
3748,
326,
359,
943,
2997,
436,
789,
347,
1146,
17336,
342,
253,
767,
9380,
326,
403,
5469,
2708,
50276,
19164,
2278,
436,
2929,
310,
1077,
2074,
281,
767,
3786,
3863,
9380,
347,
8042,
407,
34843,
301,
1054,
26159,
1078,
253,
2278,
2180,
369,
5485,
4715,
247,
11646,
4511,
23403,
407,
38883,
8376,
303,
2961,
2008,
4601,
465,
4213,
377,
1162,
355,
50276,
4064,
270,
78,
16788,
4765,
285,
6036,
47694,
11020,
285,
24498,
2235,
641,
323,
6311,
2460,
13800,
1054,
26159,
1162,
355,
432,
295,
2824,
4765,
50276,
783,
4477,
452,
2168,
3597,
281,
2953,
841,
22620,
285,
452,
2530,
247,
1618,
275,
616,
12252,
285,
619,
6010,
273,
253,
3910,
310,
347,
3637,
11761,
4477,
4496,
4385,
604,
891,
717,
25355,
272,
752,
368,
753,
337,
253,
3634,
1566,
310,
5777,
1027,
374,
36833,
1566,
323,
4373,
40844,
4632,
1327,
36928,
495,
436,
1127,
310,
4122,
4274,
17980,
281,
320,
2783,
347,
247,
3064,
984,
253,
13812,
875,
970,
27620,
18012,
4632,
2677,
1025,
18012,
310,
247,
1077,
10058,
2508,
667,
667,
34815,
651,
3164,
1611,
1097,
285,
1071,
534,
2987,
1805,
50276,
21,
436,
310,
417,
1663,
247,
3064,
253,
958,
326,
368,
2085,
4278,
670,
253,
1332,
943,
320,
247,
4284,
891,
971,
512,
253,
9380,
891,
1239,
281,
452,
2217,
4278,
281,
320,
2104,
281,
3359,
731,
608,
50276,
1439,
4623,
323,
253,
5955,
1060,
50276,
338,
253,
1543,
497,
3012,
1027,
432,
2045,
789,
841,
3910,
651,
6296,
320,
4722,
281,
2319,
533,
597,
42126,
1646,
281,
1818,
1199,
4632,
3786,
3863,
789,
50276,
338,
253,
643,
9380,
42126,
2226,
436,
651,
320,
271,
7126,
2929,
327,
697,
1211,
2299,
891,
1158,
253,
14787,
310,
7964,
627,
285,
347,
368,
476,
923,
432,
253,
6010,
1840,
697,
417,
1663,
2590,
281,
479,
1880,
436,
943,
320,
271,
17857,
32888,
2929,
390,
417,
891,
717,
327,
253,
19354,
984,
891,
651,
1902,
625,
432,
247,
2929,
281,
320,
7607,
281,
436,
18767,
26332,
625,
685,
271,
32809,
5731,
281,
271,
5368,
873,
273,
3210,
534,
452,
2168,
644,
6107,
275,
767,
9380,
50276,
7152,
339,
793,
360,
3454,
253,
2929,
310,
271,
7756,
689,
4273,
282,
1162,
355,
4765,
323,
990,
936,
423,
2460,
13800,
970,
3676,
11454,
6928,
352,
15771,
327,
247,
14923,
15579,
1566,
285,
690,
14586,
275,
253,
3733,
5933,
3368,
932,
1543,
327,
253,
465,
351,
518,
32934,
69,
10895,
921,
11701,
689,
253,
270,
8159,
5981,
275,
2426,
273,
253,
5241,
2625,
1299,
45416,
4313,
3714,
23838,
352,
310,
417,
753,
1880,
253,
2127,
588,
320,
1160,
2130,
50276,
856,
84,
50275,
22412,
2460,
13800,
310,
271,
3939,
1673,
273,
2561,
273,
1600,
323,
17857,
32888,
253,
2929,
310,
247,
3213,
3579,
8772,
4273,
282,
1162,
355,
4765,
50275,
783,
2929,
310,
973,
3542,
50275,
49363,
1543,
403,
12532,
50276,
5040,
50276,
69,
26776,
342,
4273,
282,
1162,
355,
4765,
943,
320,
21947,
352,
310,
417,
3477,
281,
923,
835,
253,
11701,
1705,
432,
432,
253,
747,
15579,
1566,
390,
432,
14586,
275,
253,
3733,
3408,
970,
13358,
14237,
327,
253,
2515,
50276,
74,
717,
9861,
326,
627,
310,
642,
5955,
327,
253,
4327,
273,
253,
4373,
19484,
29331,
752,
403,
253,
8654,
2193,
275,
253,
4679,
403,
253,
1543,
11962,
247,
2257,
7293,
327,
253,
4327,
310,
627,
247,
5700,
323,
271,
247,
30400,
4327,
50275,
12563,
310,
581,
10895,
2217,
281,
3812,
11815,
327,
253,
4081,
1332,
50276,
15419,
2368,
347,
247,
1327,
6485,
275,
3676,
4715,
13800,
891,
452,
247,
2762,
4743,
327,
253,
2929,
533,
253,
2929,
3133,
625,
247,
4030,
25184,
273,
253,
1332,
273,
4273,
282,
1162,
355,
4765,
3103,
891,
717,
417,
13762,
326,
253,
11701,
403,
10481,
16694,
323,
9311,
387,
17857,
32888,
5747,
253,
12532,
5661,
1543,
50276,
8826,
4278,
963,
993,
253,
268,
18,
253,
616,
268,
19,
285,
268,
740,
1223,
5727,
268,
20,
285,
285,
4677,
374,
50276,
81,
25,
2406,
16012,
2169,
16012,
47939,
16012,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
271,
5933,
323,
990,
936,
423,
2460,
13800,
41731,
14692,
3786,
4081,
2459,
3169,
5609,
285,
6867,
2460,
13800,
7465,
751,
480,
21949,
50276,
296,
3755,
20556,
50276,
455,
30628,
5821,
326,
436,
247,
973,
3542,
2929,
342,
10182,
1783,
285,
1543,
50276,
20881,
1255,
265,
50276,
531,
273,
253,
2792,
5439,
1309,
253,
2278,
1232,
369,
326,
374,
1077,
3332,
16516,
12661,
1077,
2074,
11333,
1580,
841,
2987,
5420,
1077,
2810,
281,
17857,
32888,
2929,
19529,
20639,
1561,
1884,
1897,
253,
2086,
9353,
4425,
281,
1555,
436,
347,
17336,
789,
50276,
783,
4477,
671,
31637,
253,
3910,
285,
22620,
342,
2720,
789,
285,
2908,
3081,
4679,
281,
19148,
690,
273,
253,
7350,
5439,
1309,
253,
2278,
1232,
4583,
253,
2929,
310,
247,
4891,
7680,
4404,
11138,
2460,
13800,
285,
310,
3103,
8521,
281,
320,
7607,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper looks at learning to represent edits for text revisions and code changes the main contributions are as follows they define a new task of representing and predicting textual and code changes they make available a new dataset of code changes text edit dataset was already available with labels of the type of change they try simple neural network models that show good performance in representing and predicting the changes the nlp community has recently defined the problem of predicting atomic edits for text data faraqui et al emnlp 2018 cited in the paper and that is the source of their wikipedia revision dataset although it is an interesting problem it is not immediately clear from the introduction of this paper what would be enabled by accurate prediction of atomic edits ie simple insertions and deletions and i hope the next version would elaborate on the motivation and significance for this new task the fixer dataset that they created is interesting those edits supposedly make the code better so modeling those edits could lead to better code having that as labeled data enables a clean and convincing evaluation task of predicting similar edits the paper focuses on the novelty of the task and the dataset so the models are simple variations of the existing bidirectional lstm and the gated graph neural network because much of the input text or code does not change the decoder gets to directly copy parts of the input for code data the ast is used instead of flat text of the code these small changes seem reasonable and work well for this problem evaluation is not easy for this task for the task of representing the edits they show visualizations of the clusters of similar edits and conduct a human evaluation to see how similar these edits actually are this human evaluation is not described in detail as they do not say how many people rated the similarity who they were how they were recruited how they were instructed and what the interrater agreement was the edit prediction evaluation is done well but it is not clear what it means when they say better prediction performance does not necessarily mean it generalizes better that may be true but then without another metric for better generalization one cannot say that better performance means worse generalization despite these minor issues the paper contributes significantly novel task dataset and results i believe it will lead to interesting future research in representing text and code changesdocsepthe authors state nicely and clearly the main contributions they see in their work intro last paragraph specifically the state the paper 1 present a new and important machine learning task 2 present a family of models that capture the structure of edits and compute efficient representations 3 create a new source code edit dataset 4 perform a set of experiments on the learned edit representations and present promising empirical evidence that the models succeed in capturing the semantics of edits we decided to organize this review by commenting on the abovestated contributions one at a time a new and important machine learning task regarding new task pro we are unfamiliar with past work which presents this precise task the task is new section 5 makes a good case for the novelty of this work con none regarding important task pro the authors motivate the task with tantalizing prospective applications automatically editing text and code eg for grammar clarity and style conceptualizing edits as nlp objects of interest that can be concretely represented clustered and used for prediction is an advance con many text editors office suites and coding ides already include features which automatically suggest or apply edits for grammar clarity and style the authors do not describe shortcomings in existing tools that might be better addressed using distributed representations of edits consequently the significance of the proposed contribution is unclear a family of models that capture the structure of edits and compute efficient representations regarding a family of models pro the family of models presented by the authors clearly generalizes such models may be utilized for computational experiments on datasets and edit types beyond those specifically utilized in this evaluation the authors apply wellutilized neural network architectures that may be trained and applied to large datasets the architecture of the neural editor permits evaluation of the degree to which the editor successfully predicts the correct edit given a preedit input and a known representation of a similar edit con the authors do not propose any scheme under which edit representations might be utilized for automatically editing text or code when an edit very similar to the desired edit is not already known and its representation available as input hence we find the authors do not sufficiently motivate the input scheme of their neural editor the input scheme of the neural editor makes trivial the case in which no edit is needed as the editor would learn during training that the output x should be the same as the input x when the representation of the zero edit is given as input while the authors discuss the importance of bottlenecking the edit encoder so that it does not simply learn to encode the desired output x they do not concretely demonstrate that the edit encoder has done otherwise in the final experiments related to that if the authors aimed to actually solve automated edits in textcode then it seems crucial their data contained negative examples ie segments which require no edits in such an evaluation one would test also when the algorithm introduces unnecessaryerroneous edits regarding capture structure of edits pro the authors present evidence that edit encoders tightly cluster relatively simple edits which involve adding or removing common tokens the authors present evidence that relatively simple edits completed automatically by a fixer often cluster together ie a known signal is retained in clustering the authors present evidence that the nearest neighbors of edits in an editrepresentation space often are semantically or structurally similar as judged by human annotators section 43 includes interesting observations comparing edit patterns better captured by the graph or seq edit encoders con the details of the human annotation tasks which generated the numerical results in tables 1 and 2 are unclear were unbiased third parties utilized were the edits stripped of their sourceencoder label when evaluated objectively what separates an unrelated from a similar edit and what separates a similar from a same edit did multiple human annotators undertake this task in parallel and what was their overall concordance eg intercoder reliability without concrete answers to these questions the validity and significance of the dcgndcg results reported in tables 1 and 2 are unclear it is not clear from the two examples given in table 1 that the three nearest neighbors embedded by the seq encoder are better ie overall more semantically andor syntactically similar to the example edit than those embedded by the bag of words model it is unclear which specific aspects of edit structure are better captured by the seq encoder than the bag of words model the overall structure of tables 1 and 2 is awkward with concrete numerical results dominated by a spatially large section containing a small number of examples create a new source code edit dataset pro the authors create a new source code edit dataset an important contribution to the study of this new task con minor is the provided dataset large enough to do more than simple experiments see note below on sample size present promising empirical evidence that the models succeed in capturing the semantics of edits pro the experiment results show how frequently the endtoend system successfully predicted the correct edit given a preedit input and a known representation of a similar edit gold standard accuracies of more than 70 and averaged transfer learning accuracies of more than 30 suggest that this system shows promise for capturing the semantics of edits con due to concerns expressed above about the model design and evaluation of the edit representations it remains unclear to what degree the models succeed in capturing the semantics of edits table 11 shows dramatic variation in success levels across fixer id in the transfer learning task yet the authors do not propose ways their endtoend system might be adjusted to address areas of weak performance the authors do not discuss the impact of training set size on their evaluation metrics the authors do not discuss the degree to which their model training task would scale to larger language datasets such as those needed for the motivating applications based on the authors response revisions and disucssions we have updated the review and the score docsepthe main contributions of the paper are an edit encoder model similar to guu et al 2017 httpaclweborganthologyq181031 a new dataset of treestructured source code edits and thorough and well thoughtout analysis of the edit encodings the paper is clearly written and provides clear support for each of their main claims i think this would be of interest to nlp researchers and others working on sequence and graphtransduction models but i think the authors could have gone further to demonstrate the robustness of their edit encodings and their applicability to other tasks this would also benefit greatly from a more direct comparison to guu et al 2017 which presents a very similar neural editor model some more specific points i really like the idea of transferring edits from one context to another the oneshot experiment is welldesigned however it would benefit from also having a lower bound to get a better sense of how good the encodings are if im reading it correctly the edit encoder has access to the full sequences x and x in addition to the alignment symbols i wonder if this hurts the quality of the representations since its possible albeit not efficient to memorize the output sequence x and decode it directly from the 512dimensional vector have you explored more constrained versions of the edit encoder such as the bagofedits from guu et al 2017 or alternate learning objectives to control for this the wikiatomicedits corpus has 137 million english insertions why did you subsample this to only 1m there is also a humanannotated subset of that you might use as evaluation data similar to the cfixers set on the human evaluation who were the annotators the categories similar edit and semantically or syntactically same edit seem to leave a lot to interpretation were more specific instructions given it also might be interesting if possible to separately classify syntactically similar and semantically similar edits on the automatic evaluation accuracy seems brittle for evaluating sequence output did you consider reporting bleu rouge or another soft sequence metric it would be worth citing existing literature on classification of wikipedia edits for example yang et al 2017 httpswwwcscmuedudiyiydocsemnlp17pdf an interesting experiment would be to correlate your edit encodings with their taxonomy
### Summary:
|
this paper investigates learning to represent edit operations for two domains text and source code the primary contributions of the paper are in the specific task formulation and the new dataset for source code edits the technical novelty is relatively weak pros the paper introduces a new dataset for source code edits cons reviewers raised various concerns about human evaluation and many other experimental details most of which the rebuttal have successfully addressed as a result r3 updated their score from 4 to 6 verdict possible weak accept none of the remaining issues after the rebuttal is a serious deal breaker eg task simplification by assuming the knowledge of when and where the edit must be applied simplifying the realworld application of the automatic edits however the overall impact and novelty of the paper is relatively weak
|
[
21184,
327,
253,
16038,
285,
8453,
323,
436,
747,
4836,
50275,
783,
4993,
254,
10895,
326,
597,
3562,
310,
4722,
1110,
1407,
953,
24628,
1056,
253,
2127,
1805,
594,
14053,
1110,
1407,
953,
812,
1421,
281,
1805,
2127,
1907,
326,
347,
13130,
941,
13276,
247,
4076,
285,
21414,
7103,
4836,
273,
21565,
2074,
1407,
953,
50276,
783,
2929,
16633,
327,
253,
38135,
273,
253,
4836,
285,
253,
10895,
594,
253,
3210,
403,
2969,
10575,
273,
253,
5368,
12246,
30869,
298,
296,
78,
285,
253,
305,
456,
4216,
11454,
2990,
984,
1199,
273,
253,
3280,
2505,
390,
2127,
1057,
417,
1818,
253,
29810,
4850,
281,
3587,
3491,
4243,
273,
253,
3280,
323,
2127,
941,
253,
7846,
310,
908,
3185,
273,
6507,
2505,
273,
253,
2127,
841,
1355,
2544,
1646,
5272,
285,
789,
973,
323,
436,
1895,
50276,
15419,
2368,
310,
417,
3477,
323,
436,
4836,
323,
253,
4836,
273,
9999,
253,
1407,
953,
597,
921,
5304,
5904,
273,
253,
9959,
273,
2074,
1407,
953,
285,
2589,
247,
1966,
7103,
281,
923,
849,
2074,
841,
1407,
953,
2686,
403,
436,
1966,
7103,
310,
417,
2529,
275,
2508,
347,
597,
513,
417,
1333,
849,
1142,
952,
20139,
253,
14259,
665,
597,
497,
849,
597,
497,
17875,
849,
597,
497,
17189,
285,
752,
253,
734,
83,
727,
4345,
369,
253,
12921,
10554,
7103,
310,
2218,
973,
533,
352,
310,
417,
2590,
752,
352,
2097,
672,
597,
1333,
1805,
10554,
3045,
1057,
417,
7933,
1599,
352,
2087,
4219,
1805,
326,
778,
320,
2032,
533,
840,
1293,
1529,
7982,
323,
1805,
26647,
581,
2550,
1333,
326,
1805,
3045,
2097,
7197,
26647,
50275,
3229,
3784,
841,
5884,
3374,
253,
2929,
17904,
3012,
4460,
4836,
10895,
285,
1543,
891,
2868,
352,
588,
1421,
281,
4722,
2852,
2561,
275,
9999,
2505,
285,
2127,
2544,
7152,
339,
431,
248,
4477,
1375,
23395,
285,
4518,
253,
2022,
9021,
597,
923,
275,
616,
789,
26432,
1390,
12494,
5742,
253,
1375,
253,
2929,
337,
1246,
247,
747,
285,
1774,
5145,
4715,
4836,
374,
1246,
247,
2021,
273,
3210,
326,
9232,
253,
2605,
273,
1407,
953,
285,
11897,
5919,
14237,
495,
2794,
247,
747,
2603,
2127,
12921,
10895,
577,
1347,
247,
873,
273,
4679,
327,
253,
6311,
12921,
14237,
285,
1246,
12532,
16774,
1941,
326,
253,
3210,
9302,
275,
26475,
253,
35185,
273,
1407,
953,
50275,
664,
4425,
281,
23968,
436,
2278,
407,
36738,
327,
253,
490,
729,
383,
456,
9021,
581,
387,
247,
673,
50276,
66,
747,
285,
1774,
5145,
4715,
4836,
50276,
1747,
13218,
747,
4836,
50276,
856,
359,
403,
32139,
342,
2469,
789,
534,
10262,
436,
10799,
4836,
253,
4836,
310,
747,
2593,
608,
2789,
247,
1175,
1083,
323,
253,
38135,
273,
436,
789,
50276,
585,
5293,
50275,
1747,
13218,
1774,
4836,
50276,
856,
253,
4477,
41509,
253,
4836,
342,
22602,
267,
3006,
13893,
4893,
8356,
14835,
2505,
285,
2127,
24088,
323,
28146,
19843,
285,
3740,
20178,
3006,
1407,
953,
347,
295,
24343,
5113,
273,
1600,
326,
476,
320,
345,
2414,
600,
6607,
29102,
285,
908,
323,
10554,
310,
271,
7170,
50276,
585,
1142,
2505,
23145,
3906,
49130,
285,
12425,
209,
1487,
2168,
2486,
3386,
534,
8356,
1804,
390,
4647,
1407,
953,
323,
28146,
19843,
285,
3740,
253,
4477,
513,
417,
6266,
35387,
275,
5368,
5657,
326,
1537,
320,
1805,
9713,
970,
5939,
14237,
273,
1407,
953,
17912,
253,
8453,
273,
253,
4081,
7680,
310,
12744,
50275,
66,
2021,
273,
3210,
326,
9232,
253,
2605,
273,
1407,
953,
285,
11897,
5919,
14237,
50276,
1747,
13218,
247,
2021,
273,
3210,
50276,
856,
253,
2021,
273,
3210,
3559,
407,
253,
4477,
4518,
2087,
4219,
824,
3210,
778,
320,
12845,
323,
15180,
4679,
327,
15302,
285,
12921,
3510,
4457,
1110,
5742,
12845,
275,
436,
7103,
253,
4477,
4647,
973,
8906,
1025,
11454,
2990,
35615,
326,
778,
320,
10166,
285,
3732,
281,
1781,
15302,
253,
10336,
273,
253,
11454,
8121,
16000,
7103,
273,
253,
4248,
281,
534,
253,
8121,
8379,
26295,
253,
3451,
12921,
1677,
247,
638,
15576,
3280,
285,
247,
1929,
6779,
273,
247,
2074,
12921,
50276,
585,
253,
4477,
513,
417,
12661,
667,
6974,
762,
534,
12921,
14237,
1537,
320,
12845,
323,
8356,
14835,
2505,
390,
2127,
672,
271,
12921,
1077,
2074,
281,
253,
6799,
12921,
310,
417,
2168,
1929,
285,
697,
6779,
2130,
347,
3280,
7613,
359,
1089,
253,
4477,
513,
417,
10481,
41509,
253,
3280,
6974,
273,
616,
11454,
8121,
253,
3280,
6974,
273,
253,
11454,
8121,
2789,
14916,
253,
1083,
275,
534,
642,
12921,
310,
3058,
347,
253,
8121,
651,
3037,
1309,
3733,
326,
253,
3453,
1269,
943,
320,
253,
1072,
347,
253,
3280,
1269,
672,
253,
6779,
273,
253,
5058,
12921,
310,
1677,
347,
3280,
1223,
253,
4477,
2319,
253,
6349,
273,
3673,
44856,
272,
253,
12921,
32049,
594,
326,
352,
1057,
417,
3365,
3037,
281,
22573,
253,
6799,
3453,
1269,
597,
513,
417,
345,
2414,
600,
7568,
326,
253,
12921,
32049,
556,
2218,
5010,
275,
253,
2457,
4679,
2905,
281,
326,
604,
253,
4477,
11205,
281,
2686,
8415,
16644,
1407,
953,
275,
2505,
3211,
840,
352,
3133,
9560,
616,
941,
6221,
4016,
6667,
26332,
13288,
534,
2430,
642,
1407,
953,
275,
824,
271,
7103,
581,
651,
1071,
671,
672,
253,
5933,
23970,
15279,
1000,
531,
528,
1407,
953,
50274,
1747,
13218,
9232,
2605,
273,
1407,
953,
50276,
856,
253,
4477,
1246,
1941,
326,
12921,
2349,
351,
398,
18996,
7368,
4942,
2969,
1407,
953,
534,
6388,
6240,
390,
11922,
1846,
21761,
253,
4477,
1246,
1941,
326,
4942,
2969,
1407,
953,
6312,
8356,
407,
247,
4993,
254,
2223,
7368,
2366,
26332,
247,
1929,
2625,
310,
14667,
275,
17524,
253,
4477,
1246,
1941,
326,
253,
5275,
15833,
273,
1407,
953,
275,
271,
12921,
37626,
2317,
2223,
403,
3300,
39904,
390,
38291,
2074,
347,
24242,
407,
1966,
12182,
2392,
2593,
7652,
3797,
4722,
7313,
10941,
12921,
6127,
1805,
10848,
407,
253,
4216,
390,
22510,
12921,
2349,
351,
398,
50275,
585,
253,
4278,
273,
253,
1966,
22581,
8892,
534,
4561,
253,
10704,
1543,
275,
7180,
337,
285,
374,
403,
12744,
497,
38663,
2626,
4676,
12845,
497,
253,
1407,
953,
25752,
273,
616,
2603,
36465,
5203,
672,
6760,
38304,
752,
36158,
271,
20804,
432,
247,
2074,
12921,
285,
752,
36158,
247,
2074,
432,
247,
1072,
12921,
858,
2709,
1966,
12182,
2392,
30618,
436,
4836,
275,
7529,
285,
752,
369,
616,
4583,
34860,
593,
24088,
734,
68,
8586,
13367,
1293,
11859,
9172,
281,
841,
3533,
253,
13091,
285,
8453,
273,
253,
277,
29676,
2109,
29676,
1543,
2361,
275,
7180,
337,
285,
374,
403,
12744,
352,
310,
417,
2590,
432,
253,
767,
6667,
1677,
275,
2829,
337,
326,
253,
1264,
5275,
15833,
12691,
407,
253,
22510,
32049,
403,
1805,
26332,
4583,
625,
3300,
39904,
285,
263,
43548,
514,
1037,
2074,
281,
253,
1650,
12921,
685,
1110,
12691,
407,
253,
7351,
273,
3000,
1566,
352,
310,
12744,
534,
2173,
7794,
273,
12921,
2605,
403,
1805,
10848,
407,
253,
22510,
32049,
685,
253,
7351,
273,
3000,
1566,
253,
4583,
2605,
273,
7180,
337,
285,
374,
310,
19328,
342,
11859,
10704,
1543,
14691,
407,
247,
28819,
1781,
2593,
4508,
247,
1355,
1180,
273,
6667,
50275,
6953,
247,
747,
2603,
2127,
12921,
10895,
50276,
856,
253,
4477,
2794,
247,
747,
2603,
2127,
12921,
10895,
271,
1774,
7680,
281,
253,
1263,
273,
436,
747,
4836,
50276,
585,
5884,
310,
253,
2530,
10895,
1781,
2217,
281,
513,
625,
685,
2969,
4679,
923,
3877,
2708,
327,
3410,
1979,
50275,
15068,
12532,
16774,
1941,
326,
253,
3210,
9302,
275,
26475,
253,
35185,
273,
1407,
953,
50276,
856,
253,
3368,
1543,
921,
849,
7208,
253,
990,
936,
423,
985,
8379,
8131,
253,
3451,
12921,
1677,
247,
638,
15576,
3280,
285,
247,
1929,
6779,
273,
247,
2074,
12921,
5328,
2629,
3933,
19103,
273,
625,
685,
5571,
285,
17522,
3700,
4715,
3933,
19103,
273,
625,
685,
1884,
1804,
326,
436,
985,
2722,
9023,
323,
26475,
253,
35185,
273,
1407,
953,
50276,
585,
1955,
281,
7350,
4469,
1840,
670,
253,
1566,
2216,
285,
7103,
273,
253,
12921,
14237,
352,
4558,
12744,
281,
752,
4248,
253,
3210,
9302,
275,
26475,
253,
35185,
273,
1407,
953,
2829,
1903,
2722,
14138,
7629,
275,
2323,
2308,
2439,
4993,
254,
2654,
275,
253,
3700,
4715,
4836,
2568,
253,
4477,
513,
417,
12661,
4088,
616,
990,
936,
423,
985,
1537,
320,
10904,
281,
2953,
3672,
273,
5075,
3045,
253,
4477,
513,
417,
2319,
253,
3486,
273,
3733,
873,
1979,
327,
616,
7103,
17082,
253,
4477,
513,
417,
2319,
253,
4248,
281,
534,
616,
1566,
3733,
4836,
651,
4311,
281,
4067,
3448,
15302,
824,
347,
1110,
3058,
323,
253,
15265,
839,
4893,
50275,
3169,
327,
253,
4477,
2380,
38549,
285,
557,
1028,
859,
621,
359,
452,
9300,
253,
2278,
285,
253,
4868,
5474,
339,
431,
248,
2022,
9021,
273,
253,
2929,
403,
271,
12921,
32049,
1566,
2074,
281,
1149,
86,
1162,
355,
4240,
3944,
29404,
7585,
2061,
14718,
1497,
82,
1093,
740,
2405,
247,
747,
10895,
273,
2578,
383,
957,
1520,
2603,
2127,
1407,
953,
285,
11080,
285,
973,
1869,
483,
1783,
273,
253,
12921,
2349,
351,
723,
253,
2929,
310,
4518,
3542,
285,
3400,
2590,
1329,
323,
1016,
273,
616,
2022,
3916,
50276,
74,
1158,
436,
651,
320,
273,
1600,
281,
295,
24343,
8607,
285,
2571,
2444,
327,
3425,
285,
17309,
384,
16147,
10083,
3210,
533,
891,
1158,
253,
4477,
812,
452,
4783,
2007,
281,
7568,
253,
31640,
273,
616,
12921,
2349,
351,
723,
285,
616,
30437,
281,
643,
8892,
436,
651,
671,
5649,
10260,
432,
247,
625,
1480,
5301,
281,
1149,
86,
1162,
355,
4240,
534,
10262,
247,
1077,
2074,
11454,
8121,
1566,
50276,
8826,
625,
2173,
2792,
50275,
74,
1663,
751,
253,
2934,
273,
27090,
1407,
953,
432,
581,
3634,
281,
1529,
253,
4394,
12022,
3368,
310,
6210,
392,
265,
1300,
2299,
352,
651,
5649,
432,
671,
1907,
247,
2406,
3033,
281,
755,
247,
1805,
3282,
273,
849,
1175,
253,
2349,
351,
723,
403,
50275,
338,
516,
4361,
352,
9113,
253,
12921,
32049,
556,
2289,
281,
253,
2120,
6430,
1269,
285,
1269,
275,
1635,
281,
253,
12420,
14217,
891,
4282,
604,
436,
31835,
253,
3290,
273,
253,
14237,
1580,
697,
1896,
23447,
417,
5919,
281,
16407,
907,
253,
3453,
3425,
1269,
285,
30358,
352,
3587,
432,
253,
23414,
6967,
4972,
452,
368,
14859,
625,
20793,
9508,
273,
253,
12921,
32049,
824,
347,
253,
7351,
1171,
264,
953,
432,
1149,
86,
1162,
355,
4240,
390,
17958,
4715,
16566,
281,
1453,
323,
436,
50275,
783,
259,
1479,
5946,
4986,
264,
953,
20689,
556,
14509,
3041,
48087,
49678,
50276,
22309,
858,
368,
8790,
4636,
436,
281,
760,
337,
78,
627,
310,
671,
247,
1966,
11423,
456,
8578,
273,
326,
368,
1537,
897,
347,
7103,
941,
2074,
281,
253,
260,
11097,
398,
873,
50275,
251,
253,
1966,
7103,
665,
497,
253,
12182,
2392,
253,
9050,
2074,
12921,
285,
3300,
39904,
390,
43548,
514,
1037,
1072,
12921,
1646,
281,
3553,
247,
2257,
281,
7914,
497,
625,
2173,
7997,
1677,
352,
671,
1537,
320,
4722,
604,
1896,
281,
11794,
30215,
43548,
514,
1037,
2074,
285,
3300,
39904,
2074,
1407,
953,
50275,
251,
253,
12077,
7103,
7200,
3133,
1308,
1522,
323,
16344,
3425,
3453,
858,
368,
1908,
9610,
7387,
86,
30497,
463,
390,
1529,
2602,
3425,
7982,
50275,
262,
651,
320,
4409,
19936,
5368,
6239,
327,
9162,
273,
259,
15170,
1407,
953,
323,
1650,
30966,
1162,
355,
4240,
5987,
2700,
68,
1026,
1906,
264,
438,
14059,
14059,
7152,
6017,
13307,
81,
1166,
9275,
271,
4722,
3368,
651,
320,
281,
24888,
634,
12921,
2349,
351,
723,
342,
616,
2891,
13646,
187,
187,
4118,
18435,
27,
2520,
2929,
2340,
684,
4715,
281,
1957,
12921,
5871,
323,
767,
10625,
2505,
285,
2603,
2127,
253,
3625,
9021,
273,
253,
2929,
403,
275,
253,
2173,
4836,
15895,
285,
253,
747,
10895,
323,
2603,
2127,
1407,
953,
253,
7681,
38135,
310,
4942,
5075,
50276,
856,
84,
253,
2929,
23970,
247,
747,
10895,
323,
2603,
2127,
1407,
953,
50274,
5040,
30628,
5439,
2710,
7350,
670,
1966,
7103,
285,
1142,
643,
5661,
4278,
954,
273,
534,
253,
30080,
22559,
452,
8379,
9713,
347,
247,
906,
391,
20,
9300,
616,
4868,
432,
577,
281,
721,
50275,
332,
8102,
1896,
5075,
2997,
5293,
273,
253,
5780,
3374,
846,
253,
30080,
22559,
310,
247,
4092,
2968,
2740,
254,
24088,
4836,
8077,
1877,
407,
7384,
253,
3640,
273,
672,
285,
835,
253,
12921,
1364,
320,
3732,
8077,
5411,
253,
1524,
10186,
2898,
273,
253,
12077,
1407,
953,
2299,
253,
4583,
3486,
285,
38135,
273,
253,
2929,
310,
4942,
5075
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
21184,
327,
253,
16038,
285,
8453,
323,
436,
747,
4836,
50275,
783,
4993,
254,
10895,
326,
597,
3562,
310,
4722,
1110,
1407,
953,
24628,
1056,
253,
2127,
1805,
594,
14053,
1110,
1407,
953,
812,
1421,
281,
1805,
2127,
1907,
326,
347,
13130,
941,
13276,
247,
4076,
285,
21414,
7103,
4836,
273,
21565,
2074,
1407,
953,
50276,
783,
2929,
16633,
327,
253,
38135,
273,
253,
4836,
285,
253,
10895,
594,
253,
3210,
403,
2969,
10575,
273,
253,
5368,
12246,
30869,
298,
296,
78,
285,
253,
305,
456,
4216,
11454,
2990,
984,
1199,
273,
253,
3280,
2505,
390,
2127,
1057,
417,
1818,
253,
29810,
4850,
281,
3587,
3491,
4243,
273,
253,
3280,
323,
2127,
941,
253,
7846,
310,
908,
3185,
273,
6507,
2505,
273,
253,
2127,
841,
1355,
2544,
1646,
5272,
285,
789,
973,
323,
436,
1895,
50276,
15419,
2368,
310,
417,
3477,
323,
436,
4836,
323,
253,
4836,
273,
9999,
253,
1407,
953,
597,
921,
5304,
5904,
273,
253,
9959,
273,
2074,
1407,
953,
285,
2589,
247,
1966,
7103,
281,
923,
849,
2074,
841,
1407,
953,
2686,
403,
436,
1966,
7103,
310,
417,
2529,
275,
2508,
347,
597,
513,
417,
1333,
849,
1142,
952,
20139,
253,
14259,
665,
597,
497,
849,
597,
497,
17875,
849,
597,
497,
17189,
285,
752,
253,
734,
83,
727,
4345,
369,
253,
12921,
10554,
7103,
310,
2218,
973,
533,
352,
310,
417,
2590,
752,
352,
2097,
672,
597,
1333,
1805,
10554,
3045,
1057,
417,
7933,
1599,
352,
2087,
4219,
1805,
326,
778,
320,
2032,
533,
840,
1293,
1529,
7982,
323,
1805,
26647,
581,
2550,
1333,
326,
1805,
3045,
2097,
7197,
26647,
50275,
3229,
3784,
841,
5884,
3374,
253,
2929,
17904,
3012,
4460,
4836,
10895,
285,
1543,
891,
2868,
352,
588,
1421,
281,
4722,
2852,
2561,
275,
9999,
2505,
285,
2127,
2544,
7152,
339,
431,
248,
4477,
1375,
23395,
285,
4518,
253,
2022,
9021,
597,
923,
275,
616,
789,
26432,
1390,
12494,
5742,
253,
1375,
253,
2929,
337,
1246,
247,
747,
285,
1774,
5145,
4715,
4836,
374,
1246,
247,
2021,
273,
3210,
326,
9232,
253,
2605,
273,
1407,
953,
285,
11897,
5919,
14237,
495,
2794,
247,
747,
2603,
2127,
12921,
10895,
577,
1347,
247,
873,
273,
4679,
327,
253,
6311,
12921,
14237,
285,
1246,
12532,
16774,
1941,
326,
253,
3210,
9302,
275,
26475,
253,
35185,
273,
1407,
953,
50275,
664,
4425,
281,
23968,
436,
2278,
407,
36738,
327,
253,
490,
729,
383,
456,
9021,
581,
387,
247,
673,
50276,
66,
747,
285,
1774,
5145,
4715,
4836,
50276,
1747,
13218,
747,
4836,
50276,
856,
359,
403,
32139,
342,
2469,
789,
534,
10262,
436,
10799,
4836,
253,
4836,
310,
747,
2593,
608,
2789,
247,
1175,
1083,
323,
253,
38135,
273,
436,
789,
50276,
585,
5293,
50275,
1747,
13218,
1774,
4836,
50276,
856,
253,
4477,
41509,
253,
4836,
342,
22602,
267,
3006,
13893,
4893,
8356,
14835,
2505,
285,
2127,
24088,
323,
28146,
19843,
285,
3740,
20178,
3006,
1407,
953,
347,
295,
24343,
5113,
273,
1600,
326,
476,
320,
345,
2414,
600,
6607,
29102,
285,
908,
323,
10554,
310,
271,
7170,
50276,
585,
1142,
2505,
23145,
3906,
49130,
285,
12425,
209,
1487,
2168,
2486,
3386,
534,
8356,
1804,
390,
4647,
1407,
953,
323,
28146,
19843,
285,
3740,
253,
4477,
513,
417,
6266,
35387,
275,
5368,
5657,
326,
1537,
320,
1805,
9713,
970,
5939,
14237,
273,
1407,
953,
17912,
253,
8453,
273,
253,
4081,
7680,
310,
12744,
50275,
66,
2021,
273,
3210,
326,
9232,
253,
2605,
273,
1407,
953,
285,
11897,
5919,
14237,
50276,
1747,
13218,
247,
2021,
273,
3210,
50276,
856,
253,
2021,
273,
3210,
3559,
407,
253,
4477,
4518,
2087,
4219,
824,
3210,
778,
320,
12845,
323,
15180,
4679,
327,
15302,
285,
12921,
3510,
4457,
1110,
5742,
12845,
275,
436,
7103,
253,
4477,
4647,
973,
8906,
1025,
11454,
2990,
35615,
326,
778,
320,
10166,
285,
3732,
281,
1781,
15302,
253,
10336,
273,
253,
11454,
8121,
16000,
7103,
273,
253,
4248,
281,
534,
253,
8121,
8379,
26295,
253,
3451,
12921,
1677,
247,
638,
15576,
3280,
285,
247,
1929,
6779,
273,
247,
2074,
12921,
50276,
585,
253,
4477,
513,
417,
12661,
667,
6974,
762,
534,
12921,
14237,
1537,
320,
12845,
323,
8356,
14835,
2505,
390,
2127,
672,
271,
12921,
1077,
2074,
281,
253,
6799,
12921,
310,
417,
2168,
1929,
285,
697,
6779,
2130,
347,
3280,
7613,
359,
1089,
253,
4477,
513,
417,
10481,
41509,
253,
3280,
6974,
273,
616,
11454,
8121,
253,
3280,
6974,
273,
253,
11454,
8121,
2789,
14916,
253,
1083,
275,
534,
642,
12921,
310,
3058,
347,
253,
8121,
651,
3037,
1309,
3733,
326,
253,
3453,
1269,
943,
320,
253,
1072,
347,
253,
3280,
1269,
672,
253,
6779,
273,
253,
5058,
12921,
310,
1677,
347,
3280,
1223,
253,
4477,
2319,
253,
6349,
273,
3673,
44856,
272,
253,
12921,
32049,
594,
326,
352,
1057,
417,
3365,
3037,
281,
22573,
253,
6799,
3453,
1269,
597,
513,
417,
345,
2414,
600,
7568,
326,
253,
12921,
32049,
556,
2218,
5010,
275,
253,
2457,
4679,
2905,
281,
326,
604,
253,
4477,
11205,
281,
2686,
8415,
16644,
1407,
953,
275,
2505,
3211,
840,
352,
3133,
9560,
616,
941,
6221,
4016,
6667,
26332,
13288,
534,
2430,
642,
1407,
953,
275,
824,
271,
7103,
581,
651,
1071,
671,
672,
253,
5933,
23970,
15279,
1000,
531,
528,
1407,
953,
50274,
1747,
13218,
9232,
2605,
273,
1407,
953,
50276,
856,
253,
4477,
1246,
1941,
326,
12921,
2349,
351,
398,
18996,
7368,
4942,
2969,
1407,
953,
534,
6388,
6240,
390,
11922,
1846,
21761,
253,
4477,
1246,
1941,
326,
4942,
2969,
1407,
953,
6312,
8356,
407,
247,
4993,
254,
2223,
7368,
2366,
26332,
247,
1929,
2625,
310,
14667,
275,
17524,
253,
4477,
1246,
1941,
326,
253,
5275,
15833,
273,
1407,
953,
275,
271,
12921,
37626,
2317,
2223,
403,
3300,
39904,
390,
38291,
2074,
347,
24242,
407,
1966,
12182,
2392,
2593,
7652,
3797,
4722,
7313,
10941,
12921,
6127,
1805,
10848,
407,
253,
4216,
390,
22510,
12921,
2349,
351,
398,
50275,
585,
253,
4278,
273,
253,
1966,
22581,
8892,
534,
4561,
253,
10704,
1543,
275,
7180,
337,
285,
374,
403,
12744,
497,
38663,
2626,
4676,
12845,
497,
253,
1407,
953,
25752,
273,
616,
2603,
36465,
5203,
672,
6760,
38304,
752,
36158,
271,
20804,
432,
247,
2074,
12921,
285,
752,
36158,
247,
2074,
432,
247,
1072,
12921,
858,
2709,
1966,
12182,
2392,
30618,
436,
4836,
275,
7529,
285,
752,
369,
616,
4583,
34860,
593,
24088,
734,
68,
8586,
13367,
1293,
11859,
9172,
281,
841,
3533,
253,
13091,
285,
8453,
273,
253,
277,
29676,
2109,
29676,
1543,
2361,
275,
7180,
337,
285,
374,
403,
12744,
352,
310,
417,
2590,
432,
253,
767,
6667,
1677,
275,
2829,
337,
326,
253,
1264,
5275,
15833,
12691,
407,
253,
22510,
32049,
403,
1805,
26332,
4583,
625,
3300,
39904,
285,
263,
43548,
514,
1037,
2074,
281,
253,
1650,
12921,
685,
1110,
12691,
407,
253,
7351,
273,
3000,
1566,
352,
310,
12744,
534,
2173,
7794,
273,
12921,
2605,
403,
1805,
10848,
407,
253,
22510,
32049,
685,
253,
7351,
273,
3000,
1566,
253,
4583,
2605,
273,
7180,
337,
285,
374,
310,
19328,
342,
11859,
10704,
1543,
14691,
407,
247,
28819,
1781,
2593,
4508,
247,
1355,
1180,
273,
6667,
50275,
6953,
247,
747,
2603,
2127,
12921,
10895,
50276,
856,
253,
4477,
2794,
247,
747,
2603,
2127,
12921,
10895,
271,
1774,
7680,
281,
253,
1263,
273,
436,
747,
4836,
50276,
585,
5884,
310,
253,
2530,
10895,
1781,
2217,
281,
513,
625,
685,
2969,
4679,
923,
3877,
2708,
327,
3410,
1979,
50275,
15068,
12532,
16774,
1941,
326,
253,
3210,
9302,
275,
26475,
253,
35185,
273,
1407,
953,
50276,
856,
253,
3368,
1543,
921,
849,
7208,
253,
990,
936,
423,
985,
8379,
8131,
253,
3451,
12921,
1677,
247,
638,
15576,
3280,
285,
247,
1929,
6779,
273,
247,
2074,
12921,
5328,
2629,
3933,
19103,
273,
625,
685,
5571,
285,
17522,
3700,
4715,
3933,
19103,
273,
625,
685,
1884,
1804,
326,
436,
985,
2722,
9023,
323,
26475,
253,
35185,
273,
1407,
953,
50276,
585,
1955,
281,
7350,
4469,
1840,
670,
253,
1566,
2216,
285,
7103,
273,
253,
12921,
14237,
352,
4558,
12744,
281,
752,
4248,
253,
3210,
9302,
275,
26475,
253,
35185,
273,
1407,
953,
2829,
1903,
2722,
14138,
7629,
275,
2323,
2308,
2439,
4993,
254,
2654,
275,
253,
3700,
4715,
4836,
2568,
253,
4477,
513,
417,
12661,
4088,
616,
990,
936,
423,
985,
1537,
320,
10904,
281,
2953,
3672,
273,
5075,
3045,
253,
4477,
513,
417,
2319,
253,
3486,
273,
3733,
873,
1979,
327,
616,
7103,
17082,
253,
4477,
513,
417,
2319,
253,
4248,
281,
534,
616,
1566,
3733,
4836,
651,
4311,
281,
4067,
3448,
15302,
824,
347,
1110,
3058,
323,
253,
15265,
839,
4893,
50275,
3169,
327,
253,
4477,
2380,
38549,
285,
557,
1028,
859,
621,
359,
452,
9300,
253,
2278,
285,
253,
4868,
5474,
339,
431,
248,
2022,
9021,
273,
253,
2929,
403,
271,
12921,
32049,
1566,
2074,
281,
1149,
86,
1162,
355,
4240,
3944,
29404,
7585,
2061,
14718,
1497,
82,
1093,
740,
2405,
247,
747,
10895,
273,
2578,
383,
957,
1520,
2603,
2127,
1407,
953,
285,
11080,
285,
973,
1869,
483,
1783,
273,
253,
12921,
2349,
351,
723,
253,
2929,
310,
4518,
3542,
285,
3400,
2590,
1329,
323,
1016,
273,
616,
2022,
3916,
50276,
74,
1158,
436,
651,
320,
273,
1600,
281,
295,
24343,
8607,
285,
2571,
2444,
327,
3425,
285,
17309,
384,
16147,
10083,
3210,
533,
891,
1158,
253,
4477,
812,
452,
4783,
2007,
281,
7568,
253,
31640,
273,
616,
12921,
2349,
351,
723,
285,
616,
30437,
281,
643,
8892,
436,
651,
671,
5649,
10260,
432,
247,
625,
1480,
5301,
281,
1149,
86,
1162,
355,
4240,
534,
10262,
247,
1077,
2074,
11454,
8121,
1566,
50276,
8826,
625,
2173,
2792,
50275,
74,
1663,
751,
253,
2934,
273,
27090,
1407,
953,
432,
581,
3634,
281,
1529,
253,
4394,
12022,
3368,
310,
6210,
392,
265,
1300,
2299,
352,
651,
5649,
432,
671,
1907,
247,
2406,
3033,
281,
755,
247,
1805,
3282,
273,
849,
1175,
253,
2349,
351,
723,
403,
50275,
338,
516,
4361,
352,
9113,
253,
12921,
32049,
556,
2289,
281,
253,
2120,
6430,
1269,
285,
1269,
275,
1635,
281,
253,
12420,
14217,
891,
4282,
604,
436,
31835,
253,
3290,
273,
253,
14237,
1580,
697,
1896,
23447,
417,
5919,
281,
16407,
907,
253,
3453,
3425,
1269,
285,
30358,
352,
3587,
432,
253,
23414,
6967,
4972,
452,
368,
14859,
625,
20793,
9508,
273,
253,
12921,
32049,
824,
347,
253,
7351,
1171,
264,
953,
432,
1149,
86,
1162,
355,
4240,
390,
17958,
4715,
16566,
281,
1453,
323,
436,
50275,
783,
259,
1479,
5946,
4986,
264,
953,
20689,
556,
14509,
3041,
48087,
49678,
50276,
22309,
858,
368,
8790,
4636,
436,
281,
760,
337,
78,
627,
310,
671,
247,
1966,
11423,
456,
8578,
273,
326,
368,
1537,
897,
347,
7103,
941,
2074,
281,
253,
260,
11097,
398,
873,
50275,
251,
253,
1966,
7103,
665,
497,
253,
12182,
2392,
253,
9050,
2074,
12921,
285,
3300,
39904,
390,
43548,
514,
1037,
1072,
12921,
1646,
281,
3553,
247,
2257,
281,
7914,
497,
625,
2173,
7997,
1677,
352,
671,
1537,
320,
4722,
604,
1896,
281,
11794,
30215,
43548,
514,
1037,
2074,
285,
3300,
39904,
2074,
1407,
953,
50275,
251,
253,
12077,
7103,
7200,
3133,
1308,
1522,
323,
16344,
3425,
3453,
858,
368,
1908,
9610,
7387,
86,
30497,
463,
390,
1529,
2602,
3425,
7982,
50275,
262,
651,
320,
4409,
19936,
5368,
6239,
327,
9162,
273,
259,
15170,
1407,
953,
323,
1650,
30966,
1162,
355,
4240,
5987,
2700,
68,
1026,
1906,
264,
438,
14059,
14059,
7152,
6017,
13307,
81,
1166,
9275,
271,
4722,
3368,
651,
320,
281,
24888,
634,
12921,
2349,
351,
723,
342,
616,
2891,
13646,
187,
187,
4118,
18435,
27,
2520,
2929,
2340,
684,
4715,
281,
1957,
12921,
5871,
323,
767,
10625,
2505,
285,
2603,
2127,
253,
3625,
9021,
273,
253,
2929,
403,
275,
253,
2173,
4836,
15895,
285,
253,
747,
10895,
323,
2603,
2127,
1407,
953,
253,
7681,
38135,
310,
4942,
5075,
50276,
856,
84,
253,
2929,
23970,
247,
747,
10895,
323,
2603,
2127,
1407,
953,
50274,
5040,
30628,
5439,
2710,
7350,
670,
1966,
7103,
285,
1142,
643,
5661,
4278,
954,
273,
534,
253,
30080,
22559,
452,
8379,
9713,
347,
247,
906,
391,
20,
9300,
616,
4868,
432,
577,
281,
721,
50275,
332,
8102,
1896,
5075,
2997,
5293,
273,
253,
5780,
3374,
846,
253,
30080,
22559,
310,
247,
4092,
2968,
2740,
254,
24088,
4836,
8077,
1877,
407,
7384,
253,
3640,
273,
672,
285,
835,
253,
12921,
1364,
320,
3732,
8077,
5411,
253,
1524,
10186,
2898,
273,
253,
12077,
1407,
953,
2299,
253,
4583,
3486,
285,
38135,
273,
253,
2929,
310,
4942,
5075
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
cyclesgym is a new benchmark environment for reinforcement learning rl applications to agriculture and farming the benchmark allows for different crops to be produced over a span of multiple years the environment can also be modified and customized to the needs of the experimenter with new crops locations etc cyclesgym is the first benchmark to evaluate and train rl algorithms on multiyear multicrop agriculture as such the benchmark could potentially have positive societal impacts by fostering the development of rl algorithms which are better suited for applications to agriculture the benchmark is useful to the community studying applications of rl to agriculture current benchmarks focus on singleyear agriculture cyclesgym allows for multiyear multicrop environments where the decisions of previous years have longlasting effects for the agronomist this environment could provide a more realistic simulation of the real world than current benchmarks the experiments in the paper are intended to show the utility of cyclesgym and how this environment is useful over existing environment these experiments should have been conducted with more rigour experiment 1 attempts to outline the benefits of cyclesgym compared to current benchmarks in terms of the multiyear feature of cyclesgym ppo is trained for both a 5 year ppo5 and 1 year ppo1 period and then evaluated on a 5 year period if the performance of ppo5 was significantly higher than ppo1 then this would provide evidence that cyclesgym is useful to study crop management over multiple years or that the multiyear feature of cyclesgym results in a harder problem unfortunately the benefit of cyclesgym over current benchmarks in this experiment is unclear in table 1 mean performance with minmax as uncertainty in performance is reported the conclusion is drawn that ppo1s performance deteriorates with more years nevertheless this conclusion is unclear since the reported uncertainty overlaps in all cases even if the paper reported confidence intervals i am uncertain that ppo5 would statistically significantly outperform ppo1 bringing the utility of cyclesgym into question at least with respect to the multiyear feature with only 5 random seeds used the estimated mean performance is most likely inaccurate it could be the case that ppo1 saw a number of lowperforming runs but that with more runs the mean performance of ppo1 and ppo5 would be nearly the same this experiment would be improved by increasing the number of runs used and reporting statistical measures of confidence furthermore the results for a number of baselines do not include any uncertainty measurements for both experiment 1 and 2 were these baselines only run using a single seed are both these baselines and the experiments with cyclesgym deterministic such that a single experiment was needed ppo in these experiments does not take into account the cost of supplies operating equipment etc do the baselines takes these factors into account if so this could give ppo an advantage in experiment 1 generalization between locations is performed by training with the weather of one location and testing with the weather of another location is weather the only differences in locations in the benchmark if so why refer to this setting as locations and not weather or climate location brings with it many connotations other than weather eg soil quality prices of supplies air quality average uv index etc docsepin this manuscript turchetta et al propose a new environment for crop growth models cgm simulation where agronomists and researchers can try different reinforcement learning rl or ml strategies the simulator is as described a kind of wrapper around the software cycle one of many cgm that can be used for simulation of crop growth and crop rotation this is not a novel subject since gcm and decision support systems dss like cropgro hoogenborm et al 1991 stics dssat dndc and holos are used by different state agencies eg usda inrae to predict outcomes for different crops during the growing season one key limitation is often the real dataset eg weather information real crop yield financial data that needs to be used to confirm the validity of the different models as such different models are normally run concurrently since they all have some biais and this is probably what is the biggest drawback of this study and proposed benchmark 1 the support of only one cgm 2 the use of unrealistic agronomic practices for their simulations eg multiple n applications during a season only some crop rotation 3 looking at the code on the github it is unclear how one could change the simulation parameters to fit its need eg changing the weather crop using the provided documentation nevertheless a more open system with easier configuration options eg weather location crop could provide an interesting benchmarks environment for rl in agriculture 1 application of an openai gym to gcm could allow new research opportunities in agronomy 2 multipleyear simulations and possibility of adding policies 3 usage of the openai gym allows future optimisations and new rl models to be tested 1 limited documentation for the benchmark and how the planted superficies ha are handled in the simulations 2 cyclegym architecture should be more defined in section 5 with some examples 3 limited geographical locations us new holland and rock springs and rotation for the simulation not representative of the whole world and agronomic practices 4 the research question is not well defined in the introduction there are mentions of the ukraine war pest control co2 emission and n pollution while the subject of crop rotation and management strategies central to this benchmark are not mentioned and explained 5 no real description of each years weather patterns which can influence the results presented in the benchmark 6 no mention of the openai gym in the document as being the rl engine only in the github 7 the rl model policies used by the authors are not well described maybe a schematic table view in terms of the goal and rewards this information is covered in part in section b but should be in the main part of the manuscript 8 new simulations and results in table 5 annexe rlfw and nonadaptivefw not explained 9 rl reward is based on profit ky ha however raw yield should be used since cornsoybean prices also fluctuate on a yearlybasis 10 the title of this manuscript is longterm crop management strategies however this is not well documented in this manuscript while the manuscript is oriented in computer science this aspect should be discussed in greater length 11 some of the figure captions and table descriptions needs more details 12 conclusiondiscussion section is too short docsepthis paper introduces cyclesgym an rl environment based on the multiyear multicrop crop grow model cgm cycles for open field agriculture on cyclesgym the authors benchmark two types of crop management practice nitrogen fertilizer n application and crop rotation they perform comparative studies of rl policies vs baselines representing current practices which are evaluated with traintest splits in time location horizon the key contributions of this work include 1 it is the first cropmanagement rl environment and benchmark that facilitate the learning of multiyear strategies with complex action spaces and multiple crops 2 for rl researchers this is a novel realistic benchmark that offers the possibility to investigate relevant issues arising in realworld rl while tackling a pressing societal problem 3 for agronomists and environmental scientists cyclesgym allows users to leverage recent advances in rl to improve management practices in agricultural systems 1 the paper is well written and motivated comprehensive review is provided for the related work in machine learning in agriculture crop growth models and crop management with rl 2 the targeted problem of rl applications in crop managementagriculture is understudied and has great potential for societal impact 3 the discussion related to the application of rl in crop managementagriculture is interesting and enlightening such as sec 42 which could be helpful for researchers in both fields 4 the experiments are interesting which encompass relevant crop management strategies under some realistic senarios partial observability temporallocation shift between traintest data 1 limited contributions in datasetsenvironments cyclesgym seems to be a direct wrapper around the multiyear multicrop cgm cycles the contributions to dataenvironment and crop growth modeling are limited 2 experiments could be stronger if datasetsenvironments arent the focus as a benchmark paper on rl application in agriculture this work only evaluates some variants of a single rl algorithm ppo adding more rl or planningoptimization algorithms would make the paper stronger such as genetic algorithms 1 and sac 2 metarl 3 would also help since cyclesgym can in principle create many distinct environmentstasks also it would be nice to provide empirical studies and implementations of the rl research questions discussed in 42 more finegrained control in action space and time scale as in 1 would be interesting to see too moreover i think it would be very helpful to benchmark an rl oracle ie the theoretical upper bound on each set of the experiments an important point of benchmark is to create a task challenging enough to allow for further algorithmic innovation therefore even though rl algorithms like ppo has shown marginal improvement over common strategies its unclear how much room of improvement is left for further development such experiments would be valuable to the machine learning community 3 better accessibility to code and datasets the current access to the cyclesgym repository is hindered by emaildevice verification making it hard to assess the implementation quality and reproducibility of the work 1 xiaoyan cao yao yao lanqing li wanpeng zhang zhicheng an zhong zhang shihui guo li xiao xiaoyu cao and dijun luo igrow a smart agriculture solution to autonomous greenhouse controlassociation for the advancement of artificial intelligence aaai 2022 2 tuomas haarnoja aurick zhou pieter abbeel and sergey levine soft actorcritic offpolicy maximum entropy deep reinforcement learning with a stochastic actor in international conference on machine learning pages 18611870 pmlr 2018 3 duan yan et al rl 2 fast reinforcement learning via slow reinforcement learning arxiv preprint arxiv161102779 2016 docsepthe proposed benchmark improves on previous rl environments for an important domain with widespread consequences the provided experiments and evaluation allow the authors to identify the more pressing issues of rl algorithm design when applied to crop management s1 the benchmark includes two important problems in crop management fertilization and crop rotation as well as longterm experiments compared to prior environments suggested for rl application to crop management these allow for a more thorough analysis s2 the baselines for experiments chosen are appropriate and allow for strong evaluation of new rl methods w1 while cycles appears to be a reasonable choice of underlying model it is under active development and supported by research it would be helpful if some justification for the choice was given under related work discussing cgms it appears the choice was arbitrary with other models potentially a better choice as they allow for ground water models and co2 concentration modeling are these other models less suited to modelling longterm effects than cycles docsepthe authors construct an rl environment to enable evaluation of multiyear management decisions on crop performance this environment is based on the cycles agronomic simulator a key challenge all of these agronomic simulatorrl approaches have is that the simulators are not reliable thats not the fault of these authors however because the simulators are so untrustworthy in many cases it reduces a problem which appears to have very meaningful societal impact to a toy problem useful only for studying rl techniques there is still some value to the rl community there it is another environment to test algorithms there however the impact of the work is much less than it would be if these simulators were better the simtoreal domain gap exists everywhere for rl but it is particularly bad with many of these crop simulators either the realworld impact is limited or the authors must address how this gap is to be addressed in order to bring these results into the real world i appreciate the need to take small steps in solving this problem but i think the gap and therefore the impact needs to be realistically addressed since no agronomic insight can be drawn from these models due to the size of the domain gap the only contribution is to the rl community and its not clear that this environment opens up new avenues of exploration improving crop forecasting and understanding of the impact of crop management decisions is an important topic the environment seems to be well architected to allow for relatively easy simulation analysis is performed looking at a very long timehorizon which demonstrates the multiyear capacity of the framework results are given with error bars over multiple runs as a new rl environment this appears to be well architected the idea of creating an rl environment is not new it was done by many others as the authors cite so the main contribution is around extending this to use a simulator which allows for multiyear modeling while including multiyear modeling is very important from an agronomic standpoint i dont know that using a different simulator makes a meaningful enough contribution within the rl space the authors conduct several experiments to show that a policy is learned over many years but the policy isnt meaningful or significantly better a fundamental approach with all of these simulators is that they are woefully uncalibrated to the real world even at the end of the season when all of the actual information for that season is input they can easily be off 30 until that is addressed its unclear that these simulators have any real value the authors address that there is a simtoreal gap that they are not addressing however without speaking to just how massive this gap is gives the appearance that this is just the standard rl challenge and that with some effort can be overcome to make real world impact unfortunately the situation is far more complicated than that and until significant work is done these agronomic simulators are no real than training rl to play a video game without this impact its unclear that this simulator is needed to address any of the rl research questions on page 5 ie those questions can be explored with other environments for example in the analysis around n fertilization it is unclear if the learned policy is realistic farmers will only spray so many times it takes effort equipment must be moved around a season additionally the growing stage of the crop dictates how the n is applied and therefore the cost type of n and type of equipment required while these seem like minor details they are fundamental considerations which the farmer addresses therefore if the policy does not take this into account it is again reduced to a toy problem with minimal real world usability nitrogen application cannot be optimized without weather and water considerations doing n alone is fine to demonstrate the code runs properly ie it serves as a unit test in engineering language but its not a realistic result to demonstrate the value of the simulator multiple variables need to be allowed to vary not held fixed yes single variable experiments are useful to ensure it has been coded properly but to demonstrate it has value to the community realistic experiments must be conducted similarly a simple baseline approach is fine to show that it works but the point of this simulator even from just the rl standpoint is to address some of questions the authors articulate i think the authors need to take a step not solve it just one step above baseline in one of those directions again to show what can be done with this framework
### Summary:
|
this proposal introduces cylesgym the first reinforcement learning rl benchmark targeted at long horizon decision making in agriculture crucially while prior work addresses singleyear decision making cylesgym captures the long term effects that one years crop has on future generations the benchmark is clearly highly relevant and opens up a new frontier for rl researchers making it a valuable contribution to the field furthermore the benchmark does a good job of highlighting interesting opportunities for rl method development such as costly information gathering and evaluating current algorithms compared to baselines there was an active discussion between the reviewers and authors of the benchmark which resolved the majority of issues raised in the initial reviews as a result there is broad support across the reviewers for the paper a lingering concern is the simtoreal gap which is mentioned at a number of places in the paper but could be emphasised more lastly the paper is well written and the evaluation sound i believe this benchmark will be welcome by the community but i recommend that the authors address the concerns regarding the writing raised by reviewer meyg in particular regarding the utility for the agriculture community in general i do not believe that issues which can be addressed in writing should be a reason to reject but those concerns should be addressed for the final version
|
[
7152,
33032,
2520,
2929,
23970,
11945,
72,
1105,
271,
391,
77,
3126,
1754,
327,
253,
4471,
2913,
23559,
1658,
17177,
1756,
1566,
260,
34753,
11945,
323,
1527,
1673,
21047,
327,
11945,
72,
1105,
253,
4477,
22791,
767,
3510,
273,
17177,
4323,
3946,
14164,
46642,
295,
2898,
285,
17177,
9381,
597,
1347,
20407,
2175,
273,
391,
77,
7823,
4632,
1666,
25379,
9999,
1655,
8333,
534,
403,
6760,
342,
1140,
565,
383,
36509,
275,
673,
4328,
16892,
50276,
783,
2234,
9021,
273,
436,
789,
2486,
50276,
18,
352,
310,
253,
806,
17177,
26454,
391,
77,
3126,
285,
22791,
326,
12454,
253,
4715,
273,
4471,
2913,
8130,
342,
2570,
2250,
8470,
285,
2709,
19492,
50276,
19,
323,
391,
77,
8607,
436,
310,
247,
4460,
15958,
22791,
326,
6131,
253,
6387,
281,
7409,
4623,
3374,
14475,
275,
1524,
10186,
391,
77,
1223,
46710,
247,
17178,
38058,
1895,
50275,
20,
50276,
1542,
639,
1406,
297,
1346,
285,
6938,
10950,
11945,
72,
1105,
4483,
4212,
281,
25057,
3332,
16424,
275,
391,
77,
281,
3157,
4323,
8333,
275,
17340,
2718,
337,
253,
2929,
310,
973,
3542,
285,
17194,
11088,
2278,
310,
2530,
323,
253,
2905,
789,
275,
5145,
4715,
275,
21047,
17177,
3116,
3210,
285,
17177,
4323,
342,
391,
77,
50276,
19,
253,
10522,
1895,
273,
391,
77,
4893,
275,
17177,
4323,
356,
695,
7546,
310,
762,
14091,
728,
285,
556,
1270,
2442,
323,
38058,
3486,
50274,
20,
253,
5955,
2905,
281,
253,
2898,
273,
391,
77,
275,
17177,
4323,
356,
695,
7546,
310,
4722,
285,
25441,
2980,
824,
347,
4706,
5976,
534,
812,
320,
9371,
323,
8607,
275,
1097,
4910,
50276,
21,
253,
4679,
403,
4722,
534,
18387,
4623,
17177,
4323,
8130,
762,
690,
15958,
5303,
12416,
7898,
1759,
1430,
5897,
455,
4341,
5333,
875,
1140,
565,
383,
941,
337,
3710,
9021,
275,
15302,
257,
11986,
11945,
72,
1105,
3133,
281,
320,
247,
1480,
27436,
1475,
253,
4471,
2913,
23559,
1658,
260,
34753,
11945,
253,
9021,
281,
941,
20034,
285,
17177,
3116,
14053,
403,
3710,
50276,
19,
4679,
812,
320,
10046,
50275,
338,
15302,
257,
11986,
403,
2649,
253,
2770,
347,
247,
22791,
2929,
327,
391,
77,
2898,
275,
21047,
436,
789,
760,
44995,
690,
11640,
273,
247,
2014,
391,
77,
5933,
268,
5367,
6240,
625,
391,
77,
390,
7219,
2178,
27996,
11333,
651,
1056,
253,
2929,
10046,
824,
347,
6380,
11333,
337,
285,
7044,
374,
50276,
3899,
7694,
495,
651,
671,
1361,
1580,
11945,
72,
1105,
476,
275,
8063,
2794,
1142,
5799,
3126,
296,
6579,
671,
352,
651,
320,
5322,
281,
2085,
16774,
2175,
285,
27558,
273,
253,
391,
77,
2561,
3533,
5469,
275,
5976,
625,
4030,
72,
11273,
1453,
275,
2250,
2317,
285,
673,
4311,
347,
275,
337,
651,
320,
4722,
281,
923,
1512,
50275,
3062,
1189,
891,
1158,
352,
651,
320,
1077,
9371,
281,
22791,
271,
391,
77,
42295,
26332,
253,
10527,
5170,
3033,
327,
1016,
873,
273,
253,
4679,
271,
1774,
1127,
273,
22791,
310,
281,
2794,
247,
4836,
11132,
2217,
281,
1581,
323,
2007,
5933,
280,
15832,
3103,
1014,
2167,
391,
77,
11333,
751,
268,
5367,
556,
2011,
16888,
7756,
689,
1846,
8130,
697,
12744,
849,
1199,
2316,
273,
7756,
310,
1669,
323,
2007,
2440,
824,
4679,
651,
320,
9865,
281,
253,
5145,
4715,
3114,
50273,
20,
1805,
28092,
281,
2127,
285,
15302,
253,
1655,
2289,
281,
253,
11945,
72,
1105,
18491,
310,
17134,
2122,
407,
4579,
10933,
21999,
2403,
352,
1892,
281,
2939,
253,
7092,
3290,
285,
38041,
273,
253,
789,
50275,
18,
1269,
571,
899,
266,
7318,
80,
340,
8500,
340,
8500,
28967,
82,
272,
632,
259,
266,
81,
1205,
1182,
12109,
1182,
73,
280,
24176,
271,
1182,
73,
543,
1182,
12109,
439,
6356,
4113,
1149,
80,
632,
1269,
22728,
1269,
571,
899,
86,
7318,
80,
285,
1073,
30986,
26535,
80,
25477,
736,
247,
7060,
21047,
2900,
281,
26279,
27160,
1453,
10769,
318,
323,
253,
32992,
273,
13345,
9260,
39951,
2284,
1384,
1423,
50276,
19,
11737,
4921,
419,
1596,
80,
6362,
32142,
781,
1182,
14451,
12580,
1715,
490,
1257,
293,
285,
1151,
463,
90,
20978,
460,
50276,
5530,
12353,
68,
17425,
50276,
2727,
22872,
4869,
15579,
3676,
35221,
4715,
342,
247,
19191,
12353,
275,
5213,
8059,
327,
5145,
4715,
7223,
25384,
14711,
1967,
268,
1686,
83,
4765,
50276,
20,
3443,
266,
340,
266,
1162,
355,
391,
77,
50276,
19,
3809,
35221,
4715,
3066,
3468,
35221,
4715,
549,
32693,
638,
3845,
549,
32693,
1036,
7749,
1630,
2787,
4022,
5474,
339,
431,
248,
4081,
22791,
19132,
327,
2045,
391,
77,
12620,
323,
271,
1774,
5028,
342,
14414,
9099,
253,
2530,
4679,
285,
7103,
1581,
253,
4477,
281,
4271,
253,
625,
17178,
3374,
273,
391,
77,
5933,
2216,
672,
3732,
281,
17177,
4323,
256,
18,
253,
22791,
3797,
767,
1774,
3237,
275,
17177,
4323,
46960,
285,
17177,
9381,
347,
973,
347,
1048,
3945,
4679,
2429,
281,
2720,
12620,
5125,
323,
391,
77,
2898,
281,
17177,
4323,
841,
1581,
323,
247,
625,
11080,
1783,
50276,
84,
19,
253,
1666,
25379,
323,
4679,
6777,
403,
4569,
285,
1581,
323,
2266,
7103,
273,
747,
391,
77,
3082,
259,
18,
1223,
11945,
4620,
281,
320,
247,
5272,
4327,
273,
6944,
1566,
352,
310,
762,
3939,
2440,
285,
4516,
407,
2561,
352,
651,
320,
9371,
604,
690,
22861,
323,
253,
4327,
369,
1677,
762,
2905,
789,
16585,
260,
72,
983,
352,
4620,
253,
4327,
369,
10341,
342,
643,
3210,
7826,
247,
1805,
4327,
347,
597,
1581,
323,
3216,
1824,
3210,
285,
820,
19,
4719,
14053,
403,
841,
643,
3210,
1679,
18960,
281,
26278,
1048,
3945,
2538,
685,
11945,
50276,
7152,
339,
431,
248,
4477,
3989,
271,
391,
77,
3126,
281,
8046,
7103,
273,
4471,
2913,
4323,
7089,
327,
17177,
3045,
50276,
2520,
3126,
310,
1754,
327,
253,
11945,
639,
1406,
4986,
40022,
50276,
66,
2234,
5691,
512,
273,
841,
639,
1406,
4986,
40022,
8435,
7274,
452,
310,
326,
253,
948,
28457,
403,
417,
9630,
50276,
394,
1832,
417,
253,
9331,
273,
841,
4477,
2299,
984,
253,
948,
28457,
403,
594,
440,
32993,
26214,
275,
1142,
2219,
352,
11355,
247,
1895,
534,
4620,
281,
452,
1077,
14282,
38058,
3486,
281,
247,
20953,
1895,
4217,
760,
323,
12392,
391,
77,
5609,
50276,
9088,
310,
1335,
690,
1318,
281,
253,
391,
77,
3114,
627,
352,
310,
1529,
3126,
281,
1071,
11333,
627,
2299,
253,
3486,
273,
253,
789,
310,
1199,
1679,
685,
352,
651,
320,
604,
841,
948,
28457,
497,
1805,
50276,
783,
948,
85,
43549,
5028,
8037,
4961,
11678,
323,
391,
77,
533,
352,
310,
3782,
3076,
342,
1142,
273,
841,
17177,
948,
28457,
50276,
26597,
253,
1524,
10186,
3486,
310,
3710,
390,
253,
4477,
1364,
2953,
849,
436,
8037,
310,
281,
320,
9713,
275,
1340,
281,
3324,
841,
1543,
715,
253,
1524,
1533,
50276,
74,
11435,
253,
878,
281,
1379,
1355,
5018,
275,
16161,
436,
1895,
533,
891,
1158,
253,
8037,
285,
3103,
253,
3486,
3198,
281,
320,
1524,
18260,
9713,
50276,
17480,
642,
639,
1406,
4986,
12288,
476,
320,
8392,
432,
841,
3210,
1955,
281,
253,
1979,
273,
253,
5028,
8037,
253,
760,
7680,
310,
281,
253,
391,
77,
3114,
285,
697,
417,
2590,
326,
436,
3126,
13279,
598,
747,
44201,
273,
17947,
11138,
17177,
16923,
272,
285,
4685,
273,
253,
3486,
273,
17177,
4323,
7089,
310,
271,
1774,
9400,
50276,
783,
3126,
3133,
281,
320,
973,
4222,
614,
2356,
281,
1581,
323,
4942,
3477,
9864,
50276,
12792,
310,
2684,
2819,
387,
247,
1077,
1048,
673,
1688,
21148,
534,
14371,
253,
4471,
2913,
5350,
273,
253,
7792,
50276,
16680,
403,
1677,
342,
2228,
8965,
689,
2709,
6613,
50276,
284,
247,
747,
391,
77,
3126,
436,
4620,
281,
320,
973,
4222,
614,
2356,
253,
2934,
273,
6153,
271,
391,
77,
3126,
310,
417,
747,
352,
369,
2218,
407,
1142,
2571,
347,
253,
4477,
26542,
594,
253,
2022,
7680,
310,
1475,
13633,
436,
281,
897,
247,
40022,
534,
4483,
323,
4471,
2913,
14053,
50276,
6050,
1690,
4471,
2913,
14053,
310,
1077,
1774,
432,
271,
639,
1406,
4986,
32764,
891,
13414,
871,
326,
970,
247,
1027,
40022,
2789,
247,
14282,
2217,
7680,
1561,
253,
391,
77,
2317,
50276,
783,
4477,
2589,
2067,
4679,
281,
921,
326,
247,
3646,
310,
6311,
689,
1142,
1107,
533,
253,
3646,
310,
2649,
14282,
390,
3012,
1805,
50276,
66,
7936,
2746,
342,
512,
273,
841,
948,
28457,
310,
326,
597,
403,
32063,
4085,
314,
440,
1179,
50250,
281,
253,
1524,
1533,
50276,
9154,
387,
253,
990,
273,
253,
2952,
672,
512,
273,
253,
4588,
1491,
323,
326,
2952,
310,
3280,
597,
476,
4354,
320,
745,
1884,
50276,
27390,
326,
310,
9713,
697,
12744,
326,
841,
948,
28457,
452,
667,
1524,
1318,
50276,
783,
4477,
2953,
326,
627,
310,
247,
948,
85,
43549,
8037,
326,
597,
403,
417,
15974,
50276,
35529,
1293,
8288,
281,
816,
849,
7863,
436,
8037,
310,
4245,
253,
7286,
326,
436,
310,
816,
253,
2629,
391,
77,
5691,
285,
326,
342,
690,
3434,
476,
320,
11399,
281,
1056,
1524,
1533,
3486,
50276,
328,
9520,
253,
4112,
310,
2080,
625,
9542,
685,
326,
285,
1919,
1534,
789,
310,
2218,
841,
639,
1406,
4986,
948,
28457,
403,
642,
1524,
685,
3733,
391,
77,
281,
1132,
247,
3492,
2165,
50276,
14920,
436,
3486,
697,
12744,
326,
436,
40022,
310,
3058,
281,
2953,
667,
273,
253,
391,
77,
2561,
3533,
327,
3239,
608,
26332,
1110,
3533,
476,
320,
14859,
342,
643,
12620,
50274,
1542,
1650,
275,
253,
1783,
1475,
295,
46960,
352,
310,
12744,
604,
253,
6311,
3646,
310,
15958,
50276,
71,
1513,
398,
588,
760,
14097,
594,
1142,
2069,
352,
3936,
3434,
6500,
1364,
320,
4395,
1475,
247,
2952,
50276,
29483,
595,
253,
5675,
3924,
273,
253,
17177,
44682,
849,
253,
295,
310,
3732,
285,
3103,
253,
2105,
1511,
273,
295,
285,
1511,
273,
6500,
2424,
50276,
6050,
841,
1646,
751,
5884,
4278,
597,
403,
7936,
15711,
534,
253,
24718,
12453,
50276,
45230,
604,
253,
3646,
1057,
417,
1379,
436,
715,
2395,
352,
310,
969,
3777,
281,
247,
20953,
1895,
342,
8723,
1524,
1533,
47813,
50274,
79,
10682,
2898,
2550,
320,
18325,
1293,
8588,
285,
1824,
15711,
50276,
25163,
295,
3815,
310,
4030,
281,
7568,
253,
2127,
6613,
6283,
26332,
352,
11029,
347,
247,
3943,
1071,
275,
11369,
3448,
533,
697,
417,
247,
15958,
906,
50275,
936,
7568,
253,
1318,
273,
253,
40022,
2709,
4903,
878,
281,
320,
4136,
281,
6889,
417,
2918,
4229,
50276,
9820,
2014,
4778,
4679,
403,
4217,
281,
5416,
352,
556,
644,
25175,
6283,
533,
281,
7568,
352,
556,
1318,
281,
253,
3114,
15958,
4679,
1364,
320,
5196,
50276,
3549,
6241,
247,
2969,
8245,
2746,
310,
4030,
281,
921,
326,
352,
2987,
533,
253,
1127,
273,
436,
40022,
1014,
432,
816,
253,
391,
77,
32764,
310,
281,
2953,
690,
273,
3533,
253,
4477,
40244,
891,
1158,
253,
4477,
878,
281,
1379,
247,
3213,
417,
8415,
352,
816,
581,
3213,
1840,
8245,
275,
581,
273,
1110,
10746,
969,
281,
921,
752,
476,
320,
2218,
342,
436,
7792,
2490,
187,
4118,
18435,
27,
2520,
10419,
23970,
2645,
868,
72,
1105,
253,
806,
35221,
4715,
391,
77,
22791,
10522,
387,
1048,
16892,
3061,
2403,
275,
21047,
29325,
1365,
1223,
2720,
789,
12453,
2014,
2913,
3061,
2403,
2645,
868,
72,
1105,
28174,
253,
1048,
1307,
2538,
326,
581,
1107,
17177,
556,
327,
2852,
14649,
50275,
783,
22791,
310,
4518,
4122,
4623,
285,
13279,
598,
247,
747,
34642,
323,
391,
77,
8607,
2403,
352,
247,
9865,
7680,
281,
253,
1673,
33810,
253,
22791,
1057,
247,
1175,
2628,
273,
27321,
4722,
9091,
323,
391,
77,
1332,
2440,
824,
347,
19983,
1491,
16778,
285,
16344,
1655,
11333,
2429,
281,
1666,
25379,
50275,
9088,
369,
271,
3939,
5955,
875,
253,
30628,
285,
4477,
273,
253,
22791,
534,
11512,
253,
5020,
273,
3374,
5439,
275,
253,
3302,
10123,
347,
247,
906,
627,
310,
3862,
1329,
2439,
253,
30628,
323,
253,
2929,
247,
42578,
4468,
310,
253,
948,
85,
43549,
8037,
534,
310,
5393,
387,
247,
1180,
273,
5053,
275,
253,
2929,
533,
812,
320,
10251,
1701,
625,
50275,
6275,
314,
253,
2929,
310,
973,
3542,
285,
253,
7103,
3590,
891,
2868,
436,
22791,
588,
320,
10112,
407,
253,
3114,
533,
891,
5583,
326,
253,
4477,
2953,
253,
7350,
5001,
253,
4028,
5439,
407,
50276,
15337,
254,
479,
11550,
275,
1798,
5001,
253,
11839,
323,
253,
21047,
3114,
275,
2087,
891,
513,
417,
2868,
326,
3374,
534,
476,
320,
9713,
275,
4028,
943,
320,
247,
1921,
281,
12009,
533,
1110,
7350,
943,
320,
9713,
323,
253,
2457,
2715,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
7152,
33032,
2520,
2929,
23970,
11945,
72,
1105,
271,
391,
77,
3126,
1754,
327,
253,
4471,
2913,
23559,
1658,
17177,
1756,
1566,
260,
34753,
11945,
323,
1527,
1673,
21047,
327,
11945,
72,
1105,
253,
4477,
22791,
767,
3510,
273,
17177,
4323,
3946,
14164,
46642,
295,
2898,
285,
17177,
9381,
597,
1347,
20407,
2175,
273,
391,
77,
7823,
4632,
1666,
25379,
9999,
1655,
8333,
534,
403,
6760,
342,
1140,
565,
383,
36509,
275,
673,
4328,
16892,
50276,
783,
2234,
9021,
273,
436,
789,
2486,
50276,
18,
352,
310,
253,
806,
17177,
26454,
391,
77,
3126,
285,
22791,
326,
12454,
253,
4715,
273,
4471,
2913,
8130,
342,
2570,
2250,
8470,
285,
2709,
19492,
50276,
19,
323,
391,
77,
8607,
436,
310,
247,
4460,
15958,
22791,
326,
6131,
253,
6387,
281,
7409,
4623,
3374,
14475,
275,
1524,
10186,
391,
77,
1223,
46710,
247,
17178,
38058,
1895,
50275,
20,
50276,
1542,
639,
1406,
297,
1346,
285,
6938,
10950,
11945,
72,
1105,
4483,
4212,
281,
25057,
3332,
16424,
275,
391,
77,
281,
3157,
4323,
8333,
275,
17340,
2718,
337,
253,
2929,
310,
973,
3542,
285,
17194,
11088,
2278,
310,
2530,
323,
253,
2905,
789,
275,
5145,
4715,
275,
21047,
17177,
3116,
3210,
285,
17177,
4323,
342,
391,
77,
50276,
19,
253,
10522,
1895,
273,
391,
77,
4893,
275,
17177,
4323,
356,
695,
7546,
310,
762,
14091,
728,
285,
556,
1270,
2442,
323,
38058,
3486,
50274,
20,
253,
5955,
2905,
281,
253,
2898,
273,
391,
77,
275,
17177,
4323,
356,
695,
7546,
310,
4722,
285,
25441,
2980,
824,
347,
4706,
5976,
534,
812,
320,
9371,
323,
8607,
275,
1097,
4910,
50276,
21,
253,
4679,
403,
4722,
534,
18387,
4623,
17177,
4323,
8130,
762,
690,
15958,
5303,
12416,
7898,
1759,
1430,
5897,
455,
4341,
5333,
875,
1140,
565,
383,
941,
337,
3710,
9021,
275,
15302,
257,
11986,
11945,
72,
1105,
3133,
281,
320,
247,
1480,
27436,
1475,
253,
4471,
2913,
23559,
1658,
260,
34753,
11945,
253,
9021,
281,
941,
20034,
285,
17177,
3116,
14053,
403,
3710,
50276,
19,
4679,
812,
320,
10046,
50275,
338,
15302,
257,
11986,
403,
2649,
253,
2770,
347,
247,
22791,
2929,
327,
391,
77,
2898,
275,
21047,
436,
789,
760,
44995,
690,
11640,
273,
247,
2014,
391,
77,
5933,
268,
5367,
6240,
625,
391,
77,
390,
7219,
2178,
27996,
11333,
651,
1056,
253,
2929,
10046,
824,
347,
6380,
11333,
337,
285,
7044,
374,
50276,
3899,
7694,
495,
651,
671,
1361,
1580,
11945,
72,
1105,
476,
275,
8063,
2794,
1142,
5799,
3126,
296,
6579,
671,
352,
651,
320,
5322,
281,
2085,
16774,
2175,
285,
27558,
273,
253,
391,
77,
2561,
3533,
5469,
275,
5976,
625,
4030,
72,
11273,
1453,
275,
2250,
2317,
285,
673,
4311,
347,
275,
337,
651,
320,
4722,
281,
923,
1512,
50275,
3062,
1189,
891,
1158,
352,
651,
320,
1077,
9371,
281,
22791,
271,
391,
77,
42295,
26332,
253,
10527,
5170,
3033,
327,
1016,
873,
273,
253,
4679,
271,
1774,
1127,
273,
22791,
310,
281,
2794,
247,
4836,
11132,
2217,
281,
1581,
323,
2007,
5933,
280,
15832,
3103,
1014,
2167,
391,
77,
11333,
751,
268,
5367,
556,
2011,
16888,
7756,
689,
1846,
8130,
697,
12744,
849,
1199,
2316,
273,
7756,
310,
1669,
323,
2007,
2440,
824,
4679,
651,
320,
9865,
281,
253,
5145,
4715,
3114,
50273,
20,
1805,
28092,
281,
2127,
285,
15302,
253,
1655,
2289,
281,
253,
11945,
72,
1105,
18491,
310,
17134,
2122,
407,
4579,
10933,
21999,
2403,
352,
1892,
281,
2939,
253,
7092,
3290,
285,
38041,
273,
253,
789,
50275,
18,
1269,
571,
899,
266,
7318,
80,
340,
8500,
340,
8500,
28967,
82,
272,
632,
259,
266,
81,
1205,
1182,
12109,
1182,
73,
280,
24176,
271,
1182,
73,
543,
1182,
12109,
439,
6356,
4113,
1149,
80,
632,
1269,
22728,
1269,
571,
899,
86,
7318,
80,
285,
1073,
30986,
26535,
80,
25477,
736,
247,
7060,
21047,
2900,
281,
26279,
27160,
1453,
10769,
318,
323,
253,
32992,
273,
13345,
9260,
39951,
2284,
1384,
1423,
50276,
19,
11737,
4921,
419,
1596,
80,
6362,
32142,
781,
1182,
14451,
12580,
1715,
490,
1257,
293,
285,
1151,
463,
90,
20978,
460,
50276,
5530,
12353,
68,
17425,
50276,
2727,
22872,
4869,
15579,
3676,
35221,
4715,
342,
247,
19191,
12353,
275,
5213,
8059,
327,
5145,
4715,
7223,
25384,
14711,
1967,
268,
1686,
83,
4765,
50276,
20,
3443,
266,
340,
266,
1162,
355,
391,
77,
50276,
19,
3809,
35221,
4715,
3066,
3468,
35221,
4715,
549,
32693,
638,
3845,
549,
32693,
1036,
7749,
1630,
2787,
4022,
5474,
339,
431,
248,
4081,
22791,
19132,
327,
2045,
391,
77,
12620,
323,
271,
1774,
5028,
342,
14414,
9099,
253,
2530,
4679,
285,
7103,
1581,
253,
4477,
281,
4271,
253,
625,
17178,
3374,
273,
391,
77,
5933,
2216,
672,
3732,
281,
17177,
4323,
256,
18,
253,
22791,
3797,
767,
1774,
3237,
275,
17177,
4323,
46960,
285,
17177,
9381,
347,
973,
347,
1048,
3945,
4679,
2429,
281,
2720,
12620,
5125,
323,
391,
77,
2898,
281,
17177,
4323,
841,
1581,
323,
247,
625,
11080,
1783,
50276,
84,
19,
253,
1666,
25379,
323,
4679,
6777,
403,
4569,
285,
1581,
323,
2266,
7103,
273,
747,
391,
77,
3082,
259,
18,
1223,
11945,
4620,
281,
320,
247,
5272,
4327,
273,
6944,
1566,
352,
310,
762,
3939,
2440,
285,
4516,
407,
2561,
352,
651,
320,
9371,
604,
690,
22861,
323,
253,
4327,
369,
1677,
762,
2905,
789,
16585,
260,
72,
983,
352,
4620,
253,
4327,
369,
10341,
342,
643,
3210,
7826,
247,
1805,
4327,
347,
597,
1581,
323,
3216,
1824,
3210,
285,
820,
19,
4719,
14053,
403,
841,
643,
3210,
1679,
18960,
281,
26278,
1048,
3945,
2538,
685,
11945,
50276,
7152,
339,
431,
248,
4477,
3989,
271,
391,
77,
3126,
281,
8046,
7103,
273,
4471,
2913,
4323,
7089,
327,
17177,
3045,
50276,
2520,
3126,
310,
1754,
327,
253,
11945,
639,
1406,
4986,
40022,
50276,
66,
2234,
5691,
512,
273,
841,
639,
1406,
4986,
40022,
8435,
7274,
452,
310,
326,
253,
948,
28457,
403,
417,
9630,
50276,
394,
1832,
417,
253,
9331,
273,
841,
4477,
2299,
984,
253,
948,
28457,
403,
594,
440,
32993,
26214,
275,
1142,
2219,
352,
11355,
247,
1895,
534,
4620,
281,
452,
1077,
14282,
38058,
3486,
281,
247,
20953,
1895,
4217,
760,
323,
12392,
391,
77,
5609,
50276,
9088,
310,
1335,
690,
1318,
281,
253,
391,
77,
3114,
627,
352,
310,
1529,
3126,
281,
1071,
11333,
627,
2299,
253,
3486,
273,
253,
789,
310,
1199,
1679,
685,
352,
651,
320,
604,
841,
948,
28457,
497,
1805,
50276,
783,
948,
85,
43549,
5028,
8037,
4961,
11678,
323,
391,
77,
533,
352,
310,
3782,
3076,
342,
1142,
273,
841,
17177,
948,
28457,
50276,
26597,
253,
1524,
10186,
3486,
310,
3710,
390,
253,
4477,
1364,
2953,
849,
436,
8037,
310,
281,
320,
9713,
275,
1340,
281,
3324,
841,
1543,
715,
253,
1524,
1533,
50276,
74,
11435,
253,
878,
281,
1379,
1355,
5018,
275,
16161,
436,
1895,
533,
891,
1158,
253,
8037,
285,
3103,
253,
3486,
3198,
281,
320,
1524,
18260,
9713,
50276,
17480,
642,
639,
1406,
4986,
12288,
476,
320,
8392,
432,
841,
3210,
1955,
281,
253,
1979,
273,
253,
5028,
8037,
253,
760,
7680,
310,
281,
253,
391,
77,
3114,
285,
697,
417,
2590,
326,
436,
3126,
13279,
598,
747,
44201,
273,
17947,
11138,
17177,
16923,
272,
285,
4685,
273,
253,
3486,
273,
17177,
4323,
7089,
310,
271,
1774,
9400,
50276,
783,
3126,
3133,
281,
320,
973,
4222,
614,
2356,
281,
1581,
323,
4942,
3477,
9864,
50276,
12792,
310,
2684,
2819,
387,
247,
1077,
1048,
673,
1688,
21148,
534,
14371,
253,
4471,
2913,
5350,
273,
253,
7792,
50276,
16680,
403,
1677,
342,
2228,
8965,
689,
2709,
6613,
50276,
284,
247,
747,
391,
77,
3126,
436,
4620,
281,
320,
973,
4222,
614,
2356,
253,
2934,
273,
6153,
271,
391,
77,
3126,
310,
417,
747,
352,
369,
2218,
407,
1142,
2571,
347,
253,
4477,
26542,
594,
253,
2022,
7680,
310,
1475,
13633,
436,
281,
897,
247,
40022,
534,
4483,
323,
4471,
2913,
14053,
50276,
6050,
1690,
4471,
2913,
14053,
310,
1077,
1774,
432,
271,
639,
1406,
4986,
32764,
891,
13414,
871,
326,
970,
247,
1027,
40022,
2789,
247,
14282,
2217,
7680,
1561,
253,
391,
77,
2317,
50276,
783,
4477,
2589,
2067,
4679,
281,
921,
326,
247,
3646,
310,
6311,
689,
1142,
1107,
533,
253,
3646,
310,
2649,
14282,
390,
3012,
1805,
50276,
66,
7936,
2746,
342,
512,
273,
841,
948,
28457,
310,
326,
597,
403,
32063,
4085,
314,
440,
1179,
50250,
281,
253,
1524,
1533,
50276,
9154,
387,
253,
990,
273,
253,
2952,
672,
512,
273,
253,
4588,
1491,
323,
326,
2952,
310,
3280,
597,
476,
4354,
320,
745,
1884,
50276,
27390,
326,
310,
9713,
697,
12744,
326,
841,
948,
28457,
452,
667,
1524,
1318,
50276,
783,
4477,
2953,
326,
627,
310,
247,
948,
85,
43549,
8037,
326,
597,
403,
417,
15974,
50276,
35529,
1293,
8288,
281,
816,
849,
7863,
436,
8037,
310,
4245,
253,
7286,
326,
436,
310,
816,
253,
2629,
391,
77,
5691,
285,
326,
342,
690,
3434,
476,
320,
11399,
281,
1056,
1524,
1533,
3486,
50276,
328,
9520,
253,
4112,
310,
2080,
625,
9542,
685,
326,
285,
1919,
1534,
789,
310,
2218,
841,
639,
1406,
4986,
948,
28457,
403,
642,
1524,
685,
3733,
391,
77,
281,
1132,
247,
3492,
2165,
50276,
14920,
436,
3486,
697,
12744,
326,
436,
40022,
310,
3058,
281,
2953,
667,
273,
253,
391,
77,
2561,
3533,
327,
3239,
608,
26332,
1110,
3533,
476,
320,
14859,
342,
643,
12620,
50274,
1542,
1650,
275,
253,
1783,
1475,
295,
46960,
352,
310,
12744,
604,
253,
6311,
3646,
310,
15958,
50276,
71,
1513,
398,
588,
760,
14097,
594,
1142,
2069,
352,
3936,
3434,
6500,
1364,
320,
4395,
1475,
247,
2952,
50276,
29483,
595,
253,
5675,
3924,
273,
253,
17177,
44682,
849,
253,
295,
310,
3732,
285,
3103,
253,
2105,
1511,
273,
295,
285,
1511,
273,
6500,
2424,
50276,
6050,
841,
1646,
751,
5884,
4278,
597,
403,
7936,
15711,
534,
253,
24718,
12453,
50276,
45230,
604,
253,
3646,
1057,
417,
1379,
436,
715,
2395,
352,
310,
969,
3777,
281,
247,
20953,
1895,
342,
8723,
1524,
1533,
47813,
50274,
79,
10682,
2898,
2550,
320,
18325,
1293,
8588,
285,
1824,
15711,
50276,
25163,
295,
3815,
310,
4030,
281,
7568,
253,
2127,
6613,
6283,
26332,
352,
11029,
347,
247,
3943,
1071,
275,
11369,
3448,
533,
697,
417,
247,
15958,
906,
50275,
936,
7568,
253,
1318,
273,
253,
40022,
2709,
4903,
878,
281,
320,
4136,
281,
6889,
417,
2918,
4229,
50276,
9820,
2014,
4778,
4679,
403,
4217,
281,
5416,
352,
556,
644,
25175,
6283,
533,
281,
7568,
352,
556,
1318,
281,
253,
3114,
15958,
4679,
1364,
320,
5196,
50276,
3549,
6241,
247,
2969,
8245,
2746,
310,
4030,
281,
921,
326,
352,
2987,
533,
253,
1127,
273,
436,
40022,
1014,
432,
816,
253,
391,
77,
32764,
310,
281,
2953,
690,
273,
3533,
253,
4477,
40244,
891,
1158,
253,
4477,
878,
281,
1379,
247,
3213,
417,
8415,
352,
816,
581,
3213,
1840,
8245,
275,
581,
273,
1110,
10746,
969,
281,
921,
752,
476,
320,
2218,
342,
436,
7792,
2490,
187,
4118,
18435,
27,
2520,
10419,
23970,
2645,
868,
72,
1105,
253,
806,
35221,
4715,
391,
77,
22791,
10522,
387,
1048,
16892,
3061,
2403,
275,
21047,
29325,
1365,
1223,
2720,
789,
12453,
2014,
2913,
3061,
2403,
2645,
868,
72,
1105,
28174,
253,
1048,
1307,
2538,
326,
581,
1107,
17177,
556,
327,
2852,
14649,
50275,
783,
22791,
310,
4518,
4122,
4623,
285,
13279,
598,
247,
747,
34642,
323,
391,
77,
8607,
2403,
352,
247,
9865,
7680,
281,
253,
1673,
33810,
253,
22791,
1057,
247,
1175,
2628,
273,
27321,
4722,
9091,
323,
391,
77,
1332,
2440,
824,
347,
19983,
1491,
16778,
285,
16344,
1655,
11333,
2429,
281,
1666,
25379,
50275,
9088,
369,
271,
3939,
5955,
875,
253,
30628,
285,
4477,
273,
253,
22791,
534,
11512,
253,
5020,
273,
3374,
5439,
275,
253,
3302,
10123,
347,
247,
906,
627,
310,
3862,
1329,
2439,
253,
30628,
323,
253,
2929,
247,
42578,
4468,
310,
253,
948,
85,
43549,
8037,
534,
310,
5393,
387,
247,
1180,
273,
5053,
275,
253,
2929,
533,
812,
320,
10251,
1701,
625,
50275,
6275,
314,
253,
2929,
310,
973,
3542,
285,
253,
7103,
3590,
891,
2868,
436,
22791,
588,
320,
10112,
407,
253,
3114,
533,
891,
5583,
326,
253,
4477,
2953,
253,
7350,
5001,
253,
4028,
5439,
407,
50276,
15337,
254,
479,
11550,
275,
1798,
5001,
253,
11839,
323,
253,
21047,
3114,
275,
2087,
891,
513,
417,
2868,
326,
3374,
534,
476,
320,
9713,
275,
4028,
943,
320,
247,
1921,
281,
12009,
533,
1110,
7350,
943,
320,
9713,
323,
253,
2457,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper investigate how to design simulation environments so the the agent trained with them can master social rules cons 1 the paper is well written and easy to read and understand thanks 2 the experiments are solid and well defined my major concern of this paper are 1 the authors seems to only considered the noise of sensors and the number of agents and these two factors happened to induce social behaviors like following lanes and stopping at traffic signals in other words i am not quite convinced that with all vehicles being automated sensor noise will cause them to formulate rules that what human drivers follow today since human driving interactions are complex i do not think that sensor noise would be enough to induce them 2 i would be happy to see how this configuration of simulation environment compares with reward guided social behaviors for example we can design a reward to encourage agents follow right of way i think this way might be more direct and powerful 2 the number of agents seems to be too small and may affect the formulation of road social rules 3 it seems like the agents did not take traffic signal status as input for the action selection then how did they formulate the signal control rules 4 figures 2 and 3 needs better explanations for readers to understand overall i think this paper needs better formulation of the logics and also a deep investigation of how the simulation variations lead to those behaviors docsepthe paper proposes to learn traffic rules eg traffic lights speed limits via multiagent rl marl from observations rather than hardcoding the rules into the algorithms to this end the authors claims contributions in defining a multiagent driving environment where each agent has incomplete observation noisy lidar and is rewarded for reaching a destination quickly without colliding experiments show that road rules can be learned ablation on the choice of the mdp and insights that perception noise and spatial density of agents are important to successfully learn the environment authors promise to release a suite of 2d driving environments for future marl research in selfdriving i must admit that i am mainly a computer vision person and i only have limited experience with rl or marl however i hope that my assessment below is still better than an educated guess i apologize in advance if there is any obvious misunderstanding strengths novelty in problem statement and at a high level the problem statement of learning hard traffic rules via observing logs seems new the application of marl to the problem is new too afaik and as the authors stated the majority of multiagent behavior works have been using imitation learning maybe consider citing 1 line of work as well which deals with multiagent imitation learning the proposed ppobased method seems a good alternative extensive experiments there are 7 small but specific tasks such as traffic lights emergence of lanes etc where the authors provided evidences to the claims that perception noise and spatial density of agents are crucial for the methods success 1 multiagent generative adversarial imitation learning weaknesses no comparison to prior work the authors acknowledged imitation learningbased il methods but did any quantitative comparison for them i dont see any evidence why solving the proposed problem using marl is better than il so why are we doing marl over il usefulness for the traffic light use case the handling of red green lights is essentially learned by not driving into other cars which obey the traffic rules with other words if there are no other agents to demonstrate how to behave the agent will always prefer to run over red this raises the question of the usefulness of the system lack of novelty in the method itself the paper seems to be using an offtheshelf ppo the centralized critic singlestep ppo and bilevel ppo do not provide sufficient novelty in terms of methods the centralized critic can be seen as a straightforward extension of ppo for the selfdriving application the bilevel extension is a straightforward way to optimize two objectives at the same time which is quite common even in the ages of convex optimization httpsenwikipediaorgwikibiconvexoptimization i think that there is novelty in the method but i am not sure if its enough for this venue dataset nuscenes dataset is a perception dataset and probably does not contain many interesting interactive scenarios still the method does not seem to perform very well in the experiments eg according to fig 2 many cars are still accelerating despite the red light details all axes names and titles in all diagrams are way too small no way people can decipher them if printed on paper no legends in the diagrams barely any caption please make diagrams selfcontained if possible conclusion this works provides an interesting new problem statement and explores the area of using marl for autonomous driving my main concerns regarding this paper are the lack of comparison to prior works in the experiments and the work provides any usefulness and improvements over current autonomous systems which are mostly based on imitation learningdocsepsummary this paper proposes a bilevel marl method that can learn road rules and conventions implicitly without hard coding it is quite interesting to see a simple and straightforward idea that is effective in this task setting however this paper needs to be further polished in terms of its delivery completeness and evaluation methodology how is the noise introduced to the mimicked lidar perception results in spline model does the trajectory shape only depends on the initial state without considering the afterward interaction the clarity and completeness of writing could be significantly improved the equations terms and the algorithm in section 4 need sufficient explanations eg what are vftargett h k1 k2 n what is the complete reward list experiments it seems the route is relatively short and the driving scenarios used for different tasks are distinct i was wondering whether the generalization of the proposed method and the learned rules could be validated no baselines have been evaluated or compared which experimental results are or the fixed track model what are the differences between the two models in the results what are the reward values during the training phase the captions and axis labels are difficult to read what do the colors mean in fig 2 my rating has been updated after the rebuttaldocsepsummary the output of the work is an mdp model that is capable of encoding complex traffic rules including traffic signals lanes right of way fifo etc various different traffic environments such as intersections highways nuscenes are considered the mdp that results from this work is very useful for future research currently there are very few simulators that encode complex traffic rules or provide the flexibility to perform research therefore the possibility of new simulators resulting from this mdp is exciting and useful to encourage and promote research in autonomous driving especially in dense and heterogeneous environments i am glad that the authors have provided the source code with extensive documentation and instructions on how to run the code and reproduce the results unless there is some major flaw with this paper that i may have missed there is no reason to reject this paper
### Summary:
|
this paper shows how road rules eg implicit designation of fast lanes on a highway naturally emerge in a multiagent mdp the paper shows that interesting traffic rules do emerge and it presents a detailed analysis of the factors that lead to this emergence the paper is complemented by documented source code with the aim to encourage the community to further work on the topic the reviewers agreed that this is original work and appreciated its simplicity two concerns that were recurrently voiced were that 1 there is no algorithmic innovation and 2 there is no comparison to baseline models or more generally a better placement in the context of existing literature the authors provided a detailed and to my eyes convincing response with respect to the two concerns above i would go as far as saying that 1 no algorithmic innovation is a feature not a bug the paper is interesting exactly because it studies emergent phenomena after framing multiagent driving as a standard rl problem concerning 2 lack of baselines it seems to me somewhat besides the point the paper is not claiming state of the art on some benchmark for a new algorithm but studying how certain implicit rules emerge in a given setup in this sense as the authors point out rather than looking at alternative baselines it is informative to look at which aspects of the setup contribute to rule emergence which is what the paper does although i realize that in proposing this i am going beyond the reviewers ratings i found this to be an original and exciting paper that i would strongly like to see accepted at the conference
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
7409,
849,
281,
2216,
9864,
12620,
594,
253,
253,
5570,
10166,
342,
731,
476,
6303,
2675,
4803,
772,
337,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
285,
2096,
6701,
374,
253,
4679,
403,
4891,
285,
973,
2931,
50275,
2577,
2201,
4468,
273,
436,
2929,
403,
337,
253,
4477,
3133,
281,
760,
2783,
253,
6046,
273,
13479,
285,
253,
1180,
273,
6083,
285,
841,
767,
2616,
4592,
281,
10808,
2675,
13576,
751,
1563,
24914,
285,
15910,
387,
7137,
6298,
275,
643,
3000,
891,
717,
417,
3240,
13762,
326,
342,
512,
9411,
1146,
16644,
8468,
6046,
588,
2847,
731,
281,
36803,
4803,
326,
752,
1966,
10865,
956,
3063,
1580,
1966,
6276,
6355,
403,
2570,
891,
513,
417,
1158,
326,
8468,
6046,
651,
320,
2217,
281,
10808,
731,
50276,
19,
891,
651,
320,
5211,
281,
923,
849,
436,
6661,
273,
9864,
3126,
26662,
342,
10921,
18107,
2675,
13576,
323,
1650,
359,
476,
2216,
247,
10921,
281,
11907,
6083,
956,
987,
273,
1039,
891,
1158,
436,
1039,
1537,
320,
625,
1480,
285,
6422,
50276,
19,
253,
1180,
273,
6083,
3133,
281,
320,
1512,
1355,
285,
778,
2818,
253,
15895,
273,
3971,
2675,
4803,
495,
352,
3133,
751,
253,
6083,
858,
417,
1379,
7137,
2625,
3708,
347,
3280,
323,
253,
2250,
5438,
840,
849,
858,
597,
36803,
253,
2625,
1453,
4803,
50276,
21,
8442,
374,
285,
495,
3198,
1805,
22909,
323,
10668,
281,
2096,
50275,
1189,
455,
891,
1158,
436,
2929,
3198,
1805,
15895,
273,
253,
2412,
982,
285,
671,
247,
3676,
5839,
273,
849,
253,
9864,
10575,
1421,
281,
1110,
13576,
5474,
339,
431,
248,
2929,
29328,
281,
3037,
7137,
4803,
24088,
7137,
10654,
3885,
7787,
3066,
4471,
12788,
391,
77,
2304,
77,
432,
7313,
2581,
685,
1892,
22257,
253,
4803,
715,
253,
11333,
281,
436,
990,
253,
4477,
3916,
9021,
275,
50275,
1545,
1699,
247,
4471,
12788,
6276,
3126,
835,
1016,
5570,
556,
18464,
8310,
27620,
16486,
274,
285,
310,
33302,
323,
10922,
247,
12095,
4541,
1293,
3007,
2821,
4679,
921,
326,
3971,
4803,
476,
320,
6311,
50275,
1752,
318,
327,
253,
4327,
273,
253,
278,
12132,
285,
16039,
326,
13071,
6046,
285,
8820,
4038,
273,
6083,
403,
1774,
281,
8379,
3037,
253,
3126,
50275,
43355,
9023,
281,
3727,
247,
18880,
273,
374,
69,
6276,
12620,
323,
2852,
2304,
77,
2561,
275,
1881,
41571,
50276,
74,
1364,
11476,
326,
891,
717,
7194,
247,
4382,
8113,
1436,
285,
891,
760,
452,
3710,
2793,
342,
391,
77,
390,
2304,
77,
2299,
891,
3524,
326,
619,
6803,
2708,
310,
1335,
1805,
685,
271,
19149,
5476,
891,
26012,
275,
7170,
604,
627,
310,
667,
4755,
40663,
50276,
296,
3755,
20556,
50275,
2369,
652,
555,
275,
1895,
3908,
285,
387,
247,
1029,
1268,
253,
1895,
3908,
273,
4715,
1892,
7137,
4803,
3066,
20764,
20131,
3133,
747,
253,
2898,
273,
2304,
77,
281,
253,
1895,
310,
747,
1512,
6706,
66,
1479,
285,
347,
253,
4477,
4767,
253,
5020,
273,
4471,
12788,
3879,
2987,
452,
644,
970,
45738,
4715,
5046,
1908,
19936,
337,
1386,
273,
789,
347,
973,
534,
13330,
342,
4471,
12788,
45738,
4715,
253,
4081,
7266,
706,
833,
1332,
3133,
247,
1175,
5795,
50275,
2068,
3134,
4679,
627,
403,
818,
1355,
533,
2173,
8892,
824,
347,
7137,
10654,
21313,
273,
24914,
3966,
835,
253,
4477,
2530,
20456,
2979,
281,
253,
3916,
326,
13071,
6046,
285,
8820,
4038,
273,
6083,
403,
9560,
323,
253,
3082,
2323,
50276,
18,
4471,
12788,
1006,
800,
48960,
45738,
4715,
50276,
20881,
1255,
265,
50275,
2369,
5301,
281,
2720,
789,
253,
4477,
14969,
45738,
4715,
3169,
4164,
3082,
533,
858,
667,
11745,
5301,
323,
731,
891,
13414,
923,
667,
1941,
2139,
16161,
253,
4081,
1895,
970,
2304,
77,
310,
1805,
685,
4164,
594,
2139,
403,
359,
2509,
2304,
77,
689,
4164,
50275,
316,
4085,
1255,
323,
253,
7137,
1708,
897,
1083,
253,
10885,
273,
2502,
50276,
11707,
10654,
310,
9093,
6311,
407,
417,
6276,
715,
643,
8458,
534,
20090,
253,
7137,
4803,
342,
643,
3000,
604,
627,
403,
642,
643,
6083,
281,
7568,
849,
281,
21319,
253,
5570,
588,
1900,
4510,
281,
1408,
689,
2502,
436,
16540,
253,
1953,
273,
253,
31471,
273,
253,
985,
50275,
77,
471,
273,
38135,
275,
253,
1332,
3139,
253,
2929,
3133,
281,
320,
970,
271,
273,
649,
1041,
48164,
268,
5367,
253,
36409,
7291,
1625,
46701,
554,
268,
5367,
285,
26413,
652,
268,
5367,
513,
417,
2085,
4209,
38135,
275,
2426,
273,
3082,
253,
36409,
7291,
476,
320,
2326,
347,
247,
15246,
6880,
273,
268,
5367,
323,
253,
1881,
41571,
2898,
253,
26413,
652,
6880,
310,
247,
15246,
1039,
281,
22318,
767,
16566,
387,
253,
1072,
673,
534,
310,
3240,
1846,
1014,
275,
253,
11880,
273,
17133,
13757,
5987,
257,
25842,
2061,
44874,
487,
3557,
30275,
2178,
27996,
891,
1158,
326,
627,
310,
38135,
275,
253,
1332,
533,
891,
717,
417,
2119,
604,
697,
2217,
323,
436,
18767,
50275,
42429,
295,
19387,
24453,
10895,
310,
247,
13071,
10895,
285,
3164,
1057,
417,
3831,
1142,
4722,
18366,
15216,
1335,
253,
1332,
1057,
417,
1646,
281,
1347,
1077,
973,
275,
253,
4679,
24088,
2556,
281,
3036,
374,
1142,
8458,
403,
1335,
38757,
5747,
253,
2502,
1708,
50276,
23454,
50275,
455,
24039,
4454,
285,
14505,
275,
512,
21302,
403,
1039,
1512,
1355,
642,
1039,
952,
476,
1086,
6894,
731,
604,
11462,
327,
2929,
50275,
2369,
38209,
275,
253,
21302,
12345,
667,
11743,
4496,
1056,
21302,
1881,
41010,
604,
1896,
50276,
585,
3444,
50276,
2520,
2987,
3400,
271,
4722,
747,
1895,
3908,
285,
33826,
253,
2170,
273,
970,
2304,
77,
323,
26279,
6276,
619,
2022,
7350,
5001,
436,
2929,
403,
253,
3480,
273,
5301,
281,
2720,
2987,
275,
253,
4679,
285,
253,
789,
3400,
667,
31471,
285,
11701,
689,
1655,
26279,
2718,
534,
403,
6571,
1754,
327,
45738,
4715,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
247,
26413,
652,
2304,
77,
1332,
326,
476,
3037,
3971,
4803,
285,
29793,
29688,
1293,
1892,
12425,
352,
310,
3240,
4722,
281,
923,
247,
2969,
285,
15246,
2934,
326,
310,
3576,
275,
436,
4836,
4758,
2299,
436,
2929,
3198,
281,
320,
2007,
29422,
275,
2426,
273,
697,
6742,
29867,
285,
7103,
50276,
9349,
1497,
849,
310,
253,
6046,
5611,
281,
253,
13892,
14050,
16486,
274,
13071,
1543,
275,
6821,
460,
1566,
1057,
253,
18974,
5281,
760,
7024,
327,
253,
3302,
1375,
1293,
7296,
253,
28279,
5016,
253,
19843,
285,
29867,
273,
4028,
812,
320,
3012,
5520,
253,
7424,
2426,
285,
253,
5933,
275,
2593,
577,
878,
4209,
22909,
24088,
752,
403,
362,
649,
1816,
85,
288,
465,
18,
465,
19,
295,
752,
310,
253,
3426,
10921,
1618,
50276,
16217,
3825,
352,
3133,
253,
7622,
310,
4942,
2159,
285,
253,
6276,
15216,
908,
323,
1027,
8892,
403,
5799,
891,
369,
12371,
1880,
253,
26647,
273,
253,
4081,
1332,
285,
253,
6311,
4803,
812,
320,
17618,
50276,
2369,
1666,
25379,
452,
644,
6760,
390,
2429,
534,
5661,
1543,
403,
390,
253,
4229,
3540,
1566,
752,
403,
253,
3910,
875,
253,
767,
3210,
275,
253,
1543,
752,
403,
253,
10921,
2193,
1309,
253,
3733,
3408,
50276,
783,
3403,
621,
285,
7844,
13301,
403,
2834,
281,
1239,
752,
513,
253,
9830,
1599,
275,
3036,
374,
50274,
2577,
13716,
556,
644,
9300,
846,
253,
30080,
22559,
7152,
339,
793,
360,
3454,
253,
3453,
273,
253,
789,
310,
271,
278,
12132,
1566,
326,
310,
7032,
273,
9706,
2570,
7137,
4803,
1690,
7137,
6298,
24914,
987,
273,
1039,
5813,
80,
3966,
2710,
1027,
7137,
12620,
824,
347,
42320,
40459,
295,
19387,
24453,
403,
2783,
50276,
783,
278,
12132,
326,
1543,
432,
436,
789,
310,
1077,
4217,
323,
2852,
2561,
4390,
627,
403,
1077,
1643,
948,
28457,
326,
22573,
2570,
7137,
4803,
390,
2085,
253,
15840,
281,
1347,
2561,
3103,
253,
6387,
273,
747,
948,
28457,
4795,
432,
436,
278,
12132,
310,
12302,
285,
4217,
281,
11907,
285,
8591,
2561,
275,
26279,
6276,
3340,
275,
14086,
285,
22766,
12620,
50276,
74,
717,
9995,
326,
253,
4477,
452,
2530,
253,
2603,
2127,
342,
9470,
10097,
285,
7997,
327,
849,
281,
1408,
253,
2127,
285,
18302,
253,
1543,
50275,
28558,
627,
310,
690,
2201,
19652,
342,
436,
2929,
326,
891,
778,
452,
9829,
627,
310,
642,
1921,
281,
12009,
436,
2929,
187,
187,
4118,
18435,
27,
2520,
2929,
2722,
849,
3971,
4803,
24088,
15424,
25344,
273,
3809,
24914,
327,
247,
17657,
10748,
20177,
275,
247,
4471,
12788,
278,
12132,
253,
2929,
2722,
326,
4722,
7137,
4803,
513,
20177,
285,
352,
10262,
247,
7000,
1783,
273,
253,
2616,
326,
1421,
281,
436,
21313,
253,
2929,
310,
48912,
407,
14290,
2603,
2127,
342,
253,
4388,
281,
11907,
253,
3114,
281,
2007,
789,
327,
253,
9400,
50276,
783,
30628,
5821,
326,
436,
310,
3236,
789,
285,
14109,
697,
17647,
767,
7350,
326,
497,
18902,
314,
39544,
497,
326,
337,
627,
310,
642,
5933,
280,
15832,
285,
374,
627,
310,
642,
5301,
281,
8245,
3210,
390,
625,
3839,
247,
1805,
14663,
275,
253,
3634,
273,
5368,
6239,
50276,
783,
4477,
2530,
247,
7000,
285,
281,
619,
2927,
21414,
2380,
342,
1675,
281,
253,
767,
7350,
1840,
891,
651,
564,
347,
2080,
347,
3981,
326,
337,
642,
5933,
280,
15832,
310,
247,
4735,
417,
247,
7505,
253,
2929,
310,
4722,
4555,
984,
352,
2175,
47006,
16958,
846,
39926,
4471,
12788,
6276,
347,
247,
2629,
391,
77,
1895,
8664,
374,
3480,
273,
1666,
25379,
352,
3133,
281,
479,
8489,
16280,
253,
1127,
253,
2929,
310,
417,
15081,
1375,
273,
253,
1445,
327,
690,
22791,
323,
247,
747,
5933,
533,
12392,
849,
2176,
15424,
4803,
20177,
275,
247,
1677,
9978,
275,
436,
3282,
347,
253,
4477,
1127,
562,
2581,
685,
2819,
387,
5795,
1666,
25379,
352,
310,
27096,
281,
1007,
387,
534,
7794,
273,
253,
9978,
8162,
281,
4086,
21313,
534,
310,
752,
253,
2929,
1057,
50276,
20261,
891,
8968,
326,
275,
36636,
436,
891,
717,
1469,
4457,
253,
30628,
17503,
891,
1119,
436,
281,
320,
271,
3236,
285,
12302,
2929,
326,
891,
651,
7052,
751,
281,
923,
7607,
387,
253,
8059,
209
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
7409,
849,
281,
2216,
9864,
12620,
594,
253,
253,
5570,
10166,
342,
731,
476,
6303,
2675,
4803,
772,
337,
253,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
285,
2096,
6701,
374,
253,
4679,
403,
4891,
285,
973,
2931,
50275,
2577,
2201,
4468,
273,
436,
2929,
403,
337,
253,
4477,
3133,
281,
760,
2783,
253,
6046,
273,
13479,
285,
253,
1180,
273,
6083,
285,
841,
767,
2616,
4592,
281,
10808,
2675,
13576,
751,
1563,
24914,
285,
15910,
387,
7137,
6298,
275,
643,
3000,
891,
717,
417,
3240,
13762,
326,
342,
512,
9411,
1146,
16644,
8468,
6046,
588,
2847,
731,
281,
36803,
4803,
326,
752,
1966,
10865,
956,
3063,
1580,
1966,
6276,
6355,
403,
2570,
891,
513,
417,
1158,
326,
8468,
6046,
651,
320,
2217,
281,
10808,
731,
50276,
19,
891,
651,
320,
5211,
281,
923,
849,
436,
6661,
273,
9864,
3126,
26662,
342,
10921,
18107,
2675,
13576,
323,
1650,
359,
476,
2216,
247,
10921,
281,
11907,
6083,
956,
987,
273,
1039,
891,
1158,
436,
1039,
1537,
320,
625,
1480,
285,
6422,
50276,
19,
253,
1180,
273,
6083,
3133,
281,
320,
1512,
1355,
285,
778,
2818,
253,
15895,
273,
3971,
2675,
4803,
495,
352,
3133,
751,
253,
6083,
858,
417,
1379,
7137,
2625,
3708,
347,
3280,
323,
253,
2250,
5438,
840,
849,
858,
597,
36803,
253,
2625,
1453,
4803,
50276,
21,
8442,
374,
285,
495,
3198,
1805,
22909,
323,
10668,
281,
2096,
50275,
1189,
455,
891,
1158,
436,
2929,
3198,
1805,
15895,
273,
253,
2412,
982,
285,
671,
247,
3676,
5839,
273,
849,
253,
9864,
10575,
1421,
281,
1110,
13576,
5474,
339,
431,
248,
2929,
29328,
281,
3037,
7137,
4803,
24088,
7137,
10654,
3885,
7787,
3066,
4471,
12788,
391,
77,
2304,
77,
432,
7313,
2581,
685,
1892,
22257,
253,
4803,
715,
253,
11333,
281,
436,
990,
253,
4477,
3916,
9021,
275,
50275,
1545,
1699,
247,
4471,
12788,
6276,
3126,
835,
1016,
5570,
556,
18464,
8310,
27620,
16486,
274,
285,
310,
33302,
323,
10922,
247,
12095,
4541,
1293,
3007,
2821,
4679,
921,
326,
3971,
4803,
476,
320,
6311,
50275,
1752,
318,
327,
253,
4327,
273,
253,
278,
12132,
285,
16039,
326,
13071,
6046,
285,
8820,
4038,
273,
6083,
403,
1774,
281,
8379,
3037,
253,
3126,
50275,
43355,
9023,
281,
3727,
247,
18880,
273,
374,
69,
6276,
12620,
323,
2852,
2304,
77,
2561,
275,
1881,
41571,
50276,
74,
1364,
11476,
326,
891,
717,
7194,
247,
4382,
8113,
1436,
285,
891,
760,
452,
3710,
2793,
342,
391,
77,
390,
2304,
77,
2299,
891,
3524,
326,
619,
6803,
2708,
310,
1335,
1805,
685,
271,
19149,
5476,
891,
26012,
275,
7170,
604,
627,
310,
667,
4755,
40663,
50276,
296,
3755,
20556,
50275,
2369,
652,
555,
275,
1895,
3908,
285,
387,
247,
1029,
1268,
253,
1895,
3908,
273,
4715,
1892,
7137,
4803,
3066,
20764,
20131,
3133,
747,
253,
2898,
273,
2304,
77,
281,
253,
1895,
310,
747,
1512,
6706,
66,
1479,
285,
347,
253,
4477,
4767,
253,
5020,
273,
4471,
12788,
3879,
2987,
452,
644,
970,
45738,
4715,
5046,
1908,
19936,
337,
1386,
273,
789,
347,
973,
534,
13330,
342,
4471,
12788,
45738,
4715,
253,
4081,
7266,
706,
833,
1332,
3133,
247,
1175,
5795,
50275,
2068,
3134,
4679,
627,
403,
818,
1355,
533,
2173,
8892,
824,
347,
7137,
10654,
21313,
273,
24914,
3966,
835,
253,
4477,
2530,
20456,
2979,
281,
253,
3916,
326,
13071,
6046,
285,
8820,
4038,
273,
6083,
403,
9560,
323,
253,
3082,
2323,
50276,
18,
4471,
12788,
1006,
800,
48960,
45738,
4715,
50276,
20881,
1255,
265,
50275,
2369,
5301,
281,
2720,
789,
253,
4477,
14969,
45738,
4715,
3169,
4164,
3082,
533,
858,
667,
11745,
5301,
323,
731,
891,
13414,
923,
667,
1941,
2139,
16161,
253,
4081,
1895,
970,
2304,
77,
310,
1805,
685,
4164,
594,
2139,
403,
359,
2509,
2304,
77,
689,
4164,
50275,
316,
4085,
1255,
323,
253,
7137,
1708,
897,
1083,
253,
10885,
273,
2502,
50276,
11707,
10654,
310,
9093,
6311,
407,
417,
6276,
715,
643,
8458,
534,
20090,
253,
7137,
4803,
342,
643,
3000,
604,
627,
403,
642,
643,
6083,
281,
7568,
849,
281,
21319,
253,
5570,
588,
1900,
4510,
281,
1408,
689,
2502,
436,
16540,
253,
1953,
273,
253,
31471,
273,
253,
985,
50275,
77,
471,
273,
38135,
275,
253,
1332,
3139,
253,
2929,
3133,
281,
320,
970,
271,
273,
649,
1041,
48164,
268,
5367,
253,
36409,
7291,
1625,
46701,
554,
268,
5367,
285,
26413,
652,
268,
5367,
513,
417,
2085,
4209,
38135,
275,
2426,
273,
3082,
253,
36409,
7291,
476,
320,
2326,
347,
247,
15246,
6880,
273,
268,
5367,
323,
253,
1881,
41571,
2898,
253,
26413,
652,
6880,
310,
247,
15246,
1039,
281,
22318,
767,
16566,
387,
253,
1072,
673,
534,
310,
3240,
1846,
1014,
275,
253,
11880,
273,
17133,
13757,
5987,
257,
25842,
2061,
44874,
487,
3557,
30275,
2178,
27996,
891,
1158,
326,
627,
310,
38135,
275,
253,
1332,
533,
891,
717,
417,
2119,
604,
697,
2217,
323,
436,
18767,
50275,
42429,
295,
19387,
24453,
10895,
310,
247,
13071,
10895,
285,
3164,
1057,
417,
3831,
1142,
4722,
18366,
15216,
1335,
253,
1332,
1057,
417,
1646,
281,
1347,
1077,
973,
275,
253,
4679,
24088,
2556,
281,
3036,
374,
1142,
8458,
403,
1335,
38757,
5747,
253,
2502,
1708,
50276,
23454,
50275,
455,
24039,
4454,
285,
14505,
275,
512,
21302,
403,
1039,
1512,
1355,
642,
1039,
952,
476,
1086,
6894,
731,
604,
11462,
327,
2929,
50275,
2369,
38209,
275,
253,
21302,
12345,
667,
11743,
4496,
1056,
21302,
1881,
41010,
604,
1896,
50276,
585,
3444,
50276,
2520,
2987,
3400,
271,
4722,
747,
1895,
3908,
285,
33826,
253,
2170,
273,
970,
2304,
77,
323,
26279,
6276,
619,
2022,
7350,
5001,
436,
2929,
403,
253,
3480,
273,
5301,
281,
2720,
2987,
275,
253,
4679,
285,
253,
789,
3400,
667,
31471,
285,
11701,
689,
1655,
26279,
2718,
534,
403,
6571,
1754,
327,
45738,
4715,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
247,
26413,
652,
2304,
77,
1332,
326,
476,
3037,
3971,
4803,
285,
29793,
29688,
1293,
1892,
12425,
352,
310,
3240,
4722,
281,
923,
247,
2969,
285,
15246,
2934,
326,
310,
3576,
275,
436,
4836,
4758,
2299,
436,
2929,
3198,
281,
320,
2007,
29422,
275,
2426,
273,
697,
6742,
29867,
285,
7103,
50276,
9349,
1497,
849,
310,
253,
6046,
5611,
281,
253,
13892,
14050,
16486,
274,
13071,
1543,
275,
6821,
460,
1566,
1057,
253,
18974,
5281,
760,
7024,
327,
253,
3302,
1375,
1293,
7296,
253,
28279,
5016,
253,
19843,
285,
29867,
273,
4028,
812,
320,
3012,
5520,
253,
7424,
2426,
285,
253,
5933,
275,
2593,
577,
878,
4209,
22909,
24088,
752,
403,
362,
649,
1816,
85,
288,
465,
18,
465,
19,
295,
752,
310,
253,
3426,
10921,
1618,
50276,
16217,
3825,
352,
3133,
253,
7622,
310,
4942,
2159,
285,
253,
6276,
15216,
908,
323,
1027,
8892,
403,
5799,
891,
369,
12371,
1880,
253,
26647,
273,
253,
4081,
1332,
285,
253,
6311,
4803,
812,
320,
17618,
50276,
2369,
1666,
25379,
452,
644,
6760,
390,
2429,
534,
5661,
1543,
403,
390,
253,
4229,
3540,
1566,
752,
403,
253,
3910,
875,
253,
767,
3210,
275,
253,
1543,
752,
403,
253,
10921,
2193,
1309,
253,
3733,
3408,
50276,
783,
3403,
621,
285,
7844,
13301,
403,
2834,
281,
1239,
752,
513,
253,
9830,
1599,
275,
3036,
374,
50274,
2577,
13716,
556,
644,
9300,
846,
253,
30080,
22559,
7152,
339,
793,
360,
3454,
253,
3453,
273,
253,
789,
310,
271,
278,
12132,
1566,
326,
310,
7032,
273,
9706,
2570,
7137,
4803,
1690,
7137,
6298,
24914,
987,
273,
1039,
5813,
80,
3966,
2710,
1027,
7137,
12620,
824,
347,
42320,
40459,
295,
19387,
24453,
403,
2783,
50276,
783,
278,
12132,
326,
1543,
432,
436,
789,
310,
1077,
4217,
323,
2852,
2561,
4390,
627,
403,
1077,
1643,
948,
28457,
326,
22573,
2570,
7137,
4803,
390,
2085,
253,
15840,
281,
1347,
2561,
3103,
253,
6387,
273,
747,
948,
28457,
4795,
432,
436,
278,
12132,
310,
12302,
285,
4217,
281,
11907,
285,
8591,
2561,
275,
26279,
6276,
3340,
275,
14086,
285,
22766,
12620,
50276,
74,
717,
9995,
326,
253,
4477,
452,
2530,
253,
2603,
2127,
342,
9470,
10097,
285,
7997,
327,
849,
281,
1408,
253,
2127,
285,
18302,
253,
1543,
50275,
28558,
627,
310,
690,
2201,
19652,
342,
436,
2929,
326,
891,
778,
452,
9829,
627,
310,
642,
1921,
281,
12009,
436,
2929,
187,
187,
4118,
18435,
27,
2520,
2929,
2722,
849,
3971,
4803,
24088,
15424,
25344,
273,
3809,
24914,
327,
247,
17657,
10748,
20177,
275,
247,
4471,
12788,
278,
12132,
253,
2929,
2722,
326,
4722,
7137,
4803,
513,
20177,
285,
352,
10262,
247,
7000,
1783,
273,
253,
2616,
326,
1421,
281,
436,
21313,
253,
2929,
310,
48912,
407,
14290,
2603,
2127,
342,
253,
4388,
281,
11907,
253,
3114,
281,
2007,
789,
327,
253,
9400,
50276,
783,
30628,
5821,
326,
436,
310,
3236,
789,
285,
14109,
697,
17647,
767,
7350,
326,
497,
18902,
314,
39544,
497,
326,
337,
627,
310,
642,
5933,
280,
15832,
285,
374,
627,
310,
642,
5301,
281,
8245,
3210,
390,
625,
3839,
247,
1805,
14663,
275,
253,
3634,
273,
5368,
6239,
50276,
783,
4477,
2530,
247,
7000,
285,
281,
619,
2927,
21414,
2380,
342,
1675,
281,
253,
767,
7350,
1840,
891,
651,
564,
347,
2080,
347,
3981,
326,
337,
642,
5933,
280,
15832,
310,
247,
4735,
417,
247,
7505,
253,
2929,
310,
4722,
4555,
984,
352,
2175,
47006,
16958,
846,
39926,
4471,
12788,
6276,
347,
247,
2629,
391,
77,
1895,
8664,
374,
3480,
273,
1666,
25379,
352,
3133,
281,
479,
8489,
16280,
253,
1127,
253,
2929,
310,
417,
15081,
1375,
273,
253,
1445,
327,
690,
22791,
323,
247,
747,
5933,
533,
12392,
849,
2176,
15424,
4803,
20177,
275,
247,
1677,
9978,
275,
436,
3282,
347,
253,
4477,
1127,
562,
2581,
685,
2819,
387,
5795,
1666,
25379,
352,
310,
27096,
281,
1007,
387,
534,
7794,
273,
253,
9978,
8162,
281,
4086,
21313,
534,
310,
752,
253,
2929,
1057,
50276,
20261,
891,
8968,
326,
275,
36636,
436,
891,
717,
1469,
4457,
253,
30628,
17503,
891,
1119,
436,
281,
320,
271,
3236,
285,
12302,
2929,
326,
891,
651,
7052,
751,
281,
923,
7607,
387,
253,
8059,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a near realworld offline rl benchmark named neorl the neorl dataset is collected on different realworld tasks with a more conservative strategy compared with the previous offline rl benchmark they evaluate some sota offline rl algorithms on neorl and reveal that current offline rl algorithms are less effective in realworld tasks the authors propose a new workflow for offline rl policies offline training offline validation new and online deployment the neorl benchmark contains sufficient datasets since it involves 7 domains with 52 tasks in total besides the authors build benchmarks for offline and online policy selection moreover they conducted sufficient experiments to verify the performance of existing offline rl algorithms and they pointed out some interesting results which can give the right direction for developing offline rl policies in realworld tasks for the neorl benchmark the authors claim that limited samples 1e4 if enough for policy training however more training data is helpful to train a more robust policy especially when we take the security issues of rl policy into account eg adversary attacks for the offline validation process the proposed validation with offline data before a policy is deployed on a realworld task indeed improves the efficiency of policy selection but from table 1 i noticed that the average rank in online performance is different from the rank in offline performance eg cql i am wondering can the offline validation reflect the correct performance rank of a policy in the realworld task and how to prevent selecting a suboptimal policy docsepthis paper presents a new offline rl benchmark that considers using a conservative working policy ie deterministic for data collection in various tasks that simulate realworld applications and also tests the performance of offline evaluation methods the results show that current offline rl methods dont work well in the benchmark compared to bc suggesting more improvement is needed 1 it is valuable to consider more conservative data composition in offline rl which matches the need of many realworld tasks 2 the authors compare the performances of current methods under online evaluation and offline evaluation respectively this is of great value to the community since performing offline evaluation is pivotal in offline rl research 3 the tasks constructed in the paper are closer to realworld applications 1 the conservative data collection part is actually very similar to the medium data in previous benchmarks eg d4rl where a fixed policy collects the data with mediumlevel performance the paper instead proposes to run the mean of the fixed gaussian policy 80 of the time and the stochastic gaussian policy otherwise im not sure if such a change can lead to more practical applications and make a huge difference in the performances of offline rl algorithms 2 while i generally agree that we need to be more conservative in data collection i dont think that means we should only collect data with a fixed mostly deterministic policy such a setting proposed in the paper could make the data coverage very narrow thus benefitting bc and modelfree methods i think it would be worth studying the dataset that consists of a mixture of policies or random policies that dont violate certain safety criteria those settings should still be practical in realworld tasks and could result in very different outcomes docsepprevious offline rl benchmarks commonly involve significant reality gaps which include rich and overly exploratory datasets degraded baseline and missing policy validation this paper proposes neorl a suite of near realworld benchmarks for offline rl 1 the authors collected a lot of real world data including locomotion control industrial control financial trading sales promotion and city management domains with realworld dataset properties 2 to investigate whether these offline rl algorithms are effective the author compare them with the working policy 3 the author adds a section on offline policy validation which feels inspired 1 why did you only choose these three environments i see that the commonality of these three environments is relatively simple halfcheetahv3 walker2dv3 and hopperv3 why not choose humanoid whose state space 376 and action space 17 2 i have great doubts about the authors point of view and i hope the author can further elaborate on this point of view comparison with the working policythe experimental results show that they appear less effective when compared with the working policy docsepthe paper presents a near realworld offline rl benchmark neorl it concentrates on learning from conservative and limited data because many realworld working policies are conservative in addition it evaluates the offline policies in both offline and online ways where offline evaluation is more reasonable in a realworld application the authors investigate popular offline rl algorithms in the nearreal setting and find that many offline rl algorithms may be overestimated i think the topics of learning from conservative data and offline evaluation are very meaningful they are important in deploying realworld rl applications since many realworld policies cannot be too exploratory i agree with the authors that a good offline policy should perform well on conservative data the results show that many offline rl algorithms may be overestimated which will inspire future works the paper is written clearly the authors propose a complete pipeline for deploying offline rl in fig1 but they dont follow such a pipeline to evaluate the rl algorithms instead they show the results of online evaluation and offline evaluation respectively and the results show that online evaluation and offline evaluation could not reach an agreement it is confusing if the policy selected by offline evaluation is better than the deterministic working policy docsepin this paper the authors propose a novel benchmark for offline reinforcement learning neorl neor includes realworld tasks eg industrial benchmark finrl citylearn salespromotion the datasets in neorl are conservative and limited which follows the properties of realworld datasets the authors propose a new method to evaluate the performance of offline rl algorithms instead of comparing with the behavior clone they propose to compare with the working policy the benchmark introduces an offline validation phase which evaluates the learned policy using a test dataset before deploying it to the environment a lot of existing offline methods are evaluated in the benchmark neorl is more practical and challenging compared with previous benchmarks and i believe that it will become a popular benchmark for offline learning 1 neorl includes diverse datasets of realworld tasks which are more conservative and limited than the previous ones 2 neorl introduces offline validation in the offline training pipeline which is not formally considered in the previous ones 3 to clearly investigate the effectiveness of offline rl algorithms neorl compares the performance of learned policy with working policy rather than behavior clone policy 4 based on the proposed pipeline of offline training the authors reevaluate many existing offline algorithms i have a question about policy evaluation in cv tasks researchers will select the best model in the validation set however in reinforcement learning the convergence is the most important and researchers usually care about the performance of the converged policy we know the best policy might not be the converged policy so in offline rl should we focus on the best policy or the converged policy
### Summary:
|
the paper introduces new datasets and benchmarks for offline settings in rl the papers contributions are notable for 1 using and highlighting the importance of learning from narrow data distributions 2 treating policy evaluation and validation as a key challenge rather than an afterthought and 3 providing extensive results of existing methods i believe this paper is a significant contribution to the field and hope that future researchers will use it in addition to the existing offline rl benchmarks
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
2822,
1524,
10186,
28841,
391,
77,
22791,
4907,
425,
263,
77,
253,
425,
263,
77,
10895,
310,
5728,
327,
1027,
1524,
10186,
8892,
342,
247,
625,
11518,
5700,
2429,
342,
253,
2045,
28841,
391,
77,
22791,
597,
7472,
690,
256,
5503,
28841,
391,
77,
11333,
327,
425,
263,
77,
285,
10313,
326,
1655,
28841,
391,
77,
11333,
403,
1679,
3576,
275,
1524,
10186,
8892,
253,
4477,
12661,
247,
747,
24824,
323,
28841,
391,
77,
7823,
28841,
3733,
28841,
12820,
747,
285,
3909,
19007,
50276,
783,
425,
263,
77,
22791,
4428,
4209,
15302,
1580,
352,
8687,
818,
10625,
342,
8073,
8892,
275,
2264,
16280,
253,
4477,
1973,
49602,
323,
28841,
285,
3909,
3646,
5438,
25761,
597,
5196,
4209,
4679,
281,
12654,
253,
3045,
273,
5368,
28841,
391,
77,
11333,
285,
597,
8042,
562,
690,
4722,
1543,
534,
476,
1918,
253,
987,
3884,
323,
6684,
28841,
391,
77,
7823,
275,
1524,
10186,
8892,
323,
253,
425,
263,
77,
22791,
253,
4477,
1750,
326,
3710,
3530,
337,
70,
21,
604,
2217,
323,
3646,
3733,
2299,
625,
3733,
941,
310,
9371,
281,
6194,
247,
625,
10237,
3646,
3340,
672,
359,
1379,
253,
3988,
3374,
273,
391,
77,
3646,
715,
2395,
24088,
34014,
8104,
50276,
1542,
253,
28841,
12820,
1232,
253,
4081,
12820,
342,
28841,
941,
1078,
247,
3646,
310,
18329,
327,
247,
1524,
10186,
4836,
6296,
19132,
253,
6733,
273,
3646,
5438,
533,
432,
2829,
337,
891,
8344,
326,
253,
3388,
5958,
275,
3909,
3045,
310,
1027,
432,
253,
5958,
275,
28841,
3045,
24088,
260,
5848,
891,
717,
12371,
476,
253,
28841,
12820,
4887,
253,
3451,
3045,
5958,
273,
247,
3646,
275,
253,
1524,
10186,
4836,
285,
849,
281,
3657,
17221,
247,
749,
29776,
3646,
50276,
7152,
33032,
2520,
2929,
10262,
247,
747,
28841,
391,
77,
22791,
326,
19401,
970,
247,
11518,
2444,
3646,
26332,
30027,
323,
941,
4849,
275,
2710,
8892,
326,
26065,
1524,
10186,
4893,
285,
671,
5216,
253,
3045,
273,
28841,
7103,
3082,
253,
1543,
921,
326,
1655,
28841,
391,
77,
3082,
13414,
789,
973,
275,
253,
22791,
2429,
281,
49501,
7738,
625,
7756,
310,
3058,
337,
352,
310,
9865,
281,
1908,
625,
11518,
941,
5889,
275,
28841,
391,
77,
534,
10129,
253,
878,
273,
1142,
1524,
10186,
8892,
50276,
19,
253,
4477,
7277,
253,
16226,
273,
1655,
3082,
762,
3909,
7103,
285,
28841,
7103,
2975,
436,
310,
273,
1270,
1318,
281,
253,
3114,
1580,
9591,
28841,
7103,
310,
30847,
275,
28841,
391,
77,
2561,
50276,
20,
253,
8892,
8818,
275,
253,
2929,
403,
8003,
281,
1524,
10186,
4893,
337,
253,
11518,
941,
4849,
629,
310,
2686,
1077,
2074,
281,
253,
4646,
941,
275,
2045,
49602,
24088,
277,
21,
8435,
835,
247,
4229,
3646,
41084,
253,
941,
342,
4646,
5251,
3045,
253,
2929,
3185,
29328,
281,
1408,
253,
1599,
273,
253,
4229,
305,
12064,
3646,
5096,
273,
253,
673,
285,
253,
19191,
305,
12064,
3646,
5010,
516,
417,
2119,
604,
824,
247,
1818,
476,
1421,
281,
625,
8542,
4893,
285,
1056,
247,
5699,
3064,
275,
253,
16226,
273,
28841,
391,
77,
11333,
50276,
19,
1223,
891,
3839,
5194,
326,
359,
878,
281,
320,
625,
11518,
275,
941,
4849,
891,
13414,
1158,
326,
2097,
359,
943,
760,
4822,
941,
342,
247,
4229,
6571,
30027,
3646,
824,
247,
4758,
4081,
275,
253,
2929,
812,
1056,
253,
941,
7031,
1077,
6891,
3021,
2750,
2835,
49501,
285,
771,
813,
658,
3082,
891,
1158,
352,
651,
320,
4409,
12392,
253,
10895,
326,
8414,
273,
247,
7802,
273,
7823,
390,
3632,
7823,
326,
13414,
20835,
2176,
5252,
6866,
1110,
7533,
943,
1335,
320,
8542,
275,
1524,
10186,
8892,
285,
812,
906,
275,
1077,
1027,
6973,
50276,
7152,
339,
377,
250,
3391,
28841,
391,
77,
49602,
7744,
6388,
1534,
6612,
18388,
534,
2486,
6793,
285,
27662,
41075,
15302,
30853,
8245,
285,
5816,
3646,
12820,
436,
2929,
29328,
425,
263,
77,
247,
18880,
273,
2822,
1524,
10186,
49602,
323,
28841,
391,
77,
50276,
18,
253,
4477,
5728,
247,
2257,
273,
1524,
1533,
941,
1690,
23904,
5011,
1453,
9787,
1453,
4832,
11947,
6224,
14892,
285,
2846,
4323,
10625,
342,
1524,
10186,
10895,
3607,
374,
281,
7409,
1880,
841,
28841,
391,
77,
11333,
403,
3576,
253,
2488,
7277,
731,
342,
253,
2444,
3646,
495,
253,
2488,
11323,
247,
2593,
327,
28841,
3646,
12820,
534,
9193,
11797,
337,
2139,
858,
368,
760,
5206,
841,
1264,
12620,
891,
923,
326,
253,
1846,
1319,
273,
841,
1264,
12620,
310,
4942,
2969,
2716,
1962,
292,
1240,
87,
20,
2940,
254,
19,
27088,
20,
285,
8511,
377,
677,
20,
2139,
417,
5206,
1966,
1238,
3692,
1375,
2317,
38516,
285,
2250,
2317,
1722,
374,
891,
452,
1270,
24626,
670,
253,
4477,
1127,
273,
1859,
285,
891,
3524,
253,
2488,
476,
2007,
21184,
327,
436,
1127,
273,
1859,
50272,
47109,
342,
253,
2444,
3646,
783,
5661,
1543,
921,
326,
597,
3176,
1679,
3576,
672,
2429,
342,
253,
2444,
3646,
50275,
7152,
339,
431,
248,
2929,
10262,
247,
2822,
1524,
10186,
28841,
391,
77,
22791,
425,
263,
77,
352,
5078,
684,
327,
4715,
432,
11518,
285,
3710,
941,
984,
1142,
1524,
10186,
2444,
7823,
403,
11518,
275,
1635,
352,
44995,
253,
28841,
7823,
275,
1097,
28841,
285,
3909,
4088,
835,
28841,
7103,
310,
625,
5272,
275,
247,
1524,
10186,
2898,
253,
4477,
7409,
4633,
28841,
391,
77,
11333,
275,
253,
2822,
6549,
4758,
285,
1089,
326,
1142,
28841,
391,
77,
11333,
778,
320,
35039,
18280,
891,
1158,
253,
12989,
273,
4715,
432,
11518,
941,
285,
28841,
7103,
403,
1077,
14282,
597,
403,
1774,
275,
45021,
1524,
10186,
391,
77,
4893,
50276,
17480,
1142,
1524,
10186,
7823,
2550,
320,
1512,
41075,
891,
5194,
342,
253,
4477,
326,
247,
1175,
28841,
3646,
943,
1347,
973,
327,
11518,
941,
253,
1543,
921,
326,
1142,
28841,
391,
77,
11333,
778,
320,
35039,
18280,
534,
588,
26761,
2852,
2987,
50276,
783,
2929,
310,
3542,
4518,
253,
4477,
12661,
247,
3426,
15722,
323,
45021,
28841,
391,
77,
275,
3036,
18,
533,
597,
13414,
956,
824,
247,
15722,
281,
7472,
253,
391,
77,
11333,
3185,
597,
921,
253,
1543,
273,
3909,
7103,
285,
28841,
7103,
2975,
285,
253,
1543,
921,
326,
3909,
7103,
285,
28841,
7103,
812,
417,
3986,
271,
4345,
352,
310,
21643,
604,
253,
3646,
4236,
407,
28841,
7103,
310,
1805,
685,
253,
30027,
2444,
3646,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
12661,
247,
4460,
22791,
323,
28841,
35221,
4715,
425,
263,
77,
425,
263,
3797,
1524,
10186,
8892,
24088,
9787,
22791,
1442,
8435,
2846,
29343,
6224,
13382,
5011,
253,
15302,
275,
425,
263,
77,
403,
11518,
285,
3710,
534,
3637,
253,
3607,
273,
1524,
10186,
15302,
253,
4477,
12661,
247,
747,
1332,
281,
7472,
253,
3045,
273,
28841,
391,
77,
11333,
3185,
273,
10941,
342,
253,
3879,
19327,
597,
12661,
281,
7277,
342,
253,
2444,
3646,
253,
22791,
23970,
271,
28841,
12820,
3408,
534,
44995,
253,
6311,
3646,
970,
247,
1071,
10895,
1078,
45021,
352,
281,
253,
3126,
247,
2257,
273,
5368,
28841,
3082,
403,
6760,
275,
253,
22791,
425,
263,
77,
310,
625,
8542,
285,
11132,
2429,
342,
2045,
49602,
285,
891,
2868,
326,
352,
588,
2489,
247,
4633,
22791,
323,
28841,
4715,
337,
425,
263,
77,
3797,
11117,
15302,
273,
1524,
10186,
8892,
534,
403,
625,
11518,
285,
3710,
685,
253,
2045,
4394,
50276,
19,
425,
263,
77,
23970,
28841,
12820,
275,
253,
28841,
3733,
15722,
534,
310,
417,
19186,
2783,
275,
253,
2045,
4394,
50276,
20,
281,
4518,
7409,
253,
12510,
273,
28841,
391,
77,
11333,
425,
263,
77,
26662,
253,
3045,
273,
6311,
3646,
342,
2444,
3646,
2581,
685,
3879,
19327,
3646,
50276,
21,
1754,
327,
253,
4081,
15722,
273,
28841,
3733,
253,
4477,
294,
45141,
1142,
5368,
28841,
11333,
891,
452,
247,
1953,
670,
3646,
7103,
275,
30105,
8892,
8607,
588,
3609,
253,
1682,
1566,
275,
253,
12820,
873,
2299,
275,
35221,
4715,
253,
14940,
310,
253,
954,
1774,
285,
8607,
3798,
1557,
670,
253,
3045,
273,
253,
5975,
2400,
3646,
359,
871,
253,
1682,
3646,
1537,
417,
320,
253,
5975,
2400,
3646,
594,
275,
28841,
391,
77,
943,
359,
2770,
327,
253,
1682,
3646,
390,
253,
5975,
2400,
3646,
2490,
187,
4118,
18435,
27,
783,
2929,
23970,
747,
15302,
285,
49602,
323,
28841,
7533,
275,
391,
77,
253,
9380,
9021,
403,
16613,
323,
337,
970,
285,
27321,
253,
6349,
273,
4715,
432,
6891,
941,
10670,
374,
12767,
3646,
7103,
285,
12820,
347,
247,
2234,
5691,
2581,
685,
271,
846,
24286,
285,
495,
5277,
9470,
1543,
273,
5368,
3082,
891,
2868,
436,
2929,
310,
247,
1534,
7680,
281,
253,
1673,
285,
3524,
326,
2852,
8607,
588,
897,
352,
275,
1635,
281,
253,
5368,
28841,
391,
77,
49602
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
2822,
1524,
10186,
28841,
391,
77,
22791,
4907,
425,
263,
77,
253,
425,
263,
77,
10895,
310,
5728,
327,
1027,
1524,
10186,
8892,
342,
247,
625,
11518,
5700,
2429,
342,
253,
2045,
28841,
391,
77,
22791,
597,
7472,
690,
256,
5503,
28841,
391,
77,
11333,
327,
425,
263,
77,
285,
10313,
326,
1655,
28841,
391,
77,
11333,
403,
1679,
3576,
275,
1524,
10186,
8892,
253,
4477,
12661,
247,
747,
24824,
323,
28841,
391,
77,
7823,
28841,
3733,
28841,
12820,
747,
285,
3909,
19007,
50276,
783,
425,
263,
77,
22791,
4428,
4209,
15302,
1580,
352,
8687,
818,
10625,
342,
8073,
8892,
275,
2264,
16280,
253,
4477,
1973,
49602,
323,
28841,
285,
3909,
3646,
5438,
25761,
597,
5196,
4209,
4679,
281,
12654,
253,
3045,
273,
5368,
28841,
391,
77,
11333,
285,
597,
8042,
562,
690,
4722,
1543,
534,
476,
1918,
253,
987,
3884,
323,
6684,
28841,
391,
77,
7823,
275,
1524,
10186,
8892,
323,
253,
425,
263,
77,
22791,
253,
4477,
1750,
326,
3710,
3530,
337,
70,
21,
604,
2217,
323,
3646,
3733,
2299,
625,
3733,
941,
310,
9371,
281,
6194,
247,
625,
10237,
3646,
3340,
672,
359,
1379,
253,
3988,
3374,
273,
391,
77,
3646,
715,
2395,
24088,
34014,
8104,
50276,
1542,
253,
28841,
12820,
1232,
253,
4081,
12820,
342,
28841,
941,
1078,
247,
3646,
310,
18329,
327,
247,
1524,
10186,
4836,
6296,
19132,
253,
6733,
273,
3646,
5438,
533,
432,
2829,
337,
891,
8344,
326,
253,
3388,
5958,
275,
3909,
3045,
310,
1027,
432,
253,
5958,
275,
28841,
3045,
24088,
260,
5848,
891,
717,
12371,
476,
253,
28841,
12820,
4887,
253,
3451,
3045,
5958,
273,
247,
3646,
275,
253,
1524,
10186,
4836,
285,
849,
281,
3657,
17221,
247,
749,
29776,
3646,
50276,
7152,
33032,
2520,
2929,
10262,
247,
747,
28841,
391,
77,
22791,
326,
19401,
970,
247,
11518,
2444,
3646,
26332,
30027,
323,
941,
4849,
275,
2710,
8892,
326,
26065,
1524,
10186,
4893,
285,
671,
5216,
253,
3045,
273,
28841,
7103,
3082,
253,
1543,
921,
326,
1655,
28841,
391,
77,
3082,
13414,
789,
973,
275,
253,
22791,
2429,
281,
49501,
7738,
625,
7756,
310,
3058,
337,
352,
310,
9865,
281,
1908,
625,
11518,
941,
5889,
275,
28841,
391,
77,
534,
10129,
253,
878,
273,
1142,
1524,
10186,
8892,
50276,
19,
253,
4477,
7277,
253,
16226,
273,
1655,
3082,
762,
3909,
7103,
285,
28841,
7103,
2975,
436,
310,
273,
1270,
1318,
281,
253,
3114,
1580,
9591,
28841,
7103,
310,
30847,
275,
28841,
391,
77,
2561,
50276,
20,
253,
8892,
8818,
275,
253,
2929,
403,
8003,
281,
1524,
10186,
4893,
337,
253,
11518,
941,
4849,
629,
310,
2686,
1077,
2074,
281,
253,
4646,
941,
275,
2045,
49602,
24088,
277,
21,
8435,
835,
247,
4229,
3646,
41084,
253,
941,
342,
4646,
5251,
3045,
253,
2929,
3185,
29328,
281,
1408,
253,
1599,
273,
253,
4229,
305,
12064,
3646,
5096,
273,
253,
673,
285,
253,
19191,
305,
12064,
3646,
5010,
516,
417,
2119,
604,
824,
247,
1818,
476,
1421,
281,
625,
8542,
4893,
285,
1056,
247,
5699,
3064,
275,
253,
16226,
273,
28841,
391,
77,
11333,
50276,
19,
1223,
891,
3839,
5194,
326,
359,
878,
281,
320,
625,
11518,
275,
941,
4849,
891,
13414,
1158,
326,
2097,
359,
943,
760,
4822,
941,
342,
247,
4229,
6571,
30027,
3646,
824,
247,
4758,
4081,
275,
253,
2929,
812,
1056,
253,
941,
7031,
1077,
6891,
3021,
2750,
2835,
49501,
285,
771,
813,
658,
3082,
891,
1158,
352,
651,
320,
4409,
12392,
253,
10895,
326,
8414,
273,
247,
7802,
273,
7823,
390,
3632,
7823,
326,
13414,
20835,
2176,
5252,
6866,
1110,
7533,
943,
1335,
320,
8542,
275,
1524,
10186,
8892,
285,
812,
906,
275,
1077,
1027,
6973,
50276,
7152,
339,
377,
250,
3391,
28841,
391,
77,
49602,
7744,
6388,
1534,
6612,
18388,
534,
2486,
6793,
285,
27662,
41075,
15302,
30853,
8245,
285,
5816,
3646,
12820,
436,
2929,
29328,
425,
263,
77,
247,
18880,
273,
2822,
1524,
10186,
49602,
323,
28841,
391,
77,
50276,
18,
253,
4477,
5728,
247,
2257,
273,
1524,
1533,
941,
1690,
23904,
5011,
1453,
9787,
1453,
4832,
11947,
6224,
14892,
285,
2846,
4323,
10625,
342,
1524,
10186,
10895,
3607,
374,
281,
7409,
1880,
841,
28841,
391,
77,
11333,
403,
3576,
253,
2488,
7277,
731,
342,
253,
2444,
3646,
495,
253,
2488,
11323,
247,
2593,
327,
28841,
3646,
12820,
534,
9193,
11797,
337,
2139,
858,
368,
760,
5206,
841,
1264,
12620,
891,
923,
326,
253,
1846,
1319,
273,
841,
1264,
12620,
310,
4942,
2969,
2716,
1962,
292,
1240,
87,
20,
2940,
254,
19,
27088,
20,
285,
8511,
377,
677,
20,
2139,
417,
5206,
1966,
1238,
3692,
1375,
2317,
38516,
285,
2250,
2317,
1722,
374,
891,
452,
1270,
24626,
670,
253,
4477,
1127,
273,
1859,
285,
891,
3524,
253,
2488,
476,
2007,
21184,
327,
436,
1127,
273,
1859,
50272,
47109,
342,
253,
2444,
3646,
783,
5661,
1543,
921,
326,
597,
3176,
1679,
3576,
672,
2429,
342,
253,
2444,
3646,
50275,
7152,
339,
431,
248,
2929,
10262,
247,
2822,
1524,
10186,
28841,
391,
77,
22791,
425,
263,
77,
352,
5078,
684,
327,
4715,
432,
11518,
285,
3710,
941,
984,
1142,
1524,
10186,
2444,
7823,
403,
11518,
275,
1635,
352,
44995,
253,
28841,
7823,
275,
1097,
28841,
285,
3909,
4088,
835,
28841,
7103,
310,
625,
5272,
275,
247,
1524,
10186,
2898,
253,
4477,
7409,
4633,
28841,
391,
77,
11333,
275,
253,
2822,
6549,
4758,
285,
1089,
326,
1142,
28841,
391,
77,
11333,
778,
320,
35039,
18280,
891,
1158,
253,
12989,
273,
4715,
432,
11518,
941,
285,
28841,
7103,
403,
1077,
14282,
597,
403,
1774,
275,
45021,
1524,
10186,
391,
77,
4893,
50276,
17480,
1142,
1524,
10186,
7823,
2550,
320,
1512,
41075,
891,
5194,
342,
253,
4477,
326,
247,
1175,
28841,
3646,
943,
1347,
973,
327,
11518,
941,
253,
1543,
921,
326,
1142,
28841,
391,
77,
11333,
778,
320,
35039,
18280,
534,
588,
26761,
2852,
2987,
50276,
783,
2929,
310,
3542,
4518,
253,
4477,
12661,
247,
3426,
15722,
323,
45021,
28841,
391,
77,
275,
3036,
18,
533,
597,
13414,
956,
824,
247,
15722,
281,
7472,
253,
391,
77,
11333,
3185,
597,
921,
253,
1543,
273,
3909,
7103,
285,
28841,
7103,
2975,
285,
253,
1543,
921,
326,
3909,
7103,
285,
28841,
7103,
812,
417,
3986,
271,
4345,
352,
310,
21643,
604,
253,
3646,
4236,
407,
28841,
7103,
310,
1805,
685,
253,
30027,
2444,
3646,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
12661,
247,
4460,
22791,
323,
28841,
35221,
4715,
425,
263,
77,
425,
263,
3797,
1524,
10186,
8892,
24088,
9787,
22791,
1442,
8435,
2846,
29343,
6224,
13382,
5011,
253,
15302,
275,
425,
263,
77,
403,
11518,
285,
3710,
534,
3637,
253,
3607,
273,
1524,
10186,
15302,
253,
4477,
12661,
247,
747,
1332,
281,
7472,
253,
3045,
273,
28841,
391,
77,
11333,
3185,
273,
10941,
342,
253,
3879,
19327,
597,
12661,
281,
7277,
342,
253,
2444,
3646,
253,
22791,
23970,
271,
28841,
12820,
3408,
534,
44995,
253,
6311,
3646,
970,
247,
1071,
10895,
1078,
45021,
352,
281,
253,
3126,
247,
2257,
273,
5368,
28841,
3082,
403,
6760,
275,
253,
22791,
425,
263,
77,
310,
625,
8542,
285,
11132,
2429,
342,
2045,
49602,
285,
891,
2868,
326,
352,
588,
2489,
247,
4633,
22791,
323,
28841,
4715,
337,
425,
263,
77,
3797,
11117,
15302,
273,
1524,
10186,
8892,
534,
403,
625,
11518,
285,
3710,
685,
253,
2045,
4394,
50276,
19,
425,
263,
77,
23970,
28841,
12820,
275,
253,
28841,
3733,
15722,
534,
310,
417,
19186,
2783,
275,
253,
2045,
4394,
50276,
20,
281,
4518,
7409,
253,
12510,
273,
28841,
391,
77,
11333,
425,
263,
77,
26662,
253,
3045,
273,
6311,
3646,
342,
2444,
3646,
2581,
685,
3879,
19327,
3646,
50276,
21,
1754,
327,
253,
4081,
15722,
273,
28841,
3733,
253,
4477,
294,
45141,
1142,
5368,
28841,
11333,
891,
452,
247,
1953,
670,
3646,
7103,
275,
30105,
8892,
8607,
588,
3609,
253,
1682,
1566,
275,
253,
12820,
873,
2299,
275,
35221,
4715,
253,
14940,
310,
253,
954,
1774,
285,
8607,
3798,
1557,
670,
253,
3045,
273,
253,
5975,
2400,
3646,
359,
871,
253,
1682,
3646,
1537,
417,
320,
253,
5975,
2400,
3646,
594,
275,
28841,
391,
77,
943,
359,
2770,
327,
253,
1682,
3646,
390,
253,
5975,
2400,
3646,
2490,
187,
4118,
18435,
27,
783,
2929,
23970,
747,
15302,
285,
49602,
323,
28841,
7533,
275,
391,
77,
253,
9380,
9021,
403,
16613,
323,
337,
970,
285,
27321,
253,
6349,
273,
4715,
432,
6891,
941,
10670,
374,
12767,
3646,
7103,
285,
12820,
347,
247,
2234,
5691,
2581,
685,
271,
846,
24286,
285,
495,
5277,
9470,
1543,
273,
5368,
3082,
891,
2868,
436,
2929,
310,
247,
1534,
7680,
281,
253,
1673,
285,
3524,
326,
2852,
8607,
588,
897,
352,
275,
1635,
281,
253,
5368,
28841,
391,
77,
49602
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper this paper studies the finitetime convergence of nonlinear sa with multiple coupled sequences different from existing multitimescale analysis the authors seek for scenarios where a finegrained analysis can provide the tight performance guarantee for singletimescale multisequence sa stsa when all sequences have strongly monotone increments they establish the iteration complexity of oeps1 to achieve epsaccuracy which improves the existing oeps15 complexity for two coupled sequences when the main sequence does not have strongly monotone increment they establish the iteration complexity of oeps2 strengths 1 this paper provides an improved rate for the case where all sequences have strongly monotone increments in singletimescale multisequence sa 2 this paper further provides an analysis for a singletimescale multisequence sa when the main sequence does not have strongly monotone increment 3 the theoretical analysis is novel which makes significant contribution to the related area weaknesses 1 i think this paper lacks sufficient discussions and comparisons on the assumptions which makes the technical part of this paper not sufficiently clear there is another work studying the singletimescale minimiax optimization 1 which is a special case of the singletimescale twosequence sa when the main sequence does not have strongly monotone increment in their algorithm 1 they show that a momentum updating step is needed in order to obtain a singletimescale algorithm however this paper does not need the momentum updates in my understanding the main technique for the proof to get rid of momentum is in 57 and 59 and 57 and 59 depends on two important assumptions a xi and psi are mutually independent otherwise the second equation in 57 does not hold this means we need to sample twice independently in the application of the algorithm in 1 they do not have this assumption i think this mutual independence should be highlighted as a formal assumption and a comparison with the existing work is needed as well b existing works eg 1 2 only need a lipchitz assumption for y while this work makes a relatively stronger assumption that the gradient of y exists such a comparison is also necessary in the paper to clarify the major difference in the assumptions and the reason why we can obtain a singletimescale algorithm 1 singletimescale stochastic nonconvexconcave optimization for smooth nonlinear td learning 2 on gradient descent ascent for nonconvexconcave minimax problems 2 according to the submission instructions the authors need to complete the check list after the reference section in this paper the authors did not provide limitations of this work according to the submission instructions the authors need to complete the check list after the reference section in this paper the authors did not provide limitations of this work i think the authors can further discuss the limitation of this theoretical analysis in their submission docsepthis work proves the convergence guarantees for sa with doublesequence and also extends to multisequence and the results seem tight besides this paper also applies their general results to sbo and sco problems and show the improvement of their results as im not an expert in optimization area im only able to give some common issues in this paper questions 1 i think assumption 3 is a little bit weird because lhs is a random variable while the rhs is a nonrandom term and without a high probability guarantee and i think it is not the generalized version of assumption 21 in 30 because the secondorder condition in assumption 3 can induce moment assumption in 30 2 i would be appreciated if the author could provide an example of a weak dependent sequence as assumption 3 describes 3 is yn arbitrary in assumption 4 if so please use more precise language 4 line 169 it is not mathcalok2 but mathcaloepsilon2 or mathcalok12 in conclusion in my own opinion i think this paper provides a solid result but i will consider changing my score after reading other reviewers comments and authors responses see strengths and weaknesses docsepthe paper studies the finitetime convergence of nonlinear stochastic approximation sa with multiple coupled sequences while there are few work on analyzing the performance of sa in this setting existing analysis all adopt a multitimescale analysis in the sense that the stepsizes for all the sequences decay at the different rates which leads to a convergence rate that is slower than that of the singlesequence sa the focus of this paper is on finding settings where it suffices to using singletimescale updates for multisequence sa leading to the same convergence rate as that of the classic singlesequence sa the implications of the new results on applications to bilevel compositional and reinforcement learning have been discussed strengths 1 this paper presents the rst k1 and k12 convergence rates for nonlinear sa under two settings a all sequences have strongly monotone increments and b all but the main sequence have strongly monotone increments these results have been then extended to multiple coupled sequences with the same iteration complexity which is also new 2 the paper presents a unified and new perspective of understanding the recent theoretical advances in stochastic bilevel compositional optimization and reinforcement learning applying the new results of nonlinear sa with multiple coupled sequences to those special cases lead to either improved iterationsample complexity or relaxed assumptions 3 the nice thing about this framework is that the iterationsample complexity of new sa algorithms developed in these applications can be established by just verifying the assumptions in this paper if those assumptions are verified they automatically enjoy the same convergence rate as singlesequence stochastic approximation or sgd 4 the simulation although very simple gets the key point of this paper there is a gap between the existing theory and the actual performance of nonlinear sa weaknesses 1 the new iteration complexity results for nonlinear sa hold under the smoothness assumption on the fixed point while the paper has justified it in several applications it does not improve the existing complexity of sa without this smoothness assumption 2 the paper can do a better job on discussing and highlighting in the main paper how the new proof techniques improve the existing analysis na docsepthis paper fomulates the stochastic approximation of multiple coupled sequences and builds the nonasymptotic convergence rate for both strongly monotone and nonmonotone case multiple applications are introduced after providing the theoretical convergence guarantee including bilevel optimization and compositional optimization strengths this work is sufficiently complete and wellwritten it gives a clear definition of the multiple sequences stochastic approximation and builds the convergence rate for both of strongly monotone and nonmonotone cases also those derived bounds have matches the best known results it is nice to see there is a unifying framework that can include stochastic bilevel optimization and stochastic compositional optimization i am not sure if this paper is the first work doing so but this paper reveals the significance of studying the stochastic approximation with multiple coupled sequences the sngletimescale learning rate scheduling can be better than the standard twotimescale one this result is really new weaknesses i have not found any major weakness of this work since this work is purely theoretical there is no any negative impact
### Summary:
|
this paper provides convergence analysis for nonlinear stochastic approximation with a multisequence update structure motivated by applications in reinforcement learning and bilevel learning when all sequences have strongly monotone increments the authors provide iteration complexity of oepsilon1 to achieve accuracy which improves the existing oepsilon15 complexity for two coupled sequences when the main sequence does not have strongly monotone increments they establish iteration complexity oepsilon2 the reviewers agreed that the techniques in this paper are novel and that it is wellwritten in addition the paper improves upon existing results when applied to problems in reinforcement learning and bilevel optimization and hence is likely to have broader impact however the reviewers felt that the for the final version the discussion of the smoothness assumption needs to be expanded and the comparison with prior work needs to be improved
|
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
436,
2929,
2175,
253,
1442,
262,
7816,
14940,
273,
14561,
618,
342,
2709,
9904,
6430,
1027,
432,
5368,
1554,
262,
1022,
25912,
1783,
253,
4477,
7703,
323,
15216,
835,
247,
4030,
72,
11273,
1783,
476,
2085,
253,
6863,
3045,
12215,
323,
34791,
1022,
25912,
1554,
885,
371,
566,
618,
331,
6678,
672,
512,
6430,
452,
7052,
49123,
42344,
597,
5100,
253,
19502,
10454,
273,
258,
2265,
18,
281,
5115,
299,
793,
18921,
1974,
534,
19132,
253,
5368,
258,
2265,
1010,
10454,
323,
767,
9904,
6430,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
17627,
597,
5100,
253,
19502,
10454,
273,
258,
2265,
19,
20544,
337,
436,
2929,
3400,
271,
5520,
2281,
323,
253,
1083,
835,
512,
6430,
452,
7052,
49123,
42344,
275,
34791,
1022,
25912,
1554,
885,
371,
566,
618,
374,
436,
2929,
2007,
3400,
271,
1783,
323,
247,
34791,
1022,
25912,
1554,
885,
371,
566,
618,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
17627,
50276,
20,
253,
10527,
1783,
310,
4460,
534,
2789,
1534,
7680,
281,
253,
2905,
2170,
50276,
20881,
1255,
265,
50276,
18,
891,
1158,
436,
2929,
19756,
4209,
11985,
285,
14023,
327,
253,
13260,
534,
2789,
253,
7681,
629,
273,
436,
2929,
417,
10481,
2590,
50276,
9088,
310,
1529,
789,
12392,
253,
34791,
1022,
25912,
7221,
571,
89,
13757,
337,
534,
310,
247,
2714,
1083,
273,
253,
34791,
1022,
25912,
2500,
583,
371,
566,
618,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
17627,
275,
616,
5933,
337,
597,
921,
326,
247,
10254,
22753,
3213,
310,
3058,
275,
1340,
281,
4044,
247,
34791,
1022,
25912,
5933,
2299,
436,
2929,
1057,
417,
878,
253,
10254,
11269,
275,
619,
4685,
253,
2022,
5853,
323,
253,
4737,
281,
755,
8314,
273,
10254,
310,
275,
8988,
285,
8978,
285,
8988,
285,
8978,
7024,
327,
767,
1774,
13260,
50275,
66,
50276,
2981,
285,
3714,
74,
403,
25834,
3907,
5010,
253,
1273,
5150,
275,
8988,
1057,
417,
2186,
436,
2097,
359,
878,
281,
3410,
7019,
10939,
275,
253,
2898,
273,
253,
5933,
275,
337,
597,
513,
417,
452,
436,
9376,
891,
1158,
436,
15577,
14275,
943,
320,
16318,
347,
247,
7473,
9376,
285,
247,
5301,
342,
253,
5368,
789,
310,
3058,
347,
973,
50276,
67,
5368,
2987,
24088,
337,
374,
760,
878,
247,
5541,
37913,
9376,
323,
340,
1223,
436,
789,
2789,
247,
4942,
10046,
9376,
326,
253,
11786,
273,
340,
4961,
824,
247,
5301,
310,
671,
3309,
275,
253,
2929,
281,
19148,
253,
2201,
3064,
275,
253,
13260,
285,
253,
1921,
2139,
359,
476,
4044,
247,
34791,
1022,
25912,
5933,
50276,
18,
34791,
1022,
25912,
19191,
1327,
44181,
45542,
1123,
13757,
323,
6032,
14561,
32989,
4715,
374,
327,
11786,
18499,
49104,
323,
1327,
44181,
45542,
1123,
7221,
991,
3237,
50276,
19,
2556,
281,
253,
19529,
7997,
253,
4477,
878,
281,
3426,
253,
2451,
1618,
846,
253,
3806,
2593,
275,
436,
2929,
253,
4477,
858,
417,
2085,
7364,
273,
436,
789,
2556,
281,
253,
19529,
7997,
253,
4477,
878,
281,
3426,
253,
2451,
1618,
846,
253,
3806,
2593,
275,
436,
2929,
253,
4477,
858,
417,
2085,
7364,
273,
436,
789,
891,
1158,
253,
4477,
476,
2007,
2319,
253,
12291,
273,
436,
10527,
1783,
275,
616,
19529,
50276,
7152,
33032,
2520,
789,
19539,
253,
14940,
23632,
323,
618,
342,
33478,
2655,
566,
285,
671,
8725,
281,
1554,
885,
371,
566,
285,
253,
1543,
1646,
6863,
16280,
436,
2929,
671,
10384,
616,
2087,
1543,
281,
256,
2399,
285,
46254,
3237,
285,
921,
253,
7756,
273,
616,
1543,
347,
516,
417,
271,
6485,
275,
13757,
2170,
516,
760,
2104,
281,
1918,
690,
1846,
3374,
275,
436,
2929,
50275,
34974,
337,
891,
1158,
9376,
495,
310,
247,
1652,
2372,
12504,
984,
298,
11285,
310,
247,
3632,
4778,
1223,
253,
38309,
310,
247,
1327,
14719,
1307,
285,
1293,
247,
1029,
5912,
12215,
285,
891,
1158,
352,
310,
417,
253,
14923,
2715,
273,
9376,
3127,
275,
1884,
984,
253,
1273,
2621,
1617,
275,
9376,
495,
476,
10808,
2774,
9376,
275,
1884,
374,
891,
651,
320,
14109,
604,
253,
2488,
812,
2085,
271,
1650,
273,
247,
5075,
7976,
3425,
347,
9376,
495,
8631,
495,
310,
340,
79,
10341,
275,
9376,
577,
604,
594,
4496,
897,
625,
10799,
3448,
577,
1386,
23504,
352,
310,
417,
14168,
1179,
536,
19,
533,
14168,
1179,
80,
4259,
19,
390,
14168,
1179,
536,
805,
50276,
249,
6452,
275,
619,
1211,
4743,
891,
1158,
436,
2929,
3400,
247,
4891,
906,
533,
891,
588,
1908,
6890,
619,
4868,
846,
4361,
643,
30628,
5701,
285,
4477,
6128,
50276,
2887,
20544,
285,
32213,
5474,
339,
431,
248,
2929,
2175,
253,
1442,
262,
7816,
14940,
273,
14561,
19191,
11193,
618,
342,
2709,
9904,
6430,
1223,
627,
403,
1643,
789,
327,
18918,
253,
3045,
273,
618,
275,
436,
4758,
5368,
1783,
512,
5283,
247,
1554,
262,
1022,
25912,
1783,
275,
253,
3282,
326,
253,
5018,
4219,
323,
512,
253,
6430,
10027,
387,
253,
1027,
4142,
534,
5644,
281,
247,
14940,
2281,
326,
310,
17357,
685,
326,
273,
253,
21864,
2655,
566,
618,
253,
2770,
273,
436,
2929,
310,
327,
4560,
7533,
835,
352,
31088,
281,
970,
34791,
1022,
25912,
11269,
323,
1554,
885,
371,
566,
618,
4283,
281,
253,
1072,
14940,
2281,
347,
326,
273,
253,
10610,
21864,
2655,
566,
618,
253,
12739,
273,
253,
747,
1543,
327,
4893,
281,
26413,
652,
5889,
267,
285,
35221,
4715,
452,
644,
5469,
50276,
296,
3755,
20556,
337,
186,
2520,
2929,
10262,
253,
391,
296,
465,
18,
285,
465,
805,
14940,
4142,
323,
14561,
618,
762,
767,
7533,
247,
512,
6430,
452,
7052,
49123,
42344,
285,
270,
512,
533,
253,
2022,
3425,
452,
7052,
49123,
42344,
841,
1543,
452,
644,
840,
6508,
281,
2709,
9904,
6430,
342,
253,
1072,
19502,
10454,
534,
310,
671,
747,
374,
186,
783,
2929,
10262,
247,
27998,
285,
747,
8668,
273,
4685,
253,
3332,
10527,
16424,
275,
19191,
26413,
652,
5889,
267,
13757,
285,
35221,
4715,
9433,
253,
747,
1543,
273,
14561,
618,
342,
2709,
9904,
6430,
281,
1110,
2714,
2219,
1421,
281,
2057,
5520,
25142,
4636,
10454,
390,
19595,
13260,
50276,
20,
186,
783,
5322,
2181,
670,
436,
7792,
310,
326,
253,
25142,
4636,
10454,
273,
747,
618,
11333,
3715,
275,
841,
4893,
476,
320,
4232,
407,
816,
49160,
253,
13260,
275,
436,
2929,
604,
1110,
13260,
403,
16058,
597,
8356,
4264,
253,
1072,
14940,
2281,
347,
21864,
2655,
566,
19191,
11193,
390,
256,
35333,
50276,
21,
186,
783,
9864,
3738,
1077,
2969,
4850,
253,
2234,
1127,
273,
436,
2929,
50276,
9088,
310,
247,
8037,
875,
253,
5368,
3762,
285,
253,
4588,
3045,
273,
14561,
618,
50275,
20881,
1255,
265,
337,
186,
783,
747,
19502,
10454,
1543,
323,
14561,
618,
2186,
762,
253,
6032,
1255,
9376,
327,
253,
4229,
1127,
1223,
253,
2929,
556,
17285,
352,
275,
2067,
4893,
352,
1057,
417,
3157,
253,
5368,
10454,
273,
618,
1293,
436,
6032,
1255,
9376,
50276,
19,
186,
783,
2929,
476,
513,
247,
1805,
2628,
327,
16585,
285,
27321,
275,
253,
2022,
2929,
849,
253,
747,
4737,
5609,
3157,
253,
5368,
1783,
50276,
2072,
5474,
33032,
2520,
2929,
269,
297,
17815,
253,
19191,
11193,
273,
2709,
9904,
6430,
285,
21168,
253,
1327,
284,
40045,
3875,
14940,
2281,
323,
1097,
7052,
49123,
285,
1327,
2163,
302,
531,
1083,
2709,
4893,
403,
5611,
846,
5277,
253,
10527,
14940,
12215,
1690,
26413,
652,
13757,
285,
5889,
267,
13757,
50273,
296,
3755,
20556,
50276,
2520,
789,
310,
10481,
3426,
285,
973,
15720,
352,
4245,
247,
2590,
5426,
273,
253,
2709,
6430,
19191,
11193,
285,
21168,
253,
14940,
2281,
323,
1097,
273,
7052,
49123,
285,
1327,
2163,
302,
531,
2219,
671,
1110,
6012,
14493,
452,
10129,
253,
1682,
1929,
1543,
50274,
262,
310,
5322,
281,
923,
627,
310,
247,
440,
5411,
7792,
326,
476,
2486,
19191,
26413,
652,
13757,
285,
19191,
5889,
267,
13757,
891,
717,
417,
2119,
604,
436,
2929,
310,
253,
806,
789,
2509,
594,
533,
436,
2929,
12957,
253,
8453,
273,
12392,
253,
19191,
11193,
342,
2709,
9904,
6430,
50271,
783,
256,
1251,
1059,
1022,
25912,
50276,
28269,
2281,
27387,
476,
320,
1805,
685,
253,
2629,
2500,
5786,
25912,
581,
436,
906,
310,
1663,
747,
50276,
20881,
1255,
265,
50276,
74,
452,
417,
1119,
667,
2201,
14855,
273,
436,
789,
1580,
436,
789,
310,
15846,
10527,
627,
310,
642,
667,
4016,
3486,
2490,
187,
4118,
18435,
27,
2520,
2929,
3400,
14940,
1783,
323,
14561,
19191,
11193,
342,
247,
1554,
885,
371,
566,
5731,
2605,
17194,
407,
4893,
275,
35221,
4715,
285,
26413,
652,
4715,
672,
512,
6430,
452,
7052,
49123,
42344,
253,
4477,
2085,
19502,
10454,
273,
258,
4259,
18,
281,
5115,
7200,
534,
19132,
253,
5368,
258,
4259,
1010,
10454,
323,
767,
9904,
6430,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
42344,
597,
5100,
19502,
10454,
258,
4259,
19,
50276,
783,
30628,
5821,
326,
253,
5609,
275,
436,
2929,
403,
4460,
285,
326,
352,
310,
973,
15720,
275,
1635,
253,
2929,
19132,
2220,
5368,
1543,
672,
3732,
281,
3237,
275,
35221,
4715,
285,
26413,
652,
13757,
285,
7613,
310,
2779,
281,
452,
16055,
3486,
2299,
253,
30628,
3543,
326,
253,
323,
253,
2457,
2715,
253,
5955,
273,
253,
6032,
1255,
9376,
3198,
281,
320,
11848,
285,
253,
5301,
342,
2720,
789,
3198,
281,
320,
5520
] |
[
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] |
[
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
436,
2929,
2175,
253,
1442,
262,
7816,
14940,
273,
14561,
618,
342,
2709,
9904,
6430,
1027,
432,
5368,
1554,
262,
1022,
25912,
1783,
253,
4477,
7703,
323,
15216,
835,
247,
4030,
72,
11273,
1783,
476,
2085,
253,
6863,
3045,
12215,
323,
34791,
1022,
25912,
1554,
885,
371,
566,
618,
331,
6678,
672,
512,
6430,
452,
7052,
49123,
42344,
597,
5100,
253,
19502,
10454,
273,
258,
2265,
18,
281,
5115,
299,
793,
18921,
1974,
534,
19132,
253,
5368,
258,
2265,
1010,
10454,
323,
767,
9904,
6430,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
17627,
597,
5100,
253,
19502,
10454,
273,
258,
2265,
19,
20544,
337,
436,
2929,
3400,
271,
5520,
2281,
323,
253,
1083,
835,
512,
6430,
452,
7052,
49123,
42344,
275,
34791,
1022,
25912,
1554,
885,
371,
566,
618,
374,
436,
2929,
2007,
3400,
271,
1783,
323,
247,
34791,
1022,
25912,
1554,
885,
371,
566,
618,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
17627,
50276,
20,
253,
10527,
1783,
310,
4460,
534,
2789,
1534,
7680,
281,
253,
2905,
2170,
50276,
20881,
1255,
265,
50276,
18,
891,
1158,
436,
2929,
19756,
4209,
11985,
285,
14023,
327,
253,
13260,
534,
2789,
253,
7681,
629,
273,
436,
2929,
417,
10481,
2590,
50276,
9088,
310,
1529,
789,
12392,
253,
34791,
1022,
25912,
7221,
571,
89,
13757,
337,
534,
310,
247,
2714,
1083,
273,
253,
34791,
1022,
25912,
2500,
583,
371,
566,
618,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
17627,
275,
616,
5933,
337,
597,
921,
326,
247,
10254,
22753,
3213,
310,
3058,
275,
1340,
281,
4044,
247,
34791,
1022,
25912,
5933,
2299,
436,
2929,
1057,
417,
878,
253,
10254,
11269,
275,
619,
4685,
253,
2022,
5853,
323,
253,
4737,
281,
755,
8314,
273,
10254,
310,
275,
8988,
285,
8978,
285,
8988,
285,
8978,
7024,
327,
767,
1774,
13260,
50275,
66,
50276,
2981,
285,
3714,
74,
403,
25834,
3907,
5010,
253,
1273,
5150,
275,
8988,
1057,
417,
2186,
436,
2097,
359,
878,
281,
3410,
7019,
10939,
275,
253,
2898,
273,
253,
5933,
275,
337,
597,
513,
417,
452,
436,
9376,
891,
1158,
436,
15577,
14275,
943,
320,
16318,
347,
247,
7473,
9376,
285,
247,
5301,
342,
253,
5368,
789,
310,
3058,
347,
973,
50276,
67,
5368,
2987,
24088,
337,
374,
760,
878,
247,
5541,
37913,
9376,
323,
340,
1223,
436,
789,
2789,
247,
4942,
10046,
9376,
326,
253,
11786,
273,
340,
4961,
824,
247,
5301,
310,
671,
3309,
275,
253,
2929,
281,
19148,
253,
2201,
3064,
275,
253,
13260,
285,
253,
1921,
2139,
359,
476,
4044,
247,
34791,
1022,
25912,
5933,
50276,
18,
34791,
1022,
25912,
19191,
1327,
44181,
45542,
1123,
13757,
323,
6032,
14561,
32989,
4715,
374,
327,
11786,
18499,
49104,
323,
1327,
44181,
45542,
1123,
7221,
991,
3237,
50276,
19,
2556,
281,
253,
19529,
7997,
253,
4477,
878,
281,
3426,
253,
2451,
1618,
846,
253,
3806,
2593,
275,
436,
2929,
253,
4477,
858,
417,
2085,
7364,
273,
436,
789,
2556,
281,
253,
19529,
7997,
253,
4477,
878,
281,
3426,
253,
2451,
1618,
846,
253,
3806,
2593,
275,
436,
2929,
253,
4477,
858,
417,
2085,
7364,
273,
436,
789,
891,
1158,
253,
4477,
476,
2007,
2319,
253,
12291,
273,
436,
10527,
1783,
275,
616,
19529,
50276,
7152,
33032,
2520,
789,
19539,
253,
14940,
23632,
323,
618,
342,
33478,
2655,
566,
285,
671,
8725,
281,
1554,
885,
371,
566,
285,
253,
1543,
1646,
6863,
16280,
436,
2929,
671,
10384,
616,
2087,
1543,
281,
256,
2399,
285,
46254,
3237,
285,
921,
253,
7756,
273,
616,
1543,
347,
516,
417,
271,
6485,
275,
13757,
2170,
516,
760,
2104,
281,
1918,
690,
1846,
3374,
275,
436,
2929,
50275,
34974,
337,
891,
1158,
9376,
495,
310,
247,
1652,
2372,
12504,
984,
298,
11285,
310,
247,
3632,
4778,
1223,
253,
38309,
310,
247,
1327,
14719,
1307,
285,
1293,
247,
1029,
5912,
12215,
285,
891,
1158,
352,
310,
417,
253,
14923,
2715,
273,
9376,
3127,
275,
1884,
984,
253,
1273,
2621,
1617,
275,
9376,
495,
476,
10808,
2774,
9376,
275,
1884,
374,
891,
651,
320,
14109,
604,
253,
2488,
812,
2085,
271,
1650,
273,
247,
5075,
7976,
3425,
347,
9376,
495,
8631,
495,
310,
340,
79,
10341,
275,
9376,
577,
604,
594,
4496,
897,
625,
10799,
3448,
577,
1386,
23504,
352,
310,
417,
14168,
1179,
536,
19,
533,
14168,
1179,
80,
4259,
19,
390,
14168,
1179,
536,
805,
50276,
249,
6452,
275,
619,
1211,
4743,
891,
1158,
436,
2929,
3400,
247,
4891,
906,
533,
891,
588,
1908,
6890,
619,
4868,
846,
4361,
643,
30628,
5701,
285,
4477,
6128,
50276,
2887,
20544,
285,
32213,
5474,
339,
431,
248,
2929,
2175,
253,
1442,
262,
7816,
14940,
273,
14561,
19191,
11193,
618,
342,
2709,
9904,
6430,
1223,
627,
403,
1643,
789,
327,
18918,
253,
3045,
273,
618,
275,
436,
4758,
5368,
1783,
512,
5283,
247,
1554,
262,
1022,
25912,
1783,
275,
253,
3282,
326,
253,
5018,
4219,
323,
512,
253,
6430,
10027,
387,
253,
1027,
4142,
534,
5644,
281,
247,
14940,
2281,
326,
310,
17357,
685,
326,
273,
253,
21864,
2655,
566,
618,
253,
2770,
273,
436,
2929,
310,
327,
4560,
7533,
835,
352,
31088,
281,
970,
34791,
1022,
25912,
11269,
323,
1554,
885,
371,
566,
618,
4283,
281,
253,
1072,
14940,
2281,
347,
326,
273,
253,
10610,
21864,
2655,
566,
618,
253,
12739,
273,
253,
747,
1543,
327,
4893,
281,
26413,
652,
5889,
267,
285,
35221,
4715,
452,
644,
5469,
50276,
296,
3755,
20556,
337,
186,
2520,
2929,
10262,
253,
391,
296,
465,
18,
285,
465,
805,
14940,
4142,
323,
14561,
618,
762,
767,
7533,
247,
512,
6430,
452,
7052,
49123,
42344,
285,
270,
512,
533,
253,
2022,
3425,
452,
7052,
49123,
42344,
841,
1543,
452,
644,
840,
6508,
281,
2709,
9904,
6430,
342,
253,
1072,
19502,
10454,
534,
310,
671,
747,
374,
186,
783,
2929,
10262,
247,
27998,
285,
747,
8668,
273,
4685,
253,
3332,
10527,
16424,
275,
19191,
26413,
652,
5889,
267,
13757,
285,
35221,
4715,
9433,
253,
747,
1543,
273,
14561,
618,
342,
2709,
9904,
6430,
281,
1110,
2714,
2219,
1421,
281,
2057,
5520,
25142,
4636,
10454,
390,
19595,
13260,
50276,
20,
186,
783,
5322,
2181,
670,
436,
7792,
310,
326,
253,
25142,
4636,
10454,
273,
747,
618,
11333,
3715,
275,
841,
4893,
476,
320,
4232,
407,
816,
49160,
253,
13260,
275,
436,
2929,
604,
1110,
13260,
403,
16058,
597,
8356,
4264,
253,
1072,
14940,
2281,
347,
21864,
2655,
566,
19191,
11193,
390,
256,
35333,
50276,
21,
186,
783,
9864,
3738,
1077,
2969,
4850,
253,
2234,
1127,
273,
436,
2929,
50276,
9088,
310,
247,
8037,
875,
253,
5368,
3762,
285,
253,
4588,
3045,
273,
14561,
618,
50275,
20881,
1255,
265,
337,
186,
783,
747,
19502,
10454,
1543,
323,
14561,
618,
2186,
762,
253,
6032,
1255,
9376,
327,
253,
4229,
1127,
1223,
253,
2929,
556,
17285,
352,
275,
2067,
4893,
352,
1057,
417,
3157,
253,
5368,
10454,
273,
618,
1293,
436,
6032,
1255,
9376,
50276,
19,
186,
783,
2929,
476,
513,
247,
1805,
2628,
327,
16585,
285,
27321,
275,
253,
2022,
2929,
849,
253,
747,
4737,
5609,
3157,
253,
5368,
1783,
50276,
2072,
5474,
33032,
2520,
2929,
269,
297,
17815,
253,
19191,
11193,
273,
2709,
9904,
6430,
285,
21168,
253,
1327,
284,
40045,
3875,
14940,
2281,
323,
1097,
7052,
49123,
285,
1327,
2163,
302,
531,
1083,
2709,
4893,
403,
5611,
846,
5277,
253,
10527,
14940,
12215,
1690,
26413,
652,
13757,
285,
5889,
267,
13757,
50273,
296,
3755,
20556,
50276,
2520,
789,
310,
10481,
3426,
285,
973,
15720,
352,
4245,
247,
2590,
5426,
273,
253,
2709,
6430,
19191,
11193,
285,
21168,
253,
14940,
2281,
323,
1097,
273,
7052,
49123,
285,
1327,
2163,
302,
531,
2219,
671,
1110,
6012,
14493,
452,
10129,
253,
1682,
1929,
1543,
50274,
262,
310,
5322,
281,
923,
627,
310,
247,
440,
5411,
7792,
326,
476,
2486,
19191,
26413,
652,
13757,
285,
19191,
5889,
267,
13757,
891,
717,
417,
2119,
604,
436,
2929,
310,
253,
806,
789,
2509,
594,
533,
436,
2929,
12957,
253,
8453,
273,
12392,
253,
19191,
11193,
342,
2709,
9904,
6430,
50271,
783,
256,
1251,
1059,
1022,
25912,
50276,
28269,
2281,
27387,
476,
320,
1805,
685,
253,
2629,
2500,
5786,
25912,
581,
436,
906,
310,
1663,
747,
50276,
20881,
1255,
265,
50276,
74,
452,
417,
1119,
667,
2201,
14855,
273,
436,
789,
1580,
436,
789,
310,
15846,
10527,
627,
310,
642,
667,
4016,
3486,
2490,
187,
4118,
18435,
27,
2520,
2929,
3400,
14940,
1783,
323,
14561,
19191,
11193,
342,
247,
1554,
885,
371,
566,
5731,
2605,
17194,
407,
4893,
275,
35221,
4715,
285,
26413,
652,
4715,
672,
512,
6430,
452,
7052,
49123,
42344,
253,
4477,
2085,
19502,
10454,
273,
258,
4259,
18,
281,
5115,
7200,
534,
19132,
253,
5368,
258,
4259,
1010,
10454,
323,
767,
9904,
6430,
672,
253,
2022,
3425,
1057,
417,
452,
7052,
49123,
42344,
597,
5100,
19502,
10454,
258,
4259,
19,
50276,
783,
30628,
5821,
326,
253,
5609,
275,
436,
2929,
403,
4460,
285,
326,
352,
310,
973,
15720,
275,
1635,
253,
2929,
19132,
2220,
5368,
1543,
672,
3732,
281,
3237,
275,
35221,
4715,
285,
26413,
652,
13757,
285,
7613,
310,
2779,
281,
452,
16055,
3486,
2299,
253,
30628,
3543,
326,
253,
323,
253,
2457,
2715,
253,
5955,
273,
253,
6032,
1255,
9376,
3198,
281,
320,
11848,
285,
253,
5301,
342,
2720,
789,
3198,
281,
320,
5520
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.