text
stringlengths
0
1.96k
"However at present, adversarial attacks likely have much larger relevance to AI than neuro" "['non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"As well as whether or how adversarial attacks (as framed) might have relevance to neuroscience" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The paper provides a broadly useful synthesis of key differences between ANN and SNN approaches" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"It offers a call to action to do more comp-neuro, in that it could revolutionise AI" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Arguably ACh and noradrenaline are more important for network states and dynamics, and equally important for plasticity as dopamine." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"The dynamics of neuromodulation is largely unknown." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"Hardly what I'd call moderate effort" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Its more a series of statements than a cleverly woven argument" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"But the individual statements are sometimes seductive" "['non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"For example ... ""A neuron simply sits and listens." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"When it hears an incoming pattern of spikes that matches a pattern it knows, it responds with a spike of its own." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"Repeat this process recursively tens to trillions of times, and suddenly you have a brain controlling a body in the world or doing something else equally clever." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"Our challenge is to understand how this occurs." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"The devil is in the details, the ""how"" of ""suddenly""." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"We require a new class of theories that dispose of the simplistic stimulus-driven encode/ transmit/decode doctrine. """ "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"Largely contradicts this one ""It is probable that revolutionary computational systems can be created in this way with only moderate expenditure of resources and effort"" I felt the paper could have done more to link with current state-of-the-art AI approaches" "['arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"While it covers important ground , I think the arguments need more refinement and focus before they can inspire productive discussion" "['non', 'arg', 'arg', 'arg', 'arg', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The authors consider how biologically motivated synaptic eligibility traces can be used for backpropagation-like learning, in particular by approximating local gradient computations in recurrent neural networks." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"This sheds new light on how artificial network algorithms might be implementable by the brain" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Space is of course limited, but the mathematics presented seem to pass all sanity checks and gives sufficiently rigor to the authors' approach" "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Given its technical details it was reasonably straightforward to follow" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The authors directly tried to associate biological learning rules with deep network learning rules in AI." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"Gives important new results about how eligibility traces can be used to approximate gradients when adequately combined with a learning signal" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"It also would have been nice to comment on the relationship of this work to unsupervised (e.g. Hebbian-based) learning rules." "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"A final addition that would have made this work more compelling would have been to more thoroughly explore e-prop for computations that unfold on timescales beyond those built-in to the neurons (e.g. membrane or adaptation timescales) and which instead rely on reverberating network activity" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Operationally, I'm not quite sure how these are different, so, to me this goal is roughly ""be explainable"", and progress towards it could be measured e.g. in MDLs." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"3) Suggest testable hypotheses." "['non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"The technical aspects of the paper seem correct , though I have some higher-level conceptual concerns" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"1) If I understand correctly, attribution is computed only for a single OSR stimulus video" "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Is the attribution analysis stable for different stimulus frequencies?" "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"If not, is it really an explanation of the OSR?" "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"2) I agree with a concern raised by reviewer 3: It's difficult to see a 1-layer network as a ""mechanistic explanation"" of a 3-layer network" "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Explanations are mostly complete , though some details are missing" "['arg', 'arg', 'arg', 'arg', 'non', 'non', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"e.g. what was the nonlinearity used in the model CNN" "['non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Also, do the CNN layers correspond to cell populations , and if so, why is it reasonable to collapse the time dimension after the first layer" "['non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"I believe this paper is addressing questions that many of the workshop attendees will find interesting" "['non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The work would benefit from more detailed discussion of the training algorithm that provides some indication that the results aren't unduly sensitive to these details" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"In particular, the setting of synaptic decay constants is an important detail in a paper about working memory." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"The statistical tools are fairly well described and appear to be well-suited for illustrating the phenomena of interest" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"I feel that more tools should have been used to further support or push the results" "['non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"For instance, while the heatmaps in Figure 3 provide visual evidence for their claims (except see my comments below), the work could have benefitted from a quantification of this evidence" "['non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The technical details are presented clearly on the whole" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"However, I feel that the work lacked clarity when it came to interpretation of the results" "['non', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The work would have benefited from a discussion of the implications of longer intrinsic timescale neurons retaining task-relevant information for longer -- in particular, this finding feels a bit ""trivial"" without the case being made for why this should push understanding in the field" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"I think the interesting part may be in quantifying just how much of a difference there is between short and long timescale neurons -- for instance, does task-relevant information in both neuron groups fall off in a way that can be well predicted by their intrinsic time constants" "['non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"How does this relate to their synaptic time constants" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"Does limiting the synaptic time constants limit the intrinsic time constants, and if so by how much" "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg']" "paper quality"
"The same type of comments apply to the second part of the results, which demonstrates that a task that doesn't require working memory results in neurons with shorter intrinsic timescales compared to the working memory task." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"The authors use an artificial network model to shed light on the biological mechanisms enabling and shaping working memory in the brain." "['non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"
"The work is a basic proof-of-concept of results that may not do much to advance understanding since they are what one would expect to see (i.e. the antithesis of their thesis seems very unlikely)." "['arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'arg', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non', 'non']" "paper quality"