ACL-OCL / Base_JSON /prefixS /json /spnlp /2022.spnlp-1.0.json
Benjamin Aw
Add updated pkl file v3
6fa4bc9
{
"paper_id": "2022",
"header": {
"generated_with": "S2ORC 1.0.0",
"date_generated": "2023-01-19T15:34:26.921955Z"
},
"title": "",
"authors": [
{
"first": "Manling",
"middle": [],
"last": "Li",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "Sha",
"middle": [],
"last": "Li",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "Julius",
"middle": [],
"last": "Cheng",
"suffix": "",
"affiliation": {},
"email": ""
},
{
"first": "Zhen",
"middle": [],
"last": "Han",
"suffix": "",
"affiliation": {},
"email": ""
}
],
"year": "",
"venue": null,
"identifiers": {},
"abstract": "",
"pdf_parse": {
"paper_id": "2022",
"_pdf_hash": "",
"abstract": [],
"body_text": [
{
"text": "Welcome to the Sixth Workshop on Structured Prediction for NLP! Structured prediction has a strong tradition within the natural language processing (NLP) community, owing to the discrete, compositional nature of words and sentences, which leads to natural combinatorial representations such as trees, sequences, segments, or alignments, among others. It is no surprise that structured output models have been successful and popular in NLP applications since their inception. Many other NLP tasks, including, but not limited to: semantic parsing, slot filling, machine translation, or information extraction, are commonly modeled as structured problems, and accounting for said structure has often lead to performance gain.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "This year we received 19 submissions, 7 of which were reviewed by the ACL Rolling Review initiative and subsequently commited to our workshop and 12 of which were directly submitted to our workshop and double-blind peer reviewed by our program committee members. Of these 19, 13 were accepted (6 of which are non-archival papers) for presentation in this edition of the workshop, all exploring this interplay between structure and neural data representations, from different, important points of view. The program includes work on structure-informed representation learning, leveraging structure in problems like temporal knowledge graph completion, multilingual syntax-aware language modeling, mention detection models, etc. Our program also includes five invited presentations from influential researchers.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "Our warmest thanks go to the program committee -for their time and effort providing valuable feedback, to all submitting authors -for their thought-provoking work, and to the invited speakers -for doing us the honor of joining our program. ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
}
],
"back_matter": [],
"bib_entries": {},
"ref_entries": {
"TABREF1": {
"type_str": "table",
"num": null,
"html": null,
"content": "<table/>",
"text": "Multilingual Syntax-aware Language Modeling through Dependency Tree Conversion Shunsuke Kando, Hiroshi Noji and Yusuke Miyao . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ."
}
}
}
}