Benjamin Aw
Add updated pkl file v3
6fa4bc9
{
"paper_id": "C92-1024",
"header": {
"generated_with": "S2ORC 1.0.0",
"date_generated": "2023-01-19T12:33:40.418392Z"
},
"title": "CHART PARSING LAMBEK GRAMMARS: MODAL EXTENSIONS AND INCREMENTALITY",
"authors": [
{
"first": "Mark",
"middle": [],
"last": "Hepple",
"suffix": "",
"affiliation": {
"laboratory": "",
"institution": "Cambridge University Computer Laboratory",
"location": {
"settlement": "Cambridge",
"country": "UK"
}
},
"email": ""
}
],
"year": "",
"venue": null,
"identifiers": {},
"abstract": "This paper I describes a method for chart parsing Lambek grammars. The method is of particular interest in two regards. Firstly, it allows efficient processing of grammars which use necessity operators, mr extension proposed for handling locality phenomena. Secondly, the method is easily adapted to allow incremental proceasing of Lambek grammars, a possibility that has hitherto been unavailable.",
"pdf_parse": {
"paper_id": "C92-1024",
"_pdf_hash": "",
"abstract": [
{
"text": "This paper I describes a method for chart parsing Lambek grammars. The method is of particular interest in two regards. Firstly, it allows efficient processing of grammars which use necessity operators, mr extension proposed for handling locality phenomena. Secondly, the method is easily adapted to allow incremental proceasing of Lambek grammars, a possibility that has hitherto been unavailable.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Abstract",
"sec_num": null
}
],
"body_text": [
{
"text": "Categorial Grammars (CGs) consist of two components: (i) a lexicon, which assigns syntactic types (plus an associated meaning) to words, (ii) a calculus which determines the set of admitted type combinations. The set of types (T) is defined recursivety in terms of a set of basic types (To) and a set of operators ({\\,/} for standard bidirectional CG), as the smallest set such that (i) To C T, (ii) /f x,y E T, then x\\y, x/y E T. 2 Intuitively, lexical types specify subcategorisation requirements of words, and requirements on constituent order. We here address a particular flexible CG, the (product-free) Lambek calculus (L: Lambek, 1958) . The rules below provide a natural deduction formulation of L (Morrill et al. 1990 ; Barry et al. 1991) , where dots above a type represent a proof of that type. Proofs proceed from a number of initial assumptions, consisting of individual types, some of which may be \"discharged\" as the proof is constructed. Eact~ type in a proof is associated with a lambda expression, corresponding to its meaning. The elimination rule/E states that proofs of A/B and B may be combined to construct a proof of A. The introduction rule /l indicates that we may discharge an assumption B within a proof of A to construct a proof of A/B (square brackets indicating the assumption's discharge). There is a side condition on the introduction rules, reflecting the ordering significance of the directional sloshes. For /1 (resp. \\1), the assumpt I am grateful to Esther KSnig for discussion of the paper. The work was done under a grant to the Cambridge Univer-\";ty Computer Laboratory, \"Unification.based models of lexical \u2022ccvu and incremental interpretation\", SPG 893168. ~ln this notation, x/y and x\\y are both functions from y into x. A convention of left association is used, Bo that, e.g.",
"cite_spans": [
{
"start": 629,
"end": 642,
"text": "Lambek, 1958)",
"ref_id": "BIBREF9"
},
{
"start": 706,
"end": 726,
"text": "(Morrill et al. 1990",
"ref_id": "BIBREF12"
},
{
"start": 729,
"end": 747,
"text": "Barry et al. 1991)",
"ref_id": "BIBREF0"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "((t~np)/pp)/np may be written s\\np/pp/np. tion discharged must be the rightmost (resp. leftmost) undischarged assumption in the proof. Elimination and introduction inferences correspond semantically to steps of functional application and abstraction, respectively.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "Hypothesis rule:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "Elimination rules: ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "Introduction",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Introduction",
"sec_num": null
},
{
"text": "Each proof demonstrates the po6sibility of combining the types of its undischarged assumptions, in their given order, to yield the type at the bottom of the proof. The following proof of \"simple forward composition\" illustrates the approach.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".~\\ I i",
"sec_num": null
},
{
"text": "-/E b:(~# /E a/c:)~z.(x (yz).)/li",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "a/b:= b/c:~ [c:zli",
"sec_num": null
},
{
"text": "Following Prawitz (1965) , a normal form (NF) for proofs can be defined using the following meaning preserving contraction rule and its mirror image dual with \\ in place of/, which, under a version of the Curry-Howard correspondence between proofs and lambda terms, are analogous to tile/~-contraction rule ((,~x.P)Q t> P[Q/z]) for lambda expressions.",
"cite_spans": [
{
"start": 10,
"end": 24,
"text": "Prawitz (1965)",
"ref_id": "BIBREF13"
},
{
"start": 41,
"end": 45,
"text": "(NF)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "a/b:= b/c:~ [c:zli",
"sec_num": null
},
{
"text": "...B A : t> :",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "': \"[B]~ !",
"sec_num": null
},
{
"text": "--/l i A/B B A --./E",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "': \"[B]~ !",
"sec_num": null
},
{
"text": "A Under this system, every L proof has an equivalent 'fl-NF' proof. Such fl-NF proofs have a straightforward structural characterisation, that their main branch (the unique path from the proof's end-type to an assumption, that includes no types forming the argument for an elimination step) consists of a sequence of (> 0) eliminations followed by a sequence of (> 0) introductions.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "': \"[B]~ !",
"sec_num": null
},
{
"text": "The main approach for parsing L ha:~ been sequeu~ calculus theorem proving. 3 Used naiwdy, this approach is inefficient due to 'spurious ambiguity', i.e. the existence of multiple equivalent proofs tor combinations. K6nig (1989) and llepple (1990u) develop a solution to this problem baaed on detining a NF for sequent proofs. These NF systems as yet cover only the basic calculus, and do not extend to various additions proposed to overcome the basic system's shortcomings ms a grammatical framework.",
"cite_spans": [
{
"start": 202,
"end": 228,
"text": "combinations. K6nig (1989)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "': \"[B]~ !",
"sec_num": null
},
{
"text": "Some importance has been attached to the properties of flexible CGs in respect of incremental process.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "': \"[B]~ !",
"sec_num": null
},
{
"text": "ing. These grammars typically allow sentences to be given analyses which are either fully or prinmrily left> branching, in which many sentenceinitiat ~ubstrings are interpretable constituents, providing for processing in which the interl)retation of a sentence is generated 'on-line' as the sentence ix presented, lncrementality is characteristic of hmnan sentence processing, and might also allow more efficient machine processing of language, by allowing early filtering of semantically implausible analyses. ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "': \"[B]~ !",
"sec_num": null
},
{
"text": "Standard chart *nethods are inadequate lot L because proving that some combination of types is possible may involve 'hypothetical reasoning', i.e. using additional assumptions over and above just the types tttat are being combined. For example, the above proof of a/b,b/c ::~ a/c requires an additional assumption c, subsequently discharged. Standard chart parsing involves ordering the edges tbr lexical categories along a single dimension, and then adding edges for con. stituents that span wider substretches of this dimension as constituents are combined. Tim problem fi3r L is that there is no place in this set up for additional hypothetical eleme*lts. Placing edges tot them anywhere on the single dimension of a normal chart would simply be incorrect. K6nig (1990, 1991) , in the only previous chart method for L, handles this probhun by placing hyt,o thetical elements on separate, independently ordered 'minicharts', which are created ('emitted') in rc~aponue to the presence of edges ttmt bear 'higher order' func tor types (i.e. seeking argunlents having fimctional types), which may require 'hylmtheticul reasoning' in the derivation of their argmnent. Minicharts may '~tttaeh' themselves into other charts (including other minicharts) at points where combinations are po~i. ble, so that 'chains of attachment' may arise. Somc 3 Spa\u00a2:e limits preclude diac~msion of I~cei*t proof *~t work. fairly complicated I)ook-keeping is required to keep track of ~:'hat has combined with what as a basin for ensuring correct 'discharge' of hypothetical elements. This information ix encoded into edgc~4 by replacing the simple indices (or vertices) of standard charts with 'complex in(liccs'. Unfortunately, the complexity of thin method precludes a proper exposition here. However, somc (lifl~rcnc~ between K6nig's method and the method to be proposed will hc mentioned at the end of the next section.",
"cite_spans": [
{
"start": 760,
"end": 778,
"text": "K6nig (1990, 1991)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Chart Parsing Lambek Grammars",
"sec_num": null
},
{
"text": "A New Chart Approach I next present a new chart parsing method for L. Its most striking diflbrence to tim standard approach is that there ix typically more than one ordering governing the association of edges in a chart. These orderings intersect and overlap, making a chart a 'nmltidimensioual object'. A uecond difl~rence in that the basic unit we adopt ior ~q,ecifying the orderings of the chart is pr*ntilivc iuteroals, rather than point-like vertices, wi~ere the relative order of the primitive intervals that make up an ordering must be explicitly defined. The span of edg?'~ is specilicd cxtcnsionally tm tim concatenated sum of some number of primitive intervals. The method i, pcfliaps most easily explained by example.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Chart Parsing Lambek Grammars",
"sec_num": null
},
{
"text": "'lb parse the coml)ina~ion x/y, y/z, z =~-x, we require a three clement ordering orde~cing (a.b. e) (a b and c being primitive interwdu). Tim three types give three edg(.~, each having thrc~: fields: (i) the edge's spun (her,, a primitive mtervat), (ii) its type (iii) the typc~8 'meaning' (here ~ m,ique constant).",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Chart Parsing Lambek Grammars",
"sec_num": null
},
{
"text": "[b, y/z, ~s2Zl [c, z, t3J Edg~ are combined uodvr the lollowing chart rules, corre~po,diag to our elimination ru]c~:",
"cite_spans": [
{
"start": 15,
"end": 25,
"text": "[c, z, t3J",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "[~t, :~/y, \u00a31]",
"sec_num": null
},
{
"text": "if It, :~/'/, A] ema I:j, Y, B] aud i~a ~itboid(i. j ) i~ [i, Y, B] and [j, %\\Y, A] ~a~d isa uubord (i. j ) ~hon [i.j, X, (AB)]",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "[~t, :~/y, \u00a31]",
"sec_num": null
},
{
"text": "The ruh~4 a[luw two edges with appropriate types to combine provided tilat the concatenation of their spans is n gubstrhlg of uome defined ordering (a test made by the predicate iua ~ubord), Given these rul~, our chart will expt~nd to include the following two edges. The l)reaeuce of atL edge with type x that ~l)anu the full width ol-the ningle defined ordering show~ that x/y, y/z, z -~ x can be derived.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "[~t, :~/y, \u00a31]",
"sec_num": null
},
{
"text": "[b.c, y, (%2 ",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "[~t, :~/y, \u00a31]",
"sec_num": null
},
{
"text": "Figure 1: The EMIT procedure Our next example x/(y\\p/q), y/z\\p, z/q =~ x requires 'hypothetical reasoning' in its derivation, which is made possible by the pr~ence of the higher-order functor x/(y\\p/q). In the natural deduction approach, deriving the functor's argument y\\p/q might involve introduction inference steps which discharge additional assumptions p and q occurring peripherally within the relevant subproof. To chart parse the same example, we require firstly the following three edges and require a three element ordering:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "ordering ( a. b. c) [a, x/(y\\p/q), tl] [b, y/z\\p, t2] [c, z/q, t3]",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "As in KSnig's approach, an 'emit' step is performed on the edge which bears a higher-order type, giving various additions to the chart needed to allow for hypothetical reasoning. Firstly, this gives two new edges, which are assigned new primitive intervals:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "[d, p, ell [e, q, v2]",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "Some new orderings must be defined to allow these edges to combine. Since the higher-order functor is forward directional, possible arguments for it must occupy non-empty intervals It such that isa_subord(a.H). Hypothetical reasoning with the two new edges is useful only in so far as it contributes to deriving edges that occupy these spans. Hence, the required new orderings are (d.H.e) such that isa_subord(a.H). Such new orderings are most conveniently created by including the following condition on ordcrings. This condition has the effect that whenever an edge of a certain form is added to tile chart, another edge of a related form is also added. The condition completes the process of hypothetical reasoning, by syntactically and semantically abstracting over the hypothetical elements ('discharging' them) to derive the function required as argument by the higher order functor. Note that, since the intervals d and a arc unique to the two edges created in the emit process, any edge spanning an interval (d. 13. e) must have involved these two edges in its derivation. The condition 'fires' on the above edge spanning (d.b.c.e) to give the first of tile following edges, which by combination gives the second. This final edge demonstrates the derivability of original goal combination.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "[b.c, y\\p/q, v2@vl@((t2 vl)(t3 v2))].",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "[a.b.c, x, (tl v2@vl@((t2 v:t)(t3 v2)))].",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "We have now seen all the basic ingredients required for handling hypothetical reasoning in the new approach. Figure 1 . shows a general (if still somewhat informal) statement of the emit procedure which is called on every edge added to the chart, but which only has an effect when an edge bears a higher-order type. The specific consequences depend on the functor's directionality. and m forward directional arguments F1 .... Fin in any order. 4 In each ease, the procedure simply adds an edge for each required hypothetical element, a condition on orders (to create all required new orderings), and a condition on edges, which fires to produce an edge for the result of hypothetical reasoning, should it succeed. Note that edges produced by such condi~ tions are there only to be argument to some higher order functor, and allowing them combine with other edges as funclov wmdd be unnecessary work. I assume that such edges are marked, and that some mechanism operates to block such combinations, s A slightly modified emit procedure is required to allow for deriving overall combinations that have a functional result type. I will not give a full statement of this procedure, but merely illustrate it. For example, in proving a combination F :\u00a2, y\\p/q, where an ordering Q had been defined for the edges of the types F, emitting the result type y\\p/q would give only a single new ordering (not a condition on orderings), a condition on edges, and two new edges for the hypothetical elements as follows:",
"cite_spans": [],
"ref_spans": [
{
"start": 109,
"end": 117,
"text": "Figure 1",
"ref_id": null
}
],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "ordering (a. Q. b) i~ [a.Q.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": ".i@vl@S)])",
"sec_num": null
},
{
"text": "Fa, p, eli I'b, q, v2]",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "b, y, S] then [~, y\\p/q, v2@(vl@S)]",
"sec_num": null
},
{
"text": "That completes description of the new chart metimd for L. A few final comments. Although the method has been described for proving type combinations, it can also be used for parsing word strings, since lexical ambiguity presents no problems. Note that defining a new ordering may enable certain combinations of edges already present in the chart that were not previously allowed, tlowever, simply checking for all edge combinations that the new ordering allows will result in many previous combinations being redone, since new orderings always share some suborderings with previously defined orderings. One way to avoid this problem is to only check for corr~binations allowed by substrings of the new ordering that were not previously suborderings.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "b, y, S] then [~, y\\p/q, v2@(vl@S)]",
"sec_num": null
},
{
"text": "Concerning the soundness of this method, note that chart derivations can be easily translated into (correct) natural deduction proofs, given a knowledge of 4This notsttion il rather clumsy in that it anpears to mug-Kent the prettence of at lemU otie forward and one backward directional arguinent and also a relative ordering of these argu~ meats, wiles tteither of these impllcationa is intended. A similar point can be made about abstraction8 in the schematic semalltics m~lvna .... wtlvllS, whose order and number will iu fact mitTor that of the corresponding syntactic arguments. A more satisfactory statement of the emit procedure could be made recursively, but tiffs would take up too nmch space. 5An alternative would be not entering 6uclt edgea at all, but instead have a condition on edges that creates tat edge for the restllt of combining the emitting higher-order functor with its implicitly derived argtulteltt, directly.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "b, y, S] then [~, y\\p/q, v2@(vl@S)]",
"sec_num": null
},
{
"text": "which edges gave rise to which others, i.e. with binary edge combinations corresponding to elimination inferences, and with the creation of an edge by a condition on edges corresponding to some sequence of introduction inferences. In fact, chart derivations all translate to fl-NF proofs, i.e. with introductions always made after any eliminations on tile main branch of any subproof. This observation provides at least art informal indication of the completeness of the method, since ttle mechanisms described should allow for chart derivations corresponding to all possible/~-NF proofs of a given combination, which (as we noted earlier) are fully representative. Another issue is whether the method is minintal in tile sense of allowing only tt single chaxt derivation for each reading of a combination. This is not Bo, given that distinct but equivalent fl-NF proofs of a combination axe possible, due to a second source of equivalence for proofs analogous to y-equivalence of lambda expressions (i.e. that f \"= Ax.f~:). For exanipie, the combination a/(b/c), b/c ==~ a has two fl-NF proofs, one involving 'unnece~axy' hypothetical rea~ soning. However, having equivalent edges represented on the chart, and the undesirable consequences for subsequence derivations, can be avoided by a simple identity check on edge addition, provided that the meaning terms of edges produced by conditions on edges are subject to rt-normalisation.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "b, y, S] then [~, y\\p/q, v2@(vl@S)]",
"sec_num": null
},
{
"text": "I will finish with some comparisons of the method to that of kSnig (1990, 1991) . The importance of KSnig's method tm precursor for the new method cannot be overstated, tlowever, the new approach is, I believe, conceptually much simpler than KSnig's. This is largely due to the use of 'conditions on edges' in the new approach to handle discharge of hypothetical elements, which allows edges to be much simpler objects than in KSnig's approach, where edges instead have to encode the potentially complex information required to allow proper discharge in their 'complex indices'. The complex nature of KSnig's edges considerably obscures the nature of parsing as being simply reasoning about sequences of types, and also makes it difficult to see how the method might be adapted to allow for extensions of L involving additional operators, even ones that have straightforward sequent rules.",
"cite_spans": [
{
"start": 67,
"end": 73,
"text": "(1990,",
"ref_id": null
},
{
"start": 74,
"end": 79,
"text": "1991)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "b, y, S] then [~, y\\p/q, v2@(vl@S)]",
"sec_num": null
},
{
"text": "A second difference of the new method is that orderings that govern the association of edges are explicitly defined. There is a sense m which the multiple intersecting orderings of the new approach can be seen to express the dimensions of the search space addressed in sequent calculus theorem proving, although collapsing together the parts of that search space that have common structure, ht K6nig's method, although the elements that belong together in a miniehart are relatively ordered, tt~e attachment of one minichaxt to another is allowed wherever relevant edges can combine (although subject to some constraints preventing infinite looping). This means that elements may be combined that would not be in sequent calculus theorem proving or in the new chart method. The consequences of this difference for the relative complexity of the two chart methods is at present unknown.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "b, y, S] then [~, y\\p/q, v2@(vl@S)]",
"sec_num": null
},
{
"text": "Various extensions of the basic Lambek calculus have been proposed to overcome some of its limitations as n grammatical approach. Morrill (1989 Morrill ( , 1990 ) suggests a unary operator o, for handling locality constraints on binding and reflexivisation. This has the following inference rules, which make it behave somewhat like necessity in the modal logic $4:",
"cite_spans": [
{
"start": 130,
"end": 143,
"text": "Morrill (1989",
"ref_id": "BIBREF10"
},
{
"start": 144,
"end": 160,
"text": "Morrill ( , 1990",
"ref_id": "BIBREF11"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "OA A where every undischarged --OE",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "~OI assumption is a O-type A OA I will try to briefly suggest how o may help in handling locality constraints. Boundaries arise where lexieal functors seek a modal argument, i.e. are of the form x/[:ly, the presence of the o making the argument phrase potentially a bounded domain. In addition, all lexical types are of the form Ox, i.e. have a single o as their outermost operator, which allows them to appear embedded within modal domains (c.f. the D! rule's requirement of O-ed assumptions). For example, a lexica] NP might be []np, a transitive verb El(s\\np/np), and a sentence-complement verb like believes type O(s\\np/Os). In a standard flexible CG treatment, extraction is handled by functional abstraction over the position of the missing (i.e. extracted) element. The type of the abstracted element is determined by the type of the higher order lexical type that requires this abstraction, e.g. the relative pronoun type rel/(s/np) abstracts over np. Note that this relative pronoun type cannot extract out of an embedded modal domain, because it abstracts over a bare (i.e. non-modal) np, whose presence would block O1 rule's use in deriving the overall modal constituent. However, a relative pronoun rel/(s/[]np), which abstracts over a modal type Drip, can extract out of an embedded modal domain.",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "Including this operator presents considerable problems for efficient processing. Firstly, it excludes the use of the NF systems devised for the calculus (KSnig, 1989; Hepple, 1990a) . As noted above~ spurious ambiguity makes ordinary (i.e. non-normal form) sequent theorem proving of L inefficient. This problem is greatly increased by inclusion of O, largely due to nondeterminism for use of the OE rule. 6 6 Conaldrr a sequent S = Ox ! ,l:]x:h..., nXn ::~ x0, where the r~l~ted amquent S s ~ Xl,X2,... ,xn ~ x0 is a theorem. Nondeterminism for use of [[]L] means that there are n{ different paths of inference from S to S t, so that there are at least n! proofs of S for each proof of S I. In fact, interaction of [OLl with other inference ruler means that there is typically many more proofs than this.",
"cite_spans": [
{
"start": 153,
"end": 166,
"text": "(KSnig, 1989;",
"ref_id": "BIBREF6"
},
{
"start": 167,
"end": 181,
"text": "Hepple, 1990a)",
"ref_id": "BIBREF3"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "The new chart method is fairly easily adapted to allow for o, avoiding the non-determinism problem of the sequent system, so that parsing examples with [] is typically only slightly slower than parsing related examples without any Os. Firstly, it is crucial that we can always identify tbe parent edge(s) for some edge (i.e. the immediate edge(s) from which it is derived), and thereby an edge's more distant ancestors. I ignore here the precise details of how this is done. The following chart rule serves in place of the bE rule:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "if [i, ~I. i] then [i, X, A]",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "For tile combination x/my, O(y/z), Oz =~ x, we would require the following ordering and first three edges. The procedure check_modal_history used by this condition checks the edge's 'history' to see if it has appropriate ancestors to license the O-introduction step. Recall that the D! rule requires that the undischarged assumptions of the proof to which it applies are all Otypes. The corresponding requirement for the chart system is that the edge must have ancestors with ~types that together span the full width of the edge's span H (i.e. there must be a subset of the edge's ancestor edges that have [:l-types, and whose spans concatenate to give H). The edge [(b.c) , y, (12 13)] satisfies this requirement, and so the condition will fire, allowing the parse to proceed to successful completion, as follows:",
"cite_spans": [
{
"start": 666,
"end": 672,
"text": "[(b.c)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "[(b.c), [.]y, (12 13)] [(a.b.c), x, (tl (t2 13))]",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "More complicated cases arise when an emitted functor seeks an argument type that is both functional and modal. As suggested above, a satisfactory statement of the emit process is best given recursively, but there is not sufficient space here. Hopefully, an example will adequately illustrate the method. Consider what is required in emitting an edge [a, w/([] (x\\y)/z), tl], whose type seeks an argument (O(x\\y)/z, i.e. a function to a modal form of a further functional type. As before, emitting creates two new edges and a single condition on orderings:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "As a second example, consider the stage of having scanned the first two types of the input sequence (y, x\\y/z, z). Scanning yields the following ordering and the first two edges. Applying emit* to the second edge yields the third edge, and two conditions: Although the method allows for a considerable degree of inerementality, some conceivable incremental constituents will not be created that would be in parsing with alternative categorial frameworks. For example, rules of type raising and composition in Combinatory Catcgorial Grammar (Steedman, 1987; Szabolcsi, 1987) would allow incremental combination of types vp/s, np ::~ vp/(s\\np), not allowed by the present approach. The modified chart method instead allows for the construction of incremental constituents in a manner that most closely relates to tim notion of dependency constituency argued for by Barry & Pickering (1990) (see also Hepple, 1991) , although since the modified parser is still a complete parser for L it cannot be viewed as implementing a notion of dependency constituency. 7 Finally, it should be noted that the additional hypothetical reasoning allowed by emit* and combinations involving additional 'incremental constituents' result in many 'spurious' analyses, so that the incremental chart method is in general slower than the non-incremental chart method.",
"cite_spans": [
{
"start": 509,
"end": 556,
"text": "Combinatory Catcgorial Grammar (Steedman, 1987;",
"ref_id": null
},
{
"start": 557,
"end": 573,
"text": "Szabolcsi, 1987)",
"ref_id": "BIBREF14"
},
{
"start": 863,
"end": 887,
"text": "Barry & Pickering (1990)",
"ref_id": null
},
{
"start": 898,
"end": 911,
"text": "Hepple, 1991)",
"ref_id": "BIBREF5"
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "ordering (a. b) Ca, x/y, ill [b, y/z, t2] ri, z, v2",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Parsing Modal Extensions",
"sec_num": null
},
{
"text": "I have presented a chart parsing method for the Lambek calculus, which I would argue has several advantages over that of KSnig (1990 KSnig ( , 1991 . Firstly, I believe that it is considerably conceptually clearer than KSnig's method, and more straightforwardly reflects intuitions about the nature of hypothetical reasoning :'However, some version of a chart parser that used only the kind of hypothetical reasoning allowed by the emit* procedure, and not that of the emit procedure, might well implement a notion of dependency constituency. in proving L combinations. Secondly, the relatively straightforward nature of the system with respect to reasoning about sequences of types should, I believe, make it easier to adapt the method to allow for additional type-forming operators over those already provided in the (product-free) Lambek calculus, particularly where operators have fairly straightforward sequent rules. We have seen how the method can be extended to allow for Morrill's Eloperator. We have also seen how the method may be modified to allow incremental parsing of Lambek grammars.",
"cite_spans": [
{
"start": 121,
"end": 132,
"text": "KSnig (1990",
"ref_id": "BIBREF7"
},
{
"start": 133,
"end": 147,
"text": "KSnig ( , 1991",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "Conclusion",
"sec_num": null
},
{
"text": "Paoc. ov COL1NG-92, NANTES, AU(i.[23][24][25][26][27][28] 1992",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "",
"sec_num": null
}
],
"back_matter": [
{
"text": " [i, y, vl] [j, z, vt] if ordering(P.a.q.K) and non_empty(q) then ordering(i.Q, j)However, recursive decomposition of tile type ((i3(x\\y)/z) gives rise to three separate conditions Oil edges (which reflect the three aspects of the description of this type as a 'function to a modal form of a further functional type'):if [(i.H.j) These three conditions 'chain' together to create edges with the type required by tbe emitted functor. Of course in practice, the.three conditions could be collapsed into a single condition, and such a move seems sensible from the viewlmint of efficiency.",
"cite_spans": [
{
"start": 1,
"end": 11,
"text": "[i, y, vl]",
"ref_id": null
},
{
"start": 321,
"end": 329,
"text": "[(i.H.j)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "annex",
"sec_num": null
},
{
"text": "Despite considerable interest in the theoretical possibility of incremental processing using the Lambek calculus, no incremental parsing methods have as yet been proposed. Indeed, most Lambek parsing work has been based around sequent theorem proving, which might be viewed as antithetical to incremental processing since it fundamentally involves reasoning about complete sequences of types. In fact it is fairly easy to modify the chart method to a|low some extent of incremental processing, i.e. so that scanning left-toright through an input string (or type sequence), the chart will contain edges assigning types to substrings that would not otherwise receive types (luring parsing, including some for initial substrings of the input.The modification of the chart method involves allowing an additional extent of hypothetical reasoning over that so far allowed, so that edges for hypothetical types are added not only for higher-order functors, but also for first-order functors. This is allowe by a new procedure emit*, described below, emit* is called on every edge added to the chart, but only has an effect if the edge's type is functional, creating a new edge for a hypothetical type corresponding to the function's first argmnent, as well as a condition on orderings and one on edges. The condition on orderings creates new orderings allowing the hypothetical edge to cmubine with its 'emitter', and tile result of that combination to he combined with filrther edges. (The requirement J \\= i.K prevents the condition incorrectly reapplying to its own output.) Note that the new edge's interval is peripheral ill the new orderings that are defined since it is only in peripheral position that the new hypothesis cau be discharged (hence, we have (G. H. i) in the condition of tile first case rather than (0. H. i. J)). Such discharge is made by the new condition on edges. Let us look at some examples (where we limit our attention to just edges relevant to the discussion). Consider parsing the type sequence (x/y, y/z, z). Since the method should not depend on the parser knowing the length of the input sequence in advauce, an ordering will be defined with each scanning step that covers just tile material so far scanned, and which extends tile ordering of the previous scanning step by one. After scanning the first two types of the input, the chart will include at least tile following two edges and ordering:",
"cite_spans": [],
"ref_spans": [],
"eq_spans": [],
"section": "Incremental Parsing",
"sec_num": null
},
{
"text": "Applying emit* to the second edge (ignoring the first edge here) yields the following edge and conditions:The condition on orderings will fire ou the ordering (a.b) to produce a new ordering (a.b. i), which permits the first two of the following edges to be built, the third being generated from the second by the condition on edges. The type x/z this edge assigns to the initial substring (x/y, y/z) of the input (corresponding to the composition of the two functions) would not have been created during parsing with other Lambek parsing methods. [(b.i) ",
"cite_spans": [
{
"start": 548,
"end": 554,
"text": "[(b.i)",
"ref_id": null
}
],
"ref_spans": [],
"eq_spans": [],
"section": "emit*( [I1 ,T,_] ) :",
"sec_num": null
}
],
"bib_entries": {
"BIBREF0": {
"ref_id": "b0",
"title": "Proof Figures and StructurM Operators for Categorial Grammar",
"authors": [
{
"first": "G",
"middle": [],
"last": "Barry",
"suffix": ""
},
{
"first": "M",
"middle": [],
"last": "Hepple",
"suffix": ""
},
{
"first": "N",
"middle": [],
"last": "Leslie",
"suffix": ""
},
{
"first": "G",
"middle": [],
"last": "Morrill",
"suffix": ""
}
],
"year": 1991,
"venue": "Proc. of EACL-5",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Barry, G., Hepple, M., Leslie, N. and Morrill, G. 1991. 'Proof Figures and StructurM Operators for Catego- rial Grammar', Proc. of EACL-5..",
"links": null
},
"BIBREF1": {
"ref_id": "b1",
"title": "Studies in Calegorial Grammar",
"authors": [
{
"first": "G",
"middle": [],
"last": "Barry",
"suffix": ""
},
{
"first": "G",
"middle": [],
"last": "Morrill",
"suffix": ""
}
],
"year": 1990,
"venue": "Edinburgh Working Papers in Cognitive Science",
"volume": "5",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Barry, G. and Morrill, G. 1990. (Eds). Studies in Calegorial Grammar. Edinburgh Working Papers in Cognitive Science, Volume 5. Centre for Cognitive Science, University of Edinburgh.",
"links": null
},
"BIBREF2": {
"ref_id": "b2",
"title": "Dependency and Constituency in Categorial Grammar",
"authors": [
{
"first": "G",
"middle": [],
"last": "Barry",
"suffix": ""
},
{
"first": "M",
"middle": [],
"last": "Piekering",
"suffix": ""
}
],
"year": 1990,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Barry, G. and Piekering, M. 1990. 'Dependency and Constituency in Categorial Grammar', in Barry and Morrill, 1990.",
"links": null
},
"BIBREF3": {
"ref_id": "b3",
"title": "Normal form theorem proving for the Lambek calculus",
"authors": [
{
"first": "M",
"middle": [],
"last": "Hepple",
"suffix": ""
}
],
"year": 1990,
"venue": "Prec. of COLING-90",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Hepple, M. 1990a. 'Normal form theorem proving for the Lambek calculus', Prec. of COLING-90.",
"links": null
},
"BIBREF4": {
"ref_id": "b4",
"title": "The Grammar and Processing of Order and Dependency: A Categorial Approach",
"authors": [
{
"first": "M",
"middle": [],
"last": "Hepple",
"suffix": ""
}
],
"year": 1990,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Hepple, M. 1990b. The Grammar and Processing of Order and Dependency: A Categorial Approach. Ph.D. dissertation, Centre for Cognitive Science, University of Edinburgh.",
"links": null
},
"BIBREF5": {
"ref_id": "b5",
"title": "Efficient Incremental Processing with Categorial Grammar",
"authors": [
{
"first": "M",
"middle": [],
"last": "Hepple",
"suffix": ""
}
],
"year": 1991,
"venue": "Proc. of ACL-eT",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Hepple, M. 1991. 'Efficient Incremental Processing with Categorial Grammar', Proc. of ACL-eT.",
"links": null
},
"BIBREF6": {
"ref_id": "b6",
"title": "Parsing as natural deduction",
"authors": [
{
"first": "E",
"middle": [],
"last": "Ksnig",
"suffix": ""
}
],
"year": 1989,
"venue": "Proc. of ACL-25",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "KSnig, E. 1989, 'Parsing as natural deduction', Proc. of ACL-25.",
"links": null
},
"BIBREF7": {
"ref_id": "b7",
"title": "The complexity of parsing with extended categorial grammars",
"authors": [
{
"first": "E",
"middle": [],
"last": "Ksnig",
"suffix": ""
}
],
"year": 1990,
"venue": "Proc. of COLING-90",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "KSnig, E. 1990, 'The complexity of parsing with ex- tended categorial grammars', Proc. of COLING-90.",
"links": null
},
"BIBREF9": {
"ref_id": "b9",
"title": "The mathematics of sentence structure",
"authors": [
{
"first": "J",
"middle": [],
"last": "Lambek",
"suffix": ""
}
],
"year": 1958,
"venue": "American Mathematical Monthly",
"volume": "65",
"issue": "",
"pages": "154--170",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Lambek, J. 1958. 'The mathematics of sentence struc- ture.' American Mathematical Monthly 65. 154-170.",
"links": null
},
"BIBREF10": {
"ref_id": "b10",
"title": "Intensionality, boundedneas~ and modal logic",
"authors": [
{
"first": "G",
"middle": [],
"last": "Morrill~",
"suffix": ""
}
],
"year": 1989,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Morrill~ G. 1989. 'Intensionality, boundedneas~ and modal logic.' Research Paper EUCCS/RP-32, Cen- tre for Cognitive Science, University of Edinburgh.",
"links": null
},
"BIBREF11": {
"ref_id": "b11",
"title": "Intensionality and Boundedness",
"authors": [
{
"first": "G",
"middle": [],
"last": "Morrill",
"suffix": ""
}
],
"year": 1990,
"venue": "Linguistics and Philosophy",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Morrill, G. 1990. 'Intensionality and Boundedness', Linguistics and Philosophy, 13.",
"links": null
},
"BIBREF12": {
"ref_id": "b12",
"title": "Categorial Deductions and Structural Operations",
"authors": [
{
"first": "G",
"middle": [],
"last": "Morrill",
"suffix": ""
},
{
"first": "N",
"middle": [],
"last": "Leslie",
"suffix": ""
},
{
"first": "M",
"middle": [],
"last": "Hepple",
"suffix": ""
},
{
"first": "G",
"middle": [],
"last": "Barry",
"suffix": ""
}
],
"year": 1990,
"venue": "",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Morrill, G., Leslie, N., Hepple, M. and Barry, G. 1990. 'Categorial Deductions and Structural Operations', in Barry and Morrill, 1990.",
"links": null
},
"BIBREF13": {
"ref_id": "b13",
"title": "Natural Deduction: a Proof Theoretical Study, Almqvist and Wiksell",
"authors": [
{
"first": "D",
"middle": [],
"last": "Prawitz",
"suffix": ""
}
],
"year": 1965,
"venue": "Combinatory Grammars and Parasitic Gaps', NLLT",
"volume": "5",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Prawitz, D. 1965. Natural Deduction: a Proof Theo- retical Study, Almqvist and Wiksell, Uppsala. Steedman, Mark. 1987. 'Combinatory Grammars and Parasitic Gaps', NLLT, 5:3.",
"links": null
},
"BIBREF14": {
"ref_id": "b14",
"title": "On Combinatory Categorial grammar",
"authors": [
{
"first": "A",
"middle": [],
"last": "Szabolcsi",
"suffix": ""
}
],
"year": 1987,
"venue": "Proc. of the Symposium on Logic and Language",
"volume": "",
"issue": "",
"pages": "",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Szabolcsi, A. 1987 'On Combinatory Categorial gram- mar', Proc. of the Symposium on Logic and Lan- guage, Debrecen, Akaddmiai KiadS, Budapest.",
"links": null
},
"BIBREF15": {
"ref_id": "b15",
"title": "aotrr 1992 1 4 O PROC. OF COLING-92",
"authors": [
{
"first": "D",
"middle": [
"E"
],
"last": "Acres",
"suffix": ""
},
{
"first": "",
"middle": [],
"last": "Coling-92",
"suffix": ""
}
],
"year": 1992,
"venue": "",
"volume": "",
"issue": "",
"pages": "23--28",
"other_ids": {},
"num": null,
"urls": [],
"raw_text": "Acres DE COLING-92, NANTES, 23-28 aotrr 1992 1 4 O PROC. OF COLING-92, NANTES, AUG. 23-28, 1992",
"links": null
}
},
"ref_entries": {
"FIGREF0": {
"text": ":)tx.]'/li A\\B:),x.____....",
"num": null,
"type_str": "figure",
"uris": null
},
"FIGREF1": {
"text": "if ordering(P.a.Q.R) and non_empty(Q) ~hen ordering(d.Q, e) In general, such conditions may fire after tile emit step that creates the condition, when other new orderings are created that include the emitting edge's span. The above condition causes new orderings (d.b.e) and (d. b. c. e) to be defined, allowing combinations that yield the following edges: [d.b, y/z, (t2 vl)] It.e, z, (t3 v2)] ['d.b.c.o, y, ((t2 vl)Ct3 v2))] The final thing created by the emit process is the following condition on edges (where I@B represents tambda abstraction over I in B): if Cd.O.e, y, S] and isa_subord(a.O) then [0, y\\p/q, v2@vl@S]",
"num": null,
"type_str": "figure",
"uris": null
},
"FIGREF2": {
"text": "The ordering condition gives the following new ordering, allowing creation of the subsequent new edges. As before, the last edge assigns a type to the combination of the first two input types which would not otherwise be expected during parsing.ordering(a.b, i) [(b.i), x\\y, (t2 v)] [(a.b.i), x, ((t2 v) tl)][(a.b), x/z, v@((t2 v) tl)]",
"num": null,
"type_str": "figure",
"uris": null
},
"TABREF1": {
"text": "Y\\BI/F1 .... \\Bn/Fm), then(add_edges: [il,Bl,vl] ...... [in,Bn,vn], [jm,Fm,~n] ...... [ji,Fl,wl] add_condltion: if ordering(P.H.~.R) and non_empty(G) then ordering(il...in.~.jm..,jl) add_condition: if [(il...in.K.jm...jl),Y,S] and isa_subord(H.K) then [K, (Y\\B1/FI .... P.~.H.R) and non_empty(~) then orderingCtl...in.~.jm...jl) add_condition: if [(il...in.K.jm...jl),Y,S] and isa_subord(H.K) then [K, (Y\\BI/FI ....",
"html": null,
"type_str": "table",
"num": null,
"content": "<table><tr><td>if</td><td colspan=\"2\">T = X\\CY\\BI/FI .... \\Bn/Fla),</td><td/><td/></tr><tr><td colspan=\"2\">then (add edgoJ:</td><td>[il,Bl,vl],</td><td>.....</td><td>[in,Bn,vn], [Jm,Fn,tm] ......</td><td>[ji,Fl,ul]</td></tr><tr><td/><td colspan=\"2\">add_condition:</td><td/><td/></tr><tr><td/><td colspan=\"2\">if ordering(</td><td/><td/></tr></table>"
}
}
}
}