Natural Language Processing Computational Semantics Dr. Sohaib Latif Assistant Professor The University of Chenab, Gujrat sohaib@cs.uchenab.edu.pk
Semantic Analysis Semantic analysis is the process of taking in some linguistic input and producing a meaning representation for it. ◦ There are many ways of doing this, ranging from completely ad hoc domain specific methods to more theoretically founded by not quite useful methods. ◦ Different methods make more or less (or no) use of syntax ◦ We’re going to start with the idea that syntax does matter ◦ The compositional rule-to-rule approach 2
Compositional Analysis Principle of Compositionality ◦ The meaning of a whole is derived from the meanings of the parts What parts? ◦ The constituents of the syntactic parse of the input. 3
Compositional Semantics Part of the meaning derives from the people and activities it’s about (predicates and arguments, or, nouns and verbs) and part from the way they are ordered and related grammatically: syntax 4
Specific vs. General-Purpose Rules We don’t want to have to specify for every possible parse tree what semantic representation it maps to We want to identify general mappings from parse trees to semantic representations: ◦ Again (as with feature structures) we will augment the lexicon and the grammar ◦ Rule-to-rule hypothesis: a mapping exists between rules of the grammar and rules of semantic representation 5
Semantic Attachments Extend each grammar rule with instructions on how to map the components of the rule to a semantic representation (grammars are getting complex) S  NP VP {VP.sem(NP.sem)} Each semantic function is defined in terms of the semantic representation of choice Problem: how to define these functions and how to specify their composition so we always get the meaning representation we want from our grammar? 6
Strong Compositionality The semantics of the whole is derived solely from the semantics of the parts. (i.e. we ignore what’s going on in other parts of the tree). 7
Quantifiers and Connectives If the quantifier is an existential, then the connective is an ^ (and) If the quantifier is a universal, then the connective is an -> (implies) 8
Multiple Complex Terms Note that the conversion technique pulls the quantifiers out to the front of the logical form… That leads to ambiguity if there’s more than one complex term in a sentence. 9
Quantifier Ambiguity Consider ◦ Every restaurant has a menu ◦ Every restaurant has a drinks. ◦ I took a picture of everyone in the room. ◦ That could mean that every restaurant has a menu ◦ Or that There’s some super-menu out there and all restaurants have that menu 10
Ambiguity This turns out to be a lot like the prepositional phrase attachment problem The number of possible interpretations goes up exponentially with the number of complex terms in the sentence The best we can do is to come up with weak methods to prefer one interpretation over another 11
What do we do with them? As we did with feature structures: ◦ Alter an Early-style parser so when constituents (dot at the end of the rule) are completed, the attached semantic function is applied and a meaning representation created and stored with the state Or, let parser run to completion and then walk through resulting tree running semantic attachments from bottom-up 12
Integration Two basic approaches ◦ Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) ◦ Pipeline… assign meaning representations to complete trees only after they’re completed 13
Example From BERP ◦ I want to eat someplace near campus ◦ Somebody tell me the two meanings… 14
Non-Compositionality Unfortunately, there are lots of examples where the meaning (loosely defined) can’t be derived from the meanings of the parts ◦ Idioms, jokes, irony, sarcasm, metaphor, metonymy, indirect requests, etc 15
English Idioms Kick the bucket, buy the farm, bite the bullet, run the show, bury the hatchet, etc… Lots of these… constructions where the meaning of the whole is either ◦ Totally unrelated to the meanings of the parts (kick the bucket) ◦ Related in some opaque way (run the show) 16
Constructional Approach Syntax and semantics aren’t separable in the way that we’ve been assuming Grammars contain form-meaning pairings that vary in the degree to which the meaning of a constituent (and what constitutes a constituent) can be computed from the meanings of the parts. 17
Semantic Grammars One problem with traditional grammars is that they don’t necessarily reflect the semantics in a straightforward way You can deal with this by… ◦ Fighting with the grammar ◦ Complex lambdas and complex terms, etc ◦ Rewriting the grammar to reflect the semantics ◦ And in the process give up on some syntactic niceties 18
Example 19
Semantic Grammar The term semantic grammar refers to the motivation for the grammar rules The technology (plain CFG rules with a set of terminals) is the same as we’ve been using The good thing about them is that you get exactly the semantic rules you need The bad thing is that you need to develop a new grammar for each new domain 20
Semantic Grammars Typically used in conversational agents in constrained domains ◦ Limited vocabulary ◦ Limited grammatical complexity ◦ Chart parsing (Earley) can often produce all that’s needed for semantic interpretation even in the face of ungrammatical input. 21

computational semanticsnlpnlpnlpnlpp.ppt

  • 1.
    Natural Language Processing ComputationalSemantics Dr. Sohaib Latif Assistant Professor The University of Chenab, Gujrat sohaib@cs.uchenab.edu.pk
  • 2.
    Semantic Analysis Semantic analysisis the process of taking in some linguistic input and producing a meaning representation for it. ◦ There are many ways of doing this, ranging from completely ad hoc domain specific methods to more theoretically founded by not quite useful methods. ◦ Different methods make more or less (or no) use of syntax ◦ We’re going to start with the idea that syntax does matter ◦ The compositional rule-to-rule approach 2
  • 3.
    Compositional Analysis Principle ofCompositionality ◦ The meaning of a whole is derived from the meanings of the parts What parts? ◦ The constituents of the syntactic parse of the input. 3
  • 4.
    Compositional Semantics Part ofthe meaning derives from the people and activities it’s about (predicates and arguments, or, nouns and verbs) and part from the way they are ordered and related grammatically: syntax 4
  • 5.
    Specific vs. General-Purpose Rules Wedon’t want to have to specify for every possible parse tree what semantic representation it maps to We want to identify general mappings from parse trees to semantic representations: ◦ Again (as with feature structures) we will augment the lexicon and the grammar ◦ Rule-to-rule hypothesis: a mapping exists between rules of the grammar and rules of semantic representation 5
  • 6.
    Semantic Attachments Extend eachgrammar rule with instructions on how to map the components of the rule to a semantic representation (grammars are getting complex) S  NP VP {VP.sem(NP.sem)} Each semantic function is defined in terms of the semantic representation of choice Problem: how to define these functions and how to specify their composition so we always get the meaning representation we want from our grammar? 6
  • 7.
    Strong Compositionality The semanticsof the whole is derived solely from the semantics of the parts. (i.e. we ignore what’s going on in other parts of the tree). 7
  • 8.
    Quantifiers and Connectives Ifthe quantifier is an existential, then the connective is an ^ (and) If the quantifier is a universal, then the connective is an -> (implies) 8
  • 9.
    Multiple Complex Terms Notethat the conversion technique pulls the quantifiers out to the front of the logical form… That leads to ambiguity if there’s more than one complex term in a sentence. 9
  • 10.
    Quantifier Ambiguity Consider ◦ Everyrestaurant has a menu ◦ Every restaurant has a drinks. ◦ I took a picture of everyone in the room. ◦ That could mean that every restaurant has a menu ◦ Or that There’s some super-menu out there and all restaurants have that menu 10
  • 11.
    Ambiguity This turns outto be a lot like the prepositional phrase attachment problem The number of possible interpretations goes up exponentially with the number of complex terms in the sentence The best we can do is to come up with weak methods to prefer one interpretation over another 11
  • 12.
    What do wedo with them? As we did with feature structures: ◦ Alter an Early-style parser so when constituents (dot at the end of the rule) are completed, the attached semantic function is applied and a meaning representation created and stored with the state Or, let parser run to completion and then walk through resulting tree running semantic attachments from bottom-up 12
  • 13.
    Integration Two basic approaches ◦Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) ◦ Pipeline… assign meaning representations to complete trees only after they’re completed 13
  • 14.
    Example From BERP ◦ Iwant to eat someplace near campus ◦ Somebody tell me the two meanings… 14
  • 15.
    Non-Compositionality Unfortunately, there arelots of examples where the meaning (loosely defined) can’t be derived from the meanings of the parts ◦ Idioms, jokes, irony, sarcasm, metaphor, metonymy, indirect requests, etc 15
  • 16.
    English Idioms Kick thebucket, buy the farm, bite the bullet, run the show, bury the hatchet, etc… Lots of these… constructions where the meaning of the whole is either ◦ Totally unrelated to the meanings of the parts (kick the bucket) ◦ Related in some opaque way (run the show) 16
  • 17.
    Constructional Approach Syntax andsemantics aren’t separable in the way that we’ve been assuming Grammars contain form-meaning pairings that vary in the degree to which the meaning of a constituent (and what constitutes a constituent) can be computed from the meanings of the parts. 17
  • 18.
    Semantic Grammars One problemwith traditional grammars is that they don’t necessarily reflect the semantics in a straightforward way You can deal with this by… ◦ Fighting with the grammar ◦ Complex lambdas and complex terms, etc ◦ Rewriting the grammar to reflect the semantics ◦ And in the process give up on some syntactic niceties 18
  • 19.
  • 20.
    Semantic Grammar The termsemantic grammar refers to the motivation for the grammar rules The technology (plain CFG rules with a set of terminals) is the same as we’ve been using The good thing about them is that you get exactly the semantic rules you need The bad thing is that you need to develop a new grammar for each new domain 20
  • 21.
    Semantic Grammars Typically usedin conversational agents in constrained domains ◦ Limited vocabulary ◦ Limited grammatical complexity ◦ Chart parsing (Earley) can often produce all that’s needed for semantic interpretation even in the face of ungrammatical input. 21