Parsing Probabilistic Context Free Grammars CMSC 473/673 UMBC - - PowerPoint PPT Presentation

parsing probabilistic context free grammars
SMART_READER_LITE
LIVE PREVIEW

Parsing Probabilistic Context Free Grammars CMSC 473/673 UMBC - - PowerPoint PPT Presentation

Parsing Probabilistic Context Free Grammars CMSC 473/673 UMBC November 8 th , 2017 Recap from last time Constituents Help Form Grammars constituent: spans of words that act (syntactically) as a group X phrase (noun phrase) Baltimore


slide-1
SLIDE 1

Parsing Probabilistic Context Free Grammars

CMSC 473/673 UMBC November 8th, 2017

slide-2
SLIDE 2

Recap from last time…

slide-3
SLIDE 3

Constituents Help Form Grammars

constituent: spans of words that act (syntactically) as a group “X phrase” (noun phrase) Baltimore is a great place to be. This house is a great place to be. This red house is a great place to be. This red house on the hill is a great place to be. This red house near the hill is a great place to be. This red house atop the hill is a great place to be. The hill is a great place to be.

S  NP VP NP  Det Noun NP  Noun NP  Det AdjP NP  NP PP PP  P NP AdjP  Adj Noun VP  V NP Noun  Baltimore

slide-4
SLIDE 4

Context Free Grammar

Set of rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP, Noun (Sometimes) Pre-terminals: symbols that can

  • nly trigger lexical rewrites, e.g., Noun

S  NP VP NP  Det Noun NP  Noun NP  Det AdjP NP  NP PP PP  P NP AdjP  Adj Noun VP  V NP Noun  Baltimore

slide-5
SLIDE 5

Generate from a Context Free Grammar

S  NP VP NP  Det Noun NP  Noun NP  Det AdjP NP  NP PP PP  P NP AdjP  Adj Noun VP  V NP Noun  Baltimore … Baltimore is a great city S NP VP

Noun

Baltimore

Verb

NP

is a great city

slide-6
SLIDE 6

Assign Structure (Parse) with a Context Free Grammar

S  NP VP NP  Det Noun NP  Noun NP  Det AdjP NP  NP PP PP  P NP AdjP  Adj Noun VP  V NP Noun  Baltimore … Baltimore is a great city S NP VP

Noun

Baltimore

Verb

NP

is a great city [S [NP [Noun Baltimore] ] [VP [Verb is] [NP a great city]]]

bracket notation (S (NP (Noun Baltimore)) (VP (V is) (NP a great city))) S-expression

slide-7
SLIDE 7

Parsing as a Core NLP Problem

sentence 1 sentence 2 sentence 3 sentence 4 Parser Grammar

Evaluation

score Other NLP task (entity coref., MT, Q&A, …)

independent

  • perations

Gold (correct) reference trees

slide-8
SLIDE 8

Grammars Aren’t Just for Syntax

  • vergeneralization

general

  • ize

A AV generalize V

  • tion

VN generalization N

  • ver-

NN

  • vergeneralization

N

slide-9
SLIDE 9

Clearly Show Ambiguity… But Not Necessarily All Ambiguity

I ate the meal with friends

NP VP VP NP PP S

I ate the meal with gusto I ate the meal with a fork

PP Attachment

(a common source of errors, even still today)

Semantic Ambiguities

Issue 1: Which grammar? Issue 2: Discourse demands flexibility

slide-10
SLIDE 10

How Do We Robustly Handle Ambiguities?

slide-11
SLIDE 11

How Do We Robustly Handle Ambiguities?

Add probabilities (to what?)

slide-12
SLIDE 12

Probabilistic Context Free Grammar

Set of weighted (probabilistic) rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP, Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

S  NP VP NP  Det Noun NP  Noun NP  Det AdjP NP  NP PP PP  P NP AdjP  Adj Noun VP  V NP Noun  Baltimore …

slide-13
SLIDE 13

Probabilistic Context Free Grammar

Set of weighted (probabilistic) rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP, Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

Q: What are the distributions? What must sum to 1? S  NP VP NP  Det Noun NP  Noun NP  Det AdjP NP  NP PP PP  P NP AdjP  Adj Noun VP  V NP Noun  Baltimore …

slide-14
SLIDE 14

Probabilistic Context Free Grammar

Set of weighted (probabilistic) rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP, Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

1.0 S  NP VP .4 NP  Det Noun .3 NP  Noun .2 NP  Det AdjP .1 NP  NP PP 1.0 PP  P NP .34 AdjP  Adj Noun .26 VP  V NP .0003 Noun  Baltimore … Q: What are the distributions? What must sum to 1?

A: P(X  Y Z | X)

slide-15
SLIDE 15

Probabilistic Context Free Grammar

p(

S NP VP

Noun

Baltimore

Verb

NP

is a great city

)=

product of probabilities of individual rules used in the derivation

slide-16
SLIDE 16

Probabilistic Context Free Grammar

p(

S NP VP

Noun

Baltimore

Verb

NP

is a great city

)=

p(

S NP VP ) *

product of probabilities of individual rules used in the derivation

slide-17
SLIDE 17

Probabilistic Context Free Grammar

p(

S NP VP

Noun

Baltimore

Verb

NP

is a great city

)=

p(

S NP VP ) *

p( ) *

NP

Noun

p( ) *

Noun

Baltimore

product of probabilities of individual rules used in the derivation

slide-18
SLIDE 18

Probabilistic Context Free Grammar

p(

S NP VP

Noun

Baltimore

Verb

NP

is a great city

)=

p(

S NP VP ) *

p( ) * p( ) *

NP

Noun

p( ) *

Noun

Baltimore

VP

Verb

NP

p( ) *

Verb

is

p( )

NP

a great city

product of probabilities of individual rules used in the derivation

slide-19
SLIDE 19

Log Probabilistic Context Free Grammar

lp(

S NP VP

Noun

Baltimore

Verb

NP

is a great city

)=

lp(

S NP VP ) +

lp( ) + lp( ) +

NP

Noun

lp( ) +

Noun

Baltimore

VP

Verb

NP

lp( ) +

Verb

is

lp( )

NP

a great city

sum of log probabilities of individual rules used in the derivation

slide-20
SLIDE 20

Estimating PCFGs

Attempt 1:

  • Get access to a treebank (corpus of

syntactically annotated sentences), e.g., the English Penn Treebank

  • Count productions
  • Smooth these counts
  • This gets ~75 F1
slide-21
SLIDE 21

Probabilistic Context Free Grammar (PCFG) Tasks

Find the most likely parse (for an observed sequence) Calculate the (log) likelihood of an observed sequence w1, …, wN Learn the grammar parameters

slide-22
SLIDE 22

Probabilistic Context Free Grammar (PCFG) Tasks

Find the most likely parse (for an observed sequence) Calculate the (log) likelihood of an observed sequence w1, …, wN Learn the grammar parameters

slide-23
SLIDE 23

Probabilistic Context Free Grammar (PCFG) Tasks

Find the most likely parse (for an observed sequence) Calculate the (log) likelihood of an observed sequence w1, …, wN Learn the grammar parameters

any

slide-24
SLIDE 24

Parsing with a CFG

Top-down backtracking (brute force) CKY Algorithm: dynamic bottom-up Earley’s Algorithm: dynamic top-down

slide-25
SLIDE 25

Parsing with a CFG

Top-down backtracking (brute force) CKY Algorithm: dynamic bottom-up Earley’s Algorithm: dynamic top-down

slide-26
SLIDE 26

CKY Precondition

Grammar must be in Chomsky Normal Form (CNF) non-terminal  non-terminal non-terminal non-terminal  terminal

slide-27
SLIDE 27

CKY Precondition

Grammar must be in Chomsky Normal Form (CNF) non-terminal  non-terminal non-terminal non-terminal  terminal

X  Y Z X  a

slide-28
SLIDE 28

CKY Precondition

Grammar must be in Chomsky Normal Form (CNF) non-terminal  non-terminal non-terminal non-terminal  terminal

X  Y Z X  a

binary rules can only involve non-terminals unary rules can only involve terminals no ternary (+) rules

slide-29
SLIDE 29

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

Example from Jason Eisner

Entire grammar Assume uniform weights

slide-30
SLIDE 30

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

slide-31
SLIDE 31

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

Goal: (S, 0, 7)

slide-32
SLIDE 32

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

Check 1: What are the non- terminals?

slide-33
SLIDE 33

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

Check 1: What are the non- terminals?

S NP VP PP N V P Det

Check 2: What are the terminals?

slide-34
SLIDE 34

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

Check 1: What are the non- terminals?

S NP VP PP N V P Det

Check 2: What are the terminals?

Papa caviar spoon ate with the a

Check 3: What are the pre- terminals?

slide-35
SLIDE 35

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

Check 1: What are the non- terminals?

S NP VP PP N V P Det

Check 2: What are the terminals?

Papa caviar spoon ate with the a

Check 3: What are the pre- terminals?

N V P Det

Check 4: Is this in CNF?

slide-36
SLIDE 36

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

Check 1: What are the non- terminals?

S NP VP PP N V P Det

Check 2: What are the terminals?

Papa caviar spoon ate with the a

Check 3: What are the pre- terminals?

N V P Det

Check 4: Is this in CNF?

Yes

slide-37
SLIDE 37

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

slide-38
SLIDE 38

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

slide-39
SLIDE 39

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

slide-40
SLIDE 40

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

slide-41
SLIDE 41

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

slide-42
SLIDE 42

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

slide-43
SLIDE 43

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

slide-44
SLIDE 44

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

(NP, 0, 1) (VP, 1, 7) (S, 0, 7)

slide-45
SLIDE 45

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar 6 1 2 3 4 5 1 2 3 4 5 6 7

(NP, 0, 1) (VP, 1, 7) (S, 0, 7)

start end

slide-46
SLIDE 46

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

NP

6 1 2 3 4 5 1 2 3 4 5 6 7

(NP, 0, 1) (VP, 1, 7) (S, 0, 7)

start end

slide-47
SLIDE 47

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

NP VP

6 1 2 3 4 5 1 2 3 4 5 6 7

(NP, 0, 1) (VP, 1, 7) (S, 0, 7)

start end

slide-48
SLIDE 48

“Papa ate the caviar with a spoon”

S  NP VP NP  Det N NP  NP PP VP  V NP VP  VP PP PP  P NP NP  Papa N  caviar N  spoon V  spoon V  ate P  with Det  the Det  a

1 2 3 4 5 6 7

Example from Jason Eisner

Entire grammar Assume uniform weights

First: Let’s find all NPs

(NP, 0, 1): Papa (NP, 2, 4): the caviar (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon

Second: Let’s find all VPs

(VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar

Third: Let’s find all Ss

(S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar

NP VP

6 1 2 3 4 5 1 2 3 4 5 6 7

(NP, 0, 1) (VP, 1, 7) (S, 0, 7)

start end S

slide-49
SLIDE 49

CKY Recognizer

Input: * string of N words * grammar in CNF Output: True (with parse)/False Data structure: N*N table T Rows indicate span start (0 to N-1) Columns indicate span end (1 to N) T[i][j] lists constituents spanning i  j

slide-50
SLIDE 50

CKY Recognizer

Input: * string of N words * grammar in CNF Output: True (with parse)/False Data structure: N*N table T Rows indicate span start (0 to N-1) Columns indicate span end (1 to N) T[i][j] lists constituents spanning i  j For Viterbi in HMMs: build table left-to-right For CKY in trees:

  • 1. build smallest-to-largest &
  • 2. left-to-right
slide-51
SLIDE 51

CKY Recognizer

T = Cell[N][N+1]

slide-52
SLIDE 52

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) }

slide-53
SLIDE 53

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { }

slide-54
SLIDE 54

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width } }

slide-55
SLIDE 55

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { } } }

slide-56
SLIDE 56

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(non-terminal Y : T[start][mid]) { for(non-terminal Z : T[mid][end]) { T[start][end].add(X for rule X  Y Z : G) } } } } }

X Y Z Y Z

slide-57
SLIDE 57

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(non-terminal Y : T[start][mid]) { for(non-terminal Z : T[mid][end]) { T[start][end].add(X for rule X  Y Z : G) } } } } }

Q: What do we return?

slide-58
SLIDE 58

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(non-terminal Y : T[start][mid]) { for(non-terminal Z : T[mid][end]) { T[start][end].add(X for rule X  Y Z : G) } } } } }

Q: What do we return? A: S in T[0][N]

slide-59
SLIDE 59

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(non-terminal Y : T[start][mid]) { for(non-terminal Z : T[mid][end]) { T[start][end].add(X for rule X  Y Z : G) } } } } }

Q: How do we get the parse?

slide-60
SLIDE 60

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(non-terminal Y : T[start][mid]) { for(non-terminal Z : T[mid][end]) { T[start][end].add(X for rule X  Y Z : G) } } } } }

Q: How do we get the parse? A: Follow backpointers (stored where?)

slide-61
SLIDE 61

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(non-terminal Y : T[start][mid]) { for(non-terminal Z : T[mid][end]) { T[start][end].add(X for rule X  Y Z : G) } } } } }

slide-62
SLIDE 62

CKY Recognizer

T = Cell[N][N+1] for(j = 1; j ≤ N; ++j) { T[j-1][j].add(X for non-terminal X in G if X  wordj) } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(rule X  Y Z : G) { T[start][end].add(X if Y in T[start][mid] & Z in T[mid][end]) } } } }

slide-63
SLIDE 63

CKY Recognizer

T = bool[K][N][N+1] for(j = 1; j ≤ N; ++j) { for(non-terminal X in G if X  wordj) { T[X][j-1][j] = True } } for(width = 2; width ≤ N; ++width) { for(start = 0; start < N - width; ++start) { end = start + width for(mid = start+1; mid < end; ++mid) { for(rule X  Y Z : G) { for rule X  Y Z : G) { T[X][start][end] = T[Y][start][mid] & T[Z][mid][end] } } } } }