Preservation of Recognizability for Weighted Extended Top-down Tree - - PowerPoint PPT Presentation

preservation of recognizability for weighted extended top
SMART_READER_LITE
LIVE PREVIEW

Preservation of Recognizability for Weighted Extended Top-down Tree - - PowerPoint PPT Presentation

Preservation of Recognizability for Weighted Extended Top-down Tree Transducers Andreas Maletti Institute for Natural Language Processing University of Stuttgart andreas.maletti@ims.uni-stuttgart.de Nara, Japan September 7, 2011


slide-1
SLIDE 1

Preservation of Recognizability for Weighted Extended Top-down Tree Transducers

Andreas Maletti

Institute for Natural Language Processing University of Stuttgart andreas.maletti@ims.uni-stuttgart.de

Nara, Japan — September 7, 2011

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 1

slide-2
SLIDE 2

Collaborators

reporting joint work with

JOOST ENGELFRIET (LIACS, Leiden, The Netherlands) ZOLTÁN FÜLÖP (U Szeged, Hungary) ERIC LILIN (U Lille, France) GIORGIO SATTA (U Padua, Italy) HEIKO VOGLER (TU Dresden, Germany)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 2

slide-3
SLIDE 3

Machine Translation — Unweighted Setup

Schema

Input − → Machine translation system − → Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable tree language L such that L ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 3

slide-4
SLIDE 4

Machine Translation — Unweighted Setup

Schema

Input − → XTT − → Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable tree language L such that L ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 3

slide-5
SLIDE 5

Machine Translation — Unweighted Setup

Schema

Input − → XTT − → Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable tree language L such that L ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 3

slide-6
SLIDE 6

Machine Translation — Unweighted Setup

Schema

Input − → XTT − → Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable tree language L such that L ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 3

slide-7
SLIDE 7

Machine Translation — Unweighted Setup

Problem

M(L) is not necessarily recognizable

Answer

in many cases it fortunately is model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓ ✓ XTOP ✗ ✓ ln-MBOT ✗ ✓ l-MBOT ✗ ✓ MBOT ✗ ✓ ln-STSSG ✗ ✗

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 4

slide-8
SLIDE 8

Machine Translation — Unweighted Setup

Problem

M(L) is not necessarily recognizable

Answer

in many cases it fortunately is model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓ ✓ XTOP ✗ ✓ ln-MBOT ✗ ✓ l-MBOT ✗ ✓ MBOT ✗ ✓ ln-STSSG ✗ ✗

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 4

slide-9
SLIDE 9

Machine Translation — Weighted Setup

Schema

weighted Input − → Machine translation system − → weighted Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable weighted tree language L such that supp(L) ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 5

slide-10
SLIDE 10

Machine Translation — Weighted Setup

Schema

weighted Input − → WXTT − → weighted Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable weighted tree language L such that supp(L) ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 5

slide-11
SLIDE 11

Machine Translation — Weighted Setup

Schema

weighted Input − → WXTT − → weighted Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable weighted tree language L such that supp(L) ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 5

slide-12
SLIDE 12

Machine Translation — Weighted Setup

Schema

weighted Input − → WXTT − → weighted Output

Question

What are the translations of sentence f?

Answer

take recognizable language {f} parse f giving a recognizable weighted tree language L such that supp(L) ⊆ {t | yield(t) = f} compute forward application M(L)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 5

slide-13
SLIDE 13

Machine Translation — Weighted Setup

Problem

Again, M(L) is not necessarily recognizable

Answer

in fewer cases it is model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ ✓ XTOP ✗ ✗ ln-MBOT ✗ ✓ l-MBOT ✗ ✓ MBOT ✗ ✗ ln-STSSG ✗ ✗

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 6

slide-14
SLIDE 14

Machine Translation — Weighted Setup

Problem

Again, M(L) is not necessarily recognizable

Answer

in fewer cases it is model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ ✓ XTOP ✗ ✗ ln-MBOT ✗ ✓ l-MBOT ✗ ✓ MBOT ✗ ✗ ln-STSSG ✗ ✗

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 6

slide-15
SLIDE 15

Machine Translation — Weighted Setup

Problem

Again, M(L) is not necessarily recognizable

Answer

in fewer cases it is model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ (✓) ✓ XTOP ✗ ✗ (✓) ln-MBOT ✗ ✓ l-MBOT ✗ ✓(✓) MBOT ✗ ✗ (✓) ln-STSSG ✗ ✗

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 6

slide-16
SLIDE 16

Contents

1

Motivation

2

Recognizable Weighted Tree Language

3

Weighted Extended Top-down Tree Transducer

4

Preservation of Recognizability

5

Nonpreservation of Recognizability

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 7

slide-17
SLIDE 17

Weight structure

Definition

Commutative semiring (C, +, ·, 0, 1) if (C, +, 0) and (C, ·, 1) commutative monoids · distributes over finite (incl. empty) sums Idempotent if c + c = c

Example

BOOLEAN semiring ({0, 1}, max, min, 0, 1) (idempotent) Semiring (N, +, ·, 0, 1) of natural numbers Tropical semiring (N ∪ {∞}, min, +, ∞, 0) (idempotent) Any field, ring, etc.

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 8

slide-18
SLIDE 18

Weight structure

Definition

Commutative semiring (C, +, ·, 0, 1) if (C, +, 0) and (C, ·, 1) commutative monoids · distributes over finite (incl. empty) sums Idempotent if c + c = c

Example

BOOLEAN semiring ({0, 1}, max, min, 0, 1) (idempotent) Semiring (N, +, ·, 0, 1) of natural numbers Tropical semiring (N ∪ {∞}, min, +, ∞, 0) (idempotent) Any field, ring, etc.

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 8

slide-19
SLIDE 19

Weighted tree automaton

Definition (BERSTEL, REUTENAUER 1982)

Weighted tree automaton (WTA) A = (Q, Σ, F, δ) with rules σ c

q

·q1 . . . ·qk states q, q1, . . . , qk ∈ Q rule weight c ∈ C k-ary input symbol σ ∈ Σk

[BERSTEL, REUTENAUER: Recognizable formal power series on trees. Theor.

  • Comput. Sci. 1982]

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 9

slide-20
SLIDE 20

Run

S NP JJ Colorless NNS ideas VP VBP sleep ADVP RB furiously

1

arbitrarily assign states

2

look-up rule weights

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 10

slide-21
SLIDE 21

Run

Sq NPq′ JJq′ Colorlessw NNSq′′ ideasw VPq1 VBPq′

1

sleepw ADVPq2 RBq2 furiouslyw

1

arbitrarily assign states

2

look-up rule weights

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 10

slide-22
SLIDE 22

Run

Sq NPq′ JJq′ Colorlessw NNSq′′ ideasw VPq1 VBPq′

1

sleepw ADVPq2 RBq2 furiouslyw

1

arbitrarily assign states

2

look-up rule weights NP.2

q′

·q′ ·q′′

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 10

slide-23
SLIDE 23

Run

S.4

q

NP.2

q′

JJ.3

q′

Colorless.1

w

NNS.3

q′′

ideas.1

w

VP.4

q1

VBP.2

q′

1

sleep.1

w

ADVP.3

q2

RB.2

q2

furiously.1

w

1

arbitrarily assign states

2

look-up rule weights NP.2

q′

·q′ ·q′′

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 10

slide-24
SLIDE 24

Run

S.4

q

NP.2

q′

JJ.3

q′

Colorless.1

w

NNS.3

q′′

ideas.1

w

VP.4

q1

VBP.2

q′

1

sleep.1

w

ADVP.3

q2

RB.2

q2

furiously.1

w

Definition

Weight wt(r) of run r = product of its weights

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 10

slide-25
SLIDE 25

Run

S.4

q

NP.2

q′

JJ.3

q′

Colorless.1

w

NNS.3

q′′

ideas.1

w

VP.4

q1

VBP.2

q′

1

sleep.1

w

ADVP.3

q2

RB.2

q2

furiously.1

w

Definition

Weight wt(r) of run r = product of its weights

Example (Weight of the run)

wt(r) = 0.4 · 0.2 · 0.3 · 0.1 · 0.3 · 0.1 · 0.4 · 0.2 · 0.1 · 0.3 · 0.2 · 0.1

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 10

slide-26
SLIDE 26

Semantics

Definition

Weight A(t) of tree t = sum of weights of runs scaled by final weight A(t) =

  • r run on t

wt(r) · F(root(r))

Definition

Weighted tree language recognizable if computable by WTA

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 11

slide-27
SLIDE 27

Semantics

Definition

Weight A(t) of tree t = sum of weights of runs scaled by final weight A(t) =

  • r run on t

wt(r) · F(root(r))

Definition

Weighted tree language recognizable if computable by WTA

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 11

slide-28
SLIDE 28

Unweighted example

Example

A = ({l, r, ⊤, ⊥}, Σ, {⊤}, δ) with Σ = {σ(2), α(0), β(0)} σ ⊤ ·⊥ ·r σ ⊤ ·ℓ ·⊥ σ ⊥ ·⊥ ·⊥ α⊥ β ⊥ σ ℓ ·ℓ ·⊥ σ ℓ ·⊥ ·ℓ αℓ σ r ·r ·⊥ σ r ·⊥ ·r β r ⊤ reached on ℓ or r in left or right subtree ⊥ can accept any tree ℓ and r accept α and β and propagate

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 12

slide-29
SLIDE 29

Unweighted example

Example

Input tree σ σ α α β

Recognized language

A = {σ(t1, t2) | |t1|α = 0 or |t2|β = 0}

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 13

slide-30
SLIDE 30

Unweighted example

Example

Accepting run σ ⊤ σ ℓ αℓ α⊥ β ⊥

Recognized language

A = {σ(t1, t2) | |t1|α = 0 or |t2|β = 0}

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 13

slide-31
SLIDE 31

Unweighted example

Example

Accepting run σ ⊤ σ ℓ αℓ α⊥ β ⊥ σ ⊤ σ ℓ α⊥ αℓ β ⊥

Recognized language

A = {σ(t1, t2) | |t1|α = 0 or |t2|β = 0}

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 13

slide-32
SLIDE 32

Unweighted example

Example

Accepting run σ ⊤ σ ℓ αℓ α⊥ β ⊥ σ ⊤ σ ℓ α⊥ αℓ β ⊥ σ ⊤ σ ⊥ α⊥ α⊥ β r

Recognized language

A = {σ(t1, t2) | |t1|α = 0 or |t2|β = 0}

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 13

slide-33
SLIDE 33

Unweighted example

Example

Accepting run σ ⊤ σ ℓ αℓ α⊥ β ⊥ σ ⊤ σ ℓ α⊥ αℓ β ⊥ σ ⊤ σ ⊥ α⊥ α⊥ β r

Recognized language

A = {σ(t1, t2) | |t1|α = 0 or |t2|β = 0}

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 13

slide-34
SLIDE 34

Unweighted example

Example

Accepting run σ ⊤ σ ℓ αℓ α⊥ β ⊥ σ ⊤ σ ℓ α⊥ αℓ β ⊥ σ ⊤ σ ⊥ α⊥ α⊥ β r

Recognized language

A = {σ(t1, t2) | |t1|α = 0 or |t2|β = 0} = {σ(t1, t2) | |t1|α + |t2|β = 0}

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 13

slide-35
SLIDE 35

Weighted example

Example

A = ({l, r, ⊤, ⊥}, Σ, F, δ) over the field (R, +, ·, 0, 1) of reals F(⊤) = 1 and F(q) = 0 otherwise Σ = {σ(2), α(0), β(0)} σ ⊤ ·⊥ ·r σ ⊤ ·ℓ ·⊥ σ ⊥ ·⊥ ·⊥ α⊥ β ⊥ σ ℓ ·ℓ ·⊥ σ ℓ ·⊥ ·ℓ αℓ σ r ·r ·⊥ σ r ·⊥ ·r β r

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 14

slide-36
SLIDE 36

Weighted example

Example

A = ({l, r, ⊤, ⊥}, Σ, F, δ) over the field (R, +, ·, 0, 1) of reals F(⊤) = 1 and F(q) = 0 otherwise Σ = {σ(2), α(0), β(0)} σ 1

·⊥ ·r σ 1

·ℓ ·⊥ σ 1

·⊥ ·⊥ α1

β 1

σ 1

·ℓ ·⊥ σ 1

·⊥ ·ℓ α1

σ 1

r

·r ·⊥ σ 1

r

·⊥ ·r β −1

r

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 14

slide-37
SLIDE 37

Weighted example

Example

Input tree σ σ α α β

Recognized weighted language

A(σ(t1, t2)) = |t1|α − |t2|β

Note

Support supp(A) = {σ(t1, t2) | |t1|α = |t2|β} is not recognizable! (i.e., language of non-zero weighted trees)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 15

slide-38
SLIDE 38

Weighted example

Example

Non-zero weighted run σ 1

σ 1

α1

α1

β 1

Recognized weighted language

A(σ(t1, t2)) = |t1|α − |t2|β

Note

Support supp(A) = {σ(t1, t2) | |t1|α = |t2|β} is not recognizable! (i.e., language of non-zero weighted trees)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 15

slide-39
SLIDE 39

Weighted example

Example

Non-zero weighted run σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β 1

Recognized weighted language

A(σ(t1, t2)) = |t1|α − |t2|β

Note

Support supp(A) = {σ(t1, t2) | |t1|α = |t2|β} is not recognizable! (i.e., language of non-zero weighted trees)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 15

slide-40
SLIDE 40

Weighted example

Example

Non-zero weighted run σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β −1

r

Recognized weighted language

A(σ(t1, t2)) = |t1|α − |t2|β

Note

Support supp(A) = {σ(t1, t2) | |t1|α = |t2|β} is not recognizable! (i.e., language of non-zero weighted trees)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 15

slide-41
SLIDE 41

Weighted example

Example

Non-zero weighted run σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β −1

r

Recognized weighted language

A(σ(t1, t2)) = |t1|α − |t2|β

Note

Support supp(A) = {σ(t1, t2) | |t1|α = |t2|β} is not recognizable! (i.e., language of non-zero weighted trees)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 15

slide-42
SLIDE 42

Weighted example

Example

Non-zero weighted run σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β 1

σ 1

σ 1

α1

α1

β −1

r

Recognized weighted language

A(σ(t1, t2)) = |t1|α − |t2|β

Note

Support supp(A) = {σ(t1, t2) | |t1|α = |t2|β} is not recognizable! (i.e., language of non-zero weighted trees)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 15

slide-43
SLIDE 43

Contents

1

Motivation

2

Recognizable Weighted Tree Language

3

Weighted Extended Top-down Tree Transducer

4

Preservation of Recognizability

5

Nonpreservation of Recognizability

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 16

slide-44
SLIDE 44

Syntax

Definition (ARNOLD, DAUCHET 1976, GRAEHL, KNIGHT 2004)

Weighted extended top-down tree transducer (WXTT) M = (Q, Σ, ∆, I, R) with finitely many rules

q Σ x1 . . . xk

c

→ ∆ q′(xi) . . . p(xj)

states q, q′, p ∈ Q variable indices i, j ∈ {1, . . . , k}

[ARNOLD, DAUCHET: Bi-transductions de forêts. Proc. ICALP 1976] [GRAEHL, KNIGHT: Training tree transducers. Proc. NAACL 2004]

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 17

slide-45
SLIDE 45

Syntax

Definition (ROUNDS 1970, THATCHER 1970)

Weighted top-down tree transducer (WTT) if all rules

q σ x1 . . . xk

c

→ ∆ q′(xi) . . . p(xj) [ROUNDS: Mappings and grammars on trees. Math. Syst. Theory, 1970] [THATCHER: Generalized sequential machine maps. J. Comput. Syst. Sci., 1970]

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 18

slide-46
SLIDE 46

Semantics

Example

States {qS, qV, qNP} of which only qS has non-zero initial weight qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

Derivation

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 19

slide-47
SLIDE 47

Semantics

Example

States {qS, qV, qNP} of which only qS has non-zero initial weight qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

Derivation

qS S t1 VP t2 t3

0.4

⇒ S′ qV VP t2 t3 qNP t1 qNP VP t2 t3

1

⇒ S′ qV t2 qNP t1 qNP VP t2 t3

1

⇒ S′ qV t2 qNP t1 qNP t3

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 19

slide-48
SLIDE 48

Semantics

Example

States {qS, qV, qNP} of which only qS has non-zero initial weight qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

Derivation

qS S t1 VP t2 t3

0.4

⇒ S′ qV VP t2 t3 qNP t1 qNP VP t2 t3

1

⇒ S′ qV t2 qNP t1 qNP VP t2 t3

1

⇒ S′ qV t2 qNP t1 qNP t3

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 19

slide-49
SLIDE 49

Semantics

Example

States {qS, qV, qNP} of which only qS has non-zero initial weight qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

Derivation

qS S t1 VP t2 t3

0.4

⇒ S′ qV VP t2 t3 qNP t1 qNP VP t2 t3

1

⇒ S′ qV t2 qNP t1 qNP VP t2 t3

1

⇒ S′ qV t2 qNP t1 qNP t3

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 19

slide-50
SLIDE 50

Semantics

Definition

Computed transformation (t ∈ TΣ and u ∈ T∆): M(t, u) =

  • q∈Q

q(t)

c1

⇒···

cn

⇒u left-most derivation

I(q) · c1 · . . . · cn

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 20

slide-51
SLIDE 51

Contents

1

Motivation

2

Recognizable Weighted Tree Language

3

Weighted Extended Top-down Tree Transducer

4

Preservation of Recognizability

5

Nonpreservation of Recognizability

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 21

slide-52
SLIDE 52

Preservation of recognizability

Definition (Forward application)

M : TΣ × T∆ → C and A: TΣ → C [M(A)](u) =

  • t∈TΣ

A(t) · M(t, u)

Approach

1

Input (or output) product followed by projection

2

Direct construction

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 22

slide-53
SLIDE 53

Preservation of recognizability

Definition (Forward application)

M : TΣ × T∆ → C and A: TΣ → C [M(A)](u) =

  • t∈TΣ

A(t) · M(t, u)

Approach

1

Input (or output) product followed by projection

2

Direct construction

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 22

slide-54
SLIDE 54

Input product + projection

Definition (Forward application)

M : TΣ × T∆ → C and A: TΣ → C [M(A)](u) =

  • t∈TΣ

A(t) · M(t, u)

Definition (Input product)

Input product of WTA A and WXTT M is WXTT AM with

AM(t, u) = A(t) · M(t, u)

Definition (Range projection)

WXTT M [ran(M)](u) =

  • t∈TΣ

M(t, u)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 23

slide-55
SLIDE 55

Input product + projection

Definition (Forward application)

M : TΣ × T∆ → C and A: TΣ → C [M(A)](u) =

  • t∈TΣ

A(t) · M(t, u)

Definition (Input product)

Input product of WTA A and WXTT M is WXTT AM with

AM(t, u) = A(t) · M(t, u)

Definition (Range projection)

WXTT M [ran(M)](u) =

  • t∈TΣ

M(t, u)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 23

slide-56
SLIDE 56

Input product + projection

Definition (Forward application)

M : TΣ × T∆ → C and A: TΣ → C M(A) = ran(AM)

Definition (Input product)

Input product of WTA A and WXTT M is WXTT AM with

AM(t, u) = A(t) · M(t, u)

Definition (Range projection)

WXTT M [ran(M)](u) =

  • t∈TΣ

M(t, u)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 23

slide-57
SLIDE 57

Product + projection

positive

two simple generic constructions

◮ BAR-HILLEL construction ◮ projection

reusable explain most known cases

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-58
SLIDE 58

Product + projection

positive

two simple generic constructions

◮ BAR-HILLEL construction ◮ projection

reusable explain most known cases

negative

requirements of two constructions inefficiencies

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-59
SLIDE 59

Product + projection

Requirement

input range

  • utput

domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✗ ✓/✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT l-MBOT MBOT ln-STSSG

Conclusion

Nondeletion essential for input product!

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-60
SLIDE 60

Product + projection

Requirement

input range

  • utput

domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✗ ✓/✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT ✓ ✗ ✓ ✓ l-MBOT ✓ ✗ ✓ ✓ MBOT ✗ ✗ ✓ ✗ ln-STSSG

Conclusion

Nondeletion essential for input product!

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-61
SLIDE 61

Product + projection

Requirement

input range

  • utput

domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✗ ✓/✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT ✓ ✗ ✓ ✓ l-MBOT ✓ ✗ ✓ ✓ MBOT ✗ ✗ ✓ ✗ ln-STSSG ✓ ✗ ✓ ✗

Conclusion

Nondeletion essential for input product!

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-62
SLIDE 62

Product + projection

Requirement

input range

  • utput

domain model product projection product projection ln-XTOP ✓ (✓) ✓ ✓ (✓) ✓ l-XTOP ✗ (✓/✗) ✓/✗ ✓ (✓) ✓ XTOP ✗ (✗) ✗ ✓ (✗) ✗ ln-MBOT ✓ (✗) ✗ ✓ (✓) ✓ l-MBOT ✓ (✗) ✗ ✓ (✓) ✓ MBOT ✗ (✗) ✗ ✓ (✗) ✗ ln-STSSG ✓ (✗) ✗ ✓ (✗) ✗

Conclusion

Nondeletion essential for input product!

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-63
SLIDE 63

Product + projection

Requirement

input range

  • utput

domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✗ ✓/✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT ✓ ✗ ✓ ✓ l-MBOT ✓ ✗ ✓ ✓ MBOT ✗ ✗ ✓ ✗ ln-STSSG ✓ ✗ ✓ ✗

Conclusion

Nondeletion essential for input product!

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 24

slide-64
SLIDE 64

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2 nondeleting linear linear

Definition

WXTT M is nondeleting if var(l) = var(r) for all rules l → r linear if no variable appears twice in r for all rules l → r

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 25

slide-65
SLIDE 65

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2 nondeleting deletes x2 deletes x1

Definition

all-copies nondeleting = nondeleting = every copy of an input subtree is fully explored some-copy nondeleting = one copy of each input subtree is fully explored

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 25

slide-66
SLIDE 66

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

is not some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV VP x3 x4 qNP x2 qNP VP x3 x4

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-67
SLIDE 67

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

is not some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV VP x3 x4 qNP x2 qNP VP x3 x4

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-68
SLIDE 68

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ qNP x2

is not some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV VP x3 x4 qNP x2 qNP VP x3 x4

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-69
SLIDE 69

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ VP qNP x2 qV x1

can be some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 VP qNP x4 qV x3 Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-70
SLIDE 70

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ VP qNP x2 qV x1

can be some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 VP qNP x4 qV x3 Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-71
SLIDE 71

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ VP qNP x2 qV x1

can be some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 VP qNP x4 qV x3 Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-72
SLIDE 72

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ VP qNP x2 qV x1

can be some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 VP qNP x4 qV x3 Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-73
SLIDE 73

Nondeletion

Example

qS S x1 x2

0.4

→ S′ qV x2 qNP x1 qNP x2 qV VP x1 x2

1

→ qV x1 qNP VP x1 x2

1

→ VP qNP x2 qV x1

can be some-copy nondeleting

Example (Derivation)

qS S VP x1 x2 VP x3 x4

0.4

⇒ S′ qV VP x3 x4 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 qNP VP x1 x2 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 qNP VP x3 x4 ⇒ S′ qV x3 VP qNP x2 qV x1 VP qNP x4 qV x3 Recognizability and Weighted Tree Transducers

  • A. Maletti

· 26

slide-74
SLIDE 74

Scenario 1

Theorem (ENGELFRIET 1977)

For nondeleting WXTT M and WTA A we can construct AM

Proof.

qS Sc

p

x1 p1 x2 p2

0.4

→ S′ qV x2 qNP x1 qNP x2

  • riginal rules

qS, p S x1 x2 → S′ qV, p2 x2 qNP, p1 x1 qNP x2

0.4 c

constructed rule

for original nondeleting rules construct new rules mark one state for each variable; one possibility x2a x1b x2d → x2 e

a

x1 f

b

x2d

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 27

slide-75
SLIDE 75

Scenario 1

Theorem (ENGELFRIET 1977)

For nondeleting WXTT M and WTA A we can construct AM

Proof.

qS Sc

p

x1 p1 x2 p2

0.4

→ S′ qV x2 qNP x1 qNP x2

  • riginal rules

qS, p S x1 x2 → S′ qV, p2 x2 qNP, p1 x1 qNP x2

0.4 c

constructed rule

for original nondeleting rules construct new rules mark one state for each variable; one possibility x2a x1b x2d → x2 e

a

x1 f

b

x2d

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 27

slide-76
SLIDE 76

Scenario 2

Theorem (∼ 2010)

For some-copy nondeleting WXTT M and WTA A

  • ver idempotent semiring we can construct AM

Proof.

for original nondeleting rules construct new rules mark one state for each variable; all possibilities x2a x1b x2d → x2

e a

x1

f b

x2d | x2a x1

f b

x2

e d

at least one exploration will succeed (somy-copy nondeletion) aebfd + abfde = abdef if several succeed (idempotency)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 28

slide-77
SLIDE 77

Scenario 2

Theorem (∼ 2010)

For some-copy nondeleting WXTT M and WTA A

  • ver idempotent semiring we can construct AM

Proof.

for original nondeleting rules construct new rules mark one state for each variable; all possibilities x2a x1b x2d → x2

e a

x1

f b

x2d | x2a x1

f b

x2

e d

at least one exploration will succeed (somy-copy nondeletion) aebfd + abfde = abdef if several succeed (idempotency)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 28

slide-78
SLIDE 78

Scenario 2

Theorem (∼ 2010)

For some-copy nondeleting WXTT M and WTA A

  • ver idempotent semiring we can construct AM

Proof.

for original nondeleting rules construct new rules mark one state for each variable; all possibilities x2a x1b x2d → x2

e a

x1

f b

x2d | x2a x1

f b

x2

e d

at least one exploration will succeed (somy-copy nondeletion) aebfd + abfde = abdef if several succeed (idempotency)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 28

slide-79
SLIDE 79

Scenario 3

Theorem (∼ 2010)

For some-copy nondeleting WXTT M and WTA A over ring we can construct AM

Proof.

for original nondeleting rules construct several new rules mark states according to elimination scheme x2a x1b x2d → x2

e a

x1

f b

x2d | x2a x1

f b

x2

e d

| x2

e a

x1

f b

x2

−1 d

at least one exploration will succeed

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 29

slide-80
SLIDE 80

Scenario 3

Theorem (∼ 2010)

For some-copy nondeleting WXTT M and WTA A over ring we can construct AM

Proof.

for original nondeleting rules construct several new rules mark states according to elimination scheme x2a x1b x2d → x2

e a

x1

f b

x2d | x2a x1

f b

x2

e d

| x2

e a

x1

f b

x2

−1 d

at least one exploration will succeed

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 29

slide-81
SLIDE 81

Scenario 3

Theorem (∼ 2010)

For some-copy nondeleting WXTT M and WTA A over ring we can construct AM

Proof.

if several succeed, then x2

e a

x1

f b

x2d | x2a x1

f b

x2

e d

| x2

e a

x1

f b

x2

−1 d

aebfd abfde aebfd abfde −aebfd

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 29

slide-82
SLIDE 82

Elimination schemes

Question

Do elimination schemes exist?

Answer

001 010 100 011 101 110 111

  • +

+ + − − − + 001 a a 010 a a 100 a a 011 a a −a a 101 a a −a a 110 a a −a a 111 a a a −a −a −a a a

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 30

slide-83
SLIDE 83

Elimination schemes

Question

Do elimination schemes exist?

Answer

001 010 100 011 101 110 111

  • +

+ + − − − + 001 a a 010 a a 100 a a 011 a a −a a 101 a a −a a 110 a a −a a 111 a a a −a −a −a a a

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 30

slide-84
SLIDE 84

Elimination schemes

Question

Do elimination schemes exist?

Answer

001 010 100 011 101 110 111

  • +

+ + − − − + 001 a a 010 a a 100 a a 011 a a −a a 101 a a −a a 110 a a −a a 111 a a a −a −a −a a a

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 30

slide-85
SLIDE 85

Direct construction

Applicability

here only l-XTOP (product + projection fails)

Failure

input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees

Solution

assign aggregate weight to transitions deleting subtrees

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 31

slide-86
SLIDE 86

Direct construction

Applicability

here only l-XTOP (product + projection fails)

Failure

input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees

Solution

assign aggregate weight to transitions deleting subtrees

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 31

slide-87
SLIDE 87

Direct construction

Applicability

here only l-XTOP (product + projection fails)

Failure

input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees

Solution

assign aggregate weight to transitions deleting subtrees

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 31

slide-88
SLIDE 88

Direct construction

Applicability

here only l-XTOP (product + projection fails)

Failure

input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees

Solution

assign aggregate weight to transitions deleting subtrees

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 31

slide-89
SLIDE 89

Bonus scenario

qS Sc

p

x1 p1 x2 p2

0.4

→ S′ qV x2

  • riginal rules

qS, p S x1 x2 → S′ qV, p2 x2

0.4 c′

constructed rule where c′ = c · in(p1)

Inside weight of p

in(p) =

  • t∈TΣ

r run on t root(r)=p

wt(r)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 32

slide-90
SLIDE 90

Bonus scenario

Theorem

For linear WXTT M and WTA A we can construct AM if inside weights of A can be computed

Computation of inside weights

trivial in BOOLEAN semiring typically simple in extremal semirings (VITERBI algorithms) possible in N (deciding finiteness of support) → possible in many interesting cases approximation possible for R (NEWTON method)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 33

slide-91
SLIDE 91

Bonus scenario

Theorem

For linear WXTT M and WTA A we can construct AM if inside weights of A can be computed

Computation of inside weights

trivial in BOOLEAN semiring typically simple in extremal semirings (VITERBI algorithms) possible in N (deciding finiteness of support) → possible in many interesting cases approximation possible for R (NEWTON method)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 33

slide-92
SLIDE 92

Bonus scenario

Theorem

For linear WXTT M and WTA A we can construct AM if inside weights of A can be computed

Computation of inside weights

trivial in BOOLEAN semiring typically simple in extremal semirings (VITERBI algorithms) possible in N (deciding finiteness of support) → possible in many interesting cases approximation possible for R (NEWTON method)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 33

slide-93
SLIDE 93

Contents

1

Motivation

2

Recognizable Weighted Tree Language

3

Weighted Extended Top-down Tree Transducer

4

Preservation of Recognizability

5

Nonpreservation of Recognizability

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 34

slide-94
SLIDE 94

Overview

model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ (✓) ✓ XTOP ✗ ✗ (✓) ln-MBOT ✗ ✓ l-MBOT ✗ ✓(✓) MBOT ✗ ✗ (✓) ln-STSSG ✗ ✗

Limitation

no coverage of unweighted failures

  • nly backward application of XTOP!

(same phenomenon for MBOT)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 35

slide-95
SLIDE 95

Overview

model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ (✓) ✓ XTOP ✗ ✗ (✓) ln-MBOT ✗ ✓ l-MBOT ✗ ✓(✓) MBOT ✗ ✗ (✓) ln-STSSG ✗ ✗

Limitation

no coverage of unweighted failures

  • nly backward application of XTOP!

(same phenomenon for MBOT)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 35

slide-96
SLIDE 96

Overview

model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ (✓) ✓ XTOP ✗ ✗ (✓) ln-MBOT ✗ ✓ l-MBOT ✗ ✓(✓) MBOT ✗ ✗ (✓) ln-STSSG ✗ ✗

Limitation

no coverage of unweighted failures

  • nly backward application of XTOP!

(same phenomenon for MBOT)

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 35

slide-97
SLIDE 97

Counterexample

WXTT M

Example

q γ x1 → σ q x1 q x1 q α → α WTA A

Example

σ 1

p

·p ·p α2

p

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 36

slide-98
SLIDE 98

Counterexample

WXTT M

Example

q γ x1 → σ q x1 q x1 q α → α

Transformation

γ . . . γ α → σ . . . σ α α . . . σ α α WTA A

Example

σ 1

p

·p ·p α2

p

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 36

slide-99
SLIDE 99

Counterexample

WXTT M

Example

q γ x1 → σ q x1 q x1 q α → α

Transformation

γ . . . γ α → σ . . . σ α α . . . σ α α WTA A

Example

σ 1

p

·p ·p α2

p

Weighted tree language

A(u) = 2|u|α

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 36

slide-100
SLIDE 100

Counterexample

Transformation M

γ . . . γ α → σ . . . σ α α . . . σ α α |u|α = 2|t|γ

Weighted tree language A

A(u) = 2|u|α

Backward application

[M−1(A)](t) = 2(2|t|γ )

Theorem

For every WTA A over N there exists n ∈ N such that ∀t ∈ TΣ A(t) ≤ n|t|+1

[FÜLÖP, ∼, VOGLER: Weighted extended tree transducers. Fundam. Inform. 2011]

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 37

slide-101
SLIDE 101

Counterexample

Transformation M

γ . . . γ α → σ . . . σ α α . . . σ α α |u|α = 2|t|γ

Weighted tree language A

A(u) = 2|u|α

Backward application

[M−1(A)](t) = 2(2|t|γ )

Theorem

For every WTA A over N there exists n ∈ N such that ∀t ∈ TΣ A(t) ≤ n|t|+1

[FÜLÖP, ∼, VOGLER: Weighted extended tree transducers. Fundam. Inform. 2011]

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 37

slide-102
SLIDE 102

Counterexample

Transformation M

γ . . . γ α → σ . . . σ α α . . . σ α α |u|α = 2|t|γ

Weighted tree language A

A(u) = 2|u|α

Backward application

[M−1(A)](t) = 2(2|t|γ )

Theorem

For every WTA A over N there exists n ∈ N such that ∀t ∈ TΣ A(t) ≤ n|t|+1

[FÜLÖP, ∼, VOGLER: Weighted extended tree transducers. Fundam. Inform. 2011]

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 37

slide-103
SLIDE 103

Overview

model M(L) recognizable? M−1(L) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓/✗ (✓) ✓ XTOP ✗ ✗ (✓) ln-MBOT ✗ ✓ l-MBOT ✗ ✓(✓) MBOT ✗ ✗ (✓) ln-STSSG ✗ ✗

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 38

slide-104
SLIDE 104

That’s all, folks!

Thank you for your attention!

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 39

slide-105
SLIDE 105

References

ARNOLD, DAUCHET: Bi-transductions de forêts. Proc. ICALP 1976 BERSTEL, REUTENAUER: Recognizable formal power series on trees.

  • Theor. Comput. Sci. 18, 1982

ENGELFRIET: Top-down tree transducers with regular look-ahead.

  • Math. Systems Theory 10, 1977

ENGELFRIET, FÜLÖP, VOGLER: Bottom-up and top-down tree series

  • transformations. J. Autom. Lang. Comb. 7, 2002

FÜLÖP, MALETTI, VOGLER: Weighted extended tree transducers.

  • Fundam. Inform. 112, 2011

GRAEHL, KNIGHT: Training tree transducers. Proc. HLT-NAACL 2004 MALETTI, SATTA: Parsing algorithms based on tree automata. Proc. IWPT 2009 MALETTI: Input products for weighted extended top-down tree transducers.

  • Proc. DLT 2010

MALETTI: An alternative to synchronous tree substitution grammars. Natural Language Engineering 17, 2011 MAY, KNIGHT: TIBURON — a weighted tree automata toolkit. Proc. CIAA 2006 ROUNDS: Mappings and grammars on trees. Math. Systems Theory 4, 1970 THATCHER Generalized 2 sequential machine maps.

  • J. Comput. System Sci. 4, 1970

Recognizability and Weighted Tree Transducers

  • A. Maletti

· 40