Text Summarization Using A Trainable Summarizer and Latent Semantic - - PowerPoint PPT Presentation

text summarization using a trainable summarizer and
SMART_READER_LITE
LIVE PREVIEW

Text Summarization Using A Trainable Summarizer and Latent Semantic - - PowerPoint PPT Presentation

Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis Jen-Yuan Yeh 1 , Hao-Ren Ke 2 , and Wei-Pang Yang 1 1 Department of Computer & Information Science, National Chiao-Tung University, Taiwan, R.O.C. 2 Digital Library


slide-1
SLIDE 1

Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis

Jen-Yuan Yeh1, Hao-Ren Ke2, and Wei-Pang Yang1

1Department of Computer & Information Science, National Chiao-Tung

University, Taiwan, R.O.C.

2Digital Library & Information Section of Library, National Chiao-Tung

University, Taiwan, R.O.C.

slide-2
SLIDE 2

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 2/36

Outline

Introduction and related work Modified Corpus-based approach (MCBA) LSA-based Text Relationship Map approach (LSA+T.R.M.) Evaluation Conclusion

slide-3
SLIDE 3

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 3/36

Text summarization

The process of distilling the most important information from a

source (or sources) to produce an abridged version for a particular user (or users) and task (or tasks) [Mani & Bloedorn, 1999].

Analysis Transformation Synthesis

Documents Summaries Compression Ratio

slide-4
SLIDE 4

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 4/36

Corpus-based Approach:

A Trainable Document Summarizer [Kupiec et al., 1995]

( ) ( ) ( ) ( )

∏ ∏

= =

∈ ∈ = ∈

k j j k j j k

f P S s P S s f P f f f S s P

1 1 2 1

| ,.., , |

Feature Extractor Labeler

Learning Algorithm Rule Application

Source Summary

Training Corpus

Source

Test Corpus

Rules vectors Machine-generated Summary

Training Phase Test Phase

slide-5
SLIDE 5

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 5/36

Text Relationship Map (T.R.M.) Approach:

Automated Text Structure and Summarization [Salton et al., 1997]

P11:2 P10: 3 P9: 8 P8: 9 P7: 5 P6: 6 P5: 7 P4: 3 P3: 7 P2: 2 P1: 6

Three heuristic methods:

  • Global bushy path
  • Depth-first path
  • Segmented bushy path
  • Each node is represented as Pi=(k1, k2, …, kn)
  • Pi and Pj are judged to be connected when their

similarity is greater than the threshold.

( )

j i j i j i

P P P P P P Sim ⋅ = ,

slide-6
SLIDE 6

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 6/36

Modified Corpus-based Approach

We use a score function to measure the significance of a

sentence.

Kupiec et al. (1995) computes the probability that a sentence

will be included in the summary.

( ) ( ) ( ) ( ) ( ) ( )

ature.

  • f each fe

tance the impor indicates and w e Title", ance to th s "Resembl represent ity", f s "Central represent f , e Keyword" s "Negativ represent , f e Keyword" s "Positiv represent n", f s "Positio represent where f s Score w s Score w s Score w s Score w s Score w s Score

i f f f f f Overall 5 4 3 2 1 5 4 3 2 1

5 4 3 2 1

⋅ + ⋅ + ⋅ − ⋅ + ⋅ =

( ) ( ) ( ) ( )

∏ ∏

= =

∈ ∈ = ∈

k j j k j j k

f P S s P S s f P f f f S s P

1 1 2 1

| ,..., , |

slide-7
SLIDE 7

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 7/36

f1: Position

For a sentence s, the position score is defined as

( ) ( )

sentence. each

  • f

ce significan implies rank which a is ) paragraph first the

  • f

sentence first the indicates 1 1 . . ( . ) (

1

R S P g e iSj mes from P where s co R PiSj nk Average ra S|PiSj s P s Scoref × ∈ =

slide-8
SLIDE 8

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 8/36

f2: Positive Keyword

For a sentence s, assume s contains Keyword1, Keyword2, …,

Keywordn, the positive-keyword is defined as

( ) ( )

. . | ) ( 1

~ 1

2

in s

  • f Keyword

req currence f is the oc where tf Keyword S s P tf s length s Score

i i n i i i f

= ⋅

∈ =

slide-9
SLIDE 9

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 9/36

f3: Negative Keyword

For a sentence s, assume s contains Keyword1, Keyword2, …,

Keywordn, this negative-keyword score is defined as

( ) ( )

. . | ) ( 1

~ 1

3

in s d

  • f Keywor

freq currence is the oc where tf Keyword S s P tf s length s Score

i k n i i i f

=

∉ ⋅ =

slide-10
SLIDE 10

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 10/36

f4: Centrality

For a sentence s, the score is defined as

( )

ntences n other se Keywords i n s Keywords i ntences n other se Keywords i n s Keywords i s Scoref U I =

4

slide-11
SLIDE 11

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 11/36

f5: Resemblance to the Title

For a sentence s, the score is defined as

( )

n Title Keywords i n s Keywords i n Title Keywords i n s Keywords i s Scoref U I =

5

slide-12
SLIDE 12

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 12/36

Word Aggregation for f2, f3, f4, and f5

Use Word Co-occurrence to reshape word unit.

Assume A, B, C, D, E are keywords, E is composed of B and C in

  • rder, if MI(B, C) > threshold, then replace B and C with E.

( )

corpus. in the adjacently

  • ccurs

and y that probabilit the is ) ( corpus; in the

  • ccurs

y that probabilit the is ) ( ) ( ) ( ) , ( log , y x x,y P x x P y P x P y x P y x MI × =

[Maosong et al., 1998]

個人 電腦 個人電腦 ABCD AED

MI(B,C)>threshold

slide-13
SLIDE 13

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 13/36

Train the Score Function by the Genetic Algorithm

Help to find a suitable combination of feature-weights. Regard (w1,w2,w3,w4,w5) as a genome, and perform the

genetic algorithm (GA) to determine the value of wi.

Fitness: the average f-measure got with the genome when

applying on the training corpus.

100 generations, each with 1000 genomes.

slide-14
SLIDE 14

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 14/36

Summary of Modified Corpus-based Approach

Use a weighted score function to measure the importance of a

sentence.

Employ ranked positions to emphasize the significance of

sentence positions.

Train the score function by the genetic algorithm to find a

suitable combination of feature weights.

slide-15
SLIDE 15

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 15/36

LSA-based T.R.M. Approach

Combine T.R.M.[Salton et al., 1997] and semantic representations

derived by LSA to promote summarization to semantic-level.

Chinese Document Chinese Document Word-by-Sentence Matrix Construction Singular Value Decomposition Dimension Reduction Semantic Matrix Reconstruction Semantic Model Analysis Sentence Relationship Analysis Semantic Related Sentence Link Text Relationship Map Construction Document Summary Document Summary Global Bushy Path Construction Sentence Selection Sentence Selection Preprocessing Sentence Identification Word Segmentation & Keyword- Frequency Calculation Semantic Sentence/Word Representations Semantic Sentence/Word Representations Text Relationship Map Text Relationship Map

Latent Semantic Analysis Summarization based on Text Relationship Map [Salton et al., 1997]

slide-16
SLIDE 16

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 16/36

Semantic Representations

Represent a document D as a Word-by-Sentence matrix A

and apply SVD to A to derive latent semantic structures of D from A.

( ) ( )

=

× − = − = ⎟ ⎠ ⎞ ⎜ ⎝ ⎛ + = ⋅ =

N j ij ij i i i j ij ij i ij ij

f f N E E G n c L G L a

1

log log 1 1 1 log

[Bellegarda et al., 1996]

where cij is the frequency of Wi in Sj, nj is the number

  • f words in Sj, and Ei is the normalized entropy of Wi

MN M M M N N N

a a a W a a a W a a a W S S S A L M O M M M L L L

2 1 2 22 21 2 1 12 11 1 2 1

=

Sentence Keyword: Nouns & Verbs

slide-17
SLIDE 17

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 17/36

Example of How LSA Works [Landauer et al., 1998]

⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣ ⎡ = 36 . 56 . 85 . 31 . 1 50 . 1 64 . 1 35 . 2 54 . 2 34 . 3 S

Dimension Reduction=2 SVD & Dimension Reduction

T

USV minors graph trees survey EPS time response system user computer interface human m m m m c c c c c A = ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣ ⎡ = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1

4 3 2 1 5 4 3 2 1 T

V S U minors graph trees survey EPS time response system user computer interface human m m m m c c c c c A

' ' ' 4 3 2 1 5 4 3 2 1 '

62 . 71 . 50 . 22 . 15 . 21 . 10 . 25 . 04 . 85 . 98 . 69 . 31 . 20 . 30 . 15 . 34 . 06 . 66 . 77 . 55 . 24 . 14 . 27 . 14 . 23 . 06 . 42 . 44 . 31 . 14 . 27 . 21 . 23 . 53 . 10 . 11 . 20 . 14 . 07 . 24 . 63 . 51 . 55 . 22 . 22 . 19 . 13 . 06 . 28 . 42 . 38 . 58 . 1 16 . 22 . 19 . 13 . 06 . 28 . 42 . 38 . 58 . 16 . 05 . 21 . 15 . 07 . 56 . 27 . 1 05 . 1 23 . 1 45 . 19 . 12 . 08 . 03 . 39 . 70 . 61 . 84 . 26 . 12 . 09 . 06 . 02 . 24 . 41 . 36 . 51 . 15 . 04 . 10 . 07 . 03 . 16 . 40 . 33 . 37 . 14 . 09 . 16 . 12 . 05 . 18 . 47 . 38 . 40 . 16 . = ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣ ⎡ − − − − − − − − − − − − − − − − − − − − − − − − − =

Semantic Sentence Representation Semantic Word Representation

slide-18
SLIDE 18

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 18/36

Summary Generation

S11 S10 S9 S8 S7 S6 S5 S4 S3 S2 S1 Node Representation

  • [Salton97]: Si= (k1, k2, …, kn)
  • Our: Semantic Sentence Representation

derived by LSA

( )

j i j i j i

P P P P P P Sim ⋅ = ,

Similarity Summary Generation

  • Global Bushy Path [Salton et al., 1997]

Compared to [Salton97]

  • Our: Semantic Sentence Representations.
  • [Salton97]: Keyword Vector Representations.

Problem of T.R.M.[Salton97] is the lack of the type or the context of a link.

slide-19
SLIDE 19

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 19/36

Summary of LSA-based T.R.M. Approach

Employ LSA to derive the semantic representations of a

document.

Semantic sentence/word vector representations.

Combine T.R.M.[Salton97] and the semantic representations to

promote summarization from keyword-level to semantic-level.

slide-20
SLIDE 20

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 20/36

Data Corpus

  • 100 articles about politics collected from New Taiwan Weekly.

4.0 3.8 4.1 3.8 4.1 Average rank of manual summary sentence 30% 30% 30% 30% 30% Manual compression ratio per document 8.7 8.0 10.0 9.0 8.7 Sentences per manual summary 27.4 25.5 30.8 27.9 27.5 Sentences per document 20 20 20 20 20 Document per collection Set 5 Set 4 Set 3 Set 2 Set1 Document Statistics

slide-21
SLIDE 21

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 21/36

Evaluation Method

We use recall(R), precision (P), and f-measure (F) to judge

the coverage between manual and machine-generated summaries.

S T T: manual summary of D S: machine-generated summary of D

R P PR F T T S , R S T S P + = = = 2 , I I

slide-22
SLIDE 22

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 22/36

Modified Corpus-based Approach:

Effects of f1, f2, f3, f4, and f5

0.4056 0.3920 0.3382 0.3313 0.2515 0.2345 TR 0.4476 0.4478 0.3935 0.3913 0.2971 0.3039 C 0.3031 0.3307 0.2259 0.2468 0.1293 0.1439 NK 0.3548 0.3547 0.2919 0.2813 0.1687 0.1785 PK 0.3671 0.3563 0.3092 0.2988 0.2327 0.2282 POS MCBA CBA MCBA CBA MCBA CBA CR=30% CR=20% CR=10% F-measure 0.4599(-) 0.4418(-) 0.3773(-) 0.3829(-) 0.2974(-) 0.2745(-) Without TR 0.4024(-) 0.4029(-) 0.3297(-) 0.3346(-) 0.2234(-) 0.2585(-) Without C 0.4908(+) 0.4736(+) 0.4294(+) 0.4045(-) 0.3166(+) 0.3083(+) Without NK 0.4619(-) 0.4583(-) 0.3989(-) 0.4048(-) 0.3025(-) 0.3071(+) Without PK 0.4824(-) 0.4686(-) 0.4129(-) 0.4003(-) 0.3109(+) 0.3051(+) Without POS 0.4839 0.4688 0.4136 0.4122 0.3029 0.3046 All features MCBA CBA MCBA CBA MCBA CBA CR=30% CR=20% CR=10% F-measure

slide-23
SLIDE 23

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 23/36

Modified Corpus-based Approach:

Effects of the Score Function

0.4908 0.4736 0.4294 0.4045 0.3166 0.3083 Avg. 0.5342 0.5237 0.4929 0.4530 0.3654 0.3473 Set 5 0.4210 0.4143 0.3819 0.3709 0.2964 0.3097 Set 4 0.4667 0.4578 0.3848 0.3746 0.3305 0.3421 Set 3 0.4989 0.4456 0.4434 0.3921 0.2703 0.2635 Set 2 0.5330 0.5264 0.4438 0.4319 0.3202 0.2789 Set 1 MCBA CBA MCBA CBA MCBA CBA CR=30% CR=20% CR=10% POS+PK+C+TR (F-measure)

slide-24
SLIDE 24

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 24/36

Modified Corpus-based Approach:

GA-learned Feature Weights

0.9587 0.085 0.248

  • 0.794

0.172 Combination 5 0.9554 0.039 0.333

  • 0.961

0.173 Combination 4 0.9626 0.169 0.196

  • 0.909

0.066 Combination 3 0.9358 0.336 0.299

  • 0.995

0.225 Combination 2 0.9264 0.278 0.281

  • 0.857

0.140 Combination 1 Fitness (F-measure) TR C NK PK POS CR=30% 0.5151 0.4908 0.4705 0.4294 0.3557 0.3166 Avg. 0.5289 0.5342 0.5145 0.4929 0.3682 0.3654 Set 5 0.4668 0.4210 0.4239 0.3819 0.3609 0.2964 Set 4 0.5108 0.4667 0.4787 0.3848 0.3733 0.3305 Set 3 0.5244 0.4989 0.4418 0.4434 0.3109 0.2703 Set 2 0.5446 0.5330 0.4935 0.4438 0.3653 0.3202 Set 1 MCBA+GA MCBA MCBA+GA MCBA MCBA+GA MCBA CR=30% CR=20% CR=10% POS+PK+C+TR (F-measure)

slide-25
SLIDE 25

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 25/36

LSA+T.R.M. Approach in Single-Document Level

Dimension Reduction Ratio

0.4090 0.4379 0.4583 0.4584 0.4420 0.4431 0.3989 0.3446 0.3059 Avg. 0.4149 0.4288 0.4614 0.4932 0.4744 0.4569 0.3547 0.3823 0.2779 Set 5 0.3284 0.4167 0.4132 0.3826 0.3674 0.4311 0.3672 0.3179 0.3629 Set 4 0.3593 0.4272 0.4107 0.3985 0.3973 0.3360 0.3311 0.3557 0.3407 Set 3 0.3750 0.4190 0.4297 0.3860 0.3955 0.3812 0.4079 0.4095 0.3451 Set 2 0.3859 0.4268 0.4400 0.4282 0.3859 0.3430 0.4304 0.3500 0.2901 Set 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 F-measure (CR=30%) 0.5477 0.5039 0.4247 Avg. 0.5901 0.5114 0.4394 Set 5 0.5228 0.4841 0.4223 Set 4 0.5169 0.4925 0.4183 Set 3 0.5315 0.4944 0.4192 Set 2 0.5773 0.5373 0.4245 Set 1 CR=30% CR=20% CR=10% LSA+T.R.M. (F-measure)

slide-26
SLIDE 26

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 26/36

LSA+T.R.M. Approach in Single-Document Level

Compared with Keyword-based T.R.M. [Salton et al., 1997]

0.3686 0.4442 (0.6) 0.3227 0.3837 (0.7) 0.2224 0.2870 (0.8) Avg. 0.4085 0.4932 (0.6) 0.3456 0.4013 (0.8) 0.2457 0.2928 (0.8) Set 5 0.3296 0.4311 (0.4) 0.2663 0.3718 (0.4) 0.1922 0.2740 (0.7) Set 4 0.3399 0.4272 (0.8) 0.3270 0.3578 (0.7) 0.2180 0.2689 (0.9) Set 3 0.3834 0.4297 (0.7) 0.3276 0.3858 (0.8) 0.2433 0.2920 (0.8) Set 2 0.3818 0.4400 (0.7) 0.3472 0.4016 (0.7) 0.2127 0.3076 (0.7) Set 1 Keyword LSA Keyword LSA Keyword LSA CR=30% CR=20% CR=10% T.R.M. (F-measure)

slide-27
SLIDE 27

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 27/36

LSA+T.R.M. Approach in Corpus Level

Compared with Keyword-based T.R.M. [Salton et al., 1997]

0.4532 0.4047 0.3064 Avg. 0.4812 0.4377 0.3294 Set 5 0.4266 0.3728 0.3060 Set 4 0.4393 0.4051 0.2673 Set 3 0.4618 0.4053 0.3225 Set 2 0.4570 0.4026 0.3067 Set 1 CR=30% CR=20% CR=10% LSA+T.R.M. (F-measure) 0.3488 0.3956 (0.4) 0.2943 0.3434 (0.5) 0.2070 0.2475 (0.5) Avg. 0.3806 0.4300 (0.6) 0.3265 0.3823 (0.4) 0.2274 0.2748 (0.6) Set 5 0.3426 0.3530 (0.4) 0.2661 0.3244 (0.4) 0.2015 0.2669 (0.4) Set 4 0.3473 0.3956 (0.4) 0.3099 0.3456 (0.6) 0.1855 0.2273 (0.6) Set 3 0.3394 0.4007 (0.3) 0.2772 0.3320 (0.6) 0.2091 0.2236 (0.3) Set 2 0.3340 0.3986 (0.2) 0.2920 0.3328 (0.5) 0.2117 0.2450 (0.4) Set 1 Keyword LSA Keyword LSA Keyword LSA CR=30% CR=20% CR=10% T.R.M. (F-measure)

slide-28
SLIDE 28

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 28/36

Conclusion

Modified Corpus-based Approach

Employ ranked positions to emphasize the significance of

sentence positions.

Train the score function by genetic algorithm to find a suitable

combination of feature-weights.

LSA-based T.R.M. Approach

Employ LSA to derive semantic representations of a document. Combine T.R.M.[Salton97] and semantic representations to promote

summarization from keyword-level to semantic-level.

  • When the compression ratio was 30%, we got average recalls of

49% for MCBA, 52% for MCBA+GA, 44% for LSA+T.R.M. in single- document level, and 40% for LSA+T.R.M. in corpus level.

slide-29
SLIDE 29

多語言複合式摘要系統

Jen-Yuan Yeh1, Hao-Ren Ke2, and Wei-Pang Yang1

1Department of Computer & Information Science, National Chiao-Tung

University, Taiwan, R.O.C.

2Digital Library & Information Section of Library, National Chiao-Tung

University, Taiwan, R.O.C.

slide-30
SLIDE 30

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 30/36

Scenario

統一檢索介面 Google Alta Vista 檢索結果 加值服務 檢索結果分群 … 數位典藏 文件摘要 個人化摘要 呈現

slide-31
SLIDE 31

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 31/36

Scenario (cont.)

slide-32
SLIDE 32

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 32/36

複合式摘要系統架構

Text Data Image Data Document Conceptual Modeling Concept Detection & Fusion Cross-language Translation Multimedia Association Cross-language Multimedia Summarization User Model Layout Design Summarization Presentation Personalized Summarization & Presentation/ Evaluation Evaluation Multimedia Source Multimedia Summarization Personalized Presentation

slide-33
SLIDE 33

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 33/36

研究議題(一)

多文件摘要技術研究

文件模型的建構與表示 主題偵測與分類 主題關聯建構與資訊壓縮 主題排序與組織 語句規劃與摘要生成

slide-34
SLIDE 34

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 34/36

研究議題(二)

跨語言摘要技術研究

文件概念的分群 中英文轉譯、同義詞處理 中英文概念群的連結

多媒體關聯研究

文件模型與整合語意特徵網路關聯 複合式摘要模型建構 複合式文件模型關聯、資訊壓縮

slide-35
SLIDE 35

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 35/36

研究議題(三)

客製化摘要呈現技術研究及評估

設計個人化使用者環境 個人化摘要生成與呈現 評估工作

slide-36
SLIDE 36

2003/9/13 Text Summarization Using A Trainable Summarizer and Latent Semantic Analysis 36/36

Q&A