Summing up g p Mark Sanderson Mark Sanderson 2 Summing up? - - PDF document

summing up g p
SMART_READER_LITE
LIVE PREVIEW

Summing up g p Mark Sanderson Mark Sanderson 2 Summing up? - - PDF document

Summing up g p Mark Sanderson Mark Sanderson 2 Summing up? What did we talk about? Where to next? 24/12/2008 3 Content Main conference Main conference EVIA 24/12/2008 EVIA highlights 4 First Mongolian test


slide-1
SLIDE 1

Summing up g p

Mark Sanderson Mark Sanderson

slide-2
SLIDE 2

Summing up?

2

  • What did we talk about?
  • Where to next?

24/12/2008

slide-3
SLIDE 3

Content

3

  • EVIA
  • Main conference

Main conference

24/12/2008

slide-4
SLIDE 4

EVIA – highlights

4

Fi t M li t t ll ti

  • First Mongolian test collection
  • New patent collection

p

  • CLEF
  • Examination of evaluation orthodoxies
  • Sakai and Robertson
  • Sholer et al
  • Sholer et al
  • Karlgren

24/12/2008

slide-5
SLIDE 5

NTCIR Clusters

5

  • Advance CLIA
  • CCLQA

CCLQA

  • IR4QA
  • Focused domain
  • Patent translation

Patent translation

  • Patent mining
  • MOAT
  • MUST
  • MUST

24/12/2008

slide-6
SLIDE 6

IR4QA and CCLQA

6

  • Great collaboration
  • Not the first
  • TREC – SDR, Blog
  • Not the last
  • Not the last
  • GRID CLEF

24/12/2008

slide-7
SLIDE 7

Patent

7

  • Mining and translation
  • Patent processing long worked on

CLIR fi ll i t t?

  • CLIR finally important?

24/12/2008

slide-8
SLIDE 8

CLIR – finally

8

  • Yahoo image search
  • Google

Google

  • Domains

P t t

  • Patent
  • Legal

24/12/2008

slide-9
SLIDE 9

Knowledge management

9

  • MOAT
  • Novel text analysis

Novel text analysis

  • MUST
  • Good to include other types of researchers at

an IR forum

  • Is there a way to integrate more research
  • Is there a way to integrate more research

groups?

24/12/2008

slide-10
SLIDE 10

Why have campaigns?

10

  • Research together
  • Build community

Build community

  • Learn how to evaluate
  • Build collections

24/12/2008

slide-11
SLIDE 11

Where to next?

11

  • Diversity
  • Diversify use

Diversify use

  • Users
  • Look out for plateaus
  • Cross campaign evaluation?

Cross campaign evaluation?

24/12/2008

slide-12
SLIDE 12

Diversity

12

  • Different people want different relevant

documents

24/12/2008

slide-13
SLIDE 13

SIGIR

13

24/12/2008

slide-14
SLIDE 14

Diversity collections

14

  • CLEF
  • imageCLEF

imageCLEF

  • New TREC web track
  • NTCIR?
  • NTCIR?

24/12/2008

slide-15
SLIDE 15

Diversify use

15

  • Hundreds of test collections
  • Very few used in big conferences.
  • Problem with reviewing?
  • Problem with researchers?
  • Problem with researchers?

24/12/2008

slide-16
SLIDE 16

Smith & Kantor 2008

16

  • Show some users
  • Google 1-10

Google 1 10

  • Google 301-310
  • Users equally effective

Users equally effective

  • 301-310 more searches

24/12/2008

slide-17
SLIDE 17

Lessons?

17

  • What are test collections predicting?
  • Consider interaction more

24/12/2008

slide-18
SLIDE 18

Plateaus

18

24/12/2008

slide-19
SLIDE 19

Crazy idea

19

  • Cross campaign evaluation?

C i i hi NTCIR TREC

  • Cooperation within NTCIR, TREC
  • How about across NTCIR, CLEF and

TREC? TREC?

24/12/2008

slide-20
SLIDE 20

Thank you

20

  • Noriko Kando
  • Tetsuya Saki

Tetsuya Saki

  • NII

24/12/2008