Self-Organizing Search in the Web of Things Kay Rmer, University of - - PowerPoint PPT Presentation

self organizing search in the web of things
SMART_READER_LITE
LIVE PREVIEW

Self-Organizing Search in the Web of Things Kay Rmer, University of - - PowerPoint PPT Presentation

Self-Organizing Search in the Web of Things Kay Rmer, University of Lbeck Networked Embedded Sensing Research Protocols Adaptation to interference Programming Models Role assignment Services Content-based Sensor Search


slide-1
SLIDE 1

Self-Organizing Search in the Web of Things

Kay Römer, University of Lübeck

slide-2
SLIDE 2

Networked Embedded Sensing Research

  • Protocols
  • Adaptation to interference
  • Programming Models
  • Role assignment
  • Services
  • Content-based Sensor Search
  • Minimally-Invasive Management
  • Systems
  • Body sensor networks
slide-3
SLIDE 3

Motivation

  • Mobile phones equipped

with sensors and connected to the Internet

  • Sensors published on the

Web: state of the real world available in real-time Search the real world by its current state! Search the real world by its current state!

GPS Microphone Camera Proximity WLAN Bluetooth Magnetometer Ambient Light Accelerometer GSM

2

slide-4
SLIDE 4

A Web of Real-World Places

S tation full hectic Mensa empty quiet S upermarket full quiet

3

slide-5
SLIDE 5

Web of Things

  • Web presence of things, people, and places with

real-time state information

  • Web of real-world entities, not Web of sensors
  • High-level states, not raw sensor data
  • Searching the Web of Things
  • Search for real-world entities: places, people, things, …
  • by their current state: empty, hot, broken, …
  • in real-time

4

slide-6
SLIDE 6

Searching the Real World: Examples

  • Quiet picnic places at waterfront?
  • Route through city avoiding traffic jams?
  • Which rental station has bicycles available?
  • Where are many people who share my interests?
  • Which trains from A to B are not crowded?
  • Where to enter train to get free seat?
  • Supermarkets with short waiting queues?

5

slide-7
SLIDE 7

Problem: Content-based Sensor Search

6

  • Find sensors reading given state in real time
  • Potentially huge, distributed set of candidate sensors
  • More state updates than queries, push not a good idea!
  • Sensor output is highly dynamic
  • Indexing sensor output not a good idea!
  • We need only a limited number of results at a time
  • Heuristics to select good candidates!
slide-8
SLIDE 8

Approach: Sensor Ranking

  • Sensors create prediction model using past readings
  • Prediction models are published on the Web
  • Search engine periodically indexes prediction models
  • Prediction models are used to rank candidate sensors
  • Highest ranking sensors are read first
  • Goal: Minimize the number of read sensors

Query Time Indexing Time

7

slide-9
SLIDE 9

System Model

  • Sensor maps discrete time to a finite discrete

set of states:

  • Sensor output time series: s(ti) = vi

8

 s :T  V

V1 V2 V3 ... ... Vpresent Sensor Output Time V4 t1 t2 t3 t4 tpresent ... ...

slide-10
SLIDE 10

System Model (Continued)

  • Prediction model maps query time and query

value to a probability estimate:

  • P(t, v): Probability that s(t) = v

9

 

1 , :  V T P 

V1 V2 V3 ... ... Sensor Output Time V4 t1 t2 t3 t4 tC Vc ... ? t

model construction search for v time window TW forecasting horizon

... ... ...

slide-11
SLIDE 11

Query Resolution

Example: Quiet places at waterfront

  • 1. Filter static (waterfront, occupancy)
  • 2. Predict (quiet)
  • 3. Rank
  • 4. Read
  • 5. Return

.1 .5 .7 .2 .2 .6 .9

slide-12
SLIDE 12
  • Normalized overhead for reading non-matching

sensors

  • Ranking error e(t,v)
  • Top-m ranking error etop(t,v,m)
  • Dito, but only first m sensors considered

Ranking Metrics

Number of non-matching sensors above last matching sensor Rank of last matching sensor

15

slide-13
SLIDE 13

Ranking Metrics: Examples

S1 S2 S3 S4 S5 S6 S7 S8 S9

Sensors that match the query Sensors that do not match the query

S1 S2 S3 S4 S5 S6 S7 S8 S9 Optimal ranking:

  • e = 0/6
  • etop = 0/5

Suboptimal ranking:

  • e = 4/8
  • etop = 2/5

Last match Last match m m

16

slide-14
SLIDE 14

Prediction Models

  • Focus on people-centric sensors
  • Tend to show periodic behaviour
  • Requirements
  • Accurate predictions for forecasting horizons that match

indexing frequencies (days - weeks)

  • Deal with imperfect periodic behavior

17

slide-15
SLIDE 15

Considered Prediction Models

  • Single-period prediction model (SPPM)
  • Assumes single dominant period of known length

(e.g., 1 week)

  • Multi-period prediction model (MPPM)
  • Assumes multiple periodic processes of unknown

length (e.g., 1 week, 4 weeks)

  • Select appropriate models at runtime

18

slide-16
SLIDE 16

V1

Single-Period Prediction Model (SPPM)

  • Assumption: Single dominant period with length p

19

V2 ... Sensor Output Time V4 t1 t2 t3 t4 ... ... ... ?

time window TW

tC t p p ... P(t,green) = 2/4 P(t,red) = 1/4 V4 V1 V3 p p

slide-17
SLIDE 17
  • Periodic symbol
  • α: symbol; p: period; l: offset; : support
  • Example: α=blue, p=2, l=1

Multi-Period Prediction Model (MPPM)

) , , , (   l p

21

Sensor Output

ps  blue,2,1,2 3      

Number of consecutive appearances of symbol α in period p at offset l

  • Max. possible occurences

l=1 p=2 p=2 p=2

slide-18
SLIDE 18

Inferring Prediction Estimates

Query for value v=b at time t=6

  • 1. Filter periodic symbols
  • Same value:  = v
  • Same phase: l  t mod p
  • 2. P(v,t) = max 

23

 p l  b 2 0.7 g 3 1 0.1 b 4 2 0.9

v=b?

0 1 2 3 4 5 6

slide-19
SLIDE 19

Adjustment Process

  • Faulty/malicious sensors, inaccurate

predictions may result in persistent misranking

  • Individual ranking error for each sensor
  • S8 ranked to low: increase prediction value
  • S7 ranked to high: decrease prediction value
  • Idea: adjustment term for each sensor
  • Updated after each query using ranking error

24

S1 S2 S3 S4 S5 S6 S7 S8 S9 E8 = +6/9 E7 = -2/9

slide-20
SLIDE 20

25

Adjustment Process: Feedback Loop

S5 S4 S3 S2 S1 S5 S4 S3 S2 S1 S3 S3 S3 S4 S3 S2 S3 S1 E3 E1 S3 S5 P3 P3 P3 P4 P3 P2 P3 P1 P3 P5 + C3 + A3 + C3 + A4 + C3 + A2 + C3 + A1 + C3 + A5 + E3 E3 E3 E4 E3 E2 E3 E5 + + + +

25

slide-21
SLIDE 21

Evaluation

  • Simulation of a realistic search engine
  • Periodic rebuild and indexing of models (1 week)
  • Periodic queries for possible values
  • Measure average ranking error
  • Prediction models: Random, SPPM, MPPM
  • With / without adjustment

26

slide-22
SLIDE 22

Evaluation: Data Sets

  • MERL motion detector dataset
  • 50 PIR sensors in office building
  • PIR output mapped to “free” and “occupied”
  • With and without a “faulty” sensor
  • ETH room reservation system
  • 7 “sensors”
  • Room occupancy: “free” or “occupied”
  • With and without “synthethic” multiperiod sensor
  • Bicing data set (in progress)
  • 350 bicycle rental stations in Barcelona
  • Number of available bicycles: “no”, “few”, “many”
slide-23
SLIDE 23

Average Ranking Error: MERL

30

Average Ranking Error

R A N D S P P M MPPM SPPM+AP M P P M + A P R A N D SPPM MPPM SPPM+AP M P P M + A P

slide-24
SLIDE 24

31

#Top Entries Average Ranking Error

Average Ranking Error vs. Top m

x5.5 x10

slide-25
SLIDE 25

Summary

  • Ubiquitous sensors connected to Internet
  • Search for real-world entities by current state
  • Sensor Ranking, a primitive for content-based sensor

search utilizing prediction models

  • Adjustment process to alleviate persistent inaccurate

rankings

  • Promising results on real-world data sets
  • Ongoing work
  • Improved ranking based on correlations
  • Building a search engine

32

slide-26
SLIDE 26

Ads

  • Act-Control-Move: Beyond networked Sensors
  • Summer School, Schloss Dagstuhl, August 15-21, 2010
  • www.cooperating-objects.eu/school
  • IEEE SUTC (Sensor Networks, Ubiquitous &

Trustworthy Computing)

  • Conference, Newport Beach, California, June 7-9, 2010
  • sutc2010.eecs.uci.eu
  • SESENA (Software Engineering for Sensor Nets)
  • ICSE Workshop, CapeTown, South Africa, May 3, 2010
  • www.sesena.info