Performance Comparison of Caching Strategies for Information-Centric - - PowerPoint PPT Presentation

performance comparison of caching strategies for
SMART_READER_LITE
LIVE PREVIEW

Performance Comparison of Caching Strategies for Information-Centric - - PowerPoint PPT Presentation

Performance Comparison of Caching Strategies for Information-Centric IoT Jakob Pfender, Alvin Valera, Winston Seah School of Engineering and Computer Science Victoria University of Wellington, New Zealand September 22, 2018 Traditional Caching


slide-1
SLIDE 1

Performance Comparison of Caching Strategies for Information-Centric IoT

Jakob Pfender, Alvin Valera, Winston Seah

School of Engineering and Computer Science Victoria University of Wellington, New Zealand

September 22, 2018

slide-2
SLIDE 2

Traditional Caching Strategies

Caching Decision Cache Replacement

2

slide-3
SLIDE 3

Traditional Caching Strategies

Caching Decision Cache Replacement

2

slide-4
SLIDE 4

Caching Decision — Traditional Approaches

Cache Everything Everywhere (CEE) ▶ Feasible in traditional ICN thanks to

large caches

▶ Fastest possible propagation of content

through network → rapid replication High redundancy Non-optimal resource utilisation

3

slide-5
SLIDE 5

Caching Decision — Traditional Approaches

Cache Everything Everywhere (CEE) ▶ Feasible in traditional ICN thanks to

large caches

▶ Fastest possible propagation of content

through network → rapid replication

▶ High redundancy ▶ Non-optimal resource utilisation

3

slide-6
SLIDE 6

Caching Decision — Traditional Approaches

Cache Everything Everywhere (CEE) ▶ Feasible in traditional ICN thanks to

large caches

▶ Fastest possible propagation of content

through network → rapid replication

▶ High redundancy ▶ Non-optimal resource utilisation Probabilistic Caching (Prob(P)) ▶ Increases cache diversity across the

network

▶ More popular content likelier to be

stored Desirability of diversity depends on application scenario If request patuern has strong skew, diversity hurts performance by wasting resources on unpopular content The more uniform the distribution, the more beneficial diversity

3

slide-7
SLIDE 7

Caching Decision — Traditional Approaches

Cache Everything Everywhere (CEE) ▶ Feasible in traditional ICN thanks to

large caches

▶ Fastest possible propagation of content

through network → rapid replication

▶ High redundancy ▶ Non-optimal resource utilisation Probabilistic Caching (Prob(P)) ▶ Increases cache diversity across the

network

▶ More popular content likelier to be

stored

▶ Desirability of diversity depends on

application scenario

▶ If request patuern has strong skew,

diversity hurts performance by wasting resources on unpopular content

▶ The more uniform the distribution, the

more beneficial diversity

3

slide-8
SLIDE 8

Traditional Caching Strategies

Caching Decision Cache Replacement

4

slide-9
SLIDE 9

Cache Replacement Decision — Traditional Approaches

▶ Least Recently Used (LRU):

▶ Unpopular / outdated content more

likely to be removed

▶ Generally efgective, but danger of

thrashing

▶ Alternative: Least Frequently Used

(LFU) → avoids thrashing, performs poorly with variable access patuerns & request spikes

Random Replacement (RR):

Evict a randomly chosen content chunk Simple, fast, no overhead Some argue that cache replacement should be performed as fast as possible

Simple and fast more desirable than efgective but complex

5

slide-10
SLIDE 10

Cache Replacement Decision — Traditional Approaches

▶ Least Recently Used (LRU):

▶ Unpopular / outdated content more

likely to be removed

▶ Generally efgective, but danger of

thrashing

▶ Alternative: Least Frequently Used

(LFU) → avoids thrashing, performs poorly with variable access patuerns & request spikes

▶ Random Replacement (RR):

▶ Evict a randomly chosen content chunk ▶ Simple, fast, no overhead ▶ Some argue that cache replacement

should be performed as fast as possible ▶ Simple and fast more desirable than

efgective but complex

5

slide-11
SLIDE 11

Traditional ICN caching vs. IoT

slide-12
SLIDE 12

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching

CEE is inefgicient (high redundancy, low diversity, poor utilisation of resources) Caching less betuer performance Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible

7

slide-13
SLIDE 13

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources) Caching less betuer performance Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible

7

slide-14
SLIDE 14

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance

Cache diversity generally desirable Cache replacement policies should be as fast & simple as possible

7

slide-15
SLIDE 15

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable

Cache replacement policies should be as fast & simple as possible

7

slide-16
SLIDE 16

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

7

slide-17
SLIDE 17

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT

7

slide-18
SLIDE 18

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT ▶ Limited memory and processing power

Available cache space extremely valuable

Unreliable links Small, transient data Request distributions tend to be uniform

7

slide-19
SLIDE 19

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT ▶ Limited memory and processing power

▶ Available cache space extremely

valuable

Unreliable links Small, transient data Request distributions tend to be uniform

7

slide-20
SLIDE 20

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT ▶ Limited memory and processing power

▶ Available cache space extremely

valuable

▶ Unreliable links

Small, transient data Request distributions tend to be uniform

7

slide-21
SLIDE 21

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT ▶ Limited memory and processing power

▶ Available cache space extremely

valuable

▶ Unreliable links ▶ Small, transient data

Request distributions tend to be uniform

7

slide-22
SLIDE 22

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT ▶ Limited memory and processing power

▶ Available cache space extremely

valuable

▶ Unreliable links ▶ Small, transient data ▶ Request distributions tend to be uniform

7

slide-23
SLIDE 23

Traditional ICN caching vs. IoT

Lessons from traditional ICN caching ▶ CEE is inefgicient (high redundancy, low

diversity, poor utilisation of resources)

▶ Caching less → betuer performance ▶ Cache diversity generally desirable ▶ Cache replacement policies should be as

fast & simple as possible

Key difgerences in IoT ▶ Limited memory and processing power

▶ Available cache space extremely

valuable

▶ Unreliable links ▶ Small, transient data ▶ Request distributions tend to be uniform

Can we apply the lessons from traditional ICN caching to the IoT?

7

slide-24
SLIDE 24

Advanced Caching Strategies (for the IoT)

slide-25
SLIDE 25

Advanced Caching Strategies

Dynamic Caching Probability ▶ Dynamically compute caching

probability for each node and/or each content chunk, based on available information

▶ Caching behaviour adapts to the state of

the network Example: pCASTING (Hail et al. 2015)

Consider content age, node batuery, cache occupancy Values normalised & weighted by relative importance Fully distributed, no communication

  • verhead

Uses purely local information

9

slide-26
SLIDE 26

Advanced Caching Strategies

Dynamic Caching Probability ▶ Dynamically compute caching

probability for each node and/or each content chunk, based on available information

▶ Caching behaviour adapts to the state of

the network

▶ Example: pCASTING (Hail et al. 2015)

▶ Consider content age, node batuery,

cache occupancy

▶ Values normalised & weighted by

relative importance

▶ Fully distributed, no communication

  • verhead

▶ Uses purely local information

9

slide-27
SLIDE 27

Advanced Caching Strategies

Dynamic Caching Probability ▶ Dynamically compute caching

probability for each node and/or each content chunk, based on available information

▶ Caching behaviour adapts to the state of

the network

▶ Example: pCASTING (Hail et al. 2015)

▶ Consider content age, node batuery,

cache occupancy

▶ Values normalised & weighted by

relative importance

▶ Fully distributed, no communication

  • verhead

▶ Uses purely local information

Max Diversity Most Recent (MDMR) ▶ Hahm et al. 2017 ▶ Cache replacement strategy developed

specifically for information-centric IoT

▶ Aim: Maximise diversity while

maintaining freshness

▶ Always atuempt to replace older content

from same producer as new content,

  • therwise oldest chunk from producer

with more than one chunk in cache, else

  • ldest chunk

9

slide-28
SLIDE 28

Performance Metrics

slide-29
SLIDE 29

Performance Metrics

▶ Server load / cache hit ratio

▶ Indicates how well popular content is

distributed across the network

Data retrieval delay

Also afgected by network congestion, density, etc.

Interest retransmission ratio

On-path caching may reduce hop count for retransmissions

Total cache evictions

How well does the strategy adapt to content popularity & propagation?

Diversity Metric (DM)

D

Cdisj S

S : Number of content producers Cdisj : Number of disjoint name

prefixes in all caches

Cache Retention Ratio (CRR)

Measures ratio of distinct objects in caches to all generated objects: C

Dq Dp 11

slide-30
SLIDE 30

Performance Metrics

▶ Server load / cache hit ratio

▶ Indicates how well popular content is

distributed across the network

▶ Data retrieval delay

▶ Also afgected by network congestion,

density, etc.

Interest retransmission ratio

On-path caching may reduce hop count for retransmissions

Total cache evictions

How well does the strategy adapt to content popularity & propagation?

Diversity Metric (DM)

D

Cdisj S

S : Number of content producers Cdisj : Number of disjoint name

prefixes in all caches

Cache Retention Ratio (CRR)

Measures ratio of distinct objects in caches to all generated objects: C

Dq Dp 11

slide-31
SLIDE 31

Performance Metrics

▶ Server load / cache hit ratio

▶ Indicates how well popular content is

distributed across the network

▶ Data retrieval delay

▶ Also afgected by network congestion,

density, etc.

▶ Interest retransmission ratio

▶ On-path caching may reduce hop

count for retransmissions

Total cache evictions

How well does the strategy adapt to content popularity & propagation?

Diversity Metric (DM)

D

Cdisj S

S : Number of content producers Cdisj : Number of disjoint name

prefixes in all caches

Cache Retention Ratio (CRR)

Measures ratio of distinct objects in caches to all generated objects: C

Dq Dp 11

slide-32
SLIDE 32

Performance Metrics

▶ Server load / cache hit ratio

▶ Indicates how well popular content is

distributed across the network

▶ Data retrieval delay

▶ Also afgected by network congestion,

density, etc.

▶ Interest retransmission ratio

▶ On-path caching may reduce hop

count for retransmissions

▶ Total cache evictions

▶ How well does the strategy adapt to

content popularity & propagation?

Diversity Metric (DM)

D

Cdisj S

S : Number of content producers Cdisj : Number of disjoint name

prefixes in all caches

Cache Retention Ratio (CRR)

Measures ratio of distinct objects in caches to all generated objects: C

Dq Dp 11

slide-33
SLIDE 33

Performance Metrics

▶ Server load / cache hit ratio

▶ Indicates how well popular content is

distributed across the network

▶ Data retrieval delay

▶ Also afgected by network congestion,

density, etc.

▶ Interest retransmission ratio

▶ On-path caching may reduce hop

count for retransmissions

▶ Total cache evictions

▶ How well does the strategy adapt to

content popularity & propagation?

▶ Diversity Metric (DM)

▶ D = |Cdisj|

|S|

▶ |S|: Number of content producers ▶ |Cdisj|: Number of disjoint name

prefixes in all caches

Cache Retention Ratio (CRR)

Measures ratio of distinct objects in caches to all generated objects: C

Dq Dp 11

slide-34
SLIDE 34

Performance Metrics

▶ Server load / cache hit ratio

▶ Indicates how well popular content is

distributed across the network

▶ Data retrieval delay

▶ Also afgected by network congestion,

density, etc.

▶ Interest retransmission ratio

▶ On-path caching may reduce hop

count for retransmissions

▶ Total cache evictions

▶ How well does the strategy adapt to

content popularity & propagation?

▶ Diversity Metric (DM)

▶ D = |Cdisj|

|S|

▶ |S|: Number of content producers ▶ |Cdisj|: Number of disjoint name

prefixes in all caches

▶ Cache Retention Ratio (CRR)

▶ Measures ratio of distinct objects in

caches to all generated objects: ▶ C =

Dq Dp 11

slide-35
SLIDE 35

Evaluation

slide-36
SLIDE 36

Evaluation — Experiment Setup

▶ All experiments conducted on FIT IoT-LAB

  • pen testbed using M3 nodes

▶ STM32 (ARM Cortex M3), 512 kB

ROM, 64 kB RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4

▶ Simple RIOT application using CCN-lite

as ICN implementation, modified to support the difgerent caching strategies 60 M3 nodes distributed evenly in a single building Multihop setup, average path length 2–3 hops Prefix announcements recorded with hop count and rebroadcast with increased hop count Interests forwarded according to lowest hop count, broadcast fallback Nodes produce random content chunks prefixed with their ID every 1–5 seconds Nodes request random existing content every 0.5–1.5 seconds, using uniform or Zipfian patuern

13

slide-37
SLIDE 37

Evaluation — Experiment Setup

▶ All experiments conducted on FIT IoT-LAB

  • pen testbed using M3 nodes

▶ STM32 (ARM Cortex M3), 512 kB

ROM, 64 kB RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4

▶ Simple RIOT application using CCN-lite

as ICN implementation, modified to support the difgerent caching strategies

▶ 60 M3 nodes distributed evenly in a single

building

▶ Multihop setup, average path length 2–3

hops Prefix announcements recorded with hop count and rebroadcast with increased hop count Interests forwarded according to lowest hop count, broadcast fallback Nodes produce random content chunks prefixed with their ID every 1–5 seconds Nodes request random existing content every 0.5–1.5 seconds, using uniform or Zipfian patuern

13

slide-38
SLIDE 38

Evaluation — Experiment Setup

▶ All experiments conducted on FIT IoT-LAB

  • pen testbed using M3 nodes

▶ STM32 (ARM Cortex M3), 512 kB

ROM, 64 kB RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4

▶ Simple RIOT application using CCN-lite

as ICN implementation, modified to support the difgerent caching strategies

▶ 60 M3 nodes distributed evenly in a single

building

▶ Multihop setup, average path length 2–3

hops

▶ Prefix announcements recorded with hop

count and rebroadcast with increased hop count

▶ Interests forwarded according to lowest hop

count, broadcast fallback Nodes produce random content chunks prefixed with their ID every 1–5 seconds Nodes request random existing content every 0.5–1.5 seconds, using uniform or Zipfian patuern

13

slide-39
SLIDE 39

Evaluation — Experiment Setup

▶ All experiments conducted on FIT IoT-LAB

  • pen testbed using M3 nodes

▶ STM32 (ARM Cortex M3), 512 kB

ROM, 64 kB RAM, Atmel AT86RF231 2.4 GHz transceiver on IEEE 802.15.4

▶ Simple RIOT application using CCN-lite

as ICN implementation, modified to support the difgerent caching strategies

▶ 60 M3 nodes distributed evenly in a single

building

▶ Multihop setup, average path length 2–3

hops

▶ Prefix announcements recorded with hop

count and rebroadcast with increased hop count

▶ Interests forwarded according to lowest hop

count, broadcast fallback

▶ Nodes produce random content chunks

prefixed with their ID every 1–5 seconds

▶ Nodes request random existing content

every 0.5–1.5 seconds, using uniform or Zipfian patuern

13

slide-40
SLIDE 40

Evaluation — Caching decision policies

slide-41
SLIDE 41

Evaluation — Caching decision policies

Server load ▶ Probabilistic caching results in lower

probability that given content can be found in given cache

▶ pCASTING seems to cache with higher

average probability than p = 0.5

▶ CEE is betuer for skewed request patuern

because it increases replication of popular content

CEE PCASTING PROB05 20 40 60 80 100 Server load (in %) Zipf Uniform 15

slide-42
SLIDE 42

Evaluation — Caching decision policies

Interest retransmission ratio ▶ No statistically significant efgect ▶ Influence of network factors (topology,

congestion) stronger than caching policy

▶ Previous work (Hail et al.) found more

significant difgerences using simulation

CEE PCASTING PROB05 20 40 60 80 100 Average Interest retransmissions (in %) Zipf Uniform 16

slide-43
SLIDE 43

Evaluation — Caching decision policies

Cache evictions ▶ Correlated with rate at which caches are

filled

▶ Skewed distribution → reduced

thrashing

▶ Total number of transmissions during

experiment run: 40,000-75,000 ▶ Uniform requests: 5%-12% evictions ▶ Zipfian requests: 1%-4%

▶ Strong dependence on cache size

CEE PCASTING PROB05 1000 1500 2000 2500 3000 3500 4000 4500 Number of cache evictions Zipf Uniform 17

slide-44
SLIDE 44

Evaluation — Caching decision policies

Data retrieval delay ▶ Large initial latency because content

takes time to propagate

▶ Cache hit probability lower for

probabilistic policies → higher peak and slower decline as caches take longer to fill

▶ All strategies reach minimal delay very

quickly

50 100 150 200 250 300 Time (in s) 5 10 15 20 25 30 Data retrieval delay (in s) CEE PROB05 PCASTING

CEE PCASTING PROB05 Time (in s) 20 40 60 80 100 Data retrieval delay (in s) Zipf Uniform

18

slide-45
SLIDE 45

Evaluation — Caching decision policies

Diversity metric ▶ Majority of content producers

represented in network at any given time

▶ CEE caches all contents from the start → caches fill more quickly

▶ But since everything is cached, cache

contents are highly redundant

▶ Probabilistic methods have higher

probability of representing all content producers

200 400 600 800 1000 1200 Time (in s) 40 50 60 70 80 90 100 Diversity metric (in %) CEE PROB05 PCASTING

CEE PCASTING PROB05 Time (in s) 20 40 60 80 100 Diversity metric (in %)

Zipf Uniform

19

slide-46
SLIDE 46

Evaluation — Caching decision policies

Cache Retention Ratio ▶ Content inevitably fades from network

afuer a while

▶ Content creation rate is constant, but

probabilistic approaches take longer to fill caches

▶ More diverse caches → delayed decline ▶ Zipfian skew means less popular content

fades much quicker → caches become more homogeneous over time

▶ Uniform distribution ensures similar

lifespans for all content

200 400 600 800 1000 1200 Time (in s) 20 40 60 80 100 Cache retention ratio (in %) CEE PROB05 PCASTING

CEE PCASTING PROB05 Time (in s) 20 40 60 80 100 Cache retention ratio (in %) Zipf Uniform

20

slide-47
SLIDE 47

Evaluation — Cache replacement policies

slide-48
SLIDE 48

Evaluation — Cache replacement policies

Server load ▶ Cache replacement strategy has

negligible efgect on server load

▶ RR performs on par with much more

complex strategies

LRU RR MDMR 20 40 60 80 100 Server load (in %) Zipf Uniform 22

slide-49
SLIDE 49

Evaluation — Cache replacement policies

Interest retransmission ratio ▶ No statistically significant efgect ▶ Thus: Any observed variations in

retransmission rates have causes independent from caching strategy

LRU RR MDMR 20 40 60 80 100 Average Interest retransmissions (in %) Zipf Uniform 23

slide-50
SLIDE 50

Evaluation — Cache replacement policies

Cache evictions ▶ Significant efgect of distribution ▶ MDMR explicitly designed to maximise

diversity for uniform patuerns, but can’t

  • utperform LRU given skewed

distribution ▶ Zipfian distribution already favours

popular content, reducing impact of cache-shaping strategies

▶ Surprising: RR outperforms other

approaches given Zipfian distribution ▶ Thrashing at popularity tail end

counteracted by RR?

LRU RR MDMR 1000 2000 3000 4000 Number of cache evictions Zipf Uniform 24

slide-51
SLIDE 51

Evaluation — Cache replacement policies

Data retrieval delay ▶ MDMR prioritises content freshness

▶ Freshness only becomes significant

factor afuer some time has elapsed ⇒ MDMR requires minimum time to become efgective

▶ Significance of early spikes mostly in

terms of strain on individual nodes, especially if devices batuery-powered

50 100 150 200 250 300 Time (in s) 5 10 15 20 25 30 Data retrieval delay (in s) LRU RR MDMR

LRU RR MDMR Time (in s) 20 40 60 80 100 Data retrieval delay (in s) Zipf Uniform

25

slide-52
SLIDE 52

Evaluation — Cache replacement policies

Diversity Metric ▶ Why no impact?

Note: MDMR intended for scenario with periodic sleeping, which is not the case here

200 400 600 800 1000 1200 Time (in s) 40 50 60 70 80 90 100 Diversity metric (in %) LRU RR MDMR

LRU RR MDMR Time (in s) 20 40 60 80 100 Diversity metric (in %)

Zipf Uniform

26

slide-53
SLIDE 53

Evaluation — Cache replacement policies

Diversity Metric ▶ Why no impact? ▶ Note: MDMR intended for scenario with

periodic sleeping, which is not the case here

200 400 600 800 1000 1200 Time (in s) 40 50 60 70 80 90 100 Diversity metric (in %) LRU RR MDMR

LRU RR MDMR Time (in s) 20 40 60 80 100 Diversity metric (in %)

Zipf Uniform

26

slide-54
SLIDE 54

Evaluation — Cache replacement policies

Cache Retention Ratio ▶ Content lifetime solely influenced by

caching decision, not by replacement

▶ Faster fading given skewed distribution

unsurprising

200 400 600 800 1000 1200 Time (in s) 20 40 60 80 100 Cache retention ratio (in %) LRU RR MDMR

LRU RR MDMR Time (in s) 20 40 60 80 100 Cache retention ratio (in %) Zipf Uniform

27

slide-55
SLIDE 55

Revisiting lessons from traditional ICN

slide-56
SLIDE 56

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance

Cache diversity generally desirable CEE is inefgicient Stateless cache replacement policies are sufgicient

29

slide-57
SLIDE 57

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity

Cache diversity generally desirable CEE is inefgicient Stateless cache replacement policies are sufgicient

29

slide-58
SLIDE 58

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity ▶ Cache diversity generally desirable

CEE is inefgicient Stateless cache replacement policies are sufgicient

29

slide-59
SLIDE 59

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity ▶ Cache diversity generally desirable — Yes, if request patuerns are uniform

CEE is inefgicient Stateless cache replacement policies are sufgicient

29

slide-60
SLIDE 60

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity ▶ Cache diversity generally desirable — Yes, if request patuerns are uniform ▶ CEE is inefgicient

Stateless cache replacement policies are sufgicient

29

slide-61
SLIDE 61

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity ▶ Cache diversity generally desirable — Yes, if request patuerns are uniform ▶ CEE is inefgicient — Depends on scenario

Stateless cache replacement policies are sufgicient

29

slide-62
SLIDE 62

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity ▶ Cache diversity generally desirable — Yes, if request patuerns are uniform ▶ CEE is inefgicient — Depends on scenario ▶ Stateless cache replacement policies are sufgicient

29

slide-63
SLIDE 63

Revisiting lessons from traditional ICN

▶ Caching less → betuer performance — Yes, at least in terms of diversity ▶ Cache diversity generally desirable — Yes, if request patuerns are uniform ▶ CEE is inefgicient — Depends on scenario ▶ Stateless cache replacement policies are sufgicient — ✓

29

slide-64
SLIDE 64

Further conclusions

slide-65
SLIDE 65

Further conclusions

▶ The performance of simple stateless strategies is encouraging, because it means

efgective caching can be achieved even on resource-constrained IoT devices Identifying ideal caching decision strategy depends on application & resources Is data homogeneous & needs to be distributed as rapidly as possible or is cache diversity more important than response time? How much strain can we afgord to place on individual nodes? Results presented here ofger no universal solution

31

slide-66
SLIDE 66

Further conclusions

▶ The performance of simple stateless strategies is encouraging, because it means

efgective caching can be achieved even on resource-constrained IoT devices

▶ Identifying ideal caching decision strategy depends on application & resources

Is data homogeneous & needs to be distributed as rapidly as possible or is cache diversity more important than response time? How much strain can we afgord to place on individual nodes? Results presented here ofger no universal solution

31

slide-67
SLIDE 67

Further conclusions

▶ The performance of simple stateless strategies is encouraging, because it means

efgective caching can be achieved even on resource-constrained IoT devices

▶ Identifying ideal caching decision strategy depends on application & resources ▶ Is data homogeneous & needs to be distributed as rapidly as possible or is cache diversity

more important than response time?

▶ How much strain can we afgord to place on individual nodes?

Results presented here ofger no universal solution

31

slide-68
SLIDE 68

Further conclusions

▶ The performance of simple stateless strategies is encouraging, because it means

efgective caching can be achieved even on resource-constrained IoT devices

▶ Identifying ideal caching decision strategy depends on application & resources ▶ Is data homogeneous & needs to be distributed as rapidly as possible or is cache diversity

more important than response time?

▶ How much strain can we afgord to place on individual nodes? ▶ Results presented here ofger no universal solution

31

slide-69
SLIDE 69

Future work

▶ One experimental setup cannot reflect the diversity of IoT applications ▶ Many more proposed caching decision strategies ▶ Take into account topological factors, content popularity, other aspects ▶ More comprehensive survey/evaluation desirable

32

slide-70
SLIDE 70

Thank you!