Reconciling Fitts Law with Shannons Information Theory EMPG 2015 - - PowerPoint PPT Presentation

reconciling fitts law with shannon s information theory
SMART_READER_LITE
LIVE PREVIEW

Reconciling Fitts Law with Shannons Information Theory EMPG 2015 - - PowerPoint PPT Presentation

Reconciling Fitts Law with Shannons Information Theory EMPG 2015 University of Padua, Sept 1-3, 2015 Julien Gori* Olivier Rioul** Yves Guiard*** *ENS Cachan **Telecom ParisTech ***CNRS LTCI Paris, France Table of Contents


slide-1
SLIDE 1

Reconciling Fitts’ Law with Shannon’s Information Theory

EMPG 2015 University of Padua, Sept 1-3, 2015 Julien Gori* Olivier Rioul** Yves Guiard***

*ENS Cachan **Telecom ParisTech ***CNRS LTCI Paris, France

slide-2
SLIDE 2

2/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Table of Contents

Information Theory & Psychology Historical Perspective Channel Capacity Fitts’ Law What is Fitts’ Law ? Multiple formulas A Geometric Framework Partitioning the Space with Targets 3 New Derivations A Coherent Information Theoretic Model A Communication Channel A Capacity Formula

slide-3
SLIDE 3

3/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Presentation outline

Information Theory & Psychology Historical Perspective Channel Capacity Fitts’ Law What is Fitts’ Law ? Multiple formulas A Geometric Framework Partitioning the Space with Targets 3 New Derivations A Coherent Information Theoretic Model A Communication Channel A Capacity Formula

slide-4
SLIDE 4

4/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Annus Mirabilis : 1948

Claude Shannon’s A Mathematical Theory of Communication Information Uncertainty Communication system Capacity

slide-5
SLIDE 5

5/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Two Telling Quotes

Information is quantifiable and measurable ! A tremendous impact on psychologists :

We now call them experiments on the capacity of people to transmit information. (G. A. Miller, 1956, The Magical Number Seven, Plus or Minus Two) Presented with a shiny new tool kit [information theory] and a somewhat esoteric new vocabulary to go with it, more than a few psychologists reacted with an excess of enthusiasm. (F. Attneave, 1959, Applications of Information Theory to Psychology)

slide-6
SLIDE 6

6/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

A Strong Reaction from Shannon and Colleagues

The first paper has the generic title « Information Theory, Photosynthesis and Religion » ( [Elias, 1958] ) [. . .] the basic results of the subject are aimed in a very specific direction, a direction that is not necessarily relevant to such fields as psychology, economics, and other social sciences. ( [Shannon, 1956] )

slide-7
SLIDE 7

7/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

The Channel Capacity

Maximum amount of information transmittable over noisy communication link (channel)

slide-8
SLIDE 8

7/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

The Channel Capacity

Maximum amount of information transmittable over noisy communication link (channel) Additive White Gaussian Noise channel signal : s noise : n y = s + n +

slide-9
SLIDE 9

7/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

The Channel Capacity

Maximum amount of information transmittable over noisy communication link (channel) Additive White Gaussian Noise channel signal : s noise : n y = s + n + Shannon’s famous Theorem 17 (1948) C = 1

2 log

  • 1 + S

N

  • = 1

2 log (1 + SNR) bits per channel use

N = E(n2), S = E(s2)

slide-10
SLIDE 10

7/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

The Channel Capacity

Maximum amount of information transmittable over noisy communication link (channel) Additive White Gaussian Noise channel signal : s noise : n y = s + n + Shannon’s famous Theorem 17 (1948) C = 1

2 log

  • 1 + S

N

  • = 1

2 log (1 + SNR) bits per channel use

N = E(n2), S = E(s2) Any achievable rate (=reliable communication) R ≤ C

slide-11
SLIDE 11

8/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Whatever Happened to Information Theory in Psychology ?

Information theory discredited in psychology One rarely sees Shannon’s information theory in contemporary psychology articles (R. Luce, 2003, Whatever Happened to Information Theory in Psychology ?) There is one notable exception : Fitts’ Law , since 1954, and more generally the speed-accuracy trade-off for rapid aimed movement [Soukoreff and MacKenzie, 2009]. Part of ISO 9241-9. Used for device assessment and movement time prediction in HCI.

slide-12
SLIDE 12

9/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Presentation outline

Information Theory & Psychology Historical Perspective Channel Capacity Fitts’ Law What is Fitts’ Law ? Multiple formulas A Geometric Framework Partitioning the Space with Targets 3 New Derivations A Coherent Information Theoretic Model A Communication Channel A Capacity Formula

slide-13
SLIDE 13

10/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

The Paradigm

Aiming at a target of size W from a distance D Start

D W

slide-14
SLIDE 14

11/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

How Do D and W Affect Target Acquisition Time ?

Fitts’ definition of an Index of Difficulty (ID), by analogy with Shannon’s Theorem 17 : ID = log2 2D W

  • (bits)

(Movement Time) MT = a + b · ID through linear regression → Speed-accuracy trade-off a and b determined through experimentation.

slide-15
SLIDE 15

12/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Other Formulations for ID

Fitts’ original formulation, [Fitts, 1953] ID = log2 2D W

  • Welford’s formulation [Welford, 1960]

ID = log2

  • 0.5 + D

W

  • MacKenzie’s formulation [MacKenzie, 1989]

ID = log2

  • 1 + D

W

  • a( D

W )b

a + b √ A

a + b log( A

W ) −a + b(c + D) log( 2A

W )

Many more formulations !

slide-16
SLIDE 16

13/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

MacKenzie’s Formulation

an analogy with Shannon’s capacity : ID = log2

  • 1 + D

W

  • C = 1

2 log2

  • 1 + S

N

  • D,W target distance and size

S,N powers of signal and noise is D

W an amplitude SNR ? What is the communication model ?

What are the input, output and noise ? What about the 1

2 factor ?

slide-17
SLIDE 17

14/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Mackenzie’s Formulation (cont’d)

Capacity for a system → Communication model signal : s noise : n y = s + n

+

C = 1 2 log2

  • 1 + S

N

  • Achievable rate (vanishing error

probability) MacKenzie formulation → Speed-accuracy trade-off ID = log2

  • 1 + D

W

slide-18
SLIDE 18

15/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Presentation outline

Information Theory & Psychology Historical Perspective Channel Capacity Fitts’ Law What is Fitts’ Law ? Multiple formulas A Geometric Framework Partitioning the Space with Targets 3 New Derivations A Coherent Information Theoretic Model A Communication Channel A Capacity Formula

slide-19
SLIDE 19

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing !

slide-20
SLIDE 20

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-21
SLIDE 21

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-22
SLIDE 22

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-23
SLIDE 23

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-24
SLIDE 24

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-25
SLIDE 25

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-26
SLIDE 26

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-27
SLIDE 27

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop aiming at a target is equivalent to choosing one target among N

slide-28
SLIDE 28

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop

W

aiming at a target is equivalent to choosing one target among N

slide-29
SLIDE 29

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop

W

N targets

slide-30
SLIDE 30

16/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Geometric Framework

Idea : aiming = choosing ! start

D W

stop

W

N targets aiming at a target is equivalent to choosing one target among N

slide-31
SLIDE 31

17/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Rederiving the Fitts Formulation

An analogy with Hick’s law (Fitts 1953) An analogy with Shannon’s Capacity (Fitts 1954) the movement terminates somewhere in between 0 and 2D :

starting point aiming point max stopping point

D 2D W

slide-32
SLIDE 32

18/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Rederiving the Fitts Formulation (cont’d)

Number of targets : n = 2D

W , n ∈ N

Corresponding entropy assuming a uniform distribution : H = log2(n) = log2 2D W = IDF

slide-33
SLIDE 33

19/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Rederiving the Welford Formulation

To put it in another way, he is called to choose a distance W out of a total distance extending from his starting point to the far edge of the target. (A. T. Welford, 1960)

D W choose a distance W out of a total W

2 + D

number of possible targets : n =

D− W

2

W

+ 1 = 1

2 + D W , n ∈ N

H = log n = log 1 2 + D W

  • = IDW
slide-34
SLIDE 34

20/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Rederiving the MacKenzie Formulation

D W number of possible targets : n = 1 + D

W , if D W ∈ N

Entropy : H = log

  • 1 + D

W

  • = IDMcK
slide-35
SLIDE 35

21/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Where Are We ?

Capacity for a system → Problem stated → Communication model signal : s noise : n y = s + n

+

C = 1 2 log2

  • 1 + S

N

  • MacKenzie formulation

→ Speed-accuracy trade-off : ID = log2

  • 1 + D

W

  • What model ?
slide-36
SLIDE 36

22/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Presentation outline

Information Theory & Psychology Historical Perspective Channel Capacity Fitts’ Law What is Fitts’ Law ? Multiple formulas A Geometric Framework Partitioning the Space with Targets 3 New Derivations A Coherent Information Theoretic Model A Communication Channel A Capacity Formula

slide-37
SLIDE 37

23/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

The Human-motor System as a Communication System

Source User intention Human participant Encoder Movement mapping Channel Noise neural noise, tremor,... Decoder Target recognition Destination Target hit Receiving device

Adapted from [Zhai et al., 2012]

slide-38
SLIDE 38

24/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Communication Model

Message

choosing a target = aiming at its center Set of messages : {− D

2 , − D 2 + W, . . . , D 2 − W, D 2 }

Messages uniformly distributed

Noise

ensuring reliable communication (= error-free) → the noise has absolute amplitude less than W

2

Uniform distribution

Output

choosing a target = aiming at its center = hitting the target

slide-39
SLIDE 39

25/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Gaussian versus Uniform Channel

Shannon Capacity for gaussian noise Signal power limited to S Noise power limited to N Gaussian distribution for noise signal : s noise : n y = s + n

+

C = 1 2 log2

  • 1 + S

N

  • MacKenzie formulation

Signal amplitude limited to D

2

Noise amplitude limited to W

2

Uniform distribution for noise

|s| ≤ D

2

|n| ≤ W

2

s + n

+

C′ =?

slide-40
SLIDE 40

26/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Capacity Formula for the Uniform Channel [Rioul and Magossi, 2014]

Theorem 1 : C′ = log2

  • 1 + D

W

  • Proof :

C′ = maxx,|x|≤ D

2 I(x, y)

I(x, y) = h(y) − h(y|x) = h(y) − h(n + x|x) = h(y) − h(n) = h(y) − log2(W) Thus maximizing the mutual information between X and Y is equivalent to maximizing h(Y) |y| ≤ |x| + |n| ≤ D+W

2

For a continuous RV under amplitude constraint, the uniform density maximizes differential entropy x discrete with uniform density gives y uniform C′ = h(y) − log2(W) = log2(D + W) − log2(W) = log2(1 + D

W )

slide-41
SLIDE 41

27/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Gaussian versus Uniform Channel

Shannon’s Capacity Formula Signal power limited to S Noise power limited to N Gaussian distribution for noise signal : s noise : n y = s + n

+

C = 1 2 log2

  • 1 + S

N

  • MacKenzie formulation

Signal amplitude limited to D

2

Noise amplitude limited to W

2

Uniform distribution for noise

|s| ≤ D

2

|n| ≤ W

2

s + n

+

C′ = log2(1 + D W )

slide-42
SLIDE 42

28/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

More than an Analogy

Theorem 2 : C = C′ Proof :

C = 1

2 log(1 + SNR)

uniform noise and uniform output Y : power of y ∝ (D + W)2 N : power of n ∝ W 2 C = 1

2 log

S+N

N

  • = 1

2 log

  • power of y=s+n

power of n

  • = 1

2 log

  • (D+W)2

W 2

  • = log

D+W

W

  • = C′

A true identity !

slide-43
SLIDE 43

29/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Pending Issues

A more realistic model With feedback ? What is the interpretation of throughput ? Can we take non-zero error into account ?

slide-44
SLIDE 44

30/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Thank You !

Any questions ?

slide-45
SLIDE 45

31/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Bibliography I

Elias, P . (1958). Two famous papers. IRE Transactions on Information Theory, 4(3) :99. Fitts, P . (1953). The influence of response coding on performance in motor tasks. Current Trends in Information Theory. University of Pittsburgh Press, Pittsburgh, PA, pages 47–75. Luce, R. D. (2003). Whatever happened to information theory in psychology ? Review of General Psychology, 7(2) :183–188.

slide-46
SLIDE 46

32/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Bibliography II

MacKenzie, I. S. (1989). A note on the information-theoretic basis for fitts’ law. Journal of motor behavior, 21(3) :323–330. Rioul, O. and Magossi, J. C. (2014). On Shannon’s formula and Hartley’s rule : Beyond the mathematical coincidence. Entropy, 16(9) :4892–4910. Shannon, C. E. (1956). The bandwagon. IRE Transactions on Information Theory, 2(1) :3.

slide-47
SLIDE 47

33/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Bibliography III

Soukoreff, R. and MacKenzie, I. (2009). An informatic rationale for the speed-accuracy trade-off. In Systems, Man and Cybernetics, 2009. SMC 2009. IEEE International Conference on, pages 2890–2896. Welford, A. T. (1960). The measurement of sensory-motor performance : Survey and reappraisal of twelve years’progress. Ergonomics, 3(3) :189–230.

slide-48
SLIDE 48

34/34 Institut Mines-Télécom Reconciling Fitts’ Law with Shannon’s Information Theory

Bibliography IV

Zhai, S., Kristensson, P . O., Appert, C., Andersen, T. H., and Cao,

  • X. (2012).

Foundational issues in touch-screen stroke gesture design-an integrative review. Foundations and Trends in Human-Computer Interaction, 5(2) :97–205.