Testing Real-Time Embedded Systems Using UppAal-TRON -Tool and - - PowerPoint PPT Presentation

testing real time embedded systems using uppaal tron tool
SMART_READER_LITE
LIVE PREVIEW

Testing Real-Time Embedded Systems Using UppAal-TRON -Tool and - - PowerPoint PPT Presentation

Testing Real-Time Embedded Systems Using UppAal-TRON -Tool and Application Kim G. Larsen, Marius Mikucionis, Brian Nielsen, Arne Skou Aalborg University, DK {kgl | marius | bnielsen | ask}@cs.aau.dk Agenda Automated Model-based Testing


slide-1
SLIDE 1

Testing Real-Time Embedded Systems Using UppAal-TRON

  • Tool and Application

Kim G. Larsen, Marius Mikucionis, Brian Nielsen, Arne Skou {kgl | marius | bnielsen | ask}@cs.aau.dk Aalborg University, DK

slide-2
SLIDE 2

2

Agenda

Automated Model-based Testing Testing Framework

Timed Automata Environment Modeling Relativized I/O conformance

Online Testing Algorithm Danfoss EKC Other Issues

Monitoring and Environment Emulation Coverage Measurement

Demo Conclusions & Future Work

slide-3
SLIDE 3

3

Testing Embedded Software

Testing: Execute actual software

(system) with controlled inputs and check responses

To find errors To determine risk of release 10-20 errors per 1000 LOC 30-50 % of development time and cost Software and complexity increases

slide-4
SLIDE 4

4

Test Gene- rator tool Test Gene- rator tool

click? x:=0 click? x<2 x>=2 DBLclick!

Automated Model-Based Testing

fail pass

Test execution tool Test execution tool

Event mapping Driver

Model Test suite

Test Generator tool Test Generator tool

Implementation Relation

Selection &

  • ptimization

Does the behavior of the (blackbox) implementation comply to that of the specification?

I m p l e m e n t a t i

  • n

U n d e r T e s t

slide-5
SLIDE 5

5

Test Gene- rator tool Test Gene- rator tool

click? x:=0 click? x<2 x>=2 DBLclick!

input

Online Testing

fail pass

Test execution tool Test execution tool

Event mapping Driver

Model

Test Generator tool Test Generator tool

  • utput

Implementation Relation

Selection &

  • ptimization
  • Test generated and executed

event-by-event (randomly), reactively

  • Long Running, deep testing, imaginative

I m p l e m e n t a t i

  • n

U n d e r T e s t input input input

  • utput
  • utput
  • utput
slide-6
SLIDE 6

6

Real-Time Systems

Environment Controller

Real Time System A system where correctness not only depends on the logical order of events but also on their timing

sensors actuators Task Task Task Task

System Model Environment Model

Output Input

Σ

Modelling & Abstraction

slide-7
SLIDE 7

7

Our Framework

  • Complete and sound algorithm
  • Efficient symbolic reachability algorithms
  • UppAal-TRON: Testing Real-Time Systems Online
  • Release 1.3 http://www.cs.aau.dk/~marius/tron/

Correct system behavior

  • Test Oracle
  • Monitor
  • Relevant input event

sequences

  • Load model

”Formal Relativized i/o conformance” Relation

  • UppAal Timed Automata Network: Env || IUT
slide-8
SLIDE 8

8

Related Work

Formal Testing Frameworks

[Brinksma, Tretmans]

Real-Time Implementation Relations

[Khoumsi’03, Briones’04]

Symbolic Reachability analysis of Timed

Automata

[Dill’89, Larsen’97,…]

Online state-set computation

[Tripakis’02]

Online Testing

[Tretmans’99, Peleska’02, Krichen’04]

slide-9
SLIDE 9

10

Sample Test Runs

INFINITELY MANY SEQUENCES!!!!!!

highTemp!·3·compressorOn? ⇒ PASS highTemp!·3·compressorOn?·123·lowTemp!·3·compressorOff? ⇒ PASS highTemp!·3·compressorOff? ⇒ FAIL highTemp!·13·compressorOn? ⇒ FAIL highTemp!·3·compressorOn?·17·lowTemp!·3·compressorOff?·3.14· highTemp!·5·compressorOn?·177·lowTemp!·3·compressorOff? ⇒ PASS

slide-10
SLIDE 10

11

Sample Cooling Controller

IUT-model Env-model

On! Off! Low? Med? High?

Cr

slide-11
SLIDE 11

12

  • Env. Modeling

Realism and Guiding

EL EM E1 E2

EL E2 E1 EM

Temp. time High! Med! Low! EM Any action possible at any time E1 Only realistic temperature variations E2 Temperature never increases when cooling EL No inputs (completely passive)

slide-12
SLIDE 12

13

Sample Cooling Controller

IUT Env-model

On! Off! Low? Med? High?

EM

Cr C’r rt-ioco EM Cr C’r

slide-13
SLIDE 13

14

Sample Cooling Controller

IUT Env-model

On! Off! Low? Med? High?

C’r rt-ioco E1 Cr , iff 3d<r

d.Med?.d.High?.d.Med?.d.Low?.ε.On, ε≤r

E1

C’r

slide-14
SLIDE 14

15

Sample Cooling Controller

IUT Env-model

On! Off! Low? Med? High?

C’r rt-ioco E2 Cr

E2

C’r

slide-15
SLIDE 15

16

Non-Determinism

time

threshold ±err switchOn! switchOff!

T

  • Modeling Action uncertainty
  • A controller switches a relay when a control variable crosses

‘around’ threshold value

  • Modeling Timing uncertainty
  • A controller switches a relay between 2 and 10 time units
slide-16
SLIDE 16

17

Implementation relation

Relativized real-time io-conformance

  • i rt-iocoe s =def
  • ∀σ ∈ TTr(e): Out((e,i) after σ) ⊆ Out((e,s) after σ)
  • i rt-iocoe s iff TTr(i) ∩ TTr(e) ⊆ TTr(s) ∩ TTr(e)
  • Intuition, for all relevant environment behaviors
  • never produces illegal output, and
  • always produces required output in time
  • ~timed trace inclusion
  • Let P be a set of states
  • TTr(P): the set of timed traces from states in P
  • P after σ = the set of states reachable after timed trace σ
  • Out(P) = possible outputs and delays in P

System Model Environment assumptions ε0’,o0,ε1’,o1… ε0,i0,ε1,i1…

e IUT s i

slide-17
SLIDE 17

18

Randomized Online Algorithm

Algorithm TestGenExec (TestSpec) returns {pass, fail} Z:={〈l0,0〉}, While Z ≠∅ and #iterations≤T do choose randomly 1. if EnvOutput(Z) ≠∅ // Offer an input choose randomly a ∈ EnvOutput(Z) send i to SUT Z:=Z after a 2. choose randomly δ ∈ Delays(Z) // Delay and wait for output Wait(δ) if o occurred after δ’ ≤ δ then Z:=Z after δ’ if o ∉ ImpOutput(Z) then return fail Z:=Z after o else // no output within δ time Z:=Z after δ 3. reset IUT Z:={〈l0,0〉} if Z=∅ then return fail else return pass

  • Sound
  • Complete as T→∞
slide-18
SLIDE 18

19

Sound & Complete

TestGenExec is sound

Fail verdict ⇒¬( I iocoe S)

complete

¬( I iocoe S) ⇒ Prob(Fail) → 1 as T→∞

(using only unit delays) Assuming

IUT can be modeled by an input enabled, deterministic,

non-blocking IO-TLOTS with isolated outputs

Time unit of IUT is known TTr(IUT) and TTr(E) are closed under digitization

LTS induced by TA with only non-strict guards

TTr(S) closed under inverse digitization

LTS induced by TA with only strict guards

slide-19
SLIDE 19

20

State-set computation

Compute all potential states the model can

  • ccupy after the timed trace ε0,i0,ε1,o1,ε2,i2,o2,…

Let Z be a set of states

Z after a: possible states after executing a (and t*) Z after ε :possible states after t* and εi , totaling a delay of ε

  • is a legal output from SUT iff O in ImpOutput(Z)

a is a relevant input in Env iff I in EnvOutput(Z) ε is a permitted delay iff Z after ε ≠∅ ε is a relevant delay iff Delays (Z)

slide-20
SLIDE 20

21

State-set Computation

Compute all potential states the model can

  • ccupy after the timed trace ε0,i0,ε1,o1,ε2,i2,o2,…

Let Z be a set of states

Z after a: possible states after executing a (and τ*) Z after ε :possible states after τ* and εi , totaling a delay of ε

l0

x≤7, a a

l3 l2 l1 l4

a, x:=0 τ

l0

τ, x:=0

l1

{ 〈l0,x=3〉 } after a = { 〈l2,x=3〉, 〈l4, x=3〉, 〈l3, x=0〉 } { 〈l0,x=0〉} after 4 = { 〈l0,x=4〉, 〈l1, 0 ≤ x ≤ 4〉 }

Represent state sets as sets of symbolic states Use symbolic reachability (similar to model checkers like UppAal)

slide-21
SLIDE 21

22

Symbolic Reachability

slide-22
SLIDE 22

23

Real-time Online

  • Compute all states reachable after timed trace
  • Maintain a set of symbolic states in real time!

Z2 Z4 Z0 Z1 Z3 Z7 Z5 Z8 Z6 Z9 Z11 Z14 Z12 Z15 Z18 Z17 Z16 Specification TA-network i! 2.75 O?

System Under Test

Online Tester:

[Tripakis’02, Krichen’04]

slide-23
SLIDE 23

25

Danfoss EKC Case

Electronic Cooling Controller

Output Relays

  • compressor relay
  • defrost relay
  • alarm relay
  • (fan relay)

Display Output

  • alarm / error indication
  • mode indication
  • current calculated temperature

Sensor Input

  • air temperature sensor
  • defrost temperature sensor
  • (door open sensor)

Keypad Input

  • 2 buttons (~40 user settable

parameters)

  • Optional real-time clock or LON network module
slide-24
SLIDE 24

26

Industrial Cooling Plants

slide-25
SLIDE 25

27

Project Goals

Can we model significant aspects and time

constraints?

Can we test in real-time? Is the tool fast enough? How do we control and observe target?

Existing product Documentation

requirements specification users manuals equipment and software for real test execution Meeting and e-mail with Danfoss Engineers

Continued collaboration

Test of new generation controllers being developed Improved test interface

slide-26
SLIDE 26

28

Basic Refrigeration Control

Time

setpoint setpoint +differential

highAlarm Deviation

lowAlarm Limit highAlarm Limit

lowAlarm Deviation differential

start compressor stop compressor start compressor stop compressor start alarm normal min restart time not elapsed min cooling time not elapsed alarm delay

slide-27
SLIDE 27

30

EKC Adaptation 1

Hardware+Physical I/O Device drivers+kernel Parameter DB (shared variables) Control Software Test Interface

LONGWRS232

win32+OLE+VB

  • AK-Online (PC SW)
  • configuration
  • supervision
  • logging
  • Read and write parameter “database”
  • 47 parameters

EKC Software Layering

slide-28
SLIDE 28

31 Adaptor

EKC Adaptation 2

tcp/ip LON+rs232 win32+OLE+VB Solaris/Linux (C++) TRON Engine

compressorOn

22.3 1 22.1 1 16.7

  • ld copy

new copy “continous” readout 2 readouts/s

setTemp(20)

“par#4=20.0”

Need better test interface!

  • Read-only parameters
  • Delay and synchronization
slide-29
SLIDE 29

32

Modeling Principles

Model significant subset

Temperature regulation Alarm monitoring Manual and periodic timer based defrosting

Modular model Compute “calculatedTemperature” in model

derive output events from that could be monitored in adaptor

Environment temperature generators Use non-determinism

Timing tolerances Model adapter delay and timing uncertainty

slide-30
SLIDE 30

33

Temperature Tracking

Temperature Time “periodic” weighted average:

5 4 *

1 sampled n n

T T T + =

EKC calculated temperature Model calculated temperature Error/uncertainty envelope

tolerance in sampling time tolerance in value computation compressorOn!

slide-31
SLIDE 31

34

Main Model Components

18 concurrent timed automata 14 clocks, 14 integers

Output Input

IUT-Model

alarm Relay compressor Relay tempMeasurement compressor

newTemp newTemp

  • n/off
  • n/off

Environment

TemperatureGenerator

defrost Relay defrost autoDefrost

  • n/off

defrostEventGen

alarm Display

  • n/off

highTempAlarm

slide-32
SLIDE 32

35

Reverse Engineering

  • Unclear and incomplete specifications
  • Method of Working
  • 1. Formulate hypothesis model
  • 2. Test
  • 3. FAIL-verdict ⇒ Refine model
  • 4. (PASS) ⇒ Confirm with Danfoss
  • Detects differences between actual and

modeled behavior

  • Indicates promising error-detection

capability

  • 4 examples
slide-33
SLIDE 33

36

Ex1: Control Period

Control actions issued when

”calculatedTemp” crosses thresholds

No requirements on period given Tested to be 1.2 seconds

“periodic” weighted average:

5 4 *

1 sampled n n

T T T + =

slide-34
SLIDE 34

37

Ex2: High Alarm Monitor v1

Clearing the alarm do not switch off alarm state, only alarm relay

slide-35
SLIDE 35

38

Ex2: High Alarm Monitor v2

  • Add HighAlarmDisplay action
  • Add location for “noSound, but alarmDisplaying”
  • (Postpone alarms after defrosting)
slide-36
SLIDE 36

39

Ex3: Defrosting and Alarms

  • When defrosting the temperature rises
  • Postpone high temperature alarms during

defrost

  • System parameter alarmDelayAfterDefrost
  • Several Interpretations

1.

Postpone alarmDelayAfterDefrost+alarmDelay after defrost?

2.

Postpone alarmDelayAfterDefrost+alarmDelay after highTemp detected?

3.

Postpone alarmdelayAfterDefrost until temperature becomes low; then use alarmDelay

  • Option 3 applies!
slide-37
SLIDE 37

40

Ex4: Defrost TimeTolerance

Defrost relays engaged earlier and

disengaged later than expected

Assumed 2 seconds tolerance Defrosting takes long time Implementation uses a low resolution

timer (10 seconds)

slide-38
SLIDE 38

41

Example Test Run

(log visualization)

1500 1600 1700 1800 1900 2000 2100 2200 2300 2400 2500 2600 2700 2800 2900 3000 3100 3200 3300 3400 3500 3600 3700 3800 100000 200000 300000 400000 500000 600000 700000 800000 900000 setTemp modelTemp ekcTemp CON COFF AON AOFF alarmRst HADOn HADOff DON DOFF manDefrostOn manDefrostOff

defrostOff? alarmOn! alarmDisplayOn! resetAlarm? AOFF! HighAlarmDisplayOff! manualDefrostOn? COFF! DON! compressorOn! //defrost complete DOFF! CON!

slide-39
SLIDE 39

42

State-set Evolution

State set plot

200 400 600 800 1000 1200 100 200 300 400 500 600 700 800 900 1000

time (sec) Number of states

5 10 15 20 25

degrees

State-set High Temp Limit Temperature Alarm Limit

Correlation between state-sets and model behavior

slide-40
SLIDE 40

43

Cost of state-set update

Number of Symbolic states µS

slide-41
SLIDE 41

Testing=Environment emulation+monitoring

slide-42
SLIDE 42

45

Testing

Correct system behavior

  • Test Oracle
  • Monitor
  • Relevant input event

sequences

  • Load model

”Formal Relativized i/o conformance” Relation

  • Replace Systems Real Environment by Tester
  • Tester provides inputs
  • Tester observes outputs

i

slide-43
SLIDE 43

46

Environment Emulation

”Formal Relativized i/o conformance” Relation

i

Compute inputs from environment model

Relevant input event sequences Load model

Feedback or one-way Outputs may go to real-system

slide-44
SLIDE 44

47

Monitoring

”Formal Relativized i/o conformance” Relation

Passively check communication between system

and its real environment

check system behavior

Passive Testing

  • i
slide-45
SLIDE 45

Measuring Coverage

slide-46
SLIDE 46

49

Coverage Measurements

How thorough has testing been?? Idea:

Use a model checker, e.g. UppAal Convert timed trace observed during test run

to a timed automata (trace automata)

Replace Environment by trace automaton Perform Reachability Analysis on annotated

model (according to coverage criteria)

slide-47
SLIDE 47

50

Location Coverage

Test sequence traversing all locations Encoding:

Enumerate locations l0,…,ln Add an auxiliary variable li for each location Label each ingoing edge to location i li:=true Mark initial visited l0:=true

Check: EF( l0=true ∧ … ∧ ln=true ) lj lj:=true lj:=true

slide-48
SLIDE 48

51

Edge Coverage

Test sequence traversing all edges Encoding:

Enumerate edges e0,…,en Add auxiliary variable ei for each edge Label each edge ei:=true

Check: EF( e0=true ∧ … ∧ en=true ) l1 l4 l3 l2

a? x:=0 e0:=1 x≥2 a? e2:=1 x<2 b! e1:=1 c! e3:=1 e4:=1

slide-49
SLIDE 49

52

Coverage of non-deterministic models

Edge i possible covered (is some run)

Check: EF( ei=true ∧ t.end)

Edge i definitely covered (in all runs)

Check: AF(t.end ⇒ ei=true)

Edge i definitely not covered (in no runs)

Check: AF(t.end ⇒ ei=false)

Trace 10.a!.5.b?

slide-50
SLIDE 50

53

Demo

slide-51
SLIDE 51

54

Touch-Sensitive Light- Controller

  • Patient user: Wait=∞
  • Impatient: Wait=15
slide-52
SLIDE 52

56

Touch-sensitive Light- Controller Model

User/ENV Interface switch Dimmer

grasp! release! touch! Level!

light controller model

hold! endhold!

slide-53
SLIDE 53

58

Mutants

synchronized public void handleTouch() { if(lightState==lightOff) { setLevel(oldLevel); lightState=lightOn; }

else { //was missing

if(lightState==lightOn){

  • ldLevel=level;

setLevel(0); lightState=lightOff; }

  • M2 incorrect additional delay in dimmer as if x:=0 was
  • n ActiveUP ↔ActiveDN transitions

X:=0 X:=0

  • M1 incorrectly implements switch
slide-54
SLIDE 54

60

Conclusions

Can accurately model EKC-like devices Can create models suitable for online testing Complete and detailed model not required

Select aspects Abstraction

MBT feasible even if specification is incomplete/unclear Promising error-detection capabilities

Differences between actual and specified behavior in

industrial case

Academic mutation studies

Excellent performance Very non-deterministic models causes very large state-

sets which can become a computational bottleneck

Real-time synchronization of IUT and tester is problematic

slide-55
SLIDE 55

61

Future Work

Tool Improvements

Import test trace into UppAal Edge & location-coverage measurements Graphical User-Interface Separate environment simulation and

monitoring

Further Danfoss Collaboration

Better test interface Test newly developed product

Coverage Guiding & RT-criteria) Automatic test adaptation abstraction Testing Hybrid and Stochastic Systems

slide-56
SLIDE 56

62

Research Challenges

Modelling

How to model real-time systems easily, and quickly,

precisely, expressively, ...

What is a good formal notion of correctness?

Tool implementation

How to analyze these models? How to compute state-set estimation in real-time? Real-time execution and clock synchronization with IUT!?!

Robustness

Partial observability and uncertainty

Guiding

Can we improve obtained coverage of model?? Real-time coverage criteria?? Is it efficient in finding errors?

How to apply in industrial practice? Extensions

Probabilistic performance testing? Hybrid systems

slide-57
SLIDE 57

63

END