Testing Concepts Brian Nielsen { Com plex System s CISS A very - - PDF document

testing concepts
SMART_READER_LITE
LIVE PREVIEW

Testing Concepts Brian Nielsen { Com plex System s CISS A very - - PDF document

{bnielsen@cs.aau.dk} } Testing Concepts Brian Nielsen { Com plex System s CISS A very com plex system t l A Spectacular softw are bugs p g Ariane 5 The first Ariane 5 rocket was launched in June, 1996. It used software


slide-1
SLIDE 1

Testing Concepts

Brian Nielsen {bnielsen@cs.aau.dk} { }

slide-2
SLIDE 2

Com plex System s

slide-3
SLIDE 3

A l t A very com plex system

CISS

slide-4
SLIDE 4

Spectacular softw are bugs p g Ariane 5

The first Ariane 5 rocket was launched in June, 1996. It used software developed for the successful Ariane 4 The rocket successful Ariane 4. The rocket carried two computers, providing a backup in case one computer failed during launch. Forty seconds into its maiden flight the seconds into its maiden flight, the rocket veered off course and

  • exploded. The rocket, along with

$500 million worth of satellites, d t d

Ariane 5 was a much more powerful rocket and generated was destroyed. forces that were larger than the computer could handle. Shortly after launch, it received an input value that was too large The value that was too large. The main and backup computers shut down, causing the rocket to veer

  • ff course.

CISS

slide-5
SLIDE 5

Spectacular softw are bugs p g U.S.S. Yorktow n, U.S. Navy

When the sailor entered the mistaken number, the computer tried to divide by zero, which isn't

  • possible. The software didn't

check to see if the inputs were

In 1998, the USS Yorktown became the first ship to test the US Navy's Smart Ship program. The Navy planned to use off the check to see if the inputs were valid before computing and generated an invalid answer that was used by another computer. The error cascaded several The Navy planned to use off-the- shelf computers and software instead of expensive U.S.S. Yorktown, courtesy of U.S. Navy custom made machines A sailor The error cascaded several computers and eventually shut down the ship's engines. custom-made machines. A sailor mistakenly entered a zero for a data value on a computer. Within minutes, Yorktown was dead in th t It l h the water. It was several hours before the ship could move again.

CISS

slide-6
SLIDE 6

Spectacular softw are bugs p g Moon or Missiles

The United States established the B lli ti Mi il E l W i Ballistic Missile Early Warning System (BMEWS) during the Cold War to detect a Soviet missile

  • attack. On October 5, 1960 the

BMEWS radar at Thule, Greenland detected something. Its computer control system decided the signal was made by

The radar had actually detected the Moon rising over the horizon. Unfortunately, the BMEWS t h d t b g y hundreds of missiles coming toward the US. computer had not been programmed to understand what the moon looked like as it rose in the eastern sky, so it interpreted the huge signal as Soviet

  • missiles. Luckily for all of us, the

mistake was realized in time.

CISS

slide-7
SLIDE 7

Spectacular softw are bugs p g Therac 2 5

The Therac-25 was withdrawn from use after it was determined h i ld d li f l that it could deliver fatal

  • verdoses under certain
  • conditions. The software would

shut down the machine before

The Therac-25 radiation therapy machine was a medical device h d b f l shut down the machine before delivering an overdose, but the error messages it displayed were so unhelpful that operators ld 't t ll h t th that used beams of electrons or photons to kill cancer cells. Between 1985-1987, at least six people got very sick after Therac- couldn't tell what the error was,

  • r how serious it was. In some

cases, operators ignored the message completely. 25 treatments. Four of them

  • died. The manufacturer was

confident that their software made it impossible for the message completely. p machine to harm patients.

CISS

IEEE Computer IEEE Computer, Vol. 26, No. 7, July 1993, pp. 18 , Vol. 26, No. 7, July 1993, pp. 18-

  • 41

41 IEEE Computer IEEE Computer, Vol. 26, No. 7, July 1993, pp. 18 , Vol. 26, No. 7, July 1993, pp. 18-

  • 41

41

slide-8
SLIDE 8

Spectacular Softw are Bugs p g …. continued

 INTEL Pentium II floating-point division

470 Mill US $ 470 Mill US $

 Baggage handling system, Denver

ill S $/ d f 9 h 1.1 Mill US $/ day for 9 months

 Mars Pathfinder  …

… .

CISS

slide-9
SLIDE 9

O di S ft B Ordinary Softw are Bugs

BMW 745i software Defect: "On certain passenger vehicles, due to a software error, a desynchronization of the valvetronic m otors for engine banks I and II may occur If this m otors for engine banks I and II may occur. If this

  • ccurs, the engine could stall. In those cases, the

driver may not be able to restart the engine. Depending on the level of engine roughness, or stalling, as well as traffic conditions and the driver’s reactions, this could lead to a crash." 15000 recalled 70-100 ECU’s in modern cars SW major part of development cost

CISS

slide-10
SLIDE 10

Com ponent-Based Devel.

 “A component is a reusable unit of com position with

explicitly specified, provided, and required interfaces and quality attributes that denotes a single abstraction and quality attributes, that denotes a single abstraction and can be composed without modifications”[ Gross’05]

 How to check that a component or a component-based

system has sufficient quality?

CISS

 Today’s component specs are weak (list of public methods)

  • exceptionally pre- and post-conditions
  • Behavior not specified.
slide-11
SLIDE 11

Com ponent Testing

 Stakeholders

 Provider  Users

 Challenges

 Testing the component in  Testing the component in

a new context

 Lack of access to the

internal workings of a internal workings of a component (observability and controllability) Adequate component

 Adequate component

testing - adaptation testing

CISS

slide-12
SLIDE 12

Testing Testing

Testing: Testing:

to check the quality (functionality, reliability, performance, … )

  • f an (software) object
  • by performing experiments
  • in a controlled way
  • In avg 10-20 errors per 1000 LOC
  • In avg. 10 20 errors per 1000 LOC
  • 30-50 % of development time and cost in embedded

software

 To find errors  To determine risk of release

CISS

slide-13
SLIDE 13

T ti Testing

D i t ti i th f ti

 Dynam ic testing is the process of executing a

program or system with the intent of finding error (Glenford Meyers’ definition)

 Static testing is any activity that aims at

fi di d f t b i ti i i lki finding defects by inspecting, reviewing, walking through, and analyzing any static component of the software (code, documents, and models)

  • a

( od , do u , a d

  • d

)

 Debugging is an ad hoc activity performed by

individual developers to find and remove bugs f from a program.

 Testing is a planned activity

CISS

slide-14
SLIDE 14

W hat is a Test? W hat is a Test?

Test Cases Test Cases Test Data Output

Correct

Software under Test

result?

O l under Test Oracle

CISS

slide-15
SLIDE 15

Testing process

T

g p

requirements

Planning& Control

specification

preparation

test plan L O I

Analysis and test design

abstract/logical test cases specification

L : Life-cycle for testing O : Organization I : Infrastructure and tools T : Techniques

g Test implemen- tation

concrete/ executable test cases

q

implementation

Test execution

verdict/logs quality report

Evaluation and reporting

CISS

reporting

slide-16
SLIDE 16

T f T ti Types of Testing

Level

system unit integration efficiency white box black box Accessibility efficiency usability reliability functionality reliability

CISS

Aspect

slide-17
SLIDE 17

Quality-Characteristics (ISO-9126) Quality-Characteristics (ISO-9126)

Functionality

 Suitability, accuracy, security, compliance, interoperability

 functional testing

y, y, y, p , p y

Reliability

 maturity, fault tolerance, recoverability

 reliability testing

Usability

 understandability, learnability, operability

Efficiency

 usability testing  performance testing

Efficiency

 time behaviour, resource utilization

Maintainability

 performance testing  maintainability testing ??

y

 Analysability, changeability, stability, testability

Portability

 maintainability testing ??  portability testing ?

 Adaptability, installability, conformance, replaceability

CISS

slide-18
SLIDE 18

W hitebox Testing W hitebox Testing

int invoice (int x, int y) {

if

1000

( , y) { int d1, d2, s; if (x<=30) d2=100; else d2=90;

if

d2=100 d2=90 d1=95 d1 80 x>30 s>1000

s=5*x + 10 *y; if (s<=200) d1=100; else if (s<=1000) d1 = 95;

d1=80

else d1 = 80; return (s*d1*d2/10000); }

if

d1=100

return s>200 Test Data Expected Output

Test Cases

X=5 Y=5 75 X=31 Y=10 229.5

CISS

X=30 Y=100 977.5

slide-19
SLIDE 19

Blackbox testing Blackbox testing

requirements

  • utput

SUT

input input events

y domain testing x CISS x

slide-20
SLIDE 20

V - Model

user requirements acceptance test acceptance test spec specification system test system test spec p component integration test architecture spec integration test spec detailed design component/ module t t module test spec detailed design test unit test spec

CISS

implementation code unit-test test spec

slide-21
SLIDE 21

Module/ Com ponent Test / p

test driver test d e Component under test test stub test stub

CISS

slide-22
SLIDE 22

I t ti T t I ntegration Test

Design Hierarchy

M1

Design Hierarchy

M1

”calls/uses” ”consists-of”

M2 M3 M4 M6 M5 M9 M7 M8 M10 M11 M12

CISS

M10 M11 M12

slide-23
SLIDE 23

I t ti T t I ntegration Test

Module Hierarchy

M1

T D B tt U

Module Hierarchy

M1

Top-Down Integration Buttom-Up Integration

”calls/uses” ”consists-of”

M2 M3 M4 M6 M5 M9 M7 M8 M10 M11 M12

Sandwich Integration

CISS

M10 M11 M12

Depth-first vs. bredth first

slide-24
SLIDE 24

S t t t System test

 2* CRTG (4 channels) 2 * 200 k€  2 CRTG (4 channels) 2 200 k€

CISS

slide-25
SLIDE 25

T t E i t Test Equipm ent

C l T A l T S (3 M€)

 Complete Type Approval Test System (3 M€)

CISS

slide-26
SLIDE 26

A t T t Acceptance Test

 Customer’s requirements  At customer’s site by customers

y

CISS

slide-27
SLIDE 27

Challenges of Testing Challenges of Testing

 Infinity of testing:  Infinity of testing:

 too many possible input combinations –infinite breadth  too many possible input sequences -infinite depth  t i lid d t d i t  too many invalid and unexpected inputs

void F(int x int y): 232* 232 different inputs void F(int x,int y): 2 2 different inputs

 Exhaustive testing never possible:

 h t i t ff ti d ffi i t t t ith  how to invent effective and efficient test cases with high probability of detecting errors ?  when to stop testing ?  What is an effe ti e method to meas e o e age ?  What is an effective method to measure coverage ?

 Optimization problem of testing yield and i t d ff t CISS invested effort

 usually stop when time is over ......

slide-28
SLIDE 28

Ri k Risk

k b bl f b

 Make best possible use of resources by

identifying and prioritizing quality aspects and subsystems subsystems

 Higher risk ⇒ more testing  No risk ⇒ no testing

 Risk = chance of failure  damage

  • Use frequency
  • Chance of error being present
  • Complexity
  • Cost of repair
  • Loss of market share
  • Legal claim
  • Complexity
  • New tools/techniques
  • Inexperienced developers
  • Legal claim

CISS

slide-29
SLIDE 29

Ch ll f T ti Challenges of Testing

d  Many operating environments and contexts

 Impact of platform capabilities – OS, HW, Remote systems  Typical and rare use patterns  Typical and rare use patterns  Implicit requirements  Domain knowledge

 How can software fail ?

 Typical programming errors  Typical wrongly implemented features  Typical wrongly implemented features  Exceptional cases  No realistic reliability models for software

H t t l t i t ff ti t t ?  How to translate in to effective tests?

CISS

slide-30
SLIDE 30

Ch ll f T ti Challenges of Testing

 Regression testing:

 very important  very boring and expensive  must be automated

 Test oracle problem

 Bad specification or no specification at all  Bad specification or no specification at all  Requirements change  Requirements elucidation is a process q p

CISS

slide-31
SLIDE 31

Challenges: W ho Should T t? Test?

 Developer

 Understands the system

But will test gently

 Independent tester

 Must learn system

But will attempt to break it

 But, will test gently  And, is driven by

deadlines

 But, will attempt to break it  And, is driven by “quality”

CISS

  • Destructive, Unprestigious??
slide-32
SLIDE 32

Ch ll f T ti Challenges of Testing

Moving implementation deadlines ...... but fixed delivery deadlines code delivery = system start code delivery = start test execution system delivery start project

CISS

slide-33
SLIDE 33

Ch ll f T ti Challenges of Testing

 Lack of appropriate tools

 Diversified fields  Experts dispersed, but also doesn’t talk across

application domain.

 Tools are specialized, sells in low volume  Tools are expensive,  Tools are immature  No money available for test tools

CISS

slide-34
SLIDE 34

Ch ll f T ti Challenges of Testing

 New embedded systems  Testing

 more functionality  increasingly advanced

f i k

 more to be tested  more complicated

i l i

 faster time-to-market  higher quality  in less time  more thorough

kill d d l d t t

  • skilled developers and testers
  • advanced testing tools and techniques
  • well organized

well organized

  • using solid development method

CISS

slide-35
SLIDE 35

END