The Non-Virtual Reality of Testing or What's Feasible in Real World - - PowerPoint PPT Presentation

the non virtual reality of testing or what s feasible in
SMART_READER_LITE
LIVE PREVIEW

The Non-Virtual Reality of Testing or What's Feasible in Real World - - PowerPoint PPT Presentation

TER-10 The Non-Virtual Reality of Testing or What's Feasible in Real World Testing Contents 1. Introduction 2. Seven myths about testing and their demystification 3. A kind of conclusion Karol Frhauf, INFOGEM AG , CH-5400 Baden,


slide-1
SLIDE 1

TER-10

The Non-Virtual Reality of Testing or What's Feasible in Real World Testing

Contents

  • 1. Introduction
  • 2. Seven myths about testing and their demystification
  • 3. A kind of conclusion

Karol Frühauf, INFOGEM AG, CH-5400 Baden, Karol.Fruehauf@infogem.ch

slide-2
SLIDE 2

TER-20

Seven myths about testing

I Testing is a hobby of quality people II The quickest way to deployment is ping-pong testing III Test automation is cheap IV You don't need to see what you test V Integration testing is interface testing VI Test coverage is a glass box test concept VII Test planning is an easy task

slide-3
SLIDE 3

ITV-30

I Testing is a hobby of quality people (1)

project goals project state

requirements work results

review or test

dates cost

actual versus planned

dates cost

without review and test no real progress control

slide-4
SLIDE 4

ITV-40

I Testing is a hobby of quality people (2)

product development product testing project management don't throw defects over the wall to the developer

slide-5
SLIDE 5

ITV-50

II The quickest way to release is ping-pong testing

.. as soon as the tester detects a defect he returns the software to the developer we have one defect to fix ... expensive regression tests if special condition then rucksack; execute all specified test cases, then switch to repair mode

slide-6
SLIDE 6

ITV-60

III Test automation is cheap (1)

expected output test output test harness test input

mouse

+ etc. system level

capture / replay tools

+ capture is cheap – replay is expensive

test case managers

requires strong update discipline unit level + + JUnit etc. test coverage profiler

test object

slide-7
SLIDE 7

ITV-70

III Test automation is cheap (2)

test harness

comparator

  • bserver

B A

data base test input data base

stub

test output benchmark

driver injection test result

use home made dedicated tools ... keep them up-to- date

configuration item

slide-8
SLIDE 8

ITV-80

IV You don't need to see what you test (1)

Marble Arch

S-CMS S-WN

HTML HTML Light WAP WAP-now iServer PSB

S-ISV S-PSB customer care service manager

  • perator

Application Data Base

email

system boundary

CMS

content editor

Content Packages Content Packages supplier of real time content, e.g. SF DRS supplier of scheduled content, e.g. SDA

customer email provider P-SM P-CC P-OP I-WWW

WWW

SMSC GRPS LBS SMTP

email location SMS net

slide-9
SLIDE 9

ITV-90

IV You don't need to see what you test (2)

SMTP MMSPA (GUI for CUC) SMTP /// MMS Platform (MMSC,Trans- coder & MM Client Proxy) MMS-IEN3: MM3: HTTP CPR-MMS1: MM7: SMTP MMSC Big Brother (Fault Reporting) SMSC Subscriber DB MM Client Proxy Transcoder WAP-GW MMS-Traffic GD2IMAIL (E-Mail Relay SCIS) MTA (MMS <-> E- Mail Proxy) FNR/HLR (Subscriber Stat.Check) PPB (PrePaid Billing) MMSPA (CDR "Pump") MMSC Sunrise SMSC-IWU (Prot.Conv. / NLS) WAP-GW non-MMS- Traffic MMS User Web User Non MMS User SNMP XML MMS-PRB6: XML (LOG) MM1: HTTP HTTPS, Socket, SOAP/XML HTTP MMS-IEN4: HTTP 3rd Party Content Provider MMS-IEN6: UCP 3.5 UCP 3.5 WML / WSP WML / WSP SMS SMS MMS-PRB2: MM8: R2.5 CDR ASN.1 BER MMS-PRB3: CORBA / Parlay MMS-IEN8a: MM4 MMS-IEN7a: MM3: ESMTP MMSC Orange "Portal" Content Provider ESMTP ESMTP Admin-Client (Config. Mgmt) HTTP Admin-Client (Perform. Mgmt) HTTP MIR PRB-IEN5: CAI ASCII/ ASN.1 MMS-PRB1: MM7: SMTP, SOAP/XML (Welcome/Test MMS) MMS-PRB7: CORBA (Autoprovisioning) MMS-PRB5: SQL (Statistics) SBS (3rd Party Billing/ Med./Rating) CPR-IEN1: Message Queue MMS Box (MML Composer) HTTP WML / HTTP OTA-Server (Handset Provis.) SIS (Subscriber Stat.Check) CORBA PRB-IEN3: HTTP PRB-IEN7: GPRS CDRs Foreign SGSN GPRS CDRs TAP3 Mediation BSCS SCM VMD Edifact CAI (Customer Administration) CDR-R (Repository) CBS (Carrier Billing) ASN.1 CDR R2.5 ASN.1 MDB (Rating) PRB-IEN4: ASCII UFIH "Portal" (Legacy Phone Support) BGW (Billing Gateway) EMA (Provisio- ning) MMSPA CTT "Pump" (LOG) SCM SGSN PRB-IEN6: ASN.1 CPR-PRB1: FTP (VASP rating) PSB (3rd Party Billing/ Rating) International OpCos IP via GRX BSCS TFL PRB-IEN1b: UFIH GPRS CDRs MMS-PRB4: LDAP HTTP XML MMS-IEN8b: MM4 WML / HTTP MMS-IEN7b: MM3: ESMTP MMS-IEN5: MM1: PAP / HTTP MMS-IEN1: CORBA, Java prov.API SMTP MMS-IEN2: HTTP XML HTTP XML MM4 PRB-IEN2: HTTPS, socket, prop. Interface Cockpit Cockpit (Authori- sation) Cockpit (Authenti- cation) SQL NetBIOS Terminal DB HHD-IEN1: t.b.d. (Configuration Messages & Guides) t.b.d. (Terminal Features) Transcoder Config. Mgmt HHD-MMS1: t.b.d. (Terminal Features) Terminal Testing Terminal Features / Test Results HHD-PRB2: t.b.d. (Terminal Features & Interactions; Known Problems; Standardized Operating Instructions) HHD-PRB1: t.b.d. (Problem Reports) 2nd Level Problem Analysis t.b.d. (New Problems)

Internal Environment

SIMONA UFIH UFIH CuC Operator Customer Customer Administrator SMPP 3.4 Terminal Administrator MMS-IEN9: MM5: SS7/MAP PRB-IEN1a: UFIH iServer (3rd Party Proxy) UFIH Edifact File: UFIH

slide-10
SLIDE 10

ITV-100

V Integration testing is interface testing (1)

Integration testing: Testing in which software components, hardware components, or both are combined and tested to evaluate the interaction between them. [IEEE 610.12] Integration testing: Testing performed to expose faults in the interfaces and in the interaction between integrated components. Interface testing: Integration testing where the interfaces between system components are tested [BS7925-1] Integration testing is the process of verifying the interaction between system components (possibly and hopefully tested already in isolation). [SWEBOK 1.0]

slide-11
SLIDE 11

ITV-110

V Integration testing is interface testing (2)

implementation testing → testing in which aggregates are tested with the aim to detect defects caused by errors made during implementation → concern is the functionality of the aggregate (unit testing) or the interaction of its parts (interface testing) integration testing → testing in which aggregates are tested with the aim to detect defects caused by errors made during integration, e.g.

building writing scripts (function test of scripts) integration of components to tiers and these to system integration of components to subsystems and these to system configuration of the system installation of the system in the target environment

slide-12
SLIDE 12

ITV-120

V Integration testing is interface testing (3)

type of errors integration testing is looking for

wrong address wrong name used queue is not set-up queue is too small file is missing or is in wrong location processes are started in a wrong sequence a process is not started at all wrong setting of configuration parameters or no setting at all etc.

slide-13
SLIDE 13

TER-130

VI Test coverage is a glass box test concept (1)

a quite usual conversation ...

?

slide-14
SLIDE 14

TER-131*

A quite usual conversation ...

A: How do you test your programs? E: In the usual way, like anybody else. A: I mean, how do you select the test cases that you intend to execute in order to torture your program? E: Simple, that’s easy. A:

  • Good. Which method do you apply?

E: Method? I know what I need to test. A: Of course you know it. I am interested to learn, when do you stop test case selection and specification? E: When I have enough test cases. A: Exactly, that is what I want to know, when do you have enough? E: As soon as I don’t need any more. A: Yes, of course, but how do you decide that you don’t need any more, that your set of test cases is complete? E: Man, everybody knows that there is nothing like complete testing. A: I am convinced there is. E: Even if it were it’s too expensive, nobody can afford it ... and it doesn't work anyway. A: Would you agree, then, that you test intuitively? E: Yes, I do, and I am proud of it.

slide-15
SLIDE 15

ITV-140

Example: Black-box test of the Windows clock

slide-16
SLIDE 16

ITV-150

Example: A complete set of test cases (1)

test cases

  • utput

1 2 3 analogue time display X digital time display X font (28 types) Arial TnR display of the Greenwich time X display of the system time X display of the title bar X no display of the title bar X display of seconds X no display of seconds X display of the date X no display of the date X display of information X

slide-17
SLIDE 17

ITV-160

Example: A complete set of test cases (2)

analogue display of time: 8 test cases 1 2 3 4 5 6 7 8 time display gch gch gch gch sys sys sys sys title bar display yes yes no no yes yes no no seconds display yes no yes no yes no yes no date display no no no no no no no no digital display of time: 448 test cases date display is possible: doubles the analogue test cases = 16 28 font types available: 16 x 28 = 448 total: analogue display + digital display + info = 8 + 448 + 1 = 457 test cases

slide-18
SLIDE 18

ITV-170

VI Test coverage is a glass box test concept (2)

first criterion (3 test cases)

for all possible types of display at least one of the possible outputs

is produced by at least one test case second criterion (457 test cases)

all possible combinations of outputs are produced by at least one

test case a possible criterion in between (30 test cases)

all possible outputs are produced by at least one test case

slide-19
SLIDE 19

ITV-180

VI Test coverage is a glass box test concept (3)

testing is a sampling procedure the sample content depends on risks the sample size is defined by the envisaged

"confidence level" of the test

coverage defines the sample coverage is a target for the test designer coverage makes systematic test case selection possible coverage determines the extent, thus also the cost of

testing

coverage enables the project leader / software

manager to (better) assess the state of affairs

slide-20
SLIDE 20

ITV-190

VII Test planning is an easy task

– we do unit testing, integration testing, system testing test planning involves

identify system boundaries and system structure define strategy for reviewing, integration, and testing analyse risks define test objects for all test objects define the characteristics of the test design the test infrastructure and specify the test harness identify all testing activities and estimate the effort trade cost and benefit of the tests schedule test activities and assign resources

can't be all done at the beginning and not all of what can be done, can be defined with the same level of detail

slide-21
SLIDE 21

ITV-200

Characteristics of a single test

test object

  • ne executable unit (or many)

test level unit or component or system or an aggregate in between test environment development or integration or test or production error types to look for logic, data entry, navigation, fault tolerance, connection, communication, response time, size, etc. basis for test case specification artefact used to gather information about possible test inputs and expected output basis for test case selection artefact used to define test coverage criteria used to assess the completeness of the selected test case set test dimensions configuration parameters of the run-time environment test goal extent of error type and test dimensions coverage test execution manually using a checklist, using test procedures, with automatic test logging, completely automated, etc. tester user, test engineer, ignorant, expert, etc. test evaluation compare with specification (basis), compare with assured results, etc. test record completed checklist, manual, test log, automatic test log, etc.

slide-22
SLIDE 22

ITV-210

Example: System test planning with variations

WEB dimension possible values cardinality

OS NT 2000 XP 3 browser Netscape IE Firefox 3 registered user no yes 2 locked user no yes 2 user language German French Italian English 4

WAP dimension possible values cardinality

  • perator

we foreign 2 device brands 5 new 15 legacy 20 registered user no yes 2 locked user no yes 2 user language German French Italian English 4

WEB WAP minimal number of variations

4 20

theoretically maximal number of variations

144 640

slide-23
SLIDE 23

ITV-220

A kind of conclusion ...

tesztelést fejtegettem kevertem tekertem remélem elég értelmesen értékeltem eme nemes cselekedést esetleg nem érzékeltem kedves teremben elhelyezettek értékes nézetét e tett szenvedésért elnézést kérek kellemes rendhelytelenség kegyetlen keresését tisztelt tesztelök

The program did not crash, Mister Byte!