!"#$%&'()*+),)*-) ./00#'1)2)345"645"%() - - PowerPoint PPT Presentation

00 1 2 345 645 i j 64 1 in 2005 computerworld hil05
SMART_READER_LITE
LIVE PREVIEW

!"#$%&'()*+),)*-) ./00#'1)2)345"645"%() - - PowerPoint PPT Presentation

!"#$%&'()*+),)*-) ./00#'1)2)345"645"%() .:I?#'&)J/#64%1) ! In 2005, ComputerWorld [Hil05] lamented that ! ! bad software plagues nearly every organization that uses computers, causing lost work hours during


slide-1
SLIDE 1

!"#$%&'()*+),)*-)

./00#'1)2)345"645"%()

slide-2
SLIDE 2

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466@) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) A)

.:I?#'&)J/#64%1)

  • !

In 2005, ComputerWorld [Hil05] lamented that !

–! “bad software plagues nearly every organization that uses computers, causing lost work hours during computer downtime, lost or corrupted data, missed sales

  • pportunities, high IT support and maintenance costs, and low customer
  • satisfaction. !
  • !

A year later, InfoWorld [Fos06] wrote about the !

–! “the sorry state of software quality” reporting that the quality problem had not gotten any better.

  • !

Today, software quality remains an issue, but who is to blame? !

–! Customers blame developers, arguing that sloppy practices lead to low-quality

  • software. !

–! Developers blame customers (and other stakeholders), arguing that irrational delivery dates and a continuing stream of changes force them to deliver software before it has been fully validated.!

slide-3
SLIDE 3

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466@) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) K)

J/#64%1)

  • ! 7"&).9'&,0%*(:'&,;%+'(<,01"*%&=)8&L9&()>?%@,;=)

#())

–! M#);"#'#;%&'4(N;):')#O'4F/%&):P)(:0&%"495EQ)))

  • ! R:')(:I?#'&@)%?:)S498():P)T/#64%1)0#1)F&)

&9;:/9%&'&8U))

–! J/#64%1):P)8&(459)&9;:0$#((&()'&T/4'&0&9%(@) ($&;4L;#N:9(@)#98)%"&)8&(459):P)%"&)(1(%&0E)) –! J/#64%1):P);:9P:'0#9;&)4()#9)4((/&)P:;/(&8)$'40#'461) :9)40$6&0&9%#N:9E) –! V(&')(#N(P#;N:9)W);:0$64#9%)$':8/;%)X)5::8)T/#64%1)X) 8&64Y&'1)?4%"49)F/85&%)#98)(;"&8/6&)

slide-4
SLIDE 4

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466@) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) +)

7"&).:I?#'&)J/#64%1)Z46&00#)

  • !

If you produce a software system that has terrible quality, you lose because no one will want to buy it. !

  • !

If on the other hand you spend infinite time, extremely large effort, and huge sums of money to build the absolutely perfect piece of software, then it's going to take so long to complete and it will be so expensive to produce that you'll be out of business anyway. !

  • !

Either you missed the market window, or you simply exhausted all your resources. !

  • !

So people in industry try to get to that magical middle ground where the product is good enough not to be rejected right away, such as during evaluation, but also not the object of so much perfectionism and so much work that it would take too long or cost too much to complete. [Ven03]!

slide-5
SLIDE 5

J/#64%1)[((/&()

  • !

!:(%U)\)F#6#9;495)#;%)

–! ]:(():P)0#'S&%)("#'&)2)&':8&8)'&$/%#N:9)8/&)%:)F#8)(:I?#'&) –! ]:(():P)0#'S&%)("#'&)8/&)%:)6#%&,#''4Y495)(:I?#'&) –! G&;:Y&'#F464%1):P);:(%()%:)40$':Y&^L_)%"&)(:I?#'&)

  • !

G4(S)

–! `98)/(&):P)(:I?#'&)0#1)"#Y&)64%&'#6)64P&,8&#%");:9(&T/&9;&() –! a&5645&9;&^]4#F464%1U)49);#6;/6#N95)%#_&(@)&E5E@):')&(N0#N95)F'485&);#F6&) %&9(46&)(%'&95%")

  • !

.&;/'4%1)

–! `_$6:4%#N:9):P)Y/69&'#F464N&() –! !:9(&T/&9;&()'#95&)P':0)F&9459)<8&P#;&8)?&F(4%&D)%:);#%#(%':$"4;)<#%:04;) ?&#$:9)6#/9;");:8&(D)

  • !

\)(N%;")49)N0&b)

–! R4_495)#9)'&&"&(#"&#8):P)N0&)4()'&6#NY&61);"&#$) –! G&6&#(495)#)$':8/;%)'&Y4(4:9)8/&)%:)A'B'0;3)4()&_$&9(4Y&)

slide-6
SLIDE 6

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) c)

`'':'()Y()Z&P&;%()

  • ! `'':'()#98)8&P&;%()

–! Error—a quality problem found before the software is released to end users! –! Defect—a quality problem found only after the software has been released to end-users!

  • ! We make this distinction because errors and defects have

very different economic, business, psychological, and human impact!

  • ! However, the temporal distinction made between errors and

defects in this book is not mainstream thinking)

slide-7
SLIDE 7

[0$':Y495).d)J/#64%1)

  • ! `9549&&'495)$'#;N;&()

–! G&T/4'&0&9%()H6#99495)2)=#9#5&0&9%) –! \';"4%&;%/'#6)2).:I?#'&)G&Y4&?() –! ./(%#49#F6&)H':;&(()=:8&6)

  • ! 7&(N95)

–! V94%)]&Y&6)7&(N95) –! R/9;N:9#6)7&(N95) –! `_$6:4%#N:9)7&(N95^!"#66&95&) –! G&5'&((4:9)7&(N95)?4%")&#;")M'&6&#(&Q) –! ./(%#49#F6&)J\)H':;&(()=:8&6)

slide-8
SLIDE 8

<R:'0#6)7&;"94;#6D)G&Y4&?()

  • ! G&T/4'&0&9%()G&Y4&?)

–! !:8&)?#6S,%"':/5"() –! R/9;N:9#6)?#6S,%"':/5"() –! 7&(%,Z&(459)$6#99495)<7ZZD)

  • ! 7&;"94;#6)R:;/()

–! .":/68)F&)&5:,9&/%'#6) –! .":/68)F&)8&%#46&8)&_#0):P);:8&@)P/9;N:9@)8&(459) –! V(/#661):961)#)P&?)8&Y&6:$&'()#98^:')%&(%&'()

  • ! .":'%)2).?&&%)

–! A)":/'()$'&$@)A)":/'()8/'#N:9) –! K,e)$&:$6&)%:%#6) –! f&&$)8&%#46&8)0&&N95)049/%&()

  • ! Z&Y&6:$)=`7G[!.)F#(&8):9)%"&)R7G()

–! `'':'()#98)%1$&)<0#g:')<&E5E@)F#8);#6;Dh)049:')<%1$:(DD)

slide-9
SLIDE 9
slide-10
SLIDE 10

Z&P&;%).%#N(N;()

  • ! \;;:'8495)%:)%"&)%&_%U)

–! #)(:I?#'&)$':;&(()%"#%)8:&()ai7)49;6/8&)'&Y4&?(@)

  • ! 14&68()C+)&'':'()#%)%"&)F&5499495):P)%&(N95)#98)
  • ! G&6&#(&()*A)6#%&9%)8&P&;%()%:)%"&)L&68)

–! #)(:I?#'&)$':;&(()%"#%)8:&()49;6/8&)'&Y4&?(@)

  • ! 14&68()A+)&'':'()#%)%"&)F&5499495):P)%&(N95)#98)
  • ! '&6&#(&()K)6#%&9%)8&P&;%()%:)%"&)L&68)

–! \);:(%)#9#61(4()4984;#%&()%"#%)%"&)$':;&(()?4%")ai) '&Y4&?();:(%()#$$':_40#%&61)K)N0&()0:'&)%"#9)%"&) $':;&(()?4%")'&Y4&?(@)%#S495)%"&);:(%):P);:''&;N95)%"&) 6#%&9%)8&P&;%()49%:)#;;:/9%)

slide-11
SLIDE 11

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) **)

7&_%F::S)`_#0$6&)

  • !

The effort required to correct a minor model error (immediately after the review) was found to require 4 person-hours. !

  • !

The effort required for a major requirement error was found to be 18 person-hours. !

  • !

Examining the review data collected, you find that minor errors occur about 6 times more frequently than major errors. Therefore, you can estimate that the average effort to find and correct a requirements error during review is about 6 person-hours. !

  • !

Requirements related errors uncovered during testing require an average of 45 person-hours to find and correct. Using the averages noted, we get:!

  • !

Effort saved per error = !Etesting – Ereviews !

  • ! !

! ! 45 – 6 = 30 person-hours/error!

  • !

Since 22 errors were found during the review of the requirements model, a saving of about 660 person-hours of testing effort would be achieved. And that’s just for requirements-related errors.!

slide-12
SLIDE 12

Review/No Review Comparison

Effort expended with and without reviews

Higher upfront manpower expenditure, Which is why managers don’t want to spend time for testing

slide-13
SLIDE 13

.:I?#'&)J/#64%1)\((/'#9;&)

  • ! `Y&'1F:81)?#9%()MT/#64%1Q)(:I?#'&)
  • ! a:F:81)?#9%()%:)$#1)P:')4%b)
  • ! !/6%/'#6)%'#84N:9):P)Y&'1)(6:$$1)(:I?#'&)

–! l#9S()?4%")$::'^64O6&)(:I?#'&)(&;/'4%1) –! =4;':(:I)$':8/;%()?4%")'#0$#9%)(&;/'4%1)":6&()#98)F/5() –! R/mm1);:9;&$%):P);#/(&(@);:9(&T/&9;&()F1)5&9&'#6)$/F64;) –! a:);6&#')49;&9NY&()P:')"45")T/#64%1).d)$':8/;%()

  • ! ]4N5#N:9n)
  • ! Z#0#5&(^\?#'8(n)
  • ! V(&')Y()o&98:')'&($:9(4F464N&()
  • ! [.i)CBB*UABBB).%#98#'8)

–! AB)'&T/4'&0&9%()P:')&j&;NY&)J\) –! !:Y&')#66)#($&;%()P':0)0#9#5&0&9%)'&($:9(4F464N&()%:)(%#N(N;#6) %&;"94T/&()

slide-14
SLIDE 14

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) *+)

`6&0&9%():P).J\)

  • ! Standards !
  • ! Reviews and Audits !
  • ! Testing!
  • ! Error/defect collection and analysis !
  • ! Change management !
  • ! Education !
  • ! Vendor management !
  • ! Security management !
  • ! Safety !
  • ! Risk management !
slide-15
SLIDE 15

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) *e)

.J\)>:#6()<(&&)R45/'&)*cE*D)

  • !

Requirements quality. The correctness, completeness, and consistency of the requirements model will have a strong influence

  • n the quality of all work products that follow. !
  • !

Design quality. Every element of the design model should be assessed by the software team to ensure that it exhibits high quality and that the design itself conforms to requirements.!

  • !

Code quality. Source code and related work products (e.g., other descriptive information) must conform to local coding standards and exhibit characteristics that will facilitate maintainability.!

  • !

Quality control effectiveness. A software team should apply limited resources in a way that has the highest likelihood of achieving a high quality result.!

slide-16
SLIDE 16
slide-17
SLIDE 17

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E) *-)

.:I?#'&)G&64#F464%1)

  • ! \)(40$6&)0&#(/'&):P)'&64#F464%1)4()9'%*C19'C

D';$''*CB%,@?&')<=7lRD@)?"&'&)) )) )=7lR)W)=77R)X)=77G)

  • ! 7"&)#;':910()=77R)#98)=77G)#'&)9'%*C19'C;"C

B%,@?&')#98)9'%*C19'C;"C&'4%,&@)'&($&;NY&61E)

  • ! !"#$%&'(%E%,@%D,@,;=)4()%"&)$':F#F464%1)%"#%)#)

$':5'#0)4():$&'#N95)#;;:'8495)%:)'&T/4'&0&9%() #%)#)54Y&9)$:49%)49)N0&)#98)4()8&L9&8)#() ))\Y#46#F464%1)W)p=77R^<=77R)X)=77GDq)_)*BBr))

slide-18
SLIDE 18

.:I?#'&)7&(N95)

  • ! !:''&;%)'&&"&3(F&P:'&)%"&1)F&;:0&)A'B'0;3(
  • ! ]::S)#%)$&'P:'0#9;&@)'&T/4'&0&9%():P)%:%#6)(1(%&0)
  • ! 7&(%&'()(":/68)F&),*A'4'*A'*;):P)8&Y&6:$&'()

–! a:%)$#'%):P)(/$&'Y4(:'1);"#49) –! >&%).d)8&64Y&'4&()g/(%)64S&);/(%:0&')8:&() –! .;"&8/6&)#8"&'&9;&)4()40$:'%#9%) –! 7&(%&'();#9)F&);:8&'(@)F/%)(":/68)9:%)F&)8:495)#;;&$%#9;&) %&(N95):P)%"&4'):?9);:8&) –! Z&Y&6:$&'()(":/68)8:)?*,;(;'31*+):P)%"&4'):?9);:8&)

  • ! Z&Y&6:$&'()%&98)%:)/(&)M&_$&;%&8Q)%&(%()
  • ! [98&$&98&9%)%&(%&'()%&98)%:)/(&)M/9&_$&;%&8Q)%&(%()
slide-19
SLIDE 19

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) *C)

  • )2)o)
  • ! Verification refers to the set of tasks that

ensure that software correctly implements a specific function. !

  • ! Validation refers to a different set of tasks

that ensure that the software that has been built is traceable to customer requirements. Boehm [Boe81] states this another way: !

–! Verification: "Are we building the product right?" ! –! Validation: "Are we building the right product?"!

slide-20
SLIDE 20
slide-21
SLIDE 21

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) A*)

.%'#%&54;)[((/&()

  • !

.$&;4P1)$':8/;%)'&T/4'&0&9%()49)#)T/#9NL#F6&)0#99&')6:95)F&P:'&)%&(N95) ;:00&9;&(E))

  • !

.%#%&)%&(N95):Fg&;NY&()&_$64;4%61E))

  • !

V98&'(%#98)%"&)/(&'():P)%"&)(:I?#'&)#98)8&Y&6:$)#)$':L6&)P:')&#;")/(&') ;#%&5:'1E)

  • !

Z&Y&6:$)#)%&(N95)$6#9)%"#%)&0$"#(4m&()M'#$48);1;6&)%&(N95EQ)

  • !

l/468)M':F/(%Q)(:I?#'&)%"#%)4()8&(459&8)%:)%&(%)4%(&6P)

  • !

V(&)&j&;NY&)%&;"94;#6)'&Y4&?()#()#)L6%&')$'4:')%:)%&(N95)

  • !

!:98/;%)%&;"94;#6)'&Y4&?()%:)#((&(()%"&)%&(%)(%'#%&51)#98)%&(%);#(&() %"&0(&6Y&(E))

  • !

Z&Y&6:$)#);:9N9/:/()40$':Y&0&9%)#$$':#;")P:')%"&)%&(N95)$':;&((E))

slide-22
SLIDE 22
slide-23
SLIDE 23

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) AK)

G&5'&((4:9)7&(N95)

  • !

Regression testing is the re-execution of some subset of tests that have already been conducted to ensure that changes have not propagated unintended side effects!

  • !

Whenever software is corrected, some aspect of the software configuration (the program, its documentation, or the data that support it) is changed. !

  • !

Regression testing helps to ensure that changes (due to testing or for

  • ther reasons) do not introduce unintended behavior or additional

errors.!

  • !

Regression testing may be conducted manually, by re-executing a subset of all test cases or using automated capture/playback tools.!

slide-24
SLIDE 24

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) A+)

.0:S&)7&(N95)

  • !

\);:00:9)#$$':#;")P:');'&#N95)M8#461)F/468(Q)P:')$':8/;%)(:I?#'&)

  • !

.0:S&)%&(N95)(%&$(U)

–! .:I?#'&);:0$:9&9%()%"#%)"#Y&)F&&9)%'#9(6#%&8)49%:);:8&)#'&)49%&5'#%&8) 49%:)#)MF/468EQ))

  • ! \)F/468)49;6/8&()#66)8#%#)L6&(@)64F'#'4&(@)'&/(#F6&)0:8/6&(@)#98)&9549&&'&8)

;:0$:9&9%()%"#%)#'&)'&T/4'&8)%:)40$6&0&9%):9&):')0:'&)$':8/;%)P/9;N:9(E)

–! \)(&'4&():P)%&(%()4()8&(459&8)%:)&_$:(&)&'':'()%"#%)?466)S&&$)%"&)F/468)P':0) $':$&'61)$&'P:'0495)4%()P/9;N:9E))

  • ! 7"&)49%&9%)(":/68)F&)%:)/9;:Y&')M(":?)(%:$$&'Q)&'':'()%"#%)"#Y&)%"&)"45"&(%)

64S&64"::8):P)%"':?495)%"&)(:I?#'&)$':g&;%)F&"498)(;"&8/6&E)

–! 7"&)F/468)4()49%&5'#%&8)?4%"):%"&')F/468()#98)%"&)&9N'&)$':8/;%)<49)4%() ;/''&9%)P:'0D)4()(0:S&)%&(%&8)8#461E))

  • ! 7"&)49%&5'#N:9)#$$':#;")0#1)F&)%:$)8:?9):')F:O:0)/$E)
slide-25
SLIDE 25

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) Ae)

d&F\$$)7&(N95),)[)

  • ! The content model for the WebApp is

reviewed to uncover errors. !

  • ! The interface model is reviewed to ensure

that all use cases can be accommodated. !

  • ! The design model for the WebApp is

reviewed to uncover navigation errors. !

  • ! The user interface is tested to uncover errors

in presentation and/or navigation mechanics.!

  • ! Each functional component is unit tested. )
slide-26
SLIDE 26

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) Ac)

d&F\$$)7&(N95),)[[)

  • !

Navigation throughout the architecture is tested. !

  • !

The WebApp is implemented in a variety of different environmental configurations and is tested for compatibility with each

  • configuration. !
  • !

Security tests are conducted in an attempt to exploit vulnerabilities in the WebApp or within its environment.!

  • !

Performance tests are conducted.!

  • !

The WebApp is tested by a controlled and monitored population of end-users. The results of their interaction with the system are evaluated for content and navigation errors, usability concerns, compatibility concerns, and WebApp reliability and performance.!

slide-27
SLIDE 27
slide-28
SLIDE 28

Z&F/55495).%'#%&54&()

  • !

l'/%&)R:';&)

–! ]&#(%)&s;4&9%) –! =&0:'1@)'/9,N0&)%'#;&@)&_#049&)(%#;S@)&%;E)

  • !

l#;S%'#;S495)

–! .%#'%)#%)6:;#N:9):P):F(&'Y&8)&'':') –! d:'S)F#;S)%"':/5")P/9;N:9(^0:8/6&()49);:8&) –! a:%)$'#;N;#6)P:')6#'5&);:8&)F#(&)

  • !

!#/(&)&64049#N:9)

–! 31$:%"&(4(,F#(&8)#$$':#;") –! [98/;N:9^8&8/;N:9):9);#/(&(@)&#;")%&(%&8) –! =/(%)S9:?);:8&)Y&'1)?&66)

  • !

\/%:0#%&8)

–! [Z`():I&9)"#Y&)%::6()%:)"&6$)%'#;&)&'':'() –! i%"&')8&F/55495)%::6()%"#%)0:94%:')(%#%&):P)Y#'4#F6&(@)L6&(@)&%;E)

  • !

H&:$6&)

–! \9):/%(48&)$:49%):P)Y4&?);#9)0#S&)%"&)F455&(%)84j&'&9;&)49)%"&)(":'%&(%)N0&)

slide-29
SLIDE 29

7"&(&)(648&()#'&)8&(459&8)%:) #;;:0$#91)!"#$%&'( )*+,*''&,*+-(.(/&%011"*'&23( .44&"%056(78'(<=;>'#?,3466) ABBCDE).648&();:$1'45"%)ABBC)F1) G:5&')H'&((0#9E)) AC)

!:''&;N95)%"&)`'':')

  • !

Is the cause of the bug reproduced in another part of the program? In many situations, a program defect is caused by an erroneous pattern of logic that may be reproduced elsewhere. !

  • !

What "next bug" might be introduced by the fix I'm about to make? Before the correction is made, the source code (or, better, the design) should be evaluated to assess coupling of logic and data structures. !

  • !

What could we have done to prevent this bug in the first place? This question is the first step toward establishing a statistical software quality assurance

  • approach. If you correct the process as well as the product, the bug will be

removed from the current program and may be eliminated from all future programs.!