Information Management Association
February 13, 2001, Lunch Meeting
Click Here To Register
Pete Dignan, ProtoTest
The business drivers of software quality:
to market (get it out)
satisfaction (make ‘em happy) and
of support (don’t give ‘em a reason to call us!)
Business Driver #1: Time-to-market
“92%-complete syndrome” Design,
code, test, code, test, code, test….
Group on project completion: http://www.standishgroup.com/msie.htm
software later than planned is costly whether it’s a product OR an internal
Business Driver #2: Customer satisfaction
have more choices than ever before in most application segments.
your customers unhappy - even internal customers – and you may lose them.
expectations of software quality are up sharply in the past few years.
Customers now expect:
Easy, quick installation
Driver #3: Cost of support
a software product or an e-commerce site might have several components:
The obvious – a help desk or customer service center
The less obvious – time taken away from development, sales,
Also – the cost of unplanned patches, service packs, custom fixes
cost to operate a help desk may include:
Staffing costs, including benefits and overhead
Training costs for Help Desk staff
Cost of any training material or promotional material distributed
or made available to Help Desk customers
Cost of Help Desk Software and associated maintenance
Depreciation or leasing fees of Help Desk hardware
Facilities overhead (e.g. rent)
product quality means fewer bugs means lower support costs. It can also mean more effective delivery of defect
information to supported customers.
How can your QA process affect these important business drivers?
Here’s a brief QA and test process primer:
Requirements are documented
Requirements are reviewed for testability
The traceability matrix is based on requirements
Late requirements are incorporated in the process
Design errors are exposed before coding
Design elements are associated with requirements (traceability
Testers are better able to understand system internals
Typically performed in Development
Code review can be tied to traceability matrix
Code control - a check-out, check -in system for source and object
Configuration management - includes all elements of a system, including 3rd party components
Master plan: risk
assessment, test types, resources, schedules
Begins as soon as Requirements are known
Tests are tied to requirements in the traceability matrix
Tests are created while code is in development
Cases and scripts are made for re-use (regression testing)
Defects are recorded, prioritized and tracked
Team must decide which defect sources will be included
Access to defect information can support other functions (e.g. -
Many types: code checking (such as for memory leaks), code
coverage, commercial capture-and-playback, custom test harnesses (beyond
standard de-bugging tools).
Regression, load/stress testing are often well suited to automation
NOT a silver bullet (people,
processes, THEN tools)
Useful in predicting schedules, defect density, defect removal
E.g. - defect origin, defect density, time-to-test per KLOC or
function point, etc.
How can such processes be applied in
this new e-business world? Here are
four important trends in software and the implications for QA & Test:
Trend 1: Software is EVERYWHERE.
years ago, software was mainly found on mainframes and minicomputers
software is in your car’s engine, your cell phone, your PDA; there are even
IP-addressable vending machines
software must run on more than one device (hardware issues), under more than one
operating system (Windows, Unix, embedded), on more than one browser (IE,
more software to test, and testing becomes MUCH more complex
is a greater demand for people with software QA and testing skills than ever
estimate there are around 5000 software QA and test people working in Colorado
(plus or minus 20%); we could probably use another 500-1000 immediately
need people with both test skills AND domain knowledge
Trend 2: Internet-based applications have MUCH shorter lifecycles.
Become more focused on informing management of knowns, unknowns and
Start the testing effort earlier in the development lifecycle
Preventing or fixing defects earlier costs 50-200 TIMES less than
Trend 3: Customer expectations of software quality are up sharply
in the past few years.
now expect: Easy, quick installation; Usability; Functionality; Reliability;
Easy and effective support
they don’t get those things they switch suppliers – and fast
do companies that create software respond to these higher expectations?
By making much more creative use of test automation (randomizing
tests; data-driven testing)
Trend #4: New development methodologies are emerging, in particular
eXtreme Programming, or XP.
the simplest thing that could work (don’t over-engineer)
Stories as requirements
unit tests BEFORE coding
extensively: combining objects and methods that are similar
How can the QA profession keep pace with changes in development
process? Working in an XP shop may
require a tester to:
developers with formal development of unit tests
users develop acceptance tests
creative in figuring out when and how to develop and perform other types of
tests that aren’t yet well defined in the XP methodology, like performance
An online resource for software QA
and testing articles and links:
Website of the Software Quality
Association in Denver: www.geocities.com/denver_squad/index.html
A useful list of software QA and
testing FAQs: www.softwareqatest.com
An in-depth look at Extreme
This site serves the Denver/Boulder
Extreme Programming (XP) community:
This website tracks known bugs,
plus fixes and workarounds, for PC software: www.bugnet.com/index.html
What do you do with intense competitive pressure, emphasis on immediate results, and frequent resource constraints? Pete Dignan will comment on some of the differences between traditional software testing and quality assurance in an Internet application.. He recommends a pragmatic approach to software quality assurance. Come, listen and debate the issues.
Pete Dignan is the founder and President of ProtoTest, a company that specializes in software quality assurance and testing services. He has seventeen years experience in the information technology field, involving system and application software, systems integration and project management. Pete has managed both product and custom software development organizations for companies such as Peak Technologies and MShow.com. He is a 1979 Phi Beta Kappa graduate of the University of Virginia. Pete serves on the advisory board for the CIS Masters program at the University College, University of Denver; and on the board of the QAI-affiliated Software Quality Association in Denver.
Pete Dignan President ProtoTest LLC
RMIMA Home Page | Contact information | Board members | About RMIMA