Common System and Software Testing Pitfalls





The interactive transcript could not be loaded.


Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on May 19, 2015

In spite of many great testing “how-to” books, the people involved with system and software testing (such as testers, requirements engineers, system/software architects, system/software engineers, technical leaders, managers, and customers) continually make many different types of testing-related mistakes. These commonly-occurring human errors can be thought of as system and software testing pitfalls, and when projects unwittingly fall into them, these pitfalls make testing less effective at uncovering defects, people less productive at performing testing, and harm project morale.

Donald Firesmith has created a repository of 167 of these testing anti-patterns, analyzed and organized them into a taxonomy consisting of 23 categories, and documented each pitfall in terms of its name, description, potential applicability, characteristic symptoms, potential negative consequences, potential causes, recommendations for avoiding it and mitigating its harm, and related pitfalls.

In this webinar, Donald Firesmith introduces the concept of a testing pitfall, provides an underlying ontology of pitfalls, explains their goals and potential uses, introduces the attendees to this taxonomy of testing pitfalls, and provides directions to further information including his “how-not-to” testing book (Common System and Software Testing Pitfalls, Addison-Wesley, 2014).


When autoplay is enabled, a suggested video will automatically play next.

Up next

to add this to Watch Later

Add to

Loading playlists...