Mistakes We Make in Testing

My dad would always remind me that mistakes are ultimately just learning opportunities, and I've leaned on this supportive and generous advice many times in my career. Recently one insightful testing newbie asked me an important question: "What are some of the most common mistakes we make as testers?" Here are some things that came to mind.

In the Early Project Stage

  • Not being involved in development at the beginning and reviewing and providing feedback on the artifacts, such as user stories, requirements, and so forth
  • Not collaborating and developing a positive relationship with the development team members
  • Invoking processes, standards, or metrics just because “we've always done it that way”
  • Ignoring traceability and not understanding the dependencies and linkages between user stories, software components, and tests
  • Not being an advocate for "good enough" software
  • Not standing up for our goals, estimates, and time requirements
  • Not speaking in business terms: risks to business, customers, clients, etc.
  • Working in isolation
  • Saying, "If only they would do xyz, then everything would be fine!”
  • Not accepting reality; we know our timelines are going to change and slips are going to happen, so we need to accept it, plan for it, and deal with it

During Testing

  • Not building some level of expected rework into our schedule (the rule of thumb is 33 percent) during test planning
  • Not soliciting involvement from other key stakeholders and subject matter experts
  • Failing to take advantage of all the business intelligence that is available to us (user experience information, features used/not used, load profiles, and so forth)
  • Spending too much time on test case implementation and execution and not enough time on test analysis and design, where we get our highest payback on defect yield
  • Not reviewing and revising our test suites regularly
  • Automating the wrong things and not considering the ROI on automation
  • Submitting incidents that are not important, are not well documented, or include inflammatory language directed at individuals
  • Failing to support our opinions with fact and not keeping the emotion out of our readiness recommendations

When Selecting or Using Tools

  • Not having requirements in place before we select and implement testing tools
  • Not planning or budgeting for the care and maintenance of testing tools
  • Failing to plan for the integration of existing tools with new tools
  • Not having an overall tools architecture in place before we implement automation
  • Failing to train new members in the testing tools
  • Not keeping management informed about the benefits and ROI of our tools and automation

In Our Careers

  • Limiting our marketability, remaining stagnant in our technical or domain knowledge, and staying comfortable in our roles. It’s important to learn new technology and testing techniques, gain new perspectives, and refresh our core competencies
  • Not giving back to the testing community through speaking, publishing, or attending conferences
  • Failing to effectively communicate what we do and why it's important

These are just some of the learning experiences I've encountered in my career. Do you have any to add?

 

See Mike Sowers at Better Software West 2016. Click here to view the program schedule.

Up Next

About the Author

TechWell Insights To Go

(* Required fields)

Get the latest stories delivered to your inbox every month.