Top 10

Top 10 V&V Fails: Don’t Let These Common Mistakes Derail Your Verification and Validation Program

The successful market introduction of a medical device requires design verification and validation testing (V&V) (per 21 CFR 820.30 and ISO 13485 7.3). Even for experienced teams, V&V can be very challenging, requiring tight coordination of multiple activities and the correct execution of many preceding activities. I have worked on V&V testing for a wide range of medical devices in small startups and large enterprises, and I’ve seen many of the same problems crop up again and again. This “Top 10 List” summarizes these recurring mistakes. If your product development team can avoid these common mistakes, you stand a much better chance of successfully launching your new medical device.

1. Inadequate design inputs

The “design inputs”, the documentation that drives the design of a new product, are the foundation for good V&V testing. Poorly written or incomplete product requirements, system requirements, software requirements, hardware requirements, etc., mean that there is not a clear basis for testing. Specifically, take care to avoid these common mistakes that undermine your ability to do V&V testing:

  • Requirements that cannot be verified as written, for example “product shall be easy to use” or “product is as good as product X.”
  • Requirements that are out of date because the design has changed, but the specifications have not been updated to reflect new features or altered behavior.
  • Poorly written or incomplete design risk analyses mean that the test program may not focus sufficient resources on safety critical aspects of the product. This can even lead to safety features being omitted entirely from testing!
  • Incomplete list of applicable standards: If you overlook standards which apply to the new product–or use out-of-date versions of the standards–your product testing will be incomplete.

2. Uncontrolled test articles 

A complete product Bill of Materials (BOM) must list the hardware revisions and software versions for the final assembly or sub-assembly being tested so that the device under test can be accurately identified and connected to the test results. All design documentation must be under change control. If the version and revision levels of the test samples are not defined and documented, you cannot show that test results measured today will apply to the manufactured product in the future (Something passed the test, but we don’t know exactly what it was).

3. Product tested is not the final design

V&V demonstrates that the design of a new product fulfills its requirements (“design outputs conform to design inputs”) and meets the needs of the user. V&V should not be used to investigate a new design or learn about a new technology. Exploratory or investigational testing should be complete before proceeding to V&V testing. If that is not the case, you risk expensive repeats of V&V testing with the final design. This issue is especially common with products involving software. Any design changes, even if they are “just software changes”, after V&V testing, trigger a repeat of all the testing. This happens unless the project team can justify which tests don’t need to be repeated. A related problem is the use of validation test samples for clinical trials which were not fabricated using the final manufacturing process.

4. Insufficient test sample size

There is inherent variability in the parameters of all of the components and assemblies in a product. Testing must demonstrate that the final product can meet its specifications in spite of this variability: you must test a sufficient number of samples to assess the variability in a population of components or set of assemblies accurately. The sample size needed for a particular test is a function of: how close the sample mean is to the specification limit, the spread of the measurements (standard deviation), and the desired reliability and confidence thresholds. In turn, these reliability and confidence thresholds are dependent on the design risk analyses for safety critical components and functions.

5. Test tools not validated

You must ensure and document that all of the test tools used in V&V testing, including test equipment, custom fixtures, test software, etc., are working properly before performing the tests. If one of your test tools has not been calibrated, it calls into question all of test data derived from that tool. The validation (qualification) of test tools should be commensurate with their complexity and with the risks associated with their use.

6. Test protocol documents are not rigorously managed

You must manage the test protocol documentation that defines the test instructions carefully or you will undermine confidence in the evidence generated by the testing. You need to agree on the method of testing before executing the tests. This ensures that the test instructions reflect the true intent of the requirements and avoids the tendency of project teams to “move the goal posts after the ball was kicked.” It is important to review and approve test protocols before testing begins, and manage them using change control once it starts. Define your test protocols with enough rigor to provide clear objective evidence for pass/fail conclusions. Your test team personnel must know how to execute these protocols exactly. Specifically, they need to understand the test methods fully and pass training on the relevant quality system procedures.

7. Poorly developed test methods and test tools

Your test results will be no more reliable than the quality of your test methods. Experienced teams understand that developing test methods that are sensitive, precise, and robust, can be very challenging. They don’t leave this critical work to the last minute in a project or have it performed by personnel who are not familiar with the product technology and application. At the start of the design process, make sure that your project has allocated enough resources and time for the development of custom test fixtures and test software to support new test methods.

8. Inadequate software testing

The software can be the most complex part of a medical product, as it must satisfy additional regulatory requirements for proper testing and documentation (such as the IEC 62304 standard). Here are some common mistakes in software testing:

  • Failure to test the software thoroughly with the final hardware.
  • Failure to test for fault conditions or other conditions resulting from user error.
  • Failure to perform regression testing following any software change.
  • Poor configuration control (source code control, control of software builds, system configuration control, control of stored software parameters, etc.) will undermine rigorous software testing.

9. Poor test planning

Good V&V preparation can take weeks or even months of work. Inexperienced project teams often underestimate the time and effort needed to perform V&V on a new medical product. Typically, the greater the complexity of a new product, the greater the proportion of project time that needs to be devoted to test preparation and testing over design work. Some common symptoms of poor test planning are:

  • Waiting too late to begin preparations for V&V
  • Responsibilities for V&V testing are not clear
  • No trace matrix to ensure full test coverage
  • Emphasis on testing does not correspond to risk analysis; does not test critical safety features thoroughly
  • Testing covers “easy” aspects of product functions but not the most important functions
  • Rushing to schedule external testing at a test lab
  • Omitting applicable standards

10. Testing is not comprehensive

V&V testing is often the last chance to catch product flaws before product release: it is expected to cover all aspects of the new product. Developing detailed trace matrices, which trace requirements to test and risk mitigations to test, is an important way to ensure comprehensive coverage. Important parts of the product that may be overlooked in the V&V test program are:

  • Product packaging (primary and secondary)
  • Product labeling
  • Product shipping and shelf-life/stability testing (including accelerated aging)
  • Product sterilization and biocompatibility (if applicable)
  • Product service features (if applicable)

Want to give your product development team a real edge in testing? Implement a good requirements management/ test management tool to automate all the bookkeeping associated with V&V testing. Good software tools will not only maintain traceability of risk and requirements to tests, but they will also speed record keeping during test execution (“paperless V&V”), and efficiently manage multiple re-tests.

Originally posted on 05/25/17 at https://www.mdtmag.com

Leave a Comment

Your email address will not be published. Required fields are marked *