As the Good Laboratory Practice (GLP) landscape continues to evolve with new guidance documents, refreshed FAQs, and the return of the Medicines and Healthcare products Regulatory Agency’s (MHRA) Laboratory Symposium there’s plenty for facilities to stay on top of. In this post, Nicole explores the key areas you should be focusing on and shares practical steps, along with regulatory insights, to help your facility remains inspection-ready.
Test Items
The first component is Test Items. OECD Document 19 came out in 2018, and Test Items and the procedures around them are still definitely a hot topic for the regulators. More emphasis that ever is being placed on the information provided to the facility, because facilities (and Sponsors) still seem to be missing the mark with what is expected.
Test Facility Management needs to ensure that all of the information required is available for the Study Director (SD) to use their expertise and judgement to confidently claim that the product they have received is what it purports to be, and that it is fit for use in the study.
The regulators acknowledge that Test Items across GLP can take all sorts of forms, which is why they have highlighted that a one-size-fits-all approach is not appropriate for all facilities. The characteristics that are critical for a medical device will not be the same as those for a viral clearance test, so if your facility is conducting a variety of study types, it’s important to make sure that the forms in use are fit for purpose.
Of equal importance is making sure that the staff involved in Test Item receipt and processing have the appropriate expertise and training to be able to follow the procedures. Also, try thinking outside the box on the forms if an ‘odd’ Test Item arrives.
Equipment
Of a similar vein, the equipment within GLP facilities comes in all shapes and sizes, but the bottom line is that it all needs to be fit for purpose, qualified and maintained appropriately.
Typically, from the MHRA findings, its balances, fridges, and freezers that crop up as the most problematic on inspection, and not the more ‘complicated’ equipment that you may suspect to be the culprit. Many of the findings came from gaps in procedures guiding routine use of the equipment (e.g. no requirement to ensure balances are level before use, similarly, requiring balances to be appropriately positioned before use but having no confirmation of the check being completed, or no procedures for what to do when a calibration fails, etc).
For fridges and freezers, it was primarily around temperature calibration where issues arose, in that procedures may require mapping of the equipment, but only one point was calibrated, or inbuilt displays were used to record data, but these screens had not been calibrated to ensure they were accurate. To avoid these pitfalls, the MHRA shared a handy flow chart that can be used for simple and complicated equipment, and at its basic level, the important points to make sure you hit are making sure:
- The User Requirement Specification (URS) you have developed in determining your need for the equipment is consistent with the specification provided by the supplier.
- Any Instrument Qualification (IQ) activities performed by the supplier meet your facility requirements (the same for Operational Qualification (OQ)), and.
- The performance qualification (PQ) activities test your specific requirements, including all the regulatory requirements.
Incorporating a risk-based approach and data flows can help ensure that no aspect of risk to your data safety and reproducibility is missed.
Computer System Validation (CSV)
The seemingly more complicated area is CSV. Many of the risks involved in CSV are the same as for equipment qualification, yet people can tie themselves in knots over how to tackle dreaded CSV processes.
Remember: systems can’t think for themselves (yet), and they still do what you tell them to. A risk-based approach can be used for everything from commercial off-the-shelf software to bespoke systems.
When validating:
- Consider the whole data flow and define what raw data is (first point of capture)
- Plan validation before regulatory use – avoid retrospective validation post-inspection
- Define validation approach and IT security in policies and procedures
Reference OECD 17 and 22 for guidance on what to include in your validation framework.
Once live, periodically review systems to ensure they remain compliant and reflect updated regulations. The review frequency should be based on system risk and must be documented.
Data Integrity (DI)
Closely linked with computer systems and risk is data integrity (DI) – a hot topic with the MHRA!
The agency gave their first DI presentation in 2015 – and to help us, they generated the homegrown GxP Data Integrity Guidance from 2018, followed by OECD Document 22 in 2021. It’s no longer acceptable to just add procedural controls around antique systems if they aren’t fit for purpose; the regulators are now expecting you to have a plan. If a modern solution exists, why has it not been implemented? (And just be aware that they are less likely to accept budgets as an excuse for not upgrading to compliant systems.) Data Integrity risk assessments that should have been completed after the release of OECD 22 should not be static documents; the world of data integrity is constantly evolving, and therefore, the risk assessments should be periodically reviewed to ensure that they are still an accurate reflection of the risks to data integrity in your facility.
Be cautious… you don’t want to make your procedures too prescriptive with so many contingencies that the system becomes burdensome and unmanageable, so it is important to consider people while implementing data integrity controls.
Quality Assurance (QA)
Quality Assurance can sometimes be called ‘The Dark Side,’ but is it a necessary evil or a blessing in disguise?
Properly implemented, an effective QA program will assure that the activities that take place within your facility are compliant with the requirements of GLP.
Inspections may be:
- Process-based – for routine/repetitive tasks
- Study-based – for new or novel procedures
- Facility-based – for activities not directly linked to studies
If you choose to structure your QA programme on a risk-based model, ensure that you’ve documented your assessment and your approach, and that you execute it as intended. QA is not immune to deviations. If something unexpected happens and you cannot carry out the inspections as planned, document the excursion and your corrective actions, and make sure the SDs are made aware so that they can assess any impact on their studies.
If QA doesn’t provide appropriate oversight, the SD should not sign the study’s compliance statement
Vendor and Test Site Oversight
The last (but vital) component is oversight of vendors, suppliers, and test sites.
If using vendors for IQ, OQ, PQ, calibration, or maintenance:
- Ensure contracts are in place
- Ensure their criteria match your facility’s needs
- Review all certificates and documentation for compliance (e.g. ensure “Pass/Fail” criteria are clear)
These pitfalls can be avoided by including vendor qualification with any equipment or system purchases, ensuring that the service agreements reflect what you need, and having staff review certificates/activities that have been performed.
If you’re going to be performing a multi-site study:
- Confirm test sites have current GLP certificates
- Perform on-site or desk-based assessments
- Maintain strong communication between Principal Investigator (PI) and SD to resolve issues and maintain oversight
If you have any questions or want to know how Tower Mains can support ensuring your facility hits the mark, don’t hesitate to get in touch with us here.
