Lab Compliance Archives - Astrix https://astrixinc.com/category/blog/lab-compliance/ Expert Services and Staffing for Science-Based Businesses Wed, 09 Aug 2023 14:33:00 +0000 en-US hourly 1 Why Data Integrity is Needed for a Digital Transformation https://astrixinc.com/blog/lab-compliance/why-data-integrity-is-needed-for-a-digital-transformation/ Mon, 24 Jan 2022 16:42:56 +0000 http://localhost/astrix/?p=10218 Digital Transformation Needs Data Integrity to Succeed To be effective, digital transformation […]

The post Why Data Integrity is Needed for a Digital Transformation appeared first on Astrix.

]]>
Digital Transformation Needs Data Integrity to Succeed

To be effective, digital transformation efforts to improve business operations require reliable data. The exponential growth of business data, along with advances in cloud computing, artificial intelligence, and the Internet of Things, has ushered in a new era of digital transformation around the world – making data integrity a critical business need.

Based on a survey of 300 Chief Data Officers (CDO) at companies with 2,500 or more employees in the Americas, EMEA, and Asia Pacific by Precisely and Corinium Global intelligence, 82% of C-level data executives believe that data concerns represent an obstacle to data integration plans. Additionally, 80% also stated it’s difficult to ensure that data is continually supplemented with the right context at scale, which helps them make better business decisions.1

How Data Integrity Assists in a Digital Transformation

In the Life Sciences industry, data integrity facilitates compliance and increases agility and efficiency. It enables organizations to move more expeditiously with confidence in their data. A digital transformation aims to digitize the organization’s operations by leveraging the right technology throughout the business and across departments. By directing the company’s efforts towards ensuring data integrity, which includes the accuracy, validity, and consistency of the data, the digital transformation can lead to lower costs, improved product quality and patient safety. This in return leads to improved profitability and growth of the organization.

How to Maintain Data Integrity During a Digital Transformation

Bad data is a liability for the organization. Therefore it is imperative that during a digital transformation process, it is stressed that data integrity is also equally important. Management needs to cultivate an environment that focuses on data integrity as the key element to success with the digitization of the organization.

In order to ensure data integrity, automation of the process should be considered of utmost importance. Human error is inevitable as it is easy to make mistakes and enter data incorrectly when done manually. Automating the entry of data is a much more efficient and effective way to enter the data at scale. In their research, Forrester has found that organizations can cut operating costs up to 90% by automating data entry.

In the Life Sciences industry, data integrity is not only important to ensure efficiency and effectiveness of the business, it also helps organizations in avoiding legal fines. By incorporating regulatory compliance in your data integrity strategy, it helps ensure that you avoid any unexpected fines and adhere to industry best practices. It is therefore also important that organizations be aware of the regulatory requirements and build procedures that conform with those requirements.

It’s essential for the organization to understand the flow of data across systems and processes. The business needs to make it regular practice to validate inputs given front-end systems are being used to add data to the process. This will help to ensure that employees submit the correct information.

Summary

A Digital Transformation requires data integrity to be successful. A digital transformation aims to digitize the organization’s operations by leveraging the right technology throughout the business and across departments. By directing the company’s efforts towards ensuring data integrity, the digital transformation can lead to lower costs, improved product quality and patient safety along with profitability and future growth.

In order to maintain data integrity as part of the digital transformation, it is imperative that management cultivates an environment that focuses on data integrity. The organization also needs to automate the processes to avoid human error and ensure they follow regulatory requirements to avoid  regulatory actions.

Why It Matters To You

Given the importance of a digital transformation to a Life Sciences organization, it is imperative that data integrity across the organization is maintained. In this blog post we discuss:

  • Why a Digital Transformation requires data integrity to be successful.
  • How data integrity ensures a successful digital transformation.
  • Key considerations to maintain data integrity with a digital transformation.

About Astrix

For over 25 years, Astrix has been a market-leader in delivering innovative solutions through world class people, process, and technology that fundamentally improves scientific outcomes and quality of life everywhere. Founded by scientists to solve the unique challenges life sciences and other science-based business face, Astrix offers a growing array of strategic,  technical, and staffing services designed to deliver value to clients across their organizations. To learn the latest about how Astrix is transforming the way science-based business succeed today, visit www.astrixinc.com.

 

The post Why Data Integrity is Needed for a Digital Transformation appeared first on Astrix.

]]>
Agile Regulatory Compliance: How quickly do your current processes allow you to respond to regulatory changes? https://astrixinc.com/blog/lab-compliance/agile-regulatory-compliance-how-quickly-do-your-current-processes-allow-you-to-respond-to-regulatory-changes/ Thu, 17 Jun 2021 14:06:49 +0000 http://astrix.mjwebexperts.com/?p=7623 Staying current with ever-changing regulatory compliance regulations can be an overwhelming and […]

The post Agile Regulatory Compliance: How quickly do your current processes allow you to respond to regulatory changes? appeared first on Astrix.

]]>
Staying current with ever-changing regulatory compliance regulations can be an overwhelming and time consuming task for life science organizations. With changes released nearly every day, if you’re not referencing the most recent guidance documents, you may be introducing unnecessary compliance risks, including citations, fines, and failed submissions – ultimately wasting valuable time, money and resources.

Throughout the drug development lifecycle, it is critical to have a comprehensive and accurate view of today’s regulatory landscape. Additionally, the emergence of new technologies has expanded the scope and frequency of change to regulatory requirements. Monitoring ongoing daily reports and a vast array of government website may offer moderate success in obtaining all of the necessary information regarding updated compliance guidance. However, the challenge still lies in translating those changes to the everyday workflow and processes that occur within the laboratory and manufacturing environment.

The Agile Response to Regulatory Change

The key question to address is, ‘how quickly do your current processes allow you to respond to regulatory changes’?  Effective regulatory change management requires an agile process to administer the consistent influx of new guidelines in a cohesive and timely manner. Global and multi-site operations have an even larger complexity of harmonizing regulatory change and integrated policy management across all regions of the organization.

The response to this challenge is agile, real-time compliance. To achieve regulatory efficiency throughout all areas of your organization, real-time compliance management can provide a sustainable solution to this complex process. Clarivate’s Regulatory Intelligence Tracking App (ClaRITA), is a business-specific regulatory monitoring and impact assessment platform that provides pharmacovigilance, safety, clinical, and technical operations teams the information they need to monitor changing legislation and regulations and remain in compliance.

ClaRITA is a web-based regulatory workflow solution that interacts with the Cortellis Regulatory Intelligence™ data to effectively manage the regulatory actioning lifecycle.  Through the continuous daily surveillance of regulatory news worldwide, ClaRITA enables you to streamline and automate regulatory monitoring and impact assessments to facilitate real-time compliance in the following ways:

  • Provides a single source of truth for regulatory updates
  • Issues automated daily alerts regarding new and important regulatory changes and priority classifications
  • Processes daily Cortellis Regulatory Intelligence (CRI) updates to obtain comprehensive, global, and expertly-analyzed information that spans all regulatory functions across the R&D lifecycle
  • Requires user authentication and allows for role-based access control for GxP compliance
  • Informs users of pre-publication Triage and Impact Assessment to determine the potential impacts of regulatory proposals
  • Manages assigned tasks, user access to tasks, and actioning of tasks
  • Reports performance on task resolution and trending by groups and users
  • Provides collaboration and status tracking tools

Streamlining the regulatory alert process allows you to efficiently manage and track responses to regulatory changes, increasing transparency across the organization to ensure that tasks are acted upon by required teams within the needed timeframe.

Example ClaRITA workflow and document status assignment


Conclusion

It is of critical importance for regulated industries to comply with the most current regulatory guidelines. Having a real-time compliance strategy in place that allows for automated regulatory compliance monitoring and an agile change management approach to quickly implement and track performance on these tasks will significantly streamline the regulatory actioning lifecycle across your organization.

Why it Matters for You

Throughout the drug development lifecycle, it is critical to have a comprehensive and accurate view of today’s regulatory landscape. Effectively managing the ever-change compliance regulations is a daunting, time consuming task for life science organizations. An agile, real-time compliance management strategy can provide a sustainable solution to this complex process in the following ways:

  • Adopting an agile regulatory change management strategy will enable you to achieve regulatory efficiency throughout all areas of your organization.
  • Streamlining the regulatory alert process will allow you to efficiently manage and track responses to regulatory changes, increasing transparency across the organization to ensure that tasks are acted upon by required teams within the needed timeframe.
  • Implementing Clarivate’s Regulatory Intelligence Tracking App (ClaRITA) will provide a business-specific regulatory monitoring and impact assessment platform that equips pharmacovigilance, safety, clinical, and technical operations teams with the information they need to monitor changing legislation and regulations and remain in compliance.
  • Continuously monitor regulatory news across the globe on a daily basis using Cortellis Regulatory Intelligence (CRI) provides automated updates to obtain comprehensive, and expertly-analyzed information that spans all regulatory functions across the R&D lifecycle.

About Astrix

Astrix partners with many of the industry leaders in the informatics space to offer state of the art solutions for all of your laboratory informatics needs. With over 25 years of industry proven experience, Astrix has the informatics specialists and business process analysis tools required to develop and implement the solution that works best for your enterprise. Our domain experts have helped hundreds of companies globally effectively navigate their digital transformation journey, connecting people, processes and systems to accelerate the advancement of science and medicine.

The post Agile Regulatory Compliance: How quickly do your current processes allow you to respond to regulatory changes? appeared first on Astrix.

]]>
“Lean Methodologies” Revolutionize Pharma Quality Control Labs https://astrixinc.com/blog/lab-compliance/lean-methodologies-revolutionize-pharma-quality-control-labs/ Mon, 19 Apr 2021 20:10:32 +0000 http://localhost/astrix/?p=4592 The pharmaceutical quality control (QC) laboratory serves one of the most vital functions in pharmaceutical production […]

The post “Lean Methodologies” Revolutionize Pharma Quality Control Labs appeared first on Astrix.

]]>

The pharmaceutical quality control (QC) laboratory serves one of the most vital functions in pharmaceutical production for both in-process and finished product testing and control. The QC lab is responsible for monitoring the quality of raw materials and supplies used in the manufacturing process as well as batch release process control. A common concern for all QC laboratory environments is the amount of manual processes that exist within the workflow, leading to the inefficient utilization of resources, decreased reliability, and ultimately reducing time to market.

Lean Methodologies have long been used in industry to improve process flows and to drive increased efficiency. This approach has typically not been applied to the same extent in the laboratory due to the increased complexity and lack of applicable tools. Incorporating lean principles into the QC laboratory workflow will lead to overall process improvement and has the potential to significantly transform your operations into the digital lab of the future.

PlanDomino, recently named as one of the 100 Hot Start Ups by the Sunday Business Post, addresses this complexity by incorporating the established principles of Lean Management to laboratory workflows and providing a unique and compelling, cloud-based interface for the management of resources and tasks.

Lean Management Enables the Digital Lab of the Future

PlanDomino is revolutionizing quality control laboratories with its unique digital lab resource scheduler and capacity management platform, designed to efficiently allocate resources and manage testing workflow. This next-generation laboratory scheduling software solution is an enabler of the Lab of the Future, driving more predictable and efficient job tracking with increased visibility and automated performance metrics.

The Lean concepts applied through the PlanDomino solution are described here:

  • Levelling: Apply levelling techniques to effectively manage the variability of incoming workloads resulting in a more productive and less stressful working environment.
  • Flow: Create standardized workflows to reduce test cycle times and work in progress to achieve an optimized sample and test management system.
  • Standard Work: Gain control over available resources, improve test repeatability and reduce on-boarding time by standardizing routine work in the laboratory, including non-test activity.
  • Visual Management: Improve management of issues, investigations, resources and overall communication through visual representation of the entire workflow.
  • Continuous Improvement: Improve efficiency and reduce waste by balancing the production line through one-piece flow lean methodology.

Adopting these lean management principles within your QC laboratory operations will offer significant improvements in efficiency, quality and cycle time. .

Transform Decision Making with New Data

Data driven decision making is the ultimate goal of any successful digital transformation strategy. Ongoing monitoring of key indicators provides early visibility of potential issues within the overall workflow. Through digital transformation initiatives, the automation of your laboratory processes establishes the framework for the systematic monitoring of performance metrics. With PlanDomino, previously unavailable data now becomes accessible for real-time visualization and analysis. This step-change will allow for improved data visualization to gain the following efficiencies throughout your workflow:

Current Procedures and Limitation

  • Difficult & Time Consuming to Measure
  • Static Boards with Limited Information
  • Expert Advice Required to Sustain Operations

Improved Process with PlanDomino Solution

  • Automated Real-time & Historical Metrics
  • Digital Display Harmonized Across Teams
  • System Encourages & Supports Continuous Improvement

The PlanDomino dashboard views provide clear, actionable displays to monitor laboratory performance, while utilizing artificial intelligence (AI) and Data Science tools to facilitate lab automation and real-time monitoring across your operations.

Conclusion

Attaining the lab of the future requires a focus beyond what is prescribed by traditional digital transformation protocols. The elimination of manual processes, while automating laboratory operations and standardizing data repositories is of fundamental importance in the overall digital strategy. The real step-change for the manufacturing QC laboratory occurs through the adoption of lean management practices and tools layered within this digitally transformed environment. PlanDomino provides the gateway to enabling the digital laboratory of the future by driving more predictable and efficient job tracking with increased visibility across your operations and automated, real-time performance metrics for quality decision based science.

Why It Matters for You

The quality control laboratory provides one of the most important functions within the pharmaceutical company. The successful digital strategy for the QC environment will gain significant operational efficiency through the incorporation of lean management practices and tools. The benefits of this approach include:

  • Levelling the flow and workloads across teams and labs automatically manages the ups and downs of workload variability. Priorities are automatically updated as the environment changes. Adjusting your schedule is made easy with the PlanDomino user interface and workflow manager.
  • Establishing flowed testing paths throughout the lab offer the quickest path to completion, resulting in shorter cycle times, while avoiding disruptions, delays and elevated volumes of WIP.
  • Improve informed data driven decision making capabilities through automated access to a multitude of performance data delivered real-time to customized dashboards.
  • Drive continuous improvement through PlanDomino’s analysis of your workflow and process data, resulting in the recommendation of where you should concentrate efforts to provide the maximum benefit for your operations.

About Astrix

Astrix partners with many of the industry leaders in the informatics space to offer state of the art solutions for all of your laboratory informatics needs. With over 25 years of industry proven experience, Astrix has the informatics specialists and business process analysis tools required to develop and implement the solution that works best for your enterprise. Our domain experts have helped hundreds of companies globally effectively navigate their digital transformation journey, connecting people, processes and systems to accelerate the advancement of science and medicine.

The post “Lean Methodologies” Revolutionize Pharma Quality Control Labs appeared first on Astrix.

]]>
Strategies for Surviving an FDA Audit https://astrixinc.com/blog/lab-compliance/strategies-for-surviving-an-fda-audit/ Thu, 30 Apr 2020 13:15:35 +0000 http://localhost/astrix/?p=3213 As part of its mission to ensure public safety, the FDA inspects […]

The post Strategies for Surviving an FDA Audit appeared first on Astrix.

]]>
As part of its mission to ensure public safety, the FDA inspects manufacturers or processors of FDA-regulated products to verify that they comply with relevant regulations. The FDA also inspects facilities that conduct clinical trials with humans and laboratories that conduct studies with animals or microorganisms when these studies are used to apply for FDA approval of a medical product. A typical inspection can last for 2-3 days and involves a number of key steps that you should be aware of.

In the United States, the FDA is not required to provide advance notice of an inspection. In facilities where violations were noted during a previous inspection, the FDA will likely provide no advance notice. If your last inspection was without violation or if this is a pre-approval visit, however, you will likely be given notice of an impending inspection.

There are few events that engender as much anxiety for company stakeholders as an FDA audit. Are you prepared for when the FDA shows up at your front door? Do your employees know how to interact with FDA inspectors when they are on site? Do you have Standard Operating Procedures (SOPs) in place for handling an FDA audit?

Surviving an FDA audit can be significantly less daunting if you know what to expect and have a strategy in place to ensure the best possible outcome. In this blog, we will provide best practice guidelines to help you prepare for an FDA inspection, as well as understand how to best interact with the inspector – both while they are on site and post-inspection.

Pre-Inspection Best Practices

There are a number of best practices that are important to accomplish prior to any FDA inspection.

Understand the Regulations. The most important preparation is to be knowledgeable about FDA regulations and compliant with them in your daily operations. We have discussed a number of FDA Guidance documents and compliance best practices in previous blog posts. These include:

Know the Types of Audits. The FDA conducts several types of inspections in its efforts to protect consumers from unsafe products:

  • Pre-approval inspections occur after a company submits an application to the FDA to market a new product.
  • Routine inspections of regulated facilities are the most common type of audits performed.
  • “For-cause” inspections can occur when a specific problem has come to the attention of the FDA.

FDA inspectors follow Compliance Policy Guides (CPGs) that detail the steps involved in each type of inspection, although they do not limit the activities of an inspector. Understanding the type of inspection to be conducted and referencing the appropriate CPG will help you know the length of time inspectors will be at your facility and the type of documentation that you will need to provide for their review.

Make a Plan. Best practice is to have inspection procedures in place in the form of SOPs that describe when and how to act in response to an FDA audit. These SOPs should cover every aspect of the inspection – inspection preparation, arrival of the inspector, conduct of personnel during an inspection, handling any disputes that may arise, system for tracking document requests by the inspector, and the final meeting with the inspector to discuss results. If you don’t already have inspection SOPs in place, it may be helpful to consult with a qualified third party to create them.

Designate an Inspection Team. It is likely that the investigator(s) will need to interact with several key personnel in your organization during the inspection. Best practice is to determine who those stakeholders are in advance, assign each team member’s roles and responsibilities, and train them on your organization’s approach to the audit. It is also important to assign someone to be the investigator’s key contact that will escort them to locations throughout your facility. Make sure this person has the knowledge and authority required to speak to any concerns and/or requests that arise.

Conduct Internal Inspections. Best practice is to periodically perform thorough internal inspections and audits of your facility and systems to assess compliance status prior to an FDA visit. One approach to accomplish this is using internal resources knowledgeable about regulatory compliance who also have experience with real audits to lead an internal audit. This should be considered as significant as an official regulatory agency audit and executed as thoroughly. Another approach is to engage a third party consultants with expertise to assess compliance status.

Inspection Best Practices

The first thing to do when the inspector(s) arrive is to check his or her identification:  the inspector is obligated to show you their credentials. Other best practice recommendations:

Notify Employees. If you did not receive advance notice of the inspection, alert all potentially affected employees as quickly as possible that the FDA is on-site and that they should follow company protocol detailed in your SOPs when asked questions.

Designate Appropriate Workspace. Designate an area for the inspector to work. This space should be quiet, private and comfortable – a separated conference room is typically a good choice. It is wise to avoid placing the inspector(s) in an office where there is a lot of traffic and external activity.

Establish Open Communication. It is important to establish open communication between your company representatives and the FDA examiner(s). Cooperate with FDA inspectors and do not impede their progress or engage in idle conversation. It is important that company representatives provide answers specific to what is asked. Remember that you are the host. Be polite, avoid confrontation, and do not challenge of the inspector’s understanding of regulations. If challenges are warranted, they are only relevant in writing as responses to the written contents of the audit report. Request a brief update meeting with the FDA inspector at the end of each day to stay informed of inspection progress and any issues that have arisen. It is important to respond to questions and clarify any issues as they arise during the inspection.

Develop an Audit Dossier. Collect copies of files or samples submitted, along with any updates provided by FDA examiners and other important details, in an audit dossier throughout the inspection process. This will identify the specific support material used during the audit and help you track any issues that are identified so you can develop an appropriate response.

Understand FDA Jurisdiction. In order to protect your company’s trade secrets and proprietary information, there are restrictions on the types of files that FDA inspector can access. Files containing the following are restricted:

  • Business plans
  • Financial records, including most sales data
  • Pricing information
  • Unrelated Research & Development
  • Personnel files

If an FDA investigator asks for something that you believe in on this restricted list, you can request a conversation to obtain clarity on why the documentation is being requested.

Develop a System for Handling Requested Records. Best practice is to develop a back room/front room operation to fulfill and track document requests as quickly as possible. Review a requested record before sending it to the inspector to verify it is complete. Once the inspector reviews a record, remove it from the inspection workplace. Be honest and forthcoming fulfilling records requested by the inspector.

Post-Inspection Best Practices

Schedule a Closeout Meeting. A closeout meeting should be held at the conclusion of the FDA inspection to discuss findings that includes company senior management, members of the Inspection Team, and all FDA inspectors. This meeting should encourage dialog. The inspector will document any compliance deviations observed during the inspection using an FDA Form 483. The contents of Form 483 should be reviewed and discussed in this meeting to make sure there is full understanding of the observations and what they mean.

Formulate an FDA Form 483 Response. Your company must provide a formal response to the FDA Form 483 observations in writing within 15 days of your closeout meeting. Your response should be clear, concise, and thorough, and include:

  • A compliance commitment statement from company management.
  • A separate response addressing each observation.
  • Root cause and a corrective and preventative actions for each observation, including dates.
  • A method of verification and/or monitoring for all future corrections and a commitment to a follow-up response.
  • Evidence for corrections that have already been completed.

Conclusion

Enforcement actions by the FDA with respect to regulatory compliance violations can result in serious financial consequences for an organization due to facility shutdown, product recalls, import and/or distribution bans, delayed or denied drug approvals, substantial remediation costs, and loss of customers due to a damaged reputation. These actions also divert worker attention away from their daily activities towards corrective and preventive actions, which can result in significant expenditures of time and money. Additionally, manufacturers who are found in violation of regulations may lose the trust of the FDA and face more frequent and in-depth inspections. Several companies that have been cited for compliance deficiencies by the FDA over the last decade are in fact no longer in business due to the financial hardships that ensued.

One of the most important actions a company can take is to develop an effective compliance strategy to ensure the best possible outcome in the event of an FDA inspection. In this blog, we covered a variety of best practices that can help your company effectively prepare for and survive an FDA inspection. In addition to these recommendations, partnering with an experienced third party consultant who can provide a comprehensive compliance assessment can be very helpful. If you would like to discuss your overall compliance strategy, please don’t hesitate to contact us.

The post Strategies for Surviving an FDA Audit appeared first on Astrix.

]]>
Trends in FDA Data Integrity 483s and Warning Letters for Pharmaceutical Companies https://astrixinc.com/blog/lab-compliance/trends-in-fda-data-integrity-483s-and-warning-letters-for-pharmaceutical-companies/ Sun, 16 Feb 2020 23:42:42 +0000 http://localhost/astrix/?p=3468 The manufacture of pharmaceutical drugs is a highly complex process that involves […]

The post Trends in FDA Data Integrity 483s and Warning Letters for Pharmaceutical Companies appeared first on Astrix.

]]>
The manufacture of pharmaceutical drugs is a highly complex process that involves advanced scientific analysis and instrumentation at all stages of production and storage. In order to guarantee the safety and efficacy of both human and veterinary drugs, the FDA strives to verify data integrity in all cGMP records used to document and guide the manufacturing process.

The FDA first identified failures in data governance and data integrity in the year 2000 with a warning letter issued to Schein Pharmaceuticals that cited lack of proper controls over computerized laboratory systems. In the years since, the FDA focus on compliance with data integrity regulations in facility inspections has increased significantly. We have written extensively about how the FDA defines data integrity and the current FDA regulations and guidance in this area. For more information on these subjects, see here and here and here.

Enforcement actions by the FDA due to failure to comply with data integrity regulations can result in serious financial consequences for an organization due to facility shutdown, product recalls, import and/or distribution bans, delayed or denied drug approvals, substantial remediation costs, and loss of customers due to a damaged reputation. In addition, manufacturers who are found in violation of data integrity regulations may lose the trust of the FDA and face more frequent and in-depth inspections in the future.

As such, it is critical for organizations to take appropriate steps to ensure compliance with applicable data integrity regulations governing the production of pharmaceutical drugs. In this blog, we will discuss recent trends in FDA warning letters and 483s in the pharmaceutical industry with regards to data integrity, and also highlight the importance and benefits of incorporating independent data integrity assessments into your organization’s quality management system (QMS) as part of cGMP audit programs.

FDA Form 483s and Warning Letters

The FDA conducts all inspections of regulated products, research sites and manufacturers through its Office of Regulatory Affairs (ORA). When the FDA inspects a pharmaceutical company’s facility, they can either show up unannounced or alert the company ahead of time. Once the inspection is complete, the inspectors can communicate any observed conditions and/or practices that may be in violation of FDA requirements through a Form 483 or a warning letter. While both of these documents serve to inform sponsors and principal investigators of issues that require corrective action, there are important differences in terms of the seriousness of the infraction(s) being highlighted:

FDA Form 483

An FDA Form 483 is essentially a list of identified regulatory deficiencies that an ORA inspector provides to company management at the end of an inspection. As the FDA expects that these deficiencies will be remediated, it is important to respond thoughtfully to the FDA within 15 days detailing your plan for corrective actions in order to avoid a warning letter. A Form 483 response will usually require input from many different aspects of your organization, so it should be all hands on deck upon receiving the 483 to facilitate a good response detailing a comprehensive plan of action within 15 days.

FDA Warning Letter

An FDA warning letter is issued by ORA inspectors for more serious compliance deficiencies, often involving previous Form 483s that have not been effectively remediated. Warning letters should be taken very seriously and answered in a timely fashion within the required time frame. A comprehensive remediation plan needs to be developed, implemented and adhered to, along with consistent communication with the FDA during the process to avoid further enforcement actions.

Trends in Pharmaceutical Company Form 483s and Warning Letters Citing Data Integrity Violations

The Big Data and AI Analytics firm Govzilla found that, regardless of company size, roughly 50% of all global drug 483s that have been issued over the 5 year period from 2014-2018 cite data integrity concerns. Data integrity violations are even more prevalent in warning letters, with 79% of global drug warning letters during this period citing data integrity issues. Additionally, the total number of FDA warning letters referencing data integrity deficiencies has increased significantly in recent years.

While 21 CFR Part 11 is known as the data integrity rule, deficiencies in Part 11 are rarely cited in 483s or warning letters – almost all deficiencies cited are failures to comply with cGMP predicate rules (specifically, 21 CFR Parts 210, 211 and 212). Two predicate rules that are frequently cited are Parts 211.68 and 211.194.

Part 211.68 specifies requirements for “Automatic, Mechanical and Electronic Equipment.” Common citations in this area include:

  • The company did not implement effective computer system controls to ensure only authorized individuals had access to the systems.
  • Access was not consistent with appropriate roles and responsibilities. For example, laboratory analysts could delete or modify data, change configuration settings (e.g., disable audit trails), could adjust date and time stamps for electronic data to falsify the date/time when data was initially acquired.
  • Data was not backed up appropriately to allow data reconstruction activities in future.
  • Audit trail data did not match data in printed chromatograms.

Part 211.194 is cited when companies do not review and include all relevant data when making lot release decisions. Common citations in this area include:

  • Companies fail to review critical data and/or metadata that would allow them to identify out of specification (OOS) events that require investigation in lot release decisions.
  • The company falsifies test results, destroys data, or does not have the data necessary to support a test result.
  • Laboratory analysts reprocess or manipulate data, or delete OOS data, so the result will meet acceptance criteria.

Other recently cited deficiencies include:

  • Utilizing “pre-injections” of product samples outside of full sample sets to determine if results pass acceptance criteria, and then deleting/ignoring results if they fail.
  • Disabling audit trails intermittently to obscure results
  • Deletions or modifications of results
  • Use of integration suppression settings to minimize data that would likely cause an OOS result
  • Aborting test runs with no justification

One common theme in recent warning letters to pharmaceutical companies has been the clear and consistent encouragement by the FDA to employ “independent” data integrity assessments as part of the strategy for remediating identified issues. A few examples include:

  • On August 10th, 2018, a warning letter was issued to the manufacturing facility of Kyowa Hakko Bio Co., Ltd. in Japan that stated: “We recommend that a qualified third party with specific expertise in the area where potential breaches were identified should evaluate all data integrity lapses.”
  • On March 26th, 2019, a warning letter was issued to the manufacturing facility of Winder Laboratories, LLC in Georgia that stated: “Your quality system does not adequately ensure the accuracy and integrity of data to support the safety, effectiveness, and quality of the drugs you manufacture…. We strongly recommend that you retain a qualified consultant to assist in your remediation.”
  • On June 13th 2019, a warning letter was issued to the manufacturing facility of Akorn Inc. in Illinois that stated: “Your quality system does not adequately ensure the accuracy and integrity of data to support the safety, effectiveness, and quality of the drugs you manufacture…. We acknowledge that you are using a consultant to audit your operation and assist in meeting FDA requirements.”

Conclusion

Data integrity is a growing focus of the FDA, and it is therefore critical for organizations to ensure compliance with applicable data integrity regulations governing the production of pharmaceutical drugs. While this blog specifically highlights FDA trends in Form 483s and warning letters for pharmaceutical drug manufacturers, the recommendations communicated are also applicable to biotech companies, clinical research and CROs, medical device manufacturers, R&D laboratories, etc.

In order to ensure compliance, working with an external consultant that has expertise in data integrity evaluations to audit your laboratory environment is best practice, as an expert with fresh eyes will be able to effectively locate data integrity issues you missed. A data integrity assessment performed by a qualified team of regulatory experts provides a number of important benefits for your organization:

  • Strengthens Your Organization’s Data Integrity Focus – Helps reinforce the fact that your organization is committed to data integrity compliance for all company employees.
  • Provides Peace of Mind – You know that your data integrity issues have been identified and are being addressed.
  • Reduces Costs – The cost of remediating data integrity issues identified by the FDA is generally much more significant than when the issues are proactively identified and corrected internally.
  • Stay Focused on Your Core Business – Proactively identifying and correcting data integrity issues allows your organization to spend less time on compliance issues so you can stay focused on your core business.

Astrix Technology Group has an experienced team of expert informatics consultants that bring together technical, strategic, regulatory and content knowledge to provide the most effective solutions to problems faced by scientific organizations. Astrix provides experienced professionals knowledgeable about FDA regulations to conduct a thorough assessment of your laboratory informatics environment to identify data integrity risks. If you have any questions about Astrix Technology Group service offerings, or if you would like have an initial consultation with someone to explore how to reduce your compliance risk around data integrity, don’t hesitate to contact us.

The post Trends in FDA Data Integrity 483s and Warning Letters for Pharmaceutical Companies appeared first on Astrix.

]]>
LIMS Master Data Best Practices Part 3: Maintenance and Scalability https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-3-maintenance-and-scalability/ Fri, 20 Dec 2019 15:11:05 +0000 http://localhost/astrix/?p=3359 Master data design has very important impacts over the lifecycle of a […]

The post LIMS Master Data Best Practices Part 3: Maintenance and Scalability appeared first on Astrix.

]]>
Master data design has very important impacts over the lifecycle of a LIMS, as nearly every piece of functionality in the system revolves around the design of the master data. One of the most important aspects to any LIMS implementation is designing the master data so that it is easy to maintain and scale as the organization grows and business needs change. Some of the key benefits of configuring your master data to be maintainable and scalable include:

  • Easier to add and/or modify master data down the road
  • Increased system efficiency and reliability
  • Future system enhancements are less resource intensive
  • Better management for large volumes of data
  • Increased user acceptance
  • Increased ROI

In short, focusing on maintainability and scalability when configuring your master data really helps improve the lifespan and usability of your LIMS. In this blog, we will provide some best practice tips on how to set up master data so it will be easy to maintain and scale as your organization grows and the system matures.

Configuring Master Data for Maintainability

Once a LIMS system is configured users often must live with the master data rules set during configuration. While change control can be used to update the configuration, sometimes the process is so cumbersome that it isn’t worth the effort, or the changes are so numerous that the system never reaches a finalized state. The tips in this section are some we have found to be most helpful during the configuration process to make creating and updating Master Data easier on users as the system evolves.

Put System reserved words and System specific rules where they can be found. Every LIMS system has reserved keywords that can’t be used as a data name. These keywords are usually buried in the installation documentation and are hard to find. In addition, every LIMS system also has different conventions for using special characters or uppercase and lowercase letters. Instead of just being part of the “tribal knowledge” passed down through system administrators, these reserved words and specific rules should be included in the master data document where they will be easy to find long after the initial installation has been completed.

Create naming conventions based on business structure. Use naming codes that refer to your business processes or structure such as site, product specification or variation, lab type, or manufacturing line. This creates a uniform system that will change less often and reduces the overall amount of static data.

For example: Let’s say you have a multi-site deployment of a LIMS system and need to perform a pH check of manufacturing lines:

Good assay name: SF03-L1PH (San Francisco site Building 3 – Line 1 pH)

Bad assay name:  LINE3PH (line 3 pH)

If the specification for the product changes, the good assay name will tell you which assay needs to be updated. With the bad assay name, there is no way to tell.

Use references in naming conventions. It is best to avoid naming master data based on a previous LIMS system or by using a specific instrument name. In these cases, as the system or instrument no longer exists the name doesn’t have any meaning. Instead, add a field or lookup table to use as a reference.

Good assay name: Chem-8260-GCMS (Chemistry dept – EPA method number – instrument type reference)

Bad assay name: SL8260-MS123 (Sapphire LIMS method 8260 – Mass spec #123)

Use summary analyses. Where instrument interfaces are being used, such as Empower, analyses should be separated into a raw data analysis and an independent reporting or summary analysis. This setup provides a few key benefits:

  • Data integrity will be maintained, because the method can be locked down. Analysts will still have the ability to manipulate the data in the reporting or summary analysis but, the raw data won’t be manipulated.
  • It provides flexibility for method changes. By having a separate method analysis, you only need to change one method instead of several if an update is needed due to an instrument change or a new instrument.
  • Having a separate raw data analysis frees the instrument from being held while the raw data is being analyzed. Additional replicates can be run, the instrument can be taken offline for maintenance, or another run can be set up.

Try not to tie your metadata to a name – instead use a field. When a defined name is used it must be hard coded into the system. This is time consuming for developers during initial setup. If any changes need to be made after the system is configured, it must be done by a developer and changes must go through change control and possibly re-validation. By creating fields instead, changes can be done in the front end of the system and the change control process is much easier.

Custom tables and fields should all start with a prefix. This separates custom tables and fields from what is pre-defined in your system. A prefix can be used to group related objects based on your master data map. This is very useful for data review or if you have a multi-site deployment strategy. A custom table prefix could be used to designate tables designed for a specific site or business process. Some of the benefits of using a prefix are:

  • If a custom table has a prefix, you don’t need to create a prefix on the fields inside the table.
  • It creates a recognizable and uniform standard that can be re-used.

Work with the database administrator to manage growing data. Over time, your data will grow. It’s easy to let lists grow to a size where users must scroll forever to find what they are looking for, or tables to where it becomes difficult to see columns on the screen. As data grows, work with the system database administrator to create indices or queries to manage growing data. This helps to maintain system performance.

Configuring Master Data for Scalability

The Master Data Plan document will be the guide on how to scale your master data. This document is usually written during the development phase of any software update or data migration project. But often it encompasses only the initial configuration. To make your plan scalable, the Master Data Plan should include instructions on how to deal with data as it grows once the initial configuration is done.

Schedule who and when for all tasks. When putting together the project timeline, adequate time and resources should be given for completing master data tasks. Too often master data is left to the last minute or the time and resources needed are underestimated. Missed entries are rationalized away with the belief that it can be entered as needed. Often this results in a deployment that is never fully completed.

When writing Master Data Plan for the project, make sure you identify who will be entering data, who will do the testing, and when tasks will be completed. This provides the check that everything is complete for go-live. If data will be entered after go-live, include it in the plan. Then, expand on this baseline to explain the who and when for entering and testing data into the future.

Define how data is transferred into the system (Data Migration plan). The main aspect of any upgrade or implementation is how to migrate master data from the old system (or no system) into a new one. Your Master Data Plan should include the list of tasks needed to put all the master data into the new system. As the configuration is built tasks will fall into an order required by the system. For example, when migrating an analysis and its components the component table may need to be migrated into the system before the analysis table.

As this information is recorded it becomes the Data Migration plan. This provides the ability to import or export large amounts of data. So when you have a new product, you can potentially add the master data as groups of tables instead of entering pieces individually. It’s a much faster and cleaner method of adding data that can be verified using scripts instead of manually entering fields one at a time.

Configure the data map for growth and provide rationale. The Master Data Plan will include the data map outlining the business units involved, workflows, how workflows relate to each other, and the list of master data fields from each workflow. Instead of leaving the data plan as just a list of the initial setup, include how to manage data as it grows over time. Be sure to include the rationale behind why the plan was configured as it was so it is easy to understand how to expand it in the future.

Consider how master data is created as your business grows. Some questions to consider are:

  • What are the criteria to determine if master data fields are added or not?
  • Will new tables or sub-tables need to be created?
  • How will new workflows be added?
  • How will lists be handled when they get too big?
  • What happens when data is no longer needed?
  • Why are some third-party systems included (e.g., ERP, manufacturing), while others were not (training, document management)?
  • Will instruments, equipment, or another system be incorporated in the future?

Answers to these and similar questions provide a framework for expansion that is easy to understand.

Define naming convention to be used and the rationale behind it. The rationale for naming conventions should also be included for the same reasons as the rationale for the configuration. This includes the rules and variations behind corporate and site field names.  For a small deployment of only a few labs or sites, there may not be any variations to consider. For a large deployment, however, there could be many site variations. If the naming convention is based on your business structure, the rules can be specific, because the business structure is less likely to change.

Conclusion

When creating master data for a new LIMS, there are many things that should be done to ensure the data is easy to manage and can grow as your system matures. We’ve provided a number of key best practice recommendations in this blog that will help you improve maintainability and scalability in your LIMS when configuring your master data. Following these recommendations will ultimately help you increase the ROI of your LIMS over its full lifespan. Be sure to tune in for part 4 of our Master Data Blog Series, where we will discuss more best practice recommendations for master data quality control and change management.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

The post LIMS Master Data Best Practices Part 3: Maintenance and Scalability appeared first on Astrix.

]]>
Enhancing Data Integrity in Clinical Research with Blockchain Technology https://astrixinc.com/blog/lab-compliance/enhancing-data-integrity-in-clinical-research-with-blockchain-technology/ Thu, 08 Aug 2019 00:00:35 +0000 http://localhost/astrix/?p=3048 Verification of data integrity is a critical part of the FDA’s mission […]

The post Enhancing Data Integrity in Clinical Research with Blockchain Technology appeared first on Astrix.

]]>
Verification of data integrity is a critical part of the FDA’s mission to ensure the safety, efficacy and quality of human and veterinary drugs, biological products, and medical devices. As these products must undergo extensive testing before being approved for the public, gathering high-quality, reliable and statistically sound data is an important goal for all clinical research. There are, however, converging trends challenging the ability of legacy data management systems to establish and sustain data integrity throughout all phases of clinical studies:

  • Increasing complexity of clinical trials and the supporting bioanalytical studies
  • Enormous amount and variety of data generated
  • Clinical trial networks comprising many parties and sites
  • Increasingly sophisticated devices for gathering clinical data

Every once in a while, a technology comes along that has the potential to act as a significant disruptive force. Blockchain technology may very well be the next big disruptive technology and is likely to find applications in nearly every major industry to reduce cost, increase efficiency and more significantly, improve trust. Trust is foundational all businesses, and blockchain enables companies to seamlessly establish both trust and transparency in data-related business processes in a non-centralized and therefore scalable and cost-effective way.

Blockchain technology was invented in 2008 with the creation of bitcoin by the pseudonymous Satoshi Nakamoto. It’s important to understand that bitcoin is itself merely an application of blockchain – bitcoin itself is a financial asset that utilizes blockchain technology as its ledger. Given the necessity of trust and transparency in financial transactions, finance transactions were a natural first use case for blockchain technologies. The ability of blockchain to facilitate trust in a decentralized with relatively low cost has drawn the attention of researchers in the pharmaceutical industry, and efforts to apply the technology to data integrity problems within clinical trials are increasing.

According to Statista, global blockchain technology revenues will experience massive growth in the coming years, with the market expected to climb to over 23.3 billion U.S. dollars in size by 2023. In this blog, we will discuss ways in which blockchain technology is being applied to enhance data integrity in clinical research.

What is Blockchain Technology?

Ledgers are the foundation of accounting: a secure record of transactions is essential to the role of money in our society. Blockchain technology utilizes a distributed peer to peer computer network to create a digital ledger (a form of a database) using cryptographic techniques to store transaction records in a verifiable and unalterable way. Each server or node in the network can independently verify each data entry and modification of any data in the chain requires alteration of all following entries in the chain and consensus of the network. In this manner, with a reasonable time and network distribution of transactions, the entire chain of entries is considered secure by design and fault tolerant due to the distribution of the data.

When applied effectively, blockchain networks inherently resist modification of the data stored in the transaction record. Transaction data stored in a blockchain can’t be stolen or hacked, as it is not kept in a central repository, but instead distributed across dozens or even thousands of geographically dispersed network nodes. The distributed nature of the network, the use of timestamped records, and cumulative cryptographic verification all assure that the stored data remains intact and immutable. Additionally, the application of public key cryptography to the transactional data, makes the entries attributable and non-repudiable. Interestingly, these are key characteristics of asserting data integrity as defined by the FDA: attributable, legible, contemporaneous, original, and accurate (ALCOA).

Furthermore, the ability of a blockchain to be an immutable audit trail makes blockchain technology perfectly suited for any company focused on data integrity and regulated or audited by 3rd parties like the FDA. In addition, blockchain network participants can store, exchange and view information in the database without the need for establishing preexisting trust between the parties (for example, by a negotiated legal contract) – trust is in fact hardcoded into the blockchain protocol.

It is important to note that not all problems require or would be most effectively met with a blockchain solution. The creation and maintenance of the ledger is currently a costly exercise. There have been efforts to address this cost, through the design of specific implementations to facilitate application to more general problems (Ethereum for example, which is a blockchain application platform based on a distributed virtual machine); and through “blockchain as a service” offerings, with Microsoft and Amazon Web Services recently adding blockchain networks to their offerings.

Blockchain solutions are ideal for data records that are meant to be shared between partners in a network where transparency and collaboration are important. Specifically, blockchain could be considered appropriate when:

  • multiple parties generate transactions that represent shared data;
  • the parties need to be able to independently verify that the transactions and the current state of the shared data are valid;
  • there are no trusted third-party services able to manage the shared data efficiently (e.g. an escrow);
  • enhanced security is needed to ensure integrity of the system.

Private vs. Public Blockchains

Blockchain databases can be categorized into two main types – public and private. Bitcoin is an example of a public blockchain, with thousands of computer nodes distributed worldwide doing the work required to verify the network. These nodes are run by participants in verifying the network (computational work also known as “mining”) who receive rewards paid in the cryptocurrency created by the network. Attributes of a public blockchain include public access, a high degree of control, a large distribution of work which can result in fewer verified transactions per unit time, and transaction verification and overall security by Proof of Stake or Proof Of Work (approaches that enhance stability and security of the network).

Private blockchains, on the other hand, are more common in industry applications of blockchain technology. These type of blockchains have a less randomly distributed network (nodes typically run by stakeholders), restricted access to the network, the potential for higher throughput and scalability, and transaction verification by authorized network participants.

Hybrid blockchains are a recent approach: the blockchain in general is a public blockchain with an ability to designate data that is public and fully open vs. data that remains private and accessible only to authenticated entities. Some recent private blockchain implementations are also leveraging public blockchains for the cryptographic payload only: this approach has an interesting advantage when considering data storage compliance with geographic regulations such as the European Union’s General Data Protection Regulation (GDPR).

Enhancing Data Integrity in Clinical Research with Blockchain Technology

Good quality data from clinical trials requires good security, proper content (metadata) and an immutable audit trail. Blockchain technology is a potential component of ensuring these capabilities. Blockchain data integrity has the strength of cryptographic validation of each interaction or transaction, and the entirety of the set of records. Any data integrity issue in a blockchain results in an immediate indication of where and when the problem happened, along with the contributor of the offending transaction.

Blockchain effectively allows proof of the integrity and provenance of the data independently of the production of the data. Blockchain-based, immutable, timestamped records of clinical trial results and protocols could potentially reduce the incidence of fraud and error in clinical trial records by eliminating the potential for outcome switching, selective reporting and data snooping.

Additionally, it is very difficult for researchers in the current environment of siloed informatics systems to share and maintain the provenance of data. Blockchain-based data systems could allow for seamless data sharing between organizations, thereby reducing data integrity issues stemming from errors and incomplete data.

As an example of this, researchers at University of California San Francisco (UCSF) recently developed a proof-of-concept method for using a private blockchain to make data collected in the clinical trial process immutable, traceable, and more trustworthy. The researchers used data from a completed Phase II clinical trial and tested the resilience to tampering with the data through their proof-of-concept web portal service. They demonstrated effectively that data entry, storage, and adverse event reporting can be performed in a more robust and secure manner, could withstand attacks from the network, and be resilient to corruption of data storage. For this approach to be implemented in the real world, it’s worthwhile to note that a regulator acting as a centralized authority would likely need to instantiate a private blockchain, operate the web portal, register all parties, and keep a ledger of the blockchain’s transactions.

Another example of blockchain technology in action was provided by scientist.com, a leading online marketplace for outsourced research in the life sciences, with the recent launch of their DataSmart™ platform based on a proprietary blockchain implementation. The platform is designed to ensure data integrity in research data from clinical and preclinical stages of drug development that is submitted electronically to regulators. DataSmart™ also allows pharmaceutical and biotechnology companies to demonstrate that critical supplier data has not been tampered with and remains unaltered.

To improve visibility of data quality and analysis, researchers from several different companies recently collaborated to develop TrialChain, a hybrid blockchain-based platform intended for application to biomedical research studies. This approach includes provisions to permit modifications of original data created as a routine part of downstream data analysis to be logged within the blockchain. In addition, the use of a hybrid blockchain structure in TrialChain allows for public validation of results alongside the additional authorization required for full access to data.

Finally, pharmaceutical giants Pfizer, Amgen and Sanofi have teamed up to find the most effective ways to utilize blockchain technology, from ensuring data integrity to speeding up clinical trials and ultimately lowering drug development costs.

Conclusion

With the ability to deliver immutable timestamped records, audit trails and data provenance, in a highly secure and attributable manner, blockchain technologies could indeed have a huge impact on clinical research worldwide. While blockchain obviously can’t prevent errors at the source (e.g., an incorrect recording of a test result), a blockchain-enabled clinical research system could provide a strong mechanism to ensure the integrity of the datasets themselves.

As in other industries, blockchain also holds promise for reducing costs in many aspects of the drug lifecycle – from discovery to manufacturing. From the ability to concisely and securely track data provenance as part of analytics to meeting regulatory serialization requirements in the supply chain, there are many interesting potential applications of blockchain technology.

While the use of blockchain technology in clinical research is still in its infancy, there is much interesting activity in applying blockchain technology to solve clinical data management challenges. Many pharmaceutical companies are forming outside partnerships and creating in-house initiatives to explore blockchain technologies in clinical research areas like patient recruitment, patient data sharing, data privacy, preventing data tampering and improving research reproducibility.

As with the application of any new technology, biopharma companies looking to implement blockchain technology in their clinical research should consider working with a quality informatics consultant well established in the domain, and consider very deliberately how the technology will fit in the overall systems architecture.

The post Enhancing Data Integrity in Clinical Research with Blockchain Technology appeared first on Astrix.

]]>
Creating an Effective Validation Master Plan https://astrixinc.com/blog/lab-compliance/creating-an-effective-validation-master-plan/ Tue, 11 Jun 2019 16:28:49 +0000 http://localhost/astrix/?p=2978 The Food and Drug Administration (FDA) is tasked with protecting the public […]

The post Creating an Effective Validation Master Plan appeared first on Astrix.

]]>
The Food and Drug Administration (FDA) is tasked with protecting the public health by ensuring the safety, efficacy, and security of the nation’s food supply, human and veterinary drugs, biological products, medical devices, vaccines, and other products. In order to fulfill its mission, the FDA requires that all products under its domain meet guidelines outlined in current Good Manufacturing Practice (GMP) regulations.  These guidelines serve to ensure that end products are free from contamination and consistent in manufacture, that the manufacturing process has been well documented, that personnel are well trained, and that the product has been checked for quality throughout the manufacturing process.

In order to establish documentary evidence that a system, procedure, process or activity carried out during product production and testing meets guidelines outlined in GMP regulations, the FDA expects manufacturers to carry out validation activities. Since there are a wide variety of systems, procedures, processes, and activities that need to be validated, industry-leading organizations typically create a Validation Master Plan (VMP) at each manufacturing facility to track, coordinate and standardize validation efforts. While a VMP is not a requirement for an organization, it has become standard practice and FDA inspectors will often want to see a VMP to determine whether a facility has a well thought out validation strategy. Towards this end, let’s discuss best practices for creating an effective Validation Master Plan.

What is a Validation Master Plan?

FDA regulations set the expectation that all aspects of a production process are well defined and controlled in order to ensure that the products produced are consistent in their safety and efficacy. A VMP is a foundational document that helps to achieve this goal by documenting compliance requirements and detailing necessary validation activities across a site or facility.

In this regard, the VMP is different than validation plans written for a single validation project. The VMP is a top layer document that establishes a disciplined and structured approach to all validation projects and describes the overall risk-based strategy for achieving and maintaining a qualified facility with validated processes.

A comprehensive VMP will assign clear responsibilities for authoring, reviewing, approving and executing all validation documentation and tasks. As such, the VMP allows manufacturers to demonstrate that they are focused on quality and in control of their quality system. This document is often one of the first that FDA inspectors will ask to review, as it will provide a concise overview of how the company has

  • delegated responsibilities
  • justified its validation efforts
  • designed production processes
  • planned resource usage
  • established a trained and competent workforce
  • integrated cGMP regulations into every aspect of its operations

The VMP is ultimately a unifying document which helps to create a common understanding of the company’s approach towards validation in all team members to enable effective organizing and execution of validation activities. It also serves to help management understand what the validation program involves and why it is important.

Components of a Good VMP

Best practice for creating a VMP document is to involve a team that includes key stakeholders from each part of the facility that the plan will touch. A team-writing approach will bring together the comprehensive knowledge and differing perspectives necessary to ensure all relevant processes, equipment, utilities and systems are addressed in the VMP. While the document should be written in a concise and to the point manner, it should also be comprehensive and as lengthy as needed to convey the necessary information, as an incomplete VMP sets the facility up for compliance challenges.

The VMP should address all quality supporting systems (e.g., equipment, instruments, processes, procedures, computer systems, utilities, etc.) used for manufacturing, processing, testing, labelling or packaging of products. A comprehensive VMP for a manufacturing facility will address validation in the following areas:

  • Equipment/instrument validation/qualification
  • Utilities validation
  • Cleaning validation
  • Process Validation
  • Analytical method validation
  • Computer system validation

In order to create a VMP that will serve as an effective document for company stakeholders and satisfy the expectations of regulatory inspectors, the following elements should be included:

Title Page. Includes title, document number and version.

Authorization. Contains signatures of management at the facility that have initiated, checked, approved and authorized the MVP. Management from all functional areas that the MVP touches should be represented (e.g., production, engineering, IT, QA, QC, GM, etc.).

Table of Contents. Documents the page numbers for each section of the VMP for easy reference.

Introduction. Provides a brief overview of both the facility (company name, location, any applicable division/subsidiary name, the sector/industry it serves) and the MVP document, along with references to the company’s Quality Assurance Policy.

Facility Description and Design. Provides a general description of the facility utilizing reference drawings detailing the areas included in the validation plan (e.g., material storage, manufacturing areas, GMP vs. non-GMP areas, etc.), with a particular focus on anything that can affect product quality. Describes all relevant products produced, equipment, and processes utilized at the facility – this information can be provided in the Appendix of the VMP.

Objectives. Describes how the VMP serves company personnel. Also describes how the VMP helps to ensure validation activities are carried out following company protocols, which are designed to evaluate whether equipment, systems, processes and methods:

  • Meet design specifications
  • Are suitable for intended use
  • Satisfy cGMP regulations
  • Meet applicable safety requirements
  • Effectively ensure that products produced are safe and fit for use

Validation Policy. Describes the organization’s commitment to satisfying regulations by conducting appropriate qualification and validation activities on all applicable areas. Also describes any mandated company policies that serve as guidelines for validation activities.

Validation Approach. Describes the risk management strategy, risk-based assessment criteria,  and methodology used to execute the validation program at the facility.

Scope of Validation. The VMP applies to all critical equipment, systems, instruments, procedures, utilities and other quality supporting systems that are used for manufacturing, labelling, packaging, testing, and processing which may affect product quality either directly or indirectly. Describes the areas which require validation and qualification at the facility, along with the justification for each.

Validation Program. Provides important details, protocols, guidelines (i.e., Validation Plans) for the validation team on the different validation projects which will be conducted at the facility.

Roles and Responsibilities. Lists the members of the validation team, along with their roles and responsibilities in the validation process. Also details any anticipated resource needs to carry out the VMP.

Acceptance Criteria. Defines the requirements that need to be satisfied (i.e., expected results of validation testing) before the equipment, system, instrument, procedure or utility is considered suitable for use in regulated activities.

Validation Deviations. Discusses the procedure for documenting and investigating any deviations encountered during the validation process.

Validation Matrix. Lists the required validations throughout the facility in order of criticality.

Validation Schedule. Provides a schedule and timeline for completion of required validations based on information in the Validation Matrix.

Change Control. Documents the change control system that must be followed when changing the validation protocol, test reporting formats, or other related documents.

Outsourced Services. Describes selection, management (i.e., vendor validation) and services provided by outside vendors.

Training. Lists competence expectations for validation activities and any planned validation training events.

Ongoing Activities. Outlines details of the ongoing activities which will need to be implemented to maintain validation compliance going forward.

Appendix. Contains a variety of documents referenced in the CSV – product list, process flows, equipment list. Also lists any relevant documents impacting or providing guidance for the execution of validation and qualifications (e.g., SOPs, specifications, etc.).

Abbreviations and Glossary. Defines technical terms, or terms specific to the organization, that may be unfamiliar to readers.

Conclusion

Validation of systems, processes, equipment, utilities and methods is an important part of a company’s Quality Management System. This is especially the case for the Life Sciences industry. A comprehensive VMP is critical to an organization’s success in both producing quality products and regulatory compliance.

Lack of a VMP, or a poorly developed one, have been the source of many form 483 observations and warning letters from the FDA. A comprehensive VMP will force the implementation of a structured and standardized approach to validation across the organization, ultimately helping to better serve the public by ensuring the safety and efficacy of the products your company produces.

The post Creating an Effective Validation Master Plan appeared first on Astrix.

]]>
Data Integrity: The Importance of a Quality Culture https://astrixinc.com/blog/lab-compliance/data-integrity-the-importance-of-a-quality-culture/ Tue, 14 May 2019 13:43:19 +0000 http://localhost/astrix/?p=2945 Verification of data integrity is a critical part of the FDA’s mission […]

The post Data Integrity: The Importance of a Quality Culture appeared first on Astrix.

]]>
Verification of data integrity is a critical part of the FDA’s mission to ensure the safety, efficacy and quality of human and veterinary drugs, biological products, and medical devices. As such, the FDA’s expectation is that all data generated to support the quality of manufactured products is both reliable and accurate.

Compliance violations involving data integrity have led to numerous regulatory actions by the FDA in recent years, including warning letters, import alerts, and consent decrees. In 2018 alone, the FDA issued 54 warning letters that had references to data integrity and data management deficiencies in pharmaceutical companies, 10 of which were in the United States. An analysis of 2018 warning letters by FDAzilla found that 45% of GMP-related warning letters issued to pharmaceutical companies based in the United States included a data integrity deficiency.

Recently, the FDA has begun to link compliance with data integrity regulations with the specific culture of an organization. The FDA wants to see companies develop a “quality culture” or quality focus that is integrated throughout the organization. The idea is that the more mature an organization’s quality culture, the more reliable the product support data (i.e., data integrity) will be.

The business case for data integrity compliance is clear – less compliance and financial risk, less rework, fewer supply interruptions to the market, improved productivity and operational performance, etc. In many companies, however, compliance problems persist as a result of reactive rather than proactive thinking regarding efforts to maintain a reliable state of compliance and quality throughout the organization. Given the FDA’s recent interest in quality culture, let’s explore best practices for establishing an organizational quality culture that supports compliance with data integrity regulations.

Quality Culture Best Practices

Data integrity violations can be the result of many factors: employee errors, lack of awareness of regulatory requirements, poor procedures or not following procedures, insufficient training, intentional acts of falsification, software or system malfunction, poor system configuration, etc. In order to avoid risk, companies involved in developing, testing, and manufacturing APIs, intermediates, or pharmaceutical and biological products should work to establish an organizational quality culture that:

  • promote an organizational culture that encourages ethical conduct
  • demonstrates the company’s commitment to compliance with data integrity regulations
  • requires the prevention and detection of data integrity deficiencies

Key aspects of this culture include:

Management Engagement. In its latest data integrity guidance Data Integrity and Compliance with Drug cGMP Guidance published in December 2018, the FDA states, “It is the role of management with executive responsibility to create a quality culture where employees understand that data integrity is an organizational core value and employees are encouraged to identify and promptly report data integrity issues. In the absence of management support of a quality culture, quality systems can break down and lead to cGMP noncompliance.”

The effect of management engagement and behavior on the success of a company’s data governance efforts should not be underestimated. The company’s data governance policies need to be strongly endorsed at the highest levels of the organization. For an organization to achieve optimal compliance with data integrity regulations, the focus on quality compliance must start at the top. Management must lead by example and set the tone to be emulated by individuals at all levels within the company.

Employee Training. It is critical for companies establish training programs that properly educate all employees on the fundamental principles of data integrity, company requirements for data integrity, requirements of regulatory agencies, expected employee conduct as a condition of performing GxP functions and disciplinary consequences for poor conduct. All new employees should go through this training prior to performing GxP activities, and each employee should receive an annual refresher training. At the conclusion of the annual refresher, best practice is to have each employee sign a certification statement confirming that he/she has adhered to company standards around data integrity over the past year, including reporting any violations about which they became aware.

Data Integrity training should also include education on why data integrity is so important to the company. Ultimately, in a successful quality culture, employees adopt a quality mindset, not because they have to, but because they understand the importance of data integrity to the company and the risks of noncompliance.

Open and Transparent Communication. Companies should strive to encourage personnel to be transparent about data integrity deficiencies so management has an accurate understanding of the risks and can provide necessary resources to mitigate them. An essential element of any successful quality culture is transparency in terms of open reporting by employees of any deviations, errors, omissions or aberrant results that impact data integrity. Management must set the proper tone to encourage open communication by working to create a culture where people listen to one another, and by not punishing people for honest mistakes. In addition, employees should be given the option to report issues anonymously if local laws permit.

Computer Access Controls. Computer systems need to have secure access controls in order to assure that changes to records can only be made by authorized personnel, and these controls need to be strictly enforced. Amongst other things, this means that each person accessing the computerized system must be able to be uniquely identified (username and password, or other biometric means), must not share their login information with others, and their actions within the system should be trackable via an audit trail. Additionally, rights to alter files and settings (e.g., system administrator role) in the computer system should not be assigned to those responsible for record content.

Data Integrity Investigations. As discussed in the above-mentioned FDA Guidance document, regardless of how a data integrity violation is discovered (e.g., third party audit, FDA audit, internal tip, etc.), all identified data integrity errors “must be fully investigated under the cGMP quality system to determine the effect of the event on patient safety, product quality, and data reliability.” The investigation should determine the root cause and ensure the necessary corrective actions are taken. Necessary corrective actions may include hiring a third-party auditor, removing individuals responsible from cGMP positions, improvements in quality oversight, enhanced computer systems, creation of mechanisms to prevent recurrences, etc.

Data Integrity Audits. Companies should include data integrity assessments in GxP audit programs. Audits may be conducted by internal staff in the Quality unit, or by an independent third party. If audit functions are outsourced to an external consultant, be sure to verify that auditors have appropriate training in data integrity evaluations. Utilizing a quality external consultant with expertise in data integrity evaluations for your GxP audit is best practice, as an expert with fresh eyes will likely be able to locate any data integrity issues you missed. The periodic review results, along with any gaps and corresponding remediation activities, must be documented.

The following items should be part of any audit of laboratory technology systems:

  • Review of existing Software Validation Lifecycle policies, SOPs, etc.
  • Identification of paper data, electronic data, raw data, and static and dynamic data
  • Intended use of computer systems (e.g., SOPs, workflows, etc.)
  • Use of notebooks and forms associated with computer systems
  • System security and access to data (e.g., user types, groups, roles, accounts, etc.)
  • Electronic signatures and audit trails
  • Data retention policies and availability of data
  • Data backup, archive, restore and recovery processes
  • Training for support and use computer systems that collect, generate, store, analyze, and/or report regulated data

Third-Party Data Integrity Audits. Manufactures are responsible for ensuring that quality system elements have been established at each of its third-party suppliers (e.g., outsourced services, purchased raw materials) that provide products or services on behalf of or at the behest of the manufacturer. Manufacturers need to verify that suppliers operating as an extension of the manufacturer have established appropriate policies, standards, procedures, or other documents that define requirements for its employees to ensure compliance with data integrity regulations.

Conclusion

Ensuring data integrity means collecting, documenting, reporting, and retaining data and information in a manner that accurately, truthfully and completely represents what actually occurred. The FDA expects pharmaceutical firms to identify, manage, and minimize data integrity risks associated with their data, products, equipment, technology, processes, and people. It is therefore critical for companies to develop a quality culture that supports data integrity compliance in order to avoid unpleasant financial consequences from enforcement actions.

It is important to note that company executives and management are ultimately responsible for creating a quality culture that will support data integrity compliance. By taking assessment of the maturity of their quality culture and making intentional plans for improvement, senior management can generate significant business value by improving product quality and reducing their organization’s compliance and financial risk.

The post Data Integrity: The Importance of a Quality Culture appeared first on Astrix.

]]>
What the FDA’s New Guidance on Data Integrity Means for Pharmaceutical Companies https://astrixinc.com/blog/lab-compliance/what-the-fdas-new-guidance-on-data-integrity-means-for-pharmaceutical-companies/ Tue, 19 Mar 2019 17:13:06 +0000 http://localhost/astrix/?p=2810 Data integrity is an important consideration in today’s pharmaceutical GxP laboratories. Compliance […]

The post What the FDA’s New Guidance on Data Integrity Means for Pharmaceutical Companies appeared first on Astrix.

]]>
Data integrity is an important consideration in today’s pharmaceutical GxP laboratories. Compliance violations involving data integrity have led to numerous regulatory actions by the FDA in recent years, including warning letters, import alerts, and consent decrees.

As part of its mission to ensure the safety, efficacy and quality of products produced by the pharmaceutical industry, the FDA expects that all data submitted to the agency to obtain market approval is both reliable and accurate. The FDA considers the integrity of data, from the moment it is generated, and extending through to the end of its life cycle, to be a critical component of ensuring that only high-quality and safe drugs are manufactured.

Citing a trend of increasing regulatory compliance violations involving data integrity during current good manufacturing practice (cGMP) inspections, the FDA published an updated version of its Data Integrity and Compliance with cGMP Guidance in December 2018. This guidance had been originally issued as a draft guidance in April 2016.

The purpose of this newly released guidance is to clarify the role of data integrity in cGMP for human and veterinary drugs, medical devices and biological products, as required in 21 CFR parts 210, 211 and 212. Let’s review this newly published FDA guidance on data integrity in order to flush out the implications for pharmaceutical GxP laboratories.

FDA cGMP Regulations

In this guidance document, the FDA clarifies the role of data integrity in current good manufacturing practice for drugs (finished pharmaceuticals and PET drugs), as documented in 21 CFR parts 210, 211, and 212. The cGMP data integrity requirements emphasized by the FDA in this guidance include:

  • Part 211.68 – Backup data should be “exact and complete” and “secure from alteration, inadvertent erasures, or loss”. Computer output should “be checked for accuracy”.
  • Part 212.110(b) – Data should be “stored to prevent deterioration or loss”.
  • Parts 211.100 and 211.160 – Certain activities should be “documented at the time of performance” and laboratory controls need to be “scientifically sound”.
  • Part 211.180 – Records should be retained as “original records,” or “true copies,” or other “accurate reproductions of the original records”.
  • Parts 211.188, 211.194, and 212.60(g) – Companies should maintain “complete information,” “complete data derived from all tests,” “complete record of all data,” and “complete records of all tests performed”.
  • Parts 211.22, 211.192, and 211.194(a) – Production and control records should be “reviewed” and laboratory records should be “reviewed for accuracy, completeness, and compliance with established standards”.
  • Parts 211.182, 211.186(a), 211.188(b)(11), and 211.194(a)(8) – Records should be “checked,” “verified,” or “reviewed”.

The FDA also lists a series of threshold questions in the Background section of the guidance that may be helpful to ask when considering how to meet these regulatory requirements:

  • Are controls in place to ensure that data is complete?
  • Are activities documented at the time of performance?
  • Are activities attributable to a specific individual?
  • Can only authorized individuals make changes to records?
  • Is there a record of changes to data?
  • Are records reviewed for accuracy, completeness, and compliance with established standards?
  • Are data maintained securely from data creation through disposition after the record’s retention period?

FDA Guidance on Data Integrity and Drug Compliance with cGMP

This new guidance emphasizes the importance of creating a flexible and risk-based company-wide data integrity strategy, and strongly suggests that management should be involved with both the development and implementation of this strategy. Effective strategies “should consider the design, operation, and monitoring of systems and controls based on risk to patient, process, and product.”

The new guidance maintains the same structure of 18 questions and answers used in the original 2016 draft version in an effort to provide clear and concise solutions to common issues in an easy to follow Q&A format. The wording of each of the questions in the new guidance is essentially the same as in the draft version, with 3 notable exceptions:

Question 2: When is it permissible to invalidate a cGMP result and exclude it from the determination of batch conformance?

Here, the FDA reiterates a number of points made in the 2016 draft guidance:

  • All data created as part of a cGMP record must be maintained for cGMP compliance and evaluated by the quality unit for conformance with specifications as part of release criteria.
  • Out-of-Specification (OOS) test results require a “valid, documented, scientifically sound justification” in order to be excluded from quality unit decisions about conformance to a specification.
  • The FDA’s, “Guidance for Industry: Investigating Out-of-Specification (OOS) Test Results for Pharmaceutical Production” provides criteria that can be used to determine when OOS results may be considered invalid.

In the new 2018 guidance, the FDA adds that in the case of an invalidated test result, “the full cGMP batch record provided to the quality unit would include the original (invalidated) data, along with the investigation report that justifies invalidating the result.”

Question 15: Can an internal tip or information regarding a quality issue, such as potential data falsification, be handled informally outside of the documented cGMP quality system?

The new guidance makes it clear that all identified data integrity errors “must be fully investigated under the cGMP quality system to determine the effect of the event on patient safety, product quality, and data reliability.” This expands the scope beyond just errors identified through internal tips and compliance hotlines. The FDA now requires investigations to be conducted for data integrity errors discovered through any source of information, including internal audits and independent third-party assessments.

Question 18: How does FDA recommend data integrity problems be addressed?

In the new 2018 guidance, the FDA provides detailed recommendations on how to address data integrity problems:

  • Determine the problem’s scope and root causes.
  • Conduct a scientifically sound risk assessment of its potential effects (including impact on data used to support submissions to FDA).
  • Implement a management strategy, including a global corrective action plan, that addresses the root causes.

The strategy to address root causes may include:

  • retaining a third-party auditor.
  • removing individuals responsible for data integrity lapses from positions where they can influence cGMP related or drug application data at your firm.
  • improvements in quality oversight.
  • enhanced computer systems.
  • the creation of mechanisms to prevent recurrences and address data integrity breaches (e.g., anonymous reporting system, data governance officials and guidelines).

Other Relevant Changes in the 2018 Guidance

In addition to changes from the original 2016 draft guidance highlighted above, there are a number of other changes that manufacturers would be wise to carefully consider:

Audit Trail Reviews. The 2018 guidance suggests that review frequency for audit trails should mirror review frequency for the data specified in cGMP. In addition to requiring audit trail review before batch release, the 2018 guidance suggests audit trail review after each significant step in manufacture, processing, packing, or holding.

Additionally, the 2018 guidance suggests that if review frequency for the data is not specified in cGMP regulations, you should determine the review frequency for the audit trail using knowledge of your processes and a risk assessment that includes evaluation of data criticality, control mechanisms, and impact on product quality.

System Suitability Testing. The FDA considers it a regulatory violation to use actual samples in system suitability test, prep, or equilibration runs as a means of disguising testing into compliance. In this guidance, the FDA has clarified its thinking regarding the use of actual samples during system suitability testing. Such samples should be a properly characterized secondary standard from a different batch than sample(s) being tested. cGMP records must provide transparency and be complete. “All data – including obvious errors and failing, passing, and suspect data – must be in the CGMP record.”

Computer System Validation (CSV). In this new guidance, the FDA has expanded its discussion of CSV to emphasize that validation studies on computer systems “should be commensurate with the risk posed by the automated system” and should validate the system for its intended use. Additionally, all non-CGMP functions performed by a system should be assessed for the potential to affect CGMP operations and mitigated appropriately.

Employee Training. The 2018 guidance states that, in addition to receiving training in detecting data integrity issues, personnel must be training in preventing data integrity issues. The FDA wants firms to train their personnel to develop corrective and preventative actions so that data integrity issues are mitigated and do not recur.

Backup Records. The FDA clarifies that the term “backup” refers to “a true copy of the original record that is maintained securely throughout the record retention period.” Additionally, “backup data must be exact, complete, and secure from alteration, inadvertent erasures, or loss.”

Access to Computer Systems. Rights to alter files and settings (e.g., system administrator role) in the computer system should not be assigned to those responsible for record content. Small companies are no longer excluded from this requirement.

Shared Logins. The FDA requires unique logins for all users that have permission to modify data. Shared login accounts for users accessing the system for read-only data viewing are acceptable. Be aware, however, that these shared login accounts do not “conform with the part 211 and 212 requirements for actions, such as second person review, to be attributable to a specific individual.”

FDA Access to Records. The FDA clarifies that it can review “records generated and maintained on computerized systems, including electronic communications that support cGMP activities.” Relevant email communications (e.g., email to authorize batch release) can be reviewed, for example.

Conclusion

With this new finalized guidance, the FDA has made it clear that it takes data integrity seriously and intends to improve patient safety by enforcing cGMP regulations with a “guilty until proven innocent” approach. Pharmaceutical firms are expected to identify, manage, and minimize  data integrity risks associated with their data, products, equipment, technology, processes, and people. It is therefore critical for companies to implement robust systems with effective data integrity controls and oversight in order to avoid unpleasant financial consequences from enforcement actions.

Astrix Technology Group is a laboratory informatics consulting, regulatory advisory, and professional services firm focused on serving the scientific community since 1995. We specialize in data integrity assessments for your laboratory. If you would like to have Astrix conduct a data integrity assessment in your laboratory, or if you would like to have a free, no-obligations consultation with an Astrix data integrity specialist during your project planning and budgeting phase, please complete the form at the bottom of this page: Astrix Data Integrity Assessment

The post What the FDA’s New Guidance on Data Integrity Means for Pharmaceutical Companies appeared first on Astrix.

]]>
Best Practices for Computer System Validation https://astrixinc.com/blog/lab-informatics/best-practices-for-computer-system-validation/ Sun, 17 Jun 2018 20:01:53 +0000 http://astrixinc.com/?p=2404 Computer system validation (CSV) is a documented process that is required by […]

The post Best Practices for Computer System Validation appeared first on Astrix.

]]>
Computer system validation (CSV) is a documented process that is required by regulatory agencies around the world to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. These regulatory agencies require CSV processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness. In the United States, for example, the FDA requires pharmaceutical companies to perform CSV for systems that support the production of the following products:

  • Pharmaceuticals
  • Biologicals
  • Medical Devices
  • Blood and blood components
  • Human cell and tissue products
  • Infant Formulas

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

CSV processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated. In this blog, we will discuss best practice recommendations for efficient and effective risk-based CSV assessment and testing.

Computer System Validation 101

With regards to Computer system validation, a “computer system” in an FDA regulated laboratory is not just computer hardware and software. A computer system can also include any equipment and/or instruments connected to the system, as well as users that operate the system and/or equipment using Standard Operating Procedures (SOPs) and manuals.

Computer system validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records. CSV utilizes both static and dynamic testing activities that are conducted throughout the software development lifecycle (SDLC) – from system implementation to retirement.

The FDA defines software validation as “Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.” Computer systems need to be examined to confirm that the system will work in all situations. Additionally, all validation activities and test results need to be documented.

All CSV activities should be documented with the following:

  • System inventory and assessment – determination of which systems need to be validated
  • User requirement specifications – clearly defines what the system should do, along with operational (regulatory) constraints
  • Functional requirement specifications – clearly defines how the system will look and function for the user to be able to achieve the user requirements.
  • Validation Plan (VP) – defines objectives of the validation and approach for maintaining validation status
  • Validation Risk assessments – analysis of failure scenarios to determine scope of validation efforts
  • Validation Traceability Matrix – cross reference between user and functional requirements and verification that everything has been tested
  • Network and Infrastructure Qualification – documentation showing that the network and infrastructure hardware/software supporting the application system being validated has been installed correctly and is functioning as intended
  • Installation Qualification (IQ) Scripts and Results – test cases for checking that system has been installed correctly in user environment
  • Operational Qualification (OQ) Scripts and Results – test cases for checking that system does what it is intended to do in user environment
  • Performance Qualification (PQ) Scripts and Results – test cases for checking that System does what it is intended to do with trained people following SOPs in the production environment even under worst case conditions
  • Validation Report – a review of all activities and documents against the Validation Plan
  • System Release Documentation – documents that validation activities are complete and the system is available for intended use.

Best Practices for Computer System Validation

Develop Clear and Precise Functional and User Requirements. One of the biggest mistakes companies make when starting an informatics project is to not do the strategic planning necessary to ensure success. The first step in any laboratory informatics project should always be a thorough workflow and business analysis. This process allows the development of clear and precise functional and user requirements that are tailored to your unique operating environment to a high degree of specificity and defined at a level that can be addressed through the new software. Without clear and precise requirements, CSV will not be able to adequately verify that the system is functioning as intended.

Perform risk-based CSV. CSV takes a lot of time and IT resources to accomplish, so it is wise to follow a flexible GAMP 5 approach that utilizes a risk-based assessment on the system to determine required test cases and the optimal level of testing for each. CSV efforts should concentrate on what is practical and achievable for the critical elements of the system that affect quality assurance and regulatory compliance. Benefits of this risk-based approach to CSV include reduced cost, business risk, duration of the validation efforts.

Create a Good Validation Plan. Like any technical endeavor, CSV processes should be guided by a good plan that is created before the project starts. This plan will define the objectives of the validation, the approach for maintaining validation status over the full SDLC, and satisfy all regulatory policies and industry best practices (e.g., GAMP 5). The validation plan will be created by people who have a good knowledge of the technology involved (i.e., informatics systems, instruments, devices, etc.) and serve to minimize the impact of the project on day-to-day lab processes.

The validation plan should detail the following:

  • Project Scope – outlines the parts of the system that will be validated, along with deliverables/documentation for the project. Validation activities are only applied to aspects of the system that will be utilized by the company.
  • Testing Approach – Defines the types of data that will be used for testing, along with the kind of scenarios that will be tested.
  • Testing Team and Responsibilities – Lists the members of the validation team, along with their roles and responsibilities in the validation process.
  • Acceptance Criteria – Defines the requirements that need to be satisfied before the system is considered suitable for use in regulated activities.

Create a Good Team. The project team should have CSV experience and knowledge of regulatory guidelines/compliance, validation procedures, laboratory processes, and the technology (e.g., informatics software, laboratory devices and instruments, etc.) being validated. It is important that the team is big enough so that members are not stretched too thin during the project. Outsourcing to a third party to augment the validation team with subject matter expertise may be appropriate in some instances.

Avoid Ambiguous Test Scripts. This mistake is related to the importance of developing clear and precise functional and user requirements for the system in question, as described above. Precise requirements lead to precise validation testing that confirms the system is fulfilling its intended use. Additionally, vendor test scripts typically only validate the base system requirements and will not be sufficient to ensure regulatory compliance.

Create Good Documentation. CSV processes and results need to be clearly documented over the full SDLC to the extent that the documents are sufficient to pass an audit by regulatory agencies. Having project team members with good understanding of regulatory guidelines is an important part of creating the necessary documentation.

Audit third-party Providers. In addition to performing CSV on internal systems, an FDA-regulated company needs to be prepared to audit third-party service providers (e.g., CROs), along with vendors of critical applications and cloud-based services (SaaS). The manufacturer of an FDA-regulated product is ultimately responsible for the integrity of the data that supports the product’s efficacy and safety, so if third-party vendors or service providers are used, the manufacturer needs to take appropriate steps to ensure that they are operating under standards that would hold up under an FDA inspection.

A risk-based assessment should be conducted to determine if an audit is necessary. At the minimum, formal agreements that clearly detail responsibilities must exist between the manufacturer and any third parties that are used to provide, install, configure, integrate, validate, maintain or modify a computerized system.

Conclusion

Effective, risk-based validation of computerized systems is an important part of maintaining regulatory compliance and product quality in modern laboratories. Inefficient or ineffective CSV processes prevents projects from being delivered on time and within budget and can also result in regulatory action. With regards to the FDA, for example, regulatory action due to failure to perform adequate CSV can be legally and financially devastating to an organization.

If you have additional questions about computer system validation, or would like to have an initial, no obligations consultation with an Astrix informatics expert to discuss your validation project, please feel free to contact us.

The post Best Practices for Computer System Validation appeared first on Astrix.

]]>