validation Archives - Astrix https://astrixinc.com/tag/validation/ Expert Services and Staffing for Science-Based Businesses Tue, 14 Nov 2023 19:18:00 +0000 en-US hourly 1 Common Mistakes in Computer System Validation https://astrixinc.com/blog/common-mistakes-in-computer-system-validation/ Sat, 03 Dec 2022 22:54:48 +0000 http://localhost/astrix/?p=2625 Modern scientific laboratories increasingly rely on computerized informatics systems (e.g., LIMS, ELN, […]

The post Common Mistakes in Computer System Validation appeared first on Astrix.

]]>
Modern scientific laboratories increasingly rely on computerized informatics systems (e.g., LIMS, ELN, CDS, etc.) to process and manage data. To ensure product safety and effectiveness, it is important for these systems to be validated in a process known as computer system validation (CSV) to confirm the accuracy and integrity of processed data.

Regulatory agencies like the FDA, in fact, require CSV to verify and document that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. CSV is required by the FDA when implementing a new system or when making a change to an existing system (upgrades, patches, extensions, etc.) that has been previously validated. Some examples of computerized systems that are required to be validated by the FDA include:

  • Laboratory data capture devices
  • Automated laboratory equipment
  • Manufacturing execution systems
  • Laboratory, clinical or manufacturing database systems

When done properly, computer system validation can be a smooth and efficient process. There are, however, several common mistakes that companies make when undertaking CSV that can result in a stalled or failed process. Let’s examine some of these mistakes in more detail.

Common Computer System Validation Mistakes

The goal of computer system validation is to document that computerized systems will work properly in all situations, consistently producing accurate results that enable regulatory compliance and the fulfillment of user requirements. CSV testing activities are conducted throughout the software development lifecycle (SDLC) – from system implementation to retirement.

There are many reasons why CSV activities can fail. Some of the more common ones that we see include:

Poor Planning. As with any project that involves technology, it is important to create a good plan. As such, a Validation Plan should be created prior to the start of any validation activities. This plan should detail the approach for maintaining validated status of the system over the full software development lifecycle (SDLC) and satisfy all regulatory policies and industry best practices (e.g., GAMP 5). The plan should also document the scope of validation activities, the testing approach, the testing team and their responsibilities, and the system acceptance criteria. The plan should also include site specific information if the system being tested is being used at multiple sites.

Poorly Defined Requirements. Any system implementation, upgrade, extension, etc. should be preceded by a thorough workflow analysis to develop a clear set of system requirements. Without clear and precise requirements, CSV will not be able to adequately verify that the system is functioning as intended. Well-designed functional and/or user requirements should be numbered and listed in a matrix that is traceable to other validation documents (e.g., Test Scripts, Design Specifications, Validation Summary Report, etc.).

Ambiguous Test Scripts. Poorly defined requirements lead to ambiguous test scripts. Without precise requirements, its hard to know exactly what you are testing for, and therefore difficult to confirm through CSV that the system in question is fulfilling its intended use or purpose.

Inadequate Definition of Expected Results. The expected results or acceptance criteria for CSV testing should be clearly and precisely defined in the Validation Plan.

Using Vendor Test Scripts. Vendor test scripts typically only validate the base system requirements and will thus not be sufficient to ensure regulatory compliance.

Inexperienced Project Team. The CSV project team should have experience with CSV and knowledge of regulatory guidelines/compliance, laboratory processes, and the technology being validated. In addition, outsourcing to a third party to augment the validation team with subject matter expertise may be necessary in some cases.

Inadequate Attention on the Project. Many times, a project will fall behind schedule simply because team members have to devote too much time to their day jobs. Members of the project team should be expected to have CSV as their main responsibility during the course of the project. An adequate project team, along with individual responsibilities, should be clearly laid out in the Validation Plan.

Wasting Time on Low Value Testing Activities. CSV activities can be time-consuming and costly. In order to avoid cost and time overruns, a risk-based assessment should be performed on the system to determine required test cases and the optimal level of testing for each. The project team should focus on what is practical and achievable for those aspects of the system that affect quality assurance and regulatory compliance.

Inadequate Documentation. Comprehensive documentation over the full SDLC needs to be created for all CSV processes and results in order to satisfy regulatory agencies. Not doing so can cause you to fail an FDA audit.

Conclusion

Computer system validation is an important part of confirming the accuracy and integrity of your data, along with ensuring product safety and effectiveness. Effective, risk-based validation of computerized systems is also an important part of maintaining regulatory compliance. Inefficient or ineffective CSV processes prevents projects from being delivered on time and within budget and may also result in regulatory action that can be legally and financially devastating to an organization.

Astrix Technology Group has over 20 years’ experience helping scientific organizations conduct effective CSV processes to reduce compliance risk and mitigate data integrity concerns and business liability issues. Our computer system validation professionals provide you with a best practice CSV methodology, along with the peace of mind that comes from knowing your CSV documentation has been produced by experts.

About Astrix Technology Group

Astrix Technology Group is an informatics consulting, professional services and staffing company dedicated to servicing the scientific community for over 20 years.  We shape our clients’ future, combining deep scientific insight with the understanding of how technology and people will impact the scientific industries. Our focus on issues related to value engineered solutions, on demand resource and domain requirements, flexible and scalable operating and business models helps our clients find future value and growth in scientific domains. Whether focused on strategies for Laboratories, IT or Staffing, Astrix has the people, skills and experience to effectively shape client value. We offer highly objective points of view on Enterprise Informatics, Laboratory Operations, Healthcare IT and Scientific Staffing with an emphasis on business and technology, leveraging our deep industry experience.

The post Common Mistakes in Computer System Validation appeared first on Astrix.

]]>
Pharma 4.0 and Validation 4.0 – The Challenges of Implementing https://astrixinc.com/blog/quality-management/pharma-4-0-and-validation-4-0-the-challenges-of-implementing/ Wed, 01 Dec 2021 14:11:07 +0000 http://localhost/astrix/?p=9373 Pharma 4.0™ and Validation 4.0, as part of the plan, requires resources, […]

The post Pharma 4.0 and Validation 4.0 – The Challenges of Implementing appeared first on Astrix.

]]>
Pharma 4.0™ and Validation 4.0, as part of the plan, requires resources, information technology, a holistic control strategy, and a cultural shift of the organization. All areas need to come together to ensure that the business can make progress towards improvement and achieve the objectives. This can lead to challenges given the extensive impact on the organization.

Challenges Impacting the Implementation of Pharma 4.0 and Validation 4.0

People, Culture, and Skillset Challenges

Manufacturers and regulators will need to make cultural adjustments and innovate to manage the myriad of data, computing, and automation hazards on the way to full adoption of Pharma 4.0™ and Validation 4.0 guidelines and the technologies required to get there. In order to achieve optimization, knowledge and training gaps will have to be addressed in the adoption of a new paradigm and industry infrastructure based on digitized and interconnected enterprise systems that rely on computational power, communications technologies, cybersecurity, and advanced controls.

For example, to implement AI and other advanced technologies in pharmaceutical manufacturing, a variety of expertise beyond traditional biology, chemistry, and process engineering will be required. Data scientists, computational and systems engineers, IT specialists, and AI experts, for example, will most likely be needed. At least initially, regulators and business may be fighting for the same tiny pool of talent in these areas. New labor force training standards are undoubtedly on the way, and comprehensive training programs will be required.

There will also be a need to understand the operation of smart devices that will enable monitoring and operating from a distance by operators and supervisors. These technologies will simplify the changeover, setup, and maintenance and the life cycle management. There will also be a need for the workforce to operate cross functionally, integrating several disciplines.

Incorporating best practices and new technologies into the business is required. The difficult part is changing the culture and persuading individuals to accept a new way of doing business after having done things a certain way for many years. Organizational Change Management will be necessary (OCM). OCM is the “application of a systematic process and set of tools for driving the people side of change to accomplish a desired outcome,” according to Prosci, the global leader in change management best practices research. Additionally, strong communications, with a leadership team eager to drive positive experiences through technology, will be required.

Planning and Process Challenges

Because the potential for Pharma 4.0™ is so great and so wide-ranging, many pharmaceutical companies are racing to implement the technology and haven’t defined their primary goals and what problems they’re seeking to solve.

If these two important questions are not answered before starting down the path to Pharma 4.0™, the direction will be unclear, and many firms will lose sight of their objectives and goals. The organization’s business processes will undergo significant changes as a result of the major technologies required for digital transformation, and those changes must be acknowledged and understood before they are implemented. The company needs to have a strategy and a plan to get there.

Pharma 4.0™ and Validation 4.0 require an holistic control strategy. A strategy that follows the product from research to development, to tech transfer, and then through to commercial manufacturing. With this strategy there also needs to be a synergy between digital automation and guidelines and an enhancement of the quality manufacturing focus, where Quality Target Product Profiles are required for all products. With this control strategy, the data from machines and components is immediately available without traversing various systems . The operators’ remarks can be automatically recorded making the continuous improvement process easy to implement.

Technology Challenges

Transitioning to Pharma 4.0™ and leveraging Validation 4.0 guidelines requires incorporating new technologies and ensuring that everything works together to produce reliable data across the value chain. The technology needs to provide a means to integrate data and eliminate the silos that may exist throughout the organization today. These data silos can exist across process areas, applications being used, and organizational groups. A few examples are business quality data, IT quality data, product quality data, and supplier quality data. In many situations, the data is not being collected using the same method and this incongruent data has a negative impact on the decision process. Many times, this leads to inefficiencies and sometimes costly inaccuracies.

Organizations need to look at technology solutions across the value chain. From the company’s business partners and the customer perspective, the organization needs to be able to make informed decisions based on data compiled from various sources and to communicate specific aspects of that data accurately to suppliers, vendors, and customers.

Innovative technologies also necessitate new processes and more regulatory scrutiny. Manufacturers must remain nimble and able to adopt new skills in order to improve manufacturing and provide high-quality products.

To connect the essential instruments and equipment, the technology including tools, devices, and IT systems must be on an open platform with common data standards and, most likely, be cloud-based to enable data sharing. Integration and traceability, as well as automation, rely on information systems that eliminate needless manual or human interaction while also lowering regulatory scrutiny. Leveraging the right technology across the organization that is integrated and provides consistent and accurate information is imperative.

Operating within existing regulatory frameworks can be considered another challenge for Pharma 4.0™ and technical innovation when it comes to regulatory restrictions. Due to a lack of regulatory precedent, the industry may continue to use old practices even though new processes will lower overall regulatory burden and improve quality in the long run.

Summary

Moving to Pharma 4.0™ and incorporating Validation 4.0 guidelines impacts the entire organization. There are challenges that influence the success of an organization meeting its objectives with Pharma 4.0™. These challenges are in the areas of people, processes, and technology. To move to this new approach, it will require training of the people and a cultural shift to be successful. It will also involve an understanding of the objectives of the move to Pharma 4.0™ and the processes involved and an holistic control strategy. Additionally, with Pharma 4.0™, there is also the technology aspect. This requires an understanding of the new technologies so that everything works together to produce reliable data across the value chain.

Why it Matters to You

Before an organization embarks on an initiative to implement Pharma 4.0™ and Validation 4.0 guidelines into the business, there needs to be an understanding of the areas that could provide challenges. These challenges impact the success of the organization. In this blog, we discuss:

  • The key challenges faced by organizations looking to implement this new approach.
  • The impact of training and cultural on the move.
  • What a Holistic Control Strategy role is in the implementation.
  • Technology consideration when implementing this new approach.

About Astrix

For over 25 years, Astrix has been a market-leader in dedicated digital transformation &  staffing services for science-based businesses.  Through our proven laboratory informatics, digital quality & compliance, and scientific staffing services we deliver the highly specialized people, processes, and technology to fundamentally transform how science-based businesses operate.  Astrix was founded by scientists to solve the unique challenges which science-based businesses face in the laboratory and beyond.  We’re dedicated to helping our clients speed & improve scientific outcomes to help people everywhere.

 

The post Pharma 4.0 and Validation 4.0 – The Challenges of Implementing appeared first on Astrix.

]]>
Validation 4.0 – The Benefits to the Life Sciences Industry https://astrixinc.com/blog/quality-management/validation-4-0-the-benefits-to-the-life-sciences-industry/ Fri, 19 Nov 2021 23:07:46 +0000 http://localhost/astrix/?p=9225 The objective of Validation 4.0 is to provide a risk-based method for […]

The post Validation 4.0 – The Benefits to the Life Sciences Industry appeared first on Astrix.

]]>
The objective of Validation 4.0 is to provide a risk-based method for process performance qualification that involves a uniform, coordinated and unified approach to computer system validation. It is based on the Pharma 4.0™ operating model and includes a thorough control plan, as well as digital maturity and data integrity by design.

The Benefits of Validation 4.0

Validation 4.0 ensures new technologies are adopted and validated to meet patient safety and product quality criteria and follow the Pharma 4.0™ operating model. Given that Validation 4.0 enables Pharma 4.0™, there are a number of benefits that Life Sciences organization will see including:

  • Improved Quality – By focusing on the critical thinking and the risk management required to validate the various technologies like smart devices within the ecosystem, it assists in ensuring optimal quality is met. The validation includes visibility into the data exchanged through support networks of suppliers, CDMOs, CMOs, and other external organizations that are part of the value chain.  By incorporating the guidelines of Validation 4.0 organizations can ensure a high level of data integration across the organization as well as with suppliers. It can also enhance visibility and resiliency of the value chain.

 

  • Lowers the Cost of Operations –  In leveraging new technologies as part of the Pharma 4.0™ framework and incorporating the guidelines of Validation 4.0, the organization can lower the overall cost of Quality across the organization.

 

  • Lowers the Risk By leveraging Validation 4.0 strategies, the organization can also lower the overall risk. By incorporating the new technologies of Pharma 4.0™ and leveraging Validation 4.0 guidelines the organization can better mitigate risk. This requires an approach that focuses on incorporating design aspects and controls into the value chain at various phases and continuously rather than a holistic check towards the end of the process, thus attempting to control and mitigate risks throughout the process.

 

  • Provides a Faster Time to Market Validation 4.0 also assists in helping the organization to get products out to market faster. By leveraging technologies of Pharma 4.0™ and the Validation 4.0 guidelines, the organization can better control the quality and safety in real-time and manage any deviation, thereby augmenting a faster time to market of products.
  • Better Visibility Across Value Chain By incorporating the new technologies of Pharma 4.0™  along with the guidelines of Validation 4.0, the organization will have a complete and concise view across the value chain in order to ensure better control of the various internal and external processes.

Summary

By leveraging the guidelines of Validation 4.0 the Life Sciences organization can receive significant benefits. It can improve quality of the products along with  lowering the cost of operations. There will also be a lower risk given the focus on controls throughout the value chain and at various phases to continuously check for issues. Additionally, the organization will have a quicker time to market by leveraging new technologies and Validation 4.0. Another significant benefit, will be the visibility across the value chain to ensure things are running smoothly from both an  internal and external perspective.

Why it Matters to You

When implementing any new technology or guidelines, it is critical to understand the benefits you will receive by doing so. Validation 4.0 is a key component of Pharma 4.0™ and will provide significant benefits to any Life Sciences organization. In this blog we discuss:

  • The key benefits of Validation 4.0 to the Life Sciences organization.
  • How the organization’s product quality and product safety are impacted.
  • How it lowers the cost of operations and the risk level.
  • How it improves the time to market and visibility across the value chain.

About Astrix

For over 25 years, Astrix has been a market-leader in dedicated digital transformation &  staffing services for science-based businesses. Through our proven laboratory informatics, digital quality & compliance, and scientific staffing services we deliver the highly specialized people, processes, and technology to fundamentally transform how science-based businesses operate.  Astrix was founded by scientists to solve the unique challenges which science-based businesses face in the laboratory and beyond.  We’re dedicated to helping our clients speed & improve scientific outcomes to help people everywhere.

The post Validation 4.0 – The Benefits to the Life Sciences Industry appeared first on Astrix.

]]>
Best Practices for Instrument Validation and Qualification https://astrixinc.com/blog/laboratory-compliance-post/best-practices-for-instrument-validation-and-qualification/ Mon, 17 Jun 2019 21:26:05 +0000 http://localhost/astrix/?p=2993 Analytical instruments provide important scientific data about manufactured products that serves to […]

The post Best Practices for Instrument Validation and Qualification appeared first on Astrix.

]]>
Analytical instruments provide important scientific data about manufactured products that serves to ensure that they meet specifications. These instruments run the gamut from simple apparatus to complex systems that combine a metrological function with software control. Laboratories operating in regulated environments are required to conduct instrument validation tests in order to produce documented evidence that instruments are fit for intended use and operate in a controlled manner to produce accurate results. This evidence helps to ensure confidence that products being produced by the manufacturer are both safe and efficacious for public consumption.

Because of their potential for impacting product quality, laboratory instruments are key targets of FDA inspections. During an inspection, the FDA will expect to see definitive evidence that instrument qualification schedules effectively control your manufacturing and testing processes. Lack of (or insufficient) qualification procedures and/or documentation are in fact frequently cited deviations in FDA inspectional observations and warning letters. In order to ensure your organization is compliant with regulations, let’s explore the relevant details regarding analytical instrument qualification and documentation.

Types of Instruments

The term “instrument” in this blog refers to any apparatus, equipment, instrument or instrument system used for analyses. The USP General Chapter <1058> Analytical Instrument Qualification classifies instruments into three categories to manage risk in the instrument qualification process:

Group A: Standard laboratory apparatus with no measurement capability or usual requirement for calibration (e.g., evaporators, magnetic stirrers, vortex mixers, centrifuges, etc.). Proper function of instruments in this group can be determined through observation, and thus no formal qualification activities are needed for this group.

Group B: Instruments providing measured values as well as equipment controlling physical parameters (such as temperature, pressure, or flow) that need calibration. Examples of instruments in this group are balances, melting point apparatus, light microscopes, pH meters, variable pipets, refractometers, thermometers, titrators, and viscometers. Examples of equipment in this group are muffle furnaces, ovens, refrigerator-freezers, water baths, pumps, and dilutors. Often, Group B instruments will only require calibration, maintenance or performance checks to verify proper function. The extent of activities necessary may depend on the criticality of the instrument for ensuring product quality.

Group C: Computerized laboratory systems that typically consist of an analytical instrument that is controlled by a separate workstation running instrument control and data acquisition, and processing software. Group C instruments will require proper calibration protocols, often including software validation, to ensure their proper functioning.

Examples of instruments in this group include the following:

  • atomic absorption spectrometers
  • differential scanning calorimeters
  • dissolution apparatus
  • electron microscopes
  • flame absorption spectrometers
  • high-pressure liquid chromatographs
  • mass spectrometers
  • microplate readers
  • thermal gravimetric analyzers
  • X-ray fluorescence spectrometers
  • X-ray powder diffractometers
  • densitometers
  • diode-array detectors
  • elemental analyzers
  • gas chromatographs
  • IR spectrometers
  • near-IR spectrometers
  • Raman spectrometers
  • UV/Vis spectrometers
  • inductively coupled plasma-emission spectrometers

While categorizing laboratory instruments into three categories provide a useful starting point when discerning what kind of qualification protocol is necessary for an instrument, it should be noted that the same type of instrument can fit into one or more categories depending on its intended use.

In addition, due to the wide diversity of laboratory instruments in use, and the different ways these systems are used, a single prescriptive approach for instrument qualification would be neither practical nor cost-effective. In general, a risk-based approach to instrument qualification should be followed to determine the extent of qualification activities necessary for any instrument. Generally speaking, the more critical an instrument is to product quality, and the more complex the instrument, the more work will be required to ensure adequate qualification status.

Software Validation with Instrument Qualification

It is becoming more and more difficult to separate the hardware and software parts of many modern analytical instruments. The software part of the more complex analytical instruments can be divided into different groups:

Firmware: Many instruments contain integrated chips with low-level software (firmware) that is necessary for proper instrument functionality. In most cases, the firmware cannot be altered by the user and is therefore considered a component of the instrument itself. Thus, no qualification of the firmware is needed – when the instrument hardware is qualified at the user-site, the firmware is also essentially qualified. Group B instruments fall into this category. In this case, the firmware version should be recorded as part of Installation Qualification (see below) activities, and any firmware updates should be tracked through change control of the instrument.

Some instruments come with more sophisticated firmware that is capable of fixed calculations on the acquired data, or even firmware that enables users to define programs for the instrument’s operation. Any firmware calculations need to be verified by the user, and firmware programs need to be defined and verified by the user to demonstrate that they are fit for intended purpose. User-defined programs need to be documented in change control and access to them should ideally be restricted to authorized personnel.

Instrument Control, Data Acquisition, and Processing Software: More complex analytical instruments are typically controlled by a separate workstation computer running instrument control and data acquisition, and processing software. Since the software is needed for data acquisition and calculations, both the hardware and software are essential for obtaining accurate analytical results from the instrument.

The software in these more complex instruments can be classified into three different types:

  • non-configurable software that can’t be modified to change the business process
  • configurable software that includes tools from the vendor to modify the business process
  • configurable software with customization options (i.e., custom software or macros to automate the business process)

In these cases, the software is needed to qualify the instrument and instrument operation is necessary when validating the software. As a result, the software validation and analytical instrument qualification (AIQ) can be integrated into a single activity to avoid duplication.

Qualification Process and Required Documentation

AIQ is not an isolated event, but instead consists of interconnected activities that occur over the lifetime of the instrument. The first step involves the creation of user requirements, which effectively specify the operational and functional requirements that the instrument is expected to fulfill. The user requirements must define every requirement relating to safety, identity, strength, purity, and quality of the product.

The next steps in qualifying an instrument and establishing fitness for purpose proceed as follows:

Design Qualification (DQ): The DQ seeks to demonstrate that the selected instrument has all the capabilities necessary to satisfy the requirements. As such, the DQ will document the requirements, along with all decisions made in selecting an instrument vendor. This information will help ensure that the instrument can be successfully implemented for the intended purpose. Verification that instrument specifications meet the desired requirements may be sufficient for commercial off the shelf (COTS) instruments. However, it may be wise for the user to verify that the supplier has adopted a robust quality system that serves to ensure the vendor specifications are reliable. If the use of the instrument changes, or if it undergoes a software upgrade, it is important for the user to review and update DQ documentation.

Installation Qualification (IQ): The IQ documents the activities necessary to establish that the instrument was received as designed and specified, is correctly installed in the right environment, and that this environment is suitable for the proper use of the instrument. Depending on the results of a risk assessment, IQ may apply to any new instrument, pre-owned instrument, onsite instrument that has not been previously qualified (or qualified to industry standards), or a qualified instrument that is being moved to another location.

Operational Qualification (OQ): The OQ documents the activities necessary to verify that the instrument functions according to its operational specifications in the user environment. OQ demonstrates fitness for selected use and should reflect the user requirements. OQ activities should simulate actual testing conditions, including worst-case scenarios, and be repeated enough times to assure reliable testing results.

Testing activities in the OQ phase should include the following parameters:

  • Fixed Parameters – These tests measure the instrument’s non-changing parameters (e.g., length, height, weight, voltage inputs, acceptable pressures, loads). These parameters will not change over the life of the instrument and therefore do not need to be retested. If the user trusts the manufacturer supplied specifications for these parameters, these tests may be waived.
  • Software Functions – When applicable, OQ testing should include critical elements of the configured software to show the instrument works as intended. Functions applicable to data acquisition, analysis, security and reporting, along with access control and audit trails, should be tested under actual conditions of use.
  • Secure data storage, backup, and archiving – When applicable, secure data storage, backup and archiving should be tested at the user’s site.
  • Instrument Function Tests – Instrument functions that are required by the user should be tested to confirm that the instrument is operating as the manufacturer intended. Supplier information can be used to identify specifications for this testing, as well as to design tests that verify that the instrument meet’s specifications in the user environment.
  • Software configuration and/or customization – Configuration or customization of instrument software should be documented and occur before any OC testing.

Performance Qualification (PQ): Also sometimes called user acceptance testing (UAT), PQ testing intends to demonstrate that an instrument consistently performs according to specifications appropriate for its intended use. PQ should be performed under conditions simulating routine sample analysis. As consistency is important in PQ, the test frequency is usually much higher than in OQ. Testing can be done each time the instrument is used, or scheduled at regular intervals.

Preventative Maintenance, Periodic Reviews and Change Control

Preventative maintenance (e.g., calibration), periodic reviews, repairs and other changes should be documented as part of instrument qualification requirements. When an instrument malfunctions, the cause should be investigated and documented. Once maintenance activities, changes, upgrades, moves and reinstallation at another location, or a repair is complete, relevant IQ, OQ and PQ tests should be run to verify the instrument is operating satisfactorily.

Critical instruments should undergo periodic review to confirm that the system is still under effective control. Areas for review may include: qualification/validation status, change control records, backup and recovery systems for records, correctness and completeness of records produced by the instrument, change control records, test result review and signoff, user procedures.

A change control process should be established in order to guide the assessment, execution, documentation and approval of any changes to laboratory instruments, including firmware and software. All details of the change should be documented, and users should assess the effects of the changes to determine if requalification (IQ, OQ or PQ) activities are necessary. It is important to note that, depending on the nature of a change, qualification tests may need to be revised in order to effectively evaluate the instrument’s qualification status after the change.

Conclusion

Analytical instruments can provide a high level of confidence in the quality of finished product through scientific data if they are qualified properly. Instrument qualification is an important part of compliance for laboratories in regulated industries and is important to ensure product quality and safety in any industry. Failure to qualify instruments properly can lead to serious consequences for an organization as a result of compliance violations and poor product quality.

Instrument qualification plans should be documented in the Validation Master Plan (VMP) and implemented by documenting user requirements and following the DQ/IQ/OQ/PQ protocols outlined in this blog. In order to demonstrate a commitment to manufacturing quality products, organizations should work to:

  • develop a VMP that encompasses the entire validation
  • implement well thought out AIQ qualification programs.
  • implement a regular requalification plan for critical instruments (annually at minimum).
  • maintain current SOP documentation.
  • take appropriate steps to ensure data integrity and data security

While a commitment to quality requires much effort and focus, industry leading organizations understand that good quality practices and culture lead to a more efficient work environment, improved employee satisfaction, and ultimately increased profitability.

The post Best Practices for Instrument Validation and Qualification appeared first on Astrix.

]]>
Best Practices for Computer System Validation https://astrixinc.com/blog/lab-informatics/best-practices-for-computer-system-validation/ Sun, 17 Jun 2018 20:01:53 +0000 http://astrixinc.com/?p=2404 Computer system validation (CSV) is a documented process that is required by […]

The post Best Practices for Computer System Validation appeared first on Astrix.

]]>
Computer system validation (CSV) is a documented process that is required by regulatory agencies around the world to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. These regulatory agencies require CSV processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness. In the United States, for example, the FDA requires pharmaceutical companies to perform CSV for systems that support the production of the following products:

  • Pharmaceuticals
  • Biologicals
  • Medical Devices
  • Blood and blood components
  • Human cell and tissue products
  • Infant Formulas

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

CSV processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated. In this blog, we will discuss best practice recommendations for efficient and effective risk-based CSV assessment and testing.

Computer System Validation 101

With regards to Computer system validation, a “computer system” in an FDA regulated laboratory is not just computer hardware and software. A computer system can also include any equipment and/or instruments connected to the system, as well as users that operate the system and/or equipment using Standard Operating Procedures (SOPs) and manuals.

Computer system validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records. CSV utilizes both static and dynamic testing activities that are conducted throughout the software development lifecycle (SDLC) – from system implementation to retirement.

The FDA defines software validation as “Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.” Computer systems need to be examined to confirm that the system will work in all situations. Additionally, all validation activities and test results need to be documented.

All CSV activities should be documented with the following:

  • System inventory and assessment – determination of which systems need to be validated
  • User requirement specifications – clearly defines what the system should do, along with operational (regulatory) constraints
  • Functional requirement specifications – clearly defines how the system will look and function for the user to be able to achieve the user requirements.
  • Validation Plan (VP) – defines objectives of the validation and approach for maintaining validation status
  • Validation Risk assessments – analysis of failure scenarios to determine scope of validation efforts
  • Validation Traceability Matrix – cross reference between user and functional requirements and verification that everything has been tested
  • Network and Infrastructure Qualification – documentation showing that the network and infrastructure hardware/software supporting the application system being validated has been installed correctly and is functioning as intended
  • Installation Qualification (IQ) Scripts and Results – test cases for checking that system has been installed correctly in user environment
  • Operational Qualification (OQ) Scripts and Results – test cases for checking that system does what it is intended to do in user environment
  • Performance Qualification (PQ) Scripts and Results – test cases for checking that System does what it is intended to do with trained people following SOPs in the production environment even under worst case conditions
  • Validation Report – a review of all activities and documents against the Validation Plan
  • System Release Documentation – documents that validation activities are complete and the system is available for intended use.

Best Practices for Computer System Validation

Develop Clear and Precise Functional and User Requirements. One of the biggest mistakes companies make when starting an informatics project is to not do the strategic planning necessary to ensure success. The first step in any laboratory informatics project should always be a thorough workflow and business analysis. This process allows the development of clear and precise functional and user requirements that are tailored to your unique operating environment to a high degree of specificity and defined at a level that can be addressed through the new software. Without clear and precise requirements, CSV will not be able to adequately verify that the system is functioning as intended.

Perform risk-based CSV. CSV takes a lot of time and IT resources to accomplish, so it is wise to follow a flexible GAMP 5 approach that utilizes a risk-based assessment on the system to determine required test cases and the optimal level of testing for each. CSV efforts should concentrate on what is practical and achievable for the critical elements of the system that affect quality assurance and regulatory compliance. Benefits of this risk-based approach to CSV include reduced cost, business risk, duration of the validation efforts.

Create a Good Validation Plan. Like any technical endeavor, CSV processes should be guided by a good plan that is created before the project starts. This plan will define the objectives of the validation, the approach for maintaining validation status over the full SDLC, and satisfy all regulatory policies and industry best practices (e.g., GAMP 5). The validation plan will be created by people who have a good knowledge of the technology involved (i.e., informatics systems, instruments, devices, etc.) and serve to minimize the impact of the project on day-to-day lab processes.

The validation plan should detail the following:

  • Project Scope – outlines the parts of the system that will be validated, along with deliverables/documentation for the project. Validation activities are only applied to aspects of the system that will be utilized by the company.
  • Testing Approach – Defines the types of data that will be used for testing, along with the kind of scenarios that will be tested.
  • Testing Team and Responsibilities – Lists the members of the validation team, along with their roles and responsibilities in the validation process.
  • Acceptance Criteria – Defines the requirements that need to be satisfied before the system is considered suitable for use in regulated activities.

Create a Good Team. The project team should have CSV experience and knowledge of regulatory guidelines/compliance, validation procedures, laboratory processes, and the technology (e.g., informatics software, laboratory devices and instruments, etc.) being validated. It is important that the team is big enough so that members are not stretched too thin during the project. Outsourcing to a third party to augment the validation team with subject matter expertise may be appropriate in some instances.

Avoid Ambiguous Test Scripts. This mistake is related to the importance of developing clear and precise functional and user requirements for the system in question, as described above. Precise requirements lead to precise validation testing that confirms the system is fulfilling its intended use. Additionally, vendor test scripts typically only validate the base system requirements and will not be sufficient to ensure regulatory compliance.

Create Good Documentation. CSV processes and results need to be clearly documented over the full SDLC to the extent that the documents are sufficient to pass an audit by regulatory agencies. Having project team members with good understanding of regulatory guidelines is an important part of creating the necessary documentation.

Audit third-party Providers. In addition to performing CSV on internal systems, an FDA-regulated company needs to be prepared to audit third-party service providers (e.g., CROs), along with vendors of critical applications and cloud-based services (SaaS). The manufacturer of an FDA-regulated product is ultimately responsible for the integrity of the data that supports the product’s efficacy and safety, so if third-party vendors or service providers are used, the manufacturer needs to take appropriate steps to ensure that they are operating under standards that would hold up under an FDA inspection.

A risk-based assessment should be conducted to determine if an audit is necessary. At the minimum, formal agreements that clearly detail responsibilities must exist between the manufacturer and any third parties that are used to provide, install, configure, integrate, validate, maintain or modify a computerized system.

Conclusion

Effective, risk-based validation of computerized systems is an important part of maintaining regulatory compliance and product quality in modern laboratories. Inefficient or ineffective CSV processes prevents projects from being delivered on time and within budget and can also result in regulatory action. With regards to the FDA, for example, regulatory action due to failure to perform adequate CSV can be legally and financially devastating to an organization.

If you have additional questions about computer system validation, or would like to have an initial, no obligations consultation with an Astrix informatics expert to discuss your validation project, please feel free to contact us.

The post Best Practices for Computer System Validation appeared first on Astrix.

]]>