Laboratory Compliance Archives - Astrix https://astrixinc.com/category/blog/laboratory-compliance-post/ Expert Services and Staffing for Science-Based Businesses Wed, 10 May 2023 19:43:31 +0000 en-US hourly 1 Navigating Global Regulations and Guidance Around Data Integrity Guidance https://astrixinc.com/blog/navigating-global-regulations-and-guidance-around-data-integrity/ Sun, 08 May 2022 18:44:43 +0000 http://astrixinc.com/?p=2351 Data integrity provides assurance that data records are reliable and accurate. Regulations […]

The post Navigating Global Regulations and Guidance Around Data Integrity Guidance appeared first on Astrix.

]]>
Data integrity provides assurance that data records are reliable and accurate. Regulations meant to ensure data integrity in pharmaceutical laboratories are found in several parts of 21 CFR governing GxP areas and have been enforced by the FDA for decades. While many of these regulations were initially developed for paper-based processes, 21 CFR Part 11 was published in 1997 to extend data integrity regulations into the modern era with electronic records and electronic signatures.

All of the data integrity principles that were developed for the paper and ink era still apply to electronic systems and data capture. However, there are several trends in the pharmaceutical industry that make data integrity compliance especially challenging for today’s laboratories:

  • increased utilization of electronic technologies
  • increased availability and usability of data
  • evolving data integrity regulations around electronic technologies
  • evolving business models (e.g., globalization)

Regulatory agencies worldwide are increasingly focusing their efforts on data integrity in GxP laboratories. This increased focus has led to a number of guidance documents being published recently on data integrity, including:

With the spotlight of global regulatory inspections shining on data integrity in Manufacturing and GMP areas, you need to know what’s new and what the impact is on data integrity in GxP. Astrix Technology Group recently sponsored a webinar conducted by Joseph Franchetti, FDA Regulatory Compliance Specialist, that highlighted the current thinking of regulatory agencies on data integrity. In this blog, we will provide a summary of some of the most important points made by Mr. Franchetti in this webinar for your review.

Data Integrity Regulations

Important GMP regulatory requirements were described in 21 CFR Part 211:

  • Instruments must be qualified and fit for purpose
  • Software must be validated
  • Any calculations used must be verified
  • Data generated in an analysis must be backed up
  • Reagents and reference solutions should be prepared correctly with appropriate records
  • Methods used must be documented and approved
  • Methods used must be verified under actual conditions of use
  • Data generated and transformed must meet the criterion of scientific soundness
  • Test data must be accurate and complete and follow procedures
  • Data and reportable value must be checked by a second individual to ensure accuracy, completeness, and conformance with procedures
  • Laboratory control records should include complete data derived from all tests conducted to ensure compliance with established specifications and standards, including examinations and assays.

In this last point, “complete data” refers to raw data/observations generated in the course of an analysis, the associated metadata which helps to provide the proper context, and the audit trail when using computerized systems that shows how and why the data has been modified and who it has been modified by.

Recent Guidance Documents on Data Integrity

While companies in the United States typically look to the FDA for guidance on data integrity regulations, it can also be valuable to stay up to date with regulatory guidance published by agencies in other parts of the world, as regulatory agencies worldwide are all moving in the same direction with respect to thinking on data integrity. Towards that end, the Medicines and Heathcare products Regulatory Agency (MHRA) in the United Kingdom published a guidance document in March 2015 entitled GMP Data Integrity Definitions and Guidance for Industry.

This guidance document made a number of helpful suggestions for systems (both electronic and paper-based) designed to ensure data integrity. These systems should include:

  • Access to clocks (synchronized) for recording timed events
  • Accessibility of batch records at locations where activities take place so that ad hoc data recording and later transcription to official records is not necessary
  • Control (free access) over blank paper templates for raw data recording
  • User access rights which prevent, or audit trail, data amendments
  • Automated data capture or printers attached to equipment such as balances, pH meters, etc.
  • Proximity of printers to relevant activity
  • Control of physical parameters (time, space, equipment) that permit performance of tasks and recording of data as required
  • Access to sampling points (e.g., for water systems)
  • Access to raw data for staff performing data checking activities

This guidance document also defined an acronym ALCOA+ which helps to define important characteristics the principles necessary for data integrity:

ALCOA

Attributable – Data must contain information about who performed an action and when? If a record is changed, who did it and why? This information should be linked to the source data.

Legible – Data must be recorded permanently in a durable medium and be readable.

Contemporaneous – The data should be recorded at the time the work is performed and the date/time stamps should follow in order

Original – Recorded data records should be the original record or a certified true copy.

Accurate – Any data errors or editing of data are recorded with documented amendments.

ALCOA+

Complete – Data records should contain all data including repeat or reanalysis performed on the sample (21 CFR 211.194)

Consistent – There should be consistent application of data time stamps in the expected sequence.

Enduring – Data should be recorded in a permanent, maintainable form for the useful life.

Available – Data should be available/accessible for review/audit for the life-time of the record.

This document also sets the expectation that pharmaceutical companies, importers and contract labs will review the effectiveness of their data governance systems to ensure data integrity, and that companies outsourcing activities should verify the adequacy of comparable systems at the contract acceptor.

Two other recent guidance documents that contain valuable information and are worth going through are MHRA’s ‘GXP’ Data Integrity Guidance and Definitions published in March 2018 and the FDA’s Data Integrity and Compliance With cGMP published in April 2016.

Meeting Regulatory Expectations

Data integrity issues happen for a number of reasons. Some of the more common include:

  • Ignoring SOPs to meet deadlines.
  • Insufficient education & understanding amongst personnel
  • Insufficient data integrity controls built into informatics systems
  • Improper or nonexistent instrument qualification and computer system validation procedures
  • Cutting corners to save money.
  • Lack of regular internal and/or external data integrity audits
  • Lack of a culture of quality – difficult to fix, as it involves people.

It is important to develop a data integrity strategy to mitigate risks across the full data lifecycle. This strategy should contain the following elements:

  • Predict Violations – Predict the most likely data integrity breaches by identifying vulnerabilities within the organization.
  • Prevent Violations – Train employees, design & validate systems, add security to systems, focus on external sources (e.g., contractors, vendors, etc.), establish Good Documentation Practices (GDPs), educate and enforce good behaviors.
  • Detect Violations – Monitor and identify issues not predicted or prevented to provide rapid response through data and audit trail review, system administrative audit trail review, and internal and/or external audits.
  • Respond to Violations – Efficient management of efforts to develop policy, address audit findings and produce corrective actions.

Most companies are not proactive with regard to data integrity (predict, prevent, detect) and are thus mostly responding to data integrity violations.

With respect to data governance, the PIC/S 2016 draft guidance document sets the expectation that organizations will design systems that “ensure that data, irrespective of the process, format or technology in which it is generated, recorded, processed, retained, retrieved and used will ensure a complete, consistent and accurate record throughout the data lifecycle.” Pharmaceutical companies should consider the design, operation and monitoring of each of the following processes to comply with the principles:

  • Data development
  • Database operations management
  • Data security management
  • Reference and master data management
  • Data warehousing and business intelligence management
  • Document and content management
  • Metadata management
  • Data quality management
  • Data architecture management

The PIC/S document recommends a risk-based approach to data governance, where “Manufacturers and analytical laboratories should design and operate a system which provides an acceptable state of control based on the data integrity risk, and which is fully documented with supporting rationale.”

Finally, there are a number steps that are important for organizations to take in order to ensure compliance with data integrity regulations:

  • Embed verification activities into internal audit processes.
  • Enforce Good Documentation Practices
  • Train your internal auditors to understand what to look for when detecting data integrity deficiencies.
  • Create an awareness among staff so they can assist with this endeavor, and report concerns before they become full-fledged issues
  • Seek quality external support to assure completely unbiased, third-party investigations and/or to enhance your internal investigation program.

Why Astrix

Astrix Technology Group is a laboratory informatics consulting, regulatory advisory, and professional services firm focused on serving the scientific community since 1995. Astrix professionals have the skills and expertise necessary to architect, implement, integrate and support best in class solutions for your organization’s laboratory environment.

Our professionals can also help to ensure that your laboratory systems and staff comply with the FDA’s data integrity mandates. We provide experienced professionals knowledgeable about FDA regulations to conduct a thorough data integrity assessment of your laboratory informatics environment to identify data integrity risks. Working with an external consultant that has expertise in data integrity evaluations to audit your laboratory environment is best practice, as an expert with fresh eyes will be able to effectively locate issues you missed.

Conclusion

Regardless of the regulatory authority, all guidance documents have the same flavor around the foundational terminology and information on technology controls. Compliance with electronic data integrity requires a fairly complex design that not all out-of-the-box systems have in place. All organizations utilizing electronic systems must implement risk-based controls for data integrity, as these controls serve to both prevent and detect data integrity issues.

Data integrity is maintained through processes and procedures, training and vigilance:

  • Detailed system use and maintenance procedures with data integrity discipline built-in.
  • Strict, verified adherence to procedures.
  • Regular training for applicable personnel focusing on data integrity topics

The post Navigating Global Regulations and Guidance Around Data Integrity Guidance appeared first on Astrix.

]]>
FDA’s Guidance on Genetic Testing: What You Need to Know https://astrixinc.com/blog/laboratory-compliance-post/fdas-guidance-on-genetic-testing-what-you-need-to-know/ Sun, 19 Dec 2021 23:33:31 +0000 http://localhost/astrix/?p=2652 In traditional medical care, diseases are typically detected via chemical changes associated […]

The post FDA’s Guidance on Genetic Testing: What You Need to Know appeared first on Astrix.

]]>
In traditional medical care, diseases are typically detected via chemical changes associated with a particular condition and all patients receive a similar dose of a medication as treatment. While this one-size-fits-all approach has led to great advancements in medicine, it has by no means been successful for all patients.

Over the last decade, a new technology known as next generation sequencing (NGS) has been paving the way for precision medicine (PM) – a more personalized approach where disease prevention and treatment is tailored to the individual based on their genetics, environment and lifestyle. The goal of PM is to be able to prescribe the right treatments to the right patients at the right time.

Unlike traditional diagnostics which measure a limited number of analytes, NGS can identify millions of DNA variants in a single test, allowing scientists to identify or ‘sequence’ large sections of a person’s DNA in just a couple of hours. Access to human genetic profiles is allowing researchers to identify disease-causing variants and develop treatments, and clinicians to match patients to suitable treatments with increasing precision. Advances in PM have already led to a variety of FDA-approved treatments that are tailored to an individual’s genetic profile, or the genetic profile of a tumor.

Over the past several years, the FDA has been working with stakeholders from across the genomics community to develop regulations that encourage innovation and ensure that genetic tests provide accurate and meaningful results for patients. On April 13, 2018, the FDA issued two separate finalized guidance documents related to next-generation sequencing (NGS):  

These guidance documents provide NGS test developers with recommendations for designing, developing and validating tests, as well as using genetic variant databases to support clinical validity.

With these new guidance documents, the FDA has laid out a flexible and adaptive regulatory approach that accommodates the rapidly evolving nature of NGS technologies, streamlined the regulatory pathway for NGS-based tests, and provided a reasonable assurance of testing safety and effectiveness. Let’s take a look at some of the key parts of these guidance documents.  

Guidance on Public Genetic Variant Databases to Support Clinical Validity

Increasing use of NGS technologies in both research and clinical settings is supporting the identification of many new genetic variants that can hold much clinical significance. Unfortunately, this information is often stored in a manner that is not accessible to researchers. The new FDA guidance encourages the creation of FDA-recognized public genetic variant databases to promote data sharing of evidence that supports the clinical validity of genetic and genomic-based tests. Such databases will aggregate and curate reports of human genotype-phenotype relationships to a disease or condition and include publicly available documentation of evidence supporting such linkages.

To become an FDA-recognized publicly available human genetic variant database that can be used to support clinical validity of genetic variant assertions, the database must meet the following criteria:

  • Operates in a manner which provides sufficient information and assurances to assess the quality of its source data, evidence review, and assertions regarding variants
  • Provides transparency regarding its data sources, how it operates, and how it evaluates variant evidence
  • Collects, stores, and reports data and conclusions in compliance with all applicable requirements regarding protected health information, patient privacy, research subject protections, and data security
  • Contains genetic variant information generated using validated methods.

The new Guidance outlines a three-step process for genetic variant databases to gain recognition from the FDA:

  • Voluntary submission of detailed information (outlined in the Guidance) about the database to the FDA
  • FDA reviews genetic variant database policies and procedures for maintaining data and making variant assertions
  • The database must be maintained in a way that satisfies FDA standards for recognition and undergoes regular reviews.

Once a genetic variant database is recognized by the FDA, its assertions can be relied on in a premarket submission without requiring any additional scientific evidence. For companies developing genetic tests, being able to rely on the vast information available in public databases to verify clinical validity of the test means a faster path to FDA marketing clearance. The FDA hopes that this guidance will encourage crowdsourcing of clinical evidence, curating and data sharing in order to advance the development of high-quality PM treatments and diagnostics.

Guidance on Design, Development and Validation of NGS-Based In Vitro Diagnostics

This guidance document is part of the FDA’s efforts to create a flexible and adaptive approach to regulation of next generation sequencing (NGS)-based tests that will serve to foster innovation and also assure that the tests are accurate and meaningful. In this guidance, the FDA outlines important considerations for designing, developing and establishing the analytical validity of NGS-based tests intended to aid in diagnosis of germline diseases (conditions arising from inherited or de novo germline variants). This guidance does not apply to NGS-based tests designed for other purposes.   

The FDA intends the recommendations in this guidance to assist test developers directly. Additionally, given that the FDA has a widely used standards recognition program that facilitates use of consensus standards to meet premarket submission requirements for devices, and since standards can more rapidly evolve with changes in technology, the FDA intends the recommendations in this guidance to inform and hopefully spur the development of consensus standards by experts in the genomics community.

The FDA believes the recommendations in this guidance may serve to reasonably assure the safety and effectiveness of these tests, which would allow NGS-based tests intended to aid in the diagnosis of suspected germline diseases to be candidates for classification as Class II (moderate risk) devices via the de novo process. If classified as class II, subsequent NGS-based tests would be reviewed through the 510(k) program, significantly streamlining the review process. The hope is that, with adherence to the recommendations in this guidance (or consensus standards that address them), along with the utilization of FDA-recognized publicly available human genetic variant databases as described in previous section, the FDA can consider exempting these kinds of NGS-based tests from premarket review in the future.

The specific recommendations in this guidance for NGS-based germline tests are extensive. Some of the aspects addressed include:

  • Test Design Considerations – design standards to ensure a test consistently meets performance metrics appropriate for the indications for use of the test (e.g., define the specific indications for use for a given test, document specific test features that are needed to assure development of a test that meets users’ needs, document acceptable specimen types to be used for the test, etc.)
  • Test Performance Characteristics – set of performance metrics that should be assessed when analytically validating NGS-based tests intended to aid in the diagnosis of suspected germline diseases (e.g., test accuracy, test reproducibility and repeatability, limit of detection, analytical specificity, etc.)
  • Quality Metrics – suggested test run quality metrics used to determine if tests should be accepted (e.g., coverage thresholds, performance thresholds, specimen quality, strand bias, etc.)
  • Recommendations for Performance Evaluation Studies – features to incorporate when evaluating test performance (e.g., perform validation studies, evaluate and document accuracy by comparison to a method identified as appropriate comparator by FDA, use specimens that reflect actual specimen type and population test is indicated for, assess test limits, etc.)
  • Presentation of Test Performance in Labeling – describes what information should be included in labeling to document test performance (e.g., aspects of test design, specimen type, results for test accuracy and precision/reproducibility, probability of test failure, limitations, etc.)
  • Test Reports – describes information that should be included in test reports (e.g., relationship between reported variants and the clinical presentation, description of genomic and chromosomal regions detected by the test, performance study summary, etc.)
  • Modifications – provides guidelines for documentation, validation and resubmission when modifications are made to approved 501(k) devices)

Conclusion

The new field of genomic testing and research is experiencing dramatic growth due to a rapidly evolving technology base. The new FDA guidance documents described above provide a flexible framework to generate the data needed to facilitate the FDA’s review of NGS-based tests. As such, these guidance documents help to ensure that NGS-based tests provide accurate and meaningful results, while at the same time lowering the barrier to innovation by giving developers new tools that support efficient test development and validation. Ultimately, it is patients who will benefit, as these FDA recommendations will be utilized to speed the development of high-quality precision medicine treatments and diagnostics that will improve patient outcomes.

About Astrix

For over 25 years, Astrix has been a market-leader in delivering innovative solutions through world class people, process, and technology that fundamentally improves scientific outcomes and quality of life everywhere. Founded by scientists to solve the unique challenges life sciences and other science-based business face, Astrix offers a growing array of strategic, technical, and staffing services designed to deliver value to clients across their organizations.

The post FDA’s Guidance on Genetic Testing: What You Need to Know appeared first on Astrix.

]]>
Laboratory Relocation Best Practices https://astrixinc.com/blog/laboratory-compliance-post/laboratory-relocation-best-practices/ Fri, 20 Dec 2019 13:40:18 +0000 http://localhost/astrix/?p=3351 The pace of change in the pharmaceutical and biotech industries means that […]

The post Laboratory Relocation Best Practices appeared first on Astrix.

]]>
The pace of change in the pharmaceutical and biotech industries means that most companies eventually must face the daunting task of relocating their laboratories. Whether due to mergers, acquisitions, funding changes or simply organic growth, a laboratory relocation is an extraordinarily complex undertaking that will impact your laboratory’s scientists, research and business goals.

A laboratory relocation involves moving high-end analytical instrumentation, hazardous materials, products and samples, and sometimes even live animals. It will require shutting down laboratory equipment, along with safely packing and shipping instruments, samples, materials, devices, computer hardware and potentially data to the new location. Once all necessary items have been moved to the new location, they will need to be unpacked and equipment will need to be requalification and/or validated.

Whether you are moving your lab across the hall, street or country, a laboratory relocation is never a routine exercise. The reality is that no two laboratories are alike – each will have a set of unique challenges that will need to be addressed with care. In this blog, we will discuss critical best practices that should be followed in all cases in order to make sure your laboratory relocation is a smooth, safe and efficient process that minimizes downtime and disruption for your business.

Best Practices for Laboratory Relocation

While a successful laboratory relocation can serve as a catalyst for company growth and enhanced innovation, a poorly managed relocation can be a disaster for your organization. Potential ways in which this “disaster” can unfold include:

  • Failure to preserve the integrity of ongoing research
  • Significant delays in ongoing research
  • Damage to sensitive equipment
  • Excessive costs to complete the move

Relocating your laboratory(s) can be an extremely complicated process. As such, there are a number of best practice recommendations that are important to follow to make your move go smoothly, some of which include:

Conduct a Site Assessment. It is important for the relocation team to meet with the lab manager to understand the background of the relocation, the nature of research being conducted, and any other important issues that need to be considered. A thorough assessment of both the old and new facilities should be conducted that includes the following:

  • Complete asset inventory including all instruments, PCs, etc. Each instrument’s location, configuration, operational condition, and usage should be documented.
  • Confirm needs of temperature sensitive chemicals, biological samples, and supplies
  • Determine permits required for hazardous materials
  • Record additional assets to be relocated including tables, hoods, chairs, etc.
  • Map physical layout of the old and new facilities – include loading dock access schedules

Create a Detailed Plan and Schedule. A typical laboratory relocation can take three to six months of planning before a single piece of equipment is moved. When relocating a laboratory, having a plan that considers every detail is critical, particularly if you are moving more than one site. Once the scope of work is established, a project plan should be developed that includes every operational aspect of the move complete with timetables, ownership and logistics. The plan should coordinate sending and receiving facilities, provide a detailed schedule, identify and prioritize critical assets, detail asset placement in the new space, describe anticipated logistical obstacles and contingencies, include a risk management plan, identify any important transportation requirements, etc. Be sure to allow enough time to create the plan that will accurately capture all of your laboratory’s requirements. Once created, have all involved review and approve the plan.

Project Management Expertise. Laboratory relocation requires ongoing monitoring and reassessment. Even the best internal teams will struggle to anticipate every conceivable complication. An experienced project manager (PM) is therefore critical to ensure good coordination and communication. In addition to facilitating good communication, the PM will need to organize, track and report on the various steps of the lab move.

Communication Plan.  Just a few of the many different types of details that will need to be communicated include:

  • Scientists will need to know the date their equipment will no longer be available
  • Decision-makers need to know how long it takes to decommission certain equipment
  • Site managers need to know when their loading docks will be accessed
  • Operations people will need to have a detailed overall assessment of the project

Develop a communication plan that involves all stakeholders to avoid disconnects between scientists, lab managers, management, facility managers, instrument vendors, and the people executing the move. Be sure to have a kick-off meeting with all involved and conduct weekly meetings to ensure everyone stays on the same page.

Build the Right Internal Team. It is important that you identify and include all the key stakeholders from your company that will need to be involved (e.g., site managers, financial/budget managers, EH&S, scientists, key decision makers, etc.). Additionally, if your move is over a long distance, you will need duplicate teams at both ends of the project.

Regulatory Compliance. An important consideration for any lab relocation is making sure all transport regulations are met. Depending on the types of material involved, along with the old and new destination, the project team may need to align with EPA, OSHA, DOT, FDA, and IATA regulations. Local and facility regulations may also need to be adhered to. In heavily regulated environments, such as pharmaceutical manufacturing and testing, movers must also comply with GLP/GMP guidelines. In this case, appropriate documentation should be maintained before, during and after the move.

Chain of Custody. Make sure you have a chain of custody in place for every piece of equipment, product or sample that you are moving so you know who is responsible for an item as it is going through the move process. This is especially important if you will be utilizing multiple vendors in the move.

Conclusion

The future productivity and profitability of your organization is impacted by the ability to execute a laboratory relocation successfully. That said, a lab relocation is a set of complex, interconnected activities that requires careful and detailed planning, efficient harmonization, proper execution, all within strict timelines and production schedules. While managing a lab demands its own set of skills, responsibilities, and knowledge, overseeing the intricate inner workings of relocation is typically not an internal core competency. In addition, most laboratory personnel assigned to a relocation typically already have full time jobs, which only increases the chance that important details will be overlooked.

If you don’t have personnel with the required skills and experience, it’s important to hire a third-party professional that does. A standard relocation company is not ideal, however, as not all project or facilities management companies are experienced in laboratory relocations. Ideally, you want a company who understands the Life Science industry and specializes in laboratory logistics. Your mover should understand the laboratory environment and instrumentation, have project management expertise, inventory management expertise, and field service engineering expertise. Hiring a qualified lab mover mitigates risk, improves efficiency, saves money, and enables scientists and lab staff to focus on their research.

 

The post Laboratory Relocation Best Practices appeared first on Astrix.

]]>
Best Practices for Instrument Validation and Qualification https://astrixinc.com/blog/laboratory-compliance-post/best-practices-for-instrument-validation-and-qualification/ Mon, 17 Jun 2019 21:26:05 +0000 http://localhost/astrix/?p=2993 Analytical instruments provide important scientific data about manufactured products that serves to […]

The post Best Practices for Instrument Validation and Qualification appeared first on Astrix.

]]>
Analytical instruments provide important scientific data about manufactured products that serves to ensure that they meet specifications. These instruments run the gamut from simple apparatus to complex systems that combine a metrological function with software control. Laboratories operating in regulated environments are required to conduct instrument validation tests in order to produce documented evidence that instruments are fit for intended use and operate in a controlled manner to produce accurate results. This evidence helps to ensure confidence that products being produced by the manufacturer are both safe and efficacious for public consumption.

Because of their potential for impacting product quality, laboratory instruments are key targets of FDA inspections. During an inspection, the FDA will expect to see definitive evidence that instrument qualification schedules effectively control your manufacturing and testing processes. Lack of (or insufficient) qualification procedures and/or documentation are in fact frequently cited deviations in FDA inspectional observations and warning letters. In order to ensure your organization is compliant with regulations, let’s explore the relevant details regarding analytical instrument qualification and documentation.

Types of Instruments

The term “instrument” in this blog refers to any apparatus, equipment, instrument or instrument system used for analyses. The USP General Chapter <1058> Analytical Instrument Qualification classifies instruments into three categories to manage risk in the instrument qualification process:

Group A: Standard laboratory apparatus with no measurement capability or usual requirement for calibration (e.g., evaporators, magnetic stirrers, vortex mixers, centrifuges, etc.). Proper function of instruments in this group can be determined through observation, and thus no formal qualification activities are needed for this group.

Group B: Instruments providing measured values as well as equipment controlling physical parameters (such as temperature, pressure, or flow) that need calibration. Examples of instruments in this group are balances, melting point apparatus, light microscopes, pH meters, variable pipets, refractometers, thermometers, titrators, and viscometers. Examples of equipment in this group are muffle furnaces, ovens, refrigerator-freezers, water baths, pumps, and dilutors. Often, Group B instruments will only require calibration, maintenance or performance checks to verify proper function. The extent of activities necessary may depend on the criticality of the instrument for ensuring product quality.

Group C: Computerized laboratory systems that typically consist of an analytical instrument that is controlled by a separate workstation running instrument control and data acquisition, and processing software. Group C instruments will require proper calibration protocols, often including software validation, to ensure their proper functioning.

Examples of instruments in this group include the following:

  • atomic absorption spectrometers
  • differential scanning calorimeters
  • dissolution apparatus
  • electron microscopes
  • flame absorption spectrometers
  • high-pressure liquid chromatographs
  • mass spectrometers
  • microplate readers
  • thermal gravimetric analyzers
  • X-ray fluorescence spectrometers
  • X-ray powder diffractometers
  • densitometers
  • diode-array detectors
  • elemental analyzers
  • gas chromatographs
  • IR spectrometers
  • near-IR spectrometers
  • Raman spectrometers
  • UV/Vis spectrometers
  • inductively coupled plasma-emission spectrometers

While categorizing laboratory instruments into three categories provide a useful starting point when discerning what kind of qualification protocol is necessary for an instrument, it should be noted that the same type of instrument can fit into one or more categories depending on its intended use.

In addition, due to the wide diversity of laboratory instruments in use, and the different ways these systems are used, a single prescriptive approach for instrument qualification would be neither practical nor cost-effective. In general, a risk-based approach to instrument qualification should be followed to determine the extent of qualification activities necessary for any instrument. Generally speaking, the more critical an instrument is to product quality, and the more complex the instrument, the more work will be required to ensure adequate qualification status.

Software Validation with Instrument Qualification

It is becoming more and more difficult to separate the hardware and software parts of many modern analytical instruments. The software part of the more complex analytical instruments can be divided into different groups:

Firmware: Many instruments contain integrated chips with low-level software (firmware) that is necessary for proper instrument functionality. In most cases, the firmware cannot be altered by the user and is therefore considered a component of the instrument itself. Thus, no qualification of the firmware is needed – when the instrument hardware is qualified at the user-site, the firmware is also essentially qualified. Group B instruments fall into this category. In this case, the firmware version should be recorded as part of Installation Qualification (see below) activities, and any firmware updates should be tracked through change control of the instrument.

Some instruments come with more sophisticated firmware that is capable of fixed calculations on the acquired data, or even firmware that enables users to define programs for the instrument’s operation. Any firmware calculations need to be verified by the user, and firmware programs need to be defined and verified by the user to demonstrate that they are fit for intended purpose. User-defined programs need to be documented in change control and access to them should ideally be restricted to authorized personnel.

Instrument Control, Data Acquisition, and Processing Software: More complex analytical instruments are typically controlled by a separate workstation computer running instrument control and data acquisition, and processing software. Since the software is needed for data acquisition and calculations, both the hardware and software are essential for obtaining accurate analytical results from the instrument.

The software in these more complex instruments can be classified into three different types:

  • non-configurable software that can’t be modified to change the business process
  • configurable software that includes tools from the vendor to modify the business process
  • configurable software with customization options (i.e., custom software or macros to automate the business process)

In these cases, the software is needed to qualify the instrument and instrument operation is necessary when validating the software. As a result, the software validation and analytical instrument qualification (AIQ) can be integrated into a single activity to avoid duplication.

Qualification Process and Required Documentation

AIQ is not an isolated event, but instead consists of interconnected activities that occur over the lifetime of the instrument. The first step involves the creation of user requirements, which effectively specify the operational and functional requirements that the instrument is expected to fulfill. The user requirements must define every requirement relating to safety, identity, strength, purity, and quality of the product.

The next steps in qualifying an instrument and establishing fitness for purpose proceed as follows:

Design Qualification (DQ): The DQ seeks to demonstrate that the selected instrument has all the capabilities necessary to satisfy the requirements. As such, the DQ will document the requirements, along with all decisions made in selecting an instrument vendor. This information will help ensure that the instrument can be successfully implemented for the intended purpose. Verification that instrument specifications meet the desired requirements may be sufficient for commercial off the shelf (COTS) instruments. However, it may be wise for the user to verify that the supplier has adopted a robust quality system that serves to ensure the vendor specifications are reliable. If the use of the instrument changes, or if it undergoes a software upgrade, it is important for the user to review and update DQ documentation.

Installation Qualification (IQ): The IQ documents the activities necessary to establish that the instrument was received as designed and specified, is correctly installed in the right environment, and that this environment is suitable for the proper use of the instrument. Depending on the results of a risk assessment, IQ may apply to any new instrument, pre-owned instrument, onsite instrument that has not been previously qualified (or qualified to industry standards), or a qualified instrument that is being moved to another location.

Operational Qualification (OQ): The OQ documents the activities necessary to verify that the instrument functions according to its operational specifications in the user environment. OQ demonstrates fitness for selected use and should reflect the user requirements. OQ activities should simulate actual testing conditions, including worst-case scenarios, and be repeated enough times to assure reliable testing results.

Testing activities in the OQ phase should include the following parameters:

  • Fixed Parameters – These tests measure the instrument’s non-changing parameters (e.g., length, height, weight, voltage inputs, acceptable pressures, loads). These parameters will not change over the life of the instrument and therefore do not need to be retested. If the user trusts the manufacturer supplied specifications for these parameters, these tests may be waived.
  • Software Functions – When applicable, OQ testing should include critical elements of the configured software to show the instrument works as intended. Functions applicable to data acquisition, analysis, security and reporting, along with access control and audit trails, should be tested under actual conditions of use.
  • Secure data storage, backup, and archiving – When applicable, secure data storage, backup and archiving should be tested at the user’s site.
  • Instrument Function Tests – Instrument functions that are required by the user should be tested to confirm that the instrument is operating as the manufacturer intended. Supplier information can be used to identify specifications for this testing, as well as to design tests that verify that the instrument meet’s specifications in the user environment.
  • Software configuration and/or customization – Configuration or customization of instrument software should be documented and occur before any OC testing.

Performance Qualification (PQ): Also sometimes called user acceptance testing (UAT), PQ testing intends to demonstrate that an instrument consistently performs according to specifications appropriate for its intended use. PQ should be performed under conditions simulating routine sample analysis. As consistency is important in PQ, the test frequency is usually much higher than in OQ. Testing can be done each time the instrument is used, or scheduled at regular intervals.

Preventative Maintenance, Periodic Reviews and Change Control

Preventative maintenance (e.g., calibration), periodic reviews, repairs and other changes should be documented as part of instrument qualification requirements. When an instrument malfunctions, the cause should be investigated and documented. Once maintenance activities, changes, upgrades, moves and reinstallation at another location, or a repair is complete, relevant IQ, OQ and PQ tests should be run to verify the instrument is operating satisfactorily.

Critical instruments should undergo periodic review to confirm that the system is still under effective control. Areas for review may include: qualification/validation status, change control records, backup and recovery systems for records, correctness and completeness of records produced by the instrument, change control records, test result review and signoff, user procedures.

A change control process should be established in order to guide the assessment, execution, documentation and approval of any changes to laboratory instruments, including firmware and software. All details of the change should be documented, and users should assess the effects of the changes to determine if requalification (IQ, OQ or PQ) activities are necessary. It is important to note that, depending on the nature of a change, qualification tests may need to be revised in order to effectively evaluate the instrument’s qualification status after the change.

Conclusion

Analytical instruments can provide a high level of confidence in the quality of finished product through scientific data if they are qualified properly. Instrument qualification is an important part of compliance for laboratories in regulated industries and is important to ensure product quality and safety in any industry. Failure to qualify instruments properly can lead to serious consequences for an organization as a result of compliance violations and poor product quality.

Instrument qualification plans should be documented in the Validation Master Plan (VMP) and implemented by documenting user requirements and following the DQ/IQ/OQ/PQ protocols outlined in this blog. In order to demonstrate a commitment to manufacturing quality products, organizations should work to:

  • develop a VMP that encompasses the entire validation
  • implement well thought out AIQ qualification programs.
  • implement a regular requalification plan for critical instruments (annually at minimum).
  • maintain current SOP documentation.
  • take appropriate steps to ensure data integrity and data security

While a commitment to quality requires much effort and focus, industry leading organizations understand that good quality practices and culture lead to a more efficient work environment, improved employee satisfaction, and ultimately increased profitability.

The post Best Practices for Instrument Validation and Qualification appeared first on Astrix.

]]>
Tips for Maintaining Laboratory Data Security https://astrixinc.com/blog/laboratory-technology/tips-for-maintaining-laboratory-data-security/ Mon, 25 Feb 2019 21:57:57 +0000 http://localhost/astrix/?p=2775 On June 27th, 2017, a massive ransomware attack infiltrated computer systems and […]

The post Tips for Maintaining Laboratory Data Security appeared first on Astrix.

]]>
On June 27th, 2017, a massive ransomware attack infiltrated computer systems and locked up files (via encryption) at companies around the world and government ministries in Ukraine. Merck & Co. was among those affected. Merck employees arrived in their offices in the morning to find a ransomware note on their computers with hackers demanding payment to release critical files.

Upon learning of the attack, the company disabled its email system and 70,000 employees were forbidden from touching their computers and told to go home. All said, the Merck’s global manufacturing, research and sales were impacted for nearly a week, costing the company an estimated $310 million dollars in the third quarter of 2017 due to increases in the cost of goods sold and operating expenses, along with lost sales.

Pharmaceutical companies typically have large quantities of research, clinical trial and patient data, along with critical intellectual property (IP), to protect. As a result, Life Science companies are an enticing target for cybercriminals. But its not just external threats from hackers or malware that pharmaceutical companies face, there are also internal threats due to disgruntled, malicious or noncompliant employees that need to be addressed.

According to a recent report by Gartner, businesses spent over $114 billion worldwide for security services and products in 2018 alone. With expanding connectivity of information systems, laboratory instruments, computer work-stations, and mobile devices to the internet and wireless networks, data protection and security have become critical components of laboratory information technology (IT) infrastructure for the pharmaceutical industry.

As industry-leading pharmaceutical organizations strive to integrate legacy systems and digitize all aspects of the product lifecycle in order to gain a competitive edge, the need to continuously protect all forms of data in all locations and transmissions can become a challenging task. In this blog, we provide some best practice recommendations for maintaining laboratory data security.

Hardware Security

Out-of-date operating systems. One of the biggest data vulnerabilities in pharmaceutical companies occurs because laboratories often run proprietary, highly customized software on lab computers. This prevents timely operating system upgrades and security patches, leaving these computers vulnerable to hackers. It is common to see Windows XP or even Windows 95 operating on laboratory computers, making security patches on these machines virtually impossible.

Windows XP was in fact leveraged by the hackers in the Merck data breach described above. If out-of-date operating systems are required to run certain software in your laboratory, best practice is to isolate those machines behind several layers of protection and keep them segregated from your main network.

Malware Protection.  Modern anti-virus software can protect computers from computer viruses, trojan horses, computer worms, spyware, ransomware, etc. This software is critical due to the increased number and severity of cyber-attack threats. All laboratory computers should have robust anti-virus software installed and configured for automated, regular virus definition updates and file scanning. Various internet software suites can provide additional security by adding firewalls and application access control or privacy features. Additionally, workforce policies and procedures and employee education are highly recommended to prevent at risk behaviors.

Interfaced instruments. Laboratory instruments these days run widely-used windows operating systems and Transmission Control Protocol and Internet Protocol (TCP/IP) based network protocols for communications. This exposes them to the same security issues as laboratory computers, meaning the instrument’s operating system and hardware protection must be approached similarly to typical computers.

Mobile devices. Increased use of mobile devices (e.g., smartphones, tablets, etc.) and wireless medical devices can create significant data security challenges for pharmaceutical companies. Companies should take steps to develop secure authentication for mobile devices, along with the ability to track and secure mobile devices remotely by locking or wiping out information.

Network Security

On-premise hosting data security. With internet connectivity, data security involves not only safeguarding computers themselves, but also protecting the network and the information that is stored and transmitted. Until fairly recently, many pharmaceutical laboratories licensed informatics software such as laboratory information management systems (LIMS) from a vendor and installed directly in the lab server/computers in what is known as an on-premise deployment. This hosting option allows strong data security with data protected by the company firewall, although using cloud-based services (e.g., attachments in Gmail, dropbox/box, etc.) behind your firewall is a possible point of failure.

External and cloud-based hosting data security. In recent years, several vendors have begun to offer deployment compatible with external hosting, offering full application functionality accessed through a device’s web browser and hosted at a third-party data center. In addition, there are now fully external cloud-based “Software-as-a-service” offerings. While on-premise hosting behind the company firewall can provide the best data security, data security in the cloud has come a long way in recent years, and single-tenant, private cloud hosting options do exist with enhanced security.

The bottom line is that, when utilizing external or cloud-based hosting for informatics software, it is critical for companies to do a thorough audit of the network vendor to ensure adequate security is in place. What does their network configuration look like? How often do they perform security audits? This vendor audit can either be done by a skilled in-house IT team or outsourced to a qualified external consultant.

HIPAA compliance. In the United States, the Health Insurance Portability and Accountability Act (HIPAA) mandates certain administrative, physical, and technical safeguards to ensure the privacy and protection of patient medical information and health records. For internet transmission of patient data, the minimum security requirements stipulated by HIPAA utilize Secure Socket Layer (SSL) protocols, which encrypt data using a private key, to protect patient confidentiality. Web servers supporting SSL protocols have web addresses starting with “https” instead of “http.” Virtual private network technology provides another option for HIPAA compliance, securing communication over public networks by creating a secure tunnel and encrypting all data.

Application Security

Passwords. Another important aspect of ensuring data security in pharmaceutical laboratories is making sure that the informatics solutions themselves have best practice data security features. Written policies are required to document the manner in which employees gain access to information systems. All applications will have passwords enabled as the main method of user authentication. Organizations should formulate procedures for creating, changing and safeguarding passwords that allow access to systems with critical data. Regulations also mandate that strong passwords be enforced for logging in to all information systems and medical applications. Applications should have security questions enabled to permit easy recovery of lost password.

Two-factor authentication. Sometimes a strong password simply isn’t enough to prevent hackers from accessing your applications. Best practice is to utilize two-factor authentication (2FA) to make sure accounts don’t get hacked. After entering a password, users are prompted to enter a code generated by an application or sent to your smartphone. Applications utilized by pharmaceutical laboratories should ideally have 2FA capabilities.

Role-based access control. Pharmaceutical laboratory information systems should also have the ability to control which users can use the system, what information they have access to, and what they can do with the data (e.g., read only, or the ability to change or delete data) with role-based permissions. Role-based access control (RBAC) allows access to information in the system based on the specific role of the user.

Computer systems should also have the ability to accommodate special situations and override standard RBAC settings in case of emergency. Finally, organizations should have procedures in place to maintain the RBAC system, adjusting permissions when necessary and terminating access when employees leave the company.

Personnel Security

Employee training and compliance. All laboratory personnel should have a basic understanding of data security threats and comply with company policies and procedures designed to prevent them. Security awareness training should be provided for all employees at the time of their hire, and this initial training should be reinforced periodically with follow-up security reminders. Compliance must be enforced in the laboratory and needs to be monitored on a regular basis through an audit and risk analysis process. Training and compliance software is available to get signoff from your lab personnel that they understand data security.

Peripheral devices. Any IT hardware, namely computer terminals, should be viewed as a potential site for rouge employees to extract data via peripheral storage devices (e.g., USB thumb drives, eSATA disk drives). Software/hardware solutions need to be in place to prevent this scenario unless the employee is specifically authorized. Appropriate steps also need to be taken to prevent employees from working with non-company machines.

Conclusion

While there are clear benefits to connected operations in pharmaceutical laboratories in terms of data integrity, innovation and operational efficiency, this connectivity also creates serious security risks that must be addressed. There is no magic bullet when it comes to data security – no single methodology or technology will get the job done. Data security efforts must therefore be comprehensive using multiple layers of protection.

Companies should undertake regular assessments of data security risks and develop a comprehensive data security strategy that uses multiple layers of protection in each of the areas described in this article – hardware security, network security, application security and personnel security. Additionally, data security concerns need to be a big part of the selection process when choosing applications to implement in your laboratory.

Astrix  has over 20 years of experience implementing laboratory informatics applications in ways that maximize data security. Our experienced professionals help implement innovative informatics solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance.

 

The post Tips for Maintaining Laboratory Data Security appeared first on Astrix.

]]>
Best Practices for Computer System Validation https://astrixinc.com/blog/lab-informatics/best-practices-for-computer-system-validation/ Sun, 17 Jun 2018 20:01:53 +0000 http://astrixinc.com/?p=2404 Computer system validation (CSV) is a documented process that is required by […]

The post Best Practices for Computer System Validation appeared first on Astrix.

]]>
Computer system validation (CSV) is a documented process that is required by regulatory agencies around the world to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. These regulatory agencies require CSV processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness. In the United States, for example, the FDA requires pharmaceutical companies to perform CSV for systems that support the production of the following products:

  • Pharmaceuticals
  • Biologicals
  • Medical Devices
  • Blood and blood components
  • Human cell and tissue products
  • Infant Formulas

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

CSV processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated. In this blog, we will discuss best practice recommendations for efficient and effective risk-based CSV assessment and testing.

Computer System Validation 101

With regards to Computer system validation, a “computer system” in an FDA regulated laboratory is not just computer hardware and software. A computer system can also include any equipment and/or instruments connected to the system, as well as users that operate the system and/or equipment using Standard Operating Procedures (SOPs) and manuals.

Computer system validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records. CSV utilizes both static and dynamic testing activities that are conducted throughout the software development lifecycle (SDLC) – from system implementation to retirement.

The FDA defines software validation as “Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.” Computer systems need to be examined to confirm that the system will work in all situations. Additionally, all validation activities and test results need to be documented.

All CSV activities should be documented with the following:

  • System inventory and assessment – determination of which systems need to be validated
  • User requirement specifications – clearly defines what the system should do, along with operational (regulatory) constraints
  • Functional requirement specifications – clearly defines how the system will look and function for the user to be able to achieve the user requirements.
  • Validation Plan (VP) – defines objectives of the validation and approach for maintaining validation status
  • Validation Risk assessments – analysis of failure scenarios to determine scope of validation efforts
  • Validation Traceability Matrix – cross reference between user and functional requirements and verification that everything has been tested
  • Network and Infrastructure Qualification – documentation showing that the network and infrastructure hardware/software supporting the application system being validated has been installed correctly and is functioning as intended
  • Installation Qualification (IQ) Scripts and Results – test cases for checking that system has been installed correctly in user environment
  • Operational Qualification (OQ) Scripts and Results – test cases for checking that system does what it is intended to do in user environment
  • Performance Qualification (PQ) Scripts and Results – test cases for checking that System does what it is intended to do with trained people following SOPs in the production environment even under worst case conditions
  • Validation Report – a review of all activities and documents against the Validation Plan
  • System Release Documentation – documents that validation activities are complete and the system is available for intended use.

Best Practices for Computer System Validation

Develop Clear and Precise Functional and User Requirements. One of the biggest mistakes companies make when starting an informatics project is to not do the strategic planning necessary to ensure success. The first step in any laboratory informatics project should always be a thorough workflow and business analysis. This process allows the development of clear and precise functional and user requirements that are tailored to your unique operating environment to a high degree of specificity and defined at a level that can be addressed through the new software. Without clear and precise requirements, CSV will not be able to adequately verify that the system is functioning as intended.

Perform risk-based CSV. CSV takes a lot of time and IT resources to accomplish, so it is wise to follow a flexible GAMP 5 approach that utilizes a risk-based assessment on the system to determine required test cases and the optimal level of testing for each. CSV efforts should concentrate on what is practical and achievable for the critical elements of the system that affect quality assurance and regulatory compliance. Benefits of this risk-based approach to CSV include reduced cost, business risk, duration of the validation efforts.

Create a Good Validation Plan. Like any technical endeavor, CSV processes should be guided by a good plan that is created before the project starts. This plan will define the objectives of the validation, the approach for maintaining validation status over the full SDLC, and satisfy all regulatory policies and industry best practices (e.g., GAMP 5). The validation plan will be created by people who have a good knowledge of the technology involved (i.e., informatics systems, instruments, devices, etc.) and serve to minimize the impact of the project on day-to-day lab processes.

The validation plan should detail the following:

  • Project Scope – outlines the parts of the system that will be validated, along with deliverables/documentation for the project. Validation activities are only applied to aspects of the system that will be utilized by the company.
  • Testing Approach – Defines the types of data that will be used for testing, along with the kind of scenarios that will be tested.
  • Testing Team and Responsibilities – Lists the members of the validation team, along with their roles and responsibilities in the validation process.
  • Acceptance Criteria – Defines the requirements that need to be satisfied before the system is considered suitable for use in regulated activities.

Create a Good Team. The project team should have CSV experience and knowledge of regulatory guidelines/compliance, validation procedures, laboratory processes, and the technology (e.g., informatics software, laboratory devices and instruments, etc.) being validated. It is important that the team is big enough so that members are not stretched too thin during the project. Outsourcing to a third party to augment the validation team with subject matter expertise may be appropriate in some instances.

Avoid Ambiguous Test Scripts. This mistake is related to the importance of developing clear and precise functional and user requirements for the system in question, as described above. Precise requirements lead to precise validation testing that confirms the system is fulfilling its intended use. Additionally, vendor test scripts typically only validate the base system requirements and will not be sufficient to ensure regulatory compliance.

Create Good Documentation. CSV processes and results need to be clearly documented over the full SDLC to the extent that the documents are sufficient to pass an audit by regulatory agencies. Having project team members with good understanding of regulatory guidelines is an important part of creating the necessary documentation.

Audit third-party Providers. In addition to performing CSV on internal systems, an FDA-regulated company needs to be prepared to audit third-party service providers (e.g., CROs), along with vendors of critical applications and cloud-based services (SaaS). The manufacturer of an FDA-regulated product is ultimately responsible for the integrity of the data that supports the product’s efficacy and safety, so if third-party vendors or service providers are used, the manufacturer needs to take appropriate steps to ensure that they are operating under standards that would hold up under an FDA inspection.

A risk-based assessment should be conducted to determine if an audit is necessary. At the minimum, formal agreements that clearly detail responsibilities must exist between the manufacturer and any third parties that are used to provide, install, configure, integrate, validate, maintain or modify a computerized system.

Conclusion

Effective, risk-based validation of computerized systems is an important part of maintaining regulatory compliance and product quality in modern laboratories. Inefficient or ineffective CSV processes prevents projects from being delivered on time and within budget and can also result in regulatory action. With regards to the FDA, for example, regulatory action due to failure to perform adequate CSV can be legally and financially devastating to an organization.

If you have additional questions about computer system validation, or would like to have an initial, no obligations consultation with an Astrix informatics expert to discuss your validation project, please feel free to contact us.

The post Best Practices for Computer System Validation appeared first on Astrix.

]]>