FDA data integrity Archives - Astrix https://astrixinc.com/tag/fda-data-integrity/ Expert Services and Staffing for Science-Based Businesses Wed, 09 Aug 2023 14:33:00 +0000 en-US hourly 1 What the FDA Guidance on Data Integrity Means for Your Lab https://astrixinc.com/blog/fda-guidance-for-data-integrity/ Wed, 11 May 2022 23:52:01 +0000 http://astrixinc.com/?p=1577 Regulatory agencies worldwide are increasingly focusing their efforts on data integrity in […]

The post What the FDA Guidance on Data Integrity Means for Your Lab appeared first on Astrix.

]]>
Regulatory agencies worldwide are increasingly focusing their efforts on data integrity in GxP laboratories. This increased focus has led to a number of guidance documents being published on data integrity in 2016, including:

In its “Data Integrity and Compliance with cGMP” guidance document, the FDA cites a “troubling” trend of violations involving data integrity “increasingly” being observed in its cGMP inspections. A recent analysis by PricewaterhouseCoopers’ Health Research Institute (HRI) shows that a growing number of pharmaceutical companies have been warned by the FDA for data integrity violations since 2010 – from 2010 to 2012, the FDA warned five companies for such violations; between 2013 and 2015, the number was 24. The issue has been particularly troubling for offshore API manufacturers.  According to a recent article published in BioProcess International, since May 2013 the FDA has issued warning letters to more than 10 big Indian pharmaceutical companies (including Sun, USV Ltd., Wockhsrdt Ltd., and RPG Life Sciences Ltd.) over problems with data integrity. These issues typically revolve around not reporting failed results, conducting unofficial analyses, deleting electronic data, disabling audit trails in electronic data capture systems, fabricating training records, reanalyzing failed samples until passing results are obtained, back- dating data, and not reporting stability failures.

Given that the FDA is clearly concerned about data integrity and intends to enforce cGMP rules related to this topic, how can you be sure that your operations are aligned with the FDA’s current thinking on data integrity?

Data Integrity and Why It’s Important

Data integrity is the maintenance and assurance of the accuracy and consistency of data over its entire life-cycle. With regards to pharmaceutical manufacturing, the FDA expects that all data submitted to the agency in an effort to gain drug approval is complete, consistent and accurate. Data integrity is a fundamental principle in pharmaceutical manufacturing that enables traceability of a batch back to its origin and thereby ensures that drugs are made and tested according to required quality standards.

From the perspective of regulatory agencies, data integrity violations arise from poor systematic control of data management systems. Lack of proper controls can result in data errors due to human error, a lack of knowledge, or from intentionally hidden, falsified or misleading data. The three most common data integrity issues cited by FDA for inspections from 2013 to 2015 were the lack of controls to prevent alterations of data by staff, failure to maintain records of accurate data, and delayed reporting of data.  Data integrity violations can result in financial loss for pharmaceutical companies due to facility shutdown, product recalls, import bans, and delayed or denied drug approvals. Additionally, FDA warning letters divert worker attention away from their daily activities towards corrective and preventive actions, which can cost significant time and money. These violations can also tarnish the company’s reputation and provide competitors with an opportunity to increase their market share.

Overview of the FDA Guidance

The FDA’s “Data Integrity and Compliance with cGMP” guidance document is organized in question and answer format, and is specifically focused on the interpretation of aspects of the regulations for cGMP (21 CFR 211) that pertain to data integrity issues in a pharmaceutical manufacturing environment. The document makes it clear that companies need to institute adequate controls and oversight to ensure data integrity. The FDA expects firms to “implement meaningful and effective strategies to manage their data integrity risks based upon their process understanding and knowledge management of technologies and business models.” Companies that do not have adequate data integrity controls and oversight in place are considered to be in violation of GMP rules, even if the FDA has not found any instances of actual data deletion or manipulation. In other words, the FDA is applying a “guilty until proven innocent” approach to data integrity.

The main purpose of the guidance seems to provide clear and concise solutions to common issues in an easy to follow Q&A format. It starts with key definitions and then goes on to address critical issues such as inclusion or exclusion of data from decision making process, access control, validation of data, good documentation practices, audit trails, electronic and paper raw data, training, ways to handle data falsification, and scope of an FDA audit of data. Each issue is explained with specific examples and provides clear instructions on FDA’s expectations.

So, what are some of the key aspects of the FDA’s guidance document that impact regulated cGMP laboratories?

  • ALCOA: Companies should establish Data Integrity controls to ensure that data are Attributable, Legible, Contemporaneously recorded, Original or a true copy, and Accurate.   The “contemporaneously recorded” stipulation means that data must be documented and saved to storage at the time it is created. It is not acceptable, for example, for an analyst to record data on a piece of paper and later transcribe this information to a controlled laboratory notebook.
  • Metadata and Audit Trails: Metadata is information that describes, explains, or otherwise makes it easier to retrieve, use, or manage data. Metadata could be a date/time stamp describing when the data was acquired, a user ID of the person who conducted the test that generated the data, the instrument ID used to acquire the data, audit trails, etc. The FDA expects that data should be maintained throughout the record’s retention period with all associated metadata required to reconstruct the CGMP activity.
  • Audit Trail Review: The FDA recommends that “audit trails that capture changes to critical data be reviewed with each record and before final approval of the record.”
  • Computer Workflow Validation: The FDA recommends that you not only validate computer systems, but also validate them for their intended use or workflow. The FDA recommends that you implement controls to manage risks associated with each aspect of the computerized workflow – software, hardware, personnel, and documentation. The clear implication is that computer system validation should not be isolated within the IT department, but should instead be connected with the company quality unit.
  • Access to Computerized Systems: The FDA recommends that companies maintain computer system access controls in order to assure that changes to records can only be made by authorized personnel. Amongst other things, this means that each person accessing the computerized system must be uniquely identified and their actions within the system tracked and audit trailed. Additionally, the FDA recommends that companies “restrict the ability to alter specifications, process parameters, or manufacturing or testing methods by technical means where possible.”
  • Control of Blank Forms: As uncontrolled blank forms present an opportunity for data falsification and/or testing into compliance, the FDA recommends that all blank forms be uniquely numbered and tracked. Electronic workflows allow this process to be automated – a clear advantage over paper-based systems.
  • GMP Records: The FDA states that, “When generated to satisfy a GMP requirement, all data become a GMP record.” This means that the original record containing the data must be stored securely throughout the record retention period. Additionally, “The FDA expects processes to be designed so that quality data that is required to be created and maintained cannot be modified.”
  • Dynamic vs. Static Records: The FDA explains that, “For the purposes of this guidance, static is used to indicate a fixed-data document, such as a paper record or an electronic image, and dynamic means that the record format allows interaction between the user and the record content.” Electronic records from certain types of laboratory instruments are dynamic records, in that they can be modified by an analyst. Original copies of records, whether static or dynamic, must be securely maintained throughout the record retention period.
  • Electronic Signatures: When appropriate controls are in place, electronic signatures can be used in place of handwritten signatures in any cGMP required record. Companies using electronic signatures should document the controls used to ensure that they are able to identify the specific person who signed the records electronically and securely link the signature with the associated record.

Conclusion

While a recent increase in data integrity violations has prompted the FDA’s increased focus in this area, the majority of non-compliance citations in routine inspections result from poor systems rather than intentional fraudulent activity. Many companies simply fail to employ robust systems with built-in features that inhibit data integrity failures. When developing data integrity practices for a cGMP environment, many companies make the mistake of relying on the saying “If it’s not documented, it didn’t happen” in order to guide their work. To reflect the FDA’s current thinking on the matter, however, this statement should be adjusted to read: “If it’s not documented completely, consistently, and accurately, it didn’t happen.”

The implications of the FDA’s recently released guidance document on data integrity are clear – the FDA takes both good documentation practice and data integrity seriously, and intends to enforce cGMP regulations with a “guilty until proven innocent” approach. It is therefore critical for companies to implement robust systems with effective data integrity controls and oversight in order to avoid unpleasant financial consequences from enforcement actions.

The post What the FDA Guidance on Data Integrity Means for Your Lab appeared first on Astrix.

]]>
10 Ways to Ensure Data Integrity https://astrixinc.com/blog/data-integrity/10-ways-to-ensure-data-integrity/ Fri, 31 Dec 2021 18:40:57 +0000 http://localhost/astrix/?p=9980 Data Integrity can be expressed as the consistency and trustworthiness of data […]

The post 10 Ways to Ensure Data Integrity appeared first on Astrix.

]]>
Data Integrity can be expressed as the consistency and trustworthiness of data over the course of its lifecycle. The status of your data, or the process of guaranteeing the quality and validity

1 – Standard Data Definitions

Need to have a single definition of all data types across the organization. Data being reported across the organization should be standardized in the format, applicability and validity to enable a clear and concise understanding.

2 – Cleaning and Monitoring Data

Bad data has a significant impact on data quality. Organizations should readily adopt a data cleansing approach to find, remove, and correct errors as much as possible. They should put into place an ongoing approach to maintaining the system’s health so that data integrity can be improved.

Verifying data against accepted statistical measurements is part of data monitoring and cleansing. Validating data against defined descriptions and finding linkages within the data are also part of the process. This approach also verifies the data’s uniqueness and assesses its reusability.

3 – Perform Risk-Based Validation

Ensure that data quality and reliability are addressed in procedures related to system operations and maintenance.

The ISPE’s GAMP5 (Good Automated Manufacturing Practice) categorizations should be used to estimate the validation complexity of your system and decide the appropriate level of risk and hence validation that is to be applied to a system.

During validation, keep track of all electronic data storage places, including printouts and PDF reports. Also, ensure that the frequency roles and responsibilities for system validation are defined in the quality management system. The approach taken to analyze important metadata, including audit trails and other details, should be outlined in the validation master plan. After  initial validation is completed, periodic re-evaluations should be scheduled based again on a risk based approach.

4 -Validate Input Source of Data

Validate the input source of the data. This is especially crucial if  data comes from an end-user or unknown source, another application, or third-party. To guarantee that the data inputs are correct, processes should be put in place to verify and validate them.

Periodic validation of  data is critical to ensure that data processes are not corrupted.

5 – Audit the Audit Trails

An audit trail needs to be a permanent record of all data in a system that includes all changes that have been made to the database or a file. The GxP-relevant data needs to be defined and changes to that data should be tracked in the audit trail. To be relevant in GxP compliance, an audit trail must be able to answer the following questions: Who? What? When? And why?

To test the audit trail functionality, assign responsibilities and schedules. The scope of an audit trail examination should be determined by the system’s complexity and intended use.

It is important to recognize that the audit trails are made up of discrete event logs, history files, database queries, reports, or other mechanisms that show system events, electronic records, or raw data contained inside the record. This represents data spread across multiple sources / locations and hence a good audit trail testing strategy should take into account all of these sources.

6 – Change Control

Ensure that system software enhancements, especially when incorporating new features, comply with evolving regulations. Also, keep up with changes by collaborating with providers and updating  systems as needed. Choose systems that are simple to upgrade when new hardware or other system inputs are added.

7 – Encrypting The Data

Encryption can also be a useful tool for maintaining data integrity across the organization. It assures that even if someone has access to data, they will be unable to view it unless they have the decryption key.

It’s useful in situations when attackers can quickly obtain files stored in a database by hijacking the server or acquiring files via database hacking.

8 – Perform Penetration Testing and Audits

Penetration testing, or having an ethical hacker try to break into the company’s database and uncover weaknesses, is an effective way to ensure data integrity. It will assist  in uncovering areas of weakness and allow for organizations to  resolve potential vulnerabilities quickly.

Organizations should also ensure security assessment includes the following elements:

  • Crucial Information to the business
  • The Techniques Hackers might leverage to gain access to critical data
  • Measures to detect and prevent future hacking attempts
  • Employee Cybersecurity Best Practices

9 – Develop Process Maps for Critical Data

The organization should also have control over how and where the data is being used, along with who is using it, in order to maintain data integrity.

Creating process maps for vital data is critical. This provides the organization with much better control over how the data is used. This will also support the business putting in place suitable security and regulatory compliance procedures.

10 – Promote a Culture of Integrity

Cultivating a culture of transparency, honesty, and integrity is also important to ensuring Data Integrity.

All those involved who touch the data of the organization need to take responsibility and ownership for that data. Everyone in the company needs to report situations where other members break the rules or fail to meet their obligations regarding the information.

These small measures will ensure that your entire organization stays on track and that your data integrity is preserved.

Summary

To help ensure Data Integrity it requires a focus on specific areas that help safeguard the data of the organization. It involves directing the company’s efforts towards a standard data definition or a single definition of the data across the organization. It also requires cleaning and monitoring the data to ensure find, remove, and correct all errors.

In order to ensure Data Integrity, there is also a need to do a risk assessment, a validate of the input source of the data, as well as an audit of the audit trail. Additionally, the organization needs secure the data by encrypting it and also performing penetration testing by ethical hackers. The organization also needs to also create process maps for critical data. This will provide much better control over how the data is used and support the business putting in place suitable security and regulatory compliance procedures. And last by not least, the organization should cultivate an environment of data integrity by ensuring all team members involved with touching or using the data take ownership of that data.

All of these steps assist in helping to guarantee data integrity and will assist in the organization having greater success in that area.

Why It Matters to You

Data Integrity is critical to an Life Science organization business. The quality of data is an imperative to ensure product quality and patient safety. In this blog, we discuss:

  • The key areas to focus on in order to ensure Data Integrity.
  • What is involved in ensuring Data Integrity.
  • Why these areas are important in ensuring Data Integrity.
  • How to incorporate these areas into the business to ensure Data Integrity.

About Astrix

For over 25 years, Astrix has been a market-leader in dedicated digital transformation &  staffing services for science-based businesses.  Through our proven laboratory informatics, digital quality & compliance, and scientific staffing services we deliver the highly specialized people, processes, and technology to fundamentally transform how science-based businesses operate.  Astrix was founded by scientists to solve the unique challenges which science-based businesses face in the laboratory and beyond.  We’re dedicated to helping our clients speed & improve scientific outcomes to help people everywhere.

The post 10 Ways to Ensure Data Integrity appeared first on Astrix.

]]>
Trends in FDA Data Integrity 483s and Warning Letters for Pharmaceutical Companies https://astrixinc.com/blog/lab-compliance/trends-in-fda-data-integrity-483s-and-warning-letters-for-pharmaceutical-companies/ Sun, 16 Feb 2020 23:42:42 +0000 http://localhost/astrix/?p=3468 The manufacture of pharmaceutical drugs is a highly complex process that involves […]

The post Trends in FDA Data Integrity 483s and Warning Letters for Pharmaceutical Companies appeared first on Astrix.

]]>
The manufacture of pharmaceutical drugs is a highly complex process that involves advanced scientific analysis and instrumentation at all stages of production and storage. In order to guarantee the safety and efficacy of both human and veterinary drugs, the FDA strives to verify data integrity in all cGMP records used to document and guide the manufacturing process.

The FDA first identified failures in data governance and data integrity in the year 2000 with a warning letter issued to Schein Pharmaceuticals that cited lack of proper controls over computerized laboratory systems. In the years since, the FDA focus on compliance with data integrity regulations in facility inspections has increased significantly. We have written extensively about how the FDA defines data integrity and the current FDA regulations and guidance in this area. For more information on these subjects, see here and here and here.

Enforcement actions by the FDA due to failure to comply with data integrity regulations can result in serious financial consequences for an organization due to facility shutdown, product recalls, import and/or distribution bans, delayed or denied drug approvals, substantial remediation costs, and loss of customers due to a damaged reputation. In addition, manufacturers who are found in violation of data integrity regulations may lose the trust of the FDA and face more frequent and in-depth inspections in the future.

As such, it is critical for organizations to take appropriate steps to ensure compliance with applicable data integrity regulations governing the production of pharmaceutical drugs. In this blog, we will discuss recent trends in FDA warning letters and 483s in the pharmaceutical industry with regards to data integrity, and also highlight the importance and benefits of incorporating independent data integrity assessments into your organization’s quality management system (QMS) as part of cGMP audit programs.

FDA Form 483s and Warning Letters

The FDA conducts all inspections of regulated products, research sites and manufacturers through its Office of Regulatory Affairs (ORA). When the FDA inspects a pharmaceutical company’s facility, they can either show up unannounced or alert the company ahead of time. Once the inspection is complete, the inspectors can communicate any observed conditions and/or practices that may be in violation of FDA requirements through a Form 483 or a warning letter. While both of these documents serve to inform sponsors and principal investigators of issues that require corrective action, there are important differences in terms of the seriousness of the infraction(s) being highlighted:

FDA Form 483

An FDA Form 483 is essentially a list of identified regulatory deficiencies that an ORA inspector provides to company management at the end of an inspection. As the FDA expects that these deficiencies will be remediated, it is important to respond thoughtfully to the FDA within 15 days detailing your plan for corrective actions in order to avoid a warning letter. A Form 483 response will usually require input from many different aspects of your organization, so it should be all hands on deck upon receiving the 483 to facilitate a good response detailing a comprehensive plan of action within 15 days.

FDA Warning Letter

An FDA warning letter is issued by ORA inspectors for more serious compliance deficiencies, often involving previous Form 483s that have not been effectively remediated. Warning letters should be taken very seriously and answered in a timely fashion within the required time frame. A comprehensive remediation plan needs to be developed, implemented and adhered to, along with consistent communication with the FDA during the process to avoid further enforcement actions.

Trends in Pharmaceutical Company Form 483s and Warning Letters Citing Data Integrity Violations

The Big Data and AI Analytics firm Govzilla found that, regardless of company size, roughly 50% of all global drug 483s that have been issued over the 5 year period from 2014-2018 cite data integrity concerns. Data integrity violations are even more prevalent in warning letters, with 79% of global drug warning letters during this period citing data integrity issues. Additionally, the total number of FDA warning letters referencing data integrity deficiencies has increased significantly in recent years.

While 21 CFR Part 11 is known as the data integrity rule, deficiencies in Part 11 are rarely cited in 483s or warning letters – almost all deficiencies cited are failures to comply with cGMP predicate rules (specifically, 21 CFR Parts 210, 211 and 212). Two predicate rules that are frequently cited are Parts 211.68 and 211.194.

Part 211.68 specifies requirements for “Automatic, Mechanical and Electronic Equipment.” Common citations in this area include:

  • The company did not implement effective computer system controls to ensure only authorized individuals had access to the systems.
  • Access was not consistent with appropriate roles and responsibilities. For example, laboratory analysts could delete or modify data, change configuration settings (e.g., disable audit trails), could adjust date and time stamps for electronic data to falsify the date/time when data was initially acquired.
  • Data was not backed up appropriately to allow data reconstruction activities in future.
  • Audit trail data did not match data in printed chromatograms.

Part 211.194 is cited when companies do not review and include all relevant data when making lot release decisions. Common citations in this area include:

  • Companies fail to review critical data and/or metadata that would allow them to identify out of specification (OOS) events that require investigation in lot release decisions.
  • The company falsifies test results, destroys data, or does not have the data necessary to support a test result.
  • Laboratory analysts reprocess or manipulate data, or delete OOS data, so the result will meet acceptance criteria.

Other recently cited deficiencies include:

  • Utilizing “pre-injections” of product samples outside of full sample sets to determine if results pass acceptance criteria, and then deleting/ignoring results if they fail.
  • Disabling audit trails intermittently to obscure results
  • Deletions or modifications of results
  • Use of integration suppression settings to minimize data that would likely cause an OOS result
  • Aborting test runs with no justification

One common theme in recent warning letters to pharmaceutical companies has been the clear and consistent encouragement by the FDA to employ “independent” data integrity assessments as part of the strategy for remediating identified issues. A few examples include:

  • On August 10th, 2018, a warning letter was issued to the manufacturing facility of Kyowa Hakko Bio Co., Ltd. in Japan that stated: “We recommend that a qualified third party with specific expertise in the area where potential breaches were identified should evaluate all data integrity lapses.”
  • On March 26th, 2019, a warning letter was issued to the manufacturing facility of Winder Laboratories, LLC in Georgia that stated: “Your quality system does not adequately ensure the accuracy and integrity of data to support the safety, effectiveness, and quality of the drugs you manufacture…. We strongly recommend that you retain a qualified consultant to assist in your remediation.”
  • On June 13th 2019, a warning letter was issued to the manufacturing facility of Akorn Inc. in Illinois that stated: “Your quality system does not adequately ensure the accuracy and integrity of data to support the safety, effectiveness, and quality of the drugs you manufacture…. We acknowledge that you are using a consultant to audit your operation and assist in meeting FDA requirements.”

Conclusion

Data integrity is a growing focus of the FDA, and it is therefore critical for organizations to ensure compliance with applicable data integrity regulations governing the production of pharmaceutical drugs. While this blog specifically highlights FDA trends in Form 483s and warning letters for pharmaceutical drug manufacturers, the recommendations communicated are also applicable to biotech companies, clinical research and CROs, medical device manufacturers, R&D laboratories, etc.

In order to ensure compliance, working with an external consultant that has expertise in data integrity evaluations to audit your laboratory environment is best practice, as an expert with fresh eyes will be able to effectively locate data integrity issues you missed. A data integrity assessment performed by a qualified team of regulatory experts provides a number of important benefits for your organization:

  • Strengthens Your Organization’s Data Integrity Focus – Helps reinforce the fact that your organization is committed to data integrity compliance for all company employees.
  • Provides Peace of Mind – You know that your data integrity issues have been identified and are being addressed.
  • Reduces Costs – The cost of remediating data integrity issues identified by the FDA is generally much more significant than when the issues are proactively identified and corrected internally.
  • Stay Focused on Your Core Business – Proactively identifying and correcting data integrity issues allows your organization to spend less time on compliance issues so you can stay focused on your core business.

Astrix Technology Group has an experienced team of expert informatics consultants that bring together technical, strategic, regulatory and content knowledge to provide the most effective solutions to problems faced by scientific organizations. Astrix provides experienced professionals knowledgeable about FDA regulations to conduct a thorough assessment of your laboratory informatics environment to identify data integrity risks. If you have any questions about Astrix Technology Group service offerings, or if you would like have an initial consultation with someone to explore how to reduce your compliance risk around data integrity, don’t hesitate to contact us.

The post Trends in FDA Data Integrity 483s and Warning Letters for Pharmaceutical Companies appeared first on Astrix.

]]>
Enhancing Data Integrity in Clinical Research with Blockchain Technology https://astrixinc.com/blog/lab-compliance/enhancing-data-integrity-in-clinical-research-with-blockchain-technology/ Thu, 08 Aug 2019 00:00:35 +0000 http://localhost/astrix/?p=3048 Verification of data integrity is a critical part of the FDA’s mission […]

The post Enhancing Data Integrity in Clinical Research with Blockchain Technology appeared first on Astrix.

]]>
Verification of data integrity is a critical part of the FDA’s mission to ensure the safety, efficacy and quality of human and veterinary drugs, biological products, and medical devices. As these products must undergo extensive testing before being approved for the public, gathering high-quality, reliable and statistically sound data is an important goal for all clinical research. There are, however, converging trends challenging the ability of legacy data management systems to establish and sustain data integrity throughout all phases of clinical studies:

  • Increasing complexity of clinical trials and the supporting bioanalytical studies
  • Enormous amount and variety of data generated
  • Clinical trial networks comprising many parties and sites
  • Increasingly sophisticated devices for gathering clinical data

Every once in a while, a technology comes along that has the potential to act as a significant disruptive force. Blockchain technology may very well be the next big disruptive technology and is likely to find applications in nearly every major industry to reduce cost, increase efficiency and more significantly, improve trust. Trust is foundational all businesses, and blockchain enables companies to seamlessly establish both trust and transparency in data-related business processes in a non-centralized and therefore scalable and cost-effective way.

Blockchain technology was invented in 2008 with the creation of bitcoin by the pseudonymous Satoshi Nakamoto. It’s important to understand that bitcoin is itself merely an application of blockchain – bitcoin itself is a financial asset that utilizes blockchain technology as its ledger. Given the necessity of trust and transparency in financial transactions, finance transactions were a natural first use case for blockchain technologies. The ability of blockchain to facilitate trust in a decentralized with relatively low cost has drawn the attention of researchers in the pharmaceutical industry, and efforts to apply the technology to data integrity problems within clinical trials are increasing.

According to Statista, global blockchain technology revenues will experience massive growth in the coming years, with the market expected to climb to over 23.3 billion U.S. dollars in size by 2023. In this blog, we will discuss ways in which blockchain technology is being applied to enhance data integrity in clinical research.

What is Blockchain Technology?

Ledgers are the foundation of accounting: a secure record of transactions is essential to the role of money in our society. Blockchain technology utilizes a distributed peer to peer computer network to create a digital ledger (a form of a database) using cryptographic techniques to store transaction records in a verifiable and unalterable way. Each server or node in the network can independently verify each data entry and modification of any data in the chain requires alteration of all following entries in the chain and consensus of the network. In this manner, with a reasonable time and network distribution of transactions, the entire chain of entries is considered secure by design and fault tolerant due to the distribution of the data.

When applied effectively, blockchain networks inherently resist modification of the data stored in the transaction record. Transaction data stored in a blockchain can’t be stolen or hacked, as it is not kept in a central repository, but instead distributed across dozens or even thousands of geographically dispersed network nodes. The distributed nature of the network, the use of timestamped records, and cumulative cryptographic verification all assure that the stored data remains intact and immutable. Additionally, the application of public key cryptography to the transactional data, makes the entries attributable and non-repudiable. Interestingly, these are key characteristics of asserting data integrity as defined by the FDA: attributable, legible, contemporaneous, original, and accurate (ALCOA).

Furthermore, the ability of a blockchain to be an immutable audit trail makes blockchain technology perfectly suited for any company focused on data integrity and regulated or audited by 3rd parties like the FDA. In addition, blockchain network participants can store, exchange and view information in the database without the need for establishing preexisting trust between the parties (for example, by a negotiated legal contract) – trust is in fact hardcoded into the blockchain protocol.

It is important to note that not all problems require or would be most effectively met with a blockchain solution. The creation and maintenance of the ledger is currently a costly exercise. There have been efforts to address this cost, through the design of specific implementations to facilitate application to more general problems (Ethereum for example, which is a blockchain application platform based on a distributed virtual machine); and through “blockchain as a service” offerings, with Microsoft and Amazon Web Services recently adding blockchain networks to their offerings.

Blockchain solutions are ideal for data records that are meant to be shared between partners in a network where transparency and collaboration are important. Specifically, blockchain could be considered appropriate when:

  • multiple parties generate transactions that represent shared data;
  • the parties need to be able to independently verify that the transactions and the current state of the shared data are valid;
  • there are no trusted third-party services able to manage the shared data efficiently (e.g. an escrow);
  • enhanced security is needed to ensure integrity of the system.

Private vs. Public Blockchains

Blockchain databases can be categorized into two main types – public and private. Bitcoin is an example of a public blockchain, with thousands of computer nodes distributed worldwide doing the work required to verify the network. These nodes are run by participants in verifying the network (computational work also known as “mining”) who receive rewards paid in the cryptocurrency created by the network. Attributes of a public blockchain include public access, a high degree of control, a large distribution of work which can result in fewer verified transactions per unit time, and transaction verification and overall security by Proof of Stake or Proof Of Work (approaches that enhance stability and security of the network).

Private blockchains, on the other hand, are more common in industry applications of blockchain technology. These type of blockchains have a less randomly distributed network (nodes typically run by stakeholders), restricted access to the network, the potential for higher throughput and scalability, and transaction verification by authorized network participants.

Hybrid blockchains are a recent approach: the blockchain in general is a public blockchain with an ability to designate data that is public and fully open vs. data that remains private and accessible only to authenticated entities. Some recent private blockchain implementations are also leveraging public blockchains for the cryptographic payload only: this approach has an interesting advantage when considering data storage compliance with geographic regulations such as the European Union’s General Data Protection Regulation (GDPR).

Enhancing Data Integrity in Clinical Research with Blockchain Technology

Good quality data from clinical trials requires good security, proper content (metadata) and an immutable audit trail. Blockchain technology is a potential component of ensuring these capabilities. Blockchain data integrity has the strength of cryptographic validation of each interaction or transaction, and the entirety of the set of records. Any data integrity issue in a blockchain results in an immediate indication of where and when the problem happened, along with the contributor of the offending transaction.

Blockchain effectively allows proof of the integrity and provenance of the data independently of the production of the data. Blockchain-based, immutable, timestamped records of clinical trial results and protocols could potentially reduce the incidence of fraud and error in clinical trial records by eliminating the potential for outcome switching, selective reporting and data snooping.

Additionally, it is very difficult for researchers in the current environment of siloed informatics systems to share and maintain the provenance of data. Blockchain-based data systems could allow for seamless data sharing between organizations, thereby reducing data integrity issues stemming from errors and incomplete data.

As an example of this, researchers at University of California San Francisco (UCSF) recently developed a proof-of-concept method for using a private blockchain to make data collected in the clinical trial process immutable, traceable, and more trustworthy. The researchers used data from a completed Phase II clinical trial and tested the resilience to tampering with the data through their proof-of-concept web portal service. They demonstrated effectively that data entry, storage, and adverse event reporting can be performed in a more robust and secure manner, could withstand attacks from the network, and be resilient to corruption of data storage. For this approach to be implemented in the real world, it’s worthwhile to note that a regulator acting as a centralized authority would likely need to instantiate a private blockchain, operate the web portal, register all parties, and keep a ledger of the blockchain’s transactions.

Another example of blockchain technology in action was provided by scientist.com, a leading online marketplace for outsourced research in the life sciences, with the recent launch of their DataSmart™ platform based on a proprietary blockchain implementation. The platform is designed to ensure data integrity in research data from clinical and preclinical stages of drug development that is submitted electronically to regulators. DataSmart™ also allows pharmaceutical and biotechnology companies to demonstrate that critical supplier data has not been tampered with and remains unaltered.

To improve visibility of data quality and analysis, researchers from several different companies recently collaborated to develop TrialChain, a hybrid blockchain-based platform intended for application to biomedical research studies. This approach includes provisions to permit modifications of original data created as a routine part of downstream data analysis to be logged within the blockchain. In addition, the use of a hybrid blockchain structure in TrialChain allows for public validation of results alongside the additional authorization required for full access to data.

Finally, pharmaceutical giants Pfizer, Amgen and Sanofi have teamed up to find the most effective ways to utilize blockchain technology, from ensuring data integrity to speeding up clinical trials and ultimately lowering drug development costs.

Conclusion

With the ability to deliver immutable timestamped records, audit trails and data provenance, in a highly secure and attributable manner, blockchain technologies could indeed have a huge impact on clinical research worldwide. While blockchain obviously can’t prevent errors at the source (e.g., an incorrect recording of a test result), a blockchain-enabled clinical research system could provide a strong mechanism to ensure the integrity of the datasets themselves.

As in other industries, blockchain also holds promise for reducing costs in many aspects of the drug lifecycle – from discovery to manufacturing. From the ability to concisely and securely track data provenance as part of analytics to meeting regulatory serialization requirements in the supply chain, there are many interesting potential applications of blockchain technology.

While the use of blockchain technology in clinical research is still in its infancy, there is much interesting activity in applying blockchain technology to solve clinical data management challenges. Many pharmaceutical companies are forming outside partnerships and creating in-house initiatives to explore blockchain technologies in clinical research areas like patient recruitment, patient data sharing, data privacy, preventing data tampering and improving research reproducibility.

As with the application of any new technology, biopharma companies looking to implement blockchain technology in their clinical research should consider working with a quality informatics consultant well established in the domain, and consider very deliberately how the technology will fit in the overall systems architecture.

The post Enhancing Data Integrity in Clinical Research with Blockchain Technology appeared first on Astrix.

]]>
Data Integrity: The Importance of a Quality Culture https://astrixinc.com/blog/lab-compliance/data-integrity-the-importance-of-a-quality-culture/ Tue, 14 May 2019 13:43:19 +0000 http://localhost/astrix/?p=2945 Verification of data integrity is a critical part of the FDA’s mission […]

The post Data Integrity: The Importance of a Quality Culture appeared first on Astrix.

]]>
Verification of data integrity is a critical part of the FDA’s mission to ensure the safety, efficacy and quality of human and veterinary drugs, biological products, and medical devices. As such, the FDA’s expectation is that all data generated to support the quality of manufactured products is both reliable and accurate.

Compliance violations involving data integrity have led to numerous regulatory actions by the FDA in recent years, including warning letters, import alerts, and consent decrees. In 2018 alone, the FDA issued 54 warning letters that had references to data integrity and data management deficiencies in pharmaceutical companies, 10 of which were in the United States. An analysis of 2018 warning letters by FDAzilla found that 45% of GMP-related warning letters issued to pharmaceutical companies based in the United States included a data integrity deficiency.

Recently, the FDA has begun to link compliance with data integrity regulations with the specific culture of an organization. The FDA wants to see companies develop a “quality culture” or quality focus that is integrated throughout the organization. The idea is that the more mature an organization’s quality culture, the more reliable the product support data (i.e., data integrity) will be.

The business case for data integrity compliance is clear – less compliance and financial risk, less rework, fewer supply interruptions to the market, improved productivity and operational performance, etc. In many companies, however, compliance problems persist as a result of reactive rather than proactive thinking regarding efforts to maintain a reliable state of compliance and quality throughout the organization. Given the FDA’s recent interest in quality culture, let’s explore best practices for establishing an organizational quality culture that supports compliance with data integrity regulations.

Quality Culture Best Practices

Data integrity violations can be the result of many factors: employee errors, lack of awareness of regulatory requirements, poor procedures or not following procedures, insufficient training, intentional acts of falsification, software or system malfunction, poor system configuration, etc. In order to avoid risk, companies involved in developing, testing, and manufacturing APIs, intermediates, or pharmaceutical and biological products should work to establish an organizational quality culture that:

  • promote an organizational culture that encourages ethical conduct
  • demonstrates the company’s commitment to compliance with data integrity regulations
  • requires the prevention and detection of data integrity deficiencies

Key aspects of this culture include:

Management Engagement. In its latest data integrity guidance Data Integrity and Compliance with Drug cGMP Guidance published in December 2018, the FDA states, “It is the role of management with executive responsibility to create a quality culture where employees understand that data integrity is an organizational core value and employees are encouraged to identify and promptly report data integrity issues. In the absence of management support of a quality culture, quality systems can break down and lead to cGMP noncompliance.”

The effect of management engagement and behavior on the success of a company’s data governance efforts should not be underestimated. The company’s data governance policies need to be strongly endorsed at the highest levels of the organization. For an organization to achieve optimal compliance with data integrity regulations, the focus on quality compliance must start at the top. Management must lead by example and set the tone to be emulated by individuals at all levels within the company.

Employee Training. It is critical for companies establish training programs that properly educate all employees on the fundamental principles of data integrity, company requirements for data integrity, requirements of regulatory agencies, expected employee conduct as a condition of performing GxP functions and disciplinary consequences for poor conduct. All new employees should go through this training prior to performing GxP activities, and each employee should receive an annual refresher training. At the conclusion of the annual refresher, best practice is to have each employee sign a certification statement confirming that he/she has adhered to company standards around data integrity over the past year, including reporting any violations about which they became aware.

Data Integrity training should also include education on why data integrity is so important to the company. Ultimately, in a successful quality culture, employees adopt a quality mindset, not because they have to, but because they understand the importance of data integrity to the company and the risks of noncompliance.

Open and Transparent Communication. Companies should strive to encourage personnel to be transparent about data integrity deficiencies so management has an accurate understanding of the risks and can provide necessary resources to mitigate them. An essential element of any successful quality culture is transparency in terms of open reporting by employees of any deviations, errors, omissions or aberrant results that impact data integrity. Management must set the proper tone to encourage open communication by working to create a culture where people listen to one another, and by not punishing people for honest mistakes. In addition, employees should be given the option to report issues anonymously if local laws permit.

Computer Access Controls. Computer systems need to have secure access controls in order to assure that changes to records can only be made by authorized personnel, and these controls need to be strictly enforced. Amongst other things, this means that each person accessing the computerized system must be able to be uniquely identified (username and password, or other biometric means), must not share their login information with others, and their actions within the system should be trackable via an audit trail. Additionally, rights to alter files and settings (e.g., system administrator role) in the computer system should not be assigned to those responsible for record content.

Data Integrity Investigations. As discussed in the above-mentioned FDA Guidance document, regardless of how a data integrity violation is discovered (e.g., third party audit, FDA audit, internal tip, etc.), all identified data integrity errors “must be fully investigated under the cGMP quality system to determine the effect of the event on patient safety, product quality, and data reliability.” The investigation should determine the root cause and ensure the necessary corrective actions are taken. Necessary corrective actions may include hiring a third-party auditor, removing individuals responsible from cGMP positions, improvements in quality oversight, enhanced computer systems, creation of mechanisms to prevent recurrences, etc.

Data Integrity Audits. Companies should include data integrity assessments in GxP audit programs. Audits may be conducted by internal staff in the Quality unit, or by an independent third party. If audit functions are outsourced to an external consultant, be sure to verify that auditors have appropriate training in data integrity evaluations. Utilizing a quality external consultant with expertise in data integrity evaluations for your GxP audit is best practice, as an expert with fresh eyes will likely be able to locate any data integrity issues you missed. The periodic review results, along with any gaps and corresponding remediation activities, must be documented.

The following items should be part of any audit of laboratory technology systems:

  • Review of existing Software Validation Lifecycle policies, SOPs, etc.
  • Identification of paper data, electronic data, raw data, and static and dynamic data
  • Intended use of computer systems (e.g., SOPs, workflows, etc.)
  • Use of notebooks and forms associated with computer systems
  • System security and access to data (e.g., user types, groups, roles, accounts, etc.)
  • Electronic signatures and audit trails
  • Data retention policies and availability of data
  • Data backup, archive, restore and recovery processes
  • Training for support and use computer systems that collect, generate, store, analyze, and/or report regulated data

Third-Party Data Integrity Audits. Manufactures are responsible for ensuring that quality system elements have been established at each of its third-party suppliers (e.g., outsourced services, purchased raw materials) that provide products or services on behalf of or at the behest of the manufacturer. Manufacturers need to verify that suppliers operating as an extension of the manufacturer have established appropriate policies, standards, procedures, or other documents that define requirements for its employees to ensure compliance with data integrity regulations.

Conclusion

Ensuring data integrity means collecting, documenting, reporting, and retaining data and information in a manner that accurately, truthfully and completely represents what actually occurred. The FDA expects pharmaceutical firms to identify, manage, and minimize data integrity risks associated with their data, products, equipment, technology, processes, and people. It is therefore critical for companies to develop a quality culture that supports data integrity compliance in order to avoid unpleasant financial consequences from enforcement actions.

It is important to note that company executives and management are ultimately responsible for creating a quality culture that will support data integrity compliance. By taking assessment of the maturity of their quality culture and making intentional plans for improvement, senior management can generate significant business value by improving product quality and reducing their organization’s compliance and financial risk.

The post Data Integrity: The Importance of a Quality Culture appeared first on Astrix.

]]>