lab informatics Archives - Astrix https://astrixinc.com/tag/lab-informatics/ Expert Services and Staffing for Science-Based Businesses Wed, 10 May 2023 19:54:40 +0000 en-US hourly 1 Outsourcing your Next Laboratory Informatics Project https://astrixinc.com/blog/outsourcing-your-next-laboratory-informatics-project/ https://astrixinc.com/blog/outsourcing-your-next-laboratory-informatics-project/#respond Sun, 08 May 2022 16:32:11 +0000 http://astrixinc.com/?p=1561 As the complexity of the global R&D environment has grown over the […]

The post Outsourcing your Next Laboratory Informatics Project appeared first on Astrix.

]]>
As the complexity of the global R&D environment has grown over the last several decades, biopharma companies have found it increasingly difficult to accomplish scalable growth with their in-house team alone. This has led to the practice of outsourcing of laboratory informatics projects to become more commonplace as a way to reduce costs and gain efficiencies.

In the accounting and consultancy firm Deloitte’s comprehensive 2014 Global Outsourcing and Insourcing Survey, 53% of respondents indicated that they were currently outsourcing part of their IT function, while 26% indicated that they planned to in the future. Over 30% of organizations participating in the Deloitte 2016 Global Outsourcing Survey indicated that they planned to do more outsourcing in the future.

Clearly, the practice of outsourcing is on the rise, largely driven by technological advancements and the promise of higher efficiency at a lower cost. Outsourcing these days, however, is often far more than just a cost-reduction strategy. Increasingly, companies are using outsourcing as a way to drive innovation and maintain competitive advantage. There are a number of significant benefits accrued by companies that use outsourcing as part of their business strategy:

  • Cost-reduction – Outsourcing allows companies to take advantage of less expensive labor markets available elsewhere. Outsourcing can also improve efficiency in business processes, thereby reducing costs.
  • Improved flexibility and agility – You can sign (or cancel) a temporary or long-term contract with one or multiple partners, depending on your needs. Companies can also easily ramp their labor force up or down as the business condition or situation changes.
  • Increased focus on core business – Outsourcing peripheral projects and services allows you to avoid the time and involved in expanding your in-house team, and instead increase your focus on and improvement of core competencies.
  • Enhanced innovation – Outsourcing allows you to engage with highly skilled, specialized professionals and/or transformative technologies and services when you need them to move the needle forward on your innovation efforts.
  • Increased speed of delivery – Depending on your business, outsourcing to other time zones may allow round-the-clock productivity, speeding delivery of products and services. Additionally, outsourcing can quickly increase the volume and/or skill of your workforce to speed business processes.
  • Scalable growth – Efficiencies that flow from the benefits described above allow companies to maintain or even improve profitability as they grow.

Life science R&D have relied on external CROs for many years, but outsourcing models for laboratory informatics are just beginning to emerge. While these potential benefits are compelling, outsourcing a laboratory informatics project is not without its challenges and risks, especially when outsourcing to a foreign country. Outsourcing lab IT projects to domestic contractors is a more established model, yet there can be significant cost-savings realized by hiring a foreign contractor. Generally speaking, when the required skills can’t be found domestically at a price that is conducive to scalable growth, companies have two options to consider for outsourcing to foreign countries – nearshoring or offshoring. Let’s explore what each of these models has to offer and the strategic implications.

Defining the Terms

Outsourcing is a general term that refers to contracting work done by either foreign or domestic third-party firms, while offshoring and nearshoring refer to contracting work done by foreign firms specifically.

Nearshoring refers to a company contracting a part of its workload to an external company located across national borders, but generally within its own region. It includes neighboring countries with similar time zones, which may even possess a similar culture as the outsourcing client. Common nearshoring locations for U.S.-based companies include Canada and Mexico, as well as many other countries in the Caribbean and Central and South America – Costa Rica, Argentina, Brazil, El Salvador, Dominican Republic and the U.S. Virgin Islands.

Offshoring, on the other hand, is the relocation of a business process by a company to a more distant location. China, India and the Philippines are the quintessential examples of offshoring locations for U.S.-based companies. In this case, the partner is located physically far enough from the home company to be operating in a totally different economic environment, time zone and legal field, plus likely with significant cultural and language differences.

Offshoring

Offshoring began in earnest in the United States as a corporate strategy back in the late 1970’s. At that time, the practice was mainly confined to the manufacturing sector, as companies sought to increase profits by moving their manufacturing operations overseas to areas with less expensive labor pools. By the 1990’s, with the birth of the internet, the practice started to move into the white-collar sector as well, as businesses now had access to highly educated workers all over the world.

Offshoring highly technical work like laboratory informatics, can present a number of challenges that may affect the value you realize from your project, such as language barriers, time zones and cultural differences. Effectively coordinating informatics teams across multiple locations and time zones is not easy. When half your team is wide awake and working hard, while the other half is still asleep halfway around the world, effective management and communications can be extremely difficult, potentially leading to cost increases and project time delays.

This time lag situation often leads to your domestic employees working late or early in the morning in order to facilitate a communication window with the extended team, which of course can contribute to stress, burnout and/or job dissatisfaction. Even if you shift the time differential burden to the offshore team by requiring them to work off hours, quality and productivity can suffer. In addition, you run the risk of increasing turnover in the offshore team causing significant project time and cost overruns.

Communication challenges can be especially problematic if the offshore contractor doesn’t speak the language of your domestic team well, or has a strong accent that is difficult for your domestic team to understand – misunderstandings may occur that could cause errors or require a project rework. Cultural differences can also contribute to communication challenges. In a 2006 survey of 200 U.S. business executives whose companies had outsourced business processes, for example, two-thirds said that miscommunication due to cultural differences had caused problems.

Another potential offshoring issue is that the offshoring team, while highly educated and skilled, may lack significant practical experience in the laboratory informatics domain, especially in your region/country with its unique regulations and best practices. It is critical that your informatics consultant at least has practical experience with the platform being implemented in the type of laboratory you are running (i.e., R&D, QC/QA, etc.). Often, foreign consultants lack this kind of crucial experience.

Finally, in the case where your laboratory handles proprietary company information, you may be putting your companies’ intellectual property (IP) at risk by consulting with an offshore provider, especially if the culture and/or region where the contractor resides do not enforce IP laws strictly. Data security can sometimes be improved dramatically by choosing alternative nearshoring options.

While offshoring can certainly generate significant cost-savings for your organization if done right, the above-described issues have contributed to many examples of offshoring failures. A 2005 survey of over 5,000 corporate executives around the globe by Ventoro[1], an outsourcing consulting and market research company, found that one-third of companies involved with offshoring projects had to shift some or all of the off-shored operation back on-shore – at significant cost and risk to project success.

Nearshoring

Given the challenges with offshoring, a nearshoring model has recently emerged that addresses many of these issues, while also providing cost-saving benefits. How? Nearshoring is very similar to offshoring, except that the company has transferred a business process to a location within its own region, as opposed to an offshore country. The close proximity (or at least similar time zone) of the nearshoring partner eliminates much of the stress and communication issues inherent in offshoring. The similar time zone advantage and more cultural similarities provides both parties with an opportunity to align work schedules, have ad hoc calls and generally reach a level of integration not possible in the offshoring model.

The closer proximity of the parties involved in nearshoring also makes it possible to exchange visits more often, ensuring a smoother workflow and better project coordination and management. Face-to-face meetings help mitigate unexpected management costs and setbacks that are directly related to project confusion. This is especially important for complex, multi-team laboratory informatics projects that need to be discussed in great detail to ensure a successful on-time delivery.

Making the Decision

So how do you choose the right outsourcing partner for your laboratory informatics project? If price is a significant issue, you should certainly consider nearshoring or offshoring. In general, nearshoring can provide a solid partner support structure, without many of issues associated with offshoring models That said, there are usually a lot more offshoring options to choose from. Regardless of which model you decide to go with, a trustworthy partner should:

  • Be able to provide several references that demonstrate quality work done for other companies in your country.
  • Speak your language fluently and be culturally compatible with your company.
  • Have practical, hands-on experience with laboratory informatics in the type of lab that you are operating. Scientific domain knowledge is also important.
  • Be financially stable. You don’t want to be abandoned in the middle of your project.
  • Have the appropriate technology and infrastructure in-house that is necessary to generate quality services.
  • Understand your business objectives, and make sure the project accomplishes them.
  • Be able to support and work with you throughout the full project lifecycle – assessment, requirements, design, implementation, follow-up and support.
  • Have the ability to task the same team members on your project from start to finish.

Obviously, it is crucial that you do due diligence when evaluating potential foreign contractors. The consequences of making the wrong decision can be disastrous to your companies’ bottom line.

Conclusion

Digitizing your laboratory environment is a key to maintaining efficiency and driving innovation. Yet, laboratory informatics projects require a high level of skill, knowledge and experience in order to deliver the results you need on time and on budget. Often, you cannot go it alone and need to find the right partner with the appropriate expertise to drive the project forward.

If a partner with the required skills can’t be found domestically at a price that is conducive to scalable growth, outsourcing to foreign countries is an option, but due diligence must be done in order to make sure a good partner is chosen. Of the two options available, nearshoring has the potential to offer the operational efficiencies of a domestic partner, but with cost savings comparable to offshoring. Nearshoring with a leading domestic partner like Astrix offers these cost savings, combined with the peace of mind of knowing that your nearshoring team is part of a trusted domestic laboratory informatics company with over 20 years of successful project experience.

 

The post Outsourcing your Next Laboratory Informatics Project appeared first on Astrix.

]]>
https://astrixinc.com/blog/outsourcing-your-next-laboratory-informatics-project/feed/ 0
Addressing Data Management Challenges in Genomics Research https://astrixinc.com/blog/lab-informatics/top-data-management-challenges-in-genomics-research/ Sat, 26 Dec 2020 21:06:56 +0000 http://localhost/astrix/?p=2660 With the successful completion of the Human Genome Project on April 14, […]

The post Addressing Data Management Challenges in Genomics Research appeared first on Astrix.

]]>
With the successful completion of the Human Genome Project on April 14, 2003, the mapping of the human genome took approximately 13 years and 3 billion dollars to accomplish. Today, next-generation sequencing (NGS) technology allows a human genome to be sequenced in just a few days for a few hundred dollars.

With ultra-high throughput, speed and scalability, NGS allows researchers to study biological systems in ways never before possible. Combined with clinical data, readily accessible genomic data is driving a revolution in drug discovery by accelerating the development of personalized, targeted treatments in a medical model known as Precision Medicine. 

The new field of genomics research, combined with the potential of Precision Medicine to drive improved patient care, has spawned significant growth in the biotechnology industry, with a recent report by Grand View Research projecting that the global biotechnology market will reach 727.1 billion USD by 2025. The promise of Precision Medicine that was envisioned decades ago with the mapping of the human genome is fast becoming a reality.

While genomic research is paving the way for significant advances in healthcare, its practice has led to new challenges for biotechnology firms in terms of increasing data volume, variety and complexity.Given that the data representing a single human genome takes up to 100 gigabytes of storage space, biotech companies engaged in genomic research must develop the computing infrastructure and skills to manage, store, analyze and interpret massive quantities of highly complex data. In this blog, we will explore challenges and best practices involved in addressing data management challenges in genomics research.

Challenges and Solutions for Genomic Workflows

While access to genomic data has improved dramatically over the last decade, processing and analytics for these massive data sets has become the new bottleneck. For pharmaceutical and biotech companies engaged in genomics research, a modern laboratory information management system (LIMS)that is flexible enough to deal with the volume,variety and complexity of genomic data sets is now a necessity. A modern LIMS that integrates with instrumentation and enterprise systems is now a critical tool to automate the lab to handle the increased throughput.

Biotech can either be new companies looking to replace paper-based systems with a new informatics solution, or an academic/pharma spin off with legacy systems that need to be transitioned into a more modern solution. Either way, a number of challenges will need to be addressed during the implementation of the LIMS solution in order to achieve an efficient, scalable and cost-effective laboratory environment.

So what are the key data management challenges in genomics research?

#1 – Complex Workflows and Sample Tracking. NGS biologics add complexity to sample tracking and workflows as compared with previous small molecule based tools and workflows. As opposed to distinct and easy to track small molecule samples, compounds and batches, biologics involve complex aspects such as sequences, ligated variants and linkages. In addition, processing of biological samples is highly complex and involves the use of cell lines and other biologics building blocks, resulting in the need to track sample lineage and progeny over time.

Additionally, NGS workflows can involve different sequencing technologies, assays, sample preparation kits/protocols, automation robots and analytical instruments. These workflows can span several different departments and user types, and many of these aspects are subject to frequent change (e.g., reagent kit upgraded, new liquid handler comes on-line, workflow adjusted with new data-dependent trigger, etc.).

#2 – R&D Externalization and Data Security. Genomics research requires diverse teams (e.g., bioinformaticians,clinicians, computational biologists, etc.) to work together to process and analyze genomic and clinical data to extract the insights that drive innovative therapies. Legacy systems often don’t communicate with each other and lack the integration to facilitate good collaboration, effectively locking researchers into siloes and hampering drug discovery.

It is increasingly common for biotech firms to externalize aspects of R&D to CROs,academia and/or smaller biotech firms to provide specific skills, resources and technology. In this case, the parent biotech will need communication and data exchange capabilities to collaborate effectively with their new partners, where the partners can access only the data they need to see, while the parent company has secure access to all of the data in the appropriate format. An informatics solution needs to be in place to facilitate management of R&D processes and data security across different locations, time zones, user groups that have different permissions,and is flexible enough to handle a changing web of external partners. A cloud-based LIMS can be particularly effective in this regard, while also providing the scalability to handle the massive genomic datasets. 

#3 – High Volume of Data.  With scientists are generating gigabytes of data with every experiment, it is critical to choose a solution that can handle the throughput and volume of data.  Challenges remain for cloud solutions, as integration with on-premise instrumentation and other systems presents risks. You will want to do a full analysis to choose a vendor with modern, RESTful APIs and a robust security model. 

#4– Regulatory Compliance.  Finally, given that genomic research typically involves human samples, biotech companies must adhere to a variety of regulations – CLIA, GDPR, HIPAA – administered by US and international agencies in addition to GxP and 21 CRF part 11. As such, the LIMS solution chosen will need to have a robust set of controls that support compliance with applicable regulations and standards on data integrity and privacy. Additionally, the project team needs to have the skills, knowledge and experience necessary to properly develop and apply these controls during the implementation so as to enhance and ensure regulatory compliance for your business. 

Best Practices for Implementing a LIMS in a Biotech Startup

Biotech startups typically need to implement a modern, robust LIMS that has the flexibility and scalability to support your business both now and as it evolves, offers an affordable total cost of ownership (TCO), and has the ability to integrate effectively with both enterprise systems and instrumentation. Selecting, implementing and integrating a LIMS solution for genomics workflows can be a challenging endeavor, however. 

One of the biggest mistakes companies make when starting a complex informatics project is omit the strategic planning necessary to create the solid project foundation that ensures success. Success, in this case, is defined by the implementation fulfilling the following metrics:

  • Delivered on time
  • Delivered within budget
  • Meets or exceeds requirements, including the necessary security and compliance requirements
  • Provide a high level of business value
  • Provide a high level of user adoption

In order to meet these metrics, a comprehensive and proven methodology should be followed that leverages a workflow and business analysis to maximize business value for your organization. In this initial phase of the project, business analysts with both domain, industry and system knowledge work to document the current state of laboratory operations, as well as an optimized set of future state requirements that will encompass the flexibility necessary for genomic workflows.

When designing system requirements, it is beneficial to consider the key data management issues in genomics research. An easy way to take these into account is to remember to address the 6V’s in your technology selection process:

  • Volume (Large amount of data)
  • Variety (cellular, molecular, genomic, preclinical data)
  • Velocity (data is being collected constantly at a high velocity)
  • Veracity (Accuracy of the data)
  • Vulnerability (Security and compliance)
  • Vocabulary (Data standardization is a challenge)

Once a set of optimized future-state requirements are generated, it is it is important to design a laboratory informatics architecture that is aligned with business goals, along with a road map to deployment, before engaging in a proper technology selection process for your organization.

With the proper foundation laid for your project in this way, a skilled project team can engage with system design, implementation, integration, validation and user training with a high degree of success. But even when a new system is successfully deployed to researchers, the system will need to be maintained and evolve over time, and any legacy systems still in use will need to be maintained as well.

Conclusion

The explosion of data generated by the rapid pace of discovery in genomics research has great potential to aid in the development of innovative new therapies that will significantly improve patient care. To deliver on that potential, however, biotech companies generally need to invest in dynamic informatics systems that can help to process and analyze the high volume, variety and complexity of genomic datasets. Legacy LIMS are generally not flexible or extensible enough to handle NGS workflows, and attempts to adapt these systems to support genomic research are unlikely to succeed. The reality is that modern, cloud-based LIMS are often ideal for genomics research.As such, your project team should have the expertise and experience necessary to help you determine the ideal hosting architecture for your LIMS solution.

Astrix has over 20 years of experience facilitating success in LIMS implementation projects in pharmaceutical and biotech companies. Our experienced professionals have the experience and knowledge to help you implement innovative informatics solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance.

The post Addressing Data Management Challenges in Genomics Research appeared first on Astrix.

]]>
LIMS Master Data Best Practices Part 5 – Mergers & Acquisitions https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-5-mergers-acquisitions/ Wed, 29 Jan 2020 13:43:44 +0000 http://localhost/astrix/?p=3437 LIMS Master Data Best Practices Part 5 - Mergers & Acquisitions

The post LIMS Master Data Best Practices Part 5 – Mergers & Acquisitions appeared first on Astrix.

]]>
A recent study by JP Morgan determined that the value of global merger and acquisition (M&A) deals in 2018 was 4.1 trillion dollars with a total deal count of 2,342. M&As are common in several industries such as pharmaceutical, biotech, food and beverage, oil and gas, and others, but the pharmaceutical industry likely sees more M&A activity than any other industry, both in terms of the number of deals and the amount of money spent on the acquisitions.

Reasons for M&As are numerous and diverse and include:

  • Improve profitability
  • Increase efficiency
  • Automate manual processes
  • Shorten time to value
  • Enhance business intelligence
  • Acquire R&D data
  • Expand product portfolio
  • Maximize growth
  • Innovate core business models
  • Mitigate technology disruption
  • Implement game-changing strategic moves

While the potential benefits of an M&A are compelling, there are also potential pitfalls lurking that can swallow large amounts of both money and resources. One of these potential pitfalls involves the problems that can occur when attempting to merge two different company’s data infrastructures.

Integrating and consolidating the master data in disparate enterprise systems is one of the most critical, yet costly and time-consuming, challenges that need to be met in an M&A. In part 5 of our LIMS master data best practices series, we will discuss best practices that can help guide the strategy for consolidating and managing LIMS master data in mergers and acquisitions.

Master Data Best Practices for M&As

Any organization undergoing an M&A will be significantly increasing IT infrastructure and the amount of master data that needs to be managed and maintained, as well as the cost of doing so. As such, each application should be analyzed to determine how it aligns with the company’s future-state vision and brings value to the organization over its lifetime in a process known as application portfolio rationalization.

There is also a strong need to aggregate and consolidate data to provide for post-merger operational efficiency and also quick wins for the short term. Merging and harmonizing disparate LIMS and their data into a single functional operating environment is not a simple task and can put enormous strain on a company’s IT department if not planned and executed effectively. Even if you don’t need to merge two separate LIMS into one, master data in your LIMS will need to be adjusted to accommodate new and/or altered workflows. Having a scalable master data plan in place, as we discussed in part 3 of our LIMS Master Data Best Practices series, can help to facilitate this process.

Effective master data management (MDM) during an M&A is an important enabler of everything from business continuity to post-merger innovation. Some of the key aspects of a successful master data management (MDM) strategy for LIMS master data include:

Conduct an Audit of Systems and Data. When a company integrates an acquisition or engages in a merger, the sooner the data integration team is involved, the smoother the integration is likely to be. The first thing that should be done by IT during an M&A is to conduct a full system and data inventory in order to understand and document the current data landscape. Some data challenges to consider and document include:

  • Data may be captured, managed and maintained differently
  • Data standards may be different
  • Data processes, procedures and methods may be different
  • Data quality may be different
  • Data strategies may be different

An inventory of this nature can be difficult to accomplish if either organization lacks good IT and data governance/documentation This is why it is important to have change control procedures in place A good place to start is to identify business, subject matter and data experts across both organizations and form a data team to research and determine what documentation is available for this initial phase. A few key questions to ask when trying to document the current data landscape include:

  • Where is the master data currently located (systems, apps and files)?
  • How does data traverse these different systems?
  • Who owns data?
  • Who manages data?

If a change control procedure (see part 4 of our LIMS Master Data Best Practices series) is in place that includes a data migration plan, the master data plan, and how to conduct reviews and audits, these questions will be easy to answer.

Do Strategic Planning. Once the current data landscape is documented, the next step is mapping out the future state workflows that will determine your master data configuration. Laboratory workflows utilize LIMS master data, and an M&A means workflows will likely need to be altered and new workflows added into the system. The first step in the process is for business analysts to conduct a series of interviews with business stakeholders to document the current state of laboratory business processes, technology and IT architecture in both organizations. In addition, analysts should discuss the merger at a high level with the organization’s management team to understand the goals, aspirations and objectives of the desired future state of the laboratory.

Once the current state is fully documented, the project team will work to create a future state model by defining the goals, workflows and requirements of the desired future-state. If you choose to harmonize and standardize laboratory operations on a new LIMS or a legacy system, the technical, business and user requirements from the strategic planning phase are utilized to guide the technology selection, implementation and integration process.

Map Out the Data Structure. With future state workflows established, the next steps are determining the data fields to be entered into the LIMS, establish naming conventions, and mapping out the data structure. We covered this in detail in part 2 of our LIMS Master Data Best Practices series. In addition, if the Master Data Plan we discussed in part 1 of our series exists for the LIMS that will be used in the new operating environment, the process of mapping out the new data structure will be much easier. Of course, this Master Data Plan will need to be updated to reflect the new operating environment. Once the data structure is mapped, configuration of the LIMS can begin.

Standardize and Migrate Master Data. Master data from the acquired company may need to be standardized before being migrated into the LIMS that will be used for the new unified operating environment. Data migration for a project of this nature can be a significant challenge The Data Migration Plan we discussed in part 4 of our series should guide this process. Master data from the acquired company will likely need to be extracted, translated and loaded into a new location. Questions to ask that help to determine your data migration/management strategy include: How are we going to get master data out of the acquired company’s systems and into the LIMS? How are we going to harmonize data across multiple sites?

Data management/migration for an M&A is typically a big job. It is therefore important to formulate a Master Data Management/Migration Strategy before any migration or LIMS configuration has happened in order to avoid significant time and cost overruns as the project proceeds, as well as minimal disruption to your lab operations during migration activities.

Once the data has been migrated, regulatory requirements mandate that it must be validated to make sure it is accurate and has been transferred properly. This validation should be guided by your Change Control procedures. Even if your company is not in a regulated industry, validating migrated data is important to make sure your data is sound.

Conclusion

Mergers and acquisitions can help drive your competitive advantage, but they can also paralyze a business and generate significant unexpected costs that can serve to minimize the value proposition of the merger. Following the best practice recommendations described in the blog help to establish a single version of the truth in your LIMS, something which is critical to ensuring your laboratory maintains operational efficiency, data integrity, and regulatory compliance. Effective LIMS master data management in an M&A is also important to ensure your laboratory continues to produce the valuable business intelligence that drives innovation and competitive advantage for your organization.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

About Astrix Technology Group

Scientific resources and technology solutions delivered on demand

Astrix Technology Group is an informatics consulting, professional services and staffing company dedicated to servicing the scientific community for over 20 years.  We shape our clients’ future, combining deep scientific insight with the understanding of how technology and people will impact the scientific industries. Our focus on issues related to value engineered solutions, on demand resource and domain requirements, flexible and scalable operating and business models helps our clients find future value and growth in scientific domains. Whether focused on strategies for Laboratories, IT or Staffing, Astrix has the people, skills and experience to effectively shape client value. We offer highly objective points of view on Enterprise Informatics, Laboratory Operations, Healthcare IT and Scientific Staffing with an emphasis on business and technology, leveraging our deep industry experience.

For More Information

For more information, contact Michael Zachowski, Vice President at Astrix Technology Group, at mzachowski@astrixinc.com.

The post LIMS Master Data Best Practices Part 5 – Mergers & Acquisitions appeared first on Astrix.

]]>
Selecting an ELN to Maximize IP Protection https://astrixinc.com/blog/lab-informatics/selecting-eln-maximize-ip-protection/ Mon, 09 Sep 2019 05:23:42 +0000 http://astrixinc.com/?p=1992 In the race to patent or publish in life sciences, time is […]

The post Selecting an ELN to Maximize IP Protection appeared first on Astrix.

]]>
In the race to patent or publish in life sciences, time is critical. For any researcher, securing Intellectual Property (IP) is equally critical. In order to receive a patent on an invention, inventors must be able to document the work they did to prove that the invention substantially performs against the claim. Additionally, this proof must be corroborated by someone (e.g., custodian of records) not directly involved in the work. Effective documentation and corroboration of the work done on an invention is also necessary to protect IP against challenges to the U.S. Patent and Trademark Office (USPTO) and possible litigation from competing inventors.

One of the most hotly contested research areas currently is in the area of CRISPR, the gene editing technology platform that has spawned a billion-dollar industry and countless biotech startups.  This technology was developed by multiple organizations, including researchers from the Broad Institute, Harvard University, MIT and University of California – Berkeley. First developed in prokaryotic cells at UC Berkeley, later, the Broad Institute implemented the technology in eukaryotic cells. At issue is whether genome editing in eukaryotic (mammalian) cells is made obvious/trivial once it is accomplished in non-mammalian cells. Obviously, this dispute has required both sides to pour over experiment records to search for evidence to support arguments, as well as to establish timing and a full audit trail of events leading to the invention. If your organization is working on bleeding edge technology platforms such as CRISPR or novel biologic therapeutic classes, for example, considering your IP protection strategy is business critical and could have major impacts on your company’s future.

Today, most established companies and many academic labs involved in R&D have made the shift from paper to electronic laboratory notebooks (ELN) in order to create a searchable repository of experimental data that researchers can access to facilitate collaboration. An ELN can also be a key element in your IP protection strategy by supporting scientists in keeping complete, accurate and witnessed records of all research. If you are looking for a new ELN, or want to bolster your IP protection for an existing ELN implementation, there are many elements to consider in order to effectively secure IP.

Baseline Features in ELNs for Securing IP

R&D partnerships continue to increase, and many organizations are looking to the cloud to host ELN systems to open up data sharing and experiment data capture across the globe. With new and emerging cloud-based ELN vendors having developed commercial offerings in the last five years or so, it is important to take a close look at the ‘baseline’ features and security controls that should be there to protect your IP. Many of these features are in fact commoditized and readily available in most platforms for the last ten years, but it is always important to check the boxes – particularly when you are working with a smaller, less established ELN vendor who may be new to the market.

Features which you should look for when choosing an ELN to support IP protection include:

21 CFR Part 11 Regulations on Electronic Records and Signatures

The regulations established in 21 CFR Part 11 dictate many of the features that should be in any ELN. As such, any ELN should have the ability to time, date and user stamp any data entered into the system. Equally important, audit trails should be available to track all the changes to a record, as well as chain of custody for the data.  Electronic signatures should be assigned as a result of a controlled workflow that is consistent with the organization’s policies and procedures for witnessing experiments.

Data/Experiment Hiding

Any quality ELN will have the ability to close off access to all or parts of the experiment to selected user groups.  With so many collaborators and experiments being developed jointly across organizations, it is critical that the system provide a way to fence off certain programs, projects, or individual experiments to the appropriate people – creating a path to assign IP ownership rights accordingly.

Electronic (Structured) Environment for Projects, Programs and Targets

Paper notebooks are stand-alone – meaning you can’t easily back them up, you can’t search by keyword, and when they’re archived, they are essentially lost. Trying to find information in a paper notebook world is often so cumbersome and difficult that it’s faster to just redo the experiment. Having the information in an electronic format increases the likelihood that the knowledge will be shared rather than lost, and sometimes more importantly, can help show a concerted progression of work against a piece of intellectual property.

To make it easier to navigate, the system should support many ways of structuring the experimental data, consistent with the scientific programs, projects and targets. Being able to quickly pull up all of the sample data by scientist and experimental results for a given program or project by target class or other meaningful criteria is not only convenient but may also be critical for building your case in court or simply working with a collaborator to flesh out ownership rights.

Archiving Immutable Experiment Records

In order to secure IP, the ELN should support creation of immortalized, time-stamped, unalterable record for archiving (usually PDF of completed experiment) that documents everything that was done during the experiment. This document needs to be archived either within the ELN itself or in a database behind the company firewall.

Security Features

ELN’s must protect company IP from possible theft by both external parties and company employees. For protection against internal theft, your ELN should have a folder-based security paradigm in which security permissions (read, write, modification, delete) are set at the root folder of a notebook, project, study or experiment. These rights will then cascade down to the new folders and entities underneath the root. Once an experiment record has been approved and locked down with the electronic signature workflow, it can be released to a wider audience for data-sharing purposes. Users performing searches in the system should only be able to see records for which they have been given viewing rights.

Records Custodian

Why are unalterable experiment records archived in a secure environment so critical?  The U.S. Federal Rules of Evidence (FRE) govern whether evidence like laboratory records are admissible in a patent interference or a civil case. There are two ways to admit laboratory notebook records. First, the author who created the records of the experiments can testify as to their validity. This would satisfy the requirement that the evidence was made by the declarant. But, what if the case is many years later, and the inventor is no longer available?

The second way is to classify them in what is commonly called the “business records” exception to hearsay (Rule 803). This exception allows business materials, e.g., lab notebooks, to be admitted as evidence if a proper foundation is established to prove their authenticity. This consists of five elements:

  • Records must be kept in the ordinary course of the business.
  • The particular record at issue must be one regularly kept.
  • The record must be made by or from a knowledgeable source.
  • The record must be made contemporaneously.
  • The record must be accompanied by testimony from a custodian.

An often-overlooked consideration is the custodian. A custodian is an individual who, in the usual course of a company’s operation, is responsible for the management of business records. A custodian can be called into court to attest to the company’s policies and procedures for documenting and securing the records desired to be admitted. They will be asked to substantiate that procedures are generally followed and the records in question adhere to those standards. Without a designated custodian, there is a risk of non-admittance of company records in a court case.

Best Practices for Laboratory Notebook Policies and Procedures

An ELN can be a big improvement over paper for IP protection, in that it helps to enforce established policies and core capabilities such as time stamping and audit trails. The organized and searchable records that an ELN facilitates can also better prove diligence and reduction to practice. If a company does not have good notebook policies in place to facilitate custodian of record, however, an ELN is no better than paper for securing IP. Good practices always trump technology. Once the necessary foundational notebook policies are in place to protect IP, an ELN can be leveraged to further improve the current state.

Some best practice ELN policies include:

Recording Experiments

Notebook records can be vital in winning or losing a patent battle, and all records must be incontrovertible. Scientists need to create an accurate, complete and organized record of their work. Laboratory notebooks should include a clear and detailed description of:

  • The experiment’s purpose
  • How and when the experiment was performed
  • What materials and reagents were used (e.g., sources, lot numbers, animal numbers for antisera, etc.).
  • What results were obtained
  • Citations for publications referenced in notebook entries
  • All software used for data analysis.
  • Where related computer files, samples, slides, or photos are stored
  • Interactions with collaborators, including discussions, data, and exchanges of materials
  • Any other information that you or another scientist in your field would need to precisely replicate your experiments

Signing and Witnessing Electronic Records

  • All experiments should be signed and dated.
  • Joint work should be signed by all the contributors. The text should set forth who is responsible for each part.
  • Entries should be witnessed (including the date of the witnessing) in a timely manner.
  • Entries for crucial or pivotal experiments should be witnessed immediately upon completion.
  • All other entries should be dated within a reasonable period after completion. To be a proper witness, a person must be able to understand the entries they are witnessing
  • Any entry which relates to a possibly patentable invention should be signed and dated by two witnesses — with their signatures under the caption, “disclosed to and understood by….”
  • A possible co-inventor or other person who was directly involved in the described experiments IS NOT a valid witness.

Archiving Electronic Records

Like paper records, electronic records must be immortalized and witnessed:

  • Electronic entries should be regularly (preferably daily, or at least weekly) immortalized in an unalterable form (e.g., read only disk or tape, electronic archive).
  • Each unalterable form should be marked with an identification of the software (including version number) needed to read the data contained therein.
  • The creator of the unalterable form should sign and date the disk or tape or record.
  • The unalterable form should be signed and dated by two witnesses other than the primary creator.

Security

Additionally, proper governance of signed, dated, witnessed, archived unalterable experiment records is vital to prevent potential internal theft of sensitive IP. An appropriate balance needs to be set in terms of who can access stored experimental data – data sharing between scientists needs to be enabled to prevent unnecessary duplication of work, but access must not be so wide that unnecessary opportunities are provided to unscrupulous employees to download and depart with confidential data.

Conclusion

The emergence of ELN technology has helped countless organizations organize and retain research knowledge in ways that support IP protection. A quality ELN, however, is only one element of a successful strategy to protect your IP – poor documentation and inconsistent business processes can drive up risk regardless of the technology chosen.

Companies who are considering an ELN for IP protection must focus first on establishing a proper foundation for IP management with good notebook policies and business practices. Once this is in place, an ELN should be selected that best supports the companies IP protection needs as well as other requirements. Finally, the ELN should be configured/customized to support and enforce the business practices and notebook policies. A quality informatics consultant can support you with the business process analysis and technology selection, development and implementation necessary to optimize your IP protection strategy.

The post Selecting an ELN to Maximize IP Protection appeared first on Astrix.

]]>
Best Practices for Implementing Informatics Systems for R&D Collaborations https://astrixinc.com/blog/lab-informatics/best-practices-for-implementing-informatics-systems-for-rd-collaborations/ Mon, 15 Jul 2019 19:56:24 +0000 http://localhost/astrix/?p=3010 In today’s global economy, scientific organizations in many different industries are turning […]

The post Best Practices for Implementing Informatics Systems for R&D Collaborations appeared first on Astrix.

]]>
In today’s global economy, scientific organizations in many different industries are turning to collaboration with external partners to fuel their R&D pipelines with flexible networks of researchers. These external collaborations can take many forms – research institutes, industry and academic partners, contract research organizations (CROs), contract development and manufacturing organizations (CDMOs), external testing laboratories, consortia, etc.

Many organizations combine numerous partners in diverse ways across multiple research projects. Even in simpler models, any collaboration with an external partner is typically not static, but evolves over time. Therefore, sponsoring organizations are often changing the business processes around the collaboration frequently and rapidly.

While external collaboration can provide many benefits including improved flexibility, enhanced innovation, and reduced time-to-market, externalized R&D activity introduces unique data and IT challenges. Some of these include:

  • Synchronization of partner and in-house data across different transactional systems
  • Maintaining secure and appropriately partitioned data
  • Harmonizing master data to facilitate high quality data flow
  • Developing appropriate long-term plans for data management, including potential data repatriation in an efficient manner
  • Protecting intellectual property (IP) and managing joint IP

These challenges can result in additional costs and be potential limitations on the benefits of external collaborations. At the least, they introduce risks for sponsoring organizations. All too often, unfortunately, these data and information management aspects of a collaboration are not fully considered until problems arise. In this blog, we discuss key characteristics of informatics systems for collaborations, along with best practices for implementing a collaboration platform.

Approaches to R&D Collaboration Data Management

Nearly all R&D informatics systems are designed and implemented only to meet internal R&D requirements. Also, organic growth of internal R&D activities often leads to a tangled web of processes and systems with significant assumptions incorporated into the ecosystem. These latent aspects of systems frequently become impactful when considering the flow of data outside of the R&D organization, and how to open internal systems and/or their data to external collaborators. Some examples of system characteristics important in collaborative data flows are:

  • User identity and access management processes and technology
  • Data access control models
  • Processes that require multiple systems with human-only integrations (“sneakernets”)

Sometimes these limitations make it infeasible to use the internal systems and processes with an external collaborator. Although it may seem more efficient to design our systems with external collaborations in mind, the reality of delivering informatics capabilities on budget and in time almost always means this does not happen. When faced with supporting external collaborations, this leaves the following choices:

  • Use the collaborator’s system. If the collaborator is in the business of collaborations, they are potentially more likely to have systems that would meet the challenges above.
  • Transfer data in email attachments. This lowest common denominator approach and unfortunately tends to be the status quo.
  • Implement a new informatics capability.

There are important sub-aspects to the implementation of a new capability. First is the relationship to the existing system(s). If the current system is meeting requirements and is only insufficient for use in a collaboration, then considering how the system might be extended is an appropriate course of action. If the current system is lacking, or there’s a likelihood of long-term multiple collaborations, then a strategic assessment with the development of a roadmap to a solution architecture that meets all needs is essential.

If either a significant extension of an existing system or a new system is needed, then a cloud-first solution architecture should be considered. Cloud-first systems have several distinct qualities that make them a logical choice to meet the needs for R&D collaboration data management. Specifically, these qualities are:

  • Configurable by intent
  • Based on a tenant model for data and configuration
  • Built for automated deployment

Key Characteristics of a Cloud Collaboration Platform

Some important characteristics of potential candidates for a cloud-based R&D collaboration data management solution are:

Configuration. An ideal platform is highly configurable, allowing organizations to define sites, projects and user roles and, along with the access permissions for each. The user authorization mechanism should be able to incorporate company-specific identity and directory systems for ease of use by scientists, rather than having separate identify and password management for the collaboration system. The platform should also support a range of core R&D capabilities, potentially including some of the functionality of:

  • A flexible, multi-disciplinary ELN
  • A portal that allows scientists for sharing reports, protocols, and other documents
  • Data capture, analysis and visualization capabilities

Deployment. An effective cloud-based platform allows quick creation of separate collaboration environments for use with specific partners. Each environment should represent a secure data  partition. Data from an environment should be extractable and be able to be merged into other environments.

Security. Cloud providers should have ISO accreditation for their systems, technology, processes, and data centers. Data should be encrypted at rest and during transit using well-defined current best practice encryption techniques. The collaboration platform data architecture should have strong isolation across tenants and include logging of all system access and use.

Integration. The cloud platform should have a complete and robust programming interface (API) for integration with the internal systems of either organization in the collaboration. The platform must support bi-directional integration and data syncing between on-premise systems and cloud applications.

Best Practices for Implementing a Cloud Collaboration Platform

There are several best practices that should be followed to successfully implement an effective cloud collaboration platform. These include:

Strategic Planning. One of the most important steps in successfully implementing a cloud collaboration platform is the planning necessary to ensure project success. Towards this end, the first steps in the project should include a thorough workflow and business analysis in order to develop the optimized future-state requirements that guide the technology selection process. In addition, an end-state solution architecture should be developed, along with a strategic roadmap to deployment. Good strategic planning helps ensure the deployment effectively and efficiently meets business and technical needs.

Change Management. It is important to carefully consider the cultural impact, employee training, and new policies that necessary to ensure success of the new collaborative environment. Since collaborative R&D informatics systems by definition involve employees of multiple organizations, attention to change management for these systems is of paramount importance to success.

Efficient Testing. Bandwidth requirements for cloud computing are significant, and load and volume testing are important to ensure that system performs acceptably. Waiting until late in the project to discover that your system is not capable of handling the data transport requirements causes unnecessary scrambling to meet implementation goals.

Effective Validation. As some vendors claim prevalidation for their cloud-based software, it is important to understand exactly the scope of the Install/Operational/Performance Qualifications this covers. Compliance requirements mandate validation in the user’s environment, so prevalidation does not suffice to fully satisfy regulations. Working with the vendor to clearly establish individual and joint responsibilities for validation prevents unnecessary duplication and establishes an overall credible approach.

A Detailed SLA. Working with the vendor to create a detailed SLA is one of the most important things you can do to ensure a successful implementation. Without a well-written SLA, your organization could be in for many unpleasant surprises and additional expenses down the road. In addition to system change management processes and requirements to maintain compliance, an important and often overlooked aspect of the SLA is data storage, including controls of underlying data replication related to availability and disaster recovery.

Conclusion

In today’s increasingly collaborative R&D landscape, creating and managing informatics systems to help scientists handle, analyze and share information is critical for organizations to enhance innovation and remain competitive. Cloud- based collaborative platforms can provide a secure, scalable and flexible approach for handling the wide array of data types, sources and partnerships which are involved in modern collaborative research. These systems allow organizations to spin up robust collaborative environments easily with minimal IT support.

When implemented properly, cloud-based research informatics systems as a complement to R&D collaborations can provide important benefits to your organization:

  • Effective use of data produced from the collaboration
  • Increased scientist productivity
  • Enhanced organizational flexibility and agility
  • Reduced IT costs per user

There are attractive benefits to a cloud-based collaboration research informatics system, but implementation of the platform can be a difficult endeavor that requires much skill and planning to execute successfully. The project team should follow a proven, comprehensive methodology in order to ensure that the implementation provides significant business value for your organization.

Astrix Technology Group has over two decades of experience in the laboratory informatics domain. Our professionals bring together the technical, strategic and content knowledge necessary to help you efficiently select, configure, implement and validate a cloud collaboration platform that best meets your needs and keeps your total cost of ownership low. Whether your deployment utilizes public, private, or a hybrid cloud architecture, our experienced and skilled professionals can make the process of implementing or migrating to a cloud collaboration platform far more cost effective and efficient. Contact us today for more information on leveraging the cloud to improve agility, reduce cost and advance collaboration when working on new scientific discoveries and technological innovation.

The post Best Practices for Implementing Informatics Systems for R&D Collaborations appeared first on Astrix.

]]>