LIMS Master Data Archives - Astrix https://astrixinc.com/category/blog/lims-implementation/lims-master-data/ Expert Services and Staffing for Science-Based Businesses Wed, 29 Jan 2020 15:49:11 +0000 en-US hourly 1 LIMS Master Data Best Practices Part 5 – Mergers & Acquisitions https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-5-mergers-acquisitions/ Wed, 29 Jan 2020 13:43:44 +0000 http://localhost/astrix/?p=3437 LIMS Master Data Best Practices Part 5 - Mergers & Acquisitions

The post LIMS Master Data Best Practices Part 5 – Mergers & Acquisitions appeared first on Astrix.

]]>
A recent study by JP Morgan determined that the value of global merger and acquisition (M&A) deals in 2018 was 4.1 trillion dollars with a total deal count of 2,342. M&As are common in several industries such as pharmaceutical, biotech, food and beverage, oil and gas, and others, but the pharmaceutical industry likely sees more M&A activity than any other industry, both in terms of the number of deals and the amount of money spent on the acquisitions.

Reasons for M&As are numerous and diverse and include:

  • Improve profitability
  • Increase efficiency
  • Automate manual processes
  • Shorten time to value
  • Enhance business intelligence
  • Acquire R&D data
  • Expand product portfolio
  • Maximize growth
  • Innovate core business models
  • Mitigate technology disruption
  • Implement game-changing strategic moves

While the potential benefits of an M&A are compelling, there are also potential pitfalls lurking that can swallow large amounts of both money and resources. One of these potential pitfalls involves the problems that can occur when attempting to merge two different company’s data infrastructures.

Integrating and consolidating the master data in disparate enterprise systems is one of the most critical, yet costly and time-consuming, challenges that need to be met in an M&A. In part 5 of our LIMS master data best practices series, we will discuss best practices that can help guide the strategy for consolidating and managing LIMS master data in mergers and acquisitions.

Master Data Best Practices for M&As

Any organization undergoing an M&A will be significantly increasing IT infrastructure and the amount of master data that needs to be managed and maintained, as well as the cost of doing so. As such, each application should be analyzed to determine how it aligns with the company’s future-state vision and brings value to the organization over its lifetime in a process known as application portfolio rationalization.

There is also a strong need to aggregate and consolidate data to provide for post-merger operational efficiency and also quick wins for the short term. Merging and harmonizing disparate LIMS and their data into a single functional operating environment is not a simple task and can put enormous strain on a company’s IT department if not planned and executed effectively. Even if you don’t need to merge two separate LIMS into one, master data in your LIMS will need to be adjusted to accommodate new and/or altered workflows. Having a scalable master data plan in place, as we discussed in part 3 of our LIMS Master Data Best Practices series, can help to facilitate this process.

Effective master data management (MDM) during an M&A is an important enabler of everything from business continuity to post-merger innovation. Some of the key aspects of a successful master data management (MDM) strategy for LIMS master data include:

Conduct an Audit of Systems and Data. When a company integrates an acquisition or engages in a merger, the sooner the data integration team is involved, the smoother the integration is likely to be. The first thing that should be done by IT during an M&A is to conduct a full system and data inventory in order to understand and document the current data landscape. Some data challenges to consider and document include:

  • Data may be captured, managed and maintained differently
  • Data standards may be different
  • Data processes, procedures and methods may be different
  • Data quality may be different
  • Data strategies may be different

An inventory of this nature can be difficult to accomplish if either organization lacks good IT and data governance/documentation This is why it is important to have change control procedures in place A good place to start is to identify business, subject matter and data experts across both organizations and form a data team to research and determine what documentation is available for this initial phase. A few key questions to ask when trying to document the current data landscape include:

  • Where is the master data currently located (systems, apps and files)?
  • How does data traverse these different systems?
  • Who owns data?
  • Who manages data?

If a change control procedure (see part 4 of our LIMS Master Data Best Practices series) is in place that includes a data migration plan, the master data plan, and how to conduct reviews and audits, these questions will be easy to answer.

Do Strategic Planning. Once the current data landscape is documented, the next step is mapping out the future state workflows that will determine your master data configuration. Laboratory workflows utilize LIMS master data, and an M&A means workflows will likely need to be altered and new workflows added into the system. The first step in the process is for business analysts to conduct a series of interviews with business stakeholders to document the current state of laboratory business processes, technology and IT architecture in both organizations. In addition, analysts should discuss the merger at a high level with the organization’s management team to understand the goals, aspirations and objectives of the desired future state of the laboratory.

Once the current state is fully documented, the project team will work to create a future state model by defining the goals, workflows and requirements of the desired future-state. If you choose to harmonize and standardize laboratory operations on a new LIMS or a legacy system, the technical, business and user requirements from the strategic planning phase are utilized to guide the technology selection, implementation and integration process.

Map Out the Data Structure. With future state workflows established, the next steps are determining the data fields to be entered into the LIMS, establish naming conventions, and mapping out the data structure. We covered this in detail in part 2 of our LIMS Master Data Best Practices series. In addition, if the Master Data Plan we discussed in part 1 of our series exists for the LIMS that will be used in the new operating environment, the process of mapping out the new data structure will be much easier. Of course, this Master Data Plan will need to be updated to reflect the new operating environment. Once the data structure is mapped, configuration of the LIMS can begin.

Standardize and Migrate Master Data. Master data from the acquired company may need to be standardized before being migrated into the LIMS that will be used for the new unified operating environment. Data migration for a project of this nature can be a significant challenge The Data Migration Plan we discussed in part 4 of our series should guide this process. Master data from the acquired company will likely need to be extracted, translated and loaded into a new location. Questions to ask that help to determine your data migration/management strategy include: How are we going to get master data out of the acquired company’s systems and into the LIMS? How are we going to harmonize data across multiple sites?

Data management/migration for an M&A is typically a big job. It is therefore important to formulate a Master Data Management/Migration Strategy before any migration or LIMS configuration has happened in order to avoid significant time and cost overruns as the project proceeds, as well as minimal disruption to your lab operations during migration activities.

Once the data has been migrated, regulatory requirements mandate that it must be validated to make sure it is accurate and has been transferred properly. This validation should be guided by your Change Control procedures. Even if your company is not in a regulated industry, validating migrated data is important to make sure your data is sound.

Conclusion

Mergers and acquisitions can help drive your competitive advantage, but they can also paralyze a business and generate significant unexpected costs that can serve to minimize the value proposition of the merger. Following the best practice recommendations described in the blog help to establish a single version of the truth in your LIMS, something which is critical to ensuring your laboratory maintains operational efficiency, data integrity, and regulatory compliance. Effective LIMS master data management in an M&A is also important to ensure your laboratory continues to produce the valuable business intelligence that drives innovation and competitive advantage for your organization.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

About Astrix Technology Group

Scientific resources and technology solutions delivered on demand

Astrix Technology Group is an informatics consulting, professional services and staffing company dedicated to servicing the scientific community for over 20 years.  We shape our clients’ future, combining deep scientific insight with the understanding of how technology and people will impact the scientific industries. Our focus on issues related to value engineered solutions, on demand resource and domain requirements, flexible and scalable operating and business models helps our clients find future value and growth in scientific domains. Whether focused on strategies for Laboratories, IT or Staffing, Astrix has the people, skills and experience to effectively shape client value. We offer highly objective points of view on Enterprise Informatics, Laboratory Operations, Healthcare IT and Scientific Staffing with an emphasis on business and technology, leveraging our deep industry experience.

For More Information

For more information, contact Michael Zachowski, Vice President at Astrix Technology Group, at mzachowski@astrixinc.com.

The post LIMS Master Data Best Practices Part 5 – Mergers & Acquisitions appeared first on Astrix.

]]>
LIMS Master Data Best Practices Part 4 – Quality Control https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-4-quality-control/ Mon, 13 Jan 2020 20:53:25 +0000 http://localhost/astrix/?p=3384 So far in our LIMS master data best practices series, we have […]

The post LIMS Master Data Best Practices Part 4 – Quality Control appeared first on Astrix.

]]>
So far in our LIMS master data best practices series, we have discussed how to define master data and create a Master Data Plan, how to effectively extrapolate master data from current records to configure your system, and how to configure your master data so it will be easy to maintain and scale as your organization grows and the system matures.

The Master Data Plan, along with other documents we have discussed in previous blogs in this series, are part of an overall quality control process. High quality data is:

  • Valid – The data follows the current rules established for its type
  • Unique – The data is not duplicated or redundant elsewhere in the system
  • Consistent – The data is the same across all environments and systems where it is used
  • Timely – The data represents only what is required at the present point in time
  • Accurate – The data is free of errors and representative of reality
  • Complete – All values required for use by customers and system users are available in the system

Any changes to master data within a LIMS (or any other GxP governed computerized systems)  should follow a formal Change Management SOP that defines the procedures for maintaining master data validity. A good change management procedure will cover all of the above-listed aspects of data quality to ensure master data quality is maintained and it remains fit for intended use.

A good Change Management SOP will be risk-based, meaning that changes with potential for higher impact will have a more extensive procedure than changes with lower impact. The SOP should have instructions on how to assess risk, how to handle objects of each risk class, and explain what triggers initiation of the change control process. Let’s look at some of the key information that should be included in your Change Management SOP in more detail.

Routine Change Control

Routine change control is the process defined for low and medium risk changes made to the LIMS. Routine change control is triggered on an as needed or recurring basis depending on requirement changes or a need for system maintenance. By defining both the process and what triggers the process, your SOP helps ensure the master data in the LIMS is an accurate representation of the current business requirements at that point in time. Key processes for updating master data that should be formalized in your SOP include:

Update all data impacted by the change. When making changes to your LIMS, be sure to consider all fields in the system that will be affected. Adding instructions to the SOP to check the LIMS Master Data Plan will help to ensure all data impacted by the change is updated at the same time. When it’s not, it leads to a cycle of continuous changes that cost time and effort.

Be sure to update all environments and systems impacted by the change. Most LIMS systems have a secondary environment where major changes can be developed without affecting the production system. When no major changes are needed, these environments get out of synch if they are not updated regularly. When the environment needs to be used, it often takes time to update before any work can take place. Likewise, if a change is made to a system that the LIMS uses as a reference (such as adding a new laboratory instrument), if the other system is not updated as well, data consistency and accuracy are lost. During any change, it is important to include a check in these other environments and systems.

Avoid data duplication. Another issue to be aware of is that duplicates (or similar data) are often created when a deactivated field is added as a new item instead of re-activating the original field, or when a new list item is added in uppercase when a lowercase version already exists in the database. If a new workflow or product is added, fields are created specific to the product or workflow that can be very similar to existing fields. Through these seemingly minor changes data uniqueness is lost over time.

Perform periodic review and audits. One of the best ways to keep data consistent and unique is to perform periodic reviews and audits. Change Management SOPs should include how to conduct reviews and audits, a risk-based schedule of how often data should be reviewed, and templates to perform the review. A few suggestions of what to review are:

  • User Accounts: For companies who must comply with 21CFR part 11 and Annex 11 regulations, user accounts must be audited on a regular basis. Even for companies who are not under these regulations, auditing user accounts is highly recommended to ensure your data is secure.
  • Sample points and sample schedules: Because these are activated and deactivated often, it is easy to forget when they have been deactivated for long periods of time. Those that are no longer used should be moved to an archived or decommissioned status.
  • Instruments and equipment: Even when instruments and equipment are stand alone systems, they are often held in lists in other systems such as a LIMS. Those that can interface with a LIMS or other system, such as HPLC or MS instruments, are often modular, and the modules can be changed out. These should be reviewed to verify the current status and decommission or archive anything that is no longer in use.

High Risk Changes

High risk changes to a LIMS are often triggered by the need for a data migration or an upgrade to a system. Processes that should be formalized in your Change Management SOP when updating master data for high risk changes include:

Define the extended process. For a high-risk change, the Change Control procedure should include pre-approval of the change, testing and validation of new data or processes, and post-approval. For data migration, a Data Migration Plan should be in place to manage the change. For system updates, design and development documentation and code review should be included if any programming or custom coding is involved.

If a development phase for the change is required, a snapshot of the current system should be used. This freezes data so developers can create changes without adding the complexity of routine updates. This is also a good time to create or update the Master Data Plan for all systems involved. Using the periodic review and auditing procedures and templates can help with this process.

Utilize a current snapshot of the system for validation activities. Development and testing can often last months and even years depending on the nature of the change. During this time, production data is growing and routine change control is still occurring, so data accuracy and consistency between the production and development environments is lost.  When it comes time to validate new data or processes, the results may not be accurate because the system will have changed.

To combat this problem, a current snapshot of the system should be used for validation. If a validation environment is available, this means migrating the current master data from the production environment to the validation environment just before validation activities take place. If a validation environment is not available, current master data should be migrated to the development environment. This ensures data used is accurate, timely, and consistent with the current system state. It also reduces the risk of errors during the validation phase.

Once validation is complete, the change may be moved to the production environment. Post approval activities should include a verification check to ensure changes are complete before releasing the system to users.

Conclusion

A change management procedure is the key to ensuring master data quality is maintained. This procedure should be risk-based to provide appropriate instructions and triggers to identify and manage each level of change. Routine change management procedures will describe how to keep data accurate and timely and ensure data updates are valid and complete. Periodic reviews and audits as well as the Master Data Plan should be included to ensure data stays consistent and unique. High-risk changes must include how to deal with all aspects of data quality through the different phases of the change.

Be sure to tune in for part 5 of our Master Data Blog Series, where we will discuss important considerations for master data management in mergers and acquisitions.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

The post LIMS Master Data Best Practices Part 4 – Quality Control appeared first on Astrix.

]]>
LIMS Master Data Best Practices Part 3: Maintenance and Scalability https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-3-maintenance-and-scalability/ Fri, 20 Dec 2019 15:11:05 +0000 http://localhost/astrix/?p=3359 Master data design has very important impacts over the lifecycle of a […]

The post LIMS Master Data Best Practices Part 3: Maintenance and Scalability appeared first on Astrix.

]]>
Master data design has very important impacts over the lifecycle of a LIMS, as nearly every piece of functionality in the system revolves around the design of the master data. One of the most important aspects to any LIMS implementation is designing the master data so that it is easy to maintain and scale as the organization grows and business needs change. Some of the key benefits of configuring your master data to be maintainable and scalable include:

  • Easier to add and/or modify master data down the road
  • Increased system efficiency and reliability
  • Future system enhancements are less resource intensive
  • Better management for large volumes of data
  • Increased user acceptance
  • Increased ROI

In short, focusing on maintainability and scalability when configuring your master data really helps improve the lifespan and usability of your LIMS. In this blog, we will provide some best practice tips on how to set up master data so it will be easy to maintain and scale as your organization grows and the system matures.

Configuring Master Data for Maintainability

Once a LIMS system is configured users often must live with the master data rules set during configuration. While change control can be used to update the configuration, sometimes the process is so cumbersome that it isn’t worth the effort, or the changes are so numerous that the system never reaches a finalized state. The tips in this section are some we have found to be most helpful during the configuration process to make creating and updating Master Data easier on users as the system evolves.

Put System reserved words and System specific rules where they can be found. Every LIMS system has reserved keywords that can’t be used as a data name. These keywords are usually buried in the installation documentation and are hard to find. In addition, every LIMS system also has different conventions for using special characters or uppercase and lowercase letters. Instead of just being part of the “tribal knowledge” passed down through system administrators, these reserved words and specific rules should be included in the master data document where they will be easy to find long after the initial installation has been completed.

Create naming conventions based on business structure. Use naming codes that refer to your business processes or structure such as site, product specification or variation, lab type, or manufacturing line. This creates a uniform system that will change less often and reduces the overall amount of static data.

For example: Let’s say you have a multi-site deployment of a LIMS system and need to perform a pH check of manufacturing lines:

Good assay name: SF03-L1PH (San Francisco site Building 3 – Line 1 pH)

Bad assay name:  LINE3PH (line 3 pH)

If the specification for the product changes, the good assay name will tell you which assay needs to be updated. With the bad assay name, there is no way to tell.

Use references in naming conventions. It is best to avoid naming master data based on a previous LIMS system or by using a specific instrument name. In these cases, as the system or instrument no longer exists the name doesn’t have any meaning. Instead, add a field or lookup table to use as a reference.

Good assay name: Chem-8260-GCMS (Chemistry dept – EPA method number – instrument type reference)

Bad assay name: SL8260-MS123 (Sapphire LIMS method 8260 – Mass spec #123)

Use summary analyses. Where instrument interfaces are being used, such as Empower, analyses should be separated into a raw data analysis and an independent reporting or summary analysis. This setup provides a few key benefits:

  • Data integrity will be maintained, because the method can be locked down. Analysts will still have the ability to manipulate the data in the reporting or summary analysis but, the raw data won’t be manipulated.
  • It provides flexibility for method changes. By having a separate method analysis, you only need to change one method instead of several if an update is needed due to an instrument change or a new instrument.
  • Having a separate raw data analysis frees the instrument from being held while the raw data is being analyzed. Additional replicates can be run, the instrument can be taken offline for maintenance, or another run can be set up.

Try not to tie your metadata to a name – instead use a field. When a defined name is used it must be hard coded into the system. This is time consuming for developers during initial setup. If any changes need to be made after the system is configured, it must be done by a developer and changes must go through change control and possibly re-validation. By creating fields instead, changes can be done in the front end of the system and the change control process is much easier.

Custom tables and fields should all start with a prefix. This separates custom tables and fields from what is pre-defined in your system. A prefix can be used to group related objects based on your master data map. This is very useful for data review or if you have a multi-site deployment strategy. A custom table prefix could be used to designate tables designed for a specific site or business process. Some of the benefits of using a prefix are:

  • If a custom table has a prefix, you don’t need to create a prefix on the fields inside the table.
  • It creates a recognizable and uniform standard that can be re-used.

Work with the database administrator to manage growing data. Over time, your data will grow. It’s easy to let lists grow to a size where users must scroll forever to find what they are looking for, or tables to where it becomes difficult to see columns on the screen. As data grows, work with the system database administrator to create indices or queries to manage growing data. This helps to maintain system performance.

Configuring Master Data for Scalability

The Master Data Plan document will be the guide on how to scale your master data. This document is usually written during the development phase of any software update or data migration project. But often it encompasses only the initial configuration. To make your plan scalable, the Master Data Plan should include instructions on how to deal with data as it grows once the initial configuration is done.

Schedule who and when for all tasks. When putting together the project timeline, adequate time and resources should be given for completing master data tasks. Too often master data is left to the last minute or the time and resources needed are underestimated. Missed entries are rationalized away with the belief that it can be entered as needed. Often this results in a deployment that is never fully completed.

When writing Master Data Plan for the project, make sure you identify who will be entering data, who will do the testing, and when tasks will be completed. This provides the check that everything is complete for go-live. If data will be entered after go-live, include it in the plan. Then, expand on this baseline to explain the who and when for entering and testing data into the future.

Define how data is transferred into the system (Data Migration plan). The main aspect of any upgrade or implementation is how to migrate master data from the old system (or no system) into a new one. Your Master Data Plan should include the list of tasks needed to put all the master data into the new system. As the configuration is built tasks will fall into an order required by the system. For example, when migrating an analysis and its components the component table may need to be migrated into the system before the analysis table.

As this information is recorded it becomes the Data Migration plan. This provides the ability to import or export large amounts of data. So when you have a new product, you can potentially add the master data as groups of tables instead of entering pieces individually. It’s a much faster and cleaner method of adding data that can be verified using scripts instead of manually entering fields one at a time.

Configure the data map for growth and provide rationale. The Master Data Plan will include the data map outlining the business units involved, workflows, how workflows relate to each other, and the list of master data fields from each workflow. Instead of leaving the data plan as just a list of the initial setup, include how to manage data as it grows over time. Be sure to include the rationale behind why the plan was configured as it was so it is easy to understand how to expand it in the future.

Consider how master data is created as your business grows. Some questions to consider are:

  • What are the criteria to determine if master data fields are added or not?
  • Will new tables or sub-tables need to be created?
  • How will new workflows be added?
  • How will lists be handled when they get too big?
  • What happens when data is no longer needed?
  • Why are some third-party systems included (e.g., ERP, manufacturing), while others were not (training, document management)?
  • Will instruments, equipment, or another system be incorporated in the future?

Answers to these and similar questions provide a framework for expansion that is easy to understand.

Define naming convention to be used and the rationale behind it. The rationale for naming conventions should also be included for the same reasons as the rationale for the configuration. This includes the rules and variations behind corporate and site field names.  For a small deployment of only a few labs or sites, there may not be any variations to consider. For a large deployment, however, there could be many site variations. If the naming convention is based on your business structure, the rules can be specific, because the business structure is less likely to change.

Conclusion

When creating master data for a new LIMS, there are many things that should be done to ensure the data is easy to manage and can grow as your system matures. We’ve provided a number of key best practice recommendations in this blog that will help you improve maintainability and scalability in your LIMS when configuring your master data. Following these recommendations will ultimately help you increase the ROI of your LIMS over its full lifespan. Be sure to tune in for part 4 of our Master Data Blog Series, where we will discuss more best practice recommendations for master data quality control and change management.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

The post LIMS Master Data Best Practices Part 3: Maintenance and Scalability appeared first on Astrix.

]]>
LIMS Master Data Best Practices Part 2: Extrapolation of Master Data from Your Current Records https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-2-extrapolation-of-master-data-from-your-current-records/ Tue, 03 Dec 2019 01:59:50 +0000 http://localhost/astrix/?p=3320 As we mentioned in part 1 of our LIMS Master Data Best […]

The post LIMS Master Data Best Practices Part 2: Extrapolation of Master Data from Your Current Records appeared first on Astrix.

]]>
As we mentioned in part 1 of our LIMS Master Data Best Practices series, master data is the core information that needs to be in place in the system in order for a LIMS to function as intended. It’s essential for all parts of the organization to agree on both the meaning and usage of master data. As such, effective master data configuration is critical to the success of any LIMS implementation and migration, not to mention your organization as a whole.

Developers configuring master data in a new LIMS will need to determine what master data fields are needed and where to put them in the system. In order to do this, relevant business process workflows and data points must be identified and properly mapped in the system. In this part 2 blog of our Master Data Best Practices series, we will detail best practices for identifying and mapping your LIMS master data, along with an effective methodology to ensure your mapping exercise maximizes business value for your organization.

Configuring Master Data

One of the biggest mistakes companies make during a LIMS implementation or migration project is to try to dump master data into a system after it has already been configured.  In this scenario, time and effort are often lost as team members realize they are missing data during validation. In order to successfully configure master data, best practice is to map out the data structure before beginning the configuration process.

Step 1: Identify Relevant Process Workflows

The first step in mapping out data structure is to map out your business process workflows.

Figure 1:  Example of receiving process workflow

Business processes are the backbone of data generation. At each step of the process, employees gather information that may be reported directly to a customer, an auditor, or to management. Not all of this data will necessarily be entered into the LIMS, but it may nonetheless be needed to support the data that will be. By mapping out business processes and potential data points you will effectively form the “bones” for your master data configuration. Some potential process examples include:

  • Material receiving for incoming components
  • Reagent creation or building block synthesis
  • Environmental monitoring
  • Manufacture equipment and instrument monitoring or quality checks
  • Laboratory testing
  • Stability and retain testing
  • Quality assurance, batch, or lot release
  • Complaints
  • Investigations

Step 2: Determine the Data Fields to be entered into the LIMS

With the relevant workflows established, you will now want to examine the records kept at each step in the workflow to identify the data fields recorded at the step.

When you think of master data in a LIMS, you may think of just the static data needed to set up analysis testing results, and limits for those results. But in a modern LIMS master data is much more complex. Most systems have document management, instrument management, training and inventory tracking included out of the box. This means master data can include your procedure effective dates, version numbers, user training dates, instrument calibrations dates, reagent expirations, inventory levels, and much more. So your master data points may include more than just looking at what comes and goes out of the laboratory.

When looking for master data fields, you want to examine both paper and electronic records. Some examples of paper records include:

  • Invoices
  • MSDS sheets
  • Reports
  • Bench worksheets
  • Labels
  • Logs
  • Procedures or Work instructions

Some examples of systems containing electronic records include:

  • Databases
  • Invoicing systems
  • Document management systems
  • Electronic notebooks
  • Excel spreadsheets

Even if you don’t plan to use the fields immediately, you may want to add them in the future. Having a record of information and how it relates to other workflows provides valuable information for future updates.

Figure 2: Example fields from a workflow

Step 3: Establish Naming Conventions and Map out the Data Structure

With the data fields identified, you’ll want to compare these fields across workflows to find common fields and establish naming conventions. Naming conventions need to make sense and be able to scale with your organization as business needs change. Next, you’ll want to create a master data map detailing each workflow that will be represented in the system along with relevant data points (fields). This master data map will provide a number of important benefits:

  • Serves as the guide for the project team to configure the system.
  • Defines requirements for how data is entered and viewed.
  • Helps determine what will be shown on screens and reports.
  • Decisions are easier to make based on the types of data instead of individual data points.
  • Reduces the risk of missing data in your final system.

Involve Users

Creating the master data map is an excellent opportunity to involve all users in any migration or implementation process. The people who will use the system on a daily basis are the best resource for gathering data fields. They are also the people who have to live with decisions made when negotiating naming conventions. By involving them in the process users are introduced to changes earlier and given a voice in the end outcome, thus reducing their resistance to change. It can also reduce training time if they are involved in part of the overall design.

Conclusion

Creating a detailed map of your master data structure is a critical piece of any LIMS implementation project. Creating this map is often a bigger job than anticipated, and it is therefore best practice to conduct this exercise in the beginning stages of your project.

Prior to this mapping exercise, however, it is important to conduct a thorough workflow and business analysis in order to optimize operational workflows in the laboratory. As process workflows must be identified during master data mapping exercises, optimizing your workflows prior to master data configuration helps to ensure your new LIMS will maximize business value for your organization.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

The post LIMS Master Data Best Practices Part 2: Extrapolation of Master Data from Your Current Records appeared first on Astrix.

]]>
LIMS Master Data Best Practices Part 1: Defining the Terms https://astrixinc.com/blog/lims-implementation/lims-master-data/lims-master-data-best-practices-part-1-defining-the-terms/ Fri, 25 Oct 2019 12:50:26 +0000 http://localhost/astrix/?p=3251 Globalization and outsourcing trends, along with technological advancements that have dramatically increased […]

The post LIMS Master Data Best Practices Part 1: Defining the Terms appeared first on Astrix.

]]>
Globalization and outsourcing trends, along with technological advancements that have dramatically increased the volume, complexity and variety of data, have created significant data management challenges for modern scientific laboratories. Most laboratories have responded to these challenges by implementing a Laboratory Information Management System (LIMS) that automates business processes and data capture associated with laboratory workflows.  With these systems comes vast amounts of data.  Ensuring you are managing your LIMS Master Data properly begins with understanding the key terms

LIMS implementations usually demand a substantial investment of time, money and resources, typically costing hundreds of thousands to millions of dollars and requiring hundreds of person days to accomplish. Failure of a LIMS project can be a huge waste of time and resources, and a financial disaster for the organization involved. As such, it is critical to get a LIMS implementation right the first time in order to preserve your return on investment.

One important facet of any successful LIMS implementation and/or migration is the design and configuration of master data. In our experience, many companies involved in LIMS implementations tend to focus on software testing and configuration and put off dealing with master data until the end of the project. This is a huge mistake. Master data design and configuration is typically a much bigger job than anticipated and has multimillion-dollar impacts down the road on things like operational efficiency, time to market and LIMS return on investment (ROI).

In an effort to help organizations understand the importance and implications of master data and avoid project delays and cost overruns, we’ve put together a series of articles to highlight LIMS master data best practices. Some of the topics that will be covered in future articles in this series include:

  • Master data configuration pitfalls
  • Extrapolation of master data from your current paper records
  • Master data naming conventions
  • Strategies for handling master data in mergers and acquisitions (M&As)
  • Designing your master data for maintainability and scalability
  • Evolution of master data and change management
  • Master data quality control
  • Master data harmonization

In this part 1 article of our LIMS master data series, we’ll define master data and discuss the importance of developing a master data plan for your LIMS implementation. Without further ado, let’s dive into our LIMS master data series!

What is Master Data?

Master data can be thought of as the information that needs to be in place in the LIMS for users to be able to use the system as intended. Master data is core, top-level, non-transactional, static data that will be stored in disparate systems and shared across the enterprise, and possibly even beyond to external partners. As master data establishes a standard definition for business-critical data, its accuracy is very important, because it collectively represents a common point of reference and “single source of truth” for your organization. As such, everyone across the organization must agree on master data definitions, standards, accuracy, and authority. ​

Within most LIMS applications, there are two types of data that come into play – static (defined) and dynamic (transactional data). Dynamic data is the data users enter into the system as part of their daily activities such as test results, samples, batches or lots of a product. The master data is typically the static data that defines the structure of the system.

Master data and dynamic data are connected in the sense that the only way that dynamic data can be created is if master data already exists in the system. For example, in order to record a sample of a product for testing (transactional data) in a LIMS, the product name (master data) must exist in the LIMS so that the sample can be associated with a particular product in the system.

In most LIMS applications, various templates provide the ability to house the master data as lists/tables of values that will be used throughout the system. Master data typically includes core data entities like products, materials, specifications, sample types, analyses, lists, locations, reagents, instruments, environmental monitoring schedules, stability protocol templates and users. That said, universal specifications of master data items are not possible, as different laboratory types and/or LIMS will typically have different objects/entities identified as the master data.

Master data is foundational to business success. Even minor issues with master data can cause significant operational problems, and these problems will only be magnified as the organization scales, or reintroduced anytime new products or facilities are implemented. In order to avoid project delays and cost overruns for a LIMS implementation, it is critical to design and configure the master data properly. Towards this end, every LIMS implementation project should include a comprehensive Master Data Plan to ensure success.

Creating a Master Data Plan

In order to ensure a successful LIMS implementation, it is important to create a well thought out Master Data Plan that includes collecting all the master data that needs to be entered, deciding on a testing strategy to verify that the data has been entered accurately, creating a proper naming convention for your master data, and having an appropriate amount of time scheduled for entering the data into the system and testing it.

A Master Data Plan is a formal document that identifies the following:

  • The rational for different aspects of the plan (e.g., why you have a specific naming convention)
  • List of the organization’s master data that needs to be put into the system
  • Schedule for when specific tasks need to be done
  • The place(s) where the master data is created
  • The people who will be doing the work of entering and testing the data. Note that these people need to have appropriate training for the job.
  • How data is transferred into the system (e.g., the data migration plan)

One of the most important aspects of the Master Data Plan is determining what data needs to go into the system. This will involve scheduling an assessment of your data to determine what needs to be classified as master data, and also the master data entities in the LIMS being implemented to know which ones you will use and how. Note that this assessment may be utilized as an opportunity for you to do some housecleaning on your data. For example, you may decide not to add in master data for any test older than 5 years.

Another important feature of the Plan should be deciding on a naming convention. Here, it is important to get agreement on master data naming conventions amongst your user base so that they will be able to easily search for the data they need. Additionally, in organizations with multiple sites, using naming conventions that allow users to find their site-specific master data is crucial.

In a regulated environment, testing and documentation of testing may need to be included as part of your validation package. Towards this end, it is important that the person who tests the master data be different than the person who creates it. The person who does the testing must also have an understanding of the data they are testing and be trained in both the testing procedure and how the test results need to be documented. In addition, the Master Data Plan should document the procedure for updating master data in the system when necessary.

Conclusion

Your organization’s master data should serve to reduce the cost and time to integrate new facilities and enhance your organization’s flexibility to comply with regulations or enter new markets successfully. Over time, the master data contained in your LIMS will likely expand as your business expands with new products, facilities and regulatory bodies. Efficient master data management (MDM) will thus become critical to your operations. Be sure to tune in for the remaining parts of our master data series, where we will discuss the important best practices necessary to ensure your master data is designed and configured to deliver maximum business value for your organization.

Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.

The post LIMS Master Data Best Practices Part 1: Defining the Terms appeared first on Astrix.

]]>