The ultimate guide for lab planning and scheduling digitalization



test 2



The landscape of biopharmaceutical lab operations has drastically evolved in the last decade. Long gone are the days of unconstrained, unlimited budgets and long R&D projects with flexible deadlines. Pharmaceutical companies must develop drugs faster than ever before by using their resources more efficiently and try to avoid capacity bottlenecks before they happen. In commercial supply chains, operational excellence experts are now turning their focus on making their quality operations more efficient, but soon realize that quality labs are fundamentally different from manufacturing: sample flows are not like material flows and planning production lines is not like planning highly educated workers and specialized Instruments, not machines and production lines.

We believe in a digitally enabled lab future where the traditional spreadsheet is replaced by a connected system that optimizes operational lab performance in a smart and cost-efficient way. That’s why we put together this lab operations digitalization guide for lab operations leadership interested in achieving lab operational excellence.

In this guide, we’ll give you insights and instructions to successfully digitalize your laboratory planning and scheduling:

  • The future of pharma QC labs and the importance of lab digitalization
  • Why Excel is outdated and not cost-efficient for resource management
  • How to optimize your QC lab resource planning in six steps
  • How to retain your most valuable asset: lab analysts
  • Use case: building your business case for lab automation using a digital performance twin
  • How to choose the right lab resource planning & scheduling software

But if you’d rather download the PDF…

The future of pharma QC labs and the importance of digitalization

Lab of the future, an ever-evolving concept

For people new to the idea, ‘digitalizing the lab’ sounds like a synonym for moving toward a ‘paperless lab’. But, with more and more companies having a lab digitalization strategy top of mind, we see the discussion moving beyond the visible outcome (=’paperless lab’) and more towards the overall value that digitalization could bring.

In one of our meetings, somebody summarized it beautifully: “the initial goal was to go paperless, but it’s about more than just that. The real value of digitalization is not that it replaces paper. It’s that it should make our processes more fluent, resulting in gains in efficiency and ultimately allow us to develop new medicine faster”.

In many of the other meetings, we heard different versions of the same story: seeing lab digitalization, not as a means to go paperless but rather to optimize process fluency.

Why you need to move beyond paperless

In a continuing effort to reduce costs and increase lab efficiency, many companies are looking to digitalize and automate laboratory processes, e.g. by going into more detail in LIMS and/or ELN. Such extensions can include capturing all method details (with or without a CDMS) and go as far as bidirectional instrument integration. It’s clear such digitalization helps to increase data integrity.

This makes the core business, what you deliver to your internal client, paperless. However, will it make everything in the lab itself paperless? Is the calibration plan integrated into those systems? Is method development included? And ultimately, will it help to optimize your lab performance?

McKinsey published an interesting article on the vast improvement potential they see in pharma QC labs. The article positions three possible horizons for which QC leaders can aim:

  1. Digitally enabled labs
    Labs that pull the sample testing in the lab. There is a lot of emphasis on connected instruments, data science, and advanced planning and scheduling.
  2. Automated labs
    Labs that have 60-80% of testing in the lab and 20-40% on the shop floor. The focus is on automated testing and non-testing processes.
  3. Distributed quality control
    This focuses on real-time testing at the line, enabling review by exception and parametric release.

In this area McKinsey sees important improvement potential:

  • Cost reductions of 25% to 40% in digitally enabled labs. This comes from the elimination of documentation work and advanced scheduling
  • Reduce lab leadtime with 10% to 20%
  • Test Productivity increase of up to 30% by implementing advanced lab scheduling
  • Sample taking and preparation productivity increase of up to 80%
  • Reduce deviations with 80% through advanced analytics
  • Accelerate deviation closing with 90%

In the past decades, a lot of attention went (and is still going) to manufacturing automation. QC labs have not really been in the spotlights. But as volume volatility and product complexity are increasing, labs don’t cut it anymore with a LIMS and some instrument connections. QC labs become a critical operation in the supply chain and deserve equal attention.

According to McKinsey, QC labs need to develop their vision, business case and transformation roadmap. The QC “pharmacists” need support from “digital engineers” that can help them break through perceived constraints on technological capabilities and system validation requirements.

McKinsey also positions Lab planning and scheduling to accelerate the process. They suggest: “Targeting a fully tested end-to-end future state prototype rather than testing and rapidly scaling up high-value solutions to capture quick wins. For example, schedule automation and optimization can be implemented quickly and start generating significant value even if a lab is not yet mostly paperless and fully digitized“.

The challenges of lab process optimization

Therefore, let’s think beyond just digitally capturing the information and switch our focus to process optimization. For example:

Hidden campaign rules

There are some hidden business rules in the testing work, e.g. how do you do campaigning? These rules are often undocumented, not digitalized, and let alone “automated”. Automating the campaigning process is an opportunity to not only minimize non-value-added work but to fully optimize throughput.

Unclear prioritization of work

Is the prioritization of work clear? What can you deliver by when? To answer this, you need two additional steps:

  • What is the total picture of what lies ahead of you? Not all that info might be in the IT systems mentioned above. For example, do they contain and quantify calibrations work? Method development work? R&D support?
  • Who can do this work and when?

The silver lining

And that brings us to the silver lining: yes, information might be ‘paperless’, but as long as everything remains unconnected, it brings little added value. Labs should therefore shift their focus and aim toward digital process optimization of the whole lab.

Planning & scheduling the digitally enabled lab

Let’s take an oversimplified example. Let’s say a lab supervisor wants to create a daily schedule for the QC team and needs four pieces of information to create the puzzle:

  1. What test methods do you need to do? Do you have the complete demand picture?
  1. According to which rules are you campaigning samples?
  2. Which lab analyst is competent to do which test method? This is typically known in LIMS, but LIMS isn’t able to plan the demand accordingly.
  3. What is the availability of our lab analysts? This is often known somewhere (HR system) but you can’t schedule as long as you don’t have the complete demand picture.

In the paperless lab (parts of) this information is stored digitally but, as a whole, it remains unconnected. This means you still need to rely on your lab supervisor’s undocumented expertise to connect the dots (manually campaign received samples, manually assign the work to the right analysts, … So, what’s the added value here?

Digital process optimization means you have a resource management system that connects information silos and automatically:

  • Synchronizes received samples from LIMS.
  • Assembles those samples into campaigns according to rules you can configure.
  • Proposes a schedule based on competencies, due dates, constraints, and analyst & instrument availabilities.

Why Excel is outdated and not cost-efficient for resource management

The hidden costs of Excel

Many lab managers are becoming some sort of spreadsheet gurus. Though spreadsheets allow a business to setup things fast and it’s extremely flexible, at some point, they mutate into unrelenting monsters that become uncontrolled, produce errors, eat a massive number of working hours. Which in turn results into hidden costs for all the 4 following reasons.

1. No single source of truth

In nearly all the cases analyzed, each team maintains its own version of the input data. In workforce capacity management there are often common sources of demand. Ideally, you want to capture them once and you’ll feed all the delivery teams.

The reality is often different. Teams consult the same data sources separately. They have their own meetings with project managers, supply chain planners, etc. Then they enter their part of the data in their own planning Excel. They create their own Excel data dumps from the BI solution (business intelligence) or databases.

Let’s assume you have 5 teams. Each team spends 8h/month to capture the same data.

2. Uncertainty regarding data accuracy

This will sound familiar… In interviews where we try to understand how a team lead works with his/her planning Excel, we almost always hear: “… but I should check if my sheet is up to date, and the result is correct because…”.

How confident are teams in the input, calculations, and outputs of their sheets? Basic, simple sheets won’t have that problem. But then the added value might be lower as well. So each time a team consults the output of the planning Excel you spend time checking if the result is correct. Let’s not exaggerate: cost = 4h/month

3. Requirements and planners change

Excel planning sheets never come with a manual. They are the result of a quest for quick wins. Team leads have a lot of bullets in their job description, but making Excels is not one them. And organizations change and so move the brains behind the Excel.

After one or two years the resource management sheet is not matching the current requirements anymore and the brains that could adapt it are out of sight. So let’s start again. Let’s assume that every 3 years you have to redo the Excel planning sheet. Unlike our Excel alternative. So our initial one-time cost of 40h becomes a recurring cost.

4. Inaccurate capacity planning

“I understand your way of thinking, but what if we outsource this part of the activity? Or what if we train our staff? Should we hire more people?”. And then the lights go out because our planning sheet was not built to respond to that kind of questions.

That’s the point where team leads make one-time clones of their “operational sheet” and start juggling with data. By the time they have to present and they get challenged by their manager, they forgot how they came to the result. It all had to be done fast, isn’t it.

So for the case: 3 such questions a year, costing 8h/case.

This hidden factory is expensive

In lean six sigma the hidden factory is defined as “the extra useful, positive output that would theoretically be possible if the energy directed at creating waste were released”.

We assume that the initial creation of the resource planning sheet is useful. Entering data and evaluating output to prepare planning decisions is considered as useful as well, but not if the same is replicated across all teams. Then on 5 teams, 4 of them end up as waste. The table below shows how the hidden factory cost builds up to 45K/year for maintaining Excel Planning sheets for 5 teams of 10 people.


In many organizations with expensive staffing and volatile resource demand planning, capacity planning tools have a potential of 5% to 10% re-allocatable FTE. This annual value of 200 to 500K goes far beyond the marginal 22K saving we calculated in the Hidden Factory example.

These are a few of the many reasons why many pharmaceutical companies are implementing a resource management solution currently. The example above is more an awareness exercise to feel the marginal business case. So how do you optimize your resource planning and scheduling.

How to optimize your QC lab resource planning in six steps

Step 1: structure demand and all sources of demand.

Demand comes from multiple sources. You should consolidate the most important sources of demand for your QC lab, such as lot release and product stability testing, deviation and out-of-spec investigations, method validations and transfers, instrument and facility qualifications or calibrations, regulatory activities, reagent and standards management, and much more. These are the most common ways of consolidating data:

  • Automatic capturing: the integration with existing business systems such as supply chain planning, ERP or LIMS applications allows the capture of a large part of the lab demand automatically. As such, demand for release and stability testing are being captured without human intervention.
  • Easy upload: transfer the content from existing excel workbooks.
  • Manual data entry and changes in a user interface, to further complete the full picture

Step 2: manage your lab team capacity.

Lab supervisors also need to manage the competences and the availability of analysts and technicians. As the team manager, you have to keep track to swiftly exchange resources in periods of peak demand, and still oversee it all. You need to be able to

  • Manage competences: compose resource teams and define who is qualified to perform what type of test methods;
  • Manage availability: schedule the availability of your teams. You should be able to take into account all sorts of unavailability such as GMP training, meetings, administration time, holidays, long leaves, etc.

In periods of peak demand, QC labs exchange technicians. This is common practice between release teams for instance, where the HPLC certified often jumps in with different teams. It is important not to lose focus, to guarantee the timely delivery of all the work.

Step 3: manage services.

While QC labs perform tests for different demand channels, modeling test methods to standardize work will help you to focus on the improvements only, and not on sub-optimizations. It starts with the definition of, e.g.

  • workload and lead-times;
  • Preparation, set-up and execution activities;
  • Campaign/batch sizes;
  • Instrument capacity constraints (e.g. Number of runs per week).

Defining standardized work is a common approach in lean six sigma projects, as a result of value-stream mapping.

Step 4: analyze capacity and get insight.

After the demand and capacity planning and services standardization work has been completed, you calculate your resource requirements. How easily can you shape endless reports to get answers to these type of questions?

  • Which teams can commit available time?
  • Which specific competences should team leaders develop or acquire?
  • What instruments have shortfalls?
  • How do the value-added versus non-value-added activities compare?
  • What is the workload for a specific project?
  • Which efficiency-improvement projects have the most impact?

Step 5: simulate work and create scenarios.

Common practice in QC labs are the “what if?” Questions that help you discover new possible issues that might occur, and compare different solutions before deciding upon the best option. We discovered a way to play around and calculate the return on investment on scenario questions such as:

  • What if you change the campaign/batch size?
  • What if you reduce lead times or workload?
  • What if demand volume and mix changes occur?
  • What will be the impact of new products introductions?
  • What if you enlarge the test capacity by training new analysts?
  • What if you add extra technicians?

Step 6: schedule task lists and track progress.

Last, but not least, is making sure things get done now that the planning has been completed. Providing teams with a comprehensive task list and tracking the work progress is what makes the difference. It will help you feed the performance dashboard with KPIs such as adherence to plan, throughput and foresight.



How to retain your most valuable asset: lab analysts

With the ever-growing focus on improving lab efficiency and service levels, it’s, now more than ever, crucial not to lose sight of your most valuable asset: your people.

A high turnover is particularly frustrating in labs because it takes quite some time to get new people up to speed: it’s not uncommon it takes +6months before junior analysts are trained to execute certain test methods according to the correct execution standards.

This means that seniority of your workforce, and thus keeping your people close, pays-off:

  • The initial ‘training phase’ of junior lab analysts requires a significant time investment. Making sure these people stay with you as long as possible is thus pivotal to safeguard the return on that investment.
  • The more widespread the competencies in the workforce (decentralized seniority), the more versatile you become in dealing with unexpected events, times of high workload, etc.

The rule of thumb: people are not machines

The rule of thumb to preserve employee retention is straightforward: amid optimizations, lean lab exercises, increased lab efficiency, better service levels, etc, it’s crucial to keep reminding everyone involved that our people are not machines. Although this might sound self-explanatory, it’s often challenging to find the right balance in a lab context.

For instance how can you optimize your ways of working, within the boundaries of what is possible and comfortable for our people? Here are 4 guidelines to keep in mind when answering that question:

Guideline 1: people like to know what’s ahead

Imagine how frustrating it must be for a lab analyst to see that their schedule completely changed when they arrive in the lab in the morning. Knowing what they can expect increases stability and reduces stress. Aim to implement processes to stabilize the schedule as much as possible (next two weeks) and to avoid last-minute changes.

Guideline 2: people like a variety of work

In theory, if you give the same set of tasks to the same people until the end of times, they’d become extremely skilled in it, which would increase efficiency levels. In reality, though, no one likes to do the same thing over and over again, day in day out. Therefore, this guideline is simple: you want to make sure your people can work on different projects and tasks. Two quick examples:

  • Make sure junior lab analysts move on from doing a small number of test methods over and over (think water testing) once they start to get a hang of a wider variety of test methods.
  • Make sure your more senior lab analysts are given the chance to spend sufficient time in the lab and are not grinding their days doing only double-checks or complex administration. In other words, make sure they can still do what they are most passionate about.

Guideline 3: people like to be treated equally

We all have experience with those inevitable periods where everybody is in  ‘fire-fighting mode’. There is an unexpected workload peak, therefore throughput needs to increase, and you count on your more senior analysts to help put out the fire (above average campaign sizes, minimize idle time, above average utilization,…). And that’s ok, as long as these periods of increased pressure are limited in time and discussed with the people in question beforehand.

Just be careful not to take the wrong lesson: after the imminent danger passed, it might be tempting to structurally link the throughput of your people to their seniority (higher seniority = higher throughput). Be mindful of this because:

  • Your more senior people will notice they consistently need to work harder than other analysts. This could leave them feeling unequally treated and the constant pressure could result in burn-out.
  • As your senior people would already be working as efficiently as possible, you’d lose your ‘buffer’ to respond to the next unexpected workload peak.


Guideline 4: people don’t like surprises

 Although you’ll never be able to completely avoid ‘fire-fighting’ every once and a while, it’s worth thinking of ways to reduce it to a minimum – to make sure you are not putting too much strain on your people, too frequently.

The key question you could ask yourself: do you have the ability to forecast:

  • When workload will peak?
  • Which of those peaks will put significant strain on our capacity (people & equipment).
  • Based on the answer to the two questions above: what can you do today to avoid or better deal with those workload peaks (think to upskill people, hire and train new analysts, level workload, …).

How the Binocs algorithm is built around these 4 guidelines

Of course, we wouldn’t be talking about these guidelines without taking them to heart. People are not machines, and that’s also the idea around which Binocs is built.

With all that in mind, you might wonder how to build your business case for laboratory automation. That’s exactly what we’re going to cover in the next section.

A use case: building your business case for lab automation using a digital performance twin

In 2008, a survey on laboratory automation concluded that most lab workers (88%) believed they would rely on lab automation in the future. Yet it’s 2021, and 72% of scientists now say their sector is lagging behind. The lab of the future seems to have been put on hold. Society’s fascination with automation focuses on “flashier” projects instead.

Using automated systems in the lab can save researchers from performing time-consuming and repetitive tasks. This frees them up to carry out more specialized processes.

Also, automating laboratory tasks improves the efficiency of experimental processes by speeding up tasks, using lower quantities of reagents, cutting waste and allowing for higher throughput of experiments. This higher efficiency leads to lower running costs of the laboratory.

Sounds great! But before getting ahead of yourself, there are two critical questions to ask yourself before implementing:

  • What exactly are you going to automate?
  • Can you justify the investment (ROI)?

Question 1: Which lab activities should you automate?

Automated labs use robots or more specific advanced automation technologies to perform all kinds of repeatable tasks like sample delivery and preparation – for a multitude of test methods.

In an ideal world, lab automation vendors can start from a detailed workflow analysis or dashboard. They want to clearly identify which test methods are being executed in the lab, and which ones eat up most of your lab analysts’ time. After all, in pursuit of maximizing your ROI, you don’t want to automate a test that you do once every month; you’re looking to focus on automating high-volume testing. Just like with anything else in life, lab work is subject to Pareto’s principle. Your lab analysts spend the majority of their time on only a relatively small number of test methods. The ideal starting point to identify which lab work you want to automate is therefore two-folded:

  1. Do a Pareto analysis. Accurately rank all of your test methods based on how much hours analysts are crunching to execute them.
  2. Analyze your role-cards / standard work workflow and determine which specific activities within that workflow are ideal candidates to automate.

Using pareto analysis in Binocs

Although it sounds simple, it’s definitely not. Accurately determining how much time is spent on each test method and identifying ideal candidates for automation requires accurate and accessible data. This data should measure actual time spent taking into account failures, re-tests, … measured not only for the test method as a whole but also for each activity within that workflow.

Binocs can definitely help here. It captures the time spent and makes it accessible in reports so you can analyze it from different angles – for example, a Pareto analysis on actual analysis time per test method.

You are then able to zoom in on specific test methods to help you identify the activities that could be automated (i.e. sample preparation) and how much time you’d potentially save.

Question 2: How much efficiency will you exactly gain?

There is no question: lab automation provides a tremendous opportunity to increase lab efficiency. Nonetheless, managers often face financial constraints and need to justify the proposed investment. The question to answer: “what exactly is the value of the reduction in inefficient use of laboratory resources (such as productivity of personnel) that would be gained through technology and automation over time?”

Until a couple of years ago, there was no reliable benchmark to predict the ROI of automating your lab. Like with most new technologies, some labs just jumped in, crossed their fingers, and measured the return on investment after the fact. And even then, the estimations were relatively shaky and limited to measuring the output/FTE before and after automation.

By now, labs can rely on industry papers and testimonials to better grasp what lab automation could mean in terms of ROI, before the fact. While useful resources, estimations typically have a range (“20%-80% in productivity gains”) too large to be useful in making quantified decisions. That’s because for these estimations to make sense, they need to take into account different lab environments, varying lab automation maturity levels, and the like. Other estimations are very specific measurements of lab X, which makes you question how well those extrapolate to your use case.

In other words: the estimations are too unspecific to your lab’s way of working. They are merely an indication that there is a great opportunity in lab automation, less so an estimation of the ROI you need to justify your investment. To make an accurate forecast of the ROI your lab could achieve with lab automation, you need simulation capabilities (a digital lab performance twin).

Simulating what-if scenarios in binocs

In Binocs, for example, you can use what-if scenarios to simulate the lab automation you’d like to implement. You can then compare your lab performance with automation against your lab performance without automation.

In doing so, the ROI is no longer limited to taking a snapshot of ‘output/FTE’. You can answer questions like: ‘how will it impact our planning adherence? How will it impact our resource utilization? How many extra projects will we be able to take on? And how will our lab performance evolve over time (forward-looking: months or years to come)?’

Are you optimally banking on your time savings?

As mentioned, automation provides great opportunities to automate repetitive, manual tasks like pipetting. This frees up the time of your highly educated analysts, which they can spend on more specialized processes. However, the paradox is this: every minute of manual work you save is only realized when you actually use it productively. Therefore, do not solely focus on the automation itself but also on how you’ll productively re-distribute your analysts’ time. Having an intelligent scheduling co-bot (digitally enabled lab) can help to maximize returns.


Wonder what could be the potential ROI of digitalizing your laboratory planning and scheduling? We created a simple yet powerful calculator that could give you a glimpse at your potential savings and help you build a business case.

Calculate your potential lab savings today using our calculator.

Calculate ROI of digitalization

How to choose the right lab resource planning & scheduling software

Are you thinking about digitalizing your laboratory resource planning & scheduling? Maybe already listing a couple of potential software vendors? You want to find a cloud-based, laser-focused enterprise software that makes the end user’s life easier and integrates smoothly with all your other lab management systems. Here are four tips to help you out.

Tip 1: Go for ’true’ cloud software

Software consultants that come to your office and install a customized version of their software on your local server (on-premises software) sounds like something from a distant past. By now, everyone seems to agree: the cloud is the future. With all kinds of ‘cloud-based solutions’ sprouting left and right, it’s sometimes easy to forget why you want it in the first place.

So, a quick reminder: you always want to be working with the latest version of the software. Your investment needs to include all future updates and enhancements for years to come. An important requirement: the cloud contains one version of the software used and constantly updated for all clients (=true cloud).

However, not every ‘cloud-based software’ is created equally. Sometimes, every client runs a customized version of the software in their own piece of the cloud. Whenever there is a new software update, custom versions, although technically in the cloud, are at risk of missing the boat. Apart from some bug fixes, the system you use tomorrow is remarkably close to the system you buy today. The longer you’ll use it, the more outdated it becomes.


  • One version of the software is constantly updated for all clients. True cloud means you have a product that is constantly improved based on feedback, not only from you but from all users, all of them facing similar challenges.
  • The system is configurable to match your ways of working (configure in the UI, not in the code).
  • This means you are not just buying the features that software offers today but also buying all future updates.

Tip 2: Don’t buy custom software

Do you remember the feeling of opening a Lego box as a kid? It’s full of blocks of different sizes, in different colors, … With a bit of imagination, you can build a thousand different things. Some software is like a Lego do-it-yourself kit: it contains components and building blocks you can use to build a solution yourself.

In contrast to Lego, in the case of software, the feeling of ‘endless opportunities’ is less appealing. Especially when it needs to solve very specific challenges unique to your situation (e.g., competency-based scheduling, specific constraints, business logic, sample campaigning rules, …). You’re probably already imagining a small army of consultants to first map your processes & needs, and then another army of programmers using the building blocks to program a solution that is not so great.

These ‘do-it-yourself kits’ are often called ‘platforms.’ Just like with Lego: they work great if you’re building a relatively straightforward solution, less so for more complex things, like scheduling people & equipment in a lab context. Choose wisely.

A powerful alternative is a solution specifically created for your ‘niche’ market, like, in the case of Binocs: digital resource planning for laboratories, CGT or resource management offices. It’s built around the best practices, experiences & feedback from dozens of clients that face challenges very similar to yours and can therefore be implemented very quickly.


  • Don’t re-invent the wheel if you don’t need to.
  • Look for software built around a solution that fits your unique challenges.
  • You want to go live in a matter of weeks, not months.

Tip 3: Choose laser-focused enterprise software with the end-user in mind

You know what it feels like: you go to a store, you’re comparing a few products, and you’re sold to the one that sounds like the best idea. It’s got the most features, the packaging looks hot and everything seems great. But then you get home, and it doesn’t deliver. It’s not as easy to use as you thought it would be. It has too many features you don’t need. You just bought an “in-store” good product.

Most enterprise software is “in-store” good. That because people who buy enterprise software — IT managers, HR managers, etc. — often focus on configurability, control and the number of features. They see all the bells and whistles from companies such as SAP and IBM and fall for the “one system does it all” sales pitch.

But once the software is in use it becomes apparent that typical enterprise software vendors aren’t mindful of the employees that have to work hours in their system on tasks that should take 5 minutes. It becomes clear that they care about the one-night stand with the managers to sell and forget the long-term relationship with the users.

That’s why it’s important to document all your user requirement specifications, whether they’re functional or non-functional: demand management, capacity management, scheduling & tracking, compliance, integration, scalability… If you don’t know which questions to ask, we’ve got you covered! Simply download our free URS template with more than 70 user requirements specifications.


  • Choose software with the end-user in mind
  • Be wary of enterprise software vendors that pretend to solve a thousand problems.
  • Look for laser-focused software solutions that solve a very particular problem for the end-user, a system that’s great at a limited set of tasks but executes them exceptionally well.

Tip 4: Look beyond APIs & standard interfaces

Laboratories have more and more digital systems in place: LIMS, ELN, ERP/SAP, … Whenever you’re adding new software to the mix, it needs to integrate smoothly (at a bare minimum, you need to be able to import and export all data via APIs).

Setting up interfaces is often a painful process. The typical interface approach uses code & programming to transform data from the sending system, so the receiving system understands. This is a rigid approach as it requires a clear contract and requirements upfront, making it challenging to make changes after the fact.

The coding requires quite some time of scarce internal IT resources. Not only to write the initial interface but also to cope with the inevitable changes afterwards. Look beyond APIs and standard interfaces. Make a realistic assessment of how smoothly the new system can be integrated.


  • Verify the vendor’s approach to how they set up interfaces: can they guarantee it will be a smooth process that doesn’t require a lot of your IT resources?
  • Look for low-code or zero-code low-code approaches.
  • Be careful with ‘standard interfaces’: they are not the solution (click here to read why).

Download our complete URS template

Unsure what user requirements specifications you should take into account when comparing resource planning and  scheduling software? We’ve got you covered!

We tailored a template of 74 user requirement specifications for you. It covers a whole range of functional and non-functional requirements: capacity management, services management, integrations, scalability, security, etc.

Make an informed decision by downloading your free copy now.

Download URS template


You now have the knowledge and tools to successfully digitalize your laboratory planning and scheduling operations. Of course, this process will take time, it will require change management and a many meetings to align with all the stakeholders. But biopharma laboratories who digitalize their lab operations management today will have a competitive advantage tomorrow on those who will start their digitalization process later. You don’t need LIMS, ERP or any other digital tool first to start your journey towards laboratory operational excellence. Remember, “scheduling automation and optimization can be implemented quickly and start generating significant value even if a lab is not yet mostly paperless and fully digitized“.