Advertisement
Articles
Advertisement

Rethinking Automation

Tue, 02/26/2013 - 4:11pm
Matt Kirtley, Product Manager, Agilent Technologies Inc., Santa Clara, Calif.

What is needed to adapt to the changing needs of life science researchers?

Figure 1. Eight independent pipettor heads provide maximum efficiency, speed, and flexibility. Images: Agilent Technologies Inc.Until now, life science researchers had a narrow set of expectations for automation systems. The main focus of laboratory automation providers has been to develop liquid handling systems for high-throughput workflows processing very large samples numbers, primarily in screening laboratories. In this model all samples go through the same protocol, usually confined to a small number of simple steps, such as serial dilutions or simple plate-to-plate liquid transfers. While effective in these instances, the changing needs of life science research caused scientists to rethink automation and expect more capabilities and easier access.

Researchers continue to want high throughput, but they also want to improve data quality and translate it into shortened research timelines and reduced costs. Driven by the need to produce high-quality data in a high-throughput environment, researchers look to reduce manual intervention and automate workflows from sample preparation phase through data analysis.

Highly flexible systems that facilitate integration of multistep processes and extend automation to more steps in the workflow can satisfy this new expectation. Industry responded to this need with recent introductions that make it possible to extend the use of automation to a wide range of life science assays and workflow steps, increasing access and allowing laboratory personnel to craft their own flexible automation solutions. Workflows can now benefit from flexibility, modularity, and integration so that a finite set of well-designed tools can be linked together by a broader group of laboratory personnel in order to tackle complex, multistep processes.

The benefit of introducing greater flexibility is very apparent in some of the most common laboratory applications. For example, speeding up sample preparation for next-generation sequencing and liquid chromatography-mass spectrometry allows scientists and laboratory personnel to employ sensitive and sophisticated techniques on medium to large sample numbers, while maintaining the highest levels of data integrity. Some factors driving this trend are the need to gain information earlier in the drug discovery process in order to reduce costs. It is a great advantage to run primary and secondary assays in parallel to shorten the overall timeline and reduce risks when choosing candidate compounds to pursue in later development stages. As a result, it is possible to advance candidates based on broader criteria, and thus increase the likelihood of success down the road. Beyond raising throughput in these instances, automation also improves reproducibility in complex multistep protocols by reducing the likelihood of random execution errors.

Automation flexibility can be improved with features that address significant bottlenecks impacting the speed of the overall process. For example, the ability to integrate multiple instruments frees scientists from individually defined solutions and enables flexible, modular options that can be modified based on existing laboratory instrumentation and changing experimental conditions. Second, the software driving the instrumentation should reflect the user's perspective, rather than the traditional instrument focus, as this makes it easier for laboratory personnel to transfer their methods and protocols directly into automated solutions.

Figure 2.  A 3D planning simulator allows the user to drag and drop components and tasks onto the system, and then run a simulated workflow to troubleshoot for potential problems before engaging the actual instruments.With regard to liquid handling, two key features that greatly increase the flexibility of an instrument include the pipetting head and the presence of robotic arms. Incorporating a multispan pipettor with truly independent x- and y-axis variability (Figure 1) makes it possible to move beyond the confines of high-density formats to a wide variety of commercially available vials, tubes, and plates, including types that are assay specific, such as Maldi and Caco-2 plates. While many automation systems incorporate robotic arms either as part of the main system or as add-ons, their functionality varies. An ideal robotic arm should have the finesse and articulation necessary to carry out critical assay steps on deck, such as the assembly of vacuum manifolds. It should also reach well off deck to pass plates to detectors and incubators in order to integrate complete workflows and reduce bottlenecks caused by a reliance on manual intervention. The need for increased flexibility necessitates an expansion of modularity to key components of the instrument, such as the use of individual pipetting cards and easy-to-move on-deck accessories.

In order to facilitate access to these added instrument benefits, concurrent software changes are needed to keep the learning curve shallow, simplify the transfer of protocols from laboratory to laboratory, and smooth integration with other systems. One significant change that can accomplish these goals is modification of the user interface to make it more relevant to a scientist's workflow, rather than focused on the automation instrumentation. Intuitive programming using common laboratory tasks is more meaningful than linked individual motions. For example, "transfer" or "dilute" are terms much more relevant to a scientist than the word "move".

Another useful innovation beneficial to automation users is the incorporation of a 3D simulator (Figure 2) that allows researchers to see what will happen in their protocol before they engage the instruments. This gives novice end users the opportunity to test their complex protocols before pressing "go". Integration and flexibility are also enhanced by incorporating the ability to use the output of other systems, such as laboratory information management systems (LIMS) and plate readers. Dynamic scheduling that can adapt to workflow changes by making modifications based on what is available on the instrument deck at any given time can also increase flexibility.

As the needs of life science research evolve, scientists are beginning to rethink automation and turn towards providers that are adapting to their changing expectations. Recent product introductions include examples of systems developed to better meet the needs of a wider range of potential end users. Automation systems that incorporate flexibility, modularity, and accessibility with improved user-friendly instrumentation and software stand the best chance of meeting these requirements. As a result, they can improve data quality, reliability, and throughput across the complete spectrum of standard workflows in genomics, proteomics, cell biology, drug and antibody screening, and ADME/TOX applications.

Advertisement

Share This Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading