OUR PARTNERS

Archives

Building a Smart Laboratory 2014

The 2014 edition of ‘Building a Smart Laboratory‘ is now published and available on the Scientific Computing World website.

Aimed at newcomers to the concept of a Smart Laboratory and those about to embark on integrating disparate lab functions, Building a Smart Laboratory 2014 covers instrumentation; automation; data acquisition; data management; and informatics. It shows how scientists can turn information and data into knowledge that will be valued and used by managers elsewhere in their organisation.

Compiled and edited by industry expert, John Trigg, together with other international specialists in laboratory systems, Building a Smart Laboratory 2014 will take readers on a guided tour of the ‘ smart’ laboratory and help them with the practical aspects of joining up their science.

How Smart is your Laboratory?

The progressive incorporation of information technology into all aspects of laboratory operations over recent decades has resulted in fundamental changes in laboratory work.  Laboratory information has traditionally been managed on paper, typically in the form of the paper laboratory notebook, worksheets and reports, in order to provide a simple, and portable means of recording ideas, hypotheses, descriptions of laboratory apparatus and laboratory procedures, results, observations and conclusions.  As such, the lab notebook and associated documentation served as both a scientific and business record.  However, the introduction of digital technologies to the laboratory has brought about significant change.  From the basic application of computational power to undertake scientific calculations at unprecedented speeds, up to the current situation of extensive and sophisticated laboratory automation, black box measurement devices and multiuser information management systems, technology is causing glassware and paper notebooks to become increasingly rare in the laboratory landscape.  The evolution of sophisticated laboratory instrumentation, data and information management systems and electronic record keeping has brought about a revolution in the process of acquiring and managing laboratory data and information.  However, the underlying principles of the scientific method are unchanged, supporting the formulation, testing and modification of hypotheses by means of systematic observation, measurement and experimentation.  In this context, a Smart Laboratory seeks to deploy modern tools and technologies to improve the efficiency of the scientific method by providing seamless integration of systems, searchable repositories of data of proven integrity, authenticity and reliability, and the elimination of mindless and unproductive paper-based processes.  Furthermore, the accumulation of ever-increasing volumes of laboratory data opens up the possibility of using sophisticated algorithms to extract understanding and meaning to further support innovation, and product and process enhancement

At the heart of the Smart Laboratory is a simple model  (figure 1) that defines the conceptual, multi-layered relationship between data, information and knowledge.

triangle copyFigure 1

The triangle represents the different layers of abstraction that exist in laboratory workflows.  These are almost always handled by different systems.  The ‘experiment’ level is the focal point for cross discipline collaboration; the point at which the scientific work is collated and traditionally handled by the paper laboratory notebook.  Above the experimental layer is a management context that is handled by established groupware and document management tools at the ‘programme’ level, and by standard ‘Office’ tools at the project level. Below the experiment level there is an increasing specialisation of data types and tools, typically encompassing laboratory instrumentation and multi-user sample and test management systems. The triangle also represents the transformation of data to knowledge, the journey from data capture to usable and reusable knowledge that is at the heart of the Smart Laboratory.  The computerisation of these different layers has typically happened at a pace driven by available technologies rather than by any coordinated strategy, and as long as experiments are recorded in paper laboratory notebooks, the opportunity to complete the journey from data to knowledge in a seamless way is seriously challenged.  The introduction of laboratory informatics tools has therefore opened up the possibility of a more strategic approach, which, in theory at least, offers the opportunity for an integrated and ‘Smart’ solution.

The underlying business drivers for a Smart Laboratory are process efficiency, laboratory productivity and error reduction.  These criteria consistently appear at the heart of any purchasing justification as industries seek to gain competitive advantage through cost reduction and time-to-market.  To meet this need, the informatics tools have been focused primarily on the elimination of waste (time, effort, errors) with the added bonus of providing a management perspective of laboratory performance.

But there are other underlying business issues that are now having an increasing impact on laboratory operations and are putting new demands on the Smart Laboratory.  In recent years, the distribution of laboratory processes across geographic boundaries and 3rd parties (externalisation) has become quite common.  The benefit to the business is to take advantage of the lowest-cost source of commodity laboratory functions, and in some cases, to tap into external sources of expertise to provide research and development activities.  Knowledge acquisition therefore becomes geographically dispersed and laboratory informatics systems are able to meet this business need by providing relevant capabilities for collaboration and the sharing of laboratory data and information, with infrastructures and appropriate levels of access control, in order to ensure adequate IP protection.  In addition to externalisation, there are two other major business issues that now challenge the laboratory informatics market.  Firstly, the growing demand for more/better innovation, and secondly, being sufficiently adaptable or agile to cope with relentless market, business and process changes.  Historically, a considerable amount of scientific innovation came about through serendipity and the investigation of unexpected outcomes of planned experiments, where the primary objective was to advance scientific knowledge and understanding.  Nowadays, innovation has evolved into a systematic, industrial and time-pressured process, dependent to a large extent on making sense of existing data, prior knowledge and evidence-based decision-making.  Concurrently with the demand for innovation is the need for organisations to be more agile and adaptable to changes driven by externalisation, mergers and acquisitions, as well as other market forces, frequently with the need to consolidate systems, processes and workforces.

These are the challenges that the laboratory informatics industry now faces.  Increased throughput, cost and error reduction isn’t enough anymore. The long term benefit of managing laboratory data and information in an integrated way needs to provide not only acquisition and storage capabilities, but also needs to make better provision for supporting and advancing science through extracting information and knowledge from the ever growing data repositories in order to make sense of the data, to uncover correlations and support evidence-based decision making.  In addition, informatics systems need to be sufficiently flexible to be able to adapt to changing business and operational needs.  Here lies the paradox; in order to justify the purchase and deployment of informatics systems, we depend on the ROI equation to quantify benefits, i.e. productivity, but the potential long-term benefits may arise through non-quantifiable factors such as better understanding, better decisions and better science.  In other words, the emphasis needs to be changed from the elimination of waste (time, effort, errors) to providing greater capability, greater flexibility, and more predictive approaches to supporting science.

How data will transform business.

Philip Evans gives a quick primer on two long-standing theories in strategy — and explains why he thinks they are essentially invalid as digital transformation invokes change across business models in private and public institutions.  The phenomenon will already be familiar to laboratory workers, particularly in the life sciences.

2014 S-Lab Awards open for applications

The 2014 S-Lab Awards are now open. They cover all aspects of laboratory good practice and innovation, including design, effective use, technical support, informatics and resource efficiency, in all sectors and in any country. The initial closing date is February 3 2014 and the application form has been made as simple and time-efficient as possible. Past winners of the UK categories are Edinburgh, Liverpool, London South Bank, Loughborough, Manchester, Newcastle, Plymouth, Sheffield Hallam and St Andrews Universities; Imperial College, London; Oldham Sixth Form College and University College, London. The 2014 Awards will be presented at the Supporting World Class Science conference at King’s College, London in September. See www.effectivelab.org.uk for more details.

ELNs 2013 – the times are a-changing, …slowly.

At last week’s annual European ELN Conference (ELNs, Data Analytics and Knowledge Management, to give it its full title), organised by IQPC, there was a definite wind of change in the air – well, more of a breeze, actually.  The merger and acquisition activity amongst some of the major players in the laboratory informatics market has seen a specific shift in their product portfolios from data acquisition and throughput, with an emphasis on productivity, to the provision of tools to support data analysis and visualisation.  This is an important change of emphasis, with the growing need for greater innovation to counteract the trend towards externalisation of ‘commodity’ laboratory processes.  In other words, there is a trend towards doing better things, rather than just doing things better.  The purchase and implementation of informatics tools is dependent on good return on investment, typically measured in productivity terms, but the longer term benefits of managing laboratory data and information in an integrated way so that data analysis tools can be deployed to support R&D scientists represents a speculative and unquantifiable benefit.  It is important to remember that the role of the informatics tools is to support science, and science is a way of thinking much more than it is a body of knowledge (Carl Sagan).  So the shift in emphasis is timely.

Unlike some industries, the pace of digital change in the laboratory world is relatively leisurely.  It has taken about four decades or so to reach a point when (some) laboratories can consider themselves to be ‘electronic’ or ‘paperless’.  At the hub of these laboratories is an informatics system comprising any one of, or a combination of the major tools; Laboratory Information Management Systems (LIMS), Electronic Laboratory Notebooks (ELN), Scientific Data Management Systems (SDMS) and Laboratory Execution Systems (LES).  The trend over recent years has been the convergence of these tools; in each case the systems originally served a distinct market sector, and were provided by quite separate vendor communities.  That situation has changed to the extent that an increasing number of laboratory data and information functions can be accommodated within a single vendor solution with the scope extending from data acquisition to data useage.

Another ‘tipping point’, or period of discontinuity was emphasised in a presentation by Michael Elliot (Atrium Research).  Over the years, ELNs have evolved in a typical software fashion, with extended and more detailed functionality, progressively adding to their complexity.  But the more simplistic ‘paper on glass’ style of ELN has recently been gaining market share with faster deployments and greater user acceptance (ease of use).  If the trend continues to gain momentum, this may influence greater modularity across the market, opening up the options for best of breed products to meet the specific demands of different types of laboratory.

The long-term issue of data standards was prominent in the conference agenda, with presentations about the Allotrope Foundation, the Pistoia Alliance and the AnIML standard.  The lack of data standards has been debated frequently, and despite some limited but heroic efforts, has gained little momentum.  Everybody understands the problem; everybody appreciates the benefits that standards could bring, but there has been no collective inertia to move towards a solution.  It is not in the vendors’ business interests to take the initiative, but in fairness, nearly all vendors express a willingness to comply with data standards if there is sufficient market demand.  As a user community, we do not collectively put any pressure on vendors to comply, and so the initiative falls to collaborative bodies such as Allotrope and Pistoia to drive momentum.

The Allotrope Foundation has recently announced a partnership with Osthus Inc. to build an Open Framework for Laboratory Data, comprising open data standards, metadata repositories and open source class libraries.  The ELN project (ELN Query Standard) initiated by the Pistoia Alliance has ground to a halt, allegedly due to a lack of agreement amongst involved parties.  The project, however, could be revisited if there is sufficient interest amongst members.  However, there is good progress with the release of the HELM (Hierarchical Editing Language for Macromolecules) standard open source tools for biomolecular representation.  The AnIML project continues to evolve, extending its original analytical scope to a wide range of scientific disciplines, including biological data.

The other topic that generated some good discussion during the conference concerned the adoption of consumer technologies in the laboratory.  In particular, the discussions focused on the use of mobile devices and social media.  With respect to mobile devices, there is an interesting balancing act that falls somewhere between their desirability and their practical value.  Limited screen sizes, gesture/touch navigation and typing (in gloves???) and their physical vulnerability in a laboratory environment all conspire to challenge the business case for their use, but for genuine mobility, and using ‘Apps’ that offer specific functionality and are tailored for the small screen sizes, there is genuine potential.  The social media argument is an interesting one.  Although in the consumer world, social media serve an entire spectrum of good bad and ugly usage, the underlying principles are extremely relevant to communication, sharing and collaboration.  For this reason, they do have significant potential for the laboratory, particularly if they can be incorporated into the controlled environment of the informatics portfolio.  Of course, the ‘cultural’ issues remain with respect to user adoption, but using the ‘push’ principles to inform, rather than the traditional ‘pull’, i.e. the information is there, but you have to find it, would seem to take advantage of the benefits of ‘social media’.