Instrument makers are uprooting analytical data

Along with the usual array of shiny new instruments, there was a sense of something changing at this year’s Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy (Pittcon) in Chicago, US. Several of the larger instrument manufacturers resisted what they described as ‘Pittcon pressure’, and eschewed the traditional wave of new instrument launches to focus on software, with a lot of talk about ‘integration’ and ‘workflows’.

Cloud-based systems concept illustration

Source: © iStock

More and more instruments are connecting to the cloud

Both Waters and Thermo Fisher Scientific were among the providers promoting ‘cloud-based’ systems for dealing with the magnitude of data modern instruments can generate, as well as ‘infrastructure as a service’ packages, which provide customers with cloud-based data storage and computing power as well as operational software. These offerings are particularly aimed at the growing number of small companies without dedicated IT resources.

A cynic might suggest this focus on eking improved performance out of existing machines, rather than developing new ones, is an effort to save R&D costs. Or an attempt to cash in on the latest digital fashion trend. That may or may not be so, but it’s also recognition that many modern instruments are pretty robust and have long service lifetimes, as well as requiring significant capital investment. Customers are not going to upgrade to the latest systems just to get a couple of extra bells and whistles. But if you can offer them a complete software package that improves the productivity of their existing machines and analysts, you might be able to tap into a different budget line…

That said, the diversity of instrumentation available – and the likelihood of customers to operate instruments from several different manufacturers side-by-side – means that the software providers are obliged to make their products as flexible as possible. Anyone attempting to lock in ‘brand loyalty’ by making software that doesn’t interface with other providers’ machines is unlikely to succeed.

But simply moving data to ‘the cloud’ does not magically make it more manageable or fruitful. The true goal of these systems is to increase productivity, by providing more powerful tools to improve the speed and quality of information that can be extracted from the data. As the machines get cleverer, the theory is that the analysts can spend less time poring over data and more time doing experiments. Whether or not that’s true depends on how well the providers listen to the needs of their customers.