Distributed measurement systems

National Instruments Aust Pty Ltd

Tuesday, 04 November, 2014


In almost every industry, traditional centralised data acquisition systems are being replaced with a more distributed network of measurement devices. This trend has recently been combined with the evolution of computing into the cloud that will continue to change not only how we acquire data, but also how we store, access and analyse it.

The recent data acquisition trends have been fuelled in part by Moore’s Law and increasing computer power, but a bulk of the shift has been caused by the trends in the consumer electronics industry changing how we interact with data. The benefits of these new architectures are numerous, including reduced capital, installation and maintenance costs; more powerful analytics; and the ability to access data from anywhere.

Historically, data acquisition systems were large and fragile. They had to be separated from the harsh environments that could be present around a device under test (DUT) or were just too large to be placed around a device. This drove the centralised architecture of data acquisition systems that has been traditionally used. In a centralised architecture, data acquisition equipment is stored in a central rack or control room, where there is ample space and it can be protected from the test. Sensor cables are then run (sometimes hundreds of metres) from this central location to sensors and actuators throughout the test fixture.

Distributed systems

As applications become more complex, these home-run sensor wiring approaches become more difficult and costly to implement. The cost of running sensor cable can often be the single largest line item for installing new data acquisition systems once labour and capital costs are included. The alternative approach to the traditional centralised data acquisition system is to fragment and distribute the data acquisition system around your application and run a single, inexpensive network cable for data transfer back to the server or control room. For example, in a wind turbine, any wire running from the blades into the central housing must first pass through a slip ring allowing the blades to spin freely. The more wires run out to the blades, the more complex the slip ring system required, exponentially increasing points of failure and system cost.

These distributed systems break apart the data acquisition system into smaller subsystems that are placed around the DUT, often in the test environment and as close to the measurement sensor as possible. They interact with the DUT locally and receive commands from and send data for logging back to a central server where the test operator is located. Additionally, computing can also be distributed close to the DUT allowing for smart data reduction or localised control algorithms without the need to flood the network with data or commands.

Advantages of a distributed architecture

This architecture offers several advantages over a centralised system. By breaking the large centralised system into a modular, distributed system, you create smaller and cheaper subsystems that you can more easily maintain and replace, should one fail. A modular system is also much more flexible, as nodes can simply be added onto the network or swapped out if measurement needs change. This ease and low cost of repair means more uptime and higher reliability for a measurement system. In contrast, a centralised system would cause concerns about the need to replace, reinstall and rewire expensive capital expenditure equipment if the requirements of your measurement system were to change.

A distributed architecture also reduces wiring cost by running a single communication cable to the distributed subsystems rather than laying possibly hundreds of sensor wires throughout your test cell. This reduction in cabling can lower costs and, more importantly, increase measurement accuracy because the shorter sensor wires to the distributed systems are less prone to noise, interference and signal loss. A sensor wire acts as an antenna as it travels to the data acquisition system, picking up electrical interference present in the room from fluorescent lights, motors and other seemingly benign sources. This interference can be combated with techniques like shielded cabling and twisted pairs wires, but these only serve to increase the cost of the cabling used.

As an example of the cost of sensor wiring, in aerospace structural test cells like those used at Boeing, Airbus and Embraer, wiring and cabling is often upwards of 25% of the total test cell hardware and software budget. By running standard ethernet cable instead of sensor wire, these large costs can easily be cut in half.

Finally, a distributed system can help offload processing from a main central computer. Many distributed data acquisition systems also have onboard intelligence that can be used to run analysis or reduce data to key values before uploading it to the central system. This architecture allows the creation of task-specific nodes in your system, with some of the analytics being done on the DAQ device, and the user interface (UI) being done by a separate computer. This means your central computer can be substantially cheaper and faster than in a centralised architecture because a lot of its processing has been offloaded and it can focus solely on UI and data storage.

Driven by smaller, cheaper and more powerful processing

Until recently, the benefits of implementing a distributed system were outweighed by the high cost of small and powerful enough hardware to embed through a test fixture. However, over the past decade, the cost and size of processing power has driven down the price of distributed data acquisition hardware and fuelled adoption of this more efficient and flexible architecture. As processors and analog to digital converters become smaller, cheaper and more capable, they can more easily be embedded into small subsystems. Data acquisition systems no longer need the large amount of space that can only be had in a large centralised system but can now be placed in packages small enough to distribute around a test. This allows the data acquisition system to take advantage of the inherent advantages of a distributed architecture and reduced sensor wiring.

Consumer trends

The advancements of processing, ADC power and size have been greatly accelerated in the past five years by the explosion of embedded consumer devices. Triggered by smartphones, embedded processors are now prevalent in most consumer products from thermostats to refrigerators. This massive increase in deployment has driven semiconductor manufacturers to further optimise their products for deployment in small, embedded systems. The same technological advancement can then be leveraged by data acquisition vendors that use common off-the-shelf parts to build more capable and cost-effective distributed products.

Additionally, this growth of embedded consumer devices is also changing our expectations about how we interact with electronic devices. Home computing tasks that used to be relegated to a single device - the family computer - are now distributed to different task-optimised products. Internet browsing is done on a tablet, pictures are stored on media servers or in the cloud, and videos and movies are watched on internet-capable TVs. By distributing the computing power, we have created task-specific products that are more efficient from both a productivity and cost perspective.

This trend is not staying at home either. As noted by Adam Richardson in Innovation X, business customer expectations are largely driven by the sum of a person’s experiences, including those in the consumer world. As engineers, scientists and technicians become more accustomed to interacting with this more distributed style of computing, they will begin to expect it more and more in the lab and data acquisition companies are now, more than ever, capable of delivering on that expectation.

To the cloud

Recently, these trends have coupled with a new trend in both business and consumer technology that is beginning to take distributed data acquisition systems even further - cloud computing. By placing saved data in the cloud, whether on an internal and private cloud, or on a public one like Microsoft’s Azure, three advantages are gained: there is near infinite processing power and near infinite storage, and the data can be accessed from anywhere.

One of the key features of cloud computing is that it abstracts the idea of individual processors and thus can be seen by you and your data as a computer with infinite processor cores, capable of running analysis that could never be done on a single computer. This allows you to further optimise your distributed system by placing processing-intensive tasks in the cloud that would never be able to run on a distributed DAQ node, nor on the central computer. Analysis that formerly locked up a system for hours or days can now be offloaded, leaving the computer free to continue collecting data or perform more low processing calculations.

The main advantage of cloud computing, however, is the ability to store limitless amounts of data and then be able to access that data from anywhere. Rather than being relegated to only accessing data on the computer it was acquired on, you could envision a system where you scan a QR code on a wheel subassembly in a test cell with your smartphone and be able to see the entire test history of it instantly. Cloud computing has the potential to completely change the entire workflow of data acquisition to a much more efficient model.

A distributed future

The future will see embedded systems continue to drive down the price and size of data acquisition systems while increased consumer familiarity with smart devices and cloud computing will increase the expectation of task-specific nodes. Both of these trends will help push data acquisition systems further into the field - away from the old, centralised design and towards a more efficient and effective, distributed architecture. I would encourage you to begin evaluating your own data acquisition systems and consider if it is time to evolve into distributed measurement systems

Related Articles

Smart buoy tech live streams data 24/7 from sea floor

A Perth-based company has developed the Nodestream protocol, a bespoke solution that uses...

Engineers discover a new way to control atomic nuclei as “qubits”

Using lasers, researchers can directly control a property of nuclei called spin, that can encode...

Simultaneously locating multiple defects on microchips

Researchers have developed a method that can simultaneously locate individual electrical flaws in...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd