Articles

All systems go

Richard Lucas

National Instruments believes cyber-physical systems that bring self-awareness to products constitute a revolution. PE talks to the firm’s co-founder

Capturing complexity: CompactRio and LabView

Dr James Truchard isn’t one to exaggerate. The co-founder, president, CEO and guiding light of National Instruments is an approachable Texan whose business and technology career has been built on openness and honesty. So if he’s talking about a new industrial revolution, we ought to take notice.

The revolution in question goes under several names. In Germany, a consortium of influential companies is talking about Industrie 4.0; a more common term is “the internet of things”. Truchard, and Americans generally, refer to the phenomenon as “cyber-physical systems”.

What it is about is the coming together inside previously inanimate products of software, hardware, data acquisition and data analysis into a single connected entity that can itself communicate and be communicated with as part of a larger set. It’s about embedded systems that bring awareness, connectivity and extra functions to objects formerly thought of as passive and uncommunicative. 

But is it a game-changer, or even a revolution? “Well, you’re always looking for a hard inflexion point. When Apple came out with the smart mobile phone that started a trend, that was massive, and the iPad likewise,” says Truchard. “But sometimes it’s a lot of things coming together that make an observable change, and I think that’s how it is with cyber-physical systems.”

The elements that he sees as congregating for this change include in-built communication capability: “This is now fairly ubiquitous,” he says, “but there’s still work to be done. We’d like to have synchronisation included in communication at a much tighter level – we think that’s coming soon.

“Another element for us was a full real-time maths capability, so that instead of doing a simulation and then having a long process for development you can now do it all in one step. I call it algorithm engineering.” And then the development of the FPGA – a field programmable gate array, the configurable integrated circuit generally treated as part of the hardware in a device – “makes it possible to get performance under a wide variety of conditions, and effectively to customise it”.

The combination of these factors, Truchard believes, is producing the real key to globalisation. “If someone has a highly proprietary system, we have a platform that means that developers all over the planet can work on it. So if you take any challenge that someone has done, then there’s a good chance that we can duplicate it. Somewhere in the world there’s an expert who can figure out how to do it, so it levels the playing field in terms of past technology.” It fundamentally changes the basis of competition, he believes.

To illustrate the point at the recent National Instruments Week convention, Truchard cited a smartphone app that would let you tune your own bagpipes. Previously, he said, bagpipe tuners might not have done great business, but they’d have considered themselves immune from technology and from competition. Now anyone can do it. Just buy the app. “Systems” of great complexity can be captured and made universally available. The convention ran under the heading of “All systems go”.

The role for National Instruments in all of this, he says, “is to create the platform and the tools” through the LabView graphical programming system and the CompactRio reconfigurable FPGA units, plus signal processing and other instrumentation. 

The visions conjured up through this world of cyber-physical systems have several features that differ from previous iterations of products and technologies, which is why it’s seen in some quarters as a revolution. For example, programmable circuitry in minuscule sensors allied to wireless communications makes it possible to instrument virtually anything, so devices and structures that were entirely passive can now measure their performance, collect their data and transmit it to a central point. 

Work on self-reporting structures – such as road bridges in Texas and the crack-prone Hammersmith flyover in London – has already been widely publicised. In a newer application, restoration work on Milan Cathedral has been able to go ahead safe in the knowledge that other structures are not being damaged by nearby work, where previous kinds of heavy and hard-wired instrumentation might have introduced risks of their own.

A further feature of cyber-physical systems is the degree to which this kind of activity can create vast amounts of data. “Big data” – the idea that we’re generating far too much raw material to be able to process it and turn it into usable and useful information – is a worry for many companies.

“Big analog data” is National Instruments’ term for the data that’s increasingly derived from the incorporation of sensors and measurement devices inside anything and everything. It’s information about temperatures, light, motion, voltages and pressures, for example. And it threatens to dwarf all other forms of big data, according to Tom Bradicich, research and development fellow at the company. 

Truchard says that cloud computing is one of the developments that will help, but the key point is to manage the data: “You need really good technologies to manage these distributed systems, and as we put them out there we have to manage them.”

It’s an area that National Instruments is working on. “We have a new line called the Watchdog Agent Prognostics Toolkit that’s designed to automate the expertise and produce prognostics to tell you if something’s wrong automatically,” says Truchard. “And it’s probably the top expert in this field who has created this toolkit in our tools, CompactRio and LabView.”

Cyber-physical systems can be organised to sort the useful data from the useless data, so the system filters out the important factors that should be analysed and used to improve decision-making. 

“Big analog data” shouldn’t be just about collecting more data: it is about adding intelligence or analytics to the data acquisition, and increasingly that is built into system design and distributed out to the sensors and data acquisition units themselves. Bradicich talks about inserting analytics at relevant points into the data model for cyber-physical systems.

National Instruments’ customer Duke Energy, the largest power generation company in the US, is already experiencing the benefits of doing that. Its previous systems for condition monitoring on its ageing power plants led to a situation, says maintenance director Bernie Cook, where maintenance specialists spent 80% of their time collecting data and only 20% analysing it. 

This sounds like a kind of “big data” nightmare but it has been made more manageable through deploying smart sensors and data acquisition units, while increasing the amount of data, producing better information on the condition of plant. “It’s made us more reliable and lowered our operating costs,” says Cook. 

A revolution? Well, certainly a huge difference.

Company built on an incremental approach 

National Instruments was founded by James Truchard (pictured), his long-term collaborator Jeff Kodosky who is still at the company, and a third colleague from the University of Texas, when they grew frustrated with the data collection and analysis equipment they worked with and knew they could do better. 



They started from Truchard’s garage, and it was three years before they felt bold enough to quit their day jobs. “I never missed a pay cheque,” Truchard recalls.

Could they do the same thing now? “Well,” says Truchard, “it was hard to do it that way first time. But now you have venture financing, which I didn’t know about when we started, and I think it would probably sacrifice the way we did it. The venture model is to go in and spend a lot of money and get a relatively quick answer when you are successful. That isn’t the way we did it. Ours was a much more incremental approach.”

The National Instruments incremental way is less a progression of technology and more a layering of new ideas on top of older ones that are still relevant, he says. “We started with GPIB (general purpose interface bus) instrument control, and then we did software that ran with GPIB control, then we had data acquisition, then PXI and instruments and then CompactRio, so we’ve been layering. We still have our GPIB layer and it’s a small percentage now, but it’s still a base and we never lose customers – they just move on to use our later technologies.”

This kind of build-up and continuity is not easy with venture capital models or with Wall Street, he feels. “As a public company, if you grow, then they say that’s what they expect. If you don’t grow, then they want you to start explaining what you’re going to do, which usually means cut costs. If you’re not careful, you’ll put yourself out of business.”

It’s the lack of commitment that makes, for him, the venture capital route less than ideal: “You can get enthralled with the technology, and that was the case with me,” he says. “You bond with the technology so you enjoy being around with the right people.”
Share:

Read more related articles

Professional Engineering magazine

Professional Engineering app

  • Industry features and content
  • Engineering and Institution news
  • News and features exclusive to app users

Download our Professional Engineering app

Professional Engineering newsletter

A weekly round-up of the most popular and topical stories featured on our website, so you won't miss anything

Subscribe to Professional Engineering newsletter

Opt into your industry sector newsletter

Related articles