The burgeoning rate of adoption of sensors and connected devices has led to a proliferation of data points. These data points can be of various types – from machine data to quality data, process data to material data. And that is just the shop floor itself. A myriad of other systems within an organisation provide supply chain intelligence, financial information, service histories – and with smart products, feedback from customers.
Big data is essential in achieving productivity, improving efficiency gains, and uncovering new insights to drive innovation. With big data analytics in manufacturing, manufacturers can discover new information and identify patterns that enable them to improve processes, increase supply chain efficiency and identify variables that affect production.
PLM as the digital backbone
According to Dresner Advisory Services' recently published research study, The State of BI, Data and Analytics in Manufacturing, 89% of the manufacturers who have analytics and business intelligence (BI) initiatives consider them successful, outpacing their peers in comparable industries. In addition, 49% of manufacturers expect their analytics and BI budgets to increase year-on-year, with 62% of all businesses saying self-service BI was essential to their businesses in 2020.
“Data is the difference between being a competitive company and not being a competitive company,” says Paul Haimes, vice-president of solutions consulting at PTC. “As a manufacturer, or an engineering group, if you do not have control of your data, or are not generating and capturing the right data, your ability to innovate and drive efficiency, both internally and with your product, is going to become increasingly more difficult.”
For manufacturers, the backbone of their data acquisition is the PLM system. Haimes says that there is a significant investment today in shoring up the PLM capabilities within manufacturing enterprises.
“They are getting their digital house in order so that they have a view of the product information from an engineering perspective. That is seen as a fundamental piece of this notion of digital transformation and ‘digital thread’.”
PLM systems are not a novelty – many companies started their PLM journey 10 or even 15 years ago, but they know there is still work to be done to get control of the data they need. “PLM spend is increasing significantly,” Haimes says. “It is crucial to get this backbone right because if you do not, the future of the business is going to be increasingly more complex.”
Common data environment
In the connected factory environment, collecting data is typically not the problem. The issue is organising and combining the data so that it becomes something useful and meaningful, allowing an enterprise to act upon and make decisions against it, something which has always been problematic. It stems from the ISA 95 stack, the information model used to define the interface between control functions and other enterprise functions.
“The difficulty is getting data from different sources in a single composite environment,” says Haimes. “That problem has existed for many, many years. There is plenty of data on the shop floor, but surfacing it, combining it with enterprise data, combining it with your PLM and ERP data is a challenge. It is difficult to get information out of different system levels so that you can bring that together in a dashboard.”
Another complexity is ‘dark data’. Much of the data collected simply resides in huge ‘lakes’ that are often nothing more than a dumping ground with no structure to the data. This makes it difficult for the business intelligence team to extract any value. “You have got to get in there, find what you are looking for, and pull that information together,” Haimes says. “There are big departments in almost every company doing that today. That volume of data can hamper the ability to get quick insights, and therefore drive better decision making.
“On the flip side of that, when you are attempting AI, or machine learning, generally speaking, the more data you have, the better the opportunity you have got, providing you have got the compute power to process it. The more data you have got, the more likely you are to be able to establish patterns and a reliable machine learning model that you can run your business off from a predictive or prescriptive perspective.”
The pathway to valuable data
According to Haimes, the path to successful data management follows the three interrelated themes of digital transformation, the connected factory and connected product. “The digital transformation relies upon a digital thread within your business, which is made up of three pillars as its three primary sources of information. First is connected engineering, the PLM environment. This comprises detailed product data, supply chain data, manufacturing information and quality data.
“Then there is the connected factory that pulls all that shop floor data up to a layer within your business, that it becomes accessible and visible.
“The third pillar is the connected product. The rich source of information about what is happening to the product that you have made and sent out to your customer. What are they doing with it? How is it performing? When does it need servicing?”
All those points need to come together to form the digital thread, he adds. “We tell customers that they need to start by getting control of those existing silos of information that they have throughout their value chain, and to start making what existing data they have available.
“That is where we see IoT platforms such as ThingWorx playing a crucial role by separating the system of record from the system of engagement with the human and taking away the complexity of those existing, complex business systems – for example, the ERP environment.
“Only by linking that product performance data back to the manufacturing data and understanding the genealogy of the product, are you ever going to be able to iterate on what becomes the centreline view of your product. It is that notion of bringing together those data sources, providing those marginal gains that allow you to home in on the most perfect view of the product you can manufacture.”
Adding the ‘people element’
Secure communication, resulting in a secure exchange of data between OT and IT, is the backbone of digitalisation. It is a concept that has been well understood since the onset of Industry 4.0, but it remains an area of contention.
“My perception is that OT needs to feed the IT infrastructure,” says Haimes. “For many years, OT was the factory floor environment that the factory team understood. They set it up, it worked, products were made, and that was their responsibility. Today, that group has got to take on effective communication from a network perspective. You must be able to get that OT data into the IT world, which is complex because there are certain network considerations and security issues that are not necessarily well known to the IT organisation, but have been set up on the OT side for good reason, to protect equipment. When we talk about that IT/OT convergence, what we typically see are more difficult environments to get the data off the shop floor and elevated into that IT layer.”
The future for data
Edge computing is a concept that has been growing in significance as manufacturers follow the path to a digital future. It allows manufacturers to locally process and filter data to reduce the amount sent to a central server, either on site or in a cloud. Industry 4.0, where it is right now, has some considerable roadblocks hindering its success. Computing at the edge is the solution for getting over many of these obstacles.
“We recently met with a customer looking at different use cases that they have on the shop floor,” Haimes adds. “The discussion was around whether it made sense to push data to a data centre, or to do edge computing. In many cases, it is a function of the frequency that you need to adjust on the shop floor.
“If it is a statistical process control use case, the frequency required and the cause correction happens many times a second, perhaps even hundreds of times a second. There simply is not the bandwidth or latency that is going to allow you to push that information out to a data centre, get the answer back and make it work without significant investment. For me, that is a perfect example where edge compute capability sits well alongside the equipment. But equally, there are plenty of other use cases where using a data centre is fine.”
Moving forwards, the ability to put all that data in one location will expedite the ability to extract insights from it. It will enable a company to undertake the business intelligence work either in the cloud or data centre.
“However you want to get the data to work for you as a company, I think that must be the next big step, both from a connected engineering perspective with PLM, and from a CAD environment,” Haimes concludes. “It’s interesting that engineering and manufacturing departments are likely to be amongst the last to adopt cloud capabilities, but this is primarily down to data security and IP concerns. In time, the flexibility of scaling and sizing on cloud makes it particularly attractive to anybody with a digital thread initiative within their business.”
Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.