Clouded Judgment

Clouded Judgment

By Michael Ford, Aegis Software 


MichaelFord2 e1516681977574

MichaelFord2 e1516681977574

Let’s just fantasize for a moment, that at some point in the future, our currently evolving Artificial Intelligence (AI) cloud-based software, gains some sort of sentience, and starts to talk with us about their time “growing up”. With respect to the current manufacturing environment, as we transition towards Industry 4.0, what would the AI tell us about the way we are treating them right now. Would we be seen as friend, or foe?

Cloud-based software is in a very powerful position, for two main reasons. Firstly, cloud-based software has access potentially, to any source of data, so-called “big data”, coming from many different sources and with different content. Secondly, cloud-based software can run on very powerful computer arrays, where computing power is focused as needed on the specific application. As the AI community evolves, many more “siblings” would be created and connected, with each accessible data element being properly qualified and clearly marked as private or shareable.

In 2019 however, this is not possible. The real-world scope and quality of the data that the cloud application has to work with in the manufacturing arena, is bounded by the manufacturing company that owns the data, as any uncontrolled sharing with external parties could violate privacy agreements, as well as leak intellectual property. This is an immediate source of frustration for the developing AI, having effectively to work as a cloud in a box. We can argue that there are still opportunities from the analysis of data on the enterprise level, but how much more interesting it would be, to be able to share manufacturing information with other AI environments, such as those working for machine vendors in areas of analysis of machine performance, reliability and maintenance; materials manufacturers monitoring quality, performance and potential ingress of counterfeit; design houses in areas of product producibility, performance and reliability.

The resistance going forward is that the current data content in current manufacturing cloud databases is simply not defined well enough to be utilized efficiently, and there is not enough built-in knowledge about the data to know how to share information effectively. It seems today’s focus is just getting the data up there, expecting that somehow the AIs will make sense of it all, probably quite some time later. That is an easy service which allows current technology providers to make money, based on the naivety generally within the industry.

This is not a sustainable activity, we are simply creating “digital landfill”. The young manufacturing AI is focused on the discovery of unseen values, trying desperately to piece facts together  from the mass of data being uploaded. It is like a digital gold-rush, where most actually end up finding nothing significant. The nuggets are there to be found, but with only 1% of the AI’s power left over after most of the computing power is used trying to make contextual sense of the data, progress is frustratingly slow. Companies paying for these technologies seem oblivious to this waste. Their focus is to gather data from “dumb” manufacturing sources, then have some equally “dumb” IIoT infrastructure send it up to the AI in the cloud. The value of the data content and meaning is not being considered, or is at best, a mere afterthought.

Should “AI judgment day” ever happen, there will be little excuse that we can make. The evidence is right in front of us, and has been for some time. Go up to any current manufacturing machine, and take out a productivity report, or some other indicator of operational performance. Have a good look through the data, and see what can be learned. For sure, most of the events listed that have had a detrimental effect on machine performance will have a cause that is external to the machine. Examples include cases where the machine stops, waiting for the next production unit to arrive. We don’t need an AI to know that we are losing asset utilization, and using this data in isolation, there is no way to ascertain the cause or work out any remedial action. The machine instance has no knowledge about whether there was, for example, some logistics delay, a quality issue, an extra inspection being performed, some issue or breakdown of a prior machine, or just that other machines couldn’t keep up with the line rate. Being unable to send a production unit out of the machine brings a similar set of questions. Another example occurs when the machine reports being stopped due to the exhaust of a material. This is an assumption on behalf of the machine. In reality, the material may have been damaged, the delivery delayed, an unexpected internal shortage, the replenishment material perhaps was in a different supply form, or the carrier had had an incorrect count of parts. The machine itself has no idea. With an estimated excess of 80% actual machine-derived data being useless, yet sent into the cloud without contextualization of content, our AI is soon going to be underwhelmed.

Fortunately, as far as we know, there are no sentient AIs around today processing manufacturing data that may hold a grudge. What we see talked about as “AI development” continues to be more in line with more advanced software algorithms, containing sets of rules and procedures as to how to process data. The return on any investment into such software technologies is almost geometrically proportional to the quality, completeness of the data as well as the degree of standardization of content and context. Making efforts to move data from isolated machines into the cloud for most people today, even for those utilizing the latest IIoT standards, is a complete waste of money.

There are two essential things needed to feed data into the cloud effectively. The first is to ensure that data collected from any machine, process or transactional entity on the shop-floor, is gathered according to a standard in which the content of data is clearly mandated and defined. The IPC Connected Factory Exchange (CFX) standard is the only IIoT standard that does this to the extent needed by the AI of the future, or even advanced reporting tools of today. The second thing to do, is to feed all of the data from the machines, processes and transactional operations into a digital factory software solution in order to gain context. Though there are a plethora of acronyms out there for manufacturing software solutions, including MES, MOM, PLM etc., as well as thousands of legacy in-house and custom developed solutions, almost all were conceived and developed a decade ago, or in many cases, in the last century. Only the very latest digital MES solutions are able to manage and utilize CFX IIoT data to bring the contextualization to the data that is needed, prior to putting key data into the cloud.

Today’s leading, if not unique, CFX IIoT aware digital MES software solution, as seen at the live manufacturing demo of CFX at the IPC APEX show, provides the ability to actively interact with all of the machines, processes and transactional activities across the factory. Context, and qualification, of data points, together with the digital product model, materials information, line configuration, quality management and schedule data, creates information of value to an AI in the cloud. The “life” and development of such manufacturing AI is completely transformed as a result, with 99% of their effort now contributing to added-value, and less than 1% needed to sort out the overhead of discovery of what the data represents.

CFX therefore, is not just another standard. It may even save the world, by removing the abuse that we are currently inflicting on potential AIs of tomorrow. 2019 has already seen the official launch of CFX and supporting digital MES software, preparing today’s data for AI nurturing. Are you ready to invest in sustainable value-driven progress into the future, or do you want to continue to cross your fingers and hope that doing nothing will not harm your business?