Gaining Intelligence from Contextualized Data
by Anh Nguyen, CTO, Creative Electron Inc.
Data, data everywhere, but what to do with it?
We are inundated and often overwhelmed by data. It is everywhere, from the outputs we receive from system level software to the granular data we get from each piece of equipment on the SMT line. The problem is not getting more data, in fact we are flooded with data, what we don’t have is intelligence from that data. This is where we need to consider contextualizing our data and adding more dimensions or layers.
In the diagnosis of a disease a doctor might have a single blood test reading, but without context, this is difficult if not impossible to accurately diagnose using just that single sample. If we add the dimension of time and date, we can now see a trend in that reading that may have implications. Additional dimensions or layers such as the typical reading for similar patients can be added and compared to groups of patients who go on to display other symptoms as well as those who don’t.We can also add in family history to create a multi-dimensional picture from the data. These layers help us make better decisions and having them at our fingertips and in real-time allows us to make faster decisions. This is how humans make decisions, by reviewing multiple data points simultaneously. This is how we derive intelligence from data.
In the world of electronic production, data becomes intelligence when it can be used to improve the performance of a process or a group of processes. At the end of the day, this intelligence is used to drive business goals such as improving efficiency and quality.For data to become information it needs to be contextualized. For example, an x-ray inspection system can give you a data point that a specific BGA has 30% of void, but you don’t know if that is good or bad, you don’t know if you should pass or fail that specific joint. It’s one data point, lost in time and space.
The first stage in contextualizing the data, is to add a time domain. That’s where statistical process control comes in. Instead of looking at specific pieces of data, isolated in time and space, statistical process control tries to do is contextualize the test with control time. Now we can see if the reading we take is extraordinary or part of a trends of decaying performance that need corrective action.
Industry 4.0 and the widespread ability of production and inspection equipment to produce data, offers another dimension of contextualization. We can also now contextualize our data against that derived from other equipment. Now we can see our void data within the context of the line and whether the part meets the criteria of other pieces of inspection equipment.
We can add as many dimensions as we have available to create this multidimensional environment, a multidimensional analysis engine for the manufacturing floor, one dimension at a time. For example, a third dimension can be incoming inspection results from goods-in. We might determine that an issue is associated with a package type, or vendor or batch. Let’s say that we determine excess voiding in a specific BGA, even when the SPC (Statistical Process Control) passes. The second dimension from other machines passed as well but they are marginal. They’re a pass but a slight outlier from what they should be. The third dimension tells you that the specific component might have been the suspect. Those three dimensions provide progressively more powerful information, so three data outputs provide a more powerful information data set to assess if you should ultimately pass or fail based on that sample.
We don’t have to stop at three dimensions, we can do four-, five-, six-dimensional optimization engine that can be assessed by an AI (Artificial Intelligence) algorithm that can help us in the future contextualize that information data set.
As an industry, and at a minimum, we should require all equipment manufacturers to provide us with a time stamped series of data and ideally, those equipment manufacturers can help contextualize the data in the first dimension of statistical process control.
I think we can expect some of the world’s largest software companies to come into the manufacturing industry to help perfect and optimize this multi-dimensional process control system. There’s little doubt where our industry is heading and whilst we don’t foresee humans being replaced by this analysis, we do see an important role for artificial intelligence and machine learning algorithms in supporting, or augmenting, humans in making critical decisions regarding the quality control of the products we make every day.
About the author:
Anh Nguyen, CTO, Creative Electron
After receiving a degree in Electrical and Computer Engineering from the University of California San Diego, Anh had a successful career as a software engineer at Sun Microsystems and then Oracle. A superstar and industry visionary, she brings to us years of experience in machine learning, neural networks, and artificial intelligence. That’s why we were delighted when Anh decided to join our team to lead our R&D efforts. With a perfect blend of software and hardware development expertise, she understands how the perfect x-ray machine must work.