Whether it’s called artificial intelligence (AI), machine learning (ML), or expert systems, AI is in the news today. Elon Musk has warned us about the potential dangers of AI’s rapid adoption, and IBM has deployed AI in its Watson service for dealing with technical issues requiring judgment of some kind. Science fiction has often incorporated machine intelligence into stories, such as Skynet in the Terminator series of movies, HAL 9000 in 2001: A Space Odyssey, or the cold war classic of Colossus: The Forbin Project.

Putting the threatening scenarios aside, AI has the potential to help decision making in ambiguous situations. This is more than just following an automated flow chart; these are situations that normally require some judgment, historically from a person. This brings us to electronic test and test engineering. Does AI have a role here? To find out, I contacted a number of companies about their AI efforts and how they saw the future.

2001: A Space Odyssey featured an intelligent computer, HAL 9000, that became a foe to the crew of Discovery One. It is one of many hypothetical examples of AI turning evil. Could a modern HAL tell a test engineer, “I’m sorry Dave, I can’t reorder the test sequence that way.”? Image source: Pixabay

I primed the discussion by asking some specific questions:

  • Is AI applicable at all?
  • Are there products or services that use AI today?
  • Will there be products in the future that will use AI?
  • If so, which applications are the most promising?
  • What business model do you see offering AI products and services?
  • What AI products and services do you offer today?

My first clue about AI’s appropriateness didn’t come from the responses; it came from the lack of responses. Several companies simply said they did not have AI efforts, or at least nothing they felt comfortable speaking about. So, we know AI isn’t mainstream in test, at least not yet.

From the responses I received, I did see one important correlation: the semiconductor test industry seems positioned to pursue AI.

Mark Hutner, DfX Manager, and Yi Zhang, Product Manager, answered jointly from Teradyne. They see promising applications of AI in two areas: customer production economics and overall equipment efficiency to avoid unplanned downtimes. “Both will drive value for our customers via yield improvement, test coverage, improved product time to market and optimized test run-time,” they said. While test optimization is being performed in a rule-based fashion using adaptive test today, they see promising advantages for AI here. “We see reductions in test time of 10% to 15%.”

Regarding customer production economics, Teradyne sees AI adding value by doing yield recovery after an initial fault is observed. This would enable more testing to recover marginally good dies. Combined with the AI-fueled adaptive test, this can have a large impact on product profitability. “It is reasonable to expect a product profitability improvement of more than 30% with a yield-recovery approach where average test time is optimized and yield is recovered,” said Hutner and Zhang.

Teradyne offers a data pathway for several offline data analysis tools because the infrastructure at each customer varies. Optimal+, a company that offers end-to-end product analytics solutions for semiconductor and electronics companies, offers such tools. I spoke with Michael Schuldenfrei, a Technology Fellow at the company.

Finding root causes to geographical yield issues across wafers is one of the applications deploying AI. Image source: Optimal+

Schuldenfrei pointed out that Optimal+ uses AI in several of its software products for semiconductor manufacturing to cluster groups of wafers with similar geographic or parametric failure signatures to identify common root causes. In electronic manufacturing, neural networks are used to analyze the output of various inspection steps to provide better classification of failures. “We use AI to identify cracks in PCBs,” said Schuldenfrei. AI is also used to predict the outcome of expensive or damaging process steps, such as burn-in, so that they can be skipped for parts that are predicted to pass.

Schuldenfrei sees a promising future for AI. “As the complexity of chips, boards, and products continues to grow, AI is a critical component in handling the massive volumes of data generated during manufacturing, assembly, and test to find relevant and high value insights. As electronics become more prevalent in mission-critical applications like autonomous cars, quality and reliability requirements will drive significant investment in AI to identify potential bad parts during manufacturing and prevent them from being used.”

Optimal+ offers their products through annual subscriptions. Longer term, Schuldenfrei foresees a business model for data scientists to monetize their AI models by deploying them on third-party platforms, such as those by Optimal+. Such business models may allow AI to be deployed beyond failure analysis, manufacturing, and test optimization to embrace entire lifecycle analytics, which he calls the “Holy Grail.”

So, there are AI-based products. But do semiconductor companies actually use AI? Rohit Mittal, a director of engineering at Intel, wrote of Intel’s use of machine learning for EDN in Machine learning improves production test. He stated, “We described a methodology where difficult to measure parameters can be reliably predicted from routinely measured parameters during manufacturing test using a machine learning algorithm combined with an error compensation margin. The predicted values can then be used to dynamically set other yield-affecting parameters during manufacturing test. This methodology also enables detection of quality excursions that occur due to component or process changes, as it does not rely on fixed specifications derived from early R&D DVT testing.”

Thinking about the big picture, it is easy to envision why semiconductor manufacturing may be the first adopter of machine learning for test. Traditional electronic manufacturing typically relies on the assembly of known good pre-tested parts. In theory, if the design is right, only parts and process defects can lead to a failure. Semiconductor manufacturing is different--the process is everything, including the creation of the “parts.” Achieving a high yield is a multi-dimensional optimization challenge, something for which machine learning should be a useful tool.

While semiconductor test looks primed to exploit artificial intelligence, the outlook for electronic functional test is less certain. Image source: Bloomy Controls

So, what about traditional electronic manufacturing? Here we have some ways to go. I spoke with Grant Gothing, CTO at Bloomy Controls. Bloomy has a rich history of creating functional test and data acquisition systems. Gothing told me that Bloomy currently uses rule-based algorithms to optimize production functional test systems. He sees AI as a future technology to aid in troubleshooting of assemblies. The issue he highlighted was one of semantics; rule-based systems are not, strictly speaking, AI systems. Yet, they are very useful in achieving many of the same results. They give insight and suggestions to technicians as they work to troubleshoot and turn on complex electronic assemblies. The question is, what are the advantages of AI systems beyond that of rule-based systems or flow charts?

And there lies the critical question of whether AI-enhanced products will find a significant application space in functional test. From my own perspective, this is also one of financial tradeoffs. After all, if yield becomes high enough, is there a compelling reason to direct resources to repair the non-working assemblies? Perhaps, but the answer to that is very situational.

So, returning to the question “Will artificial intelligence come to the test industry?”, my answer is a definitive yes for the semiconductor industry. There are already products and proven results. As for traditional electronic test systems in R&D or manufacturing, the answer isn’t clear. One thing is certain, all these tools are aids to engineers, not replacements for them. Engineers need not worry that machines will take their jobs, at least not yet.

Larry Desjardin is a regular contributor to EDN's Test Cafe. He served in several R&D and executive management positions with Hewlett-Packard and Agilent Technologies.

This article is part of an AspenCore Special Project on the application of AI at the edge, looking beyond the voice and vision systems that have garnered much of the press. For further insights into the hardware, implementations, and implications of AI, check out these other articles in this Special Project:


Innovations Pushing AI Toward the Edge
AI will allow developers to implement more complex embedded system behaviors, and new tools are allowing more developers to implement AI.

How AI changes the future of edge computing
While it makes sense to incorporate AI with edge computing, hardware and software components need to address several challenges including power consumption, processing capability, data storage, and security.

Hardware helping move AI to the edge
What type of processing power is required to move AI to the edge? Vendors have come up with a range of answers.

AI makes data storage more effective for analytics
Turning data into intelligence requires analysis. AI implemented in the storage controller can substantially speed that analysis, as this proof-of-concept demonstration shows.

Connectivity remains central to mainstreaming AI, machine learning workloads
There is a race to make AI results relevant, reliable, and readily available. Only those with AI models trained on the best machine/deep learning infrastructure, from the largest data sets, will survive.