YOU ARE AT:Data Analytics‘Deep learning won’t work in the real world’ – open-source AI is...

‘Deep learning won’t work in the real world’ – open-source AI is the only way to ‘reason’, says Dell

Higher-order analytics tools are being embedded in the industrial space, as data science and data capture practices advance, and as compute power and wireless networking become more flexible. But machine learning is limited in its current form, and the ‘fourth industrial revolution’ will slow from a gallop if AI techniques are not open-sourced, and AI solutions are not collaborated on and shared.

So said Dell at IoT Solutions World Congress 2019 in Barcelona last week, in a panel also chaired by Dell, and attended by the likes of Elisa, Hugo Boss, and Air Liquide besides. Enterprise IoT Insights will post further coverage from the session, from these others. But the floor, here, goes to Dell, which cited recent remarks from Turing Award winner Yoshua Bengio, professor at the University of Montreal.

“Machine learning in itself is not going to work in the real world, for certain types of applications. Yoshua Bengio, the leading researcher in AI, said a few weeks ago deep learning in itself in not going to work,” commented Said Tabet, chief architect for emerging technologies and ecosystems at Dell, on stage in Barcelona.

In March, Bengio received the Turing Award for his work on deep learning and neural networks. He shared the award, the $1 million prize money, with two others: Geoffrey Hinton, at the University of Toronto, and Yann LeCun, at New York University. The trio have been dubbed ‘godfathers of the AI boom’.

In an interview with Wired in August, Bengio said deep learning works well in controlled scenarios, but cannot ape human intelligence without the ability to reason about causal relationships. “Current approaches to machine learning assume the trained AI system will be applied on the same kind of data as the training data. In real life it is often not the case,” he told Wired.

In Barcelona, Tabet noted that heavy industry, like manufacturing, has been running an algorithmic shuffle of one kind of another on its machine data for a generation, at least. But new jumps in connectivity and computing have advanced the capture and processing of data, and spurred the sector to make more rapid digital change. It is the same revolution that is sweeping every sector; but manufacturing is a hot house for it.

Tabet said: “When it works, it definitely does a job, and enables you to go to another level. That evolution [with horizontal technologies] is taking us to the next level, where we are combining things like machine learning, deep learning, and now going next to a reasoning system.”

He explained: “Artificial intelligence – the way it has been used before, with decision support and logic based systems; deductive, inductive, and so on – has been in manufacturing for fault detection and other areas for 30 or 40 years. With the intranet in the 1990s, it started to be socialised. But that evolution has started to be more successful in the last five years, particularly with IoT – and with machine learning, and deep learning.

“And that’s because of open source adoption of these technologies, and the fact we have access to compute and storage, [as well as] a lot of data batching techniques, and IoT sensors for capturing information. [Now] machine learning, as something that came from operations research and statistics, is relatively straightforward; it works well with specific examples.”

Open source machine learning solutions, and not walled off models, is the only way the broad industrial IoT market, where the IT and OT worlds come together, will carry machine learning systems beyond highly-trained image recognition tasks (divining cats in internet searches, or errors on production lines).

“We need that open source tooling, we need those capabilities of adoption, for the ecosystem, so all of our audience and the people that show up to the factories supporting different business models can actually train their systems to do that.”

Tabet warned, also, that deep learning should be properly managed; it is costly anyway, in terms of resources and expenses, and will spiral uncontrollably if order is not maintained, he said.

“With deep learning, particularly with unsupervised learning, using a lot of algorithms and frameworks, you can get into trouble if you don’t take care of your data. And I always caution [about] the cost in terms of resources and [money] – in critical or mission critical environments, that is really important.”

He added: “At the end of the day, if you are going to gather all that data – which requires resources [and money] from an IT and OT point of view – without ROI in mind, [then] the monetization will be difficult… And it needs to be monetized, because the infrastructure behind it, despite all the effort and innovation, is still a cost, and funding is required to support it.”

ABOUT AUTHOR

James Blackman
James Blackman
James Blackman has been writing about the technology and telecoms sectors for over a decade. He has edited and contributed to a number of European news outlets and trade titles. He has also worked at telecoms company Huawei, leading media activity for its devices business in Western Europe. He is based in London.