The world is vast and colorful. For entrepreneurs, the challenge is “simply” to seize the opportunities offered to identify tomorrow’s technologies and turn them into solutions. This is a basic principle that Hans Beckhoff himself has adhered to for the past 40 years. His inventive spirit seems to know no bounds. Human control technology is one of his current sources of inspiration for new ideas as you will find out in the following interview. Together with Armin Pehlivan, he provides an exciting insight into a new method of machine programming.
Mr. Beckhoff, you focused on the IT sector from a very early stage. Why was this?
Hans Beckhoff: IT has always been the driving force behind the semiconductor industry. In contrast to this, the market for industrial electronics was relatively small. That’s why we decided in 1985 to use IT technology for our purposes. And this combination of IT and automation technology has proved to be very fruitful since the beginning. By concentrating on PC-based control, we have always had the most powerful processors and the most advanced operating systems at our disposal and were able to build on this to create control technology that offers the best possible performance.
Edge computing was/is basically nothing new then for Beckhoff Automation?
Hans Beckhoff: That’s right, because the required features of an edge computer are inherently available in our PC-based control technology. That said, the increased demand for edge controllers is beneficial for us. That’s because our compact IPCs, like the C6015, are often seen as the perfect complement to a classic controller for enabling edge computing.
Armin Pehlivan: Alongside our small DIN rail Embedded PCs, however, the EK9160 IoT bus coupler is also ideal for use as an edge device for example. It is a smaller, self-contained controller that can transfer all types of data – whether digital or analog – to the cloud and in some cases even with preprocessing. Moreover, the EK9160 is also simple to install and configure. This means that this controller is also a perfect edge computing enabler for less tech-savvy people.
To what extent does Beckhoff Automation act as a role model for future development – particularly with speech (listen/speak), vision and machine learning (intelligence)?
Hans Beckhoff: Nature has always provided us with a great source of learning. In terms of human control technology, for example, evolution has decreed that, apart from a few exceptions, this is regulated by a central nervous system. Our brain therefore acts as a modular central computer that accesses a closed process image of the entire body via our nervous system. A Beckhoff controller is similarly designed: We use EtherCAT as a communication medium, which forwards all signals collected to the controller. And in the same way that our brain can simultaneously process multiple sensory inputs, such as visual, acoustic or tactile, our TwinCAT control software can also do this with the help of appropriate algorithms.
Armin Pehlivan: Regarding the performance of our controllers, we again benefit from the achievements of the IT industry. PC-based control technology can be precisely scaled in terms of both hardware and software. Through multi-core and now many-core technology, we can theoretically distribute the computing capacities that are required, for example, for classic process flow, measurement technology, image processing, artificial intelligence or motion, to up to 256 cores, though in practice we deliver 48-core controllers.
What do you think are the future applications of TwinCAT Speech?
Hans Beckhoff: It is “normal” for humans to talk to one another in order to exchange views. From that point of view, speech input and output is a very natural human-machine interface from which good things can be expected in the future. A pilot project is currently underway in Austria in which speech input is used for programming, but Armin Pehlivan can provide you with first-hand information about this ...
Armin Pehlivan: Most of the speech applications implemented to date are limited to individual commands: The machine or system is simply instructed what to do and in turn it does it. Or, conversely, the machine uses speech output to indicate that it has a fault somewhere. Beckhoff Automation has a much more ambitious goal however: We want to have a proper dialogue with machines, with both sides asking and answering questions.
Could you give a couple of details about the ongoing speech project?
Armin Pehlivan: Unfortunately, I am not allowed to divulge details, but I can use an example to try to show what we are currently working on. If we assume that I were to issue a command to another person to “Please buy a liter of milk,” then I would have automatically triggered a larger chain of actions. That’s because to buy milk, the person concerned has to leave the house. They have to know which stores sell milk and where they can find such a store. On the way there, they have to observe traffic rules. In the store itself, the person has to interact with other people and line up at checkout. Therefore, executing this one “buy milk” command already involves a lot of activities in the background. And it is this type of complex programming that we are now trying to resolve with TwinCAT Speech. In the future, end customers should be able to reprogram their machine with voice commands.
Hans Beckhoff: It can therefore be assumed that a headset will be used in the future to communicate with machines.
Which future scenarios would you otherwise envision?
Hans Beckhoff: Machine vision is another topic that will undoubtedly become more prevalent than it has been to date. And parallels with humans can again be drawn here. The eyes act as our universal sensors, allowing us, among other things, to identify people and objects, estimate distances or detect risks of collision. All of these things are likewise possible with cameras. We therefore anticipate that use of between 10 and 30 cameras per machine will become commonplace at some point, as the importance of traditional sensors successively diminishes. Either way, we are optimally equipped for such a scenario with TwinCAT Vision, since we regard image processing as an integral part of our control technology and not as a separate technical function. We can therefore perform image processing evaluations at the same speed at which we operate the PLC.
What is the next big revolution we can expect from Beckhoff Automation?
Hans Beckhoff: We resolved when we established the company that we would unveil evolutionary developments every year and launch a genuine revolution on the market every five to seven years. And we have stuck to this roadmap up to the present day. Previous highlights have included a PC-based machine controller with integrated floppy disk drive (1985), the first-ever compact IPC with LCD display, and the bus terminal – now a completely standard component of automation technology, which we invented but unfortunately did not patent. TwinCAT came in 1996, followed by EtherCAT in 2003, which has now become a global standard for industrial communication. The presentation of TwinSAFE, our software-based safety solution, was a further technological milestone in our company’s history. The most recent “revolutions” to attract attention were the XTS linear transport system and the XPlanar flying motion system – a type of “flying carpet” for product transport. Because we are now able to create arbitrarily shaped magnetic fields, a lot more innovations are set to be unveiled in the coming years in the area of drive technology. That much I can already predict.
Armin Pehlivan: We are also anticipating significant potential for innovative solutions in the entire cloud area. We already have a number of concepts in this regard, that are either at the planning stage or being implemented. We began with TwinCAT Cloud Engineering, which allows users to work with TwinCAT via an internet browser. The software therefore no longer needs to be installed on the user’s own computer in order to access the existing TwinCAT engineering and runtime products. This is an interesting prospect for many companies, since it allows them to use a web-based automation solution independently of their own central IT, and it is also ideal for collaborative work.
Where is the journey in the area of machine learning taking us?
Hans Beckhoff: The fundamental idea with machine learning is to no longer follow the classic engineering route of designing solutions for specific tasks and then turning these solutions into algorithms, rather to enable the desired algorithms to be learned from model process data instead. We offer a high-performance execution module for trained classic machine-learning algorithms with the TF3800 TwinCAT 3 function and for trained neural networks with the TF3810 TwinCAT 3 function. Both are what are known as inference machines, which are used as products by our customers. From a PLC perspective, these are function modules in which artificial intelligence algorithms are implemented. The algorithms or neural networks are trained in established frameworks such as PyTorch, TensorFlow or MATLAB®. The information learned is loaded into the inference machine as a description file. The standardized exchange format Open Neural Network Exchange (ONNX) is supported, so that the worlds of automation and data science merge seamlessly. Training programs available in the community can therefore be used to train algorithms and implement them in our controller.
Armin Pehlivan: This creates a permanent cycle. We collect data at the machine, incorporate it into the training software and the result of the training software is then executed in the inference machine. Using the example of a sawing machine, it is therefore possible that three values that are intrinsically difficult to link algorithmically can be trained in such a way that reliable status information about the sharpness of the saw blade is nevertheless obtained.
Hans Beckhoff: The merging of multiple sensor data into a common, superordinate signal is generally an area that works very well with machine learning and neural networks. We also use this technology in our own products. For example, XPlanar is backed by any number of neural networks. Machine learning is basically an interdisciplinary technology, which will not only be used in technology but everywhere in society.
Where are the boundaries – will everything control itself one day?
Hans Beckhoff: If the successes of the first industrial revolution are regarded as boosting muscle power, then artificial intelligence could be viewed as boosting thinking power. Although it is now commonplace for us to allow machines to relieve us of various tasks, many people are extremely reluctant to relinquish the power of thinking and decision making. We should in no way fear that as humans we will become “useless.” After all, automation technology itself has a deeply-rooted human aspect. A good basis for dialogue between providers and users of products and solutions is needed in order to master complex tasks. Much to my joy, the Austrian Beckhoff team has been very good at gaining the trust of many customers. They are also frequently involved in really exciting projects since the Austrians are somewhat more open to “crazy new ideas” than German customers. The method outlined by Armin Pehlivan of using speech input for programming is the best example of this.