Ang Lim Go, HPE - About Swarm Intelligence, Quantum Supremacy and the Clouds

Dr Eng Lim Goh- Vice President and Chief Technologist for High Performance Computing and

artificial intelligence at Hewlett PackardEnterprise. He worked as a technical director at Silicon Graphics for 27 years. His research interests include the differentiation of humanity as we move from analytics to inductive machine learning, deductive reasoning, and artificial general intelligence. Continues his research into human perception of virtual and augmented reality.

Awarded the NASA Medal for Exceptionaladvances in technology as the principal investigator of an experiment aboard the ISS for the operation of autonomous supercomputers in extended space missions In addition to co-creating blockchain-based swarm learning applications, he oversees the deployment of AI in Formula 1 races, industrializes the technology behind the champion poker bot, and jointly designs systems for simulating a biologically detailed mammalian brain. Got six US patents, another five are pending.

HPE (Hewlett Packard Enterprise)is an American IT company founded in 2015year together with HP Inc. after the division of Hewlett-Packard Corporation. Inherited the business in the corporate client segment - it produces servers, supercomputers, data storage systems, storage networks, network equipment, convergent systems, and is also involved in building cloud infrastructures.

“The cloud will remain important in the world of big data”

- Cloud technologies have long gone beyond innovation to modern IT standards. What role do they play today in developing new products?

- At HPE, we focused our computingdevelopment as part of the “peripheral to cloud” trend, mainly because most of the data goes to the periphery first. We have to transfer all the data from the periphery to the cloud, for example, the data of supermarkets, cars, if we are talking about a connected car (a car that can bidirectionally communicate with other systems - HiTech), the aviation industry and hospitals. In many cases, we transfer data to the cloud in order to then analyze the data and send the result back to the periphery.

Cloud computing is important because it allowsuse all the computing power concentrated in the cloud, while at the periphery it is usually less. The traditional way is to first collect data on the periphery, and then configure smart peripherals to send only the necessary information to the cloud. The cloud has all the computing resources to conduct machine learning, do analysis, get results that will be sent back to the periphery. That is why we believe that the cloud will remain important in the world of big data.

- Why use artificial intelligence to create new data centers? What is its main purpose in this context?

- Data centers (DPC) are becomingmore complex and users demanding. Regarding the complexity of the data center, today you have a large number of central (CPU) and graphic (GPU) processors for AI, which have many cores. There are also large flows of data, the storage and movement of which must be organized. All this consumes a lot of energy and increases the complexity of data centers.

GPU (graphics processing unit)- graphics processor, specializeda device for processing graphics and visual effects. Unlike the CPU (central processing unit), the GPU architecture is better suited for parallel computing and has much faster memory. Modern GPUs can be used not only for graphics processing, but also for similar mathematical calculations, for which processing speed is more important. At the same time, the data processing speed of the GPU compared to the CPU can be thousands of times higher.

Processor cores- independent processors assembled on onephysical chip. This method makes it possible to reduce the physical size of the chip, its power consumption and heat dissipation, and also significantly increase performance without changing the processor architecture.

As for users, their requirements are alsogreatly increased. In the past, they bought equipment, launched it, and while the system worked, users were satisfied. But today they ask: "Are my applications working optimally?" - since not always a direct increase in computing power gives a proportional increase in productivity.

As a result, you have user requirements,the complexity of data centers, which means that you need to implement more AI, which would view the data and help make better decisions. The problem is that we don’t have enough data to help AI learn. About 10 thousand customers entered our project and sent their data on the data center to the cloud. Now we are sending the results of processing AI data back to each of these data centers in order to optimize their work.

- Is AI currently actively used in creating equipment for corporate clients? How soon should you expect similar technologies in office and home products?

- If you mean the ability to giveforecasts based on history, then it is already very widely used now. Today it is used in many areas: in finance, to predict the value of stocks, when to sell and buy, in pricing derivatives in financial markets, or to calculate anomalies in x-rays in medicine. There are cars that are smart enough to understand that, for example, vibration in a shock absorber means something bad, and send information about it to the driver. Learning through history in order to be able to make decisions and predictions has become a reality. But bolder predictions that a superman will appear are still science fiction. However, it is important to start thinking about it now.

“Quantum computers, using the optimization method, will make the computer with AI learn faster”

- It is difficult for ordinary people to understand what exactly quantum computers are of which they talk so much today. How do you define them for yourself?

- To begin with, I don’t understand quantumthe mechanics. I do not understand the entanglement of quantum states, the superposition and measurement of collapse to the classical state. But it is not important. I accept all three of these concepts. I admit that they exist. Since I am an engineer by training, I use only what I understand more. For example, different energy levels of electrons in an atom: low, high and very high. Further, entanglement is when two atoms get so close that they start to get entangled. We also talked about the collapse of a function when an initially uncertain system “selects” one of the admissible states as a result of measurement. I admit the existence of these three concepts, which allows me from an engineering point of view to combine all the different quantum systems that are currently being developed for quantum information processing.

- More recently, Google made a lot of noise, announcing the achievement of "quantum superiority." Do you use quantum technologies in your designs?

- I think we will get the analog technologymeasurements in quantum computing in the next ten years. But in the digital sense, for a quantum computer to work like today's machine, it will take more than ten years. One of the biggest problems is how to keep entanglement and superposition stable long enough to make calculations. Today they have many errors, and their correction requires much more qubits in order to support one computational qubit. Therefore, I argue that it will take more than ten years to reach the point where a quantum computer will become better than classic computers. Therefore, there is still time, but when it appears, we can radically change the order of things.

Quantum superiority– the ability of quantum computing devicessolve problems that classical computers are practically unable to solve. Google had previously announced plans to demonstrate quantum supremacy before the end of 2017 using an array of 49 superconducting qubits, but the actual achievement of such a result was only announced on October 23, 2019 as a result of collaboration with NASA. According to Google, “quantum supremacy was achieved on an array of 54 qubits, of which 53 were functional and were used to perform calculations in 200 seconds that would take a conventional supercomputer about 10,000 years.”

Qubit (from quantum bit)- quantum discharge or smallest element forstoring information in a quantum computer. Like a bit, a qubit allows two eigenstates, denoted 0|1, but it can also be in their “superposition,” that is, in both states simultaneously. Whenever the state of a qubit is measured, it randomly transitions into one of its own states. Qubits can be “entangled” with each other, that is, an unobservable connection can be imposed on them, expressed in the fact that with any change in one of several qubits, the rest change in concert with it.

- How is a quantum computer related to artificial intelligence?

- AI uses machine learning, it learns withusing the story. This happens by trial and error, he tries one story, predicts incorrectly, corrects, then another story - to predict, if not, then correct. And so a thousand attempts. Ten thousand attempts. One hundred thousand. A million or ten million. He needs to make many attempts to tune in, until he displays the correct algorithm for the forecasts. I believe that quantum computers, using the optimization method, will make the computer with AI learn faster. So that he does not have to make so many attempts and try a million times to achieve the correct result. A quantum computer will allow it to very quickly reach a good level of predictions.

Blockchain and swarm intelligence

— How are blockchain technologies used across enterprises?

- AI and blockchain are very closely related. We believe that not the blockchain itself, but the technology that underlies it will be important for peripherals. Since the data will flow to the periphery, you will want to do as much as possible to save the computing power of the cloud. Imagine you have a million high-definition HD cameras. You cannot send data stream from a million cameras to the cloud. You will have to put computers on the periphery that are smart enough to decide: “I do not need to send this. I will send only this. ” But then you need smart computers. We believe that the ability to connect multiple peripheral computers into one group, one “swarm” for swarm training will become important. This is due to swarm intelligence - they are both interconnected.

The exact definition of swarm intelligence is still notformulated. Swarm intelligence describes the collective behavior of a decentralized, self-organizing system. RI systems, as a rule, consist of many agents (boids), locally interacting with each other and with the environment. Ideas of behavior usually come from nature, especially from biological systems. Each boyd follows very simple rules. Despite the fact that there is no centralized behavior management system that would indicate to each of them what to do, local and somewhat random interactions lead to the emergence of intelligent group behavior that is not controlled by individual Boyids. In general, RI should be a multi-agent system, which would have self-organizing behavior, which in total should exhibit some reasonable behavior.

If we talk about our method of swarm training, thenhe is like that. Suppose one hospital provides training, isolating its data, it does not share data, and only shares the results of its training. So are the other hospitals. This entire transfer process is coordinated through blockchain technology. We are sure that it is needed because we want all peripheral devices to work, albeit independently, but as a whole.

We don’t want to have centralized management,because in the swarm it is not. A swarm of bees has a queen bee in the hive. But she gives no instructions while the swarm flies. The bees coordinate themselves. And only when they return to the hive, they communicate with the queen bee, serve it, and so on. But when they are inside the swarm, they are trained, they have to coordinate actions among themselves. And so the swarm lives. But how to coordinate it without a leader? Blockchain Therefore, blockchain is important for the periphery. If there is only one leader coordinating the swarm, and he drops out, then the whole swarm does not work. Bees have to look for another leader. There is no leader in the blockchain.

- What can you say about RI technologies? Is the analogy with neural networks appropriate here?

“Roy is exactly like a neural network.” Each individual bee or server on the periphery has its own neural network. Each hospital, like a swarm, has its own separate training neural network. But blockchain allows this training to be shared across all hospitals. Therefore, each bee, hospital or computer on the periphery has its own neural network. But when they share their learning from bee to bee, they use blockchain. As a result, they use both neural networks and blockchain. The neural network is used for self-study, and the blockchain is used to share with others.

“Earth Responsibility Attracts Young Engineers”

- Today, corporations pay special attention to environmental care. What kind of measures does HPE take in its work to take care of the environment?

- This is an important topic.First, we as a company are responsible for the Earth. Secondly, many young engineers want to join a company that feels such responsibility. Yes, I think in this new generation there is a trend towards greater consciousness. We want to attract young engineers. And thirdly, these are the right things.

We have two large recovery centers inUSA and Scotland. According to rough estimates, over the past year we bought, processed and sold 99% of the restored old equipment, totaling $ 3 million. From the residues we extract most of the raw materials: silver, gold - and reuse them. And only a very small percentage, about 0.3%, is thrown away.

The second area is customer interaction inareas of environmental protection. One of my favorite examples is an application from our client, the company Salling Group, designed to combat the irrational use of food. Today, about 2,000 supermarkets are connected to them. For example, stores intend to throw out 26,912 food items because they have expired. Selling such products at a great discount, retailers can increase their profits by 10%, and customers can get goods at a low price.

Another area is clean energy. A huge amount of carbon dioxide is produced in the world, because people need energy. We are working very closely with the ITER (International Experimental Nuclear Reactor) project to try to use nuclear fusion for energy production. The complexity of nuclear fusion is to keep the plasma in a magnetic field that revolves around TOKAMAK (a toroidal chamber with magnetic coils - “HiTech”). We provide a supercomputer to calculate the optimal structure of the TOKAMAK magnetic field in order to keep the plasma stable.