Mikhail Tsvetkov - Intel Technical Director in Russia. Works in the field of electronic technology for over 15 years.
Hearing Aid Battery Sensors
- What are the main directions of development of Intel?
- Today, Intel - Data-Centric-company. On the one hand, we came to this status from the microelectronic industry - our plants have not gone away, still Intel is one of the leading suppliers of the semiconductor foundation of the modern digital world. On the other hand, we have already outgrown the status of a manufacturer of simply microprocessors and have become a global component creator for the entire digital infrastructure. Starting from IoT-things used to collect primary data, and to the most powerful data centers in which these data live, are processed and transformed from numbers into knowledge. Therefore, we solve all the key tasks that stand in this way of data evolution. The collection, storage and transmission - both wired and wireless, we have a large technology package in the field of 4G, LTE, 5G cellular, optical channels.
For example, one of the most promising technologies- Intel® Silicon Photonics, which will expand and make available high-speed channels in the near future. And, of course, processing elements. The good old Intel CPU, both in the server segment and in the client segment, still remains the most versatile and popular computer for a wide range of tasks. Plus the most important area is data storage. Now Intel produces a lot of SSDs: from user-defined SATA SSDs to ultra-modern NVMe SSD for data centers, including already on the fundamentally new physics of 3D XPoint. This we have not yet raised the issue of autonomous driving.
- Do you do it?
- Personally, I - no, but we have a separate unit, Intel Autonomous Driving. Intel is looking very attentively and actively working on this.
- Do you develop infrastructures from beginning to end - this is both data collection, that is, sensors, and processing? Are these systems for specific industries?
- No, specific deploymentIoT infrastructures are an integration work. Intel rarely acts as an integrator. We are technology developers. For example, we make transceivers, chips for Bluetooth and Wi-Fi connectivity. Most laptops have any of our Wi-Fi or Bluetooth chips. Developing these protocols, we transfer technology from the IT world to the industrial world.
Photo: Anton Karliner / Haytek
For example, our colleagues from Intel IT wereA very interesting pilot was conducted at one of the factories for the deployment of a wireless network of 150 sensors that monitored equipment, pressure, the presence of impurities of various gases in the air. It was a semiconductor manufacturing, which uses a large number of chemical components. And the high efficiency of Bluetooth low energy (BLE) was proven - the topology for short distances, about 15 m from the receiver, even in such a difficult room as a production hall. According to internal estimates of our IT service, it turned out that the cost of this network was only 10% compared to classic wired sensors, including wiring and maintaining wired infrastructure in an already operated room.
The following infrastructure was deployed there: there were two IoT gateways in a large factory room, in fact, an Intel PC with Intel Bluetooth and a Wi-Fi module, and wireless sensors were hung. Gateways are connected by cable to the Ethernet-network and via Wi-Fi. Interference is possible between different wireless standards as they use the same frequency band. BLE and Wi-Fi both operate in the 2.4 GHz band. But unlike other families of protocols, for example, IEEE 802.15.4, where coexistence (with Hightech) is not too well implemented with Wi-Fi, Bluetooth and Wi-Fi are harmoniously combined, they are rather efficiently shared frequency resources and are resistant to interference with each other. Most importantly, as a result of testing this system, which lasted a year and a half, 99% reliability of communication with the sensors was achieved, and the operation stability was very predictable. If the sensor did not work, then it did not work immediately, because it was incorrectly placed, for example, behind the column or too far. But if the geometry was such that the connection could be established, the sensor functioned properly and the connection was reliable.
Sensors have shown their ability to live frombatteries in 620 mAh for 452 days. This is good, but this is not the limit, because a 620 mAh battery is a battery for a hearing aid, and, for example, an AA finger is already somewhere around 2 thousand mAh.
Kettles as sources of non-trivial information
- In Russia, is R & D somehow engaged in IoT?
- IoT is not a single spherical thing invacuum, it is part of the life cycle of data, their automated generator. Humanity generates data by uploading photos, filling text, but this method of obtaining information does not provide a complete picture of the world. In order to analyze the world in much more detail, automation is needed. The natural progress of any desired business is automation. To automate data collection, the infrastructure is deployed from sensors.
I once said that the best IoT-sensor -This is a video camera. The video stream is such a rich information source, and, most importantly, it is intuitive to the person. If we consider IoT separately from the general Data-Centric-concept, then in most cases it is not very interesting.
The ability to turn on the kettle on a cell phone- A good option, but more from the category of additional options of household appliances, rather than the Internet of things. But the ability to analyze information from a million dummies can give a completely new non-trivial knowledge about how the load on the network is changing, about how people drink tea in the mornings, that most residents of houses with gas stoves prefer to boil tea electrically and pay extra money for it.
- In the industrial IoT it is clear who owns the data. And if we say, conditionally, about kettles, household IoT, then who will own these data when they are collected from personal devices?
- I think in each particular case it will be determined by the contract that the person signs directly with the operator of his data.
- Device manufacturer?
- Not necessary. The provider of the service to which the person is connected may be a device manufacturer, an Internet company or, in general, a separate startup. In any case, a person (as a subject of decision-making) - this is shown by the latest changes in legislation - will have the right to manage his data and express his decision in a form mandatory for the operator. A service representative will be required to follow this decision.
Photo: Anton Karliner / Haytek
The data issue is divided into two parts: this is a physical / technological organization of data acquisition and social / legal. The social and legal part lies more in the field of the state and the person himself, and we, a technology company, should simply provide a convenient and cost-effective opportunity to implement any decision made.
Putting an observer in front of a wall in 24 screens is simply cruel
- Will it be mostly wireless data collection?
- The trend is now to move to wirelesstechnology. Telemetry itself is a field of automation that has been well known for half a century. The RS-485 interface is a family of serial interfaces, and the successor to Ethernet is not a new story. But the scale of these systems was constrained by such factors as the need to lay a cable. Cable routing is a serious task that requires planning even at the construction stage of a building. Just come and put 100 wired sensors is very difficult. I am not saying that it is impossible, but it is extremely difficult. But the appearance of cheap and resistant to interference, with a long battery life, sensors, can turn the quantity into a new quality. In this case, when the sensors reach a certain threshold, becoming wireless, they will be the same natural attribute of any space, as now the lighting.
RS-485 (English Recommended Standard 485) - physical layer standard for asynchronousinterface. The standard has gained great popularity and has become the basis for creating a whole family of industrial networks widely used in industrial automation.
EIA has previously labeled all its standards.prefix "RS" (eng. Recommended Standard - recommended standard). Many engineers continue to use this designation, but EIA / TIA officially replaced RS with EIA / TIA in order to facilitate identification of the origin of their standards.
An interesting feature - the development of IoT recallsthe law of development of semiconductor engineering. In the beginning, when there is no market yet, piece chips come out in pilot mode, they are extremely expensive, because the development costs a lot of money. But with the advent of millet and an increase in the number of manufactured chips reduced the price for one piece. So, according to Moore's law, the revolutionary development of technology has made it possible for a new world of personal computers to emerge, with a microprocessor price of less than $ 1,000. The same thing that was done in the 80–90s is now happening in the world of IoT things. When the cost of the components and the holistic IoT-system will overcome the threshold of massive explosive distribution, then the manufacturer will be profitable to invest in the development of new systems, because he will see the market, and users will be able to effectively automate all aspects of their lives.
- When will this happen?
- This is already happening. Now the segment of video surveillance is growing very rapidly, and not only in the field of security, but also in the form of AI - good intellectual video surveillance with recognition of the situation, counting the number of people in queues, traffic. For example, video surveillance in the industry has almost replaced quality control on production lines. That is, now it is no longer necessary to force a person to continuously look at the blanks flying in front of him on the conveyor to determine the marriage. Many interesting things happen in this area, and the right question immediately arises: what to do with this shaft of information? Existing classic data processing tools are no longer useful. Again, it is impossible to put an observer in front of a 24-screen wall and require him to continuously concentrate and extract information from these streams. It is just cruel.
AI is also not a supernova, periodically to"Intellect on silicon" has been circulating since the 50s. Even I caught the wave of 2000, when I was writing a term paper on the implementation of FPGA neural networks (FPGA). But at that moment the ground was not ready for rapid growth, for a qualitative leap. There was not even more data and productive equipment. Even Kolmogorov investigated the issues of AI. He said that he did not see mathematical obstacles for creating full-fledged living beings, built entirely on digital information processing mechanisms.
Andrei Nikolaevich Kolmogorov - Soviet mathematician, one of the greatest mathematicians of the XX century.
Kolmogorov - one of the founders of modernprobability theory, he obtained fundamental results in topology, geometry, mathematical logic, classical mechanics, the theory of turbulence, the theory of complexity of algorithms and functional analysis.
Photo: Anton Karliner / Haytek
But the performance of a computer of the 60s wasnot enough to work practically useful neural network. And only in the second half of the 2010s, the performance of general-purpose computers reached the threshold required to run multi-layer neural networks with millions of parameters. And, most importantly, the Internet has accumulated enough information for large, public, semantically tagged data sets, such as ImageNet, to appear. And here, please, a revolutionary leap - the AlexNet network on ImageNet did not show the accuracy of object recognition in photographs, comparable to that of a person. And we are accustomed to living with human errors.
“Soon the 3GPP-committee will be renamed 5GPP-committee”
- Intel also deals with 5G. At what stage is work now?
- Now the specification is formalized. The first deployment will appear closer to the second half of 2019, around the world, and widespread in 2020. 5G what is good? It solves three key tasks at once - effective collection of relevant data, their transfer and processing. 5G is a solution to the problem of mass data transfer, powerful video streams and low latency. Because IoT is not only telemetry, but also signals to actuators. Low latency in the management of mechanical objects, in real-time calculations. There, time intervals are measured in milliseconds, and such rigid delays are not provided for in existing systems. One of the subgroups of 5G is the guaranteed distribution time of the team. And the third point is the explosive growth of connected devices. In LTE, the base station capacity is relatively small. Connecting tens of thousands of users exceeds the capabilities of modern 4G-technologies. And the third area where 5G is actively developing is an increase in the subscriber base capacity. In order for operators to be able to cheaply connect low-consuming and low-transmitting sensor networks.
- What are you developing in this context?
- We develop modems. Intel is a manufacturer of good 4G, 3G, and now 5G modems. The recently introduced XMM 8160 5G modem is preparing for worldwide use. Within the framework of the 3GPP committee developing cellular communication specifications, standardization work is underway. There is a joke that the 3GPP committee will soon be renamed the 5GPP committee. The committee consists of our colleagues from Nizhny Novgorod, we are actively involved in the development of this standard. But the best contribution is product creation.
Galloping electrons, qubits and minus one thousand kelvins
- If you continue the topic of data and their increase, do you see any limit to the development of data storage?
- So far, the limit is not visible. Now it's realistic to talk about petabyte storage in a 1U server. This is practically our tomorrow, if not already today. And speaking more globally, I’m afraid to make pessimistic forecasts, because throughout the 50-year history we’ve only done that we disproved skeptics and moved further and further. But at the same time, with the prospect for the future, Intel is developing in the field of quantum computing, now it has reached 49 qubits in conjunction with academic institutions.
- In Russia?
- No, in Europe, together with the NetherlandsQuTech Research Center. Very nontrivial tasks of keeping qubits in a stable state at temperatures that differ from absolute zero by only a fraction of a degree are solved there. We are also researching new architectures, for example, such as neuromorphic computing. Now, models of artificial neural networks on processors only imitate the work of the neurons of the living world, physically it is the multiplication of matrices on digital multipliers. Unlike them, a neuromorphic quantizer emulates the physics of a neuron. And Intel made another digital, but already asynchronous chip for the implementation of such models.
- Quantum computing, for example, at IBM, is based on superconductivity, do you have a similar technology?
- We explore different effects. Now there are about six approaches, on the basis of which they are trying to make a quantum supercomputer. Intel uses a spin qubit that is stable even at a temperature of 1 Kelvin, which is quite warm compared to superconductivity.
Photo: Anton Karliner / Haytek
- Stable a few milliseconds?
- Yes, a few milliseconds. Theorists say that a quantum computer will be able to show practically applicable results on a qubit number from a thousand or more. But is 49 cubits so small? For example, when the world's first bipolar memory chip, created by Intel in 1969, appeared, its memory capacity was only 64 bits. But it launched a rapid evolution, and literally a year later, a CMOS DRAM chip of 1024 bits was created. The process was launched, the technology received a start in life. In quantum computing, a lot of work is now being done in parallel in the theoretical part. Tasks are sought that are solved in principle faster than on traditional computer architecture.
Without computing resources, no one conducts clinical trials.
- Intel is involved in digital healthcare. You even launched some products, Basis Peak watches, which were withdrawn in 2016.
- It was not so much health asfitness industry. Health care with all its requirements and tasks is a separate area, we are actively working with it, just in terms of the development of infrastructure and data processing technologies. Medicine has always been a very high-tech and data-intensive field of human activity, and now, when it becomes possible to automate the collection and processing of information, analytical, data-based medicine is actively developing.
We must pay tribute to the doctors, they have been veryworked well with statistics. Now we have introduced AI for image analysis. The neural network cannot build a diagnosis, but it can serve as an advisory tool for the doctor. Collection of information and statistics on hospitals, on health systems within the country and around the world, provides a huge amount of information for analysis. Clinical trials of new drugs are a large and difficult part of medical research. There can not be expected that in 100% of cases the result will be repeated. The results are always of a statistical nature, one should always look for correlations, understand where the true dependence is, and where the special case is. And here, without computational resources, I think no one is already conducting clinical trials.
- You mentioned a lot of different obstacles that need to be addressed in the field of data. What is the hardest thing now developing in this direction? What is missing?
- Many people love to complain that theysomething is missing. I am now trying to think of what to complain about, and it does not work. There is a huge amount of work in all directions, and most importantly, what is missing is time.