To become climate neutral by 2050, the European Union has launched two ambitious programs: Green Deal
Observation data will be continuously entered intoa digital twin to make the digital model of the Earth more accurate to track evolution and predict possible future trajectories of change. But in addition to the observational data commonly used to model weather and climate, the researchers also want to integrate new data on relevant human activities into the model. The new model of the Earth system will display practically all processes on the planet's surface as realistic as possible, including the human influence on the management of water resources, food and energy, as well as processes in the physical system.
The digital twin is intended to be an information system that develops and tests scenarios that demonstrate more sustainable development and thus better inform policy.
“For example, if you plan to builda two-meter dam in the Netherlands, I can view the data in my digital twin and see if the dam will continue to protect against anticipated extreme events in 2050 "
Peter Bauer, Deputy Director of Research, European Center for Medium-Range Weather Forecasts (ECMWF) and co-initiator of Destination Earth.
The digital twin will also be used for strategic planning of fresh water and food supplies or wind and solar power plants.
Researchers say what to considersteady development of weather patterns since the 1940s. Meteorologists were the first to begin modeling physical processes on the world's largest computers. Today's weather and climate models are ideal for defining entirely new ways to use supercomputers efficiently for many other scientific disciplines.
In the past when modeling weather and climateused different approaches to modeling the Earth system. While climate models represent a very wide range of physical processes, they usually do not account for the small-scale processes that are needed for more accurate weather predictions, which in turn focus on fewer processes. The digital twin will unite both areas and allow the complex processes of the entire Earth system to be simulated in high resolution. But to do this, the codes of the simulation programs must be adapted to new technologies that promise much higher computing power.
With computers and algorithms availabletoday, very complex simulations can hardly be performed at the planned extremely high resolution of one kilometer, because for decades, code development has stalled from a computer science perspective. Climate Research has benefited from the ability to improve performance through the use of next-generation processors without the need to overhaul its program. This free performance boost with each new generation of processors stopped about 10 years ago. As a result, modern programs can often only use 5% of the maximum performance of conventional processors.
To achieve the necessary improvements, scientistsemphasize the need for collaborative design, that is, the joint and simultaneous development of hardware and algorithms, which has been successfully demonstrated by the research team over the past ten years. They propose to pay special attention to general data structures, optimized spatial sampling of the computed grid, and optimization of time step lengths. Scientists also want to decouple codes for solving a scientific problem from codes that perform optimal computation in the corresponding system architecture. This more flexible program structure will enable faster and more efficient switching to future architectures.
The authors also see great potential inartificial intelligence. It can be used, for example, to assimilate data or process observational data, represent undefined physical processes in models, and compress data. Thus, AI can speed up modeling and filter out the most important information from large amounts of data. In addition, the researchers suggest that the use of machine learning not only makes computations more efficient, but can also help describe physical processes more accurately.
Scientists review their strategy paperas a starting point on the path to creating a digital Earth twin. Among the computer architectures available today and expected in the near future, supercomputers based on graphics processing units (GPUs) appear to be the most promising option. Researchers estimate that a full-scale digital twin will require a system with approximately 20,000 GPUs and approximately 20 megawatts of power. For both economic and environmental reasons, such a computer must operate in a location where the electricity generated with CO2 neutrality is available in sufficient quantities.
Physicists have created an analogue of a black hole and confirmed Hawking's theory. Where it leads?
Scientists have discovered the speed limit in the quantum world.
Abortion and science: what will happen to the children who will give birth.