Loading...

Exascale

HPC4E Project information

Exascale

HPC4E Project information

Why is exascale HPC necessary for the energy industry?

New energy sources, if untapped, might become crucial in the mid-term. Intensive numerical simulations and prototyping are needed to assess their real value and improve their throughput. The impact of exascale HPC and data intensive algorithms in the energy industry is well established in the U.S. Department of Energy (DOE) document “Synergistic Challenges in Data-Intensive Science and Exascale Computing”.

  1. Wind energy will become an increasingly important part in the energy supply mix, but we must learn better how to enable a more elastic production-demand response. For the wind energy generation industry HPC is a must. The competitiveness of wind farms can be guaranteed only with accurate wind resource assessment, farm design and short-term micro-scale wind simulations to forecast the daily power production. In off-shore wind farms additional engineering problems related with mooring and anchorage mechanisms increase the HPC needs. The Global Energy Assessment report states that the fundamental knowledge barriers to further progress in wind energy are defined as: “scientists’ understanding of atmospheric flows, unsteady aerodynamics and stall, turbine dynamics and stability, and turbine wake flows and related array effects”. The use of computational fluid dynamics (CFD) large-eddy simulation (LES) models to analyse atmospheric flow in a wind farm capturing turbine wakes and array effects requires exascale HPC systems. This is the first energy industry problems addressed by this project.
     
  2. Biogas, i.e. biomass-derived fuels by anaerobic digestion of organic wastes, is attractive because of its wide availability, renewability and reduction of CO2 emissions, contribution to diversification of energy supply and rural development. Moreover, biogas production does not compete with feed and food feedstock. However, its use in practical systems is still limited since the complex fuel composition might lead to unpredictable combustion performance and instabilities in industrial combustors. Combustion is a well-established source of exascale problems, and becomes even more relevant when complex fuels are employed. Hazardous combustion phenomena are associated with significant variation in fuel composition that changes the flame speed, heat release rate, local fuel consumption rate, pollutant formations, and more important, flame stability mechanisms. Therefore, the use of exascale computing using high-fidelity numerical simulations is fundamental to investigate the combustion characteristics of new fuels in practical combustors. The research in high-fidelity numerical simulations in combustion is underscored in the DOE report on Basic Energy Needs for Clean and Efficient Combustion of 21st Century Transportation Fuels, which identified a single overarching grand challenge: to develop a “validated, predictive, multi-scale, combustion modeling capability to optimize the design and operation of evolving fuels”. The next generation of exascale HPC systems will be able to run combustion simulations in parameter regimes relevant to industrial applications using alternative fuels, which is required to design efficient furnaces, engines, clean burning vehicles and power plants. This is the second energy industry problems addressed by this project.
     
  3. Hydrocarbons are still the main energy source, and it has become hard to replace them in some critical aspects for everyday life (e.g. transportation). Their usage releases carbon dioxides to the atmosphere, but their exploration/production has environmental costs too. Exploration costs are dominated by the massive drilling costs as well as by the expenses of financial and environmental insurance. In the following we will only discuss conventional hydrocarbon resources, as opposed to hydrofracturing and shale gas plays which are far less energy efficient and more environmentally damaging. Nowadays, one of the main HPC consumers is the oil & gas (O&G) energy industry. Companies such as ENI and TOTAL have their own top-20 supercomputers devoted to geophysical exploration and reservoir modelling. In March 2015 the Norwegian company PGS has installed a 5 Petaflops system for geophysical exploration. O&G is thus the only industry to boast such large supercomputers for private use at present time. Geert Wenes, Senior Practice Leader at Cray wrote recently “a perfect storm in seismic processing requirements is ensuring that the O&G industry will be an early adopter of exascale computing technologies”. This perfect storm is the computational requirements arising from full wave-form modelling and inversion of seismic and electromagnetic data. By taking into account the complete physics of waves in the subsurface, imaging tools are able to reveal information about the Earth’s interior with unprecedented quality. Nevertheless, honouring the actual wave physics has a high cost in terms of computational intensity, which can only be matched by using the exascale HPC systems. This is the third energy industry problems addressed by this project.