Geophysical exploration for finding and monitoring hydrocarbon reservoirs relies heavily on processing large amounts of data.
The data size alone could make O&G exploration and exploitation an intensive computing case. Nevertheless, it is the computational intensity of modern-day data processing tools what makes geophysical imaging an HPC grand challenge. The huge cost related to data acquisition and drilling in ever more challenging locations is quickly being counterbalanced by, comparatively, cheaper computing infrastructures. For this reason, ever more O&G companies are set to buy their own petascale computing systems for geophysical imaging (and reservoir modelling) alone (top500.org). The key concept here is: accurate imaging of the subsurface helps reducing geological uncertainty, and thus dropping drilling failure rates. In this way, the hundred million dollar drilling effort involved in establishing a deep water platform becomes a far less risky investment. Furthermore, the environmental consequences of drilling are diminished if each bore is more likely to hit the reservoir.