In Pursuit of Better Models and Simulations, Oil Industry Looks to the Math SciencesJanuary 2, 2002
Figure 1. Grid and structure for a heterogeneous salt dome model. The oil reservoir in this example has many wells and a highly heterogeneous permeability field. The discretization contains a large number of hexahedral elements. The use of unstructured and nonconforming meshes, by better capturing the physics, would improve the accuracy of the simulation.
Opening the Arctic National Wildlife Refuge for oil exploration is a hot-button environmental issue. It's also an issue that involves mathematics---more than might at first meet the eye. According to the oil companies, technological advances now in use will enable them to locate and drill for oil in environmentally sound ways that will have minimal impact on the ANWR. The use of applied mathematics has been an integral part of those advances.
In this article, we consider some of the ways in which mathematics is used in the oil industry, where money-making decisions are made every day. How can more oil be extracted from a given reservoir in an economically feasible manner? How should the wells be drilled? What is the geologic structure of a given oil field? All of these questions can be answered with the help of applied mathematics.
As politicians debate the worthiness of finding new sources of energy, John Killough and other like-minded researchers have devoted their careers to actually finding energy resources. Killough received his PhD in the mathematical sciences from Rice University, under the supervision of Mary F. Wheeler, director of the Center for Subsurface Modeling at the University of Texas, Austin. Killough, who has taught (chemical engineering at the University of Houston) and worked in industry (Exxon Production Research and Arco Production Research), now holds the newly created position of Research Fellow at Landmark Graphics Corporation. He has done pioneering work in the application of supercomputing and parallelization to reservoir simulation, and he has continued to work with Wheeler and has also collaborated with Don Peaceman on challenging petroleum problems.
Using applied mathematics, oil industry researchers are combining the methods of geologists, geophysicists, and modelers to optimize production plans and reduce costs. Improved seismic grids, better interpolation between sparse data points, and research in multiscaling techniques have all proved helpful in resolving the problems inherent in the simulation of geologic structures. Visualization techniques that aid in determining well placement and drilling rates rely on accurate models and realistic discretizations, which implies that high-performance computing is a necessity.
With Killough's help, we offer here a look at the state of the art in modeling and simulation in the oil industry, along with an industry "wish list" of improvements to which mathematical scientists are in a good position to contribute.
Appropriate gridding schemes are of vital importance for accurate simulations of the behavior of any physical system. For an oil reservoir, a full model requires three grids: a seismic grid, a geologic (or geophysical) grid, and a reservoir grid.
The seismic grid describes the shape of the several interconnected geological flow units that constitute a reservoir. Once this global earth structure has been established, geologic measurements like logging (laboratory analysis of a core soil sample) and well testings (determinations of pressure and temperature at the drilling site) give more precise information about rock properties and heterogeneities like pinchouts (sharp contrasts in permeability between rock layers).
Two essential properties on a geologic grid are porosity and permeability. Porosity is simply the ratio of pore volume to total volume. If porosity is too low, then the field is not worth exploiting and it would be pointless to simulate the exploitation. Permeability measures the resistance of the porous medium to fluid flow. Porosity and permeability are not necessarily correlated. Low permeability values might also be an indication that field exploration is useless.
Once these attributes have been established, the mathematical model-the partial differential equations characterizing the fluid flow and mass transfer-can be discretized on the reservoir grid. Oil engineers are satisfied with the quality of current seismic and geophysical grids, and with the amounts of data extracted from those grids. With the reservoir grid, however, three main difficulties arise.
The first is a multiscale issue: The reservoir grid is much more refined than the seismic and geologic grids. The dimensions of a typical reservoir grid cell are 10 x 10 x 3 cubic meters, compared with 100 x 100 x 10 cubic meters for a seismic grid cell. Currently, either geocellular or geostatistical modeling is used to project the coarse-scale data onto the fine-scale models. Both approaches have pros and cons. Geocellular modeling interpolates poorly between scattered data points, for instance, while geostatistical data may be far from reality. Resolution of the scaling issues is of primary importance because a poor solution to the problem could cause the numerical solution to lose accuracy.
If the many different types of reservoirs are to be successfully modeled, the reservoir grid should also be able to resolve complex geometries, faults, channels, and deviated wells. Hexahedral grids, the current standard in the oil industry, do not satisfactorily resolve these problems (see Figure 1). More flexible grids, such as perpendicular bisector (PEBI) grids, have been introduced as an attempt to address those issues, but generating PEBI grids can be quite complicated in three dimensions and they still do not resolve sloping faults. The current trend is to move to tetrahedral elements, which can handle more complicated geometries.
Another requirement for the reservoir grid is the ability to handle local grid refinement. In the parts of the domain where faults, wells, or pinchouts occur, the grid may need to be very refined, while a coarser grid may be sufficient elsewhere in the domain. The computational time required for a simulation is much lower for grids with local refinement capabilities than for uniform grids.
Grids alone cannot produce effective reservoir simulations, of course. Intertwined with grid elements are the discretization methods used to approximate such physical quantities as pressure, flow rates, and mass balances. Current industrial reservoir simulators support finite differences, popular methods that are efficient when used on regular structured grids. These methods can become unstable on unstructured meshes, however, and they are not ideal for complex geometries.
What is needed is attention to the design of numerical schemes that can produce more accurate reservoir simulations on very general unstructured reservoir grids, while still maintaining the conservation properties of the finite difference methods. One promising discretization strategy is the discontinuous Galerkin method. John Killough is among the researchers interested in these finite element methods, which have several appealing properties: They are element-wise conservative, they support local approximations of high order, they are robust and nonoscillatory in the presence of steep gradients, and they can be implemented on unstructured and nonmatching grids.
Being higher-order methods, discontinuous Galerkin methods are more costly than standard cell-centered finite differences. "We are willing to pay the price for a more accurate simulation, since mistakes can be costly," Killough says. The main challenge facing the oil industry, he believes, is the dis-cretization of governing partial differential equations on a grid that aligns with faults and deviated wells, and that can be locally refined.
The underlying physical models are perhaps the most mathematically sound aspect of reservoir simulation. According to Killough, reservoir engineers are satisfied with existing models and are now interested in using these models inside optimization loops to obtain better parameter estimates. The ultimate goal is to use the models and estimates to match field production histories, thereby enabling oil producers to better predict future behavior.
Because the minimization functions can be flat, with multiple local minima, many optimization methods are useless in the absence of good gridding techniques. And missing the minimum means the possible loss of millions of dollars. Many oil industry researchers are working to develop optimization strategies for these problems, and many efforts are under way to resolve the inherent ill-posedness of the inverse problems by incorporating the "right" set of static data. Most of these solutions have high computational costs, with the result that many of the model parameters are simply ignored; in ongoing efforts, researchers are attempting to determine the best subset of parameters to use.
Well characterizations are another important component of reservoir simulations. Current capabilities permit accurate modeling of both vertical and horizontal wells, although improved models are needed for highly deviated wells. Deviated wells cut through strata that vary in structure and composition as they meander from the surface facility to the source. Modeling the wells, reconciling surface facility and well discretizations, is yet another problem to be solved.
Capturing the right flow mechanics within a reservoir well and in the near-well areas is necessary for accurate description of the behavior of multibranch wells. The behavior of some variables, such as saturation, may not be important in areas relatively far from the wells, and grids should be more coarse in these areas.
The gridding techniques, discretization methods, model development, and visualization are only as good as the computing power available. Killough points out that motivation for advances in high-performance computing has never been lacking. The demands of reservoir engineers to increase the complexity of their models have grown with the availability of computing power. Taking advantage of innovative parallel techniques, engineers have improved their models from only tens of reservoir cells in the 1960s to tens of millions of cells today. Among the techniques that have contributed to these successes are domain decomposition, efficient message-passing strategies, and enhanced linear equation solvers.
Excellent parallel performance is obtained not only on conventional, sophisticated parallel Unix machines, but also on clusters of PCs. Obtaining a cluster of PCs can be relatively inexpensive. As Killough points out, "the price/performance comparisons for PCs and PC clusters is already significantly better than the Unix counterparts." Given the explosion of information available to reservoir simulators, however, he adds that further improvements in high-performance computing are clearly needed.
Other issues require ongoing attention as well. Improved parallel linear equation solvers are needed, especially for fully implicit formulations; tubing hydraulics, surface facilities, and pipeline networks should be better incorporated into the overall parallelization process; and static and dynamic load balancing should be optimized for full exploitation of any parallel simulation.
Research in many areas of applied mathematics can help to ensure adequate, stable supplies of energy. The problems in the oil industry are all interconnected: The grids are only as good as the discretization methods; the choice of discretization method sometimes depends on the amount of computing power available; and accurate predictions of future behavior rely on good optimization techniques as well as good gridding, discretizations, and models. Applied mathematics can play an important role in the extraction of energy in an environmentally sound and efficient manner, and can contribute to the design of the next generation of reservoir simulators.
The authors thank John Killough and Abby Kaighin for their help in writing this article. For readers wishing more information, the Society of Petroleum Engineers (www.spe.org) is a good resource.
Lea Jenkins and Béatrice Rivière are research associates at the Center for Subsurface Modeling, TICAM, University of Texas, Austin.