Скачать книгу

As for multi-scale, the discipline involves integration of data from a wide range of data, say, nanometer (electron microscope), to centimeter (cutting and core samples), to decimeter (well log), to meter (seismic) scale. Spatially most of these data are acquired within a small portion of one or several wells and geophysical data gives the capability to extrapolate away from the wells with lower resolution. Due to uncertainties in the data, rapid variations in the subsurface, and sparse sampling multi-scale integration can be a challenging task. There is a good discussion of “SURE Challenge” in the book where the author addresses the above mentioned challenges of integration incolving multitude of data set with different Scale, Unvertainty, Resoultion and Environemnet. It is suggested that different AI and Data Analytics techniques may be best equipped to handle the SURE Challenge.

      The second component can be categorized into input data quality (informally ‘garbage in, garbage out’). Any workflow that is lets say cutting edge cant work without high quality input data. Further, it may cause mis-interpretation that a workflow is ‘not’ a good workflow or appropriate simply because the input data was the culprit. The input data in fact starts from data acquisition, then to data processing and finally to data interpretation and integration. One of the pitfalls along the way is to simply obtain the data as an interpreter and not be aware of lets say the ‘history’. An example would be to apply amplitude based seismic analysis to data that non-amplitude preserving processing was applied to (Automatic Gain Control or AGC would be a simple example). However, the same could be happening with say well-log or production data. The good news is that over time in every SRC related discipline data quality has been improving with not only better tools but also more frequent data acquisition during the life of a reservoir. Further, over time we have learned to build much better processing tools that provide high quality data for the integration component. The net result of this has improved our ability to conduct integrated studies and quantitative products. One example of this near to my heart is joint seismic inversion of PP reflected waves with PS (or converted) reflected waves from a reservoir. We have seen that with improved acquisition and processing, the joint PP/PS inversion can substantially improve pre-stack seismic inversion providing a stable S-impedance as well as a P-impedance that can provide valuable information such as formation properties, porosity, Total Organic Carbon (TOC,) fluid types, and time-lapse reservoir pressure and saturation changes over the life of the reservoir. Such improvements are going on in all the subsurface disciplines thanks to modern acquisition and more diverse data with higher quality.

       Dr. Ali Tura

      Professor of Geophysics

      Co-director of Reservoir Characterization Project (RCP)

      Colorado School of Mine,

      Denver, Colorado

      Preface

      An important step in exploration development, monitoring, and management a reservoir as well optimizing production and planning for post primary production decisions is reservoir characterization. Upon the completion of the preliminary task of reservoir characterization, and as we continue to produce from the reservoir or use different methods to stimulate it, many of its properties change. This requires updating the reservoir model, bringing up the concept of dynamic reservoir characterization. To achieve this goal, we incorporate the newly acquired petrophysical, seismic, micro seismic and production data. The updated model would be a better representative of the status of the reservoir. Both static reservoir properties, such as porosity, permeability, and facies type; and dynamic reservoir properties, such as pressure, fluid saturation, and temperature, needs to be updated as more field data become available.

      Among the reason for focusing on reservoir characterization is the fact based on the estimates by experts, more than 95% of the world’s oil production in the 21st century will come from existing fields. This will require significant improvements in the current recovery rates of less than 50% in most reservoirs. Improved secondary and tertiary recovery through enhanced recovery of oil and gas require by better understanding and monitoring of the reservoir will be an important element of the much-needed increase in the recovery factor. Increased production will be made possible only through effective dynamic reservoir characterization.

      We need to recognize the fact that reservoir characterization is a multidisciplinary field. It attempts to describe petroleum deposits and the nature of the rocks that contain hydrocarbons using a variety of data types. Reservoir characterization relies on expertise from petroleum engineers, geologists, geochemists, petrophysicists, and of course geophysicists, The integration of information from these fields, with the aid of advanced data analysis techniques as well as artificial intelligence (AI) based methods will make our reservoir models more accurate and the updating process much faster.

      Part 2 deals with general issues on reservoir characterization and anomaly detection. It is comprised of 7 chapters on different related topics such as: (1) Comparison between estimated shear wave velocity and elastic modulus at in situ pressure condition (2) Anomaly detection (3) geochemical analysis on characterization of carbonate source-derived hydrocarbons, (4) MWD mud pulse telemetry, (5) Use of Monte Carlo clustering to detect geologic anomalies, (6) Gas-sand predictors using dissimilarity analysis, and (7) Fluid flow tests distorted by wellbore storage effects. Part 3 is dedicated to reservoir permeability detection, being one of the most important reservoir properties. What are covered here are three different techniques. Two of them involves use of two different machine learning techniques to predict permeability, namely, exponential/multiplicative and Monte Carlo/committee machines. The other chapter discusses geoscience criteria identifying high gas permeability zones.

      One of the reasons for reservoir characterization is to assess the recoverable reserves in the reservoir. Part 4 addresses reserves evaluation and decision-making issues. The first chapter of this part discusses foundation for science-based decision making, using data from the Gulf of Mexico. The next chapter in Part 4 investigates decline trends in a reservoir using Bootstrap and Monte Carlo modeling. This Part concludes with a typical production, reserves, and valuation method used in an oil and gas company.

      Given the tremendous success with the development and production from shale reservoirs over the last 2 decades, Part 5 is dedicated to the unconventional reservoirs. The chapters in Part 5 include: (1) Optimization of Gas-Drilling in Unconventional Tight-Sand Reservoirs, (2) Predicting the Fluid Temperature Profile in Drilling Gas Hydrates Reservoirs,

Скачать книгу