A Review of Integrated Description and Evaluation of Reservoirs based on Seismic, Core and Well Log Data – Case Study of Bornu Basin

DOI : 10.17577/IJERTV8IS050003

Download Full-Text PDF Cite this Publication

Text Only Version

A Review of Integrated Description and Evaluation of Reservoirs based on Seismic, Core and Well Log Data – Case Study of Bornu Basin

*Abubakar Bello, Aliyu D. Bida, Mukhtar Habib Department of Mineral & Petroleum Resources Engineering Kaduna Polytechnic, Kaduna Nigeria

Abstract:- The reservoir description and evaluations is very important to the reservoir engineers, and this could be achieved when the data coming from different disciplines combined in order to generate a model that is representative of the reservoir being studied and can be used for defining the most viable development strategy of the field from both an economic and technical outlook. The integration among Seismic survey, geology, Core sampling analysis and Well log is truly essential, but requires specific approaches and procedures for generating and calibrating a reservoir model capable of dealing with all phases among the oil and gas upstream operations. The results obtained from the approach has exhibited great advantages in terms of improvement in the quality and flexibility of the model, reduction of working time and generation of a single final model that can be adapted to describe and evaluate the reservoir, thus, an integrated approach is necessary for reservoir modeling purposes and this is beneficial in describing nature of reservoir as well as evaluating other reservoir with the similar properties.

Key words: Description, Evaluation, Reservoir Studies, Dynamic Behavior, Static Model, Dynamic Simulations, Petrophysical Parameters, Hydrocarbon Reservoirs.

INTRODUCTION

The goal of a reservoir study is to understand and describe the dynamic behavior of a hydrocarbon reservoir by properly integrating all the available geological, geophysical,

petrophysical and engineering information so as to predict the future performance of the system under different development and production strategies. For that purpose, it is common practice to rely on a reservoir model that can handle and process a large amount of data. This model is generated to accurately reproduce the structural and petrophysical properties of the hydrocarbon-bearing formation and to describe the fluid dynamics taking place within the reservoir. Ideally, the same model should be further extended to account for the rock mechanical properties, to calculate stresses and deformations induced by operating the reservoir. In this way, all relevant aspects (static and dynamic) would be incorporated into one comprehensive model, by which not only single phenomena but also their mutual interactions, as they occur in the reservoir, could be investigated for forecast purposes and economic evaluations. However, a typical reservoir study can be very complex because it requires the integration of several disciplines, each having a different perspective, each governed by different sets of equations and parameters and often focused on a different problem scale. Furthermore, a subsurface body can only be characterized indirectly (e.g., through seismic methods), which means that no direct observation or measurement can be made, or by direct investigation of very limited portions (e.g., at exploration wells) of the whole reservoir.

Fig.1: Traditional Approach for Reservoir Modeling (Modified from Cosentino, 2001)

As a consequence, uncertainties often related to the structural and geological complexity of the upper layers of the Earths crust where reservoirs are found cannot be eliminated and need to be dealt with. Historically, the workflow followed for many years by geoscientists and engineers looked very similar to that presented in Fig. 1. Basically, each matter in a reservoir study was managed independently. The results were handed from one specialist to the other without any active interconnection or systematic information exchange. Each discipline involved in the construction of the reservoir model had to provide data with the highest possible accuracy in order to minimize the overall uncertainties. This work process was based on the convincement that if the results provided by each discipline could be accurate, the uncertainty affecting the final model would be reduced. However, this approach showed several limitations, especially when inconsistencies arose during data processing and interpretation. In these situations, a thorough and consistent re-evaluation of all model parameters was required. In recent years, a generally improved awareness of environmental issues and the need to enhance recovery from a large number of oil and gas fields around the world demanded a new reservoir management practice. At the same time, significant advances in technology and computer science were achieved, potentially allowing data sharing and a facilitated transfer of hard and soft information among different disciplines. Thus, the market was prompted to provide highly sophisticated tools for studying and simulating the behavior of hydrocarbon reservoirs.

The need for more accurate modeling, with a higher level of details so as to capture most of the reservoir geological and geomechanical features and to describe complex interactions among rocks, fluids and wells, are currently leading to the creation of software packages that incorporate all the subsurface disciplines and provide a common project environment for petroleum geoscientists and engineers. In this approach subsequent adjustments to maintain a coherent reservoir representation and modeling are eased by the possibility that all specialists have access to updated data and results in real time (Rocca, V. and F. Verga, 2008).

The interaction between the various specialists involved in a reservoir model construction can produce significant changes to the final model depending on the scale of the problem they are looking at. For example, a new definition of the geological structure can heavily affect the whole reservoir model, while a re-evaluation of the porosity of a single facies would influence only the fluid amount or distribution in the model. The understanding and modeling of coupled phenomena also provide new insights of the system behavior. When fluid dynamic and geomechanical issues are solved together, the deformations induced by pressure depletion due to production and, in turn, the impact of rock compaction on fluid flow, are accounted for. This implies that the model is more sophisticated and that the relevant parameters must be defined and calibrated accordingly. Overall, the recognition of the importance of the reciprocal influences among different disciplines and the progressively enhanced ability to actually implement

integration has lead, in time, to a substantially new approach to reservoir modeling. This advanced workflow can truly provide better-quality reservoir studies, but it also demands improved competences and skills.

Model Descriptions

  1. Description of Reservoir Modeling Workflow: The construction of a complete reservoir model requires the integration and coupling of two basic models, each one describing in detail specific reservoir characteristics, namely; Static and Dynamic models.

  2. Static Reservoir Modeling: The static model of a reservoir can be considered the final product of the structural, stratigraphic and lithological modeling activities. Each of the above modeling parts can be developed according to its own workflow, but a deep integration among them is necessary in order to generate a calibrated static model.

    The workflow for setting up the static model always begins with the creation of a structural model, which includes all the geophysical, geological and well information that are needed to reproduce thetop and bottom maps of the reservoir layers and to identify the presence of faults, if any. First, the available geophysical data are imported and under quality checked. Usually 2D seismic sections that cover the portion of the subsoil where the reservoir is located are available. However, nowadays modern acquisition techniques provide 3D high resolution seismic datasets. If coupled with a good sedimentological understanding of the area, they permit identification of the geological trends and extraction of a large variety of seismically derived lithological and petrophysical properties. The seismic data are most commonly expressed in Two-Way Travel Time (TWT) of the seismic rays from the seismic datum, which is usually the sea level, to the subsurface formations. Then, all the features derived from the seismic datasets are converted from the time domain to the depth domain by using an appropriate velocity model. Based on the interpretation of the seismic acquisitions in combination with well log data, the definition of the surfaces that correspond to the tops and bottoms of the reservoir layers is possible and the construction of the structural model begins. When analyzing the seismic data, faults are also recognized and mapped to be used at a later time during the construction of the model grid.

    The most important part in the construction of a structural model is perhaps the fault modeling process. The role of the faults in the compartmentalization of the field and the accuracy, with which faults are mapped in the model, can have a direct impact on the way fluids can flow through the porous media. Hence, they can severely affect the results of the dynamic simulations that are used to define the production strategy of the field. In order to achieve the highest possible accuracy, all available seismic and well data should be combined. Discontinuities of the seismic signal can be interpreted as faults, but only well data can provide direct evidence that a fault has been intercepted by the well.

    The stratigraphic modeling is the workflow part that deals with the description of the internal structure of the model. Zones and layers that best describe the different levels of the reservoir are defined. Modern stratigraphic interpretation also takes the principles of sequence stratigraphy into account. Sequence stratigraphy helps identifying and predicting the geometries of the various geological bodies based on the sea level change that causes the deposition of different sedimentation patterns. The stratigraphic correlations are then migrated in a 3D static model as a series of units (beds) with a varying areal continuity throughout the field. The continuity of the sedimentary bodies is a key issue because it will eventually control the flow patterns when modeling the dynamic reservoir behavior.

    In a typical numerical reservoir modeling approach, the volume of interest (i.e., the reservoir) is divided into elements-called blocks (or cells). Each block is assigned values of the local petrophysical properties, obtained from the geological and geophysical studies. In a static model the grid is generally Cartesian, thus the cells should all have a regular shape. The block dimensions are usually small in the horizontal plane (the side can be some 20÷50 meters) and so as to allow accurate description of the structural and geological features. The vertical discretization is imposed so as to honor the stratigraphic sequence encountered at the wells, but it can be further refined where needed or if the reservoir comprises one or more thick stratigraphic units.

    The last part of the static modeling workflow is the assignment of appropriate lithological (facies modeling) and petrophysical properties to each block. Facies can be described as lithological units that include a series of geological characteristics. They can be considered as an elementary part of the reservoir model. During facies modeling, the grid cells are classified in a usually limited number of facies that can be used in the following for the tuning of the static model. The facies distribution can be performed using a variety of statistical approaches; with these the attempt is made to rely on some objective (or less subjective) rules to distribute the information recorded at the wells throughout the entire reservoir.

    Petrophysical modeling consists in assigning the petrophysical parameters to the model grid blocks. Fluids saturations and porosity are the most important parameters that control the amount of the hydrocarbons stored inside the reservoir; permeability dictates the ease with which they can flow and thus eventually be produced. The values of the petrophysical parameters usually derive from well and core data but their distribution in the model is controlled by deterministic or statistical methods. Possible facies distributions created using a stochastic approach are presented in Fig. 2. In the last decades geostatistics has become a valuable tool with which the areal distribution of the petrophysical properties can be generated in a statistical and geological representative manner.

    Fig. 2: Example of Three Realizations of a Fully Stochastic Facies Distribution in a Static Model.

    Usually geological models are constructed using very fine grids that are not suitable for dynamic simulation purposes. Although the more detailed the model, the most accurate the description of fluid flow phenomena, a good balance between accuracy and speed of computation is generally sought. Therefore, a coarser grid is then generated to be exported in the dynamic model that then can be used in the static model. This implies that all the properties in the grid need to be upscaled and the problem arises on how the properties of very large grids (e.g., millions of cells) should be transposed to much smaller grids (tens or hundreds of thousands of cells). A number of analytical and numerical techniques have been proposed to calculate an average value used to populate the cells of the simulation grid (Christie, 1996; Carlson, 2003). Depending on the petrophysical parameter (e.g., porosity or permeability) that needs to be upscaled, a different approach should be used. The selection of the most adequate method for upscaling mainly depends on the variance and distribution of the property values and is of crucial importance since all the simulation results are obviously affected by the characteristics of the final

    reservoir model. Sensitivity studies are highly recommended in order to evaluate the most suited upscaling procedure for the case under investigation (Cosentino, 2001).

    Dynamic reservoir modeling: The objective of reservoir modeling is to build a numerical model able to simulate the dynamic behavior of a given hydrocarbon reservoir. The purposes of the model, once built and calibrated, are various: estimate system parameters, forecast the field productivity according to different development scenarios and learn more about specific phenomena.

    Among the different techniques available in the market to study hydrocarbon reservoirs, 3D numerical modeling represents one of the most widespread and powerful approaches for reservoir simulation in the petroleum industry. As previously discussed, the model grid constitutes the geometrical discretization of the reservoir and is built on the basis of the structural maps (top, bottom, shape, thickness). The model blocks are then connected through flow equations describing the fluid flow mechanisms.

    Fig. 3: Schematic Representation of the Dynamic Modeling Workflow

    The components of a dynamic reservoir modeling workflow are the static model, the PVT data of the fluids, the rock- fluid interaction properties, the equilibration data (i.e., initial conditions), the well data and the production history. A schematic of the reservoir dynamic modeling workflow is displayed in Fig. 3.

    The basic workflow consists in 5 distinct steps:

    1. Data acquisition;

    2. Model design;

    3. Initialization;

    4. History matching; and

    5. Forecast.

The first step of the workflow is the data acquisition, i.e., the gathering of available data and the quality control of each piece of information. The design of a simulation model is influenced by the type of process to be modeled, the complexity of the fluid-mechanics problem, the objectives of the study, the quality of the reservoir data, the time and budget constraints. The most common simulators are immiscible black oil programs; the simulation of more complex processes requires use of special-purpose simulators, often supported by peripheral programs (Mattax and Dalton, 1990).

The initialization phase consists in assigning the initial saturation and pressure distributions and to double-check the hydrocarbons volumetric evaluations performed with the static model and through material balance techniques. In the history matching phase the model is calibrated based on the available measured pressure and production data, by modifying the input parameters. Once the model is properly calibrated, productivity and recovery forecasts are performed for different field development scenarios.

The main input data for a dynamic reservoir model comes from different sources. Well logs typically provide porosity and water saturation values along the well trajectory, while RFTs and MDTs measure the formation pressure profile

versus depth, which is crucial for initializing the model. Laboratory routine analyses on cores can provide information about horizontal and vertical permeabilities; special core analyses are performed to obtain capillary pressures and relative permeability curves. Fluid samples are collected and analyzed in laboratories to obtain PVT fluid properties. Well testing is a common and powerful tool to get reliable estimates of the well productivity, of the permeability of the formation and of possible heterogeneities within the test drainage area. The principal input parameters of a dynamic reservoir model can be classified according to the following scheme:

  1. Petropyhysical data: absolute/relative permeability, porosity, water saturation, net to gross ratio, capillary pressure;

  2. PVT data: oil properties (density, formation volume factor, gas-oil solution ratio, viscosity, saturation pressure), gas properties (gas gravity, compressibility factor, formation volume factor, viscosity) and water properties (density, formation volume factor, viscosity, compressibility);

  3. Reservoir data: depth of the fluid contacts, initial pressure at a given depth (datum), temperature and aquifer parameters;

  4. Production data: production/injection fluid rates, bottom hole and tubing head flowing pressure measurements, static bottom hole pressure values;

  5. Completion data: well productivity and injectivity index, wellbore diameter, skin factor (i.e., permeability reduction in the near wellbore due to drilling and completion mud invasion);

  6. Well and/or field constraints: target (maximum) production/injection rates, maximum water rate, maximum gas-oil ratio, minimum flowing bottom hole and minimum tubing head pressure;

  7. Economic requirements: minimum oil and gas production rates, maximum water production rate.

MODEL CALIBRATIONS

  1. Static stand-alone calibration: During the construction of the static model, as new data are progressively added to the model, it is necessary to perform a re-calibration of the static model before it becomes ready for simulation purposes. Data from a new seismic section or from a new well can lead to the reconstruction of the structural model or to a reevaluation of the petrophysical parameters and their distribution in the geological grid. The high complexity and heterogeneity that characterize the majority of the reservoirs demand the incorporation of all available information. The fault modeling is one of the parts of the geological workflow that frequently get re-assessed when new seismic or well data (e.g., new well tops) are acquired. Faults contribute to the compartmentalization of the field and, hence, strongly affect the fluid flow in the reservoir, but they also have an impact on the shape of the cells, potentially leading to grid anomalies (e.g., spikes, wedges,). This combination of new seismic and well data can lead to modifications of the geometry of geological zones from which the reservoir is produced. This can have an immediate effect on the hydrocarbon volume, to which in turn the economic evolution of the field is directly connected. Since the geological model and property distribution define the basic skeleton on which the dynamic and the geomechanical models are then build, it is very important that it is well calibrated and coherent with all the available data.

  2. Dynamic stand-alone calibration: History matching is a complex procedure, strongly dependent on the quality and amount of available data, the particular reservoir being studied, the resources allocated to the project and eventually the experience and personal attitude of the engineers working on the model Cosentino, 2001). Several limitations and critical factors are typically associated to the history matching process, the most important one being the non- uniqueness of the solution. A second critical aspect of the process is the iterative nature of the history match. In a typical reservoir study the history match requires to modify several parameters having a completely different nature. Normally, some parameters are related to the static- geological modeling, whereas other parameters are dynamic. In the dynamic modeling process all modifications should be shared with the other professionals of the group, in order to ensure consistency. Uncontrolled adjustments of model parameters can easily and quickly render useless the efforts of the whole team.

    Typically, the simplest and most traditional approach to history matching is the stand-alone calibration of the dynamic model. The structural and geological model is generated independently and beforehand. Once a properly defined static model has been set up, it is employed to define a dynamic model which is subsequently modified and calibrated by acting on some parameters. In this kind of approach, the static model is usually not modified, not to mention challenged and all the adjustments are performed in the dynamic modeling environment.

    Even if it is not possible to define a standard procedure for the history matching process, some general steps can be identified. The first stage in a calibration process is to define the critical parameters (i.e.., those affected by a high degree of uncertainty) and the key wells (i.e., wells with typical production behavior and long historical production) to be tuned in order to obtain a satisfactory match.

    Two steps are crucial in the calibration process: pressure match and saturation match. The pressure match requires the calibration of the global energy balance in the reservoir. The global pressure level of the field is first adjusted by modifying the pore volumes occupied by the different fluids (oil, gas and aquifer), the formation compressibility and the permeability on a field scale. In a second stage the individual well behavior is matched trough local variations of the same parameters. Permeability is generally the principal reservoir variable to modify in order to improve the pressure match. Saturation history matching is usually carried out after the pressure match with the aim of reproducing the reservoir fluids distribution, both in terms of arrival time of water/gas at the wells and of evolution of the associated production profiles after breakthrough. Again, the match should be focused first on the adjustment of the global field performance and then on the behavior of the individual wells. Relative permeability curves represent the key matching factor in this stage of the history matching process.

    The history matching phase can be considered successful when the model is able to reproduce the histrical dynamic behavior of the reservoir. It is not crucial to obtain an inherently good match of every well; it is important that the model is able to capture the main production mechanisms governing the field behavior so that the model can be effectively employed for its real purpose, i.e., the development and production forecast scenarios.

    DISCUSSIONS

    Integrated Calibration (static-dynamic models): A huge limitation of the stand-alone approach to reservoir modeling is the limited exchange of information among the different technicians (typically geologists and reservoir engineers) involved in the reservoir study. The exchange can be truly effective and advantageous only if the different phases of the study are fully integrated and if the activities are performed in parallel with a proper timing.

    Traditionally, the static modeling in a reservoir study is performed separately by a group of geologists and simulation experts. The modeling workflow ends with the computation of the fluids initially in place. All subsequent modifications performed in the dynamic modeling phase are rarely integrated in the original static modeling. A more effective approach, which is recently being adopted in many reservoir studies, is to continue the static modeling phase and the exchange of information throughout the whole reservoir modeling process. Following this approach, the results of the dynamic analysis and modeling can be directly employed in the static modeling phase in order to better constrain the workflow and get more reliable results. The

    static and dynamic modeling, represent two distinct but interchangeable phases of the whole process and can be considered concluded only when the integrated model is fully reviewed and integrated.

    The integrated process can be time consuming and requires an open option for modifications at all levels of the workflow: for instance, during the process it can be necessary to change the structure of the reservoir or the formation layering, causing a new processing of most steps of the static and dynamic modeling. Notwithstanding the disadvantages and drawbacks involved, an integrated workflow allows obtainment of a significantly improved picture of the reservoir, to better handle all the data and mostly to ensure a high level of consistency between the different phases of the study. In order to ensure the success of an integrated approach, the traditional sequential planning of the activities must be replaced by an integrated, parallel, planning (Saleri, 1993) which allows overlapping of the time frames associated to the various contributing activities, facilitates exchange and integration possibilities and reduces potential delays (every actor remains active for the whole duration of the process).

    CONCLUSION

    This review has demonstrated the need for an integrated approach in the construction of hydrocarbon reservoir models. In fact, a truly integrated workflow leads to an overall improvement of the reservoir model from the static and dynamic point of view. The updates, revisions and modifications proposed at each step, including progressive adjustment of the model parameters in the calibration phase are shared among the different specialists and coherency is inherently ensured.

    During dynamic modeling, the engineers can provide the geologists with valuable information about the hydraulic connectivity among the geological bodies or through faults intersecting the reservoir. They can also offer feedback on the petro-physical parameters and their distributions based on the calibration of the global energy balance of the field, as the global pressure level is adjusted by modifying the pore volumes occupied by the different fluids.

    ACKNOWLEDGMENTS

    We are grateful to Tertiary Education Trust Fund (TETFUND) and management of Kaduna Polytechnics as this work is part of her sponsored projects through a research grant. We are equally indebted to Department of Petroleum Resources (DPR) for the permission and right of entry to Integrated Data Services Limited (IDSL) a subsidiary of Nigerian National Petroleum Corporation (NNPC) for the provision of the data.

    REFERENCES:

    1. Cosentino, L., 2001. Integrated reservoir studies. Editions Technip., ISBN: 2-7108-0797-1,pp:310.

    2. Mattax, C.C. and R.L. Dalton, 1990. Reservoir Simulation. In: Doherty Memorial fund of AIME,

    3. Rocca, V., 2009. Development of a fully coupled approach for evaluation of wellbore stability in hydrocarbon reservoirs. Am. J.

      Environ. Sci., 5: 781-790. ISSN: 1553-345X

    4. Rocca, V. and F. Verga, 2008. Stability modelling applied to wellbore design. Geam. 3: 5-12. ISSN: 1121-9041

    5. Saleri, N.G., 1993. Reservoir performance forecasting: acceleration by parallel planning. JPT, 45: 652-657.

Leave a Reply