Come to booth #600 to learn more about our Microseismic Monitoring Solutions and subscriber arrays in Canada.
Meet our Science and Development Team
Book a 1-on-1 presentation with Dario Baturan or Sepideh Karimi
- Subscriber Array
- Machine Learning
- Risk Management
- Induced Seismicity Management Dashboard demo
Abstracts for Oral Presentations
Performance evaluation of a real-time induced seismicity management tool
Monday, May 13, 9:25-9:50 AM
Induced Seismicity – Modeling and Case Studies
Minimizing risk of seismic activity induced by hydraulic fracturing is a high priority for oil and gasoperations. The majority of the regulatory traffic light protocols introduced to date is reactive and based on staged magnitude thresholds. The operators are required to establish operational protocols designed to minimize the likelihood of the occurrence of large magnitude events and are in some instances mandated to implement high-resolution seismic monitoring arrays. The ultimate goal of these seismic networks and their data products, beyond simple regulatory compliance, is to provide operators with a near real-time measure of the induced seismicity (IS) risk and an indication of the implemented mitigation protocol effectiveness. One such approach includes using highresolution seismic data products to derive maximum magnitude (Mmax) and seismicity forecasting models in near real-time, allowing for adjustments in operational parameters to reduce the probability of a felt or damaging event.
In this study, we present the learnings from a practical implementation of model-based ricks management systems using three forecasting methodologies for hydraulic fracture operations. The performance of this system to predict seismicity is validated via real-time monitoring and playback in over 50 diverse datasets. The results show that the estimated seismicity agrees well with observed seismicity in majority of cases, multiple models produce very similar results and injected volume has limited impact on seismicity forecasts. The study also highlights the limitations of this approach when a large event occurs early in the sequence. One of the most important takeaways is the impact that the quality of seismic data has on the system performance. A high-quality data recorded by a local array combined with advanced processing techniques designed to generate “research grade” seismic catalogues automatically in near real-time is a key requirement. This development also serves as an excellent example of collaboration between industry (data acquisition and array deployment), academia (model development), and service providers (data processing advancements and implementation) to understand and manage induced seismicity phenomenon.
Semblance-weighted stack for improved microseismic monitoring
Wednesday, May 15, 9:25-9:50 AM
Telus 101 - 102
Surface microseismic monitoring can be challenging due both strong surface noise and weak signal generated by the hydraulic fractures. Therefore it has become common practice to deploy very large numbers of geophones to allow for the detection of the very small events, primarily through stacking techniques. The cost of acquiring and processing such large datasets, can be prohibitive and, therefore, may not be the most practical design for all monitoring applications. While more sensors theoretically improve the signal-to-noise ratio by √N, where, N is the number of measurements, this assumes prior removal of coherent noise sources. Large densely sampled patches remove the coherent surface wave noise through an F-K filter. We demonstrate that semblance-weighted stacking yields similar noise attenuation and improves signal-to-noise of arrivals. We also show that array design impacts signal-to-noise of arrivals and that a hexagonal array provides greater uplift than linear arrays given equal channel count.