We look forward to seeing you in Vienna at the EGU annual meeting

Date

Come to our booth #54 to see the new solution that will revolutionize portable monitoring campaigns

 

Pegasus is an intuitive and versatile system covering the full spectrum of portable applications from long term broadband to full waveform, full wavefield imaging. Pegasus opens up the next chapter in Large-N science.Pegasus Digitizer

 

  • Lightweight, compact & modular
  • Flexible power options
  • Easy-to-deploy and maintain
  • Complete data collection, handling and management

 

Poster Presentations 

Initial Test Results for Trillium Slim Borehole 120, A New Small-Diameter High-Performance Seismometer
Presenting author: Bruce Townsend

Day: Wednesday, April 10th
Time: 16:15–18:00
Place: Hall X2

 

Pegasus, New Very Low Power 4 Channel Broadband Quality Digitizer for Dense Autonomous Research Arrays
Presenting author: Peter Devanney

Day: Tuesday, April 9th
Time: 14:00–15:45
Place: Hall X1

Oral Presentations 

Performance of a Real-Time Machine Learning Based Seismic Catalog Generator Over a One Year Period in Production
Presenting author: Sepideh Karimi

Day: Monday, April 8th
Time: 11:30–11:45
Place: Room L3

The application of real-time induced seismicity forecasting as a risk management system
Presenting author: Sepideh Karimi

Day: Thursday, April 11th
Time: 12:00–12:15
Place: Room L2

 

3D velocity model optimization for enhanced absolute event locations: Application to Duvernay Subscription Array in western Canada 
Presenting author: Sepideh Karimi

Day: Tuesday, April 9th
Time: 11:00–11:30
Place: Room 2.32

 

 

 

Abstracts

Initial Test Results for Trillium Slim Borehole 120, A New Small-Diameter High-Performance Seismometer

 

The new Trillium Slim Borehole 120 seismometer is a 104 mm diameter instrument with Trillium 120QA/PH class performance having significantly improved SWaP (size, weight and power). It is designed for smaller holes down to 4.5” or 115 mm diameter, in shallow or deep deployment, using a simple passive holelock or sand installation.  The small diameter permits deployment in existing small boreholes, and facilitates construction of new lower cost boreholes, minimizing disturbance of the surrounding rock and improving instrument coupling.

We present initial test results for this instrument in small and large diameter cased holes, direct burial, and side-by-side pier testing.  

 

High performance with simpler logistics makes the Trillium Slim Borehole well suited for many applications, including new higher-density arrays for full waveform analysis and detection of earthquake gravity signals.  In conclusion we present a proposed array design for optimal measurement of earthquake gravity signals on a regional scale.

 

Pegasus, New Very Low Power 4 Channel Broadband Quality Digitizer for Dense Autonomous Research Arrays

 

Researchers needing high-resolution broadband waveforms have lacked station density because of system availability, the logistical and capital costs, time to deploy stations and the complexity of systems.  New posthole type broadband sensors and direct burial techniques have lowered the deployment time of denser broadband arrays while delivering observatory grade data in most environments. The newest geophone based seismic recording systems require much less logistics and are inexpensive but lack response below 1Hz and are generally not deployed for the length of time a broadband station is or can support a broadband sensor.

 

We present a new very low powered and economical digitizer available from Nanometrics, Pegasus. The power required is less than 200mW for recording three channels and a duty cycled GNSS timing system. It is very small (<14.5cm 3 ), lightweight (<.5kg) and IP68 rated for immersion with a robust enclosure for autonomous operations in all terrestrial environments.

 

Installation and servicing is simplified by applications and features that makes reaping the data fast, verified and reliable while field station operational review is completed quickly and with certainty. Coupled with the Nanometrics updated low power version of the directly buriable Horizon broadband sensor the total station power is less than .5W. A fourth 24 bit channel can be used for a complimentary geophysical high-resolution sensor such as infrasound, tilt meter or absolute pressure sensor. All metadata is automatically created on the digitizer for complete station xml metadata compliant to FDSN standards.

 

Performance of a Real-Time Machine Learning Based Seismic Catalog Generator Over a One Year Period in Production

 

Accurate and complete seismic event catalogs generated in real-time provide valuable insight into, among other things, induced seismic risk management and public safety strategies. However, due to the vast amount of seismic data being collected, the construction of such catalogs is traditionally labor intensive. Hence automated processes have been developed to reduce the manual workload involved in catalog production. Many machine learning oriented approaches have been proposed, however, their performance is commonly reviewed with relation to a static seismic catalog. As machine learning algorithms can be prone to overfitting, the ability to generalize for use in a real-time system is critical. 

 

In this study, we focus on the temporal stability of the Feature Weighted Beamforming (FWB) which has been applied on over 15 networks over a one year period in a production environment. The performance is measured with regards to the comparison of the automatically generated catalog with the corresponding analyst reviewed catalog. We present detailed results from an induced seismic monitoring array over the Duvernay Formation in Western Canada (Duvernay Subscriber Array, DSA), as well as some higher level statistics on other seismic networks. The initial results from DSA in comparison to standard STA/LTA picking with subsequent associations shows that  FWB reduced the number of false positives by 75% without loss of sensitivity, it also reduced the average difference in the event location between automatic and manually picked solutions by 82%. Similar to DSA, for all networks which included a large variety of training data FWB demonstrated consistent detection of all real seismic events compared to a sensitive STA/LTA pick associator regarding system sensitivity and location accuracy. Our investigation confirmed that the average difference in automated event locations output by FWB with respect to the analyst reviewed solutions are consistent over time. New clusters of seismic activity not seen during training are also correctly detected and located. We also discuss cautions for use of FWB when provided a limited training data set.

 

The application of real-time induced seismicity forecasting as a risk management system

 

Stress changes due to subsurface fluid injection in oil and gas operations may induce seismic activity. The size and distribution of such events are a function of local geology, in situ stress conditions, and treatment parameters. Not all fluid injection operations exhibit seismicity, but when they do it is vital to monitor the on-going induced seismic activity for an evaluation of the invoked risk mitigation plans. The majority of regulatory traffic light protocols introduced to date are based on staged magnitude thresholds, which increase the need for an estimation of the potential largest magnitude event that may occur during operations. Forecasting the maximum magnitude in real-time is a subject of significant interest to many operators. This allows operators to proactively optimize and adjust their stimulation plans in a way to prevent regulatory shutdowns.

 

In this study, we discuss different published seismicity forecasting models and evaluate their performance via real-time monitoring play back 30+ diverse datasets acquired during hydraulic fracturing operations to simulate real-time monitoring conditions. We use three prediction models to estimate the maximum magnitude and one model to evaluate the number of events stronger than a given threshold magnitude. Our findings show that, in general, maximum magnitude estimates from different models are nearly identical and in good agreement with the observed seismicity. We show that over time, the forecasts lose their sensitivity to the injection volume. The study also highlights the limitations of this approach when a large event occurs in early stages of a sequence. One of the most important takeaways is the impact that the quality of seismic data has on the system performance. High-quality data recorded by a local array combined with advanced processing techniques designed to generate “research grade” seismic catalogues automatically in near real-time is a key requirement.

 

3D velocity model optimization for enhanced absolute event locations: Application to Duvernay Subscription Array in western Canada 

 

Absolute hypocentral locations form the basis for characterizing natural and induced seismicity. The accuracy of these locations play a crucial role in subsequent hazard estimation, risk management, performance measurements, and operation optimizations.  

 

Apart from acquisition geometry and phase picking errors, velocity model errors are one of the main sources of event location inaccuracy. One should differentiate between precision and accuracy. Precision relates to error ellipsoids, origin times and depth uncertainties, while accuracy describes how close the located events are to the “actual” locations. It should be noted that on regional-scale networks with numerous stations where higher inaccuracies are expected, event locations are relatively less sensitive to unmodeled heterogeneities in the assumed velocity model. In contrast, on local arrays, commonly used for induced seismic monitoring, these uncertainties can significantly affect locations and subsequent analyses. In this study, we present a methodology for constructing and further calibrating/optimizing a 3D velocity model to improve the accuracy and precision of event locations recorded in the Duvernay Subscription Array (DSA) in Alberta, Canada.

 

In the first step, we build an initial 3D velocity model by interpolating numerous P- and S-phase sonic logs from nearby wells utilizing a combination of linear triangulation and nearest neighbor algorithms. We interpolate such that the model is geologically constrained by the structural horizon surfaces and formation tops obtained by surface seismic and seismic-to-well ties. This results in a more meaningful model in terms of its agreement with the actual Earth’s subsurface velocities and their physical complexity. We perform a cross-validation driven outlier removal procedure prior to interpolation to remove poor data and limit unrealistic velocity contrasts. Next, we smooth the obtained model using an elliptical inverse distance weighted exponential function to further reduce sharp non-physical velocity contrasts in all directions. 

 

In addition to the 3D velocity model, we calculate station statics at the location of the stations and take them into account through a grid search location algorithm. These parameters are used to relocate the existing event catalog in the DSA network that had initially been located using an existing simple 1D velocity model from western Alberta. The relocated events show higher precision as they result in tighter clusters with reduced RMS residuals and lower station phase residuals. They also show higher accuracy as they provide better agreement with common events in well-constrained microseismic catalogs. As expected for surface monitoring, the velocity model optimization affects the depths of the events more severely than their lateral position, which is more stable. We also suggest that adding extra stations for better azimuthal coverage of different event clusters can further lower the associated location uncertainty caused by the configuration bias and also allows for redundancy and therefore, lowers location uncertainty caused by arrival-time picks.