We look forward to seeing you in Seattle at the SSA annual meeting

Date

Meet our presenters

We will be presenting and chairing sessions on a wide range of topics, from the new low power portable ecosystem to using machine learning to generate automatic seismic catalogs. Please see the details and abstracts below. If you want to learn more about one of the topics listed below, but you're not attending the SSA meeting, please contact us about our webinar program.

Presenters

Oral Presentations

 

Absolute hypocentral location improvements with 3D velocity model optimization: Application to Duvernay, western Canada 

Presenting author: Sepideh Karimi

 

Day: Wednesday, April 24

Time: 8:30 am

Place: Cascade I

Increased detections through array design and processing

Presenting author: Sepideh Karimi

 

Day: Friday, April 26

Time: 9:15 am

Place: Cascade I

 

New High Resolution Very Low Powered Broadband Digitizer System, Pegasus

Presenting author: Bruce Townsend

 

Day: Wednesday, April 24

Time: 11:15 am

Place: Grand Crescent

The application of high-quality seismic catalogs in forecasting induced seismicity: A risk management system

Presenting author: Sepideh Karimi

 

Day: Friday, April 26

Time: 11:15 am

Place: Cascade I

 

Optimizing Borehole Station and Array Performance, Enabled by the Trillium Slim Borehole 120 Seismometer

Presenting author: Geoffrey Bainbridge

 

Day: Wednesday, April 24

Time: 3:00 pm

Place: Grand Crescent

 

 

Poster Presentations

Multi-Sensored Small Diameter Cased Borehole for EEW - Turn an EEW Station Into an Greater Capability Long Term Observatory and Monitoring Solution

Presenting author: Tim Parker

 

Day: Wednesday, April 24

Time: 2:15 pm - 3:30 pm

Place: Fifth Avenue

Operational Real-Time Automatic Seismic Catalog Generator Utilizing Machine Learning: Performance Review Over a One Year Period in Production

Presenting author: Sepideh Karimi

 

Day: Wednesday, April 24

Time: 3:30 pm - 4:15

Place: Grand Ballroom

 

Chaired Sessions

 

 

Abstracts

Oral Presentations

 

Optimizing Borehole Station and Array Performance, Enabled by the Trillium Slim Borehole 120 Seismometer

Presenting author: Geoffrey Bainbridge

 

Downhole installation offers the potential for best seismic performance but also greater uncertainties compared to a near-surface direct bury or vault installation.  This talk surveys some typical pitfalls, solutions, and future directions for cased borehole stations based on Nanometrics’ experience. 

 

A heuristic model is presented of the installation as a series of material interfaces where each presents a potential source of mechanical noise and affects the system transfer function.  The single most important factor is the location of the seismometer relative to the interface of loose soil to solid rock.  There are also a series of connections from the seismometer to a holelock or sand, to the casing, to cement, then to soil or rock, and also up to the wellhead via the cables and casing.  Each of these interfaces can affect performance, which we illustrate with data and recommendation of best practices.

 

We also present Nanometrics’ new borehole instrument: the Trillium Slim Borehole 120 seismometer, a 4.1 inch (104 mm) diameter instrument with Trillium 120QA/PH class performance having significantly improved SWaP (size, weight and power).  It is designed for smaller holes down to 4.5” or 115 mm diameter, in shallow or deep deployment, using a simple passive holelock or sand installation.  The small diameter permits deployment in existing small boreholes, and facilitates construction of new lower cost boreholes, minimizing disturbance of the surrounding rock and improving instrument coupling.

 

New High Resolution Very Low Powered Broadband Digitizer System, Pegasus

Presenting author: Bruce Townsend

 

The seismic research community has been looking for solutions and equipment for temporary array studies in broadband and mixed sensor deployments including Large-N science. Nanometrics is addressing this need with the introduction of the Pegasus digital recorder and supporting ecosystem which is designed to provide high resolution recording in temporary networks and remote environments. Data is recovered science ready and the Pegasus is compatible with not only broadband sensors, but with geophones and various other types of instrumentation.

 

The exceptionally low power consumption of the Pegasus digital recorder significantly reduces battery requirements, overall station size is less than 1.25 liters in volume, it is lightweight at less than.5kg and is IP68 rated for immersion.   The enclosure is robust for autonomous operations in all terrestrial environments enabling efficient deployment of more stations for a longer period of time. Installation and servicing is simplified with applications and features making data harvesting fast, verified and reliable while field station operational review is completed quickly and with certainty. Stand-alone power is less than 200mW for three channels and a power cycled GNSS timing system. Coupled with the Nanometrics Trillium directly buriable broadband sensors the lowest possible total station power is less than 400mW.  A fourth high resolution channel can be used for complimentary geophysical sensors, such as infrasound, tilt meter or absolute pressure sensor. As mentioned, the Pegasus digitizer will work with other conventional analog out seismic sensors, such as geophones and accelerometers and has standard community data storage formats that are essentially archive ready, thus leveraging the research community archive services. Data is complete and ready-to-process, with MiniSEED waveforms, StationXML metadata compliant to FDSN standards and comprehensive project audit information. For really remote operational awareness and  logistical planning there is a very low power SOH(state of health) of heath telemetry option.

 

High performance and simpler logistics makes the Trillium Slim Borehole well suited for many applications, including new higher-density arrays for full waveform analysis and detection of earthquake gravity signals.  In conclusion we present a proposed array design for optimal measurement of earthquake gravity signals on a regional scale.

 

The application of high-quality seismic catalogs in forecasting induced seismicity: A risk management system

Presenting author: Sepideh Karimi

 

In this study, we discuss different published seismicity forecasting models and present the learnings from a practical implementation of three published forecasting models in a risk management system for hydraulic fracturing operations. The seismicity prediction performance of the system is validated via real-time monitoring and playback of over 30 diverse datasets. The results show that the estimated seismicity agrees well with observed seismicity in the majority of cases, that multiple models produce very similar results, and that the injected volume has limited impact on seismicity forecasts. The study also highlights the limitations of this approach when a large event occurs early in the sequence. One of the most important takeaways is the impact that the quality of seismic data has on the system performance. High-quality data recorded by a local array combined with advanced processing techniques designed to generate “research grade” seismic catalogues automatically in near real-time is a key requirement. This development also serves as an excellent example of collaboration between industry (data acquisition and array deployment), academia (model development), and service providers (data processing advancements and implementation) to understand and manage the induced seismicity phenomenon.

 

Increased detections through array design and processing

Presenting author: Sepideh Karimi

 

Induced and microseismic monitoring is often hindered by the low signal-to-noise ratio (SNR) of the arrivals. This is due to both the often weak source and strong surface noise. Within the oil and gas industry it has become common practice to deploy very large numbers of geophones to allow for the detection of the weak seismic events, primarily through stacking techniques. The cost of acquiring and processing such large datasets can be prohibitive, particularly for academic and government institutions. Therefore, this may not be a practical design for all applications. While more sensors will theoretically improve the signal-to-noise ratio by √N, where, N is the number of measurements, this assumes prior removal of coherent noise sources. Large densely sampled patches remove coherent surface wave noise through application of an F-K, or similar, filter. This approach is only valid when the wavefield is well sampled, often requiring a prohibitive number of instruments. We demonstrate that using a small array and semblance-weighted stacking we can achieve similar results to the large dense patch design.

 

Our data set consists of two arrays that recorded a hydraulic fracture treatment in western Canada. The first consists of 25 dense patches with 96 recorded channels in a grid. Each channel is comprised of a string of 12 1C geophones. The second array has eight hexagonal subarrays of 3C geophones. Using subsets of the large patch data, we show that semblance-weighted stacking can achieve a √N increase in SNR. This suggests that the coherent noise within the data has been successfully attenuated. We also show that applying semblance-weighted stacking on the hexagonal subarrays falls on the √N from the patch data. We conclude that we can attenuate the coherent noise using only a six-station hexagonal array and semblance-weighted stacking. Thus, a limited number of sensors in an easily acquired array design and intelligent processing can increase detections to provide a more complete catalog.

 

Absolute hypocentral location improvements with 3D velocity model optimization: Application to Duvernay, western Canada 

Presenting author: Sepideh Karimi

 

The reliability of absolute hypocentral locations is of paramount importance as they form the basis for characterizing natural and induced seismicity. Apart from acquisition geometry and phase picking errors, velocity model errors are the main source of event location uncertainty, especially on local arrays, commonly used for induced seismic monitoring. We present a methodology for constructing and further calibrating/optimizing a 3D velocity model to reduce uncertainty in event locations recorded in the Duvernay Subscription Array (DSA) in Alberta, Canada. The method involves building an initial 3D velocity model by interpolating numerous P- and S-phase sonic logs from nearby wells. The model is constrained by structural horizon surfaces and formation tops obtained from surface seismic to ensure agreement with actual Earth’s subsurface velocities and their physical complexity. A combination of cross-validation driven outlier removal and a smoothing operator via an elliptical inverse distance weighted exponential function, help remove poor data and limit unrealistic velocity contrasts. We also calculate station statics and include them in a grid search location algorithm utilizing the new 3D model to relocate the existing event catalog in the DSA network that had initially been located using a simple 1D velocity model. The relocated events show higher precision as they result in tighter clusters with reduced RMS residuals and lower station phase residuals. They also show higher accuracy as they provide better agreement with common events in well-constrained microseismic catalogs. As expected for surface monitoring, the velocity model optimization affects the depths of the events more severely than their lateral positions, which are more stable. The azimuthal coverage and redundancy from addition of potential extra stations can further lower the location uncertainty caused by the configuration bias and arrival-time picking errors.

 

Poster Presentations

 

Multi-Sensored Small Diameter Cased Borehole for EEW - Turn an EEW Station Into an Greater Capability Long Term Observatory and Monitoring Solution

Presenting author: Bruce Townsend

As the requirements and science evolves for improved EEW, a more capable infrastructure would enable greater monitoring capabilities.  We propose a deeper grouted casing when appropriately using borehole best practices to ensure improved coupling for lower noise high and low frequency recording.  Casing emplacements should be a day operation for installation and a subset could be used for the densification of geodetic arrays with only slight modifications by using the wellhead as a monument supporting the antenna as has been demonstrated on some PBO borehole stations.  Stations using a new slimline T120PH and dual sensor Cascadia in a single cased hole will add large dynamic range, resiliency and low noise recording that would enable prompt gravity wave observations along with higher sensitivity for local earthquake recording. Dry cased holes are the standard for long term geophysical observatories and a better investment when all the associated costs of operating EEW observatories are considered while recognizing that these networks are in their infancy in the evolution of hazard monitoring best practices.

 

Operational Real-Time Automatic Seismic Catalog Generator Utilizing Machine Learning: Performance Review Over a One Year Period in Production

Presenting author: Sepideh Karimi

 

Real-time seismic event catalogs which are accurate and complete provide valuable insight into, among other things, public safety strategies and induced seismic risk management. The construction of such catalogs is traditionally labor intensive, hence automated processes have been developed to reduce the manual workload involved in catalog production. Many machine learning oriented approaches have been proposed, however, their performance is commonly reviewed with relation to a static seismic catalog. As machine learning algorithms can be prone to overfitting, the ability to generalize for use in a real-time system is critical.

In this study, we focus on the temporal stability of a real-time automatic seismic catalog generator algorithm (Feature Weighted Beamforming, FWB) which has been applied on over 15 networks over a one year period in a production environment. We present detailed results from an induced seismic monitoring array over the Duvernay Formation (Duvernay Subscriber Array, DSA), as well as some higher level statistics on other seismic networks. The initial results from DSA in comparison to standard STA/LTA picking and associations show that FWB reduced the number of false positives by 75% without loss of sensitivity, it also reduced the average difference in the event location between automatic and manually picked solutions by 82%. Similar to DSA, for all networks which included a large variety of training data FWB demonstrated consistent detection of all real seismic events compared to a sensitive STA/LTA pick associator regarding system sensitivity and location accuracy. We confirmed that the average difference in automated event locations output by FWB relative the analyst reviewed solutions are consistent over time. New clusters of seismic activity not seen during training are also correctly detected and located. We also discuss cautions for use of FWB when provided a limited training data set.

 

Chaired Sessions

 

Machine Learning in Seismology

 

With machine learning algorithms facilitating significant developments and breakthroughs in a range of fields and applications, the advanced processing capability provides a compelling opportunity in the application of seismology studies. The potential to greatly accelerate the rate of processing as well as the accuracy of complex seismology data will provide numerous advancements in the understanding of seismic activity and has the potential to redefine what is achievable in the Earth Science field. 

We encourage session contributors to present their research and innovative approaches to geophysical studies using machine learning algorithms and processes. We welcome all submissions relevant to applications and developments of machine learning and artificial intelligence in seismology.

 

New Approaches to Geophysical Research Using Dense Mixed Sensor and Broadband Seismology Arrays

 

Along with broadband seismic sensors, researchers are now deploying complementary geophysical instruments such as high and low gain seismic velocity sensors, accelerometers, infrasound, GNSS and Magnetotelluric in dense and sparse arrays. Some of these techniques require longer term deployments than others so contributors are encouraged to discuss these constraints when considering these types of combined instrument observations.

 

We encourage session contributors to present their research, motivation and innovative approaches to geophysical studies using mixed sensor deployment techniques for earth and climate sciences observations including: broadband and geophone velocity sensors, accelerometers, tilt, pressure and infrasound along with geodesy and magnetotelluric instruments. Additionally, we ask contributors to describe the results and challenges of this multidisciplinary approach to geophysical studies with these types of research arrays.

 

Evolving Best Practices for Station Buildout in EEW and New Permanent Networks

 

On the US West Coast, the U.S. Geological Survey and its partner institutions, University of California, Berkeley, Caltech, the University of Oregon and the University of Washington are focusing on completing the build-out of the infrastructure for initial implementation of ShakeAlert Earthquake Early Warning (EEW) in the United States. Over the next few years, the number of EEW-capable seismic stations must at least double from today's ~850 contributing stations. This effort requires planning regarding station density and type as well as complex logistics including siting, legal and environmental permitting, equipment specification and delivery. Other important topics include data quality and latency, continuous monitoring systems, delivery of alerts and other technical topics. This session invites contributions from any EEW system operator on all these topics related to EEW build, including case studies for current and planned seismic networks as well as new ideas for developing EEW deployments and collaborations with contributing networks that are novel for their design or approach in handling these issues.

 

Injection-induced Seismicity     

 

Induced seismicity related to oil and gas production has been a growing concern in the last few years. Although the majority of waste water disposal and hydraulic fracturing operations do not generate seismicity or large magnitude events, there have been few reports of damaging earthquakes in North America which have led to an increased demand for appropriate risk assessment and management of induced seismicity and development of effective risk mitigation strategies. 
 

Utilizing detailed geological and geophysical analysis of seismic survey data could characterize the pre-existing structures. Drilling and completion programs can then be designed to minimize the likelihood of proximal faults activation and large magnitude event occurrence. However, as the data recorded to date in North America has shown, most of the activated faults are unmapped and the conditions (in-situ stress state, fault proximity, area and fault orientation) at each oil and gas field is highly variable. This highlights the requirement for understanding, evaluating and managing the seismic risk in near real-time.

Injection-induced seismicity and associated risks have been the subject of many studies and research, however there are still many questions to answer, particularly on development of effective risk mitigation strategies. In this session, we welcome contributions on geomechanics, numerical modelings, case studies, induced seismicity forecasting and  risk assessment techniques.