SSA 2016 - Nanometrics Event Schedule - Booth # 6

Join Nanometrics in Reno, Nevada from April 19 to 22 for the Seismological Society of America’s 2016 annual meeting. The event promises to be a stimulating exchange of research on a wide range of topics with colleagues from all over the world, including oral and poster presentations from Nanometrics researchers. 
 
 

Wednesday, April 20

Session: Induced Seismicity

              
11:30 am          Oral presentation
  “A Ground Motion Prediction Equation for Induced Earthquakes in Oklahoma”
  Presenter: Emrah Yenier
  Room: Tuscany A
  ABSTRACT

 

2:45 pm     Poster presentation
  “Prediction of Earthquake Ground Motions in Western Alberta”
  Presenter: Emrah Yenier
  Poster #49, Tuscany F
 

ABSTRACT

 

Thursday, April 21

 
Session: Induced Seismicity Monitoring: What is Really Needed?
 
     
8:30 am    Oral presentation
  “Challenges and Strategies for Monitoring Induced Seismicity”
  Presenter: Sepideh Karimi
  Room: Tuscany 1 / 2
  ABSTRACT
 
 
Session: Advancements in Network Operations and Station Design
 
 
8:30 am        Poster presentation
  “Feasibility of Tilt Measurement Using Seismometer Mass Position Data”
  Presenter: Geoff Bainbridge
  Poster #31 / Tuscany F
  ABSTRACT
 

Friday, April 22

 
Session: NGA-East: Research Results and Ground-Motion Characterization Products for Central and Eastern North America
 
 
11:00 am                         
Oral presentation
  “Prediction Equation for Central and Eastern North America Based on a Regionally Adjustable Generic Ground Motion Model”
  Presenter: Emrah Yenier
  Room: Tuscany 7 / 8
  ABSTRACT
 
Session: Active Tectonics, Faults and Large Earthquakes
 
11:00 am                Poster presentation
  “Sources of Latency and Associated Design Trade-Offs in Earthquake Early Warning Systems”
  Presenter: Geoffrey Bainbridge
  Poster #7 / Tuscany F
 

ABSTRACT

 

ABSTRACTS

 

A Ground Motion Prediction Equation for Induced Earthquakes in Oklahoma

 
Emrah Yenier (1),  Gail M. Atkinson (1) & Danielle D. F. Sumy Sumy (2)
(1) Western University, London, ON, Canada (2) Incorporated Research Institutions for Seismology, Washington DC, USA
 
Seismic activity in Oklahoma has substantially increased within the last decade. Deep injection of wastewater from hydrocarbon production is believed to be responsible for the evolving seismicity in the region. The growing seismicity in Oklahoma has raised concerns regarding the hazard associated with induced seismicity. Estimation of ground motions that can be produced by induced earthquakes is key to determining hazard contributions from induced seismicity.
In this study, we aim to develop a ground motion prediction equation (GMPE) for induced earthquakes in Oklahoma. Available ground motion data from induced seismicity are insufficient to develop a robust empirical GMPE for moderate-to-large magnitudes (M > 5). In order to obtain a predictive model applicable for a wide range of magnitudes, we adopt a regionally-adjustable generic GMPE whose parameters have been calibrated to the rich empirical data in California. We investigate the region-specific source and attenuation attributes of induced events, using ground motions obtained from the 2011 Prague, Oklahoma earthquake sequence. We examine the spatial and temporal variation of stress parameters determined from ground motions of induced events, and compare their values to those obtained from naturally occurring earthquakes, to gain insights into the source characteristics of induced events in the region. We adjust the generic GMPE using the regional model parameters and calibration factor calculated from empirical data. The adjusted GMPE provides predictions for average horizontal-component response spectra and peak motions that may be produced by induced earthquakes in Oklahoma.
 

Prediction of Earthquake Ground Motions in Western Alberta

 
Emrah Yenier (1), Dario Baturan (1), Andrew Law (1) & Gail M. Atkinson (2)
  1. Nanometrics, Inc., Ottawa, Canada (2) Western University, London, ON, Canada
 
We develop a ground-motion prediction equation (GMPE) for earthquakes in western Alberta, where hazard contributions from induced seismicity is of particular interest. We investigate the regional source and attenuation attributes using peak ground motions and response spectra from recorded seismic events. We supplement the seismic data with ground motions obtained from mining/quarry blasts in the region, to gain insights into the regional attenuation over a wide distance range. The available empirical data is limited for deriving a robust predictive model in the magnitude range of engineering interest (M>4). We therefore adopt a regionally-adjustable generic GMPE (Yenier and Atkinson, 2015 BSSA), with parameters that have been calibrated to the rich empirical data in California, to ensure seismologically robust predictions for moderate-to-large magnitudes. We modify the model parameters of generic GMPE based on the source and attenuation attributes observed in western Alberta, and determine an empirical calibration factor that accounts for the overall differences between the generic model and the empirical data in the region. This provides a hybrid GMPE that is fully-adjusted for observed motions in western Alberta and is applicable for wide ranges of magnitude and distance.
 

Challenges and Strategies for Monitoring Induced Seismicity

 
Dario Baturan, Sepideh Karimi & Emrah Yenier
Nanometrics Inc, Kanata, ON, Canada,
 
Between 2013 and 2015, a number of seismic events characterized as induced with magnitudes above M3.0 were recorded in British Columbia, Alberta, Ohio and Oklahoma. Following increased public awareness and media scrutiny, many jurisdictions have put in place protocols to mitigate risks associated with induced seismicity. Most of the regulations introduced to date mandate the deployment of real-time seismic monitoring networks as drivers of operational traffic light systems. Using some of the regulatory protocols now in place, we address the best practice strategies associated with monitoring for induced seismicity. How many stations are needed to meet the monitoring mandate and what should be their geographical distribution? How many stations could be inoperative before the network does not meet its monitoring mandate? With a number of sensing technologies to choose from, including seismometers, accelerometers and geophones, which one provides the best combination of self-noise, clip level and frequency response to cover the seismic event magnitude and epicentral distance range? As most current “traffic light” protocol thresholds are based on magnitudes, which magnitude scale should be used? The networks initially deployed to manage risk associated with induced seismicity can provide additional benefits. Generated data sets can be used to assist operators in optimizing completion operations, identify or refine knowledge of geological structures, estimate the direction of in-situ stress regimes, monitor critical infrastructures and develop regional attenuation relationships for more accurate ground motion and magnitude estimates.

Feasibility of Tilt Measurement Using Seismometer Mass Position Data

 
Geoffrey Bainbridge, Sepideh Karimi, Emrah Yenier & Andrew Moores
Nanometrics Inc., Ottawa, ON, Canada
 
Force feedback seismometers provide mass position outputs which represent the time-averaged feedback force applied to each inertial mass, in order to cancel external forces and keep it balanced at its center point. These external forces are primarily due to tilt and temperature. In a symmetric triaxial seismometer, tilt and temperature effects can be distinguished because temperature affects all 3 axes equally whereas tilt causes a different force on each axis. This study analyzes the resolution of tilt and temperature signals that can be obtained from a force-feedback seismometer, and the potential applicability of this data to applications such as volcano monitoring and cap rock integrity monitoring. Also the synergy of a combined seismic, tilt, and temperature instrument is considered.

 

Prediction Equation for Central and Eastern North America Based on a Regionally Adjustable Generic Ground Motion Model

 
E. Yenier & Gail M. Atkinson
Western University, London, ON, Canada
 
Limited ground-motion observations in central and eastern North America (CENA) create challenges in terms of developing a regional ground motion prediction equation (GMPE) for a wide range of magnitudes and distances using conventional empirical methods. We tackle this problem by adjusting a generic GMPE model based on the observed source and attenuation attributes in CENA.
The basis of the generic GMPE is an equivalent point-source simulation model whose parameters have been calibrated to empirical data in California. We use simulated motions to determine the decoupled effects of basic source and attenuation parameters on ground motion amplitudes. The generic GMPE is defined as a function of magnitude, distance, stress parameter, geometrical spreading rate and anelastic attenuation, for a reference NEHRP B/C boundary site condition. We also include an empirical calibration factor to account for residual effects that are different or missing in simulations compared to observed motions in the target region. This provides a “plug-and-play” GMPE that can be adjusted for use in any region by modifying a few key model parameters based on the observed ground motions.
We calibrate the generic GMPE for CENA using the regional source and attenuation parameters as well as the empirical calibration factor determined from NGA-East ground-motion database. We infer a magnitude- and depth-dependent stress parameter model based on the values obtained from study events. The developed GMPE provides median predictions of ground motions in CENA for average horizontal-component peak ground motions and 5%-damped pseudo spectral acceleration (T ≤ 10 s), for wide ranges of magnitude (M3-M8) and distance (< 600 km).
 

Sources of Latency and Associated Design Trade-Offs in Earthquake Early Warning Systems

 
Chris Cordahi, David Easton, Tim Hayman & Ross MacCharles
Nanometrics, Ottawa, ON, Canada
 
Low latency is a key contributor to the success of an Earthquake Early Warning (EEW) system. There are several points where latency is introduced between the instant in time that a digitizer produces a set of samples across its analog sensor channel inputs and the point at which the corresponding data reaches its destination for EEW analysis outside the instrumentation and networking domains. Typically long distances separate data sources from the location at which analysis is performed. These points of latency arise out of software, mathematical, and networking as well as physical constraints imposed upon the digitizer and associated communication systems. System designs must account for tradeoffs between latency and resource (CPU) utilization, which has an effect on power consumption, and communication network bandwidth. Designers of seismological instrumentation used for EEW deployments must keep these trade-offs in mind and make clever implementation choices to minimize delay. System integrators and network operators must be fully aware of latency and its contributors in order to make the right configuration choices when commissioning EEW systems to ensure the lowest possible latency without compromising the accuracy of the early warning data product. We illustrate the tradeoffs being made at the identified latency points using an analysis of a typical deployment of a digitizer streaming live seismic data to a central site utilizing a Very Small Aperture Terminal (VSAT) communication system.
 
 
 
 
 
Published Date: 
Monday, April 18, 2016
News Category: 

EGU 2016 - Nanometrics Event Schedule - Booth # G20

 
Join Nanometrics in Vienna, Austria from April 17 to 22 for the General Assembly 2016 of the European Geosciences Union (EGU). Scientists from all nations will be presenting their research and discussing new ideas in all areas of geosciences.  Nanometrics researchers will be presenting their latest research in the areas of polar direct-bury installation, ocean-bottom instrumentation, tilt and latency in early warning systems as well as a full short course on direct bury techniques and best practices.
 
 

Wednesday, April 20

                   
10:30-12:00          Short Course
  SC64: “Direct Burial and the Advance of Broadband Seismic Studies”
  Convener: Tim Parker
  Room 2.97
  MORE INFO

 

17:00                   Nanometrics Customer Appreciation Event
  Join us at booth #G20 to toast the success of our valued customers!
 

Thursday, April 21

 
Session: GD8.3/EMRP4.9/SM7.6: Exploring crust and mantle under the oceans: innovative instrumentation, processing and interpretation of ocean-bottom data (co-organized)
 
     
 
17:30-17:45   Oral presentation
  EGU2016-11915: “LOBSTER - The Next Generation”
  Presenter: Arne Schwenk
  Room L3
  ABSTRACT
 
 
Session: SM7.2/G6.2: New developments in seismic and geodetic instrumentation (co-organized)
 
 
17:30-19:00                   Poster presentation
  X1.263: “Potential Applications of an Integrated Seismic, Tilt and Temperature Instrument”
  Presenter: Tim Parker
  Hall X1
  ABSTRACT
 
 
Session: CR2.2: Glacier monitoring from in-situ and remotely sensed observations
 
17:30-19:00                           
Poster presentation
 
X3.231: “Direct Burial Broadband Seismic Instrumentation that is Rugged and Tilt Tolerant for Polar Environments”
  Presenter: Tim Parker
 
Hall X3
  ABSTRACT
 
 

On-booth poster paper

 
Stop by our booth to see “Sources of latency and associated design trade-offs in earthquake early warning systems” by David Easton et al., a poster being presented at the 2016 Seismological Society of America Annual Meeting. David will be on hand to discuss his findings.
 
 
 

ABSTRACTS

 

SC64:  “Direct Burial and the Advance of Broadband Seismic Studies”

Tim Parker
Nanometrics Inc, Ottawa, Canada
 
Please join us for an open workshop on direct-bury broadband seismic sensor deployments. There will be presentations on direct-bury deployments and techniques and results from case studies along with discussions of new community-driven initiatives to collect standardized state-of-health and metadata on portable deployments.
 
The agenda starts with a few talks from some of the geophysical facilities and individuals who are exploring these techniques. There will also be time for attendees to provide 5-minute (or less) flash presentations on their experiences (please bring yours!) and a question-and-answer period on any topics related to temporary, experiment-driven broadband deployments. We’ll conclude with a conversation on how to continue moving forward best practices in portable broadband sensor deployment.
 
 

Arne Schwenk
K.U.M. GmbH, Kiel, Germany
 
Since 1997 K.U.M. GmbH designs and manufactures Ocean Bottom Seismometer. During the last three years we designed a new instrument, which is presented here. Higher resolution, higher accuracy and less power consumption led to an unique instrument, the worlds smallest broadband long-term OBS. Key data are: 32 bit, 143dB, 300mW, 120 sec, 200kg deployment weight, size of half a palette.
 

Potential Applications of an Integrated Seismic, Tilt, and Temperature Instrument

Geoffrey Bainbridge, Tim Parker, Sepideh Karimi, and Peter Devanney
Nanometrics Inc, Ottawa, Canada
 
Force feedback seismometers provide mass position outputs, which represent the time-averaged feedback force applied to each inertial mass, in order to cancel external forces and keep it balanced at its center point. These external forces are primarily due to tilt and temperature. In a symmetric triaxial seismometer, tilt and temperature effects can be distinguished because temperature affects all 3 axes equally whereas tilt causes a different force on each axis. This study analyzes the resolution of tilt and temperature signals that can be obtained from a force-feedback seismometer, and the potential applicability of this data to applications such as volcano monitoring and cap rock integrity monitoring. Also the synergy of a combined seismic, tilt, and temperature instrument is considered.

 

Direct Burial Broadband Seismic Instrumentation that are Rugged and Tilt Tolerant for Polar Environments

Tim Parker (1), Paul Winberry (2), Audrey Huerta (2), Geoff Bainbridge (1), and Peter Devanney (1)
(1) Nanometrics, (2) Central Washington University
 
The integrated broadband Meridian Posthole and Compact seismic systems have been engineered and tested for extreme polar environments. Ten percent of the Earth’s surface is covered in glacial ice and the dynamics of these environments is a strategic concern for all. The development for these systems was driven by researchers needing to densify observations in ice-covered regions with difficult and limited logistics. Funding from an NSF MRI award, GEOICE and investment from the vendor enabled researchers to write the specifications for a hybrid family of instruments that can operate at -55C autonomously with very little power, 1 watt for the Meridian Compact system and 1.5 watts for the Meridian 120PH. Tilt tolerance in unstable ice conditions was a concern and these instruments have a range of up to +/-5 degrees. The form factor, extreme temperature tolerance and power load of the instruments has reduced the bulk of a complete station by 1/2 and simplified installation greatly allowing more instruments to be deployed with limited support and a lighter logistical load. These systems are being tested in the Antarctic at South Pole Station and McMurdo for the second year and the investment has encouraged other instrument and power system vendors to offer polar rated equipment including telemetry for ancillary support.
 
 

Sources of Latency and Associated Design Trade-offs in Earthquake Early Warning Systems

Chris Cordahi, David Easton, Tim Hayman & Ross MacCharles
Nanometrics, Ottawa, ON, Canada
 
Low latency is a key contributor to the success of an Earthquake Early Warning (EEW) system. There are several points where latency is introduced between the instant in time that a digitizer produces a set of samples across its analog sensor channel inputs and the point at which the corresponding data reaches its destination for EEW analysis outside the instrumentation and networking domains. Typically long distances separate data sources from the location at which analysis is performed. These points of latency arise out of software, mathematical, and networking as well as physical constraints imposed upon the digitizer and associated communication systems. System designs must account for tradeoffs between latency and resource (CPU) utilization, which has an effect on power consumption, and communication network bandwidth. Designers of seismological instrumentation used for EEW deployments must keep these trade-offs in mind and make clever implementation choices to minimize delay. System integrators and network operators must be fully aware of latency and its contributors in order to make the right configuration choices when commissioning EEW systems to ensure the lowest possible latency without compromising the accuracy of the early warning data product. We illustrate the tradeoffs being made at the identified latency points using an analysis of a typical deployment of a digitizer streaming live seismic data to a central site utilizing a Very Small Aperture Terminal (VSAT) communication system.
 
 
 
 
 
 
Published Date: 
Thursday, April 14, 2016
News Category: 

Natural Resources Canada sends seismologist to record McAdam quakes

Courtesy of:
CBC News
February 11, 2016
To read the full story click here
 
 
Two seismologists from Ottawa have set up measuring equipment in the village of McAdam, which is still being rattled by earthquakes eight days after the swarm of tremors began.  Calvin Andrews, a technologist with Natural Resources Canada, installed the first of four aftershock deployment kits in the basement of the village high school Thursday morning.
 
"This took me about 15 minutes to set up and we're already streaming data into Ottawa now," he said. "They're very sensitive." 
 
He and seismologist Stephen Halchuk are installing these kits, because the nearest permanent seismometer is 40 kilometres away in St. George, providing less than accurate data on the recent quakes.
 
Over the past eight days there have been dozens of small quakes under the village, measuring up to a 3.3 magnitude, strong enough to crack windows.
 
"I know residents aren't glad to hear it but I'm glad to hear that there were a couple of events last night, because I was afraid that as soon as we got permission to come down the earthquake stops," said Halchuk.
 
That's what happened last time Natural Resources Canada installed the devices in McAdam, he said, after a similar swarm in 2012. This time, he hopes to learn a lot more.
 
"It's likely a fault, but it's very small," he said. 
 
"There's no plate boundary here. There's not two plates interacting like you have on the west coast of North America, so we can't point to a definite source and say that's what's causing the events."
 
Halchuk says the data won't help predict when a quake is coming, but it should give a better idea about the likelihood of any larger events in the future.
 
Natural Resources Canada says the four seismometers will be kept in McAdam for at least a few months.
 
 
Published Date: 
Wednesday, March 2, 2016
News Category: 

GeoConvention 2016 - Presentations Schedule - Booth # 311

 

Nanometrics at GeoConvention

 

Oral Presentations 

 

Title: A Local Magnitude (ML) Formula for Western Alberta

Presenter: Emrah Yenier

Time & Location: Telus Room 104-106, 11:05 - 11:30 hrs on Wednesday, March 9th

Session: Induced Seismicity in the WCSB I 

 

Title: High Quality Induced Seismic Monitoring: Strategies and Applications

Presenter: Dario Baturan

Time & Location: Telus Room 104-106, 15:00 - 15:25 hrs on Wednesday, March 9th

Session: Induced Seismicity in the WCSB II

 

On-booth presentation on demand

 

Title: Induced Seismic Monitoring Challenges

Presenter: Dario Baturan

Learn more about: 

  • Monitoring injection wells and frac operations in real time
  • Mitigating risk before regulatory action
  • Obtaining enhanced research grade data sets

 

Title: Frack Monitoring with Sparse Surface Networks- A Case Study

Presenter: Dario Baturan

Learn more about: 

  • Monitoring the effectiveness of hydraulic fracture operations
  • Inferring fracture mechanics during hydraulic completions
  • Delineating geologic structures near the well and determine in situ stress regime

 

 

Abstracts 

 
 "A Local Magnitude (ML) Formula for Western Alberta"  

Wednesday, March 9th  Telus room 104-106 I  Session: Induced Seismicity in the WCSB I I  11:05-11:30 hrs

We examine the distance attenuation of peak Wood-Anderson (WA) amplitudes obtained from earthquakes in western Alberta, to develop a regionally-calibrated local magnitude (ML) equation. A comparison of WA amplitudes from earthquakes and mining/quarry blasts in the region show that both event types decay at similar rates with distance and demonstrate a considerable Moho-bounce effect between 100 km and 220 km. Motivated by this observation, we merge the two amplitude datasets, and model the regional attenuation using a trilinear model to account for the observed shape of attenuation. We also determine a site correction for each station, which enables determination of ML for a reference site condition. The derived ML equation results in unbiased magnitude estimates with distance in western Alberta, and attains systematically lower values than the existing magnitudes computed based on standard ML models.

 

 "High Quality Induced Seismic Monitoring: Strategies and Applications"  

Wednesday, March 9th  Telus room 104-106 I  Session: Induced Seismicity in the WCSB II I  15:00-15:25 hrs

We examine  the importance of designing induced seismic monitoring (ISM) networks to provide the richest data sets possible with benefits to operators beyond basic regulatory compliance. Using examples from operational Montney and Duvernay ISM networks, we highlight the impact of network performance modeling, data sharing and instrumentation on the quality of generated data products and event catalogs. We illustrate the application of enhanced data sets in evaluating the effectiveness of yellow traffic light-initiated mitigation techniques and improving the accuracy of scientific research outputs.

 
 
Published Date: 
Wednesday, March 2, 2016
News Category: 

Nanometrics Trillium 240 seismometer helps LIGO detect gravitational waves

Nanometrics’ Trillium seismometer technology was an important component in the Laser Interferometer Gravitational-wave Observatory (LIGO) installations that detected gravitational waves for the first time ever. Scientists announced this major scientific achievement yesterday, noting that the first gravitational waves seen at the twin LIGO installations, located in Livingston, Louisiana, and Hanford, Washington, were recorded September 14th, 2015, having been detected independently 7 milliseconds apart.

This story begins 1.3 billion years ago, in a galaxy far, far away...

Two black holes that had been spinning around each other in a gradually decaying orbit finally collided, converting enormous mass into energy radiated out as gravitational waves. The wave has been spreading out since then, finally reaching us last year. Nobody would have noticed it, except that back in 1992 the LIGO project was launched to set up observatories with 4 km long L-shaped laser interferometers operating in a near-perfect vacuum. These weren't sensitive enough but "Advanced LIGO" was developed to upgrade the capabilities and it went online just six weeks before this first event was detected.

LIGO vacuum pod containing Trillium 240 broadband seismometerTrillium 240 very broadband seismometer used for LIGO gravitational wave discovery

LIGO researchers approached Nanometrics in March 2008, needing highly sensitive broadband seismometers that could measure the ever-present vibrations of the earth, so they could subtract the vibration effects out of the interferometer to help unmask fainter signals that would otherwise be drowned out.  Nanometrics worked with LIGO to qualify and develop a custom version of the Trillium 240 broadband seismometer designed originally for measuring distant earthquakes. LIGO had a specialized list of requirements for this seismometer.  It had to be suitable for deploying inside the huge vacuum chamber, and so needed to be as small as possible. It had to be extremely reliable and never need servicing, as a failure could mean months of downtime as the vacuum in the huge LIGO chamber was lost and then pumped back down again. LIGO also needed 50 seismometers delivered by early 2010. LIGO ultimately selected the Trillium 240 seismometer to instrument their vacuum chambers, not only because of the good performance shown in early tests at Stanford, but also for their proven history of reliability in the field.

"More specifically, the fifteen Trillium 240 seismometers in each observatory help keep the LIGO mirrors and lasers from being disturbed by vibrations. The seismometers measure the most minuscule ground motion, and other LIGO equipment feeds these signals back to help stabilize the detectors.  It is a similar concept to noise-cancelling headphones that pick up room noise and create a reverse signal into your ear to make it quieter," explains Bruce Townsend, VP of Products, Nanometrics, "This makes it possible for the scientists to measure the extremely tiny motions due to gravitational waves."

When asked about the importance of the role the Trillium 240 seismometer played, Dr. Brian Lantz, Senior Research Scientist, Stanford University (LIGO) stated, "They are in vacuum in the instrument. They are a critical piece of the active seismic isolation we use. And they were (and still are) great!"

"We are thrilled that the Trillium 240 seismometer has played a role in enabling this significant scientific discovery," said Neil Spriggs, CEO, Nanometrics, "and we are proud of the contribution Nanometrics made to help enable the LIGO success."

For more information, go to: LIGO http://www.ligo.org/
 

About Nanometrics, Inc:

Nanometrics is dedicated to providing an open and honest scientific approach to Induced Seismicity Monitoring that has been honed by over 30 years of experience developing innovative seismic technology. 

Nanometrics has designed and deployed real-time seismological networks on every continent and in every climate.  As the world leader in seismological networks, Nanometrics offers expertise in event detection, event location, magnitude calculation, real-time data transmission and processing, rapid event notification, and seismic emission tomography (SET).  Utilizing proprietary broadband node instrumentation and networking technology, the team at Nanometrics provides customized networks to facilitate real-time induced seismicity monitoring, hydraulic fracture monitoring and wastewater monitoring services.

Nanometrics is an award winning company providing monitoring solutions and equipment for studying man-made and natural seismicity. Headquartered in Ottawa, Ontario, with offices and representatives world-wide, Nanometrics has over 30 years’ experience, delivering solutions to customers across the globe. Nanometrics real-time and portable systems are utilized by the world’s leading scientific institutions, universities and major corporations. Our pedigree is founded on precision instrumentation, network technology and software applications for seismological and environmental research. We specialize in collecting and analyzing critical real time data for global, regional and local seismic networks. We deliver world-class network design, installation and training services throughout the globe in a safety conscious environment.

Strategic Intelligence fueled by Science

For more information, contact: Mairi Miller, Marketing Manager, Nanometrics Inc.

www.nanometrics.ca  I  mairimiller@nanometrics.ca  I  +613-592-6776 ext. 288

 

Published Date: 
Friday, February 12, 2016
News Category: 

Pages