GeoWorld

GeoWorld December 2011

Issue link: https://read.dmtmag.com/i/50311

Contents of this Issue

Navigation

Page 28 of 31

benefit. No longer must planners wait a quarter or a year to collect new data for analysis. The energy industry, for example, has long made use of sensor-based data. The use of System Control and Data Acquisition (SCADA) has a long history dating back decades. Previously considered a legacy system, SCADA has become the foundation for next-generation sensors and control systems such as Substation Automation and Distribution Automation, allowing for real-time operation of smart devices such as capacitors for power quality or reclosers for reliability. Volt-Var Optimization allows a utility to reduce voltage levels from the traditional 220V without affecting power quality, yielding energy conservation without impacting customers. Fault Localization, Isolation and Service Restoration allows utility systems to rapidly locate the occurrence of a fault on the network, enabling automated switching so the number of customers impacted is minimized. And there has been considerable hype around the "Smart Grid," but, at its core, the Smart Grid is com- prised of sensor deployments returning high-quality, high-frequency data that can enable automated com- mand and control. Getting Complex When it comes to managing and analyzing sensor data, remember that measurements close to each other are more related than measurements farther away: spatial autocorrelation in action. This is intuitive; the temperature of somewhere 10 miles away is more like where we are then the temperature of something 1,000 miles away. Data-mining applications that use such sensor data require an intrinsic understanding of how spa- tial autocorrelation can affect results. Statisticians know that the first key step in any statistical analy- sis is to eliminate correlation among variables as much as possible—this also is true with spatial autocorrelation. There are opportunities for geospatial practitioners well versed with geostatistical techniques. From a data-management perspective, GIS is an ideal sys- tem to manage sensor data that have spatial and temporal location as a fundamental component of the measurement. There may be less "green fields" for GIS profession- als to deploy first-time GIS installations, but there are certainly opportunities for these same professionals to enhance existing installations to optimize the stor- age and analysis of sensor data. Data scrubbing and cleansing efforts for dynamic data are like the data lSpatial and temporal proximity of simple events facilitates a roll-up to complex events. collection and scrubbing efforts of static data on which many geotechnologists began their careers. Raw sensor data yield information about simple events (e.g., a power-quality measurement, a car passing at a certain speed, etc.). However, such events frequently roll up to more complex events, where a cascade of events are interrelated to each other. A utility outage, for example, typically is preceded by additional sensor data that a fault event occurred (e.g., a phasor measurement unit reports a fault event, followed by a recloser reporting that it has been tripped open and finally a series of smart meters indi- cating that they have lost power). These events all are related to each other, part of a complex event. Complex-event processing software can determine that these simple events roll up to a single complex event, and spatial and temporal proximity of the simple events is one of the foundational pieces of informa- tion. Geospatial knowledge is a necessary ingredient to complex-event processing. Sensors are inherently geospatial, and that means opportunities for geospatial professionals are growing as the number of sensors deployed grows. Erik Shepard is principal of Waterbridge Consulting: e-mail: erik@waterbridge.biz. DECEMBER 2O11 / WWW . GEOPLA CE . C O M 29

Articles in this issue

Links on this page

Archives of this issue

view archives of GeoWorld - GeoWorld December 2011