Loading...

Follow Between the Poles on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Underground utility damage during construction is a major safety problem and drag on economies in many countries.  The technology that is currently best practices has been used for decades and is characterized by being slow, unreliable under certain conditions, hazardous for the operator, expensive, and requiring trained, skilled operators.  There are signs that this is changing. ImpulseRadar, a company based in Malå, Sweden, has developed a unique ground penetrating radar (GPR) technology which uses real-time sampling (RTS) to gather data thousands of times faster than a conventional GPR.  This is the first technology advance that I am aware of that enables GPR scans of underground utilities with a commercial product at highway speeds up to 130 km/hr.

Introduction

The Common Ground Alliance estimates that there are 400,000 cases of utility damage during construction annually in the U.S. In the U.S. according to Federal Highway Administration (FHWA) underground utility conflicts and relocations are a major cause of project delays during road construction. A study by the Pennsylvania State University found $21 in cost saving for every dollar invested in improved location information about underground infrastructure.

Not knowing where underground infrastructure is has engendered what is estimated to be a $10 billion per year industry in the U.S. Every construction project requires locating underground utilities prior to and during construction.

The most accurate way to determine the location of underground infrastructure is to expose it by carefully digging a hole and then bring in a survey team to survey the location.  But this is time consuming, expensive, and can be hazardous.

An increasingly viable alternative is various types of remote sensing. Currently best practice for locating underground infrastructure is walking the site or right of way with electromagnetic wands (EMI) or ground penetrating radar (GPR) pushcarts. This is slow and can be extremely hazardous.  It can also have major indirect and social costs such as lane closures on busy transportation routes.

Underground reality capture at up to 130 km/hr 

We are beginning to see some promising advances in remote sensing technology for detecting underground utilities. Capturing ground penetrating radar scans of below ground infrastructure data at roadway speeds would be an important step toward efficiently and safely creating 3D maps of the underground.  Recently I had the opportunity to chat with Matthew Wolf of ImpulseRadar, a company based in Malå, Sweden that has developed a unique ground penetrating radar (GPR) technology which uses real-time sampling (RTS) to gather data thousands of times faster than a conventional GPR.  The new Raptor® GPR array is designed to be fitted to a survey vehicle, which supports an arrangement of up to 18-channels.  The real-time sampling (RTS) technology implemented by ImpulseRadar enables very fast collection of GPR data at speeds in excess of 130 km/hr. @ 5 cm point intervals. This is much faster than conventional GPR systems which typically operate below 15 km/hour.  This enables the Raptor® GPR array to collect 3D GPR scans at posted highway speed limits. High precision positioning of the data can be achieved with RTK GPS or robotic total stations in areas of poor GPS coverage. Talon® acquisition software displays data from all channels and depicts the position of the array and the swath of 3D GPR coverage with a moving map in real time while surveying.  In addition capturing GPR data does not require highly-trained personnel.  OF course, post-processing the GPR scans  to create 3D maps of the below ground infrastructure requires qualified, skilled personnel.  Post processed 3D data can be converted to line data for export to CAD for the depiction of utilities in a 3D survey utility/top map.

Real world applications

A recent project involved mapping utilities at 36 rail grade crossings in 3D. The project was for a new commuter rail expansion along an existing rail right of way. The scans resulted in 3D maps of utilities that were previously shown on existing records and identification of additional utilities that were not recorded on maps. The higher quality subsurface utility information reduced the need for unnecessary vacuum excavation test holes with an estimated cost of $1,000 per hole. Test hole results are currently running 90% in very dense corridors demonstrating the value of this approach by eliminating expensive "dry holes" and reducing risk of utility damage during excavation.

Using ImpulseRadar's GPR technology GEL Solutions LLC and David Evans and Associates Inc implemented a multi-sensor system that enables them to perform above and below-ground 3D surveys.  The system combines LiDAR, photogrammetry, and the Raptor array. The LiDAR and and photo cameras and the GPR can be mounted on a vehicle to capture data at speed-limit speeds.  In a pilot project in Redlands, California above and below ground scans were captured over the same corridor and post-processed to create a 3D model of the infrastructure that lies above and below the surface. Ground-painted markouts showing the positions of the underground utilities from conventional EM locate methods seen in the above ground LiDAR/phot scans were extracted and georeferenced to features such as poles, pull-boxes, and so on.

Technical breakthrough

The only competing GPR initiatives of which I am aware that attempt to achieve the goal of operating at roadway speeds are two initiatives reported in 2018. T2 Utility Engineers, based in Whitby, Ontario, reported commercially using a Hexagon IDS GeoRadar Stream EM multi-channel ground penetrating radar (GPR) array towed at 10-12 km/hr to capture subsurface data. In a separate initiative a successful proof of concept has been reported by DGT Associates in Mississauga, Ontario in which data collected by a Siteco rig combining a Faro mobile laser scanner and Sensors and Software GPR arrays collected data simultaneously above and below ground at roadway speeds of 80 to 90 km/hr. But ImpulseRadar's RTS technology with much higher sampling rates appears to be an important technical breakthrough that achieves GPR reality capture of the underground at high speed in a commercial product.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Currently autonomous vehicles with some exceptions rely on on-board sensors for the detailed aspects of navigation. But many believe that high precision maps which contain significantly more detailed information and true-ground-absolute accuracy than current road maps. These high-precision 3D maps will be specifically for the self-driving vehicle market and include detailed inventories of all stationary physical assets related to roadways such as road lanes, road edges, shoulders, dividers, traffic signals, signage, paint markings, poles, and all other critical data needed for the safe navigation of roadways and intersections by autonomous vehicles.  Referred to as high definition (HD) maps they will record the location of physical highway assets to centimeters.

These HD maps have two applications in the autonomous vehicle market.  They can be used by autonomous vehicles in actual driving to augment on-board sensors with near real-time and highly accurate data of the highway and surrounding location where the vehicle is driving.  Secondly because self-driving vehicles would need to travel billions of miles in the physical world to demonstrate a significant improvement on safety for human drivers, simulation has become an essential part of the development of autonomous vehicles.

 At the 2016 SPAR3D conference I had a chance to chat with Ron Singh, then Chief of Surveys at the Oregon Department of Transport about high definition (HD) maps of highway systems.  High definition means accurate to centimeters and Ron believed that HD highways maps would be required to reduce the risk and costs of highway construction.  But he also believed that autonomous vehicles would require this information.  Today autonomous vehicles rely completely on on-board sensors. but many believe that for safe operation they will require other information that can only come from external HD maps of highways and roads.

At the Year in Infrastructure 2017 (YII2017) conference in Singapore, Sharad Oberoi of Sanborn gave an insightful overview of the state of the art for automated feature extraction from combined aerial and mobile scans of highways and adjacent areas. Sanborn develops high precision maps for the autonomous vehicle market.  Sanborn has developed proprietary mapping technology that leverages aerial imagery, aerial lidar data, and mobile lidar data to create standardized, high-precision 3D base-maps focusing specifically on the self-driving vehicle market.

Recently in the UK Zenzic has commissioned the Ordnance Survey (OS) to explore the geospatial data requirements for HD maps when considering the lifecycle of testing and the data interoperability requirements to enable UK plc to provide an exemplar test facility ecosystem and set the foundations for operational deployments.  HD maps for autonomous vehicles are more complex than maps used for simple driver-based navigation systems. Mapping mustbe highly accurate (better than 5cmin resolution) and needs to contain a minimum set of road information, such as lane-level geometry and information relating to street furniture.This study has been conducted with the support of software simulation companies, the Met Office and the British Standards Institution(BSI), with the aim of understanding the data requirements, gaps and sources regarding geospatial data for self-driving vehicle testing.  A report Geodata report - analysis and recommendations for self-driving vehicle testing has been released.  It addresses why self-driving vehicles require real-time high precision maps. With high-definition maps which are updated in real-time, a self-driving vehicle is able to reference the position of other road users against what it already knows to be there. It also provides a back-up in situations where its sensors are unreliable such as poor weather conditions such as snow, heavy-rain or sun reflecting off a wet-road. 

Its key conclusions are that geospatial data is fundamental for autonomous vehicles. Maps can provide an important trusted baseline where the on-boardd sensor feed are unreliable.   To provide the framework required for the testing and driving of self-driving vehicles, current mapping specifications will need to be enhanced to include relevant street-level features, with better than 5cm resolution, more extensive information about street furniture, and interoperability with other sources of data.  An authenticated, authoritative single source of geospatial data accessible to autonomous vehicles is required. Standards relating to geospatial data in the context of autonomous vehicles are required. This is on the Open Geospatial Consortium radar. Real-time updates to mapping through sensor data feeds will also be essential.  Finally it makes some recommendations about the development of data formats (LAS 1.2 or LAZ, OBJ, OpenDrive, and ESRI shapefile), data hosting, governance, minimum safe requirements and standards.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I thought it would be worth while to include here the plenary talk I gave at GeoIgnite 2019 in Ottawa about the growing evidence of the benefits of an integrated BIM + geospatial approach to full-lifecycle construction. 

The UK government and others have stated clearly that they expect the largest benefits of BIM to flow from a full-lifecycle approach to construction.  As construction companies who have taken on design, build, operate and/or maintain projects have found, for full life-cycle construction projects an approach that combines BIM + geospatial significantly improves outcomes. Until the last year or two evidence of the benefits of BIM + geospatial from real world projects has not been available. Now we are beginning to see substantive data that offer evidence of the benefits of an integrated BIM+geospatial full lifecycle approach for construction projects.

Some owners have begun procuring using a design, build, maintain, and or operate model.   Engineering and construction companies who have undertaken these projects have reported significant business benefits from adopting an integrated BIM+geospatial approach to construction. Further evidence of the benefits of this approach is provided recently by the owner of a major $1.3 billion PPP construction project.  Nagpur Metro has estimated that it expects total savings in excess of US$225 million over the 25 year lifetime of a design-build-operate-maintain project.

In Canada construction contributes $137 billion, about 7 %, to the GDP annually. Construction in advanced economies have worldwide have a serious problem. The McKinsey Global Institute estimates that the world will need to spend $57 trillion on infrastructure through 2030 to keep up with global GDP growth. This will require huge amounts of private investment from pension funds, insurance companies and sovereign wealth funds. To attract this money will require improvements in construction productivity, but in many of the world's advanced economies, construction productivity has been flat for the past 40 years. McKinsey reports that large construction projects typically take 20 percent longer to finish than scheduled and are up to 80 percent over budget. McKinsey & Company suggests that the construction industry is ripe for disruption and two of the technologies that it believes will be key in that anticipated transformation are geospatial and BIM. 

The UK Government as part of its building information modeling (BIM) initiative has said repeatedly that it expects the big payoff of a digital model will be during operations and maintenance, which typically represents 80% of the cost of a facility.  But to date there has been little uptake of this approach in the facilities management (FM) sector. A recent survey of the FM industry found that while 92% had heard about BIM and 84% agreed that BIM has the potential to deliver value add to FM, over two thirds said that the FM is not prepared for BIM.

The early adopters of a BIM+geospatial approach are companies who take on design, build, and maintain and/or operate projects, variously referred to as DBFM in the Netherlands, BOT in Southeast Asia, and PPP in Canada and India. Among these companies are some well-known names in global construction; Major construction firms have realized that BIM+geospatial integration provide greater value for full lifecycle projects. EllisDon, Parsons Brinckerhoff , Atkins Global , Arcadis , BAM and AECOM.

For example, in Canada In Canada public–private partnership (PPP) have been remarkably successful for building and maintaining infrastructure. 15 years ago EllisDon began taking on PPP projects and quickly recognized that an integrated BIM+geospatial simplified capturing and maintaining data over the full project lifecyle of a project. As a result Integrated BIM + geospatial has become a best practice on P3 projects at EllisDon.

Owners have found that BIM+geospatial integration provides greater value to projects that involve not just design and construction but also operations and maintenance. A leader in this space, Rijkswaterstaat, the Dutch transportation authority, began offering design-build-finance-maintain (DBFM) projects a number of years ago which has motivated private Dutch engineering and construction companies to adopt an integrated geospatial+BIM approach to construction. The firm Royal BAM Group nv/BAM Infraconsult adopted integrated BIM + geospatial because of market developments including more complex construction assignments and an increasing demand from customers for service provision throughout the entire life cycle of a project. This includes reality capture using LiDAR scanning at the beginning of the project, before design, to capture an acuurate representation of the location where construction will take place. During the design and build phases of the project everything captured is georeferenced so that it can be migrated to an integrated GIS+FM system for maintenance activities. 

The Crossrail project in London which was the largest engineering project in Europe at the time, adopted a full-lifecycle BIM+geospatial approach with targets of 20% savings on design and construction and 40% savings during operations and maintenance.

In Southeast Asia for a BOT (build-operate-transfer) substation expansion project won by Malaysian company PESTECH to design, build and operate the substation for 25 years. PESTECH adopted an integrated BIM+geospatial approach which included reality capture of the existing substation at the beginning of the project using LiDAR and georeferencing everything including engineering drawings, digital photos, point clouds, BIM models, and other construction documents. The benefits that were realized as a result of this apporach included reduced design time, fewer site visits, reduced land requirement and material waste, and most importantly for keeping the project on budgets and on schedule no change orders.

In China a geospatial+BIM approach was adopted for the Miaoshan 220kV Secondary Transformer Substation project. This was a large indoor substation in Dongxihu District, Wuhan City, Hubei Province. The project integrated digital tools for construction, site preparation, mechanical, electrical, and protection. The 3D digital modeling approach enabled collaborative design of an indoor substation with restricted space with multiple voltages in a very congested urban area with minimal impact on the existing buildings and infrastructure and all of this within a tight time frame. One of the important benefits of the approach was that minimized the impact of the substation on the existing infrastructure. Since the transformer substation is located in the downtown of Wuhan where buildings are dense, reality capture (laser scanning) was used to model the area surrounding the substation. This helped to ensure the substation site design did not conflict with surrounding buildings. To enable combining the model, the point cloud data, the digital terrain model and other external data everything was geolocated (Bentley calls this geo-coordinated). Geo-coordination, a term minted by Bentley, enables engineers and facilities managers to find things that are tagged,related, or close to a building element and it makes it possible to navigate the information environment, not only 3D design models (virtual reality), but also 3D models in a real world context (augmented reality). For this project it was fundamental not only for integrating the substation design in its real world context, but also for making it possible to reuse the BIM model and associated data during operation and maintenance.

In the U.S. AECOM, which is a US$18.2 billion a year firm in the construction sector and has been ranked for eight years running #1 in Engineering News Record‘s “Top 500 Design Firms”, uses BIM + GIS on design, build, finance and operate (DBFO) projects around the world. AECOM has applied this approach to the external campus of Denver International Airport, and to lease management at Orlando, Hong Kong, and South West Florida international airports. AECOM has found that the advantage of an integrated BIM+GIS approach based on a centralized integration of information is that it allows the client to make strategic decisions during the design, build and operate phases of the construction life-cycle.

The first project I have come across that has attempted to total savings attributable to an integrated BIM+geospatial full life-cycle approach to construction is the Napgur Metro project. This is a US$1.3 billion project underway in Nagpur, and appears to be the first project in Asia to integrate a digital twin with an asset management system to eliminate information loss about assets during design and construction. Since the Nagpur system implements a full-lifecycle approach to project management, the location of each of the 500,000 assets comprising the systems is recorded making it possible to click on an asset in SAP and be shown the location of the asset in a 3D map. Final deliverables are digital models rather than paper drawings providing a basis for digitalizing operations and maintenance. The benefits of a 3D BIM+geospatial approach have been projected based on a 25 year lifetime for the project. It is estimated that this will result in US$400,000 savings during plan, design and build. But the big benefit of an integrated BIM+geospatial approach is expected during the 25 year maintain and operate phase of the project. It is expected that operating manpower requirements will be reduced by 20%. The really big payoff is an estimated savings of US$222 million over the lifetime of the project.

Reflecting this trend toward integrated BIM+geospatial there has been a massive consolidation over the past decade of the major hardware and software vendors providing geospatial, BIM and reality capture equipment and software. Hexagon AB, founded in in 1975, began major acquistions in the 1990s which included Leica Geosystems and Intergraph. It has now reached $3.49 billion annual revenue. Trimble founded in 1978 also embarked on major acquisitions including Sketchup and Tekla and is now a $2.7 billion construction equipment and software company. More recently Topcon (Japan) acquired Sokkia and Clearedge and has achieved $1.2 billion in annual revenue. Perhaps most significantly as an indicator for this trend in the industry, Autodesk ($2.46 billlion) and ESRI ($1.1 billion), the 800 pound gorillas in BIM and GIS, respectively, have announced a partnership to improve interoperability between the AEC and geospatial worlds. Finally, Bentley ($700 million) + Siemens ($105 billion) are working closely on a number of construction projects.

As a growing number of owners see the advantages of a full life-cycle approach in construction projects and begin to change their procurement practices, construction companies are changing their business processes to optimize facility maintenance and operation.  The leading edge of this trend is AEC companies that have taken on the challenge of design, build, operate and/or maintain projects. They have reported that they have realized significant benefits from an integrated BIM+geospatial full lifecycle approach to construction. Now it appears that the word is getting out. Standards organizations in both the AEC and geospatial worlds are making progress on BIM+geospatial interoperability. Major software vendors Autodesk and ESRI have announced an agreement to partner for greater interoperability between their products. The open source community is also addressing the issue of BIM+geospatial interoperability. In the past the quantified advantages of an integrated BIM+geospatial approach have remained for the most part company internal, but we are now beginning to see data from real world projects that offer quantitative evidence for the benefits of an integrated BIM+geospatial full lifecycle approach for construction projects.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Canada's first synthetic aperture radar satellite was launched in 1995. Since then Radarsat-2 was launched in 2007 and continues to operate.  In June of this year the Radarsat Constellation (RCM) was launched. At the GeoIgnite conference Sergey Samsonov, Canada Centre Remote Sensing (NRCan), explained that in addition to the technical improvements such as a revisit time of four days and data latency for some applications as low as 10 minutes, RCM data will be made available at no cost.  This is a different business model from Radarsat-1 and Radarsat-2 and is intended to reduce entry costs for developers and entrepreneurs.  The RCM data will be available through the Earth Observation Management System.  In addition earlier this year the Canadian Space Agency and the Canada Centre for Mapping and Earth Observation made RADARSAT-1 images of Earth available to researchers, industry and the public at no cost. The 36,500 images are available through the Earth Observation Data Management System.

One of the primary benefits of radar is that it is not affected by cloud cover.  Radarsat-2 and RCM are capable of one meter resolution, although five meter is more typical.  In addition the Radarsat satellites are capable of using interferometry to measure centimeter movements and deformations of the Earth's surface in the horizontal and vertical directions.  In his talk Sergey Samsonov showed some examples from Radarsat-2 of subsidence in Vancouver, Seattle, Mexico City, Alberta, and other locations detected by radar interferometry.  I have blogged about an application Network Alert developed by Planetek that uses satellite radar interferometry to detect ground subsidence and to alert municipal water network operators of possible water leaks.

The new constellation dramatically improves the ability to monitor changes on the Earth's surface. The revisit time for the new constellation is much shorter. While the previous satellites had a revisit time of 24 days, the new RCM satellites will revisit the same location every four days with a greater frequency in Canada's fat north. The time between data acquisition and the data being available with the RCM constellation is remarkable. The data latency for the new RCM constellation depends on the type of data. For ships in Canadian waters it is 10 minutes. For other maritime surveillance it is 30 minutes. For Canadian and global disaster management it is 2 hours - and remember it is not affected by cloud cover.  For ecosystem monitoring applications the daya will be available in 24 hours. Pre-defined recurrent observation scenarios (standard coverages) will be made available to users in advance of acquisition.

The RCM is designed for three main areas: maritime surveillance (ice, surface wind, oil pollution and ship monitoring), disaster management and environmental monitoring. While the mission design initially focused on maritime security requirements, land security, particularly in the Arctic, will be dramatically enhanced. In addition economic growth is also being targetted.  Developers and entrepreneurs are being encouraged by an open data policy to develop a wide range of applications in Canada and internationally.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Deep learning is increasingly finding practical applications using geospatial data, specifically, satellite imagery. At the GeoIgnite conference in Ottawa, Nicolas Martinez of Statistics Canada described a project scheduled to kick-off in July to use machine learning to identify housing starts across Canada.  The goal of the project is to improve the accuracy and coverage of housing starts survey, specifically to fill data gaps for smaller remote and indiginous communities which are generally excluded from the current survey program. 

Canada Mortgage and Housing Corporation (CMHC) currently employs a significant number of people at their head office in addition to field agents across the country who physically confirm residential starts and completions.  The objective of the project is to apply convolutional neural network (CNN) machine learning to satellite imagery to extract information about housing starts and completions. The imagery will be preprocessed to prepare it to be used in developing a machine learning processing model.  The largest challenge will be developing training and testing imagery data sets.

The original application of deep learning to photo imagery was by Geoffrey Hinton at Google who used it to distinguish dogs and other objects in photos that people uploaded.  More recent applications of satellite imagery targeted landuse and agricultural identification. For example, remote sensed imagery from a satellite can be used to create a model that will differentiate between corn and potatoes, using factors that can be calculated from satellite imagery such as normalized difference vegetation index (NDVI) (min, max, mean), texture (min, max, mean), vegetation height, and geometric factors such as orientation.  I blogged about how open source code and publicly available training data has been applied to track deforestation and reforestation in Mato Grosso, a state in the central amazon region of Brazil. A study in the Netherlands used CNN to identify blocked waterways using overflight imagery with a success rate of 97 %.

An important application of satellite imagery is identifying built structures. I have blogged about how satellite imagery has been used to identify buildings and transportation networks by applying a neural network model together with OpenStreetMap layers and high resolution Worldview multispectral imagery.  NVIDIA has demonstrated the ability to automate detection of many road networks using deep learning algorithms and multi-spectral high resolution imagery.  Another application of CNN and imagery in the Netherlands was able to differentiate residential roofs with dormer windows from those without.

The essential key to effective application of deep learning is good training and test data - ground-truthed data that involves, for example, someone on the ground identifying whether the fields seen in the imagery are corn or potatoes or something else. There are  publicly available training data sets that can be used to train satellite imagery CNN models. For example, I blogged about the System for Terrrestrial Ecosystem Parameterization (STEP) dataset which has 2000 manually labeled sites covering 17 different land cover types scattered across all continents. A large publicly available training dataset derived from Sentinel 2 imagery contains 30,000 polygons of land use training data for ten classes of land use: annual crop, forest, herbaceous vegetation, highway, industrial, pastures, permanent crop, residential, river and sea or lake.

The Statistics Canada project is scheduled to begin in July with initial work on a proof of concept and CNN testing to begin in August and September.  The communities chosen for the initial phase are  Kitchener-Cambridge-Waterloo, Red Deer, and Iqaliut.  The first phase is scheduled to be completed by winter 2019.  The project plans to use Tensorflow and Keras, open source libraries and a python framework for applying machine learning.  Artificial intelligence (AI) has often suffered from inflated expectations, complex code, heavy processing requirements and not very practical applications. There is a growing body of practical applications in the geospatial domain that shows that as a result of the immense processing power available in the cloud machine learning is able to generate fairly easily practical and useful results.  As Chris Holmes pointed out in his talk at FOSS4GNA, the challenge for Statistics Canda will be to develop reliable and publicly available training and test datasets to enable deep learning models to be created for a broader range of applications using the huge volumes of satellite imagery and other geospatial data that are now available.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

At the GeoIgnite conference in Ottawa, Wade Larsen of Urthecast described what may be a killer application for satellite imagery captured daily. Farmers in the US over-irrigate by 25% per year. A recent study has demonstrated that monitoring vegetation growth on a daily basis can save farmers $50 000 in water per year. With 260,000 pivots operating in the US, this amounts to billions of dollars of savings every year.

Urthecast has been known for having two cameras (Iris and Theia) on the International Space Station and two Deimos earth observation satellites. Last year Urthecast announced the acquisition of Geosys Technology, a software company that processes earth observation imagery for agricultural applications.  Their latest initiative is called UrtheDaily with the objective of capturing nine bands of imagery from six satellites covering the Earth's entire landmass every day. The nine bands are identical to Sentinel-2's visual and near-infrared bands. The cameras are cross-calibrated with Sentinel-2 to assure data compatibility. The six cameras enable a daily revisit for higher latitudes and somewhat less for tropical areas. It captures imagery at the same time, from the same altitude every day.  About 150 terabytes will be acquired and downloaded every day. The imagery has five meter resolution (GSD) and represents scientific grade data quality that will be available on the cloud for user access within hours of acquisition. UrtheDaily is designed to support machine-learning and AI ready geo-analytics applications on a global scale. The UrtheDaily platform will provide the ability to discover data, run user algorithms, and deliver data to customers.

UrtheCast reports that it has already signed contracts primarily in the agriculture sector. An example of a potential killer app in agriculture is monitoring irrigation. Farmers in the U.S. overwater by 25 %. A four year in Texas by HydroBio which assessed the value of frequent monitoring of plant health and growth using satellite data concluded that to be effective this required high-frequency (daily) monitoring with highly calibrated sensors. With its resolution UrtheDaily will be able to monitor vegetation for individual pivots on a daily basis in order to optimize water utilization. It is estimated that this could save farmers $50,000 per pivot per year. (A pivot typically irrigates 120 acres.) There are 260,000 pivot irrigation systems in the U.S. which translates into the potential to save farmers billions of dollars per year.

To date UrtheCast reports that it has signed $350 million worth of contracts in the U.S., Russia, India and other countries. It plans to launch six satellites in 2021 and become operational in 2022.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Chattanooga's power company Electric Power Board (EPB) is one of the most advanced with respect to adopting smart grid technology in North America. At HXGNLive in Las Vegas, Ken Jones reported quantified benefits that EPB has found from its investment in smart grid technology. These include 65% reduction in customer outages and 52% reduction in outage minutes and savings of $23 million in restoring outages after a recent severe storm.

About five years ago I first heard about the remarkable transformation of the power grid in the Chattanooga, Tennessee, a city with a population of about 180,000. With the help of the Department of Energy and other partners the local power company Electric Power Board (EPB) saw the smart grid as not only benefiting the utility and improving the quality of life of its cusomers, but also as a driver of economic development. Early on EPB decided to deploy one of the country’s highest capacity fiber-optic networks to improve power quality, reliability, customer service and energy efficiency. Chattanooga is one of the few municipalities anywhere with 10 Gbit/sec internet speeds for both urban and rural customers. Currently over 50% of EPB customers subscribe to the fiber service.  EPB has estimated that the incremental economic and social benefits of the high capacity backbone for Chattanooga lie between $865.3 million to $1.3 billion. Furthermore, somewhere between 2,832 and 5,228 jobs have been created linked to EPB's infrastructure investment.

In addition to the fiber backbone, EPB has installed 1400 automatic switches, an AMI management system, a distribution management system (DMS), and support for demand management (DM) and distributed energy (DRE).

At HxGNLive in Las Vegas Ken Jones of EPB presented some of the very impressive quantified benefits that EPB has realized as a result of these investments in fiber and smart grid technology.

Reliability

Reduction in customer outages 65%
Reduction in outage minutes 52%

Annual operating savings

Meter reading $2.0 million
Field services $0.7 million
Demand charges $3.0 million

Environmental benefits (CO2 emissions reduction)

Reduced truck mileage 400 tons/year
Demand management kWh reduction 1000 tons/year
Power factor improvement 2000 tons/year

In addition EPB has developed a solar farm of 4,400 panels generating about 2 million kWh/year. Anyone can buy into this starting at just $5/month. Ken gave examples of local firms that completely offset their power usage with solar power.

One of the interesting steps that EPB has introduced to help customers monitor their power usage is a mobile app MyEPB. The app provides real-time energy usage and usage comparison by day, week, and month, real-time outage reporting for the EPB service territory, support for customers to report outages, and billing alerts, for example, notify me when my monthly bill reaches $50. To date there have been 30,000 downloads of MyEPB.

Progress of outage restoration after a recent storm showing areas automatically restored (violet) and manually restored (green)

An area where the investment in smart grid technology has realized a major savings is resilience. Ken presented a map showing the areas in the EPB service territory where power was automatically restored by the automated distribution switches after a recent severe summer storm. The outage and restoration statistics were equally impressive and showed that automation resulted in an estimated savings of $23 million for this storm.

  W/o automation W/ automation Difference Improvement(%)
Customers with an outage 72,622 32,043 40,579 55.88%
Cost of all outages $69.3 million $46.1 million $23.2 million 33.48 %
Outage minutes 16,986,240 12,059,524 4,926,716 29.00 %

One of the important steps that the EPB has taken with its partners is to implement a microgrid testbed at the Chattanooga airport comprising a solar farm, batteries and supporting infrastructure.  This is one of the few airports anywhere that is able function completely off the grid.

These quantified benefits provide tangible evidence of the benefits of smart grid technology deployment.  I would recommend that any power utility operating in a small to medium-sized city seriously consider the Chattanooga experience.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Traditional output from a GPR scanner showing hyperbolas reflected from underground utilities

The most exciting technology announcement at HXGNLive is a new ground penetrating radar (GPR) solution that dramatically lowers the bar for GPR scanning, enabling surveyors and other professionals who have hesitated to get into this lucrative market because of the complexity of interpreting GPR scans to take the leap.  Before this announcement today by Agata Fischer of Leica Geosystems, non-geotechnical professionals including surveyors were put off by GPR scans consisting of images with hyperbolas showing the reflections of RF waves from underground objects.  This left GPR pretty much to highly trained geotechnical and other underground engineering professionals.

As I discovered at a recent talk I gave to the Alberta Land Surveyors Association, there is a lot interest among the surveying community to expand their professional offerings beyond the traditional above ground survey to include below ground surveys.  Two years ago the one-button BLK360 opened up engineering grade laser scanning to a much broader professional audience.  Leica Geosystems is betting that with a similar focus on simplicity the DSX device and DXplore software announced today, is going to open GPR scanning to a much broader professional community, in particular, surveyors. 

Output from the DSX with DXplore software showing tomographic image of detected underground utilities

The key features of the DSX GPR device and DXplore software announced today are simplicity of use and interpretation, detection results that can be relied on,  immediate 2D and 3D undergound utility maps, and interoperability with CAD and other engineering products.  The DSX workflow is very simple: import available as-builts in the area as DXF or other CAD format files, define a grid for the area to be surveyed, walk the grid with the DSX device with real-time fedback on the DSX's display, show the tomographic display (no hyperbolas !) of what was detected, manually identify suspected utilities on the display,  the software then processes the captured GPR images to confirm the utility pipe or cable, and export a 2D or 3D vector file in a CAD compatible file format.  Very simple.

But too simple for experienced GPR practitioners. I have talked to very experienced GPR people who don't find the DSX interesting for them.  They point out things that they find lacking like tomographic images at different depths and scanning at more than frequency.  But for the people who don't have GPR experience, believe that underground surveying represents a significant business opportunity, and already use or know Leica Geosystems total stations or other survey equipment this is a way to get their toes in the water.  Furthermore I fully expect that Leica will not stop here but will continue to develop easy to use GPR software for detecting and mapping underground infrastructure including support for the multi-channel Leica IDS Georadar arrays that are being applied to very sophisticated mapping of underground infrastructure.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Are there things that an owner, agency or municipal government responsible for an airport, industrial campus, or town can do to reduce the risk of underground utility damage, the associated disruptions to operations and business and the attendant danger for construction workers and the public ?  At this year's Geo Business in London I had a chance to chat with Andy Rhoades, who was Head of Service Protection Knowledge & Information Management Technical Standards and Assurance at Heathrow Airports from 2001 to 2017.  During the first ten years of Andy's tenure at Heathrow the annual number of services strikes (underground utility damage) at Heathrow declined by a factor of six and has continued to decline even as the amount of construction has grown.

Airports are like cities except that they have a greater density of underground utilities and more types of underground equipment; communications cables, aeronaughtical ground lighting cables, gas mains, low voltage electric power cables, high voltage electric power cables, storm water mains, sanitary waste water mains, potable water mains, grey water mains, fuel mains, fire fighting water mains, and a variety of underground structures. In addition striking a main in an airport carries with it a larger risk than in an urban area because of the incoming and outgoing aircraft and the large number of people in a concentrated area.  I have previously blogged about what Andy was doing at Heathrow based on his talk at Geospatial World Forum 2014.

Andy introduced a seven step process, from early design to handover, for excavations at Heathrow that continues to be followed at Heathrow.   One of important features is the concept of exclusion zones.  Based on PAS 128 quality levels, it details what type of excavation can be safety used within 0 to 3 meters of a utility.  These include hand-digging (foot pressure only), powered hand tools, vacuum or hydraulic extraction, and powered excavation. 

Another requirements is that all people involved in detecting and mapping underground infrastructure must be trained.  Heathrow has helped to develop National Vocational Qualifications (NVQs) relevant to underground asset management. National Vocational Qualifications (NVQs) are work based awards in England, Wales and Northern Ireland that are achieved through assessment and training. To achieve an NVQ, candidates must prove that they have the competence to carry out their job to the required standard. NVQs are based on National Occupational Standards that describe the 'competencies' expected in any given job. In addition in the UK there is a formal structure Qualifications and Credit Framework (QCF).  Training levels that Andy mentions are QCF Utility Mapping Technician, QCF Level 3 Utility Mapping Surveyor, and QCF Level 5 Senior Utility Mapping Surveyor.

Asset damage compared to the amount of construction activity at Heathrow.  Graph also shows number of incidents halved 2015-2016 when using predictive algorithms.

Based on his experience at Heathrow, for his consulting practice Andy has produced two manuals that document his over 15 years of experience at Heathrow, and 9 years at Thames Water, Londons Water company.  Services and Buried Infrastructure Protection Standard is a recommended standard for owners of large number of underground assets such as airports, towns, universities, industrial campuses, and other facilities.  The second is a Clients Guide to Utility Survey that gives owners/clients a guide to what they should require from an underground utility survey.  Between them these manuals contain a wealth of practical information for owners/clients who want to reduce the risk of underground utility damage during excavation.  They also contain examples of what has happened if the guidelines are not followed, for example, air traffic control tower loss of communications, when a backhoe operator decided to take out a big concrete slab which happened to contain fiber cables carrying the airport's communications.  He also provides best practices for surveys in rural areas, suburban areas, busy urban areas and congested city areas and airports. 

One of the very interesting sections in the Guide covers different technologies for underground utility detection including GPR, EML (Cat and Genny), Acoustic pipe locator, using a dye to trace drains, earth resistance, gyroscopic pipe locator, infrared imaging, magnetometry, metal detectors, RFID detectors, vibration acoustic.  There are some good rules of thumb, for example, underground utilities can be detected through concrete, utilities can't be detected through salty water (w/ electrolytes) with GPR, but can through fresh water (w/o electrolytes), and underground detection generally is not very accurate for depth without expert processing of raw data.  That radio waves used in GPR travel at the speed of light through air, two thirds the speed of light in ground (average depending on ground make up), and one third the speed of light in water, can result in serious errors in reported depths of detected infrastructure.

He also discusses how the commonly used techniques GPR and EML can generate unreliable data including ghost utilities.  EML in particular has problems when utilities are close to each other or cross each other.  The result is that the reported location can be off and in the worst case, EML can report three utilities when in fact there are only two, or two where there is only one.  EML can report an S shaped warp in a cable when it crosses another utility.  He has a very detailed decision tree that can be followed to improve the confidence level in detected underground utilities by taking into account frequencies used, ground conditions, depth, separation between utilities, type of equipment (plastic, metal, PVC, optical cable, ceramic), water content, temperature and air content.

Airports are worst cases because of the number and density of underground utilities.  Recognizing that the reliability of underground utility detection and mapping is about 70% if using EML only, it is essential that anyone involved in owning, managing, locating, mapping and reducing the risk of utility damage during construction follow procedures like those defined by Andy at Heathrow and now through his own company Buried Asset Protection Ltd. Andy_Rhoades@BAPC.uk.com.

Currently Andy is leading the update of the British quality standard for underground infrastructure PAS 128.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Back in 2016 Gartner predicted that by 2020 the largest electric power company will be an Uber-like behemoth that will not own assets but will simply manage energy suppliers and energy consumers in an open market.

At that time I blogged about three large Las Vegas casinos' plans to leave the NV Energy utility grid.  MGM Resorts left the grid in 2016 even though required to pay $86.9 million in exit fees to do so.  This trend has accelerated.  In 2018, 10 companies began efforts to leave Nevada’s utility monopoly, NV Energy.

Regulators in some jurisdictions have recognized that the traditional business model for electric power no longer works in this age of decentralized power generation.  New York's REV initiative and Texas' ERCOT deregulation initiative have become models for other states for modernizing power generation, transmission, distribution, and retail.  The central tenet of modern utility regulation is to restrict traditional utilities to owning and maintaining power transmission and distribution lines.  Matching power generation and consumption power is through an open, unregulated transactive energy market.  Open markets tend to encourage decentralized, renewable energy generation.

In 1999, the Texas Legislature began restructuring the state’s electric industry which allowed consumers to begin choosing their retail electricity provider. As of Dec. 31, 2001, deregulation of the ERCOT electricity markets required investor-owned utilities to unbundle their operations. The provision of service to end-use retail customers became competitive. There are over 100 retail electricity providers of which more than a quarter offer a 100% renewable plan to their customers as an option. Since 2002, approximately 85% of commercial and industrial consumers have switched power providers at least once.

In Illinois in 1997 the Customer Choice Act mandated that state utilities could no longer own generation facilities. During the Mandatory Transition Period 1997–2007, utilities were required to sell their electricity generation assets to other energy companies. Utilities no longer sold power and became responsible for delivering electricity only. Since 2002 in the deregulated market, customers were permitted to buy electricity from competing retail electric providers. In 2006 the Retail Electric Competition Act encouraged residential and small business customers to switch to alternative electric providers.

in July 1999, Ohio restructured its energy market to give consumers choice with their energy provider. The law took effect on January 1, 2001. Customers could now choose to buy energy from retail electric suppliers instead of automatically receiving it from the utility company (primarily AEP Ohio, Dayton Power & Light, Duke Energy, and FirstEnergy) in their area. In May 2008, each incumbent utility was required to shed its power generation operations and become a purely electric distribution utility.

All of the big technology players (Google, Apple, Amazon, Facebook) are already indirectly disrupting the electric power generation industry with commitments to billions of dollars of renewable power generation which has driven over $15 billion in investment in wind and solar power generation.

The latest battleground is Virginia where Costco's and Walmart's requests to leave Dominion Energy's grid have been rejected by the Virginia's State Corporation Commission (SCC) Division of Public Utility Regulation. Kroger, Harris Teeter, Target, Albertsons and Cox Communications all currently have exit applications pending at the commission.

The Virginia Energy Reform Coalition (VERC) includes policy experts from across the ideological spectrum aimed at breaking the monopoly held by two utilities, Dominion Energy and Appalachian Power in favour of a deregulated energy market.   VERC cites ERCOT as an inspiration, for restricting utilities to owning transmission and distribution lines and no longer selling electric power.

I remember back in 2013 John Wellinghoff, then Chairman of FERC, saying that these changes in the electric power industry are driven by power consumers - industrial, commercial and consumer.  "People are going to continue to drive towards having these kinds of technologies available to them. And once that happens through the technologies and the entrepreneurial spirit we are seeing with these companies coming in, I just don't see how we can continue with the same model we have had for the last 100 or 150 years...You need whatever you do need - it may be a wind system providing power remotely, it may be a battery system for storage, it may be a local generation system, solar PV or natural gas, or a combined heat and power system. But whatever it is, it is what the consumer needs at the cost level that is appropriate and the reliability level they think is appropriate based upon their choices."

Consumers wanting to be in control of their energy, corporations wanting to reduce their power costs, the world wide drive for more sustainable energy, and dropping wind and solar energy costs are all making the traditional business model unsustainable.  Regulators who see the future as a transactive energy market with traditional utilities providing infrastructure such as New York and Texas are prepared for the future.  Others are going to experience increasing pressure from consumers and corporations who want the freedom and flexibility of going off the grid.

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview