Detection of electric utility features and vegetation encroachments from satellite images using deep learning

  • 🔬 Data Science
  • 🥠 Deep Learning and Object Detection

Introduction

This sample notebook demonstrates how to efficiently map the electric utility features and trees in the imagery with possible locations of vegetation encroachment. Satellite imagery combined with machine learning leads to cost-effective management of the electric grids. This workflow consists of four major operations:

  • Building and extracting training data for electric utility and trees using ArcGIS Pro
  • Training a deep learning model i.e. RetinaNet using arcgis.learn
  • Model inferencing at scale using ArcGIS Pro
  • Proximity analysis between detected objects (electric utility and trees) feature layers using ArcGIS Pro

Example of electric utility object i.e transmission towers, distribution towers, Sub-station detection

Necessary imports and get connected to your GIS

import arcgis
from arcgis import GIS
from arcgis.learn import RetinaNet, prepare_data
gis = GIS("home")

Part 1 - Train the model and detect electric utility features

A electric utility feature layer consisting of manually labelled features, which will be used to define the location and label of each feature.

electric_train_data = gis.content.get('8e703649c43041c1bb4985b16788aa44')
electric_train_data
ElectricUtility_data
Data for detection of electric utility Feature Layer Collection by api_data_owner
Last Modified: August 21, 2020
0 comments, 10 views

Export electric utility training data for deep learning

Training samples for electric utilities were manually labelled for Hartford, Connecticut state. The training data consisted of three classes i.e. Transmission towers, Distribution towers, Sub-stations.

Training data can be exported using the Export training data for deeplearning tool available in ArcGIS Pro and ArcGIS Image Server

  • Input Raster: Imagery
  • Input Feature Class or Classified Raster: feature layer with labelled polygon
  • Class Value Field: field in the attributes containing class
  • Tile Size X & Tile Size Y: 256
  • Stride X & Stride Y: 128
  • Reference System: Map space
  • Meta Data Format: Pascal VOC (Visual Object Class)
  • Environments: Set optimum Cell Size, Processing Extent
arcpy.ia.ExportTrainingDataForDeepLearning("Imagery", r"C:\ElectricUtility_deepLearn\Train_chip_lablel_data", "Training_electricObjects", , "TIFF", 256, 256, 128, 128, "ONLY_TILES_WITH_FEATURES", "PASCAL_VOC_rectangles", 0, "Classvalue", 0, None, 0)

After filling all details and running the Export Training Data For Deep Learning tool, a code like above will be generated and executed. This will create training data i.e. Image (.tif) and labels (.xml) with necessary files in the specified folder, ready to be used in upcoming steps.

Train the model

We will select and train a model using arcgis.learn module in ArcGIS API for Python. arcgis.learn has deep learning tools and capabilities to accomplish the task in this study. As electric utility features are generally small and varied in appearance, RetinaNet model is used, which is one of the best one stage object detection model that works specifically well in case of small and dense objects.

Prepare data

We will specify the path to our training data and a few hyperparameters. It also helps in transformations and data augmentations on the training data, which enables us to train better model with limited datasets.

  • path: path of the folder containing training data.
  • class_mapping: allows for specifying text labels for the classes
  • batch_size: Number of images your model will train on each step inside an epoch, it directly depends on the memory of your graphic card. 20 worked for us on a 11GB GPU.

This function returns a fast.ai databunch, which will be used further to train the model.

training_data = gis.content.get('01ca39eab00e4780b0517af11d971b31')
training_data
detection_of_electric_utility_features_and_vegetation_encroachments_from_satellite_images_using_deep_learning
Image Collection by api_data_owner
Last Modified: May 26, 2021
0 comments, 0 views
filepath = training_data.download(file_name=training_data.name)
import zipfile
with zipfile.ZipFile(filepath, 'r') as zip_ref:
    zip_ref.extractall(Path(filepath).parent)
data_path = Path(os.path.join(os.path.splitext(filepath)[0]))
## Load the Data ##
data = prepare_data(data_path, class_mapping = {1:'Dist_tower',2:'Trans_tower',3:'Station'}, batch_size=8)
## Check the classes in the loaded data ##
data.classes
['background', 'Dist_tower', 'Station', 'Trans_tower']

Visualize a few samples from your training data

  • rows: number of rows we want to see in the results
## Visualize random training samples from the data ##
data.show_batch(rows = 3, alpha=1)
<Figure size 864x864 with 9 Axes>

The imagery chips have the bounding boxes marked out for electric utility feature type.

Load RetinaNet model architecture

The code below initiates a RetinaNet model with a pre-trained Resnet type backbone or other supported backbones. The model return types and bounding boxes for detected electric utility objects in the imagery.

## Load the model architecture with resnet152 backbone 
retinanet_model = RetinaNet(data)

Tuning for optimal learning rate

Learning rate is one of the most important hyperparameters during model training as too high/low may cause the model to never converge or learn. arcgis.learn leverages fast.ai’s learning rate finder to find an optimum learning rate for training models. We can use the lr_find() method to find the optimum learning rate at which can train a robust model fast enough.

## Tune the learning rate

lr = retinanet_model.lr_find()
print(lr)
<Figure size 432x288 with 1 Axes>
0.0005754399373371565

Based on the learning rate finder, the lr = 0.0005754399373371565, which could be used to train our model or if not specified it internally uses the learning rate finder to find optimal learning rate and uses it.

Fit the model on the data

To train the model, we use the fit() method. To start, we will use 20 epochs to train our model. Epoch defines how many times the model is exposed to the entire training set.

retinanet_model.fit(epochs=20, lr=lr)
epochtrain_lossvalid_losstime
00.8464651.18206100:26
10.8277091.18788600:28
20.7895381.29165400:29
30.8075791.74876000:28
40.7731271.46605900:28
50.8081861.48437400:28
60.7713621.18508000:28
70.7798121.15058700:30
80.7645231.53107500:29
90.7556241.36929200:30
100.7261511.07201100:28
110.7114151.33071700:31
120.6738980.98909100:29
130.6512461.02315700:31
140.6310410.99926700:31
150.6107630.90441800:30
160.6036500.91797700:31
170.5803270.93552200:29
180.5504180.92015400:31
190.5513970.90526500:29

Training data was split into training and validation set in prepare data step. fit() starts the training process and gives losses on training and validation sets. The losses help in assessing the generalizing capability of the model and also prevent overfitting. When a considerable decrease in losses in observed, the model could be saved for further training or inference.

Unfreeze and fine tuning (optional)

Frozen network layers pretrained on ImageNet, help in accelerating the training of the network. Unfreezing the backbone layers, helps to fine-tune the complete network architecture with our own data, leading to better generalization capability.

As per requirement, It can be executed or else we can continue with next step to Save the electric utility detection model.

Unfreeze model

retinanet_model.unfreeze()

Optimal learning rate

## Find optimal learning rate for the model with unfreezed backbone
lr = retinanet_model.lr_find()
<Figure size 432x288 with 1 Axes>

Fit model on the data

## Fine-tune for around 10 epochs
retinanet_model.fit(epochs=10,lr=lr)
epochtrain_lossvalid_losstime
00.4886620.86077900:28
10.4739570.88000900:29
20.4631130.82944500:29
30.4757840.82172300:30
40.4727330.80352200:32
50.4645350.75837200:31
60.4667400.74786800:31
70.4453770.72459800:31
80.4205400.71045900:33
90.4103090.70818900:32

Save the electric utility detection model

We will save the model which we trained as a 'Deep Learning Package' ('.dlpk' format). Deep Learning package is the standard format used to deploy deep learning models on the ArcGIS platform. We will use the save() method to save the trained model. By default, it will be saved to the 'models' sub-folder within our training data folder.

retinanet_model.save("Retinanet_electric_model_e20")

Load an intermediate model to train it further

To retrain a saved model, we can load it again using the code below and train it further

retinanet_model.load("Retinanet_model.pth")

Visualize results in validation set

It is a good practice to see results of the model viz-a-viz ground truth. The code below picks random samples and shows us ground truth and model predictions, side by side. This enables us to preview the results of the model within the notebook.

retinanet_model.show_results(rows=8, thresh=0.35)
<Figure size 576x2304 with 16 Axes>

Part 2 - Train the model and detect trees

The same workflow is followed for tree detection from Export training data to detect objects. After training, the model is saved for inference in the next step or for further training.

Export trees training data for deep learning

Training samples for trees were manually labelled for Hartford, Connecticut state. The training data consisted of one class i.e. Trees.

Training data can be exported using the Export training data for deeplearning tool available in ArcGIS Pro and ArcGIS Image Server

  • Input Raster: Imagery
  • Input Feature Class or Classified Raster: feature layer with labelled polygon
  • Class Value Field: field in the attributes containing class i.e tree
  • Tile Size X & Tile Size Y: 256
  • Stride X & Stride Y: 128
  • Reference System: Map space
  • Meta Data Format: Pascal VOC (Visual Object Class)
  • Environments: Set optimum Cell Size, Processing Extent
electric_train_data = gis.content.get('81ef094891e042ccb7f0742b34805f25')
electric_train_data
ElectricUtility_trees_data
Data for detection of tree for electric utility sample notebookFeature Layer Collection by api_data_owner
Last Modified: August 21, 2020
0 comments, 1 views

Prepare data

tree_data_path = Path(os.path.join(os.path.splitext(filepath)[1]))
tree_data = prepare_data(tree_data_path, batch_size=4)

Visualize a few samples from your trees training data

tree_data.show_batch(rows = 2)
<Figure size 576x576 with 4 Axes>

Load RetinaNet model

## Load the model architecture with resnet152 backbone 

retinanet_tree_model = RetinaNet(tree_data, backbone='resnet152')

Finding optimal learning rate

lr = retinanet_tree_model.lr_find()
print(lr)
<Figure size 432x288 with 1 Axes>
0.0004786300923226385

Fit the model on the tree data

retinanet_tree_model.fit(epochs=10, lr=lr, checkpoint= False)
epochtrain_lossvalid_losstime
00.7985560.98933402:28
10.6392050.86866304:55
20.5774790.92631302:56
30.5523740.74774202:41
40.5595700.70051902:48
50.4515960.72153102:40
60.4989870.64263702:29
70.4471960.81038002:26
80.4215640.71266402:27
90.4291690.69836002:26

Visualize results in validation tree dataset

retinanet_tree_model.show_results(rows=8, thresh=0.3)
<Figure size 576x2304 with 16 Axes>

Save the tree detection model

retinanet_tree_model.save("Retinanet_tree_model_e10")

Part 3 - Deploy model and detect electric utility features & trees at scale

We will use the saved model to detect objects using 'Detect Objects Using Deep Learning' tool available in both ArcGIS Pro and ArcGIS Enterprise. For this sample, we will use the high resolution satellite imagery to detect electric utility features. Detect objects using both the model i.e. electric utility and trees, to get two different feature layers.

  • Input Raster : Imagery
  • Output Detect Objects : Detected_Results
  • Model Definition : Retinanet_electric_model_e20.emd or Retinanet_tree_model_e10
  • padding : The 'Input Raster' is tiled and the deep learning model runs on each individual tile separately before producing the final 'detected objects feature class'. This may lead to unwanted artifacts along the edges of each tile as the model has little context to detect objects accurately. Padding as the name suggests allows us to supply some extra information along the tile edges, this helps the model to predict better.
  • threshold : 0.5
  • nms_overlap : 0.1
  • Cell Size : Should be close to value with which we trained the model, we specified that at the Export Training Data step .

arcpy.ia.DetectObjectsUsingDeepLearning(in_raster="Imagery", out_detected_objects=r"DetectedObjects", in_model_definition=r"\\models\Retinanet_model_e400\Retinanet_model_e400.emd", model_arguments ="padding 56;batch_size 4;threshold 0.5", run_nms="NMS", confidence_score_field="Confidence", class_value_field="Class", max_overlap_ratio=0.1, processing_mode="PROCESS_AS_MOSAICKED_IMAGE")

Detect Objects Using Deep Learning returns a feature class that can be further refined using the Definition query and Non Maximum Suppression tool.

Detected electric utility feature layer

Detected trees feature layer

Part 4 - Near analysis to find possible vegetation encroachment near electric utility features

After model inference on imagery, detected objects i.e. Electric utility and trees, in the imagery are saved in as separate feature layers. The near analysis tool in ArcGIS Pro is used to calculate distance and additional proximity information between the input features (electric utility) and the closest feature in another layer or feature class (Trees).

  • Input Features : feature layer from detect object tool for electric utility
  • Near Features : feature layer from detect object tool for trees
  • Search radius : required distance or range of search
  • Location : check location parameter checkbox

Here the tool finds locations where trees are in the vicinity of 5 m near electric utility features for possible vegetation related outages. The input feature will have two more attribute x (near_x) and y co-ordinates (near_y) of the closest location of the near feature.

Detected objects i.e electric utility, trees with markers representing proximity of trees to utility installations

The green and red bounding boxes are trees and electric utility respectively. The red anchor show the electric utility object in range of 5m of tree and possible location of vegetation related outage, while yellow show at safe distance. We have published the outputs from this sample as a hosted feature layer.

Conclusion

The models available with arcgis.learn were able to detect and map the electric utility features at scale in the imagery. Further training the models with larger and better data can improve the results at a scale of country.

The overlay of information from this workflow can assist electric utility industry in cost-effective and efficient management of the electric grid. Data science can help us derive insight from data, but communicating those insights is perhaps as important if not more. We used the ArcGIS Pro to publish the results as a hosted feature layer, which could be viewed as a web-map.

Web-map of detected objects with encroachment grid locations

References

[1] Tsung-Yi Lin, Piotr Dollár, Ross Girshick, Kaiming He, Bharath Hariharan: “Feature Pyramid Networks for Object Detection”, 2016; [http://arxiv.org/abs/1612.03144 arXiv:1612.03144].

[2] Tsung-Yi Lin, Priya Goyal, Ross Girshick, Kaiming He: “Focal Loss for Dense Object Detection”, 2017; [http://arxiv.org/abs/1708.02002 arXiv:1708.02002].

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.