Skip To Content ArcGIS for Developers Sign In Dashboard

ArcGIS API for Python

Download the samples Try it live

Feature Categorization using Satellite Imagery and Deep Learning

  • 🔬 Data Science
  • 🥠 Deep Learning and Feature Classifier

Introduction and methodology

This sample notebook demonstrates the use of deep learning capabilities in ArcGIS to perform feature categorization. Specifically, we are going to perform automated damage assessment of homes after the devastating Woolsey fires. This is a critical task in damage claim processing, and using deep learning can speed up the process and make it more efficient. The workflow consists of three major steps: (1) extract training data, (2) train a deep learning feature classifier model, (3) make inference using the model.

Figure 1. Feature classification example


Figure 2. Methodology

Part 1 - Export training data for deep learning

To export training data for feature categorization, we need two input data:

  • An input raster that contains the spectral bands,
  • A feature class that defines the location (e.g. outline or bounding box) and label of each building.

Import ArcGIS API for Python and get connected to your GIS

In [1]:
from arcgis import GIS
import arcgis
from arcgis import learn, create_buffers
from arcgis.raster import analytics
from arcgis.features.analysis import join_features
from arcgis.learn import prepare_data, FeatureClassifier, classify_objects,  Model, list_models
arcgis.env.verbose = True
In [3]:
gis = GIS(url='', username="arcgis_python", password="amazing_arcgis_123")

Prepare data that will be used for training data export

A building footprints feature layer will be used to define the location and label of each building.

In [3]:
building_footprints ='buildings_woolsey', item_type='Feature Layer Collection')[0]
buildings_woolseyFeature Layer Collection by api_data_owner
Last Modified: June 25, 2020
0 comments, 0 views

We will buffer the building footprints layer for 100m using create_buffers . With 100m buffer, when the training data will be exported it will cover the surrounding of houses which will help the model to understand the difference between damaged and undamaged houses.

In [4]:
building_buffer = arcgis.create_buffers(building_footprints, 
Feature Layer Collection by api_data_owner
Last Modified: June 25, 2020
0 comments, 2 views

An aerial imagery of West Malibu will be used as input raster that contains the spectral bands. This raster will be used for exporting the training data and inferencing the results.

In [5]:
gis2 = GIS("")
In [6]:
input_raster ="111318_USAA_W_Malibu")[0]
Map Image Layer by romeroma
Last Modified: August 30, 2019
0 comments, 171 views

Specify a folder name in raster store that will be used to store our training data

In [7]:
ds = analytics.get_datastores(gis=gis)
<DatastoreManager for>
In [8]:
[<Datastore title:"/fileShares/ListDatastoreContent" type:"folder">,
 <Datastore title:"/rasterStores/RasterDataStore" type:"rasterStore">]
In [9]:
rasterstore = ds.get("/rasterStores/LocalRasterStore")
<Datastore title:"/rasterStores/LocalRasterStore" type:"rasterStore">
In [10]:
samplefolder = "feature_classifier_sample"

Export training data using arcgis.learn

Now ready to export training data using the export_training_data() method in arcgis.learn module. In addtion to feature class, raster layer, and output folder, we also need to specify a few other parameters such as tile_size (size of the image chips), stride_size (distance to move each time when creating the next image chip), chip_format (TIFF, PNG, or JPEG), metadata_format (how we are going to store those training labels). Note that unlike Unet and object detection, the metadata is set to be Labeled_Tiles here. More detail can be found here.

Depending on the size of your data, tile and stride size, and computing resources, this operation can take a while. In our experiments, this took 15mins~2hrs. Also, do not re-run it if you already run it once unless you would like to update the setting.

We will export the training data for a small sub-region of our study area and the whole study area will be used for inferencing of results. We will create a map widget, zoom in to the western corner of our study area and get the extent of the zoomed in map. We will use this extent in Export training data using deep learning function.

In [11]:
# add the building_buffer layer in the webmap
m1 ='Malibu')