Feature Categorization using Satellite Imagery and Deep Learning¶
Introduction¶
This guide demonstrates the use of deep learning capabilities in ArcGIS to perform feature categorization. Specifically, we are going to perform automated damage assessment of homes after the devastating Woolsey fires. This is a critical task in damage claim processing, and using deep learning can speed up the process and make it more efficient. The workflow consists of three major steps: (1) extract training data, (2) train a deep learning feature classifier model, (3) make inference using the model.
Part 1 - export training data for deep learning¶
To export training data for feature categorization, we need two input data:
- A input raster that contains the spectral bands,
- A feature class that defines the location (e.g. outline or bounding box) and label of each building.
Import ArcGIS API for Python and get connected to your GIS¶
from arcgis import GIS
gis = GIS("home")
Prepare data that will be used for training data export¶
First, let's get the feature class that defines the location and label of each building.
building_label = gis.content.search("damage_labelselection_Buffer_100", item_type='Feature Layer Collection')[0]
building_label
Now let's get input raster that contains the spectral bands.
gis2 = GIS("https://ivt.maps.arcgis.com")
input_raster_layer = gis2.content.search("111318_USAA_W_Malibu")[0]
input_raster_layer
Specify a folder name in raster store that will be used to store our training data¶
from arcgis.raster import analytics
ds = analytics.get_datastores(gis=gis)
ds
ds.search()
rasterstore = ds.get("/rasterStores/LocalRasterStore")
rasterstore
samplefolder = "feature_classifier_sample"
samplefolder
Export training data using arcgis.learn
¶
Now ready to export training data using the export_training_data()
method in arcgis.learn module. In addtion to feature class, raster layer, and output folder, we also need to specify a few other parameters such as tile_size
(size of the image chips), stride_size
(distance to move each time when creating the next image chip), chip_format
(TIFF, PNG, or JPEG), metadata_format
(how we are going to store those training labels). Note that unlike Unet and object detection, the metadata is set to be Labeled_Tiles
here. More detail can be found here.
Depending on the size of your data, tile and stride size, and computing resources, this operation can take a while. In our experiments, this took 15mins~2hrs. Also, do not re-run it if you already run it once unless you would like to update the setting.
import arcgis
from arcgis import learn
arcgis.env.verbose = True
export = learn.export_training_data(input_raster=input_raster_layer,
output_location=samplefolder,
input_class_data=building_label,
classvalue_field = "classValue",
chip_format="PNG",
tile_size={"x":600,"y":600},
stride_size={"x":0,"y":0},
metadata_format="Labeled_Tiles",
context={"startIndex": 0, "exportAllTiles": False, "cellSize": 0.1},
gis = gis)
Part 2 - model training¶
If you've already done part 1, you should already have the training chips. Please change the path to your own export training data folder that contains "images" and "labels" folder.
from arcgis.learn import prepare_data, FeatureClassifier
data_path = r'to_your_data_folder'
data = prepare_data(data_path, {1:'Damaged', 0:'Undamaged'}, chip_size=600, batch_size=16)
Visualize training data¶
To get a sense of what the training data looks like, arcgis.learn.show_batch()
method randomly picks a few training chips and visualize them.
data.show_batch()