Skip To Content ArcGIS for Developers Sign In Dashboard

Classify Objects Using Deep Learning


Classify Objects Using Deep Learning diagram

The ClassifyObjectsUsingDeepLearning task is used to classify objects based on overlaid imagery data using the designated deep learning model and generate a feature service with new assigned label for each object.

As of 10.8, you must license your ArcGIS Server as an ArcGIS Image Server to use this resource.

Request parameters



The portal item ID, image service URL, cloud raster dataset, or shared raster dataset that will be classified. At least one type of input needs to be provided in the JSON object. If multiple inputs are given, the itemId takes priority.

Syntax: A JSON object describes the input raster.

inputRaster={"itemId": <portal item id>}

inputRaster={"url": <image service url>}

inputRaster={"uri": <cloud raster uri or shared data path>}

inputRaster={"serviceProperties":{"name":" testrasteranalysis ","serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis /ImageServer"},"itemProperties":{"itemId":"8cfbd3ec25584d0d8f4ed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}



The feature service layer that contains points, polylines, or polygons that identify the location of each object to be classified and labelled. The layer index is needed for the feature service URL.

Syntax: JSON object describes the input feature service layer.

inputFeatures={"url": <feature service url>}

inputFeatures={"uri": <shared data path>}

inputFeatures={"serviceProperties":{"name":" testrasteranalysis ","serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis /FeatureServer/0"},"itemProperties":{"itemId":"8cfbd3ec25584d0d8f4ed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}


inputFeatures={"url": "https://myserver/arcgis/rest/services/Hosted/testrasteranalysis/FeatureServer/0}



The output hosted feature service properties. If the hosted feature service is already created, the portal item ID or service URL can be given to the service tool. The output path of the feature class that is generated will be used to update the existing service definition. The service tool can also generate a new hosted feature service with the given service properties.

The output hosted feature service is stored and shared on the hosting server.

outputFeatureClass={"itemId": <portal item id>}

outputFeatureClass={"url": <hosted feature service url>}

outputFeatureClass={"uri": <feature class local output path>}

outputFeatureClass={"serviceProperties":{"name":"testrasteranalysis","serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis/FeatureServer"},"itemProperties":{"itemId":"8cfbd3ec25584d0d8fed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}



The input for model can be a deep learning model package (.dlpk) item uploaded to your portal, an .emd file, or the entire JSON string of the model definition.

Example for portal item:

model={"itemId": "x2u130909jcvojzkeeraedf"}
model={"url": "https://<portal name>/portal/sharing/rest/content/items/x2u130909jcvojzkeeraedf"}

Example for .emd file:

model={"uri": "\\\\sharedstorage\\sharefolder\\ClassifyHouseDamage.emd"}

Example for JSON:

  "Framework": "Keras",
  "ModelConfiguration": {"Name": "KerasClassifier"},
  "ModelFile": "Damage_Classification_Model.h5",
  "ModelType": "ObjectClassification",
  "ImageHeight": 256,
  "ImageWidth": 256,
  "ExtractBands": [0,1,2],
  "CropSizeFixed": 1,
  "BlackenAroundFeature": 1,
  "ImageSpaceUsed": "MAP_SPACE",
  "Classes": [
      "Value": 0,
      "Name": "undamaged",
      "Color": [255, 255, 0]
      "Value": 1,
      "Name": "damaged",
      "Color": [0, 255, 255]




The name of the value pairs of arguments and their values that can be customized by the clients.


modelArguments={"name1": "value1", "name2": "value2"}



The name of the field that will contain the classification label in the output feature service.





Specify how all raster items in an image service will be processed.

  • true—All raster items in the image service will be processed as separate images.
  • false—All raster items in the image service will be mosaicked together and processed. This is the default.





Environment settings that affect task execution. This task has three settings:

  • Extent (extent)—A bounding box that defines the analysis area.
  • Cell Size (cellSize)—The output raster will have the resolution specified by cell size.
  • Processor Type (processorType)—The specified processor (CPU or GPU) will be used for the analysis.
  • Parallel Processing Factor (parallelProcessingFactor)—The specified number or percentage of processes will be used for the analysis.


context={"cellSize": "20", "processorType": "CPU", "parallelProcessingFactor": "4"}


The response format. The default response format is html.

Values: html | json

Example usage

The following is a sample request URL for ClassifyObjectsUsingDeepLearning:
inputRaster="url":""&inputFeatures="url":""&outputFeatureClass={"serviceProperties":{"name":"test10210453"}}&model={"itemId": "d8d3902b41854529a907ad9f42af5a06"}&modelArguments={"padding": "0", "batch_size": "16"}&classLabelField=ClassLabel&processAllRasterItems=false&context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}}, "processorType": "CPU", "parallelProcessingFactor": 2}}&f=json

The following is a sample POST request for ClassifyObjectsUsingDeepLearning:

POST /webadaptor/rest/services/System/RasterAnalysisTools/GPServer/ClassifyObjectsUsingDeepLearning HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Content-Length: []

inputRaster={"url":""}&inputFeatures={"url":""}&outputFeatureClass={"serviceProperties": {"name":"test10210453"}}&model={"itemId": "d8d3902b41854529a907ad9f42af5a06"}modelArguments={"padding": "0", "batch_size": "16"
}classLabelField=ClassLabel&processAllRasterItems=false&context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}},"processorType": "CPU", "parallelProcessingFactor": 2}}&f=json

Both of the above requests use the following parameters and values in their request:

model={"itemId": " d8d3902b41854529a907ad9f42af5a06"}
modelArguments={"padding": "0", "batch_size": "16"}
context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}}, "processorType": "CPU", "parallelProcessingFactor": 2}}


When you submit a request, the task assigns a unique job ID for the transaction.


{ "jobId": "<unique job identifier>", "jobStatus": "<job status>" }

After the initial request is submitted, you can use the jobId to periodically check the status of the job and messages as described in Checking job status. Once the job has successfully completed, use the jobId to retrieve the results. To track the status, you can make a request of the following form:


When the status of the job request is esriJobSucceeded, you can access the results of the analysis by making a request of the following form:


JSON Response example

The response returns the outObjects output parameter, which has properties for parameter name, data type, and value. The content of value is always the output feature layer itemId and the image service URL.

  "paramName": "outObjects",
  "dataType": "GPFeatureRecordSetLayer",
  "value": {
    "itemId": "f121390b85ef419790479fc75b493efd",
    "url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/FeatureServer"