Skip To Content
ArcGIS Developer
Dashboard

Classify Pixels Using Deep Learning

Description

Classify Pixels Using Deep Learning diagram

The ClassifyPixelsUsingDeepLearning operation can be used to classify pixels in the imagery data using the designated deep learning model and generate an image service for the classified raster.

Request parameters

ParameterDetails
inputRaster

(Required)

The image that will be classified. This can be specified as the portal item ID, image service URL, cloud raster dataset, shared raster dataset, or raster dataset or image collection in the data store. At least one type of input must be provided in the JSON object. If multiple inputs are provided, the itemId value takes priority.

Syntax: A JSON object describes the input raster.

Example:

//Portal Item ID
inputRaster={"itemId": <portal item id>}

//Image Service URL
inputRaster={"url": <image service url>}

//Cloud raster or shared data path URI
inputRaster={"uri": <cloud raster uri or shared data path>}

//Service Properties
inputRaster={"serviceProperties":{"name":"testrasteranalysis",
  "serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis/ImageServer"},
  "itemProperties":{"itemId":"8cfbd3ec25584d0d8f4ed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}

//Data store URI
inputRaster={"uri":"/rasterStores/rasterstorename/A/B/C"}
or
inputRaster={"uri":"/fileShares/filesharedatastorename/A/B/C"}
or
inputRaster={"uri":"/cloudStores/cloudstorename/A/B/C"}

outputClassifedRaster

(Required)

The output hosted image service properties. If the hosted image service is an existing service, the portal item ID or service URL can be used for the service tool. The output path of the raster dataset generated in the raster store will be used to update the existing service definition. The service tool can also generate a new hosted image service with the given service properties.

The output hosted image service is stored in the raster store and shared on either the raster analysis image server or image hosting image server, depending on the enterprise configuration.

If the inputRaster is an image collection in the data store or a mosaic dataset, and processAllRasterItems is set to true, the output hosted image service is created from all classified rasters.

Syntax: A JSON object describes the output image service.

Example

//Portal Item ID
outputClassifiedRaster={"itemId": <portal item id>}

//Image Service URL
outputClassifiedRaster={"url": <image service url}

//Cloud raster or shared data path URI
outputClassifiedRaster={"uri": <cloud raster uri or shared data path>}

//Service Properties
{"serviceProperties":{"name":"testrasteranalysis",
  "serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis/ImageServer"},
  "itemProperties":{"itemId":"8cfbd3ec25584d0d8f4ed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}

model

(Required)

The deep learning model to use to classify objects. This can be specified as the deep learning model portal item ID, as an .emd or .dlpk file, or as the entire JSON string of the model definition.

Syntax: A JSON object describes the model.

Example:


//Portal Item
model={"itemId": "x2u130909jcvojzkeeraedf"}
model={"url": "https://<portal name>/portal/sharing/rest/content/items/x2u130909jcvojzkeeraedf"}

//.emd or .dlpk file
model={"uri": "\\\\sharedstorage\\sharefolder\\LandcoverClassification.emd"}
model={"uri": "\\\\sharedstorage\\sharefolder\\LandcoverClassification.dlpk"}
model={"uri": "/rasterStores/rasterstorename/A/B/LandcoverClassification.emd"}
model={"uri": "/rasterStores/rasterstorename/A/B/LandcoverClassification.dlpk"}

//.emd or .dlpk file stored in raster store with file share type
model = {"uri": "/fileShares/filesharedatastorename/A/B/ClassifyHouseDamage.emd"}
model={"uri": "/fileShares/filesharedatastorename/A/B/model.dlpk"}

Example for JSON:


{"Framework": "TensorFlow","ModelConfiguration": "DeepLab","ModelFile":"frozen_inference_graph.pb","ModelType":"ImageClassification","ImageHeight":513,"ImageWidth":513,"ExtractBands":[0,1,2],"Classes":[{"Value": 0,"Name": "Evergreen Forest","Color": [0, 51, 0]},{"Value": 1,"Name": "Grassland/Herbaceous","Color": [241, 185, 137]},{"Value": 2,"Name": "Bare Land","Color": [236, 236, 0]},{"Value": 3,"Name": "Open Water","Color": [0, 0, 117]},{"Value": 4,"Name": "Scrub/Shrub","Color": [102, 102, 0]},{"Value": 5,"Name": "Impervious Surface","Color": [236, 236, 236]}]}

modelArguments

The name value pairs of arguments and their values that can be customized by the clients.

Syntax: A JSON object describes the value pairs of arguments.

Example:

modelArguments={"name1": "value1", "name2": "value2"}

processAllRasterItems

Specifies how all raster items in an image service will be processed. If set to true, all raster items in the image service will be processed as separate images. If set to false, all raster items in the image service will be mosaicked together and processed. This is the default.

Values: true | false

context

Contains additional settings that affect task execution.

This parameter has the following settings:

  • Extent (extent)—A bounding box that defines the analysis area.
  • Output Spatial Reference (outSR)—The output raster will be projected into the output spatial reference.
  • Snap Raster (snapRaster)—The output raster will have its cells aligned with the specified snap raster.
  • Cell Size (cellSize)—The output raster will have the resolution specified by cell size.
  • Processor Type(processorType)—Processing will occur using the server computer CPU or GPU.
  • Recycle Interval Of Processing Workers(recycleProcessingWorker)—The number of image sections to be processed before stopping the process and starting new worker processes.
  • Parallel Processing Factor (parallelProcessingFactor)—The specified number or percentage of processes will be used for the analysis.
  • Number of Retries On Failures(retryOnFailures)—The number of retries that will be attempted when a worker process fails.
f

The response format. The default response format is html.

Values: html | json

Response

When you submit a request, the task assigns a unique job ID for the transaction.

Syntax:

{
"jobId": "<unique job identifier>",
"jobStatus": "<job status>"
}

After the initial request is submitted, you can use jobId to periodically review the status of the job and messages as described in Checking job status. Once the job has successfully completed, use jobId to retrieve the results. To track the status, you can make a request of the following form:

https://<raster analysis tools url>/ClassifyPixelsUsingDeepLearning/jobs/<jobId>

When the status of the job request is esriJobSucceeded, you can access the results of the analysis by making a request of the following form:

https://<raster analysis url>/ClassifyPixelsUsingDeepLearning/jobs/<jobId>/results/outRaster

Example usage

A sample request URL for ClassifyPixelsUsingDeepLearning is below.

https://services.myserver.com/arcgis/rest/services/System/RasterAnalysisTools/GPServer/ClassifyPixelsUsingDeepLearning

JSON Response example

The response returns the outRaster output parameter, which has properties for parameter name, data type, and value. The content of the value is always the output raster dataset and image service URL.

{
    "paramName": "outRaster",
    "dataType": "GPString",
    "value": {
        "itemId": "f121390b85ef419790479fc75b493efd", 
        "url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/ImageServer"
    }
}

Related topics