Skip To Content
ArcGIS Developer
Dashboard

Detect Change Using Deep Learning service

Description

The DetectChangeUsingDeepLearning task runs a trained deep learning model to detect change between two rasters.

License:
You must license ArcGIS Server as ArcGIS Image Server to use this resource.

Request parameters

ParameterDetails
fromRaster

(Required)

The input for the previous raster. The input raster can be the portal item ID, the image service URL, a cloud raster dataset, or a shared raster dataset.

Syntax: A JSON object describing the from raster.

Example:


//Portal item ID
{"itemId": <portal item id>} 

//Image service URL
{"url": <image service url>}

Cloud raster URI/shared data path
{"uri": <cloud raster uri or shared data path>}
toRaster

(Required)

The input for the recent raster. The input raster can be the portal item ID, the image service URL, a cloud raster dataset, or a shared raster dataset.

Syntax: A JSON object describing the to raster.

Example:


//Portal item ID
{"itemId": <portal item id>} 

//Image service URL
{"url": <image service url>}

//Cloud raster URI/shared data path
{"uri": <cloud raster uri or shared data path>}
outputClassifiedRaster

(Required)

The output raster that shows the change between the from and to rasters.

You can specify the name, or you can create an empty service using Portal Admin Sharing API and use the return JSON object as input to this parameter.

Syntax: A JSON object describes the name of the output or the output table.

Example:


{"serviceProperties":{"name": "output_classified_raster"}}
modelDefinition

(Required)

The maximum number of changes per pixel that will be calculated. This number corresponds to the number of bands in the output raster. The default is 1, meaning only one change date will be calculated and the output will contain one band.

This parameter is not available when the changeType parameter is set to NUM_OF_CHANGES.

Syntax: A JSON object or string that describes the input Esri Model Definition file.

Example:


{
  "Framework": "TensorFlow",
	 "ModelConfiguration": "ObjectDetectionAPI",
	 "ModelFile": ".\\frozen_inference_graph.pb",
	 "ModelType": "ObjectionDetection",
	 "InferenceFunction": ". \\CustomObjectDetector.py",
	 "ImageHeight": 850,
	 "ImageWidth": 850,
	 "ExtractBands": [0,1,2],
  "Classes": [
    "Value": 0,
    "Name": "Tree",
    "Color": [0, 255, 0]
  ]
}
arguments

(Optional)

Lists additional deep learning parameters and arguments for experiments and refinement, such as a confidence threshold for adjusting sensitivity.

Syntax: A JSON object describes the arguments.

Example:


{"name1": "value1", "name2": "value2"}
context

(Optional)

Contains additional settings that affect task operation. This task has the following settings:

  • Extent (extent)—A bounding box that defines the analysis area.
  • Output Spatial Reference(outSR)—The output raster will be projected into the output spatial reference.
  • Snap Raster (snapRaster)—The output raster will have its cells aligned with the specified snap raster.
  • Cell Size (cellSize)—The output raster will have the resolution specified by cell size.
  • Parallel Processing Factor (parallelProcessingFactor)—The specified number or percentage of processes that will be used for the analysis.

Example:

context={"cellSize": "20", "parallelProcessingFactor": "4"}
f

The response format. The default response format is html.

Values: html | json

Response

When you submit a request, the task assigns a unique job ID for the transaction.

Syntax:

{ "jobId": "<unique job identifier>", "jobStatus": "<job status>" }

After the initial request is submitted, you can use the jobId to periodically check the status of the job and messages, as described in Check job status. Once the job has successfully completed, use the jobId to retrieve the results. To track the status, you can make a request of the following form:

http://<analysis url>/DetectChangeUsingDeepLearning/jobs/<jobId>

Access results

When the status of the job request is esriJobSucceded, you can access the results of the analysis by making a request of the following form:

https://<raster analysis url>/DetectChangeUsingDeepLearning/jobs/<jobId>/results/outputRaster?token=<your token>&f=json

ParameterDescription
outputRaster

The output multidimensional raster itemId value and URL:

Example:

{"url": "https://rasteranalysis-url>/DetectChangeUsingDeepLearning/jobs/<jobId>/results/outputMultidimensionalRaster"}
{"url":
"http://<raster analysis url>/DetectChangeUsingDeepLearning/jobs/<jobId>/results/outputRaster"}

The result has properties for parameter name, data type, and value. The content of the value is always the output raster dataset's itemId value and image service URL.


{
 "paramName": "outputRaster",
 "dataType": "GPString",
 "value": {
  "itemId": "c267610d0feb4370bf38cc6e2c4ac261",
  "url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/ImageServer"
 }
}