Skip To Content
ArcGIS Developers
Dashboard

Compute Sensor Model

Description

Compute Sensor Model diagram

The ComputeSensorModel operation is a service that computes the bundle block adjustment for the image collection and applies the frame transformation to the images. It also generates the control point, solution, solution points, and flight path tables, though these tables are not published as portal items.

License:

As of ArcGIS 10.5, you must license your ArcGIS Server as an ArcGIS Image Server to use this resource.

Request parameters

ParameterDetails
imageCollection

The image collection (mosaic dataset) name or URL. The image service must exist before calling this service to compute sensor model.

Syntax: A JSON object supports three keys: itemId, url, and uri. These keys are case sensitive.

Example:

{"itemId": “<portal item id>”}
{“url”: “<image service url>”}
{“uri”: “<cloud raster uri or shared data path>”}

mode

(Optional)

The bundle block adjustment mode keyword. It is used when the image collection type is UAV/UAS or UNKNOWN at the same time that the block adjustment status is Raw or Quick. The following modes are supported:

  • Quick: Computes tie points and adjustments at one-eighth of the source imagery resolution.
  • Full: Adjusts the images using the Quick mode solution at the full resolution of the source imagery.
  • Refine: Computes tie points and adjustments of the source imagery at full resolution.
locationAccuracy

(Optional)

This parameter allows you to specify the GPS location accuracy level of the source image. The following options determine how far the tool will search for neighboring matching images for calculating tie points and block adjustments:

  • High: GPS accuracy is 0 to 10 meters, and the tool uses a maximum of 4 by 3 images.
  • Medium: GPS accuracy of 10 to 20 meters, and the tool uses a maximum of 4 by 6 images.
  • Low: GPS accuracy of 20 to 50 meters, and the tool uses a maximum of 4 by 12 images.
context

  • parallelProcessingFactor: The specified number or percentage of processes will be used for the analysis. The default value is “50%”.
  • computeCandidate: Indicates whether Compute Mosaic Candidates will run inside the service task. Default value is False.
  • maxOverlap: Specifies the maximum area overlap for running the Compute Mosaic Candidates tool inside the task. The default value is 0.6.
  • maxLoss: Specifies the maximum area loss allowed for running the Compute Mosaic Candidates tool inside the task. The default value is 0.05.
  • initPointResolution: Specifies the initial tie point resolution for running the Compute Camera Model tool inside the task. The default value is 8.0.
  • maxResidual: Specifies the maximum residual for running the Compute Block Adjustment and Compute Camera Model tools inside the task. The default value is 5.0.
  • adjustOptions: Specifies the adjustment options for running the Compute Block Adjustment tool inside the task. The default value is empty.
  • pointSimilarity: Specifies the similarity for running the Compute Tie Points tool inside the task. The default value is MEDIUM.
  • pointDensity: Specifies the point density for running the Compute Tie Points tool inside the task. The default value is MEDIUM.
  • pointDistribution: Specifies the point distribution for running the Compute Tie Points tool inside the task. The default value is RANDOM.
  • polygonMask: Specifies the input mask for running the Compute Tie Points tool inside the task. Default value is empty.
  • regenTiepoints: Indicates whether Compute Tie Points will rerun inside the service task if tie points feature class exists. The default value is True.

{
  "computeCandidate": False,
  "maxoverlap": 0.6,
  "maxloss": 0.05,
}
f

The response format. The default response format is html.

Values: html | json

Note:
The ComputeSensorModel automatically determines the mode parameter based on the image collection type. For UA and drone images, the client can configure the adjustment mode from Quick, Full, and Refine. For aerial images, the mode is always Full and point accuracy is honored. For satellite images, only the RPC adjustment mode is supported.

Response

When you submit a request, the task assigns a unique job ID for the transaction.

Syntax:

{
"jobId": "<unique job identifier>",
"jobStatus": "<job status>"
}

After the initial request is submitted, you can use the jobId to periodically check the status of the job and messages as described in Checking job status. Once the job has successfully completed, you use the jobId to retrieve the results. To track the status, you can make a request of the following form:

https://<orthomapping tools url>/ComputeSensorModel/jobs/<jobId>

The response, once the job request is complete, returns the result parameter, which can be accessed by making a request of the below form. The result parameter contains the output image collection url.

{"url": "https://<orthomapping tools url>/ComputeSensorModel/jobs/<jobId>/results/result"}

When the status of the job request is esriJobSucceeded, you can access the results of the analysis by making a request of the following form:

https://<orthomapping tools url>/ComputeSensorModel/jobs/<jobId>/results/result?token=<your token>&f=json

Example usage

The following is a sample URL for ComputeSensorModel:

https://services.myserver.com/arcgis/rest/services/System/OrthomappingTools/GPServer/ComputeSensorModel/submitJob

JSON Request example

The result has properties for parameter name, data type, and value.

imageCollection={"itemId": "1780d648db3545bba8661ad98d824a4"}&
mode=QUICK&
locationAccuracy=High

JSON Response example

{
  "paramName": "result",
  "dataType": "GPString",
  "value": {
    "url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/ImageServer"
  }
}