Compute Sensor Model

URL:
https://<root>/System/OrthomappingTools/GPServer/ComputeSensorModel
Methods:
GET
Version Introduced:
10.6.1

Description

Compute Sensor Model diagram

The ComputeSensorModel operation is a service that computes the bundle block adjustment for the image collection and applies the frame transformation to the images. It also generates the control point, solution, solution points, and flight path tables, though these tables are not published as portal items.

Request parameters

ParameterDetails

imageCollection

The image collection (mosaic dataset) name or URL. The image service must exist before calling this service to compute sensor model.

Syntax: A JSON object supports three keys: itemId, url, and uri. These keys are case sensitive.

Example:

1
2
3
{"itemId": “<portal item id>”}
{“url”: “<image service url>”}
{“uri”: “<cloud raster uri or shared data path>”}

mode

(Optional)

The bundle block adjustment mode keyword. It is used when the image collection type is UAV/UAS or UNKNOWN at the same time that the block adjustment status is Raw or Quick. The following modes are supported:

  • Quick : Computes tie points and adjustments at one-eighth of the source imagery resolution.
  • Full : Adjusts the images using the Quick mode solution at the full resolution of the source imagery.
  • Refine : Computes tie points and adjustments of the source imagery at full resolution.

locationAccuracy

(Optional)

This parameter allows you to specify the GPS location accuracy level of the source image. The following options determine how far the tool will search for neighboring matching images for calculating tie points and block adjustments:

  • High : GPS accuracy is 0 to 10 meters, and the tool uses a maximum of 4 by 3 images.
  • Medium : GPS accuracy of 10 to 20 meters, and the tool uses a maximum of 4 by 6 images.
  • Low : GPS accuracy of 20 to 50 meters, and the tool uses a maximum of 4 by 12 images.

context

Contains additional settings for the request. These additional settings include:

  • parallelProcessingFactor : The specified number or percentage of processes will be used for the analysis. The default value is “50%”.
  • computeCandidate : Indicates whether Compute Mosaic Candidates will run inside the service task. Default value is False .
  • maxOverlap : Specifies the maximum area overlap for running the Compute Mosaic Candidates tool inside the task. The default value is 0.6 .
  • maxLoss : Specifies the maximum area loss allowed for running the Compute Mosaic Candidates tool inside the task. The default value is 0.05 .
  • initPointResolution : Specifies the initial tie point resolution for running the Compute Camera Model tool inside the task. The default value is 8.0 .
  • maxResidual : Specifies the maximum residual for running the Compute Block Adjustment and Compute Camera Model tools inside the task. The default value is 5.0 .
  • adjustOptions : Specifies the adjustment options for running the Compute Block Adjustment tool inside the task. The default value is empty.
  • pointSimilarity : Specifies the similarity for running the Compute Tie Points tool inside the task. The default value is MEDIUM .
  • pointDensity : Specifies the point density for running the Compute Tie Points tool inside the task. The default value is MEDIUM .
  • pointDistribution : Specifies the point distribution for running the Compute Tie Points tool inside the task. The default value is RANDOM .
  • polygonMask : Specifies the input mask for running the Compute Tie Points tool inside the task. Default value is empty.
  • regenTiepoints : Indicates whether Compute Tie Points will rerun inside the service task if tie points feature class exists. The default value is True .
1
2
3
4
5
{
  "computeCandidate": False,
  "maxoverlap": 0.6,
  "maxloss": 0.05,
}

f

The response format. The default response format is html.

Values: html | json

Response

When you submit a request, the task assigns a unique job ID for the transaction.

Syntax:

1
2
3
4
{
"jobId": "<unique job identifier>",
"jobStatus": "<job status>"
}

After the initial request is submitted, you can use jobId to periodically review the status of the job and messages as described in Checking job status. Once the job has successfully completed, use jobId to retrieve the results. To track the status, you can make a request of the following form:

1
https://<orthomapping tools url>/ComputeSensorModel/jobs/<jobId>

The response, once the job request is complete, returns the result parameter, which can be accessed by making a request of the below form. The result parameter contains the output image collection url.

1
{"url": "https://<orthomapping tools url>/ComputeSensorModel/jobs/<jobId>/results/result"}

When the status of the job request is esriJobSucceeded , you can access the results of the analysis by making a request of the following form:

1
https://<orthomapping tools url>/ComputeSensorModel/jobs/<jobId>/results/result?token=<your token>&f=json

Example usage

The following is a sample URL for ComputeSensorModel :

1
https://services.myserver.com/arcgis/rest/services/System/OrthomappingTools/GPServer/ComputeSensorModel/submitJob

JSON Request example

The result has properties for parameter name, data type, and value.

1
2
3
imageCollection={"itemId": "1780d648db3545bba8661ad98d824a4"}&
mode=QUICK&
locationAccuracy=High

JSON Response example

1
2
3
4
5
6
7
{
  "paramName": "result",
  "dataType": "GPString",
  "value": {
    "url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/ImageServer"
  }
}

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.

You can no longer sign into this site. Go to your ArcGIS portal or the ArcGIS Location Platform dashboard to perform management tasks.

Your ArcGIS portal

Create, manage, and access API keys and OAuth 2.0 developer credentials, hosted layers, and data services.

Your ArcGIS Location Platform dashboard

Manage billing, monitor service usage, and access additional resources.

Learn more about these changes in the What's new in Esri Developers June 2024 blog post.

Close