Programmatically accessing analysis services

This document describes how to programmatically access and use analysis services. The sample code in this document uses Python. If you are programming in JavaScript, start by reading the JavaScript API document Working with Analysis Widgets. These JavaScript widgets and associated classes allow easy access to the tasks within the Analysis Service.

The general workflow to follow when programmatically submitting Analysis Service tasks to ArcGIS Online is as follows.

  1. Get token-based authentication from the portal host or www.arcgis.com.
  2. Get the Analysis Service URL from the portal host or www.arcgis.com and formatting the response as a dictionary to query for data.
  3. Submit your job.
  4. Monitor the job status.
  5. Get job results.

The above workflow is presented as a series of Python routines, each routine building on the previous one. At the end of this document, the analysis_job_results() knits all the routines together and shows using the Create Buffers task.

1 Get the token

Using ArcGIS Online requires authentication and an active and valid username/password is required to obtain token-based authentication. Once a token has been issued from ArcGIS Online, you can access specific resources, submit requests, submit analysis service jobs, and obtain results and status messages from analysis service jobs.

This approach uses all Python built-in modules. The portal_url object can be either the Organization Portal assigned with the ArcGIS Online subscription (for example http://companyname.maps.arcgis.com) or www.arcgis.com can be used if it is accessible from the organization.

The Python code sample below shows how to obtain a token using an ArcGIS Online username and password. Since working with ArcGIS Online programmatically requires many HTTP requests and responses, this sample has a utility function called submit_request() to take the HTTP request as a parameter and return the json response data in the form of a Python dictionary.

Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
import urllib
import urllib2
import httplib
import time
import json
import contextlib

def submit_request(request):
    """ Returns the response from an HTTP request in json format."""
    with contextlib.closing(urllib2.urlopen(request)) as response:
        job_info = json.load(response)
        return job_info

def get_token(portal_url, username, password):
    """ Returns an authentication token for use in ArcGIS Online."""

    # Set the username and password parameters before
    #  getting the token.
    #
    params = {"username": username,
              "password": password,
              "referer": "http://www.arcgis.com",
              "f": "json"}

    token_url = "{}/generateToken".format(portal_url)
    request = urllib2.Request(token_url, urllib.urlencode(params))
    token_response = submit_request(request)
    if "token" in token_response:
        print("Getting token...")
        token = token_response.get("token")
        return token
    else:
        # Request for token must be made through HTTPS.
        #
        if "error" in token_response:
            error_mess = token_response.get("error", {}).get("message")
            if "This request needs to be made over https." in error_mess:
                token_url = token_url.replace("http://", "https://")
                token = get_token(token_url, username, password)
                return token
            else:
                raise Exception("Portal error: {} ".format(error_mess))

See the Generate Token documentation for more information on the generateToken request.

2 Get the Analysis Service URL

You use the token returned in the above code sample to access resources on ArcGIS Online, including submitting a job through the Spatial Analysis service. Using the token, you can programmatically obtain the URL needed to submit a spatial analysis job request through the ArcGIS REST API. The spatial analysis URL for submitting a job is not static and can change based on the user and the ArcGIS Online for Organizations account associated with the username. The Python code sample below shows how to obtain the request URL using token-based authentication.

Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
def get_analysis_url(portal_url, token):
    """ Returns Analysis URL from AGOL for running analysis services."""

    print("Getting Analysis URL...")
    portals_self_url = "{}/portals/self?f=json&token={}".format(portal_url, token)
    request = urllib2.Request(portals_self_url)
    portal_response = submit_request(request)

    # Parse the dictionary from the json data response to get Analysis URL.
    #
    if "helperServices" in portal_response:
        helper_services = portal_response.get("helperServices")
        if "analysis" in helper_services:
            analysis_service = helper_services.get("analysis")
            if "url" in analysis_service:
                analysis_url = analysis_service.get("url")
                return analysis_url
    else:
        raise Exception("Unable to obtain Analysis URL.")

See the Portal Self documentation for more information on the portal_url/self request.

3 Submit a job (analysis task)

The analysis_url returned from the above code sample contains the URL that is needed to submit jobs programmatically through the Spatial Analysis service. The Python code sample below shows an example of how to submit an analysis job through the ArcGIS REST API using any of the available Analysis Service tasks:

Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
def analysis_job(analysis_url, task, token, params):
    """ Submits an Analysis job and returns the job URL for monitoring the job
        status in addition to the json response data for the submitted job."""

    # Unpack the Analysis job parameters as a dictionary and add token and
    # formatting parameters to the dictionary. The dictionary is used in the
    # HTTP POST request. Headers are also added as a dictionary to be included
    # with the POST.
    #
    print("Submitting analysis job...")

    params["f"] = "json"
    params["token"] = token
    headers = {"Referer":"http://www.arcgis.com"}
    task_url = "{}/{}".format(analysis_url, task)
    submit_url = "{}/submitJob?".format(task_url)
    request = urllib2.Request(submit_url, urllib.urlencode(params), headers)
    analysis_response = submit_request(request)
    if analysis_response:
        # Print the response from submitting the Analysis job.
        #
        print(analysis_response)
        return task_url, analysis_response
    else:
        raise Exception("Unable to submit analysis job.")

As an example, if you were to run the Create Buffers task, the task argument would be "CreateBuffers" and params would be a dictionary with values for the inputLayer, distances, units, dissolveType, and outputName keys. (See the analysis_job_results() function implementation at the end of this document for an example.) The params dictionary is initialized with these key/value pairs.

4 Monitor the job

The analysis_job() function above returns a URL for the task that can be used to monitor the job status, and a dictionary containing information about the job that can be used to access the job results. The Python code sample below shows how to monitor the job submitted through REST.

Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
def analysis_job_status(task_url, job_info, token):
    """ Tracks the status of the submitted Analysis job."""

    if "jobId" in job_info:
        # Get the id of the Analysis job to track the status.
        #
        job_id = job_info.get("jobId")
        job_url = "{}/jobs/{}?f=json&token={}".format(task_url, job_id, token)
        request = urllib2.Request(job_url)
        job_response = submit_request(request)

        # Query and report the Analysis job status.
        #
        if "jobStatus" in job_response:
            while not job_response.get("jobStatus") == "esriJobSucceeded":
                time.sleep(5)
                request = urllib2.Request(job_url)
                job_response = submit_request(request)
                print(job_response)

                if job_response.get("jobStatus") == "esriJobFailed":
                    raise Exception("Job failed.")
                elif job_response.get("jobStatus") == "esriJobCancelled":
                    raise Exception("Job cancelled.")
                elif job_response.get("jobStatus") == "esriJobTimedOut":
                    raise Exception("Job timed out.")

            if "results" in job_response:
                return job_response
        else:
            raise Exception("No job results.")
    else:
        raise Exception("No job url.")

The analysis_job_status() function sends requests to the job URL to get the status of the job. It will continue to send requests until the status of the job is esriJobSucceeded. The sleep time is set so that the responses will not be sent as frequently in case the job submitted is large. Errors are raised if the status of the job is esriJobFailed, esriJobCancelled, or esriJobTimedOut. A list of the jobStatus values can be found in Check job status.

5 Get the results of the job

Once the job succeeds, the json data from the result response is returned by the above analysis_job_status() function. The dictionary created from the json data result response can then be used as input to the analysis_job_results() function below to get additional information about the result.

Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
def analysis_job_results(task_url, job_info, token):
    """ Use the job result json to get information about the feature service
        created from the Analysis job."""

    # Get the paramUrl to get information about the Analysis job results.
    #
    if "jobId" in job_info:
        job_id = job_info.get("jobId")
        if "results" in job_info:
            results = job_info.get("results")
            result_values = {}
            for key in results.keys():
                param_value = results[key]
                if "paramUrl" in param_value:
                    param_url = param_value.get("paramUrl")
                    result_url = "{}/jobs/{}/{}?token={}&f=json".format(task_url,
                                                                        job_id,
                                                                        param_url,
                                                                        token)
                    request = urllib2.Request(result_url)
                    param_result = submit_request(request)
                    job_value = param_result.get("value")
                    result_values[key] = job_value
            return result_values
        else:
            raise Exception("Unable to get analysis job results.")
    else:
        raise Exception("Unable to get analysis job results.")


if __name__ == '__main__':
        httplib.HTTPConnection._http_vsn = 10
        httplib.HTTPConnection._http_vsn_str = 'HTTP/1.0'

        username = "username"
        password = "password"
        host_url = "http://www.arcgis.com"
        portal_url = "{}/sharing/rest".format(host_url)
        token = get_token(portal_url, username, password)
        analysis_url = get_analysis_url(portal_url, token)
        feature_service = "http://<path_to_feature_service_to_buffer>"
        task = "CreateBuffers"
        output_service = "CreateBuffers_20_35_50_feature_service"
        params = {"inputLayer": {"url":feature_service},
                  "distances": [20, 35, 50],
                  "units": "Miles",
                  "dissolveType": "Dissolve",
                  "outputName": {"serviceProperties":{"name": output_service}}}

        task_url, job_info = analysis_job(analysis_url, task, token, params)
        job_info = analysis_job_status(task_url, job_info, token)
        job_values = analysis_job_results(task_url, job_info, token)

Using the param_url along with the task_url, the dictionary of result values from the job can be returned and information about the new service created from the CreateBuffers task, such as the URL and the Item ID, can be obtained.

Python 3 code sample

The code sample below implements all the routines above in Python 3. The main difference is due to how Python implements urllib between version 2 and 3.

Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
import urllib.request
import urllib.parse
import urllib.error
import http.client
import time
import json
import contextlib

def submit_request(request):
    """ Returns the response from an HTTP request in json format."""
    with contextlib.closing(urllib.request.urlopen(request)) as response:
        content = response.read()
        content_decoded = content.decode("utf-8")
        job_info = json.loads(content_decoded)
        return job_info


def get_token(portal_url, username, password):
    """ Returns an authentication token for use in ArcGIS Online."""

    # Set the username and password parameters before
    #  getting the token.
    #
    params = {"username": username,
              "password": password,
              "referer": "http://www.arcgis.com",
              "f": "json"}

    token_url = "{}/generateToken".format(portal_url)
    data = urllib.parse.urlencode(params)
    data_encoded = data.encode("utf-8")
    request = urllib.request.Request(token_url, data=data_encoded)
    token_response = submit_request(request)
    if "token" in token_response:
        print("Getting token...")
        token = token_response.get("token")
        return token
    else:
        # Request for token must be made through HTTPS.
        #
        if "error" in token_response:
            error_mess = token_response.get("error", {}).get("message")
            if "This request needs to be made over https." in error_mess:
                token_url = token_url.replace("http://", "https://")
                token = get_token(token_url, username, password)
                return token
            else:
                raise Exception("Portal error: {} ".format(error_mess))


def get_analysis_url(portal_url, token):
    """ Returns Analysis URL from AGOL for running analysis services."""

    print("Getting Analysis URL...")
    portals_self_url = "{}/portals/self?f=json&token={}".format(portal_url, token)
    request = urllib.request.Request(portals_self_url)
    portal_response = submit_request(request)

    # Parse the dictionary from the json data response to get Analysis URL.
    #
    if "helperServices" in portal_response:
        helper_services = portal_response.get("helperServices")
        if "analysis" in helper_services:
            analysis_service = helper_services.get("analysis")
            if "url" in analysis_service:
                analysis_url = analysis_service.get("url")
                return analysis_url
    else:
        raise Exception("Unable to obtain Analysis URL.")


def analysis_job(analysis_url, task, token, params):
    """ Submits an Analysis job and returns the job URL for monitoring the job
        status in addition to the json response data for the submitted job."""

    # Unpack the Analysis job parameters as a dictionary and add token and
    # formatting parameters to the dictionary. The dictionary is used in the
    # HTTP POST request. Headers are also added as a dictionary to be included
    # with the POST.
    #
    print("Submitting analysis job...")

    params["f"] = "json"
    params["token"] = token
    headers = {"Referer":"http://www.arcgis.com"}
    task_url = "{}/{}".format(analysis_url, task)
    submit_url = "{}/submitJob?".format(task_url)
    data = urllib.parse.urlencode(params)
    data_encoded = data.encode("utf-8")
    request = urllib.request.Request(submit_url, data=data_encoded, headers=headers)
    analysis_response = submit_request(request)
    if analysis_response:
        # Print the response from submitting the Analysis job.
        #
        print(analysis_response)
        return task_url, analysis_response
    else:
        raise Exception("Unable to submit analysis job.")


def analysis_job_status(task_url, job_info, token):
    """ Tracks the status of the submitted Analysis job."""

    if "jobId" in job_info:
        # Get the id of the Analysis job to track the status.
        #
        job_id = job_info.get("jobId")
        job_url = "{}/jobs/{}?f=json&token={}".format(task_url, job_id, token)
        request = urllib.request.Request(job_url)
        job_response = submit_request(request)

        # Query and report the Analysis job status.
        #
        if "jobStatus" in job_response:
            while not job_response.get("jobStatus") == "esriJobSucceeded":
                time.sleep(5)
                request = urllib.request.Request(job_url)
                job_response = submit_request(request)
                print(job_response)

                if job_response.get("jobStatus") == "esriJobFailed":
                    raise Exception("Job failed.")
                elif job_response.get("jobStatus") == "esriJobCancelled":
                    raise Exception("Job cancelled.")
                elif job_response.get("jobStatus") == "esriJobTimedOut":
                    raise Exception("Job timed out.")

            if "results" in job_response:
                return job_response
        else:
            raise Exception("No job results.")
    else:
        raise Exception("No job url.")


def analysis_job_results(task_url, job_info, token):
    """ Use the job result json to get information about the feature service
        created from the Analysis job."""

    # Get the paramUrl to get information about the Analysis job results.
    #
    if "jobId" in job_info:
        job_id = job_info.get("jobId")
        if "results" in job_info:
            results = job_info.get("results")
            result_values = {}
            for key in list(results.keys()):
                param_value = results[key]
                if "paramUrl" in param_value:
                    param_url = param_value.get("paramUrl")
                    result_url = "{}/jobs/{}/{}?token={}&f=json".format(task_url,
                                                                        job_id,
                                                                        param_url,
                                                                        token)
                    request = urllib.request.Request(result_url)
                    param_result = submit_request(request)
                    job_value = param_result.get("value")
                    result_values[key] = job_value
            return result_values
        else:
            raise Exception("Unable to get analysis job results.")
    else:
        raise Exception("Unable to get analysis job results.")


if __name__ == '__main__':
        http.client.HTTPConnection._http_vsn = 10
        http.client.HTTPConnection._http_vsn_str = 'HTTP/1.0'

        username = "username"
        password = "password"
        host_url = "http://www.arcgis.com"
        portal_url = "{}/sharing/rest".format(host_url)
        token = get_token(portal_url, username, password)
        analysis_url = get_analysis_url(portal_url, token)
        feature_service = "http://<path_to_feature_service_to_buffer>"
        task = "CreateBuffers"
        output_service = "CreateBuffers_20_35_50_feature_service6"
        params = {"inputLayer": {"url": feature_service},
                  "distances": [20, 35, 50],
                  "units": "Miles",
                  "dissolveType": "Dissolve",
                  "outputName": {"serviceProperties": {"name": output_service}}}

        task_url, job_info = analysis_job(analysis_url, task, token, params)
        job_info = analysis_job_status(task_url, job_info, token)
        job_values = analysis_job_results(task_url, job_info, token)

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.