Skip to content

Authorization

ArcGIS GeoAnalytics for Microsoft Fabric must be authorized with a valid license before running any function or tool. You can authorize the module with a GeoAnalytics for Microsoft Fabric username and password or an API key provided by Esri. If the module is not authorized, functions and tools will fail to run with the following error: com.esri.geoanalytics.internal.AuthError: Not authorized.

You can authorize GeoAnalytics for Microsoft Fabric in one of two ways:

  • By calling geoanalytics_fabric.auth() in a PySpark notebook or by calling .auth(), .authWithKey(), or .authWithCredFile() in a Scala notebook.
  • By setting a Spark configuration property in the Fabric notebook before importing the GeoAnalytics for Microsoft Fabric module.

Authorization methods

You can authorize GeoAnalytics for Microsoft Fabric in a PySpark or Scala notebook. In a PySpark notebook, import the module and call the authorization function geoanalytics_fabric.auth(). In Scala, import the module and call one of the authorization methods, .auth(), .authWithKey(), or .authWithCredFile() .

Username and Password

Provide a username and password for an active GeoAnalytics for Microsoft Fabric subscription. This securely authorizes GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.

PythonPythonScala
Use dark colors for code blocksCopy
1
2
3

import geoanalytics_fabric
geoanalytics_fabric.auth(username="User1", password="p@ssw0rd")

API Key

Provide an API key for an active GeoAnalytics for Microsoft Fabric subscription. This also securely authorizes GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.

PythonPythonScala
Use dark colors for code blocksCopy
1
2
3

import geoanalytics_fabric
geoanalytics_fabric.auth(api_key="AAPTxy8BH1VEsoebNVZXo8HurN-2tn61d0FQhPnODbRy4BoRx6c9QdIuMnUT...")

Credentials file

Provide the path to a credentials file containing the username and password for an active GeoAnalytics for Microsoft Fabric subscription. Alternatively, the file could contain an API key. In either case, using a credentials file securely authorizes GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.

The credentials file can be accessed using the File API path option from a OneLake connected to the Fabric Notebook.

PythonPythonScala
Use dark colors for code blocksCopy
1
2
3

import geoanalytics_fabric
geoanalytics_fabric.auth(cred_file=r"/lakehouse/default/Files/.../creds.txt")

The following is an example of the contents of a credentials file when using a username and password:

Use dark colors for code blocksCopy
1
2
username User1
password p@ssw0rd

The following is an example of the contents of a credentials file when using an API key:

Use dark colors for code blocksCopy
1
apikey AAPTxy8BH1VEsoebNVZXo8HurN-2tn61d0FQhPnODbRy4BoRx6c9QdIuMnUT...

Spark configuration property

You can also authorize GeoAnalytics for Microsoft Fabric by setting a Spark configuration property before importing the GeoAnalytics for Microsoft Fabric module. The credentials file can be accessed using the File API path option from a OneLake connected to the Fabric Notebook.

The credentials file contains the username and password for an active GeoAnalytics for Microsoft Fabric subscription. For example:

Use dark colors for code blocksCopy
1
2
username User1
password p@ssw0rd

Alternatively, the file can contain an API key for an active GeoAnalytics for Microsoft Fabric subscription. For example:

Use dark colors for code blocksCopy
1
apikey AAPTxy8BH1VEsoebNVZXo8HurN-2tn61d0FQhPnODbRy4BoRx6c9QdIuMnUT...

In either case, using a credentials file will securely authorize GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.

PythonPythonScala
Use dark colors for code blocksCopy
1
2
3

spark.conf.set("geoanalytics.auth.cred.file", r"/lakehouse/default/Files/.../creds.txt")
import geoanalytics_fabric

Verify authorization

You can verify that you are correctly authorized to use GeoAnalytics for Microsoft Fabric by running any SQL function or tool, or by calling the authorization information function. This is geoanalytics_fabric.auth_info() in a PySpark notebook or .authInfo() in a Scala notebook. These functions return a DataFrame with two columns, name and value, which represent the name of a usage property and the value of that property respectively. The properties returned include:

  • 'session_uptime'—The amount of time in milliseconds that GeoAnalytics for Microsoft Fabric has been authorized in the current session.
  • 'auth'—The type of authorization currently in use. Options are token/oauth (when authorizing with a username/password) or token/apikey (when authorizing with an API key).
  • 'scope'—The scope of the current authorization.
  • 'offline'—This will always be false since the module is connected for usage reporting.
  • 'metered'—This will always be true since the usage is being measured. Usage is measured in compute units per millisecond.
  • 'authorized'true if the module is correctly authorized and ready to use.
  • 'billing_type'—The plan type.
  • 'available_hours'—The remaining compute unit-hours (also known as core-hours) available in your subscription.
  • 'session_usage'—The compute unit-milliseconds consumed in the current session.

If the GeoAnalytics for Microsoft Fabric module is not authorized, an empty DataFrame will be returned. The code snippet below shows an example of what the authorization information DataFrame might look like before and 10 seconds after authorization:

PythonPythonScala
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11

import geoanalytics_fabric, time

print("Before authorization:")
geoanalytics_fabric.auth_info().show()

geoanalytics_fabric.auth(username="User1", password="p@ssw0rd")
time.sleep(10)

print("After authorization:")
geoanalytics_fabric.auth_info().show()
Result
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
Before authorization:
+----+-----+
|name|value|
+----+-----+
+----+-----+

After authorization:
+---------------+-----------+
|           name|      value|
+---------------+-----------+
| session_uptime|      10015|
|           auth|token/oauth|
|          scope|    session|
|        offline|      false|
|        metered|       true|
|     authorized|       true|
|       username|   username|
|   billing_type|    Prepaid|
|available_hours|    3718.87|
|  session_usage|          0|
+---------------+-----------+

Deauthorization

Once you authorize GeoAnalytics for Microsoft Fabric, it will remain authorized until the Spark session is ended or until you call the deauthorization function.

PythonPythonScala
Use dark colors for code blocksCopy
1
2

geoanalytics_fabric.deauth()

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.