ArcGIS GeoAnalytics for Microsoft Fabric must be authorized with a valid license before running any function or tool.
You can authorize the module with a GeoAnalytics for Microsoft Fabric username and password or an API key provided by
Esri. If the module is not authorized, functions and tools will fail to run with the following error: com.esri.geoanalytics.internal.
.
You can authorize GeoAnalytics for Microsoft Fabric in one of two ways:
- By calling
geoanalytics
in a PySpark notebook or by calling_fabric.auth() .auth()
,.auth
, orWith Key() .auth
in a Scala notebook.With Cred File() - By setting a Spark configuration property in the Fabric notebook before importing the GeoAnalytics for Microsoft Fabric module.
Authorization methods
You can authorize GeoAnalytics for Microsoft Fabric in a PySpark or Scala notebook. In a PySpark notebook, import the module and
call the authorization function geoanalytics
. In Scala,
import the module and call one of the authorization methods, .auth()
, .auth
, or .auth
.
Username and Password
Provide a username and password for an active GeoAnalytics for Microsoft Fabric subscription. This securely authorizes GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.
import geoanalytics_fabric
geoanalytics_fabric.auth(username="User1", password="p@ssw0rd")
API Key
Provide an API key for an active GeoAnalytics for Microsoft Fabric subscription. This also securely authorizes GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.
import geoanalytics_fabric
geoanalytics_fabric.auth(api_key="AAPTxy8BH1VEsoebNVZXo8HurN-2tn61d0FQhPnODbRy4BoRx6c9QdIuMnUT...")
Credentials file
Provide the path to a credentials file containing the username and password for an active GeoAnalytics for Microsoft Fabric subscription. Alternatively, the file could contain an API key. In either case, using a credentials file securely authorizes GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.
The credentials file can be accessed using the File AP
option from a OneLake
connected to the Fabric Notebook.
import geoanalytics_fabric
geoanalytics_fabric.auth(cred_file=r"/lakehouse/default/Files/.../creds.txt")
The following is an example of the contents of a credentials file when using a username and password:
username User1
password p@ssw0rd
The following is an example of the contents of a credentials file when using an API key:
apikey AAPTxy8BH1VEsoebNVZXo8HurN-2tn61d0FQhPnODbRy4BoRx6c9QdIuMnUT...
Spark configuration property
You can also authorize GeoAnalytics for Microsoft Fabric by setting a Spark configuration property before importing the
GeoAnalytics for Microsoft Fabric module. The credentials file can be accessed using the File AP
option from
a OneLake connected to the Fabric Notebook.
The credentials file contains the username and password for an active GeoAnalytics for Microsoft Fabric subscription. For example:
username User1
password p@ssw0rd
Alternatively, the file can contain an API key for an active GeoAnalytics for Microsoft Fabric subscription. For example:
apikey AAPTxy8BH1VEsoebNVZXo8HurN-2tn61d0FQhPnODbRy4BoRx6c9QdIuMnUT...
In either case, using a credentials file will securely authorize GeoAnalytics for Microsoft Fabric over the internet using OAuth 2.0.
spark.conf.set("geoanalytics.auth.cred.file", r"/lakehouse/default/Files/.../creds.txt")
import geoanalytics_fabric
Verify authorization
You can verify that you are correctly authorized to use GeoAnalytics for Microsoft Fabric by running any SQL function or
tool, or by calling the authorization information function. This is geoanalytics
in a PySpark notebook or
.auth
in a Scala notebook. These functions return a DataFrame with two columns, name
and value
, which represent
the name of a usage property and the value of that property respectively. The properties returned include:
'session
—The amount of time in milliseconds that GeoAnalytics for Microsoft Fabric has been authorized in the current session._uptime' 'auth'
—The type of authorization currently in use. Options aretoken/oauth
(when authorizing with a username/password) ortoken/apikey
(when authorizing with an API key).'scope'
—The scope of the current authorization.'offline'
—This will always befalse
since the module is connected for usage reporting.'metered'
—This will always betrue
since the usage is being measured. Usage is measured in compute units per millisecond.'authorized'
—true
if the module is correctly authorized and ready to use.'billing
—The plan type._type' 'available
—The remaining compute unit-hours (also known as core-hours) available in your subscription._hours' 'session
—The compute unit-milliseconds consumed in the current session._usage'
If the GeoAnalytics for Microsoft Fabric module is not authorized, an empty DataFrame will be returned. The code snippet below shows an example of what the authorization information DataFrame might look like before and 10 seconds after authorization:
import geoanalytics_fabric, time
print("Before authorization:")
geoanalytics_fabric.auth_info().show()
geoanalytics_fabric.auth(username="User1", password="p@ssw0rd")
time.sleep(10)
print("After authorization:")
geoanalytics_fabric.auth_info().show()
Before authorization:
+----+-----+
|name|value|
+----+-----+
+----+-----+
After authorization:
+---------------+-----------+
| name| value|
+---------------+-----------+
| session_uptime| 10015|
| auth|token/oauth|
| scope| session|
| offline| false|
| metered| true|
| authorized| true|
| username| username|
| billing_type| Prepaid|
|available_hours| 3718.87|
| session_usage| 0|
+---------------+-----------+
Deauthorization
Once you authorize GeoAnalytics for Microsoft Fabric, it will remain authorized until the Spark session is ended or until you call the deauthorization function.
geoanalytics_fabric.deauth()