Locator and network dataset setup

GeoAnalytics Engine supports geocoding and network analysis tools. To use these tools, you need to setup the required components described below.

The geocoding tools require a locator and the network analysis tools require a network dataset. The locator or network dataset must be locally accessible to all nodes in your Spark cluster. In a cloud environment, you can first upload the locator or network dataset to a file system like Amazon S3 and then mount or copy it to each node's local system. This location in each node's file system needs to have enough disk space to store the locator or network dataset.

Here is an example of how to stage the locator or network dataset in Azure Databricks:

  1. Upload the locator or network dataset to a cloud file system like Azure Blob Storage.
  2. Install GeoAnalytics Engine on Azure Databricks.
  3. On a notebook, mount the locator or network dataset to DBFS using the dbutils.fs.mount command.
  4. Update the Cluster-scoped init script to copy files from the mounted location to /databricks/.
    Use dark colors for code blocksCopy
    1
    2
    cp -r /dbfs/mnt/locators/. /databricks/locators/
    cp -r /dbfs/mnt/network_datasets/. /databricks/network_datasets/

What's next?

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.