Write to feature services

There are two options to write Spark DataFrames to feature services. First, GeoAnalytics for Microsoft Fabric supports saving a feature service in ArcGIS Online or ArcGIS Enterprise with Spark DataFrameWriter. Additionally, the ArcGIS API for Python also allows you to write data in a Spatially Enabled DataFrame to a feature layer using the to_featurelayer() method. This tutorial will show you how to write to feature services using both approaches.

Steps

Import

  1. In your notebook, import geoanalytics_fabric.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    import geoanalytics_fabric

Write to feature services using Spark DataFrameWriter

  1. Register a GIS with the ArcGIS Online or Portal for ArcGIS user that the feature service is shared with.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    geoanalytics_fabric.register_gis("myGIS", "https://arcgis.com", username="User", password="p@ssw0rd")
  2. Load a polygon feature service of world continents into a Spark DataFrame. The field shape is the geometry column of the DataFrame.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    url = "https://services.arcgis.com/P3ePLMYs2RVChkJx/arcgis/rest/services/World_Continents/FeatureServer/0"
    df = spark.read.format("feature-service").load(url)
    df.select("FID", "CONTINENT", "SQMI", "SQKM", "Shape_Area", "shape").show()
    Result
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    +---+-------------+--------------+--------------+-------------+--------------------+
    |FID|    CONTINENT|          SQMI|          SQKM|   Shape_Area|               shape|
    +---+-------------+--------------+--------------+-------------+--------------------+
    |  1|       Africa|1.1583462724E7|3.0001150784E7|2559.07309772|{"rings":[[[39505...|
    |  2|         Asia|1.7317280092E7|4.4851729022E7|5432.08522748|{"rings":[[[-2.00...|
    |  3|    Australia|  2973612.2055|   7701651.076|695.539920644|{"rings":[[[1.768...|
    |  4|      Oceania|  165678.71418|  429107.61696|42.5654703343|{"rings":[[[2.003...|
    |  5|South America|  6856255.3355|1.7757690859E7|1539.31293336|{"rings":[[[-7481...|
    |  6|   Antarctica|  4754809.4571| 1.231494924E7|6054.02150735|{"rings":[[[-2.00...|
    |  7|       Europe| 3821854.34569|  9898596.9251|1444.39561322|{"rings":[[[26548...|
    |  8|North America|  9339528.4866|2.4189364532E7| 3708.7527567|{"rings":[[[-9092...|
    +---+-------------+--------------+--------------+-------------+--------------------+
  3. Write the spatially enabled DataFrame of US world continents into a feature service layer using Spark DataFrameWriter. The service name must be unique. If the layer name is not specified, the service name will be the layer name.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    service_name = "continents"
    df.write.format("feature-service") \
            .option("gis", "myGIS") \
            .option("serviceName", service_name) \
            .option("layerName", "layer") \
            .option("tags", "continents, boundaries") \
            .option("description", "This is an example feature service showing boundaries of world continents") \
            .save()
  4. Overwrite an existing layer with a Spark DataFrame of North America continent boundaries using Spark DataFrameWriter. When loading the feature service layer back, there is one feature (Continent North America) in the layer.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    north_america = df.where("continent = 'North America'")
    
    # You can check the service URL in the ArcGIS Online Contents after saving the Spark Dataframe
    service_url = "https://<host>/<uniqueID>/ArcGIS/rest/services/<serviceName>/FeatureServer"
    north_america.write.format("feature-service") \
                 .option("gis", "myGIS") \
                 .option("serviceUrl", service_url) \
                 .option("layerName", "layer") \
                 .mode("overwrite") \
                 .save()
  5. Append to an existing layer with a Spark DataFrame of South America continent boundaries using Spark DataFrameWriter. When loading the feature service layer back, there are two features (North America and South America) in the layer.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    south_america = df.where("continent = 'South America'")
    # You can check the service URL in the ArcGIS Online Contents after saving the Spark Dataframe
    south_america.write.format("feature-service") \
                 .option("gis", "myGIS") \
                 .option("serviceUrl", service_url) \
                 .option("layerName", "layer") \
                 .mode("append") \
                 .save()
  6. Append to an existing layer with a Spark DataFrame of Europe continent boundaries using option truncate is true. All records in the existing layer will be removed before appending the DataFrame. When loading the feature service layer back, there is one feature (Europe) in the layer.

    PythonPythonScala
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    europe = df.where("continent = 'Europe'")
    # You can check the service URL in the ArcGIS Online Contents after saving the Spark Dataframe
    europe.write.format("feature-service") \
          .option("gis", "myGIS") \
          .option("serviceUrl", service_url) \
          .option("layerName", "layer") \
          .option("truncate", "true") \
          .mode("append") \
          .save()

Write to a feature service using the ArcGIS API for Python

  1. Import arcgis and log into ArcGIS Online or ArcGIS Enterprise. For more information check out Working with different authentication schemes.

    Python
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    from arcgis.gis import GIS
    username="<username>"
    password="<password>"
    gis=GIS(username=username,password=password)
    print("Successfully logged in as: " + gis.properties.user.username)
  2. Load a polygon feature service of US state boundaries into a Spark DataFrame.

    Python
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    url = "https://services.arcgis.com/P3ePLMYs2RVChkJx/ArcGIS/rest/services/USA_States_Generalized/FeatureServer/0"
    us_states = spark.read.format("feature-service").load(url)
  3. Convert the Spark DataFrame to a Spatially Enabled DataFrame. This requires that you have the arcgis module installed. arcgis will be imported automatically when calling st.to_pandas_sdf.

    Python
    Use dark colors for code blocksCopy
    1
    2
    3
    4
    
    import geoanalytics_fabric
    from geoanalytics_fabric.sql import functions as ST
    us_states_sdf = us_states.st.to_pandas_sdf()
  4. Export the Spatially Enabled DataFrame to a feature layer hosted in ArcGIS Online or ArcGIS Enterprise via the account that you have signed in during step 1.

    Python
    Use dark colors for code blocksCopy
    1
    2
    3
    feature_layer = us_states_sdf.spatial.to_featurelayer('US States', sanitize_columns=True)

What's next?

For more information, see the Writing GIS Data documentation and the following ArcGIS API for Python guide topics:

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.