There are two options to write Spark DataFrames to feature services. First, GeoAnalytics Engine supports saving a
feature service in ArcGIS Online or ArcGIS Enterprise
with Spark Data Frame Writer
. Additionally, the ArcGIS API for Python also
allows you to write data in a Spatially Enabled DataFrame to a feature layer using the t o_ featurelayer()
method. This tutorial will show you how to write to feature services using both approaches.
Prerequisites The following are required for this tutorial:
A running Spark session configured with ArcGIS GeoAnalytics Engine . A Jupyter or JupyterLab notebook connected to your Spark session. The arcgis
module. An internet connection (for accessing sample data). Steps Import and authorize In your notebook, import geoanalytics
and authorize the module using a username and password, license file, or token.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
import geoanalytics
geoanalytics.auth(username= "user1" , password= "p@ssword" )
Write to feature services using Spark Data Frame Writer
Register a GIS with the ArcGIS Online or Portal for ArcGIS user that the feature service is shared with.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
geoanalytics.register_gis( "myGIS" , "https://arcgis.com" , username= "User" , password= "p@ssw0rd" )
Load a polygon feature service of world continents into a Spark DataFrame. The field shape
is the geometry column of the DataFrame.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
url = "https://services.arcgis.com/P3ePLMYs2RVChkJx/arcgis/rest/services/World_Continents/FeatureServer/0"
df = spark.read. format ( "feature-service" ).load(url)
df.show()
Result
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
+---+-------------+----------------+--------------+-------------------+------------------+--------------------+
|FID| CONTINENT| SQMI| SQKM| Shape__Area| Shape__Length| shape|
+---+-------------+----------------+--------------+-------------------+------------------+--------------------+
| 1| Africa|1.158346272399E7|3.0001150784E7| 3.3535112664014E13|4.91447980539965E7|{"rings":[[[39505...|
| 2| Asia| 1.7317280092E7|4.4851729022E7|1.14529033853897E14|3.11176818750514E8|{"rings":[[[-2.00...|
| 3| Australia| 2973612.2055| 7701651.076|9.65215194657644E12|2.99695387322192E7|{"rings":[[[1.768...|
| 4|North America| 9339528.4866|2.4189364532E7|1.11314432724737E14|5.95152476132253E8|{"rings":[[[-9092...|
| 5| Oceania| 165678.71418| 429107.61696|6.58166981274528E11| 2.617751878519E7|{"rings":[[[2.003...|
| 6|South America| 6856255.3355|1.7757690859E7|2.06843927975048E13|7.73729124857082E7|{"rings":[[[-7481...|
| 7| Antarctica| 4754809.4571| 1.231494924E7|6.96642125484067E14|2.53068488966892E8|{"rings":[[[-2.00...|
| 8| Europe| 3821854.34569| 9898596.9251|3.50892430260398E13|2.36911732881397E8|{"rings":[[[26548...|
+---+-------------+----------------+--------------+-------------------+------------------+--------------------+
Write the spatially enabled DataFrame of US world continents into a feature service layer using Spark Data Frame Writer
.
The service name must be unique. If the layer name is not specified, the service name will be the layer name.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
service_name = "continents"
df.write. format ( "feature-service" ) \
.option( "gis" , "myGIS" ) \
.option( "serviceName" , service_name) \
.option( "layerName" , "layer" ) \
.option( "tags" , "continents, boundaries" ) \
.option( "description" , "This is an example feature service showing boundaries of world continents" ) \
.save()
Overwrite an existing layer with a Spark DataFrame of North America continent boundaries using Spark Data Frame Writer
.
When loading the feature service layer back, there is one feature (Continent North America) in the layer.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
north_america = df.where( "continent = 'North America'" )
# You can check the service URL in the ArcGIS Online Contents after saving the Spark Dataframe
service_url = "https://<host>/<uniqueID>/ArcGIS/rest/services/<serviceName>/FeatureServer"
north_america.write. format ( "feature-service" ) \
.option( "gis" , "myGIS" ) \
.option( "serviceUrl" , service_url) \
.option( "layerName" , "layer" ) \
.mode( "overwrite" ) \
.save()
Append to an existing layer with a Spark DataFrame of South America continent boundaries using Spark Data Frame Writer
.
When loading the feature service layer back, there are two features (North America and South America) in the layer.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
south_america = df.where( "continent = 'South America'" )
# You can check the service URL in the ArcGIS Online Contents after saving the Spark Dataframe
south_america.write. format ( "feature-service" ) \
.option( "gis" , "myGIS" ) \
.option( "serviceUrl" , service_url) \
.option( "layerName" , "layer" ) \
.mode( "append" ) \
.save()
Append to an existing layer with a Spark DataFrame of Europe continent boundaries using option truncate
is true
.
All records in the existing layer will be removed before appending the DataFrame. When loading the feature service
layer back, there is one feature (Europe) in the layer.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
europe = df.where( "continent = 'Europe'" )
# You can check the service URL in the ArcGIS Online Contents after saving the Spark Dataframe
europe.write. format ( "feature-service" ) \
.option( "gis" , "myGIS" ) \
.option( "serviceUrl" , service_url) \
.option( "layerName" , "layer" ) \
.option( "truncate" , "true" ) \
.mode( "append" ) \
.save()
Write to a feature service using the ArcGIS API for Python Import arcgis
and log into ArcGIS Online or ArcGIS Enterprise. For more information check out Working with different authentication schemes .
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
from arcgis.gis import GIS
username= "<username>"
password= "<password>"
gis=GIS(username=username,password=password)
print ( "Successfully logged in as: " + gis.properties.user.username)
Load a polygon feature service of US state boundaries into a Spark DataFrame.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
url = "https://services.arcgis.com/P3ePLMYs2RVChkJx/ArcGIS/rest/services/USA_States_Generalized/FeatureServer/0"
us_states = spark.read. format ( "feature-service" ).load(url)
Convert the Spark DataFrame to a Spatially Enabled DataFrame. This requires that you have the arcgis module installed. arcgis
will be imported automatically when calling st.to_pandas_sdf.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
us_states_sdf = us_states.st.to_pandas_sdf()
Export the Spatially Enabled DataFrame to a feature layer
hosted in ArcGIS Online or ArcGIS Enterprise via the account that you have signed in during step 1.
Python
Use dark colors for code blocks Copy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
feature_layer = us_states_sdf.spatial.to_featurelayer( 'US States' )
What's next? For more information, see the Writing GIS Data documentation
and the following ArcGIS API for Python guide topics: