Augment reality to navigate route

View on GitHub

Use a route displayed in the real world to navigate.

Image of augment reality to navigate route sample

Use case

It can be hard to navigate using 2D maps in unfamiliar environments. You can use full-scale AR to show a route overlaid on the real-world for easier navigation.

How to use the sample

The sample opens with a map centered on the current location. Tap the map to add an origin and a destination; the route will be shown as a line.

When ready, tap the camera button to start the AR navigation. Calibrate the heading before starting to navigate.

When you start, route instructions will be displayed and spoken. As you proceed through the route, new directions will be provided until you arrive.

How it works

  1. The map page is used to plan the route before starting the AR experience. See Navigate route, Find route, and Offline routing samples for a more focused demonstration of that workflow.
  2. Pass the resulting RouteResult and the input RouteTask and RouteParameters to the view used for the AR portion of the navigation experience.
    • The route task and parameters are used to support a rerouting capability where routes are recalculated on-the-fly if you deviate. Due to service limitations, this sample doesn't support on-the-fly rerouting. You can incorporate offline routing to support rerouting in your app.
  3. Start ARKit/ARCore tracking with continuous location updates when the AR view is shown.
  4. Get the route geometry from the first route in the RouteResult. Use the scene's base surface to apply elevation to the line so that it will follow the terrain.
    • First, densify the polyline to ensure that the elevation adjustment can be applied smoothly along the line with GeometryEngine.densify(_:maxSegmentLength:)
    • Next, create a polyline builder with a spatial reference matching the input route geometry
    • Get a list of all points in the polyline by iterating through parts and points along each part
    • For each point in the polyline, use surface.elevation(for: point) to get the elevation for that point. Then create a new point with the x and y of the input and z as the returned elevation value. This sample adds 3 meters to that value so that the route line is visible above the road. Add the new point to the polyline builder with builder.add(newPoint)
    • Once all points have been given an elevation and added to the polyline builder, call toGeometry() on the polyline builder to get the elevation-adjusted route line.
  5. Add the route geometry to a graphics overlay and add a renderer to the graphics overlay. This sample uses a MultilayerPolylineSymbol with a SolidStrokeSymbolLayer to visualize a tube along the route line.
  6. The WorldScaleSceneView has a calibration view that uses sliders to manipulate the heading (direction you are facing) and elevation. Because of limitations in on-device compasses, calibration is often necessary; small errors in heading cause big problems with the placement of scene content in the world.
    • The calibration view slider in the sample implements a 'joystick' interaction; the heading is adjusted faster the further you move from the center of the slider.
  7. When the user starts navigating, create a RouteTracker, providing a RouteResult and the index of the route you want to use; this sample always picks the first returned result.
  8. Create a location data source and listen for location change events. When the location changes, call track(_:) on the route tracker with the updated location.
  9. Keep the calibration view accessible throughout the navigation experience. As the user walks, small heading errors may become more noticeable and require recalibration.

Relevant API

  • GeometryEngine
  • LocationDataSource
  • RouteResult
  • RouteTask
  • RouteTracker
  • Surface
  • WorldScaleSceneView

About the data

This sample uses Esri's world elevation service to ensure that route lines are placed appropriately in 3D space. It uses Esri's world routing service to calculate routes. The world routing service requires authentication and does consume ArcGIS Online credits.

Additional information

This sample requires a device that is compatible with ARKit 1 on iOS.

Unlike other scene samples, there's no need for a basemap while navigating, because context is provided by the camera feed showing the real environment. The base surface's opacity is set to zero to prevent it from interfering with the AR experience.

A digital elevation model is used to ensure that the displayed route is positioned appropriately relative to the terrain of the route. If you don't want to display the route line floating, you could show the line draped on the surface instead.

World-scale AR is one of three main patterns for working with geographic information in augmented reality. Augmented reality is made possible with the ArcGIS Runtime Toolkit. See Augmented reality in the guide for more information about augmented reality and adding it to your app.

Because most navigation scenarios involve traveling beyond the accurate range for ARKit/ARCore positioning, this sample relies on continuous location updates from the location data source. Because the origin camera is constantly being reset by the location data source, the sample doesn't allow the user to pan to calibrate or adjust the altitude with a slider. The location data source doesn't provide a heading, so it isn't overwritten when the location refreshes.

Tags

augmented reality, directions, full-scale, guidance, mixed reality, navigate, navigation, real-scale, route, routing, world-scale

Sample Code

AugmentRealityToNavigateRouteView.swiftAugmentRealityToNavigateRouteView.swiftAugmentRealityToNavigateRouteView.RoutePlannerView.swift
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
// Copyright 2024 Esri
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//   https://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

import ArcGIS
import ArcGISToolkit
import AVFoundation
import SwiftUI

struct AugmentRealityToNavigateRouteView: View {
    /// The data model for the selected route.
    @StateObject private var routeDataModel = RouteDataModel()
    /// A Boolean value indicating whether the route planner is showing.
    @State private var isShowingRoutePlanner = true
    /// The location datasource that is used to access the device location.
    @State private var locationDataSource = SystemLocationDataSource()
    /// A scene with an imagery basemap.
    @State private var scene = Scene(basemapStyle: .arcGISImagery)
    /// The elevation surface set to the base surface of the scene.
    @State private var elevationSurface: Surface = {
        let elevationSurface = Surface()
        elevationSurface.navigationConstraint = .unconstrained
        elevationSurface.opacity = 0
        elevationSurface.backgroundGrid.isVisible = false
        return elevationSurface
    }()
    /// The elevation source with elevation service URL.
    @State private var elevationSource = ArcGISTiledElevationSource(url: URL(string: "https://elevation3d.arcgis.com/arcgis/rest/services/WorldElevation3D/Terrain3D/ImageServer")!)
    /// The graphics overlay containing a graphic.
    @State private var graphicsOverlay = GraphicsOverlay()
    /// The status text displayed to the user.
    @State private var statusText = "Adjust calibration before starting."
    /// A Boolean value indicating whether the use is navigatig the route.
    @State private var isNavigating = false
    /// The result of the route selected in the route planner view.
    @State private var routeResult: RouteResult?

    init() {
        elevationSurface.addElevationSource(elevationSource)
        scene.baseSurface = elevationSurface
    }

    var body: some View {
        if isShowingRoutePlanner {
            RoutePlannerView(isShowing: $isShowingRoutePlanner)
                .onDidSelectRoute { routeGraphic, routeResult  in
                    self.routeResult = routeResult
                    graphicsOverlay = makeRouteOverlay(
                        routeResult: routeResult,
                        routeGraphic: routeGraphic
                    )
                }
                .task {
                    try? await elevationSource.load()
                }
        } else {
            VStack(spacing: 0) {
                WorldScaleSceneView { _ in
                    SceneView(scene: scene, graphicsOverlays: [graphicsOverlay])
                }
                .calibrationButtonAlignment(.bottomLeading)
                .onCalibratingChanged { isPresented in
                    scene.baseSurface.opacity = isPresented ? 0.6 : 0
                }
                .task {
                    try? await locationDataSource.start()

                    for await location in locationDataSource.locations {
                        try? await routeDataModel.routeTracker?.track(location)
                    }
                }
                .overlay(alignment: .top) {
                    Text(statusText)
                        .multilineTextAlignment(.center)
                        .frame(maxWidth: .infinity, alignment: .center)
                        .padding(8)
                        .background(.regularMaterial, ignoresSafeAreaEdges: .horizontal)
                }
                .onDisappear {
                    Task { await locationDataSource.stop() }
                }
                Divider()
            }
            .toolbar {
                ToolbarItemGroup(placement: .bottomBar) {
                    Button("Start") {
                        isNavigating = true
                        Task {
                            do {
                                try await startNavigation()
                            } catch {
                                print("Failed to start navigation.")
                            }
                        }
                    }
                    .disabled(isNavigating)
                }
            }
        }
    }

    /// Creates a graphics overlay and adds a graphic (with solid yellow 3D tube symbol)
    /// to represent the route.
    private func makeRouteOverlay(routeResult: RouteResult, routeGraphic: Graphic) -> GraphicsOverlay {
        let graphicsOverlay = GraphicsOverlay()
        graphicsOverlay.sceneProperties.surfacePlacement = .absolute
        let strokeSymbolLayer = SolidStrokeSymbolLayer(
            width: 1.0,
            color: .yellow,
            lineStyle3D: .tube
        )
        let polylineSymbol = MultilayerPolylineSymbol(symbolLayers: [strokeSymbolLayer])
        let polylineRenderer = SimpleRenderer(symbol: polylineSymbol)
        graphicsOverlay.renderer = polylineRenderer

        if let originalPolyline = routeResult.routes.first?.geometry {
            addingElevation(3, to: originalPolyline) { polyline in
                routeGraphic.geometry = polyline
                graphicsOverlay.addGraphic(routeGraphic)
            }
        }

        return graphicsOverlay
    }

    /// Densify the polyline geometry so the elevation can be adjusted every 0.3 meters,
    /// and add an elevation to the geometry.
    ///
    /// - Parameters:
    ///   - z: A `Double` value representing z elevation.
    ///   - polyline: The polyline geometry of the route.
    ///   - completion: A completion closure to execute after the polyline is generated with success or not.
    private func addingElevation(
        _ z: Double,
        to polyline: Polyline,
        completion: @escaping (Polyline) -> Void
    ) {
        if let densifiedPolyline = GeometryEngine.densify(polyline, maxSegmentLength: 0.3) as? Polyline {
            let polylineBuilder = PolylineBuilder(spatialReference: densifiedPolyline.spatialReference)
            Task {
                for part in densifiedPolyline.parts {
                    for point in part.points {
                        async let elevation = try await elevationSurface.elevation(at: point)
                        let newPoint = await GeometryEngine.makeGeometry(from: point, z: try elevation + z)
                        // Put the new point 3 meters above the ground elevation.
                        polylineBuilder.add(newPoint)
                    }
                }
                completion(polylineBuilder.toGeometry())
            }
        } else {
            completion(polyline)
        }
    }

    /// Starts navigating the route.
    private func startNavigation() async throws {
        guard let routeResult else { return }
        let routeTracker = RouteTracker(
            routeResult: routeResult,
            routeIndex: 0,
            skipsCoincidentStops: true
        )
        guard let routeTracker else { return }

        routeTracker.voiceGuidanceUnitSystem = Locale.current.usesMetricSystem ? .metric : .imperial

        routeDataModel.routeTracker = routeTracker

        do {
            try await routeDataModel.routeTask.load()
        } catch {
            throw error
        }

        if routeDataModel.routeTask.info.supportsRerouting,
           let reroutingParameters = ReroutingParameters(
            routeTask: routeDataModel.routeTask,
            routeParameters: routeDataModel.routeParameters
           ) {
            do {
                try await routeTracker.enableRerouting(using: reroutingParameters)
            } catch {
                throw error
            }
        }

        statusText = "Navigation will start."
        await startTracking()
    }

    /// Starts monitoring multiple asynchronous streams of information.
    private func startTracking() async {
        await withTaskGroup(of: Void.self) { group in
            group.addTask { await trackStatus() }
            group.addTask { await routeDataModel.trackVoiceGuidance() }
        }
    }

    /// Monitors the asynchronous stream of tracking statuses.
    ///
    /// When new statuses are delivered, update the route's traversed and remaining graphics.
    private func trackStatus() async {
        guard let routeTracker = routeDataModel.routeTracker else { return }
        for await status in routeTracker.$trackingStatus {
            guard let status else { continue }
            switch status.destinationStatus {
            case .notReached, .approaching:
                if let route = routeResult?.routes.first {
                    let currentManeuver = route.directionManeuvers[status.currentManeuverIndex]
                    statusText = currentManeuver.text
                }
            case .reached:
                statusText = "You have arrived!"
            @unknown default:
                break
            }
        }
    }
}

extension AugmentRealityToNavigateRouteView {
    @MainActor
    class RouteDataModel: ObservableObject {
        /// An AVSpeechSynthesizer for text to speech.
        let speechSynthesizer = AVSpeechSynthesizer()
        /// The route task that solves the route using the online routing service, using API key authentication.
        let routeTask = RouteTask(url: URL(string: "https://route-api.arcgis.com/arcgis/rest/services/World/Route/NAServer/Route_World")!)
        /// The parameters for route task to solve a route.
        var routeParameters = RouteParameters()
        /// The route tracker.
        @Published var routeTracker: RouteTracker?
        /// The route result.
        @Published var routeResult: RouteResult?

        /// Monitors the asynchronous stream of voice guidances.
        func trackVoiceGuidance() async {
            guard let routeTracker = routeTracker else { return }
            for try await voiceGuidance in routeTracker.voiceGuidances {
                speechSynthesizer.stopSpeaking(at: .word)
                speechSynthesizer.speak(AVSpeechUtterance(string: voiceGuidance.text))
            }
        }
    }
}

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.