Augment reality to navigate route

View on GitHub

Use a route displayed in the real world to navigate.

Image of augment reality to navigate route sample

Use case

It can be hard to navigate using 2D maps in unfamiliar environments. You can use full-scale AR to show a route overlaid on the real-world for easier navigation.

How to use the sample

The sample opens with a map centered on the current location. Tap the map to add an origin and a destination; the route will be shown as a line.

When ready, tap the camera button to start the AR navigation. Calibrate the heading before starting to navigate.

When you start, route instructions will be displayed and spoken. As you proceed through the route, new directions will be provided until you arrive.

How it works

  1. The map page is used to plan the route before starting the AR experience. See Navigate route, Find route, and Offline routing samples for a more focused demonstration of that workflow.
  2. Pass the resulting RouteResult and the input RouteTask and RouteParameters to the view used for the AR portion of the navigation experience.
    • The route task and parameters are used to support a rerouting capability where routes are recalculated on-the-fly if you deviate. Due to service limitations, this sample doesn't support on-the-fly rerouting. You can incorporate offline routing to support rerouting in your app.
  3. Start ARKit tracking with continuous location updates when the AR view is shown.
  4. Get the route geometry from the first route in the RouteResult. Use the scene's base surface to apply elevation to the line so that it will follow the terrain.
    • First, densify the polyline to ensure that the elevation adjustment can be applied smoothly along the line with GeometryEngine.densify(_:maxSegmentLength:)
    • Next, create a polyline builder with a spatial reference matching the input route geometry
    • Get a list of all points in the polyline by iterating through parts and points along each part
    • For each point in the polyline, use surface.elevation(for: point) to get the elevation for that point. Then create a new point with the x and y of the input and z as the returned elevation value. This sample adds 3 meters to that value so that the route line is visible above the road. Add the new point to the polyline builder with builder.add(newPoint)
    • Once all points have been given an elevation and added to the polyline builder, call toGeometry() on the polyline builder to get the elevation-adjusted route line.
  5. Add the route geometry to a graphics overlay and add a renderer to the graphics overlay. This sample uses a MultilayerPolylineSymbol with a SolidStrokeSymbolLayer to visualize a tube along the route line.
  6. The WorldScaleSceneView has a calibration view that uses sliders to manipulate the heading (direction you are facing) and elevation. Because of limitations in on-device compasses, calibration is often necessary; small errors in heading cause big problems with the placement of scene content in the world.
    • The calibration view slider in the sample implements a 'joystick' interaction; the heading is adjusted faster the further you move from the center of the slider.
  7. When the user starts navigating, create a RouteTracker, providing a RouteResult and the index of the route you want to use; this sample always picks the first returned result.
  8. Create a location data source and listen for location change events. When the location changes, call track(_:) on the route tracker with the updated location.
  9. Keep the calibration view accessible throughout the navigation experience. As the user walks, small heading errors may become more noticeable and require recalibration.

Relevant API

  • GeometryEngine
  • LocationDataSource
  • RouteResult
  • RouteTask
  • RouteTracker
  • Surface
  • WorldScaleSceneView

About the data

This sample uses Esri's world elevation service to ensure that route lines are placed appropriately in 3D space. It uses Esri's world routing service to calculate routes. The world routing service requires authentication and does consume ArcGIS Online credits.

Additional information

This sample requires a device that is compatible with ARKit on iOS.

Unlike other scene samples, there's no need for a basemap while navigating, because context is provided by the camera feed showing the real environment. The base surface's opacity is set to zero to prevent it from interfering with the AR experience.

A digital elevation model is used to ensure that the displayed route is positioned appropriately relative to the terrain of the route. If you don't want to display the route line floating, you could show the line draped on the surface instead.

World-scale AR is one of three main patterns for working with geographic information in augmented reality. Augmented reality is made possible with the ArcGIS Maps SDK Toolkit. See Augmented reality in the guide for more information about augmented reality and adding it to your app.

Because most navigation scenarios involve traveling beyond the accurate range for ARKit positioning, this sample relies on continuous location updates from the location data source. Because the origin camera is constantly being reset by the location data source, the sample doesn't allow the user to pan to calibrate or adjust the altitude with a slider. The location data source doesn't provide a heading, so it isn't overwritten when the location refreshes.

Tags

augmented reality, directions, full-scale, guidance, mixed reality, navigate, navigation, real-scale, route, routing, world-scale

Sample Code

AugmentRealityToNavigateRouteView.swiftAugmentRealityToNavigateRouteView.swiftAugmentRealityToNavigateRouteView.ARSceneView.swift
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
// Copyright 2024 Esri
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//   https://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

import ArcGIS
import CoreLocation
import SwiftUI

@available(macCatalyst, unavailable)
struct AugmentRealityToNavigateRouteView: View {
    /// The view model for the map view in the sample.
    @StateObject private var model = MapModel()

    /// The error shown in the error alert.
    @State private var error: Error?

    /// The point on the map where the user tapped.
    @State private var tapLocation: Point?

    var body: some View {
        MapView(
            map: model.map,
            graphicsOverlays: model.graphicsOverlays
        )
        .locationDisplay(model.locationDisplay)
        .onSingleTapGesture { _, mapPoint in
            tapLocation = mapPoint
        }
        .onDisappear {
            Task {
                await model.locationDisplay.dataSource.stop()
            }
        }
        .task(id: tapLocation) {
            guard let tapLocation else { return }

            do {
                try await model.addRouteGraphic(for: tapLocation)
            } catch {
                self.error = error
            }
        }
        .task {
            do {
                try await model.setUp()
            } catch {
                self.error = error
            }
        }
        .overlay(alignment: .top) {
            instructionText
        }
        .toolbar {
            ToolbarItemGroup(placement: .bottomBar) {
                toolbarButtons
            }
        }
        .errorAlert(presentingError: $error)
    }

    /// The buttons in the bottom toolbar.
    @ViewBuilder private var toolbarButtons: some View {
        Spacer()
        NavigationLink {
            ARRouteSceneView(model: model.sceneModel)
        } label: {
            Image(systemName: "camera")
                .imageScale(.large)
        }
        .disabled(!model.didSelectRoute)

        Spacer()
        Button {
            model.reset()
            model.statusText = "Tap to place a start point."
        } label: {
            Image(systemName: "trash")
                .imageScale(.large)
        }
        .disabled(!model.didSelectRouteStop && !model.didSelectRoute)
    }

    /// The instruction text in the overlay.
    private var instructionText: some View {
        Text(model.statusText)
            .multilineTextAlignment(.center)
            .frame(maxWidth: .infinity, alignment: .center)
            .padding(8)
            .background(.thinMaterial, ignoresSafeAreaEdges: .horizontal)
    }
}

@available(macCatalyst, unavailable)
private extension AugmentRealityToNavigateRouteView {
    // MARK: Map Model

    /// A view model for this example.
    @MainActor
    class MapModel: ObservableObject {
        /// A map with an imagery basemap style.
        let map = Map(basemapStyle: .arcGISImagery)

        /// The map's location display.
        let locationDisplay: LocationDisplay = {
            let locationDisplay = LocationDisplay()
            locationDisplay.autoPanMode = .recenter
            return locationDisplay
        }()

        /// The graphics overlay for the stops.
        private let stopGraphicsOverlay = GraphicsOverlay()

        /// A graphics overlay for route graphics.
        private let routeGraphicsOverlay: GraphicsOverlay = {
            let overlay = GraphicsOverlay()
            overlay.renderer = SimpleRenderer(
                symbol: SimpleLineSymbol(style: .solid, color: .yellow, width: 5)
            )
            return overlay
        }()

        /// The map's graphics overlays.
        var graphicsOverlays: [GraphicsOverlay] {
            return [stopGraphicsOverlay, routeGraphicsOverlay]
        }
        /// A point representing the start of navigation.
        private var startPoint: Point? {
            didSet {
                let stopSymbol = PictureMarkerSymbol(image: .stopA)
                let startStopGraphic = Graphic(geometry: startPoint, symbol: stopSymbol)
                stopGraphicsOverlay.addGraphic(startStopGraphic)
            }
        }

        /// A point representing the destination of navigation.
        private var endPoint: Point? {
            didSet {
                let stopSymbol = PictureMarkerSymbol(image: .stopB)
                let endStopGraphic = Graphic(geometry: endPoint, symbol: stopSymbol)
                stopGraphicsOverlay.addGraphic(endStopGraphic)
            }
        }

        /// A Boolean value indicating whether a route stop is selected.
        var didSelectRouteStop: Bool {
            startPoint != nil || endPoint != nil
        }

        /// A Boolean value indicating whether a route is selected.
        var didSelectRoute: Bool {
            startPoint != nil && endPoint != nil
        }

        /// The view model for scene view in the sample.
        let sceneModel = SceneModel()

        /// The status text displayed to the user.
        @Published var statusText = "Tap to place a start point."

        /// Performs important tasks including setting up the location display, creating route parameters,
        /// and loading the scene elevation source.
        func setUp() async throws {
            try await startLocationDisplay()
            try await makeParameters()
            try await sceneModel.loadElevationSource()
        }

        /// Starts the location display to show user's location on the map.
        func startLocationDisplay() async throws {
            // Request location permission if it has not yet been determined.
            let locationManager = CLLocationManager()
            if locationManager.authorizationStatus == .notDetermined {
                locationManager.requestWhenInUseAuthorization()
            }

            locationDisplay.dataSource = sceneModel.locationDataSource

            // Start the location display to zoom to the user's current location.
            try await locationDisplay.dataSource.start()
        }

        /// Creates walking route parameters.
        func makeParameters() async throws {
            let parameters = try await sceneModel.routeTask.makeDefaultParameters()

            if let walkMode = sceneModel.routeTask.info.travelModes.first(where: { $0.name.contains("Walking") }) {
                parameters.travelMode = walkMode
                parameters.returnsStops = true
                parameters.returnsDirections = true
                parameters.returnsRoutes = true
                sceneModel.routeParameters = parameters
            }
        }

        /// Adds a route graphic for the selected route using a given start and end point.
        /// - Parameter mapPoint: The map point for the route start or end point.
        func addRouteGraphic(for mapPoint: Point) async throws {
            if startPoint == nil {
                startPoint = mapPoint
                statusText = "Tap to place destination."
            } else if endPoint == nil {
                endPoint = mapPoint
                sceneModel.routeParameters.setStops(makeStops())

                let routeResult = try await sceneModel.routeTask.solveRoute(
                    using: sceneModel.routeParameters
                )
                if let firstRoute = routeResult.routes.first {
                    let routeGraphic = Graphic(geometry: firstRoute.geometry)
                    routeGraphicsOverlay.addGraphic(routeGraphic)
                    sceneModel.routeResult = routeResult
                    try await sceneModel.makeRouteOverlay()
                    statusText = "Tap camera to start navigation."
                }
            }
        }

        /// Creates the start and destination stops for the navigation.
        private func makeStops() -> [Stop] {
            guard let startPoint, let endPoint else { return [] }
            let stop1 = Stop(point: startPoint)
            stop1.name = "Start"
            let stop2 = Stop(point: endPoint)
            stop2.name = "Destination"
            return [stop1, stop2]
        }

        /// Resets the start and destination stops for the navigation.
        func reset() {
            routeGraphicsOverlay.removeAllGraphics()
            stopGraphicsOverlay.removeAllGraphics()
            sceneModel.routeParameters.clearStops()
            startPoint = nil
            endPoint = nil
        }
    }
}

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.