Use a route displayed in the real world to navigate.
Use case
It can be hard to navigate using 2D maps in unfamiliar environments. You can use full-scale AR to show a route overlaid on the real-world for easier navigation.
How to use the sample
The sample opens with a map centered on the current location. Tap the map to add an origin and a destination; the route will be shown as a line.
When ready, tap the camera button to start the AR navigation. Calibrate the heading before starting to navigate.
When you start, route instructions will be displayed and spoken. As you proceed through the route, new directions will be provided until you arrive.
How it works
- The map page is used to plan the route before starting the AR experience. See Navigate route, Find route, and Offline routing samples for a more focused demonstration of that workflow.
- Pass the resulting
RouteResult
and the inputRouteTask
andRouteParameters
to the view used for the AR portion of the navigation experience.- The route task and parameters are used to support a rerouting capability where routes are recalculated on-the-fly if you deviate. Due to service limitations, this sample doesn't support on-the-fly rerouting. You can incorporate offline routing to support rerouting in your app.
- Start ARKit/ARCore tracking with continuous location updates when the AR view is shown.
- Get the route geometry from the first route in the
RouteResult
. Use the scene's base surface to apply elevation to the line so that it will follow the terrain.- First, densify the polyline to ensure that the elevation adjustment can be applied smoothly along the line with
GeometryEngine.densify(_:maxSegmentLength:)
- Next, create a polyline builder with a spatial reference matching the input route geometry
- Get a list of all points in the polyline by iterating through parts and points along each part
- For each point in the polyline, use
surface.elevation(for: point)
to get the elevation for that point. Then create a new point with the x and y of the input and z as the returned elevation value. This sample adds 3 meters to that value so that the route line is visible above the road. Add the new point to the polyline builder withbuilder.add(newPoint)
- Once all points have been given an elevation and added to the polyline builder, call
toGeometry()
on the polyline builder to get the elevation-adjusted route line.
- First, densify the polyline to ensure that the elevation adjustment can be applied smoothly along the line with
- Add the route geometry to a graphics overlay and add a renderer to the graphics overlay. This sample uses a
MultilayerPolylineSymbol
with aSolidStrokeSymbolLayer
to visualize a tube along the route line. - The
WorldScaleSceneView
has a calibration view that uses sliders to manipulate the heading (direction you are facing) and elevation. Because of limitations in on-device compasses, calibration is often necessary; small errors in heading cause big problems with the placement of scene content in the world.- The calibration view slider in the sample implements a 'joystick' interaction; the heading is adjusted faster the further you move from the center of the slider.
- When the user starts navigating, create a
RouteTracker
, providing aRouteResult
and the index of the route you want to use; this sample always picks the first returned result. - Create a location data source and listen for location change events. When the location changes, call
track(_:)
on the route tracker with the updated location. - Keep the calibration view accessible throughout the navigation experience. As the user walks, small heading errors may become more noticeable and require recalibration.
Relevant API
- GeometryEngine
- LocationDataSource
- RouteResult
- RouteTask
- RouteTracker
- Surface
- WorldScaleSceneView
About the data
This sample uses Esri's world elevation service to ensure that route lines are placed appropriately in 3D space. It uses Esri's world routing service to calculate routes. The world routing service requires authentication and does consume ArcGIS Online credits.
Additional information
This sample requires a device that is compatible with ARKit 1 on iOS.
Unlike other scene samples, there's no need for a basemap while navigating, because context is provided by the camera feed showing the real environment. The base surface's opacity is set to zero to prevent it from interfering with the AR experience.
A digital elevation model is used to ensure that the displayed route is positioned appropriately relative to the terrain of the route. If you don't want to display the route line floating, you could show the line draped on the surface instead.
World-scale AR is one of three main patterns for working with geographic information in augmented reality. Augmented reality is made possible with the ArcGIS Runtime Toolkit. See Augmented reality in the guide for more information about augmented reality and adding it to your app.
Because most navigation scenarios involve traveling beyond the accurate range for ARKit/ARCore positioning, this sample relies on continuous location updates from the location data source. Because the origin camera is constantly being reset by the location data source, the sample doesn't allow the user to pan to calibrate or adjust the altitude with a slider. The location data source doesn't provide a heading, so it isn't overwritten when the location refreshes.
Tags
augmented reality, directions, full-scale, guidance, mixed reality, navigate, navigation, real-scale, route, routing, world-scale
Sample Code
// Copyright 2024 Esri
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// https://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
import ArcGIS
import ArcGISToolkit
import AVFoundation
import SwiftUI
struct AugmentRealityToNavigateRouteView: View {
/// The data model for the selected route.
@StateObject private var routeDataModel = RouteDataModel()
/// A Boolean value indicating whether the route planner is showing.
@State private var isShowingRoutePlanner = true
/// The location datasource that is used to access the device location.
@State private var locationDataSource = SystemLocationDataSource()
/// A scene with an imagery basemap.
@State private var scene = Scene(basemapStyle: .arcGISImagery)
/// The elevation surface set to the base surface of the scene.
@State private var elevationSurface: Surface = {
let elevationSurface = Surface()
elevationSurface.navigationConstraint = .unconstrained
elevationSurface.opacity = 0
elevationSurface.backgroundGrid.isVisible = false
return elevationSurface
}()
/// The elevation source with elevation service URL.
@State private var elevationSource = ArcGISTiledElevationSource(url: URL(string: "https://elevation3d.arcgis.com/arcgis/rest/services/WorldElevation3D/Terrain3D/ImageServer")!)
/// The graphics overlay containing a graphic.
@State private var graphicsOverlay = GraphicsOverlay()
/// The status text displayed to the user.
@State private var statusText = "Adjust calibration before starting."
/// A Boolean value indicating whether the use is navigatig the route.
@State private var isNavigating = false
/// The result of the route selected in the route planner view.
@State private var routeResult: RouteResult?
init() {
elevationSurface.addElevationSource(elevationSource)
scene.baseSurface = elevationSurface
}
var body: some View {
if isShowingRoutePlanner {
RoutePlannerView(isShowing: $isShowingRoutePlanner)
.onDidSelectRoute { routeGraphic, routeResult in
self.routeResult = routeResult
graphicsOverlay = makeRouteOverlay(
routeResult: routeResult,
routeGraphic: routeGraphic
)
}
.task {
try? await elevationSource.load()
}
} else {
VStack(spacing: 0) {
WorldScaleSceneView { _ in
SceneView(scene: scene, graphicsOverlays: [graphicsOverlay])
}
.calibrationButtonAlignment(.bottomLeading)
.onCalibratingChanged { isPresented in
scene.baseSurface.opacity = isPresented ? 0.6 : 0
}
.task {
try? await locationDataSource.start()
for await location in locationDataSource.locations {
try? await routeDataModel.routeTracker?.track(location)
}
}
.overlay(alignment: .top) {
Text(statusText)
.multilineTextAlignment(.center)
.frame(maxWidth: .infinity, alignment: .center)
.padding(8)
.background(.regularMaterial, ignoresSafeAreaEdges: .horizontal)
}
.onDisappear {
Task { await locationDataSource.stop() }
}
Divider()
}
.toolbar {
ToolbarItemGroup(placement: .bottomBar) {
Button("Start") {
isNavigating = true
Task {
do {
try await startNavigation()
} catch {
print("Failed to start navigation.")
}
}
}
.disabled(isNavigating)
}
}
}
}
/// Creates a graphics overlay and adds a graphic (with solid yellow 3D tube symbol)
/// to represent the route.
private func makeRouteOverlay(routeResult: RouteResult, routeGraphic: Graphic) -> GraphicsOverlay {
let graphicsOverlay = GraphicsOverlay()
graphicsOverlay.sceneProperties.surfacePlacement = .absolute
let strokeSymbolLayer = SolidStrokeSymbolLayer(
width: 1.0,
color: .yellow,
lineStyle3D: .tube
)
let polylineSymbol = MultilayerPolylineSymbol(symbolLayers: [strokeSymbolLayer])
let polylineRenderer = SimpleRenderer(symbol: polylineSymbol)
graphicsOverlay.renderer = polylineRenderer
if let originalPolyline = routeResult.routes.first?.geometry {
addingElevation(3, to: originalPolyline) { polyline in
routeGraphic.geometry = polyline
graphicsOverlay.addGraphic(routeGraphic)
}
}
return graphicsOverlay
}
/// Densify the polyline geometry so the elevation can be adjusted every 0.3 meters,
/// and add an elevation to the geometry.
///
/// - Parameters:
/// - z: A `Double` value representing z elevation.
/// - polyline: The polyline geometry of the route.
/// - completion: A completion closure to execute after the polyline is generated with success or not.
private func addingElevation(
_ z: Double,
to polyline: Polyline,
completion: @escaping (Polyline) -> Void
) {
if let densifiedPolyline = GeometryEngine.densify(polyline, maxSegmentLength: 0.3) as? Polyline {
let polylineBuilder = PolylineBuilder(spatialReference: densifiedPolyline.spatialReference)
Task {
for part in densifiedPolyline.parts {
for point in part.points {
async let elevation = try await elevationSurface.elevation(at: point)
let newPoint = await GeometryEngine.makeGeometry(from: point, z: try elevation + z)
// Put the new point 3 meters above the ground elevation.
polylineBuilder.add(newPoint)
}
}
completion(polylineBuilder.toGeometry())
}
} else {
completion(polyline)
}
}
/// Starts navigating the route.
private func startNavigation() async throws {
guard let routeResult else { return }
let routeTracker = RouteTracker(
routeResult: routeResult,
routeIndex: 0,
skipsCoincidentStops: true
)
guard let routeTracker else { return }
routeTracker.voiceGuidanceUnitSystem = Locale.current.usesMetricSystem ? .metric : .imperial
routeDataModel.routeTracker = routeTracker
do {
try await routeDataModel.routeTask.load()
} catch {
throw error
}
if routeDataModel.routeTask.info.supportsRerouting,
let reroutingParameters = ReroutingParameters(
routeTask: routeDataModel.routeTask,
routeParameters: routeDataModel.routeParameters
) {
do {
try await routeTracker.enableRerouting(using: reroutingParameters)
} catch {
throw error
}
}
statusText = "Navigation will start."
await startTracking()
}
/// Starts monitoring multiple asynchronous streams of information.
private func startTracking() async {
await withTaskGroup(of: Void.self) { group in
group.addTask { await trackStatus() }
group.addTask { await routeDataModel.trackVoiceGuidance() }
}
}
/// Monitors the asynchronous stream of tracking statuses.
///
/// When new statuses are delivered, update the route's traversed and remaining graphics.
private func trackStatus() async {
guard let routeTracker = routeDataModel.routeTracker else { return }
for await status in routeTracker.$trackingStatus {
guard let status else { continue }
switch status.destinationStatus {
case .notReached, .approaching:
if let route = routeResult?.routes.first {
let currentManeuver = route.directionManeuvers[status.currentManeuverIndex]
statusText = currentManeuver.text
}
case .reached:
statusText = "You have arrived!"
@unknown default:
break
}
}
}
}
extension AugmentRealityToNavigateRouteView {
@MainActor
class RouteDataModel: ObservableObject {
/// An AVSpeechSynthesizer for text to speech.
let speechSynthesizer = AVSpeechSynthesizer()
/// The route task that solves the route using the online routing service, using API key authentication.
let routeTask = RouteTask(url: URL(string: "https://route-api.arcgis.com/arcgis/rest/services/World/Route/NAServer/Route_World")!)
/// The parameters for route task to solve a route.
var routeParameters = RouteParameters()
/// The route tracker.
@Published var routeTracker: RouteTracker?
/// The route result.
@Published var routeResult: RouteResult?
/// Monitors the asynchronous stream of voice guidances.
func trackVoiceGuidance() async {
guard let routeTracker = routeTracker else { return }
for try await voiceGuidance in routeTracker.voiceGuidances {
speechSynthesizer.stopSpeaking(at: .word)
speechSynthesizer.speak(AVSpeechUtterance(string: voiceGuidance.text))
}
}
}
}