Navigate in AR

View on GitHub
Sample viewer app

Use a route displayed in the real world to navigate.

Image of navigate in AR

Use case

It can be hard to navigate using 2D maps in unfamiliar environments. You can use full-scale AR to show a route overlaid on the real-world for easier navigation.

How to use the sample

The sample opens with a map centered on the current location. Tap the map to add an origin and a destination; the route will be shown as a line. Calibrate the heading before starting to navigate. When ready, tap 'Start AR' to start the AR navigation. When you start, route instructions will be displayed and spoken. As you proceed through the route, new directions will be provided until you arrive.

How it works

  1. The map page is used to plan the route before starting the AR experience. See the Find a route and Offline routing samples for a more focused demonstration of that workflow.
  2. Pass the resulting RouteResult to the activity used for the AR portion of the navigation experience.

    • The route task and parameters are used to support a rerouting capability where routes are recalculated on-the-fly if you deviate. Due to service limitations, this sample doesn't support on-the-fly rerouting. You can incorporate offline routing to support rerouting in your apps.
  3. Start ARCore tracking with continuous location updates when the AR view is shown.
  4. Get the route geometry from the first route in the RouteResult. Use the scene's base surface to apply elevation to the line so that it will follow the terrain.
  5. Add the route geometry to a graphics overlay and add a renderer to the graphics overlay. This sample uses a MultilayerPolylineSymbol with a SolidStrokeSymbolLayer to visualize a tube along the route line.
  6. Create a calibration view. This sample uses a slider to manipulate the heading (direction you are facing). Because of limitations in on-device compasses, calibration is often necessary; small errors in heading cause big problems with the placement of scene content in the world.

    • Note that while this sample implemented a slider, there are many possible strategies for implementing heading calibration.
    • While calibrating, the basemap is shown at 50% opacity, to allow you to compare the basemap imagery with what is seen by the camera. While this works in some environments, it won't work indoors, in forested areas, or if the ground truth has changed since the basemap imagery was updated. Alternative scenarios can involve orienting relative to landmarks (for example, stage sets at a concert) or starting at a known orientation by lining up with a static image.
    • The slider in the sample implements a 'joystick' interaction; the heading is adjusted faster the further you move from the center of the slider. There are many possible slider interactions you could choose to implement.
  7. When the user starts navigating, create a RouteTracker, providing a RouteResult and the index of the route you want to use; this sample always picks the first returned result.
  8. Create a location data source and listen for location change events. When the location changes, call routeTracker.trackLocationAsync with the updated location.
  9. Keep the calibration view accessible throughout the navigation experience. As the user walks, small heading errors may become more noticeable and require recalibration.

Relevant API

  • ArcGISArView
  • LocationDataSource
  • RouteResult
  • RouteTask
  • RouteTracker
  • Surface

About the data

This sample uses Esri's world elevation service to ensure that route lines are placed appropriately in 3D space. It uses Esri's world routing service to calculate routes. The world routing service requires authentication and does consume ArcGIS Online credits.

Additional information

This sample requires a device that is compatible with ARCore 1.8 on Android.

Unlike other scene samples, there's no need for a basemap while navigating, because context is provided by the camera feed showing the real environment. The base surface's opacity is set to zero to prevent it from interfering with the AR experience. During calibration, the basemap is shown at 50% opacity to help the user verify that they have calibrated properly.

A digital elevation model is used to ensure that the displayed route is positioned appropriately relative to the terrain of the route. If you don't want to display the route line floating, you could show the line draped on the surface instead.

Real-scale AR is one of three main patterns for working with geographic information in augmented reality. See Augmented reality in the guide for more information.

Because most navigation scenarios involve traveling beyond the accurate range for ARCore positioning, this sample relies on continuous location updates from the location data source. Because the origin camera is constantly being reset by the location data source, the sample doesn't allow the user to pan to calibrate or adjust the altitude with a slider. The location data source doesn't provide a heading, so it isn't overwritten when the location refreshes.


augmented reality, directions, full-scale, guidance, mixed reality, navigate, navigation, real-scale, route, routing, world-scale

Sample Code
 *  Copyright 2019 Esri
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 * Unless required by applicable law or agreed to in writing, software
 *  distributed under the License is distributed on an "AS IS" BASIS,
 *  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 *  See the License for the specific language governing permissions and
 *  limitations under the License.

package com.esri.arcgisruntime.sample.navigateinar;

import java.util.ArrayList;
import java.util.LinkedList;

import android.Manifest;
import android.content.res.Resources;
import android.os.Bundle;
import android.speech.tts.TextToSpeech;
import androidx.annotation.NonNull;
import androidx.core.content.ContextCompat;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
import android.widget.Toast;

import com.esri.arcgisruntime.geometry.Point;
import com.esri.arcgisruntime.location.AndroidLocationDataSource;
import com.esri.arcgisruntime.mapping.ArcGISScene;
import com.esri.arcgisruntime.mapping.ArcGISTiledElevationSource;
import com.esri.arcgisruntime.mapping.Basemap;
import com.esri.arcgisruntime.mapping.NavigationConstraint;
import com.esri.arcgisruntime.mapping.Surface;
import com.esri.arcgisruntime.mapping.view.Camera;
import com.esri.arcgisruntime.mapping.view.Graphic;
import com.esri.arcgisruntime.mapping.view.GraphicsOverlay;
import com.esri.arcgisruntime.mapping.view.LayerSceneProperties;
import com.esri.arcgisruntime.navigation.RouteTracker;
import com.esri.arcgisruntime.symbology.MultilayerPolylineSymbol;
import com.esri.arcgisruntime.symbology.SimpleRenderer;
import com.esri.arcgisruntime.symbology.SolidStrokeSymbolLayer;
import com.esri.arcgisruntime.symbology.StrokeSymbolLayer;
import com.esri.arcgisruntime.symbology.SymbolLayer;
import com.esri.arcgisruntime.tasks.networkanalysis.RouteResult;
import com.esri.arcgisruntime.toolkit.control.JoystickSeekBar;

public class ARNavigateActivity extends AppCompatActivity {

  private static final String TAG = ARNavigateActivity.class.getSimpleName();

  private ArcGISArView mArView;

  private TextView mHelpLabel;
  private View mCalibrationView;

  public static RouteResult sRouteResult;

  private ArcGISScene mScene;

  private boolean mIsCalibrating = false;
  private RouteTracker mRouteTracker;
  private TextToSpeech mTextToSpeech;

  private float mCurrentVerticalOffset;

  protected void onCreate(Bundle savedInstanceState) {

    // ensure at route has been set by the previous activity
    if (sRouteResult.getRoutes().get(0) == null) {
      String error = "Route not set before launching activity!";
      Toast.makeText(this, error, Toast.LENGTH_SHORT).show();
      Log.e(TAG, error);


  private void navigateInAr() {
    // get a reference to the ar view
    mArView = findViewById(;
    // disable touch interactions with the scene view
    mArView.getSceneView().setOnTouchListener((view, motionEvent) -> true);
    // create a scene and add it to the scene view
    mScene = new ArcGISScene(Basemap.createImagery());
    // create and add an elevation surface to the scene
    ArcGISTiledElevationSource elevationSource = new ArcGISTiledElevationSource(getString(R.string.elevation_url));
    Surface elevationSurface = new Surface();
    // allow the user to navigate underneath the surface
    // hide the basemap. The image feed provides map context while navigating in AR
    // disable plane visualization. It is not useful for this AR scenario.
    // add an ar location data source to update location
    mArView.setLocationDataSource(new ArLocationDataSource(this));

    // create and add a graphics overlay for showing the route line
    GraphicsOverlay routeOverlay = new GraphicsOverlay();
    Graphic routeGraphic = new Graphic(sRouteResult.getRoutes().get(0).getRouteGeometry());
    // display the graphic 3 meters above the ground
    // create a renderer for the route geometry
    SolidStrokeSymbolLayer strokeSymbolLayer = new SolidStrokeSymbolLayer(1, Color.YELLOW, new LinkedList<>(),
    ArrayList<SymbolLayer> layers = new ArrayList<>();
    MultilayerPolylineSymbol polylineSymbol = new MultilayerPolylineSymbol(layers);
    SimpleRenderer polylineRenderer = new SimpleRenderer(polylineSymbol);

    // create and start a location data source for use with the route tracker
    AndroidLocationDataSource trackingLocationDataSource = new AndroidLocationDataSource(this);
    trackingLocationDataSource.addLocationChangedListener(locationChangedEvent -> {
      if (mRouteTracker != null) {
        // pass new location to the route tracker

    // get references to the ui views defined in the layout
    mHelpLabel = findViewById(;
    mArView = findViewById(;
    mCalibrationView = findViewById(;

    // show/hide calibration view
    Button calibrationButton = findViewById(;
    calibrationButton.setOnClickListener(v -> {
      // toggle calibration
      mIsCalibrating = !mIsCalibrating;
      if (mIsCalibrating) {
      } else {

    // start navigation
    Button navigateButton = findViewById(;
    // start turn-by-turn when the user is ready
    navigateButton.setOnClickListener(v -> {
      // create a route tracker with the route result
      mRouteTracker = new RouteTracker(this, sRouteResult, 0);
      // initialize text-to-speech to play navigation voice guidance
      mTextToSpeech = new TextToSpeech(this, status -> {
        if (status != TextToSpeech.ERROR) {
      mRouteTracker.addNewVoiceGuidanceListener((RouteTracker.NewVoiceGuidanceEvent newVoiceGuidanceEvent) -> {
        // Get new guidance
        String newGuidanceText = newVoiceGuidanceEvent.getVoiceGuidance().getText();
        // Display and then read out the new guidance
        // read out directions
        mTextToSpeech.speak(newGuidanceText, TextToSpeech.QUEUE_FLUSH, null);
          .addTrackingStatusChangedListener((RouteTracker.TrackingStatusChangedEvent trackingStatusChangedEvent) -> {
            // Display updated guidance

    // wire up joystick seek bars to allow manual calibration of height and heading
    JoystickSeekBar headingJoystick = findViewById(;
    // listen for calibration value changes for heading
    headingJoystick.addDeltaProgressUpdatedListener(delta -> {
      // get the origin camera
      Camera camera = mArView.getOriginCamera();
      // add the heading delta to the existing camera heading
      double heading = camera.getHeading() + delta;
      // get a camera with a new heading
      Camera newCam = camera.rotateTo(heading, camera.getPitch(), camera.getRoll());
      // apply the new origin camera
    JoystickSeekBar altitudeJoystick = findViewById(;
    // listen for calibration value changes for altitude
    altitudeJoystick.addDeltaProgressUpdatedListener(delta -> {
      mCurrentVerticalOffset += delta;
      // get the origin camera
      Camera camera = mArView.getOriginCamera();
      // elevate camera by the delta
      Camera newCam = camera.elevate(delta);
      // apply the new origin camera
    // this step is handled on the back end anyways, but we're applying a vertical offset to every update as per the
    // calibration step above
    mArView.getLocationDataSource().addLocationChangedListener(locationChangedEvent -> {
      Point updatedLocation = locationChangedEvent.getLocation().getPosition();
      mArView.setOriginCamera(new Camera(
          new Point(updatedLocation.getX(), updatedLocation.getY(), updatedLocation.getZ() + mCurrentVerticalOffset),
          mArView.getOriginCamera().getHeading(), mArView.getOriginCamera().getPitch(),

    // remind the user to calibrate the heading and altitude before starting navigation
    Toast.makeText(this, "Calibrate your heading and altitude before navigating!", Toast.LENGTH_LONG).show();

   * Request read external storage for API level 23+.
  private void requestPermissions() {
    // define permission to request
    String[] reqPermission = { Manifest.permission.CAMERA };
    int requestCode = 2;
    if (ContextCompat.checkSelfPermission(this, reqPermission[0]) == PackageManager.PERMISSION_GRANTED) {
    } else {
      // request permission
      ActivityCompat.requestPermissions(this, reqPermission, requestCode);

   * Handle the permissions request response.
  public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
    if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
    } else {
      // report to user that permission was denied
      Toast.makeText(this, getString(R.string.navigate_ar_permission_denied), Toast.LENGTH_SHORT).show();
    super.onRequestPermissionsResult(requestCode, permissions, grantResults);

  protected void onPause() {
    if (mArView != null) {

  protected void onResume() {
    if (mArView != null) {