Skip To Content ArcGIS for Developers Sign In Dashboard

Display scenes in augmented reality

ArcGIS Runtime supports three augmented reality (AR) patterns through a combination of low-level API features and components in the ArcGIS Runtime toolkits referred to as AR Toolkit. AR Toolkit is open source, so you can use it as is or modify its components to meet your needs. You can keep your changes proprietary or share them with the open source community.

In addition to AR Toolkit features, you'll need to use the following ArcGIS Runtime features when creating AR experiences:

  • Scene view space effect control — Disable rendering the 'starry sky' effect to display scene content on top of a camera feed.
  • Scene view atmosphere effect control — Disable rendering the atmosphere effect to avoid obscuring rendered content.
  • Surface transparency — Hide the ground when rendering world-scale AR because the camera feed, not the basemap, is providing context for your GIS content. You can use a semitransparent surface to calibrate your position in world-scale AR.
  • Scene view navigation constraint — By default, scene views constrain the camera to being above the ground. You should disable this feature to enable users to use world-scale AR underground (for example, while in a basement). The navigation constraint will interfere with tabletop AR if the user attempts to look at the scene from below.

Enable your app for AR using AR Toolkit

  1. Install AR Toolkit using the installation instructions provided within each toolkit repo on GitHub. While all ArcGIS Runtime SDKs have a toolkit, only the following toolkits support augmented reality:
  2. Add an ARSceneView to your app.
  3. Configure privacy and permissions.
  4. Now you're ready to add tabletop AR , add flyover AR , or add world-scale AR to your app.

Add an AR view to your app

ARSceneView uses an underlying ARKit or ARCore view and an ArcGIS Runtime SceneView . ARSceneView subclasses SceneView , so you can access scene view properties, including the Scene property, directly. Use the following properties on ARSceneView to configure AR:

  • TranslationFactor - controls the relationship between physical device position changes and changes in the position of the scene view's camera. This is useful for tabletop and flyover AR.
  • OriginCamera - controls the initial position of the scene view's camera. When position tracking is started, ARSceneView transforms the scene view camera's position using a transformation matrix provided by ARKit or ARCore. Once the origin camera is set, the manipulation of the scene view's camera is handled automatically.
  • SetInitialTransformation – takes a point on the screen, finds the surface represented by that point, and applies a transformation such that the origin camera is pinned to the location represented by that point. This is useful for pinning content to a surface, which is needed for tabletop AR.
To use ARSceneView , first add it to the view, then configure the lifecycle methods to start and stop tracking as needed.

using Android.App;
using Android.Graphics;
using Android.OS;
using Android.Support.V7.App;
using Android.Views;
using Android.Widget;
using Esri.ArcGISRuntime.ARToolkit;

namespace MyArApp
{
    public class BasicARExample : AppCompatActivity
    {
        // Hold references to the UI controls.
        private ARSceneView _arSceneView;

        protected override void OnCreate(Bundle bundle)
        {
            base.OnCreate(bundle);
            CreateLayout();
        }

        private void CreateLayout()
        {
            SetContentView(MyArApp.Resource.Layout.BasicARExample);
            _arSceneView = FindViewById<ARSceneView>(ArcGISRuntime.Resource.Id.arSceneView);
        }

        protected override async void OnPause()
        {
            base.OnPause();
            await _arSceneView.StopTrackingAsync();

        }

        protected override async void OnResume()
        {
            base.OnResume();
            await _arSceneView.StartTrackingAsync(ARLocationTrackingMode.Ignore);
        }
    }
}

Configure privacy and permissions

Before you can use augmented reality, you'll need to request location and camera permissions.

On Android, you'll need to request location permissions before using ARCore. Ensure that the following permissions are specified in AndoidManifest.xml :

android.permission.CAMERA
android.permission.ACCESS_FINE_LOCATION

When starting the AR experience, ensure that the user has granted permissions. See Xamarin's documentation for details.

Note that the device must support ARCore for ARSceneView to work on Android. Google maintains a list of supported devices. ARCore is a separate installable component delivered via Google Play.

Note:

AR is resource-intensive feature. Just because a device supports ARCore does not mean it can provide the performance and reliability required to meet your users' needs. Understand your target audience and ensure that you test your AR experience on realistic user devices. Do not make the mistake of only testing on high-end developer

Add the following to the application definition in AndroidManifest.xml to ensure ARCore is installed with your app. This metadata is required for AR to work on Xamarin. You can specify optional or required depending on whether your app should work when ARCore is not present.

<!-- Indicates that app requires ARCore ("AR Required"). Causes Google
         Play Store to download and install ARCore along with the app.
         For an "AR Optional" app, specify "optional" instead of "required".
    -->
<application ...>
    <meta-data android:name="com.google.ar.core" android:value="required" />
</application>

The following declaration (outside of the application element) will ensure that the app only displays in the Play Store if the device supports ARCore:

<!-- Indicates that app requires ARCore ("AR Required"). Ensures app is only
         visible in the Google Play Store on devices that support ARCore.
         For "AR Optional" apps remove this line.
    -->
<application ...>
    <uses-feature android:name="android.hardware.camera.ar" android:required="true" />
</application>

Once you have installed the toolkit, configured your app to meet privacy requirements, requested location permissions, and added an ARSceneView to your app, you can begin implementing your AR experience.

Understand Common AR Patterns

There are many AR scenarios you can achieve with Runtime. This SDK recognizes the following common patterns for AR:

  • Flyover – Flyover AR is a kind of AR scenario that allows you to explore a scene using your device as a window into the virtual world. A typical flyover AR scenario will start with the scene’s virtual camera positioned over an area of interest. You can walk around and reorient the device to focus on specific content in the scene.
  • Tabletop – A kind of AR scenario where scene content is anchored to a physical surface, as if it were a 3D-printed model. You can walk around the tabletop and view the scene from different angles.
  • World-scale – A kind of AR scenario where scene content is rendered exactly where it would be in the physical world. This is used in scenarios ranging from viewing hidden infrastructure to displaying waypoints for navigation. In AR, the real world, rather than a basemap, provides the context for your GIS data.

Each experience is built using some combination of the features in Runtime and the toolkit and some basic behavioral assumptions.

AR pattern Origin camera Translation factor Scene view Base surface

Flyover AR

Above the tallest content in the scene

A large value to enable rapid traversal; 0 to restrict movement

Space effect:

Stars

Atmosphere:

Realistic

Displayed

Tabletop AR

On the ground at the center or lowest point on the scene

Based on the size of the target content and the physical table

Space effect:

None

Atmosphere:

None

Optional

World-scale AR

At the same location as the physical device camera

1, to keep virtual content in sync with real-world environment

Space effect:

None

Atmosphere:

None

Optional for calibration

Note:

While the table describes the settings for the common AR scenarios in general, you have freedom to define your own configurations and build scenarios not explicitly mentioned in this guide.

Add tabletop AR to your app

Tabletop AR allows you to use your device to interact with scenes as if they were 3D-printed models sitting on your desk. You could, for example, use tabletop AR to virtually explore a proposed development without needing to create a physical model.

Note:

See Learn more below for links to additional resources.

Configure content for tabletop AR

The goal of a tabletop AR experience is to create the illusion of scene content sitting on your desk. Consider the following guidelines when creating content for your tabletop experience :

  • Keep scene content focused; ideally a single geographic feature or a small area of a city.
  • Ensure that scene content looks OK from below; your users should be able to look at the scene from an y angle.
  • Don’t use an unbounded basemap. A basemap covering a large area will obscure the real-world table the scene is anchored to.
  • Don’t allow users to select arbitrary scenes for use in tabletop scenarios; most scenes do not follow these guidelines and will result in a poor AR experience.
  • Consider e xporting basemap tiles to the device for faster access to tiles within the targeted area of your scene. Exporting basemap tiles gives you precise control over the area of the content in your scene.

Implement tabletop AR

Tabletop AR often allows users to place scene content on a physical surface (e.g. top of a desk) of their choice. Once the content is placed, it stays anchored to the surface as the user moves around it.

  1. Create an ARSceneView and add it to the view
  2. When tracking is ready and at least one plane has been found, wait for the user to tap. . You can use the  PlanesDetectedChanged event to be notified when planes are detected.
    public override void LoadView()
    {
        // After creating and displaying views…
        // Get notification when planes are detected
        _arSceneView.PlanesDetectedChanged += ARSceneView_PlanesDetectedChanged;
    }
    
    private void ARSceneView_PlanesDetectedChanged(object sender, bool planeDetected)
    {
        if (planeDetected)
        {
            BeginInvokeOnMainThread(EnableTapToPlace);
        }
    }
    
    private void EnableTapToPlace()
    {
        // Show the help label.
        _helpLabel.Hidden = false;
        _helpLabel.Text = "Tap to place the scene.";
    
        // Wait for the user to tap.
        _arSceneView.GeoViewTapped += _arSceneView_GeoViewTapped;
    }
  3. Once the user has tapped a point, call SetInitialTransformation. The AR toolkit will use the native platform’s plane detection to position the virtual camera relative to the plane. If the result is true, the transformation has been set successfully and you can place the scene.
    private void _arSceneView_GeoViewTapped(object sender, Esri.ArcGISRuntime.UI.Controls.GeoViewInputEventArgs e)
    {
        if (_arSceneView.SetInitialTransformation(e.Position))
        {
            DisplayScene();
        }
    }
  4. Create and display the scene. Set the scene’s surface opacity to 0 to avoid obscuring the table, and set the navigation constraint on the scene’s base surface to None . For demonstration purposes, this code uses the Philadelphia mobile scene package because it is particularly well-suited for tabletop display. You can download that .mspk and add it to your project to make the code below work. Alternatively, you can use any scene for tabletop mapping, but for the best experience you should follow the guidelines in Configure content for tabletop AR.
    private Scene _tabletopScene;
    
    private async void DisplayScene()
    {
        // Get the downloaded mobile scene package.
        MobileScenePackage package = await MobileScenePackage.OpenAsync("path_to_.mspk");
        await package.LoadAsync();
        _tabletopScene = package.Scenes.First();
    
        // Hide the base surface.
        _tabletopScene.BaseSurface.Opacity = 0;
    
        // Enable subsurface navigation. This allows you to look at the scene from below.
        _tabletopScene.BaseSurface.NavigationConstraint = NavigationConstraint.None;
    
        // Show the scene.
        _arSceneView.Scene = _tabletopScene;
    
        UpdateTranslationFactorAndOriginCamera(_tablestopScene);
    }
  5. Find an anchor point in the scene. You can use a known value, a user-selected value, or a computed value. For simplicity, this example uses a known value. Place the origin camera at that point.
    private void UpdateTranslationFactorAndOriginCamera()
    {
        // Create a camera at the bottom and center of the scene.
        //    This camera is the point at which the scene is pinned to the real-world surface.
        Camera originCamera = new Camera(39.95787000283599, -75.16996728256345, 8.813445091247559,
                                         0, 90, 0);
    
        // Set the origin camera.
        _arSceneView.OriginCamera = originCamera;
    }
  6. Set the translation factor on the ArcGIS AR view so that the whole scene can be viewed by moving around it. A useful formula for determining this value is translation factor = virtual content width / desired physical content width. The desired physical content width is the size of the physical table while virtual content width is the real-world size of the scene content; both measurements should be in meters.
    private void UpdateTranslationFactorAndOriginCamera()
    {
        // Continued from above...
    
        // The width of the scene content is about 800 meters.
        double geographicContentWidth = 800;
    
        // The desired physical width of the scene is 1 meter.
        double tableContainerWidth = 1;
    
        // Set the translation factor based on the scene content width and desired physical size.
        _arSceneView.TranslationFactor = geographicContentWidth / tableContainerWidth;
    }

Guidelines for tabletop AR user experience

To create an optimal tabletop mapping experience:

  • Provide user feedback for ARKit tracking issues. See Apple's Human Interface Guidelines for details.
  • Set expectations about lighting before starting the experience. AR doesn't work well in low light.
  • Don't allow users to view arbitrary content in AR. Use curated content that has been designed for an optimal tabletop experience.
  • Keep scene content focused; this should ideally be a single geographic feature or a small area of a city.
  • Ensure that the scene looks good from below; your users may attempt to look at the scene from any angle.
  • Don't use an unbounded basemap. A basemap covering a large area will obscure the real-world table that the scene is anchored to.

Learn more

Add flyover AR to your app

Flyover AR displays a scene while using the movement of the physical device to move the scene view camera. For example, you can walk around while holding up your device as a window into the scene. Unlike other AR experiences, the camera feed is not displayed to the user , making flyover more similar to a traditional virtual reality (VR) experience.

Flyover is the simplest AR scenario to implement, as there is only a loose relationship between the physical world and the rendered virtual world. With flyover, you can imagine your device as a window into the virtual scene.

Note:

See Learn more below for links to additional resources.

Implement flyover AR

  1. Create the AR view and add it to the UI.
  2. Create the scene, add any content, then display it. This example uses an integrated mesh layer of an area on the US-Mexico border.
    private void DisplayScene()
    {
        // Create the scene with a basemap.
        Scene flyoverScene = new Scene(Basemap.CreateImagery());
    
        // Create the integrated mesh layer and add it to the scene.
        IntegratedMeshLayer meshLayer = new IntegratedMeshLayer(new System.Uri("https://www.arcgis.com/home/item.html?id=dbc72b3ebb024c848d89a42fe6387a1b"));
        flyoverScene.OperationalLayers.Add(meshLayer);
    
        // Show the scene.
        _arSceneView.Scene = flyoverScene;
    
        // TODO – configure origin camera for AR
    }
  3. Place the origin camera above the content you want the user to explore, ideally in the center. Typically, you’ll want to place the origin camera above the highest point in your scene.
    private void DisplayScene()
    {
        // Continued from above
        // Wait for the layer to load so that extent is available.
        await meshLayer.LoadAsync();
    
        // Start with the camera at the center of the mesh layer.
        Envelope layerExtent = meshLayer.FullExtent;
        Camera originCamera = new Camera(layerExtent.GetCenter().Y, layerExtent.GetCenter().X, 600, 0, 90, 0);
        _arSceneView.OriginCamera = originCamera;
    }
  4. Set the translation factor to allow rapid traversal of the scene. The translation factor defines the relationship between physical device movement and virtual camera movement. To create a more immersive experience, set the space effect on the scene view to Stars and the atmosphere effect to Realistic . Disable the navigation constraint on the scene's base surface to prevent camera position problems near the ground.
    private void DisplayScene()
    {
        // Continued from above
    
        // Disable navigation constraint
        _scene.BaseSurface.NavigationConstraint = NavigationConstraint.None;
    
        // Set the translation factor to enable rapid movement through the scene.
        _arSceneView.TranslationFactor = 1000;
    
        // Enable atmosphere and space effects for a more immersive experience.
        _arSceneView.SpaceEffect = SpaceEffect.Stars;
        _arSceneView.AtmosphereEffect = AtmosphereEffect.Realistic;
    }

Guidelines for flyover AR

Consider the following guidelines for creating high-quality flyover AR experiences:

  • Provide actionable feedback to the user when there are ARKit tracking issues. Tracking problems caused by environmental factors like low light can break the AR experience.
  • Set expectations about the environment before starting the AR experience. Flyover AR doesn't work well in low light or in tight, constrained spaces.
  • Start the AR experience in the center of the area of interest for your scene. Users are likely to move around freely, including rotating and looking behind where they originally start.

Learn more

Add world-scale AR to your app

A world-scale AR experience is defined by the following characteristics:

      • The scene camera is positioned to precisely match the position and orientation of the device’s physical camera
      • Scene content is placed in the context of the real world by matching the scene view’s virtual camera position and orientation to that of the physical device camera.
      • Context aids, like the basemap, are hidden; the camera feed provides real-world context.

      Some example use cases of world-scale AR include:

      • Visualizing hidden infrastructure, like sewers, water mains, and telecom conduits.
      • Maintaining context while performing rapid data collection for a survey.
      • Visualizing a route line while navigating.
      Note:

      See Learn more below for links to additional resources.

      Configure content for world-scale AR

      The goal of a world-scale AR experience is to create the illusion that your GIS content is physically present in the world around you. There are several requirements for content that will be used for world-scale AR that go beyond what is typically required for 2D mapping.

      • Ensure that all data has an accurate elevation (or Z) value. For dynamically generated graphics (for example, route results) use an elevation surface to add elevation.
      • Use an elevation source in your scene to ensure all content is placed accurately relative to the user.
      • Don't use 3D symbology that closely matches the exact shape of the feature it represents. For example, do not use a generic tree model to represent tree features or a fire hydrant to represent fire hydrant features. Generic symbology won’t capture the unique geometry of actual real-world objects and will highlight minor inaccuracies in position.
      • Consider how you present content that would otherwise be obscured in the real world, as the parallax effect can make that content appear to move unnaturally. For example, underground pipes will ‘float’ relative to the surface, even though they are at a fixed point underground. Have a plan to educate users, or consider adding visual guides, like lines drawn to link the hidden feature to the obscuring surface (for example, the ground).

            Understand location tracking options for world-scale AR

            There are a few strategies for determining the device’s position in the world and maintaining that position over time:

            • Use the device’s location data source (for example, GPS) to acquire an initial position and make further position updates using ARKit and ARCore only.
            • Use the location data source continuously.

            With continuous updates, the origina camera is set every time the location data source provides a new update. With a one-time update, the origin camera is set only once.

            There are benefits and drawbacks to each approach that you should consider when designing your AR experience:

            • One-time update
              • Advantage: ARKit/ARCore tracking is more precise than most location data sources.
              • Advantage: Content stays convincingly pinned to its real-world position, with minimal drifting or jumping.
              • Disadvantage: Error accumulates the further you venture from where you start the experience.
            • Continuous update
              • Advantage: Works over a larger area than ARKit or ARCore.
              • Disadvantage: Visualized content will jump as you move through the world and the device’s location is updated (as infrequently as once per second rather than ARKit’s 60 times per second).
              • Disadvantage: Because the origin camera is constantly being reset, you can’t use panning to manually correct position errors.

            You don’t need to make a binary choice between approaches for your app. Your app can use continuous updates while the user moves through larger areas, then switch to a primarily ARKit or ARCore-driven experience when you need greater precision.

            With ARSceneView, the choice of location strategy is specified in the call to StartTrackingAsync. To change the location update mode, stop tracking and then resume tracking with the desired mode.

            Note:

            In the samples, continuous update mode is referred to as Roaming and one-time update mode is referred to as Local.

            Implement world-scale AR

            1. Create an ARSceneView and add it to the view.
            2. Configure the ARSceneView  with a location data source. The location data source provides location information for the device. The AR scene view uses the location data source to place the virtual scene camera close to the location of the physical device’s camera.
              _arView.LocationDataSource = new SystemLocationDataSource();
            3. Configure the scene for AR by setting the space and atmosphere effects and adding an elevation source, then display it.
              private void ConfigureSceneForAR()
              {
                  // Create the scene with imagery basemap
                  _arView.Scene = new Scene(Basemap.CreateImagery ());
              
                  // Create and add the elevation surface
                  _elevationSource = new ArcGISTiledElevationSource(new Uri("https://elevation3d.arcgis.com/arcgis/rest/services/WorldElevation3D/Terrain3D/ImageServer"));
                  _elevationSurface = new Surface();
                  _elevationSurface.ElevationSources.Add(_elevationSource);
                  _arView.Scene.BaseSurface = _elevationSurface;
              
                  // Remove the navigation constraint.
                  _elevationSurface.NavigationConstraint = NavigationConstraint.None;
              
                  // Disable space and atmosphere effects
                  _arView.SpaceEffect = SpaceEffect.None;
                  _arView.AtmosphereEffect = AtmosphereEffect.None;
              }
            4. Start tracking using one of two location update modes, continuous or once only.
              protected override async void OnResume()
              {
                  base.OnResume();
              
                  // Continuous update mode
                  _arView.StartTrackingAsync(ARLocationTrackingMode.Continuous);
              
                  // One-time mode
                  //_arView.StartTrackingAsync(ARLocationTrackingMode.Initial);
              }
            5. Provide a calibration UI to allow your users to correct heading, elevation, and location errors.

            Learn more

            Guidelines for world-scale AR

            Consider the following guidelines to help you overcome common obstacles to creating a useful world-scale AR experience:

            • Create a calibration experience that works for your users’ needs in the environment they will be working in. See Enable calibration for world-scale AR for details.
            • Educate your users on what to expect from world-scale experiences, including calibration and parallax issues.
            • Consider how and where your users will use your AR experience. Due to limitations in GNSS and compass accuracy, your users will need to calibrate their initial starting position. Ensure that your users have the tools they need to calibrate their position, whether that is a known landmark, a fixed calibration point, or comparison to known reference features. See Enable calibration for world-scale AR for details.
            • Consider encouraging users to use high-accuracy positioning devices if practical.

            Enable calibration for world-scale AR

            World-scale AR depends on a close match between the positions and orientations of the device’s physical camera and the scene view’s virtual camera. Any error in the device’s position or orientation will degrade the experience. Consider each of the following key properties as common sources of error:

            • Heading – Usually determined using a magnetometer (compass) on the device
            • Elevation/Altitude (Z) – Usually determined using GPS/GNSS or a barometer
            • Position (X,Y) – usually determined using GPS/GNSS, cell triangulation, or beacons

            The following examples illustrate these errors by showing a semitransparent basemap for comparison with the ground truth provided by the camera:

              Design a calibration workflow

              There are many ways to calibrate position and heading. In most scenarios, you’ll need to provide one or more points of comparison between scene content and the real-world ground truth. Consider the following options for allowing the user to visually calibrate the position:

              • Align the imagery on the basemap with the camera feed.
              • Align a known calibration feature with its real-world equivalent (for example, a previously recorded tree feature).
              • Define a start point and heading and direct the user.

              Consider the following UI options for allowing the user to adjust the calibration:

              • Display sliders for orientation and elevation adjustment.
              • Use 'joystick' sliders, where the further from the center the slider moves, the faster the adjustment goes.
              • Use an image placed in a known position in conjunction with ARKit / ARCore image detection to automatically determine the device's position.

              Explicitly plan for calibration when designing your AR experiences. Consider how and where your users will use your app. Not all calibration workflows are appropriate for all locations or use cases.

              Note:

              As described in Understand two location tracking options for world-scale AR , you can configure the AR view to reset the device’s position using the location data source continuously or just once. If you would like to manually adjust position or altitude while using continuous location updates, you will need to implement a custom location data source.

              Identify real-world and in-scene objects

              ArcGIS Runtime scene views have two methods for determining the location in a scene that corresponds to a point on the device's screen:

                • ScreenToBaseSurface – ignores non-surface content, like 3D buildings
                • ScreenToLocationAsync – includes non-surface content

                ARSceneView has ARScreenToLocation , which:

                1. Performs a hit test using ARKit / ARCore to find a real- world plane.
                2. Applies a transformation to determine the physical position of that plane relative to the known position of the device's camera.
                3. Returns the real- world position of the tapped plane.

                You can use ARScreenToLocation  to enable field data collection workflows where users tap to identify real-world objects in the camera feed as detected by ARKit / ARCore . The position of the tapped object will be more accurate than using the device's location, as you might with a typical field data collection process.

                Manage vertical space in world-scale AR

                Accurate positioning is particularly important to world-scale AR; even small errors can break the perception that the virtual content is anchored in the real world. Unlike 2D mapping, Z values are important. And unlike traditional 3D experiences, you need to know the position of the user’s device.

                Be aware of the following common Z-value challenges that you’re likely to encounter while building AR experiences:

                • Many kinds of Z values – Android and iOS devices differ in how they represent altitude/elevation/Z values.
                • Imprecise altitude – Altitude/Elevation is the least precise measurement offered by GPS/GNSS. In testing, we found devices reported elevations that were anywhere between 10 and 100 above or below the true value, even under ideal conditions.

                Many kinds of Z values

                Just as there are many ways to represent position using X and Y values, there are many ways to represent Z values. GPS devices tend to use two primary reference systems for altitude/elevation:

                • WGS84 – Height Above Ellipsoid (HAE)
                • Orthometric – Height Above Mean Sea Level (MSL)

                The full depth of the differences between these two references is beyond the scope of this topic, but do keep in mind the following facts:

                  • Android devices return elevations in HAE, while iOS devices return altitude in MSL.
                  • It is not trivial to convert between HAE and MSL; MSL is based on a measurement of the Earth’s gravitational field. There are many models, and you may not know which model was used to when generating data.
                  • Esri’s world elevation service uses orthometric altitudes.
                  • The difference between MSL and HAE varies by location and can be on the order of tens of meters. For example, at Esri’s HQ in Redlands, California, the MSL altitude is about 30 meters higher than the HAE elevation.

                  It is important that you understand how your Z values are defined to ensure that data is placed correctly in the scene. For example, the Esri world elevation service uses MSL for its Z values. If you set the origin camera using an HAE Z value, you could be tens of meters off from the desired location.

                  To gain a deeper understanding of these issues, see ArcUser: Mean Sea Level, GPS, and the Geoid.

                        Adjust Z values on Android

                        Because many existing data sets and Esri services use orthometric (MSL) Z values, it is convenient to get MSL values from the location data source. Although Android natively provides values in WGS84 HAE, you can listen for NMEA messages from the on-board GPS to get elevations relative to MSL if the device supports it.

                        To consume MSL elevations in the AR scene view, you’ll need to create a custom location data source. See the public samples for a full MSL-adjusted location data source. The following demonstrates how to listen to NMEA messages on Android.

                        public class MSLAdjustedARLocationDataSource : LocationDataSource
                        {
                            public enum AltitudeAdjustmentMode
                            {
                                GpsRawEllipsoid,
                                NmeaParsedMsl
                            }
                            private AltitudeAdjustmentMode _currentMode = AltitudeAdjustmentMode.GpsRawEllipsoid;
                        
                            // Enable configuration of the altitude mode, adding or removing NMEA listener as needed.
                            public AltitudeAdjustmentMode AltitudeMode {
                                get => _currentMode;
                                set
                                {
                                    _currentMode = value;
                        
                                    if (_currentMode == AltitudeAdjustmentMode.NmeaParsedMsl)
                                    {
                                        GetLocationManager().AddNmeaListener(_listener);
                                    }
                                    else
                                    {
                                        GetLocationManager().RemoveNmeaListener(_listener);
                                    }
                                }
                            }
                        
                            // Object to handle NMEA messages from the onboard GNSS device.
                            private readonly NmeaListener _listener = new NmeaListener();
                        
                            // Allow setting an altitude offset.
                            private double _altitudeOffset;
                            public double AltitudeOffset
                            {
                                get => _altitudeOffset;
                                set
                                {
                                    _altitudeOffset = value;
                        
                                    // Raise a location changed event if possible.
                                    if (_lastLocation != null)
                                    {
                                        BaseSource_LocationChanged(_baseSource, _lastLocation);
                                    }
                                }
                            }
                        
                            // Track the last location so that a location changed
                            // event can be raised when the altitude offset is changed.
                            private Location _lastLocation;
                        
                            public IntPtr Handle => throw new NotImplementedException();
                        
                            // Track the last elevation received from the GNSS.
                            private double _lastNmeaElevation;
                        
                            // Use the underlying system location data source.
                            private readonly SystemLocationDataSource _baseSource;
                        
                            private readonly Context _context;
                        
                            public MSLAdjustedARLocationDataSource(Context context)
                            {
                                _context = context;
                        
                                // Create and listen for updates from a new system location data source.
                                _baseSource = new SystemLocationDataSource();
                                _baseSource.HeadingChanged += BaseSource_HeadingChanged;
                                _baseSource.LocationChanged += BaseSource_LocationChanged;
                        
                                // Listen for altitude change events from the onboard GNSS.
                                _listener.NmeaAltitudeChanged += (o, e) =>
                                {
                                    _lastNmeaElevation = e.Altitude;
                                };
                            }
                        
                            private void BaseSource_LocationChanged(object sender, Location e)
                            {
                                // Store the last location to enable raising change events.
                                _lastLocation = e;
                        
                                // Intercept location change events from the base source and either
                                // apply an altitude offset, or return the offset altitude from the latest NMEA message.
                                MapPoint newPosition = null;
                                switch (AltitudeMode)
                                {
                                    case AltitudeAdjustmentMode.GpsRawEllipsoid:
                                        newPosition = new MapPoint(e.Position.X, e.Position.Y, e.Position.Z + AltitudeOffset, e.Position.SpatialReference);
                                        break;
                                    case AltitudeAdjustmentMode.NmeaParsedMsl:
                                        newPosition = new MapPoint(e.Position.X, e.Position.Y, _lastNmeaElevation + AltitudeOffset, e.Position.SpatialReference);
                                        break;
                                }
                        
                                Location newLocation = new Location(newPosition, e.HorizontalAccuracy, e.Velocity, e.Course, e.IsLastKnown);
                        
                                UpdateLocation(newLocation);
                            }
                        
                            private void BaseSource_HeadingChanged(object sender, double e)
                            {
                                UpdateHeading(e);
                            }
                        
                            protected override Task OnStartAsync() => _baseSource.StartAsync();
                        
                            protected override Task OnStopAsync() => _baseSource.StopAsync();
                        
                            private LocationManager _locationManager;
                        
                            private LocationManager GetLocationManager()
                            {
                                if (_locationManager == null)
                                {
                                    _locationManager = (LocationManager)_context.GetSystemService("location");
                                }
                                return _locationManager;
                            }
                        
                            private class NmeaListener : Java.Lang.Object, IOnNmeaMessageListener
                            {
                                private long _lastTimestamp;
                                private double _lastElevation;
                        
                                public event EventHandler<AltitudeEventArgs> NmeaAltitudeChanged;
                        
                                public void OnNmeaMessage(string message, long timestamp)
                                {
                                    if (message.StartsWith("$GPGGA") || message.StartsWith("$GNGNS") || message.StartsWith("$GNGGA"))
                                    {
                                        var parts = message.Split(',');
                        
                                        if (parts.Length < 10)
                                        {
                                            return; // not enough
                                        }
                        
                                        string mslAltitude = parts[9];
                        
                                        if (string.IsNullOrEmpty(mslAltitude)) { return; }
                        
                        
                                        if (double.TryParse(mslAltitude, NumberStyles.Float, CultureInfo.InvariantCulture, out double altitudeParsed))
                                        {
                                            if (timestamp > _lastTimestamp)
                                            {
                                                _lastElevation = altitudeParsed;
                                                _lastTimestamp = timestamp;
                                                NmeaAltitudeChanged?.Invoke(this, new AltitudeEventArgs { Altitude = _lastElevation });
                                            }
                                        }
                                    }
                                }
                        
                                public class AltitudeEventArgs
                                {
                                    public double Altitude { get; set; }
                                }
                            }
                        }

                        Learn more

                        Visualize planes and features detected by ARKit and ARCore

                        Some workflows, like tapping to place a tabletop scene or collecting a feature, rely on ARKit / ARCore features that detect planes. Plane visualization is particularly useful for two common scenarios:

                        • Visualization provides feedback to users, so they know which surfaces the app has detected and can interact with
                        • Visualization is useful while developing and testing your app

                        ARKit / ARCore can be configured to render planes.

                        _arSceneView.RenderPlanes = true;