Collect data in AR

View inAndroidiOSView on GitHub

Tap on real-world objects to collect data.

screenshot showing a feature being recorded, with a prompt to specify feature attributes

Use case

You can use AR to quickly photograph an object and automatically determine the object's real-world location, facilitating a more efficient data collection workflow. For example, you could quickly catalog trees in a park, while maintaining visual context of which trees have been recorded - no need for spray paint or tape.

How to use the sample

Before you start, go through the on-screen calibration process to ensure accurate positioning of recorded features.

When you tap, an orange diamond will appear at the tapped location. You can move around to visually verify that the tapped point is in the correct physical location. When you're satisfied, tap the '+' button to record the feature. An image from the camera feed will automatically be attached to the recorded feature.

WARNING: collection of photos is completely automatic; consider your surroundings when adding features.

How it works

  1. Create an ARSceneView. Set the atmosphere effect to None and the space effect to None. Create and show a scene in the scene view.
  2. Load the feature service and display it with a feature layer.
  3. Create and add the elevation surface to the scene.
  4. Create a graphics overlay for planning the location of features to add. Configure the graphics overlay with a renderer and add the graphics overlay to the scene view.
  5. When the user taps the screen, use ARSceneView.ARScreenToLocation to find the real-world location of the tapped object using ARKit/ARCore plane detection.
  6. Add a graphic to the graphics overlay preview where the feature will be placed and allow the user to visually verify the placement.
  7. When the user presses the button, take the current AR frame from ARSceneView.ArSceneView.ArFrame.AcquireCameraImage(). Rotate the image appropriately and convert it to a JPEG for efficient storage.
  8. Prompt the user for a tree health value, then create the feature. Upon successful creation of the feature, use feature.AddAttachment to add the image.

Relevant API

  • ARSceneView
  • GraphicsOverlay
  • SceneView
  • Surface

About the data

The sample uses a publicly-editable sample tree survey feature service. You can use AR to quickly record the location and health of a tree while seamlessly capturing a photo.

Additional information

This sample requires a device that is compatible with ARKit 1 on iOS or ARCore 1.8 on Android.

There are two main approaches for identifying the physical location of tapped point:

  • ARSceneView.ARScreenToLocation - uses plane detection provided by ARKit/ARCore to determine where in the real world the tapped point is.
  • SceneView.ScreenToLocation - determines where the tapped point is in the virtual scene. This is problematic when the opacity is set to 0 and you can't see where on the scene that is. Real-world objects aren't accounted for by the scene view's calculation to find the tapped location; for example tapping on a tree might result in a point on the basemap many meters away behind the tree.

This sample only uses the ARScreenToLocation approach, as it is the only way to get accurate positions for features not directly on the ground in real-scale AR.

Note that unlike other scene samples, a basemap isn't shown most of the time, because the real world provides the context. Only while calibrating is the basemap displayed at 50% opacity, to give the user a visual reference to compare to.

Real-scale AR is one of three main patterns for working with geographic information in augmented reality. See Augmented reality in the guide for more information.

See the 'Edit feature attachments' sample for more specific information about the attachment editing workflow.

This sample uses a combination of two location data source modes: continuous update and one-time update, presented as 'roaming' and 'local' calibration modes in the app. The error in the position provided by ARKit/ARCore increases as you move further from the origin, resulting in a poor experience when you move more than a few meters away. The location provided by GPS is more useful over large areas, but not good enough for a convincing AR experience on a small scale. With this sample, you can use 'roaming' mode to maintain good enough accuracy for basic context while navigating a large area. When you want to see a more precise visualization, you can switch to 'local' (ARKit/ARCore-only) mode and manually calibrate for best results.

Tags

attachment, augmented reality, capture, collection, collector, data, field, field worker, full-scale, mixed reality, survey, world-scale

Sample Code

JoystickSeekBar.csJoystickSeekBar.csCollectDataAR.csMSLAdjustedARLocationDataSource.cs
Use dark colors for code blocksCopy
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
using Android.Content;
using Android.Util;
using AndroidX.AppCompat.Widget;
using ArcGISRuntime;
using System;
using System.Timers;

namespace ArcGISRuntimeXamarin.Samples.ARToolkit.Controls
{
    public class JoystickSeekBar : AppCompatSeekBar
    {
        private const double DefaultMin = 0;
        private const double DefaultMax = 100;
        private const long DefaultDeltaIntervalMillis = 250;

        private readonly double _min = DefaultMin;
        private readonly double _max = DefaultMax;
        private double _deltaProgress;

        public event EventHandler<DeltaChangedEventArgs> DeltaProgressChanged;

        private readonly Timer _eventTimer = new Timer();

        public JoystickSeekBar(Context context) : base(context)
        {
            Progress = (int)(_max * 0.5);
        }

        public JoystickSeekBar(Context context, IAttributeSet attrs) : base(context, attrs)
        {
            var attributes = context.Theme.ObtainStyledAttributes(attrs, Resource.Styleable.JoystickSeekBar, 0, 0);
            _min = attributes.GetFloat(Resource.Styleable.JoystickSeekBar_jsb_min, (float)DefaultMin);
            _max = attributes.GetFloat(Resource.Styleable.JoystickSeekBar_jsb_max, (float)DefaultMax);

            if (_min > _max)
            {
                throw new AndroidRuntimeException("Attribute jsb_min must be less than attribute jsb_max");
            }

            Min = (int)_min;
            Max = (int)_max;
            Progress = (int)(((_max - _min) * 0.5) + _min);

            _eventTimer.Elapsed += (o, e) =>
            {
                DeltaProgressChanged?.Invoke(this, new DeltaChangedEventArgs() { DeltaProgress = _deltaProgress });
            };

            _eventTimer.Interval = DefaultDeltaIntervalMillis;

            ProgressChanged += JoystickSeekBar_ProgressChanged;
            StartTrackingTouch += JoystickSeekBar_StartTrackingTouch;
            StopTrackingTouch += JoystickSeekBar_StopTrackingTouch;
        }

        private void JoystickSeekBar_StopTrackingTouch(object sender, StopTrackingTouchEventArgs e)
        {
            _deltaProgress = 0;
            _eventTimer.Stop();

            Progress = (int)(((_max - _min) * 0.5) + _min);
        }

        private void JoystickSeekBar_StartTrackingTouch(object sender, StartTrackingTouchEventArgs e)
        {
            _eventTimer.Start();
        }

        private void JoystickSeekBar_ProgressChanged(object sender, ProgressChangedEventArgs e)
        {
            _deltaProgress = (float)(Math.Pow(this.Progress, 2) / 25 * (this.Progress < 0 ? -1.0 : 1.0));
        }
    }

    public class DeltaChangedEventArgs : EventArgs
    {
        public double DeltaProgress;
    }
}

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.