When considering which tools your users need and which map interactions make sense for their tasks, it's important to know which tools are available and current trends and user expectations. Equally important is understanding device and platform patterns as well as your users' various environments. ArcGIS Runtime SDKs provide many tools and capabilities out of the box, such as those for navigating the map, displaying information, and performing actions in context.
Easy and focused apps are trending
Apps are becoming more focused. Ease-of-use expectations (getting the job done quickly) are on the rise. User perceptions are more relevant in app design than they've been in the past.
Especially in the world of smartphones and tablets, today's trend is toward making apps more focused and targeted at a specific use case. Users are accustomed to using multiple apps to get their work done. They've come to expect an app to be focused on helping them get one or a small number of tasks accomplished quickly. More and more, users have a high expectation that an app will get the job done quickly and effortlessly.
Devices have become a part of most users' everyday life, even to the point that some users develop an emotional attachment to them. App builders and UX designers alike recognize that this attachment can come from a pleasant experience or a feeling of satisfaction when using an app. These feelings can derive from the app making the task fun, from the UI being particularly pleasing, or from the task just being just easy to do.
Devices (even desktops) have changed significantly. UIs are simplified, often animated, and aesthetically appealing. Touch screens and the touch-enabled mouse devices have had an enormous impact on the user experience. In addition to expecting a fast and easy experience, users have come to expect using an app to be an enjoyable and delightful experience.
The focused apps of today are built to run on a variety of devices, including small form factors. Long toolbars with multiple tools which some GIS users have become accustomed to are unwieldy and unnecessary. The story is similar for other tools. For example, in most cases, pop-ups are a better choice for an app than the Identify button. Pop-ups remove an extra click for the user to get to the same information. The same goes for a long table of contents pane with an endless number of layers. Often a long table of contents is a sign that the app doesn't target a specific enough task.
Think about how your users will want to interact with the map. Older applications required you to pick a particular tool on a toolbar to pan, zoom in, zoom out, or view the full extent. With touch screens, mouse devices, and trackpads, many of these tools can be provided through gestures, removing the need for tools on a tool bar altogether. In addition, the proliferation of devices that can provide their own location based on GPS or other technique means that you can use this information to change the map's extent without the user doing anything.
ArcGIS Runtime SDKs provide many of these interactions for you out of the box. But you still need to consider whether they're suitable for your app. What will the user expect to happen when they touch the map? Will they be able to use two fingers to pinch and spread the map (are they wearing gloves for instance)? Will they know to try a long press, double-tap, and a left/right swipe to open the search tool? For answers to questions similar to these, see the section Understand your users and their surrounding environment.
ArcGIS Runtime SDKs provide many gestures out of the box. Gestures aren't the answer to everything, but if you keep things simple, intuitive, and follow the device's standard practices, you can provide a lot of functionality in your mapping app without a toolbar.
Android Design documentation provides more detailed information on Gestures.
Map gestures that come pre-set with the ArcGIS Runtime SDK for Android MapView control are:
- Map pan—Drag
- Map fling—Drag and release while dragging
- Zoom in—Double touch
- Zoom in—Pinch open
- Zoom out—Two finger touch
- Zoom out—Pinch close
You can also override or add your own gestures to the map through overriding and implementing your own MapOnTouchListener and setting this onto the MapView (using the View.setOnTouchListener() method). The MapOnTouchListener has the following events for you to handle:
- onDoubleTap—Notified when a single-pointer-double-tap gesture occurs
- onDragPointerMove—Notified when a part of a single touch drag gesture event occurs
- onDragPointerUp—Notified when a part of a single-touch-drag gesture event occurs
- onLongPress—Notified when a long-press gesture occurs
- onMultiPointersSingleTap—Notified when a two-pointers-single-tap gesture occurs
- onPinchPointersDown—Notified when a part of a pinch gesture occurs
- onPinchPointersMove—Notified when a part of a pinch gesture occurs
- onPinchPointersUp—Notified when a part of a pinch gesture occurs
- onSingleTap—Notified when a single-pointer-single-tap gesture occurs
- onTouch—Called when a touch event is dispatched to a view
For an example of a custom MapOnTouchListener see the Geometry Editor sample where it is used to provide a sketch layer for editing.
Once you've built some default map interactions or gesture-based tools, the next step is to display some information to the user. You can turn complex data into rich, decision-making information for your users. If you understand your users and know what they want to achieve, displaying the right information in an appropriate way can reduce the need for additional tools. Older GIS applications have displayed tables and rows of data to the user, but modern approaches are more intuitive and powerful.
ArcGIS Runtime SDKs provide several API components to display information:
- Graphics—ArcGIS Runtime SDKs provide a graphics layer. Graphics layers allow you to add, to a map, graphics (points, lines, polygons, or text) that can be associated with a symbol to make the graphics look the way you want. Graphics let you provide a lot of information to users about something they have tapped on or an area they're interested in. However, you might need a legend or additional information to allow the user to interpret what you've added. For details on graphics layers, see Add graphics and text.
- Callouts—A callout is a small view that displays on top of the map and has a leader that can point to a feature on the map. You can use these effectively as long as you have enough screen real estate. You can embed text, images, and other items in these callouts to provide information. The callout can be considered part of the map as it is tied to a specific geographic location and therefore it moves as the map is panned or zoomed. Use the Callout class, which is obtained from the MapView.
- Pop-ups, supported throughout the ArcGIS platform, are a powerful way to bring your maps to life. ArcGIS Runtime SDKs fully support pop-up configurations and provide you with views to display their content without having to write any of the UI code yourself. More information on using Androids Runtime pop-ups can be found here.
- GPS/Location information-The ArcGIS Runtime SDK for Android provides an out of the box location display which uses the Android platforms LocationService. This shows the devices current location and approximate accuracy on the map. This component can also be set to automatically pan the map as the location moves in addition to being set into modes useful for navigation or compass based directions. More information can be found in the API reference for the LocationDisplayManager
- Platform-specific views—For native apps, there's a wide array of platform-specific views, widgets, and graphical layouts you can use to display information, such as graphs, pie charts, meters, categorized text, and so on. Read information on Android Design to understand what you can use and how you should use them.
When you want to provide non-gesture based functionality or tools for your users, you want to ensure they appear at the right place at the right time. They don’t always need to be executed from a toolbar.
For instance, let's say that a default map interaction of a long press adds a feature to the map and displays a list of actions or tools that can be performed on the location like viewing more details, zooming, or finding directions to the location. You can change these actions or tools based on the type of location the user pressed. This approach is typically better than exposing these actions on a toolbar, as they would be active only after a location had been selected—they might be greyed out most of the time, taking up valuable screen real estate and confusing the user as to when they can be used.
For capabilities that may be hidden, such as a long press in a blank area of the map as a way to add a feature, it is recommended you add in-app help. For example, to ensure the user knows that a long press adds a feature, when the user first opens the app, you could have an image of a finger "fly in" from the right, then animate it to mimic the adding of a feature, with a dismiss control the user can press once he understands the message being conveyed.
With contextual actions, you need to provide a clear exit path for the user, so they can return to where they came from. Use your target platform's best practice and navigation tools to provide this behavior.
For example, in Android, this means ensuring the Back button and action bar Up button behave as expected. For more information on Android navigation design patterns, see Navigation with Back and Up on the Android developers site.
Not everything can be done without a toolbar. If your app requires a toolbar, make sure you plan out where the tools will display and how the user will access or interact with them. You have many options to consider here, some of which depend on the platform and form factor of the devices you’re targeting.
Tools on top of the map
Many map-centric applications add tools that hover on top of the map. This approach makes sense for tools that are directly associated with the map, such as navigation buttons (which may make it easier than a gesture to zoom and pan the map with one hand or when wearing gloves). The GPS tool can often be seen near the top of the map, and, since it usually navigates the map to your current location, this approach makes sense.
Tools on a toolbar
Android has explicit design guidelines for adding tools to a toolbar (or ActionBar, in Android's case) that are relevant to all platforms. Essentially, tools should be visible if they adhere to the FIT scheme (frequently used, important, and/or typical). If you are adding tools to your toolbar try to adhere to this guidance. Cluttered toolbars make it difficult for users to find what they are looking for.
Another approach is to provide the user with toolbars or menus that are viewable only through a gesture or button click. This allows a number of tools to be shown at once but without them cluttering up the app's interface at all times. Patterns in use for hidden toolbars include the following:
- Slide out drawers—Menus or tools display from the side of the screen via a gesture or a button click. Lots of apps are using this approach so that commonly used tools don't clutter the app's screen real estate. The tools are easy for the user to show at any time and can provide a long list of scrollable menus or tools. Most apps use menu items with clear text and icons so the user knows exactly what will happen when these are pressed.
- Tool wheel—Menus and tools display in a wheel shape via a gesture or button click. This approach is more commonly used for showing multiple tool buttons (typically without accompanying text) rather than menus. Like the slide out drawers, this approach provides quick access to a number of tools for the user to quickly switch between. It's been used in apps for advanced drawing to allow users to quickly switch between the vast array of draw tools. It's also been used in camera apps, whfere the user needs to quickly pick the right setting to take pictures.
Platform concepts and patterns
Above all, it's important for your user experience and user interface design to incorporate guidelines, best practices, and design patterns specific to the platform you're using. This should include the icon styles to use (flat vs 3D), the use and design of tabs (top or bottom of the screen), app navigation tools (OS back button vs app back buttons) and the change in interface as screens are rotated or as the screens appear on different devices.
Guidelines for Android are provided on Android's developer site, in the Design section.
Your users and their environment
Additional important questions to ask yourself about your users and their surrounding environment are provided below. The answers can affect the user experience and how you present tools.
- Where will they be using the application?—Outdoors in strong sunlight? Indoors in a dark basement? At night? Or all three? The answer will affect the design of your UI and icons. A white background with dark text is good for an app used in strong daylight. A dark background with light colored text is best for dark conditions. Does you app need both a daytime mode and a night mode with different icon sets for each?
- How will your users be working with the application and what else might they be doing at the same time?—Will they be sitting down? Standing up? Driving? Flying helicopters? (ArcGIS Runtime SDK apps have been built for pilots.). The answer tells you if they are holding the device in one hand, two hands, or relying on a device docking station. This information, in turn, helps you determine which tools and gestures will work. For example, a pinch won't work to zoom out of a map if the user only has one hand available. Voice commands or some other handling should be considered in that case.
- What will users be wearing?—Will they be wearing gloves? If so, button and tool size need to take this into account. For example, bigger buttons and controls are needed for gloved fingers. Will they be wearing sunglasses, helmets, or other items that will get in the way of the screen and affect how the screen looks?
- What sort of device will they be using?—A tablet, phone, laptop, or an embedded device? These form factors will likely affect your choice of how best to provide tools. Phones require the ability to work with a single hand for most things. Tablets typically allow the user to place them on a surface so that two hands can be used (but this depends on how they will be using the application). For laptops, typically two hands are available for gestures and touches. Embedded devices might come with multiple ways to control the app, such as hard buttons, dials, and switches. An app may sometimes be connected to a keyboard and at other times not; in that case, you need to account for both scenarios.