IBM Immersive Data

An augmented reality app for data visualization

More data has been created in the past two years than the entire history of the human race. Businesses are in a race to collect, organize and analyze all of this data because the ability to do so will lead to meaningful insights that could propel them ahead of the competition. IBM Immersive Data is a mobile AR app designed to visualize data and expedite this process.

My role on this project was as a UX Designer. I worked with two other designers and our design lead, and collaborated closely with developers (essential when working with a difficult-to-implement medium such as AR). When I joined the team, we began the process of transitioning the existing product from the Microsoft Hololens over to Apple ARkit.

When Immersive Data first began development, head mounted displays such as the Hololens were the only available option. However, due to the multi-thousand dollar price tag for the hardware, the limited field of view, and graphics quality concerns, we decided to port our work over to the newly-available ARkit. The fact that everyone had a phone in their pockets removed a lot of the friction from adoption and made it more likely that our users would add the product to their data science tool kit. My tasks included figuring out how to translate the headset experience and interactions into the mobile AR app.

My job involved re-imagining this AR experience for mobile AR users.

THE PROBLEM

The data science process has several key stages including: identifying a problem, collecting data, cleaning data, exploring data, transforming data, modeling data, and communicating the results. IBM Immersive Data targets two aspects of the data science process: exploring the data and communicating findings.

Data exploration is one of the most time-consuming stages of the process, as data scientists need to analyze and test many different relationships in order to generate insights. By visualizing data in 3D and viewing it from new angles and perspectives, we can help data scientists identify key patterns, relationships, and outliers in seconds – as opposed to the hours it would typically take.

When it comes to communicating the findings of their work, data scientists and business analysts often struggle to explain complex insights to non-technical decision-makers. Not everyone enjoys spreadsheets and graphs! Augmented reality provides an engaging and intuitive way to comprehend data, giving data scientists a better way to persuade stakeholders of the value of their data discoveries.

I created an empathy map and as-is scenario of our primary persona, the data scientist. We used this as a starting point for ideation during a client workshop.

WHY AUGMENTED REALITY?


Before diving into any AR project, it’s important to consider what, if any benefits augmented reality will provide over a traditional web or mobile experience. Through our domain research and user research studies with data scientists and business analysts, we came up with these key reasons why viewing data in AR is better than exploring it on a computer:

  1. You can see more data at once. It can be difficult to see a lot of variables on a computer screen but it’s easy in AR. Immersive visualizations let you see many dimensions in the data at once. Users can easily adjust variables to change the information they’re seeing. They can also compare multiple data visualizations side by side.
  1. You can walk around the data. Data can lay hidden behind other data. By walking around the visualization, AR allows users to experience multiple viewpoints and ensure they’re not missing anything. Clusters and points of interest stand out in AR. Our users said they had an easier time noticing outliers and patterns because they could approach and back away from the visualization.
  1. You can learn the way humans were meant to. Humans have been navigating 3D spaces and manipulating 3D objects for millennia. Interacting with AR objects is a natural and intuitive way to learn and understand.
  1. You can improve accuracy and perception. A study by researchers at NASA and Caltech found that, “Immersion provides benefits beyond the traditional desktop visualization tools: it leads to a demonstrably better perception of datascape geometry, more intuitive data understanding, and a better retention of the perceived relationships in the data.”

We conducted user research & testing with data scientists in order to validate our assumptions.

EXPECTED VS. ACTUAL USER BEHAVIOR


As we transitioned our product from headsets to tablets we encountered a number of challenges. We went from being able to manipulate 3D objects in 3D space (on the headset) to manipulating 3D objects on a 2D screen, and that creates a kind of disassociation. We wanted to make the interactions feel natural and intuitive, but during user testing our participants struggled. A lot of this boiled down to the gap between the user behavior we expected and the actual user behavior. As it turned out, the mental model of a user wearing an AR headset is very different from the mental model of a user holding a tablet.

The video below shows a user interacting with the head-mounted version of the product. Notice how the user approaches the visualization and walks around it. This was an important behavior within the product as observing the data from different perspectives is key to uncovering patterns and relationships. We found that headset users were naturally drawn towards the visualization.


Unfortunately, when we tested the same use case on a tablet, users didn’t walk around the visualization anymore. Instead, they stood still and tried to pinch and zoom and rotate the 3D object on the screen. In other words, they defaulted to the traditional phone or tablet behaviors they were accustomed to. Since the actual user behavior and expectations deviated from our prior (headset-based) observations, we ended up needing to include touch-based rotation in our product. We still wanted to find ways to encourage behaviors like walking, but we recognized the need for alternative methods in order to meet the users where they were.

The video below shows the headset method for adjusting placement of the visualization. It involved clicking the magenta button to enter the edit mode, moving the object, and clicking the button to exit edit mode. The interaction was simple and easy to discover.



Again, while this was intuitive to headset-wearers, during testing, those using a tablet struggled to discover and use this interaction. Users naturally defaulted to trying traditional touch screen behaviors such as drag & drop and pinch to zoom. It was an important learning — to bear in mind the existing mental model of mobile users when designing a mobile AR experience. As a result, we simplified the interactions to meet users’ mental models (below).



APP NAVIGATION


Designing navigation for the app required taking into account several different layers of navigation:

  • Navigating within the experience by manipulating the AR objects in the viewport
  • 2D UI elements that allow the user to make selections about what they can see and manipulate (contextual to the visualization in view)
  • Navigation that enables management of the overall experience. For example, user login and authentication, integration and interaction with external web-based software, user settings etc.

Let’s take a look at each of these aspects of navigation.

2D vs 3D


As we translated product elements from headset to tablet, we carefully questioned whether they should remain in 3D or be redesigned in 2D. A key consideration that factored into this decision process was the need for precision — data scientists require a high level of granularity in their work. Since the z-position (distance from the user) of an AR object can make the touch target on a phone very small, users found it tough to make fine adjustments. As a result, we relocated our adjustment tools to the 2D UI.

It was easier to manipulate tools (such as the data collection box) that were located in the AR space when using a headset.

We went through numerous iterations, playing with effects such as partial transparency in order to keep the user immersed in AR. Unfortunately accessibility suffered as a result.


We ultimately decided to use opaque 2D UI elements.


Sometimes it made sense to use traditional 2D elements rather than forcing the user to struggle in 3D.



Menu design


Before a user can view a visualization, they must first make a selection from the menu. The original headset experience featured glass cubes that represented data visualizations. We liked the physical embodiment of the dataset, so we tried to keep this in the app.

An early menu iteration showing the data visualization files as physical glass cubes.


I explored designs for a radial menu to allow quick access to recent visualizations.


But things became complicated as we dug into the user’s workflow. Data scientists showed us how their datasets were organized within specific projects. They were also interested in creating sub-visualizations that were derived from a prior view. We wanted to give users a way to save and organize all these different explorations.

We learned there could be data visualizations within other data visualizations. This forced us to rethink the file structure.


Given the potential complexity, we decided it was best to keep the file system within the app similar to the project structure users already had in place within their desktop-based tools. As we worked on product integrations (more on that below), we mimicked the organizational structure.


We ultimately went with a file system that was similar to what our users work with in their desktop tools. This reduced confusion and ensured faster navigation.


When it came to designing the menu action buttons, we had two clear goals: keep the interface as minimal as possible, and differentiate between local and global actions. Since users could access the app on a phone, we were working with a limited “window” into the 3D world, so we didn’t want to clutter the viewing field.

We went through many different iterations, playing with things like positioning and transparency. We eventually settled on collapsible navigation to maximize screen real estate for the AR view.

We ultimately divided our actions into two groups, placing global navigation on the left, and context-sensitive actions on the right.



Login flow


Because of issues related to data privacy and permissions, it was important that the app had a secure login and authentication process. Other considerations that went into the design of this flow included:

  • Giving users the ability to test out the app using sample data. It was important that we didn’t lose potential users because they didn’t already have an IBM product user ID.
  • Providing a method and guidance for scanning a QR code for fast authentication.
  • Using surface detection or another positioning method for optimal placement of the AR data visualization

Sketching out the "happy path" user flow for app login.


Exploring how to provide guidance and feedback through animations, text, and haptics.



FINAL PRODUCT





PRODUCT INTEGRATIONS


The IBM Immersive Data app doesn’t function independently. In fact, this is what makes it so powerful — it works in conjunction with IBM’s robust data analysis tools, allowing users to take their existing datasets and launch them in augmented reality. My work included designing product integrations with software such as IBM Watson Studio and IBM Augmented Data Explorer to enable a seamless transition from the web experience to the mobile AR view.

Discoverability of the AR feature was a concern, so we found ways to draw the users' attention.


I created detailed specs to pass on to the developers.


Our product integrations let users scan a QR code and launch their AR visualization within seconds. See the process in action below.




IMPACT


Immersive Data has been on demo at numerous events and well-received by sponsor users. IBM partner Ari Kaplan, a sports analyst, used Immersive Data to dig into the data related to baseball players (if you’ve seen or read Moneyball, you’ll know about this strategy of using statistical analysis to find undervalued players and build winning teams). Based on Ari’s discoveries using our immersive visualizations, he was able to sign a multi-million dollar player contract.

Sponsor user Ari Kaplan uses complex spreadsheets and dashboards to evaluate baseball players. Ari was able to find an undervalued baseball player and sign a multi-million dollar contract before the end of free agency thanks to Immersive Data.


“When making recommendations to the MLB, I need to consider many factors including whether pitchers are left handed or right, if there’s a prior injury, ERA, salary expectations, and predicted performance. By comparing across all of these dimensions in a single visualization, I can make better, faster judgements.”
-Ari Kaplan

Client events were a fantastic opportunity to gather product feedback and feature requests, as well as understand business needs.


Presenting our product to global executives. Immersive Data was well-received.


Immersive Data is available as a demo at IBM's Client Innovation Centers globally. It consistently receives positive feedback from visitors.


Immersive Data has won several design awards, including:

  • 2019 Indigo Design Awards: UX, Interface, and Navigation, Gold
  • 2019 Indigo Design Awards: Interactive Design, Silver
  • 2018 Spark Design Awards, Bronze


LEARNINGS


Designing in augmented reality was challenging because we often found ourselves designing without precedent. Although there are general guidelines available, there weren’t always design patterns for the things we wanted to do, especially when it came to manipulating AR objects. The biggest takeaway I had from this project was to fail fast and often. This meant generating lots of different ideas and testing them. Testing with the target user was important for ensuring the core product ideas made sense, but testing with anyone and everyone willing to participate was helpful in making sure the gestures, interactions, and interface made sense. A lot of seemingly good ideas ended up on the cutting room floor, but the more we understood and met the mental model of mobile users, the faster we found our way to the right answers.