More data has been created in the past two years than the entire history of the human race. Businesses are in a race to collect, organize and analyze all of this data because the ability to do so will lead to meaningful insights that could propel them ahead of the competition. IBM Immersive Data is a mobile AR app designed to visualize data and expedite this process.
My role on this project was as a UX Designer. I worked with two other designers and our design lead, and collaborated closely with developers (essential when working with a difficult-to-implement medium such as AR). When I joined the team, we began the process of transitioning the existing product from the Microsoft Hololens over to Apple ARkit.
When Immersive Data first began development, head mounted displays such as the Hololens were the only available option. However, due to the multi-thousand dollar price tag for the hardware, the limited field of view, and graphics quality concerns, we decided to port our work over to the newly-available ARkit. The fact that everyone had a phone in their pockets removed a lot of the friction from adoption and made it more likely that our users would add the product to their data science tool kit. My tasks included figuring out how to translate the headset experience and interactions into the mobile AR app.
The data science process has several key stages including: identifying a problem, collecting data, cleaning data, exploring data, transforming data, modeling data, and communicating the results. IBM Immersive Data targets two aspects of the data science process: exploring the data and communicating findings.
Data exploration is one of the most time-consuming stages of the process, as data scientists need to analyze and test many different relationships in order to generate insights. By visualizing data in 3D and viewing it from new angles and perspectives, we can help data scientists identify key patterns, relationships, and outliers in seconds – as opposed to the hours it would typically take.
When it comes to communicating the findings of their work, data scientists and business analysts often struggle to explain complex insights to non-technical decision-makers. Not everyone enjoys spreadsheets and graphs! Augmented reality provides an engaging and intuitive way to comprehend data, giving data scientists a better way to persuade stakeholders of the value of their data discoveries.
Before diving into any AR project, it’s important to consider what, if any benefits augmented reality will provide over a traditional web or mobile experience. Through our domain research and user research studies with data scientists and business analysts, we came up with these key reasons why viewing data in AR is better than exploring it on a computer:
As we transitioned our product from headsets to tablets we encountered a number of challenges. We went from being able to manipulate 3D objects in 3D space (on the headset) to manipulating 3D objects on a 2D screen, and that creates a kind of disassociation. We wanted to make the interactions feel natural and intuitive, but during user testing our participants struggled. A lot of this boiled down to the gap between the user behavior we expected and the actual user behavior. As it turned out, the mental model of a user wearing an AR headset is very different from the mental model of a user holding a tablet.
The video below shows a user interacting with the head-mounted version of the product. Notice how the user approaches the visualization and walks around it. This was an important behavior within the product as observing the data from different perspectives is key to uncovering patterns and relationships. We found that headset users were naturally drawn towards the visualization.
Unfortunately, when we tested the same use case on a tablet, users didn’t walk around the visualization anymore. Instead, they stood still and tried to pinch and zoom and rotate the 3D object on the screen. In other words, they defaulted to the traditional phone or tablet behaviors they were accustomed to. Since the actual user behavior and expectations deviated from our prior (headset-based) observations, we ended up needing to include touch-based rotation in our product. We still wanted to find ways to encourage behaviors like walking, but we recognized the need for alternative methods in order to meet the users where they were.
The video below shows the headset method for adjusting placement of the visualization. It involved clicking the magenta button to enter the edit mode, moving the object, and clicking the button to exit edit mode. The interaction was simple and easy to discover.
Again, while this was intuitive to headset-wearers, during testing, those using a tablet struggled to discover and use this interaction. Users naturally defaulted to trying traditional touch screen behaviors such as drag & drop and pinch to zoom. It was an important learning — to bear in mind the existing mental model of mobile users when designing a mobile AR experience. As a result, we simplified the interactions to meet users’ mental models (below).
Designing navigation for the app required taking into account several different layers of navigation:
Let’s take a look at each of these aspects of navigation.
As we translated product elements from headset to tablet, we carefully questioned whether they should remain in 3D or be redesigned in 2D. A key consideration that factored into this decision process was the need for precision — data scientists require a high level of granularity in their work. Since the z-position (distance from the user) of an AR object can make the touch target on a phone very small, users found it tough to make fine adjustments. As a result, we relocated our adjustment tools to the 2D UI.
Before a user can view a visualization, they must first make a selection from the menu. The original headset experience featured glass cubes that represented data visualizations. We liked the physical embodiment of the dataset, so we tried to keep this in the app.
But things became complicated as we dug into the user’s workflow. Data scientists showed us how their datasets were organized within specific projects. They were also interested in creating sub-visualizations that were derived from a prior view. We wanted to give users a way to save and organize all these different explorations.
Given the potential complexity, we decided it was best to keep the file system within the app similar to the project structure users already had in place within their desktop-based tools. As we worked on product integrations (more on that below), we mimicked the organizational structure.
When it came to designing the menu action buttons, we had two clear goals: keep the interface as minimal as possible, and differentiate between local and global actions. Since users could access the app on a phone, we were working with a limited “window” into the 3D world, so we didn’t want to clutter the viewing field.
Because of issues related to data privacy and permissions, it was important that the app had a secure login and authentication process. Other considerations that went into the design of this flow included:
The IBM Immersive Data app doesn’t function independently. In fact, this is what makes it so powerful — it works in conjunction with IBM’s robust data analysis tools, allowing users to take their existing datasets and launch them in augmented reality. My work included designing product integrations with software such as IBM Watson Studio and IBM Augmented Data Explorer to enable a seamless transition from the web experience to the mobile AR view.
Our product integrations let users scan a QR code and launch their AR visualization within seconds. See the process in action below.
Immersive Data has been on demo at numerous events and well-received by sponsor users. IBM partner Ari Kaplan, a sports analyst, used Immersive Data to dig into the data related to baseball players (if you’ve seen or read Moneyball, you’ll know about this strategy of using statistical analysis to find undervalued players and build winning teams). Based on Ari’s discoveries using our immersive visualizations, he was able to sign a multi-million dollar player contract.
“When making recommendations to the MLB, I need to consider many factors including whether pitchers are left handed or right, if there’s a prior injury, ERA, salary expectations, and predicted performance. By comparing across all of these dimensions in a single visualization, I can make better, faster judgements.”
-Ari Kaplan
Immersive Data has won several design awards, including:
Designing in augmented reality was challenging because we often found ourselves designing without precedent. Although there are general guidelines available, there weren’t always design patterns for the things we wanted to do, especially when it came to manipulating AR objects. The biggest takeaway I had from this project was to fail fast and often. This meant generating lots of different ideas and testing them. Testing with the target user was important for ensuring the core product ideas made sense, but testing with anyone and everyone willing to participate was helpful in making sure the gestures, interactions, and interface made sense. A lot of seemingly good ideas ended up on the cutting room floor, but the more we understood and met the mental model of mobile users, the faster we found our way to the right answers.