Interaction Designer

Screen Shot 2019-04-26 at 12.11.25 PM.png

Stage AR

Augmented Reality Design

Product Design

Motion Design

 

Stage AR

 

Challenge

Abstraction creates emotional distance in a presentation

outcome

We designed an immersive, collaborative presentation tool for the Hololens.

Personal Contributions

Object Library UI/UX, Animation, Prototype, Sound Design, Narration

Timeline

Twelve weeks

PROCESS

Research > Unity Prototyping > Testing > Final Concept Video

 

objective

 Design a compelling product/idea/solution that utilizes Microsoft’s Hololens.

Audience

Users of all levels of technical proficiency who want to create, edit, and present their story for maximum emotional impact, all within the Hololens.

Team

Oscar Marczynski, Kelsy Aschenbeck, Spencer Weglin, Henry Davis

 
 

 

AR in Movies

Hollywood has long been an inspiration when designing products for new technology. In our popular media scan of AR gestures, two scenes stood out.

 
 
 
Minority Report- Tom Cruise expressed challenges in filming, due the phisical strain of having to hold his arms in place for multiple takes

Minority Report- Tom Cruise expressed challenges in filming, due the phisical strain of having to hold his arms in place for multiple takes

Blade Runner 2049- In this scene, a "dream designer" uses a physical controller to make detailed, real-time manipulations to holographic assets

Blade Runner 2049- In this scene, a "dream designer" uses a physical controller to make detailed, real-time manipulations to holographic assets

 
 

Minority report is commonly referenced in AR interface design. These images were revolutionary and continue to be visually stunning. However, the user experience could use improvement. Having to hold arms perpendicular to the body to initiate all interactions can cause tremendous strain for the user. 

This scene from bladerunner is an improvement on the minority report user experience, in that the "dream designer" is using an ergonomic controller to manipulate her assets. However, the truly revolutionary aspect of AR technology is the movement away from the reliance on physical artifacts/controllers. 

 
 
 

 

AR in reality: Microsoft Hololens

The University of Washington made available 2 Hololenses for our team try out. We focused on becoming familiar with their universal gestures to perform simple tasks, such as recalling and reviewing pictures in the photo library, and making simple scaling manipulations. 

 
 
 
Microsoft Hololens' "bloom" gesture

Microsoft Hololens' "bloom" gesture

The "bloom" gesture is required to initiate any interaction with the Hololens UI. I found that after the initial excitement/delight from recreating some of my favorite sci-fi scenes wore off, it was frustrating to rely on gestural cues to get things done. This is in part due to the limitations in gaze tracking, the Hololens forcing users to turn their whole head towards an object they'd like to select. In this way, it feels more like the forehead is in charge of selecting assets/buttons, rather than the hands. I found myself using Cortana, Microsoft's voice activated digital assistant, to get me to the screens I needed, and more and more, found myself just saying "select" when facing a button, rather than using a gesture. 

 
 

 

Rapid prototyping in AR

Our first Unity prototype

Our first Unity prototype

Sketches for object library and asset selection flows

Sketches for object library and asset selection flows

 
 
 

Key findings: Life-size > Hi-Res

There were some unavoidable road blocks while testing in Unity. The more realistic a 3d asset was designed to be, the more unusable the interaction due to polygon count related lags in Unity's holographic emulator.

However, through rapid iterations, we learned that the simplification of assets actually lead to a clarification of UI and greater user satisfaction.

In summary, the impact of seeing an asset in it's real-life scale far outweighs the desire to see that asset in real-life resolution.

Design Principles

  • Let users voice-initiate tasks

  • Allow creation of scenes/slides in any sized room

  • Utilize Microsoft's extensive object library to present scenes/collections of related objects upon user search

  • Allow users to edit their presentations in AR, on the spot (stay away from holograms that imitate flat screens)

  • This is a collaborative tool

 
 

 

 

Final Features

 
 
 

Voice Activated Placement

After defining a working area with an artboard, users can simply ask Cortana, Microsoft’s voice assistant, to place an object. From there, users can select and edit.

 
 
 
 

Suggested Objects

Once a user requests an object, related objects appear on a dock-able and moveable shelf, allowing the user to build out whole scenes in the time it would take to design a single slide in a traditional presentation platform.

 
 
 
 

Storyboard Mode

Once scenes are created, users can drag and drop them in any order, giving an instant visual overview of their presentation.

 
 
 
 

Interactive Mode

Users can assign assets as trigger points, making presentations uniquely dynamic and adaptive to their audience.

 
 

 

Final Prototype Video

Stage ar | Maximum audience impact

In our concept video, we feature two designers who are building a presentation to advocate for safer bicycle lanes in their neighborhood. Walking through a life-sized scene of an easily avoidable accident allows the viewing audience to truly internalize the presenters' talking points, leaving a lasting emotional impact.