JAUNT VR CONSUMER APP : CASE STUDY

XRCast_Hero-01.jpg

Project Temple

Reimagining the VR video browsing experience


 
 

THE JAUNT VR APP PROMISES

A HIGH QUALITY AND EXPANSIVE LIBRARY OF VR CONTENT PLAYABLE ACROSS PLATFORMS TO CONSUMERS AROUND THE WORLD

 
 

From Zion to Temple

Initially the app was known as “Zion” and would serve to showcase both the Jaunt One cinematic VR camera and the Jaunt Cloud Services stitching and publishing solution.

As popularity grew with VR, so did the ever expanding library of 360 videos. As a product “Zion” was faced with performance and architecture restraints, an input system that would not fully take advantage of the new VR controllers and an unscalable design for organizing and discovering the ever growing library of content.


HairWomanSQR_MASKED.png

JAUNT VR PROJECT TEMPLE

RESEARCH & DISCOVERY


 
 

JAUNT VR PROJECT TEMPLE

BROWSE


Use Depth to Give Context

One of the problems unique to VR is maintaining user context. A browse pattern that uses depth can help users understand where they are, how they got there, how to get back and where to go from here.

BrowseDia_Build_v4.gif

 

JAUNT VR PROJECT TEMPLE

BROWSE BOARD SYSTEM


 

Server-Side Playlists

Publishing Managers have the ability to feature content and configure playlists from the Jaunt XR Platform’s Media Manager.

 

Playlist Types

Board, Channel & Title Playlists can have additional playlist types nested within them. This design affords for flexibility and scalability.

 

JAUNT VR PROJECT TEMPLE

BRANDED CHANNELS


 

Branded Immersive Environments

Branded channels are dedicated areas that contain a channel’s content library, a branded immersive environment and provide a unique experience that can be customized by content and featuring.

 

OLYMPIC CHANNEL 360 ENVIRONMENT TOUR

 

JAUNT VR PROJECT TEMPLE

TITLE PREVIEW


 

Previews for User Discovery

Title Previews are used to promote user discovery. Each preview state provides additional types of information and interactivity to help quickly move the user from exploring to discovering titles they may like.

 

Title Preview States

State 1: The flat thumbnail creates the initial visual representation of a title in the board system.

State2: A hover state that quickly let’s the user discover additional information about the title such as a short meta description and look into an image from the title.

State3: A selected state with full description and additional meta information. A “play” and “download” prompt give the user actionable next steps.

 

Title Preview Assets

Flat Thumbnails provide a “poster” image as a standard 16:9 or (for featuring) a large 1:1 format.

Stereo Parallax Thumbnails display (on hover) a larger stereoscopic window that allows the user to peer into an immersive frame chosen from the title.

Stereo Preview Spheres let the user interact and “grab” a full spherical image from the title preview and view it from within the sphere. A comfortable transition allows the user to go from holding the smaller sphere at their controller to viewing it from inside a full spherical stereo environment.

 
 

JAUNT VR PROJECT TEMPLE

TITLE DOWNLOAD


 

Download Manager

The download system provides for high-quality downloads to the client through a cue type system. When a download is initiated by the user it spawns a “Downloads” playlist board in the “Home” area that is responsive to the number of titles in the playlist. A visual system provides clear user feedback by providing status and management of a downloaded title from the title preview panel. Primary and secondary controls are provided as needed with this design. Download status is also displayed on the title thumbnail wherever it appears within the system.

 
 

Download Flow

The title preview panel provides user control and feedback for downloads as well as an error-checking and notification system to remind the user when a title is downloaded or when network or download errors occur.

 
DownloadFlow_v2.gif
 

JAUNT VR PROJECT TEMPLE

PLAYER


 

The Jaunt Player

The core technology behind the Jaunt Player utilizes the XRP Player Engine SDK providing high-quality playback of content-adaptive streaming and monolithic format downloads across platforms.

A unique user experience is integrated within the player. A spherical object appears in depth containing the player control, language subtitle access and scrubbing transport features. The player sphere allows for the user to see nearly the full breadth of a 360 video within the user’s FOV. The scrubbing feature allows for the player sphere to display (in stereoscopic depth) the video within it and only when the scrubbing controls are released does the video environment sphere update. This overcomes any potential for user discomfort or disorientation.

A transition point for entering, exiting and pausing a video title is part of the unique player design. Transitions allow affordances for maintaining user context as well as providing a smooth transition in and out of a stereoscopic 360 video environment.

 

VIEW THE JAUNT PLAYER PROTOTYPE

 

JAUNT VR PROJECT TEMPLE

FINAL OVERVIEW


The Final Evolution

The Temple product launch brings all the pieces together after extensive iteration, prototyping and testing. Initially launched in Q4 2017, the new Jaunt VR consumer app “Temple” was featured at the Windows Mixed Reality platform launch after a month of testing and feedback with the WMR team. “Temple” was released on the Oculus, Vive and Sony Playstation VR platforms in Q1 2018.

 

JAUNT VR PRODUCT DESIGN EVOLUTION VIDEO

 

 

TEAM CONTRIBUTORS

Additional design contributions by Ethan Miller, Matthew Luther.

Unity & technical prototyping by Gregory Lutter, Matthew Luther.

Concept Illustrations by Javier Lazlo.


AT&T 5G - 3D HOLOGRAMS FOR CONSUMER AR : CASE STUDY

XRCast_Hero.png

3D Holograms in AR

The AT&T 5G Pilot Product Launch

 
 
PromiseSlide_Hero-01.png

JAUNT XR CAST PROMISES

AN EASY-TO-USE, HIGH FIDELITY, TRANSPORTABLE SOFTWARE & HARDWARE SOLUTION FOR ONSITE PERFORMANCE CAPTURE AND LIVE/STREAMING DELIVERY TO AR APPS.

 

JAUNT XR CAST TECHNOLOGY


Jaunt XRCast is a volumetric capture technology that records human performance in 3D for playback or live streaming in AR applications. It’s comprised of a multi-sensor array or “capture stage” that is portable and features a simple calibration and capture application for on-site events. The capture technology processes the depth (voxel) and rgb information into a streamable format that includes meshes and UV texture maps. Quality and density of the texture and mesh is optimized for the platform, device and network conditions in real-time.


 
 

How do we elevate Jaunt’s latest product XR CAST and show it’s creative potential?

 
 

PILOT PROJECT

Demonstrating viable use cases for XRCast as a product brought Jaunt into a partnership with AT&T. The upcoming 5G network from AT&T promises consumers low-latency and incredibly large data streams. As lead designer, I was tasked with demonstrating to AT&T how XRCast can provide consumers with a look at what is possible with 5G.

“How do we demonstrate
the kinds of experiences
possible through 5G?”


“How can we make
low-latency and high
data throughput
meaningful to the
consumer?” -AT&T
 

JAUNT XR CAST

TAILGATE LAUNCH EVENT

The partnership with AT&T, Nokia and Jaunt would mark XRCast’s first product release and would take the form of an event activation meant to highlight the upcoming features of AT&T’s 5G network. The event would take place at the Atlanta Falcons vs Dallas Cowboys NFL Game Day Tailgate village. The event would provide an XR Cast demonstration that would utilize a 5G network to deliver an AR experience for fans to discover.

The pitch - “There’s no better way to show off your team spirit than with your very own touchdown dance - in augmented reality.”

The event would feature two volumetric stages set in the tailgating fan area. Fans would be able to create a 3D hologram of their end zone dance that is both a memorable and shareable experience.

 

JAUNT XR CAST

DESIGN GOALS

The overarching message from the partnership with AT&T was to increase awareness of what types of experiences are possible with 5G. To amplify this message it was clear that providing a social sharing aspect was a key goal. At it’s core, XR Cast is designed to capture human performance in a unique way and would also be an important focus in the design. Lastly, delivering an AR experience today requires solving many technical gaps that still exist between the technology and consumers.

 

JAUNT XR CAST

CORE FEATURES

To make this product launch event successful for everyone it was absolutely necessary to create an end-to-end system that was focused on reliability, simplicity and performance in a unique network condition. A core feature of the XR Cast product design is to be simple, flexible and reliable at scale. A key metric for the success of this demonstration was to provide social sharing by the consumer.

 

JAUNT XR CAST

SYSTEM ARCHITECTURE

The system architecture provides several modular components that can be customized and managed for multiple use cases from the XR Platform. Event managers can configure an event or distribution channel and the system will essentially create a closed loop that integrates all the individual components.These components are managed by a data architecture that manages 5 main aspects: User ID, Capture ID, Marker ID, Event ID (distribution channel) and Share ID (social).

 

JAUNT XR CAST

USER JOURNEY

This user journey maps out the entire experience from end-to-end. Storyboarding helps communicate and illustrate who each of our users are and how their goals would need to be met. The experience itself will be further defined through user personas and journey maps and initial user flows for each of the components in the design: user sign-up, capture operator app, AR kiosk demo experience and the sharing application (to deliver users’ captures to their own devices for social sharing after the event).

 

 

JAUNT XR CAST

USER PERSONAS

Throughout the user journey several user types are identified such as Brand Ambassadors, Capture Stage Operator and the Participants who attends the event. Each user type is assigned a specific role with specific goals they will want to achieve.

 
 

JAUNT XR CAST

JOURNEY MAPS

Journey maps can provide a deeper understanding of the user experience, training or comfortability with accomplishing their goals. Pain and touch points are identified that should be taken into consideration in the design.

 

Untitled-14.png

 
 

JAUNT XR CAST

WHY AN AR MARKER?

Why an AR marker? Delivering AR experiences to consumer devices presents a few problems that need to be overcome. Many user devices do not support AR and their computer vision frameworks yet. Delivering a specific AR experience to a dedicated app requires additional steps the user must take to locate (on a store), download and install.

For volumetric captures of humans, each marker is unique and is tied to a specific user ID and their capture ID.

Many users find that moving, sliding and rotating the physical marker to be intuitive and provides a unique ability to interact with their capture in their own physical space. All early indications during testing and demo events showed users playfully interacting in fun ways such as putting multiple captures together to dance with each other or holding the markers up to faces in playful and completely intuitive ways.

 
MArkerID_Hero_v1-01.png
 

JAUNT XR CAST

AR MARKER SYSTEM

The AR marker, or to be specific, the WebAR marker system can be supported by most current user devices and works with the widest browsers and platforms. Delivering AR experiences (to a first-time AR user) in a browser provides many advantages over dedicated apps constrained to only the latest consumer devices (currently). It is dead simple- click on URL, place the marker in camera view, view in AR. No flags, no downloads, few blacklisted devices. This solution helps overcome many technology gaps in a way that is quite simple for the user.

 
MArkerSummary_Frame-01.png
 

JAUNT XR CAST

AR DEMO KIOSK

The WebAR marker system can support the widest range of most current user devices, browsers and platforms. Delivering the AR experience at the demo kiosk area utilizes a unique AR table surface design that provides a micro-texture for surface tracking stability as well as a number of detectable “glyphs” to keep the entire AR scene calibrated. When AR markers are placed on the surface and recognized by the camera the AR Demo Kiosk App spawns and tracks a specific capture to each unique marker recognized.

 
DemoKioskTable_V2-01.png
KioskApp_Arch_V2-01.png
 

JAUNT XR CAST

MULTI CAPTURE PLAYBACK

The provided tablet devices utilize the Demo Kiosk App and allow the user to view animated scene objects such as a floating 3D blimp and miniature capture stages at each “Live” end zone. Simultaneous playback occurs of participant captures for each of the unique AR Markers in view. Pre-placed captures are also viewable and are calibrated to the scene utilizing “glyph” detection. These captures feature the Atlanta Falcon cheerleaders, mascot and football stars. For the user they simply need to pick up a device and begin viewing the entire AR scene and interacting with the captures.

 
 

JAUNT XR CAST

DAY OF EVENT OVERVIEW

Watch our users’ journey through the entire live event- from sign-up, capture and then moments later seeing their own 3D Holograms for the first time!

 

EVENT DAY USER INTERACTION

 

JAUNT XR CAST

SOCIAL SHARING APP

Post event, the ability to deliver each participant’s own 3D captured performance to experience on their own devices was the cornerstone to creating a shareable experience. The social sharing app design was focused on reducing as many user pain points as possible including eliminating the need for an app download and considerations for device support and first time user expectations. By creating an app experience within the browser a user simply receives a notification in text/email to their user ID provided at the day of the event. The notification provides a simple clickable URL unique to their performance capture.

The web app design provides for viewing the capture in a WebGL and WebAR environment to further increase support for older user devices and browsers. Additionally the user has the ability to change themes for the 3D environment their capture appears in such as an Atlanta Falcon theme or a Dallas Cowboy theme. The three main goals of the browser app is to view, customize, create a shareable asset from their experience and then share on social channels. Although AR viewing mode is secondary to maximizing the number of users who can access and view a 3D capture. AR viewing mode would also require a marker to allow for tracking the capture in world space. A lost marker flow to allow the user to save and print a marker image was an additional consideration in the design.

 
 

VIEW FACEBOOK DEEP LINKING TEST PROTOTYPE


JAUNT XR CAST

SHARING APP FLOWS

Early wireframes and flows for the “sharing” app begin to define and shape the features that would allow users to reach specific goals. Here I have begun to define features such as viewing the user’s captures in 2 modes -WebGL (supported by most devices and browsers) and WebAR. Additional features: customizable themes and beginning to define the social sharing flow.

 
 

“PLAY” CLICK THROUGH WIREFRAME PROTOTYPE BELOW

 

JAUNT XR CAST

MAXIMIZE SCREEN AREA

Through wireframe prototyping and testing, design decisions were made that focus on simplifying the user interactions and creating the shareable asset (snapshot) as well as maximizing the usable screen real estate for viewing and interacting in WebGL and WebAR. Snapshots are created for a specific frame of the capture playback and are generated on the user’s device based upon the supported format by social channel.

 
 

 

JAUNT XR CAST

SHARING APP IN BROWSER

The sharing app allows for a simple experience in browser for both mobile and desktop interactions (such as rotation and scale) and include generous tap/click targets. The accordion UI “shelf” is optimized to always maximize the usable screen area not already utilized by lower navigation menus or header bars.

 
 

VIEW SOCIAL SHARING APP IN BROWSER


 
 

TEAM CONTRIBUTORS

Additional design contributions by Alejandro Delgado.

Unity & technical prototyping by Gregory Lutter.

Front end development & testing by Neil Sotirakopoulos, Sarah Parks.


AXON - INTERACTIVE EXPERIENTIAL TRAINING: CASE STUDY

Go_Image_AcademyLogo@2xBlur.png
 

EXPERIENTIAL TRAINING

VR TRAINING FOR GLOBAL LAW ENFORCEMENT

The use case for training in VR has been proven out over time to be successful in every metric used to measure the use of this medium. To take advantage of VR, I believe in using an experiential approach to training. By putting the user in life-like situations and scenarios that allow for interactivity, branching narratives and changes in user points of view, the user has an opportunity to learn, understand and even empathize at a much greater level. Given the opportunity to “relive” the experience from another character’s point of view helps deepen the opportunity a user can have for learning new information and affecting their own decisions.

Law enforcement training often requires that each “trainee” be certified to have been administered required information and measured in their situational response to the training information (also known as scenario based training). This type of training is a compliment to “tactical” training which varies widely within the spectrum of law enforcement agencies and regulations.

 
WideBoard_JauntAxon@2x.png
 

BUILDING A VR TRAINING PLATFORM

Axon provides training and certification for it’s Taser brand and BodyCam products to law enforcement officers and agents around the world. It employs highly experienced and trained Senior Master Instructors to develop and administer the training programs to Master Instructors who then train the regional Field Instructors. To provide training at a global scale, Axon has initiated an effort to create a robust digital training platform , Axon Academy, that can be accessed by regional instructors in the field as well as their trainees.

Jaunt and Axon partnered together to develop a phase 1 product for VR Experiential Training that would provide platform integrations, user training data, analytics and the ability to publish a library of interactive VR experiences (utilizing the Jaunt XR Platform) to a dedicated Axon VR Training app deployed on Oculus Go headsets.


 

The VR Training App

DESIGNING FOR THE OCULUS GO

The VR Training App would be specifically designed for VR mobile platforms. The Oculus Go provides cost vs. performance, mobility, reliability and ease-of-use benefits for the Instructors in the field who are primarily new to VR. Axon provided very clear guidance that the ability to get buy-in from Master Instructors for VR Experiential Training was critical to the success of this product/service design.

Go_Image_AcademyLogo@2x.png
 

Who Are We Designing For?

THE TRAINING PYRAMID

The Axon training structure utilizes a scalable pyramid by creating an ever expanding network of Axon Certified Field Instructors. Axon Certified Regional Instructors can expect to instruct 200K+ trainees throughout the world. They are deployed on a scheduled basis with local and regional law enforcement departments.

The Axon Certified Regional Instructors administer pre-planned training exercises that are meant to bring scenario based training and certification modules to the trainees for a variety of certifications required by the trainee force to carry and use Axon devices and many times require re-certification after a period of time.

The Axon Certified Regional Instructor must also maintain certification throughout the year by attending required certification training events provided by the Axon Master Instructor courses.

Axon Master Instructor courses are created and administered by the top level subject matter experts at Axon - Senior Master Instructors.

Artboard 9@2x.png
 

The Training Environment

SCENARIO BASED TRAINING IN THE FIELD

Scenario based training for law enforcement is a series of modules designed to put the trainee into as close a real life situation as possible. The subject matter experts interviewed explain that their methods can include stunt actors, pyrotechnics and physical realities that a law enforcement officer/agent might encounter in a real-life scenario. Scenarios are typically set-up with team instruction and move out into a separate physical space. After the scenario has played out the trainees must document the details, their actions and the outcome as they perceived it. The instructors will evaluate their performance as well as their documentation and provide direct feedback to the trainees. The VR Experiential Training design decisions made were deeply informed by the Axon subject matter experts and are intended to replicate the already established cadence, intensity and flow of any Axon training day.

 

What Problems Are We Trying To Solve?

MOBILITY, EASE-OF-USE, PERFORMANCE DATA

The unique Axon training environment presents a number of design problems to solve. The introduction of unfamiliar hardware and the need to interact with it needed to be addressed for both the Instructors and Trainees. Simplicity in design and ease-of-use with the goal of being successfully adopted by both.

  1. Instructors will need to be able to maintain and quickly setup the Oculus Go devices and the appropriate training content in the field.

  2. Training sites do not always have network connections or internet services thus putting an emphasis on mobility and offline capabilities.

  3. Data for each unique user session (such as date, location, user ID, Instructor ID, session performance data and training module version) must be maintained on the Axon Academy platform for legal and analytical certification.

  4. All Instructor and Trainee interactions must follow a clear and easy to discover mental model of how to achieve tasks and goals.

 

DESIGNING USER FLOWS AND JOURNEY MAPS

Defining user flows and journey maps helps flush out possible solutions and refine those solutions based upon pain and touch points for each user.

 
 

The VR Training Modules

Scenario Based Training

The Axon VR Training App environment provides for a scalable and customizable training “library” to be maintained by the Instructor through the Instructor Mode.

The Training Mode feature is designed to allow for a minimal UI and simple interaction patterns that can be quickly discovered by the Trainee without Instructor help or lengthy tutorials. Allowing for each Trainee’s experience to be focused on the task at hand and successfully completed the module(s).

AXON VR TRAINING APP ENVIRONMENT

 

Interactivity In VR Training

Decision Based Branching Narratives & Multiple Points Of View

The VR Training Modules live within a 360 Environment and include multiple VR video based interactive scenarios that provide for decision based responses that trigger branching in the narrative. Each decision the user makes provides an immediate narrative change and the results of those user decisions can be reviewed through the perspective of other characters within the narrative to achieve a “re-live through the eyes of…” experience.

 

Interactive Templates

While each interactive module can contain any number of perspectives and scenarios a foundational template was designed and tested to provide a clear flow for how each user would achieve completion of a module. This provides for accurate scoring from module to module independent of the content or the number of narrative decision branches. This also provides the ability for trajectories of user session data to be implemented in a scalable way.

Interactive Experience Flow.png
 

Decision Based Scenarios

Branching Narrative Interactivity

I’ve added a short example of the interactions I designed for this Schizophrenia scenario based upon the Subject Matter Experts’ input and behavioral models they have defined.

JAUNT VR 360 WEB & NATIVE MOBILE APPS: UX DISCOVERY 2.0

JAUNTWEB360_Header_HeroImageVignette.png
 

DESIGN EVOLUTION 2.0

WEB360 & MOBILE APP 360/VR 2.0

In 2016 the Jaunt mission statement was to become the   “Netflix of VR”. 

The Jaunt library of cinematic VR content had grown considerably and I was asked to propose a V2.0 redesign of the consumer web and native mobile app user experience.

MOBILE APP 360/VR 2.0

The Jaunt VR native mobile apps (Android/iOS) were originally envisioned as an experimental “3D” app primarily focused on exploration, browsing and playing of a selection of cinematic VR experiences. 

Google Cardboard and Samsung Gear VR held an early lead in consumer interest in mobile VR due to it’s ease of accessibility and low-cost.

The Jaunt VR 1.0 design had been trying to capitalize on this new consumer enthusiasm by supporting a fully immersive experience including gaze-based navigation and in-headset browsing for Samsung Gear and Google Cardboard.

As mobile VR matured, it became evident that mental models for gaze-based interactions were not ideal and a clear split path in VR platforms had emerged- devices with a controller and those without.

 

THE SWITCH TO 2D BROWSING

A new design would allow for a native mobile browsing touch experience to replace gaze-based interactions and would support 360 & VR viewing in the player view. Improving the browsing experience to traverse a large library of content would improve a few key metrics, such as video completion rate and videos per user session watched.

GRID PATTERNS & WIREFRAMES

• Exploring grid patterns for traversing a large content library.

• The browser experience across desktop and mobile features content positioning and greater flexibility.

• A clear visual system and hierarchy is defined to promote exploration and discovery by the user.

 
 

IMPROVED SEARCH

DISCOVERY & SEARCH

While working within the constraints of the current data architecture, improving user discoverability through search results was a feature focus. When users try a search and get no results, there’s a high risk of abandonment. But paying attention to design details can turn this potential for “No results” into a meaningful result and an opportunity for content discovery.

 

CONTENT FEATURING

DRIVING DISCOVERY & EXPLORATION

Featuring and content placement by curation is controlled by a publishing manager. Navigation, categories and content playlists can be added, removed or changed based upon product and user insights. A clear visual hierarchy within a design system allows for discovery based decisions and browsing optimized for cross-platform native experience and the web360 experience.

 
 
 

IMMERSIVE 3D ENVIRONMENTS

KIA_StingrLobby_noUI_COMP.06.360+copy.jpg

DESIGN CONCEPTS

The creative process, in partnership with brands and studios such as FX Networks, Kia and Olympic Channel, begins with understanding what feeling and message one wants to create within the 3D space. Understanding how the space will be used and how it will look and feel is best expressed at the early stage through concepts and explorations that help bring along everyone involved. In addition, identifying the scope and types of asset designs allow for managing the entire 3D creation pipeline.

I design these environments from concept to completion while managing the 2D and 3D asset production pipeline both internally and externally. In partnership with a concept illustrator, many creative decisions can be iterated on quickly and easily to respond to brand partner feedback before beginning production of the environment.


 

 

3D IMMERSIVE ENVIRONMENTS IN VR

DESIGNING FOR VIRTUAL SPACES

Creating immersive environments requires a deep understanding of design for 3D spaces. Scale, depth and lighting are all used to help set the stage for an immersive experience that can convey many things that are meant to be felt not just seen. I’ve highlighted a few unique 3D environments that were created as spaces for partner brands, featured experiences and content “channels” for the Jaunt VR app.

 
 

OLYMPIC CHANNEL

2018 WINTER OLYMPICS

 

OLYMPIC CHANNEL VR ENVIRONMENT

 

KIA STINGER

VIRTUAL TEST DRIVES

 

KIA STINGER VR ENVIRONMENT

 

AXON TASER

EXPERIENTIAL TRAINING

 

AXON TASER TRAINING VR ENVIRONMENT

 

TASTEMADE

VR BRANDED CHANNEL

 

TASTEMADE VR ENVIRONMENT

 

TEAM CONTRIBUTORS

Additional contributions to:

Modeling, texturing, lighting and rendering by Saga Alayyoubi & Drive Studios LA.

Concept Illustrations by Javier Lazlo.