Vans – THIS TIES US TOGETHER AR Experience

Helping Vans Scale AR Retail Windows

Vans – THIS TIES US TOGETHER AR Experience

This holiday season, Vans’ wanted to extend their “THIS TIES US TOGETHER” campaign into their brick-and-mortar stores across North America and Canada by creating an exclusive AR driven experience in each of their 500 storefronts.

The webAR experience offers an innovative retail window shopping experience that blends physical installation and creative technology.

The Experience

Users scan the QR code located on the Vans’ retail window display to unlock a vibrantly animated 3D scene. Users then choose one of three “virtual gifts” inspired by three unique stories including American indie pop band Muna, Queer art and music collective Bottom Feeders, and a group of artists and puppeteers at the historic LA Bob Baker Marionette Theater.

After selection, users are prompted to “scratch to unwrap” the gift that will joyfully explode and take over the entire retail window. Its wrapping paper burst off with a flurry of dynamic 3D elements, transforming into their selected family’s style. 

The experience is tied off with a final message, “Happy Holidays From The Vans Family, To Every Kind of Family”, surprising users with the ability to interact and play with the now-transformed space.

In the spirit of Vans’ THIS TIES US TOGETHER campaign, the experience celebrates bringing groups of people together in a deep and meaningful way - growing beyond a group of friends and into a true family. 

Design Approach

Pulling inspiration from their larger campaign, we stayed true to the visual language that was established while taking full advantage of the possibilities of a three dimensional space to create the most engaging and immersive experience possible.

We utilized animation to express the unique vibe of each family while staying true to the larger campaign. Muna’s scene was explosive and fun, Bottom Feeders’ scene took inspiration from their DIY nature, while Bob Baker was flowing and elegant.

A 270-degree Experience 

Creating an experience that will be in 500+ different storefront locations requires an approach that is scalable. One of the biggest challenges lies in the variability in the configuration of the retail windows, particularly in the space in front of the windows.  These spaces range from a large open space in a mall setting to the smallest of sidewalks in a busy city. Our design process had to take this limitation into account throughout the visual design process.

We created a 270-degree curved experience that surrounds the user and encourages them to virtually look around the experience, while keeping them in a fixed location. Once all the pieces were in place, we continuously iterated and adapted the 3D scenes to create the most user-friendly experience.

Interactive Hero Element 

We took inspiration from each family’s unique forms of expressions and utilized a hero element where our users are able to play and interact with their gift after they unwrapped their presents.

Tapping on Muna’s guitar brings the whole scene to life with a rhythmic pulsing, swiping Bottom Feeders’ skateboard triggers different tricks and a shuffling of the whole scene, and swaying Bob Baker’s marionette makes the whole scene sway along with it.

Tech Approach

The Retail Window

Tool collaborated closely with Vans window display team to ensure that all design, placement and configuration will allow the best optimization and scalability across all the stores. To allow users to “Tap to Select” specific gifts and unwrap the assigned 3D scene, we implemented computer vision and 8th Wall SLAM tracking capabilities.

The computer vision model was trained using photos of the different physical objects in the window displays. This approach detects the selected gift's position on the user's device and allows a seamless synchronization between the real world and the digital world. In addition, we used computer vision to identify the various window configurations of Vans’ retail stores. 

In order to run both computer vision and AR tracking simultaneously, we implemented  a WebWorker to delegate all the image detection work, while keeping the main WebGL Loop to manage the AR tracking and rendering of the camera. We also integrated an image tracker to stabilize the center of the scene and recalibrate the computer vision position if an offset occurs.