Niantic
This project was a hackathon put on by Niantic as a part of their 2022 Lightship VPS World Tour. Our job was to create an experience using 8th Wall's Lightship Visual Positioning System(VPS).
CONTEXT
In this project we were fortunate enough to earn 1st Place for the 8th Wall(Web AR) pool of the competition. This was my first official hackathon that I participated in so I went in wanting to learn and apply all the skills in XR I had gained over the past 2 years of solo learning.
IDEATION
We had to quickly get to know one another, our skills, and discuss an idea that would be viable within the 24 hours we had to create. What brought us together as a group was one shared interest: Fashion. Therefore, We took our common interest and began brainstorming. Once we had an idea we all felt good about, I began to sketch out the concept on paper in order to get full team alignment on what we were to build.
The final concept: A live, location-based, multiplayer AR Fashion show that allows users to watch and react in real-time.
Concepting with Teammates
Final Concept Sketch
PREPARATION
Our only guideline for this challenge was that we had to use Lightship's VPS technology to create a location-based experience. We decided to use a unique area at the hackathon's location to be the location for our fashion show, the Subway Car(Pictured Below). This is because it evoked the essence of New York's recognizable street style, while also making for an interesting location as a runway show.
While the devs began getting the code environment ready, I began 3D scanning the subway car.
Scanning Subway Car for Lightship VPS
Now that we had our location scanned, we could now upload to our 3D modeling software, Blender, to begin designing the show around.
Bringing model into Blender
BUILDING AND DEBUGGING
Now it was time to start implementing our ideas. Because of the time crunch, it was imperative that we worked out all the bugs and established what worked as quickly as possible. To create the 3D models and animations, I established the following workflow.
Create ReadyPlayerMe(RPM) Avatar -> Export RPM Avatar as .glb -> Convert to FBX -> Import to Mixamo - Rig, Animate, and Export -> Import into Blender, Animate walk along path -> Export as .glb -> Import model into 8th Wall Environment.
A condensed view of this process is seen below.
Challenge: Debugging AR can be quite time consuming as it is required to test by pulling out your actual device to see if it is working as intended. This challenge was amplified by the time crunch we had. We also ran into a number of issues that took a while to solve, such as the culling on our models as shown below.
Debugging Challenges
FINAL OUTPUT
After a long day of debugging, trial, and error, we were able to work as a team to not only fix the errors, but also successfully add live multiplayer capabilities. Through the use of a javascript library called GunJS, we were able to implement a P2P system that enabled users to throw reactions and see the show together live. The final results can be seen below.
CONCLUSION
My first official hackathon experience was gratifying to say the least. I was fortunate enough to work with an amazing team that came together extremely well. We had great synergy in terms of both skills and interests, which I believe is what propelled us to the win. Definitely will be attending more hackathons in the future!
Group Photo