For this week we were to use AR targets to explore storytelling / narrative. I thought it would be fun for this class and for thesis ideation to do a lotus style brainstorm around the central question “How to Convey a Narrative Experience using AR targets” (post here).
Vuforia & Unity
In class we learned how to use Unity + Vuforia for target tracking, however I think I learned my lesson about taking extra time when upgrading! I upgraded my Unity and was having a little trouble with it? Will make sure to sign up for office hours. Originally was going to explore targets through working on embroidered / quilted targets. I did upload the embroidery successful in my Vuforia developer portal (post here). It was interesting to see the augment-ability rating.
Eyejack app + Image targets
In office hours Sarah Rothberg mentioned to me to try using Eyejack AR app as a way to quick & dirty prototype a target based AR experience. It is a good tool to get you thinking and sketching out the actions you might want to develop for later with Vuforia + Unity.
For homework I made a small scavenger hunt inspired experience where you try to catch a lady bug and let it back outside. My oldest nephew is obsessed with scavenger hunts right now and was thinking how ar targets could illustrate a story while helping navigate the user through a space. I was also inspired by the Cern Big Bang AR app and how they designed around action prompts (like place out your hand) and made it feel as if it was directly affecting the animation, even though the animation would move forward without it.
AR target Test – help the ladybug back home
Eyejack was fairly glitchy as a handheld experience. Especially for a user that is new to the experience you built, exploring around for the exact target mark. Maybe a use case for how it is a good sketching tool but better built out with another tool for later and final states.