Magic Windows: Vuforia & Image Targets

Getting Started

Step 1: create dev account @ https://developer.vuforia.com/

 

Adding personal target tests: embroidered ash leaves

Screen Shot 2020-02-10 at 2.53.50 PMScreen Shot 2020-02-10 at 2.53.37 PM

Step 2: image targets tutorial 

    • On the developer portal go to Develop/License Manager and generate a license (keep it handy!)
    • under Develop/Target Manager create image target and add your target Image(s). Please note its rating and features quality. Below info is pulled from the Vuforia Developer portal:
      • “Image Targets are detected based on natural features that are extracted from the target image and then compared at run time with features in the live camera image. The star rating of a target ranges between 1 and 5 stars; although targets with low rating (1 or 2 stars) can usually detect and track well. For best results, you should aim for targets with 4 or 5 stars. To create a trackable that is accurately detected, you should use images that are:”
        • Attribute Example
          Rich in detail Street-scene, group of people, collages and mixtures of items, or sport scenes
          Good contrast Has both bright and dark regions, is well lit, and not dull in brightness or color
          No repetitive patterns Grassy field, the front of a modern house with identical windows, and other regular grids and patterns
  • download img database package (as Unity Editor package)
  • make sure running Unity on a 2019.2.xx version
  • follow guides to set up on ios or android

 

 

Iterating off in class tutorial

Image Target class example

  • create new project (3D)
  • create new scene
    • Open build settings and make sure the right platform is selected
    • Open Player settings: make sure to add your package/bundle name, etc
    • Under XR settings: activate Vuforia
  • delete camera and directional light from the project hierarchy
  • add ARcamera
    • config Vuforia settings
      • add App License Key

 

  • under Assets/Import Package select Custom Package and select the Samples.unitypackage or your own target image database
  • add IMG target prefab to the hierarchy
    • make sure to select the right database
    • make sure to select the right image target
  • add a sphere primitive as a child of the image target (name it: ‘eye’)
    • scale xyz .2
    • pos (0.0, .148, 0.0)
    • create and apply white eye material
  • add second sphere as child of ‘eye’ and name it ‘pupil
    • scale xy z .25 .25 .1
    • pos 0, .012 .471
    • create and apply new black pupil material
  • create a new c# script,  name it ‘LookAtSomething’

public class LookAtMe : MonoBehaviour {

// transform position from the object we want to be looking at

public Transform target;

// Update is called once per frame

void Update () {

// use the coordinates to rotate this gameobject towards the target

this.transform.LookAt(target);

 

}

}

  • add the new script to the ‘eye’ GameObject in the hierarchy
  • make sure to link the ARCamera (what we want the eye to be looking at) to the target transform property
  • test! Move around and check if the eye is always looking at the camera
  • make a prefab out of ‘eye’
  • add any more necessary instances of the ‘eye’ prefab
  • Have fun!

 

 

 

Questions for Rui

  • module for mobile environment setup?

 

 

 

 

 

 

Magic Windows: HW 1.1 & 1.2

Hw – 1.1 Augment a Space 

For this assignment I knew I wanted to explore and learn a little of Madmapper. I’ve always heard about it but have never gotten to try – their free demo projects their logo so please ignore the large lettering ❤ I love projection mapping that highlights details of architectural spaces. In the lines of “if these walls could talk” thought about all the sounds and music the walls around me have absorbed over the years, including a music video collection for me when younger was The work of Directors: Spike Jonze, Chris Cunningham, and Michel Gondry. 

Several videos introduced me to idea of “augmenting” or hacking our perception and reality and felt like a great building block so to speak. Although this exploration isn’t necessarily dynamic, there is no human interaction, there still feels like a special moment of altering an every day surrounding, a brick wall in our apartment here. Could imagine making it interactive where tapping one of the blocks would have a ripple effect of some kind.

Original 5 music videos inspired by the Work of Directors series I grew up watching a lot that inspired the selection below:

81UqwhMV8WL._SL1500_.jpg

 

 

Hw 1.2 – Give a favorite AR/XR inspiration / example 

I’ve enjoyed playing with the Wonderscope app recently. Its use of voice activation to help the user feel more connected to the characters of the story while building reading skills felt compelling. With all voice assisted technologies and speech recognition it always leaves the question of which voice is it trained for? But felt like it was definitely a playful use of animation & ar. It also provides  a great onboarding process before a story unfolds.