Magic Windows: W3 – Object Augmentation

What does it mean to augment objects, both technically and conceptually ?

This week Rui brought to focus up a few thinking points as well as walked us through a vuforia tutorial.  Class began thinking about the philosophy of animism as a powerful approach to thinking about object augmentation.

“Animism is the belief that objects, places and creatures all possess a distinct spiritual essence.  Potentially animism perceives all things — animals, rocks, rivers, weather systems, human handiwork, and perhaps even words as animated and alive (from wiki)”. 

Also thinking about developmental psychology theorist Piaget and how kids are egocentric in that for a set amount of years tend to only have themselves as a reference, as a way they experience the world, in and in that thinking embody objects as a relational exercise. As an example how sometimes instead of we got lost we say the car got lost when really it was a combination of us, our gps etc.

Rui challenged us to move beyond thinking about projection, past the hologram assistant.

He also talked about how to augment objects as an interface. How to extend the current tool and its current functionality? Can a waterbottle also tell you about the amount of water drunk over the day? How to invoke, look at the form and materiality of things and how to blend its form into the ar space / more naturally tied

He then brought up to the sensation of Pareidolia ,   seeing faces in everyday objects. How the brain reads for easy patterns sometimes and fills in the dots. A quick assesment as a signal to noice mechanic, reaching conclusions that aren’t always correct like an outlet being a face. Or Rui’s parallel example of hearing Portugese vowel sounds when listening to Russian. Think also about how computer vision / facial recognition works too.

Then we walked through a bunch of fun examples of augmenting objects.

 

Suwappu – Berg London

 

Garden Friends – Nicole He

Tape Drawing – Bill Buxton

https://www.billbuxton.com/tapeDrawing.htm

 

Smarter Objects

 

Invoked computing

 

MARIO

 

Paper Cubes – Anna Fusté, Judth Amores

 

 

InForm

 

 

when objects dream – ECAL

 

HW for next week:

  1. Eyes on
    Pls read at least one of the following:

    Extra reading: Radical Atoms
    it is a nice pairing with Ivan Sutherland’s ‘The Ultimate Display’

     

  2. Hands on
    Write a brief paragraph about your idea on augmenting an object.
    This should be initial concept for next week’s assignment.
    Be clear about your idea, inspirations and potential execution plan.

  3. Brain on
    List potential input (sensory) and output( feedback) modes that you can leverage in the real world – translate these into technical possibilities using current technology (you can use your mobile device but feel free to bring any other technology or platform (Arduino? Etc..) that you can implement.
    Choose on elf each (input, output) and create a simple experience to show off their properties but, also, affordances and constraints.

Mobile Lab: Hw2 – interactive poster

IMG_2586.PNGIMG_2587.PNGIMG_2588.PNG

 

For this week’s homework I tried out two ideas but went for the more minimalist shape based graphic style of musician Tycho’s record art, specifically Montana. It reminded me of some of the Swiss poster design references from class. In my  sketch I played with increasing duration of the animation by .25 with each consecutive triangle in sequence. also used scaleEffect to make the triangles larger.

 

 

9b79ff6e14162849db41637cf84efe40--graphic-illustration-album-cover

unnamed.jpg

 

 

Was really inspired by the tutorial below. Would like to restructure my code to have a similar repeat structure as the rectangle in the video. Instead of copy and pasting 7 individual triangles in a z stack and tweaking them, would be more efficient to have a function that draws the triangle initially. Then play with offset, color (or opacity) and animation time.

Although in a very ITP fashion I lost my sketch (and will rebuild) but at the very least had video of it when it was working. And hope to post to gitup soon

https://www.youtube.com/watch?v=Xvm8hrvCk58&feature=youtu.be

 

Tutorials:

 

Questions:

  • how to parse a crash log
  • how to fix team names
  • how to import fonts
  • how to work with sound files
  • easy eyedropper / color picker?

 

Week 2 lab/homework  covered a range of things including:

  • Anatomy of an iOS App, UI Design, Prototyping Tools
  • Swift Programming, Swift playgrounds
  • SwiftUI Basics
  • UI development, layout design

Design and develop an interactive poster starting with the poster code kit

  • Select an artist, movie, musician, etc. as inspiration
  • Poster needs to include type AND images (graphic or photography)
  • Make it interactive: On tap apply animation
  • Post documentation to #sp2020-homework channel on Slack. Include the following:
    • Source of inspiration
    • 1-2 sentence description
    • 2-3 screenshots
    • short video
    • GitHub link to code

 

Magic Windows: Hw 2 – Storytelling through AR targets

For this week we were to use AR targets to explore storytelling / narrative. I thought it would be fun for this class and for thesis ideation to do a lotus style brainstorm around the central question  “How to Convey a Narrative Experience using AR targets” (post here).

 

Vuforia & Unity

In class we learned how to use Unity + Vuforia for target tracking, however I think I learned my lesson about taking extra time when upgrading! I upgraded my Unity and was having a little trouble with it? Will make sure to sign up for office hours. Originally was going to explore targets through working on embroidered / quilted targets. I did upload the embroidery successful in my Vuforia developer portal (post here). It was interesting to see the augment-ability rating.

Screen Shot 2020-02-10 at 2.53.37 PMScreen Shot 2020-02-10 at 2.53.50 PM

 

 

Eyejack app + Image targets 

Screen Shot 2020-02-12 at 11.59.13 PM.png

In office hours Sarah Rothberg mentioned to me to try using Eyejack AR app as a way to quick & dirty prototype a target based AR experience. It is a good tool to get you thinking and sketching out the actions you might want to develop for later with Vuforia + Unity.

For homework I made a small scavenger hunt inspired experience where you try to catch a lady bug and let it back outside. My oldest nephew is obsessed with scavenger hunts right now and was thinking how ar targets could illustrate a story while helping navigate the user through a space. I was also inspired by the Cern Big Bang AR app and how they designed around action prompts (like place out your hand) and made it feel as if it was directly affecting the animation, even though the animation would move forward without it.

 

 

 

AR target Test – help the ladybug back home

giphy

 

 

 

 

 

 

 

 

Eyejack was fairly glitchy as a handheld experience. Especially for a user that is new to the experience you built, exploring around for the exact target mark. Maybe a use case for how it is a good sketching tool but better built out with another tool for later and final states.

 

 

Magic Windows: AR Target brainstorm

IMG_2547.jpg

Lotus style brainstorm around “How to Convey a Narrative Experience using AR targets”

Center of Lotus: 

  • What are the strengths of AR targets?
  • What are the weaknesses of AR targets generally?
  • How to explore AR targets in thesis quilt project
  • How to break away from a rectilinear target shape / boundary of mobile device?
  • How to further mesh the AR world into the physical environment / tricks to avoide gimmicky pitfalls?
  • Who is telling the story – narrator/reader relationship or perspective?
  • What is the experience “end goal” vibe?
  • What materials do you want to augment / use as targets?

 

Petal pages:

  • What are the strengths of AR targets?
    • How to maximize for AR’s ability to be experienced mobilely, through a wide range of physical space?
    • How to best get at the feeling of discovery?
    • How do you want to “edit”/ manipulate surroundings?
    • What “realities” do you want to question?
    • How to use voice activation as a narrative device?
    • What is not “visible” in physical world that you wish to see or experience?
    • What is “visible” in physical world that you wish to rearrange, remove or re-imagine?
    • How might sound be altered- shifted or triggered to convey a narrative through target based AR

 

  • What are weaknesses of AR? 
    • How to break out of AR’s boundary of the mobile device shape?
    • How to avoid the experience from feeling “gimicky” or derivative?
    • How to make accessible? for those who dont own smart mobile devices and for a range of abilities
    • how to design for quick data download?
    • how do you minimize the motion sickness/ feeling for other participants that are not contoling the device?
    • How to make it a social interaction
    • how to make the mobile screen not the end goal but to better observe the physical world around you
    • What are some onboarding tricks for AR use? Or quick prompts to make the experience feel more “immersive” / interactive?

 

  • What materials do you want to augment / use as targets?
    • objects / moments of a relationship to something or someone?
    • somethings along lines of ingredients in kitchen suggesting recipes with items on hand
    • best ways to use AR to reveal the multiplicity of histories (or untold histories)
    • how to bring to light / manifest stories through objects
    • how to draw with our bodies in space?
    • seeing other human interaction in the AR world, multi-player interaction
    • thinking about wearables, how to create fun narrative interactions through worn objects
    • augmenting symbolic physical or public spaces ex: Stonewall Forever AR at Christoper Park  / or in a digital humanities way using the plaques in botanitcal gardens shakespeare garden to augment different passages

 

  • What is the experience “end goal” vibe 
    • what design aesthetic? [nostalgic, referential to a specific moment in design history]
    • how do you encourage user to experience / play w/ several elements if designed as a nonlinear, explorative story experience?
    • Does it celebrate or commemorate something specific?
    • Does the interaction contribute to a larger dialogue?
    • what is the duration of the experience?
    • who is the primary & secondary user?
    • how to design for an immersive educational tool / experience?
    • what emotional tone do you want?

 

  • How to avoid AR from feeling gimmicky? To further mesh the physical & AR space together?
    • What are AR techniques that other people feel worn out on?
    • How to have experience the grounded in geolocation
    • what are optical tricks that AR apps have used to help you suspend belief?
    • how to have animation style or visuals better match the physical environment? (if elevates the story?)
    • What are new innovations in the AR space?
    • What are techniques /experiences that feel overplayed to you?
    • What are everyday tasks that AR could help make easier, more accessible, beneficial for routine use? think openlab parsons
    • How can sounds best support and round out the experience?

 

  • How to break away from a rectilinear target shape / mobile screen? 
    • how to fold the ar mobile device intro an outward facing screen of a wearable?
    • what are effective playful 3d everyday objects to use as targets?
    • how to creatively design sfx and music in interaction
    • building a physical handheld object around the mobile space
    • how to use textures naturally found in our everyday?
    • are there physical objects in public space that can be used?
    • could duration of day function as creating different targets but still the same object?
    • could using prompts through geolocation support a more exciting experience?

 

  • how to effectively / creatively use AR targets in thesis quilt? 
    • how to have multiple pathways / pages in AR experience [magnifier / find leaf mode vs a more info button / return home screen]
    • how to fold in real world collection or experience into the trigger
    • what is the action that brings the user into the ‘magic circle’ as we talked about in Joy of Games?
    • Which AR tools to use in the deliverable time frame?
    • What types of mark making in quilting can be mirrored in the ar animation experiment
    • what size or spacing between targets is most effective
    • how to fold in gamification or badge collection?
    • how to have it reel like it naturally enhances the project / feels integrated vs a tacked on buzzwordy tech method?

 

  • who is telling the story?
    • which voice do you want to use?
      • ex:3rd person omniscient
    • if in a digital humanities exercise of “thick” storymapping, who built the map that the stories are being placed or imbedded?
    • historical / educational?
    • revealing multiple histories & stories around a specific topic
      • oral histories
    • does the augmented target unfold the narrative through onscreen visuals (having to watch) verses visual cues to inspire you to looks around the screen?
    • how to develop experience to be player input driven, like a choose your own adventure or generative storyline?
    • how to design as a culture jamming activity / experience? who is dictating the culture? and who is creating the jam
    • what are storytelling narration or poetic devices  that could be helpful?(like breaking 4th wall(is there a 4th wall in ar)? or onomatopoeia / metaphors?

Filtering

Highlighting

Value mapping – Prioritization

The art of paying attention: Meeting quilt maker Adelia Moore

Last week I had the chance to meet Nancy’s friend Adelia and talk about her experiences with quilting. It was so wonderful to hear her talk through her creative process and show examples of different things she’s made. More soon! She also mentioned to check out the episode of Mr. Rogers that she was on and an article she wrote in the Atlantic titled Mr Rogers: the art of paying attention.

 

Thesis: W2 – MVP

For homework this week we were to create a MVP prototype of our thesis. To imagine as if it were due and have the interaction at least play test-able.

Since my Vuforia tests are taking a little longer than expected, hoping to storyboard out in drawings and then have a printed quilt with the attachable “leaf ids” that can be activated through Eyejack to simulate the AR experience. ( Playing off of some of the observations from playing with different AR apps)

After spending a month back in TN with family I got to hang out with my nephews a good bit. We got the oldest an ElectroDough kit , although he loved everything about it it was clear that the badges really helped him go through things in a specific order, and increased his desire to achieve all stages. Inspired me to think about if there was a way that parents could then add on different leaf patches (which unlocks additional AR interaction) as they come across them in the real world. Similar to a leaf id print out or fieldguide.

IMG_2541

IMG_2542IMG_2543

screen-shot-2018-11-15-at-4-21-58-am-e1542274834801

IMG_2544

AR Strengths / Weaknesses

https://www.youtube.com/watch?v=MNP77K9RsjIFor this week thesis teacher Sarah asked me to playtest as many AR apps as I could and see what their strengths and weaknesses were. Then to think which strengths make sense to design for into my mvp / first prototype. For this round I got to test out about 1o different ones and hope to keep exploring more:

 

David Bowie AR Exhibit 

Screen Shot 2020-02-10 at 3.41.56 PM.png

  • good range of media available (costumes, liner notes, videos, etc)
  • liked the map of gallery lay out / table of contents
  • good onboarding
  • audio levels of various media (narrator vs song clips) were surprisingly different / a little too jarring in difference?
  • reiterated the importance of mastering / unifying & double checking levels for possible use cases (ex: with earbuds vs over ear headphones)

 

Wonderscope

  • voice activation to move story forwards
    • encourages interaction with characters / makes the illusion feel more believable as a dialogue exchange
    • builds reading skills
  • some stories have difficulty with scale if immobile (like tucked in for bed)? but it is meant to encourage the user to move around
  • good on boarding to make sure the room is light enough and that the mic & camera permissions are on

                IMG_2501.PNG

augmented-reality-reading-app.jpg

  • Wonderscope Story – Red Riding Hood
    • uses the ability to go into spaces like entering the house with your phone moving through the door
    • like the others, can click on items that the character needs to move the story forward
      • has a visual indicator of sparkle particles that radiate vertically off the item to be found / clicked (helpful with
    • increases interactivity by having user read verb commands that then affect the animations like “spin” or “sprinkle”
    • liked the educational aspect of the bigdipper & the north star

 

  • Wonderscope Story – Sinclair Snakes – Museum Mischief
    • liked the use of different tool interactions
      • flashlight to search for hiding snake
      • duster to dust for footprints
      • how the ar museum rooms highlight

 

Big Bang in AR : Cern 

Weird Type – Zach Lieberman 

Starwalk

 

 

My Caterpillar 

Some nature apps (non ar)

Picture this

Seek  

 

Magic Windows: Vuforia & Image Targets

Getting Started

Step 1: create dev account @ https://developer.vuforia.com/

 

Adding personal target tests: embroidered ash leaves

Screen Shot 2020-02-10 at 2.53.50 PMScreen Shot 2020-02-10 at 2.53.37 PM

Step 2: image targets tutorial 

    • On the developer portal go to Develop/License Manager and generate a license (keep it handy!)
    • under Develop/Target Manager create image target and add your target Image(s). Please note its rating and features quality. Below info is pulled from the Vuforia Developer portal:
      • “Image Targets are detected based on natural features that are extracted from the target image and then compared at run time with features in the live camera image. The star rating of a target ranges between 1 and 5 stars; although targets with low rating (1 or 2 stars) can usually detect and track well. For best results, you should aim for targets with 4 or 5 stars. To create a trackable that is accurately detected, you should use images that are:”
        • Attribute Example
          Rich in detail Street-scene, group of people, collages and mixtures of items, or sport scenes
          Good contrast Has both bright and dark regions, is well lit, and not dull in brightness or color
          No repetitive patterns Grassy field, the front of a modern house with identical windows, and other regular grids and patterns
  • download img database package (as Unity Editor package)
  • make sure running Unity on a 2019.2.xx version
  • follow guides to set up on ios or android

 

 

Iterating off in class tutorial

Image Target class example

  • create new project (3D)
  • create new scene
    • Open build settings and make sure the right platform is selected
    • Open Player settings: make sure to add your package/bundle name, etc
    • Under XR settings: activate Vuforia
  • delete camera and directional light from the project hierarchy
  • add ARcamera
    • config Vuforia settings
      • add App License Key

 

  • under Assets/Import Package select Custom Package and select the Samples.unitypackage or your own target image database
  • add IMG target prefab to the hierarchy
    • make sure to select the right database
    • make sure to select the right image target
  • add a sphere primitive as a child of the image target (name it: ‘eye’)
    • scale xyz .2
    • pos (0.0, .148, 0.0)
    • create and apply white eye material
  • add second sphere as child of ‘eye’ and name it ‘pupil
    • scale xy z .25 .25 .1
    • pos 0, .012 .471
    • create and apply new black pupil material
  • create a new c# script,  name it ‘LookAtSomething’

public class LookAtMe : MonoBehaviour {

// transform position from the object we want to be looking at

public Transform target;

// Update is called once per frame

void Update () {

// use the coordinates to rotate this gameobject towards the target

this.transform.LookAt(target);

 

}

}

  • add the new script to the ‘eye’ GameObject in the hierarchy
  • make sure to link the ARCamera (what we want the eye to be looking at) to the target transform property
  • test! Move around and check if the eye is always looking at the camera
  • make a prefab out of ‘eye’
  • add any more necessary instances of the ‘eye’ prefab
  • Have fun!

 

 

 

Questions for Rui

  • module for mobile environment setup?

 

 

 

 

 

 

So you want to build an AR project…

This week we had residents Ilana and Terrick talk to us about building out AR/VR projects for thesis. Since I hope to explore the AR route potentially as well as for Magic Windows class wanted to note some of the points here too. Office hours this week with thesis advisor Sarah was also an assignment was two fold:

1: to play with and dissect as many AR apps as possible – what’s working and what’s not?

2: to stark making and that playing with quick AR creation tools like EyeJack will help get at what interactions for my quilt feel successful verses not.

We also talked about the target audience of kids and connecting them to local ecology through plant identification.

 

Think about an AR/VR project?

In the thesis presentation from the residents they raised some good consideration points. The importance of content and design, concept first and the tech to then help lift it up.

  • Why is AR the right medium for your project?
  • Content First and Design
  • Technology “second”
  • Understand what your project is about
    • consider – is it the right medium? if yes, why?
  • Look for references and understand what they did right/wrong
    • what platforms did they use? what is the scale of the project?

Some AR References

Anna Ridler – Wikileaks: A Love Story (2016)

IMG_6729

 

Zach Lieberman – Weird Type App (2019)

 

Planeta – David Bowie Is – AR exhibition App  (2018)

 

Wonderscope (2018)

 

 

What is your minimal MVP (Minimal Viable Product)?

  • What are your goals / MVP? 
    • Playing off of what we talked about in thesis class last week, always check back in with yourself about what your MVP is – A version of your final thesis output that has jus enough features to satisfy your benchmarks and provide feedback for future product development? Remember that the ITP thesis is more of a “Proof of Concept” prototype that you can then apply to grants, residencies, etc for incubation / final productions
  • Think about Content and Design, A lot! 
    • “According to what your project goal and MVP is, understand what is the right content to present, interactions you will need to create and aesthetics. This will guide you to decide on the tech.”
  • Sketch your experience and think about the user 
    • “When/where will people use it? How easy/hard should it be for people to access it?”
  • Think about the aesthetic you aim to create 

 

Advice for project builds

Terrick gave us a great breakdown of his learnings from his VR thesis and Ilana of her AR thesis. Both offered great advice for building out projects:

  1. Don’t underestimate the planning stage
  2. Users won’t always do what you want them to
  3. Photorealism isn’t always the best approach to creating believable experiences
  4. Spend ample time on sound

 

  1. Users are not used to have their mobile cameras as a ‘portal’. People still see it only as a docummenting tool. So keep in mind you will have to educate the user all the time.
  2. Create Gaze cues.
  3. Play with scale.
  4. Make a great Onboarding. Assume people won’t realize things by themselves and test.
  5. Understand that you have two spaces: the 2D UI and the 3D space. Design for both.
  6. Consider also that you have two spaces for gathering user input. Take advantage of that.
  7. Don’t forget about sound.

 

Technology

Define your project – Is it:

  • functional product
  • gaming
  • storytelling
  • experiment / artistic

 

PossibleAR tech: 

 

Tech Learnings 

From Ilana:

  1. Building takes time. A lot. With that the development process is super slow, mainly when you think about geolocation projects. So plan accordingly.
  2. To scale things accurately, you will be building in your device and testing all the time. A good tip is to build a system in the UI that will be only for your development process.
  3. If you are working with geolocation, don’t expect things to be in the exact coordinates in real space that you place them in your code. Our phone’s GPS is not accurate.
  4. This is a fast paced tech environment. Things change, versions get updated and sometimes, if you are working with different libraries and SDKs, at some point they may not work together anymore. So always have your MVP in mind and goals to prioritize.
  5. If you plan on putting your app on the AppStore, make sure to test it on multiple devices.
  6. It is a new medium and there is not a lot of people doing projects with this technology. Send e-mails, participate in communities and meetups. People will be excited to talk to you and share experiences.
  7. Just user test a lot. Don’t assume anything will work well until it does.

 

From Terrick:

  1. Not everyone knows what VR is. Onboarding is important!
  2. It’s very easy to cause motion sickness.
  3. Try to use existing assets when you can.
  4. Know the version of your project!

From both – User Test a Lot

 

Some AR tools:

Mobile Lab: W1 assignment – 1 button app

 

For week one we were to “Create a One Button app” using the one button code kit.  I  decided to make a “free museum access” app that lists all the museums we can get into free with our NYU student ID. It’s such a great benefit and am always forgetting! This helps me keep the list in one place but also as a visual suggestion of somewhere to go that day. The list is pulled from the post on the NYU site here.

After adding Images to the list I realized that the white text was hard to read so looked around and found a way to create a background box with opacity. However the background box is not responsive when you rotate the phone from portrait to landscape. Something to trouble shoot for in next steps.

I also tried it as a randomized list but enjoyed the more straightforward click through to help reiterate what the waived admissions options were better. Initially was thinking of a “What museum should I go to today?” randomized app and it reminded me just how often I forget about the free access we get with our student ids at certain ones.

Could see a next step iteration combining some of the things learned from the Swift UI lab with map location. Or having  the address/train stops in a Vstack below the museum name.

 

Questions:

  • how to outline text, like if using white text how to create a drop shadow or black outline to better separate from background image
  • how to have font color responsive if people select light or dark mode on their phones
  • best practices for crediting assets in app code
  • how to have the app restart from beginning when reopened, currently will start from last frame clicked