Magic Windows: HW3

Hands on
Write a brief paragraph about your idea on augmenting an object.
This should be initial concept for next week’s assignment.
Be clear about your idea, inspirations and potential execution plan.

For this assignment I hope to explore an aspect of my thesis project. For thesis I’m working on a memory quilt designed for my nephews (target age 4/5ish) that uses AR to unlock memories around tree id-ing and walks they’ve taken with their family (mainly parents). This project is inspired by the tree id-ing walks they already go on, my nephews enjoyment of AR and quilts made in our family. The quilt will include repurposed squares from my brothers childhood quilt.

The augmented object will be patches on the quilt illustrated with contour lines of leaves of local trees in their neighborhood like a tulip poplar. When the phone (with vuforia) is held over the patch, the name of the tree will appear, encouraging the name to be read.  As you read it the leaf and words will fill in with color in (similar to the voice interaction of wonderscope). when the  whole name has been said it will unlock images of trees of they’ve found on past walks / hikes (that have been tagged to that target).

 

 

Brain on
List potential input (sensory) and output( feedback) modes that you can leverage in the real world – translate these into technical possibilities using current technology (you can use your mobile device but feel free to bring any other technology or platform (Arduino? Etc..) that you can implement.

Input / Sensory

  • sound
  • speech
  • peripheral senses – ambient light to signal a shift in weather (thinking of the experiments mentioned in reading Tangible Bits: Towards Seamless Interfaces Between People, Bits and Atoms  by Hiroshi Ishii and Brygg Ullmer)
  • temperature
  • motion
  • human connection / like touch
  • a force / or amount of pressure exerted
  • distance between objects
  • time (between actions) / duration
  • accelerometer / orientation / tilt sensors
  • sentiment analysis / emotionally quality of messages & speech
  • arduino + distance sensor
  • FSR

 

Output / Feedback

  • arduino + heating pad + thermo ink (ex: Jingwen Zhu’s Heart on My Dress)
  • animation (play, pause skip around time codes)
  • air flow like Programmable Air  / expand – contract
  • arduino + a range of things like neopixels
  • animate or unlock something visual on a mobile device

 

Choose one of each (input, output) and create a simple experience to show off their properties but, also, affordances and constraints.

 

Experiments:  (quick prototype – sliding images / illustrations into view as they would be unlocked through audio / speech recognition) 

would like to do another quick experiement that gets both sides of interaction, was only able to get the SFSpeechRecognizer example to work would love for specific words to activate the animations shown above.  Was also able to getthe unity tutorials from the past couple weeksup and going on my updated unity again ❤

And loved reading the papers provided including: Tangible Bits: Towards Seamless Interfaces Between People, Bits and Atoms  by Hiroshi Ishii and Brygg Ullmer @ MIT Medialab / Tangible Media Group. (post here) and BRICKS: LAYING THE FOUNDATIONS FOR GRASPABLE USER INTERFACES (post here)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s