AR Quilt

Screen Shot 2020-05-07 at 3.34.54 PM

Title: Memory Quilt: Lets Go For a Walk

Description:   Memory Quilt is an AR project for my nephews. Embroidered leaves of neighborhood trees found on the quilt become shifting repositories of memories, unlocked through a companion AR mobile app.

Process

Screen Shot 2020-05-07 at 3.01.14 PM.png

Screen Shot 2020-05-07 at 3.02.43 PM
celebrating their routine walks, shared love of nature and how to share those memories with one another

Screen Shot 2020-05-07 at 3.02.38 PM.pngScreen Shot 2020-05-07 at 3.04.31 PM

User Testing 

Screen Shot 2020-04-14 at 11.49.25 AM

Some questions

  • How to share new and old memories through objects
  • How might generational storytelling inform social AR experiences
  • How to design digital experiences that encourage off screen exploration  
  • What does it mean to craft personalized apps, can it offer a certain kind of warmth
  • Possible aesthetic and interaction design decisions to better integrate the two

Some points of inspiration 

“by learning from a traditional medium, we’ve got a new technique  for drawing of computers whose value extends way beyond simply the automotive industry of tape drawing where it originated from and can benefit anybody putting geometry into a computer ”

SVA MFA interaction design student work from: https://sva.edu/academics/graduate/mfa-interaction-design
www.danielstarrason.com
http://www.danielstarrason.com
  • Abbot device for outdoor exploration paper
  • Bill Buxton’s tape drawing –> translation of tools / physical making to digital
  • SVA Interaction design MFA project (tree rings)
  • Embroidered landscapes from Museum in Akeyuri Iceland
  • The rise of digital naturalism and outdoor exploration apps

Notes from presentation w/ guest reviewers

What an amazing set of presentations from classmates! The reviewers also had such a wonderful and amazing set of insights for everyone.  Feeling so inspired by everyone. Here’s some feedback/takeaways:

Q’s = feedback from reviewers

A’s = solutions for next draft

 

Technical clarity  

Q: Unsure which quilts are being used/ mended. Is it one? is it all of them, any of them? 

A: Include a layered final quilt rendering / breakdown of influences. could be a visual timeline of how different family quilts influenced the design decisions or a gif of final quilt build

 

Q:  Whats the technical journey of the memory. Illustrate the mechanisms making it happen, from captured on mobile to discoverable through quilt to generated tap-able virtual leaves, each revealing a memory (technical demo). an illustrated pathway making the opaque black box transparent

ex: how does a maple leaf memory become digitally tied to image target. How does app know to generate virtual leaves and reveal the correctly associated memories.  Is it connected to a cloud database like Firebase to help with world saving?

A: (technical diagram, moving beyond the UX,UI. showing the data model, flow diagrams of how things are paired, the pathways from camera to swift ui image picker to reality composer to swift ui)

 

Q: Are the leaves tapped during augmented reality sequence real or virtual

using image of the real leaf in UX of AR/ Explore mode rendering was visually appealing, but confused how the app is used

A: have more explicit text/arrow that says “tap virtual leaves” right now it says just “leaves around you”.

Q: Is it limited to just photos? What kind of memories are being stored / range of media.

    • could it be a larger range of media? storytelling through quilts is an oral history tradition, would be nice to have more audio storytelling / sound and music incorporated. maybe video or gifs

A: have a mvp of images, but reiterate that ideally it would be the range, maybe even audio recording ability vs written with each memory that is stored.

 

Q: how many and how they then manifest as virtual leaves

          • How is world saving and file management addressed? (technical)
          • How does reality composer communicate with swift ui
        • GPS / Map of memory locations
        • digital background that makes sense?

 

Conceptual clarity 

        • continuing to pull the thread of looking at past ways of making and telling stories to inform new ones
  • wants to hear more stories like the nephew one woven throughout. 

Mobile Lab: W12

 

RenderedImage

 

App Title: Memory Quilt / Lets go for a walk

Short description: Memory Quilt is an AR quilt project for my nephews. Created by mending a family quilt, it celebrates their family’s love for nature, time spent together, and how they might share those stories with one another. Embroidered leaves of neighborhood trees back in Tennessee found on the quilt become shifting repositories of memories, unlocked through a companion AR mobile app.

Update: For this week I focused on building a mockup of the “Add to Collection Mode” and tested through sketch remotely with my brother back in TN. Continuing to work on  the AR explore mode – thinking both about the interaction and balance of Screen/World space.

 

Usertesting

1st Sketch PrototypeScreen Shot 2020-04-24 at 1.03.14 AM

User tester 1  

  • would like ability to write a small something about the memory, what makes the moment unique
  • date & location, an ability to geotag

 

User tester 2 

  • to have the main buttons closer to where the thumb naturally lies, at the bottom of screen
  • to have it stay as an option on each leaf page in stead of a single screen asking whether to take new picture or from camera roll

 

 

 

2nd Sketch Prototype Screen Shot 2020-04-30 at 1.18.34 PM

User tester 3

  • ability to edit memory

 

 

 

Explore Quilt AR mode: Scene flow brainstorm

Screen Shot 2019-12-05 at 12.29.03 AM

IMG_3022

Screen Shot 2020-04-22 at 10.05.21 PM.png

 

giphy-1.gif

 

IMG_2783

IMG_2997

IMG_2949IMG_2988

 

  • The main question is how should an image appear that’s been tagged to the species?
  • hoping to start with the inspiration of this image, the act of picking up leaves from the ground to look at them closer
  • Might also be good to look at Adobe Aero

 

 

Some Influences 

www.danielstarrason.com

I’ve also been revisiting some projects that I feel influenced this one as well on a subconscious level.  An early Kimchi & Chips exploration as well as some really beautiful embroidered icelandic landscapes from the turn of the 20th century at the museum Safnasafnid in Aykeuri.

 

 

Explore Quilt Mode: building out custom behaviors 

Animated GIF

Image Anchor:  Embroidered Maple leaf  

  • falling leaves =  primitive shapes + physics enabled 
  • on tap behavior to pull up UI / tagged leaf  memory 

 

 

 

Exploring quilt AR experience w Reality Composer

Screen Shot 2019-12-05 at 12.29.03 AM

Screen Shot 2020-04-14 at 11.49.25 AM

After office hours with Mobile lab and Thesis have decided to explore my project’s AR side with Reality Composer / Reality Kit + SwiftUI.

Since travel is limited could test with family in TN also by emailing them the image targets and testing the app via Testflight. Could test UI aspects with Sketch +InVision, and have a link that leads to a clickable playtest version ❤

This week now that some of the quilt patchwork and embroidered targets are done, revisiting the interaction and experience. It can be easy to fall down the rabbit hole of learning a new tool or software. A common question in past office hours whether with Nien, Sarah, Marina, Rui or Ilana has been less about how to make it work technically but to focus more on conceptual side and storytelling.

For W10 I plan to focus on 1 tree interaction then moving to the others. Focusing first on the maple tree on Fairfax.

Screen Shot 2020-04-14 at 11.49.36 AMScreen Shot 2020-04-14 at 11.49.04 AM

IMG_2985 2.jpeg

IMG_2989IMG_2997.jpeg

 

Thesis Update + Mobile lab / 4.15.20

Navigating Reality Composer

 

After office hours with Mobile lab and Thesis have decided to explore my project’s AR side with Reality Composer / Reality Kit + SwiftUI.

Since travel is limited could test with family in TN also by emailing them the image targets and testing the app via Testflight. Could test UI aspects with Sketch +InVision, and have a link that leads to a clickable playtest version ❤

This week now that some of the quilt patchwork and embroidered targets are done, revisiting the interaction and experience. It can be easy to fall down the rabbit hole of learning a new tool or software. A common question in past office hours whether with Nien, Sarah, Marina, Rui or Ilana has been less about how to make it work technically but to focus more on conceptual side and storytelling.

IMG_1041.pngIMG_1042.png

Mobile Lab: W9

Screen Shot 2020-04-09 at 3.22.12 PM 

For my final app I plan to keep iterating on the Trees on My Block list app from midterm. Inspired by NYC Tree map site and its JSON set. I hope to translate some of the things that I love so much about the project into app form while also keeping it hyper local, one neighborhood block view at time. Would love for it to be a more fluid vs hardcoded experience so I could share the app with friends and family in NYC too. Things that I would like to try for:

 

2 Explore modes

Map View – using geolocation data see where you are on the map and trees you’re next to
Species List View- navigate to clickable list of species

Ability to update user location data + tree map load based on address input 

design/build to where a user could input an address and would load their block’s tree map. When address is entered, would load both their current location and your blocks tree map with corresponding tree species / based on the NYC Treemap Json.

 

A more intimate / less tutorial driven design aesthetic

 

Mobile Lab: W8

This week for hw we were to explore Reality Composer. Pick 3 separate areas (AR Health, AR Learning, AR Retail, AR Work and AR Entertainment) and design an app concept for that area.

 

AR Swatch helper (AR work / AR Retail)

When looking at possible knitting textile projects or a new pattern theres always the swatch gauge that is supposed to give you a guesstimate of how many stitches = a measured surface area based on needle size, material, and tightness of stitch. Being new to knitting I still forget the importance of making a swatch size and dive straight in realizing maybe its too small of a stitch project for me right now? Although hopefully that intuition will develop over time would love to have an ar app that could parse the swatch gauge measurement for me and generate one in real scale. Sometimes having a visual to scale in your everyday view is the most helpful : )

 

Gauge_Woes_Swatches

 

 

 

 

Object Xray Machine (AR Learning)

media-1049361-o-mouse-inside
from https://www.edn.com/under-the-hood-optical-wireless-mouse-rodentia-analogus/

Object Xray Machine is an educational AR app that overlays a “teardown” view of everyday electronic objects. Learn more about the behind the scenes parts that makeup our electronic tools.

This quick prototype uses an image recognition trigger of the mouse + area around it. Ideally would use object recognition so could be held in hand with overlay, not dependent on feature recognition targets outside the object itself for activation. When the camera recognizes the device like a mouse or phone, it would then reveal the motherboard & components within, with clickable components for more information.

 

https://vimeo.com/user89676923

 

AR plant care (AR Entertainment / Learning) – Object Tracking 

IMG_2892.jpeg

An AR app that logs & can remind you when to water your plant friends. Was able to snag a screenshot when it triggered correctly, but only worked for a quick second. Need to do a better object scan for object detection. Instead of a speech bubble could list last time watered and count down to next.

 

 

Future Ideabank:

  1. Untold Histories (AR Learning)
  2. Stretch for the stars (AR Health) 
  3. Object based radio generator (AR Entertainment)

 

Concepts to think on:

  • object tracking
  • screen triggers
  • tap triggers
  • scene transitions

 

Resources: 

 

 

Mobile Lab: AR Lecture

What is Augmented Reality 

  • this form of ar has been around for a while
  • ar as marketing tool  /
  • medical usage
  • gaming etc

 

 

Current state of AR / why are we seeing more AR 

  • scale of feasibility
  • increased accessibility of having AR in our pockets
  • first wave was a little cheesy now we’re getting into a deeper usage of AR
  • light ar sensor on new ipad

 

 

What makes AR possible?

  • Recent advancements in software & hardware
  • visual inertial odometry / use of camera sensors to detect sensors
  • feature points / comparing camera frames – what are the things that are constant & what are the things that are moving?
  • one thing ar does really well is measuring

 

 

Book Augmented Human