Mobile Lab: W9

Screen Shot 2020-04-09 at 3.22.12 PM 

For my final app I plan to keep iterating on the Trees on My Block list app from midterm. Inspired by NYC Tree map site and its JSON set. I hope to translate some of the things that I love so much about the project into app form while also keeping it hyper local, one neighborhood block view at time. Would love for it to be a more fluid vs hardcoded experience so I could share the app with friends and family in NYC too. Things that I would like to try for:

 

2 Explore modes

Map View – using geolocation data see where you are on the map and trees you’re next to
Species List View- navigate to clickable list of species

Ability to update user location data + tree map load based on address input 

design/build to where a user could input an address and would load their block’s tree map. When address is entered, would load both their current location and your blocks tree map with corresponding tree species / based on the NYC Treemap Json.

 

A more intimate / less tutorial driven design aesthetic

 

Mobile Lab: W8

This week for hw we were to explore Reality Composer. Pick 3 separate areas (AR Health, AR Learning, AR Retail, AR Work and AR Entertainment) and design an app concept for that area.

 

AR Swatch helper (AR work / AR Retail)

When looking at possible knitting textile projects or a new pattern theres always the swatch gauge that is supposed to give you a guesstimate of how many stitches = a measured surface area based on needle size, material, and tightness of stitch. Being new to knitting I still forget the importance of making a swatch size and dive straight in realizing maybe its too small of a stitch project for me right now? Although hopefully that intuition will develop over time would love to have an ar app that could parse the swatch gauge measurement for me and generate one in real scale. Sometimes having a visual to scale in your everyday view is the most helpful : )

 

Gauge_Woes_Swatches

 

 

 

 

Object Xray Machine (AR Learning)

media-1049361-o-mouse-inside
from https://www.edn.com/under-the-hood-optical-wireless-mouse-rodentia-analogus/

Object Xray Machine is an educational AR app that overlays a “teardown” view of everyday electronic objects. Learn more about the behind the scenes parts that makeup our electronic tools.

This quick prototype uses an image recognition trigger of the mouse + area around it. Ideally would use object recognition so could be held in hand with overlay, not dependent on feature recognition targets outside the object itself for activation. When the camera recognizes the device like a mouse or phone, it would then reveal the motherboard & components within, with clickable components for more information.

 

https://vimeo.com/user89676923

 

AR plant care (AR Entertainment / Learning) – Object Tracking 

IMG_2892.jpeg

An AR app that logs & can remind you when to water your plant friends. Was able to snag a screenshot when it triggered correctly, but only worked for a quick second. Need to do a better object scan for object detection. Instead of a speech bubble could list last time watered and count down to next.

 

 

Future Ideabank:

  1. Untold Histories (AR Learning)
  2. Stretch for the stars (AR Health) 
  3. Object based radio generator (AR Entertainment)

 

Concepts to think on:

  • object tracking
  • screen triggers
  • tap triggers
  • scene transitions

 

Resources: