Translation Study #1: Folding Flowers was created during my time in Gene Kogan’s Neural Aesthetic class at ITP NYU. The exploration feels deeply rooted in my experience at SFPC reminding me of when Gene introduced us to machine learning and style transfer as well as being introduced to the idea of collaboration with machines.
I remember also Zach explaining the beauty and interesting things that can happen with translation. This could be a translation from one medium to another as we experienced in his class Recreating the Past or also conceptually thinking about how something in the world might be articulated through new types of media. Additionally inspired by our classmate Robby Kraft walking us through a workshop of different methods of origami and consequently tactile ways of thinking.
In this experiment I used RunwayML’s training to explore how computation might “fold” images of flowers into flower origami, and witness its thinking through latent space.
[Origami Image Set: Applying Sam Lavigne/ antiboredom’s flickr scaper to gather public images of Flickr group Flower Origami. Then training it on a set of flower images via RunwayML.]
Next test this weekend is with Swindell Pottery and her and her dad’s pottery archive & an environmental dataset like landscapes or trees. I think one helpful thing in particular has been how maybe the origami have a blank backdrop / surface that the objects are photographed against. I really like the moving between what looks like a 2d image scape and then transforming into a 3d object. It almost looks like the AI is learning how to fold and make its own origami inspired by flowers and landscapes.
If you were to use it as an actual training model maybe training it for less steps would be helpful.