Automata_Salvation

AUTOMATA. VERY EXCITED.

For the first assignment, we have to develop an idea for a project that involves Automata.

  • Two quick studies that explore some facet of the design, experience or concept. Can be any materials or methods including software, models, mechanism mockup, etc.
  • At least one physical studies.

Because I’m still dizzy in the post-era of Maker Faire, I don’t have the whole picture about what I want to do with Automata, but I know I’m also fascinated by chain reaction, parts triggering parts sequentially in a strange and unexpected way, and I’d love to try out Whirligigs!

The idea of the project is “after finding out what your decision did to yourself and bearing all the consequences, you will be less mad about yourself from watching yourself being punished“, aka Salvation.

For the past two months, I felt I was running along the edge of the cliff. Holding all the stuff and running forward clumsily, and the deadline just kept poking me behind. And sometime it made me want to kill myself(METAPHORICALLY). And this whirligig of artist Ben Thal somehow has the healing effect on me.

I love this kind of simple movement repeat again and again and again, and it allows me to stare at it tirelessly and helps me to clear up my mind, especially the self-destruction genre.

And my first design for this concept is Stab My Face.

stab my face

sketch

Stabbing My Face

It’s also a practice of the amazing Harry T. Brown’s 507 Mechanical Movements #157. I chose it because it has a nice momentum to strike the stab, insteading of constant speed all the time.

Screen Shot 2014-10-09 at 7.20.24 PM

 

 

Ideally, it should be driven by motor instead of my clumsy fingers, and they all should be well designed and fabricated instead of taping everything on top of the shop table.

And my second design for this concept is Pull the Guts Out.

pull the guts out

It’s an automata that keeps pulling out the guts. The guts only moves when arm moves downward.

Using the same gear setup #157, girl’s arm rotates up and down. And once the arm goes down, the stick connected to it pushes the gear with teeth rotate counterclockwise, and then drag the softmateral guts rotate counterclockwise too. The tricky part is, I’m not sure about the movement when arm moves upward. Ideally the gear would just have little movement when arm moves up because of the teeth.

 

HHouse_Immersive Performance_Elevator Ticket

For the first week assignment – a simple immersive performance, to put a new slant on an everyday situation or interaction.Ben, Tom and me present the Elevator Ticket!

And here’s the nicely put-together by Tom:

In New York there isn’t much more routine then taking an elevator ride. With the exception of freak accidents elevators usually function in the same way. Press the button, wait, get in and awkwardly attempt not to look at other people, get off.

As routine as elevator rides may be, we imagined disrupting the routine of elevator usage.

In the style of old train tickets, we created an elevator ticket and instructed visitors to the NYU Tisch building that a ticket was required for using the elevator.

Using the reasoning of ‘safety’ and ‘efficiency’ we asked visitors for his/her destination floor and collected tickets prior to boarding the elevator. The ticket was not presented as an option but instead as a requirement to use the elevator.

The responses varied from mild amusement, to annoyance and non compliance. It was interesting that most elevator users accepted the reasoning of ‘efficiency’ or ‘safety’ at face value, even as we were not dressed in a way that might suggest authority.

We imagined that to enhance the performance we could  dress in uniform and have operators standing inside of the elevator asking for tickets from passengers and pressing floor button for them. During the ride the operator could talk about the imaginary floor news and wish passengers a merry imaginary scenario day when arriving.

{Tune In, Tune Out}

Screen Shot 2014-05-27 at 2.34.35 PM

For the final of Mash-up class, Jeff Ong and I teamed up and made “Tune In, Tune Out”. It’s a webite uses your computer’s on-board microphone, listens to the surrounding environmental sound, and translates it into a musical landscape.

An accompanying visualization appears on-screen, along with options to change the musical fundamentals (key, scale, mode) of the sounds. Build with Web Audio API and Tone.js for audio, and Three.js for visual.

We started idea of filtering out the “sound of the sound”. So much effort and energy has gone into our attempts to “escape” our immediate sonic environment: conversations, distracting noises, and try to drown out the surrounding chaos.

There’s no denying the need to focus, and there’s also no shortage of aural distractions, especially living in urban areas. “Tune In, Tune Out” offers an alternative to the music and tools we use to tune out the noise around us, channeling that very noise into a sonic landscape.

The_eight_musical_modes

An additional project inspiration and goal is to build some basic tools around making it easier to use music theory concepts in tandem with data representation. There are different types of Key, Mode, Scale, and Sound Wave to choose from. Thank Aaron Arntz for the music theory Fly by!

GitHub here.

{Rabbit_Hole}

{currently works with Chrome and Firefox browser}

For the composition assignment the final of Coding for Emotional Impact class, I want to create something with multiple layers and is self-explained. Inspired by the description of computer vision is a rabbit hole from Andy(because I’m learning Three.js by myself recently), I wanted to make a game about “Rabbit Hole”, and my biggest assumption is that everyone is sort of down the rabbit hole.

ps. It’s not really a fun game to play. Still confusing should it be fun to play or just an emotion-building nowhere…

– Title
Rabbit Hole
– Environment
environment     environment2
– Audience
Whoever also down the rabbit hole or wonder how it feel down there.
 
– Narrative arc
Rabbit HoleMetaphor for the conceptual path which is thought to lead to the true nature of reality. Infinitesimally deep and complex, venturing too far down is probably not that great of an idea.

Literary Nonsense- has no system of logic, although it may imply the existence of an inscrutable one, just beyond our grasp.

And below are three snapshots of what I’ve built so far. I made my own models in Maya and drew textures in Photoshop. Can’t view online because of some web-related issue I can’t solve to load the music(SOLVED_by hard coding the url of music file path). But have no ideas how to do the transition from scene to scene…

SCENE_ZERO: http://www.rabbithole.link/

open

SCENE_ONE: http://www.rabbithole.link/index_D.html

Screen Shot 2014-04-13 at 10.26.14 PM

SCENCE_TWO: http://www.rabbithole.link/index_G.html

Screen Shot 2014-04-13 at 10.27.30 PM

SCENE_THREE: http://www.rabbithole.link/index_S.html

Screen Shot 2014-04-20 at 6.44.00 PM

SCENE_FOUR: http://www.rabbithole.link/index_M.html

maze

SCENE_FIVE: http://www.rabbithole.link/index_T.html

TV

SCENE_SIX: http://www.rabbithole.link/index_F.html

jump

SCENE_SEVEN: http://www.rabbithole.link/index_V.html

voice

SCENE_EIGHT: http://www.rabbithole.link/index_E.html

Elevator

( Three.js + web stuff ) == super deep rabbit hole.

Virtual Reality Tour of Met

For my internship during Spring semester 2014 in Media Lab of The Metropolitan Museum of Art, I hooked up

    1. 3D models of Met from the Architecture Department
    2. Official Audio Guide
    3. 3D models art pieces in Greek and Roman gallery, made by 3D scan with photos
    4. Unity as game engine
    5. virtual reality head-mounted display Oculus Rift as controller

and create an immersive virtual reality tour of Met!

forBlog With Oculus Rift, users can wonder around the museum, listening to the audio guide and admiring art pieces, walk upstair, watch butterflies, being blocked by huge bowl, and being inside of the surreal mash-up models(credits to Decho<horse> and Rui<uncolor triangulars>). metTour

IDEA

With a background as VFX artist of 3D animation and post production, I’m always interested in 3D and how it can be interactive in the creative way. Once I got the chance to intern in Media Lab of the Met and knew we can access the 3D models of museum, I wanted to use Oculus Rift to walk inside the fantasy version of the Met, and to enjoy the immersive experience in space.

 

PROJECT_DEVELOPMENT

Virtual Met Museum –> Fantasy Experiment –> Art piece + Audio Guide

 

BASIC_SETUP_HOW_TO

First of all, tons of basic knowledge about Unity here. And setup a project from scratch, here.

 

✓ Import BIM 3D models into Unity

Basically just put the fbx file into the Assets folder of the project you just created. Not too complicated but there’s one thing you should be aware of, the SCALE. It’s a good practice to setup scale right in the modeling application before importing the model to Unity, and associated details described as below:

  • 1 Unity unit = 1m
  • the fewer GameObjects the better. Also, use 1 material if you can
  • useful link: wiki unity3d

 

✓ Oculus Rift Plugin in Unity 3d Setup

Just follow the clear instruction on youtube!

 

✓ Add collider to meshes

In order to preventing player walking through meshes(e.g. walls, stairs), we need to add Collider attribute on models, steps as below:

  • select model
  • @inspector
  • Add Component –> Physics –> Box Collider or Mesh Collider
  • Mesh Collider is more specific than box collider but at the same time is more expensive to use.

collider copy

 

✓ Occlusion Culling

Means that things you aren’t looking at, aren’t loading into memory, so game will run faster.

  •  geometry must be broken into sensibly sized pieces.
    • if you have one object that contains all the furniture, either all or none of the entire set of furniture will be culled.
  • tag all scene objects that you want to be part of the occlusion to Occluder Static in the Inspector.
  • Back!
  • useful link: unity3d manual

 

✓ Import 3D-Scanned Models from 

  • Take about 20 photos around the object you want to 3D scan of(360 degrees!).
  • Upload the photos to 123D Catch.
  • Yeah now you’ll have both .obj model file and texture file!
  • Just download the file, and drag whole folder into the Asset folder of Unity!

 

POSSIBILITIES

  • Gain accessibility for people who can’t visit the museum in person.
  • Installation design simulation.

 

Thank_to

It’s really a good experience interning at MediaLab of Met. I know I want to keep working on 3D and also step into virtual reality world with Oculus Rift, and it’s a great match that I can have this topic as my own project, and also match to the needs of Met! From this internship, I gained valuable resources from the museum, and also knowing amazing mentors and colleagues from Labs. This project leads me to the world of virtual reality and I’m glad and also thankful that I’m a Spring 14′ intern of Media Lab of The Metropolitan Museum of Art.

{Walk & Talk}

IMG_5162 copy

Walk & Talk

is the final project of me and Ziv for Spatial Media.

 

Ideas

For the final of Spatial Media, there’s no restriction on content and context, so because of the struggling process of brainstorming, we decided to make a project helping brainstorming! Getting the inspirations from Land and Sea, a game heard from Ziv from Israel, and the Boundary Functions of Scott Snibbe that exploring the relation between people and spaces, we built up a system that people can expand their territory by walking and shaking, and once people stop moving, their territories will shrink and eventually disappear. Based on several researches proved that “Body Movements Can Influence Problem Solving”(e.g. Science Daily, May 13, 2009), it has the potential to be installed in office space, and help employees brainstorming.

traditional_territory_small

 

Concept

Each person in the game is assigned an initial territory, which he can expand by walking. If they don’t walk, their territory gets smaller. Also, if they don’t talk (to brainstorm or just chit chat), their territories will shrink. This way, people have to walk and talk in order to keep his/her territory.

 

Context

This game can be used for multiple purposes, e.g. DECISION MAKING • BRAINSTORMING • VOTING ON IDEAS • BUDGET PLANNING. Overall the main function is provoking thoughts about sharing and space with movements.

 

Technique

process

Tools: OpenFrameworks, Kinect*1, projectors*2.

 

Developing

1) The first attempt to expand the territories based on the blobs movement captured by Kinect.

2) Using ofPolylines to smooth the shape of geometry.

3) Beautiful mistakes ;)

4) Final version of color-filled geometry.

EmoCode final_{Rabbit Hole}

Rabbit Hole

  • “down the rabbit hole”, a metaphor for adventure into the unknown, from its use in Alice’s Adventures in Wonderland
  • a slang expression for a psychedelic experience, from the same usage
  • ARG, Alternative Reality Game(http://en.wikipedia.org/wiki/Alternate_reality_game#Unique_terminology) –
  • TING, This is not a game
  • Literary nonsense

    • these supernatural phenomena are not nonsensical if they have a discernible logic supporting their existence
    • has no system of logic, although it may imply the existence of an inscrutable one, just beyond our grasp.
  • Eating and devouring[edit]

    Carina Garland notes how the world is “expressed via representations of food and appetite”, naming Alice’s frequent desire for consumption (of both food and words), her ‘Curious Appetites’.[26] Often, the idea of eating coincides to make gruesome images. After the riddle “Why is a raven like a writing-desk?”, the Hatter claims that Alice might as well say, “I see what I eat…I eat what I see” and so the riddle’s solution, put forward by Boe Birns,[27] could be that “A raven eats worms; a writing desk is worm-eaten”; this idea of food encapsulates idea of life feeding on life, for the worm is being eaten and then becomes the eater  – a horrific image of mortality.

    Nina Auerbach discusses how the novel revolves around eating and drinking which “motivates much of her [Alice’s] behaviour”, for the story is essentially about things “entering and leaving her mouth”[28] The animals of Wonderland are of particular interest, for Alice’s relation to them shifts constantly because, as Lovell-Smith states, Alice’s changes in size continually reposition her in the food chain, serving as a way to make her acutely aware of the ‘eat or be eaten’ attitude that permeates Wonderland.[29]

  •  How Doth the Little Crocodile
    • How doth the little crocodile
      Improve his shining tail,
      And pour the waters of the Nile
      On every golden scale!
      How cheerfully he seems to grin,
      How neatly spreads his claws,
      And welcomes little fishes in
      With gently smiling jaws!
  • The effect of nonsense is often caused by an excess of meaning, rather than a lack of it.

Awareness

(3/25_Updated_footage version)

It’s a project of material experiment and mycelium network simulation. The ultimate goal is to pull closer humans’ relationship with fungus, increase awareness, and explore the usage of mycelium by holding workshop and gathering public source.

 

 

material experiment

In 2007, Eben Bayer and Gavin McIntyre noticed mycelium’s self-assembling glue-like character. By growing mycelium with agricultural byproducts, they discovered a biological, durable, and compostable material that performs, and they found a company called ecovatice. Their products are pressed with pressure during production, and are thick, chunky, and volumetric. Inspired by artist Eric Klarenbeek‘s 3D printed case with straw, I guessed as long as I follow the principle that “mycelium digests nutrient and water and grows harder”, the process of production can be free-formed and without boundary. So I gave it a try.

diagram

For the blender part, the ratio of mycelium+straw & water is approximately 2:1.

Hang the balls in a separated area in order to avoid be contaminated. And after 3~5 days the ball will become obvious white, showing the growing of mycelium.

After 10 days, harvest the balls and pop the balloons, and voila!

Put them aside and dry their interiors for a day(because they were blocked by the balloon).

IMG_3105

IMG_3145

IMG_3130

IMG_3110

IMG_3157

 

 

mycelium network

I’m also interested in how mycelium communicates with each others. The roots of most land plants are colonised by mycorrhizal fungi that provide mineral nutrients in exchange for carbon, and based on “Underground signals carried through common mycelial networks warn neighbouring plants of aphid attack” on Ecology Letter, by Zdenka Babikova, Lucy Gilbert, Toby J. A. Bruce,3 Michael Birkett, John C. Caulfield, Christine Woodcock, John A. Pickett and David Johnson, mycorrhizal mycelia can also act as a conduit for signaling between plants, acting as an early warning system for herbivore attack.

Screen Shot 2014-03-19 at 10.39.44 AM

The experiment is based on the fact that Vicia Faba will emit plant volatiles, particularly methyl salicylate, making this bean plants repellent to aphids but attractive to aphid(bugs) enemies such as parasitoids. It sets up 5 Vicia Faba, having only one of them attacked by aphids,  and having it either connected to other plants with roots or without roots, with mycelium or without mycelium(as picture right above). And the result(as picture left above) shows that the plants, which are connected to the Donor(infested w/ aphids) by mycelium, act same as the Donor, producing volatiles to repel aphids and attract aphids’ enemy. It means This underground messaging system allows neighboring plants to invoke herbivore defenses before attack.

It interests me a lot, and I want to use it as the content to inform people about the amazing behavior of fungus by visualizing the network of mycelium. The idea is–>

  1. when there’s no one around, the mycelium bulb will breathe in its own pattern, presenting w/ LEDs, and there is a video playing footages of fungus & mycelium life.
  2. once someone comes near, the mycelium bulbs will communicate with each other, lighting up and off one by one, and the video will change to broadcast the human-related footages(e.g. garbage, oil spill, and mycoremediation).

2014-03-12 09.48.46

footage Breathing, password: fungus

footage Awarepassword: fungus

 

 

And here are my Arduino code. I wrote digitalWrite into PMW pins.

//#include <LED.h>
#include <NewPing.h>

#define TRIGGER_PIN 8
#define ECHO_PIN 7
#define MAX_DISTANCE 30

//for ultrasonic sensor
NewPing sonar(TRIGGER_PIN, ECHO_PIN, MAX_DISTANCE);
int value;
int interval;  //to trigger the change of LEDs

//for smoothing
const int numReadings = 5;
int readings[numReadings];
int oriReading;
int index = 0;
int total = 0;
int average = 0;

//pin
int ledPins[] = { 
  3,5,6,9,10,11 };

int lastFade[6] = {
  0};
int level[] = {
  10, 23, 45, 50, 100, 205};

//output
int maxV = 220;
int minV = 5;

//slope & intercept
double ain[6], bin[6], aex[6], bex[6];

//time
double inTime[] = {
  1500, 1700, 1900, 2000, 2100, 2300};
double pauseTime[] = {
  350, 400, 450, 500, 550, 600};
double outTime[] = {
  2000, 2200, 2400, 2500, 2600, 2800};
double thirdT[6], cycleT[6];
double levels[6];

boolean lightUp[6];
int awareTime[] = {
  0, 1, 2, 3, 4, 5};
int awareOriTime[] = {
  0, 1, 2, 3, 4, 5};

void setup() {
  Serial.begin(9600);

  //smoothing
  for(int i=0; i<numReadings; i++){
    readings[i] = 0;
  }

  for(int i=0; i<6; i++) {
    pinMode(ledPins[i], OUTPUT);

    thirdT[i] = inTime[i] + pauseTime[i];
    cycleT[i] = inTime[i] + pauseTime[i] + outTime[i];

    ain[i] = (maxV - minV)/inTime[i];
    bin[i] = minV;
    aex[i] = (minV - maxV)/outTime[i];
    bex[i] = maxV - aex[i]*(inTime[i]+pauseTime[i]);

    lightUp[i] = false;
  }  
}

unsigned long tstart[6];
double time;

void loop() {

  //ultrasonic sensor
  oriReading = sonar.ping();
  value = (int) oriReading/US_ROUNDTRIP_CM;

  for(int thisChannel=0; thisChannel<6; thisChannel++) {

    //if detect ppl, all light up
    if(value > 0) {

      //if time can be dividable by 60
      if ( (awareTime[thisChannel])%6 == 0 ) {
        lightUp[thisChannel] = !lightUp[thisChannel];
      }

      if(lightUp[thisChannel] == true)
        levels[thisChannel] = 255;
      else
        levels[thisChannel] = 0;

      analogWrite(ledPins[thisChannel], levels[thisChannel]);

      //determin whether to restart the cycle of time
      awareTime[thisChannel] += 1;

      if( awareTime[thisChannel] >= (180) )
        awareTime[thisChannel] = awareOriTime[0];
    } 

    //if not, do LED pattern
    else {

      if (lastFade[thisChannel] <= inTime[thisChannel]) {
        levels[thisChannel] = int( ain[thisChannel]*lastFade[thisChannel] + bin[thisChannel] );
      }
      else if (lastFade[thisChannel] <= thirdT[thisChannel]) {
        levels[thisChannel] = maxV;
      }
      else {
        levels[thisChannel] = int( aex[thisChannel]*lastFade[thisChannel] + bex[thisChannel] );
      }

      analogWrite(ledPins[thisChannel], levels[thisChannel]);
      delay(1);

      //determin whether to restart the cycle of time
      if(lastFade[thisChannel] >= cycleT[thisChannel]) {
        lastFade[thisChannel] = 0;
        tstart[thisChannel] = millis();
      }
      else {
        lastFade[thisChannel] = millis() - tstart[thisChannel];
      }
    }
  }
}

 

And my Processing code to switch footages based on the Serial signal got from Arduino.

import processing.serial.*;
import processing.video.*;
import java.awt.MouseInfo;
import java.util.Arrays;
import java.util.Collections;
import java.awt.Rectangle;

Movie aware;
Movie grow;
boolean playGrow = true;

Serial myPort;

void setup() {
  size(displayWidth, displayHeight);
  if (frame != null) {
    frame.setResizable(true);
  }
  background(0);
  // Load and play the video in a loop
  aware = new Movie(this, "aware_2.mp4");
  grow = new Movie(this, "grow_v2.mp4");
  aware.loop();
  grow.loop();

//  println(Serial.list());
  String portName = Serial.list()[5];
  myPort = new Serial(this, portName, 9600);
}

void movieEvent(Movie m) {
  m.read();
}

void draw() {
  if(playGrow)
    image(grow, 0, 0, width, height);
  else
    image(aware, 0, 0, width, height);
}

void serialEvent (Serial myPort) {
  int inByte = myPort.read();
  println(inByte);

  if (inByte > 10)
    playGrow = false;
  else
    playGrow = true;

}

void keyPressed() {
  if(key == '1')
    playGrow = true;
  if(key == '2')
    playGrow = false;
}

int mX;
int mY;

boolean sketchFullScreen() {
  return true;
}

void mouseDragged() {
  frame.setLocation(
  MouseInfo.getPointerInfo().getLocation().x-mX, 
  MouseInfo.getPointerInfo().getLocation().y-mY);
}

public void init() {
  frame.removeNotify();
  frame.setUndecorated(true);
  frame.addNotify();
  super.init();
}

 

 

photos of Fabrication2014-03-11 12.19.29

2014-03-12 00.24.24

lightUp2

lightUp1

 

IMG_9685

IMG_9701

For further development, I’m thinking about maybe cooperate with Kate‘s “mushroom craft” and have some craft workshops! After all the process of making those mycelium light bulbs, I’ve been through the fabrication works which I’ve never tried before, and it felt great! I think direct “Hand” touch is the most effective way to pull closer the relationship between people and materials.

By starting the production from searching and growing the material, we can appreciate more about the resource we take from nature and also be more aware about the environmental issues. Not just sitting there receiving the news from TV, but actually  caring and being aware of it because you feel it affecting the fabrication process directly. Maker/Crafter spirit is one of the answer to the future.

Timing and Pacing

For this week’s subject Timing and Pacing, I chose “No Safe-House” in the soundtrack of The Grand Budapest Hotel to decode.

timing

 

effect I intend to achieve

–> emotion accumulation, cheerful and narrative.

notes

  • library I used for camera in 3D –>  http://mrfeinberg.com/peasycam/#about
  • using PShape to store the tetrahedron I made and set their movements with trigonometry functions, noise, hsl, and hard-coding frameCount!!!(see how long and tedious my codes are :P)
  • issues to work on, since I used frameCount, it’s different all the time, depending how fast my computer run. need to use millis() next time!
  • next step will be using library Minim to generate the patterns directly from the analysis of sound file.

codes 

Read The Rest

« Older Entries Newer Entries »