{Tune In, Tune Out}

Screen Shot 2014-05-27 at 2.34.35 PM

For the final of Mash-up class, Jeff Ong and I teamed up and made “Tune In, Tune Out”. It’s a webite uses your computer’s on-board microphone, listens to the surrounding environmental sound, and translates it into a musical landscape.

An accompanying visualization appears on-screen, along with options to change the musical fundamentals (key, scale, mode) of the sounds. Build with Web Audio API and Tone.js for audio, and Three.js for visual.

We started idea of filtering out the “sound of the sound”. So much effort and energy has gone into our attempts to “escape” our immediate sonic environment: conversations, distracting noises, and try to drown out the surrounding chaos.

There’s no denying the need to focus, and there’s also no shortage of aural distractions, especially living in urban areas. “Tune In, Tune Out” offers an alternative to the music and tools we use to tune out the noise around us, channeling that very noise into a sonic landscape.

The_eight_musical_modes

An additional project inspiration and goal is to build some basic tools around making it easier to use music theory concepts in tandem with data representation. There are different types of Key, Mode, Scale, and Sound Wave to choose from. Thank Aaron Arntz for the music theory Fly by!

GitHub here.

{Rabbit_Hole}

{currently works with Chrome and Firefox browser}

For the composition assignment the final of Coding for Emotional Impact class, I want to create something with multiple layers and is self-explained. Inspired by the description of computer vision is a rabbit hole from Andy(because I’m learning Three.js by myself recently), I wanted to make a game about “Rabbit Hole”, and my biggest assumption is that everyone is sort of down the rabbit hole.

ps. It’s not really a fun game to play. Still confusing should it be fun to play or just an emotion-building nowhere…

– Title
Rabbit Hole
– Environment
environment     environment2
– Audience
Whoever also down the rabbit hole or wonder how it feel down there.
 
– Narrative arc
Rabbit HoleMetaphor for the conceptual path which is thought to lead to the true nature of reality. Infinitesimally deep and complex, venturing too far down is probably not that great of an idea.

Literary Nonsense- has no system of logic, although it may imply the existence of an inscrutable one, just beyond our grasp.

And below are three snapshots of what I’ve built so far. I made my own models in Maya and drew textures in Photoshop. Can’t view online because of some web-related issue I can’t solve to load the music(SOLVED_by hard coding the url of music file path). But have no ideas how to do the transition from scene to scene…

SCENE_ZERO: http://www.rabbithole.link/

open

SCENE_ONE: http://www.rabbithole.link/index_D.html

Screen Shot 2014-04-13 at 10.26.14 PM

SCENCE_TWO: http://www.rabbithole.link/index_G.html

Screen Shot 2014-04-13 at 10.27.30 PM

SCENE_THREE: http://www.rabbithole.link/index_S.html

Screen Shot 2014-04-20 at 6.44.00 PM

SCENE_FOUR: http://www.rabbithole.link/index_M.html

maze

SCENE_FIVE: http://www.rabbithole.link/index_T.html

TV

SCENE_SIX: http://www.rabbithole.link/index_F.html

jump

SCENE_SEVEN: http://www.rabbithole.link/index_V.html

voice

SCENE_EIGHT: http://www.rabbithole.link/index_E.html

Elevator

( Three.js + web stuff ) == super deep rabbit hole.

Virtual Reality Tour of Met

For my internship during Spring semester 2014 in Media Lab of The Metropolitan Museum of Art, I hooked up

    1. 3D models of Met from the Architecture Department
    2. Official Audio Guide
    3. 3D models art pieces in Greek and Roman gallery, made by 3D scan with photos
    4. Unity as game engine
    5. virtual reality head-mounted display Oculus Rift as controller

and create an immersive virtual reality tour of Met!

forBlog With Oculus Rift, users can wonder around the museum, listening to the audio guide and admiring art pieces, walk upstair, watch butterflies, being blocked by huge bowl, and being inside of the surreal mash-up models(credits to Decho<horse> and Rui<uncolor triangulars>). metTour

IDEA

With a background as VFX artist of 3D animation and post production, I’m always interested in 3D and how it can be interactive in the creative way. Once I got the chance to intern in Media Lab of the Met and knew we can access the 3D models of museum, I wanted to use Oculus Rift to walk inside the fantasy version of the Met, and to enjoy the immersive experience in space.

 

PROJECT_DEVELOPMENT

Virtual Met Museum –> Fantasy Experiment –> Art piece + Audio Guide

 

BASIC_SETUP_HOW_TO

First of all, tons of basic knowledge about Unity here. And setup a project from scratch, here.

 

✓ Import BIM 3D models into Unity

Basically just put the fbx file into the Assets folder of the project you just created. Not too complicated but there’s one thing you should be aware of, the SCALE. It’s a good practice to setup scale right in the modeling application before importing the model to Unity, and associated details described as below:

  • 1 Unity unit = 1m
  • the fewer GameObjects the better. Also, use 1 material if you can
  • useful link: wiki unity3d

 

✓ Oculus Rift Plugin in Unity 3d Setup

Just follow the clear instruction on youtube!

 

✓ Add collider to meshes

In order to preventing player walking through meshes(e.g. walls, stairs), we need to add Collider attribute on models, steps as below:

  • select model
  • @inspector
  • Add Component –> Physics –> Box Collider or Mesh Collider
  • Mesh Collider is more specific than box collider but at the same time is more expensive to use.

collider copy

 

✓ Occlusion Culling

Means that things you aren’t looking at, aren’t loading into memory, so game will run faster.

  •  geometry must be broken into sensibly sized pieces.
    • if you have one object that contains all the furniture, either all or none of the entire set of furniture will be culled.
  • tag all scene objects that you want to be part of the occlusion to Occluder Static in the Inspector.
  • Back!
  • useful link: unity3d manual

 

✓ Import 3D-Scanned Models from 

  • Take about 20 photos around the object you want to 3D scan of(360 degrees!).
  • Upload the photos to 123D Catch.
  • Yeah now you’ll have both .obj model file and texture file!
  • Just download the file, and drag whole folder into the Asset folder of Unity!

 

POSSIBILITIES

  • Gain accessibility for people who can’t visit the museum in person.
  • Installation design simulation.

 

Thank_to

It’s really a good experience interning at MediaLab of Met. I know I want to keep working on 3D and also step into virtual reality world with Oculus Rift, and it’s a great match that I can have this topic as my own project, and also match to the needs of Met! From this internship, I gained valuable resources from the museum, and also knowing amazing mentors and colleagues from Labs. This project leads me to the world of virtual reality and I’m glad and also thankful that I’m a Spring 14′ intern of Media Lab of The Metropolitan Museum of Art.

{Walk & Talk}

IMG_5162 copy

Walk & Talk

is the final project of me and Ziv for Spatial Media.

 

Ideas

For the final of Spatial Media, there’s no restriction on content and context, so because of the struggling process of brainstorming, we decided to make a project helping brainstorming! Getting the inspirations from Land and Sea, a game heard from Ziv from Israel, and the Boundary Functions of Scott Snibbe that exploring the relation between people and spaces, we built up a system that people can expand their territory by walking and shaking, and once people stop moving, their territories will shrink and eventually disappear. Based on several researches proved that “Body Movements Can Influence Problem Solving”(e.g. Science Daily, May 13, 2009), it has the potential to be installed in office space, and help employees brainstorming.

traditional_territory_small

 

Concept

Each person in the game is assigned an initial territory, which he can expand by walking. If they don’t walk, their territory gets smaller. Also, if they don’t talk (to brainstorm or just chit chat), their territories will shrink. This way, people have to walk and talk in order to keep his/her territory.

 

Context

This game can be used for multiple purposes, e.g. DECISION MAKING • BRAINSTORMING • VOTING ON IDEAS • BUDGET PLANNING. Overall the main function is provoking thoughts about sharing and space with movements.

 

Technique

process

Tools: OpenFrameworks, Kinect*1, projectors*2.

 

Developing

1) The first attempt to expand the territories based on the blobs movement captured by Kinect.

2) Using ofPolylines to smooth the shape of geometry.

3) Beautiful mistakes ;)

4) Final version of color-filled geometry.

EmoCode final_{Rabbit Hole}

Rabbit Hole

  • “down the rabbit hole”, a metaphor for adventure into the unknown, from its use in Alice’s Adventures in Wonderland
  • a slang expression for a psychedelic experience, from the same usage
  • ARG, Alternative Reality Game(http://en.wikipedia.org/wiki/Alternate_reality_game#Unique_terminology) –
  • TING, This is not a game
  • Literary nonsense

    • these supernatural phenomena are not nonsensical if they have a discernible logic supporting their existence
    • has no system of logic, although it may imply the existence of an inscrutable one, just beyond our grasp.
  • Eating and devouring[edit]

    Carina Garland notes how the world is “expressed via representations of food and appetite”, naming Alice’s frequent desire for consumption (of both food and words), her ‘Curious Appetites’.[26] Often, the idea of eating coincides to make gruesome images. After the riddle “Why is a raven like a writing-desk?”, the Hatter claims that Alice might as well say, “I see what I eat…I eat what I see” and so the riddle’s solution, put forward by Boe Birns,[27] could be that “A raven eats worms; a writing desk is worm-eaten”; this idea of food encapsulates idea of life feeding on life, for the worm is being eaten and then becomes the eater  – a horrific image of mortality.

    Nina Auerbach discusses how the novel revolves around eating and drinking which “motivates much of her [Alice’s] behaviour”, for the story is essentially about things “entering and leaving her mouth”[28] The animals of Wonderland are of particular interest, for Alice’s relation to them shifts constantly because, as Lovell-Smith states, Alice’s changes in size continually reposition her in the food chain, serving as a way to make her acutely aware of the ‘eat or be eaten’ attitude that permeates Wonderland.[29]

  •  How Doth the Little Crocodile
    • How doth the little crocodile
      Improve his shining tail,
      And pour the waters of the Nile
      On every golden scale!
      How cheerfully he seems to grin,
      How neatly spreads his claws,
      And welcomes little fishes in
      With gently smiling jaws!
  • The effect of nonsense is often caused by an excess of meaning, rather than a lack of it.

Awareness

(3/25_Updated_footage version)

It’s a project of material experiment and mycelium network simulation. The ultimate goal is to pull closer humans’ relationship with fungus, increase awareness, and explore the usage of mycelium by holding workshop and gathering public source.

 

 

material experiment

In 2007, Eben Bayer and Gavin McIntyre noticed mycelium’s self-assembling glue-like character. By growing mycelium with agricultural byproducts, they discovered a biological, durable, and compostable material that performs, and they found a company called ecovatice. Their products are pressed with pressure during production, and are thick, chunky, and volumetric. Inspired by artist Eric Klarenbeek‘s 3D printed case with straw, I guessed as long as I follow the principle that “mycelium digests nutrient and water and grows harder”, the process of production can be free-formed and without boundary. So I gave it a try.

diagram

For the blender part, the ratio of mycelium+straw & water is approximately 2:1.

Hang the balls in a separated area in order to avoid be contaminated. And after 3~5 days the ball will become obvious white, showing the growing of mycelium.

After 10 days, harvest the balls and pop the balloons, and voila!

Put them aside and dry their interiors for a day(because they were blocked by the balloon).

IMG_3105

IMG_3145

IMG_3130

IMG_3110

IMG_3157

 

 

mycelium network

I’m also interested in how mycelium communicates with each others. The roots of most land plants are colonised by mycorrhizal fungi that provide mineral nutrients in exchange for carbon, and based on “Underground signals carried through common mycelial networks warn neighbouring plants of aphid attack” on Ecology Letter, by Zdenka Babikova, Lucy Gilbert, Toby J. A. Bruce,3 Michael Birkett, John C. Caulfield, Christine Woodcock, John A. Pickett and David Johnson, mycorrhizal mycelia can also act as a conduit for signaling between plants, acting as an early warning system for herbivore attack.

Screen Shot 2014-03-19 at 10.39.44 AM

The experiment is based on the fact that Vicia Faba will emit plant volatiles, particularly methyl salicylate, making this bean plants repellent to aphids but attractive to aphid(bugs) enemies such as parasitoids. It sets up 5 Vicia Faba, having only one of them attacked by aphids,  and having it either connected to other plants with roots or without roots, with mycelium or without mycelium(as picture right above). And the result(as picture left above) shows that the plants, which are connected to the Donor(infested w/ aphids) by mycelium, act same as the Donor, producing volatiles to repel aphids and attract aphids’ enemy. It means This underground messaging system allows neighboring plants to invoke herbivore defenses before attack.

It interests me a lot, and I want to use it as the content to inform people about the amazing behavior of fungus by visualizing the network of mycelium. The idea is–>

  1. when there’s no one around, the mycelium bulb will breathe in its own pattern, presenting w/ LEDs, and there is a video playing footages of fungus & mycelium life.
  2. once someone comes near, the mycelium bulbs will communicate with each other, lighting up and off one by one, and the video will change to broadcast the human-related footages(e.g. garbage, oil spill, and mycoremediation).

2014-03-12 09.48.46

footage Breathing, password: fungus

footage Awarepassword: fungus

 

 

And here are my Arduino code. I wrote digitalWrite into PMW pins.

//#include <LED.h>
#include <NewPing.h>

#define TRIGGER_PIN 8
#define ECHO_PIN 7
#define MAX_DISTANCE 30

//for ultrasonic sensor
NewPing sonar(TRIGGER_PIN, ECHO_PIN, MAX_DISTANCE);
int value;
int interval;  //to trigger the change of LEDs

//for smoothing
const int numReadings = 5;
int readings[numReadings];
int oriReading;
int index = 0;
int total = 0;
int average = 0;

//pin
int ledPins[] = { 
  3,5,6,9,10,11 };

int lastFade[6] = {
  0};
int level[] = {
  10, 23, 45, 50, 100, 205};

//output
int maxV = 220;
int minV = 5;

//slope & intercept
double ain[6], bin[6], aex[6], bex[6];

//time
double inTime[] = {
  1500, 1700, 1900, 2000, 2100, 2300};
double pauseTime[] = {
  350, 400, 450, 500, 550, 600};
double outTime[] = {
  2000, 2200, 2400, 2500, 2600, 2800};
double thirdT[6], cycleT[6];
double levels[6];

boolean lightUp[6];
int awareTime[] = {
  0, 1, 2, 3, 4, 5};
int awareOriTime[] = {
  0, 1, 2, 3, 4, 5};

void setup() {
  Serial.begin(9600);

  //smoothing
  for(int i=0; i<numReadings; i++){
    readings[i] = 0;
  }

  for(int i=0; i<6; i++) {
    pinMode(ledPins[i], OUTPUT);

    thirdT[i] = inTime[i] + pauseTime[i];
    cycleT[i] = inTime[i] + pauseTime[i] + outTime[i];

    ain[i] = (maxV - minV)/inTime[i];
    bin[i] = minV;
    aex[i] = (minV - maxV)/outTime[i];
    bex[i] = maxV - aex[i]*(inTime[i]+pauseTime[i]);

    lightUp[i] = false;
  }  
}

unsigned long tstart[6];
double time;

void loop() {

  //ultrasonic sensor
  oriReading = sonar.ping();
  value = (int) oriReading/US_ROUNDTRIP_CM;

  for(int thisChannel=0; thisChannel<6; thisChannel++) {

    //if detect ppl, all light up
    if(value > 0) {

      //if time can be dividable by 60
      if ( (awareTime[thisChannel])%6 == 0 ) {
        lightUp[thisChannel] = !lightUp[thisChannel];
      }

      if(lightUp[thisChannel] == true)
        levels[thisChannel] = 255;
      else
        levels[thisChannel] = 0;

      analogWrite(ledPins[thisChannel], levels[thisChannel]);

      //determin whether to restart the cycle of time
      awareTime[thisChannel] += 1;

      if( awareTime[thisChannel] >= (180) )
        awareTime[thisChannel] = awareOriTime[0];
    } 

    //if not, do LED pattern
    else {

      if (lastFade[thisChannel] <= inTime[thisChannel]) {
        levels[thisChannel] = int( ain[thisChannel]*lastFade[thisChannel] + bin[thisChannel] );
      }
      else if (lastFade[thisChannel] <= thirdT[thisChannel]) {
        levels[thisChannel] = maxV;
      }
      else {
        levels[thisChannel] = int( aex[thisChannel]*lastFade[thisChannel] + bex[thisChannel] );
      }

      analogWrite(ledPins[thisChannel], levels[thisChannel]);
      delay(1);

      //determin whether to restart the cycle of time
      if(lastFade[thisChannel] >= cycleT[thisChannel]) {
        lastFade[thisChannel] = 0;
        tstart[thisChannel] = millis();
      }
      else {
        lastFade[thisChannel] = millis() - tstart[thisChannel];
      }
    }
  }
}

 

And my Processing code to switch footages based on the Serial signal got from Arduino.

import processing.serial.*;
import processing.video.*;
import java.awt.MouseInfo;
import java.util.Arrays;
import java.util.Collections;
import java.awt.Rectangle;

Movie aware;
Movie grow;
boolean playGrow = true;

Serial myPort;

void setup() {
  size(displayWidth, displayHeight);
  if (frame != null) {
    frame.setResizable(true);
  }
  background(0);
  // Load and play the video in a loop
  aware = new Movie(this, "aware_2.mp4");
  grow = new Movie(this, "grow_v2.mp4");
  aware.loop();
  grow.loop();

//  println(Serial.list());
  String portName = Serial.list()[5];
  myPort = new Serial(this, portName, 9600);
}

void movieEvent(Movie m) {
  m.read();
}

void draw() {
  if(playGrow)
    image(grow, 0, 0, width, height);
  else
    image(aware, 0, 0, width, height);
}

void serialEvent (Serial myPort) {
  int inByte = myPort.read();
  println(inByte);

  if (inByte > 10)
    playGrow = false;
  else
    playGrow = true;

}

void keyPressed() {
  if(key == '1')
    playGrow = true;
  if(key == '2')
    playGrow = false;
}

int mX;
int mY;

boolean sketchFullScreen() {
  return true;
}

void mouseDragged() {
  frame.setLocation(
  MouseInfo.getPointerInfo().getLocation().x-mX, 
  MouseInfo.getPointerInfo().getLocation().y-mY);
}

public void init() {
  frame.removeNotify();
  frame.setUndecorated(true);
  frame.addNotify();
  super.init();
}

 

 

photos of Fabrication2014-03-11 12.19.29

2014-03-12 00.24.24

lightUp2

lightUp1

 

IMG_9685

IMG_9701

For further development, I’m thinking about maybe cooperate with Kate‘s “mushroom craft” and have some craft workshops! After all the process of making those mycelium light bulbs, I’ve been through the fabrication works which I’ve never tried before, and it felt great! I think direct “Hand” touch is the most effective way to pull closer the relationship between people and materials.

By starting the production from searching and growing the material, we can appreciate more about the resource we take from nature and also be more aware about the environmental issues. Not just sitting there receiving the news from TV, but actually  caring and being aware of it because you feel it affecting the fabrication process directly. Maker/Crafter spirit is one of the answer to the future.

Timing and Pacing

For this week’s subject Timing and Pacing, I chose “No Safe-House” in the soundtrack of The Grand Budapest Hotel to decode.

timing

 

effect I intend to achieve

–> emotion accumulation, cheerful and narrative.

notes

  • library I used for camera in 3D –>  http://mrfeinberg.com/peasycam/#about
  • using PShape to store the tetrahedron I made and set their movements with trigonometry functions, noise, hsl, and hard-coding frameCount!!!(see how long and tedious my codes are :P)
  • issues to work on, since I used frameCount, it’s different all the time, depending how fast my computer run. need to use millis() next time!
  • next step will be using library Minim to generate the patterns directly from the analysis of sound file.

codes 

Read The Rest

Roofwing

IMG_1826

A flying + walking, surreal experience design for rooftop. Work done with amazing John.

It’s all start from an assignment in the class Spatial Media, with specific requirement as below:

+ be site specific (i.e. be designed for an actual location)
+ be designed for use by more than one person
+ involve a horizontal surface

START FROM SPACE

Because we wanted the experience to be strongly connected with space, we developed the idea starting from space. We chose rooftop because its exciting yet rusted characters, and we thought it has the potential to be a relaxing playground for citizens, especially for cities with limited space.

CONCEPT

+ Utilize the height and windy characters directly from space
+ Cooperate with your teammates to maximize and exaggerate the excitement

SP_Rooftop_old

SWINGS PRODUCTION(making miniature!)

+ woods
+ necklace chains & rings
+ thin hemp rope
+ webcam
+ monitor
+ white board

TECHNIQUES

+ Openframeworks
+ videoGrabber
+ videoPlayer
+ color detection
+ image sequences
+ soundPlayer

Swing range test

Infinite Loop
This is one of the amazing beautiful mistake with the infinite loop between monitor showing what camera capturing and camera capturing monitor.

HOW IT WORKS

+ Team A & Team B
+ swing people*5 need to cooperate to swing as high as they can to shoot out the strong thunder, or either weak thunder or NO thunder will be shoot out(different thunder sound effects).
+ people in the center can block the thunders by stepping on them(scoring sound effect + spark sequence), so two team have to try to rope in the people in the center
+ once thunders reach the other side, strong thunder gets 2 scores, weak thunder gets 1 score.
+ SWING OR DIE.

« Older Entries