David Bowie Is – AR Exhibition

David Bowie Is – AR Exhibition, is an AR project by The David Bowie Archive, Sony Music Entertainment (Japan) Inc. and Planeta. In this post, I jotted down some of my development as Lead Developer, under Director Nick Dangerfield, Art Direction Seth Tillett, and Tech Director Dan Brewster.

Paper Interaction

The irl exhibition has a lot of 2D materials, and for browsing them in a mobile app, there is a subtle balance we want to maintain, a balance between efficiency and AR-ness. We tried several ways, shuffle, fade in/out, but ultimately it feels the most natural to see them laying or leaning on some sort of surface.

And just like how you would expect and do in real life, we made the 2D materials move with your fingertip.

The effect especially works great on small paper cutout.
(Note – all videos in the post are direct screen recordings on my old iPhone6s)

And as a tabletop AR, some 2D artifacts need to be stacked in order for all to fit in, thus we need to a way to shuffle them.

After testing, we found out that, bringing the “focus” artifact to the front of the stack after being sent back, feels the most natural and intuitive.

Paper stack shuffle.

Scene Changes Mask

To pack the whole physical show into a tabletop AR mobile app, it is important to find a way to navigate the rich contents.

It includes over 400 high resolution captures of David Bowie’s costumes, sketches, handwritten lyrics, notes, music videos and original works of art are presented in striking arrangements and immersive settings, as well as dozens of never before seen items, including archival videos, drawings, photographs, and notes.

davidbowieisreal.com

Within the show, we decided to condense each section into a digestible Diorama, and make them navigable through a map.

And within each diorama, we need to make some sub-scene changes as well to accommodate all the contents.

For this, I used a lot of invisible 3D mask.

The mask uses depth buffer to occlude and prevent the contents from being shown. It is a simple but effective graphic trick, and more can be found here.

For example in Hunger City, two masks are places on the sides of the stage, to hide the off-stage contents.

Scene Making

Under the guidance of Director and Art Director and with feedback from the whole development team, I build the AR scenes including: Early Influences, Space Oddity, Cultural Influences, Songwriting, Recording, Characters, Collaborations, Life On Mars, Ziggy Stardust, Hunger City, Stage and Screen, New York, and Black & White Years. Below are the in-app screen recording of some of those scenes.

Early Influences

Hunger City

Black and White Years

Stage and Screen

New York

Ziggy Stardust

MASK v2.0 – Unity!

Unity3D version of MASK!

Since it’s much more stable with App than with Webpage, maybe it’s worth a try to make a unity version of MASK collection. Hmm such a sweet dilemma!

Codes of accessing camera on the phone… omg so simple!

WebCamDevice[] devices = WebCamTexture.devices;
mCamera = new WebCamTexture ();

//for(var i=0; i<devices.Length; i++){
//	Debug.Log(devices[i].name);
//}

if (devices.Length > 0) {
	mCamera.deviceName = devices[0].name;
	displayPlan.GetComponent<Renderer>().material.mainTexture = mCamera;
	mCamera.Play ();
}

Summer Throwback Three: We Are What We Pretend To Be Inc. established!

The third major experiment I had this summer was a street car journey. At first it’s just another experiment with Unity, but then it leads me to an idea for the project I want to do this year, as the Research Fellow at ITP.

I decided to establish an incorporation, called WE ARE WHAT WE PRETEND TO BE Inc..

It’s inspired by the quotes of writer Kurt Vonnegut:

“We are what we pretend to be, so we must be careful about what we pretend to be.”

I discovered this when I was doing research for my thesis, and it inspired me a lot to shape the idea, and to make the MASK series.

WE ARE WHAT WE PRETEND TO BE Inc.(WAWPRE) is an entity for my series experiments about the relationship between our free will and unfree (uncontrollable) will. Can I change my unfree will by change my free will? How can I shift self identity by ultering my and also others perspective?

 

The first product of WE ARE WHAT WE PRETEND TO BE Inc. is MASK, a customized VR headset that helps the user become what they want to be.

MASK_lineup

 

The second service of WE ARE WHAT WE PRETEND TO BE Inc. is THE TRAM.

We Are What We Pretend To Be

It’s an installation that provided customized virtual reality experience that users (aka passengers) can design their tram journey based on their mood, destination, favorite color, and cloth to wear. Also from the cutout hole on the cardboard box, which is usually covered with curtain, I will take a picture of the passengers, so the passengers will be able to see themselves in the VR.

WE ARE WHAT WE PRETEND TO BE Inc. THE TRAM takes you to wherever you want, fulfill whatever desire you want, therefore you become whomever you want. You set yourself free.

I joined Internet Yami-Ichi in New York this summer, and displayed my little handcraft Home Depot cardboard box as an installation + experience seller. For the software, I used Unity and started developing it on and off since August, since I was notified that I was selected to join the event. As for the box, I made it with recycled materials I found in junk shelf in one night before the event! Holy cow SO NERVOUS.

It’s really a successful experiments, for first ever user test! The users used it for a long time, average at least more than 1 minute! Which is a huge difference for me.  I think that’s because the tram keeps moving, the scenery outside the window keeps changing, plus there are ghost passengers getting on the bus when the tram stops at the tram station, the user is expecting things to happen, thus enjoys the VR experience for the longer time.

Screen captures with different customized setup

 

Contents development

It’s a subtraction process, because of limited time haha

You can escape into different spaces, under different circumstances.

Different customized options & graphic design

Ideas finalizing

 

Internet Yami-Ichi 

IMG_20150912_131813 IMG_20150912_133938 IMG_20150912_14281211997881_10207456961200394_772412524_nIMG_20150912_182150 IMG_20150912_184804

 

Thank you all my lovely first ever passengers<3 <3 <3

Let me thank you with this passengers all together gif <3<3<3 Thank you guys!!!

ppp

 

Summer Throwback Two: Big Mac Index VR

In this summer, I was interning in The Economist Media Lab. It’s a tight team of Ron Diorio, Frank J. Andrejasich, and Ziv Schneider. The goal of this internship is exploring how The Economist can use virtual reality to enhance the editorial piece, reach broader audience, and create unusual experience.

Because of my previous 3D data visualization experiments, The Economist is interested how they can do data viz in VR, since data representation is one of their forte. Luckily, I could choose what I want to create, so I picked Big Mac Index. Why? Because it is BURGER dude. BURGER.

About Big Mac Index:

One of the most famous index of The Economist. The earliest data starts from September 1986, and it has been used as a semi-humorous illustration of purchasing power parity (PPP). It makes exchange-rate theory a bit more digestible.

For this, I use the every half a year data since 2000, and the countries include: Argentina, Australia, Brazil, Britain, Canada, Chile, China, Czech Republic, Denmark, Euro area, Hong Kong, Hungary, Indonesia, Japan, Malaysia, Mexico, New Zealand, Poland, Russia, Singapore, South Africa, South Korea, Sweden, Switzerland, Taiwan, Thailand, and United States.

 

There are total three scenes in this Big Mac Index VR: 1) Intro, 2) Map, 3) Comparison Mode.

INTRO

With bunch of graphics and little texts, the opening shows how to navigate and interact in the virtual world, and gives an intro about the Index. It’s like the traditional motion graphic voice-over video, yet it’s 3D and interactable.

 

MAP

Map is the major part of Big Mac Index VR. There are two kinds of data: first one is “How many Big Mac you can buy with $10 value equivalent in the currency?”, second one is: “The valuation of the currency compared with US Dollar”. For each one, I use burger tower and color of map to represent it. Throughout different time periods, the height of the burger tower and the color of the map change.

views

There are also two modes for how to display the data, one is automatic time passing by (default), another one is being designated. By tilting the headset 90 degree clockwise, the menu is toggled up to be chosen from. Once it’s not in auto mode, the time period appears in the sky for be chosen, and the map only shows data for specific year.

 

COMPARISON

Still in progress :) It’s for clear comparison in small group with selected countries.

11

 

Summer Throwback One: Facebook Land

Originally is a birthday present to Andy S. But it ends up to be an Facebook post exploration journey. Below is the instructions:

  • Start the journey
  • Log into Facebook with token
  • Hear happy birthday wishes from Yelling Kid
  • Your latest 20 posts from the news feed show up, include: name, contents, and picture.
  • Be checked out by Andy the Candy
  • Press Space key to have confusing conversations with Andy the Candy
  • Andy the Candy starts walking
  • Follow Andy the Candy to a post
  • Press Space key to view the content
  • Andy the Candy greets you when you are back
  • Andy the Candy leads you to another post
  • (Loop)

Very fun experiments! Possible next steps: More posts you browse, the darker the environment it becomes, and eventually shut down the application :>

Just don’t fall for FB too much yo ^_<

 

Materials –> 3D-like 2D gif images

Virtual Reality Tour of Met

For my internship during Spring semester 2014 in Media Lab of The Metropolitan Museum of Art, I hooked up

    1. 3D models of Met from the Architecture Department
    2. Official Audio Guide
    3. 3D models art pieces in Greek and Roman gallery, made by 3D scan with photos
    4. Unity as game engine
    5. virtual reality head-mounted display Oculus Rift as controller

and create an immersive virtual reality tour of Met!

forBlog With Oculus Rift, users can wonder around the museum, listening to the audio guide and admiring art pieces, walk upstair, watch butterflies, being blocked by huge bowl, and being inside of the surreal mash-up models(credits to Decho<horse> and Rui<uncolor triangulars>). metTour

IDEA

With a background as VFX artist of 3D animation and post production, I’m always interested in 3D and how it can be interactive in the creative way. Once I got the chance to intern in Media Lab of the Met and knew we can access the 3D models of museum, I wanted to use Oculus Rift to walk inside the fantasy version of the Met, and to enjoy the immersive experience in space.

 

PROJECT_DEVELOPMENT

Virtual Met Museum –> Fantasy Experiment –> Art piece + Audio Guide

 

BASIC_SETUP_HOW_TO

First of all, tons of basic knowledge about Unity here. And setup a project from scratch, here.

 

✓ Import BIM 3D models into Unity

Basically just put the fbx file into the Assets folder of the project you just created. Not too complicated but there’s one thing you should be aware of, the SCALE. It’s a good practice to setup scale right in the modeling application before importing the model to Unity, and associated details described as below:

  • 1 Unity unit = 1m
  • the fewer GameObjects the better. Also, use 1 material if you can
  • useful link: wiki unity3d

 

✓ Oculus Rift Plugin in Unity 3d Setup

Just follow the clear instruction on youtube!

 

✓ Add collider to meshes

In order to preventing player walking through meshes(e.g. walls, stairs), we need to add Collider attribute on models, steps as below:

  • select model
  • @inspector
  • Add Component –> Physics –> Box Collider or Mesh Collider
  • Mesh Collider is more specific than box collider but at the same time is more expensive to use.

collider copy

 

✓ Occlusion Culling

Means that things you aren’t looking at, aren’t loading into memory, so game will run faster.

  •  geometry must be broken into sensibly sized pieces.
    • if you have one object that contains all the furniture, either all or none of the entire set of furniture will be culled.
  • tag all scene objects that you want to be part of the occlusion to Occluder Static in the Inspector.
  • Back!
  • useful link: unity3d manual

 

✓ Import 3D-Scanned Models from 

  • Take about 20 photos around the object you want to 3D scan of(360 degrees!).
  • Upload the photos to 123D Catch.
  • Yeah now you’ll have both .obj model file and texture file!
  • Just download the file, and drag whole folder into the Asset folder of Unity!

 

POSSIBILITIES

  • Gain accessibility for people who can’t visit the museum in person.
  • Installation design simulation.

 

Thank_to

It’s really a good experience interning at MediaLab of Met. I know I want to keep working on 3D and also step into virtual reality world with Oculus Rift, and it’s a great match that I can have this topic as my own project, and also match to the needs of Met! From this internship, I gained valuable resources from the museum, and also knowing amazing mentors and colleagues from Labs. This project leads me to the world of virtual reality and I’m glad and also thankful that I’m a Spring 14′ intern of Media Lab of The Metropolitan Museum of Art.