Final Project Documentation: Trash Cyclone

PROJECT NAME :: Trash Cyclone

DESCRIPTION :: Trash Cyclone aimed to bring awareness on how much trash we create in our daily life. By using the Tango's Motion Tracking feature to mark the user as the center of the universe with one main gravitational force, the user as the attractor will attract all the trash in the app. The user has to move / run away from these trash. Otherwise, the user's view will slowly become dark and die.


BACKGROUND :: SKYBOX

The skybox feature in Unity is amazing!!!! I loveeeee it!!! I cannot beleive how seamless each planes, top, right, left, bottom, and back are connected. There are no edges!

Here, I use The Tango AR Delta Camera. The Delta Camera let me see my app in First Person view, Third Person view and Top view.


BACKGROUND :: REAL WORLD

Now I switched to use The Tango AR Camera feature to see the real world as the background. The Tango AR camera only has a First Person view, it doesn't have the three views mode like the Delta camera has. (First Person view, Third Person view and Top view.)
 


RIGID BODY :: FORCE :: GRAVITY

Rigid Body, Force and Gravity in unity is an amazing tool once I get it working! It took me a while to understand(still not totally) unity's terminologies and its techniques. I really like how I have the control to simulate earth's gravity or I can make my app exist in zero gravitational world, just like in space!

 

The attractor (spawners, trash prefabs: bananas, boxes, etc) cannot have any gravity, it has to be turned off and only the camera's gravity turned on.  otherwise they would all fall and gather at the bottom of the sphere.

 

The camera has to be linked to each of the prefabs trash. Otherwise, they would not follow the camera and will not attract to the user as the user move in physical world.


BUGS?? GLITCHES??

During my app development process, I found that there was a bug in code because when I build my app from Unity to Tango, I could see my trash prefabs are being created on my laptop (Unity) but not on the android phone (Tango).


WHAT'S NEXT?
ADDITIONAL FEATURE :: VIBRATION :: SOUND

I really want to include vibration feature into my app. I imagined that each prefab has vibration and sound mode. And when each of the prefabs are attracted to the center (me / camera), the phone would vibrate and make some noise when each of the prefabs hit me. The user experience is going to be much more realistic. As Rui has suggested I tried using a customized Android Vibration Plugin from Unity 3D Asset Store for only $3. At the end I didn't use this feature because I think I did not add vibration permissions on the android manifest file correctly because I couldn't build my app at the end and keep getting 'Build Failure' error. I'll try and add this in afterwards.
 

Android Vibration Plugin

Unity Normal Setting Vibration (1 sec)

 

Final Thoughts..... If I ever take this project further, I think that I want to develop an app that bring awareness on how much trash each of us create each day, month, year or our entire life and have these trash follow each user through the app. Also, instead of using the the unity's 3D model, I would use a special 3D camera to scan actual trash object and import into unity to get a more realistic look.


Final Project Proposal: Trash Cyclones


INSPIRATION: Trash Trash Trash

One day I was walking to school to ITP and it was very windy there was this plastic bag flying right towards me and I was pretty certain that it will not get me and I am not in its path. Ohhhh, I was wrong...that plastic bag hit straight to my face. Imagine me doing the Keanu Reeves move in the Matrix. 

^^^^^ Me trying to get away from the plastic bag.


FIRST PROPOSAL: Thoughts | Ideas | Sketches

For my final project, I am going to create a virtual world of NYC in Unity thats undergoing a natural disaster, cyclones to be exact due to excessive quantity of waste and trash in the New York. 

I am going to develop a prototype app in Tango with Motion Tracking feature along with Unity, as well as, implementing DSNY’s Refuse and Recycling Disposal Networks API.  The Tango / camera will represent a user (at the center of the universe), all the trash and wastes (movers) in the virtual world are going to be attracted to the user (attractor). The user has to shake or run away from these junks otherwise these junks will block your point of view and the world will gradually turning dark and the user eventually become one of the junk. The number of trash flying around are determined by JSON.

Wind / wave patterns have been always fascinated me (just like flocking and perlin noise example sketches that Shiffman showed), so I want to simulate the wind/cyclone forces. What will make this simulation interesting is instead of the boring particles and spheres that we have been making in p5.js sketches since I started ITP will be replaced by what we throw into trash everyday, which are:

  • paper: newspaper
  • metal: light poles, pans
  • glass: jars,
  • food: pizza, lettuce

These trash important keywords are also presents in the API JSON I found! Just what I need!

 

Source

 

Speaking of API, as I was researching on New York’s trash and waste API, I came across a well-researched article written by the guardian on the subject matter.

Click on the article's image to the source link.

 

The article also referenced several sources and links (API page) at the bottom of the article, which I am going to use for my Final project. In addition, the data was fairly recent, 2016.

But now…., what’s confusing to me is making sense of the big data and only extract the necessary one out.
 

"Baby Steps…baby steps…baby steps….”

 

My next step is implementing the API JSON into Unity.....