m   r
  r   h
  e    i
  a      n
  t        e
i am an undergraduate student in the School of Art at Carnegie-Mellon University.


email: aman at bad-data.com
github: https://github.com/aman-tiwari

Soylent Saviour

Soylent Saviour is a collaborative project with Aprameya Mysore. It consists of a cast agar topography of Pittsburgh, host to a number of slime molds. The slime mold is robotically fed with Soylent at locations of high gentrification. The slime responds to the feeding, creating a hypothetical map of the gentrification of Pittsburgh.
In addition, an audience can interact with the map by assumping personas of gentrifiers, directing future feedings and creating an imagined future projection of gentrification in Pittsburgh.
Soylent Saviour will be shown at ACADIA 2016, as part of the Contingent Landscapes project.

More documentation, including process, of Soylent Saviour (warning: big pdf!)


Terrapattern is collaborative project aiming to democratize geospatial insight previously available only to the privileged few. I acted as the "machine learning wrangler" in the Terrapattern team, building the machine learning model as well as backend used to power terrapattern.com. Terrapattern enables people to discover strange patterns, delightful coincidences and mundane commonalities in satellite imagery.

Experience Terrapattern


Obituaryoet is a twitterbot.
It mashes together obituaries from people's wikipedia pages.
It tries its hardest.

github // twitter


The SEARCHSOUND project aims to capture and express the disconnected flows of emotion in the internet.
SEARCHSOUND consumes 1% of the recent messages on twitter. It analyses the semantic distance between the words in the tweet and words commonly associated with emotions. It converts this semantic knowledge into a series of tones, and plays them.
more information & source

uarm experiments

These were two experiments done with the UArm robotic arm.


In the first, the UArm communicates using a Zen garden. It rakes its garden, inviting the user to express themselves with a brush. The UArm responds to the user, a conversation ensures till the garden is too messy to continue. The UArms rakes, maintains peace.

This experiment consists of an openframeworks app, a zen garden, a webcam and the UArm.
The openframeworks app uses ofxCv to look at the zen garden and decide on how messy it is, as well as how to respond in a (primitve) way. Based on the garden, it passes instructions to the UArm — whether to rake, or to 'respond'

friendly baby

The second was part of a collaborative project, the creation of a sculpture. The sculpture was a wire-mesh torso and arms, covered with faux-tex to simulate amniotic/developing skin. The UArm was the head. The body has a webcam embedded in it. When a person approaches, friendy baby sees them. It is affectionate but picky, a persona created by its programming. video and pictures to be added soon

js experiments

These are all hosted on bad-data.com,  my other website.


A harmonograph implemented using p5js.
Use the mouse & the arrow keys to change the pendulumns' parameters.


boids as brushes

Experiments in using particles as brushes to create faux-3D structures.


Windows XP Simulator

emscripten flocking

A proof-of-concept that a naively coded flocking simulation in openframeworks works when compiled by emscripten to js. The performance of this web-app is rougly 2/3rd to 1/2 (i.e, can simulate only half or two-thirds of) the original, natively compiled one. not embedded due to large size, but you can see it at:

Click and drag to rotate the camera.