For our wedding, I designed, coded, and hand-built our website, save-the-dates, and centerpieces.
In the Fall of 2011, SUBWAY® introduced the SUBWAY Fresh Artists™ filmmakers series to students at NYU and USC with the challenge to develop a scripted, episodic web series around the idea of “Every Breakfast Sandwich Tells A Story.” More than 50 student teams submitted creative treatments for judging by a panel of representatives from SUBWAY® Restaurants, Content & Co., industry insiders, as well as the faculties and alumni NYU and USC.
Our team of 4 created TECH UP and were selected as winners, premiering the series at SXSW 2012, and being featured on MyDamnChannel, IFC, Indiewire as well as the Promotion Pictures Gala at Tribeca Cinema on 4/10/2012 where the interactive portion of the series was brought to life in a sandwich sizing photo booth.
Anson, an aspiring tech entrepreneur, endeavors to launch a start-up for his latest invention, running his bootstrap operation out of a Subway restaurant, with the help of sandwich artist, Marlene, and to the dismay of franchise owner, Omar.
In the interest in embedding the interactivity into the web-series, I took it upon myself to be Anson, creating his inventions, both fiscal growth measurer and sandwich sizer to be used during the filming of the episode. The series was written around the technology and the idea that aspiring entrepreneurs and artists in big cities (like NYC) are constantly struggling to 1. find space to work and 2. get people to understand their ideas.
Anson’s demonstrations in the 3rd episode are all generated using a software application I created in Processing that uses the Microsoft Kinect to measure distance between hands. The technology was then brought off-screen for people to use at the Gala to size up their own sandwiches.
All software was written in Processing and the Kinect-related code troubleshooting was overcome using Greg Borenstein’s Making Things See. All of the code for Processing, Arduino and the middleware for using the Kinect, is on my Github account here.
The booth was fabricated completely by hand. All of the supplies and files for creating the booth are below.
3 18X32″ sheets of 1/4″ translucent white plexi (Illustrator Files for box fabrication)
16 ft USB A/B cord (for trigger button)
High resolution web cam (I used the Logitech C910)
Green screen kit
1 Tota light with lightstand and umbrella
Fujifilm ASK 2500 dye-sub printer
Industrial strength velcro (white, preferably)
Forever is an immersive experience for one person; created with Paul May for the Recurring Concepts in Art class at ITP.
In our Recurring Concepts in Art class we were asked to take a previous project made at ITP and re-conceptualise/remake it without using technology.
Paul May and I worked together to take two projects that involved tactility, physical sensation and strong emotion – merging them into one project that could serve as an exploration of Spectacle; in which we both shared an interest.
Our goal was to create an immersive, overwhelming experience for one person without using the traditional technologies of Spectacle.
We set out to create an environment that could physically envelope a participant. The key technical challenge was building a space big enough for a human. We investigated using sewer pipe and telephone kiosks – eventually we decided to build the “vessel” for ourselves. We constructed a 4 feet x 4 feet by 6 feet “vessel” and used steel cables and rigging to suspend it from the lighting grid at ITP.
Originally, we thought that making this an automated/self-directed experience made sense – but we were keen to avoid complex technologies, sensors etc. The more we thought about having to interact with the participant directly, the more we liked the idea – the experience would remain personal and individual, but be directed from outside; this seemed appropriate. We added a layer of inflatable seating/bedding to the inside of the vessel, inflated by two high-powered electric air pumps.
Forever is an interactive art installation. The participant is asked to put on ear-protection and dark glasses. The participant enters a large vessel which is suspended from the ceiling. Two operators use pumps to inflate the interior walls of the vessel around the participant. After a period of time the walls are deflated and the participant exits. The next person is made ready, and enters the vessel.
We were asked to reflect on the project. We chose to make an audio recording of our thoughts about the project. The recording was made on the 14th November 2011, one month before the ITP Winter Show.
Forever was presented for the two days of the ITP Winter Show 2011. We estimate that over 100 people took part in the experience.
Thanks to all of the inspiration and guidance we received from our Recurring Concepts in Art class, and of course, Georgia Krantz.
The LED Cube was created for our Media Controller project. Working with Doug Thistlethwaite and Miguel Bermudez, we decided we wanted to use the classic PinArt toy to control a 3D LED cube.
Using this Instructable we found online and these instructions on how to use a multiplexer as a starting point, we assembled a cube and used force-sensitive resistors underneath the pins of the PinArt to communicate with the LEDs.
Using this Instructable, we built a 3D LED cube.
We then tested out how to use a multiplexer.
We then taped the pins down and applied 6 FSRs to the bottom of the pin toy.
All 6 FSRs needed to be calibrated to read the pressures evenly.
The code must run through and talk to each of the 64 LEDs and figure out if they should be on or off. With the various animations we created in code, this process happens too quickly for the human eye to detect it running through each LED at a time.
Ideally, we would like to build this bigger (we took some of our inspiration from the current installation in Madison Square Park), but we’ll need to re-visit the construction of it if we wanted to make it feasible.
Matt Tennie and I built a high five tracker. Our assignment was to collect data from the ITP floor over a period of a few days. We decided that we should track awesomeness. So we asked that if anyone on the floor experiences a moment of Awesomeness (ie, mind blown in class, happy accidental code discovery, the LED finally blinks, etc.) then they were to log this awesomeness by high-fiving the tracker on the way down the hall.
The tracker consists of a framed picture of Michael Jackson with a simple switch connected to an Arduino that sends serial commands to a Processing sketch which takes a photo with a web cam, which is uploaded to this Flickr stream.
The interface consisted of a piece of 1/4″ framed plexiglass with an impatiently cut out picture of Michael Jackson with a giant sparkly glove taped to it. The frame and plexi combo are intended to make the interface durable enough to get punched in the face 100 times a day.
The frame is attached to a cable trough with velcro and a piece of card board. A joint in the cardboard allowed us to mount a simple switch which would bend on impact. The frame would then be allowed to swing freely until resting back in place.
The switch is read by an Arduino Uno which is programmed to print serial data to the control computer.
Serial communication gets established and pins get assigned ports.
The switch is read and prints the state of the pin to serial as 0 or 1 (and also has a contingent in case nothing comes through).
There is a 4 second delay after the switch is tripped so the swinging frame can settle itself to avoid false positives.
Processing code is here.
All 500+ high fives from the ITP show can be viewed here.
All 211 awesome high fives can be viewed HERE.
To create a fashionable ankle accessory to provide daily proprioceptive ankle training for the working city woman to walk faster, more comfortably while also sporting fashionable ankle wear.
Final presentation can be found here.
Gait analysis code is here.
According to Patricia Ladis and Tracey Vincel of the KIMA Center for Physiotherapy and Wellness, proprioceptive (or dermatological) training is an excellent way to prevent the poor ankle and calcaneus (heel) positioning that women’s pumps cause. They see a lot of women who have an anterior lateral tendency when wearing heels that throws off the biomechanics of not only the ankle, but also the knee and hip. By simply taping the skin, it makes it easier to walk in heels and prevents women from training their ankles to position themselves that way while in heels.
In terms of really preventing sprain, the most important factor is pulling the heel in the opposite direction of the sprain. In other words, you want to counter the direction of the foot without effecting gait. This is wearing something flexible with also some boning (behind the ankle) is ideal. A good resource for this is Soma Simple, which focuses on inverting the heel to prevent pronation. Tracey emphasized the importance of “hugging the calcaneus”.
This article from the Clinical Journal of Sport Medicine, Issue: Volume 21(3), May 2011 (pp 277-278) (also here) demonstrated how proprioception training could be used to prevent ankle injuries. The trial study used professional basketball players as ankle injuries are the most common injury in the sport. By making the players aware of their ankles, which can be done by simple taping the area, providing proprioceptive feedback of that area to the brain, players reduced their incidence of ankle injury.
Joint injuries are common in sports thus taping and bracing athlete joints is common. However, this type of proprioceptive training is not applied to everyday life, where joint strains and sprains occur with relative frequency (need citation here), especially in women wearing high heels. There are some shoe styles that focus on ankle support, but not much focus in thinking about the ankle beyond its relation to shoes. Our idea for our final project is to create a line of “joint jewelry” that provides daily proprioceptive training while also serving as a fashionable accessory. Visually we are looking at Alexander McQueen as inspiration for this line.
3D Scanning for Joint Jewelry
Using this code, we scanned our ankles and imported the STL files into the CAD program Rhino for Mac. Our files are below. We are still working out the kinks in this process and may switch over to Meshlab from Rhino.
Inspired by Amy Sly’s Am I Wearing Pants? flowchart, my friend Keeli and I created the spin-off Am I Wearing a Dress? to help girls avoid leaving the house in dresses that aren’t quite, well, dresses. After seeing a girl at a Lone Star Chili Cook-off who seemed to think her tunic was actually a dress, we felt the need to sound the alarm to confused women everywhere. Yes, I live in NYC where almost anything goes and the outfits rarely fail to entertain, but I can’t stand by and support seeing flashes of lady parts as girls parade down Park Ave in “dresses” that don’t quite cover themselves (oh, it’s happened).
Apparently GLHD is something ladies everywhere feel they can relate to (except for these guys who apparently are unclear on the concept of “humor”) Check out all the great sites that felt the issue of GLHD should be highlighted.
and even this lovely Diary of a First Year Teacher
Help keep your partially dressed female friends at home until they’ve covered their tushies. Also, if you see them, document and post! We’d love to see!
Using 3D cameras and face-tracking, GAGE provides a cheap, accessible system for objectively automating existing motion and gait disorder tests, like Parkinson's Disease, all from the comfort of the patient’s home.
iStride is a wearable sensor that keeps running and biking cadence at an optimal range, preventing injury and increasing efficiency.
The concept behind the project is to monitor runners and cyclists as they run, to ensure efficiency and avoid injury. Optimal cadence is between 80-95 strides per minute (per foot). For the most efficient stride (and best way to avoid injury), a runner or cyclist’s foot should hit the ground 80-95 times in 60 seconds. Our device measures in 5 second intervals so that the runner or cyclist has immediate aural feedback to indicate when they have gone below or above optimal cadence.
The final breadboard layout includes the speaker linked to a 555 timer and digital potentiometer. This allows us to add the speaker/audio as output. The timer turns on and off extremely fast, creating a long tone (the speed it turns on and off determines the pitch of the tone which is dictated in the code.
I am a horror movie enthusiast and as such decided to recreate the shower scene from Psycho as a 3D virtual play space.
By hacking a Wii Remote and writing some code in Processing, I made a projection that simulates the 30-second stabbing scene from Psycho, where the stabber gets to stab his/her friend behind an actual shower curtain, causing bloody handprints to appear upon each stab.
Using either the application Osculator or Darwiin Remote, applications that allow your Wii remote to sync with a Mac, you can use the remote to act as key strokes on your computer. This allows the remote to “talk to” other applications like Processing.
By finding the ranges of the pitch, yaw and roll of the remote you can map those to the kind of output desired, in this case, a forward stabbing motion should generate one bloody hand print to be projected at a time. (note the highest accel. value is 0.3996528, the lowest–not pictured, is 0.10354).
Below is documentation of how the silhouette of the “friend” in the shower would look against the projection onto the shower curtain.
I finally ended up changing the port through which the wii data was communicating (from 12000 to 8080) and synced the Bluetooth detection of the wiimote manually on my computer before starting up OSCulator. Since the data coming through OSC routing from OSCulator to Processing was inconsistent, I designated acceleration as a specific key stroke (I used the “1″ key), which means that every time the acceleration goes over a certain threshold (this can be seen in the “quick view” section of OSCulator), the computer detects that the button “1″ is being pressed. Thus, I changed the code such that an image of a bloody hand would only appear when “1″ was “pushed”. I also set one of the buttons on the wii to act as the START button so that the hands wouldn’t start appearing until the stabber is ready to start stabbing. Here is a video of Valentina testing out the Stab-A-Friend .
The final code (for the wiimote) is here.
This short video, featuring live stop-motion, is a recreation of a Missed Connection we found entitled "Pickles Aren't For Everyone".
Shortly after we published our video, the project became far more interesting when we discovered the 'true story' we had fictionalized, was in fact a fictional story we reenacted.
Begin forwarded message:
Thanks for your email.
And it certainly is a strange convergence of events. I found the video through a google alert I have set up for the Ships That Pass project. The missed connection is indeed fake, written by Emma Straub: http://www.emmastraub.net/
I hope that knowledge doesn’t take away from the wonderful short film you made. And maybe drove a little traffic your way.
All the best,
On Wed, Nov 24, 2010 at 12:20 AM, Spike McCue <firstname.lastname@example.org>wrote:
Weird, I was one of Lily’s partner’s for the video (aka “that dude in the ridiculous glasses” aka “that’s my desktop in the beginning of the movie”). We worked with two other partners to source, create, shoot and edit the video.
I’m just curious how you found it? Also, it’s fake? It was quite well written for a missed connection…
For our Understanding Networks class, we wanted to create a usable, scalable system that would allow future ITP students to build applications using ITP data. We then built our own application on top of it (Floorsquare) as a way for students to “check-in” to The Floor (where ITP is based) during the end of semester student show. When each student arrived, they had to swipe their ID in this USB card swiper, which logged them in. This allowed the show curators to ensure that students were arriving on time for show set-up and also allowed guests to have an interactive directory of projects. During the student show, the only map that exists of student projects is a physical, paper print-out. Floorsquare allowed guests at the show to see which student was linked with which projects. The user could then touch the student’s picture and see which projects they had done. Of course, this is only one iteration of a possible application that could be built using this API and we hope to see many more get built using it.
All code is available here on GitHub.