Judgemental Mirror

What if for every mirror that you looked into, someone else was looking right at you? For our mid-term Anastasios and I decided to make a judgemental mirror. Essentially, we wanted to build something that detected when an individual was in front of this “mirror.” Then we’d take a snapshot and crowdsource the image using the Mechanical Turk API. After rendering what people describe what you’re wearing, we’d play it back to you. With Halloween right around the corner, we thought about this in the context of judging people’s costumes. Here’s a sketch of our original idea:

idea

Realizing we didn’t have much time for the project, we focused on the experience we wanted to create then broke it into smaller parts that were doable. Additionally, we wanted to make sure we incorporated some Halloween theme to it.

Our biggest challenge for this project was using the TTL Serial Camera, which neither of us have worked with before. We were unsure of the output via serial, so that was our first task.

img_3486

After looking at a few tutorials, it was apparent that many tutorials assumed we’d want to save to an SD card module, which is what we did not want to do. This was partly because we wanted to have a stand alone project and also upload the sketch to some webpage which could save the inputs from our source of judgers. I quickly coded up what this webpage could be:

interface

Anastasios was able to figure out a way to take the JPEG images from the TTL serial camera, run a script, and display it in a p5 sketch. Knowing this was the biggest challenge, we then started thinking about LEDs and proximity sensors which would add to the interaction. The following images are of us testing the different components and eventually putting it together.

img_3488 img_3490

img_3491img_3493

img_3494

In the end, we presented a variation of our idea. Neither Anastasios or I took fabrication this semester, so we found this part to be the hardest. Handling the multiples wires & sodering of wires was challenging for us. Overall, it was a good learning experience and happy with what were accomplished in a short amount of time.

First comes Pcomp, then comes ICM, now they’re married

I was one of the unfortunate ones who missed synthesis because I was officiating my best friend’s wedding. But here I was once again marrying two things: pcomp and ICM.

Code for p5 can be found here: https://alpha.editor.p5js.org/projects/H1wosAX1l

From my last post tagged ICM, I talked about Jamie XX and how it’s been therapeutic music to listen to while I work. I wanted to keep things simple for this and I replicated the image of his Colour album in p5. Using constructors, objects, and arrays this time, I was able to come up with a working sketch.

Afterwards, I thought about hooking this up to the synthesis lab with a simple potentiometer. After a few tweaks in the code and the mapping, I could something workable. I even took the liberty of overlaying a snippet of the Jamie XX song “Sleep Sound” for the video.

Visualizing Sleep Sound by Jamie XX

I’ve been terrible at blogging lately. Partly it’s because I’m trying to wrap around my head on what I want to build here at ITP. Though one constant thing in my workflow has always been music. I’m embarrassed to say this, but I can listen to one song over and over. I find it therapeutic. One song in particular has been this track by Jamie XX.

After a few listens, I thought about how Jamie XX’s music has so many layers and a lot of texture to it’s production. Inspired by this I began to just try and code his album cover in P5:

What’s great about this exercise is that I’m slowly piecing together everything I’m learning (i.e. objects, constructor functions, color, etc). Right now I’m taking baby steps and coded these shapes literally. With help from Stephanie Koltun, I’ve decided to slowly build this project out with more interaction and maybe even hook it up to a pcomp project.

Stay tuned.

The MTA experience dissected

About ~8 million people live in NYC, and almost every single person either has one of these cards or has used them at least once in their lifetime. When you think of images of NYC it’s almost impossible to not think of its subway system. What I’m focusing on for this blog post is specifically how people pay and interact with entering the subway system.

Considering 2016 is the most we’ve ever been connected to technology, it’s amazing to see how low-tech the fare system is for the MTA. A quick google search for ‘MTA plastic card’ yields the following top two headlines for articles:

Why fix something that isn’t broken?

Here’s how the current system works based on my assumptions. In order to ride the subway, riders must first purchase a fare. The fare itself can be purchased at automated kiosks found at every station. Riders can choose to add a Time Value or Cash Value to their cards depending on how frequent they ride. Once a fare is purchased, the rider is provided a paper card with a magnetic strip which can then be swiped at turnstiles that unlock allowing entrance to your subway.  Simple, right? Maybe 50 years ago. Let’s breakdown this experience a little further.

I’ll start with already having a card loaded with a fare. From my observations and experience, swiping is the most basic action one can take to enter the subway. However, I’ve encountered a number of people who have to swipe multiple times because the angle in which the slid the card produced an error. Interestingly, the tone that is triggered for this error doesn’t distinguish between whether the error was for swiping incorrectly OR if there are insufficient funds to enter. This can be a confusing experience for tourist at times. If we also consider that there are only a limited amount of turnstiles that riders can access, this whole experience can create a chain of delays of people moving through. However, errors aside, swiping successfully and moving through the turnstiles is fairly easy and intuitive.

Let’s now consider loading the card with money. It’s convenient that there are kiosks at every subway station which allows riders to top-up their existing card or purchase a new one. However, considering it is 2016, I”m a bit perplexed as to why the MTA hasn’t switched to a plastic card. Comparing the paper card to plastic card alone can be a blog post. Today if I were to lose my paper MTA card, I effectively lost money equivalent to the dollar value that I put on it. There’s no convenient way for me to report it lost or stolen. We can assume that if plastic cards were issued, we can embed chips in them that would allow cards to be registered. This alone would open up a new system that would provide an improved experience.

Also, producing plastic cards with chips in them opens up new experiences as well. The MTA in theory could allow customers to purchase fare for their MTA cards online at-home or via a mobile device. The unique ID on each card would be an important identifier for the rider which could allow MTA to load funds onto their account. While purchasing fare at the kiosks right now are fairly easy, there’s only a fixe number of them and riders in need of adding value will queue in line causing more delays to ones commute.

Overall, there’s plenty of room for improvement with the existing MTA system. I believe it would be possible to shave off minutes in rider’s experience relative to what they encounter today.