All Eyes On You
“All Eyes On You” by Britzpetermann are bunch of varied sized eyes that follow the passers-by. The team used a high quality projector and semi-matt foil to project the eyes on the window. They are rendered by a WebGL frontend using a shader sphere effect. The detection is performed by OpenFrameworks and the Kinect.
Like Audience (http://www.creativeapplications.net/openframeworks/audience-openframeworks/). Two iPhones on servo motors which can swivel to follow a user. The arduino receives information via OSC sent by a custom iOS5 app. Uses CIDetector for face detection.
Surveillance camera that can track, zoom and follow subjects. The camera is however, a little insecure. Easily startled by sudden movements, it is shy around strangers and tends to avoid direct eye contact. This reversal of the relationship between the surveillance system and its subjects gives the machine an element of human personality and fallibility that is by turns endearing, tragic, and slightly disturbing.
Watch the video, great demonstration of a robotic system that emits a kind of “uninterested” response
The Quantum Parallelograph
Uses a php script to pull information about you from google. It’s a (dubious) look at quantum theory and parallel universes, but the interesting thing here is the printout, which is a little printer attached to an Arduino. Hanley brought this up, it could be an interesting interaction between the computers and users.
I’m assuming if you want to build a robotic head, you’ll have some experience using an arduino, servos, and some other language that can interface between a webcam and the arduino. I used processing as my interface between the arduino and my webcam. I’m also using the OpenCV library for processing.
I’ll describe the overarching structure behind the code to give the illusion that a robot is following a person’s face. At a high level, there are only a few steps to have a webcam follow a person’s face:
Detect face. Take a look at some OpenCV examples on processing’s website.
Grab X-Y pixel coordinates of the face.
Calculate the pixel distance between the center of the face and center of the webcam’s view. In other words, you’re going to take the images from the webcam, and calculate the distance between the center of that image and the center of the face.
Write an algorithm that minimizes the distance between the webcam’s center and the face’s center.
- This algorithm will also control the servos.
The Desire of Codes. A nightmarish interactive work.
The “individual” visitor in a double role as a subject of expression and observation.
A large number of devices resembling tentacles with built-in small cameras are placed across a huge wall (Part 1), while six robotic “search arms” equipped with cameras and projectors are suspended from the ceiling (Part 2). Each device senses with insect-like wriggling movements the positions and movements of visitors, and turns toward detected persons in order to observe their actions. In addition, a giant round-shaped screen that looks like an insect’s compound eye is installed in the back of the exhibition space (Part 3). Visual data transmitted from each camera, along with footage recorded by surveillance cameras installed at various places around the world, are stored in a central database, and ultimately projected in complex images mixing elements of past and present, the venue itself and points around the globe, onto the screen. The compound eye visualizes a new reality in which fragmentary aspects of space and time are recombined, while the visitor’s position as a subject of expression and surveillance at once indicates the new appearances of human corporeality and desire.
90 devices are exhaustively distributed across a wall. As soon as a visitor enters the area in front of the wall, the devices’ heads start blinking, and all together they move in the respective visitor’s direction like an insect’s tentacles. Highly sensitive cameras and microphones able to detect motion and sound beyond human perception record the visitor’s action.
Like facets of an insect’s compound eye, countless hexagonal parts make up one large screen.
Superdirective microphones installed at several points in the exhibition hall record every sound occurring in the space. Voices and other noises generated by visitors, as well as the artworks’ own mechanical sounds, mix on a recombined time axis to create the soundtrack for this installation. The behaviors and states of all of the three parts’ respective elements trigger all previously recorded sound data accumulated up to the present point, and these components keep forming a constantly updated sonic environment.
Technological Dream Series: No. 1, Robots
One day, in the future, robots will do everything for us. It’s a dream that refuses to go away. Over the coming years, robots are destined to play a significant part in our daily lives — not as super smart, functional machines, nor as pseudo life forms, but as technological cohabitants. But how will we interact with them? What new interdependencies and relationships might emerge in relation to different levels of robot intelligence and capability? These objects are meant to spark a discussion about how we’d like our robots to relate to us: subservient, intimate, dependent, equal?
Robot 1: This one is very independent. It lives in its own world getting on with its work. We don’t really need to know what it does as long as it does it well. It could, for instance, be running the computers that manage our home. It has one quirk; it needs to avoid strong electromagnetic fields as these might cause it to malfunction. Every time a TV or radio is switched on, or a mobile phone is activated it moves itself to the electromagnetically quietest part of the room. As it is ring shaped, the owner could, if they liked, place their chair in its centre, or stand there and enjoy the fact that this is a good space to be in.
Robot 2: In the future products/robots might not be designed for specific tasks or jobs. Instead they might be given jobs based on behaviours and qualities that emerge over time. This robot is very nervous, so nervous in fact, that as soon as someone enters a room it turns to face them and analyses them with its many eyes. If the person approaches too close it becomes extremely agitated and even hysterical. Home security makes good use of this robot’s neurosis.
Robot 3: More and more of our data, even our most personal and secret information, will be stored on digital databases. How do we ensure that only we can access it? This robot is a sentinel, it uses retinal scanning technology to decide who accesses our data. In films iris scanning is always based on a quick glance. This robot demands that you stare into its eyes for a long time, it needs to be sure it is you.
Robot 4: This one is very needy. Although extremely smart it is trapped in an underdeveloped body and depends on its owner to move it about. Neediness is designed into very smart products to maintain a feeling of control. Originally, manufacturers would have made robots speak human languages, but over time they will evolve their own language. You can still hear human traces in its voice.