Eye Catcher

Using a combination of industrial robotics and high power magnets, a seemingly inconspicuous frame on a wall, magically comes to life. Through a series of experimental films, photography and physical prototypes, the primitive effects of eye (and eye-like) stimuli have been investigated.

HOW FACIAL TRACKING WORKS


A wireless pinhole camera in the frame transmits the video footage of onlookers back to our software (built in Processing and using face-OSC) which analyses 12 values of facial expression such as width of the mouth, the height of the eye-brow, the height of eye-ball etc. That information then drives the reciprocal expressions of the frames fluid "eyes", controlled by four servo/magnets manipulating ferrofluid.

HOW THE FRAME RESPONDS TO THE USER


To start with, the height of passers-by is calculated by ultrasonic Sensors embedded in the ceiling. This is remapped to the robotic arm (controlled using the Lab's opensource controller Scorpion) hidden behind the wall which magnetically drives the frame to align "face to face" with onlookers

Principle Researchers: Lin Zhang, Ran Xie

Supervisors: Ruairi Glynn and Dr Christopher Leung with William Bondin


See more videos and process: Interactive Architecture

Read an interview with the researchers: We make money not art