Motion Microscope

Motion Microscope

Since archaic time, man has wanted to see and hear things far smaller, subdued than could be perceived with the naked eye. Although, the first use of the lens remains a bit of a mystery, it’s believed that the use of lenses is quite modern than previously thought. If you look at it, it is infallibly believed that two Dutch spectacle-makers, Hans and Jansen Zacharias invented the first compound microscope in the 1590s.

Over the past centuries, microscopes have revolutionized our world. They reveal tiny objects, structures and life that are too nanoscopic for us to see with our naked eyes. Microscopes have contributed significantly in the fields of science and technology where great discoveries have been made over the centuries.

Perception Enhancing Lenses

At the recent TEDxBeaconStreet, Michael Rubinstein, a research scientist at Google, played a jaw-dropping video footage of barely perceivable movements, like the pulse and heartbeats. The video footage was extracted using a “Motion Microscope,” that doesn’t use optics like a regular microscope to make small objects look bigger. Instead, it utilizes a video camera and image processing to reveal the tiniest motions and color changes in objects and people. Changes that are impossible for us to see with our naked eyes. A microscope that lets us look at the world in a completely new way.

The Motion Microscope, developed at MIT in collaboration with Microsoft and Quanta Research, can extract intelligible audio by analyzing the movements of sound waves bouncing off objects and people. It reveals tiny changes – from the rippling of a woman’s stomach as a baby moves inside, the actual way the blood flows in the human body, how fast your heart is beating, to the wobbly motion of your eyeballs inside of your skull, vibrations of individual guitar strings. Such changes are amazingly abstruse, which is the reason, when you look at these changes with your naked eye, you don’t see it.

In a brilliant video demonstration, Rubinstein took a short clip from “Batman Begins” just to show Christian Bale’s pulse. While Bale is presumably wearing makeup, the motion microscope was able to recover his pulse and measure his heart rates.

This is derived from an algorithm to amplify minute color changes in videos, which allows the color changes from blood flow to be visualized. The algorithm employs a signal processing approach to analyze image motion and get a very accurate measurement of the color at each pixel in the image/video.

The technique used is similar to an Eulerian framework for fluid flow analysis, which is also known as the Eulerian video magnification algorithm. The changes are amplified to create enhanced videos or images, which actually show the changes. The algorithm can show color-amplified videos of objects and people, motion-magnified videos of pregnant bellies, small facial expressions, and mechanical movements like vibrations in engines. It can help engineers diagnose machinery problems, see how buildings and towers away in the wind and react to natural forces. It has its biological applications too, it can show how the veins and arteries are pulsing in our bodies.

Reverberating Sources to See the Unseen

The motion microscope is somewhat analogous to the equalizer in a stereo sound system, which boosts some of the frequencies and abates others. Although, in this case, the pertinent frequency is the frequency of color changes in a sequence of video frames, and not the frequency of an audio signal. It also works in real time and can display both the original video and the new version of the video, with changed amplified.

We all know that an Opera singers piercing voice can break a wine glass if they hit the correct note. In another video demonstration, Rubinstein showed how a note was played in the resonance frequency of a glass nearby. When magnified the motions by 250 times, one could clearly see how the glass vibrated and resonated to the sound.

The motion microscope has its crazy audio and video applications too. Rubinstein and the team at MIT experimented with a bag of chips that was lying on a table.The team filmed the bag lying on the table with a video camera and analyzed tiny, unperceivable motions that sound waves create in the room. In a high-speed video the bag of chips was recorded using a visual microscope. One could not decipher what is actually going on in the video by looking at it, however, the team was able to recover the audio by analyzing tiny motions in the video. The visual microscope could sense motions that can be measured as one thousandth of a millimeter. One can use the visual microscope to recover audio from objects like plants, CCTV cameras, etc.

It can act as a huge improvement in interrogation. By simply looking at the changes in one’s facial expression, the interrogator can detect a lie without having eagle eyes or hooking up wires to the person involved in the crime. Moreover, the technology can prove to be really useful in solving criminal affairs by recovering sounds from video recordings. It’s a wild, interesting way to turn everyday objects into microphones!

Currently, there is a lot of debate going on in Europe about whether a person declared to be brain-dead is in fact alive or not. The techniques developed can be put to noble use to see whether the person in the vegetative state reacts to sensory or verbal prompts.

Opportunities for Astronomical Discoveries

The development is perfectly suited for astronomic observations. Now and then, astronomers are discovering planets around other stars by observing infinitesimal changes in the positions of stars that occur due to the gravitational pull of the orbiting planets. The techniques developed by the team could help see all the stars that have planets at all once. It would be interesting to see what kind of opportunities for astronomical discoveries are awaiting for us, what happens when we point the microscope towards objects in space.

The algorithm is open-source so that others fascinated by the new development can use and experiment with it. People can work on their own videos. A website has been put up on the web by Quanta Research where users without any prior experience in computer science or programming can upload their videos, experiment with the new microscope and process them online. It’s a new tool, a new way to look at the world.

The technology is a gentle reminder of how limited human senses are at interpreting the characteristics of the nature of reality in the Universe. It is an indubitable proof that everything, the universe is a symphony of vibrating strings.