MIT Researchers Debut New Video Amplification Algorithm to Monitor Small Changes in Movement

By Tracey E. Schelmetic February 28, 2013

Massachusetts Institute of Technology has developed a new algorithm for video amplification that can yield clues about movement that are invisible to the naked eye.

The new system, called Eulerian Video Magnification, works by taking a video and then homing in on specific pixels that change during movement. By analyzing these changes frame by frame, the solution identifies tiny changes in color and then amplifies them up to 100 times, which yields important clues about the motion of the object in the video, the New York Times is reporting today. 

Though the process was originally developed to monitor premature babies without touching them or using invasive monitors – a sample video of a newborn run through the process shows the infant’s face changing color with each pulse – researchers from MIT’s Computer Science and Artificial Intelligence Laboratory say it has a number of other applications.

Team leader Professor William T. Freeman sees an application for the technology in search and rescue, for example, allowing rescue teams to determine from a distance if someone is still breathing.

“Once we amplify these small motions, there’s like a whole new world you can look at,” he said.

It could also find an application in manufacturing, measuring stress and movement on machines and determining in advance which parts of the machine are likely to break down.

The system is somewhat akin to the equalizer in a stereo sound system, which boosts some frequencies and cuts others, except that the pertinent frequency is the frequency of color changes in a sequence of video frames – not the frequency of an audio signal, according to the research team.

“It could be used to compare different images of the same scene, allowing the user to easily pick out changes that might otherwise go unnoticed. In one set of experiments, the system was able to dramatically amplify the movement of shadows in a street scene photographed only twice, at an interval of about 15 seconds,” said MIT’s CSAI Lab.




Edited by Braden Becker

TechZone360 Contributor

SHARE THIS ARTICLE
Related Articles

Verizon Needs Tough Love on Copper Policies

By: Doug Mohney    1/29/2015

New regulation on broadband and telecommunications providers is at top of mind here at ITEXPO. Jeff Pulver, founder and chief executive of pulver.com …

Read More

OTT Video Set to Top $6 Billion in 2019

By: Tara Seals    1/29/2015

When it comes to over-the-top (OTT) video, it has grown not only in developed regions but also in emerging markets, both as an alternative and complem…

Read More

Digium CEO: Businesses at Every Level Can Get Started with UCaaS

By: Allison Boccamazzo    1/29/2015

Digium CEO Danny Windham made one thing clear during his keynote presentation at ITEXPO 2015: Businesses of all kinds, at every developmental level, c…

Read More

When Gaming Isn't a Game: 3 Best Practices to Protect Your Hosting Service Against DDoS Attacks

By: Joe Eskew    1/28/2015

The unprecedented number of security breaches, hacks and DDoS attacks on gaming communities, software manufacturers and even Hollywood studios grew to…

Read More

No Hackers Took Down Facebook; Hour's Outage Mostly Internal

By: Steve Anderson    1/28/2015

Facebook released a statement not long after the outage had hit, revealing that the cause of the shutdown was not "...the result of a third-party atta…

Read More