SciTech

Snapchat uses lenses to analyze face

Credit: Theodore Teichman/Assistant Photo Editor Credit: Theodore Teichman/Assistant Photo Editor Credit: Theodore Teichman/Assistant Photo Editor Credit: Theodore Teichman/Assistant Photo Editor Credit: Theodore Teichman/Assistant Photo Editor Credit: Theodore Teichman/Assistant Photo Editor

Have you ever tried to spice up your Snapchat streaks with the hilarious, face-distorting filters? Have you ever felt like ripping your hair out after seeing upwards of twenty flower-crown-filtered faces on your Instagram or Facebook feed? Have you ever wondered how these filters, called “lenses,” even work?

Five years after its first launch in July 2011 under the name Picaboo, Snapchat, an image messaging app, has emerged as one of the most popular social media apps among millennials, with 10 billion daily video views as of April 2016.

Snapchat continues to update their app with new technologies, such as the 24-hour story (which was recently copied by Instagram), temporary text messages, and instant video chatting.

In September 2015, for about $150 million, Snapchat purchased the Ukrainian company, Looksery, which specializes in real-time video modification on mobile devices, in order to create Lenses.

Even people who have not downloaded the app have most likely been exposed to the famous Snapchat filters — during the 2016 New York Fashion Week, a Spanish clothing brand used makeup to recreate the most popular lenses on their models. Lenses have made their way into Facebook profile pictures, Instagram posts, and Twitter feeds, even though Snapchat’s uniqueness comes from its ephemeral nature.

For almost a year, Snapchat users sent billions of pictures and videos altered by Lenses, unaware of the technology behind it. In late June 2016, Vox, the self-described “general interest news site for the 21st century,” posted a YouTube video sharing insight into the mechanics of Lenses, and although they were not allowed to talk to the engineers behind it, the people at Vox examined the patents available online.

The filters tap into a field of tech known as computer vision, which explores how a computer uses a camera’s pixel data to detect objects. The pixel data that the computer sees are merely numbers corresponding to each pixel that represent a color, so the algorithm to detect faces involves locating areas of contrast and matching the patterns to typical patterns of a face.

For example, the bridge of a nose is often lighter in contrast to the surrounding areas, and the center of a forehead is lighter than the sides. This algorithm, known as the Viola-Jones algorithm, only works with front-facing angles; if someone were looking or tilting their head to the side, the computer would not be able to detect their face. The advantages to the system include its fast computational abilities and its approach to scaling features. However, the algorithm is sensitive to lighting and, as we mentioned before, angles of the face.

In addition to recognizing common areas of contrast and simply recognizing faces, Snapchat must know exactly where each feature of the face is in order to accurately place its filters.

According to Vox, people behind the scenes manually marked facial features of thousands of different people in order to develop a general template that can be applied to every face before further adjustments, namely rotating and scaling the template, can occur.

After this, Snapchat must yet again analyze pixel data surrounding each facial feature to make minor tweaks in the mesh (or the digital mask). Contrasting pixels denote edges such as jawlines, eyebrow shapes, and lips.
Following all of this data analysis and fine-tuning, Snapchat can finally position the filter confidently, and voila — you are now a dog, a lion, or a snowboarder with a mustache!

In spite of all of this seemingly new and innovative tech, detecting faces and making alterations is not the new technology that makes Lenses incredible — the processing speed required to do this in real-time on your mobile device is.

So the next time Snapchat lags when you are trying to show someone a funny filter, just think: it is already so amazing that we can do this at all, and things will only get better from here.