An algorithm to click a pic fits the mood

An algorithm to click a pic fits the mood copy

 

Bloomberg

Instead of hiring a professional photographer for an important marketing job last year, Massimo Portincaso gave the job to computer software.
Portincaso, the head of marketing at the Boston Consulting Group, was tasked with overhauling his company’s website. It was a straight-forward assignment, giving the site a fresh look highlighting the business’s capabilities. When the time came to select pictures for the new homepage, Portincaso didn’t want to entrust a photographer with the job. He turned over some of the decision-making to an algorithm he trained to crawl an internet photo database to find what he wanted.
That system, created by Berlin-based startup EyeEm, began suggesting shots matching Portincaso’s taste — in a similar way to how Pandora’s algorithm adapts music recommendations based on a person’s listening preferences. Rather than combing through stock photos from services such as Getty Images, Portincaso entered terms such as “young woman,” “smiling” and “escapism” into EyeEm’s search field and a list of pictures emerged from the database that fit with the look he was going for. At first, the algorithm suggested lots of random photos of people and color schemes. Over time, it learned he wanted more abstract shots: no pictures of people from the front and more uniform colors. “It’s almost frightening,” he said.
The website job was relatively minor, but represents a bigger change in computers’ ability to understand images and adopt human-like preferences. EyeEm began as an Instagram-like photo-sharing app about five years ago. With backing from investors including billionaire Peter Thiel, the company has evolved to be at the cutting edge of the technology industry’s race to effectively organize the trillions of images online.
The company has developed ways to quickly identify what is in a picture. Ramzi Rizk, the co-founder and chief technology officer at EyeEm, highlighted the capabilities by showing an internal version of the software. As he moved his iPhone camera around a room, the code scrolling across his phone’s screen identified objects in real-time: window, desk, computer, shelf, book. When he turned the phone on himself, a description showed, “handsome.” “That’s clearly a bug,” he said.
People can search for photos by typing in terms like “dinner last December in San Francisco” or “Hawaii sunsets.” “It’s not enough to say what’s in a photo — you have to filter out the most relevant,” Rizk said.

Leave a Reply

Send this to a friend