Thanks to artificial intelligence and new machine-learning algorithms, facial recognition has moved beyond security applications, popularized by Hollywood as an easy way to spot bad guys. New facial recognition methods can not only identify people, but they can also determine their age, gender, ethnicity, emotional state, and whether they are “live” and in person.
One company leading these developments is Ever AI, which today announced enhancements to its facial recognition product suite. Among the new features is “liveness detection”, which can determine if a person’s face is “live” as opposed to a photo.
Ever AI said attackers have attempted to bypass facial recognition authentication methods by using photos of someone’s face printed on paper, or a mobile device screen image, to try to gain access to a computer or mobile device.

Doug Aley, CEO, Ever AI
Doug Aley, CEO of Ever AI, said the prevalence of social media has made it a lot easier for attackers to spoof facial recognition systems.
“You can easily get a printout of somebody’s face, and if the facial recognition technology doesn’t have liveness detection on it, you could probably spoof that face recognition by holding up a piece of paper to a camera,” Aley said.
In addition, the liveness detection can help simplify security systems that aim to prove “liveness” by having customers move their head from side to side or “hold up a peace sign” with their hands, Aley said.
“That’s something that in the past people have relied on; it’s not the smartest technology and creates more friction for customers,” he said.
Ever AI, which began as a consumer photo and video storage organizer application, trained its AI with a data set of more than 13 billion photos and videos. The company’s models can be accessed as self-hosted application programming interfaces (APIs) or through an Android software development kit (SDK). The company does not give customers access to Ever’s entire photo and video data set, Aley said.
Getting robots to recognize humans
One big customer of the company’s models is SoftBank Robotics, maker of the Pepper humanoid robot. The company “relies on Ever AI’s face recognition technology to create more personalized experiences for our retail and hospitality customers,” said Omar Abdelwahed, head of studio for SoftBank Robotics America, in a statement.
Service robots, particularly humanoid robots, will need facial recognition for better communication with humans, Aley said. The company recently showed off its technology at a San Francisco event, in which up to 600 attendees checked in with the Pepper robot.
The robot would scan a human’s face and identify them through a previously uploaded LinkedIn image provided by the attendee. Once confirmed, the robot would greet the attendee by name, Aley said. “It was a wow-type moment,” he said.
A big advantage for robots is its memory recall ability.
“Take my Starbucks, for example,” Aley said. “Downstairs, they get about 600 customers a day, and I know two of the baristas incredibly well, they know my name and my drink.”
“But the other six to 10 baristas that rotate in and out of that Starbucks don’t, and that’s frustrating as a customer,” he said. “It’s understandable – how can they possibly keep track of the 600 people a day that come in?”
“Robots don’t have that problem [with human recall],” Aley added. “They’re not at the point yet where they can provide the completely personalized experience, but they are at a place where they can greet you by name, and greet everyone by name.”
Other facial recognition applications
In addition to humanoid robots, facial recognition could be used with security robots that need to authenticate employees or visitors entering a building.
In the industrial robotics space, Ever AI is working with customers in Europe, Aley noted. Manufacturers are adding facial recognition to their production lines to identify employees working on machines for better auditing and safety purposes.
Facial recognition has also been useful in autonomous vehicles. AI firm Affectiva calls its technology “Emotion AI” and uses it to help detect whether drivers are drowsy or distracted.
Aley said facial recognition could also be used to help identify people within cars to see what seat they’re in for insurance purposes – such as who was driving when an accident happened, or what their emotional state was during the event.
The biggest challenge for facial recognition with robotics is speed. This determines whether robots can access a facial database on a network or whether they need the database stored on the edge. Aley said his company’s goal is to make the technology “as fast as robotically possible” for systems that don’t have speedy connectivity to networks or cloud services.