“This is a new technique we developed inhouse that wraps a face with our AI algorithms,” said Alethea AI CEO Arif Khan. “Avatars are fun to play with and develop, but these ‘masks/skins’ are a different, more potent, animal to preserve privacy.”
The Los Angeles based startup launched in 2019 with a focus on creating avatars for content creators that the creators could license out for revenue. The idea comes as deepfakes, or manipulated media that can make someone appear as if they are doing or saying anything, becomes more accessible and widespread.
According to a 2019 report from Deep Trace, a company which detects and monitors deepfakes, there were over 14,000 deepfakes online in 2019 and over 850 people were targeted by them. Alethea AI wants to let creators use their own synthetic media avatars for marketing purposes, in a sense trying to let people leverage deepfakes of themselves for money.
Khan compares the proliferation of facial recognition data now to the Napster-style explosion in music piracy in the early 2000s. Companies, like Clearview AI, have already harvested large amounts of data from people for facial recognition algorithms, then resold this data to security services without consent, and with all the bias inherent in facial recognition algorithms, which are generally less accurate on women and people of color.
Clearview AI, has marketed itself to law enforcement and scraped billions of images from websites like Facebook, Youtube, and Venmo. The company is currently being sued for doing so.
“We will get to a point where there needs to be an iTunes sort of layer, where your face and voice data somehow gets protected,” said Khan.
One part of that is creators licensing out their likeness for a fee. Crypto entrepreneur Alex Masmej was thefirst such avatar, and for $99 you can hire the avatar to say 200 words of whatever you want, provided the real Masmej approves the text.
“There are a lot of Black Mirror scenarios when we think of deepfakes but if my personal approval is needed for my deepfakes and it’s then time-stamped on a public blockchain for anyone to verify the videos that I actually want to release, that provides a protection that deepfakes are currently lacking,” said Masmej.
The privacy pilot takes this idea one step further, not only creating a deep fake license out, but preventing companies or anyone from grabbing your facial data from a recording.
There are two parts to the privacy component. The first, currently being piloted, involves pre-recorded videos. Users upload a video, identity where and what face skin they would like superimposed on their own, and then Alethea AI’s algorithms map the key points on your own face, and wrap the mask around this key point map that is created. The video is then sent back to a client.
Alethea AI also wants to enable face masking during real time communications, such as over a Zoom call. But Khan says computing power doesn’t quite allow that yet, though it should be possible in a year, he hopes.