The maker of a defunct cloud photo storage app that pivoted to selling facial recognition services has been ordered to delete user data and any algorithms trained on it, under the terms of an FTC settlement.
The regulator investigated complaints the Ever app -- which gained earlier notoriety for using dark patterns to spam users' contacts -- had applied facial recognition to users' photographs without properly informing them what it was doing with their selfies.
Under the proposed settlement, Ever must delete photos and videos of users who deactivated their accounts and also delete all face embeddings (i.e. data related to facial features which can be used for facial recognition purposes) that it derived from photos of users who did not give express consent to such a use.
Moreover, it must delete any facial recognition models or algorithms developed with users’ photos or videos.
This full suite of deletion requirements -- not just data but anything derived from it and trained off of it -- is causing great excitement in legal and tech policy circles, with experts suggesting it could have implications for other facial recognition software trained on data that wasn't lawfully processed.
Or, to put it another way, tech giants that surreptitiously harvest data to train AIs could find their algorithms in hot water with the US regulator.
This is revolutionary - and fascinating to see the US beats the EU in drawing this consequence https://t.co/20evtGaZM5
— Mireille Hildebrandt (@mireillemoret) January 12, 2021
That could require deleting the core ML models underlying Facebook Newsfeed or Google Search
— ashkan soltani (@ashk4n) January 12, 2021
The quick background here is that the Ever app shut down last August, claiming it had been squeezed out of the market by increased competition from tech giants like Apple and Google.
However the move followed an investigation by NBC News -- which in 2019 reported that app maker Everalbum had pivoted to selling facial recognition services to private companies, law enforcement and the military (using the brand name Paravision) -- apparently repurposing people's family snaps to train face reading AIs.
One commissioner, Rohit Chopra, issued a standalone statement in which he warns that current gen facial recognition technology is "fundamentally flawed and reinforces harmful biases", saying he supports "efforts to enact moratoria or otherwise severely restrict its use".
"Until such time, it is critical that the FTC meaningfully enforce existing law to deprive wrongdoers of technologies they build through unlawful collection of Americans’ facial images and likenesses," he adds.
Chopra's statement highlights the fact that commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that "derive much of their value from ill-gotten data", as he puts it -- flagging an earlier settlement with Google and YouTube under which the tech giant was allowed to retain algorithms and other technologies "enhanced by illegally obtained data on children".
And he dubs the Ever decision "an important course correction".
Ever has not been fined under the settlement -- something Chopra describes as "unfortunate" (saying it's related to commissioners "not having restated this precedent into a rule under Section 18 of the FTC Act").
He also highlights the fact that Ever avoided processing the facial data of a subset of users in States which have laws against facial recognition and the processing of biometric data -- citing that as an example of "why it's important to maintain States' authority to protect personal data". (NB: Ever also avoided processing EU users' biometric data; another region with data protection laws.)
"With the tsunami of data being collected on individuals, we need all hands on deck to keep these companies in check," he goes on. "State and local governments have rightfully taken steps to enact bans, moratoria, and other restrictions on the use of these technologies. While special interests are actively lobbying for federal legislation to delete state data protection laws, it will be important for Congress to resist these efforts. Broad federal preemption would severely undercut this multifront approach and leave more consumers less protected.
"It will be critical for the Commission, the states, and regulators around the globe to pursue additional enforcement actions to hold accountable providers of facial recognition technology who make false accuracy claims and engage in unfair, discriminatory conduct."
Paravision has been contacted for comment on the FTC settlement.
Update: Paravision sent us this statement via email:
The FTC Consent Order reflects a change that has already taken place. The Ever service was closed in August 2020 and the company has no plans to run a consumer business moving forward. In September 2020, Paravision released its latest-generation face recognition model which does not use any Ever users’ data. The consent order mirrors the course we had already set and reinforces a mindful tone as we look ahead.
Face recognition and computer vision technology have the potential to improve our lives in profound ways and we take the gravity of its impacts extremely seriously. Paravision has been repeatedly recognized by the U.S. Government through NIST as the most accurate provider of face recognition from the U.S., UK, and Europe. We look forward to maintaining this position with our latest generation model, and are deeply committed to the ethical development and use of this technology.