Accused of using photos of users without permission to develop facial recognition technologies, the late photo storage Ever - now renamed Paravision AI - reached an agreement with Federal Trade Commission (FTC), of the United States, last Monday (11).
The decision obliges the company Everalbum Inc., owner of the app, to delete all photos and videos of users who have disabled their accounts, as well as delete algorithms for facial recognition developed from these media.
All “facial incorporations” - used for facial recognition purposes - derived from photos without the users' consent must also be deleted.
In addition, the company is forced to make it clearer how it collects and uses user information, in addition to informing how the app protects their privacy.
In cases of commercialization of the software for private use, the company will need the customers' consent before using any biometric information collected, either to create new facial incorporations or even in the development of facial recognition technologies.
Long standing problem
The Ever app emerged in 2013 as a cloud storage service. However, after administrators realized that the app would not emerge in the industry, the tool was changed in 2017 to function as a provider of facial recognition technology.
The “big problem” is that, according to a report published by NBC News in 2019, Ever was using users' private photos to train their algorithm facial recognition before the technology is sold to customers.
Even if it is true, the company later promised to exclude media from disabled accounts. However, the company did not deliver on its promise until mid-October 2019, according to the FTC, culminating in the imbroglio (now resolved) with the agency.
In response to the lawsuit, a spokesman for Paravision AI stated that the FTC decision reflects a "change that has already occurred" and that Paravision's facial recognition technology does not use any data from Ever users.
Street: The Verge