Photographs are a great way for people to document their lives. As photos have become digital, so have photo albums with apps like Ever.

“Ever is a company dedicated to helping you capture and rediscover your life’s memories,” the app’s website says. However, it seems Ever has done more than simply help people “make memories.”

A new report by NBC News has revealed that Ever uses the photos people share to train the company’s facial recognition system. Then, Ever offers to sell that technology to private companies, law enforcement, and the military.

Nothing made Ever’s practices clear on its website or app until NBC News reached out to the company in April. Then, a brief reference was added to its privacy policy, stating:

“To organize your Files and to enable you to share them with the right people, Ever uses facial recognition technologies as part of the Service. Your Files may be used to help improve and train our products and these technologies. Some of these technologies may be used in our separate products and services for enterprise customers, including our enterprise face recognition offerings, but your Files and your personal information will not be. Your Files, and any personal information contained in them, are not shared with or provided to third parties.”

Jason Schultz, a law professor at New York University, said Ever AI needs to do more to inform people about how their photos are being used. He told NBC News that burying the language in a 2,500-word privacy policy isn’t enough.

“They are commercially exploiting the likeness of people in the photos to train a product that is sold to the military and law enforcement,” Schultz told NBC News. “The idea that users have given real consent of any kind is laughable.”

Even with its policy update, Ever itself doesn’t say much about what’s going on, but the company’s AI website is far more clear.

“Our customers trust Ever AI’s technology to deliver mission-critical solutions for surveillance & monitoring, physical access control, and digital authentication,” the company writes.

Ever’s AI website also claims “government agencies can improve outcomes for surveillance and bolster building security and access for classified areas.”

In a news release, the company confirmed it has an “ever-expanding private global dataset of 13 billion photos and videos” and uses them to offer “best-in-class face recognition technology.” According to NBC News, Ever’s CEO Doug Aley confirmed that those photos come from Ever users.

The company also says it can estimate emotion, ethnicity, gender, and age. Those are all troubling things to claim. It’s not possible to determine someone’s gender from their face, for example, and reveals how transphobia is ingrained into technology.

In addition, AI Now’s 2018 report denounced “affect recognition,” or “a subclass of facial recognition that claims to detect things such as personality, inner feelings, mental health, and ‘worker engagement’ based on images or video of faces.”

“These claims are not backed by robust scientific evidence, and are being applied in unethical and irresponsible ways that often recall the pseudosciences of phrenology and physiognomy,” AI Now wrote.

Although Ever claims that no personal information is shared to third parties, privacy advocates still see the entire setup as a huge violation.

“This looks like an egregious violation of people’s privacy. They are taking images of people’s families, photos from a private photo app, and using it to build surveillance technology. That’s hugely concerning,” Jacob Snow, a technology and civil liberties attorney at the American Civil Liberties Union (ACLU) of Northern California told NBC News.

This isn’t the first time reports have revealed companies are using people’s photos to train facial recognition without their consent. Earlier this year, NBC News reported that IBM took nearly a million photos from Flickr to train facial recognition without anyone’s consent.

Slate also ran a report revealing how the government uses the images of vulnerable people in order to train facial recognition technology. That included images of immigrants, abused children, and dead people — all without consent.

As facial recognition grows in popularity, issues around consent are becoming more relevant. After all, if your own face is used without permission to train facial recognition, then how is it ever possible to opt out?