BOOK THIS SPACE FOR AD
ARTICLE ADA security researcher claims UK health club and gym chain Total Fitness bungled its data protection responsibilities by failing to lock down a database chock-full of members' personal data.
The researcher told us more than 474,000 images of both members and staff – including men, women, and children – were stored in a database that was left unprotected and publicly accessible without the need for a password.
The size of the database, according to cybersecurity researcher Jeremiah Fowler and reported to vpnMentor, totaled 47.7GB. It also included a cache of images that revealed individuals' identity documents, bank and payment card information, as well as phone numbers and immigration records in some rare cases, Fowler claimed.
"This raises privacy concerns regarding how companies collect images of members or customers, how they are stored, how long they are kept, and who has access to them," said the researcher. "Many people choose to stay private online and do not publicly share images of themselves, their friends, families, or children.
"Nearly all social media accounts offer users the ability to have a private profile and have strict control over who can access their content. However, this doesn't seem to be the case for member-uploaded images on Total Fitness platforms. It is hypothetically possible that the images stored in the backend database are potentially retained even after being deleted by the member. This would potentially explain why the database contained images of sensitive documents."
The database, which is now locked down, was populated with various images including shots of members' faces, submitted either by themselves during the process of registering for memberships online, or by staff registering members on-site.
Total Fitness said that members' images only comprised a "subset" of the total cache, whereas other files included shots of artifacts like merchandise and commercial imagery. Regarding the number of images in the database, the health club chain stressed that only a very small number of the images contained personally identifiable information (PII).
The explanation is at odds with Fowler's account of things. He claims the database was almost entirely populated with member images to the tune of roughly 97 percent – not merely a "subset" as Total Fitness said.
Total Fitness told The Reg that the images were collected and stored for "legitimate business purposes" and that they're used to ensure memberships aren't misused and to identify specific members around the facility, where necessary.
The company has 15 health clubs across northern England and Wales which serve more than 100,000 members and 600 staff. Those affected will be contacted by the company today, it said.
From the company's initial response to our questions, it doesn't sound like those who had only their images exposed will be notified, but El Reg chased for clarification on the matter.
"While an initial investigation found no data that could in [and] of itself identify an individual, other than a photo of the member, we continued our analysis and conducted a more thorough examination, including an image-by-image check," a company spokesperson told us.
"This forensic examination identified a small subset of 114 images provided by members that included information that could be used to identify a member. We immediately disabled this folder and these images have been removed.
"From a review of our logs, we can see no evidence of anyone accessing these files prior to the alert."
The spokesperson also said the UK's Information Commissioner's Office (ICO) was informed of the situation and that Total Fitness would be supporting it with any investigations it may launch.
"We have made it a priority to ensure that images of members are not combined with other data that can identify the member through updates to staff and will make this clearer in our communication to new members who join via the app which does not involve staff intervention on upload of images," the spokesperson added.
Potential for abuse
According to Total Fitness' logs, it said there was no evidence that any unauthorized party, other than Fowler, had accessed it. It's not known for how long the database was left unprotected, but the researcher reckons he spotted files that appeared to be uploaded as far back as March 2021.
Regardless, some customers will doubtless be peeved at the privacy violation, especially with images of children being stored with no protection.
Fowler stressed the seriousness of the matter by raising the growing issue of AI and deepfake technology.
Images can be pumped into an online reverse image search tool to reveal real identities, which can then theoretically victimize affected people using various flavors of cybercrime.
Deepfake imagery in general has been abused in sextortion scams for some time, for example. The FBI issued an alert about some concerning cases last year, and as recently as April 2024 the UK's National Crime Agency (NCA) also issued similar warnings.
There also exists the possibility that these images could be used by fake social media or dating app profiles to carry out romance scams.
In one of the limited cases Fowler investigated, a reverse image search of a Total Fitness member led to their identification as a content creator on the OnlyFans platform. Feasibly, such imagery could be used to target existing fans with financial scams.
Brit security guard biz exposes 1.2M files via unprotected database Global taxi software vendor exposes details of nearly 300K across UK and Ireland H-1B fraud consultancies grow, with application abuse openly discussed online Catfish: A fanfare for Facebook fakeryIt doesn't just affect those with an especially public profile, anyone who's anyone can be the victim of identity theft in these cases, including Fowler himself.
He said: "This is a personal issue for me. I recently received several messages from people that I do not know informing me that my information and pictures were being used to contact multiple women in an attempted romance scam. I have no idea why they chose me when they could have probably had a better success rate using pictures of Brad Pitt or Jeremy Clarkson, but luckily, as a public person in my role as a cybersecurity researcher, I am easy to contact.
"When the potential victims became suspicious, they reached out to me directly using an official channel, and I confirmed that I was being impersonated. I know firsthand that knowing someone is attempting to use your identity and images to harm others can be quite upsetting. This is why awareness is an important first step to protect digital identities online." ®