05 May 2026
Every year, millions of people cross finish lines. They pin numbers to their chests, attach timing chips to their shoes, and move through courses lined with cameras. It feels like sport. It is also, in ways that have never been properly examined, one of the largest unregulated mass data collection operations in the United Kingdom.
I recently reviewed privacy policies of major UK sporting event organisers. What I found was alarming: not because the organisations appeared malicious, but because the gaps were so systematic and so unremarked upon. Medical data collected without a specified legal basis. International data transfers undisclosed. No separate consent mechanism for special category data. Vague retention periods. No group-specific data protection notice. A privacy framework designed for a simpler world, applied without revision to something considerably more complex.
But the privacy policy failures, significant as they are, are not the deepest problem. The deepest problem is something the policy does not address at all, because nobody has asked it to.
Here is what happens when you enter a mass participation sporting event.
At registration, you provide your name, phone number, postal address, email address, date of birth or age category, and in many cases additional personal details including emergency contact information and medical data. You agree to terms and conditions that include, somewhere in the text, an acknowledgement that you may be photographed.
On race day, your name is linked to a participant number. That number is printed on a bib and attached to the front of your body, visible to everyone around you for the duration of the event.
Your result — your name, your finishing time, your age category — is published publicly. Most events make results searchable by participant number, by name, or both. This database is open to anyone with an internet connection.
Official photographs are taken throughout the event by professional photographers. These images are processed and published in a publicly accessible gallery, typically organised by participant number. For a modest fee, anyone can download a high-resolution image. In some cases an entire gallery of dozens of images.
Many events also offer live GPS tracking, publishing participants' real-time location data on a publicly accessible map, linked to their name.
Now consider what this means in practice. A person standing at the finish line with a smartphone can photograph a participant crossing. They have a face and a participant number. They spend thirty seconds on the results page. Now they have a name and an age category. They pay for access to the official gallery. Now they have a high-resolution photograph. They did not interact with the data controller. They did not agree to any terms. They did not trigger any data protection mechanism.
The participant consented to being photographed. They did not consent to having their biometric image linked to their name, age, and performance data and made available for download by any member of the public. Those are not the same thing.
The legal framework here is less ambiguous than the events industry appears to assume.
A photograph is not automatically biometric data. It becomes special category biometric data under Article 9 at the point it is processed using specific technical means to uniquely identify a person. Race photography galleries do not, in isolation, cross that threshold. Facial recognition matching systems, which several major providers now offer as standard, do — unambiguously and without exception. The ICO has confirmed this position. Processing special category biometric data requires not just a lawful basis under Article 6, but an explicit condition under Article 9(2): explicit consent, substantial public interest, or one of a narrow set of other grounds.
Events typically rely on consent as their lawful basis for photography. That consent is obtained at registration, bundled into general terms and conditions, in language such as "you may be photographed during the event and images may be used for promotional purposes."
This is inadequate for several reasons.
First, consent under UK GDPR must be specific, informed, and unambiguous. A general acknowledgement buried in registration terms does not constitute specific, informed consent to having biometric images linked to personally identifiable data and published in a searchable public database.
Second, the consent obtained at registration describes a different processing activity than what actually occurs. "You may be photographed" describes capturing an image. It does not describe linking that image to a participant number, publishing it in a searchable gallery indexed by name, making it available for download, or running it through facial recognition software.
Third, the combination of separately collected data points creates something the consent framework was never designed to address. The results database is not photographs. The official gallery is not a results database. But combined, they constitute a biometric identification system that any member of the public can access and use, without the knowledge or involvement of the data controller.
This is not a theoretical concern about future technology. It is a description of current commercial practice.
Several major race photography providers now offer facial recognition matching as a standard feature. Participants upload a selfie after the event. The system scans thousands of race photographs, matches facial geometry, and returns the images in which that participant appears. The feature is marketed as a convenience: no more searching through thousands of images by number. It is also, without ambiguity, biometric processing under Article 9.
The legal basis for this processing is rarely made explicit in event registration terms. Participants are not typically told at registration that their facial geometry may be processed by a third-party photography provider using automated recognition technology. The data processing agreement between the event organiser and the photography provider is not disclosed, and there is no evidence in any of the privacy policies I reviewed that such agreements are uniformly in place.
This matters because Article 13 of UK GDPR requires data controllers to provide specific information about processing at the point of collection, including the identity of any third-party processors, the purposes and legal basis for processing, and whether automated decision-making or profiling is involved. Facial recognition is automated processing. It involves biometric data. The legal basis must be explicit. In none of the registration processes I examined was this disclosed.
Photography providers process and store images on servers outside the United Kingdom. Under Article 13, participants must be informed of any transfer to a third country at the point of registration, including the legal basis for that transfer. This disclosure was absent from every registration process I examined. The transfer itself may be lawful. Countries in the EU, plus Norway, Iceland, Canada or New Zealand, for example, hold adequacy status under UK GDPR. But adequacy does not remove the obligation to disclose. Participants cannot give informed consent to processing they have not been told is happening.
Mass participation sporting events have existed for decades. The data infrastructure surrounding them has changed beyond recognition in the past fifteen years. Online registration, searchable results databases, GPS tracking, professional photography at scale, facial recognition matching: these are developments of the last decade or so, layered onto a consent framework that was designed for a world where you filled in a paper entry form and a local photographer took a few shots at the finish.
The consent model has not been updated to reflect what is actually being collected, combined, and published. Every privacy policy I reviewed was inadequate and showed no sign of having been updated to reflect current practice. The ICO has not issued specific guidance on mass sporting event data collection. No regulatory body has mapped the full data chain from registration through results publication through photography through facial recognition.
This is not a story about bad actors. The vast majority of event organisers are not attempting to build surveillance infrastructure. They are using tools that have become standard in the industry, under consent frameworks that were never designed to cover them, without having been asked by any regulator to examine what those tools actually do with participant data.
That absence of scrutiny is the problem. Not the events themselves.
This post is not a call to stop photographing sporting events. Photography is part of the culture of mass participation sport. Results are published because participants want to find their times. Live tracking exists because supporters want to follow their people on course.
The argument is not that these things should not happen. The argument is that the consent and governance framework surrounding them has not kept up with what they have become.
Adequate compliance would require, at minimum, the following.
A registration process that specifically and clearly describes what data is collected, how it is combined, who processes it, and what participants are agreeing to. Not buried in general terms, but explicit and specific.
A genuine opt-out mechanism for official photography or public display of personal data that does not require participants to forgo participation. If the lawful basis for photography is consent, that consent must be freely given. Consent that cannot be withdrawn without losing access to the activity is not freely given.
Explicit disclosure of facial recognition processing, the identity of the third-party provider, and the legal basis for biometric data processing at the point of registration, not in a separate document that participants are unlikely to find.
Documented data processing agreements between event organisers and photography providers, timing partners, and any other third parties who handle participant data.
Retention policies that reflect the actual purpose of each data category. There is no legitimate reason to retain participant photographs in a publicly accessible gallery indefinitely. There is no legitimate reason to retain GPS tracking data after the event has concluded.
None of this is technically complex. All of it is legally required. None of it is currently standard practice in the UK events industry.
The Information Commissioner's Office has published guidance on biometric data, on consent, on special category data, and on the obligations of data controllers processing personal data at scale. What it has not done is apply that guidance specifically to the mass participation sporting events sector.
This post is an invitation to do so.
The data chain described here — registration data linked to participant numbers, linked to publicly searchable results, linked to publicly accessible photography, processed through facial recognition by third-party providers — exists at hundreds of events across the United Kingdom every year. It involves millions of participants. It processes special category biometric data. It operates under consent frameworks that do not adequately describe what participants are agreeing to.
That seems worth a look.