Why Athletes, Coaches, and Federations Need a Governance Plan
- Jeremy Kerner
- Jun 25
- 3 min read

The Finish Line Just Moved
Imagine this: your coaching team is trialing a new AI system. It doesn’t just track heart rate variability or oxygen efficiency. It runs full-system simulations using real-time inputs from blood
glucose monitors, cortisol levels, GPS data, and psychological profiling.
The system predicts not only the best pace strategy for race day but adjusts in real time,
mid-race, whispering decisions to your bone-conduction headset. Push now. Hold back. Surge
in 1.3 kilometers. It doesn’t just know your training. It knows your metabolic tipping point, your
risk of cramping, your brain’s stress response to altitude.
You win.
You shave eight minutes off your personal best.
Your competitors? They had no idea what hit them.
Now imagine that system was trained on incomplete data. Imagine it performs worse for women,or older athletes, or underestimates risk of cardiac stress in non-white participants because it was trained on skewed profiles.
Suddenly, it’s not just an edge. It’s a liability.
Innovation Without Guardrails Is a Risk
Across multisport and endurance athletics, AI is becoming core infrastructure. Not just for race-day tactics, but for sleep tracking, coaching decisions, nutrition programming, injury prevention, mental resilience tools, sponsorship analytics, and even fan engagement.
But here’s the uncomfortable truth: Almost no one has a governance plan. Which means no oversight on what AI is doing, who it’s doing it for, and what harm it might cause in the process.
Elite sport is highly competitive. That’s precisely why it needs a structure to make sure AI is
safe, lawful, and fair.
Regulation Is Not a Side Quest
The EU AI Act is a law that applies to any AI system used for biometric analysis, physical or
mental health assessment, or automated decision-making that significantly affects a person.
In practical terms, this means:
AI tools used for performance prediction or injury risk management are likely high-risk
systems
Any system profiling athletes or tracking behavior in Europe must comply with GDPR,
especially if it includes biometric, health, or mental status data
Federations, clubs, and tech vendors are legally accountable for explainability, fairness,
and auditability of AI tools
Even outside the EU, these standards are becoming global benchmarks. Olympic bodies,
national teams, and sports tech investors will all be expected to comply.
Athletes Are Not Test Subjects
Endurance athletes operate at the limits of human capability. That’s what makes this space so
inspiring, and so vulnerable.
When AI is used without guardrails, you risk amplifying gender bias, masking overtraining,
misclassifying injuries, or creating systems that only work well for a small subset of elite profiles.
Athletes deserve better. So do the coaches, physiologists, sports scientists, and data teams
supporting them.
What a Governance Plan Must Include
Whether you run an Ironman team, a national triathlon program, or an elite marathon club, you
need a clear AI governance strategy. This should include:
An AI inventory: What systems are being used? Where? By whom?
Risk classification and documentation: Is the tool high-risk? Does it need human oversight?
Data and consent policies: Athletes must be informed, and their data handled under
lawful terms.
Technical review and bias testing: Tools must be reviewed for performance across
age, gender, race, and physiology.
Contracts and accountability: Vendors must provide clarity on what their systems do,
how they were trained, and how errors are handled.
The future of multisport is not just physical. It is computational.
If you are embracing AI without governance, you are building speed without brakes.
The real competitive advantage isn’t just performance. It’s foresight.
Comments