How France plans to use Artificial Intelligence to keep Paris 2024 Olympics safe
France tested Artificial Intelligence-driven video surveillance technology that will be deployed during the Olympic Games at a Depeche Mode concert this week, calling the exercise a success.
French legislation passed in 2023 permits the use of AI video surveillance for a trial period covering the Games to detect abnormal events or human behaviour at large-scale events.
The technology could be pivotal to thwarting an attack like the bombing at the 1996 Olympics in Atlanta or the Nice truck attack in 2016, officials say.
Rights campaigners warn the technology poses a threat to civil liberties.
WHAT IS AI-POWERED SURVEILLANCE?
Algorithmic video surveillance uses computer software to analyse images captured by video surveillance cameras in real time.
Four companies — Videtics, Orange Business, ChapsVision and Wintics — have developed AI software that use algorithms to analyse video streams coming from existing video surveillance systems to help identify potential threats in public spaces.
The algorithms are trained to detect pre-determined “events” and abnormal behaviour and send alerts accordingly. Human beings then decide if the alert is real and whether to act on it.
WHAT WILL THE ALGORITHMS BE LOOKING FOR?
The law allows for eight different “events” to be flagged by AI surveillance software during the Games that include: crowd surges; abnormally heavy crowds; abandoned objects; presence or use of weapons; a person on the ground; a fire breaking out; contravention of rules on traffic direction.
Within these categories, specific thresholds (number of people, type of vehicle, timing etc) can be set manually to cater for each individual event, location or threat.