The EU’s emerging AI act contains so little on workers’ rights that Europe will need a separate AI-labour law, trade unions say.
Last Wednesday (14 June), the European Parliament adopted its position on the Artificial Intelligence Act — a game-changing set of rules for AI-based products and services.
MEPs inched further on labour rights than the original European Commission proposal.
But in the race to legislate potentially disruptive technologies, EU institutions are still largely overlooking the workers affected by incoming AI systems, unionists say.
AI is changing workers’ lives in many ways.
It’s already used for tracking drivers in apps such as Uber and Deliveroo, in helping robots work alongside people on factory lines, and in managing patients’ medical data for doctors and nurses.
The EU Parliament has added a potential ban on risky AI emotional-recognition systems, which can warp the employer-employee relationship — but that’s as far as progress on employment issues goes.
“We need a specific directive on labour relations because there are many question marks that are not clarified in the AI law because that was not its [original] purpose,” said Aída Ponce, a senior researcher at the European Trade Union Institute (ETUI) in Brussels.
“Specific provisions are needed to set a clear and concrete limit to the use of AI systems in industrial relations, up to the point of monitoring,” she said.
The ETUI’s sister organisation, the European Trade Union Confederation (ETUC) in Brussels, has also said a separate AI-workers directive will be needed.
And it specifically criticised as a “major loophole” that only AI applications categorised as high-risk are regulated.
According to article six of the EU Parliament’s position, they can only be restricted in the workplace if they pose a “significant” risk.
But while “[AI] providers will generally state that their applications may pose risks to workers, [they] will classify them as non-significant,” ETUC warned.
“This additional burden on workers is unacceptable and leaves their safety and rights open to abuse,” it said.
Read More