Money

UK trade unions call for legal curbs on AI in workplace


New legal protections are urgently needed to regulate the use of artificial intelligence in UK workplaces and prevent workers being hired and fired by algorithm, according to a new report by the Trades Union Congress.

The coronavirus pandemic has accelerated the adoption of high tech management tools, with employers increasingly turning to AI to help them recruit remotely, sift candidates for redundancy, determine performance ratings, allocate work and monitor the productivity of home workers.

The TUC argues that employment law has failed to keep pace with the rollout of new technology and wants employers, tech companies and government to take action to plug the gaps.

Among other reforms, it wants a new legal right for workers to have a human review of decisions made by AI algorithms, so that they can challenge any decision that appears unfair or discriminatory.

“Increasingly employers are not doing things — it’s the machines doing it. Where you substitute a boss with an algo, you’re undermining the personal relationship [between employer and employee],” said Robin Allen, one of the lawyers commissioned by the TUC to write the report.

“Employers must never forget the personal relationship,” he added, warning that companies would lay themselves open to costly discrimination claims unless they observed a set of “red lines” in the way they deployed new technologies, including explaining and justifying their use to employees. “You have to use this in a human way. You mustn’t let it make decisions you can’t yourself justify.”

The reforms outlined in the report also include a legal duty on employers to consult trade unions on the use of “high risk” and intrusive technology in the workplace, and a new legal right to “switch off” from work, intended to help homeworkers draw boundaries between work and private life.

Gig workers and low paid employees in workplaces such as call centres are often assumed to be the most affected by algorithmic management. The TUC highlighted recent claims by couriers for Uber Eats, who say they were fired unfairly because of facial identification software that has been found to be unreliable when used with people from ethnic minority backgrounds.

But Andrew Pakes, at the professionals’ union Prospect, said white-collar workers were increasingly subject to high-tech monitoring, profiling and recruitment processes.

Referring to the spread of AI video-interviewing systems, he said: “I’m quite deaf. I always look like I’m gurning on a call. Does that make it look like I’m not very interested?”

“AI at work could be used to improve productivity and working lives. But it is already being used to make life-changing decisions,” said Frances O’Grady, general secretary of the TUC, adding: “Without fair rules, the use of AI at work could lead to widespread discrimination and unfair treatment.

Where are the potential problems with AI at work?

© Bloomberg

Recruitment. The TUC report notes the increasing use of AI video interview systems, which analyse candidates’ words and other data such as facial movements and give the employer feedback that can be used as part of their decision-making process. This kind of technology has been criticised for discriminating against disabled applicants, but Robin Allen said it was increasingly used for recruitment and selecting staff for redundancy.

Monitoring Many employers already use online tools to monitor when staff are available or offline, how many keyboard strikes they make per hour, their use of social media and the content of emails. Already contentious, this kind of surveillance could be even more problematic on privacy grounds when staff are compelled to work from home, depending on the steps employers took to explain the extent of monitoring and justify it on business grounds.

Disciplinary action The TUC highlighted the case of a longstanding employee, on a last warning after several unauthorised absences, who was dismissed because an automated absence management system did not process a fit note from her doctor correctly. A manager at the dismissal hearing assumed the automated system was correct. A court would probably view this as unfair dismissal but workers and employees with less than two years service have no protection under unfair dismissal legislation.



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.