With AI, workplace surveillance has ‘skyrocketed’, leaving Canadian laws behind




Anja Karadeglija, Canadian Press



Published Saturday, March 9, 2024 2:42 pm EST




OTTAWA: Technology that tracks your location at work and the time you spend in the bathroom. A program that takes random screenshots of your laptop screen. A tracking system that detects your mood during your shift.

These are just a few ways that employees surveillance Technology is being deployed, now turbocharged thanks to the explosive growth of artificial intelligence.

Canada’s laws are not up to par, experts warn.

“Any work device that your employer puts in your hand, you can assume has some way of monitoring your work and productivity,” said Valerio De Stefano, Canada research chair in innovation law and society at York University.

“Electronic monitoring is a reality for most workers.”

Artificial intelligence could also determine whether someone gets or keeps a job in the first place.

Automated hiring is already “extremely widespread,” with nearly every Fortune 500 company in the United States using AI to hire new workers, De Stefano said.

Unlike traditional monitoring, he added, AI makes “autonomous decisions about hiring, retention and discipline” or provides recommendations to the employer on such decisions.

Employee surveillance It may look like a warehouse worker with a mini-computer on his arm that tracks every move he makes, said Bea Bruske, president of the Canadian Labor Congress.

“They’re building a platform, but that particular minicomputer tracks every step, every movement of the wrist, so to speak,” Bruske said.

“They know exactly how many boxes are placed on that pallet, how long it takes, and how many additional steps that worker could have taken.”

There is little data documenting how widespread AI-powered worker technology is. surveillance It could be in Canada. Unless employers are upfront about their practices, “we don’t necessarily know,” Bruske said.

In a 2022 study by Future Skills Centre, pollster Abacus Data surveyed 1,500 employees and 500 supervisors working remotely.

Seventy percent reported that some or all aspects of their work were being monitored digitally.

About a third of employees said they experienced at least one instance of location tracking, video or webcam recording, keystroke monitoring, screenshots, or use of biometric information by the employer.

“There is a patchwork of laws governing privacy in the workplace that currently provide considerable leeway for employers to monitor employees,” the report notes.

Electronic monitoring in the workplace has been around for years. But the technology has become more intimate, taking on tasks such as listening to casual conversations between workers.

It has also become easier for businesses to use and more customized to their specific needs, and more standardized, said McGill University associate professor Renee Sieber.

De Stefano said artificial intelligence has made electronic monitoring more invasive, as it “can process a lot more data and is more affordable.”

“Employer tracking has skyrocketed” since AI has existed, he added.

However, those in the industry insist there is also a positive side.

Toronto-based FutureFit AI is creating an AI-powered professional assistant that CEO Hamoon Ekhtiari says can help people navigate workplaces that technology is rapidly changing.

AI can search for jobs, provide career guidance, find training programs, or generate a plan for next steps. In the hiring process, it can give applicants quick information about gaps in their applications, Ekhtiari said.

As artificial intelligence permeates Canadian workplaces, lawmakers are scrambling to introduce new rules.

The federal government has proposed Bill C-27, which would establish obligations for “high-impact” artificial intelligence systems.

That includes those dealing with “determinations regarding employment, including recruitment, recommendation, hiring, remuneration, promotion, training, apprenticeship, transfer or dismissal,” Innovation Minister Francois-Said said. Philippe Champagne.

Champagne has flagged concerns that AI systems could perpetuate bias and discrimination in hiring, including in who sees job ads and how applicants are ranked.

But critics have taken issue with the fact that the bill does not explicitly include protections for workers. It also will not take effect immediately, only after regulations implementing the bill are developed.

In 2022, Ontario began requiring employers with 25 or more employees to have a written policy that describes electronic monitoring and states for what purposes they can use that information.

Neither the proposed legislation nor the Ontario law “offers sufficient protection for workers,” De Stefano said.

Activities such as reading employees’ emails and tracking time are allowed, as long as the employer has a policy and informs workers about what is happening, he added.

“That’s good to know, but if I don’t have recourse against using these systems, some of which can be extremely problematic, the protection really isn’t particularly meaningful.”

Ontario has also proposed requiring employers to disclose their use of AI in hiring. If passed, it would make the province the first Canadian jurisdiction to implement such a law.

In theory, provincial and federal privacy laws should offer some protections. But Canada’s privacy commissioners have warned that existing privacy legislation is woefully inadequate.

They said in October that “the recent proliferation of employee tracking software” has “revealed that laws protecting privacy in the workplace are outdated or non-existent altogether.”

Watchdogs in other countries have been cracking down. In January, France fined Amazon $35 million for monitoring workers with an “excessively intrusive system.”

The issue has also been on the radar of unions. The Canadian Labor Congress is not happy with Bill C-27 and employees and their unions have not been adequately consulted, Bruske said.

De Stefano said the government should “stop making the adoption of these systems a unilateral choice for employers” and instead give workers the opportunity to be fully informed and express their concerns.

Governments should aim for something that distinguishes between monitoring performance and surveillancewhich puts potty time in the latter category, Sieber added.

An argument could be made for banning some technologies entirely, such as “emotional AI” tools that detect whether a worker in front of a computer screen or on an assembly line is happy, he said.

Emily Niles, senior researcher at the Canadian Union of Public Employees, said AI systems work on information such as time logs, the number of tasks completed during a shift, email content, meeting notes and cell phone usage.

“AI doesn’t exist without data, and it’s actually our data that runs,” Niles said.

“That’s an important point of intervention for the union, to assert workers’ voices and control over these technologies.”

This report by The Canadian Press was first published March 9, 2024.


Leave a Comment