At a time When the boundaries between work and personal life are increasingly vague, the rise in surveillance and control technologies for new workers creates a dystopian reality that requires urgent attention to decision -makers.
The legislation filed on Beacon Hill, a law promoting the responsibility of artificial intelligence, known as Fair Act, would provide Massachusetts workers with essential protection against the reckless and harmful uses of “boss” software. Employers use these electronic and algorithmic decision -making systems to automate management functions, in particular by determining if workers obtain a job, following the locations and communications of workers throughout – and sometimes even after – working day and decide how many workers are paid and if they are promoted, submitted or licensed.
For sectors – in warehouses, offices, factories and employment sites across the country – these technologies and their large -scale risks are not theoretical or remote. Call center workers have long experienced Continuous and intrusive monitoring. AI propulsion cameraseven been forced to sign consent forms for biometric information. The Trump Administration and its Ministry of Government Effectiveness (DOGE) installed software which follows the movements and strikes of the mouse of federal workers and can even Activate their webcams remotely.
Und controlled surveillance threatens workers’ privacy, and research shows that it increases the risk of accidents, injuries and many mental and physical disorders. Automated decision -making systems have led workers to unjustly lose job opportunities and to face wage reductions and termination based on inaccurate or incomplete data, often because these systems are deeply defective and biased.
The fair act (Sd.838/ /HD.1458) would establish crucial guarantees, in particular the limitation of electronic surveillance to the situations necessary for legitimate commercial purposes, such as the quality control and the security of the network. Employers should inform workers of their electronic surveillance activities. The bill would also limit the sale or transfer of data from employees and prohibit the collection of sensitive information such as biometric data, including fingerprints, face and iris scans.
A recent study by the Center for Democracy & Technology and Colonorker.org has revealed that workers want protections like those that Fair Act offers. After the participating workers discussed various types of monitoring of the workplace between them, the results were striking: highly sustained overwhelming majorities requiring greater transparency concerning the surveillance of employers and data collection practices, prohibiting employers excluding blocks, limiting the location and biometric monitoring, and preventing employers from engaging in a surveillance of productivity or workers’ mental health.
In particular, this support transcended traditional political divisions. For example, the almost identical and overwhelming majority of liberal (96%), moderate (95%) and conservative (94%) participants favored employers to disclose the types of data from the employees they collect and how they collect them. The electronic monitoring protections of the Fair Act are therefore not just the right thing to do for workers; They also have the support of workers from all political history.
Of course, new technologies can also help workers do their work, but only if workers have comments and autonomy in the way these technologies are used. The fair act thus protects the roles of workers in decision -making by protecting employees from unfavorable actions if they reasonably refuse to follow the production of an AI system because this would lead to a negative result. This guarantees that the decision -making authority remains with humans rather than machines. This benefits not only workers, but customers, patients and patients that their employers serve.
Above all, fair law would also attack automated decision -making systems by requiring independent impact assessments before employers implement such systems, prohibiting their use to predict employee behavior or interfere with protected activities and guarantee workers a significant notice before employers use them to make key decisions about them.
Bossware Practices and the dystopian workplaces they create, do not line up with the values of Massachusetts. Unfortunately, we cannot expect the Trump administration to adopt rules or regulations concerning these practices – but as technology reshapes our workplaces, our laws must evolve to protect the rights and dignity of workers.
The fair act represents a thoughtful and balanced approach to meet the challenges posed by Bossware Systems. By adopting this legislation, Massachusetts can establish a national standard for the protection of workers in the digital era.
Chrissy Lynch is president of Massachusetts AFL-CIO. Amanda Ballantyne is an executive director of AFL-CIO Technology Institute. Matthew Scherer is a Senior Policy Advisor for Workers’ Rights and Technologies at the Center for Democracy & Technology. Lynch is a member of the Massinc Board of Directors, the non -profit civic organization that publishes Commonwealth Beacon.