Search This Blog

Powered by Blogger.

Blog Archive

Labels

Footer About

Footer About

Labels

Showing posts with label Privacy violation. Show all posts

Microsoft Teams’ New Location-Based Status Sparks Major Privacy and Legal Concerns

 

Microsoft Teams is preparing to roll out a new feature that could significantly change how employee presence is tracked in the workplace. By the end of the year, the platform will be able to automatically detect when an employee connects to the company’s office Wi-Fi and update their status to show they are working on-site. This information will be visible to both colleagues and supervisors, raising immediate questions about privacy and legality. Although Microsoft states that the feature will be switched off by default, IT administrators can enable it at the organizational level to improve “transparency and collaboration.” 

The idea appears practical on the surface. Remote workers may want to know whether coworkers are physically present at the office to access documents or coordinate tasks that require on-site resources. However, the convenience quickly gives way to concerns about surveillance. Critics warn that this feature could easily be misused to monitor employee attendance or indirectly enforce return-to-office mandates—especially as Microsoft itself is requiring employees living within 50 miles of its offices to spend at least three days a week on-site starting next February. 

To better understand the implications, TECHBOOK consulted Professor Christian Solmecke, a specialist in media and IT law. He argues that the feature rests on uncertain legal footing under European privacy regulations. According to Solmecke, automatically updating an employee’s location constitutes the processing of personal data, which is allowed under the GDPR only when supported by a valid legal basis. In this case, two possibilities exist: explicit employee consent or a legitimate interest on the part of the employer. But as Solmecke explains, an employer’s interest in transparency rarely outweighs an employee’s right to privacy, especially when tracking is not strictly necessary for job performance. 

The expert compares the situation to covert video surveillance, which is only permitted when there is a concrete suspicion of wrongdoing. Location tracking, if used to verify whether workers are actually on-site, falls into a similar category. For routine operations, he stresses, such monitoring would likely be disproportionate. Solmecke adds that neither broad IT policies nor standard employment contracts provide sufficient grounds for processing this type of data. Consent must be truly voluntary, which is difficult to guarantee in an employer-employee relationship where workers may feel pressured to agree. 

He states that if companies wish to enable this automatic location sharing, a dedicated written agreement would be required—one that employees can decline without negative repercussions. Additionally, in workplaces with a works council, co-determination rules apply. Under Germany’s Works Constitution Act, systems capable of monitoring performance or behavior must be approved by the works council before being implemented. Without such approval or a corresponding works agreement, enabling the feature would violate privacy law. 

For employees, the upcoming rollout does not mean their on-site presence will immediately become visible. Microsoft cannot allow employers to activate such a feature without clear employee knowledge or consent. According to Solmecke, any attempt to automatically log and share employee location inside the company would be legally vulnerable and potentially challengeable. Workers retain the right to reject such data collection unless a lawful framework is in place. 

As companies continue navigating hybrid and remote work models, Microsoft’s new location-based status illustrates the growing tension between workplace efficiency and digital privacy. Whether organizations adopt this feature will likely depend on how well they balance those priorities—and whether they can do so within the boundaries of data protection law.

Clearview: Face Recognition Software Used by US Police


Clearview, a facial recognition company has apparently conducted nearly a million searches, helping US police. Haon Ton, CEO of Clearview has revealed to BBC that the firm now has looked into as much as 30 billion images from various platforms including Facebook, taken without users’ consent. 

Millions of dollars have been fined against the corporation over and over again in Europe and Australia for privacy violations. Critics, however, argue that the police using Clearview to their aid puts everyone into a “perpetual police line-up.” 

"Whenever they have a photo of a suspect, they will compare it to your face[…]It's far too invasive," says Matthew Guariglia from the Electronic Frontier Foundation. 

The figure has not yet been clarified by the police in regard to the million searches conducted by Clearview. But, Miami Police has admitted to using this software for all types of crimes in a rare revelation to the BBC. 

How Does Clearview Works 

Clearview’s system enables a law enforcement customer to upload an image of a face, followed by looking for matches in a database of billions of images it has in store. It then provides links to where the corresponding images appear online. It is regarded as one of the world's most potent and reliable facial recognition companies. 

The firm has now been banned from providing its services to most US companies after the American Civil Liberties Union (ACLU) accused Clearview AI of violating privacy laws. However, there seems to be an exemption for police, with Mr. Ton saying that his software is used by hundreds of police forces across the US. 

Yet, the US police do not routinely reveal if they do use the software, and in fact have banned the software in several US cities like Portland, San Francisco, and Seattle. 

Police frequently portray the use of facial recognition technology to the public as being limited to serious or violent offenses. 

Moreover, in an interview with law enforcement about the efficiency of Clearview, Miami Police admitted to having used the software for all types of crime, from murders to shoplifting. Assistant Chief of Police Armando Aguilar said his team used the software around 450 times a year, and it has helped in solving murder cases. 

Yet, critics claim that there are hardly any rules governing the use of facial recognition by police.