Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label General Data Protection Regulation. Show all posts

Hays Research Reveals the Increasing AI Adoption in Scottish Workplaces


Artificial intelligence (AI) tool adoption in Scottish companies has significantly increased, according to a new survey by recruitment firm Hays. The study, which is based on a poll with almost 15,000 replies from professionals and employers—including 886 from Scotland—shows a significant rise in the percentage of companies using AI in their operations over the previous six months, from 26% to 32%.

Mixed Attitudes Toward the Impact of AI on Jobs

Despite the upsurge in AI technology, the study reveals that professionals have differing opinions on how AI will affect their jobs. Even though 80% of Scottish professionals do not already use AI in their employment, 21% think that AI technologies will improve their ability to do their tasks. Interestingly, during the past six months, the percentage of professionals expecting a negative impact has dropped from 12% to 6%.

However, the study indicates its concern among employees, with 61% of them believing that their companies are not doing enough to prepare them for the expanding use of AI in the workplace. Concerns are raised by this trend regarding the workforce's readiness to adopt and take full use of AI technologies. Tech-oriented Hays business director Justin Black stresses the value of giving people enough training opportunities to advance their skills and become proficient with new technologies.

Barriers to AI Adoption 

The reluctance of enterprises to disclose their data and intellectual property to AI systems, citing concerns linked to GDPR compliance (General Data Protection Regulation), is one of the noteworthy challenges impeding the mass adoption of AI. This reluctance is also influenced by concerns about trust. The demand for AI capabilities has outpaced the increase of skilled individuals in the sector, highlighting a skills deficit in the AI space, according to Black.

The reluctance to subject sensitive data and intellectual property to AI systems results from concerns about GDPR compliance. Businesses are cautious about the possible dangers of disclosing confidential data to AI systems. Professionals' scepticism about the security and dependency on AI systems contributes to their trust issues. 

The study suggests that as AI sets its foot as a crucial element in Scottish workplaces, employees should prioritize tackling skills shortages, encouraging employee readiness, and improving communication about AI integration, given the growing role that AI is playing in workplaces. By doing this, businesses might as well ease the concerns about GDPR and trust difficulties while simultaneously fostering an atmosphere that allows employees to fully take advantage of AI technology's benefits.  

Irish Regulator Fines WhatsApp $266 Million for Breaching EU Privacy Regulations

 

Facebook-owned WhatsApp has been directed to pay a 225 million euros ($266 million) fine for violating the EU’s General Data Protection Regulation after it failed to notify the users and non-users on what it does with their personal data. 

The penalty was handed down by the Irish Data Protection Commission (DPC), the leading data privacy regulator for Facebook within the European Union, following an investigation started in December 2018 after the DPC received multiple complaints from "individual data subjects" (both users and non-users) regarding WhatsApp data processing activities.

"We examined whether WhatsApp has discharged its GDPR transparency obligations with regard to the provision of information and the transparency of that information to both users and non-users of WhatsApp's service. This includes information provided to data subjects about the processing of information between WhatsApp and other Facebook companies," DPC said.

In addition to the fine, the 266-page decision by the DPC directs WhatsApp to bring its processing into compliance by taking eight remedial actions within the next three months. One of WhatsApp's Spokesperson stated the penalty and said that the company provided detailed information to the users. The fine imposed by DPC is "out of step with previous GDPR-related fines" levied against other technology giants. 

"We have worked to ensure the information we provide is transparent and comprehensive and will continue to do so. We disagree with the decision today regarding the transparency we provided to people in 2018, and the penalties are entirely disproportionate," said the spokesperson. 

The DPC says it discovered that WhatsApp's practices violated four specific parts of GDPR: 

• Article 5, covering principles relating to the  processing of personal data; 

• Article 13, covering information to be provided when personal data gets collected from a data subject;

• Article 14, covering information to be provided when personal data has not been obtained from a data subject; 

• Article 15, which concerns a data subject's right to access their personal data from a controller. 

The fine imposed on WhatsApp is the second-highest fine ever issued so far under GDPR, outranked only by an $885 million fine against Amazon, according to Jonathan Armstrong, a compliance and technology lawyer with London-based law firm Cordery. 

According to Ireland's Data Protection Commission, it initially proposed a penalty in the range of 30 million euros to 50 million. But the European Data Protection Board reviewed the WhatsApp case and on July 28 issued a binding decision instructing the DPC to reassess and increase its proposed fine. The DPC says that based on the board's instructions, it increased the fine to 225 million euros. 

"An eye-catching aspect of that process was the increase in the size of the fine from a range of 30 million to 50 million euros first proposed by the DPC. The fine highlights the importance of compliance with the GDPR's rules on transparency in the context of users, non-users, and data sharing between group entities," says John Magee, who heads law firm DLA Piper's privacy, data protection, and security practice in Ireland.