Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Meta Platforms. Show all posts

EU Claims Meta’s Paid Ad-Free Option Violates Digital Competition Rules

 

European Union regulators have accused Meta Platforms of violating the bloc’s new digital competition rules by compelling Facebook and Instagram users to either view ads or pay to avoid them. This move comes as part of Meta’s strategy to comply with Europe's stringent data privacy regulations.

Starting in November, Meta began offering European users the option to pay at least 10 euros ($10.75) per month for ad-free versions of Facebook and Instagram. This was in response to a ruling by the EU’s top court, which mandated that Meta must obtain user consent before displaying targeted ads, a decision that jeopardized Meta’s business model of personalized advertising.

The European Commission, the EU’s executive body, stated that preliminary findings from its investigation indicate that Meta’s “pay or consent” model breaches the Digital Markets Act (DMA) of the 27-nation bloc. According to the commission, Meta’s approach fails to provide users the right to “freely consent” to the use of their personal data across its various services for personalized ads.

The commission also criticized Meta for not offering a less personalized service that is equivalent to its social networks. Meta responded by stating that their subscription model for no ads aligns with the direction of the highest court in Europe and complies with the DMA. The company expressed its intent to engage in constructive dialogue with the European Commission to resolve the investigation.

The investigation was launched soon after the DMA took effect in March, aiming to prevent tech “gatekeepers” from dominating digital markets through heavy financial penalties. One of the DMA's objectives is to reduce the power of Big Tech firms that have amassed vast amounts of personal data, giving them an advantage over competitors in online advertising and social media services. The commission suggested that Meta should offer an option that doesn’t rely on extensive personal data sharing for advertising purposes.

European Commissioner Thierry Breton, who oversees the bloc’s digital policy, emphasized that the DMA aims to empower users to decide how their data is used and to ensure that innovative companies can compete fairly with tech giants regarding data access.

Meta now has the opportunity to respond to the commission’s findings, with the investigation due to conclude by March 2025. The company could face fines of up to 10% of its annual global revenue, potentially amounting to billions of euros. Under the DMA, Meta is classified as one of seven online gatekeepers, with Facebook, Instagram, WhatsApp, Messenger, and its online ad business listed among two dozen “core platform services” that require the highest level of regulatory scrutiny.

This accusation against Meta is part of a series of regulatory actions by Brussels against major tech companies. Recently, the EU charged Apple with preventing app makers from directing users to cheaper options outside its App Store and accused Microsoft of violating antitrust laws by bundling its Teams app with its Office software.


Changing Methods of Tracking and Sharing Healthcare Data

 


As artificial intelligence (AI) becomes more and more prevalent in healthcare, there is a growing need to manage its development, as rapidly. Private companies and organizations own and control AI technologies. Because of the way artificial intelligence is implemented, corporations, clinics, and government bodies could be required to play a much larger role in determining what health information is gathered, utilized, and protected about patients than is typical under traditional circumstances. There are privacy concerns associated with data security and the implementation of this method that need to be considered. 

Earlier this year, a patient from Baltimore, Maryland-based MedStar Health System, filed a lawsuit against Meta Platforms, seeking damages on behalf of the entire group of patients who were injured due to the company's practices in the U.S. The Northern District of California is the court responsible for hearing the cases. 

A plaintiff in the class action lawsuit alleged that Meta, the parent company of Facebook, was using Pixel tracking technology to sneak into hospitals' and health systems' websites and portals to track patients' information. As of now, Meta has been sued by at least two more class action lawsuits alleging that the company improperly collected information about its customers. 

As well as several major health systems having been named as defendants (Dignity Health, UCSF) or have faced lawsuits against them (Northwestern Memorial Hospital) for alleged misuse or misconfiguration of the Pixel tool, several of the major health systems in the country have also been named as co-defendants. 

Multiple recent studies have revealed that third-party tracking occurs on nearly all hospital websites, which reinforces recent media coverage of the increasing number of consumers who are losing privacy when they browse online to find health information. 

As it turns out, nearly all U.S. hospital website visitors who provide their contact information have the option of sharing potentially sensitive medical information with tech companies, data brokers, and advertising firms, according to a recent analysis of Health Affairs published by the University of Pennsylvania. 

As a first set of concerns, one is the complexity of accessing, using, and having control over patient data under private ownership. In some recent public-private partnerships for the implementation of artificial intelligence, privacy has been poorly protected, leading to poor results. The research using big data for health purposes has been criticized thus far due to a lack of systematic oversight of the research. To protect patient privacy and other rights, appropriate safeguards must be implemented. A structural incentive should be provided to private custodians of data to prevent the unauthorized use of these data. This should deter the use of these data in alternative ways. 

Moreover, another concern about AI-driven methods is the possibility that they could expose people's private information to external threats. New algorithms have been developed that have successfully reidentified such data in the absence of any tools for deidentification or anonymization and therefore this capability may be compromised or even made null and void. 

Under a private custodianship, the risk of data exposure to unauthorized persons could rise significantly. 

As a result of these developments, hospitals and health systems now have to ask themselves some questions regarding the design of their websites and apps, and how third parties may, either inadvertently or not, put patients' protected health information at risk through the use of these tools. 

This missive from January 2014 contains Frances' full name, along with the revelation that she has a genital wart and human papillomavirus. This is a sexually transmitted disease associated with genital warts and cancer. Moreover, the letter also contained her date of birth and ended with a plea to friends asking them to help expose this hoe. 

The following day, Frances, who had lived near her high school pals but had been dating for a short time, was told by a friend that the former friend who lived nearby had shared a secret that only she and a former boyfriend knew about. 

Frances was treated at the local hospital where the Facebook poster worked as a patient care technician, but they were no longer friends after Frances had been treated there. 

The hospital responded to Frances' complaint by sending her a letter of apology in March 2014 after Frances complained to a nurse supervisor at the hospital. In the letter, the company stressed that it takes these sorts of situations very seriously. Despite not specifying what actions were taken, "We took action according to our policies and procedures," they said.

As far as the disclosures to Meta/Facebook are concerned, what is truly concerning is not so much the sharing of their data, but that their data may be shared broadly and for advertising and tracking purposes without their consent or knowledge, which is what concerns the majority of people. 

Under HIPAA, covered entities, including certain providers and insurance plans, as well as certain business associates/vendors, are required to adhere to certain privacy and security regulations, as well as to respect the rights of individuals. It also establishes certain requirements regarding the privacy and security of health information. 

Patients must be notified of the use and disclosure of their personal health information. In addition, the organization obtains valid authorization for certain types of use and disclosures. It requires certain assurances before sharing PHI with vendors. These standards also require organizations to provide patients with information about how their PHI may be used and disclosed. 

The Executive Order, which was issued earlier this summer, also requires the Department of Health and Human Services to consider actions and guidance to strengthen security and privacy protections for reproductive healthcare providers specifically. Organizations should focus on the current legislation, rules, and risks that apply today. However, they should also pay close attention to what is being discussed in the legislature and the enforcement actions being taken.