Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Biometric data. Show all posts

Passkeys vs Passwords: The Future of Online Authentication

 

In the realm of online security, a shift is underway as passkeys gain traction among tech giants like Apple, Google, Microsoft, and Amazon. 

These innovative authentication methods offer a more seamless login experience and bolster cybersecurity against threats like malware and phishing. However, traditional passwords still hold their ground, allowing users to retain control over their security preferences.

A password is a unique combination of characters, including upper and lower case letters, numbers, and symbols, used to verify a user's identity. While originally designed to be memorized or manually recorded, they can now be securely stored online with tools like NordPass.

Passkeys, the technologically advanced successors to passwords, rely on PINs, swipe patterns, or biometric data (such as fingerprints or facial scans) for identity verification. They leverage the WebAuthn standard for public-key cryptography, generating a unique key pair on user devices, making them impervious to theft or forgetfulness.

Passkey vs Password: Security Comparison

Passkeys and passwords vary fundamentally in design, approach, and effectiveness in securing accounts. Here are some key distinctions:

Cybersecurity:

Passwords are susceptible to hacking, especially those with fewer than 10 characters. Passkeys, on the other hand, utilize biometric data and cryptographic methods, drastically reducing vulnerability. Only with access to the user's authenticator device and biometric information can a passkey be breached.

Convenience:

Creating, recalling, and managing complex passwords can be arduous and time-consuming, leading to 'password fatigue.' Passkeys, once set up, facilitate quick and seamless authentication, eliminating the need to remember multiple passwords.

Login Success Rate:

Passkeys have a significantly higher success rate compared to passwords. Recent data from Google revealed that while passwords succeed only 13.8% of the time, passkeys boasted a success rate of 63.8%.

Popularity:

Although passkeys are gaining traction, they are not yet universally supported. Familiarity with passwords and concerns over passkey error handling and biometric privacy contribute to their slower adoption.

The Evolution of Authentication

While passkeys represent a significant leap forward in security and user-friendliness, the demise of passwords is a gradual process. The established dominance of passwords, spanning over half a century, requires a patient transition. Behavioral habits and the need for technological refinement play pivotal roles in this shift.

Presently, passkey usage is seldom mandatory, allowing users to choose their preferred verification method. For sites exclusively supporting passwords, outsourcing password management is advisable, with various free tools available to assess password strength.

In conclusion, the future of online authentication is evolving towards passkeys, offering a more secure and user-friendly experience. However, the transition from passwords will be a gradual one, shaped by technological advancements and user behavior.

New Privacy Policy: X Plans on Collecting Users’ Biometric Data


According to a new privacy policy introduced by X (formerly known as Twitter), it will soon be collecting its users’ biometric data. 

The policy says that the company intends to compile individuals' employment and educational histories. According to the policy page, the modification will take effect on September 29. 

The updated policy reads, “Based on your consent, we may collect and use your biometric information for safety, security, and identification purposes.” While biometric data usually involves an individual’s physical characteristics, like their face or fingerprints, X has not yet specified the data they will be collecting. Also, X is yet to provide details on its plans to collect it. 

In a conversation with Bloomberg, the company noted that biometrics are only for premium users and will have the opportunity to submit their official ID and a photograph in order to add an additional layer of verification. According to Bloomberg, biometric information can be retrieved from both the ID and the image for matching reasons.

“This will additionally help us tie, for those that choose, an account to a real person by processing their government issued ID[…]This will also help X fight impersonation attempts and make the platform more secure,” X said in a statement to Bloomberg.

Last month, X had its name filed in a ‘proposed class action suit,’ where it was accused of illicitly capturing, storing and using Illinois residents’ biometric data,, including facial scans. The lawsuit says X “has not adequately informed individuals” that it “collects and/or stores their biometric identifiers in every photograph containing a face.”

In addition to the modified details of the biometric collection, X’s updated policy reveals its intention of storing users’ employment and education history. 

“We may collect and use your personal information (such as your employment history, educational history, employment preferences, skills and abilities, job search activity and engagement, and so on) to recommend potential jobs for you, to share with potential employers when you apply for a job, to enable employers to find potential candidates, and to show you more relevant advertising,” the updated policy reads.

The move seems to be related to the beta functionality of X, which enables verified companies on the network to publish job postings on their accounts. The prominent social networking platform has also established a legitimate @XHiring account. The hiring drive is a component of Musk's plans to make X an "everything app."  

Germany Admits Investigating Worldcoin’s Eye-Scanning Orb

Privacy issues with the Worldcoin cryptocurrency project, a venture by OpenAI CEO Sam Altman has been in talks since the announcement of its official launch. Several countries have now started considering its potential threats and are looking into the issue with much significance. 

Adding to this, Germany became the third European country ato admit investigating Worldcoin, after France and the US. Thereby, it seems like it would be tough regulatory road ahead for the venture.

The head of Bavarian State Office for Data Protection Supervision, Germany's data watchdog, recently noted that that they have been investigating Worldcoin since November 2022 over suspicion of the venture’s potential of accessing "sensitive data at a very large scale."

Despite being officially launched just last week, Worldcoin continues collecting iris scans from individuals all over the world for the past two years to add to its database. The company claims that this will enable users to verify their identity as humans in the developing age of artificial intelligence by connecting human identity to specific biometric data. While there is hint of intrigue in the project’s idea, it has raised concerns of the critiques. 

For instance, when reporters were dispatched to the project to have their irises scanned, Gizmodo and Futurism both reported that Orb operators did not ask for any prior identification or confirmation that participants are who they claim to be. In the underdeveloped world, participants in the project's pilot program have expressed feeling duped by the trade. Furthermore, since a blockchain is involved, it is unclear whether an individual can ask to have their data removed from the company's database.

However, neither these European data watchdogs nor Ethereum co-founder Vitalik Buterin, whose blockchain Worldcoin relies on, are persuaded that this type of "proof-of-personhood" venture is ready for a widespread adoption.

In a blog post regarding Worldcoin, Buterin claimed that "if even one Orb manufacturer is malicious or hacked, it can generate an unlimited number of fake iris scan hashes, and give them World IDs."

This only leads us to one conclusion, we will not be convinced until Worldcoin reveals what exactly they do with the collected data.

Companies Appeal for Relief From Biometric Privacy Act


Later this June, in a Public Radio talk show hosted by Brian Mackey, Senate President Don Harmon said that some of the most flourishing business association leaders have “punched us in the nose” after the Senate Democrats came up with what he called a "good faith solution" to issues brought on by the state's highly controversial Biometric Information Privacy Act.

The Senate Democratic proposal that the business groups opposed, according to Harmon, is "very friendly to the business community that has been asking for these changes."

Biometric Information Privacy Act (BIPA) 

Companies these days regularly collect biometric data, like facial recognition and fingerprint scans. However, in Illinois, it has been made illegal for companies to obtain any such data unless they obtain informed consent.

The Biometric Information Privacy Act was implemented on October 3, 2008, which regulates private entities may acquire, utilize, and handle biometric identifiers and information. Notably, government organizations are not covered by the Act. The only other states with comparable biometric safeguards are Texas and Washington, but BIPA is the strictest. According to the Act, a violation is punishable by a $1,000 or $5,000 fine if it is willful or reckless. This BIPA's damages clause has given rise to numerous class action lawsuits.

Privacy Issues

In regards to the privacy issues concerning the collection of biometric data, many lawsuits have been filed so far, as a result of which many companies want relief.

The Illinois Supreme Court pleaded with the General Assembly earlier this spring to rethink the law in its ruling against the White Castle company. The burger chain's eventual cost for obtaining employee fingerprint scans might reach $17 billion as a result of that decision.

Near the originally scheduled conclusion of the spring legislative session, the Illinois Retail Merchants Association, the Illinois Manufacturers' Association, and the Illinois Chamber of Commerce held a press conference with other business leaders to vehemently oppose the Senate Democrats' proposal.

In the press conference, Illinois Manufacturers' Association President and CEO Mark Denzler, argued the measure would exacerbate the issue. However, Denzler is not exactly known for his hostility. The legislation, according to Denzler, “will only increase abuse of this law by trial attorneys” who have made thousands of claims under the law.

However, the three business groups either refused to respond to Harmon’s comments or, did not respond at all.

In response, the Supreme Court claims that the legislative intent of BIPA was to penalize every single acquisition of employee biometric data. "That's how we ended up with a $17 billion" fine, claimed Senate President Pro Tempore Bill Cunningham since a significant number of White Castle employees were subjected to multiple daily scans for five years. According to the Democratic plan, the law would have been altered specifically to base the fine on the number of employees rather than the number of scans. However, they also doubled the fine from $1,000 to $1,500, which was also criticized by the business associations.

The original state statute, according to BIPA law opponents, has very little to do with reality. The purpose of the law is to safeguard individuals from having their biometric information kept, shared, or otherwise used without their knowledge or consent. After all, people can alter their computer passwords to hide their identities, but could not simply alter their fingerprints.

Cunningham on the other hand claims to have heard a theory by a Republican lawmaker that the corporate defense bar has confirmed to the business groups that they can get the state law overturned by the U.S. Supreme Court, leaving no reason to compromise at the state level. “I have no idea if that's true or not[…]But it's a better explanation that I can come up with,” he says.

Promoting Trust in Facial Recognition: Principles for Biometric Vendors

 

Facial recognition technology has gained significant attention in recent years, with its applications ranging from security systems to unlocking smartphones. However, concerns about privacy, security, and potential misuse have also emerged, leading to a call for stronger regulation and ethical practices in the biometrics industry. To promote trust in facial recognition technology, biometric vendors should embrace three key principles that prioritize privacy, transparency, and accountability.
  1. Privacy Protection: Respecting individuals' privacy is crucial when deploying facial recognition technology. Biometric vendors should adopt privacy-centric practices, such as data minimization, ensuring that only necessary and relevant personal information is collected and stored. Clear consent mechanisms must be in place, enabling individuals to provide informed consent before their facial data is processed. Additionally, biometric vendors should implement strong security measures to safeguard collected data from unauthorized access or breaches.
  2. Transparent Algorithms and Processes: Transparency is essential to foster trust in facial recognition technology. Biometric vendors should disclose information about the algorithms used, ensuring they are fair, unbiased, and capable of accurately identifying individuals across diverse demographic groups. Openness regarding the data sources and training datasets is vital, enabling independent audits and evaluations to assess algorithm accuracy and potential biases. Transparency also extends to the purpose and scope of data collection, giving individuals a clear understanding of how their facial data is used.
  3. Accountability and Ethical Considerations: Biometric vendors must demonstrate accountability for their facial recognition technology. This involves establishing clear policies and guidelines for data handling, including retention periods and the secure deletion of data when no longer necessary. The implementation of appropriate governance frameworks and regular assessments can help ensure compliance with regulatory requirements, such as the General Data Protection Regulation (GDPR) in the European Union. Additionally, vendors should conduct thorough impact assessments to identify and mitigate potential risks associated with facial recognition technology.
Biometric businesses must address concerns and foster trust in their goods and services as facial recognition technology spreads. These vendors can aid in easing concerns around facial recognition technology by adopting values related to privacy protection, openness, and accountability. Adhering to these principles can not only increase public trust but also make it easier to create regulatory frameworks that strike a balance between innovation and the defense of individual rights. The development of facial recognition technology will ultimately be greatly influenced by the moral and ethical standards upheld by the biometrics sector.






Military Device Comprising of Thousands of Peoples' Biometric Data Sold on eBay


The last time the U.S. military used its Secure Electronic Enrollment Kit (SEEK II) devices was more than ten years ago, close to Kandahar, Afghanistan. The bulky black rectangle piece of technology, which was used to scan fingerprints and irises, was switched off and put away.

That is, until Matthias Marx, a German security researcher, purchased the device for $68 off of eBay in August 2022 (a steal, at about half the listed price). Marx had unintentionally acquired sensitive, identifying information on thousands of people for the cheap, low price of less than $70. The biometric fingerprint and iris scans of 2,632 people were accompanied by names, nationalities, photographs, and extensive descriptions, according to a story by The New York Times. 

From the war zone areas to the government equipment sale to the eBay delivery, it seems that not a single Pentagon official had the foresight to remove the memory card out of the specific SEEK II that Marx ended up with. The researcher told the Times, “The irresponsible handling of this high-risk technology is unbelievable […] It is incomprehensible to us that the manufacturer and former military users do not care that used devices with sensitive data are being hawked online.”  

According to the Times, the majority of the data in the SEEK II was gathered on people who the American military has designated as terrorists or wanted people. Others, however, were only ordinary citizens who had been detained at Middle Eastern checkpoints or even people who had aided the American administration. 

Additionally, all of that information might be utilized to locate someone, making the devices and related data exceedingly hazardous, if they ended up in the wrong hands. For instance, the Taliban may have a personal motive for tracking down and punishing anyone who cooperated with U.S. forces in the area. 

Marx and his co-researchers from Chaos Computer Club, which claims to be the largest hacker group in Europe, purchased the SSEK II and five other biometric capture devices- all from eBay. The group then went on with analyzing the devices for potential flaws, following a 2021 report by The Intercept, regarding military tech seize by the Taliban. 

Marx was nonetheless concerned by the extent of what he discovered, despite the fact that he had set out from the start to assess the risks connected with biometric devices. The Times reports that a second SEEK II purchased by CCC and last used in Jordan in 2013 contained data on U.S. troops—likely gathered during training—in addition to the thousands of individuals identified on the single SEEK II device last used in Afghanistan.