Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Face recognition. Show all posts

 Facial Recognition Technology is Transforming in Texas

The Facial Recognition Act, a measure that places stringent restrictions on law enforcement's use of facial recognition surveillance, was introduced on September 28. 

The proposed legislation would establish a set of regulations that effectively address both the risks associated with facial recognition technology's failures, such as algorithmic bias and erroneous arrests and those associated with its successes, such as the possibility of widespread surveillance and abuse.

Errors in facial recognition might have drastic effects. Some of the various cases were the rejection of a woman's application for unemployment benefits in Texas, which made it impossible for her to pay her rent, and the arrest of a Black man by police in New Jersey, which could have limited the guy's options for housing and work.

Citizens had not been shielded from pointless facial identification by the laws of the new state. In Texas, businesses are not allowed to gather your biometric data without your permission, but if you refuse, you have no other options. Citizens are obligated to grant the apartment manager's request for approval. 

Researchers have already expended too much time and money to turn around now. In the majority of the U.S., there are even fewer limitations on the use of biometric data. Without regulation, businesses sell biometric information to advertisers and governments. Then, it can be used by state, federal, and private entities to silence our speech, pursue our preferences, and prevent us from exercising our fundamental rights.

To gather evidence against renters, at least one city even installed facial recognition-capable cameras outside a public housing complex. Facial recognition is growing increasingly widespread despite its flaws and potentially harmful effects. A facial recognition solution was introduced by Equifax, which targets leasing offices.

In order to determine if a customer would pay for their purchases, Socure and other companies market a service that combines facial recognition technology with computer code. A facial recognition technology marketed by ODIN is said to be able to recognize people who are homeless and give the police personal information about them. 

Such information includes any existing arrest warrants, which frequently just serve to criminalize poverty and make it harder to acquire housing, as well as claims of prior behavior, which could put armed cops on edge and make effective outreach more difficult. There is no reason why such characteristics are required for that work, notwithstanding ODIN's assertions that its system can remotely check people into shelters using biometric identification and location tracking. Facial recognition doesn't function as intended, and authors can't rely on it to make crucial judgments regarding housing, credit, or law enforcement.

Since the foundation of America, a lot has happened. Urbanization has brought us closer together, and technology has linked everyone on a scale that was previously unimaginable.

By Fooling a Webcam, Hackers were Able to get Past Windows Hello

 

Biometric authentication is a critical component of the IT industry's plan to eliminate the need for passwords. However, a new method for fooling Microsoft's Windows Hello facial recognition technology demonstrates that a little hardware tinkering can make the system unlock when it shouldn't.

Face-recognition authentication has become more prevalent in recent years thanks to services like Apple's FaceID, with Windows Hello driving usage even further. Face recognition by Hello is compatible with a variety of third-party webcams. 

Only webcams having an infrared sensor in addition to the conventional RGB sensor operate with Windows Hello facial recognition. However, it turns out that the system doesn't even look at RGB data. The researchers discovered that by using a single straight-on infrared image of a target's face and a black frame, they were able to open the victim's Windows Hello–protected device. The researchers were able to fool Windows Hello into thinking the device owner's face was there and unlocking by manipulating a USB webcam to produce an attacker-chosen image. 

“We tried to find the weakest point in the facial recognition and what would be the most interesting from the attacker’s perspective, the most approachable option,” says Omer Tsarfati, a researcher at the security firm CyberArk. “We created a full map of the Windows Hello facial-recognition flow and saw that the most convenient for an attacker would be to pretend to be the camera because the whole system is relying on this input.”

Microsoft dubbed the discovery a "Windows Hello security feature bypass vulnerability" and patched the problem on Tuesday. Furthermore, the company recommends that users use "Windows Hello enhanced sign-in security," which employs Microsoft's "virtualization-based security" to encrypt Windows Hello facial data and process it in a secure area of memory. 

Tsarfati, who will present the findings at the Black Hat security conference in Las Vegas next month, says the CyberArk team focused on Windows Hello's facial-recognition authentication because there has already been a lot of research into PIN cracking and fingerprint-sensor spoofing in the industry. 

He goes on to say that the team was attracted by a large number of Windows Hello users. Microsoft said in May 2020 that the service had over 150 million users. In December, Microsoft announced that 84.7 percent of Windows 10 users utilize Windows Hello to log in.

A hack that fools Face Recognition AI into false identification


Face recognition AI is increasingly being used at Airports and at other security outlets, especially during a pandemic to heed to proper security measures of identifying people while maintaining social distancing but a recent discovery by McAfee, a cybersecurity firm has proved that these Face Recognition systems are not all that perfect.

Researchers at McAfee tested a face recognition system similar to the ones used at Airports for passport verification- they fed the system an image created by machine learning that looks like one person but is recognized as someone else by the face recognition software. This could allow someone to board a flight (who is on the no-flight list) as someone else who has the booking.

“If we go in front of a live camera that is using facial recognition to identify and interpret who they're looking at and compare that to a passport photo, we can realistically and repeatedly cause that kind of targeted misclassification,” said the researcher, Steve Povolny.

To trick the face recognition algorithm the researchers at McAfee used CycleGAN, which is an image translation algorithm that could transform your picture to make it look like something painted by Monet or make a summer picture look like a winter one.

The team used 1,500 photos of the project leads to be transformed by CycleGAN and after hundred of tries, CycleGAN created an image that the face recognition recognized as someone else instead of whom the human eye perceived.

But there are two concerns with the study- first, that the researchers had a similar face recognition system as they do at the airport security but not the same.“I think for an attacker that is going to be the hardest part to overcome, where [they] don’t have access to the target system” said Povolny. Second, CycleGAN takes time to create such an image and the software requires a high-end system to work functionally.

 The researchers aimed at the study to point out the vulnerability of Face recognition systems and the dangers of relying solely on these checks.

"AI and facial recognition are incredibly powerful tools to assist in the pipeline of identifying and authorizing people,” Povolny says. “But when you just take them and blindly replace an existing system that relies entirely on a human without having some kind of a secondary check, then you all of a sudden have introduced maybe a greater weakness than you had before.”

San Francisco to ban facial recognition







Law makers in San Francisco have voted to ban the use of face recognition technology by city agencies, including the police department while provoking worries over privacy.

The new bill  Stop Secret Surveillance Ordinance, was introduced by San Francisco Supervisor Aaron Peskin. The ordinance states that any plans to buy any kind of new surveillance technology must now be approved by city administrators.

"With this vote, San Francisco has declared that face surveillance technology is incompatible with a healthy democracy and that residents deserve a voice in decisions about high-tech surveillance," said Matt Cagle from the American Civil Liberties Union in Northern California.

"We applaud the city for listening to the community, and leading the way forward with this crucial legislation. Other cities should take note and set up similar safeguards to protect people's safety and civil rights."

Face recognition technology uses an algorithm that scans a person’s face and then matches it with pre saved database. This technology is now commonly used by smartphones, laptops, and other digital device companies. 

San Francisco is the first US city to ban the face recognition.