Showing 40 results for:
Popular topics
After raising $5 million in funding, Parfait is on its way to disrupting the wig industry with the help of technology. The women-led startup is said to be the first to integrate facial recognition and artificial intelligence to provide buyers with customizable wig products. According to a press release sent to AfroTech, the seed round was spearheaded by Upfront Ventures and Serena Ventures. Ulu Ventures, Unshackled Ventures, Contrary Capital, Visible Hands, TRUE Capital’s Culture Fund, Omar Johnson, Chamillionaire, Tristan Walker, and Upland Workshop participated in the round. “Parfait’s mission to leverage Al to solve core issues for both the tech industry and communities of color is something we, at Serena Ventures, have believed in since the beginning,” said Serena Williams, Managing Partner at Serena Ventures, in a statement. “She went on to say “It’s been inspiring to witness their incredible achievements so far, and we’re proud to invest in this next phase of Parfait’s...
On Feb. 12, 2021, the Minneapolis Police Department announced that its officers are banned from using facial recognition software when they’re in the process of apprehending suspects. According to TechCrunch, the problematic police department — best known for being the home of the officers who killed George Floyd last summer — is known for having a “relationship” with Clearview AI, a firm with a record of “scraping” images from social media networks and selling them, wholesale, to police departments and federal law enforcement agencies. 13 members of the city council — with no opposition — voted in favor of banning facial recognition software usage. And Minneapolis is just the latest city to join in the usage ban, joining Boston and San Francisco in this landmark move. However, the bans haven’t included selling the images to private companies — which many privacy experts cite as a growing concern. But there’s another, more salient reason why facial recognition software is facing...
Facial recognition technology has been a known issue in police conduct for years now, and the problem has yet to be resolved. New Jersey man Nijeer Parks, 33, has unfortunately become the third known Black man to be falsely identified and wrongfully-arrested for false facial recognition, according to The New York Times. The incident occurred in February 2019, when Parks was accused of allegedly shoplifting candy and trying to hit a police officer with a car at a Hampton Inn in Woodbridge, NJ. Engadget shares officers were called to the Hampton Inn where the alleged shoplifter presented them with a Tennessee driver’s license, which they eventually confirmed to be fake. State agencies then utilized facial recognition systems to analyze the photo from the ID and found an apparent match in Parks’ state ID. Despite being 30 miles from the incident at the time, officers still identified Parks as the suspect. As a result, Parks spent a total of 10 days in jail last year, and $5,000 to...
This past Wednesday, Amazon announced in a blog post that they’re placing a one-year ban on police’s use of facial recognition technology. Part of their statement goes as follows: “We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.” This announcement could be in response to the protests and uprisings that have occurred as a result of all the recent police killings and brutality. Although their statement postpones their involvement in endorsing this technology for law enforcement for a year, it doesn’t address what will happen once the ban expires. On the heels of this announcement, many were outraged at the time limit placed on the ban, expressing that it’s not enough to simply forbid the sale of...
The safety of protesters on the frontlines of the rebellions across the country has been a grave concern for many. Protesters have urged one another to either leave their phones at home or turn off their location services as police officers have been known to use tracking technology to trace protesters and scan their faces in surveillance videos. Buzzfeed News reported that protesters in Minneapolis are being watched by law enforcement agencies that have trialed or deployed a variety of surveillance tools. To combat surveillance tools used to target these protesters, Adobe shared that they offer several editing programs to help blur faces out of people’s content, USA Today reports . Unlike Photoshop — which only includes effects that help blur unwanted faces from pictures — video editing apps such as Adobe Premiere and After Effects helps to seal the identities of faces included in videos. According to USA Today , Adobe offers a variety of prices for their monthly subscriptions that...
Facial recognition has been quietly unfolding across the United States for years. Now, increased public awareness has turned facial recognition into a hot political issue that may enter the presidential race. Earlier this month, Bernie Sanders became the first 2020 presidential candidate to call for a ban on police use of facial recognition technology. While some of the impacts of facial recognition cannot be reversed, its growing political significance may at least help communities of color escape some of its worst effects. Sanders’ position on facial recognition draws from bans that have occurred this year in San Francisco , Oakland, and Somerville, Massachusetts. In each city, activists focused on pointing out facial recognition’s potential for introducing widespread, mass surveillance of already vulnerable communities, like what may be occurring in the cities of Chicago and Detroit. A spokesperson for Sanders’ campaign told Recode: “Police use of facial recognition software is...
This article was originally published on 07/03/2019 In June, Somerville, Massachusetts became the second city in the United States to ban facial recognition technology . Originally introduced back in May , the “Face Surveillance Full Ban Ordinance” places a moratorium on government use of facial recognition and other “remote biometric surveillance systems,” until the state develops a framework for its use. “[T]he benefits of using facial surveillance, which are few and speculative, are greatly outweighed by its harms, which are substantial,” the bill says. The bill touches on concerns previously cited by advocates, comparing the broad application of face surveillance in public spaces to requiring everyone to carry around and display a photo I.D. at all times. Under the bill, any data collected through facial recognition would be considered “unlawfully obtained.” That means it can’t be used in trials and should be deleted immediately. In addition, if the banned technology is used on...
As awareness around facial recognition continues to grow, a primary concern has been its potential to open up new frameworks for mass surveillance. That concern grew even more pressing as people realized that facial recognition could potentially be used in body cameras, essentially creating roving, real-time surveillance systems on the chests of police. On Thursday, Axon — the company that created the Taser and supplies 47 out of the 69 largest police agencies in the United States with body cameras and software — announced a ban on the use of facial recognition on its devices . Although this can certainly be considered a temporary victory, Axon’s announcement must be carefully analyzed — both within social contexts, the words that the company used, and its own history. Axon’s decision comes from the first report of an AI and Policing Technology Ethics Board that the company originally formed in April of 2018. The board was developed to lead Axon in ethically developing products and...
Facial recognition has been widely criticized for the risks it poses to vulnerable communities. The technology typically reinforces pre-existing social biases, as seen by its inability to read anyone who isn’t a white man . It also poses severe privacy risks. Facial recognition makes it easy for government agencies to develop continuous, mass surveillance of vulnerable communities. With all of these risks, it’s not a technology that most people would want to use on kids. Despite that, Lockport City School District in New York is trying to test a facial and object recognition system called “Aegis.” In September, the district used $1.4 million of the $4.2 million it received in funding through the Smart Schools Bond Act to install the system, Lockport Journal reported . Superintendent Michelle Bradley announced plans to begin testing the system on June 3. Bradley described the test as an “initial implementation phase.” That means the school wanted to test the system for any necessary...
Facial recognition has the potential to introduce continuous, mass surveillance throughout the United States. Vulnerable communities — including Black people, religious minorities, and other communities of color — are especially likely to be harmed by facial recognition’s deployment. Amazon is perhaps one of the most infamous participants in facial recognition software development. But on Wednesday, Amazon shareholders failed to pass two resolutions concerning the company’s facial recognition software, Rekognition. Although the proposals were non-binding — meaning Amazon could have rejected the vote’s results — passing them would have still sent a message. The first proposal was about stopping sales of Rekognition to the government, and the second demanded an independent review of the program’s civil and human rights impacts. Unfortunately, the vote doesn’t come as a huge surprise. As noted by TechCrunch , CEO Jeff Bezos retains 12 percent of the company’s stock. He also has the...
Across the United States, local governments have held discussions about facial recognition. Last week, San Francisco banned government use of the technology , while cities like Oakland, California and Somerville, Massachusetts are exploring doing the same. Each of those cities began looking closely at facial recognition due to the danger it poses to Black and brown communities. San Francisco’s own bill stated, “The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.” It’s important that local communities are starting conversations about facial recognition tech and the harms that come with it, but it also needs to occur at a higher level. After all, the risks that civil rights and privacy advocates highlight around facial recognition includes continuous, mass surveillance...
Most users on Facebook are familiar with how the platform can find you in peoples’ photos and suggest that you be tagged. That’s done by utilizing face recognition, but a big concern with the technology is whether or not people are losing their ability to consent. Your face is personal, after all, and it shouldn’t be recorded or analyzed without it. In December 2017, Facebook’s Deputy Chief Privacy Officer Rob Sherman published a blog post to address those concerns before introducing Facebook’s new on/off switch. “When it comes to face recognition, control matters,” Sherman wrote. A video on the page informed viewers that anyone could completely “opt out” of the facial recognition technology by adjusting their Facebook Account Settings. Now, it seems that’s not entirely true. A study conducted by Consumer Reports found that out of 31 Facebook users in the United States, eight accounts — or 25 percent — lacked the Face Recognition setting. The study may seem small, but it’s alarming...
Body cameras were intended to be a way of encouraging police accountability. The idea was that if encounters were recorded, police would be unable to lie about events that took place. Now, there’s a rising concern that police may add facial recognition software to body cameras — essentially transforming them into roving face surveillance systems. A coalition of privacy and civil rights groups are taking action in California to ensure that doesn’t happen. The Body Cam Accountability Act — otherwise known as AB 1215 — was proposed by Assemblymember Phil Ting (D-San Francisco), who described it as “an important civil rights measure that will prevent exploitation of vulnerable communities.” The bill has gained support from the American Civil Liberties Union, Color of Change, Council on American-Islamic Relations, Data for Black Lives, and more. In a statement obtained by AfroTech, the ACLU wrote: “The people of California were promised that body cameras would guard against police...
There have been some big moves made lately in the fight against facial recognition technology. Last week, San Francisco banned local government use of the technology completely. Now, the focus has turned to Amazon’s infamous Rekognition program. At the company’s annual meeting on Wednesday , Amazon’s own shareholders will vote on two proposals. One, to stop the sale of Rekognition to the government, and the other to require an independent review of its civil and human rights impacts. In preparation of the vote, a coalition of privacy and civil rights advocates have written an open letter to Amazon’s shareholders. On May 22, the American Civil Liberties Union (ACLU) will present the open letter at Amazon’s meeting by invitation of shareholders. The presentation will mark one year since the ACLU first revealed how far Amazon’s Rekognition program had gone. Now, the letter is open for other groups and individual consumers to sign on. Within it, the groups focused on addressing Amazon’s...