Technology

Unless Congress adopts a new privacy law, Microsoft does not sell police facial recognition

Microsoft has taken Amazon and IBM competitors to limit how the company offers third parties, in particular the law enforcement agencies, facial recognition technology.
 
The company says it doesn't now provide the police with the technology, but now says it won't until federal laws regulate the way it can be deployed safely and without violating human rights or civil freedoms.
 
IBM stated that it will completely end all controversial tech sales, development , and research. Amazon told the police that it would stop providing Congress time for one year to introduce "stronger regulation to regulate the ethical use of face recognition technology."
 
President Brad Smith of Microsoft most closely reported his approach to face recognition on Thursday when he outlined the new approach of the company: not to say that one day, it would sell the technology to police, but to call for regulation first.
 
"We do not sell facial recognition technology in the United States to police forces today because of the values we have developed," said Smith in The Washington Post. "But I believe that this is a time when we really need to listen more, learn more and do more.
 
Given this, we agreed that we would not offer facial recognition to US Police Departments until we have national laws that regulate this technology on the basis of human rights.
 
It also seems that Microsoft will also provide facial recognition to human rights organizations to fight trafficking and other violations as Amazon has said it will continue with his Rekognition program.
 
Smith said that the company "is also preparing a range of additional review factors , in order that we look at other potential technology uses which go far beyond what we already have".
 
Tech industry has its own role to play in providing law enforcement agencies with unregulated, potentially racially-biased technology through ongoing protests across the US and around the globe against racism and police brutality and a national conversation about racial inequality.
 
Investigations have demonstrated that facial recognition systems have significant difficulties in identifying and even determining the gender of darker skins because they are trained in data sets composed mainly of white males.
 
For years , researchers, activists and law-makers from artificial intelligence have sounded alarm over the sale of technology to police, warning not only about racial prejudices, but also about violations of human rights and the privacy of technologies that could contribute to the growth of surveillance nations.
 
While Microsoft sold access to these technology to police departments, the company's approach has become more principled. Out of consideration for human rights violations, Microsoft denied California access to its facial recognition technology last year.
 
It also reported that, following allegations of an Israeli company Microsoft having invested in supplying the Israeli government the technology for spying on Palestinians, it will no longer invest in third-party companies developing the technology back in March.
 
(Microsoft later stated that its internal investigation concluded that the AnyVision company "has no mass surveillance program in the West Bank before and does not run at this moment," but has nevertheless dismissed it.)
 
Microsoft has been the vocal supporter of the federal regulations which would determine how such systems can be utilized and how privacy protection and discrimination will be safeguarded.
 
Since at least 2018, Smith has expressed public concern about the dangers of unregulated face recognition. But last year, the company was also caught offering more than 10 million facial recognition data, including images of many people who did not know or agreed to participate in the dataset. Shortly after a Financial Times request did the organization take the data collection offline.
 
Microsoft backed California legislation that would require police departments and private companies to buy these systems and use them only this year, according to the American Civil Liberties Union ( ACLU).
 
 

 






Follow Us


Scroll to Top