top of page

Microsoft Takes Action: Azure OpenAI Service Blocked for US Police Facial Recognition

Updated: Jun 6

The updated code of conduct for Azure OpenAI in the United States specifies that police officers are prohibited from using its facial recognition technology to match individuals with criminal databases.

Azure OpenAI

Microsoft has updated its guidelines to restrict the use of its Azure OpenAI Service for facial recognition by police in the United States. Specifically, the service cannot be used by police officers with body-worn or dash-mounted cameras to identify individuals based on a database of suspects or previous inmates. This update also bans the use of real-time facial recognition on mobile cameras by law enforcement globally in uncontrolled settings.


Previously, Microsoft's policy prohibited any user, including U.S. state or local police, from identifying or verifying identities using facial recognition technology in media that includes people's faces. However, this earlier version did not address the use of the technology by global law enforcement agencies.


Microsoft has previously limited the sale of its facial recognition technology to U.S. police departments. In 2020, Brad Smith, the company's president at the time, announced that Microsoft would not sell this technology to U.S. police until there was a federal law based on human rights to govern its use. Now, Microsoft has extended this restriction to include real-time use of the technology by law enforcement agencies around the world. However, it's important to note that despite these restrictions, Microsoft has sold its facial recognition software to at least one American prison in the past.


Why it Matters:

While law enforcement agencies view facial recognition as a useful tool for fighting crime, the technology has an accuracy rate of about 80% when matching faces, which can lead to errors. For example, in 2022, a businessman from Noida was mistakenly detained in Abu Dhabi because facial recognition software incorrectly matched him to a wanted criminal.


Moreover, the use of AI-powered facial recognition can introduce additional problems, including bias. These biases occur because the data used to train these AI systems often does not represent all groups of people equally. A discussion by UNESCO last year highlighted this issue, noting that such systems might display what they called a "white guy problem." For instance, AI systems designed to detect drowsiness could erroneously judge someone as asleep based on their eyelid appearance, illustrating the potential flaws stemming from biased training data.


High Speed Internet

FAQs

Q1. What is facial recognition technology?

Facial recognition technology is an advanced form of artificial intelligence designed to identify or verify individuals by analyzing their facial features. This technology works by capturing an image or video of a person's face and comparing the unique characteristics of that face, such as the shape of the cheekbones, the distance between the eyes, and the contours of the lips, against a database of known faces. It is used in various applications, from unlocking smartphones and enhancing security systems to aiding law enforcement and border control efforts. Despite its widespread use, facial recognition technology raises significant concerns related to privacy, security, and potential biases.


Q2. What is the difference between Azure OpenAI and OpenAI?

Azure OpenAI is a collaboration between Microsoft and OpenAI that integrates OpenAI's advanced artificial intelligence models with Microsoft's Azure cloud platform. This partnership provides businesses with powerful AI tools and the robust cloud infrastructure of Azure. In contrast, OpenAI itself is an independent research organization focused on developing AI technology and promoting safe AI practices across various applications.


Q3. Has facial recognition technology ever wrongly identified someone?

Yes, there have been cases where individuals were mistakenly identified by facial recognition technology, such as a businessman who was wrongly detained while traveling due to a misidentification.

bottom of page