Microsoft to law enforcement: No using Azure OpenAI for facial recognition


0

Several street surveillance cameras in front of a glass building.

Microsoft is more explicitly banning police departments from using its AI models to identify suspects, according to new conduct language for its Azure OpenAI collaboration.

The new language explicitly prohibits using its AI model services “for facial recognition purposes by or for a police department in the United States.” It also prohibits use cases in which mobile cameras are used by any law enforcement globally “in the wild” or where patrolling police officers use body-worn or dash-mounted cameras to verify identities. Microsoft also disallowed identification of individuals within a database of suspects or prior inmates.

The company’s Azure OpenAI system, which provides API access to OpenAI’s language and coding models via Microsoft’s cloud storage, recently added Chat GPT-4 Turbo with Vision, OpenAI’s advanced text and image analyzer. In February, the company announced it was submitting it’s generative AI services for use by federal agencies.

Microsoft’s Code of Conduct already prohibited using the Azure OpenAI system to:

  • identify or verify individual identities based on people’s faces or other physical, physiological, or behavioral characteristics; or

  • identify or verify individual identities based on media containing people’s faces or otherwise physical, biological, or behavioral characteristics.

The new language outlines more specific bans on police agencies using AI systems for data collection. A recent ProPublica report documented the extent to which police departments around the country are implementing similar machine learning, including the use of AI-powered tools to examine millions of hours of footage from traffic stops and other civilian interactions. “Much of the data compiled by these analyses and the lessons learned from it remains confidential, with findings often bound up in nondisclosure agreements,” the publication wrote. “This echoes the same problem with body camera video itself: Police departments continue to be the ones to decide how to use a technology originally meant to make their activities more transparent and hold them accountable for their actions.”

While some actors have taken similar steps to protect user data from law enforcement inquiries, including Google’s recent location data privacy protections, others lean into the possibility of collaboration. Last week, police camera and cloud storage provider Axon, unveiled Draft One, an AI model that automatically transcribes audio from body cameras in order to “significantly enhance the efficiency of police report writing.”


Like it? Share with your friends!

0

What's Your Reaction?

hate hate
0
hate
confused confused
0
confused
fail fail
0
fail
fun fun
0
fun
geeky geeky
0
geeky
love love
0
love
lol lol
0
lol
omg omg
0
omg
win win
0
win

0 Comments

Your email address will not be published. Required fields are marked *

Choose A Format
Personality quiz
Series of questions that intends to reveal something about the personality
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Countdown
The Classic Internet Countdowns
Open List
Submit your own item and vote up for the best submission
Ranked List
Upvote or downvote to decide the best list item
Meme
Upload your own images to make custom memes
Video
Youtube and Vimeo Embeds
Audio
Soundcloud or Mixcloud Embeds
Image
Photo or GIF
Gif
GIF format