Privacy fears as Indian city readies facial recognition to spot harassed women

By Rina Chandran Jan 22 (Thomson Reuters Foundation) - A plan to monitor women's expressions with facial recognition technology to prevent street harassment in a north Indian city, will lead to intrusive policing and privacy violations, digital rights experts warned on Friday. In Lucknow, about 500 kilometres (310 miles) from the nation's capital New Delhi, police identified some 200 harassment hotspots that women visit often and where most complaints are reported, said police commissioner D.K. Thakur.

Reuters | Updated: 22-01-2021 14:11 IST | Created: 22-01-2021 13:41 IST
Privacy fears as Indian city readies facial recognition to spot harassed women
Representative image

By Rina Chandran Jan 22 (Thomson Reuters Foundation) - A plan to monitor women's expressions with facial recognition technology to prevent street harassment in a north Indian city, will lead to intrusive policing and privacy violations, digital rights experts warned on Friday.

In Lucknow, about 500 kilometres (310 miles) from the nation's capital New Delhi, police identified some 200 harassment hotspots that women visit often and where most complaints are reported, said police commissioner D.K. Thakur. "We will set up five AI-based cameras which will be capable of sending an alert to the nearest police station," he said, referring to the artificial intelligence-based technology.

"These cameras will become active as soon as the expressions of a woman in distress change," he told reporters this week, without giving further details on which expressions would trigger an alert. Facial recognition technology is being increasingly deployed in airports, railway stations and cafes across India, with plans for nationwide systems to modernise the police force and its information gathering and criminal identification processes.

But technology analysts and privacy experts say the benefits are not clear and could breach people's privacy or lead to greater surveillance, with little clarity on how the technology works, how the data is stored, and who can access the data. "The whole idea that cameras are going to monitor women's expressions to see if they are in distress is absurd," said Anushka Jain, an associate counsel at digital rights non-profit Internet Freedom Foundation.

"What is the expression of someone in distress - is it fear, is it anger? I could be talking to my mother on the phone and get angry and make a face - will that trigger an alert and will they send a policeman?" A more feasible solution would be to increase police patrol numbers, Jain told the Thomson Reuters Foundation, adding that the technology is untested, and could lead to over-policing and the harassment of women who trigger alerts.

India is one of the world's most dangerous places for women, with a rape occurring every 15 minutes, according to government data. Uttar Pradesh, where Lucknow is located, is the least safe state, with the highest number of reported crimes against women in 2019. Police often turn away women who go to register complaints or fail to take action, said Roop Rekha Verma, a women's rights activist in Lucknow.

"And they want us to believe they will take action watching our facial expressions," she said. India launched a slew of legal reforms after a fatal 2012 gang rape, including easier mechanisms to report sex crimes, fast-track courts and a tougher rape law with the death penalty, but conviction rates remain low.

While there is a growing backlash against facial recognition technology in the United States and in Europe, Indian officials have said it is needed to bolster a severely under-policed country, and to stop criminals and find missing children. But digital rights activists say its use is problematic without a data protection law, and that it threatens the right to privacy, which was declared to be a fundamental right by the Supreme Court in a landmark ruling in 2017.

"The police are using the technology to solve a problem without considering that this will simply become a new form of surveillance, a new form of exercising power over women," said Vidushi Marda, a researcher at human rights group Article 19. "AI is not a silver bullet, and no amount of 'fancy' tech can fix societal problems," she said.

(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)


TRENDING

OPINION / BLOG / INTERVIEW

Addressing conflict-related sexual violence at long last

... ...

Why unequal access to coronavirus vaccines is a threat to us all

... ...

India’s love affair with fossil fuels: the path to sustainable development?

... ...

Videos

Latest News

Guatemala's Pacaya volcano erupts, officials warn of burning projectiles

Guatemalas Pacaya volcano erupted on Wednesday, expelling lava for several hours and prompting the countrys meteorological institute to warn that incandescent volcanic blocks could rain down on nearby towns.Officials stopped short of orderi...

U.S. DOJ declined to investigate Trump transport chief after inspector general review

The U.S. Justice Department declined to investigate or prosecute then-Transportation Secretary Elaine Chao after the inspector generals office referred allegations of potential misuse of office for review, a report made public on Wednesday ...

Raging pandemic shuts down Sao Paulo as Brazil nears Pfizer deal

Brazil set a daily record for COVID-19 deaths for the second straight day on Wednesday, as a raging resurgence of the virus led Sao Paulo state to shutter businesses and the government to try to close vaccine deals with Pfizer and Janssen. ...

PRESS DIGEST- Financial Times - March 4

The following are the top stories in the Financial Times. Reuters has not verified these stories and does not vouch for their accuracy. HeadlinesAmazon opens first physical store outside North America httpson.ft.com309V6pJ German regulator ...

Give Feedback