Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The Spanish government announced this week a major overhaul of a program in which police rely on an algorithm to identify potential victims of domestic violence, after facing questions about its effectiveness. the system of the responsible.
The program, VioGén, requires the police to ask a series of questions to the victim. The results are fed into a software program — ranging from no risk to high risk — designed to identify women who are most vulnerable to abuse. The number helps determine police protection and other services women can receive.
A New York Times investigation last year found that police relied heavily on the technology, almost always agreeing to decisions made by the VioGén software. Some women who were identified by the algorithm as not at risk or at risk of further harm later experienced more abuse, including dozens who were killed, according to The Times.
Spanish officials said the changes announced this week were part of a long-planned overhaul of the system, which was introduced in 2007. They said the agency had helped the police have limited resources to use the software to protect vulnerable women and reduce the number of repeat attacks.
In the updated system, VioGén 2, the software will no longer be able to flag women as not at risk. Police also must enter more information about the victim, which officials said will lead to more accurate predictions.
Other changes are intended to improve cooperation between government agencies involved in violence against women, including facilitating the sharing of information. In some cases, victims will receive a special protection plan.
“Machismo is knocking at our door and doing it with violence unlike anything we’ve seen in a long time,” said Ana Redondo, the minister of equality, during a press conference on Wednesday. “This is not the time to back down. It is time to move on.”
Spain’s use of an algorithm to guide the treatment of gender-based violence is a far-reaching example of how governments are turning to algorithms to make important public decisions. resident, a trend that is expected to increase with the use of artificial intelligence. The system has been discussed as a potential model for governments elsewhere trying to combat violence against women.
VioGén was created with the belief that algorithms based on mathematical models can act as an impartial tool to help the police find and protect women who may be at fault. Yes-or-no questions include: Was a weapon used? Were there any economic problems? Did the attacker exhibit controlled behavior?
Victims classified as high-risk received more protection, including regular inspections of their homes, access to shelters and police monitoring of their abusers. Those with lower scores received less aid.
As of November, Spain had more than 100,000 active cases of women analyzed by VioGén, and around 85 percent of victims were classified as being at low risk of being harmed by their abusers. Police in Spain are trained to reject VioGén’s recommendation if there is evidence to support it, but The Times found that the risk rating was upheld about 95 percent of the time.
Victoria Rosell, a judge in Spain and former representative of the government focused on the issue of gender-based violence, said that a moment of “self-criticism” is needed for the government to improve VioGén. He said the system could be more accurate if it pulled information from additional government databases, including the health and education systems.
Natalia Morlas, president of Somos Más, a victims’ rights group, said she welcomed the changes, which she hoped would lead to better risk assessments by police.
“It is very important to properly calibrate the risk of the victim because it can save lives,” said Mrs. Morlas. He added that it is important to maintain close monitoring of the system because victims “must be handled by people, not machines.”