This story is part ofcomplete CNET news from Apple’s annual development conference.
Apple has announced a new Safety Check feature to help potential victims in violent relationships.
Why it matters
This is the latest example of how the technical industry is facing complex personal technological issues that do not have clear or easy answers.
What will be next
Apple is working with victim-survival organizations to identify additional features that can help people in crisis.
Among the long-awaited and popular new features that Apple plans to bring to the iPhone this fall, is e.g.as well as functions to find and is one that, when used, can mean life or death.
Apple announced Safety Check on Monday,to help victims of domestic violence. The setup, which comes with iOS 16 this fall, is intended to help someone quickly break ties with a potential abuser. Safety Check does this either by helping a person quickly find out who they are automatically sharing information with, such as their location or photos, or by denying access to and sharing of information to all devices other than the ones they have in hand.
It is noteworthy that the application also includes a distinctive button in the upper right corner of the screen labeled Quick Exit. As the name suggests, it is designed to help potential victims quickly hide that they were watching the Safety Check, in case their abuser would not allow them privacy. If the abuser reopens the settings application where the security check is stored, it will start on the default general settings page and effectively cover the victim’s tracks.
“A lot of people share passwords and access to their devices with a partner,” said Katie Skinner, Apple’s privacy manager at WWDC’s Monday event. “However, in a violent relationship, this can endanger personal safety and make it more difficult for victims to help.”
Security control and the careful way in which it has been coded are part of a greater effort by technology companies to prevent their products from being used as tools of abuse. It is also the latest sign of Apple’s willingness to embark on building technology to address sensitive issues. And while the company claims to be serious in its approach, it criticizes some of its actions. Last year, the company announced efforts to uncover images of child abuse on some of its phones, tablets and computers, a move that concerns critics..
Yet advocates say Apple is one of the few large companies to work publicly on the issue. While many technology giants, including Microsoft, Facebook, Twitter and Google, have created and implemented systemsand behavior on their respective sites, they have tried to create tools that stop abuse when it occurs.
Unfortunately, the abuse has worsened. A November 2020 survey of domestic violence experts found that 99.3% had clients who had experienced “technology-facilitated harassment and abuse,” according to the Women’s Services Network, which worked with Curtin University in Australia. In addition, the organizations learned that the news aboutVictim tracking has increased by more than 244% since the last survey in 2015.
In the midst of all this, technology companies like Apple are increasingly working with victims’ organizations to understand how their tools can be misused by the perpetrator and how they can help potential victims. The result is features such as the Quick Exit button on Safety Check, which advocates say are Apple building these features in what they call “trauma information.”
“Most people can’t appreciate the sense of urgency,” said Renee Williams, executive director of the National Center for Victims of Crime. “Apple was very receptive.”
Some of the biggest winnings in the technology industry come from identifying perpetrators. In 2009, Microsoft helped create image recognition software called PhotoDNA, which is now used by social networks and websites around the world to identify images of child abuse when uploaded to the Internet. Similar programs have since been created to help identify acquaintanceslive broadcasts and other things that big technology companies are trying to keep off their platforms.
As technology becomes more prevalent in our lives, this effort becomes more important. And unlike adding new video technology or increasing computer performance, these social issues don’t always have clear answers.
In 2021, Apple made one of its first public shifts to victim-focused technology when it announced new features for its iMessage service to analyze messages sent to children.. If his system suspected a picture, it would blur the attachment and warn the person who received it to make sure he wanted to see it. Apple would also direct children to resources that could help them if they fell victim to the service.
At the time, Apple said it had created technology to scan messages for privacy. However, activists feared that Apple’s system was also designed to alert an identified parent if their child still chose to display a suspicious attached image. According to some critics, this could encourage abuse by a potentially dangerous parent.
Apple’s further efforts to uncover potential images of child abuse that could be synced to its photo service via iPhones, iPads and Macs have been criticized by security experts who.
Victims’ advocates nevertheless acknowledged that Apple was one of the few device companies to work on tools to support victims of potential abuse when it occurs. Microsoft and Google have not responded to requests for plans to introduce security-like features to help victims who might be using Windows and Xbox software for computers and video game consoles, or Android mobile software for phones and tablets.
Learning, but a lot of work
The technology industry has been working with victim organizations for more than a decade, looking for ways to adopt safety thinking within its products. Advocates say that in the last few years especially a lotwithin the technology giants, in some cases employed by people from the non-profit world who worked on the problems that the technology industry was dealing with.
Apple began consulting with some of the victims’ advocates last year to ask for input and ideas on how best to build the system.
“We are beginning to see the recognition that there is a corporate or social responsibility to ensure that your applications cannot be misused too easily,” Karen Bentley, CEO of Wesnet. And she said it was especially difficult because as technology evolved to make it easier to use, it also had the potential to become a tool of abuse.
That’s one of the reasons he says Apple’s security check is “brilliant,” because it can quickly and easily separate someone’s digital information and communications from those who abuse them. “If you’re experiencing domestic violence, you’re probably experiencing some kind of violence in technology,” she said.
Although Safety Check has moved from idea to test software and will be widely available with a set of iOS 16 software updates for iPhones and iPads in the fall, Apple said it plans to work more on these issues.
Unfortunately, Safety Check does not cover the ways in which rapists can track people using devices they do not own – for example, when someone puts one of the $ 29 AirTag trackers in their coat pockets or in a car to chase them. Safety Check is also not intended for phones set up under children’s accounts, for people under the age of 13, although the function is still in the testing phase and may change.
“Unfortunately, rapists are persistent and constantly updating their tactics,” said Erica Olsen, project director of Safety Net, a National Network to End Domestic Violence program that trains companies, community groups and governments on how to improve victims’ safety and privacy. . “There will always be something to do in this area.”
Apple said it is expanding training with its employees who communicate with customers, including vendors in its stores, so that they know how features like Safety Check work and can teach them if necessary. The company has also developed guidelines for its support staff to help them identify and assist potential victims.
In one case, for example, AppleCare teams learn to listen when an iPhone owner calls and expresses concern that they have no control over their own device or their own account on iCloud. In another, AppleCare can guide someone to remove their Apple ID from a family group.
Apple also updated its personal security user guide in January to teach people how to reset and regain control of an iCloud account that can be compromised or used as a misuse tool.
Craig Federighi, Apple’s chief software officer, said the company would continue to expand its personal security features as part of a greater commitment to its customers. “Protecting you and your privacy is and will always be at the heart of what we do,” he said.