Technology, Security & How To Engage with "Girls Around Me": A Conversation with Deanna Zandt

A story on the blog Cult of Mac last week shed a light on a smartphone/tablet app (and trend) that’s causing some fear and existential dread in the hearts of many, and poking a hole in the ideas that the Internet is an equal-opportunity playing field, or that technology is the root of all of our problems.

Essentially, using publicly shared information on Facebook, Twitter, Foursquare and other social media sites, a new app called “Girls Around Me” allows a person to “check in,” find the names and pictures of a number of women (or men, but it is called “Girls Around Me”) in local social spaces, and then approach them, with whatever information is publicly shared already in mind. After the Cult of Mac story came out, Foursquare cut the app’s access to its API data, stating that this use was against its policies. I spoke to WAM!bassador of Technology Deanna Zandt about “Girls Around Me,” the fallout & the future.

To Zandt, the majority of the reaction to the Cult of Mac expose has seemed like “everybody being like ‘oh my god, women shouldn’t be using Foursquare.’ When publishers of apps create their APIs they should be taking the appropriate precautions so that users have the ability to protect themselves, that people know that their lives weren’t going to be in danger in some way.” She, herself, had an interchange (click here for the Storify) with Dr. NerdLove, a love advice column, about the good doctor’s victim-blaming reaction. Saying women should stay away from this kind of technology is, as Zandt said to Dr. NerdLove, like saying women shouldn’t wear short skirts. The problem is not individual women engaging in technology; the problem is a culture where abusing public information like this is permissible if not explicitly encouraged.

“Technology becomes an easy target to place all of our hopes and fears onto,” she said. “Rather than addressing women’s public sexuality, or any of those [issues around violence] culturally, we point to technology and say ‘how terrible.'”

Another contemporary example, Zandt said, was when Tyler Clementi, a Rutgers student, committed suicide after his roommate used a webcam and Twitter to “out” Tyler. “I was being interviewed on CNN [about technology’s role in Clementi’s death],” Zandt said, “and in the pre-interview, the reporter asked, ‘Doesn’t this incident show us the dark side of technology?’ And I said, “No, this isn’t about the dark side of technology, this is about the dark side of humanity.'”

While the challenges that come along with apps and iChat are new, this kind of technological panic has numerous precedents. “I think about how people treated the telephone in a certain way, like it was going to stop people from visiting each other, that it would be the end of society as we know it,” she said. “When the telephone [was first made widely available], prank calls and dirty calls were all the rage in one way or another, and people didn’t tell women to stop answering the phone. But we always seem to get general instructions about what to do,  and parents understood and could hold our hands through it. These newer technologies are not as culturally ingrained, so people automatically fear them.”

Technology, for this generation, is very much a given. As a broader culture, and especially for folks who have lived their whole lives with the Internet, we don’t fear new technology so much as accept it. We want to try new things, and give applications our information. This means, however, that for developers as well as consumers, issues of security, as with “Girls Around Me,” often come after the fact of their publishing, rather than before.  Zandt attributes much of this to the relative homogeneity of developing teams.

“In terms of personal safety, developers don’t have a breadth of diversity and equity coming to the development table to take their blinders off to see where these things could go horribly awry,” Zandt said. “When GoogleBuzz launched as a new feature, you had to click 80 million screens just to get to GMail, but one of the things that it did as part of the integration package was automatically share things that you read on GoogleReader with your most contacted person in your GMail. For your normal average person and normal average experience, your most contacted people are your partner or your best friend. For one person, it was her abusive stalker ex-husband who was threatening to kill her, and the technology was passively, without her knowledge, sharing things with this person.  If you look at the GoogleBuzz launch team, too, it is largely these young dudes, most of them white, maybe one or two women and a couple of Asian guys, coming from roughly the same kind of experiential background. That’s where for me a lot of the problems arise from the after the fact, there isn’t the diversity of experience at the table to prevent these kinds of things from happening.”

It’s dangerous, too, when developers don’t consider the experiences of those using their interfaces abroad. Zandt talked about an app called Path, which synchronizes all of a person’s social media networks into one place, making it easier to share updates or photos with as many people as possible. “While most apps scan your address book in some way, Path would upload your address book to its servers and process your contact information,” Zandt said. “It’s not just that it could be evil and marketing smarmy, but especially in countries that are using mobile technologies to fight dictatorship, they could lose their lives if the regime could get access to the features somehow. ”

So then, what do we do? How do we, as a culture, keep ourselves safe, while still being able to share and talk and meet-up and enjoy all the spoils of the new technological frontier, which only keeps expanding?

To Zandt, much of this work is being done by advocacy groups, notably the Electronic Frontier Foundation and the ACLU of Northern California, who are “publishing great set of guidelines and developing for privacy,” she said. “There are a bunch of people that are pressuring for certain adoption standards. Right now, for an individual to take action or hold the company accountable, all the company has to do is point to the Terms of Service.”

Because right now there aren’t really the mechanisms, historically marginalized folks are told to “stay in line and do XYZ and not get hurt,” she said. “It’s important to develop standards.”

While the real issues here are around culture, and the big work must be done by larger organizations, these things don’t change overnight. For now, read up on how you can better protect your data, look into the advocacy work that’s already being done, and keep the conversation going. While “Girls Around Me” is a particularly heinous example, I’m certain it’s not the only app of its kind, and certainly not the only instance of technology being used for exploitative means. We must remain vigilant, we must remain vocal and we must remain skeptical, but we also can’t let this scare us off from engaging with social media, because that’s how the creeps win.

Leave a Reply

Your email address will not be published. Required fields are marked *