PHOTO ILLUSTRATION BY Clare Jensen

PHOTO ILLUSTRATION BY Clare Jensen

 

The Capital I Internet

It’s fashionable (and often accurate!) to approach the Capital I Internet as a toxic, terrible place where misplaced good intentions were commandeered by capitalists, trolls, politicians and foreign actors to not only not democratize these spaces but to push propaganda and further destabilize society.  Deep breaths. Those things do exist, to be sure. But today we wanted to give you a different perspective by sharing our conversation with author Joanne McNeil

McNeil writes about technology, art and the intersection of the two, among other things. One of the first times she crossed our desks was for a New York Times Op Ed about the proliferation of feminized virtual assistants (a la Alexa and Siri) in 2015. By then, the female virtual assistant had been so normalized, her words were simultaneously revelatory and internalized as fact – a hard feat to accomplish in cultural critique. In her work, McNeil often interrogates the human implications (or motivations) of technology. It’s a POV that stands out in a category of journalism dominated by the same demographic that dominates the industry itself (men). 

Earlier this year, McNeil published a book entitled Lurking: How a Person Became a User. The title suggests this human approach to the subject, and indeed, reviews call it “a people’s history of the Internet.” What unfolds is a not naive yet not cynical investigation of our digital spaces, where we’ve been, and where we should go if we want to make them safer, more inclusive places. 

We talked to Joanne during the first week of June, 2020. In many ways these past few weeks felt a lot like that week, and in some ways it feels like ages ago. We hope you get as much out of this as we did. 


*Lightly edited for length and clarity.

 
PHOTO COURTESY OF Joanne McNeil 

PHOTO COURTESY OF Joanne McNeil

 


WRITING ABOUT TECH

It's always hard for me to point to where it got started. The writing I'm doing now started about the time I started a blog, which was in the late aughts, and I got the attention of a few editors in New York. I moved to New York and became an editor at an art and tech focused publication. I always had one foot in culture and one foot in the tech industry. Blogging was something that felt like you could move from media back to tech and be part of all these conversations at once. And that has a lot to do with the size of the community, which was much smaller then. 

When I look back, ten, fifteen years ago, the sense of approaching tech from a cultural angle was somewhat unusual. The interest wasn’t really there. It was culture sometimes as opposed to general culture. And generally speaking, the resources weren’t there. There weren’t a lot of tech reporters. These companies exploded in those years. And the number of people actually chronicling the transition from small startups to enormous, powerful conglomerates that have influence beyond what companies have ever had … there are a lot of great accounts, but for a transition of this nature not nearly enough.

WHY AI HAS A GENDER PROBLEM

The case of having all of these examples of AI personalities represented as women felt at that time so reflective of the teams that were developing them, because even the small startups that were creating these apps, they were almost all white men. There are plenty of other examples in technology like cameras that have not been able to represent Black people. We risk seeing more androids or female AI than actual women in the room. 

“WOMEN IN TECH”

Looking back at 2010, there might have been one woman working at these tech companies, or two or three.  If these companies grew, maybe the percentage is as bad as it was, but there are still more women working in tech. Who has more power and authority and respect in a company? The engineering team, the customer service side, or even UX. The groups that are more diverse and include more women? A lot of the time that work involves more ‘care work’ and that really hasn’t changed much in the last ten years.

A book I talk about a lot because it’s very representative of the lack of change is called The Boy Kings by Kate Losse. It’s a memoir of her time at Facebook, published in 2012. The date is crucial to know, because these issues of being a woman who has a role at a company working directly with its users (as it should be for a social media platform) should be very well regarded. This work just did not have the respect that the engineering teams would have, and in addition to that, the issues of sexual harassment and the prevalence of it, a lot of that has not changed. It’s just that the companies have grown bigger, and there are more women who are subject to this discrimination and culture. 

Awareness of these problems goes both ways. You have a lot of companies that now understand the optics. We have seen a lot of that with corporate responses to the George Floyd killing with carefully worded statements that are more likely to address just race in general as opposed to police brutality against Black people in America, and it’s not quite the same. Even the most beautifully stated statement is not the same as having companies with a company culture where people of color can thrive. Also, hiring a bunch of young people now doesn’t really change the chain of a command at a company.  And then, of course, this summer, we’ve seen a lot of Silicon Valley layoffs. Because junior staff is more diverse than senior staff, we’re seeing a lot of people of color getting laid off after all of this work that was done to make the industry more inclusive just because it’s been so slow going and so resistant to change. 

 
 

“We risk seeing more androids or female AI than actual women in the room.”

Joanne Mcneil

 
 


THE EXCLUSIVE IDEALS OF UTOPIA AND MERITOCRACY IN TECH

I have to say, with the World Wide Web, which is where I start with this book, there is something utopian built into it. And Tim Berners-Lee is one of the good progressive thinkers, comparatively speaking in the tech industry. But if you look back, a number of companies in the nineties which had people of color founding companies that were laid out as communities of color. I want to say that every single story I came across, the issue was that they could not find funding. You could find funding to create a channel on a well for a Black community, but if you were trying to start your own company, the investors would be very hesitant to trust a founder with this responsibility. In this book, I am always very careful to let people know that the Internet has never been a utopia, but, there have also been many missed opportunities to make it better. And all of the things that we do love about the Internet could be much more immediate and thrive through decentralized tech like the web, like email, if there had been a few different ways of development. 

WHAT DO WE DO NOW? 

I’ll start with what we can do. It’s really important to think - ‘do I need to broadcast something, or do I need this for my community?’ I’m sometimes very cynical about the actor-broadcaster, especially on Twitter. But I saw a bunch of people matching donations to the Minnesota Freedom Fund, providing bail to Minneapolis protestors. That is incredible. And that is something that happened because people used the Retweet function. I would say that 99% of the time, this is very likely to create a very toxic experience online. But users took advantage of this element of amplification and brought a windfall to the bail funds. 

If you’re using the Internet to communicate or create a community and have that experience, there are tools that work, like creating a private Slack channel or group DMs, and have people easily find the group so that it’s not like making it a secret society. But people do figure it out. There are a lot of ways to meet people on Facebook and Twitter and then send them a DM and say ‘we have this private mailing list, do you want to join our community?’ Then these communities are based on the communication itself as opposed to the amplification and the virality of what you’re doing. Then you can have a real honest conversation. 

What the tech companies should be doing is thinking smaller. I don’t see community. I don’t see people from communities representing people on the community level. The tech the companies from the nineties, the founders were from these communities and were active in them. They would have meet ups, they would be accountable on a social level as well as just a corporate level. These were their friends. They had to mediate disputes. They had something at stake. 

But Mark Zuckerberg is so alienated by two billion users. He doesn’t have people on the level of the communities that he takes credit for. Having that representation - advocates in finding users and respect for these users - is really crucial for building a better Internet. I don’t know how these companies make money doing this but I do think we can all agree that if we made the Internet better, it would benefit all of us as a society. 

IMAGE COURTESY OF MCD/Joanne McNeil

IMAGE COURTESY OF MCD/Joanne McNeil

 


HARASSMENT, ANONYMITY, AND EVERYTHING THAT COMES WITH IT

Harassment was not taken seriously as an issue to build around until far too late. We had to go through years of targeting and dog-piling. When you look back to some stories – like that of  Kathy Sierra, or issues with the Black women who were impersonated just before GamerGate or users on 4Chan who created a fake feminist hashtag called End Father’s Day where they impersonated Women of Color – there were all incredibly damaging experiences that could have been addressed when they were small. But then all of a sudden, this became the trade off to even being on Twitter, constantly subjecting yourself to harassment of this nature. And it also speaks to how these groups, women, people of color, trans people, were very marginalized in tech. They’re still marginalized now, but they were incredibly marginalized in the tech community in those years. This harassment was peaking along with the sense that they weren’t the tech community. That you’re not real tech if this is happening to you. So even someone with the seniority and respect as Kathy Sierra could endure this harassment really shows how tech uses this to further marginalize people. 

Early on in the discussion of harassment, anonymity was always pointed to as the cause. You had little accounts, that might have ten followers (remember the egg profile photos, anyone?) but now we see people, including our president, who say incredibly hostile and toxic things with their name attached. There are enough people who are public figures for their toxicity, that it became an anonymity with more complexity. 


It gives people the freedom to speak up, be a whistleblower, or expand elements of their identity that they might feel some sort of discomfort with at the moment. The platform that intrigues me the most probably because I don’t really use it, only to just lurk, is Reddit because as hostile as it is, that’s where Gamergate really took hold. There are also these platforms where you’ll see people who work at Amazon warehouse giving workers resources and lists of tactics or if they see Covid-19 at the workplace. Without having this slight bit of anonymity, it could be quite hurtful to people who can not speak about certain elements of their experience without repercussion. 

WHERE DO WE GO FROM HERE?
TAKE CARE, THINK SMALL. A LESSON FROM LIBRARIANS. 

The thing that makes the Internet unusual is that you have this receipt of your experience. You have the photos in your inbox. You have all of this content that you’re creating while you’re just having an experience. And it’s not the same as having a box of letters under your bed that you have control over it. These companies, your email provider, their control of that data is part of the equation. 

That's why, at the end of the book, I wanted to direct people’s attention to what’s happening, and work that librarians are doing at libraries, because I feel like they have far more expertise about this than I do. What they could do with the information, and the sense that some of this is very historically important. 

Taking care of the experience like you would take care of the content is something that I’m seeing happen on a group level. Like a shared newsletter, where everyone is a coauthor.  I’m seeing people have this conversation, ‘how many can we realistically put out?’ Do we archive them? Or Slack channels, ‘can I post a screenshot to Twitter?’ There is this level of trust that comes through when you have these conversations about how private we want to be, how permanent we want to be, what are our goals as a community or publication. That feels like real progress.  I don’t think these conversations were happening as much five years ago at all.  

It’s also about finding control where you can have control. If you have a group chat or DM on Twitter, you can decide to delete that. It’s a small thing, but it's something you have control of as a user. These are certainly imperfect platforms, many of them toxic, but we could go into them with the perspective of ‘what can I do with this?’ And if ultimately, the goal is to keep in touch with people, you have plenty of options and it just becomes about finding the one that’s right for you. 

 
IMAGE CREDIT Lift conference

IMAGE CREDIT Lift conference

 

WHAT SHE'S READING

Carl Sagan's Contact. Marie NDiaye's Ladivine, Rita Indiana's Tentacle, and We Are Made of Diamond Stuff by Isabel Waidner are all up next.

FEMALE/BIPOC TECH WRITERS WE SHOULD BE READING 

Julia Carrie Wong, Susie Cagle, Kashmir Hill, and Nitasha Tiku are all phenomenal reporters. Three books I'd recommend to anyone working in the tech industry are Charlton D. McIlwain's Black Software, Alice Marwick's Status Update, and Katherine Losse's The Boy Kings. Also, there is a wonderful new podcast called Tech Won't Save Us, which is a good place to find out about the work of various tech critics.

A WOMAN WHO INSPIRES HER 

The artist Lynn Hershman Leeson, for her prescience and resilience. I recently wrote an essay for Filmmaker magazine further explaining why.