Uncanny Valley: Why Facebook's Art Patronage Deserves More Ethical Scrutiny Than You Think
This week, extending the debate about ethical arts funding into Silicon Valley…
TALENT BOOKING
On Thursday, my artnet News colleague Taylor Dafoe reported that Facebook is expanding its support of the arts via two high-profile hirings and a high volume of upcoming commissions. But given the ongoing intellectual melee over whose patronage is too problematic for artists and institutions to accept today, it’s time to take a longer look at Facebook’s complications than we in the art media have generally done so far.
First, though, the recap: Dafoe relayed that Facebook hired Tina Vaz, formerly the deputy director for global communications at the Solomon R. Guggenheim Foundation, to lead its artist-in-residency initiative and its Analog Research Lab, an in-house workshop where staffers can create their own prints and posters. Jennie Lamensdorf, previous head of art-in-buildings for real-estate firm Time Equities, will work to implement Vaz’s vision in the Bay Area specifically.
The pair will be responsible for growing Facebook’s already-500-strong portfolio of artworks commissioned for its offices and surrounding communities. The social media titan aims to commission another 200 pieces by year’s end. Vaz and Lamensdorf will accomplish this with the help of an existing team of 25 curators and administrators at the artist-in-residency program and the Analog Research Lab.
In his piece, Dafoe rightly implies that Facebook’s new announcements could be positioned as art-washing of, or at least a distraction from, the firm’s most widely registered black eye:
The hires come at a time when Facebook is trying to fix its public image as it faces increasing scrutiny for its role in the spread of misinformation. This week, Nancy Pelosi, the speaker of the US House of Representatives, again placed blame on Facebook for being “exploited by the Russians” in the 2016 presidential election.
However, I’d argue that Facebook’s effect on the 2016 US presidential election, whether through Cambridge Analytica’s data-driven voter profiling or activity by the Kremlin’s Internet Research Agency, was both less meaningful and less distressing than other issues in Mark Zuckerberg’s empire that have gotten far less traction with the general populace.
I say this because we have confirmation from both independent analysts and the actual mathematician hired to build Cambridge Analytica’s “personality algorithms” using Facebook data that they were functionally useless to the Donald Trump campaign (and other far-right clients). I’m also in the minority camp (albeit with some well-regarded experts on tech and Russia) that believes the Kremlin’s disinformation efforts on Facebook had a lot less impact on the election than, say, homegrown disinformation on network TV.
If we push past those controversies, what other ethical red flags have been raised by who Facebook is and what it does? Let me offer three sins of commission to consider, all discussed in a recent WIRED feature memorably titled “15 Months of Fresh Hell Inside Facebook.”
FACE FACTS
First, in the fallout from the Cambridge Analytica scandal, Facebook hired a right-leaning opposition-research firm called Definers Public Affairs to push out dubiously sourced negative coverage on its biggest critics, including seeding a conspiracy theory that some of said critics were funded by liberal financier and Holocaust survivor George Soros.
For the uninitiated, nationalist politicians and outright hate groups have used Soros’s name and fictive involvement as an anti-Semitic dog whistle for years. And given that, according to the November 2018 New York Times feature that exposed their work for Facebook, Definers’ in-house news network frequently has its coverage picked up by far-right outlets like Breitbart, it’s difficult not to read something sinister into the consultancy’s choice to pursue the Soros angle—and Facebook’s apparent willingness to run with it for a bit.
(Facebook terminated its relationship with Definers after it was made public by the Times. Since Facebook did not offer an explanation for the move, it remains possible that COO Sheryl Sandberg simply felt Definers didn’t lean in hard enough.)
Second, committed liberals should probably know that Joel Kaplan, Facebook’s vice president of global public policy, traveled to Washington to be present for close friend and then-embattled Supreme Court Justice nominee Brett Kavanaugh’s congressional hearing, centered on allegations of sexual assault brought by multiple women. Kaplan also threw a lavish party celebrating Kavanaugh’s confirmation the day after it became official. Despite Facebook’s later public statements in support of women’s rights and safety, Kaplan’s personal relationships give ethical hard-liners enough evidence to make an argument that patronage from Facebook is patronage from the patriarchy’s worst.
Third and most troubling of all, Facebook and its products—particularly the messaging platform WhatsApp, which it acquired in 2014—have been linked to targeted violence in at least five different countries around the world. And “targeted” does not necessarily mean “small-scale,” as two deeply reported (but not necessarily widely known) examples prove.
In September 2018, BuzzFeed News published an exposé accusing Philippine strongman Rodrigo Duterte and his administration of disseminating propaganda used as evidence to jail a prominent political opponent and distort public opinion about the “extrajudicial executions of more than 12,000 Filipinos suspected of selling or using drugs in the country.” A month later, the New York Times reported that hundreds of military personnel in Myanmar had been using Facebook for at least five years to spread lies and directly enable what the United Nations labeled “a textbook example of ethnic cleansing” against the country’s Rohingya Muslim minority. (Although Facebook eventually took action to try to combat these atrocities, critics have characterized those actions as woefully insufficient and late-arriving.)
I don’t want to downplay either the necessity or the complexity of the debate around ethical arts funding. But my point is this: if they want to be consistent, anyone committed to vetting patrons’ affiliations and income sources should not stop with physical weaponry or deadly opioids. They should also turn their moral microscopes toward Facebook and the rest of Big Tech. Because in 2019 and beyond, the business of information may have an even greater effect on our lives than the businesses of war and drugs.
That’s all for this week. ‘Til next time, remember: Whether offline or online, the path to wealth—and patronage—isn’t normally paved by good deeds.