Two government officials yesterday admitted what many Big Tech critics have argued for years: Lawmakers and regulators have done a bad job of managing monopolies.
Chairman of the House Judiciary Committee’s antitrust committee David Cicilline and Federal Trade Commissioner Rohit Chopra spoke at a virtual event hosted by the American Economic Liberties Project, a lobbying group focused on battling monopoly power. They suggested that the U.S. needs new laws to prevent companies from engaging in anticompetitive behavior as well as improved regulatory enforcement to crack down on companies that break the rules.
“We need to end the era of weak enforcement,” Chopra said on the Zoom call.
Both Congress and regulators have become increasingly concerned with Big Tech’s growing power and their history of squashing and scooping up rivals. And the battle against Big Tech has bipartisan interest, though in some cases for different reasons.
Already, the antitrust committee had released a report urging for sweeping regulatory changes following its 16-month investigation into Apple, Amazon, Facebook, and Google. The committee suggested breaking up the companies, prohibiting them from giving themselves or others preferential treatment, and strengthening antitrust regulatory bodies and laws.
Now, a couple of weeks later, the DOJ has filed its landmark lawsuit against Google regarding its dominance of search and search ads. It, too, listed recommendations such as splitting up the company or prohibiting certain anticompetitive practices.
But investigations aside, Cicilline said Congress has fallen behind on its duty to create modern laws and ensure regulatory agencies have enough resources to do their jobs. And Chopra said regulators should be doing more within the confines of the current laws.
Recent rulings have benefited tech companies, Chopra added. For example, YouTube and Google agreed to pay the FTC a $170 million settlement for collecting personal information from children on YouTube without parental consent. Factoring in both the settlement and the data that was collected, Chopra believes YouTube ultimately made money from their actions.
Also, the FTC’s $5 billion settlement with Facebook last year gave CEO Mark Zuckerberg and COO Sheryl Sandberg immunity from wrongdoing, Chopra said. “A settlement with a fine and some paperwork is not going to fix the problem,” he said. “It’s only going to incentivize bad behavior.”
Congress' antitrust investigation is its first in 50 years. If regulators and lawmakers want to make the sweeping changes that Cicilline and Chopra suggest, they have quite the tall task ahead.
On the latest episode of our Brainstorm podcast, we explore the impact of social media on the U.S. election. Russian interference in the 2016 election was a wake-up call for many. As Brainstorm host Michal Lev-Ram says, it was the first time the public realized social media platforms could do more harm than good in the world.
Fast forward to today. What have the platforms done to battle misinformation? How effective have their changes been? And how will all of this impact the 2020 election?
Lev-Ram, along with co-host Brian O’Keefe, turn to three experts for answers: tech investor turned activist Roger McNamee, MIT Professor and author of “The Hype Machine,” Sinan Aral, and Data Sheet's own Danielle Abril.
Listen to the episode here.
RIP Quibi. Streaming mobile video service Quibi, led by Meg Whitman and Jeffrey Katzenberg, once touted itself as an exciting new form of entertainment that boasted “quick bites,” or shows and movies told in five- to 10-minute chapters. But the service, which raised $1.8 billion from investors including 21st Century Fox and MGM Studios, never lived up to its hype. The service has hired a restructuring firm to explore its options, and on Wednesday, the worst outcome came to fruition: Quibi is shutting down.
TikTok tackles hate. The social media network that became popular for its teen dance challenges is now dealing with a problem familiar to some of its more mature competitors: hateful content. On Wednesday TikTok announced that it’s expanding its hate speech policies that already prohibit neo-Nazism and white supremacy to include “neighboring ideologies” like white nationalism and genocide theory. The service also said it will crack down on coded language and symbols TikTokers may use to spread hate speech.
¡Qué horror! Conspiracy theories have made their way into Spanish media and are being spread to Spanish speakers across the nation on social media, The New York Times reports. That includes messages aimed at pitting Latino and Black voters against each other, using racist language to villainize people who protest police brutality. And as they do with misinformation in English, Internet services are struggling to police the harmful content.
New neighbor. Facebook is developing a new feature called Neighborhoods that will provide users a private space to connect with their neighbors. Sound familiar? That’s probably because NextDoor, which is considering filing for an initial public offering, already does that. Facebook told Bloomberg it’s testing the feature in Calgary, Canada. True to form, the social media giant aims to recreate the success of a soon-to-be rival on its own service.
Difficulty with diversity. A month after the rise of the racial justice movement, Microsoft announced that it planned to double the number of Black managers and senior leaders in the U.S. over the next five years. But in its latest diversity report, the company revealed that it still has a long way to go reach that goal. The report shows that in the past year, Microsoft only increased the representation of Latinos and Black people by .3 percentage points, with both groups heavily underrepresented in management roles.
FOOD FOR THOUGHT
Racism and hate speech has been an ongoing problem for Facebook. But the issue became particularly troubling following the murder of George Floyd in Minneapolis, according to a report from the Anti-Defamation League. Derogatory anti-Black posts quadrupled on Facebook in June from 20 posts per month to 80 posts per month and remained consistent until Sept. 1.
The report also revealed that, between January and September, derogatory comments accounted for 7% of posts referring to Black Americans and 9% of posts referring to Jewish Americans on Facebook pages and public groups.
“Tech companies should design their products to prevent hateful content from multiplying inside defined online spaces like Facebook groups as well as hate movements from ‘breaking out’ into public social media,” the report reads. “This is not simply a question of stricter moderation, but will likely require additional safety features and changes to existing aspects of these platforms.”
IN CASE YOU MISSED IT
Tesla beats the Street again as profits, deliveries, and margins surge By David Z. Morris
PayPal will soon offer Bitcoin and other cryptocurrency purchase options By David Z. Morris
The blockchain industry faces a moment of truth as high-profile projects go live By Jeff John Roberts
BEFORE YOU GO
Some encouraging news during an exhausting election season: Thousands of young women are stepping up to serve as poll workers after the pandemic created a shortage, according to nonprofit news outlet The 19th. Many of these young women are 16 and 17 years old, not yet old enough to vote, and they’re pulling 14- and 16-hour shifts. Talk about dedication to democracy.
Emily Ramshaw, co-founder and CEO of The 19th, tweeted the story on Wednesday, asking: “Want to get goosebumps?” Goosebumps indeed, Emily.
Chairman of the House Judiciary […]