Shannon Bond

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.

Bond joined NPR in September 2019. She previously spent 11 years as a reporter and editor at the Financial Times in New York and San Francisco. At the FT, she covered subjects ranging from the media, beverage and tobacco industries to the Occupy Wall Street protests, student debt, New York City politics and emerging markets. She also co-hosted the FT's award-winning podcast, Alphachat, about business and economics.

Bond has a master's degree in journalism from Northwestern University's Medill School and a bachelor's degree in psychology and religion from Columbia University. She grew up in Washington, D.C., but is enjoying life as a transplant to the West Coast.

Last year, in the middle of the pandemic, Sinead Boucher offered $1 to buy Stuff, New Zealand's largest news publisher.

Boucher was already the company's chief executive and was worried that its Australian media owner would shut down the publisher. Things had started to look really grim: The economy had ground to a halt and advertising revenue had evaporated.

"I knew that they ... would potentially just decide to wind us up," said Boucher. "So it was just a punt."

Facebook is making changes to give users more choice over what posts they see in their news feeds, as the social media company defends itself from accusations that it fuels extremism and political polarization.

The changes, announced Wednesday, include making it easier for people to switch their feeds to a "Most Recent" mode, where the newest posts appear first, and allowing users to pick up to 30 friends or pages to prioritize. Users can now limit who can comment on their posts.

Tech workers say they have experienced more harassment based on gender, age and race or ethnicity while working remotely during the pandemic, according to a survey from a nonprofit group that advocates for diversity in Silicon Valley.

The increases were highest among women, transgender and nonbinary people, and Asian, Black, Latinx and Indigenous people.

Support for the siege on the U.S. Capitol. Bogus promises of COVID-19 cures. Baseless rumors about vaccines.

Who should be held accountable for the spread of extremism and hoaxes online?

Lina Khan, a prominent antitrust scholar who advocates for stricter regulation of Big Tech, may be about to become one of the industry's newest watchdogs.

President Biden on Monday nominated Khan to the Federal Trade Commission, an agency tasked with enforcing competition laws. She is the splashiest addition to Biden's growing roster of Big Tech critics, including fellow Columbia Law School professor Tim Wu, who announced earlier this month he would join the National Economic Council.

If there's one business that has come out ahead after a very hard year, it's Zoom.

The Silicon Valley upstart has become synonymous with video chat over the course of the pandemic. It has fulfilled our need to see and be with each other, even when we can't do that in person. And it's beat out some of the biggest names in tech along the way.

Kelly Steckelberg, the company's chief financial officer, can pinpoint the day when everything changed: March 15, 2020.

"Almost overnight, the demand grew exponentially," she told NPR.

Facebook is failing to enforce its own rules against falsehoods about COVID-19, vaccines, election fraud and conspiracy theories when it comes to posts in Spanish, according to a coalition of advocacy groups.

"There is a gap, quite an enormous gap, in fact, in English and Spanish-language content moderation," Jessica González, co-CEO of the advocacy group Free Press, told NPR.

Instagram recommended false claims about COVID-19, vaccines and the 2020 U.S. election to people who appeared interested in related topics, according to a new report from a group that tracks online misinformation.

"The Instagram algorithm is driving people further and further into their own realities, but also splitting those realities apart so that some people are getting no misinformation whatsoever and some people are being driven more and more misinformation," said Imran Ahmed, CEO of the Center for Countering Digital Hate, which conducted the study.

There's a saying in Silicon Valley: Solve your own problems. Tracy Chou didn't have to look further than her social media feeds to see those problems.

"I've experienced a pretty wide range of harassment," she said. "Everything from the casual mansplaining-reply guys to really targeted, persistent harassment and stalking and explicit threats that have led me to have to go to the police and file reports."

On Feb. 1, the editor of an award-winning Indian magazine got a call from his social media manager: The magazine's Twitter account was down.

"I said, 'Are you sure? Can you just refresh, and check again?' " recalled Vinod K. Jose, executive editor of The Caravan, which covers politics and culture. "But she said, 'No, no, it's real.' "

Copyright 2021 NPR. To see more, visit https://www.npr.org.

AILSA CHANG, HOST:

All right. Well, for more on this dilemma facing Twitter in India, we're going to turn now to NPR tech correspondent Shannon Bond.

Hey, Shannon.

SHANNON BOND, BYLINE: Hey, Ailsa.

Twitter users aren't known for staying quiet when they see something that's flat out wrong, or with which they disagree. So why not harness that energy to solve one of the most vexing problems on social media: misinformation?

With a new pilot program called Birdwatch, Twitter is hoping to crowdsource the fact-checking process, eventually expanding it to all 192 million daily users.

"I think ultimately over time, [misleading information] is a problem best solved by the people using Twitter itself," CEO Jack Dorsey said on a quarterly investor call on Tuesday.

Facebook is expanding its ban on vaccine misinformation and highlighting official information about how and where to get COVID-19 vaccines as governments race to get more people vaccinated.

"Health officials and health authorities are in the early stages of trying to vaccinate the world against COVID-19, and experts agree that rolling this out successfully is going to be helping build confidence in vaccines," said Kang-Xing Jin, Facebook's head of health.

January brought a one-two punch that should have knocked out the fantastical, false QAnon conspiracy theory.

After the Jan. 6 attack on the U.S. Capitol, the social media platforms that had long allowed the falsehoods to spread like wildfire — namely Twitter, Facebook and YouTube — got more aggressive in cracking down on accounts promoting QAnon.

Updated at 3:16 p.m. ET

Facebook's oversight board on Thursday directed the company to restore several posts that the social network had removed for breaking its rules on hate speech, harmful misinformation and other matters.

The decisions are the first rulings for the board, which Facebook created last year as a kind of supreme court, casting the final votes on the hardest calls the company makes about what it does and does not allow users to post.

The alternative social network MeWe had 12 million users at the end of 2020. Barely three weeks into 2021 — and two since a right-wing mob attacked the U.S. Capitol — the company says it's now passed 16 million.

CEO Mark Weinstein says this popularity is a testament to the reason he launched MeWe in 2016 as an alternative to Facebook. MeWe markets itself as privacy forward. It doesn't harness users' data to sell ads or decide what content to show them.

Two weeks ago, Facebook indefinitely suspended former President Donald Trump from its social network and Instagram, after a mob of his supporters stormed the U.S. Capitol. CEO Mark Zuckerberg said the risks of allowing Trump to keep using the social network were "too great."

Now, Facebook wants its newly formed independent oversight board to weigh in and decide whether it should reinstate Trump.

Updated at 3:05 p.m. ET

Willy Solis never saw himself as an activist.

"I'm an introvert, extreme introvert," he said. "That's my nature."

But 2020 changed that — like so many other things.

Los Angeles County Supervisor Sheila Kuehl's district sweeps from the beaches of Santa Monica to the San Fernando Valley. Among the two million people she represents are Latino communities hit especially hard by the coronavirus pandemic.

"Many essential workers, many market and pharmacy and food service and restaurant and hotel workers and a lot of health care workers," she said. "So a lot of people just had to go to work."

Updated at 6:32 p.m. ET

When you search on Google, do you get the best results? Or the results that are best for Google?

That question is at the heart of the latest lawsuit to challenge the tech giant's dominance over Internet search and advertising.

On Thursday, a bipartisan group of 38 attorneys general hit Google with the company's third antitrust complaint in less than two months, zeroing in on its role as "the gateway to the Internet."

This week, the Federal Trade Commission and 48 attorneys general unveiled blockbuster lawsuits accusing Facebook of crushing competition and calling for the tech giant to be broken up.

The twin complaints together run to nearly 200 pages documenting how Facebook became so powerful — and how, according to the government, it broke the law along the way.

Kolina Koltai first heard about the coronavirus back in January, but not from newspapers or TV. Instead, she read about it in anti-vaccination groups on Facebook.

"They were posting stories from China like, 'Hey, here's this mysterious illness,' or 'Here's this something that seems to be spreading,'" she said.

Updated at 9:30 p.m. ET

The Federal Trade Commission and 48 attorneys general across the nation filed much-anticipated lawsuits against Facebook on Wednesday, accusing the social media giant of gobbling up competitive threats in a way that has entrenched its popular apps so deeply into the lives of billions of people that rivals can no longer put up a fight.

Facebook is banning claims about COVID-19 vaccines that have been debunked by public health experts, as governments prepare to roll out the first vaccinations against the virus.

That includes posts that make false claims about how safe and effective the vaccines are, and about their ingredients and side effects.

Google illegally fired two employees involved in labor organizing last year, the National Labor Relations Board alleged in a complaint on Wednesday.

The tech giant also violated federal labor law, the agency said, by surveilling employees who viewed a union organizing presentation, interrogating others, unfairly enforcing some rules and maintaining policies that "discourage" workers from protected organizing activities.

Updated at 5:19 p.m. ET

Facebook users saw hate speech about once in every 1,000 pieces of content they viewed on the social network between July and September, the company said on Thursday.

This is the first time Facebook has publicly estimated the prevalence of hate speech on its platform, giving a sense of scale of the problem. It published the new metric as part of its quarterly report on how much content it removed from Facebook and Instagram for breaking rules ranging from violence to child exploitation to suicide and self-harm.

Updated Thursday at 11:02 a.m. ET

More than 200 Facebook workers say the social media company is making content moderators return to the office during the pandemic because the company's attempt to rely more heavily on automated systems has "failed."

Twitter said on Thursday that it would maintain some of the changes it had made to slow down the spread of election misinformation, saying they were working as intended.

Before Election Day, Twitter, Facebook and other social networks had announced a cascade of measures billed as protecting the integrity of the voting process.

Last week, millions of Americans turned to cable news to watch election returns pour in. Some refreshed their Twitter feeds to get the latest tallies. And nearly 300,000 others kept an eye on the YouTube channel of 29-year-old Millie Weaver, a former correspondent for the conspiracy theory website Infowars, who offered right-wing analysis to her followers in a live-stream that carried on for almost seven hours the day after the election.

At times, her pro-Trump commentary veered into something else: misinformation.

Facebook removed a group filled with false claims about voter fraud and calls for real-world protests over vote counting that had gained more than 360,000 members since it was created on Wednesday.

Pages