Make wishtv.com your home page

Congress grills social media companies on efforts to prevent violence

WASHINGTON (Nexstar) — A mass shooting is livestreamed on Facebook. The plan for another mass murder is posted online before the shots ring out.

In light of these recent connections between social media and violent acts, a Senate committee put Google, Facebook and Twitter in the hot seat Wednesday.

These platforms say they’ve become a way to communicate with loved ones during tragedies — organize events for people to gather and grieve — and raise money to help support victims.

But they’re working to prevent these incidents all together.

“We just don’t allow the violence period,” said Monika Bickert, Facebook’s head of global policy management.

Facebook, Twitter and Google told lawmakers they have made progress toward protecting users from online hate and violence.

The companies say over the past two years they have removed millions of violent posts, suspended thousands of accounts, and hired hundreds of employees to search out hate speech that might encourage terrorism.

U.S. Sen. Roger Wicker, a Mississippi Republican, says the social media giants still have work to do. “This is a matter of serious importance to the safety and well-being of our nation’s communities.”

U.S. Sen. Rick Scott, a Florida Republican, pointed to a major failure.

“Someone with the username Nicholas Cruz had posted a comment on a Youtube video that said, ‘I am going to be a professional school shooter,’” Scott said.

Less than six months later, a gunman with the same name killed 17 people at a Parkland, Florida, High School when Scott was governor.

He asked the representative from Google, which owns YouTube, what happened.

“We strive to be vigilant, to invest heavily, to proactively report when we see an imminent threat. I don’t have the details on the specific facts you’re describing,” said Derek Slater, Google’s global director of information policy.

But, the companies said, Parkland and other tragedies highlight why they have made it a priority to work more closely with each other and with law enforcement.

“Removing content alone will not stop those who are determined to cause harm,” said Nick Pickles, Twitter’s public policy director.

The companies and agencies now have better information sharing and technology but say, many times, the content just moves to darker corners of the internet.

“The threat environment that we are in today as a country has changed and evolved in the past 24 to 36 months,” said George Selim, Anti-Defamation League’s senior vice president of programs.

As the platforms work to keep up, members of Congress plan to ask the Department of Justice for its help, too.

Some lawmakers are hoping the department will police hate crimes on the dark side of the web as it has with child porn in recent years.