Facebook’s Zuckerberg Says He’s Got Election Stuff Under Control

It’s the evening of November 3rd. Election Day 2020. The polls have closed, and in-person vote totals are being reported, but millions of mail-in ballots, which skew heavily Democratic, won’t be counted for days or weeks. Donald Trump, unsurprisingly, doesn’t care to wait for that to happen. He’s leading the in-person vote in the decisive swing states. He takes to Facebook to declare premature victory and insist that ballots stop being counted.

This hypothetical chain of events has come up a lot recently, as an unprecedented number of Americans prepare to vote by mail. The Democratic data firm Hawkfish calls it the “red mirage”—an apparent Trump landslide on election night, leading to a fight over the millions of outstanding ballots that makes Bush v. Gore look like a tea party. Which raises an important question: How will the social media platforms where so many Americans get their news respond?

On Wednesday morning, we got some answers to that question. In a blog post, Mark Zuckerberg laid out Facebook’s latest election-related policies, including its plan to deal with the possibility that a winner won’t be officially declared on Election Day. The company plans to use its new Voting Information Center “to prepare people for the possibility that it may take a while to get official results.” On Election Day, the information center will include authoritative information from Reuters and the National Election Pool. And if a candidate claims victory prematurely, Zuckerberg says Facebook will “add a label to their post educating that official results are not yet in and directing people to the official results.” (Posts that could trick people out of having their vote counted—or use Covid-19 scaremongering to deter them from voting—will be subject to removal.)

his response
click over here
take a look at the site here
more tips here
helpful resources
check out this site
look at this website
have a peek at this site
the original source
visit our website
visit this website
go to this website
pop over here
Home Page
Recommended Reading
these details
try these out
check my reference
her comment is here
useful link
hop over to here
click this link here now
blog link
Continue eading
Click Here
Clicking Here
Go Here
Going Here
Read This
Read More
Find Out More
Discover More
Learn More
Read More Here
Discover More Here
Learn More Here
Click This Link
Visit This Link
Home Page
Visit Website
Web Site
Get More Info
Get More Information
This Site
More Info
Check This Out
Look At This
Full Article
Full Report
Read Full Article
Read Full Report
a cool way to improve
a fantastic read
a knockout post
a replacement
a total noob
about his
additional hints
additional info
additional reading
additional resources
agree with
are speaking
article source
at bing
at yahoo
best site

These are good ideas, in theory. The question, as with every Facebook policy announcement, is how well they will be executed. “We’ve already strengthened our enforcement against militias,” Zuckerberg’s blog post notes, less than a week after the Verge reported that Facebook failed to act on multiple user warnings about militia-related events prior to the shooting in Kenosha, Wisconsin, that left two people dead. The new policies leave similar room for uncertainty. Will a false claim of victory by a politician be clearly and decisively debunked? Or will misinformation simply be presented next to a vague link to “Get voting information”? The latter is what initially happened with Trump’s strange Wednesday post attempting to retroactively clean up his suggestion that North Carolina Republicans illegally vote twice. Facebook later updated the post with a different label that says, “Voting by mail has a long history of trustworthiness in the US and the same is predicted this year. (Source: Bipartisan Policy Center.)” That’s a shade more helpful—but the change underscores how unpredictable this policy implementation can be. The generic label remains on other posts in Trump’s feed, as well as on posts by Joe Biden that discuss election issues.

That disclaimer, meanwhile, links to Facebook’s Voting Information Center, which is at the heart of the company’s ambitious plan to register 4 million new voters. It provides lots of helpful links to things like voter registration, mail ballot applications, and—in a particularly inspired move, given the barriers to in-person voting—ways to volunteer to be a poll worker. But will all that authoritative information actually make its way to people’s eyeballs? Facebook has emphasized that the Voting Information Center will appear at the top of people’s News Feeds, but three weeks after its rollout, I still don’t see it in my feed on Facebook’s desktop site. To be fair, it does appear on mobile, which more people use, but in my experience it takes a few seconds to pop up—by which point I don’t see it, because I’ve already scrolled down far enough to where Facebook’s recommendation algorithm is suggesting new QAnon groups for me to join. (I recently joined a few for research purposes.)

Leave a Reply

Your email address will not be published.