Does the Tech Industry Want Federal Regulations for Facebook?

A whistleblower and a trove of leaked documents have raised questions about Facebook’s business practices, including its alleged prioritization of engagement over user safety. Pundits and analysts are questioning whether the social-networking giant should be broken up or regulated in some way.

According to Blind, which surveys anonymous technologists about a range of issues, some 45 percent of Facebook employees also think that regulation would be a good thing. However, the sample size of 482 “verified” Facebook employees is small, considering the company employs around 60,000 people.

Although the tech industry is often fearful of regulation, it seems that employees at other tech companies also don’t mind if the federal government hits Facebook with some new rules. “Among all professionals surveyed by Blind, five out of seven (72 percent) answered affirmatively to the question: ‘Should the federal government impose regulations on Facebook?”’ read a note accompanying Blind’s data. “Professionals at Facebook’s rival social media platforms, including LinkedIn, Pinterest, Snap, Twitter, and TikTok-developer ByteDance were among the most enthusiastic for new rules for the $900-billion Internet giant.”

If you can’t beat ‘em, get the Feds to regulate ‘em, in other words. But Facebook’s current difficulties (which include a spectacular system crash in early October and fines for alleged discrimination against U.S. workers) might only multiply if its valuable technologists decide they don’t want to work for the company anymore. While Facebook pays its engineers, data scientists, and other professionals quite a bit, money is only part of the equation when it comes to retention; people want to feel good about their employer.

In order to keep its talent in place, Facebook’s managers (and Facebook CEO Mark Zuckerberg himself) will need to convince the workforce that the company ultimately has its users’ best interests at heart. Senior leadership must also show it has a plan to tackle Facebook’s external crises—and that probably goes beyond a name change

One Response to “Does the Tech Industry Want Federal Regulations for Facebook?”

  1. Terry Webb

    I want to say a lot of the extremism occurs because people can’t think of a test where they could be wrong. And that form of extremism is endemic to human thinking. It can cause situation where people think that only violence and shutting up the opposition is the only way to proceed.

    Facebook’s echo-chamber methodology to drive user engagement actually magnifies this problem (As Frances Haugen testified). But the same thing happens with liberal media (which have shut-off public comment, and that comment is often a mixed bag of conservative and liberal view points). Conservatives have been actively shamed at many companies (ex: Facebook, ex: Twitter, ex: Google). So conservatives cannot comment in call-ins, we have to comment in written anonymous boards, and ideally in rebuttal to biased or flawed reporting. Hey, I don’t want the violence of ANTIFA to visit me at work. It kind of did at one point in an Office meeting where I was pressured to support Hillary Clinton, I had to abstain.

    What Facebook is doing is not any different than what many organizations do (example NPR, KQED-FORUM) they are controlling messages for sake of the almighty dollar.

    Now one common thing that scientists do is to subject their test results/procedures/conclusions to review by their piers. And this is because they are seeking help which can come either a non-confirming or confirming or other review (which might conjecture tests that disprove conclusions). I would say CNN, NPR, KQED-FORUM, and Facebook should actually be seeking public approval of their reports or distribution procedures. Because that is most Frontal-Cortex way you can actually find potential tests that might disprove you conclusion (for example conclusions about the H-1b program, but also many others). But they don’t.

    Most of the conservative news services still allow public comments.

    Facebook is feeling ashamed, almost as if they bit the Apple (I hope you see the pun here) and realize their lack of accoutrement. And these documents, hey they are the apple.

    We see this also with sites (NPR and KQED for example). NPR stopped allowing comments to its articles. KQED stopped allowing written comments on its Forum page. This controlling of the message was done in order to increase loyalty of it paying base (example the Eric Schmidt foundation).

    The document trove points to a profit only motive by Facebook. Is that bad?

    It’s bad in an environment where people can’t think critically about what is thrown at them. And they can’t think critically under the following conditions:

    – When the situation/environment makes it impossible to see the truth. Facebook is not transparent about how it steers engaging (read bias confirming) stories at users. Fox and CNN do the exact same thing, but it is obvious. Political parties do the exact same thing. What makes Facebook so unusual, is that it was never made obvious to users and there appears to be no way to opt out of it in Facebook.

    – Facebook lies. They tried, initially, to discredit Haugen and her testimony. This reminds me a lot of when the Clinton administration threatened Monica Lewinsky with libel (she admitted the affair with Clinton, under oath). Until it was disclosed there was a stained dress with President Clinton’s DNA on it. Then we got that sheep-eyed apology from Clinton. A similar thing is happening to Facebook, and by the way they are handling it, it shows something about Facebook, they are a company that will play dirty until they are shocked and realize they can’t get away with that tactic.

    – Facebook will do anything to protect Mark Zuckerberg from having to take the witness stand, where these is a chance he might either have to perjure himself, or ruin his reputation by actually telling the truth. Literally, Facebook paid 5 billion dollars to the FTC to protect Mark Zuckerberg from the witness stand. Well now that the documents are out, and many contradict Mark’s stated actions and motive, he may as well start telling the truth now. He can never be President, but he might at least help the situation instead of continue to hurt the situation. He can be like Eric Schmidt and pull strings in behind the stage.

    Social media only needs regulation in the following areas:
    – User data must be kept private, if the user doesn’t opt out of their privacy rights. Government should strictly define those privacy rights.
    – Minors need to be confined to a kiddie version of Social Media, or kept out if not available.
    – Companies need to be transparent about what news they are filtering and what they are not. And those that enjoy section 230 rights, shouldn’t be able to bar people from their sites. Unless they are actively sponsoring death, violence, or other illegal activity (on or off the site). And the government can be the final arbiter on these things, in the U.S. it should require a court order, so that people’s rights are protected.
    – If a company (or any other organization) opts out of Sections 230, who they put on their site and who they kick off is their own business. In such cases, journalistic integrity guidelines should be enough to protect them.