Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Friday, October 29, 2021

The business of business is profit, not defending democracy, truth or anything else

Facebook: Clean-up on Aisle 3!
In other words, Houston, we're got a problem!



The business model - mining for minds
A whistleblower at Facebook recently released the Facebook Papers.[1] Those internal company documents show that Facebook's algorithms were set to intentionally foment anger and discord before the 1/6 coup attempt because that was the most profitable thing to do. When a social media platform like Facebook has content that makes people angry or otherwise emotionally whipped up, they spend more time on the platform. That makes the people spending more time on the platform better products to sell to advertisers. That increases the platform's profits. 

Users of social media are the product the social media companies sell to advertisers. Specifically, their eyeballs on cell phone and computer screens is what is being sold. The longer and more eyeballs they can trap onto screens, the more money the company makes.[2] That is the business model and it is smashingly successful. So smashing that it arguably is a major factor helping to smash American democracy into some form of corrupt authoritarianism or fascism.

Company documents show that the social network’s employees repeatedly raised red flags about the spread of misinformation and conspiracies before and after the contested November vote.

Sixteen months before last November’s presidential election, a researcher at Facebook described an alarming development. She was getting content about the conspiracy theory QAnon within a week of opening an experimental account, she wrote in an internal report.

On Nov. 5, two days after the election, another Facebook employee posted a message alerting colleagues that comments with “combustible election misinformation” were visible below many posts.

Four days after that, a company data scientist wrote in a note to his co-workers that 10 percent of all U.S. views of political material — a startlingly high figure — were of posts that alleged the vote was fraudulent.

In each case, Facebook’s employees sounded an alarm about misinformation and inflammatory content on the platform and urged action — but the company failed or struggled to address the issues. The internal dispatches were among a set of Facebook documents obtained by The New York Times that give new insight into what happened inside the social network before and after the November election, when the company was caught flat-footed as users weaponized its platform to spread lies about the vote.

Facebook has publicly blamed the proliferation of election falsehoods on former President Donald J. Trump and other social platforms. In mid-January, Sheryl Sandberg, Facebook’s chief operating officer, said the Jan. 6 riot at the Capitol was “largely organized on platforms that don’t have our abilities to stop hate. “Mark Zuckerberg, Facebook’s chief executive, told lawmakers in March that the company “did our part to secure the integrity of our election.”

But the company documents show the degree to which Facebook knew of extremist movements and groups on its site that were trying to polarize American voters before the election. The documents also give new detail on how aware company researchers were after the election of the flow of misinformation that posited votes had been manipulated against Mr. Trump.
The NYT article goes on to point out that Facebook’s employees believed the social network could have done more. Enforcement of Facebook groups arguing that the 2020 election was stolen was not coordinated, but instead piecemeal. Those lies were not stopped. Regarding QAnon, Facebook employees warned for years about its potential to radicalize users, so Facebook cannot honestly argue it was unaware of what the radical right was doing. Facebook's algorithms sent a test account an employee set up to QAnon because the fake account indicated that the fake person, Carol Smith, calling herself a conservative mom who claimed to follow radical right propaganda and lies sources, Fox News and Sinclair Broadcasting. 

That was an internal Facebook research project called “Carol’s Journey to QAnon.” A key QAnon crackpot conspiracy was that the ex-president was valiantly opposing a shadowy cabal of Democratic pedophiles. According to a Facebook researcher, Carol Smith’s account feed devolved in three weeks into “a constant flow of misleading, polarizing and low-quality content.” The same thing happened with fake accounts set to look like liberals. Facebook algorithms were set to emotionally whip people up and polarize them to increase the time their minds stayed trapped in Facebook.  
 

Questions: 
1: Should Facebook sue the whistleblowers that have leaked internal company documents? 

2. From a free speech point of view, is one relatively non-toxic policy, assessing corporate taxes at least in part on higher taxes on revenues or profits that come in from or are associated with objectively false content? 


Footnotes: 
1. According to the AP, “the Facebook Papers represents a unique collaboration between 17 American news organizations, including The Associated Press.” AP writes
Facebook the company is losing control of Facebook the product — not to mention the last shreds of its carefully crafted, decade-old image as a benevolent company just wanting to connect the world.

Thousands of pages of internal documents provided to Congress by a former employee depict an internally conflicted company where data on the harms it causes is abundant, but solutions, much less the will to act on them, are halting at best.

The crisis exposed by the documents shows how Facebook, despite its regularly avowed good intentions, appears to have slow-walked or sidelined efforts to address real harms the social network has magnified and sometimes created. They reveal numerous instances where researchers and rank-and-file workers uncovered deep-seated problems that the company then overlooked or ignored.  
“At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie,” Facebook said in a prepared statement Friday. “The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”

Statements like these are the latest sign that Facebook has gotten into what Sophie Zhang, a former Facebook data scientist, described as a “siege mentality” at the company. Zhang last year accused the social network of ignoring fake accounts used to undermine foreign elections. With more whistleblowers — notably Haugen — coming forward, it’s only gotten worse.  
“Facebook has been going through a bit of an authoritarian narrative spiral, where it becomes less responsive to employee criticism, to internal dissent and in some cases cracks down upon it,” said Zhang, who was fired from Facebook in the fall of 2020. “And this leads to more internal dissent.”
No wonder Facebook is changing its name to Meta (what a 😜 stupid name). Companies in serious public relations trouble do that all the time. Cigarette companies, financial firms, and most everyone else in public relations hot water changes their name to hide their sleaze, corruption and/or crimes. It is an effective way of laundering bad corporate behavior, and personal and social damage in the public memory. 

2. That is like eyeballs on televisions screens or ears listening to radio. The more viewers or listeners a TV or radio broadcast has, the more money advertisers are willing to pay for their ads on those platforms. Advertisers want and pay for your valuable mental attention. Mental attention comes via eyeballs and ears. Another analogy is casinos. The longer the average person stays and plays, the more money they will lose. That fact is inherent in markets and it is augmented by casinos rigging their games to increase the odds of people losing. 

The point is simple: the average person's time and attention has commercial value. All advertisers want it and pay to get it -- that is the point of buying advertisements. Few advertisers ask or care about how those consumer minds get trapped or what political or social collateral damage that mining for minds might have done. It's just business. Morals, social conscience, democracy and truth are irrelevant.

No comments:

Post a Comment