Pragmatic politics focused on the public interest for those uncomfortable with America's two-party system and its way of doing politics. Considering the interface of politics with psychology, cognitive science, social behavior, morality and history.
Etiquette
DP Etiquette
First rule: Don't be a jackass.
Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.
Tuesday, December 3, 2019
Gun Violence Statistics
FOR THOSE WHO CARE:
https://lawcenter.giffords.org/facts/gun-violence-statistics/
Average Deaths per Year1
Total 36,383
https://everytownresearch.org/gun-violence-america/
Monday, December 2, 2019
Marketing Unproven Medical Treatments
The Washington Post reports on a growing industry that uses hardball marketing tactics to patients with terminal diseases. The industry sells stem cell treatments for progressive lung disorders, Parkinson's disease and other untreatable diseases. Because the patients are desperate, they fall prey to the sales pitches. The sales tactics include telling patients how they can raise the needed money, e.g., fundraising on GoFundMe. None of the treatments have been proven safe and effective by the FDA. Some people spend all of their remaining money for these treatments.
It is hard to imagine why such businesses are allowed to operate legally. It is bad enough that useless treatments and products such as nutritional supplements and homeopathy products are legal. These stem cell treatments are worse because they falsely claim to treat serious diseases. Nutritional supplements and homeopathy products all must carry a warning label stating that the product has not been shown to treat or improve any disease or symptom.
One of these fake medicine companies, the Lung Health Institute, doesn't show that disclaimer on its website. The only disclaimer is innocuous and in small print, “Each patient is different. Results may vary.”
Indeed, results will vary. They will vary from failure to failure coupled with bankruptcy and homelessness.
One of these fake medicine companies, the Lung Health Institute, doesn't show that disclaimer on its website. The only disclaimer is innocuous and in small print, “Each patient is different. Results may vary.”
Indeed, results will vary. They will vary from failure to failure coupled with bankruptcy and homelessness.
What is government for?
One can ask about the role of government here. It is clear that government isn’t concerned about companies selling fake treatments to sick people. In this instance, the role of government is mostly to protect companies and their business interests. Patient welfare is of little apparent concern although these companies presumably cannot poison their patients under current law. This is the face of modern anti-government conservative and populist ideology.
Question: Is it irrational or incorrect to assert that, for this industry, the role of government is to protect companies and their business interests more than protecting consumer from health treatment scams?
Friday, November 29, 2019
How to Spot Professional Trolls Online
Two professors at Clemson University have been analyzing social media and propaganda tactics that professional Russian and other foreign nation trolls use to foment social discord and distrust online in Western democracies. They analyzed data and Tweets that Twitter has made public. What they conclude is that, regardless of where they are located, amateur trolls who are bigoted, narrow minded, angry and/or try to provoke liberals, conservatives and minority groups and individuals just for the fun of it “aren’t a threat to Western democracy.”
By contrast with amateur trolls, professional democracy attackers are much more subtle and effective. They start by posting or Tweeting positive, warm messages designed to build a social media following. Rolling Stone writes:
The researchers, Darren Linvill, associate professor of communication, and Patrick Warren, associate professor of economics, discussed their research with KUOW, an NPR affiliate station, in a 9 minute interview. KUOW writes:
By contrast, offensive disinformation which is content specifically designed to manipulate emotions and attitudes by focusing on social stress points and playing on personal ideology. This kind of propaganda focuses on what is important to the people in the target country, not in the troll farm country. The goal is to to reinforce differences in existing attitudes and beliefs and use those differences to foment social division, distrust in institutions, e.g., the professional media, fellow citizens, and out-groups.
For self-defense against troll manipulation, the researchers suggest asking some self-reflection questions when you are confronted with social media content from a source you are not familiar with. First ask yourself, is this true? For ideologues, belief in lies is easy when the lie fits personal ideological belief. Second, ask why am I seeing this? Trolls know how to manipulate the algorithm. Third, ask what impact on other would sharing or upvoting this have? This asks for a measure of empathy, which in a way is an opposite of self-righteous belief, which can easily be reinforced by troll lies and manipulation.
By contrast with amateur trolls, professional democracy attackers are much more subtle and effective. They start by posting or Tweeting positive, warm messages designed to build a social media following. Rolling Stone writes:
Professional trolls are good at their job. They have studied us. They understand how to harness our biases (and hashtags) for their own purposes. They know what pressure points to push and how best to drive us to distrust our neighbors. The professionals know you catch more flies with honey. They don’t go to social media looking for a fight; they go looking for new best friends. And they have found them.
Disinformation operations aren’t typically fake news or outright lies. Disinformation is most often simply spin. Spin is hard to spot and easy to believe, especially if you are already inclined to do so. While the rest of the world learned how to conduct a modern disinformation campaign from the Russians, it is from the world of public relations and advertising that the IRA learned their craft. To appreciate the influence and potential of Russian disinformation, we need to view them less as Boris and Natasha and more like Don Draper.
As good marketers, professional trolls manipulate our emotions subtly. In fall 2018, for example, a Russian account we identified called @PoliteMelanie re-crafted an old urban legend, tweeting: “My cousin is studying sociology in university. Last week she and her classmates polled over 1,000 conservative Christians. ‘What would you do if you discovered that your child was a homo sapiens?’ 55% said they would disown them and force them to leave their home.” This tweet, which suggested conservative Christians are not only homophobic but also ignorant, was subtle enough to not feel overtly hateful, but was also aimed directly at multiple cultural stress points, driving a wedge at the point where religiosity and ideology meet. The tweet was also wildly successful, receiving more than 90,000 retweets and nearly 300,000 likes.
This tweet didn’t seek to anger conservative Christians or to provoke Trump supporters. She wasn’t even talking to them. Melanie’s 20,000 followers, painstakingly built, weren’t from #MAGA America (Russia has other accounts targeting them). Rather, Melanie’s audience was made up of educated, urban, left-wing Americans harboring a touch of self-righteousness. She wasn’t selling her audience a candidate or a position — she was selling an emotion. Melanie was selling disgust. The Russians know that, in political warfare, disgust is a more powerful tool than anger. Anger drives people to the polls; disgust drives countries apart. (emphasis added)
The researchers, Darren Linvill, associate professor of communication, and Patrick Warren, associate professor of economics, discussed their research with KUOW, an NPR affiliate station, in a 9 minute interview. KUOW writes:
To stop trolls from exploiting existing tensions in American society, he says people need to question why we’re seeing certain messages and the consequences of sharing them before hitting retweet.
“I think that there’s a lot that you can do,” Warren says. “If you’re mindful of the origins of the information you’re sharing, it can make a big difference.”
Linville: “..... I think it doesn’t ultimately [matter] if it’s a Russian troll or an Iranian troll or a Chinese troll, I think one needs to be careful when you’re interacting with anonymous accounts not to retweet someone just because they use the same hashtag as you did and you agree with them, but also not accuse people of being Russian trolls just because you disagree with them. I think that’s one of the biggest impacts of Russian disinformation is that we don’t trust each other anymore and it’s really dangerous and it’s a lasting impact.”
Warren: “I think it’s important to realize that when you share something on social media, you’re doing two things. You’re sharing a message, but you’re also bringing prominence to the account you’re sharing. And so the question you should be asking yourself often on social media, in addition to the obvious question that we all start with, which is: Is this real or not? The next question you should be asking yourself is, why am I seeing this? Algorithms kind of rule our lives on social media. And what these guys are trying to do is get people who shouldn’t be central to the conversation to become more central to the conversation due to their gaming of the algorithm.”
Defensive disinformation vs. offensive disinformation
Defensive disinformation is used by professional government trolls to deny and distract from information the government wants to hide, distort or deny. For example, the Saudi Arabian government ran botnet trolls on Twitter that falsely denied the Saudi government murdered journalist Jamal Khashoggi.By contrast, offensive disinformation which is content specifically designed to manipulate emotions and attitudes by focusing on social stress points and playing on personal ideology. This kind of propaganda focuses on what is important to the people in the target country, not in the troll farm country. The goal is to to reinforce differences in existing attitudes and beliefs and use those differences to foment social division, distrust in institutions, e.g., the professional media, fellow citizens, and out-groups.
The ideology target
In a previous discussion here, I attacked political ideologies as a factor that significantly contributes to, or directly causes, major social and political problems. Strongly held ideological beliefs make it much easier to reject inconvenient facts, truths and sound reasoning. The research discussed in this OP makes it clear that professional trolls intentionally reinforce and then target ideological differences to foment social distrust and discord.For self-defense against troll manipulation, the researchers suggest asking some self-reflection questions when you are confronted with social media content from a source you are not familiar with. First ask yourself, is this true? For ideologues, belief in lies is easy when the lie fits personal ideological belief. Second, ask why am I seeing this? Trolls know how to manipulate the algorithm. Third, ask what impact on other would sharing or upvoting this have? This asks for a measure of empathy, which in a way is an opposite of self-righteous belief, which can easily be reinforced by troll lies and manipulation.
Tuesday, November 26, 2019
Here’s Everything The Mueller Report Says About How Russian Trolls Used Social Media
The Mueller report clearly describes how Russian trolls reached millions of people on Facebook, were quoted in major newspapers as real Americans, and even organized rallies.
https://www.buzzfeednews.com/article/ryanhatesthis/mueller-report-internet-research-agency-detailed-2016
https://www.buzzfeednews.com/article/ryanhatesthis/mueller-report-internet-research-agency-detailed-2016
Special counsel Robert Mueller’s report on Russian interference in the 2016 election and the Trump campaign provides one of the most detailed looks at how Russia’s Internet Research Agency — the infamous Kremlin-linked troll farm — tried to hijack the 2016 election and swing the vote in favor of Donald Trump.
The report, which concludes that Trump didn’t commit a crime but “also does not exonerate him [of obstruction],” gives us a clear and exhaustive look at the scope, focus, and results of the IRA’s efforts. The agency learned how to use platforms like Facebook and Twitter over the span of four years. By the end, it used analytical tools and the built-in network effect of massive social media platforms to create large artificial grassroots political organizations that were aggressively targeting both Republicans and Democrats.
The IRA was able to reach up to 126 million Americans on Facebook via a mixture of fraudulent accounts, groups, and advertisements, the report says. Twitter accounts it created were portrayed as real American voices by major news outlets. It was even able to hold real-life rallies, mobilizing hundreds of people at a time in major cities like Philadelphia and Miami. Fake online personas were able to communicate with members of the Trump campaign — who were unaware they were ever communicating with foreign nationals.
Here’s everything we know about Russian interference from the report.
It started in 2014.
According to Mueller’s report, the IRA began creating fake Facebook accounts and small groups as early as 2014.
“IRA employees operated social media accounts and group pages designed to attract U.S. audiences,” the report reads. “These groups and accounts, which addressed divisive U.S. political and social issues, falsely claimed to be controlled by U.S. activists."
The lines up with what we already knew about the IRA’s activity. One of its first large-scale misinformation projects was the Columbian Chemicals Plant explosion hoax in September 2014, when IRA members created a completely fake explosion at a chemical plant in Louisiana. “The perpetrators didn’t just doctor screenshots from CNN; they also created fully functional clones of the websites of Louisiana TV stations and newspapers,” the New York Times wrote about the hoax.
The IRA consolidated all of its US operations into one department called the “Translator” department, which appears to have operated like a typical digital media startup with different agents focusing on specific platforms, monitoring analytics, and even graphic designers. About a dozen people, known as “specialists,” would run an account at a time.
The IRA’s activity wasn’t confined to social media, either. IRA employees traveled to the United States on intelligence-gathering missions in 2014.
“Four IRA employees applied to the U.S. Department of State to enter the United States, while lying about the purpose of their trip and claiming to be four friends who had met at a party,” the report reads. “Ultimately, two IRA employees-Anna Bogacheva and Aleksandra Krylova-received visas and entered the United States on June 4, 2014.”
The IRA was on pretty much every platform.
At first, the IRA focused its activity on Facebook, YouTube, and Twitter. Later, Tumblr and Instagram accounts were created. In the beginning, Russian trolls were manning only fake individual accounts. By 2015, however, they began creating larger groups and pages. Finally, they attempted to flex their network effect to hold real-life rallies.
According to Mueller’s report, the Facebook groups were particularly popular. By the time Facebook deactivated them in 2017, the Russia-controlled group "United Muslims of America" had over 300,000 followers, the "Don't Shoot Us" group had over 250,000 followers, the "Being Patriotic" Facebook group had over 200,000 followers, and the "Secured Borders" Facebook group had over 130,000 followers.
A post from an IRA-controlled Facebook page called "Secured Borders".
MUCH MORE TO THIS STORY HERE:
NOW LET'S HEAR FROM TRUMPERS THAT THE REAL INTERFERENCE WAS REALLY BY THE UKRAINE!!
Climate Change in India
An New York Times article, India’s Ominous Future: Too Little Water, or Far Too Much, points out that climate change has altered the monsoon season. Now, the rains are less predictable and can be more prolonged when they arrive. That leaves large areas of the country in crippling drought or in floods. On top of that, decades of inept government policies leave millions of people mostly defenseless in the face of climate changes and hopeless levels of pollution, garbage and plastic waste.
Dry creek bed choked with litter
Foam from industrial runoff pass worshipers in the Yamuna River
Collecting water for drinking and cooking
Flood in Mumbai
A recent report on climate change is arguing that too little is being done. The Washington Post reports:
The world has squandered so much time mustering the action necessary to combat climate change that rapid, unprecedented cuts in greenhouse gas emissions offer the only hope of averting an ever-intensifying cascade of consequences, according to new findings from the United Nations.
Amid that growing pressure to act, Tuesday’s U.N. report offers a grim assessment of how off-track the world remains. Global temperatures are on pace to rise as much as 3.9 degrees Celsius (7 degrees Fahrenheit) by the end of the century, according to the United Nations’ annual “emissions gap” report, which assesses the difference between the world’s current path and the changes needed to meet the goals of the 2015 Paris climate accord.
The sobering report comes at a critical moment, when it remains unclear whether world leaders can summon the political will to take the ambitious action scientists say is essential. So far, the answer has been no.
No doubt that climate change science deniers will trot out the usual arguments in defense of doing nothing, just like gun violence deniers trot out their arguments for less gun control after each mass slaughter of innocents. India looks to be well and truly hosed.
Monday, November 25, 2019
A Massive Data Hack: Google Cloud Server Was Unprotected
Tech-Xplore reports a massive database was left unprotected:
Data Viper writes:
One can only hope that other cloud servers, e.g., ones the Social Security Administration or the US military uses, aren't just left open like that for anyone to play with. Past performance has not been confidence-inspiring and it probably does predict future performance.
"The data left unprotected was actually a database, aggregating 1.2 billion users' personal information, e.g., social media accounts, email addresses and phone numbers. The incident was relayed on the Data Viper blog.
Bloomberg quoted Troia. "There are no passwords related to this data, but having a new, fresh set of passwords isn't that exciting anymore. Having all of this social media stuff in one place is a useful weapon and investigative tool."
After all, just nabbing names, phone numbers and account URLs delivers ample information to get attackers started."
Data Viper writes:
"On October 16, 2019 Bob Diachenko and Vinny Troia discovered a wide-open Elasticsearch server containing an unprecedented 4 billion user accounts spanning more than 4 terabytes of data.
A total count of unique people across all data sets reached more than 1.2 billion people, making this one of the largest data leaks from a single source organization in history. The leaked data contained names, email addresses, phone numbers, LinkedIN and Facebook profile information.
What makes this data leak unique is that it contains data sets that appear to originate from 2 different data enrichment companies.
For a very low price, data enrichment companies allow you to take a single piece of information on a person (such as a name or email address), and expand (or enrich) that user profile to include hundreds of additional new data points of information. As seen with the Exactis data breach, collected information on a single person can include information such as household sizes, finances and income, political and religious preferences, and even a person’s preferred social activities.
Each time a company chooses to “enrich” a user profile, they are also agreeing to provide what they know about the person to the enriching organization (thereby increasing the validity of the organization’s future results). Despite efforts from social media organizations like Facebook, the resulting data continues to be compounded, creating a situation with no oversight that ultimately allows all of a person’s social and personal information to be easily downloaded."Wired magazine writes:
"For well over a decade, identity thieves, phishers, and other online scammers have created a black market of stolen and aggregated consumer data that they used to break into people's accounts, steal their money, or impersonate them. In October, dark web researcher Vinny Troia found one such trove sitting exposed and easily accessible on an unsecured server, comprising 4 terabytes of personal information—about 1.2 billion records in all.
While the collection is impressive for its sheer volume, the data doesn't include sensitive information like passwords, credit card numbers, or Social Security numbers."
One can only hope that other cloud servers, e.g., ones the Social Security Administration or the US military uses, aren't just left open like that for anyone to play with. Past performance has not been confidence-inspiring and it probably does predict future performance.
Subscribe to:
Posts (Atom)