Wednesday, January 24, 2024

Various bits: MOAB discovered; Germain’s thoughts on the election; Building neuron pathways

The regular news sucks today. It’s repetitive to the point of nausea. The MSM is awash in clickbait headlines leading to mostly clueless blowhard opinionators spewing empty blither about not much of anything, while the authoritarian radical right media remains as crappy and vicious as it has been for many years now. So, other things seem worth a bit of time.

Techspot reports about the finding of a gigantic data breach dubbed MOAB, the Mother Of All Breaches:
Researchers have discovered a database composed of stolen user credentials and personally identifiable information so large that it's been dubbed the mother of all breaches (MOAB). The dataset contains no fewer than 26 billion records, making up 12TB of data from sites including Twitter/X, LinkedIn, Weibo, Tencent, and more.

As is the case with similar databases, most of the data in MOAB has been gathered together from previous leaks over the years. But the sheer number of records it contains suggests there will be new information that has never before appeared online. 

"Threat actors could leverage the aggregated data for a wide range of attacks, including identity theft, sophisticated phishing schemes, targeted cyberattacks, and unauthorized access to personal and sensitive accounts," the researchers write.

Most of the top leakers
Corporate motto: We take your information privacy 
very seriously! 
(Who is AdultFriendFinder??
What is MyFitnessPal??)
___________________________________________________________________
 ___________________________________________________________________ 

After the unsurprising, completely expected NH primary result, it remains as clear as it has been for months and months, that the election will be Biden vs. DJT. OK, maybe a bit clearer. 

Exit polls indicate that most Repub DJT voters in NH think he would be just fine even if convicted of felonies. So much for anyone ever complaining about an unqualified presidential candidate, despite dementia, felonies, confusing Haley for Pelosi, corruption, or fornication with an illegal hush money payoff. For about half the country, qualification for Republican president is a non-issue as long as the candidate passes two tests, (1) foggy mirror (still breathing*) and (2) whatever the Constitution says (not much).

* I'm not dead yet! 

Despite (i) the corruption and lunatic authoritarianism that has engulfed most of America's political right, and (ii) Biden’s crappy public opinion polling and even worse messaging, I’m starting to sense a shift in general sentiment in favor of Biden. It’s apparently not all the GOP lies and moral rot that seems to bug people the most. It’s something else. Several news items over the last couple of weeks hinted to me that there is a significant ongoing turn by the American public against GOP authoritarianism and maybe another thing(s) I cannot see in the data. 

Right now, and subject to changing events, it feels like Biden is probably going to win this thing (~55% chance) despite himself. Maybe this feeling will pass, but something feels different at the moment. There’s still a lot of time left for either candidate to really self-destruct or be destructed by an October surprise! 
___________________________________________________________________
 ___________________________________________________________________ 

Researchers have found that neurons are likely to self-organize into neural pathways instead of forming pathways dictated by genes. I posted here before about non-genetic phenomena that hints at self-organization of neurons. In a completely non-biological example, a post discussed how piles of tiny silver wires covered by a very thin layer of polymer could form electrical pathways that mimicked learning in a process called neuromorphic learning, a purely non-biological phenomenon. Now, a similar self-organizing behavior was found to occur in animal models of neural pathway formation. The authors postulate that the same process could be happening in humans. 

Heavy-tailed neuronal connectivity arises from Hebbian self-organization

The connections in networks of neurons are heavy-tailed, with a small number of neurons connected much more strongly than the vast majority of pairs. However, it remains unclear whether this heavy-tailed connectivity emerges from simple underlying mechanisms. Here we propose a minimal model of synaptic self-organization: connections are pruned at random, and the synaptic strength rearranges under a mixture of preferential and random dynamics. .... Extending our model to include neuronal activity and Hebbian plasticity, we find that clustering in the network also emerges naturally. We confirm these predictions in the connectomes of several animals, suggesting that heavy-tailed and clustered connectivity may arise from general principles of network self-organization rather than mechanisms specific to individual species or systems.
To understand how neurons form connections to one another, they developed a model based on Hebbian dynamics, a term coined by Canadian psychologist Donald Hebb in 1949 that essentially says, “neurons that fire together, wire together.” This means the more two neurons activate together, the stronger their connection becomes. [just like what happens in non-biological neuromorphic learning]

Across the board, the researchers found these Hebbian dynamics produce “heavy-tailed” connection strengths just like they saw in the different organisms. The results indicate that this kind of organization arises from general principles of networking, rather than something specific to the biology of fruit flies, mice, or worms.

The model also provided an unexpected explanation for another networking phenomenon called clustering, which describes the tendency of cells to link with other cells via connections they share. A good example of clustering occurs in social situations. If one person introduces a friend to a third person, those two people are more likely to become friends with them than if they met separately.

“These are mechanisms that everybody agrees are fundamentally going to happen in neuroscience,” Holmes said. “But we see here that if you treat the data carefully and quantitatively, it can give rise to all of these different effects in clustering and distributions, and then you see those things across all of these different organisms.”
As Palmer pointed out, though, biology doesn't always fit a neat and tidy explanation, and there is still plenty of randomness and noise involved in brain circuits. Neurons sometimes disconnect and rewire with each other -- weak connections are pruned, and stronger connections can be formed elsewhere. 
[unlike what happens non-biological neuromorphic learning] This randomness provides a check on the kind of Hebbian organization the researchers found in this data, without which strong connections would grow to dominate the network.

The researchers tweaked their model to account for randomness, which improved its accuracy.

“Without that noise aspect, the model would fail,” Lynn said. “It wouldn't produce anything that worked, which was surprising to us. It turns out you actually need to balance the Hebbian snowball effect with the randomness to get everything to look like real brains.” 
In this data, one can start to glimpse the workings behind the stunning complexity and diversity of brain wiring, thinking and behavior. Our neurons are self-associating (Hebbian behavior) and self-disassociating (connection pruning or “anti-Hebbian” behavior) all the time. That phenomenology winds up being reflecting in thinking and behavior. 

No comments:

Post a Comment