(Just found the author of the study/studies: Sosis
: A commune being defined as a group of people not sharing kinship deciding to live and work together.
Yes, Dawkins is saying that "universal love" does not have an evolutionary component, which seems like a fairly uncontroversial claim.
It seems like your criticism of Dawkins is more a criticism of how other people have misunderstood him, rather than any criticism of the arguments in The Selfish Gene itself?
If you haven't, I highly suggest you read Jonathan Haidt's "The Righteous Mind". While it's at a popular level, it does a fairly good job at presenting a plausible framework for how moral behavior (like altruism) can emerge from evolutionary principles.  Haidt is probably one of the most influential moral psychologists today.
Particularly relevant to my own experience was the commentary on how politicians have become less cooperative with their rivals in other parties, and how political views/party associations have become more extreme/less tolerant overall.
This is true for pretty much everyone - don't go and count yourself as the exception. The more intelligent you are, the more refined your reasoning, but there's evidence to show that intelligence will not lower the bias. Counterarguments from others as intelligent or more intelligent will. One of the curses of being more intelligent is that if you hold a biased view, you usually need someone as smart as you to change your mind. The smarter you get, the fewer people there are who can help remove your bias.
Some people are more objective than others, but often only in a limited domain - not in their whole lives.
>However, 20 per cent of justifications were subjective and involved making a reference to one’s cultural identity, personal experience.
The book also touches on this. In my personal experience, fact based reasoning is rarer than this. There are many reasons people believe something. Attempting to discern the Truth is usually in the minority. It is to be expected that all the other reasons will be more prevalent - they simply have more utility than merely gaining knowledge. It shouldn't surprise people that factual reasoning is rare - it has little utility in most spheres of life. Much less than social cohesion and tribalism does.
Consider the issue of intelligence, and its spread across various groups (usually race and gender). It's very common to find a very well educated person insist that everyone is born equally with the same mental/intelligence potential, and differences exist merely in the extent they foster it. When asked for their rationale/evidence, the answer is usually a variant of "I choose to believe it" (usually for ideological or cultural reasons). I'm not referring only to ordinary folks, but also to university academics, etc.
(I'm not saying that they are factually wrong - merely the reasons they believe it are not based on any facts).
>whether they agreed with the scientific consensus on climate change, vaccines, genetically modified (GMO) foods and evolution
Two of those items (vaccines and GMO foods) touch on a strongly cultural force on purity. The book shows that a lot of people value purity (likely a genetic trait). They associate food consumption not just with physical health, but also mental/spiritual health. So they are quite sensitive to "unnatural" or foreign agents going into their bodies.
Actually, even moral disgust is positively correlated with right leaning folks. The Righteous Mind (https://www.amazon.com/Righteous-Mind-Divided-Politics-Relig...) is a great read that covers the research on the topic.
I used to think the same way, but part of the issue is that most people are not abstract or systems thinkers. They don't perceive abstraction or systems the way many HN readers do or would.
In addition, and related, most people have strong tribal identities that overwhelm their limited intellectual capabilities; Jonathan Haidt's The Righteous Mind is very good on this: https://www.amazon.com/Righteous-Mind-Divided-Politics-Relig... and there are others as well.
Climate change deniers, anti-vaxxers, and other conspiracy theorists share some key underlying traits.
Most of us, including me, also live in our own bubbles. You're likely in a rationalist and data-driven bubble, so you don't see people to whom you'd have explain an entire rationalist and data-driven worldview.
Kindle Version - $11.99
Paperback (Prime) - $9.32
So how is that working out?
Think of all the campaigns that have effected change. How often did shaming work? Sure, you have a few cases like the fight against Apartheid, but in general? Not effective.
Here's the thing. I'm as pro-science as they come. However, I've been blessed to come from communities that fall prey to anti-vaccine and other "nonsense". And one thing I know is that fact based ridicule and moralizing has a low success rate.
As someone who somewhat understands both communities, I am already not on your side. If a pro-vacciner like me is turned off by such rhetoric, imagine it from an anti-vacciner's side.
Think I'm an outlier? I'll hazard a guess that most pro-vacciners are close to someone who is not (family connections, etc).
There is comfort in being "right". But being "right" does not in itself translate to right outcomes.
The Righteous Mind (https://www.amazon.com/Righteous-Mind-Divided-Politics-Relig...) is a very worthy read. A few things it points out:
1. On a polarized issue, facts will increase the polarization (and I'm guessing justifying shame with facts will exacerbate the issue)
2. To persuade someone, you will have a lot more success appealing to emotions than to the rational mind. This does not mean playing games where you manipulate people.
The article is almost cute. Like, "Did you know people are tribal? And did you ever think that might be a bad thing?" It's not a profound new idea, and it's one that's been better discussed elsewhere, from the SCC posts on the subject (see also the recent one on Albion's Seed) to Jonathan Haidt's The Righteous Mind to Joshua Greene's Moral Tribes to the many, many articles that have already been written about partisan polarization in the U.S. (and probably globally, if Europe is any indication).
I mean I'm glad that a random NYT column is provoking further discussion about an important subject, but there's so much more and better stuff that has been said about it than just what this touches on.
Prior to reading this, my politics aligned very closely with those of Sanders and I thought everybody on the right were selfish, evil, close-minded fools. After reading the book, my politics are still left of center (but definitely right of Sanders), but I think I understand and appreciate the politics of my right leaning family and friends.
I'd also recommend Jonathan Haidt's The Righteous Mind (http://www.amazon.com/The-Righteous-Mind-Politics-Religion/d...). He makes a lot of interesting points, including that most people come to a conclusion about an issue, then look for reasoning to support it, and that most of us operate on instinct most of the time—logic is a more costly, difficult mode whose use can be cultivated but which is not at all the default.
If you're like me and love debates, this book is awesome. It'll show you how to find common ground and understand implicit values behind arguments.
Link for the lazy (non-affiliate): http://www.amazon.com/The-Righteous-Mind-Politics-Religion/d...
This reminds me to read a book that talks about this: "The Righteous Mind: Why Good People Are Divided by Politics and Religion," by Jonathan Haidt. Maybe someone else has read it and can comment.
I think it's because a lot of the people who work in nonprofits and public agencies are at best ambivalent about profit and commerce, and at worse openly hostile to them, and such feelings are based primarily on emotions. So asking "Why?" in a way designed to elicit logical / intellectual reasons is unlikely to yield a lot of productive opinions (Jonathan Haidt discusses the role of non-logical emotions in cognition in his book The Righteous Mind, which is completely brilliant and ought to be read by everyone).
Incidentally, my family's consulting firm provides grant writing services for nonprofits and public agencies (see http://blog.seliger.com if you're curious), and we face a lot of the profit / commerce ambivalence too. I even wrote about the issue in a post about the grant funding system and the role specialization and gains from trade play: http://blog.seliger.com/2012/03/25/why-fund-organizations-th... . A lot of people feel like nonprofits and public agencies are not supposed to be like other businesses, even though, in reality, they are a lot like other businesses except, obviously for the profit drive.
So, like other businesses, a lot of nonprofits buy goods and services they can't productively make or do themselves. We use the analogy of a plumber: most nonprofits do not have one on staff, and, when their toilets clog, they hire someone to do the job. That's fairly straightforward. But many do feel that grant applications are something like a college admissions essay, in which hiring a consultant is somehow cheating.  We obviously don't think so, but, nonetheless, a lot of people have that feeling and don't really think grant writing is like plumbing. But nonprofit and public agencies who submit better proposals tend to get funded more often than those who don't, so to some extent those feelings get weeded out by the "market," which still exists.
We've also argued before that there's no reason why a nonprofit grant writing agency can't exist, but in practice none do, and, if they did, the demand for their services would far outstrip supply, because grant writing is very boring, difficult, and tedious—a troika that makes for a great business, but doesn't give people the good feelings they might get from, say, doling out soup at a soup kitchen, or providing pro-bono legal work.
Notes:  http://www.amazon.com/Righteous-Mind-Divided-Politics-Religi...
 Actually, hiring an admissions essay person starts to make sense when one thinks about how much might be on the line, but that's another issue.
 EDIT: Found the post that discusses these issues: http://blog.seliger.com/2008/08/08/tilting-at-windmills-why-... .
Its a top down theory/solution to what critics would argue is a bottom up problem. Individuals must be responsible for what they say, how they regulate their emotional state, and how their experiences and cognitive distortions skew their thinking. CT/CRT, by my understanding, argues against this. Thus it seems reasonable to say it leads to a lack of accountability if you define accountability as a responsibility for ones actions and beliefs.
I’ve read a small bit on CT/CRT, intersectionality, and the modern culture of safetyism. Primarily from Haidt who has more peer reviewed sources on things than anyone could ever want.
I find CT/CRT to be compelling to a degree, but it brings along with it too much baggage in my opinion. You’re likely not going to find or be given a specific source of data that says CRT leads to lack of accountability (however you would measure that), its an assumption made by the previous poster. You don’t need one either to have a discussion, so don’t fall back on the lack of academic evidence as an argument in itself.