Friday, June 26, 2015

Thinking about what the Dalai Lama said (3)


(image credit)
"Seek first to understand, then to be understood" is one of seven habits of highly effective people. It requires listening, specifically active listening and empathic listening.
 

Wednesday, June 24, 2015

Thinking about what the Dalai Lama said (2)


(image credit)
Sure enough, literally just a night or two after I saw this post, a mosquito woke me up from the dead of sleep. The air was cool, so I opened the window, before I retired. It clearly buzzed right into my bedroom. I swatted at the air, with increasing annoyance and force, as the tiny creature seemed intent to keep me awake. I tucked my head under the covers, and kept it so, until I couldn't stand having my head covered. Was I scot-free of my nemesis? No, as it buzzed around my head after a couple of minutes. I finally got out of bed, and did this and that, for about an hour. Then, I was finally able to return to sleep.

Wise words indeed, from the Dalai Lama (sigh).

 

Monday, June 22, 2015

Thinking about what the Dalai Lama said (1)


(image credit)
I'd like to believe that the Dalai Lama speaks only to a segment of humankind, which fall into a quandary about how to live life. How big this segment is, I don't know. But I'd also like to imagine that there are many who earn a living, and live reasonably happy, healthy lives and who worry about the future but aren't paralyzed into an empty, meaningless life.
 

Friday, June 12, 2015

Julia Galef: Bayes' Rule changed the way I think


Bayes' Rule is a simple formula that tells you how to weigh evidence and change your beliefs. I don't go around plugging numbers into a formula all the time, but nevertheless, becoming familiar with Bayes has shifted the way I think in some important ways.
My beliefs are grayscale - and my confidence in them changes as I learn new things. Well-said, by Julia Galef.

Her example about believing she is a good driver is instructive. A subsequent road accident may require altering that belief, but then again it may not require altering that belief. Galef's point is that we cannot stop at a superficial review of an incident or event vis-a-vis some preexisting belief. Instead, we need to probe more deeply into such incident or event, and determine its cause.

If the accident were purely another driver's fault (e.g. misjudgment, carelessness), then she can viably maintain her belief that she is a good driver. However, if the accident were entirely her own fault, she may still have good reason to hold to her belief. For example, if she had a five-year history of driving with no accidents, a one-off accident now doesn't necessarily mean she is suddenly a bad driver. But, to her point, that one-off accident may prompt even a minute decrease in her confidence in that belief. Of course, repeated driving mistakes, resulting in frequent accidents may require significant alteration or dismissal of her belief: that is, she is no longer a good driver; she is a bad driver.

There is a name for beliefs that people hold as true, no matter the reality surrounding them, no matter a raft of opposing evidence: dogma. Bayes' Theorem or Bayes' Rule is an antidote :)

Wednesday, June 10, 2015

Julia Galef: Think Rationally via Bayes' Rule (2)



Recently I posted this talk by Julia Galef, along with the following comment on Google+:

I am working to steep myself in Bayes' Theorem or Bayes' Rule. I very much like the fact that it helps weigh our beliefs against perhaps a host of evidence that confirm (or disconfirm) our beliefs to varying degrees. The principles underlying Bayes Theorem help sharpen our thinking and strengthen our reasoning. In an optimal scenario - that is, if we are open and willing - we ought to modify our beliefs depending on the evidence we encounter. In other words, we ought to be think logically or rationally about ourselves, people and the world around us.

But the thing is, we as humans are not entirely logical or rational; we are also intuitive, subjective and instinctive. For example, we may hold on to the belief that all teenagers are undisciplined and irresponsible, even though we meet some who are quite the opposite. So the tenacity with which some may uphold their beliefs, despite evidence to contrary defies Bayes' Theorem. This is the stuff of prejudice and discrimination.

However, this example of defiance doesn't necessarily discredit the usefulness of Bayes' Theorem, in ways that Julia Galef talks about it. Say, we have a friend who's prejudiced or discriminatory against teenagers, we might try to firmly and-or gently persuade him to think otherwise via Bayes' Theorem.

Our tact may first draw on one of the Seven Habits of Highly Effective People, in particular: Seek first to understand, then to be understood. Maybe he's faced an assault or trauma in relation to teenagers; maybe he's a father who weathered a terrible storm with his teenage children. Using empathy - that is, the ability and willingness to walk in others' shoes or to sit in their seats, we might help our friend resolve such trauma or storm, so that perhaps he comes to see that despite all of what he experienced, not all teenagers are undisciplined or irresponsible.

So: empathic understanding + Bayes Theorem underpin not only more effective thinking and reasoning, but also better understanding and relationships between people.


My post generated a bit of a discussion:




+Gerallt G. Franke What you say makes very good sense, and it resonates with my working notion that we approach any thing, any phenomenon or context as it is. That is, we do our best to avoid (or at least set aside) any preconceived idea, preexisting framework, or preset methodology. In short, we approach that thing etc. with a blank slate and an empty "toolkit." Then, we let that thing itself point us to an idea, framework or methodology for knowing it, analyzing it, and solving it.

I believe you're quite right: If, instead, we began with a prior logic, reasoning, inference or deduction - perhaps because we have difficulty tolerating or grasping uncertainty - then there is a good likelihood that logic etc. will be off (at best) or break down (at worst). Alternatively, by examining something as it is, again without preconceived ideas, we are much more likely to arrive at the right logic for that thing itself. 



Monday, June 8, 2015

Julia Galef: Think Rationally via Bayes' Rule (1)


Bayes' Rule is a formalization of how to change your mind when you learn new information about the world or have new experiences.

Transcript
I'd like to introduce you to a particularly powerful paradigm for thinking called Bayes' Rule. Back in the Second World War the then governor of California, Earl Warren, believed that Japanese Americans constituted a grave threat to our national security. And as he was testifying as much to Congress, someone brought up the fact that, you know, we haven't seen any signs of subterfuge from the Japanese American community. And Warren responded that, "Ah, this makes me even more suspicious. This is an even more ominous sign because that indicates that they're probably planning some major secret timed to attack รก la Pearl Harbor. And this convinces me even more that the Japanese Americans are a threat."

So this pattern of reasoning is what sustains most conspiracy theories. You see signs of a cover up -- well, that just proves that I was right all along about the cover up. You don't see signs of a cover up, well that just proves that the cover up runs even deeper than we previously suspected.

Bayes' Rule is probably the best way to think about evidence. In other words, Bayes' Rule is a formalization of how to change your mind when you learn new information about the world or have new experiences. And I don't think that the math behind -- the math of Bayes' Rule is crucial to getting benefit out of it in your own reasoning or decision making. In fact, there are plenty of people who use Bayes' Rule on a daily basis in their jobs -- statisticians and scientists for example. But then when they leave the lab and go home, they think like non-Bayesians just like the rest of us.

So what's really important is internalizing the intuitions behind Bayes' Rule and some of the general reasoning principles that fall out of the math. And being able to use those principles in your own reasoning.

After you've been steeped in Bayes' Rule for a little while, it starts to produce some fundamental changes to your thinking. For example, you become much more aware that your beliefs are grayscale, they're not black and white. That you have levels of confidence in your beliefs about how the world works that are less than one hundred percent but greater than zero percent. And even more importantly, as you go through the world and encounter new ideas and new evidence, that level of confidence fluctuates as you encounter evidence for and against your beliefs.

Also I think that many people, certainly including myself, have this default way of approaching the world in which we have our preexisting beliefs and we go through the world and we pretty much stick to our beliefs unless we encounter evidence that's so overwhelmingly inconsistent with our beliefs about the world that it forces us to change our minds and, you know, adopt a new theory of how the world works. And sometimes even then we don't do it.

So the implicit question that I'm asking myself, that people ask themselves as they go through the world, is when I see new evidence, can this be explained with my theory? And if yes, then we stop there. But, after you've got some familiarity with Bayes' Rule what you start doing is instead of stopping after asking yourself can this evidence be explained with my own pet theory, you also ask well, would it be explained better with some other theory or maybe just as well with some other theory? Is this actually evidence for my theory?