Stuff > Thoughts and Ideas

Mental Models

<< < (2/6) > >>

galumay:
"Worry about the what ares, not the what ifs."

The smartest thing my dad ever said - and he said a lot of smart things!

I would take it a step further, "Work out which category your concerns/issues fall into, then do something about the what ares, and ignore the what ifs."

galumay:
Charles Munger gives an illuminating example on the issue of stealing:
A very significant fraction of the people in the world will steal if (A) it's very easy to do
and, (B) there's practically no chance of being caught. And once they start stealing, the
consistency principle will soon combine with operant conditioning to make stealing
habitual. So if you run a business where it's easy to steal because of your methods, you're
working a great moral injury on the people who work for you ...
It's very, very important to create human systems that are hard to cheat. Otherwise you're
ruining your civilization because these big incentives will create incentive-caused bias and
people will rationalize that bad behavior is OK.
Then, if somebody else does it, now you've got at least two psychological principles:
incentive-caused bias plus social proof. Not only that, but you get Serpico effects: If enough
people are profiting in a general social climate of doing wrong, then they'll turn on you
and become dangerous enemies if you try and blow the whistle.

galumay:
Follow these three pieces of advice from Charles Munger:
(1) I don't want you to think we have any way oflearning or behaving so you
won't make a lot of mistakes. I'm just saying that you can learn to make fiwer
mistakes than other people - and how to fix your mistakes fosterwhen you do make
them. But there's no way that you can live an adequate life without [making] many
mistakes. In fact, one trick in life is to get so you can handle mistakes. Failure to
handle psychological denial is a common way for people to go broke: You've made
an enormous commitment to something. You've poured effort and money in. And
the more you put in, the more that the whole consistency principle makes you
think, "Now it has to work. IfI put in just a little more, then it'll work."
And deprival super-reaction syndrome also comes in: You're going to lose the
whole thing if you don't put in a little more. People go broke that way - because
they can't stop, rethink and say, "I can afford to write this one off and live to fight
again.
I don't have to pursue this thing as an obsession - in a way that will break me." Part ofwhat you must learn is how to handle mistakes and new facts that change the odds. Life, in part, is like a poker game, wherein you have to learn to quit sometimes when holding a much-loved hand.
(2) I've gotten so that I now use a kind of two-track analysis. First, what are the factors that really govern the interests involved, rationally considered? And second, what are the subconscious influences where the brain at a subconscious level is automatically doing these things - which by and large are useful, but which often misfunction. One approach is rationality - the way you'd work out a bridge problem: by evaluating the real interests, the real probabilities and so forth. And the other is to evaluate the psychological factors that cause subconscious conclusions - many ofwhich are wrong.
(3) Take all the main models from psychology and use them as a checklist in reviewing outcomes in complex systems. No pilot takes off without going through his checklist: A, B, C, D ... And no bridge player who needs two extra tricks plays a hand without going down his checklist and figuring out how to do it...And, to repeat for emphasis, you have to pay special attention to combinatorial effects that create lollapalooza consequences.

galumay:
"Look a t where the bullet holes are a n d p u t extra armor every place else. "
During World War II, the statistician Abraham Wald tried to determine where one should add extra armor to airplanes. Based on the patterns ofbullet holes in returning airplanes, he suggested that the parts not hit should be protected with extra armor. How could he reach that conclusion? Because he also considered planes that didn't return. Assume that all planes had been hit more or less uniformly. Some planes hit in marked areas were still able to return. This means that planes that didn't return were most likely hit somewhere else - in unmarked places. These were the areas that needed more armor.

galumay:
At a press conference in 2001, when Warren Buffett was asked how he evaluated new business ideas, he said he used 4 criteria as filters.
- Can I understand it? If it passes this filter,
- Does it look like it has some kind ofsustainable competitive advantage? If it
passes this filter,
- Is the management composed of able and honest people? If it passes this filter,
- Is the price right? If it passes this filter, then we write a check
What does Warren Buffett mean by "understanding?" Predictability: "Our definition of understanding is thinking that we have a reasonable probability of being able to assess where the business will be in 10 years."

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version