Spock’s emotional state is always set to “calm”, even when wildly inappropriate. He often gives many significant digits for probabilities that are grossly uncalibrated. (E.g: “Captain, if you steer the Enterprise directly into that black hole, our probability of surviving is only 2.234%” Yet nine times out of ten the Enterprise is not destroyed. What kind of tragic fool gives four significant digits for a figure that is off by two orders of magnitude?) Yet this popular image is how many people conceive of the duty to be “rational” - small wonder that they do not embrace it wholeheartedly.
Just a sample of the amazing facility for instance possessed by Eliezer Yudkowski, a self-taught AI researcher and guy behind Harry Potter and the Methods of Rationality.
I’ve been slowly working my way through his writing over the past few months and can’t recommend him enough.