idoh > notes > Causal Opacity

Causal Opacity is the idea that there is a barrier to understanding a system once it passes some threshold of complexity. Because of causal opacity you can neither make accurate predictions nor even really figure out how the system got into its current state. Some other ways of saying it would be,

Because of the shroud of causal opacity,

Obviousness of the Observation

I can write a bunch about whether systems are complex or simple, emergence, number of edges in a graph, but fundamentally predictions are quite the crapshoot and everyone knows it.

Nobody can pick stocks, figure out who will win the next election, know which conflicts will turn into wars, and on and on. Even at a smaller scale, like will a company succeed or will a relationship break up or flourish, nobody really knows.