Penrose’s theory and all other Godel-related strong-AI refutations are stupid: “Human mind is different from a computer because humans are capable of detecting logically inconsistent theories and logical paradoxes and think outside the box in order to know that they are paradoxes.”
This is not true at all - our mind actually does nothing more than what a computer operating system would do if it sees a process that occupies a lot of memory and doesn’t produce a result - it would kill the process (or the thought that leads to paradox.) We aren’t able to escape an infinite cycle because we are more capable than computers - we are merely equipped with heuristics necessary to escape from a situation that does not benefit us in any way (sometimes).
Jencel’s principle of triviality: any self-consistent system of knowledge can be reduced, at least theoretically, to a small number of clear elementary postulates and what follows from them. So, any system for which it is not immediately clear what these postulates are, is not self-consistent.
Simply put: if you cannot explain the system using very elementary language and constructs, then there is probably inconsistency in it somewhere.
Example: most religious doctrines.
Re-reading “I am a strange loop” and finding much stuff that I missed originally. I like the author’s idea of an organism’s concept of self as the central concept in an organism’s system of thought, and the one that binds all other concepts together.
Like, a concept is considered true and real by us only if it relates to our concept of ourselves. Our concept of ourselves is the realest thing there exists for us (although it in actuality is completely objective).
I think that the principle of the banality of evil is also valid in the other way around: not only that evil is banal, but all banal things are evil. So, something is banal if and only if it is evil.