[Previous] Error Identifying Superpower | Home | [Next] Discussing Animal Intelligence

Intelligence Isn't Speed

I explained on Reddit [one typo is fixed in this post] that intelligence isn't a matter of computing hardware speed.


Sounds like the IQ vs Universality thing is just two camps talking past each other.

Suppose we do believe in the basic premise of universality, that all computers are equally "powerful" in a specific way, namely that there's no problem a sophisticated computer can solve that a simple computer cannot, provided we just give the simple computer a long enough time frame to solve it in.

Fair enough. But surely we're also interested in how fast the computer can solve the problems. That's not a trivial factor, especially when we consider that human computers are prone to getting bored, frustrated, confused, or forgetful.

So maybe when we talk about IQ we're not talking about computational power, but maybe something like computational speed. Or, more likely, computational speed combined with some other personality traits.

I think computational universality helps change the primary point of interest (re intelligence) to software that is created and modified after birth. You think maybe it makes hardware speed the key place to look re intelligence. FYI, your view is something I've already considered and taken into account.

You also think some other (genetic) personality traits may be important to intelligence. I don't think so partly because of a different type of universality: universal intelligence (or universal learning, universal knowledge creating, universal problem solving, same things). Universalities are discussed in The Beginning of Infinity by David Deutsch. It's important, in these discussions, to keep the two types of universalities separate (universal computer; universal learning/thinking software). I won't go into this point further right now. I'm going to talk about the hardware speed issue.

Suppose my brain is 100% faster than yours (which sounds like an unrealistically high difference). You will still outperform me, by far, if you use a better algorithm than I do. E.g. if you use an O(N) algorithm to think about something while I'm using O(N^2).

That's called Big O notation, which basically means how many steps it takes to complete the algorithm. N is the number of data points. In this example, you need time proportional to the amount of data. I need time proportional to the square of the amount of data. So for decent sized data sets, you win even if my hardware is twice as fast. E.g. with 10 data points, you win by a a factor of 5. Taking 2 seconds per step, you need 10 * 2 = 20 seconds. I, doing steps in 1 second, need 10^2 = 100 seconds. How does it scale? With 100 data points, you need 200 seconds and I need 100^2 = 10,000 seconds. Now you won by a factor of 50. That factor will go up if there's more data. And the world has a lot of data.

Exponential differences in Big O complexity between algorithms are common and routinely make a huge difference in processing time – far more than CPU speed. In software we write, lots of work goes into using algorithms that are only sub-optimal by a linear or constant amount.

If people think at different speeds, you should probably blame their thinking method (software) rather than their hardware for well over 99% of the difference. Especially because hardware variation between humans is pretty small.

But most differences in intelligence are not speed differences anyway. For example, often one human solves a problem and another doesn't solve it at all. The second guy doesn't solve it slower, he fails. He gets stuck and gives up, or won't even begin because he knows he doesn't understand how to do it. This is partly because of what knowledge people have or lack (learned information that wasn't inborn), and partly because of thinking methods (e.g. algorithms which could be fast or exponentially slow depending on how well they're designed). With bad algorithms, the time to finish can be a million years while a good algorithm can do the same task in minutes on a slower CPU.

There are other crucial non-hardware issues too, e.g. error correction. If you make a thinking mistake, can you recover from that, identify that something has gone wrong, find the problem, and fix it? Some ways of thinking can accomplish that pretty reliably for a wide variety of errors. But some ways of thinking are quite fragile to error. This is leads to wildly different thinking results that aren't due to hardware speed.

I'll close with an explanation of these issues from David Deutsch, from my interview with him:

David: As to innate intelligence: I don't think that can possibly exist because of the universality of computation. Basically, intelligence or any kind of measure of quality of thinking is a measure of quality of software, not hardware. People might say, "Well, what hardware you have might affect how well your software can address problems." But because of universality, that isn't so: we know that hardware can at most affect the speed of computation. The thing that people call intelligence in everyday life — like the ability of some people like Einstein or Feynman to see their way through to a solution to a problem while other people can't — simply doesn't take the form that the person you regard as 'unintelligent' would take a year to do something that Einstein could do in a week; it's not a matter of speed. What we really mean is the person can't understand at all what Einstein can understand. And that cannot be a matter of (inborn) hardware, it is a matter of (learned) software.


Elliot Temple on December 3, 2019

Messages (4)

#14664 he deleted the comment you replied to after he replied saying he agreed with you. the reddit culture of deleting things sucks, why do they delete some many things? is it to not get bad karma?


Anonymous at 10:42 PM on February 26, 2020 | #15651 | reply | quote

#15651

- scrub personal history / accidentally doxxed self, etc

- scrub it before deleting the account for another reason to avoid things coming up in future (otherwise you can _never_ delete the post)

- clean up evidence of sock puppetry?

they're the ones that come to mind, though maybe if someone were protesting some reddit decision they might do it to (like deleting social media accounts) ¯\_(ツ)_/¯

If you ever need to find an old comment there are multiple reddit-history sites that sometimes have posts, and ofc web.archive.org too.


Anonymous at 4:38 PM on June 20, 2020 | #16753 | reply | quote

#15651 People mostly delete stuff because they're worried other people might think it's bad. If other people don't approve, they don't want it to exist (or they want to change it), because their goal is fitting in.

When something gets criticized, people worry that many others may also think it's bad, like the critic does, so they commonly want to get rid of it. And if the criticism convinces the OP himself that his message was bad, then the OP will usually assume most other people will also be convinced.


Anonymous at 4:45 PM on June 20, 2020 | #16754 | reply | quote

Want to discuss this? Join my forum.

(Due to multi-year, sustained harassment from David Deutsch and his fans, commenting here requires an account. Accounts are not publicly available. Discussion info.)