The Human Brain, from 340 B.C. to 2030 [Pic]


----------------

brain-tr

Presented without comment.

[Source: Manu Cornet @ Bonkers World]







11 Responses to The Human Brain, from 340 B.C. to 2030 [Pic]

  1. The sad thing is I witnessed a person in a corn maze asking Siri “How many legs does a spider have?”, and then complain about being stumped because they didn’t know which way to go because Siri had no reception to figure it out. And we’re talking people in their late-30’s, early-40’s…

    • Not necessarily dumber. What requires more intelligence – reciting a long piece of poetry, or using a technological tool? The answer is a little different to what you’d expect. Recitation implies that oral history is the only means of recording things (and in 340 BCE, that was plainly untrue – we have ostraka, pieces of pottery bearing the names of ‘ostracised’ politicians, before the fall of Athens in 404 BCE, showing that a fair few citizens were literate). Being able to recite history is good only if you have a strong memory for it – it doesn’t require analysis. So, let’s look at the next technical development (shown here in the 1720’s) – the number zero. Yep, it’s technology! Or rather, numeracy. Developed not by the Greeks but by the Arabs, zero gave us the means to create complex mathematics that hadn’t been developed by Greek mathematicians. Remember, Euclid looked at geometry and not negative numbers. By this arbitrary stage in the timeline of human rationality, the world has developed technical achievements that allow ideas to be expressed rather than to be lost as each generation died out – the printing press. Lithographs and woodcuts show illustrations and concepts in art which would have otherwise not been seen. Music has been developed as a written form so that tunes and melodies don’t die out – they can be re-sung.
      By 1990, of course, things have changed. The industrial revolution brings with it the age of steam, and then the age of electricity. More leisure time gives us more time to learn – and not only that, but our greatly extended lifespans in comparison to the ancient Greeks and even the Victorians provide us with more time to consider the world around us. Remembering a four-digit number is difficult in comparison to the way we’ve done things before – writing a cheque took practice, and in 1990 we’re more likely to have developed the habit of going to a bank to withdraw our cash. We’re not less intelligent – just slower to adapt to the progress of new technology. Remember, computing power has been on average doubling each year, so that now it’s possible to have a phone with a faster microprocessor than the 1969 moon landing’s craft. People are still as smart as they always have been – but we need to adapt our skills so we don’t become the obsolete ones.

  2. It’s not doing either. I’m not LAZIER because we made a tool that does work for us. As much complain about people being lazy because they wear glasses.

  3. Right… because comparing someone whose job it was to remember stories (besides, A) it was usually far from perfect, and B) the stories were often specifically designed for memorising – learn the basic structure, a few epithets, and you could tell the story however you want) with a deliberately-randomised, rarely-used number with no inner logic makes so much sense. Instead of, I don’t know, comparing them to a professional comedian? The guy who can go on stage and spend 2 hours saying things he has memorised?

    The same applies to the second set of comparisons, comparing Edmond Halley (I guess?) with Joe Shmoe is hardly valid. Why not compare him to, say, Stephen Hawking?

    You know, I could do comparisons like this too.

    These days, the average person is likely to be able to use a computer with a fair amount of skill. Two hundred years ago, no one knew what a computer even was. What a bunch of morons!

    I agree that compared to the past, we do use external storage for our memory a bit more often, though to be fair, this trend did start in Mesopotamia when someone got the bright idea to mark off his oxen on a piece of clay instead of trying to keep the numbers in his head – the only difference is that within the last century or so, the amount of information we can access and the amount of information that we need to store on a day-to-day basis have grown quite significantly (i.e. a jump from 27 to 28 units of information – surely there’s a proper name, but I can’t be bothered to search for it, tired as I am, sorry – in a decade is not a very visible increase, a jump from 761 to 1205 in a year will show – the numbers are of course completely random and not based on any sort of verifiable data).

  4. I think this comic also assumes everyone in the past was a Scholar, Philosopher or Mathematician. Nevermind that at one time it was mostly clergy who where the scholars compared to all the “Lusers” of today. I would go out on a limb and say most of the scholars or even literate people in the past where clergy or upper class citizens.

    Also, in all honesty Apple users are no more dumb or smarter than the average PC user. They just using a different brand of equipment.