[Source: Virtual Shackes]
Pfh. LOL'd at the comic, but any instructor that responds that way to that kind of "HOLY CRAP SOMEONE ACTUALLY UNDERSTANDS THIS STUFF INSTEAD OF JUST TAKING NOTES ABOUT IT!" epiphany needs to be smacked. He should be rewinding the class a bit and making an example of it, so the other brain-dead fact-memorizing, note-taking, test-studying drones in the class can have an actual real-world example to understand. Otherwise it's just dead weight like most of the crap being jotted-down in class…
Do you notice the progression of pixelation in the comic? At first, it is just his cup, then its his desk and someones shadow, then he is pixelated and everything else is normal.
In the first picture, his pencil is pixelated too.
But this made me laugh, ’cause that was my same reaction when I heard that in my first programming class too!
Oh the days when data was stored as bytes, and such arbitrary storage values were the case. *Le-sigh*
Huh? You know that's why, for example, in SQL Server your integers range from -2,147,483,648 to 2,147,483,647, right?
… Am I a bit odd in that I actually had the similar epiphany, but about the maximum amount of rows in old-style Excel spreadsheets?
I wonder what my avatar thinks of that?
This is not strictly correct. Eight bits make an octet. It is a common convention that a byte is identical to an octet, but there are some computer architectures where a byte could be any length you wanted. For example, the PDP-10 has instructions that operate on variable-length bytes that can be anywhere between 1 and 35 bits in length.
This is why all the CCITT standards talk of octets rather than bytes.