Today, I have a question for you. And I want you to be sure of your answer before continuing. Deal? So here’s the question:

Do you think programming languages are designed to be read by humans? Or computers?

Just in case you’ve never seen a programming language before, let me list out a quick program right here. In fact, if you’re reading this on a desktop computer, you can actually run this program yourself – just open up your browser’s developer tools, write this out into the Console tab, and hit enter.

for (let i = 99; i > 0; i--) {
  console.log(`${i} bottles of beer on the wall,`)
  console.log(`${i} bottles of beeeeer`)
  console.log(`Take one down, pass it around, and...`)
  console.log(`${i-1} bottles of beer on the wall!`)

So what does this program do? It just prints out the lyrics for the song “99 bottles of beer on the wall”. All 396 lines of it. And that’s despite the fact that the program is just six lines long.

More succinctly, this program automates the process of producing lyrics. It describes the lyric-production process in such a way that the computer is able to reliably follow the instructions, time after time after time.

But how is it that the computer can look at the above code, and actually execute it? After all, the computer only understands ones and zeros; pulses of high and low voltage inside electronic circuits.

In fact, in order for the computer to understand your code, it needs an interpreter. It needs the code to be translated from a string of characters, into a list of syntax tokens, into a tree of syntax nodes, into a list of instructions, and finally into their binary representation as ones and zeros.

Once upon a time, humans actually set the ones and zeros by hand. Then they got lazy, and invented programming languages, and compilers – programs that teach the computer how to translate those languages into ones and zeros.

That’s the funny thing; programming languages actually create more work for computers. They’re designed for humans, and unabashedly so.