Logs, moths and firewalls. The evolution of computing language

The language of computing comes from centuries of human work, sailors, builders, physicians, soldiers.

When early programmers started to describe machines that nobody had seen before, they borrowed familiar words from the physical world. This made technology easier to understand: code sounded less like mathematics and more like memory.

From the ocean to the console

Let’s begin our story among the waves of the sea.

In the seventeenth century, sailors used a piece of wood, a chip log, tied to a rope with knots spaced at regular intervals to measure a ship’s speed. When the log was thrown into the sea, the rope unrolled as the ship moved forward. One sailor watched the rope pass through his hands while another turned a sandglass that ran for about half a minute. The number of knots that slipped through during that time showed how fast the ship was moving, one knot for each nautical mile per hour. That’s why speed at sea is still measured in knots today.

They wrote those numbers in a logbook, a daily record of their journey. This habit of writing became a kind of protection against the sea’s unpredictability: a way to hold time still.

When engineers began to automate data recording in the 1960s, they reused the term log for the same reason. A log file was the ship’s diary reborn in the digital world. Behind every line of code we can spot a trace of navigation.

Architecture and the shape of software

As computers moved from laboratories into offices and houses, their vocabulary became architectural. We still open windows, build frameworks, and store files in folders. The desktop itself is a metaphor for the workplace, and a platform suggests a structure that supports what stands on top of it. These words were chosen intentionally.

In the early 1980s, designers at Xerox PARC and later Apple wanted to make computing visual and familiar. Lev Manovich (2001) explains that the window metaphor connected digital space with human perspective, just like the framed view of a room or a painting. It turned abstract data into something spatial, where users could “enter” the machine instead of merely typing commands.

Likewise, firewall came from the real world. In architecture, a firewall is a solid barrier that keeps fire from spreading between parts of a building. By the late 1980s, network engineers borrowed the same word to describe the software that stops harmful data from crossing between systems. The intent was identical: containment, safety, survival.

Living bodies and relationships

The digital world also adopted the language of biology. Computers have memory and feedback loops. They learn and communicate, all terms, these, that make machines sound alive. This metaphor came naturally, because humans understand new systems by comparing them to themselves.

The story of the bug shows how literal accidents become part of language. In 1947, engineers at Harvard University found a moth trapped inside the Mark II Aiken Relay Calculator. Grace Hopper, a member of the programming team, taped it into her logbook and wrote: "First actual bug found". The joke survived and debugging became a permanent part of programming. Engineers had already used the word bug to describe mechanical faults since the days of Thomas Edison, but Hopper turned bug and debug into everyday terms in computing.

The IT lexicon is metaphorical par excellence.

In the 1970s, the engineers of UNIX at Bell Labs began describing their new operating system in strangely human terms: child and parent processes, orphans, and zombies. A program that generated another became a parent; one left without supervision became an orphan; a terminated process that refused to disappear lingered as a zombie. Even the command used to stop a process was called kill, an almost brutal verb that turned system administration into a miniature theatre of life and death.

These metaphors were improvised by programmers amused by the life their machines seemed to simulate.

Earlier, at MIT’s Project MAC, researchers had already borrowed from philosophy and physics, calling their background services daemons, after Maxwell’s thought experiment — invisible agents regulating the unseen. By the 1980s, with the spread of networks and hacking culture, the lexicon expanded: worms, viruses, Trojan horses, ghosts, and phantoms populated the digital landscape. The gothic imagination and the mythic past returned, repurposed to describe systems that were no longer visible, physical, or easily controlled. Computers can be infected, files need to be quarantined and it is a good practice to sanitize code.

Language of war and defence

The vocabulary of war entered computing with the birth of global networks. During the Cold War, the first version of the Internet ARPANET was developed to resist a nuclear strike. Because of that military origin, cybersecurity still speaks the language of defence: attack, shield, Trojan horse, payload. When we install firewalls, use encryption, or talk about secure zones, we echo military strategy.

The metaphors guide how we design systems, every login, every password, every wall is a reminder that the digital world has borders too. Even our habits online follow a pattern of fortification: protect, isolate, control.

Into the clouds…

The economy also shaped computing language. Words like server and client repeat the logic of service and demand.

In the 1990s, data mining became popular, a way to describe extracting value from information. The term borrowed its image from industrial work, where miners searched for precious materials deep in the ground.

Then came the cloud. The name suggested something soft and weightless, a digital sky where data could float freely, untouched by gravity, but the image is deceiving. What we call the cloud rests nevertheless on the ground, in vast data centres filled with cables, servers, and cooling fans that hum like mechanical lungs. These hidden buildings burn electricity day and night, watched over by engineers who keep them alive. The metaphor makes technology sound effortless, but every byte stored and every video streamed requires real energy, real metal, and real human work.

Our language drifts upward, while the machines stay anchored to the earth.

…and back to runes

In 1996, three major companies, Intel, Ericsson, and Nokia, met to discuss how to create a common standard for a new kind of short-range radio technology. Their goal was to make different devices and industries work together more easily. During that meeting, Jim Kardach from Intel proposed the name Bluetooth, a tribute to Nordic culture, as a temporary code name. He explained:

“King Harald Bluetooth was famous for uniting Scandinavia just as we intended to unite the PC and cellular industries with a short-range wireless link.” — Bluetooth.com

The name was not meant to be permanent. The team considered two alternatives, but both resulted inapplicable at the moment of the official launch, so the name stayed. The name spread quickly within the industry and became impossible to replace. Its logo combines two runes from the Younger Futhark alphabet, Hagall (ᚼ) and Bjarkan (ᛒ), the initials of King Harald Bluetooth’s name, merged into a single symbol. The IT terminology is so fascinating in history and complexity, that a literature branch is born to express its allure: Code poetry is a writing technique that mixes notions of classical poetry and coding language.

An example can be expressed in this coem, a tiny poem written in the syntax of code:

function fromWoodToWord(log) {
  const code = encode(log);
  return memory(code);
}

The function is imaginary, but its logic and transmitted message are real. It is both a technical process and a metaphor for how knowledge evolves. Language records movement, turns it into information, and finally preserves it as meaning.

Every time a programmer writes a line of code, every time a user opens a window or checks a log file, we are continuing an ancient conversation between the physical and the virtual.

Bibliography