The evolution of interfaces from their earliest iterations to the UX+UI we know today
The entire history of interfaces won’t fit into a single text, but maybe after this article, you’ll show off a little in front of your friends. To not quote Wikipedia yet again, we’ll keep it simple: the interface is something, through which two objects interact. These can be machines, apps, people and devices. Even our arms and legs, eyes and ears could be considered interfaces — through them, we interact with the world around us. In this article, we’ll mostly focus on HMI, Human-Machine Interfaces (but we’ll mention the rest of em too).Before machinery
The industrial revolution and its consequences on interfaces
Interfaces in our common understanding showed up during the industrial revolution. All kinds of machines were invented, and these machines needed to be manipulated. For that, we needed universal and understandable patterns. This is the next step of interface evolution — these machines needed standardization, so they could be mass-produced, and it were relatively easy to teach how to work them.
1804 is when the first coding happened — as telling the machine some sort of algorithm. Jacquard loom became the first machine that could be programmed. An incredible breakthrough in terms of repeatable actions. The machine was programmed using punched cards.
Punched cards were the progenitors of programming, and soon enough there were all kinds of machines using them, even mechanical pianos. And in 1873 we saw the first iteration of the now common QWERTY.
Going digital
It would take quite some time to go through every stage of computing evolution, so let’s get to the fun part. To the digital revolution. The 1980s marked a ubiquitous transformation from analog tech to digital. These are the interfaces that already existed (out of those that are between people and machines): Gesture-based interface: steering wheel, joystick, etc (analog ways to control planes, trains, and automobiles); Command line, instructions are given through keyboard input (DOS, BIOS); Graphic user interface. GUI is where functions are represented by graphical elements. The so-called WIMP: windows, icons, menus, pointers. The first GUI was developed in Xerox Palo Alto Research Center (PARC) for the computer Xerox Alto, made in 1973. It was not a commercial product, meant mostly for scientific research.
What’s next
The voice is used more and more in SILK interfaces — Speech, Image, Language, Knowledge. Maybe someday we’ll see neurointerfaces that transmit data between neurons and machines through implants. It’s unlikely that we’ll see the end of GUI anytime soon — but technology keeps on moving forward, and when technology gets advanced and available enough, it usually sweeps its predecessor from the map.
What do you think we’ll see in ten years, twenty, hundred? Will it be smartphones, will we even have screens at all?