Before the age of slick digital interfaces, a group of researchers at Xerox PARC used simple paper, scissors, and glue to prototype the future. This is the story of how low-fidelity, hands-on prototyping helped shape the intuitive graphical user interfaces we use every day, proving that deep user insight is more important than complex code.
In the early 1970s, the future of the office was being imagined in a low-slung building in Palo Alto, California. At Xerox’s Palo Alto Research Center—PARC—the ambient noise wasn’t the clatter of typewriters, but the quiet hum of machines that were not yet personal computers, but were getting close. A group of researchers was tasked with creating "the office of the future," and they had begun work on a word processor named Gypsy. It was meant to be intuitive, a system that showed you on the screen exactly what you would get on the printed page. They called this concept "What You See Is What You Get," or WYSIWYG. The problem was, the software for this revolutionary idea didn't exist yet. And the hardware, the Alto computer, was a precious, room-sized resource. So, two of the lead designers, Tim Mott and Larry Tesler, were faced with a dilemma: how do you test a user interface that exists only in your mind? Their answer was not to write a single line of code. Instead, they turned to cardboard, paper, and scissors. They built a "computer" out of the most ordinary office supplies. They mocked up a keyboard and a screen on a large sheet of cardboard, and then, using small pieces of paper, they created menus, icons, and snippets of text that could be moved around, simulating the digital experience.
The tests were a kind of performance art. A secretary from Xerox, a woman with no computer experience, was invited to sit in front of the cardboard computer. Mott and Tesler gave her a simple task: retype a document. As she "typed" on the paper keyboard, another researcher, hidden behind the cardboard, would act as the "processor." He would listen to her commands, and then, with the meticulous slowness of a human CPU, he would slide new pieces of paper onto the "screen" to show the results of her actions. If she wanted to delete a word, he would physically remove the piece of paper with that word on it and replace it with a blank space. What they discovered was profound. The secretary, unfamiliar with the logic of programming, expected the machine to understand her intentions. When she wanted to replace a word, she would first type the new word and *then* highlight the word to be deleted. To a programmer, this was backward. But to a novice, it was perfectly logical. You wouldn't remove the old word until you had its replacement ready. In that moment, Mott and Tesler realized that the most intuitive interface was not the one that was most logically efficient for the machine, but the one that best mirrored the user's mental model. This insight led directly to the development of "cut, copy, and paste" and the modeless editing that defines how we interact with computers to this day. The code could be written later; the human logic had to be understood first. The future wasn't built from silicon and circuits, but from paper and empathy.