The long, connected history of great UX design

How human-centered design has driven tech and product innovation for decades

Todd Krieger

Todd KriegerSenior Editor at Freshworks

Sep 07, 20236 MINS READ

Software may be eating the world, but design is shaping it. While the technological advances behind self-driving cars (LIDAR!), smart doorbells (facial recognition!), and other gizmos and apps tend to get most of our attention, the design work behind great products, software, and services is what ensures that we humans can and want to use them. 

The terms of art that describe how people-friendly products are designed have changed a lot over the past century, but they all aim for the holy grail of great end-user experience. 

George Eastman’s motto at Kodak—“You press the button, we do the rest”—was certainly ahead of its time, as was his Brownie camera, introduced in 1900. The mind-blowing features (and ease of use) of today’s smartphone cameras sprung from the same ethos.

“The idea of human-centered design is to look at all aspects of somebody’s life and needs,” says Robert Fabricant, co-author of the book, “User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play.” That was as true at the turn of the 20th century, with its newly minted factories and assembly-line automobiles, as it is today, with software shaping so much of our daily experiences. 

In this timeline piece for The Works—using Fabricant’s book and additional research as our guide—we trace some of the key moments and trends in the history of innovative design. 

‘Harmonizing’ factory floors and assembly lines

In the early 1900s, industrial bosses often sent workers to the factory floor to hammer, weld, and rivet goods with little to no oversight—and sometimes even without proper tools or training. In his 1911 monograph, “The Principles of Scientific Management,” Frederick Taylor proposed scrapping this ham-fisted approach and instead “harmonizing human-machine interactions” in a way that would improve worker productivity and allow employees to share in that prosperity. 

Taylor argued that factory owners needed to refashion the machines, tools, and processes of industrial labor and start scientifically selecting and training workers for specific tasks. Not only did this provide an early template for organizational design and modern management principles, it set the stage for Henry Ford’s groundbreaking use of moving assembly lines and divisions of labor when mass production of the Model T debuted in 1913.

Understanding human factors to improve functional design 

Piloting a plane is a taxing, resource-intensive activity even in the best of conditions. In a war zone, where situational awareness is at a premium, the slightest error at the controls can mean death. In the aftermath of World War II, researchers sought a better understanding of how pilots related to their complex surroundings, and to help them explain why perfectly good airplanes crashed so often.

Read also: The evolution of chatbots

In 1954, Paul Fitts, an experimental psychology researcher, suggested that hundreds of crashes previously blamed on pilot error happened because of design flaws. As Fabricant relates in his book, one design error belonged to the B-17 Flying Fortress, the standard-bearer of U.S. bombing missions. The toggle switches for the plane’s landing gear and wing flaps looked identical and sat so close together on the cockpit console that battle-weary pilots returning from a sortie often flipped the wrong switches and instantly sent the plane to the ground.

The design fix: make the knobs different and more responsive to the pilot’s hands so he could simply know what to do by touch and not rely on sight. This slight modification lives on in every wing and landing gear configuration today—not to mention the influence that ripples through phone and auto design, as well.

Subscribe to The Works

Insights on the business impact and ROI of AI. Sign up today.

Making computers easier to use 

Until the 1980s, using a computer required a master’s degree in self-restraint. Telling the computer what to do meant typing tedious and frustrating text-based command line interfaces (CLIs). Enter the Apple Macintosh in 1984, the first commercial PC to popularize the modest computer mouse and the graphical user interface (GUI). 

Like so many innovations, this did not happen overnight. The GUI was built on work that had been introduced as far back as 1968 in “The Mother of All Demos” by Doug Engelbart at the Stanford Research Institute.

Engelbart had begun toying with the idea of a computer “mouse” as early as 1963. Yet it was the genius of Apple in their presentation of the GUI—with icons, menus, and buttons that we now take for granted—that altered the way people interact with machines. It made computers approachable and indispensable tools for business, science, and art.

Making the internet accessible to all 

Prior to 1993, finding information on the internet required university search tools like Archie and Gopher. For government workers, academics, and scientists, cumbersome tools were fine. But the launch that year of Mosaic, the first web browser, changed that. Hatched by researchers at the University of Illinois, it displayed pictures with text, building on the visual appeal established by GUIs and helping users navigate web pages. It also featured something called a hyperlink—a throwback to Engelbart—a clickable underscore that made it simple to navigate between web pages.

Less than 18 months after it appeared, Mosaic became the “browser of choice” for more than 1 million users. Although discontinued in 1997, Mosaic laid the foundation for the browser or app you’re using to read this piece.

The dawn of modern UX

While working at Apple Computer in the early 1990s, cognitive psychologist Donald Norman began looking at a user’s entire experience when engaging with an Apple product. He didn’t limit his work to how users felt when sitting in front of a screen. He studied the user relationship from the moment they stepped into a store, bought a device, unpacked it at home, powered it up, and finally got the thing going. 

Read also: The ROI of great UX

For the fastidious Norman, even that wasn’t enough. To learn more, he helped set up a team called the User Experience Architect’s Office. It was at that moment (in 1993) the now-ubiquitous term “user experience” (UX) was born. It has since become synonymous with the designers and design elements that make working with computers productive and pleasurable today.

Opening the world of touch 

The late 1990s witnessed the emergence of an unlikely handheld hero: the Palm Pilot personal digital assistant (PDA). It was a battery-powered electronic planner that featured an LCD screen, a plastic stylus, handwriting recognition, and one-button synchronization with PCs.

Other PDAs were already out at the time (Apple had infamously failed with its Newton device), but the Palm won over businesspeople and everyday consumers with its stylus and its ability to tap through apps, including contacts, notes, to-do lists, calendars, and games.

By the 2000s, other PDAs with more intuitive touch-based screens unseated the Palm. These included RIM’s BlackBerry with its QWERTY keyboard, which sparked a revolution in mobile texting and emailing. In 2007, Apple made the keyboard a touchscreen function of its new iPhone, setting the stage for how users interact with almost every type of digital consumer product today, from washing machines to car dashboards. 

A new approach: ‘design thinking’ 

Pioneered by the landmark design firm IDEO, which was known for its work on the first Apple mouse, the concept of design thinking emerged in the early 2000s as a more holistic way to develop products.

“Design thinking is both a system—involving prototyping, critique, and iteration—and a set of processes,” says Nathan Shedroff, author, professor, and founder of the MBA program in design at California College of the Arts. “There are many ways to solve problems. There’s the scientific method, integrative systemic thinking, but design is different. Design thinking deconstructs the system, breaks it apart, and develops a new model from scratch.” 

One oft-cited example is IDEO’s development of the Swiffer, a deceptively simple product that changed the way many people clean their floors. “Where design thinking excels is in looking at the combination of things,” explains Shedroff. “IDEO looked at home cleaning, broke it apart, talked to users. They built back up this solution that has been wildly popular. And in being successful, it edged out market share from those other solutions.” 

What’s next 

The growth of generative AI and other forms of AI will no doubt shape the rules of design in how we live, work, and play in the years to come. This next chapter may prove to be the most vital: How design influences AI will dictate how it will shape the world.

We want to hear from you! Please send us your feedback, and get informed about exciting updates from The Works. Drop us a line: theworks@freshworks.com.