
It’s a common joke among millennials and older that GenZ and younger don’t know what the save icon is. I haven’t seen any data that validate this claim, but I have met a few people who have never seen a floppy disk. And why would they? It is a relic of a time without ubiquitous computer networks. Now, it is just a symbol like the two vertical lines that everyone knows means “pause”.
The UI designer who first used a floppy disk icon (possibly Susan Kare, formerly of Apple) was using an old human factors trick: using digital elements that mimic the appearance and function of real-world objects. This type of symbol, called a skeuomorph, is widely used in user interface designs. We press “buttons” that aren’t buttons on our phones and transfer “files” that aren’t files.
In many ways, our user interfaces are haunted by the ghosts of our past material culture. A button is a roman-era technology for holding your pants up. It is also an industrial revolution era technology for starting a machine. It is also an information age technology used to make choices about application behavior. The concept of a button has served humanity well on the road from the classical era to the application era.
As we stand on the precipice of a new era – the AI era – I find myself asking, “has the concept of a button run its course?” It took decades for us to settle into the computer interface metaphors we use every day. We instantly recognize menu bars, windows, text areas and hundreds of other controls that would have baffled people just 50 years ago. Each one of these controls was a conscious choice by one person that got copied by other designers until they became standards. Will this process of borrowing ideas from the material culture of the past continue? Will some future user of an AI system leverage the idea of a computer without ever having seen one?
In the film “Star Trek: the Voyage Home”, the crew of the Enterprise travel back in time to the mid 1980s. Mr. Scott, the galaxy’s greatest starship engineer, finds himself in the awkward position of using a 1986 vintage Macintosh Plus. He simply starts talking to it. When this doesn’t work, Dr. McCoy helpfully hands him the mouse, which he holds up like a microphone and repeats “hello, computer.”
The audience was in on the joke. Since Star Trek debuted in the 60s, we’ve watched jealously as Mr. Scott simply talked to his 23rd century computer and got flawless results every time. After Mr. Scott figured out that he needed to use the Mac’s keyboard, we all felt a little satisfied when he said “how quaint”. For a moment, we got to acknowledge that computers were hard to use and the user interface was one of the main reasons why.

Even in the 1960s when computers were locked in sterile data centers, everybody already seemed to know that talking to a computer and being understood is how we wanted to interact with them. HAL had a voice interface in “2001: a Space Odyssey”. The robot in “Lost in Space” had a voice interface. Talking and listening is simply the first thing that comes to mind when people dream about computers they WANT to use.
Thanks to the hard work of dreamers who never gave up through many AI winters, conversational UI is almost here. As technologists who are working on AI-driven applications for users today, we have to solve the problems posed by 1960s science fiction. Mr. Scott had a disdain for keyboards, but when he returns to his starship there are buttons everywhere! It seems like every scifi computer that has a voice interface also has screens and buttons. Why is it that when humans dream of future computers, they have voices and buttons? What does that say about how people want to interact with them? Shouldn’t I be able to fly the Enterprise just by telling the computer “go up into space”?
For this and many other reasons, not sure it’s the end of the road for the button. They seem to inhabit our imaginations. Maybe they are just symbols of control, a reassurance that you can always switch the machine off.
As the CTO of Xficient, my teams are tackling human-AI interfaces in healthcare. The hypothetical questions of the 1960s are here for us to solve today. It is up to us and our industry colleagues to figure out if the button has a future. I suspect it will take at least as long for us to settle on the standard human-AI interfaces as it did computer interfaces. Buttons are just a microcosm of the problem. How should you interact with AI to manipulate data that you have? Are files relevant anymore? The questions are endless and the stakes are high.
When it comes to critical healthcare tasks, there needs to be a degree of trust between AI co-workers and healthcare industry professionals that the vast majority of AI applications can’t produce. This is partly due to the reliability of the models, but it is also due to the insufficiency of our current UI patterns and metaphors. For example, if AI recommends that a procedure should require pre-authorization as part of a healthcare plan, how does the human validate that that is consistent with medical policy? A patient’s timely access to care may depend on the outcome of that human-AI interaction. Our mission is to develop technology that maximally leverages what humans and machines are good at together.
The road is long, but I’m confident that we will get there and that Xficient will be a part of it. As we grapple with human-AI interfaces daily, we look forward to the day when the bewildering array of buttons, menus, and windows of today seem “quaint”.

About Scott Graves
I’m a technology leader who is interested in all aspects of building high-performance software development organizations. I’ve spent my whole career mastering the art of building and leading development teams. Now I want to share what I’ve learned and push the state of the art forward.
I believe that solving hard technology problems starts with people. Sometimes that means the people who make the technology and sometimes it’s people who use it. We are toolmakers, and tools are for making life easier for our fellow humans. It’s fundamentally an act of love and compassion. The rest is just details.