Start me up

The digital economy of 1995 hadn’t reached anywhere near today’s all-encompassing levels of cultural saturation, but that year still featured one of the splashiest consumer technology rollouts in history. This one wasn’t orchestrated by Apple though, as Steve Jobs had only recently returned to a company that was barely clinging to life.1

No, this product launch costing hundreds of millions of dollars and generating several multiples of that in return was pulled off by Microsoft, led by Bill Gates and crew.2 The company unleashed a major transformation of personal computing through an operating system with the utilitarian name of Windows 95.3 The event was a phenomenon, with shoppers lining up outside of computer stores like sneakerheads waiting for a fresh drop of Yeezys.4

An operating system was the unglamorous backbone of computing, making it a strange candidate for such intense fandom. But it had been several years since the last Windows iteration, and the new version marked a radical change in usability, flexing the potential of the graphical interface and introducing elements like the Start button that persist to this day.5

Users were eager to adopt the software. They spent many hours getting through the installation process and learning its advanced features. Corporate IT departments laboriously rolled out the package to employees’ machines, interrupting their work and requiring training on how to take advantage of the new functionality.

Microsoft would issue a few more Windows versions tagged to their release year, but none engendered the same level of enthusiasm. Eventually the idea of these big bang upgrades faded from the public consciousness, aided by one major factor: the internet.

Advances in connectivity finally obviated the rationale for massive, one-shot changes. Instead of waiting until the pent-up backlog of features was large enough to merit the disruption, software became regularly and seamlessly updated in the background. In the process functionality would evolve almost imperceptibly.6 By adopting a continuous model, improvement and adaptation to current needs became an expected feature.

Unlike Windows, some products were born in this environment of continuous improvement. In 2004 an internal project at Google was released to the wider world. It was called Gmail, and it would quickly ascend to prominence at the expense of former leaders like Microsoft.7

Gmail introduced some radical innovations. It allowed for virtually unlimited storage, eliminating the effort of diligently sifting through messages to stay under thresholds. A search function meant that careful filing within virtual folders was no longer relevant—about time, given this concept was an electronic vestige from the physical world of desks and offices.

But this was all done without a singular rollout or major training. It had the advantage of being based on the internet from the start, with a gentler learning curve and no need to install software.

A comparison between Gmail’s original public beta and today’s version reveals radical differences, but there was no discrete moment when the shift happened.8 Each update caused minimal discomfort, and most users were able to quickly get past the initial unfamiliarity.9 In contrast to Windows, there are no large segments of the user base stuck on older versions of Gmail, unwilling or unable to make the transition.

Keep moving

Forcing big changes all at once can be more challenging than finding ways to integrate growth and development as a regular feature of life. A software upgrade may be a mild inconvenience, but there are other areas where the stakes are higher.

Take education, where the traditional model is extremely concentrated. The general pattern seems to be to first learn everything you need to know about the world broadly, and your profession specifically, during a period of 15 or 20 years, and then spend the next 50 or so drawing down that educational capital.

This is problematic when the state of the art is not something fixed that can be referred to indefinitely. For example, medicine is rife with outdated practices that fail to account for the latest discoveries. Protocols are followed or drugs are prescribed for years or even decades after the realization that they don’t work as expected or even cause harm. Innovations similarly take a long time to propagate, with research suggesting that 15 years elapse before even half of the American population benefits from certain important discoveries.

Healthcare practitioners have continuing education requirements as a means of avoiding these issues, but they are clearly not enough, and inertia is a primary factor. Hardening a worldview around whatever happens to be current during one’s 20s isn’t the optimal approach—and that’s not just true for physicians. This makes it harder to make the transition when it inevitably arrives.

Much as the internet altered software development cycles, technology has also led to a proliferation of learning options. The resources of a university classroom are now available to non-students anywhere in the world. Those interested in brushing up on marketable skills can find tailored programs in major cities that are starting to disaggregate the front-loaded approach to education.

The model of software used to be one of stability punctuated by step changes to a new version. Then the internet happened, and we didn’t have to wait years to get massive upgrades all at once. Even auto manufacturers like Tesla have been able to move to a world of continuous improvement to their products, aided by networks that enable constant connection and feedback.

slow and steady wins the race

Onward and upward

Discontinuous change also increases the chances that some will be left behind, as the gap with the status quo seems to great. This can compound over time, taking them further away from the cutting edge and making the gap even harder to close.

Athletes who compete in cycling or track races understand this phenomenon intuitively. The pack will try to stay on pace with the lead competitor, because a breakaway can soon make him uncatchable even with an all-out effort. Those who hang back too long waiting for the optimal moment can find themselves out of the race entirely.

Spreading growth out into manageable increments also makes it more likely to stick. The capacity to absorb new information is limited, and only so much change can happen at once. The parable of the tortoise and the hare codifies this principle in the form of folk wisdom: continuous, steady progress is preferable to bursts of activity and enthusiasm punctuated by torpor, and can lead to a winning result.

Move from a model where career or life transformations are one-off, to a framework where growth is integrated into what you do. Cultivate a low-level unease that keeps you alert, instead of a comfort that lulls you into falling further behind. Take the position that your education did not arbitrarily end with the conferral of a degree, and the process of learning is more nuanced and lifelong.

Instead of stagnating until a major leap is inevitable, consider ways to change and learn consistently. Even if your role requires repetition or consistency, always look for what could be improved. If your answer is “nothing,” that’s a strong sign that you should move elsewhere—there is a slight possibility that you have achieved perfection, but it’s far more likely that complacency is taking over. The gap between where you are and the line of potential is growing, and soon it might feel insurmountable.

Consistent refinement can make you or your organization more effective, and help smooth the inevitable transitions. New uses of technology can help avoid the need for clunky or disruptive interventions.

How are you slowly making large changes?


Not receiving these articles automatically? 1/week, subscribe here.


References

Mashable looks back at the hype surrounding Windows 95’s launch.

Atul Gawande’s article on health care standardization in the New Yorker includes the statistic on how long it takes for innovations to propagate.

  1. Apple soon needed a cash infusion to stay in business, and in a Shakespearean turn of events its unlikely rescuer turned out to be Microsoft, which had no inkling that it was propping up the company that would eventually blow it out of the mobile and media markets.
  2. The spectacle of Gates and top lieutenants dancing on stage to the Rolling Stones’ “Start Me Up” at the launch event is… interesting. Let’s just say they danced like no one was watching, except a whole lot of people were watching.
  3. In addition to easy identification it has the advantage of immediate obsolescence. As soon as the calendar ticks over to the next year you know you have outdated goods and start thinking about a newer version. From a marketing perspective, this is a feature, not a bug.
  4. At the time, software was sold on things called “disks”, which had to be physically acquired from places called “stores”. And Yeezys are Kanye’s shoe collaboration with Adidas, but you already knew that.
  5. Apple aficionados claim, not without warrant, that Microsoft was merely aping features that the Macintosh long had. Apple’s own OS was named “System 7” and had launched a few years earlier, albeit with a tiny fraction of the hype surrounding Windows 95.
  6. Except when you get the reminder from your computer telling you a restart is necessary to install system updates, which you then snooze like an unwanted alarm.
  7. Which at one point offered the quadruply-branded Microsoft Windows Live Hotmail. I mean, just pick one horse and ride it.
  8. Fun fact: the original owner of the Gmail domain was a branded service featuring the cartoon cat Garfield. If he at least got some Google shares in exchange that should be enough to keep him in lasagna for the rest of his nine lives.
  9. Except for those of you still using those Yahoo accounts.