As the inventor of copy and paste dies, here are other computing innovations we take for granted

He's also credited by some as having coined the phrase "user-friendly."

Computing innovations we take for granted
Photo by Lorenzo Herrera on Unsplash

Larry Tesler invented cut and paste, and coined the phrase "user-friendly".


His career in the technology sector spanned 50 years and was witness to many innovations that are now part of our daily lives.

In 1961, Larry Tesler went to study at Stanford University, which itself has been pivotal to the growth of Silicon Valley. It's where Bill Hewlett and Dave Packard met before founding the company that bears their name; Larry Page and Sergey Brin, the founders of Google, studied there too, as did Elon Musk.

Tesler worked at some of the biggest names in Silicon Valley: Apple, Xerox, and Yahoo. He also worked briefly at Amazon.

The pioneering computer scientist believed passionately that computers needed to be easy to use, and is credited by some as having coined the phrase "user-friendly".

In the 1970s, he developed the cut/copy and paste function that is now so widely used that it's hard to imagine not being able to Ctrl-X/Ctrl-C and Ctrl-V.

Here are some of the biggest innovations in computing the world has seen since Tesler first started at Stanford...

1. Mouse tales

One of the other big computing breakthroughs of the 1970s took place at the Xerox Palo Alto Research Center (PARC), where Tesler worked. It was the mouse. Although the initial concept for the mouse dates back to the work of Douglas Engelbart in the 1960s, the device was refined at Xerox, where the first ball-mouse was developed.

The mouse revolutionized the way people interact with computers, getting away from the purely text-driven approach and ushering in the era of the graphical user interface that we are all familiar with today.

2. You've got mail

Email was invented in the mid-1960s, too, and has become one of the most ubiquitous features of modern life. Some would say a little too ubiquitous.

Every minute of every day, 188 million emails are sent and more than half of them are spam. In the early 1970s, when the @ symbol was first integrated into email addressing protocols, the only people with access to an email mailbox were users of The Advanced Research Projects Agency Network (ARPANET). That was the first wide-area network and connected dozens of universities across the United States.

3. On the move

The chances are you're reading this on something other than a desktop. Everyone takes for granted the ability to take their computer with them, whether it's a laptop, a tablet or even their smartphone.

The very first vision for a mobile computer dates back to the 1970s, when Alan Kay, a researcher at Xerox PARC, had an idea for something he called the Dynabook. Apart from a cardboard mock-up, nothing came of it. But in 1981, the world was introduced to the Osborne 1 – the first portable computer. It had a 13cm screen that could only display 52 characters on each line of text. If you wanted one, it would have set you back $1,795.

It was basic by any modern standards, but the Osborne 1 sounded the starting pistol for the race to produce better mobile computers. By the end of the 1980s, several brands were producing their own, including Kyocera, Epson and Apple.

This was a period of innovation that saw the very first touchpad – it appeared on the Gavilan SC, launched in 1983 and the first computer to be referred to as a laptop.

The 1990s was the boom-decade for laptops. The chip-maker Intel designed the first processor specifically for mobile devices and many big-name computer makers started to produce laptops based on mass-produced components, such as screens, processors and circuit boards.

And then, a little over 10 years ago, the world was introduced to the ultimate in mobile computing devices – the smartphone as we know it now. There are currently more than 3 billion smartphones around the world, their user-friendliness being one of the key factors in their tremendous success.

4. A super-connected future

The next big wave in technology is already lapping at our ankles: fifth-generation (5G) mobile technology, which is predicted togenerate around $3.6 trillion of economic output and create 22.3 million jobs by 2035.

It will play an important role in the growth of smart cities and the Fourth Industrial Revolution. It could even assist progress toward some of the United Nation's Sustainable Development Goals. And over the next five years, investments in 5G networks are likely to reach $1 trillion.

The UN's Sustainable Development Goal 12 calls for responsible consumption and production, to reduce waste and preserve resources. 5G is already helping cut waste in smart factories. Its role in managing smart cities, where sensors collect data on the daily hustle and bustle of city life will help reduce congestion and emissions by keeping traffic moving, too.

It also has the potential to revolutionize the provision of multiple vital services such as education and healthcare by connecting people to one another, and to devices that can gather important information. Clinicians will be able to assess a person's vital signs – heart rate, blood pressure, breathing and more – in real-time, no matter how far from the patient they happen to be.

Reprinted with permission of the World Economic Forum. Read the original article.

Credit: fergregory via Adobe Stock
Surprising Science
  • Australian scientists found that bodies kept moving for 17 months after being pronounced dead.
  • Researchers used photography capture technology in 30-minute intervals every day to capture the movement.
  • This study could help better identify time of death.
Keep reading Show less

Meet the worm with a jaw of metal

Metal-like materials have been discovered in a very strange place.

Credit: Mike Workman/Adobe Stock
Personal Growth
  • Bristle worms are odd-looking, spiky, segmented worms with super-strong jaws.
  • Researchers have discovered that the jaws contain metal.
  • It appears that biological processes could one day be used to manufacture metals.
Keep reading Show less

Don't be rude to your doctor. It might kill you.

Dealing with rudeness can nudge you toward cognitive errors.

Photo by Jonathan Borba from Pexels
Surprising Science
  • Anchoring is a common bias that makes people fixate on one piece of data.
  • A study showed that those who experienced rudeness were more likely to anchor themselves to bad data.
  • In some simulations with medical students, this effect led to higher mortality rates.
Keep reading Show less
Strange Maps

Welcome to the United Fonts of America

At least 222 typefaces are named after places in the U.S. — and there's still room for more.

Quantcast