7 Tech Advancements from the 70s That Changed the World

Tech Lists
Share Tweet Submit Pin
7 Tech Advancements from the 70s That Changed the World

The 70s was the decade technology really became consumer technology. Invention that was birthed in the 50s and 60s became products to fill store shelves and catalogues in the 70s. The Consumer Electronics Show (CES) even moved to a twice-a-year format in the boom of all the gadgets and consumer-facing electronics hitting the market.

Including everything from the cell phone to the person computer itself, most of the technology we use today find their common ancestors from a product sold in the 1970s. So let’s look back at the 7 big tech advancements from the 1970s that completely changed the world:

1. The Floppy Disk

floppydisk .jpg

The floppy disk may be a largely obsolete technology at this point but its legacy is huge. This very article is being written in Microsoft Word and the document has been saved by clicking that floppy disk symbol in the top left corner. We’re not using floppy disks anymore but we haven’t forgotten about them.

The small little square piece of plastic long pre-dates the likes of USB sticks with the roots of floppy disks tracing back to the late 1960s when IBM began tinkering on the idea.

The first commercially sold floppy disks hit the market in the early 70s, sold by IBM and Memorex and were 8” in diameter. IBM initially labelled the product as a Type 1 Diskette but the term floppy disk caught on in the press and thus it was christened so for decades to come.

It was then in the mid to late 70s that the floppy disk format started to shrink in size. Now-defunct computer maker Shugart Associates, led by Alan Shugart who previously worked on the disk storage at IBM and Memorex, released a 5¼” version and the product came down in price even if the disks could only hold about 90 to 100 KB of data.

By the late 1970s, Apple had released its own floppys that boasted 256 KB of data storage and in 1978, Tandon released a double-sided floppy disk that could hold 360 KB but it wasn’t until the ‘80s that we were introduced to the 3½” floppy disk that we all know and love.

However, the ability to write data to an external disk was truly transformative. Moving data so easily, even just via hardware like USB or external hard drive, is something we take for granted now but the opportunities presented by the floppy disk format were staggering, even if the amounts of data we’re talking about here are miniscule by today’s standards.

Floppy disks’ heyday didn’t really come until the 80s and 90s but the products developed and released in the 70s were vital cornerstones in computing

You’d be forgiven for thinking that floppy disks have gone entirely the way of the dodo but you wouldn’t be 100% correct. A 2014 edition of 60 Minutes found that the Air Force was still using 8” floppy disks from the 1970s in some cases for partly operating ballistic missiles. The website FloppyDisk.com continues to sell 5¼” and 3½” floppy disks to the few remaining faithful and you can still find disks and external disk readers on Amazon. —Jonathan Keane

2. Portable Cassette Player

Screen Shot 2016-07-06 at 4.36.54 PM.png

Philips Electronics, a Dutch electronics corporation founded by Anton and Gerard Philips, in 1962, invented the first cassette for audio storage. Cassettes came in two forms: a tape that already contained pre-recorded content, or a “blank” cassette that was fully recordable. When the cassette was developed there were originally only three tapes in the world to allow Philips to get it right, with the BASF PES-18 being the fist tapes for the compact cassette. Using a magnetic tape recording format, a cassette, contains two miniature spools that are held inside protective plastic shells. Between the spools they hold the magnetically coated, polyester-type plastic film that continues to be passed and wound.

The cassette has two stereo pairs of tracks (so four in total) or two monaural analog audio tracks. One stereo or one monophonic track is played or recoded while the tape moves in one direction, with the second pair moving in the opposite direction. Initially, the compact cassette format offered fairly poor reliability, but as technology soon improved with advances in noise reduction and the ability to play stereo tapes, new tape formulations assured a higher-quality sound from compact cassettes.

The first cassette players were simple, mono-record and playback units that required a dynamic microphone. Stereo recorders eventually evolved into cassette decks and Hi-Fi cassette decks often didn’t have built-in speakers. So they were bulky, big pieces of equipment that didn’t sound all too well. Then, Sony came along and fixed everything.

In the 1970’s, Sony was basically the king of well-designed, miniature music products and in 1979; they revolutionized the market with the release of the first portable music system. Though it wasn’t a huge engineering innovation (magnetic cassette technology had been around since 1963) the TPS-L2 Walkman cassette player was small in size, just slightly larger than an actual cassette tape and definitely smaller than the 8-track cartridge player. The self-contained portable music system came with lightweight headphones, played cassette tapes and was capable of Hi-Fi stereo sound, so, needless to say, everyone wanted a Walkman when they came out.

The biggest impact the portable cassette player really had was how it changed the way we listen to music. Previously, people had in-home devices or 8-track players in their cars, but the Sony Walkman changed our listening habits by allowing people to carry recorded music with them and listen to it through lightweight headphones. Obviously, in 2016, no one carries a Walkman anymore—unless you happen to be a subway performer, you might still have one—but, the Walkman was the first piece of portable technology that eventually led to smaller, more modern looking devices we know and love today.—Isabel Thottam

3. The All-In-One Personal Computer

Screen Shot 2016-07-06 at 4.26.41 PM.png

The history of advances in computing goes back decades, but personal computers or “microcomputers,” didn’t get their first boom until the 1970s. The original Apple I was birthed in Steve Wozniak’s garage, based on what he was learning and experimenting with at the Homebrewer Computer Club in Palo Alto, California. It was in this club of enthusiasts that the personal computer would move from hobby to product.

The Apple I sold 200 units in 1976, the first successfully marketed personal computer was the Commodore PET, a computer that was adopted in both Canada and the United States. Commodore was just another calculator company in the early 1970s until its lead electrical engineer got shown a prototype of the Apple II by the Steves of Apple.

Although Commodore turned down Steve Jobs’ offer to purchase the Apple II, it went on to beat Apple to release with the Commodore PET, an all-in-one computer that included the MOS Technology 6502 microprocessor, 4 kB of 8-bit RAM, a build-in monochrome monitor, and a sheet metal case. Though the PET was first and had some moderate market success, the Apple II (which came out later that year) was personal computing’s first big hit. The color graphics were a big draw, as well as its more accessible setup. The Apple II sold for $1,298, which comes out to a shocking $5,069 with inflation considered.

The real innovation behind the first personal computers though was this idea of an all-in-one product that the average family might want to have in their home. However, that vision wouldn’t fully come to pass until the first user-friendly products such as the IBM Personal Computer, the Macintosh, and the Windows platform came in the 1980s. Regardless, there is arguably no technological advancement in the past fifty years as monumental as the invention of the personal computer.—Luke Larsen

4. The Cell Phone

Screen Shot 2016-07-06 at 4.29.17 PM.png

Look at this monster.

The original cell phone, the Motorola DynaTAC 8000X, first hit the U.S. market in 1983. By today’s standards, the DynaTAC was an enormous beast. It cost an absurd $3,995 (roughly $10,000 today), took ten hours to charge and offered users just 30 minutes of call time. The final model accepted by the FCC was ten inches tall, not including the antenna which added another four inches, weighed 28 grams and had a total of 19 buttons. It was a far cry from the svelte, aluminum cased slates we carry around in our pockets today.

Many people called it a “brick phone” thanks to its resemblance to a standard, clay-fired brick and over the years it became something of a comedy prop, a way for us to measure how far we’ve come by looking at the apparent silliness of the first attempt.

But, in 1973 when Motorola engineer, executive and “Father of the Cell Phone” Martin Cooper made the first truly mobile cellular phone call with the DynaTAC prototype walking down Sixth Avenue in New York City, the moment was far from silly. It was as monumental as Steve Jobs on stage at Moscone West in 2007, when Apple reinvented the phone.

It wasn’t as flashy as the unveiling of the iPhone, nor as immediately world changing, but the Motorola team led by Cooper that engineered the original DynaTAC shifted the idea of mobile telephony in a drastic way. Before the DynaTAC, the idea of “mobile phones” wasn’t really all that mobile. The first big idea came in the late ‘60s from AT&T, which at the time was the household name in American telephones. AT&T’s plan, having developed technology that could pass phone calls between various towers as users passed by, was to put a phone in your car. It was an important step, but it wasn’t the right step to Cooper.

He believed that communication needed to be personal, that users should be able to move about the world but still be reachable at any given time. AT&T’s plan was mobile, but you were still tied to a machine that weighed well over a ton. If not a car phone, the next best option to come were phones that needed to be accompanied by hefty briefcases, making them a chore to carry. Motorola set out to change that in 1972 when it put all its efforts into building a handheld, portable cell phone. Within three months, Cooper and his team had a working prototype.

On April 3, 1973, Cooper made the first call, famously to his top competitor Joel Engel at AT&T, and the direction of mobile telephony changed forever. At first, it took time for the industry to develop, taking a full ten years from the first call for the DynaTAC to hit shelves. But, once it did, cell phones began to sophisticate rapidly until the January 9, 2007, the biggest date in the mobile phone industry. The invention of the iPhone gave rise to the idea of computers in our pockets, high-powered machines that could do things previously thought unbelievable. But it was the DynaTAC and the team at Motorola that gave rise to the idea of phones being personal, and phone numbers being tied to a specific person rather than a place.

In the current landscape that sees phones outnumber people on the Earth and the word “phone” synonymous with “smartphone”, we owe a lot to Jobs and Apple for reinventing the cell phone, but they owe just as much to the engineers at Motorola for inventing it in the first place.—Eric Walters

5. The VCR

70stech-vcr (1).jpg

Methods for recording television existed before the 70s, but were very expensive, were technologically complex, and different solutions used different, non-compatible tape formats. The technology in Videocassette Recorders (VCR’s) was highly mechanical, and even in their latter years, VCRs weren’t much smaller than a briefcase. It wasn’t until the mid-70s that VCRs manufactured by Japanese firms were made reliable and affordable. Once they started entering people’s homes, they fundamentally changed the way people consumed both television and movies.

No longer did we need to make ourselves available for the prime time schedule of The Incredible Hulk. Now, it was within our power to record it and watch at our own convenience. Once they were cheap and simple enough for home use, and could record two or three episodes on a single tape, the industry exploded.

In the early 80’s, the movie industry led by the Movie Pictures Association of America (MPAA) challenged the sale of VCRs in court. During the case, the VCR was likened to the Boston Strangler in how brutally it would kill the movie industry. In the end, the MPAA lost the case. If the story sounds familiar, that’s because the same thing happened to the music industry when the RIAA fought to shut down Napster. Today, the music industry makes billions of dollars in profit from the sale of downloadable and streaming media.

The MPAA isn’t wrong for trying to shut down illicit methods of sharing copyrighted material, but they were wrong about the VCR destroying the movie industry. Shortly after the case against consumer VCR sales, the movie industry figured out how to turn decades of back catalogue into highly profitable video sales.

Home video rentals further expanded entertainment options to include cinema. Binge watching had a place in living rooms long before Netflix, as a half-dozen movies could be watched back-to-back over a weekend. Encyclopedic knowledge of films (both good and bad) could be had by anyone with a video rental membership. VCRs and video stores came to influence self-referential pop culture in the 90s as directors like Quentin Tarantino and Kevin Smith weaned their filmmaking inspiration from watching hundreds of movies.

Even after gaining popularity, there were two incompatible tape formats; VHS and Betamax. The latter was of higher quality, but patent-owner Sony held firmly onto licensing rights for players, keeping them expensive. VHS became the dominant format in part because it could hold hours of content on a single tape. By the last days of VHS, it was possible to record six or more hours on a cassette.

Other formats like Laserdisc or VCDs tried to conquer the industry with greater convenience or picture quality, but it wasn’t until the turn of the millennium that a true alternative took root. By 2000, Blockbuster Video committed half of their space to DVD video discs. A decade later, video downloads and streaming services from companies like Apple and Netflix shuttered the Blockbuster Video doors, and videotape rentals became a thing of the past.

The last major Hollywood production to see VHS’s magnetic tape was David Cronenberg’s A History of Violence in 2006. Even a full decade of absence, we still see the influence of VCRs in our daily lives as we timeshift shows on DVR, stream movies, or scoff as the MPAA tries to shut down file sharing sites violating copyright law.—Stephen Clark

6. The First “Real” Video Game

Screen Shot 2016-07-06 at 4.35.31 PM.png

The very early history of video games arguably begins in 1947 following the invention of the cathode-ray tube amusement device, which stimulated an artillery shell arcing towards a target on a cathode ray tube (CRT) screen. The ‘player’ was able to control it by adjusting knobs to change the trajectory of a CRT beam spot on the display. However, some people and definitions do not consider this the first video game since it did not run on a computing device, though it is the first known interactive electronic game.

Then, in the 1950s, another video game-like device was invented, the Nimrod computer, which was built by Ferranti International and designed to play an electronic version of the logic and strategy game “Nim.” The computer used a set of fixed lights that would turn off and on with a legend describing what was going on during the demo. Later, in 1952, A.S. Douglas created an electronic version of tic-tac-toe on the Electronic Delay Storage Automatic Calculator, which looked as boring as it sounds.

In 1958, William Higinbothan, an American physicist at Brookhaven National Laboratory, created Tennis for Two, which was played on an oscilloscope and is widely considered as the first video game. Similar to the classic arcade game Pong (which would come about in the 1970s) Tennis for Two was very simple, consisting of a net, a ball that bounced over the net, and a large, aluminum controller for players to use for controlling the game. Higinbotham created the game by rigging together several analog computers and transistor circuits to feed images into the oscilloscope. The game circuitry was simple, using resistors, capacitors and relays, and transistors for fast switching when the ball was in play. The game had a small display of about five inches in diameter so Higinbotham later improved it with a larger display.

Tennis for Two only lasted for about two years until it was forgotten. In the 1970s through the 80s there were a series of lawsuits arguing that a “video game” is technically an apparatus displaying a game through the manipulation of video display signals using raster-based equipment, like a TV or computer monitor. Using this definition, everything prior to 1970 would not be considered a video game, but the technology certainly helped advance the ideas. In the 1960s, Ralph Baer, an engineer at Sanders Associates, created an interactive TV game—a chase game and a tennis game. Additionally, he manipulated a toy gun so it would detect spots of light on the TV screen. Baer and Sanders Associates received the first patent for a video game, which was later purchased by Magnavox in the early 1970s.

Arcade games and video gaming started gaining mainstream popularity in the 70’s once Nolan Bushnell and Ted Dabney (who would become the future founders of Atari) attempted to create an arcade version of Spacewar, named Computer Space. The technology of Computer Space was simple as there was no microprocessor or modern memory architecture—the computer was made on 74 logic circuits using diode arrays as memory. The game was too hard for people to play in bars, so it didn’t do very well. Later, an easier, more mainstream game called Pong was created, which is why many people assume this is the first video game.—Isabel Thottam

7. Digital Wristwatches

70stech-digitalwristwatch (1).jpg

Today, digital timekeeping is everywhere. It’s built into displays on our cell phones, stoves, and cars. However, when the first Light Emitting Diode (LED) digital wristwatch was launched in 1970 almost everyone was still telling time on analog clock faces with hour and minute hands. The quick rise in popularity of digital wristwatches changed the way we told time.

The Hamilton Watch Company released the gold Pulsar watch with red LED display. The prototype had been seen two years earliere in Stanley Kubrick’s 2001: A Space Odyssey, but you’d be forgiven for missing it amongst all the other “21st century” technology. It was seen again as spy gear in the James Bond flick, Live and Let Die. However, by the mid-70s, Texas Instruments had started selling cheap digital watches for under $20 and it quickly eroded the premium status of digital watches. Now anyone could have this futuristic-looking technology on their wrist.

Through the 80s, expensive and high-quality watches made way for flamboyant and high-tech. Liquid Crystal Display (LCD) technology was still in its infancy and difficult to read. Work continued to improve the visibility since LCD used much less power than LED. Many LCD designs like Casio’s basic F91 and Timex’s Ironman have stood the test of time as their designs are still available today. After acquiring the rights to the premium Pulsar watch line, Seiko became known for their premium digital watch selection.

They were always on our wrist, and always with us, so many additional features began to appear. A lack of moving parts made digital wristwatches ideal for exercising, so timers and stopwatches were added. Many companies released calculator watches, and Casio had the widest variety. Timex released a watch in 1994 that could store contacts, phone numbers, anniversaries and meetings with data transferred via your computer monitor to a small optical sensor on the device.

Digital timekeeping wasn’t invented alongside LED and LCD wristwatches. Early digital clocks date back to the 19th century and used dials with cards or discs that flipped into view. However, the cheap production of digital wristwatches led to their popularity outstripping even that of analog, so the way we talked about time changed too. Where we used to say, “quarter to eight” became “eight forty-five”. Rounding time to “half past one” became the more precise “one thirty-two”. It’s a small difference in lexicon, but it can mean the difference between someone being just a tick on the clock face late, or showing up seven minutes after the meeting had started.

In spite of the 21st-century feel of digital timekeeping, wristwatches fell out of popularity. By 2000, pagers and mobile phones synced with cell towers became popular. Since they always displayed the correct time, many people simply stopped wearing watches. In 2012, five years after the iPhone was released, the market was ripe for reinvention. The Pebble smartwatch combined a weeklong battery with an e-ink display personalizing your interface, and infinite information pulled from a paired smartphone.

It became one of the first unmitigated successes on the new crowdfunding site, Kickstarter. After just two hours, the Pebble reached its $100k funding goal. In six days it became the most successful Kickstarter campaign to date. By the end of the 30 days, over $10 million was raised. Three years later, the market is flooded with smartwatches running Android, iOS, or other custom operating systems that are more powerful than the first smartphones.

Nearly all smartwatches still depend on a smartphone for connectivity, and most (except the Pebble) can’t even tell you the time after a day of use without a recharge. Perhaps simple, digital wristwatches could see a resurgence in popularity.—Stephen Clark

Recently in Tech