Six Tech Advancements from the 60s That Changed the World

The 1960s were a remarkable decade of change. The first generation of the postwar America youth were discovering their cultural identity, the Vietnam War and counterculture was reshaping the political landscape, and the Beatles went from A Hard Day’s Night to Abbey Road. But at the center of it, technology was beginning to develop at an unprecedented pace—most of which the general public only saw the tip of the iceberg. These technologies laid the foundation for an entire half decade of scientific innovations, many of which even resulting in products that we enjoy today.

The iconic technological products of the 60s—the microwave oven, the color television, the living room cabinet record player—these are of a different topic of discussion. The following are the technologies that were discovered and implemented in a way that led to the products of the future. These are the six technological advancements from the 1960s that changed the world forever:

1. DRAM (the stuff in your computer)

1280px-MT4C1024-HD (1).jpg

It’s the unsexy little things, really, that end up meaningfully changing the world. Copper wires, LEDs, optic fibers, tin—the discovery of these things drastically increased the efficiency of a much larger tool, thing, or framework, which opened the floodgates that led to the next thing and then the next.

For hard technological infrastructure, specifically the various components that make up the computer, the Dynamic Random Access Memory (DRAM) memory chip was the unsexy little thing. Invented in 1968 by Robert H. Dennard, who received his doctorate at the Carnegie Institute of Technology in Pittsburgh, the chip vastly increased the memory capacity of computers at a cheaper price. This allowed for a diverse set of more powerful technological products to be created for market, which was a for mainstream technology consumers who could, with help of DRAMs, own powerful personal tech on a more affordable budget. Some form of DRAM happily sits in the intestines of your favorite devices: gaming consoles, phones, computers, digital camera, Roku stick, so on and so forth.

The memory chip also allowed great gains for companies, as the chip lowered the barriers of entry for companies and wider industries to embrace, integrate, and scale their operations on top of computing power. And that’s a big deal—an increased accessibility to computing power allowed companies to become more powerful, efficient, and effective than ever before.

The memory chip was also vibrant argument in support of Moore’s Law, which (to simplify the technical definition down to bare basics) is the observation that overall processing power for computers will double every two years. The Law pretty much remains intact to this day, though the technical specificities of what constitutes “processing power” has shifted—from internal electrical circuitry to the ephemeral thing we call the “cloud.—Nicholas Quah

2. Telstar, the First Commercial Satellite

1567_Space_200 (1).png

By 1962, some governments had already launched satellites, largely to carry out extraterrestrial experiments. That year, though, the first commercial satellite went into orbit, forever changing the face of communications.

Telstar 1 had its origins in a partnership between Bell Labs (which built it), NASA, AT&T, the UK’s General Post Office (back when it was handling Britain’s telecom services), and France Telecom. The satellite is less than three feet long but weighs about 170 pounds and used solar power when it was in service.

This hunk of metal might be infinitesimal in the grand scheme of the universe, but it’s hard to overstate its importance to humanity. It was the first device our race used to relay phone calls, fax images, and television pictures, including the first transatlantic TV feed, through space.

That last one was an especially big deal. On July 11, just a day after launch, Telstar relayed non-public TV pictures. A couple of weeks later, the first live, public TV pictures beamed across the Atlantic, a snippet of a game between the Philadelphia Phillies and the Chicago Cubs.

The broadcast was supposed to first show John F. Kennedy giving a speech in this moment befitting a president, but the signal was up and running before he was ready, so the Phillies and Cubs got an unexpected moment in the spotlight. When JFK did appear, he fittingly spoke of the American dollar price, which caused some concern across the pond at the time.

But, there was a catch. Since Telstar wasn’t in geosynchronous orbit (i.e. fixed to the same point above the spinning Earth), those transatlantic TV broadcasts only lasted 20 minutes during its orbit of 2 hours and 37 minutes. It would be three more years before a communications satellite was in geosynchronous orbit, giving an uninterrupted signal.

Ironically—because Telstar was in part a product of the Space Race between the US and the Soviet Union—it was the Cold War that thumped the first nail in its space coffin. Just a day before Telstar launched, the US tested a high-altitude nuclear bomb which affected the part of the atmosphere that Telstar orbited. The radiation increase the bomb caused, along with other high-altitude tests (including a Soviet blast), damaged the satellite’s delicate transistors and knocked it offline that November. Scientists managed to kickstart Telstar in January, but there was soon another, catastrophic transistor failure.

Telstar survived just seven months in service, handling more than 400 total transmissions. It’s still up there, though, floating around as a little piece of history in the vast expanse of space.—Kris Holt

3. BASIC Programming Language

1024px-Pn-pravez-class-5.jpg

If you want to accelerate a revolution, you need to get as many people as humanly possible to actively participate in its machinations. And to do that, you would need to figure out some sort of pathway, some sort of tool, that can help turn people into participants. Or, to switch metaphors, you can’t cultivate a language without speakers, and if you want to speakers, you need a damn good dictionary.

In the initial public revolution of computing, the BASIC programming language was that damn good dictionary. Short for “Beginner’s All-Purpose Symbolic Instruction Code,” BASIC was created by two Dartmouth professors, John G. Kemeny and Thomas E. Kurtz, in 1964 as part of the Dartmouth Time-Sharing System project.

What it allowed the user to do was nothing short of revolutionary: previously, “computing” involved incredibly expensive mainframe machines—the kind that fills up rooms and has things that looks like film reels on a projector- as well as an extensive and aggravatingly slow system that involved punch-card production (which would contain the program), several trained operators as middle-men to handle the input and output of the process, and a sizable wait to receive results. The Time-Sharing System rode on a series of technological breakthroughs that allows these processes to happen simultaneously, consolidating the overall “computing” process and marked a dramatic shift in hardware design.

Integer_BASIC.png

BASIC was the language designed to allow users to interact with this new computing configuration. With it, people could directly communicate with the computer itself through a terminal with keyboards and teletype printers which would later become display screens. It’s kind of crazy to think about it, but the computers we know now—and the way we interact with them, through monitors and mice and keyboards—largely began with this particular innovation. (I mean, who knows? In an alternate timeline, computing could largely be auditory, and we would provide input by verbally going “beep beep boop boop.”)

That the language was accessibly teachable, and that the language was designed and fostered in an environment built on teaching, contributed greatly to the expansion of its user base. And this was key: the act and culture of computing were essentially democratized, which allowed an exponentially growing number of people who could work to solve the issues of computing and to explore new ways of integrating it into the wider world. This past season of Mad Men featured a small arc involving the introduction of a computer into its early-1970s world, gesturing towards a moment where computing met advertising. It is an interesting depiction of an early pre-historic moments that featured computers “eating the world.”

These days, the closest equivalent to BASIC that we have is probably the Ruby language and its corollary Ruby-on-Rails deployment framework. Ruby is an exceeding easy programming language to get behind, and with its popularization in the current iteration of the tech industry, Ruby could well do to software what BASIC did to computing as a whole. Whatever happens, it’s still kinda great to think about the fact that 50 years ago, ubiquitous computing started with this rather specific innovation. —Nicholas Quah

4. LEDs and Electroluminescent Panels

Orange_LED_emitting.png

The world we live in now is constantly lit with the technology of electroluminescence. Whether it’s the billboard you drive by, the computer screen you’re reading this on, or lights in your house, the purveyance of light-emitting capacitors is hard to deny. While the phenomenon of a material emitting light from an electrical current wasn’t invented in the 1960s, the possibilities for electroluminescent panels came into full view during the decade, most notably with the invention of the light-emitting diode (LED).

It’s said that General Electric had patents for electroluminescent panels dating back to 1938, but it wasn’t until 1962 that the first LED was created. The first LED was developed by Nick Holonyak Jr., often referred to as the “father of the LED”, working as a scientist for General Electric in 1962. His was a red LED—and the first LED that could be seen by the human eye. However, just a year earlier, the first infrared LED was accidentally discovered by Bob Biard and Gary Pittman, who found upon it while working attempting to make a laser diode for Texas Instruments.

However, until 1968, both red and infrared LEDs remained incredibly expensive and thus, fairly limited in use. None other than the controversial Monsanto Company were first able to manufacture red LEDs for the mass market, first selling their LEDs to Hewlett-Packard (HP) for their products in 1968. They first appeared only in alphanumeric displays, handheld calculators, and as indicators, as they weren’t very bright. But slowly they became more brighter and cheaper, and became integrated in more and more commercial products throughout the 70s and 80s. The LED is still incredibly common today, especially since the discovery of white LEDs in the late 1990s, which made it a technology that could replace the common incandescent forms of light.

66ChargerDash (1).jpg

But like today, LED wasn’t the only burgeoning source of electroluminescence in the 1960s. While LEDs would eventually be the future of light, the more practical was a technology called electroluminescent panels. These were first shown off by Aron Vecht in 1968 and quickly put to use in watches, vehicles, night lights, and backlights. This technology was further developed to mass market proportions by companies such as Sharp and Planar System and became the dominant means of lighting consumer products throughout the rest of the century.—Luke Larsen

5. Direct Distance Dialing

number-please-a-phrase-remembered-50.jpegImage via Amador Ledger Dispatch

Today, boasting that your phone can effortlessly dial someone several states away is laughable. With smartphones capable of shooting high-definition video and streaming movies, using a phone as originally intended is passé. Even poking fun at the not-using-phones-as-phones trope is worn out—see this 2011 story by The New York Times.

Fifty years ago, though, boasting about a seamless long-distance call would have been impressive. The 60s marked the spread of direct dialing technology, allowing folks to reach a friend outside their local area without the help of an operator. Previously, it was second nature to pick up the receiver and chat with an operator before connecting with your friend in Albuquerque, or Aberdeen. Sprawling switchboards manned by teams of operators were a fundamental part of the dialing process.

The first transcontinental direct dial phone call was achieved in 1951, connecting the mayors of Englewood, NJ and Alameda, CA by way of area codes. It was a historic moment, but didn’t facilitate the technology’s immediate spread—for many Americans, direct dialing wasn’t a household convenience until the 60s. Still, the breakthrough event spawned a series of short films excitedly introducing the new tech, like 1951’s The Nation at Your Fingertips. In the 11-minute black-and-white piece, a couple—having coffee in a kitchen that could easily double as a Leave It To Beaver set—phones their daughter for a long-distance chat, no operator required.

(And while we may lampoon the couple in the video, they tease their predecessors, too—the film takes a moment to mock a 1880’s couple who can’t quite get the hang of this “newfangled contraption” the telephone.)

Direct dialing may seem a trivial advancement when viewed from afar, but it majorly progressed the telephone’s ultimate mission: connecting people instantly, regardless of the space between them. And in the 60s, it was tech worth talking about—just one decade prior, many American relied on party lines, or telephone lines that were shared between households. In 2014, party lines seem an unthinkable inconvenience capable of spoiling surprises and facilitating eavesdropping.

When the 1969 Apollo 11 mission rounded out the 60s as a watershed decade for tech, direct dialing was a common feature. Which meant excited calls to friends and family to discuss the lunar landing didn’t need start with “Hello, operator?”—Kevin Wazacki

6. The birth of the Internet (sort of)

tx-2.jpg_Image via Computer History Museum_

Because of the staggering cost of computing at this time, a lot of the innovation surrounding it came from government and military funding. The internet is no different. A top secret part of DARPA (Defense Advanced Research Projects Agency) began development on a new protocol called Network Control Protocol and made its first connection on a fateful day in 1969. This early internet protocol, which is just way of determining how computers function together, led to the creation of ARPAnet. There is some dispute over what the exact need the military had for the technology, but regardless of the motive, they wanted a way to quickly and easily share data between their nationwide computers.

The development was spearheaded by researchers at DARPA, most notably, a scientist brought over from MIT named Lawrence Roberts. However, some of the biggest concepts behind how such a connection should function was developed independently years before. One of the most important was a concept called ‘Packet switching’, which put content-neutral data into small blocks of data that could be easily transferred and had been in development since the beginning of the decade.

It all came together under ARPAnet and the data exchanged by computers at Stanford Research Institute and UCLA exchange data. It was the first version of the internet by every definition of the word. By the time the 70s rolled around, the first emails were being sent, the first file transfer protocol (FTP) was set up, and even the first remote desktop connection was made. ARPAnet was eventually replaced by a military-wide internet service in the early 80s, as well as the more modern internet protocol TCP (transmission control protocol), which opened up the possibilities The internet in its current format wouldn’t available to the public until commercial ISPs showed up in the late 80s, but the groundwork was all laid in the 60s.—Luke Larsen

SOURCES
“1951: First Direct-Dial Transcontinental Telephone Call”
ATT.com
http://www.corp.att.com/attlabs/reputation/timeline/51trans.html

 
Join the discussion...