Six Tech Advancements from the 60s That Changed the World

Tech Lists
Share Tweet Submit Pin

3. BASIC Programming Language

1024px-Pn-pravez-class-5.jpg

If you want to accelerate a revolution, you need to get as many people as humanly possible to actively participate in its machinations. And to do that, you would need to figure out some sort of pathway, some sort of tool, that can help turn people into participants. Or, to switch metaphors, you can’t cultivate a language without speakers, and if you want to speakers, you need a damn good dictionary.

In the initial public revolution of computing, the BASIC programming language was that damn good dictionary. Short for “Beginner’s All-Purpose Symbolic Instruction Code,” BASIC was created by two Dartmouth professors, John G. Kemeny and Thomas E. Kurtz, in 1964 as part of the Dartmouth Time-Sharing System project.

What it allowed the user to do was nothing short of revolutionary: previously, “computing” involved incredibly expensive mainframe machines—the kind that fills up rooms and has things that looks like film reels on a projector- as well as an extensive and aggravatingly slow system that involved punch-card production (which would contain the program), several trained operators as middle-men to handle the input and output of the process, and a sizable wait to receive results. The Time-Sharing System rode on a series of technological breakthroughs that allows these processes to happen simultaneously, consolidating the overall “computing” process and marked a dramatic shift in hardware design.

Integer_BASIC.png

BASIC was the language designed to allow users to interact with this new computing configuration. With it, people could directly communicate with the computer itself through a terminal with keyboards and teletype printers which would later become display screens. It’s kind of crazy to think about it, but the computers we know now—and the way we interact with them, through monitors and mice and keyboards—largely began with this particular innovation. (I mean, who knows? In an alternate timeline, computing could largely be auditory, and we would provide input by verbally going “beep beep boop boop.”)

That the language was accessibly teachable, and that the language was designed and fostered in an environment built on teaching, contributed greatly to the expansion of its user base. And this was key: the act and culture of computing were essentially democratized, which allowed an exponentially growing number of people who could work to solve the issues of computing and to explore new ways of integrating it into the wider world. This past season of Mad Men featured a small arc involving the introduction of a computer into its early-1970s world, gesturing towards a moment where computing met advertising. It is an interesting depiction of an early pre-historic moments that featured computers “eating the world.”

These days, the closest equivalent to BASIC that we have is probably the Ruby language and its corollary Ruby-on-Rails deployment framework. Ruby is an exceeding easy programming language to get behind, and with its popularization in the current iteration of the tech industry, Ruby could well do to software what BASIC did to computing as a whole. Whatever happens, it’s still kinda great to think about the fact that 50 years ago, ubiquitous computing started with this rather specific innovation. —Nicholas Quah

4. LEDs and Electroluminescent Panels

Orange_LED_emitting.png

The world we live in now is constantly lit with the technology of electroluminescence. Whether it’s the billboard you drive by, the computer screen you’re reading this on, or lights in your house, the purveyance of light-emitting capacitors is hard to deny. While the phenomenon of a material emitting light from an electrical current wasn’t invented in the 1960s, the possibilities for electroluminescent panels came into full view during the decade, most notably with the invention of the light-emitting diode (LED).

It’s said that General Electric had patents for electroluminescent panels dating back to 1938, but it wasn’t until 1962 that the first LED was created. The first LED was developed by Nick Holonyak Jr., often referred to as the “father of the LED”, working as a scientist for General Electric in 1962. His was a red LED—and the first LED that could be seen by the human eye. However, just a year earlier, the first infrared LED was accidentally discovered by Bob Biard and Gary Pittman, who found upon it while working attempting to make a laser diode for Texas Instruments.

However, until 1968, both red and infrared LEDs remained incredibly expensive and thus, fairly limited in use. None other than the controversial Monsanto Company were first able to manufacture red LEDs for the mass market, first selling their LEDs to Hewlett-Packard (HP) for their products in 1968. They first appeared only in alphanumeric displays, handheld calculators, and as indicators, as they weren’t very bright. But slowly they became more brighter and cheaper, and became integrated in more and more commercial products throughout the 70s and 80s. The LED is still incredibly common today, especially since the discovery of white LEDs in the late 1990s, which made it a technology that could replace the common incandescent forms of light.

66ChargerDash (1).jpg

But like today, LED wasn’t the only burgeoning source of electroluminescence in the 1960s. While LEDs would eventually be the future of light, the more practical was a technology called electroluminescent panels. These were first shown off by Aron Vecht in 1968 and quickly put to use in watches, vehicles, night lights, and backlights. This technology was further developed to mass market proportions by companies such as Sharp and Planar System and became the dominant means of lighting consumer products throughout the rest of the century.—Luke Larsen

Recently in Tech