Monday, August 3, 2015

The '80s: The Decade That Made Technology (MLA research essay)

          In the modern day, technology has become a crucial part in the life of mankind. If we look around, we might be able to spot someone constantly staring and tapping on a small piece of plastic. Someone else could be sitting still with a small piece of thin cable plugged into both ears. Early human beings would think that our generation is bizarre. Nonetheless, we can agree that our everyday lives have become so much more convenient thanks to the endless possibilities of things we can do with the outputs of current technology. When we make a video call, chat in a group message, read an
e-book on a tablet, watch a how-to video, or Google the recipe of an apple pie, it may not come to our attention that at one point people could not perform any of those tasks and some of those words did not even make any sense. National Geographic has created a series called “The 80’s The Decade That Made Us” which claimed that, just like its title, the 1980s was the decade that heavily influenced the lives in the twenty-first century. Significantly, the 1980s was the decade that kicked off the journey of modern technology as we saw the first personal computer, the first cellphone and the first
hand-held video game.
Technology has been progressing before the 1980s. For instance, the first concept of a computer was invented during the 1830s. However, it took another century for this idea to be built successfully (Isaacson 35). Michael Swaine and Paul Freiberger, the authors of the book Fire In The Valley : The Birth And Death Of The Personal Computer, state:
Computers in the 1960s were massive. Even the smallest “minicomputers,” the kind built by DEC, were refrigerator-sized. And computers were expensive. Only government agencies, universities, and big businesses could afford to own a computer. And they were obscure and sinister, typically operated by a white-coated “priesthood” of specially trained operators and programmers using this mysterious private language. In the 1960s computers were widely regarded as a dehumanizing tool of the bureaucracy, especially by the young (Swaine and Freiberger 32).
In addition to being colossal, only a few people were able to understand how to operate this incomprehensible machine. Additionally, a lot of people did not believe that they needed a machine to do the human's work because people worked just fine before the existence of computers. Some were even against computers since they believed that the role of humans in workplace was being taken over by this untrustworthy robot. Besides, early computers were pricey and fragile. Hence, maintaining a computer could be troublesome. The understanding of computers is more like a myth and something abstract. Consequently, even if its main function was to aid our work, it was still not convenient to own one.
Despite being an overly complicated machine, there were still computer enthusiasts who were willing to study this machine in the hope of transforming it into something usable by regular people. Among them were Bill Gates and his friends, who were fascinated by this amazing invention, as they learned programming when they were only in seventh grade. Gates later decided to form a programming group with Paul Allen, his friend and schoolmate, when they were in eighth grade (Isaacson 316). Their journey as programmers has leaded them to start a company called Microsoft and began writing software for various tech companies. In the late 1970s, IBM, who was considered to be “the world's greatest computer company” by Isaacson, was trying to find a way to compete in the market for refrigerator-sized minicomputers. Noticing the rising potential of Microsoft, IBM contacted and teamed up with Gates to build a personal computer (Isaccson 357). When it was accomplished and launched in the early 1980s, the true journey of modern technology officially began.
It took decades to transform the gigantic analog machine into a portable digital personal computer, but it only took a year to change the direction that the world of computer was heading. Swaine and Freiberger state, “When, on August 12, 1981, IBM announced its first personal computer, it radically and irrevocably changed the world for microcomputer makers, software developers, retailers, and the rapidly growing market of microcomputer buyers” (Swaine and Freiberger 708). The concept of owning a personal computer caught the attention of American consumers because users were able to personalize their computers based on their preferences. Besides, they wanted to have one just for the sake of feeling good. On the top of that, as quoted in the Book The Innovators by author Walter Isaacson, “All personal computers would be using the same standardized microprocessor. Hardware will in effect become a lot less interesting. The total job will be in the software” (qtd. in Isaacson 362). In the 1980s, thanks to the outstanding release of Gates' Microsoft operating system, the focus of computer development has shifted from hardware to software. As a result, many job opportunities were created for developers to work on paid or open source software to fulfill the requirement of a business or an individual.
A few years later, we also saw the launch of Apple's Macintosh. “The first Macintosh purchaser were early adopters—technophiles willing to accept the inevitable quirks of new technologies for the thrill of being the first to use them” (Swaine and Freiberger 735). Similar to IBM personal computers, most of the users felt the need to own something new for the first time. However, Apple’s sale was not as successful as the sale of IBM and Microsoft's personal computer as market for Macintosh slowed down just after two years after its release in 1984 (Swaine and Freiberger 735). According to Isaacson, “The primary reason for Microsoft’s success was that it was willing and eager to license its operating system to any hardware maker. Apple, by contrast, opted for an integrated approach. Its hardware came only with its software and vice versa” (Isaacson 369). In the 1980s, users were still experimenting with the new release of personal computer. When they got their hand on a PC, they were thrilled to have their own personal device and a complete freedom of personalizing their computer based on their own unique desires. Apple, on the other hand, had restricted the options to consumers so much which caused them to prefer PC over Macintosh.
Microsoft Windows and Apple Macintosh were just two regular products that started in the 1980s, but their names are still among the world's most popular electronic products. Isaccson suggests, “Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare” (Isaccson 485). Thanks to Microsoft and Apple, we have seen technology flourish right in front of our eyes as they are the reason behind other technology-based companies and they all are currently taking part in improving different branches of technology. Likewise, today technology does not revolve solely around computers. Different invention and developments were made to ensure the convenience of lives in the twenty-first century.
In the early 1980s, people were already familiar with the concept of making and receiving a telephone call. However, this could only be done at home or at a public pay phone. Guy Klemens, the author of the book The Cellphone : The History And Technology Of The Gadget That Changed The World, describes the first mobile phone as “Too large to be carried in anything less than an automobile and too expensive for any but the elite, the car phone nonetheless was popular” (Klemens 45). At the time, the only way to have a cellphone was to have it installed in the car and this process was immensely pricey. Nevertheless, there were more than 6,000 cars with a cellphone installed (Klemens 71). Communication was limited before the car phone was introduced, but people managed to overcome this obstacle for thousands of years without any cellphone in their hands. Hence, it was likely that nobody would seem to care about the debut of cellphones. However, when the first cellphone was released in the 1980s, everyone was smitten by this amazing invention.
According to Klemens, “If someone were to take the complete plan for a cellphone back in time to only 1970, there is little that anyone could have done with it” (Klemens 3). Technology in the 1970s was not advanced enough to allow the invention of cellphone to be made possible. However, that was just a small factor. The main reason behind the success of cellphone was the fact that the 1980s were just the right time to come up with this idea and popularize this new invention. Back then, cellphones were not just a device for transmitting and receiving voice. It was a way for American consumers to feel good. In the series The 80s, The Decade That Made Us, a woman said that she lived by the motto “It's not what you own, it's what people think you own” (“Shop 'Til You Drop”). As cellphone was released to the market, some people were rich enough to have one installed in their cars. This could be a way for them to feel good about themselves and show off to the world their luxurious item that they own. On the other hand, those who could not afford to have one also found a way to make themselves feel just as good by having a fake plastic phone and pretending to be busy talking while driving.
Nowadays, a cellphone is no longer just a device for communication. After decades of development from the early cellphone in a car, mobile phone has evolved into an affordable high performance device owned by hundreds of millions of people. Klemens explains, “Most of the digital IC in a cellphone a few years into the 2000's is devoted to non-cellular phone features. Graphics processors for the display, audio processors for music, and a more powerful general processor for running applications like games became standard” (Klemens 200). Cellular phones nowadays consist of modern hardware which has the exact same capabilities of a computer, a camera, a music player, and a video game console. Basically, all the technologies in the 1980s were able to be squeezed and installed on one small hand-held device. We can agree that cellphones are much modern than ever, yet owning a cellphone is not a must. However, it is quite impossible to live in the twenty-first century without one. This suggests that after the 1980s, cellphones have evolved and become part of human's society.
The 1980s could be a dull time as America was still recovering from war. Adults were too busy working or looking for a job. Similarly, kids and teenagers needed to go to school or look for a way to have something new and exciting in their lives for the sake of killing boredom. In 1972, the first video game, Pong, was released by a company called Atari. It was just a basic game, but it was enough to entertain everyone in the 1970s (Ryan 9). The popularity of this video game started to fade out as players began to get sick and tired of spending hours of their time moving a paddle up and down and wished that there were more things they could do. Nonetheless, playing Pong continued to be everyone's favorite time killing activity till 1980.
When the 1980s began, the need to play any kind of game skyrocketed again as Atari and multiple Japanese companies released various video games to the United States (Ryan 12). The release of Donkey Kong, which was created by Nintendo, had made video games one of the most important forms of entertainment in the 1980s. In an interview with Billy Mitchell, who was described by the author as “a sixteen-year-old pinball wizard”, Mitchell stated “Video games were something new and different and I don't like new and different. But they started getting more popular everyone was standing around the Donkey Kong machine, and I wanted that attention” (qtd. in Ryan 33). Kids and teenagers were crazy about the concept of playing video games. Not only did they enjoy themselves through the experience of playing, they also wanted be at the center of attention of being an experienced player.
Many years later, Game Boy, the first hand-held video game, was introduced to the market. The first ever portable game may not be the best entertainment system, but everyone was dying to have one. According to Ryan, “Sure, people would complain the “Game Boy” (as it was being called) couldn't be played in the dark. But their unspoken desire for a light, cheap, long-lasting product outweighed the backlight's pros” (Ryan 103). There were complaints about the lack of color and features. However millions of this hand-held video games were sold, even president Bush had one (Ryan 105). Apparently, in 1980s everyone enjoyed playing video games and everyone overlooked any kind of flaws of video games.
What made video games a success was the fact that it was more than a form of entertainment. Players were able to relate to the games and get lost in their own imagination. At the beginning of the 1980s, American has been through a tough time of war and everyone was scared that another one was approaching. Hence, escaping reality for a short while could make a huge different for players. Ryan explains, “Most every other game offered a way to destroy, and Pac-Man offered a way to escape, Donkey Kong offered way to rescued” (Ryan 35). Each video games motivated players differently. Players could felt a sense of achievement when their characters in the game were able to rescue someone. They felt defeated when they were killed in the game and they believed that they had another chance to start again in the hope of succeeding. National Geographic also compared the game Tetris to the Berlin wall (“Super Power”). Players felt that they need to get rid of the blocks, just like the way they wanted to tear down the walls that separated Germany.
As we can see today, video games are still serving its purpose of entertaining geeks and everybody else. Ryan states:
Entertainment went from being something we saw in crowds to something we experienced as single players, a trend that is now shifting back to group interaction. The global quality of life is undeniably raised by all this dedication to a new form of play. Games—whether joystick, D-pad, or motion—are at their root enjoyable. They make the world a happier place” (Ryan 274).
Video games were created for all kind of people to enjoy the games equally. Those who are uncomfortable with social interaction have a complete freedom of playing video game by themselves. In addition, one significant feature of game playing in the twenty-first century was online communities, which allows players around the world to compete with or against one another (Ryan 274). Video games have different genres and can be played using different consoles and controllers. Therefore, players have the ability to decide when they want to play and what they want to play based on their preferences. Clearly, the experience of playing video games has completely transformed since the invention of Game Boy.
It may seem that the 1980s enabled the success of computers, mobile phones, video games and many other electronic devices. However, the real father who paved the way for all of them was just a small piece of metal, the microchips. They may seem like tiny pieces of metal, but they were actually the most influential creation in the 1980s. Tom Simonite wrote the article “Thinking in Silicon” which gives a brief history behind the modern technology that we are holding in our hands and shows us why microchips were responsible for this success. Simonite considered Carver Mead, a professor at the California Institute of Technology, to be the father of modern computing due to his design of the microchip based on the human's brain in the early 1980s. “This triggered explosive growth in computation power: computer looked set to become mainstream, even ubiquitous” (Simonite 54). Ever since that time, computers have become more and more powerful. In the same way, IBM came up with PC and Apple Macintosh was also introduced. Up until now, as there are constantly improvement and updates on microchips and modern hardware, we continue to see the competition of PC and Mac as they are trying to come up with the best product in the market. In the same article, Simonite suggests another example that we can see today, “Google recently made headlines with software that can reliably recognize cats and human faces in video clips” (Simonite 53). Thanks to microchips, major companies are able to come up with amazing and unbelievable ideas that continue to surprise the world. It was not a coincident that Microsoft and Apple were successful right after the invention of the microchips. These two companies would not be able to build their first personal computer without this tiny invention. Hence, today we would be living in a much less advance world.
Overall, in terms of technology, I agree with National Geographic's claim that 1980s was the decade that made us. The proof is right in front of our eyes as we can see vividly the role of technology in our communication, entertainment, education and every other aspect of human's life. The real beginning of technology may have started way before the 1980s, but this decade was the time that truly revolutionizes how technology has been used at its full potential to serve human's wants and needs. In the 1980s, innovators were coming up with small and crazy ideas; yet, it has drastically transformed the world we are living in right now.

Works Cited
Isaacson, Walter. The Innovators. New York: Simon & Schuster, 2014. Print.
Klemens, Guy. The Cellphone : The History And Technology Of The Gadget That Changed The World. Jefferson, N.C. : McFarland & Company, 2010. Print.
Ryan, Jeff. Super Mario : How Nintendo Conquered America. New York : Portfolio Penguin, 2011. Print.
Shop 'til You Drop.” The '80s: The Decade That Made Us. National Geographic. 15 Apr. 2015. Narr. Rob Lowe. Television.
Simonite, Tom. "Thinking In Silicon." Technology Review 117.1 (2014): 52-58. EBSCOhost. Web. 12 Jul. 2015.
Super Power.” The '80s: The Decade That Made Us. National Geographic. 16 Apr. 2015. Narr. Rob Lowe. Television.
Swaine, Michael, and Paul Freiberger. Fire In The Valley: The Birth And Death Of The Personal Computer. Pragmatic Bookshelf, 2014. Ebook.


No comments:

Post a Comment