Skip to main content

Technology, the Double Edged Sword

Technology is an amazing thing isn't it?  Look at the evolution of the computer.  Below I am providing a history of the evolution of the computer.  It is not all-inclusive, but does document some very important moments in history starting in 1801 and ending in 2017.  If you aren't a huge reader, you can jump past the history lesson to the remainder of my blog below, but it is very interesting and I recommend you read through it. 

In France in 1801, Joseph Marie Jacquard invented a loom that used punched wooden cards to automatically weave fabric designs.  Early computers would use similar punch cards.

In 1822, an English Mathematician named Charles Babbage conceived of a steam-driven calculating machine that would be able to compute tables of numbers.  The project was funded by the English government and was a failure.  More than a century later, however, the first computer was actually built in the world.

In 1890, Herman Hollerith designed a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 Million (which was the equivalent of $132 Million today).  Herman established the company that would eventually become IBM.

In 1936, Alan Turing presented the notion of a universal machine, later called the Turing Machine, capable of computing anything that is computable.  The central concept of the modern computer was based on his ideas.

In 1937, J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempted to build the first computer without gears, cams, belts or shafts.

In 1939, Hewlett-Packard was founded by David Packard and Bill Hewlett in Palo Alto, California.  It was founded in a garage according to the Computer History Museum.

In 1941, Atanasoff and his graduate student, Clifford Berry, designed a computer that could solve 29 equations simultaneously.  This marked the first time a computer was able to store information on its main memory.

In 1943-1944, Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, built the Elextronic Numerical Integrator and Calculator (ENIAC).  It is considered the Grandfather of digital computers.  It filled a 20 Foot by 40 Foot room and had 18,000 vacuum tubes. 

In 1946, Mauchly and Presper left the University of Pennsylvania and received funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

In 1947, William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invented the transistor.  They discovered how to make an electric switch with solid materials and no need for a vacuum.

In 1953, Grace Hopper developed the first computer language, which eventually became known as COBOL.  Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceived the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

In 1954, the FORTRAN programming language, an acronym for FORmula TRANslation, was developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.

In 1958, Jack Kilby and Robert Noyce unveiled the integrated circuit, known as the computer chip.  Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

In 1964, Douglas Engelbart showed a prototype of the modern computer, with a mouse and a graphical user interface (GUI).  That marked the evolution f the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.

In 1969, a group of developers at Bell Labs produced UNIX, an operating system that addressed compatibility issues.  Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities.  Due to the slow nature of the system, it never quite gained traction among home PC users.

In 1970, the newly formed Intel unveiled the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

In 1971, Alan Shugart led a team of IBM engineers who invented the "Floppy Disk," which allowed data to be shared among computers.

In 1973, Robert Metcalfe, a member of the research staff for Xerox, developed Ethernet for connecting multiple computers and other hardware.

In 1974-1977, a number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 (Known as the Trash 80), and the Commodore PET.

In 1975, the January issue of Popular Electronics magazine featured the Altair 8080, described as the "World's first minicomputer kit to rival commercial models." Two computer geeks, Paul Allen and Bill Gates, offered to write software for Altair, using the new BASIC language.  On April 4th, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

In 1976, Steve Jobs and Steve Wozniak started Apple Computers on April Fool's Day and rolled out the Apple I, the first computer with a single-circuit board, according to Stanford University.

In 1977, Radio Shack's initial production run of the TRS-80 was just 3,000.  It sold like crazy.  For the first time, non-geeks could write programs and make a computer do what they wished.

In 1977, Jobs and Wozniak incorporated Apple and presented the Apple II at the first West Coast Computer Faire.  It offered color graphics and incorporated an audio cassette drive for storage.

In 1978, accountants rejoiced at the introduction of VisiCalc, the first computerized spreadsheet program.

In 1979, word processing became a reality as MicroPro International released WordStar.  "The defining change was to add margins and word wrap," said creator Bob Barnaby in email to Mike Petrie in 2000.  "Additional changes included getting rid of command mode and adding a print function.  I was the technical brains - I figured out how to do it, and did it, and documented it."

In 1981, the first IBM personal computer, code named "Acorn," was introduced.  It used Microsoft's MS-DOS operating system.  It had an Intel Chip, two floppy disks and an optional color monitor.  Sears & Roebuck and Computerland sold the machines, marking the first time a computer was available through outside distributors.  It also popularized the term PC.

In 1983, Apple's Lisa was the first personal computer with a GUI.  It also featured a drop-down menu and icons.  It flopped but eventually evolved into the Macintosh.  The Gavilan SC was the first portable computer with the familiar flip form factor and the first to be marketed as a "laptop."

In 1985, Microsoft announced Windows, according to Encyclopedia Britannica.  This was the company's response to Apple's GUI.  Commodore unveiled the Amiga 1000, which featured advanced audio and video capabilities.

In 1985, the first dot-com domain name was registered on March 15th, years before the World Wide Web would mark the formal beginning of Internet History.  The Symbolics Computer Company, a small Massachusetts computer manufacturer, registered Symbolics.com.  More than two years later, only 100 dot-coms had been registered.

In 1986, Compaq brought the Deskpro 386 to market.  Its 32-bit architecture provided speed comparable to mainframes.

In 1990, Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, developed HyperText Markup Language (HTML), which gave the rise to the World Wide Web.

In 1993, the Pentium Microprocessor advanced the use of graphics and music on PCs.

In 1994, PCs became gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" were among games to hit the market. 

In 1996, Sergey Brin and Larry Page developed the Google Search Engine at Stanford University.

In 1997, Microsoft invested $150 Million in Apple, which was struggling at the time, ending Apple's court case against Microsoft in which it alleged that Microsoft copies the "look and feel" of its operating system.

In 1999, the term Wi-Fi became part of the computing language and users began connecting to the internet without wires.

In 2001, Apple unveiled the Mac OS X operating system, which provided protected memory architecture and pre-emptive multi-tasking, among other benefits.  Not to be outdone, Microsoft rolled out Windows XP, which had a significantly redesigned GUI.

In 2003, the first 64-bit processor, ADM's Athlon 64, became available to the consumer market.

In 2004, Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, which was the dominant Web browser.  Facebook, a social networking site, launched.

In 2005, TouTube, a video sharing service, was founded.  Google acquired Android, a Linux-based mobile phone operating system.

In 2006, Apple introduced the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac.  Nintento's Wii game console hits the market.

In 2007, the iPhone brought many computer functions to the smartphone.

In 2009, Microsoft launched Windows 7, which offered the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.

In 2010, Apple unveiled the iPad, which changed the way consumers viewed media and jumpstarted the dormant tablet computer segment.

In 2011, Google released the Chromebook, a laptop that ran on the Google Chrome OS.

In 2012, Facebook gained 1 Billion Users on October 4th.

In 2015, Apple released the Apple Watch.  Microsoft released Windows 10.

In 2016, the first reprogrammable quantum computer was created.  "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system.  They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

In 2017, the Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers.  "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement.  "Millions of molecules exist, and each molecule has a unique three-dementional atomic structure as wel as variables such as shape, size, or even color.  The richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based digital architectures."

That was a lot of good information that really shows you a framework for how the evolution of technology works.  Things don't just get produced out of no where.  Ideas flow for decades at times, researchers and developers learn from each other, try and try again, and eventually, new technology is born; usually after spending millions, if not billions of dollars on research and development. 

The reason I wanted to go through the history of computers is to first show how long it took to get from point A (the first real conceived idea) to point B (the computer) to point C (the constant improvement of the computer).  In the history above, it started in 1801 and carried through to 2017.  That is a very long time for the evolution of one general piece of technology.  The interesting and scary thing that I want to really talk about next, however, is how the evolution of the PC and the World Wide Web has led to much more accelerated technological advancements in more recent years. 

Look at the Cell Phone you are holding right now.  That Cell Phone, regardless of the brand, is far more powerful than the entire IBM System/360 Model 75 Mainframe Computers that cost 3.5 Million Dollars back in the day and were the entire computing platform for NASA that got our astronauts onto the moon. 

The Apollo Guidance Computer, which the Apollo 11 Command Module had on board, was a machine that had 64 kilobytes of memory and operated at 0.043MHz.  In comparison, an iPhone5s (which most of us look at as the "obsolete" iPhone these days) can easily fit into anyone's pocket, has a CPU running at speeds of up to 1.3GHz - enough to enable the execution of millions of calculations per second.  The iPhone's 1GB of RAM should well suffice for storing the 6 megabytes of code that NASA developed to monitor the status of its spacecrafts and astronauts in 1969. 

Every year the Cell Phone is getting to be more and more of an all-inclusive platform that allows you to do everything you want on one little device, and that happened much faster than the evolution of the computer.  It really happened because of the evolution of the computer.  Newer technologies are evolving much faster than prior to the technological age, and that is causing a boom of tech companies to come out of the woodwork. 

Tech is great right?  Well, yes and no.  Technology tends to make everything we do easier.  In factories, one robot can do what three humans used to do, and do it much faster and more accurately.  Remember the time you used to go to the library (for those who grew up prior to the internet), sit and rummage through research books to get sources for your school projects, and carry a pocket full of dimes to make copies of the source pages?  Now you have Google and other search engines that can compute more in a second than you can compute in your entire life.  Information is at your fingertips.  These things are great, but they come with a cost. 

The mechanization of assembly and production plants around the world threatens up to 800 Million Jobs Worldwide by 2030.  Research positions that used to exist within companies around the globe are no longer needed with the technological advances of the World Wide Web.  We will soon see no more compact discs being produced because no one buys them anymore.  That's more jobs lost.  Remember the days when every city had multiple computer shops that did computer repair?  Those are becoming a thing of the past as well, with computers becoming more disposable than luxury.  One breaks, you throw it out and get a new one. 

Technology threatens jobs around the globe, and at some point in the future, everything is going to come to a head.  Unemployment rates will skyrocket as technology replaces jobs.  At some point, and this could be far into the future, but possibly sooner than some people have considered, countries will limit the amount of children families can have, like China has in the past.  Otherwise, we will end up overpopulating ourselves into oblivion. 

Evolution is constant and fast, and it adapts to the flavor of the day.  Look at cell phones.  People used to have bag phones, then large cellular phones, and then over the years, companies made phones smaller and smaller until they ended up very compact.  Then, technology improved more and they realized that people aren't just using phones for calling and texting.  They want to use phones to cruise the web, watch videos, work on business applications, etc.  So what happened?  Those small phones started to grow, and now the bigger the screen the better. 

I am interested in your thoughts on the evolution of technology.  How do you think the world is going to be affected overall by technological advances over the next few decades?  Are we going to have a worldwide job crisis?  Is technology ever going to level out and stop evolving?

It's interesting to look back and see how slowly some things evolved and how quickly others have and are evolving.  You buy a television this year for Christmas and by June of next year your television is old news.  In thinking about this blog though I started by looking backwards at the evolution of technology, but find it more frightening to look forward at the future of technology. 

Dunk





















































































































Comments

Popular posts from this blog

Trump and the DOJ Trying to Silence Mueller...

Mueller is set to sit in front of Congress on Wednesday this week and answer questions regarding the Mueller Investigation.  The DOJ, and organization that is supposed to look out for the best interests of America but has recently decided to become Trump's personal Law Firm, has sent correspondence to Mueller about what he can and cannot answer. For instance, you have the question of the 10 documented cases of obstruction in volume 2 of the report.  Mueller made it clear that he was bound by DOJ policy and could not indict a sitting president.  One very important question the Congressional Leadership will likely ask is, "If Donald Trump wasn't the sitting President, would you have indicted him based on the amount of evidence in the report?"  That question in every respect is fair.  You are asking the head of an investigation if they would have indicted someone had an arbitrary roadblock not been in place.  I say arbitrary roadblock because the policy is...

Let's talk about kids

So here we are, waiting patiently and watching what the government is doing to reunite children with their parents who have been separated at the border.  The government said all it would take was a few key strokes and they could immediately identify which children were where and get them reunited.  Interesting comment considering that we are now past the court mandated timeframe within which the government was mandated by the court to reunite the 102 children under the age of 5 with their parents.  They had a month to do it, and only managed to reunite around 54 children by the deadline.  They have resorted to taking DNA samples from children and parents to try to match them up in a database because the process they used to separate these children from their parents didn't take into consideration the reunification process, so in most cases they have no idea which children match up to which parents. Instead of the President taking a more compassionate approach to...