The Path to Modern Computers
Welcome back. In this video,
we'll be learning how huge devices
like the analytical engine grew,
I mean shrunk into
the computing devices that we use today.
The development of computing has been
steadily growing since the invention of
the analytical engine but didn't make
a huge leap forward until World War II.
Back then, research into computing was super expensive.
Electronic components were large and you
needed lots of them to compute anything of value.
This also meant that computers took up a ton of space and
many efforts were underfunded and unable to make headway.
But when the war broke out,
governments started pouring money and
resources into computing research.
They wanted to help develop technologies that would
give them advantages over other countries.
Lots of efforts were spun up and
advancements were made in fields like cryptography.
Cryptography is the art of writing and solving codes.
During the war, computers were used to process
secret messages from enemies
faster than a human could ever hope to do.
Today the role cryptography plays in
secure communication is
a critical part of computer security,
which we'll learn more about in a later course.
For now, let's look at how computers started to
make a dramatic impact on society.
After the war, companies like IBM, Hewlett Packard,
and others were advancing
their technologies into the academic,
business, and government realms.
Lots of technological advancements
in computing were made in the 20th century.
Thanks to direct interests from governments,
scientists, and companies leftover from World War II.
These organizations invented new methods
to store data in computers,
which fuel the growth of computational power.
Consider this, until
the 1950s punchcards were a popular way to store data.
Operators would have decks of
ordered punch cards that were used for data processing.
If they dropped the deck by accident
and the cards get out of order,
it was almost impossible to get them sorted again,
there were obviously some limitations to punchcards.
But thanks to new technological innovations
like magnetic tape and its counterparts,
people began to store more data on more reliable media.
A magnetic tape worked by magnetizing data onto a tape.
This left stacks and stacks
of punchcards to collect dust,
while the new magnetic tape counterparts
began to revolutionize the industry.
I wasn't joking when I said
early computers took up a lot of space.
They had huge machines to read data and
racks of vacuum tubes that help move that data.
Vacuum tubes controlled the electricity voltages and
all electronic equipment like televisions and radios.
But these specific vacuum tubes
were bulky and broke all the time.
Imagine what the work of
an IT support specialist was
like in those early days of computing.
The job description might have included crawling around
inside huge machines filled
with dust and creepy crawly things,
or replacing vacuum tubes
and swapping out those punchcards.
In those days, doing
some debugging might've taken on a more literal meaning.
Well-known computer scientist Admiral Grace Hopper had
a favorite story involving
some engineers working on the Harvard Mark II computer.
They were trying to figure out
the source of the problems in a relay.
After doing some investigating,
they discovered the source of their trouble was a moth,
a literal bug in the computer.
The ENIAC was one of
the earliest forms of general-purpose computers.
It was a wall-to-wall convolution
of massive electronic components and wires.
17,000 vacuum tubes and took up
about 1,800 square feet of floor space.
Imagine if you had to work with
that scale of equipment today,
I wouldn't want to share an office
with 1,800 square feet of machinery.
Eventually, the industry started using
transistors to control electricity voltages.
This is now a fundamental component
of all electronic devices.
Transistors perform
almost the same functions as vacuum tubes,
but they are more compact and more efficient.
You can easily have billions of
transistors in a small computer chip today.
Throughout the decades,
more and more advancements were made.
The very first compiler was
invented by Admiral Grace Hopper.
Compilers made it possible to translate
human language via
a programming language into machine code.
The big takeaway is that this advancement was
a huge milestone in
computing that led to where we are today.
Now, learning programming languages is
accessible for almost anyone anywhere.
We no longer have to learn how to write
machine code in ones and zeros.
Eventually, the industry gave way to
the first hard disk drives and microprocessors.
Then programming language started becoming
the predominant way for
engineers who develop computer software.
Computers were getting smaller and smaller,
thanks to advancements in electronic components.
Instead of filling up entire rooms like ENIAC,
they were getting small enough to fit on tabletops.
The Xerox Alto was the first computer
that resembled the computers we're familiar with now.
There was also the first computer to implement
a graphical user interface that
use icons, a mouse, and a window.
Some of you may remember that the sheer size and cost of
historical computers made it almost
impossible for an average family to own one.
Instead, they were usually found in
military and university research facilities.
When companies like Xerox started building machines at
a relatively affordable price
and at a smaller form factor.
The consumer age of computing began,
then in the 1970s,
a young engineer named
Steve Wozniak invented the Apple 1,
a single-board computer meant for hobbyists.
With his friend Steve Jobs,
they created a company called Apple Computer.
Their follow-up to the Apple I.
The Apple II, was ready for the average consumer to use.
The Apple II was a phenomenal success,
selling for nearly two decades and giving
a new generation of people access the personal computers.
For the first time, computers
became affordable for the middle-class
and help bring computing technology
into both the home and office.
In the 1980s, IBM introduced its personal computer.
It was released with a primitive version
of an operating system called
MS DOS or Microsoft Disk Operating System.
Side-note, modern operating systems
don't just have text anymore,
they have beautiful icons, words,
and images like what we see on our smartphones.
It's incredible how far we've come from
the first operating system to
the operating systems we use today.
Back to IBM's PC;
it was widely adopted and made
more accessible to consumers,
it's thanks to a partnership with Microsoft.
Microsoft founded by Bill Gates,
eventually created in Microsoft Windows.
For decades, it was the preferred
operating system in the workplace and
dominated the computing industry because
it can be run on any compatible hardware.
With more computers in the workplace,
the dependence on IT rose,
and so does the demand for
skilled workers who could support that technology.
Not only were personal computers
entering the household for the first time,
but a new type of computing was emerging, video games.
During the 1970s and 80s,
Coin operated entertainment machine called
arcades became more and more popular.
A company called Atari developed one of the first coin
operated arcade games in 1972 called Pong.
Pong was such a sensation that people were standing in
lines at bars and
rec centers for hours at a time to play.
Entertainment computers like Pong
launched the video game era.
Eventually Atari went on to
launch the video computer system,
which helped bring personal video consoles into the home.
Video games have contributed to
the evolution of computers in a very real way,
tell that to the next person who dismisses them as a toy.
Video game show people that computers didn't
always have to be all work and no play.
They were a great source of entertainment too.
This was an important milestone
for the computing industry,
since at that time,
computers were primarily used
in the workplace or at research institutions.
With huge players in the market like Apple,
Macintosh and Microsoft Windows,
taking over the operating system space,
a program whereby the name of Richard Stallman start
developing a free Unix-like operating system.
Unix was an operating system developed
by Ken Thompson and Dennis Ritchie,
but it wasn't cheap and it wasn't available to everyone.
Stallman created an OS that he called GNU.
It was meant to be free to use with
similar functionality to Unix.
Unlike Windows or Macintosh,
GNU wasn't owned by a single company,
it's code was open-source,
which meant that anyone could modify and share it.
GNU didn't evolve into a full operating system,
but it set a foundation for the formation of one
of the largest open source operating system,
Linux, which was created by Linus Torvalds.
We get into the technical details of
Linux later in this course,
but just know that it's a major player
in today's operating systems.
As an IT support specialist,
it is very likely that you'll
work with an open-source software.
You might already be using one like
the internet browser and Mozilla Firefox.
By the early 90s, computer started getting even
smaller than a real game changer
made his way into the scene.
PDAs or personal digital assistants,
which allows computing to go mobile.
These mobile devices included
portable media players, Word processors,
email clients, Internet browsers,
and more all-in-one handy handheld device.
In the late 1990s,
Nokia introduced the PDA with mobile phone functionality.
This ignited and industry of
pocketable computers or as
we know them today, smartphones.
In mere decades, we went from
having computers that weigh tons
and took up entire rooms to having
powerful computers that fit in our pockets.
It's almost unbelievable, and it's just the beginning.
If you're stepping into the IT industry,
it's essential that you understand how to
support the growing need
of this ever-changing technology.
Computer support 50 years ago consisted of
changing vacuum tubes and stacking punchcards,
things that no longer exist in today's IT world.
While computers evolve in
both complexity and prevalence,
so did knowledge required to support and maintain them.
In 10 years,
IT support could require working through
virtual reality lenses, you never know.
Who knows what the future holds,
but right now it is
an exciting time to be at the forefront of this industry.
Now that we've run down where computers came
from and how they've evolved over the decades,
let's get a better grasp on how computers actually work.