From Abacus to Analytical Engine
When you hear the word computer,
maybe you think of something like
a beefy gaming desktop with flashing lights.
Or maybe you think of a slim and sleek laptop.
These fancy devices aren't what people
had in mind when computers were first created.
To put it simply, a computer is a device that stores
and processes data by performing calculations.
Before we had actual computer devices,
the term computer was used to
refer to someone who actually did the calculation.
You're probably thinking that's crazy talk.
My computer lets me check social media,
browse the Internet, design graphics.
How can it possibly just perform calculations?
Well, friends, in this course,
we'll be learning how
computer calculations are baked into applications,
social media games, etc,
all the things that you use every day.
But to kick things off,
we'll learn about the journey computers took from
the earliest known forms of
computing into the devices that you know and love today.
In the world of technology and if I'm
getting really philosophical in life,
it is important to know where we've been in
order to understand where we are and where we're going.
Historical context can help you
understand why things work the way they do today.
Have you ever wondered why the alphabet
isn't laid out in order on your keyboard?
The keyboard layout that most of the world
uses today is the Cordelia,
distinguished by the Q-W-E-R-T-Y
keys in the top row of the keyboard.
The most common letters that you type aren't found on
the home row where your fingers sit the most.
But why? There are
many stories that claim to answer this question.
Some say it was developed to slow down type is so
they wouldn't jam old mechanical typewriters.
Others claim it was meant to resolve
problem for telegraph operators.
One thing is for sure the keyboard layout that
millions of people use
today isn't the most effective one.
Different keyboard layouts have even
been created to try and make typing more efficient.
Now that we're starting to live in
a mobile centric world with our smartphones,
the landscape for keyboards may change completely.
My typing fingers are crossed.
And the technology industry,
having a little contexts can go a long way to
making sense of the concepts you will encounter.
By the end of this lesson,
you'll be able to identify some of
the most major advances
in the early history of computers.
Do you know what an abacus is?
It looks like a wooden toy that a child would play with.
But it's actually one of the earliest known computers.
It was invented in 500 BC to count large numbers.
While we have calculators like
the old reliable TI 89 or the ones in our computers.
The abacus is actually still used today.
Over the centuries, humans
built more advanced counting tools,
but they still require a human to
manually perform the calculations.
The first major step forward was the invention of
the mechanical calculator in
the 17th century by Blaise Pascal.
This device uses a series of gears and levers to
perform calculations for the user automatically.
While it was limited to addition, subtraction,
multiplication, and division for pretty small numbers.
It paved the way for more complex machines.
The fundamental operations of
the mechanical calculator relater
apply to the textile industry.
Before we had streamline manufacturing
looms we're used to we've yarn into a fabric.
If you want to design patterns on your fabric.
That took an incredible amount of manual work.
In the 1800s, a man by the name of
Joseph Jacquard invented a programmable loom.
These looms took a sequence of cards with holes in them.
When the loom encountered a hole,
it would hook to thread underneath it.
If it did or encounter a whole,
the hook wouldn't thread anything.
Eventually the spun up a design pattern on the fabric.
These cards were known as punch cards.
While Mr. Jacquard reinvented the textile industry,
he probably didn't realize that his invention would
shape the world of computing and the world itself today.
Pretty epic, Mr. Jacquard, pretty epic.
Let's fast forward a few decades and meet
a man by the name of Charles Babbage.
Babbage was a gifted engineer who
developed a series of machines that are now
known as the greatest breakthrough
on our way to the modern computer.
He built what was called a difference engine.
It was a very sophisticated version of some
of the mechanical calculators we were just talking about.
It could perform fairly
complicated mathematical operations,
but not much else.
Babbage's follow-up to the difference engine
was a machine he called the analytical engine.
He was inspired about Jacquard,
use of punchcards to automatically perform
calculations instead of manually entering them by hand.
Babbage use punch cards and
his analytical engine to allow
people to pre-define a series
of calculations they want it to perform.
As impressive as this achievement was,
the analytical engine was still
just a very advanced mechanical calculator.
It took the powerful insights of a mathematician named
Ada Lovelace to realize
the true potential of the analytical engine.
She was the first person to recognize that the machine
could be used for more than pure calculations.
She developed the first algorithm for the engine.
It was the very first example of computer programming.
Algorithm is just a series of
steps that solve specific problems.
Because of Lovelace's discovery,
the algorithms could be
programmed into the analytical engine.
It became the very first general
purpose computing machine in history.
A great example that women have had some of
the most valuable Mines and Technology since the 1800s.
No Comments