introduction to IT support (a history lesson)

Navigating Coursera -


How to Pass the Class

You can review videos, readings, discussion forums, in-video questions, and practice quizzes in the program for free. However, to access graded assignments and be eligible to receive your official Google IT Support certificate, you must:

AND

  • Pass all graded assignments in the five courses at the minimum passing level, or above. Each graded assignment in a course is part of a cumulative grade for that course. The passing score for each course is 80%. 

Getting and Giving Help

You can get/give help in the following ways:

  1. Coursera Learner Support: Use the Learner Help Center to find information on specific technical issues. These include error messages, difficulty submitting assignments, or problems with video playback. If you can’t find an answer in the documentation, you can also report your problem to the Coursera support by clicking on the Contact Us! link available on the bottom of help center articles. If you're having trouble accessing any of the course content, please reach out to Coursera support.

  2. Qwiklabs Support: Please use the Qwiklabs support request form to log any issues related to accessing and using Qwiklabs. A member of the Qwiklabs team will work with you to help resolve them.

  3. Course Content Issues: You can also flag problems in course materials. When you rate course materials, the instructor will see your ratings and feedback; other learners won’t. To rate course materials:

  • Open the course material you want to rate. You can only rate videos, readings, and quizzes.

  • If the item was interesting or helped you learn, click the thumbs-up icon.

  • If the item was unhelpful or confusing, click the thumbs-down icon.

Participate in program surveys


During this certificate program, you will be asked to complete a few short surveys. These are part of a research study being conducted to understand how effectively the certificate meets your career training needs. Keep reading for a summary of what each survey will cover.

Your survey participation is optional but extremely helpful in making this program as effective as possible. All data is kept confidential and is aggregated for review in accordance with Coursera’s privacy policy. Your name is separated from your data when it is stored.

There are no right or wrong answers. Your responses or personal data:

Won’t affect your program experience, scores, or ability to receive a certificate or job offer.

Won’t be shared outside of our research team, unless you give permission to share your contact information with hiring partners.

Thanks for your consideration and time!

Entry survey

First, you will have an opportunity to answer a brief survey to help researchers understand why you enrolled in this certificate program. If you don’t fill out the survey now, you will receive an invitation to fill it out after completing your first video or activity.

 

The survey asks about your experiences leading up to this program and the goals you hope to achieve. This is critical information to ensure your needs as a learner are met and that this program will continue to be offered in the future.

Individual course feedback

After you complete the last graded assignment within an individual course, you might be asked to complete a survey. This survey will revisit questions from the previous survey and ask what you have learned up to that point in the program. Again, filling out this information is voluntary but extremely beneficial to the program and future learners.

Certificate completion survey

After you complete the last graded assignment in the final (#) course of the certificate program, you will be asked to complete a survey that revisits some earlier questions and asks what you have learned throughout the duration of the program. This survey also asks whether you would like to share your contact information with prospective employers. Filling out the survey and sharing your contact information with prospective employers is completely optional and will not affect your course experience, scores, or ability to receive a certificate or job offer in any way.

How to Use Discussion Forums


Upvoting Posts

When you enter the discussion forum for your course, you will see an Upvote button under each post. We encourage you to upvote posts you find thoughtful, interesting, or helpful. This is the best way to ensure that quality posts will be seen by other learners in the course. Upvoting will also increase the likelihood that important questions get addressed and answered.

Report Abuse

Coursera's Code of Conduct prohibits:

  • Bullying or threatening other users

  • Posting spam or promotional content

  • Posting mature content

  • Posting assignment solutions (or other violations of the Honor Code)

Please report any posts that infringe upon copyright or are abusive, offensive, or that otherwise violate Coursera’s Honor Code by using the Report this option found under the menu arrow to the right of each post.

Following

If you find a particular thread interesting, click the Follow button under the original post of that thread page. When you follow a post, you will receive an email notification anytime a new post is made.

Improving Your Posts

Course discussion forums are your chance to interact with thousands of like-minded individuals around the world. Getting their attention is one way to do well in this course. In any social interaction, certain rules of etiquette are expected and contribute to more enjoyable and productive communication. The following are tips for interacting in this course via the forums, adapted from guidelines originally compiled by AHA! and Chuq Von Rospach & Gene Spafford:

  1. Stay on topic in existing forums and threads. Off-topic posts make it hard for other learners to find information they need. Post in the most appropriate forum for your topic, and do not post the same thing in multiple forums.

  2. Use the filters at the top of the forum page (Latest, Top, and Unanswered) to find active, interesting content.

  3. Upvote posts that are helpful and interesting.

  4. Be civil. If you disagree, explain your position with respect and refrain from any and all personal attacks.

  5. Make sure you’re understood, even by non-native English speakers. Try to write full sentences, and avoid text-message abbreviations or slang. Be careful when you use humor and sarcasm as these messages are easy to misinterpret.

  6. If asking a question, provide as much information as possible, what you’ve already considered, what you’ve already read, etc.

  7. Cite appropriate references when using someone else’s ideas, thoughts, or words.

  8. Do not use a forum to promote your product, service, or business.

  9. Conclude posts by inviting other learners to extend the discussion. For example, you could say “I would love to understand what others think.”

  10. Do not post personal information about other posters in the forum.

  11. Report spammers.

 

For more details, refer to Coursera's Forum Code of Conduct.

These tips and tools for interacting in this course via the forums were adapted from guidelines originally by The University of Illinois.

 

Get to Know Your Fellow October Completers


Overview

Working well with your fellow learners is an important part of an online course. So, at the beginning of this course, we’d like you to take time to "break the ice" and get to know each other. You may already know some of your fellow learners or have just met them. Establishing personal interaction with other learners will make your online learning experience much more enjoyable and engaging. So, we encourage you to participate in this activity, though it’s optional.

Meet and Greet

Tell everyone your story! We encourage you to share a brief introduction about yourself to your fellow learners. Read some of your fellow learners' postings. Pick at least 2 postings that are most interesting to you and add your friendly comments.

You can go to the Meet Your Fellow October Completers Discussion Prompt and add your introduction story there.

Suggested Topics

  • Hopes and goals? Why did you decide to enroll into the IT Support Professional Certificate? What are your expectations of this course? Are you excited to learn about IT? What do you hope to put into place in your life the day this program is over?

  • Tips and strategies. How do you plan to complete the certificate by October? When/how often are you going to set aside time to learn each week? Is there a place in your home or neighborhood where you won’t get distracted?

  • Buddy up. Share with us any other information that might help others in the cohort find you when searching the forums. What common interests might you share with your classmates? Find an “accountability buddy” or “buddies” so you can help keep each other on track!

Updating Your Profile

Optionally, please consider updating your profile, which can also be accessed by clicking the Profile link in the menu that appears when you click on your name at the top-right corner of this screen. When people find you in the forums, they can click on your name to view your complete profile and get to know you more.

From Abacus to Analytical Engine

When you hear the word computer,
maybe you think of something like
a beefy gaming desktop with flashing lights.
Or maybe you think of a slim and sleek laptop.
These fancy devices aren't what people
had in mind when computers were first created.
To put it simply, a computer is a device that stores
and processes data by performing calculations.
Before we had actual computer devices,
the term computer was used to
refer to someone who actually did the calculation.
You're probably thinking that's crazy talk.
My computer lets me check social media,
browse the Internet, design graphics.
How can it possibly just perform calculations?
Well, friends, in this course,
we'll be learning how
computer calculations are baked into applications,
social media games, etc,
all the things that you use every day.
But to kick things off,
we'll learn about the journey computers took from
the earliest known forms of
computing into the devices that you know and love today.
In the world of technology and if I'm
getting really philosophical in life,
it is important to know where we've been in
order to understand where we are and where we're going.
Historical context can help you
understand why things work the way they do today.
Have you ever wondered why the alphabet
isn't laid out in order on your keyboard?
The keyboard layout that most of the world
uses today is the Cordelia,
distinguished by the Q-W-E-R-T-Y
keys in the top row of the keyboard.
The most common letters that you type aren't found on
the home row where your fingers sit the most.
But why? There are
many stories that claim to answer this question.
Some say it was developed to slow down type is so
they wouldn't jam old mechanical typewriters.
Others claim it was meant to resolve
problem for telegraph operators.
One thing is for sure the keyboard layout that
millions of people use
today isn't the most effective one.
Different keyboard layouts have even
been created to try and make typing more efficient.
Now that we're starting to live in
a mobile centric world with our smartphones,
the landscape for keyboards may change completely.
My typing fingers are crossed.
And the technology industry,
having a little contexts can go a long way to
making sense of the concepts you will encounter.
By the end of this lesson,
you'll be able to identify some of
the most major advances
in the early history of computers.
Do you know what an abacus is?
It looks like a wooden toy that a child would play with.
But it's actually one of the earliest known computers.
It was invented in 500 BC to count large numbers.
While we have calculators like
the old reliable TI 89 or the ones in our computers.
The abacus is actually still used today.
Over the centuries, humans
built more advanced counting tools,
but they still require a human to
manually perform the calculations.
The first major step forward was the invention of
the mechanical calculator in
the 17th century by Blaise Pascal.
This device uses a series of gears and levers to
perform calculations for the user automatically.
While it was limited to addition, subtraction,
multiplication, and division for pretty small numbers.
It paved the way for more complex machines.
The fundamental operations of
the mechanical calculator relater
apply to the textile industry.
Before we had streamline manufacturing
looms we're used to we've yarn into a fabric.
If you want to design patterns on your fabric.
That took an incredible amount of manual work.
In the 1800s, a man by the name of
Joseph Jacquard invented a programmable loom.
These looms took a sequence of cards with holes in them.
When the loom encountered a hole,
it would hook to thread underneath it.
If it did or encounter a whole,
the hook wouldn't thread anything.
Eventually the spun up a design pattern on the fabric.
These cards were known as punch cards.
While Mr. Jacquard reinvented the textile industry,
he probably didn't realize that his invention would
shape the world of computing and the world itself today.
Pretty epic, Mr. Jacquard, pretty epic.
Let's fast forward a few decades and meet
a man by the name of Charles Babbage.
Babbage was a gifted engineer who
developed a series of machines that are now
known as the greatest breakthrough
on our way to the modern computer.
He built what was called a difference engine.
It was a very sophisticated version of some
of the mechanical calculators we were just talking about.
It could perform fairly
complicated mathematical operations,
but not much else.
Babbage's follow-up to the difference engine
was a machine he called the analytical engine.
He was inspired about Jacquard,
use of punchcards to automatically perform
calculations instead of manually entering them by hand.
Babbage use punch cards and
his analytical engine to allow
people to pre-define a series
of calculations they want it to perform.
As impressive as this achievement was,
the analytical engine was still
just a very advanced mechanical calculator.
It took the powerful insights of a mathematician named
Ada Lovelace to realize
the true potential of the analytical engine.
She was the first person to recognize that the machine
could be used for more than pure calculations.
She developed the first algorithm for the engine.
It was the very first example of computer programming.
Algorithm is just a series of
steps that solve specific problems.
Because of Lovelace's discovery,
the algorithms could be
programmed into the analytical engine.
It became the very first general
purpose computing machine in history.
A great example that women have had some of
the most valuable Mines and Technology since the 1800s.

The Path to Modern Computers

Welcome back. In this video,
we'll be learning how huge devices
like the analytical engine grew,
I mean shrunk into
the computing devices that we use today.
The development of computing has been
steadily growing since the invention of
the analytical engine but didn't make
a huge leap forward until World War II.
Back then, research into computing was super expensive.
Electronic components were large and you
needed lots of them to compute anything of value.
This also meant that computers took up a ton of space and
many efforts were underfunded and unable to make headway.
But when the war broke out,
governments started pouring money and
resources into computing research.
They wanted to help develop technologies that would
give them advantages over other countries.
Lots of efforts were spun up and
advancements were made in fields like cryptography.
Cryptography is the art of writing and solving codes.
During the war, computers were used to process
secret messages from enemies
faster than a human could ever hope to do.
Today the role cryptography plays in
secure communication is
a critical part of computer security,
which we'll learn more about in a later course.
For now, let's look at how computers started to
make a dramatic impact on society.
After the war, companies like IBM, Hewlett Packard,
and others were advancing
their technologies into the academic,
business, and government realms.
Lots of technological advancements
in computing were made in the 20th century.
Thanks to direct interests from governments,
scientists, and companies leftover from World War II.
These organizations invented new methods
to store data in computers,
which fuel the growth of computational power.
Consider this, until
the 1950s punchcards were a popular way to store data.
Operators would have decks of
ordered punch cards that were used for data processing.
If they dropped the deck by accident
and the cards get out of order,
it was almost impossible to get them sorted again,
there were obviously some limitations to punchcards.
But thanks to new technological innovations
like magnetic tape and its counterparts,
people began to store more data on more reliable media.
A magnetic tape worked by magnetizing data onto a tape.
This left stacks and stacks
of punchcards to collect dust,
while the new magnetic tape counterparts
began to revolutionize the industry.
I wasn't joking when I said
early computers took up a lot of space.
They had huge machines to read data and
racks of vacuum tubes that help move that data.
Vacuum tubes controlled the electricity voltages and
all electronic equipment like televisions and radios.
But these specific vacuum tubes
were bulky and broke all the time.
Imagine what the work of
an IT support specialist was
like in those early days of computing.
The job description might have included crawling around
inside huge machines filled
with dust and creepy crawly things,
or replacing vacuum tubes
and swapping out those punchcards.
In those days, doing
some debugging might've taken on a more literal meaning.
Well-known computer scientist Admiral Grace Hopper had
a favorite story involving
some engineers working on the Harvard Mark II computer.
They were trying to figure out
the source of the problems in a relay.
After doing some investigating,
they discovered the source of their trouble was a moth,
a literal bug in the computer.
The ENIAC was one of
the earliest forms of general-purpose computers.
It was a wall-to-wall convolution
of massive electronic components and wires.
17,000 vacuum tubes and took up
about 1,800 square feet of floor space.
Imagine if you had to work with
that scale of equipment today,
I wouldn't want to share an office
with 1,800 square feet of machinery.
Eventually, the industry started using
transistors to control electricity voltages.
This is now a fundamental component
of all electronic devices.
Transistors perform
almost the same functions as vacuum tubes,
but they are more compact and more efficient.
You can easily have billions of
transistors in a small computer chip today.
Throughout the decades,
more and more advancements were made.
The very first compiler was
invented by Admiral Grace Hopper.
Compilers made it possible to translate
human language via
a programming language into machine code.
The big takeaway is that this advancement was
a huge milestone in
computing that led to where we are today.
Now, learning programming languages is
accessible for almost anyone anywhere.
We no longer have to learn how to write
machine code in ones and zeros.
Eventually, the industry gave way to
the first hard disk drives and microprocessors.
Then programming language started becoming
the predominant way for
engineers who develop computer software.
Computers were getting smaller and smaller,
thanks to advancements in electronic components.
Instead of filling up entire rooms like ENIAC,
they were getting small enough to fit on tabletops.
The Xerox Alto was the first computer
that resembled the computers we're familiar with now.
There was also the first computer to implement
a graphical user interface that
use icons, a mouse, and a window.
Some of you may remember that the sheer size and cost of
historical computers made it almost
impossible for an average family to own one.
Instead, they were usually found in
military and university research facilities.
When companies like Xerox started building machines at
a relatively affordable price
and at a smaller form factor.
The consumer age of computing began,
then in the 1970s,
a young engineer named
Steve Wozniak invented the Apple 1,
a single-board computer meant for hobbyists.
With his friend Steve Jobs,
they created a company called Apple Computer.
Their follow-up to the Apple I.
The Apple II, was ready for the average consumer to use.
The Apple II was a phenomenal success,
selling for nearly two decades and giving
a new generation of people access the personal computers.
For the first time, computers
became affordable for the middle-class
and help bring computing technology
into both the home and office.
In the 1980s, IBM introduced its personal computer.
It was released with a primitive version
of an operating system called
MS DOS or Microsoft Disk Operating System.
Side-note, modern operating systems
don't just have text anymore,
they have beautiful icons, words,
and images like what we see on our smartphones.
It's incredible how far we've come from
the first operating system to
the operating systems we use today.
Back to IBM's PC;
it was widely adopted and made
more accessible to consumers,
it's thanks to a partnership with Microsoft.
Microsoft founded by Bill Gates,
eventually created in Microsoft Windows.
For decades, it was the preferred
operating system in the workplace and
dominated the computing industry because
it can be run on any compatible hardware.
With more computers in the workplace,
the dependence on IT rose,
and so does the demand for
skilled workers who could support that technology.
Not only were personal computers
entering the household for the first time,
but a new type of computing was emerging, video games.
During the 1970s and 80s,
Coin operated entertainment machine called
arcades became more and more popular.
A company called Atari developed one of the first coin
operated arcade games in 1972 called Pong.
Pong was such a sensation that people were standing in
lines at bars and
rec centers for hours at a time to play.
Entertainment computers like Pong
launched the video game era.
Eventually Atari went on to
launch the video computer system,
which helped bring personal video consoles into the home.
Video games have contributed to
the evolution of computers in a very real way,
tell that to the next person who dismisses them as a toy.
Video game show people that computers didn't
always have to be all work and no play.
They were a great source of entertainment too.
This was an important milestone
for the computing industry,
since at that time,
computers were primarily used
in the workplace or at research institutions.
With huge players in the market like Apple,
Macintosh and Microsoft Windows,
taking over the operating system space,
a program whereby the name of Richard Stallman start
developing a free Unix-like operating system.
Unix was an operating system developed
by Ken Thompson and Dennis Ritchie,
but it wasn't cheap and it wasn't available to everyone.
Stallman created an OS that he called GNU.
It was meant to be free to use with
similar functionality to Unix.
Unlike Windows or Macintosh,
GNU wasn't owned by a single company,
it's code was open-source,
which meant that anyone could modify and share it.
GNU didn't evolve into a full operating system,
but it set a foundation for the formation of one
of the largest open source operating system,
Linux, which was created by Linus Torvalds.
We get into the technical details of
Linux later in this course,
but just know that it's a major player
in today's operating systems.
As an IT support specialist,
it is very likely that you'll
work with an open-source software.
You might already be using one like
the internet browser and Mozilla Firefox.
By the early 90s, computer started getting even
smaller than a real game changer
made his way into the scene.
PDAs or personal digital assistants,
which allows computing to go mobile.
These mobile devices included
portable media players, Word processors,
email clients, Internet browsers,
and more all-in-one handy handheld device.
In the late 1990s,
Nokia introduced the PDA with mobile phone functionality.
This ignited and industry of
pocketable computers or as
we know them today, smartphones.
In mere decades, we went from
having computers that weigh tons
and took up entire rooms to having
powerful computers that fit in our pockets.
It's almost unbelievable, and it's just the beginning.
If you're stepping into the IT industry,
it's essential that you understand how to
support the growing need
of this ever-changing technology.
Computer support 50 years ago consisted of
changing vacuum tubes and stacking punchcards,
things that no longer exist in today's IT world.
While computers evolve in
both complexity and prevalence,
so did knowledge required to support and maintain them.
In 10 years,
IT support could require working through
virtual reality lenses, you never know.
Who knows what the future holds,
but right now it is
an exciting time to be at the forefront of this industry.
Now that we've run down where computers came
from and how they've evolved over the decades,
let's get a better grasp on how computers actually work.

Pioneers in Computing and IT

Pioneers in Computing and IT

Computer technology has come a long way since the first computer was invented. Along the way, many people from diverse backgrounds contributed inventions and innovations that helped us get to where we are today with modern computers. Without these individuals, information technology would not be where it is today. 

Early Computer Pioneers

Ada Lovelace

Ada Lovelace was born in 1815 to Anna Milbanke and the poet Lord Byron. Her mother Anna Milbanke educated her to excel in mathematics. When Lovelace was still young, she was shown the Difference Engine (a mechanical calculator developed by Charles Babbage) and published a set of notes which contained the first computer algorithm for the Difference Engine in 1843. Lovelace posited at the time that computers would eventually be used outside of mathematics for things like composing music and made predictions about how technology would influence society. 

Alan Turing

Alan Turing was born in 1912. While completing his degrees, he developed the concept of the Turing machine. Turing proved that there were some yes/no mathematical questions that could never be solved computationally which defined computation and its limitations. These findings would go on to become one of the seeds of computer science and his conceptual Turing machine (so named by his Doctoral advisor) is considered a predecessor of modern computer programs. During the Second World War, Turing developed the Turing-Welchman Bombe which was used to decipher Nazi codes and intercept Nazi messages. After the war, Turing's Imitation Game (now known as the Turing test) was created as a means to evaluate the abilities of artificial intelligence. 

Margaret Hamilton

Margaret Hamilton was born in 1936. While working in the meteorology department at the Massachusetts Institute of Technology, she developed software for predicting weather. Later Hamilton would go on to work on the software that was used in the NASA Apollo command and lunar modules. With her experience writing software, she wanted to ensure that this skill would get its due respect and coined the term “software engineering.” Culminating her experience working on the Apollo missions and moon landings, Hamilton formalized what she learned into a theory that would later become the Universal System Language. 

Admiral Grace Hopper

Grace Hopper was born in 1906. During the Second World War, she joined the US Navy Reserve after taking a leave from her role as a mathematics professor at Vassar College. In the Navy, she was assigned the Bureau of Ships Computation Project at Harvard University where she worked on the programming team for the Mark I computer. After the war and her time at Harvard, she began working on more powerful computers and recommended that a programming language be developed that used English words rather than symbols. This concept would eventually become FLOW-MATIC the first programming language to use English words which also necessitated the invention of the first compiler (a program that translates source code into machine code). Notably, she is also credited with first using the term “computer bug” after a real bug (a moth) flew into a computer she was working on. Later in her career, she was one of the designers of COBOL, a programming language that is still in use today. 

NASA and the Human Computers 

The following women all worked on various NASA projects. Some even were hired as human computers. They were tasked with completing complex calculations by hand for all sorts of situations from wartime thrust-to-weight ratios to Apollo orbit trajectories. They all went on to have impressive careers in mathematics and computer science. 

Annie Easley developed the energy analytics code used to analyze power technology including the technology that was used in battery technology for Centaur rockets and early hybrid vehicles

Katherine Johnson was a physicist, mathematician, and space scientist who provided the calculation for important missions like the first orbit of the Earth and the Apollo 11 moon landing. 

Dorothy Vaughan was a mathematician who would eventually become the first African American supervisor of NACA (National Advisory Committee for Aeronautics which would later become NASA) and a FORTRAN expert programmer working on the Scout Launch Vehicle Program (a family of rockets that placed small satellites in orbit). 

Mary Jackson was NASA’s first Black female engineer. She worked on wind tunnel and flight experiments and would go on to earn NASA’s most senior engineering title. 

Melba Roy Mouton was a Head Mathematician at NASA working on Project Echo, the first experiment in passive satellite communication. At NASA, she wrote programs that calculated locations and trajectories of aircraft. 

Evelyn Boyd Granville worked on multiple projects in the Apollo and Mercury programs for NASA. She worked on computer techniques related to concepts like celestial mechanics and trajectory computation. 

Innovators in Modern Technology

Hedy Lamarr

Hedy Lamarr was born in 1914. A movie actress during the golden age of Hollywood, she was also a self-taught inventor. During the Second World War, she read about radio-controlled torpedoes which could potentially be jammed by enemy forces. She and a composer friend proposed and patented an idea for a frequency-hopping radio signal that used existing player piano technology. The principles of this work would eventually be used in familiar technologies like WiFI, Bluetooth, and GPS. 

Guillermo Gonzalez Camarena

Guillermo Gonzalez Camarena was born in 1917. An electrical engineer, in 1940 he patented an adapter that let monochrome cameras use colors. This technology was one of the earliest forms of color television. Camarena’s system would eventually be used by NASA for the Voyager mission and made color images of Jupiter possible.

Gerald (Jerry) Lawson 

Jerry Lawson was born in 1940. Working as a semiconductor engineer for the Fairchild company, he worked on a team that developed the Fairchild Channel F, a color video game console that was designed to use interchangeable game cartridges. Previously, most game systems had built-in programming. He would later be dubbed the “father of the video game cartridge” for this work. 

Mark E. Dean

Mark Dean was born in 1957. An inventor and computer scientist, he is the chief engineer of the IBM team that released the IBM personal computer. He holds three of the nine patents for the PC. He and his team also created the first gigahertz computer chip and he also helped develop the color PC monitor. Along with Dennis Moeller, he developed the Industry Standard Architecture (ISA) bus which was a precursor to modern bus structures like PCI and PCI express. 

Clarence “Skip” Ellis

Clarence Ellis was born in 1943. He was a computer scientist and professor who pioneered in Computer Supported Cooperative Work and Groupware. In fact, while working at Xerox PARC, he and his team developed a groupware system called OfficeTalk. For the first time, this system allowed for collaboration from a distance using ethernet. He also focused on icon-based graphical user interfaces (GUIs) that have become prevalent in modern computing. 

Gladys West

Gladys West was born in 1930. A mathematician, she was hired to work for the US Navy to more accurately model the shape of the Earth. She used algorithms to account for all sorts of variations in the shape of the Earth and her model would eventually be used as the basis for the Global Positioning System (GPS). 

These individuals are a few notable examples, but this is by no means a complete list!

 

Character Encoding

By the end of this video,
you'll learn how we can represent the words, numbers,
emojis, and more we see on our screens from
only these 256 possible values.
It's all thanks to character encoding.
Character encoding is used to assign
our binary values to
characters so that we as humans can read them.
We definitely wouldn't want to see
all the texts in our emails in
webpages rendered in complex sequences of zeros and ones.
This is where character encodings come in handy.
You can think of character encoding as a dictionary.
It's a way for your computers to look up
which human character should be
represented by a given binary value.
The oldest character encoding standard used is ASCII.
It represents the English alphabet,
digits, and punctuation marks.
The first character in the ASCII to
binary table, a lowercase a,
maps to 01100001 in binary.
This is done for all the characters
you can find in the English alphabet,
as well as numbers and some special symbols.
The great thing with ASCII was that we only needed to use
127 values out of our possible 256.
It lasted for a very long time,
but eventually, it wasn't enough.
Other character encoding standards were
created to represent different languages,
different amounts of characters, and more.
Eventually, they would require
more than 256 values we are allowed to have.
Then came UTF-8,
the most prevalent encoding standard used today.
Along with having the same ASCII table,
it also lets us use a variable number of bytes.
What do I mean by that? Think of any emoji.
It's not possible to make emojis with
a single byte since we can
only store one character in a byte.
Instead, UTF-8 allows us to
store a character in more than one byte,
which means endless emoji fun.
UTF-8 is built off the Unicode Standard.
We won't go into much detail,
but the Unicode Standard helps us
represent character encoding in a consistent manner.
Now that we've been able to represent letters, numbers,
punctuation marks, and even emojis,
how do we represent color?
Well, there are all kinds of color models.
For now, let's stick to a basic one
that's used in a lot of computers,
RGB or red, green, and blue model.
Just like the actual colors,
if you mix a combination of any of these,
you'll be able to get the full range of colors.
In computer learn,
we use three characters for the RGB model.
Each character represents a shade of the color,
and that then changes the color
of the pixel you see on your screen.
With just eight combinations of zeros and ones,
we're able to represent everything
that you see on your computer from
a simple letter a to the very video
that you're watching right now. Very cool.
(Required)