HISTORY OF COMPUTER
The evolution of the technologies that have brought upon modern
computing.
we can appreciate how fast technology is evolving and the people
who have brought us to this point Many inventions have taken several centuries
to develop into their modern forms and modern inventions are rarely the product
of a single inventor's efforts.

The computer is no different, the bits and pieces of the computer, both
hardware and software have come together over many centuries, with many people
and groups each adding a small contribution.
The abacus was one of the first machines humans had ever created to be
used for counting and calculating.
Fast forward to 1642 and the abacus evolves into the first mechanical adding machine, built by mathematician and scientist, Blaise Pascal.

The first mechanical calculator, the Pascaline, is also where we see the first signs of technophobia emerging, with mathematicians fearing the loss of their jobs due to progress.
He was also the first to lay down the concepts of binary arithmetic, how
all technology nowadays communicates and even envisioned a machine that used
binary arithmetic. From birth, we are taught how to do arithmetic in base 10
and for most people, that's all they're concerned with, the numbers 0 to
9.
There is an infinite number of ways to represent information, such as
octal as base 8, hexadecimal as base 16 used to represent colors, base 256
which is used for encoding, the list can go on. Binary is base 2, represented
by the numbers 0 & 1, why binary is essential for modern computing.
Progressing to the 1800s we saw with Charles Babbage.

Babbage is known as the father of the computer, with the design of his
mechanical calculating engines.
In 1820, Babbage noticed that many computations consisted of
operations that were regularly repeated and theorized that these operations
could be done automatically.
This led to his first design, the difference engine, it would have
a fixed instruction set, be fully automatic through the use of steam power and
print its results into a table.
In 1830, Babbage stopped work on his different engines to pursue his second idea, the analytical engine.
Elaborating on the difference engine this machine would be able to execute operations in non-numeric orders through the addition of conditional control, store memory, and read instructions from punch cards, essentially making it a programmable mechanical computer.1830s and the1840s. Faraday's work on electromagnetism.

Lack of funding his designs never came to reality, but if they had would have speed up the invention of the computer by nearly 100 years.
She is considered the world's first programmer and came up with an algorithm that would calculate Bernoulli numbers that were designed to work
with Babbage's machine.
She also outlined many fundamentals of programming such as data analysis, looping, and memory addressing. 10 years before the turn of the century, with inspiration from Babbage, the American inventor Herman Hollerith designed one of the first successful electromechanical machines referred to as the census tabulator.

This machine would read U.S. census data from punched cards, up to 65 at
a time, and tally up the results.
Depending on where the holes in the card are will determine your
input based on what connections are completed. To input data to the punched
card, you could use a keypunch machine aka the first iteration of a keyboard.
The 1800s were a period where the theory of computing began to evolve
and machines started to be used for calculations, but the 1900s is where we
begin to see the pieces of this nearly 5,000-year puzzle coming together,
especially between 1930 to 1950.

The concept of the modern computer is largely based on Turing's ideas. Also starting in 1936, German engineer, Konrad Zuse, invented the world's first programmable computer.
This device read instructions from punched tape and was the first computer to use boolean logic and binary to make decisions through relays.
Boolean logic is simply logic that results in either a true or
false output, or when corresponding to binary, one, or zero.
Zuse would later use punched cards to encode information in binary,
essentially making them the first data storage and memory devices.
In 1942, with the computer Z4, Zuse also released the world's first
commercial computer. For these reasons, many consider Zuse the inventor of the
modern-day computer.

In 1937, Howard Aiken with his colleagues at Harvard and in
collaboration with IBM began work on the Harvard Mark 1 Calculating Machine, a
programmable calculator and inspired by Babbage's analytical engine.
This machine was composed of nearly 1 million parts, had over 500 miles of wiring, and weighed nearly 5 tons! The Mark 1 had 60 sets of 24 switches for manual data entry and could store 72 numbers, each 23 decimal digits.
It could do 3 additions or subtractions in a second, a multiplication
took 6 seconds, a division took 15.3 seconds, and a logarithm or trig function
took about 1 minute.
The first technology that was fully digital, and unlike the relays used
in previous computers, were less power-hungry, faster and more reliable.
Beginning in 1937 and completing in 1942, the first digital computer was built by John Atanasoff and his graduate student Clifford Berry, the computer was dubbed the ABC.
Computers like those built by Zuse, the ABC was purely digital - it used
vacuum tubes and included binary math and Boolean logic to solve up to 29
equations at a time.
In 1943, the Colossus was a built-in collaboration with Alan
Turing, to assist in breaking German crypto codes, not to be confused with
Turing's bombe that actually solved Enigma.
This computer was fully digital, but unlike the ABC, it was fully programmable, making it the first fully programmable digital computer. Completing construction in 1946, the Electrical Numerical Integrator and Computer aka the ENIAC was completed. Composed of nearly 18,000 vacuum tubes and large enough to fill an entire room, the ENIAC is considered the first successful high-speed electronic digital computer.

It was, but like Aikens Mark 1 was a pain to rewire every time the
instruction set had to be changed.
The ENIAC essentially took the concepts from Atanasoff's ABC and elaborated
on them on a much larger scale. Meanwhile, the ENIAC was under construction.
In 1945, mathematician John von Neumann contributed a new understanding of how computers should be organized and built, further elaborating on Turing's theories and bringing clarity to the idea from computer memory and addressing.

The Electronic Discrete Variable Automatic Computer aka the EDVAC was completed in 1950 and the first stored-program computer. It was able to operate over 1,000 instructions per second. He is also credited with being the father of computer virology with his self-reproducing computer program design.
This machine has the stored program concept as its major feature, and
that, in fact, is the thing that makes the modern computer revolution possible!
At this point, you can see that computing had officially evolved
into its own field:

Electromechanical relays that took milliseconds to digital vacuum tubes that took only microseconds. From binary as a way to encode information with punched cards, to being used with boolean logic and represented by physical technologies like relays and vacuum tubes to finally being used to store instructions and programs.
From the abacus as a way to count, Pascal's mechanical calculator, the
theories of Leibniz, Alan Turing, and John von Neumann, the vision of Babbage
and the intellect of Lovelace, George Bools contribution of boolean logic, the
progressing inventions of a programmable calculator to a stored-program fully
the digital computer and countless other inventions, individuals, and groups.

Each step a further accumulation of knowledge - while the title of the inventor of the computer may be given to an individual or group, it was really a joint contribution over 5,000 years and more so between 1800 to 1950. Vacuum tubes were a huge improvement over relays, but they still didn't make economic sense on a large scale.
The ENIACs 18000 tubes, roughly 50 would burn out per day, and around
the clock team of technicians would be needed to replace them.
Vacuum tubes were also the reason why computers took up the space of
entire rooms weighed multiple tons and consumed enough energy to power a small
town!
In 1947, the first silicon transistor was invented at Bell Labs, and by 1954 the first transistorized digital computer was invented, aka the TRAGIC. It was composed of 800 transistors, took the space of .085 cubic meters compared to the 28 the ENIAC took up, only took 100 watts of power and could perform 1 million operations per second.

Also during this era, we begin to see major introductions on both the
hardware and software aspects of computing.
On the hardware side, the first memory device, the random-access
magnetic core store was introduced in 1951 by Jay Forrester, in other words,
the beginnings of what is now known as RAM today.
For more about ram: https://techcj-tech.blogspot.com/2021/08/what-is-computer-ram.html
The first, the hard drive was introduced by IBM in 1957, it
weighed one ton and could store five megabytes.
Assembly was the first programming language to be introduced in
1949 but really started taking off in this era of computing.
Assembly was a way to communicate with the machine in pseudo-English instead of machine language aka binary. The first true widely-used programming language was Fortran invented by John Backus at IBM in 1954. Assembly is a low-level language and Fortran is a high-level language.
In low-level languages, while you aren't writing instructions in machine
code, a very deep understanding of computer architecture and instructions are
still required to execute the desired program, which means a limited number of
people have the skills and it is very error-prone.
In 1958, this all changed with Jack Kilby of Texas Instruments and
his invention of the integrated circuit. The integrated circuit was a way to
pack many transistors onto a single chip, instead of individually wiring
transistors.
Packing all the transistors also significantly reduced the power and heat consumption of computers once again and made them significantly more economically feasible to design.

Integrated circuits sparked a hardware revolution and beyond computers
assisted in the development of various other electronic devices due to
miniaturization, such as the mouse invented by Douglas Engelbart in 1964, he
also demonstrated the first graphical user interface(GUI).
Computer speed, performance, memory, and storage also began
to iteratively increase as ICs could pack more transistors into smaller surface
areas. This is demonstrated by the invention of the floppy disk in 1971 by IBM
and DRAM by Intel in the same year.
Hardware, further advances in software were made as well, with an explosion of programming languages and the introduction of some of the most
common languages today.
BASIC in 1964 and C in 1971.

In 1965, led Gordon Moore, one of the founders of Intel, make one of the greatest predictions in human history, Computing power would double every two years at low cost and that computers would eventually be so small that they could be embedded into homes, cars, and what he referred to as personal portable communications equipment, aka mobile phones.
The End



1 Comments
This comment has been removed by a blog administrator.
ReplyDelete