Ticker

6/recent/ticker-posts

History of computer science timeline- History of computing

History of computing

History of computer science timeline- History of computing


Hi, thanks for tuning in to Singularity Prosperity.

This content is the first in a multi-part series discussing computing, in this video we will discuss the evolution of computing, more specifically, the evolution of technologies that have ushered in the modern computing age. The purpose of this video is that we can appreciate how quickly the technology and the people that have brought us to this point are evolving! Many inventions have taken several centuries to develop into their modern forms, and modern inventions are rarely the product of the efforts of a single inventor.

The computer is no different, the bits and pieces of the computer, both the hardware and the software, have come together for many centuries, with many people and groups, each adding a small contribution. We started already in the year 3000 a. C. with the Chinese abacus, how does this relate to computing? The abacus was one of The first machines that humans had created to be used for counting and calculating. Fast forward to 1642 and the Abacus becomes the first mechanical adding machine, built by the mathematician and scientist Blaise Pascal. This first mechanical calculator, the Pascaline, is also where we see the first signs of Technophobia emerge, with mathematicians fearing the loss of their jobs due to progress.

Also in the 1600s, from the 1660s to the early 1700s, we meet Gottfried Leibniz. A pioneer in many fields, best known for his contributions to mathematics and considered by many to be the first computer scientist. Inspired by Pascal, he created His own calculating machine, capable of performing all four arithmetic operations. He was also the first to establish the concepts of binary arithmetic, how all technology communicates today, and even envisioned a machine that used binary arithmetic. From birth we are taught to do base 10 arithmetic And for most people that is all they care about, the numbers 0 to 9. However, there are an infinite number of ways to represent information, such as Octal as base 8, exadecimal as base 16 Used represent colors, base 256 used for encoding, the list can go on.

Binary is base 2, represented by the numbers 0 and 1. We will explore later in this content why Binary is essential for modern computing. Returning to the topic, fast forward to the 19th century. We meet Charles Babbage. Babbage is known as the father of the computer, with the design of his mechanical calculation engines. In 1820, abbage noted that many calculations consisted of regularly repeating operations and he theorized that these operations could be performed automatically. This led to his first design, the Differential engine, it would have a fixed instruction set, be fully automatic using the energy of steam, and print its results in a table. In 1830, Babbage stopped working on his differential engine to pursue his second idea, the analytical engine. Developing the difference engine, this machine would be Capable of executing operations in non-numerical orders by adding conditional control, storing memory and reading punch card instructions,

It essentially makes it a programmable mechanical computer. Unfortunately due to The lack of funding for his designs never came true, but if they had, it would have accelerated the invention of the computer by almost 100 years. Also worth mentioning It's Ada Lovelace, who worked closely with Babbage. She is considered the The world's first programmer and she came up with an algorithm that she would calculate Bernoulli numbers that were designed to work with Babbage's machine. 

History of computer science timeline- History of computing


She also described many fundamentals of programming, such as data analysis, looping, and memory management. 10 years before the turn of the century, with inspiration from Babbage, American inventor Herman Hollerith designed one of the first Electromechanical machines, called the census tabulator. This machine would read US census data from punched cards, up to 65 at a time, and count the results. Hollerith's tabulator became so successful that he went on to found his own Firm to market the device, this company eventually became IBM. To briefly explain How do punch cards work, essentially once a card is fed into the machine? An attempt is made to connect. Depending on where the holes in the card 

Your entry will be determined based on the connections that are completed. To enter Data to the punch card, you could use a key punch machine, also known as the prime1st iteration of a keyboard! The 19th century was a period when the theory of computation began to evolve and machines began to be used for calculations, but the 20th century is where Begin to see the pieces of this nearly 5,000-year-old puzzle coming together, especially between 1930 and 1950. In 1936, Alan Turing proposed the concept of a universal machine, later to be called the Turing machine, capable of calculating anything that is computable. 

Up to this point, machines could only perform certain tasks that the hardware was Designed to. The concept of the modern computer is largely based on Turings Ideas. Also starting in 1936, the German engineer Konrad Zuse invented the world's first programmable computer. This device read the instructions from the punched tape and was the first computer to use Boolean and binary logic to make decisions, using relays. For reference, Boolean logic is simply logic that results in Either a true or false output, or when it corresponds to binary, one or zero. We'll dive into Boolean logic more in depth later in this video. Later, Zuse would use punch cards to encode information in binary, essentially making them the first memory and data storage devices. In 1942, with The Z4 computer, Zuse also launched the World's First Commercial Computer. 

For these reasons, many consider Zuse the inventor of the modern computer. In 1937, Howard Aiken with his Harvard colleagues and in collaboration with IBM began work on the Harvard Mark 1 Calculating Machine, a programmable calculator inspired by Babbage's analytical engine. This machine consisted of nearly 1 million parts, had over 500 miles of wiring, and weighed nearly 5 tons. The Mark 1 had 60 sets of 24 switches for manual data entry and You can store 72 numbers, each with 23 decimal digits. He could do 3 additions or subtractions in one second, a multiplication took 6 seconds, a division took 15.3 seconds, and a Logarithm or trigonometric function took about 1 minute. As a fun side note, one of the Mark 1's lead programmers, Grace Hopper, discovered the first Computer bug, a dead moth blocking one of the machine's reading holes.

Hopper is also credited with creating the Word debugger! The era of the vacuum tube It marks the beginning of modern computing. The first technology that was completely Digital, and unlike the relays used in previous computers, they consumed less power, Faster and more reliable. Beginning in 1937 and ending in 1942, John Atanasoff and his graduate student Clifford Berry built the first digital computer, the computer was nicknamed The ABC. Unlike previously built computers like those built by Zuse, ABC was purely digital: it used vacuum tubes and included binary math and Boolean logic to solve up to 29 equations at a time. In 1943, the Colossus 

It was built in collaboration with Alan Turing, to help crack German cryptographic codes, not to be confused with the Turing bomb that actually solved Enigma. This computer was also fully digital, but unlike the ABC, it was fully programmable, making it the first fully programmable digital computer. After construction was completed in 1946, the electrical numerical integrator and computer, also known as ENIAC, were completed. Comprised of nearly 18,000 vacuum tubes and large enough to fill an entire room, the ENIAC is considered the first successful high-speed electronic digital computer. 

It was somewhat programmable, but like the Aikens Mark 1, it was a nuisance to Rewire every time the instruction set had to be changed. The ENIAC essentially took the ABC concepts from Atanasoff and developed them on a much larger scale. Meanwhile, the ENIAC was under construction, in 1945, the mathematician John von Neumann, contributed to a new understanding of how computers should be organized and built, delving into Turing theories and bringing clarity to the idea from memory and addressing computer scientist. He referred to conditional addressing or subroutines, something Babbage had envisioned for his analytical engine nearly 100 years earlier. He also the idea that the instructions or the program that runs on a computer can be modified in the same way as the data, and encoded in binary. Von Neumann assisted in the design of the Successor of Eniacs, the Electronic Discrete Variable Automatic Computer, also known as 

History of computer science timeline- History of computing


EDVAC, which was completed in 1950 and the first stored-program computer. It wascapable of operating more than 1,000 instructions per second. It is also attributed Being the father of computer virology with his design of a self-replicating computer program. And it contains essentially those things that the modern computer does, albeit in a somewhat primitive form. This machine has 

The concept of stored program as its main feature, and that in fact is what What makes the modern computer revolution possible! At this point, you can see that computing has officially evolved into its own field: from mechanical to electromechanical relays that took milliseconds to digital vacuum tubes that only took microseconds. From Binary as a way of encoding information with perforated ards, to being used with Boolean logic and represented by physical technologies such as relays and vacuum tubes to finally be used to store instructions and programs. From the abacus as a way of counting, Pascal's mechanical calculator, the theories of 

Leibniz, Alan Turing and John von Neumann, The Vision of Babbage and the Intellect of Lovelace, George Bools Contribution of Boolean Logic, the Progressive Inventions of a Programmable Ca to a Fully Digital Stored Program in Computer and countless other inventions, individuals and groups. Each step is a greater accumulation of knowledge: while the title of inventor of the computer can be awarded to an individual or group, it was actually a joint contribution for 5,000 years and more between 1800 and 1950. Vacuum tubes were a great improvement over 

Relays, but they still didn't make economic sense on a large scale. For example, of the 18,000 eniacs tubes, approximately 50 would be burned per day and a team of technicians would be needed 24 hours a day to replace them. Vacuum tubes were also the reason computers took up the space of entire rooms, weighed several tons, and consumed enough energy to power a small town. In 1947, the first silicon transistor was invented at Bell Labs and in 1954 the first transistorized digital computer, also known as TRADIC, was invented. It was composed of 800 transistors, it occupied a space of .085 cubic meters. Compared to the 28 that the ENIAC occupied, it only took 100 watts of power and could perform 1 million operations per second. Also during this era, we began to see important introductions in both the hardware and software aspects of computing.

On the hardware side, the first memory device, the Random Access Magnetic Core Store, was introduced in 1951 by Jay Forrester. In other words, the beginnings of what is now known as RAM today. The first hardThe unit was introduced by IBM in 1957, weighed one ton and could hold fiveMegabytes, which cost roughly $ 27,000 a month in today's money. OnThe software side is where many of the major innovations and advancements started to come, this because the hardware and architecture of computers were starting to become more standardized rather than all working on different variations of a computing machine. Assembly was the first programming language that was introduced in 1949, but it really started to take off in this computer age. Assembly was a way of communicating with the machine in pseudo-English rather than machine language, also known as binary. The first widely used True programming language was Fortran, invented by John Backus at IBM in 1954.

Assembly is a low-level language and Fortran is a high-level language. In low-level languages ​​while you are not writing instructions inMachine code, a very deep understanding of computer architecture, and instructions are still required to run a desired program, which means that a limited number of people have the skills and are very error prone. Also from the early to mid-1950s, compiling code back into machine code was still an expensive and time-consuming process. This all changed with Grace Hopper and her development of the first computer compiler, Hopper if you remember from earlier also encountered the first computer 'bug'. This allowed computer programming to become more

Affordable and almost instantaneous, rather than the time-consuming process of writing assembly code and then manually converting it back to machine code. As a side note, Hopper also helped with the invention of and another early programming language, Cobol. This era marks the beginning of the modern computing age and where the exponential trend in computing performance really began.While the transistors were an improvementimportant with regard to vacuum tubes, they still had to beSoldiers together individually. As a result, more complex computers became,

It led to more complicated and numerous connections between transistors, increasing the probability of faulty wiring. In 1958, all of this changed with Jack Kilby of Texas Instruments and his invention of the integrated circuit. The integrated circuit was a way of packing many transistors onto a single chip, rather than wiring transistors individually. Packing all the transistors also significantly reduced the power and

The heat consumption of computers once again and made them significantly moreEconomically feasible to design and buy. Integrated circuits led to hardwareThe revolution and beyond computers aided in the development of several otherElectronic devices due to miniaturization, such as the mouse invented by Douglas Engelbart in 1964, also demonstrated the first graphical user interface as a side note.

History of computer science timeline- History of computing


Computer speed, performance, memory, and storage also began to iteratively increase as ics were able to pack more transistors into smaller surface areas. This is demonstrated by the invention of the floppy disk in 1971 by IBM and in the same year, DRAM by Intel, to list a few. Along with hardware, further advancements were made in software as well, with an explosion of programming languages ​​and the introduction of some of the most common languages ​​today: BASIC in 1964 and C in 1971. 

Post a Comment

0 Comments