Components of Computing Devices
Information system paper
This week we are going to talk about some of the primary components of computing devices.
1
Week 3 – Components of Computing Devices – Binary Numbers
2
Before we take a look under the hood in the computer, I would like to talk to you a little about data representation in the computer.
http://www.w3schools.com/tags/att_meta_charset.asp
ASCII First character encoding Defined 128 different standard alphanumeric characters that could be used on the internet.
Supports numbers (0-9), English letters (A-Z)
some special characters like ! $ + – ( ) @ < > .
ANSI Original Windows Supported 256 different (Windows-1252) character set character codes.
ISO-8859-1 Default character set It also supported 256 for HTML 4 different character codes.
UTF-8 Default character set Supports almost all of the (UNICODE) for HTML 5 characters and symbols in the world.
Week 3 – Components of Computing Devices – Meta Character Sets
If you have studied HTML, you will have noticed this required HTML 5 directive: <meta charset = “utf-8”> HTML 5 validators will issue an error if this statement is missing from the program. Here is what is going on with the statement. UTF-8 is the default character set for HTML 5. Character sets are encoding standards. I would like to talk with you for a few minutes about what that means.
When the first computers were designed, designers realized that there would have to be a highly standardized mechanism for determining what a computer thinks when you press a certain key on the keyboard, or a certain character is read into memory. If this mechanism didn’t exist, computers wouldn’t be able to share data. So, one of the very earliest standards for computers that was created was the ASCII character coding mechanism. ASCII stands for “American Standard Code for Information Interchange”. ASCII codes involved a 7-bit byte, which means that they could define 128 different alphanumeric characters. Other character sets that have been created are listed above, including UTF-8.
In order to understand character codes, you need to understand character representation in a computer. So, let’s look at a few things.
3
10
Week 3 – Components of Computing Devices – Binary Numbers
What number are you looking at here? If you said, “ten”, you would be correct. However, you would also be correct if you said “two” or if you said “sixteen”. So, what are we talking about?
4
0,1,2,3,4,5,6,7,8,9
Week 3 – Components of Computing Devices – Binary Numbers
I am talking about the base that the number is represented in. We tend to think in Base 10, so the number on the slide looks most like a “ten” to us. But “10” isn’t equal to “ten” just because it looks like “10”. Each digit in the number on this slide has a meaning, and the meaning relates to two things: the base of the number, and the position of the digit. If we are thinking about Base 10, this number is equal to “ten” because there is a “1” in the 10’s place in the number.
To figure out the value of any decimal (base 10) number, you just add up the digits, taking into account the digit’s place in the number string. So, the number “10” is equal to 0 x 10 to the zeroth power + 1 x 10 to the first power. Any number to the 0th power is equal to 1, so this number is equal to 0X1 + 1X10, or 10. Before we move on to the next slide, note that in Base 10 notation, there are 10 digits we can use before we have to move to the next place.
5
0,1
Week 3 – Components of Computing Devices – Binary Numbers
If we are working in base 2 (binary) notation, our number becomes equal to: 0x2 to the 0th power + 1×2 to 1st power, or 0*1 + 1*2, or 2! The same algorithm comes up with a totally different value because we are thinking in a different base. Note that in binary notation, we only have 2 numbers to work with before we have to move to the next place.
6
0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F
Week 3 – Components of Computing Devices – Binary Numbers
Now that you are refreshed on notation, I am sure you realize that in base 16 (Hexadecimal) notation, our number is equal to: 0x16 to the 0th power + 1×16 to the 1st power, or 0*1 + 1*16, or 16! Since we only have ten digits available in our Arabic numbering system – how in Arabic numerals can we represent 15? The answer is “F”. We add six alphas from our alphabet to represent the missing six numerals. “A” equals 10, “B” equals 11, and so forth. You have most likely seen the results of this in looking at addresses in your computer – now you will understand that the addresses are expressed in Hex!
7
Week 3 – Components of Computing Devices – Binary Numbers
So, what does this have to do with us? Here we are again, back to vacuum tubes.
8
Week 3 – Components of Computing Devices – Binary Numbers
ENIAC contained 18,000-20,000 vacuum tubes. Numbers in ENIAC were stored in decimal, not binary, notation, in an attempt to minimize the number of circuits required. This involved using a series of circuits called “ring counters”. The approach didn’t really work out however, and virtually all digital computers since then are binary in design. The vacuum tube is a great way to demonstrate the idea behind binary representation, because the vacuum tube is essentially a binary device.
9
Week 3 – Components of Computing Devices – Binary Numbers
Let’s build an extremely crude binary device with a standard keyboard. Let the absence of power represent a “0” and the presence of power represent a “1”. Pressing a key on the keyboard will switch power to some tubes and not others. Using a 7 bit byte, what is the biggest number I can represent with binary notation using this mechanism?
10
Week 3 – Components of Computing Devices – Binary Numbers
Let’s switch to an 8 bit byte. Now what is the biggest number I can represent?
11
Week 3 – Components of Computing Devices – Binary Numbers
Now I have two questions for you. Given the states of the eight bits in this byte, what number is represented? What key did I press on the keyboard?
12
Week 3 – Components of Computing Devices – ASCII Table
The answer to the second question lies in this slide. This is an ASCII chart. It shows the assigned ASCII codes for the characters on a normal keyboard (upper case alphas, lower case alphas, numbers and symbols) in Binary, Octal, Decimal and Hex. Note that this chart is an old one for the seven bit byte. Although this representation will accommodate the normal characters on a standard English keyboard, it is obvious that there are not enough representations available for non-printing characters, and character sets from other languages, industries and disciplines. However, the chart does let you determine the key I pressed on the keyboard – a Capital A.
13
Week 3 – Components of Computing Devices – Unicode Table
This partial chart is a table for the encoding mechanism called Unicode. Unicode was designed to address the limitations of ASCII. It incorporates an 8 bit byte, and can combine from one to four bytes to make a character. UTF-8 uses Unicode in 4 byte segments, and therefore can accommodate 1,112,064 characters, close to all the characters that exist.
This slide shows the Unicode chart. Note that the characters on the Unicode chart with values between 0 and 127, are identical to the characters on the ASCII chart.
14
Week 3 – Components of Computing Devices
Abstract Architecture of all Computers
All computing devices have a similar number of components, although in phones or tablets some of them might be combined into one component.
Moving on to Components, all computing devices, regardless of their shape, size or purpose, share certain required elements.
There needs to be a certain amount of memory. This memory has to be sufficient to accommodate the data that are to be processed and the most basic of the operations necessary to access the functionality to do the processing. A computer’s memory is normally transient and only lasts until the computer is turned off.
A Central Processing unit, which houses special structures necessary for logical and arithmetic processing is required.
Because Primary Memory is transient, there needs to be some mechanism for what we call “Persistent Storage” – a way to store data that persists even when the computer is turned off. Over the years, the Persistent Storage requirement has been filled with punched cards, magnetic tape, floppy disks, compact disks, internal and removable external hard disks, memory sticks, thumb drives, and so forth.
Input devices handlers for mouse, keyboard, touchpad, etc are required as well.
Output device handlers for Display Screen, Printer, etc are required.
There can be additional components such as cameras, microphones, speakers, and so forth.
15
Week 3 – Components of Computing Devices
https://fossbytes.com/what-is-a-motherboard-what-are-the-components-of-a-motherboard/
The motherboard is the platform in the computer where the components are laid out and connected together. My experience has been that learning something in our industry is greatly assisted by two things: Having a need to know whatever it is, and learning the acronyms associated with it. Here I am presenting three different images of a motherboard that I was able to find which were labeled. It is fun to compare them, and see which elements they all have in common.
16
Week 3 – Components of Computing Devices
http://4.bp.blogspot.com/-kHCk15xhQIA/UBzG7HoMvHI/AAAAAAAAAFc/eUUIeOfEdMk/s1600/motherboard-labelled5.jpg
17
Week 3 – Components of Computing Devices
https://thecustomizewindows.com/2011/03/components-of-a-motherboard/
18
Week 3 – Components of Computing Devices – Terminology
On this slide and the next one, I have compiled a couple dozen terms with their definitions, that relate to things you will find on the motherboard. I have found that reading the glossary helps to understand the images of the motherboard, and studying the images helps to understand the acronyms.
19
Week 3 – Components of Computing Devices – Terminology
20
Week 3 – Components of Computing Devices – John Von Neumann
This is a name you have probably heard. John von Neumann was a child prodigy, born János Neumann in 1903 to wealthy parents in Budapest, Hungary. Von Neumann was a genius mathematician who by 1929 had published thirty-two major papers. His published work, which is prodigious, spans the fields of mathematics, physics, quantum mechanics, economics and computer sciences.
IN 1929 he was invited to lecture at Princeton University in Princeton, New Jersey. In 1933 he was offered a lifetime professorship on the faculty of the Institute for Advanced Study, where he remained as a mathematics professor until his death. The Institute for Advanced Study (IAS) is an independent, postdoctoral research center based in Princeton, New Jersey, where von Neumann joined such illuminati as Albert Einstein and Kurt Gödel. In 1937 he became a naturalized citizen and Anglicized his first name to John, adding the “von” to it from his father’s title. He worked extensively as a consultant to the US government’s defense projects, including the Manhattan Project, and is credited with the equilibrium strategy of mutual assured destruction.
21
Week 3 – Components of Computing Devices – John Von Neumann
Just a few of the many accomplishments that Von Neumann is credited with in the field of Computing are:
Inventing a sorting algorithm called the merge sort algorithm,
Describing a new architecture for EDVAC, the sequal to ENIAC,
Working on game theory and the philosophy of artificial intelligence with Alan Turing,
Contributing to the development of the Monte Carlo method (which allowed solutions to complicated problems to be approximated using random numbers),
Developing a format for making pseudorandom numbers, the middle-square method, and
Doing pioneering work in the field of cellular automata.
22
Week 3 – Components of Computing Devices – Von Neumann Architecture
In 1945, while consulting for the Moore School of Electrical Engineering at the University of Pennsylvania, von Neumann wrote a paper titled “First Draft of a Report on EDVAC”, which described modifications to ENIAC to run as a stored-program machine. EDVAC was being planned while ENIAC was being completed, and in his paper Von Neumann proposed a different plan for EDVAC. The plan described a computer architecture in which the data and the program are both stored in the computer’s memory in the same address space, as opposed tothe earliest computers that were “programmed” using a separate memory device such as a paper tape or wired board.
This architecture, called The Von Neumann Architecture, has endured as a basic plan for computers ever since. It is the basis for most modern computer designs.
23
Week 3 – Components of Computing Devices
Programming ENIAC required flipping switches and rewiring some of its parts: although this was relatively efficient, it could often take a quarter hour to reprogram ENIAC for a computation that only took thirty seconds to run. In contrast, EDVAC read in some kind of input that encoded its operations, stored them in memory, and executed from there. Eckert had proposed the idea almost a year earlier, but von Neumann seized upon it and became its champion.
24
Week 3 – Components of Computing Devices – Von Neumann Architecture
This is another picture of the Von Neumann architecture, with some more details filled in.
25
Week 3 – Components of Computing Devices – Von Neumann Architecture
The control unit of the Central Processing Unit regulates and integrates the operations of the computer. It selects and retrieves instructions from the main memory in proper sequence and interprets them so as to activate the other functional elements of the system at the appropriate moment to perform their respective operations. All input data are transferred via the main memory to the arithmetic-logic unit for processing, which involves the four basic arithmetic functions (i.e., addition, subtraction, multiplication, and division) and certain logic operations such as the comparing of data and the selection of the desired problem-solving procedure or a viable alternative based on predetermined decision criteria.
26
Week 3 – Components of Computing Devices – Von Neumann Architecture
The Arithmetic Logic Unit (ALU) allows the computer to add, subtract, and to perform basic logical operations such as AND/OR.
27
Week 3 – Von Neumann Architecture
Central Processing Unit – Arithmetic Logic Unit
AND
OR
XOR
NOT
NAND
NOR
XNOR
Operations like this in the ALU utilize what we call logic gates. These logic gates work by taking two inputs (one input for the ‘NOT’ gate) and producing an output. If we consider the ‘AND’ gate the output will be true, or ‘1’ (or a high voltage), if input #1 and input #2 are true, and the output will be false, or ‘0’ (or a low voltage), if one or both inputs are false. Likewise, if we consider the ‘OR’ gate the output will be true if input #1 or input #2 is true. The ‘XOR’ gate output will be true if either input is true, but false if both inputs are true; this is an implementation of the exclusive ‘OR’ logic operation. The ‘NOT’ gate will output the opposite of the input; so if the input is true the ‘NOT’ gate’s output will be false. The ‘NAND’, ‘NOR’, and ‘XNOR’ gates are implementations of the ‘AND’, ‘OR’, and ‘XOR’ gates respectively with a ‘NOT’ gate prior to the output; so a ‘NAND’ gate will return what an ‘AND’ gate does not. All of this should sound very familiar to you – remember truth tables?
28
Week 3 – Von Neumann Architecture
Central Processing Unit – Arithmetic Logic Unit
4 bit AND
These logic functions are by themselves an important part of a CPU’s functionality, but performing logic operations on two inputs is only so useful. By combining these gates together we can have devices with more inputs. For example, you can combine three ‘AND’ gates. These three ‘AND’ gates will produce an output that is true only when all four inputs are true. In essence, this is a 4 bit ‘AND’ gate. You can extrapolate from this and form an 8 bit ‘AND’ gate by combining two 4 bit ‘AND’s and one 2 bit ‘AND’.
29
Week 3 – Von Neumann Architecture
Central Processing Unit – Arithmetic Logic Unit
Half Adder
This gate is called a half-adder, and for it, the inputs are not true or false but ‘1’ or ‘0’. The output of this adder is the sum of the inputs with a carry bit. If the inputs are ‘1’ and ‘1’ we are adding 1 plus 1. The output labeled ‘SUM’ is just an ‘XOR’ of the inputs which will be ‘0’. The output labeled ‘CARRY’ is an AND gate which of course will be ‘1’. The addition answer therefore is 10 which is the binary addition of ‘1’ and ‘1’. If the inputs are ‘1’ and ‘0’ the ‘SUM’ will be ‘1’ and the ‘CARRY’ will be ‘0’, giving an answer of 01 or just 1.
30
Week 3 – Von Neumann Architecture
Central Processing Unit – Arithmetic Logic Unit
Full Adder
The full-adder is two half-adders with one additional ‘OR’ gate. To use a full-adder to add two binary numbers of arbitrary size you will begin with the right most bit, called the least significant bit (LSB) of each number with a carry in bit of ‘0’. You would then add the two bits, record the sum, and use the carry out bit as the carry in bit when adding the next two bits and moving towards the most significant bits (MSB). By repeating this process you can add two binary numbers of any arbitrary length. This process is known as a ripple carry.
31
Week 3 – Von Neumann Architecture
Central Processing Unit – Registers
Registers hold instructions and other data. Registers supply operands to the ALU and store the results of operations.
A register may hold an instruction, a storage address, or any kind of data (such as a bit sequence or individual characters). Some instructions specify registers as part of the instruction. For example, an instruction may specify that the contents of two defined registers be added together and then placed in a specified register.
A register must be large enough to hold an instruction – for example, in a 64-bit computer, a register must be 64 bits in length. In some computer designs, there are smaller registers – for example, half-registers – for shorter instructions. Depending on the processor design and language rules, registers may be numbered or have arbitrary names.
A processor typically contains multiple index registers, also known as address registers or registers of modification. The effective address of any entity in a computer includes the base, index, and relative addresses, all of which are stored in the index register.
A shift register is another type. Bits enter the shift register at one end and emerge from the other end. Flip flops, also known as bistable gates, store and process the data.
32
Week 3 – Components of Computing Devices – Gordon Moore
Let me introduce you to an interesting prophet of our industry: Gordon Moore was the cofounder, with Robert Noyce, of Intel Corporation.
He is famous for Moore’s law that predicted: “The number of transistors per silicon chip doubles each year.” Another way of expressing this prediction would be to say that computing would dramatically increase in power, and decrease in relative cost, at an exponential pace.
In 1975, as the rate of growth began to slow, Moore revised his time frame to two years.
In actuality, over roughly 40 years from 1961, the number of transistors has doubled approximately every 18 months, but this does not really detract from Moore’s prescience in understanding the explosive nature of the growth in power that would occur.
33
Week 3 – Components of Computing Devices – Gordon Moore
Looking at this chart, you can see the exponential growth, but to actually track this growth turns out to be complicated.
34
Week 3 – Processor History
On the next two slides you can see the most comprehensive list of Processors I could find. It is hard to figure out exactly when the doubling occurs from this chart because it isn’t sorted right and because there are other factors that enter into the transistor count such as multiple core processors, but the one thing that is unmistakable from the chart is the reality of massive increases in computing power since the 70’s!
35
Week 3 – Processor History
36
Week 3 – Processor History
I have read a lot of articles expressing the opinion that Moore’s law will soon cease to operate, although Intel doesn’t seem to agree with this opiinion. Several reasons have been proferred for these opinions, such as:
Clock speeds are slowing down due to heat.
Design of the chips is becoming increasingly expensive.
Manufacturing facilities for chips are becoming prohibitively expensive.
Eventually, the width of transistors would be such that it would be unlikely for them to operate reliably.
Power usage increases with increased packing of transistors onto a chip.
As the transistors are packed more tightly, dissipating the energy that they use becomes harder and harder
37
Week 3 – Components of Computing Devices – Transistors and Semiconductors
Transistors are made by combining semiconductors in various ways. Semiconductors, most commonly created with silicon, are used everywhere in electronics, and are still indispensable to our industry. Semiconductors are made by a process called “doping”, which means adding different types of impurities to silicon. Based on the type of impurity added, the silicon exhibits different degrees of conductivity. Putting this doped silicon together in various ways results in electrical devices that perform various switching activities. I am assigning a good article for this week that explains how all of this happens. I hope you enjoy the reading, and have a great week!
38
References
http://www.socialmediaexaminer.com/copyright-fair-use-and-how-it-works-for-online-images/
https://www.lib.umn.edu/copyright/using-images-teaching
https://www.stinkyinkshop.co.uk/articles/ultimate-guide-to-images
https://www.rivaliq.com/blog/guide-copyright-fair-use-laws-online-images/
http://www.livescience.com/20718-computer-history.html
https://www.bing.com/images/search?view=detailV2&ccid=pWYCdzN0&id=42706CE5581F344A55E6C417C990D17E90E39A0A&thid=OIP.pWYCdzN0EtYRwHQPRBPWHgD6D6&q=MONTE+CARLO+METHOD+IMAGE&simid=608041558264581732&selectedindex=229&mode=overlay&first=1
https://www.bing.com/images/search?view=detailV2&ccid=WtLmoPgg&id=F02BD58490874177050F9E09D51D69457598BFD2&thid=OIP.WtLmoPggphdTPR3hSnZyAAEsDY&q=john+von+neumann+images&simid=608036335597587774&selectedIndex=34&ajaxhist=0
https://www.britannica.com/biography/John-von-Neumann
http://www.teach-ict.com/as_as_computing/ocr/H447/F453/3_3_3/vonn_neuman/miniweb/pg3.htm
http://www.teach-ct.com/as_as_computing/ocr/H447/F453/3_3_3/vonn_neuman/miniweb/pg2.htm#
http://alexandrebrunet.com/images/Ecrits/trois_architectures_d_ordin.jpg
https://www.lifewire.com/tour-inside-a-desktop-pc-2624588
http://computer.howstuffworks.com/microprocessor1.htm
http://www.extremetech.com/wp-content/uploads/2016/02/CPU-Progression1.png
39
References
https://www.britannica.com/technology/central-processing-unit
http://web.cse.ohio-state.edu/~teodorescu.1/download/teaching/cse675.au08/Cse675.02.F.ALUDesign_part1.pdf
http://techgenix.com/Arithmetic-Logic-Unit/
http://www.computernostalgia.net/staticOutline.htm
http://www.explainthatstuff.com/
http://www.explainthatstuff.com/howtransistorswork.html
http://www.explainthatstuff.com/integratedcircuits.html
http://whatis.techtarget.com/definition/processor
https://fossbytes.com/whats-inside-my-computer-different-components-of-a-computer/
https://fossbytes.com/what-is-a-motherboard-what-are-the-components-of-a-motherboard/
kHCk15xhQIA/UBzG7HoMvHI/AAAAAAAAAFc/eUUIeOfEdMk/s1600/motherboard-labelled5.jpg
https://www.britannica.com/biography/Gordon-Moore
https://arstechnica.com/information-technology/2016/02/moores-law-really-is-dead-this-time/
http://wagnercg.com/Portals/0/FunStuff/AHistoryofMicroprocessorTransistorCount.pdf
http://dicingusa.com/wp-content/uploads/2015/10/semiconductor-wafer.jpg
https://thecustomizewindows.com/2011/03/components-of-a-motherboard/
https://thecustomizewindows.com/2012/01/northbridge-and-southbridge-of-a-motherboard/
http://www.mpoweruk.com/images/transistorpnp.gif
http://processmaterials.iceland.creativearc.com/assets/images/products/Application_-_Semiconductor.jpg
whatis.techtarget.com
https://www.techopedia.com
40
References
https://www.bing.com/images/search?view=detailV2&ccid=CSKlSB%2FB&id=8508C95040A79E6A9E1D96430333131EB054574F&thid=OIP.CSKlSB_BEF5VTgJlBLf-6gEsCv&q=architectural+diagram+of+a+simple+idealized+computer&simid=608020731811858717&selectedindex=60&mode=overlay&first=1
https://www.bing.com/images/search?view=detailV2&ccid=kkaPnjTf&id=87A627B7C3E43237F06638639FD71D58CBA742A5&thid=OIP.kkaPnjTfIBrvLcN3SYSWcAEsDh&q=architectural+diagram+of+a+simple+idealized+computer&simid=608014267905476225&selectedindex=72&mode=overlay&first=1
https://images.search.yahoo.com/search/images?p=semiconductor+images&fr=mcafee&imgurl=http%3A%2F%2Fblogs-images.forbes.com%2Fjimhandy%2Ffiles%2F2011%2F10%2F2x-nm-wafer.jpg#id=13&iurl=http%3A%2F%2Fblogs-images.forbes.com%2Fjimhandy%2Ffiles%2F2011%2F10%2F2x-nm-wafer.jpg&action=click
https://images.search.yahoo.com/search/images?p=semiconductor+images&fr=mcafee&imgurl=http%3A%2F%2Fblogs-images.forbes.com%2Fjimhandy%2Ffiles%2F2011%2F10%2F2x-nm-wafer.jpg#id=13&iurl=http%3A%2F%2Fblogs-images.forbes.com%2Fjimhandy%2Ffiles%2F2011%2F10%2F2x-nm-wafer.jpg&action=click
http://primatics.com/wp-content/uploads/2014/07/Semiconductor.jpg
http://4.bp.blogspot.com/-
41
References
42