< Day 5
This post is part of a series I hope to publish regularly (oops) called "1 Times Daily." The idea is simple: that almost anyone can learn about computer science. I'm going to try to boil CS down to its core and present that in a way that nearly everyone can understand. I'll do my best to write from a "general audience" perspective, and we'll see if anyone bothers to read!
Day 6: Computers synchronize their actions with clocks.
The only reason for time is so that everything doesn't happen at once.
Most of what we've discussed in this series so far can be summarized as follows:
Electricity -> On/Off -> 1s and 0s (binary numbers) -> Numbers -> Representations of anything
Electricity -> On/Off -> True/False (binary logic) -> Logic -> Manipulations of representations
That is to say, computers use the fact that electricity is either "on" or "off" to represent both binary numbers and logical values, which can be combined to do some really cool things. We've dug pretty deep into some of that, but hopefully that only inspires us to wonder how computers actually do even cooler things.
Fortunately for us here in the information age, computers can do substantially more than power lines. Take our adder from yesterday - it's pretty cool (if you're into that sort of thing!), but it's got a couple of problems:
1) How do I know when the output of a circuit is ready?
2) Where does the output go at the "end" of the circuit?
Today we're going to talk about how computers deal with #1 from this list. We'll save #2 for the next post.
As we've discussed them so far, the electrical current used in a computer's circuits flows as quickly as possible. We might think that this is the best possible setup (who wouldn't want a result as fast as possible?), but actually it has problems. The issue is clearer when we think of wanting to do operations in a sequence.
For example, if I want to add three numbers A + B + C, the plan might be:
1) Add A + B, get a result D
2) Add the result of D + C
(As a motivational side note, this "plan" actually describes a program we need to write.)
Computing the first result takes some time, depending on how many gates are needed, how many digits are in the numbers, and other physical factors of the hardware. I need to wait for the first result before I can perform the next step, but I have no idea how long to wait. In fact, the whole idea of "waiting" is foreign to computers as we've discussed them so far - we have no notion of time. Thankfully, that's about to change.
If you've ever been a part of a marching band, you know that one of the most important factors is "the beat" (actually, this is pretty important for all music). If someone can't find the beat, their odds of running into someone else while marching are much higher. Marching bands solve the problem by stepping at the exact same time with approximately the same distance in every step.
Similarly, computers introduce a "beat" in the form of a clock. Clocks are described by how many times they tick every second, also called the clock's frequency. The unit of measure is called the "hertz" (Hz). When you buy a computer, you'll see advertisements describing the frequency of its main logical circuit, called the central processing unit. Nowadays these values are reported in Gigahertz (GHz). Clocks are typically made by bending a material such as a quartz crystal. Mindblowingly, these minerals produce a consistent voltage on their surface when bent. This voltage occurs in a consistent, oscillating (repeating) pattern, and computers use features of that pattern to define the ticks of the clock. The voltage can be multiplied and otherwise manipulated to fit the needed frequency for a circuit.
If we were to look at the voltage pattern produced by a well-groomed clock, it would vary between 1 ("on"/"high") and 0 ("off"/"low") something like this:
In terms of vocabulary, this is a square wave pattern. The time it takes for a clock to complete a full cycle of "up" and "down" is known as the period of the clock. The number of periods per second is another way of thinking about the frequency of the clock mentioned above. The edge of a clock pattern is the point at which the wave transitions from high -> low (falling edge) or low->high (rising edge), meaning that successive rising edges (and successive falling edges) occur one period apart from each other. If we can adapt our circuits to detect and respond to clock edges, we can add a notion of time, which helps us know when it's possible to trust results.
With this support from a clock, we can then tell our second operation in the A + B + C problem above to wait one tick of the clock before obtaining the value of D to compute D + C. A computer needs a bit more hardware to manage its sense of time, but at least now we have something to work with.
Thanks for reading!
No comments:
Post a Comment