Non-Digital Computers This is the last installment of my many-part series on computers. Last time we used the notion of a Turing machine to define what a computer is. We discovered something surprising: that not all computers need to be digital, or even electronic! A computer can be mechanical, made of dominoes, or even just a rules system in a card game. To give you of a flavor of how inclusive the definition of a computer really is, I’ll now give you a whirlwind tour of some notable examples of non-digital computers. The Antikythera Mechanism In April of 1900,
Computer Related / Education / logic / etc.
What Is A Computer, Really?
Look at the picture above. Believe it or not, that person is operating an extremely sophisticated mechanical calculator, capable of generating tables that evaluate functions called “polynomials.” Although a graphing calculator can do that, a pocket calculator certainly can’t. The device above is a mechanical purpose-built computer! This article is the next installment of my series on computing. In the previous parts, we learned about Boolean logic, the language computers think in. We then learned how to implement this logic electronically and, using our newfound understanding of electronics, how to make computer memory so that computers can record results
Computer Related / logic / Mathematics / etc.
The Turing Machine
This is the sixth part in my multi-part series on computing. In the previous parts, we learned about Boolean logic, the language computers think in. We then learned how to implement this logic electronically. And finally, we learned how to make computer memory, so that computers can record results of calculations. Now before we conclude the series, we’re going to take a quick detour into computational theory and the Turing machine. Alan Turing’s Machine of the Mind In 1936, mathematician, WWII codebreaker, and all around awesome guy Alan Turing wanted to investigate a problem in formal logic. Specifically, he
Computer Related / Science And Math
A Parallel Computing Primer
So, Jonah is moving and he asked me to write a guest post. Jonah’s recent articles about computing prompted me to write about distributed computing. The question I will answer is: how do you go from computing with a sequential program to computing on many core machines (aka Parallel Computation)? Parallel Computation First of all, what is parallel computation? In a nutshell, parallel computation is the science which allows you to use a many processors to compute faster. You certainly would want to do this if you worked on the stock market where the faster you are at calculating
Uncategorized
Moving (again)
For (hopefully) the last time in the next three years, I’m moving! It’s only one city over, but I want to try and keep up a semblance of work productivity while I pack up and hop. So for the next two weeks or so, the blog will be on hiatus. Sorry all! I will try to put up some fun content sporadically. And hopefully a guest post.
Electronics / Physics / Science And Math
Flip-Flops and the Art of Computer Memory
It’s a poor sort of memory that only works backwards. ~The White Queen to Alice (Lewis Carroll, Through the Looking Glass) This is the fifth part in my multi-part series on how computers work. Computers are thinking machines, and the first four parts of my series have been on how we teach computers to think. But all of this logic, electronic or otherwise, is useless unless our computers can remember what they did. After logicking something out, a computer needs to remember the result of all that logicking! In this post, I describe how to use the logic gates
Computer Related / Electronics / logic / etc.
The Boolean Circuit and Electronic Logic, Part 2
If the presence of electricity can be made visible in any part of the circuit, I see no reason why intelligence may not be transmitted instantaneously by electricity. ~Samuel Morse This is the fourth part in my multi-part series on how computers work. Computers are thinking machines, but they can’t do this on their own. We need to teach them how to think. And for this, we need a language of logic. In the first part of the series, I introduced this language of logic, Boolean algebra. In the second part, I described how to formulate complex logical statements
Computer Related / Condensed Matter / History / etc.
The Boolean Circuit and Electronic Logic, Part 1
Living in a vacuum sucks. ~Adrienne E. Gusoff This is the third part in my multi-part series on how computers work. Computers are thinking machines, but they can’t do this on their own. We need to teach them how to think. And for this, we need a language of logic. In the first part of the series, I introduced this language of logic, Boolean algebra. In the second part, I described how to formulate complex logical statements using Boolean algebra. Now, in part three, I lay the groundwork for how we can implement simple Boolean logic using electronics. In
logic / Mathematics / Science And Math
George Boole and the Language of Logic, Part 2
Anything that thinks logically can be fooled by something else that thinks at least as logically as it does. ~Douglas Adams This is the second post in a multi-part series explaining how computers work. A computer is a thinking machine, a device which applies logic to any problem we ask it to. However, computers don’t know how to do this automatically. We have to teach them. And to teach them, we need a language of logic. Last time, we introduced one such language of logic, Boolean algebra. This time, we learn how to make composite statements in Boole’s system.
abstract algebra / logic / Mathematics / etc.
George Boole and the Language of Logic, Part 1
Logic takes care of itself; all we have to do is to look and see how it does it. ~Ludwig Wittgenstein Contrariwise, if it was so, it might be; and if it were so, it would be; but as it isn’t, it ain’t. That’s logic. ~Lewis Carroll This is the first post in a multi-part series explaining how computers work. At its heart, a computer is a logical-thinking machine. It’s very good at starting with several assumptions and deducing a conclusion from those assumptions. Of course, a computer can’t do any of that on its own. We need to