Today we discuss Boolean Logic. It may be the first in a series on different kinds of logic, or it may not. We'll see. A concept that should be intimately familiar to any programmer is Boolean logic. True or False, 1 or 0, Yes or No, these are all different representations of a Boolean variable. Invented by George Boole in the 1800s, Boolean logic forms the basis of modern computing, used in everything from the base transistor to bitmasks and search queries. Wikipedia has a good introduction to the theory for anyone who needs a refresher. Rather than re-teach it, here I am going to draw your attention to a few of the more interesting features of Boolean logic. First off, everything can be defined from only one basic operation, alternative denial (aka NAND, ↑). Alternatively, the same can be done with NOR, albeit in different patterns. For example, NOT can be created by splitting the input into both sides of a NAND gate. For this reason – and many others – NAND is often used as a building block of transistor circuits. It is a bit awkward for general use however, so for higher level Boolean logic we usually deal in three main operations: negation (NOT, or ¬), conjunction (AND, or ∧), and disjunction (OR, or ∨). There are others, but these are good enough for now. Take a look at a few examples: A ∨ B = ¬(¬A ∧ ¬B) (De Morgan's Law) Even just using these three operators, there are multiple ways to express any given function. For example, which is the 'best' way to represent XOR (the ⊕ operator)? A ⊕ B = (¬A ∧ B) ∨ (A ∧ ¬B) The first is easier for most people to understand ("A is true and B is false, or the other way around"), while the second has fewer logical operations. If we wanted to implement XOR in software that might matter, although in hardware it would usually be implemented as a custom unit such as the CMOS 4070, or failing that, four NAND gates: N := A ↑ B Another interesting feature of hardware level Boolean logic is propagation delay. Essentially, we think of logical operations as instantaneous, however in hardware they take a measured time to switch. The propagation delay is the total time it takes for a change in the inputs to be reflected as a change of the outputs of a function. This gives us a tradeoff between time and transistor count – basically, it might be better to use more transistors to implement a function, but structured in parallel. This tradeoff is sometimes reflected in software: if your computer can run multiple instructions in parallel, it is sometimes possible to use more instructions perform a function in fewer clock cycles. The compiler usually handles this kind of micro-optimization, however. One other interesting tidbit: take a look at Karnaugh maps for simplifying expressions and finding race conditions. Essentially, you draw out the truth table for an expression and then cover the '1's with as few overlapping rectangles as possible, optionally adding extras to prevent race conditions. Lastly, I'll leave you with a puzzle: invert three inputs using only two NOT gates. To be specific, you have a function with 3 inputs, A B C, and you want three outputs, X Y Z, such that X = ¬A However, you only have two NOT gates (although you can use as many AND or OR gates as you'd like). Note: to solve this in written form (rather than by drawing a circuit diagram), you may find that you need temporary variables, such as 'N := ¬A ∧ B'. Bonus points if you can generalize it to an algorithm for inverting n inputs using ⌊n/2⌋ + 1 NOT gates.
0 = A ∧ ¬A
1 = ¬(A ∧ ¬A)
A ⊕ B = (A ∨ B) ∧ ¬(A ∧ B)
A ⊕ B = (A ↑ N) ↑ (N ↑ B)
Y = ¬B
Z = ¬C
Saturday, May 24, 2008
Boolean Logic
Subscribe to:
Post Comments (Atom)
No comments
Post a Comment