Difference between revisions of "Boolean Algebra"

From ACSL Category Descriptions
Jump to navigation Jump to search
Line 1: Line 1:
''Boolean algebra''' is the branch of algebra in which the values of the variables and constants have exactly two values: "true" and "false", usually denoted 1 and 0 respectively. This overview is taken from the Wikipedia article.  
''Boolean algebra''' is the branch of algebra in which the values of the variables and constants have exactly two values: "true" and "false", usually denoted 1 and 0 respectively. This overview is loosely based on the Wikipedia article on [[https://en.wikipedia.org/wiki/Boolean_algebra|Boolean Algebra]].  


The basic operators in Boolean algebra are "and", "or", and "not". The "and" of two variable results in true only whenever both variables are true. The "or" of two two variables is true whenever either or both variables are true. The "not" of a variable changes a true to false and a false to true. Just as algebra has basic rules for simplifying and evaluating expression, so does Boolean algebra.  
The basic operators in Boolean algebra are "and", "or", and "not". The "and" of two variable results in true only whenever both variables are true. The "or" of two variables is true whenever either or both variables are true. The "not" of a variable changes a true to false and a false to true. Just as algebra has basic rules for simplifying and evaluating expression, so does Boolean algebra.  


==Why is Boolean Algebra Important?==
==Why is Boolean Algebra Important?==
Line 19: Line 19:
<syntaxhighlight lang="python" inline>s < 100</syntaxhighlight>
<syntaxhighlight lang="python" inline>s < 100</syntaxhighlight>
and   
and   
<syntaxhighlight lang="python" inline>(x % 2 == 0) and (x % 3 != 0)</syntaxhighlight>are Boolean expressions.  
<syntaxhighlight lang="python" inline>(x % 2 == 0) and (x % 3 != 0)</syntaxhighlight>
are Boolean expressions.  


*For computer scientists, Boolean algebra defines the circuits that make up the logic of digital computers.  
*For computer scientists, Boolean algebra is the basis for digital circuits that make up a computer's hardware. The [[Digital Electronics]] category concerns a graphical representation of a circuit. That circuit is typically easiest to understand and evaluate by converting it to its Boolean algebra representation.  
 
*The general population uses Boolean algebra, probably without knowing that they are doing so, when they enter search terms in Internet search engines. Google,
for example, suuport AND, OR, and NOT, as well as a handful of shorthands  (See e.g.,[https://www.google.com/help/cheatsheet.html Syntax cheatsheet],
 
** Whitespace is used to specify logical AND, as it is the default operator for joining search terms:
** The OR keyword is used for logical OR:
**A prefixed minus sign is used for logical NOT:
 
For example, the search team jaguar speed -car returns pages about the speed of the Jaquar animal.  
   
   
* The general population uses Boolean algebra, probably without knowing that they are doing so, when they enter search terms in Internet search engines. For example, the search expression "red sox -yankees" is the Boolean expression that will returns web pages that contain the words "red" and "sox", as long as it does not contain the word "yankees".  The search expression "jaguar speed -car" returns pages about the speed of the jaguar animal, not the Jaguar car.


   
   
Line 38: Line 31:
===Basic operations===
===Basic operations===


The basic operations of Boolean algebra are as follows:
The basic operations of Boolean algebra are AND, OR, and NOT:
* AND ([[Logical conjunction|conjunction]]), denoted ''x&y'' (sometimes ''x'' AND ''y'' or K''xy''), satisfies ''x''∧''y'' = 1 if ''x'' = ''y'' = 1 and ''x''∧''y'' = 0 otherwise.
* AND (conjunction) will be denoted as <math>x \wedge y</math>, <math>x \text{AND} y</math>, <math>x \& y</math>, <math>xy</math>, or <math>x \cdot y</math>
* OR ([[Logical disjunction|disjunction]]), denoted ''x''∨''y'' (sometimes ''x'' OR ''y'' or A''xy''), satisfies ''x''∨''y'' = 0 if ''x'' = ''y'' = 0 and ''x''∨''y'' = 1 otherwise.
* OR (disjunction) will be denoted as <math>x \vee y</math>, <math>x \text{OR} y</math>,  or <math>x + y</math>
* NOT ([[negation]]), denoted ¬''x'' (sometimes NOT ''x'', N''x'' or !''x''), satisfies ¬''x'' = 0 if ''x'' = 1 and ¬''x'' = 1 if ''x'' = 0.
* NOT (negation) will be in denated as <math>\neg x</math>, <math>\text{NOT} x</math>, or <math>\bar{x}</math>
 
The values of AND, OR, and NOT for all possible inputs is show in the following truth table:


Alternatively the values of ''x''∧''y'', ''x''∨''y'', and ¬''x'' can be expressed by tabulating their values with [[truth tables]] as follows.


<dl>
<dl>
Line 53: Line 47:
!<math>x</math>
!<math>x</math>
!<math>y</math>
!<math>y</math>
!<math>x \wedge y</math>
!<math>x y</math>
!<math>x \vee y</math>
!<math>x + y</math>
|-
|-
!0
!0
Line 89: Line 83:
   
   


One may consider that only the negation and one of the two other operations are basic, because of the following identities that allow to define the conjunction in terms of the negation and the disjunction, and vice versa:
 
:<math>
\begin{align}
x \wedge y & = \neg(\neg x \vee \neg y) \\
x \vee y & = \neg(\neg x \wedge \neg y)
\end{align}
</math>


===Secondary operations===
===Secondary operations===
The three Boolean operations described above are referred to as basic, meaning that they can be taken as a basis for other Boolean operations that can be built up from them by '''composition,''' the manner in which operations are combined or compounded. Operations composed from the basic operations include the following examples:
The three Boolean operations described above are referred to as basic, meaning that they can be taken as a basis for other Boolean operations that can be built up from them by '''composition,''' the manner in which operations are combined or compounded. Operations composed from the basic operations include the following examples:
 
:<math>x \rightarrow y = \neg{x} \vee y</math>
:<math>x \oplus y = (x \vee y) \wedge \neg{(x \wedge y)}</math>
:<math>x \oplus y = (x \vee y) \wedge \neg{(x \wedge y)}</math>
:<math>x \equiv y = \neg{(x \oplus y)}</math>
:<math>x \equiv y = \neg{(x \oplus y)}</math>
Line 111: Line 97:
!<math>x</math>
!<math>x</math>
!<math>y</math>
!<math>y</math>
!<math>x \rightarrow y</math>
!<math>x \oplus y</math>
!<math>x \oplus y</math>
!<math>x \equiv y</math>
!<math>x \equiv y</math>
Line 117: Line 102:
!0
!0
!0
!0
| 1 || 0 || 1
| 0 || 1
|-
|-
!1
!1
!0
!0
|0 || 1 || 0
| 1 || 0
|-
|-
!0
!0
!1
!1
|1 || 1 || 0
|1 || 0
|-
|-
!1
!1
!1
!1
| 1 || 0 || 1
| 0 || 1
|}
|}


The first operation, ''x''&nbsp;&nbsp;''y'', or C''xy'', is called '''material implication'''. If ''x'' is true then the value of ''x''&nbsp;→&nbsp;''y'' is taken to be that of ''y''. But if ''x'' is false then the value of ''y'' can be ignored; however the operation must return ''some'' boolean value and there are only two choices.  So by definition, ''x''&nbsp;→&nbsp;''y'' is ''true'' when x is false([[Relevance logic]] suggests this definition by viewing an implication with a [[false premise]] as something other than either true or false.)
The first operation, ''x''&nbsp;&nbsp;''y'', or J''xy'', is called '''[[exclusive or]]''' (often abbreviated as XOR) to distinguish it from disjunction as the inclusive kind. It excludes the possibility of both ''x'' and ''y''.   


The second operation, ''x''&nbsp;&nbsp;''y'', or J''xy'', is called '''[[exclusive or]]''' (often abbreviated as XOR) to distinguish it from disjunction as the inclusive kind. It excludes the possibility of both ''x'' and ''y''. Defined in terms of arithmetic it is addition mod&nbsp;2 where 1&nbsp;+&nbsp;1 =&nbsp;0.
The second operation, the complement of exclusive or, is '''equivalence''' or Boolean equality: ''x''&nbsp;&nbsp;''y'', or E''xy'', is true just when ''x'' and ''y'' have the same value. Hence ''x''&nbsp;⊕&nbsp;''y'' as its complement can be understood as ''x''&nbsp;≠&nbsp;''y'', being true just when ''x'' and ''y'' are different. Equivalence's counterpart in arithmetic mod 2 is ''x'' + ''y'' + 1.


The third operation, the complement of exclusive or, is '''equivalence''' or Boolean equality: ''x''&nbsp;≡&nbsp;''y'', or E''xy'', is true just when ''x'' and ''y'' have the same value. Hence ''x''&nbsp;⊕&nbsp;''y'' as its complement can be understood as ''x''&nbsp;≠&nbsp;''y'', being true just when ''x'' and ''y'' are different. Equivalence's counterpart in arithmetic mod 2 is ''x'' + ''y'' + 1.
Given two operands, each with two possible values, there are 2<sup>2</sup> = 4 possible combinations of inputs. Because each output can have two possible values, there are a total of 2<sup>4</sup> = [[Boolean algebras canonically defined#Truth tables|16 possible binary Boolean operations]].


==Laws==
==Laws==
Line 230: Line 213:
\end{align}
\end{align}
</math>
</math>
===Completeness===
The laws listed above define Boolean algebra, in the sense that they entail the rest of the subject. The laws ''Complementation'' 1 and 2, together with the monotone laws, suffice for this purpose and can therefore be taken as one possible ''complete'' set of laws or [[axiomatization]] of Boolean algebra. Every law of Boolean algebra follows logically from these axioms. Furthermore, Boolean algebras can then be defined as the [[model (logic)|models]] of these axioms as treated in the [[Boolean algebra#Boolean algebras|section thereon]].
To clarify, writing down further laws of Boolean algebra cannot give rise to any new consequences of these axioms, nor can it rule out any model of them. In contrast, in a list of some but not all of the same laws, there could have been Boolean laws that did not follow from those on the list, and moreover there would have been models of the listed laws that were not Boolean algebras.
This axiomatization is by no means the only one, or even necessarily the most natural given that we did not pay attention to whether some of the axioms followed from others but simply chose to stop when we noticed we had enough laws, treated further in the section on [[Boolean algebra#Axiomatizing Boolean algebra|axiomatizations]]. Or the intermediate notion of axiom can be sidestepped altogether by defining a Boolean law directly as any '''tautology''', understood as an equation that holds for all values of its variables over 0 and 1. All these definitions of Boolean algebra can be shown to be equivalent.
===Duality principle===
<!-- linked from redirect [[Duality principle (Boolean algebra)]] -->
Principle: If {X, R} is a [[partially ordered set|poset]], then {X, R(inverse)} is also a poset.
There is nothing magical about the choice of symbols for the values of Boolean algebra. We could rename 0 and 1 to say α and β, and as long as we did so consistently throughout it would still be Boolean algebra, albeit with some obvious cosmetic differences.
But suppose we rename 0 and 1 to 1 and 0 respectively. Then it would still be Boolean algebra, and moreover operating on the same values. However it would not be identical to our original Boolean algebra because now we find ∨ behaving the way ∧ used to do and vice versa. So there are still some cosmetic differences to show that we've been fiddling with the notation, despite the fact that we're still using 0s and 1s.
But if in addition to interchanging the names of the values we also interchange the names of the two binary operations, ''now'' there is no trace of what we have done. The end product is completely indistinguishable from what we started with. We might notice that the columns for ''x''∧''y'' and ''x''∨''y'' in the truth tables had changed places, but that switch is immaterial.
When values and operations can be paired up in a way that leaves everything important unchanged when all pairs are switched simultaneously, we call the members of each pair '''dual''' to each other. Thus 0 and 1 are dual, and ∧ and ∨ are dual. The '''Duality Principle''', also called [[De Morgan's laws|De Morgan duality]], asserts that Boolean algebra is unchanged when all dual pairs are interchanged.
One change we did not need to make as part of this interchange was to complement. We say that complement is a '''self-dual''' operation. The identity or do-nothing operation ''x'' (copy the input to the output) is also self-dual. A more complicated example of a self-dual operation is (''x''∧''y'') ∨ (''y''∧''z'') ∨ (''z''∧''x''). There is no self-dual binary operation that depends on both its arguments. A composition of self-dual operations is a self-dual operation. For example, if ''f''(''x'', ''y'', ''z'') = (''x''∧''y'') ∨ (''y''∧''z'') ∨ (''z''∧''x''), then ''f''(''f''(''x'', ''y'', ''z''), ''x'', ''t'') is a self-dual operation of four arguments ''x,y,z,t''.
The principle of duality can be explained from a [[group theory]] perspective by the fact that there are exactly four functions that are one-to-one mappings ([[automorphism]]s) of the set of Boolean [[polynomial]]s back to itself: the identity function, the complement function, the dual function and the contradual function (complemented dual). These four functions form a [[group (mathematics)|group]] under [[function composition]], isomorphic to the [[Klein four-group]], [[group action|acting]] on the set of Boolean polynomials. [[Walter Gottschalk]] remarked that consequently a more appropriate name for the phenomenon would be the ''principle'' (or ''square'') ''of quaternality''.<ref name="GivantHalmos2009">{{cite book|author1=Steven R. Givant|author2=Paul Richard Halmos|title=Introduction to Boolean algebras|url=https://books.google.com/books?id=ORILyf8sF2sC&pg=PA22|year=2009|publisher=Springer|isbn=978-0-387-40293-2|pages=21–22}}</ref>
==Diagrammatic representations==
===Venn diagrams===
A [[Venn diagram]]<ref>{{cite journal |author-last=Venn |author-first=John |author-link=John Venn |title=I. On the Diagrammatic and Mechanical Representation of Propositions and Reasonings |journal=[[The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science]] |publisher=[[Taylor & Francis]] |volume=10 |issue=59 |date=July 1880 |series=5 |doi=10.1080/14786448008626877 |pages=1–18 |url=https://www.cis.upenn.edu/~bhusnur4/cit592_fall2014/venn%20diagrams.pdf |dead-url=no |archive-url=https://web.archive.org/web/20170516204620/https://www.cis.upenn.edu/~bhusnur4/cit592_fall2014/venn%20diagrams.pdf |archive-date=2017-05-16}} [http://www.tandfonline.com/doi/abs/10.1080/14786448008626877] [https://books.google.com/books?id=k68vAQAAIAAJ&pg=PA1]</ref> is a representation of a Boolean operation using shaded overlapping regions. There is one region for each variable, all circular in the examples here. The interior and exterior of region ''x'' corresponds respectively to the values 1 (true) and 0 (false) for variable ''x''. The shading indicates the value of the operation for each combination of regions, with dark denoting 1 and light 0 (some authors use the opposite convention).
The three Venn diagrams in the figure below represent respectively conjunction ''x''∧''y'', disjunction ''x''∨''y'', and complement ¬''x''.
[[File:Vennandornot.svg|center|500px|thumb|Figure 2. Venn diagrams for conjunction, disjunction, and complement]]
For conjunction, the region inside both circles is shaded to indicate that ''x''∧''y'' is 1 when both variables are 1. The other regions are left unshaded to indicate that ''x''∧''y'' is 0 for the other three combinations.
The second diagram represents disjunction ''x''∨''y'' by shading those regions that lie inside either or both circles. The third diagram represents complement ¬''x'' by shading the region ''not'' inside the circle.
While we have not shown the Venn diagrams for the constants 0 and 1, they are trivial, being respectively a white box and a dark box, neither one containing a circle. However we could put a circle for ''x'' in those boxes, in which case each would denote a function of one argument, ''x'', which returns the same value independently of ''x'', called a constant function. As far as their outputs are concerned, constants and constant functions are indistinguishable; the difference is that a constant takes no arguments, called a ''zeroary'' or ''nullary'' operation, while a constant function takes one argument, which it ignores, and is a ''unary'' operation.
Venn diagrams are helpful in visualizing laws. The commutativity laws for ∧ and ∨ can be seen from the symmetry of the diagrams: a binary operation that was not commutative would not have a symmetric diagram because interchanging ''x'' and ''y'' would have the effect of reflecting the diagram horizontally and any failure of commutativity would then appear as a failure of symmetry.
[[Idempotence]] of ∧ and ∨ can be visualized by sliding the two circles together and noting that the shaded area then becomes the whole circle, for both ∧ and ∨.
To see the first absorption law, ''x''∧(''x''∨''y'') = ''x'', start with the diagram in the middle for ''x''∨''y'' and note that the portion of the shaded area in common with the ''x'' circle is the whole of the ''x'' circle. For the second absorption law, ''x''∨(''x''∧''y'') = ''x'', start with the left diagram for ''x''∧''y'' and note that shading the whole of the ''x'' circle results in just the ''x'' circle being shaded, since the previous shading was inside the ''x'' circle.
The double negation law can be seen by complementing the shading in the third diagram for ¬''x'', which shades the ''x'' circle.
To visualize the first De Morgan's law, (¬''x'')∧(¬''y'') = ¬(''x''∨''y''), start with the middle diagram for ''x''∨''y'' and complement its shading so that only the region outside both circles is shaded, which is what the right hand side of the law describes. The result is the same as if we shaded that region which is both outside the ''x'' circle ''and'' outside the ''y'' circle, i.e. the conjunction of their exteriors, which is what the left hand side of the law describes.
The second De Morgan's law, (¬''x'')∨(¬''y'') = ¬(''x''∧''y''), works the same way with the two diagrams interchanged.
The first complement law, ''x''∧¬''x'' = 0, says that the interior and exterior of the ''x'' circle have no overlap. The second complement law, ''x''∨¬''x'' = 1, says that everything is either inside or outside the ''x'' circle.
===Digital logic gates===
Digital logic is the application of the Boolean algebra of 0 and 1 to electronic hardware consisting of [[logic gates]] connected to form a [[circuit diagram]]. Each gate implements a Boolean operation, and is depicted schematically by a shape indicating the operation. The shapes associated with the gates for conjunction (AND-gates), disjunction (OR-gates), and complement (inverters) are as follows.<ref>{{Cite journal
  | last = Shannon
  | first = Claude
  | authorlink = Claude Shannon
  | title = The Synthesis of Two-Terminal Switching Circuits
  | journal = Bell System Technical Journal
  | volume = 28
  | pages = 59–98
  | year = 1949
  | id =
  | doi=10.1002/j.1538-7305.1949.tb03624.x}}</ref>
[[File:LogicGates.GIF|center|400px|thumb]]
The lines on the left of each gate represent input wires or ''ports''. The value of the input is represented by a voltage on the lead. For so-called "active-high" logic, 0 is represented by a voltage close to zero or "ground", while 1 is represented by a voltage close to the supply voltage; active-low reverses this.  The line on the right of each gate represents the output port, which normally follows the same voltage conventions as the input ports.
Complement is implemented with an inverter gate. The triangle denotes the operation that simply copies the input to the output; the small circle on the output denotes the actual inversion complementing the input. The convention of putting such a circle on any port means that the signal passing through this port is complemented on the way through, whether it is an input or output port.
The [[Boolean Algebra#Duality principle|Duality Principle]], or [[De Morgan's laws]], can be understood as asserting that complementing all three ports of an AND gate converts it to an OR gate and vice versa, as shown in Figure 4 below. Complementing both ports of an inverter however leaves the operation unchanged.
[[File:DeMorganGates.GIF|center|400px|thumb]]
More generally one may complement any of the eight subsets of the three ports of either an AND or OR gate. The resulting sixteen possibilities give rise to only eight Boolean operations, namely those with an odd number of 1's in their truth table. There are eight such because the "odd-bit-out" can be either 0 or 1 and can go in any of four positions in the truth table. There being sixteen binary Boolean operations, this must leave eight operations with an even number of 1's in their truth tables. Two of these are the constants 0 and 1 (as binary operations that ignore both their inputs); four are the operations that depend nontrivially on exactly one of their two inputs, namely ''x'', ''y'', ¬''x'', and ¬''y''; and the remaining two are ''x''⊕''y'' (XOR) and its complement ''x''≡''y''.
==Boolean algebras==
{{Main article|Boolean algebra (structure)}}
The term "algebra" denotes both a subject, namely the subject of [[algebra]], and an object, namely an [[algebraic structure]]. Whereas the foregoing has addressed the subject of Boolean algebra, this section deals with mathematical objects called Boolean algebras, defined in full generality as any model of the Boolean laws. We begin with a special case of the notion definable without reference to the laws, namely concrete Boolean algebras, and then give [[#Boolean algebras: the definition|the formal definition]] of the general notion.
==Applications==
Boolean algebra as the calculus of two values is fundamental to computer circuits, computer programming, and mathematical logic, and is also used in other areas of mathematics such as set theory and statistics.<ref name="givhal"/>
===Computers===
In the early 20th century, several electrical engineers intuitively recognized that Boolean algebra was analogous to the behavior of certain types of electrical circuits. [[Claude Shannon]] formally proved such behavior was logically equivalent to Boolean algebra in his 1937 master's thesis, ''[[A Symbolic Analysis of Relay and Switching Circuits]]''.
Today, all modern general purpose [[computer]]s perform their functions using two-value Boolean logic; that is, their electrical circuits are a physical manifestation of two-value Boolean logic. They achieve this in various ways: as voltages on wires in high-speed circuits and capacitive storage devices, as orientations of a magnetic domain in ferromagnetic storage devices, as holes in punched cards or paper tape, and so on. (Some early computers used decimal circuits or mechanisms instead of two-valued logic circuits.)
Of course, it is possible to code more than two symbols in any given medium. For example, one might use respectively 0, 1, 2, and 3 volts to code a four-symbol alphabet on a wire, or holes of different sizes in a punched card. In practice, the tight constraints of high speed, small size, and low power combine to make noise a major factor. This makes it hard to distinguish between symbols when there are several possible symbols that could occur at a single site. Rather than attempting to distinguish between four voltages on one wire, digital designers have settled on two voltages per wire, high and low.
Computers use two-value Boolean circuits for the above reasons. The most common computer architectures use ordered sequences of Boolean values, called bits, of 32 or 64 values, e.g. 01101000110101100101010101001011. When programming in [[machine code]], [[assembly language]], and certain other [[programming languages]], programmers work with the low-level digital structure of the [[Word (data type)|data registers]]. These registers operate on voltages, where zero volts represents Boolean 0, and a reference voltage (often +5V, +3.3V, +1.8V) represents Boolean 1. Such languages support both numeric operations and logical operations. In this context, "numeric" means that the computer treats sequences of bits as [[binary number]]s (base two numbers) and executes arithmetic operations like add, subtract, multiply, or divide. "Logical" refers to the Boolean logical operations of disjunction, conjunction, and negation between two sequences of bits, in which each bit in one sequence is simply compared to its counterpart in the other sequence. Programmers therefore have the option of working in and applying the rules of either numeric algebra or Boolean algebra as needed. A core differentiating feature between these families of operations is the existence of the [[Carry (arithmetic)|carry]] operation in the first but not the second.
===Boolean operations===
The original application for Boolean operations was [[mathematical logic]], where it combines the truth values, true or false, of individual formulas.
Natural languages such as English have words for several Boolean operations, in particular conjunction (''and''), disjunction (''or''), negation (''not''), and implication (''implies''). ''But not'' is synonymous with ''and not''. When used to combine situational assertions such as "the block is on the table" and "cats drink milk," which naively are either true or false, the meanings of these [[logical connective]]s often have the meaning of their logical counterparts. However, with descriptions of behavior such as "Jim walked through the door", one starts to notice differences such as failure of commutativity, for example the conjunction of "Jim opened the door" with "Jim walked through the door" in that order is not equivalent to their conjunction in the other order, since ''and'' usually means ''and then'' in such cases. Questions can be similar: the order "Is the sky blue, and why is the sky blue?" makes more sense than the reverse order. Conjunctive commands about behavior are like behavioral assertions, as in ''get dressed and go to school''. Disjunctive commands such ''love me or leave me'' or ''fish or cut bait'' tend to be asymmetric via the implication that one alternative is less preferable. Conjoined nouns such as ''tea and milk'' generally describe aggregation as with set union while ''tea or milk'' is a choice. However context can reverse these senses, as in ''your choices are coffee and tea'' which usually means the same as ''your choices are coffee or tea'' (alternatives). Double negation as in "I don't not like milk" rarely means literally "I do like milk" but rather conveys some sort of hedging, as though to imply that there is a third possibility. "Not not P" can be loosely interpreted as "surely P", and although ''P'' necessarily implies "not not ''P''" the converse is suspect in English, much as with [[intuitionistic logic]]. In view of the highly idiosyncratic usage of conjunctions in natural languages, Boolean algebra cannot be considered a reliable framework for interpreting them.
Boolean operations are used in [[digital logic]] to combine the bits carried on individual wires, thereby interpreting them over {0,1}. When a vector of ''n'' identical binary gates are used to combine two bit vectors each of ''n'' bits, the individual bit operations can be understood collectively as a single operation on values from a [[Boolean algebra (structure)|Boolean algebra]] with 2<sup>''n''</sup> elements.
[[Naive set theory]] interprets Boolean operations as acting on subsets of a given set ''X''. As we saw earlier this behavior exactly parallels the coordinate-wise combinations of bit vectors, with the union of two sets corresponding to the disjunction of two bit vectors and so on.
The 256-element free Boolean algebra on three generators is deployed in [[computer displays]] based on [[raster graphics]], which use [[bit blit]] to manipulate whole regions consisting of [[pixels]], relying on Boolean operations to specify how the source region should be combined with the destination, typically with the help of a third region called the [[Mask (computing)|mask]]. Modern [[video cards]] offer all 2<sup>2<span><sup>3</sup></span></sup>&nbsp;=&nbsp;256 ternary operations for this purpose, with the choice of operation being a one-byte (8-bit) parameter. The constants SRC = 0xaa or 10101010, DST = 0xcc or 11001100, and MSK = 0xf0 or 11110000 allow Boolean operations such as (SRC^DST)&MSK (meaning XOR the source and destination and then AND the result with the mask) to be written directly as a constant denoting a byte calculated at compile time, 0x60 in the (SRC^DST)&MSK example, 0x66 if just SRC^DST, etc. At run time the video card interprets the byte as the raster operation indicated by the original expression in a uniform way that requires remarkably little hardware and which takes time completely independent of the complexity of the expression.
[[Solid modeling]] systems for [[computer aided design]] offer a variety of methods for building objects from other objects, combination by Boolean operations being one of them. In this method the space in which objects exist is understood as a set ''S'' of [[voxel]]s (the three-dimensional analogue of pixels in two-dimensional graphics) and shapes are defined as subsets of ''S'', allowing objects to be combined as sets via union, intersection, etc. One obvious use is in building a complex shape from simple shapes simply as the union of the latter. Another use is in sculpting understood as removal of material: any grinding, milling, routing, or drilling operation that can be performed with physical machinery on physical materials can be simulated on the computer with the Boolean operation ''x''&nbsp;∧&nbsp;¬''y'' or ''x''&nbsp;−&nbsp;''y'', which in set theory is set difference, remove the elements of ''y'' from those of ''x''. Thus given two shapes one to be machined and the other the material to be removed, the result of machining the former to remove the latter is described simply as their set difference.
=
[[Google Scholar]] appears to perform the XOR operation when the OR keyword is used.
-->

Revision as of 17:00, 20 July 2018

Boolean algebra' is the branch of algebra in which the values of the variables and constants have exactly two values: "true" and "false", usually denoted 1 and 0 respectively. This overview is loosely based on the Wikipedia article on [Algebra].

The basic operators in Boolean algebra are "and", "or", and "not". The "and" of two variable results in true only whenever both variables are true. The "or" of two variables is true whenever either or both variables are true. The "not" of a variable changes a true to false and a false to true. Just as algebra has basic rules for simplifying and evaluating expression, so does Boolean algebra.

Why is Boolean Algebra Important?

Boolean algebra is important to programmers, computer scientists, and the general population.

  • For programmers, Boolean expressions are used for conditionals and loops. For example, the following snippet of code sums the even numbers that are not also multiples of 3, stopping when the sum hits 100:
s = 0
x = 1
while (s < 100):
    if (x % 2 == 0) and (x % 3 != 0):
        s = s + x
    x = x + 1

Both s < 100 and (x % 2 == 0) and (x % 3 != 0) are Boolean expressions.

  • For computer scientists, Boolean algebra is the basis for digital circuits that make up a computer's hardware. The Digital Electronics category concerns a graphical representation of a circuit. That circuit is typically easiest to understand and evaluate by converting it to its Boolean algebra representation.
  • The general population uses Boolean algebra, probably without knowing that they are doing so, when they enter search terms in Internet search engines. For example, the search expression "red sox -yankees" is the Boolean expression that will returns web pages that contain the words "red" and "sox", as long as it does not contain the word "yankees". The search expression "jaguar speed -car" returns pages about the speed of the jaguar animal, not the Jaguar car.


Operations

Basic operations

The basic operations of Boolean algebra are AND, OR, and NOT:

  • AND (conjunction) will be denoted as [math]x \wedge y[/math], [math]x \text{AND} y[/math], [math]x \& y[/math], [math]xy[/math], or [math]x \cdot y[/math]
  • OR (disjunction) will be denoted as [math]x \vee y[/math], [math]x \text{OR} y[/math], or [math]x + y[/math]
  • NOT (negation) will be in denated as [math]\neg x[/math], [math]\text{NOT} x[/math], or [math]\bar{x}[/math]

The values of AND, OR, and NOT for all possible inputs is show in the following truth table:


Template:Col-begin Template:Col-break
[math]x[/math] [math]y[/math] [math]x y[/math] [math]x + y[/math]
0 0 0 0
1 0 0 1
0 1 0 1
1 1 1 1

Template:Col-break

[math]x[/math] [math]\neg x[/math]
0 1
1 0

Template:Col-end



Secondary operations

The three Boolean operations described above are referred to as basic, meaning that they can be taken as a basis for other Boolean operations that can be built up from them by composition, the manner in which operations are combined or compounded. Operations composed from the basic operations include the following examples:

[math]x \oplus y = (x \vee y) \wedge \neg{(x \wedge y)}[/math]
[math]x \equiv y = \neg{(x \oplus y)}[/math]

These definitions give rise to the following truth tables giving the values of these operations for all four possible inputs.

[math]x[/math] [math]y[/math] [math]x \oplus y[/math] [math]x \equiv y[/math]
0 0 0 1
1 0 1 0
0 1 1 0
1 1 0 1


The first operation, x ⊕ y, or Jxy, is called exclusive or (often abbreviated as XOR) to distinguish it from disjunction as the inclusive kind. It excludes the possibility of both x and y.

The second operation, the complement of exclusive or, is equivalence or Boolean equality: x ≡ y, or Exy, is true just when x and y have the same value. Hence x ⊕ y as its complement can be understood as x ≠ y, being true just when x and y are different. Equivalence's counterpart in arithmetic mod 2 is x + y + 1.


Laws

A law of Boolean algebra is an identity such as x∨(yz) = (xy)∨z between two Boolean terms, where a Boolean term is defined as an expression built up from variables and the constants 0 and 1 using the operations ∧, ∨, and ¬. The concept can be extended to terms involving other Boolean operations such as ⊕, →, and ≡, but such extensions are unnecessary for the purposes to which the laws are put. Such purposes include the definition of a Boolean algebra as any model of the Boolean laws, and as a means for deriving new laws from old as in the derivation of x∨(yz) = x∨(zy) from yz = zy as treated in the section on axiomatization.

Monotone laws

Boolean algebra satisfies many of the same laws as ordinary algebra when one matches up ∨ with addition and ∧ with multiplication. In particular the following laws are common to both kinds of algebra:<ref name="O'Regan_p33">Template:Cite book</ref>

Associativity of [math]\vee[/math]: [math]x \vee (y \vee z)[/math] [math]= (x \vee y) \vee z[/math]
Associativity of [math]\wedge[/math]: [math]x \wedge (y \wedge z)[/math] [math]= (x \wedge y) \wedge z[/math]
Commutativity of [math]\vee[/math]: [math]x \vee y[/math] [math]= y \vee x[/math]
Commutativity of [math]\wedge[/math]: [math]x \wedge y[/math] [math]= y \wedge x[/math]
Distributivity of [math]\wedge[/math] over [math]\vee[/math]: [math]x \wedge (y \vee z)[/math] [math]= (x \wedge y) \vee (x \wedge z)[/math]
Identity for [math]\vee[/math]: [math]x \vee 0[/math] [math]= x[/math]
Identity for [math]\wedge[/math]: [math]x \wedge 1[/math] [math]= x[/math]
Annihilator for [math]\wedge[/math]: [math]x \wedge 0[/math] [math]= 0[/math]

The following laws hold in Boolean Algebra, but not in ordinary algebra:

Annihilator for [math]\vee[/math]: [math]x \vee 1[/math] [math]= 1[/math]
Idempotence of [math]\vee[/math]: [math]x \vee x[/math] [math]= x[/math]
Idempotence of [math]\wedge[/math]: [math]x \wedge x[/math] [math]= x[/math]
Absorption 1: [math]x \wedge (x \vee y)[/math] [math]= x[/math]
Absorption 2: [math]x \vee (x \wedge y)[/math] [math]= x[/math]
Distributivity of [math]\vee[/math] over [math]\wedge[/math]: [math]x \vee (y \wedge z)[/math] [math]=(x \vee y) \wedge (x \vee z)[/math]

Taking x = 2 in the third law above shows that it is not an ordinary algebra law, since 2×2 = 4. The remaining five laws can be falsified in ordinary algebra by taking all variables to be 1, for example in Absorption Law 1 the left hand side would be 1(1+1) = 2 while the right hand side would be 1, and so on.

All of the laws treated so far have been for conjunction and disjunction. These operations have the property that changing either argument either leaves the output unchanged or the output changes in the same way as the input. Equivalently, changing any variable from 0 to 1 never results in the output changing from 1 to 0. Operations with this property are said to be monotone. Thus the axioms so far have all been for monotonic Boolean logic. Nonmonotonicity enters via complement ¬ as follows.<ref name="givhal"/>

Nonmonotone laws

The complement operation is defined by the following two laws.

[math] \begin{align} &\text{Complementation 1} & x \wedge \neg x & = 0 \\ &\text{Complementation 2} & x \vee \neg x & = 1 \end{align} [/math]

All properties of negation including the laws below follow from the above two laws alone.<ref name="givhal"/>

In both ordinary and Boolean algebra, negation works by exchanging pairs of elements, whence in both algebras it satisfies the double negation law (also called involution law)

[math] \begin{align} &\text{Double negation} & \neg{(\neg{x})} & = x \end{align} [/math]

But whereas ordinary algebra satisfies the two laws

[math] \begin{align} (-x)(-y) & = xy \\ (-x) + (-y) & = -(x + y) \end{align} [/math]

Boolean algebra satisfies De Morgan's laws:

[math] \begin{align} &\text{De Morgan 1} & \neg x \wedge \neg y & = \neg{(x \vee y)} \\ &\text{De Morgan 2} & \neg x \vee \neg y & = \neg{(x \wedge y)} \end{align} [/math]