George Boole, the Holy Trinity, and the Birth of the Computer

The Holy TrinitySince it first emerged in the early days of Christianity, the concept of the Holy Trinity— Father, Son and Holy Spirit as three components of a single God-head— has been a head-scratcher for non-Christians and an unending source of conflict within the Church. The concept was already rattling around among the earliest Church Fathers in the second century when the cult of Jesus was largely below everyone’s radar. But as Christianity became the official religion of the Romans in the fourth century and the bishops in major metropolitan areas became powerful imperial functionaries, internal debates about the precise nature between the components of the Trinity intensified. The disputes filtered down to the bishop’s flocks and became the irresolvable wedge issues of their era, ultimately driving a stream of schisms within the Church. After Islam erupted out of the Arabian peninsula in the seventh century, Muslim caliphs found themselves ruling over largely Christian populations and, though they were mostly tolerant, were dismissive of the concept of the Trinity and appalled by the dogmatic debates that came with it. “Do not say ‘three,’” an Arabic inscription dating to the late seventh-century Dome of the Rock warns the Christians in Jerusalem, “It is better for you.”

A Stumbling Block, and an Historic Breakthrough

George Boole sketchI personally find the details of the different arguments about the Trinity hard to sit through. But I’ve long found it fascinating that in one very convoluted way, the concept itself had a profound pragmatic impact on the modern world. A deeply religious nineteenth-century Christian mathematician found that the very idea of a Trinity disturbed him in a way he couldn’t get out of his head, and when he finally resolved what was bothering him, he had discovered— without realizing it— the basis for digital computing and the information age. The mathematician was an Englishman named George Boole and his insight became the basis for Boolean logic.

Boole’s life was in many ways a series of professional disappointments. As a young man and a new father, he longed to study mathematics at Cambridge but by the time he came around to applying, he was considered too old to be a student, and he made ends meet for his family by tutoring. He was frustrated explaining and re-explaining basic mathematics to young students when his real desire was to explore the boundaries of the field, but one can argue that being forced to revisit the foundations of mathematics over and over may have prepared him to challenge those same foundations later.

In addition to his passion for mathematics, Boole loved to spend his free time in “versification”, which was the term for translating classic poems from the original Greek and Latin into English in a way that retained their original meaning and spirit. That hobby provided hours of mental calisthenics involving the abstractions of semantics and language.

Boole, like many devout “low church” English people of the time, had trouble resolving traditional Christian thinking with the new sense of scientific skepticism of his era. He simply could not countenance the idea that God came in a package of three components, however the components and the relationship between them were defined, but he had difficulty putting his finger on why the idea of the Trinity bothered him so much. The problem seemed to lie in an intuitive unease that Boole was hardly the first to sense: it wasn’t that “three” is demonstrably the incorrect number of divine components, the problem was that “three” is a number in the first place, or in other words a representation of quantity. When thinking about a universal, omnipresent and omniscient God as imagined among the Abrahamic faiths, a far more attractive value was “one.” But even that “one” was not strictly speaking a quantity, as in one God among many possible gods, but was more akin to what people would refer to without thinking as a universal truth. There was a huge difference between “a” god and “the” God, and the difference could be thought of as the difference between “one” as a number or “one” as a statement of existence. (Boole was attracted to the Hebrew association of God with Unity, and even for a time considered converting to Judaism. Ultimately, he became — wait for it! — a Unitarian.)

Boole thought deeply about this as a theological issue, but the ultimate importance of the difficulty he was having occurred when he applied the same constraint to mathematics. It occurred to Boole that all of mathematical calculation up to his time was in reality concerned only with a particular kind of calculation: that of calculating quantities. Specifically, the common algebra he taught his students, the kind of math concerned with statements like 2x + 7y = 16, was ultimately concerned with the numerical values x and y might have, or in other words the quantities they represented. But Boole realized that other algebras were possible, and that there could be, for example, an algebra concerned with logic rather than quantity, and could therefore be used to calculate truth or falsity rather than numerical values.

Boole’s new algebra only allowed values that could be 1 or 0, and even then only on its own terms. The 1 in Boolean algebra was not a quantity (as in the difference between 5 and 4) but a subtle abstraction that pertained only to logic. The 1’s and 0’s in Boolean algebra didn’t have to represent anything in particular: One and zero could represent true or false, all or nothing, on or off, existence or non-existence, or any other pair of concepts whose values toggled only between two absolute states. Boolean algebra followed many of the same rules as the algebra of quantities. In both, for example, 1 – 1 = 0. But there was one important difference: in Boolean algebra 1 + 1 = 1, reflecting the fact that, for example, “two true statements taken together are true” and “everything combined with everything is everything.”

The Laws of Thought?

Boole was struck by the computational capabilities of his new type of algebra. He was astounded to discover that his algebra could be applied to the arduous task of solving complex Aristotelian logic problems. He was not the first person to mistake logic with thinking, and for many years he thought he had discovered the way the human mind works and died disappointed that he wasn’t recognized for that. But what he had actually discovered is how formal logic works. He had developed a practical way to represent and solve complex logical problems mathematically. In the late nineteenth and early twentieth century, Boolean logic found a practical commercial application when it was widely used to double-check the implications of complex insurance contracts.

[see excerpt: How Boolean algebra solved Aristotelian problems of logic]

Boolean logic in circuitryBut Boolean logic found its true technological home in the mid-twentieth century, when the American computer pioneer Claude Shannon demonstrated in his Master’s thesis that binary electrical circuits (those that are represented as being only “on” or “off”, “closed” or “open”) behave according to the laws of Boolean algebra. Shannon showed that complex logic could be represented in binary circuitry, which became the basis for digital computers.

Charles Babbage
Charles Babbage

During his lifetime Boole had no inkling that his algebra would become the basis for a revolution on the scale that it did, but one of his contemporaries may have. While researching Boole in the British Library many years ago, I came across a hand-written letter he sent to his contemporary Charles Babbage who was working on what is generally considered a quirky early antecedent to the programmable computer. Though there is no evidence Babbage realized how electronic circuitry could be used in computation, he was in general cursed with being able to imagine far ahead of what he was actually able to accomplish with monstrously complex geared machinery. Babbage was anxious to follow up with Boole after devouring his books on algebra. But Boole in his note was writing back to Babbage to politely put him off—Babbage at the time was developing a reputation for being a bit of a crank, and in addition there was a vague whiff of scandal regarding the inventor due to the fact that he had spent thousands of pounds of government funds on his mechanical calculating machines on a project few people could understand. The two pioneers never came closer to collaborating.

Was this post interesting? PLEASE share it!