• 1689

    Leibniz: Binary notation

    Leibniz invented the modern binary number system in 1689 as a way to convert verbal logic statements into mathematical ones, and he used only zeros and ones. Leibniz wrote his system in an article called “Explication de l’Arithmétique Binaire” or “Explanation of the Binary Arithmetic” in 1703.

  • 1847

    George Boole: Boolean algebra

    In mathematics and mathematical logic, Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted 1 and 0 respectively. Instead of elementary algebra where the values of the variables are numbers, and the prime operations are addition and multiplication, the main operations of Boolean algebra are the conjunction and, the disjunction or, and the negation not. It is thus a formalism for describing logical relations in the same way that elementary algebra describes numeric relations.

    Boolean algebra was introduced by George Boole in his first book The Mathematical Analysis of Logic (1847), and set forth more fully in his An Investigation of the Laws of Thought (1854). Boolean algebra has been fundamental in the development of digital electronics, and is provided for in all modern programming languages.

    Source: Wikipedia

    1847

  • 1889

    Herman Hollerith’s processing data machine

    Herman Hollerith was a German-American statistician, inventor, and businessman who developed an electromechanical tabulating machine for punched cards to assist in summarizing information and, later, in accounting. It was the first data processing machine.

  • 1936

    Alan Turing: On Computable Numbers

    Alan Turing published a paper, “On Computable Numbers, with an application to the Entscheidungsproblem,” which became the foundation of computer science.

    1936

  • 1937

    Claude Shannon: A Symbolic Analysis of Relay and Switching Circuit

    A Symbolic Analysis of Relay and Switching Circuits is the title of a master’s thesis written by computer science pioneer Claude E. Shannon while attending the Massachusetts Institute of Technology (MIT) in 1937. In his thesis, Shannon, a dual degree graduate of the University of Michigan, proved that Boolean algebra could be used to simplify the arrangement of the relays that were the building blocks of the electromechanical automatic telephone exchanges of the day. Shannon went on to prove that it should also be possible to use arrangements of relays to solve Boolean algebra problems.

    The utilization of the binary properties of electrical switches to perform logic functions is the basic concept that underlies all electronic digital computer designs. Shannon’s thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II. At the time, the methods employed to design logic circuits were ad hoc in nature and lacked the theoretical discipline that Shannon’s paper supplied to later projects.

    Source: Wikipedia

  • 1948

    Claude Shannon: A Mathematical Theory of Communication

    “A Mathematical Theory of Communication” is an article by Claude E. Shannon published in Bell System Technical Journal in 1948. It was renamed “The Mathematical Theory of Communication” in the book of the same name, a small but significant title change after realizing the generality of this work.

    The article was the founding work of the field of information theory. It laid out the basic elements of communication:

    • An information source that produces a message
    • A transmitter that operates on the message to create a signal which can be sent through a channel
    • A channel, which is the medium over which the signal, carrying the information that composes the message, is sent
    • A receiver, which transforms the signal back into the message intended for delivery
    • A destination, which can be a person or a machine, for whom or which the message is intended

    It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information.

    Source: Wikipedia

    1948