Answer the question at the end by quoting:

Hackett was born in Brooklyn, New York to Anna (nee Geller) and Philip Hacker, an upholsterer and part-time inventor. He grew up on 54th and 14th Ave in Borough Park, Brooklyn, across from Public School 103 (now a yeshiva). He graduated from New Utrecht High School in 1942.
Hackett starred as the title character on NBC-TV's Stanley, a 1956-57 situation comedy which ran for 19 weeks on Monday evenings at 8:30 pm EST. The half-hour series also featured a young Carol Burnett and the voice of Paul Lynde. The Max Liebman produced program aired live before a studio audience and was one of the last sitcoms from New York to do so. Stanley revolved around the adventures of the titular character (Hackett) as the operator of a newsstand in a posh New York City hotel. On September 30, 1960, he appeared as himself in an episode of NBC's short-lived crime drama Dan Raven, starring Skip Homeier, set on the Sunset Strip of West Hollywood.  After starring on Broadway in I Had a Ball, Hackett appeared opposite Robert Preston in the film adaptation of The Music Man (1962). In It's a Mad, Mad, Mad, Mad World (1963), Hackett was paired with Mickey Rooney, with whom he had also recently made Everything's Ducky (1961), in which they played two sailors who smuggle a talking duck aboard a Navy ship. Children became familiar with him as lovable hippie auto mechanic Tennessee Steinmetz in Disney's The Love Bug (1969).  He appeared many times on the game show Hollywood Squares in the late 1960s. In one episode, Hackett was asked which was the country with the highest ratio of doctors to populace; he answered Israel, or in his words, "The country with the most Jews." Despite the audience roaring with laughter (and Hackett's own belief that the actual answer was Sweden), the answer turned out to be correct. Hackett's regular guest shots on Jack Paar's Tonight Show in the early 1960s were rewarded with a coveted appearance on Paar's final Tonight program on March 29, 1962.

Did he mainly only do comedy shows?

He appeared many times on the game show Hollywood Squares in the late 1960s.



Answer the question at the end by quoting:

Claude Elwood Shannon (April 30, 1916 - February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory". Shannon is noted for having founded information theory with a landmark paper, A Mathematical Theory of Communication, that he published in 1948. He is, perhaps, equally well known for founding digital circuit design theory in 1937, when--as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT)--he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct any logical, numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his fundamental work on codebreaking and secure telecommunications.
In 1948, the promised memorandum appeared as "A Mathematical Theory of Communication," an article in two parts in the July and October issues of the Bell System Technical Journal. This work focuses on the problem of how best to encode the information a sender wants to transmit. In this fundamental work, he used tools in probability theory, developed by Norbert Wiener, which were in their nascent stages of being applied to communication theory at that time. Shannon developed information entropy as a measure of the uncertainty in a message while essentially inventing the field of information theory.  The book, co-authored with Warren Weaver, The Mathematical Theory of Communication, reprints Shannon's 1948 article and Weaver's popularization of it, which is accessible to the non-specialist. Warren Weaver pointed out that the word information in communication theory is not related to what you do say, but to what you could say. That is, information is a measure of one's freedom of choice when one selects a message. Shannon's concepts were also popularized, subject to his own proofreading, in John Robinson Pierce's Symbols, Signals, and Noise.  Information theory's fundamental contribution to natural language processing and computational linguistics was further established in 1951, in his article "Prediction and Entropy of Printed English", showing upper and lower bounds of entropy on the statistics of English - giving a statistical foundation to language analysis. In addition, he proved that treating whitespace as the 27th letter of the alphabet actually lowers uncertainty in written language, providing a clear quantifiable link between cultural practice and probabilistic cognition.  Another notable paper published in 1949 is "Communication Theory of Secrecy Systems", a declassified version of his wartime work on the mathematical theory of cryptography, in which he proved that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. He is also credited with the introduction of sampling theory, which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. This theory was essential in enabling telecommunications to move from analog to digital transmissions systems in the 1960s and later.  He returned to MIT to hold an endowed chair in 1956.

What information theory did Shannon propose?
In 1948, the promised memorandum appeared as "A Mathematical Theory of Communication,