Marvin Minsky, who combined a scientist’s thirst for knowledge with a philosopher’s quest for truth as a pioneering explorer of artificial intelligence, work that helped inspire the creation of the personal computer and the Internet, died on Sunday night in Boston. He was 88.
His family said the cause was a cerebral hemorrhage.
Well before the advent of the microprocessor and the supercomputer, Professor Minsky, a revered computer science educator at M.I.T., laid the foundation for the field of artificial intelligence by demonstrating the possibilities of imparting common-sense reasoning to computers.
“Marvin was one of the very few people in computing whose visions and perspectives liberated the computer from being a glorified adding machine to start to realize its destiny as one of the most powerful amplifiers for human endeavors in history,” said Alan Kay, a computer scientist and a friend and colleague of Professor Minsky’s.
Fascinated since his undergraduate days at Harvard by the mysteries of human intelligence and thinking, Professor Minsky saw no difference between the thinking processes of humans and those of machines. Beginning in the early 1950s, he worked on computational ideas to characterize human psychological processes and produced theories on how to endow machines with intelligence.
Professor Minsky, in 1959, co-founded the M.I.T. Artificial Intelligence Project (later the Artificial Intelligence Laboratory) with his colleague John McCarthy, who is credited with coining the term “artificial intelligence.”
Beyond its artificial intelligence charter, however, the lab would have a profound impact on the modern computing industry, helping to impassion a culture of computer and software design. It planted the seed for the idea that digital information should be shared freely, a notion that would shape the so-called open-source software movement, and it was a part of the original ARPAnet, the forerunner to the Internet.
Professor Minsky’s scientific accomplishments spanned a variety of disciplines. He designed and built some of the first visual scanners and mechanical hands with tactile sensors, advances that influenced modern robotics. In 1951 he built the first randomly wired neural network learning machine, which he called Snarc. And in 1956, while at Harvard, he invented and built the first confocal scanning microscope, an optical instrument with superior resolution and image quality still in wide use in the biological sciences.
His own intellect was wide-ranging and his interests were eclectic. While earning a degree in mathematics at Harvard he also studied music, and as an accomplished pianist, he would later delight in sitting down at one and improvising complex baroque fugues.
Professor Minsky was lavished with many honors, notably, in 1969, the Turing Award, computer science’s highest prize.
He went on to collaborate, in the early ’70s, with Seymour Papert, the renowned educator and computer scientist, on a theory they called “The Society of Mind,” which combined insights from developmental child psychology and artificial intelligence research.
Professor Minsky’s book “The Society of Mind,” a seminal work published in the mid-1980s, proposed “that intelligence is not the product of any singular mechanism but comes from the managed interaction of a diverse variety of resourceful agents,” as he wrote on his website.
Underlying that hypothesis was his and Professor Papert’s belief that there is no real difference between humans and machines. Humans, they maintained, are actually machines of a kind whose brains are made up of many semiautonomous but unintelligent “agents.” And different tasks, they said, “require fundamentally different mechanisms.”
Their theory revolutionized thinking about how the brain works and how people learn.
“Marvin was one of the people who defined what computing and computing research is all about,” Dr. Kay said. “There were four or five supremely talented characters from back then who were early and comprehensive and put their personality and stamp on the field, and Marvin was among them.”
Marvin Lee Minsky was born on Aug. 9, 1927, in New York City. The precocious son of Dr. Henry Minsky, an eye surgeon who was chief of ophthalmology at Mount Sinai Hospital, and Fannie Reiser, a social activist and Zionist.
Fascinated by electronics and science, the young Mr. Minsky attended the Ethical Culture School in Manhattan, a progressive private school from which J. Robert Oppenheimer, who oversaw the creation of the first atomic bomb, had graduated. (Mr. Minsky later attended the affiliated Fieldston School in Riverdale.) He went on to attend the Bronx High School of Science and later Phillips Academy in Andover, Mass.
After a stint in the Navy during World War II, he studied mathematics at Harvard and received a Ph.D. in math from Princeton, where he met John McCarthy, a fellow graduate student.
Intellectually restless throughout his life, Professor Minsky sought to move on from mathematics once he had earned his doctorate. After ruling out genetics as interesting but not profound, and physics as mildly enticing, he chose to focus on intelligence itself.
“The problem of intelligence seemed hopelessly profound,” he told The New Yorker magazine when it profiled him in 1981. “I can’t remember considering anything else worth doing.”
To further those studies he reunited with Professor McCarthy, who had been awarded a fellowship to M.I.T. in 1956. Professor Minsky, who had been at Harvard by then, arrived at M.I.T. in 1958, joining the staff at its Lincoln Laboratory. A year later, he and Professor McCarthy founded M.I.T.’s AI Project, later to be known as the AI Lab. (Professor McCarthy left for Stanford in 1962.)
Professor Minsky’s courses at M.I.T. — he insisted on holding them in the evenings — became a magnet for several generations of graduate students, many of whom went on to become computer science superstars themselves.
Among them were Ray Kurzweil, the inventor and futurist; Gerald Sussman, a prominent A.I. researcher and professor of electrical engineering at M.I.T.; and Patrick Winston, who went on to run the AI Lab after Professor Minsky stepped aside.
Another of his students, Danny Hillis, an inventor and entrepreneur, co-founded Thinking Machines, a supercomputer maker in the early 1990s.
Mr. Hillis said he had so been taken by Professor Minsky’s intellect and charisma that he found a way to insinuate himself into the AI Lab and get a job there. He ended up living in the Minsky family basement in Brookline, Mass.
“Marvin taught me how to think,” Mr. Hillis said in an interview. “He had a style and a playful curiosity that was a huge influence on me. He always challenged you to question the status quo. He loved it when you argued with him.”
Professor Minsky’s prominence extended well beyond M.I.T. While preparing to make the 1968 science-fiction epic “2001: A Space Odyssey,” the director Stanley Kubrick visited him seeking to learn about the state of computer graphics and whether Professor Minsky believed it would be plausible for computers to be able to speak articulately by 2001.
Professor Minsky is survived by his wife, Gloria Rudisch, a physician; two daughters, Margaret and Juliana Minsky; a son, Henry; a sister, Ruth Amster; and four grandchildren.
“In some ways, he treated his children like his students,” Mr. Hillis recalled. “They called him Marvin, and he challenged them and engaged them just as he did with his students.”
In 1989, Professor Minsky joined M.I.T.’s fledgling Media Lab. “He was an icon who attracted the best people,” said Nicholas Negroponte, the Media Lab’s founder and former director.
For Dr. Kay, Professor Minsky’s legacy was his insatiable curiosity. “He used to say, ‘You don’t really understand something if you only understand it one way,’” Dr. Kay said. “He never thought he had anything completely done.”
An obituary on Tuesday about Marvin Minsky, a pioneer in artificial intelligence, misstated the year he received the Turing Award, computer science’s highest prize. It was 1969, not 1970.