DENDRAL (for Dendritic Algorithm) was a computer program devised by Joshua Lederberg, chairman of the Stanford computer science department Edward A. Feigenbaum, and chemistry professor Carl Djerassi for the elucidation of the molecular structure of unknown organic compounds taken from known groups of such compounds, such as the alkaloids and the steroids. 1 Before the toxicological and pharmacological properties of a compound can be assessed, its molecular structure – the configuration of its atoms – has to be determined. Using the fragmentation pattern of ions produced by bombarding molecules with electrons in a mass spectrometer as entry data, DENDRAL made successive inferences about the type and arrangement of atoms in order to identify the compound from among hundreds or thousands of candidates.
By observing structural constraints within molecules which made certain combinations of atoms implausible, generating and testing hypotheses about the identity of the compound, and ruling out candidates that did not fit within the structural constraints, DENDRAL traced branches of a tree chart that contained all possible configurations of atoms, until it reached the configuration that matched the instrument data most closely. Hence its name, from "dendron," the Greek word for tree. Joshua Lederberg himself worked out the basic notational algorithm, called graph theory, to represent three-dimensional molecular structures in a form computers could understand. 2
In its practical utilization, DENDRAL was designed to relieve chemists of a task that was demanding, repetitive, and time-consuming: surveying a large number of molecular structures, to find those that corresponded to instrument data. Once fully operational, the program performed this task with greater speed than an expert spectrometrist, and with comparable accuracy.
The greatest significance of DENDRAL, however, lay in its theoretical and scientific contribution to the development of knowledge-based computer systems. It was the ambition of DENDRAL's creators to transfer the principles of artificial intelligence from the realm of chess and other strictly controlled settings in which they had been formulated during the 1950s, to real-world problems facing biomedical researchers and physicians. They wanted to show that computers could become experts within a concrete knowledge domain, such as mass spectrometry, where they could solve problems, explain their own conclusions, and interact with human users.
Joshua Lederberg and his colleagues believed that artificial intelligence – the use of computers for manipulating symbols, for instance the combination of words in an "if-then" inference, rather than for purely numerical calculation – could assimilate the rules of inductive reasoning and empirical judgment that guide scientists and physicians in their work, rules for which mathematical representations did not exist. Bruce Buchanan and others in Stanford's computer science department distilled these rules, which they called "heuristics," from extended interviews with Joshua Lederberg and other experts in their respective fields, and translated them into the formal code of symbolic computation.
DENDRAL ran on a computer system called ACME (Advanced Computer for Medical Research), installed at Stanford Medical School in 1965 for use by resident researchers through time-sharing, with Joshua Lederberg as principal investigator. Initially, the system performed real-time, standard numerical analysis of clinical and biomedical research data. DENDRAL was the first artificial intelligence application hosted by ACME. It was succeeded in 1973 by SUMEX-AIM (Stanford University Medical Experimental Computer – Artificial Intelligence in Medicine), a national computer resource for artificial intelligence applications in biomedicine. Users at universities and hospitals across the country were connected to SUMEX via the ARPANET, a predecessor of the Internet developed by the Pentagon in the 1960s.
"
By 1980, SUMEX hosted nineteen projects, including DENDRAL and its spin-offs, CONGEN and Meta-DENDRAL, programs that generated not just hypotheses for the interpretation of instrument data, but the inductive rules by which hypotheses were constructed. Other SUMEX projects included MYCIN, a program to diagnose and manage medication schedules for infectious diseases, and MOLGEN, a program under Joshua Lederberg's own supervision that aided in the planning of laboratory experiments in genetics. Among remote users of the system, researchers at the University of Pittsburgh created INTERNIST, a program that diagnosed multiple internal diseases in the same patient to assist physicians in rural health clinics and other isolated locations without access to advanced diagnostic equipment. Psychiatrists at the UCLA's Neuropsychiatric Institute simulated the thought processes of paranoid patients with a program called PARRY in order to test explanations for the causes of paranoia, and to train psychiatrists in its diagnosis.
1
The University of Wisconsin, 1998 "Oral History Interview", page 42, paragraph 211
clearly points out the level of understanding that Joshua Lederberg had with regards
to computers and the theory of computation (Automata Theory). His words tell the
story better than anyone else could, and with his own authority.
"The discussion turns to computers. JL notes that his first introduction to computers
was in 1941, when there was a card sequence controlled calculator installed in the
American Institute Science Laboratory at 310 5th. Avenue, in the shadow ofthe Empire
State Building. This laboratory was the forerunner of what later became the Westinghouse
Science Talent Search, but in 1941 it was the program that provided facilities to high
school students who wanted to do bonafide research at a time when high school labs were
less equipped to do that than at they are at present. By examination, JL won what might
be called a scholarship permitting him to work at this laboratory. While his own project
was in cyto-chemistry — the chemical identification of cellular constituents by specific
staining reactions under the microscope — there were some other students who were
starting to experiment with these various machines. These were not very elaborate computers.
They were relay driven and involved punch cards. Basically the only memory they had were
the intermediate cards, so if you wanted to calculate a square root, for example, you could
put in a number, program it to do that, and probably burn up several cards to get the
results. But it was the first intellectual robot [sic]
JL had ever seen . He was quite intrigued by its analogy to living organisms, and he from
that moment on followed the development of computers, though mostly
from afar and from the press. [sic]
Joshua Lederberg makes it entirely clear here that he has never studied the theory
of computation (automata theory): the theory that forms the basis of computer
science. Although it has been the dream of computer scientists to create a theory
of AI (artificial intelligence) and to be able to construct machines with the
capabilities of living, thinking organisms, unfortunately, very little of this
dream has been realized at this time (2011). Artificial Intelligence (AI) has
focused upon such aspects as Automatic Theorem Provers (mostly using Conjunctive
Normal Form or Disjunctive Normal Form), and tree-searching (depth-first, breadth-first,
probabalistic). These areas of study are fairly limited and not very exciting.
Indeed, AI has mostly ceased to be an area of active research for several decades.
Famous researchers often have work done by other people (undergraduates and
graduate students), and the names of the famous researchers are then placed
upon research papers. These famous researchers are given the title of PI
(Principal Investigator), even though they don't have the qualifications
needed for the research. This applies to computer science, as well as
chemistry. Joshua Lederberg was famous for microbial genetics, but not in
molecular genetics. No known research publications in recognized scholarly
journals exist for Joshua Lederberg in the field of Organic Chemistry (or
any branch of chemistry).
2
Graph theory has been used for several decades to study the atoms and
molecules encountered in both Inorganic as well as Organic Chemistry.
Unfortunately, this approach to chemistry has problems in that graph
theory cannot fully model and replace the other theoretical subjects
used to describe atoms and molecules (for example, Statistical
Mechanics, Thermodynamics, Thermodynamics of Irreversible processes),
let alone the anomalies found in chemical reactions. If graph
theory could be used so successfully, then all other theories (Statistical
Mechanics and Thermodynamics) would become superflous. Attempts to
use graph theory in Bioinformatics (DNA, RNA, Proteomics) do exist.
However, such approaches have not become very popular yet.
See:
"Emergent Computation: Emphasizing Bioinformatics",
by M. Simon, Springer, 2005,
especially chapter 8, pages 308-311.