It would be no exaggeration to say that computation changed the world in the twentieth century. Implicated in nearly all contemporary technology, today's computers empower so-called "intelligent systems" to negotiate the world of human reasoners, even driving cars. This work and its legacy is the focus of the volume under review. A reader of this volume will acquire a broad acquaintance with the history of the theory of computation in the twentieth century, and with ways in which this theory will continue to develop in the twenty-first century. The content of these results was revolutionary for the foundations of mathematics, but their proof is more directly relevant to the theory of computation.
|Published (Last):||16 March 2006|
|PDF File Size:||12.10 Mb|
|ePub File Size:||10.74 Mb|
|Price:||Free* [*Free Regsitration Required]|
It would be no exaggeration to say that computation changed the world in the twentieth century. Implicated in nearly all contemporary technology, today's computers empower so-called "intelligent systems" to negotiate the world of human reasoners, even driving cars.
This work and its legacy is the focus of the volume under review. A reader of this volume will acquire a broad acquaintance with the history of the theory of computation in the twentieth century, and with ways in which this theory will continue to develop in the twenty-first century.
The content of these results was revolutionary for the foundations of mathematics, but their proof is more directly relevant to the theory of computation. He was unsure that recursivity provided such a means.
Turing identified a class of conditions that he took to capture idealized human computation; these conditions could be taken to define a class of machines now called "Turing machines". The class of functions computable by a Turing machine was soon shown to be extensionally equivalent to the class of recursive functions. Other models of computation were offered around the same time, such as Alonzo Church's lambda calculus, and these classes of functions were also shown to be extensionally equivalent to the class of Turing computable functions.
That these classes of functions capture the informal notion of computability has been asserted in what is known as the Church-Turing thesis, which states that a function is computable in an informal sense if and only if it is Turing computable or computable by one of the many other extensionally equivalent models of computation.
Many have wondered whether this thesis is mathematically provable, or whether it is too imprecise to admit of mathematical proof. This question is the subject of two of this volume's articles, to which we now turn. Saul Kripke's article contends that the Church-Turing thesis is provable, arguing in a way he says resembles arguments given by Turing and Church.
In particular, Kripke wants to prove that every intuitively computable function is recursive. He claims that computations are specific forms of mathematical deductions, since they are sets of instructions whose output is supposed to follow deductively from those instructions. Suppose, Kripke says, that the steps of a given deduction are fully expressible in first-order logic he calls this supposition "Hilbert's thesis". It follows, Kripke contends, that the inferences of the formalized deduction are recursive, and thus the computation so expressed is recursive, as he wanted to show.
The cogency of this argument comes down, of course, to whether one accepts Hilbert's thesis. One might hold that an informal deduction cannot be fully expressed in a first-order way, since informal notions are too rich for their meanings to be settled by any single finite expression. Such a view seems to have been part of Brouwer's belief that mathematical thought is essentially unformalizable.
One might instead maintain the need for higher-order logic to formalize adequately the concepts and inferences of branches of mathematics that implicate the infinite, such as real analysis. Kripke holds that even if his thesis is only understood as a reduction of Church's thesis to Hilbert's thesis, he has amplified the Church-Turing thesis in a substantive way. Stewart Shapiro's article makes the case, in contrast to Kripke's, that the Church-Turing thesis cannot be proved. Extending the case made in his article on the same subject, Shapiro articulates a view of Friedrich Waismann that our pre-theoretical mathematical concepts are generally not determined in all possible ways, so that there are typically many ways to sharpen concepts further, rather than just one "correct" way as the Platonist would have it.
He illustrates the "open texture" of mathematical concepts by drawing on Lakatos' famous dialogue on the concept of polyhedra, in which a series of proposed definitions of polyhedra are confronted with confounding cases that suggest a series of revisions to these proposals. The question is whether these sharpenings were somehow "tacit" in our original pre-theoretic concept, or whether these revisions are instead replacing the original concept with something new.
The advocate of open texture holds that the original concept was not precise enough to fix any particular revision as being right. Shapiro continues by arguing that the notion of informal computability at issue in the Church-Turing thesis is subject to open texture in this way.
He recounts the sharpenings of computability during the twentieth century, noting that there were other options along the way, other idealizations of computation, that could have been chosen such as computable in polynomial time or space.
That the notion chosen seemed "right" hinges upon choices that the advocate of open texture stresses are not determined by the pre-theoretic concept. We could come across procedures in the future that we want to count as computations that we do not right now: Dorit Aharonov and Umesh Vazirani discuss one such possibility, quantum computation, in their chapter. To the extent that today's concept of computability is settled, as the widespread acceptance of the Church-Turing thesis suggests, Shapiro urges us to see that settling as a sharpening, the result of human decisions, rather than the discovery of the "true" contours of the concept of computability.
A brief word is in order concerning the opposing positions of Kripke's and Shapiro's articles. Kripke's alleged proof of the Church-Turing thesis hinges on what he calls "Hilbert's thesis", that every computation can be fully expressed in a first-order way. If pre-theoretic computation is subject to open texture, then no particular expression of it fully captures its content, and hence no first-order expression does so.
Thus the open texture of computability would undermine the cogency of Kripke's proof by contradicting Hilbert's thesis. Thus, as Kripke recognizes, Hilbert's thesis will be a locus of disagreement with his proof.
A point more relevant to Shapiro's argument is what to make of all the proved equivalences between different models of computation: anything computable by a Turing machine is computable by a partial recursive function, is computable in the lambda calculus, and so on. One might hold that these equivalences are evidence that we have found the "real" concept of computability, because they indicate the "inevitability" of their analyses. No matter how we try to sharpen our concept of computability, we keep finding the same class of computations, extensionally speaking.
Thus the original concept was precise enough to fix this class of computations, and this is evidence that these equivalent sharpenings are "right".
The normativity at play here between concept and analysis demands clarification, but such clarification is needed to spell out open texture as well, since open texture is at root a negative view that no single analysis of a mathematical concept can be, or alternately can be shown to be, adequate for capturing the full content of that concept.
Questions of computability have often been linked to questions about the nature of the human mind, since one may wonder if the mind is a computational machine. In a paper previously published in but included in this volume, Putnam argues that if the mind more precisely, the linguistic and scientific faculties of the mind, in Chomsky's terms were representable by a Turing machine, then we could never know by mathematical or empirical means which Turing machine it is.
The articles of B. Turing, by contrast, identifies mechanized reasoning as "discipline" that human reasoners exceed in exercising what he calls "initiative", deployed when discipline fails in tackling problems. As Sieg notes, Turing suggested that new theories resulting from such "initiative" could themselves be mechanized once settled.
The resulting conception of the dynamics of mathematical theory change is also the focus of Copeland and Shagrir's article. Copeland and Shagrir emphasize what they call Turing's "multi-machine theory of mind", in which human minds are Turing machines at each stage of development, but which machine they are changes during a lifetime rather than remaining fixed from birth to death. These changes occur when minds break the "discipline" of one mechanization and learn new processes, resulting in a new machine.
While computational models of mind are not in vogue at present, Turing's view seems worthy of interest to contemporary investigators in cognitive science and the philosophy of mind. Solomon Feferman's article is an engrossing discussion of computation over the real numbers, a key component of contemporary science and engineering in their use of numerical methods for simulations, modeling, optimization, and statistical analysis. Both are claimed by their creators to be well-suited as foundations for scientific computation and its numerical methods.
Feferman notes that these two models of computation are incompatible, in that each classifies functions over the reals as computable that the other does not. His focal question is whether they can both be reasonable models of computation on the reals.
To answer this, Feferman observes that one needs the right perspective, one given by thinking of computation over an arbitrary structure, since the two competing models identify the structure of the reals differently algebraically and topologically, respectively.
He then explicates three theories of computation over an arbitrary structure: the first due to Harvey Friedman; the second to John Tucker and Jeffery Zucker; and the third to himself, adapting work of Richard Platek and Yiannis Moschovakis. He affirms the adequacy of each of these three theories as suitable for implementing the particular models of both BSS and of Braverman and Cook, thus answering positively his focal question.
Thus one who desires to implement scientific computations may choose either of these two models of computation, and the decision to choose between them must be made on other grounds.
Feferman notes that, in practice, users of scientific computations simply use approximations to the reals and real-valued functions, using what computer scientists call "floating-point arithmetic" think, for instance, of Runga-Kutta methods for solving ordinary differential equations. The BSS and Braverman and Cook models, by contrast, do not take the reals as approximations, but rather as continuous structures. They are thus arguably more precise from a foundational point of view than the methods used by most practicing numerical analysts today.
The developers of the BSS and Braverman and Cook models are concerned with the pragmatics of computation, but on grounds of computational complexity rather than the pragmatics of implementation in daily work.
Feferman notes that there is another approach to computing over the reals, introduced by Errett Bishop's constructive analysis. Bishop works with approximations to the reals, and is thus closer to normal practice in scientific computation than the other models of computation considered here. Feferman narrates recent work in interpreting Bishop's work by logicians working in computability theory, and notes that further analysis of Bishop's constructive mathematics in this direction would be worthwhile because there are indications that such work could permit information on the computational complexity of Bishop's model to be mined.
This would bring together the practical concerns of the scientific computation community and the community of complexity theorists in theoretical computer science. Carl Posy's article also addresses computability over the reals and its applicability to empirical science, but has a rather different focus than Feferman's. Posy is motivated by an argument by Putnam that mathematical physics requires non-constructive methods. Posy's reconstruction of Putnam's argument identifies an assumption that the constructive and the computable coincide, and this article is an interrogation of this assumption.
The focal question is whether the constructive and the computable coincide. To investigate this question, Posy compares and contrasts Hilbert's and Brouwer's analyses of the constructive as figureheads of the two central schools of contemporary constructivity Bishop's analysis is also addressed but set aside as insufficiently imprecise for the investigation here. He argues that for Hilbert and the Russian school of constructive analysis descending from Markov , recursive functions adequately represent finitely intuitive thinking, so that for Hilbert, the constructive and computable coincide.
He then argues that for Brouwer and the intuitionist school of constructivity that follows , the constructive is not necessarily recursive, since Brouwer sees indeterminacy as intrinsic to the activity of the free-creating subject at the root of his analysis. Thus Hilbert and Brouwer have opposing answers to the article's focal question. Posy then argues cogently that this clash is rooted in Hilbert's and Brouwer's differing adaptations of Kantian epistemology to the mathematics of the early twentieth century.
Robert Soare's article also illustrates how issues in the theory of computation are relevant to computation in practice.
He draws attention to oracle machines, a type of Turing machine that can receive and use facts about membership in a particular set, called its oracle. Machines with non-computable oracles thus exceed the computational power of ordinary Turing machines, and we thus talk about computability relative to an oracle.
For instance, an oracle machine that can ask questions of the halting set will be able to tell all ordinary Turing machines whether they will halt though each oracle machine has its own halting problem and halting set, generating a hierarchy of halting sets.
Soare notes that oracle machines can be thought of as online computers, able to draw on information external to the machine itself, like the World Wide Web; whereas ordinary Turing machines are always offline, so to speak.
Soare thus contends that computability theory is relevant to computer science today. Mathematical logicians and philosophers during the twentieth century focused largely on computability in an idealization where practical constraints did not apply. A computation that takes as many seconds to finish as there are elementary particles in the universe is as computable as one that takes five seconds, from the point of view of the analyses of Turing, Church, and so on.
But feasibility matters very much for computations in our daily lives. An airplane autopilot that took a century to compute the plane's descent would be of little use. Naturally computer science has been more concerned with questions of feasible computability than with computability tout court , as computers have come to fill so many key roles in our lives today.
Scott Aaronson's fascinating article argues that philosophers ought to follow computer scientists by reflecting on computational complexity. In brief, complexity theory studies the amount of time needed to solve a problem computationally relative to the size of that problem; for example, how long does it take to factor an integer into its prime components, as a function of the number of digits of that integer?
Complexity theorists have settled on two main classes of complexity. Feasible problems are those whose solution time grows at a polynomial rate relative to the size N of the problem, in that the time has an upper bound computed by a polynomial function on N. By contrast, an infeasible problem's solution time grows at an exponential rate relative to N, that is, this time has a lower bound computed by an exponential function on N. As Aaronson observes, there is no presently known efficient algorithm for factoring primes, and so the problem of prime factorization is currently judged infeasible, but that does not imply that there is no efficient way to factor primes.
Thus membership in these complexity classes is subject to revision; hence the considerable interest in the related question of whether P, the class of problems solvable by a polynomial-time algorithm, is extensionally equivalent to NP, the class of problems for which a solution can be checked though not necessarily solved by a polynomial-time algorithm.
Aaronson shows the relevance of computational complexity to many areas of philosophy: to quote him,. His discussions of each are overflowing with ideas; hundreds of graduate students could mine the article for distinct, highly worthwhile philosophical thesis topics. I'll present just one example to give a flavor of Aaronson's insights. In addition to metaphysical issues like the intentionality of consciousness, the possibility of artificial intelligence hinges on practical questions: the Turing test singles out human dialogue as an empirically-verifiable activity that an intelligent machine would need to be able to do.
Aaronson notes that humans require relatively little empirical information to judge other humans to be conscious, and that for any given dialogue this information could be codified in a "lookup table" listing all possible histories of the dialogue and the participant actions following each such history.
The Church-Turing Thesis
Edited by B. Jack Copeland , Carl J. Posy and Oron Shagrir. Computer scientists, mathematicians, and philosophers discuss the conceptual foundations of the notion of computability as well as recent theoretical developments.
When I was an undergraduate at UCLA, the mathematics and philosophy departments each counted Alonzo Church as a member of their faculty. For a number of those years, back in the s, I was pretty deeply involved in mathematical logic, largely due to the influence of a fabulous and charismatic rookie professor, Telis K. I just learned that Professor Menas passed away only recently, in — God rest his soul. As things unfolded, I opted for a minor in philosophy and so, in due course, enrolled in a seminar with Alonzo Church himself. Indeed, his office was in the philosophy department, many buildings over from the mathematics department, and I only saw him in the latter building once.