## Flexible Computation

Flexible computation refers to procedures that allow a graceful tradeoff to be made between the quality of results and allocations of costly resources, such as time, memory, or information. Systems employing flexible computation gain the ability to adapt the quality of their response to dynamic changes in requirements for precision, and to uncertainty or variation in the cost of computational commodities.

Some of the work on flexible computation has explored foundations of flexibility and value in basic algorithms such as sorting procedures. Here is a figure displaying data generated by the Protos/Algo system as it probed the value of partial results generated by alternative sorting algorithms. A partial sort is depicted by a set of points in a two dimensional represenation where one axis is the key of records and the other are the locations of the records. A diagonal line represents a completely sorted file. In this figure, the incremental refinement of partial results is demonstrated for Selection sort (top) and Shellsort (below). We can probe the value of partial results of computation by developing a utility model that captures alternate dimensions of value in the partial result. In the case of flexible sorting algorithms, we consider dimensions of value user's find in partial sorted files. Utility models are typically functions of the user and context of application of the computation. This figure highlights several dimensions of value in a partial sort that can be combined with a multiattribute utility model to generate an overall value of a partial result. Here is a graph of the utility of partial results with computation for Shellsort and Selection sort, generated by applying a specific utility model to the results of these sorting procedures. The utility model considers the contribution to the overall value of the result of several attributes of value. Here is another depiction of the value of partial results with flexible computation, now exploring the traveling salesperson problem (TSP), an NP-Hard task. We show the performance over time of a two-opt approximation. In the context of a loss function, representing the cost of waiting for an increasingly better tour, we compute a net value, the curve appearing in the middle of the graph. This work was done with Adrian Klein. Moving to notions of rationality under limited resources, we can consider flexibility in decision making inference. The Protos system was built primarily to explore the control of decision-theoretic inference in time-critical contexts. The system continues to compute an approximation for the expected value of computation (EVC) and decides whether to continue to compute or to act in the world. Here is some output from Protos after the system tackled a time-pressured medical decision. The upper left-hand corner displays the tightening of bounds over a probability needed to solve a decision problem. The larger graph displays several pieces of information about the state of the problem when the system decided to act in the world rather than continue to refine its result. In related research, we developed and analyzed flexible strategies for performing theorem proving under limited or varying resources. In this work, we perform a Bayesian analysis of the truth of a propositional claim, based on information about the size of the theorem-proving search space and progress towards a goal during theorem proving. Beyond computing the current probability of truth and falsity of a propositional claim, we can compute the expected value of continuing to perform theorem proving computation. Here is a graph of the dynamic probability of finding an open path with the matrix method of theorem proving. We use a generative model of the probability to make decisions about the value of continuing to compute in a time-critical setting. See the paper (in the references below) on flexible computation in theorem proving for details (postscript is available online). 