[pypy-svn] r29016 - pypy/dist/pypy/doc

auc at codespeak.net auc at codespeak.net
Tue Jun 20 18:12:21 CEST 2006


Author: auc
Date: Tue Jun 20 18:12:20 2006
New Revision: 29016

Modified:
   pypy/dist/pypy/doc/howto-logicobjspace-0.9.txt
Log:
more fixes

Modified: pypy/dist/pypy/doc/howto-logicobjspace-0.9.txt
==============================================================================
--- pypy/dist/pypy/doc/howto-logicobjspace-0.9.txt	(original)
+++ pypy/dist/pypy/doc/howto-logicobjspace-0.9.txt	Tue Jun 20 18:12:20 2006
@@ -7,21 +7,21 @@
 
 This document gives some information about the content and usage of an
 extension of PyPy known as the Logic Objectspace (LO). The LO, when
-finished, will provide additional builtins that will allow to write :
+finished, will provide additional builtins that will allow to write:
 
 * concurrent programs based on coroutines scheduled by dataflow logic
   variables,
 
-* concurrent logic programs semantically equivalent to "pure Prolog",
+* concurrent logic programs,
 
 * concurrent constraint problems,
 
 * new search "engines" to help solve logic/constraint programs.
 
-The 0.9 preview comes without logic programming ; the constraint
-solver is only lightly tested, is not equipped with some specialized
-but important propagators for linear relations on numeric variables,
-and *might* support concurrency -- but that would be an accident; the
+The 0.9 preview comes without logic programming; the constraint solver
+is only lightly tested, is not equipped with some specialized but
+important propagators for linear relations on numeric variables, and
+*might* support concurrency - but that would be an accident; the
 dataflow scheduling of coroutines is known to fail in at least one
 basic and important case.
 
@@ -44,22 +44,22 @@
 ------------------------
 
 Logic variables are similar to Python variables in the following
-sense:: they map names to values in a defined scope. But unlike normal
-Python variables, they have two states:: free and bound. A bound logic
-variable is indistinguishable from a normal Python value which it
+sense: they map names to values in a defined scope. But unlike normal
+Python variables, they have two states: free and bound. A bound logic
+variable is indistinguishable from a normal Python value, which it
 wraps. A free variable can only be bound once (it is also said to be a
 single-assignment variable). It is good practice to denote these
 variables with a beginning capital letter, so as to avoid confusion
 with normal variables.
 
 The following snippet creates a new logic variable and asserts its
-state :
+state::
 
   X = newvar()
   assert is_free(X)
   assert not is_bound(X)
 
-Logic variables can be bound thusly :
+Logic variables can be bound thusly::
 
   bind(X, 42)
   assert X / 2 == 24
@@ -73,18 +73,18 @@
 The bind operator is low-level. The more general operation that binds
 a logic variable is known as "unification". Unify is an operator that
 takes two arbitrary data structures and tries to assert their
-equivalentness, much in the sense of the == operator, but with one
+equalness, much in the sense of the == operator, but with one
 important twist : unify mutates the state of the involved logic
 variables.
 
-Unifying structures devoid of logic variables, like :
+Unifying structures devoid of logic variables, like::
 
   unify([1, 2], [1, 2])
   unify(42, 43)
 
 is equivalent to an assertion about their equalness, the difference
 being that a FailureException will be raised instead of an
-AssertionError, would the assertion be violated :
+AssertionError, would the assertion be violated::
 
   assert [1, 2] == [1, 2]   
   assert 42 == 43           
@@ -123,7 +123,7 @@
 The operators table
 -------------------
 
-Logic variables support the following operators (with their arity) :
+Logic variables support the following operators (with their arity):
 
 Predicates:: 
  is_free/1   # applies to anything
@@ -148,11 +148,11 @@
 language. With respect to behaviour under concurrency conditions,
 logic variables come with two operators :
 
-* wait:: this suspends the current thread until the variable is bound,
-  it returns the value otherwise (impl. note : in the logic
+* wait: this suspends the current thread until the variable is bound,
+  it returns the value otherwise (impl. note: in the logic
   objectspace, all operators make an implicit wait on their arguments)
 
-* wait_needed:: this suspends the current thread until the variable
+* wait_needed: this suspends the current thread until the variable
   has received a wait message. It has to be used explicitely,
   typically by a producer thread that wants to produce data only when
   needed.
@@ -162,8 +162,9 @@
 
 Wait and wait_needed allow to write efficient lazy evaluating code.
 
-Using the "uthread" builtin, here is how to implement a
-producer-consummer scheme::
+Using the "uthread" builtin (which spawns a coroutine and applies the
+2..n args to its first arg), here is how to implement a
+producer/consummer scheme::
 
   def generate(n, limit):
       if n < limit:
@@ -187,8 +188,8 @@
 
 Note that this eagerly generates all elements before the first of them
 is consummed. Wait_needed helps write us a lazy version of the
-generator. But the consummer will be responsible of the proper
-termination, and thus must be adapted too::
+generator. But the consummer will be responsible of the termination,
+and thus must be adapted too::
 
   def lgenerate(n, L):
       """lazy version of generate"""
@@ -245,11 +246,14 @@
 This program currently blocks on the first call to sleep (we don't
 even get a chance to reach the first unify operation).
 
+Finally, it must be noted that bind/unify and wait pair of operations
+are quite similar to send and synchronous receive primitives for
+inter-process communication.
 
 Constraint Programming
 ======================
 
-The LO comes stuffed with a flexible, extensible constraint solver
+The LO comes with a flexible, extensible constraint solver
 engine. While regular search strategies such as depth-first or
 breadth-first search are provided, you can write better, specialized
 strategies (an exemple would be best-search). We therein describe how
@@ -261,15 +265,15 @@
 +++++++++++++++++++++++++++++++++++++++++++++
 
 A constraint satisfaction problem is defined by a triple (X, D, C)
-where X is a set of finite domains variables, D the set of domains
-assocated to the variables in X, and C the set of constraints, or
+where X is a set of finite domain variables, D the set of domains
+associated with the variables in X, and C the set of constraints, or
 relations, that bind together the variables of X.
 
 Note that the constraint variables are NOT logic variables. Not yet
 anyway.
 
 So we basically need a way to declare variables, their domains and
-relations ; and something to hold these together. The later is what we
+relations; and something to hold these together. The later is what we
 call a "computation space". The notion of computation space is broad
 enough to encompass constraint and logic programming, but we use it
 there only as a box that holds the elements of our constraint
@@ -284,9 +288,8 @@
 This snippet defines a couple of variables and their domains. Note
 that we didn't take a reference of the created variables. We can query
 the space to get these back if needed, and then complete the
-definition of our problem.
+definition of our problem. Our problem, continued::
 
-      # ... continued ...
       x = cs.find_var('x')
       y = cs.find_var('y')
       cs.tell(make_expression([x,y], 'len(x) == y'))
@@ -295,7 +298,8 @@
       
 We must be careful to return the set of variables whose candidate
 values we are interested in. The rest should be sufficiently
-self-describing ... Now to print solutions out of this, we must::
+self-describing... Now to get and print solutions out of this, we
+must::
 
   import solver
   cs = newspace()
@@ -306,9 +310,9 @@
 
 The builtin solve function returns a generator. You will note with
 pleasure how slow the search can be on a solver running on a Python
-interpreter written in Python and running on top of cpython... It is
-expected that the compiled version of PyPy + LO will provide decent
-performance.
+interpreter written in Python, the later running on top of
+cpython... It is expected that the compiled version of PyPy + LO will
+provide decent performance.
 
 Operators for CSP specification
 +++++++++++++++++++++++++++++++
@@ -338,9 +342,9 @@
 Extending the search engine
 +++++++++++++++++++++++++++
 
-Here we show how the primitives allow you to write, in pure Python, a
-very basic solver that will search depth-first and return the first
-found solution.
+Here we show how the additional builtin primitives allow you to write,
+in pure Python, a very basic solver that will search depth-first and
+return the first found solution.
 
 As we've seen, a CSP is encapsulated into a so-called "computation
 space". The space object has additional methods that allow the solver
@@ -372,15 +376,16 @@
 is removed from the variable domains. This phase is called "constraint
 propagation". It is crucial because it prunes as much as possible of
 the search space. Then, the call to ask returns a positive integer
-value, which means that all (possibly concurrent) computations of the
-space are terminated.
+value, which means that all (possibly concurrent) computations
+happening inside the space are terminated.
 
 At this point, either:
 
-* the space is failed (status == 0), which means that there is no set
-  of values of the finite domains that can satisfy the constraints,
+* the space is failed (status == 0), which means that there is no
+  combination of values of the finite domains that can satisfy the
+  constraints,
 
-* one solution has been found (status == 1):: there is exactly one
+* one solution has been found (status == 1): there is exactly one
   valuation of the variables that satisfy the constraints,
 
 * several branches of the search space can be taken (status represents
@@ -394,31 +399,32 @@
 general-purpose solver.
 
 In line 8, we take a clone of the space; nothing is shared between
-space and newspace. Having taken a clone, we now have two identical
-versions of the space that we received. This will allow us to explore
-the two alternatives. This step is done, line 9 and 13, with the call
-to commit, each time with a different integer value representing the
-branch to be taken. The rest should be sufficiently self-describing.
+space and newspace (the clone). Having taken a clone, we now have two
+identical versions of the space that we got as parameter. This will
+allow us to explore the two alternatives. This step is done, line 9
+and 13, with the call to commit, each time with a different integer
+value representing the branch to be taken. The rest should be
+sufficiently self-describing.
 
-This shows the two important space methods used by a search engine::
+This shows the two important space methods used by a search engine:
 ask, which waits for the stability of the space and informs the solver
 of its status, and commit, which tells a space which road to take in
-case of a fork. 
+case of a fork.
 
-Now, earlier, we talked of a "distributor":: it is a program running
-in a computation space. It could be anything, and in fact, in the
-final version of the LO, it will be any Python program, augmented with
-calls to non-deterministic choice points. Each time a program embedded
-in a computation space reaches such a point, it blocks until some Deus
-ex machina make the choice for him. Only a solver can be responsible
-for the actual choice (that is the reason for the name
-"non-deterministic":: the decision does not belong to the embedded
-program, only to the solver that drives it).
+Now, earlier, we talked of a "distributor": it is a program running in
+a computation space. It could be anything, and in fact, in the final
+version of the LO, it will be any Python program, augmented with calls
+to non-deterministic choice points. Each time a program embedded in a
+computation space reaches such a point, it blocks until some Deus ex
+machina makes the choice for him. Only a solver can be responsible for
+the actual choice (that is the reason for the name "non
+deterministic":: the decision does not belong to the embedded program,
+only to the solver that drives it).
 
 In the case of a CSP, the distributor is a simple piece of code, which
-works only after the propagation phase. The standard way of CSP
-solving is typically using binary search, so there is a default,
-binary distributor set up in any new space.
+works only after the propagation phase has reached a fixpoint. The
+standard way of CSP solving is typically using binary search, so there
+is a default, binary distributor set up in any new space.
 
 Here are two examples of distribution strategies:
 
@@ -428,7 +434,7 @@
 * take a variable with a small domain, and keep only one value in the
   domain (in other words, we "instantiate" the variable).
 
-There are a great many ways to distribute ... Some of them perform
+There are a great many ways to distribute... Some of them perform
 better, depending on the caracteristics of the problem to be
 solved. But there is no absolutely better distribution strategy. Note
 that the second strategy given as example there is what is used (and
@@ -444,3 +450,12 @@
 For distributor writers::
  choose
 
+Were to look
+------------
+
+See also:
+
+* pypy-dist/pypy/objspace/constraint/applevel for the existing solver
+  and some sample problems (I'm quite sure that the conference
+  scheduling problem is up to date wrt the current API).
+



More information about the Pypy-commit mailing list