[Scipy-svn] r5133 - in trunk/scipy: cluster linalg optimize
scipy-svn at scipy.org
scipy-svn at scipy.org
Sun Nov 16 08:34:06 EST 2008
Author: ptvirtan
Date: 2008-11-16 07:33:51 -0600 (Sun, 16 Nov 2008)
New Revision: 5133
Modified:
trunk/scipy/cluster/hierarchy.py
trunk/scipy/linalg/basic.py
trunk/scipy/linalg/info.py
trunk/scipy/optimize/info.py
Log:
Fix some docstring offending Sphinx Latex generation
Modified: trunk/scipy/cluster/hierarchy.py
===================================================================
--- trunk/scipy/cluster/hierarchy.py 2008-11-16 13:12:04 UTC (rev 5132)
+++ trunk/scipy/cluster/hierarchy.py 2008-11-16 13:33:51 UTC (rev 5133)
@@ -966,7 +966,9 @@
deviation of the link heights, respectively; ``R[i,2]`` is
the number of links included in the calculation; and
``R[i,3]`` is the inconsistency coefficient,
- .. math:
+
+ .. math::
+
\frac{\mathtt{Z[i,2]}-\mathtt{R[i,0]}}
{R[i,1]}.
Modified: trunk/scipy/linalg/basic.py
===================================================================
--- trunk/scipy/linalg/basic.py 2008-11-16 13:12:04 UTC (rev 5132)
+++ trunk/scipy/linalg/basic.py 2008-11-16 13:33:51 UTC (rev 5133)
@@ -400,14 +400,14 @@
ord norm for matrices norm for vectors
===== ============================ ==========================
None Frobenius norm 2-norm
- 'fro' Frobenius norm -
+ 'fro' Frobenius norm --
inf max(sum(abs(x), axis=1)) max(abs(x))
-inf min(sum(abs(x), axis=1)) min(abs(x))
1 max(sum(abs(x), axis=0)) as below
-1 min(sum(abs(x), axis=0)) as below
2 2-norm (largest sing. value) as below
-2 smallest singular value as below
- other - sum(abs(x)**ord)**(1./ord)
+ other -- sum(abs(x)**ord)**(1./ord)
===== ============================ ==========================
Returns
Modified: trunk/scipy/linalg/info.py
===================================================================
--- trunk/scipy/linalg/info.py 2008-11-16 13:12:04 UTC (rev 5132)
+++ trunk/scipy/linalg/info.py 2008-11-16 13:33:51 UTC (rev 5133)
@@ -2,7 +2,7 @@
Linear algebra routines
=======================
- Linear Algebra Basics:
+Linear Algebra Basics::
inv --- Find the inverse of a square matrix
solve --- Solve a linear system of equations
@@ -14,7 +14,7 @@
pinv --- Pseudo-inverse (Moore-Penrose) using lstsq
pinv2 --- Pseudo-inverse using svd
- Eigenvalues and Decompositions:
+Eigenvalues and Decompositions::
eig --- Find the eigenvalues and vectors of a square matrix
eigvals --- Find the eigenvalues of a square matrix
@@ -36,7 +36,7 @@
rsf2csf --- Real to complex schur form
hessenberg --- Hessenberg form of a matrix
- matrix Functions:
+matrix Functions::
expm --- matrix exponential using Pade approx.
expm2 --- matrix exponential using Eigenvalue decomp.
@@ -52,7 +52,7 @@
sqrtm --- matrix square root
funm --- Evaluating an arbitrary matrix function.
- Iterative linear systems solutions
+Iterative linear systems solutions::
cg --- Conjugate gradient (symmetric systems only)
cgs --- Conjugate gradient squared
Modified: trunk/scipy/optimize/info.py
===================================================================
--- trunk/scipy/optimize/info.py 2008-11-16 13:12:04 UTC (rev 5132)
+++ trunk/scipy/optimize/info.py 2008-11-16 13:33:51 UTC (rev 5133)
@@ -2,7 +2,7 @@
Optimization Tools
==================
- A collection of general-purpose optimization routines.
+A collection of general-purpose optimization routines.::
fmin -- Nelder-Mead Simplex algorithm
(uses only function calls)
@@ -17,9 +17,8 @@
leastsq -- Minimize the sum of squares of M equations in
N unknowns given a starting estimate.
+Constrained Optimizers (multivariate)::
- Constrained Optimizers (multivariate)
-
fmin_l_bfgs_b -- Zhu, Byrd, and Nocedal's L-BFGS-B constrained optimizer
(if you use this please quote their papers -- see help)
@@ -28,28 +27,24 @@
fmin_cobyla -- Constrained Optimization BY Linear Approximation
+Global Optimizers::
- Global Optimizers
-
anneal -- Simulated Annealing
brute -- Brute force searching optimizer
+Scalar function minimizers::
- Scalar function minimizers
-
fminbound -- Bounded minimization of a scalar function.
brent -- 1-D function minimization using Brent method.
golden -- 1-D function minimization using Golden Section method
bracket -- Bracket a minimum (given two starting points)
+Also a collection of general-purpose root-finding routines::
- Also a collection of general-purpose root-finding routines.
-
fsolve -- Non-linear multi-variable equation solver.
+Scalar function solvers::
- Scalar function solvers
-
brentq -- quadratic interpolation Brent method
brenth -- Brent method (modified by Harris with hyperbolic
extrapolation)
@@ -59,7 +54,7 @@
fixed_point -- Single-variable fixed-point solver.
- A collection of general-purpose nonlinear multidimensional solvers.
+A collection of general-purpose nonlinear multidimensional solvers::
broyden1 -- Broyden's first method - is a quasi-Newton-Raphson
method for updating an approximate Jacobian and then
@@ -83,7 +78,7 @@
anderson2 -- the Anderson method, the same as anderson, but
formulated differently
- Utility Functions
+Utility Functions::
line_search -- Return a step that satisfies the strong Wolfe conditions.
check_grad -- Check the supplied derivative using finite difference
More information about the Scipy-svn
mailing list