You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+98-41Lines changed: 98 additions & 41 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,57 +6,72 @@ A comprehensive Clojure library for numerical optimization, root-finding, interp
6
6
7
7
### Root-Finding
8
8
9
-
-**`root-solvers`** — Univariate root-finding with multiple algorithms: Brent-Dekker, modified Newton-Raphson, Muller, plus Hipparchus methods (bisection, Brent, Illinois, Pegasus, Ridders, secant, Regula-Falsi). Includes quadratic equation solver.
9
+
-**`roots.continuous`** — Univariate root-finding with multiple algorithms: Brent-Dekker, modified Newton-Raphson, Muller, plus Hipparchus methods (bisection, Brent, Illinois, Pegasus, Ridders, secant, Regula-Falsi). Includes quadratic equation solver.
10
10
11
-
-**`integer-root-solvers`** — Bisection algorithm for strictly increasing discrete functions. Returns the minimum integer with function value ≥ 0.
11
+
-**`roots.integer`** — Bisection algorithm for strictly increasing discrete functions. Returns the minimum integer with function value ≥ 0.
12
12
13
-
-**`plateau-root-solvers`** — Root-finding for monotonic functions that return plateau values. Supports univariate and multivariate cases.
13
+
-**`roots.plateau`** — Root-finding for monotonic functions that return plateau values. Supports univariate and multivariate cases.
14
+
15
+
-**`roots.polynomial`** — Polynomial root-finding using Laguerre's method via Hipparchus.
14
16
15
17
### Optimization
16
18
17
-
-**`optimize-univariate`** — Brent optimizer for 1D minimization/maximization over bounded intervals.
19
+
-**`optimize.continuous-univariate`** — Brent optimizer for 1D minimization/maximization over bounded intervals.
20
+
21
+
-**`optimize.integer-univariate`** — Integer optimizer using exponential search or ternary search. Handles unimodal functions over integer domains.
22
+
23
+
-**`optimize.linear`** — Two-phase Simplex Method. Minimizes/maximizes linear objectives subject to linear constraints (equality, ≤, ≥).
24
+
25
+
-**`optimize.mixed-integer`** — Mixed Integer Programming (MIP) using OjAlgo. Extends linear programming to support integer and binary variable constraints.
26
+
27
+
-**`optimize.quadratic`** — Minimizes (1/2)x^T P x + q^T x subject to equality and inequality constraints using OjAlgo.
18
28
19
-
-**`integer-optimize`** — Custom integer maximizer that exponentially focuses search around a guess. Handles functions with at most one sign change in derivative.
29
+
-**`optimize.nlp-constrained`** — Constrained nonlinear optimization using COBYLA. Minimizes objectives subject to nonlinear inequality constraints (≥ 0).
20
30
21
-
-**`linear-programming`** — Two-phase Simplex Method. Minimizes/maximizes linear objectives subject to linear constraints (equality, ≤, ≥).
31
+
-**`optimize.nlp-unbounded`** — Unbounded nonlinear optimization with multiple pure-Clojure algorithms:
Solvers for systems of equations without an objective function to optimize:
31
46
32
-
-**`nonlinear-constraints-without-objective`** — Finds variable values that satisfy nonlinear constraints:
47
+
-**`systems.linear`** — Iterative solvers for symmetric linear systems (A × y = b): SYMMLQ and Conjugate Gradient methods. For small/dense systems, use direct least squares from `provisdom.math.linear-algebra` instead.
48
+
49
+
-**`systems.nonlinear`** — Finds variable values that satisfy nonlinear constraints:
33
50
- Nonlinear Least Squares (Levenberg-Marquardt, Gauss-Newton)
-*Linear basis* (closed-form): custom basis functions for univariate and multivariate data
60
75
61
76
#### Interpolation vs Curve Fitting
62
77
@@ -71,41 +86,83 @@ Interpolation is organized by dimensionality:
71
86
72
87
**Note**: LOESS is in the interpolation namespace (same API) but doesn't pass through points—it performs local regression for smoothing noisy data.
73
88
89
+
#### Curve Fitting vs Regression
90
+
91
+
| Aspect | Curve Fitting | Regression |
92
+
|--------|---------------|------------|
93
+
|**Goal**| Find parameters of a known functional form | Model statistical relationships between X and y |
94
+
|**Functional form**| Arbitrary: `a·e^(-((x-b)/c)²)`, `a·cos(ωx+φ)`| Linear in parameters: `y = Xβ` (possibly transformed) |
95
+
|**Optimization**| Nonlinear least squares (Levenberg-Marquardt) | Closed-form (OLS/Ridge) or IRLS (GLMs) |
96
+
|**Assumptions**| Just minimize residuals | Error distribution, link function |
97
+
|**Output**| Parameters of the specific function | Coefficients + diagnostics (R², MSE, precision) |
98
+
99
+
**Curve fitting** answers: "Given this formula, what parameters fit best?"
100
+
101
+
**Regression** answers: "What's the statistical relationship between predictors X and response y?"
102
+
103
+
Logistic and beta regression aren't just curve fitting—they model the *distribution* of y given X (Bernoulli, Beta) with appropriate link functions, using Iteratively Reweighted Least Squares (IRLS).
104
+
74
105
### Regression
75
106
76
-
-**`logistic-regression`** — Iteratively Reweighted Least Squares (IRLS) with optional ridge regularization.
107
+
All regression methods are in the `provisdom.solvers.regression` namespace hierarchy:
108
+
109
+
-**`regression.ordinary`** — Ordinary least squares with optional regularization:
110
+
-*OLS* — Standard least squares via QR decomposition
111
+
-*Ridge (L2)* — Closed-form solution with λI penalty
112
+
-*LASSO (L1)* — Coordinate descent for sparse solutions
113
+
-*Elastic Net* — Combined L1+L2 via coordinate descent
114
+
115
+
-**`regression.logistic`** — Binary logistic regression using IRLS with optional ridge regularization.
0 commit comments