Non-Negative Least-Squares (NNLS) algorithm, by Lawson and Hanson. It was mostly done for learning purposes, and will be improved over time.
Currently, it seems to match scipy, at least in some basic tests.
You are welcome to report issues and contribute to this project.
$ npm i nnls
import { nnls } from 'nnls';
const { x, d } = nnls(X, y); // result and dual vectors
The code returns also the dual vector.
You can get execution information using the options:
import { nnls } from 'nnls';
import { Matrix } from 'ml-matrix'; //npm i ml-matrix
const X = new Matrix([
[1, 0],
[2, 0],
[3, 0],
[0, 1],
]);
const Y = Matrix.columnVector([1, 2, 3, 4]);
const solution = Matrix.columnVector([1, 4]);
const result = nnls(X, Y, { info: true });
console.log(result.x.to1DArray(), result.info);
/*
{
x: Matrix {
[
1.000000
4
]
rows: 2
columns: 1
},
d: Matrix {
[
-3.6e-15
0
]
rows: 2
columns: 1
},
info: {
rse: [ 5.477225575051661, 4, 1.0175362097255202e-15 ],
iterations: 3
}
}
* /
Like other implementations (for example scipy.optimize.nnls
) it is limited to a single vector $y$, or as it is called in the literature, a single right-hand-side (RHS).
As a minor addition to other implementations, you can pass { interceptAtZero:false }
then the result is consistent with $f(0)=C$.
For multiple RHS, you can take a look at Fast-Combinatorial Non-Negative Least-Squares