Package Exports
- @fvictorio/newton-raphson-method
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@fvictorio/newton-raphson-method) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
newton-raphson-method
Find zeros of a function using Newton's Method
Introduction
This is a fork of scijs/newton-raphson-method that uses big.js instances instead of plain javascript numbers.
The Newton-Raphson method uses the tangent of a curve to iteratively approximate a zero of a function, f(x)
. This yields the update:
Example
Consider the zero of (x + 2) * (x - 1)
at x = 1
:
const { newtonRaphson } = require('newton-raphson-method');
function f (x) { return x.minus(1).mul(x.plus(2)); }
function fp (x) { return x.minus(1).plus(x).plus(2); }
// Using the derivative:
newtonRaphson(f, 2, fp)
// => 1.0000000000000000 (6 iterations)
// Using a numerical derivative:
newtonRaphson(f, 2)
// => 1.0000000000000000 (6 iterations)
Installation
$ npm install @fvictorio/newton-raphson-method
API
newtonRaphson(f, x0[, fp, options])
Given a real-valued function of one variable, iteratively improves and returns a guess of a zero.
Parameters:
f
: The numerical function of one variable of which to compute the zero.x0
: A number representing the intial guess of the zero. Can be a number or a big.js instance.fp
(optional): The first derivative off
. If not provided, is computed numerically using a fourth order central difference with step sizeh
.options
(optional): An object permitting the following options:tolerance
(default:1e-7
): The tolerance by which convergence is measured. Convergence is met if|x[n+1] - x[n]| <= tolerance * |x[n+1]|
.maxIterations
(default:20
): Maximum permitted iterations.h
(default:1e-4
): Step size for numerical differentiation.verbose
(default:false
): Output additional information about guesses, convergence, and failure.
Returns: If convergence is achieved, returns a big.js instance with an approximation of the zero. If the algorithm fails, returns false
.
See Also
modified-newton-raphson
: A simple modification of Newton-Raphson that may exhibit improved convergence.newton-raphson
: A similar and lovely implementation that differs (only?) in requiring a first derivative.
License
© 2016 Scijs Authors. MIT License.
Authors
Ricky Reusser