Skip to content

Commit cb3ebe5

Browse files
test
1 parent 8fa4a21 commit cb3ebe5

File tree

3 files changed

+40
-31
lines changed

3 files changed

+40
-31
lines changed

README.md

Lines changed: 32 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,107 +1,110 @@
1+
12
# Table of Contents
2-
![GitHub](https://img.shields.io/github/license/jishnurajendran/Numerical-analysis?style=for-the-badge) ![GitHub forks](https://img.shields.io/github/forks/jishnurajendran/Numerical-analysis?style=for-the-badge) ![GitHub Repo stars](https://img.shields.io/github/stars/jishnurajendran/Numerical-analysis?style=for-the-badge) ![GitHub watchers](https://img.shields.io/github/watchers/jishnurajendran/Numerical-analysis?style=for-the-badge)
33

4-
1. [Root Finding Methods](#org770afdb)
5-
1. [Newton’s method](#org013f1b7)
6-
2. [Fixed point method](#orgde6567b)
7-
3. [Secant method](#org4ebbe87)
8-
2. [Interpolation techniques](#org9e2a72e)
9-
1. [Hermite Interpolation](#orgd63ca7f)
10-
2. [Lagrange Interpolation](#org1e8da43)
11-
3. [Newton’s Interpolation](#orgd4f58aa)
12-
3. [Integration methods](#org8e7e5c8)
13-
1. [Euler Method](#org351da1a)
14-
2. [Newton–Cotes Method](#org75020aa)
15-
3. [Predictor–Corrector Method](#orgcf8f14e)
16-
4. [Trapizoidal method](#orgf561a2c)
4+
1. [Root Finding Methods](#org97f8dc1)
5+
1. [Newton’s method](#org4ec5a5a)
6+
2. [Fixed point method](#orgd92eb51)
7+
3. [Secant method](#org5e86b54)
8+
2. [Interpolation techniques](#org7879a30)
9+
1. [Hermite Interpolation](#org01982a3)
10+
2. [Lagrange Interpolation](#org1020c9c)
11+
3. [Newton’s Interpolation](#orgd08b2ee)
12+
3. [Integration methods](#orgf7b000b)
13+
1. [Euler Method](#orge64619c)
14+
2. [Newton–Cotes Method](#orgb51f88e)
15+
3. [Predictor–Corrector Method](#org2f8adfb)
16+
4. [Trapizoidal method](#org4dbe660)
17+
18+
\![GitHub](<https://img.shields.io/github/license/jishnurajendran/Numerical-analysis?style=for-the-badge>) \![GitHub forks](<https://img.shields.io/github/forks/jishnurajendran/Numerical-analysis?style=for-the-badge>) \![GitHub Repo stars](<https://img.shields.io/github/stars/jishnurajendran/Numerical-analysis?style=for-the-badge>) \![GitHub watchers](<https://img.shields.io/github/watchers/jishnurajendran/Numerical-analysis?style=for-the-badge>)
1719

20+
:TOC: :include all
1821

1922

20-
<a id="org770afdb"></a>
23+
<a id="org97f8dc1"></a>
2124

2225
# Root Finding Methods
2326

2427

25-
<a id="org013f1b7"></a>
28+
<a id="org4ec5a5a"></a>
2629

2730
## [Newton&rsquo;s method](https://en.wikipedia.org/wiki/Newton%27s_method)
2831

2932
Newton&rsquo;s method (also known as the Newton–Raphson method) is a method for finding successively better approximations to the roots (or zeroes) of a real-valued function. The process is repeated as $$ x_{n+1}=x_{n}-{\frac {f(x_{n})}{f'(x_{n})}} $$
3033

3134

32-
<a id="orgde6567b"></a>
35+
<a id="orgd92eb51"></a>
3336

3437
## [Fixed point method](https://en.wikipedia.org/wiki/Fixed-point_iteration)
3538

3639
Fixed-point iteration is a method of computing fixed points of iterated functions. More specifically, given a function f defined on the real numbers with real values and given a point x0 in the domain of f, the fixed point iteration is
3740
$$ x_{n+1}=f(x_{n}),\,n=0,1,2,\dots$$
3841

3942

40-
<a id="org4ebbe87"></a>
43+
<a id="org5e86b54"></a>
4144

4245
## [Secant method](https://en.wikipedia.org/wiki/Secant_method)
4346

4447
Secant method is a root-finding algorithm that uses a succession of roots of secant lines to better approximate a root of a function f. The secant method can be thought of as a finite difference approximation of Newton&rsquo;s method.
4548
$$ x_{n}=x_{n-1}-f(x_{n-1}){\frac {x_{n-1}-x_{n-2}}{f(x_{n-1})-f(x_{n-2})}}={\frac {x_{n-2}f(x_{n-1})-x_{n-1}f(x_{n-2})}{f(x_{n-1})-f(x_{n-2})}}. $$
4649

4750

48-
<a id="org9e2a72e"></a>
51+
<a id="org7879a30"></a>
4952

5053
# Interpolation techniques
5154

5255

53-
<a id="orgd63ca7f"></a>
56+
<a id="org01982a3"></a>
5457

5558
## Hermite Interpolation
5659

5760
Hermite Interpolation is a method of interpolating data points as a polynomial function. The generated Hermite interpolating polynomial is closely related to the Newton polynomial, in that both are derived from the calculation of divided differences.
5861

5962

60-
<a id="org1e8da43"></a>
63+
<a id="org1020c9c"></a>
6164

6265
## Lagrange Interpolation
6366

6467
Lagrange polynomials are used for polynomial interpolation. See [Wikipedia](https://en.wikipedia.org/wiki/Lagrange_polynomial)
6568

6669

67-
<a id="orgd4f58aa"></a>
70+
<a id="orgd08b2ee"></a>
6871

6972
## Newton&rsquo;s Interpolation
7073

7174
Newton&rsquo;s divided differences is an algorithm, historically used for computing tables of logarithms and trigonometric functions. Divided differences is a recursive division process. The method can be used to calculate the coefficients in the interpolation polynomial in the Newton form.
7275

7376

74-
<a id="org8e7e5c8"></a>
77+
<a id="orgf7b000b"></a>
7578

7679
# Integration methods
7780

7881

79-
<a id="org351da1a"></a>
82+
<a id="orge64619c"></a>
8083

8184
## Euler Method
8285

8386
Euler method (also called forward Euler method) is a first-order numerical procedure for solving ordinary differential equations (ODEs) with a given initial value. It is the most basic explicit method for numerical integration of ordinary differential equations and is the simplest Runge–Kutta method.
8487
$$ y_{n+1} = y_{n} + h f(t_{n} , y_{n}) $$
8588

8689

87-
<a id="org75020aa"></a>
90+
<a id="orgb51f88e"></a>
8891

8992
## Newton–Cotes Method
9093

9194
Newton–Cotes formulae, also called the Newton–Cotes quadrature rules or simply Newton–Cotes rules, are a group of formulae for numerical integration (also called quadrature) based on evaluating the integrand at equally spaced points. They are named after Isaac Newton and Roger Cotes.
9295

9396

94-
<a id="orgcf8f14e"></a>
97+
<a id="org2f8adfb"></a>
9598

9699
## Predictor–Corrector Method
97100

98101
Predictor–Corrector methods belong to a class of algorithms designed to integrate ordinary differential equations – to find an unknown function that satisfies a given differential equation. All such algorithms proceed in two steps:
99102

100-
1. The initial, &ldquo;prediction&rdquo; step, starts from a function fitted to the function-values and derivative-values at a preceding set of points to extrapolate (&ldquo;anticipate&rdquo;) this function&rsquo;s value at a subsequent, new point.
101-
2. The next, &ldquo;corrector&rdquo; step refines the initial approximation by using the predicted value of the function and another method to interpolate that unknown function&rsquo;s value at the same subsequent point.
103+
1. The initial, *&ldquo;prediction&rdquo;* step, starts from a function fitted to the function-values and derivative-values at a preceding set of points to extrapolate (&ldquo;anticipate&rdquo;) this function&rsquo;s value at a subsequent, new point.
104+
2. The next, *&ldquo;corrector&rdquo;* step refines the initial approximation by using the predicted value of the function and another method to interpolate that unknown function&rsquo;s value at the same subsequent point.
102105

103106

104-
<a id="orgf561a2c"></a>
107+
<a id="org4dbe660"></a>
105108

106109
## Trapizoidal method
107110

README.org

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,12 @@
11
#+TITLE: Numerical-analysis
22
#+AUTHOR: Jishnu Rajendran
33

4+
![GitHub](https://img.shields.io/github/license/jishnurajendran/Numerical-analysis?style=for-the-badge) ![GitHub forks](https://img.shields.io/github/forks/jishnurajendran/Numerical-analysis?style=for-the-badge) ![GitHub Repo stars](https://img.shields.io/github/stars/jishnurajendran/Numerical-analysis?style=for-the-badge) ![GitHub watchers](https://img.shields.io/github/watchers/jishnurajendran/Numerical-analysis?style=for-the-badge)
5+
6+
:PROPERTIES:
7+
:TOC: :include all
8+
:END:
9+
410
* Root Finding Methods
511
** [[https://en.wikipedia.org/wiki/Newton%27s_method][Newton's method]]
612
Newton's method (also known as the Newton–Raphson method) is a method for finding successively better approximations to the roots (or zeroes) of a real-valued function. The process is repeated as $$ x_{n+1}=x_{n}-{\frac {f(x_{n})}{f'(x_{n})}} $$
@@ -32,8 +38,8 @@ Newton–Cotes formulae, also called the Newton–Cotes quadrature rules or simp
3238

3339
** Predictor–Corrector Method
3440
Predictor–Corrector methods belong to a class of algorithms designed to integrate ordinary differential equations – to find an unknown function that satisfies a given differential equation. All such algorithms proceed in two steps:
35-
1. The initial, "prediction" step, starts from a function fitted to the function-values and derivative-values at a preceding set of points to extrapolate ("anticipate") this function's value at a subsequent, new point.
36-
2. The next, "corrector" step refines the initial approximation by using the predicted value of the function and another method to interpolate that unknown function's value at the same subsequent point.
41+
1. The initial, /"prediction"/ step, starts from a function fitted to the function-values and derivative-values at a preceding set of points to extrapolate ("anticipate") this function's value at a subsequent, new point.
42+
2. The next, /"corrector"/ step refines the initial approximation by using the predicted value of the function and another method to interpolate that unknown function's value at the same subsequent point.
3743
** Trapizoidal method
3844
Trapezoidal rule is a technique for approximating the definite integral. The trapezoidal rule works by approximating the region under the graph of the function f(x) as a trapezoid and calculating its area.
3945
$$ \int _{a}^{b}f(x)\,dx\approx \sum _{k=1}^{N}{\frac {f(x_{k-1})+f(x_{k})}{2}}\Delta x_{k}$$

num-ana.png

194 KB
Loading

0 commit comments

Comments
 (0)