.update at 0x7fab0cfbca60> with 3 widgets\n",
" whic…"
]
},
"execution_count": 12,
"metadata": {
},
"output_type": "execute_result"
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "9a40564d5b25446a8c685c7da517bcf3",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"FigureWidget({\n",
" 'data': [{'colorscale': [[0.0, 'lightblue'], [1.0, 'lightblue']],\n",
" 'opacity': …"
]
},
"execution_count": 12,
"metadata": {
},
"output_type": "execute_result"
}
],
"source": [
"gradient3d_interactive(f, (x, 0, 8), (y, 0, 8), (z, 0, 6), critpt_dict, scale=(1,1,1))"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## The gradient of a function $f\\!: \\mathbb{R}^n \\to \\mathbb{R}$\n",
"\n",
"\n",
"Consider a function $f$ of two (or more) variables, say $f(x, y)$. \n",
"\n",
"We know $f$ will have two partial derivatives: \n",
"$$ \\frac{\\partial f}{\\partial x} \\qquad \\text{and} \\qquad \\frac{\\partial f}{\\partial y} $$\n",
"\n",
"\n",
" And generally, each of these partial derivatives will also be a function from $\\mathbb{R}^2$ to $\\mathbb{R}$. So if we put these two functions together into a vector: \n",
"\n",
"$$ \\begin{bmatrix} \\frac{\\partial f}{\\partial x} \\\\ \\frac{\\partial f}{\\partial y} \\end{bmatrix} $$\n",
"\n",
"the result will be a function from from $\\mathbb{R}^2$ to $\\mathbb{R}^2$... a vector field!\n",
"\n",
"This vector field is called the gradient of $f$.\n",
"\n",
"
\n",
"\n",
"
\n"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"(0, 0): Eigenvalues [10, 40]\n",
"(40/3, 0): Eigenvalues [-1240/9, -10]\n",
"(0, 20/3): Eigenvalues [-310/9, -40]\n",
"(1/2*sqrt(133) + 3/2, -1/4*sqrt(133) + 3/4): Eigenvalues [9/16*sqrt(133) - 1/16*sqrt(-810*sqrt(133) + 276670) - 45/16, 9/16*sqrt(133) + 1/16*sqrt(-810*sqrt(133) + 276670) - 45/16]\n",
"(-1/2*sqrt(133) + 3/2, 1/4*sqrt(133) + 3/4): Eigenvalues [-9/16*sqrt(133) - 1/16*sqrt(810*sqrt(133) + 276670) - 45/16, -9/16*sqrt(133) + 1/16*sqrt(810*sqrt(133) + 276670) - 45/16]\n",
"(-8, -4): Eigenvalues [-sqrt(4177) + 15, sqrt(4177) + 15]\n",
"(5, 5/2): Eigenvalues [-35, 65/4]\n"
]
}
],
"source": [
"g(x, y) = 5*x^2 + 20*y^2 - 1/4*x^3 - 2*y^3 - 1/2*x^2*y^2\n",
"H = g.derivative(2)\n",
"for x0, y0 in solve(list(g.derivative()(x, y)), (x, y)):\n",
" x0, y0 = x0.rhs(), y0.rhs()\n",
" if not (x0 in RR and y0 in RR):\n",
" continue\n",
" print(f\"({x0}, {y0}): Eigenvalues {H(x0,y0).eigenvalues()}\")"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## An example of the gradient of a function\n",
"\n",
"\n",
"Define $f\\!: \\mathbb{R}^2 \\to \\mathbb{R}$ by \n",
"$$ f(x, y) = 5x^2 + 20y^2 - \\tfrac{1}{4} x^3 - 2y^3 - \\tfrac{1}{2} x^2 y^2 $$\n",
"\n",
"
The partial derivative of $f$ with respect to $x$ is$\\quad \\frac{\\partial f}{\\partial x} = 10x - \\tfrac{3}{4} x^2 - x y^2$\n",
"\n",
"
And the partial derivative of $f$ with respect to $y$ is $\\quad \\frac{\\partial f}{\\partial y} = 40y - 6y^2 - x^2 y$\n",
"\n",
"
\n",
"\n",
"So the gradient of $f$ is a function, which we'll write as $\\mathrm{grad}f$, defined by \n",
"\n",
"$$ \\mathrm{grad}f (\\begin{bmatrix} x \\\\ y \\end{bmatrix}) = \\begin{bmatrix} 10x - \\tfrac{3}{4} x^2 - x y^2 \\\\ 40y - 6y^2 - x^2 y \\end{bmatrix} $$\n",
"
\n",
"\n",
"
\n"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "7ce4bf96120742aca15f6b928524c3f9",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"VBox(children=(HBox(children=(Text(value='(4,2)', description='Point:'), Button(description='Add vector', styl…"
]
},
"execution_count": 14,
"metadata": {
},
"output_type": "execute_result"
}
],
"source": [
"options = dict(scalefactor=0.005, axes_scale=(1,1), label=r\"\\textrm{grad}f\")\n",
"vectorfield_interactive(g.derivative(), (x, 3.6, 6.4), (y, 1.8, 3.2), **options)"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Notation and a few details about the gradient\n",
"\n",
"\n",
"**Notation:** The gradient of $f$ is written as $\\ \\mathrm{grad}f \\ $ or $\\ \\vec{\\nabla}f$. \n",
"\n",
"So for our previous example, we could write \n",
"\n",
"$$ \\mathrm{grad}f (x, y) = \\begin{bmatrix} 10x - \\tfrac{3}{4} x^2 - x y^2 \\\\ 40y - 6y^2 - x^2 y \\end{bmatrix} \\qquad \\text{or} \\qquad \\vec{\\nabla}f (x, y) = \\begin{bmatrix} 10x - \\tfrac{3}{4} x^2 - x y^2 \\\\ 40y - 6y^2 - x^2 y \\end{bmatrix} $$\n",
"\n",
"
\n",
"\n",
"
\n",
"\n",
"Dimensions: If $f\\!: \\mathbb{R}^n \\to \\mathbb{R}$, then $\\mathrm{grad} f\\!: \\mathbb{R}^n \\to \\mathbb{R}^n$ is defined by \n",
"\n",
"$$ \\mathrm{grad}f = \\begin{bmatrix} \\frac{\\partial f}{\\partial x} \\\\ \\frac{\\partial f}{\\partial y} \\\\ \\vdots \\end{bmatrix} $$\n",
"\n",
"
\n",
"
\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## The gradient, and critical points of $f$\n",
"\n",
"\n",
"
\n",
"Recall that to find the critical points of $f$, we set both partial derivatives to $0$, and solve simultaneously. \n",
"
\n",
"\n",
"
\n",
"Note that this is the same way we find equilibrium points of a system of differential equations. So now we can say that finding the critical points of $f$ is the same as finding the equilibrium points of the $\\,\\mathrm{grad}f \\,$ vector field! \n",
"
\n",
"\n",
"
\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## How to visualize the gradient vector field"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "-"
}
},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "83814a4455874e2f8c49bfdd8d66637a",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"Interactive function .update at 0x7fab0d00b820> with 3 widgets\n",
" whic…"
]
},
"execution_count": 15,
"metadata": {
},
"output_type": "execute_result"
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "8d30481e8a8a42a0bf95efefb6b98d00",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"FigureWidget({\n",
" 'data': [{'colorscale': [[0.0, 'lightblue'], [1.0, 'lightblue']],\n",
" 'opacity': …"
]
},
"execution_count": 15,
"metadata": {
},
"output_type": "execute_result"
}
],
"source": [
"gradient3d_interactive(f, (x, 0, 8), (y, 0, 8), (z, 0, 6), critpt_dict, scale=(1,1,1))"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "-"
}
},
"source": [
"
"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## How to visualize the gradient vector field"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "-"
}
},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "01300574aa65411a8106b0db5a492f93",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"HBox(children=(Button(description='Zoom in', style=ButtonStyle()), FigureWidget({\n",
" 'data': [{'fill': 'tosel…"
]
},
"execution_count": 16,
"metadata": {
},
"output_type": "execute_result"
}
],
"source": [
"gradient2d_interactive(f, (x, 0, 8), (y, 0, 8), critpt_dict, scale=(1,1))"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## The direction of the gradient vector field\n",
"\n",
"\n",
"At any point in the domain of $f$, there is a gradient vector. It points *in the direction in which the graph of $f$ has its steepest slope.* \n",
"\n",
"In other words, in short, it points in the “uphill” direction. \n",
"\n",
"
This will allow us to classify critical points of $f$:
\n",
"
\n",
" - A local maximum of $f$ will be a stable equilibrium point (sink) of the $\\,\\mathrm{grad}f \\,$ vector field.
\n",
" - A local minimum of $f$ will be an unstable equilibrium point (source) of the $\\,\\mathrm{grad}f \\,$ vector field.
\n",
" - A saddle point of $f$ will be a saddle point of the $\\,\\mathrm{grad}f \\,$ vector field.
\n",
"
\n",
"\n",
"
\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"# Conclusions: \n",
"\n",
"\n",
"- For a function $f\\!: \\mathbb{R}^n \\to \\mathbb{R}$, its gradient is the vector field $\\mathrm{grad}f\\!: \\mathbb{R}^n \\to \\mathbb{R}^n$ defined by \n",
"$$ \\mathrm{grad}f (x, y, \\dotsc) = \\begin{bmatrix} \\frac{\\partial f}{\\partial x} \\\\ \\frac{\\partial f}{\\partial y} \\\\ \\vdots \\end{bmatrix} $$\n",
"\n",
"- Therefore the critical points of $f$ are the same as the equilibrium points of the gradient vector field of $f$. \n",
"\n",
"- At any point in the domain of $f$, the direction of $\\mathrm{grad}f$ at that point is the direction in which $f$ increases the fastest. (greatest rate of change, highest slope) \n",
"\n",
"- Therefore we can use the gradient of $f$ to classify the critical points of $f$: \n",
" - A local maximum of $f$ will be a stable equilibrium point (sink) of the gradient vector field of $f$. \n",
" - A local minimum of $f$ will be an unstable equilibrium point (source) of the gradient vector field of $f$. \n",
" - A saddle point of $f$ will be a saddle point of the gradient vector field of $f$. \n",
"\n",
"
\n",
"
\n",
"
\n"
]
},
{
"cell_type": "code",
"execution_count": 0,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
],
"source": [
]
}
],
"metadata": {
"celltoolbar": "Slideshow",
"hide_input": false,
"kernelspec": {
"display_name": "SageMath 9.3",
"language": "sagemath",
"metadata": {
"cocalc": {
"description": "Open-source mathematical software system",
"priority": 10,
"url": "https://www.sagemath.org/"
}
},
"name": "sage-9.3",
"resource_dir": "/ext/jupyter/kernels/sage-9.3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
}
},
"nbformat": 4,
"nbformat_minor": 4
}