CodeMathFusion

πŸš€ Level 4 - Topic 4: Gradient, Directional Derivatives, and Optimization 🌟

Maximizing and Navigating Multivariable Landscapes

1) Introduction: Exploring Multivariable Changes πŸ“š

Building on partial derivatives, this topic delves deeper into multivariable calculus with the gradient, directional derivatives, and optimization. These concepts help us understand the steepest ascent, the rate of change in any direction, and how to find maximum or minimum values of functions with multiple variables. They are crucial in fields like machine learning, physics, and economics for optimizing systems and analyzing slopes.

We’ll cover:

  • Gradient Vector: The direction of the steepest increase.
  • Directional Derivatives: Change along any specified direction.
  • Optimization: Finding critical points and extrema.
Let’s navigate this terrain step by step! πŸŽ‰

Quick Recap: Partial derivatives measure change with respect to one variable; now we combine them for multiple directions.

2) The Gradient Vector πŸŽ“

The gradient of a function \( f(x, y) \) is a vector that points in the direction of the steepest increase and its magnitude gives the slope in that direction. It’s computed using partial derivatives.

Definition 19.1: Gradient

The gradient of \( f(x, y) \) is \( \nabla f = \left( \frac{\partial f}{\partial x}, \frac{\partial f}{\partial y} \right) \). For \( f(x, y, z) \), it’s \( \nabla f = \left( \frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}, \frac{\partial f}{\partial z} \right) \).

**Properties**: The gradient is perpendicular to level curves/surfaces, and \( \nabla f \cdot \mathbf{u} \) gives the directional derivative (more on this later).

Example 1: Computing the Gradient

Find \( \nabla f \) for \( f(x, y) = x^2 + y^2 \).

  • \( \frac{\partial f}{\partial x} = 2x \), \( \frac{\partial f}{\partial y} = 2y \).
  • \( \nabla f = (2x, 2y) \).

Answer: \( \nabla f = (2x, 2y) \).

Example 2: Gradient at a Point

Evaluate \( \nabla f \) at \( (1, -1) \) for \( f(x, y) = xy + y^2 \).

  • \( \frac{\partial f}{\partial x} = y \), \( \frac{\partial f}{\partial y} = x + 2y \).
  • At \( (1, -1) \): \( \nabla f = (-1, 1 + 2(-1)) = (-1, -1) \).

Answer: \( \nabla f(1, -1) = (-1, -1) \).

3) Directional Derivatives πŸ“

The directional derivative measures the rate of change of a function along a specific direction defined by a unit vector \( \mathbf{u} \). It generalizes partial derivatives.

Definition 19.2: Directional Derivative

The directional derivative of \( f \) at \( (x, y) \) in direction \( \mathbf{u} = (u_1, u_2) \) (unit vector) is \( D_{\mathbf{u}} f = \nabla f \cdot \mathbf{u} \), or \( D_{\mathbf{u}} f = \frac{\partial f}{\partial x} u_1 + \frac{\partial f}{\partial y} u_2 \).

**Steps**: Compute \( \nabla f \), ensure \( \mathbf{u} \) is a unit vector (\( |\mathbf{u}| = 1 \)), and take the dot product.

Example 3: Basic Directional Derivative

Find \( D_{\mathbf{u}} f \) for \( f(x, y) = x^2 + y^2 \) at \( (1, 1) \) in direction \( \mathbf{u} = \left(\frac{\sqrt{2}}{2}, \frac{\sqrt{2}}{2}\right) \).

  • \( \nabla f = (2x, 2y) \), at \( (1, 1) \): \( \nabla f = (2, 2) \).
  • \( D_{\mathbf{u}} f = (2, 2) \cdot \left(\frac{\sqrt{2}}{2}, \frac{\sqrt{2}}{2}\right) = 2 \cdot \frac{\sqrt{2}}{2} + 2 \cdot \frac{\sqrt{2}}{2} = \sqrt{2} + \sqrt{2} = 2\sqrt{2} \).

Answer: \( D_{\mathbf{u}} f = 2\sqrt{2} \).

Example 4: Direction of Steepest Descent

For \( f(x, y) = x^2 - y^2 \) at \( (1, 1) \), find the directional derivative in \( \mathbf{u} = \left(-\frac{\sqrt{2}}{2}, \frac{\sqrt{2}}{2}\right) \).

  • \( \nabla f = (2x, -2y) \), at \( (1, 1) \): \( \nabla f = (2, -2) \).
  • \( D_{\mathbf{u}} f = (2, -2) \cdot \left(-\frac{\sqrt{2}}{2}, \frac{\sqrt{2}}{2}\right) = 2 \cdot -\frac{\sqrt{2}}{2} + (-2) \cdot \frac{\sqrt{2}}{2} = -\sqrt{2} - \sqrt{2} = -2\sqrt{2} \).
  • Interpretation: Negative indicates descent.

Answer: \( D_{\mathbf{u}} f = -2\sqrt{2} \).

4) Optimization of Multivariable Functions πŸ”

Optimization finds the maximum or minimum values of a function. For multivariable functions, we use critical points where the gradient is zero or undefined.

Definition 19.3: Critical Points and Extrema

Critical points occur where \( \nabla f = (0, 0) \). To classify, use the second derivative test with the Hessian determinant \( D = f_{xx}f_{yy} - (f_{xy})^2 \): \( D > 0 \) and \( f_{xx} > 0 \) is a minimum, \( D > 0 \) and \( f_{xx} < 0 \) is a maximum, \( D < 0 \) is a saddle.

**Steps**: Set \( f_x = 0 \) and \( f_y = 0 \), solve for critical points, and apply the test.

Example 5: Finding Critical Points

Find and classify critical points of \( f(x, y) = x^2 + y^2 - 4x - 6y + 13 \).

  • \( f_x = 2x - 4 = 0 \), \( f_y = 2y - 6 = 0 \).
  • Solve: \( x = 2 \), \( y = 3 \).
  • Second derivatives: \( f_{xx} = 2 \), \( f_{yy} = 2 \), \( f_{xy} = 0 \).
  • Discriminant: \( D = 2 \cdot 2 - 0^2 = 4 > 0 \), \( f_{xx} = 2 > 0 \), so minimum.
  • Value: \( f(2, 3) = 4 + 9 - 8 - 18 + 13 = 0 \).

Answer: Minimum at \( (2, 3) \), value 0.

Example 6: Real-World Optimization

Maximize \( P(x, y) = 100x + 150y - 2x^2 - y^2 \) subject to a constraint (e.g., budget \( 2x + y \leq 50 \)).

  • \( P_x = 100 - 4x = 0 \), \( P_y = 150 - 2y = 0 \).
  • Solve: \( x = 25 \), \( y = 75 \) (check constraint: \( 2 \cdot 25 + 75 = 125 > 50 \), adjust).
  • Constraint: Use Lagrange (for completeness): \( \nabla P = \lambda \nabla g \), where \( g = 2x + y - 50 \).
  • \( 100 - 4x = \lambda \cdot 2 \), \( 150 - 2y = \lambda \cdot 1 \), \( 2x + y = 50 \).
  • Solve: \( \lambda = 50 - 2x \), \( 150 - 2y = 50 - 2x \), \( y = 50 \), \( x = 0 \).
  • Recheck: \( P(0, 50) = 7500 \), test boundary.

Answer: Maximum \( P = 7500 \) at \( (0, 50) \) within constraint.

5) Advanced Applications and Techniques πŸ”

Advanced applications include constrained optimization (Lagrange multipliers) and gradient fields in physics. The gradient also guides path-finding in machine learning.

Definition 19.4: Lagrange Multipliers

To maximize \( f(x, y) \) subject to \( g(x, y) = c \), solve \( \nabla f = \lambda \nabla g \) and \( g(x, y) = c \).

Example 7: Constrained Optimization

Maximize \( f(x, y) = xy \) subject to \( x + y = 10 \).

  • \( \nabla f = (y, x) \), \( \nabla g = (1, 1) \).
  • \( y = \lambda \), \( x = \lambda \), \( x + y = 10 \).
  • Solve: \( 2\lambda = 10 \), \( \lambda = 5 \), \( x = 5 \), \( y = 5 \).
  • Value: \( f(5, 5) = 25 \).

Answer: Maximum \( f = 25 \) at \( (5, 5) \).

Example 8: Gradient in Physics

For \( V(x, y) = -x^2 - y^2 \), find the force field (negative gradient).

  • \( \nabla V = (-2x, -2y) \).
  • Force: \( \mathbf{F} = -\nabla V = (2x, 2y) \).

Answer: \( \mathbf{F} = (2x, 2y) \).

6) Practice Questions 🎯

Fundamental Practice Questions 🌱

Instructions: Compute the gradient or directional derivative for the given functions. πŸ“š

\( f(x, y) = x^2 + y^2 \), find \( \nabla f \) at \( (1, 1) \)

\( f(x, y) = xy \), find \( \nabla f \) at \( (2, -1) \)

\( f(x, y) = e^x \sin(y) \), find \( \nabla f \) at \( (0, \pi/2) \)

\( f(x, y) = x^2 - 2xy \), \( \mathbf{u} = (1, 0) \), find \( D_{\mathbf{u}} f \) at \( (1, 1) \)

\( f(x, y) = \ln(x + y) \), find \( \nabla f \) at \( (1, 1) \)

\( f(x, y) = x^3 + y^3 \), \( \mathbf{u} = \left(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}\right) \), find \( D_{\mathbf{u}} f \) at \( (1, 1) \)

\( f(x, y) = \cos(x) + \sin(y) \), find \( \nabla f \) at \( (0, 0) \)

\( f(x, y) = \frac{1}{x} + y^2 \), \( \mathbf{u} = \left(0, 1\right) \), find \( D_{\mathbf{u}} f \) at \( (1, 1) \)

\( f(x, y) = x^2 y \), find \( \nabla f \) at \( (2, 3) \)

\( f(x, y) = e^{-(x^2 + y^2)} \), \( \mathbf{u} = \left(\frac{\sqrt{3}}{2}, \frac{1}{2}\right) \), find \( D_{\mathbf{u}} f \) at \( (0, 0) \)

\( f(x, y) = \sqrt{x^2 + y^2} \), find \( \nabla f \) at \( (3, 4) \)

Challenging Practice Questions 🌟

Instructions: Solve these advanced problems involving gradients, directional derivatives, or optimization. 🧠

Find \( \nabla f \) and \( D_{\mathbf{u}} f \) for \( f(x, y) = x^2 y + y^3 \) at \( (1, 2) \) with \( \mathbf{u} = \left(\frac{1}{\sqrt{5}}, \frac{2}{\sqrt{5}}\right) \), and interpret.

Compute critical points of \( f(x, y) = 2x^2 + 3xy + y^2 - 4x - 5y + 3 \) and classify using the second derivative test.

Determine the direction of steepest descent for \( f(x, y) = e^{-x} \cos(y) \) at \( (0, \pi/2) \).

Evaluate \( \nabla f \) and find the maximum directional derivative for \( f(x, y) = x^3 - 3xy + y^2 \) at \( (1, 1) \).

Optimize \( f(x, y) = x^2 + 2y^2 - 2x - 4y + 1 \) subject to \( x + y = 1 \) using Lagrange multipliers.

7) Summary & Cheat Sheet πŸ“‹

7.1) Gradient

\( \nabla f = \left( \frac{\partial f}{\partial x}, \frac{\partial f}{\partial y} \right) \), direction of steepest increase.

7.2) Directional Derivative

\( D_{\mathbf{u}} f = \nabla f \cdot \mathbf{u} \), rate of change in direction \( \mathbf{u} \).

7.3) Optimization

Critical points from \( \nabla f = 0 \), classify with \( D = f_{xx}f_{yy} - (f_{xy})^2 \).

You’ve mastered gradients and optimization! Next, we’ll explore double integrals. πŸŽ‰