# Example Sage lab problems

## Calculus

### Calc I

Problem 1: We have interpreted $\lim_{x\to a}f(x)=L$ as "$f(x)$ gets arbitrarily close to $L$ as $x$ gets close to $a$." Here we want to see how close $x$ needs to be to $a$ in order to make $f(x)$ close to $L$ or discover that no matter how close $x$ is to $a$, $f(x)$ does not approach the alleged value or any single value.

1. For $f(x)=x^2$, we know that $\lim_{x\to 2}~x^2=4$. Estimate for which $x>0$ is $|f(x)-4|<1$ by graphing the function for an appropriate range of values. Is your solution an interval containing $x=2$?

2. Estimate for which $x>0$ is $|f(x)-4|<1$ by solving the inequality in Sage.

3. As in parts a. and b. pick 3 (very) small values $\epsilon>0$ and determine which $x>0$ make $|f(x)-4|<\epsilon$. Are your solutions intervals containing $x=2$?

4. For $g(x)=\sin\left(\frac{1}{x}\right)$, determine whether or not $\lim_{x\to 0}~g(x)$ can exist.

{{{id=201| plot(x^2,(x,1.75,2.15)) /// }}} {{{id=219| solve(x^2-4<1,x) /// }}}

Problem 2: Using the Intermediate Value Theorem, we have seen how to test whether or not a continuous function $f(x)$ has a root, but it does not give us (directly) a method to find it. A procedure called Newton's method is a tool that can help us to find a root. It is given by defining a sequence (also in the book in Section 4.8, p.269-272) $$x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)},$$ with the successive terms (ideally) getting closer and closer to the zero. The prerequisite for Newton's method is that we can make a good guess'' for the zero. In the following let $h(x)=e^{\sqrt{x}}$ and $g(x)=x^4$

1. Find an interval for which the Intermediate Value Theorem shows that the two functions must intersect.

2. Use the interval in part (a) to make a guess at a value for the desired $x$.

3. Apply Newton's method to get approximations for the desired value for $x$.

{{{id=220| f(x)=exp(sqrt(x))-x^4 /// }}} {{{id=210| N(x)=x-f(x)/(diff(f,x)); show(N) /// }}} {{{id=222| a=N(N(N(1.75))); show([a,f(a)]) /// }}}

### Calc II

Problem 1: The series $\sum_{n=1}^{\infty}\frac{1}{n}$ is called the harmonic series.

1. Determine, by trial and error, the smallest $N$ such that $\sum_{n=1}^{N}\frac{1}{n}~>~2$

2. Determine, by trial and error, the smallest $N$ such that $\sum_{n=1}^{N}\frac{1}{n}~>~6$

3. Determine, by trial and error, the smallest $N$ such that $\sum_{n=1}^{N}\frac{1}{n}~>~10$

4. Do you think this series converges or diverges?

{{{id=202| RR(sum(1/x,x,1,10^5)) /// }}} {{{id=223| /// }}}

Problem 2: For $|x|<1$, suppose we know $$\frac{1}{1-x}=\sum_{k=0}^{\infty}x^k.$$ Determine the Taylor series for $F(x)$ from the Taylor series for $\frac{1}{1-x}$.

1. $F(x)=(x+1)\ln(1+x)-x$.

2. $F(x)=\int\tan^{-1}(x)dx$.

{{{id=211| F(x)=(x+1)*log(1+x)-x; F.taylor(x,0,6) /// }}} {{{id=227| g(x)=1/(1+x); s=g.taylor(x,0,5); show(s) /// }}} {{{id=229| I=integral(s,x); expand((1+x)*I-x) /// }}} {{{id=231| /// }}}

Problem 3: Let $F(x)=\sum_{k=0}^{\infty}\frac{x^{2k}}{2^k\cdot k!}$.

1. Show that $F(x)$ has infinite radius of convergence.

2. Show that $y=F(x)$ is a solution of $y''=xy'+y$ with initial conditions $y(0)=1$ and $y'(0)=0$.

3. Plot the partial sums $S_N$ for $N=1,3,5,7$ on the same graph and describe any patterns or interesting attributes you see.

{{{id=212| /// }}}

### Multivariable Calc

Definition: The gradient of a function $f(x,y)$ is an ordered pair $\nabla f(x,y):=\left(\frac{\partial f}{\partial x}(x,y),\frac{\partial f}{\partial y}(x,y)\right)$.

Theorem: If $P=(a,b)$ is a point on the curve $g(x,y)=0$ for which $z=f(x,y)$ is maximum or minimum, then $\nabla f(P)=\lambda\nabla g(P)$, where $\lambda$ is some constant.

Main idea in the theorem: For a maximum or minimum of $z=f(x,y)$ one can show that $\nabla f(P)$ is perpendicular to the contour line of $g(x,y)$ which contains the point $P$. Also, one shows that $\nabla g(P)$ is perpendicular to the contour line of $g(x,y)$ which contains the point $P$. Any two lines in the $xy$-plane perpendicular to the same thing must be parallel.

Some useful commands:
{{{id=1| x,y=var('x,y') def contour_lines(G,x0,x1,y0,y1): # Takes as input a function G of x and y, starting x-value x0, ending x-value x1, # starting y-value y0, and ending y-value y1. # return contour_plot(G,(x,x0,x1),(y,y0,y1),contours=,fill=False) def gradplot(F,a,b,z): # Takes in a gradient F, x-input a, y-input b, and lastly a color choice # where 1 is red and 2 is green. # if (z==1): return parametric_plot([a+x*F(a,b),b+x*F(a,b)],(x,0,1),color="red") elif (z==2): return parametric_plot([a+x*F(a,b),b+x*F(a,b)],(x,0,1),color="green") else: print 'Error: Last input must be 1 or 2' /// }}} Instructions: For each of the following problems, define the relevant functions, determine the gradients of the functions, and solve the given optimization problem (using the above theorem). After determining the optimal point $P=(a,b)$, plot the gradients at $P$ and the contour line which contains $P$ in the same $xy$-plane, and then plot the optimized function in 3-dimensional space.