The Math.Stackexchange (MSE) is an extraordinary source of great quality responses on almost any non-research level math question. There was a recent question by the user belgi, called A list of basic integrals, that got me thinking a bit. It is not in the general habit of MSE to allow such big-list or soft questions. But it is an unfortunate habit that many very good tidbits get lost in the sea of questions (over 55000 questions now).

So I decided to begin a post containing some of the gems on integration techniques that I come across. I don’t mean this to be a catchall reference (For a generic integration reference, I again recommend Paul’s Online Math Notes and his Calculus Cheat Sheet). And I hope not to cross anyone, nor do I claim that mixedmath is to be the blog of MSE. But there are some really clever things done to which I, for one, would like a quick reference.

Please note that this is one of those posts-in-progress. If you know of another really slick bit that I missed, please let me know. And as I come across more, I’ll update this page accordingly.

#### In the question A self-contained proof that that $\displaystyle \sum_{n = 1}^\infty \frac{1}{n^p}$ converges for $p \geq 1$, by the user admchrch, there are two really nice responses:

First, joriki presented an argument that bounded the partial sums $S_{2k+1}$ by themselves. In particular he showed that

$\displaystyle S_{2k+1} = \sum_{n=1}^{2k+1} \frac{1}{n^p}$

$\displaystyle= 1+\sum_{i=1}^k\left(\frac{1}{(2i)^p}+\frac{1}{(2i+1)^p}\right)$

$\displaystyle < 1+\sum_{i=1}^k\frac{2}{(2i)^p}$

$=1+2^{1-p}S_k$

$=1+2^{1-p}S_{2k+1}$

Solving for $S_{sk+1}$ shows that $S_{sk+1} < \frac{1}{1 - 2^{1-p}}$, which is independent of $k$.

Second, Pacciu wrote up an answer that presented a convergence test similar to Cauchy’s, but a bit different. His presented criterion stated that if $(a_n)$ is a sequence of positive numbers, then the series $\sum a_n$ diverges if

$\displaystyle \lim \dfrac{\ln \frac{1}{a_n}}{\ln n} = l< 1$

and converges if the limit $l > 1$.

#### In a question of the user James about showing that $\int_{-a}^a \frac{f(x)}{1 + e^x} dx = \int_0^a f(x) dx$ when $f$ is even, there was one particularly nice answer:

First, there is an exceedingly clever answer from a user who wants their name anonymous, I believe, and who I will call user9413. His answer is short and sweet:

$I =\int\limits_{-a}^{a}\frac{f(x)}{1+e^{x}} \ dx \quad \cdots(1)$
$I = \int\limits_{-a}^{a} \frac{f(x)}{1+e^{-x}} \ dx \qquad\qquad \Bigl[ \small\because \int\limits_{a}^{b}f(x) = \int\limits_{a}^{b}f(a+b-x) \ \Bigr] \quad \cdots (2)$
$\Longrightarrow 2I = \int\limits_{-a}^{a} \biggl[ \frac{f(x)}{1+e^{x}} + \frac{e^{x}\cdot f(x)}{1+e^{x}} \biggr] \ dx \quad\qquad \cdots (1) + (2)$
$=\int\limits_{-a}^{a} f(x) \ dx = 2 \int\limits_{0}^{a} f(x) \ dx \qquad \Bigl[ \small \text{since}\ f \ \text{is even so} \ \int\limits_{-a}^{a} f(x) = 2\int\limits_{0}^{a} f(x) \Bigr]$
 I should note that the answers ([1] and [2]) of Jonas and Zarrax are good too.

#### In a question of the user grestudying12345 on Integration Techniques, which was unfortunately largely a bust, there were two answers of note:

I hadn’t ever really thought to knock down repeated integration by parts all at once before, but that’s exactly what Hans Lundmark does in his answer. I don’t include the text of the answer here because it’s easy to reproduce, but it’s the idea that struck me. Why hadn’t I heard or considered such a thing.

Linda Collins Sandgren notes that when attempting to integrate rational expressions of sine and cosine, the substitution $u = \tan \frac{x}{2}$ always leads to a rational function in $u$, and in particular

$\sin{x}=2\cos{\frac{x}{2}}\sin{\frac{x}{2}}=\frac{2\cos{\frac{x}{2}}\sin{\frac{x}{2}}}{\cos^2{\frac{x}{2}}+\sin^2{\frac{x}{2}}}=\frac{2u}{1+u^2}$

$\cos{x}=\cos^2{\frac{x}{2}}-\sin^2{\frac{x}{2}}=\frac{\cos^2{\frac{x}{2}}-\sin^2{\frac{x}{2}}}{\cos^2{\frac{x}{2}}+\sin^2{\frac{x}{2}}}=\frac{1-u^2}{1+u^2}$

#### The user Aryabhata asked a question about the Convergence of $\sqrt n x_n$, where $latex x_{n+1} = \sin (x_n )$:

While Martin Sleziak wrote up an answer relying ultimately on an application of L’Hopital’s rule and the fact that $\lim (a_{n+1} - a_n) = a \implies \lim \dfrac{a_n}{n} = a$, I’d like to focus on David Speyer’s answer (found here), that goes a bit like this:

For small $x$, we know $\sin x = x - x^3/6 + O(x^5)$. Set $y_n = 1/x_n^2$. Then we have

$1/x^2_{n+1} = x_n^{-2}(1 - x_n^2/6 + )(x_n^4))^{-2} = 1/x_n^2 + 1/3 + O(x_n^2)$

So $\displaystyle y_{n+1} = y_n + 1/3 + O(y_n^{-1})$ and $y_n = n/3 + O\left( \sum_{k=1}^n y_k^{-1} \right)$ and $\displaystyle \frac{1}{n}y_n = \frac{1}{3} + \frac{1}{n} O\left( \sum_{k=1}^n y_k^{-1}\right)$

We know $x_n \to 0$, so $y_n^{-1} \to 0$, and thus the average goes to zero. Thus $\lim y_n / n = 1/3$, and using the continuity of $\frac{1}{\sqrt t}$ and transforming back to $\sqrt n x_n$, we get the answer.

David Speyer summarized my general reaction nicely in a comment: PS This is a good example of why I find the O() notation insanely more useful than limits.

As an aside, there’s another good use of big-Oh notation by Qiaochu (and whose blog Annoying Precision has a link on my sidebar <–) in his answer to kuch nahi’s question on expanding $\cos^{-1} (\cos^2 x)$

#### Aryabhata is the writer of a slick answer to a classic integral asked by user9413 (generic username), on why $\displaystyle\int_0^\infty \frac{\sin x}{x} dx = \frac{\pi}{2}$:

This is often covered in a first class on complex analysis, but it can be done clasically. At the risk of plugging myself, I wrote a very late answer using infinite series but no complicated argument.

The answer by Aryabhata, however, is exceptional, especially in that I’d never seen it before. It all boils down to one idea: Notice that $\displaystyle \int_0^\infty e^{-xy} \sin x dy = \frac{\sin x }{x}$. Then one needs to justify switching the order of integration, $\int_{0}^{\infty} \Bigg(\int_{0}^{\infty} e^{-xy} \sin x \,dy \Bigg)\, dx = \int_{0}^{\infty} \Bigg(\int_{0}^{\infty} e^{-xy} \sin x \,dx \Bigg)\,dy$, and the right side can be solved easily by integration by parts.

#### Another challenging integral, $\displaystyle \int_0^\infty \ln(1 - e^{-x})dx$, was asked by Jack Rousseau. The solution itself is interesting, but the key method behind it is the common plan to expand in series:

Unfortunately, the magic of Anon’s posted answer is reduced once we know to expand by series, just as the satisfaction from solving an assigned homework problem is lost when the method of solution is prescripted (and when done in primary and secondary schools, perhaps contributing to the widespread belief that math is nothing more than arithmetic, with no creativity). But he did the following

$\displaystyle -\int_0^\infty \ln(1-e^{-x})dx=\int_0^\infty\left(e^{-x}+\frac{e^{-2x}}{2}+\frac{e^{-3x}}{3}+\cdots\right)dx$

$\displaystyle =\int_0^\infty e^{-x}dx+\frac{1}{2}\int_0^\infty e^{-2x}dx+\frac{1}{3}\int_0^\infty e^{-3x}dx+\cdots$

$\displaystyle =1+\frac{1}{2}\cdot\frac{1}{2}+\frac{1}{3}\cdot\frac{1}{3}+\frac{1}{4}\cdot\frac{1}{4}\cdots = \dfrac{\pi^2}{6}$

But it’s still a bit cool because it has the Riemann zeta, a pretty big bonus.

(probably still growing)