The Partial Fraction Algorithm

The following theorem is the heart of the partial fraction method. Applied recursively to the denominator, it produces the partial fraction expansion of any rational function.

Theorem 3   Let $ F$ be a field, and let $ g$, $ f_1,$ $ f_2$ be polynomials in $ F[X]$ such that $ f_1$ and $ f_2$ are coprime and $ \partial g < \partial f_1 + \partial f_2$. Then one may write
$\displaystyle \frac{g(x)}{f_1(x) f_2(x)} = \frac{g_1(x)}{f_1(x)} +
\frac{g_2(x)}{f_2(x)},
$

with $ \partial g_1 < \partial f_1$ and $ \partial g_2 < \partial f_2$.

Indeed, as $ f_1$, $ f_2$ are coprime, we may apply Theorem 2 and deduce that there exist polynomials $ h_1$, $ h_2$ such that

$\displaystyle g(x) = h_1(x) f_2(x) + h_2(x) f_1(x).
$

Now let $ t$ be an arbitrary polynomial (to be fixed later) and define

$\displaystyle g_1(x) = h_1(x) + t(x) f_1(x); \quad g_2(x) = h_2(x) - t(x) f_2(x).
$

Then
$\displaystyle g(x) = g_1(x) f_2(x) + g_2(x) f_1(x).
$

Now given $ h_1(x), f_1(x)$, Euclid's algorithm (Theorem 1) provides $ g_1(x), t(x)$ with $ g_1 = 0$ or $ \partial g_1 < \partial f_1$ such that
$\displaystyle h_1(x) = -t(x) f_1(x) + g_1(x),
$

and we take this for the definition of our polynomial $ t(x)$. It remains to prove that $ \partial g_2 < \partial f_2$. Suppose, for a contradiction that $ \partial g_2 \ge \partial f_2$. Then $ \partial(g_2 f_1) \ge \partial f_2 + \partial f_1$. Now $ \partial(g_1 f_2) < \partial f_1 + \partial f_2$, so the $ g_2 f_1$ term dominates and we have that $ \partial g \ge \partial f_1 +
\partial f_2$, a contradiction. This proves the theorem.

We now flesh out the algorithm by giving the full partial fraction expansion.

Corollary 1   Let $ g, f$ be polynomials such that $ \partial g < \partial f$; and write $ f= f_1^{a_1} \ldots f_r^{a_r}$, where $ f_1, \ldots, f_r$ are mutually pairwise coprime polynomials (that is, no pair of polynomials has a non-trivial common factor), then one can express
$\displaystyle \frac{g(x)}{f(x)} = \sum_{i=1}^r \sum_{j=1}^{a_i}
\frac{g_{ij}(x)}{(f_i(x))^j},
$

for some polynomials $ g_{ij}$ such that $ \partial g_{ij} < \partial f_i$.

Using our partial fraction algorithm (Theorem 3), a simple induction gives

$\displaystyle \frac{g(x)}{f(x)} = \sum_{i=1}^r \frac{h_i(x)}{(f_i(x))^{a_i}},
$

for some polynomial $ h_i$ with $ h_i=0$ or $ \partial h_i < a_i \partial
f_i$. Consequently, given $ h$, $ f$ with $ \partial h < a \partial f$, it is sufficient to express
$\displaystyle \frac{h(x)}{f(x)^a} = \sum_{j=1}^a \frac{g_j(x)}{f(x)^j},
$

for some polynomials $ g_j$ with $ \partial g_j < \partial f$.

This is trivial for $ a=1$, and for $ a>1$, we employ Euclid's algorithm to give

$\displaystyle h(x) = q(x) f(x) + r(x),
$

with $ r=0$ or $ \partial r < \partial f$. Then
$\displaystyle \frac{h(x)}{f(x)^a} = \frac{r(x)}{f(x)^a} + \frac{q(x)}{f(x)^{a-1}}.
$

As in the proof of Theorem 3, one can show that $ q
= 0 $ or $ \partial q < (a-1) \partial f$. Induction on $ a$ then gives our result.

Finally, we can employ this result to recover the familiar high school partial fractions result for real polynomials.

Corollary 2   Let $ f$ be a polynomial over $ \mathbb{R}$ and let $ g$ be a polynomial over some extension field of $ \mathbb{R}$ (e.g. $ \mathbb{R}$ itself, $ \mathbb{C}$, etc.). Write
$\displaystyle f = l_1^{a_1} \ldots l_r^{a_r} q_1^{b_1} \ldots q_s^{b_s},
$

where the $ l_i$ are linear polynomials and the $ q_i$ are quadratic polynomials, all of which are mutually pairwise coprime. Then there exist constants $ A_{ij}, B_{ij}, C_{ij}$ in the extension field such that
$\displaystyle \frac{g(x)}{f(x)} = \sum_{i=1}^r \sum_{j=1}^{a_i}
\frac{A_{ij}}{l_i(x)^j} + \sum_{i=1}^s \sum_{j=1}^{b_i} \frac{B_{ij}+
C_{ij} x}{q_i(x)^j}.
$

The proof is left to the reader!

Gihan Marasingha 2005-09-19