If you approximate a function, f(x), by a polynomial with degree n,
a_0 + a_1 (x-c) + a_2 (x-c)^2 + ... + a_n (x-c)^n,
then the remainder is simply
R_n(x) = f(x) - (a_0 + a_1 (x-c) + a_2 (x-c)^2 + ... + a_n (x-c)^n).
If f(x) actually is a polynomial with degree n, then R_n(x) = 0 and the coefficients are
a_k = f^(k)(c)/k! for k = 0, 1, 2, ..., n.
If we use these coefficients when f(x) is not a polynomial, R_n(x) is not identically zero. We don't expect an exact formula for the remainder--if we had one we wouldn't need the Taylor polynomial--but we would like to know if the remainder is small enough to be able to use the polynomial approximation, or over what interval around c the remainder is small enough. For that we need an estimate of the remainder.
Also, if we can prove that the remainder tends to zero everywhere (or maybe on some interval) as n tends to infinity, we can use the infinite Taylor series.
That's what the remainder is. You probably want to understand the various approximations to it. One approach is to think of c as a variable. This makes sense if we want the remainder at a given point x. Then
R_n(x, c) = f(x) - (f(c) + f'(c) (x-c)/1! + f''(c) (x-c)^2/2! + ... + f^(n)(c) (x-c)^n)
and you can differentiate with respect to c. Most of the terms automagically vanish and you know the derivative of the remainder at x, as a function of c, in terms of the n+1 th derivative of f.
There are several ways to get a useful estimate that use some form of the mean value theorem. The simplest is that there is some unknown point in a given interval where the gradient is the same as the average gradient across the interval. So we have an exact formula for the remainder at that point.