To calculate a planets space coordinates, we have to
solve the function
f (x) = x − 1 − 0.5 sin x
Let the base point be a = xi = π/2 on the interval [0, π].
Determine the highest-order Taylor series expansion resulting
in a maximum error of 0.015 on the specified interval.
The error is equal to the absolute value of the difference
between the given function and the specific Taylor series
expansion. (Hint: Solve graphically.)