Abstract: 
Algorithms and applications are presented for the following data
approximation problem. Let phi_i: i=1,2,...,n} be measurements
of smooth function values {f(x_i): i=1,2,...,n}, where the measurements
are so rough that the number of sign changes in the sequence
{phi_{i+1}phi_{i}: i=1,2,...,n1}, say it is sigma, is much
greater than the number in the sequence {f(x_{i+1})f(x_{i}):
i=1,2,...,n1}. The problem of calculating the least squares
change to the data subject to the condition that the first differences
of the estimated values have at most k1 sign changes is considered,
where k is a prescribed integer. The estimates form a nvector
with k monotonic sections in its components, alternately increasing
and decreasing. The main difficulty in this optimization calculation
is that the optimal positions of the joins of the monotonic sections
have to be found automatically, but the number of all possible
combinations of positions can be of magnitude n^{k1}, so that
it is not practicable to test each one separately. It is shown
that the case when k=1 is straightforward, and that the case
when k>1 reduces to partitioning the data into at most k disjoint
sets of adjacent data and solving a k=1 problem for each set.
It is shown also that the partition into suitable sets can be
done by a dynamic programming method, which can be made quite
efficient by taking advantage of some properties that are implied
by the optimization calculation. Two methodologically different
algorithms have been developed. The first one requires O(knsigma)
computer operations. The other algorithm reduces the complexity
to O(nsigma+ksigmalog_2sigma) operations, when k >= 3, by taking
advantage of some ordering relations between the indices of the
joins during the data partition process. The complexity is only
O(n) when k=1 or 2. In relation to these algorithms, Fortran
software packages have been written by the author and some of
their numerical results will be given. The packages can manage
routinely very large amounts of data. For example, they require
few seconds to calculate a best fit with 10 or 100 monotonic
sections to 30000 very noisy data on a common pc. The piecewise
monotonicity approximation method may have many applications,
in various contexts in several disciplines. For example, it is
highly suitable in estimating the turning points (peaks) of a
function from some noisy measurements of its values.Peak finding
is a subject of continuous interest in spectroscopy, chromatography
and signal processing. Other examples arise from medical imaging,
from signal processing and from data analysis for instance. A
selection of application results will be presented.
