the area of online algorithms. Smoothed complexity has been introduced
by Spielman and Teng to explain the behaviour of algorithms that perform well in practice while having a poor worst case complexity. The idea is to add some noise to the initial input instances by perturbing the input values slightly at random and to analyze the performance of the algorithm on these perturbed instances.
We apply this notion to two well-known online algorithms.
The first one is the multi-level feedback algorithm (MLF), minimizing
the average flow time on a sequence of jobs released over time, when the processing times of these jobs are not known. MLF is known to work very well in practice, though it has a poor competitive ratio.
We show that the smoothed competitive ratio of MLF improves
exponentially with the amount of random noise that is added; on average, MLF even admits a constant competitive ratio.
The second algorithm that we consider is the work function algorithm
(WFA) for metrical task systems, a general framework to model online
problems.
Our analysis reveals that the smoothed competitive ratio of WFA is much better than its worst case competitive ratio and that it depends on certain topological parameters of the underlying metric.