The application I'm writing performs a length algorithm which usually takes a few minutes to finish. During this time I'd like to show the user a progress bar which indicates how much of the algorithm is done as precisely as possible.
The algorithm is divided into several steps, each with its own typical timing. For instance-
initialization (500 milli-sec)
reading inputs (5 sec)
step 1 (30 sec)
step 2 (3 minutes)
writing outputs (7 sec)
shutting down (10 milli-sec)
Each step can report its progress quite easily by setting the range its working on, say [0 to 150] and then reporting the value it completed in its main loop.
What I currently have set up is a scheme of nested progress monitors which form a sort of implicit tree of progress reporting.
All progress monitors inherit from an interface IProgressMonitor:
void setRange(int from, int to) = 0;
void setValue(int v) = 0;
The root of the tree is the ProgressMonitor which is connected to the actual GUI interface:
class GUIBarProgressMonitor : public IProgressMonitor
Any other node in the tree are monitors which take control of a piece of the parent progress:
class SubProgressMonitor : public IProgressMonitor
SubProgressMonitor(IProgressMonitor *parent, int parentFrom, int parentLength)
A SubProgressMonitor takes control of the range [parentFrom, parentFrom+parentLength] of its parent.
With this scheme I am able to statically divide the top level progress according to the expected relative portion of each step in the global timing. Each step can then be further subdivided into pieces etc'
The main disadvantage of this is that the division is static and it gets painful to make changes according to variables which are discovered at run time.
So the question: are there any known design patterns for progress monitoring which solve this issue?