class BatchScheduler:
Constructor: BatchScheduler(number_of_batches, batch_to_execute)
Facilitate to run a benchmarking experiment in independent batches.
The batch scheduler crucially assumes that in each batch the same
problems are given in the same order when calling is_in_batch.
Pseudo code example:
batch_to_execute = 0 # set current batch to execute
suite = cocoex.Suite('bbob', '', '')
batcher = cocoex.BatchScheduler(4, batch_to_execute)
for problem in suite:
if not batcher.is_in_batch(problem):
continue
# ... run optimizer on problem ...
needs to be run, in accordance with the first argument to
BatchScheduler, four times overall (e.g., in parallel) with
batch_to_execute in (0,1,2,3) to generate the full experimental
data.
Details: to get a more even time distribution over all batches, it
seems advisable that the number of functions is not divisible by the
number of batches. That is, 4 (or 6 or 8 or 12) batches is not likely
to be ideal on the 'bbob' testbed of 24 functions.
| Method | __init__ |
distribute over number_of_batches batches and execute |
| Method | is |
return True iff the batch number for problem equals batch_to_execute |
| Instance Variable | current |
Undocumented |
| Instance Variable | current |
Undocumented |
| Instance Variable | first |
Undocumented |
| Instance Variable | params |
Undocumented |
distribute over number_of_batches batches and execute
the batch with number batch_to_execute which must obey
0 <= batch_to_execute < number_of_batches.
return True iff the batch number for problem equals batch_to_execute
as given as constructor argument. Assumes that id_function and
dimension are attributes of problem.
The batch number for problem is attributed using
(problem.id_function, problem.dimension) by order of
appearance, in that the number is incremented whenever either
id_function or dimension changes. Instances that follow
suit with the same function ID and dimension belong to the same
batch.