module documentation

Raw post-processing routines.

This module implements class DataSet, unit element in the post-processing and class DataSetList, sequence of instances of DataSet.

Futhermore it implements methods for dealing with a third data structure which is a dictionary of DataSetList which is handy when dealing with DataSetList instances from multiple algorithms for comparisons.

Class DataSet Unit element for the COCO post-processing.
Class DataSetList List of instances of DataSet.
Class DictAlg Undocumented
Class RunlengthBasedTargetValues a class instance call returns f-target values based on reference runlengths:
Class TargetValues store and retrieve a list of target function values:
Function align_list Undocumented
Function asTargetValues Undocumented
Function cocofy Replaces cocopp references in pickles files with coco_pproc This could become necessary for future backwards compatibility, however rather should become a class method.
Function dictAlgByDim Returns a dictionary with problem dimension as key from a dictionary of DataSet lists.
Function dictAlgByDim2 Returns a dictionary with problem dimension as key.
Function dictAlgByFun Returns a dictionary with function id as key.
Function dictAlgByFuncGroup Returns a dictionary with function group as key.
Function dictAlgByNoi Returns a dictionary with noise group as key.
Function get_DataSetList try to load pickle file or fall back to DataSetList constructor.
Function parseinfo Extract data from a header line in an index entry.
Function parseinfoold Deprecated: Extract data from a header line in an index entry.
Function process_arguments Undocumented
Function processInputArgs Process command line arguments.
Function set_unique_algId on return, elements in ds_list do not have an algId attribute value from taken_ids or from ds_list_reference if taken_ids is None.
Function store_reference_values Undocumented
Variable maximal_evaluations_only_to_last_target Undocumented
Variable targets_displayed_for_info Undocumented
Function _DataSet_complement_data insert a line for each target value, never used (detEvals(targets) does the job on the fly).
def align_list(list_to_process, evals):

Undocumented

def asTargetValues(target_values):

Undocumented

def cocofy(filename):

Replaces cocopp references in pickles files with coco_pproc This could become necessary for future backwards compatibility, however rather should become a class method.

def dictAlgByDim(dictAlg):

Returns a dictionary with problem dimension as key from a dictionary of DataSet lists.

The input argument is a dictionary with algorithm names as keys and a list of DataSet instances as values. The resulting dictionary will have dimension as key and as values dictionaries with algorithm names as keys.

def dictAlgByDim2(dictAlg, remove_empty=False):

Returns a dictionary with problem dimension as key.

The difference with dictAlgByDim is that there is an entry for each algorithm even if the resulting DataSetList is empty.

This function is meant to be used with an input argument which is a dictionary with algorithm names as keys and which has list of DataSet instances as values. The resulting dictionary will have dimension as key and as values dictionaries with algorithm names as keys.

def dictAlgByFun(dictAlg, agg_cons=False):

Returns a dictionary with function id as key.

This method is meant to be used with an input argument which is a dictionary with algorithm names as keys and which has list of DataSet instances as values. The resulting dictionary will have function id as key and as values dictionaries with algorithm names as keys.

def dictAlgByFuncGroup(dictAlg):

Returns a dictionary with function group as key.

This method is meant to be used with an input argument which is a dictionary with algorithm names as keys and which has list of DataSet instances as values. The resulting dictionary will have a string denoting the function group and as values dictionaries with algorithm names as keys.

def dictAlgByNoi(dictAlg):

Returns a dictionary with noise group as key.

This method is meant to be used with an input argument which is a dictionary with algorithm names as keys and which has list of DataSet instances as values. The resulting dictionary will have a string denoting the noise group ('noiselessall' or 'nzall') and as values dictionaries with algorithm names as keys.

def get_DataSetList(*args, **kwargs):

try to load pickle file or fall back to DataSetList constructor.

Also write pickle file if reading failed. Global side effect: testbedsettings.load_current_testbed is called as it is in DataSet.__init__.

args[0] is expected to be either a list with one element which is a repository filetype name or the name itself. Otherwise, the fallback is executed.

def parseinfo(s):

Extract data from a header line in an index entry.

Use a 'smarter' regular expression than parseinfoold. The header line should be a string of comma-separated pairs of key=value, for instance: key = value, key = 'value'

Keys should not use comma or quote characters.

def parseinfoold(s):

Deprecated: Extract data from a header line in an index entry.

Older but verified version of parseinfo

The header line should be a string of comma-separated pairs of key=value, for instance: key = value, key = 'value'

Keys should not use comma or quote characters.

def process_arguments(args, current_hash, dictAlg, dsList, sortedAlgs):

Undocumented

def processInputArgs(args, process_background_algorithms=False):

Process command line arguments.

Returns several instances of DataSetList, and a list of algorithms from a list of strings representing file and folder names, see below for details. This command operates folder-wise: one folder corresponds to one algorithm.

It is recommended that if a folder listed in args contain both :file:`info` files and the associated :file:`pickle` files, they be kept in different locations for efficiency reasons.

Parameters
argsUndocumented
process_background_algorithmsUndocumented
list argsstring arguments for folder names
bool process_background_algorithmsoption to process also background algorithms
Returns
all_datasets
a list containing all DataSet instances, this is to prevent the regrouping done in instances of DataSetList. Caveat: algorithms with the same name are overwritten!?
pathnames
a list of keys of datasetlists_per_alg with the ordering as given by the input argument args
datasetlists_by_alg
a dictionary which associates each algorithm via its input path name to a DataSetList
def set_unique_algId(ds_list, ds_list_reference, taken_ids=None):

on return, elements in ds_list do not have an algId attribute value from taken_ids or from ds_list_reference if taken_ids is None.

In case, BFGS becomes BFGS 2 etc.

def store_reference_values(ds_list):

Undocumented

maximal_evaluations_only_to_last_target: bool =

Undocumented

targets_displayed_for_info: list =

Undocumented

def _DataSet_complement_data(self, step=10**0.2, final_target=1e-08):

insert a line for each target value, never used (detEvals(targets) does the job on the fly).

To be resolved: old data sets don't have this method, therefore it must be global in the module