The Optimizer Composite Class

A composite class focused on training and evaluating a given model. Given the complexity of the class, several backend modules are utilized to increase performance during training. An overview of the modules and their purposes is further discussed in the subsequent sections.

OptimizerWrapper

A cythonized optimizer wrapper class used for tracking and saving the state of the optimizer when training a model.

class OptimizerWrapper(initial=None)
Parameters:

initial (dict) – Expects a dictionary of keys being class attributes with associated standard values.

setoptimizer()

Does a preliminary check of the input settings parameters when initializing a torch optimizer, e.g. optimizer name and params.

setscheduler()

Does a preliminary check of the scheduler parameters.

save()

Saves the current optimizer and scheduler state for the trained model at given epoch and k-fold.

load()

Loads the optimizer at a given epoch and k-fold.

step()

Steps the optimizer.

zero()

Zeros the gradient of the model’s parameters.

stepsc()

Steps the scheduler.

Variables:
  • optimizer (optimizer) – Returns the torch optimizer.

  • scheduler (scheduler) – Returns the torch scheduler.

  • model (model) – Returns the original model

  • Path (str) – The path to store the checkpoint data.

  • RunName (str) – Name of the training session (untitled by default)

  • Optimizer (str) – The name of the optimizer to use (SGD, ADAM, …)

  • Scheduler (str) – The name of the scheduler to use (ExponentialLR, …)

  • OptimizerParams (dict) – Additional parameters to define the optimizer.

  • SchedulerParams (dict) – Additional parameters to define the scheduler.

  • Epoch (int) – The current epoch of the optimizer.

  • Train (bool) – Set the optimizer to train mode.

  • KFold (int) – K-Fold of the optimizer.

cOptimizer

class cOptimizer
length(self) dict[str, int]:
GetHDF5Hashes(str path) bool:
UseAllHashes(dict inpt) None:
MakeBatch(sampletracer, vector[string] batch, int kfold, int index, int max_percent = 80) [torch_geometric.Data]:
UseTheseFolds(list inpt) None:
FetchTraining(int kfold, int batch_size) list[str]:
FetchValidation(int kfold, int batch_size) list[str]:
FetchEvaluation(int batch_size) list[str]:
AddkFold(int epoch, int kfold, dict inpt, dict out_map) None:
DumpEpochHDF5(int epoch, str path, list[int] kfolds) None:
RebuildEpochHDF5(int epoch, str path, int kfold) None:
BuildPlots(int epoch, str path):
Variables:
  • metric_plot (bool) – Whether to plot the learning, accuracy, etc. metrics.

  • kFolds (list[int]) – The specific k-Folds to train on. Useful for multiprocessing.

RandomSamplers

RandomSamplers:
SaveSets(self, inpt, path) None

Dumps the hashes of the samples to HDF5 files.

RandomizeEvents(self, Events, nEvents=None) dict[str: None]

Randomly selects hashes from the input events, or randomly selects nEvents from the input.

MakeTrainingSample(self, Sample, TrainingSize=50) dict[str, list[str]]

Splits the input sample into training and testing by the specified ‘TrainingSize’ percentage.

MakekFolds(self, sample, folds, shuffle=True, asHashes=False) dict[str, dict[str, list[str]]]
MakeDataLoader(self, sample, SortByNodes=False, batch_size=1) list[torch_geometric.Data], dict[str, list[str]]

Optimizer

class Optimizer(inpt)
Inherited-members:

SampleTracer, _Interface, RandomSampler

Parameters:

inpt (Union[SampleTracer, None]) – Set to None by default, but expects an inherited instance of SampleTracer.

Start(sample):
Parameters:

sample (Union[SampleTracer, None]) – Set to None by default, but expects an inherited instance of SampleTracer.