AutoFlux API Reference

NaiveGAflux.AutoFlux.fitMethod
fit(x, y; cb)
fit(data::Tuple; cb, mdir)

Return a population of models fitted to the given data.

The type of model will depend on the shape of x.

The following model types are currently supported

Keyword cb can be used to supply a callback function which will be called each generation with the current population as input.

Keyword mdir is a directory which will be searched for serialized state from which the optimisation will resume operations. Particularly useful in combination with cb=persist.

source

ImageClassifiction

NaiveGAflux.AutoFlux.fitMethod
fit(c::ImageClassifier, x, y; cb, fitnesstrategy, evolutionstrategy, stopcriterion)

Return a population of image classifiers fitted to the given data.

Arguments

  • c::ImageClassifier: Type of models to train. See ImageClassifier.

  • x: Input data. Must be a 4D array.

  • y: Output data. Can either be an 1D array in which case it is assumed that y is the raw labes (e.g. ["cat", "dog", "cat", ...]) or a 2D array in which case it is assumed that y is one-hot encoded.

  • cb=identity: Callback function. After training and evaluating each generation but before evolution cb(population) will be called where population is the array of candidates. Useful for persistence and plotting.

  • fitnesstrategy::AbstractFitnessStrategy=TrainSplitAccuracy(): Strategy for fitness from data.

  • evolutionstrategy::AbstractEvolutionStrategy=EliteAndTournamentSelection(popsize=c.popsize): Strategy for evolution.

  • stopcriterion: Takes the current population and returns true if fitting shall stop. Candidate fitness is available by calling fitness(c) where c is a member of the population.

source
NaiveGAflux.AutoFlux.fitMethod
fit(c::ImageClassifier, fitnesstrategy::AbstractFitness, evostrategy::AbstractEvolution, stopcriterion; cb)

Return a population of image classifiers fitted to the given data.

Lower level version of fit to use when fit(c::ImageClassifier, x, y) doesn't cut it.

Arguments

  • c::ImageClassifier: Type of models to train. See ImageClassifier.

  • fitnessstrategy: An AbstractFitness used to compute the fitness metric for a candidate.

  • evostrategy::AbstractEvolution: Evolution strategy to use. Population p will be evolved through p = evolve(evostrategy, p).

  • stopcriterion: Takes the current population and returns true if fitting shall stop. Candidate fitness is available by calling fitness(c) where c is a member of the population.

  • cb=identity: Callback function. After training and evaluating each generation but before evolution cb(population) will be called where population is the array of candidates. Useful for persistence and plotting.

source
NaiveGAflux.AutoFlux.ImageClassification.ImageClassifierType
ImageClassifier
ImageClassifier(popinit, popsize, seed)
ImageClassifier(;popsize=50, seed=1, newpop=false, mdir=defaultdir("ImageClassifier"); insize, outsize)

Type to make AutoFlux.fit train an image classifier using initial population size popsize using random seed seed.

Load models from mdir if directory contains models. If persistence is used (e.g. by providing cb=persist to fit) candidates will be stored in this directory.

If newpop is true the process will start with a new population and existing state in the specified directory will be overwritten.

source
NaiveGAflux.AutoFlux.ImageClassification.TrainSplitAccuracyType
struct TrainSplitAccuracy{T} <: AbstractFitnessStrategy
    TrainSplitAccuracy(;split, accuracyconfig, accuracyfitness, trainconfig, trainfitness)

Strategy to train model on a subset of the training data and measure fitness as accuracy on the rest.

Size of subset for accuracy fitness is ceil(Int, split * nobs) where nobs is the size of along the last dimension of the input.

Arguments

  • accuracyconfig (default BatchedIterConfig()) determine how to iterate over the accuracy subset.
  • accuracyfitness (default AccuracyVsSize) determine how to measure fitness based on accuracyconfig.
  • trainconfig (default TrainIterConfig()) determines how to iterate over the training subset.
  • trainfitness is a function accepting the iterator produced by trainconfig and the fitness strategy produced by accuracyfitness and returns the AbstractFitness to be used (default TrainThenFitness wrapped in GpuFitness).
source
NaiveGAflux.AutoFlux.ImageClassification.TrainAccuracyVsSizeType
struct TrainAccuracyVsSize <: AbstractFitnessStrategy
TrainAccuracyVsSize(;trainconfig, trainfitness)

Produces an AbstractFitness which measures fitness accuracy on training data and based on number of parameters combined in the same way as is done for AccuracyVsSize.

Arguments

  • trainconfig (default TrainIterConfig()) determines how to iterate over the training subset.
  • trainfitness is a function accepting the iterator produced by trainconfig and the fitness strategy produced by accuracyfitness and returns the AbstractFitness to be used (default TrainAccuracyFitness wrapped in GpuFitness).

Beware that fitness as accuracy on training data will make evolution favour overfitted candidates.

source
NaiveGAflux.AutoFlux.ImageClassification.AccuracyVsSizeFunction
AccuracyVsSize(data, accdigits=2, accwrap=identity)

Produces an AbstractFitness which measures fitness accuracy on data and based on number of parameters.

The two are combined so that a candidate a which achieves higher accuracy rounded to the first accdigits digits compared to a candidate b will always have a better fitness.

If the first accdigits of accuracy is the same the candidate with fewer parameters will get higher fitness.

Accuracy part of the fitness is calculated by accwrap(AccuracyFitness(data)).

source
NaiveGAflux.AutoFlux.ImageClassification.BatchedIterConfigType
struct BatchedIterConfig{T, V}
BatchedIterConfig(;batchsize=32, dataaug=identity, iterwrap=identity)

Configuration for creating batch iterators from array data.

The function dataiter(s::BatchedIterConfig, x, y) creates an iterator which returns a tuple of batches from x and y respectively.

More specifically, the result of s.iterwrap(zip(s.dataaug(bx), by)) will be returned where bx and by are BatchIterators.

source
NaiveGAflux.AutoFlux.ImageClassification.ShuffleIterConfigType
struct ShuffleIterConfig{T, V}
ShuffleIterConfig(;batchsize=32, seed=123, dataaug=identity, iterwrap=identity)

Configuration for creating shuffled batch iterators from array data. Data will be re-shuffled every time the iterator restarts.

The function dataiter(s::ShuffleIterConfig, x, y) creates an iterator which returns a tuple of batches from x and y respectively.

More specifically, the result of s.iterwrap(Iterators.map(((x,y),) -> (s.dataaug(x), y), iter)) will be returned where iter is a BatchIterator over x and y with shuffle=true.

Note there there is no upper bound on how many generations are supported as the returned iterator cycles the data indefinitely. Use e.g. Iterators.take(itr, cld(nepochs * nbatches_per_epoch, nbatches_per_gen)) to limit to nepochs epochs.

source
NaiveGAflux.AutoFlux.ImageClassification.GlobalOptimiserMutationType
struct GlobalOptimiserMutation{S<:AbstractEvolutionStrategy, F} <: AbstractEvolutionStrategy
GlobalOptimiserMutation(base::AbstractEvolutionStrategy)
GlobalOptimiserMutation(base::AbstractEvolutionStrategy, optfun)

Maps the optimiser of each candidate in a population through optfun (default randomlrscale()).

Basically a thin wrapper for global_optimiser_mutation.

Useful for applying the same mutation to every candidate, e.g. global learning rate schedules which all models follow.

source
NaiveGAflux.AutoFlux.ImageClassification.EliteAndSusSelectionType
struct EliteAndSusSelection <: AbstractEvolutionStrategy
EliteAndSusSelection(popsize, nelites, evolve)
EliteAndSusSelection(;popsize=50, nelites=2, evolve = crossovermutate())

Standard evolution strategy.

Selects nelites candidates to move on to the next generation without any mutation.

Also selects popsize - nelites candidates out of the whole population using SusSelection to evolve by applying random mutation.

Mutation operations are both applied to the model itself (change sizes, add/remove vertices/edges) as well as to the optimiser (change learning rate and optimiser algorithm).

Finally, models are renamed so that the name of each vertex of the model of candidate i is prefixed with "modeli".

source
NaiveGAflux.AutoFlux.ImageClassification.EliteAndTournamentSelectionType
struct EliteAndTournamentSelection <: AbstractEvolutionStrategy
EliteAndTournamentSelection(popsize, nelites, k, p, evolve)
EliteAndTournamentSelection(;popsize=50, nelites=2; k=2, p=1.0, evolve = crossovermutate())

Standard evolution strategy.

Selects nelites candidates to move on to the next generation without any mutation.

Also selects popsize - nelites candidates out of the whole population using TournamentSelection to evolve by applying random mutation.

Mutation operations are determined by evolve both applied to the model itself (change sizes, add/remove vertices/edges) as well as to the optimiser (change learning rate and optimiser algorithm).

Finally, models are renamed so that the name of each vertex of the model of candidate i is prefixed with "modeli".

source