Iterators

NaiveGAflux.RepeatPartitionIteratorType
RepeatPartitionIterator
RepeatPartitionIterator(base, nrep)

Iteratates over iterators of a subset of size nrep elements in base.

Generally useful for training all models in a population with the same data in each evolution epoch.

Tailored for situations where iterating over models is more expensive than iterating over data, for example if candidates are stored in host RAM or on disk and needs to be transferred to the GPU for training.

Example training loop:


for partiter in iter

    for model in population
        train!(model, partiter)
    end

    evolvepopulation(population)
end
source
NaiveGAflux.SeedIteratorType
SeedIterator
SeedIterator(base; rng=rng_default, seed=rand(rng, UInt32))

Iterator which has the random seed of an AbstractRNG as state.

Calls Random.seed!(rng, seed) every iteration so that wrapped iterators which depend on rng will produce the same sequence.

Useful in conjunction with RepeatPartitionIterator and BatchIterator and/or random data augmentation so that all candidates in a generation are trained with identical data.

source
NaiveGAflux.GpuIteratorFunction
GpuIterator(itr)

Return an iterator which sends values from itr to the GPU.

Will often be used automatically when training a model with parameters on the GPU.

source
NaiveGAflux.BatchIteratorType
BatchIterator{R, D}
BatchIterator(data, batchsize; shuffle)

Return an iterator which iterates batchsize samples along the last dimension of data or all elements of data if data is a Tuple (e.g (features, labels)).

Will shuffle examples if shuffle is true or an AbstractRNG. Shuffling will be different each time iteration starts (subject to implementation of shuffle(rng,...)).

source
NaiveGAflux.TimedIteratorType
TimedIterator{F,A,I}
TimedIterator(;timelimit, patience, timeoutaction, accumulate_timeouts, base)

Measures time between iterations and calls timeoutaction() if this time is longer than timelimit patience number of times.

Intended use is to quickly abort training of models which take very long time to train, typically also assigning them a very low fitness in case of a timeout.

By default, calling timeoutaction() will not stop the iteration as this would break otherwise convenient functions like length, collect and map. Let timeoutaction() return TimedIteratorStop to stop iteration.

If accumulate_timeouts is false then counting will reset when time between iterations is shorter than timelimit, otherwise it will not.

source