Is it possible to have dynamic batchsize in keras?
It is possible if you train on a loop vs training with fit. an example
from random import shuffle
dataSlices = [(0,104),(104,186),(186,218)]
for epochs in range(0,10):
shuffle(dataSlices)
for i in dataSlices:
x,y = X[i[0]:i[1],:],Y[i[0]:i[1],:]
model.fit(x,y,epochs=1,batchsize=x.shape[0])
#OR as suggest by Daniel Moller
#model.train_on_batch(x,y)
This would assume your data is 2d numpy arrays. This idea can be further expanded to use a fit_generator()
inplace of the for loop if you so choose (see docs).