Split dataframe into relatively even chunks according to length
You can take the floor division of a sequence up to the amount of rows in the dataframe, and use it to groupby
splitting the dataframe into equally sized chunks:
n = 400
for g, df in test.groupby(np.arange(len(test)) // n):
print(df.shape)
# (400, 2)
# (400, 2)
# (311, 2)
A more pythonic way to break large dataframes into smaller chunks based on fixed number of rows is to use list comprehension:
n = 400 #chunk row size
list_df = [test[i:i+n] for i in range(0,test.shape[0],n)]
[i.shape for i in list_df]
Output:
[(400, 2), (400, 2), (311, 2)]