How to upload folders to Google Colab?
The easiest way to do this, if the folder/file is on your local drive:
- Compress the folder into a ZIP file.
- Upload the zipped file into colab using the upload button in the File section. Yes, there is a File section, see the left side of the colab screen.
- Use this line of code to extract the file. Note: The file path is from colab's File section.
from zipfile import ZipFile
file_name = file_path
with ZipFile(file_name, 'r') as zip:
zip.extractall()
print('Done')
- Click Refresh in the colab File section.
- Access the files in your folder through the file paths
Downside: The files will be deleted after the runtime is over.
You can use some part of these steps if your file is on a Google Drive, just upload the zipped file to colab from Google Drive.
You can zip them, upload, then unzip it.
!unzip file.zip
I suggest you not to upload them just in Colab, since when you're restarting the runtime you will lose them (just need to re-upload them, but it can be an issue with slow connections).
I suggest you to use the google.colab
package to manage files and folders in Colab. Just upload everything you need to your google drive, then import:
from google.colab import drive
drive.mount('/content/gdrive')
In this way, you just need to login to your google account through google authentication API, and you can use files/folders as if they were uploaded on Colab.
EDIT May 2022:
As pointed out in the comments, using Google Drive as storage for a large number of files to train a model is painfully slow, as described here: Google Colab is very slow compared to my PC. The better solution in this case is to zip the files, upload them to colab and then unzip them using
!unzip file.zip
More unzip options here: https://linux.die.net/man/1/unzip