How to upload a large file into GCP Cloud Storage?
This is happening because the array that is returned by Files.readAllBytes has a bigger size than the maximum allowed.
A workaround you can do is to divide the file into multiple arrays of bytes, upload them as separated files to the bucket and join them using the gsutil compose command.
Version of storage:
<artifactId>google-cloud-storage</artifactId>
<version>1.63.0</version>
Preparation:
BlobId blobId = BlobId.of(BUCKET_NAME, date.format(BASIC_ISO_DATE) + "/" + prefix + "/" + file.getName());
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("application/gzip").build();
uploadToStorage(storage, file, blobInfo);
Main method:
private void uploadToStorage(Storage storage, File uploadFrom, BlobInfo blobInfo) throws IOException {
// For small files:
if (uploadFrom.length() < 1_000_000) {
byte[] bytes = Files.readAllBytes(uploadFrom.toPath());
storage.create(blobInfo, bytes);
return;
}
// For big files:
// When content is not available or large (1MB or more) it is recommended to write it in chunks via the blob's channel writer.
try (WriteChannel writer = storage.writer(blobInfo)) {
byte[] buffer = new byte[10_240];
try (InputStream input = Files.newInputStream(uploadFrom.toPath())) {
int limit;
while ((limit = input.read(buffer)) >= 0) {
writer.write(ByteBuffer.wrap(buffer, 0, limit));
}
}
}
}