You can use Python, e.g. from a Cloud Function:
from google.cloud import storage
from zipfile import ZipFile
from zipfile import is_zipfile
import io
def zipextract(bucketname, zipfilename_with_path):
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucketname)
destination_blob_pathname = zipfilename_with_path
blob = bucket.blob(destination_blob_pathname)
zipbytes = io.BytesIO(blob.download_as_string())
if is_zipfile(zipbytes):
with ZipFile(zipbytes, 'r') as myzip:
for contentfilename in myzip.namelist():
contentfile = myzip.read(contentfilename)
blob = bucket.blob(zipfilename_with_path + "/" + contentfilename)
blob.upload_from_string(contentfile)
zipextract("mybucket", "path/file.zip") # if the file is gs://mybucket/path/file.zip
Django-storages has a backend for Google Cloud Storage, but it is not documented, I realised looking in the repo. Got it working with this setup:
DEFAULT_FILE_STORAGE = 'storages.backends.gs.GSBotoStorage'
GS_ACCESS_KEY_ID = 'YourID'
GS_SECRET_ACCESS_KEY = 'YourKEY'
GS_BUCKET_NAME = 'YourBucket'
STATICFILES_STORAGE = 'storages.backends.gs.GSBotoStorage'
To get YourKEY and YourID you should create keys, in the settings tab.Interoperability
Hope it helps and you don't have to learn it the hard way :)
Ah in case you haven't yet, the dependencies are:
pip install django-storages
pip install boto
The solution, I believe, is to use the functionality that the file.createWriteStream function wraps in the Google Cloud Node SDK.bucket.upload
I've got very little experience with streams, so try to bear with me if this doesn't work right off.
First of all, we need take the base64 data and drop it into a stream. For that, we're going to include the library, create a buffer from the base64 data, and add the buffer to the end of the stream.stream
var stream = require('stream');
var bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(req.body.base64Image, 'base64'));
More on decoding base64 and creating the stream.
We're then going to pipe the stream into a write stream created by the function.file.createWriteStream
var gcs = require('@google-cloud/storage')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
//Define bucket.
var myBucket = gcs.bucket('my-bucket');
//Define file & file name.
var file = myBucket.file('my-file.jpg');
//Pipe the 'bufferStream' into a 'file.createWriteStream' method.
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg',
metadata: {
custom: 'metadata'
}
},
public: true,
validation: "md5"
}))
.on('error', function(err) {})
.on('finish', function() {
// The file upload is complete.
});
Info on , File docs, file.createWriteStream, and the bucket.upload method code in the Node SDK.bucket.upload
So the way the above code works is to define the bucket you want to put the file in, then define the file and the file name. We don't set upload options here. We then pipe the variable we just created into the bufferStream method we discussed before. In these options we define the metadata and other options you want to implement. It was very helpful to look directly at the Node code on Github to figure out how they break down the file.createWriteStream function, and recommend you do so as well. Finally, we attach a couple events for when the upload finishes and when it errors out.bucket.upload