google cloud storage - gsutil rsync with gzip compression -


i'm hosting publicly available static resources in google storage bucket, , want use gsutil rsync command sync our local version bucket, saving bandwidth , time. part of our build process pre-gzip these resources, gsutil rsync has no way set content-encoding header. means must run gsutil rsync, run gsutil setmeta set headers on of gzipped file types. leaves bucket in bad state until header set. option use gsutil cp, passing -z option, requires re-upload entire directory structure every time, , includes lot of image files , other non-gzipped resources wastes time , bandwidth.

is there atomic way accomplish rsync , set proper content-encoding headers?

assuming you're starting gzipped source files in source-dir can do:

gsutil -h content-encoding:gzip rsync -r source-dir gs://your-bucket 

note: if , run rsync in reverse direction decompress , copy objects down:

gsutil rsync -r gs://your-bucket source-dir  

which may not want happen. basically, safest way use rsync synchronize objects as-is between source , destination, , not try set content encodings on objects.


Comments

Popular posts from this blog

OpenCV OpenCL: Convert Mat to Bitmap in JNI Layer for Android -

android - org.xmlpull.v1.XmlPullParserException: expected: START_TAG {http://schemas.xmlsoap.org/soap/envelope/}Envelope -

python - How to remove the Xframe Options header in django? -