Log compression on archival #4803
MichielDeMey
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
@MichielDeMey this sounds like a feature request? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
For some artifact repositories (such as S3 and GCS) we can leverage Gzip compression to reduce the log storage size.
Since text is highly compressible, I believe we can greatly reduce storage size and storage cost.
Both support Gzip compression, all we need to do is compress the logs before sending it to the storage provider.
We do need to tell the storage provider that the incoming file is Gzip compressed, but most storage SDKs provide a way for you to do this.
It usually comes down to setting the
Content-Encoding
header along with the correctContent-Type
.As for downloading a Gzipped log file, we should not need to modify any existing code.
GCS can automatically decompress the Gzipped file.
Documentation: https://cloud.google.com/storage/docs/transcoding#decompressive_transcoding
As for S3, it should be similar although I have no practical experience.
Happy to discuss!
Beta Was this translation helpful? Give feedback.
All reactions