Images from Azure Blob Storage #889
Replies: 4 comments 3 replies
-
any insights here? |
Beta Was this translation helpful? Give feedback.
-
Could you provide more details?
|
Beta Was this translation helpful? Give feedback.
-
My current test case is outputting 45 A4 pages. My end goal is a much bigger document I would presume 2000+ pages with around 4000 images of between 200-300k. I have modified my code to rather use tmp storage and using SkiaSharp to compress the images by 50% as I am forcing smaller images on the doc. This has allowed me to execute the small test case but the generation is rather slow. I can chunk the export and merge afterwards if that would be a better approach. |
Beta Was this translation helpful? Give feedback.
-
For my sample of 45 pages. these are the results of the overall process using both versions. 2023.12.x faster. v2024.3.6 v2023.12.6 |
Beta Was this translation helpful? Give feedback.
-
I am generating a rather large at times PDF which includes a number of dynamic images stored in Azure Blob Storage.
My current approach is to load them up front and reference them in my document generation.
`var imagesData = new Dictionary<string, byte[]>();
var imageMessages = TotalMessages from my model;
and then reference like this.
string imageKey = $"{fileId}/Input/{message.MediaUrl}"; byte[] imageData = imagesData.ContainsKey(imageKey) ? imagesData[imageKey] : null; if (imageData != null) { innerColumn.Item().Width(200).Image(imageData).FitArea().WithCompressionQuality(ImageCompressionQuality.Medium); } else { _logger.LogError($"Unable to load image from path: {imageKey}"); innerColumn.Item().Text("Image could not be loaded."); }
This works fine locally but not when published to my Azure App service - I run out of memory.
What's the best approach here?
I need my method to support long complex pdf documents with dynamic images.
Please help.
Beta Was this translation helpful? Give feedback.
All reactions