Skip to content

jmuzsik/nextGenImageCreation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Next Gen Image Creation

This is a simple block of code (all shell code really) that does a few image and S3 things. I am not sure if it will work on non-mac operating systems. It should though, with slight adjustments from what I have written.

What you need

  1. brew install imagemagick or the like

  2. This is optional. Hook up an AWS account to your CLI.

  3. Node if you use the utility script, GetJSONData.sh. Otherwise, you don't need it.

  4. Run this so you do not have to use sudo, or run sudo if you prefer.

    • cd Scripts && chmod +x * && cd .. && cd Javascript && chmod +x && cd ..
    • This allows access to the file system for these scripts, the scripts do not do anything outside of this home directory. All that happens is image conversion and pushing to S3 as is mentioned before.
    • If you use sudo you'll just have to put your computer's password several times.

How to start

  1. Clone this repository, store it wherever is fitting. You can do several things here:

    • Create images in next gen formats (as V8 (Google) and Webkit(Apple) are optimised to handle specific image types)
      • webp for Google
      • JPEG 2000 (jp2) for Apple
    • Create placeholders for your images (for lazy loading)
    • Push this content to an AWS S3 bucket
  2. After cloning, you have several files to work with. They are all shell scripts. Think of them as utility files. Below is a quick rundown for each one.


Something to know (common pitfalls)

  1. Only run the script from the root directory.
  2. Other stuff to think of in the future...?

CreateS3Bucket.sh

Creates an s3 bucket with public read access. Public read access means that anyone can access the image from the s3 url. But, people cannot publicly Post, Put, or Delete. Only Get.

Do not put sensitive data, AWS predominantly recommends that people do not grant public read access as sensitive data such as voter and military information has been publicly accessible. This should not matter here, as this is solely for images.

What to do before calling it

  • Set up AWS account and add credentials to your terminal if you have not. Other than that, think of a good bucket name.

How to call it

  • bash ./Scripts/createS3Bucket.sh

Options

What you'll see in the console:

  1. {"Location": "/somebucketname"}
    • Which isn't important, it comes automatically from the AWS CLI.
  2. Then this: upload: Images/duckJPG.jpg to s3://somebucketname/duckJPG.jpg
    • Which again is something from the AWS CLI.
  3. A string that says this: Check if it works by clicking this url: https://somebucketname.s3.amazonaws.com/duckJPG.jpg. Which, when copied into your browser, should show a duck image, as that is the test image. You should take a peek in your AWS console as well.

What to know

  • You have this bucket in your AWS account now. It's not optimised to security or the like but it works and is accessible. Best of all, you can now push images to it.

ConvertFiles.sh

Converts a group of images to nextGen formats.

What to do before calling it

  • Add all the images you want to convert to the Images folder in the root directory. That's your main folder for Images. After conversion, all Webp images will be in a WebpFiles folder, all JPEG 2000 (JP2) will be in a JP2Files folder, and placeholders will be in the Placeholders folder.
    • You should also resize your images prior to doing this. The less amount of bytes an image contains the faster it loads for the client, and they usually do not need to be large for high quality rendering.

How to call it

  • bash ./Scripts/ConvertFiles.sh

Options

  • None

What you'll see in the console

  • You shouldn't see anything atm.

What to know

  • You now have jp2, webp, and placeholder images of your previous images.

MoveImagesToS3.sh

This moves your images to the S3 bucket of choice.

How to call it

  • bash ./Scripts/MoveImagesToS3.sh bucket="somebucketname"

Options

  • bucket="somebucketname"
    • Required

What you'll see in the console

  • A bunch of text like this (corresponding to how many files you have to write to s3): upload: ./duck.jp2 to s3://somebucketname/jp2Images/duck.jp2
    • This comes from the AWS cli.
  • A message of how to check the images. With your url, like this: This is the url to see the images
  • Then the url: https://somebucketname.s3.amazonaws.com/some-extension
  • And this: some-extension is the path and filename, something like fallback/someImage.jpg.
  • And lastly this: ie. https://somebucketname.s3.amazonaws.com/fallback/someImage.jpg

GetJSONData.sh

This creates a file with the corresponding url's for your images for easy reference. No need to use the AWS console.

How to call it

  • bash ./Scripts/GetJSONData.sh bucket="somebucketname"

Options

  • bucket="somebucketname"
    • Required

What you'll see in the console

  • The file has been saved! It is stored as imageUrls.json in the home directory.

Additional info

  • Change the scripts to whatever you want! These options are not set in stone. Read more about the AWS S3 CLI or Image Magick CLI to see additional possibilities.

And now for something completely different

About

Create next generation images and push them to s3

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published