Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No nodes created by this plugin #332

Open
nanorepublica opened this issue Jul 3, 2019 · 11 comments · Fixed by #334 · May be fixed by #339
Open

No nodes created by this plugin #332

nanorepublica opened this issue Jul 3, 2019 · 11 comments · Fixed by #334 · May be fixed by #339
Assignees
Labels
bug Something isn't working question Further information is requested

Comments

@nanorepublica
Copy link

Hi there,

I am trying to use this plugin with a private S3 bucket to download (& display) about 350 photos. However, I am currently getting the following warning message whenever I run gatsby develop.

...
warning Failed to process remote content https://bucket.s3.amazonaws.com/...
...

warning The gatsby-source-s3-image plugin has generated no Gatsby nodes. Do you need it?

My first impression was that I had a permissions problem, but some searching around revealed this issue comment ((gatsbyjs/gatsby#12848 (comment)) on the main gatsby repo, which points to a problem between lodash and promises.

I fully accept that it is likely my code/config that is the issue so any insight into what I might be doing wrong would be appreciated.

My package.json is as follows:

  "dependencies": {
    "@fortawesome/fontawesome-svg-core": "^1.2.19",
    "@fortawesome/free-solid-svg-icons": "^5.9.0",
    "@fortawesome/react-fontawesome": "^0.1.4",
    "boxpack": "^0.1.0",
    "dotenv": "^8.0.0",
    "gatsby": "^2.4.2",
    "gatsby-image": "^2.0.41",
    "gatsby-plugin-manifest": "^2.1.1",
    "gatsby-plugin-offline": "^2.1.0",
    "gatsby-plugin-react-helmet": "^3.0.12",
    "gatsby-plugin-remote-images": "^1.0.3",
    "gatsby-plugin-sharp": "^2.0.36",
    "gatsby-source-filesystem": "^2.0.33",
    "gatsby-source-s3": "^0.0.0",
    "gatsby-source-s3-image": "^1.4.16",
    "gatsby-transformer-sharp": "^2.1.19",
    "potpack": "^1.0.1",
    "prop-types": "^15.7.2",
    "react": "^16.8.6",
    "react-dom": "^16.8.6",
    "react-helmet": "^5.2.1"
  }
@jessestuart
Copy link
Owner

Hey @nanorepublica! Sorry to hear you’re having trouble. Happy to help debug — I regularly test this source plugin with both private S3 buckets for my prod site, and a self-hosted minio bucket when developing locally just to speed things up a bit… plus I make liberal use of lodash / promises etc., so I’m confident we can this working for you :)

Here’s what jumps out to me: how are you authenticating to your private bucket? I don’t see the aws-sdk listed in your dependencies, which IMO is going to be the easiest way to get you authenticated on AWS’ side. I have a few lines in my site’s gatsby-config.js that sets this up — something like:

const AWS = require('aws-sdk')
AWS.config.update({
  accessKeyId: process.env.ACCESS_KEY_ID,
  secretAccessKey: process.env.SECRET_KEY_ID,
})
// ^ This is the key part!
module.exports = {
  plugins: [
    // …
    {
      resolve: 'gatsby-source-s3-image',
      options: { bucketName: 'my-private-bucket' },
    },
  ],
}

You can read up on other AWS authentication options around their docs here, but I’ve never had any problems with the AWS.config.update({ /* … */ }) method. [Side note: you’ll need to ensure that the IAM credentials you use have the proper permissions to access the bucket.]

Hope this helps — let me know if this gets you going, so I can update the documentation to be more explicit re: authentication requirements. If not: it’d be helpful to see the relevant portion(s) of your gatsby-config.js.

Cheers,
-JS

@jessestuart jessestuart added bug Something isn't working question Further information is requested labels Jul 5, 2019
@jessestuart jessestuart self-assigned this Jul 5, 2019
@nanorepublica
Copy link
Author

I do have that in my config (see below) and no luck :(

const AWS = require('aws-sdk');

AWS.config.update({
  accessKeyId: process.env.AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
});

module.exports = {
  siteMetadata: {
    title: '365 mosaic',
    description: 'Website to show off a 365 photo project',
    author: '@nanorepublica',
  },
  plugins: [
    'gatsby-plugin-react-helmet',
    {
      resolve: 'gatsby-source-s3-image',
      options: {
        bucketName: 'william-365',
        protocol: 'https',
      },
    },

It would seem that createRemoteFileNode is the function that throws the warning message I described.

Could you share an example of your AWS IAM policy for your user as that is the only other thing that I could see being a configuration issue?

@nanorepublica
Copy link
Author

a quick update from me. It is definitely a permissions issue. Looking at your package.json from your site I had an outdated gatsby version. Updating this led to a very clear HTTP 403 error message for my images.

It would be great to see an example of an IAM S3 policy for a user.

jessestuart added a commit that referenced this issue Jul 9, 2019
Accounts for AWS’ requirement that buckets created outside of `us-east-1` **post** 2019-03-20 cannot be referenced via the `s3.amazonaws.com` virtual host.

(Hopefully?) fixes #332.

> You can use this client [s3.amazonaws.com] to create a bucket in any AWS
> Region that was launched until March 20, 2019. To create a bucket in
> Regions that were launched after March 20, 2019, you must create a
> client specific to the Region in which you want to create the bucket.
> For more information about enabling or disabling an AWS Region, see AWS
> Regions and Endpoints in the AWS General Reference.
> [...]
> Buckets created in Regions launched after March 20, 2019 are not
> reachable via the https://bucket.s3.amazonaws.com naming scheme.

Source: https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingBucket.html

Signed-off-by: Jesse Stuart <hi@jessestuart.com>
@jessestuart jessestuart reopened this Jul 9, 2019
jessestuart pushed a commit that referenced this issue Jul 9, 2019
# [1.5.0](v1.4.22...v1.5.0) (2019-07-09)

### Features

* **aws:** Refactor S3 logic for new (2019) region handling. ([#334](#334)) ([b8176b9](b8176b9)), closes [#332](#332)
@jessestuart
Copy link
Owner

Dope, that helps to clarify the issue.

Not sure if this affects you — I noticed your profile shows you’re UK based — but I did some investigating over the weekend and discovered a bug with how the AWS SDK handles S3 buckets created not in us-east-1, after 20 March 2019 (I kid you not). I did some testing with a bucket just created in eu-west-2 and it seems to be working + backwards compatible.

Re: the S3 policy 403 errors — yeah, I’ve seen that too while testing. Take a look at AWS’ policy generator, a bare-bones webapp for generating the policy JSON based on a few inputs. Here’s an example policy that I got working for a Gatsby site using an EU-based bucket:

{
    "Version": "2012-10-17",
    "Id": "Policy1562526525963",
    "Statement": [
        {
            "Sid": "Stmt1562526522757",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::js-london-test/*"
        }
    ]
}

(screenshot of where this is configured in the AWS console)
image

Note: the wildcard principal effectively makes the bucket object publicly accessible, given the URL (thus no more 403’s); there’s definitely a way to lock this down (e.g., specifying an individual IAM user as principal) but that’s not something I’ve tested since in all my use cases the images end up public regardless.


TL;DR:

Hope this helps! Keep me posted how you get on with this.

@robinmetral
Copy link

robinmetral commented Jul 28, 2019

Note: the wildcard principal effectively makes the bucket object publicly accessible, given the URL (thus no more 403’s); there’s definitely a way to lock this down (e.g., specifying an individual IAM user as principal) but that’s not something I’ve tested since in all my use cases the images end up public regardless.

Well, if you set the principal as "*" you don't even need this bit here: (because the bucket is public)

const AWS = require('aws-sdk')
AWS.config.update({
  accessKeyId: process.env.ACCESS_KEY_ID,
  secretAccessKey: process.env.SECRET_KEY_ID,
})
// ^ This is the key part!

I tried to set the principal in my bucket policy as my user, but I keep getting 403s. In my case I also don't mind my bucket being public, but still it would be nice to document how to make a bucket private while working with this plugin 🙂

@nanorepublica
Copy link
Author

@robinmetral I have just opened PR #339 which should allow private buckets to be used. It is still a work in progress right now, but the basics look to be working for me.

@rahul-nath
Copy link

I really want to use this plugin (or any working plugin to source S3) but I've been trying for a month and can't get over the issue here.

I'm located in New York, my bucket is private, and I am seeing warning The gatsby-source-s3-image plugin has generated no Gatsby nodes..

Correspondingly, when I try a page query on the node supposedly created, I see an the error: Cannot query field "allS3Image" on type "Query".

I update my aws configuration itself (even though it's already configured in my gatsby-browser file) like this within gatsby-config.js

const AWS = require('aws-sdk')
AWS.config.update({
  accessKeyId: process.env.GATSBY_AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.GATSBY_AWS_SECRET_ACCESS_KEY,
})

My query in pages/index.js looks like:

export const getAllPerformers = graphql`
query MyQuery {
  allS3Image {
    edges {
      node {
        Key
        Url
      }
    }
  }
}
`

And I don't see allS3Image listed under nodes in GraphiQL.

I include it directly in the plugins within gatsby-config.js:

  plugins: [
		{
      resolve: `gatsby-source-filesystem`,
      options: {
        name: `images`,
        path: path.join(__dirname, `src`, `images`),
      },
    },
    `gatsby-plugin-sharp`,
    `gatsby-transformer-sharp`,
    {
      resolve: `gatsby-plugin-layout`,
      options: {
        component: require.resolve(`./src/layouts/baseLayout.js`),
      },
    },
		{
		  resolve: 'gatsby-source-s3-image',
		  options: {
		    bucketName: 'app-images-dev',
		    protocol: 'http', // [optional] Default to `https`.
		  },
		},
]

I'm using the latest version of this plugin, and the latest version of Gatsby.

@rahul-nath
Copy link

rahul-nath commented May 21, 2020

I'm not using TypeScript -- could that be the issue? I have also tried using http vs. https (working on localhost)

@nanorepublica
Copy link
Author

Hi @rahul-nath,

I was working on the PR above, but other priorities and that I hit an issue I couldn't get my head around with Gatsby has prevented me from continuing to work on it.
Additionally, I realised 2 things:

  1. This source plugin does not work with private buckets as can been seen from this issue.
  2. Also that no source plugin would work with a private S3 bucket in the sense that the resultant gatsby site would not be able to use S3 URLs unless the objects/bucket was public.

The idea behind the PR above was to generate temporary URLs during the build process for the source plugin to get the image, do the local processing and then present the data to gatsby.
There are 2 issues with this approach

  1. The URLs would eventually expire or the expiry would be so long they are essentially public
  2. The processed images would be public making the fact that the bucket was private void.

Therefore I would suggest crafting a bucket policy that allows the images you want to pull from your S3 bucket public. For an initial bucket policy see the comment above (#332 (comment))

@robinmetral
Copy link

robinmetral commented May 21, 2020

@rahul-nath lucky timing, I'm actually building private bucket support into @robinmetral/gatsby-source-s3 right now (PR in progress at gatsby-uc/gatsby-source-s3#43, I'll merge it tomorrow merged and available in @robinmetral/gatsby-source-s3@2.0.1 🎉 ).

I agree with @nanorepublica above that private bucket support is not a huge priority, since it's likely that the images will be publicly accessibly online anyways, but it's still a nice feature just because S3 buckets are private by default - so supporting them removes one config step.

For context, I made this other S3 source plugin because I needed it, and both other community plugins are not actively maintained anymore. Happy to help with any issues over there 🙂

@rahul-nath
Copy link

rahul-nath commented May 22, 2020

@robinmetral I actually started using your plugin yesterday! I really like its simplicity, haven't had trouble with public images.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
4 participants