Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core(robots): use new fetcher to get robots.txt #12423

Merged
merged 11 commits into from May 4, 2021
Merged

Conversation

adamraine
Copy link
Member

Fixes #10225

Use new fetcher to get robots.txt. The CSP of the page should not interfere this way.

@adamraine adamraine requested a review from a team as a code owner April 28, 2021 21:44
@adamraine adamraine requested review from patrickhulce and removed request for a team April 28, 2021 21:44
@google-cla google-cla bot added the cla: yes label Apr 28, 2021
Comment on lines +161 to +164
return {
stream: networkResponse.resource.success ? (networkResponse.resource.stream || null) : null,
status: networkResponse.resource.httpStatusCode || null,
};
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I used null to match the expectations of the RobotsTxt artifact. I think null makes more sense than undefined but it leads to some ugly code like this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have had 3 different versions of a comment written here in various tabs that I've lost with varying levels of concern, but at this point it just SGTM 😆

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what are the failure cases? ERR_BLOCKED_BY_CLIENT? Is success set by netError and error status codes, or just netError? If it's just netError, maybe we should mimic real fetch() and throw in that case?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's just netError, maybe we should mimic real fetch() and throw in that case?

In one iteration of my comment, I discuss with my lost tab that it seems odd that we would throw an exception rather than a warning on notApplicable. It's not a bug, but rather a known edge case, so the new null behavior makes more sense. That does suggest we actually make it a warning in RobotsTxt audit.

I'm fine with maintaining the status quo as well and throwing if others prefer that.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I was thinking in terms of fetcher on its own. The gatherer could then do as it saw fit with the response

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

success is set to false for error status codes, and there is no stream handle to use if content is served wit an error status code.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is no stream handle to use if content is served wit an error status code.

that seems like a CDP bug but probably not one worth pursuing until we have a reason to want to look at response bodies from requests that 404 or whatever :)

I think it still could make sense for fetcher to return

  • good statusCode: Promise.resolve({status: number, content: string})
  • bad statusCode: Promise.resolve({status: number, content: null})
  • netError: Promise.reject(new Error(netErrorName))

(and e.g. the RobotsTxt gatherer could catch the error and augment the artifact with a failure reason to add to the audit explanation string)

but I don't think this is critical today, so I'm fine with this version as well.

lighthouse-core/gather/gatherers/seo/robots-txt.js Outdated Show resolved Hide resolved
Copy link
Collaborator

@patrickhulce patrickhulce left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nothing blocking here, LGTM nice work! :)

does this method trigger the download UI if the mime type isn't set like the previous fetching approach? I bet that's much more common on .txt than js/json files and something to be aware of and/or prioritize on the devtools side to fix

Comment on lines +161 to +164
return {
stream: networkResponse.resource.success ? (networkResponse.resource.stream || null) : null,
status: networkResponse.resource.httpStatusCode || null,
};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have had 3 different versions of a comment written here in various tabs that I've lost with varying levels of concern, but at this point it just SGTM 😆

lighthouse-core/gather/gatherers/seo/robots-txt.js Outdated Show resolved Hide resolved
lighthouse-core/gather/fetcher.js Show resolved Hide resolved
lighthouse-core/gather/fetcher.js Outdated Show resolved Hide resolved
lighthouse-core/gather/gatherers/seo/robots-txt.js Outdated Show resolved Hide resolved
@adamraine
Copy link
Member Author

does this method trigger the download UI if the mime type isn't set like the previous fetching approach?

No

@adamraine
Copy link
Member Author

https://github.com/GoogleChrome/lighthouse/pull/12423/checks?check_run_id=2470752493

Getting a diagnostic message "Getting browser version" in the DevTools integration tests. Does that happen for every call to Browser.getVersion?

@adamraine adamraine merged commit 03adece into master May 4, 2021
@adamraine adamraine deleted the robots-fetcher branch May 4, 2021 15:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Lighthouse unable to download robots.txt
4 participants