- Sponsor
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proof on principle: linkcheck exclude matched documents #9894
Conversation
a837630
to
9a8aa16
Compare
I'm not sure your use-case is commonly needed. I feel broken links are not acceptable even if it's historical documents. Are there any examples? I need to know why such a feature is needed. |
Hmh, I am honestly not sure how widely useful this would be. In my research I found this user on the sphinx mailing list who seems to have another use case: https://www.mail-archive.com/sphinx-users@googlegroups.com/msg04536.html In my case I am pondering whether to add linkcheck to our CI to ensure all links stay valid. This however comes at costs:
I agree that ideally one would ensure that all links, also those in historical documents (more concretely this minutes section) stay valid. However I feel enforcing the validity of all our historical documents is just not a good use of our very limited (human) resources. What do you think? |
I’m unsure about this idea. My initial guts is the same as #9894 (comment), all links should be working.
These links look like they could reasonably be ignored with a Perhaps a not-so-uncommon use case: a listing of contributors with links to their profile page page. |
Hi @francoisfreitag,
Wouldn't |
Thanks for clarifying, I misunderstood the intent with the |
Because the use case seems uncommon, it’s tempting to refuse to support it to avoid feature creep. OTOH, it has a sort symmetry with I’m +0 on adding it. If @tk0miya agrees, I’ll help refine the patch. |
Thank you for your clarification. The case of the minutes is a good example. I agree that it's non-sense to maintain hyperlinks of them. Okay, let's go forward. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In addition to this, document and testcase are needed.
Thanks for the heads up! I will have a look at the docs and tests later this weekend 👍 |
fcaa33b
to
d095060
Compare
This should now be ready for a round of reviews! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Otherwise looking pretty good, thanks!
264051c
to
db6d386
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A couple small things.
This is getting a bit messy :-) You will squash merge when everything is ready, so I don't have to clean up, right? |
Sure can! |
From my side this should be ready now :-) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
@francoisfreitag Please merge this if you are also ok! |
Work has been busy, I’ll give it a look (and squash the commits) later today or tomorrow. |
d3f59c4
to
10023da
Compare
Thanks! |
Subject: Proof on principle: linkcheck exclude matched documents
Purpose
For a use-case certain parts of the documentation are merely historical protocols which do need maintenance. We would benefit from telling linkcheck to ignore these parts of the documentation entirely.
Here I implement a system modelled after
linkcheck_anchors_ignore
.linkcheck_documents_ignore
ignores links in documents that match one or more user-given document exclusion patterns.I would be interested if you would consider merging this into the main repo, or whether building a custom builder for our use case is a better idea. This is at the moment merely a proof of principle, if you are planning to move on with this, I will finish it up.