Skip to content

Commit

Permalink
Set unlimited depth within the domain and reduce verbosity
Browse files Browse the repository at this point in the history
  • Loading branch information
precondition committed Jan 13, 2025
1 parent c57d13d commit f4f0551
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions .github/workflows/broken-links-crawler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,10 @@ jobs:
id: check-broken-links
uses: ScholliYT/[email protected]
with:
# We would need a high `max_depth` for the crawler to naturally find all the keymapdb pages but we do not want the crawler to go deep
# in unrelated websites, so in order to keep the `max_depth` low but the keymapdb page coverage high, we manually list them here.
website_url: "https://keymapdb.com,http://keymapdb.com/page/2/,http://keymapdb.com/page/3/,http://keymapdb.com/page/3/,http://keymapdb.com/page/4/,http://keymapdb.com/page/5/,http://keymapdb.com/page/6/,http://keymapdb.com/page/7/"
website_url: "https://keymapdb.com"
exclude_url_prefix: "/assets,https://mechdb.net"
max_retries: 2
max_depth: 3
verbose: debug
# The crawler stops going deeper once it leaves the keymapdb domain
# so it won't accidentally crawl the whole web even if you put -1.
max_depth: -1
verbose: true

0 comments on commit f4f0551

Please sign in to comment.