Package Exports
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@crawlbase/n8n-nodes-crawlbase) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
n8n-nodes-crawlbase
Crawlbase node for n8n. Crawl web pages with native Crawlbase API credentials and the Crawling API — no need to wire a generic HTTP Request node by hand.
Features
- Native credentials — Add your Crawlbase API token once and use it across workflows; Test connection validates the token.
- Crawling API — Single node for GET/POST/PUT with URL (from parameter or from each input item).
- Options — Optional Crawling API parameters (for example
format,page_wait,country,request_headers, cookies,device, scraper, screenshot, store, async, and JS rendering helpers), request body for POST/PUT, and an HTTP timeout on the client (not a Crawlbase query parameter). See Crawling API parameters for the full list and behavior. - Normalized output — Each item returns
statusCode,headers,body, andmetadata(includingoriginalStatus,cbStatus,urlwhere provided by the API).
Installation
In n8n (Community nodes)
- In n8n, go to Settings → Community nodes → Install a community node.
- Enter:
n8n-nodes-crawlbase. - Install and restart if prompted.
From source
npm install
npm run buildThen in n8n, add the path to this package (the directory containing this README) as a community node in Settings → Community nodes, or use npm run dev to run n8n with this node loaded locally.
Credentials
- Add a Crawlbase API credential (search for “Crawlbase” in the credential list).
- Enter your API Token from the Crawlbase dashboard.
- Click Test connection to confirm the token works.
First crawl
- Add a Crawlbase node to your workflow.
- Select your Crawlbase API credential.
- Enter a URL (e.g.
https://example.com) or choose From input item field and set the field name. - Choose Method (GET/POST/PUT) and Response format (HTML or JSON).
- Run the workflow. The node outputs
statusCode,headers,body, andmetadatafor each URL.
Item-list mode
- Set URL Source to From input item field and specify the field that contains the URL (e.g.
url). - Connect an input that provides one item per URL. The node runs one Crawling API request per item and returns one output item per input item.
Rate limits and retries
Crawlbase applies rate limits depending on your plan. To avoid failures:
- Use n8n’s Retry On Fail on the Crawlbase node (node settings).
- Set Wait Between Tries to at least 1 second (or higher if you hit limits).
- For many URLs, consider splitting work (e.g. Loop Over Items / batching) so you don’t send bursts of requests.
Example workflow
See example-workflow.json for a workflow that crawls a single public URL and returns HTML. Import it in n8n via Workflows → Import from file.
Verification and catalog
To get the node verified and listed in n8n’s integration catalog:
- Publish this package to npm as
n8n-nodes-crawlbase. - Ensure it meets n8n’s verification guidelines.
- Submit the node via the n8n Creator Portal.
Links
License
MIT
Copyright 2026 Crawlbase