/
What customizations can be applied to the crawler?
This Confluence instance is now read-only, please head over to the Algolia Confluence instance for the same more up-to-date information
What customizations can be applied to the crawler?
What level of customization can be applied to how Search.io crawls my site?
Answer
You can setup a variety of rules for crawling your site.
From your Console, you can choose what domains are stored in your collection, and if crawling is active for each domain.
You can also create exclude rules based on URL structure, domain, and a variety of metadata.
Exclude rules will remove all matching records from your collection, and search.io will not re-crawl any records that match in the future.
Related content
Does the crawler automatically crawl my website content?
Does the crawler automatically crawl my website content?
More like this
How do I prevent pages from being crawled?
How do I prevent pages from being crawled?
More like this
Can I index and crawl password protected sites?
Can I index and crawl password protected sites?
More like this
How does crawling and indexing work?
How does crawling and indexing work?
More like this
Can I use my Search.io account with multiple domains, sites, or applications?
Can I use my Search.io account with multiple domains, sites, or applications?
More like this
How do I exclude a directory from search results?
How do I exclude a directory from search results?
More like this