A Perl-compatible regular expression to restrict the pages that are crawled by the connector. The full URL of a page must match the regular expression, otherwise it is not crawled and is not ingested.
Type: | String |
Default: | |
Required: | No |
Configuration Section: | TaskName or FetchTasks or Default |
Example: | SpiderUrlMustHaveRegex=.*data\.mywebsite\.com.*|.*public\.mywebsite\.com.*
|
See Also: | SpiderUrlCantHaveRegex |
|