DEV Community

Discussion on: Building a Polite Web Crawler

Collapse
 
sudhersan93 profile image
sudhersan

What about the UserAgent in Crawl settings? Can I Provide Like this "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36"

Collapse
 
turnerj profile image
James Turner

Yep, you can supply any user agent in the crawl settings (see example). Providing a user agent like that, while will work perfectly fine, circumvents sites giving direction into what content is accessible or not via the "robots.txt" file.