Have you ever spent 30 minutes searching for something on Google and found nothing useful?
I used to be the same. Then I learned Google dorking — advanced search operators that turn Google into a precision research tool.
Here are 10 tricks that changed everything for me.
1. Find Exact File Types
filetype:pdf machine learning cheatsheet
This finds PDFs only. Works with csv, xlsx, pptx, doc too. I use this to find free research papers, datasets, and templates.
Pro tip: Try filetype:csv salary data 2026 — you will be surprised what is publicly available.
2. Search Within a Specific Site
site:github.com web scraping python
Forget GitHub search — it is terrible. Google indexes GitHub repos better than GitHub itself.
Real example: site:github.com awesome-list scraping found me repositories I never would have discovered through GitHub search.
3. Find Login Pages and Dashboards
intitle:"dashboard" site:your-competitor.com
This reveals which tools your competitors use — their analytics dashboards, admin panels, staging environments.
Disclaimer: Only look at publicly accessible pages. Never try to access anything requiring authentication.
4. Exclude Noise From Results
python web scraping tutorial -selenium -beautiful soup
The minus operator removes results containing those terms. Perfect when you want to find alternatives to popular tools.
My favorite: best CRM -salesforce -hubspot — shows you CRMs that are not just the big players.
5. Find Pages With Specific Words in the URL
inurl:api-pricing cloud services
This finds pricing pages directly. Skip the marketing fluff, go straight to what things cost.
Power combo: inurl:pricing filetype:pdf saas — finds PDF pricing guides.
6. Search Within a Date Range
web scraping tools after:2025-01-01 before:2026-04-01
Google results get stale fast. This forces fresh content only.
Use case: I use this when researching tools — best project management tools after:2026-01-01 gives me only current reviews.
7. Find Related Sites
related:dev.to
This shows sites Google considers similar. Great for finding new platforms, communities, or competitors.
Try it: related:producthunt.com or related:ycombinator.com — you will discover platforms you did not know existed.
8. Find Exposed Directories and Files
intitle:"index of" datasets csv
This finds open directories with files. Researchers and universities often leave data publicly accessible.
For data scientists: intitle:"index of" .csv training data — goldmine for finding datasets.
9. Combine Operators for Power Searches
site:reddit.com inurl:comments "best tool for" web scraping -ads
Reddit comments are the most honest reviews on the internet. This combo finds discussion threads about specific topics.
My go-to research method: When evaluating any tool, I search Reddit comments first.
10. Wildcard Search for Unknown Terms
"how to * a web scraper in *"
The asterisk matches any word. Perfect when you know the pattern but not the specifics.
Creative use: "the best * for web scraping in 2026" reveals what types of tools people write about.
Bonus: My Research Workflow
When I research a topic, I chain these together:
- Broad search — get the landscape
- site:reddit.com — get honest opinions
- filetype:pdf — find detailed reports
- after:2025-01-01 — filter for freshness
- -[big players] — discover hidden gems
This workflow saves me hours every week.
What is your favorite Google search trick?
I am always looking for new operators and combos. Drop your best search trick in the comments — I will add the best ones to this list with credit!
If you found this useful, I write about developer productivity, web scraping, and hidden APIs every week. Follow me for more.
Check out my curated list of 130+ web scraping tools — updated weekly.
More from me: 10 Dev Tools I Use Daily | 77 Scrapers on a Schedule | 150+ Free APIs
Top comments (0)