DEV Community

Cover image for Join us for the Bright Data Web Scraping Challenge: $3,000 in Prizes!
dev.to staff for The DEV Team

Posted on

Join us for the Bright Data Web Scraping Challenge: $3,000 in Prizes!

We are excited to team up with Bright Data to bring the community a new challenge.

Running through December 29, the Bright Data Web Scraping Challenge provides an opportunity to access public web data and build tools and applications powered by web scraping.

Bright Data offers dedicated endpoints for extracting fresh, structured web data from over 100 popular domains as well as a Scraping Browser that dramatically reduces overhead for maintaining a scraping and browser infrastructure.

If you’ve ever been curious about optimizing your web scraping and data collection process, this challenge is for you! We hope you give it a try.

Our Prompts

Prompt 1: Scrape Data from Complex, Interactive Websites

Create a project where you need to scrape data from sites with dynamic content and user interactions (e.g., infinite scroll or login-protected pages). Use Bright Data’s Scraping Browser for seamless handling of JavaScript-heavy and interactive websites.

Here is the submission template for anyone that wants to jump right in, but please review all challenge rules on the official challenge page before submitting.

Prompt 1 Submission Template

Prompt 2: Build a Web Scraper API to Solve Business Problems

Use a Web Scraper API to tackle common business challenges like aggregating product prices, monitoring competitors, or collecting reviews across platforms. Use Bright Data’s Web Scraper API for efficient and scalable data collection.

Here is the submission template for anyone that wants to jump right in, but please review all challenge rules on the official challenge page before submitting.

Prompt 2 Submission Template

Prompt 3: Most Creative Use of Web Data for AI Models

Design a pipeline that collects and structures web data to fine-tune an AI model—for example, creating custom chatbots or sentiment analysis tools. Leverage Bright Data’s Web Scraper API or Scaping Browser to collect real time web data and create innovative, AI-driven solutions.

Prompt 3 Submission Templte

Judging Criteria and Prizes

All three prompts will be judged on the following:

  • Use of underlying technology
  • Usability and User Experience
  • Accessibility
  • Creativity

The winner of each prompt will receive:

  • $1,000 USD
  • 6-month DEV++ Membership
  • Exclusive DEV Badge
  • A gift from the DEV Shop

All Participants with a valid submission will receive a completion badge on their DEV profile.

How To Participate

In order to participate, you will need to create a Bright Data account through this link and publish a post using the submission template associated with each prompt.

Bright Data Credits

Participants will receive $15 in testing credits upon signing up through our dedicated sign up link. If additional credits are required, participants can email noah@brightdata.com with the subject line “DEV Challenge - Credit Required,” including the email they used to sign up and details about their use case.

Important Note: Use of Data Provided by Bright Data

If you receive data from Bright Data as part of this challenge, it is solely for use in your project submission. This data is not intended for reuse, resale, or redistribution at any point. Data provided for the competition will be accessed through an account created by Bright Data and credited using a DEV.to promotion code. The promotion code will provide the necessary credits to complete your project as part of the challenge. Misuse of the data or credits may result in disqualification from the competition and/or revocation of access.

Please review our additional rules, guidelines, and FAQ page before submitting so you understand our participation guidelines and official contests rules such eligibility requirements.

Need Help or Inspiration?

You can get to know the Bright Data platform by utilizing their docs and tutorials:

Important Dates

  • December 11: Bright Data Web Scraping Challenge begins!
  • December 29: Submissions due at 11:59 PM PDT
  • January 9: Winners Announced

We can’t wait to see what you build! Questions about the challenge? Ask them below.

Good luck and happy coding!

Top comments (12)

Collapse
 
b-d055 profile image
Bryan Doss

This one was fun, thanks @noahbrinker @thepracticaldev!

Collapse
 
noahbrinker profile image
Noah Brinker

Can't wait to see everyone's submissions!

Collapse
 
srbhr profile image
Saurabh Rai

This sounds like a nice idea to work with!

Collapse
 
fahminlb33 profile image
Fahmi Noor Fiqri

Cool hackathon! Can't wait to submit my project idea.

One problem, I didn't get the $15 credit after signing up using the provided link, I only get $2 trial credits. I signed up using Google login.

Collapse
 
sarahokolo profile image
sahra 💫 • Edited

@thepracticaldev @jess For the second prompt, at the heading, it states "Build a web scraper API..." and at the start of the text below it says "Use a Web Scraper API..." This has me a bit confused. Are we supposed to build an API that returns data scraped by Bright's web scraper API?

Collapse
 
delaaja profile image
Dela • Edited

For the third prompt, do we need to submit a running, working AI model fine tuned with the data (which would require us to host the model at our own expense), or can we submit just the data normalization pipeline, up to the point where the normalized data is about to be fed into the model?

Collapse
 
jess profile image
Jess Lee

Hey @delaaja, you can just submit the pipeline and not the fully tuned model.

Collapse
 
alexanie_ profile image
Alex Anie

So I have a question. Is it compulsory we include the GitHub repo of the project or just the project live Demo @noahbrinker @thepracticaldev

Collapse
 
benndora1 profile image
Ben Ndora

Lets do this guys

Collapse
 
sarahokolo profile image
sahra 💫

This should be fun : )

Collapse
 
trixsec profile image
Trix Cyrus

Gonna be fun let's do this

Some comments may only be visible to logged-in visitors. Sign in to view all comments.