Interacting with the Dev.to Article API

Steve Layton on March 09, 2019

ATLG Sidebar Uhh OK Hay all! If this is your first time, welcome! If not welcome back! This week I'm starting a new "series" ... [Read Full]
markdown guide
 

Hey there!

I'll point out a couple things, firstly, in the following snippet:

// FormatArticleRequest returns http.Request ready to do() and get an article
func (dtc DevtoClient) FormatArticleRequest(i int32) (r *http.Request, err error) {
  URL := fmt.Sprintf(dtc.DevtoAPIURL+"articles/%d", i)
  r, err = http.NewRequest(http.MethodGet, URL, nil)
  if err != nil {
    return nil, err
  }
  return r, nil
}

You are returning a *http.Request and an error which means you can simplify it like this:

// FormatArticleRequest returns http.Request ready to do() and get an article
func (dtc DevtoClient) FormatArticleRequest(i int32) (r *http.Request, err error) {
  URL := fmt.Sprintf(dtc.DevtoAPIURL+"articles/%d", i)
  return http.NewRequest(http.MethodGet, URL, nil)
}

Later on in your worker, you are forgetting to check the errors on json.Unmarshal(body, &articles)

Other than that, good job on parameterizing the HTTP client instance. Were I to use your API client, I would certainly love to set my own timeouts etc. so this is the right way to go!

Good job!

 

Thanks for the comment! Ungh! I keep forgetting that I can shorten the returns when I do it that way. One of these days! Good catch on the json.Unmarshal not sure how that snuck through, I'll make sure to fix that up before putting the code on GitHub.

 

Cool, another post about the Dev.to API! 🙂

Correct me if I am wrong though but that will go over every single page till it runs out of data right? If so, you might want to be careful actually running that as there would be hundreds of pages and you're grabbing the contents for each article on every page. Maybe chuck a "max articles" count on there somewhere? Or maybe limit it to a specific tag?

 

The articles API endpoint looks to only return 30 articles per page and seems to cut off around ~460 pages or so. The idea is indeed to grab all public posts as I need a good amount of text for something else I'm toying around with so, in this case, I have a rough idea of how many articles we'll be getting. However, we're only getting 30 articles worth of content in one go which should not cause any issues on the server side or locally. You definitely wouldn't want to be running this repeatedly in a short span or anything though.

 

Yeah as it is still about 13,000 requests - if that ran often, it could easily hammer the site.

The articles endpoint is cached as far as I know so, as is the code shouldn't cause any significant load on Varnish. Heh. But yeah, imagine trying to simply download the articles one at a time, api/articles/1, that what like ~90000 hits. Yuck.

 

Is there a documentation somewhere of the API , I can't seem to find it anywhere. I want to retrieve my articles.

 

There is no documentation as of yet, I pulled the details out of the code itself on GitHub. If all you need is a dump of your own articles you can request an emailed export at the bottom of the settings page, dev.to/settings/misc.

code of conduct - report abuse