We're looking for anyone who wants to write a scraper for their country or state's laws: in any programming language, for any government and place on Earth.
The project is Public.Law ("Law is code; open-source it.") github.com/public-law/ We're increasing access to justice ("A2J").
If you want to do your own code from scratch, we'd love it: wherever you live, you can help make your local government more accessible. We're building a collection of web scrapers and data import scripts that turn a set of laws into JSON. Here's an example in Haskell, for the U.S. state of Nevada: github.com/public-law/nevada-revis.... But like I said, any language is great. It doesn't matter if you're a beginner or advanced. The most important thing is to get something that works for every place on earth. Since it'll run on Ubuntu and simply output JSON, anyone can use these data feeds to make innovative apps.
But if you just want to contribute a little bit of documentation, or maybe do some refactoring, we have a bunch of open source repo's already up.
Hey. I am absurdly good at scraping but like Hafidz don't get the bigger picture either.
Do you have any type of DTD? Specifics? Does it need to be parsed to useful format once (do you just want the data?) or is it expected to run periodically?
(Sources change and so the parser needs to change too)
Hi Robb, I am just one who keep looking any project updates in this post and just stop by yours. But to be honest, I am just dummy coder. Currently, I was interesting in building data pre-processing framework for web data, before they are inserted into database.
I have visited all your links, but I just can not catch the big picture, such as what kind of data need to be scrapped, output criteria, design rule, etc. Can you give me more information?
FYI: I am from Indonesia, so there are not many open data about law compared to USA.
We’re a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.