This is using d3js.org's US json topography and d3 version 5.9.5. I have posted the solution to the gitHub repo for d3, I wanted to get this out to as many people as possible so no one has to waste time trying to figure it out.
So, I have an issue and a "hacked" solution that worked for me.
I have several (100) cities that I would like to color the county they are in based on a score that they are given in an array.
I call geoContains with the following parameters: geoContains(county, points). When mapped, several counties now "contain" New York City (pretty much the "rust belt" and along the northern border).
I dug through the code posted here and noticed the following issue when comparing it to Json topography data:
geoContains line 37: This is where the coordinates of the polygon are passed to the polygonContains (object.coordinates);
polygonContains line 2: This takes the length of the polygon array that was passed in (object.coordinates). For all polygons in the above-referenced Json file, the length of this array is 1 and it contains another array that has all the coordinates to create the polygon. This is where the error occurs.
I've gotten around this by passing the correct coordinates array into the polygonContains function rather than the array that holds the array or going through the geoContains function. It works beautifully.
I wanted to point this out as an issue and also give people the cause and eventual solution to the problem. I'm just a newb programmer (less than 6 months programming, self taught, and one week working in d3), so I am not sure if I am doing this correctly. I spent a lot of time combing the internet trying to figure out what was wrong, so I'm hoping the above helps someone else out.
Top comments (0)