[GIS] D3.js cannot render the geoJSON file

d3geojsontopojsonunited-kingdom

I've been trying to draw maps using the d3.js library in a manner similar to that below.

<script>
        //Width and height
        var w = 500;
        var h = 300;

        //Define path generator
        var path = d3.geo.path();

        //Create SVG element
        var svg = d3.select("body").append("svg").attr({width:w, height: h});

        //Load in GeoJSON data
        d3.json("us.json", function(json) {

            //Bind data and create one path per GeoJSON feature
            svg.selectAll("path")
               .data(json.features)
               .enter()
               .append("path")
               .attr("d", path)
               .attr("fill","#666666");

        });

    </script>

All this stuff works absolutely FINE for the tutorial's US maps, the geojson he uses is spot on for this d3 library.

The steps I took to get my own dataset as I wanted UK's regional maps.

  1. Download shape files from Ordnance Survey website
  2. Attempt at converting files to geojson using sites such as mapshaper.org (worked, displays on their site
  3. Using my own geojson files to draw the map.

Something draws, my file is in the right format but the whole map is drawn based on my fill colour. It's almost as if the coordinates are not correct. Naturally they won't be in the correct format.

Here is my shapefile open in mapshaper : http://i.imgur.com/lz40BDn.png

The GeoJson that works : filedropper.com/us
The GeoJson that doesn't work : filedropper.com/regionsgeo

I've attempted at using json parsers to compare the structures of both and they do seem very similar. With the exception that there are many more extra features on the regions json file but that seems to be something that mapshaper generates.

Does anyone have an idea why map isn't rendering as it should? Could it be something to do with the coordinates system my json is using compared to that required by d3.js? Have I missed some steps that I should have undertaken to convert the shapefiles?

Best Answer

UPDATE:

Your problem seems to be with your data. Here is a snippet of your file:

      },
      "geometry": {
        "type": "Polygon",
        "coordinates": [
          [
            [
              585951.8,
              181704.9
            ],
            [
              576293.9,
              181299.8
            ],

If I'm not mistaken, those coordinates use the Ordnance Survey National Grid reference system. By default, geoJSON expects the coordinates to be WGS84(longitude, latitude). You can try converting the shape files again and making sure that the output reference system is WGS84. You can do this in QGIS by saving the file as geoJSON and selecting WGS84 as the output coordinate reference system.

The resulting geoJSON should work fine. So to answer your question, the format is correct. It's the coordinates that aren't right.

Here is the geoJSON spec for your reference next time.


I think the problem is that your code expects some attributes to be present in the geoJSON. The code has some specific assumptions about the contents of the geoJSON file. You can't just expect to replace the geoJSON in the exercise with your own file and expect it to work. It is highly unlikely that the attributes in your files would be the same, especially when they're from different sources. You wouldn't expect code written for visualizing data from one CSV to automatically work with a different CSV file without some tweaks, would you? The data has changed and so must the code. You can start by comparing the contents of the geoJSON files and checking the differences. You can do this by loading them in a text editor and comparing the contents. See what the code expects in the file and tweak it accordingly. You can also load them into QGIS and see what attributes are different. The geoJSON format only specifies how you can encode geographic data structures. The contents of the file are up to you. In much the same way that CSVs only tell you how to encode tabular data. What the column names and other content are up to the data producer.