[GIS] PostgreSQL, PostGIS and GeoServer publishing 1.7 million records

geoserverpostgispostgresql

I have a polygon geodata in postgresql and I am publishing out to geoserver the 1.7 million records. How long is too long when processing?

EDIT: Forgot to mention that I successfully published a database spatial view subset of this same data.

Side question, What is postgresql really doing during this process???

@Devdatta Tengshe, I have a polygon dataset with about 1.7 million records and I am just trying to publish the data to geoserver. I published a spatial view from a small table and a join to the large polygon dataset. This layer seems fine in openlayers. I just published another spatial view from a different small table joined to the large polygon dataset and it published but will not finish loading in openlayers.

Best Answer

Several things to think about with this amount of data

1) Create a Spatial index in PostgreSQL/PostGIS, like this

CREATE INDEX table_geom_idx ON tablename USING GIST (geometry_column name);

2) What scales are you showing the data at, something like 1.7 million features probably needs to be shown at a set scale range. In GeoServer you can use a Min and Max zoom threshold, e.g

<MinScaleDenominator>16000</MinScaleDenominator>
<MaxScaleDenominator>32000</MaxScaleDenominator>

You will need to add this to your SLD, look at the GeoServer SLD cookbook for further examples

http://docs.geoserver.org/latest/en/user/styling/sld-cookbook/index.html

3) Can you generalise your data at other scales?

4) Tweak postgresql.conf to make sure it is tuned for performance. Best resource I found was this

http://wiki.postgresql.org/wiki/Tuning_Your_PostgreSQL_Server

I easily publish 10s millions of vector features with PostGIS and GeoServer, but you need to use the above tweaks to get it working efficiently

Related Question