I'm creating a real time traffic application. I read the AVL car data from a web service and insert the records in a PostgreSQL table.
Then a trigger calculates the closest link and if there is a previous position for the same car, I calculate the route to get the distance travelled in that time frame, then I get the average speed for that route.
The thing is, I get around 3000 points each 5 minutes, so I need to process ~600 records per minute (10 each second). I'm very close to those numbers because I filter the road network before calling the routing function and the routes are very small.
I understand PostgreSQL doesn't support multiple threads in one connection but supports multiple connections, so could use the other CPU cores.
I can make a different table for each State (25), but how can I partition the problem so each trigger creates a new connection and usse more CPU cores instead of only one?
Best Answer
See this help document for creating multiple connections.
This simple code will create multiple connections:
You can then choose a connection to use:
Finally you need to disconnect all of your connections:
This code will automatically make use of multiple threads and doesn't require you to create redundancies, such as duplicate tables.