I haven't had a chance to test out the Nominatim and Geocoder US geocoders. My understanding though is that the Geocoder and Nominatim can't be run directly in the database, which to me is a big disadvantage because it makes them difficult to use in things like triggers or for batch updates directly in the database.
PostGIS geocoder being a pure PostGIS/postgresql set of plpgsql functions, runs completely in the database. I would expect the Geocoder US and PostGIS geocoder results to be on par and from what I have tested using the web stuff, they are. Google is a bit better since they take advantage of place names.
I think Nominatum since it does use OSM data does use tiger data indirectly since OSM imports for US much of that comes from TIGER with user contributed corrections. I am not sure what vintage (year) of Tiger OSM is currently on. From playing with the web interfaces online it takes advantage of place names too and has an interesting twist in that it allows you to specify the zoom level of the geocoding which allows for faster geocoding by set the zoom precision you need.
Full disclosure -- I've been doing a lot of work on PostGIS Geocoder and wrote the online manual for it. I will tell you a bug I am working on is that I think the location of the point it interpolates is on the wrong side of the street. I'm working on fixing that. If you could care less about which side of the street (or at least sometimes), then that may be a non-issue to you.
Probably worth while testing with the online versions comparing some address results: e.g. nominatim you can test here: http://open.mapquestapi.com/nominatim/v1/search.php
For my use cases I have found the fuzzy checking of nominatum is not as good as what PostGIS Geocoder has. For example my vanity street address (mailing address) is 1 Devonshire Place, Boston MA. PostGIS returns an answer which is close as I recall, Google returns an answer, but I can't get Nominatim to return an answer. To be fair even Boston parcel records have no clue where this is and gets listed in parcel records as washington street. PostGIS can find it since it does various levels of checking and intersections of cross streets etc. I've tested other cases where I purposely type in the zip wrong or something and PostGIS comes back with an accurate set of options. Google does too.
Like everybody else, I could give you an answer with code, but I don't think somebody has explained to you that you are doing something that is fundamentally wrong.
Why are you hitting this error? Because you are calling geocode every time somebody views your page and you are not caching your results anywhere in the db!
The reason that limit exists is to prevent abuse from Google's resources (whether it is willingly or unwillingly) - which is exactly what you are doing :)
Although google's geocode is fast, if everybody used it like this, it would take their servers down. The reason why Google Fusion Tables exist is to do a lot of the heavy server side lifting for you. The geocoding and tile caching is done on their servers. If you do not want to use that, then you should cache them on your server.
If still, 2500 request a day is too little, then you have to look at Google Maps Premier (paid) license that gives you 100,000 geocoding requests per day for something around 10k a year (that is a lot - with server side caching you should not be reaching this limit unless you are some huge site or are doing heavy data processing). Without server side caching and using your current approach, you would only be able to do 800 pageviews a day!
Once you realize that other providers charge per geocode, you'll understand that you should cache the results in the db. With this approach it would cost you about 10 US cents per page view!
Your question is, can you work around the throttle limit that Google gives you? Sure. Just make a request from different ip addresses. Heck, you could proxy the calls through amazon elastic ips and would always have a new fresh 2500 allotted calls. But of course, besides being illegal (you are effectively circumventing the restriction given to you by the Google Maps terms of service), you would be doing a hack to cover the inherent design flaw you have in your system.
So what is the right way for that use-case? Before you call the google geocode api, send it to your server and query if it is in your cache. If it is not, call the geocode, store it in your cache and return the result.
There are other approaches, but this should get you started in the right direction.
Update: From your comments below, it said you are using PHP, so here is a code sample on how to do it correctly (recommendation from the Google team itself) https://developers.google.com/maps/articles/phpsqlsearch_v3
Best Answer
I can suggest gisgraphy that you can install it locally and do the number of request you want without any limitation, the data are from geonames and openstreetmap. so it is totally free as you need. you can also add /edit/ remove data via a GUI. (those kind of things you can not do with google ;)