Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tracking visitors/customers comes to mind. Definitely not uncommon to have 1000 uniques a day :)


I imagine caching netblocks rather than running individual IP lookups would cut down on the load quite a bit.


If you are geolocating, I would suggest spending $370 to buy the MaxMind GeoIPCity database. There's an nginx module for it and any major language will have code to run queries against it.

http://www.maxmind.com/en/city http://nginx.org/en/docs/http/ngx_http_geoip_module.html


So something along the lines of tracking incoming requests, then doing a DNS lookup on the incoming IP? Isn't that something that something like Google Analytics could do for you? And in that case, would this be more for people who are avoiding GA?


Google Analytics can give you this data in the reports but won't allow you to get access to it in real time in your application.

Imagine you want to redirect users to the correct country page in your site. In this case you need to get access to the ip geolocation in real time. GA won't help you there.


If your goal is geo-redirection then a 3rd party service is usually a bad choice (speed + downtime). An memory geo-ip database is the best way to go - but this is usually also part of what you pay for when you buy the non-free version


Maxmind have a free database (GeoLite2) that's updated weekly. Here's an overview of the accuracy per country:

http://www.maxmind.com/en/geolite_city_accuracy

It's less accurate than their commercial offering, but good enough in many cases. For ease of use, there's a nice C API (https://github.com/maxmind/geoip-api-c) and wrappers like pygeoip for Python.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: