Our crawler
Near Here automatically checks hundreds of venue websites, council listing pages, and event feeds to find what's happening locally. This page explains how we do that, and what you can do to help us find your events more reliably.
Our bot
When we visit your site, we identify ourselves with the following user agent:
NearHereEventsBot/0.2 (+https://nearhere.events)
If you spot this in your server logs, that's us. We crawl politely and infrequently — we're not trying to hammer your server, just keep our listings fresh.
In your robots.txt, use NearHereEventsBot as the user agent name — you don't need to include the version number.
Helping us find your events
The single most useful thing you can do is add a Sitemap: directive to your robots.txt. We read sitemap files to discover your event pages, which means we can find them quickly without having to crawl your whole site.
If your site already has a sitemap, just make sure it's referenced from your robots.txt:
# robots.txt Sitemap: https://www.example.com/sitemap.xml
You can reference multiple sitemaps if you have one specifically for events:
# robots.txt Sitemap: https://www.example.com/sitemap.xml Sitemap: https://www.example.com/sitemap-events.xml
Slowing things down
We respect the Crawl-delay directive. If you'd like us to crawl more slowly, add it to your robots.txt — we honour values up to 10 seconds between requests.
# Apply a crawl delay for all bots User-agent: * Crawl-delay: 5 # Or target us specifically User-agent: NearHereEventsBot Crawl-delay: 5
Don't want to be listed?
No problem at all. If you'd like your venue removed from Near Here, just let us know via our contact form and we'll sort it promptly.