I noticed today that my Icecast servers are getting hit by Google quite regularly. Strange because there isn’t really any indexable content.
So today I added a robots.txt to Icecast in /usr/share/icecast2/web on Debian systems. I disallowed all robots with:
User-agent: *
Disallow: /
So hopefully I won’t get robots on my Icecast server. This would also protect against people using wget to record streams, as wget will honour robots.txt.