feat: add robots.txt and sitemap.xml for SEO (#820)
* feat: add robots.txt and sitemap.xml for SEO
- API robots.txt: block all crawlers (it's an API, not a website)
- frontend robots.txt: allow search engines, block AI training bots
(GPTBot, ClaudeBot, CCBot, Google-Extended), allow AI search bots
(ChatGPT-User, Claude-User, PerplexityBot), disallow private paths
- sitemap.xml: dynamic generation via +server.ts route
- includes static pages, tracks, artists, albums
- backend /sitemap-data endpoint provides minimal data
- CDN-cached for 1 hour (s-maxage=3600)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* fix: move deferred imports to top of file
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
authored by
zzstoatzz.io