After realizing that Googlebot was not actually indexing our single page app, we decided to try to whip up a very quick server side solution:
- use grunt-html-snapshot to generate static pages that are put in a
/static/snapshots/
folder in django - nginx detects for a crawler,
- if detected, maintain the same file path, but preprend with the
/static/snapshots/
route- i.e. example.com/about -> example.com/static/snapshots/about
This was a bad solution for a variety of reasons:
- It’s not dynamic. If the crawler hits a user profile or a dynamic page, we would not serve up the correct content.
- Google doesn’t like it when you serve the crawler something different from your sites real content.
- We were still sending a soft 404.
- We would have to generate new snapshots on each deploy and add/substract/edit new routes anytime something changed.
- Even in their official docs, Google doesn’t sound all that thrilled about using redirects for SEO.
To recap, attempts at SEO:
1. Relying on Googelbot executing JS
2. Using a 302 redirect.