thanks google, finally they suggested a solution like this:
if you use hashbang in your URL, then googlebot does a request to your backend with _escaped_fragment_ parameter in which your url fragment (the part after hashbang) is passed, so google can index the response you returned.
at backend you can render your frontend page via a tool using one of layout engines or native libraries generally used for html unit tests, then you can publish the DOM XML you gained for response.
at this point, for performance, i'll have 3 advice independent from tool you preferred:
- do not render CSS and be sure you're not doing requests for CSS files.
- do not request images, flash and etc.
after performance problems, as an alternative, we wrote a crawler to create and save html snapshot of rendered DOM, and served created folder statically to googlebot.
for this, we used phantomjs for rendering, and wrote a small python program to crawl site:
project satisfies only our application's requirements for now, you're free to fork and enrich it according to your needs.
when i googled with keywords 'ajax' and 'seo', there were many advices but there was no working application. i hope this project can fill this gap.