How are things at Google
Posted: Wed Jan 22, 2025 8:59 am
Double work: you need to write both a "classic" site and a JavaScript binding for working via AJAX. Increased cost of site development.
Complication of internal logic. Increased cost of support.
Google has been working on indexing AJAX pages for a long time.
In general, a search robot tries to imitate human behavior: it loads pages and scripts like a regular Internet browser.
That is why it is better to allow search robots to index all resources: not only pages, but also scripts.
Case: Successful Indexing of a JavaScript chinese overseas australia database Flash Website
Back in 2006, Toyota Ekaterinburg Vostok created an amazing website.
Toyota Center Ekaterinburg Vostok website in 2006
Toyota Center Ekaterinburg Vostok website in 2006
Problem
Promotion turned out to be problematic, since the site was made using completely “non-text” technologies: JavaScript and Flash. That is, search robots saw only program code instead of the expected texts.
Solution
Method #3 was used (the other two didn’t exist yet): for each page, a “substrate” was created inside <noscript> — and the site was shown to users “in all its glory,” and to search engines — only the text content.
Result: 3rd position in Yandex for the query “Toyota” throughout Russia (at that time there was still a general federal search without division into regions).
Conclusion
The methods described achieve one thing: for the search robot to see plain text.
If your AJAX site provides information to robots in a way that they can understand, you get:
a user-friendly site (after all, that's what you wanted when you decided to use AJAX),
indexing by search engines.
This article will help you when AJAX decides to enter your life.
Save it to your Facebook by clicking on the "Share" link (bottom left):
Complication of internal logic. Increased cost of support.
Google has been working on indexing AJAX pages for a long time.
In general, a search robot tries to imitate human behavior: it loads pages and scripts like a regular Internet browser.
That is why it is better to allow search robots to index all resources: not only pages, but also scripts.
Case: Successful Indexing of a JavaScript chinese overseas australia database Flash Website
Back in 2006, Toyota Ekaterinburg Vostok created an amazing website.
Toyota Center Ekaterinburg Vostok website in 2006
Toyota Center Ekaterinburg Vostok website in 2006
Problem
Promotion turned out to be problematic, since the site was made using completely “non-text” technologies: JavaScript and Flash. That is, search robots saw only program code instead of the expected texts.
Solution
Method #3 was used (the other two didn’t exist yet): for each page, a “substrate” was created inside <noscript> — and the site was shown to users “in all its glory,” and to search engines — only the text content.
Result: 3rd position in Yandex for the query “Toyota” throughout Russia (at that time there was still a general federal search without division into regions).
Conclusion
The methods described achieve one thing: for the search robot to see plain text.
If your AJAX site provides information to robots in a way that they can understand, you get:
a user-friendly site (after all, that's what you wanted when you decided to use AJAX),
indexing by search engines.
This article will help you when AJAX decides to enter your life.
Save it to your Facebook by clicking on the "Share" link (bottom left):