Of course, with any new technology I try to build or incorporate into something bigger, I first test it on my own site, the-spot.net, and then implement it elsewhere as intended.
Most of the work I had learned by reading “Professional AJAX” beforehand, and doing the XML-Http-Requests manually, and creating my own classes and such. But this time around, it was much simpler.
So I took it one step further, and decided to read up on the various ways to make use of these frameworks, and ended up calling all the various modules on the My Spot page of the-spot.net, it does create a problem for SEO-driven development.
If you’re looking for automatically updating content on a page that doesn’t have to change for it to happen, then you cause this code (once you include the prototype.js script in your page):
new Ajax.PeriodicalUpdater(‘products’, ‘/some_url’,
But while this code does the job to get your content updated automatically, at the given frequency in seconds (multiplied by the decay per each cycle containing unchanged content), the Google crawlers aren’t going to find it – because all that is on your page is this:
…and that does not have any content in it for the Google crawlers and other search engine bots to read.
I’m still grappling with how to get content into that box for the crawlers to read, but not my users – and since my site is php-driven, I should be able to include the normal template includes enclosed in <noscript> tags:
<!-- INCLUDE “modules/module.html” -->