Categories
Programming Projects Technology the-spot.net The-Spot.Network Updates

JavaScript, XHTML, & AJAX…again

Cover image for product 0471777781Well, I haven’t done much that was technically inclined, except for some various coding forays into the world of JavaScript, and XHTML– consequently it led me back to AJAX and XML-Http-Requests. But I didn’t bother to post anything about that, since it was work-related, and not website-related…kind of.

Of course, with any new technology I try to build or incorporate into something bigger, I first test it on my own site, the-spot.net, and then implement it elsewhere as intended.

Most of the work I had learned by reading “Professional AJAX” beforehand, and doing the XML-Http-Requests manually, and creating my own classes and such. But this time around, it was much simpler.

We implemented the Lightbox image overlay script, and found that it made use of the Prototype.js framework (for handling the asynchronous JavaScript calls) and Script.aculo.us for handling the animation of the overlaid image.

So I took it one step further, and decided to read up on the various ways to make use of these frameworks, and ended up calling all the various modules on the My Spot page of the-spot.net, it does create a problem for SEO-driven development.

If you’re looking for automatically updating content on a page that doesn’t have to change for it to happen, then you cause this code (once you include the prototype.js script in your page):

new Ajax.PeriodicalUpdater(‘products’, ‘/some_url’,
{
method: ‘get’,
insertion: Insertion.Top,
frequency: 1,
decay: 2
});

But while this code does the job to get your content updated automatically, at the given frequency in seconds (multiplied by the decay per each cycle containing unchanged content), the Google crawlers aren’t going to find it – because all that is on your page is this:

<div id=”products”></div>

…and that does not have any content in it for the Google crawlers and other search engine bots to read.

I’m still grappling with how to get content into that box for the crawlers to read, but not my users – and since my site is php-driven, I should be able to include the normal template includes enclosed in <noscript> tags:

<noscript>
<!-- INCLUDE “modules/module.html” -->
</noscript>

…in order for my pages to show content for the crawlers, but not my visitors – unless they have JavaScript turned off, of course. I’ll give it a shot though, and post more information later. I have to go back and re-edit my pages and templates to make sure they are still compliant with the original template architecture.

By [[Neo]]

I am a web programmer, system integrator, and photographer. I have been writing code since high school, when I had only a TI-83 calculator. I enjoy getting different systems to talk to each other, coming up with ways to mimic human processes using technology, and explaining how complicated things work.

Of my many blogs, this one is purely about the technology projects, ideas, and solutions that I have come across in my internet travels. It's also the place for technical updates related to my other sites that are part of The-Spot.Network.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.