Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

Best use of Template::Toolkit for Search Engine Optimization?

by punch_card_don (Curate)
on Oct 24, 2008 at 15:40 UTC ( [id://719373]=perlquestion: print w/replies, xml ) Need Help??

punch_card_don has asked for the wisdom of the Perl Monks concerning the following question:

Mesmerizing Monks,

I use Template::Toolkit to serve dynamically generated web pages. My basic set-up is a MySQL db, a bunch of T::T templates, and a Perl script to extract data, populate the T::T variables, and spit out the appropriate page.

I'm about to start a new website and plan to use T::T and a db, BUT, this time search engine ranking will be important, whereas it never has been before for the private, internal, sites I usually work on. So I'm thinking about how using T::T might affect this.

For example, if a page is called via a form that uses POST to my_central_script.pl, then every page on the site has the same url, www.my_domain.com/cgi-bin/my_central_script.pl. Not very search engine friendly, and not to mention the fact that I doubt spiders are going to submit my forms to find these pages.

This can be corrected by using either GET for forms or links of the form

<a href="../cgi-bin/my_tt_script.pl?a_id=xxx&b_id=yyy">Some page</a>
And http:/www.my_doman.com/cgi-bin/my_tt_script.pl?a_id=xxx&b_id=yyy is the url of that page.

Question is, what does this do to search engine indexing, if anything? Do the major search engines like this kind of url?

Now, the data in the db will change only very rarely, and there will only be a few hundred (ok, uner 1,000 anyway) possible combinations of data. That is, under 1,000 possible distinct web pages. So I'm wondering about the urls' impact on that. So for this site, I could, in theory, use T::T as a generator of pre-formatted static html pages. Enter the data into the db, make the templates, write a script that will cycle through generating all possible combinations of data, and store the output pages as static html files. Perhaps the search engines prefer nice, ordinary urls like

http://www.my_domain.com/aid_xxx_bid_yyy.htm

with links on them to other pages like

<a href="aid_xxx_bid_zzz.htm">Page keywords</a>

So,what is the best way to use T::T when SEO is important?

Thanks.




Time flies like an arrow. Fruit flies like a banana.

Replies are listed 'Best First'.
Re: Best use of Template::Toolkit for Search Engine Optimization?
by moritz (Cardinal) on Oct 24, 2008 at 15:48 UTC
    Question is, what does this do to search engine indexing, if anything? Do the major search engines like this kind of url?

    Here is the answer of the "most major" search engine.

    Perhaps the search engines prefer nice, ordinary urls like http://www.my_domain.com/aid_xxx_bid_yyy.htm

    First ask yourself what's best of the user. IMHO the example without URL parameters looks nicer, and is friendlier to the user.

    Second point is, just because the URL has no obvious params and ends with .htm doesn't mean it has to be a static page. What I'd do is to use the URLs ending in .htm(l), but still generate them dynamically. Either by having an script that handles all requests and looks at path_info, or with mod_rewrite.

    I don't see how the underlying template engine affects this decision at all.

Re: Best use of Template::Toolkit for Search Engine Optimization?
by ccn (Vicar) on Oct 24, 2008 at 15:51 UTC
    Apache module which called mod_rewrite can help you to deal with search engines

    Using it you can have SEO frendly urls like
    http://www.my_domain.com/aid_xxx_bid_yyy.htm
    which is internally translated for your T::T as
    http://www.my_domain.com/cgi-bin/tt.pl?aid=xxx&bid=yyy

      Rather than mod_rewrite (which indeed will help), I'd suggest to not patch, but solve the problem. Catalyst can handle these "SEO friendly urls" with no problem and makes maintaining the website in general _a lot_ easier (once you get to understand the concept).

      --
      b10m
Re: Best use of Template::Toolkit for Search Engine Optimization?
by ww (Archbishop) on Oct 24, 2008 at 23:11 UTC
    Contrarian that I (sometimes) am, I like the static page scheme (option 3) given the comparatively small number of possible combinations of data and relative rarity of changes you expect in the db.

    Your 3rd option can do more for you than provide user and SE friendly URLs. If you generate the thousand or fewer pages on a local machine, you can readily (with a bit of a big workload up front) tweak the DESCRIPTION and KEYWORDS metas, which may do more for your search engine ranking that many other optomizing techniques.

    On the other hand, there are so many parameters that go into SE ranking ("How fresh is the 'Latest Update' or 'Last modified' date? How well do the keywords match the actual content? etc.) so this suggestion may be an example of something comparable to "premature optomization."

      Another option is to generate static "sitemap.xml" file describing all 1000 entires and combinations. And then say to search spiders to look in this file: in robots.txt

      User-Agent: * Disallow: Sitemap: /sitemap.xml

      For the sitemap syntax look here

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://719373]
Approved by moritz
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others musing on the Monastery: (8)
As of 2024-04-23 14:11 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found