|
Knowledgebase Articles and information about running a website, cPanel and various hints and tips. Here you will find tutorials on php, MySql, .htaccess, cron, SEO, Search Engines, CHMOD, FTP, CSS, HTML and various other hints and tips on running and Administrating a website. |
![]() |
|
#1
| ||||
| ||||
![]() Index of Document: 1) Intro to .htaccess 2) Error Documents 3) Error Codes 4) Password Protection 5) Enabling SSI via .htaccess 6) Blocking Users via IP 7) Blocking Users/Sites by Referer 8) Blocking Bad Sites & Ripping Tools 9) Setting Default Directory 10) Using Redirects 11) Prevent viewing of .htaccess 12) Setting MIME Types 13) Prevent Hotlinking 14) Prevent viewing Directory Content 15) Conclusion & More Info Canonical URL's and SEO Ok, i'm moving this up to the top of the list before the intro because many people are searching for ways to clear up their Canonical URL's. Canonical is a fancy word for having multiple URL's pointing at the same page or content. This is bad because your Pagerank or authority gets distributed between multiple copies of a page instead of consolidating and boosting the one version. These canonical URL's occur on nearly every site, largely due to Apache having the ability to display different documents on the www version URL than the http version although very few people make use of this and we wind up with the same page appearing on both. The other reason is due to mis-managed coding or your CMS being not so SEO friendly. The Fix: Code: Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} ^mysite.com RewriteRule (.*) http://www.mysite.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.mysite.com/ [R=301,L] What this code does is rewrite: http://mysite.com http://mysite.com/index.html www.mysite.com/index.html All to simply www.mysite.com Also change the .html to .php if your sites uses an index.php extension. Enjoy ![]() Introduction to .htaccess I am sure that most of you have heard of htaccess, if just vaguely, and that you may think you have a fair idea of what can be done with an htaccess file. You are more than likely mistaken about that, however. Regardless, even if you have never heard of htaccess and what it can do for you, the intention of this tutorial is to get you two moving along nicely together. If you have heard of htaccess, chances are that it has been in relation to implementing custom error pages or password protected directories. But there is much more available to you through the marvelously simple .htaccess file. A Few General Ideas An htaccess file is a simple ASCII file, such as you would create through a text editor like NotePad or SimpleText. Many people seem to have some confusion over the naming convention for the file, so let me get that out of the way. .htaccess is the file extension. It is not file.htaccess or somepage.htaccess, it is simply named .htaccess In order to create the file, open up a text editor and save an empty page as .htaccess (or type in one character, as some editors will not let you save an empty page). Chances are that your editor will append its default file extension to the name (ex: for Notepad it would call the file .htaccess.txt). You need to remove the .txt (or other) file extension in order to get yourself htaccessing--yes, I know that isn't a word, but it sounds keen, don't it? You can do this by right clicking on the file and renaming it by removing anything that doesn't say .htaccess. You can also rename it via telnet or your ftp program, and you should be familiar enough with one of those so as not to need explaining. htaccess files must be uploaded as ASCII mode, not BINARY. You may need to CHMOD the htaccess file to 644 or (RW-R--R--). This makes the file usable by the server, but prevents it from being read by a browser, which can seriously compromise your security. (For example, if you have password protected directories, if a browser can read the htaccess file, then they can get the location of the authentication file and then reverse engineer the list to get full access to any portion that you previously had protected. There are different ways to prevent this, one being to place all your authentication files above the root directory so that they are not www accessible, and the other is through an htaccess series of commands that prevents itself from being accessed by a browser, more on that later) Most commands in htaccess are meant to be placed on one line only, so if you use a text editor that uses word-wrap, make sure it is disabled or it might throw in a few characters that annoy Apache to no end, although Apache is typically very forgiving of malformed content in an htaccess file. htaccess is an Apache thing, not an NT thing. There are similar capabilities for NT servers, though in my professional experience and personal opinion, NT's ability in these areas is severely handicapped. But that's not what we're here for. htaccess files affect the directory they are placed in and all sub-directories, that is an htaccess file located in your root directory (yoursite.com) would affect yoursite.com/content, yoursite.com/content/contents, etc. It is important to note that this can be prevented (if, for example, you did not want certain htaccess commands to affect a specific directory) by placing a new htaccess file within the directory you don't want affected with certain changes, and removing the specific command(s) from the new htaccess file that you do not want affecting this directory. In short, the nearest htaccess file to the current directory is treated as the htaccess file. If the nearest htaccess file is your global htaccess located in your root, then it affects every single directory in your entire site. Before you go off and plant htaccess everywhere, read through this and make sure you don't do anything redundant, since it is possible to cause an infinite loop of redirects or errors if you place something weird in the htaccess. Also...some sites do not allow use of htaccess files, since depending on what they are doing, they can slow down a server overloaded with domains if they are all using htaccess files. I can't stress this enough: You need to make sure you are allowed to use htaccess before you actually use it. Some things that htaccess can do can compromise a server configuration that has been specifically setup by the admin, so don't get in trouble.
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
![]() Blog Comment Software |
#2
| ||||
| ||||
![]() Error Documents This seems to be what people think htaccess was meant for, but it is only part of the general use. We'll be getting into progressively more advanced stuff after this. In order to specify your own ErrorDocuments, you need to be slightly familiar with the server returned error codes. (List to the right). You do not need to specify error pages for all of these, in fact you shouldn't. An ErrorDocument for code 200 would cause an infinite loop, whenever a page was found...this would not be good. You will probably want to create an error document for codes 404 and 500, at the least 404 since this would give you a chance to handle requests for pages not found. 500 would help you out with internal server errors in any scripts you have running. You may also want to consider ErrorDocuments for 401 - Authorization Required (as in when somebody tries to enter a protected area of your site without the proper credentials), 403 - Forbidden (as in when a file with permissions not allowing it to be accessed by the user is requested) and 400 - Bad Request, which is one of those generic kind of errors that people get to by doing some weird stuff with your URL or scripts. In order to specify your own customized error documents, you simply need to add the following command, on one line, within your htaccess file: ErrorDocument code /directory/filename.ext or ErrorDocument 404 /errors/notfound.html This would cause any error code resulting in 404 to be forward to yoursite.com/errors/notfound.html Likewise with: ErrorDocument 500 /errors/internalerror.htmlYou can name the pages anything you want (I'd recommend something that would prevent you from forgetting what the page is being used for), and you can place the error pages anywhere you want within your site, so long as they are web-accessible (through a URL). The initial slash in the directory location represents the root directory of your site, that being where your default page for your first-level domain is located. I typically prefer to keep them in a separate directory for maintenance purposes and in order to better control spiders indexing them through a ROBOTS.TXT file, but it is entirely up to you. If you were to use an error document handler for each of the error codes I mentioned, the htaccess file would look like the following (note each command is on its own line): Code: ErrorDocument 400 /errors/badrequest.htmlErrorDocument 401 /errors/authreqd.htmlErrorDocument 403 /errors/forbid.htmlErrorDocument 404 /errors/notfound.htmlErrorDocument 500 /errors/serverr.html You can also specify HTML, believe it or not! Code: ErrorDocument 401 "<body bgcolor=#ffffff><h1>You have to actually <b>BE</b> a <a href="#">member</A> to view this page, Colonel! And again, that should all be on one line, no naughty word wrapping! Next, we are moving on to password protection, that last frontier before I dunk you into the true capabilities of htaccess. If you are familiar with setting up your own password protected directories via htaccess, you may feel like skipping ahead.
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#3
| ||||
| ||||
![]() Error Codes: Successful Client Requests 200 OK 201 Created 202 Accepted 203 Non-Authorative Information 204 No Content 205 Reset Content 206 Partial Content Client Request Redirected 300 Multiple Choices 301 Moved Permanently 302 Moved Temporarily 303 See Other 304 Not Modified 305 Use Proxy Client Request Errors 400 Bad Request 401 Authorization Required 402 Payment Required (not used yet) 403 Forbidden 404 Not Found 405 Method Not Allowed 406 Not Acceptable (encoding) 407 Proxy Authentication Required 408 Request Timed Out 409 Conflicting Request 410 Gone 411 Content Length Required 412 Precondition Failed 413 Request Entity Too Long 414 Request URI Too Long 415 Unsupported Media Type Server Errors 500 Internal Server Error 501 Not Implemented 502 Bad Gateway 503 Service Unavailable 504 Gateway Timeout 505 HTTP Version Not Supported
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#4
| ||||
| ||||
![]() Password Protection Ever wanted a specific directory in your site to be available only to people who you want it to be available to? Ever got frustrated with the seeming holes in client-side options for this that allowed virtually anyone with enough skill to mess around in your source to get in? htaccess is the answer! There are numerous methods to password protecting areas of your site, some server language based (such as ASP, PHP or PERL) and client side based, such as JavaScript. JavaScript is not as secure or foolproof as a server-side option, a server side challenge/response is always more secure than a client dependant challenge/response. htaccess is about as secure as you can or need to get in everyday life, though there are ways above and beyond even that of htaccess. If you aren't comfortable enough with htaccess, you can password protect your pages any number of ways, and JavaScript Kit has plenty of password protection scripts for your use. The first thing you will need to do is create a file called .htpasswd. I know, you might have problems with the naming convention, but it is the same idea behind naming the htaccess file itself, and you should be able to do that by this point. In the htpasswd file, you place the username and password (which is encrypted) for those whom you want to have access. For example, a username and password of wsabstract (and I do not recommend having the username being the same as the password), the htpasswd file would look like this: Code: wsabstract:y4E7Ep8e7EYV For security, you should not upload the htpasswd file to a directory that is web accessible (yoursite.com/.htpasswd), it should be placed above your www root directory. You'll be specifying the location to it later on, so be sure you know where you put it. Also, this file, as with htaccess, should be uploaded as ASCII and not BINARY. Create a new htaccess file and place the following code in it: Code: AuthUserFile /usr/local/you/safedir/.htpasswd AuthGroupFile /dev/null AuthName EnterPassword AuthType Basic require user wsabstract The second to last line require user is where you enter the username of those who you want to have access to that portion of your site. Note that using this will allow only that specific user to be able to access that directory. This applies if you had an htpasswd file that had multiple users setup in it and you wanted each one to have access to an individual directory. If you wanted the entire list of users to have access to that directory, you would replace Require user xxx with require valid-user. The AuthName is the name of the area you want to access. It could anything, such as "EnterPassword". You can change the name of this 'realm' to whatever you want, within reason. We are using AuthType Basic because we are using basic HTTP authentication.
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#5
| ||||
| ||||
![]() Enabling SSI via .htaccess Many people want to use SSI, but don't seem to have the ability to do so with their current webhost. You can change that with htaccess. A note of caution first...definitely ask permission from your host before you do this, it can be considered 'hacking' or violation of your host's TOS, so be safe rather than sorry: Code: AddType text/html .shtml AddHandler server-parsed .shtml Options Indexes FollowSymLinks Includes And that's it, you should have SSI enabled. But wait...don't feel like renaming all of your pages to .shtml in order to take advantage of this neat little toy? Me either! Just add this line to the fragment above, between the first and second lines: Code: AddHandler server-parsed .html Some people also prefer to allow SSI in html pages so as to avoid letting anyone who looks at the page extension to know that they are using SSI in order to prevent the server being compromised through SSI hacks, which is possible. Either way, you now have the knowledge to use it either way. If, however, you are going to keep SSI pages with the extension of .shtml, and you want to use SSI on your Index pages, you need to add the following line to your htaccess: Code: DirectoryIndex index.shtml index.html
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#6
| ||||
| ||||
![]() Blocking Users by IP Is there a pesky person perpetrating pain upon you? Stalking your site from the vastness of the electron void? Blockem! In your htaccess file, add the following code--changing the IPs to suit your needs--each command on one line each: Code: order allow,deny deny from 123.45.6.7 deny from 012.34.5. allow from all You can also set an option for deny from all, which would of course deny everyone. You can also allow or deny by Domain Name rather than IP address.
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#7
| ||||
| ||||
![]() Blocking User and/or Sites by Referers Blocking users or sites that originate from a particular domain is another useful trick of .htaccess. Lets say you check your logs one day, and see tons of referrals from a particular site, yet upon inspection you can't find a single visible link to your site on theirs. The referral isn't a "legitimate" one, with the site most likely hot linking to certain files on your site such as images, .css files, or files you can't even make out. Remember, your logs will generate a referrer entry for any kind of reference to your site that has a traceable origin. Before I get to the code itself, it's important to note that blocking access by referrer in .htaccess requires the help of the Apache module mod_rewrite to make out the referrer first. This module is installed by default on most servers (ask your host if you're not sure). So, to deny access all traffic that originate from a particular domain (referrers) to your site, use the following code: Block traffic from a single referrer: Code: RewriteEngine on # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} badsite\.com [NC] RewriteRule .* - [F] Code: RewriteEngine on # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} badsite\.com [NC,OR] RewriteCond %{HTTP_REFERER} anotherbadsite\.com RewriteRule .* - [F] Finally, the last line in the .htaccess file specifies that the action to take when a match is found is to fail the request, meaning the referrer traffic will hit a 403 Forbidden error. The only difference between blocking a single referrer and multiple referrers is the modified [NC, OR] flag in the later case to every domain but the last. Now, you may have noticed the line "Options +FollowSymlinks" above, which is commented. Uncomment this line if your server isn't configured with FollowSymLinks in its <directory> section in httpd.conf, and you get a 500 Internal Server error when using the code above as is.
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#8
| ||||
| ||||
![]() Blocking Bad Websites, and Site Ripping Tools The definition of a "bad bot" varies depending on who you ask, but most would agree they are the spiders that do a lot more harm than good on your site (ie: an email harvester). A site ripper on the other hand are offline browsing programs that a surfer may unleash on your site to crawl and download every one of its pages for offline viewing. In both cases, both your site's bandwidth and resource usage are jacked up as a result, sometimes to the point of crashing your server. Bad bots typically ignore the wishes of your robots.txt file, so you'll want to ban them using means such as .htaccess. The trick is to identify a bad bot. Below is a useful code block you can insert into.htaccess file for blocking a lot of the known bad bots and site rippers currently out there. Yeah, there's alot of malicious tools out there. Code: RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.* - [F,L]
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#9
| ||||
| ||||
![]() Your Default Directory Page Some of you may be wondering, just what in the world is a DirectoryIndex? Well, grasshopper, this is a command which allows you to specify a file that is to be loaded as your default page whenever a directory or url request comes in, that does not specify a specific page. Tired of having yoursite.com/index.html come up when you go to yoursite.com? Want to change it to be yoursite.com/ILikePizzaSteve.html that comes up instead? No problem! Code: DirectoryIndex filename.html Code: DirectoryIndex filename.html index.cgi index.pl default.htm Every once in a while, I use this method for the following needs: Say I keep all my include files in a directory called include, and that I keep all my image files in a directory called images, I don't want people to be able to directory browse through them (even though we can prevent that through another htaccess trick, more later) I would specify a DirectoryIndex entry, in a specific htaccess file for those two directories, for /redirect/index.pl that is a redirect page (as explained here) that redirects a request for those directories to be sent to the homepage. Or I could just specify a directory index of index.pl and upload an index.pl file to each of those directories. Or I could just stick in an htaccess redirect page, which is our next subject!
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
#10
| ||||
| ||||
![]() Using Re-Directs Ever go through the nightmare of changing significantly portions of your site, then having to deal with the problem of people finding their way from the old pages to the new? It can be nasty. There are different ways of redirecting pages, through http-equiv, javascript or any of the server-side languages. And then you can do it through htaccess, which is probably the most effective, considering the minimal amount of work required to do it. htaccess uses redirect to look for any request for a specific page (or a non-specific location, though this can cause infinite loops) and if it finds that request, it forwards it to a new page you have specified: Code: Redirect /olddirectory/oldfile.html http://yoursite.com/newdirectory/newfile.html Using this method, you can redirect any number of pages no matter what you do to your directory structure. It is the fastest method that is a global affect.
__________________ Knowledgebase | SEO | Free Scripts | Free Graphics | Free Wordpress Themes | PradoPoint Offline | Domains For Sale |
![]() Blog Comment Software |
![]() |
Tools | |
Display Modes | |
![]() | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Thomson TG782T Modem/Router Guide | Admin | Knowledgebase | 1 | 12-19-2009 05:15 PM |
Free Wordpress Theme :: Travel Guide v1.0 | Ben.Johnson | Wordpress Themes | 0 | 11-27-2009 07:00 PM |
Search Engine Optimization Guide | SEO_Internet | SEO | 1 | 05-04-2009 10:57 AM |
Wordpress Theme - Ultimate Urbanstyle | Ben.Johnson | Wordpress Themes | 0 | 03-12-2009 05:08 AM |
Wordpress Theme - Ultimate RedMag | Ben.Johnson | Wordpress Themes | 0 | 01-26-2009 02:34 PM |
Wordpress Theme - Lottery Master Guide | Ben.Johnson | Wordpress Themes | 0 | 11-13-2008 08:59 PM |
Ultimate Search Engine Loophole Review | Admin | SEO | 0 | 11-11-2008 06:30 AM |