Creating a website that looks good, is easy to navigate and use and is not too ‘deep’ are all great attributes for building a website that can be easy to use by customers and is likely to garner good rankings when searched.
But, if you are a bit more technically expert or have a web guru on hand, then there are some other really deep things you can add to your site to really maximise its SEO and help search engines push your site to the top.
Read On:
Structuring your website for SEO
Constructing a site for Google spiders
SEO jargon buster
Here are some technical tips:
Create a Robots.txt File
Creating a robots.txt file is a way of speaking directly to the search-engine spiders when they arrive at your site. Perhaps you would rather the spiders not visit certain section of your site. Or maybe you want to instruct them to visit every single page. Other times you may want to control the frequency at which the spiders visit your site.
Also you can create a robots.txt file to prevent search-engine spiders from consuming excessive amounts of bandwidth on your server and also to prevent potential copyright infringements.
The file resides in the root directory of your Web server.
Tip: A robots.txt file can also be used to tell the search-engine spiders where a site map is located with this text: Sitemap: http://www.example.com/yoursitemap.html
You can use the robots.txt generator at: www.mcanerin.com/EN/search-engine/robots-txt.asp to simplify the robots.txt creation task.
Using the Nofollow Attribute
Attaching a nofollow attribute to a link is your way of telling the search-engine spiders that they should not follow that link or view that link as anything of significance when determining ranking.
The nofollow attribute can also be used to link to other Web sites that are not directly related to the content of you own Web site. Google has asked that all paid links be tagged with the nofollow attribute to indicate that the links should not affect ranking influence.
Tip: Essentially, adding the nofollow attribute to a link tells the search engines not to use that link as a positive factor in their ranking algorithm.
Structure URL s correctly
Both search engines and search-engine users appreciate static-looking, descriptive URL s. The search engines also take into consideration the keywords and phrases contained within your URL s and use these to influence your rankings.
Tip: Structuring your URL s correctly provides you with benefits beyond just improving your search-engine optimization efforts. One such example is a paid search, or pay-per-click, advertising campaign.
A properly categorized Web site gives you a head start in developing the keyword lists necessary to properly construct a paid search advertising campaign. You may actually see a price decrease in your paid search efforts as well.
Protect yourself with an .htaccess file
An .htaccess file is the Apache Web server’s configuration file. It is a straightforward yet powerful text file that can accomplish a wide variety of functions.
Although normally left to expert server administrators, an .htaccess file can help you avoid several potential problems.
Rewriting URL s and redirecting Web traffic enables you to use numerous forms of protection ranging from password protecting directories, banning visitors from certain sources, and preventing bandwidth theft from image linkers.
Rules located in an .htaccess file uploaded to your images directory take precedence over the foot file. Also you may want to stop the malicious use of the Linux Wget command that can retrieve your Web site content.
You can use .htaccess to block direct linking to your images, and instead even send a replacement image.
Tip: An excellent tool located at www.htaccesstools.com allows you to generate a variety of different .htaccess code blocks.
The tool automatically generates .htaccess authentication code along with an encrypted .htpasswd file to password protect directories. This is especially important with the advent of new image search engines like Google Images.
The tool also creates .htaccess code to block unwanted bots traffic or traffic from certain IP addresses. You should regularly check for excessive traffic from random bots or excessive visits from any particular IP addresses.
Speak Your Mind