Bangla Radio

Share

Bookmark and Share

Friday, June 11, 2010

How to create Robot.text and htaccess file.

Use of  .htaccess file

The .htaccess file can be used on Apache servers running Linux or Unix to increase your web site security, and customize the way your web site behaves. The main uses of the .htaccess files are to redirect visitors to custom error pages, stop directory listings, ban robots gathering email addresses for spam, ban visitors from certain countries and IP addresses, block visitors from bad referring (warez) sites, protect your web site from hot linking images and bandwidth theft, redirect visitors from a requested page to a new web page, and to password protect directories. Use the information in this article as a starting point to optimize and protect your web site.
More details in htaccess


How to create  robots.txt File

The robots.txt file is a simple ASCII text file used to indicate web site files and directories that should not be indexed. Many webmasters choose not to revise their robots.txt file because they are uncertain how the changes could impact their rankings. However, a poorly written robots.txt file can cause your complete web site to be indexed, gathering information like passwords, email addresses, hidden links, membership areas, and confidential files. A poorly written robots.txt file could also cause portions of your web site to be ignored by the search engines.

More details in Robot.txt

0 comments:

Post a Comment

Submit your Web Site to Search Engine

Google PR Check

Check Page Rank of your Web site pages instantly:

This page rank checking tool is powered by Page Rank Checker service

Keyword Based Search Engine Checker

Your domain:
(eg. iwebtool.com)
 
Search on:  
Limit search to:
Keywords to search:
 
 

Powered by iWEBTOOL

Backlink Checker Tools

Your domain:
  (eg. iwebtool.com)
 

Powered by iWEBTOOL

Search Engine Spider Simulator

Enter URL to Spider

Google Penalty Checker

Google Penalty Checker



Powered by Li'l Engine's Google Penalty Checker