What is robots.txt file
The robots.txt is a text file which has .txt extension. It is used to tell search engines (like Google, Yahoo and Bing) about the web pages that it should not be crawled and indexed and prevent search engines to crawl to specific area like feeds, trackbacks for security reason, wordpress admin pages and avoid duplicate content because that web pages are not so important for you.
How to create robots.txt file
i) Make a new text document (notepad).
ii) Write your robots.txt.
For example :
User-Agent : *
iii) Rename the document as robots. Since text document has .txt extension so the file may be called as robots.txt.
iv) Then upload this file to the root directory of your website.
How to upload robots.txt file to the root directory
i) Open your FTP client such as Filezilla.
ii) Login to your FTP client.
iii) Go to Public_html folder.
iv) Upload your robots.txt file here.
v) Now you can check this by opening http://yourwebsite.com/robots.txt.
It will show the content of your robots.txt file.
Read Also :
Free PPT and PDF Submission Sites
How to Change Spam Comments into Backlinks
20 Tips For Off Page Optimization
On Page Optimization Tips
Top 10 Duplicate Content Checker Tools
13 Free Domain Name Suggestion Tools
How To Get Free Domain Name In Blogger
How To Submit Your Blog To Technorati
10 Most Popular Sites For Download Free Softwares
6 Free Tools To Create RSS Feed
On Page Optimization vs Off Page Optimization
Importance And Benefits Of Backlinks In SEO
Buy Facebook Fans: Driving Force behind Success
Working for optimizing other language websites