Lecture

SEO Tools - robots, sitemap, rss

In SEO, there are several additional tools used to effectively introduce a website to search engines.

In this lesson, we will explore commonly used SEO tools: robots.txt, sitemap, and rss.


robots.txt

robots.txt is a text file located on your website that informs search engines which pages should or should not be crawled.

Sample robots.txt
User-agent: * Disallow: /private/ Allow: /public/

The above robots.txt instructs all search engines not to access the /private/ directory while allowing access to the /public/ directory.


Sitemap

A sitemap is a file that lists a website's pages and shows the relationships between them.

Sitemaps function like a map by informing search engines about the site's structure and the links within web pages.

Sample sitemap
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>https://example.com/page1</loc> <lastmod>2023-10-15</lastmod> <changefreq>daily</changefreq> </url> <url> <loc>https://example.com/page2</loc> </url> </urlset>

The sitemap example above lists two web pages, with page1 showing both the last modification date and change frequency.


RSS

rss is a file that automatically informs subscribers about content updates on a website.

It is mostly used for blogs, newsletters, and web services.

Sample rss
<?xml version="1.0"?> <rss version="2.0"> <channel> <title>My Website News</title> <link>https://example.com</link> <description>Latest updates from My Website</description> <item> <title>New Article</title> <link>https://example.com/new-article</link> <description>This is a new article.</description> </item> </channel> </rss>

In this example, we introduce updates under the title My Website News, including content like New Article.

Mission
0 / 1

What is the file that represents the list of pages on a website and their relationships?

sitemap

rss

robots.txt

metadata

Lecture

AI Tutor

Design

Upload

Notes

Favorites

Help

HTML
CSS
Loading...