URL | Lastmod | Changefreq | Priority |
---|
💡 Tip of the Day
Monitor site speed for better rankings.
A sitemap is a simple list that helps search engines find the pages you care about. It does not replace good navigation or clean linking, but it reduces the chance that new or lightly linked URLs are missed. A small generator makes it easy to assemble absolute URLs with hints like last modified dates, change frequency, and priority. You can paste the output into your site root or submit it in Search Console so crawlers discover updates faster.
Quick workflow - add URLs, confirm details, export
Collect a set of absolute URLs. Add each one with a lastmod date that reflects the day the page content actually changed. Set a reasonable changefreq and a priority that reflects importance within your site rather than a guess about global value. Generate the XML and save it as sitemap.xml. Upload to your site root or any public path and reference it in robots.txt with a Sitemap line. Submit the path in Search Console so your file is on record.
Absolute URLs and lastmod - small choices that matter
Absolute URLs avoid confusion with relative paths and allow engines to fetch correctly regardless of context. lastmod should be the content change date, not the deploy date of your CSS. Setting accurate lastmod helps engines spend time on pages that truly changed. If a page has not changed in years, keep the date stable. Inflating lastmod without real edits can waste crawl budget and slow down indexing of new work.
Changefreq and priority - hints, not promises
These fields are advisory. Changefreq describes how often a page is likely to change, and priority ranks pages within your own site from 0.0 to 1.0. Do not set everything to daily or 1.0. A blog home may change weekly, while an archive page may be monthly. Use a simple scheme - 0.8 for core hubs, 0.5 for evergreen content, 0.3 for long tail pages. Engines treat these as hints alongside real observations like crawl history and internal links.
When to split sitemaps - size and structure
A single sitemap can hold up to 50,000 URLs or 50 MB uncompressed. Large sites should split files by section or type and optionally reference them from a sitemap index. Grouping by type helps with debugging - you can watch error counts for product, blog, or docs separately. Small sites rarely need an index. Keep your files simple until you need more structure.
Comparison - manual curation vs automatic generation
Aspect | Manual curation | Automatic generation |
---|---|---|
Control | High | Tied to templates |
Freshness | Depends on habit | Updates with deploy |
Error risk | Typos possible | Template bugs |
Best for | Small sites or critical sets | Large sites with stable CMS |
Bullet notes - sitemap hygiene that ages well
- List canonical URLs only - avoid parameterized or duplicate paths.
- Ensure every URL returns the intended status - 200 for live pages, not 3xx or 4xx.
- Update lastmod when content changes, not when styling changes.
- Keep the file reachable and reference it in robots.txt for easy discovery.
Debugging - read errors as helpful signals
If Search Console flags errors, open the reported URLs in a browser. Status mismatches and blocked robots rules are common culprits. Fix the underlying page first, then regenerate the sitemap entry. Remember that submitting a sitemap does not force indexing. It simply queues discovery. Strong internal links and useful content do the rest. Use the URL Inspection tool to check specific pages and request recrawls after significant fixes.
Special cases - images, video, and news
Sitemaps can include extensions for images, video, and news to provide richer metadata. If a big part of your site relies on these, consider specialized sitemaps alongside a general urlset. For images, list key images for each page to improve discovery. For video, provide title, description, and thumbnail. For news, follow freshness and eligibility guidelines. Keep separate files tidy and test with validation tools before publishing.
References that help without fluff
The Sitemaps protocol site documents the core format and limits clearly. It is the fastest way to confirm whether your file meets the basic rules. Google’s documentation explains how sitemaps interact with crawling and indexing and how to use Search Console to submit and monitor them. Bookmark both and save your team time when questions repeat. Clear references beat guesswork and keep your implementation aligned with current expectations.
Two questions before you publish
- Do the URLs you listed represent the pages you actually want in search - and are they canonical and indexable?
- Will your process keep lastmod dates honest so crawlers learn which sections truly changed each release?
Sitemaps are not glamorous, but they reduce friction. With a few careful choices and a small, repeatable process, you help search engines find the right pages sooner and spend their time where it helps readers. Combine accurate sitemaps with strong internal links and steady publishing, and indexing tends to follow without drama.
References
Sitemaps protocol - official documentation
Google Search Central - Build and submit a sitemap