iconDirEasy

SEO

Directories are a true cheat code for SEO, don't sleep on it!

Metadata

We've configured all the pages with Metadata for you. Everything you need is already pre-filled from your lib/config/seo.ts file:

  • title
  • description
  • type
  • url
  • OpenGraph
  • Twitter Card

And so on ...

You can customize these seo metadata for each page.

Sitemap

DirEasy automatically generates the sitemap for the website using Next.js's built-in Generating multiple sitemaps functionality (sitemap.ts). The generation process scans different directories in the project and creates sitemap entries for various types of content:

1. Page Collection

If you add new pages or change the routes of your pages, you need to make sure the sitemap reflects these changes.

You can find the configuration for the sitemap in the lib/config/seo.ts:

lib/config/seo.ts
  sitemapConfig: {
    staticPages: [
      {
        url: "/page1",
        changeFrequency: "weekly",
        priority: 0.8,
      },
    ],
  },

2. Content Collection

Blog Posts

  • Reads all files in the content/blog directory

Docs Posts

  • Reads all files in the content/docs directory

3. Projects

  • Each project submitted by a user is added to the sitemap

4. Collections

  • Each collection created by admin is added to the sitemap
The sitemap is automatically regenerated during each build, ensuring it stays up to date with your content.
  • Google Search Console cannot exceed 5000 items in each sitemap.xml file. If your Directory website's projects submissions do not exceed 5000, then the url of your sitemap.xml generated will be: https://your-domain.com/sitemap/0.xml and https://your-domain.com/sitemap/1.xml
  • Page CollectionContent Collection and Collections are listed in the 0.xml file.
  • Projects are listed in the 1.xml file. 2.xml for the next 5000 items and so on.

Robots

robots.txt file will help crawling robots (Google, Bing) understand what pages you want to be indexed or not.

DirEasyautomatically generates the robots.txt for the website using Next.js's built-in robots generation functionality (robots.ts).

The robots.txt is automatically regenerated during each build.

On this page