Package Exports
- @nasa-gcn/remix-seo
- @nasa-gcn/remix-seo/build/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@nasa-gcn/remix-seo) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
Remix SEO
A fork of https://github.com/balavishnuvj/remix-seo with some added bug fixes and features.
Collection of SEO utilities like sitemap, robots.txt, etc. for a Remix application.
Features
- Generate Sitemap
- Generate Robots.txt
Installation
To use it, install it from npm (or yarn):
npm install @nasa-gcn/remix-seo
Usage
Add a sitemap and a robots.txt file to your site by adding resource routes for them, as explained below.
Sitemap
Add to your project a route module called app/routes/sitemap[.]xml.ts
with the following contents.
import { routes } from "@remix-run/dev/server-build";
import type { LoaderFunctionArgs } from "@remix-run/node";
import { generateSitemap } from "@nasa-gcn/remix-seo";
export function loader({ request }: LoaderFunctionArgs) {
return generateSitemap(request, routes, {
siteUrl: "https://balavishnuvj.com",
});
}
generateSitemap
takes three params request
, routes
, and SEOOptions
.
Configuration
SEOOptions
lets you configure the sitemap
export type SEOOptions = {
siteUrl: string; // URL where the site is hosted, eg. https://balavishnuvj.com
headers?: HeadersInit; // Additional headers
/*
eg:
headers: {
"Cache-Control": `public, max-age=${60 * 5}`,
},
*/
};
- To not generate sitemap for a route
// in your routes/url-that-doesnt-need-sitemap
import { SEOHandle } from "@nasa-gcn/remix-seo";
export let loader: LoaderFunction = ({ request }) => {
/**/
};
export const handle: SEOHandle = {
getSitemapEntries: () => null,
};
- To generate sitemap for dynamic routes
// routes/blog/$blogslug.tsx
export const handle: SEOHandle = {
getSitemapEntries: async (request) => {
const blogs = await db.blog.findMany();
return blogs.map((blog) => {
return { route: `/blog/${blog.slug}`, priority: 0.7 };
});
},
};
Robots
Add a new route module with the filename app/routes/robots[.txt].ts
and the
following contents:
To generate robots.txt
import { generateRobotsTxt } from '@nasa-gcn/remix-seo'
export function loader() {
return generateRobotsTxt([
{ type: "sitemap", value: "https://balavishnuvj.com/sitemap.xml" },
{ type: "disallow", value: "/admin" },
]);
}
generateRobotsTxt
takes two arguments.
First one is array of policies
export type RobotsPolicy = {
type: "allow" | "disallow" | "sitemap" | "crawlDelay" | "userAgent";
value: string;
};
and second parameter RobotsConfig
is for additional configuration
export type RobotsConfig = {
appendOnDefaultPolicies?: boolean; // If default policies should used
/*
Default policy
const defaultPolicies: RobotsPolicy[] = [
{
type: "userAgent",
value: "*",
},
{
type: "allow",
value: "/",
},
];
*/
headers?: HeadersInit; // Additional headers
/*
eg:
headers: {
"Cache-Control": `public, max-age=${60 * 5}`,
},
*/
};