This content originally appeared on DEV Community and was authored by Patryk Zdunowski
If you keep release notes by hand, you know the drill: scroll through commits, group them, paste links, hunt down authors… and lose an hour you’ll never get back.
I wrote a zero-dependency Node script that turns your Conventional Commit history into clean, publish-ready Markdown release notes. It:
- Groups commits by type (
feat
,fix
,docs
, …) - Detects breaking changes and surfaces Highlights
- Appends a short hash with a link to the commit
- Resolves a GitHub @handle for the author (PR author fallback)
- Works with tags like
v1.2.3
,1.2.3
, and monorepo-style tags likemypackage@1.2.3
Example, real usage you can find on GitHub releases.
Below is how it works, how to use it, and the full script.
What the output looks like
Here’s a realistic example of the Markdown it prints:
## 🚀 Release v1.4.0
## Highlights
- drop Node 16 support (breaking) by @devin ([a1b2c3d](https://github.com/owner/repo/commit/a1b2c3d))
## Features
- add CSV export to dashboard by @marta ([d4e5f6a](https://github.com/owner/repo/commit/d4e5f6a))
- support OAuth device flow by @liam ([1122334](https://github.com/owner/repo/commit/1122334))
## Bug Fixes
- fix flaky timezone parsing by @sara ([778899a](https://github.com/owner/repo/commit/778899a))
## Performance
- cache user settings lookups by @tom ([99aa001](https://github.com/owner/repo/commit/99aa001))
## Documentation
- clarify env vars in README by @lea ([55cc66d](https://github.com/owner/repo/commit/55cc66d))
## Other Changes
- bump internal tooling by unknown ([abc1234](https://github.com/owner/repo/commit/abc1234))
“unknown” appears if neither the commit nor its PR provide an associated GitHub login (e.g., when the email isn’t linked to a GitHub account).
Quick start
-
Save the script as
scripts/generate-release-notes.mjs
(full code at the end). - Provide a GitHub token with repo scope (Actions’ automatic
GITHUB_TOKEN
works).
export GITHUB_TOKEN=ghp_XXXXXXXXXXXXXXXXXXXXXXXXXXXX
node scripts/generate-release-notes.mjs v1.4.0
Usage
# Basic: between the latest relevant previous tag and HEAD
node scripts/generate-release-notes.mjs <version>
# Explicit previous tag or hash
node scripts/generate-release-notes.mjs <version> <previous_tag_or_hash>
# Explicit range
node scripts/generate-release-notes.mjs <version> <from_hash_or_tag> <to_hash_or_tag>
Tag schemas supported
-
v1.2.3
(default) 1.2.3
-
mypackage@1.2.3
(monorepo packages)
For stable releases (no -rc
, -beta
, etc.), the script compares only against other stable tags. For pre-releases, it compares against the latest matching pre-or-stable as appropriate. If no previous tag is found, it falls back to the repository’s first commit.
Add it to package.json
{
"scripts": {
"release:notes": "node scripts/generate-release-notes.mjs"
}
}
Run:
GITHUB_TOKEN=${GITHUB_TOKEN} npm run release:notes -- v1.4.0
GitHub Actions example
Automatically print release notes when a tag is pushed:
name: Release Notes
on:
push:
tags:
- "v*"
jobs:
notes:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- run: node scripts/generate-release-notes.mjs ${{ github.ref_name }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
You can pipe the output into a file:
node scripts/generate-release-notes.mjs v1.4.0 > RELEASE_NOTES.md
…and attach it to a GitHub Release or commit it to CHANGELOG.md
.
How it works (and why it’s reliable)
1) Repository & range detection
- Reads
origin
remote (SSH or HTTPS) to inferOWNER/REPO
. -
Figures out previous tag automatically based on your versioning scheme:
-
v*
,*
, orpkg@*
(monorepo). - Stable vs pre-release logic ensures stable tags compare to previous stable.
-
Supports overriding the range with CLI args.
2) Commit collection via GitHub REST
- Calls
GET /repos/{owner}/{repo}/compare/{base}...{head}
to list commits. - Extracts
sha
,html_url
, commit message, and author login if available. - If a commit author login is missing (squash merges, bots, or non-linked emails), it calls
GET /repos/{owner}/{repo}/commits/{sha}/pulls
and uses the PR author as a fallback.
3) Conventional Commit parsing
- Parses headers like
type(scope)!: subject
. - Groups into sections (
feat
,fix
,perf
,refactor
,docs
,chore
,test
,build
,ci
,style
,revert
), with a clean title map. - Unknown or non-conventional commits fall into Other Changes.
- Anything marked
!
or containingBREAKING CHANGE:
lands in Highlights. - Final order: common types in a predictable sequence → any dynamic types A→Z → Other Changes.
4) Output
- Emits tidy Markdown with section headings.
- Each line includes:
subject by @handle ([abcdefg](link))
.
Edge cases handled
-
Monorepos: Use
mypackage@1.2.3
tags to scope previous-tag detection to that package. - Squash merges: Falls back to the PR’s author handle.
- No previous tag: Compares from the first commit.
- Non-Conventional commits: Safely grouped into Other Changes.
Customize it
A few quick tweaks you might want:
-
Change section titles/order: edit
TITLE_MAP
andCOMMON_ORDER
. -
Hide noise (e.g.,
chore
,style
): filter those buckets before emitting. - Change “Highlights” rules: adjust the breaking detection regex.
-
Write to a file by default: replace the final
console.log
with a file write.
Full script
Save as
scripts/generate-release-notes.mjs
and make it executable (chmod +x
).
/**
* Generate release notes with @handles using GitHub REST.
*
* Usage:
* node scripts/generate-release-notes.mjs <version> [previous_tag_or_hash]
* node scripts/generate-release-notes.mjs <version> <from_hash> <to_hash>
* Env:
* GITHUB_TOKEN
*/
import { execSync } from "node:child_process";
import process from "node:process";
const VERSION = process.argv[2] || "";
const PREVIOUS_TAG_ARG = process.argv[3] || "";
const TO_HASH_ARG = process.argv[4] || "";
const TOKEN = process.env.GITHUB_TOKEN;
if (!VERSION) { console.error("❌ Provide version"); process.exit(1); }
if (!TOKEN) { console.error("❌ Set GITHUB_TOKEN"); process.exit(1); }
const sh = (cmd) =>
execSync(cmd, { stdio: ["ignore", "pipe", "ignore"] }).toString().trim();
// Detect owner/repo from origin (supports .git or not)
const remote = (() => { try { return sh("git remote get-url origin"); } catch { return ""; } })();
if (!remote) { console.error("❌ No 'origin' remote"); process.exit(1); }
let OWNER = "", REPO = "";
{
const ssh = remote.match(/^git@github\.com:([^/]+)\/(.+?)(?:\.git)?$/);
const https = remote.match(/^https:\/\/github\.com\/([^/]+)\/(.+?)(?:\.git)?$/);
if (ssh) { OWNER = ssh[1]; REPO = ssh[2]; }
else if (https) { OWNER = https[1]; REPO = https[2]; }
else { console.error(`❌ Not a GitHub remote: ${remote}`); process.exit(1); }
}
// --- Determine range -------------------------------------------------
const isExplicitRange = Boolean(PREVIOUS_TAG_ARG && TO_HASH_ARG);
let BASE = "";
let HEAD = "";
if (isExplicitRange) {
// version + from + to: honor exactly; skip any tag/version heuristics
BASE = PREVIOUS_TAG_ARG;
HEAD = TO_HASH_ARG;
} else {
// Heuristics only when explicit range isn't provided
function detectSchema(v) {
// v1.2.3 or v1.2.3-rc.1
if (/^v\d+\.\d+\.\d+(?:-[A-Za-z0-9.-]+)?$/.test(v)) return { schema: "v*", prefix: "v" };
// monorepo/package tag: mypackage@1.2.3 (no slash)
if (/^[A-Za-z0-9-]+@\d+\.\d+\.\d+(?:-[A-Za-z0-9.-]+)?$/.test(v)) {
const pkg = v.replace(/@.*$/, "");
return { schema: `${pkg}@*`, prefix: `${pkg}@` };
}
// plain 1.2.3 or 1.2.3-rc.1
if (/^\d+\.\d+\.\d+(?:-[A-Za-z0-9.-]+)?$/.test(v)) return { schema: v, prefix: "" };
return { schema: v, prefix: "" };
}
const { schema: TAG_SCHEMA, prefix: TAG_PREFIX } = detectSchema(VERSION);
const firstSha = () => { try { return sh("git rev-list --max-parents=0 HEAD").split("\n")[0]; } catch { return ""; } };
function latestMatchingTag(exclude) {
const all = sh("git tag --sort=-version:refname").split("\n").filter(Boolean);
const isStableVersion = !VERSION.includes("-"); // no pre-release suffix
if (TAG_SCHEMA === "v*") {
// v* tags
return all.find(t =>
/^v[0-9]/.test(t) &&
(isStableVersion ? !t.includes("-") : true) &&
t !== exclude
) || "";
}
if (TAG_SCHEMA.includes("@")) {
// mypackage@* tags
const prefix = TAG_PREFIX;
return all.find(t =>
t.startsWith(prefix) &&
(isStableVersion ? !t.includes("-") : true) &&
t !== exclude
) || "";
}
// plain 1.2.x line
const mm = (VERSION.match(/^(\d+\.\d+)\./) || [, ""])[1];
if (mm) {
return all.find(t =>
t.startsWith(`${mm}.`) &&
(isStableVersion ? !t.includes("-") : true) &&
t !== exclude
) || "";
}
return "";
}
const PREVIOUS_TAG = PREVIOUS_TAG_ARG || latestMatchingTag(VERSION);
BASE = PREVIOUS_TAG || (function () { try { return sh("git rev-list --max-parents=0 HEAD").split("\n")[0]; } catch { return ""; } })();
HEAD = "HEAD";
}
if (!BASE) { console.error("❌ Could not determine base ref"); process.exit(1); }
if (!HEAD) { console.error("❌ Could not determine head ref"); process.exit(1); }
// --- GitHub REST (no deps) ------------------------------------------
async function gh(path) {
const res = await fetch(`https://api.github.com${path}`, {
headers: {
Authorization: `Bearer ${TOKEN}`,
Accept: "application/vnd.github+json",
"User-Agent": "release-notes",
},
});
if (!res.ok) throw new Error(`${res.status} ${res.statusText}: ${await res.text()}`);
return res.json();
}
async function commitsBetween(base, head) {
const data = await gh(`/repos/${OWNER}/${REPO}/compare/${encodeURIComponent(base)}...${encodeURIComponent(head)}`);
return (data.commits || []).map(c => ({
sha: c.sha,
url: c.html_url,
msg: c.commit.message,
login: c.author?.login || null,
}));
}
async function prAuthorLogin(sha) {
try {
const data = await gh(`/repos/${OWNER}/${REPO}/commits/${sha}/pulls`);
return data?.[0]?.user?.login || null;
} catch {
return null;
}
}
// --- Conventional Commit parsing & rendering -------------------------
const COMMON_ORDER = ["feat", "fix", "perf", "refactor", "docs", "chore", "test", "build", "ci", "style", "revert"];
const TITLE_MAP = {
feat: "## Features",
fix: "## Bug Fixes",
perf: "## Performance",
refactor: "## Refactors",
docs: "## Documentation",
chore: "## Chores",
test: "## Tests",
build: "## Build",
ci: "## CI",
style: "## Style",
revert: "## Reverts",
};
const toTitle = (type) => TITLE_MAP[type] || `## ${type.charAt(0).toUpperCase()}${type.slice(1)}`;
const clean = (s) => s.trim().replace(/\s+/g, " ");
function parseHeader(m) {
const first = (m || "").split("\n")[0];
const r = first.match(/^(\w+)(?:\([^)]*\))?(!)?:\s*(.+)$/);
if (!r) return { matched: false, type: "other", breaking: false, subject: first.trim() };
const [, type, bang, subject] = r;
return { matched: true, type: type.toLowerCase(), breaking: !!bang, subject: subject.trim() };
}
// --- Main ------------------------------------------------------------
(async () => {
const commits = await commitsBetween(BASE, HEAD);
const items = [];
for (const c of commits) {
const parsed = parseHeader(c.msg);
let login = c.login || await prAuthorLogin(c.sha);
const bucketKey = parsed.matched ? parsed.type : "other";
items.push({
bucketKey,
matched: parsed.matched,
breaking: parsed.breaking || /(^|\n)BREAKING CHANGE:/i.test(c.msg || ""),
subject: clean(parsed.subject),
handle: login ? `@${login}` : "unknown",
sha: c.sha,
url: c.url,
});
}
// Highlights (any breaking / major)
const HIGHLIGHTS = items.filter(i => i.breaking || /\b(major|breaking)\b/i.test(i.subject));
// Section order: common -> dynamic types -> other
const dynamicTypes = Array.from(new Set(
items
.filter(i => i.bucketKey !== "other")
.map(i => i.bucketKey)
.filter(t => !COMMON_ORDER.includes(t))
)).sort();
const sectionOrder = [
...COMMON_ORDER.filter(t => items.some(i => i.bucketKey === t)),
...dynamicTypes,
"other",
];
const buckets = new Map(sectionOrder.map(k => [k, []]));
for (const it of items) {
if (!buckets.has(it.bucketKey)) buckets.set(it.bucketKey, []);
buckets.get(it.bucketKey).push(it);
}
// Emit markdown
const out = [];
out.push(`## 🚀 Release ${VERSION}`);
if (HIGHLIGHTS.length) {
out.push("", "## Highlights");
for (const b of HIGHLIGHTS) {
out.push(`- ${b.subject} by ${b.handle} ([${b.sha.slice(0, 7)}](${b.url}))`);
}
}
for (const key of sectionOrder) {
const arr = buckets.get(key) || [];
if (!arr.length) continue;
const title = key === "other" ? "## Other Changes" : toTitle(key);
out.push("", title);
for (const i of arr) {
out.push(`- ${i.subject} by ${i.handle} ([${i.sha.slice(0, 7)}](${i.url}))`);
}
}
console.log(out.join("\n"));
})().catch(err => { console.error("❌ Failed:", err.message || err); process.exit(1); });
Final notes
-
Auth:
GITHUB_TOKEN
is required; Actions’ built-in token is enough for public/private repos. - Conventional Commits FTW: Consistent commit messages pay off here.
- PR authorship: If you squash-merge, the script still credits the PR author.
If you try it, I’d love feedback or ideas for small improvements (like filtering buckets, publishing directly to a GitHub Release, or generating HTML).
This content originally appeared on DEV Community and was authored by Patryk Zdunowski

Patryk Zdunowski | Sciencx (2025-08-18T21:54:05+00:00) Zero-dep release notes from Conventional Commits. Retrieved from https://www.scien.cx/2025/08/18/zero-dep-release-notes-from-conventional-commits/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.