Building a Robust JS/TS Monorepo: Best Practices with Yarn, NX and Changesets
Introduction
Why Monorepo?
Nowadays, the rapid evolution of software development cannot be denied. The teams are growing, projects tend to be more complex. Companies spend significant resources to maintain a distributed codebase consisting of many fragments. Enter the monorepo – a single, unified repository that brings together all of your code. Far from being a trend, monorepos have recently become an architectural approach to house the entire codebase in one place. Teams obtain enhanced context sharing, smooth collaboration and a tool that naturally encourages code reuse.
Setting Up Yarn Workspaces
Note: Throughout this article, whenever “Yarn” is mentioned, it specifically refers to Yarn v4—the latest version offering enhanced capabilities and improved performance.
What are Yarn Workspaces?
Workspaces are the packages of the monorepo, often called packages. They help you manage multiple packages in a single repository effortlessly. With workspaces. you can:
-
Share Dependencies Easily:
Share common dependencies across your project seamlessly.
-
Simplify Dependency Management:
Yarn automatically links local packages, reducing duplication and easing development.
-
Accelerate Installations:
Benefit from Yarn’s performance optimizations and caching mechanisms (i.e., built-in plug’n’play).
-
Improve Control Over Monorepo:
Define constraints (rules) and use dozens of available plugins to maintain consistency.
While Yarn is the selected manager for this article thanks to its simplicity, speed, and extensive configuration options – it’s important to note that the right choice depends on your project’s specific needs, team preferences, and overall workflow. For instance, PNPM and Turborepo are other modern tools that offer a wide range of features.
Initial Configuration
Yarn setup is a straightforward process. Follow the official guide to install and configure Yarn in your project: Yarn Installation Guide.
Once you’ve completed the installation, let’s move on to configuration. Since we’re using plug’n’play, you need to ensure that your IDE correctly recognizes dependencies. If you’re using VSCode, run:
# Typescript is required for VSCode SDK to set up correctly
yarn add -D typescript@^5
yarn dlx @yarnpkg/sdks vscode
If you’re using another code editor, check for the available SDKs here: Yarn Editor SDKs.
At this point, you’re all set to start using Yarn.
Organizing the Monorepo Structure
Now that the package manager is configured, it’s time to design a scalable project organization. A clear, well-defined structure not only makes the repository easier to navigate but also promotes better code reuse. In this example, we’ll divide the codebase into three major categories:
-
Apps:
- Client: Contains the final, deployable client products.
- Server: Contains the final, deployable server products.
-
Features:
- Client: For standalone UI widgets.
- Server: For standalone backend business logic pieces.
-
Libs:
Houses shared code such as design system components, constants, assets, and utilities. This is the context-free zone for storing reusable logic.
To demonstrate the power of this folder structure, let’s start by adding these major folders to Yarn’s workspaces list. In your root package.json, add the following:
"workspaces": [
"apps/**",
"features/**",
"libs/**"
]
This configuration tells Yarn to treat packages in these folders as local packages. Subsequent installations will ensure that dependencies for each package are properly set up and linked.
Bootstrapping Codebase
In this section, we’ll walk through a minimal codebase example that illustrates how to bootstrap the monorepo. Instead of including full code snippets, I’ll provide short examples with links to the complete files in the repository created specifically for this article.
Bootstrapping Server Application
We begin with a simple Express API for user authentication. This server application exposes a single endpoint (/auth/signIn
) that utilizes a handler from another package.
import express from "express";
import cors from "cors";
import { signInHandler } from "@robust-monorepo-yarn-nx-changesets/sign-in-handler";
const app = express();
const port = process.env.PORT || 1234;
app.use(express.json());
app.use(
cors({
origin: process.env.CORS_ORIGIN || "http://localhost:3000",
})
);
app.post("/auth/signIn", signInHandler);
app.listen(port, () => {
console.log(`Server is running at http://localhost:${port}`);
});
Link to the package
As you can see, the /auth/signIn
endpoint uses a handler imported from another package. That brings us to our next component: the server feature.
Bootstrapping Server Feature
The server feature encapsulates the authentication logic. In this package, we define the sign-in handler, which leverages a shared validation utility from the libs.
import type { RequestHandler } from "express";
import {
passwordValidator,
usernameValidator,
} from "@robust-monorepo-yarn-nx-changesets/validator";
const signInHandler: RequestHandler = (req, res) => {
if (!req.body) {
res.status(422).send("Request body is missing");
return;
}
if (typeof req.body !== "object") {
res.status(422).send("Request body expected to be an object");
return;
}
const { username, password } = req.body;
const usernameValidationResult = usernameValidator(username);
if (typeof usernameValidationResult === "string") {
res
.status(422)
.send("Invalid username format: " + usernameValidationResult);
return;
}
const passwordValidationResult = passwordValidator(password);
if (typeof passwordValidationResult === "string") {
res
.status(422)
.send("Invalid password format: " + passwordValidationResult);
return;
}
// Emulate a successful sign-in
if (username === "test" && password === "test1234") {
res.status(200).send("Sign in successful");
return;
}
return res.status(422).send("Username or password is incorrect");
};
export default signInHandler;
Link to the package
This approach sums up the authentication logic within its own package, allowing it to be developed and maintained independently. Notice how the validator utilities are imported from the shared lib.
Bootstrapping Client Application
Next, let’s look at the client side. In our client application, we build a simple website that enables user authentication by invoking the server API.
"use client";
import { SignInForm } from "@robust-monorepo-yarn-nx-changesets/sign-in-form";
const API_URL = process.env.NEXT_PUBLIC_API_URL || "http://localhost:1234";
export default function Home() {
const handleSubmit = async (username: string, password: string) => {
const response = await fetch(`${API_URL}/auth/signIn`, {
method: "POST",
body: JSON.stringify({ username, password }),
headers: {
"Content-Type": "application/json",
},
});
if (response.status === 200) {
alert("Sign in successful");
return;
}
if (response.status === 422) {
alert("Sign in failed: " + (await response.text()));
return;
}
alert("Sign in failed");
};
return (
);
}
Link to the package
In this example, the SignInForm
component is imported from a client feature package, which leads us to our final component.
Bootstrapping Client Feature
The client feature package provides the authentication form along with the shared validation logic. This avoids duplicating code and ensures consistency.
import {
passwordValidator,
usernameValidator,
} from "@robust-monorepo-yarn-nx-changesets/validator";
interface SignInFormProps {
onSubmit: (username: string, password: string) => void;
}
const SignInForm = ({ onSubmit }: SignInFormProps) => {
const handleSubmit = (event: React.FormEvent) => {
event.preventDefault();
const username = (event.currentTarget[0] as HTMLInputElement).value;
const usernameValidationResult = usernameValidator(username);
if (typeof usernameValidationResult === "string") {
alert(usernameValidationResult);
return;
}
const password = (event.currentTarget[1] as HTMLInputElement).value;
const passwordValidationResult = passwordValidator(password);
if (typeof passwordValidationResult === "string") {
alert(passwordValidationResult);
return;
}
onSubmit(username!, password!);
};
return (
);
};
export default SignInForm;
Link to the package
Here, we again see the usage of the validator from our shared libs, ensuring that validation logic is centralized and easily maintained.
That’s it for our minimal codebase example. Keep in mind that this code is a simplified illustration meant to demonstrate the basic structure and interconnection between Apps, Features, and Libs in a monorepo. You can expand upon these examples as needed to fit your project’s specific requirements.
Running Scripts with NX
Managing scripts in a monorepo can be challenging. While Yarn allows you to run scripts across multiple packages using various conditions, it may require custom scripting for more granular control. This is where NX comes in: it provides an out-of-the-box solution for efficient, targeted script execution.
Introduction to NX
NX is a build system optimized for monorepos with advanced CI capabilities. With NX, you can:
- Run tasks efficiently in parallel: Leverage concurrency to speed up your builds.
- Identify dependency relationships: Understand connections among packages and scripts.
- Cache script execution results: Avoid redundant work by caching outputs.
- Customize behavior with plugins: Extend functionality through a rich ecosystem of plugins.
Targeted Script Execution
To harness NX’s capabilities, we first need to create an nx.json
file to define a set of rules for our scripts. Below is an example configuration:
{
"targetDefaults": {
"build": {
"dependsOn": [
"^build"
],
"outputs": [
"{projectRoot}/dist"
],
"cache": true
},
"typecheck": {
"dependsOn": [
"^build",
"^typecheck"
]
},
"lint": {
"dependsOn": [
"^build",
"^lint"
]
}
},
"defaultBase": "main"
}
In plain English, this configuration means:
-
Build
The
build
script for a package depends on the successful build of its dependencies, and its output is cached. -
Typecheck
The
typecheck
script for a package depends on both the build and typecheck scripts of its dependencies. -
Lint
The
lint
script for a package depends on both the build and lint scripts of its dependencies.
Now, let’s add scripts to the package.json
:
"scripts": {
"build:all": "yarn nx run-many -t build",
"build:affected": "yarn nx affected -t build --base=${BASE:-origin/main} --head=${HEAD:-HEAD}",
"typecheck:all": "yarn nx run-many -t typecheck",
"typecheck:affected": "yarn nx affected -t typecheck --base=${BASE:-origin/main} --head=${HEAD:-HEAD}",
"lint:all": "yarn nx run-many -t lint",
"lint:affected": "yarn nx affected -t lint --base=${BASE:-origin/main} --head=${HEAD:-HEAD}",
"quality:all": "yarn nx run-many --targets=typecheck,lint",
"quality:affected": "yarn nx affected --targets=typecheck,lint --base=${BASE:-origin/main} --head=${HEAD:-HEAD}"
}
Here, we define four types of execution scripts:
Each script has two variations:
- all: Runs the script on all packages.
- affected: Runs the script only on packages affected by recent changes. The
BASE
andHEAD
environment variables allow you to specify a range (defaulting toorigin/main
and the currentHEAD
), enabling granular execution on pull requests. This can significantly save time and resources.
Managing Circular Dependencies
NX also provides a built-in command to generate a dependency graph, which can help in dependency cycles detection. The following script uses the NX graph output to check for circular dependencies and fails if any are found.
Create a file at scripts/check-circulardeps.mjs
with the following content:
import { execSync } from "child_process";
import path from "path";
import fs from "fs";
const hasCycle = (node, graph, visited, stack, path) => {
if (!visited.has(node)) {
visited.add(node);
stack.add(node);
path.push(node);
const dependencies = graph.dependencies[node] || [];
for (const dep of dependencies) {
const depNode = dep.target;
if (
!visited.has(depNode) &&
hasCycle(depNode, graph, visited, stack, path)
) {
return true;
}
if (stack.has(depNode)) {
path.push(depNode);
return true;
}
}
}
stack.delete(node);
path.pop();
return false;
};
const getGraph = () => {
const cwd = process.cwd();
const tempOutputFilePath = path.join(cwd, "nx-graph.json");
execSync(`nx graph --file=${tempOutputFilePath}`, {
encoding: "utf-8",
});
const output = fs.readFileSync(tempOutputFilePath, "utf-8");
fs.rmSync(tempOutputFilePath);
return JSON.parse(output).graph;
};
const checkCircularDeps = () => {
const graph = getGraph();
const visited = new Set();
const stack = new Set();
for (const node of Object.keys(graph.dependencies)) {
const path = [];
if (hasCycle(node, graph, visited, stack, path)) {
console.error("🔴 Circular dependency detected:", path.join(" → "));
process.exit(1);
}
}
console.log("✅ No circular dependencies detected.");
};
checkCircularDeps();
This script:
- Executes the NX command to generate a dependency graph.
- Reads the graph from a temporary JSON file.
- Recursively checks for cycles.
- Logs an error and exits if a circular dependency is detected.
Validating Dependencies with Yarn Constraints
As projects grow, maintaining consistency across dependencies becomes challenging. Enforcing strict rules around dependencies, Node versions, and other configurations is essential to avoid unnecessary technical debt. Yarn Constraints offer a way to automate these validations.
Understanding Yarn Constraints
Yarn Constraints are the set of rules for packages in your monorepo. Significant advantage of using them is that you are the manager of these rules. For example, you can create a rule to force all the packages to use the same React version. Once it’s set, you’ll never run into a problem when a host application cannot use a feature/lib with a higher React version.
While migrating a large monorepo to a new major version of a dependency might be complex, using constraints ultimately brings consistency and stability to the entire project.
Enforcing Consistency
In our example repository, we use a yarn.config.cjs file to enforce consistency for:
-
Node Version
-
Yarn Version
-
Dependencies’ Versions
To allow for flexibility during transitions, you can define exclusions to temporarily bypass certain checks. For instance:
const workspaceCheckExclusions = [];
const dependencyCheckExclusions = [];
These constants let you exclude specific workspaces or dependencies from the validation process, ensuring smooth migrations when necessary.
Managing Versioning with Changesets
Another problem you may face with the growth of the repository is the version management and releasing. Changesets provide an elegant solution to automate this process, ensuring that every change is tracked, versioned and released.
Introduction to Changesets
Changesets is an open-source tool designed to manage versioning in monorepo repositories. It simplifies the process of keeping track of changes by allocating them into small, human-readable documents that capture the intent of the change. These documents are called changesets. Key benefits include:
-
Clear Documentation
Each changeset outlines the changes made, which helps both developers and consumers understand what to expect in a new release.
-
Granular Version Control
Each package is versioned independently, ensuring that only the affected packages are updated. This minimizes risk of empty version bumps and dependency breaks.
-
Collaboration-Friendly
As every change is recorded through a changeset, teams can review and approve updates before the actual release.
Automating Releases
One of the most powerful features of Changesets is the ability to automate the process. You can integrate Changesets into your CI/CD pipeline and forget about manual version changes and NPM publishing.
Take a look at the release.yaml workflow in the example repository. It has create-release-pull-request-or-publish
step. The step backed by changesets/action GitHub action creates all the magic. You only need to set up NPM_TOKEN
for publishing your packages. Then, every push to the main
branch will:
-
Check if there are any Changeset documents.
If changeset documents are present, the action creates a pull request with the necessary version bumps and changelog updates. If no changes are detected, nothing happens.
-
Check if there are any packages ready to publish.
If packages are ready to be released, the action publishes the new versions to NPM using the provided
NPM_TOKEN
. If there are no packages ready to publish, the action exits without making changes.
By automating these tasks, Changesets ensure that your releases are consistent and reliable, reducing the potential for human error and streamlining your development workflow.
Workflow Integration with GitHub Actions
This section delves into how to unleash the power of the architecture we’ve just built. Using GitHub Actions, we’ll automate PR quality checks, version releases for libraries and features, and application deployments. The focus is on maximizing automation while maintaining code quality and job granularity.
Verify PR Quality
To ensure that pull request code remains consistent and stable, we create a dedicated quality.yaml workflow. This workflow performs several tasks, such as ensuring that manual version changes aren’t introduced (since versioning is managed by Changesets):
- id: check_version
name: Check version changes
run: |
BASE_BRANCH=${{ github.event.pull_request.base.ref }}
git fetch origin $BASE_BRANCH
CHANGED_FILES=$(git diff --name-only origin/$BASE_BRANCH HEAD)
VERSION_CHANGED=false
for FILE in $CHANGED_FILES; do
if [[ $FILE == */package.json ]]; then
if [ -f "$FILE" ]; then
HEAD_VERSION=$(grep '"version":' "$FILE" | awk -F '"' '{print $4}')
else
continue
fi
HEAD_VERSION=$(cat $FILE | grep '"version":' | awk -F '"' '{print $4}')
if git cat-file -e origin/$BASE_BRANCH:$FILE 2>/dev/null; then
BASE_VERSION=$(git show origin/$BASE_BRANCH:$FILE | grep '"version":' | awk -F '"' '{print $4}')
else
BASE_VERSION=$HEAD_VERSION
fi
if [ "$BASE_VERSION" != "$HEAD_VERSION" ]; then
VERSION_CHANGED=true
echo "Version change detected in $FILE"
fi
fi
done
if [ "$VERSION_CHANGED" = true ]; then
echo "Manual version changes are prohibited. Use changesets instead."
exit 1
fi
env:
GITHUB_REF: ${{ github.ref }}
Alongside this check, the check-quality
job installs dependencies, validates constraints, checks for circular dependencies and verifies overall code quality using the script we defined earlier with NX:
- id: install-dependencies
name: Install dependencies
run: yarn --immutable
- id: check-constraints
name: Check constraints
run: yarn constraints
- id: check-circulardeps
name: Check circular dependencies
run: yarn check-circulardeps:all
- id: check-quality
name: Check quality
run: BASE=origin/${{ github.event.pull_request.base.ref }} yarn quality:affected
The quality check is designed to run only on the packages affected by the current pull request. Successful completion of these jobs signals the pull request is ready to merge (in addition to receiving code reviews).
If additional checks are required for your project, you can update your nx.json
and quality script keeping the workflow unchanged.
Publish Libraries and Features
After a PR is merged, the release workflow (as described in the Changesets chapter) is triggered. This workflow builds the affected packages and creates a PR with the version bumps. Once this PR is approved and merged, release.yaml runs again – this time, instead of creating a PR, it detects version changes and releases the updated packages to NPM:
- id: build-packages
name: Build packages
run: yarn build:affected
- id: create-release-pull-request-or-publish
name: Create Release Pull Request or Publish to NPM
uses: changesets/action@v1
with:
version: yarn changeset version
publish: yarn release
commit: "chore: publish new release"
title: "chore: publish new release"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
release-apps:
needs: release-libs-features
uses: ./.github/workflows/release-apps.yaml
with:
publishedPackages: ${{ needs.release-libs-features.outputs.publishedPackages }}
Following this, a job called release-apps
is executed, which is responsible for application deployments. It receives a list of published packages from the previous step and brings us to the next chapter.
Publish Apps
The final part of the release process involves deploying your applications (applications are not published to NPM, because they are set private
in package.json
). The release-apps.yaml workflow is automatically triggered by release.yaml, or can be executed directly from the Actions tab on GitHub:
name: Release Apps
on:
workflow_call:
inputs:
publishedPackages:
description: "List of published packages"
required: false
type: string
default: "[]"
workflow_dispatch:
inputs:
publishedPackages:
description: "List of published packages (optional)"
required: false
type: string
default: "[]"
This workflow accepts publishedPackages
input to determine which packages have been published. Using a matrix strategy, it checks each application of the matrix for the presence of published dependencies:
- id: check-dependency-published
name: Check if any app dependency is published
run: |
PUBLISHED_PACKAGES="${{ inputs.publishedPackages }}"
PACKAGE_NAME="${{ matrix.package }}"
APP="${{ matrix.app }}"
DEPENDENCIES=$(jq -r '.dependencies // {} | keys[]' "apps/$APP/package.json")
for DEP in $DEPENDENCIES; do
if echo "$PUBLISHED_PACKAGES" | grep -w "$DEP"; then
echo "published=true" >> $GITHUB_OUTPUT
exit 0
fi
done
echo "published=false" >> $GITHUB_OUTPUT
This check is one condition for initiating an app deployment. The other condition ensures that the app’s version has been changed (indicating that a redeploy is necessary even if no dependencies have been updated):
- id: check-version-change
name: Check if app version has changed
run: |
APP="${{ matrix.app }}"
PACKAGE_JSON_PATH="apps/$APP/package.json"
CURRENT_VERSION=$(jq -r '.version' "$PACKAGE_JSON_PATH")
PREVIOUS_VERSION=$(git show HEAD~1:"$PACKAGE_JSON_PATH" | jq -r '.version' || echo "")
if [[ "$CURRENT_VERSION" == "$PREVIOUS_VERSION" ]]; then
echo "changed=false" >> $GITHUB_OUTPUT
else
echo "changed=true" >> $GITHUB_OUTPUT
fi
Finally, after confirming that the app either has updated dependencies or its version has changed, the workflow retrieves the new version and proceeds to build and deploy the application:
- id: set-up-docker
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- id: get-app-version
name: Get the app version from package.json
run: echo "app-version=$(cat ./apps/${{ matrix.app }}/package.json | jq -r '.version')" >> $GITHUB_OUTPUT
- id: build-image
name: Build image
if: steps.check-dependency-published.outputs.published == 'true' || steps.check-version-change.outputs.changed == 'true'
uses: docker/build-push-action@v4
with:
build-contexts: |
workspace=./
context: "./apps/${{ matrix.app }}"
load: true
push: false
tags: |
${{ matrix.app }}:v${{ steps.get-app-version.outputs.app-version }}
In this example, we build the Docker image without pushing it to a registry. In your production workflow, replace this step with the actual deployment process.
Conclusion
Recap of Best Practices
Throughout this article, we explored setup of a robust monorepo and the tools that help manage it efficiently. By centralizing your codebase, you not only simplify dependency management but also streamlines collaboration across teams. We demonstrated how Yarn can be leveraged to share dependencies, accelerate installation with PnP and improve overall project consistency. Additionally, integrating NX for targeted script execution ensures that CI is fast and efficient. Changesets helped to automate versioning, reducing manual errors and streamlining releases. Finally, we’ve made a production-ready CI/CD pipeline with GitHub actions that performs only the necessary tasks.
Next Steps
- Experiment and Adapt: Begin by setting up a small-scale monorepo to test these best practices. Experiment with different folder structures, and gradually expand to include more packages as your confidence grows.
- Integrate Additional Tools: Consider integrating complementary tools like PNPM or Turborepo based on your project’s unique requirements and team preferences.
- Enhance CI/CD Pipelines: Fine-tune your GitHub Actions workflows to include additional quality checks, code coverage, and security scans tailored to your project.
- Community and Updates: Stay updated with the latest releases of Yarn, NX, and Changesets. Engage with the community to share insights and learn about emerging trends in monorepo management.
Resources
-
Example Repository:
Access the complete example repository created for this guide. Explore the project structure, code samples, and scripts that showcase the monorepo setup in action.
-
Published NPM Packages:
Check out the actual NPM package published as part of this project. These packages demonstrate real-world usage and implementation of the concepts discussed in the article.