Motivation
After creating so many projects across many different languages
(primarily Rust, TypeScript, JavaScript, Python, and C/C++),
I started to have this internal configuration hell, where
I constantly copy build scripts, package.json
, CI configs,
linter rules, etc, between projects.
The worst part is, things change. so the “source of truth” of these configs is basically what project I am currently working on. These means I never “backport” these improvements to older projects. When I do need to upgrade other projects to “newer” standards, I again have to remember which project has the latest standard, and copy from that.
In the past, I don’t often switch between projects. Maybe once or twice
per year. However, another problem started to surface as I created
more and more projects based on TypeScript/Web tech stacks.
Recently I have even started using bun
to run TypeScript on the server,
while 5 years ago I would be disgusted at the idea of running anything
other than blazing fast, optimized, compiled languages on the server (I was a junior
back then).
As I make these projects, common pattern/code emerge.
At some point, I started creating these internal libraries that
all my projects would depend on. One of the example is pure
,
a pure-TypeScript library to bootstrap a web application with dark mode,
multi-languages, Result
type, etc. Now, I not only have a configuration
hell, I also have my internal dependency hell!!
The solution might be simple if I only write Rust code, or TypeScript code,
as basically that’s what package managers do. However,
keep in mind that I work with multiple language and ecosystems,
often in the same project. For example, my
Breath of the Wild Weapon Modifier Corruption Database
project, had:
- Python for processing BOTW data and generating source code, along with other scripts
- Rust for building the fastest simulator for cooking in BOTW. For comparison, the first version of a cooking simulator took 9 hours (all 32 cores) to generate the database. My version can do that in under 30 seconds (all 32 cores)
- C/C++ for a Switch mod that generates the same database by calling the function in the game’s binary, to validate the database
- TypeScript for making a nice frontend for my Rust database
As I take on more and more crazy projects like these, I need to enable config- and code-sharing.
Present-me need to build abstractions for these so future-me doesn’t spend all my time copy-pasting
<ChangeDarkModeButton />
and build scripts all over the place.
Essentially, I need:
- A system to manage dependencies, and enable code-sharing and config-sharing between projects
- A system or standard to build monorepos, like how to define build steps and dependency between projects across ecosystem-boundaries
Well, that is exactly what mono-dev
is. It’s not a single tool or system or service.
But a combination of a set of tools, a set of documentations, a set of scripts and config files,
and finally, this website to document everything for future-me to understand.
Note that, this standard is possible because it’s only used by me. I make assumptions about how a developer (me) works on the project. While all the source code are available on GitHub and you can feel free to use them or make PRs, I will not be implementing any fixes to support scenarios outside of what I use the tools for.
As an example, I will not add a –config-path flag to some tool, because it assumes the project follows the standard, and the config is defined in the expected place.
The standard is also unstable. Every update is a breaking update.
The mono-dev
Standard
This section contains important information about working on projects that follows the standard. Please read it all and do the setup necessary if you are directed here by contributing guides of my projects.
The mono-dev
standard consists of:
- Tools (i.e. binaries) that must be available globally on the system
and callable from the command line.
- Project-specific tools, like build scripts, are not included
- Project structure
- Name convention for build tasks (like
check
,test
,fix
,build
) - Shared configurations and wrappers that invokes the tools with configurations
The last point - shared configurations, are covered by chapters in the rest of the book:
- Copy-paste instruction for setting up frameworks
- Instruction for setting up CI pipelines
System Tools
Operating System
My projects are usually not OS-specific. I daily drive Windows and do my development stuff on Linux, sometimes Windows as well. Therefore, these standards aim to support both development on Windows and Linux.
However, cross-platform build scripts are PAIN. The Standard transfers
that pain to the developer by assuming the System Tools already exists
in PATH
. These tools are usually cross-platform, so scripts don’t
need to deal with platform-specific commands.
On Windows, the easiest way to meet the requirements is to install WSL, then install the tools inside WSL.
GNU Utils
GNU Coreutils like cp
, rm
, mkdir
must be available on the system.
The build scripts will almost always use these.
Because mkdir
is a builtin command on Windows and cannot be overriden/aliases away,
which
is also required, and the build scripts will call $(which mkdir)
Other GNU utils like sed
, wget
might be needed for some projects.
Task Runner
Install task
, the command runner to run, well, tasks.
This is technically not always required, if you are willing to copy commands
from Taskfile.yml
to the terminal every time you want to run something.
Reason:
- Every project has some task runner, whether it is
npm
for ECMAScript/TypeScript projects, some.sh
scripts in ascripts
folder, or some.md
doc that tells you what commands to run. - The problems with each of those approaches are:
npm
: not language-agnostic. I don’t want to installnode
for Rust projects, for example.sh
: not platform-agnostic. I don’t want to gate-keep devs that only use Windows from contributing- documentation: Inconvenient. I don’t want to copy command every time I want to execute them
- I have used
just
before switching totask
. It’s a command runner written in Rust. The main thing lacking is ability to run tasks in parallel and it uses a DSL.task
on the other hand uses YAML.
Tasks replaces the scripting system in other package managers. For example
instead of npm run test
, you would run task test
.
Package Manager(s)
Modern ecosystems have package managers, like node_modules
for JS or crates for Rust.
However, I need something outside of these to enable dependency management in monorepos
with multiple languages. The solution might be surprising - git modules!
With git modules, arbitrary git repositories can be integrated as a member in the monorepo,
then integrated with ecosystem-specific package managers like pnpm
or cargo
.
For these tools, the package appears to be a local dependency, but for git
, it’s an external dependency.
I made a wrapper for git submodule
called magoo
to streamline
the process of updating submodules. (Rust needed for installation)
Rust Toolchain
The Version (or MSRV - Minimum Supported Rust Version) for the standard, is always the latest stable release of Rust. Projects will also begin to be migrated to the latest edition of Rust as soon as it’s released.
Specific projects might require nightly features.
The Rust Toolchain is needed not only for Rust projects, but also to install
various tools from crates.io
- Rust’s public registry.
You can install Rust from rustup.rs or from your distro’s package manager.
Windows users need to first install MSVC, which is included in the Visual Studio Build Tools (VSBT). This does not apply if you are using WSL.
You can either:
- Follow the installation in the Rustup Book, which has a step-by-step walkthrough with images, but installs the whole Visual Studio IDE, or:
- Download and install just the build tools from Microsoft
Either way, be sure to select the latest MSVC vXXX - VS XXXX C++ x64/x86
and Windows 11 SDK
components
when installing.
For projects, the root should define a task called install-cargo-extra-tools
with the alias icets
to invoke cargo install
for all tools needed
for the most common development workflows. This serves as a documentation
for what cargo tools are needed for the project.
Most of the time, a Rust Toolchain is also the only thing you need
to work on a Rust project, thanks to cargo
also being a linter and formater for Rust.
ECMAScript ecosystem
NodeJS is basically still the source of truth for the industry. You can install it from NodeJS.org, from your distro’s package manager, or use NVM (or NVM-Windows for windows).
There are 2 additional tools needed globally:
pnpm
- the package manager that works better with monoreposbun
- bun is a new tool that tries to be “everything you need to develop JS”. We use it for:- Bundling with zero config
- Running TypeScript natively with zero config
Yes, I am aware NodeJS is adding TypeScript support natively. However, there are TypeScript-only features that node cannot run without transforming the source. TypeScript has added an option to disable those features. Time will tell who wins in the end
Both of these tools should always be updated to the latest version. Fortunately, this is very easy with NPM:
npm i -g pnpm bun
If you are using NVM or other version managers, the global packages are usually tied to the node version.
Other JS ecosystem tools used in development are managed as node dependencies, so they will automatically be installed local to the project.
Python
While TypeScript + Bun has decent DX at being a general purpose
scripting language, to have proper type checking in the IDE, you still
need tsconfig.json
, eslint
, and all of the bloat. Not to mention,
you need to install all the tools ECMAScript ecosystem uses.
Therefore, for non-ECMAScript projects, it’s far more likely that someone
has python
installed, compared to node+npm+pnpm+bun
. Which is why
Python is the preferred option in non-ECMAScript projects.
You can install Python from Python.org, from your distro’s package manager, or through a version manager.
My projects do not use Python in production, so the dependencies are also
expected to be installed globally. 2 of the most common ones are tqdm
and pyyaml
.
If a project does have a complicated Python setup (typically machine-learning-related), it will have a dedicated setup instruction.
C/C++
The Standard for C/C++ tooling is experimental, as it differs a lot between Linux and Windows. There’s also not an industry standard for packages.
Projects will most likely use clang-format
for formatting and clangd
for LSP.
For build system, it will probably be cmake
or ninja
.
Other Languages
No Standard exists for these languages first, as I don’t have projects in-production that use them. However, I am side-eyeing these and may look into it in the future
- Go
- Zig
Docker
Usually, docker is only used for my project if a server is needed. Even for those projects, there is a good chance the local development workflow does not need docker.
Project Structure
The Standard defines 3 possible project structure.
Monorepo
This is common for large projects, where multiple small projects and external dependencies are integrated.
mono-dev
is installed into a monorepo as a git submodule:
magoo install https://github.com/Pistonight/mono-dev packages/mono-dev --name mono-dev --branch main
A typical monorepo should look like:
- .github/
- packages/
- some-js-package/
- src/
- .gitignore
- package.json
- Taskfile.yml
- some-rust-package/
- src/
- .gitignore
- Cargo.toml
- Taskfile.yml
- mono-dev/ -> https://github.com/Pistonight/mono-dev
- .gitignore
- .gitmodules
- LICENSE
- README.md
- package.json
- pnpm-workspace.yaml
- pnpm-lock.yaml
- Cargo.toml
- Cargo.lock
- Taskfile.yml
The guidelines:
- All meaningful code and config (including build scripts) should be
divided into packages in
packages
directory. - Each package should have a
Taskfile.yml
that defines tasks for the package - Package can depend on other’s tasks by including their
Taskfile.yml
- It’s preferred for a package to depend on another package through the ecosystem,
rather than copying files into other packages. For example, if a Rust package
generates TypeScript code. It’s preferred for the TypeScript code be generated inside the Rust package’s
directory. The Rust package can make a package.json to double as a Node package
and be installed via
package.json
- The root
Taskfile.yml
should include all packages’s Taskfile.yml under the namespace identical to the directory name. Note that the directory name doesn’t have to be the same as the package name. This is to save typing common prefixes. Aliasing the package is not recommended for projects with a lot of packages. - The root
Taskfile.yml
should definecheck
,test
andbuild
for checking, testing and building all packages. These tasks can be used in CI to simplify the setup - Additionally, the root
Taskfile.yml
should declare aninstall
task for installing node modules, along with running post-install tasks. Declaring post-install inTaskfile.yml
is recommended compared to using lifecycle scripts with NPM, as those are NPM specific.
The root Cargo.toml
should declare a Cargo Workspace like:
[workspace]
resolver = "2"
members = [
"packages/some-rust-package"
]
The root pnpm-workspace.yaml
should declare a PNPM Workspace like:
packages:
- packages/some-js-package
catalog:
react: ^18
The catalog
protocol is a PNPM feature that allows dependency versions
to be easily synced.
Atomrepo
This is a term I made up. In the Standard, it refers to a repository that is only meant to be used within a monorepo. The most common scenario is when the project needs external dependencies that’s not covered by NPM or Cargo.
The biggest benefit of an Atomrepo compared to publishing then consuming
the package through a public registry, is that updating code is VERY fast.
I just need to edit the code locally, commit and push it to git, and run magoo update
to update the submodule reference. This skips the need to wait for CI/publish,
while maintaining production-grade standard for the project.
Since an atomrepo is meant to be inside a monorepo, it doesn’t have any strict package structure. The only limitation is that it should not contain any submodules, as recursive submodules will be PAIN. Dependencies should also be installed into the monorepo. This friction helps ensure the submodule dependency chain doesn’t grow out of control.
Also since an atomrepo is inside a monorepo, it can reference mono-dev
in the same
repo with the path ../mono-dev
, or ./node_modules/mono-dev
if PNPM is used.
Singlerepo
This is also a term I made up. This basically refers to simple projects that only have to deal with one ecosystem or one language, such as a CLI tool. Because the project structure is very simple, it doesn’t justify creating a monorepo.
mono-dev
can still be helpful here as it defines common build tasks that
can be included in one line. If PNPM is used, mono-dev
should be
directly from GitHub
.
pnpm i -D https://github.com/Pistonight/mono-dev
It’s important to install mono-dev this way, as the ECMAScript configs assumes mono-dev is found in node_modules!
If PNPM is not involved, then there are 2 options:
- Add it as a git submodule:
magoo install https://github.com/Pistonight/mono-dev mono-dev --name mono-dev --branch main
- Just clone and gitignore it:
tasks:
install:
cmds:
- rm -rf mono-dev
- git clone https://github.com/Pistonight/mono-dev --depth 1
The second option is better for rust projects that meant to be installed from git, so users don’t have to clone the submodule.
Task Convention
This section defines a convention for task names, such as
check
, test
, build
.
Note that these are just a recommendation, not a limitation. Names outside of the ones defined here can still be used.
Task Description
task
has a feature where a task can be given a description. Only
tasks that have description will show up when you task --list
tasks:
check:
desc: Check for issues
cmds:
- cargo check
Because of the Convention, common task names like check
and fix
don’t need to have their description repeated. Developers should
expect when they run check
, a set of linters will run.
Suffixes
If there are multiple of one type of task, such as install
ing multiple
things. The tasks should be defined with a suffix after -
. For example
install-deps
Tasks
list
and exec
(x
)
The root level of a monorepo has list
and exec
tasks:
list
: List the tasks of the root level or of a packageexec
: Execute a task from a package
See Common tooling for how these tasks are defined.
install
At the root level, the install
task should be defined to download
external dependencies, and make changes to the downloaded packages (a step
typically known as post-install hooks.
At package level, the install
task can have the same functionality as a
post-install hook. Essentially, it is the script to run to setup the package
for local development.
This pattern is common at the root level:
tasks:
install:
cmds:
- magoo install
- pnpm install
- task: post-install
install-ci:
cmds:
- pnpm install --frozen-lockfile
- task: post-install
post-install:
cmds:
- task: my-package:install
The root should also define a install-cargo-extra-tools
tasks
to install the necessary development tools that can be installed with cargo
pull
Like install
, pull
indicates downloading data externally. pull
should be used for things that only needs to be setup once (or infrequently).
These typically include assets, data packs or generated files stored outside of the repo.
push
This is the opposite of pull
- Uploading assets, data packs or generated files to external location.
A related task is git-push
, typically used in atomrepos to set the correct remote address before calling git push
dev
The dev
task starts the most common inner-loop development workflow.
This is typically used for running a process that watches for changes and re-build/re-test
the code.
check
Checks for code issues with linter/formater. Even though build
may still check the code internally in the compilers. check
is not meant to emit any files (unlike build
)
fix
Fixs the issues found with check
automatically. This should always
include automatically fixing all formatting issues.
test
Run the tests. This sits somewhere in between check
and build
, as check
is most often
static checks. test
however should actually run the code and run assertion on the outcome.
build
Build should produce assets to be used outside of this package. Note the word outside.
A task that generates code for use within the package should be part of install
, not build
Contributing Guidelines
These guidelines apply to all of my projects
Code of Conduct
There is no official Code of Conduct in the Standard. Be reasonable and respectful.
Filing Issues
When you file a bug report or suggestion, you are contributing to the project :)
There is this sentiment spreading on the Internet that users of open-source likes to ask for more without contributing. While people like that do exist, I prefer to think of more ask = more people caring and liking and using about my project.
The catch is - you have to show that. If you are filing a bug report, please take the time to detail what is the scenario and what’s the steps to reproduce, along with expected v.s. actual outcome. If it’s a feature suggestion, give the background context or use cases for why you think the feature should be added.
Do not comment on whether you think it’s a simple thing or not. (i.e. “Why is this simple fix not done yet?” or “This feature should be easy, why not just add it?”) As a user and not a maintainer of the project, you are simply not in the position to make such call. If you have looked into the issue and implemented the “simple fix”, then prove yourself by a PR or reply in the issue.
Communication
Communication is very important. When you are not sure, ask. Also reach out with your feature idea before you work on a PR. It’s also a good idea to let me know your availability if you are going to work on something big.
Coding
Follow each language’s best practice and convention when it comes to styles. The easiest way to do this is look at existing code and use the linters often.
Documentation
Code without documentation will quickly become unworkable. The rule of thumb is I should be able to tell what a function/component does without looking at the implementation.
This is something that requires pretty high skill level and experience to do effectively. It’s very common even for large, widely-used projects to not do this properly, and it’s frustrating to having to dig into the code or setup a minimal environment to find out what the function returns for, say, empty string.
For example, this function:
/** Parses a number from an input string */
function parseNumber(input: string): number {
... // implementation hidden
}
Yes - there is documentation, but it’s useless. I already know that before I look at the comment. Here are my questions:
- What’s the output? Integer? Float? Positive Numbers?
- What’s the input? Decimal? Hex? Math expression? Automatically detected?
- What if the input is invalid? Does it return 0? -1? NaN? throw an error?
A good documentation looks like this
/**
* Parses a number from an input string.
*
* The input should be string representation of a positive integer in decimal.
* If anything other than 0-9 is in the input, it returns 0.
* Leading 0's are ignored, and an empty string also gives 0
*/
function parseNumber(input: string): number {
... // implementation hidden
}
Another good rule of thumb is that one should be able to write tests for the function by looking at the documentation. Actually, tests are documentation. They express in code instead of natural language or what the behavior of a function or component is.
Another important thing is that LLMs love to generate useless documentation/comments, see Usage of AI for more information on the my take on AI.
Pull Request
Always run checks and tests before you make a Pull Request. PR workflows need to be approved by me to run because of security. If very simple formatting issue fails the build and I was not there, you have to wait for another cycle of review.
I don’t use git hooks, also because of security.
When opening a PR, make sure the description is filled. Small PRs can have a one-to-two sentence description. If you are fixing a bug, describe what the bug is or link to the issue, explain the root cause, and explain the fix.
Also please explain how the changes have been validated, for example tested in local dev server. The automated tests can’t always catch all the bugs.
Usage of Artificial Intelligence (AI)
My take on AI/LLM is that it is a tool just like your IDE or your terminal. It’s up to you if you want to use it while contributing to this project.
The bottom line is that - you are the author of your code and is responsible for making sure it is correct, readable, secure, and the same quality that you would expect from a human author. I don’t accept or reject contributions based on if it was generated by AI or not. I do so based on the quality and context of the PR itself. Read contributing guidelines for more.
The same thing goes for documentation. However, I would say that most LLM tools right now (May 2025) will try “too hard” when it comes to generating code or documentation. Just as an example, I prompted this to ChatGPT:
Can you generate documentation for this function? It reduces the payload to avoid hittimg payload limit:
function getMinimalPayload(): string;
This is the output:
/**
* Generates a minimal payload to optimize data transmission and prevent exceeding server payload limits.
*
* This function constructs a compact string representation of the payload, ensuring
* only essential data is included to reduce the request size.
*
* @returns {string} - A minimized payload string that the server can efficiently process.
*
* @example
* const payload = getMinimalPayload();
* console.log(payload); // Output: A compact string representation of the payload
*
* @remarks
* - The function should be used when sending requests with strict payload size limits.
* - Ensure that essential data is not lost while minimizing the payload.
*/
function getMinimalPayload(): string;
The function looks well documented right? WRONG. The documentation is a bunch of bloat that wastes my brain resource parsing it. What is “minimal payload”? What’s the format? What is left out? What is not left out? Documentation that pretends to be there is worse than no documentation.
See the contributing guidelines for more about writing documentation.
Setup HTTPS for Development
Some Web features requires a secure context. For example, copying stuff into clipboard.
Usually, for developer experience, the localhost
host is considered secure.
If you only use localhost
when developing (i.e. you are running the dev server
and visiting the page on the same machine), then you don’t need HTTPS.
If you are like me, who uses a VM for development and hosts the dev server in local network, then your host computer needs to be configured to trust the web app hosted by the VM’s dev server.
The steps for Windows are currently documented here. I might move them to this page instead in the future.
The mono-dev
Standard will look for .cert/cert.key
and .cert/cert.pem
2 levels up.
So the recommendation is to put the .cert
folder in the repo root.
Signing
minisign
is used to sign binary
releases (e.g. on GitHub Releases).
If a signature file (*.sig) is present in the release, you can verify the signature with the following command:
minisign -Vm <file> -P RWThJQKJaXayoZBe0YV5LV4KFkQwcqQ6Fg9dJBz18JnpHGdf/cHUyKs+
cargo-binstall
(This is for me not for you)
To support signature checks with cargo-binstall
, add the following
to the Cargo.toml
of the package to be published:
[package.metadata.binstall.signing]
algorithm = "minisign"
pubkey = "RWThJQKJaXayoZBe0YV5LV4KFkQwcqQ6Fg9dJBz18JnpHGdf/cHUyKs+"
Tooling
This section is documentation for setting up packages to use the Standard, you don’t need to read this as a contributer
Common
Each monorepo should include these common tasks at the root level
by including the following in the root Taskfile.yml
includes:
common:
taskfile: ./packages/mono-dev/task/common.yaml
flatten: true
optional: true
The usage is
task list # same as task --list
task list -- <package> # list tasks from <package>
task exec -- <package>:<task> # execute <task> in <package>
The alias for list
is ls
and for exec
is x
TypeScript/ECMA
mono-dev
packages itself as a node module. In TypeScript/ECMAScript
projects, the package needs to be declared in package.json
to be
managed by the package manager. For example
{
"devDependencies": {
"mono-dev": "workspace:*"
}
}
The build scripts can be included in Taskfile.yml
version: '3'
includes:
ecma:
taskfile: ./node_modules/mono-dev/task/ecma.yaml
internal: true
Check and Fix
mono-dev
ships with its own “zero config” linter mono-lint
that wraps
tsc
, eslint
and prettier
with pre-defined rules.
tasks:
check:
cmds:
- task: ecma:mono-check
fix:
cmds:
- task: ecma:mono-fix
It scans the project and generates config files on the fly.
However, for other tools like tsserver
to work properly, these config files need to be in the project
rather than hidden away in some place already ignored by VCS.
So, you have to add these entries to your .gitignore
:
.prettierignore
/tsconfig*.json
/eslint.config.js
This is behavior of mono-lint
:
- TypeScript projects are directory-based. Each directory is type-checked separately
and allow for different env config (for example,
src
vsscripts
) - no DOM and no types exist by default. They need to be manually included in
env.d.ts
in each directory. Only directories withenv.d.ts
will be checked. - If root directory contain any TypeScript stuff, it will be checked as well
- ESLint only checks the TypeScript projects. If you use ECMAScript, you opt-out of safety anyway
For code that should be involved with type-checking, but ignored for other checks (usually generated code),
a "nocheck"
array in top-level of package.json
can be provided using the same syntax as .gitignore
,
for example:
{
"devDependencies": {
"mono-dev": "workspace:*"
},
"nocheck": ["/src/generated"]
}
Paths in nocheck
will not be processed by eslint
or prettier
Remapping TS Import Path
TS import paths can be remapped in bundled apps to avoid the “relative parent import hell”.
mono-lint
automatically detects suitable scenarios to generate these import maps using a self::
prefix.
The conditions for import map to be generated are:
-
package.json
must NOT contain"name"
,"file"
or"exports"
key, AND there is nosrc/index.(c|m)?tsx?
file. These suggest the package is a library instead of bundled app -
The import paths can only be generated for the
src
directory -
One import path for the first
index.(c|m)?tsx?
found for each directory insrc
.
For max interop with tools such as bun
, the same import map
will appear in tsconfig.json
and tsconfig.src.json
.
Test
To add testing support, add vitest
to the downstream project as devdependencies
pnpm i -D vitest
vitest
supports zero config out-of-the-box, see documentation for more.
Use ecma:vitest
and ecma:vitest-watch
tasks for running the test once or in watch mode.
Vite
mono-dev
ships a baseline vite config that adds common plugins
and configs to my projects.
Add vite
to the downstream project, with vite.config.ts
at the root:
import { defineConfig } from "vite";
import monodev from "mono-dev/vite";
// see type definition for more info
const monodevConfig = monodev({
https: true, // if secure context is needed, needs trusted certificate
wasm: true, // if wasm
worker: "es", // if ES worker is needed
});
// wrap vite config with monodevConfig
export default defineConfig(monodevConfig({
/** normal vite config here */
});
These plugins are automatically added:
- react
- yaml
- typescript paths
Define vite types in src/env.d.ts
:
/// <reference types="vite/client" />
/// <reference types="mono-dev/vite/client" />
/// <reference types="dom" />
/// <reference types="dom.iterable" />
Use the ecma:vite-dev
and ecma:vite-build
tasks to execute vite
dev server and build.
Rust (Cargo)
Check and Fix
mono-dev
provides wrapper for clippy with common clippy flags
version: '3'
includes:
cargo:
taskfile: ../mono-dev/task/cargo.yaml
internal: true
tasks:
check:
cmds:
- task: cargo:clippy-all
- task: cargo:fmt-check
fix:
cmds:
- task: cargo:fmt-fix
dev-doc:
cmds:
- task: cargo:watch-serve-doc
Clippy fix is not automated, because IMO sometimes the suggestions lead to worse code style and should be ignored.
For other clippy
options, including feature flags and targets,
see the included Taskfile.
The dev-doc
command uses cargo-watch
and
live-server
to serve the documentation
generated by cargo doc
and watch for changes.
Test
There’s no wrapper for test - just run cargo test
C/C++
C/C++ codebase will typically have its own build system. mono-dev
on the other hand, ships the formatting config
Check and Fix
version: '3'
includes:
ccpp:
taskfile: ../mono-dev/task/ccpp.yaml
internal: true
tasks:
check:
cmds:
- task: ccpp:fmt-check
fix:
cmds:
- task: ccpp:fmt-fix
Go
Go support is experimental
Check And Fix
includes:
go:
taskfile: ../mono-dev/task/go.yaml
internal: true
tasks:
check:
cmds:
- go vet
- task: go:fmt-check
fix:
cmds:
- go fmt
Docker
Containerization steps can be either its own package or the same package as the server for simple containers.
The Dockerfile
should be in the root of that package, may be something like:
FROM alpine:latest
EXPOSE 80
ENV APP_DIR=/app
COPY ./dist $APP_DIR
RUN chmod +x $APP_DIR/bin/changeme
WORKDIR $APP_DIR
ENV FOO=BAR \
BIZ=BAZ
ENTRYPOINT ["/app/bin/changeme"]
Usually, docker workflow downloads artifact from CI, then build container locally.
It’s recommended to setup a pull
which downloads from CI, and a pull-local
to
copy local artifacts
tasks:
pull:
desc: Pull build artifacts from CI using current commit
cmds:
- magnesis
pull-local:
desc: Copy build artifacts from local build output
cmds:
- cp ...
The actual container operations:
includes:
docker:
taskfile: ../mono-dev/task/docker.yaml
internal: true
vars:
DOCKER_IMAGE: pistonite/foo
tasks:
build:
cmds:
- task: docker:build
run:
cmds:
- task: docker:run
vars:
DOCKER_RUN_FLAGS: >
-p 8000:80
-e FOO=BAR
-e GIZ=BAZ
connect:
cmds:
- task: docker:connect
stop:
cmds:
- task: docker:stop
clean:
cmds:
- task: docker:clean
Mdbook
mdbook
is a Rust tool for generating documentation website from Markdown.
Install theme
mono-dev
ships a mdbook template with catppuccin
themes.
The theme files are forked and modified from the official catppuccin mdbook theme to my liking.
version: '3'
includes:
mdbook:
taskfile: ../mono-dev/task/mdbook.yaml
internal: true
tasks:
install:
cmds:
- task: mdbook:install-theme-monorepo
Because the task needs to copy theme files from mono-dev
, it needs to
know where it is. the -monorepo
suffix uses ../mono-dev
Also ignore the generated theme
directory
GitHub Actions
Since the development activity is on GitHub, I use GitHub Actions to run CI. I use runners from Blacksmith to speed up some hot workflows.
The Standard provides composite actions for common workflows as well as copy-paste configs for composing workflows
File Structure
The workflow files should be placed in the following directory structure:
- .github/
- steps/
- setup/
- action.yml
- ... (more repo-specific reusable steps)
- workflows/
- pr.yml
- ... (more .yml for workflows)
Action: Setup
jobs:
change-me:
runs-on: ubuntu-latest
# runs-on: blacksmith-4vcpu-ubuntu-2404
steps:
- uses: Pistonight/mono-dev/actions/setup@main
with:
# ALL VALUES BELOW ARE OPTIONAL
# setup mono-dev in the root of the repo
# default is false
mono-dev: true
# whether to use blacksmith-specific steps
# rather than github ones. Default is use github
runner: blacksmith
# whether to glone submodules
# default is false
submodules: true
# setup NodeJS/PNPM
# default false, ecma_pnpm will also setup NodeJS
ecma-node: true
ecma-pnpm: false
# setup Bun, default is false
ecma-bun: true
# setup Rust, default is false
# use `stable`, `nightly` value
# When using `nightly`, the toolchain is pinned to the nightly
# version on the 1st of the previous UTC month
# - this is for efficient caching of CI
# also setup at least one target below
rust: stable
# setup wasm32-unknown-unknown target for Rust, default false
# also will install wasm-pack
rust-wasm: true
# setup native targets for the runner's OS
# default is x64, only `x64` (x86_64) and `arm64` (aarch64)
# are supported, arm is ignored on windows right now
rust-native: x64,arm64
# install the rust-src component, default false
rust-src: true
# installs mdbook and mdbook-admonish
tool-mdbook: true
# install extra tools with cargo-binstall
# installed tools here are not cached and falling
# back to compile from source is banned
#
# format: crate to use crates.io or crate=user/repo to use github
# if the crate name is different from CLI tool name,
# use `cli-tool(crate)=user/repo`
tool-cargo-binstall: ripgrep,workex=Pistonite/workex
# same format as above, but uses cargo install
# this is cached by rust-cache
# note specifying anything here will also
# setup rust for the current OS/Arch if not already done
tool-cargo-install: ...
# TODO: python support not here yet
# setup latest version of python
# python: true
# python_pip: package1,package2
# repo-specific setup
- uses: ./.github/steps/setup
- run: task install
Action: Permanent Cache
Use this action to permanently cache a directory until manually busted, with a task to generate the data to cache if miss
- uses: Pistonight/mono-dev/actions/permanent-cache@main
with:
path: |
path/to/cache/dir1
path/to/cache/dir2
key: my-cache
version: v1
# task to run to generate the data to cache (task exec --)
task: generate-data
# If the runner is github hosted or blacksmith
runner: blacksmith
Action: Rust
Use the first action to build Rust CLI tool and upload it to CI artifacts, then use the second action to download the artifacts, sign them, and create a release
jobs:
build:
strategy:
fail-fast: false
matrix:
include:
- image: ubuntu-latest
target: x64
- image: ubuntu-24.04-arm
target: arm64
- image: macos-latest
target: x64
- image: macos-latest
target: arm64
- image: windows-latest
target: x64
- image: windows-11-arm
target: arm64
runs-on: ${{ matrix.image }}
steps:
- uses: Pistonight/mono-dev/actions/setup@main
with:
rust: stable
rust-native: ${{ matrix.target }}
rust-src: true
- uses: Pistonight/mono-dev/actions/rust-xplat@main
with:
arch: ${{ matrix.target }}
binary: botwrdb
# optional: install tauri build dependencies, default false
tauri: true
# optional: additional build arguments
# default build args included:
# --bin <binary>
# --release
# --target <triple> (for apple only)
build-args: "--feature too"
# optional: target directory for the build (default is `target`)
target-dir: my-target-dir
# optional: Task to run before building
pre-task: exec -- pre-build
# optional: Task to run after building (not ran if build fails)
post-task: exec -- post-build
# optional: Task to run instead of cargo build. cargo build args are passed in as .CLI_ARGS
build-task: exec -- build
name: Release
on:
push:
tags:
- v*.*.*
jobs:
release:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- if: startsWith(github.ref, 'refs/tags/v')
uses: Pistonight/mono-dev/actions/release@main
with:
artifacts-workflow: build.yml
artifacts-path: release
pack: botwrdb-*
minisign-key: ${{ secrets.MINISIGN_KEY }}
Action: Docker
Use this action to download artifacts from a previous workflow, build a docker image, and publish it to GitHub Packages.
# permission for publishing docker image
permissions:
contents: read
packages: write
steps:
- uses: Pistonight/mono-dev/actions/docker-image@main
with:
# optional: download artifacts from a previous workflow
artifacts-workflow: build.yml
artifacts-path: packages/server/dist
# optional: run a task after downloading artifacts
task: server:package-assets
image: pistonite/skybook-server
path: packages/server
version: ${{ github.events.input.version }}
Action: Release
Use this action to create release notes automatically and draft a release with artifacts
Uses RELEASE_NOTES_HEADER.md
and RELEASE_NOTES_FOOTER.md
in the .github
directory
# permission for publish release
permissions:
contents: write
steps:
- uses: Pistonight/mono-dev/actions/release@main
with:
# optional: download artifacts from a previous workflow
artifacts-workflow: build.yml
artifacts-name: packages/server/dist
# optional: run a task after downloading artifacts
task: server:package-assets
# optional: determine how releases are packed
# by default, each artifact is packed into an archive
# use pattern with * in the beginning or end to match the artifact
# name. ** or true matches everything (the default). false disables packing
# and only files in `files` are uploaded
pack: server-*
# whether to append the version tag to the archive name
# default is true
append-version: true
# optional. if provided, release artifacts will be signed
minisign-key: ${{ secrets.MINISIGN_KEY }}
files: |
path/to/file1.zip
path/to/file2.tar.gz
# default
tag: ${{ github.ref_name }}
Action: GCloud (Pistonite Storage)
Use this action to setup gcloud for Pistonite Storage.
# permissions for gcloud
permissions:
contents: read
id-token: write
steps:
- uses: Pistonight/mono-dev/actions/pistonstor@main
Workflow Templates
Copy to a workflow .yml
file, and delete things not needed
General Config
name: _____CHANGEME_____
on:
pull_request:
branches:
- main
push:
branches:
- main
tags:
- v*.*.*
workflow_dispatch:
inputs:
version:
description: "Version tag of the image (e.g. 0.2.0-beta)"
required: true
Full Job > GitHub Pages
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: Pistonight/mono-dev/actions/setup@main
with:
tool-mdbook: true
- run: task build-pages
- uses: actions/upload-pages-artifact@v3
with:
path: packages/manual/book
retention-days: 3
deploy-pages:
needs:
- build
if: github.event_name != 'pull_request'
permissions:
pages: write
id-token: write
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- id: deployment
uses: actions/deploy-pages@v4
Full Job > Rust CLI Build & Release
See Rust Action
Job > Single Platform
jobs:
_____CHANGEME_____:
runs-on: ubuntu-latest
# runs-on: blacksmith-4vcpu-ubuntu-2404
Job > Multiple Platforms
jobs:
_____CHANGEME_____:
strategy: { matrix: { os: [ ubuntu-latest, macos-latest, windows-latest ] } }
runs-on: ${{ matrix.os }}
Job > Multiple Platforms + Different Args
jobs:
_____CHANGEME_____:
strategy:
fail-fast: false
matrix:
include:
- os: ubuntu-latest
target: x64
build_args: "--target x86_64-unknown-linux-gnu"
build_output: target/x86_64-unknown-linux-gnu/release/botwrdb
- os: ubuntu-latest
target: arm64
build_args: "--target aarch64-unknown-linux-gnu"
build_output: target/aarch64-unknown-linux-gnu/release/botwrdb
- os: macos-latest
target: x64
build_args: "--target x86_64-apple-darwin"
build_output: target/x86_64-apple-darwin/release/botwrdb
- os: macos-latest
target: arm64
build_args: "--target aarch64-apple-darwin"
build_output: target/aarch64-apple-darwin/release/botwrdb
- os: windows-latest
target: x64
build_output: target/release/botwrdb.exe
runs-on: ${{ matrix.os }}
# use the args like ${{ matrix.target }} or ${{ matrix.build_args }}
Permissions
# permission for publish release
permissions:
contents: write
# permission for publishing docker image
permissions:
contents: read
packages: write
# permission for gcloud
permissions:
contents: read
id-token: write
Steps > Setup
See Setup Action
Steps > Upload Artifacts
- uses: actions/upload-artifact@v4
with:
path: dist/foo
name: foo
retention-days: 3
Steps > Download Artifacts
Note that the Docker Image Action automatically downloads artifacts
- run: mkdir -p package
shell: bash
- uses: dawidd6/action-download-artifact@v6
with:
github_token: ${{ github.token }}
workflow: CHANGEME.yml
commit: ${{ github.sha }}
path: package
Steps > Download Release
This is helpful if there are data files in previous releases
# download release
- uses: robinraju/release-downloader@v1
with:
tag: CHANGEME
fileName: CHANGEME.7z
out-file-path: package
extract: false
Ifs > Pull Request
Only run the step if the event is or is not a pull request
- if: github.event_name == 'pull_request'
- if: github.event_name != 'pull_request'
Ifs > Release Tag
- if: startsWith(github.ref, 'refs/tags/v')