Frontend as a Platform: Designing Shared Infrastructure for 20+ Teams
Frontend as a Platform: Designing Shared Infrastructure for 20+ Teams
Building Developer Experience at Scale
When frontend engineering scales beyond a single team, individual applications become secondary to the infrastructure that enables teams to build them efficiently. Platform engineering for frontend means providing shared tooling, libraries, and governance that accelerate development while maintaining consistency and quality across dozens of teams and hundreds of developers.
This article presents patterns for building frontend platform infrastructure that serves 20+ teams, covering CLI tooling, shared libraries, governance models, developer experience optimizations, and release pipeline design.
Platform Architecture Overview
┌─────────────────────────────────────────────────────────────────────────────┐
│ Frontend Platform Architecture │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Application Teams (20+) │
│ ─────────────────────── │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Team A │ │ Team B │ │ Team C │ │ Team D │ │ ... │ │
│ │ App 1 │ │ App 2 │ │ App 3 │ │ App 4 │ │ │ │
│ └────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘ │
│ │ │ │ │ │ │
│ └───────────┴───────────┴───────────┴───────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ Platform Layer │ │
│ │ │ │
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ ┌────────────┐ │ │
│ │ │ CLI Tools │ │ Shared │ │ Build │ │ Governance │ │ │
│ │ │ (scaffold, │ │ Libraries │ │ Pipeline │ │ & Linting │ │ │
│ │ │ upgrade) │ │ (UI, utils) │ │ Templates │ │ │ │ │
│ │ └──────────────┘ └──────────────┘ └──────────────┘ └────────────┘ │ │
│ │ │ │
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ ┌────────────┐ │ │
│ │ │ Design │ │ Telemetry │ │ Feature │ │ Release │ │ │
│ │ │ System │ │ & Analytics │ │ Flags │ │ Pipeline │ │ │
│ │ └──────────────┘ └──────────────┘ └──────────────┘ └────────────┘ │ │
│ │ │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ Infrastructure Layer │ │
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ ┌────────────┐ │ │
│ │ │ CDN │ │ CI/CD │ │ Registry │ │ Monitoring │ │ │
│ │ │ (Cloudflare) │ │ (Actions) │ │ (npm) │ │ (Datadog) │ │ │
│ │ └──────────────┘ └──────────────┘ └──────────────┘ └────────────┘ │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
CLI Scaffolding System
Project Generator Architecture
// packages/platform-cli/src/commands/create.ts
import { Command } from 'commander';
import inquirer from 'inquirer';
import { execSync } from 'child_process';
import * as fs from 'fs-extra';
import * as path from 'path';
import Handlebars from 'handlebars';
interface ProjectConfig {
name: string;
type: 'app' | 'library' | 'microfrontend';
framework: 'react' | 'next' | 'remix';
features: {
auth: boolean;
analytics: boolean;
featureFlags: boolean;
i18n: boolean;
storybook: boolean;
testing: 'jest' | 'vitest';
};
teamOwner: string;
ciPipeline: 'github-actions' | 'gitlab-ci' | 'circleci';
}
interface TemplateContext extends ProjectConfig {
platformVersion: string;
generatedAt: string;
packageScope: string;
}
const TEMPLATE_DIR = path.join(__dirname, '../templates');
const PLATFORM_VERSION = require('../../package.json').version;
export const createCommand = new Command('create')
.description('Create a new frontend project')
.argument('[name]', 'Project name')
.option('-t, --type <type>', 'Project type (app, library, microfrontend)')
.option('-f, --framework <framework>', 'Framework (react, next, remix)')
.option('--skip-install', 'Skip dependency installation')
.option('--skip-git', 'Skip git initialization')
.action(async (name, options) => {
const config = await gatherProjectConfig(name, options);
await generateProject(config, options);
});
async function gatherProjectConfig(
name: string | undefined,
options: Record<string, unknown>
): Promise<ProjectConfig> {
const answers = await inquirer.prompt([
{
type: 'input',
name: 'name',
message: 'Project name:',
default: name,
when: !name,
validate: (input: string) => {
if (!/^[a-z][a-z0-9-]*$/.test(input)) {
return 'Project name must be lowercase, start with a letter, and contain only letters, numbers, and hyphens';
}
if (fs.existsSync(input)) {
return `Directory ${input} already exists`;
}
return true;
},
},
{
type: 'list',
name: 'type',
message: 'Project type:',
choices: [
{ name: 'Application (standalone SPA)', value: 'app' },
{ name: 'Library (shared package)', value: 'library' },
{ name: 'Microfrontend (module federation)', value: 'microfrontend' },
],
when: !options.type,
},
{
type: 'list',
name: 'framework',
message: 'Framework:',
choices: (answers: { type: string }) => {
if (answers.type === 'library') {
return [{ name: 'React (component library)', value: 'react' }];
}
return [
{ name: 'Next.js (recommended)', value: 'next' },
{ name: 'Remix', value: 'remix' },
{ name: 'React (Vite)', value: 'react' },
];
},
when: !options.framework,
},
{
type: 'checkbox',
name: 'features',
message: 'Select features:',
choices: [
{ name: 'Authentication (@platform/auth)', value: 'auth', checked: true },
{ name: 'Analytics (@platform/analytics)', value: 'analytics', checked: true },
{ name: 'Feature Flags (@platform/flags)', value: 'featureFlags', checked: true },
{ name: 'Internationalization (i18n)', value: 'i18n' },
{ name: 'Storybook', value: 'storybook' },
],
},
{
type: 'list',
name: 'testing',
message: 'Testing framework:',
choices: [
{ name: 'Vitest (recommended)', value: 'vitest' },
{ name: 'Jest', value: 'jest' },
],
},
{
type: 'input',
name: 'teamOwner',
message: 'Team owner (for CODEOWNERS):',
default: '@platform/frontend',
},
{
type: 'list',
name: 'ciPipeline',
message: 'CI Pipeline:',
choices: [
{ name: 'GitHub Actions', value: 'github-actions' },
{ name: 'GitLab CI', value: 'gitlab-ci' },
{ name: 'CircleCI', value: 'circleci' },
],
},
]);
return {
name: name || answers.name,
type: options.type as ProjectConfig['type'] || answers.type,
framework: options.framework as ProjectConfig['framework'] || answers.framework,
features: {
auth: answers.features?.includes('auth') ?? false,
analytics: answers.features?.includes('analytics') ?? false,
featureFlags: answers.features?.includes('featureFlags') ?? false,
i18n: answers.features?.includes('i18n') ?? false,
storybook: answers.features?.includes('storybook') ?? false,
testing: answers.testing,
},
teamOwner: answers.teamOwner,
ciPipeline: answers.ciPipeline,
};
}
async function generateProject(
config: ProjectConfig,
options: { skipInstall?: boolean; skipGit?: boolean }
): Promise<void> {
const projectDir = path.join(process.cwd(), config.name);
console.log(`\n📦 Creating ${config.type} project: ${config.name}\n`);
// Create project directory
await fs.ensureDir(projectDir);
// Build template context
const context: TemplateContext = {
...config,
platformVersion: PLATFORM_VERSION,
generatedAt: new Date().toISOString(),
packageScope: '@company',
};
// Copy and process templates
const templatePath = path.join(TEMPLATE_DIR, config.type, config.framework);
await processTemplates(templatePath, projectDir, context);
// Add optional features
if (config.features.auth) {
await addFeature('auth', projectDir, context);
}
if (config.features.analytics) {
await addFeature('analytics', projectDir, context);
}
if (config.features.featureFlags) {
await addFeature('feature-flags', projectDir, context);
}
if (config.features.i18n) {
await addFeature('i18n', projectDir, context);
}
if (config.features.storybook) {
await addFeature('storybook', projectDir, context);
}
// Add CI configuration
await addCIConfig(config.ciPipeline, projectDir, context);
// Initialize git
if (!options.skipGit) {
console.log('📝 Initializing git repository...');
execSync('git init', { cwd: projectDir, stdio: 'pipe' });
execSync('git add .', { cwd: projectDir, stdio: 'pipe' });
execSync('git commit -m "Initial commit from @platform/cli"', {
cwd: projectDir,
stdio: 'pipe',
});
}
// Install dependencies
if (!options.skipInstall) {
console.log('📥 Installing dependencies...');
execSync('npm install', { cwd: projectDir, stdio: 'inherit' });
}
console.log(`\n✅ Project created successfully!`);
console.log(`\n cd ${config.name}`);
console.log(` npm run dev\n`);
}
async function processTemplates(
templateDir: string,
outputDir: string,
context: TemplateContext
): Promise<void> {
const files = await fs.readdir(templateDir, { withFileTypes: true });
for (const file of files) {
const sourcePath = path.join(templateDir, file.name);
let targetName = file.name.replace(/\.hbs$/, '');
// Handle special file names
if (targetName === '_package.json') targetName = 'package.json';
if (targetName === '_gitignore') targetName = '.gitignore';
if (targetName === '_eslintrc.js') targetName = '.eslintrc.js';
const targetPath = path.join(outputDir, targetName);
if (file.isDirectory()) {
await fs.ensureDir(targetPath);
await processTemplates(sourcePath, targetPath, context);
} else if (file.name.endsWith('.hbs')) {
// Process handlebars template
const template = await fs.readFile(sourcePath, 'utf-8');
const compiled = Handlebars.compile(template);
await fs.writeFile(targetPath, compiled(context));
} else {
// Copy file as-is
await fs.copy(sourcePath, targetPath);
}
}
}
async function addFeature(
feature: string,
projectDir: string,
context: TemplateContext
): Promise<void> {
const featurePath = path.join(TEMPLATE_DIR, 'features', feature);
if (await fs.pathExists(featurePath)) {
await processTemplates(featurePath, projectDir, context);
}
}
async function addCIConfig(
ciType: string,
projectDir: string,
context: TemplateContext
): Promise<void> {
const ciPath = path.join(TEMPLATE_DIR, 'ci', ciType);
if (await fs.pathExists(ciPath)) {
await processTemplates(ciPath, projectDir, context);
}
}
Upgrade Command for Version Migration
// packages/platform-cli/src/commands/upgrade.ts
import { Command } from 'commander';
import * as fs from 'fs-extra';
import * as path from 'path';
import semver from 'semver';
import { execSync } from 'child_process';
interface Migration {
fromVersion: string;
toVersion: string;
description: string;
steps: MigrationStep[];
}
interface MigrationStep {
type: 'codemod' | 'dependency' | 'config' | 'manual';
description: string;
action: (projectDir: string) => Promise<void>;
}
const MIGRATIONS: Migration[] = [
{
fromVersion: '2.0.0',
toVersion: '3.0.0',
description: 'Upgrade to React 18 and new hooks API',
steps: [
{
type: 'dependency',
description: 'Update React to v18',
action: async (projectDir) => {
const pkgPath = path.join(projectDir, 'package.json');
const pkg = await fs.readJson(pkgPath);
pkg.dependencies.react = '^18.2.0';
pkg.dependencies['react-dom'] = '^18.2.0';
await fs.writeJson(pkgPath, pkg, { spaces: 2 });
},
},
{
type: 'codemod',
description: 'Update createRoot usage',
action: async (projectDir) => {
execSync(
`npx jscodeshift -t @platform/codemods/react-18-root.js ${projectDir}/src`,
{ stdio: 'inherit' }
);
},
},
{
type: 'codemod',
description: 'Replace deprecated hooks',
action: async (projectDir) => {
execSync(
`npx jscodeshift -t @platform/codemods/deprecated-hooks.js ${projectDir}/src`,
{ stdio: 'inherit' }
);
},
},
],
},
{
fromVersion: '3.0.0',
toVersion: '4.0.0',
description: 'Migrate to new design system tokens',
steps: [
{
type: 'dependency',
description: 'Update @platform/design-system',
action: async (projectDir) => {
const pkgPath = path.join(projectDir, 'package.json');
const pkg = await fs.readJson(pkgPath);
pkg.dependencies['@platform/design-system'] = '^4.0.0';
await fs.writeJson(pkgPath, pkg, { spaces: 2 });
},
},
{
type: 'codemod',
description: 'Migrate color tokens',
action: async (projectDir) => {
execSync(
`npx jscodeshift -t @platform/codemods/color-tokens-v4.js ${projectDir}/src`,
{ stdio: 'inherit' }
);
},
},
{
type: 'config',
description: 'Update Tailwind config',
action: async (projectDir) => {
const configPath = path.join(projectDir, 'tailwind.config.js');
if (await fs.pathExists(configPath)) {
let config = await fs.readFile(configPath, 'utf-8');
config = config.replace(
'@platform/design-system/tailwind-v3',
'@platform/design-system/tailwind-v4'
);
await fs.writeFile(configPath, config);
}
},
},
],
},
];
export const upgradeCommand = new Command('upgrade')
.description('Upgrade platform dependencies and apply migrations')
.option('--dry-run', 'Show what would be done without making changes')
.option('--to <version>', 'Target platform version')
.option('--skip-codemods', 'Skip codemod transformations')
.action(async (options) => {
const projectDir = process.cwd();
const currentVersion = await detectCurrentVersion(projectDir);
const targetVersion = options.to || await getLatestVersion();
console.log(`\n📦 Platform Upgrade`);
console.log(` Current: v${currentVersion}`);
console.log(` Target: v${targetVersion}\n`);
const migrations = getMigrationsInRange(currentVersion, targetVersion);
if (migrations.length === 0) {
console.log('✅ Already on latest version');
return;
}
console.log(`Found ${migrations.length} migration(s) to apply:\n`);
for (const migration of migrations) {
console.log(` ${migration.fromVersion} → ${migration.toVersion}: ${migration.description}`);
}
if (options.dryRun) {
console.log('\n--dry-run specified, no changes made');
return;
}
// Create backup
console.log('\n📋 Creating backup...');
const backupDir = path.join(projectDir, `.platform-backup-${Date.now()}`);
await fs.copy(projectDir, backupDir, {
filter: (src) => !src.includes('node_modules') && !src.includes('.git'),
});
// Apply migrations
for (const migration of migrations) {
console.log(`\n🔄 Applying: ${migration.description}`);
for (const step of migration.steps) {
if (step.type === 'codemod' && options.skipCodemods) {
console.log(` ⏭️ Skipping codemod: ${step.description}`);
continue;
}
console.log(` → ${step.description}`);
try {
await step.action(projectDir);
} catch (error) {
console.error(` ❌ Failed: ${error}`);
console.log(`\n⚠️ Migration failed. Restore from: ${backupDir}`);
process.exit(1);
}
}
}
// Update platform version in package.json
await updatePlatformVersion(projectDir, targetVersion);
// Reinstall dependencies
console.log('\n📥 Reinstalling dependencies...');
execSync('rm -rf node_modules package-lock.json', { cwd: projectDir });
execSync('npm install', { cwd: projectDir, stdio: 'inherit' });
console.log(`\n✅ Successfully upgraded to v${targetVersion}`);
console.log(` Backup saved to: ${backupDir}`);
console.log('\n Run tests to verify the upgrade:');
console.log(' npm test\n');
});
async function detectCurrentVersion(projectDir: string): Promise<string> {
const pkgPath = path.join(projectDir, 'package.json');
const pkg = await fs.readJson(pkgPath);
// Check for platform version marker
if (pkg.platformVersion) {
return pkg.platformVersion;
}
// Infer from design-system version
const dsVersion = pkg.dependencies?.['@platform/design-system'];
if (dsVersion) {
return semver.minVersion(dsVersion)?.version || '1.0.0';
}
return '1.0.0';
}
async function getLatestVersion(): Promise<string> {
const result = execSync('npm view @platform/cli version', { encoding: 'utf-8' });
return result.trim();
}
function getMigrationsInRange(from: string, to: string): Migration[] {
return MIGRATIONS.filter(m =>
semver.gt(m.toVersion, from) && semver.lte(m.toVersion, to)
).sort((a, b) => semver.compare(a.toVersion, b.toVersion));
}
async function updatePlatformVersion(projectDir: string, version: string): Promise<void> {
const pkgPath = path.join(projectDir, 'package.json');
const pkg = await fs.readJson(pkgPath);
pkg.platformVersion = version;
await fs.writeJson(pkgPath, pkg, { spaces: 2 });
}
Shared Library Architecture
Monorepo Package Structure
┌─────────────────────────────────────────────────────────────────────────────┐
│ Platform Package Structure │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ packages/ │
│ ├── core/ # Framework-agnostic utilities │
│ │ ├── utils/ # Pure functions, helpers │
│ │ ├── types/ # Shared TypeScript types │
│ │ └── constants/ # Shared constants │
│ │ │
│ ├── react/ # React-specific packages │
│ │ ├── design-system/ # UI components │
│ │ ├── hooks/ # Shared hooks │
│ │ └── providers/ # Context providers │
│ │ │
│ ├── integrations/ # Third-party integrations │
│ │ ├── auth/ # Auth0, Okta integration │
│ │ ├── analytics/ # Segment, Amplitude │
│ │ ├── feature-flags/ # LaunchDarkly, Split │
│ │ └── monitoring/ # Datadog, Sentry │
│ │ │
│ ├── build/ # Build tooling │
│ │ ├── webpack-config/ # Shared webpack config │
│ │ ├── eslint-config/ # ESLint presets │
│ │ ├── tsconfig/ # TypeScript base configs │
│ │ └── vite-config/ # Vite presets │
│ │ │
│ └── cli/ # Platform CLI │
│ ├── commands/ # CLI commands │
│ ├── templates/ # Project templates │
│ └── codemods/ # Migration transformations │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Package Design Principles
// packages/react/design-system/src/index.ts
// Explicit exports for tree-shaking
export { Button } from './components/Button';
export type { ButtonProps } from './components/Button';
export { Input } from './components/Input';
export type { InputProps } from './components/Input';
export { Modal } from './components/Modal';
export type { ModalProps } from './components/Modal';
// Theme exports
export { ThemeProvider, useTheme } from './theme';
export type { Theme, ThemeConfig } from './theme';
// Token exports
export * as tokens from './tokens';
// packages/react/design-system/package.json
{
"name": "@platform/design-system",
"version": "4.0.0",
"main": "./dist/index.js",
"module": "./dist/index.mjs",
"types": "./dist/index.d.ts",
"sideEffects": false,
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.mjs",
"require": "./dist/index.js"
},
"./styles.css": "./dist/styles.css",
"./tailwind": {
"types": "./dist/tailwind.d.ts",
"import": "./dist/tailwind.mjs",
"require": "./dist/tailwind.js"
},
"./components/*": {
"types": "./dist/components/*/index.d.ts",
"import": "./dist/components/*/index.mjs",
"require": "./dist/components/*/index.js"
}
},
"peerDependencies": {
"react": "^18.0.0",
"react-dom": "^18.0.0"
},
"dependencies": {
"@radix-ui/react-dialog": "^1.0.0",
"@radix-ui/react-dropdown-menu": "^2.0.0",
"class-variance-authority": "^0.7.0",
"clsx": "^2.0.0",
"tailwind-merge": "^2.0.0"
},
"devDependencies": {
"@platform/tsconfig": "workspace:*",
"@platform/eslint-config": "workspace:*"
}
}
Versioning and Release Strategy
// scripts/release.ts
import { execSync } from 'child_process';
import * as fs from 'fs-extra';
import * as path from 'path';
import semver from 'semver';
interface PackageInfo {
name: string;
version: string;
path: string;
dependencies: Record<string, string>;
changed: boolean;
}
interface ReleaseConfig {
packages: string[];
conventionalCommits: boolean;
prereleaseId?: string;
}
async function release(config: ReleaseConfig) {
// 1. Detect changed packages
const changedPackages = await detectChangedPackages();
if (changedPackages.length === 0) {
console.log('No packages have changed since last release');
return;
}
console.log('Changed packages:');
changedPackages.forEach(pkg => console.log(` - ${pkg.name}`));
// 2. Determine version bumps from conventional commits
const versionBumps = await determineVersionBumps(changedPackages);
// 3. Update package versions
for (const pkg of changedPackages) {
const bump = versionBumps.get(pkg.name) || 'patch';
const newVersion = semver.inc(pkg.version, bump, config.prereleaseId);
if (!newVersion) continue;
await updatePackageVersion(pkg.path, newVersion);
console.log(`${pkg.name}: ${pkg.version} → ${newVersion} (${bump})`);
}
// 4. Update internal dependencies
await updateInternalDependencies(changedPackages);
// 5. Generate changelogs
await generateChangelogs(changedPackages);
// 6. Build packages
console.log('\nBuilding packages...');
execSync('pnpm build', { stdio: 'inherit' });
// 7. Run tests
console.log('\nRunning tests...');
execSync('pnpm test', { stdio: 'inherit' });
// 8. Publish to registry
console.log('\nPublishing...');
for (const pkg of changedPackages) {
execSync(`pnpm publish --filter ${pkg.name} --no-git-checks`, {
stdio: 'inherit',
});
}
// 9. Create git tags and commit
await createGitRelease(changedPackages);
}
async function detectChangedPackages(): Promise<PackageInfo[]> {
const lastTag = execSync('git describe --tags --abbrev=0', {
encoding: 'utf-8',
}).trim();
const changedFiles = execSync(
`git diff --name-only ${lastTag}..HEAD`,
{ encoding: 'utf-8' }
).split('\n').filter(Boolean);
const packages = await getAllPackages();
const changedPackages: PackageInfo[] = [];
for (const pkg of packages) {
const pkgRelativePath = path.relative(process.cwd(), pkg.path);
const hasChanges = changedFiles.some(file =>
file.startsWith(pkgRelativePath)
);
if (hasChanges) {
changedPackages.push({ ...pkg, changed: true });
}
}
return changedPackages;
}
async function determineVersionBumps(
packages: PackageInfo[]
): Promise<Map<string, 'major' | 'minor' | 'patch'>> {
const bumps = new Map<string, 'major' | 'minor' | 'patch'>();
for (const pkg of packages) {
const commits = execSync(
`git log --format=%s --no-merges $(git describe --tags --abbrev=0)..HEAD -- ${pkg.path}`,
{ encoding: 'utf-8' }
).split('\n').filter(Boolean);
let bump: 'major' | 'minor' | 'patch' = 'patch';
for (const commit of commits) {
if (commit.includes('BREAKING CHANGE') || commit.startsWith('!')) {
bump = 'major';
break;
}
if (commit.startsWith('feat')) {
bump = 'minor';
}
}
bumps.set(pkg.name, bump);
}
return bumps;
}
async function getAllPackages(): Promise<PackageInfo[]> {
const packagesDir = path.join(process.cwd(), 'packages');
const packages: PackageInfo[] = [];
async function scanDir(dir: string) {
const entries = await fs.readdir(dir, { withFileTypes: true });
for (const entry of entries) {
if (!entry.isDirectory()) continue;
const pkgJsonPath = path.join(dir, entry.name, 'package.json');
if (await fs.pathExists(pkgJsonPath)) {
const pkg = await fs.readJson(pkgJsonPath);
packages.push({
name: pkg.name,
version: pkg.version,
path: path.join(dir, entry.name),
dependencies: pkg.dependencies || {},
changed: false,
});
} else {
// Check subdirectories
await scanDir(path.join(dir, entry.name));
}
}
}
await scanDir(packagesDir);
return packages;
}
async function updatePackageVersion(pkgPath: string, version: string) {
const pkgJsonPath = path.join(pkgPath, 'package.json');
const pkg = await fs.readJson(pkgJsonPath);
pkg.version = version;
await fs.writeJson(pkgJsonPath, pkg, { spaces: 2 });
}
async function updateInternalDependencies(changedPackages: PackageInfo[]) {
const allPackages = await getAllPackages();
const versionMap = new Map<string, string>();
// Build version map
for (const pkg of changedPackages) {
const pkgJson = await fs.readJson(path.join(pkg.path, 'package.json'));
versionMap.set(pkg.name, pkgJson.version);
}
// Update dependencies in all packages
for (const pkg of allPackages) {
const pkgJsonPath = path.join(pkg.path, 'package.json');
const pkgJson = await fs.readJson(pkgJsonPath);
let updated = false;
for (const depType of ['dependencies', 'devDependencies', 'peerDependencies']) {
const deps = pkgJson[depType];
if (!deps) continue;
for (const [name, version] of versionMap) {
if (deps[name] && deps[name].startsWith('workspace:')) {
continue; // Skip workspace protocol
}
if (deps[name]) {
deps[name] = `^${version}`;
updated = true;
}
}
}
if (updated) {
await fs.writeJson(pkgJsonPath, pkgJson, { spaces: 2 });
}
}
}
async function generateChangelogs(packages: PackageInfo[]) {
for (const pkg of packages) {
const changelogPath = path.join(pkg.path, 'CHANGELOG.md');
execSync(
`npx conventional-changelog -p angular -i ${changelogPath} -s --commit-path ${pkg.path}`,
{ stdio: 'inherit' }
);
}
}
async function createGitRelease(packages: PackageInfo[]) {
// Stage changes
execSync('git add .', { stdio: 'inherit' });
// Create commit
const packageList = packages.map(p => p.name).join(', ');
execSync(`git commit -m "chore(release): publish ${packageList}"`, {
stdio: 'inherit',
});
// Create tags
for (const pkg of packages) {
const pkgJson = await fs.readJson(path.join(pkg.path, 'package.json'));
const tag = `${pkg.name}@${pkgJson.version}`;
execSync(`git tag ${tag}`, { stdio: 'inherit' });
}
// Push
execSync('git push --follow-tags', { stdio: 'inherit' });
}
export { release, ReleaseConfig };
Governance and Linting
Platform ESLint Configuration
// packages/build/eslint-config/src/index.ts
import type { Linter } from 'eslint';
interface PlatformESLintConfig {
type: 'app' | 'library';
react: boolean;
typescript: boolean;
strictMode?: boolean;
}
export function createConfig(options: PlatformESLintConfig): Linter.Config {
const config: Linter.Config = {
root: true,
parser: options.typescript ? '@typescript-eslint/parser' : undefined,
parserOptions: {
ecmaVersion: 2022,
sourceType: 'module',
ecmaFeatures: {
jsx: options.react,
},
project: options.typescript ? './tsconfig.json' : undefined,
},
env: {
browser: true,
es2022: true,
node: true,
},
plugins: [
options.typescript && '@typescript-eslint',
options.react && 'react',
options.react && 'react-hooks',
'@platform',
].filter(Boolean) as string[],
extends: [
'eslint:recommended',
options.typescript && 'plugin:@typescript-eslint/recommended',
options.react && 'plugin:react/recommended',
options.react && 'plugin:react-hooks/recommended',
].filter(Boolean) as string[],
rules: {
// Platform-specific rules
'@platform/no-direct-api-calls': 'error',
'@platform/use-platform-analytics': 'warn',
'@platform/use-design-system': 'warn',
'@platform/no-hardcoded-strings': options.strictMode ? 'error' : 'warn',
'@platform/require-error-boundary': 'warn',
// TypeScript rules
...(options.typescript && {
'@typescript-eslint/explicit-function-return-type': 'off',
'@typescript-eslint/no-explicit-any': 'warn',
'@typescript-eslint/no-unused-vars': ['error', {
argsIgnorePattern: '^_',
varsIgnorePattern: '^_',
}],
'@typescript-eslint/consistent-type-imports': ['error', {
prefer: 'type-imports',
}],
}),
// React rules
...(options.react && {
'react/react-in-jsx-scope': 'off',
'react/prop-types': 'off',
'react-hooks/rules-of-hooks': 'error',
'react-hooks/exhaustive-deps': 'warn',
}),
// General rules
'no-console': ['warn', { allow: ['warn', 'error'] }],
'prefer-const': 'error',
'no-var': 'error',
},
settings: {
react: options.react ? {
version: 'detect',
} : undefined,
},
overrides: [
// Test files
{
files: ['**/*.test.{ts,tsx}', '**/*.spec.{ts,tsx}'],
env: {
jest: true,
},
rules: {
'@typescript-eslint/no-explicit-any': 'off',
'@platform/no-hardcoded-strings': 'off',
},
},
// Storybook files
{
files: ['**/*.stories.{ts,tsx}'],
rules: {
'@platform/use-design-system': 'off',
},
},
],
};
return config;
}
// Custom ESLint rules
// packages/build/eslint-config/src/rules/no-direct-api-calls.ts
import type { Rule } from 'eslint';
export const noDirectApiCalls: Rule.RuleModule = {
meta: {
type: 'problem',
docs: {
description: 'Disallow direct fetch/axios calls, use @platform/api-client instead',
recommended: true,
},
fixable: undefined,
schema: [],
messages: {
noDirectFetch: 'Use @platform/api-client instead of direct fetch calls',
noDirectAxios: 'Use @platform/api-client instead of axios',
},
},
create(context) {
return {
CallExpression(node) {
// Check for fetch()
if (
node.callee.type === 'Identifier' &&
node.callee.name === 'fetch'
) {
context.report({
node,
messageId: 'noDirectFetch',
});
}
// Check for axios()
if (
node.callee.type === 'Identifier' &&
node.callee.name === 'axios'
) {
context.report({
node,
messageId: 'noDirectAxios',
});
}
// Check for axios.get(), axios.post(), etc.
if (
node.callee.type === 'MemberExpression' &&
node.callee.object.type === 'Identifier' &&
node.callee.object.name === 'axios'
) {
context.report({
node,
messageId: 'noDirectAxios',
});
}
},
ImportDeclaration(node) {
if (node.source.value === 'axios') {
context.report({
node,
messageId: 'noDirectAxios',
});
}
},
};
},
};
Architectural Decision Records (ADRs)
// packages/cli/src/commands/adr.ts
import { Command } from 'commander';
import * as fs from 'fs-extra';
import * as path from 'path';
import inquirer from 'inquirer';
interface ADR {
number: number;
title: string;
status: 'proposed' | 'accepted' | 'deprecated' | 'superseded';
date: string;
context: string;
decision: string;
consequences: string;
supersedes?: number;
supersededBy?: number;
}
const ADR_DIR = 'docs/adr';
export const adrCommand = new Command('adr')
.description('Manage Architecture Decision Records');
adrCommand
.command('new <title>')
.description('Create a new ADR')
.action(async (title: string) => {
const adrDir = path.join(process.cwd(), ADR_DIR);
await fs.ensureDir(adrDir);
const existingAdrs = await getExistingAdrs(adrDir);
const nextNumber = existingAdrs.length > 0
? Math.max(...existingAdrs.map(a => a.number)) + 1
: 1;
const answers = await inquirer.prompt([
{
type: 'editor',
name: 'context',
message: 'Context (what is the issue?):',
},
{
type: 'editor',
name: 'decision',
message: 'Decision (what have we decided?):',
},
{
type: 'editor',
name: 'consequences',
message: 'Consequences (what are the implications?):',
},
]);
const adr: ADR = {
number: nextNumber,
title,
status: 'proposed',
date: new Date().toISOString().split('T')[0],
...answers,
};
const filename = `${String(nextNumber).padStart(4, '0')}-${slugify(title)}.md`;
const filepath = path.join(adrDir, filename);
await fs.writeFile(filepath, formatAdr(adr));
console.log(`Created: ${filepath}`);
});
adrCommand
.command('list')
.description('List all ADRs')
.action(async () => {
const adrDir = path.join(process.cwd(), ADR_DIR);
const adrs = await getExistingAdrs(adrDir);
if (adrs.length === 0) {
console.log('No ADRs found');
return;
}
console.log('\nArchitecture Decision Records:\n');
for (const adr of adrs) {
const statusEmoji = {
proposed: '📝',
accepted: '✅',
deprecated: '⚠️',
superseded: '🔄',
}[adr.status];
console.log(` ${statusEmoji} ${String(adr.number).padStart(4, '0')} - ${adr.title} (${adr.status})`);
}
});
adrCommand
.command('accept <number>')
.description('Accept a proposed ADR')
.action(async (numberStr: string) => {
const number = parseInt(numberStr, 10);
await updateAdrStatus(number, 'accepted');
console.log(`ADR ${number} accepted`);
});
adrCommand
.command('supersede <number> <newNumber>')
.description('Mark an ADR as superseded by another')
.action(async (oldNumberStr: string, newNumberStr: string) => {
const oldNumber = parseInt(oldNumberStr, 10);
const newNumber = parseInt(newNumberStr, 10);
await updateAdrStatus(oldNumber, 'superseded', { supersededBy: newNumber });
await updateAdrField(newNumber, 'supersedes', oldNumber);
console.log(`ADR ${oldNumber} superseded by ADR ${newNumber}`);
});
async function getExistingAdrs(adrDir: string): Promise<ADR[]> {
if (!await fs.pathExists(adrDir)) {
return [];
}
const files = await fs.readdir(adrDir);
const adrs: ADR[] = [];
for (const file of files) {
if (!file.endsWith('.md')) continue;
const content = await fs.readFile(path.join(adrDir, file), 'utf-8');
const adr = parseAdr(content, file);
if (adr) adrs.push(adr);
}
return adrs.sort((a, b) => a.number - b.number);
}
function parseAdr(content: string, filename: string): ADR | null {
const numberMatch = filename.match(/^(\d+)-/);
if (!numberMatch) return null;
const number = parseInt(numberMatch[1], 10);
const titleMatch = content.match(/^# (.+)$/m);
const statusMatch = content.match(/Status: (\w+)/i);
const dateMatch = content.match(/Date: ([\d-]+)/i);
return {
number,
title: titleMatch?.[1] || 'Untitled',
status: (statusMatch?.[1]?.toLowerCase() || 'proposed') as ADR['status'],
date: dateMatch?.[1] || '',
context: '',
decision: '',
consequences: '',
};
}
function formatAdr(adr: ADR): string {
return `# ${adr.title}
Date: ${adr.date}
Status: ${adr.status}
${adr.supersedes ? `\nSupersedes: ADR-${String(adr.supersedes).padStart(4, '0')}` : ''}
${adr.supersededBy ? `\nSuperseded by: ADR-${String(adr.supersededBy).padStart(4, '0')}` : ''}
## Context
${adr.context}
## Decision
${adr.decision}
## Consequences
${adr.consequences}
`;
}
function slugify(text: string): string {
return text
.toLowerCase()
.replace(/[^a-z0-9]+/g, '-')
.replace(/^-|-$/g, '');
}
async function updateAdrStatus(
number: number,
status: ADR['status'],
extra: Partial<ADR> = {}
) {
const adrDir = path.join(process.cwd(), ADR_DIR);
const files = await fs.readdir(adrDir);
for (const file of files) {
if (file.startsWith(String(number).padStart(4, '0'))) {
const filepath = path.join(adrDir, file);
let content = await fs.readFile(filepath, 'utf-8');
content = content.replace(/Status: \w+/i, `Status: ${status}`);
if (extra.supersededBy) {
content = content.replace(
/Status: \w+/i,
`Status: ${status}\n\nSuperseded by: ADR-${String(extra.supersededBy).padStart(4, '0')}`
);
}
await fs.writeFile(filepath, content);
break;
}
}
}
async function updateAdrField(number: number, field: string, value: unknown) {
const adrDir = path.join(process.cwd(), ADR_DIR);
const files = await fs.readdir(adrDir);
for (const file of files) {
if (file.startsWith(String(number).padStart(4, '0'))) {
const filepath = path.join(adrDir, file);
let content = await fs.readFile(filepath, 'utf-8');
if (field === 'supersedes') {
content = content.replace(
/Status: (\w+)/i,
`Status: $1\n\nSupersedes: ADR-${String(value).padStart(4, '0')}`
);
}
await fs.writeFile(filepath, content);
break;
}
}
}
Developer Experience Tooling
Local Development Server Proxy
// packages/cli/src/commands/dev.ts
import { Command } from 'commander';
import { createServer, InlineConfig } from 'vite';
import { createProxyMiddleware } from 'http-proxy-middleware';
import express from 'express';
import * as fs from 'fs-extra';
import * as path from 'path';
interface DevConfig {
port: number;
apiProxy: Record<string, string>;
mockServer: boolean;
https: boolean;
open: boolean;
}
const DEFAULT_CONFIG: DevConfig = {
port: 3000,
apiProxy: {},
mockServer: false,
https: false,
open: true,
};
export const devCommand = new Command('dev')
.description('Start development server with platform features')
.option('-p, --port <port>', 'Port number', '3000')
.option('--mock', 'Enable mock server')
.option('--https', 'Enable HTTPS')
.option('--no-open', 'Do not open browser')
.action(async (options) => {
const projectDir = process.cwd();
const config = await loadDevConfig(projectDir);
const mergedConfig: DevConfig = {
...DEFAULT_CONFIG,
...config,
port: parseInt(options.port) || config.port || DEFAULT_CONFIG.port,
mockServer: options.mock || config.mockServer,
https: options.https || config.https,
open: options.open ?? config.open ?? DEFAULT_CONFIG.open,
};
await startDevServer(projectDir, mergedConfig);
});
async function loadDevConfig(projectDir: string): Promise<Partial<DevConfig>> {
const configPath = path.join(projectDir, 'platform.config.js');
if (await fs.pathExists(configPath)) {
const config = require(configPath);
return config.dev || {};
}
return {};
}
async function startDevServer(projectDir: string, config: DevConfig) {
console.log('\n🚀 Starting Platform Development Server\n');
// Create express app for middleware
const app = express();
// Add API proxies
for (const [pathPrefix, target] of Object.entries(config.apiProxy)) {
console.log(` 📡 Proxying ${pathPrefix} → ${target}`);
app.use(
pathPrefix,
createProxyMiddleware({
target,
changeOrigin: true,
pathRewrite: { [`^${pathPrefix}`]: '' },
onError: (err, req, res) => {
console.error(`Proxy error: ${err.message}`);
res.status(502).json({ error: 'Proxy error', message: err.message });
},
})
);
}
// Add mock server if enabled
if (config.mockServer) {
console.log(' 🎭 Mock server enabled');
await setupMockServer(app, projectDir);
}
// Add platform middleware
app.use('/__platform__', platformMiddleware(projectDir));
// Create Vite server
const viteConfig: InlineConfig = {
configFile: path.join(projectDir, 'vite.config.ts'),
server: {
port: config.port,
https: config.https,
open: config.open,
middlewareMode: true,
},
};
const vite = await createServer(viteConfig);
app.use(vite.middlewares);
// Start server
const server = app.listen(config.port, () => {
console.log(`\n ✅ Server running at http${config.https ? 's' : ''}://localhost:${config.port}\n`);
console.log(' Platform features:');
console.log(' - /__platform__/info → App information');
console.log(' - /__platform__/config → Runtime configuration');
console.log(' - /__platform__/health → Health check');
if (config.mockServer) {
console.log(' - /__platform__/mocks → Mock server status');
}
console.log('');
});
// Graceful shutdown
process.on('SIGTERM', () => {
console.log('\n👋 Shutting down...');
server.close();
vite.close();
process.exit(0);
});
}
async function setupMockServer(app: express.Application, projectDir: string) {
const mocksDir = path.join(projectDir, 'mocks');
if (!await fs.pathExists(mocksDir)) {
return;
}
// Load mock definitions
const mockFiles = await fs.readdir(mocksDir);
for (const file of mockFiles) {
if (!file.endsWith('.json') && !file.endsWith('.js')) continue;
const mockPath = path.join(mocksDir, file);
const mocks = file.endsWith('.json')
? await fs.readJson(mockPath)
: require(mockPath);
for (const mock of Array.isArray(mocks) ? mocks : [mocks]) {
const { method = 'GET', path: mockPath, response, delay = 0 } = mock;
app[method.toLowerCase() as 'get' | 'post'](mockPath, (req, res) => {
setTimeout(() => {
if (typeof response === 'function') {
res.json(response(req));
} else {
res.json(response);
}
}, delay);
});
console.log(` Mock: ${method} ${mockPath}`);
}
}
}
function platformMiddleware(projectDir: string): express.Router {
const router = express.Router();
const pkgJson = require(path.join(projectDir, 'package.json'));
router.get('/info', (req, res) => {
res.json({
name: pkgJson.name,
version: pkgJson.version,
platformVersion: pkgJson.platformVersion,
nodeVersion: process.version,
environment: 'development',
});
});
router.get('/config', async (req, res) => {
const configPath = path.join(projectDir, 'platform.config.js');
if (await fs.pathExists(configPath)) {
const config = require(configPath);
res.json({
features: config.features || {},
analytics: config.analytics || {},
auth: config.auth || {},
});
} else {
res.json({});
}
});
router.get('/health', (req, res) => {
res.json({
status: 'healthy',
timestamp: new Date().toISOString(),
});
});
return router;
}
Bundle Analysis and Optimization
// packages/cli/src/commands/analyze.ts
import { Command } from 'commander';
import { execSync } from 'child_process';
import * as fs from 'fs-extra';
import * as path from 'path';
interface BundleStats {
totalSize: number;
gzipSize: number;
chunks: ChunkStats[];
dependencies: DependencyStats[];
duplicates: string[];
recommendations: string[];
}
interface ChunkStats {
name: string;
size: number;
gzipSize: number;
modules: number;
isAsync: boolean;
}
interface DependencyStats {
name: string;
size: number;
version: string;
usedBy: string[];
}
export const analyzeCommand = new Command('analyze')
.description('Analyze bundle size and composition')
.option('--json', 'Output as JSON')
.option('--ci', 'Fail if budget exceeded')
.option('--compare <baseline>', 'Compare against baseline')
.action(async (options) => {
const projectDir = process.cwd();
console.log('📊 Analyzing bundle...\n');
// Build with stats
execSync('npm run build -- --stats', {
cwd: projectDir,
stdio: 'inherit',
});
// Load stats file
const statsPath = path.join(projectDir, 'dist', 'stats.json');
const stats = await fs.readJson(statsPath);
// Analyze
const analysis = analyzeBundle(stats);
if (options.json) {
console.log(JSON.stringify(analysis, null, 2));
return;
}
// Print report
printAnalysisReport(analysis);
// Compare with baseline
if (options.compare) {
const baseline = await fs.readJson(options.compare);
printComparison(analysis, baseline);
}
// Check budgets
if (options.ci) {
const budgets = await loadBudgets(projectDir);
const violations = checkBudgets(analysis, budgets);
if (violations.length > 0) {
console.error('\n❌ Budget violations:');
violations.forEach(v => console.error(` - ${v}`));
process.exit(1);
}
}
// Save for comparison
const analysisPath = path.join(projectDir, 'dist', 'bundle-analysis.json');
await fs.writeJson(analysisPath, analysis, { spaces: 2 });
console.log(`\n📁 Analysis saved to: ${analysisPath}`);
});
function analyzeBundle(stats: any): BundleStats {
const chunks: ChunkStats[] = [];
const dependencies = new Map<string, DependencyStats>();
const moduleSet = new Set<string>();
const duplicates: string[] = [];
// Process chunks
for (const chunk of stats.chunks || []) {
const chunkStat: ChunkStats = {
name: chunk.names?.[0] || chunk.id,
size: chunk.size,
gzipSize: estimateGzipSize(chunk.size),
modules: chunk.modules?.length || 0,
isAsync: !chunk.initial,
};
chunks.push(chunkStat);
// Process modules in chunk
for (const module of chunk.modules || []) {
if (module.name.includes('node_modules')) {
const depName = extractPackageName(module.name);
const existing = dependencies.get(depName);
if (existing) {
existing.size += module.size;
existing.usedBy.push(chunkStat.name);
} else {
dependencies.set(depName, {
name: depName,
size: module.size,
version: '',
usedBy: [chunkStat.name],
});
}
}
// Detect duplicates
const modulePath = normalizeModulePath(module.name);
if (moduleSet.has(modulePath)) {
duplicates.push(modulePath);
}
moduleSet.add(modulePath);
}
}
const totalSize = chunks.reduce((sum, c) => sum + c.size, 0);
const gzipSize = chunks.reduce((sum, c) => sum + c.gzipSize, 0);
return {
totalSize,
gzipSize,
chunks: chunks.sort((a, b) => b.size - a.size),
dependencies: Array.from(dependencies.values()).sort((a, b) => b.size - a.size),
duplicates: [...new Set(duplicates)],
recommendations: generateRecommendations(chunks, dependencies, duplicates),
};
}
function extractPackageName(modulePath: string): string {
const match = modulePath.match(/node_modules[/\\](@[^/\\]+[/\\][^/\\]+|[^/\\]+)/);
return match?.[1]?.replace(/\\/g, '/') || 'unknown';
}
function normalizeModulePath(modulePath: string): string {
return modulePath.replace(/\\/g, '/').replace(/\?.*$/, '');
}
function estimateGzipSize(size: number): number {
// Rough estimate: gzip typically achieves 60-70% compression
return Math.round(size * 0.35);
}
function generateRecommendations(
chunks: ChunkStats[],
dependencies: Map<string, DependencyStats>,
duplicates: string[]
): string[] {
const recommendations: string[] = [];
// Check for large dependencies
for (const dep of dependencies.values()) {
if (dep.size > 100000) {
recommendations.push(
`Consider lazy-loading '${dep.name}' (${formatSize(dep.size)})`
);
}
}
// Check for duplicates
if (duplicates.length > 0) {
recommendations.push(
`Found ${duplicates.length} duplicate module(s). Run 'npm dedupe'`
);
}
// Check for large initial chunks
const initialChunks = chunks.filter(c => !c.isAsync);
const initialSize = initialChunks.reduce((sum, c) => sum + c.size, 0);
if (initialSize > 300000) {
recommendations.push(
`Initial bundle (${formatSize(initialSize)}) exceeds 300KB. Consider code splitting.`
);
}
// Check common large packages
const largePackages = ['moment', 'lodash', 'rxjs'];
for (const pkg of largePackages) {
if (dependencies.has(pkg)) {
recommendations.push(
`Consider replacing '${pkg}' with a lighter alternative`
);
}
}
return recommendations;
}
function printAnalysisReport(analysis: BundleStats) {
console.log('═══════════════════════════════════════════════════════════');
console.log(' Bundle Analysis Report ');
console.log('═══════════════════════════════════════════════════════════\n');
console.log(`Total Size: ${formatSize(analysis.totalSize)}`);
console.log(`Gzip Size: ${formatSize(analysis.gzipSize)}\n`);
console.log('Top Chunks:');
analysis.chunks.slice(0, 10).forEach(chunk => {
const asyncLabel = chunk.isAsync ? ' (async)' : '';
console.log(` ${chunk.name}${asyncLabel}: ${formatSize(chunk.size)}`);
});
console.log('\nTop Dependencies:');
analysis.dependencies.slice(0, 10).forEach(dep => {
console.log(` ${dep.name}: ${formatSize(dep.size)}`);
});
if (analysis.duplicates.length > 0) {
console.log(`\n⚠️ Duplicates Found: ${analysis.duplicates.length}`);
}
if (analysis.recommendations.length > 0) {
console.log('\n💡 Recommendations:');
analysis.recommendations.forEach(rec => {
console.log(` • ${rec}`);
});
}
}
function printComparison(current: BundleStats, baseline: BundleStats) {
console.log('\n📊 Comparison with Baseline:\n');
const sizeDiff = current.totalSize - baseline.totalSize;
const sizePercent = ((sizeDiff / baseline.totalSize) * 100).toFixed(1);
const arrow = sizeDiff > 0 ? '↑' : sizeDiff < 0 ? '↓' : '→';
const color = sizeDiff > 0 ? '\x1b[31m' : sizeDiff < 0 ? '\x1b[32m' : '';
const reset = '\x1b[0m';
console.log(
`Total Size: ${formatSize(baseline.totalSize)} → ${formatSize(current.totalSize)} ` +
`${color}(${arrow} ${formatSize(Math.abs(sizeDiff))}, ${sizePercent}%)${reset}`
);
}
async function loadBudgets(projectDir: string): Promise<Record<string, number>> {
const configPath = path.join(projectDir, 'platform.config.js');
if (await fs.pathExists(configPath)) {
const config = require(configPath);
return config.budgets || {};
}
return {
total: 500000, // 500KB total
initial: 300000, // 300KB initial
chunk: 150000, // 150KB per chunk
};
}
function checkBudgets(
analysis: BundleStats,
budgets: Record<string, number>
): string[] {
const violations: string[] = [];
if (budgets.total && analysis.totalSize > budgets.total) {
violations.push(
`Total bundle (${formatSize(analysis.totalSize)}) exceeds budget (${formatSize(budgets.total)})`
);
}
if (budgets.initial) {
const initialSize = analysis.chunks
.filter(c => !c.isAsync)
.reduce((sum, c) => sum + c.size, 0);
if (initialSize > budgets.initial) {
violations.push(
`Initial bundle (${formatSize(initialSize)}) exceeds budget (${formatSize(budgets.initial)})`
);
}
}
if (budgets.chunk) {
for (const chunk of analysis.chunks) {
if (chunk.size > budgets.chunk) {
violations.push(
`Chunk '${chunk.name}' (${formatSize(chunk.size)}) exceeds budget (${formatSize(budgets.chunk)})`
);
}
}
}
return violations;
}
function formatSize(bytes: number): string {
if (bytes < 1024) return `${bytes}B`;
if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(1)}KB`;
return `${(bytes / 1024 / 1024).toFixed(2)}MB`;
}
Release Pipeline Design
Multi-Environment Deployment
┌─────────────────────────────────────────────────────────────────────────────┐
│ Release Pipeline Architecture │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Developer │
│ │ │
│ │ git push feature/xyz │
│ ▼ │
│ ┌──────────────────────────────────────────────────────────────────────┐ │
│ │ Pull Request │ │
│ │ ┌────────────┐ ┌────────────┐ ┌────────────┐ ┌────────────────┐ │ │
│ │ │ Build │─▶│ Tests │─▶│ Lint │─▶│ Preview Deploy │ │ │
│ │ └────────────┘ └────────────┘ └────────────┘ └────────────────┘ │ │
│ │ │ │ │
│ │ unique URL ─┘ │ │
│ └──────────────────────────────────────────────────────────────────────┘ │
│ │
│ Merge to main │
│ │ │
│ ▼ │
│ ┌──────────────────────────────────────────────────────────────────────┐ │
│ │ Staging Pipeline │ │
│ │ ┌────────────┐ ┌────────────┐ ┌────────────┐ ┌────────────────┐ │ │
│ │ │ Build │─▶│ Tests │─▶│ E2E Tests │─▶│ Deploy Staging │ │ │
│ │ │ (prod) │ │ (full) │ │ │ │ │ │ │
│ │ └────────────┘ └────────────┘ └────────────┘ └───────┬────────┘ │ │
│ └──────────────────────────────────────────────────────────┼───────────┘ │
│ │ │
│ ┌─────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌──────────────────────────────────────────────────────────────────────┐ │
│ │ Production Release (manual) │ │
│ │ │ │
│ │ ┌───────────────┐ ┌───────────────┐ ┌───────────────────┐ │ │
│ │ │ Canary (5%) │───▶│ Canary (25%) │───▶│ Full Rollout │ │ │
│ │ │ + monitor │ │ + monitor │ │ (100%) │ │ │
│ │ └───────────────┘ └───────────────┘ └───────────────────┘ │ │
│ │ │ │ │ │
│ │ ▼ ▼ │ │
│ │ ┌──────────────────────────────────────┐ │ │
│ │ │ Automatic Rollback │ │ │
│ │ │ if error rate > threshold │ │ │
│ │ └──────────────────────────────────────┘ │ │
│ │ │ │
│ └──────────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
GitHub Actions Workflow
# .github/workflows/platform-pipeline.yml
name: Platform CI/CD Pipeline
on:
push:
branches: [main]
pull_request:
branches: [main]
env:
NODE_VERSION: '20'
PNPM_VERSION: '8'
jobs:
# ==================== LINT & TYPE CHECK ====================
lint:
name: Lint & Type Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v2
with:
version: ${{ env.PNPM_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'pnpm'
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Lint
run: pnpm lint
- name: Type check
run: pnpm typecheck
# ==================== UNIT TESTS ====================
test:
name: Unit Tests
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v2
with:
version: ${{ env.PNPM_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'pnpm'
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Run tests
run: pnpm test --coverage
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
files: ./coverage/lcov.info
# ==================== BUILD ====================
build:
name: Build
runs-on: ubuntu-latest
needs: [lint, test]
steps:
- uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v2
with:
version: ${{ env.PNPM_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'pnpm'
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Build
run: pnpm build
- name: Analyze bundle
run: pnpm analyze --ci
- name: Upload build artifacts
uses: actions/upload-artifact@v4
with:
name: build
path: dist/
retention-days: 7
# ==================== PREVIEW DEPLOY (PR only) ====================
preview:
name: Preview Deploy
runs-on: ubuntu-latest
needs: build
if: github.event_name == 'pull_request'
environment:
name: preview
url: ${{ steps.deploy.outputs.url }}
steps:
- uses: actions/checkout@v4
- name: Download build
uses: actions/download-artifact@v4
with:
name: build
path: dist/
- name: Deploy to Cloudflare Pages
id: deploy
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: ${{ github.repository }}-preview
directory: dist
gitHubToken: ${{ secrets.GITHUB_TOKEN }}
- name: Comment PR with preview URL
uses: actions/github-script@v7
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `🚀 Preview deployed: ${{ steps.deploy.outputs.url }}`
})
# ==================== E2E TESTS ====================
e2e:
name: E2E Tests
runs-on: ubuntu-latest
needs: preview
if: github.event_name == 'pull_request'
steps:
- uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v2
with:
version: ${{ env.PNPM_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'pnpm'
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Install Playwright
run: pnpm exec playwright install --with-deps
- name: Run E2E tests
run: pnpm e2e
env:
BASE_URL: ${{ needs.preview.outputs.url }}
- name: Upload test results
uses: actions/upload-artifact@v4
if: always()
with:
name: playwright-report
path: playwright-report/
# ==================== STAGING DEPLOY ====================
staging:
name: Deploy to Staging
runs-on: ubuntu-latest
needs: build
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
environment:
name: staging
url: https://staging.example.com
steps:
- uses: actions/checkout@v4
- name: Download build
uses: actions/download-artifact@v4
with:
name: build
path: dist/
- name: Deploy to Staging
run: |
# Deploy to staging CDN
aws s3 sync dist/ s3://staging-bucket/ --delete
aws cloudfront create-invalidation --distribution-id ${{ secrets.STAGING_CF_ID }} --paths "/*"
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- name: Run smoke tests
run: pnpm test:smoke
env:
BASE_URL: https://staging.example.com
# ==================== PRODUCTION DEPLOY ====================
production:
name: Deploy to Production
runs-on: ubuntu-latest
needs: staging
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
environment:
name: production
url: https://www.example.com
steps:
- uses: actions/checkout@v4
- name: Download build
uses: actions/download-artifact@v4
with:
name: build
path: dist/
- name: Deploy Canary (5%)
run: |
# Deploy to canary bucket
aws s3 sync dist/ s3://canary-bucket/
# Update traffic split
aws cloudfront update-distribution --id ${{ secrets.PROD_CF_ID }} \
--origin-groups '{"Items":[{"Id":"main","Members":{"Items":[{"OriginId":"canary","Weight":5},{"OriginId":"production","Weight":95}]}}'
- name: Monitor Canary (5 min)
run: |
sleep 300
# Check error rates
ERROR_RATE=$(curl -s "https://api.datadog.com/api/v1/query?query=avg:frontend.error.rate{env:canary}" | jq '.series[0].pointlist[-1][1]')
if (( $(echo "$ERROR_RATE > 0.01" | bc -l) )); then
echo "Error rate too high: $ERROR_RATE"
exit 1
fi
- name: Deploy Canary (25%)
run: |
aws cloudfront update-distribution --id ${{ secrets.PROD_CF_ID }} \
--origin-groups '{"Items":[{"Id":"main","Members":{"Items":[{"OriginId":"canary","Weight":25},{"OriginId":"production","Weight":75}]}}'
- name: Monitor Canary (5 min)
run: |
sleep 300
ERROR_RATE=$(curl -s "https://api.datadog.com/api/v1/query?query=avg:frontend.error.rate{env:canary}" | jq '.series[0].pointlist[-1][1]')
if (( $(echo "$ERROR_RATE > 0.01" | bc -l) )); then
echo "Error rate too high: $ERROR_RATE"
exit 1
fi
- name: Full Rollout
run: |
# Deploy to production bucket
aws s3 sync dist/ s3://production-bucket/ --delete
# Route all traffic to production
aws cloudfront update-distribution --id ${{ secrets.PROD_CF_ID }} \
--origin-groups '{"Items":[{"Id":"main","Members":{"Items":[{"OriginId":"production","Weight":100}]}}'
aws cloudfront create-invalidation --distribution-id ${{ secrets.PROD_CF_ID }} --paths "/*"
- name: Create release
uses: actions/github-script@v7
with:
script: |
const { data: release } = await github.rest.repos.createRelease({
owner: context.repo.owner,
repo: context.repo.repo,
tag_name: `v${Date.now()}`,
name: `Release ${new Date().toISOString().split('T')[0]}`,
body: 'Automated production release',
draft: false,
prerelease: false
});
console.log(`Created release: ${release.html_url}`);
Key Takeaways
-
CLI is the developer's gateway: Invest heavily in scaffolding, upgrades, and developer workflows through CLI tooling
-
Packages should be composable: Small, focused packages with explicit exports enable tree-shaking and independent versioning
-
Governance through automation: ESLint rules and CI checks enforce standards without manual review burden
-
ADRs document decisions: Architecture Decision Records create institutional memory and onboarding context
-
Upgrade paths must be automated: Codemods and migration scripts make version upgrades tractable across many teams
-
Local development should mirror production: Proxy configuration, mock servers, and platform middleware bridge dev/prod gaps
-
Bundle analysis prevents bloat: Automated analysis and budgets catch size regressions before production
-
Progressive rollouts reduce risk: Canary deployments with monitoring enable confident releases
-
Platform teams serve other teams: Success is measured by team adoption, developer productivity, and time-to-production
-
Documentation is infrastructure: Treat docs, examples, and guides with the same rigor as code
Building a frontend platform is building a product for developers. The best platform is one that teams don't think about—it just works.
What did you think?