atBB Monorepo#
atBB is a decentralized BB-style forum built on the AT Protocol. Users own their posts on their own PDS; the forum's AppView indexes and serves them. Lexicon namespace: space.atbb.* (domain atbb.space is owned). License: AGPL-3.0.
The master project plan with MVP phases and progress tracking lives at docs/atproto-forum-plan.md.
Apps & Packages#
Apps (apps/)#
Servers and applications that are deployed or run as services.
| App | Description | Port |
|---|---|---|
@atbb/appview |
Hono JSON API server — indexes forum data, serves API | 3000 |
@atbb/web |
Hono JSX + HTMX server-rendered web UI — calls appview API | 3001 |
Packages (packages/)#
Shared libraries, tools, and utilities consumed by apps or used standalone.
| Package | Description |
|---|---|
@atbb/db |
Drizzle ORM schema and connection factory for PostgreSQL |
@atbb/lexicon |
AT Proto lexicon definitions (YAML) + generated TypeScript types |
Dependency chain: @atbb/lexicon and @atbb/db build first, then @atbb/appview and @atbb/web build in parallel. Turbo handles this via ^build.
Development#
Setup#
devenv shell # enter Nix dev shell (Node.js, pnpm, turbo)
pnpm install # install all workspace dependencies
cp .env.example .env # configure environment variables
Commands#
pnpm build # build all packages (lexicon → appview + web)
pnpm dev # start all dev servers with hot reload
pnpm test # run all tests across all packages
pnpm clean # remove all dist/ directories
devenv up # start appview + web servers via process manager
pnpm --filter @atbb/appview db:migrate # run database migrations
pnpm --filter @atbb/appview dev # run a single package
pnpm --filter @atbb/appview test # run tests for a single package
Environment Variables#
See .env.example. Key variables:
PORT— server port (appview: 3000, web: 3001)FORUM_DID— the forum's AT Proto DIDPDS_URL— URL of the forum's PDSAPPVIEW_URL— URL the web package uses to reach the appview APIFORUM_HANDLE,FORUM_PASSWORD— forum service account credentials
OAuth & session management (required for production):
OAUTH_PUBLIC_URL— public URL where AppView is accessible (used for client_id and redirect_uri)SESSION_SECRET— signing key for session tokens (generate withopenssl rand -hex 32)SESSION_TTL_DAYS— session lifetime in days (default: 7)REDIS_URL— optional Redis URL for session storage (recommended for multi-instance deployments)
Deployment#
Docker#
The project includes production-ready Docker infrastructure for single-container deployment:
# Build the Docker image
docker build -t atbb:latest .
# Run with docker-compose (recommended)
cp docker-compose.example.yml docker-compose.yml
# Edit docker-compose.yml with your DATABASE_URL, FORUM_DID, etc.
docker compose up -d
What's included:
- Multi-stage Dockerfile (Node 22 Alpine, ~200MB final image)
- Nginx reverse proxy serving both appview (port 3000) and web (port 3001) on port 80
- Non-root user (
atbb:atbb) for security - Health checks on
/api/healthz - Production-ready entrypoint script
Key files:
Dockerfile— multi-stage build definitionentrypoint.sh— startup script (nginx + node servers)nginx.conf— reverse proxy configurationdocker-compose.example.yml— orchestration templatedocs/deployment-guide.md— comprehensive deployment instructions
Database migrations: The container does NOT auto-run migrations. Run manually before starting:
docker compose run --rm atbb pnpm --filter @atbb/appview db:migrate
Pre-Commit Checks#
Every commit automatically runs three checks in parallel via lefthook:
- Lint — oxlint scans staged TypeScript/JavaScript files for code quality issues
- Typecheck —
pnpm turbo lintruns type checking on affected packages - Test — Vitest runs tests in packages with staged changes
Auto-Fixing Lint Issues#
Before committing, auto-fix safe lint violations:
# Fix all packages
pnpm turbo lint:fix
# Fix specific package
pnpm --filter @atbb/appview lint:fix
Bypassing Hooks (Emergency Only)#
In urgent situations, bypass hooks with:
git commit --no-verify -m "emergency: your message"
Use sparingly — hooks catch issues that would fail in CI.
CI/CD#
GitHub Actions Workflows#
.github/workflows/ci.yml — Runs on all pull requests (parallel jobs):
- Lint:
pnpm exec oxlint .— catches code quality issues - Type Check:
pnpm turbo lint— verifies TypeScript types across all packages - Test:
pnpm test— runs all tests with PostgreSQL 17 service container - Build:
pnpm build— verifies compilation succeeds
.github/workflows/publish.yml — Runs on pushes to main branch:
- Builds Docker image and publishes to GitHub Container Registry (GHCR)
- Tags:
latest(main branch) andsha-<commit>(specific commit) - Image:
ghcr.io/atbb-community/atbb:latest
All checks must pass before merging a PR.
How Hooks Work#
- Lefthook manages git hooks (
lefthook.yml) - Oxlint provides fast linting (
.oxlintrc.json) - Turbo filters checks to affected packages only
- Hooks auto-install after
pnpm installviapreparescript
Testing Standards#
CRITICAL: Always run tests before committing code or requesting code review.
Running Tests#
# Run all tests
pnpm test
# Run tests for a specific package
pnpm --filter @atbb/appview test
# Run tests in watch mode during development
pnpm --filter @atbb/appview test --watch
# Run a specific test file
pnpm --filter @atbb/appview test src/lib/__tests__/config.test.ts
Environment Variables in Tests#
CRITICAL: Turbo blocks environment variables by default for cache safety. Tests requiring env vars must declare them in turbo.json:
{
"tasks": {
"test": {
"dependsOn": ["^build"],
"env": ["DATABASE_URL"]
}
}
}
Symptoms of missing declaration:
- Tests pass when run directly (
pnpm --filter @atbb/appview test) - Tests fail when run via Turbo (
pnpm test) with undefined env vars - CI fails even though env vars are set in workflow
- Database errors like
database "username" does not exist(postgres defaults to system username when DATABASE_URL is unset)
Why this matters: Turbo's caching requires deterministic inputs. Environment variables that leak into tasks without declaration would make cache hits unpredictable. By explicitly declaring env vars in turbo.json, you tell Turbo to include them in the task's input hash and pass them through to the test process.
When adding new env vars to tests: Update turbo.json immediately, or tests will mysteriously fail when run via Turbo but pass when run directly.
When to Run Tests#
Before every commit:
pnpm test # Verify all tests pass
git add .
git commit -m "feat: your changes"
Before requesting code review:
pnpm build # Ensure clean build
pnpm test # Verify all tests pass
# Only then push and request review
After fixing review feedback:
# Make fixes
pnpm test # Verify tests still pass
# Push updates
Test Requirements#
All new features must include tests:
- API endpoints: Test success cases, error cases, edge cases
- Business logic: Test all code paths and error conditions
- Error handling: Test that errors are caught and logged appropriately
- Security features: Test authentication, authorization, input validation
Test quality standards:
- Tests must be independent (no shared state between tests)
- Use descriptive test names that explain what is being tested
- Mock external dependencies (databases, APIs, network calls)
- Test error paths, not just happy paths
- Verify logging and error messages are correct
Red flags (do not commit):
- Skipped tests (
test.skip,it.skip) without Linear issue tracking why - Tests that pass locally but fail in CI
- Tests that require manual setup or specific data
- Tests with hardcoded timing (
setTimeout,sleep) - use proper mocks
Example Test Structure#
describe("createForumRoutes", () => {
it("returns forum metadata when forum exists", async () => {
// Arrange: Set up test context with mock data
const ctx = await createTestContext();
// Act: Call the endpoint
const res = await app.request("/api/forum");
// Assert: Verify response
expect(res.status).toBe(200);
const data = await res.json();
expect(data.name).toBe("Test Forum");
});
it("returns 404 when forum does not exist", async () => {
// Test error case
const ctx = await createTestContext({ emptyDb: true });
const res = await app.request("/api/forum");
expect(res.status).toBe(404);
});
});
Test Coverage Expectations#
While we don't enforce strict coverage percentages, aim for:
- Critical paths: 100% coverage (authentication, authorization, data integrity)
- Error handling: All catch blocks should be tested
- API endpoints: All routes should have tests
- Business logic: All functions with branching logic should be tested
Do not:
- Skip writing tests to "move faster" - untested code breaks in production
- Write tests after requesting review - tests inform implementation
- Rely on manual testing alone - automated tests catch regressions
Before Requesting Code Review#
CRITICAL: Run this checklist before requesting review to catch issues early:
# 1. Verify all tests pass
pnpm test
# 2. Check runtime dependencies are correctly placed
# (Runtime imports must be in dependencies, not devDependencies)
grep -r "from 'drizzle-orm'" apps/*/src # If found, verify in dependencies
grep -r "from 'postgres'" apps/*/src # If found, verify in dependencies
# 3. Verify error test coverage is comprehensive
# For API endpoints, ensure you have tests for:
# - Input validation (missing fields, wrong types, malformed JSON)
# - Error classification (network→503, server→500)
# - Error message clarity (user-friendly, no stack traces)
Common mistake: Adding error tests AFTER review feedback instead of DURING implementation. Write error tests immediately after implementing the happy path — they often reveal bugs in error classification and input validation that are better caught before review.
Lexicon Conventions#
- Source of truth is YAML in
packages/lexicon/lexicons/. Never edit generated JSON or TypeScript. - Build pipeline: YAML → JSON (
scripts/build.ts) → TypeScript (@atproto/lex-cli gen-api). - Adding a new lexicon: Create a
.yamlfile underlexicons/space/atbb/, runpnpm --filter @atbb/lexicon build. - Record keys: Use
key: tidfor collections (multiple records per repo). Usekey: literal:selffor singletons. - References: Use
com.atproto.repo.strongRefwrapped in named defs (e.g.,forumRef,subjectRef). - Extensible fields: Use
knownValues(notenum) for strings that may grow (permissions, reaction types, mod actions). - Record ownership:
- Forum DID owns:
forum.forum,forum.category,forum.role,modAction - User DID owns:
post,membership,reaction
- Forum DID owns:
AT Protocol Conventions#
- Unified post model: There is no separate "topic" type. A
space.atbb.postwithout areplyref is a topic starter; one with areplyref is a reply. - Reply chains:
replyRefhas bothroot(thread starter) andparent(direct parent) — same pattern asapp.bsky.feed.post. - MVP trust model: The AppView holds the Forum DID's signing keys directly and writes forum-level records on behalf of admins/mods after verifying their role. This will be replaced by AT Protocol privilege delegation post-MVP.
TypeScript / Hono Gotchas#
@types/nodeis required as a devDependency in every package that usesprocess.envor other Node APIs.tsxdoesn't need it at runtime, buttscbuilds will fail without it.- Hono JSX
children: UsePropsWithChildren<T>fromhono/jsxfor components that accept children. Unlike React, Hono'sFC<T>does not includechildrenimplicitly. - HTMX attributes in JSX: The
typed-htmxpackage provides types forhx-*attributes. Seeapps/web/src/global.d.tsfor the augmentation. - Glob expansion in npm scripts:
@atproto/lex-clineeds file paths, not globs. Usebash -c 'shopt -s globstar && ...'to expand**/*.jsonin npm scripts. .envloading: Dev scripts use Node's--env-file=../../.envflag to load the root.envfile. Nodotenvdependency needed.- API endpoint parameter type guards: Never trust TypeScript types for user input. Change handler parameter types from
stringtounknownand add explicittypeofchecks. TypeScript types are erased at runtime — a request missing thetextfield will pass type checking but crash withTypeError: text.trim is not a function.// ❌ BAD: Assumes text is always a string at runtime export function validatePostText(text: string): { valid: boolean } { const trimmed = text.trim(); // Crashes if text is undefined! // ... } // ✅ GOOD: Type guard protects against runtime type mismatches export function validatePostText(text: unknown): { valid: boolean } { if (typeof text !== "string") { return { valid: false, error: "Text is required and must be a string" }; } const trimmed = text.trim(); // Safe - text is proven to be a string // ... } - Hono JSON parsing safety:
await c.req.json()throwsSyntaxErrorfor malformed JSON. Always wrap in try-catch and return 400 for client errors:let body: any; try { body = await c.req.json(); } catch { return c.json({ error: "Invalid JSON in request body" }, 400); }
Error Handling Standards#
Follow these patterns for robust, debuggable production code:
API Route Handlers#
Required for all database-backed endpoints:
- Validate input parameters before database queries (return 400 for invalid input)
- Wrap database queries in try-catch with structured logging
- Check resource existence explicitly (return 404 for missing resources)
- Return proper HTTP status codes (400/404/500, not always 500)
Example pattern:
export function createForumRoutes(ctx: AppContext) {
return new Hono().get("/", async (c) => {
try {
const [forum] = await ctx.db
.select()
.from(forums)
.where(eq(forums.rkey, "self"))
.limit(1);
if (!forum) {
return c.json({ error: "Forum not found" }, 404);
}
return c.json({ /* success response */ });
} catch (error) {
console.error("Failed to query forum metadata", {
operation: "GET /api/forum",
error: error instanceof Error ? error.message : String(error),
});
return c.json(
{ error: "Failed to retrieve forum metadata. Please try again later." },
500
);
}
});
}
Catch Block Guidelines#
DO:
- Catch specific error types when possible (
instanceof RangeError,instanceof SyntaxError) - Re-throw unexpected errors (don't swallow programming bugs like
TypeError) - Log with structured context: operation name, relevant IDs, error message
- Return user-friendly messages (no stack traces in production)
DON'T:
- Use bare
catchblocks that hide all error types - Return generic "try again later" for client errors (400) vs server errors (500)
- Fabricate data in catch blocks (return null or fail explicitly)
- Use empty catch blocks or catch without logging
Helper Functions#
Validation helpers should:
- Return
nullfor invalid input (not throw) - Re-throw unexpected errors
- Use specific error type checking
Example:
export function parseBigIntParam(value: string): bigint | null {
try {
return BigInt(value);
} catch (error) {
if (error instanceof RangeError || error instanceof SyntaxError) {
return null; // Expected error for invalid input
}
throw error; // Unexpected error - let it bubble up
}
}
Serialization helpers should:
- Avoid silent fallbacks (log warnings if fabricating data)
- Prefer returning
nullover fake values ("0",new Date()) - Document fallback behavior in JSDoc if unavoidable
Defensive Programming#
All list queries must have defensive limits:
.from(categories)
.orderBy(categories.sortOrder)
.limit(1000); // Prevent memory exhaustion on unbounded queries
Filter deleted/soft-deleted records:
.where(and(
eq(posts.rootPostId, topicId),
eq(posts.deleted, false) // Never show deleted content to users
))
Use ordering for consistent results:
.orderBy(asc(posts.createdAt)) // Chronological order for replies
Global Error Handler#
The Hono app must have a global error handler as a safety net:
app.onError((err, c) => {
console.error("Unhandled error in route handler", {
path: c.req.path,
method: c.req.method,
error: err.message,
stack: err.stack,
});
return c.json(
{
error: "An internal error occurred. Please try again later.",
...(process.env.NODE_ENV !== "production" && {
details: err.message,
}),
},
500
);
});
Testing Error Handling#
Test error classification, not just error catching. Users need actionable feedback: "retry later" (503) vs "report this bug" (500).
// ✅ Test network errors return 503 (retry later)
it("returns 503 when PDS connection fails", async () => {
mockPutRecord.mockRejectedValueOnce(new Error("fetch failed"));
const res = await app.request("/api/topics", {
method: "POST",
body: JSON.stringify({ text: "Test" })
});
expect(res.status).toBe(503); // Not 500!
const data = await res.json();
expect(data.error).toContain("Unable to reach your PDS");
});
// ✅ Test server errors return 500 (bug report)
it("returns 500 for unexpected database errors", async () => {
mockPutRecord.mockRejectedValueOnce(new Error("Database connection lost"));
const res = await app.request("/api/topics", {
method: "POST",
body: JSON.stringify({ text: "Test" })
});
expect(res.status).toBe(500); // Not 503!
const data = await res.json();
expect(data.error).not.toContain("PDS"); // Generic message for server errors
});
// ✅ Test input validation returns 400
it("returns 400 for malformed JSON", async () => {
const res = await app.request("/api/topics", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: "{ invalid json }"
});
expect(res.status).toBe(400);
const data = await res.json();
expect(data.error).toContain("Invalid JSON");
});
Error classification patterns to test:
- 400 (Bad Request): Invalid input, missing required fields, malformed JSON
- 404 (Not Found): Resource doesn't exist (forum, post, user)
- 503 (Service Unavailable): Network errors, PDS connection failures, timeouts — user should retry
- 500 (Internal Server Error): Unexpected errors, database errors — needs bug investigation
Documentation & Project Tracking#
Keep these synchronized when completing work:
-
docs/atproto-forum-plan.md— Master project plan with phase checklist- Mark items complete
[x]when implementation is done and tested - Add brief status notes with file references and Linear issue IDs
- Update immediately after completing milestones
- Mark items complete
-
Linear issues — Task tracker at https://linear.app/atbb
- Update status: Backlog → In Progress → Done
- Add comments documenting implementation details when marking Done
- Keep status in sync with actual codebase state, not planning estimates
-
Workflow: When finishing a task:
# 1. Run tests to verify implementation is correct pnpm test # 2. If tests pass, commit your changes git add . git commit -m "feat: your changes" # 3. Update plan document: mark [x] and add completion note # 4. Update Linear: change status to Done, add implementation comment # 5. Push and request code review # 6. After review approval: include "docs:" prefix when committing plan updates
Why this matters: The plan document and Linear can drift from reality as code evolves. Regular synchronization prevents rediscovering completed work and ensures accurate project status.
Bruno API Collections#
CRITICAL: Keep Bruno collections synchronized with API changes.
The bruno/ directory contains Bruno collections that serve dual purpose:
- Interactive API testing during development
- Version-controlled API documentation that stays in sync with code
When to Update Bruno Collections#
When adding a new API endpoint:
- Create a new
.brufile in the appropriatebruno/AppView API/subdirectory - Follow the naming pattern: use descriptive names like
Create Topic.bru,Get Forum Metadata.bru - Include all request details: method, URL with variables, headers, body (if POST/PUT)
- Add comprehensive documentation in the
docsblock explaining:- Required/optional parameters
- Expected response format with example
- All possible error codes (400, 401, 404, 500, 503)
- Authentication requirements
- Validation rules
- Add assertions to validate responses automatically
When modifying an existing endpoint:
- Update the corresponding
.brufile inbruno/AppView API/ - Update parameter descriptions if inputs changed
- Update response documentation if output format changed
- Update error documentation if new error cases added
- Update assertions if validation logic changed
When removing an endpoint:
- Delete the corresponding
.brufile - Update
bruno/README.mdif it referenced the removed endpoint
When adding new environment variables:
- Update
bruno/environments/local.bruwith local development values - Update
bruno/environments/dev.bruwith deployment values - Document the variable in
bruno/README.mdunder "Environment Variables Reference"
Bruno File Template#
When creating new .bru files, follow this template:
meta {
name: Endpoint Name
type: http
seq: 1
}
get {
url: {{appview_url}}/api/path
}
params:query {
param1: {{variable}}
}
headers {
Content-Type: application/json
}
body:json {
{
"field": "value"
}
}
assert {
res.status: eq 200
res.body.field: isDefined
}
docs {
Brief description of what this endpoint does.
Path/query/body params:
- param1: Description (type, required/optional)
Returns:
{
"field": "value"
}
Error codes:
- 400: Bad request (invalid input)
- 401: Unauthorized (requires auth)
- 404: Not found
- 500: Server error
Notes:
- Any special considerations or validation rules
}
Workflow Integration#
When committing API changes, update Bruno collections in the SAME commit:
# Example: Adding a new endpoint
git add apps/appview/src/routes/my-route.ts
git add apps/appview/src/routes/__tests__/my-route.test.ts
git add bruno/AppView\ API/MyRoute/New\ Endpoint.bru
git commit -m "feat: add new endpoint for X
- Implements POST /api/my-endpoint
- Adds validation for Y
- Updates Bruno collection with request documentation"
Why commit together: Bruno collections are API documentation. Keeping them in the same commit ensures the documentation is never out of sync with the implementation.
Testing Bruno Collections#
Before committing:
- Open the collection in Bruno
- Test each modified request against your local dev server (
pnpm dev) - Verify assertions pass (green checkmarks)
- Verify documentation is accurate and complete
- Check that error scenarios are documented (not just happy path)
Common Mistakes#
DON'T:
- Commit API changes without updating Bruno collections
- Use hardcoded URLs instead of environment variables (
{{appview_url}}) - Skip documenting error cases (only document 200 responses)
- Leave placeholder/example data that doesn't match actual API behavior
- Forget to update assertions when response format changes
DO:
- Update Bruno files in the same commit as route implementation
- Use environment variables for all URLs and test data
- Document all HTTP status codes the endpoint can return
- Include example request/response bodies that match actual behavior
- Test requests locally before committing
Git Conventions#
- Do not include
Co-Authored-Bylines in commit messages. prior-art/contains git submodules (Rust AppView, original lexicons, delegation spec) — reference material only, not used at build time.- Worktrees with submodules need
submodule deinit --all -fthenworktree remove --forceto clean up.