WIP! A BB-style forum, on the ATmosphere! We're still working... we'll be back soon when we have something to show off!
node typescript hono htmx atproto

atBB Monorepo#

atBB is a decentralized BB-style forum built on the AT Protocol. Users own their posts on their own PDS; the forum's AppView indexes and serves them. Lexicon namespace: space.atbb.* (domain atbb.space is owned). License: AGPL-3.0.

The master project plan with MVP phases and progress tracking lives at docs/atproto-forum-plan.md.

Apps & Packages#

Apps (apps/)#

Servers and applications that are deployed or run as services.

App Description Port
@atbb/appview Hono JSON API server — indexes forum data, serves API 3000
@atbb/web Hono JSX + HTMX server-rendered web UI — calls appview API 3001

Packages (packages/)#

Shared libraries, tools, and utilities consumed by apps or used standalone.

Package Description
@atbb/db Drizzle ORM schema and connection factory for PostgreSQL
@atbb/lexicon AT Proto lexicon definitions (YAML) + generated TypeScript types
@atbb/spike PDS read/write test script for validating AT Proto operations

Dependency chain: @atbb/lexicon and @atbb/db build first, then @atbb/appview and @atbb/web build in parallel. Turbo handles this via ^build.

Development#

Setup#

devenv shell                    # enter Nix dev shell (Node.js, pnpm, turbo)
pnpm install                    # install all workspace dependencies
cp .env.example .env            # configure environment variables

Commands#

pnpm build                      # build all packages (lexicon → appview + web)
pnpm dev                        # start all dev servers with hot reload
pnpm test                       # run all tests across all packages
pnpm clean                      # remove all dist/ directories
devenv up                       # start appview + web servers via process manager
pnpm --filter @atbb/appview dev # run a single package
pnpm --filter @atbb/appview test # run tests for a single package
pnpm --filter @atbb/spike spike # run the PDS spike script

Environment Variables#

See .env.example. Key variables:

  • PORT — server port (appview: 3000, web: 3001)
  • FORUM_DID — the forum's AT Proto DID
  • PDS_URL — URL of the forum's PDS
  • APPVIEW_URL — URL the web package uses to reach the appview API
  • FORUM_HANDLE, FORUM_PASSWORD — forum service account credentials (for spike/writes)

Testing Standards#

CRITICAL: Always run tests before committing code or requesting code review.

Running Tests#

# Run all tests
pnpm test

# Run tests for a specific package
pnpm --filter @atbb/appview test

# Run tests in watch mode during development
pnpm --filter @atbb/appview test --watch

# Run a specific test file
pnpm --filter @atbb/appview test src/lib/__tests__/config.test.ts

When to Run Tests#

Before every commit:

pnpm test  # Verify all tests pass
git add .
git commit -m "feat: your changes"

Before requesting code review:

pnpm build  # Ensure clean build
pnpm test   # Verify all tests pass
# Only then push and request review

After fixing review feedback:

# Make fixes
pnpm test   # Verify tests still pass
# Push updates

Test Requirements#

All new features must include tests:

  • API endpoints: Test success cases, error cases, edge cases
  • Business logic: Test all code paths and error conditions
  • Error handling: Test that errors are caught and logged appropriately
  • Security features: Test authentication, authorization, input validation

Test quality standards:

  • Tests must be independent (no shared state between tests)
  • Use descriptive test names that explain what is being tested
  • Mock external dependencies (databases, APIs, network calls)
  • Test error paths, not just happy paths
  • Verify logging and error messages are correct

Red flags (do not commit):

  • Skipped tests (test.skip, it.skip) without Linear issue tracking why
  • Tests that pass locally but fail in CI
  • Tests that require manual setup or specific data
  • Tests with hardcoded timing (setTimeout, sleep) - use proper mocks

Example Test Structure#

describe("createForumRoutes", () => {
  it("returns forum metadata when forum exists", async () => {
    // Arrange: Set up test context with mock data
    const ctx = await createTestContext();

    // Act: Call the endpoint
    const res = await app.request("/api/forum");

    // Assert: Verify response
    expect(res.status).toBe(200);
    const data = await res.json();
    expect(data.name).toBe("Test Forum");
  });

  it("returns 404 when forum does not exist", async () => {
    // Test error case
    const ctx = await createTestContext({ emptyDb: true });
    const res = await app.request("/api/forum");
    expect(res.status).toBe(404);
  });
});

Test Coverage Expectations#

While we don't enforce strict coverage percentages, aim for:

  • Critical paths: 100% coverage (authentication, authorization, data integrity)
  • Error handling: All catch blocks should be tested
  • API endpoints: All routes should have tests
  • Business logic: All functions with branching logic should be tested

Do not:

  • Skip writing tests to "move faster" - untested code breaks in production
  • Write tests after requesting review - tests inform implementation
  • Rely on manual testing alone - automated tests catch regressions

Before Requesting Code Review#

CRITICAL: Run this checklist before requesting review to catch issues early:

# 1. Verify all tests pass
pnpm test

# 2. Check runtime dependencies are correctly placed
# (Runtime imports must be in dependencies, not devDependencies)
grep -r "from 'drizzle-orm'" apps/*/src  # If found, verify in dependencies
grep -r "from 'postgres'" apps/*/src    # If found, verify in dependencies

# 3. Verify error test coverage is comprehensive
# For API endpoints, ensure you have tests for:
# - Input validation (missing fields, wrong types, malformed JSON)
# - Error classification (network→503, server→500)
# - Error message clarity (user-friendly, no stack traces)

Common mistake: Adding error tests AFTER review feedback instead of DURING implementation. Write error tests immediately after implementing the happy path — they often reveal bugs in error classification and input validation that are better caught before review.

Lexicon Conventions#

  • Source of truth is YAML in packages/lexicon/lexicons/. Never edit generated JSON or TypeScript.
  • Build pipeline: YAML → JSON (scripts/build.ts) → TypeScript (@atproto/lex-cli gen-api).
  • Adding a new lexicon: Create a .yaml file under lexicons/space/atbb/, run pnpm --filter @atbb/lexicon build.
  • Record keys: Use key: tid for collections (multiple records per repo). Use key: literal:self for singletons.
  • References: Use com.atproto.repo.strongRef wrapped in named defs (e.g., forumRef, subjectRef).
  • Extensible fields: Use knownValues (not enum) for strings that may grow (permissions, reaction types, mod actions).
  • Record ownership:
    • Forum DID owns: forum.forum, forum.category, forum.role, modAction
    • User DID owns: post, membership, reaction

AT Protocol Conventions#

  • Unified post model: There is no separate "topic" type. A space.atbb.post without a reply ref is a topic starter; one with a reply ref is a reply.
  • Reply chains: replyRef has both root (thread starter) and parent (direct parent) — same pattern as app.bsky.feed.post.
  • MVP trust model: The AppView holds the Forum DID's signing keys directly and writes forum-level records on behalf of admins/mods after verifying their role. This will be replaced by AT Protocol privilege delegation post-MVP.

TypeScript / Hono Gotchas#

  • @types/node is required as a devDependency in every package that uses process.env or other Node APIs. tsx doesn't need it at runtime, but tsc builds will fail without it.
  • Hono JSX children: Use PropsWithChildren<T> from hono/jsx for components that accept children. Unlike React, Hono's FC<T> does not include children implicitly.
  • HTMX attributes in JSX: The typed-htmx package provides types for hx-* attributes. See apps/web/src/global.d.ts for the augmentation.
  • Glob expansion in npm scripts: @atproto/lex-cli needs file paths, not globs. Use bash -c 'shopt -s globstar && ...' to expand **/*.json in npm scripts.
  • .env loading: Dev and spike scripts use Node's --env-file=../../.env flag to load the root .env file. No dotenv dependency needed.
  • API endpoint parameter type guards: Never trust TypeScript types for user input. Change handler parameter types from string to unknown and add explicit typeof checks. TypeScript types are erased at runtime — a request missing the text field will pass type checking but crash with TypeError: text.trim is not a function.
    // ❌ BAD: Assumes text is always a string at runtime
    export function validatePostText(text: string): { valid: boolean } {
      const trimmed = text.trim();  // Crashes if text is undefined!
      // ...
    }
    
    // ✅ GOOD: Type guard protects against runtime type mismatches
    export function validatePostText(text: unknown): { valid: boolean } {
      if (typeof text !== "string") {
        return { valid: false, error: "Text is required and must be a string" };
      }
      const trimmed = text.trim();  // Safe - text is proven to be a string
      // ...
    }
    
  • Hono JSON parsing safety: await c.req.json() throws SyntaxError for malformed JSON. Always wrap in try-catch and return 400 for client errors:
    let body: any;
    try {
      body = await c.req.json();
    } catch {
      return c.json({ error: "Invalid JSON in request body" }, 400);
    }
    

Error Handling Standards#

Follow these patterns for robust, debuggable production code:

API Route Handlers#

Required for all database-backed endpoints:

  1. Validate input parameters before database queries (return 400 for invalid input)
  2. Wrap database queries in try-catch with structured logging
  3. Check resource existence explicitly (return 404 for missing resources)
  4. Return proper HTTP status codes (400/404/500, not always 500)

Example pattern:

export function createForumRoutes(ctx: AppContext) {
  return new Hono().get("/", async (c) => {
    try {
      const [forum] = await ctx.db
        .select()
        .from(forums)
        .where(eq(forums.rkey, "self"))
        .limit(1);

      if (!forum) {
        return c.json({ error: "Forum not found" }, 404);
      }

      return c.json({ /* success response */ });
    } catch (error) {
      console.error("Failed to query forum metadata", {
        operation: "GET /api/forum",
        error: error instanceof Error ? error.message : String(error),
      });
      return c.json(
        { error: "Failed to retrieve forum metadata. Please try again later." },
        500
      );
    }
  });
}

Catch Block Guidelines#

DO:

  • Catch specific error types when possible (instanceof RangeError, instanceof SyntaxError)
  • Re-throw unexpected errors (don't swallow programming bugs like TypeError)
  • Log with structured context: operation name, relevant IDs, error message
  • Return user-friendly messages (no stack traces in production)

DON'T:

  • Use bare catch blocks that hide all error types
  • Return generic "try again later" for client errors (400) vs server errors (500)
  • Fabricate data in catch blocks (return null or fail explicitly)
  • Use empty catch blocks or catch without logging

Helper Functions#

Validation helpers should:

  • Return null for invalid input (not throw)
  • Re-throw unexpected errors
  • Use specific error type checking

Example:

export function parseBigIntParam(value: string): bigint | null {
  try {
    return BigInt(value);
  } catch (error) {
    if (error instanceof RangeError || error instanceof SyntaxError) {
      return null;  // Expected error for invalid input
    }
    throw error;  // Unexpected error - let it bubble up
  }
}

Serialization helpers should:

  • Avoid silent fallbacks (log warnings if fabricating data)
  • Prefer returning null over fake values ("0", new Date())
  • Document fallback behavior in JSDoc if unavoidable

Defensive Programming#

All list queries must have defensive limits:

.from(categories)
.orderBy(categories.sortOrder)
.limit(1000);  // Prevent memory exhaustion on unbounded queries

Filter deleted/soft-deleted records:

.where(and(
  eq(posts.rootPostId, topicId),
  eq(posts.deleted, false)  // Never show deleted content to users
))

Use ordering for consistent results:

.orderBy(asc(posts.createdAt))  // Chronological order for replies

Global Error Handler#

The Hono app must have a global error handler as a safety net:

app.onError((err, c) => {
  console.error("Unhandled error in route handler", {
    path: c.req.path,
    method: c.req.method,
    error: err.message,
    stack: err.stack,
  });
  return c.json(
    {
      error: "An internal error occurred. Please try again later.",
      ...(process.env.NODE_ENV !== "production" && {
        details: err.message,
      }),
    },
    500
  );
});

Testing Error Handling#

Test error classification, not just error catching. Users need actionable feedback: "retry later" (503) vs "report this bug" (500).

// ✅ Test network errors return 503 (retry later)
it("returns 503 when PDS connection fails", async () => {
  mockPutRecord.mockRejectedValueOnce(new Error("fetch failed"));
  const res = await app.request("/api/topics", {
    method: "POST",
    body: JSON.stringify({ text: "Test" })
  });
  expect(res.status).toBe(503);  // Not 500!
  const data = await res.json();
  expect(data.error).toContain("Unable to reach your PDS");
});

// ✅ Test server errors return 500 (bug report)
it("returns 500 for unexpected database errors", async () => {
  mockPutRecord.mockRejectedValueOnce(new Error("Database connection lost"));
  const res = await app.request("/api/topics", {
    method: "POST",
    body: JSON.stringify({ text: "Test" })
  });
  expect(res.status).toBe(500);  // Not 503!
  const data = await res.json();
  expect(data.error).not.toContain("PDS");  // Generic message for server errors
});

// ✅ Test input validation returns 400
it("returns 400 for malformed JSON", async () => {
  const res = await app.request("/api/topics", {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: "{ invalid json }"
  });
  expect(res.status).toBe(400);
  const data = await res.json();
  expect(data.error).toContain("Invalid JSON");
});

Error classification patterns to test:

  • 400 (Bad Request): Invalid input, missing required fields, malformed JSON
  • 404 (Not Found): Resource doesn't exist (forum, post, user)
  • 503 (Service Unavailable): Network errors, PDS connection failures, timeouts — user should retry
  • 500 (Internal Server Error): Unexpected errors, database errors — needs bug investigation

Documentation & Project Tracking#

Keep these synchronized when completing work:

  1. docs/atproto-forum-plan.md — Master project plan with phase checklist

    • Mark items complete [x] when implementation is done and tested
    • Add brief status notes with file references and Linear issue IDs
    • Update immediately after completing milestones
  2. Linear issues — Task tracker at https://linear.app/atbb

    • Update status: Backlog → In Progress → Done
    • Add comments documenting implementation details when marking Done
    • Keep status in sync with actual codebase state, not planning estimates
  3. Workflow: When finishing a task:

    # 1. Run tests to verify implementation is correct
    pnpm test
    
    # 2. If tests pass, commit your changes
    git add .
    git commit -m "feat: your changes"
    
    # 3. Update plan document: mark [x] and add completion note
    # 4. Update Linear: change status to Done, add implementation comment
    # 5. Push and request code review
    # 6. After review approval: include "docs:" prefix when committing plan updates
    

Why this matters: The plan document and Linear can drift from reality as code evolves. Regular synchronization prevents rediscovering completed work and ensures accurate project status.

Git Conventions#

  • Do not include Co-Authored-By lines in commit messages.
  • prior-art/ contains git submodules (Rust AppView, original lexicons, delegation spec) — reference material only, not used at build time.
  • Worktrees with submodules need submodule deinit --all -f then worktree remove --force to clean up.