A minimal AT Protocol Personal Data Server written in JavaScript.
atproto pds

feat(lexicon-resolver): add live lexicon validation

Complete lexicon validation implementation with live DNS-based schema resolution.

Key features:
- DNS authority lookup via _lexicon.{domain} subdomain
- DID resolution to PDS endpoint
- Recursive ref resolution (fetches all referenced schemas)
- Bundled core com.atproto schemas (strongRef, label.defs, moderation.defs)
- Cloudflare Workers compatibility (fetch binding fix)

Changes:
- Add @pds/lexicon-resolver package with DNS/DID resolution
- Add LexiconResolverPort to core PDS
- Validate records in createRecord, putRecord, applyWrites
- Return validationStatus: 'valid' | 'unknown' in responses
- Add default LexiconResolver to node, deno, cloudflare adapters
- Add e2e test for live lexicon resolution

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

+3505 -21
+3 -3
docs/endpoint-comparison.md
··· 117 117 118 118 | Endpoint | pds.js has | Missing from pds.js | 119 119 |----------|------------|---------------------| 120 - | repo.createRecord | collection, record, rkey | **repo**, **validate**, swapCommit | 120 + | repo.createRecord | collection, record, rkey, **validate** | **repo**, swapCommit | 121 121 | repo.deleteRecord | collection, rkey | **repo**, swapCommit, swapRecord | 122 - | repo.putRecord | collection, rkey, record | **repo**, **validate**, swapCommit, swapRecord | 123 - | repo.applyWrites | writes | **repo**, validate, swapCommit | 122 + | repo.putRecord | collection, rkey, record, **validate** | **repo**, swapCommit, swapRecord | 123 + | repo.applyWrites | writes, **validate** | **repo**, swapCommit | 124 124 | sync.getRepo | did | since | 125 125 | sync.listBlobs | did, cursor, limit | since | 126 126 | sync.listRepos | (none) | cursor, limit |
+746
docs/plans/2026-01-13-lexicon-resolver.md
··· 1 + # Lexicon Resolver Implementation Plan 2 + 3 + > **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. 4 + 5 + **Goal:** Add optional lexicon validation to record writes with dynamic schema resolution. 6 + 7 + **Architecture:** Core defines a `LexiconResolverPort` interface and wires it into write handlers (createRecord, putRecord, applyWrites). A separate `@pds/lexicon-resolver` package wraps `@atproto/lex-resolver` for schema resolution and `@atproto/lexicon` for validation. 8 + 9 + **Tech Stack:** `@atproto/lexicon` (validation), `@atproto/lex-resolver` (DNS + PDS schema fetching) 10 + 11 + --- 12 + 13 + ## Task 1: Add LexiconResolverPort to ports.js 14 + 15 + **Files:** 16 + - Modify: `packages/core/src/ports.js:127-129` 17 + 18 + **Step 1: Add the port typedef** 19 + 20 + Add before the final `export {}`: 21 + 22 + ```javascript 23 + /** 24 + * @typedef {'valid' | 'unknown'} ValidationStatus 25 + */ 26 + 27 + /** 28 + * Lexicon resolver port for validating records against schemas. 29 + * Implementations may resolve schemas dynamically (DNS + fetch) or use static schemas. 30 + * 31 + * @typedef {Object} LexiconResolverPort 32 + * @property {(collection: string, record: object) => Promise<ValidationStatus>} assertValid 33 + * Resolves schema if needed, validates record. 34 + * Returns 'valid' on success, 'unknown' if schema unresolvable. 35 + * Throws Error with descriptive message if record is invalid against known schema. 36 + */ 37 + ``` 38 + 39 + **Step 2: Commit** 40 + 41 + ```bash 42 + git add packages/core/src/ports.js 43 + git commit -m "feat(core): add LexiconResolverPort typedef 44 + 45 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 46 + ``` 47 + 48 + --- 49 + 50 + ## Task 2: Add lexiconResolver to PDS constructor 51 + 52 + **Files:** 53 + - Modify: `packages/core/src/pds.js:300-340` 54 + 55 + **Step 1: Add JSDoc for the new parameter** 56 + 57 + Add to the constructor JSDoc (after line 311): 58 + 59 + ```javascript 60 + * @param {import('./ports.js').LexiconResolverPort} [config.lexiconResolver] - Optional lexicon resolver for record validation 61 + ``` 62 + 63 + **Step 2: Add to constructor destructuring** 64 + 65 + Add `lexiconResolver` to the destructuring (after `webSocket,`): 66 + 67 + ```javascript 68 + webSocket, 69 + lexiconResolver, 70 + ``` 71 + 72 + **Step 3: Store on instance** 73 + 74 + Add after `this.webSocket = webSocket;` (line 334): 75 + 76 + ```javascript 77 + this.lexiconResolver = lexiconResolver; 78 + ``` 79 + 80 + **Step 4: Commit** 81 + 82 + ```bash 83 + git add packages/core/src/pds.js 84 + git commit -m "feat(core): add lexiconResolver to PDS constructor 85 + 86 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 87 + ``` 88 + 89 + --- 90 + 91 + ## Task 3: Add validation to handleCreateRecord 92 + 93 + **Files:** 94 + - Modify: `packages/core/src/pds.js:742-819` 95 + - Test: `packages/core/test/lexicon.test.js` (create) 96 + 97 + **Step 1: Write failing test** 98 + 99 + Create `packages/core/test/lexicon.test.js`: 100 + 101 + ```javascript 102 + import { describe, expect, test } from 'vitest'; 103 + 104 + // Mock resolver that tracks calls 105 + function createMockResolver(behavior = 'valid') { 106 + const calls = []; 107 + return { 108 + calls, 109 + async assertValid(collection, record) { 110 + calls.push({ collection, record }); 111 + if (behavior === 'valid') return 'valid'; 112 + if (behavior === 'unknown') return 'unknown'; 113 + if (behavior === 'invalid') throw new Error('Invalid record: missing required field'); 114 + return 'valid'; 115 + }, 116 + }; 117 + } 118 + 119 + describe('Lexicon Validation', () => { 120 + describe('LexiconResolverPort mock', () => { 121 + test('calls resolver with collection and record', async () => { 122 + const resolver = createMockResolver('valid'); 123 + expect(resolver.calls).toHaveLength(0); 124 + 125 + const status = await resolver.assertValid('app.bsky.feed.post', { text: 'hello' }); 126 + 127 + expect(resolver.calls).toHaveLength(1); 128 + expect(resolver.calls[0].collection).toBe('app.bsky.feed.post'); 129 + expect(resolver.calls[0].record).toEqual({ text: 'hello' }); 130 + expect(status).toBe('valid'); 131 + }); 132 + 133 + test('returns unknown when schema not found', async () => { 134 + const resolver = createMockResolver('unknown'); 135 + const status = await resolver.assertValid('com.example.unknown', {}); 136 + expect(status).toBe('unknown'); 137 + }); 138 + 139 + test('throws when record is invalid', async () => { 140 + const resolver = createMockResolver('invalid'); 141 + await expect( 142 + resolver.assertValid('app.bsky.feed.post', {}) 143 + ).rejects.toThrow('Invalid record'); 144 + }); 145 + }); 146 + }); 147 + ``` 148 + 149 + **Step 2: Run test to verify it passes** 150 + 151 + ```bash 152 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/core/test/lexicon.test.js 153 + ``` 154 + 155 + Expected: PASS (tests the mock behavior) 156 + 157 + **Step 3: Update handleCreateRecord to use resolver** 158 + 159 + In `handleCreateRecord`, replace the current record handling (around lines 785-818) with: 160 + 161 + ```javascript 162 + const rkey = providedRkey || createTid(); 163 + const uri = `at://${did}/${collection}/${rkey}`; 164 + 165 + // Ensure $type is set 166 + const recordWithType = { $type: collection, ...record }; 167 + 168 + // Validate record against schema if resolver provided 169 + const shouldValidate = _validate !== false; 170 + let validationStatus; 171 + 172 + if (shouldValidate && this.lexiconResolver) { 173 + try { 174 + validationStatus = await this.lexiconResolver.assertValid( 175 + collection, 176 + recordWithType, 177 + ); 178 + } catch (err) { 179 + return Response.json( 180 + { 181 + error: 'InvalidRecord', 182 + message: `Invalid ${collection} record: ${err.message}`, 183 + }, 184 + { status: 400 }, 185 + ); 186 + } 187 + } else if (shouldValidate && !this.lexiconResolver) { 188 + validationStatus = 'unknown'; 189 + } 190 + // If _validate === false, validationStatus remains undefined 191 + 192 + // Encode record and create CID 193 + const encoded = cborEncodeDagCbor(recordWithType); 194 + const cid = cidToString(await createCid(encoded)); 195 + 196 + // Store block 197 + await this.actorStorage.putBlock(cid, encoded); 198 + 199 + // Store record 200 + await this.actorStorage.putRecord(uri, cid, collection, rkey, encoded); 201 + 202 + // Link blobs to record 203 + const blobRefs = findBlobRefs(record); 204 + for (const blobCid of blobRefs) { 205 + await this.actorStorage.linkBlobToRecord(blobCid, uri); 206 + } 207 + 208 + // Create commit 209 + await this.createCommit(did, [{ action: 'create', uri, cid }]); 210 + 211 + // Get commit info 212 + const latestCommit = await this.actorStorage.getLatestCommit(); 213 + 214 + return Response.json({ 215 + uri, 216 + cid, 217 + commit: latestCommit 218 + ? { cid: latestCommit.cid, rev: latestCommit.rev } 219 + : null, 220 + validationStatus, 221 + }); 222 + ``` 223 + 224 + **Step 4: Run tests** 225 + 226 + ```bash 227 + cd /Users/chadmiller/code/pds-experiment && pnpm test 228 + ``` 229 + 230 + **Step 5: Commit** 231 + 232 + ```bash 233 + git add packages/core/src/pds.js packages/core/test/lexicon.test.js 234 + git commit -m "feat(core): add lexicon validation to createRecord 235 + 236 + - Calls lexiconResolver.assertValid when resolver provided 237 + - Returns InvalidRecord error when validation fails 238 + - Returns validationStatus in response (valid/unknown/undefined) 239 + 240 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 241 + ``` 242 + 243 + --- 244 + 245 + ## Task 4: Add validation to handlePutRecord 246 + 247 + **Files:** 248 + - Modify: `packages/core/src/pds.js:915-1018` 249 + 250 + **Step 1: Add validation logic to handlePutRecord** 251 + 252 + Find the section after scope permissions check (around line 950) and add the same validation pattern. Update to include `_validate` in destructuring and add validation logic before encoding: 253 + 254 + ```javascript 255 + const { 256 + repo, 257 + collection, 258 + rkey, 259 + record, 260 + validate: _validate, 261 + } = await request.json(); 262 + ``` 263 + 264 + Then after scope check, before encoding: 265 + 266 + ```javascript 267 + // Ensure $type is set 268 + const recordWithType = { $type: collection, ...record }; 269 + 270 + // Validate record against schema if resolver provided 271 + const shouldValidate = _validate !== false; 272 + let validationStatus; 273 + 274 + if (shouldValidate && this.lexiconResolver) { 275 + try { 276 + validationStatus = await this.lexiconResolver.assertValid( 277 + collection, 278 + recordWithType, 279 + ); 280 + } catch (err) { 281 + return Response.json( 282 + { 283 + error: 'InvalidRecord', 284 + message: `Invalid ${collection} record: ${err.message}`, 285 + }, 286 + { status: 400 }, 287 + ); 288 + } 289 + } else if (shouldValidate && !this.lexiconResolver) { 290 + validationStatus = 'unknown'; 291 + } 292 + ``` 293 + 294 + **Step 2: Update response to include validationStatus** 295 + 296 + Add `validationStatus` to the response object. 297 + 298 + **Step 3: Run tests** 299 + 300 + ```bash 301 + cd /Users/chadmiller/code/pds-experiment && pnpm test 302 + ``` 303 + 304 + **Step 4: Commit** 305 + 306 + ```bash 307 + git add packages/core/src/pds.js 308 + git commit -m "feat(core): add lexicon validation to putRecord 309 + 310 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 311 + ``` 312 + 313 + --- 314 + 315 + ## Task 5: Add validation to handleApplyWrites 316 + 317 + **Files:** 318 + - Modify: `packages/core/src/pds.js:1021-1170` 319 + 320 + **Step 1: Extract validate param from request body** 321 + 322 + Add `validate` to the destructuring: 323 + 324 + ```javascript 325 + const { repo, writes, validate, swapCommit: _swapCommit } = await request.json(); 326 + ``` 327 + 328 + **Step 2: Add validation inside create case** 329 + 330 + For create case (around line 1070), add validation before storing: 331 + 332 + ```javascript 333 + if ($type === 'com.atproto.repo.applyWrites#create') { 334 + const { collection, rkey: providedRkey, value } = write; 335 + 336 + // Validate if resolver provided and validate !== false 337 + const shouldValidate = validate !== false; 338 + let validationStatus; 339 + const recordWithType = { $type: collection, ...value }; 340 + 341 + if (shouldValidate && this.lexiconResolver) { 342 + try { 343 + validationStatus = await this.lexiconResolver.assertValid( 344 + collection, 345 + recordWithType, 346 + ); 347 + } catch (err) { 348 + return Response.json( 349 + { 350 + error: 'InvalidRecord', 351 + message: `Invalid ${collection} record: ${err.message}`, 352 + }, 353 + { status: 400 }, 354 + ); 355 + } 356 + } else if (shouldValidate && !this.lexiconResolver) { 357 + validationStatus = 'unknown'; 358 + } 359 + 360 + // ... rest of create logic, using recordWithType for encoding 361 + ``` 362 + 363 + **Step 3: Add validation to update case** 364 + 365 + Same pattern for the update case. 366 + 367 + **Step 4: Add validationStatus to create/update results** 368 + 369 + ```javascript 370 + results.push({ 371 + $type: 'com.atproto.repo.applyWrites#createResult', 372 + uri, 373 + cid, 374 + validationStatus, 375 + }); 376 + ``` 377 + 378 + **Step 5: Run tests** 379 + 380 + ```bash 381 + cd /Users/chadmiller/code/pds-experiment && pnpm test 382 + ``` 383 + 384 + **Step 6: Commit** 385 + 386 + ```bash 387 + git add packages/core/src/pds.js 388 + git commit -m "feat(core): add lexicon validation to applyWrites 389 + 390 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 391 + ``` 392 + 393 + --- 394 + 395 + ## Task 6: Create lexicon-resolver package 396 + 397 + **Files:** 398 + - Create: `packages/lexicon-resolver/package.json` 399 + - Create: `packages/lexicon-resolver/src/index.js` 400 + - Create: `packages/lexicon-resolver/src/resolver.js` 401 + 402 + **Step 1: Create package.json** 403 + 404 + ```json 405 + { 406 + "name": "@pds/lexicon-resolver", 407 + "version": "0.1.0", 408 + "type": "module", 409 + "main": "src/index.js", 410 + "exports": { 411 + ".": "./src/index.js", 412 + "./resolver": "./src/resolver.js" 413 + }, 414 + "dependencies": { 415 + "@atproto/lexicon": "^0.4.0", 416 + "@atproto/lex-resolver": "^0.1.0" 417 + }, 418 + "devDependencies": { 419 + "vitest": "^2.1.8" 420 + } 421 + } 422 + ``` 423 + 424 + **Step 2: Create src/index.js** 425 + 426 + ```javascript 427 + export { LexiconResolver } from './resolver.js'; 428 + ``` 429 + 430 + **Step 3: Create src/resolver.js** 431 + 432 + ```javascript 433 + import { Lexicons } from '@atproto/lexicon'; 434 + import { LexResolver } from '@atproto/lex-resolver'; 435 + 436 + /** 437 + * @typedef {import('@pds/core/ports').ValidationStatus} ValidationStatus 438 + */ 439 + 440 + /** 441 + * Lexicon resolver using @atproto/lex-resolver for schema resolution 442 + * and @atproto/lexicon for validation. 443 + * Implements LexiconResolverPort. 444 + */ 445 + export class LexiconResolver { 446 + /** 447 + * @param {Object} [opts] 448 + * @param {number} [opts.ttl=3600000] - Cache TTL in ms (default 1 hour) 449 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 450 + * @param {string} [opts.plcDirectoryUrl] - PLC directory URL 451 + * @param {Array<object>} [opts.schemas] - Pre-loaded static schemas (LexiconDoc[]) 452 + */ 453 + constructor(opts = {}) { 454 + this.lexicons = new Lexicons(); 455 + this.resolver = new LexResolver({ 456 + fetch: opts.fetch, 457 + plcDirectoryUrl: opts.plcDirectoryUrl, 458 + }); 459 + this.ttl = opts.ttl ?? 3600000; 460 + this.cache = new Map(); // nsid -> { fetchedAt } 461 + 462 + // Pre-load static schemas 463 + for (const schema of opts.schemas ?? []) { 464 + try { 465 + this.lexicons.add(schema); 466 + this.cache.set(schema.id, { fetchedAt: Date.now() }); 467 + } catch { 468 + // Schema already added or invalid, skip 469 + } 470 + } 471 + } 472 + 473 + /** 474 + * Validate a record against its schema. 475 + * @param {string} collection - The collection/NSID (e.g., 'app.bsky.feed.post') 476 + * @param {object} record - The record to validate 477 + * @returns {Promise<ValidationStatus>} 478 + */ 479 + async assertValid(collection, record) { 480 + // Ensure schema is loaded 481 + const hasSchema = await this.ensureSchema(collection); 482 + if (!hasSchema) { 483 + return 'unknown'; 484 + } 485 + 486 + // Validate using @atproto/lexicon 487 + this.lexicons.assertValidRecord(collection, record); 488 + return 'valid'; 489 + } 490 + 491 + /** 492 + * Ensure a schema is loaded, fetching if necessary. 493 + * @param {string} nsid 494 + * @returns {Promise<boolean>} - true if schema is available 495 + */ 496 + async ensureSchema(nsid) { 497 + // Check if already loaded and not stale 498 + const cached = this.cache.get(nsid); 499 + if (cached && Date.now() - cached.fetchedAt < this.ttl) { 500 + return this.lexicons.get(nsid) !== undefined; 501 + } 502 + 503 + // Already have it but checking staleness 504 + if (this.lexicons.get(nsid)) { 505 + return true; 506 + } 507 + 508 + // Try to resolve 509 + try { 510 + const result = await this.resolver.get(nsid); 511 + if (result?.lexicon) { 512 + this.lexicons.add(result.lexicon); 513 + this.cache.set(nsid, { fetchedAt: Date.now() }); 514 + return true; 515 + } 516 + } catch { 517 + // Resolution failed 518 + } 519 + 520 + return false; 521 + } 522 + } 523 + ``` 524 + 525 + **Step 4: Commit** 526 + 527 + ```bash 528 + git add packages/lexicon-resolver/ 529 + git commit -m "feat(lexicon-resolver): create package with @atproto/lexicon + lex-resolver 530 + 531 + - LexiconResolver class implementing LexiconResolverPort 532 + - Uses @atproto/lex-resolver for DNS + PDS schema fetching 533 + - Uses @atproto/lexicon for validation 534 + - In-memory caching with TTL 535 + 536 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 537 + ``` 538 + 539 + --- 540 + 541 + ## Task 7: Add tests for LexiconResolver 542 + 543 + **Files:** 544 + - Create: `packages/lexicon-resolver/test/resolver.test.js` 545 + 546 + **Step 1: Create test file** 547 + 548 + ```javascript 549 + import { describe, expect, test, vi } from 'vitest'; 550 + import { LexiconResolver } from '../src/resolver.js'; 551 + 552 + // Sample lexicon schema for testing 553 + const postSchema = { 554 + lexicon: 1, 555 + id: 'app.bsky.feed.post', 556 + defs: { 557 + main: { 558 + type: 'record', 559 + key: 'tid', 560 + record: { 561 + type: 'object', 562 + required: ['text', 'createdAt'], 563 + properties: { 564 + text: { type: 'string', maxLength: 3000 }, 565 + createdAt: { type: 'string', format: 'datetime' }, 566 + }, 567 + }, 568 + }, 569 + }, 570 + }; 571 + 572 + describe('LexiconResolver', () => { 573 + describe('with static schemas', () => { 574 + test('validates record against pre-loaded schema', async () => { 575 + const resolver = new LexiconResolver({ 576 + schemas: [postSchema], 577 + }); 578 + 579 + const status = await resolver.assertValid('app.bsky.feed.post', { 580 + $type: 'app.bsky.feed.post', 581 + text: 'Hello world', 582 + createdAt: new Date().toISOString(), 583 + }); 584 + 585 + expect(status).toBe('valid'); 586 + }); 587 + 588 + test('throws on invalid record', async () => { 589 + const resolver = new LexiconResolver({ 590 + schemas: [postSchema], 591 + }); 592 + 593 + await expect( 594 + resolver.assertValid('app.bsky.feed.post', { 595 + $type: 'app.bsky.feed.post', 596 + // missing required fields 597 + }) 598 + ).rejects.toThrow(); 599 + }); 600 + 601 + test('returns unknown for unloaded schema', async () => { 602 + const resolver = new LexiconResolver({ 603 + schemas: [postSchema], 604 + }); 605 + 606 + const status = await resolver.assertValid('com.example.unknown', { 607 + $type: 'com.example.unknown', 608 + }); 609 + 610 + expect(status).toBe('unknown'); 611 + }); 612 + }); 613 + 614 + describe('ensureSchema', () => { 615 + test('returns true for pre-loaded schema', async () => { 616 + const resolver = new LexiconResolver({ 617 + schemas: [postSchema], 618 + }); 619 + 620 + const result = await resolver.ensureSchema('app.bsky.feed.post'); 621 + expect(result).toBe(true); 622 + }); 623 + 624 + test('returns false for unknown schema when resolution fails', async () => { 625 + const resolver = new LexiconResolver(); 626 + // Mock the resolver to fail 627 + resolver.resolver.get = vi.fn().mockRejectedValue(new Error('Not found')); 628 + 629 + const result = await resolver.ensureSchema('com.example.unknown'); 630 + expect(result).toBe(false); 631 + }); 632 + }); 633 + }); 634 + ``` 635 + 636 + **Step 2: Run tests** 637 + 638 + ```bash 639 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/resolver.test.js 640 + ``` 641 + 642 + **Step 3: Commit** 643 + 644 + ```bash 645 + git add packages/lexicon-resolver/test/ 646 + git commit -m "test(lexicon-resolver): add unit tests 647 + 648 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 649 + ``` 650 + 651 + --- 652 + 653 + ## Task 8: Install dependencies and update workspace 654 + 655 + **Step 1: Install dependencies** 656 + 657 + ```bash 658 + cd /Users/chadmiller/code/pds-experiment && pnpm install 659 + ``` 660 + 661 + **Step 2: Run full test suite** 662 + 663 + ```bash 664 + cd /Users/chadmiller/code/pds-experiment && pnpm test 665 + ``` 666 + 667 + **Step 3: Run linter** 668 + 669 + ```bash 670 + cd /Users/chadmiller/code/pds-experiment && pnpm lint 671 + ``` 672 + 673 + **Step 4: Fix any issues and commit** 674 + 675 + ```bash 676 + git add pnpm-lock.yaml 677 + git commit -m "chore: install lexicon-resolver dependencies 678 + 679 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 680 + ``` 681 + 682 + --- 683 + 684 + ## Task 9: Update endpoint comparison docs 685 + 686 + **Files:** 687 + - Modify: `docs/endpoint-comparison.md` 688 + 689 + **Step 1: Update the missing parameters table** 690 + 691 + Change from: 692 + 693 + ```markdown 694 + | repo.createRecord | collection, record, rkey | **repo**, **validate**, swapCommit | 695 + | repo.putRecord | collection, rkey, record | **repo**, **validate**, swapCommit, swapRecord | 696 + | repo.applyWrites | writes | **repo**, validate, swapCommit | 697 + ``` 698 + 699 + To: 700 + 701 + ```markdown 702 + | repo.createRecord | collection, record, rkey, **validate** | **repo**, swapCommit | 703 + | repo.putRecord | collection, rkey, record, **validate** | **repo**, swapCommit, swapRecord | 704 + | repo.applyWrites | writes, **validate** | **repo**, swapCommit | 705 + ``` 706 + 707 + **Step 2: Commit** 708 + 709 + ```bash 710 + git add docs/endpoint-comparison.md 711 + git commit -m "docs: update endpoint comparison with validate param support 712 + 713 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 714 + ``` 715 + 716 + --- 717 + 718 + ## Summary 719 + 720 + After completing all tasks: 721 + 722 + 1. **@pds/core** has: 723 + - `LexiconResolverPort` typedef in `ports.js` 724 + - Optional `lexiconResolver` in PDS constructor 725 + - `validate` param on createRecord, putRecord, applyWrites 726 + - `validationStatus` in responses 727 + 728 + 2. **@pds/lexicon-resolver** provides: 729 + - `LexiconResolver` class implementing the port 730 + - Uses `@atproto/lex-resolver` for DNS + PDS schema fetching 731 + - Uses `@atproto/lexicon` for validation 732 + - In-memory caching with TTL 733 + - Pre-loadable static schemas 734 + 735 + 3. **Usage:** 736 + ```javascript 737 + import { PersonalDataServer } from '@pds/core'; 738 + import { LexiconResolver } from '@pds/lexicon-resolver'; 739 + 740 + const pds = new PersonalDataServer({ 741 + // ... other config 742 + lexiconResolver: new LexiconResolver({ 743 + schemas: [/* optional static schemas */], 744 + }), 745 + }); 746 + ```
+902
docs/plans/2026-01-14-lexicon-resolver-rewrite.md
··· 1 + # Lexicon Resolver Rewrite Implementation Plan 2 + 3 + > **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. 4 + 5 + **Goal:** Replace `@atproto/lexicon` and `@atproto/lex-resolver` with `@bigmoves/lexicon` and custom resolver logic for full AT Protocol spec compliance. 6 + 7 + **Architecture:** Custom DNS-over-HTTPS resolver extracts NSID authority, looks up `_atproto.{domain}` TXT record, resolves DID to PDS endpoint, fetches lexicon. Validation uses `@bigmoves/lexicon` zero-dependency validator. Universal runtime support (Node, Cloudflare Workers, Deno). 8 + 9 + **Tech Stack:** `@bigmoves/lexicon` (validation), DNS-over-HTTPS (Cloudflare), PLC directory (DID resolution) 10 + 11 + --- 12 + 13 + ## Task 1: Create authority.js module 14 + 15 + **Files:** 16 + - Create: `packages/lexicon-resolver/src/authority.js` 17 + - Test: `packages/lexicon-resolver/test/authority.test.js` 18 + 19 + **Step 1: Write the failing test** 20 + 21 + Create `packages/lexicon-resolver/test/authority.test.js`: 22 + 23 + ```javascript 24 + import { describe, expect, test } from 'vitest'; 25 + import { nsidToDomain, domainToAtprotoDns } from '../src/authority.js'; 26 + 27 + describe('authority', () => { 28 + describe('nsidToDomain', () => { 29 + test('converts app.bsky.feed.post to bsky.app', () => { 30 + expect(nsidToDomain('app.bsky.feed.post')).toBe('bsky.app'); 31 + }); 32 + 33 + test('converts com.atproto.repo.createRecord to atproto.com', () => { 34 + expect(nsidToDomain('com.atproto.repo.createRecord')).toBe('atproto.com'); 35 + }); 36 + 37 + test('converts org.example.lexicon.test to example.org', () => { 38 + expect(nsidToDomain('org.example.lexicon.test')).toBe('example.org'); 39 + }); 40 + }); 41 + 42 + describe('domainToAtprotoDns', () => { 43 + test('prepends _atproto to domain', () => { 44 + expect(domainToAtprotoDns('bsky.app')).toBe('_atproto.bsky.app'); 45 + }); 46 + }); 47 + }); 48 + ``` 49 + 50 + **Step 2: Run test to verify it fails** 51 + 52 + ```bash 53 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/authority.test.js 54 + ``` 55 + 56 + Expected: FAIL with "Cannot find module '../src/authority.js'" 57 + 58 + **Step 3: Write minimal implementation** 59 + 60 + Create `packages/lexicon-resolver/src/authority.js`: 61 + 62 + ```javascript 63 + /** 64 + * Extract authority from NSID and convert to domain. 65 + * NSID format: tld.domain.name (e.g., app.bsky.feed.post) 66 + * Authority is everything except the last segment, reversed. 67 + * 68 + * @param {string} nsid - e.g., "app.bsky.feed.post" 69 + * @returns {string} - e.g., "bsky.app" 70 + */ 71 + export function nsidToDomain(nsid) { 72 + const parts = nsid.split('.'); 73 + // Remove the last segment (the name), take first two for domain 74 + const authority = parts.slice(0, -1); 75 + // Reverse: [app, bsky] -> [bsky, app] -> "bsky.app" 76 + return authority.slice(0, 2).reverse().join('.'); 77 + } 78 + 79 + /** 80 + * Convert domain to AT Protocol DNS lookup name. 81 + * 82 + * @param {string} domain - e.g., "bsky.app" 83 + * @returns {string} - e.g., "_atproto.bsky.app" 84 + */ 85 + export function domainToAtprotoDns(domain) { 86 + return `_atproto.${domain}`; 87 + } 88 + ``` 89 + 90 + **Step 4: Run test to verify it passes** 91 + 92 + ```bash 93 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/authority.test.js 94 + ``` 95 + 96 + Expected: PASS (3 tests) 97 + 98 + **Step 5: Commit** 99 + 100 + ```bash 101 + git add packages/lexicon-resolver/src/authority.js packages/lexicon-resolver/test/authority.test.js 102 + git commit -m "feat(lexicon-resolver): add authority module for NSID to domain conversion 103 + 104 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 105 + ``` 106 + 107 + --- 108 + 109 + ## Task 2: Create dns.js module 110 + 111 + **Files:** 112 + - Create: `packages/lexicon-resolver/src/dns.js` 113 + - Test: `packages/lexicon-resolver/test/dns.test.js` 114 + 115 + **Step 1: Write the failing test** 116 + 117 + Create `packages/lexicon-resolver/test/dns.test.js`: 118 + 119 + ```javascript 120 + import { describe, expect, test, vi } from 'vitest'; 121 + import { lookupTxt, parseDidFromTxt } from '../src/dns.js'; 122 + 123 + describe('dns', () => { 124 + describe('parseDidFromTxt', () => { 125 + test('extracts DID from TXT record value', () => { 126 + expect(parseDidFromTxt('did=did:plc:xyz123')).toBe('did:plc:xyz123'); 127 + }); 128 + 129 + test('handles quoted TXT record', () => { 130 + expect(parseDidFromTxt('"did=did:plc:xyz123"')).toBe('did:plc:xyz123'); 131 + }); 132 + 133 + test('returns null for invalid format', () => { 134 + expect(parseDidFromTxt('invalid')).toBeNull(); 135 + }); 136 + 137 + test('returns null for empty string', () => { 138 + expect(parseDidFromTxt('')).toBeNull(); 139 + }); 140 + }); 141 + 142 + describe('lookupTxt', () => { 143 + test('fetches TXT records via DoH', async () => { 144 + const mockFetch = vi.fn().mockResolvedValue({ 145 + ok: true, 146 + json: () => Promise.resolve({ 147 + Answer: [ 148 + { type: 16, data: '"did=did:plc:z72i7hdynmk6r22z27h6tvur"' }, 149 + ], 150 + }), 151 + }); 152 + 153 + const records = await lookupTxt('_atproto.bsky.app', { fetch: mockFetch }); 154 + 155 + expect(mockFetch).toHaveBeenCalledWith( 156 + 'https://cloudflare-dns.com/dns-query?name=_atproto.bsky.app&type=TXT', 157 + expect.objectContaining({ 158 + headers: { Accept: 'application/dns-json' }, 159 + }), 160 + ); 161 + expect(records).toEqual(['did=did:plc:z72i7hdynmk6r22z27h6tvur']); 162 + }); 163 + 164 + test('returns empty array on network error', async () => { 165 + const mockFetch = vi.fn().mockRejectedValue(new Error('Network error')); 166 + const records = await lookupTxt('_atproto.example.com', { fetch: mockFetch }); 167 + expect(records).toEqual([]); 168 + }); 169 + 170 + test('returns empty array when no Answer', async () => { 171 + const mockFetch = vi.fn().mockResolvedValue({ 172 + ok: true, 173 + json: () => Promise.resolve({}), 174 + }); 175 + const records = await lookupTxt('_atproto.example.com', { fetch: mockFetch }); 176 + expect(records).toEqual([]); 177 + }); 178 + }); 179 + }); 180 + ``` 181 + 182 + **Step 2: Run test to verify it fails** 183 + 184 + ```bash 185 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/dns.test.js 186 + ``` 187 + 188 + Expected: FAIL with "Cannot find module '../src/dns.js'" 189 + 190 + **Step 3: Write minimal implementation** 191 + 192 + Create `packages/lexicon-resolver/src/dns.js`: 193 + 194 + ```javascript 195 + const DEFAULT_DOH_URL = 'https://cloudflare-dns.com/dns-query'; 196 + 197 + /** 198 + * Parse DID from AT Protocol TXT record value. 199 + * Format: "did=did:plc:xyz123" or did=did:plc:xyz123 200 + * 201 + * @param {string} txt - TXT record value 202 + * @returns {string|null} - DID or null if invalid 203 + */ 204 + export function parseDidFromTxt(txt) { 205 + if (!txt) return null; 206 + // Remove surrounding quotes if present 207 + const unquoted = txt.replace(/^"|"$/g, ''); 208 + const match = unquoted.match(/^did=(.+)$/); 209 + return match ? match[1] : null; 210 + } 211 + 212 + /** 213 + * Lookup TXT records via DNS-over-HTTPS. 214 + * 215 + * @param {string} domain - Domain to lookup (e.g., "_atproto.bsky.app") 216 + * @param {Object} [opts] 217 + * @param {string} [opts.dohUrl] - DoH endpoint URL 218 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 219 + * @returns {Promise<string[]>} - Array of TXT record values 220 + */ 221 + export async function lookupTxt(domain, opts = {}) { 222 + const dohUrl = opts.dohUrl ?? DEFAULT_DOH_URL; 223 + const fetchFn = opts.fetch ?? globalThis.fetch; 224 + 225 + try { 226 + const url = `${dohUrl}?name=${encodeURIComponent(domain)}&type=TXT`; 227 + const response = await fetchFn(url, { 228 + headers: { Accept: 'application/dns-json' }, 229 + }); 230 + 231 + if (!response.ok) return []; 232 + 233 + const data = await response.json(); 234 + if (!data.Answer) return []; 235 + 236 + // Extract TXT record values (type 16 = TXT) 237 + return data.Answer 238 + .filter((record) => record.type === 16) 239 + .map((record) => { 240 + // Remove surrounding quotes from data field 241 + const value = record.data?.replace(/^"|"$/g, '') ?? ''; 242 + return value; 243 + }); 244 + } catch { 245 + return []; 246 + } 247 + } 248 + ``` 249 + 250 + **Step 4: Run test to verify it passes** 251 + 252 + ```bash 253 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/dns.test.js 254 + ``` 255 + 256 + Expected: PASS (6 tests) 257 + 258 + **Step 5: Commit** 259 + 260 + ```bash 261 + git add packages/lexicon-resolver/src/dns.js packages/lexicon-resolver/test/dns.test.js 262 + git commit -m "feat(lexicon-resolver): add dns module for DoH TXT lookups 263 + 264 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 265 + ``` 266 + 267 + --- 268 + 269 + ## Task 3: Create did.js module 270 + 271 + **Files:** 272 + - Create: `packages/lexicon-resolver/src/did.js` 273 + - Test: `packages/lexicon-resolver/test/did.test.js` 274 + 275 + **Step 1: Write the failing test** 276 + 277 + Create `packages/lexicon-resolver/test/did.test.js`: 278 + 279 + ```javascript 280 + import { describe, expect, test, vi } from 'vitest'; 281 + import { resolveDid, extractPdsEndpoint } from '../src/did.js'; 282 + 283 + describe('did', () => { 284 + describe('extractPdsEndpoint', () => { 285 + test('extracts PDS endpoint from DID document', () => { 286 + const didDoc = { 287 + service: [ 288 + { 289 + id: '#atproto_pds', 290 + type: 'AtprotoPersonalDataServer', 291 + serviceEndpoint: 'https://pds.example.com', 292 + }, 293 + ], 294 + }; 295 + expect(extractPdsEndpoint(didDoc)).toBe('https://pds.example.com'); 296 + }); 297 + 298 + test('finds PDS by type if id differs', () => { 299 + const didDoc = { 300 + service: [ 301 + { 302 + id: '#pds', 303 + type: 'AtprotoPersonalDataServer', 304 + serviceEndpoint: 'https://pds.example.com', 305 + }, 306 + ], 307 + }; 308 + expect(extractPdsEndpoint(didDoc)).toBe('https://pds.example.com'); 309 + }); 310 + 311 + test('returns null if no PDS service', () => { 312 + const didDoc = { service: [] }; 313 + expect(extractPdsEndpoint(didDoc)).toBeNull(); 314 + }); 315 + 316 + test('returns null if no service array', () => { 317 + const didDoc = {}; 318 + expect(extractPdsEndpoint(didDoc)).toBeNull(); 319 + }); 320 + }); 321 + 322 + describe('resolveDid', () => { 323 + test('resolves did:plc via PLC directory', async () => { 324 + const mockFetch = vi.fn().mockResolvedValue({ 325 + ok: true, 326 + json: () => Promise.resolve({ 327 + service: [ 328 + { 329 + id: '#atproto_pds', 330 + type: 'AtprotoPersonalDataServer', 331 + serviceEndpoint: 'https://bsky.social', 332 + }, 333 + ], 334 + }), 335 + }); 336 + 337 + const endpoint = await resolveDid('did:plc:xyz123', { fetch: mockFetch }); 338 + 339 + expect(mockFetch).toHaveBeenCalledWith('https://plc.directory/did:plc:xyz123'); 340 + expect(endpoint).toBe('https://bsky.social'); 341 + }); 342 + 343 + test('resolves did:web via .well-known', async () => { 344 + const mockFetch = vi.fn().mockResolvedValue({ 345 + ok: true, 346 + json: () => Promise.resolve({ 347 + service: [ 348 + { 349 + id: '#atproto_pds', 350 + type: 'AtprotoPersonalDataServer', 351 + serviceEndpoint: 'https://example.com', 352 + }, 353 + ], 354 + }), 355 + }); 356 + 357 + const endpoint = await resolveDid('did:web:example.com', { fetch: mockFetch }); 358 + 359 + expect(mockFetch).toHaveBeenCalledWith('https://example.com/.well-known/did.json'); 360 + expect(endpoint).toBe('https://example.com'); 361 + }); 362 + 363 + test('returns null for unknown DID method', async () => { 364 + const endpoint = await resolveDid('did:unknown:xyz', { fetch: vi.fn() }); 365 + expect(endpoint).toBeNull(); 366 + }); 367 + 368 + test('returns null on fetch error', async () => { 369 + const mockFetch = vi.fn().mockRejectedValue(new Error('Network error')); 370 + const endpoint = await resolveDid('did:plc:xyz', { fetch: mockFetch }); 371 + expect(endpoint).toBeNull(); 372 + }); 373 + }); 374 + }); 375 + ``` 376 + 377 + **Step 2: Run test to verify it fails** 378 + 379 + ```bash 380 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/did.test.js 381 + ``` 382 + 383 + Expected: FAIL with "Cannot find module '../src/did.js'" 384 + 385 + **Step 3: Write minimal implementation** 386 + 387 + Create `packages/lexicon-resolver/src/did.js`: 388 + 389 + ```javascript 390 + const DEFAULT_PLC_URL = 'https://plc.directory'; 391 + 392 + /** 393 + * Extract PDS service endpoint from DID document. 394 + * 395 + * @param {Object} didDoc - DID document 396 + * @returns {string|null} - PDS endpoint URL or null 397 + */ 398 + export function extractPdsEndpoint(didDoc) { 399 + if (!didDoc?.service || !Array.isArray(didDoc.service)) { 400 + return null; 401 + } 402 + 403 + const pdsService = didDoc.service.find( 404 + (s) => s.id === '#atproto_pds' || s.type === 'AtprotoPersonalDataServer', 405 + ); 406 + 407 + return pdsService?.serviceEndpoint ?? null; 408 + } 409 + 410 + /** 411 + * Resolve a DID to its PDS service endpoint. 412 + * 413 + * @param {string} did - DID string (did:plc:... or did:web:...) 414 + * @param {Object} [opts] 415 + * @param {string} [opts.plcUrl] - PLC directory URL 416 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 417 + * @returns {Promise<string|null>} - PDS endpoint URL or null 418 + */ 419 + export async function resolveDid(did, opts = {}) { 420 + const plcUrl = opts.plcUrl ?? DEFAULT_PLC_URL; 421 + const fetchFn = opts.fetch ?? globalThis.fetch; 422 + 423 + try { 424 + let didDocUrl; 425 + 426 + if (did.startsWith('did:plc:')) { 427 + didDocUrl = `${plcUrl}/${did}`; 428 + } else if (did.startsWith('did:web:')) { 429 + const domain = did.slice('did:web:'.length); 430 + didDocUrl = `https://${domain}/.well-known/did.json`; 431 + } else { 432 + return null; 433 + } 434 + 435 + const response = await fetchFn(didDocUrl); 436 + if (!response.ok) return null; 437 + 438 + const didDoc = await response.json(); 439 + return extractPdsEndpoint(didDoc); 440 + } catch { 441 + return null; 442 + } 443 + } 444 + ``` 445 + 446 + **Step 4: Run test to verify it passes** 447 + 448 + ```bash 449 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/test/did.test.js 450 + ``` 451 + 452 + Expected: PASS (8 tests) 453 + 454 + **Step 5: Commit** 455 + 456 + ```bash 457 + git add packages/lexicon-resolver/src/did.js packages/lexicon-resolver/test/did.test.js 458 + git commit -m "feat(lexicon-resolver): add did module for DID resolution 459 + 460 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 461 + ``` 462 + 463 + --- 464 + 465 + ## Task 4: Update package.json dependencies 466 + 467 + **Files:** 468 + - Modify: `packages/lexicon-resolver/package.json` 469 + 470 + **Step 1: Update dependencies** 471 + 472 + Replace contents of `packages/lexicon-resolver/package.json`: 473 + 474 + ```json 475 + { 476 + "name": "@pds/lexicon-resolver", 477 + "version": "0.1.0", 478 + "type": "module", 479 + "main": "src/index.js", 480 + "exports": { 481 + ".": "./src/index.js", 482 + "./resolver": "./src/resolver.js", 483 + "./dns": "./src/dns.js", 484 + "./did": "./src/did.js", 485 + "./authority": "./src/authority.js" 486 + }, 487 + "dependencies": { 488 + "@bigmoves/lexicon": "^0.1.2" 489 + }, 490 + "devDependencies": { 491 + "vitest": "^2.1.8" 492 + } 493 + } 494 + ``` 495 + 496 + **Step 2: Install new dependencies** 497 + 498 + ```bash 499 + cd /Users/chadmiller/code/pds-experiment && pnpm install 500 + ``` 501 + 502 + Expected: Installs `@bigmoves/lexicon`, removes `@atproto/lexicon` and `@atproto/lex-resolver` 503 + 504 + **Step 3: Commit** 505 + 506 + ```bash 507 + git add packages/lexicon-resolver/package.json pnpm-lock.yaml 508 + git commit -m "chore(lexicon-resolver): switch to @bigmoves/lexicon 509 + 510 + Remove @atproto/lexicon and @atproto/lex-resolver dependencies. 511 + 512 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 513 + ``` 514 + 515 + --- 516 + 517 + ## Task 5: Rewrite resolver.js 518 + 519 + **Files:** 520 + - Modify: `packages/lexicon-resolver/src/resolver.js` 521 + - Modify: `packages/lexicon-resolver/test/resolver.test.js` 522 + 523 + **Step 1: Rewrite resolver.js** 524 + 525 + Replace contents of `packages/lexicon-resolver/src/resolver.js`: 526 + 527 + ```javascript 528 + import { validateRecord } from '@bigmoves/lexicon'; 529 + import { lookupTxt, parseDidFromTxt } from './dns.js'; 530 + import { resolveDid } from './did.js'; 531 + import { nsidToDomain, domainToAtprotoDns } from './authority.js'; 532 + 533 + /** 534 + * @typedef {import('@pds/core/ports').ValidationStatus} ValidationStatus 535 + */ 536 + 537 + /** 538 + * Lexicon resolver with full AT Protocol spec compliance. 539 + * Resolves schemas via DNS authority lookup, DID resolution, and PDS fetch. 540 + * Validates records using @bigmoves/lexicon. 541 + * 542 + * Implements LexiconResolverPort. 543 + */ 544 + export class LexiconResolver { 545 + /** 546 + * @param {Object} [opts] 547 + * @param {number} [opts.ttl=3600000] - Cache TTL in ms (default 1 hour) 548 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 549 + * @param {string} [opts.dohUrl] - DNS-over-HTTPS endpoint 550 + * @param {string} [opts.plcUrl] - PLC directory URL 551 + * @param {Array<{id: string, defs: Object}>} [opts.schemas] - Pre-loaded static schemas 552 + */ 553 + constructor(opts = {}) { 554 + /** @type {Map<string, {id: string, defs: Object}>} */ 555 + this.schemas = new Map(); 556 + /** @type {Map<string, {fetchedAt: number}>} */ 557 + this.cache = new Map(); 558 + this.ttl = opts.ttl ?? 3600000; 559 + this.fetch = opts.fetch ?? globalThis.fetch; 560 + this.dohUrl = opts.dohUrl; 561 + this.plcUrl = opts.plcUrl; 562 + 563 + // Pre-load static schemas 564 + for (const schema of opts.schemas ?? []) { 565 + this.schemas.set(schema.id, schema); 566 + this.cache.set(schema.id, { fetchedAt: Date.now() }); 567 + } 568 + } 569 + 570 + /** 571 + * Validate a record against its schema. 572 + * @param {string} collection - The collection/NSID (e.g., 'app.bsky.feed.post') 573 + * @param {object} record - The record to validate 574 + * @returns {Promise<ValidationStatus>} 575 + */ 576 + async assertValid(collection, record) { 577 + const hasSchema = await this.ensureSchema(collection); 578 + if (!hasSchema) { 579 + return 'unknown'; 580 + } 581 + 582 + const schema = this.schemas.get(collection); 583 + const error = validateRecord([schema], collection, record); 584 + if (error) { 585 + const path = error.path ? `${error.path}: ` : ''; 586 + throw new Error(`${path}${error.message}`); 587 + } 588 + return 'valid'; 589 + } 590 + 591 + /** 592 + * Ensure a schema is loaded, fetching if necessary. 593 + * @param {string} nsid 594 + * @returns {Promise<boolean>} - true if schema is available 595 + */ 596 + async ensureSchema(nsid) { 597 + // Check if already loaded and not stale 598 + const cached = this.cache.get(nsid); 599 + if (cached && Date.now() - cached.fetchedAt < this.ttl) { 600 + return this.schemas.has(nsid); 601 + } 602 + 603 + // Already have it, just not checking staleness 604 + if (this.schemas.has(nsid)) { 605 + return true; 606 + } 607 + 608 + // Try to resolve via DNS -> DID -> PDS 609 + try { 610 + const lexicon = await this.fetchLexicon(nsid); 611 + if (lexicon) { 612 + this.schemas.set(nsid, lexicon); 613 + this.cache.set(nsid, { fetchedAt: Date.now() }); 614 + return true; 615 + } 616 + } catch { 617 + // Resolution failed 618 + } 619 + 620 + return false; 621 + } 622 + 623 + /** 624 + * Fetch a lexicon via the AT Protocol resolution chain. 625 + * @param {string} nsid 626 + * @returns {Promise<{id: string, defs: Object}|null>} 627 + */ 628 + async fetchLexicon(nsid) { 629 + // 1. Convert NSID to domain 630 + const domain = nsidToDomain(nsid); 631 + const dnsName = domainToAtprotoDns(domain); 632 + 633 + // 2. DNS TXT lookup 634 + const txtRecords = await lookupTxt(dnsName, { 635 + dohUrl: this.dohUrl, 636 + fetch: this.fetch, 637 + }); 638 + 639 + // 3. Parse DID from TXT record 640 + let did = null; 641 + for (const txt of txtRecords) { 642 + did = parseDidFromTxt(txt); 643 + if (did) break; 644 + } 645 + if (!did) return null; 646 + 647 + // 4. Resolve DID to PDS endpoint 648 + const pdsEndpoint = await resolveDid(did, { 649 + plcUrl: this.plcUrl, 650 + fetch: this.fetch, 651 + }); 652 + if (!pdsEndpoint) return null; 653 + 654 + // 5. Fetch lexicon from PDS 655 + const url = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=com.atproto.lexicon.schema&rkey=${encodeURIComponent(nsid)}`; 656 + 657 + const response = await this.fetch(url); 658 + if (!response.ok) return null; 659 + 660 + const data = await response.json(); 661 + return data.value ?? null; 662 + } 663 + } 664 + ``` 665 + 666 + **Step 2: Update test file** 667 + 668 + Replace contents of `packages/lexicon-resolver/test/resolver.test.js`: 669 + 670 + ```javascript 671 + import { describe, expect, test, vi } from 'vitest'; 672 + import { LexiconResolver } from '../src/resolver.js'; 673 + 674 + /** @type {import('@bigmoves/lexicon').Lexicon} */ 675 + const postSchema = { 676 + lexicon: 1, 677 + id: 'app.bsky.feed.post', 678 + defs: { 679 + main: { 680 + type: 'record', 681 + key: 'tid', 682 + record: { 683 + type: 'object', 684 + required: ['text', 'createdAt'], 685 + properties: { 686 + text: { type: 'string', maxLength: 3000 }, 687 + createdAt: { type: 'string', format: 'datetime' }, 688 + }, 689 + }, 690 + }, 691 + }, 692 + }; 693 + 694 + describe('LexiconResolver', () => { 695 + describe('with static schemas', () => { 696 + test('validates record against pre-loaded schema', async () => { 697 + const resolver = new LexiconResolver({ 698 + schemas: [postSchema], 699 + }); 700 + 701 + const status = await resolver.assertValid('app.bsky.feed.post', { 702 + $type: 'app.bsky.feed.post', 703 + text: 'Hello world', 704 + createdAt: new Date().toISOString(), 705 + }); 706 + 707 + expect(status).toBe('valid'); 708 + }); 709 + 710 + test('throws on invalid record', async () => { 711 + const resolver = new LexiconResolver({ 712 + schemas: [postSchema], 713 + }); 714 + 715 + await expect( 716 + resolver.assertValid('app.bsky.feed.post', { 717 + $type: 'app.bsky.feed.post', 718 + // missing required fields 719 + }), 720 + ).rejects.toThrow(); 721 + }); 722 + 723 + test('returns unknown for unloaded schema', async () => { 724 + const resolver = new LexiconResolver({ 725 + schemas: [postSchema], 726 + }); 727 + 728 + const status = await resolver.assertValid('com.example.unknown', { 729 + $type: 'com.example.unknown', 730 + }); 731 + 732 + expect(status).toBe('unknown'); 733 + }); 734 + }); 735 + 736 + describe('ensureSchema', () => { 737 + test('returns true for pre-loaded schema', async () => { 738 + const resolver = new LexiconResolver({ 739 + schemas: [postSchema], 740 + }); 741 + 742 + const result = await resolver.ensureSchema('app.bsky.feed.post'); 743 + expect(result).toBe(true); 744 + }); 745 + 746 + test('returns false for unknown schema when resolution fails', async () => { 747 + const mockFetch = vi.fn().mockRejectedValue(new Error('Network error')); 748 + const resolver = new LexiconResolver({ fetch: mockFetch }); 749 + 750 + const result = await resolver.ensureSchema('com.example.unknown'); 751 + expect(result).toBe(false); 752 + }); 753 + }); 754 + 755 + describe('fetchLexicon', () => { 756 + test('resolves lexicon via DNS -> DID -> PDS chain', async () => { 757 + const mockFetch = vi 758 + .fn() 759 + // DNS lookup 760 + .mockResolvedValueOnce({ 761 + ok: true, 762 + json: () => 763 + Promise.resolve({ 764 + Answer: [{ type: 16, data: '"did=did:plc:testdid"' }], 765 + }), 766 + }) 767 + // DID resolution 768 + .mockResolvedValueOnce({ 769 + ok: true, 770 + json: () => 771 + Promise.resolve({ 772 + service: [ 773 + { 774 + id: '#atproto_pds', 775 + type: 'AtprotoPersonalDataServer', 776 + serviceEndpoint: 'https://pds.example.com', 777 + }, 778 + ], 779 + }), 780 + }) 781 + // Lexicon fetch 782 + .mockResolvedValueOnce({ 783 + ok: true, 784 + json: () => Promise.resolve({ value: postSchema }), 785 + }); 786 + 787 + const resolver = new LexiconResolver({ fetch: mockFetch }); 788 + const lexicon = await resolver.fetchLexicon('app.bsky.feed.post'); 789 + 790 + expect(lexicon).toEqual(postSchema); 791 + expect(mockFetch).toHaveBeenCalledTimes(3); 792 + }); 793 + }); 794 + }); 795 + ``` 796 + 797 + **Step 3: Run tests** 798 + 799 + ```bash 800 + cd /Users/chadmiller/code/pds-experiment && pnpm test packages/lexicon-resolver/ 801 + ``` 802 + 803 + Expected: PASS (all tests in lexicon-resolver) 804 + 805 + **Step 4: Commit** 806 + 807 + ```bash 808 + git add packages/lexicon-resolver/src/resolver.js packages/lexicon-resolver/test/resolver.test.js 809 + git commit -m "feat(lexicon-resolver): rewrite resolver with custom DNS/DID resolution 810 + 811 + - Use @bigmoves/lexicon for validation 812 + - Custom DNS-over-HTTPS authority resolution 813 + - Custom DID resolution (PLC + did:web) 814 + - Full AT Protocol spec compliance 815 + 816 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 817 + ``` 818 + 819 + --- 820 + 821 + ## Task 6: Update index.js exports 822 + 823 + **Files:** 824 + - Modify: `packages/lexicon-resolver/src/index.js` 825 + 826 + **Step 1: Update exports** 827 + 828 + Replace contents of `packages/lexicon-resolver/src/index.js`: 829 + 830 + ```javascript 831 + export { LexiconResolver } from './resolver.js'; 832 + export { lookupTxt, parseDidFromTxt } from './dns.js'; 833 + export { resolveDid, extractPdsEndpoint } from './did.js'; 834 + export { nsidToDomain, domainToAtprotoDns } from './authority.js'; 835 + ``` 836 + 837 + **Step 2: Commit** 838 + 839 + ```bash 840 + git add packages/lexicon-resolver/src/index.js 841 + git commit -m "feat(lexicon-resolver): export utility functions 842 + 843 + Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>" 844 + ``` 845 + 846 + --- 847 + 848 + ## Task 7: Run full test suite and typecheck 849 + 850 + **Step 1: Run all tests** 851 + 852 + ```bash 853 + cd /Users/chadmiller/code/pds-experiment && pnpm test 854 + ``` 855 + 856 + Expected: All tests pass 857 + 858 + **Step 2: Run typecheck** 859 + 860 + ```bash 861 + cd /Users/chadmiller/code/pds-experiment && pnpm typecheck 862 + ``` 863 + 864 + Expected: No errors 865 + 866 + **Step 3: Run lint** 867 + 868 + ```bash 869 + cd /Users/chadmiller/code/pds-experiment && pnpm lint 870 + ``` 871 + 872 + Expected: No errors 873 + 874 + --- 875 + 876 + ## Summary 877 + 878 + After completing all tasks: 879 + 880 + 1. **`@pds/lexicon-resolver`** now uses: 881 + - `@bigmoves/lexicon` for validation (zero dependencies) 882 + - Custom `dns.js` for DNS-over-HTTPS TXT lookups 883 + - Custom `did.js` for DID resolution (PLC + did:web) 884 + - Custom `authority.js` for NSID to domain conversion 885 + 886 + 2. **Removed dependencies:** 887 + - `@atproto/lexicon` 888 + - `@atproto/lex-resolver` 889 + 890 + 3. **Universal runtime support:** Node, Cloudflare Workers, Deno 891 + 892 + 4. **Usage unchanged:** 893 + ```javascript 894 + import { LexiconResolver } from '@pds/lexicon-resolver'; 895 + 896 + const resolver = new LexiconResolver({ 897 + schemas: [/* optional static schemas */], 898 + }); 899 + 900 + // Validates via DNS -> DID -> PDS resolution chain 901 + const status = await resolver.assertValid('app.bsky.feed.post', record); 902 + ```
+2
package.json
··· 26 26 }, 27 27 "dependencies": { 28 28 "@pds/core": "workspace:*", 29 + "@pds/lexicon-resolver": "workspace:*", 29 30 "@pds/node": "workspace:*", 30 31 "better-sqlite3": "^12.6.0" 31 32 }, 32 33 "devDependencies": { 34 + "@bigmoves/lexicon": "^0.2.0", 33 35 "@biomejs/biome": "^2.3.11", 34 36 "@cloudflare/workers-types": "^4.20260103.0", 35 37 "@types/better-sqlite3": "^7.6.13",
+2 -1
packages/cloudflare/package.json
··· 8 8 ".": "./src/index.js" 9 9 }, 10 10 "dependencies": { 11 - "@pds/core": "workspace:*" 11 + "@pds/core": "workspace:*", 12 + "@pds/lexicon-resolver": "workspace:*" 12 13 } 13 14 }
+8
packages/cloudflare/src/index.js
··· 1 1 // @pds/cloudflare - Cloudflare Workers + Durable Objects adapter 2 2 3 3 import { PersonalDataServer } from '@pds/core'; 4 + import { LexiconResolver } from '@pds/lexicon-resolver'; 4 5 5 6 /** 6 7 * @typedef {import('@pds/core/ports').BlobPort} BlobPort ··· 548 549 // Initialize WebSocket adapter with hibernation support 549 550 const webSocket = createWebSocket(state); 550 551 552 + // Initialize lexicon resolver for record validation 553 + // Wrap fetch to preserve `this` binding in Cloudflare Workers 554 + const lexiconResolver = new LexiconResolver({ 555 + fetch: (url, init) => fetch(url, init), 556 + }); 557 + 551 558 // Create PDS instance with port injection 552 559 this.pds = new PersonalDataServer({ 553 560 actorStorage, ··· 560 567 appviewDid: env.APPVIEW_DID, 561 568 relayUrl: env.RELAY_URL, 562 569 password: env.PDS_PASSWORD, 570 + lexiconResolver, 563 571 }); 564 572 565 573 // Schedule blob cleanup alarm if not already set (runs daily)
+113 -8
packages/core/src/pds.js
··· 309 309 * @param {string} [config.relayUrl] - Relay URL for firehose notifications (e.g., http://localhost:2470) 310 310 * @param {string} [config.password] - Password for createSession 311 311 * @param {WebSocketPort} [config.webSocket] - WebSocket port for subscribeRepos 312 + * @param {import('./ports.js').LexiconResolverPort} [config.lexiconResolver] - Optional lexicon resolver for record validation 312 313 */ 313 314 constructor({ 314 315 actorStorage, ··· 321 322 relayUrl, 322 323 password, 323 324 webSocket, 325 + lexiconResolver, 324 326 }) { 325 327 this.actorStorage = actorStorage; 326 328 this.sharedStorage = sharedStorage; ··· 332 334 this.relayUrl = relayUrl; 333 335 this.password = password; 334 336 this.webSocket = webSocket; 337 + this.lexiconResolver = lexiconResolver; 335 338 this.lastCrawlNotify = 0; 336 339 337 340 // Cache for derived values ··· 785 788 const rkey = providedRkey || createTid(); 786 789 const uri = `at://${did}/${collection}/${rkey}`; 787 790 788 - // Encode record and create CID 791 + // Ensure $type is set 789 792 const recordWithType = { $type: collection, ...record }; 793 + 794 + // Validate record against schema if resolver provided 795 + const shouldValidate = _validate !== false; 796 + let validationStatus; 797 + 798 + if (shouldValidate && this.lexiconResolver) { 799 + try { 800 + validationStatus = await this.lexiconResolver.assertValid( 801 + collection, 802 + recordWithType, 803 + ); 804 + } catch (err) { 805 + return Response.json( 806 + { 807 + error: 'InvalidRecord', 808 + message: `Invalid ${collection} record: ${err.message}`, 809 + }, 810 + { status: 400 }, 811 + ); 812 + } 813 + } else if (shouldValidate && !this.lexiconResolver) { 814 + validationStatus = 'unknown'; 815 + } 816 + // If _validate === false, validationStatus remains undefined 817 + 818 + // Encode record and create CID 790 819 const encoded = cborEncodeDagCbor(recordWithType); 791 820 const cid = cidToString(await createCid(encoded)); 792 821 ··· 814 843 commit: latestCommit 815 844 ? { cid: latestCommit.cid, rev: latestCommit.rev } 816 845 : null, 817 - validationStatus: 'valid', 846 + validationStatus, 818 847 }); 819 848 } 820 849 ··· 975 1004 } 976 1005 } 977 1006 978 - // Encode record and create CID 1007 + // Ensure $type is set 979 1008 const recordWithType = { $type: collection, ...record }; 1009 + 1010 + // Validate record against schema if resolver provided 1011 + const shouldValidate = _validate !== false; 1012 + let validationStatus; 1013 + 1014 + if (shouldValidate && this.lexiconResolver) { 1015 + try { 1016 + validationStatus = await this.lexiconResolver.assertValid( 1017 + collection, 1018 + recordWithType, 1019 + ); 1020 + } catch (err) { 1021 + return Response.json( 1022 + { 1023 + error: 'InvalidRecord', 1024 + message: `Invalid ${collection} record: ${err.message}`, 1025 + }, 1026 + { status: 400 }, 1027 + ); 1028 + } 1029 + } else if (shouldValidate && !this.lexiconResolver) { 1030 + validationStatus = 'unknown'; 1031 + } 1032 + // If _validate === false, validationStatus remains undefined 1033 + 1034 + // Encode record and create CID 980 1035 const encoded = cborEncodeDagCbor(recordWithType); 981 1036 const cid = cidToString(await createCid(encoded)); 982 1037 ··· 1013 1068 commit: latestCommit 1014 1069 ? { cid: latestCommit.cid, rev: latestCommit.rev } 1015 1070 : null, 1016 - validationStatus: 'valid', 1071 + validationStatus, 1017 1072 }); 1018 1073 } 1019 1074 ··· 1071 1126 const rkey = providedRkey || createTid(); 1072 1127 const uri = `at://${did}/${collection}/${rkey}`; 1073 1128 1074 - // Encode record and create CID 1129 + // Ensure $type is set 1075 1130 const recordWithType = { $type: collection, ...value }; 1131 + 1132 + // Validate record against schema if resolver provided 1133 + const shouldValidate = _validate !== false; 1134 + let validationStatus; 1135 + 1136 + if (shouldValidate && this.lexiconResolver) { 1137 + try { 1138 + validationStatus = await this.lexiconResolver.assertValid( 1139 + collection, 1140 + recordWithType, 1141 + ); 1142 + } catch (err) { 1143 + return Response.json( 1144 + { 1145 + error: 'InvalidRecord', 1146 + message: `Invalid ${collection} record: ${err.message}`, 1147 + }, 1148 + { status: 400 }, 1149 + ); 1150 + } 1151 + } else if (shouldValidate && !this.lexiconResolver) { 1152 + validationStatus = 'unknown'; 1153 + } 1154 + 1155 + // Encode record and create CID 1076 1156 const encoded = cborEncodeDagCbor(recordWithType); 1077 1157 const cid = cidToString(await createCid(encoded)); 1078 1158 ··· 1093 1173 $type: 'com.atproto.repo.applyWrites#createResult', 1094 1174 uri, 1095 1175 cid, 1096 - validationStatus: 'valid', 1176 + validationStatus, 1097 1177 }); 1098 1178 } else if ($type === 'com.atproto.repo.applyWrites#update') { 1099 1179 if (!providedRkey) { ··· 1105 1185 1106 1186 const uri = `at://${did}/${collection}/${providedRkey}`; 1107 1187 1188 + // Ensure $type is set 1189 + const recordWithType = { $type: collection, ...value }; 1190 + 1191 + // Validate record against schema if resolver provided 1192 + const shouldValidate = _validate !== false; 1193 + let validationStatus; 1194 + 1195 + if (shouldValidate && this.lexiconResolver) { 1196 + try { 1197 + validationStatus = await this.lexiconResolver.assertValid( 1198 + collection, 1199 + recordWithType, 1200 + ); 1201 + } catch (err) { 1202 + return Response.json( 1203 + { 1204 + error: 'InvalidRecord', 1205 + message: `Invalid ${collection} record: ${err.message}`, 1206 + }, 1207 + { status: 400 }, 1208 + ); 1209 + } 1210 + } else if (shouldValidate && !this.lexiconResolver) { 1211 + validationStatus = 'unknown'; 1212 + } 1213 + 1108 1214 // Unlink old blobs 1109 1215 await this.actorStorage.unlinkBlobsFromRecord(uri); 1110 1216 1111 1217 // Encode record and create CID 1112 - const recordWithType = { $type: collection, ...value }; 1113 1218 const encoded = cborEncodeDagCbor(recordWithType); 1114 1219 const cid = cidToString(await createCid(encoded)); 1115 1220 ··· 1136 1241 $type: 'com.atproto.repo.applyWrites#updateResult', 1137 1242 uri, 1138 1243 cid, 1139 - validationStatus: 'valid', 1244 + validationStatus, 1140 1245 }); 1141 1246 } else if ($type === 'com.atproto.repo.applyWrites#delete') { 1142 1247 if (!providedRkey) {
+15
packages/core/src/ports.js
··· 125 125 * @property {(data: Uint8Array) => void} [broadcast] - Broadcast data to all connected clients 126 126 */ 127 127 128 + /** 129 + * @typedef {'valid' | 'unknown'} ValidationStatus 130 + */ 131 + 132 + /** 133 + * Lexicon resolver port for validating records against schemas. 134 + * Implementations may resolve schemas dynamically (DNS + fetch) or use static schemas. 135 + * 136 + * @typedef {Object} LexiconResolverPort 137 + * @property {(collection: string, record: object) => Promise<ValidationStatus>} assertValid 138 + * Resolves schema if needed, validates record. 139 + * Returns 'valid' on success, 'unknown' if schema unresolvable. 140 + * Throws Error with descriptive message if record is invalid against known schema. 141 + */ 142 + 128 143 // Export empty object - types are defined via JSDoc 129 144 export {};
+2 -1
packages/deno/package.json
··· 8 8 ".": "./src/index.js" 9 9 }, 10 10 "dependencies": { 11 - "@pds/core": "workspace:*" 11 + "@pds/core": "workspace:*", 12 + "@pds/lexicon-resolver": "workspace:*" 12 13 }, 13 14 "devDependencies": { 14 15 "@types/deno": "^2.5.0"
+5
packages/deno/src/index.js
··· 5 5 import { PersonalDataServer } from '@pds/core'; 6 6 import { createActorStorage, createSharedStorage } from '@pds/storage-sqlite'; 7 7 import { createDenoBlobs } from '@pds/blobs-deno'; 8 + import { LexiconResolver } from '@pds/lexicon-resolver'; 8 9 9 10 /** 10 11 * Create WebSocket port for Deno ··· 79 80 * @param {string} [options.appviewDid] - AppView DID for service auth 80 81 * @param {string} [options.relayUrl] - Relay URL for firehose notifications 81 82 * @param {string} [options.password] - Password for createSession 83 + * @param {import('@pds/core/ports').LexiconResolverPort} [options.lexiconResolver] - Lexicon resolver for record validation 82 84 * @returns {Promise<PdsServer>} 83 85 */ 84 86 export async function createServer({ ··· 92 94 appviewDid, 93 95 relayUrl, 94 96 password, 97 + lexiconResolver, 95 98 }) { 96 99 const db = new DatabaseSync(dbPath); 97 100 const actorStorage = createActorStorage(db); ··· 101 104 } 102 105 const blobs = blobsArg ?? createDenoBlobs(/** @type {string} */ (blobsDir)); 103 106 const webSocket = createWebSocketPort(); 107 + const resolver = lexiconResolver ?? new LexiconResolver(); 104 108 105 109 const pds = new PersonalDataServer({ 106 110 actorStorage, ··· 113 117 appviewDid, 114 118 relayUrl, 115 119 password, 120 + lexiconResolver: resolver, 116 121 }); 117 122 118 123 /** @type {Deno.HttpServer | null} */
+19
packages/lexicon-resolver/package.json
··· 1 + { 2 + "name": "@pds/lexicon-resolver", 3 + "version": "0.1.0", 4 + "type": "module", 5 + "main": "src/index.js", 6 + "exports": { 7 + ".": "./src/index.js", 8 + "./resolver": "./src/resolver.js", 9 + "./dns": "./src/dns.js", 10 + "./did": "./src/did.js", 11 + "./authority": "./src/authority.js" 12 + }, 13 + "dependencies": { 14 + "@bigmoves/lexicon": "^0.2.0" 15 + }, 16 + "devDependencies": { 17 + "vitest": "^2.1.8" 18 + } 19 + }
+28
packages/lexicon-resolver/src/authority.js
··· 1 + /** 2 + * Extract authority from NSID and convert to domain. 3 + * NSID format: <authority>.<name> (e.g., app.bsky.feed.post) 4 + * 5 + * Per AT Protocol spec, authority is a reversed domain name with 2+ segments. 6 + * Standard NSIDs use 2-segment authorities (app.bsky, com.atproto). 7 + * We only support 2-segment authorities as this covers all real-world usage. 8 + * 9 + * @param {string} nsid - e.g., "app.bsky.feed.post" 10 + * @returns {string} - e.g., "bsky.app" 11 + */ 12 + export function nsidToDomain(nsid) { 13 + const parts = nsid.split('.'); 14 + // Take first two segments as authority (standard AT Protocol convention) 15 + // Reverse: [app, bsky] -> [bsky, app] -> "bsky.app" 16 + return parts.slice(0, 2).reverse().join('.'); 17 + } 18 + 19 + /** 20 + * Convert domain to AT Protocol lexicon DNS lookup name. 21 + * Uses _lexicon subdomain per AT Protocol spec for lexicon authority resolution. 22 + * 23 + * @param {string} domain - e.g., "bsky.app" 24 + * @returns {string} - e.g., "_lexicon.bsky.app" 25 + */ 26 + export function domainToLexiconDns(domain) { 27 + return `_lexicon.${domain}`; 28 + }
+60
packages/lexicon-resolver/src/did.js
··· 1 + const DEFAULT_PLC_URL = 'https://plc.directory'; 2 + 3 + /** 4 + * @typedef {{id: string, type: string, serviceEndpoint: string}} DidService 5 + * @typedef {{service?: DidService[]}} DidDocument 6 + */ 7 + 8 + /** 9 + * Extract PDS service endpoint from DID document. 10 + * 11 + * @param {DidDocument} didDoc - DID document 12 + * @returns {string|null} - PDS endpoint URL or null 13 + */ 14 + export function extractPdsEndpoint(didDoc) { 15 + if (!didDoc?.service || !Array.isArray(didDoc.service)) { 16 + return null; 17 + } 18 + 19 + const pdsService = didDoc.service.find( 20 + (/** @type {DidService} */ s) => 21 + s.id === '#atproto_pds' || s.type === 'AtprotoPersonalDataServer', 22 + ); 23 + 24 + return pdsService?.serviceEndpoint ?? null; 25 + } 26 + 27 + /** 28 + * Resolve a DID to its PDS service endpoint. 29 + * 30 + * @param {string} did - DID string (did:plc:... or did:web:...) 31 + * @param {Object} [opts] 32 + * @param {string} [opts.plcUrl] - PLC directory URL 33 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 34 + * @returns {Promise<string|null>} - PDS endpoint URL or null 35 + */ 36 + export async function resolveDid(did, opts = {}) { 37 + const plcUrl = opts.plcUrl ?? DEFAULT_PLC_URL; 38 + const fetchFn = opts.fetch ?? globalThis.fetch; 39 + 40 + try { 41 + let didDocUrl; 42 + 43 + if (did.startsWith('did:plc:')) { 44 + didDocUrl = `${plcUrl}/${did}`; 45 + } else if (did.startsWith('did:web:')) { 46 + const domain = did.slice('did:web:'.length); 47 + didDocUrl = `https://${domain}/.well-known/did.json`; 48 + } else { 49 + return null; 50 + } 51 + 52 + const response = await fetchFn(didDocUrl); 53 + if (!response.ok) return null; 54 + 55 + const didDoc = await response.json(); 56 + return extractPdsEndpoint(didDoc); 57 + } catch { 58 + return null; 59 + } 60 + }
+62
packages/lexicon-resolver/src/dns.js
··· 1 + import { isValidDid } from '@bigmoves/lexicon'; 2 + 3 + const DEFAULT_DOH_URL = 'https://cloudflare-dns.com/dns-query'; 4 + 5 + /** 6 + * @typedef {{type: number, data?: string}} DnsRecord 7 + */ 8 + 9 + /** 10 + * Parse DID from AT Protocol TXT record value. 11 + * Format: "did=did:plc:xyz123" or did=did:plc:xyz123 12 + * Validates DID format using @bigmoves/lexicon. 13 + * 14 + * @param {string} txt - TXT record value 15 + * @returns {string|null} - DID or null if invalid 16 + */ 17 + export function parseDidFromTxt(txt) { 18 + if (!txt) return null; 19 + // Remove surrounding quotes if present 20 + const unquoted = txt.replace(/^"|"$/g, ''); 21 + const match = unquoted.match(/^did=(.+)$/); 22 + if (!match) return null; 23 + const did = match[1]; 24 + return isValidDid(did) ? did : null; 25 + } 26 + 27 + /** 28 + * Lookup TXT records via DNS-over-HTTPS. 29 + * 30 + * @param {string} domain - Domain to lookup (e.g., "_atproto.bsky.app") 31 + * @param {Object} [opts] 32 + * @param {string} [opts.dohUrl] - DoH endpoint URL 33 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 34 + * @returns {Promise<string[]>} - Array of TXT record values 35 + */ 36 + export async function lookupTxt(domain, opts = {}) { 37 + const dohUrl = opts.dohUrl ?? DEFAULT_DOH_URL; 38 + const fetchFn = opts.fetch ?? globalThis.fetch; 39 + 40 + try { 41 + const url = `${dohUrl}?name=${encodeURIComponent(domain)}&type=TXT`; 42 + const response = await fetchFn(url, { 43 + headers: { Accept: 'application/dns-json' }, 44 + }); 45 + 46 + if (!response.ok) return []; 47 + 48 + const data = await response.json(); 49 + if (!data.Answer) return []; 50 + 51 + // Extract TXT record values (type 16 = TXT) 52 + return data.Answer.filter( 53 + (/** @type {DnsRecord} */ record) => record.type === 16, 54 + ).map((/** @type {DnsRecord} */ record) => { 55 + // Remove surrounding quotes from data field 56 + const value = record.data?.replace(/^"|"$/g, '') ?? ''; 57 + return value; 58 + }); 59 + } catch { 60 + return []; 61 + } 62 + }
+5
packages/lexicon-resolver/src/index.js
··· 1 + export { LexiconResolver } from './resolver.js'; 2 + export { lookupTxt, parseDidFromTxt } from './dns.js'; 3 + export { resolveDid, extractPdsEndpoint } from './did.js'; 4 + export { nsidToDomain, domainToLexiconDns } from './authority.js'; 5 + export { coreSchemas } from './schemas/com-atproto.js';
+272
packages/lexicon-resolver/src/resolver.js
··· 1 + import { validateRecord } from '@bigmoves/lexicon'; 2 + import { lookupTxt, parseDidFromTxt } from './dns.js'; 3 + import { resolveDid } from './did.js'; 4 + import { nsidToDomain, domainToLexiconDns } from './authority.js'; 5 + import { coreSchemas } from './schemas/com-atproto.js'; 6 + 7 + /** 8 + * @typedef {import('@pds/core/ports').ValidationStatus} ValidationStatus 9 + * @typedef {import('@bigmoves/lexicon').Lexicon} Lexicon 10 + */ 11 + 12 + /** 13 + * Lexicon resolver with full AT Protocol spec compliance. 14 + * Resolves schemas via DNS authority lookup, DID resolution, and PDS fetch. 15 + * Recursively resolves all referenced schemas for proper validation. 16 + * Validates records using @bigmoves/lexicon. 17 + * 18 + * Implements LexiconResolverPort. 19 + */ 20 + export class LexiconResolver { 21 + /** 22 + * @param {Object} [opts] 23 + * @param {number} [opts.ttl=3600000] - Cache TTL in ms (default 1 hour) 24 + * @param {typeof fetch} [opts.fetch] - Fetch implementation 25 + * @param {string} [opts.dohUrl] - DNS-over-HTTPS endpoint 26 + * @param {string} [opts.plcUrl] - PLC directory URL 27 + * @param {Lexicon[]} [opts.schemas] - Pre-loaded static schemas 28 + */ 29 + constructor(opts = {}) { 30 + /** @type {Map<string, Lexicon>} */ 31 + this.schemas = new Map(); 32 + /** @type {Map<string, {fetchedAt: number}>} */ 33 + this.cache = new Map(); 34 + /** @type {Set<string>} */ 35 + this.pendingFetches = new Set(); 36 + this.ttl = opts.ttl ?? 3600000; 37 + this.fetch = opts.fetch ?? globalThis.fetch; 38 + this.dohUrl = opts.dohUrl; 39 + this.plcUrl = opts.plcUrl; 40 + 41 + // Pre-load core com.atproto schemas (not published via DNS) 42 + for (const schema of coreSchemas) { 43 + this.schemas.set(schema.id, schema); 44 + this.cache.set(schema.id, { fetchedAt: Date.now() }); 45 + } 46 + 47 + // Pre-load user-provided static schemas 48 + for (const schema of opts.schemas ?? []) { 49 + this.schemas.set(schema.id, schema); 50 + this.cache.set(schema.id, { fetchedAt: Date.now() }); 51 + } 52 + } 53 + 54 + /** 55 + * Validate a record against its schema. 56 + * @param {string} collection - The collection/NSID (e.g., 'app.bsky.feed.post') 57 + * @param {object} record - The record to validate 58 + * @returns {Promise<ValidationStatus>} 59 + */ 60 + async assertValid(collection, record) { 61 + const hasSchema = await this.ensureSchemaWithRefs(collection); 62 + if (!hasSchema) { 63 + return 'unknown'; 64 + } 65 + 66 + // Pass ALL loaded schemas to validateRecord for ref resolution 67 + const allSchemas = Array.from(this.schemas.values()); 68 + const error = validateRecord(allSchemas, collection, record); 69 + if (error) { 70 + const path = error.path ? `${error.path}: ` : ''; 71 + throw new Error(`${path}${error.message}`); 72 + } 73 + return 'valid'; 74 + } 75 + 76 + /** 77 + * Ensure a schema and all its refs are loaded. 78 + * @param {string} nsid 79 + * @returns {Promise<boolean>} - true if schema is available 80 + */ 81 + async ensureSchemaWithRefs(nsid) { 82 + // First ensure the main schema 83 + const hasMain = await this.ensureSchema(nsid); 84 + if (!hasMain) { 85 + return false; 86 + } 87 + 88 + // Get the schema and find all refs 89 + const schema = this.schemas.get(nsid); 90 + if (!schema) { 91 + return false; 92 + } 93 + 94 + // Extract and resolve all refs recursively 95 + const refs = this.extractRefs(schema); 96 + for (const ref of refs) { 97 + // Skip if already loaded and fresh 98 + const cached = this.cache.get(ref); 99 + const isFresh = cached && Date.now() - cached.fetchedAt < this.ttl; 100 + if (isFresh && this.schemas.has(ref)) { 101 + continue; 102 + } 103 + 104 + // Recursively ensure this ref and its refs 105 + await this.ensureSchemaWithRefs(ref); 106 + } 107 + 108 + return true; 109 + } 110 + 111 + /** 112 + * Ensure a single schema is loaded (without refs). 113 + * @param {string} nsid 114 + * @returns {Promise<boolean>} - true if schema is available 115 + */ 116 + async ensureSchema(nsid) { 117 + // Check if already loaded and not stale 118 + const cached = this.cache.get(nsid); 119 + const isFresh = cached && Date.now() - cached.fetchedAt < this.ttl; 120 + 121 + if (isFresh) { 122 + return this.schemas.has(nsid); 123 + } 124 + 125 + // Prevent concurrent fetches of the same schema 126 + if (this.pendingFetches.has(nsid)) { 127 + // Wait a bit and check if it's now available 128 + await new Promise((r) => setTimeout(r, 100)); 129 + return this.schemas.has(nsid); 130 + } 131 + 132 + // Stale or not cached - try to fetch/refresh 133 + this.pendingFetches.add(nsid); 134 + try { 135 + const lexicon = await this.fetchLexicon(nsid); 136 + if (lexicon) { 137 + this.schemas.set(nsid, lexicon); 138 + this.cache.set(nsid, { fetchedAt: Date.now() }); 139 + return true; 140 + } 141 + } catch { 142 + // Resolution failed - fall back to existing schema if available 143 + } finally { 144 + this.pendingFetches.delete(nsid); 145 + } 146 + 147 + // If fetch failed but we have an existing (stale) schema, use it 148 + return this.schemas.has(nsid); 149 + } 150 + 151 + /** 152 + * Extract all ref NSIDs from a lexicon schema. 153 + * @param {Lexicon} schema 154 + * @returns {string[]} - Array of referenced NSIDs 155 + */ 156 + extractRefs(schema) { 157 + /** @type {Set<string>} */ 158 + const refs = new Set(); 159 + 160 + /** 161 + * Recursively walk object to find refs 162 + * @param {unknown} obj 163 + */ 164 + const walk = (obj) => { 165 + if (!obj || typeof obj !== 'object') return; 166 + 167 + if (Array.isArray(obj)) { 168 + for (const item of obj) { 169 + walk(item); 170 + } 171 + return; 172 + } 173 + 174 + const record = /** @type {Record<string, unknown>} */ (obj); 175 + 176 + // Check for ref type 177 + if (record.type === 'ref' && typeof record.ref === 'string') { 178 + const ref = record.ref; 179 + // Extract NSID from ref (could be "com.atproto.repo.strongRef" or "lex:com.atproto.repo.strongRef#main") 180 + const nsid = this.refToNsid(ref); 181 + if (nsid) { 182 + refs.add(nsid); 183 + } 184 + } 185 + 186 + // Check for union refs 187 + if (record.type === 'union' && Array.isArray(record.refs)) { 188 + for (const ref of record.refs) { 189 + if (typeof ref === 'string') { 190 + const nsid = this.refToNsid(ref); 191 + if (nsid) { 192 + refs.add(nsid); 193 + } 194 + } 195 + } 196 + } 197 + 198 + // Recurse into nested objects 199 + for (const value of Object.values(record)) { 200 + walk(value); 201 + } 202 + }; 203 + 204 + walk(schema); 205 + return Array.from(refs); 206 + } 207 + 208 + /** 209 + * Convert a ref string to an NSID. 210 + * @param {string} ref - e.g., "com.atproto.repo.strongRef" or "lex:app.bsky.feed.post#main" 211 + * @returns {string|null} - The NSID or null if invalid 212 + */ 213 + refToNsid(ref) { 214 + // Remove lex: prefix if present 215 + let nsid = ref.startsWith('lex:') ? ref.slice(4) : ref; 216 + 217 + // Remove fragment (e.g., #main, #strongRef) 218 + const hashIndex = nsid.indexOf('#'); 219 + if (hashIndex !== -1) { 220 + nsid = nsid.slice(0, hashIndex); 221 + } 222 + 223 + // Validate it looks like an NSID (has at least 3 segments) 224 + const parts = nsid.split('.'); 225 + if (parts.length >= 3) { 226 + return nsid; 227 + } 228 + 229 + return null; 230 + } 231 + 232 + /** 233 + * Fetch a lexicon via the AT Protocol resolution chain. 234 + * @param {string} nsid 235 + * @returns {Promise<Lexicon|null>} 236 + */ 237 + async fetchLexicon(nsid) { 238 + // 1. Convert NSID to domain 239 + const domain = nsidToDomain(nsid); 240 + const dnsName = domainToLexiconDns(domain); 241 + 242 + // 2. DNS TXT lookup 243 + const txtRecords = await lookupTxt(dnsName, { 244 + dohUrl: this.dohUrl, 245 + fetch: this.fetch, 246 + }); 247 + 248 + // 3. Parse DID from TXT record 249 + let did = null; 250 + for (const txt of txtRecords) { 251 + did = parseDidFromTxt(txt); 252 + if (did) break; 253 + } 254 + if (!did) return null; 255 + 256 + // 4. Resolve DID to PDS endpoint 257 + const pdsEndpoint = await resolveDid(did, { 258 + plcUrl: this.plcUrl, 259 + fetch: this.fetch, 260 + }); 261 + if (!pdsEndpoint) return null; 262 + 263 + // 5. Fetch lexicon from PDS 264 + const url = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=com.atproto.lexicon.schema&rkey=${encodeURIComponent(nsid)}`; 265 + 266 + const response = await this.fetch(url); 267 + if (!response.ok) return null; 268 + 269 + const data = await response.json(); 270 + return data.value ?? null; 271 + } 272 + }
+269
packages/lexicon-resolver/src/schemas/com-atproto.js
··· 1 + /** 2 + * Core com.atproto lexicon schemas. 3 + * These are bundled because they aren't published via DNS-based lexicon resolution. 4 + * Source: https://github.com/bluesky-social/atproto/tree/main/lexicons/com/atproto 5 + */ 6 + 7 + export const comAtprotoRepoStrongRef = { 8 + lexicon: 1, 9 + id: 'com.atproto.repo.strongRef', 10 + description: 'A URI with a content-hash fingerprint.', 11 + defs: { 12 + main: { 13 + type: 'object', 14 + required: ['uri', 'cid'], 15 + properties: { 16 + uri: { type: 'string', format: 'at-uri' }, 17 + cid: { type: 'string', format: 'cid' }, 18 + }, 19 + }, 20 + }, 21 + }; 22 + 23 + export const comAtprotoRepoDefs = { 24 + lexicon: 1, 25 + id: 'com.atproto.repo.defs', 26 + defs: { 27 + commitMeta: { 28 + type: 'object', 29 + required: ['cid', 'rev'], 30 + properties: { 31 + cid: { type: 'string', format: 'cid' }, 32 + rev: { type: 'string', format: 'tid' }, 33 + }, 34 + }, 35 + }, 36 + }; 37 + 38 + export const comAtprotoLabelDefs = { 39 + lexicon: 1, 40 + id: 'com.atproto.label.defs', 41 + defs: { 42 + label: { 43 + type: 'object', 44 + description: 'Metadata tag on an atproto resource (eg, repo or record).', 45 + required: ['src', 'uri', 'val', 'cts'], 46 + properties: { 47 + ver: { 48 + type: 'integer', 49 + description: 'The AT Protocol version of the label object.', 50 + }, 51 + src: { 52 + type: 'string', 53 + format: 'did', 54 + description: 'DID of the actor who created this label.', 55 + }, 56 + uri: { 57 + type: 'string', 58 + format: 'uri', 59 + description: 60 + 'AT URI of the record, repository (account), or other resource that this label applies to.', 61 + }, 62 + cid: { 63 + type: 'string', 64 + format: 'cid', 65 + description: 66 + "Optionally, CID specifying the specific version of 'uri' resource this label applies to.", 67 + }, 68 + val: { 69 + type: 'string', 70 + maxLength: 128, 71 + description: 72 + 'The short string name of the value or type of this label.', 73 + }, 74 + neg: { 75 + type: 'boolean', 76 + description: 77 + 'If true, this is a negation label, overwriting a previous label.', 78 + }, 79 + cts: { 80 + type: 'string', 81 + format: 'datetime', 82 + description: 'Timestamp when this label was created.', 83 + }, 84 + exp: { 85 + type: 'string', 86 + format: 'datetime', 87 + description: 88 + 'Timestamp at which this label expires (no longer applies).', 89 + }, 90 + sig: { 91 + type: 'bytes', 92 + description: 'Signature of dag-cbor encoded label.', 93 + }, 94 + }, 95 + }, 96 + selfLabels: { 97 + type: 'object', 98 + description: 99 + 'Metadata tags on an atproto record, published by the author within the record.', 100 + required: ['values'], 101 + properties: { 102 + values: { 103 + type: 'array', 104 + items: { type: 'ref', ref: '#selfLabel' }, 105 + maxLength: 10, 106 + }, 107 + }, 108 + }, 109 + selfLabel: { 110 + type: 'object', 111 + description: 112 + 'Metadata tag on an atproto record, published by the author within the record. Note that schemas should use #selfLabels, not #selfLabel.', 113 + required: ['val'], 114 + properties: { 115 + val: { 116 + type: 'string', 117 + maxLength: 128, 118 + description: 119 + 'The short string name of the value or type of this label.', 120 + }, 121 + }, 122 + }, 123 + labelValueDefinition: { 124 + type: 'object', 125 + description: 126 + 'Declares a label value and its expected interpretations and behaviors.', 127 + required: ['identifier', 'severity', 'blurs', 'locales'], 128 + properties: { 129 + identifier: { 130 + type: 'string', 131 + description: 132 + "The value of the label being defined. Must only include lowercase ascii and the '-' character ([a-z-]+).", 133 + maxLength: 100, 134 + maxGraphemes: 100, 135 + }, 136 + severity: { 137 + type: 'string', 138 + description: 139 + "How should a client visually convey this label? 'inform' means neutral and informational; 'alert' means negative and warning; 'none' means show nothing.", 140 + knownValues: ['inform', 'alert', 'none'], 141 + }, 142 + blurs: { 143 + type: 'string', 144 + description: 145 + "What should this label hide in the UI, if applied? 'content' hides all of the target; 'media' hides the images/video/audio; 'none' hides nothing.", 146 + knownValues: ['content', 'media', 'none'], 147 + }, 148 + defaultSetting: { 149 + type: 'string', 150 + description: 'The default setting for this label.', 151 + knownValues: ['ignore', 'warn', 'hide'], 152 + default: 'warn', 153 + }, 154 + adultOnly: { 155 + type: 'boolean', 156 + description: 157 + 'Does the user need to have adult content enabled in order to configure this label?', 158 + }, 159 + locales: { 160 + type: 'array', 161 + items: { type: 'ref', ref: '#labelValueDefinitionStrings' }, 162 + }, 163 + }, 164 + }, 165 + labelValueDefinitionStrings: { 166 + type: 'object', 167 + description: 168 + 'Strings which describe the label in the UI, localized into a specific language.', 169 + required: ['lang', 'name', 'description'], 170 + properties: { 171 + lang: { 172 + type: 'string', 173 + description: 'The code of the language these strings are written in.', 174 + format: 'language', 175 + }, 176 + name: { 177 + type: 'string', 178 + description: 'A short human-readable name for the label.', 179 + maxGraphemes: 64, 180 + maxLength: 640, 181 + }, 182 + description: { 183 + type: 'string', 184 + description: 185 + 'A longer description of what the label means and why it might be applied.', 186 + maxGraphemes: 10000, 187 + maxLength: 100000, 188 + }, 189 + }, 190 + }, 191 + labelValue: { 192 + type: 'string', 193 + knownValues: [ 194 + '!hide', 195 + '!no-promote', 196 + '!warn', 197 + '!no-unauthenticated', 198 + 'dmca-violation', 199 + 'doxxing', 200 + 'porn', 201 + 'sexual', 202 + 'nudity', 203 + 'nsfl', 204 + 'gore', 205 + ], 206 + }, 207 + }, 208 + }; 209 + 210 + export const comAtprotoModerationDefs = { 211 + lexicon: 1, 212 + id: 'com.atproto.moderation.defs', 213 + defs: { 214 + reasonType: { 215 + type: 'string', 216 + knownValues: [ 217 + 'com.atproto.moderation.defs#reasonSpam', 218 + 'com.atproto.moderation.defs#reasonViolation', 219 + 'com.atproto.moderation.defs#reasonMisleading', 220 + 'com.atproto.moderation.defs#reasonSexual', 221 + 'com.atproto.moderation.defs#reasonRude', 222 + 'com.atproto.moderation.defs#reasonOther', 223 + 'com.atproto.moderation.defs#reasonAppeal', 224 + ], 225 + }, 226 + reasonSpam: { 227 + type: 'token', 228 + description: 'Spam: frequent unwanted promotion, replies, mentions.', 229 + }, 230 + reasonViolation: { 231 + type: 'token', 232 + description: 'Direct violation of server rules, laws, terms of service.', 233 + }, 234 + reasonMisleading: { 235 + type: 'token', 236 + description: 'Misleading identity, affiliation, or content.', 237 + }, 238 + reasonSexual: { 239 + type: 'token', 240 + description: 'Unwanted or mislabeled sexual content.', 241 + }, 242 + reasonRude: { 243 + type: 'token', 244 + description: 245 + 'Rude, harassing, explicit, or otherwise unwelcoming behavior.', 246 + }, 247 + reasonOther: { 248 + type: 'token', 249 + description: 'Reports not falling under another report category.', 250 + }, 251 + reasonAppeal: { 252 + type: 'token', 253 + description: 'Appeal a previously taken moderation action.', 254 + }, 255 + subjectType: { 256 + type: 'string', 257 + description: 'Tag describing a type of subject that might be reported.', 258 + knownValues: ['account', 'record', 'chat'], 259 + }, 260 + }, 261 + }; 262 + 263 + /** All core com.atproto schemas */ 264 + export const coreSchemas = [ 265 + comAtprotoRepoStrongRef, 266 + comAtprotoRepoDefs, 267 + comAtprotoLabelDefs, 268 + comAtprotoModerationDefs, 269 + ];
+24
packages/lexicon-resolver/test/authority.test.js
··· 1 + import { describe, expect, test } from 'vitest'; 2 + import { nsidToDomain, domainToLexiconDns } from '../src/authority.js'; 3 + 4 + describe('authority', () => { 5 + describe('nsidToDomain', () => { 6 + test('converts app.bsky.feed.post to bsky.app', () => { 7 + expect(nsidToDomain('app.bsky.feed.post')).toBe('bsky.app'); 8 + }); 9 + 10 + test('converts com.atproto.repo.createRecord to atproto.com', () => { 11 + expect(nsidToDomain('com.atproto.repo.createRecord')).toBe('atproto.com'); 12 + }); 13 + 14 + test('converts org.example.lexicon.test to example.org', () => { 15 + expect(nsidToDomain('org.example.lexicon.test')).toBe('example.org'); 16 + }); 17 + }); 18 + 19 + describe('domainToLexiconDns', () => { 20 + test('prepends _lexicon to domain', () => { 21 + expect(domainToLexiconDns('bsky.app')).toBe('_lexicon.bsky.app'); 22 + }); 23 + }); 24 + });
+103
packages/lexicon-resolver/test/did.test.js
··· 1 + import { describe, expect, test, vi } from 'vitest'; 2 + import { resolveDid, extractPdsEndpoint } from '../src/did.js'; 3 + 4 + describe('did', () => { 5 + describe('extractPdsEndpoint', () => { 6 + test('extracts PDS endpoint from DID document', () => { 7 + const didDoc = { 8 + service: [ 9 + { 10 + id: '#atproto_pds', 11 + type: 'AtprotoPersonalDataServer', 12 + serviceEndpoint: 'https://pds.example.com', 13 + }, 14 + ], 15 + }; 16 + expect(extractPdsEndpoint(didDoc)).toBe('https://pds.example.com'); 17 + }); 18 + 19 + test('finds PDS by type if id differs', () => { 20 + const didDoc = { 21 + service: [ 22 + { 23 + id: '#pds', 24 + type: 'AtprotoPersonalDataServer', 25 + serviceEndpoint: 'https://pds.example.com', 26 + }, 27 + ], 28 + }; 29 + expect(extractPdsEndpoint(didDoc)).toBe('https://pds.example.com'); 30 + }); 31 + 32 + test('returns null if no PDS service', () => { 33 + const didDoc = { service: [] }; 34 + expect(extractPdsEndpoint(didDoc)).toBeNull(); 35 + }); 36 + 37 + test('returns null if no service array', () => { 38 + const didDoc = {}; 39 + expect(extractPdsEndpoint(didDoc)).toBeNull(); 40 + }); 41 + }); 42 + 43 + describe('resolveDid', () => { 44 + test('resolves did:plc via PLC directory', async () => { 45 + const mockFetch = vi.fn().mockResolvedValue({ 46 + ok: true, 47 + json: () => 48 + Promise.resolve({ 49 + service: [ 50 + { 51 + id: '#atproto_pds', 52 + type: 'AtprotoPersonalDataServer', 53 + serviceEndpoint: 'https://bsky.social', 54 + }, 55 + ], 56 + }), 57 + }); 58 + 59 + const endpoint = await resolveDid('did:plc:xyz123', { fetch: mockFetch }); 60 + 61 + expect(mockFetch).toHaveBeenCalledWith( 62 + 'https://plc.directory/did:plc:xyz123', 63 + ); 64 + expect(endpoint).toBe('https://bsky.social'); 65 + }); 66 + 67 + test('resolves did:web via .well-known', async () => { 68 + const mockFetch = vi.fn().mockResolvedValue({ 69 + ok: true, 70 + json: () => 71 + Promise.resolve({ 72 + service: [ 73 + { 74 + id: '#atproto_pds', 75 + type: 'AtprotoPersonalDataServer', 76 + serviceEndpoint: 'https://example.com', 77 + }, 78 + ], 79 + }), 80 + }); 81 + 82 + const endpoint = await resolveDid('did:web:example.com', { 83 + fetch: mockFetch, 84 + }); 85 + 86 + expect(mockFetch).toHaveBeenCalledWith( 87 + 'https://example.com/.well-known/did.json', 88 + ); 89 + expect(endpoint).toBe('https://example.com'); 90 + }); 91 + 92 + test('returns null for unknown DID method', async () => { 93 + const endpoint = await resolveDid('did:unknown:xyz', { fetch: vi.fn() }); 94 + expect(endpoint).toBeNull(); 95 + }); 96 + 97 + test('returns null on fetch error', async () => { 98 + const mockFetch = vi.fn().mockRejectedValue(new Error('Network error')); 99 + const endpoint = await resolveDid('did:plc:xyz', { fetch: mockFetch }); 100 + expect(endpoint).toBeNull(); 101 + }); 102 + }); 103 + });
+90
packages/lexicon-resolver/test/dns.test.js
··· 1 + import { describe, expect, test, vi } from 'vitest'; 2 + import { lookupTxt, parseDidFromTxt } from '../src/dns.js'; 3 + 4 + describe('dns', () => { 5 + describe('parseDidFromTxt', () => { 6 + test('extracts DID from TXT record value', () => { 7 + expect(parseDidFromTxt('did=did:plc:xyz123')).toBe('did:plc:xyz123'); 8 + }); 9 + 10 + test('handles quoted TXT record', () => { 11 + expect(parseDidFromTxt('"did=did:plc:xyz123"')).toBe('did:plc:xyz123'); 12 + }); 13 + 14 + test('returns null for invalid format', () => { 15 + expect(parseDidFromTxt('invalid')).toBeNull(); 16 + }); 17 + 18 + test('returns null for empty string', () => { 19 + expect(parseDidFromTxt('')).toBeNull(); 20 + }); 21 + 22 + test('returns null for invalid DID format', () => { 23 + expect(parseDidFromTxt('did=not-a-valid-did')).toBeNull(); 24 + expect(parseDidFromTxt('did=did:')).toBeNull(); // Missing method and specific-id 25 + expect(parseDidFromTxt('did=did:plc:')).toBeNull(); // Missing specific-id 26 + }); 27 + 28 + test('accepts valid did:web format', () => { 29 + expect(parseDidFromTxt('did=did:web:example.com')).toBe( 30 + 'did:web:example.com', 31 + ); 32 + }); 33 + }); 34 + 35 + describe('lookupTxt', () => { 36 + test('fetches TXT records via DoH', async () => { 37 + const mockFetch = vi.fn().mockResolvedValue({ 38 + ok: true, 39 + json: () => 40 + Promise.resolve({ 41 + Answer: [ 42 + { type: 16, data: '"did=did:plc:z72i7hdynmk6r22z27h6tvur"' }, 43 + ], 44 + }), 45 + }); 46 + 47 + const records = await lookupTxt('_atproto.bsky.app', { 48 + fetch: mockFetch, 49 + }); 50 + 51 + expect(mockFetch).toHaveBeenCalledWith( 52 + 'https://cloudflare-dns.com/dns-query?name=_atproto.bsky.app&type=TXT', 53 + expect.objectContaining({ 54 + headers: { Accept: 'application/dns-json' }, 55 + }), 56 + ); 57 + expect(records).toEqual(['did=did:plc:z72i7hdynmk6r22z27h6tvur']); 58 + }); 59 + 60 + test('returns empty array on network error', async () => { 61 + const mockFetch = vi.fn().mockRejectedValue(new Error('Network error')); 62 + const records = await lookupTxt('_atproto.example.com', { 63 + fetch: mockFetch, 64 + }); 65 + expect(records).toEqual([]); 66 + }); 67 + 68 + test('returns empty array when no Answer', async () => { 69 + const mockFetch = vi.fn().mockResolvedValue({ 70 + ok: true, 71 + json: () => Promise.resolve({}), 72 + }); 73 + const records = await lookupTxt('_atproto.example.com', { 74 + fetch: mockFetch, 75 + }); 76 + expect(records).toEqual([]); 77 + }); 78 + 79 + test('returns empty array on HTTP error response', async () => { 80 + const mockFetch = vi.fn().mockResolvedValue({ 81 + ok: false, 82 + status: 500, 83 + }); 84 + const records = await lookupTxt('_atproto.example.com', { 85 + fetch: mockFetch, 86 + }); 87 + expect(records).toEqual([]); 88 + }); 89 + }); 90 + });
+124
packages/lexicon-resolver/test/resolver.test.js
··· 1 + import { describe, expect, test, vi } from 'vitest'; 2 + import { LexiconResolver } from '../src/resolver.js'; 3 + 4 + /** @type {import('@bigmoves/lexicon').Lexicon} */ 5 + const postSchema = { 6 + lexicon: 1, 7 + id: 'app.bsky.feed.post', 8 + defs: { 9 + main: { 10 + type: 'record', 11 + key: 'tid', 12 + record: { 13 + type: 'object', 14 + required: ['text', 'createdAt'], 15 + properties: { 16 + text: { type: 'string', maxLength: 3000 }, 17 + createdAt: { type: 'string', format: 'datetime' }, 18 + }, 19 + }, 20 + }, 21 + }, 22 + }; 23 + 24 + describe('LexiconResolver', () => { 25 + describe('with static schemas', () => { 26 + test('validates record against pre-loaded schema', async () => { 27 + const resolver = new LexiconResolver({ 28 + schemas: [postSchema], 29 + }); 30 + 31 + const status = await resolver.assertValid('app.bsky.feed.post', { 32 + $type: 'app.bsky.feed.post', 33 + text: 'Hello world', 34 + createdAt: new Date().toISOString(), 35 + }); 36 + 37 + expect(status).toBe('valid'); 38 + }); 39 + 40 + test('throws on invalid record', async () => { 41 + const resolver = new LexiconResolver({ 42 + schemas: [postSchema], 43 + }); 44 + 45 + await expect( 46 + resolver.assertValid('app.bsky.feed.post', { 47 + $type: 'app.bsky.feed.post', 48 + // missing required fields 49 + }), 50 + ).rejects.toThrow(); 51 + }); 52 + 53 + test('returns unknown for unloaded schema', async () => { 54 + const resolver = new LexiconResolver({ 55 + schemas: [postSchema], 56 + }); 57 + 58 + const status = await resolver.assertValid('com.example.unknown', { 59 + $type: 'com.example.unknown', 60 + }); 61 + 62 + expect(status).toBe('unknown'); 63 + }); 64 + }); 65 + 66 + describe('ensureSchema', () => { 67 + test('returns true for pre-loaded schema', async () => { 68 + const resolver = new LexiconResolver({ 69 + schemas: [postSchema], 70 + }); 71 + 72 + const result = await resolver.ensureSchema('app.bsky.feed.post'); 73 + expect(result).toBe(true); 74 + }); 75 + 76 + test('returns false for unknown schema when resolution fails', async () => { 77 + const mockFetch = vi.fn().mockRejectedValue(new Error('Network error')); 78 + const resolver = new LexiconResolver({ fetch: mockFetch }); 79 + 80 + const result = await resolver.ensureSchema('com.example.unknown'); 81 + expect(result).toBe(false); 82 + }); 83 + }); 84 + 85 + describe('fetchLexicon', () => { 86 + test('resolves lexicon via DNS -> DID -> PDS chain', async () => { 87 + const mockFetch = vi 88 + .fn() 89 + // DNS lookup 90 + .mockResolvedValueOnce({ 91 + ok: true, 92 + json: () => 93 + Promise.resolve({ 94 + Answer: [{ type: 16, data: '"did=did:plc:testdid"' }], 95 + }), 96 + }) 97 + // DID resolution 98 + .mockResolvedValueOnce({ 99 + ok: true, 100 + json: () => 101 + Promise.resolve({ 102 + service: [ 103 + { 104 + id: '#atproto_pds', 105 + type: 'AtprotoPersonalDataServer', 106 + serviceEndpoint: 'https://pds.example.com', 107 + }, 108 + ], 109 + }), 110 + }) 111 + // Lexicon fetch 112 + .mockResolvedValueOnce({ 113 + ok: true, 114 + json: () => Promise.resolve({ value: postSchema }), 115 + }); 116 + 117 + const resolver = new LexiconResolver({ fetch: mockFetch }); 118 + const lexicon = await resolver.fetchLexicon('app.bsky.feed.post'); 119 + 120 + expect(lexicon).toEqual(postSchema); 121 + expect(mockFetch).toHaveBeenCalledTimes(3); 122 + }); 123 + }); 124 + });
+1
packages/node/package.json
··· 10 10 "dependencies": { 11 11 "@pds/blobs-fs": "workspace:*", 12 12 "@pds/core": "workspace:*", 13 + "@pds/lexicon-resolver": "workspace:*", 13 14 "@pds/storage-sqlite": "workspace:*", 14 15 "ws": "^8.19.0" 15 16 },
+7
packages/node/src/index.js
··· 3 3 import { createServer as createHttpServer } from 'node:http'; 4 4 import { createFsBlobs } from '@pds/blobs-fs'; 5 5 import { PersonalDataServer } from '@pds/core'; 6 + import { LexiconResolver } from '@pds/lexicon-resolver'; 6 7 import { createActorStorage, createSharedStorage } from '@pds/storage-sqlite'; 7 8 import { WebSocketServer } from 'ws'; 8 9 ··· 80 81 * @param {string} [options.appviewDid] - AppView DID for service auth 81 82 * @param {string} [options.relayUrl] - Relay URL for firehose notifications (e.g., http://localhost:2470) 82 83 * @param {string} [options.password] - Password for createSession 84 + * @param {import('@pds/core/ports').LexiconResolverPort} [options.lexiconResolver] - Lexicon resolver for record validation 83 85 * @returns {Promise<PdsServer>} 84 86 */ 85 87 export async function createServer({ ··· 93 95 appviewDid, 94 96 relayUrl, 95 97 password, 98 + lexiconResolver, 96 99 }) { 97 100 // Dynamic import for better-sqlite3 (optional peer dependency) 98 101 const Database = (await import('better-sqlite3')).default; ··· 115 118 const wss = new WebSocketServer({ noServer: true }); 116 119 const webSocket = createWebSocket(upgradeMap, wss); 117 120 121 + // Create default lexicon resolver if not provided 122 + const resolver = lexiconResolver ?? new LexiconResolver(); 123 + 118 124 // Create PDS with both storages 119 125 const pds = new PersonalDataServer({ 120 126 actorStorage, ··· 127 133 appviewDid, 128 134 relayUrl, 129 135 password, 136 + lexiconResolver: resolver, 130 137 }); 131 138 132 139 const server = createHttpServer(async (req, res) => {
+529
pnpm-lock.yaml
··· 11 11 '@pds/core': 12 12 specifier: workspace:* 13 13 version: link:packages/core 14 + '@pds/lexicon-resolver': 15 + specifier: workspace:* 16 + version: link:packages/lexicon-resolver 14 17 '@pds/node': 15 18 specifier: workspace:* 16 19 version: link:packages/node ··· 18 21 specifier: ^12.6.0 19 22 version: 12.6.0 20 23 devDependencies: 24 + '@bigmoves/lexicon': 25 + specifier: ^0.2.0 26 + version: 0.2.0 21 27 '@biomejs/biome': 22 28 specifier: ^2.3.11 23 29 version: 2.3.11 ··· 92 98 '@pds/core': 93 99 specifier: workspace:* 94 100 version: link:../core 101 + '@pds/lexicon-resolver': 102 + specifier: workspace:* 103 + version: link:../lexicon-resolver 95 104 96 105 packages/core: {} 97 106 ··· 100 109 '@pds/core': 101 110 specifier: workspace:* 102 111 version: link:../core 112 + '@pds/lexicon-resolver': 113 + specifier: workspace:* 114 + version: link:../lexicon-resolver 103 115 devDependencies: 104 116 '@types/deno': 105 117 specifier: ^2.5.0 106 118 version: 2.5.0 107 119 120 + packages/lexicon-resolver: 121 + dependencies: 122 + '@bigmoves/lexicon': 123 + specifier: ^0.2.0 124 + version: 0.2.0 125 + devDependencies: 126 + vitest: 127 + specifier: ^2.1.8 128 + version: 2.1.9(@types/node@25.0.6) 129 + 108 130 packages/node: 109 131 dependencies: 110 132 '@pds/blobs-fs': ··· 113 135 '@pds/core': 114 136 specifier: workspace:* 115 137 version: link:../core 138 + '@pds/lexicon-resolver': 139 + specifier: workspace:* 140 + version: link:../lexicon-resolver 116 141 '@pds/storage-sqlite': 117 142 specifier: workspace:* 118 143 version: link:../storage-sqlite ··· 151 176 '@bcoe/v8-coverage@1.0.2': 152 177 resolution: {integrity: sha512-6zABk/ECA/QYSCQ1NGiVwwbQerUCZ+TQbp64Q3AgmfNvurHH0j8TtXa1qbShXA6qqkpAj4V5W8pP6mLe1mcMqA==} 153 178 engines: {node: '>=18'} 179 + 180 + '@bigmoves/lexicon@0.2.0': 181 + resolution: {integrity: sha512-nDdgIF2tIxtnOe7eas0nyJi8kBBEHZQb9McMiZuWWEPx3tBePUoCHqlUAsB/jzPU8Zx3QJ4AlM6qoMIODvvzVQ==} 154 182 155 183 '@biomejs/biome@2.3.11': 156 184 resolution: {integrity: sha512-/zt+6qazBWguPG6+eWmiELqO+9jRsMZ/DBU3lfuU2ngtIQYzymocHhKiZRyrbra4aCOoyTg/BmY+6WH5mv9xmQ==} ··· 257 285 258 286 '@emnapi/runtime@1.8.1': 259 287 resolution: {integrity: sha512-mehfKSMWjjNol8659Z8KxEMrdSJDDot5SXMq00dM8BN4o+CLNXQ0xH2V7EchNHV4RmbZLmmPdEaXZc5H2FXmDg==} 288 + 289 + '@esbuild/aix-ppc64@0.21.5': 290 + resolution: {integrity: sha512-1SDgH6ZSPTlggy1yI6+Dbkiz8xzpHJEVAlF/AM1tHPLsf5STom9rwtjE4hKAF20FfXXNTFqEYXyJNWh1GiZedQ==} 291 + engines: {node: '>=12'} 292 + cpu: [ppc64] 293 + os: [aix] 260 294 261 295 '@esbuild/aix-ppc64@0.27.0': 262 296 resolution: {integrity: sha512-KuZrd2hRjz01y5JK9mEBSD3Vj3mbCvemhT466rSuJYeE/hjuBrHfjjcjMdTm/sz7au+++sdbJZJmuBwQLuw68A==} ··· 270 304 cpu: [ppc64] 271 305 os: [aix] 272 306 307 + '@esbuild/android-arm64@0.21.5': 308 + resolution: {integrity: sha512-c0uX9VAUBQ7dTDCjq+wdyGLowMdtR/GoC2U5IYk/7D1H1JYC0qseD7+11iMP2mRLN9RcCMRcjC4YMclCzGwS/A==} 309 + engines: {node: '>=12'} 310 + cpu: [arm64] 311 + os: [android] 312 + 273 313 '@esbuild/android-arm64@0.27.0': 274 314 resolution: {integrity: sha512-CC3vt4+1xZrs97/PKDkl0yN7w8edvU2vZvAFGD16n9F0Cvniy5qvzRXjfO1l94efczkkQE6g1x0i73Qf5uthOQ==} 275 315 engines: {node: '>=18'} ··· 282 322 cpu: [arm64] 283 323 os: [android] 284 324 325 + '@esbuild/android-arm@0.21.5': 326 + resolution: {integrity: sha512-vCPvzSjpPHEi1siZdlvAlsPxXl7WbOVUBBAowWug4rJHb68Ox8KualB+1ocNvT5fjv6wpkX6o/iEpbDrf68zcg==} 327 + engines: {node: '>=12'} 328 + cpu: [arm] 329 + os: [android] 330 + 285 331 '@esbuild/android-arm@0.27.0': 286 332 resolution: {integrity: sha512-j67aezrPNYWJEOHUNLPj9maeJte7uSMM6gMoxfPC9hOg8N02JuQi/T7ewumf4tNvJadFkvLZMlAq73b9uwdMyQ==} 287 333 engines: {node: '>=18'} ··· 294 340 cpu: [arm] 295 341 os: [android] 296 342 343 + '@esbuild/android-x64@0.21.5': 344 + resolution: {integrity: sha512-D7aPRUUNHRBwHxzxRvp856rjUHRFW1SdQATKXH2hqA0kAZb1hKmi02OpYRacl0TxIGz/ZmXWlbZgjwWYaCakTA==} 345 + engines: {node: '>=12'} 346 + cpu: [x64] 347 + os: [android] 348 + 297 349 '@esbuild/android-x64@0.27.0': 298 350 resolution: {integrity: sha512-wurMkF1nmQajBO1+0CJmcN17U4BP6GqNSROP8t0X/Jiw2ltYGLHpEksp9MpoBqkrFR3kv2/te6Sha26k3+yZ9Q==} 299 351 engines: {node: '>=18'} ··· 306 358 cpu: [x64] 307 359 os: [android] 308 360 361 + '@esbuild/darwin-arm64@0.21.5': 362 + resolution: {integrity: sha512-DwqXqZyuk5AiWWf3UfLiRDJ5EDd49zg6O9wclZ7kUMv2WRFr4HKjXp/5t8JZ11QbQfUS6/cRCKGwYhtNAY88kQ==} 363 + engines: {node: '>=12'} 364 + cpu: [arm64] 365 + os: [darwin] 366 + 309 367 '@esbuild/darwin-arm64@0.27.0': 310 368 resolution: {integrity: sha512-uJOQKYCcHhg07DL7i8MzjvS2LaP7W7Pn/7uA0B5S1EnqAirJtbyw4yC5jQ5qcFjHK9l6o/MX9QisBg12kNkdHg==} 311 369 engines: {node: '>=18'} ··· 318 376 cpu: [arm64] 319 377 os: [darwin] 320 378 379 + '@esbuild/darwin-x64@0.21.5': 380 + resolution: {integrity: sha512-se/JjF8NlmKVG4kNIuyWMV/22ZaerB+qaSi5MdrXtd6R08kvs2qCN4C09miupktDitvh8jRFflwGFBQcxZRjbw==} 381 + engines: {node: '>=12'} 382 + cpu: [x64] 383 + os: [darwin] 384 + 321 385 '@esbuild/darwin-x64@0.27.0': 322 386 resolution: {integrity: sha512-8mG6arH3yB/4ZXiEnXof5MK72dE6zM9cDvUcPtxhUZsDjESl9JipZYW60C3JGreKCEP+p8P/72r69m4AZGJd5g==} 323 387 engines: {node: '>=18'} ··· 330 394 cpu: [x64] 331 395 os: [darwin] 332 396 397 + '@esbuild/freebsd-arm64@0.21.5': 398 + resolution: {integrity: sha512-5JcRxxRDUJLX8JXp/wcBCy3pENnCgBR9bN6JsY4OmhfUtIHe3ZW0mawA7+RDAcMLrMIZaf03NlQiX9DGyB8h4g==} 399 + engines: {node: '>=12'} 400 + cpu: [arm64] 401 + os: [freebsd] 402 + 333 403 '@esbuild/freebsd-arm64@0.27.0': 334 404 resolution: {integrity: sha512-9FHtyO988CwNMMOE3YIeci+UV+x5Zy8fI2qHNpsEtSF83YPBmE8UWmfYAQg6Ux7Gsmd4FejZqnEUZCMGaNQHQw==} 335 405 engines: {node: '>=18'} ··· 340 410 resolution: {integrity: sha512-lS/9CN+rgqQ9czogxlMcBMGd+l8Q3Nj1MFQwBZJyoEKI50XGxwuzznYdwcav6lpOGv5BqaZXqvBSiB/kJ5op+g==} 341 411 engines: {node: '>=18'} 342 412 cpu: [arm64] 413 + os: [freebsd] 414 + 415 + '@esbuild/freebsd-x64@0.21.5': 416 + resolution: {integrity: sha512-J95kNBj1zkbMXtHVH29bBriQygMXqoVQOQYA+ISs0/2l3T9/kj42ow2mpqerRBxDJnmkUDCaQT/dfNXWX/ZZCQ==} 417 + engines: {node: '>=12'} 418 + cpu: [x64] 343 419 os: [freebsd] 344 420 345 421 '@esbuild/freebsd-x64@0.27.0': ··· 354 430 cpu: [x64] 355 431 os: [freebsd] 356 432 433 + '@esbuild/linux-arm64@0.21.5': 434 + resolution: {integrity: sha512-ibKvmyYzKsBeX8d8I7MH/TMfWDXBF3db4qM6sy+7re0YXya+K1cem3on9XgdT2EQGMu4hQyZhan7TeQ8XkGp4Q==} 435 + engines: {node: '>=12'} 436 + cpu: [arm64] 437 + os: [linux] 438 + 357 439 '@esbuild/linux-arm64@0.27.0': 358 440 resolution: {integrity: sha512-AS18v0V+vZiLJyi/4LphvBE+OIX682Pu7ZYNsdUHyUKSoRwdnOsMf6FDekwoAFKej14WAkOef3zAORJgAtXnlQ==} 359 441 engines: {node: '>=18'} ··· 366 448 cpu: [arm64] 367 449 os: [linux] 368 450 451 + '@esbuild/linux-arm@0.21.5': 452 + resolution: {integrity: sha512-bPb5AHZtbeNGjCKVZ9UGqGwo8EUu4cLq68E95A53KlxAPRmUyYv2D6F0uUI65XisGOL1hBP5mTronbgo+0bFcA==} 453 + engines: {node: '>=12'} 454 + cpu: [arm] 455 + os: [linux] 456 + 369 457 '@esbuild/linux-arm@0.27.0': 370 458 resolution: {integrity: sha512-t76XLQDpxgmq2cNXKTVEB7O7YMb42atj2Re2Haf45HkaUpjM2J0UuJZDuaGbPbamzZ7bawyGFUkodL+zcE+jvQ==} 371 459 engines: {node: '>=18'} ··· 378 466 cpu: [arm] 379 467 os: [linux] 380 468 469 + '@esbuild/linux-ia32@0.21.5': 470 + resolution: {integrity: sha512-YvjXDqLRqPDl2dvRODYmmhz4rPeVKYvppfGYKSNGdyZkA01046pLWyRKKI3ax8fbJoK5QbxblURkwK/MWY18Tg==} 471 + engines: {node: '>=12'} 472 + cpu: [ia32] 473 + os: [linux] 474 + 381 475 '@esbuild/linux-ia32@0.27.0': 382 476 resolution: {integrity: sha512-Mz1jxqm/kfgKkc/KLHC5qIujMvnnarD9ra1cEcrs7qshTUSksPihGrWHVG5+osAIQ68577Zpww7SGapmzSt4Nw==} 383 477 engines: {node: '>=18'} ··· 388 482 resolution: {integrity: sha512-MJt5BRRSScPDwG2hLelYhAAKh9imjHK5+NE/tvnRLbIqUWa+0E9N4WNMjmp/kXXPHZGqPLxggwVhz7QP8CTR8w==} 389 483 engines: {node: '>=18'} 390 484 cpu: [ia32] 485 + os: [linux] 486 + 487 + '@esbuild/linux-loong64@0.21.5': 488 + resolution: {integrity: sha512-uHf1BmMG8qEvzdrzAqg2SIG/02+4/DHB6a9Kbya0XDvwDEKCoC8ZRWI5JJvNdUjtciBGFQ5PuBlpEOXQj+JQSg==} 489 + engines: {node: '>=12'} 490 + cpu: [loong64] 391 491 os: [linux] 392 492 393 493 '@esbuild/linux-loong64@0.27.0': ··· 402 502 cpu: [loong64] 403 503 os: [linux] 404 504 505 + '@esbuild/linux-mips64el@0.21.5': 506 + resolution: {integrity: sha512-IajOmO+KJK23bj52dFSNCMsz1QP1DqM6cwLUv3W1QwyxkyIWecfafnI555fvSGqEKwjMXVLokcV5ygHW5b3Jbg==} 507 + engines: {node: '>=12'} 508 + cpu: [mips64el] 509 + os: [linux] 510 + 405 511 '@esbuild/linux-mips64el@0.27.0': 406 512 resolution: {integrity: sha512-sJz3zRNe4tO2wxvDpH/HYJilb6+2YJxo/ZNbVdtFiKDufzWq4JmKAiHy9iGoLjAV7r/W32VgaHGkk35cUXlNOg==} 407 513 engines: {node: '>=18'} ··· 414 520 cpu: [mips64el] 415 521 os: [linux] 416 522 523 + '@esbuild/linux-ppc64@0.21.5': 524 + resolution: {integrity: sha512-1hHV/Z4OEfMwpLO8rp7CvlhBDnjsC3CttJXIhBi+5Aj5r+MBvy4egg7wCbe//hSsT+RvDAG7s81tAvpL2XAE4w==} 525 + engines: {node: '>=12'} 526 + cpu: [ppc64] 527 + os: [linux] 528 + 417 529 '@esbuild/linux-ppc64@0.27.0': 418 530 resolution: {integrity: sha512-z9N10FBD0DCS2dmSABDBb5TLAyF1/ydVb+N4pi88T45efQ/w4ohr/F/QYCkxDPnkhkp6AIpIcQKQ8F0ANoA2JA==} 419 531 engines: {node: '>=18'} ··· 426 538 cpu: [ppc64] 427 539 os: [linux] 428 540 541 + '@esbuild/linux-riscv64@0.21.5': 542 + resolution: {integrity: sha512-2HdXDMd9GMgTGrPWnJzP2ALSokE/0O5HhTUvWIbD3YdjME8JwvSCnNGBnTThKGEB91OZhzrJ4qIIxk/SBmyDDA==} 543 + engines: {node: '>=12'} 544 + cpu: [riscv64] 545 + os: [linux] 546 + 429 547 '@esbuild/linux-riscv64@0.27.0': 430 548 resolution: {integrity: sha512-pQdyAIZ0BWIC5GyvVFn5awDiO14TkT/19FTmFcPdDec94KJ1uZcmFs21Fo8auMXzD4Tt+diXu1LW1gHus9fhFQ==} 431 549 engines: {node: '>=18'} ··· 438 556 cpu: [riscv64] 439 557 os: [linux] 440 558 559 + '@esbuild/linux-s390x@0.21.5': 560 + resolution: {integrity: sha512-zus5sxzqBJD3eXxwvjN1yQkRepANgxE9lgOW2qLnmr8ikMTphkjgXu1HR01K4FJg8h1kEEDAqDcZQtbrRnB41A==} 561 + engines: {node: '>=12'} 562 + cpu: [s390x] 563 + os: [linux] 564 + 441 565 '@esbuild/linux-s390x@0.27.0': 442 566 resolution: {integrity: sha512-hPlRWR4eIDDEci953RI1BLZitgi5uqcsjKMxwYfmi4LcwyWo2IcRP+lThVnKjNtk90pLS8nKdroXYOqW+QQH+w==} 443 567 engines: {node: '>=18'} ··· 450 574 cpu: [s390x] 451 575 os: [linux] 452 576 577 + '@esbuild/linux-x64@0.21.5': 578 + resolution: {integrity: sha512-1rYdTpyv03iycF1+BhzrzQJCdOuAOtaqHTWJZCWvijKD2N5Xu0TtVC8/+1faWqcP9iBCWOmjmhoH94dH82BxPQ==} 579 + engines: {node: '>=12'} 580 + cpu: [x64] 581 + os: [linux] 582 + 453 583 '@esbuild/linux-x64@0.27.0': 454 584 resolution: {integrity: sha512-1hBWx4OUJE2cab++aVZ7pObD6s+DK4mPGpemtnAORBvb5l/g5xFGk0vc0PjSkrDs0XaXj9yyob3d14XqvnQ4gw==} 455 585 engines: {node: '>=18'} ··· 474 604 cpu: [arm64] 475 605 os: [netbsd] 476 606 607 + '@esbuild/netbsd-x64@0.21.5': 608 + resolution: {integrity: sha512-Woi2MXzXjMULccIwMnLciyZH4nCIMpWQAs049KEeMvOcNADVxo0UBIQPfSmxB3CWKedngg7sWZdLvLczpe0tLg==} 609 + engines: {node: '>=12'} 610 + cpu: [x64] 611 + os: [netbsd] 612 + 477 613 '@esbuild/netbsd-x64@0.27.0': 478 614 resolution: {integrity: sha512-xbbOdfn06FtcJ9d0ShxxvSn2iUsGd/lgPIO2V3VZIPDbEaIj1/3nBBe1AwuEZKXVXkMmpr6LUAgMkLD/4D2PPA==} 479 615 engines: {node: '>=18'} ··· 496 632 resolution: {integrity: sha512-DNIHH2BPQ5551A7oSHD0CKbwIA/Ox7+78/AWkbS5QoRzaqlev2uFayfSxq68EkonB+IKjiuxBFoV8ESJy8bOHA==} 497 633 engines: {node: '>=18'} 498 634 cpu: [arm64] 635 + os: [openbsd] 636 + 637 + '@esbuild/openbsd-x64@0.21.5': 638 + resolution: {integrity: sha512-HLNNw99xsvx12lFBUwoT8EVCsSvRNDVxNpjZ7bPn947b8gJPzeHWyNVhFsaerc0n3TsbOINvRP2byTZ5LKezow==} 639 + engines: {node: '>=12'} 640 + cpu: [x64] 499 641 os: [openbsd] 500 642 501 643 '@esbuild/openbsd-x64@0.27.0': ··· 522 664 cpu: [arm64] 523 665 os: [openharmony] 524 666 667 + '@esbuild/sunos-x64@0.21.5': 668 + resolution: {integrity: sha512-6+gjmFpfy0BHU5Tpptkuh8+uw3mnrvgs+dSPQXQOv3ekbordwnzTVEb4qnIvQcYXq6gzkyTnoZ9dZG+D4garKg==} 669 + engines: {node: '>=12'} 670 + cpu: [x64] 671 + os: [sunos] 672 + 525 673 '@esbuild/sunos-x64@0.27.0': 526 674 resolution: {integrity: sha512-Q1KY1iJafM+UX6CFEL+F4HRTgygmEW568YMqDA5UV97AuZSm21b7SXIrRJDwXWPzr8MGr75fUZPV67FdtMHlHA==} 527 675 engines: {node: '>=18'} ··· 534 682 cpu: [x64] 535 683 os: [sunos] 536 684 685 + '@esbuild/win32-arm64@0.21.5': 686 + resolution: {integrity: sha512-Z0gOTd75VvXqyq7nsl93zwahcTROgqvuAcYDUr+vOv8uHhNSKROyU961kgtCD1e95IqPKSQKH7tBTslnS3tA8A==} 687 + engines: {node: '>=12'} 688 + cpu: [arm64] 689 + os: [win32] 690 + 537 691 '@esbuild/win32-arm64@0.27.0': 538 692 resolution: {integrity: sha512-W1eyGNi6d+8kOmZIwi/EDjrL9nxQIQ0MiGqe/AWc6+IaHloxHSGoeRgDRKHFISThLmsewZ5nHFvGFWdBYlgKPg==} 539 693 engines: {node: '>=18'} ··· 546 700 cpu: [arm64] 547 701 os: [win32] 548 702 703 + '@esbuild/win32-ia32@0.21.5': 704 + resolution: {integrity: sha512-SWXFF1CL2RVNMaVs+BBClwtfZSvDgtL//G/smwAc5oVK/UPu2Gu9tIaRgFmYFFKrmg3SyAjSrElf0TiJ1v8fYA==} 705 + engines: {node: '>=12'} 706 + cpu: [ia32] 707 + os: [win32] 708 + 549 709 '@esbuild/win32-ia32@0.27.0': 550 710 resolution: {integrity: sha512-30z1aKL9h22kQhilnYkORFYt+3wp7yZsHWus+wSKAJR8JtdfI76LJ4SBdMsCopTR3z/ORqVu5L1vtnHZWVj4cQ==} 551 711 engines: {node: '>=18'} ··· 556 716 resolution: {integrity: sha512-Iuws0kxo4yusk7sw70Xa2E2imZU5HoixzxfGCdxwBdhiDgt9vX9VUCBhqcwY7/uh//78A1hMkkROMJq9l27oLQ==} 557 717 engines: {node: '>=18'} 558 718 cpu: [ia32] 719 + os: [win32] 720 + 721 + '@esbuild/win32-x64@0.21.5': 722 + resolution: {integrity: sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw==} 723 + engines: {node: '>=12'} 724 + cpu: [x64] 559 725 os: [win32] 560 726 561 727 '@esbuild/win32-x64@0.27.0': ··· 862 1028 '@vitest/browser': 863 1029 optional: true 864 1030 1031 + '@vitest/expect@2.1.9': 1032 + resolution: {integrity: sha512-UJCIkTBenHeKT1TTlKMJWy1laZewsRIzYighyYiJKZreqtdxSos/S1t+ktRMQWu2CKqaarrkeszJx1cgC5tGZw==} 1033 + 865 1034 '@vitest/expect@4.0.16': 866 1035 resolution: {integrity: sha512-eshqULT2It7McaJkQGLkPjPjNph+uevROGuIMJdG3V+0BSR2w9u6J9Lwu+E8cK5TETlfou8GRijhafIMhXsimA==} 867 1036 1037 + '@vitest/mocker@2.1.9': 1038 + resolution: {integrity: sha512-tVL6uJgoUdi6icpxmdrn5YNo3g3Dxv+IHJBr0GXHaEdTcw3F+cPKnsXFhli6nO+f/6SDKPHEK1UN+k+TQv0Ehg==} 1039 + peerDependencies: 1040 + msw: ^2.4.9 1041 + vite: ^5.0.0 1042 + peerDependenciesMeta: 1043 + msw: 1044 + optional: true 1045 + vite: 1046 + optional: true 1047 + 868 1048 '@vitest/mocker@4.0.16': 869 1049 resolution: {integrity: sha512-yb6k4AZxJTB+q9ycAvsoxGn+j/po0UaPgajllBgt1PzoMAAmJGYFdDk0uCcRcxb3BrME34I6u8gHZTQlkqSZpg==} 870 1050 peerDependencies: ··· 876 1056 vite: 877 1057 optional: true 878 1058 1059 + '@vitest/pretty-format@2.1.9': 1060 + resolution: {integrity: sha512-KhRIdGV2U9HOUzxfiHmY8IFHTdqtOhIzCpd8WRdJiE7D/HUcZVD0EgQCVjm+Q9gkUXWgBvMmTtZgIG48wq7sOQ==} 1061 + 879 1062 '@vitest/pretty-format@4.0.16': 880 1063 resolution: {integrity: sha512-eNCYNsSty9xJKi/UdVD8Ou16alu7AYiS2fCPRs0b1OdhJiV89buAXQLpTbe+X8V9L6qrs9CqyvU7OaAopJYPsA==} 881 1064 1065 + '@vitest/runner@2.1.9': 1066 + resolution: {integrity: sha512-ZXSSqTFIrzduD63btIfEyOmNcBmQvgOVsPNPe0jYtESiXkhd8u2erDLnMxmGrDCwHCCHE7hxwRDCT3pt0esT4g==} 1067 + 882 1068 '@vitest/runner@4.0.16': 883 1069 resolution: {integrity: sha512-VWEDm5Wv9xEo80ctjORcTQRJ539EGPB3Pb9ApvVRAY1U/WkHXmmYISqU5E79uCwcW7xYUV38gwZD+RV755fu3Q==} 884 1070 1071 + '@vitest/snapshot@2.1.9': 1072 + resolution: {integrity: sha512-oBO82rEjsxLNJincVhLhaxxZdEtV0EFHMK5Kmx5sJ6H9L183dHECjiefOAdnqpIgT5eZwT04PoggUnW88vOBNQ==} 1073 + 885 1074 '@vitest/snapshot@4.0.16': 886 1075 resolution: {integrity: sha512-sf6NcrYhYBsSYefxnry+DR8n3UV4xWZwWxYbCJUt2YdvtqzSPR7VfGrY0zsv090DAbjFZsi7ZaMi1KnSRyK1XA==} 1076 + 1077 + '@vitest/spy@2.1.9': 1078 + resolution: {integrity: sha512-E1B35FwzXXTs9FHNK6bDszs7mtydNi5MIfUWpceJ8Xbfb1gBMscAnwLbEu+B44ed6W3XjL9/ehLPHR1fkf1KLQ==} 887 1079 888 1080 '@vitest/spy@4.0.16': 889 1081 resolution: {integrity: sha512-4jIOWjKP0ZUaEmJm00E0cOBLU+5WE0BpeNr3XN6TEF05ltro6NJqHWxXD0kA8/Zc8Nh23AT8WQxwNG+WeROupw==} 1082 + 1083 + '@vitest/utils@2.1.9': 1084 + resolution: {integrity: sha512-v0psaMSkNJ3A2NMrUEHFRzJtDPFn+/VWZ5WxImB21T9fjucJRmS7xCS3ppEnARb9y11OAzaD+P2Ps+b+BGX5iQ==} 890 1085 891 1086 '@vitest/utils@4.0.16': 892 1087 resolution: {integrity: sha512-h8z9yYhV3e1LEfaQ3zdypIrnAg/9hguReGZoS7Gl0aBG5xgA410zBqECqmaF/+RkTggRsfnzc1XaAHA6bmUufA==} ··· 945 1140 946 1141 buffer@5.7.1: 947 1142 resolution: {integrity: sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ==} 1143 + 1144 + cac@6.7.14: 1145 + resolution: {integrity: sha512-b6Ilus+c3RrdDk+JhLKUAQfzzgLEPy6wcXqS7f/xe1EETvsDP6GORG7SFuOs6cID5YkqchW/LXZbX5bc8j7ZcQ==} 1146 + engines: {node: '>=8'} 948 1147 949 1148 call-bind-apply-helpers@1.0.2: 950 1149 resolution: {integrity: sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==} ··· 958 1157 resolution: {integrity: sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==} 959 1158 engines: {node: '>= 0.4'} 960 1159 1160 + chai@5.3.3: 1161 + resolution: {integrity: sha512-4zNhdJD/iOjSH0A05ea+Ke6MU5mmpQcbQsSOkgdaUMJ9zTlDTD/GYlwohmIE2u0gaxHYiVHEn1Fw9mZ/ktJWgw==} 1162 + engines: {node: '>=18'} 1163 + 961 1164 chai@6.2.2: 962 1165 resolution: {integrity: sha512-NUPRluOfOiTKBKvWPtSD4PhFvWCqOi0BGStNWs57X9js7XGTprSmFoz5F0tWhR4WPjNeR9jXqdC7/UpSJTnlRg==} 963 1166 engines: {node: '>=18'} 1167 + 1168 + check-error@2.1.3: 1169 + resolution: {integrity: sha512-PAJdDJusoxnwm1VwW07VWwUN1sl7smmC3OKggvndJFadxxDRyFJBX/ggnu/KE4kQAB7a3Dp8f/YXC1FlUprWmA==} 1170 + engines: {node: '>= 16'} 964 1171 965 1172 chownr@1.1.4: 966 1173 resolution: {integrity: sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg==} ··· 1000 1207 resolution: {integrity: sha512-aW35yZM6Bb/4oJlZncMH2LCoZtJXTRxES17vE3hoRiowU2kWHaJKFkSBDnDR+cm9J+9QhXmREyIfv0pji9ejCQ==} 1001 1208 engines: {node: '>=10'} 1002 1209 1210 + deep-eql@5.0.2: 1211 + resolution: {integrity: sha512-h5k/5U50IJJFpzfL6nO9jaaumfjO/f2NjK/oYB2Djzm4p9L+3T9qWpZqZ2hAbLPuuYq9wrU08WQyBTL5GbPk5Q==} 1212 + engines: {node: '>=6'} 1213 + 1003 1214 deep-extend@0.6.0: 1004 1215 resolution: {integrity: sha512-LOHxIOaPYdHlJRtCQfDIVZtfw/ufM8+rVj649RIHzcm/vGwQRXFt6OPqIFWsm2XEMrNIEtWR64sY1LEKD2vAOA==} 1005 1216 engines: {node: '>=4.0.0'} ··· 1036 1247 es-object-atoms@1.1.1: 1037 1248 resolution: {integrity: sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==} 1038 1249 engines: {node: '>= 0.4'} 1250 + 1251 + esbuild@0.21.5: 1252 + resolution: {integrity: sha512-mg3OPMV4hXywwpoDxu3Qda5xCKQi+vCTZq8S9J/EpkhB2HzKXq4SNFZE3+NK93JYxc8VMSep+lOUSC/RVKaBqw==} 1253 + engines: {node: '>=12'} 1254 + hasBin: true 1039 1255 1040 1256 esbuild@0.27.0: 1041 1257 resolution: {integrity: sha512-jd0f4NHbD6cALCyGElNpGAOtWxSq46l9X/sWB0Nzd5er4Kz2YTm+Vl0qKFT9KUJvD8+fiO8AvoHhFvEatfVixA==} ··· 1206 1422 lodash@4.17.21: 1207 1423 resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==} 1208 1424 1425 + loupe@3.2.1: 1426 + resolution: {integrity: sha512-CdzqowRJCeLU72bHvWqwRBBlLcMEtIvGrlvef74kMnV2AolS9Y8xUv1I0U/MNAWMhBlKIoyuEgoJ0t/bbwHbLQ==} 1427 + 1209 1428 magic-string@0.30.21: 1210 1429 resolution: {integrity: sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ==} 1211 1430 ··· 1276 1495 path-to-regexp@6.3.0: 1277 1496 resolution: {integrity: sha512-Yhpw4T9C6hPpgPeA28us07OJeqZ5EzQTkbfwuhsUg0c237RomFoETJgmp2sa3F/41gfLE6G5cqcYwznmeEeOlQ==} 1278 1497 1498 + pathe@1.1.2: 1499 + resolution: {integrity: sha512-whLdWMYL2TwI08hn8/ZqAbrVemu0LNaNNJZX73O6qaIdCTfXutsLhMkjdENX0qhsQ9uIimo4/aQOmXkoon2nDQ==} 1500 + 1279 1501 pathe@2.0.3: 1280 1502 resolution: {integrity: sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w==} 1503 + 1504 + pathval@2.0.1: 1505 + resolution: {integrity: sha512-//nshmD55c46FuFw26xV/xFAaB5HF9Xdap7HJBBnrKdAd6/GxDBaNA1870O79+9ueg61cZLSVc+OaFlfmObYVQ==} 1506 + engines: {node: '>= 14.16'} 1281 1507 1282 1508 picocolors@1.1.1: 1283 1509 resolution: {integrity: sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==} ··· 1414 1640 tinybench@2.9.0: 1415 1641 resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==} 1416 1642 1643 + tinyexec@0.3.2: 1644 + resolution: {integrity: sha512-KQQR9yN7R5+OSwaK0XQoj22pwHoTlgYqmUscPYoknOoWCWfj/5/ABTMRi69FrKU5ffPVh5QcFikpWJI/P1ocHA==} 1645 + 1417 1646 tinyexec@1.0.2: 1418 1647 resolution: {integrity: sha512-W/KYk+NFhkmsYpuHq5JykngiOCnxeVL8v8dFnqxSD8qEEdRfXk1SDM6JzNqcERbcGYj9tMrDQBYV9cjgnunFIg==} 1419 1648 engines: {node: '>=18'} ··· 1422 1651 resolution: {integrity: sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ==} 1423 1652 engines: {node: '>=12.0.0'} 1424 1653 1654 + tinypool@1.1.1: 1655 + resolution: {integrity: sha512-Zba82s87IFq9A9XmjiX5uZA/ARWDrB03OHlq+Vw1fSdt0I+4/Kutwy8BP4Y/y/aORMo61FQ0vIb5j44vSo5Pkg==} 1656 + engines: {node: ^18.0.0 || >=20.0.0} 1657 + 1658 + tinyrainbow@1.2.0: 1659 + resolution: {integrity: sha512-weEDEq7Z5eTHPDh4xjX789+fHfF+P8boiFB+0vbWzpbnbsEr/GRaohi/uMKxg8RZMXnl1ItAi/IUHWMsjDV7kQ==} 1660 + engines: {node: '>=14.0.0'} 1661 + 1425 1662 tinyrainbow@3.0.3: 1426 1663 resolution: {integrity: sha512-PSkbLUoxOFRzJYjjxHJt9xro7D+iilgMX/C9lawzVuYiIdcihh9DXmVibBe8lmcFrRi/VzlPjBxbN7rH24q8/Q==} 1664 + engines: {node: '>=14.0.0'} 1665 + 1666 + tinyspy@3.0.2: 1667 + resolution: {integrity: sha512-n1cw8k1k0x4pgA2+9XrOkFydTerNcJ1zWCO5Nn9scWHTD+5tp8dghT2x1uduQePZTZgd3Tupf+x9BxJjeJi77Q==} 1427 1668 engines: {node: '>=14.0.0'} 1428 1669 1429 1670 tslib@2.8.1: ··· 1453 1694 util@0.12.5: 1454 1695 resolution: {integrity: sha512-kZf/K6hEIrWHI6XqOFUiiMa+79wE/D8Q+NCNAWclkyg3b4d2k7s0QGepNjiABc+aR3N1PAyHL7p6UcLY6LmrnA==} 1455 1696 1697 + vite-node@2.1.9: 1698 + resolution: {integrity: sha512-AM9aQ/IPrW/6ENLQg3AGY4K1N2TGZdR5e4gu/MmmR2xR3Ll1+dib+nook92g4TV3PXVyeyxdWwtaCAiUL0hMxA==} 1699 + engines: {node: ^18.0.0 || >=20.0.0} 1700 + hasBin: true 1701 + 1702 + vite@5.4.21: 1703 + resolution: {integrity: sha512-o5a9xKjbtuhY6Bi5S3+HvbRERmouabWbyUcpXXUA1u+GNUKoROi9byOJ8M0nHbHYHkYICiMlqxkg1KkYmm25Sw==} 1704 + engines: {node: ^18.0.0 || >=20.0.0} 1705 + hasBin: true 1706 + peerDependencies: 1707 + '@types/node': ^18.0.0 || >=20.0.0 1708 + less: '*' 1709 + lightningcss: ^1.21.0 1710 + sass: '*' 1711 + sass-embedded: '*' 1712 + stylus: '*' 1713 + sugarss: '*' 1714 + terser: ^5.4.0 1715 + peerDependenciesMeta: 1716 + '@types/node': 1717 + optional: true 1718 + less: 1719 + optional: true 1720 + lightningcss: 1721 + optional: true 1722 + sass: 1723 + optional: true 1724 + sass-embedded: 1725 + optional: true 1726 + stylus: 1727 + optional: true 1728 + sugarss: 1729 + optional: true 1730 + terser: 1731 + optional: true 1732 + 1456 1733 vite@7.3.1: 1457 1734 resolution: {integrity: sha512-w+N7Hifpc3gRjZ63vYBXA56dvvRlNWRczTdmCBBa+CotUzAPf5b7YMdMR/8CQoeYE5LX3W4wj6RYTgonm1b9DA==} 1458 1735 engines: {node: ^20.19.0 || >=22.12.0} ··· 1493 1770 yaml: 1494 1771 optional: true 1495 1772 1773 + vitest@2.1.9: 1774 + resolution: {integrity: sha512-MSmPM9REYqDGBI8439mA4mWhV5sKmDlBKWIYbA3lRb2PTHACE0mgKwA8yQ2xq9vxDTuk4iPrECBAEW2aoFXY0Q==} 1775 + engines: {node: ^18.0.0 || >=20.0.0} 1776 + hasBin: true 1777 + peerDependencies: 1778 + '@edge-runtime/vm': '*' 1779 + '@types/node': ^18.0.0 || >=20.0.0 1780 + '@vitest/browser': 2.1.9 1781 + '@vitest/ui': 2.1.9 1782 + happy-dom: '*' 1783 + jsdom: '*' 1784 + peerDependenciesMeta: 1785 + '@edge-runtime/vm': 1786 + optional: true 1787 + '@types/node': 1788 + optional: true 1789 + '@vitest/browser': 1790 + optional: true 1791 + '@vitest/ui': 1792 + optional: true 1793 + happy-dom: 1794 + optional: true 1795 + jsdom: 1796 + optional: true 1797 + 1496 1798 vitest@4.0.16: 1497 1799 resolution: {integrity: sha512-E4t7DJ9pESL6E3I8nFjPa4xGUd3PmiWDLsDztS2qXSJWfHtbQnwAWylaBvSNY48I3vr8PTqIZlyK8TE3V3CA4Q==} 1498 1800 engines: {node: ^20.0.0 || ^22.0.0 || >=24.0.0} ··· 1615 1917 1616 1918 '@bcoe/v8-coverage@1.0.2': {} 1617 1919 1920 + '@bigmoves/lexicon@0.2.0': {} 1921 + 1618 1922 '@biomejs/biome@2.3.11': 1619 1923 optionalDependencies: 1620 1924 '@biomejs/cli-darwin-arm64': 2.3.11 ··· 1686 1990 tslib: 2.8.1 1687 1991 optional: true 1688 1992 1993 + '@esbuild/aix-ppc64@0.21.5': 1994 + optional: true 1995 + 1689 1996 '@esbuild/aix-ppc64@0.27.0': 1690 1997 optional: true 1691 1998 1692 1999 '@esbuild/aix-ppc64@0.27.2': 2000 + optional: true 2001 + 2002 + '@esbuild/android-arm64@0.21.5': 1693 2003 optional: true 1694 2004 1695 2005 '@esbuild/android-arm64@0.27.0': ··· 1698 2008 '@esbuild/android-arm64@0.27.2': 1699 2009 optional: true 1700 2010 2011 + '@esbuild/android-arm@0.21.5': 2012 + optional: true 2013 + 1701 2014 '@esbuild/android-arm@0.27.0': 1702 2015 optional: true 1703 2016 1704 2017 '@esbuild/android-arm@0.27.2': 1705 2018 optional: true 1706 2019 2020 + '@esbuild/android-x64@0.21.5': 2021 + optional: true 2022 + 1707 2023 '@esbuild/android-x64@0.27.0': 1708 2024 optional: true 1709 2025 1710 2026 '@esbuild/android-x64@0.27.2': 2027 + optional: true 2028 + 2029 + '@esbuild/darwin-arm64@0.21.5': 1711 2030 optional: true 1712 2031 1713 2032 '@esbuild/darwin-arm64@0.27.0': ··· 1716 2035 '@esbuild/darwin-arm64@0.27.2': 1717 2036 optional: true 1718 2037 2038 + '@esbuild/darwin-x64@0.21.5': 2039 + optional: true 2040 + 1719 2041 '@esbuild/darwin-x64@0.27.0': 1720 2042 optional: true 1721 2043 1722 2044 '@esbuild/darwin-x64@0.27.2': 1723 2045 optional: true 1724 2046 2047 + '@esbuild/freebsd-arm64@0.21.5': 2048 + optional: true 2049 + 1725 2050 '@esbuild/freebsd-arm64@0.27.0': 1726 2051 optional: true 1727 2052 1728 2053 '@esbuild/freebsd-arm64@0.27.2': 1729 2054 optional: true 1730 2055 2056 + '@esbuild/freebsd-x64@0.21.5': 2057 + optional: true 2058 + 1731 2059 '@esbuild/freebsd-x64@0.27.0': 1732 2060 optional: true 1733 2061 1734 2062 '@esbuild/freebsd-x64@0.27.2': 2063 + optional: true 2064 + 2065 + '@esbuild/linux-arm64@0.21.5': 1735 2066 optional: true 1736 2067 1737 2068 '@esbuild/linux-arm64@0.27.0': ··· 1740 2071 '@esbuild/linux-arm64@0.27.2': 1741 2072 optional: true 1742 2073 2074 + '@esbuild/linux-arm@0.21.5': 2075 + optional: true 2076 + 1743 2077 '@esbuild/linux-arm@0.27.0': 1744 2078 optional: true 1745 2079 1746 2080 '@esbuild/linux-arm@0.27.2': 2081 + optional: true 2082 + 2083 + '@esbuild/linux-ia32@0.21.5': 1747 2084 optional: true 1748 2085 1749 2086 '@esbuild/linux-ia32@0.27.0': ··· 1752 2089 '@esbuild/linux-ia32@0.27.2': 1753 2090 optional: true 1754 2091 2092 + '@esbuild/linux-loong64@0.21.5': 2093 + optional: true 2094 + 1755 2095 '@esbuild/linux-loong64@0.27.0': 1756 2096 optional: true 1757 2097 1758 2098 '@esbuild/linux-loong64@0.27.2': 1759 2099 optional: true 1760 2100 2101 + '@esbuild/linux-mips64el@0.21.5': 2102 + optional: true 2103 + 1761 2104 '@esbuild/linux-mips64el@0.27.0': 1762 2105 optional: true 1763 2106 1764 2107 '@esbuild/linux-mips64el@0.27.2': 1765 2108 optional: true 1766 2109 2110 + '@esbuild/linux-ppc64@0.21.5': 2111 + optional: true 2112 + 1767 2113 '@esbuild/linux-ppc64@0.27.0': 1768 2114 optional: true 1769 2115 1770 2116 '@esbuild/linux-ppc64@0.27.2': 1771 2117 optional: true 1772 2118 2119 + '@esbuild/linux-riscv64@0.21.5': 2120 + optional: true 2121 + 1773 2122 '@esbuild/linux-riscv64@0.27.0': 1774 2123 optional: true 1775 2124 1776 2125 '@esbuild/linux-riscv64@0.27.2': 2126 + optional: true 2127 + 2128 + '@esbuild/linux-s390x@0.21.5': 1777 2129 optional: true 1778 2130 1779 2131 '@esbuild/linux-s390x@0.27.0': ··· 1782 2134 '@esbuild/linux-s390x@0.27.2': 1783 2135 optional: true 1784 2136 2137 + '@esbuild/linux-x64@0.21.5': 2138 + optional: true 2139 + 1785 2140 '@esbuild/linux-x64@0.27.0': 1786 2141 optional: true 1787 2142 ··· 1792 2147 optional: true 1793 2148 1794 2149 '@esbuild/netbsd-arm64@0.27.2': 2150 + optional: true 2151 + 2152 + '@esbuild/netbsd-x64@0.21.5': 1795 2153 optional: true 1796 2154 1797 2155 '@esbuild/netbsd-x64@0.27.0': ··· 1806 2164 '@esbuild/openbsd-arm64@0.27.2': 1807 2165 optional: true 1808 2166 2167 + '@esbuild/openbsd-x64@0.21.5': 2168 + optional: true 2169 + 1809 2170 '@esbuild/openbsd-x64@0.27.0': 1810 2171 optional: true 1811 2172 ··· 1818 2179 '@esbuild/openharmony-arm64@0.27.2': 1819 2180 optional: true 1820 2181 2182 + '@esbuild/sunos-x64@0.21.5': 2183 + optional: true 2184 + 1821 2185 '@esbuild/sunos-x64@0.27.0': 1822 2186 optional: true 1823 2187 1824 2188 '@esbuild/sunos-x64@0.27.2': 1825 2189 optional: true 1826 2190 2191 + '@esbuild/win32-arm64@0.21.5': 2192 + optional: true 2193 + 1827 2194 '@esbuild/win32-arm64@0.27.0': 1828 2195 optional: true 1829 2196 1830 2197 '@esbuild/win32-arm64@0.27.2': 1831 2198 optional: true 1832 2199 2200 + '@esbuild/win32-ia32@0.21.5': 2201 + optional: true 2202 + 1833 2203 '@esbuild/win32-ia32@0.27.0': 1834 2204 optional: true 1835 2205 1836 2206 '@esbuild/win32-ia32@0.27.2': 2207 + optional: true 2208 + 2209 + '@esbuild/win32-x64@0.21.5': 1837 2210 optional: true 1838 2211 1839 2212 '@esbuild/win32-x64@0.27.0': ··· 2064 2437 transitivePeerDependencies: 2065 2438 - supports-color 2066 2439 2440 + '@vitest/expect@2.1.9': 2441 + dependencies: 2442 + '@vitest/spy': 2.1.9 2443 + '@vitest/utils': 2.1.9 2444 + chai: 5.3.3 2445 + tinyrainbow: 1.2.0 2446 + 2067 2447 '@vitest/expect@4.0.16': 2068 2448 dependencies: 2069 2449 '@standard-schema/spec': 1.1.0 ··· 2073 2453 chai: 6.2.2 2074 2454 tinyrainbow: 3.0.3 2075 2455 2456 + '@vitest/mocker@2.1.9(vite@5.4.21(@types/node@25.0.6))': 2457 + dependencies: 2458 + '@vitest/spy': 2.1.9 2459 + estree-walker: 3.0.3 2460 + magic-string: 0.30.21 2461 + optionalDependencies: 2462 + vite: 5.4.21(@types/node@25.0.6) 2463 + 2076 2464 '@vitest/mocker@4.0.16(vite@7.3.1(@types/node@25.0.6))': 2077 2465 dependencies: 2078 2466 '@vitest/spy': 4.0.16 ··· 2081 2469 optionalDependencies: 2082 2470 vite: 7.3.1(@types/node@25.0.6) 2083 2471 2472 + '@vitest/pretty-format@2.1.9': 2473 + dependencies: 2474 + tinyrainbow: 1.2.0 2475 + 2084 2476 '@vitest/pretty-format@4.0.16': 2085 2477 dependencies: 2086 2478 tinyrainbow: 3.0.3 2479 + 2480 + '@vitest/runner@2.1.9': 2481 + dependencies: 2482 + '@vitest/utils': 2.1.9 2483 + pathe: 1.1.2 2087 2484 2088 2485 '@vitest/runner@4.0.16': 2089 2486 dependencies: 2090 2487 '@vitest/utils': 4.0.16 2091 2488 pathe: 2.0.3 2092 2489 2490 + '@vitest/snapshot@2.1.9': 2491 + dependencies: 2492 + '@vitest/pretty-format': 2.1.9 2493 + magic-string: 0.30.21 2494 + pathe: 1.1.2 2495 + 2093 2496 '@vitest/snapshot@4.0.16': 2094 2497 dependencies: 2095 2498 '@vitest/pretty-format': 4.0.16 2096 2499 magic-string: 0.30.21 2097 2500 pathe: 2.0.3 2098 2501 2502 + '@vitest/spy@2.1.9': 2503 + dependencies: 2504 + tinyspy: 3.0.2 2505 + 2099 2506 '@vitest/spy@4.0.16': {} 2507 + 2508 + '@vitest/utils@2.1.9': 2509 + dependencies: 2510 + '@vitest/pretty-format': 2.1.9 2511 + loupe: 3.2.1 2512 + tinyrainbow: 1.2.0 2100 2513 2101 2514 '@vitest/utils@4.0.16': 2102 2515 dependencies: ··· 2156 2569 base64-js: 1.5.1 2157 2570 ieee754: 1.2.1 2158 2571 2572 + cac@6.7.14: {} 2573 + 2159 2574 call-bind-apply-helpers@1.0.2: 2160 2575 dependencies: 2161 2576 es-errors: 1.3.0 ··· 2173 2588 call-bind-apply-helpers: 1.0.2 2174 2589 get-intrinsic: 1.3.0 2175 2590 2591 + chai@5.3.3: 2592 + dependencies: 2593 + assertion-error: 2.0.1 2594 + check-error: 2.1.3 2595 + deep-eql: 5.0.2 2596 + loupe: 3.2.1 2597 + pathval: 2.0.1 2598 + 2176 2599 chai@6.2.2: {} 2600 + 2601 + check-error@2.1.3: {} 2177 2602 2178 2603 chownr@1.1.4: {} 2179 2604 ··· 2205 2630 dependencies: 2206 2631 mimic-response: 3.1.0 2207 2632 2633 + deep-eql@5.0.2: {} 2634 + 2208 2635 deep-extend@0.6.0: {} 2209 2636 2210 2637 define-data-property@1.1.4: ··· 2237 2664 dependencies: 2238 2665 es-errors: 1.3.0 2239 2666 2667 + esbuild@0.21.5: 2668 + optionalDependencies: 2669 + '@esbuild/aix-ppc64': 0.21.5 2670 + '@esbuild/android-arm': 0.21.5 2671 + '@esbuild/android-arm64': 0.21.5 2672 + '@esbuild/android-x64': 0.21.5 2673 + '@esbuild/darwin-arm64': 0.21.5 2674 + '@esbuild/darwin-x64': 0.21.5 2675 + '@esbuild/freebsd-arm64': 0.21.5 2676 + '@esbuild/freebsd-x64': 0.21.5 2677 + '@esbuild/linux-arm': 0.21.5 2678 + '@esbuild/linux-arm64': 0.21.5 2679 + '@esbuild/linux-ia32': 0.21.5 2680 + '@esbuild/linux-loong64': 0.21.5 2681 + '@esbuild/linux-mips64el': 0.21.5 2682 + '@esbuild/linux-ppc64': 0.21.5 2683 + '@esbuild/linux-riscv64': 0.21.5 2684 + '@esbuild/linux-s390x': 0.21.5 2685 + '@esbuild/linux-x64': 0.21.5 2686 + '@esbuild/netbsd-x64': 0.21.5 2687 + '@esbuild/openbsd-x64': 0.21.5 2688 + '@esbuild/sunos-x64': 0.21.5 2689 + '@esbuild/win32-arm64': 0.21.5 2690 + '@esbuild/win32-ia32': 0.21.5 2691 + '@esbuild/win32-x64': 0.21.5 2692 + 2240 2693 esbuild@0.27.0: 2241 2694 optionalDependencies: 2242 2695 '@esbuild/aix-ppc64': 0.27.0 ··· 2436 2889 kleur@4.1.5: {} 2437 2890 2438 2891 lodash@4.17.21: {} 2892 + 2893 + loupe@3.2.1: {} 2439 2894 2440 2895 magic-string@0.30.21: 2441 2896 dependencies: ··· 2520 2975 2521 2976 path-to-regexp@6.3.0: {} 2522 2977 2978 + pathe@1.1.2: {} 2979 + 2523 2980 pathe@2.0.3: {} 2981 + 2982 + pathval@2.0.1: {} 2524 2983 2525 2984 picocolors@1.1.1: {} 2526 2985 ··· 2719 3178 2720 3179 tinybench@2.9.0: {} 2721 3180 3181 + tinyexec@0.3.2: {} 3182 + 2722 3183 tinyexec@1.0.2: {} 2723 3184 2724 3185 tinyglobby@0.2.15: ··· 2726 3187 fdir: 6.5.0(picomatch@4.0.3) 2727 3188 picomatch: 4.0.3 2728 3189 3190 + tinypool@1.1.1: {} 3191 + 3192 + tinyrainbow@1.2.0: {} 3193 + 2729 3194 tinyrainbow@3.0.3: {} 3195 + 3196 + tinyspy@3.0.2: {} 2730 3197 2731 3198 tslib@2.8.1: 2732 3199 optional: true ··· 2755 3222 is-typed-array: 1.1.15 2756 3223 which-typed-array: 1.1.19 2757 3224 3225 + vite-node@2.1.9(@types/node@25.0.6): 3226 + dependencies: 3227 + cac: 6.7.14 3228 + debug: 4.4.3 3229 + es-module-lexer: 1.7.0 3230 + pathe: 1.1.2 3231 + vite: 5.4.21(@types/node@25.0.6) 3232 + transitivePeerDependencies: 3233 + - '@types/node' 3234 + - less 3235 + - lightningcss 3236 + - sass 3237 + - sass-embedded 3238 + - stylus 3239 + - sugarss 3240 + - supports-color 3241 + - terser 3242 + 3243 + vite@5.4.21(@types/node@25.0.6): 3244 + dependencies: 3245 + esbuild: 0.21.5 3246 + postcss: 8.5.6 3247 + rollup: 4.55.1 3248 + optionalDependencies: 3249 + '@types/node': 25.0.6 3250 + fsevents: 2.3.3 3251 + 2758 3252 vite@7.3.1(@types/node@25.0.6): 2759 3253 dependencies: 2760 3254 esbuild: 0.27.2 ··· 2766 3260 optionalDependencies: 2767 3261 '@types/node': 25.0.6 2768 3262 fsevents: 2.3.3 3263 + 3264 + vitest@2.1.9(@types/node@25.0.6): 3265 + dependencies: 3266 + '@vitest/expect': 2.1.9 3267 + '@vitest/mocker': 2.1.9(vite@5.4.21(@types/node@25.0.6)) 3268 + '@vitest/pretty-format': 2.1.9 3269 + '@vitest/runner': 2.1.9 3270 + '@vitest/snapshot': 2.1.9 3271 + '@vitest/spy': 2.1.9 3272 + '@vitest/utils': 2.1.9 3273 + chai: 5.3.3 3274 + debug: 4.4.3 3275 + expect-type: 1.3.0 3276 + magic-string: 0.30.21 3277 + pathe: 1.1.2 3278 + std-env: 3.10.0 3279 + tinybench: 2.9.0 3280 + tinyexec: 0.3.2 3281 + tinypool: 1.1.1 3282 + tinyrainbow: 1.2.0 3283 + vite: 5.4.21(@types/node@25.0.6) 3284 + vite-node: 2.1.9(@types/node@25.0.6) 3285 + why-is-node-running: 2.3.0 3286 + optionalDependencies: 3287 + '@types/node': 25.0.6 3288 + transitivePeerDependencies: 3289 + - less 3290 + - lightningcss 3291 + - msw 3292 + - sass 3293 + - sass-embedded 3294 + - stylus 3295 + - sugarss 3296 + - supports-color 3297 + - terser 2769 3298 2770 3299 vitest@4.0.16(@types/node@25.0.6): 2771 3300 dependencies:
+76 -1
test/e2e.test.js
··· 488 488 ); 489 489 expect(data.results[0].uri).toBeTruthy(); 490 490 expect(data.results[0].cid).toBeTruthy(); 491 - expect(data.results[0].validationStatus).toBe('valid'); 491 + // Node has lexiconResolver configured with post schema, others don't 492 + expect(data.results[0].validationStatus).toBe( 493 + PLATFORM === 'node' ? 'valid' : 'unknown', 494 + ); 495 + }); 496 + 497 + it('applyWrites returns unknown for unregistered collection', async () => { 498 + const { status, data } = await jsonPost( 499 + '/xrpc/com.atproto.repo.applyWrites', 500 + { 501 + repo: DID, 502 + writes: [ 503 + { 504 + $type: 'com.atproto.repo.applyWrites#create', 505 + collection: 'com.example.unknown', 506 + rkey: 'unknowntest', 507 + value: { foo: 'bar' }, 508 + }, 509 + ], 510 + }, 511 + { Authorization: `Bearer ${token}` }, 512 + ); 513 + expect(status).toBe(200); 514 + // Unknown collection should always return 'unknown' validationStatus 515 + expect(data.results[0].validationStatus).toBe('unknown'); 516 + 517 + // Cleanup 518 + await jsonPost( 519 + '/xrpc/com.atproto.repo.applyWrites', 520 + { 521 + repo: DID, 522 + writes: [ 523 + { 524 + $type: 'com.atproto.repo.applyWrites#delete', 525 + collection: 'com.example.unknown', 526 + rkey: 'unknowntest', 527 + }, 528 + ], 529 + }, 530 + { Authorization: `Bearer ${token}` }, 531 + ); 532 + }); 533 + 534 + it('createRecord validates app.bsky.feed.like via live lexicon resolution', async () => { 535 + // This test verifies live lexicon resolution: DNS -> DID -> PDS -> lexicon fetch 536 + // Uses app.bsky.feed.like which is NOT in the static schemas - must resolve live 537 + const { status, data } = await jsonPost( 538 + '/xrpc/com.atproto.repo.createRecord', 539 + { 540 + repo: DID, 541 + collection: 'app.bsky.feed.like', 542 + record: { 543 + $type: 'app.bsky.feed.like', 544 + subject: { 545 + uri: 'at://did:plc:test/app.bsky.feed.post/abc123', 546 + cid: 'bafyreig2fjxi3a2vwkanxj4jna4mlwabm4zkeakslfoqxxktsq3nqkgzoi', 547 + }, 548 + createdAt: new Date().toISOString(), 549 + }, 550 + }, 551 + { Authorization: `Bearer ${token}` }, 552 + ); 553 + expect(status).toBe(200); 554 + // Node has full network access for live DNS resolution 555 + // Cloudflare/Deno in test env may have limited network, returns 'unknown' 556 + expect(data.validationStatus).toBe( 557 + PLATFORM === 'node' ? 'valid' : 'unknown', 558 + ); 559 + 560 + // Cleanup 561 + const rkey = data.uri.split('/').pop(); 562 + await jsonPost( 563 + '/xrpc/com.atproto.repo.deleteRecord', 564 + { repo: DID, collection: 'app.bsky.feed.like', rkey }, 565 + { Authorization: `Bearer ${token}` }, 566 + ); 492 567 }); 493 568 494 569 it('applyWrites delete returns proper format', async () => {
+30 -1
test/helpers/node-server.js
··· 1 1 // test/helpers/node-server.js 2 2 import { mkdirSync, rmSync } from 'node:fs'; 3 3 import { createServer } from '@pds/node'; 4 + import { LexiconResolver } from '@pds/lexicon-resolver'; 5 + import { defineLexicon } from '@bigmoves/lexicon'; 4 6 5 7 const TEST_DATA_DIR = './test-data'; 8 + 9 + // Minimal app.bsky.feed.post schema for e2e validation testing 10 + const postSchema = defineLexicon({ 11 + lexicon: 1, 12 + id: 'app.bsky.feed.post', 13 + defs: { 14 + main: { 15 + type: 'record', 16 + key: 'tid', 17 + record: { 18 + type: 'object', 19 + required: ['text', 'createdAt'], 20 + properties: { 21 + text: { type: 'string', maxLength: 3000 }, 22 + createdAt: { type: 'string', format: 'datetime' }, 23 + }, 24 + }, 25 + }, 26 + }, 27 + }); 6 28 const TEST_PORT = 3000; 7 29 const USE_LOCAL_INFRA = process.env.USE_LOCAL_INFRA !== 'false'; 8 30 const USE_S3 = process.env.BLOB_STORAGE === 's3'; ··· 77 99 /** 78 100 * Start Node.js PDS server for e2e tests 79 101 * Cleans test data directory for fresh state each run 102 + * @param {Object} [options] 103 + * @param {import('@pds/core/ports').LexiconResolverPort} [options.lexiconResolver] - Lexicon resolver for record validation 80 104 * @returns {Promise<{close: () => Promise<void>}>} Server instance with close() method 81 105 */ 82 - export async function startNodeServer() { 106 + export async function startNodeServer(options = {}) { 83 107 // Fresh data each run 84 108 rmSync(TEST_DATA_DIR, { recursive: true, force: true }); 85 109 mkdirSync(TEST_DATA_DIR, { recursive: true }); ··· 90 114 console.log('Using S3 blob storage (MinIO)'); 91 115 } 92 116 117 + // Create default lexicon resolver with bsky post schema for validation testing 118 + const lexiconResolver = 119 + options.lexiconResolver ?? new LexiconResolver({ schemas: [postSchema] }); 120 + 93 121 const server = await createServer({ 94 122 port: TEST_PORT, 95 123 dbPath: `${TEST_DATA_DIR}/pds.db`, ··· 106 134 // Keep appview pointing to production (for proxy tests) 107 135 appviewUrl: 'https://api.bsky.app', 108 136 appviewDid: 'did:web:api.bsky.app', 137 + lexiconResolver, 109 138 }); 110 139 111 140 await server.listen();
+1 -6
tsconfig.build.json
··· 6 6 "emitDeclarationOnly": true 7 7 }, 8 8 "include": ["packages/*/src/**/*.js"], 9 - "exclude": [ 10 - "node_modules", 11 - "test", 12 - "examples", 13 - "packages/deno" 14 - ] 9 + "exclude": ["node_modules", "test", "examples", "packages/deno"] 15 10 }
+7
wrangler.toml
··· 23 23 [[kv_namespaces]] 24 24 binding = "SHARED_KV" 25 25 id = "f10a4ac94e9d4defa7cd375c7e340c0d" 26 + 27 + [observability] 28 + enabled = true 29 + 30 + [observability.logs] 31 + invocation_logs = true 32 + head_sampling_rate = 1